Senior Data Operations Engineer
Melbourne Office (33X)
Bupa
Bupa is an international healthcare company. Our purpose is helping people live longer, healthier, happier lives and making a better world.General information
Name Senior Data Operations Engineer Ref # 48397 Date Monday, April 28, 2025 Full / Part Time Full TimeDescription & Requirements
Opportunity snapshot:
An exciting, 12-month fixed term opportunity has become available for a Data Modeller.
Our Data Operations Team is a team of specialist skills that enables Bupa ANZ to execute on its Data Strategy. Collectively, we are working to build the future state of Bupa’s data platforms on Microsoft Azure services, and to enable the business to work strategically with data to maximise competitive performance. Data Operations contribute to the evolution of Bupa’s data estate via squad, product-based teams, and business partnering approaches.
We are growing our DevOps and Data Ops capabilities at Bupa, maturing our practices toward industry leading approaches to proactively solve problems ahead of customer impact.
Our DataOps people have a broad skill set to continue our consolidation and migration journey to our Strategic Data Platform, so skills in Operational Data and Analytics, Data Warehouse and BI, Advanced Analytics and Self-service, and across different business units including Health Insurance, Health Services, and corporate functions are as important as cloud-based skills in Databricks and Azure.
The Senior Data Operations Engineer are embedded and closely coupled with product-based teams to help design and deliver outcomes aligned to embedded business and data platform squads. They also play an important role in the definition of better practices, standards and guidelines. They are also a key contributor towards the continual improvement initiatives of the team.
What will I be doing?
- Be a leader and shape and influence outcomes in Data Operations, Cloud Operations, Data Engineering and Broader Engineering Teams in Technology (such as Infrastructure).
- Work directly with the product-based teams and the broader Chief Data Office in understanding outcomes, requirements and deliver outcomes based on our customer’s needs.
- Drive continuous improvement in DevOps & Cloud Platform capability ensuring it is always relevant and meets the needs of internal customer groups. Its maturity is measure against industry standards such as DORA.
- Establish and refine best practices for source control, continuous integration, automated testing, and service management. As part of this role, you will create future state CICD, test automation and improve and automate aspects of service management. Measured against DevOps Metrics.
- Mature Infrastructure as code (IaC), automation scripts, and data pipeline code, ensuring optimal performance, security, and maintainability. As part of this role, you will create our future state IAC approaches.
- You will create secure tools/products for data engineers to support and streamline their CI/CD processes.
- Share and create Knowledge and operational support documents to help with faster resolution. As part of this role, you will ensure that all members of the team are able to operate and use products you produce.
Requirements for the role:
- Educated to minimum of degree level in engineering, computer science or related technology discipline.
- 7+ years’ experience in DevOps and/or Data Engineering and/or Software Development
- Demonstrated experience leading code reviews and improving code quality across a team.
- Strong background in cloud platforms (AWS, Azure, GCP) with a focus on building scalable, resilient infrastructure for both applications and data workflows.
- Deep understanding of data engineering principles, including building and managing ETL/ELT pipelines, data lakes solutions (Databricks), and streaming data technologies (e.g., Kafka, Spark)
- Hands-on experience with containerization (Docker) and orchestration (Kubernetes) for managing large-scale, distributed systems.
- Proficiency in programming and scripting languages such as Python, PowerShell or Bash for automating infrastructure and data processing tasks.
About us:
Bupa has a strategic goal of being the most customer-centric digital healthcare organisation, with the use of data as an explicit pillar of this strategy.
The program pillars include:
- A Customer-Centric Focus - Bupa aims to grow our data products, the data that people have access to and the utilisation of data as a strategic asset that drives our Connected Care, Core Modernisation, and other strategic change agents
- Data Access and Democratization - Ensuring everyone has access to and can understand the data Bupa holds; reducing barriers to entry and make the complex simple for people who want to leverage our data assets
- Support for Access - Help leaders in ensuring easy access to data for people that they work with
- Empowerment - Enabling business people to do their role as owners, stewards, and leaders.
- Delivery & Customer Value First - Treat data as an asset to create value for customers and remove barriers to achieving compliance, risk, or other barriers to value
What’s in it for you?
As well as a competitive salary, a range of Bupa benefits and flexible working/ work from home, you’ll be challenged and encouraged to innovate. You will collaborate strongly with colleagues who are committed to delivering exceptional experiences. We trust, respect and consider everyone, knowing your difference will make the difference.
Other benefits include subsidies on our health insurance, travel, car, home, contents and pet insurance products as well as Bupa services such as Dental and Optical. You can also access a ‘People First’ wellness program which provides a range of services such as health coaches, annual skin checks and flu vaccinations, assistance with nutrition, mental and general well-being guides and product discounts.
You’ll feel happier & healthier working at Bupa!
Location Melbourne Office (33X) Recruiter Luke Sist* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: AWS Azure CI/CD Computer Science Databricks DataOps Data strategy Data warehouse DevOps Distributed Systems Docker ELT Engineering ETL GCP Kafka Kubernetes Pipelines Python Security Spark Streaming Testing
Perks/benefits: Career development Competitive pay Flex hours Health care Wellness
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.