Senior Data Operations Engineer

Melbourne, Australia

Apply now Apply later

Company Description

At Intellihub, we're not just about metering and data solutions – we're about our people. Our team of passionate individuals drives our mission to simplify the energy transition and make a real difference in the lives of millions of people across Australia and New Zealand.  But our success is not just measured by numbers. It's measured by the diverse and inclusive culture we foster within our team, where everyone is valued and given equal opportunities to thrive in a positive, safe and productive workplace.

Providing smart metering devices and services to over 50 energy retailers across ANZ, our reach is growing rapidly. But the true breadth of what we do goes beyond the meter, touching everything from solar to water, virtual power plants to electric vehicle charging.

Job Description

Intellihub is seeking a Senior DataOps Engineer to design, develop, and operationalize data pipelines and integration frameworks for our enterprise data platform. You will be part of a DataOps team, ensuring the ongoing operation, support, and maintenance of data products and platform services.

In this role, you'll drive improvements in data and operations, enhance data pipeline efficiency, and leveraging new technology trends such as advanced analytics and Generative AI (GenAI). You'll also mentor junior engineers and contribute to their development.

Key Responsibilities:

  • Enhance and innovate ETL design patterns
  • Develop data ingestion processes and models to meet business needs
  • Improve data engineering processes, frameworks, and automation
  • Maintain scalable, high-quality solutions using CI/CD and modular design
  • Support platform engineering, including infrastructure and DevOps practices
  • Define operational metrics for system availability, reliability, and data quality, following ITIL processes
  • Automate cloud infrastructure management using Infrastructure as Code (IaC)

Qualifications

  • Tertiary qualification in Information Technology, Engineering, or a related discipline
  • Experience with Agile Program Increment / Scaled Agile Framework and backlog grooming
  • Strong problem-solving skills and motivation
  • Experience with cloud platforms AWS and DBT is essential
  • Experience in Data Warehouse, Data Lake, and Data Lakehouse implementation.
  • Expertise in SQL and Python
  • Experience with Data platforms and tools such as Databricks
  • Knowledge of Data modelling techniques for Data Warehouse and Data Mart applications
  • In-depth knowledge of large-scale data sets (structured and unstructured)
  • Experience with DevOps and agile delivery (code management, CI/CD, modular design)
  • Familiarity with monitoring and logging solutions for Data Platforms and Tools

Additional Information

We offer a dynamic and inclusive workplace that values collaboration and diversity. With a flexible hybrid working mode, we prioritise work-life balance while fostering personal and professional growth. Our innovative culture provides opportunities for career development, access to industry-leading tools, and a strong commitment to employee well-being, all supported by a competitive salary and benefits package. 

Sound exciting? Apply now and be part of a team that's shaping the future of energy. For further information, please contact Alicia on careers@intellihub.com.au

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Agile AWS CI/CD Databricks DataOps Data pipelines Data quality Data warehouse dbt DevOps Engineering ETL Generative AI ITIL Pipelines Python SQL

Perks/benefits: Career development Competitive pay Flex hours Startup environment

Region: Asia/Pacific
Country: Australia

More jobs like this