Data Engineer

Sydney, Australia

Company Description

Nine is Australia’s largest locally owned media company – the home of Australia’s most trusted and loved brands spanning News, Sport, Lifestyle, and Entertainment. We pride ourselves on creating the best content, accessed by consumers when and how they want – across Publishing, Broadcasting and Digital.

Nine’s assets include the 9Network, major mastheads such as The Sydney Morning Herald, The Age and The Australian Financial Review, radio stations 2GB, 3AW, 4BC and 6PR, digital properties such as nine.com.au, 9Now, 9Honey, Pedestrian.TV, Drive, subscription video platform Stan and a majority investment in Domain Group.

Our Purpose: We shape culture by sparking conversations, challenging perspectives, and entertaining our communities.

We bring people together by celebrating the big occasions and connecting the everyday moments. Australia belongs here. We bring our purpose to life via three shared values: We walk the talk, turn over every stone and keep it human.

Job Description

Data Engineering builds and manages the pipelines, applications and infrastructure for collecting, processing, and storing Nine’s data. We are responsible for collecting hundreds of millions of events from client devices each day and streaming them directly into our data warehouse. We extract data from the majority of Nine’s enterprise systems, centralise it in our data warehouse, and create rich data models for analysis and to inform business operations and strategy.

The team is also responsible for building solutions to surface data to users including dashboards, Slack integrations and automated static reports, as well as providing data feeds to downstream systems for personalising audience experiences and commercialisation.

Role responsibilities

  • Design, develop and maintain data pipelines that extract, load and transform data from various sources into the data warehouses.

  • Design and implement data models that align with business requirements and support efficient querying and analysis.

  • Create dashboards, reports and visualisations that directly address business problems and inform and drive action.

  • Ensure the pipelines are reliable, scalable, and efficient, considering factors including data volume, velocity, and variety.

  • Implement data quality checks and validation processes to maintain accurate and reliable data.

  • Set up monitoring and alerting to proactively identify and resolve issues.

  • Perform root cause analysis for incidents and implement corrective actions.

  • Contribute to the technical design and vision for the data platform.

  • Collaborate with cross-functional teams, including product managers, data scientists, analysts, and business stakeholders to understand data requirements and deliver solutions that meet business needs.

  • Communicate technical concepts and solutions effectively to non-technical stakeholders.

  • Stay updated with emerging data engineering technologies and trends, and evaluate their potential benefits for the organisation's data ecosystem.

Qualifications

  • Fluent in at least one programming language (e.g. Python, C#, Java).

  • Proficient in SQL

  • Understanding of OLTP concepts including normalised schema design.

  • Understanding of OLAP concepts and data model design.

  • Experience in designing and building data pipelines, ETL processes, and managing data infrastructure.

  • Experience with automating workflows, and managing task dependencies using a tool like Apache Airflow.

Desirable:

  • A bachelor's or master's degree in computer science, software engineering, data engineering, data science, information technology or a related field.

  • Experience with Google Cloud Platform, AWS or Azure including services for data storage, processing, and analytics.

  • Experience with cloud deployment and test automation using a CI/CD solution such as Cloud Build or Concourse

  • Experience in deploying cloud infrastructure as code (IaC) using Terraform or similar

Additional Information

Our Commitment to Diversity and Inclusion:

At Nine, we are committed to fostering a workforce that embraces all aspects of diversity and inclusion and where practices are equitable to ensure our people experience a sense of belonging. From day one, you'll be encouraged to bring your whole self to work and will be supported to perform at your best. Should you require any adjustments to the recruitment process in order to equitably participate, we encourage you to advise us at the time of application.

We encourage applications from Aboriginal and Torres Strait Islander people, people with disabilities, and of all ages, nationalities, backgrounds and cultures.

Disclaimer: We do not accept unsolicited agency resumes and are not responsible for any fees related to unsolicited resumes.

Apply now Apply later
  • Share this job via
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Airflow AWS Azure CI/CD Computer Science Data pipelines Data quality Data warehouse Engineering ETL GCP Google Cloud Java Model design OLAP Pipelines Python SQL Streaming Terraform

Perks/benefits: Team events

Region: Asia/Pacific
Country: Australia
Job stats:  2  0  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.