Specialist Data Engineer

Australia

BHP

At BHP what we produce is essential for the world to continue to grow and make the transition to cleaner energy possible.

View all jobs at BHP

Apply now Apply later

 

 

About BHP

 

At BHP we support our people to grow, learn, develop their skills and reach their potential. With a global portfolio of operations, we offer a diverse and inclusive environment with extraordinary career opportunities. Our strategy is to focus on creating a safe work environment where our employees feel strongly connected to our values and objectives, and where the capability of our people is key to our success.

 

Come and be a part of this success.

 

About the Role

 

As a Data Engineer, you will play a key role supporting projects that fundamentally change BHP’s performance across safety, productivity and people performance. You will play a pivotal role in enabling data-driven decision-making and ensuring data availability, scalability, and reliability.


The work:
 

The Data Engineer will be critical in the successful development and implementation of digital solutions and products, as these are built on effective access to high-quality data.
 

The Data Engineer’s primary responsibilities are to define the data lifecycle (including data models and data sources for analytics platforms), and to gather and clean business data from the business in order to provide ready-to-work inputs for Data Scientists and other consumers.
 

The Data Engineer will also apply strong expertise in data mining and information retrieval to design, develop, optimise, and maintain data architecture and pipelines that adhere to ETL principles and business goals.
 

As such, the Data Engineer is accountable for ensuring that data is available to be utilised by the applications and digital products is of high quality. As a successful Data Engineer:

  • You’ll work closely with internal BHP customers to understand their data requirements, develop and model data structure, and design and build the ingestion process to provide access to data from operational and enterprise source systems
  • You will be involved in the design and development of data integration and data pipelines (ETL)
  • You will work with various on-prem and cloud-based data platforms, technologies and services
  • You’ll plan and deliver secure, good practice data integration strategies and approaches
  • You’ll work closely with database teams on topics related to data requirements, cleanliness, quality etc.

About You


Ideally the successful candidate will have all of the following skills. But we have prioritised the experience into [M]ust Have, [S]hould have and [C]ould Have.
 

  • Data Warehousing / Data Governance experience, including;
    • Experienced in multiple database technologies such as Distributed Processing, Traditional RDBMS, MPP, NoSQL. [M]
    • Strong experience in traditional data warehousing / ETL tools (Informatica, Talend, Pentaho, DataStage) [M]
    • Demonstrated experience working across structured, semi-structured, and unstructured data [S]
    • Experience with data processing systems with Hadoop, Spark, Storm, Impala, etc.[S]
    • Strong understanding of understanding of traditional ETL tools & RDBMS, End to End Data Pipeline [M]
    • Knowledge of Data Governance and strong understanding of data lineage and data quality [S]
    • Experience with DBT for analytics, modelling and quality checking [S]
       
  • AWS Cloud experience, including;
    • Experience with AWS cloud services: S3, EC2, EMR, RDS, Redshift and Kinesis. [M]
    • Experienced in multiple database technologies such as Distributed Processing (Spark, Hadoop, EMR), Traditional RDBMS (MS SQL Server, Oracle, MySQL, PostgreSQL), MPP (AWS Redshift, Teradata). NoSQL (MongoDB, DynamoDB, Cassandra, Neo4J, Titan) [S]
    • Experience in designing and building streaming data ingestion, analysis and processing pipelines using Kafka, Kafka Streams, Spark Streaming and similar cloud native technologies. [S]
    • Ideally have experience with Infrastructure as Code using Terraform. [C]

  • Back-End Software Development, including
    • Strong experience with Python and at least two of the following technologies: Scala, SQL, Java [M]
    • Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. [S]
    • Experience deploying applications into production environments e.g. code packaging, integration testing, monitoring, release management. [M]
    • Experience with source control tools such along with branching and merging concepts. Ideally experience with GitLab for source control and CI/CD. [S]
    • Experience in software engineering best practices such as code reviews, testing frameworks, maintainability and readability [S]
    • Ideally have experience with making data available for consumption (i.e. APIs, Event based publish / subscribe, Data Mart provisioning) [S]
    • Ideally have experience with MuleSoft, Solace and Streamsets. [C]
       
  • General
    • Bachelor’s degree required; Computer Science, MIS, or Engineering preferred [S]
    • Demonstrated experience working in data engineering or architecture role [S]
    • Experience working in DevOps, Agile, Scrum, Continuous Delivery and/or Rapid Application Development environments [M]
    • Attitude to thrive in a fun, fast-paced start-up like environment [M]
    • Experience working on a collaborative Agile product  team [M]
    • Self-motivated with strong problem-solving and learning skills [M]
    • Flexibility to changes in work direction as the project develops [M]
    • Excellent communication, listening, and influencing skills [M]

Application close on Friday, 21 Feb 2025

 

About Our Process

 

At BHP, we are committed to employing individuals who align with the BHP Charter Values and meet the requirements of the role. As part of the recruitment process, there are a number of checks which may be conducted to demonstrate applicants suitability for a role including police / criminal background checks, medical, drug and alcohol testing, due diligence checks, right to work checks, and/or reference checks. 

If you are already employed directly by BHP, please log in using your BHP email address or apply via our internal jobs portal. 

 

Supporting a Diverse Workforce
 

The size, stability and magnitude of our business not only provides significant opportunity for professional development, but also attractive salary packages with performance-based bonuses and a best-in-class employee share program. We know there are many aspects of our employees' lives that are important, and work is only one of these, so we offer benefits to enable your work to fit with your life. These benefits include flexible working options, a generous paid parental leave policy, other extended leave entitlements and parent rooms.  

At BHP, we know that we are strengthened by diversity. We are an Equal Opportunity employer that is committed to making BHP a safe and inclusive workplace where everyone can thrive and be at their best every day. We are focused on creating a workforce that’s more diverse and represents the communities where we work and live. providing a work environment in which everyone is included, treated fairly and with respect. We are an Equal Opportunity employer and recognise that true diversity includes gender, age, race, disability status, sexual orientation, religion, neurodiversity, education levels, and many more aspects of your identity.

BHP is committed to providing a recruitment process that is fair, equitable and accessible for all. If you have a disability, we know that it may be helpful for us to adjust our process to make it equitable for your individual situation. If you would like to reach out to someone about your situation and our recruitment process, please email us at inclusion@bhp.com.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Airflow APIs Architecture AWS Azkaban Cassandra CI/CD Computer Science Data governance Data Mining Data pipelines Data quality Data Warehousing dbt DevOps DynamoDB EC2 Engineering ETL GitLab Hadoop Informatica Java Kafka Kinesis MongoDB MPP MS SQL MySQL Neo4j NoSQL Oracle Pentaho Pipelines PostgreSQL Python RDBMS Redshift Scala Scrum Spark SQL Streaming Talend Teradata Terraform Testing Unstructured data

Perks/benefits: Career development Equity / stock options Flex hours Medical leave Parental leave Salary bonus Startup environment

Region: Asia/Pacific
Country: Australia

More jobs like this