Data Engineer

Melbourne

Easygo

At Easygo, we pride ourselves in building smart, industry leading entertainment products online. Learn more about our games, apps & career opportunities.

View all jobs at Easygo

Apply now Apply later

Are you a passionate and ambitious data engineer ready to dive into an environment that fosters innovation, continuous learning, and professional growth? We're seeking talented individuals who are eager to tackle complex big data problems, build scalable solutions, and collaborate with some of the finest engineers in the entertainment industry.

  • Complex Projects, Creative Solutions: Dive into intricate projects that challenge and push boundaries. Solve complex technical puzzles and craft scalable solutions.
  • Accelerate Your Growth: Access mentorship, training, and hands-on experiences to level up your skills. Learn from industry experts and gain expertise in scaling software.
  • Collaborate with Industry Leaders: Work alongside exceptional engineers, exchanging ideas and driving innovation forward through collaboration.
  • Caring Culture, Career Development: We deeply care about your career. Our culture prioritizes your growth with tailored learning programs and mentorship.
  • Embrace Challenges, Celebrate Success: Take on challenges, learn from failures, and celebrate achievements together.
  • Shape the Future: Your contributions will shape the future of entertainment.

About the team

You’ll be joining the Data Engineering team on an exciting mission to build a top-tier data platform that powers everything we do. We manage data from our in-house products like Stake and Kick, as well as third-party tools and services, centralising it and ensuring it’s reliable, scalable, and ready to drive smarter decisions across the business.

We’re focused on advanced data tools and services that truly make a difference. Our goal is to help teams unlock the full potential of data, empowering them to create impactful outcomes for our customers and the business. If you’re someone who loves tackling big challenges and shaping the future of data, you’ll fit right in!

Key Responsibilities:

  • Design, develop, and maintain scalable ETL pipelines using AWS Glue and orchestrate workflows with Airflow to extract, transform, and load data from various sources (e.g., databases, APIs, flat files, streaming services) into the data lake, following medallion architecture principles.
  • Build and implement secure and efficient data systems using AWS services and Terraform, ensuring performance and compliance.
  • Collaborate with cross-functional teams to transform data from the gold layer in the data lake to Redshift using dbt, enabling high-quality analytics and machine learning insights.
  • Monitor and optimise data pipelines for performance, scalability, and cost-efficiency, ensuring observability through monitoring and alerting systems.
  • Document end-to-end processes, including ingestion, transformation, storage and governance, to support knowledge sharing and scalability.
  • Implement data governance practices such as data lineage, classification, access control, and compliance with GDPR and other regulatory requirements.
  • Build and optimise real-time data pipelines using PySpark, Glue Spark, and Kinesis, focusing on Change Data Capture (CDC) for seamless operations and reliability.
  • Ensure pipelines are thoroughly tested and optimised, with comprehensive monitoring and alerting systems for reliability and performance.
  • Participate in peer code reviews to ensure adherence to best practices, coding standards, and high-quality development. 

Minimum Qualifications:

  • A Bachelor’s degree in Computer Science, Software Engineering, Information Systems, or a related field or equivalent practical experience.
  • 3 - 6 years of experience in data engineering, with a focus on ETL development, data modelling, database management, and real-time data pipelines.
  • Proficiency in SQL, Python, or PySpark, with hands-on experience using cloud services such as Glue, Redshift, Kinesis, Lambda, S3 and DMS.
  • Experience with orchestration tools (e.g., Apache Airflow), version control systems (e.g., GitHub), and big data technologies such as Spark or Hadoop.
  • Experience designing and implementing modern cloud-based data platforms, preferably on AWS, using Infrastructure as Code (IaC) tools like Terraform.
  • Knowledge of data governance and compliance standards.
  • Strong problem-solving, analytical, and communication skills for engaging with cross-functional teams.

Preferred Qualifications 

  • Experience with DataOps principles, CI/CD pipelines, and agile development methodologies.
  • Knowledge of machine learning concepts and their application in data engineering.
  • AWS certifications (e.g., AWS Certified Solutions Architect, AWS Certified Data Analytics) or similar cloud certifications.

Some of the perks of joining us:

  • Championing Engineering Excellence to influence data driven impact across global scale software products.
  • Work alongside the top 5% of engineering talent in Australia using a vast AWS cloud native and big data technology stack.
  • Exposure to building global, large-scale volume data pipelines, data warehouses, and datalakes which are consuming requests at thousands per second frequency.
  • EAP access for you and your family
  • Access to over 9,000 courses across our Learning and Development Platform 
  • Lucrative Annual Bonuses
  • Paid volunteer day
  • Two full-time barista’s who will make your daily coffee, tea or fresh juice!
  • Daily catered breakfast
  • On-site masseuse on Wednesdays
  • Team lunches and happy hour in the office from 4pm on Fridays
  • Fun office environment with pool tables, table tennis and all your favourite gaming consoles
  • Help yourself drinks fridges and snack shelves

We believe that the unique contributions of everyone at Easygo are the driver of our success. To make sure that our products and culture continue to incorporate everyone's perspectives and experience we never discriminate on the basis of race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status. We are passionate about providing a workplace that encourages great participation and an equal playing field, where merit and accomplishment are the only criteria for success.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Airflow APIs Architecture AWS AWS Glue Big Data CI/CD Classification Computer Science Data Analytics Data governance DataOps Data pipelines dbt Engineering ETL GitHub Hadoop Kinesis Lambda Machine Learning Pipelines PySpark Python Redshift Spark SQL Streaming Terraform

Perks/benefits: Career development Lunch / meals Salary bonus

Region: Asia/Pacific
Country: Australia

More jobs like this