Senior Data Engineer
Melbourne
Easygo
At Easygo, we pride ourselves in building smart, industry leading entertainment products online. Learn more about our games, apps & career opportunities.Are you a passionate and ambitious data engineer ready to dive into an environment that fosters innovation, continuous learning, and professional growth? We're seeking talented individuals who are eager to tackle complex big data problems, build scalable solutions, and collaborate with some of the finest engineers in the entertainment industry.
- Complex Projects, Creative Solutions: Dive into intricate projects that challenge and push boundaries. Solve complex technical puzzles and craft scalable solutions.
- Accelerate Your Growth: Access mentorship, training, and hands-on experiences to level up your skills. Learn from industry experts and gain expertise in scaling software.
- Collaborate with Industry Leaders: Work alongside exceptional engineers, exchanging ideas and driving innovation forward through collaboration.
- Caring Culture, Career Development: We deeply care about your career. Our culture prioritizes your growth with tailored learning programs and mentorship.
- Embrace Challenges, Celebrate Success: Take on challenges, learn from failures, and celebrate achievements together.
- Shape the Future: Your contributions will shape the future of entertainment.
About the team
You’ll be joining the Data Engineering team on an exciting mission to build a top-tier data platform that powers everything we do. We manage data from our in-house products like Stake and Kick, as well as third-party tools and services, centralising it and ensuring it’s reliable, scalable, and ready to drive smarter decisions across the business.
We’re focused on advanced data tools and services that truly make a difference. Our goal is to help teams unlock the full potential of data, empowering them to create impactful outcomes for our customers and the business. If you’re someone who loves tackling big challenges and shaping the future of data, you’ll fit right in!
Key Responsibilities:
- Architect, design, and optimise scalable ETL pipelines using AWS Glue, orchestrating complex workflows with Airflow and enabling efficient data processing from multiple sources.
- Lead the development of secure, scalable, and cost-efficient data infrastructure using AWS services and Terraform, ensuring best practices in infrastructure automation.
- Collaborate with stakeholders to define and drive strategic data initiatives, including data lakehouse architecture, real-time data processing, and analytics enablement.
- Own and enhance observability practices, implementing robust monitoring, alerting, and logging systems to minimize downtime and optimize performance.
- Develop and enforce automated data governance strategies, including data lineage tracking, access control, masking, and encryption, ensuring compliance with GDPR and industry regulations.
- Develop and execute automated data governance processes to ensure compliance with data lineage, classification, access control, GDPR, and other regulatory standards. This includes masking, encryption, and policy enforcement to protect sensitive data, especially in regulated markets.
- Architect and optimise data pipelines for Change Data Capture (CDC), reconciliation workflows, and reliability enhancements.
- Drive improvements in engineering best practices, including code quality, system design, documentation, and operational efficiency to foster team efficiency and high-quality outcomes with measurable business values.
- Identify opportunities and implement innovative solutions to enhance data platform capabilities, proactively solving performance bottlenecks and improving data reliability.
- Perform advanced code reviews, mentor engineers and foster a collaborative, high-performance and collaborative engineering culture.
Some of the perks of joining us:
- Championing Engineering Excellence to influence data driven impact across global scale software products.
- Work alongside the top 5% of engineering talent in Australia using a vast AWS cloud native and big data technology stack.
- Exposure to building global, large-scale volume data pipelines, data warehouses, and datalakes which are consuming requests at thousands per second frequency.
- EAP access for you and your family
- Access to over 9,000 courses across our Learning and Development Platform
- Lucrative Annual Bonuses
- Paid volunteer day
- Two full-time barista’s who will make your daily coffee, tea or fresh juice!
- Daily catered breakfast
- On-site masseuse on Wednesdays
- Team lunches and happy hour in the office from 4pm on Fridays
- Fun office environment with pool tables, table tennis and all your favourite gaming consoles
- Help yourself drinks fridges and snack shelves
We believe that the unique contributions of everyone at Easygo are the driver of our success. To make sure that our products and culture continue to incorporate everyone's perspectives and experience we never discriminate on the basis of race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status. We are passionate about providing a workplace that encourages great participation and an equal playing field, where merit and accomplishment are the only criteria for success.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Architecture AWS AWS Glue Big Data Classification Data governance Data pipelines Engineering ETL Pipelines Terraform
Perks/benefits: Career development Lunch / meals
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.