Data Engineer
Serbia
Gambling.com Group
Gambling.com expertly reviews and compares gambling services to identify the best regulated operators and products available online.Gambling.com Group (Nasdaq: GAMB) is a multi-award winning provider of marketing and sports data services looking for exceptional talent interested in the fast-paced, high-growth online gambling industry. The company operates a portfolio of renowned websites and brands – including flagship site Gambling.com and sports betting site Bookies.com as well as iGaming-focused sites Casinos.com, BonusFinder.com and UK-centric Freebets.com. In the U.S., the company operates state-specific websites such as NewYorkBets.com, BetCarolina.com, and BetArizona.com, helping consumers discover and connect with legal gambling options.
In addition to its marketing operations, Gambling.com Group provides sports data services through consumer subscription platforms like OddsJam and RotoWire, along with B2B services through OpticOdds. These offerings deliver real-time data, actionable insights, and technology driven tools to both consumers and enterprise partners.
As the first and only online gambling affiliate publicly traded in the U.S., Gambling.com Group has earned recognition as a leader in its field – most recently winning Casino Affiliate of the Year at the 2024 EGR Operator Awards. Gambling.com Group embraces a majority remote-first hybrid work model, offering employees the flexibility to work remotely while being part of a global team. For some roles, in-office presence is required for operational reasons, ensuring seamless collaboration and effectiveness where needed.
With flagship offices in Dublin, Charlotte, and Malta, and satellite offices in Madison, Helsinki, Serbia, and Costa Rica, we operate on a "think local, act global" mindset. Gambling.com Group is on the lookout for innovative, solution-driven, and fast-paced thinkers to join our growing team.
We have an opening for a Data Engineer with a focus on providing accurate and consistent data for analysis. This person will have the opportunity to work with bespoke tools designed for data capture and processing that data into a state-of-the-art Data Warehouse, primarily using Python and AWS Technologies.
What will you do:
- Mentor and guide the technical development of junior team members
- Design, develop, maintain, and document Python based data processing and ETL jobs, ensuring data is complete and accurate and that failures are minimised and resolved efficiently.
- Develop and maintain a state-of-the-art Data Warehouse
- Adhere to coding standards, participate in code reviews, and assist with controlled releases as part of best practice workflows.
- Ensure data pipelines are optimised to minimise costs, whilst also efficient enough to meet the reporting needs of the business.
- Ensure that requirements are captured effectively and efficiently
- Utilise leading edge technologies to ensure pipelines are using the best and most appropriate tools to achieve company goals.
- Communicate results of technical projects to both technical and non-technical users.
- Work with a globally distributed team and be comfortable with slack/zoom for meetings and communication and organising development tasks remotely.
What you need:
- A Master’s Degree in Engineering, Computer Science or related field, or equivalent work experience.
- 5+ years of commercial development experience as a Data Engineer or in the Data Warehousing domain.
- Large scale delivery and data management experience
- Experience building and maintaining data processing pipelines, ETL/ETL processes
- In depth knowledge of data warehousing methodologies.
- Comfortable writing complex SQL.
- Experience working with a major Relational Database such as Snowflake, Oracle, Teradata, Redshift etc.
- Significant Experience of working with Python
- Experience with orchestration tools like Airflow, Dagster, etc.
- Experience of working effectively in a distributed team environment.
- Clear ability to work in teams, contribute to technical decisions and learn/suggest/assess new technologies.
- Ability to be a creative problem solver with a solution-focused attitude.
- Comfortable with Git
- Experience of working within an AWS architecture
- Possess a strong sense of ownership, urgency, and drive.
- Document all code and features for maintainability.
- Excellent written and oral communication skills in English.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Architecture AWS Computer Science Dagster Data management Data pipelines Data warehouse Data Warehousing Engineering ETL Git Oracle Pipelines Python RDBMS Redshift Snowflake SQL Teradata
Perks/benefits: Startup environment
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.