Senior Data Engineer - Flutter UK&I, Hybrid (Fixed-term - 12 months)
Cluj-Napoca, Romania
Betfair
We are the largest technology hub of Flutter Entertainment Plc, an FTSE 100 company, with over 1,900 people powering the world’s leading betting and gaming brands.Betfair Romania Development is Paddy Power Betfair’s Software Development Center in Cluj.
We offer career opportunities in Java, Web, Data Warehouse, QA, Linux, Application & Network Security, Business Analysis, Project Management, Product, User Experience, Agile and SCRUM, in multiple project streams: Gaming, Platform Development, Data Services, Sports Operations, Marketing.
We’re expanding the Data team and we’re looking for people passionate about Data, paying attention to accuracy, latency with interests in Machine Learning, Predictive Analyses, stream consumptions based on Kafka, Cloud architecture.
The Role
We are looking for a Senior Data Engineer with proven data integration skills to develop and support Integration, ETL , Data Warehouse, as well as data driven services such as personalization or customer profiles offered as APIs and streams to other business units.
You will have the opportunity to work in an agile environment where you will team up with colleagues having experience in Data Warehouse, Machine Learning and more.
The successful candidate should have excellent technical and problem solving skills, a positive and results driven attitude, a strong communicator who can interact with both technical and non-technical people.
Key Responsibilities / Duties:
- Be an effective team player of a Scrum Team - understanding and contributing to the agile delivery process and taking ownership of your team’s processes.
- Create and promote the use of behaviour/test-driven development at multiple levels within the software by pairing with production code developers and product owners.
- To deliver excellent production quality code and encourage and mentor other developers to do so
- To be proactive in identifying holistic approaches across diverse programs, projects and technologies
- To promote the production of reusable code and modules where appropriate with a view to maximising effort and reducing development costs applying principles of Agile delivery.
- To be proactive in the identification of opportunities to improve and rationalise applications and processes across the whole Data team, working in close collaboration with other Data team members and subject matter experts.
- To liaise closely with other Data team members and subject experts to ensure that accurate and effective support documentation is maintained to reflect code development and changes.
- To ensure risks and issues are identified in a timely manner and effectively communicated with proposed resolution and mitigation strategies to the Data Delivery Manager
- To be proactive and drive the building of effective relationships across Data and key areas of the Paddy Power Betfair business stakeholder community
- Write and maintain functional and technical specifications
- Monitor, optimize and trouble shoot database, microservice and stream and ML service performance
- Analyse code for problem resolution
- Thorough, demonstrable unit testing
Experience & Qualifications:
Essential
- Excellent SQL, preferably on both RDBMS and an MPP platform like Redshift or another big data platform like Hadoop
- Dimensional data modelling
- Demonstrable experience with high-volume data loads (terra bytes and above)
- Knowledge of ETL from highly transactional (1000s records/second) OLTP systems
- Demonstrable experience of object oriented in Python
- A proven ability to influence technical decisions in a fast-moving commercial environment
- Demonstrates excellent people skills
- Demonstrates exceptional communication, interpersonal skills, and consistent high energy levels
- Proven development skills in at least one and preferably two or more of the following:
- Microservices
- Kafka or another message based streaming platform
- Machine learning and analytics in either Python or Sagemaker
- Open source NoSQL technologies (e.g. MongoDB, CouchDB, ElasticSearch).
- ETL Tools such as Talend or Airflow
- AWS (preferable) or experience of data engineering on another leading cloud vendor such as GCP or Azure
- Experience of database performance analysis and design
- Unit testing knowledge
- Exposure to Continuous Integration / Continuous Delivery tools (e.g. Go, Jenkins)
Desirable:
- Agile
- Experience of large data warehouse (10 TB+) with multiple sources and outputs Knowledge of the online gaming/gambling industry
- Educated to degree level in a science or technology related field
Key Skills and Attributes:
- Proactive work ethic with the ability to deliver results and meet challenging deadlines
- Passion & flexibility to work the hours required to see projects to completion in a timely, accurate & efficient manner.
- Self-motivating.
- Attention to detail with a high degree of pride in work produced.
- Proven ability & desire to innovate.
- Strong analytical skills.
- Enthusiasm for the software development process.
- Good English language skills.
What you can expect:
- 25 days of annual leave
- ShareSave scheme and „Flexible Benefits” of your choice
- Private health insurance (includes dental insurance and health assessments)
- Excellent development opportunities including thousands of courses online through ‘Udemy'
- Working from home options
We thank all applicants for their interest, however only the suitable candidates will be contacted for an interview. By submitting your application online, you agree that: your details will be used to progress your application for employment.
If your application is successful, your details will be used to administer your personnel record. If your application is unsuccessful, we will retain your details for a period no longer than three years, in order to consider you for prospective roles within our company.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow APIs Architecture AWS Azure Big Data Data warehouse Elasticsearch Engineering ETL GCP Hadoop Java Jenkins Kafka Linux Machine Learning Microservices MongoDB MPP NoSQL Open Source Python RDBMS Redshift SageMaker Scrum Security SQL Streaming Talend TDD Testing
Perks/benefits: Career development Flex hours Health care
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.