Data Engineer - Fraud Team
Bucharest, Romania
Deutsche Bank
Discover Deutsche Bank, one of the world’s leading financial services providers. News and Information about the bank and its productsJob Description:
DB Global Technology is Deutsche Bank’s technology center in Central and Eastern Europe. Opened in January 2014, the Bucharest office is constantly expanding.
The team is made up of enthusiastic professionals that work in an international environment, learning new technologies as part of Deutsche Banks’ businesses.
Changing the Bank is a challenging endeavor that we tackle every day and enjoy our success when our efforts fundamentally change how Deutsche Bank works.
Responsibilities:
As a Data Engineer within the Surveillance analytics space, you will be responsible for the design and implementation of Data Engineering pipelines as defined by use cases defined by both internal and external stakeholders. You will closely work with Data Architects, Data Analysts, Data Scientists and Machine Learning Engineers to understand their needs from a technical, but also from data governance perspective.
The pipelines are developed in a Google Cloud ecosystem using Apache Airflow (Cloud Composer), Apache Beam (Dataflow) or a combination of both depending on each use case. The standard pipeline may include batch data migration to cloud, data transformations of varying complexities, data enrichment and data serving for various use cases such as data analysis or machine learning training.
In addition,
- Adopt an automation-first approaches to testing, deployment, security and compliance of solutions through Infrastructure as Code and automated policy enforcement.
- As part of a development team, collaborate with other team members to understand requirements, analyses and refine stories, design solutions, implement them, test those and support them in production.
- Providing Level 3 support for technical components and contributing to problem and root cause analysis
- Collaborating with Functional Analysts and technical Specialists to complete work
- Ensure that the Bank’s SDLC controls are always adhered to.
- Participate in the agile ceremonies and contribute in backlog refinement and planning sessions.
Skills:
- [Must Haves] Python, SQL
- [Strong Plus] Apache Airflow (Cloud Composer), Apache Beam (Dataflow), BigQuery, Cloud Storage
- [Support Techs] Git, GitHub Actions, Terraform
- Engineering qualification with strong hands-on experience of working on enterprise level bigdata platforms and solution.
- Candidates with experience of working with payment domain will be preferred.
Well-being & Benefits
- 24 days’ holiday + loyalty days + bank holidays (week days offered for bank holidays on weekend days);
- Flexible working hours and working from home;
- Private healthcare and life insurance;
- A culture of continuous learning with coaching and support from experts in your team.
We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively.
Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group.
We welcome applications from all people and promote a positive, fair and inclusive work environment.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow BigQuery Data analysis Dataflow Data governance Engineering Excel GCP Git GitHub Google Cloud Machine Learning Pipelines Python SDLC Security SQL Terraform Testing
Perks/benefits: Career development Flex hours
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.