Senior Data Engineer
Warszawa (Zajęcza 4), Poland
Tech stack/ knowledge:
- Mandatory:
- Proficient in relational databases and SQL.
- Proven big data experience, either from an implementation or a data science perspective.
- Strong coding experience in the likes of Java, Scala or Python.
- Experience with GCP tech stack (BigQuery, Dataflow, Cloud Composer, Google Cloud Storage) or other public cloud providers equivalents (AWS, Azure)
- Experience with Git, Maven, CI/CD (eg. Azure DevOps) and documentation tools
- Experience with Infrastructure as Code (eg. Terraform)
- Experience or affinity with IT Security concepts
- English B2
- Nice to have:
- DBT
- Looker
- Data governance, Data lineage, Data Catalogue (Metadata)
- Apache Beam
- Spark
- Flink
- Kafka
- Automated testing framework
Here are your daily responsibilities:
- As a Senior Data Engineer, you contribute and are responsible for the design, development, test automation, hardening (security, stability, deployment) and documentation of components and data pipelines for the extraction and provision of structured and unstructured data within a cloud infrastructure or between cloud and on-premises systems.
- You contribute to technology choices and architecture of newly built components and services
- You have a strong focus on performance, large data sets and towards event-driven architecture
- You assess the efficiency of end-to-end processes by identifying and mitigating risks
- You continuously improve and optimize the ingestion, post-processing and reporting layer of the data platform
- You standardize and reutilize automation patterns as part of the data ingestion and processing stages
- You design and develop automated unit, integration and regression tests
- You prove strong analytical skills, proactivity and ability to work in cross-border international teams.
- You work in an Agile environment, collaborate closely with the Product Owner and Customer Journey Expert to interpret business requirements and estimate, plan, implement, test and deliver as part of short delivery cycles.
- You focus on delivering working software of high quality that satisfies customer requirements.
The Mission
The ING Pricing Architecture (IPA) platform consists of multiple components that enable real-time and batch calculations of financial risk metrics and simulations (e.g. XVA, PFE, Value at Risk, Expected Shortfall, Bilateral Margining, pre-deal derivatives Pricing) which are driven by new banking regulations as well as advanced Risk analysis on the derivative product portfolio of Financial Markets (interest rates, credits, foreign exchange). This platform is in the heart of the IT landscape for our global dealing rooms and risk managers in Asia, Europe and Americas, with 7-9 scrum teams spread across 4 locations (Amsterdam, Brussels, Bucharest and Singapore), cooperating to evolve it towards target ING Financial Markets IT vision.
The IPA DARE Purpose
We provide a fast, easy, secure and reliable Data, Analysis and Reporting Environment, based largely on the GCP technology in the public cloud. The purpose of this asset is to provide one system for FO (Front Office) and Risk Management business users to be used for both Intraday and End Of Day pricing, Risk Management and Position Management.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Architecture AWS Azure Banking Big Data BigQuery CI/CD Dataflow Data governance Data pipelines dbt DevOps Flink GCP Git Google Cloud Java Kafka Looker Maven Pipelines Python RDBMS Scala Scrum Security Spark SQL Terraform Testing Unstructured data
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.