Senior Data Engineer

AME (Amsterdam - Maple), Netherlands

Apply now Apply later

The Mission 

 

The ING Pricing Architecture (IPA) platform consists of multiple components that enable real-time and batch calculations of financial risk metrics and simulations (e.g. XVA, PFE, Value at Risk, Expected Shortfall, Bilateral Margining, pre-deal derivatives Pricing) which are driven by new banking regulations as well as advanced Risk analysis on the derivative product portfolio of Financial Markets (interest rates, credits, foreign exchange). This platform is in the heart of the IT landscape for our global dealing rooms and risk managers in Asia, Europe and Americas, with 7-9 scrum teams spread across 4 locations (Amsterdam, Brussels, Bucharest and Singapore), cooperating to evolve it towards target ING Financial Markets IT vision. 

The IPA DARE Purpose 

We provide a fast, easy, secure and reliable Data, Analysis and Reporting Environment, based largely on the GCP technology in the public cloud. The purpose of this asset is to provide one system for FO (Front Office) and Risk Management business users to be used for both Intraday and End Of Day pricing, Risk Management and Position Management. 

 

Your day to day 

Even if you’ll start your day from the comfort of your home or drink your morning coffee at our office’s cafeteria, your day will be quite similar when comes to tasks. Here are your daily responsibilities: 

  • As a Senior Data Engineer, you contribute and are responsible for the design, development, test automation, hardening (security, stability, deployment) and documentation of components and data pipelines for the extraction and provision of structured and unstructured data within a cloud infrastructure or between cloud and on-premises systems. 

  • You contribute to technology choices and architecture of newly built components and services 

  • You have a strong focus on performance, large data sets and towards event-driven architecture 

  • You assess the efficiency of end-to-end processes by identifying and mitigating risks 

  • You continuously improve and optimize the ingestion, post-processing and reporting layer of the data platform 

  • You standardize and reutilize automation patterns as part of the data ingestion and processing stages 

  • You design and develop automated unit, integration and regression tests 

  • You prove strong analytical skills, proactivity and ability to work in cross-border international teams. 

  • You work in an Agile environment, collaborate closely with the Product Owner and Customer Journey Expert to interpret business requirements and estimate, plan, implement, test and deliver as part of short delivery cycles. 

  • You focus on delivering working software of high quality that satisfies customer requirements. 

 

 

What you’ll bring to the team 

 

Experience: 5+ years of engineering experience. 

Tech stack/ knowledge: 

Mandatory: 

  • Proficient in relational databases and SQL. 

  • Proven big data experience, either from an implementation or a data science perspective. 

  • Strong coding experience in the likes of Java, Scala or Python. 

  • Experience with GCP tech stack (BigQuery, Dataflow, Cloud Composer, Google Cloud Storage) or other public cloud providers equivalents (AWS, Azure) 

  • Experience with Git, Maven, CI/CD (eg. Azure DevOps) and documentation tools 

  • Experience with Infrastructure as Code (eg. Terraform) 

  • Experience or affinity with IT Security concepts 

Nice to have: 

  • DBT 

  • Looker 

  • Data governance, Data lineage, Data Catalogue (Metadata) 

  • Apache Beam 

  • Spark 

  • Flink 

  • Kafka 

  • Automated testing framework 

Foreign languages: English (advanced) 

Education: A University degree in Computer Science, Software Engineering, Financial Engineering or equivalent. 

 

What’s in it for you 

  • A salary tailored to your qualities and experience. 

  • 24-27 vacation days depending on the contract. 

  • Pension scheme. 

  • 13th-month salary. 

  • Individual Savings Contribution (BIS), 3.5% of your gross annual salary. 

  • 8% Holiday payment. 

  • Hybrid working to blend home working for focus and office working for collaboration and co-creation. 

  • Growth opportunities 

  • Defining a clear career path on short/ mid/ long term and identify the competencies you need to build/ develop to reach the next level: vertically – towards a managerial position or horizontally – towards an expert or architect level, locally or globally 

  • Internal mobility is encouraged 

  • Possibility to access International Short-Term Assignments or Long-Term Assignments  

  • Upskilling/ reskilling programs 

  • Learning & Development opportunities 

  • Annual training & certifications budget 

  • Udemy, CloudSkillBoost & e-learning platforms 

  • CSR activities: tree planting, plastic cleaning, helping the elderly, coding lessons for teenagers etc. 

 

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0
Category: Engineering Jobs

Tags: Agile Architecture AWS Azure Banking Big Data BigQuery CI/CD Computer Science Dataflow Data governance Data pipelines dbt DevOps Engineering Flink GCP Git Google Cloud Java Kafka Looker Maven Pipelines Python RDBMS Scala Scrum Security Spark SQL Terraform Testing Unstructured data

Perks/benefits: Career development

Region: Europe
Country: Netherlands

More jobs like this