Data Engineer

Bangalore Office, India

Fidelity International

Fidelity International offers investment solutions and retirement expertise to institutions, individuals and their advisers around the world.

View all jobs at Fidelity International

Apply now Apply later

About the OpportunityJob Type: Permanent

Application Deadline: 30 June 2025

  • Design, develop, and optimize complex SQL queries, stored procedures, and data models for Oracle-based systems
  • Create and maintain efficient data pipelines for extract, transform, and load (ETL) processes using Informatica or Python
  • Implement data quality controls and validation processes to ensure data integrity
  • Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications
  • Document database designs, procedures, and configurations to support knowledge sharing and system maintenance
  • Troubleshoot and resolve database performance issues through query optimization and indexing strategies
  • Integrate Oracle systems with cloud services, particularly AWS S3 and related technologies
  • Participate in code reviews and contribute to best practices for database development
  • Support migration of data and processes from legacy systems to modern cloud-based solutions
  • Work within an Agile framework, participating in sprint planning, refinement, and retrospectives
  • Required Qualifications
  • 3+ years of experience with Oracle databases, including advanced SQL & PLSQL development
  • Strong knowledge of data modelling principles and database design
  • Proficiency with Python for data processing and automation
  • Experience implementing and maintaining data quality controls
  • Experience with AI-assisted development (GH copilot, etc..)
  • Ability to reverse engineer existing database schemas and understand complex data relationships
  • Experience with version control systems, preferably Git/GitHub
  • Excellent written communication skills for technical documentation
  • Demonstrated ability to work within Agile development methodologies
  • Knowledge of concepts, particularly security reference data, fund reference data, transactions, orders, holdings, and fund accounting
  • Additional Qualifications

  • Experience with ETL tools like Informatica and Control-M
  • Unix shell scripting skills for data processing and automation
  • Familiarity with CI/CD pipelines for database code
  • Experience with AWS services, particularly S3, Lambda, and Step Functions
  • Knowledge of database security best practices
  • Experience with data visualization tools (Power BI)
  • Familiarity with domains (Security Reference, Trades, Orders Holdings, Funds, Accounting, Index etc)
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0
Category: Engineering Jobs

Tags: Agile AWS CI/CD Copilot Data pipelines Data quality Data visualization ETL Git GitHub Informatica Lambda Oracle Pipelines Power BI Python Security Shell scripting SQL Step Functions

Region: Asia/Pacific
Country: India

More jobs like this