Data Engineering Lead

India

WTW

WTW tarjoaa tietoon perustuvia, näkemyslähtöisiä ratkaisuja ihmisten, riskien ja pääoman alalla.

View all jobs at WTW

Apply now Apply later

    Design and implement clean, modular, efficient Python (3.x) codebases for backend services, data pipelines, and LLM integrations.
    Integrate with external Document AI/LLM systems via RESTful APIs, codifying prompts into production-grade code and managing their lifecycle (versioning, tuning concepts, template integration).
    Architect and evolve MongoDB schemas, with expert handling of embedding vs referencing strategies, schema migrations, and performance tuning.
    Perform CRUD operations, indexing, backup strategies, and monitoring on MongoDB Atlas hosted on AWS; manage VPC peering, IAM roles, and serverless triggers if needed.
    Build and maintain cloud-native, scalable, secure data systems primarily on Azure or AWS.
    Ensure high standards of quality with unit testing, CI/CD pipelines, and coding best practices.
    Lead hands-on development while collaborating closely with Data Scientists, Product Managers, and DevOps teams.
    Champion a high-quality, production-grade approach to LLM prompt engineering and backend data services.
    Monitor technological trends in AI integration, NoSQL technologies, and cloud-native data architectures to keep Neuron's tech stack future-proof.


The Requirements
Please enter the minimum criteria, skills, education, licenses etc. required to do this job
Mandatory Skills
•    Python Development
o    Strong proficiency in Python 3.x backend and scripting tasks
o    Experience integrating with Document AI/LLM systems (e.g., OpenAI, Azure OpenAI) via APIs
o    Good understanding of RESTful API concepts and integration patterns
o    Ability to codify prompts, manage their lifecycle, and integrate templates into production LLM pipelines
o    Familiarity with unit testing, CI/CD pipelines (e.g., GitHub Actions, Azure DevOps)
•    Data Analytics Exposure
o    Familiarity with end-to-end data analytics workflows, including data preparation, transformation, and insight delivery
o    Ability to support or collaborate with analytics teams to ensure backend systems support analytical use cases
o    Experience working with WTW’s Radar platform is a strong plus
•    Document Database Expertise
o    Strong experience working with document databases for high-performance applications — MongoDB preferred
o    Proficiency in schema design, including embedding vs referencing strategies
o    Hands-on experience with CRUD operations, indexing, performance tuning, and schema evolution
•    Cloud Platform Familiarity
o    Strong familiarity with Azure or AWS services relevant to data and backend application hosting
o    Bonus: experience with AWS/Azure SDKs in Python
 

 

•    General
o    8–12 years of total experience in data engineering, backend development, and/or cloud-native application development
o    Ability to operate both strategically (solution architecture) and tactically (coding hands-on)
o    Excellent communication and documentation skills to share complex ideas with technical and non-technical stakeholders
Nice-to-Have Skills
    Experience fine-tuning or customizing LLMs beyond just API integration.
    Familiarity with data governance frameworks and data quality best practices.
    Experience in insurance, financial services, or digital platform environments.
    Exposure to serverless cloud-native architectures.
    Understanding of secure software development practices in regulated environments.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: APIs Architecture AWS Azure CI/CD Data Analytics Data governance Data pipelines Data quality DevOps Engineering GitHub LLMs MongoDB NoSQL OpenAI Pipelines Prompt engineering Python Radar Testing

Region: Asia/Pacific
Country: India

More jobs like this