Senior Specialist Data Engineering

IND - Telangana - Hyderabad (HITEC City), India

MSD

At MSD, we're following the science to tackle some of the world's greatest health threats. Get a glimpse of how we work to improve lives.

View all jobs at MSD

Apply now Apply later

Job Description

Senior Specialist Data Engineering

A Data Engineer is responsible for designing, building, and maintaining robust data pipelines and infrastructure that facilitate the collection, storage, and processing of large datasets. They collaborate with data scientists and analysts to ensure data is accessible, reliable, and optimized for analysis. Key tasks include data integration, ETL (Extract, Transform, Load) processes, and managing databases and cloud-based systems. Data engineers play a crucial role in enabling data-driven decision-making and ensuring data quality across organizations.

What will you do in this role:

  • Develop comprehensive High-Level Technical Design and Data Mapping documents to meet specific business integration requirements.
  • Own the data integration and ingestion solutions throughout the project lifecycle, delivering key artifacts such as data flow diagrams and source system inventories.
  • Provide end-to-end delivery ownership for assigned data pipelines, performing cleansing, processing, and validation on the data to ensure its quality.
  • Define and implement robust Test Strategies and Test Plans, ensuring end-to-end accountability for middleware testing and evidence management.
  • Collaborate with the Solutions Architecture and Business analyst teams to analyze system requirements and prototype innovative integration methods.
  • Exhibit a hands-on leadership approach, ready to engage in coding, debugging, and all necessary actions to ensure the delivery of high-quality, scalable products.
  • Influence and drive cross-product teams and collaboration while coordinating the execution of complex, technology-driven initiatives within distributed and remote teams.
  • Work closely with various platforms and competencies to enrich the purpose of Enterprise Integration and guide their roadmaps to address current and emerging data integration and ingestion capabilities.
  • Design ETL/ELT solutions, lead comprehensive system and integration testing, and outline standards and architectural toolkits to underpin our data integration efforts.
  • Analyze data requirements and translate them into technical specifications for ETL processes.
  • Develop and maintain ETL workflows, ensuring optimal performance and error handling mechanisms are in place.
  • Monitor and troubleshoot ETL processes to ensure timely and successful data delivery.
  • Collaborate with data analyst and other stakeholders to ensure alignment between data architecture and integration strategies.
  • Document integration processes, data mappings, and ETL workflows to maintain clear communication and ensure knowledge transfer.

What should you have:

  • Bachelor’s degree in information technology, Computer Science or any Technology stream
  • 8+ years of working experience with enterprise data integration technologies – Informatica PowerCenter, Informatica Intelligent Data Management Cloud Services (CDI, CAI, Mass Ingest, Orchestration)
  • 5+ years of integration experience utilizing REST and Custom API integration
  • 8+ Years of working Experiences in Relational Database technologies and Cloud Data stores from AWS, GCP & Azure
  • 2+ years of work experience utilizing AWS cloud well architecture framework, deployment & integration and data engineering.
  • Preferred experience with CI/CD processes and related tools including- Terraform, GitHub Actions, Artifactory etc.
  • Proven expertise in Python and Shell scripting, with a strong focus on leveraging these languages for data integration and orchestration to optimize workflows and enhance data processing efficiency
  • Extensive Experience in design of reusable integration pattern using the cloud native technologies
  • Extensive Experience Process orchestration and Scheduling Integration Jobs in Autosys, Airflow.
  • Experience in Agile development methodologies and release management techniques
  • Excellent analytical and problem-solving skills
  • Good Understanding of data modeling and data architecture principles

Current Employees apply HERE

Current Contingent Workers apply HERE

Search Firm Representatives Please Read Carefully 
Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company.  No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. 

Employee Status:

Regular

Relocation:

VISA Sponsorship:

Travel Requirements:

Flexible Work Arrangements:

Hybrid

Shift:

Valid Driving License:

Hazardous Material(s):


Required Skills:


Preferred Skills:

Job Posting End Date:

05/30/2025

*A job posting is effective until 11:59:59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Airflow APIs Architecture AWS Azure CI/CD Computer Science Data management Data pipelines Data quality ELT Engineering ETL GCP GitHub Informatica Pipelines Python RDBMS Shell scripting Terraform Testing

Perks/benefits: Flex hours Relocation support

Region: Asia/Pacific
Country: India

More jobs like this