Senior Engineer - ETL Support

Gurgaon

Guardian

We provide life insurance, disability insurance, dental insurance, and other benefits that help protect people and inspire their well-being.

View all jobs at Guardian

Apply now Apply later

Job Description:

etermine the design and workout strategy for new projects. Lead the Project lifecycle closely following Governance models.
• Identify and resolve issues reported within defines SLAs part of routine maintenance of existing applications built in SharePoint online.
• Report extraction and automations to support different metrics, that are maintained in the applications.
• Work with Stakeholders on the enhancements in the existing applications.

Project Planning, Tracking, & Reporting
• Workout budgetary estimates during kick off, after understanding the high-level requirements.
• Support the team in project planning activities, identify risks proactively, escalating thru appropriate channels as required and responsible for closely tracking the actual efforts
• Communicate regularly the status of Projects, risks that creeps, impediments if any.

Design
• Familiar with Microsoft Visio, Word to create HLD, LLD, DLD in various stages of SharePoint site development and maintenance.

Qualifications:

Job Description Summary

Seeking a motivated Data Ingestion and Integration Engineer to join our team as an individual contributor.

In this role, resource will be responsible for primarily support and designing, building, and optimizing automated data pipelines to support data technology

integration and ingestion needs.

 

Position Objective:

Informatica Platform Administration, Development & Production Support Review & analyze business data requirements working with IT and Business stake holders for Data warehousing, Data integration, Reporting projects. Prepare Technical ETL design document, develop, & support ETL processes using Informatica Power center, Informatica Cloud, Syncsort DMX-h, & SSIS (SQL Server Integration Services) tools and production support 24/7 with on call support.

 

Competencies/Skills: Individual Contributor Competencies

 

Proficient in the following

Any ETL tool knowledge (eg: Precisely/Informatica)

Datalake/Databricks Concepts

Pyspark

SQL/Unix (basics)

Strong Understanding of LakeHouse concepts (eg : Delta format ,Workflow creation etc)

 

Knowledge:

Strong knowledge of ETL concepts especially as applied to Informatica, Informatica cloud.

Strong knowledge of data analysis, data transformation, conceptual data modeling, data transformation &metadata management.

Strong knowledge of data design for various applications.

Strong knowledge of SQL.

Good knowledge of various RDBMS, Mainframe

Good knowledge of Project Life Cycles methodologies.

Strong verbal & written communication skills.

Strong knowledge of Unix & Unix Shell scripting

Experience with Enterprise Scheduling tools such as Control-M is a plus.

 

Job responsibilities:

Provide Production Support, on call support for all ETL platforms.

Review & understand business data related ETL requirements for database, application and reporting projects.

Develop & make enhancement on new & existing data systems.

Participate in the development plans & resource estimation for task planning.

Assist with the development of logical data models.

Monitor production jobs & resolve any issues in a timely manner.

Partner with other IT areas in resolving issues & improving processes

Develop Informatica or ETL Workflows.

Develop ETL code both in Informatica & at the shell script level.

Design, build, and maintain automated data pipelines that ensure smooth data ingestion and integration from various sources.

Develop scalable and efficient solutions for data ingestion, using best practices in ETL/ELT processes.

Implement data validation and quality checks to ensure data integrity throughout the pipeline.

Optimize data pipelines to handle large volumes of data in real-time and batch processing environments.

Collaborate with data engineers, analysts, and other stakeholders to understand data requirements and deliver seamless solutions.

Monitor pipeline performance, troubleshoot issues, and continuously improve pipeline efficiency.

Document pipeline processes, configuration, and maintenance routines to facilitate knowledge sharing and onboarding.

Location:

This position can be based in any of the following locations:

Gurgaon

Current Guardian Colleagues: Please apply through the internal Jobs Hub in Workday

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0
Category: Engineering Jobs

Tags: Data analysis Databricks Data pipelines Data Warehousing ELT ETL Informatica Pipelines PySpark RDBMS SharePoint Shell scripting SQL SSIS

Perks/benefits: Team events

Region: Asia/Pacific
Country: India

More jobs like this