Senior Data Engineer (Azure)

Delhi, IN-%LABEL POSITION TYPE REMOTE ANY%

Applications have closed

MindTech

Mindtech Company Offers IT Outsourcing Service with a Nearshore Focus to Help You Accelerate Your Business Profitably. +17 Years of Experience.

View all jobs at MindTech

Senior Data Engineer

We seek a highly skilled and motivated Senior Data Engineer to join our team.

Work Schedule Expectation (Non-Negotiable but Flexible)

One of the key responsibilities for this role is to ensure availability during the Enterprise Data Warehouse (EDW) batch job runtime, which occurs daily from 12:30 AM to 3:30 AM EST. Although failures are rare (>90% of jobs succeed without intervention), we need someone experienced who can quickly address any issues that arise and rerun failed jobs before US users begin their day.

In addition to this 3-hour window, the candidate is expected to be available from 8:00 AM EST until at least 1:00 PM EST (a total of ~8 hours/day, including the early shift) to support overlap with US business hours. Flexibility in managing this schedule is allowed, and we do not consider this an “on-call” role.

We value outcomes over micromanagement and trust the engineer to self-manage their time responsibly.

Job Responsibilities

● Design and develop scalable, efficient, high-performance ETL/ELT data pipelines.

● Collaborate with internal and external stakeholders to facilitate seamless data integration across platforms.

● Implement data quality checks and monitoring systems to ensure data integrity and accuracy.

● Optimize data storage and processing systems, addressing bottlenecks and improving performance.

● Troubleshoot and resolve data issues, including pipeline failures and performance concerns.

● Provide guidance to teams on data-related challenges.

● Document processes and technical specifications clearly for cross-team visibility.

Requirements

● Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.

● 8+ years of experience in data engineering and data warehouse environments.

● Strong experience in data integration and automation.

● Proficient in SQL and PL/SQL across multiple database platforms (SQL Server, Snowflake).

● 5+ years of Python experience.

● 3+ years working with AWS or Azure cloud platforms.

● Familiarity with Property & Casualty Insurance industry is a plus.

● Knowledge of CI/CD tools (Azure DevOps, Jenkins, GitLab).

● Experience with BI tools like Tableau, Power BI, or Cognos.

● Excellent communication and collaboration skills.

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  2  0  0
Category: Engineering Jobs

Tags: AWS Azure CI/CD Computer Science Data pipelines Data quality Data warehouse DevOps ELT Engineering ETL GitLab Jenkins Pipelines Power BI Python Snowflake SQL Tableau

Perks/benefits: Flex hours

Regions: Remote/Anywhere Asia/Pacific
Country: India

More jobs like this