Lead data engineer

Bengaluru, Karnataka, India

Weekday

At Weekday, we help companies hire engineers who are vouched by other software engineers. We are enabling engineers to earn passive income by leveraging & monetizing the unused information in their head about the best people they have worked...

View all jobs at Weekday

Apply now Apply later

This role is for one of the Weekday's clients

Min Experience: 8 years

Location: Bangalore, Mumbai

JobType: full-time

We are seeking a highly experienced and motivated Lead Data Engineer to join our data engineering team. This role is perfect for someone with 8–10 years of hands-on experience in designing and building scalable data infrastructure, data pipelines, and high-performance data platforms. You will lead a team of engineers, set data engineering standards, and work cross-functionally with data scientists, analysts, and software engineers to enable a data-driven culture within the organization.

Requirements

Key Responsibilities:

  • Technical Leadership:
    Lead the design and development of robust, scalable, and high-performance data architectures, including batch and real-time data pipelines using modern technologies.
  • Data Pipeline Development:
    Architect, implement, and maintain complex ETL/ELT workflows using tools like Apache Airflow, Spark, Kafka, or similar.
  • Data Warehouse Management:
    Design and maintain cloud-based data warehouses and data lakes (e.g., Snowflake, Redshift, BigQuery, Delta Lake), ensuring optimized storage and query performance.
  • Data Quality and Governance:
    Implement data validation, monitoring, and governance processes to ensure data accuracy, completeness, and security across all platforms.
  • Collaboration:
    Work closely with stakeholders, including business analysts, data scientists, and application developers, to understand data needs and deliver effective solutions.
  • Mentorship and Team Management:
    Guide and mentor junior and mid-level data engineers, foster best practices in code, architecture, and agile delivery.
  • Automation and CI/CD:
    Develop and manage data pipeline deployment processes using DevOps and CI/CD principles.

Required Skills & Qualifications:

  • 8–10 years of proven experience in data engineering or a related field.
  • Strong programming skills in Python, Scala, or Java.
  • Expertise in building scalable and fault-tolerant ETL/ELT processes using frameworks such as Apache Spark, Kafka, Airflow, or similar.
  • Hands-on experience with cloud platforms (AWS, GCP, or Azure) and tools like S3, Redshift, Snowflake, BigQuery, Glue, EMR, or Databricks.
  • In-depth understanding of relational and NoSQL databases (PostgreSQL, MongoDB, Cassandra, etc.).
  • Strong SQL skills with the ability to write complex and optimized queries.
  • Familiarity with data modeling, data warehousing concepts, and OLAP/OLTP systems.
  • Experience in deploying data services using containerization (Docker, Kubernetes) and CI/CD tools like Jenkins, GitHub Actions, or similar.
  • Excellent communication skills with a collaborative and proactive attitude.

Preferred Qualifications:

  • Experience working in fast-paced, agile environments or startups.
  • Exposure to machine learning pipelines, MLOps, or real-time analytics.
  • Familiarity with data governance frameworks and data privacy regulations (GDPR, CCPA).
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Agile Airflow Architecture AWS Azure BigQuery Cassandra CI/CD Databricks Data governance Data pipelines Data quality Data warehouse Data Warehousing DevOps Docker ELT Engineering ETL GCP GitHub Java Jenkins Kafka Kubernetes Machine Learning MLOps MongoDB NoSQL OLAP Pipelines PostgreSQL Privacy Python Redshift Scala Security Snowflake Spark SQL

Region: Asia/Pacific
Country: India

More jobs like this