Data Engineer (Snowflake + Airflow)
India - Remote
Weekday
At Weekday, we help companies hire engineers who are vouched by other software engineers. We are enabling engineers to earn passive income by leveraging & monetizing the unused information in their head about the best people they have worked...This role is for one of the Weekday's clients
Salary range: Rs 1800000 - Rs 2500000 (ie INR 18-25 LPA)
Min Experience: 5 years
JobType: full-time
We are looking for a highly skilled and motivated Data Engineer with deep expertise in Snowflake and Airflow to join our growing data team. In this role, you will be responsible for designing, developing, and maintaining robust data pipelines and ETL workflows that power our data analytics and business intelligence solutions. You’ll work closely with data scientists, analysts, and product teams to ensure data is accessible, reliable, and optimized for performance.
This role requires strong hands-on experience in Python, PySpark, SQL, Snowflake, and Airflow, and familiarity with AWS cloud services and Elasticsearch for scalable, secure, and efficient data processing.
Requirements
Key Responsibilities:
- Design, develop, and maintain scalable and reliable ETL workflows and data pipelines using Airflow, Python, and PySpark.
- Build and optimize data models in Snowflake to support analytical and operational use cases.
- Write efficient and reusable SQL queries for data extraction, transformation, and loading into Snowflake.
- Work with large-scale datasets to ensure high data quality, integrity, and availability.
- Integrate data from multiple sources, including APIs, databases, and Elasticsearch, into a centralized data platform.
- Optimize and monitor data pipelines and workflow performance for reliability and scalability.
- Collaborate with data scientists and business analysts to understand data needs and provide technical support for analytics and reporting.
- Implement and enforce best practices for data engineering, including coding standards, version control, and documentation.
- Utilize AWS tools and services such as S3, Lambda, EC2, and CloudWatch to support and deploy data pipelines in the cloud environment.
- Maintain data security and compliance in line with company and regulatory requirements.
Skills & Qualifications:
- Bachelor's or Master’s degree in Computer Science, Engineering, or related field.
- Minimum of 5 years of professional experience in a Data Engineering role.
- Proficiency in Python and PySpark for building robust data workflows and handling big data.
- Expertise in SQL and experience optimizing complex queries.
- Strong hands-on experience with Snowflake, including data modeling and performance tuning.
- Experience using Apache Airflow for orchestrating workflows and scheduling data pipelines.
- Solid understanding of ETL concepts, data warehousing, and data pipeline architecture.
- Experience with AWS cloud platform and services.
- Familiarity with Elasticsearch for data indexing and search solutions is a plus.
- Strong analytical and problem-solving skills with attention to detail.
- Excellent communication skills and ability to work collaboratively in a cross-functional team.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow APIs Architecture AWS Big Data Business Intelligence Computer Science Data Analytics Data pipelines Data quality Data Warehousing EC2 Elasticsearch Engineering ETL Lambda Pipelines PySpark Python Security Snowflake SQL
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.