Data Engineering Specialist
India - Remote
Weekday
At Weekday, we help companies hire engineers who are vouched by other software engineers. We are enabling engineers to earn passive income by leveraging & monetizing the unused information in their head about the best people they have worked...This role is for one of the Weekday's clients
Salary range: Rs 1500000 - Rs 2500000 (ie INR 15-25 LPA)
Min Experience: 5 years
JobType: full-time
We are looking for a skilled Data Engineer to design, develop, and maintain scalable and efficient data pipelines and warehousing solutions. The ideal candidate will have strong experience in ELT/ETL processes, data modeling, and working with modern cloud and big data technologies to support data-driven decision-making across the organization.
Requirements
Key Responsibilities:
- Design, build, and maintain robust ETL/ELT data pipelines ensuring accuracy, completeness, and timely delivery of data.
- Collaborate with cross-functional teams to gather data requirements and translate them into scalable data models and solutions.
- Develop and optimize data pipelines using technologies like Elastic Search, AWS S3, Snowflake, and NFS.
- Design and manage data warehouse schemas to support analytics and business intelligence initiatives.
- Implement data validation, quality checks, and monitoring systems to ensure data integrity and proactively address issues.
- Partner with data scientists and analysts to ensure accessible and usable data for analytical applications.
- Stay up to date with best practices in data engineering, including CI/CD, DevSecFinOps, and Agile/Scrum methodologies.
- Contribute to the continuous improvement of our data infrastructure and warehousing architecture.
Requirements:
Mandatory:
- Bachelor's degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in Data Engineering with a focus on ELT/ETL workflows.
- 3+ years of hands-on experience with Snowflake data warehousing solutions.
- 3+ years of experience in building and maintaining ETL pipelines using Airflow.
- Minimum 3 years of professional experience using Python for data processing and automation tasks.
- Experience with Elastic Search in the context of data pipelines and analytics.
- Strong command of SQL and data modeling best practices.
- Hands-on experience with AWS S3 and other cloud-based data storage services.
- Familiarity with NFS and similar file storage systems.
- Strong analytical thinking and problem-solving skills.
- Excellent communication and collaboration abilities.
Technical Skills:
- Languages: Python, SQL
- Data Engineering: ETL/ELT, Airflow, Data Modeling
- Tools & Platforms: Snowflake, Elastic Search, AWS S3, NFS
- Methodologies: CI/CD, Agile/Scrum, DevSecFinOps
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow Architecture AWS Big Data Business Intelligence CI/CD Computer Science Data pipelines Data warehouse Data Warehousing ELT Engineering ETL Pipelines Python Scrum Snowflake SQL
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.