Data Engineer
Bengaluru, Karnataka, India
- Remote-first
- Website
- @weekdayworks 𝕏
- Search
Weekday
At Weekday, we help companies hire engineers who are vouched by other software engineers. We are enabling engineers to earn passive income by leveraging & monetizing the unused information in their head about the best people they have worked...This role is for one of the Weekday's clients
Salary range: Rs 600000 - Rs 1700000 (ie INR 6-17 LPA)
Min Experience: 3 years
Location: Bangalore, Chennai, pune, Kolkata, Gurugram
JobType: full-time
Experience: 6+ years in IT with 3+ years in Data Warehouse/ETL projects
Requirements
Primary Responsibilities:
- Design and develop modern data warehouse solutions using Snowflake, Databricks, and Azure Data Factory (ADF).
- Deliver forward-looking data engineering and analytics solutions that scale with business needs.
- Work with DW/BI leads to gather and implement requirements for new ETL pipelines.
- Troubleshoot and resolve issues in existing pipelines, identifying root causes and implementing fixes.
- Partner with business stakeholders to understand reporting requirements and build corresponding data models.
- Provide technical mentorship to junior team members and assist with issue resolution.
- Engage in technical discussions with client architects and team members to align on best practices.
- Orchestrate data workflows using scheduling tools like Apache Airflow.
Qualifications:
- Bachelor's or Master’s degree in Computer Science or a related field.
- Expertise in Snowflake, including security, SQL, and object design/implementation.
- Proficient with Snowflake tools such as SnowSQL, Snowpipe, Snowsight, and Snowflake connectors.
- Strong understanding of Star and Snowflake schema modeling.
- Deep knowledge of data management principles and data warehousing.
- Experience with Databricks and a solid grasp of Delta Lake architecture.
- Hands-on with SQL and Spark (preferably PySpark).
- Experience developing ETL processes and transformations for data warehousing solutions.
- Familiarity with NoSQL and open-source databases such as MongoDB, Cassandra, or Neo4J.
- Exposure to structured and unstructured data, including imaging and geospatial formats.
- Proficient in DevOps tools and practices including Terraform, CircleCI, Git.
- Strong background in RDBMS, PL/SQL, Unix Shell Scripting, and query performance tuning.
- Databricks Certified Data Engineer Associate/Professional certification is a plus.
- Ability to thrive in a fast-paced, dynamic environment managing multiple projects.
- Experience working within Agile development frameworks.
- Excellent communication, analytical, and problem-solving skills with strong attention to detail.
Mandatory Skills:
Snowflake, Azure Data Factory, PySpark, Databricks, SQL, Python
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow Architecture Azure Cassandra Computer Science Databricks Data management Data warehouse Data Warehousing DevOps Engineering ETL Git MongoDB Neo4j NoSQL Open Source Pipelines PySpark Python RDBMS Security Shell scripting Snowflake Spark SQL Terraform Unstructured data
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.