Snowflake Developer
Pune, Maharashtra, India
Weekday
At Weekday, we help companies hire engineers who are vouched by other software engineers. We are enabling engineers to earn passive income by leveraging & monetizing the unused information in their head about the best people they have worked...This role is for one of the Weekday's clients
Salary range: Rs 700000 - Rs 2500000 (ie INR 7-25 LPA)
Min Experience: 6 years
Location: Bengaluru, Pune, Chennai, Kolkata, Gurgaon
JobType : full-time
Requirements
Primary Roles and Responsibilities
- Design and implement modern data warehouse solutions utilizing Snowflake, Databricks, and Azure Data Factory (ADF).
- Deliver scalable, future-proof data engineering and analytics solutions.
- Collaborate with data warehouse and business intelligence leads to gather and understand ETL pipeline development requirements.
- Troubleshoot and resolve issues in existing data pipelines by identifying gaps and implementing fixes.
- Translate business reporting needs into effective data models and solutions.
- Mentor and support junior team members by helping them overcome technical challenges.
- Engage in technical discussions with client architects and internal teams.
- Manage and schedule data pipelines using orchestration tools like Airflow.
Skills and Qualifications
- Bachelor's or Master’s degree in Computer Science or a related field.
- Minimum of 6 years of experience in IT, with at least 3 years focused on data warehouse and ETL projects.
- In-depth knowledge of Snowflake including security, SQL, and object design/implementation.
- Hands-on experience with Snowflake tools such as SnowSQL, Snowpipe, Snowsight, and various connectors.
- Strong understanding of dimensional data modeling techniques (Star and Snowflake schema).
- Solid foundation in data management principles.
- Experience with Databricks platform and Delta Lake architecture.
- Proficient in SQL and Spark, specifically PySpark.
- Skilled in building ETL processes and data transformation workflows.
- Familiarity with NoSQL databases such as MongoDB, Cassandra, or Neo4J.
- Capable of handling structured, unstructured, imaging, and geospatial data.
- Experience working in DevOps environments using tools like Terraform, CircleCI, and Git.
- Strong expertise in RDBMS, advanced SQL/PLSQL, Unix shell scripting, query optimization, and performance tuning.
- Databricks Certified Data Engineer Associate/Professional certification is a plus.
- Agile methodology experience is preferred.
- Excellent verbal and written communication skills.
- Strong problem-solving abilities and attention to detail.
Mandatory Skills
- Snowflake
- Azure Data Factory (ADF)
- PySpark
- Databricks
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow Architecture Azure Business Intelligence Cassandra Computer Science Databricks Data management Data pipelines Data warehouse DevOps Engineering ETL Git MongoDB Neo4j NoSQL Pipelines PySpark RDBMS Security Shell scripting Snowflake Spark SQL Terraform
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.