Snowflake Developer

Pune, Maharashtra, India

Weekday

At Weekday, we help companies hire engineers who are vouched by other software engineers. We are enabling engineers to earn passive income by leveraging & monetizing the unused information in their head about the best people they have worked...

View all jobs at Weekday

Apply now Apply later

This role is for one of the Weekday's clients

Salary range: Rs 2200000 - Rs 2400000 (ie INR 22-24 LPA)

Min Experience: 5 years

Location: Pune, Bengaluru, Chennai, Kolkata, Gurgaon

JobType: full-time

We are looking for an experienced Snowflake Developer to join our Data Engineering team. The ideal candidate will possess a deep understanding of Data Warehousing, SQL, ETL tools like Informatica, and visualization platforms such as Power BI. This role involves building scalable data pipelines, optimizing data architectures, and collaborating with cross-functional teams to deliver impactful data solutions.

Requirements

Key Responsibilities

  • Data Engineering & Warehousing: Leverage over 5 years of hands-on experience in Data Engineering with a focus on Data Warehousing and Business Intelligence.
  • Pipeline Development: Design and maintain ELT pipelines using Snowflake, Fivetran, and DBT to ingest and transform data from multiple sources.
  • SQL Development: Write and optimize complex SQL queries and stored procedures to support robust data transformations and analytics.
  • Data Modeling & ELT: Implement advanced data modeling practices including SCD Type-2, and build high-performance ELT workflows using DBT.
  • Requirement Analysis: Partner with business stakeholders to capture data needs and convert them into scalable technical solutions.
  • Data Quality & Troubleshooting: Conduct root cause analysis on data issues, maintain high data integrity, and ensure reliability across systems.
  • Collaboration & Documentation: Collaborate with engineering and business teams. Develop and maintain thorough documentation for pipelines, data models, and processes.

Skills & Qualifications

  • Expertise in Snowflake for large-scale data warehousing and ELT operations.
  • Strong SQL skills with the ability to create and manage complex queries and procedures.
  • Proven experience with Informatica PowerCenter for ETL development.
  • Proficiency with Power BI for data visualization and reporting.
  • Hands-on experience with Fivetran for automated data integration.
  • Familiarity with DBT, Sigma Computing, Tableau, and Oracle.
  • Solid understanding of data analysis, requirement gathering, and source-to-target mapping.
  • Knowledge of cloud ecosystems such as Azure (including ADF, Databricks); experience with AWS or GCP is a plus.
  • Experience with workflow orchestration tools like Airflow, Azkaban, or Luigi.
  • Proficiency in Python for scripting and data processing (Java or Scala is a plus).
  • Bachelor’s or Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related field.

Key Tools & Technologies

  • Snowflake, snowsql, Snowpark
  • SQL, Informatica, Power BI, DBT
  • Python, Fivetran, Sigma Computing, Tableau
  • Airflow, Azkaban, Azure, Databricks, ADF
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow Architecture AWS Azkaban Azure Business Intelligence Computer Science Data analysis Databricks Data pipelines Data quality Data visualization Data Warehousing dbt ELT Engineering ETL FiveTran GCP Informatica Java Oracle Pipelines Power BI Python Scala Snowflake SQL Statistics Tableau

Region: Asia/Pacific
Country: India

More jobs like this