Snowflake Engineer

Bengaluru, Karnataka, India

Weekday

At Weekday, we help companies hire engineers who are vouched by other software engineers. We are enabling engineers to earn passive income by leveraging & monetizing the unused information in their head about the best people they have worked...

View all jobs at Weekday

Apply now Apply later

This role is for one of the Weekday's clients

Salary range: Rs 1000000 - Rs 4000000 (ie INR 10-40 LPA)

Min Experience: 8 years

Location: Bangalore, Pune, Chennai, Kolkata, Gurgaon

JobType: full-time

We are seeking an experienced Snowflake Engineer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data from multiple sources, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions.

Requirements

Key Responsibilities

  • Data Pipeline Development & Integration: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT (Extract, Load, Transform) processes across various data sources.
  • SQL Query Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis.
  • Data Modeling & ELT Implementation: Implement advanced data modeling techniques (e.g., SCD Type-2) using DBT. Design and optimize high-performance data architectures in Snowflake.
  • Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions.
  • Troubleshooting & Data Quality: Perform root cause analysis of data issues, ensure effective resolution, and maintain high data quality standards.
  • Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Maintain clear documentation for data processes, data models, and pipelines.

Skills & Qualifications

  • Snowflake expertise for data warehousing and ELT processes.
  • Strong proficiency in SQL for relational databases and writing complex queries.
  • Experience with Informatica PowerCenter for data integration and ETL development.
  • Familiarity with Power BI for data visualization and business intelligence reporting.
  • Experience with Fivetran for automated ELT pipelines.
  • Familiarity with Sigma Computing, Tableau, Oracle, and DBT.
  • Strong data analysis, requirement gathering, and mapping skills.
  • Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), AWS, or GCP.
  • Experience with workflow management tools such as Airflow, Azkaban, or Luigi.
  • Proficiency in Python for data processing (knowledge of Java or Scala is a plus).

Education

  • Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field.

Skills

  • Snowflake Cloud
  • Snowflake
  • Snowpipe
  • SQL
  • Informatica
  • DBT
  • Data Built Tool
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow Architecture AWS Azkaban Azure Business Intelligence Computer Science Data analysis Databricks Data pipelines Data quality Data visualization Data Warehousing dbt ELT ETL FiveTran GCP Informatica Java Oracle Pipelines Power BI Python RDBMS Scala Snowflake SQL Statistics Tableau

Region: Asia/Pacific
Country: India

More jobs like this