Snowflake Engineer

Bengaluru, Karnataka, India

Weekday

At Weekday, we help companies hire engineers who are vouched by other software engineers. We are enabling engineers to earn passive income by leveraging & monetizing the unused information in their head about the best people they have worked...

View all jobs at Weekday

Apply now Apply later

This role is for one of Weekday's clients
Salary range: Rs 1000000 - Rs 4000000 (ie INR 10-40 LPA)
Min Experience: 6 years
Location: Karnataka, Pune, Kolkata, Gurgaon, Chennai, Bengaluru
JobType: full-time

Requirements

Primary Roles and Responsibilities:

● Developing Modern Data Warehouse solutions using Snowflake, Databricks and ADF.

● Ability to provide solutions that are forward-thinking in data engineering and analytics space

● Collaborate with DW/BI leads to understand new ETL pipeline development requirements.

● Triage issues to find gaps in existing pipelines and fix the issues

● Work with business to understand the need in the reporting layer and develop a data model to fulfill reporting needs

● Help joiner team members to resolve issues and technical challenges.

● Drive technical discussions with client architect and team members

● Orchestrate the data pipelines in the scheduler via Airflow

Skills and Qualifications

● Bachelor's and/or master’s degree in computer science or equivalent experience.

● Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects.

● Expertise in Snowflake security, Snowflake SQL and designing/implementing other Snowflake objects.

● Hands-on experience with Snowflake utilities, SnowSQL, Snowpipe, Snowsight and Snowflake connectors.

● Deep understanding of Star and Snowflake dimensional modeling.

● Strong knowledge of Data Management principles

● Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture

● Should have hands-on experience in SQL and Spark (PySpark)

● Experience in building ETL / data warehouse transformation processes

● Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J)

● Experience working with structured and unstructured data including imaging & geospatial data.

● Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT.

● Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, troubleshooting and Query Optimization.

● Databricks Certified Data Engineer Associate/Professional Certification (Desirable).

● Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects

● Should have experience working in Agile methodology

● Strong verbal and written communication skills.

● Strong analytical and problem-solving skills with a high attention to detail.

Mandatory Skills: Snowflake/Azure Data Factory/ PySpark / Databricks

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0
Category: Engineering Jobs

Tags: Agile Airflow Architecture Azure Cassandra Computer Science Databricks Data management Data pipelines Data warehouse Engineering ETL Git MongoDB Neo4j NoSQL Open Source Pipelines PySpark RDBMS Security Shell scripting Snowflake Spark SQL Terraform Unstructured data

Region: Asia/Pacific
Country: India

More jobs like this