AWS / Databricks / Snowflake

Pune, Maharashtra, IN

Apply now Apply later

Description

Department:                          Sales and Delivery Team - Empower

Industry:                                 Information Technology & Services, Computer Software, Management Consulting

Location:                                 WFH/ India Remote

Experience Range:                6 - 10 years

Basic Qualification:               Bachelor of Engineering or Equivalent

Travel Requirements:           Not required

Website:                                 www.exusia.com

 

Exusia, a cutting-edge digital transformation consultancy, and is looking for top Talent in DWH & Data Engineering space with specific skills in AWS/Databricks/Snowflake to join our global delivery team's Industry Analytics practice in India.


What’s the Role?

Full-time job to work with Exusia's clients to design, develop and maintain large scale Data Warehouses and Data lakes. The right candidates will get a chance to work with client stakeholders to capture the requirements, design and model the data repository, extract and transform data from various source systems into the target DWH or Data lake to support Analytical and reporting needs.  The candidate should have experience in cloud based DWH and data engineering tools and worked with large scale data platforms running on AWS leveraging Snowflake as DWH and Databricks for data processing.

Criteria for the Role!

·       Have a minimum of 6 years’ experience in Data engineering & Analytics space with hands on project experience using Snowflake & Databricks

·       Should have worked on AWS platform for large data initiatives and should have exposure to AWS native services relevant for Data Engineering & Analytics

Education Qualification – Bachelor of Science (preferably in Computer and Information Sciences or Business Information Technology) or an Engineering degree in the above areas

Requirements

Key Responsibilities:

·       Work with business stakeholders to gather and analyze business requirements, building a solid understanding of the Data Analytics domain

·       Document, discuss and resolve business, data and reporting issues within the team, across functional teams, and with business stakeholders

  • Should be able to work independently and come up with solution design
  • Build optimized data processing solutions using the given toolset  
  • Provide hands on technical leadership and work on multiple projects if required
  • Design pipelines to ingest & integrate structured/unstructured data from heterogeneous sources
  • Collaborate with delivery leadership to deliver projects on time adhering to the quality standards
  • Contribute to talent recruitment, competency building and providing mentor-ship to junior resources for these skills

Mandatory Skills:

·        Must have strong DWH and ETL basics

·          Setting and designing virtual DWH using snowflake

·          Optimized data loading to snowflake using tools like Snow pipe / Firetran & Databricks

·          Experience in handling bulk loading vs continuous loading using Snow pipe along with hands-on experience with scripting / stored procedures

·          Using snowflake features like Staging data, time travel, data sharing and data security to deliver performant and optimized DWH while working with large datasets / big data

·          Experience working with multiple databases and has an understanding of Data migration from Oracle/SQL Server/Other DBs to snowflake

·        Hands on experience using Databricks in at least 2 projects, Exposure to Python, Pyspark & Spark SQL is mandatory

·        Hands on experience in using Delta lake for supporting large scale data sets

·        Experience handling structured/unstructured data and  batch/real time data processing use cases

·        Using Databricks scheduling capabilities or using Airflow for orchestrating data pipelines

·        Have working knowledge on AWS and have used storage/compute services

 


Nice-to-Have Skills

 

·        Exposure to Healthcare domain is a big plus

·        Provisioning and managing Databricks cluster

·        Knowledge of AWS services like Lambda, AWS DevOps tools

·        Understanding of Data Modeling & Other ETL tools

·        Prior migration experience - On Prem to Cloud and legacy databases to snowflake

·        Self-starter who can pick up other cloud skills like Azure & GCP

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Airflow AWS Azure Big Data Consulting Data Analytics Databricks Data pipelines DevOps Engineering ETL GCP Lambda Oracle Pipelines PySpark Python Security Snowflake Spark SQL Unstructured data

Region: Asia/Pacific
Country: India

More jobs like this