Data Engineer

Bengaluru

Fractal

Fractal Analytics helps global Fortune 100 companies power every human decision in the enterprise by bringing analytics and AI to the decision.

View all jobs at Fractal

Apply now Apply later

It's fun to work in a company where people truly BELIEVE in what they are doing!

We're committed to bringing passion and customer focus to the business.

Fractal Analytics is a strategic analytics partner to the most admired Fortune 500 companies globally and helps them power every human decision in the enterprise by bringing analytics & AI to the decision-making process.

Fortune 500 companies recognize analytics as a competitive advantage to understand customers and make better decisions. We deliver insight, innovation, and impact to them by leveraging Big Data, analytics and technology and help them drive smarter, faster, and more accurate decisions in every aspect of their business.

Fractal has consistently been rated as India’s best companies to work for, by The Great Place to Work® Institute. Fractal has been featured as a leader in the Customer Analytics Service Providers Wave™ 2019 by Forrester Research and recognized as an “Honorable Vendor” in 2019 magic quadrant for data & analytics by Gartner.

Job Description:

  • 3+ years of hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Account Usage schema, data sharing, and stored procedures.

  • Experience in handling semi-structured data (JSON, Parquet) in snowflake

  • Hands-on experience in dbt core or dbt cloud for performing transformations within Snowflake data platform.

  • Good understanding on how Micro-Partition works within Snowflake

  • SQL expertise with experience in handling large scale data

  • Strong knowledge and hands on experience in Data Modelling

  • Hands-on experience in any of the Cloud technologies such as (AWS or Azure)

  • Strong expertise in ETL / ELT tools such as Informatica, Talend, Matillion, etc..,

  • In-depth understanding of ETL/ELT concepts and modeling structure principles

  • Develop effective validation processes between the data sources and the BI asset to ensure accurate and timely availability of reliable data

  • Experience in Data warehousing - OLTP, OLAP, Dimensions, Facts, and Data modeling.

  • Experience in setting up CICD pipelines for Snowflake data platform.

  • Experience gathering and analyzing system requirements

  • Snowpro Core certification is a plus

  • Experience in cloud based DWH is a plus

If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!

Not the right fit?  Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: AWS Azure Big Data Data Warehousing dbt ELT ETL Informatica JSON Matillion OLAP Parquet Pipelines Python Research Snowflake SQL Talend

Region: Asia/Pacific
Country: India

More jobs like this