Data Engineer Snowflake + AWS

Mumbai

Fractal

Fractal Analytics helps global Fortune 100 companies power every human decision in the enterprise by bringing analytics and AI to the decision.

View all jobs at Fractal

Apply now Apply later

It's fun to work in a company where people truly BELIEVE in what they are doing!

We're committed to bringing passion and customer focus to the business.

Role - Data Engineer Snowflake + AWS
YEO - 5-10 years
Location - Mumbai, Pune, Chennai, Bangalore, Gurugram

We are looking for a talented Snowflake Data Engineer with expertise in DBT to join our dynamic data team. In this role, you will design, build, and maintain our data infrastructure, leveraging Snowflake for data warehousing and DBT for data transformation. Your work will enable the organization to derive actionable insights and make data-driven decisions.
 

Key Responsibilities:

  • Design, develop, and maintain data pipelines in Snowflake.
  • Perform data transformations, mappings, and scheduling of ETL processes.
  • Set up and manage dbt models to ensure data quality and consistency.
  • Monitor and troubleshoot data jobs to ensure seamless operation.
  • Collaborate with data analysts and engineers to optimize data workflows.
  • Implement best practices for data storage, retrieval, and security.

Required Skills:

  • Proficiency in Snowflake and strong SQL skills for data querying and transformation.
  • Experience with dbt for managing data models.
  • Familiarity with ETL/ELT processes and data pipeline orchestration.(Airflow)
  • Ability to monitor, debug, and optimize data workflows.
  • Excellent problem-solving skills and attention to detail.
  • Expertise AWS as  data engineering and cloud based solutions.

Qualifications:

  • Bachelor’s degree in Computer Science, Information Technology, or a related field.
  • Proven experience as a Data Engineer, with a strong focus on Snowflake and DBT.
  • Proficiency in SQL and experience with data modeling concepts.
  • Strong understanding of ETL processes and data transformation best practices.
  • Familiarity with cloud platforms (e.g., AWS, Azure, GCP).
  • Excellent problem-solving skills and attention to detail.
  • Strong communication skills and the ability to work collaboratively in a team environment.

Preferred Qualifications:

  • Experience with version control systems (e.g., Git) for DBT projects.

If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!

Not the right fit?  Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0
Category: Engineering Jobs

Tags: Airflow AWS Azure Computer Science Data pipelines Data quality Data Warehousing dbt ELT Engineering ETL GCP Git Pipelines Security Snowflake SQL

Region: Asia/Pacific
Country: India

More jobs like this