DE&A - Core - Data Engineer (Snowflake, Python, AWS)

India

Zensar

Zensar is a global organization which conceptualizes, builds, and manages digital products through experience design, data engineering, and advanced analytics for over 200 leading companies. Our solutions leverage industry-leading platforms to...

View all jobs at Zensar

Apply now Apply later

Job Title: Data Engineer (Snowflake, Python, AWS)

Experience: 5-8 Years

 

Job Summary:

We are looking for a highly skilled Data Engineer with expertise in Snowflake, Python, and AWS to join our team. The ideal candidate will have 5-8 years of experience in designing, building, and optimizing data pipelines, ensuring data integrity, and enabling efficient data storage and retrieval. This role requires strong analytical skills, problem-solving abilities, and hands-on experience with cloud-based data architectures.

Key Responsibilities:

  • Design, develop, and optimize ETL/ELT pipelines using Python and Snowflake.
  • Implement and maintain cloud-based data solutions on AWS (S3, Lambda, Glue, Redshift, etc.).
  • Work with structured and unstructured data from multiple sources to build efficient data models.
  • Develop and maintain data lakes and data warehouses, ensuring high availability and security.
  • Optimize SQL queries and Snowflake performance through best practices.
  • Automate data workflows and support CI/CD pipelines for data engineering processes.
  • Collaborate with data scientists, analysts, and business teams to provide high-quality data solutions.
  • Ensure data quality, security, and compliance with industry best practices.

Required Skills & Qualifications:

  • 5-8 years of experience in data engineering.
  • Expertise in Snowflake architecture, optimization, and data modeling.
  • Strong proficiency in Python for data processing and automation.
  • Experience with AWS services like S3, Lambda, Glue, Redshift, and Step Functions.
  • Hands-on experience with ETL/ELT tools and data pipeline orchestration.
  • Strong SQL skills and experience in query optimization.
  • Knowledge of data security, governance, and compliance best practices.
  • Experience with Airflow, DBT, or similar orchestration tools is a plus.
  • Excellent problem-solving, analytical, and communication skills.

Preferred Qualifications:

  • Experience with Big Data technologies (Spark, Kafka, etc.).
  • Knowledge of machine learning pipelines or real-time data processing.
  • Certification in AWS, Snowflake, or Python is a plus.

 

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow Architecture AWS Big Data CI/CD Data pipelines Data quality dbt ELT Engineering ETL Kafka Lambda Machine Learning Pipelines Python Redshift Security Snowflake Spark SQL Step Functions Unstructured data

Region: Asia/Pacific
Country: India

More jobs like this