Data Engineer (3-5 years) (Remote)

Maharashtra, Pune, India

Codvo.ai

Codvo is an AI, Cloud & UX Development Company, helping enterprises and start-ups build remote tech teams that create high-performing software across disruptive industries.

View all jobs at Codvo.ai

Apply now Apply later

Company Overview

 

We are a global empathy-led technology services company where software and people transformations go hand-in-hand.

Product innovation and mature software engineering are part of our core DNA. Our mission is to help our customers accelerate their digital journeys through a global, diverse, and empathetic talent pool following outcome-driven agile execution. Respect, Fairness, Growth, Agility, and Inclusiveness are the core values that we aspire to live by each day.

We continue to invest in our digital strategy, design, cloud engineering, data, and enterprise AI capabilities required to bring a truly integrated approach to solving our client's most ambitious digital journey challenges.

Job Description:

We are looking for a skilled Data Engineer with 3-5 years of experience in data processing, database management, and cloud data solutions. The ideal candidate should have a strong background in Python, SQL, and big data technologies to optimize and manage large-scale data workflows.

Key Responsibilities:

Develop and optimize SQL Server databases for optimizing and managing databases

Work with Python for data manipulation, ETL processes, and automation.

Familiarity with Databricks and PySpark or BigQuery for big data processing.

Design and manage data workflows using Apache Airflow or similar orchestration tools.

Implement and maintain cloud data solutions with Azure Data Factory, Delta Lake, and Azure Fabric.

Strong problem-solving skills, data modeling expertise, and effective communication abilitie

(Plus) Work with SAP HANA for enterprise data solutions.


Required Skills:

  • Programming: Proficiency in Python and SQL for data manipulation.
  • Database Management: Experience with Microsoft SQL Server optimization and administration.
  • Big Data Processing: Hands-on experience with Databricks, PySpark, or BigQuery.
  • Data Orchestration: Familiarity with Apache Airflow or similar workflow management tools.
  • Cloud Technologies: Expertise in Azure Data Factory, Delta Lake, and Azure Fabric or similar on AWS, GCP
  • Bonus Skills: Understanding of SAP HANA is a plus.
  • Soft Skills: Strong problem-solving abilities, data modeling expertise, and excellent communication skills.

    Experience : 3-5 years
    Work Location : (Remote)

Timing – 2:30PM- 11:30PM


Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Airflow AWS Azure Big Data BigQuery Databricks Engineering ETL GCP PySpark Python SQL

Regions: Remote/Anywhere Asia/Pacific
Country: India

More jobs like this