Data Engineer

Noida, Uttar Pradesh

ShyftLabs

ShyftLabs is not just a software company; we're your partners in propelling digital transformation at unprecedented speed. As experts, we specialize in crafting end-to-end solutions through our collaborative approach. With a deep-rooted...

View all jobs at ShyftLabs

Apply now Apply later

Position Overview We are looking for an experienced Lead Data Engineer to join our dynamic team. If you are passionate about building scalable software solutions, and work collaboratively with cross-functional teams to define requirements and deliver solutions we would love to hear from you. ShyftLabs is a growing data product company that was founded in early 2020 and works primarily with Fortune 500 companies. We deliver digital solutions built to help accelerate the growth of businesses in various industries, by focusing on creating value through innovation.

Job Responsibilities:

  • Develop and maintain data pipelines and ETL/ELT processes using Python
  • Design and implement scalable, high-performance applications
  • Work collaboratively with cross-functional teams to define requirements and deliver solutions
  • Develop and manage near real-time data streaming solutions using Pub, Sub or Beam.
  • Contribute to code reviews, architecture discussions, and continuous improvement initiatives
  • Monitor and troubleshoot production systems to ensure reliability and performance

Basic Qualifications:

  • 5+ years of professional software development experience with Python
  • Strong understanding of software engineering best practices (testing, version control, CI/CD)
  • Experience building and optimizing ETL/ELT processes and data pipelines
  • Proficiency with SQL and database concepts
  • Experience with data processing frameworks (e.g., Pandas)
  • Understanding of software design patterns and architectural principles
  • Ability to write clean, well-documented, and maintainable code
  • Experience with unit testing and test automation
  • Experience working with any cloud provider (GCP is preferred)
  • Experience with CI/CD pipelines and Infrastructure as code
  • Experience with Containerization technologies like Docker or Kubernetes
  • Bachelor's degree in Computer Science, Engineering, or related field (or equivalent experience)
  • Proven track record of delivering complex software projects
  • Excellent problem-solving and analytical thinking skills
  • Strong communication skills and ability to work in a collaborative environment

Preferred Qualifications:

  • Experience with GCP services, particularly Cloud Run and Dataflow
  • Experience with stream processing technologies (Pub/Sub)
  • Familiarity with big data technologies (Airflow)
  • Experience with data visualization tools and libraries
  • Knowledge of CI/CD pipelines with Gitlab and infrastructure as code with Terraform
  • Familiarity with platforms like Snowflake, Bigquery or Databricks,.
  • GCP Data engineer certification 
We are proud to offer a competitive salary alongside a strong insurance package. We pride ourselves on the growth of our employees, offering extensive learning and development resources.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow Architecture Big Data BigQuery CI/CD Computer Science Databricks Dataflow Data pipelines Data visualization Docker ELT Engineering ETL GCP GitLab Kubernetes Pandas Pipelines Python Snowflake SQL Streaming Terraform Testing

Perks/benefits: Career development Competitive pay Startup environment

Region: Asia/Pacific
Country: India

More jobs like this