Data Engineer

Pune, India

Valtech

At Valtech, we offer solutions designed to achieve the exceptional, helping our clients leap beyond sameness and maximize their full potential.

View all jobs at Valtech

Apply now Apply later

We are seeking a skilled Data Engineer to join our growing data team in India. You will be responsible for designing, building, and maintaining scalable data infrastructure and pipelines that enable data-driven decision making across our organization and client projects. This role offers the opportunity to work with cutting-edge technologies and contribute to innovative data solutions for global clients.

What you do

Technical Skills

  • Minimum 3+ years of experience in data engineering or related field
  • Strong programming skills in Python and/or Scala/Java
  • Experience with SQL and database technologies (PostgreSQL, MySQL, MongoDB)
  • Hands-on experience with data processing frameworks:
    • Apache Spark, Hadoop ecosystem
    • Apache Kafka for streaming data
    • Apache Airflow or similar workflow orchestration tools
  • Knowledge of data warehouse concepts and technologies
  • Experience with containerization (Docker, Kubernetes)
  • Understanding of data modeling principles and best practices

Cloud & Platform Experience

  • Experience with at least one major cloud platform (AWS, Azure, or GCP)
  • Familiarity with cloud-native data services:
  • Data lakes, data warehouses, and analytics services
  • Server less computing and event-driven architectures
  • Identity and access management for data systems
  • Knowledge of Infrastructure as Code (Terraform, CloudFormation, ARM templates)

Data & Analytics

  • Understanding of data governance and security principles
  • Experience with data quality frameworks and monitoring
  • Knowledge of dimensional modeling and data warehouse design
  • Familiarity with business intelligence and analytics tools
  • Understanding of data privacy regulations (GDPR, CCPA)

Preferred Qualifications

Advanced Technical Skills

  • Experience with modern data stack tools (dbt, Fivetran, Snowflake, Databricks)
  • Knowledge of machine learning pipelines and MLOps practices
  • Experience with event-driven architectures and microservices
  • Familiarity with data mesh and data fabric concepts
  • Experience with graph databases (Neo4j, Amazon Neptune)

Industry Experience

  • Experience in digital agency or consulting environment
  • Background in financial services, e-commerce, retail, or customer experience platforms
  • Knowledge of marketing technology and customer data platforms
  • Experience with real-time analytics and personalization systems

Soft Skills

  • Strong problem-solving and analytical thinking abilities
  • Excellent communication skills for client-facing interactions
  • Ability to work independently and manage multiple projects
  • Adaptability to rapidly changing technology landscape
  • Experience mentoring junior team members

What we ask 

Data Infrastructure & Architecture

  • Design and implement robust, scalable data architectures and pipelines
  • Build and maintain ETL/ELT processes for batch and real-time data processing
  • Develop data models and schemas optimized for analytics and reporting
  • Ensure data quality, consistency, and reliability across all data systems

Platform-Agnostic Development

  • Work with multiple cloud platforms (AWS, Azure, GCP) based on client requirements
  • Implement data solutions using various technologies and frameworks
  • Adapt quickly to new tools and platforms as project needs evolve
  • Maintain expertise across different cloud ecosystems and services

Data Pipeline Development

  • Create automated data ingestion pipelines from various sources (APIs, databases, files, streaming)
  • Implement data transformation logic using modern data processing frameworks
  • Build monitoring and alerting systems for data pipeline health
  • Optimize pipeline performance and cost-efficiency

Collaboration & Integration

  • Work closely with data scientists, analysts, and business stakeholders
  • Collaborate with DevOps teams to implement CI/CD for data pipelines
  • Partner with client teams to understand data requirements and deliver solutions
  • Participate in architecture reviews and technical decision-making
What we offer

You’ll join an international network of data professionals within our organisation. We support continuous development through our dedicated Academy. If you're looking to push the boundaries of innovation and creativity in a culture that values freedom and responsibility, we encourage you to apply.

At Valtech, we’re here to engineer experiences that work and reach every single person. To do this, we are proactive about creating workplaces that work for every person at Valtech. Our goal is to create an equitable workplace which gives people from all backgrounds the support they need to thrive, grow and meet their goals (whatever they may be). You can find out more about what we’re doing to create a Valtech for everyone here.

Please do not worry if you do not meet all of the criteria or if you have some gaps in your CV. We’d love to hear from you and see if you’re our next member of the Valtech team!

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow APIs Architecture AWS Azure Business Intelligence CI/CD CloudFormation Consulting CX Databricks Data governance Data pipelines Data quality Data warehouse dbt DevOps Docker E-commerce ELT Engineering ETL FiveTran GCP Hadoop Java Kafka Kubernetes Machine Learning Microservices MLOps MongoDB MySQL Neo4j Pipelines PostgreSQL Privacy Python Scala Security Snowflake Spark SQL Streaming Terraform

Perks/benefits: Career development

Region: Asia/Pacific
Country: India

More jobs like this