DataOps Engineer

Bengaluru, Karnataka, India

⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️

Velotio

Velotio Technologies is a leading product engineering & digital solutions company for innovative startups and enterprises. Velotio has worked with over 90 global customers, including NASDAQ-listed enterprises and unicorn startups. We specialize...

View all jobs at Velotio

Apply now Apply later

Velotio Technologies is a product engineering company working with innovative startups and enterprises. We are a certified Great Place to Work® and recognized as one of the best companies to work for in India. We have provided full-stack product development for 110+ startups across the globe building products in the cloud-native, data engineering, B2B SaaS, IoT & Machine Learning space. Our team of 400+ elite software engineers solves hard technical problems while transforming customer ideas into successful products.

Job Overview:

We are seeking a skilled DataOps Engineer with a strong foundation in DevOps practices and Data Engineering principles. The ideal candidate will be responsible for ensuring smooth deployment, observability, and performance optimization of data pipelines and platforms. You will work at the intersection of software engineering, DevOps, and data engineering—bridging gaps between development, operations, and data teams.

Requirements

Key Responsibilities:

  • Design, implement, and manage CI/CD pipelines using tools such as Jenkins, Git, and Terraform.
  • Manage and maintain Kubernetes (K8s) clusters for scalable and resilient data infrastructure.
  • Develop and maintain observability tools and dashboards (e.g., Prometheus, Grafana, ELK stack) for monitoring pipeline and platform health.
  • Automate infrastructure provisioning and deployments using Infrastructure as Code (IaC) tools, preferably Terraform.
  • Collaborate with data engineers to debug, optimize, and track performance of data pipelines (e.g., Airflow, Airbyte, etc.).
  • Implement and monitor data quality, lineage, and orchestration workflows.
  • Develop custom scripts and tools in Python to enhance pipeline reliability and automation.
  • Work closely with data teams to manage and optimize Snowflake environments, focusing on performance tuning and cost efficiency.
  • Ensure compliance with security, scalability, and operational best practices across the data platform.
  • Act as a liaison between development and operations to maintain SLAs for data availability and reliability.

Required Skills & Experience:

  • 4–8 years of experience in DevOps / DataOps / Platform Engineering roles.
  • Proficient in managing Kubernetes clusters and associated tooling (Helm, Kustomize, etc.).
  • Hands-on experience with CI/CD pipelines, especially using Jenkins, GitOps, and automated testing frameworks.
  • Strong scripting and automation skills in Python.
  • Experience with workflow orchestration tools like Apache Airflow and data ingestion tools like Airbyte.
  • Solid experience with Infrastructure as Code tools, preferably Terraform.
  • Familiarity with observability and monitoring tools such as Prometheus, Grafana, Datadog, or New Relic.
  • Working knowledge of data platforms, particularly Snowflake, including query performance tuning and monitoring.
  • Strong debugging and problem-solving skills, especially in production data pipeline scenarios.
  • Excellent communication skills and ability to collaborate across engineering, operations, and analytics teams.

Preferred Qualifications:

  • Experience with cloud platforms (AWS, and/or GCP) and cloud-native DevOps practices.
  • Familiarity with data cataloging and lineage tools.
  • Exposure to container security, policy management, and data governance tools.
  • Background in data modeling, SQL optimization, or data warehousing concepts is a plus.

Benefits

Our Culture:

  • We have an autonomous and empowered work culture encouraging individuals to take ownership and grow quickly
  • Flat hierarchy with fast decision making and a startup-oriented “get things done” culture
  • A strong, fun & positive environment with regular celebrations of our success. We pride ourselves in creating an inclusive, diverse & authentic environment

At Velotio, we embrace diversity. Inclusion is a priority for us, and we are eager to foster an environment where everyone feels valued. We welcome applications regardless of ethnicity or cultural background, age, gender, nationality, religion, disability or sexual orientation.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow AWS CI/CD Data governance DataOps Data pipelines Data quality Data Warehousing DevOps ELK Engineering GCP Git Grafana Helm Jenkins Kubernetes Machine Learning Pipelines Python Security Snowflake SQL Terraform Testing

Perks/benefits: Career development Flat hierarchy Health care Startup environment

Region: Asia/Pacific
Country: India

More jobs like this