Senior MLOps Engineer

Bengaluru, Karnataka, India

Weekday

At Weekday, we help companies hire engineers who are vouched by other software engineers. We are enabling engineers to earn passive income by leveraging & monetizing the unused information in their head about the best people they have worked...

View all jobs at Weekday

Apply now Apply later

This role is for one of Weekday's clients
Min Experience: 9 years
Location: Bengaluru
JobType: full-time

Requirements

About the Role

We are seeking a highly experienced Senior MLOps Engineer to join our data science and engineering team. In this role, you will be responsible for designing, building, and maintaining scalable MLOps pipelines to support the deployment, monitoring, and optimization of machine learning models in production. You will play a critical role in operationalizing machine learning workflows and ensuring the seamless integration of models with data systems and applications.

The ideal candidate will have a strong foundation in machine learning operations, ETL processes, big data frameworks, and cloud infrastructure (preferably AWS). You’ll work closely with data scientists, engineers, DevOps, and product teams to streamline the path from model development to production deployment, while ensuring performance, reliability, and compliance standards.

Key Responsibilities

  • End-to-End MLOps Lifecycle: Design, build, and manage machine learning pipelines for data preprocessing, model training, evaluation, deployment, and monitoring in production environments.
  • Automation & CI/CD: Implement CI/CD pipelines for ML models to automate testing, versioning, and deployment workflows.
  • Data Engineering: Work with ETL and big data tools to build robust data ingestion and transformation pipelines that support ML use cases.
  • Model Monitoring: Establish performance tracking, model drift detection, and alerting mechanisms for deployed ML models to ensure continued reliability.
  • Infrastructure Management: Provision and manage cloud-based ML infrastructure using tools like AWS SageMaker, Lambda, ECS, S3, RDS, and EMR.
  • Collaboration: Partner with data scientists and software engineers to streamline the ML lifecycle, reduce friction in deployment, and improve iteration speed.
  • Governance & Compliance: Implement best practices in model governance, reproducibility, and auditability across different stages of the ML workflow.
  • Scalability & Optimization: Ensure the ML infrastructure is scalable, cost-efficient, and optimized for high availability and performance.

Required Skills & Experience

  • 9+ years of experience in a combination of machine learning engineering, DevOps, or data engineering roles.
  • Solid expertise in MLOps tools (MLflow, Kubeflow, Airflow, SageMaker Pipelines, etc.).
  • Strong understanding of machine learning workflows, model lifecycle management, and production deployment.
  • Proficiency with ETL frameworks, data processing pipelines, and big data tools (e.g., Spark, Kafka, Hadoop).
  • Hands-on experience with cloud services — particularly AWS (EC2, S3, Lambda, SageMaker, EKS, EMR).
  • Proficient in scripting and programming languages like Python, Bash, and SQL.
  • Experience with containerization and orchestration tools like Docker and Kubernetes.
  • Strong problem-solving skills and ability to debug production ML systems.

Preferred Qualifications

  • Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, or a related field.
  • Certifications in AWS or MLOps platforms are a plus.
  • Exposure to security, compliance, and regulatory practices in data and ML workflows.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Airflow AWS Big Data CI/CD Computer Science DevOps Docker EC2 ECS Engineering ETL Hadoop Kafka Kubeflow Kubernetes Lambda Machine Learning MLFlow ML infrastructure ML models MLOps Model training Pipelines Python SageMaker Security Spark SQL Testing

Region: Asia/Pacific
Country: India

More jobs like this