SE II - Data Operations

India

Uplight

Uplight supports energy providers around the world with clean energy solutions for customer engagement and grid flexibility management.

View all jobs at Uplight

Apply now Apply later

The Position We are seeking a seasoned engineer with a passion for changing the way millions of people save energy. You’ll work within the Deliver and Operate team to build and improve our platforms to deliver flexible and creative solutions to our utility partners and end users and help us achieve our ambitious goals for our business and the planet.  We are seeking a highly skilled and detail-oriented Software Engineer II for Data Operations team to maintain our data infrastructure, pipelines, and work-flows. You will play a key role in ensuring the smooth ingestion, transformation, validation, and delivery of data across systems. This role is ideal for someone with a strong understanding of data engineering and operational best practices who thrives in high-availability environments.Responsibilities & Skills You should:  
  • Monitor and maintain data pipelines and ETL processes to ensure reliability and performance.
  • Automate routine data operations tasks and optimize workflows for scalability and efficiency.
  • Troubleshoot and resolve data-related issues, ensuring data quality and integrity.
  • Collaborate with data engineering, analytics, and DevOps teams to support data infrastructure.
  • Implement monitoring, alerting, and logging systems for data pipelines.
  • Maintain and improve data governance, access controls, and compliance with data policies.
  • Support deployment and configuration of data tools, services, and platforms.
  • Participate in on-call rotation and incident response related to data system outages or failures.
Required Skills :  
  • 5+ years of experience in data operations, data engineering, or a related role.
  • Strong SQL skills and experience with relational databases (e.g., PostgreSQL, MySQL).
  • Proficiency with data pipeline tools (e.g., Apache Airflow).
  • Experience with cloud platforms (AWS, GCP) and cloud-based data services (e.g., Redshift, BigQuery).
  • Hands ob experience with scripting languages such as Python, Bash, or Shell.
  • Knowledge of version control (e.g., Git) and CI/CD workflows.
Qualifications 
  • Bachelor's degree in Computer Science, Engineering, Data Science, or a related field.
  • Experience with data observability tools (e.g., Splunk, DataDog).
  • Background in DevOps or SRE with focus on data systems.
  • Exposure to infrastructure-as-code (e.g., Terraform, CloudFormation).
  • Knowledge of streaming data platforms (e.g., Kafka, Spark Streaming).
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: MLOps Jobs

Tags: Airflow AWS BigQuery CI/CD CloudFormation Computer Science Data governance DataOps Data pipelines Data quality DevOps Engineering ETL GCP Git Kafka MySQL Pipelines PostgreSQL Python RDBMS Redshift Spark Splunk SQL Streaming Terraform

Region: Asia/Pacific
Country: India

More jobs like this