DataOps Engineer

Ukraine - Remote

Intellectsoft

Trusted IT software development company. 17 years of innovation, user-centric designs, agile methods, and support for businesses and startups.

View all jobs at Intellectsoft

Apply now Apply later

Join our team in building a modern, high-impact Analytical Platform for one of the largest integrated resort and entertainment companies in Southeast Asia. This platform will serve as a unified environment for data collection, transformation, analytics, and AI-driven insights—powering decisions across marketing, operations, gaming, and more.

You’ll work closely with Data Architects, Data Engineers, Business Analyst and DevOps Engineers to design and implement scalable data solutions.

Requirements

  • Bachelor's or Master’s degree in Computer Science, Engineering, or a related field.
  • 3+ years of experience in DataOps, DevOps, or Data Engineering roles.
  • Proficiency in scripting languages (Python, Bash, etc.).
  • Strong experience with orchestration tools (e.g., Apache Airflow, Prefect, or Dagster).
  • Hands-on experience with cloud platforms (e.g., AWS, GCP, Azure) and cloud-native data tools.
  • Familiarity with CI/CD tools (e.g., GitLab CI, Jenkins, CircleCI).
  • Knowledge of containerization and orchestration technologies (Docker, Kubernetes).
  • Experience with infrastructure-as-code tools (Terraform, CloudFormation).
  • Strong understanding of data privacy, security, and compliance practices
  • Experience with modern data warehouses (e.g., Snowflake, Redshift, Yellowbrick) and ETL/ELT tools.
  • Understanding of data governance, metadata management, and data cataloging tools.
  • Experience collaborating in Agile/Scrum teams and working with version-controlled data models (e.g., via Git).

Nice to have skills

  • Experience with real-time data processing (e.g., Kafka, Spark Streaming).
  • Familiarity with data observability platforms (e.g., Monte Carlo, Datadog, Great Expectations).
  • Experience working in regulated industries (e.g., gaming, finance, hospitality).

Responsibilities:

  • Design, build, and manage CI/CD pipelines for data applications, models, and pipelines.
  • Develop and maintain infrastructure-as-code (IaC) for data platform components.
  • Automate data quality checks, validation, and monitoring processes.
  • Collaborate with data engineers and analysts to optimize data ingestion and transformation pipelines.
  • Implement robust logging, alerting, and observability tools for data pipelines.
  • Manage orchestration frameworks (e.g., Airflow) and ensure timely execution of workflows.
  • Maintain compliance with data governance, privacy, and security policies.
  • Support and troubleshoot production data issues and infrastructure outages.

Benefits

  • 35 absence days per year for work-life balance
  • Udemy courses of your choice
  • English courses with native-speaker
  • Regular soft-skills trainings
  • Excellence Сenters meetups
  • Online/offline team-buildings
  • Business trips
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Airflow AWS Azure CI/CD CloudFormation Computer Science Dagster Data governance DataOps Data pipelines Data quality DevOps Docker ELT Engineering ETL Finance GCP Git GitLab Jenkins Kafka Kubernetes Monte Carlo Pipelines Privacy Python Redshift Scrum Security Snowflake Spark Streaming Terraform

Regions: Remote/Anywhere Europe
Country: Ukraine

More jobs like this