Data Engineer

Kyiv, Kyiv City

Ajax Systems

Ajax alarm system includes everything you need to protect your property | Professional home security system that combines CCTV, fire alarm and automation

View all jobs at Ajax Systems

The Data Engineer will be responsible for designing, building, and optimizing scalable data pipelines and cloud- based infrastructure under the guidance of the Lead Data Platform Engineer. This role involves working with Databricks, SAP Data Sphere, and AWS to enable seamless data ingestion, transformation, and integration across cloud environments to support enterprise-wide analytics and data-driven decision-making as well as scalable, efficient, and secure data architecture. The Data Engineer will collaborate with cross-functional teams to support analytics, reporting, and data-driven decision-making while ensuring performance, security, and data governance best practices.

Key Responsibilities:

    Data Pipeline Development & Optimization:
  • Design, develop, and maintain ETL/ELT pipelines for batch and streaming data processing.
  • Implement data transformations, cleansing, and enrichment using Databricks (Spark, PySpark, SQL, Delta
  • Lake, MLflow) and SAP Data Sphere (Data Builder, Business Builder).
  • Automate pipeline deployment and orchestration.
  • Ensure data quality, validation, and consistency by implementing robust monitoring frameworks.
  • Cloud Data Platform Implementation & Maintenance:
  • Develop and maintain data lakehouse solutions on AWS.
  • Optimize Databricks workflows, job clusters, and cost-efficiency strategies.
  • Implement data governance, lineage tracking, and access controls using Databricks Unity Catalog.
  • SAP Data Sphere & Data Integration:
  • Build real-time and batch data integrations between SAP Data Sphere and cloud-based platforms.
  • Develop logical and physical data models within SAP Data Sphere, ensuring scalability and efficiency. • Enable cross-system data harmonization and replication between SAP and non-SAP environments.
  • Performance Monitoring & Troubleshooting:
  • Monitor data pipeline performance, identify bottlenecks, and optimize query execution.
  • Implement logging, alerting, and monitoring.
  • Work with the Lead Data Platform Engineer to drive continuous improvements in scalability, observability, and security.
  • Collaboration & Continuous Learning:
  • Work closely with Architects, Data Analysts, and BI teams to support analytical solutions.
  • Follow best practices in DevOps, CI/CD, and infrastructure-as-code (Terraform).
  • Actively learn and apply the latest cloud, data engineering, and SAP Data Sphere advancements.

Key Requirements:

  • 3+ years of experience in data engineering, cloud platforms, and distributed systems.
  • Proficiency in SQL, Python, and Spark.
  • Experience with Databricks (Delta Lake, Spark, MLflow) and AWS data services.
  • Experience with SAP Data Sphere, SAP data modeling, and integration frameworks (OData, API management) will be a plus.
  • Familiarity with data pipeline orchestration tools.
  • Experience with DevOps & CI/CD pipelines (Terraform, GitHub Actions, Jenkins).
  • Strong problem-solving skills and a passion for scalable and efficient data processing.

We offer:

  • A dynamic team working within a zero-bullshit culture;
  • Working in a comfortable office at UNIT.City (Kyiv). The office is safe as it has a bomb shelter;
  • Reimbursement for external training for professional development;
  • Ajax's security system kit to use;
  • Official employment with Diia City ;
  • Medical Insurance;
  • Flexible work schedule.
The Data Engineer plays a vital role in building and maintaining scalable, efficient, and secure data pipelines, ensuring seamless SAP and cloud data integration. This role directly supports the Lead Data Platform Engineer in driving enterprise-wide analytics, AI/ML innovation, and data-driven decision-making.

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  3  0  0
Category: Engineering Jobs

Tags: APIs Architecture AWS CI/CD Databricks Data governance Data pipelines Data quality DevOps Distributed Systems ELT Engineering ETL GitHub Jenkins Machine Learning MLFlow Pipelines PySpark Python Security Spark SQL Streaming Terraform

Perks/benefits: Career development Flex hours

Region: Europe
Country: Ukraine

More jobs like this