Data Engineer
Ukraine - Remote
Intellectsoft
Trusted IT software development company. 17 years of innovation, user-centric designs, agile methods, and support for businesses and startups.Join our team in building a modern, high-impact Analytical Platform for one of the largest integrated resort and entertainment companies in Southeast Asia. This platform will serve as a unified environment for data collection, transformation, analytics, and AI-driven insights—powering decisions across marketing, operations, gaming, and more.
You’ll work closely with Data Architects, ML Engineers, Business Analysts, and DevOps to design and implement scalable data solutions.
Requirements
- 5+ years of experience in data engineering or backend data development.
- Strong knowledge of data pipeline design, integration frameworks, and ETL tools.
- Experience working with cloud or hybrid data architectures.
- Proficiency in SQL and at least one programming language (e.g., Python, Scala).
- Hands-on experience with distributed data processing (e.g., Spark, Flink) is a plus.
- Familiarity with data lake, data warehouse, or lakehouse architectures.
- Experience with real-time data streaming and ingestion frameworks is a strong advantage.
- Understanding of data security, privacy, and compliance best practices.
- Experience working in Agile/Scrum environments.
Nice to have skills
- Experience with modern open-source tools (e.g., Airflow, dbt, Delta Lake, Apache Kafka).
- Exposure to machine learning pipelines or working alongside ML teams.
- Familiarity with BI tools and data visualization concepts.
- Experience working in regulated industries (e.g., gaming, finance, hospitality).
Responsibilities:
- Design, build, and maintain scalable and reliable data pipelines for ingesting data from various sources (internal systems, APIs, external platforms).
- Work with structured, semi-structured, and unstructured data, ensuring data quality, consistency, and integrity.
- Develop and maintain ETL/ELT processes to support real-time and batch analytics.
- Collaborate with Data Architects to design optimal data models and storage structures for analytics workloads.
- Implement data validation, deduplication, and transformation logic.
- Contribute to the definition of data governance, security, and access policies.
- Participate in platform scaling and performance optimization initiatives.
- Work closely with business and analytics teams to understand data needs and translate them into technical solutions.
Benefits
- 35 absence days per year for work-life balance
- Udemy courses of your choice
- English courses with native-speaker
- Regular soft-skills trainings
- Excellence Сenters meetups
- Online/offline team-buildings
- Business trips
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow APIs Architecture Data governance Data pipelines Data quality Data visualization Data warehouse dbt DevOps ELT Engineering ETL Finance Flink Kafka Machine Learning Open Source Pipelines Privacy Python Scala Scrum Security Spark SQL Streaming Unstructured data
Perks/benefits: Career development
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.