Senior Data Engineer
Bulgaria, Lithuania, Poland, Romania
Exadel
Advance your business through technology and pave the way to becoming a digital leader with Exadel, an enterprise software development and consulting company.We’re searching for a Senior Data Engineer and turning data into insight at scale. You’ll work alongside a collaborative team and contribute to solutions that go live and make a difference.
Work at Exadel - Who We Are
We don’t just follow trends—we help define them. For 25+ years, Exadel has transformed global enterprises. Now, we’re leading the charge in AI-driven solutions that scale with impact. And it’s our people who make it happen—driven, collaborative, and always learning.
About Our Customer
The customer is an educational technology company based in Salt Lake City, Utah. It is the developer and publisher of a web-based learning management system and a massive open online course (MOOC) platform. Millions of students and teachers around the world use the customer’s products.
Requirements
- 5+ years of experience as a Data Engineer or in a similar data-focused engineering role
- Strong hands-on experience with Snowflake for data warehousing, including complex queries and performance optimization
- Proficiency in Scala, Java, or Python
- Experience working with Apache Spark for distributed data processing
- Hands-on experience with Amazon Web Services (AWS)
- Experience with Infrastructure as Code (IaC), particularly using Terraform
- Understanding of Change Data Capture (CDC) architectures and tools such as Debezium
- Familiarity with Kafka-based streaming data pipelines
- Strong problem-solving skills in distributed systems environments
- Comfortable working in agile development teams
- Effective communication skills with both technical and non-technical stakeholders
Nice to Have
- Experience with orchestration tools such as Apache Airflow or AWS Step Functions
- Familiarity with CI/CD pipelines in cloud-native environments
- Ability to write clean, maintainable, and well-documented code following engineering best practices
English level
Upper Intermediate +
Responsibilities
- Design, build, and maintain scalable data pipelines that support real-time and batch data processing
- Implement and optimize CDC-based workflows using Debezium for near real-time data synchronization across systems
- Develop and manage infrastructure as code using Terraform to automate cloud deployments
- Ensure the efficient ingestion, transformation, and storage of data in Snowflake
- Monitor and troubleshoot data workflows to ensure high performance and reliability
- Collaborate with cross-functional teams, including data scientists, architects, and business stakeholders, to translate requirements into robust data solutions
- Combine data from multiple sources and ensure its quality, consistency, and accuracy
- Apply best practices in data modeling, versioning, and schema management
- Contribute to the evolution of data architecture and engineering standards
- Maintain clear documentation of data processes, pipelines, and infrastructure configurations
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow Architecture AWS CI/CD Data pipelines Data Warehousing Distributed Systems Engineering Java Kafka Pipelines Python Scala Snowflake Spark Step Functions Streaming Terraform
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.