Lead Data Engineer
Remote job
Full Time Senior-level / Expert EUR 43K - 60K
LITIT
LITIT drives innovation in IT solutions for DACH region, serving diverse industries and nurturing Lithuanian talent for a brighter digital future.ABOUT THE COMPANY
LITIT, a joint venture between NTT DATA and Reiz Tech, is a company with deep-rooted industry know-how, dedicated to innovation within the IT sector. Its primary focus is delivering high-quality solutions in the DACH region. With a commitment to excellence, LITIT combines the best of German precision, Japanese work ethics, and Lithuanian talent to provide unparalleled IT service and support to its clients.
ABOUT THE CLIENT
Globally recognized leader in high-performance engineering, specializing in advanced propulsion and power systems for the aerospace, defense, and energy sectors. With a strong focus on innovation and precision, the company develops complex technologies that enhance efficiency, reliability, and sustainability. Their expertise in digital transformation and data-driven solutions ensures continuous improvement in performance and operational excellence, supporting critical industries worldwide.
ABOUT THE ROLE
As a Lead Data Engineer, you will be at the forefront of designing, building, and maintaining scalable data architectures that support our organization’s data-driven initiatives. You will lead a team of data engineers, ensuring the development of robust, reliable, and high-performance data pipelines. Collaborating closely with data scientists, analytics teams, and key stakeholders, you will enhance data accessibility, optimize data workflows, and drive innovation in data engineering best practices.
RESPONSIBILITIES
Design, develop, and maintain scalable data pipelines and architectures to support analytical and machine learning use cases.
Lead a team of data engineers, providing mentorship, technical guidance, and fostering a culture of continuous learning.
Implement best practices for data management, integration, governance, and security across various platforms.
Optimize data processing workflows to enhance performance, reliability, and scalability.
Collaborate with data scientists and analysts to understand business needs and deliver efficient data solutions.
Ensure data quality and consistency across ETL processes and data platforms.
Drive the adoption of new technologies and methodologies to improve data engineering capabilities.
Implement and manage CI/CD pipelines for data workflows, ensuring seamless deployments and automation.
REQUIREMENTS
Experience in developing data pipelines and scalable data engineering solutions.
Proficiency with AWS services (e.g., SageMaker, Glue, EMR).
Strong programming skills in Python and R.
Experience with CI/CD pipelines (e.g., GitHub Actions) for data workflows.
Solid understanding of data modeling and ETL processes.
Familiarity with machine learning concepts and their integration with data engineering.
Strong communication skills and the ability to work effectively with cross-functional teams.
WHAT WE OFFER
Learning opportunities with compensated certificates, learning lunches, and language lessons.
Opportunity to switch projects after one year.
Team building and victory celebration compensation every quarter.
Office in Vilnius, Lithuania that offers themed lunches and a pet-friendly environment.
Remote work opportunities.
Flexible time off depending on the project.
Seasonal activities with colleagues.
Health insurance for Lithuanian residents.
Referral bonuses.
Loyalty days.
Recognition of important occasions in your life.
Tags: Architecture AWS CI/CD Data management Data pipelines Data quality Engineering ETL GitHub Machine Learning Pipelines Python R SageMaker Security
Perks/benefits: Career development Flex vacation Pet friendly Team events
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.