Data Engineering Lead
Sofia City, Bulgaria
Sofia Stars
Sofia Stars is an operational services company based in Sofia. We offer a range of solutions for online businesses, including R&D, Marketing, Customer Support, KYC, Risk, and Anti-Fraud services. With 300+ bright stars on our team, we deliver secure, reliable solutions with a touch of quality that shines. When you join us, you’ll be part of a place where ideas light up, and growth isn’t just a promise—it’s a journey.
The Data Engineer Lead’s role is to spearhead our data engineering efforts, utilizing expertise in Apache Airflow, Snowflake, and team leadership.
This role involves not only hands-on technical responsibilities but also leadership in guiding, mentoring, and managing a team of data engineers to architect and maintain robust data solutions.
You could be our first Data person in Sofia! Do you have what it takes to lead a team of 5? If all of that sounds interesting, keep reading.
✅ Responsibilities:
- Technical Leadership: Lead the design, development, and maintenance of scalable data pipelines using Apache Airflow and Snowflake. Provide technical guidance, best practices, and mentorship to the data engineering team.
- Team management: Manage a team of data engineers, fostering a collaborative and innovative environment. Assign tasks, set goals, and conduct performance evaluations to ensure the team's success and growth.
- Architecture and Strategy: Drive the architecture and strategy for data infrastructure, ensuring scalability, reliability, and efficiency. Collaborate with cross-functional teams to align data engineering initiatives with business objectives.
- Data warehousing expertise: Oversee Snowflake data warehouse management, including schema design, optimization, security, and performance tuning. Ensure adherence to best practices and governance standards.
- ETL Implementation: Lead the implementation of complex data workflows and scheduling using Airflow, ensuring robustness, monitoring, and optimization.
- Collaboration and Communication: Collaborate with stakeholders, data scientists, analysts, and other teams to understand data requirements. Communicate effectively to translate business needs into technical solutions.
- Continuous Improvements: Drive continuous improvement initiatives, identify areas for enhancement, and implement best practices to optimize data engineering processes and workflows.
✅ Requirements:
-
Technical Expertise: Proven experience with Apache Airflow (MWAA) and Snowflake, including designing and implementing scalable data pipelines and data warehouse solutions.
-
Cloud-Based Stack: Hands-on experience with AWS services (especially S3 and MWAA), Python, and SQL within cloud-based environments.
-
Collaboration & Communication: Strong interpersonal skills with the ability to work effectively across cross-functional teams and communicate complex technical concepts to both technical and non-technical stakeholders.
-
Strategic Mindset: Ability to align technical solutions with business goals, contributing to innovation and efficiency across data engineering processes.
-
Analytical & Problem-Solving Skills: Strong troubleshooting abilities, with a focus on optimizing data workflows and maintaining data accuracy and integrity.
❗️IMPORTANT: This position is office based, our offices are located on 51, Blv. „Cherni Vrah“, Office „X“/Energy Tower.
Join Sofia Stars and rock with us! 🚀
Ready to shine? Let’s make it real.
Ready to shine? Let’s make it real.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Architecture AWS Data pipelines Data warehouse Data Warehousing Engineering ETL Pipelines Python R R&D Security Snowflake SQL
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.