Data Engineering Lead

Birkirkara, Birkirkara, Malta

BrainRocket

BrainRocket - committed developers at your disposal. We are a brand new, innovative software development company for IT solutions and services.

View all jobs at BrainRocket

Apply now Apply later

We’re BrainRocket — an international software development and digital solutions company driven by 1,300 talented professionals across Cyprus, Poland and Portugal.
Here, everything moves at rocket speed: driving innovation, pioneering projects, and fast-tracking careers.
Together, we turn ideas into action—let’s get started!

The Data Engineering Lead’s role is to spearhead our data engineering efforts, utilizing expertise in Apache Airflow, Snowflake, and team leadership.

This role involves not only hands-on technical responsibilities but also leadership in guiding, mentoring, and managing a team of data engineers to architect and maintain robust data solutions.

 

✅ Responsibilities:

✔️Leadership: Lead the design, development, and maintenance of scalable data pipelines using Apache Airflow and Snowflake. Provide technical guidance, best practices, and mentorship to the data engineering team.

✔️Team management: Manage a team of data engineers, fostering a collaborative and innovative environment. Assign tasks, set goals, and conduct performance evaluations to ensure the team's success and growth.

✔️Architecture and Strategy: Drive the architecture and strategy for data infrastructure, ensuring scalability, reliability, and efficiency. Collaborate with cross-functional teams to align data engineering initiatives with business objectives.

✔️Data warehousing expertise: Oversee Snowflake data warehouse management, including schema design, optimization, security, and performance tuning. Ensure adherence to best practices and governance standards.

✔️ETL Implementation: Lead the implementation of complex data workflows and scheduling using Airflow, ensuring robustness, monitoring, and optimization.

✔️Collaboration and Communication: Collaborate with stakeholders, data scientists, analysts, and other teams to understand data requirements. Communicate effectively to translate business needs into technical solutions.

✔️Continuous Improvements: Drive continuous improvement initiatives, identify areas for enhancement, and implement best practices to optimize data engineering processes and workflows.

 

✅ Requirements:

✔️Technical Expertise: Proven experience with Apache Airflow (MWAA) and Snowflake, including designing and implementing scalable data pipelines and data warehouse solutions.

✔️Cloud-Based Stack: Hands-on experience with AWS services (especially S3 and MWAA), Python, and SQL within cloud-based environments.

✔️Collaboration & Communication: Strong interpersonal skills with the ability to work effectively across cross-functional teams and communicate complex technical concepts to both technical and non-technical stakeholders.

✔️Strategic Mindset: Ability to align technical solutions with business goals, contributing to innovation and efficiency across data engineering processes.

✔️Analytical & Problem-Solving Skills: Strong troubleshooting abilities, with a focus on optimizing data workflows and maintaining data accuracy and integrity.

 

✅ Why you should join us?

✔️Opportunity for career progress in a fast growing European company

✔️Private Health Insurance

✔️Corporate Discounts

✔️Regular team & company events

✔️People-oriented management without bureaucracy

✔️24 days of paid holidays

✔️Friendly team

✔️Full-time, in-house, standard business hours

✔️Competitive salary

Bold moves start here. Make yours. Apply today! 🚀

Bold moves start here. Make yours. Apply today! 

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Airflow Architecture AWS Data pipelines Data warehouse Data Warehousing Engineering ETL Pipelines Python Security Snowflake SQL

Perks/benefits: Career development Competitive pay Health care Team events

Region: Europe
Country: Malta

More jobs like this