Data Engineer (Cloud: GCP)
Porto, Portugal
Alter Solutions
Looking for an IT partner? We're an IT consulting company with expertise in Nearshore software development and Cybersecurity. Offices in 13 countriesJob Description
We are seeking a skilled and motivated Data Engineer to join our dynamic team. As a Data Engineer, you will play a key role in understanding business and technological challenges, developing efficient data pipelines, and ensuring the smooth deployment of solutions. You will apply industry best practices while helping evolve the data architecture and infrastructure of our company.We are seeking a skilled and motivated Data Engineer to join our dynamic team. As a Data Engineer, you will play a key role in understanding business and technological challenges, developing efficient data pipelines, and ensuring the smooth deployment of solutions. You will apply industry best practices while helping evolve the data architecture and infrastructure of our company.
Key Responsibilities:
Project Understanding and Communication:
- Understand business challenges from the user’s perspective and clearly communicate to ensure a deep understanding of the issues.
- Collaborate with the Data Architect to fully comprehend the provided architecture and ensure alignment with your work.
- Communicate technical solutions with your peers and provide regular updates to the Project Manager overseeing the project.
Development:
- Write clear interface contracts for new or updated features and ensure they are communicated effectively.
- Develop data pipelines based on the defined architecture.
- Apply best practices to ensure quality code and maintainable solutions.
- Deploy infrastructure as requested, with a particular focus on using Terraform.
- Conduct peer code reviews and actively engage in reviewing peers’ code when merging new versions.
Testing:
- Collaborate with the Project Manager to define tests based on functional and technical requirements.
- Perform tests and regularly communicate the results to the team.
- Document and summarize test results for ongoing monitoring and improvements.
Deployments:
- Present completed work to the Data Architect and Lead DataOps during Deployment Reviews.
- Track and communicate any potential issues during the active monitoring phase after deployment.
- Diligently apply the deployment process, including logging and monitoring strategies.
Key Responsibilities:
Project Understanding and Communication:
- Understand business challenges from the user’s perspective and clearly communicate to ensure a deep understanding of the issues.
- Collaborate with the Data Architect to fully comprehend the provided architecture and ensure alignment with your work.
- Communicate technical solutions with your peers and provide regular updates to the Project Manager overseeing the project.
Development:
- Write clear interface contracts for new or updated features and ensure they are communicated effectively.
- Develop data pipelines based on the defined architecture.
- Apply best practices to ensure quality code and maintainable solutions.
- Deploy infrastructure as requested, with a particular focus on using Terraform.
- Conduct peer code reviews and actively engage in reviewing peers’ code when merging new versions.
Testing:
- Collaborate with the Project Manager to define tests based on functional and technical requirements.
- Perform tests and regularly communicate the results to the team.
- Document and summarize test results for ongoing monitoring and improvements.
Deployments:
- Present completed work to the Data Architect and Lead DataOps during Deployment Reviews.
- Track and communicate any potential issues during the active monitoring phase after deployment.
- Diligently apply the deployment process, including logging and monitoring strategies.
Qualifications
Requested Hard Skills:
- Google Cloud Platform (GCP): Strong knowledge of GCP and at least one year of hands-on experience with its services.
- Azure Cloud Platform: General knowledge of Azure is a plus.
- Apache Airflow: Minimum of two years of experience with Airflow orchestration; experience with Google Composer is an advantage.
- Terraform: Experience deploying infrastructure, especially using Terraform.
Desired Soft Skills:
- Strong communication skills for effective collaboration with cross-functional teams.
- Ability to tackle problems from a user perspective and ensure clear understanding across technical teams.
- Strong problem-solving abilities and a proactive approach to resolving issues.
- Detail-oriented, results-driven, and capable of working independently and as part of a team.
Additional Information
If you have experience with data pipelines, cloud technologies, and orchestrators like Apache Airflow, and you're looking for a new challenge, we’d love to hear from you!
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Architecture Azure DataOps Data pipelines GCP Google Cloud Pipelines Terraform Testing
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.