Aws Data Engineer – Ingles – Remoto

Spain - Remote

IRIUM

En Irium somos expertos en diseñar soluciones tecnológicas, llevar a cabo servicios gestionados, dar soporte informático y realizar consultoría para la transformación digital de tu empresa.

View all jobs at IRIUM

Apply now Apply later

🚀 En IRIUM nos preocupamos porque no dejes de perseguir tus sueños. Prepárate para conquistar tus metas, y ten siempre presente disfrutar del camino.
We are looking for a Senior Data Engineer to contribute to the development and optimization of data infrastructure. The role requires proficiency in DBT, Snowflake, GitHub, and additional experience with Apache Airflow, Python, and AWS. This position demands senior-level expertise, fluency in English, and the ability to work remotely from Spain.

🔍 ¿Qué buscamos?:

MUST HAVE:
 
    • DBT: Experience in developing and maintaining data transformation workflows using DBT.
    • Snowflake: Proficiency in Snowflake for data storage and integration.
    • GitHub: Strong skills in version control and collaboration using GitHub
    • C1 English level required

RESPONSIBILITIES:
 
  • Data Pipeline Development: Design, develop, and maintain robust data pipelines using DBT, Apache Airflow, and Python.
  • Data Integration: Integrate data from various sources into Snowflake, ensuring data quality and consistency.
  • Collaboration: Work closely with Data Scientists and ML Engineers to ensure seamless data processing and integration.
  • Optimization: Optimize data storage and retrieval processes to enhance performance and scalability.
  • Version Control: Utilize GitHub for version control and collaboration on data engineering projects.
  • Cloud Infrastructure: Manage and optimize AWS cloud infrastructure for data processing and storage.
  • Troubleshooting: Identify and resolve issues related to data pipelines and infrastructure.
  • Documentation: Maintain comprehensive documentation of data processes, pipelines, and infrastructure.
  • Qualifications:
  • Education: Degree in Computer Science, Data Engineering, or a related field.
  • Experience: Minimum 5 years of experience in data engineering, with a strong focus on DBT, Snowflake, and GitHub.
  • Technical Skills: Proficiency in Python, Apache Airflow, and AWS.
  • Communication: Fluency in English, with excellent communication and collaboration skills.
  •  Problem-Solving: Strong analytical and problem-solving skills, with attention to detail.

100% work from home is allowed (in Spain)



NICE TO HAVE: AWS


⭐ ¿Qué Ofrecemos?
 
•             Lugar de trabajo: 100% work from home is allowed (in Spain)
•             Contrato indefinido con IRIUM
•             Retribución flexible (restaurante, transporte y guardería) ✌
•             Banda salarial: Según valía y experiencia (40-46K)
•             23 días de vacaciones 🏕️
•             Buen clima laboral 🙍‍♀️🙍‍♂️
•             Acceso ilimitado a formación tecnológica puntera en modalidad barra libre. 📚
•             Club de beneficios para empleados con descuentos directos y miles de ofertas en marcas, hoteles, agencias de viaje, cines, ropa… 💰

✨Pasarás a formar parte de un gran equipo de personas que estarán siempre dispuestas a ayudarte.
IRIUM es una empresa formada por profesionales con inquietudes, dinámicos y resolutivos. Nuestros valores son la responsabilidad y el compromiso con el trabajo bien hecho, este es el espíritu que buscamos en IRIUM, sea cual sea tu edad, si te reconoces ¡esta es tu empresa!
Podemos construir juntos el futuro. ¿Hablamos?
🟢🔵🟣
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow AWS Computer Science Data pipelines Data quality dbt Engineering GitHub Machine Learning Pipelines Python Snowflake

Regions: Remote/Anywhere Europe
Country: Spain

More jobs like this