735 - Senior Data Engineer · 7+ Databricks/Warehousing/Retail)

Córdoba, Córdoba Province, Argentina

Darwoft

Darwoft is an industry-leading custom software development company specialized in mobile and web app UX and development.

View all jobs at Darwoft

Apply now Apply later

Data Engineer Databricks | Retail Industry

Were looking for an experienced Data Engineer with strong hands-on expertise in Databricks to join our team and support key initiatives in the Retail sector. Youll be responsible for building and maintaining scalable, high-performance data solutions that help drive intelligent decision-making across the business.

If you thrive working with large datasets, love building robust pipelines, and have a passion for solving complex problems in the retail worldthis role is for you.

What Youll Do:

  • Design, develop, and optimize data pipelines using Databricks to support large-scale data processing and analytics.

  • Collaborate with Tech Leads, Product Managers, and fellow Engineers to integrate and transform data from multiple sources (structured and unstructured).

  • Develop and maintain high-quality, production-grade Databricks workflows to support analytics, reporting, and machine learning initiatives.

  • Ensure all solutions are built with scalability, performance, and cost-efficiency in mind, tailored to retail-specific needs like sales, inventory, and customer behavior.

  • Monitor and improve data quality, reliability, and observability across data systems.

  • Drive best practices in data engineering, automation, and continuous improvement.

What Were Looking For:

  • Bachelors or Masters degree in Computer Science, Engineering, or related field.

  • 7+ years of experience in data engineering, with a proven track record in Databricks and Spark.

  • Deep understanding of data warehousing concepts, data lake architecture, and ETL/ELT workflows.

  • Hands-on experience with cloud platforms such as AWS or GCP.

  • Experience working with retail datasets: transactions, supply chain, point-of-sale systems, etc.

  • Strong experience with orchestration tools like Apache Airflow.

  • Solid programming skills in Python and SQL.

  • Ability to work collaboratively in cross-functional teams, with clear communication and a problem-solving mindset.




    ¡Postulate ahora y sumate a la querida Darwoft!
    talento@darwoft.com
    Consultas? Sigue al Recruiter

    https://www.linkedin.com/in/hernanvietto/


Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Airflow Architecture AWS Computer Science Databricks Data pipelines Data quality Data Warehousing ELT Engineering ETL GCP Machine Learning Pipelines Python Spark SQL

Region: South America
Country: Argentina

More jobs like this