Data Engineer – Data Pipelines & Modeling

Buenos Aires

⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️

Ryz Labs

Unlock the power of LatAm's elite nearshore talent with Ryz Labs. Our top-tier staff augmentation services provide access to the brightest minds in software development, IT, sales, ops, and CS. Elevate your team and achieve unparalleled results...

View all jobs at Ryz Labs

Apply now Apply later

This position is only for professionals based in Argentina or Uruguay
We're looking for a data engineer for one of our clients' team. You will help enhance and scale the data transformation and modeling layer. This role will focus on building robust, maintainable pipelines using dbt, Snowflake, and Airflow to support analytics and downstream applications. You’ll work closely with the data, analytics, and software engineering teams to create scalable data models, improve pipeline orchestration, and ensure trusted, high-quality data delivery.
Key Responsibilities:- Design, implement, and optimize data pipelines that extract, transform, and load data into Snowflake from multiple sources using Airflow and AWS services- Build modular, well-documented dbt models with strong test coverage to serve business reporting, lifecycle marketing, and experimentation use cases- Partner with analytics and business stakeholders to define source-to-target transformations and implement them in dbt- Maintain and improve our orchestration layer (Airflow/Astronomer) to ensure reliability, visibility, and efficient dependency management- Collaborate on data model design best practices, including dimensional modeling, naming conventions, and versioning strategies
Core Skills & Experience:- dbt: Hands-on experience developing dbt models at scale, including use of macros, snapshots, testing frameworks, and documentation. Familiarity with dbt Cloud or CLI workflows- Snowflake: Strong SQL skills and understanding of Snowflake architecture, including query performance tuning, cost optimization, and use of semi-structured data- Airflow: Solid experience managing Airflow DAGs, scheduling jobs, and implementing retry logic and failure handling; familiarity with Astronomer is a plus- Data Modeling: Proficient in dimensional modeling and building reusable data marts that support analytics and operational use cases- AWS (Nice to Have): Familiarity with AWS services such as DMS, Kinesis, and Firehose for ingesting and transforming data- Segment (Nice to Have): Familiarity with event data and related flows, piping data in and out of Segment
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow Architecture AWS Data pipelines dbt Engineering Firehose Kinesis Model design Pipelines Snowflake SQL Testing

Region: South America
Country: Argentina

More jobs like this