786 - Data Engineer Ssr (Python/SQL/Snowflake/AWS) · LATAM
Buenos Aires, Argentina
Darwoft
Darwoft is an industry-leading custom software development company specialized in mobile and web app UX and development.Senior Data Engineer (Snowflake/Airflow/Python/AWS) · LATAM
-
Location: Anywhere in LATAM
-
Job Type: Remote
-
Project: Data Engineering for US-based Health Client
-
Time Zone: GMT-3 to GMT-5 preferred
-
English Level: B2 / C1
Get to Know Us
At Darwoft, we build digital products with heart. Were a Latin American tech company focused on creating impactful, human-centered software in partnership with companies around the globe. Our remote-first culture is based on trust, continuous learning, and collaboration.
Were passionate about tech but even more about people. If youre looking to join a team where your ideas matter and your impact is real, welcome to Darwoft.
Were Looking For a Senior Data Engineer
Youll be joining a fast-moving, collaborative environment where your role will focus on designing and optimizing data pipelines using Python, Snowflake, Airflow, and AWS. Youll work closely with data scientists and analysts to build scalable solutions that support critical business decisions.
What Youll Be Doing
70% Data Pipeline Development
-
Design and optimize ingestion, storage, and transformation pipelines using Python, SQL, Snowflake, and Snowpark
-
Build and enhance real-time data pipelines with AWS Lambda and Snowpipe
-
Collaborate with data scientists and analysts to deliver business-ready datasets
-
Create internal and external data views (logical, materialized, and secure)
-
Test and evaluate new features in Snowflake, Airflow, and AWS for proof of concepts
15% Code Review
-
Participate in peer code reviews and provide constructive feedback
-
Maintain clean, efficient, and scalable code
10% Agile Collaboration
-
Join sprint ceremonies (planning, stand-ups, reviews, retrospectives)
-
Ensure alignment with stakeholders on deliverables and timelines
5% Release Support
-
Coordinate deployments with PMs and IT
-
Ensure smooth release cycles with minimal downtime
What You Bring
-
5+ years of experience as a Data Engineer
-
Strong expertise with Snowflake and orchestration tools like Airflow
-
Advanced Python and SQL programming skills
-
Hands-on experience with AWS services: Lambda, S3, and real-time data streaming
-
Solid understanding of ELT pipelines, data modeling, and efficient storage strategies
-
Great communication and collaboration skills
Nice to Have
-
Experience working with healthcare data in the US
-
Familiarity with data privacy regulations (HIPAA, GDPR)
-
Experience with Snowpark and Snowflakes data sharing capabilities
Education
-
Bachelors degree in Computer Science, Engineering, or a related field
Perks & Benefits
-
Contractor agreement with payment in USD
-
100% remote work
-
Argentinas public holidays
-
English classes
-
Referral program
-
Access to learning platforms
Explore More Opportunities
Check out all our open roles at www.darwoft.com/careers
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow AWS Computer Science Data pipelines ELT Engineering Lambda Pipelines Privacy Python Snowflake SQL Streaming
Perks/benefits: Career development Health care
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.