Data Engineer Ssr/Sr (ETL · NiFi/Glue · PySpark · AWS)

Buenos Aires, Argentina

Darwoft

Darwoft is an industry-leading custom software development company specialized in mobile and web app UX and development.

View all jobs at Darwoft

Apply now Apply later

  • Location: LATAM (preferably Argentina CABA/AMBA region)

  • Job Type: Remote (hybrid expected for CABA/AMBA)

  • Project: Data Infrastructure Modernization

  • Time Zone: GMT-3 (Buenos Aires time)

  • English Level: B2 / C1

Get to Know Us

At Darwoft, we build digital products with heart. Were a Latin American tech company focused on creating impactful, human-centered software in partnership with companies around the globe. Our remote-first culture is built on trust, continuous learning, and collaboration.

Were passionate about tech, but even more about people. If youre looking to join a team where your ideas matter and your impact is real welcome to Darwoft.

Were Looking For a Data Engineer

Were looking for a skilled Data Engineer to help lead our migration from on-premise data processes to a scalable cloud-native environment. Youll design and optimize ETL/ELT pipelines using modern tools, and play a key role in ensuring efficiency, scalability, and high-quality data integration as we move to the cloud.

Note: Candidates located in CABA/AMBA will be expected to attend some in-person ceremonies.

What Youll Be Doing

  • Design, develop, and optimize ETL/ELT processes using NiFi, AWS Glue, EMR, or Informatica Cloud.

  • Migrate and adapt existing ETL workflows from on-premise to cloud infrastructure.

  • Ingest and process unstructured data into Data Lakes, ensuring seamless integration with other data sources.

  • Implement and fine-tune scalable data workflows with Apache Spark or PySpark.

  • Develop automation and data processing scripts in Python as part of robust data pipelines.

  • Collaborate closely with solution architects and cross-functional teams to ensure smooth cloud transitions.

  • Build cloud-based pipelines (preferably on AWS) with scalability and performance in mind.

  • Maintain comprehensive documentation of data architecture and processes for clarity and support.

What You Bring

  • Proven experience developing ETL/ELT jobs in both on-premise and cloud environments.

  • Hands-on experience with tools such as NiFi, AWS Glue, EMR, or Informatica Cloud.

  • Solid background in migrating data workflows to the cloud.

  • Familiarity with ingesting and transforming unstructured data within Data Lakes.

  • Proficiency in Apache Spark or PySpark for distributed data processing.

  • Strong Python scripting skills for automation within data pipelines.

  • Cloud experience, ideally with AWS (other cloud platforms are a plus).

  • Strong analytical and problem-solving skills with a focus on optimizing complex data workflows.

Nice to Have

  • AWS or other cloud platform certifications.

  • Experience with other data orchestration/integration tools.

  • Background in large-scale migration or cloud optimization projects.

  • Understanding of data governance and security in cloud environments.

Perks & Benefits

  • Full-time contract with payment in ARS

  • 100% remote work

  • Competitive salaries

  • Legal leave and vacation days

  • 5 extra personal days off per year

  • Access to top learning platforms

  • Benefits and discounts card

  • Welcome kit

  • Reimbursement programs

  • English classes

  • Referral program

  • Birthday gift

  • Healthy Break

  • Darwoft-style celebrations: anniversaries, year-end parties, birthdays, and fun team-building events

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Architecture AWS AWS Glue Data governance Data pipelines ELT ETL Informatica NiFi Pipelines PySpark Python Security Spark Unstructured data

Perks/benefits: Career development Team events

Regions: Remote/Anywhere South America
Country: Argentina

More jobs like this