Data Engineer

Latin America

Jahnel Group

Whether it be custom enterprise software, a web or mobile application, a software product, or game development, we are here to help you succeed.

View all jobs at Jahnel Group

Apply now Apply later

Jahnel Group’s mission is to provide the absolute best environment for software creators to pursue their passion by connecting them with great clients doing meaningful work. We get to build some of the most complex and compelling applications for our clients located across the country.

We’re a fast-growing INC 5000 recognized company, yet we still work as a very close-knit team (100+ employees). We’re growing like crazy, and if you’re looking for the next place to call home, hit us up for a beer or coffee.

Who We're Looking For

We’re on the hunt for a Data Engineer with strong experience designing and maintaining modern data workflows in cloud environments. You’re someone who thrives in high-impact environments, can translate business questions into clean datasets, and knows how to optimize data pipelines for performance and scalability. You’ll work closely with engineering and analytics teams to build a solid data foundation for insights and decision-making.

Primary Responsibilities

  • Build and maintain scalable data pipelines using Python, dbt, and SQL
  • Design and implement ELT/ETL processes to support business intelligence efforts
  • Work with data warehouses such as Snowflake and PostgreSQL to organize and optimize data models
  • Collaborate cross-functionally with analytics, engineering, and product teams
  • Support data governance by standardizing practices, ensuring data quality, and improving data accessibility
  • Leverage AWS services (e.g., S3, Lambda, Glue) to create scalable, cost-efficient cloud data solutions
  • Drive automation and efficiency in data workflows using tools like PySpark and Databricks
  • Participate in code reviews and improve data tooling and infrastructure

 Some Must-Haves:

  • 3+ years of experience in Data Engineering or Analytics Engineering
  • Strong proficiency with Python, SQL, and dbt
  • Hands-on experience with Snowflake, PostgreSQL, and data modeling best practices
  • Working knowledge of AWS cloud ecosystem
  • Experience with ELT tools, workflow orchestration, and scalable data transformation (e.g., Airflow, Databricks, PySpark)
  • Strong collaboration skills and the ability to communicate technical concepts clearly
  • Passion for clean, maintainable, and well-documented code

Where We're Looking For It:

Latin America Open to 100% Fully Remote

Other Information

The work hours will be approximately 9:00 am to 5:00 pm EST, depending on workload, with the occasional late night when a tight deadline calls for it. We work for security-conscious clients, thus background checks will be required.

Position available immediately.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0
Category: Engineering Jobs

Tags: Airflow AWS Business Intelligence Databricks Data governance Data pipelines Data quality dbt ELT Engineering ETL Lambda Pipelines PostgreSQL PySpark Python Security Snowflake SQL

Regions: Remote/Anywhere North America South America

More jobs like this