Mid Data Engineer

Brazil - Remote Office

WEX

WEX is the global commerce platform for fuel and fleet, employee benefits, and business payments. Simplify your business and let WEX handle the complex.

View all jobs at WEX

Apply now Apply later

About the Team/Role

We’re the Data Platform team. We are the core data infrastructure and workflow group at the company, working with cutting-edge cloud data technologies. Our team works hard, covers for one another, and maintains work-life balance. We own our results and we take pride of ownership in everything we do.

As a Data Engineer at WEX, you’ll be responsible for building and maintaining the bridge between our data and the rest of the organization. You’ll work with stakeholders to understand business requirements and then implement SQL-first transformation workflows to deploy analytics code. You’ll help ensure the integrity, reliability, and usability of data for stakeholders, so they can make critical data-driven decisions. You’ll be part of a collaborative scrum team that consists of an Agility Engineer, Technical Program Manager, Data Engineers, QA, and DevOps. You’ll also be supported by a manager that is here to listen and help you grow.

  • A highly motivated individual who loves working as part of a high performing team

  • Someone who cares deeply for team results, checks your ego at the door, and takes pride in owning results

  • You can share one or more passion projects or have contributed to open-source projects in your own time. 

  • You are constantly learning and upskilling, because that's who you are.

  • You are a critical thinker with strong analytical and problem-solving abilities.

  • You are self motivated and able to work independently with minimum supervision.

How you’ll make an impact ​

​Our core stack for this position consists of dbt (data build tool) or another imperative tool , SQL, Snowflake, and Airflow. You possess the following skills and experiences:

  • Solid understanding of SQL to perform data transformation tasks.

  • Strong understanding of data design principles and dimensional data modeling.

  • Advanced SQL skills and understanding of query optimization strategies in Snowflake.

  • Fundamental understanding of DAGs and operators in Airflow, hands-on experience with action and sensor operators.

  • Solid understanding of basic programming concepts (in Python or similar modern languages).

Experience you’ll bring

  • Strong experience as an Data Engineer developing complex data models using dbt, macros or Jinja and building data pipelines

  • Exceptional analytical and problem-solving skills

  • BS in a technical or quantitative field; or you can make us feel intensely confident that you don’t need one.

Apply now Apply later
  • Share this job via
  • 𝕏
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  3  1  0
Category: Engineering Jobs

Tags: Airflow Data pipelines dbt DevOps Open Source Pipelines Python Scrum Snowflake SQL

Regions: Remote/Anywhere South America
Country: Brazil

More jobs like this