Data Architect

Brazil (Remote)

dLocal

Simplify your cross-border payment operations in high-growth markets. Send and receive funds locally, reaching new customers. One easy integration, unlimited secure transactions.

View all jobs at dLocal

Apply now Apply later

Why should you join dLocal?
dLocal enables the biggest companies in the world to collect payments in 40 countries in emerging markets. Global brands rely on us to increase conversion rates and simplify payment expansion effortlessly. As both a payments processor and a merchant of record where we operate, we make it possible for our merchants to make inroads into the world’s fastest-growing, emerging markets. 
By joining us you will be a part of an amazing global team that makes it all happen, in a flexible, remote-first dynamic culture with travel, health, and learning benefits, among others. Being a part of dLocal means working with 900+ teammates from 25+ different nationalities and developing an international career that impacts millions of people’s daily lives. We are builders, we never run from a challenge, we are customer-centric, and if this sounds like you, we know you will thrive in our team.

What will I be doing?

  • Defining critical data architectures and evaluating trade-offs to determine the optimal technology stack.
  • Establishing and promoting frameworks for data access, stewardship, and governance to improve transparency and accountability.
  • Taking ownership of critical issues, collaborating closely with teams and stakeholders to identify and implement the best solutions.
  • Acting as a trusted advisor to the entire data team, enhancing their productivity and effectiveness.

What skills do I need?

  • 8-10+ years of proven experience in designing and managing scalable data architectures, particularly in enterprise environments.
  • Experience in defining and implementing various types of data architectures, analyzing trade-offs, and selecting technology stacks accordingly.
  • Extensive experience in building, maintaining, and optimizing data platforms, including data warehouse design, data modeling, monitoring, and operations.
  • Proficiency in open-source technologies such as Spark, MapReduce, Airflow, DBT, and Kafka.
  • Comfortable working in cloud environments, specifically AWS and GCP.
  • Expertise in data modeling, data lake/warehouse patterns, data pipelines, and data management technologies.
  • Ability to establish and promote frameworks for data access, stewardship, and governance to enhance transparency and accountability.
  • Strong stakeholder management skills, with the ability to balance delivery expectations and pressure control.
  • Skilled in managing risks and conflicts effectively, addressing challenges head-on.
  • Ability to influence others and advocate for technical excellence while remaining adaptable to change.
  • Self-sufficient and proactive, with a clear understanding of when to seek help.
What happens after you apply?
Our Talent Acquisition team is invested in creating the best candidate experience possible, so don’t worry, you will definitely hear from us. We will review your CV and keep you posted by email at every step of the process!
Also, you can check out our webpageLinkedinInstagram, and Youtube for more about dLocal!
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0
Category: Architecture Jobs

Tags: Airflow Architecture AWS Data management Data pipelines Data warehouse dbt GCP Kafka Open Source Pipelines Spark

Perks/benefits: Career development Flex hours Health care

Regions: Remote/Anywhere South America
Country: Brazil