DataOps Engineer

Vitória, Espírito Santo

Trustly

Trustly, as a simple and fast online banking payments solution, enables consumers and merchants to carry out in and out payments using their bank account.

View all jobs at Trustly

Apply now Apply later

WHO WE AREThe Trustly Americas team combines PayWithMyBank, a 2012 Silicon Valley startup, with Trustly AB of Sweden, following their 2019 merger. Our team represents 30 nationalities, serving 9,000 merchants, connecting to 650 million consumers, and 12,000 banks across 33 countries. Our global network processes over $100 billion annually.
We are the leader in Open Banking Payments, and we’re on a mission to give merchants and consumers a better payment experience through Pay by Bank. For merchants, we deliver payment acceptance at a significantly lower cost than cards. For consumers, we enable them to use their bank accounts at payment methods instead of being held captive by high-interest cards. And it all happens through the security of Open Banking, only with Trustly.
With offices in Vitoria, Brazil, Silicon Valley in the US, and global headquarters in Stockholm, Sweden, we are a culturally diverse team.  Across Brazil, we have embraced remote work from home policy.
At Trustly, we believe that inclusion and diversity are essential foundations for building a fair and equitable society. We do not discriminate based on race, religion, ancestry, color, national origin, gender identity, sexual orientation, age, citizenship, marital status, or disability status. Our main goal is to provide a fair, welcoming, diverse environment with opportunities for all collaborators. The stages of our selection process take place online and without distinction of any kind.
It’s a great time to join Trustly as the Americas team is growing fast. If you thrive in an entrepreneurially-minded, fast-paced, casual, professional, positive, and rewarding work environment, check us out!
About the teamTrustly's DataOps team is responsible for delivering the data generated by the application to interested areas, as well as data from APIs and other tools. All this thought in a safe, structured, scalable and generic way, because we work with multiple environments (in different regions) and we need to maintain consistency. We work with both the batch layer (using Airflow) and the streaming layer (Kafka). We help areas in process automation to deliver data more quickly and reliably. We are also concerned with the quality of the data (Data Quality), creating an observability layer for the data to act in a preventive and immediate way to the inconsistencies and failures of our processes. We also work on the delivery of products and services to facilitate the use and consultation of our data, such as maintaining tools such as Debezium and Quicksight. Last but not least, we interacted with the Data Science area to provide the necessary support and infrastructure for the production of models. To achieve our goals, we follow good code and development practices, apply end-to-end encryption in all our processes, use infrastructure as code, etc.

What you'll do:

  • Work closely with the data team to build, scale, and optimize the Trustly big data environment, including the data lake setup, AWS infrastructure stack, BI data warehouse, and automation/visualization tools.
  • Implement ETL processes and QA checks to ensure data in the data lake is accurate and up to date.
  • Implement data pipelines to automate the timely delivery of customer reporting and dashboards.
  • Partner with DevOps to ensure our environment and tools are compliant with security protocols.
  • Partner with data scientists to productionalize machine learning models.
  • Documentation of data flows, architectural setup, and data model.

Who you are:

  • Bachelor’s or Master’s degree in IT/Math/CS/Engineering or other technical discipline.
  • Successful history of building big data pipelines, and data sets.
  • Experience with AWS cloud services (DMS, EC2, EMR, RDS) and big data tools (Redshift).
  • Desirable experience with Spark and Delta Lake.
  • Experience with relational databases (preferably Postgres), strong SQL coding, data modeling, data warehouses.
  • Experience with IaC, Terraform, etc.
  • Desirable experience with Kubernetes, Docker.
  • Experience with CI/CD.
  • Experience with automation and workflow management tools (e.g. Airflow, kubeflow).
  • Intermediate Python programming skills.

Our perks and benefits:

  • Bradesco health and dental plan, for you and your dependents, with no co-payment cost;
  • Life insurance with differentiated coverage;
  • Meal voucher and supermarket voucher;
  • Home Office Allowance;
  • Wellhub - Platform that gives access to spaces for physical activities and online classes;
  • Trustly Club - Discount at educational institutions and partner stores;
  • English Program - Online group classes with a private teacher;
  • Extended maternity and paternity leave;
  • Birthday Off;
  • Flexible hours/Home Office - our culture is remote-first! You can work in every city in Brazil;
  • Welcome Kit - We work with Apple equipment (Macbook Pro, iPhone) and we send many more treats! Spoiler alert: Equipment can be purchased by you according to internal criteria!;
  • Annual premium - As a member of our team, you are eligible to receive an annual bonus, at the company's discretion, based on the achievement of our KPIs and individual performance;
  • Referral Program - If you refer a candidate and we hire the person, you will receive a reward for that!
Check out our Glassdoor or Brazil Life page on LinkedIn for more details about Brazil, our culture, and much more.
#LIRemote#LI-CHERRYNETRUSTLY
At Trustly, we embrace and celebrate diversity of all forms and the value it brings to our employees and customers. We are proud and committed to being an Equal Opportunity Employer and believe an open and inclusive environment enables people to do their best work.  All decisions regarding hiring, advancement, and any other aspects of employment are made solely on the basis of qualifications, merit, and business need.
Want to make a difference in a fast-growing business? Apply now!
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow APIs AWS Banking Big Data CI/CD DataOps Data pipelines Data quality Data warehouse DevOps Docker EC2 Engineering ETL Kafka KPIs Kubeflow Kubernetes Machine Learning Mathematics ML models Pipelines PostgreSQL Python QuickSight RDBMS Redshift Security Spark SQL Streaming Terraform

Perks/benefits: Career development Flex hours Flex vacation Gear Health care Home office stipend Insurance Parental leave Salary bonus Startup environment Team events

Region: South America
Country: Brazil

More jobs like this