UW - Ssr. Data Engineer - 126

Brazil - Remote

Thaloz

Simplify your digital product journey with Thaloz. From product strategy to team expansion with LatAm talent, we've got you covered. Discover more about us!

View all jobs at Thaloz

Apply now Apply later

The Data Engineer plays a crucial role in our organization by building robust data ingestion pipelines and integrating data into usable datasets. This position is essential for ensuring that our finance and accounting teams have access to accurate and efficient data. The ideal candidate will work closely with data architects and other stakeholders to create refined data objects, leveraging technologies such as Snowflake on Azure and DBT as the primary coding platform. This role requires a self-starter who can quickly understand data schemas and proactively address issues, contributing to the overall success of our data initiatives.

Responsibilities:

As a Data Engineer, you will be responsible for the following key tasks and objectives:

  • Data Ingestion and Integration: Build and maintain ingestion pipelines to integrate data from various sources into Snowflake, ensuring that the data is transformed into usable datasets for finance and accounting purposes.
  • Collaboration with Data Architects: Work closely with data architects to design and create refined data objects that meet the needs of the finance and accounting teams, ensuring data accuracy and efficiency.
  • Batch Accuracy Processes: Follow established processes for batch accuracy, ensuring that data is ingested and processed correctly and efficiently.
  • Model Building: Develop and maintain data models that support business intelligence and reporting needs, ensuring that the data is structured in a way that is easy to analyze.
  • Deployment Management: Manage deployment processes for data pipelines and models, ensuring that changes are implemented smoothly and without disruption to existing workflows.
  • Schema Understanding: Quickly understand and adapt to existing data schemas, making necessary adjustments to improve data ingestion processes.
  • Python Environment Structuring: Help structure the current Python environment for API ingestion, ensuring that it is optimized for performance and reliability.
  • Proactive Issue Resolution: Identify and address any issues that arise in the data ingestion process, working collaboratively with team members to implement solutions.

Requirements

  • Snowflake: Proficiency in using Snowflake as a cloud data platform, including experience with data warehousing, data modeling, and SQL querying within Snowflake.
  • DBT (Data Build Tool): Experience with DBT as the primary coding platform for transforming raw data into a structured format, including writing and managing DBT models and documentation.
  • Python: Strong programming skills in Python, particularly for data ingestion and transformation tasks, including experience with libraries such as Pandas and NumPy.
  • SQL: Proficiency in SQL for querying and manipulating data within relational databases, with a focus on performance optimization and best practices.
  • Data Ingestion: Experience in building and maintaining data ingestion pipelines, including knowledge of ETL (Extract, Transform, Load) processes and tools.
  • Azure Data Factory: Familiarity with Azure Data Factory for orchestrating data workflows and integrating data from various sources into Snowflake.
  • Data Modeling: Strong understanding of data modeling concepts and best practices, with the ability to design and implement effective data models that support business needs.

Nice to Have:

  • Tableau: Experience with Tableau for data visualization and reporting, including the ability to create interactive dashboards and reports that communicate insights effectively.
  • CI/CD Pipelines: Familiarity with Continuous Integration and Continuous Deployment (CI/CD) practices, particularly in the context of data engineering and data pipeline management.
  • Agile Methodology: Understanding of Agile methodologies and practices, with experience working in Agile teams to deliver data solutions iteratively and collaboratively.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile APIs Azure Business Intelligence CI/CD Data pipelines Data visualization Data Warehousing dbt Engineering ETL Finance NumPy Pandas Pipelines Python RDBMS Snowflake SQL Tableau

Regions: Remote/Anywhere South America
Country: Brazil

More jobs like this