[Job - 22687] Master Data Developer, Brazil

Brazil

Apply now Apply later

We are tech transformation specialists, uniting human expertise with AI to create scalable tech solutions.With over 6,500 CI&Ters around the world, we’ve built partnerships with more than 1,000 clients during our 30 years of history. Artificial Intelligence is our reality.

We are looking for a Data Platform Developer to join our team in our development center, working with a major US client in a fast-paced, high-impact environment. In this role, you will architect and implement robust data pipelines and modern analytics platforms, working in close collaboration with cross-functional teams across the US and Brazil. You will be responsible for building scalable data solutions leveraging modern cloud-native tools.Fluent English communication is essential for engaging with our global stakeholders and ensuring alignment across distributed teams.
Requirements for this challenge:- Solid experience as a Data Developer- Strong SQL expertise with the ability to optimize, refactor, and validate large-scale data transformations.- Proficiency in Python (or similar language) for scripting and automation of data workflows.- Hands-on experience with Snowflake, including performance tuning, data governance, masking, and workload management.- Advanced knowledge and production experience with dbt for transformation logic, testing, documentation, and CI/CD integrations.- Proven experience implementing Data Vault 2.0 models, including Hubs, Links, Satellites, PIT tables, and business vault patterns using AutomateDV or similar frameworks.- Experience orchestrating ETL/ELT pipelines using Airflow, with knowledge of DAG structuring, dependency management, and dynamic task generation.- Familiarity with modern data orchestration tools, such as Prefect, Dagster, or AWS Glue.- Comfortable working in environments using CI/CD pipelines with GitHub Actions, integrating dbt, testing, and deployment to Snowflake or similar platforms.- Solid understanding of data modeling best practices, including normalization, dimensional modeling, and historization.- Ability to translate business requirements into scalable data architectures, and to communicate technical concepts effectively with stakeholders.
Nice to have:- Experience with data observability tools like Monte Carlo, Datafold, or Great - - Expectations to ensure trust in data pipelines.- Experience with containerization technologies like Docker or Kubernetes for reproducible environments and scalable deployments.- Exposure of SAP- Knowledge of GraphQL, RESTful APIs, or streaming ingestion frameworks such as Kinesis or Firehose.- Experience working in hybrid architectures, including data lakehouses, or multi-cloud strategies.
#LI-JP3
Our benefits:
-Health and dental insurance-Meal and food allowance-Childcare assistance-Extended paternity leave-Wellhub (Gympass)-TotalPass-Profit-sharing (PLR)-Life insurance-CI&T University-Discount club-Free online platform dedicated to physical, mental, and overall well-being-Pregnancy and responsible parenting course-Partnerships with online learning platforms-Language learning platformAnd many more!More details about our benefits here: https://ciandt.com/br/pt-br/carreiras


Collaboration is our superpower, diversity unites us, and excellence is our standard. We value diverse identities and life experiences, fostering a diverse, inclusive, and safe work environment. We encourage applications from diverse and underrepresented groups to our job positions.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow APIs Architecture AWS AWS Glue CI/CD Dagster Data governance Data pipelines dbt Docker ELT ETL Firehose GitHub GraphQL Kinesis Kubernetes Monte Carlo Pipelines Python Snowflake SQL Streaming Testing

Perks/benefits: Career development Fitness / gym Health care

Region: South America
Country: Brazil

More jobs like this