[Job - 22687] Master Data Developer, Colombia
Colombia
⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️
We are tech transformation specialists, uniting human expertise with AI to create scalable tech solutions.With over 6,500 CI&Ters around the world, we’ve built partnerships with more than 1,000 clients during our 30 years of history. Artificial Intelligence is our reality.
We are looking for a Data Platform Developer to join our team in our development center, working with a major US client in a fast-paced, high-impact environment. In this role, you will architect and implement robust data pipelines and modern analytics platforms, working in close collaboration with cross-functional teams across the US and Brazil. You will be responsible for building scalable data solutions leveraging modern cloud-native tools.Fluent English communication is essential for engaging with our global stakeholders and ensuring alignment across distributed teams.
Requirements for this challenge:- Solid experience as a Data Developer- Strong SQL expertise with the ability to optimize, refactor, and validate large-scale data transformations.- Proficiency in Python (or similar language) for scripting and automation of data workflows.- Hands-on experience with Snowflake, including performance tuning, data governance, masking, and workload management.- Advanced knowledge and production experience with dbt for transformation logic, testing, documentation, and CI/CD integrations.- Proven experience implementing Data Vault 2.0 models, including Hubs, Links, Satellites, PIT tables, and business vault patterns using AutomateDV or similar frameworks.- Experience orchestrating ETL/ELT pipelines using Airflow, with knowledge of DAG structuring, dependency management, and dynamic task generation.- Familiarity with modern data orchestration tools, such as Prefect, Dagster, or AWS Glue.- Comfortable working in environments using CI/CD pipelines with GitHub Actions, integrating dbt, testing, and deployment to Snowflake or similar platforms.- Solid understanding of data modeling best practices, including normalization, dimensional modeling, and historization.- Ability to translate business requirements into scalable data architectures, and to communicate technical concepts effectively with stakeholders.
Nice to have:- Experience with data observability tools like Monte Carlo, Datafold, or Great - - Expectations to ensure trust in data pipelines.- Experience with containerization technologies like Docker or Kubernetes for reproducible environments and scalable deployments.- Exposure of SAP- Knowledge of GraphQL, RESTful APIs, or streaming ingestion frameworks such as Kinesis or Firehose.- Experience working in hybrid architectures, including data lakehouses, or multi-cloud strategies.
#LI-JP3
Our benefits include:
- Premium Healthcare- Meal voucher- Maternity and Parental leaves- Mobile services subsidy- Sick pay-Life insurance- CI&T University - Colombian Holidays- Paid VacationsAnd many others.
Collaboration is our superpower, diversity unites us, and excellence is our standard. We value diverse identities and life experiences, fostering a diverse, inclusive, and safe work environment. We encourage applications from diverse and underrepresented groups to our job positions.
We are looking for a Data Platform Developer to join our team in our development center, working with a major US client in a fast-paced, high-impact environment. In this role, you will architect and implement robust data pipelines and modern analytics platforms, working in close collaboration with cross-functional teams across the US and Brazil. You will be responsible for building scalable data solutions leveraging modern cloud-native tools.Fluent English communication is essential for engaging with our global stakeholders and ensuring alignment across distributed teams.
Requirements for this challenge:- Solid experience as a Data Developer- Strong SQL expertise with the ability to optimize, refactor, and validate large-scale data transformations.- Proficiency in Python (or similar language) for scripting and automation of data workflows.- Hands-on experience with Snowflake, including performance tuning, data governance, masking, and workload management.- Advanced knowledge and production experience with dbt for transformation logic, testing, documentation, and CI/CD integrations.- Proven experience implementing Data Vault 2.0 models, including Hubs, Links, Satellites, PIT tables, and business vault patterns using AutomateDV or similar frameworks.- Experience orchestrating ETL/ELT pipelines using Airflow, with knowledge of DAG structuring, dependency management, and dynamic task generation.- Familiarity with modern data orchestration tools, such as Prefect, Dagster, or AWS Glue.- Comfortable working in environments using CI/CD pipelines with GitHub Actions, integrating dbt, testing, and deployment to Snowflake or similar platforms.- Solid understanding of data modeling best practices, including normalization, dimensional modeling, and historization.- Ability to translate business requirements into scalable data architectures, and to communicate technical concepts effectively with stakeholders.
Nice to have:- Experience with data observability tools like Monte Carlo, Datafold, or Great - - Expectations to ensure trust in data pipelines.- Experience with containerization technologies like Docker or Kubernetes for reproducible environments and scalable deployments.- Exposure of SAP- Knowledge of GraphQL, RESTful APIs, or streaming ingestion frameworks such as Kinesis or Firehose.- Experience working in hybrid architectures, including data lakehouses, or multi-cloud strategies.
#LI-JP3
Our benefits include:
- Premium Healthcare- Meal voucher- Maternity and Parental leaves- Mobile services subsidy- Sick pay-Life insurance- CI&T University - Colombian Holidays- Paid VacationsAnd many others.
Collaboration is our superpower, diversity unites us, and excellence is our standard. We value diverse identities and life experiences, fostering a diverse, inclusive, and safe work environment. We encourage applications from diverse and underrepresented groups to our job positions.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Job stats:
0
0
0
Categories:
Big Data Jobs
Engineering Jobs
Tags: Airflow APIs Architecture AWS AWS Glue CI/CD Dagster Data governance Data pipelines dbt Docker ELT ETL Firehose GitHub GraphQL Kinesis Kubernetes Monte Carlo Pipelines Python Snowflake SQL Streaming Testing
Region:
South America
Country:
Colombia
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.
Data Scientist II jobsSr. Data Engineer jobsBI Developer jobsPrincipal Data Engineer jobsBusiness Intelligence Developer jobsStaff Machine Learning Engineer jobsStaff Data Scientist jobsPrincipal Software Engineer jobsJunior Data Analyst jobsData Science Intern jobsDevOps Engineer jobsData Manager jobsSoftware Engineer II jobsData Science Manager jobsStaff Software Engineer jobsLead Data Analyst jobsData Analyst Intern jobsAI/ML Engineer jobsSr. Data Scientist jobsBusiness Data Analyst jobsData Specialist jobsBusiness Intelligence Analyst jobsData Engineer III jobsData Governance Analyst jobsSenior Backend Engineer jobs
Consulting jobsMLOps jobsAirflow jobsOpen Source jobsEconomics jobsLinux jobsKPIs jobsKafka jobsTerraform jobsGitHub jobsJavaScript jobsPostgreSQL jobsPrompt engineering jobsRDBMS jobsBanking jobsData Warehousing jobsNoSQL jobsStreaming jobsClassification jobsComputer Vision jobsRAG jobsScikit-learn jobsPhysics jobsGoogle Cloud jobsdbt jobs
GPT jobsPandas jobsHadoop jobsLooker jobsBigQuery jobsR&D jobsData warehouse jobsOracle jobsReact jobsScala jobsDistributed Systems jobsLangChain jobsScrum jobsMicroservices jobsELT jobsCX jobsPySpark jobsIndustrial jobsOpenAI jobsJira jobsSAS jobsRedshift jobsTypeScript jobsModel training jobsRobotics jobs