[Job - 22687] Master Data Developer, Colombia
Colombia
We are tech transformation specialists, uniting human expertise with AI to create scalable tech solutions.With over 6,500 CI&Ters around the world, we’ve built partnerships with more than 1,000 clients during our 30 years of history. Artificial Intelligence is our reality.
We are looking for a Data Platform Developer to join our team in our development center, working with a major US client in a fast-paced, high-impact environment. In this role, you will architect and implement robust data pipelines and modern analytics platforms, working in close collaboration with cross-functional teams across the US and Brazil. You will be responsible for building scalable data solutions leveraging modern cloud-native tools.Fluent English communication is essential for engaging with our global stakeholders and ensuring alignment across distributed teams.
Requirements for this challenge:- Solid experience as a Data Developer- Strong SQL expertise with the ability to optimize, refactor, and validate large-scale data transformations.- Proficiency in Python (or similar language) for scripting and automation of data workflows.- Hands-on experience with Snowflake, including performance tuning, data governance, masking, and workload management.- Advanced knowledge and production experience with dbt for transformation logic, testing, documentation, and CI/CD integrations.- Proven experience implementing Data Vault 2.0 models, including Hubs, Links, Satellites, PIT tables, and business vault patterns using AutomateDV or similar frameworks.- Experience orchestrating ETL/ELT pipelines using Airflow, with knowledge of DAG structuring, dependency management, and dynamic task generation.- Familiarity with modern data orchestration tools, such as Prefect, Dagster, or AWS Glue.- Comfortable working in environments using CI/CD pipelines with GitHub Actions, integrating dbt, testing, and deployment to Snowflake or similar platforms.- Solid understanding of data modeling best practices, including normalization, dimensional modeling, and historization.- Ability to translate business requirements into scalable data architectures, and to communicate technical concepts effectively with stakeholders.
Nice to have:- Experience with data observability tools like Monte Carlo, Datafold, or Great - - Expectations to ensure trust in data pipelines.- Experience with containerization technologies like Docker or Kubernetes for reproducible environments and scalable deployments.- Exposure of SAP- Knowledge of GraphQL, RESTful APIs, or streaming ingestion frameworks such as Kinesis or Firehose.- Experience working in hybrid architectures, including data lakehouses, or multi-cloud strategies.
#LI-JP3
Our benefits include:
- Premium Healthcare- Meal voucher- Maternity and Parental leaves- Mobile services subsidy- Sick pay-Life insurance- CI&T University - Colombian Holidays- Paid VacationsAnd many others.
Collaboration is our superpower, diversity unites us, and excellence is our standard. We value diverse identities and life experiences, fostering a diverse, inclusive, and safe work environment. We encourage applications from diverse and underrepresented groups to our job positions.
We are looking for a Data Platform Developer to join our team in our development center, working with a major US client in a fast-paced, high-impact environment. In this role, you will architect and implement robust data pipelines and modern analytics platforms, working in close collaboration with cross-functional teams across the US and Brazil. You will be responsible for building scalable data solutions leveraging modern cloud-native tools.Fluent English communication is essential for engaging with our global stakeholders and ensuring alignment across distributed teams.
Requirements for this challenge:- Solid experience as a Data Developer- Strong SQL expertise with the ability to optimize, refactor, and validate large-scale data transformations.- Proficiency in Python (or similar language) for scripting and automation of data workflows.- Hands-on experience with Snowflake, including performance tuning, data governance, masking, and workload management.- Advanced knowledge and production experience with dbt for transformation logic, testing, documentation, and CI/CD integrations.- Proven experience implementing Data Vault 2.0 models, including Hubs, Links, Satellites, PIT tables, and business vault patterns using AutomateDV or similar frameworks.- Experience orchestrating ETL/ELT pipelines using Airflow, with knowledge of DAG structuring, dependency management, and dynamic task generation.- Familiarity with modern data orchestration tools, such as Prefect, Dagster, or AWS Glue.- Comfortable working in environments using CI/CD pipelines with GitHub Actions, integrating dbt, testing, and deployment to Snowflake or similar platforms.- Solid understanding of data modeling best practices, including normalization, dimensional modeling, and historization.- Ability to translate business requirements into scalable data architectures, and to communicate technical concepts effectively with stakeholders.
Nice to have:- Experience with data observability tools like Monte Carlo, Datafold, or Great - - Expectations to ensure trust in data pipelines.- Experience with containerization technologies like Docker or Kubernetes for reproducible environments and scalable deployments.- Exposure of SAP- Knowledge of GraphQL, RESTful APIs, or streaming ingestion frameworks such as Kinesis or Firehose.- Experience working in hybrid architectures, including data lakehouses, or multi-cloud strategies.
#LI-JP3
Our benefits include:
- Premium Healthcare- Meal voucher- Maternity and Parental leaves- Mobile services subsidy- Sick pay-Life insurance- CI&T University - Colombian Holidays- Paid VacationsAnd many others.
Collaboration is our superpower, diversity unites us, and excellence is our standard. We value diverse identities and life experiences, fostering a diverse, inclusive, and safe work environment. We encourage applications from diverse and underrepresented groups to our job positions.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Job stats:
0
0
0
Categories:
Big Data Jobs
Engineering Jobs
Tags: Airflow APIs Architecture AWS AWS Glue CI/CD Dagster Data governance Data pipelines dbt Docker ELT ETL Firehose GitHub GraphQL Kinesis Kubernetes Monte Carlo Pipelines Python Snowflake SQL Streaming Testing
Region:
South America
Country:
Colombia
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.
BI Developer jobsData Engineer II jobsPrincipal Data Engineer jobsStaff Data Scientist jobsSr. Data Engineer jobsPrincipal Software Engineer jobsStaff Machine Learning Engineer jobsData Science Manager jobsData Manager jobsDevOps Engineer jobsData Science Intern jobsSoftware Engineer II jobsJunior Data Analyst jobsData Analyst Intern jobsLead Data Analyst jobsBusiness Data Analyst jobsBusiness Intelligence Analyst jobsStaff Software Engineer jobsData Specialist jobsSenior Backend Engineer jobsAccount Executive jobsData Governance Analyst jobsSr. Data Scientist jobsAI/ML Engineer jobsData Engineer III jobs
Consulting jobsAirflow jobsOpen Source jobsMLOps jobsKPIs jobsLinux jobsEconomics jobsJavaScript jobsRDBMS jobsTerraform jobsData Warehousing jobsKafka jobsNoSQL jobsGitHub jobsGoogle Cloud jobsPostgreSQL jobsComputer Vision jobsScikit-learn jobsPhysics jobsClassification jobsStreaming jobsBanking jobsHadoop jobsR&D jobsLooker jobs
dbt jobsData warehouse jobsOracle jobsRAG jobsScala jobsBigQuery jobsPandas jobsPrompt engineering jobsGPT jobsReact jobsCX jobsScrum jobsDistributed Systems jobsPySpark jobsIndustrial jobsELT jobsJira jobsRedshift jobsMicroservices jobsLangChain jobsSAS jobsJenkins jobsOpenAI jobsSalesforce jobsRobotics jobs