Data Engineer

Medellín, Colombia

⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️

Kiwibot

Rent a Kiwibot delivery robot by the hour. Speed up deliveries, lower costs, and go green with autonomous, branded robots for your business.

View all jobs at Kiwibot

Apply now Apply later

Kiwibot is at the forefront of robotic delivery solutions, transforming urban logistics with autonomous technology. Our mission is to make last-mile delivery efficient, sustainable, and accessible. We are a data-driven company, and our Data Team is crucial to our success, enabling intelligent decision-making, optimizing robot performance, and supporting our rapid growth.

The Opportunity:

We are seeking a highly skilled and motivated Data Engineer to join our dynamic Data Team. This critical role will be instrumental in designing, developing, and maintaining our robust data architecture and pipelines, ensuring data quality, and providing essential data support for our AI and Robotics initiatives. The ideal candidate will be an end-to-end data professional, capable of leading complex projects, collaborating with cross-functional teams, and upholding the highest standards of data governance and security.

Key Responsibilities:

  • Data Pipeline Development & Management:
    • Design, develop, and maintain scalable, reliable, and efficient ETL/ELT pipelines for batch and real-time data processing (e.g., MQTT/Kafka data ingestion).

    • Manage and optimize our data warehousing solutions, primarily Google BigQuery, ensuring efficient data storage, querying, and cost-effectiveness.

    • Implement and maintain data quality assertions across all data pipelines to ensure data integrity from source to consumption.

    • Develop and integrate new data sources into our existing data ecosystem.

    • Troubleshoot and resolve data pipeline issues, ensuring minimal disruption to data availability.

  • Data Support:
    • Collaborate closely with company teams to understand their data needs and develop tailored data solutions.

    • Design and implement data workflows to support machine learning workflows

    • Contribute to the development of data-driven insights that improve robot autonomy and performance.

  • Data Architecture & Infrastructure:
    • Contribute to the design and evolution of our overall data architecture, ensuring scalability, performance, and maintainability.

    • Implement and adhere to best practices for data modeling, schema design, and data governance.

    • Work with cloud infrastructure (GCP preferred) to deploy and manage data services.

    • Knowledge of spatial data wrangling and best practices for warehousing and consumption.

  • Monitoring, Reporting & Dashboards:
    • Develop and maintain monitoring solutions for data pipeline health and performance.

    • Ensure data consistency and accuracy in reporting tools.

  • Team Collaboration & Leadership:
    • Collaborate effectively with cross-functional teams to gather requirements and deliver data solutions.

    • Mentor junior team members and contribute to a culture of continuous learning and knowledge sharing within the data team.

    • Take ownership of projects from conception to deployment, ensuring timely and high-quality deliverables.

Technical Skills & Qualifications:

  • Required:
    • Strong proficiency in Python for data engineering and scripting.

    • Extensive experience with SQL and relational databases (PostgreSQL preferred).

    • Proven expertise with Google Cloud Platform (GCP) services, especially BigQuery, Cloud Storage,, Cloud Functions.

    • Experience designing, building, and maintaining robust ETL/ELT data pipelines.

    • Familiarity with data orchestration tools (e.g., Apache Airflow,).

    • Experience with real-time data processing technologies (e.g., Kafka, MQTT).

    • Understanding of data modeling techniques (e.g., dimensional modeling, Kimball).

    • Familiarity with version control systems (Git/GitHub).


  • Plus:
    • Exposure to AI/ML data pipelines and MLOps principles.

    • Knowledge of AI Agents for internal product development.


    • Familiarity with containerization technologies (Docker, Kubernetes).

    • Exposure to companies KPIs and OKRs among other performance metrics.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow Architecture BigQuery Data governance Data pipelines Data quality Data Warehousing Docker ELT Engineering ETL GCP Git GitHub Google Cloud Kafka KPIs Kubernetes Machine Learning MLOps MQTT OKR Pipelines PostgreSQL Python RDBMS Robotics Security SQL

Perks/benefits: Career development Startup environment

Region: South America
Country: Colombia

More jobs like this