Senior Data Analytics Engineer
LATAM
⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️
Zartis
Zartis provides bespoke software development teams, software development outsourcing and consulting services you can trust.Zartis is a digital solutions provider working across technology strategy, software engineering and product development. We partner with firms across financial services, MedTech, media, logistics technology, renewable energy, EdTech, e-commerce, and more. Our engineering hubs in EMEA and LATAM are full of talented professionals delivering business success and digital improvement across application development, software architecture, CI/CD, business intelligence, QA automation, and new technology integrations.
We are looking for a Senior Data Analytics Engineer to work on a project in the Technology industry.
The project:
Our teammates are talented people that come from a variety of backgrounds. We’re committed to building an inclusive culture based on trust and innovation.
You will be part of a distributed team developing new technologies to solve real business problems. Our client empowers organizations to make smarter, faster decisions through the seamless integration of strategy, technology, and analytics. They have helped leading brands harness their marketing, advertising, and customer experience data to unlock insights, enhance performance, and drive digital transformation.
We are looking for someone with good communication skills, ideally with experience making decisions, being proactive, used to building software from scratch, and with good attention to detail.
What you will do:
- Designing performant data pipelines for the ingestion and transformation of complex datasets into usable data products.- Building scalable infrastructure to support hourly, daily, and weekly update cycles.- Implementing automated QA checks and monitoring systems to catch data anomalies before they reach clients.- Re-architecting system components to improve performance or reduce costs.- Supporting team members through code reviews and collaborative development.- Building enterprise-grade batch and real-time data processing pipelines on AWS, with a focus on serverless architectures.- Designing and implementing automated ELT processes to integrate disparate datasets.- Collaborating across multiple teams to ingest, extract, and process data using Python, R, Zsh, SQL, REST, and GraphQL APIs.- Transforming clickstream and CRM data into meaningful metrics and segments for visualization.- Creating automated acceptance, QA, and reliability checks to ensure business logic and data integrity.- Designing appropriately normalized schemas and making informed decisions between SQL and NoSQL solutions.- Optimizing infrastructure and schema design for performance, scalability, and cost efficiency.- Defining and maintaining CI/CD and deployment pipelines for data infrastructure.- Containerizing and deploying solutions using Docker and AWS ECS.- Proactively identifying and resolving data discrepancies, and implementing safeguards to prevent recurrence.- Contributing to documentation, onboarding materials, and cross-team enablement efforts.
- What you will bring:
- Bachelor’s degree in Computer Science, Software Engineering, or a related field; additional training in statistics, mathematics, or machine learning is a strong plus.- 5+ years of experience building scalable and reliable data pipelines and data products in a cloud environment (AWS preferred).Deep understanding of ELT processes and data modelling best practices.- Strong programming skills in Python or a similar scripting language.- Advanced SQL skills, with intermediate to advanced experience in relational database design.- Familiarity with joining and analyzing large behavioral datasets, such as Adobe and GA4 clickstream data.- Excellent problem-solving abilities and strong attention to data accuracy and detail.- Proven ability to manage and prioritize multiple initiatives with minimal supervision.
Nice to have:
- Experience working with data transformation tools such as Data Build Tool or similar technologies.- Familiarity with Docker containerization and orchestration.- Experience in API design or integration for data pipelines.- Development experience in a Linux or Mac environment.- Exposure to data QA frameworks or observability tools (e.g., Great Expectations, Monte Carlo, etc.).
What we offer:
- 100%Remote Work- WFH allowance: Monthly payment as financial support for remote working.- Career Growth: We have established a career development program accessible for all employees with a 360º feedback that will help us to guide you in your career progression.- Training: For Tech training at Zartis, you have time allocated during the week at your disposal. You can request from a variety of options, such as online courses (from Pluralsight and Educative.io, for example), English classes, books, conferences, and events.- Mentoring Program: You can become a mentor in Zartis or you can receive mentorship, or both.- Zartis Wellbeing Hub (Kara Connect): A platform that provides sessions with a range of specialists, including mental health professionals, nutritionists, physiotherapists, fitness coaches, and webinars with such professionals as well.- Multicultural working environment: We organize tech events, webinars, parties, and activities to do online team-building games and contests.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: APIs Architecture AWS Business Intelligence CI/CD Computer Science CX Data Analytics Data pipelines Data QA Docker E-commerce ECS ELT Engineering GraphQL Linux Machine Learning Mathematics Monte Carlo NoSQL Pipelines Python R RDBMS SQL Statistics
Perks/benefits: Career development Conferences Team events Wellness
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.