AWS Data Engineer

Colombia

⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️

Lean Tech

Lean Solutions Group is a top workforce optimization company. Explore our offshore and nearshore staffing solutions to transform your business operations.

View all jobs at Lean Tech

Apply now Apply later

Company Overview: Lean Tech is a rapidly expanding organization situated in Medellín, Colombia. We pride ourselves on possessing one of the most influential networks within software development and IT services for the entertainment, financial, and logistics sectors. Our corporate projections offer a multitude of opportunities for professionals to elevate their careers and experience substantial growth. Joining our team means engaging with expansive engineering teams across Latin America and the United States, contributing to cutting-edge developments in multiple industries. Currently, we are seeking a Mid+ AWS Data Engineer to join our team. Here are the challenges that our next warrior will face and the requirements we look for:  Position Title: AWS Data Engineer Location: Remote - LATAM What you will be doing: This position involves translating high-level solution designs into functional AWS data pipelines and services. You will closely collaborate with an Onshore Data Architect to implement real-time change data capture using Oracle GoldenGate and integrate it with Amazon MSK for downstream processing. Key responsibilities include developing containerized transformation microservices on AWS EKS, designing efficient NoSQL storage in Apache Cassandra, implementing graph-based pipelines for Amazon Neptune clusters, and enabling ad-hoc analytics via Amazon Athena on S3 data lakes. Automating deployments using Terraform or CloudFormation, and integrating CI/CD processes for continuous improvements is critical. This role requires seamless collaboration with cross-functional teams to onboard new data sources, support analytics/ML use cases, and maintain data quality and governance across all pipelines. Working within Lean Tech's distributed team, the position presents opportunities to tackle complex data challenges using advanced AWS services. Collaborate with the Onshore Data Architect to transform high-level solution designs into functional AWS data pipelines and services.
Ingest real-time change data from legacy systems using Oracle GoldenGate and publish it to Amazon MSK (Kafka) for processing by downstream services.
Develop containerized microservices for data transformation on AWS EKS, ensuring code is modular and reusable.
Build and maintain batch workflows to copy data nightly into Kafka topics and S3 landing zones.
Design and optimize NoSQL storage solutions in Apache Cassandra for high-throughput event data, ensuring efficient read/write patterns.
Implement graph-based pipelines using custom systems to populate and update Amazon Neptune clusters with periodic jobs.
Facilitate ad-hoc analytics by organizing data lakes on Amazon S3 and configuring Amazon Athena schemas for effective, self-service querying.
Automate infrastructure and deployments through Terraform or CloudFormation, integrating CI/CD processes for microservices and Infrastructure-as-Code (IaC) changes.
Monitor and troubleshoot comprehensive data flows using AWS CloudWatch, Kubernetes dashboards, and Kafka tools, implementing alerts on key Service Level Agreements (SLAs).
Ensure robust data quality and governance by embedding validation checks, evolving schema strategies, and documenting all data pipelines.
Work closely with cross-functional teams to onboard new data sources, supporting analytical and machine learning use cases.
Document processes and facilitate knowledge transfer to ensure long-term maintainability and continuous improvement within the nearshore team. Required Skills & Experience: 3+ years of AWS data engineering experience, including proficiency with MSK, EKS, S3, Athena, and Neptune.
Advanced programming skills in Python and Java for developing transformation microservices in Kubernetes (EKS).
Experience with Oracle GoldenGate for real-time change data capture from legacy systems to Amazon MSK (Kafka).
Proficiency with NoSQL databases, specifically Apache Cassandra, for high-throughput event data management.
Comprehensive understanding of Infrastructure-as-Code (IaC) using Terraform and CloudFormation.
Intermediate to advanced skills in CI/CD methodologies, utilizing tools such as Jenkins, GitHub Actions, or AWS CodePipeline.
Hands-on experience with container orchestration using Docker and Kubernetes.
Advanced knowledge of Apache Kafka for data parsing, enrichment, and restructuring in data pipelines.
Practiced expertise in implementing graph-based data architectures using Amazon Neptune.
Skilled in data lake management and analytics using Amazon S3 and configuring Amazon Athena schemas.
Ability to automate monitoring and alert systems using AWS CloudWatch.
Excellent written and verbal communication skills, especially for remote collaboration with cross-functional teams. Good to Have: Experience with Apache Spark for distributed data processing and analytics.
Familiarity with ElasticSearch for advanced search capabilities and data exploration.
Certification in AWS Certified Solutions Architect or AWS Certified DevOps Engineer.
Exposure to data governance frameworks and tools for maintaining data quality.
Knowledge of RESTful API development and integration.
Strong problem-solving skills and adaptability in fast-paced environments.
Experience with advanced data visualization tools, such as Tableau or Power BI.
Proven experience in agile methodologies for project management. Soft Skills:
Strong written and verbal communication skills, essential for creating clear documentation of data pipelines and collaborating effectively using remote communication tools like Slack and Confluence.
Excellent problem-solving abilities, crucial for troubleshooting complex AWS data flows and ensuring the robustness of infrastructure deployments and microservices.
Effective teamwork and collaboration skills, demonstrated by working closely with onshore and cross-functional teams to align on deliverables and expectations.
Adaptability and flexibility in handling dynamic project requirements, shown by integrating tooling such as CI/CD processes and infrastructure automation with Terraform and CloudFormation.
Leadership in managing remote collaborations by delivering proactive updates on progress and coordinating with various stakeholders to meet timelines and objectives. Why you will love Lean Tech:
Join a powerful tech workforce and help us change the world through technology
Professional development opportunities with international customers
Collaborative work environment
Career path and mentorship programs that will lead to new levels. Join Lean Tech and contribute to shaping the data landscape within a dynamic and growing organization. Your skills will be honed, and your contributions will play a vital role in our continued success. Lean Tech is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile API Development APIs Architecture Athena AWS Cassandra CI/CD CloudFormation Confluence Data governance Data management Data pipelines Data quality Data visualization DevOps Docker Elasticsearch Engineering GitHub Java Jenkins Kafka Kubernetes Machine Learning Microservices NoSQL Oracle Pipelines Power BI Python Spark Tableau Terraform

Perks/benefits: Career development Startup environment

Region: South America
Country: Colombia

More jobs like this