Senior Data Engineer

Brazil

Lean Tech

Lean Solutions Group is a top workforce optimization company. Explore our offshore and nearshore staffing solutions to transform your business operations.

View all jobs at Lean Tech

Apply now Apply later

Company Overview:  Lean Tech is a rapidly expanding organization situated in Medellín, Colombia. We pride ourselves on possessing one of the most influential networks within software development and IT services for the entertainment, financial, and logistics sectors. Our corporate projections offer many opportunities for professionals to elevate their careers and experience substantial growth. Joining our team means engaging with expansive engineering teams across Latin America and the United States, contributing to cutting-edge developments in multiple industries. We are seeking a dedicated Senior Data Engineer to join our team and provide support and troubleshooting for our drivers. Position Title: Senior Data Engineer  Location: Remote - LATAM What you will be doing: The Senior Data Engineer will be responsible for the development, optimization, and maintenance of Extract, Transform, Load (ETL) processes using Python, focusing on building a data ingestion engine to support customer reporting solutions. Your responsibilities will include: 
  • Develop, optimize, and maintain ETL processes using Python.
  • Troubleshoot ETL-related issues to ensure data accuracy and integrity.
  • Collaborate with departmental teams to understand data and reporting requirements.
  • Implement best practices for data retrieval and reporting performance.
  • Maintain documentation of ETL processes and data architectures.
  • Automate the ingestion of external data via Snowflake, including validation and transformation.
  • Create standard SQL queries for report generation and upload files to a server.
  • Ensure seamless delivery of reports, including email notifications for customers.
  • Work closely with stakeholders to define and refine data requirements and reporting strategies.
  • Develop and maintain data pipelines to support analytics and business intelligence efforts.
 Project scope Initial Scoping
  • Receiving the raw data from external solution via Snowflake share.
  • Lightweight tracking of the few report attributes such as codes & state purchased, and delivery destination.
  • Manual validation of codes to make sure they are valid
  • Creating a standard SQL query to pull together a report per Product specification.
  • Uploading the report file(s) to a server for customer to download.
 Generally Available Solution
  • Automate the regular ingestion of external solution data (via Snowflake share) into  landing zone
  • Automate any transformations required to structure data to more efficiently generate reports
  • Integrate with data collected from UI as a system o record atribute including automated validation of code.
  • Automate the data pipeline and creation of final report(s)
  • Automate QA consistency checks and integration with data team QA processes
  • Automate the delivery of reports to the correct customer SFTP location including email notification to customer when new data is available.
 Requirements & Qualifications: To excel in this role, you should possess: 
  • Proficiency in SQL and experience with Snowflake.
  • Strong ETL experience using Python.
  • Experience with large, structured datasets.
  • Ability to collaborate effectively with cross-functional teams.
  • In-depth knowledge of Amazon Web Services (AWS) and Platform as a Service (PAAS) (Good to have)
  • Working knowledge of Bitbucket and Git (Good to have)
  • Foundational knowledge of CI/CD platforms, preferably TeamCity (Good to have)
  • Experience working with data in various formats including XML, JSON, and CSV (Good to have)
  • Ability to automate data ingestion from Amazon S3, FTP, and other technologies (Good to have)
  • Strong problem-solving skills and attention to detail.
  • Proven ability to manage multiple tasks and meet deadlines.
  • Experience in designing and implementing data architectures.
  • Understanding of data governance and data security best practices.
  • Experience with data warehousing concepts and technologies.
  • Ability to work independently and as part of a team.
  • Strong analytical and critical thinking skills.
  Why you will love Lean Tech:  
  • Join a powerful tech workforce and help us change the world through technology
  • Professional development opportunities with international customers
  • Collaborative work environment
  • Career path and mentorship programs that will lead to new levels.
  Join Lean Tech and contribute to shaping the data landscape within a dynamic and growing organization. Your skills will be honed, and your contributions will play a vital role in our continued success. Lean Tech is an equal-opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
Apply now Apply later
  • Share this job via
  • 𝕏
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Architecture AWS Bitbucket Business Intelligence CI/CD CSV Data governance Data pipelines Data Warehousing Engineering ETL Excel Git JSON Pipelines Python Security Snowflake SQL XML

Perks/benefits: Career development

Region: South America
Country: Brazil

More jobs like this