Data Engineer
Vicente López, Argentina
Magoya
Digital Product Development for Agri Businesses. With a 100% service-focused approach, we develop custom solutions & teams to drive growth and sustainability.🌟 Join Magoya! 🌟
About Magoya:
Magoya is a growing AgTech company focused on Ag Digital Product Development Services. We help Agri-businesses in the US thrive by providing innovative digital solutions tailored to their unique needs. At Magoya, we prioritize customer success and are committed to delivering exceptional service and results. 🚀
Role Overview:
We are looking for a skilled Data Engineer to design, build, and maintain robust data pipelines and scalable architectures that support analytics and reporting needs across the organization. This role involves integrating various data sources, optimizing data processes, and ensuring high data quality and reliability. You’ll collaborate closely with stakeholders to deliver clean, structured datasets while actively contributing to data architecture strategies.
Key Responsibilities:
- Design, develop, and maintain data extraction, transformation, and loading pipelines (ETL/ELT).
- Optimize data structures and processes to ensure performance, scalability, and data quality.
- Integrate diverse internal and external data sources, including APIs, databases, and cloud-based storage solutions such as Amazon S3, Azure Blob Storage, or Google Cloud Storage.
- Manage and enhance data storage systems, such as data warehouses (e.g.,
- Amazon Redshift, Azure Synapse, BigQuery), data lakes, and cloud-based data platforms.
- Collaborate with stakeholders to provide reliable, clean, and structured datasets for analytics and reporting.
- Participate actively in data architecture decisions and strategy discussions.
- Monitor, troubleshoot, and ensure the reliable operation of data pipelines in production environments.
- Apply best practices in version control, testing, and automation of data ingestion and processing workflows.
- Document technical processes, data structures, and architectural decisions clearly and thoroughly.
Skills & Qualifications:
- Strong proficiency in programming languages focused on data manipulation, particularly Python.
- Experience with workflow orchestration tools (e.g., Apache Airflow, Azure Data Factory, AWS Glue, or Google Cloud Composer).
- Familiarity with cloud services and storage solutions (AWS, Azure, GCP, or equivalent).
- Knowledge of relational and non-relational databases, data warehousing, and data lakes.
- Experience with data modeling and designing scalable data architectures.
- Proficiency in version control systems (e.g., Git) and familiarity with CI/CD practices.
- Ability to apply software engineering best practices to data engineering and data science workflows.
What We Offer:
- A chance to work with a fast-growing AgTech company committed to innovation and customer success.
- A collaborative and supportive environment.
- Competitive compensation with performance-based incentives.
- Opportunities for professional development and growth.
Are you ready to make an impact in AgTech? Join us! 💪
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow APIs Architecture AWS AWS Glue Azure BigQuery CI/CD Data pipelines Data quality Data Warehousing ELT Engineering ETL GCP Git Google Cloud Pipelines Python RDBMS Redshift Testing
Perks/benefits: Career development Competitive pay Startup environment
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.