Data Engineer
Brazil - Remote
Partner One
PartnerOne acquires and grows enterprise software companies for the long term. We have very strong financial resources, combined with an agile and entrepreneurial mindset.We are seeking a skilled and proactive Data Integration Engineer to join our remote team. In this role, you will be responsible for creating, optimizing, maintaining, and decommissioning data integrations across a diverse and complex hybrid IT landscape. This includes on-premises systems, SaaS platforms, and cloud environments (primarily AWS). You will play a key role in ensuring accurate, complete, and secure data flows that support both business and security initiatives.
Key Responsibilities:
- Design, build, and manage data workflows and integration pipelines across a hybrid IT environment.
- Ensure data integrity, observability, and resilience across internal systems and third-party platforms.
- Collaborate with internal stakeholders to gather requirements and deliver end-to-end integration solutions that are scalable and future-ready.
- Refactor or replace legacy data workflows to support modernized business processes.
- Identify integration inefficiencies and recommend improvements with long-term technical implications in mind.
- Maintain and develop scripts and automation tools using Python, Bash, PowerShell, and other relevant languages.
- Use DevOps practices to implement CI/CD workflows, maintain source control, and deploy changes reliably.
- Work with cloud services (especially AWS), containers, and both Linux and Windows environments.
- Integrate with and develop solutions for Splunk, including custom Splunk Apps (primarily Python-based).
- Collaborate occasionally with external stakeholders to support partner or cross-company integration efforts.
Requirements
- Experience in a similar Data Integration, DevOps, or Systems Engineering role.
- Strong experience with Python, Bash, PowerShell, or similar scripting languages.
- Hands-on experience with hybrid IT environments: on-premises, SaaS, and cloud (especially AWS).
- Experience with containers (e.g., Docker), and cloud-based deployment and monitoring practices.
- Proficient in using CI/CD tools and managing code in source control systems (e.g., Git).
- Strong understanding of secure and reliable data pipeline design and observability.
- Familiarity with integrating and maintaining Splunk, including the development of custom apps.
- Ability to gather requirements, communicate tradeoffs, and deliver complete solutions independently.
- Strong communication and collaboration skills, both with internal and external stakeholders.
Preferred Qualifications:
- Experience working with infrastructure as code (e.g., Terraform, CloudFormation).
- Familiarity with additional monitoring or logging tools and data orchestration platforms.
- Prior experience supporting security-related or compliance-driven projects.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: AWS CI/CD CloudFormation DevOps Docker Engineering Git Linux Pipelines Python Security Splunk Terraform
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.