Senior DevOps Engineer (SRE)
Porto Alegre, Rio Grande do Sul, Brazil; São Paulo, São Paulo, Brazil
São Paulo-SP, Brazil or Porto Alegre-RS, Brazil
Who We Are
In 2021, Poatek was acquired by WillowTree, an award-winning digital product consultancy recognized as one of the fastest growing and best places to work in the United States. Now a TELUS Digital, with offices across the globe, we continue to partner with the world’s leading brands, such as HBO, PepsiCo, Domino’s and more, to design, build, and transform their digital products and strategies. We’re looking to grow our Poatek offices in Brazil with top talent excited to collaborate with team members across the globe to deliver innovative solutions for our clients.
Location and Flexibility
This is a hybrid role. This model requires the ability to work in a hybrid mode from one of our offices in São Paulo (2 times/ week or 8 days/ month) or Porto Alegre (3 times/ week or 12 days/ month). Our office culture is designed to foster in-person innovation, collaboration, and connection with team members local and visiting from other global offices.
The Opportunity
As a DevOps Engineer, you will play a crucial role in ensuring the continuous delivery of our data products and services to our clients and internal stakeholders. DevOps needs vary by project, so we value flexibility and willingness to learn.
Responsibilities
-
Proactive Responsibilities
- Design and implement scalable data pipeline architectures in collaboration with the EDE FTE Engineers.
- Implement and manage automated alerting systems for data pipeline issues.
- Automate repetitive tasks in data processing and management.
- Continuously optimize data pipeline efficiency and reduce operational costs and the number of issues/failures MOM.
- Implement and manage disaster recovery plans / backup plans.
- Conduct capacity planning for data storage and processing needs.
- Develop and maintain documentation for data pipeline systems and processes and provide KT to EDE FTE, WT, and TD Engineers.
- Continuously improve data pipeline reliability through analysis and testing.
- Collaborate with EDE FTE Engineers to improve pipeline reliability.
- Monitor performance and reliability of data pipelines, enterprise datahub, HPBI, and MDM systems.
Reactive Responsibilities
- Monitor, troubleshoot, and resolve production issues in data processing workflows (leveraging ITSM).
- Participate in on-call rotations for data pipeline incident response.
- Conduct post-incident reviews and implement improvements for data pipelines.
- Maintain infrastructure reliability for data pipelines, enterprise datahub, HPBI, and MDM systems.
Qualifications
- 5+ years of industry experience in the field of Data Engineering support and enhancement
- Proficient in Google Cloud Platform (GCP), AWS or Azure, services such as Dataflow, BigQuery, Cloud Storage and Pub/Sub.
- Strong understanding of data pipeline architectures and ETL processes.
- Experience with Python programming language in terms of data processing.
- Knowledge of SQL and experience with relational databases.
- Familiarity with version control systems like Git.
- Ability to analyze, troubleshoot, and resolve complex data pipeline issues.
- Software engineering experience in optimizing data pipelines to improve performance and reliability.
- Continuously optimize data pipeline efficiency and reduce operational costs and reduce number of issues/failures
- Automate repetitive tasks in data processing and management
- Experience in monitoring and alerting for Data Pipelines.
- Continuously improve data pipeline reliability through analysis and testing
- Perform SLA-oriented monitoring for critical pipelines and provide suggestions as well implement post business approval for SLA adherence if needed.
- Monitor performance and reliability of GCP data pipelines, Informatica ETL workflows, MDM and Control-M jobs.
- Maintain infrastructure reliability for GCP data pipelines, Informatica ETL workflows, MDM and Control-M jobs.
- Conduct post-incident reviews and implement improvements for data pipelines.
- Develop and maintain documentation for data pipeline systems and processes
- Excellent communication and documentation skills.
- Strong problem-solving and analytical skills
Why Poatek?
In addition to being part of an international and innovative consultancy company, you will have:
- Flexible hours and autonomy
- Work with cutting edge technologies
- Partner with global and relevant brands in the market
- Collaborative team and learning ecosystem
- Career development plan & growth
- Internacional travel opportunities (optional)
Some of our benefits:
- Health and dental plan
- Life insurance
- Monthly voucher for meals, culture, education, health and mobility
- Child care assistance and more!
We will only use the information you provide to process your application and to produce tracking statistics. Since we do not request personal data deemed sensitive, we ask you to abstain from sharing those information with us.
For more information on how we use your information, see our Privacy Policy.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Architecture AWS Azure BigQuery Dataflow Data pipelines DevOps Engineering ETL GCP Git Google Cloud Informatica Pipelines Privacy Python RDBMS SQL Statistics Testing
Perks/benefits: Career development Flex hours Health care
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.