Intermediate Data Engineer - OP01793
São Paulo, State of São Paulo, Brazil - Remote
⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️
Dev.Pro
Globally distributed 850+ tech talent-rich software development partner. Result driven. Quality obsessed. Scale your business with Dev.Pro🟢 Are you in Brazil, Argentina or Colombia? Join us as we actively recruit in these locations, offering a comfortable remote environment. Submit your CV in English, and we'll get back to you!
We invite a Intermediate Data Engineer to contribute to a large-scale data modernization effort for a major enterprise client. You’ll help migrate and transform complex legacy data pipelines to a modern custom-built cloud environment for improved scalability, maintainability, and compliance. You’ll work closely with architects, DevOps, QA, and product stakeholders to deliver scalable, reliable data solutions that meet unique business needs.
🟩 What's in it for you:
- Join a fully integrated delivery team built on collaboration, transparency, and mutual respect
- Contribute to high-impact data platform transformation and gain experience with Google Landing Zones
- Work hands-on with modern, in-demand technologies like GCP, Snowflake, Apache Iceberg, dbt, Airflow, Dataflow, and BigQuery
✅ Is that you?
- 3+ years in data engineering and data warehouse modeling
- Strong proficiency in designing and building ETL for large data volumes and streaming solutions
- Expert-level SQL skills and experience in Snowflake and Apache Iceberg tables
- Hands-on experience with GCP services (BigQuery, GCS, Airflow, Dataflow, Dataproc, Pub/Sub)
- Proficiency in Python for ETL scripting and DAG development
- Experience using dbt for data transformation and orchestration
- Familiarity with CI/CD processes and tools (Git, Terraform, Serverless)
- Degree in Computer Science, Data Engineering, Information Systems, or related fields
- Strong communication and collaboration abilities
- Upper-Intermediate+ English level
Desirable:
- Experience building and managing streaming data pipelines and event-driven architectures
- Experience writing Bash scripts
- Experience with Java for Dataflow jobs
- Familiarity with data lakehouse architectures using Iceberg tables
- Proficiency with Docker for containerizing data pipelines and supporting orchestration
- Familiarity with AI-assisted tools like GitHub Copilot
🧩Key responsibilities and your contribution
In this role, you'll be actively involved in key data engineering activities, helping ensure the project’s success and timely delivery.
- Review and analyze existing ETL solutions for migration to the new architecture
- Design, optimize, and migrate batch and streaming data pipelines to the GCP Landing Zone
- Build and manage data transformations with dbt, supporting ELT pipelines in Snowflake
- Ensure the new data infrastructure meets performance and quality SLAs/SLOs
- Implement monitoring and alerting for pipelines to ensure system fault tolerance
- Develop migration scripts to transfer historical data to Iceberg tables
- Collaborate closely with the team and other stakeholders to align on data requirements and solutions
- Participate in code reviews, design discussions, and technical planning
🎾 What's working at Dev.Pro like?
Dev.Pro is a global company that's been building great software since 2011. Our team values fairness, high standards, openness, and inclusivity for everyone — no matter your background
🌐 We are 99.9% remote — you can work from anywhere in the world
🌴 Get 30 paid days off per year to use however you like — vacations, holidays, or personal time
✔️ 5 paid sick days, up to 60 days of medical leave, and up to 6 paid days off per year for major family events like weddings, funerals, or the birth of a child
⚡️ Partially covered health insurance after the probation, plus a wellness bonus for gym memberships, sports nutrition, and similar needs after 6 months
💵 We pay in U.S. dollars and cover all approved overtime
📓 Join English lessons and Dev.Pro University programs, and take part in fun online activities and team-building events
Our next steps:
✅ Submit a CV in English — ✅ Intro call with a Recruiter — ✅ Internal interview — ✅ Client interview — ✅ Offer
Interested? Find out more:
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Architecture BigQuery CI/CD Computer Science Copilot Dataflow Data pipelines Dataproc Data warehouse dbt DevOps Docker ELT Engineering ETL GCP Git GitHub Java Pipelines Python Snowflake SQL Streaming Terraform
Perks/benefits: Health care Medical leave Team events Transparency Wellness
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.