Data Engineer Tech Lead - OP01764

São Paulo, State of São Paulo, Brazil - Remote

⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️

Dev.Pro

Globally distributed 850+ tech talent-rich software development partner. Result driven. Quality obsessed. Scale your business with Dev.Pro

View all jobs at Dev.Pro

Apply now Apply later

🟢 Are you in Brazil, Argentina or Colombia? Join us as we actively recruit in these locations, offering a comfortable remote environment. Submit your CV in English, and we'll get back to you!

We invite a Senior Data Engineer to play a key part in a large-scale data modernization effort for a major enterprise client. You’ll lead the Data Engineering team in migrating and transforming complex legacy data pipelines to a modern custom-built cloud environment for improved scalability, maintainability, and compliance. You’ll also collaborate closely with architects, DevOps, QA, and product stakeholders to deliver scalable, reliable data solutions that meet unique business needs.

🟩 What's in it for you:

  • Join a fully integrated delivery team built on collaboration, transparency, and mutual respect
  • Lead a skilled Data Engineering team through high-impact data platform transformation in a production environment
  • Work hands-on with modern, in-demand technologies like GCP, Snowflake, Apache Iceberg, dbt, Airflow, Dataflow, and BigQuery
  • Gain hands-on experience with Google Landing Zones

Is that you?

  • 10+ years of experience in software engineering, with last 5+ years in data engineering and data warehouse modeling, with a deep understanding of the data lifecycle
  • Proven track record in designing, building, and scaling ETL/ELT pipelines for both batch and streaming data
  • Strong hands-on experience with Google Cloud Platform (GCP) - about 2+ years ideally not older than 2021, including: BigQuery, GCS, Cloud Composer (Airflow), Dataflow, Dataproc, Pub/Sub, Cloud Functions
  • Proficient in Python for ETL scripting, DAG development, and data manipulation
  • Experience using dbt for data transformation, testing, and orchestration
  • Familiar with CI/CD pipelines and Infrastructure as Code (IaC) using tools like Git, Terraform, and Serverless Framework
  • Strong architectural mindset: able to design scalable, cost-efficient, and maintainable data solutions
  • Acts as a tech lead and architect hybrid—can drive technical direction (both in customer negotiations & planning and implementation), mentor engineers, and remain hands-on
  • Strong leadership and team collaboration skills; able to influence decisions, set standards, and support delivery
  • Continuously explores new technologies, best practices, and modern data stack tools
  • Degree in Computer Science, Information Systems, Data Engineering, or a related technical field
  • Upper-Intermediate or higher level of English (B2+); excellent communicator and team player

Desirable:

  • Experience building and managing streaming data pipelines and event-driven architectures
  • Experience writing Bash scripts
  • Experience with Java for Dataflow jobs
  • Familiarity with data lakehouse architectures using Iceberg tables
  • Proficiency with Docker for containerizing data pipelines and supporting orchestration
  • Familiarity with AI-assisted tools like GitHub Copilot

🧩Key responsibilities and your contribution

In this role, you’ll combine hands-on engineering work with technical leadership responsibilities, acting as a bridge between engineering and solution architecture, guiding technical direction, mentoring engineers and contributing to code and design to ensure scalable, high impact project delivery

  • Review and analyze existing ETL solutions for migration to the new architecture
  • Design, optimize, and migrate batch and streaming data pipelines to the GCP Landing Zone
  • Build and manage data transformations with dbt, supporting ELT pipelines in Snowflake
  • Ensure the new data infrastructure meets performance and quality SLAs/SLOs
  • Implement monitoring and alerting for pipelines to ensure system fault tolerance
  • Develop migration scripts to transfer historical data to Iceberg tables
  • Act as a liaison between the technical team and the client to ensure clear communication
  • Break down complex tasks into smaller, manageable technical deliverables for the team
  • Proactively identify risks and take steps to mitigate them

🎾 What's working at Dev.Pro like?

Dev.Pro is a global company that's been building great software since 2011. Our team values fairness, high standards, openness, and inclusivity for everyone — no matter your background

🌐 We are 99.9% remote — you can work from anywhere in the world
🌴 Get 30 paid days off per year to use however you like — vacations, holidays, or personal time
✔️ 5 paid sick days, up to 60 days of medical leave, and up to 6 paid days off per year for major family events like weddings, funerals, or the birth of a child
⚡️ Partially covered health insurance after the probation, plus a wellness bonus for gym memberships, sports nutrition, and similar needs after 6 months
💵 We pay in U.S. dollars and cover all approved overtime
📓 Join English lessons and Dev.Pro University programs, and take part in fun online activities and team-building events

Our next steps:

✅ Submit a CV in English — ✅ Intro call with a Recruiter — ✅ Internal interview — ✅ Client interview — ✅ Offer

Interested? Find out more:

📋How we work

💻 LinkedIn Page

📈 Our website

💻IG Page

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Airflow Architecture BigQuery CI/CD Computer Science Copilot Dataflow Data pipelines Dataproc Data warehouse dbt DevOps Docker ELT Engineering ETL GCP Git GitHub Google Cloud Java Pipelines Python Snowflake Streaming Terraform Testing

Perks/benefits: Health care Medical leave Team events Transparency Wellness

Regions: Remote/Anywhere South America
Country: Brazil

More jobs like this