Data Engineer - Tech Lead
São Paulo
About Digibee
Digibee is an iPaaS that scales integration workflows while reducing cost and technical debt. Rather than require specialized integration experts, Digibee lets every developer quickly build, test, deploy, govern, and monitor integrations across on-premise and cloud environments using a simple but powerful low-code interface.
Founded in São Paulo, Brazil, in 2017 and headquartered in Weston, Florida, our team is widely distributed throughout the Americas. In May of 2023, Digibee closed a Series B funding round of $60 million that is intended to drive our expansion in the United States.
About the role
Join our growing team as a Lead Data Engineer and architect the data backbone that powers Digibee business intelligence. In this high-impact role, you will take on a pivotal role in designing, building, and managing Digibee’s internal data platform. This hands-on position requires both deep technical expertise and leadership capability, with responsibility for shaping our data architecture, developing scalable data solutions, and driving the strategic use of data across the organization. You will lead the end-to-end data product lifecycle, from planning and development to operational maintenance, ensuring that our data systems are robust, compliant, and cost-effective. As a technical leader within our lean data team, you’ll set best practices, guide your engineering partner in execution, deliver impactful data solutions that align with Digibee’s strategic objectives, and integrate FinOps principles to build efficient and cost-conscious data pipelines leveraging Google Cloud Platform (GCP) and best-in-class SaaS tools. If you thrive in shaping data strategy while staying close to the code and driving tangible results, this is your opportunity to make a significant impact.
Key Responsibilities:
- Data Strategy & Architecture: Define and implement data management strategies, building robust data lakes, warehouses, and pipelines that support data-driven decisions. Ensure compliance with data security standards such as GDPR, LGPD, HIPAA, and CCPA.
- Pipeline & Processing Development: Develop and optimize end-to-end ELT/ETL pipelines that connect various data sources (databases, APIs) to user interfaces and analytics platforms. Utilize Google Cloud services, including Cloud Storage, BigQuery, and Pub/Sub.
- Cost & Operational Management: Forecast and manage data-related costs for tools, storage, and processing. Post-implementation, oversee data ingestion, tool evaluation, and ongoing maintenance for continuous improvements.
- Data Visualization & Insights: Recommend and utilize visualization tools to deliver actionable insights and uncover trends across datasets. Communicate findings and project value effectively to stakeholders.
- Team Leadership & Mentorship: Lead data engineering teams, mentoring junior engineers, and championing best practices in software engineering, compliance, and cost-effective data management.
Required Qualifications:
- Technical Expertise: Proven experience in large-scale data management, including data lake and warehouse architectures, automation, access control, and segregation of data for security and compliance.
- Programming Skills: Demonstrated proficiency in Python, Java, or Go and experience with big data processing frameworks like Hadoop, Spark, and Kafka is required.
- Cloud Computing: Extensive experience with Google Cloud Platform (e.g., Cloud Run, Cloud Functions, Pub/Sub).
- Data Modeling & Compliance: Strong skills in data modeling, ETL processes, and knowledge of data privacy laws and best practices.
- Cost Management: Ability to forecast and manage costs associated with data tools and services, optimizing for scalability and efficiency.
- Communication & Stakeholder Management: Ability to effectively present technical concepts and project impact to both technical and non-technical stakeholders.
Preferred Qualifications:
- Data Visualization & Orchestration: Experience with visualization tools (e.g., Tableau, Power BI, Looker Studio) and orchestration frameworks (e.g., Apache Airflow, Dagster).
- DevOps & CI/CD: Familiarity with Terraform or OpenTofu for infrastructure management and GitLab CI for continuous integration.
Our Perks And Benefits
- We're remote first, with a flexible working schedule
- Health care
- R$ 1.200,00/month on Caju card (for food and meal allowance, mobility, home office supplies, culture, health, and education)
- Life insurance
- Child care assistance
- Gympass
- English course: we have a partnership for group classes for R$100 monthly
Our culture
We believe in a highly collaborative work environment in order to foster constant development and exchange between teams. We encourage learning, sharing knowledge, and using new technologies to create disruptive ideas - we want to create something great together!
At Digibee, we know it's our people who make the difference. We embrace and value diversity and are dedicated to encouraging a supportive and respectful culture in our community.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow APIs Architecture Big Data BigQuery Business Intelligence CI/CD Dagster Data management Data pipelines Data strategy Data visualization DevOps ELT Engineering ETL GCP GitLab Google Cloud Hadoop Java Kafka Looker Pipelines Power BI Privacy Python R Security Spark Tableau Terraform
Perks/benefits: Career development Fitness / gym Flex hours Health care Home office stipend
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.