Staff Data Engineer, Revenue & Growth Data
Remote - Poland
Dropbox
Discover Dropbox – secure, easy cloud storage for file sharing and collaboration. With Dash, save time and organize all your company content in one place.Role Description
Dropbox is rebuilding the foundation of its monetization and financial data systems - and at the center of that is our revenue and growth data platform. We’re looking for a Staff Data Engineer to lead the design and delivery of this critical foundation, enabling insights and systems that drive ARR tracking, financial reporting, and product monetization.
You’ll define architectural direction, elevate engineering standards across teams, and build scalable systems that directly power executive reporting, monetization experiments, and product strategy. Your work will have executive visibility and serve as a linchpin for Dropbox’s monetization and growth strategy.
We’re seeking an engineer who combines deep technical expertise with a strong sense of ownership and the ability to drive change across org boundaries. If you are passionate about building for scale, reliability, and business impact, this is your opportunity.
Our Engineering Career Framework is viewable by anyone outside the company and describes what’s expected for our engineers at each of our career levels. Check out our blog post on this topic and more here.
Responsibilities
- Architect the next-generation data platform for ARR, revenue attribution, and growth analytics - setting vision, driving alignment, and delivering at scale.
- Own and evolve core data models and systems used across Finance, Product, and Analytics teams, ensuring accuracy, trust, and accessibility.
- Lead platform modernization, including the adoption of scalable lakehouse architectures (Databricks, Spark, Delta), CI/CD for data, and observability frameworks.
- Drive adoption of scalable data practices across Dropbox through reusable tooling, process improvements, and cross-team collaboration.
- Partner with stakeholders (e.g., Finance, Product, Data Science, and Infrastructure) to understand data needs and deliver solutions that drive real business outcomes.
- Mentor and grow junior engineers, and cultivate a high-performing, innovation-driven team culture.
Requirements
- BS degree in Computer Science or related technical field involving coding (e.g., physics or mathematics), or equivalent technical experience.
- 10+ years building large-scale data systems, with a demonstrated track record of technical leadership, including ownership of architectural direction and cross-team platform work.
- Proven ability to set architectural direction, lead platform evolution, and influence technical strategy across teams.
- Deep hands-on expertise with Spark, Spark SQL, and Databricks, along with experience orchestrating data pipelines using Apache Airflow, and writing performant, maintainable Python and SQL code.
- Track record of implementing data quality, testing, and observability systems at scale.
- Experience supporting monetization, financial, or product growth analytics with trusted and governed data models.
- Familiarity with cloud platforms (AWS, GCP, or Azure) and lakehouse paradigms (e.g., Delta Lake, Iceberg).
Preferred Qualifications
- Experience leading data migrations or platform transformations (e.g., from on-prem to cloud, Hadoop to Databricks).
- Familiarity with tools for data contracts, lineage, and governance (e.g., dbt, Monte Carlo, Great Expectations).
- Understanding of data privacy and compliance frameworks, including GDPR, SOX, and audit-readiness.
Compensation
Dropbox applies increased tax deductible costs to remuneration earned by certain qualifying employees (to the extent an employee will be involved in the creation of the software as an “author”) for the transfer of copyrights, in accordance with the relevant provisions of the Personal Income Tax Act.
Poland Pay Range249 900 zł—338 100 zł PLN* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Architecture AWS Azure CI/CD Computer Science Databricks Data pipelines Data quality dbt Engineering Finance GCP Hadoop Mathematics Monte Carlo Physics Pipelines Privacy Python Spark SQL Testing
Perks/benefits: Career development Startup environment
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.