Senior Data Engineer
US Timezones
â ď¸ We'll shut down after Aug 1st - try foođŚ for all jobs in tech â ď¸
Dynamic
Dynamic combines authentication, smart wallets, and secure key management into one flexible SDK. Get the most multi-chain coverage across chains and third-party wallets.Dynamic started with a simple vision: every app and website will have a wallet. Three years in, that vision is no longer just an idea. Itâs happening now. Wallets are no longer just for crypto apps. Theyâre becoming the backbone of fintech, payroll, and global remittances. They power faster, cheaper, and more accessible transactions. The best crypto apps, like Ondo Finance, Story, and Magic Eden already run on Dynamic. Now, the worldâs top fintech and HR platforms are integrating wallets and payments through Dynamic, tapping into crypto rails. We are at a pivotal moment as we scale from supporting leading crypto apps to becoming the wallet infrastructure of the internet.
Why join Dynamic now?
Own the next wave of apps and fintechs: Your work will directly impact how the worldâs biggest fintech players adopt wallets and stablecoin payments.
Join at the perfect moment: Weâre scaling fast, but still early enough that your contributions will define our trajectory.
Build the foundation of modern money: Backed by a16z crypto, Founders Fund, and other top investors, weâre making money more connected across chains and ecosystems.
Our product:
Check out a product demo here
What we are looking for:
As a Data Engineer at Dynamic, you will play a critical role in enabling seamless access to data across our internal research and product teams. Youâll be responsible for building and optimizing the pipelines and infrastructure that power our data aggregation, transformation, and analysis efforts. Your work will ensure our analysts and researchers can efficiently work with data to drive insights and product innovation.
Youâll collaborate cross-functionally with stakeholders across Research, Product, and Engineering to improve internal data systems, automate workflows, and contribute to the future of data science at Dynamic.
The anticipated U.S. base salary range for this full-time position is $165,000â$200,000 for candidates located in the NY/Bay Area. In addition to base compensation, we offer a comprehensive total rewards package that includes equity and a competitive benefits program.
Actual base salary will be determined based on a variety of factors, including the scope and responsibilities of the role, required skills and experience, and your geographic location. Salary ranges are reviewed annually and may be adjusted based on market trends and internal equity. Offers are made within the applicable range at the time of hiring.
You will be a fantastic fit for this role if:
You have 3+ years of experience as a Data Engineer, Analytics Engineer, or similar role focused on building and maintaining data pipelines in production environments.
Youâre fluent in Python and SQL
You have experience with ETL workflows, data modeling, and infrastructure to support high-quality, reliable data delivery.
Youâve worked on automating data workflows and scaling internal analytics systems.
Youâre comfortable collaborating cross-functionally with technical and non-technical stakeholders.
To ensure timezone overlap with our team, you are based in the Americas.
Nice to Haves:
2 to 3 years of experience building in the Web3 space for a notable Web3 company
Youâre excited about Web3 and fintech, and want to work on infrastructure used by millions of usersâ
Experience using DBT and Snowflake
You are located close to NYC, SF, Chicago, or Floripa (Brazil)
You will:
Build and maintain data pipelines: Develop and optimize ETL processes that power research and analytics workflows across the company.
Support internal data access: Ensure researchers and analysts have timely and seamless access to data needed for protocol analysis and product development.
Automate internal processes: Identify and automate repetitive or manual data workflows to improve efficiency and scalability across teams.
Collaborate on infrastructure and tooling: Work with Product, Research, and Engineering to ensure data systems are reliable, extensible, and aligned with team needs.
Develop and maintain educational tools: Assist in creating content and resources to help internal teams level up their data literacy and technical skills.
Drive data quality and structure: Champion data integrity, consistency, and best practices across our internal datasets and pipelines.
Contribute to long-term strategy: Help shape the evolution of our data architecture and prepare systems for advanced analysis and future data science efforts.
Tags: Architecture Crypto Data pipelines Data quality dbt Engineering ETL Finance FinTech Pipelines Python Research Snowflake SQL
Perks/benefits: Competitive pay Equity / stock options
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.