Senior Data Engineer
New York, New York, United States
About 21Shares
21Shares makes investing in digital assets as easy as buying shares using your conventional broker or bank. Investors can purchase our crypto ETPs easily, safely and securely in a regulated framework on the SIX Swiss Exchange in USD, Euros and GBP, on BX Swiss in CHF, on Boerse Stuttgart in Euros, on DB Xetra as well as Wiener Boerse ( Vienna Exchange) in Euros. We offer the most expansive suite of crypto ETPs available on regulated European exchanges.
Founded in 2018, 21Shares is led by a team of talented entrepreneurs and experienced professionals from the asset management and banking industry. Headquartered in Zurich, the company has launched several exchange-traded products such as NEAR and ONDO in the last twelve months.
About the Role
We are seeking a highly motivated and skilled Senior Data Engineer with strong team leadership skills to join our growing team of analytics experts in our New York office. As a highly experienced Data Engineer, you will be responsible for designing, expanding, and optimizing our data infrastructure and pipeline architecture, while simultaneously mentoring and leading a team of data engineers. You will work on cutting-edge technologies, playing a key role in ensuring efficient data flow and collection across cross-functional teams, helping to establish best practices and drive technical excellence. The ideal candidate is a hands-on builder and leader, experienced in developing scalable data systems, building robust data pipelines, managing complex datasets, and fostering a high-performing team environment.
Our culture values diversity, communication, collaboration, and a shared passion for leveraging data to drive insights, innovation, and business outcomes.
Responsibilities and Scope
- Design, build, and maintain robust and scalable data pipeline architectures.
- Assemble large, complex datasets that meet both functional and non-functional business requirements.
- Identify, design, and implement internal process improvements — including automation of manual workflows, optimization of data delivery, and re-architecting infrastructure for greater scalability and reliability.
- Design, build, and optimize ETL infrastructure to enable scalable, high-quality data workflows across diverse sources, leveraging SQL and modern data processing frameworks.
- Build analytics tools that utilize the data pipeline to deliver actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
- Collaborate with stakeholders across Executive, Product, Data, and Design teams to resolve data-related technical issues and ensure their data infrastructure needs are met.
- Ensure data integrity, separation, and security across multiple data centers and AWS regions.
- Create data tools and frameworks to empower analytics and data science teams in building and optimizing products that drive innovation and establish market leadership.
- Lead and mentor a small team of data engineers, fostering a culture of technical excellence, collaboration, and continuous improvement.
- Provide technical guidance, set coding standards, conduct code reviews, and support career development for team members.
- Work closely with data and analytics experts to continually enhance the functionality, reliability, and scalability of our data systems.
What You Will Need To Be Great In This Role
- 6+ years of experience in a Data Engineering role, designing, building, and managing scalable and reliable data systems.
- Proficient with big data and stream-processing technologies such as Spark and Kafka.
- Hands-on experience with cloud platforms, particularly AWS services like EC2 and RDS.
- Skilled in building and orchestrating data pipelines using tools like Airflow.
- Experience with Databricks for scalable data processing and advanced analytics.
- Strong knowledge of SQLMesh for modern data workflow management.
- Extensive experience integrating and working with external data sources via REST APIs, GraphQL endpoints, and SFTP servers.
- Strong communication skills and leadership capabilities are required.
Databases and Data Management:
- Expertise with relational and NoSQL databases, including Postgres and MongoDB. ●
- Solid understanding of data modeling, data governance, and data security best practices.
Programming and Development:
- Proficient in Python for data engineering, automation, and workflow scripting.
- Familiarity with software engineering best practices, including version control, testing, and CI/CD pipelines for data workflows.
- Experience with JavaScript and TypeScript is a plus.
Analytics, Visualization, and BI:
- Skilled in implementing and supporting self-service BI tools to enable business teams with accessible, actionable insights.
- Experience with Streamlit for building interactive data visualizations is a plus.
Blockchain and Financial Data Expertise:
- Knowledge of blockchain technology and the cryptocurrency ecosystem is a nice-to-have, with a strong interest in staying up to date with emerging trends.
- Experience working with financial datasets and financial engineering concepts is considered a strong advantage.
Our Stack
We work with a modern and evolving technology stack, including but not limited to:
- Cloud Infrastructure: AWS for cloud services and infrastructure management
- Databases: PostgreSQL for relational data, MongoDB for non-relational (NoSQL) data, and Redis for caching and real-time data management
- Backend: NestJS (Node.js, TypeScript) and Python for building scalable backend services
- Frontend: React for web applications, Streamlit for building interactive data visualizations
- Data Engineering: Airflow and SQLMesh for data pipeline orchestration and modern workflow management
- Big Data & Processing: Databricks and Kafka for scalable data processing, analytics, and streaming
- Integrations & APIs: Extensive use of REST APIs, GraphQL, SFTP, and Slack integrations to enable seamless data exchange and operational workflows
- Messaging & Event Streaming: Kafka for real-time data pipelines and event-driven architectures
We are continuously exploring and integrating new technologies to meet business needs and drive innovation.
This role is based in New York City and will be expected to work from our New York office Monday - Wednesday.
Compensation (NYC Only)
Pursuant to Section 8-102 of Title 8 of the New York City administrative code, the base salary range for this role is $170,000.00 - $220,000.00. Total compensation packages are based on various factors unique to each candidate, including but not limited to skill set, years and depth of experience, certifications, and specific office location.
Tags: Airflow APIs Architecture AWS Banking Big Data Blockchain CI/CD Crypto Databricks Data governance Data management Data pipelines EC2 Engineering ETL GraphQL JavaScript Kafka MongoDB Node.js NoSQL Pipelines PostgreSQL Python React Security Spark SQL Streaming Streamlit Testing TypeScript
Perks/benefits: Career development
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.