Data Engineer
Remote - United States
Paxos
Paxos builds regulated blockchain and digital asset solutions for global leaders in financial services. Designed for enterprises. Built for the future of finance.About Paxos
Today’s financial infrastructure is archaic, expensive, inefficient and risky — supporting a system that leaves out more people than it lets in. So we’re rebuilding it.
We’re on a mission to open the world’s financial system to everyone by enabling the instant movement of any asset, any time, in a trustworthy way. For over a decade, we’ve built blockchain infrastructure that tokenizes, custodies, trades and settles assets for the world’s leading financial institutions, like PayPal, Venmo, Mastercard and Interactive Brokers.
About the team
The Data Engineering team at Paxos builds and maintains the data infrastructure that powers our engineering partners and analytics teams. Our mission is to provide scalable, reliable, and secure data access to support critical business needs, including product analytics, business intelligence, client and regulatory reporting, and operational efficiencies. By continuously enhancing our data platform, we drive innovation, optimize costs, and unlock new revenue opportunities.
You’ll be working closely with talented colleagues such as Joe, Chris and Mike, collaborating across engineering, security, and analytics teams.
About the role
We are looking for a Data Engineer to help scale and optimize our data platform. You will play a key role in ensuring efficient data management, enforcing governance policies, and driving automation across our cloud infrastructure. This role is ideal for someone passionate about data architecture, security, and performance optimization in data warehouses (Snowflake/Redshift/BigQuery) and/or cloud infrastructure (AWS/GCP/Azure).
Your responsibilities will include managing and scaling our Snowflake environment, implementing role-based access controls (RBAC), and leveraging AWS services and Infrastructure as Code (IaC) to streamline operations. You’ll work with a cutting-edge tech stack, including:
- AWS: EKS/Kubernetes, MSK/Kafka, RDS/Postgres, Lambda, Glue, S3
- Snowflake: RBAC, Data Governance & Security, Account Management, Warehouse Optimization
- Infrastructure as Code: Terraform/Pulumi
- Data Tooling: DBT, Airbyte/Fivetran, Debezium, Dagster/Airflow, Acryl/DataHub, Monte Carlo, Looker
This is a pivotal moment for our team, and your contributions will directly shape the future of our data ecosystem.
What you’ll do
- Shape the future of our data infrastructure by designing, scaling, and optimizing our data architecture and cloud infrastructure.
- Collaborate across teams to define and enforce data governance, security policies, and access controls.
- Develop and maintain scalable data pipelines and ELT frameworks using AWS-native services, DBT and Snowflake.
- Oversee Snowflake account management, ensuring resource optimization, RBAC implementation, and compliance with best practices.
- Automate and streamline data ingestion, monitoring, and access management across AWS and Snowflake.
- Implement Infrastructure as Code (IaC) using Terraform to efficiently manage AWS and Snowflake resources.
- Ensure data quality and reliability by designing and implementing validation frameworks and automated testing.
- Advocate for modern data tooling and architectures, driving efficiency and scalability.
About you
You have 3+ years of experience in data engineering, with a background in:
- Cloud Infrastructure & Security – Familiarity with AWS services, including EKS/Kubernetes, MSK/Kafka, RDS/Postgres, Lambda, Glue, and S3, along with security best practices (IAM, KMS, ASM).
- Infrastructure as Code (IaC) – Proficient in Terraform to manage AWS and Snowflake resources.
- Cloud Data Warehousing – Hands-on experience with Snowflake, Redshift, BigQuery, or Azure Data Warehouse, including account management, RBAC, data governance & security, and warehouse optimization.
- SQL & Performance Optimization – Strong skills in query optimization and performance tuning within cloud-based data warehouses.
- Programming & Automation – Proficiency in Python (or similar) for automation, data pipeline development, and workflow orchestration.
- Workflow Orchestration – Experience with Dagster, Airflow, AWS Step Functions, or Azure Data Factory to manage data workflows.
- ETL/ELT Design & Maintenance – Experience in building and maintaining scalable, reliable data pipelines.
- Data Governance & Compliance – Understanding of security, governance, and compliance best practices for handling sensitive data.
- Modern Data Tooling – Experience with tools like DBT, Airbyte, Debezium, Dagster, Acryl, Monte Carlo, and Looker to enhance data reliability and observability.
This is a unique opportunity to influence the future of our data infrastructure, work with cutting-edge cloud technologies, and drive meaningful impact across Paxos.
Important Notice for Paxos Applicants
We’ve become aware of fraudulent accounts posting as Paxos recruiters on LinkedIn and other platforms. These scammers attempt to deceive applicants into paying for job opportunities or providing personal financial information.
To verify a legitimate Paxos recruiter:
- We only use @paxos.com email addresses
- We never ask for payment or financial details to apply, interview, or work here
- For technical roles, we do not perform a coding interview without prior screening by our engineering team
Thanks for your interest in Paxos!
Pay and benefits
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Architecture AWS Azure BigQuery Blockchain Business Intelligence Dagster Data governance Data management Data pipelines Data quality Data warehouse Data Warehousing dbt ELT Engineering ETL FiveTran GCP Kafka Kubernetes Lambda Looker Monte Carlo Pipelines PostgreSQL Python Redshift Security Snowflake SQL Step Functions Terraform Testing
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.