Data Infrastructure Engineer
Global Remote
About Bastion
Bastion enables financial institutions and enterprises to issue regulated stablecoins, generate revenue on reserves, and expand their ecosystems. Bastion’s platform combines stablecoin issuance, secure custody, and seamless orchestration for cross-border transfers, on/off-ramps, and stablecoin conversions. With Bastion’s platform and APIs, businesses can create and scale their stablecoin network, while optimizing revenue, compliance, and control.
Overview
As a Data Infrastructure Engineer, you will be responsible for building and maintaining critical data infrastructure. You will build ingestion, analysis, and reporting pipelines that are at the core of Bastion’s leading compliance and product offerings. You’ll also work with teams across the entire Bastion organization, including compliance, legal, and finance.
Given the foundational nature of this role, you will also be responsible for selecting appropriate technologies, managing external vendor relationships, and fostering a data-driven culture across the entire company. You will be expected to architect and build both real-time and batch data pipelines that ingest data from a variety of sources, delivering them to our data warehouse. In addition, you will be responsible for establishing strong security and privacy controls around sensitive data.
Responsibilities
Architect, build, and maintain modern and robust real-time and batch data analytics pipelines.
Develop and maintain declarative data models and transformations.
Implement data ingestion integrations for streaming and traditional sources such as Postgres, Kafka, and DynamoDB.
Deploy and configure BI tooling for data analysis.
Work closely with product, finance, legal, and compliance teams to build dashboards and reports to support business operations, regulatory obligations, and customer needs.
Establish, communicate, and enforce data governance policies.
Document and share best practices with regards to schema management, data integrity, availability, and security.
Protect and limit access to sensitive data by implementing a secure permissioning model and establishing data masking and tokenization processes.
Identify and communicate data platform needs, including additional tooling and staffing.
Work with cross-functional teams to define requirements, plan projects, and execute on the plan.
Qualifications
5+ years of professional engineering and data analytics experience, startup experience a plus.
Strong proficiency and comfort using SQL and Python to perform complex data analysis.
Recent experience building automation tooling and pipelines using a general purpose language such as Python, Golang, and/or Typescript.
Experience with modern data pipeline and warehouse technologies (e.g. Snowflake, Databricks, Apache Spark, AWS Glue)
Strong proficient writing declarative data models and transformations using modern technologies (e.g. dbt)
Experience building and maintaining cloud-based data lakes.
Prior experience with integrating real-time data streaming technologies (e.g. Kafka, Spark)
Prior experience with configuring and maintaining modern data orchestration platforms (e.g. Airflow)
Comfort with infrastructure-as-code tooling (e.g. Terraform) and container orchestration platforms (e.g. Kubernetes)
Strong preference to keep things simple, ship fast, and avoid overengineering.
Self-driven and ability to work autonomously.
Professional Web3 / Crypto experience is a plus.
What We Look For
Ownership and Proactivity: Demonstrated sense of ownership and accountability, combined with a forward-thinking approach and an unwavering motivation to excel in their designated role.
Customer-Centric Mindset: Proven track record of placing customers at the heart of all decisions, striving beyond satisfaction to truly impress and exceed expectations.
Detail-Oriented: Demonstrated ability to produce high-quality work with meticulous attention to detail, ensuring consistency and precision in every task.
Team Player with a Competitive Edge: Strong collaboration skills, understanding that success is achieved collectively.
Continuous Improvement Mindset: Never settle for the status quo. Always looking for growth opportunities and new challenges, with a vision of powering the future of web3. Recognizes that the journey to excellence is ongoing and embraces the challenge.
Empathetic Insight: Demonstrated ability to understand and share the feelings of others, fostering genuine connections and promoting a supportive environment.
Bastion provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, and placement. Bastion participates in E-Verify to authorize eligibility of employment in the United States.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow APIs AWS AWS Glue Crypto Data analysis Data Analytics Databricks Data governance Data pipelines Data warehouse dbt DynamoDB Engineering Excel Finance Golang Kafka Kubernetes Pipelines PostgreSQL Privacy Python Security Snowflake Spark SQL Streaming Terraform TypeScript
Perks/benefits: Startup environment
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.