Senior Data Engineer (7-month Contract)
Canada
- Remote-first
- Website
- @StackAdapt 𝕏
- Search
StackAdapt
StackAdapt is the leading programmatic advertising platform used by the most exceptional digital marketers. Experience the difference.We have an exciting opportunity in the newly formed Enterprise Data Office (EDO) with its mandate to serve the business leaders and stakeholders at StackAdapt with trusted official reporting and governed self-service analytics. Reporting to the Director of Enterprise Data Office, the Senior Data Engineer will have accountability over, and be responsible for, the data architecture, data pipelines, data operations, and platform management of the data lake and enterprise data warehouse currently under development.
This role will be responsible for assessing the functional and non-functional business requirements gathered by the Manager of Business Data Analysis from the business departments in order to formulate a holistic data solution that meets or exceeds their respective business intelligence needs. This includes building ingestion data pipelines from multiple data sources and architecting data models that are fit-for-purpose for immediate business needs as well as fulfilling broader reporting and analytical use cases. This role will build the data transformation pipelines to materialize the data model, configure and automate the orchestration of the pipelines to provide seamless execution and collaborate with BI Engineers to ensure our business stakeholders’ needs are met as they consume data from within the BI platform.
StackAdapt is a Remote-First company. We are open to candidates located anywhere in Canada for this position.
What You'll Be Doing
- This role will be responsible for all data architecture, data engineering, and data operations within the Data Lake and Enterprise Data Warehouse (Snowflake) including:
- Build reliable data ingestion pipelines to extract data from a variety of data sources including databases (e.g., RDBMS/NOSQL/file stores), applications (via API), flatfiles, etc into the Data Lake with appropriate metadata tagging
- Orchestrate data pipelines via batch, near-real-time, or real-time operations depending on requirements to ensure a seamless and predictable execution
- Advise the team on all data architecture, data modeling, data operations, and data platform decisions
- Support the day to day operation of the EDO pipelines by monitoring alerts and investigating, troubleshooting, and remediating production issues
What You'll Bring to the Table
- Minimum 5 years of experience leading data engineering within an Enterprise Data Warehouse environment
- Hands-on experience with cloud-based data warehouses (e.g., Snowflake, BigQuery, Redshift) and bigdata mediums (e.g., Delta Lake, Parquet, Avro)
- Knowledgeable in data warehousing architecture fundamentals (e.g., Kimball/Inmon methodology, dimensional modeling, conformed dimensions, SCDs, etc).
- Proven experience managing cloud-based infrastructure on AWS/AZURE/GCP with knowledge of container technologies such as Kubernetes and networking fundamentals.
- Hands-on experience building ETL/ELT data pipelines via custom-coded scripts (e.g., Spark, Python, JAVA, SQL stored procedures) and via integration platforms (e.g., PowerCenter, DataStage, Talend)
- Highly experienced in orchestrating data operations via tools such as Apache Airflow, Cron, Astronomer etc. and administering the data platform via Infrastructure-as-Code (e.g., Terraform).
About StackAdapt
We've been recognized for our diverse and supportive workplace, high performing campaigns, award-winning customer service, and innovation. We've been awarded:
Ad Age Best Places to Work 2024G2 Top Software and Top Marketing and Advertising Product for 2024Campaign’s Best Places to Work 2023 for the UK2024 Best Workplaces for Women and in Canada by Great Place to Work®#1 DSP on G2 and leader in a number of categories including Cross-Channel Advertising
#LI-Remote
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow APIs Architecture Avro AWS Azure BigQuery Business Intelligence Data analysis DataOps Data pipelines Data warehouse Data Warehousing ELT Engineering ETL GCP Java Kubernetes NoSQL Parquet Pipelines Python RDBMS Redshift Snowflake Spark SQL Talend Terraform
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.