Data Engineer
CAN - Vancouver, Canada
Full Time Mid-level / Intermediate Clearance required CAD 140K - 260K * est.
Boeing
Welcome to the official corporate site for the world's largest aerospace company and leading manufacturer of commercial jetliners and defense, space and security systems. Learn about our passion for innovation, our products, careers and more.Company:
The Boeing CompanyBoeing Canada Vancouver is looking for Data Engineer to join our team in Vancouver, Canada. This position supports the development and sustainment of cloud data infrastructure for the Safety Data Analytics (SDA) program in Boeing Global Services (BGS) Training, Digital & Analytics (TD&A). The position will be fully on-site.
In this role, you will be part of a growing team of Data Engineers that is working to build and maintain data infrastructure that processes large amounts of commercial airplane flight data, making curated datasets available for data consumers through a Delta Lake in Azure Databricks and a data warehouse in Google Cloud BigQuery. You will partner directly with Architects, Data Scientists, Software Developers, and Business Intelligence Analysts to define technical requirements for highly scalable streaming and batch data pipeline features; then you will design, implement, test, and support these features. An attention to detail and obsession with quality is a must, because the data products that our Data Engineering team produces serve both internally facing and externally facing applications.
A successful candidate will be a self-starter who understands the importance of collaborating with partner teams to fully define functional requirements before starting technical work. This role requires prior experience managing and analyzing data at scale in Azure Databricks and/or Google Cloud Platform (GCP) along with a dependable aptitude for Python/Spark, Containers, and DevOps to deliver scalable Cloud Data Engineering solutions. Prior experience managing and analyzing aerospace flight data (QAR/CPL, CVR/FDR, and/or ADS-B) at scale is highly desirable. Candidates with ETL and workflow orchestration in both GCP and Azure Databricks are desired, as our team will support an existing container-based GCP data infrastructure until our migration to Azure Databricks is complete.
The Safety Data Analytics program is a multi-disciplinary team of data professionals, Analytical Insights Managers, and Pilots who work together to identify and communicate data-driven insights to external aerospace customers, with the end goal of making flying safer. Join us, as our Data Engineering team creates a new generation of scalability and efficiency for data consumers in an Azure Databricks Delta Lake.
Position Responsibilities:
Support the development and improvement of streaming, batch data pipelines in Azure Databricks and/or Google Cloud Platform to provide curated datasets of airplane flight data to partner teams
Partner with internal stakeholders to define desired user experiences with data; use these requirements to design orchestration strategies, data models, and partition strategies that optimize performance and cost of data curation and access
Use Python, PySpark, and/or Scala to programmatically analyze, reduce, transform, and load datasets that are too large for in-memory, computer-based workflows
Develop Extract, Transform, Load (ETL) and analytics processes that consume data from: (1) unstructured files and sources, (2) SQL / NoSQL datastores, and (3) messaging queues / streams
Develop and maintain CI/CD pipelines used to automate DevOps tasks
Build, test, and deploy container workers for serverless, scalable ETL workflow orchestration
Implement testing and alerts that identify undesirable performance in codebases and data pipelines
Analyze large, cloud-based datasets to identify, disposition, and explain the root cause for unanticipated data trends
Develop and maintain partnerships while working in a cross-functional team of Data Engineers, Data Scientists, Software Developers, and Business Intelligence Analysts
Implement access-control and security systems for compliance with data governance policies
Exercise critical thinking and innovative problem solving
Ensure commitments and service levels are achieved or exceeded
Demonstrate the Boeing Behaviors in a deliberate and observable way
Occasional travel may be required
Basic Qualifications (Required Skills/Experience):
Education/experience typically acquired through advanced technical education (e.g. Associates or Bachelor)
5+ years of related work experience
3+ years of experience developing Python scripts, packages, and notebooks to analyze and process large datasets
3+ years of experience developing complex SQL queries to interact with relational datasets
3+ years of experience using Git for version control of software
2+ years of experience working in Azure, Google Cloud Platform, or Databricks
2+ years of experience modeling, developing, and maintaining SQL or NoSQL datastores and maintaining analytics datastores (Databricks Delta Tables or Google BigQuery preferred)
2+ years of experience developing and maintaining ETL workflow orchestration using tools like Apache Airflow, GCP Workflows, or Databricks Workflows
2+ years developing and maintaining CI/CD pipelines
1+ years of workplace experience using Apache Spark, Apache Beam, and/or Dask for distributed processing of large, cloud-based datasets
1+ years of experience with PySpark or Scala
Ability to support our global customer base with a flexible work schedule which may include late or early meetings
Preferred Qualifications (Desired Skills/Experience):
7+ years of related work experience or an equivalent combination of technical education and experience
Experience working with aerospace datasets including timeseries flight data (QAR/CPL, CVR/FDR, and/or ADS-B) at scale is highly desirable
Experience working in a role that develops Data Engineering solutions as part of a cross-functional team that includes Data Engineering, Data Science, Software Engineers, and/or Business Intelligence Analysts
Relocation:
Relocation assistance is not a negotiable benefit for this position. Candidates must live in the immediate area or relocate at their own expense.
Additional Information:
This requisition is for a locally hired position in Canada. The employer is Boeing Canada. Candidates must be legally authorized to work in Canada. Benefits and pay are determined by Canada and are not on Boeing US-based payroll. This is not an expatriate assignment.
Please also submit a CV or resume written in English.
Applications for this position will be accepted until June 20, 2025.
Language Requirements:
Not ApplicableEducation:
Not ApplicableRelocation:
Relocation assistance is not a negotiable benefit for this position.Security Clearance:
This position does not require a Security Clearance.Visa Sponsorship:
Employer will not sponsor applicants for employment visa status.Contingent Upon Award Program
This position is not contingent upon program awardShift:
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Azure BigQuery Business Intelligence CI/CD Data Analytics Databricks Data governance Data pipelines Data warehouse DevOps Engineering ETL GCP Git Google Cloud NoSQL Pipelines PySpark Python Scala Security Spark SQL Streaming Testing
Perks/benefits: Flex hours Relocation support
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.