Staff Data Engineer

USA - CA - 500 Paula Ave

The Walt Disney Company

The mission of The Walt Disney Company is to be one of the world's leading producers and providers of entertainment and information.

View all jobs at The Walt Disney Company

Apply now Apply later

Job Posting Title:

Staff Data Engineer

Req ID:

10100975

Job Description:

As a Staff Data Engineer at The Walt Disney Studios, you will play a pivotal role in the transformation of data into actionable insights. Collaborate with our dynamic team of technologists to develop cutting-edge data solutions that drive innovation and fuel business growth. Your responsibilities will include managing complex data structures and delivering scalable and efficient data solutions. Your expertise in data engineering will be crucial in optimizing our data-driven decision-making processes. If you're passionate about leveraging data to make a tangible impact, we welcome you to join us in shaping the future of our organization.

Key Responsibilities:

● Contribute to maintaining, updating, and expanding existing Core Data platform data pipelines

● Build tools and services to support data discovery, lineage, governance, and privacy

● Collaborate with other software/data engineers and cross-functional teams

● Build and maintain continuous integration and deployment pipelines

● Provision and support cloud resources (add details of AWS/Azure/GCP)

● Tech stack includes Airflow, Spark, Snowflake, Databricks, and dbt

● Collaborate with product managers, architects, and other engineers to drive the success of the Core Data platform

● Contribute to developing and documenting both internal and external standards and best practices for pipeline configurations, naming conventions, and more

● Ensure high operational efficiency and quality of the Core Data platform datasets to ensure our solutions meet SLAs and project reliability and accuracy to all our stakeholders (Engineering, Data Science, Operations, and Analytics teams)

● Be an active participant and advocate of agile/scrum ceremonies to collaborate and improve processes for our team

● Engage with and understand our customers, forming relationships that allow us to understand and prioritize both innovative new offerings and incremental platform improvements

● Maintain detailed documentation of your work and changes to support data quality and data governance requirements

Qualifications:

● 7+ years of data engineering experience developing large data pipelines

● Proficient in SQL Engines with advanced performance tuning capabilities

● Strong understanding of data modeling principles, including Dimensional Modeling and data normalization

● Proficiency in at least one major programming language (e.g. Python, Java, Scala)

● Hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines

● Experience with Snowflake Platform & familiarity with Databricks is a plus

● Experience designing, developing, and optimizing scalable data pipelines and ETL processes, integrating diverse structured and unstructured data sources, and ensuring the reliability and efficiency of data processing workflows

● Experience implementing data quality checks, monitoring, and logging to ensure the integrity and reliability of data pipelines

● Experience designing & developing world class CI/CD and DevOps practices

● Proficiency of Terraform or CDKTF

● Proficiency with containers Docker, Kubernetes etc.

● Deep Understanding of AWS or other cloud providers as well as infrastructure as code

● Excellent conceptual and analytical reasoning competencies

● Advance understanding of OLTP vs OLAP environments

● Willingness and ability to learn and pick up new skill sets

● Self-starting problem solver with an eye for detail and excellent analytical and communication skills

● Familiar with Scrum and Agile methodologies

Required Education:

● Bachelor’s degree in Computer Science, Information Systems, or a related field, or equivalent work experience

● Master’s Degree is a plus

The hiring range for this position in California is $149,300 - $200,200 per year based on a 40 hour work week. The amount of hours scheduled per week may vary based on business needs. The base pay actually offered will take into account internal equity and also may vary depending on the candidate’s geographic region, job-related knowledge, skills, and experience among other factors. A bonus and/or long-term incentive units may be provided as part of the compensation package, in addition to the full range of medical, financial, and/or other benefits, dependent on the level and position offered

Job Posting Segment:

TWDSTECH

Job Posting Primary Business:

Technology-Technology Innovation Group

Primary Job Posting Category:

Data Engineering

Employment Type:

Full time

Primary City, State, Region, Postal Code:

Glendale, CA, USA

Alternate City, State, Region, Postal Code:

Date Posted:

2024-09-25
Apply now Apply later
  • Share this job via
  • 𝕏
  • or
Job stats:  0  0  0

Tags: Agile Airflow AWS Azure CI/CD Computer Science Databricks Data governance Data pipelines Data quality dbt DevOps Docker Engineering ETL GCP Java Kubernetes OLAP Pipelines Privacy Python Scala Scrum Snowflake Spark SQL Terraform Unstructured data

Perks/benefits: Equity / stock options Salary bonus

Region: North America
Country: United States

More jobs like this