Lead Software Engineer, Big Data Infrastructure
USA - CA - 2450 Broadway, United States
Full Time Senior-level / Expert USD 152K - 223K
The Walt Disney Company
The mission of The Walt Disney Company is to be one of the world's leading producers and providers of entertainment and information.Job Posting Title:
Lead Software Engineer, Big Data InfrastructureReq ID:
10115534Job Description:
Disney Entertainment & ESPN Product & Technology
On any given day at Disney Entertainment & ESPN Product & Technology (DEEP&T), we’re reimagining ways to create magical viewing experiences for the world’s most beloved stories while also transforming Disney’s media business for the future. Whether that’s evolving our streaming and digital products in new and immersive ways, powering worldwide advertising and distribution to maximize flexibility and efficiency, or delivering Disney’s unmatched entertainment and sports content, every day is a moment to make a difference to partners and to hundreds of millions of people around the world.
A few reasons why we think you’d love working for Disney Entertainment & ESPN Product &Technology
Building the future of Disney’s media business: DE&E Technologists are designing and building the infrastructure that will power Disney’s media, advertising, and distribution businesses for years to come.
Reach & Scale: The products and platforms this group builds and operates delight millions of consumers every minute of every day – from Disney+ and Hulu, to ABC News and Entertainment, to ESPN and ESPN+, and much more.
Innovation: We develop and execute groundbreaking products and techniques that shape industry norms and enhance how audiences experience sports, entertainment & news.
About Our Team
The Big Data Infrastructure team manages big data services such as Hadoop, Spark, Flink, Presto, Hive, etc. Our services are distributed across the data center and Cloud, supporting a large scale of data amount and thousands of physical resources. We focus on the virtualization of big data environments, cost efficiency, resiliency, and performance.
The right person for this role should have proven experience with working in mission-critical infrastructure and enjoy building and maintaining large-scale data systems with the challenge of varied requirements and large storage capabilities. If you are someone who enjoys building large-scale big data infrastructure, then this is a great role for you
Responsibilities
Develop, scale, and improve in-house/cloud and open-source Hadoop-related systems (e.g. Spark, Flink, Presto/Trino, etc).
Investigate new big data technology, and apply it to the DisneyStreaming production environment.
Build next-gen cloud-based big data infrastructure for batch and streaming data applications, and continuously improve performance, scalability and availability
Handle architectural and design considerations such as performance, scalability, reusability, and flexibility issues.
Advocate engineering best practices, including the use of design patterns, code review, and automated unit/functional testing.
Work together with other engineering teams to influence them on big data system design and optimization.
Define and lead the adoption of best practices and processes. Collaborate with senior internal team members and external stakeholders to gather requirements and drive implementation
Collaborate efficiently with Product Managers and other developers to build datastores as a service.
Collaborate with senior internal team members and external stakeholders to gather requirements and drive implementation.
Basic Qualifications
At least 7 years of professional programming and design experience
Bigdata-related components (e.g. HDFS, HBase, Yarn, Hive, Spark, Flink, Presto, Impala, Terraform, EKS, Spinnaker, IAM, EMR, and etc)
Experience in building in-house big data infrastructure.
Experience in developing and optimizing ETL and ad-hoc query engines (e.g. Spark, Flink, Hive, Presto/Trino, GreenPlum)
Experience in CICD, fine-tuned metrics, security and compliance enhancement on compute engines
Experience in latest data format (Iceberg, Delta, Hudi)
Preferred Qualifications
Experience in catalog and metadata management would be a plus
Experience in developing and optimizing Hadoop-related and containerized technologies would be a plus (e.g. HDFS, HBase, Yarn, Kubernetes, docker, RocksDB)
Demonstrated ability with cloud infrastructure technologies, including Terraform, K8S, IAM, ELB, Ranger, KMS, S3, Glue etc
Experience in managing a big data cluster with over 1000 nodes.
Required Education
Bachelor's degree in computer science, Information Systems, Software, Electrical or Electronics Engineering, or comparable field of study, and/or equivalent work experience
Additional Information
#DISNEYTECH
The hiring range for this position in Santa Monica, California is $152,200 to $204,100 per year, in Seattle, Washington is $159,500 to $213,900 per year, and in San Francisco, California is $166,800 to $223,600 per year. The base pay actually offered will take into account internal equity and also may vary depending on the candidate’s geographic region, job-related knowledge, skills, and experience among other factors. A bonus and/or long-term incentive units may be provided as part of the compensation package, in addition to the full range of medical, financial, and/or other benefits, dependent on the level and position offered.Job Posting Segment:
Commerce, Data & IdentityJob Posting Primary Business:
PDE - Data Platform EngineeringPrimary Job Posting Category:
Software EngineerEmployment Type:
Full timePrimary City, State, Region, Postal Code:
Santa Monica, CA, USAAlternate City, State, Region, Postal Code:
USA - CA - Market St, USA - WA - 925 4th AveDate Posted:
2025-03-10Tags: Big Data Computer Science Docker Engineering ETL Flink Hadoop HBase HDFS Kubernetes Open Source Security Spark Streaming Terraform Testing
Perks/benefits: Equity / stock options Salary bonus
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.