Engineer 4, Data Engineering - 6319
PA - Philadelphia, 1800 Arch St, United States
Comcast
Comcast NBCUniversal creates incredible technology and entertainment that connects millions of people to the moments and experiences that matter most.Job Summary
Job Description
DUTIES: Provide technical leadership in manipulating and transforming large, complex data using SQL and ETL processes; manage code repositories using Github; work in an Agile development environment; use Python for data manipulation, analysis, and scripting; create data visualizations and dashboards using Tableau and Thoughtspot; work with telecommunications architecture and wireless technologies; analyze and process large volumes of data using Spark and Hadoop; process data using Databricks, Linux, and Unix Shell Scripting; deploy, manage, and scale data infrastructure and services using AWS; design and architect databases and applications and lead projects through implementation, using in-depth understanding of product life cycle and industry technical knowledge; determine appropriateness of data storage for optimum data storage organization; determine how tables relate to each other and how fields interact within the tables to develop relational models; collaborate with technology and platform management partners to optimize data sourcing and processing rules to ensure appropriate data quality; create system architecture, design and specification using in-depth engineering skills and knowledge to solve difficult development problems and achieve engineering goals; works closely with a variety of team members to clearly define data product requirements and technical roadmaps; determine and source appropriate data for a given analysis; work with data modelers/analysts to understand the business problems they are trying to solve then create or augment data assets to feed their analysis; integrates knowledge of business and functional priorities; and guide and mentor junior-level engineers Position is eligible to work remotely one or more days per week, per company policy.
REQUIREMENTS: Bachelor’s degree, or foreign equivalent, in Computer Science, Engineering, or related technical field, and five (5) years of experience manipulating and transforming data using SQL and ETL processes; managing code repositories using Github; working in an Agile development environment; of which three (3) years of experience include using Python for data manipulation, analysis, and scripting; creating data visualizations and dashboards using Tableau and Thoughtspot; working with telecommunications architecture and wireless technologies; of which one (1) year of experience includes analyzing and processing large volumes of data using Spark and Hadoop; processing data using Databricks, Linux, and Unix Shell Scripting; and deploying, managing, and scaling data infrastructure and services using AWS.
Employees at all levels are expected to:
- Understand our Operating Principles; make them the guidelines for how you do your job.
- Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services.
- Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences.
- Win as a team - make big things happen by working together and being open to new ideas.
- Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers.
- Drive results and growth.
- Respect and promote inclusion & diversity.
- Do what's right for each other, our customers, investors and our communities.
Disclaimer:
This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications.
Skills
Data Visualization, Extract Transform Load (ETL), Structured Query Language (SQL)We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That's why we provide an array of options, expert guidance and always-on tools that are personalized to meet the needs of your reality—to help support you physically, financially and emotionally through the big milestones and in your everyday life.
Please visit the benefits summary on our careers site for more details.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Architecture AWS Computer Science CX Databricks Data quality Data visualization Engineering ETL GitHub Hadoop Linux Python Shell scripting Spark SQL Tableau
Perks/benefits: Career development
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.