Sr. Software Engineer(Python, PySpark, AWS)
VA - Reston, 11951 Freedom Dr Ste 900
Comcast
Comcast NBCUniversal creates incredible technology and entertainment that connects millions of people to the moments and experiences that matter most.Job Summary
Job Description
What You'll Do:
- Architect and develop scalable data pipelines using ETL tools such as Pentahoa and Pyspark, ensuring efficient data extraction, transformation, and loading processes from multiple sources.
- Design and optimize data warehousing solutions to support high-performance data analytics, reporting, and business intelligence needs.
- Implement cloud-based solutions in AWS, leveraging services such as S3, Redshift, Lambda, Glue, and EMR for processing, storage, and integration of data at scale.
- Utilize the HashiCorp stack (Terraform, Vault, Consul, etc.) to manage infrastructure as code, ensuring secure and scalable deployments across cloud environments.
- Collaborate closely with cross-functional teams, including domain experts in cable construction, to capture data requirements and design solutions that meet both technical and business objectives.
- Integrate microservices architecture with data engineering pipelines to enhance modularity, scalability, and the robustness of the overall system.\
- Develop and maintain data pipelines using Python and PySpark for batch and real-time data processing.
- Ensure data quality by implementing rigorous data validation, monitoring, and transformation processes.
- Perform ETL pipeline optimization for improved performance, reliability, and scalability, addressing issues proactively to minimize downtime.
- Lead technical discussions and provide strategic input to improve data architecture and workflows, bringing innovation to the team.
- Mentor junior engineers, sharing best practices in cloud-based data engineering and pipeline automation.'
What You'll Need:
- 7+ years of experience in data engineering, with at least 3 years working in AWS environments.
- Expertise in Python and PySpark for large-scale data transformations and processing.
- Proficient in using HashiCorp tools like Terraform, Vault, and Consul for infrastructure management and security.
- Strong proficiency in data warehousing concepts, data modeling, and database performance optimization.
- Advanced experience with ETL pipeline development, particularly with Pentaho or similar tools.
- Knowledge of the cable construction domain and its specific data challenges.
- Experience in designing and integrating microservices architecture into ETL and data pipelines.
- Experience with CI/CD pipelines, version control (Git), and Agile methodologies.
- Strong skills in SQL for querying, optimizing, and maintaining relational databases.
- Excellent communication skills with the ability to convey complex technical concepts to both technical and non-technical stakeholders.
Employees at all levels are expected to:
- Understand our Operating Principles; make them the guidelines for how you do your job.
- Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services.
- Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences.
- Win as a team - make big things happen by working together and being open to new ideas.
- Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers.
- Drive results and growth.
- Respect and promote inclusion & diversity.
- Do what's right for each other, our customers, investors and our communities.
Disclaimer:
- This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications.
Skills
Amazon Web Services (AWS), PL/SQL (Programming Language), PySpark, Python (Programming Language)We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That's why we provide an array of options, expert guidance and always-on tools that are personalized to meet the needs of your reality—to help support you physically, financially and emotionally through the big milestones and in your everyday life.
Please visit the benefits summary on our careers site for more details.
Education
Bachelor's DegreeWhile possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience.Certifications (if applicable)
Relative Work Experience
7-10 YearsComcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law.* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Architecture AWS Business Intelligence CI/CD CX Data Analytics Data pipelines Data quality Data Warehousing Engineering ETL Git Lambda Microservices Pentaho Pipelines PySpark Python RDBMS Redshift Security SQL Terraform
Perks/benefits: Career development Startup environment
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.