Advisor, Data Engineer (4)
PA - Philadelphia, 1701 John F Kennedy Blvd, United States
Comcast
Comcast NBCUniversal creates incredible technology and entertainment that connects millions of people to the moments and experiences that matter most.Job Summary
The Data Engineer (Level 4) will play a critical role in building and maintaining a modern audit data platform within Comcast’s Internal Audit organization. This role focuses on designing scalable, resilient data pipelines using AWS-native tools with a strong emphasis on automation, quality, governance, and observability.You’ll be responsible for turning fragmented, manual data inputs into production-grade pipelines that support audit engagements across Comcast’s business units. The ideal candidate brings deep technical expertise, strong system thinking, and the ability to lead technical decisions that drive scalability, trust, and transparency across the audit data ecosystem.
Job Description
What You Deliver
- Designs, builds, and maintains resilient, automated data pipelines using AWS-native services (e.g., S3, Glue, Lambda, Redshift) and orchestration tools (e.g., Airflow, Step Functions).
- Implements modular, well-documented transformation logic and data models to support standardized audit workflows.
- Introduces and maintains data quality checks and observability tooling (e.g., Monte Carlo, Great Expectations) to ensure data accuracy, lineage, and pipeline reliability.
- Builds end-to-end pipeline monitoring: logging, alerting, and traceability from source ingestion to report-ready datasets.
- Develops and supports CI/CD pipelines (e.g., GitHub Actions) for secure, auditable deployments of infrastructure and code.
- Manages data integration across diverse sources (APIs, SFTP, databases) using secure, repeatable patterns with version control and infrastructure-as-code practices.
How You Deliver
- Independently delivers robust engineering solutions with minimal supervision, including root-cause analysis and long-term fixes for data pipeline issues.
- Leads code reviews and contributes to evolving team-wide engineering standards, documentation, and architecture.
- Balances short-term audit needs with long-term architectural improvements that reduce technical debt and manual effort.
- Maintains strong operational rigor: writes testable, observable, well-documented code that scales across engagements.
- Continuously explores and evaluates new tools, libraries, and frameworks relevant to data quality, automation, and audit transparency.
How You Partner
- Works closely with Data Solutions Advisors and audit engagement leads to align technical work with business and controls objectives.
- Partners with audit teams, internal stakeholders, and source system owners to ensure secure, reliable access to critical business data.
- Engages in collaborative planning and sprint cycles with cross-functional teams to align on scope, milestones, and blockers.
- Shares expertise across the CGA team by mentoring junior engineers and championing knowledge sharing practices.
How You Develop
- Takes initiative to level up technical skill sets in areas like orchestration, observability, or data governance.
- Applies audit and regulatory awareness (e.g., PII, data access controls) to technical implementation, ensuring systems are not only functional but trustworthy.
- Stays current on emerging data tools, cloud patterns, and engineering best practices relevant to regulated environments.
Qualifications
- 7–10+ years of experience in data engineering, data infrastructure, or equivalent roles.
- Advanced proficiency in Python, SQL, and scripting (e.g., Bash); experience developing production-grade pipelines.
- Deep experience with AWS (S3, Glue, Lambda, Redshift, IAM, CloudWatch); infrastructure-as-code experience preferred.
- Experience building and orchestrating workflows using tools like Airflow, AWS Step Functions, or Dagster.
- Proficient with Git-based development workflows and CI/CD automation (e.g., GitHub Actions, CodePipeline).
- Hands-on experience with data quality and observability tooling (e.g., Monte Carlo, Great Expectations, or custom alerting/monitoring frameworks).
- Familiarity with secure data integrations (e.g., APIs, database connectors, SFTP), credential management, and audit logging.
- Strong written and verbal communication skills, with an ability to articulate engineering tradeoffs to technical and non-technical audiences.
Skills
AWS Devops, CI/CD, Collaboration, Data Engineering, Data Infrastructure, Data PipelinesWe believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That's why we provide an array of options, expert guidance and always-on tools that are personalized to meet the needs of your reality—to help support you physically, financially and emotionally through the big milestones and in your everyday life.
Please visit the benefits summary on our careers site for more details.
Education
Bachelor's DegreeWhile possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience.Certifications (if applicable)
Relevant Work Experience
7-10 YearsComcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law.* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow APIs Architecture AWS CI/CD Dagster Data governance Data pipelines Data quality DevOps Engineering Git GitHub Lambda Monte Carlo Pipelines Python Redshift SQL Step Functions
Perks/benefits: Career development Equity / stock options Transparency
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.