Sr Data Engineer (PySpark) - Hybrid

Hartford CT- Home Office

The Hartford

Get business, home and car insurance from The Hartford. Choose from a broad selection of business insurance coverages and design the right solution for your company. The Hartford offers AARP members great ways to save on car and home insurance,...

View all jobs at The Hartford

Apply now Apply later

Staff Data Engineer - GE07CE

Data Engineer - GE08AE

We’re determined to make a difference and are proud to be an insurance company that goes well beyond coverages and policies. Working here means having every opportunity to achieve your goals – and to help others accomplish theirs, too. Join our team as we help shape the future.          

Join a fast-paced and talented team to deliver Data Engineering capabilities for The Hartford’s Commercial Data Science Data Delivery.  You will have an opportunity to engage in enabling well architected cloud-based data solutions for Entity Resolution using technologies such as AWS and Snowflake.

This role will have a Hybrid work arrangement, with the expectation of working in an office (Columbus, OH, Chicago, IL, Hartford, CT or Charlotte, NC) 3 days a week (Tuesday through Thursday).

Responsibilities:

  • Drive End-to-End solution delivery involving multiple platforms and technologies with medium to large, complexity or oversee certain parts of very large complex implementations, leveraging ELT solutions to acquire, integrate, and operationalize data.
  • Partner with architects and stakeholders to influence and implement the vision of the pipeline and data product architecture while safeguarding the integrity and scalability of of the environment.
  • Accountable for data pipeline and product physical solution designs across teams as well as tool recommendations
  • Accountable for Data Engineering Practices across all the teams involved.
  • Implement and utilize leading big data methodologies (AWS, Hadoop/EMR, Spark, Kafka, and Snowflake) with cloud hosting solutions, on a multi-team/product level

Knowledge, Skills, and Abilities:  

  • Strong Technical Knowledge (Cloud data pipelines and data consumption products)
  • Team player with transformation mindset.
  • Ability to lead successfully in a lean, agile, and fast-paced organization, leveraging Scaled Agile principles and ways of working.
  • Guides team to mature Code quality management, DataOps principles, automated testing, and environment management practices to deliver incremental customer value.

Qualifications:           

  • Candidates must be authorized to work in the US without company sponsorship. The company will not support the STEM OPT I-983 Training Plan endorsement for this position.
  • 3+ years in large scale big data engineering experience focused on PySpark and designing best practices in Programming, SDLC practices, Distributed systems, Data warehousing solutions SQL and NoSQL, ETL tools, CICD, Cloud Technologies (AWS/AZURE),and Datalake
  • 3+ years of developing and operating production workloads in cloud infrastructure (AWS, Azure, etc)
  • 3+ years of operating in a technical leadership capacity for 2+ teams.
  • Must have PySpark experience
  • Exposure to AWS best practices
  • Knowledge of core functional components/services of AWS – compute, storage, Edge, Database, Migration and Transfer, Networking, and Governance

Certifications/Licenses (as applicable)

  • Cloud certifications preferred.

Compensation

The listed annualized base pay range is primarily based on analysis of similar positions in the external market. Actual base pay could vary and may be above or below the listed range based on factors including but not limited to performance, proficiency and demonstration of competencies required for the role. The base pay is just one component of The Hartford’s total compensation package for employees. Other rewards may include short-term or annual bonuses, long-term incentives, and on-the-spot recognition. The annualized base pay range for this role is:

$97,600 - $170,040

The posted salary range reflects our ability to hire at different position titles and levels depending on background and experience.

Equal Opportunity Employer/Females/Minorities/Veterans/Disability/Sexual Orientation/Gender Identity or Expression/Religion/Age

About Us | Culture & Employee Insights | Diversity, Equity and Inclusion | Benefits

Apply now Apply later
Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Architecture AWS Azure Big Data DataOps Data pipelines Data Warehousing Distributed Systems ELT Engineering ETL Hadoop Kafka NoSQL Pipelines PySpark SDLC Snowflake Spark SQL STEM Testing

Perks/benefits: Equity / stock options Insurance

Region: North America
Country: United States

More jobs like this