Manager of Data Engineering
Hartford CT- Home Office, United States
Full Time Senior-level / Expert USD 127K - 190K
The Hartford
Get business, home and car insurance from The Hartford. Choose from a broad selection of business insurance coverages and design the right solution for your company. The Hartford offers AARP members great ways to save on car and home insurance,...We’re determined to make a difference and are proud to be an insurance company that goes well beyond coverages and policies. Working here means having every opportunity to achieve your goals – and to help others accomplish theirs, too. Join our team as we help shape the future.
The Hartford is seeking a Manager of Data Engineering within Workers’ Compensation and Group Benefits Claims Data Science to lead a team of Data Engineers to design, develop, and implement modern and sustainable data assets to fuel machine learning and artificial intelligence solutions across a wide range of strategic initiatives designed to drive efficiency within Workers’ Compensation and Group Benefits claims process.
As a Manager of Data Engineering, you will lead and mentor a small team through the entire software development lifecycle process in support of continuous data delivery, while growing your knowledge of emerging technologies. We use the latest data technologies, software engineering practices, MLOPs, Agile delivery frameworks, and are passionate about building well-architected and innovative solutions that drive business value. This cutting edge and forward focused organization presents the opportunity for collaboration, self-organization within the team, and visibility as we focus on continuous business data delivery.
This role will have a Hybrid work schedule, with the expectation of working in an office (Columbus, OH, Chicago, IL, Hartford, CT or Charlotte, NC) 3 days a week (Tuesday through Thursday).
Responsibilities:
- Lead and mentor data engineers to deliver and maintain reusable and sustainable data assets and production pipelines that assist the functional business units in meeting their strategic objectives
- Prototype and lead deployment of high impact innovations, catering to changing business needs, by leveraging new technologies
- Consult with cross-functional stakeholders in the analysis of short and long-range business requirements and recommend innovations which anticipate the future impact of changing business needs. Distill these requirements into user stories and action items for team members.
- Formulate logical statements of business problems and devise, test and implement efficient, cost-effective application program solutions
- Identify and validate internal and external data sources for availability and quality. Work with SMEs to describe and understand data lineage and suitability for a use case
- Create data assets and build data pipelines that align to modern software development principles for further analytical consumption. Perform data analysis to ensure quality of data assets
- Develop code that enables real-time modeling solutions to be ingested into front-end systems
- Produce code artifacts and documentation using GitHub for reproducible results and hand-off to other data science teams and ensure high quality standards for direct reports
Qualifications:
- 6+ years of relevant experience recommended
- Bachelor’s degree in Computer Science, Engineering, IT, MIS, or a related discipline
- Experience in managing Data Engineering initiatives in an Agile environment
- Expertise in Python and SQL
- Proficiency in ingesting data from a variety of structures including relational databases, Hadoop/Spark, cloud data sources, XML, JSON
- Proficiency in ETL concerning metadata management and data validation
- Expertise in Unix and Git
- Proficiency in Automation tools (Autosys, Cron, Airflow, etc.)
- Proficiency with AWS Services (i.e. S3, EMR, etc) a plus
- Proficiency with Cloud data warehouses, automation, and data pipelines (i.e. Snowflake, Redshift) a plus
- Able to communicate effectively with both technical and non-technical teams
- Able to translate complex technical topics into business solutions and strategies as well as turn business requirements into a technical solution
- Experience with leading project execution and driving change to core business processes through the innovative use of quantitative techniques
- Experience building CICD pipeline using Jenkins or equivalent.
- Experience with Solution Design and Architecture of data and ML pipelines as well as integrating with Enterprise systems.
- Good understanding and experience building orchestration framework for real-time and batch services.
- Experience building asynchronous or event-driven services in cloud environment.
- Familiar with BI tools (Tableau, PowerBI, ThoughtSpot, etc)
- Able to translate complex technical topics into business solutions and strategies as well as turn business requirements into a technical solution
- Experience with leading project execution and driving change to core business processes through the innovative use of quantitative techniques
Candidate must be authorized to work in the US without company sponsorship. The company will not support the STEM OPT I-983 Training Plan endorsement for this position.
Compensation
The listed annualized base pay range is primarily based on analysis of similar positions in the external market. Actual base pay could vary and may be above or below the listed range based on factors including but not limited to performance, proficiency and demonstration of competencies required for the role. The base pay is just one component of The Hartford’s total compensation package for employees. Other rewards may include short-term or annual bonuses, long-term incentives, and on-the-spot recognition. The annualized base pay range for this role is:
$127,200 - $190,800Equal Opportunity Employer/Females/Minorities/Veterans/Disability/Sexual Orientation/Gender Identity or Expression/Religion/Age
About Us | Culture & Employee Insights | Diversity, Equity and Inclusion | Benefits
Tags: Agile Airflow Architecture AWS Computer Science Data analysis Data pipelines Engineering ETL Git GitHub Hadoop Jenkins JSON Machine Learning MLOps Pipelines Power BI Python RDBMS Redshift Snowflake Spark SQL STEM Tableau XML
Perks/benefits: Career development Equity / stock options Insurance
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.