Senior Staff Data Engineer - Hybrid
Hartford CT- Home Office, United States
Full Time Senior-level / Expert USD 132K - 198K
The Hartford
Get business, home and car insurance from The Hartford. Choose from a broad selection of business insurance coverages and design the right solution for your company. The Hartford offers AARP members great ways to save on car and home insurance,...We’re determined to make a difference and are proud to be an insurance company that goes well beyond coverages and policies. Working here means having every opportunity to achieve your goals – and to help others accomplish theirs, too. Join our team as we help shape the future.
The Enterprise Data Services department’s IT team supporting Global specialty is seeking a hands-on Senior Staff Data Engineer to enhance and support its Data assets on snowflake and SQL server platform. We are looking for a talented professional with a proven track record of engineering the ELT development and integration using Snowflake. Our ideal candidate will leverage deep technical expertise and problem-solving skills to deliver invest , maintenance and enhancement projects within the Data & Analytics Value stream.
This role will have a Hybrid work arrangement, with the expectation of working in an office location (Hartford, CT; Charlotte, NC; Chicago, IL; Frisco, TX; Columbus, OH, Danbury, CT, Alpharetta, GA) 3 days a week (Tuesday through Thursday). Candidates must be authorized to work in the US without company sponsorship.
This role will have a Hybrid work arrangement, with the expectation of working in an office location (Hartford, CT; Chicago, IL; Columbus, OH; and Charlotte, NC) 3 days a week (Tuesday through Thursday).
Role Description:
The Senior Staff Data Engineer will be proficient with data platform architecture, design, data curation, multi-dimensional models, strong understanding of data architecture, principles of ETL and Data Warehousing. Responsibilities will also include technical delivery review, resolution of architecture issues in AWS Snowflake platform.
Responsibilities:
- Demonstrate expertise in Snowflake’s cloud native architecture and Microsoft SQL server technology.
- Ability to create, troubleshoot, enhance complex code in Snowflake and SQL server.
- Experience in building data pipelines(ELT) with Snowflake cloud data platform using AWS compute (EC2) and storage layers (S3).
- Experience in building the Snowflake SQL Datawarehouse using the Virtual warehouses based on best practices.
- Hands on experience working with Talend or SSIS as an ELT tool with Snowflake and SQL server Data Integration.
- Implement and leverage Materialized views, Data Sharing, Clone Feature and Performed Dynamic data Masking.
- Have a solid understanding of delivery methodology(SDLC) and lead teams in the implementation of the solution according to the design/architecture.
- Hands on experience with Snow SQL, Stored Procedures, UDF’s using JavaScript, SnowPipe and other snowflake utilities.
- Experience in Data migration from RDMS to snowflake cloud Datawarehouse.
- Experience in data security and data access controls and design.
- Solution the data loading and unloading activities to/from Snowflake.
- Experience working with Data Lakes loading disparate data sources- Structured, semi-structured data (Flat files, XML, JSON, Parquet) and unstructured data.
- Experience in building data pipelines using Talend and automation of data ingestion including change data capture (CDC).
- Integration of data pipelines with source control repository and build CI/CD pipeline and DevOps.
- Experience in Performance Tuning of Talend / SQL Agent Jobs to reduce the CPU time/load timing.
- Deeper Knowledge on SnowFlake License model and their continuous data protection life cycle.
- Architect reusable Talend components such as Audit and reconciliation of jobs.
- Researches and evaluates alternative solutions and recommends the most efficient and cost-effective solution for the systems design.
- Support and quickly respond to Production issues and requirements clarifications.
- Coordinate as needed between multiple disciplines such as, Architects, Business Analysts, Scrum Masters, and Developers to get technical clarity leading to design, develop and implementation of business solution.
- Oversight of quality and completeness of detailed technical specifications, solution designs, and code reviews as well as adherence to the non-functional requirements.
- Experience in delivering technical solutions in an iterative, agile environment (Scrum/Kanban)
- Participate as active agile team member to help drive feature refinement, user story completion, code review, etc.
- Identify, document, and communicate technical risks, issues and alternative technical solutions discovered during project.
- Collaboration with a high-performing, forward-focused team, Release train engineer, Product Owner(s) and Business stakeholders.
- Ability to work on innovative and new projects with a "fail-fast" approach to provide optimal solutions that bring the most value to the business.
- Passion for learning new skills and the ability to adjust priorities on multiple projects based on changing demands/needs.
Qualifications & Key Skills:
- Candidates must be authorized to work in the US without company sponsorship. The company will not support the STEM OPT I-983 Training Plan endorsement for this position.
- 5+ years in Snowflake in AWS and Talend Data Integrator.
- 7+ years of hands-on experience in Datawarehouse and Data Integration(ELT/ETL)
- 7+ years of Proficiency in ETL with Microsoft Business Intelligence (SSIS, SSRS) and other tools.
- 2+ years of hands-on experience with Data Visualization (preferably Tableau).
- Strong background and problem-solving skills with Enterprise Data warehouse , ETL/ELT development, Database Replication, metadata management and data quality.
- Hands-on experience in all phases of SDLC developing ETL solutions using T-SQL code, Stored Procedures, SSIS.
- Strong data warehouse applications knowledge in preferably in financial/insurance domain is required.
- Knowledge of version control tools , CI/CD pipeline and DevOps tools like GitHub , Jenkins nexus and uDeploy.
- Knowledge of Data Profiling, Data Modeling and Database design is key to this role.
Compensation
The listed annualized base pay range is primarily based on analysis of similar positions in the external market. Actual base pay could vary and may be above or below the listed range based on factors including but not limited to performance, proficiency and demonstration of competencies required for the role. The base pay is just one component of The Hartford’s total compensation package for employees. Other rewards may include short-term or annual bonuses, long-term incentives, and on-the-spot recognition. The annualized base pay range for this role is:
$132,400 - $198,600Equal Opportunity Employer/Females/Minorities/Veterans/Disability/Sexual Orientation/Gender Identity or Expression/Religion/Age
About Us | Culture & Employee Insights | Diversity, Equity and Inclusion | Benefits
Tags: Agile Architecture AWS Business Intelligence CI/CD Data pipelines Data quality Data visualization Data warehouse Data Warehousing DevOps EC2 ELT Engineering ETL GitHub JavaScript Jenkins JSON Kanban Parquet Pipelines Scrum SDLC Security Snowflake SQL SSIS STEM Tableau Talend T-SQL Unstructured data XML
Perks/benefits: Career development Equity / stock options Gear Insurance Team events
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.