Data Architect

Erie, PA, United States

Apply now Apply later

Company Description

Derex Technologies Inc specializes in providing IT consulting, staffing solutions and software services. Globally headquartered in Harrison New Jersey since 1996 Derex delivers the highest quality technology professionals and an array of customized IT talent solutions designed to improve productivity and drive results to global clients throughout North America.

With over two decades of unparalleled experience, Derex provides supports to its clientele, across such industries as Systems Integration, Banking and Finance, Telecommunications, Pharmaceutical and Life Sciences, Energy, Healthcare, Technology, Transportation, and local and federal Government agencies.

Job Description

Role: Data Architect

Location: Remote

Duration: Long term

 

Must have experience on AWS, Guidewire Claim Data Access (CDA) or similar insurance domain data.

 

About the Role:

We are seeking an experienced Data Architect to lead and design modern data solutions for a Property & Casualty (P&C) customer undergoing a major data modernization initiative involving Guidewire Claim Data Access (CDA). The ideal candidate will possess strong technical expertise, hands-on experience, and excellent communication skills to successfully deliver enterprise-grade data solutions in Azure/Informatica. This role requires a proactive problem solver who can troubleshoot and optimize complex data pipelines and workflows for maximum efficiency and reliability.

 

Key Responsibilities:

  1. Architect and implement enterprise metadata-driven data pipelines using ETL tools like Azure Data Factory (ADF) and Informatica.
  2. Design and develop an Operational Data Store (ODS) sourced from Azure Data Lake, ensuring robust, scalable, and high-performing architecture.
  3. Collaborate with stakeholders to integrate and optimize Guidewire Data (CDA) into the data lake architecture, enabling advanced analytics and reporting.
  4. Troubleshoot and resolve issues in data pipelines, workflows, and related processes to ensure reliability and data accuracy.
  5. Continuously monitor and optimize current workflows for performance, scalability, and cost-efficiency, adhering to best practices.
  6. Develop and maintain custom processes using Python, T-SQL, and Spark, tailored to business requirements.
  7. Leverage Azure Functions to design serverless compute solutions for event-driven and scheduled data workflows.
  8. Optimize data workflows and resource usage to ensure cost-efficiency in Azure Cloud environments.
  9. Provide leadership and guidance for implementing Hadoop-based big data solutions where applicable.
  10. Develop a comprehensive understanding of P&C domain data, ensuring alignment with business objectives and compliance requirements.
  11. Communicate technical solutions effectively with cross-functional teams, stakeholders, and non-technical audiences.

 

Required Qualifications:

  1. 13+ years of experience in data architecture, data engineering, and/or ETL development roles, with at least 3+ years in a P&C insurance domain.
  2. Proven experience with Azure Cloud Services, including Azure Data Lake, Azure Data Factory, and SQL Server.
  3. Leverage Informatica for robust ETL workflows, data integration, and metadata-driven pipeline automation to streamline data processing
  4. Build end-to-end metadata-driven frameworks and continuously optimize existing workflows for improved performance, scalability, and efficiency.
  5. Strong knowledge of Guidewire Claim Data Access (CDA) or similar insurance domain data.
  6. Expertise in troubleshooting and optimizing data pipelines and workflows for enhanced reliability and performance.
  7. Proficiency in scripting and programming with Python, T-SQL, and Spark for custom data workflows.
  8. Hands-on expertise in building and managing ODS systems from data lakes.
  9. Experience with Azure Functions for serverless architecture.
  10. Familiarity with Hadoop ecosystems (preferred but not mandatory).
  11. Demonstrated ability to design solutions for Azure Cloud Cost Optimization.
  12. Excellent communication skills to engage with technical and business stakeholders effectively.
  13. Experience with metadata management and data cataloging for large-scale data ecosystems.

 

Preferred Skills:

  1. Familiarity with Guidewire systems and their integration patterns.
  2. Experience in implementing Data Governance frameworks.
  3. Certification in Azure (e.g., Azure Data Engineer Associate or Azure Solutions Architect).
  4. Experience with other data platforms/tools such as Hadoop, Databricks etc.

 

Regards,

 

Manoj

Derex Technologies INC

Contact : 973-834-5005 Ext 206

Additional Information

All your information will be kept confidential according to EEO guidelines.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Architecture Jobs

Tags: Architecture AWS Azure Banking Big Data Consulting Databricks Data governance Data pipelines Engineering ETL Finance Hadoop Informatica Pharma Pipelines Python Spark SQL T-SQL

Region: North America
Country: United States

More jobs like this