Technical Architect (BIBA)

United States

Hexaware

We empower clients with world-class technology products, services, and solutions.

View all jobs at Hexaware

Apply now Apply later

Data Architect

Location: Washington DC (3-4 days onsite in a week)

 

Job Description:

We are seeking an experienced and skilled Data Architect/Data Engineer to join our team. The ideal candidate will have a strong background in Data Warehousing, Data Engineering, and Analytics, along with extensive experience in cloud data architecture, specifically on Azure. This role requires hands-on expertise in current technologies and delivery, with a focus on developing and managing complex data pipelines, data lakes, and lakehouses.

Key Responsibilities:
  • Data Warehousing & Engineering: Minimum of 15+ years of experience in Data Warehousing, Data Engineering, and Analytics.
  • Cloud Data Architecture: At least 8+ years of experience in cloud data architecture, with a minimum of 4+ years specializing in Azure.
  • Azure Databricks Expertise: Hands-on experience (5-7 years) in Azure Databricks, including end-to-end delivery of one large program.
  • ETL Development: High proficiency in developing complex ETL data pipelines using Databricks with PySpark. Manage and maintain Data Lake & Lakehouse, including CDC (Change Data Capture) and SCD (Slowly Changing Dimensions).
  • Unity Catalog: Experience in Unity Catalog is preferred.
  • Integration with Power BI: Knowledge of integrating with Power BI and defining approaches for data self-service.
  • Information Management Solutions: Experience in developing progressive information management solutions and supporting end-to-end development life-cycle processes (SDLC).
  • Business Alignment: Extensive experience in aligning application development with business needs.
  • Problem-Solving: Exceptional analytical and problem-solving skills to address complex challenges.
  • Team Collaboration: Ability to work effectively within a team environment, with excellent communication and presentation skills.
  • Objective Setting: Ability to articulate clear objectives and define qualitative/quantitative measures of success.
Preferred Qualifications:
  • Hands-on experience with current technologies and delivery methodologies.
  • Strong understanding of cloud data architecture principles and best practices.
  • Proficiency in managing data governance frameworks, including Unity Catalog.
  • Ability to design and implement scalable data solutions that meet business requirements.
Skills and Competencies:
  • Expertise in Azure Databricks and PySpark.
  • Proficiency in ETL pipeline development and data lake management.
  • Strong communication and presentation skills.
  • Analytical mindset with a methodical approach to problem-solving.
  • Collaborative and team-oriented work style.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Architecture Azure Databricks Data governance Data pipelines Data Warehousing Engineering ETL Pipelines Power BI PySpark SDLC

Region: North America
Country: United States

More jobs like this