Data Architect II

Bangalore, KA, IN

Apply now Apply later

About the role:

The Data Architect is a critical role in the production of data for operations and actionable insights. The Data Architect designs, develops, automates and supports the data products used to extract, transform and load data from operational systems and other sources to the enterprise data platform for onward consumption by business users, data scientists and other applications.
 

We are seeking a motivated Data architect & engineer, who will contribute to deliver end to end data pipelines to productive environments, working in tight partnership with peers, internal experts, and business clients to support, organize and lead various activities within the team

 

Key tasks:

  • Understand business requirements and convert them into technical specifications
  • Understand end to end application & integration landscape
  • Design and implement data storage solutions like data lakes and data warehouses
  • Optimize data storage for performance, cost, and reliability
  • Integrate data from multiple sources, including APIs, databases, and external data providers
  • Ensure seamless data flow across systems and platforms
  • Build data pipelines to ingest data from source to enterprise data platforms
  • Become the subject matter expert for data and integration related topics
  • Develop and maintain data models to support analytical and operational requirements
  • Collaborate with business data analysts to design and implement data schemas
  • Monitor and optimize the performance of data systems and pipelines.
  • Identify and resolve bottlenecks and performance issues
  • Build sustainable relationships with key business and IT stakeholders in order to become a trusted partner along with the product owner
  • Meet with business stakeholders, understand the requirements and advise on efficient solutions
  • Ensure timely customer communication and coordination of follow-up activities
  • Apply and implement Agile Scrum and DevOps standard methodologies while driving the initiative forward 

 

Your main responsibilities are:

As a Data Architect for digital solutions, you contribute by:

  • Design, Develop, and Maintain Data Pipelines on Palantir / Azure or similar platforms
  • Ensure data quality and integrity through robust validation and transformation processes
  • Create and manage scalable and efficient data pipelines to process and integrate data from various sources systems, e.g. Planon, MS Dynamics, SuccessFactors etc.
  • Document data engineering processes, systems, and standards.
  • Promote and enforce best practices for data management and engineering.
  • Collaborating with the Product owner and Engineering Lead to build a target architecture for the products in scope of the team and to determine how we can improve business processes and generate value.
    • Advise on design practices needed for the squad on developing, integrating, managing digital solutions, services and reducing technical debts
    • Collaborating with Engineering leads to ensure design decisions are aligned for the product area

 

About the team:

The role is assigned to the Data Engineering & Analytics Product Area (Area III) which is the data engineering and analytics backbone for business teams spanning HR, Legal & Compliance, Procurement, Communications & Corporate Real estate

 

About you:

As a successful candidate for this role, you possess

  • Bachelor or Master's Degree in a relevant quantitative field, e.g. Computer Science, Mathematics, Engineering, Statistics or other is an advantage
  • 4 to 6 years of experience in designing and implementing end to end data pipelines, data models and analytical dashboards for insights
  • Strong understanding of data engineering and analytical engineering techniques with modern programming languages and analytical frameworks (proficient in Python, PySpark, SQL, Typescript
  • Well versed in Visualization tools like Palantir Workshop, Slate, Contour)
  • Palantir certification/ Azure Data Engineering certification in the recent past or AI-102 Azure AI Engineer certification is an advantage
  • Proficient in Spark based data lake design and operations
  • Experience with Integration technologies like REST/SOAP APIs, Event based architecture 
  • Problem-solving skills and commitment to quality and timely delivery
  • Strong interpersonal, written and verbal communication skills, including clear articulation of technical topics to a non-technical audience
  • A desire to continuously upskill & stay relevant with emerging technologies like Palantir Foundry.
  • Proficiency in SQL and experience with relational databases (e.g. Oracle, Azure SQL).
  • Familiarity with cloud platforms (e.g. Azure) and their data services.
  • Experience with data modeling and schema design

 

 About Swiss Re

 

Swiss Re is one of the world’s leading providers of reinsurance, insurance and other forms of insurance-based risk transfer, working to make the world more resilient. We anticipate and manage a wide variety of risks, from natural catastrophes and climate change to cybercrime. We cover both Property & Casualty and Life & Health. Combining experience with creative thinking and cutting-edge expertise, we create new opportunities and solutions for our clients. This is possible thanks to the collaboration of more than 14,000 employees across the world.

Our success depends on our ability to build an inclusive culture encouraging fresh perspectives and innovative thinking. We embrace a workplace where everyone has equal opportunities to thrive and develop professionally regardless of their age, gender, race, ethnicity, gender identity and/or expression, sexual orientation, physical or mental ability, skillset, thought or other characteristics. In our inclusive and flexible environment everyone can bring their authentic selves to work and their passion for sustainability.

 

About Swiss Re

 

Swiss Re is one of the world’s leading providers of reinsurance, insurance and other forms of insurance-based risk transfer, working to make the world more resilient. We anticipate and manage a wide variety of risks, from natural catastrophes and climate change to cybercrime. We cover both Property & Casualty and Life & Health. Combining experience with creative thinking and cutting-edge expertise, we create new opportunities and solutions for our clients. This is possible thanks to the collaboration of more than 14,000 employees across the world.

Our success depends on our ability to build an inclusive culture encouraging fresh perspectives and innovative thinking. We embrace a workplace where everyone has equal opportunities to thrive and develop professionally regardless of their age, gender, race, ethnicity, gender identity and/or expression, sexual orientation, physical or mental ability, skillset, thought or other characteristics. In our inclusive and flexible environment everyone can bring their authentic selves to work and their passion for sustainability.

If you are an experienced professional returning to the workforce after a career break, we encourage you to apply for open positions that match your skills and experience.

 

 

Keywords:  
Reference Code: 133374 

 

 

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Architecture Jobs

Tags: Agile APIs Architecture Azure Computer Science Data management Data pipelines Data quality DevOps Engineering Mathematics Oracle Pipelines PySpark Python RDBMS Scrum Spark SQL Statistics TypeScript

Perks/benefits: Flex hours

Region: Asia/Pacific
Country: India

More jobs like this