Data Architect with Databricks (Kraków, Poland)

Kraków, Małopolskie, Poland

⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️

Applications have closed

Xebia

Leading global technology consultancy providing strategy, software engineering, advanced training, and managed services to help businesses thrive in the AI-enabled digital era.

View all jobs at Xebia

Hello, let’s meet!

We are Xebia - a place where experts grow. For nearly two decades now, we've been developing digital solutions for clients from many industries and places across the globe. Among the brands we’ve worked with are UPS, McLaren, Aviva, Deloitte, and many, many more.

We're passionate about Cloud-based solutions. So much so, that we have a partnership with three of the largest Cloud providers in the business – Amazon Web Services (AWS), Microsoft Azure & Google Cloud Platform (GCP). We even became the first AWS Premier Consulting Partner in Poland.

Formerly we were known as PGS Software. In 2021, we joined Xebia Group – a family of interlinked companies driven by the desire to make a difference in the world of technology.

Xebia stands for innovation, talented team members, and technological excellence. Xebia means worldwide recognition, and thought leadership. This regularly provides us with the opportunity to work on global, innovative projects.

Our mission can be captured in one word: Authority. We want to be recognized as the authority in our field of expertise.

What makes us stand out? It's the little details, like our attitude, dedication to knowledge, and the belief in people's potential - emphasizing every team members development. Obviously, these things are not easy to present on paper – so make sure to visit us to see it with your own eyes!

Now, we've talked a lot about ourselves – but we'd love to hear more about you.

Send us your resume to start the conversation and join the #Xebia.

You will be:

  • designing and implementing scalable, reliable, and secure data platform architectures using Azure services and Azure Databricks,
  • defining end-to-end data processing pipelines, including ingestion, transformation, and consumption layers,
  • reviewing and optimizing current solution based on Medalion Architecture principles Stakeholder Collaboration: Collaborate with Product Owners, Data Engineers, Data Scientists, and Business Stakeholders to align the platform’s architecture with business objectives,
  • providing technical leadership and guidance to cross-functional teams,
  • optimizing Databricks workloads and resourcing management for performance and cost-efficiency,
  • providing architect solutions that scale for growing datasets and evolving business needs,
  • ensuring compliance with data governance and security standards,
  • implementing best practices for managing data privacy, access control, and auditability,
  • staying updated on the latest Azure and Databricks features and identifying opportunities to enhance the data platform,
  • establishing architectural standards and reusable frameworks.

Requirements

Your profile:

  • ready to start immediately,
  • openness to work on-side, from Krakow office, 2-3 days per week,
  • at least 7 years of relevant experience in design and implementation of data models for enterprise data warehouse/lake initiatives,
  • experience in designing normalized and de-normalized data models,
  • hands-on experience with integrating various data sources, APIs, and real-time streaming solutions (e.g., Event Hub, Kafka),
  • proficiency in optimizing Databricks clusters and Spark jobs,
  • familiarity with Azure-native security features, such as Key Vault, Managed Identities, and role-based access control (RBAC),
  • knowledge and understanding of Databricks Unity Catalog,
  • ability to lead architectural discussions, resolve conflicts, and drive consensus across technical teams,
  • mentorship and upskilling of engineering teams on Azure and Databricks best practices
  • strong hands-on experience with Azure services, including but not limited toAzure Databricks, Azure Data Factory, Azure Synapse Analytics, Azure Data Lake (Gen2), Azure Blob Storage, Azure Functions, Azure Key Vault,
  • proficiency in designing and implementing ETL/ELT pipelines with Databricks notebooks and workflow,
  • expertise in Spark, SQL, and Python programming within Databricks.

Work from the European Union region and a work permit are required.


Recruitment Process:

CV review – HR call – InterviewClient Interview – Decision

Job stats:  1  0  0
Category: Architecture Jobs

Tags: APIs Architecture AWS Azure Consulting Databricks Data governance Data warehouse ELT Engineering ETL GCP Google Cloud Kafka Pipelines Privacy Python Security Spark SQL Streaming

Region: Europe
Country: Poland

More jobs like this