Data Architect

Krakow

Alfa Laval

Enhancing customers’ competitiveness through world-leading sustainable solutions within the Energy, Food, Water and Marine industries.

View all jobs at Alfa Laval

Apply now Apply later

Every day, we get opportunities to make a positive impact – on our colleagues, partners, customers and society. Together, we’re pioneering the solutions of the future and unlocking the full potential of precious resources. Trusted to act on initiative, we challenge conventional thinking to develop world-leading technologies that inspire progress in vital areas, including energy, food, water and shipping.

As we push forward, the innovative, open spirit that fuels our 140-year-old start-up culture and rapid growth also drives our personal growth. So, as we shape a more resourceful, less wasteful world, we build our careers too.

The data architect is responsible for designing, creating, and managing an organization’s data architecture. This role is critical in establishing a solid foundation for data management within an organization, ensuring that data is organized, accessible, secure, and aligned with business objectives. The data architect designs warehouses, file systems and databases, and defines how data will be collected, organized, and shared. 

Your Responsibilities 

  • Interpret and deliver impactful strategic plans improving data integration, data quality, and data delivery in support of business initiatives and roadmaps 

  • Design the structure and layout of data systems, including databases, warehouses, and lakes 

  • Select and implement database management systems that meet the organization’s needs by defining data schemas, optimizing data storage, and establishing data access controls and security measures 

  • Define and implement the long-term technology strategy and innovations roadmaps across analytics, data engineering, and data platforms 

  • Design and implement processes for the ETL process from various sources into the organization’s data systems 

  • Translate high-level business requirements into data models and appropriate metadata, test data, and data quality standards 

  • Manage senior business stakeholders to secure strong engagement and ensures that the delivery of the project aligns with longer-term strategic roadmaps 

  • Simplify the existing data architecture, deliver reusable services and cost-saving opportunities in line with the policies and standards of the company 

  • Lead and participate in the peer review and quality assurance of project architectural artifacts across the Enterprise Architecture group through governance forums  

  • Define and manage standards, guidelines, and processes to ensure data quality 

  • Work with IT teams, business analysts, and data analytics teams to understand data consumers’ needs and develop solutions 

  • Evaluate and recommend emerging technologies for data management, storage, and analytics 

  • Ensure, by promotion and hands-on examples/templates, that agreed data architecture and governance are adhered to by the organization  

Job Requirements 

Education 

  • A bachelor’s degree in computer science, data science, engineering, or related field 

Experience 

  • At least 5 years of relevant experience in design and implementation of data models for enterprise data warehouse/lake initiatives 

  • Experience leading projects involving data warehousing, data modeling, and data analysis 

  • Design experience in Azure Databricks, PySpark, and Power BI 

Skills 

  • Strong ability in programming languages such as Python and SQL 

  • Proficiency in the design and implementation of modern data architectures and concepts such as cloud services (AWS, Azure - preferred, GCP), real-time data distribution (Kafka, Dataflow), and modern data warehouse tools (Snowflake, Databricks - preferred) 

  • Experience with data storage technologies such as SQL and NoSQL databases 

  • Understanding of entity-relationship modeling, metadata systems, and data quality tools and techniques 

  • Ability to think strategically and relate architectural decisions and recommendations to business needs and client culture 

  • Ability to assess traditional and modern data architecture components based on business needs 

  • Experience with business intelligence tools and technologies such as ETL, Power BI 

  • Ability to regularly learn and adopt modern technologies, especially in the ML/AI realm 

  • Strong analytical and problem-solving skills 

  • Ability to synthesize and clearly communicate large volumes of complex information to senior management of various technical understandings 

  • Ability to collaborate and excel in complex, cross-functional teams involving data scientists, business analysts, and stakeholders 

  • Ability to guide solution design and architecture to meet business needs 

What we Offer

  • An open environment where you are expected to work independently and with possibility to influence the work content

  • Attractive salary and benefits package

  • Flexible working hours, you can start between 7:30 and 9:30 am.

  • Hybrid work schedule (our office is located on Przybyszewskiego 56, Kraków)

  • No formal dress-code

  • Annual integration events

  • Employee volunteering opportunities and interesting CSR projects

  • Relocation support within Poland, if needed

We review applications continually so please submit your application as soon as possible. Please note that due to GDPR we do not accept applications sent via email - please submit your application online.

We care about diversity, inclusion and equity in our recruitment processes. We also believe behavioral traits can provide important insights into a candidate's fit to a role. To help us achieve this we apply Pymetrics assessments, and upon application you will be invited to play the assessment games.

#LI-OM1

Apply now Apply later
  • Share this job via
  • 𝕏
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Architecture Jobs

Tags: Architecture AWS Azure Business Intelligence Computer Science Data analysis Data Analytics Databricks Dataflow Data management Data quality Data warehouse Data Warehousing Engineering ETL Excel GCP Kafka Machine Learning NoSQL Power BI PySpark Python Security Snowflake SQL

Perks/benefits: Career development Equity / stock options Flex hours Relocation support Startup environment Team events

Region: Europe
Country: Poland

More jobs like this