Data Architect/Data Vault Modeller

Kraków, PL, 30-302

GFT Technologies

We see opportunity in technology. In domains such as cloud, AI, mainframe modernisation, DLT and IoT, we blend established practice with new thinking to help our clients stay ahead.

View all jobs at GFT Technologies

Apply now Apply later

What will you do?
 

As Data Architect/Data Vault Modeller you will be working on designing and implementing state-of-the-art modern data processing systems for some of the biggest and most technologically advanced companies in the financial, IoT and retail sector. Often working directly with stakeholder and up to C-Level client representatives, our architects are experts in top-level system design and project scoping. 

 

 

Your tasks

 

  • Design and maintain conceptual, logical, and physical data models aligned with business needs
  • Lead the development of relational and analytical database structures, ensuring data integrity and performance
  • Optimize queries and storage for large-scale datasets, collaborating with engineering and operations teams
  • Provide architectural guidance on SQL development, data quality, and metadata management

 

Your skills

 

  • Proven experience as a Data Architect, Lead Data Engineer, or similar role (on a Data Vault 2.0 project)
  • Expertise in relational and analytical database design principles.
  • Proficiency in dimensional modelling and data normalization techniques.
  • Knowledge of performance optimization for large-scale datasets.
  • Familiarity with enterprise-scale data warehouse environments.
  • Proficiency in SQL and strong understanding of performance optimization techniques
  • Strong knowledge of Data Vault 2.0 methodology, including hub, link, and satellite modelling
  • Experience in designing scalable and flexible data models using the Data Vault approach
  • Understanding of data lineage and auditability within Data Vault frameworks
  • Excellent communication and stakeholder engagement skills

 

Nice to have

 

  • Experience in BigQuery, including SQL querying, table partitioning, and clustering
  • Experience with GCP-native services like Cloud Storage, CloudSQL
  • Knowledge of GCP IAM roles and security best practices
  • Familiarity with Google Cloud Data Fusion for data integration pipelines
  • Experience with Apache Airflow for building and managing ETL workflows 
  • Proficiency in building efficient and reusable ELT/ETL pipelines

 

We offer you

 

  • Remote or hybrid work (2 office days per week)
  • Working in a highly experienced and dedicated team
  • Competitive salary and extra benefit package that can be tailored to your personal needs (private medical coverage, sport & recreation package, lunch subsidy, life insurance, etc.)
  • Contract of employment or B2B contract 
  • On-line training and certifications fit for career path
  • Free on-line foreign languages lessons
  • Regular social events
  • Access to e-learning platform
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Architecture Jobs

Tags: Airflow BigQuery Clustering Data quality Data warehouse ELT Engineering ETL GCP Google Cloud Pipelines Security SQL

Perks/benefits: Career development Competitive pay Flex hours Health care Team events

Region: Europe
Country: Poland

More jobs like this