Data Architect
Kraków, PL, 30-302
GFT Technologies
We see opportunity in technology. In domains such as cloud, AI, mainframe modernisation, DLT and IoT, we blend established practice with new thinking to help our clients stay ahead.What will you do?
As Data Architect you will be working on designing and implementing state-of-the-art modern data processing systems for some of the biggest and most technologically advanced companies in the financial sector. Often working directly with stakeholder and up to C-Level client representatives, our architects are experts in top-level system design and project scoping.
Your tasks
- Design and maintain conceptual, logical, and physical data models aligned with business needs with Data Vault 2.0
- Driving governance processes, documentation
- Leading data model and platform innovations and improvements
- Provide architectural guidance on SQL development, data quality, and metadata management
- Lead the development of analytical and data warehouse systems, ensuring data integrity and performance
- Optimize queries and storage for large-scale datasets, collaborating with engineering and operations team
Your skills
- Proven experience as a Data Architect, Lead Data Engineer or similar
- Experience in designing scalable and flexible data models using Data Vault methodology or other (Kimball, Inmon, Anchor...)
- Expertise in relational and analytical database design principles
- Proficiency in dimensional modelling and data normalization techniques
- Knowledge of performance optimization for large-scale datasets
- Familiarity with enterprise-scale data warehouse environments
- Proficiency in SQL and strong understanding of performance optimization techniques
- Understanding of data lineage and auditability
- Excellent communication and stakeholder engagement skills
Nice to have
- Experience in BigQuery or similar Data Warehouse engines, including SQL querying, table partitioning, and clustering
- Experience with GCP-native services like Cloud Storage, CloudSQL
- Knowledge of GCP IAM roles and security best practices
- Familiarity with Google Cloud Data Fusion for data integration pipelines
- Experience with Apache Airflow for building and managing ETL workflows
- Proficiency in building efficient and reusable ELT/ETL pipelines
We offer you
- Remote or hybrid work (2 office days per week)
- Working in a highly experienced and dedicated team
- Competitive salary and extra benefit package that can be tailored to your personal needs (private medical coverage, sport & recreation package, lunch subsidy, life insurance, etc.)
- Contract of employment or B2B contract
- On-line training and certifications fit for career path
- Free on-line foreign languages lessons
- Regular social events
- Access to e-learning platform
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow BigQuery Clustering Data quality Data warehouse ELT Engineering ETL GCP Google Cloud Pipelines Security SQL
Perks/benefits: Career development Competitive pay Flex hours Health care Team events
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.