Data Engineer with Scala
Kraków, PL, 30-302
GFT Technologies
We see opportunity in technology. In domains such as cloud, AI, mainframe modernisation, DLT and IoT, we blend established practice with new thinking to help our clients stay ahead.What will you do?
As a Data Engineer with Scala, your mission will be to develop, test and deploy the technical and functional specifications from the Solution Designers / Business Architects / Business Analysts, guaranteeing the correct operability and compliance with the internal quality levels.
Our Client is focusing on developing a DataHub to store accounting information of different geographical locations. The main goal is to create efficient, user friendly and scalable solutions that could be used by different teams.
Your tasks
- You will develop end-to-end ETL processes with Spark/Scala. This includes transferring data from/to the data lake, technical validations, business logic, etc.
- You will use Scrum methodology, and be part of a high performance team
- You will document your solutions in tools such as JIRA, Confluence, ALM
- You will certify your delivery and its integration with other components, designing and performing the relevant test to ensure the quality of your team delivery
Your skills
- At least 4 years of experience working on Data Engineering topics
- At least 2 years of experience working with Spark and Scala
- Strong SQL and Python
- Experience in working with big data – Spark, Hadoop, Hive
- Knowledge of GCP or Azure Databricks is considered as a strong plus
- Experience and expertise across data integration and data management with high data volumes.
- Experience working in agile continuous integration/DevOps paradigm and tool set (Git, GitHub, Jenkins, Sonar, Nexus, Jira)
- Experience with different database structures, including (Postgres, SQL, Hive)
- Fluent English is a must (both written and spoken)
Nice to have
- CI/CD: Jenkins, GitHub Actions
- Orchestration: Control-M, Airflow
- Scripting: Bash, Python
We offer you
- Working in a highly experienced and dedicated team
- Contract of employment or B2B contract
- Hybrid work from our offices – 2 office days per week
- Competitive salary and extra benefit package that can be tailored to your personal needs (private medical coverage, sport & recreation package, lunch subsidy, life insurance, etc.)
- On-line training and certifications fit for career path
- On-line foreign language lessons
- Social events
- Access to e-learning platform
- Ergonomic and functional working space
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow Azure Big Data CI/CD Confluence Databricks Data management DevOps Engineering ETL GCP Git GitHub Hadoop Jenkins Jira PostgreSQL Python Scala Scrum Spark SQL
Perks/benefits: Competitive pay Health care Team events
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.