MID Big Data Engineer (Python, GCP) - BANKING
Poland
Capco
Capco is a global management and technology consultancy dedicated to the financial services and energy industries.CAPCO POLAND
*We are looking for a Kraków (+ nearby) based candidate, this job has a hybrid work mode.
Joining Capco means joining an organisation that is committed to an inclusive working environment where you’re encouraged to #BeYourselfAtWork. We celebrate individuality and recognize that diversity and inclusion, in all forms, is critical to success. It’s important to us that we recruit and develop as diverse a range of talent as we can and we believe that everyone brings something different to the table – so we’d love to know what makes you different. Such differences may mean we need to make changes to our process to allow you the best possible platform to succeed, and we are happy to cater to any reasonable adjustments you may require. You will find the section to let us know of these at the bottom of your application form or you can mention it directly to your recruiter at any stage and they will be happy to help.
Capco Poland is a leading global technology and management consultancy, dedicated to driving digital transformation across the financial services industry. Our passion lies in helping our clients navigate the complexities of the financial world, and our expertise spans banking and payments, capital markets, wealth, and asset management. We pride ourselves on maintaining a nimble, agile, and entrepreneurial culture, and we are committed to growing our business by hiring top talent.
We also are:
- Experts in banking and payments, capital markets, wealth and asset management
- Focused on maintaining our nimble, agile, and entrepreneurial culture
- Committed to growing our business and hiring the best talent to help us get there
We're seeking a skilled Senior Big Data Engineer to join our Team. The ideal candidate will be responsible for designing, implementing and maintaining scalable data pipelines and solutions on on-prem / migration / cloud projects for large scale data processing and analytics.
BIG DATA ENGINEER @ CAPCO - WHAT TO EXPECT
- You will work on engaging projects with some of the largest banks in the world, on projects that will transform the financial services industry.
- You’ll be part of digital engineering team that develop new and enhance existing financial and data solutions, having the opportunity to work on exciting greenfield projects as well as on established Tier1 bank applications adopted by millions of users.
- You’ll be involved in digital and data transformation processes through a continuous delivery model.
- You will work on automating and optimizing data engineering processes, develop robust and fault tolerant data solutions and enhancing security standards both on cloud and on-premise deployments.
- You’ll be able to work across different data, cloud and messaging technology stacks.
- You’ll have an opportunity to learn and work with specialized data and cloud technologies to widen the skill set.
THINGS YOU WILL DO
As a key member of the technical team, you will be expected to define and contribute at a high-level to many aspects of our collaborative Agile development process:
- Design, implement, and manage scalable data pipelines using Google Cloud Platform (GCP)services, such as BigQuery, Google Cloud Storage (GCS), and DataProc.
- Build and optimize data workflows using SQL,NoSQL databases and APIs.
- Collaborate with cross-functional teams to gather requirements and ensure that data infrastructure supports the broader business objectives.
- Write efficient, maintainable code in Python for data processing and integration tasks.
- Troubleshoot and resolve issues related to data pipelines, databases, and cloud services.
- Ensure data quality, reliability, and security throughout the data pipeline lifecycle.
- Actively contribute to the design of new systems and maintain or upgrade existing data infrastructure.
- Contribute to security designs and have advanced knowledge of key security technologies e.g. TLS, OAuth, Encryption
- Support internal Capco capabilities by sharing insight, experience
TECH STACK: Python, GCP, GCS, DataProc, BigQuery, Spark
Nice to have: PySpark
SKILLS & EXPERIENCES YOU NEED TO GET THE JOB DONE
- Hands-on experience with Google Cloud Platform (GCP), including BigQuery, GCS, and DataProc.
- Strong proficiency in Python programming for data engineering tasks.
- Proven experience working with SQL and NoSQL databases.
- Solid understanding of building and maintaining data pipelines using APIs.
- Knowledge of data pipeline orchestration and optimization techniques.
- Experience working with large-scale data sets and cloud-based data storage systems.
- Strong problem-solving abilities and a keen attention to detail.
Nice to have:
- Knowledge of PySpark for large-scale data processing.
- Familiarity with data engineering best practices and emerging technologies.
WHY JOIN CAPCO?
- Employment contract and/or Business to Business - whichever you prefer
- Possibility to work remotely
- Speaking English on daily basis, mainly in contact with foreign stakeholders and peers
- Multiple employee benefits packages (MyBenefit Cafeteria, private medical care, life-insurance)
- Access to 3.000+ Business Courses Platform (Udemy)
- Access to required IT equipment
- Paid Referral Program
- Participation in charity events e.g. Szlachetna Paczka
- Ongoing learning opportunities to help you acquire new skills or deepen existing expertise
- Being part of the core squad focused on the growth of the Polish business unit
- A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients
- A work culture focused on innovation and creating lasting value for our clients and employees
ONLINE RECRUITMENT PROCESS STEPS*
- Screening call with the Recruiter
- Technical interview: first stage
- Client Interview
- Feedback/Offer
We have been informed of several recruitment scams targeting the public. We strongly advise you to verify identities before engaging in recruitment related communication. All official Capco communication will be conducted via a Capco recruiter.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile APIs Banking Big Data BigQuery Data pipelines Dataproc Data quality Engineering GCP Google Cloud NoSQL Pipelines PySpark Python Security Spark SQL
Perks/benefits: Career development Flat hierarchy Health care Team events
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.