Senior Data Engineer (GCP)
Remote job
Addepto
We specialize in delivering custom-made AI solutions and Machine Learning services tailored to meet even the most niche industries.Addepto is a leading consulting and technology company specializing in AI and Big Data, helping clients deliver innovative data projects. We partner with top-tier global enterprises and pioneering startups, including Rolls Royce, Continental, Porsche, ABB, and WGU. Our exclusive focus on AI and Big Data has earned us recognition by Forbes as one of the top 10 AI companies.
As a Senior Data Engineer, you will have the exciting opportunity to work with a team of technology experts on challenging projects across various industries, leveraging cutting-edge technologies. Here are some of the projects we are seeking talented individuals to join:
Centralized reporting platform for a growing US telecommunications company. This project involves implementing BigQuery and Looker as the central platform for data reporting. It focuses on centralizing data, integrating various CRMs, and building executive reporting solutions to support decision-making and business growth.
Design and development of a universal data platform for global aerospace companies. This Azure and Databricks powered initiative combines diverse enterprise and public data sources. The data platform is at the early stages of the development, covering design of architecture and processes as well as giving freedom for technology selection.
Design of the data transformation and following data ops pipelines for global car manufacturer. This project aims to build a data processing system for both real-time streaming and batch data. We’ll handle data for business uses like process monitoring, analysis, and reporting, while also exploring LLMs for chatbots and data analysis. Key tasks include data cleaning, normalization, and optimizing the data model for performance and accuracy.
🚀 Your main responsibilities:
- Design, implement, and optimize scalable data pipelines on GCP using BigQuery for both batch and real-time data processing.
- Develop, enhance, and maintain dashboards in Looker to provide insightful and executive-level reporting.
- Integrate external CRM systems and other data sources through API connectors to centralize and streamline data access.
- Leverage SQL, Python, and API connectors for efficient ETL processes, data transformation, and automation.
- Conduct data cleaning, transformation, and modeling to ensure high-quality, consistent data across the platform.
- Collaborate with cross-functional teams to understand business needs and translate them into effective data solutions that align with reporting and strategic goals.
Requirements
🎯 What you'll need to succeed in this role:
- At least 4 years of commercial experience in a Data Engineer role.
- Experience with BigQuery on GCP as a data warehouse, with some existing tables already built (approx. 10% completed).
- Expertise in Looker for building and optimizing dashboards for management reporting.
- Strong experience with API integration, especially for connecting external CRM systems and other third-party services.
- Proficiency in SQL and Python for data processing, transformation, and automation.
- Knowledge of using API connectors for seamless data integration and ETL pipelines.
- Experience in data cleaning, modeling, and ensuring data consistency and quality across systems.
- Hands-on experience with Google Cloud Platform (GCP) and its suite of services to support data processing, storage, and reporting.
- Fluent English (C1 level) is a must.
Excellent communication skills and consulting experience with direct interaction with clients.
Ability to work independently and take ownership of project deliverables.
Master’s or Ph.D. in Computer Science, Data Science, Mathematics, Physics, or a related field.
🎁 Discover our perks & benefits:
Work in a supportive team of passionate enthusiasts of AI & Big Data.
Engage with top-tier global enterprises and cutting-edge startups on international projects.
Enjoy flexible work arrangements, allowing you to work remotely or from modern offices and coworking spaces.
Accelerate your professional growth through career paths, knowledge-sharing initiatives, language classes, and sponsored training or conferences, including a partnership with Databricks, which offers industry-leading training materials and certifications.
Choose from various employment options: B2B, employment contracts, or contracts of mandate.
Make use of 20 fully paid days off available for B2B contractors and individuals under contracts of mandate.
Participate in team-building events and utilize the integration budget.
Celebrate work anniversaries, birthdays, and milestones.
Access medical and sports packages, eye care, and well-being support services, including psychotherapy and coaching.
Get full work equipment for optimal productivity, including a laptop and other necessary devices.
With our backing, you can boost your personal brand by speaking at conferences, writing for our blog, or participating in meetups.
Experience a smooth onboarding with a dedicated buddy, and start your journey in our friendly, supportive, and autonomous culture.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: APIs Architecture Azure Big Data BigQuery Chatbots Computer Science Consulting Data analysis Databricks DataOps Data pipelines Data warehouse ETL GCP Google Cloud LLMs Looker Mathematics Physics Pipelines Python SQL Streaming
Perks/benefits: Career development Conferences Flex vacation Startup environment Team events
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.