Expert level Data Engineer- GCP

Bangalore, Karnataka, India

Aqilea

Welcome to Aqilea, a leading provider of IT and Industrial Services. Our diverse team with a strong technical background delivers exceptional solutions. Discover our innovative services and meet our amazing people with amazing skills

View all jobs at Aqilea

Apply now Apply later

Company Description

We are a consulting company with a bunch of technology-interested and happy people!

We love technology, we love design and we love quality. Our diversity makes us unique and creates an inclusive and welcoming workplace where each individual is highly valued.

With us, each individual is her/himself and respects others for who they are and we believe that when a fantastic mix of people gather and share their knowledge, experiences and ideas, we can help our customers on a completely different level.

We are looking for you who want to grow with us!

With us, you have great opportunities to take real steps in your career and the opportunity to take great responsibility.

Job description

You will be involved in one of the biggest data transformation journey. As a data engineer, you will be working with building of data products in the context of Data Mesh concept based on defined target vision and requirements.

We appreciate a multitude of technical backgrounds, and we believe you will enjoy working here if you are passionate about data. In this role, you will be required to implement data-intensive solutions for a data-driven organization.

You will join the Data Engineering Competence area within AI (Artificial Intelligence), Analytics & Data Domain and be an individual contributor in one of the data product-teams. The area supports all our brands globally to create, structure, guard and ensure data is available, understandable and of high quality.

Responsibilities:


Experience in data query languages (SQL or similar), BigQuery, and different data formats (Parquet, Avro)

Take end-to-end responsibility to design, develop and maintain the large-scale data infrastructure required for the machine learning projects

Have the DevOps mindset and principles to manage CI/CD pipelines and terraform as well as Cloud infrastructure, in our context, it is GCP (Google Cloud Platform).

Leverage the understanding of software architecture and software design patterns to write scalable, maintainable, well-designed, and future-proof code

Work in cross-functional agile team of highly skilled engineers, data scientists, business stakeholders to build the AI ecosystem

Tech skills:

GCP service (BigQuery, cloud run, cloud functions, pubsub, dataflow, cloud composer, etc.)
SQL
Python
DBT
Terraform
Basics of Azure

Required cloud certification: GCP

Additional information

We are looking for you who:

Start: Immediately
Location: Bangalore
Form of employment: Full-time until further notice, we apply 6 months probationary employment

We interview candidates on an ongoing basis, do not wait to submit your application.

Apply now Apply later
  • Share this job via
  • 𝕏
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0
Category: Engineering Jobs

Tags: Agile Architecture Avro Azure BigQuery CI/CD Consulting Dataflow dbt DevOps Engineering GCP Google Cloud Machine Learning Parquet Pipelines Python SQL Terraform

Perks/benefits: Career development

Region: Asia/Pacific
Country: India

More jobs like this