2025 Data Integration Engineer SDS/BSV-ES Bengaluru

bangalore , India

⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️

Bosch Group

Moving stories and inspiring interviews. Experience the meaning of "invented for life" by Bosch completely new. Visit our international website.

View all jobs at Bosch Group

Apply now Apply later

Company Description

Bosch Global Software Technologies Private Limited is a 100% owned subsidiary of Robert Bosch GmbH, one of the world's leading global supplier of technology and services, offering end-to-end Engineering, IT and Business Solutions. With over 28,200+ associates, it’s the largest software development center of Bosch, outside Germany, indicating that it is the Technology Powerhouse of Bosch in India with a global footprint and presence in the US, Europe and the Asia Pacific region.

Job Description

Roles & Responsibilities :
We are seeking a motivated Data Integration Engineer to support the development and operation of reliable, efficient, and secure data integration workflows across various systems. The candidate will work closely with senior engineers and cross-functional teams to understand integration requirements and deliver robust data pipelines.

Work experience:

 ·       3–5 years of experience in data integration, ETL/ELT workflows, or API-based integration.

·       Exposure to integration tools and platforms like Azure Data Factory, MuleSoft, Boomi, Talend, or Informatica.

·       Familiarity with REST/SOAP APIs and data formats including JSON, XML, CSV.

·       Experience with SQL and NoSQL databases such as PostgreSQL, SQL Server, MongoDB.

·       Exposure to scripting languages such as Python, Shell scripting, or JavaScript.

·       Understanding of message queuing technologies (e.g., Kafka, RabbitMQ, Azure Service Bus).

·       Basic understanding of cloud platforms (Azure, AWS, or GCP).

·       Exposure to version control systems (e.g., Git), CI/CD tools, and Agile methodologies.

·       Basic understanding of data warehousing concepts and dimensional modelling.

·       Exposure to monitoring tools such as Grafana, Prometheus, or ELK stack.

 

Required Skills & Qualifications:

 ·       Good analytical and debugging skills.

·       Strong written and verbal communication.

·       Willingness to learn and adopt new technologies.

·       Ability to collaborate effectively within a team environment.

·       Understanding of secure data handling and governance principles.

·       Ability to create technical documentation and flow diagrams.

·       Enthusiastic attitude toward continuous improvement.

·       Awareness of data security, compliance (e.g., GDPR), and API rate-limiting practices.

·       Knowledge of error handling, retry policies, and scheduling mechanisms.

Qualifications

Educational qualification:

Graduation/Post Graduation in Engineering, Masters in Computer Science/Computer Applications from a Reputed University

Experience :

3–5 years of experience in data integration, ETL/ELT workflows, or API-based integration

Mandatory/requires Skills :
·       Familiarity with REST/SOAP APIs and data formats including JSON, XML, CSV.

·       Experience with SQL and NoSQL databases such as PostgreSQL, SQL Server, MongoDB.

. Exposure to scripting languages such as Python, Shell scripting, or JavaScript

Preferred Skills :

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile APIs AWS Azure CI/CD Computer Science CSV Data pipelines Data Warehousing ELK ELT Engineering ETL GCP Git Grafana Informatica JavaScript JSON Kafka MongoDB NoSQL Pipelines PostgreSQL Python RabbitMQ Security Shell scripting SQL Talend XML

Region: Asia/Pacific
Country: India

More jobs like this