Sr. Software Engineer

Johnson Controls India COEE1

Johnson Controls

Applying data from both inside buildings and beyond, our customers can now manage operations systemically.

View all jobs at Johnson Controls

Apply now Apply later

Job Title - Senior Software Engineer (AI Engineering)

Posting Title

Senior Software Engineer (Full Stack - AI & Data Engineering)

Preferred Locations      

India (Pune)

Introduction

The future is being built today and Johnson Controls is making that future more productive, more secure and more sustainable. We are harnessing the power of cloud, AI/ML and Data analytics, the Internet of Things (IoT), and user design thinking to deliver on the promise of intelligent buildings and smart cities that connect communities in ways that make people’s lives and the world better.

What you will do

The Johnson Controls AI Hub’s mission is to infuse AI capabilities into products using a collaborative approach working alongside multiple business units. One of the charters of the hub is to create end-to end enablers in order to streamline AI/ML operations starting with Data supply strategy to Data discovery to Model training and development to deployment of AI services in the Cloud as well as at the Edge.

The AI Hub team is looking to accelerate the creation of tools, services and workflows to aid in the quick and widespread deployment of AI Services on a global scale. We are looking for a hands-on Senior Software Engineer with industry experience to contribute to foundational AI/ML engineering with repeatability in mind. The Senior Engineer will work with data scientists, platform/data architects and domain experts from teams across JCI and build enablers for help in productionization of AI/ML models.

AI Engineering: Use sound and widely accepted software engineering principles to deliver high-quality software that forms the foundation of our end-to-end AI/ML solutions that make the buildings smarter.

How you will do it

  • Be part of a highly performant technical team consisting of backend, MLOps, DevOps engineers and architects to bring workflows to life that aid in the development and widespread deployment of AI services in the cloud and at the edge
  • Work with Product and Data Science teams, understand and translate requirements to well-designed modular components accounting for the variability in data sources and deployment targets
  • Work with Data Architects, Product Owners & other specialists to rapidly design, secure, build, test and release new data enablement capabilities for data scientists to use in model development & training
  • Help evaluate vendors, open source and proprietary technologies and present recommendations to onboard potential partners, automate machine learning workflows, model training and versioned experimentation, digital feedback and monitoring

What we look for

Required

  • BS in Computer Science/Electrical or Computer Engineering, or has a degree and demonstrated technical abilities in similar areas
  • 5+ years of experience as a Software Engineer in any of the following fields: Finance, Data Science, Cloud Services, IoT
  • 3+ years of programming and object-oriented design experience in any of the modern languages such as Python (preferred), Nodejs (preferred), Java, Scala or C#
  • API-first design experience accounting for security, authentication/authorization, logging and usage patterns
  • Strong hands-on experience performing test driven development & taking ownership of code for quality and performance by automating APIs using SOAPUI/Rest Assured or other API automation frameworks & running performance tests using JMeter
  • Experience working with publish-subscribe message systems managing real-time data feeds like Kafka ,RedPanda(preferred), Apache Spark, RabbitMQ
  • Container experience using technologies such as Kubernetes, Docker, AKS, OpenShift, Service Fabric
  • Experience with building & deploying backend services on Azure using EKS, AKS is a plus
  • Experience developing data processing frameworks (ETL & ELT) for Enterprise Data Warehouse and building large-scale batch and real-time data pipelines using cloud data technologies, such as Snowflake, Python, and Apache Airflow is a plus
  • Knowledgeable in the SCRUM/Agile development methodology
  • Strong spoken and written communication skills

Preferred Qualifications

  • MS in Computer Science/Electrical or Computer Engineering
  • 7+ years of experience as a Software Engineer in any of the following fields: Finance, Data Science, Cloud Services, IoT
  • 5+ years of programming and object-oriented design experience in any of the modern languages such as Python (preferred), Nodejs (preferred), Java, Scala or C#
  • 1+ years of experience working alongside Data Scientists to productize AI/ML Models

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Airflow APIs Azure Computer Science Data Analytics Data pipelines Data warehouse DevOps Docker ELT Engineering ETL Finance Java Kafka Kubernetes Machine Learning ML models MLOps Model training Node.js Open Source Pipelines Python RabbitMQ Scala Scrum Security Snowflake Spark

Region: Asia/Pacific
Country: India

More jobs like this