Data Engineer - DataOps

Bangalore, Karnataka, India

Wesco

We build, connect, power, and protect the world. As a leading global supply chain solutions provider, we use inspiration to drive innovation.

View all jobs at Wesco

Apply now Apply later

As a Data Engineer - DataOps, you are responsible for creating and maintaining software development lifecycle activities pertaining to the business applications. You will design, code, test, debug, and document existing and new enhancements.

Responsibilities: 

  • Partner with data engineering, and business teams to set KPIs, SLAs, and SLIs for the quality and availability of data.
  • Validate the data pipeline executions to ensure that they are completed within the SLA’s.
  • Troubleshoot issues related to data pipelines, data integrations, bug fixes, performance bottlenecks.
  • Work with multiple cross functional teams like Data Engineering, Business Analytics, IT and Vendor Support to resolve any incidents / problems.
  • Continuously monitor and alert for any anomalies in the data.
  • Implement role-based security, including AD integration, security policies, and auditing across the data platform environment.
  • Test data to make sure it matches business logic and meets basic operational thresholds.
  • Work with Data Scientists to support the AI / ML model development environments and operationalization of the models.
  • Closely monitor and optimize the data platform infrastructure cost.
  • On-call support for P1 issues on weekends / holidays on rotation basis.
  • A desire to work in a collaborative, intellectually curious environment.

Qualifications:

  • A bachelor's degree in computer science or equivalent
  • 5+ years of experience as a Data Engineer,
  • Experience with Azure big data tools: Azure Databricks, Azure Synapse, HDInsight, ADLS
  • Experience with relational SQL and NoSQL databases.
  • Excellent problem solving and analytic skills associated with working on structured and unstructured datasets using Azure bigdata tools.
  • Experience with data pipeline and workflow management tools: ADF and Logic Apps
  • Experience with Azure cloud services: VM, Azure Databricks, Azure SQL DB, Azure Synapse
  • Experience with stream-processing systems: Azure Streams
  • Experience with scripting languages Python. and have strong knowledge in developing Python notebooks.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Azure Big Data Business Analytics Computer Science Databricks DataOps Data pipelines Engineering KPIs Machine Learning ML models NoSQL Pipelines Python Security SQL

Region: Asia/Pacific
Country: India

More jobs like this