Software Engineer III - Databricks Engineer

Bengaluru, Karnataka, India

Apply now Apply later

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level.

As a Software Engineer III - Databricks Engineer at JPMorgan Chase within the Commercial & Investment Bank Technology Team, you'll serve as a seasoned member of an agile team to design and deliver trusted market-leading Data Engineering Solutions and Data products in a secure, stable, and scalable way. You will be carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.

Job Responsibilities :

  • Design and implement scalable data architectures using Databricks at an enterprise-scale.
  • Design and implement Databricks integration and interoperability with other cloud providers such as AWS, Azure, GCP, Snowflake, Immuta, and OpenAI.
  • Collaborate with data scientists, analysts and business stakeholders to understand requirements and deliver solutions.
  • Develop and maintain data architecture standards, including data product interfaces, data contracts, and governance frameworks.
  • Implement data governance and security measures to ensure data quality and compliance with industry and regulatory standards.
  • Monitor and optimize the performance and scalability of data products and infrastructure.
  • Stay up-to-date with industry trends and emerging technologies in data mesh and cloud computing.

Required qualifications, capabilities, and skills :

  • Formal training or certification on software engineering concepts and 3+ years applied experience
  • Applied experience in Data Engineering space using enterprise tools, home grown frameworks and 5+ years of speciality in
  • Databricks implementation from start to end.
  • Experience with multiple cloud platforms such as AWS, Azure, Google Cloud, Databricks, and Snowflake.
  • Hands-on practical experience delivering system design, application development, testing, and operational stability
  • Influencer with a proven record of successfully driving change and transforming across organizational boundaries
  • Strong leadership skills, with the ability to present and effectively communicate to Senior Leaders and Executives.
  • Experience with data governance, security, and industry and regulatory compliance best practices.
  • Deep understanding of Apache Spark, Delta Lake, DLT and other big data technologies
  • Strong experience in Python programming 

Preferred qualifications, capabilities, and skills :

  • Experience in Snowflake is preferred
  • Experience with KAFKA is preferred
  • Experience of working in a development teams, using agile techniques and Object Oriented development and scripting languages, is preferred.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Architecture AWS Azure Big Data Databricks Data governance Data quality Engineering GCP Google Cloud Kafka OpenAI Python Security Snowflake Spark Testing

Perks/benefits: Career development

Region: Asia/Pacific
Country: India

More jobs like this