Databricks Bigdata Delivery Lead - Assistant Vice President

Hyderabad, India

State Street

State Street provides investment servicing, investment management, investment research and trading services to institutional investors worldwide.

View all jobs at State Street

Apply now Apply later

Delivery Lead with Oracle, AWS Databricks

Expected to spend 80% of the time on hands-on development, design and architecture and remaining 20% on guiding the team on technology and removing other impediments Capital Markets Projects experience preferred Provides advanced technical expertise in analyzing, designing, estimating, and developing software applications to project schedule. Oversees systems design and implementation of most complex design components. Creates project plans and deliverables and monitors task deadlines. Oversees, maintains and supports existing software applications. Provides subject matter expertise in reviewing, analyzing, and resolving complex issues. Designs and executes end to end system tests of new installations and/or software prior to release to minimize failures and impact to business and end users. Responsible for resolution, communication, and escalation of critical technical issues. Prepares user and systems documentation as needed. Identifies and recommends Industry best practices. Serves as a mentor to junior staff.

  • Acts as a technical architect lead/mentor for developers in day to day and overall project areas.
  • Proven experience in designing and implementing data warehousing and ETL processes.
  • Strong background in performance tuning and optimization of data processing systems.
  • Hands on experience in using Delta lake for supporting large scale data sets
  • Experience handling structured/unstructured data and batch/real time data processing use cases
  • Using Databricks scheduling capabilities or using Airflow for orchestrating data pipelines
  • Familiarity with data governance and security best practices. (e.g., Unity Catalog)

Technical Skills

  • Experience with SQL, Python, and Spark.
  • Deep level of skills with Python, Java or Scala.
  • Experience with Delta Lake, Unity Catalog, Delta Sharing, Delta Live Tables (DLT).
  • Hands on experience developing batch and streaming data pipelines.
  • End to end understanding of software architecture, design, development and implementation.
  • Strong Practical experience using Scrum, Agile modelling and adaptive software development.
  • Continuous integration and build process, test automation and deployment experience
  • Ability to understand and grasp the big picture of system components.
  • Experience building environment and architecture and design guides and architecture and application blueprints.
  • Able to Work Independently
  • Excellent problem-solving and analytical skills
  • Excellent oral and written communication skills

Oracle SQL, PLSQL, Bigdata tech stack, AWS Databricks

AWS Cloud migration

12 PM IST to 9 PM IST

Oracle, Impala , Hive, Spark, Sqoop, Scala

State Street's Speak Up Line

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Agile Airflow Architecture AWS Databricks Data governance Data pipelines Data Warehousing ETL Java Oracle Pipelines Python Scala Scrum Security Spark SQL Streaming Unstructured data

Region: Asia/Pacific
Country: India

More jobs like this