Sr Lead Data Engineer
Pune, Maharashtra, India
JPMorgan Chase & Co.
Embrace this pivotal role as an essential member of a high performing team dedicated to reaching new heights in data engineering. Your contributions will be instrumental in shaping the future of one of the world’s largest and most influential companies.
As a Senior Lead Data Engineer at JPMorgan Chase within the Consumer and Community Banking, Chase Travel Team, you are an integral part of an agile team that works to enhance, build, and deliver data collection, storage, access, and analytics in a secure, stable, and scalable way. Leverage your deep technical expertise and problem solving capabilities to drive significant business impact and tackle a diverse array of challenges that span multiple data pipelines, data architectures, and other data consumers.Job Responsibilities:
- Provide recommendations and insight on data management, governance procedures, and intricacies applicable to the acquisition, maintenance, validation, and utilization of data
- Design and deliver trusted data collection, storage, access, and analytics data platform solutions in a secure, stable, and scalable way
- Define database back-up, recovery, and archiving strategy
- Generate advanced data models for one or more teams using firmwide tooling, linear algebra, statistics, and geometrical algorithms
- Approve data analysis tools and processes
- Create functional and technical documentation supporting best practices
- Advise junior engineers and technologists
- Evaluate and report on access control processes to determine the effectiveness of data asset security
- Add to team culture of diversity, equity, inclusion, and respect
- Develop data strategy (and enterprise data models for applications), manage data infrastructure (design, construct, install, and maintain large-scale processing systems and infrastructure), drive data quality, ensure data accessibility (to analysts and data scientists), ensure compliance with data governance requirements, and ensure business alignment (ensure data engineering practices align with business goals), author/review/approve technical requirements and designs, re-engineer processes to drive cost-effective business solutions
Required Qualifications, Capabilities, and Skills:
- Formal training or certification on software engineering concepts and 5+ years applied experience
- Advanced knowledge of linear algebra, statistics, and geometrical algorithms
- Advanced understanding of database back-up, recovery, and archiving strategies
- Experience presenting and delivering visual data
- Expert proficiency in multiple programming languages - ideally Python & Java
- Expert proficiency in at least one cluster computing framework (preferably Spark, alternatively Flink or Storm)
- Expert proficiency in at least one cloud data lakehouse platform (preferably AWS data lake services or Databricks, alternatively Hadoop), at least one relational data store (Postgres, Oracle or similar), and at least one NoSQL data store (Cassandra, Dynamo, MongoDB or similar)
- Expert proficiency in at least one scheduling/orchestration tool (preferably Airflow, alternatively AWS Step Functions or similar)
- Expert in Unix scripting, data structures, data serialization formats (JSON, AVRO, Protobuf, or similar), big-data storage formats (Parquet, Iceberg, or similar), data processing methodologies (batch, micro-batching, and stream), one or more data modeling techniques (Dimensional, Data Vault, Kimball, Inmon, etc.), Agile methodology (develop PI plans and roadmaps), TDD (or BDD), and CI/CD tools
- Proficiency in microservices architecture, serverless computing, and distributed cluster computing tools (Docker, Kubernetes, etc.)
- Coach and mentor team members on design and development practices; develop strong partnerships with cross-functional teams and leadership to drive data strategy
Preferred Qualifications, Capabilities, and Skills:
- Proficiency in IaC (preferably Terraform, alternatively AWS CloudFormation)
- Proficiency in cloud-based data pipeline technologies such as Fivetran, DBT, Prophecy.io, etc.
- Proficiency in front-end technologies (preferably React, alternatively Angular)
- Proficiency in Snowflake data platform
J.P. Morgan is a global leader in financial services, providing strategic advice and products to the world’s most prominent corporations, governments, wealthy individuals and institutional investors. Our first-class business in a first-class way approach to serving clients drives everything we do. We strive to build trusted, long-term partnerships to help our clients achieve their business objectives.
We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow Angular Architecture Avro AWS Banking Cassandra CI/CD CloudFormation Data analysis Databricks Data governance Data management Data pipelines Data quality Data strategy dbt Docker Engineering FiveTran Flink Hadoop Java JSON Kubernetes Linear algebra Microservices MongoDB NoSQL Oracle Parquet Pipelines PostgreSQL Python React Security Snowflake Spark Statistics Step Functions TDD Terraform
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.