Vice President - Data Engineer

BL3 - 11 Kasturba Road, Suite 34, Level 23, Bangalore, Karnataka, India

BlackRock

BlackRock is the world's largest investment manager helping individuals and financial professionals build a better financial future. Read more.

View all jobs at BlackRock

Apply now Apply later

About this role

At BlackRock, technology has always been at the core of what we do – and today, our technologists continue to shape the future of the industry with their innovative work. We are not only curious but also collaborative and eager to embrace experimentation as a means to solve complex challenges. Here you’ll find an environment that promotes working across teams, businesses, regions and specialties – and a firm committed to supporting your growth as a technologist through curated learning opportunities, tech-specific career paths, and access to experts and leaders around the world.

We are seeking a highly skilled and motivated Lead Data Engineer to join the Private Market Data Engineering team within Aladdin Data at BlackRock for driving our Private Market Data Engineering vision of making private markets more accessible and transparent for clients. In this role, you will work multi-functionally with Product, Data Research, Engineering, and Program management.

Engineers looking to work in the areas of orchestration, data modeling, data pipelines, APIs, storage, distribution, distributed computation, consumption and infrastructure are ideal candidates. The candidate will have extensive experience in leading, designing and developing data pipelines using Python, Apache Airflow orchestration platform, DBT (Data Build Tool), Great Expectations for data validation, Apache Spark, MongoDB, Elasticsearch, Snowflake and PostgreSQL. In this role, you will be responsible for leading, designing, developing, and maintaining robust and scalable data pipelines. You will collaborate with various stakeholders to ensure the data pipelines are efficient, reliable, and meet the needs of the business.

Key Responsibilities
  • Design, develop, and maintain data pipelines using Aladdin Data Enterprise Data Platform framework.

  • Develop data transformation using DBT (Data Build Tool) with SQL or Python.

  • Design and develop ETL/ELT data pipelines using Python, SQL and deploy them as containerized apps on a Kubernetes cluster.  

  • Develop APIs for data distribution on top of the standard data model of the Enterprise Data Platform.

  • Ensure data quality and integrity through automated testing and validation using tools like Great Expectations.

  • Implement all observability requirements in the data pipeline.

  • Optimize data workflows for performance and scalability.

  • Performs code and design reviews for tasks done by other team members.

  • Works with other senior technical staff in the team in the evaluation of tools and technologies to ensure high quality data platform.

  • Works on the development of technical standards for the product and platform.

  • Collaborates with Engineering Managers and the business team to understand the system requirements to develop the best possible technical solution for the product.

  • Provide Technical guidance to the team on best practices, programming techniques and provide solutions to any technical issue or a problem.

  • Document data engineering processes and best practices whenever required.

Required Skills and Qualifications
  • Must have 8 to 12 years of experience in data engineering, with a focus on designing and building data pipelines.

  • Exposure on leading complex software projects.

  • Strong programming skills in Python.

  • Experience with Apache Airflow or any other orchestration framework for data orchestration.

  • Proficiency in DBT for data transformation and modeling.

  • Experience with data quality validation tools like Great Expectations or any other similar tools.

  • Strong at writing SQL and experience with relational databases like SQL Server, PostgreSQL.

  • Experience with cloud-based data warehouse platform like Snowflake.

  • Experience working on NoSQL databases like Elasticsearch and MongoDB.

  • Experience working with container orchestration platform like Kubernetes on AWS and/or Azure cloud environments.

  • Experience on Cloud platforms like AWS and/or Azure.

  • Experience working with backend microservices and APIs using Java or C#.

  • Exposure on message-oriented middleware technologies like Kafka is a plus.

  • Ability to work collaboratively in a team environment.

  • Need to possess critical skills of being detail oriented, passion to learn new technologies and good analytical and problem-solving skills.

  • Experience with Financial Services application is a plus.

  • Effective communication skills, both written and verbal.

  • Bachelor’s or master’s degree in computer science, Engineering, or a related field.

Our benefits

To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about.

Our hybrid work model

BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock.

About BlackRock

At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being.  Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress.

This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive.

For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock

BlackRock is proud to be an Equal Opportunity Employer.  We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  2  0  0

Tags: Airflow APIs AWS Azure Computer Science Data pipelines Data quality Data warehouse dbt Elasticsearch ELT Engineering ETL Finance Java Kafka Kubernetes Microservices MongoDB NoSQL Pipelines PostgreSQL Python RDBMS Research Snowflake Spark SQL Testing

Perks/benefits: Career development Flex hours Flex vacation Startup environment

Region: Asia/Pacific
Country: India

More jobs like this