Data Pipeline Engineering, Associate

HA3-Gurgaon - DLF Cyber City

Applications have closed

BlackRock

Seit über 30 Jahren arbeitet BlackRock daran, die Wirtschaft zu stärken und Anlegern zu ihren finanziellen Zielen zu verhelfen.

View all jobs at BlackRock

About this role

Data is at the heart of Aladdin and increasingly the ability to consume, store, analyze and gain insight from data has become a key component of our competitive advantage. The DOE team is responsible for the data ecosystem within BlackRock. Our goal is to build and maintain a leading-edge data platform that provides highly available, consistent data of the highest quality for all users of the platform, notably investors, operations teams, and data scientists. We focus on evolving our platform to deliver exponential scale to the firm, powering the future growth of Aladdin.

Data Pipeline Engineers at BlackRock get to experience working at one of the most recognized financial companies in the world while being part of a software development team responsible for next generation technologies and solutions. Our engineers design and build large scale data storage, computation, and distribution systems. They partner with data and analytics experts to deliver high quality analytical and derived data to our consumers.

We are looking for data engineers who like to innovate and seek complex problems. We recognize that strength comes from diversity and will embrace your unique skills, curiosity, drive, and passion while giving you the opportunity to grow technically and as an individual. We are committed to open source and we regularly give our work back to the community. Engineers looking to work in the areas of data acquisition, ETL Workflow orchestration, data modeling, data pipeline engineering, Restful APIs, relational data bases, and data distribution are ideal candidates.

Responsibilities

Data Pipeline Engineers are expected to be involved from inception of projects, understand requirements, design, develop, deploy, and maintain data pipelines (ETL / ELT). Our goal is to drive up user engagement and adoption of the modern data platform while constantly working towards improving platform performance and scalability.

Deployment and maintenance require close interaction with various teams. This requires maintaining a positive and collaborative working relationship with teams within DOE as well as with wider Aladdin developer community. Creative and inventive problem-solving skills for reduced turnaround times are highly valued. Preparing user documentation to maintain both development and operations continuity is integral to the role.

  • Brainstorming ideas for data engineering and solutions. Provide ideas for process enhancements that will result in development projects

  • Developing and maintaining existing Client data distribution, tools, and processes, including participation in tactical and strategic development projects

  • Working alongside program managers, product teams and business analysts throughout the SDLC cycle

  • Providing L2 or L3 support for technical and/or operational issues.

  • Help the team conduct end-to-end tests to ensure production operations run successfully during monthly cycles.

  • Ability to speak with confidence to individuals, both technical and non-technical, outside the team.

  • Ability to lead process engineering for a data pipeline engineering area.

  • Ability to contribute on design decisions for complex systems.

  • Take responsibility for work beyond assigned tasks. Take emotional ownership of projects.

  • Address issues promptly and transparently.

  • Take on challenges outside of day-to-day responsibilities.

Essential Skills

  • BE/BS/B. Tech in Computer Science or equivalent practical experience

  • At least 3+ years’ experience as a software engineer

  • Experience in Python, Unit/Integration/Functional testing

  • Experience in SQL, PL/SQL programming, Stored procedures, UDFs

  • Experience / Familiarity with Database Modeling, Normalization techniques

  • Experience / Familiarity with object-oriented design patterns

  • Experience with dev ops tools for CI/CD like Git, Maven, Jenkins, Gitlab CI, Azure DevOps

  • Experience with Agile development concepts and related tools (e.g. ADO, Jira, etc.)

  • Ability to trouble shoot and fix performance issues across the codebase and database queries

  • Excellent written and verbal communication skills

  • Passion for learning and implementing new technologies

  • Ability to operate under fast-paced environment

Desired Skills

  • Experience in Cloud Native Services (AWS/Azure)

  • ETL background in any language or tools

  • Experience with Snowflake or other Cloud Data warehousing products

  • Exposure with Workflow management tools such as Airflow

  • Data ingestion and transformation tools for ETL tools e.g. DBT, Spark, Kafka

Our benefits

To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about.

Our hybrid work model

BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock.

About BlackRock

At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being.  Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress.

This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive.

For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock

BlackRock is proud to be an Equal Opportunity Employer.  We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law.

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0
Category: Engineering Jobs

Tags: Agile Airflow APIs AWS Azure CI/CD Computer Science Data pipelines Data Warehousing dbt DevOps ELT Engineering ETL Finance Git GitLab Jenkins Jira Kafka Maven Open Source Pipelines Python SDLC Snowflake Spark SQL Testing

Perks/benefits: Career development Flex hours Flex vacation Startup environment

Region: Asia/Pacific
Country: India

More jobs like this