Data Engineer

India - Bengaluru

Capco

Capco is a global management and technology consultancy dedicated to the financial services and energy industries.

View all jobs at Capco

Apply now Apply later

Job Title: Senior Data Engineer/Developer

Number of Positions: 2

Job Description:

The Senior Data Engineer will be responsible for designing, developing, and maintaining scalable data pipelines and building out new API integrations to support continuing increases in data volume and complexity. They will collaborate with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization.

Responsibilities:

  • Design, construct, install, test and maintain highly scalable data management systems & Data Pipeline.
  • Ensure systems meet business requirements and industry practices.
  • Build high-performance algorithms, prototypes, predictive models, and proof of concepts.
  • Research opportunities for data acquisition and new uses for existing data.
  • Develop data set processes for data modeling, mining and production.
  • Integrate new data management technologies and software engineering tools into existing structures.
  • Create custom software components and analytics applications.
  • Install and update disaster recovery procedures.
  • Collaborate with data architects, modelers, and IT team members on project goals.
  • Provide senior level technical consulting to peer data engineers during data application design and development for highly complex and critical data projects.

Qualifications:

  • Bachelor's degree in computer science, Engineering, or related field, or equivalent work experience.
  • Proven 5-8 years of experience as a Senior Data Engineer or similar role.
  • Experience with big data tools: Hadoop, Spark, Kafka, Ansible, chef, Terraform, Airflow, and Protobuf RPC etc.
  • Expert level SQL skills for data manipulation (DML) and validation (DB2).
  • Experience with data pipeline and workflow management tools.
  • Experience with object-oriented/object function scripting languages: Python, Java, Go lang etc.
  • Strong problem solving and analytical skills.
  • Excellent verbal communication skills.
  • Good interpersonal skills.
  • Ability to provide technical leadership for the team.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow Ansible APIs Big Data Business Intelligence Computer Science Consulting Data management Data pipelines DB2 Engineering Hadoop Java Kafka Pipelines Python Research Spark SQL Terraform

Region: Asia/Pacific
Country: India

More jobs like this