Data Engineer

Pune, Maharashtra

Apply now Apply later

About Hevo:Hevo (www.hevodata.com) is a simple, intuitive, and powerful No-code Data Pipeline platform that enables companies to consolidate data from multiple software for faster analytics.Hevo powers data analytics for 2000+ data-driven companies across multiple industry verticals, including Cult.fit, Postman, ThoughtSpot, Jawa Motorcycles. By automating complex data integration tasks, Hevo allows data teams to focus on deriving groundbreaking insights and driving their businesses forward.Hevo’s mission is simple but bold: Build technology from India, for the world that is simple to adopt and easy to access so that everyone can unlock the potential of data.Based in San Francisco and Bangalore, Hevo has seen exponential growth since its inception. With total funding of $42 million from Sequoia India, Qualgro, and Chiratae Ventures, Hevo is now entering a new phase of hyper-growth.Hevoites are a bunch of thoughtful, helpful problem solvers who are obsessed with making a difference in the lives of their customers, colleagues, and their own individual trajectory. If you are someone who is passionate about redefining the future of technology, then Hevo is the place for you.
About the RoleWe’re looking for a Data Engineer to help build reliable and scalable data pipelines that power reports, dashboards, and business decisions at Hevo. You’ll work closely with engineering, product, and business teams to make sure data is accurate, available, and easy to use.

Key Responsibilities

  • Independently design and implement scalable ELT workflows using tools like Hevo, dbt, Airflow, and Fivetran.
  • Ensure the availability, accuracy, and timeliness of datasets powering analytics, dashboards, and operations.
  • Collaborate with Platform and Engineering teams to address issues related to ingestion, schema design, and transformation logic.
  • Escalate blockers and upstream issues proactively to minimize delays for stakeholders.
  • Maintain strong documentation and ensure discoverability of all models, tables, and dashboards.
  • Own end-to-end pipeline quality, minimizing escalations or errors in models and dashboards.
  • Implement data observability practices such as freshness checks, lineage tracking, and incident alerts.
  • Regularly audit and improve accuracy across business domains.
  • Identify gaps in instrumentation, schema evolution, and transformation logic.
  • Ensure high availability and data freshness through monitoring, alerting, and incident resolution processes.
  • Set up internal SLAs, runbooks, and knowledge bases (data catalog, transformation logic, FAQs).
  • Improve onboarding material and templates for future engineers and analysts

Required Skills & Experience

  • 3–5 years of experience in Data Engineering, Analytics Engineering, or related roles.
  • Proficient in SQL and Python for data manipulation, automation, and pipeline creation.
  • Strong understanding of ELT pipelines, schema management, and data transformation concepts.
  • Experience with modern data stack: dbt, Airflow, Hevo, Fivetran, Snowflake, Redshift, or BigQuery.
  • Solid grasp of data warehousing concepts: OLAP/OLTP, star/snowflake schemas, relational & columnar databases.
  • Understanding of Rest APIs, Webhooks, and event-based data ingestion.
  • Strong debugging skills and ability to troubleshoot issues across systems.

  • Preferred Background
  • Experience in high-growth industries such as eCommerce, FinTech, or hyper-commerce environments.
  • Experience working with or contributing to a data platform (ELT/ETL tools, observability, lineage, etc.).

Core Competencies

  • Excellent communication and problem-solving skills
  • Attention to detail and a self-starter mindset
  • High ownership and urgency in execution
  • Collaborative and coachable team player
  • Strong prioritization and resilience under pressure
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow APIs BigQuery Data Analytics Data pipelines Data Warehousing dbt E-commerce ELT Engineering ETL FinTech FiveTran OLAP Pipelines Python Redshift Snowflake SQL

Region: Asia/Pacific
Country: India

More jobs like this