Senior ETL Developer

Hyderabad, IN

Milestone Technologies, Inc.

The world's leading companies partner with Milestone Technologies, an IT Services and Digital Solutions company to deliver IT services and technologies at scale, accelerate digital operations, develop innovative applications, and drive...

View all jobs at Milestone Technologies, Inc.

Apply now Apply later

We are looking for a Senior ETL Developer. This pivotal role is dedicated to harnessing state-of-the-art data processing technologies such as Snowpark, Snow pipe, Apache Spark, and Databricks Delta Live Tables. The successful candidate will play a key role in evolving, deploying, and overseeing our advanced data infrastructure, pivotal for extensive data processing, analytics, and business intelligence functions. Demonstrating profound knowledge in cloud-native tools and platforms including AWS, Azure, Oracle, Snowflake, and Databricks, the Senior Data Engineer will also excel in deploying advanced data integration techniques utilizing cloud-native ETL tools (e.g., AWS Glue, Azure Data Factory) and Oracle Data Integrator (ODI).

Key Responsibilities:

  • Lead the architecture and development of scalable data solutions across both batch and streaming platforms, optimizing the use of cloud technologies (AWS or similar) to meet business and technical demands, including real-time data processing.
  • Analyze complex data elements and systems, engaging in the development of conceptual, logical, and physical data models, along with the validation and execution of ETL/ELT mappings and transformation logic.
  • Craft and peer review data models and schemas, enhancing new and current data sources for the data warehouse, ensuring efficiency and scalability.
  • Conduct thorough unit, integration, and system testing of data sources to ensure data integrity against source systems, aiming for continuous performance optimization.
  • Innovate with Snowflake’s Snowpark and Snowpipe to architect scalable data management solutions, facilitating real-time data ingestion.
  • Leverage Apache Spark for comprehensive data processing and analytics, enhancing data workflow efficiency and scalability.
  • Implement Databricks Delta Live Tables to automate and ensure data ETL processes, maintaining data quality and consistency for real-time analytics.
  • Maintain high-performance data pipelines, making extensive use of AWS, Azure, and Databricks, along with seamless integration with Oracle Data Integrator (ODI).
  • Utilize cloud native ETL tools to streamline data integration and transformation across diverse data platforms.
  • Develop sophisticated data transformations and machine learning models within Snowflake using Snowpark to boost analytics capabilities.
  • Ensure the availability of real-time data and analytics through efficient configuration and optimization of Snowpipe.
  • Collaborate with business stakeholders and IT departments to transform business needs into technical specifications and practical data models.
  • Uphold data governance, security, and compliance standards across all data platforms and tools.
  • Remain abreast of emerging trends and technologies in data engineering, promoting innovative solutions that propel the data strategy.

Qualifications:

  • Bachelor’s or master’s degree in computer science, Information Technology, Engineering, or a related field.
  • 3-5 years of proven expertise in data processing and analytics using Apache Spark, with hands-on experience in Databricks Delta Live Tables for managing real-time data.
  • Proficiency in Snowflake, including mastery of Snowpark and Snowpipe, paired with experience in cloud native ETL tools (AWS Glue, Azure Data Factory, Google Cloud Dataflow).
  • Comprehensive background in Oracle, Snowflake, Databricks, and Oracle Data Integrator (ODI) for in-depth data integration and processing.
  • Strong programming skills in SQL, Python, Java, or Scala, with the capability to develop intricate data models and algorithms.
  • In-depth knowledge of data warehousing, data modeling, and data lake concepts within a cloud environment.
  • Exceptional problem-solving skills, complemented by the ability to work both independently and collaboratively.
  • Outstanding communication skills, with proficiency in engaging with both technical and non-technical team members.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Architecture AWS AWS Glue Azure Business Intelligence Computer Science Databricks Dataflow Data governance Data management Data pipelines Data quality Data strategy Data warehouse Data Warehousing ELT Engineering ETL Excel GCP Google Cloud Java Machine Learning ML models Oracle Pipelines Python Scala Security Snowflake Spark SQL Streaming Testing

Perks/benefits: Career development

Region: Asia/Pacific
Country: India

More jobs like this