Lead Data Engineer - R01553055
Bangalore, Karnataka, India
⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️
Brillio
From data ingestion and transformation to advanced analytics and visualization, we provide end-to-end solutions to help you drive business growth.Primary Skills
- GCP, Big Query, Python, Airflow, SQL, DBT
Job requirements
- About the Role
- We are seeking a Senior Data Engineer with deep expertise in Google Cloud Platform (GCP) and BigQuery to lead cloud modernization initiatives, develop scalable data pipelines, and enable real-time data processing for enterprise-level systems. This is a high-impact role focused on driving the transformation of legacy infrastructure into a robust, cloud-native data ecosystem.
- Key Responsibilities
- 1. Data Migration & Cloud Modernization
- Analyze legacy on-premises and hybrid cloud data warehouse environments (e.g., SQL Server).
- Lead the migration of large-scale datasets to Google BigQuery.
- Design and implement data migration strategies ensuring data quality, integrity, and performance.
- 2. Data Integration & Streaming
- Integrate data from various structured and unstructured sources, including APIs, relational databases, and IoT devices.
- Build real-time streaming pipelines for large-scale ingestion and processing of IoT and telemetry data.
- 3. ETL / Data Pipeline Development
- Modernize and refactor legacy SSIS packages into cloud-native ETL pipelines.
- Develop scalable, reliable workflows using Apache Airflow, Python, Spark, and GCP-native tools.
- Ensure high-performance data transformation and loading into BigQuery for analytical use cases.
- 4. Programming & Query Optimization
- Write and optimize complex SQL queries, stored procedures, and scheduled jobs within BigQuery.
- Develop modular, reusable transformation scripts using Python, Java, Spark, and SQL.
- Continuously monitor and optimize query performance and cost efficiency in the cloud data environment.
- Required Skills & Experience
- 5+ years in Data Engineering with a strong focus on cloud and big data technologies.
- Minimum 2+ years of hands-on experience with GCP, specifically BigQuery.
- Proven experience migrating on-premise data systems to the cloud.
- Strong development experience with Apache Airflow, Python, and Apache Spark.
- Expertise in streaming data ingestion, particularly in IoT or sensor data environments.
- Strong SQL development skills; experience with BigQuery performance tuning.
- Solid understanding of cloud architecture, data modeling, and data warehouse design.
- Familiarity with Git and CI/CD practices for managing data pipelines.
- Preferred Qualifications
- GCP Professional Data Engineer certification.
- Experience with modern data stack tools like dbt, Kafka, or Terraform.
- Exposure to ML pipelines, analytics engineering, or DataOps/DevOps methodologies.
- Why Join Us?
- Work with cutting-edge technologies in a fast-paced, collaborative environment.
- Lead cloud transformation initiatives at scale.
- Competitive compensation and benefits.
- Remote flexibility and growth opportunities.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow APIs Architecture Big Data BigQuery CI/CD DataOps Data pipelines Data quality Data warehouse dbt DevOps Engineering ETL GCP Git Google Cloud Java Kafka Machine Learning Pipelines Python RDBMS Spark SQL SSIS Streaming Terraform
Perks/benefits: Career development Competitive pay
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.