Senior Data Engineer - GCP

Karachi, Sindh

NorthBay Solutions

Your Unwavering Compass for Everything AWS Find Your True North with NorthBay, an AWS Premier Service Provider Get access to all of our AWS best practices and insights.     Subscribe Now Navigating the Cloud is was Complex Navigating...

View all jobs at NorthBay Solutions

Apply now Apply later

We are currently recruiting Senior Data Engineers - Big Data with a strong emphasis on GCP (Google Cloud Platform) Cloud skills. This role is ideal for individuals with a solid background in data engineering and a passion for Big Data technologies. The successful candidate will have over 4 to 6+ years of experience in Data Engineering, with a particular focus on GCP Cloud services.

Location: Karachi, Lahore, Islamabad
Work Arrangement: Hybrid

Key Responsibilities:

  • Primary Responsibility: Leverage GCP Cloud technologies to design, develop, and manage complex ETL pipelines, big data solutions, and distributed systems. This includes extensive work with GCP services such as BigQuery, Cloud Composer, and Cloud Storage.
  • Collaborate with a global, multi-cultural team of Data Engineers on a variety of exciting and challenging projects.
  • Design and develop cutting-edge Big Data and Cloud technologies to build critical, highly complex distributed systems from scratch.
  • Support and guide team members, fostering their development and contributing to a culture of continuous improvement and innovation.
  • Solve complex issues and make improvement and process recommendations that positively impact the business.

Technology Stack & Required Knowledge:

  • GCP Cloud Expertise: Extensive experience with core GCP services including BigQuery, Cloud Composer, and Cloud Storage.
  • Over 4 to 6+ years of experience as a Data Engineer with a focus on Big Data technologies such as Hadoop, Spark, Hive, and distributed processing technologies.
  • Proficiency in ETL pipelines, data modeling, SQL scripting, and data pipelines, with a focus on integrating data from multiple sources.
  • Strong software development skills in Python, PySpark, and/or Scala for data engineering.
  • Hands-on experience with distributed storage systems (e.g., HDFS, S3) and data warehousing concepts, including relational and columnar databases (e.g., PostgreSQL, MySQL, Redshift, BigQuery).
  • Familiarity with containerization technologies and CI/CD pipelines tools like Docker and Jenkins for automating software delivery processes.
  • Experience with ETL schedulers such as StepFunction or Apache Airflow.
  • Understanding of AWS data management tools such as Data Lake, Databricks, or AWS Snowflake is a plus.
  • Must be familiar with Jira and Agile development methodologies.
  • Self-motivated with strong problem-solving skills, and a passion for technology.

This is an excellent opportunity to be part of a dynamic team where you will play a key role in leveraging GCP Cloud to drive the company's data strategy forward. If you have the required skills and experience, and you're excited about the prospect of working on large-scale, complex projects, we encourage you to apply.

Apply now Apply later
  • Share this job via
  • 𝕏
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Airflow AWS Big Data BigQuery CI/CD Databricks Data management Data pipelines Data strategy Data Warehousing Distributed Systems Docker Engineering ETL GCP Google Cloud Hadoop HDFS Jenkins Jira MySQL Pipelines PostgreSQL PySpark Python Redshift Scala Snowflake Spark SQL

Region: Asia/Pacific
Country: Pakistan

More jobs like this