Cloud Engineer

India - Bengaluru

Aptos

Unify your omnichannel experience and enterprise with Aptos' leading solutions. See why the world's top retailers trust Aptos ONE, POS, Merchandising and more.

View all jobs at Aptos

Apply now Apply later

Making a job change is a big decision. Why consider Aptos?

You will join a team of remarkable colleagues who are committed and passionate about creating and delivering leading-edge solutions to the retail market. You will be part of an exciting growth journey where we will do everything possible to help you reach and exceed your career dreams. Our colleagues have access to industry-leading training and development opportunities, and the chance to work in a global, diverse culture with offices in 13 countries. You will be part of an inclusive culture that is grounded in our Company's purpose: to make a difference for every colleague, every client, every day.

With years of deep retail DNA, Aptos has been a market-leading platform that drives the world’s largest retailers’ product, promotion, commerce and merchandising decisions across online and brick-and-mortar operations. The opportunity at Aptos has never been greater, as we transition our solutions to cloud-native, microservices architecture. More than 135,000 retail locations impact nearly $2 trillion in annual revenue across fashion, grocery, drug, convenience, general merchandise, discount and sporting goods stores optimized with Aptos’ solutions. We hope you’ll be a part of taking innovative solutions to market with the leader in Unified Commerce.

Job Overview:
We are seeking an experienced Cloud Data Engineer with strong expertise in Google Cloud Platform (GCP) to join our Operations Enablement team. The ideal candidate will have experience in supporting, developing, testing, and maintaining GCP-based data pipelines. Proficiency in BigQuery and familiarity with handling support tasks will be crucial in this role. You will collaborate closely with cross-functional teams including support, development, data science, analysts, and other engineers to ensure that data workflows are streamlined and align with business objectives. This position will also contribute to a significant project focused on migrating legacy ETL pipelines from Informatica to GCP-based solutions.

The successful candidate will play a key role in optimizing our data systems as part of an important transformation initiative. We are looking for a driven, motivated Cloud Data Engineer ready to make a substantial impact on our business.

Experience - 4+ years

Key Skills:

  • GCP Data Pipelines
  • BigQuery
  • Composer
  • GitLab
  • Cloud Functions
  • Python
  • SQL Server
  • ETL
  • Automation
  • Performance Tuning
  • Data Warehousing
  • Data Integration

Key Responsibilities:

Pipeline Development & Maintenance:

  • Design, implement, and optimize GCP-based data pipelines utilizing services like Cloud Functions, Cloud Composer, and others.
  • Involved in complex workstreams from start to finish, ensuring high-quality, on-time delivery.
  • Develop and automate ETL processes for large-scale data integration and transformation.
  • Optimize data pipeline scalability, performance, and reliability.
  • Take part in technical discussions and communicate complex ideas to non-technical stakeholders.
  • Align technical decisions with business objectives and propose impactful solutions.
  • Mentor junior engineers and share expertise across teams.

BigQuery Management:

  • Design, optimize, and maintain BigQuery environments to support large datasets.
  • Develop efficient, cost-effective queries and data models within BigQuery.
  • Monitor and troubleshoot BigQuery performance, while managing data storage and associated costs.

Data Integration & Transformation:

  • Collaborate with internal teams to understand data needs and design solutions to integrate data from multiple sources into GCP.
  • Ensure consistency, quality, and compliance of data across pipelines and systems.

Collaboration & Support:

  • Work alongside Data Scientists, Analysts, and other engineers to develop and deploy data-driven solutions.
  • Provide support for troubleshooting data pipeline issues and optimizing performance.

Documentation & Reporting:

  • Create and maintain comprehensive documentation for GCP data pipeline architectures and configurations.
  • Provide regular reports and dashboards on pipeline performance, data quality, and health.

Required Skills & Qualifications:

  • Extensive experience with Google Cloud Platform (GCP), including services like BigQuery, Kubernetes, Airflow, Cloud Storage, Dataflow, Cloud Composer, Pub/Sub, and more.
  • Familiarity with big data technologies such as Spark, and experience with programming languages like Python or Scala.
  • Strong proficiency in SQL, particularly for querying and optimizing large datasets in BigQuery.
  • Expertise in Python for building data pipelines and automation scripts.
  • Experience with ETL/ELT, both batch and streaming with Strong skills in troubleshooting pipelines.
  • Knowledge of advanced pipeline concepts, including idempotency, scaling, security, testing, version control, and handling schema changes.
  • Some experience with Informatica is a plus.
  • Familiarity with CI/CD pipelines and automation tools preferable in Gilab.
  • Experience in optimizing pipeline performance and cost efficiency.
  • Some understanding of GCP IAM, monitoring, and logging tools to maintain secure and efficient operations.
  • Background in data warehousing, data integration, and performance tuning.

 

We offer a competitive total rewards package including a base salary determined based on the role, experience, skill set, and location. For those in eligible roles, discretionary incentive compensation may be awarded in recognition of individual achievements and contributions. We also offer a range of benefits and programs to meet employee needs, based on eligibility. 

We are an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. By submitting an application for this job, you acknowledge that any personal data or personally identifiable information that you provide to us will be processed in accordance with our Candidate Privacy Notice.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow Architecture Big Data BigQuery CI/CD Dataflow Data pipelines Data quality Data Warehousing ELT ETL GCP GitLab Google Cloud Informatica Kubernetes Microservices Pipelines Privacy Python Scala Security Spark SQL Streaming Testing

Perks/benefits: Career development Competitive pay Health care Startup environment

Region: Asia/Pacific
Country: India

More jobs like this