Cloud Data Platform-Technical Anchor

Chennai, Tamil Nadu, India

Ford Motor Company

Since 1903, we have helped to build a better world for the people and communities that we serve. Welcome to Ford Motor Company.

View all jobs at Ford Motor Company

Apply now Apply later

Job Overview:

As a Cloud Data Platform Engineering Specialist, you will be instrumental in building and optimizing our Enterprise Data Platform on Google Cloud Platform (GCP). Your role will focus on designing, developing, and deploying scalable data solutions that integrate cloud-native technologies, service-oriented architectures, and microservices principles. You’ll also bring full-stack knowledge to the development process, ensuring seamless data integration and access across various layers of the platform.

We are looking for a hands-on developer who has in-depth knowledge in Cloud Fundamentals and Infrastructure solutions who can help to translate business requirements into key functionalities by deciding the right Cloud Technology for the use case. We will appreciate individuals who are eager to learn, capable of mastering a variety of multi-cloud technologies, and dedicated to understanding and meeting the needs of developers with empathy and precision.

Key Responsibilities:

  • Design and Build Data Pipelines: Architect, develop, and maintain scalable data pipelines and microservices that support real-time and batch processing on GCP.
  • Service-Oriented Architecture (SOA) and Microservices: Design and implement SOA and microservices-based architectures to ensure modular, flexible, and maintainable data solutions.
  • Full-Stack Integration: Leverage your full-stack expertise to contribute to the seamless integration of front-end and back-end components, ensuring robust data access and UI-driven data exploration.
  • Data Ingestion and Integration: Lead the ingestion and integration of data from various sources into the data platform, ensuring data is standardized and optimized for analytics.
  • GCP Data Solutions: Utilize GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) to build and manage data platforms that meet business needs.
  • Data Governance and Security: Implement and manage data governance, access controls, and security best practices while leveraging GCP’s native row- and column-level security features.
  • Performance Optimization: Continuously monitor and improve the performance, scalability, and efficiency of data pipelines and storage solutions.
  • Collaboration and Best Practices: Work closely with data architects, software engineers, and cross-functional teams to define best practices, design patterns, and frameworks for cloud data engineering.
  • Automation and Reliability: Automate data platform processes to enhance reliability, reduce manual intervention, and improve operational efficiency.

Qualifications: Qualifications:

  • Education:
    • Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or a related field. Master’s degree or equivalent experience preferred.

 

  • Experience: 
    • 5+ years of experience in data engineering or software engineering, with at least 2 years focused on cloud data platforms (GCP preferred).
    • Technical Skills: Proficient in Python, Java, or Scala with experience in designing and deploying cloud-based data pipelines and microservices using GCP tools like BigQuery, Dataflow, and Dataproc.
    • Service-Oriented Architecture and Microservices: Strong understanding of SOA, microservices, and their application within a cloud data platform context.
    • Full-Stack Development: Knowledge of front-end and back-end technologies, enabling collaboration on data access and visualization layers (e.g., React, Node.js).
    • Database Management: Experience with relational (e.g., PostgreSQL, MySQL) and NoSQL databases, as well as columnar databases like BigQuery.
    • Data Governance and Security: Understanding of data governance frameworks and implementing RBAC, encryption, and data masking in cloud environments.
    • CI/CD and Automation: Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform, and automation frameworks.
    • Problem-Solving: Strong analytical skills with the ability to troubleshoot complex data platform and microservices issues.
    • Certifications (Preferred): GCP Data Engineer, GCP Professional Cloud Architect.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Architecture BigQuery CI/CD Computer Science Dataflow Data governance Data pipelines Dataproc Engineering GCP Google Cloud Java Microservices MySQL Node.js NoSQL Pipelines PostgreSQL Python React Scala Security Terraform

Region: Asia/Pacific
Country: India

More jobs like this