Senior Platform Engineer – Data Pipelines
Texas, United States
Carrier
Carrier is the global leader in sustainable healthy buildings, HVAC, commercial and transport refrigeration solutions. Learn more about Carrier Corporation.Carrier is the leading global provider of healthy, safe and sustainable building and cold chain solutions with a world-class, diverse workforce with business segments covering HVAC, refrigeration, and fire and security. We make modern life possible by delivering safer, smarter and more sustainable services that make a difference to people and our planet while revolutionizing industry trends. This is why we come to work every day. Join us and we can make a difference together.
About This Role
The Senior Platform Engineer – Data Pipelines plays a critical role in designing, building, and optimizing the data infrastructure that enables efficient and scalable ELT (Extract, Load, Transform) pipelines. This role ensures that data is ingested, processed, and made available for analytics, AI, and business intelligence across our data platforms. The ideal candidate is experienced in cloud-native architectures, modern data lakehouse technologies, and automation best practices to ensure high availability, security, and performance of our data pipelines.
Key Responsibilities:
- Design and develop scalable ELT pipelines to support enterprise data ingestion, transformation, and storage.
- Optimize data movement and processing workflows using Apache Iceberg, Snowflake, AWS Glue, and other modern data engineering tools.
- Implement best practices for data orchestration, monitoring, and automation, ensuring high-performance and cost-efficient operations.
- Collaborate with Data Engineers and Platform Operations teams to enhance data pipeline reliability, security, and scalability.
- Leverage DevOps and DataOps methodologies to enable CI/CD for data workflows and improve operational efficiency.
- Ensure data quality and governance compliance by integrating metadata management, lineage tracking, and validation frameworks.
- Continuously evaluate and enhance the ELT framework to align with evolving business and analytical needs.
Required Qualifications:
- 10+ years of experience in data engineering, platform engineering, or cloud infrastructure with a focus on ELT pipelines.
- 5+ years of experience with SQL, Python, and distributed computing frameworks such as Spark.
- 5+ years of experience with technologies such as Snowflake, AWS Glue, Airflow, or Qlik.
- Bachelor’s Degree.
Preferred Qualifications:
- Hands-on experience with technologies such as Apache Iceberg.
- Strong understanding of cloud-based data architectures (AWS, Azure, or GCP) and infrastructure automation.
- Experience implementing observability and monitoring for data pipelines, ensuring reliability and performance.
- Familiarity with CI/CD pipelines, containerization (Docker, Kubernetes), and infrastructure as code (Terraform, CloudFormation).
RSRCAR
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Architecture AWS AWS Glue Azure Business Intelligence CI/CD CloudFormation DataOps Data pipelines Data quality DevOps Docker ELT Engineering GCP Kubernetes Pipelines Python Qlik Security Snowflake Spark SQL Terraform
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.