Data Infrastructure Engineer

Orlando, Florida, United States - Remote

Worth AI

Complete applications effortlessly with Worth Pre-Filling. Our AI-powered technology guarantees real-time accuracy and eliminates manual data entry.

View all jobs at Worth AI

Apply now Apply later

Worth AI, a leader in the computer software industry, is looking for a talented and experienced Data Infrastructure Engineer to join their innovative team. At Worth AI, we are on a mission to revolutionize decision-making with the power of artificial intelligence while fostering an environment of collaboration, and adaptability, aiming to make a meaningful impact in the tech landscape.. Our team values include extreme ownership, one team and creating reaving fans both for our employees and customers.

Worth is looking for a Data Infrastructure Engineer to build and maintain the foundational systems that power our data platform. In this role, you will design, implement, and optimize scalable, reliable, and secure data infrastructure that supports analytics, data science, and product applications across the company.

The ideal candidate is deeply experienced with modern data architectures, cloud platforms, and data orchestration tools. You are passionate about automation, performance tuning, and ensuring high availability of data services. This is a critical role on our data team and offers the opportunity to shape the long-term data strategy of the organization.

Responsibilities

  • Design, build, and maintain scalable and resilient data infrastructure in a cloud environment (AWS, Azure, or GCP).
  • Develop and maintain ETL/ELT pipelines using orchestration tools such as Airflow, Dagster, or dbt.
  • Optimize data workflows for reliability, performance, and cost efficiency across structured and unstructured datasets.
  • Manage data lake and data warehouse environments (e.g., Snowflake, BigQuery, Redshift, Delta Lake).
  • Ensure data security, privacy, and compliance, including role-based access control, data encryption, and audit logging.
  • Collaborate with data scientists, analysts, and product teams to ensure data accessibility, accuracy, and availability.
  • Support real-time and batch data processing frameworks, including Kafka, Spark, Flink, or similar tools.
  • Monitor, troubleshoot, and improve the observability and performance of data systems using tools like Prometheus, Grafana, or Datadog.
  • Maintain CI/CD pipelines for data infrastructure using Terraform, GitHub Actions, or similar tools.

Requirements

  • 5+ years of experience in data engineering, infrastructure engineering, or a related field.
  • Strong programming skills in Python, Node.js.
  • Proficient in SQL and experience with distributed query engines (e.g., Trino, Presto).
  • Experience with cloud-native data platforms such as AWS Glue
  • Hands-on experience with infrastructure-as-code tools (Terraform, Pulumi, CloudFormation).
  • Familiarity with containerization and orchestration tools such as Kafka and Kubernetes.
  • Solid understanding of data governance, quality frameworks, and data lifecycle management.
  • Ability to work in a fast-paced, collaborative environment with a focus on impact and delivery.
  • Experience in streaming data architecture and tools like Apache Kafka, Kinesis, or Pub/Sub.
  • Background in supporting machine learning or analytics platforms.
  • Exposure to data mesh, data contracts, or modern data stack concepts.
  • Knowledge of DevOps principles applied to data systems.

Benefits

    • Health Care Plan (Medical, Dental & Vision)
    • Retirement Plan (401k, IRA)
    • Life Insurance
    • Unlimited Paid Time Off
    • 9 paid Holidays
    • Family Leave
    • Work From Home
    • Free Food & Snacks (Access to Industrious Co-working Membership!)
    • Wellness Resources
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow Architecture AWS AWS Glue Azure BigQuery CI/CD CloudFormation Dagster Data governance Data strategy Data warehouse dbt DevOps ELT Engineering ETL Flink GCP GitHub Grafana Kafka Kinesis Kubernetes Machine Learning Node.js Pipelines Privacy Python Redshift Security Snowflake Spark SQL Streaming Terraform

Perks/benefits: 401(k) matching Career development Health care Medical leave Unlimited paid time off

Regions: Remote/Anywhere North America
Country: United States

More jobs like this