Principal Data Engineer
Remote, Dallas, Texas
Las Vegas Sands
Las Vegas Sands Corporation is the world leader in developing and operating international, world-class integrated resorts.Job Description:
Position Overview
The primary responsibility of the Principal Data Engineer is to lead the design and implementation of our data architecture and pipelines for a casino management system being developed from the ground up. This role requires extensive technical expertise and leadership skills to build scalable, reliable, and high-performance data solutions that support real-time processing, analytics, and reporting. The Principal Data Engineer will collaborate with cross-functional teams to ensure seamless integration and data flow across various systems.
All duties are to be performed in accordance with departmental and Las Vegas Sands Corp.’s policies, practices, and procedures. All Las Vegas Sands Corp. Team Members are expected to conduct and carry themselves in a professional manner at all times. Team Members are required to observe the Company’s standards, work requirements and rules of conduct.
Essential Duties & Responsibilities
Lead the design and development of a robust data architecture that aligns with business goals, ensuring it is scalable, secure, and adaptable for future needs.
Architect, develop, and maintain complex data pipelines for efficient data ingestion, transformation, and storage, ensuring high availability and quality.
Oversee the integration of diverse data sources (e.g., transactional systems, third-party APIs, IoT devices) to create a unified data ecosystem for the casino management system.
Define and implement Extract, Transform, Load (ETL) processes that support both batch and real-time analytics, optimizing data movement and processing efficiency.
Establish data governance policies and best practices to ensure compliance with regulatory standards, data security, and privacy protocols.
Work closely with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions that enable data-driven decision-making.
Monitor and optimize data pipeline performance, identifying bottlenecks and implementing enhancements to ensure rapid data processing and retrieval.
Provide technical leadership and mentorship to data engineering teams, promoting best practices in data engineering and fostering a culture of innovation.
Maintain comprehensive documentation of data architecture, workflows, and processes to support ongoing development and operational maintenance.
Perform job duties in a safe manner.
Attend work as scheduled on a consistent and regular basis.
Perform other related duties as assigned.
Minimum Qualifications
At least 21 years of age.
Proof of authorization to work in the United States.
Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
Must be able to obtain and maintain any certification or license, as required by law or policy.
8+ years of experience in data engineering, with at least 3 years in a principal or lead role, preferably in the gaming or casino industry.
Expertise with data pipeline technologies (e.g., Apache Kafka, Apache Airflow, Apache NiFi) for orchestrating data workflows at scale.
Advanced experience with ETL frameworks and tools (e.g., Talend, Informatica, AWS Glue) for seamless data integration and processing.
Deep knowledge of relational and NoSQL databases (e.g., PostgreSQL, MongoDB, Cassandra) and experience with data warehousing solutions.
Proficiency in cloud data solutions (e.g., AWS, Azure, Google Cloud) and familiarity with their associated data services (e.g., AWS Redshift, Google BigQuery).
Strong programming skills in languages relevant to data engineering (e.g., Python, Java, Scala) for developing data pipelines and processing workflows.
Expertise in data modeling techniques (e.g., star schema, snowflake schema) to support complex analytics and reporting requirements.
Demonstrated experience with data quality frameworks and tools to ensure data integrity and compliance with business standards.
Strong understanding of continuous integration/continuous deployment (CI/CD) practices and tools (e.g., Jenkins, GitLab CI) for automating deployment processes.
Exceptional analytical and problem-solving abilities with a focus on delivering high-quality data solutions.
Proven ability to lead and mentor technical teams, fostering a culture of knowledge sharing and continuous improvement.
Strong interpersonal skills with the ability to communicate effectively and interact appropriately with management, other Team Members and outside contacts of different backgrounds and levels of experience.
Physical Requirements
Must be able to:
Physically access assigned workspace areas with or without reasonable accommodation.
Work indoors and be exposed to various environmental factors such as, but not limited to, CRT, noise, and dust.
Utilize laptop and standard keyboard to perform essential functions of the job.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow APIs Architecture AWS AWS Glue Azure BigQuery Cassandra CI/CD Computer Science Data governance Data pipelines Data quality Data Warehousing Engineering ETL GCP GitLab Google Cloud Informatica Java Jenkins Kafka MongoDB NiFi NoSQL Pipelines PostgreSQL Privacy Python Redshift Scala Security Snowflake Talend
Perks/benefits: Gear
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.