Data Engineer - Las Vegas, USA

United States

Photon

Photon, a global leader in digital transformation services and IT consulting, works with 40% of the Fortune 100 companies as their digital agency of choice.

View all jobs at Photon

Apply now Apply later

Greetings Everyone
 
 
 

Who are we? 

For the past 20 years, we have powered many Digital Experiences for the Fortune 500. Since 1999, we have grown from a few people to more than 4000 team members across the globe that are engaged in various Digital Modernization. For a brief 1 minute video about us, you can check https://youtu.be/uJWBWQZEA6o.

 


 

What will you do?  What are we looking for? 

Job Description :  
  
Develop and maintain data pipelines, ELT processes, and workflow orchestration using Apache Airflow, Python and PySpark to ensure the efficient and reliable delivery of data.  
Design and implement custom connectors to facilitate the ingestion of diverse data sources into our platform, including structured and unstructured data from various document formats .  
Collaborate closely with cross-functional teams to gather requirements, understand data needs, and translate them into technical solutions.  
Implement DataOps principles and best practices to ensure robust data operations and efficient data delivery.  
Design and implement data CI/CD pipelines to enable automated and efficient data integration, transformation, and deployment processes.  
Monitor and troubleshoot data pipelines, proactively identifying and resolving issues related to data ingestion, transformation, and loading.  
Conduct data validation and testing to ensure the accuracy, consistency, and compliance of data.  
Stay up-to-date with emerging technologies and best practices in data engineering.  
Document data workflows, processes, and technical specifications to facilitate knowledge sharing and ensure data governance.  
  
Experience:  
Bachelor’s degree in computer science, Engineering, or a related field  
8 + years’ experience in data engineering, ELT development, and data modeling.  
Proficiency in using Apache Airflow and Spark for data transformation, data integration, and data management.  
Experience implementing workflow orchestration using tools like Apache Airflow, SSIS or similar platforms.  
Demonstrated experience in developing custom connectors for data ingestion from various sources.  
Strong understanding of SQL and database concepts, with the ability to write efficient queries and optimize performance.  
Experience implementing DataOps principles and practices, including data CI/CD pipelines.  
Excellent problem-solving and troubleshooting skills, with a strong attention to detail.  
Effective communication and collaboration abilities, with a proven track record of working in cross-functional teams.  
Familiarity with data visualization tools Apache Superset and dashboard development.  
Understanding of distributed systems and working with large-scale datasets.  
Familiarity with data governance frameworks and practices.  
Knowledge of data streaming and real-time data processing technologies (e.g., Apache Kafka).  
Strong understanding of software development principles and practices, including version control (e.g., Git) and code review processes.  
Experience with Agile development methodologies and working in cross-functional Agile teams.  
Ability to adapt quickly to changing priorities and work effectively in a fast-paced environment.  
Excellent analytical and problem-solving skills, with a keen attention to detail.  
Strong written and verbal communication skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical stakeholders.  
  
Required Skills –  
Python, Pyspark, Sql,Airflow, Trino, Hive, Snowflake, Agile Scrum  
Good to have–   
Linux,Openshift, Kubernentes, Superset  

Compensation, Benefits and Duration

Minimum Compensation: USD 34,000
Maximum Compensation: USD 120,000
Compensation is based on actual experience and qualifications of the candidate. The above is a reasonable and a good faith estimate for the role.
Medical, vision, and dental benefits, 401k retirement plan, variable pay/incentives, paid time off, and paid holidays are available for full time employees.
This position is available for independent contractors
No applications will be considered if received more than 120 days after the date of this post

Apply now Apply later
Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Airflow CI/CD Computer Science Data governance Data management DataOps Data pipelines Data visualization Distributed Systems ELT Engineering Git Kafka Linux Pipelines PySpark Python Scrum Snowflake Spark SQL SSIS Streaming Superset Testing Unstructured data

Perks/benefits: 401(k) matching Health care

Region: North America
Country: United States

More jobs like this