Data Engineer

Cincinnati, OH; New York, NY

Luma Financial Technologies

Luma Financial Technologies: The leading platform for learning, comparing, and managing structured products, annuities, and life insurance.

View all jobs at Luma Financial Technologies

Apply now Apply later

About Luma Financial Technologies


Founded in 2018, Luma Financial Technologies (“Luma”) has pioneered a cutting-edge fintech software platform that has been adopted by broker/dealer firms, RIA offices, and private banks around the world. By using Luma, institutional and retail investors have a fully customizable, independent, buy-side technology platform that helps financial teams more efficiently learn about, research, purchase, and manage alternative investments as well as annuities. Luma gives these users the ability to oversee the full, end-to-end process lifecycle by offering a suite of solutions. These include education resources and training materials; creation and pricing of custom structured products; electronic order entry; and post-trade management. By prioritizing transparency and ease of use, Luma is a multi-issuer, multi-wholesaler, and multi-product option that advisors can utilize to best meet their clients’ specific portfolio needs. Headquartered in Cincinnati, OH, Luma also has offices in New York, NY, Miami, FL, Zurich, Switzerland and Lisbon, Portugal. For more information, please visit Luma’s website

About the role


We are looking for an experienced Data Engineer to lead our data infrastructure development, focusing on building robust, scalable, and efficient data solutions. The ideal candidate will bring expertise in modern data engineering technologies and a proven track record of delivering high-performance data pipelines.


Key Responsibilities & Opportunities

Data Pipeline Development

  • Design, develop, and maintain advanced data pipelines in Snowflake using dbt
  • Design, develop and maintain data pipelines using Python
  • Implement and optimize complex ETL/ELT processes
  • Ensure comprehensive data quality and consistency across multiple systems

Performance and Optimization

  • Create and optimize sophisticated SQL queries for advanced reporting and analysis
  • Develop efficient database queries with a focus on performance optimization
  • Troubleshoot complex data transformation challenges

Monitoring and Reliability

  • Implement and manage production data pipeline monitoring
  • Develop proactive health checks and monitoring protocols
  • Diagnose and rapidly resolve data integration issues

Cross-Functional Collaboration

  • Interface effectively with product, engineering, and business intelligence teams
  • Translate complex technical requirements into comprehensive data solutions
  • Provide technical leadership and guidance on data engineering challenges

Qualifications


  • 3-5 years of professional experience in data engineering
  • Bachelor's degree in Computer Science, Data Science, or related field
  • Excellent written and verbal communication skills
  • Proven ability to collaborate effectively across geographical boundaries
  • Proven technical expertise in:

o   Python

o   Advanced SQL

o   dbt

o   Snowflake

o   Data pipeline architecture

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Architecture Business Intelligence Computer Science Data pipelines Data quality dbt ELT Engineering ETL FinTech Pipelines Python Research Snowflake SQL

Perks/benefits: Career development Transparency

Region: North America
Country: United States

More jobs like this