Senior Data Engineer

Bulgaria, Georgia, Hungary, Lithuania, Poland, Romania, Uzbekistan

Exadel

Advance your business through technology and pave the way to becoming a digital leader with Exadel, an enterprise software development and consulting company.

View all jobs at Exadel

Apply now Apply later

We are seeking a Senior Data Engineer specializing in streaming systems to join our platform team. This individual contributor role will be responsible for designing and implementing high-throughput streaming pipelines that process AI telemetry data for carbon emissions calculations. You'll be working with cutting-edge technology to help organizations understand and optimize their AI carbon footprint in near real-time.

Work at Exadel - Who We Are 

Since 1998, Exadel has been engineering its products and custom software for clients of all sizes. Headquartered in Walnut Creek, California, Exadel has 2,000+ employees in development centers across America, Europe, and Asia. Our people drive Exadel’s success and are at the core of our values.

About Our Customer

The public benefit corporation has a mission to decarbonize the advertising and media industry. The customer has developed an accurate emissions model in the digital advertising ecosystem, providing the data foundation to measure, reduce, and offset digital emissions. This team obtained experience that includes the invention of ad exchange and the development used in monetizing the Internet.

Requirements

  • 5+ years of background building high-throughput data pipelines
  • Strong expertise in streaming architectures and ETL design
  • Proficiency with Apache Beam or similar streaming frameworks
  • Experience with cloud platforms (preferably GCP)
  • Solid understanding of data modeling and optimization
  • Competency in time-series databases and real-time analytics

Nice to Have

  • Experience with ClickHouse or similar columnar databases
  • Background in AI/ML infrastructure monitoring
  • Skills in Golang development
  • Track record of working in distributed teams

English level

Intermediate+

Responsibilities 

  • Design and implement scalable data pipelines processing 50k+ events per second
  • Build robust file-based ingestion systems with sub-30-second latency requirements
  • Develop real-time analytics capabilities for carbon emissions visualization
  • Create reprocessing frameworks for historical data analysis
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Architecture Data analysis Data pipelines Engineering ETL GCP Golang Machine Learning ML infrastructure Pipelines Streaming

Perks/benefits: Team events

Regions: Asia/Pacific Europe

More jobs like this