Data Engineer - Kosovo

Complex Ramiz Sadiku, Prishtinë

Radix

Radix is a leading new domain registry, offering an extensive domain portfolio that empowers digital identities across the globe.

View all jobs at Radix

Apply now Apply later

About Radix

Radix is a fast-growing SaaS company transforming how the multifamily industry uses data. Our values-Curiosity, Resilience, Impact, Courage, and Responsibility-guide how we work, grow, and lead. At Radix, data is our superpower: from benchmarking rents to powering predictive analytics, everything we build starts with clean, reliable, and accessible data.

We believe exceptional people build exceptional companies-and our Data Engineer will play a key role in scaling the platforms and pipelines that turn raw information into industry-shaping intelligence.

This role is based onsite in our Prishtina, Kosovo office.

 

Your Impact

As a Data Engineer at Radix, you'll design, build, and optimize the infrastructure that powers our AI/ML models, analytics dashboards, and customer-facing products. You'll work closely with data scientists, product managers, and engineers to ensure the right data shows up at the right time-securely, accurately, and efficiently. Your work will directly shape how thousands of multifamily professionals discover insights and make data-driven decisions.

 

Key Outcomes

  • Reliable Data Pipelines: Deliver highly available, low-latency ETL/ELT pipelines to ingest and transform high-volume data
  • Scalable Architecture: Build cloud-native systems (e.g., CDC, streaming, lakehouse) that scale with our business
  • Data Quality & Governance: Implement automated testing, monitoring, and alerting to reduce manual fixes
  • Cross-Team Enablement: Provide self-service data access to accelerate analytics, experimentation, and modeling

 

Key Responsibilities

  • Design and maintain ETL/ELT workflows using Python, SQL, and orchestration tools (Airflow, Prefect, or Dagster)
  • Build and optimize data lakes/warehouses (Snowflake, BigQuery, Redshift, etc.) following lakehouse and cost-efficiency principles
  • Ingest data from APIs, files, and third-party feeds using scalable, maintainable pipelines
  • Process real-time data with Kafka, Kinesis, or Pub/Sub for event-driven features
  • Build monitoring and alerting into pipelines; champion data quality and governance
  • Partner with data scientists to deliver features; collaborate with engineers to expose data via APIs
  • Coordinate with DevOps to support CI/CD and infrastructure-as-code (Terraform, Pulumi)
  • Enforce SOC 2-compliant data security and privacy standards
  • Troubleshoot issues, track performance metrics, and contribute to continuous improvement

 

What You Bring

Experience

  • 4–6 years in data engineering or backend engineering within cloud-based environments
  • Proven experience designing, building, and maintaining production-grade data pipelines that support analytics or machine learning workloads
  • Comfortable working independently and partnering cross-functionally in a fast-paced, iterative environment

Skills

  • Strong Python and advanced SQL; comfortable with Spark
  • Hands-on experience with orchestration tools (Airflow, Prefect, or Dagster)
  • Experience with dbt or version-controlled ELT frameworks
  • Depth in at least one cloud platform (AWS, GCP, or Azure); familiarity with Docker/Kubernetes
  • Understanding of data modeling, performance tuning, and cloud cost optimization
  • Excellent collaboration and communication skills

Preferred

  • Experience supporting AI/ML pipelines or MLOps tools (e.g., Feature Store, MLflow)
  • Exposure to property tech, real estate, or asset-heavy industries
  • Familiarity with Data Mesh or domain-oriented data products

 

Personal Attributes

  • Curiosity: You ask "why" and dive deep into learning new tools and systems.
  • Resilience: You build systems that stay stable-and recover fast when they don't.
  • Impact-Driven: You focus on delivering business value, not just writing code.
  • Courage: You advocate for good architecture, even when it's hard.
  • Responsibility: You own what you build-end to end.

 

How We Work at Radix

We move fast, work smart, and trust each other. We don't micromanage-we empower. Our values show up in every sprint, stand-up, and code review. You'll have the autonomy to innovate and the backing of a team that cares deeply about craft, impact, and growth.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  1  0
Category: Engineering Jobs

Tags: Airflow APIs Architecture AWS Azure BigQuery CI/CD Dagster Data pipelines Data quality dbt DevOps Docker ELT Engineering ETL GCP Kafka Kinesis Kubernetes Machine Learning MLFlow ML models MLOps Pipelines Privacy Python Redshift Security Snowflake Spark SQL Streaming Terraform Testing

Perks/benefits: Career development Startup environment

Region: Europe
Country: Kosovo

More jobs like this