Data Architect

Remote, (Wrocław or Poland)

Kellton

Kellton is a “Born Digital” technology consulting and IT services company that delivers innovative solutions and exceptional business value. Know more.

View all jobs at Kellton

Apply now Apply later

Who are we? 🤔
• a global team with high standards - we maintain Silicon Valley quality and vibe, even though we have spread far beyond • we've made a real difference in the world since 2008! We deliver for Zoetis, Apple, Tesla, NATO, UNICEF, and many more • 85-people-sized A-Team in Wrocław, with other offices in the US, UK, and Ireland
View our website for more details and case study examples. Because everyone is empowered at Kellton, new team members enjoy making a difference right away and progressing quickly in responsibility and ownership.
Contract type: B2B, hourly rate

We're looking for you if you have:

  • experience with Databricks and its associated tech stack: PySpark, Delta, Unity Catalog, Databricks Asset Bundles (DAB),
  • 6+ years of hands-on experience in software engineering with high proficiency in Python (experience with modern Python tooling – Rye, Ruff, Pydantic, Typer - is highly desirable),
  • 4+ years of experience with SQL and large-scale data systems (preferably in Azure),
  • strong understanding of software development best practices, including clean code principles, automated testing, version control (GitHub), CI/CD (GitHub Actions), and building and deploying software packages,
  • strong work ethic, pragmatic approach, attention to detail, excellent communication skills and ability to work without supervision.

Your tasks:

  • championing software engineering excellence in all aspects of data engineering by applying best practices in code quality, performance, and maintainability of our data ingestion and transformation frameworks,
  • individual contribution and collaboration with engineering teams and consulting partners to build, test, debug, and ship high-impact, production-ready features of the frameworks,
  • owning the frameworks’ backlog, design decisions and release management process, driving collaboration across the Data Engineering community to deliver timely enhancements, bug fixes, and framework optimizations,
  • serving as the Subject Matter Expert (SME) for the entire Data Engineering framework, shaping the product roadmap and driving strategic enhancements,
  • advising other Data Engineers in the team.

💪🏼 We care about your growth…

  • you will sense a knowledge-exchange culture from Day 1 - workshops, meetups, hackathons, or coffee machine meetings - we inspire each other on a daily basis 💡
  • you’ll work in an international and multicultural environment on a daily basis 🌎

🧘🏻‍♀️ and…well-being...

  • ever wondered how bar, office, and dev-cave basement will look combined? That’s our office aka the best and natural habitat for devs (can be outranked only by your home office)
  • multiple food benefits: in-office lunches, a kitchen full of snacks and drinks, and many more 🍱
  • of course, we do provide Multisport Card
Still not sure about applying to us?Visit our website for more details and case study examples.
Apply now Apply later
  • Share this job via
  • 𝕏
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Architecture Jobs

Tags: Azure CI/CD Consulting Databricks Engineering GitHub PySpark Python SQL Testing

Perks/benefits: Lunch / meals

Regions: Remote/Anywhere Europe
Country: Poland

More jobs like this