Senior Data Engineer
Latvia - Nationwide
Emergn
Emergn combines consulting, product-focused capabilities development, and software engineering services to establish long-term success.Department: Data & Analytics
Employment Type: Full Time
Location: Latvia - Nationwide
Compensation: €4,500 - €5,000 / month
Description
We are a global digital business services organization with a mission to improve the way people and companies work. Forever. Our Consulting, Delivery and Learning teams design and deliver transformational digital products and experiences that add value to our clients’ businesses and to their customers’ lives.Every day, across the world, our teams are pioneering faster, better ways to bring our client’s most exciting ideas to life.We are looking for a seasoned Senior Data Engineer to help us shape Emergn’s exciting future and play an important role in our growth.
We want you to:
- Design, build, and maintain scalable data pipelines and workflows using Microsoft Fabric, including Data Factory, Lakehouse, and Synapse Pipelines.
- Use Apache Spark in Microsoft Fabric Notebooks for large-scale data processing, cleansing, and transformation tasks.
- Develop efficient SQL-based solutions for data modeling, data warehousing, and analytics layers.
- Leverage Python and PySpark to automate data flows, integrate sources, and apply advanced data logic.
- Collaborate with analysts, engineers, and stakeholders to deliver clean, trustworthy datasets to reporting and ML pipelines.
- Implement and monitor ETL/ELT processes for batch and near-real-time use cases across diverse data sources.
- Optimize performance of queries and transformations in Spark and SQL environments.
- Support Lakehouse Architecture within Microsoft Fabric and ensure best practices in data lake management.
- Assist in establishing data quality, data lineage, and governance processes across the data stack.
- Act as a subject matter expert on data workflows within the Microsoft ecosystem, helping to guide best practices across teams.
This job might be for you if you have:
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
- Hands-on experience with Microsoft Fabric components including Spark Notebooks, Data Factory, and Synapse.
- Strong understanding of Apache Spark (especially via PySpark) for distributed data processing.
- Proficiency in SQL for data manipulation and optimization.
- Solid Python skills for scripting, automation, and transformation logic.
- Experience with cloud-native data solutions—preferably on Microsoft Azure.
- Understanding of data warehouse design, dimensional modeling, and Lakehouse patterns.
- Familiarity with CI/CD and version control tools (e.g., Git, Azure DevOps).
- Comfortable working in agile, iterative data development cycles.
- Excellent communication and stakeholder collaboration skills.
Nice to Have:
- Familiarity with OneLake architecture and Delta Lake implementation in Fabric.
- Knowledge of Power BI data modeling and how backend data impacts reports.
- Experience with streaming data ingestion (e.g., Azure Event Hubs, Kafka, Fabric Real-Time Analytics).
- Exposure to notebook-based development workflows in Jupyter or Databricks.
- Awareness of data privacy, security best practices, and compliance (e.g., GDPR, DLP tools).
- Previous experience with other Spark platforms like Databricks or HDInsight is a plus.
What we offer:
- Work within a dynamic international team of experts
- Excellent opportunity for personal and professional development
- Comfortable office and flexible working hours
- Ability to work with modern technologies
- Huge catalogue of educational programs, the possibility of training and certification at the expense of company
- 20 working days of vacation per year
- Life and disability insurance
- Health insurance
- Big free parking
- Helping to better define their thinking, including shaping their product organization and clarifying and communicating their product strategy.
- Developing their people by delivering exemplary training in product management, modern ways of working, and leadership.
- Delivering their outcomes by building customer-centric products and customer experiences.
Ready to make an impact?
Join us today!
Tags: Agile Architecture Azure CI/CD Computer Science Consulting Databricks Data pipelines Data quality Data warehouse Data Warehousing DevOps ELT Engineering ETL Git Jupyter Kafka Machine Learning Pipelines Power BI Privacy PySpark Python Security Spark SQL Streaming
Perks/benefits: Career development Flex hours Flex vacation Insurance
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.