Senior Data Engineer
Bulgaria, Hungary, Lithuania, Poland
Exadel
Advance your business through technology and pave the way to becoming a digital leader with Exadel, an enterprise software development and consulting company.We are looking for a certified Data Engineer who will turn data into information, information into insight, and insight into business decisions. This is a unique opportunity to be one of the key drivers of change in our expanding company.
Work at Exadel - Who We Are
Since 1998, Exadel has been engineering its products and custom software for clients of all sizes. Headquartered in Walnut Creek, California, Exadel has 2,000+ employees in development centers across America, Europe, and Asia. Our people drive Exadel’s success and are at the core of our values.
About Our Customer
The Customer is a prominent company in the life sciences sector that creates and markets tools, supplies, and services for managing liquids, samples, and cells in laboratories across the globe. Their diverse product lineup features pipettes and automated pipetting machines, dispensers, centrifuges, mixers, spectrometers, and equipment for DNA amplification, along with ultra-low temperature freezers, fermenters, bioreactors, CO2 incubators, shakers, and systems for cell manipulation.
Requirements
- Proven expertise in Databricks, including architecture, configuration, and optimization
- Hands-on experience with Databricks integrations
- Proficient in data engineering tasks such as ETL/ELT pipelines, orchestration, and quality assurance
- Hands-on experience with cloud platforms, particularly Azure and GCP
- Strong understanding of Git and CI/CD pipeline creation and integration
- Familiarity with Delta Tables and utility pipelines for data optimization
- Knowledge of SDLC processes, including code reviews and version control best practices
- Skills in “opening a new client account” as the first Engineer/Developer joining the project
- Practice in working with cross-functional teams, including Data Engineers and Project Managers
- Ability to lead workshops, meetings, and pair-programming sessions
- Strong documentation and proposal development skills
- Adept at presenting technical concepts to non-technical stakeholders
Nice to Have
- Certification in Databricks, Azure, or GCP
- Experience with large-scale migration projects (e.g., moving solutions to GCP)
- Familiarity with QA frameworks for data pipelines
English level
Intermediate+
Responsibilities
- Analyze existing data sources
- Identify anomalies/trends
- Create data pipelines, graphics, and reports
- Evaluate business needs and objectives
- Explore ways to enhance data quality and reliability
- Identify opportunities for data acquisition
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Architecture Azure CI/CD Databricks Data pipelines Data quality ELT Engineering ETL GCP Git Pipelines SDLC
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.