Data Quality Engineer
Bnei Brak, Israel, IL
eToro
Trade and invest on a trusted multi-asset platform. Get ideas from 35M users, practise with a demo account or copy top-performing traders automatically.Description
eToro is the trading and investing platform that empowers users to invest, share and learn. We were founded in 2007 with the vision of a world where everyone can trade and invest in a simple and transparent way. We have created an investment platform that is built around collaboration and investor education. On our platform, users can view other investors’ portfolios and statistics, and interact with them to exchange ideas, discuss strategies and benefit from shared knowledge. We have over 38 million registered users from 75 countries and our platform is available in 20 languages. We are a fast growing business with over 1,500 employees across 13 offices around the globe, strategically positioned to serve the needs of users. You can find out more about eToro here.
We are seeking a proactive Data Quality Engineer with a strong commitment to maintaining high data quality standards and working effectively with cross-functional teams. The ideal candidate is a proactive individual with a "can-do" attitude who is dedicated to continuous learning and excels in collaboration.
What You'll Do:
- Monitor and Maintain Data Quality: Use the Azure Data Platform, including Azure Databricks, Azure Data Factory, Azure Data Lake Storage, Azure Log Analytics and Azure Logic Apps to create and manage monitoring workflows, alerts, and controls that ensure high data quality.
- Automate Data Quality Processes: Leverage your strong programming skills in Python and PySpark to perform automated Data Quality workflows. Write efficient and optimized code to implement data quality checks in production and CI/CD processes.
- Quality Assurance: Conduct rigorous testing and validation of data pipelines and processes. Design and execute test cases, identify and report defects, and ensure data quality is maintained throughout the development lifecycle.
- Collaboration and Communication: Work closely with cross-functional teams, stakeholders, and developers to address data quality issues. Communicate your findings, recommendations, and solutions clearly and effectively.
A Data Quality Engineer plays a critical role in ensuring the accuracy, reliability, and integrity of an organization’s data. This position involves designing, implementing, and maintaining data quality processes and frameworks to monitor, control, and enhance data quality across various data platforms and pipelines. The Data Quality Engineer collaborates closely with data engineers, data scientists, analysts, and other stakeholders to identify and resolve data quality issues, ensuring that high-quality data is available for decision-making and analytical purposes.
Requirements
What We Are Looking For:
- At least 3 years of experience in python automation development.
- Advanced skills in SQL for querying, validating, and assessing data quality. Ability to write and understand complex SQL queries.
- Proficiency in working with Azure data services.
- Experience in designing and executing test cases and ensuring data quality throughout the development lifecycle.
- Willingness to stay updated with the latest technologies, tools, and best practices in data quality management.
- Ability to adapt to new technologies and continuously improve data quality processes.
Join our team and help us ensure the highest standards of data quality in our processes
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Azure CI/CD Databricks Data pipelines Data quality Pipelines PySpark Python SQL Statistics Testing
Perks/benefits: Career development
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.