Data Scientist

Poland

GroupM

GroupM is the world’s leading media investment company. We make advertising work better for people.

View all jobs at GroupM

Apply now Apply later

 

Title: Data Scientist

WHO WE ARE

Choreograph is WPP’s global data products and technology company. We’re on a mission to transform marketing by building the fastest, most connected data platform that bridges marketing strategy to scaled activation.

We work with agencies and clients to transform the value of data by bringing together technology, data, and analytics capabilities.  We deliver this through the Open Media Studio, an AI-enabled media and data platform for the next era of advertising.

We’re endlessly curious. Our team of thinkers, builders, creators, and problem solvers are over 1,000 strong, across 50+ markets around the world.

 

WHO ARE WE LOOKING FOR?

We are seeking a highly motivated and experienced Data Scientist to join our global team of data professionals. As a Data Scientist, you will play a key role in building and maintaining our Centralized Data Hub, focusing on data integration, modeling, and expanding data integration into downstream products. You will contribute your expertise in a range of methodologies and applications to solve data-related challenges in the media landscape, driving business value through the development of impactful data solutions in collaboration with Product, Engineering, and fellow data scientists.

Your responsibilities will include data exploration, data integration, data modeling, and contributing to the development of data pipelines. You will engage with stakeholders to understand their data requirements, provide regular updates on project progress, and translate technical concepts into understandable information for both technical and non-technical audiences.

You will be a part of a vibrant group, collaborating with specialist and business teams, demonstrating your technical knowledge, growing your soft/interpersonal skills, and helping develop and improve the data infrastructure and product offerings of Choreograph. You will also act as a buddy for new team members.

WHAT WILL YOU DO?

  • Data Integration: Design, develop, and maintain data pipelines to ingest, transform, and load data from various sources into the Centralized Data Hub.
  • Data Modeling: Develop and implement data models that support efficient data access and analysis for various use cases.
  • Data Hub Maintenance: Contribute to the ongoing maintenance, optimization, and scalability of the Centralized Data Hub.
  • Data Quality: Implement data quality checks and monitoring to ensure the accuracy, completeness, and consistency of data within the Data Hub. Confidently explore bias and other data quality issues without specific instructions.
  • Data Exploration & Analysis: Explore and analyze data to identify patterns, trends, and insights that can be used to improve data quality and inform business decisions.
  • Data Product Support & Integration: Collaborate with data engineers, product teams, and stakeholders to integrate data from the Centralized Data Hub into data products, ensuring data quality and alignment with specific business needs and requirements.
  • Data Storytelling: Gather, analyze, and visualize data to create compelling and informative narratives for stakeholders, using storytelling techniques to effectively communicate insights and recommendations.
  • Communication & Presentation: Communicate and present findings, results, and outcomes to key stakeholders from Business, Product, and Delivery teams. Interpret complex messages, outputs, and algorithms and simplify them into actionable steps. Present project/team work internally (team and across teams).
  • Project Management: Take responsibility for your tasks on projects and complete the work with some guidance. Take accountability for your tasks and can work with others to stay on schedule.
  • Stakeholder Management: Understand key stakeholders of a team and projects, as well as the roles each stakeholder plays. Provide regular detailed technical updates on a project. Keep self and team project plans up to date. Escalate risks.
  • Continuous Learning: Stay up to date with the latest advancements, trends, and research in the field of data science, data engineering, and related technologies.
  • Data Governance: Adhere to data governance and privacy policies to ensure compliance with relevant regulations and industry best practices.
  • Team Culture & Standards: Understand, embrace, and champion our team's established ways of working, contributing to a positive and productive environment. Present own work internally (team and department). Confidently speak about own team and the DS department.

 

WHAT WILL YOU NEED?

  • Bachelor's or Master's degree in Computer Science, Mathematics, Statistics, Physics, Economics, or a related field.
  • 3+ years of experience analyzing large volumes of data and solving analytical problems using quantitative approaches.
  • 3+ years of programming proficiency in Python and SQL. The ability to develop efficient and scalable code for data processing, analysis, and visualization is crucial for success in this role. Confidently uses relevant programming language/paradigm.
  • Has a good understanding and a working knowledge of the main concepts, techniques, and algorithms.
  • Knows how to use a full range of tools in use by the team (docker/git/notebooks/etc).
  • Experience with data integration tools and techniques (e.g., ETL processes, data pipelines).
  • Experience with data modeling concepts and techniques (e.g., relational modeling, dimensional modeling).
  • Experience with cloud-based data platforms (e.g., GCP, AWS, Azure).
  • Working experience with GCP, Pyspark, Vertex-AI, or equivalent (preferred).
  • Familiarity with data warehousing concepts and technologies (e.g., Snowflake, BigQuery).
  • Knowledge of data quality principles and practices.
  • Experience with stakeholder engagement and collaboration.
  • Able to work with key stakeholders and subject matter experts. Able to perform any research/investigation necessary to implement the work.
  • Excellent analytical skills and strong statistical knowledge, as well as interest in applying them to complex problems.
  • Excellent communication skills to explain complex concepts to technical teams within the team. Interacts on good terms with managers, colleagues, and communities.
  • Able to see the "big picture," understands company objectives and is clear on the value their team is bringing to the company. Understands team's goals and the individual value they are contributing. Understands the DS department and different surrounding department structures, as well as their roles on projects. Contributes to successful cross-team relationships.
  • A passion for working with data and building data-driven solutions.
  • A growth mindset, both for yourself and the team. We value honesty, helpfulness, and humility.
  • Ability to communicate complex quantitative analysis in a clear, precise, and actionable manner.

If you are ready to be at the forefront of the AdTech industry, shaping its future, and driving success for both Choreograph and our clients, we encourage you to apply and join our team.

Choreograph is the beating heart of data inside WPP’s media investment group, GroupM, the world’s leading media investment company responsible for more than $60 billion in annual media investment. Discover more about Choreograph at www.choreograph.com

GroupM and all its affiliates embrace and celebrate diversity, inclusivity, and equal opportunity. We are committed to building a team that represents a variety of backgrounds, perspectives, and skills. We are a worldwide media agency network that represents global clients. The more inclusive we are, the greater work we can create together.

 

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  3  1  0
Category: Data Science Jobs

Tags: AWS Azure BigQuery Computer Science Data governance Data pipelines Data quality Data Warehousing Docker Economics Engineering ETL GCP Git Mathematics Physics Pipelines Privacy PySpark Python Research Snowflake SQL Statistics Vertex AI

Perks/benefits: Career development

Region: Europe
Country: Poland

More jobs like this