Lead Data Engineer (Remote)
Remote, United Kingdom
Circana
Circana business tools provide in-depth consumer behavior data, industry trends, and expert analysis of market research to drive business growth.
At Circana, we are fueled by our passion for continuous learning and growth, we seek and share feedback freely, and we celebrate victories both big and small in an environment that is flexible and accommodating to our work and personal lives. We have a global commitment to diversity, equity, and inclusion as we believe in the undeniable strength that diversity brings to our business, employees, clients, and communities. With us, you can always bring your full self to work. Join our inclusive, committed team to be a challenger, own outcomes, and stay curious together. Circana is proud to be Certified™ by Great Place To Work®. This prestigious award is based entirely on what current employees say about their experience working at Circana.
Learn more at www.circana.com.
What will you be doing?
We are seeking a skilled and motivated Data Engineer to join a growing team Global Team based in the UK. In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and Apache Airflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and a desire to make a significant impact, we encourage you to apply!
Job Responsibilities
Data Engineering & Data Pipeline Development
- Design, develop, and optimize scalable DATA workflows using Python, PySpark, and Airflow
- Implement real-time and batch data processing using Spark
- Enforce best practices for data quality, governance, and security throughout the data lifecycle
- Ensure data availability, reliability and performance through monitoring and automation.
Cloud Data Engineering :
- Manage cloud infrastructure and cost optimization for data processing workloads
- Implement CI/CD pipelines for data workflows to ensure smooth and reliable deployments.
Big Data & Analytics:
- Build and optimize large-scale data processing pipelines using Apache Spark and PySpark
- Implement data partitioning, caching, and performance tuning for Spark-based workloads.
- Work with diverse data formats (structured and unstructured) to support advanced analytics and machine learning initiatives.
Workflow Orchestration (Airflow)
- Design and maintain DAGs (Directed Acyclic Graphs) in Airflow to automate complex data workflows
- Monitor, troubleshoot, and optimize job execution and dependencies
Team Leadership & Collaboration
- Lead a team of data engineers, providing technical guidance and mentorship
- Foster a collaborative environment and promote best practices for coding standards, version control, and documentation.
Requirements
- This a client facing role, strong communication and collaboration skills are vital
- Experience in data engineering with expertise in Azure, PySpark, Spark, and Airflow.
- Strong programming skills in Python, SQL with the ability to write efficient and maintainable code
- Deep understanding of Spark internals (RDDs, DataFrames, DAG execution, partitioning, etc.)
- Experience with Airflow DAGs, scheduling, and dependency management
- Knowledge of Git, Docker, Kubernetes, Terraform, and apply best practices of DevOps for CI/CD workflows
- Excellent problem-solving skills and ability to optimize large-scale data processing.
- Experience in leading teams and working in Agile/Scrum environments
- A proven track record of working effectively global remote teams
Desirable:
- Experience with data modelling and data warehousing concepts
- Familiarity with data visualization tools and techniques
- Knowledge of machine learning algorithms and frameworks
Circana Behaviours
As well as the technical skills, experience and attributes that are required for the role, our shared behaviours sit at the core of our organization. Therefore, we always look for people who can continuously champion these behaviours throughout the business within their day-to-day role:
- Stay Curious: Being hungry to learn and grow, always asking the big questions.
- Seek Clarity: Embracing complexity to create clarity and inspire action.
- Own the Outcome: Being accountable for decisions and taking ownership of our choices.
- Centre on the Client: Relentlessly adding value for our customers.
- Be a Challenger: Never complacent, always striving for continuous improvement.
- Champion Inclusivity: Fostering trust in relationships engaging with empathy, respect, and integrity.
- Commit to each other: Contributing to making Circana a great place to work for everyone.
Location
This position can be located in the following area(s): Remote or Bracknell, UK
#LI-KM1
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow Azure Big Data CI/CD Data pipelines Data quality Data visualization Data Warehousing DevOps Docker Engineering Git Kubernetes Machine Learning Pipelines PySpark Python Scrum Security Spark SQL Terraform
Perks/benefits: Career development Flex hours
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.