Data Engineer

Cambridge, England, United Kingdom

Axiom Software Solutions Limited

Axioms Software Solutions is one of the well known Best Software Consulting Company,Business Intelligence Analyst Consultant and expertise Devops for Developers. Trust us for all your software needs.

View all jobs at Axiom Software Solutions Limited

Apply now Apply later

Position: Data Engineer

Location: Cambridge / Luton, UK (Hybrid 2-3 days onsite in a week)

Duration: Long Term B2B Contract

Job Description:

The ideal candidate with a minimum of 5 +years of experience having strong experience working with Snowflake, DBT, Python, and AWS to deliver ETL/ELT Pipelines using different resources.

• Proficiency in Snowflake data warehouse architecture Design, build, and optimize ETL/ELT pipelines using DBT (Data Build Tool) and Snowflake.

• Experience with DBT (Data Build Tool) for data transformation and modelling. Implement data transformation workflows using DBT (core/cloud).

• Strong Python programming skills for automation and data processing. Leverage Python to create automation scripts and optimize data processing tasks.

• Proficiency in SQL performance tuning, and query optimization techniques using snowflake.

• Troubleshoot and optimize DBT models, and Snowflake performance.

• Knowledge of CI/CD, version control (Git) tools. Experience with orchestration tools such as Airflow,

• Strong analytical and problem-solving skills with an ability to work in agile development environment independently.

• Ensure data quality, reliability, and consistency across different environments.

• Collaborate with other data engineers, data analysts, and business stakeholders to understand data needs and translate them into engineering solutions.

• Certification in AWS, Snowflake, or DBT is a plus.

Requirements

Position: Data Engineer

Location: Cambridge / Luton, UK (Hybrid 2-3 days onsite in a week)

Duration: Long Term B2B Contract

Job Description:

The ideal candidate with a minimum of 5 +years of experience having strong experience working with Snowflake, DBT, Python, and AWS to deliver ETL/ELT Pipelines using different resources.

• Proficiency in Snowflake data warehouse architecture Design, build, and optimize ETL/ELT pipelines using DBT (Data Build Tool) and Snowflake.

• Experience with DBT (Data Build Tool) for data transformation and modelling. Implement data transformation workflows using DBT (core/cloud).

• Strong Python programming skills for automation and data processing. Leverage Python to create automation scripts and optimize data processing tasks.

• Proficiency in SQL performance tuning, and query optimization techniques using snowflake.

• Troubleshoot and optimize DBT models, and Snowflake performance.

• Knowledge of CI/CD, version control (Git) tools. Experience with orchestration tools such as Airflow,

• Strong analytical and problem-solving skills with an ability to work in agile development environment independently.

• Ensure data quality, reliability, and consistency across different environments.

• Collaborate with other data engineers, data analysts, and business stakeholders to understand data needs and translate them into engineering solutions.

• Certification in AWS, Snowflake, or DBT is a plus.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0
Category: Engineering Jobs

Tags: Agile Airflow Architecture AWS CI/CD Data quality Data warehouse dbt ELT Engineering ETL Git Pipelines Python Snowflake SQL

Region: Europe
Country: United Kingdom

More jobs like this