DBT Data Engineer

Maharashtra, Pune, India

Codvo.ai

Codvo is an AI, Cloud & UX Development Company, helping enterprises and start-ups build remote tech teams that create high-performing software across disruptive industries.

View all jobs at Codvo.ai

Apply now Apply later

About the Role:As a Senior Data Engineer you will play a crucial role in designing building, data infrastructure. You will be part of the data engineering team, responsible for ensuring data quality, accessibility, and reliability.Responsibilities:Data Pipeline Development:Design, develop, and maintain robust and scalable data pipelines using various tools and technologies (e.g., Airflow, Kafka, Spark, etc.).Implement and manage ETL/ELT processes to efficiently load and transform data from various sources into our data warehouse.Data Modeling and Transformation:Develop and maintain complex data models using dbt, ensuring data accuracy, consistency, and adherence to business requirements.Write efficient and reusable SQL code within dbt for data transformation, validation, and documentation.Implement data governance and data quality practices within dbt models.Data Warehousing:Work with cloud-based data warehousing solutionsOptimize data warehouse performance and scalability.Technical Leadership:Evaluate and recommend new technologies and tools to improve our data infrastructure.Lead the design and implementation of complex data engineering projects.Stay up-to-date with the latest trends and advancements in data engineering.Qualifications:Experience:5+ years of experience as a Data Engineer or in a similar role.2+ years of hands-on experience with dbt, including building, testing, and deploying dbt models.Technical Skills:
  • Strong proficiency in SQL.
  • Expertise in data modeling, data warehousing, and ETL/ELT processes.
  • Experience with cloud-based data platforms (e.g., AWS, GCP, Azure).
  • Experience with data warehousing solutions (e.g., Snowflake, BigQuery, Redshift).
  • Experience with workflow management tools (e.g., Airflow, Dagster).
  • Familiarity with version control systems (e.g., Git).
  • Experience with containerization (e.g., Docker, Kubernetes) is a plus.
  • Experience with programming languages like Python is a plus.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow AWS Azure BigQuery Dagster Data governance Data pipelines Data quality Data warehouse Data Warehousing dbt Docker ELT Engineering ETL GCP Git Kafka Kubernetes Pipelines Python Redshift Snowflake Spark SQL Testing

Region: Asia/Pacific
Country: India

More jobs like this