Senior Data Engineer

Plymouth, MA, United States

Apply now Apply later

We are seeking an experienced Senior Data Engineer to join our team and help build and maintain our enterprise data lakehouse. The ideal candidate will have expertise in Azure Data Factory (ADF), Apache Airflow, dbt, and implementing medallion-style data architectures inside modern datawarehouse platforms such as BigQuery, Snowflake and Redshift.

Responsibilities:
  • Design, implement, and maintain scalable data pipelines using ADF and dbt
  • Develop and optimize ELT processes within a medallion architecture (Bronze, Silver, Gold, Semantics layers)
  • Collaborate with data governor, analysts, and other stakeholders to understand data requirements and deliver high-quality datasets
  • Implement data quality checks and monitoring throughout the data lifecycle
  • Optimize query performance and data models for efficient analytics
  • Contribute to data governance and documentation efforts
  • Design and implement analytical models to enhance data insights and automation
  • Integrate analytical models into existing data pipelines and workflows.
  • Stay updated with the latest AI and machine learning technologies and best practices.
Requirements:
  • Bachelor's degree in Computer Science, Engineering, or related field
  • 5+ years of experience as a Data Engineer
  • Strong proficiency in SQL and Python
  • Hands-on experience with Azure Data Factory, AWS Glue or Apache Airflow for workflow orchestration
  • Expertise in using dbt for data transformation and modeling
  • Experience implementing medallion architecture or similar multi-layer data architectures
  • Familiarity with cloud data platforms (e.g., BigQuery, Snowflake, or Redshift)
  • Knowledge of data warehousing concepts and dimensional modeling
  • Experience with developing ML and statistical models
  • Strong problem-solving skills and attention to detail
  • Excellent communication skills and ability to work in a collaborative environment
Preferred Qualifications:
  • Experience with Delta Lake or similar data lakehouse technologies
  • Familiarity with data science and machine learning concepts
  • Knowledge of data governance and compliance requirements
  • Experience with CI/CD practices for data pipelines
  • Experience dealing with data at financial institutions/banks
  • Experience with FIS IBS core banking system
  • Familiarity with Kafka, Kinesis or similar data streaming service
  • Familiarity with microservices based and event driven architecture
  • Experience with efficient code development and debugging using gen-ai tools like Github co-pilot
  • Understanding of MLOps practices and tools
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow Architecture AWS AWS Glue Azure Banking BigQuery CI/CD Computer Science Data governance Data pipelines Data quality Data Warehousing dbt ELT Engineering GitHub Kafka Kinesis Machine Learning Microservices MLOps Pipelines Python Redshift Snowflake SQL Statistics Streaming

Region: North America
Country: United States

More jobs like this