Senior Data Engineer
Remote
GBH
We transform businesses. Measuring our achievements by their impact. 15 years using tech to help ambitious companies get further faster. asdfThis is a remote position.
At GBH, we don’t just do tech—we live it, breathe it, and build it with purpose.We’re the dreamers, the builders, the strategists who turn ideas into digital experiences that actually matter. Whether it’s crafting seamless mobile and web apps, unlocking insights through big data, or rethinking tech strategies, we do it all with impact in mind and belonging at heart.
We’re Geared for Impact. Built for Belonging. And always ready for what’s next.
Description
GBH is seeking an experienced Senior Data Engineer to design, develop, and maintain data pipelines and infrastructure. The ideal candidate will work with modern data engineering tools and practices, including Apache Airflow, Elasticsearch, PostgreSQL, and Oracle, ensuring high data integrity, availability, and performance.
This role involves processing and transforming data from multiple sources into structured, scalable pipelines for downstream analytics and reporting tools such as Apache Superset. The Senior Data Engineer will be a key contributor to the design and implementation of robust ETL/ELT processes and will work closely with software developers, analysts, and DevOps engineers in an agile development environment.
You will be responsible for:
- Work with talented and creative professionals.
- Work in different projects and develop your career within an industry or multiple industries.
- Collaborate closely with a multidisciplinary team.
- Design, build, and maintain robust Airflow DAGs for data extraction, transformation, and loading (ETL/ELT) from multiple sources including Oracle, PostgreSQL, file systems, and SharePoint.
- Transform raw data into clean, deduplicated, and well-structured formats suitable for Elasticsearch and Superset dashboards.
- Develop and optimize Elasticsearch indexes, ensuring performance and data accuracy for analytical workloads.
- Collaborate with software engineers and analysts to translate business needs into data engineering solutions.
- Contribute to the CI/CD pipelines and automation for data integration and deployment using GitLab.
- Monitor, troubleshoot, and optimize data workflows and pipelines in production environments.
- Write and maintain technical documentation and support training initiatives.
Requirements
- Bachelor’s degree in Computer Science, Data Engineering, or related technical field.
- 5 years of experience in data engineering, working on large-scale ETL/ELT systems.
- 3 years of professional experience with Apache Airflow, building and maintaining DAGs.
- 3 years of experience with Elasticsearch, including index design and optimization.
- Strong experience with Python for data pipeline development and scripting.
- Experience with PostgreSQL and Oracle databases, including SQL optimization and data modeling.
- Experience working in UNIX/Linux environments.
- Solid understanding of CI/CD pipelines using GitLab
- Familiarity with DevOps practices, containerization (e.g. Docker), and optionally Kubernetes.
- Experience working in Agile software development teams.
- Proficiency with Jira for project and issue tracking.
- Excellent communication skills in English (verbal and written)
Meet the Team
On this team you´ll work with a group of talented professionals such as Hector Aristy.
Benefits
Why Join GBH?- Our Culture: A friendly, fast-paced and inclusive environment. We rely on an open and empathetic culture that constantly promotes the growth of our team.
- Learning & Development: We do our best to set the best baselines to accelerate your career.
- Benefits & Rewards: We strive to offer competitive, unbiased, and fair rewards for all our people. We empower you to manage your own time and promote flexible working opportunities, along with family-friendly policies.
What Happens once you apply?
Once you have completed your application, we will review your profile and you will hear from us in a period of 5 to 10 days maximum.
Equal Opportunity: The selection process for this position ensures compliance with the principle of non-discrimination by sex, origin (including racial or ethnic), age, marital status, disability, religion or belief, political opinion, sexual orientation, union affiliation, social status, or language.
Know anyone perfect for this role? Refer a friend here.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow Big Data CI/CD Computer Science Data pipelines DevOps Docker Elasticsearch ELT Engineering ETL GitLab Jira Kubernetes Linux Oracle Pipelines PostgreSQL Python SharePoint SQL Superset
Perks/benefits: Career development Flex hours Startup environment
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.