Data Engineer - Consultant (Remote)
Sacramento, CA
Contract Senior-level / Expert USD 120K - 170K
Releady
Releady is a diverse and women-led talent solutions firm specializing in tech, engineering, data, digital, marketing, and creative. We're a diverse, women-led team with decades of experience in consulting, staffing, and hyper-scaling...This Senior Data Engineer position is with a healthcare insurance industry client where you'll join their Data Engineering development team. You'll be responsible for the design, development, testing, and deployment of enterprise data solutions using both on-premises and cloud technologies. Reporting to the client's Manager of Data Engineering Development, you'll build and maintain data pipelines and warehousing solutions utilizing their modern cloud-based tech stack centered around Snowflake, dbt Cloud, and Azure.
- Duration: 6+ months contract
- Location: Remote, but must reside in California, Arizona, Washington, Oregon, Nevada. Working hours will be PST. Preference for California.
- Rate: $60/hr - $85/hr DOE
***Must be able to work in the United States without sponsorship***
RESPONSIBILITIES- Design, develop, and implement data integration pipelines and data warehouse solutions using Snowflake, dbt Cloud, and Azure technologies (ADLS, Synapse)
- Build and optimize production-ready data workflows, ensuring high performance, reliability, and scalability
- Create and maintain data pipelines following Data Vault architectural principles for warehouse modeling
- Work with Collibra for data governance, quality assurance, and metadata management
- Leverage Refuel.ai for data mastering and Striim for data validation processes
- Assist in troubleshooting and resolving data pipeline issues by analyzing end-to-end workflows
- Collaborate with client stakeholders to translate requirements into effective data solutions
- Support data visualization and reporting needs through Tableau
- Implement CI/CD practices using Git repositories and modern DevOps tools
- Participate in an Agile/DevSecOps pod model alongside solution architects, data modelers, analysts, and business partners
- Bachelor's degree in Computer Science, Information Systems, or related field (or equivalent experience)
- 5+ years of experience in data engineering or related roles
- Strong proficiency in SQL and database technologies including Snowflake, Oracle, and SQL Server
- Hands-on experience with dbt Cloud for data transformation and pipeline development
- Demonstrated experience with Azure cloud technologies, particularly ADLS and Synapse
- Knowledge of Data Vault modeling principles and implementation techniques
- Experience with data governance and data quality tools, particularly Collibra
- Familiarity with data visualization platforms, especially Tableau
- Understanding of version control systems (Git, Bitbucket) and CI/CD practices
- Experience with scheduling systems like Tidal or Control-M
- Working knowledge of Agile methodologies and DevOps principles applied to data pipelines
- Preferred Skills:
- Experience with data observability platforms and data quality monitoring
- Knowledge of Python, R, KNIME, or Alteryx for data science applications
- Experience with Refuel.ai and Striim technologies
- Background in data migration from traditional databases (Oracle, SQL Server) to cloud platforms
- Experience with enterprise scheduling tools like Tidal
Tags: Agile Azure Bitbucket CI/CD Computer Science Data governance Data pipelines Data quality Data visualization Data warehouse dbt DevOps Engineering Git KNIME Oracle Pipelines Python R Snowflake SQL Tableau Testing
Perks/benefits: Insurance
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.