Snowflake data Engineer

Los Angeles

Apply now Apply later

Only Locals and GC/USC

As a Snowflake Data Engineer / Expert, you will be responsible for managing and optimizing our Snowflake data platform. You will design, develop, and maintain data pipelines, ensure data integrity, and work closely with Salesforce integration teams and business analysts. This role requires hands-on experience in Snowflake, ETL/ELT pipeline development, and strong collaboration with cross-functional teams.

Key Responsibilities:

  • Design, develop, and maintain scalable Snowflake data solutions.
  • Build, optimize, and manage ETL/ELT data pipelines.
  • Integrate Snowflake with Salesforce and other CRM platforms.
  • Collaborate with business and technical teams to gather requirements and deliver data-driven solutions.
  • Implement data quality checks, monitoring, and error-handling mechanisms.
  • Perform advanced data analysis and deliver insights to business stakeholders.
  • Ensure compliance with data security, privacy, and governance policies.
  • Optimize Snowflake performance, including query optimization and storage usage.
  • Automate processes and workflows to improve efficiency.

Required Qualifications:

  • 8+ years of hands-on experience with Snowflake Data Platform and overall 10+ years of IT Experience.
  • Snowflake certifications (SnowPro Core Certification, SnowPro Advanced preferred).
  • Proficient in SQL, SnowSQL, and Snowflake-specific features (Streams, Tasks, Clustering, etc.).
  • Experience in integrating Snowflake with Salesforce (via Salesforce Connect, APIs, or third-party ETL tools).
  • Solid experience in ETL/ELT tools (e.g., Talend, Fivetran, Informatica, dbt, Matillion).
  • Strong understanding of data modeling, warehousing concepts, and best practices.
  • Proficiency in data analysis, writing complex queries, and delivering actionable insights.
  • Experience with cloud platforms (AWS, Azure, or GCP) and their services related to Snowflake.
  • Familiarity with Python, Java, or other scripting languages is a plus.
  • Knowledge of CI/CD processes and tools related to data pipelines.
  • Strong communication and collaboration skills.

Preferred Qualifications:

  • Experience with Salesforce Data Architecture and APIs.
  • Familiarity with Salesforce objects and schema for seamless integration.
  • Experience in data visualization tools (Tableau, Power BI, CRM Analytics, etc.).
  • Exposure to Big Data technologies and related frameworks.
Hybrid - Los Angeles Downtown (3 days at office)


Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: APIs Architecture AWS Azure Big Data CI/CD Clustering Data analysis Data pipelines Data quality Data visualization dbt ELT ETL FiveTran GCP Informatica Java Matillion Pipelines Power BI Privacy Python Salesforce Security Snowflake SQL Tableau Talend

Region: North America
Country: United States

More jobs like this