Data Analytics Engineer - GTM

Palo Alto CA

Apply now Apply later

Obsidian Security was founded in 2017 to solve the unaddressed blindspot of SaaS Security. SaaS applications provide the tools employees need to succeed and hold the business’ most critical information. If those tools become unavailable or that data is jeopardized, there is a detrimental impact on the organization. 

Obsidian proudly offers the industry's most comprehensive and powerful SaaS defense solution. We are committed to solving the challenge of SaaS Security for our customers as efficiently and effectively as possible.

We’re a passionate team optimizing for impact by solving some of the biggest challenges in cybersecurity today. We listen closely to our customers, iterate quickly, and (over) deliver to delight them. Working at Obsidian means contributing to an industry-leading cybersecurity product in an environment where customer satisfaction, privacy, and data ethics are paramount.

We are seeking a talented and driven Data Analytics Engineer to join our team and play a pivotal role in building the foundation for data and analytics to support our business objectives. This position involves designing, developing, and optimizing data pipelines and analytics platforms using cutting-edge tools such as Databricks, Snowflake, Sigma Computing, and other technologies. The ideal candidate will have a strong technical background, a passion for data, and a collaborative mindset to partner with business teams and enable data-driven decision-making. This is a full time position, on-site/hybrid in our Palo Alto office.

The Data Analytics Engineer will be responsible for the scripts and processes required to extract, transform, clean and move data and metadata so they can be loaded into a data warehouse, data mart or operational data store. Reads, analyzes and digests what the company wants to accomplish with its data, and designs the best possible ETL process around those goals. 

Key Responsibilities

  • Design, implement, and maintain robust, scalable, and efficient data pipelines to process and store large volumes of data from multiple sources
  • Develop and perform tests and validate all data flows and prepare all ETL processes according to business requirements and incorporate all business requirements into all design specifications
  • Define and capture metadata and rules associated with ETL processes
  • Leverage tools like Databricks, Snowflake, and Sigma Computing to develop, maintain, and optimize the data architecture and analytics environment
  • Work closely with business stakeholders to understand data needs, translate requirements into technical solutions, and enable self-service analytics capabilities
  • Proactively communicate and collaborate with external and internal customers to understand processes, analyze information needs and define functional requirements
  • Implement best practices for data quality, security, and governance, ensuring compliance with organizational and regulatory standards
  • Continuously monitor and enhance the performance of data pipelines, ensuring timely delivery of high-quality data
  • Review and edit requirements, specifications, business processes and recommendations related to proposed solution.
  • Provide application analysis and data modeling design to collect data for centralized data warehouse
  • Standardize data collection by developing methods for database design and validation reports
  • Lead and perform complex analysis in an evolving data environment.
  • Extract and analyze data, patterns, and related trends as needed, with the subsequent ability to synthesize the data into information consumable by organization.
  • Provide data extraction, reports, dashboards, analysis and consultation services across the organization.
  • Successfully engage in multiple initiatives simultaneously and be able to meet project deliverable deadlines.
  • Work independently with users to define concepts.

Qualifications

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
  • 3+ years of experience in data engineering or a related role.
  • Proficiency with Databricks, Snowflake, and Sigma Computing, with the ability to integrate and optimize their functionalities.
  • Strong programming skills in Python, Scala, SAS, or SQL.
  • Experience designing and implementing ETL/ELT pipelines for large-scale data systems.
  • Knowledge of cloud platforms such as AWS, Azure, or GCP.
  • Experience with building multi-dimensional data models to serve as a foundation for future analyses
  • Experience with connecting data sets to data visualization tools and creating reports
  • Strong project management and organizational skills, experience working on complex initiatives with cross-functional teams in a dynamic environment
  • Strong communication (verbal and written) and interpersonal skills to translate key insights from complex analyses into actionable business insights

Preferred Skills

  • Familiarity with data visualization tools and reporting frameworks.
  • Experience with orchestration tools like Airflow or similar.
  • Understanding of CI/CD processes and version control (e.g., Git).
  • Knowledge of data governance frameworks and principles.
  • Ability to adapt and thrive in a fast-paced, dynamic environment.

 

Pay Transparancy

Please note that the base pay range is a guideline and for candidates who receive an offer, the base pay will vary based on factors such as work location, as well as the knowledge, skills and experience of the candidate. In addition to a competitive base salary, this position is eligible for equity awards and may be eligible for incentive compensation based on factors such as experience, skills, and location.

At Obsidian, we are proud to be an equal-opportunity employer. We value diversity and hire for talent, passion, and compassion. In compliance with federal law, all persons hired will be required to submit satisfactory proof of identity and legal authorization.  If you have a need that requires accommodation, please contact accommodations@obsidiansecurity.com

Information collected and processed as part of any job applications you choose to submit is subject to Obsidian’s Applicant Privacy Policy.

Base Salary Range$113,000—$158,000 USD

Employee Benefits:

Our competitive benefits packages are designed to support our employees' well-being, both at work and at home.

  • Competitive compensation with equity and 401k
  • Comprehensive healthcare with dental and vision coverage
  • Flexible paid time off and paid holiday time off 
  • 12 weeks of new parent or family leave
  • Personal and professional development resources
Apply now Apply later
Job stats:  0  0  0

Tags: Airflow Architecture AWS Azure CI/CD Computer Science Data Analytics Databricks Data governance Data pipelines Data quality Data visualization Data warehouse ELT Engineering ETL GCP Git Pipelines Privacy Python SAS Scala Security Snowflake SQL

Perks/benefits: Competitive pay Equity / stock options Flex hours Flex vacation Health care

Region: North America
Country: United States

More jobs like this