Analytics Engineer Internship

Salt Lake City, Utah

Apply now Apply later

Lucid Software is the leader in visual collaboration, helping teams see and build the future from idea to reality. Our products, business, and workplace culture have received numerous awards, such as being named to the Forbes Cloud 100 and a Fortune Best Workplace in Technology. Lucid is a hybrid workplace, allowing employees to work remotely, from one of our offices, or a combination of the two depending on the needs of the role and team. At Lucid, we hold true to our core values of teamwork over ego, innovation in everything we do, individual empowerment, initiative, and ownership, and passion and excellence in every area. We value diversity and are dedicated to creating an environment that is respectful and inclusive for everyone.

Here at Lucid, data is key to making decisions that improve the product for our users, fuel the growth of the business, and allow the company to operate efficiently. As an Analytics Engineer Intern, you will help design, build, and maintain our analytics data warehouse, which is used across the company to quickly and accurately answer important questions that drive impact. This includes cleaning, testing, documenting, and modeling data, as well as the systems that orchestrate our production data pipelines. You will get to work with data from a variety of sources, including clickstream data, CRM systems, marketing platforms, subscription and payment data, and support tickets. Our data stack consists of Stitch, Fivetran, Airflow, Snowflake, Databricks, dbt, and Hightouch. In this position, you will focus on SQL-based data transformation in Snowflake and dbt.

This role is part of the Strategy and Analytics team, which supports the data and decision-making needs of every other function at Lucid. As such, you will have many opportunities to work cross-functionally with other teams. For example, you may work with data engineers on the ingestion of a new data source into our data warehouse or on moving data from the warehouse to other systems. You may also work with business leaders and stakeholders to help them self-serve to meet their own data needs or to automate a manual process. You may work closely with analysts on the Strategy and Analytics team to understand business needs and craft data sets to meet those needs. While analysts also contribute to data modeling, testing, and documentation, you will be an advisor and advocate in ensuring that we follow best practices and a technical expert when analysts run into difficult and complex data challenges.

Responsibilities:

  • Write complex, production-quality (i.e., accurate, performant, and maintainable) data transformation code to solve the needs of analysts, data scientists, and business stakeholders
  • Implement effective data tests to ensure accuracy and reliability of data and ELT pipelines
  • Assist in coaching and advising analysts on data modeling, SQL query structure and optimization, and software engineering best practices (e.g., version control, testing, code deployment)
  • Assist in designing and maintaining the architecture and organizational structure of our data warehouse
  • Collaborate with data engineers on infrastructure projects to implement new systems/tools/processes, ingest and model data from new sources, and pipe data between systems
  • Troubleshoot and resolve data issues as they arise
  • Ensure that data, systems, business logic, and metrics are well-documented
  • Maintain the quality of our analytics codebase by cleaning up old code, identifying and addressing tech debt, and ensuring consistent style
  • Other duties as assigned

Requirements:

  • Currently pursuing graduate or undergraduate degree, ideally in a technical or quantitative field
  • Ability and desire to develop the technical skills needed to work with large data sets (SQL, Python, R)
  • Comfortable using SQL for data transformations
  • Ability and desire to develop familiarity with version control workflows, Python, or another modern programming language
  • Ability to communicate clearly about data to both technical and non-technical audiences
  • Detail-oriented, organized, and a good team player
  • Passion for structure, organization, and efficiency, down to the details (e.g. maintaining consistent naming conventions and coding style)
  • This position is intended for current undergraduate or graduate students who will graduate in December 2024 or later

Preferred Qualifications:

  • Passion for problem-solving - If you’ve ever been so absorbed in a problem that your mind couldn’t rest until you figured it out, you’ll be in good company.
  • Willing to help and be helped - Our impact comes only through helping others make better decisions using data. We recognize that we’re stronger together - there’s no shame in asking for help, we’re not afraid to say “I don’t know”, and we actively seek feedback.
  • Desire to learn - You’ll often be answering questions that have never been answered before, which requires a high level of intellectual curiosity and an eagerness to dive into new problems, domains, tools, and techniques.

#LI-DA1 

Apply now Apply later
Job stats:  1  0  0

Tags: Airflow Architecture Databricks Data pipelines Data warehouse dbt ELT Engineering FiveTran Pipelines Python R Snowflake SQL Testing

Perks/benefits: Startup environment

Region: North America
Country: United States

More jobs like this