Junior Data Engineer

Las Vegas, NV, US

Apply now Apply later

Description

ABOUT US

We are a full-service, commercial real estate firm that delivers highly specialized Asset Management Services and focuses primarily on Manufactured Housing and Self-Storage. We own and operate over 115 properties across 29 states. We are seeking innovative, passionate, and motivated individuals looking for an opportunity to join a fast-growing organization with tremendous professional growth opportunities. Our achievements hinge on our capacity to implement our guiding principles, fostering a distinctive organizational culture that sets us apart from others. Only by doing so can we cultivate an atmosphere where meaningful relationships and productive work converge, paving the way for continuous improvement and innovation. Our team is made up of uniquely qualified, professional individuals who understand the complexities and challenges of acquiring and managing our key assets. We are known for providing a space where your contributions are valued, your ideas are heard, and the value you provide is recognized through career advancement and financial opportunities.


POSITION PURPOSE

We are looking for a passionate and driven Junior Data Engineer to join our dynamic Data team!   

The Junior Data Engineer is responsible for supporting the design, development, and maintenance of data pipelines, data warehouses, and integrations that ensure clean, accurate, and accessible data for business operations and analytics. This role works closely with cross-functional teams to manage data workflows, ensure data quality, and optimize system performance to support business intelligence and reporting needs.


**Please note this is an in-person role in Las Vegas.  THIS IS NOT A REMOTE ROLE.  Candidate must reside or be attending school in Las Vegas, NV to be considered.


JUNIOR DATA ENGINEER RESONSIBILITITES:

  

ETL Development & Management:

  • Design, develop, and maintain ETL (Extract, Transform, Load)      pipelines to automate the flow of data from multiple systems, including      Reonomy, Rent Manager, StorEDGE, Pipedrive, and Sage.
  • Ensure data is consistently cleaned, transformed, and integrated      from disparate sources into a centralized Fabric Lakehouse.

Database Management:

  • Assist in the design and optimization of Fabric Lakehouse tables to      ensure efficient data storage and retrieval.
  • Write and optimize SQL queries for high performance and minimal      resource usage.
  • Perform routine database      maintenance tasks, including indexing and performance tuning within Microsoft      Fabric environments.

Data Warehousing:

  • Assist in the design and implementation of scalable data warehouses      to consolidate data from multiple sources.
  • Develop data marts and semantic      models tailored for specific business units or analytical use cases within      Power BI and Fabric.

Data Quality Assurance:

  • Implement data validation and quality checks to ensure the accuracy,      integrity, and completeness of data.
  • Develop scripts and processes to clean, transform, and prepare data      for reporting and analysis.

Data Pipeline Automation:

  • Automate data workflows using tools such as Apache Airflow, Luigi,      or custom scripting solutions.
  • Manage and schedule data pipeline updates to ensure timely data      availability.

API Integration:

  • Develop and manage API      integrations within Microsoft Fabric Notebooks or Data Pipelines to ingest      and process third-party data.
  • Utilize external APIs to fetch data from third-party sources and      integrate it into internal data infrastructure.

Big Data Technologies:

  • Support distributed computing solutions using big data technologies      such as Hadoop, Spark, or Kafka.
  • Implement real-time data processing solutions for streaming data      applications.

Data Security & Compliance:

  • Assist in implementing data governance policies to ensure data      security, privacy, and regulatory compliance.
  • Manage and enforce data access controls to protect sensitive      information and ensure authorized use.

Monitoring & Troubleshooting:

  • Set up monitoring tools to track the performance, availability, and      health of data pipelines.
  • Troubleshoot and resolve data pipeline, database, and data quality      issues as they arise.

Documentation & Reporting:

  • Create and maintain detailed technical documentation for data      pipelines, database schemas, and data workflows.
  • Develop and maintain reports and dashboards to provide insights into      data pipeline performance, data quality, and system health.

JUNIOR DATA ENGINEER QUALIFICATIONS:

  • Recently graduated or currently pursuing a degree in Data Science, Statistics, Real Estate, or a related field.
  • NOT A REMOTE ROLE – In-Person Only – Must reside or currently enrolled in a Las Vegas, NV school.
  • Understanding of ETL/SLT concepts and data modeling best practices.
  • Familiarity with scripting languages like Python.
  • Knowledge of Power BI, especially working with semantic models and DirectQuery.
  • Experience working with APIs (REST, JSON) for data extraction and integration. 
  • Strong attention to detail and a commitment to data quality and accuracy.
  • Ability to work collaboratively in a cross-functional team environment.
  • Possess a genuine interest in commercial real estate and investment analysis.
  • Must have excellent verbal and written communication skills.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow APIs Big Data Business Intelligence Data governance Data pipelines Data quality Data Warehousing ETL Hadoop JSON Kafka Pipelines Power BI Privacy Python Security Spark SQL Statistics Streaming

Perks/benefits: Career development

Region: North America
Country: United States

More jobs like this