Data Analytics Engineer
New York Office
Perchwell
Real estate data visualization, analytics, market insights. Search Manhattan, Brooklyn, NYC, apartments, condos, co-ops, townhouses, homes for sale.Who We Are
Perchwell is the modern data and workflow platform for real estate professionals and consumers. Based on the industry’s foundational data, Perchwell builds a modern software suite to empower real estate professionals to do their best work, provide differentiated service to their clients, and grow their businesses.
Backed by Lux Capital, Founders Fund, and some of the country’s leading Multiple Listing Services (MLSs), Perchwell builds next generation workflow software/data products for the multi-trillion dollar residential real estate industry. Perchwell is the first new entrant to come to market in decades and is currently scaling its best-in-class platform.
What We're Looking For
We are seeking a motivated and experienced Data Analytics Engineer to join Perchwell’s growing data team. This role will be pivotal in developing and maintaining scalable data pipelines, transforming raw data into usable formats, and enabling self-service analytics across the organization. The ideal candidate will have a strong technical foundation, exceptional problem-solving skills, and experience in modern data stack technologies. You will collaborate closely with data engineers, data analysts, and product managers to deliver high-quality, timely data solutions.
What You’ll Do
Data Pipeline Development: Design, build, and maintain ETL/ELT pipelines to process large volumes of structured and unstructured data
Data Transformation: Use tools like dbt to transform raw data into clean, reusable data models that support self-service analytics
Data Integration: Integrate data from multiple sources, ensuring data accuracy, quality, and consistency across platforms
Data Governance: Ensure adherence to data quality and security standards, and support the implementation of data governance policies
Automation & Orchestration: Utilize Airflow to schedule and monitor workflows, ensuring timely delivery of data
CI/CD & Version Control: Implement CI/CD processes for analytics code, enabling rapid development, testing, and deployment
Collaboration & Support: Work closely with cross-functional teams (Data Analysts, Data Scientists, Product Managers) to deliver data products that meet business needs
Innovation: Stay up-to-date on the latest advancements in data engineering, large language models, and AI-driven analytics to drive innovation in Perchwell’s data strategy
Required for the Role
5-10 years of work experience in data analytics, data engineering, or a similar role
SQL: Advanced proficiency in SQL for querying, transforming, and optimizing large datasets
Data Warehousing: Strong experience with Amazon Redshift or other cloud data warehouses (Snowflake, BigQuery, etc.)
ETL/ELT Tools: Expertise with dbt to build modular, testable, and reusable transformation logic
Programming: Proficiency in Python for scripting, automation, and working with APIs
Workflow Orchestration: Experience with Airflow or similar orchestration tools to manage workflows
CI/CD: Experience with CI/CD pipelines for version control, testing, and deployment of analytics code
Data Modeling: Knowledge of data modeling best practices and the ability to design star/snowflake schemas
Data Quality: Experience with tools, frameworks, or processes for data validation, testing, and reconciliation
LLM & AI Knowledge: Exposure to Large Language Models (LLMs) and experience using them to support analytics, such as enabling natural language query interfaces
Analytical Mindset: Strong problem-solving and critical-thinking skills
Communication: Ability to communicate complex technical concepts to non-technical stakeholders
Detail-Oriented: Precision in handling large datasets and ensuring data integrity
Collaboration: A team player who thrives in cross-functional environments
In this role, you’ll work out of our New York City Office in Soho Manhattan 3 days/week.
Bonus Points for the Following
Experience with DataOps principles and practices
Familiarity with data lake technologies (e.g., S3, Delta Lake, etc.)
Exposure to API development and experience in ingesting API-based data sources
Knowledge of monitoring and observability tools for data pipelines
Exposure to AI/ML workflows and experience incorporating them into data pipelines
Salary Range
The compensation for this position is $140K-175K base salary + equity + benefits
Benefits
Unlimited PTO, plus 10 paid company holidays
401K with a company match
Medical, dental, and vision plans
HSA and FSA options
Commuter benefits
Parental leave
Company-wide onsite or offsite each year
Beautiful office in Soho, Manhattan with a stocked kitchen, catered breakfast and lunch once per week, happy hours and meet-ups
Note: At this time, we are only considering candidates who are authorized to work in the U.S.
Tags: Airflow API Development APIs BigQuery CI/CD Data Analytics Data governance DataOps Data pipelines Data quality Data strategy Data Warehousing dbt ELT Engineering ETL LLMs Machine Learning Pipelines Python Redshift Security Snowflake SQL Testing Unstructured data
Perks/benefits: 401(k) matching Equity / stock options Health care Lunch / meals Medical leave Parental leave Salary bonus Snacks / Drinks Team events Unlimited paid time off
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.