Enterprise Data Architect - Databricks
Reno, Nevada, United States
ITS Logistics, LLC
ITS Logistics is a premier 3PL with dedicated fleet and asset-lite transportation services plus omnichannel distribution and fulfillment.ABOUT ITS LOGISTICS
Are you ready to unleash your potential and be a part of one of the fastest growing, exciting, logistics companies in the US? ITS Logistics is a premier Third-Party Logistics company that provides creative supply chain solutions. With the highest level of service, unmatched industry experience and work ethic, and a laser focus on innovation and technology–our purpose is to improve the quality of life by delivering excellence in everything we do.
At ITS, we invest in your personal and professional growth, providing the tools, resources, and support you need to unleash your full potential, collaborate with like-minded teammates, and seize limitless opportunities. By joining our all-star team, you will be part of an organization that values your unique skills, encourages your drive for excellence, and recognizes your unwavering commitment to achieving our shared goals.
We empower our team members to become champions in their respective fields by nurturing a culture of collaboration, competition, and unyielding resilience. We believe that together, we can conquer any challenge and achieve remarkable victories.
Want to learn more about ITS Logistics? Check out our website! www.its4logistics.com
About The Job
Position Summary
At a pivotal stage of data modernization and scale, ITS is seeking an Enterprise Data Architect with deep expertise in Databricks, Python, and data engineering to lead the architectural evolution of our enterprise data platform. This individual will be instrumental in defining, designing, and delivering scalable data solutions that fuel strategic decision-making, empower analytics teams, and support advanced AI/ML initiatives. This is an exciting opportunity to shape the future of our data architecture at a high-growth, fast-paced organization.
Key Responsibilities
- Data Architecture Leadership: Define and drive the enterprise data architecture strategy with a focus on scalability, performance, security, and modularity across lakehouse and data warehouse environments.
- Databricks Platform Mastery: Architect and implement solutions within the Databricks ecosystem, including Delta Lake, Unity Catalog, Workflows, and advanced Spark operations.
- Cloud & Engineering Strategy: Design and guide the implementation of distributed, cloud-based data pipelines, leveraging modern frameworks and infrastructure (Azure preferred).
- Python-Centric Development: Provide leadership on enterprise-grade Python solutions for ETL/ELT, auditing, orchestration, and automation of data flows across systems.
- Governance & Standards: Establish standards and patterns for metadata management, lineage, data quality, and security using Unity Catalog and access control best practices.
- Business Partnership: Collaborate with data analysts, data scientists, product owners, and operational leaders to understand business needs and translate them into architectural solutions.
- Modeling & Design: Drive robust dimensional modeling practices to support both real-time and batch processing across Bronze, Silver, and Gold layers.
- Technical Mentorship: Guide data engineers and developers across projects by promoting architectural excellence, peer reviews, and code quality. Encourage a culture of continuous learning and innovation.
- Technology Evaluation and Selection : Evaluate and select appropriate data technologies, tools, and platforms. Stay abreast of emerging data trends and technologies
- Data Quality Management : Establish and manage data quality standards, processes, and metrics to ensure data accuracy, consistency, and completeness - Deliver initiatives to monitor, improve, and report on data quality.
- CI/CD Enhancements: Enable/Enhance DevOps for data through integration with Git-based workflows, test automation, environment management, and release strategies using Azure Devops.
- Documentation & Communication: Document architectural decisions and communicate clearly with both technical and non-technical stakeholders.
Basic Qualifications
- Education: Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent experience.
- Experience: 8+ years in data engineering and architecture, including 5+ years in architect-level roles focused on scalable data platform design.
- Databricks Expertise: 4+ years of experience with Databricks including Spark (PySpark), Delta Lake, Unity Catalog, MLflow, and Databricks Workflows.
- Python Proficiency: Advanced hands-on development using Python for ETL, orchestration, and scalable data transformations.
- Modern Data Stack: Strong knowledge of cloud-native data engineering tools and concepts (e.g., Azure Data Lake, Azure Data Factory, Snowflake, or similar platforms).
- Design Thinking: Expertise in designing medallion architectures, data modeling (dimensional and normalized), and handling large-scale structured and unstructured data.
- Security & Governance: Familiarity with data access policies, role-based access controls, PII management, and compliance frameworks.
- Communication: Strong written and verbal skills; capable of translating complex technical ideas into actionable insights for non-technical stakeholders.
- Project Leadership: Proven success delivering architectural outcomes on large-scale initiatives; effective at balancing speed and quality in a rapidly evolving environment.
- Scalability and Performance Optimization : Design and support the implementation of scalable data pipelines, ensuring performance that meets SLAs and maintains data system reliability
- Documentation & Standards: Experience defining and implementing architectural standards, reusable frameworks, and technical design documents.
Other Qualifications
- Experience in requirements gathering methodologies and processes.
- Excellent communications skills, especially with end users and leadership.
- Experienced working in a collaborative environment.
- Excellent judgment skills (e.g., when to escalate technical problems and issues).
- Ability to absorb and retain complex technical information quickly.
- Strong diligence.
- Demonstrated analytical, critical, and problem-solving skills.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Architecture Azure CI/CD Computer Science Databricks Data pipelines Data quality Data warehouse DevOps ELT Engineering ETL Git Machine Learning MLFlow Pipelines PySpark Python Security Snowflake Spark Unstructured data
Perks/benefits: Career development Startup environment
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.