Senior Enterprise Architect - Data Architecture
Salt Lake City Office, United States
Western Governors University
Western Governors University is an online university where you can earn an affordable, accredited, career-focused college degree at an accelerated pace.If you’re passionate about building a better future for individuals, communities, and our country—and you’re committed to working hard to play your part in building that future—consider WGU as the next step in your career.
Driven by a mission to expand access to higher education through online, competency-based degree programs, WGU is also committed to being a great place to work for a diverse workforce of student-focused professionals. The university has pioneered a new way to learn in the 21st century, one that has received praise from academic, industry, government, and media leaders. Whatever your role, working for WGU gives you a part to play in helping students graduate, creating a better tomorrow for themselves and their families.
The salary range for this position takes into account the wide range of factors that are considered in making compensation decisions including but not limited to skill sets; experience and training; licensure and certifications; and other business and organizational needs.
At WGU, it is not typical for an individual to be hired at or near the top of the range for their position, and compensation decisions are dependent on the facts and circumstances of each case. A reasonable estimate of the current range is:
Job Description
We are seeking a visionary and detail-oriented Sr Enterprise Architect - Data Architecture to lead the design and implementation of advanced data ecosystems in alignment with data mesh principles, big data processing, streaming technologies, modern data governance practices, and AI/ML/Generative AI. In this role, you will collaborate with stakeholders across the organization to create a scalable, decentralized, and secure data architecture that supports strategic decision-making, AI/ML innovation, and advanced analytics.
This is a critical role in defining the technical blueprint for how data is collected, processed, stored, shared, and governed across multiple domains while ensuring the architecture scales to meet future needs.
Key Responsibilities
Data Architecture Design and Strategy
Develop and maintain a comprehensive enterprise data architecture that aligns with business goals, ensuring scalability, flexibility, and performance.
Create data models, pipelines, and storage solutions optimized for structured, semi-structured, and unstructured data.
Standardize data integration patterns, defining the technical stack and best practices for the organization.
Implementing Data Mesh Principles
Lead the transition from a centralized data approach to a domain-driven, decentralized data mesh architecture.
Define and promote the concept of data as a product, ensuring domain teams own and manage their data with self-serve capabilities.
Design cross-domain data contracts to ensure interoperability and governance.
Partner with engineering teams to implement domain-oriented data ownership and infrastructure.
Big Data and Streaming Solutions
Architect and implement scalable big data platforms using tools like Hadoop, Snowflake, BigQuery, or Redshift.
Design real-time data streaming and event-driven architectures using tools like Kafka, Apache Flink, or Apache Pulsar.
Build systems to process and analyze large datasets efficiently, optimizing for latency and throughput.
Integrate real-time and batch processing solutions to create unified data platforms.
AI/ML and Generative AI Enablement
Design data infrastructure to support AI/ML and Generative AI initiatives, ensuring data pipelines deliver high-quality training data.
Collaborate with data scientists to define and implement data architectures for machine learning workflows, including feature engineering and model deployment.
Enable scalable experimentation and deployment of Generative AI models like GPT, Stable Diffusion, or domain-specific LLMs (Large Language Models).
Develop strategies for model monitoring, data drift detection, and retraining pipelines.
Data Governance, Reporting, and Security
Establish and implement data governance frameworks, policies, and standards that ensure security, compliance, and privacy across the organization.
Deploy tools for data cataloging, metadata management, and lineage tracking (e.g., Collibra, Alation, Amundsen).
Define and implement data quality management processes, including monitoring, profiling, and validation tools.
Architect and implement data reporting and visualization tools (e.g., Power BI, Tableau, Looker) to enable data-driven decision-making across the organization.
Partner with the security team to implement robust data security protocols for encryption, access control, and threat prevention, ensuring compliance with regulations (e.g., GDPR, CCPA, HIPAA).
Collaboration and Leadership
Work closely with data engineers, data scientists, analysts, and business stakeholders to understand data needs and deliver solutions.
Drive the adoption of modern data architecture principles and best practices within engineering and business teams.
Lead data workshops and training sessions to enable teams to effectively use and manage data products.
Stay up-to-date with emerging technologies and industry trends to continually enhance the organization's data capabilities.
Qualifications
Required Skills & Experience
Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
Minimum of 10 years of experience in data architecture, engineering, or a related field.
Hands-on experience with data mesh principles and implementing decentralized data architectures.
Expertise in big data technologies (e.g., Hadoop, Spark, Hive, Presto, Snowflake, Redshift, BigQuery).
Strong experience with streaming technologies such as Kafka, Apache Flink, Pulsar, or Storm.
Proven expertise in building and scaling cloud-native data platforms (AWS, Azure, GCP).
Deep understanding of data governance concepts, including metadata management, lineage tracking, and compliance tools.
Strong knowledge of relational and NoSQL databases (e.g., PostgreSQL, MongoDB, Cassandra).
Proficiency with programming languages (e.g., Python, Java, Scala) and data pipeline frameworks (e.g., Airflow, dbt).
Experience integrating AI/ML workflows, MLOps into data architecture.
Certifications (Preferred)
Cloud Certifications:
AWS Certified Data Analytics – Specialty
AWS Certified Solutions Architect – Professional
Big Data and Streaming Certifications:
Cloudera Certified Professional (CCP) Data Engineer
Databricks Certified Data Engineer Associate/Professional
Confluent Certified Developer for Apache Kafka
AI/ML Certifications:
AWS Certified Machine Learning – Specialty
TensorFlow Developer Certification
Data Governance and Reporting Certifications:
Certified Data Management Professional (CDMP)
Collibra Data Governance Certification
Preferred Skills
Experience with infrastructure-as-code tools (e.g., Terraform, CloudFormation).
Knowledge of container orchestration tools (e.g., Kubernetes, Docker).
Familiarity with advanced analytics and machine learning platforms like Databricks, SageMaker, or Vertex AI.
Experience working in highly regulated industries (e.g., finance, healthcare)
#LI-ZARD
Position & Application Details
Full-Time Regular Positions (classified as regular and working 40 standard weekly hours): This is a full-time, regular position (classified for 40 standard weekly hours) that is eligible for bonuses; medical, dental, vision, telehealth and mental healthcare; health savings account and flexible spending account; basic and voluntary life insurance; disability coverage; accident, critical illness and hospital indemnity supplemental coverages; legal and identity theft coverage; retirement savings plan; wellbeing program; discounted WGU tuition; and flexible paid time off for rest and relaxation with no need for accrual, flexible paid sick time with no need for accrual, 11 paid holidays, and other paid leaves, including up to 12 weeks of parental leave.How to Apply: If interested, an application will need to be submitted online. Internal WGU employees will need to apply through the internal job board in Workday.
Additional Information
Disclaimer: The job posting highlights the most critical responsibilities and requirements of the job. It’s not all-inclusive.
Accommodations: Applicants with disabilities who require assistance or accommodation during the application or interview process should contact our Talent Acquisition team at recruiting@wgu.edu.
Equal Opportunity Employer: We are an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. #DEI
Tags: Airflow Architecture AWS Azure Big Data BigQuery Cassandra CloudFormation Computer Science Data Analytics Databricks Data governance Data management Data pipelines Data quality dbt Docker Engineering Feature engineering Finance Flink GCP Generative AI GPT Hadoop Java Kafka Kubernetes LLMs Looker Machine Learning MLOps Model deployment MongoDB NoSQL Pipelines PostgreSQL Power BI Privacy Pulsar Python Redshift SageMaker Scala Security Snowflake Spark Stable Diffusion Streaming Tableau TensorFlow Terraform Unstructured data Vertex AI
Perks/benefits: Flex hours Flexible spending account Flex vacation Health care Insurance Medical leave Parental leave Salary bonus
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.