Data Architect | Las Vegas, NV
NV, United States
Full Time Senior-level / Expert USD 46K - 161K
Photon
Photon, a global leader in digital transformation services and IT consulting, works with 40% of the Fortune 100 companies as their digital agency of choice.Job Summary:
We are seeking a highly skilled and experienced Data Architect to design, implement, and manage scalable, high-performance data solutions. The ideal candidate will have expertise in data modeling, database design, data warehousing, and cloud-based data architecture. The Data Architect will work closely with business stakeholders, data engineers, and software development teams to ensure that data solutions align with organizational goals and support advanced analytics and reporting capabilities.
Key Responsibilities:
- Design and develop enterprise-wide data architecture strategies to support business objectives.
- Define and enforce data governance, data security, and data quality standards.
- Create and optimize database schemas, ETL processes, and data pipelines for efficient data integration.
- Lead the implementation of data warehousing solutions using modern cloud technologies (AWS, Azure, GCP).
- Collaborate with data engineers, data scientists, and application developers to ensure seamless data flow across systems.
- Evaluate and select appropriate database management systems (SQL, NoSQL, etc.) based on business needs.
- Define and document best practices for data modeling, metadata management, and master data management (MDM).
- Monitor, troubleshoot, and optimize database performance and scalability.
- Stay updated with emerging trends in data architecture, big data, and analytics technologies.
Required Qualifications:
- Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field.
- [X+] years of experience in data architecture, data modeling, or database design.
- Expertise in relational and NoSQL databases (MySQL, PostgreSQL, MongoDB, Cassandra, etc.).
- Strong experience with cloud platforms (AWS, Azure, GCP) and cloud data solutions (Snowflake, BigQuery, Redshift, etc.).
- Hands-on experience with ETL tools (Informatica, Talend, Apache NiFi) and data pipeline orchestration (Apache Airflow, AWS Glue, etc.).
- Proficiency in data modeling techniques (OLTP, OLAP, Star Schema, Snowflake Schema).
- Knowledge of programming and scripting languages such as SQL, Python, Java, or Scala.
- Familiarity with data governance, compliance (GDPR, CCPA), and data security best practices.
- Excellent problem-solving skills and ability to work in an agile environment.
Preferred Qualifications:
- Experience with big data technologies such as Hadoop, Spark, Kafka.
- Knowledge of machine learning and AI-driven data architectures.
- Certifications in cloud data technologies (AWS Certified Data Analytics, Google Professional Data Engineer, etc.).
- Strong communication and stakeholder management skills.
Compensation, Benefits and Duration
Minimum Compensation: USD 46,000
Maximum Compensation: USD 161,000
Compensation is based on actual experience and qualifications of the candidate. The above is a reasonable and a good faith estimate for the role.
Medical, vision, and dental benefits, 401k retirement plan, variable pay/incentives, paid time off, and paid holidays are available for full time employees.
This position is available for independent contractors
No applications will be considered if received more than 120 days after the date of this post
Tags: Agile Airflow Architecture AWS AWS Glue Azure Big Data BigQuery Cassandra Computer Science Data Analytics Data governance Data management Data pipelines Data quality Data Warehousing ETL GCP Hadoop Informatica Java Kafka Machine Learning MongoDB MySQL NiFi NoSQL OLAP Pipelines PostgreSQL Python Redshift Scala Security Snowflake Spark SQL Talend
Perks/benefits: 401(k) matching Career development Health care
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.