Director, Principle Data Architect
Paris, France
Tiffany & Co.
Discover fine jewelry creations of timeless beauty and superlative craftsmanship that will be treasured always.Position Overview:
Tiffany & Co. is seeking a Director - Principle Data Architect to join our team as we are in a journey to the Google Cloud Platform. We are modernizing and optimizing our Data & Analytics program by consolidating our legacy data platforms onto GCP through re-design, and we will be enabling a suite of leading capabilities to support the spectrum of business needs with key cloud and big data technologies, including AI/ML, streaming data, data lake and data warehouse, as well as self-service and delivered reporting. This important role will help lead our GCP architecture, roadmap, and integration strategy, as well as play a key role in the build of these capabilities.
This is a very exciting time for the Tiffany Data & Analytics program, and we are looking for an individual with the right combination of multi-faceted skills, experiences, and energy to join our team!
As a successful candidate, you will demonstrate a robust track record of leading data engineering and platform enablement projects and enhancement efforts on GCP. You will be a subject matter expert on GCP services, tools, and best practices for data engineering, data warehousing, and data delivery. You will help drive our key initiatives and high value projects, including the migration of our existing data warehouse platforms (iSeries-based) onto GCP, as well as integrating with operational systems located within AWS, Azure, hosted as SaaS, and on-prem using the best fit integration technologies and methodologies. You will work with an internal project team augmented by third-party project resources who will be onshore and offshore.
This role has the ability to be located in either Paris, France or Parsippany, New Jersey.
Responsibilities:
- Define enterprise data architecture for the platforms
- Lead data architecture practice and represent Data Engineering at the enterprise level
- Technical subject matter expert in the data ecosystem, including GCP, providing input into architecture, platforms, and development strategies and methodologies – includes mentorship of engineering and platform teams via design reviews, code reviews, etc.
- Define data engineering and data platform integration framework and standards. Mentor, coach and build learning program ton onboard efficiently and quickly new people in the team and ensure standards are understood and followed.
- Facilitate cross-team collaboration for defining and building enterprise data management architecture from principals to tools, oversee cross-functional adoption of the new architecture, and enable a new level of engineering efficiency when working with data.
- Direct the strategy and implementation for migration from the existing data warehouse platforms onto GCP.
- Lead technical direction of the team, driving the necessary changes and recommending appropriate technology choices working collaboratively with Architecture, Platform, DevOps, Security, and Project teams; influence technical direction with expert input into GCP-related and other project decisions.
- Lead the shift towards a DevOps processes for the Data & Analytics delivery function, emphasizing continuous integration, release management and automated testing to maximize development agility and improve time to market.
- Drive the data platform technical roadmap, as the data platform product owner to prioritize and build new features that serve the business teams and standardize technical work (templates for data engineers that are re-used)
- Interface with key business functions (i.e., Marketing, Sales, Finance, Merchandising, etc.), as well as IT Business Analysis teams to assess business functional requirements and translate them into data and integration requirements.
- Direct technical resources from consulting partners (onsite and offshore), communicate architecture standards and best practices, establish a high impact development process, drive excellence in all deliverables.
- Establish daily cadence with project team to prioritize and execute work items through adoption of Agile principles and processes.
- Manage relationships with external vendors to determine technical competence and identify integration opportunities.
- Be a hands-on leader for end-to-end delivery of GCP platforms, capabilities, and content.
Success Factors:
- Ability to understand, drive, and deliver technology solutions in GCP.
- Great understanding of Data architecture and components, especially for Data Warehousing, including BigQuery, Dataform, Cloud Composer, Data Catalog; other key platforms include Fivetran/HVR/LDP, Power BI, and Dataiku.
- Great understanding of and experience with modern Data Warehouse concepts, including Data Lake/Data Warehouse/Data Mart implementations, SQL-based transformations, and ELT methodology.
- Great understanding of DevOps and agile methodology, comfortable with Jira and Azure Devops and Github.
- Ability to lead project and technical teams and deliver solutions.
- Great planning, coordination, and execution skills.
- Ability to work with business partners collaboratively and successfully.
- Ability to provide technical leadership to GCP-related project and support teams.
- Knowledgeable and skilled in GCP-related technologies, including platforms and tools for meeting business use cases for all forms of analytics and AI/ML functionality.
- Ability to prioritize well, communicate clearly and understand how to drive a high level of focus and excellence in deliverables.
- Ability to collaborate, propose solutions, and advise on the best course of action.
- Communicate accurately, concisely, and with tact and diplomacy when appropriate.
- Be conscientious, reliable, and inquisitive with a keen desire to learn, not just gain the knowledge necessary for the job but also the underlying reasons and drivers.
Qualifications:
Required:
- BS in Computer Science or related field or equivalent experience.
- 7+ years of experience in Data & Analytics field.
- 3+ years of experience in GCP-specific architecture, development, or support capacity.
- 2+ years of experience leading an offshore team in a Data & Analytics environment, with responsibilities in data integrations and data platform development.
- 3+ years of experience building data integration solutions in GCP with BigQuery as target data warehouse repository
- 5+ years of experience designing and building data storage capabilities, such as data warehouse or data lake.
- An understanding of large-scale computing solutions, including software design and development, and database architectures.
- Knowledge of GCP cloud security, orchestration, management, data management (in particular metadata management and data quality checks).
- Knowledge of FinOps
- High level of comfort communicating effectively across internal and external organizations.
- Ability to build data pipelines as part of integrations design.
- Strong knowledge of ELT, ETL, CDC, API, messaging, streaming, and all forms of data ingestion techniques applicable to cloud-based Data & Analytics.
- Strong skills in SQL
- Experience with DevOps and Agile projects
- Experience with no-SQL; relational & non-relational platforms; multiple file formats, including Parquet, AVRO, JSON, XML, CSV, etc.
- Fluent in English, fluent in French a plus
Preferred:
- Experience with Dataform or dbt strongly preferred.
- Experience with Fivetran LDP and Fivetran Managed Service Platform strongly preferred (or similar technologies for API-based integration and CDC-based replication)
- Python, Power Apps development experience strongly preferred.
- Experience with Power BI.
- Experience with data science platforms, such as Dataiku.
- Experience designing and building API interfaces for efficient data extraction.
- Experience working in Global Retail Industry.
- GCP Solutions Architect certification is a plus.
- GCP Cloud Architect certification is a plus.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile APIs Architecture Avro AWS Azure Big Data BigQuery Computer Science Consulting CSV Data management Data pipelines Data quality Data warehouse Data Warehousing dbt DevOps ELT Engineering ETL Finance FiveTran GCP GitHub Google Cloud Jira JSON Machine Learning Parquet Pipelines Power BI Python Security SQL Streaming Testing XML
Perks/benefits: Career development
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.