Data Management Architect

Taipei- 9F, No.100, Songren Rd., Xinyi Dist., Taiwan

dentsu

At dentsu, innovation is our strength, and your growth is our mission. We help you keep up with technological changes in the digital economy.

View all jobs at dentsu

Apply now Apply later

As a Data Architect at Merkle, dentsu Taiwan. The primary focus of this role is on the development of marketing data integration solutions such as CDP (Customer Data Platform), CRM, and Data Middleware. Leveraging technologies including databases, data lakes, data warehouses, and ETL data pipelines, the Data Architect will be responsible for designing and implementing solutions throughout the project development lifecycle. This role will act as the solution owner, directly contributing to solution architecture, design, development, testing, and deployment. The Data Architect will also serve as a subject matter expert in the field of solution technology.

Job Description:

Responsibilities:

  • Concentrate on the construction of marketing data integration applications, such as CDP, CRM, and Data Middleware.
  • Participate in early-stage architecture planning and implementation design for data application projects to ensure robust data architecture.
  • Design and implement data processing workflows and ETL pipelines for big data, integrating, cleansing, transforming, processing, modeling, and designing databases and data flows. Assist data scientists in data analysis and modeling.
  • Ensure accuracy, completeness, consistency, availability, security, and privacy of diverse data sources.
  • Collaborate with clients' data departments in industries like retail, finance, telecommunications, and e-commerce, utilizing data engineering and data science foundations to provide data application support and solutions.
  • Conduct requirement interviews, architecture planning, construction, development, and maintenance for data integration platforms and data warehouses.
  • Research data technology tools, perform proof-of-concept (POC) studies, and validate feature capabilities.

Additional Requirements:

  • Availability for client on-site service, frequency to be determined.
  • Over 3 years of experience in data engineering development, possessing extensive knowledge of data engineering and data platform-related technologies. Proficiency in at least one programming language (such as Java or Python), relational databases, and SQL syntax.
  • Experience with data warehouse/data integration platform architecture planning, requirement interviews, model design, and development/maintenance using commercial ETL tools (e.g., Trinity, SSIS, SAS, Cloud Data Fusion, Dataproc, AWS Glue, Databricks) or open-source ETL tools (Airflow, NIFI, dbt, Apache Beam, Apache Flink).
  • Familiarity with data development and operational processes, including CI/CD, DevOps, MLOps practices, infrastructure as code, containerization, virtualization, etc.
  • Demonstrated capability in technical architecture planning and defining technical standards, encompassing system architecture design, data flow design, technical feasibility assessment, data storage, and streaming evaluation.
  • Effective collaboration with business and technical teams, excellent understanding of business requirements, and enthusiasm for programming.
  • Strong problem-solving skills, excellent teamwork and communication abilities, a positive and proactive attitude.
  • Practical experience in data architecture, ETL data processing transformations, and data analysis in industries such as retail, finance, telecommunications, and e-commerce is a plus.

Location:

Taipei- 9F, No.100, Songren Rd., Xinyi Dist.

Brand:

Merkle

Time Type:

Full time

Contract Type:

Permanent
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Architecture Jobs

Tags: Airflow Architecture AWS AWS Glue Big Data CI/CD Data analysis Databricks Data management Data pipelines Dataproc Data warehouse dbt DevOps E-commerce Engineering ETL Finance Flink Java MLOps Model design NiFi Open Source Pipelines Privacy Python RDBMS Research SAS Security SQL SSIS Streaming Testing

Region: Asia/Pacific
Country: Taiwan

More jobs like this