Data & Integration Architect
USA - Remote - Ohio, United States
⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️
Full Time Senior-level / Expert USD 113K - 182K
dentsu
Me olemme dentsu. Joukko optimisteja, visionäärejä ja edelläkävijöitä. Jatkuvasti muuttuvassa maailmassa me autamme brändejä kasvamaan, transformoitumaan ja kehittämään liiketoimintaansa vastuullisesti.Job Description:
This role is remote-friendly.
Responsibilities:
You will architect, design and build Cloud solutions using Azure Data Factory.
You will design and build global data warehouse solutions, ensuring data consistency, quality, and compliance across international data sources.
You will develop and optimize ETL/ELT CI/CD workflows using ADF pipelines, Data Flows, Linked Services, Integration Runtimes, and Triggers.
You will use PySpark, Kafka, Kenesis and Python for data transformation, cleansing, and enrichment tasks within Azure Synapse or Databricks environments.
You will collaborate with cross-functional teams to define data architecture standards, governance, and best practices.
You will provide technical leadership and mentorship to junior engineers .
You will ensure performance tuning, monitoring, and troubleshooting of data pipelines and workflows.
You will report to Vice President, Data Engineering Lead.
Required Experience:
9+ years of experience in data warehousing/engineering.
3+ years experience in Azure Data Factory architecture and implementation (migration or new implementation).
Bachelor's degree in computer science, Information Systems, or related field.
Experience with ADF components: Pipelines, Datasets, Linked Services, Integration Runtime, Data Flows, and Triggers.
Proven experience in building and managing global data warehouse solutions, integrating data from multiple countries and ensuring localization and compliance.
Experience with Azure tool stack.
Experience in Python and PySpark, kafka, kinesis for data processing and scripting.
Familiarity with Azure Synapse Analytics, Azure Data Lake, and Azure Key Vault.
Hands-on experience with any ETL tool. Preferred Informatica PowerCenter/Cloud and Oracle PL/SQL. Good understanding of modern ELT practices, data ingestion patterns, and streaming pipelines.
Knowledge of data modeling, data governance, and data security principles.
Expert in work estimation & resource management.
Experience in data privacy regulations (e.g., GDPR, HIPAA) in multi-country data environments.
Experience with pipeline automation tools like Fivetran or custom connectors.
Expertise in Databricks (on Azure) for large-scale data engineering and transformation workflows, including the use of PySpark, Scala, Delta Lake, and MLflow. Familiarity with Notebook-based collaboration and version-controlled data pipelines.
Proficiency in SQL (T-SQL or SparkSQL) for developing complex queries, views, stored procedures, and optimization. Solid experience in Python, especially data manipulation libraries like pandas, numpy, and integration with PySpark.
Experience with REST APIs and experience building/consuming APIs for data exchange. Familiarity with OAuth2.0, token-based authentication, and secure API practices in cloud environments.
Working knowledge of Microsoft Fabric (OneLake, Lakehouse, Notebooks, Pipelines) as an interactive environment for unified data analytics and collaborative workflows across Power BI, Synapse, and Data Engineering workloads.
Azure Databricks Unity Catalog and Azure Purview for data cataloging and lineage.
Worked with Structured, semi-structured (JSON, Parquet), and unstructured data
Azure Schema design and optimization for performance.
Azure certifications (e.g., Azure Data Engineer Associate, Azure Solutions Architect Expert) is a plus.
Experience with CI/CD pipelines for data solutions using Azure DevOps is a plus.
Experience with stored procedures in SQL server and oracle is a plus.
The annual salary range for this position is $113,000-$182,850. Placement within the salary range is based on a variety of factors, including relevant experience, knowledge, skills, and other factors permitted by law.
Benefits available with this position include:
Medical, vision, and dental insurance,
Life insurance,
Short-term and long-term disability insurance,
401k,
Flexible paid time off,
At least 15 paid holidays per year,
Paid sick and safe leave, and
Paid parental leave.
Dentsu also complies with applicable state and local laws regarding employee leave benefits, including, but not limited to providing time off pursuant to the Colorado Healthy Families and Workplaces Act, in accordance with its plans and policies. For further details regarding Dentsu benefits, please visit www.dentsubenefitsplus.com.
To begin the application process, please click on the "Apply" button at the top of this job posting. Applications will be reviewed on an ongoing basis, and qualified candidates will be contacted for next steps.
#LI-Merkle
#LI-AB2
Location:
USA - Remote - OhioBrand:
MerkleTime Type:
Full timeContract Type:
PermanentDentsu is committed to providing equal employment opportunities to all applicants and employees. We do this without regard to race, color, national origin, sex , sexual orientation, gender identity, age, pregnancy, childbirth or related medical conditions, ancestry, physical or mental disability, marital status, political affiliation, religious practices and observances, citizenship status, genetic information, veteran status, or any other basis protected under applicable federal, state, or local law.
Dentsu is committed to providing reasonable accommodation to, among others, individuals with disabilities and disabled veterans. If you need an accommodation because of a disability to search and apply for a career opportunity with us, please send an e-mail to ApplicantAccommodations@dentsu.com by clicking on the link to let us know the nature of your accommodation request and your contact information. We are here to support you.
Tags: APIs Architecture Azure CI/CD Computer Science Data Analytics Databricks Data governance Data pipelines Data warehouse Data Warehousing DevOps ELT Engineering ETL FiveTran Informatica JSON Kafka Kinesis MLFlow NumPy Oracle Pandas Parquet Pipelines Power BI Privacy PySpark Python Scala Security SQL Streaming T-SQL Unstructured data
Perks/benefits: Flex hours Flex vacation Health care Insurance Medical leave Parental leave
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.