Solution Architect
US - NJ - BIRLASOFT OFFICE, US
Applications have closed
Birlasoft
At Birlasoft we combine the power of domain, enterprise, and digital technologies to reimagine business potential. Surpassing expectations, breaking convention!Job Title: Data Architect
Location: Raritan, New Jersey
Client Name: Johnson & Johnson
Job Type: Full-Time
Role Overview
The Data Architect will be responsible for designing, creating, deploying, and managing the organization’s data architecture, while ensuring optimal data integration, storage, and analysis capabilities. This role will work closely with data engineers and analysts to develop scalable data pipelines and analytical tools that empower decision-making processes. The ideal candidate will combine deep technical expertise with a strategic mindset to drive data-driven initiatives across the organization.
Key Responsibilities
Data Architecture Design: Develop and maintain an end-to-end data architecture strategy, ensuring scalability, security, and alignment with business goals.
Data Engineering: Lead the design and implementation of robust, scalable, and efficient data pipelines to ingest, process, and store data from diverse data sources.
Data Modelling: Create and maintain conceptual, logical, and physical data models to support databases, data warehouses, and data lakes.
Data Integration: Implement and oversee ETL (Extract, Transform, Load) processes for seamless integration of structured and unstructured data from multiple sources.
Data Analysis: Collaborate with data analysts to ensure data accessibility and deliver actionable insights from business data. Develop architecture that supports advanced analytics, reporting, and machine learning use cases.
Data Governance: Implement and enforce best practices for data governance, data quality, and data stewardship, ensuring compliance with privacy regulations (e.g., GDPR, CCPA).
Data Performance Optimization: Monitor and optimize data systems to ensure high availability, reliability, and performance.
Data Warehousing: Oversee the design and maintenance of data warehouses and data lakes to support analytical queries and real-time reporting.
Data Security & Compliance: Ensure the data architecture adheres to data security protocols and industry standards, implementing encryption, masking, and other privacy measures.
Collaboration: Work closely with data engineers, analysts, business stakeholders, and IT teams to define data needs, data flow, and analytics requirements.
Data Tooling: Identify and recommend tools and technologies for data management, analysis, and reporting, based on current trends and business needs.
Documentation: Prepare comprehensive documentation for data architecture, processes, and workflows, ensuring clear communication between teams.
Problem Solving: Lead troubleshooting efforts for data-related issues, providing strategic solutions that support long-term goals.
Required Experience:
- 10+ years of experience in data architecture, data engineering, or data management.
- Strong experience in building and optimizing data pipelines, architectures, and data sets.
- Proven experience in data modelling, database design, and data integration.
- Experience with both structured (SQL) and unstructured (NoSQL, Hadoop) data management.
- Demonstrated experience in data analysis and delivering data-driven insights to stakeholders.
- Experience with cloud platforms for data management (e.g., AWS, Azure, Google Cloud).
- Knowledge of data governance and compliance frameworks.
Technical Skills:
- Proficiency with ETL tools (e.g., Talend, Informatica) and data pipeline orchestration frameworks (e.g., Apache Airflow, AWS Glue).
- Strong SQL and Python programming skills.
- Experience with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) and big data technologies (e.g., Hadoop, Spark).
- Familiarity with data analysis and visualization tools (e.g., Tableau, Power BI, Looker).
- Experience with advanced analytics and machine learning platforms is a plus.
- Knowledge of data lake architecture and working with real-time streaming data solutions (e.g., Kafka, Kinesis) is highly desirable.
Preferred Qualifications:
- Bachelor’s or Master’s degree in Computer Science, Data Science, Information Systems, or a related field.
- Certifications such as AWS Certified Solutions Architect, Google Cloud Professional Data Engineer, or Microsoft Certified Azure Data Engineer.
- Strong problem-solving and analytical thinking with experience in delivering strategic data solutions.
- Excellent communication skills to interact with both technical and non-technical stakeholders.
- Experience with Agile methodologies and data engineering best practices.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow Architecture AWS AWS Glue Azure Big Data BigQuery Computer Science Data analysis Data governance Data management Data pipelines Data quality Data Warehousing Engineering ETL GCP Google Cloud Hadoop Informatica Kafka Kinesis Looker Machine Learning NoSQL Pipelines Power BI Privacy Python Redshift Security Snowflake Spark SQL Streaming Tableau Talend Unstructured data
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.