Data Architect
Latin America#LATAM, Remote, Colombia (Hybrid)
Parser
Elevating organisations through groundbreaking AI innovation. Parser is a global technology consultancy firm that specialises in custom technology solutions that enhance digital experiences and boost productivity through AI and Data.This position offers you the opportunity to join a fast-growing technology organization that is redefining productivity paradigms in the software engineering industry. Thanks to our flexible, distributed model of global operation and the high caliber of our experts, we have enjoyed triple digit growth over the past five years, creating amazing career opportunities for our people.
If you want to accelerate your career working with like-minded subject matter experts, solving interesting problems and building the products of tomorrow, this opportunity is for you.
Parser is searching for a meticulous and experienced Data Architect to join our talented team. Your central responsibility will be to develop, optimize, and oversee the conceptual design and logic of our client data systems. Your duties may include preparing architect reports, monitoring the system, and supervising system migrations.
To succeed in this role, you should know how to examine new data system requirements and implement migration models. The ideal candidate will also have proven experience in data analysis and management, data engineering, and excellent analytical and problem-solving abilities.
The impact you'll make:
Data Model Design: Design and optimize data models for heavy load and high-performance systems, ensuring scalability, reliability, and responsiveness under high data volumes.
Data Gateway Design: Architect and develop scalable data gateways to integrate disparate data sources, enabling efficient data flow and interoperability.
Data Integration: Collaborate with stakeholders to understand data requirements and ensure optimal integration of structured, semi-structured, and unstructured data.
AI-driven Data Validation: Implement artificial intelligence and machine learning techniques to automate data validation processes, identify inconsistencies, and certify data quality.
Data Governance: Establish and maintain best practices for data governance, ensuring security, accuracy, and compliance with industry standards.
Performance Optimization: Analyze and improve the performance of data pipelines and integrations to handle large-scale data efficiently.
Technical Documentation: Develop comprehensive documentation for data gateway architectures, validation methodologies, and certification processes.
Stakeholder Collaboration: Partner with business leaders, data scientists, and developers to align architectural solutions with organizational objectives.
Your skills:
- Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or related field.
- 8+ years of experience in data architecture, integration, and management.
- Strong knowledge of data gateways, APIs, and data flow management tools.
- Hands-on experience with AI and machine learning models for data validation and quality assessment.
- Proficiency in programming languages such as Python, R, or Java for AI implementations.
- Expertise in designing data models for heavy load and high traffic systems, with a focus on scalability and reliability.
- Proven experience in designing and optimizing data models to support heavy load, high traffic, and high-performance systems, ensuring seamless integration with existing architectures.
- In-depth understanding of ETL processes and data pipeline optimization for high-performance systems.
- Exceptional communication skills to collaborate effectively across technical and non-technical teams.
- Expertise in database technologies (SQL, NoSQL) and cloud platforms (e.g., AWS, Azure, Google Cloud).
- Experience with advanced AI and machine learning frameworks to enhance data validation and quality assurance processes.
- Knowledge of advanced data modeling and data governance tools to facilitate seamless data management across systems.
- Familiarity with Big Data technologies like Hadoop, Spark, or Kafka for managing high-scale data flows and analytics.
- Excellent english communication skills
Some of the benefits you’ll enjoy working with us:
- The chance to work in innovative projects with leading brands, that use the latest technologies that fuel transformation.
- The opportunity to be part of an amazing, multicultural community of tech experts.
- A competitive compensation package and medical insurance.
- The opportunity to grow and develop your career with the company.
- A flexible and remote working environment.
Come and join our #ParserCommunity.
Follow us on Linkedin
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: APIs Architecture AWS Azure Big Data Computer Science Data analysis Data governance Data management Data pipelines Data quality Engineering ETL GCP Google Cloud Hadoop Java Kafka Machine Learning Mathematics ML models Model design NoSQL Pipelines Python R Security Spark SQL Unstructured data
Perks/benefits: Career development Competitive pay Flex hours Startup environment
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.