Data Architect
Bucharest · North Macedonia
Encora
Encora provides its clients with tailored innovation software engineering solutions across a wide range of leading-edge technologies.Important Information
- Experience: 7–10+ years
- Job Mode: Full-time
- Work Mode: Hybrid
Job Summary
- Assemble large, complex datasets that meet functional and non-functional business requirements.
- Plan, create, and maintain data architectures aligned with business goals.
- Create and maintain optimal data pipeline architecture, focusing on automation and scalability.
- Identify, design, and implement internal process improvements, including automation of manual processes and optimization of data delivery.
- Propose infrastructure for optimal extraction, transformation, and loading (ETL) of data from diverse sources using SQL and Big Data technologies.
- Continuously audit data management systems to ensure performance, address breaches or gaps, and report findings to stakeholders.
- Recommend analytics tools to generate actionable insights into business performance metrics, including customer acquisition and operational efficiency.
- Collaborate with stakeholders (executives, product teams, and data teams) to resolve technical issues and support data infrastructure needs.
- Build and maintain strong relationships with senior stakeholders to help them leverage Big Data technologies for business solutions.
Responsibilities and Duties
- Assemble, optimize, and maintain large datasets tailored to business needs.
- Design and implement scalable, high-quality data architectures and pipelines.
- Automate workflows, optimize performance, and ensure scalability in infrastructure design.
- Conduct continuous performance audits of data systems and implement improvements as needed.
- Design tools to deliver actionable insights for business intelligence and analytics.
- Collaborate with cross-functional teams to address technical issues and enhance data operations.
- Support data migrations, including integration with platforms like MS Dynamics CRM or SharePoint.
- Actively participate in Agile delivery frameworks (Scrum, DSDM) to ensure quality results.
Qualifications and Skills
- Education: BS/MS in Computer Science, Engineering, Information Technology, or related field with programming experience.
- Proven experience (7–10+ years) in engineering, database modeling, design, and architecture for large-scale analytics projects.
- Expertise in SQL and relational database management, as well as Big Data technologies (Apache Spark, Databricks, Kafka, Hadoop).
- Deep knowledge of modern data architectures (e.g., Lambda architecture, Streaming, Delta Lake).
- Experience with data pipeline tools (Azure Data Factory, Airflow) and Business Intelligence tools (SSAS, Power BI, Tableau).
- Familiarity with cloud services (Azure, AWS).
- Proficiency in programming languages such as Python, R, C#, or Java.
- Knowledge of Data Science, Machine Learning, and Artificial Intelligence trends.
- Strong understanding of industry best practices in data design, integration, and architecture.
- Experience working with Agile methodologies (Scrum, DSDM).
- Excellent English communication skills, both written and spoken.
Role-specific Requirements
- Extensive experience building and optimizing Big Data pipelines and architectures.
- Knowledge of Business Intelligence, analytics, and reporting technologies.
- Experience with data migrations and platforms such as MS Dynamics CRM and SharePoint.
- Strong knowledge of data trends, modern architectures, and scalable design.
- Customer-centric approach to explain technical concepts to non-technical stakeholders.
- Strong communication and collaboration skills in an international and virtual team setting.
- Proven ability to deliver quality results and foster strong client relationships.
Technologies
- Big Data: Apache Spark, Databricks, Snowflake, Kafka, Hadoop
- Data Pipeline Tools: Azure Data Factory, Airflow
- Business Intelligence Tools: SSAS, Power BI, Tableau
- Cloud Services: Azure, AWS
- Programming Languages: Python, R, C#, Java
Skillset Competencies
- Advanced SQL and Big Data pipeline optimization.
- Expertise in modern data architectures and ETL processes.
- Strong data migration and integration experience.
- Proficiency in analytics and reporting technologies.
- Excellent problem-solving, negotiation, and communication skills.
- Ability to work effectively in cross-functional, international teams.
- Strong client relationship management and quality delivery focus.
About Encora
Encora is a trusted partner for digital engineering and modernization, working with some of the world’s leading enterprises and digital-native companies. With over 9,000 experts in 47+ offices worldwide, Encora offers expertise in areas such as Product Engineering, Cloud Services, Data & Analytics, AI & LLM Engineering, and more. At Encora, hiring is based on skills and qualifications, embracing diversity and inclusion regardless of age, gender, nationality, or background.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow Architecture AWS Azure Big Data Business Intelligence Computer Science Databricks Data management DataOps Data pipelines Engineering ETL Hadoop Java Kafka Lambda LLMs Machine Learning Pipelines Power BI Python R RDBMS Scrum SharePoint Snowflake Spark SQL Streaming Tableau
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.