Solution Engineer & Technical Project Manager
Remote, RO
⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️
NTT DATA Romania
Who we are
We are seeking a highly skilled and proactive IT Technical Project Manager & Solution Engineer to lead complex, data-intensive projects from conception through delivery. This role demands a unique combination of hands-on technical expertise in big data and data wharehouse platforms, strategic planning, and agile project management. You will work closely with data engineers, architects, business stakeholders, and cross-functional teams to build scalable, efficient, and reliable data solutions.
What you'll be doing
- Manage and oversee daily activities of the data warehouse, including image creation, ETL/ELT processes, data loads, performance monitoring, and troubleshooting.
- Proactively engage with the needed stakeholders to address security and complaince topics.
- Provide hands-on support and development in managing data pipelines, SQL optimization, and data validation processes.
- Collaborate with data engineering, analytics, BI, and infrastructure teams to ensure seamless data availability and accuracy.
- Proactively monitor and optimize warehouse performance, ensuring uptime, scalability, and cost-efficiency
- Develop and enforce best practices for data quality, governance, backup, and disaster recovery.
- Create and maintain documentation for data workflows, data dictionaries, operational procedures, and technical designs.
- Participate in and lead root cause analysis efforts to identify and resolve data-related issues quickly.
- Evaluate and implement new tools, automation scripts, or practices that improve the reliability and efficiency of warehouse operations.
- Lead end-to-end project lifecycle of big data solutions, ensuring timely delivery and alignment with business goals.
- Define project scope, timelines, resources, milestones, and deliverables.
- Coordinate and collaborate with cross-functional teams, vendors, and stakeholders.
- Apply Agile/Scrum or other project management methodologies to drive iterative development and continuous improvement.
- Manage risks, issues, and dependencies across project phases.
- Architect and implement big data solutions using platforms such as Hadoop, Spark, Kafka, Hive, and cloud-native technologies (e.g., AWS, Azure, GCP).
- Develop and optimize ETL/ELT pipelines and data integration strategies.
- Design scalable data architectures and storage solutions that support analytics, machine learning, and real-time data processing.
- Ensure data quality, security, and compliance standards are met.
- Conduct code reviews and guide development best practices across teams.
What you'll bring along
- Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or related field.
- Minimum 3-5 years of experience in big data engineering or solution architecture and project management or technical leadership role.
- Strong experience with Apache Spark, Kafka, or similar technologies.
- Proficiency with cloud platforms (AWS, Azure) and modern data tools (Syanpse/Fabric, Databricks, Snowflake, etc.).
- Expertise in data modeling, data lakes, and data warehouses.
- Hands-on experience with CI/CD pipelines, Git, Jenkins, or similar DevOps tools.
- Knowledge of programming languages: Python, Scala, Java, or SQL.
- Background in data governance, security, and compliance (e.g., GDPR, DORA…).
- Familiarity with agile methodologies and incident management systems (e.g., Jira, PagerDuty).
- Experience in financial services or telecom domains.
- Familiarity with data governance frameworks and tools (e.g., IDMC/Informatica)
- Strong communication, leadership, and stakeholder management abilities.
- Familiarity with model driven approaches to guide and automate the design, development, and maintenance of software systems. (and tools like DBT Cora/Platform)
- Excellent command of both spoken and written English.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Architecture AWS Azure Big Data CI/CD Computer Science Databricks Data governance Data pipelines Data quality Data warehouse dbt DevOps ELT Engineering ETL GCP Git Hadoop Informatica Java Jenkins Jira Kafka Machine Learning Pipelines Python Scala Scrum Security Snowflake Spark SQL
Perks/benefits: Career development
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.