Data Engineer Intern DDIT Dev. TRD
Hyderabad (Office), India
Novartis
Working together, we can reimagine medicine to improve and extend people’s lives.Job Description Summary
-Entry level Internship for project delivery and/or operations in a less complex business sub capability.
Job Description
MAJOR ACCOUNTABILITIES
- Assist in designing scalable data ingestion and integration solutions for smooth onboarding of business data into the TRD Data platform.
- Ensure data is ‘fit for use’ by applying business and technical rules during its lifecycle.
- Identify automation opportunities or accelerators where possible.
- Build data pipelines using Python CI/CD and transformation workflows to process or integrate various data sets from ingestion to consumption layers.
- Align with Solution Architects and vendors for best practices.
- Adopt data management practices to ensure data quality, data modeling, harmonization, standards, and ontologies.
- Handle metadata effectively and leverage enterprise ontology management.
- Ensure FAIR data principles are adhered to wherever applicable.
- Perform requirement scoping assessments to determine project feasibility.
- Highlight gaps in existing functionality and review requirements with stakeholders.
- Develop comprehensive requirement specifications to estimate cost, time, and resources for deploying solutions.
- Liaise with the service development team to suggest high-level functional solutions.
- Develop project estimates and complete financial models (costs, savings, revenue opportunities, investment horizon, etc.).
- Ensure relevant stakeholders are involved in specifying new services and/or major upgrades to existing services.
- Ensure the overall user experience is considered when designing and deploying data solutions and services.
- Ensure implemented solutions are according to specifications and fit for purpose.
- Support end-user training and self-service activities.
KEY PERFORMANCE INDICATORS / MEASURES OF SUCCESS
- Feedback on project execution (implementation quality, time)
- Programming and process efficiency
- Steady/error-free data process flow
- Completeness and accuracy of project deliverables
- Ability to document and test activities up to date
IDEAL BACKGROUND
Education:
- Currently pursuing an Engineering Degree in Informatics, Computer Sciences, Life Sciences, or a related field.
Skills & Knowledge:
- Basic knowledge of Python programming and ETL and BI concepts
- Familiarity with generative AI and Streamlit for creating interactive data applications
- Understanding of data ingestion patterns in AWS cloud and third-party tools
- Basic SQL query writing skills on relational databases like Oracle/MS SQL Server
- Understanding of data consumption topics like data science, reports, dashboards, and KPIs
- Familiarity with ETL tools like Alteryx and BI tools like Power BI
- Understanding of metadata and data discovery techniques
- Basic knowledge of AWS cloud data integrations and data management technologies
- Familiarity with data architecture and data engineering concepts such as data modeling, data lakes, and data analytics
- Strong analytical, critical thinking, and problem-solving skills
- Experience working in Agile projects and methodology (preferred)
Skills Desired
Budget Management, Business Acumen, Performance Management, Planning, Risk Management, Service Delivery Management, Stakeholder Management, Waterfall ModelTags: Agile Architecture AWS CI/CD Data Analytics Data management Data pipelines Data quality Engineering ETL Generative AI KPIs MS SQL Oracle Pipelines Power BI Python RDBMS SQL Streamlit
Perks/benefits: Career development
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.