Data Engineer, ETL/Hadoop Developer
Budapest, Budapest, Hungary
MP Solutions Ltd.
MPS RPO Toborzási folyamatok teljeskörű kiszervezése, informatikai tanácsadás, software engineeringPrimary responsibilities
- Design and development of ETL and Hadoop/ Snowflake applications.
- Undertaking end-to-end project delivery (from inception to post-implementation support), including review and finalization of business requirements, creation of functional specifications and/or system designs, and ensuring that end-solution meets business needs and expectations.
- Responsibilities around deployment support (late hour, weekend).
- Development of new transformation processes to load data from source to target, or performance tuning of existing ETL code (mappings, sessions) and Hadoop/ Snowflake Platform.
- Analysis of existing designs and interfaces and applying design modifications or enhancements.
- Coding and documenting data processing scripts and stored procedures.
- Providing business insights and analysis findings for ad-hoc data requests
- Testing software components and complete solutions (including debugging and troubleshooting) and preparing migration documentation. Providing reporting-line transparency through periodic updates on project or task status.
Requirements
- Bachelor’s/master's degree in engineering, preferably Computer Science/Engineering.
- 3+ years of experience with the technical analysis and design, development and implementation of data warehousing / Data Lake solutions.
- Strong SQL programming and stored procedure development skills.
- 2+ years of experience developing in Informatica or any other ETL tool.
- 2+ years relational database experience.
- Strong UNIX Shell scripting experience to support data warehousing solutions.
- Process oriented, focused on standardization, streamlining, and implementation of best practices delivery approach.
- Excellent problem solving and analytical skills.
- Excellent verbal and written communication skills.
- Experience in optimizing large data loads.
Advantages
- Understanding/experience in Hive/Impala/Spark/Snowflake.
- Experience with Teradata is a big plus.
- Ability to architect an ETL solution and data conversion strategy.
- Exposure to an Agile Development environment.
- Knowledge about TWS Scheduler.
- Strong understanding of Data warehousing domain.
- Good understanding of dimensional modelling.
- Should be a good Team player.
Benefits
- You will have the opportunity to gain experience in exciting, long-term, innovative projects
- Flexible working arrangements (core hours and opportunity to work from home)
- Work in a multinational team/environment,
- A team of great engineers,
- Cafeteria
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Job stats:
4
1
0
Category:
Engineering Jobs
Tags: Agile Computer Science Data Warehousing Engineering ETL Hadoop Informatica RDBMS Shell scripting Snowflake Spark SQL Teradata Testing
Perks/benefits: Flex hours
Region:
Europe
Country:
Hungary
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.
Data Engineer II jobsSr. Data Engineer jobsStaff Data Scientist jobsPrincipal Data Engineer jobsBI Developer jobsSenior AI Engineer jobsStaff Machine Learning Engineer jobsData Manager jobsData Science Intern jobsBusiness Data Analyst jobsPrincipal Software Engineer jobsJunior Data Analyst jobsData Specialist jobsResearch Scientist jobsData Science Manager jobsData Analyst Intern jobsSoftware Engineer II jobsLead Data Analyst jobsSr. Data Scientist jobsDevOps Engineer jobsData Engineer III jobsJunior Data Engineer jobsAI/ML Engineer jobsBI Analyst jobsData Engineering Manager jobs
Git jobsEconomics jobsLinux jobsKafka jobsOpen Source jobsNoSQL jobsHadoop jobsData Warehousing jobsAirflow jobsRDBMS jobsJavaScript jobsMLOps jobsComputer Vision jobsBanking jobsPhysics jobsScala jobsKPIs jobsGoogle Cloud jobsPostgreSQL jobsClassification jobsData warehouse jobsScikit-learn jobsOracle jobsGitHub jobsTerraform jobs
R&D jobsStreaming jobsSAS jobsPySpark jobsScrum jobsPandas jobsCX jobsBigQuery jobsDistributed Systems jobsData Mining jobsJira jobsdbt jobsMicroservices jobsLooker jobsReact jobsRobotics jobsIndustrial jobsJenkins jobsRAG jobsRedshift jobsUnstructured data jobsMySQL jobsData strategy jobsNumPy jobsE-commerce jobs