DE&A - Core - Cloud Data Engineering - Informatica Cloud
India
⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️
Zensar
Zensar is a global organization which conceptualizes, builds, and manages digital products through experience design, data engineering, and advanced analytics for over 200 leading companies. Our solutions leverage industry-leading platforms to...We are looking for an experienced Informatica Cloud Developer with 5–8 years of hands-on experience in data integration, ETL, and Informatica Intelligent Cloud Services (IICS). The ideal candidate should have a strong understanding of cloud-based data architecture, ETL/ELT best practices, and be able to build and manage scalable data pipelines using Informatica Cloud.
Key Responsibilities:Design, develop, and deploy data integration workflows using Informatica Intelligent Cloud Services (IICS).
Migrate and optimize ETL processes from on-premises to cloud environments.
Build scalable and reusable data pipelines for ingestion, transformation, and loading into cloud data platforms (e.g., Snowflake, Azure, AWS Redshift, GCP BigQuery).
Collaborate with data architects and business analysts to gather data integration requirements.
Implement performance tuning and error handling in ETL jobs.
Monitor and troubleshoot ETL jobs to ensure smooth data flow and accurate reporting.
Work closely with cross-functional teams including Data Engineering, BI, and Application teams.
Ensure data quality and integrity throughout the integration process.
Prepare technical documentation, unit test cases, and deployment guidelines.
Required Skills:5–8 years of experience in ETL Development with strong focus on Informatica Cloud (IICS).
Proficiency in developing Cloud Data Integration (CDI) and Application Integration solutions.
Solid experience in working with REST/SOAP APIs, JSON/XML formats.
Good understanding of SQL and relational databases (e.g., Oracle, SQL Server, PostgreSQL).
Hands-on experience with at least one cloud data platform like Snowflake, Azure Synapse, AWS Redshift, or GCP BigQuery.
Experience with job scheduling and monitoring tools.
Exposure to version control tools (e.g., Git) and CI/CD practices.
Strong problem-solving, analytical, and communication skills.
Preferred Qualifications:Informatica IICS or PowerCenter certifications.
Knowledge of data warehousing concepts, data lakes, and cloud data architecture.
Experience with Python, Shell scripting, or automation frameworks is a plus.
Prior experience in Agile/Scrum delivery environments.
Educational Qualification:Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, or related field.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile APIs Architecture AWS Azure BigQuery CI/CD Computer Science Data pipelines Data quality Data Warehousing ELT Engineering ETL GCP Git Informatica JSON Oracle Pipelines PostgreSQL Python RDBMS Redshift Scrum Shell scripting Snowflake SQL XML
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.