Specialist IS Engineer
India - Hyderabad
Amgen
Amgen is committed to unlocking the potential of biology for patients suffering from serious illnesses by discovering, developing, manufacturing and delivering innovative human therapeutics.Career Category
Information SystemsJob Description
Role Description:
In this vital role as Specialist IS Engineer, you will be responsible for designing, developing, and maintaining software applications and solutions that meet business needs and ensuring the availability and performance of critical systems and applications. You would be responsible for maintains scalable software solutions to address complex business needs, with a focus on building robust data foundations for data repositories, integrations, and reporting. This role ensures system performance and availability while minimizing downtime through automation and proactive incident management.
The ideal candidate will collaborate with product owners, architects, and engineers to design, implement and manage next-generation metrics engine and data governance capability, including metadata and reference data management for analytics. Additional responsibilities include designing interfaces, workflows, and data models, conducting data mapping across systems, optimizing data migration, and deploying integrations in development and production environments, ensuring compliance and operational excellence.
This position is perfect for a collaborative, detail-oriented professional passionate about enhancing clinical operations through advanced data engineering and integrations.
Roles & Responsibilities:
Collaborate closely with product owners, data architects, business SMEs and engineers to develop and deliver high-quality solutions, enhancing and maintaining integrations across clinical systems.
Design and architect the next-generation metrics engine on modern infrastructure to support operational analytics leveraging cloud technologies.
Design and implement a new data governance capability incorporating essential features like metadata and reference data management for analytics.
Take ownership of complex software projects from conception to deployment, managing scope, risk, and timelines.
Utilize rapid prototyping skills to quickly translate concepts into working solutions and code.
Leverage modern AI/ML technologies to enable predictive analytics, NLP/NLQ capabilities, and enhance the overall data analytics process.
Analyze functional and technical requirements of applications, translating them into software architecture and design specifications.
Develop and execute unit tests, integration tests, and other testing strategies to ensure software quality and reliability.
Integrate systems and platforms to ensure seamless data flow, functionality, and interoperability.
Provide ongoing support and maintenance for applications, ensuring smooth and efficient operation.
Collaborate on building advanced analytics capabilities to empower data-driven decision-making and operational insights.
Provide technical guidance and mentorship to junior developers, fostering team growth and skill development.
Basic Qualifications and Experience:
Doctorate Degree OR
Master’s degree with 4 - 6 years of experience in Computer Science, IT or related field OR
Bachelor’s degree with 6 - 8 years of experience in Computer Science, IT or related field OR
Diploma with 10 - 12 years of experience in Computer Science, IT or related field
Diploma with 14 - 18 years of experience in Computer Science, IT or related field
Functional Skills:
Must-Have Skills
Strong understanding of cloud platforms (e.g., AWS, GCP, Azure) and containerization technologies (e.g., Docker, Kubernetes)
Hands on experience writing SQL using any RDBMS (Redshift, Postgres, MySQL, Teradata, Oracle, etc.)
Hands on experience in programming (e.g., SQL, C++, JavaScript, XML).
Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), workflow orchestration, performance tuning on big data processing
Good-to-Have Skills:
Experience with software DevOps CI/CD tools, such Git, Jenkins, Linux, and Shell Script
Experience with Spark, Hive, Kafka, Kinesis, Spark Streaming, and Airflow
Experience with data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines
Experience working in an agile environment (i.e. user stories, iterative development, etc.)
Soft Skills:
Excellent analytical and troubleshooting skills
Strong verbal and written communication skills
Ability to work effectively with global, virtual teams
High degree of initiative and self-motivation
Ability to manage multiple priorities successfully
Team-oriented, with a focus on achieving team goals
Strong presentation and public speaking skills
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow Architecture AWS Azure Big Data CI/CD Computer Science Data Analytics Databricks Data governance Data management DevOps Docker Engineering ETL GCP Git JavaScript Jenkins Kafka Kinesis Kubernetes Linux Machine Learning MySQL NLP Oracle Pipelines PostgreSQL Prototyping PySpark RDBMS Redshift Spark SQL Streaming Teradata Testing XML
Perks/benefits: Career development
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.