Specialist, IT
BANGL/GDCi - BANGALORE GDCi, India
Baxter International Inc.
This is where you save and sustain lives
At Baxter, we are deeply connected by our mission. No matter your role at Baxter, your work makes a positive impact on people around the world. You'll feel a sense of purpose throughout the organization, as we know our work improves outcomes for millions of patients.
Baxter's products and therapies are found in almost every hospital worldwide, in clinics and in the home. For over 85 years, we have pioneered significant medical innovations that transform healthcare.
Together, we create a place where we are happy, successful and inspire each other. This is where you can do your best work.
Join us at the intersection of saving and sustaining lives—where your purpose accelerates our mission.
Perform development work and technical support related to our data transformation and ETL jobs in support of a global data warehouse. Can communicate results with internal customers. Requires the ability to work independently, as well as in cooperation with a variety of customers and other technical professionals.
Essential Duties and Responsibilities:
This section contains a list of five to eight primary responsibilities of this role that account for 5% or more of the work. The incumbent will perform other duties assigned.
- Development of new ETL/data transformation jobs, using PySpark, AWS Glue, Snowflake and IBM DataStage in AWS.
- Developmental knowledge of Data products using Starburst Galaxy is an added advantage.
- Developmental knowledge in Oracle Apex is an added advantage.
- Enhancement and support on existing ETL/data transformation jobs.
- Can explain technical solutions and resolutions with internal customers and communicate feedback to the ETL team.
- Perform technical code reviews for peers moving code into production.
- Perform and review integration testing before production migrations.
- Provide high level of technical support, and perform root cause analysis for problems experienced within area of functional responsibility.
- Can document technical specs from business communications.
Qualifications:
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. List knowledge, skills, and/or abilities required. Reasonable accommodations may be made to enable individuals with disabilities to perform essential functions.
- 5+ years of ETL experience.
- Experience with core Python programming for data transformation.
- Advanced-level PySpark skills. Can read, understand and debug existing code and write PySpark code from scratch.
- Strong understanding of Pyspark architecture and ability to tune long running Pyspark jobs and also suggest improvements to the architecture.
- Strong knowledge of SQL fundamentals, understanding of subqueries, can tune queries with execution hints to improve performance.
- Strong knowledge in Snowflake concepts and experience of setting up snowpipes.
- IBM DataStage experience is nice to have.
- Able to write SQL code sufficient for most business requirements for pulling data from sources, applying rules to the data, and stocking target data
- Proven track record in troubleshooting ETL jobs and addressing production issues like performance tuning, reject handling, and ad-hoc reloads.
- Proficient in developing optimization strategies for ETL processes.
- Intermediate AWS technical support skills.
- Knowledge of Control M batch scheduling and ability to run and monitor jobs running via Control-M
- Can create clear and concise documentation and communications.
- Can document technical specs from business communications.
- Ability to coordinate and aggressively follow up on incidents and problems, perform diagnosis, and provide resolution to minimize service interruption
- Ability to prioritize and work on multiple tasks simultaneously
- Effective in cross-functional and global environments to manage multiple tasks and assignments concurrently with effective communication skills.
- A self-starter who can work well independently and on team projects.
- Experienced in analyzing business requirements, defining the granularity, source to target mapping of the data elements, and full technical specification.
- Understands data dependencies and how to schedule jobs in Control-M.
- Experienced working at the command line in various flavors of UNIX, with basic understanding of shell scripting in bash and korn shell.
Education and/or Experience:
Include the education and experience that is necessary to perform the job satisfactorily.
- Bachelors of Science in computer science or equivalent
- 5+ years of ETL and SQL experience
- 3+ years of python and PySpark experience
- 3+ years of AWS and unix experience
- 3+ years of Snowflake experience
- Preferred certifications:
- AWS Certified Cloud Practitioner (amazon.com)
- Python and PySpark certifications
- Snowflake certifications
Equal Employment Opportunity
Baxter is an equal opportunity employer. Baxter evaluates qualified applicants without regard to race, color, religion, gender, national origin, age, sexual orientation, gender identity or expression, protected veteran status, disability/handicap status or any other legally protected characteristic.
Reasonable Accommodations
Baxter is committed to working with and providing reasonable accommodations to individuals with disabilities globally. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the application or interview process, please click on the link here and let us know the nature of your request along with your contact information.
Recruitment Fraud Notice
Baxter has discovered incidents of employment scams, where fraudulent parties pose as Baxter employees, recruiters, or other agents, and engage with online job seekers in an attempt to steal personal and/or financial information. To learn how you can protect yourself, review our Recruitment Fraud Notice.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Apex Architecture AWS AWS Glue Computer Science Data warehouse ETL Oracle PySpark Python Shell scripting Snowflake SQL Testing
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.