Asset & Wealth Management-Dallas-Associate-Data Engineering

Dallas, Texas, United States

Goldman Sachs

The Goldman Sachs Group, Inc. is a leading global investment banking, securities, and asset and wealth management firm that provides a wide range of financial services.

View all jobs at Goldman Sachs

Apply now Apply later

Goldman Sachs Asset & Wealth Management:

GSAM is the asset management arm of Goldman Sachs, which supervises more than $1.8 trillion in assets. Goldman Sachs Asset Management has been providing discretionary investment advisory services since 1988 and has investment professionals in all major financial centers around the world.Ā 

Ā Our team of engineers in the Quant Data Engineering help Ā support quantitative research and investment strategies through efficient data acquisition, transformation, and access. We build critical data pipelines to enable seamless ingestion, storage, and processing of vendor datasets of any size, format, or structure. QDE empowers researchers and portfolio managers by rapidly onboarding and productionizingĀ  datasets and providing distributed compute infrastructure (Spark/Dask) for large-scale data analysis .

Ā We are seeking an associate Software Engineer for the Asset Management Division in our Quant Engineering team in Dallas.

Ā 

Responsibilities and Qualifications

  • Bachelor’s or Master’s degree in Computer Science or related technical discipline.
  • Work in an Ā Agile environment managing end-to-end systems development cycle from requirements analysis to coding, testing, UAT, implementation and maintenance
  • Work in a dynamic, fast-paced environment that provides exposure to big data platform and research
  • Able to contribute towards Production Support and Maintenance of data platform including incident management, troubleshooting ,alert monitoring and problem management

Ā 

Skills and Experiences we are Looking For

  • 2-5 years’ experience working as a data engineer with hands on experience building ETL pipelines.
  • Knowledge in software development, design and core programming concepts in at least one of these languages: Python, Scala, Java
  • Expertise in Big data technologies using Hadoop, Spark and distributed computingĀ 
  • In-depth knowledge of relational and columnar SQL databases with SQL, PLSQL querying skillsĀ 
  • Strong programming experience in at least Pyspark or ScalaĀ 
  • Knowledge of AWS, Snowflake cloud technology would be a plus
  • Experience in shell scripting in Unix or Linux environments.
  • Experience working in a git-based CI/CD SDLC environment.
  • Comfortable multi-tasking, open to learning new technologies and working as part of a global team.
  • Ā Strong problem solving and analytical skills with excellent communication skillsĀ 
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index šŸ’°

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile AWS Big Data CI/CD Computer Science Data analysis Data pipelines Engineering ETL Git Hadoop Java Linux Pipelines PySpark Python Research Scala SDLC Shell scripting Snowflake Spark SQL Testing

Region: North America
Country: United States

More jobs like this