Manager, Data Engineer - PGIM Global Services (Newark, NJ or Tampa, FL - Hybrid)
Prudential Tower, 655 Broad Street, Newark, NJ, United States
Full Time Mid-level / Intermediate USD 120K+
Prudential Financial
Helping individuals and institutions improve their financial wellness through life & health insurance, retirement services, annuities and investment products.Job Classification:
Technology - Engineering & CloudA GLOBAL FIRM WITH A DIVERSE & INCLUSIVE CULTURE
As the Global Asset Management business of Prudential, we’re always looking for ways to improve financial services. We’re passionate about making a meaningful impact - touching the lives of millions and solving financial challenges in an ever-changing world.
We also believe talent is key to achieving our vision and are intentional about building a culture on respect and collaboration. When you join PGIM, you’ll unlock a motivating and impactful career – all while growing your skills and advancing your profession at one of the world’s leading global asset managers!
If you’re not afraid to think differently and challenge the status quo, come and be a part of a dedicated team that’s investing in your future by shaping tomorrow today.
At PGIM, You Can!
Your Team & Role
As a Lead Data Engineer at PGIM Global Services, you will partner with talented architects, infrastructure engineers, machine learning engineers, data scientists, and data analysts to improve our core data platforms and supporting applications. You will analyze, design, develop, test, and perform ongoing maintenance to build high-quality data pipelines that drive platform delivery. You will implement capabilities to solve sophisticated business problems, deploy innovative products, services, and experiences to delight our customers! In addition to advanced technical expertise and experience, you will bring excellent problem-solving, communication, and teamwork skills, along with agile ways of working, strong business insight, an inclusive leadership attitude, and a continuous learning focus to all that you do.
This role is based in our office in Newark, NJ or Tampa, FL. Our organization follows a hybrid work structure where employees can work remotely and from the office, as needed, based on demands of specific tasks or personal work preferences. This position is hybrid and requires your on-site presence on a reoccurring weekly basis at least 3 days per week.
Here is What You Can Expect on a Typical Day
- Build and optimize data pipelines, logic, and storage systems with the latest coding practices and industry standards, modern design patterns, and architectural principles; remove technical impediments.
- Develop high-quality, well-documented, and efficient code adhering to all applicable Prudential standards.
- Conduct complex data analysis and report on results, prepare data for prescriptive and predictive modeling, and combine raw information from different sources.
- Collaborate with data analysts, scientists, and architects on data projects to enhance data acquisition, transformation, organization processes, data reliability, efficiency, and quality.
- Write unit, integration tests, and functional automation, researching problems discovered by quality assurance or product support, and developing solutions to address the problems.
- Bring a strong understanding of relevant and emerging technologies, provide input and coach team members, and embed learning and innovation in the day-to-day.
- Work on complex problems in which analysis of situations or data requires an evaluation of intangible variables.
- Use programming languages including but not limited to Python, R, SQL, Java, Scala, Pyspark/Apache Spark, Shell scripting.
The Skills & Expertise You Bring
- Bachelor of Computer Science or Engineering or experience in related fields.
- Experience in working with DevOps automation tools & practices; Knowledge of the full software development life cycle (SDLC).
- Ability to coach others with minimal guidance and effectively leverage diverse ideas, experiences, thoughts, and perspectives to the benefit of the organization.
- Knowledge of business concepts, tools, and processes that are needed for making sound decisions in the context of the company's business.
- Ability to learn new skills and knowledge on an ongoing basis through self-initiative and tackling challenges.
- Excellent problem-solving, communication, and collaboration skills; enjoy learning new skills!
- Advanced experience and/or expertise with several of the following:
- Programming Languages: Python, R, SQL, Java, Scala, Pyspark/Apache Spark, Shell scripting.
- Data Ingestion, Integration & Transformation: Moving data from multiple sources, formats, and volumes to analytics platforms through various tools. Preparing data for further analysis; transforming and mapping raw data to generate insights and wrangling data through tools. Extensive knowledge of Microsoft Fabric
- Database Management Systems: Storing, organizing, managing, and delivering data using relational DBs, NoSQL DBs, Graph DBs, and data warehouse technologies including Azure SQL Database and Azure Synapse Analytics.
- Database Tools: Data architecture to store, organize, and manage data. Experience with SQL and NoSQL based databases for storage and processing of structured, semi-structured & unstructured data.
- Real-Time Analytics: Azure Stream Analytics, Azure Event Hubs.
- Data Buffering: Azure Event Hubs, Azure Service Bus.
- Workflow Orchestration: Azure Data Factory, Azure Logic Apps.
- Data Visualization: Power BI, MS Excel.
- Data Lakes & Warehousing: Building Data Models, Data Lakes, and Data Warehousing using Azure Data Lake Storage and Azure Synapse Analytics.
- Data Protection and Security: Knowledge of data protection, security principles, and services; data loss prevention, role-based access controls, data encryption, data access capture, and core security services.
- Common Infrastructure as Code (IaC) Frameworks: Azure Resource Manager (ARM) templates, Terraform.
- Cloud Computing: Knowledge of fundamentals of Azure architectural principles and services; Strong ability to write code and deploy using Azure services. Microsoft Fabric End to End Knowledge
- Testing/Quality: Unit, interface, and end-user testing concepts and tooling inclusive of non-functional requirements (performance, usability, reliability, security/vulnerability scanning, etc.) including how testing is integrated into DevOps; accessibility awareness.
Preferred Qualifications
- Serverless data pipeline development using Azure Functions and Azure Logic Apps.
- Relevant certifications in Azure technologies.
Note: Prudential is required by state specific laws to include the salary range for this role when hiring a resident in applicable locations. The salary range for this role is from $120,000 to $1550,000. Specific pricing for the role may vary within the above range based on many factors including geographic location, candidate experience, and skills. Roles may also be eligible for additional compensation and/or benefits. Eligibility to participate in a discretionary annual incentive program is subject to the rules governing the program, whereby an award, if any, depends on various factors including, without limitation, individual and organizational performance.
#LI-LR1 #LI-Hybrid
What we offer you:Market competitive base salaries, with a yearly bonus potential at every level.
Medical, dental, vision, life insurance, disability insurance, Paid Time Off (PTO), and leave of absences, such as parental and military leave.
401(k) plan with company match (up to 4%).
Company-funded pension plan.
Wellness Programs including up to $1,600 a year for reimbursement of items purchased to support personal wellbeing needs.
Work/Life Resources to help support topics such as parenting, housing, senior care, finances, pets, legal matters, education, emotional and mental health, and career development.
Education Benefit to help finance traditional college enrollment toward obtaining an approved degree and many accredited certificate programs.
Employee Stock Purchase Plan: Shares can be purchased at 85% of the lower of two prices (Beginning or End of the purchase period), after one year of service.
Eligibility to participate in a discretionary annual incentive program is subject to the rules governing the program, whereby an award, if any, depends on various factors including, without limitation, individual and organizational performance. To find out more about our Total Rewards package, visit Work Life Balance | Prudential Careers. Some of the above benefits may not apply to part-time employees scheduled to work less than 20 hours per week.
Prudential Financial, Inc. of the United States is not affiliated with Prudential plc. which is headquartered in the United Kingdom.
Prudential is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, ancestry, sex, sexual orientation, gender identity, national origin, genetics, disability, marital status, age, veteran status, domestic partner status, medical condition or any other characteristic protected by law.
If you need an accommodation to complete the application process, please email accommodations.hw@prudential.com.
If you are experiencing a technical issue with your application or an assessment, please email careers.technicalsupport@prudential.com to request assistance.
Tags: Agile Architecture Azure Classification Computer Science Data analysis Data pipelines Data visualization Data warehouse Data Warehousing DevOps Engineering Excel Finance Java Machine Learning NoSQL Pipelines Power BI Predictive modeling PySpark Python R Scala SDLC Security Shell scripting Spark SQL Terraform Testing Unstructured data
Perks/benefits: 401(k) matching Career development Competitive pay Equity / stock options Health care Insurance Medical leave Parental leave Salary bonus Wellness
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.