Feature Lead - Insights
Newark, United States
Bank of America
What would you like the power to do? At Bank of America, our purpose is to help make financial lives better through the power of every connection.Job Description:
At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. We do this by driving Responsible Growth and delivering for our clients, teammates, communities and shareholders every day.
Being a Great Place to Work is core to how we drive Responsible Growth. This includes our commitment to being an inclusive workplace, attracting and developing exceptional talent, supporting our teammates’ physical, emotional, and financial wellness, recognizing and rewarding performance, and how we make an impact in the communities we serve.
Bank of America is committed to an in-office culture with specific requirements for office-based attendance and which allows for an appropriate level of flexibility for our teammates and businesses based on role-specific considerations.
At Bank of America, you can build a successful career with opportunities to learn, grow, and make an impact. Join us!
Job Description:
This job is responsible for providing leadership, technical direction and oversight to a team delivering technology solutions. Key responsibilities of the job are to provide oversight of the design, implementation, and maintenance of complex computer programs, align technical solutions to business objectives, and ensure that coding practices/quality comply with software development standards. Job expectations include conducting multiple software implementations and applying both depth and breadth in knowledge of several technical competencies.
Position Summary:
- Develop high-performance and scalable Analytics solutions using the Big Data platform to facilitate the collection, storage, and analysis of massive data sets from multiple channels.
- Utilize your in-depth knowledge of Hadoop stack and storage technologies, including HDFS, Spark, MapReduce, Yarn, Hive, Sqoop, Impala, Hue, and Oozie, to design and optimize data processing workflows.
- Apply your expertise in NoSQL technologies like MongoDB, SingleStore, or HBase to efficiently handle diverse data types and storage requirements.
- Implement Near real-time and Streaming data solutions to provide up-to-date information to millions of Bank customers.
- Collaborate with cross-functional teams to identify system bottlenecks, benchmark performance, and propose innovative solutions to enhance system efficiency.
- Take ownership of defining Big Data strategies and roadmaps for the Enterprise, aligning them with business objectives.
- Stay abreast of emerging technologies and industry trends related to Big Data, continuously evaluating new tools and frameworks for potential integration.
- Provide guidance and mentorship to junior teammates.
Responsibilities:
- Designs, develops and is accountable for feature delivery
- Applies enterprise standards for solution design, coding and quality
- Ensures solution meets product acceptance criteria with minimal technical debt
- Guides the team on work breakdown and execution
- Works with the Product Owner to ensure that product backlog/requirements are healthy, with clear acceptance criteria
- Plays a team lead role (as an individual contributor) and mentors the team
- Guides team members with skills and practices (planning and estimation, peer reviews, and other engineering practices)
Required Qualifications:
- Bachelor's or Master's degree in Science or Engineering, or a related field.
- Minimum of 8 years of industry experience, with at least 5 years focused on hands-on work in the Big Data domain.
- Highly skilled in Hadoop stack technologies, such as HDFS, Spark, Yarn, Hive, Sqoop, Impala and Hue.
- Strong proficiency in programming languages such as Python, Scala, and Bash/Shell Scripting.
- Excellent problem-solving abilities and the capability to deliver effective solutions for business-critical applications.
- Strong command of Visual Analytics Tools, with a focus on Tableau.
Desired Qualifications:
- Proficiency in NoSQL technologies like HBase, MongoDB, SingleStore, etc.
- Experience in Real-time streaming technologies like Spark Streaming, Kafka, Flink, or Storm.
- Familiarity with Cloud Technologies such as Azure, AWS, or GCP.
- Working knowledge of machine learning algorithms, statistical analysis, and programming languages (Python or R) to conduct data analysis and develop predictive models to uncover valuable patterns and trends.
- Proficiency in Data Integration and Data Security within the Hadoop ecosystem, including knowledge of Kerberos.
Skills:
- Automation
- Influence
- Result Orientation
- Stakeholder Management
- Technical Strategy Development
- Architecture
- Business Acumen
- Risk Management
- Solution Delivery Process
- Solution Design
- Agile Practices
- Analytical Thinking
- Collaboration
- Data Management
- DevOps Practices
Shift:
1st shift (United States of America)Hours Per Week:
40* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Architecture AWS Azure Big Data Data analysis Data management DevOps Engineering Flink GCP Hadoop HBase HDFS Kafka Machine Learning MongoDB NoSQL Oozie Python R Scala Security Shell scripting Spark Statistics Streaming Tableau
Perks/benefits: Career development
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.