Senior Data Engineer

Remote - India

Apply now Apply later

ABOUT OPORTUN

Oportun (Nasdaq: OPRT) is a mission-driven fintech that puts its 2.0 million members' financial goals within reach. With intelligent borrowing, savings, and budgeting capabilities, Oportun empowers members with the confidence to build a better financial future. Since inception, Oportun has provided more than $16.6 billion in responsible and affordable credit, saved its members more than $2.4 billion in interest and fees, and helped its members save an average of more than $1,800 annually. Oportun has been certified as a Community Development Financial Institution (CDFI) since 2009.

 

WORKING AT OPORTUN


Working at Oportun means enjoying a differentiated experience of being part of a team that fosters a diverse, equitable and inclusive culture where we all feel a sense of belonging and are encouraged to share our perspectives. This inclusive culture is directly connected to our organization's performance and ability to fulfill our mission of delivering affordable credit to those left out of the financial mainstream. We celebrate and nurture our inclusive culture through our employee resource groups.

Engineering Business Unit Overview:

The charter for Engineering group at Oportun is to be the world-class engineering force behind our innovative products. The group plays a vital role in designing, developing, and maintaining cutting-edge software solutions that power our mission and advance) our business. We strike a balance between leveraging leading tools and developing in-house solutions to create member experiences that empower their financial independence.

The talented engineers in this group are dedicated to delivering and maintaining performant, elegant, and intuitive systems to our business partners and retail members. Our platform combines service-oriented platform features with sophisticated user experience and is enabled through a best-in-class (and fun to use!) automated development infrastructure. We prove that FinTech is more fun, more challenging, and in our case, more rewarding as we build technology that changes our members’ lives.

Engineering at Oportun is responsible for high quality and scalable technical execution to achieve business goals and product vision. They ensure business continuity to members by effectively managing systems and services - overseeing technical architectures and system health. In addition, they are responsible for identifying and executing on the technical roadmap that enables product vision as well as fosters member & business growth in a scalable and efficient manner.

The Enterprise Data and Technology (EDT) pillar within the Engineering Business Unit focusses on enabling wide use of corporate data assets whilst ensuring quality, availability and security across the data landscape.

Position Overview:

As a Senior Data Engineer at Oportun, you will be a key member of our EDT team, responsible for designing, developing, and maintaining sophisticated software / data platforms in achieving the charter of the engineering group. Your mastery of a technical domain enables you to take up business problems and solve them with a technical solution. With your depth of expertise

and leadership abilities, you will actively contribute to architectural decisions, mentor junior engineers, and collaborate closely with cross-functional teams to deliver high-quality, scalable software solutions that advance our impact in the market. This is a role where you will have the opportunity to take up responsibility in leading the technology effort – from technical requirements gathering to final successful delivery of the product - for large initiatives (cross- functional and multi-month long projects).

Responsibilities:1. Data Architecture and Design:
  • Lead the design and implementation of scalable, efficient, and robust data architectures to meet business needs and analytical requirements.
  • Collaborate with stakeholders to understand data requirements, build subject matter expertise, and define optimal data models and structures.
2.Data Pipeline Development and Optimization:
  • Design and develop data pipelines, ETL processes, and data integration solutions for ingesting, processing, and transforming large volumes of structured and unstructured data.
  • Optimize data pipelines for performance, reliability, and scalability.
3.Database Management and Optimisation:
  • Oversee the management and maintenance of databases, data warehouses, and data lakes to ensure high performance, data integrity, and security.
  • Implement and manage ETL processes for efficient data loading and retrieval.
4.Data Quality and Governance:
  • Establish and enforce data quality standards, validation rules, and data governance practices to ensure data accuracy, consistency, and compliance with regulations.
  • Drive initiatives to improve data quality and documentation of data assets.
5.Mentorship and Leadership:
  • Provide technical leadership and mentorship to junior team members, assisting in their skill development and growth.
  • Lead and participate in code reviews, ensuring best practices and high-quality code.
6.Collaboration and Stakeholder Management:
  • Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand their data needs and deliver solutions that meet those needs.
  • Communicate effectively with non-technical stakeholders to translate technical concepts into actionable insights and business value.
7.Performance Monitoring and Optimization:
  • Implement monitoring systems and practices to track data pipeline performance, identify bottlenecks, and optimize for improved efficiency and scalability.

8.Common Software Engineering Requirements
  • You actively contribute to the end-to-end delivery of complex software applications, ensuring adherence to best practices and high overall quality standards.
  • You have a strong understanding of a business or system domain with sufficient knowledge & expertise around the appropriate metrics and trends. You collaborate closely with product managers, designers, and fellow engineers to understand business needs and translate them into effective software solutions.
  • You provide technical leadership and expertise, guiding the team in making sound architectural decisions and solving challenging technical problems. Your solutions anticipate scale, reliability, monitoring, integration, and extensibility.
  • You conduct code reviews and provide constructive feedback to ensure code quality, performance, and maintainability. You mentor and coach junior engineers, fostering a culture of continuous learning, growth, and technical excellence within the team.
  • You play a significant role in the ongoing evolution and refinement of current tools and applications used by the team, and drive adoption of new practices within your team.
  • You take ownership of (customer) issues, including initial troubleshooting, identification of root cause and issue escalation or resolution, while maintaining the overall reliability and performance of our systems.
  • You set the benchmark for responsiveness and ownership and overall accountability of engineering systems.
  • You independently drive and lead multiple features, contribute to (a) large project(s) and lead smaller projects. You can orchestrate work that spans multiples engineers within your team and keep all relevant stakeholders informed. You support your lead/EM about your work and that of the team, that they need to share with the stakeholders, including escalation of issues
Qualifications:
  • Bachelor's or Master's degree in Computer Science, Data Science, or a related field.
  • 5+ years of experience in data engineering, with a focus on data architecture, ETL, and database management.
  • Proficiency in programming languages like Python/Pyspark and Java /Scala
  • Expertise in big data technologies such as Hadoop, Spark, Kafka, etc.
  • In-depth knowledge of SQL and experience with various database technologies (e.g., PostgreSQL, MySQL, NoSQL databases).
  • Experience and expertise in building complex end-to-end data pipelines.
  • Experience with orchestration and designing job schedules using the CICD tools like Jenkins and Airflow.
  • Ability to work in an Agile environment (Scrum, Lean, Kanban, etc)
  • Ability to mentor junior team members.
  • Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and their data services (e.g., AWS Redshift, S3, Azure SQL Data Warehouse).
  • Strong leadership, problem-solving, and decision-making skills.
  • Excellent communication and collaboration abilities.

We are proud to be an Equal Opportunity Employer and consider all qualified applicants for employment opportunities without regard to race, age, color, religion, gender, national origin, disability, sexual orientation, veteran status or any other category protected by the laws or regulations in the locations where we operate.

 

California applicants can find a copy of Oportun's CCPA Notice here:  https://oportun.com/privacy/california-privacy-notice/.

 

We will never request personal identifiable information (bank, credit card, etc.) before you are hired. We do not charge you for pre-employment fees such as background checks, training, or equipment. If you think you have been a victim of fraud by someone posing as us, please report your experience to the FBI’s Internet Crime Complaint Center (IC3).

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0
Category: Engineering Jobs

Tags: Agile Airflow Architecture AWS Azure Big Data Computer Science Data governance Data pipelines Data quality Data warehouse Engineering ETL FinTech GCP Hadoop Java Jenkins Kafka Kanban MySQL NoSQL Pipelines PostgreSQL Privacy PySpark Python Redshift Scala Scrum Security Spark SQL Unstructured data

Perks/benefits: Career development

Regions: Remote/Anywhere Asia/Pacific
Country: India

More jobs like this