Oportun Logo

Oportun

Senior Data Engineer (R12745)

Reposted 12 Days Ago
Be an Early Applicant
Remote
Hiring Remotely in India
Senior level
Remote
Hiring Remotely in India
Senior level
Lead design, development, and maintenance of data platforms; optimize data pipelines; manage databases; mentor junior team members.
The summary above was generated by AI
ABOUT OPORTUN

Oportun (Nasdaq: OPRT) is a mission-driven fintech that puts its 2.0 million members' financial goals within reach. With intelligent borrowing, savings, and budgeting capabilities, Oportun empowers members with the confidence to build a better financial future. Since inception, Oportun has provided more than $16.6 billion in responsible and affordable credit, saved its members more than $2.4 billion in interest and fees, and helped its members save an average of more than $1,800 annually. Oportun has been certified as a Community Development Financial Institution (CDFI) since 2009.

 

WORKING AT OPORTUN


Working at Oportun means enjoying a differentiated experience of being part of a team that fosters a diverse, equitable and inclusive culture where we all feel a sense of belonging and are encouraged to share our perspectives. This inclusive culture is directly connected to our organization's performance and ability to fulfill our mission of delivering affordable credit to those left out of the financial mainstream. We celebrate and nurture our inclusive culture through our employee resource groups.

Position Overview:

As a Sr. Data Engineer at Oportun, you will be a key member of our team, responsible for designing, developing, and maintaining sophisticated software / data platforms in achieving the charter of the engineering group. Your mastery of a technical domain enables you to take up business problems and solve them with a technical solution. With your depth of expertise and leadership abilities, you will actively contribute to architectural decisions, mentor junior engineers, and collaborate closely with cross-functional teams to deliver high-quality, scalable software solutions that advance our impact in the market. This is a role where you will have the opportunity to take up responsibility in leading the technology effort – from technical requirements gathering to final successful delivery of the product - for large initiatives (cross-functional and multi-month-long projects).

Responsibilities

Data Architecture and Design:

  • Lead the design and implementation of scalable, efficient, and robust data architectures to meet business needs and analytical requirements.
  • Collaborate with stakeholders to understand data requirements, build subject matter expertise, and define optimal data models and structures.

Data Pipeline Development and Optimization:

  • Design and develop data pipelines, ETL processes, and data integration solutions for ingesting, processing, and transforming large volumes of structured and unstructured data.
  • Optimize data pipelines for performance, reliability, and scalability.

Database Management and Optimization:

  • Oversee the management and maintenance of databases, data warehouses, and data lakes to ensure high performance, data integrity, and security.
  • Implement and manage ETL processes for efficient data loading and retrieval.

Data Quality and Governance:

  • Establish and enforce data quality standards, validation rules, and data governance practices to ensure data accuracy, consistency, and compliance with regulations.
  • Drive initiatives to improve data quality and documentation of data assets.

Mentorship and Leadership:

  • Provide technical leadership and mentorship to junior team members, assisting in their skill development and growth.
  • Lead and participate in code reviews, ensuring best practices and high-quality code.

Collaboration and Stakeholder Management:

  • Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand their data needs and deliver solutions that meet those needs.
  • Communicate effectively with non-technical stakeholders to translate technical concepts into actionable insights and business value.

Performance Monitoring and Optimization:

  • Implement monitoring systems and practices to track data pipeline performance, identify bottlenecks, and optimize for improved efficiency and scalability.

Common  Requirements

  • You have a strong understanding of a business or system domain with sufficient knowledge & expertise around the appropriate metrics and trends. You collaborate closely with product managers, designers, and fellow engineers to understand business needs and translate them into effective  solutions.
  • You provide technical leadership and expertise, guiding the team in making sound architectural decisions and solving challenging technical problems. Your solutions anticipate scale, reliability, monitoring, integration, and extensibility.
  • You conduct code reviews and provide constructive feedback to ensure code quality, performance, and maintainability. You mentor and coach junior engineers, fostering a culture of continuous learning, growth, and technical excellence within the team.
  • You play a significant role in the ongoing evolution and refinement of current tools and applications used by the team, and drive adoption of new practices within your team.
  • You take ownership of (customer) issues, including initial troubleshooting, identification of root cause and issue escalation or resolution, while maintaining the overall reliability and performance of our systems.
  • You set the benchmark for responsiveness and ownership and overall accountability of engineering systems.
  • You independently drive and lead multiple features, contribute to (a) large project(s) and lead smaller projects. You can orchestrate work that spans multiples engineers within your team and keep all relevant stakeholders informed. You support your lead/EM about your work and that of the team, that they need to share with the stakeholders, including escalation of issues
     

Qualifications
 

  • Bachelor's or Master's degree in Computer Science, Data Science, or a related field.
  • 5+ years of experience in data engineering, with a focus on data architecture, ETL, and database management.
  • Proficiency in programming languages like Python/PySpark and Java or Scala
  • Expertise in big data technologies such as Hadoop, Spark, Kafka, etc.
  • In-depth knowledge of SQL and experience with various database technologies (e.g., PostgreSQL, MariaDB, NoSQL databases).
  • Experience and expertise in building complex end-to-end data pipelines.
  • Experience with orchestration and designing job schedules using the CICD tools like Jenkins, Airflow or Databricks
  • Ability to work in an Agile environment (Scrum, Lean, Kanban, etc)
  • Ability to mentor junior team members.
  • Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and their data services (e.g., AWS Redshift, S3, Azure SQL Data Warehouse).
  • Strong leadership, problem-solving, and decision-making skills.
  • Excellent communication and collaboration abilities.
  • Familiarity or certification in Databricks is a plus.

We are proud to be an Equal Opportunity Employer and consider all qualified applicants for employment opportunities without regard to race, age, color, religion, gender, national origin, disability, sexual orientation, veteran status or any other category protected by the laws or regulations in the locations where we operate.

 

California applicants can find a copy of Oportun's CCPA Notice here:  https://oportun.com/privacy/california-privacy-notice/.

 

We will never request personal identifiable information (bank, credit card, etc.) before you are hired. We do not charge you for pre-employment fees such as background checks, training, or equipment. If you think you have been a victim of fraud by someone posing as us, please report your experience to the FBI’s Internet Crime Complaint Center (IC3).

Top Skills

Airflow
AWS
Azure
Databricks
GCP
Hadoop
Java
Jenkins
Kafka
Mariadb
NoSQL
Postgres
Pyspark
Python
Scala
Spark
SQL

Similar Jobs

39 Minutes Ago
Remote
Bengaluru, Karnataka, IND
Junior
Junior
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
The Security GRC Analyst at Atlassian will implement and manage security risk and governance processes, collaborating with various teams and enhancing security operations through automation and technical guidance.
Top Skills: AutomationCybersecurityGoJqlPythonRisk ManagementSQL
20 Hours Ago
Remote
India
Senior level
Senior level
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
The Senior Machine Learning Systems Engineer will lead infrastructure for AI & ML tools, tackling complex challenges, mentoring junior members, and collaborating across teams.
Top Skills: Java,Kotlin,Aws,Sagemaker,S3,Cloud Formation
Yesterday
Remote
Hybrid
India
Mid level
Mid level
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
The Team Leader in Technology Services oversees testing execution, enhances UAT processes, conducts post-production testing, and manages defect resolution while collaborating with the Product Owner.
Top Skills: AzureExcelMs PowerpointMs VisioMs Word

What you need to know about the Mumbai Tech Scene

From haggling for the best price at Chor Bazaar to the bustle of Crawford Market, the energy of Mumbai's traditional markets is a key part of the city's charm. And while these markets will always have their place, the city also boasts a thriving e-commerce scene, ranking among the largest in the region. Driven by online sales in everything from snacks to licensed sports merchandise to children's apparel, the local industry is worth billions, with companies actively recruiting to meet the demands of continued growth.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account