Loan Document Review System

A system for loan structuring analysts to streamline the process of comparing loan term sheet files and loan agreement files.

ABOUT

Project Purpose

The system automates the extraction of terms from files to reduce manual effort in document comparison. It uses AI to accurately extract all terms from the term sheet and match them to the agreement file. This enhances productivity, minimizes errors, and simplifies the document comparison process.

Challenges

Simplifying the Workflow Maze:

Creating a streamlined, intuitive process across multiple document types.

Taming Document Overload:

Designing a clear information architecture for complex, lengthy documents.

Making Comparison Results Actionable:

Presenting comparison results intuitively, with clarity on confidence scoring.

Roles & Responsibilities

  • Solely responsible for leading the full end-to-end UX/UI design process.
  • Conducted user research, translated requirements into user-friendly interfaces, and performed usability testing for optimal user experience.

The Scope of Work

01. RESEARCH

Users Interview

I have conducted several client interviews as part of my primary qualitative research. These interviews were valuable in identifying the key pain points and challenges experienced by the users.

  • Research subjects: 9 loan structuring analysts
  • Research Methods: Interviews, product prototype testing

Target User Analysis

Loan structuring analysts often need to compare loan term sheet files and loan agreement files to make sure that all terms in the term sheet are included in the agreement. This can be a time-consuming and labor-intensive process, as it requires manual search and comparison.

According to interviews, users would like a system that can automatically extract terms from the term sheet and locate corresponding terms in the agreement. This would streamline the workflow and reduce error rates.

Current Loan Document Review Process: A Tedious and Error-Prone Workflow

The steps users typically take to compare loan documents:

1. Collect digital versions of the term sheet and agreement

2. Find all involved terms in the term sheet and read them one by one

3. Manually search the agreement for each term and find the corresponding term

4. Carefully compare the details of each term to confirm consistency

5. Record the final review results and opinions

Users pain points

  • Comparing documents line by line is extremely time-consuming, taking 3-5 hours per document set. This lowers productivity

  • Manual search often leads to missing terms, resulting in inaccurate reviews. The error rate is estimated to be 5-10%

  • Keeping track of already compared terms is difficult. Users often lose track and repeat comparisons

  • Constant switching between documents creates a fragmented workflow. Users have to scroll through long documents back and forth to compare terms

  • Recording notes and review results rely on paper and Excel sheets. Data is scattered and unorganized

User Requirement Analysis

  1. Automatically extract all involved terms from the term sheet
  2. Precisely locate each corresponding term in the loan agreement
  3. Highlight the differences and inconsistencies between terms
  4. Have text and visual output for easy user review
  5. Simple and easy to learn
  6. Support common file formats such as PDF, Word

02. IDEATION

Streamlined Steps

After the user research, we designed this streamlined flow to address the specific pain points faced by loan analysts. Each step targets the frustrations they expressed, aiming to make their work more efficient, and accurate.

Wireframes

After understanding the product goals and user needs, I first quickly sketched out possible layouts and content blocks for key pages. This allowed me to freely think through different structural combinations and arrive at a reasonable initial concept.

By rapidly sketching lo-fi wireframes, I was able to thoroughly explore different page relationships and layout options, and quickly make changes and test them out. This ultimately yielded a layout approach that the team collectively found reasonable, laying the foundation for high-fidelity prototyping down the road.

Information Architecture

Designing proper information architecture is crucial for a product’s user experience. A logical information structure can make complex content more organized so users can quickly find needed information without getting lost in trivial details.

For this project, I first needed to understand the logical relationships between key information pieces – terms are extracted from uploaded files, and comparison results are generated by matching terms with the loan agreement file. There are clear hierarchical and sequential connections between the information.

03. PROTOTYPING

Interactive prototype

Using the completed set of digital wireframes, I created a clickable prototype. The primary user flow I created was viewing the results of terms, so the prototype could be used in a usability study.

Usability Testing

Research goals

  1. Evaluate the usability and effectiveness of the UX flow
  2. Evaluate the understandability of workflows like file upload and term extraction
  3. Test usability of term comparison and result presentation functions
  4. Discover issues and improvement opportunities in interface elements
  5. Determine user satisfaction and gather feedback for improvements.

Participants

  1. Loan Structure Analysts: 3 participants
  2. Loan Approvals Officers: 2 participants

Testing tasks

  1. Create a review task
  2. Compare terms against the agreement file
  3. Verify the comparison results and highlight the correct clauses in the loan agreement file
  4. Annotate term sheets and loan agreement files manually

Data Collection

  1. Record task time and steps taken
  2. Note down problems encountered in flows
  3. Gather subjective feedback on interfaces and interactions

Results Analysis

  1. Analyze task success rates and workflow usability
  2. Identify problematic interface elements and solutions
  3. Compile feedback to inform optimization

Testing KPIs

  1. Task success rate
  2. Task time
  3. User Satisfaction

Results

  • 4 users encountered difficulties during the term comparison task, with only a 50% complete success rate.
  • 4 users believe that a save feature is needed so that they can review the work in parts instead of all at once.
  • 4 users have expressed that the design of the comparison document is not intuitive.

Solving UX-problems

I considered the user feedback and made changes to the designs to improve usability.
Here you will find examples of improvements.

1.User role and permission


The client requested additional user roles, maker and checker, after user testing for a smoother workflow. Before diving into UI or feature changes, I defined the relationship between the new roles and their corresponding permissions.

This ensured that each role had the appropriate access and capabilities to perform its specific tasks, contributing to a more intuitive and efficient user experience.

*Relationship diagram between roles and tasks

*Role and permission

2.View result page


  • Problem

The layout of the results page is not very clear and the users cannot see the results intuitively.

  • Solution

  • Modified the UI layout to make it more user-friendly. The most important result tab is now displayed in the default tab, and the source files Term sheet, and loan agreement file are displayed in tabs next to the results.
  • The term section has also been updated to include the number of terms, and there is now more space to display the term content, which is often long.
  • Add file state to track the progress of the task.
  • Created diagrams of roles and permissions to help users better understand the roles involved and their respective responsibilities.

Before usability study

After usability study

3.File upload and processing page


  • Problem

  • The upload entry is not intuitive enough, and the process progress is not displayed.
  • More details about the task are not added to make it easier for analysts to process the files.
  • Backend colleagues reported that the file preprocessing (parse and match process) will take a long time.

  • Solution

  • Added a more intuitive file upload process that shows the user the progress of the task.
  • Add details about the task fields.
  • Added a preprocessing file entry and a preprocessing file list page, so users can not only directly upload files from their local devices but also choose files that have already undergone preprocessing. This will save a significant amount of waiting time.

Before usability study

After usability study

04. VISUALIZATION

Sign in

User access is controlled through the client’s user authentication system.

Statistics

Upon login, users land on the Statistics page, which provides an overview of the system’s performance over the last month. Key metrics include:

  • Matching Accuracy Rate
  • Distribution of Confirmed Results
  • Number of Terms Processed

Role: Maker

File upload and processing

Users can directly upload files from their local devices or choose files that have already undergone preprocessing

Role: Maker

File comparison (1/4)

  • Automatically select the result with the highest score (Top 1) as the default option.
  • Allow users to search the results by entering keywords, and the results will be dynamically updated to show relevant results.

File comparison (2/4)

  • Allow users to expand the view to see additional results for each content of terms beyond the default selection.

File comparison (3/4)

  • Enable users to manually select an alternative result on the agreement file if they believe it better corresponds to the subsection of the term.

File comparison (4/4)

  • Highlight the manually added results

Role: Checker

File comparison List (1/2)

The history page lists past file comparisons, with the most recent at the top. Each entry includes:

  • Task name
  • File name
  • Date and time
  • Status
  • Actions of the task

File comparison List (2/2)

If the checker marks a task as “completed,” a real-time report is generated, detailing:

  • Number of terms analyzed
  • Number of term contents
  • Analysis accuracy

05. RESULT SUMMARY

Success Metrics

  • Improved efficiency by 30%
  • Term extraction accuracy reached 95%

Work Completed

  • Researched core user needs
  • Designed information architecture and iterated on prototypes
  • Optimized complex file comparison workflow
  • Provided clear result visualization
  • Applied intuitive visual designs
  • Added navigational cues
  • Enabled auto-save progress

Results

Through collaborating with engineering, we delivered streamlined UI design and efficient interactions. Core workflows became more intelligent and user-friendly, significantly improving user productivity and satisfaction.

Next Steps

The next steps would involve gathering feedback to optimize the product after its release, adding new features as needed, and improving the user experience.