Presenter Script | Loan Approval Process
Second-screen guide
Presenter guide

Loan Approval Process

Use this on a second screen while you run the demo. This is designed to be more prescriptive: what this section is about, what to say, how to frame it, what to point at, and what to practice before you present.
Audience
Sales reps, sales engineers, mixed technical and executive audiences
Core message
The problem is not lack of data. The problem is operationalizing context in time. Redis becomes the operational context layer that makes the live decision possible.
Suggested runtime
12 to 15 minutes.
Use this for
Practice, repetition, and live second-screen support while demoing.

How to use this script

Assume the rep or the SA has this script open on a second screen while running the demo. This is not background reading. This is the coaching layer. Use the “Say this exactly” line as the default talk track, then adapt with the framing notes if the room is more executive or more technical.

Opening message

Northstar does not win by making more decisions in batch. It wins by making the right lending decision in the live borrower moment. Credit data, fraud data, income signals, and policy data already exist in the stack. The issue is that they do not assemble fast enough to turn a raw application into the best approval path before the borrower drops or calls support.
This is built for digital lending, underwriting, fraud, and platform teams. The visible demo stays customer-safe. The script is where the presenter carries the detail, the stakes, and the ask.
The data is already there: Loan origination system, bureau and income verification APIs, fraud graph, document repository, feature lakehouse, and Kafka streams for application and identity events. The issue is not whether the data exists. The issue is whether it can be assembled and acted on in <10 ms while the moment is still live.
That is what this demo shows. Nine stages, one primary decision moment centered on Daniel Lee, and a decisioning pipeline that turns scattered operational context into the winning action: Instant conditional approval.

Demo objective

Show how Redis helps a lender make the right lending decision in the live borrower moment by assembling credit, fraud, income, and policy context in milliseconds.

What the audience should remember

Stage 1
The Architecture

This section is about

This section explains the purpose of the click and why this moment matters in the overall real-time decisioning story.

Say this exactly

This is the full architecture for what you are about to see. At the top are the systems of record and live signal sources — the loan origination system, core banking platform, credit bureau APIs, document repository, fraud and identity platform, and Kafka event streams. None of these go away. They stay exactly where they are.
The ingest layer has two jobs. RDI handles change data capture and operational sync from the core repositories — near-real-time, no custom pipeline code. Redis Feature Form handles the feature pipeline from the analytical and streaming systems into the context layer, with full train-serve parity. Two tools, two roles, one unified ingest layer.
The context layer is the operational working set. Hot data and live session state stay in Redis RAM for sub-millisecond access. Larger history, document embeddings, and warm bureau context sit in Redis Flex. Redis Context Retriever connects those stores to the decision engine as the semantic access layer — it assembles the Applicant 360 — credit state, account history, and policy context — and exposes it as structured MCP tools the decision engine can call directly.
The decision engine is where rules, eligibility, ML ranking, and policy arbitration come together. The output channels are where the business sees the result. And the learning loop makes every accepted or rejected action improve the next one.

Frame it this way

Frame this as one step in the larger real-time decisioning story, with Redis turning scattered data into an action while the moment is still live. Emphasize this point: Lead with Redis as the operational context layer, not a rip-and-replace. The architecture matters because it makes the live decision possible.

What to point at on screen

The five-tier Redis Real-Time Decisioning reference architecture with Data Sources, Ingest Layer, Unified Context Layer, Decision Engine, Output Channels, and the learning loop. In the Unified Context Layer, Redis RAM, Redis Flex, and Feature Store sit in the top row. Redis Context Retriever sits centered in a second row below them — visually connecting those stores to the Decision Engine as the MCP access layer.

Practice note

Practice landing on this transition cleanly: "This is the architecture. Now let me show you what happens when the live customer moment actually starts."

Message to reinforce

Lead with Redis as the operational context layer, not a rip-and-replace. The architecture matters because it makes the live decision possible.

Transition to the next click

This is the architecture. Now let me show you what happens when the live customer moment actually starts.

Stage 2
Decision Moment

This section is about

This section explains the purpose of the click and why this moment matters in the overall real-time decisioning story.

Say this exactly

This is Daniel Lee. Prime borrower, 742 FICO, payroll direct deposit, and a repeat applicant at Northstar Lending. He just submitted a digital loan application and is waiting on the confirmation screen for an approval path.
This is not an edge case. This is the repeatable decision moment Northstar handles thousands of times a day. And it has to resolve before the confirmation screen finishes rendering.
If the system waits too long, Daniel bounces. If it acts on partial context, it routes him to manual review when he qualifies for conditional approval. Either way, Northstar loses the moment. If the system decides in time with full context — credit state, fraud score, income signals, and active policy all assembled together — it captures higher funded-loan conversion, reduces unnecessary manual review, and holds a cleaner risk posture.

Frame it this way

Frame this as one step in the larger real-time decisioning story, with Redis turning scattered data into an action while the moment is still live. Emphasize this point: Make the business stakes concrete. This is the live moment where latency and context determine whether the company captures value or misses it.

What to point at on screen

The live trigger centered on Daniel Lee, plus the side panel explaining why this moment matters right now.

Practice note

Practice landing on this transition cleanly: "We have one live moment to recognize Daniel Lee correctly and act before the old process falls back to something generic."

Message to reinforce

Make the business stakes concrete. This is the live moment where latency and context determine whether the company captures value or misses it.

Transition to the next click

We have one live moment to recognize Daniel Lee correctly and act before the old process falls back to something generic.

Stage 3
Ingest

This section is about

This section is about how the existing systems stay in place while Redis operationalizes their data. Emphasize additive architecture, not rip-and-replace.

Say this exactly

Northstar Lending keeps everything you see at the top of this architecture. The loan origination system, core banking platform, FICO and Experian bureau APIs, core deposits ledger, payroll and income verification APIs, and Kafka event streams all stay exactly where they are. Redis is not the new system of record. Redis is the operational serving layer that makes those existing systems act together in the live decision window.
The ingest layer has two jobs. RDI handles change data capture from the loan origination system, core banking, and bureau repositories — near-real-time sync with no custom pipeline code required. Redis Feature Form handles the feature pipeline from the analytical and streaming systems into the online feature store, with full train-serve parity. Two tools, clear separation of concerns, one unified ingest layer.
The result is a working set that is always current. Not a nightly batch. Not a stale snapshot. Milliseconds behind the source.

Frame it this way

Frame this as additive architecture. Existing systems remain the systems of record; Redis makes their data usable in the live decision path. Emphasize this point: Reinforce additive architecture. RDI and Redis Feature Form make existing systems operational in the moment without replacing systems of record.

What to point at on screen

Industry repositories and streaming APIs flowing into Redis through RDI and Redis Feature Form, with pipeline status visible on the right.

Practice note

Practice landing on this transition cleanly: "Redis does not replace the existing stack. RDI and Redis Feature Form make that stack operational in the live decision window."

Message to reinforce

Reinforce additive architecture. RDI and Redis Feature Form make existing systems operational in the moment without replacing systems of record.

Transition to the next click

Redis does not replace the existing stack. RDI and Redis Feature Form make that stack operational in the live decision window.

Stage 4
Context Assembly

This section is about

This section is about the unified context layer. Slow down here and show how live signals and durable history come together to produce decision-ready context.

Say this exactly

This is the heart of the decisioning stack. The left panel is Daniel Lee's durable profile — customer value band, relationship tenure, prior interaction pattern, eligibility state, policy constraints, and frequency cap history. The right panel is what is happening right now in this session — current intent, streaming signal state, inventory availability, risk and compliance check, and surface readiness.
Most systems have one or the other. They can look up a customer record. Or they can capture a live event. The gap is serving both together at request time, inside the latency budget.
Redis Context Retriever is what makes that possible. It assembles the Applicant 360 — credit state, account history, and active policy constraints — and surfaces it as structured tools the decision engine can call directly. No fan-out queries, no manual joins across repositories. History without live state is stale. Live state without history is shallow. Redis is the layer that serves both in the same response path.

Frame it this way

Frame this as the heart of the demo. If the audience remembers one thing, it should be that better decisions come from better live context, not from more static rules. Emphasize this point: Slow down here. This is where unified context becomes tangible: history, live signals, policy, and situational awareness in one decision path.

What to point at on screen

Two panels: historical context on the left and live context on the right, merged into one working view.

Practice note

Practice landing on this transition cleanly: "A profile tells you who the customer is. Context tells you what the business should do next."

Message to reinforce

Slow down here. This is where unified context becomes tangible: history, live signals, policy, and situational awareness in one decision path.

Transition to the next click

A profile tells you who the customer is. Context tells you what the business should do next.

Stage 5
Feature Serving

This section is about

This section is about why the model or rules engine can act in real time. The message is that online features arrive fast, consistently, and with train-serve parity.

Say this exactly

You are looking at six features served live from Redis in under a millisecond each — credit score band, fraud risk, income stability, debt-to-income ratio, relationship value score, and verification readiness. One hundred eighty-six features total across this decision path. P99 lookup latency under fifteen milliseconds.
The point is not the feature names themselves. The point is that these are the same features used to train the model, served online at decision time with the same definitions and the same logic. That is train-serve parity. Most teams can train a model. The hard part is serving the right features fast enough in production without drift between the notebook and the live application.
Redis Feature Form on Redis closes that gap. These features are specifically why the system chooses Instant conditional approval instead of defaulting to manual review or surfacing an out-of-policy offer that fails compliance.

Frame it this way

Frame this as the bridge between models and production outcomes. The point is not model training; the point is serving the right features inside the latency budget. Emphasize this point: Differentiate analytics from execution. The model is not the hard part; serving trustworthy online features in milliseconds is the hard part.

What to point at on screen

Online feature cards plus the feature-serving performance panel.

Practice note

Practice landing on this transition cleanly: "Your model is only as good as the features you can serve in milliseconds, not the features you can describe in a slide deck."

Message to reinforce

Differentiate analytics from execution. The model is not the hard part; serving trustworthy online features in milliseconds is the hard part.

Transition to the next click

Your model is only as good as the features you can serve in milliseconds, not the features you can describe in a slide deck.

Stage 6
Ranking

This section is about

This section is about the actual decision. The audience should understand that this is not a generic recommendation; it is ranked next-best-action arbitration based on live context.

Say this exactly

The winner is Instant conditional approval, with an NBA score of 0.94. It wins because it fits this exact moment — Daniel is a prime borrower with a 742 FICO, verified income, and a payroll direct deposit relationship that makes the auto-pay incentive a natural fit. High relevance, strong economics, fully within policy.
Manual review route scores 0.79. That is the path the legacy process typically takes because it is the simplest generic fallback. It is not wrong — it just misses the moment. A conditional approval with payroll verification captures the same risk controls at materially higher funded-loan conversion.
Out-of-policy jumbo offer is suppressed entirely. A model operating on partial context might have surfaced it. The full picture — debt-to-income at 34 percent, collateral constraints — removes it before it reaches the decision engine.
Redis Search is what powers the similarity matching in this ranking step. Vector search is not a separate product you bolt on — it is a query type that Redis Search handles natively, the same way it handles full-text and numeric filtering. Most teams still think of vector search as a specialty database problem. It is not. It is just another data type Redis can search at sub-millisecond speed.
This is not content ranking. This is decisioning — arbitrating across policy, economics, relevance, and vector similarity in one low-latency response.

Frame it this way

Frame this as decision arbitration. The system is not just surfacing options; it is choosing the best action for this exact moment. Emphasize this point: Show that Redis is not just scoring content; it is helping the decisioning stack rank actions in the real business moment.

What to point at on screen

The ranked candidate actions — powered by Redis Search running vector similarity to match applicant risk profile against the action space, combined with eligibility rules and policy arbitration. Point to the Redis Search component tag and call out that vector search is one of the inputs into the ranking, not the whole story. Instant conditional approval is the winner; Manual review route and Out-of-policy jumbo offer are lower-ranked or suppressed.

Practice note

Practice landing on this transition cleanly: "We are not surfacing random recommendations. We are ranking the actions the business already cares about and choosing the one that fits this moment best."

Message to reinforce

Show that Redis is not just scoring content; it is helping the decisioning stack rank actions in the real business moment.

Transition to the next click

We are not surfacing random recommendations. We are ranking the actions the business already cares about and choosing the one that fits this moment best.

Stage 7
Business Impact

This section is about

This section translates the technical story into business value. Tie the decision quality back to revenue, retention, risk reduction, or operating efficiency.

Say this exactly

These numbers are direct results of the architecture. Decision latency of 11.6 milliseconds means the approval path is ready before the confirmation screen finishes rendering. A 37 percent reduction in manual review means underwriters are spending time on genuinely complex files, not routing cases that should have been auto-approved. A 14 point approval yield lift means more qualified borrowers are getting the right offer at the right moment instead of falling through to a generic path.
The value is not this single transaction. It is what happens when this decision gets repeated across the full book of business — every application, every product line, every channel where the system is choosing between conditional approval, manual review, or a suboptimal fallback. That is where the math compounds.
That is also why the next step is a pilot, not a deeper technical evaluation. The question is not whether Redis is fast. The question is what one product line looks like when the decisioning stack runs on live context instead of batch-retrieved profile data.

Frame it this way

Frame this in business terms only. This is where the rep should own the room and make the value feel measurable. Emphasize this point: Translate the technical story into measurable business outcomes. This is where the architecture earns the right to exist.

What to point at on screen

The decision economics panel and the side-by-side business impact summary.

Practice note

Practice landing on this transition cleanly: "The math is not the single transaction in front of us. It is what happens when this decision gets repeated across the full book of business."

Message to reinforce

Translate the technical story into measurable business outcomes. This is where the architecture earns the right to exist.

Transition to the next click

The math is not the single transaction in front of us. It is what happens when this decision gets repeated across the full book of business.

Stage 8
Outcome

This section is about

This section is the visible before-and-after. Keep it simple and let the audience see the difference between a generic or legacy experience and a Redis-powered one.

Say this exactly

Same customer. Same portal. Same moment. The left side shows what happens without the context layer — partial profile, delayed retrieval, limited live signals. The system routes Daniel Lee to manual review because that is the safest generic fallback available.
On the right, the same screen opens with the right action already staged. Instant conditional approval — approve with payroll verification and auto-pay incentive. Manual review reduction of 37 percent is the visible result.
The product is not the UI. The UI is identical on both sides. The product is the decision layer underneath it — the one that assembled credit state, fraud score, income signals, and active policy before the screen finished loading.

Frame it this way

Frame this as the payoff slide. Keep it simple: same customer or user, same surface, different decision layer. Emphasize this point: Keep the contrast visual and simple: same surface, different decision layer, very different outcome.

What to point at on screen

The side-by-side comparison of the generic or delayed path versus the Redis-powered path on the same end-user surface.

Practice note

Practice landing on this transition cleanly: "Same surface. Same moment. Different decision layer. That is the product."

Message to reinforce

Keep the contrast visual and simple: same surface, different decision layer, very different outcome.

Transition to the next click

Same surface. Same moment. Different decision layer. That is the product.

Stage 9
Architecture Recap

This section is about

This section closes the loop. Re-state the architectural lesson and remind the audience that the visible output is only possible because the context layer works in real time.

Say this exactly

This is the same architecture you saw at the start. Every tier looks the same. What is different now is that you have seen what each one contributed to the outcome.
Three takeaways. First, this is not a science project. This is a practical reference architecture that Northstar can operate today. Second, it is additive — the loan origination system, core banking platform, bureau APIs, and fraud graph stay exactly where they are. Redis sits in the operational path so those systems can act together. Third, this is a business story first. Higher funded-loan conversion, 37 percent fewer manual reviews, and a cleaner risk posture are the reasons to do it — not the platform architecture.
The next step is a focused working session to map this against your actual environment. We scope one product line, one score band, and one pilot that runs Redis-powered conditional approval alongside your current routing logic. That is a clean comparison with a real KPI before you commit to broader rollout.

Frame it this way

Frame this as the close. Re-state the architectural lesson and the next logical step to pilot the approach. Emphasize this point: Close the loop on context and real-time decisioning. End with a pilot-oriented ask tied to one segment, one workflow, and a clear KPI.

What to point at on screen

The architecture returns with the proven latency, outcome, and scale callouts visible.

Practice note

Practice landing on this transition cleanly: "You already have the systems and the data. What you need is the layer that lets them act together in the live decision window. That is Redis.

## Anticipated objections
- We already score with the bureau and LOS.
- Acknowledge the existing investment first. Then explain that Redis is additive: the current system stays in place, and Redis becomes the low-latency context and decisioning layer on top of it.
- This sounds like replacing underwriting.
- Make it clear that policy remains upstream and explicit. Redis executes the approved rules, audit trail, and model versions in real time; it does not replace governance.
- How do we keep policy and compliance control?
- Make it clear that policy remains upstream and explicit. Redis executes the approved rules, audit trail, and model versions in real time; it does not replace governance.

## Pacing guidance
- Total runtime: 12 to 16 minutes end to end. Budget roughly 60 to 90 seconds per stage, with a little more time on Stages 1, 4, 7, and 9.
- Pacing Guide
- Stage 1: 90 to 120 seconds. Orient the room and establish the additive architecture pattern.
- Stage 2: 60 seconds. Introduce the person and the stakes.
- Stage 3: 60 to 90 seconds. Keep it light for business audiences, deeper for technical audiences.
- Stage 4: 90 to 120 seconds. Slow down. This is where the contextual-intelligence story lands.
- Stage 5: 60 to 75 seconds. Go deeper only if the room wants ML detail.
- Stage 6: 75 to 90 seconds. Walk the winner, then contrast the alternatives.
- Stage 7: 90 to 120 seconds. Translate the demo into business math.
- Stage 8: 60 to 90 seconds. Let the visual comparison land.
- Stage 9: 90 to 120 seconds. Recap and close on the pilot ask.

## Audience calibration
- If the room skews executive, spend more time on Stages 1, 7, and 9 and compress the detailed ingestion and feature content.
- If the room skews technical, spend more time on Stages 3, 4, and 5 and let the SE take the lead on RDI, Redis Feature Form, latency, and train-serve parity.
- If the room is mixed, have the rep own the framing and close, and let the SE step in for the technical middle of the story.

## Closing reminder
Keep the close simple: the customer already has the data and the decisioning ambition. Redis is the context layer that makes those signals usable in the live moment so the business can improve better approval accuracy, lower fraud loss, faster borrower conversion."

Message to reinforce

Close the loop on context and real-time decisioning. End with a pilot-oriented ask tied to one segment, one workflow, and a clear KPI.

Transition to the next click

You already have the systems and the data. What you need is the layer that lets them act together in the live decision window. That is Redis.

## Anticipated objections
- We already score with the bureau and LOS.
- Acknowledge the existing investment first. Then explain that Redis is additive: the current system stays in place, and Redis becomes the low-latency context and decisioning layer on top of it.
- This sounds like replacing underwriting.
- Make it clear that policy remains upstream and explicit. Redis executes the approved rules, audit trail, and model versions in real time; it does not replace governance.
- How do we keep policy and compliance control?
- Make it clear that policy remains upstream and explicit. Redis executes the approved rules, audit trail, and model versions in real time; it does not replace governance.

## Pacing guidance
- Total runtime: 12 to 16 minutes end to end. Budget roughly 60 to 90 seconds per stage, with a little more time on Stages 1, 4, 7, and 9.
- Pacing Guide
- Stage 1: 90 to 120 seconds. Orient the room and establish the additive architecture pattern.
- Stage 2: 60 seconds. Introduce the person and the stakes.
- Stage 3: 60 to 90 seconds. Keep it light for business audiences, deeper for technical audiences.
- Stage 4: 90 to 120 seconds. Slow down. This is where the contextual-intelligence story lands.
- Stage 5: 60 to 75 seconds. Go deeper only if the room wants ML detail.
- Stage 6: 75 to 90 seconds. Walk the winner, then contrast the alternatives.
- Stage 7: 90 to 120 seconds. Translate the demo into business math.
- Stage 8: 60 to 90 seconds. Let the visual comparison land.
- Stage 9: 90 to 120 seconds. Recap and close on the pilot ask.

## Audience calibration
- If the room skews executive, spend more time on Stages 1, 7, and 9 and compress the detailed ingestion and feature content.
- If the room skews technical, spend more time on Stages 3, 4, and 5 and let the SE take the lead on RDI, Redis Feature Form, latency, and train-serve parity.
- If the room is mixed, have the rep own the framing and close, and let the SE step in for the technical middle of the story.

## Closing reminder
Keep the close simple: the customer already has the data and the decisioning ambition. Redis is the context layer that makes those signals usable in the live moment so the business can improve better approval accuracy, lower fraud loss, faster borrower conversion.

Objections handling

Pacing guidance