Use case #0001

How Onboarding QC AI samples 100% of files without increasing headcount

A manual quality control team that samples 15% of onboarding files catches 15% of errors — and the 85% it did not review goes to disbursement with whatever errors it contains. The consequences emerge over the next 6 to 24 months: a KYC document that was a photocopy rather than an original, a NACH mandate with the wrong account number, an income document that does not cover the required 12-month period, a property valuation that expired 3 months before sanction. Each of these errors creates a compliance finding, a collections complication, or a documentation gap when the account is examined. The Onboarding Quality Agent AI checks 100% of files against a 48-point quality checklist before any application moves to sanction — not as an auditor reviewing a sample, but as the quality gate every file must pass.

A manual quality control team that samples 15% of onboarding files catches 15% of errors — and the 85% it did not review goes to disbursement with whatever errors it contains. The consequences emerge over the next 6 to 24 months: a KYC document that was a photocopy rather than an original, a NACH mandate with the wrong account number, an income document that does not cover the required 12-month period, a property valuation that expired 3 months before sanction. Each of these errors creates a compliance finding, a collections complication, or a documentation gap when the account is examined. The Onboarding Quality Agent AI checks 100% of files against a 48-point quality checklist before any application moves to sanction — not as an auditor reviewing a sample, but as the quality gate every file must pass.

Why 15% sampling is not quality control — it is quality sampling

Statistical sampling is appropriate for research — understanding what proportion of a population has a property, with a confidence interval. It is not appropriate for a quality gate — a process whose purpose is to prevent any defective unit from proceeding. A quality gate that allows 85% of files to proceed without review is not a quality gate; it is a sample review with a label. The reason most institutions use sampling is not that it is adequate — it is that 100% review by trained human reviewers is too expensive at disbursement scale. A team that can review 80 files per day, at an institution disbursing 500 loans per month, would need a QC team of 3 people working full-time on quality review, plus a manager and systems for managing the queue. The cost of 100% QC at scale is prohibitive without automation.

The Onboarding Quality Agent AI changes the economics: it reviews every file against a 48-point checklist in approximately 4 minutes per file — running all 48 checks in parallel from the institution's document repository. A file that passes all 48 checks is cleared automatically. A file that fails any check is flagged with the specific failure, the specific document or field that failed, and a recommended resolution — and routed to the appropriate team for human intervention only on the specific issue. The human QC team's time is spent on failures, not on reviewing files that passed. At 500 applications per month, the AI reviews all 500 in approximately 33 hours of compute time; the human QC team reviews only the flagged items, which at a typical 22% error rate represents 110 files requiring human attention — a manageable queue for a team of 2 rather than 3.

"A quality gate that passes 85% of files without looking at them is not a gate — it is a filter on the 15% you happened to pull. The AI reviews all 500. Every month."

The 48-point QC checklist: structure and coverage

01–10
Group 1 — KYC completeness and validity (10 checks)

Identity documents present, current, and consistent across all fields

Checks 1–3: Aadhaar present, name on Aadhaar matches application name, address on Aadhaar within permitted deviation from declared address. Checks 4–5: PAN present and NSDL-validated — not a minor's PAN, not a duplicate or cancelled PAN. Checks 6–7: Photograph on ID matches applicant photograph in file (face verification against submitted selfie). Checks 8–9: For MSME borrowers, GSTIN present and active — GST filing current within 3 months. Check 10: For co-applicants and guarantors, all required KYC documents present for each profile.

→ Most common failure: Check 8 (GSTIN filing gap) · Check 3 (address deviation) · Together account for 41% of KYC group failures
11–20
Group 2 — Income document completeness and validity (10 checks)

Income documents cover the required period, come from permitted sources, and match bureau and bank data

Checks 11–12: ITR present for the required 2 years (1 year for new business exception path); ITR acknowledgement present — not just the ITR computation sheet. Check 13: Salary slip covers the required 3 consecutive months for salaried borrowers. Checks 14–15: Bank statements cover 12 consecutive months with no missing months — not 12 individual monthly statements with gaps. Check 16: Form 16 present for salaried borrowers and matches ITR declared income within 10%. Checks 17–18: Bank statement credits are consistent with declared income — no month where credited income is more than 30% below declared income. Check 19: For AA-sourced data, consent is current (not expired). Check 20: Income document dates — none of the required income documents predates the required period start.

→ Most common failure: Check 15 (bank statement gap months) · Check 18 (bank credits vs declared income) · Accounts for 58% of income group failures
21–30
Group 3 — Property and security documentation (10 checks)

For HL and LAP: title chain complete, encumbrance-free, valuation current, insurance arranged

Checks 21–22: Property documents present and cover the required title search period (13-year minimum for ancestral property; standard for others). Check 23: Encumbrance certificate date — EC is not older than 6 months at the time of sanction. Check 24: CERSAI search completed and no prior charge exists. Check 25: Property valuation report — valuation not older than 6 months; valuer on the institution's approved panel. Check 26: Valuation report contains the required photographs, locality map, and square footage calculation. Checks 27–28: Sale deed / registered agreement to sell present and relevant parties' signatures complete. Check 29: Property insurance policy in force — not lapsed, not a proposal form. Check 30: For under-construction property, RERA registration number present and verified.

→ Most common failure: Check 23 (old EC) · Check 25 (valuation older than 6 months) · Together 62% of property group failures
31–40
Group 4 — Loan documentation and disclosures (10 checks)

KFS issued pre-sanction, NACH mandate executed, all required disclosures signed

Check 31: KFS present and dated before sanction date — not on or after the sanction date. Check 32: KFS contains all required fields per the current template (including post-RBI/DOR/2025-26/84 fields effective December 1). Check 33: Borrower acknowledgement of KFS is signed and dated — signature present, date matches. Check 34: NACH mandate present, signed, and submitted — bank details on mandate match bank details in application. Check 35: NACH mandate bank account confirmed to exist (penny drop or bank letter). Check 36: Loan sanction letter issued and borrower-acknowledged copy returned. Check 37: Most Important Terms and Conditions (MITC) signed. Check 38: Pre-EMI or moratorium letter issued where applicable. Check 39: eSign/wet signature confirmed for each document requiring it — no unsigned pages in the agreement. Check 40: Borrower consent for credit bureau enquiry present and dated before the bureau pull date.

→ Most common failure: Check 34 (NACH bank details mismatch) · Check 31 (KFS dated on sanction date, not before) · Combined 44% of loan docs group failures
41–48
Group 5 — Regulatory and compliance checks (8 checks)

AML checks complete, FPC adherence confirmed, no prohibited transactions

Checks 41–42: AML/KYC check completed — borrower not on OFAC, UN sanctions, or CIBIL negative list. Check 43: PEP (Politically Exposed Person) check result present and actioned if PEP flag was raised. Check 44: Property type and loan purpose — property is a permitted type for the product sanctioned (e.g., no commercial property for a home loan product). Check 45: FPC breach flag — no FPC-prohibited terms appear in the loan agreement (no prepayment penalty exceeding the regulated cap). Check 46: Rate of interest is within the Board-approved range for this credit tier and product. Check 47: Processing fee charged matches the schedule (no manual override not covered by an approved waiver). Check 48: For co-lending arrangements, co-lender confirmation document present and co-lender verification complete.

→ Most common failure: Check 47 (processing fee override without waiver document) · Check 44 (property type check for LAP) · 38% of compliance group failures

The QC dashboard: live throughput, November 14, 2025

Onboarding QC Dashboard — November 14, 2025 · Live
Today: 28 files submitted · 22 reviewed · 6 in progress · Month to date: 214 reviewed of 214 submitted · 100% coverage
Files reviewed today22 of 28 submitted
Pass (all 48 checks)17 (77.3%)
Flagged (1+ failures)5 (22.7%)
Month-to-date error rate21.5% · vs 23.8% Oct
Recent files — review status and outcome
LA-2025-18881
48/48 checks pass · Home loan ₹42L · No failures
PASS
LA-2025-18882
46/48 · MSME ₹18L · Check 15 (bank gap Aug) · Check 34 (NACH mismatch)
2 ERRORS · HOLD
LA-2025-18883
48/48 checks pass · LAP ₹35L
PASS
LA-2025-18884
30/48 · Home loan ₹28L · Multiple property doc failures · EC expired · Valuation 8 months old
4 ERRORS · REJECT
LA-2025-18885
48/48 · MSME ₹12L · Clean
PASS
LA-2025-18886
47/48 · Home loan ₹55L · Check 31 (KFS dated same day as sanction)
1 ERROR · HOLD
LA-2025-18887
In progress · Home loan ₹38L · 6 of 10 KYC checks done
IN PROGRESS
Month-to-date (Nov 1–14)
214 of 214 files reviewed · 100% coverage
Human QC team reviewed: 46 flagged files (21.5%) · AI reviewed: all 214 · Zero backlog
Avg review time per file
4.2 minutes (AI) · 18 minutes (human flagged files)
Human team capacity freed: ~720 hours/month vs 100% manual
● 100% coverage · Month-to-date: 214 reviewed · Error rate 21.5% (improving from 23.8% October) · Human team reviews flagged files only · Zero backlog maintained
100%File coverage — every application reviewed against 48-point checklist before sanction · 214 of 214 files month-to-date
4.2 minAverage AI review time per file — all 48 checks in parallel · 100 files reviewed in 7 hours · No human time for passing files
21.5%Error rate month-to-date — 46 files flagged · Down from 23.8% October · Improvement driven by feedback loop reducing top error types
720 hrsHuman QC hours freed per month — team reviews only flagged files · At 100% manual, 500 files × 18 min = 150 hours/month of pure review

The 85th percentile file that was never reviewed in the manual sampling regime was not a better file — it was a file that got lucky. The AI removes luck from quality control.

In November 2025, 214 applications have been submitted. Under a 15% sampling regime, 32 of those would have been reviewed and 182 would have proceeded to sanction without quality review. Of those 182, approximately 39 (21.5% error rate) would have contained errors that only emerge later — at the audit stage, at the examination stage, or when collections tries to enforce against a NACH mandate with the wrong account number. The Onboarding Quality Agent AI reviewed all 214 files and flagged 46 with errors before any of them reached sanction. The 46 were corrected before disbursement. None of them will generate a collections complication or a compliance finding that needs to be explained in 18 months. 100% coverage is not a luxury — it is what quality control means when you take the word seriously.

← Back to Onboarding Quality Agent AI