Validation and Quality

How Fund Analyst Intelligence enforces deterministic validation, measures quality operationally, and keeps outputs reliable across monthly cycles.

Validation and Quality

Fund Analyst Intelligence is designed to be reliable in production.
Reliability comes from validation before narrative.
Quality is treated as an operational metric, not a marketing claim.

This page explains how validation works, how quality is measured, and how human review fits into the production model.

Objectives

A production validation system must ensure:

  • structured fund fields remain consistent over time
  • contradictions are detected and surfaced explicitly
  • missing or stale evidence becomes an exception
  • material changes are prioritised and reviewable
  • outputs remain comparable across funds and cycles

Validation philosophy

Deterministic gates first

Validation is rule-driven where possible.
The system should not “reason” about facts that can be checked.
Narrative is produced only after factual gates pass.

Human review is part of quality

Regulated workflows require accountability.
Review is not a failure of automation.
It is a control that makes automation safe.

Fail explicitly rather than guess

If evidence is missing or conflicting, the system should surface it.
It should not fabricate a clean output.
Exceptions are a feature, not an inconvenience.

Validation layers

Fund Analyst Intelligence validates across multiple layers.

Layer 1 — Completeness

Ensures required fields are present for the chosen scope.

Examples

  • identifiers and domicile
  • strategy and mandate fields
  • fees and liquidity terms
  • key people and operational providers

Outcome

  • completeness score per fund
  • missing-field exceptions

Layer 2 — Consistency

Checks logical coherence across fields and sources.

Examples

  • redemption frequency consistent with notice periods
  • fee descriptions consistent across documents
  • share class terms aligned with fee schedules
  • strategy claims consistent across quarters

Outcome

  • contradiction exceptions
  • “requires reviewer decision” flags for high-severity conflicts

Layer 3 — Freshness and recency

Ensures evidence meets policy expectations.

Examples

  • factsheet older than allowed threshold
  • DDQ out of date for required fields
  • online source updated but internal record stale

Outcome

  • stale-evidence exceptions
  • follow-up prompts and escalation rules

Layer 4 — Change detection integrity

Ensures deltas are meaningful and correctly classified.

Examples

  • identify true changes versus rewordings
  • isolate material changes from minor wording differences
  • detect hidden changes in operational terms

Outcome

  • structured change log with category labels
  • materiality scoring inputs

Layer 5 — Template and output validity

Ensures reports conform to a stable structure.

Examples

  • required report sections present
  • change summary included
  • evidence links present for key statements
  • approval stamp present before publication

Outcome

  • report readiness checks
  • publication gates

Quality signals

Quality in Fund Analyst Intelligence is monitored continuously.

Per-cycle quality signals

  • completeness score for required fields
  • number of contradictions detected
  • proportion of key claims with evidence links
  • count and severity of exceptions
  • review time and time-to-close for follow-ups

Portfolio-level quality signals

  • exception trends by category over time
  • recurring issues by manager or strategy
  • funds with repeated stale evidence
  • funds with frequent material term changes
  • adherence to cycle cadence and SLAs

These signals turn quality into a steering tool.
They also reduce dependence on individual memory and effort.

Managing exceptions

Exceptions are the mechanism that keeps quality honest.
They convert uncertainty into action.

Exception lifecycle

  1. detected by validation or change logic
  2. triaged and prioritised by severity and materiality
  3. reviewed and resolved, or marked as monitoring
  4. recorded with decision notes and ownership
  5. closed or escalated by SLA policy

A system without exception discipline does not scale.
Fund Analyst Intelligence is built around exception discipline.

Human review controls

Review is implemented as explicit workflow states.

Typical controls

  • reviewer assignment and ownership
  • decision states: accept, edit, reject, follow-up, monitor
  • rationale capture for manual overrides
  • approval gate before publication

Manual overrides are recorded.
They are not silent.
This protects governance and auditability.

Minimising variability in narrative outputs

Narrative quality is maintained by constraints.

  • templates enforce stable structure
  • narrative is restricted to validated facts and resolved exceptions
  • change summaries are generated from structured deltas
  • confidence and evidence requirements prevent unsupported statements

This reduces subjective variance across funds and cycles.
It also improves comparability for committees.

Continuous improvement loop

Production quality improves through iteration.

A typical improvement loop is:

  1. observe recurring exception patterns
  2. refine validation rules or source requirements
  3. tune materiality thresholds by category
  4. update templates for clarity and stability
  5. measure impact in the next cycle

This creates compounding operational value.
It converts experience into system capability.

Definition of done

Validation and quality are production-grade when:

  • required fields meet completeness thresholds
  • contradictions and gaps are surfaced as exceptions
  • stale evidence is detected by policy
  • outputs pass readiness checks before publication
  • review decisions and overrides are recorded
  • quality KPIs are tracked per cycle and portfolio-wide

This is how Fund Analyst Intelligence keeps outputs reliable.