Monthly Validation Cycle
Monthly Validation Cycle
The monthly cycle is the core operating unit of Fund Analyst Intelligence.
It converts scattered updates into a controlled validation workflow.
It produces a current fund profile, exceptions, and a monthly memo with provenance.
The cycle is designed to be repeatable.
It is designed to be auditable.
It is designed to scale across many funds.
Objective
A monthly cycle answers four questions:
- What is the current validated state of the fund?
- What changed since the last cycle?
- Which changes are material and require action?
- What evidence supports the update and the narrative?
Inputs
User materials
- DDQ / ODD files
- factsheets and marketing decks
- manager letters and periodic updates
- fee schedules and share class documents
- internal notes and committee decisions
Approved online sources
- manager website and official publications
- regulated filings where applicable
- trusted third-party references (policy-defined)
All inputs are treated as artefacts.
Artefacts are versioned, dated, and attached to the cycle.
Cycle stages
Stage 0 — Baseline readiness (first month only)
A fund needs a stable baseline snapshot.
This snapshot is the reference point for all future deltas.
Outputs
- baseline profile
- required-field completeness check
- initial evidence pack
Stage 1 — Ingest
The system collects and registers the cycle’s source set.
Artefacts are stored with metadata and version identifiers.
Checks
- duplicates and stale documents
- missing required artefacts
- source freshness and recency rules
Output
- cycle source inventory
Stage 2 — Extract
Fund fields and claims are extracted into structured form.
Every extracted item is linked to evidence.
Examples of extracted items
- AUM and date
- fee levels and breakpoints
- liquidity and redemption terms
- benchmark and risk limits
- key people and governance statements
Output
- extracted field set + claim list + evidence links
Stage 3 — Validate
Deterministic checks validate consistency and completeness.
Validation rules are transparent and configurable.
Validation types
- required-field completeness
- cross-field consistency
- policy constraints (allocator-specific)
- contradiction detection across sources
- source coverage and evidence sufficiency
Output
- validation results + flagged issues
Stage 4 — Compare
The system compares the current extracted state to the prior snapshot.
It produces a structured delta set.
Delta properties
- category (fees, liquidity, team, strategy, operations, risk, reporting)
- direction and magnitude where relevant
- effective date
- evidence links
- confidence indicators
Output
- change log draft for the cycle
Stage 5 — Materiality and exceptions
Not every delta matters.
This stage applies thresholds and rules to prioritise attention.
Materiality signals
- magnitude of change (e.g., fee level, liquidity constraints)
- category severity (e.g., key person, legal/operational terms)
- persistence across sources
- conflict or ambiguity in evidence
- allocator policy preferences
Output
- prioritised exceptions queue
Stage 6 — Review and resolution
Review is an explicit part of production.
The reviewer resolves exceptions and finalises the fund profile update.
Reviewer actions
- accept an update
- edit a field and add rationale
- request follow-up questions
- reject a claim due to weak evidence
- mark items as “monitor” rather than “change”
Output
- approved updated profile + resolution notes
Stage 7 — Report
Reporting is generated from validated facts and resolved exceptions.
Narrative is constrained and evidence-linked.
Monthly memo includes
- cycle scope and source list
- validation summary and key signals
- material changes and their interpretation
- open questions and follow-ups
- evidence pack reference
- approval signature and timestamp
Output
- monthly validation memo + evidence pack
Stage 8 — Publish and archive
The cycle is closed and stored.
All artefacts remain accessible for audit and follow-up.
Output
- immutable cycle record
- snapshot history updated
What changes for allocator teams
Before
Teams rebuild fund records each month.
Evidence is scattered.
Quality depends on individual habits.
After
Teams review exceptions.
Evidence is packaged once and reused.
Quality is enforced by the workflow.
Operational KPIs (what you track)
- cycle completion time per fund
- number of exceptions per cycle
- exceptions by category and severity
- review time per fund
- repeat issues and unresolved follow-ups
- evidence coverage for key fields
These KPIs support steering.
They also support governance and scale.
Definition of done for a monthly cycle
A cycle is considered complete only if:
- the source inventory is recorded
- validation checks have run
- deltas are computed against the prior snapshot
- exceptions are reviewed and resolved
- the monthly memo is generated and approved
- the evidence pack is attached
- the snapshot history is updated
This is what makes the process production-grade.