Audit trail and evidence trace
Maintaining a complete, immutable record of every AI input, output, human decision, and override — so any action can be reconstructed and reviewed after the fact.
Why it matters
Trust in AI systems depends on accountability. When something goes wrong — or when an auditor asks — the organization needs to show exactly what the AI produced, what humans changed, and why. Without an audit trail, AI becomes a liability instead of an asset.
Where it shows up
finance
Every piece of auto-generated commentary in a reporting pack is tagged as AI-generated, with the original draft preserved alongside the final human-edited version.
hr
All manager guidance interactions are logged — including the question, the AI response, the policy citations, and whether HR was involved in the answer.
procurement
Vendor scoring methodology, individual criterion scores, human adjustments, and the final recommendation are all recorded for post-decision review.
Common mistakes
- Logging only the final output without preserving the AI's original recommendation
- Not recording when and why humans overrode AI suggestions
- Making audit logs too technical for non-technical reviewers to understand
- Storing audit data in a format that's difficult to query or report on
Signals that a workflow needs this pattern
- The function is subject to external audit (financial audit, compliance review, government inspection)
- Decisions could be challenged legally or by stakeholders after the fact
- The organization needs to demonstrate that AI systems are governed responsibly
- Regulatory frameworks require explainability or record-keeping for AI-assisted decisions
