The first question in an audit is usually easy to answer. It’s the second or third question that causes problems.
“Do you have an information security policy?” “Yes.”
“When was it last reviewed?” “Um, let me check…”
“Who approved the changes?” Silence.
This pattern repeats across different controls, different audits, different organisations. The initial question about existence gets a confident answer. The probing questions about operation create uncertainty.
Understanding why this happens explains most of what goes wrong with audit evidence, and points to how to fix it.
The existence question is easy
When an auditor asks whether you have a particular control, policy, or process, you can usually produce something. A document exists. A procedure has been written down. Someone created a framework at some point.
This is the easy part of audit scrutiny because it’s about artefacts. You either have the document or you don’t. You can point to it, share it, demonstrate that it exists.
Most organisations pass this test. The failure comes later.
The operation question is harder
The second tier of questioning shifts from existence to operation. Not “do you have this?” but “can you show me this working?”
These questions sound simple:
“When was the last time you used this incident response procedure?” “How many access reviews have you completed this year?” “Can you show me a recent vendor risk assessment?”
But they require something different from the first question. They require evidence that connects the policy to actual activity. They require a trail showing that the control operates in reality, not just in documentation.
This is where what auditors look for shifts from artefacts to patterns. They’re trying to establish whether your stated controls reflect operational reality or wishful thinking.
Follow-up questions aren’t about catching organisations out. They’re about testing whether decisions were made deliberately, or just assumed.
Why the trail breaks down
The trail from policy to action breaks down for reasons that have nothing to do with whether the work is being done.
Records exist in someone’s head
A manager knows they reviewed system access last month. They remember making changes. They recall the conversation with their team lead about a former employee’s credentials.
But there’s no record of this. The decision happened verbally. The action happened in the admin panel. The system logged the change but didn’t capture who made it or why.
When the auditor asks “can you show me your most recent access review?”, the answer is either “I did one but I’m not sure where the record is” or “let me try to recreate what I remember.”
Neither is convincing.
Evidence lives in the wrong format
Risk assessments get done in conversations and meetings. Someone takes notes. Those notes might be thorough and accurate, but they live in someone’s notebook, or in a message thread, or in a document saved somewhere nobody else can find.
The work happened. The thinking was sound. But the evidence isn’t in a form that can be retrieved or verified later.
The trail is fractured
A supplier assessment involves checking their security documentation, reviewing their contract terms, assessing their financial stability, and making a decision about risk acceptance.
These activities happen across different tools and involve different people. The security review happens in email. The contract check happens in a legal system. The financial check happens in a spreadsheet. The decision gets made in a meeting and recorded… nowhere, or in someone’s email.
When an auditor asks “how do you assess suppliers?”, you can describe the process accurately. But showing a complete example requires pulling together fragments from five different places, and hoping nothing has been deleted or lost.
The problem with reconstruction
When organisations realise they can’t answer follow-up questions with clean evidence, they often try to reconstruct what happened.
Someone remembers that they did a particular risk assessment in May. They find some related emails. They create a document that captures what they recall about the decisions they made.
This rarely works well, for three reasons.
First, memory is unreliable. What felt like a thorough assessment at the time becomes fuzzy in hindsight. The specific risks identified, the mitigations agreed, the reasoning behind decisions – these details erode quickly.
Second, reconstruction is obvious. An auditor can usually tell the difference between evidence that accumulated naturally and evidence that was assembled after the fact. Creation dates don’t match. The language feels summarised rather than contemporaneous. Details are missing that should be present if the evidence was created in the moment.
Third, reconstruction doesn’t scale. You might be able to reconstruct one example when an auditor asks for it. But if they ask for three examples, or if they want to see a pattern over time, reconstruction becomes impossible. You can’t recreate six months of access reviews or a year of vendor assessments from memory and fragments.
What auditors are actually testing
When an auditor moves from existence questions to operation questions, they’re testing three things:
Whether controls are real Do the policies and procedures you’ve documented actually happen, or are they aspirational?
Whether controls are consistent Do they happen once because someone remembered, or do they happen reliably as part of how work gets done?
Whether controls are visible Can someone who wasn’t directly involved understand what happened and why, or does it require institutional knowledge and archaeology?
These tests aren’t about catching organisations out. They’re about establishing whether your compliance programme operates in reality or only on paper.
Assertions don’t answer these questions. Only evidence does. And evidence that stands up to scrutiny has a particular structure: it’s connected, timestamped, attributed, and retrievable.
The timing problem
One of the most revealing moments in an audit is when someone says “let me find that for you” and then disappears for three hours to hunt through email and folders.
This signals that evidence exists somewhere, but it wasn’t designed to be found. The work happened. Someone did it. But the evidence doesn’t exist as a coherent trail – it exists as fragments that require detective work to assemble.
The underlying problem is that evidence was never treated as something that needed to persist. It was created for the moment – to support a decision, to satisfy a requirement, to check a box – but not in a way that made it retrievable later.
This is why organisations that handle audits smoothly aren’t necessarily doing more work. They’re doing the same work in systems that naturally create retrievable evidence.
When a risk assessment happens in a system designed for risk management, the evidence is the assessment record itself. When access reviews happen in a system that logs actions and decisions, the evidence is the audit trail. When vendor assessments follow a structured process that captures each step, the evidence is the process output.
The work and the evidence become the same thing.
What good evidence looks like in practice
When follow-up questions don’t cause anxiety, it’s because evidence exists in a particular form.
Someone asks: “When did you last test your backup restoration process?”
A good answer isn’t: “I think we did that in September, let me check with the IT team.”
A good answer is: “We tested it on 12 September. Here’s the test plan, here are the results, here’s the list of issues we identified, and here’s the follow-up action log showing what we fixed.”
The evidence isn’t perfect. The test might have identified problems. But the trail is complete and credible.
Or: “Can you show me how you handled the last subject access request?”
A good answer isn’t: “We handle those when they come in, they usually take a week or two.”
A good answer is: “The most recent one came in on 3 October, was assigned to Alice on 4 October, required input from three departments, and was completed on 15 October. Here’s the request, here’s the response, and here’s the log showing each step.”
Again, the process might not be perfect. The request might have taken longer than ideal. But the evidence is there, and it tells a clear story.
The difference between doing work and proving work
The hardest thing about compliance evidence is that doing the work well doesn’t automatically create good evidence.
You can be managing risks thoughtfully, reviewing access diligently, assessing vendors thoroughly, and still struggle when someone asks you to prove it.
The gap isn’t in the work itself. It’s in how the work gets captured, recorded, and made retrievable.
This is why how evidence actually works matters more than how much effort you put into compliance. Evidence isn’t about effort. It’s about structure. It’s about whether the systems you use to do compliance work naturally create trails that someone can follow later.
When evidence accumulates naturally from operational activity, follow-up questions become straightforward. Not because you prepared perfectly, but because the trail was there all along.
What to do about it
If your organisation struggles with follow-up questions, the fix isn’t to work harder at audit time. It’s to change how evidence accumulates during normal work.
Ask: when we complete a risk assessment, where does the record live and how would someone find it six months later?
Ask: when we make a decision about a security control, who records that decision and in what format?
Ask: when we review access permissions, does the system capture who did the review, what they found, and what they changed?
These questions shift the focus from audit preparation to operational hygiene. They’re about making sure that when you do work worth doing, there’s a trail showing it happened.
That’s what makes audits less stressful. Not having perfect answers, but having credible evidence that connects your stated controls to operational reality.
Because the organisations that handle follow-up questions well aren’t necessarily doing different work. They’re just making sure that when they do the work, it creates evidence that persists.

