Agency|Insights
Audit Insights & PreparationAudit Insights & Preparation

SOC 2 Evidence Collection Guide: What Auditors Actually Want

We have watched more SOC 2 audits stall in the evidence collection phase than in any other part of the process.

Agency Team
Agency Team
·12 min read
Hand-drawn illustration of document, magnifying glass, and checklist representing SOC 2 evidence collection

We have watched more SOC 2 audits stall in the evidence collection phase than in any other part of the process. The pattern is remarkably consistent: GRC platforms auto-collect seventy to eighty percent of evidence — cloud configurations, access logs, MFA enforcement, endpoint compliance — but the remaining twenty to thirty percent requires manual evidence that your team must produce on schedule. That manual gap is where audits break down. In our experience, understanding exactly what auditors expect to see for each Trust Service Criteria category, how evidence should be organized, and what freshness standards apply prevents last-minute scrambles during fieldwork and eliminates the most common audit exceptions.

This guide covers evidence types and expectations by control category, the gap between automated and manual evidence, naming and organization conventions, freshness requirements, common evidence gaps that delay audits, and how to build a sustainable evidence collection cadence across the observation period. We wrote this for compliance leads and engineers responsible for gathering and organizing SOC 2 audit evidence.

Evidence Types: Automated vs Manual

Every SOC 2 control requires evidence — documentation that demonstrates the control is designed and operating as intended. Evidence falls into two categories, and we recommend understanding the distinction before you begin planning.

Automated Evidence

Automated evidence is collected continuously by your GRC platform through integrations with your infrastructure and tools. You configure the integration once and the platform collects evidence on an ongoing basis.

Evidence TypeSourceExample
Cloud configuration snapshotsAWS, Azure, GCP integrationsS3 encryption status, security group rules, IAM policies
MFA enforcement recordsIdentity provider integrationPer-user MFA status across Okta, Google Workspace, Entra ID
Endpoint compliance statusGRC platform agentDisk encryption, screen lock, OS version, firewall status per device
Code review recordsGitHub, GitLab, BitbucketPull request approval records, branch protection configuration
User provisioning/deprovisioning logsIdentity providerAccount creation and termination timestamps
Vulnerability scan resultsScanner integrationScan reports from Snyk, Qualys, or similar tools
Uptime monitoring dataMonitoring integrationAvailability records from Datadog, PagerDuty, or similar

Automated evidence runs continuously without human intervention. The risk we see most often is integration disconnection — if an integration breaks, evidence collection stops, creating a gap that cannot be backfilled.

Manual Evidence

Manual evidence requires human action to produce and upload. No GRC platform can fully automate these activities because they involve judgment, analysis, or coordination that cannot be reduced to an API call.

Evidence TypeFrequencyWhat the Auditor Expects
Access review recordsQuarterlyDocumented review of user access across critical systems with findings, approvals, and remediation
Risk assessment reportAnnuallyFormal risk register with identified risks, likelihood/impact ratings, and treatment plans
Vendor security assessmentsAnnually per vendorAssessment records for critical vendors; vendor SOC 2 reports or security certifications
Incident response tabletop resultsAnnuallyExercise documentation including scenario, participants, findings, and action items
Penetration test reportAnnuallyReport from independent testing firm with findings and remediation tracking
Business continuity/DR test resultsAnnuallyTest documentation including recovery procedures, actual recovery times, and findings
Security awareness training completionAnnually (all employees)Completion records with employee names and timestamps
Policy approval recordsAnnuallyDocumented management sign-off on all policies with dates
Board/leadership security updatesQuarterlyMeeting minutes or presentation records showing security was discussed at leadership level
Background check recordsPer employee (onboarding)Verification that background checks were completed for all employees

In our experience, manual evidence is the source of most audit exceptions because it depends on human scheduling and follow-through.

Evidence by Control Category

CC1-CC2: Control Environment and Communication

What auditors want:

  • Organizational chart showing security governance structure
  • Evidence of security responsibilities in job descriptions or role definitions
  • Communication records showing security policies were distributed to all employees
  • Policy acknowledgment records (all employees, all policies)
  • Board or leadership meeting minutes discussing security topics

Common gap: We frequently see missing leadership security discussion documentation. Our recommendation is to add a standing security update item to quarterly leadership meetings and document it in meeting minutes. This is low-effort but high-impact.

CC3: Risk Assessment

What auditors want:

  • Formal risk assessment report dated within the observation period
  • Risk register listing identified risks with likelihood, impact, and risk score
  • Treatment plans for each identified risk (accept, mitigate, transfer, avoid)
  • Evidence that the risk assessment methodology was followed consistently
  • Documentation of any changes to risk ratings during the period

Common gap: We see risk assessments completed at the start of the engagement but never updated when new risks emerge. We recommend treating the risk register as a living document and updating it when material changes occur — new services launched, infrastructure changes, vendor changes.

CC5: Control Activities

What auditors want:

  • Documented policies covering all required areas (ten core policies minimum)
  • Evidence that policies were reviewed and approved within the past twelve months
  • Records showing policies were communicated to and acknowledged by all employees
  • Evidence of policy exceptions with documented rationale and approval

Common gap: Policies approved but not acknowledged by all employees. We recommend using your GRC platform's policy distribution workflow and tracking acknowledgment to 100% completion.

CC6: Logical and Physical Access

What auditors want:

  • Quarterly access review records showing systematic review of user access across all critical systems
  • Access provisioning evidence: documented approval before access was granted
  • Deprovisioning evidence: timely access removal for terminated employees (within twenty-four hours)
  • MFA enforcement records showing all users have MFA enabled
  • Password or authentication policy configuration evidence
  • Evidence of least-privilege access implementation
  • Physical security evidence (if applicable): badge access logs, visitor management records

Common gap: Quarterly access reviews missed or not documented. We tell every client the same thing: schedule access reviews on fixed calendar dates (first week of each quarter) and document even when no findings are identified. A clean review is still evidence.

CC7: System Operations

What auditors want:

  • Security monitoring configuration evidence: alerting rules, detection mechanisms
  • Evidence of alert response: logs showing security alerts were investigated and resolved
  • Vulnerability scan results and remediation evidence across the observation period
  • Penetration test report with findings and remediation tracking
  • Incident response plan documentation
  • Incident response tabletop exercise records
  • Actual incident records (if any incidents occurred) with documented response and resolution

Common gap: Vulnerability scan findings not remediated in a timely manner. We recommend establishing remediation SLAs (critical: seventy-two hours, high: thirty days, medium: ninety days) and tracking remediation in your project management tool.

CC8: Change Management

What auditors want:

  • Evidence of code review requirements: pull request configurations requiring approval
  • Branch protection settings preventing direct production commits
  • Sample of change records showing the change management process was followed: request, approval, testing, deployment
  • Evidence that emergency changes followed an expedited but documented process
  • Release documentation or deployment records

Common gap: Emergency or hotfix deployments that bypass the standard approval process without documentation. In our experience, the fix is straightforward: define an emergency change procedure that requires at minimum a post-deployment review and approval.

CC9: Risk Mitigation (Vendor Management)

What auditors want:

  • Complete vendor inventory listing all third-party vendors that handle customer data or provide critical services
  • Vendor security assessment records for critical vendors
  • Vendor SOC 2 reports, ISO 27001 certificates, or security questionnaire responses
  • Business continuity and disaster recovery test results
  • Backup verification records showing backups are running and recoverable

Common gap: Vendor security documentation not collected or expired. We always advise starting vendor assessments early, sending requests to all critical vendors simultaneously, and following up aggressively with non-responsive vendors. Vendor responsiveness is outside your control, so the earlier you start the better.

Evidence Organization Best Practices

Naming Conventions

Consistent naming makes evidence retrieval during fieldwork dramatically faster. We recommend this format:

[Date]-[Category]-[Description].[ext]
2026-01-15-access-review-q1-production-systems.pdf
2026-03-01-risk-assessment-annual-report.pdf
2026-02-20-vendor-assessment-aws.pdf
2026-01-10-pen-test-report-acme-security.pdf

Folder Structure

We advise organizing evidence by control category to match how auditors review:

/CC1-CC2-Control-Environment/
/CC3-Risk-Assessment/
/CC5-Control-Activities/
/CC6-Access-Controls/
/CC7-System-Operations/
/CC8-Change-Management/
/CC9-Risk-Mitigation/

Most GRC platforms organize evidence by control category automatically. If you are supplementing with manual uploads, we recommend following the platform's organizational structure rather than inventing your own.

Evidence Freshness Requirements

Evidence TypeFreshness StandardConsequence of Staleness
Cloud configurationReal-time or dailyAudit exception if evidence gaps exist during observation period
Access reviewsCompleted quarterly during observation periodException if any quarter is missed
Risk assessmentCompleted within the past 12 monthsException if outdated or missing
Vulnerability scansContinuous or monthly during observation periodException if significant gaps exist
Penetration testCompleted within the past 12 monthsException if no current test on file
Training recordsCompleted within the past 12 months for all employeesException for employees without current training
Vendor assessmentsReviewed within the past 12 months per vendorException for critical vendors without current assessment
Policy approvalReviewed and approved within the past 12 monthsException for stale policies

Building a Sustainable Evidence Cadence

In our experience, the key to clean Type II audits is establishing a recurring evidence collection schedule that prevents gaps over the six-to-twelve-month observation period. We walk every client through this cadence.

Monthly Tasks

  • Review GRC platform dashboard for any disconnected integrations or failing evidence collection
  • Verify vulnerability scans are running and results are being collected
  • Check for any new employees who need onboarding compliance tasks (training, policy acknowledgment, background check, agent installation)
  • Verify any departing employees were properly deprovisioned

Quarterly Tasks

  • Conduct access review across all critical systems
  • Document leadership security update (meeting minutes or presentation)
  • Review and update vendor inventory if new vendors were added
  • Verify all automated evidence streams are collecting without gaps

Annual Tasks

  • Conduct formal risk assessment and update risk register
  • Complete security awareness training for all employees
  • Review and approve all policies (update if needed)
  • Conduct incident response tabletop exercise
  • Complete penetration test
  • Test business continuity and disaster recovery procedures
  • Conduct or update vendor security assessments for critical vendors

Setting Up Reminders

We recommend configuring your GRC platform to alert you when manual evidence tasks are approaching their due dates. Supplement with calendar invitations for quarterly and annual tasks assigned to the compliance lead. Build a shared compliance calendar visible to all team members who contribute evidence. In our experience, the companies that maintain a clean evidence trail are the ones that treat evidence collection as an ongoing operational discipline, not a pre-audit scramble.

Key Takeaways

  • We consistently see that GRC platforms automate seventy to eighty percent of evidence collection; the remaining twenty to thirty percent requires manual effort on a defined schedule, and that manual portion is where audits stall
  • In our experience, access reviews, risk assessments, vendor assessments, incident response testing, and penetration testing are the most critical manual evidence categories to stay on top of
  • We advise treating quarterly access reviews as non-negotiable calendar events — they are the most commonly missed manual evidence task and the most frequent source of audit exceptions
  • We recommend organizing evidence by control category with consistent naming conventions and proper date stamping from day one
  • Freshness requirements vary by evidence type: cloud configurations need real-time collection; annual activities (risk assessment, pen test, training) must be completed within twelve months
  • We tell every client to build a monthly, quarterly, and annual evidence cadence to prevent gaps during the Type II observation period
  • Integration disconnections are the most common cause of automated evidence gaps — we recommend monitoring your GRC platform dashboard monthly at minimum

Frequently Asked Questions

What is the most common evidence gap that delays SOC 2 audits?

What we tell clients is that missed quarterly access reviews are the single most common evidence gap we encounter. Organizations complete the first access review during implementation but forget to conduct reviews in subsequent quarters — we see this particularly during the second and third quarters of a Type II observation period. The second most common gap is vendor security assessments: critical vendors without current SOC 2 reports or completed security questionnaires on file. In our experience, both gaps are entirely preventable with calendar-based scheduling and GRC platform task reminders.

How detailed do access review records need to be?

What we tell clients is that auditors expect to see the following for each access review: the list of systems reviewed, the users who had access at the time of review, the reviewer's assessment of whether each user's access was appropriate, any findings of inappropriate access, and evidence that inappropriate access was remediated (removed or modified). A spreadsheet or GRC platform export showing user-by-system access with reviewer approval stamps is the standard format. We always emphasize that simply stating an access review was conducted is insufficient — the actual review records must be available.

Can I collect evidence retroactively for the observation period?

What we tell clients is no — evidence must be collected contemporaneously, meaning at the time the control was operating. You cannot recreate or backfill evidence for a period that has already passed. If your GRC platform was disconnected from AWS for two weeks during the observation period, that two-week gap in cloud configuration evidence cannot be filled after the fact. The auditor will document it as an exception. This is why we stress the importance of monitoring integration health throughout the observation period.

How should we handle evidence for controls that were changed mid-observation period?

In our advisory work, we see this scenario regularly — for example, a client migrates from one cloud provider to another mid-observation period. The auditor expects evidence from both the old and new environments for their respective periods. We recommend documenting the change date, the reason for the change, and the evidence trail from each environment. We also advise communicating control changes to your auditor as early as possible so they can plan their testing accordingly.

Do auditors accept screenshots as evidence?

What we tell clients is that screenshots are acceptable for certain evidence types, particularly configuration settings that cannot be exported in structured format. However, auditors prefer structured exports (CSV, JSON, PDF reports) over screenshots because they are harder to fabricate and provide more detail. If you must use screenshots, we recommend ensuring they include timestamps, the system URL or identifier, and enough context to verify what is being shown. GRC platform automated evidence is always preferred over manual screenshots because the platform provides tamper-resistant collection with metadata.

Agency Team

Agency Team

Agency Insights

Expert guidance on cybersecurity compliance from Agency's advisory team.

LinkedIn

Related Reading

Stay ahead of compliance

Get expert insights on cybersecurity compliance delivered to your inbox.