A11Y-003 recommended wcag-compliance

Screen reader testing for critical paths

Automated tools miss interaction and flow issues. Screen reader testing validates the real user experience for critical paths like signup and checkout.

Question to ask

"When did someone last test checkout with a screen reader?"

Verification guide

Severity: Recommended (B2C apps), Optional (internal tools)

Automated tools miss interaction and flow issues. Screen reader testing validates the real user experience.

Check automatically:

# Look for manual testing docs/checklists
grep -riE "screen reader|nvda|voiceover|jaws|narrator|manual.*test" docs/ --include="*.md" 2>/dev/null

# Check for testing checklists or QA docs
find . -maxdepth 4 -type f \( -name "*qa*" -o -name "*testing*" -o -name "*checklist*" \) -name "*.md" 2>/dev/null | xargs grep -l -iE "screen reader|a11y" 2>/dev/null

Ask user:

  • "Are critical user paths tested with a screen reader?" (signup, checkout, core flows)
  • "Who does this testing?" (dedicated QA, developers, external audit)
  • "How often?" (every release, quarterly, once)
  • "Which screen readers?" (VoiceOver, NVDA, JAWS)

Pass criteria:

  • Critical paths have been tested with at least one screen reader
  • Testing happens at some cadence (not just once at launch)
  • Issues found are tracked and fixed

Fail criteria:

  • Never tested with screen reader
  • "We assume axe catches everything" (it doesn't)
  • Tested once years ago, never since

Evidence to capture:

  • Screen readers used
  • Testing cadence
  • Who performs testing
  • Last test date

Section

41. Accessibility

Team & Development