Section 41 · Team & Development
Accessibility
Accessibility practices for user-facing applications
This guide walks you through auditing a project's accessibility practices for user-facing applications.
The Goal: Accessible by Default
Accessibility isn't a feature — it's a quality attribute. The goal is:
- Standards-based — Working toward a defined target (WCAG 2.1 AA)
- Automated — Catching issues in CI before they ship
- Tested — Manual validation for what automation misses
- Tracked — Known gaps visible and prioritized
Accessibility debt compounds like technical debt. This guide verifies the practices that prevent it.
Cross-references:
- Section 8 (Testing & Code Metrics) — Testing infrastructure
- Section 14 (Documentation) — Documentation practices
- Section 22 (Front-End Performance) — Lighthouse overlaps
- Section 40 (Technical Debt Tracking) — Same patterns for a11y debt
Before You Start
- Confirm this applies — This section is for user-facing applications (web apps, mobile apps, customer-facing sites). Internal tools, CLIs, and backend services can skip this audit.
- Identify the tech stack — React, Vue, vanilla HTML affects which patterns to look for
- Check for compliance requirements — Legal (ADA, EAA) or contractual obligations change severity levels
wcag-compliance
An explicit accessibility target gives the team something to measure against. Without it, "accessible enough" is undefined.
“What's your accessibility target, and who set it?”
Automated testing catches ~30-40% of accessibility issues before they ship. Tools like axe-core, pa11y, or Lighthouse CI integrated into the build pipeline.
“Would a missing label on a form input ship undetected?”
Automated tools miss interaction and flow issues. Screen reader testing validates the real user experience for critical paths like signup and checkout.
“When did someone last test checkout with a screen reader?”
Core flows work with keyboard only. Focus is visible, focus order is logical, skip links exist, and modals trap focus correctly.
“Tab through your app — where does focus go invisible?”
testing
Major releases get a thorough accessibility review beyond what CI catches. Blocking issues prevent release.
“Last major release — did accessibility get reviewed?”
Color choices validated for WCAG contrast requirements (4.5:1 for normal text). Automated tools or design system with pre-validated colors.
“Does your gray placeholder text pass 4.5:1 contrast?”
ESLint rule enforces alt attribute on images. Meaningful alt text on content images, empty alt="" for decorative images.
“What does a screen reader say about your hero image?”
All form inputs have associated labels. Error messages connected to inputs via aria-describedby. ARIA used correctly as HTML supplement.
“Error on a form field — does a screen reader announce it?”
documentation
Developers have documented guidance on building accessibly. Component-level a11y guidelines, pattern library, or contributing guide.
“Where do devs look up how to build an accessible modal?”
Dedicated label or tag for a11y issues. Known issues logged and visible, not just remembered. Issues from audits get captured.
“How many a11y issues have been open for 6+ months?”
Known gaps have a prioritized plan with ownership and timelines. Progress tracked over time, not just an unprioritized backlog.
“Known a11y gaps — who owns fixing them and by when?”