Overview
Accessibility testing is a critical part of ensuring digital products are usable by everyone, including people with disabilities. This comprehensive guide covers best practices for testing web applications, mobile apps, and digital content for accessibility compliance and usability.
Effective accessibility testing requires a combination of automated tools, manual testing procedures, screen reader testing, keyboard navigation testing, and most importantly, testing with real users with disabilities. No single testing method can catch all accessibility issues – a multi-faceted approach is essential.
Key Principles of Accessibility Testing
- Test early and often: Integrate accessibility testing throughout the development lifecycle
- Use multiple methods: Combine automated tools with manual testing
- Test with real users: Include people with disabilities in testing
- Test across platforms: Verify accessibility in different browsers, devices, and assistive technologies
- Document everything: Keep detailed records of testing procedures and findings
- Continuous improvement: Treat accessibility as an ongoing process, not a one-time effort
Why Accessibility Testing Matters
Accessibility testing is essential for several compelling reasons:
Legal and Compliance
- Regulatory Requirements: Many jurisdictions require WCAG 2.1 Level AA compliance or equivalent standards
- Section 508: US federal agencies must ensure accessibility
- European Accessibility Act: EU member states must meet accessibility requirements
- ADA Compliance: Title III applies to websites and mobile apps in the US
- Litigation Risk: Accessibility lawsuits are increasing globally
Business Benefits
- Larger Market Reach: 15% of the world's population has some form of disability
- Better SEO: Accessible websites often rank better in search results
- Improved Usability: Accessibility improvements benefit all users
- Brand Reputation: Demonstrates commitment to inclusion
- Cost Savings: Finding issues early is cheaper than fixing them later
User Experience
- Screen Reader Users: Enable access for blind and low-vision users
- Keyboard Users: Support people who cannot use a mouse
- Cognitive Disabilities: Ensure content is understandable
- Hearing Impairments: Provide captions and transcripts
- Situational Disabilities: Help users in challenging environments
Testing Methodology
A comprehensive accessibility testing strategy requires structured methodologies that integrate testing throughout the development lifecycle.
Shift-Left Testing
Shift-left testing means testing earlier in the development process, ideally starting from design and continuing throughout development.
Benefits of Shift-Left
- Early Detection: Catch issues before they are coded
- Lower Costs: Fixing bugs early is exponentially cheaper
- Better Architecture: Accessibility is built in from the start, not added later
- Team Alignment: Everyone understands accessibility requirements from the beginning
Implementation
- Design Review: Review wireframes and designs for accessibility issues
- Component Library: Test components in isolation during development
- Code Linting: Use eslint-plugin-jsx-a11y or similar tools
- Unit Tests: Write accessibility unit tests with jest-axe or similar
- Integration Tests: Test accessibility in end-to-end scenarios
The Testing Pyramid
The accessibility testing pyramid shows the distribution of tests across different levels:
Base: Automated Tests (Many)
- Linters: eslint-plugin-jsx-a11y, axe-linter
- Unit Tests: jest-axe, Testing Library with accessibility queries
- Integration Tests: axe-core in Cypress/Playwright
- CI/CD Checks: Automated accessibility testing on every build
Middle: Manual Tests (Some)
- Keyboard Navigation: Test all functionality keyboard-only
- Screen Reader Testing: NVDA, JAWS, VoiceOver checks
- Browser Dev Tools: Inspect accessibility tree and ARIA attributes
- Color Contrast: Manual verification of contrast ratios
Top: User Testing (Few)
- Assistive Tech Users: Testing with real users with disabilities
- Usability Studies: Observe users interacting with your product
- Accessibility Audits: Professional review by experts
Automated Testing
Automated tools are the foundation of accessibility testing. They catch 30-40% of accessibility issues and should be integrated into CI/CD pipelines.
Browser Extensions
-
axe DevTools:
- Available for Chrome, Firefox, Edge
- Runs comprehensive accessibility checks
- Provides detailed issue reports with severity
- Shows WCAG conformance level
- Offers remediation guidance
-
WAVE:
- Browser extension by WebAIM
- Visual overlay of accessibility errors
- Shows structural elements
- Identifies ARIA issues
- Free to use
-
Lighthouse:
- Integrated into Chrome DevTools
- Accessibility scoring
- Performance and SEO checks
- Automatable via CLI
- Part of Chrome developer tools
JavaScript Libraries
axe-core (for Jest/React Testing Library):
import { render } from '@testing-library/react';
import { axe, toHaveNoViolations } from 'jest-axe';
expect.extend(toHaveNoViolations);
test('Button should have no accessibility violations', async () => {
const { container } = render(<Button>Click me</Button>);
const results = await axe(container);
expect(results).toHaveNoViolations();
});
pa11y (Command Line Tool):
// Run from command line
pa11y https://example.com
// Or programmatically in Node.js
const pa11y = require('pa11y');
async function testAccessibility() {
const results = await pa11y('https://example.com', {
standard: 'WCAG2AA',
runners: ['axe', 'htmlcs']
});
console.log(results.issues);
}
testAccessibility();
Linters
eslint-plugin-jsx-a11y (for React):
// .eslintrc.json
{
"extends": ["plugin:jsx-a11y/recommended"],
"plugins": ["jsx-a11y"],
"rules": {
"jsx-a11y/alt-text": "error",
"jsx-a11y/anchor-has-content": "error",
"jsx-a11y/aria-props": "error",
"jsx-a11y/aria-role": "error",
"jsx-a11y/heading-has-content": "error",
"jsx-a11y/no-autofocus": "error"
}
}
CI/CD Integration
Automated accessibility testing should be part of your continuous integration pipeline:
GitHub Actions Example
name: Accessibility Testing
on: [push, pull_request]
jobs:
a11y-test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Setup Node
uses: actions/setup-node@v3
with:
node-version: '18'
- name: Install dependencies
run: npm ci
- name: Run accessibility tests
run: npm run test:a11y
- name: Run pa11y on build
run: |
npm run build
npx pa11y-ci --sitemap http://localhost:3000/sitemap.xml
Limitations of Automated Testing
While automated tools are invaluable, they have significant limitations:
Automated Tools Cannot Detect:
- Alt Text Quality: Tools can detect missing alt attributes but not whether the text is meaningful
- Logical Tab Order: They can check tabindex but not whether the order makes sense
- ARIA Correctness: Tools validate syntax but not whether ARIA is used appropriately
- Content Comprehension: Cannot assess whether content is clear and understandable
- Screen Reader Experience: Cannot simulate the actual user experience
- Keyboard Traps: Cannot automatically detect if focus is trapped
- Contextual Appropriateness: Cannot judge if accessibility features are appropriate in context
Important: Automated testing is a starting point, not a replacement for manual testing and user testing.
Manual Testing
Manual testing is essential to identify issues that automated tools miss. This includes keyboard navigation, screen reader testing, and visual inspections.
Keyboard Navigation Testing
Testing keyboard accessibility ensures that all functionality is available without a mouse.
Testing Procedure
- Disconnect your mouse and rely solely on keyboard
- Use
Tab to navigate forward, Shift + Tab backward
- Use
Enter or Space to activate buttons/links
- Use arrow keys for menus, sliders, and custom components
- Test all interactions: forms, modals, dropdowns, carousels
Test For:
- Focus Visibility: Is it always clear which element has focus?
- Logical Order: Does focus follow a sensible path?
- No Keyboard Traps: Can you navigate out of all components?
- Skip Links: Can you skip repetitive navigation?
- Custom Controls: Do custom widgets support proper keyboard navigation?
Common Issues
- Missing focus indicators or removed
outline
- Incorrect tab order due to CSS layout
- Focus management issues in modals and single-page apps
- Inaccessible custom widgets (dropdowns, sliders, etc.)
- Keyboard traps in embedded content or modals
Screen Reader Testing Basics
Screen reader testing is critical to understanding how blind and low-vision users experience your site.
Which Screen Readers to Use
- NVDA (Windows, Free): Popular for testing, open source
- JAWS (Windows, Commercial): Widely used in professional settings
- VoiceOver (macOS/iOS, Built-in): Default for Apple devices
- TalkBack (Android, Built-in): Default for Android devices
Screen Reader Basics
To test with screen readers:
- Start the screen reader and learn basic commands
- Navigate by headings (
H key in NVDA/JAWS)
- List landmarks (
D for regions)
- Navigate through links and forms
- Listen to the entire page linearly
- Test all interactive components
Visual Accessibility Testing
Visual testing ensures content is perceivable and understandable for users with vision impairments.
Visual Testing Checklist
- Color Contrast: Text has at least 4.5:1 contrast (3:1 for large text)
- Not Color Alone: Information is not conveyed by color alone
- Text Size: Text is scalable to 200% without loss of content
- Spacing: Adequate spacing between interactive elements (minimum 44x44px target size)
- Focus Styles: Clear focus styles with sufficient contrast
- Responsive Design: Layout works at different zoom levels
WCAG Compliance Testing
The Web Content Accessibility Guidelines (WCAG) 2.1 provide a comprehensive framework for testing accessibility. Level AA is the most commonly targeted standard.
1. Perceivable
Information and user interface components must be presentable to users in ways they can perceive.
Text Alternatives
- All images have meaningful alt text or are marked as decorative
- Complex graphics have detailed descriptions
- Form buttons and inputs have descriptive labels
- Icon buttons have accessible names
Time-Based Media
- Videos have captions for deaf users
- Audio has transcripts
- Videos have audio descriptions for important visual information
- Live audio has live captions
Adaptable
- Content can be presented in different ways without losing information
- Correct semantic HTML (headings, landmarks, lists)
- Reading order is logical
- Instructions don't rely solely on sensory characteristics
Distinguishable
- Color contrast meets WCAG minimum requirements (4.5:1 for normal text)
- Audio stops automatically after 3 seconds or can be controlled
- Text can be scaled to 200% without loss of content
- Images of text are only used for decoration or when essential
2. Operable
User interface components and navigation must be operable.
Keyboard Accessible
- All functionality is available via keyboard
- No keyboard traps
- Keyboard shortcuts can be disabled or remapped
- Focus order is logical and intuitive
Enough Time
- Time limits can be extended or turned off
- Moving, blinking, or scrolling can be paused
- Auto-updates can be delayed or stopped
Seizures and Physical Reactions
- Nothing flashes more than 3 times per second
- No content known to cause seizures
- Animations can be disabled (prefers-reduced-motion)
Navigable
- Skip links to bypass repetitive content
- Pages have descriptive titles
- Focus order preserves meaning
- Link purpose is clear from text or context
- Multiple navigation methods available (menu, search, sitemap)
- Headings and labels are descriptive
- Focus is always visible
Input Modalities
- Functionality using pointer events has keyboard alternatives
- Multipoint or path-based gestures have single-point alternatives
- Touch targets are at least 44x44 CSS pixels
- Labels are included in accessible names
3. Understandable
Information and the operation of user interface must be understandable.
Readable
- Page language is programmatically set
- Language changes are marked
- Unusual words have definitions
- Abbreviations are expanded
Predictable
- Focus does not trigger context changes
- Input does not trigger unexpected context changes
- Navigation is consistent across pages
- Components are consistently identified
Input Assistance
- Errors are automatically detected and described
- Labels or instructions are provided for inputs
- Error suggestions are provided
- Undo, confirmation, or verification for legal/financial transactions
4. Robust
Content must be robust enough to be interpreted by a wide variety of user agents, including assistive technologies.
Compatible
- HTML is valid and well-formed
- No duplicate IDs
- No ARIA errors in validation
- Status messages are programmatically determinable
- Name, role, value available for all UI components
Screen Reader Testing
Comprehensive screen reader testing requires familiarity with multiple screen readers and browser combinations.
NVDA Testing (Windows)
NVDA is a free, open-source screen reader for Windows and an excellent tool for testing.
Setup
- Download NVDA from
nvaccess.org
- Install or run portable version
- Learn basic keyboard commands (NVDA key is usually
Insert or Caps Lock)
- Use Firefox or Chrome for testing
Essential NVDA Commands
- NVDA + Space: Toggle between browse and focus mode
- H: Next heading
- D: Next landmark
- K: Next link
- B: Next button
- F: Next form field
- Insert + F7: Elements list (headings, links, landmarks)
- NVDA + Down Arrow: Read next item
- NVDA + Shift + Down Arrow: Read all from here
NVDA Testing Checklist
- Navigate through all headings – is the structure logical?
- Check landmarks – are regions properly identified?
- Test all form fields – do they have correct labels?
- Activate all buttons and links – is their purpose clear?
- Verify images – is alt text meaningful?
- Test modals and dialogs – is focus managed correctly?
- Check live regions – are dynamic updates announced?
- Test custom widgets – does ARIA work correctly?
JAWS Testing (Windows)
JAWS is the most widely used commercial screen reader. While paid, a 40-minute demo version is available for testing.
Essential JAWS Commands
- H: Next heading
- R: Next region/landmark
- T: Next table
- G: Next graphic
- F: Next form field
- Insert + F6: Headings list
- Insert + F7: Links list
- Insert + Down Arrow: Read all
VoiceOver Testing (macOS/iOS)
VoiceOver is the built-in screen reader for macOS and iOS.
VoiceOver macOS Commands
- Cmd + F5: Turn VoiceOver on/off
- VO + A: Start reading
- VO + Right/Left Arrow: Next/previous item
- VO + U: Rotor (navigate by headings, links, etc.)
- VO + Cmd + H: Next heading
- VO + Space: Activate item
- Ctrl: Stop reading
iOS VoiceOver Testing
- Triple-click Home/Power: Turn VoiceOver on/off
- Swipe right/left: Next/previous item
- Double-tap: Activate
- Rotor: Rotate two fingers to change navigation mode
- Three-finger swipe: Scroll through page
Browser & Device Testing
Accessibility can vary across browsers and devices. Test across multiple combinations.
Recommended Testing Matrix
| Browser |
Platform |
Screen Reader |
Priority |
| Chrome |
Windows |
NVDA |
High |
| Firefox |
Windows |
NVDA |
High |
| Chrome |
Windows |
JAWS |
Medium |
| Safari |
macOS |
VoiceOver |
High |
| Safari |
iOS |
VoiceOver |
High |
| Chrome |
Android |
TalkBack |
Medium |
| Edge |
Windows |
Narrator |
Low |
Device-Specific Testing
- Desktop: Test in common desktop browsers at different screen sizes
- Tablet: Test both portrait and landscape orientations
- Mobile: Test on both iOS and Android devices
- Responsive: Test all breakpoints from 320px to 1920px+
Common Browser-Specific Issues
- Safari: Focus management issues in single-page apps
- Firefox: ARIA live region support differs
- Edge: Sometimes inconsistent keyboard navigation
- Mobile browsers: Touch target size requirements
Mobile Accessibility Testing
Mobile accessibility testing requires understanding platform-specific accessibility features and testing on real devices.
iOS Accessibility Testing
- Enable VoiceOver in Settings → Accessibility → VoiceOver
- Test with VoiceOver gestures (swipe, double-tap, rotor)
- Check Dynamic Type support (text sizing)
- Test with Voice Control for hands-free operation
- Verify Reduce Motion support
- Test with Display Accommodations (color filters, increased contrast)
Android Accessibility Testing
- Enable TalkBack in Settings → Accessibility → TalkBack
- Test with TalkBack gestures
- Verify font size scaling
- Test with Switch Access for external switches
- Check high contrast mode
- Test Voice Access for voice control
Mobile-Specific Considerations
- Touch Targets: Minimum 44x44 CSS pixels (48x48 preferred)
- Spacing: Adequate space between interactive elements
- Orientation: Support both portrait and landscape
- Gestures: Provide alternatives to complex gestures
- Form Fields: Use correct input types for better mobile keyboards
- Focus Management: Manage focus correctly in single-page apps
Color Contrast Testing
Color contrast testing ensures text and UI elements are readable for users with vision impairments.
WCAG Contrast Requirements
- Level AA: 4.5:1 for normal text (under 24px)
- Level AA: 3:1 for large text (24px+ or 19px+ bold)
- Level AAA: 7:1 for normal text
- Level AAA: 4.5:1 for large text
- UI Components: 3:1 for UI components and graphical objects
Contrast Testing Tools
- WebAIM Contrast Checker: Web-based tool for quick checks
- Colour Contrast Analyser: Desktop app with eyedropper
- Browser DevTools: Chrome/Firefox have built-in contrast checkers
- axe DevTools: Checks contrast automatically
- WAVE: Shows contrast issues visually on page
Color Blindness Testing
Also test for different types of color blindness:
- Protanopia (Red-Green): Most common form, affects ~8% of men
- Deuteranopia (Green-Red): Second most common
- Tritanopia (Blue-Yellow): Rarer
- Tools: ColorOracle, Sim Daltonism, Chrome DevTools Vision Deficiencies
Color Contrast Checklist
- All text meets minimum contrast ratios
- UI component boundaries have sufficient contrast
- Focus indicators are visible at 3:1 against background
- Information is not conveyed by color alone
- Links are identifiable without color
- Error indicators don't rely only on color
Assistive Technology Testing
Beyond screen readers, test for various assistive technologies.
Screen Magnification
- Windows Magnifier: Built into Windows
- macOS Zoom: Built into macOS
- ZoomText: Commercial magnifier with additional features
- Testing Procedure: Zoom to 200-400% and check readability
- What to Check: Text remains readable, layouts don't break, horizontal scrolling is minimal
Voice Control
- Dragon NaturallySpeaking: Commercial speech recognition
- Windows Speech Recognition: Built into Windows
- macOS Dictation: Built into macOS
- Voice Control (iOS/macOS): Advanced voice control
- Testing Procedure: Navigate and interact using voice commands only
Switch Control Devices
- Switch Access (Android): For external switches
- Switch Control (iOS): For external switches
- Testing Procedure: Navigate using single switch or scanning mode
- What to Check: All functionality is reachable with switch access
User Testing with People with Disabilities
Testing with real users with disabilities is the most valuable form of accessibility testing. It uncovers issues no other testing method can find.
Recruiting Participants
- Accessibility Testing Firms: Companies specializing in accessibility usability testing
- Disability Organizations: Partner with local disability advocacy organizations
- User Research Platforms: UserTesting.com, Respondent.io with accessibility screening
- Community Outreach: Reach out to accessibility communities online
- Compensation: Pay participants fairly, often higher rates for specialized testing
Accessibility Participant Diversity
- Screen Reader Users: Blind and low-vision users
- Keyboard-Only Users: Motor disabilities
- Magnification Users: Low vision
- Voice Control Users: Motor disabilities
- Cognitive Disabilities: Learning or neurological differences
- Deaf/Hard of Hearing: For video/audio content
User Testing Best Practices
- Test Early and Often: Not just at the end of the project
- Value Time: Don't just ask "is this accessible?" but "how is the experience?"
- Observe, Don't Intervene: Let users find issues without assistance
- Document Everything: Record sessions (with permission)
- Ask Why: Understand not just what didn't work but why
- Test Realistic Scenarios: Not just checklist compliance
What to Test with Users
- Critical User Flows: Registration, checkout, main features
- Complex Components: Custom widgets, data visualizations
- Forms: Especially important for e-commerce and applications
- Navigation: Can the user find what they need?
- Content: Is content understandable and well-structured?
Accessibility Bug Reporting
Effective bug reports help developers understand and fix accessibility issues quickly.
Writing Accessibility Bugs
A good accessibility bug report should include:
Clear Title
- Good: "Button missing accessible name for screen reader"
- Bad: "Accessibility issue on homepage"
Severity
- Critical: Blocks main functionality
- High: Difficult to use but workarounds possible
- Medium: Confusing or frustrating
- Low: Cosmetic or minor improvement
WCAG Reference
- Which WCAG success criterion is violated
- Level (A, AA, or AAA)
- Link to WCAG documentation
User Impact
- Which users are affected
- Which assistive technology is affected
- What users cannot do
Reproduction Steps
- Browser and version
- Screen reader/assistive technology if applicable
- Step-by-step reproduction instructions
- Actual vs. expected behavior
Test Environment
- Operating system and version
- Browser and version
- Screen reader and version (if applicable)
- Device (desktop, tablet, mobile)
Expected Behavior
- What should happen
- How it should behave for assistive technology users
- How other sites/apps handle this correctly
Suggested Fix
- Code suggestion if possible
- Link to documentation
- Reference to ARIA patterns if applicable
Screenshots/Recordings
- Annotated screenshot
- Screen reader recording
- Video of the issue
Bug Tracking Tools
- JIRA: Use accessibility labels and custom fields for WCAG criteria
- GitHub Issues: Use accessibility labels and templates
- Dedicated Tools: Some organizations use specialized accessibility trackers
- Spreadsheets: For smaller projects or initial audits
Issue Prioritization
Critical (Fix Immediately)
- Blocks main functionality (e.g., can't purchase, can't log in)
- No workaround available
- Violates Level A WCAG criteria
- Affects large user base
High (Next Release)
- Difficult to use but workarounds exist
- Violates Level AA WCAG criteria
- Affects important user flows
- Commonly encountered issue
Medium (Backlog)
- Confusing but usable
- Violates Level AAA or best practices
- Affects minor features
- Moderate user impact
Low (Nice to Have)
- Cosmetic issues
- Very rare scenarios
- Improvements beyond compliance
- Minimal user impact
Accessibility in QA Process
Accessibility should be integrated into every step of the QA process, not treated as a separate concern.
Definition of Done
Add accessibility criteria to your "Definition of Done":
- Automated accessibility tests pass
- Keyboard navigation testing completed
- Screen reader testing performed
- Color contrast meets WCAG AA
- All interactions are keyboard accessible
- Focus management is correct
- Semantic HTML is used
- ARIA is correctly implemented
QA Stages
Design QA
- Review designs for sufficient color contrast
- Verify touch targets are large enough
- Check that focus states are designed
- Ensure information is not conveyed by color alone
- Verify readable typography sizes
Development QA
- Code review for accessibility best practices
- Linters check ARIA and semantic HTML
- Unit tests include accessibility checks
- Component documentation includes accessibility usage
Testing QA
- Automated accessibility tests in CI/CD pipeline
- Manual keyboard navigation testing
- Screen reader testing for new features
- Cross-browser accessibility testing
- Mobile accessibility testing
Pre-Release QA
- Comprehensive accessibility audit
- User testing with people with disabilities
- WCAG conformance check
- Documentation of known accessibility issues
Accessibility Champion Role
- Designate a Champion: One QA team member becomes accessibility specialist
- Responsibilities: Training team, reviewing accessibility tests, updating standards
- Time Allocation: Dedicate 20-30% of time to accessibility activities
- Continuous Learning: Stay updated on accessibility best practices
- Advocacy: Promote accessibility across the organization
Continuous Accessibility Testing
Accessibility is not a one-time project – it requires ongoing testing and monitoring.
Regression Testing
- Automated Regression: Run axe tests on every build
- Manual Regression: Re-check critical flows with keyboard/screen reader
- CI/CD Gating: Block deployments that fail accessibility tests
- Component Library: Accessibility tests for shared components
Production Monitoring
- Regular Audits: Quarterly or monthly full accessibility audits
- User Feedback: Provide a way for accessibility feedback
- Analytics: Monitor assistive technology usage
- Bug Tracking: Track accessibility bugs over time
Regular Audits
- Quarterly Audits: Comprehensive WCAG review
- Post-Release Audits: Test new features after deployment
- Third-Party Audits: Bring in external experts for annual review
- Self-Audits: Team performs regular accessibility reviews
Staying Current
- WCAG Updates: Follow new WCAG versions (WCAG 3.0 coming)
- Browser Changes: New browser features may impact accessibility
- AT Updates: Screen readers and assistive technologies evolve
- Best Practices: Accessibility community is constantly evolving
- Legal Requirements: New laws and regulations may appear
Comprehensive Accessibility Testing Checklist
Use this checklist to ensure you've covered all aspects of accessibility.
Automated Testing
- axe DevTools tests pass
- Lighthouse accessibility score is 100
- WAVE shows no errors
- eslint-plugin-jsx-a11y rules configured
- jest-axe tests written for components
- CI/CD pipeline runs accessibility tests
Keyboard Accessibility
- All functionality available via keyboard
- Tab order is logical
- Focus indicators are always visible
- No keyboard traps
- Skip links present
- Custom widgets have correct keyboard interaction
- Keyboard shortcuts can be disabled if needed
Screen Reader
- Tested with NVDA/JAWS (Windows)
- Tested with VoiceOver (macOS/iOS)
- Tested with TalkBack (Android)
- All images have alt text
- Heading structure is logical
- Landmarks are correctly used
- Form labels are correct
- Links are descriptive
- Error messages are announced
- Dynamic content uses ARIA live regions
Visual Accessibility
- Color contrast meets WCAG AA (4.5:1)
- Information not conveyed by color alone
- Text is scalable to 200%
- Layout doesn't break at zoom
- Touch targets are at least 44x44px
- Focus indicators have sufficient contrast
Content Accessibility
- Page language is set
- Headings are hierarchical
- Lists use correct markup
- Tables have headers
- Abbreviations are expanded
- Error messages are descriptive
Mobile Accessibility
- Tested with iOS VoiceOver
- Tested with Android TalkBack
- Touch targets are sufficiently large
- Supports portrait and landscape
- Works with Dynamic Type/font scaling
- Respects prefers-reduced-motion
Form Accessibility
- All inputs have labels
- Error identification is clear
- Error suggestions provided
- Required fields properly marked
- Group-related fields together
- Input purpose is programmatically identifiable
Multimedia Accessibility
- Videos have captions
- Audio has transcripts
- Videos have audio descriptions
- Media players are keyboard accessible
- Auto-play can be paused
WCAG Conformance
- Level A criteria are met
- Level AA criteria are met
- ARIA is correctly used
- HTML is valid
- No duplicate IDs