How to Do a Website Accessibility Audit in 2026: A Step-by-Step Guide
Learn how to conduct a thorough website accessibility audit. This practical guide covers automated scanning, manual testing, WCAG criteria, and tools — everything you need to find and fix accessibility issues.
Whether you're preparing for the April 2026 ADA deadline, the European Accessibility Act, or simply want to make your website usable by everyone — the first step is always the same: an accessibility audit.
But where do you start? What do you check? And how do you prioritize what to fix first?
This guide walks you through a complete website accessibility audit, step by step.
What Is a Website Accessibility Audit?
An accessibility audit is a systematic review of your website against established standards — typically WCAG 2.1 Level AA — to identify barriers that prevent people with disabilities from using your site.
A thorough audit combines:
- Automated scanning — tools that crawl your pages and flag code-level issues
- Manual testing — human review of keyboard navigation, screen reader compatibility, and user experience
- Assistive technology testing — verifying your site works with screen readers, voice control, and other tools
No single method catches everything. Automated tools find roughly 30-50% of WCAG issues. The rest requires human judgment.
Step 1: Define Your Scope
Before scanning anything, decide what you're auditing:
- Full site or key pages? For large sites, start with your most critical user journeys: homepage, navigation, signup/login, checkout, contact forms, and your top 10 most-visited pages.
- Which standard? WCAG 2.1 Level AA is the legal benchmark in most jurisdictions. Level AAA is aspirational but rarely required.
- Mobile too? Responsive design issues often create accessibility barriers. Test at multiple breakpoints.
Tip: Document your scope. You'll need it for compliance records and to track progress over time.
Step 2: Run an Automated Scan
Start with automated tools to get a baseline. This catches the "low-hanging fruit" — issues that are clearly detectable in code.
What automated scans catch well:
- Missing alt text on images
- Poor color contrast ratios
- Missing form labels
- Incorrect heading hierarchy
- Missing document language
- ARIA attribute errors
- Empty links and buttons
What they miss:
- Whether alt text is actually meaningful
- Keyboard trap issues in complex widgets
- Logical reading order
- Context-dependent issues (like whether a link makes sense out of context)
- Dynamic content and single-page app interactions
How to scan with AccessiGuard
- Go to accessiguard.app and enter your URL
- Our scanner crawls your pages and tests against WCAG 2.1 criteria
- Review the report — issues are categorized by severity and WCAG success criterion
- Export or share the report with your development team
A free scan gives you an immediate snapshot. Paid plans add multi-page scanning, monitoring, and historical tracking.
Step 3: Manual Keyboard Testing
This is where you find the issues automated tools can't. Put your mouse aside and try to use your entire site with only your keyboard.
What to test:
Tab order: Press Tab repeatedly. Does focus move through the page in a logical order? Can you reach every interactive element (links, buttons, form fields, menus)?
Focus visibility: Can you always see where you are on the page? A visible focus indicator (outline, highlight) should be present on every focused element.
Keyboard traps: Can you always Tab out of every component? Modal dialogs, dropdown menus, and embedded content (like maps or videos) are common trap points.
Skip navigation: Does the site have a "Skip to main content" link that appears when you first press Tab? Without it, keyboard users must tab through the entire header on every page.
Interactive elements: Can you open menus, submit forms, activate buttons, and close dialogs using only Enter, Space, Escape, and arrow keys?
Common keyboard failures:
- Custom dropdown menus that only respond to mouse clicks
- Modal dialogs that don't trap focus (focus escapes behind the overlay)
- Carousels with no keyboard controls
- "Click here" elements built with
<div>instead of<button>
Step 4: Screen Reader Testing
Test with at least one screen reader to understand how assistive technology users experience your site.
Free screen reader options:
- NVDA (Windows) — free, widely used
- VoiceOver (macOS/iOS) — built into Apple devices, activate with
Cmd+F5 - TalkBack (Android) — built into Android devices
What to listen for:
- Images: Are they described? Do decorative images get skipped? ("Image: DSC_0034.jpg" is not helpful.)
- Headings: Does the heading structure make sense when navigated by heading? (
H1→H2→H3, not random jumps) - Forms: Are labels announced when you focus each field? Are error messages associated with the correct field?
- Links: Do link texts make sense without surrounding context? ("Click here" and "Read more" repeated 20 times is unusable.)
- Landmarks: Can the screen reader identify page regions (header, nav, main, footer)?
- Dynamic content: When content updates (notifications, live regions, loading states), is the screen reader notified?
Step 5: Check Color and Visual Design
Visual accessibility goes beyond color contrast.
Color contrast
Text must meet minimum contrast ratios against its background:
- Normal text: 4.5:1 ratio (WCAG AA)
- Large text (18px+ bold or 24px+ regular): 3:1 ratio
- UI components and graphics: 3:1 ratio against adjacent colors
Use tools like the WebAIM Contrast Checker or check contrast results in your AccessiGuard scan report.
Beyond contrast:
- Don't rely on color alone to convey information (red/green for error/success). Add icons, text, or patterns.
- Text resizing: Does your layout hold up at 200% zoom? Users with low vision depend on this.
- Motion: Can animations and auto-playing content be paused? Some users experience vestibular disorders.
Step 6: Test Forms and Error Handling
Forms are where accessibility failures have the most direct impact — they block users from signing up, purchasing, or contacting you.
Checklist:
- Every input has a visible, programmatically associated
<label> - Required fields are indicated (not just with color)
- Error messages are specific ("Email format is invalid" not just "Error")
- Errors are associated with the correct field (via
aria-describedbyoraria-errormessage) - Focus moves to the first error when a form submission fails
- Success confirmation is announced to screen readers
- Autocomplete attributes are used where appropriate (
autocomplete="email", etc.)
Step 7: Review Media Content
Images
- Informative images have descriptive
alttext - Decorative images have empty
alt=""(so screen readers skip them) - Complex images (charts, infographics) have extended descriptions
Video
- Captions are provided for all pre-recorded video
- Audio descriptions are available for video content where the visual information isn't conveyed through audio alone
- Auto-playing video can be paused
Audio
- Transcripts are provided for audio-only content (podcasts, audio clips)
Step 8: Prioritize and Fix
You'll likely find dozens (or hundreds) of issues. Don't panic. Prioritize by impact:
Fix first (Critical):
- Keyboard traps — users literally can't proceed
- Missing form labels — forms are unusable
- No focus indicators — keyboard users are lost
- Auto-playing audio/video that can't be stopped
Fix next (High):
- Missing alt text on informative images
- Poor color contrast on body text
- Heading hierarchy issues
- Missing skip navigation
Fix later (Medium):
- Decorative images with non-empty alt text
- Minor contrast issues on non-essential elements
- Missing autocomplete attributes
- Inconsistent focus styles
Track over time (Low):
- ARIA best practices (using
rolewhere native HTML would suffice) - Minor heading level skips in secondary content
Step 9: Document and Monitor
An audit isn't a one-time event. Accessibility is ongoing.
- Save your baseline report — you need to show progress
- Set up monitoring — new code deployments can introduce regressions
- Create an accessibility statement — publish your commitment and contact info for feedback (see ours)
- Schedule regular audits — quarterly automated scans plus semi-annual manual reviews is a reasonable cadence
AccessiGuard's monitoring feature automatically rescans your sites on a schedule and alerts you to new issues — so you catch regressions before your users do.
How Long Does an Audit Take?
| Site Size | Automated Scan | Manual Review | Total |
|---|---|---|---|
| Small (5-10 pages) | 5 minutes | 2-4 hours | Half a day |
| Medium (50-100 pages) | 15 minutes | 1-2 days | 2-3 days |
| Large (500+ pages) | 30-60 minutes | 1-2 weeks | 2-3 weeks |
Automated scanning is fast. The manual review is where the real time goes — and where the most impactful issues are found.
Start Your Audit Now
The best time to audit your website was before you launched. The second best time is now.
- Run a free AccessiGuard scan to get your baseline
- Follow this guide for manual testing
- Prioritize fixes by impact
- Set up monitoring to catch regressions
With the ADA deadline approaching and accessibility lawsuits surging, proactive auditing isn't just good practice — it's risk management.
Need help interpreting your scan results? Contact us — we're happy to help.