axe vs WAVE vs Pa11y: Accessibility Testing Tools Compared

Written By  Crosscheck Team

Content Team

December 25, 2025 8 minutes

axe vs WAVE vs Pa11y: Accessibility Testing Tools Compared

axe vs WAVE vs Pa11y: Accessibility Testing Tools Compared

With 95.9% of the top one million websites failing basic WCAG 2.2 standards (WebAIM, 2024) and the ADA Title II compliance deadline for large public entities landing on April 24, 2026, accessibility testing has shifted from a nice-to-have into a legal and operational necessity.

The problem is that the tooling landscape is fragmented. Ask three developers which accessibility tool they use and you'll get three different answers: axe, WAVE, or Pa11y. Each has a distinct philosophy, a different target user, and meaningful trade-offs.

This article cuts through the noise with a straight comparison so you can make the right call for your team.


Why Automated Accessibility Testing Matters Right Now

The numbers are sobering. The 2025 WebAIM Million report found 50,960,288 distinct accessibility errors across one million home pages — an average of 51 errors per page. Low-contrast text appeared on 79.1% of home pages. Nearly one in five images had no alternative text.

And those are only the errors automated tools can detect. Automated scanners catch roughly 30–40% of WCAG issues. The remaining 60–70% require human judgment.

That gap matters enormously as the ADA Title II rule — requiring WCAG 2.1 Level AA compliance for all state and local government digital properties — approaches its first enforcement deadline. Non-compliance carries federal penalties of up to $150,000 per violation.

So which tool should be part of your testing stack? Let's look at the three most widely adopted options.


Tool #1: axe (axe-core / axe DevTools)

Made by: Deque Systems License: axe-core is open source (MPL 2.0); axe DevTools has a free tier and a paid Pro tier Downloaded: 3 billion+ times; 800,000+ Chrome extension installs

What it is

axe-core is the accessibility engine underlying the majority of the tooling ecosystem. It is a JavaScript library that analyses the DOM against a rule set mapped to WCAG 2.0, 2.1, and 2.2 at levels A, AA, and AAA, as well as Section 508 and EN 301 549.

The axe DevTools browser extension exposes those rules through a Chrome, Firefox, and Edge panel. The axe-core npm package embeds them directly into unit tests, integration tests, and browser tests.

Key features

  • WCAG 2.0, 2.1, and 2.2 coverage across A, AA, and AAA levels
  • Framework integrations: Playwright, Cypress, Selenium, Jest, and more — accessibility checks slot into existing test suites with minimal configuration
  • CI/CD ready: axe-core runs headlessly inside pipelines; failures can gate deployments
  • Low false positives: Deque has invested heavily in precision; axe-core flags incomplete results separately so teams know when manual review is required
  • Intelligent Guided Tests (IGTs): the Pro tier adds AI-assisted prompts that walk testers through issues automated rules cannot resolve on their own
  • Configurable rule tags: scope scans to specific WCAG success criteria using AxeBuilder.withTags()

On average, axe-core finds 57% of WCAG issues automatically — significantly above the 30–40% industry benchmark for automated tools.

Who it's for

Developers and QA engineers who want accessibility woven into their existing test framework. axe shines in component testing, pull request checks, and any workflow where automated regression prevention matters more than visual exploration.

Limitations

  • The free browser extension is useful but lacks the guided testing and reporting depth of the paid Pro tier
  • Non-technical stakeholders may struggle with the DevTools panel interface
  • Like every automated tool, it cannot catch keyboard navigation issues or screen reader announcements that require a human to experience

Tool #2: WAVE (Web Accessibility Evaluation Tool)

Made by: WebAIM at Utah State University License: Free (browser extension and web interface); licensed API for enterprise Available as: Chrome, Firefox, and Edge extension; web interface at wave.webaim.org

What it is

WAVE is the accessibility evaluation tool built and maintained by WebAIM, the same organisation behind the annual WebAIM Million report. It has been a reference tool in the accessibility community since WebAIM was founded in 1999.

Rather than integrating into a test framework, WAVE overlays colour-coded icons and outlines directly onto the live page you are viewing. Red icons mark errors. Green icons mark accessibility features. Yellow icons flag areas that need closer human review.

Key features

  • In-page visual overlay: issues are shown in context on the actual page, making it immediately obvious where a problem lives in the layout
  • Navigation Order panel: reveals tab order, element roles, and accessible names — what a screen reader would actually announce
  • Contrast checking: checks foreground/background colour contrast ratios against WCAG thresholds, including support for foreground alpha opacity (added in version 3.2.6)
  • Privacy-first: the extension processes everything locally in the browser; no page data is sent to WebAIM's servers, making it safe for intranet and authenticated pages
  • WCAG and Section 508 checks: covers a broad range of success criteria, with inline links to relevant guidelines for each flagged issue
  • CSS disabling: strips styles to reveal the underlying reading and navigation order
  • Enterprise monitoring: Pope Tech is an enterprise platform built on WAVE that tracks accessibility over time across entire sites

Who it's for

Designers, content authors, and accessibility specialists who need to understand issues in visual context. WAVE is also excellent for training: the icons and inline explanations make it a teaching tool, not just a reporting tool. It is the fastest path from zero to "I understand what this error means."

Limitations

  • No CI/CD integration — WAVE is not scriptable and cannot gate a build pipeline
  • No API for bulk testing in the free tier (the licensed API is a separate paid product)
  • Not designed for developers who want programmatic access to results
  • Can surface more alerts than axe on the same page, which some teams find noisy

Tool #3: Pa11y

Made by: The Pa11y open source project License: Open source (MIT) Available as: CLI tool, CI integration (Pa11y CI), dashboard (Pa11y Dashboard), web service API

What it is

Pa11y (pronounced "pally") is a command-line accessibility tool built for developers and DevOps engineers who want to run accessibility checks as part of an automated pipeline without a browser GUI in the loop. It uses Puppeteer under the hood and supports two test runners: axe-core and HTML_CodeSniffer.

Key features

  • CLI-first: pa11y https://example.com runs a scan from the terminal and returns a structured list of issues with CSS selectors pointing to each failing element
  • Multiple output formats: CLI, JSON, CSV, HTML, and TSV — pipe results into whatever reporting or alerting system your team already uses
  • Configurable standards: WCAG2A, WCAG2AA (default), or WCAG2AAA; easily switched with the --standard flag
  • Threshold control: use --threshold to allow a permitted number of issues before the test fails — useful for incremental remediation
  • Pa11y CI: a dedicated CI runner that tests against a list of URLs or a sitemap, designed to slot into GitHub Actions, CircleCI, Jenkins, or any other pipeline
  • Pa11y Dashboard: a web interface that tracks accessibility metrics over time and graphs regressions and improvements
  • Pa11y 9 (2025): the current major release requires Node.js 20, 22, or 24; Pa11y CI v4 pairs it with a current Puppeteer version for compatibility with modern Linux environments

Who it's for

Developers and DevOps engineers who want maximum flexibility in how they run, configure, and consume accessibility results. Pa11y is the right choice for teams that want to test an entire site on a schedule, monitor a design system's components for regressions, or integrate accessibility into a custom reporting dashboard.

Limitations

  • Purely CLI/API — no visual overlay or GUI. Non-technical team members cannot use it without help
  • Setup requires Node.js knowledge and configuration effort
  • Accessibility coverage depends on which engine is selected (axe-core or HTML_CodeSniffer); the tool itself adds no additional rules
  • Cannot test authenticated or session-dependent pages without additional Puppeteer scripting

Side-by-Side Comparison

axe DevToolsWAVEPa11y
TypeBrowser extension + libraryBrowser extension + web appCLI + CI runner
Open sourceCore only (axe-core)NoYes
CostFree tier; paid ProFree; paid APIFree
CI/CD integrationYesNoYes
Visual overlayGoodExcellentNone
Bulk / site-wide testingVia axe Monitor (paid)Via Pope Tech (paid)Yes (Pa11y CI, free)
WCAG version support2.0, 2.1, 2.22.22.0, 2.1, 2.2 (via engine)
Learning curveMediumLowMedium–High
Best forDev/QA integrationVisual audits and trainingAutomated pipelines
False positive rateLowModerateDepends on engine

Which Tool Should You Choose?

The honest answer is: you probably need more than one.

Here is a practical starting point based on team composition:

If you are a developer building accessibility into a test suite — start with axe-core. Integrate @axe-core/playwright or cypress-axe into your existing tests and let accessibility checks run on every pull request. The low false-positive rate keeps the signal clean.

If you are a designer or content owner who needs to understand issues in context — use WAVE. Open the extension, scan the page you are working on, and use the inline explanations to understand what needs to change and why. It is the fastest path to accessibility literacy.

If you are a DevOps engineer or want to monitor an entire site on a schedule — use Pa11y CI. Point it at your sitemap, set thresholds, and push results to your monitoring platform. The JSON output integrates with almost anything.

If you need enterprise-grade coverage — combine all three with paid features: axe DevTools Pro for guided testing, Pope Tech (WAVE-based) for site-wide monitoring, and axe Monitor for continuous scanning.


The Gap That Automated Tools Cannot Fill

Even the most comprehensive automated scan will not tell you whether your modal traps keyboard focus correctly, whether your error messages are announced at the right time by a screen reader, or whether a user with low vision can actually complete your checkout flow.

Automated tools find the rules violations. Human testers find the experience failures.

This is where teams often hit a friction point: when a tester manually works through a page with a screen reader and discovers a broken focus sequence, documenting that issue with enough context for a developer to reproduce and fix it is harder than it sounds. What was the exact state of the page? Which element had focus? What network requests were in flight? What did the console say?

This is exactly where Crosscheck fills the gap. While axe, WAVE, and Pa11y handle automated accessibility audits, Crosscheck is a Chrome extension built for capturing and documenting accessibility issues found during manual testing — with full context attached automatically. Console logs, network requests, user action sequences, and performance data are all recorded alongside the bug report, so the developer receiving the ticket has everything they need to reproduce the issue. No back-and-forth. No "I can't reproduce it." Just clear, context-rich reports that go straight into Jira or ClickUp.

For teams doing serious accessibility work — especially ahead of the ADA Title II deadline — pairing automated scanners with a thorough manual testing workflow documented through Crosscheck is the most complete approach available.


Summary

The accessibility tooling ecosystem has matured considerably. axe, WAVE, and Pa11y are each excellent at what they do — and each has real limits.

  • axe is the developer's tool: precise, framework-native, and CI-ready
  • WAVE is the auditor's tool: visual, educational, and immediately understandable
  • Pa11y is the automation engineer's tool: flexible, scriptable, and pipeline-friendly

None of them replace manual testing. And none of them make documenting what you find during manual testing any easier — that is a separate problem worth solving.

With the April 24, 2026 ADA Title II deadline weeks away for large public entities, the time to build a comprehensive accessibility testing workflow is now. Start with automated scans to catch the low-hanging fruit. Layer in manual testing to catch the issues the scanners miss. And make sure every issue your team finds gets documented with enough context to actually be fixed.


Try Crosscheck for Free

If your team does manual accessibility testing — or any kind of QA — and you are tired of writing bug reports that lack the context developers need, give Crosscheck a try.

Install the Chrome extension, run your next accessibility audit, and see how much faster issues get resolved when every report comes with console logs, network requests, and a full action replay attached automatically. Start your free trial at crosscheck.cloud.

Related Articles

Contact us
to find out how this model can streamline your business!
Crosscheck Logo
Crosscheck Logo
Crosscheck Logo

Speed up bug reporting by 50% and
make it twice as effortless.

Overall rating: 5/5