ADA digital accessibility lawsuits are on pace to exceed 4,975 filings in 2025 — a 20% jump from the year before (UsableNet, 2025). Meanwhile, the European Accessibility Act started enforcement in June 2025 with fines reaching EUR 500,000 per violation (Fieldfisher, 2025). The regulatory window for treating accessibility as optional has closed.
Yet most developers still face the same three bad choices: enterprise tools that start at five figures, credit-based APIs with unpredictable costs, or building and maintaining their own Playwright + axe-core infrastructure. This post explains why API-based scanning solves that problem, what automated testing actually catches, and what the legal landscape looks like heading into 2026.
If you're weighing options, see our guide to comparing accessibility testing tools.
TL;DR: 94.8% of top websites fail WCAG checks, averaging 51 errors per page (WebAIM, 2025). API-based accessibility scanning lets developers run WCAG 2.2 audits via a single HTTP request — no browser infrastructure, no enterprise contracts. Automated scans catch 57% of issues by volume, making them essential for CI/CD gating before manual review.
Why Is Web Accessibility Scanning Critical in 2026?
Web accessibility lawsuits in the US surpassed 4,000 in 2024, and 2025 projections exceed 4,975 (UsableNet, 2025). Across the Atlantic, the EAA now mandates WCAG 2.1 AA compliance for any digital product sold in EU markets. Ignoring accessibility isn't just an ethical gap — it's a financial liability.
The European Accessibility Act Is Live
The EAA took effect on June 28, 2025. Penalties range from EUR 5,000 to EUR 500,000 depending on the member state (Fieldfisher, 2025). This directive applies to e-commerce, banking, transport, and telecommunications — essentially any digital service available to EU consumers.
Enforcement isn't theoretical. Member states have appointed national authorities with investigation and penalty powers. Companies must demonstrate ongoing compliance, not just a one-time audit.
ADA Litigation Keeps Climbing
Settlement costs for ADA website lawsuits range from $5,000 to $75,000, but total costs including legal fees often hit $30,000 to $225,000 or more (Accessible.org, 2025). That's per case. Repeat plaintiffs file dozens of suits annually against the same categories of businesses.
The Business Case Beyond Compliance
Around 1.3 billion people — 16% of the global population — experience significant disability (WHO, 2023). Their combined annual disposable income reaches $6.9 trillion globally (W3C WAI, 2024). Inaccessible websites don't just create legal risk. They lock out paying customers.
The digital accessibility software market reflects this urgency. It grew to $768 million in 2024 and is projected to reach $1.3 billion by 2030, expanding at a 9.2% CAGR (Grand View Research, 2024).
Citation capsule: ADA digital accessibility lawsuits are projected to exceed 4,975 in 2025, a 20% increase over 2024 (UsableNet, 2025). The European Accessibility Act enforces penalties from EUR 5,000 to EUR 500,000 per violation (Fieldfisher, 2025). Combined US and EU regulatory pressure makes automated WCAG scanning a business requirement, not an option.
Learn more about how the scanning features address these requirements.
What Percentage of Websites Actually Fail WCAG?
Nearly all of them. The WebAIM Million study found 94.8% of the top one million homepages had detectable WCAG failures in 2025, with an average of 51 errors per page (WebAIM, 2025). Despite years of awareness campaigns and better tooling, the web remains overwhelmingly inaccessible.
The six most common error types account for the vast majority of detected failures:
- Low contrast text — found on 79.1% of homepages
- Missing alt text — 56.5% of homepages
- Missing form labels — 48.2% of homepages
- Empty links — 45.4% of homepages
- Empty buttons — 29.6% of homepages
- Missing document language — 15.8% of homepages
(WebAIM, 2025)
When we tested the top e-commerce sites during development, every single one had at least three of these six error types. The pattern is consistent: these aren't obscure edge cases. They're structural problems that automated scanners catch reliably.
What's striking is how fixable these issues are. Missing alt text takes seconds to add. Form labels require a single HTML attribute. Document language is one line in the <html> tag. But without automated scanning in the development workflow, these gaps slip through code review unnoticed.
Only 26% of engineers say they "always" write accessible code (Level Access, 2024). That's not a skills gap — it's a tooling gap. Developers need automated feedback at the point where they can actually fix things: in the CI/CD pipeline.
Citation capsule: The WebAIM Million (2025) found that 94.8% of the top one million homepages had WCAG failures, averaging 51 errors per page. Low contrast text appeared on 79.1% of sites, missing alt text on 56.5%, and missing form labels on 48.2%. These six error types represent the bulk of detectable accessibility barriers on the web.
See how the scanning process works in detail.
How Does API-Based Accessibility Scanning Work?
API-based scanning removes the infrastructure burden. Instead of running headless browsers locally, you send an HTTP request with a URL and receive structured WCAG violation data in JSON. A study by Applause found 73% of organizations aren't adequately equipped for ongoing accessibility testing (Applause, 2024) — an API approach closes that gap without adding operational complexity.
Here's the workflow in three steps.
Step 1: Get an API Key
curl -X POST https://api.a11yflow.dev/v1/keys \
-H "Content-Type: application/json" \
-d '{"email": "you@example.com"}'Step 2: Submit a URL for Scanning
curl -X POST https://api.a11yflow.dev/v1/scans \
-H "Authorization: Bearer sk_live_your_key_here" \
-H "Content-Type: application/json" \
-d '{"url": "https://example.com"}'Step 3: Retrieve Results
curl https://api.a11yflow.dev/v1/scans/scan_id_here \
-H "Authorization: Bearer sk_live_your_key_here"Scans typically complete in 5 to 15 seconds. The response includes every violation with its WCAG criterion, impact level, failing element selector, XPath, contrast ratios where relevant, and a specific fix suggestion.
After building and maintaining our own Playwright + axe-core infrastructure for over a year, the pain points became clear: browser dependency management in CI, flaky headless Chrome processes, and the constant overhead of keeping scan workers healthy. Wrapping that complexity behind an API call was the obvious next step.
The key difference from browser extensions or CLI tools? An API fits into any workflow. Pipe results into your CI/CD pipeline, issue tracker, Slack channel, or custom dashboard. No plugins, no browser dependencies, no local setup.
For the full endpoint reference, see the API documentation.
Citation capsule: 73% of organizations lack adequate tooling for ongoing accessibility testing (Applause, 2024). API-based scanning addresses this by accepting a URL via HTTP request and returning structured WCAG violation data in JSON, removing the need for local browser infrastructure or manual audit scheduling.
What Can Automated Scanning Actually Catch?
Automated accessibility testing catches approximately 57% of issues by volume, based on Deque's analysis of over 2,000 audits covering 13,000 pages (Deque, 2021). That 57% includes the highest-frequency violations: contrast failures, missing labels, broken ARIA attributes, and structural HTML problems.
Where Automation Excels
The errors automation catches well are binary. An image either has alt text or it doesn't. A form input either has a label or it doesn't. A contrast ratio either meets the 4.5:1 threshold or it doesn't. These pattern-matching checks run reliably at scale.
With semi-automated testing — where tools flag potential issues for human review — coverage rises to 80% or more during development (Deque, 2022). The combination of automated flags plus human judgment covers significant ground.
Where Automation Falls Short
The remaining 43% requires human evaluation. Keyboard navigation, reading order, focus management, timeout handling, and meaningful alt text (not just present, but accurate) all need a person interacting with the page using assistive technology.
In production, we gate on zero critical violations before deployment. Moderate and minor issues get tracked as metrics rather than blockers. This approach prevents the worst barriers from reaching users while keeping the pipeline fast enough that developers don't bypass it. It's a practical middle ground that we've found works better than all-or-nothing gating.
For a deeper dive, see our guide to comparing accessibility testing tools.
Citation capsule: Deque's study of 2,000+ audits found automated tools detect 57% of accessibility issues by volume (Deque, 2021). Semi-automated approaches raise coverage to 80% or more during development (Deque, 2022). The remaining issues — keyboard navigation, reading order, cognitive clarity — require manual testing with assistive technology.
Why Do Overlay Widgets Create More Legal Risk?
Overlay widgets promise one-line fixes for accessibility compliance, but the data tells a different story. In 2024, 25% of ADA lawsuits cited overlay widgets as barriers to access rather than solutions (Accessibility.works, 2025). Courts have not accepted overlays as evidence of WCAG compliance.
Overlays work by injecting JavaScript that modifies the visual presentation of a page. They don't fix the underlying HTML structure. A missing form label remains missing. A broken ARIA role stays broken. Screen readers still encounter the same barriers.
Why does this increase legal risk rather than reduce it? Because installing an overlay creates a documented record that the site owner knew about accessibility problems but chose a surface-level fix instead of addressing the code. Plaintiffs' attorneys cite this as evidence of negligence.
The alternative is straightforward: fix the actual code. Automated scanning identifies the specific elements that fail WCAG criteria. Each violation comes with a selector, a rule ID, and a fix suggestion. That's actionable data a developer can resolve in minutes — not a JavaScript band-aid that papers over the problem.
How Much Does an Accessibility Lawsuit Cost?
ADA website accessibility settlements typically range from $5,000 to $75,000, but the full cost — including legal fees, remediation, and ongoing monitoring — often reaches $30,000 to $225,000 or more (Accessible.org, 2025). For businesses facing multiple suits, costs compound quickly.
In the EU, EAA penalties are separate and additive. A single product violating accessibility requirements in multiple member states could face fines from each jurisdiction. The Fieldfisher analysis notes penalties of EUR 5,000 to EUR 500,000 per member state (Fieldfisher, 2025).
Compare that to the cost of prevention. Automated scanning on a flat-rate plan costs a fraction of a single settlement. Running scans on every deployment catches regressions before they create legal exposure. The math isn't close.
Compare pricing plans to see what fits your team.
Citation capsule: ADA website accessibility lawsuits cost $30,000 to $225,000 including settlements and legal fees (Accessible.org, 2025). EAA penalties add EUR 5,000 to EUR 500,000 per member state (Fieldfisher, 2025). Automated scanning on every deployment prevents violations from reaching production at a cost far below a single legal action.
What's on the Roadmap?
A11yFlow launched with core scanning capabilities, but the roadmap targets the gaps developers ask about most. Here's what's coming:
- Scheduled monitoring — set a scan cadence and receive alerts when your accessibility score drops below a threshold
- Webhooks — trigger actions in your own systems (Slack, Jira, PagerDuty) when scans complete
- GitHub Action — run WCAG checks on every pull request without custom workflow configuration
- Public rule pages — a searchable reference for every axe-core rule with code-level fix guidance
These priorities came directly from early adopter feedback during the beta period. Scheduled monitoring was the most requested feature by a wide margin, followed by CI/CD integration via GitHub Actions. We're building what developers told us they need, not what looks good on a feature comparison chart.
Check the full API documentation for endpoint details.
Frequently Asked Questions
How is API-based scanning different from browser extensions?
Browser extensions require manual interaction — you open a page, click a button, review results. API-based scanning accepts a URL via HTTP request and returns structured JSON. This makes it automatable: you can integrate scans into CI/CD pipelines, cron jobs, or monitoring dashboards without human intervention. Extensions work well for spot checks during development; APIs work for systematic, repeatable testing at scale.
What WCAG standards does the scanner cover?
The scanner uses axe-core, which includes 89 rules covering WCAG 2.0 A/AA, WCAG 2.1, and WCAG 2.2 criteria. Every scan tests against the full rule set. Results include the specific WCAG success criterion each violation relates to, so you can map findings directly to compliance requirements. See the full rule reference in the API documentation.
Can automated scanning replace a manual accessibility audit?
No. Automated tools detect roughly 57% of accessibility issues by volume (Deque, 2021). Keyboard navigation, reading order, cognitive clarity, and meaningful content evaluation still require human testers — ideally people who use assistive technology daily. Automated scanning is a filter that catches the majority of common violations before they reach users. Manual audits cover the rest.
Is there a free tier?
Yes. The free plan includes 25 scans per month with no credit card required and no expiration. Every feature is available at every tier — there are no artificial limits on what data you can see. Check the pricing plans for details on higher-volume options.
How long does a scan take?
Most scans complete in 5 to 15 seconds. Each scan launches headless Chromium via Playwright, fully renders the page (including JavaScript-driven SPAs and client-rendered content), then runs axe-core against the rendered DOM. You poll the scan endpoint until results are ready, or set up webhooks to receive a notification on completion.
Do accessibility overlays count as compliance?
No. Courts have not accepted overlay widgets as evidence of WCAG compliance. In 2024, 25% of ADA lawsuits involved sites with overlays installed (Accessibility.works, 2025). Overlays modify visual presentation without fixing underlying HTML structure. The only path to compliance is fixing the code — which starts with identifying what's broken through automated and manual testing.