Using AI for Neuroaccessibility in web design


Design has learned how to remove obstacles. It has not yet learned how to reduce strain. As our digital environments become unavoidable, the question is no longer whether interfaces function, but whether they allow people to think.


On a Tuesday morning, Elena opens her laptop at the kitchen table.

The house has the hollow calm that follows a rush. Breakfast dishes sit stacked but unrinsed. A backpack leans against the wall where her son dropped it on the way out. The coffee she poured an hour ago has cooled into something bitter and forgotten.

She is trying to complete a task she has already abandoned twice.

The website loads cleanly. No errors. No warnings. The kind of interface that signals competence at a glance. Pale background. Muted color palette. Plenty of space.

Somewhere on the page is the form she needs to finish.

She scrolls.

The sections repeat themselves visually, like paragraphs written in the same tone. Headings exist, but they don’t assert themselves. Links are plentiful but imprecise. Each one promises progress without revealing its cost.

Elena pauses, fingers hovering above the trackpad, trying to reconstruct where she left off the last time she attempted this. She feels the faint, familiar pressure behind her eyes. Not frustration exactly. More like static.

After a few minutes, she closes the tab.

Later, she will tell herself she was distracted. That she should try again when she has more time. What she won’t name is the quieter feeling beneath it: the sense that the system did not care whether she stayed.


Accessibility solved the body. Not the mind.

If Elena were asked whether the site was accessible, she would struggle to answer.

The text was legible. The buttons were clickable. Nothing prevented her from using it in the literal sense. By most contemporary standards, the site was doing its job.

And yet it demanded something from her that she could not easily give.

Traditional accessibility standards were built to remove physical and sensory barriers. They address what can be seen, heard, or manipulated. These standards have changed lives. They remain essential.

But they stop short of addressing what happens inside the head.

How much information must be held at once. How clearly the system signals importance. How easily a user can recover context after interruption.

A 2021 review in ACM Transactions on Computer–Human Interaction noted that current accessibility guidelines provide little practical guidance for reducing cognitive load or supporting attention regulation, despite decades of research demonstrating their impact on usability.

The result is a peculiar kind of exclusion. Nothing is technically inaccessible. And yet the experience quietly drains the people it is meant to serve.

Elena does not experience this as a design failure. She experiences it as personal insufficiency.

Neurodiversity is not a diagnosis. It is a very common condition of living. 
Nevertheless, Elena does not consider herself neurodivergent. She has no label for what happens when her attention slips or her memory falters. She has simply learned, over time, to attribute those moments to fatigue, stress, or her own limitations.


Neuroscience tells a different story.

Cognitive capacities such as attention, working memory, and processing speed are not fixed traits. Besides genetics, they fluctuate constantly, influenced by stress hormones, emotional load, sleep, illness, and context. Stanislas Dehaene, writing on attention, describes it as a limited and fragile resource, shaped as much by environment as by intent.

In other words, the brain Elena brings to the screen at 10:12 a.m. is not the same brain she had the day before.

This variability is not pathological. It is human.

Neuroaccessibility begins with a simple acknowledgment: design must account not only for different kinds of people, but for different states of the same person.

Design is not neutral. It creates a mental climate.

Every interface establishes a kind of weather system for the mind.

Hierarchy tells the eye where to land. Spacing sets the pace of reading. Navigation determines how much must be remembered. Labels shape expectation and risk.

When these elements are vague or flattened, the brain compensates. It scans harder. It rereads. It hesitates before acting.

Cognitive load theory explains why this matters. When extraneous load increases—mental effort unrelated to the task itself—performance declines. Decisions slow. Errors multiply. Fatigue accumulates.

The Nielsen Norman Group has documented for years that users abandon tasks not because they lack motivation, but because the interface asks them to interpret too much too quickly.

Elena’s hesitation is not confusion in the abstract. It is the predictable response of a nervous system trying to conserve energy.


The complete absence of consent

There is a moment Elena recognizes but cannot articulate: she did not choose this complexity.

She did not opt into ambiguity. She did not consent to holding the map in her head.

On digital systems, consent is rarely discussed, but they operate on cognitive-level assumptions.

Does the interface allow users to choose depth, or does it impose it? Does it preserve context, or does it reset without warning? Does it guide, or does it simply present?

Research published in Human Factors has shown that forced navigation sequences significantly increase stress markers and task abandonment, especially under cognitive strain.

Neuroaccessible design restores consent by offering structure without coercion. It allows users to decide how they move, how deeply they engage, and when they pause.

“Designing for neuroaccessibility does not lower standards. It clarifies them”.  This claim often meets resistance, and it deserves explanation. The fear is that accommodating cognitive variability means simplifying content, diluting rigor, or designing for the least capable user.

What actually happens is something quieter and more precise.

Clarity removes guesswork. Hierarchy reveals intention. Chunking makes complexity navigable rather than opaque.

Educational psychology research supports this. A meta-analysis in Educational Psychology Review found that structured presentation improved comprehension across ability levels, not only for learners with identified differences.

Standards are not lowered. Ambiguity is.

Elena does not need less information. She needs information that knows how to introduce itself.


When compliance isn’t enough

Chris, a digital art director, has been working on a series of updates for a client’s website for several weeks.

This is not a startup or a lifestyle brand chasing attention. It is a public-facing organization operating under a government contract. Which means the rules are explicit. Under Section 508 of the Rehabilitation Act, aligned in practice with WCAG 2.2 Level AA, the site must meet established accessibility standards.

Chris’s team knows this terrain well. Contrast ratios are checked and rechecked. Keyboard navigation is tested. Focus states behave properly. Alt text is present and descriptive. Error messages are announced, not hidden. Accessibility is not an afterthought here; it is embedded into the workflow, discussed early and audited often.

On paper, the site is compliant. And yet, something isn’t sitting right.

Not visually. Not technically. But cognitively.

It is the kind of weight that does not appear in checklists or automated scans. The kind that does not throw errors, but accumulates quietly as someone moves through the page. The kind of strain that has no formal name in most accessibility frameworks, but lives squarely in the space that neuroaccessibility is beginning to describe.

WCAG 2.2 is not new. It became an official W3C Recommendation in October 2023, and by now, it has been part of the standards landscape for some time.

What is new is the way Chris’s team is reading it.

On paper, the update appears modest. A handful of additional success criteria. Refinements around focus visibility. Clarifications on target size. Adjustments for dragging movements. Nothing that suggests a fundamental shift.

But as the team revisits the site through this lens, a deeper disconnect becomes difficult to ignore.

The site passes AA.
The content is technically accessible.
And still, something feels heavy.

The problem is not color or contrast.
It is not navigation or input methods.
It is the amount of mental work the interface quietly demands.

For years, accessibility of digital content has been defined primarily by physical and sensory access. But it rarely asks a more uncomfortable question: how hard does this make someone think?

As WCAG 2.2 edges closer to acknowledging cognitive accessibility—the terrain neuroaccessibility is concerned with—Chris realizes the challenge has shifted. The work is no longer only about making information visible or operable. It is about how information is structured, paced, and metabolized by different minds.

Some people understand best through diagrams. Others through tables. Some scan first. Others read linearly, provided the path is explicit.

This is not a new standard that has arrived suddenly.

It is an existing standard finally being taken seriously.

Compliance, Chris realizes, is no longer the finish line, but the start. He just wishes there were a tool to guide them more quickly through the process.


Why AI enters the picture

For years, cognitive friction was sensed but not formally accounted for. Designers “felt it”. Users complained vaguely, research was biased, so teams argued from intuition and wrong assumptions.

What has changed is not the ambition of design teams, but the range of what they can now perceive and respond to.

Pattern-recognition systems can analyze visual density, navigational depth, interaction loops, and decision points across complex interfaces. They can surface where attention fragments, where hierarchy collapses, and where users are likely to hesitate or abandon tasks.

A 2023 study from MIT’s Computer Science and Artificial Intelligence Laboratory demonstrated that AI-assisted UX analysis predicted task abandonment more accurately than heuristic review alone.

AI does not design. It reveals.


Filling the Gap

This realization led to the creation of NAC — the Neurodivergent Accessibility Companion.

NAC is a GPT-based system designed to guide teams through the often-invisible terrain of cognitive accessibility. It uses established accessibility standards, including WCAG, as a shared reference point—not as a verdict, but as a lens—to evaluate cognitive load, hierarchy, navigation, and emotional friction. From there, it provides concrete, human-readable suggestions to help teams shape their work so it is easier to understand and move through.

NAC assigns scores, not as verdicts, but as signals — a way of pointing to where cognitive effort concentrates and where guidance can help.

It does not certify or replace formal compliance audits. Instead, it interprets WCAG-informed signals and patterns to show where systems quietly ask too much, and where small, intentional changes can restore clarity, consent, and flow.

The work is led by Federico Abrahams, a designer and founder of Internauts, whose research centers on human context and contextual interaction. Following a late diagnosis on the autism spectrum, he has dedicated the last couple of years to studying neuroaccessibility not as accommodation, but as design literacy.

He partners with Dr. Julie Hartman, whose background spans design psychology, consumption psychology, and neuroscience, informed by clinical practice. Her focus is not labels, but regulation: how people disengage, recover, and re-enter systems that ask something of them.

Together, their work is trying to help designers, and everyone else, notice how information is presented and paced, how it becomes understandable, and where that process quietly starts to fail, not to find a one-size-fits-all solution, but to close the gap for most.


That evening, Elena tries again.

Not the same site.

She opens a competitor’s website almost by accident, following a recommendation buried in an email. The page loads. It looks less refined at first glance. Fewer design flourishes. More contrast.

But something shifts.

The hierarchy is unmistakable. She knows where to begin. Each section announces its purpose before asking for her attention. When she pauses, the interface seems to wait with her rather than rushing ahead.

She completes the task without noticing the time passing.

When she closes the laptop, there is no sense of relief. No small victory. Just the quiet absence of strain.

This is what good web design often feels like: not delight, not frictionless magic, but the restoration of normalcy. The feeling of being allowed to think in one’s own way.

No single solution fits every mind. But again and again, the most enduring design case studies point to the same truth: when a problem is approached by simplifying around a clear goal—stripping away what is incidental, clarifying what matters—the result tends to work for more people, not fewer.
The internet is no longer a tool we occasionally use. It is an environment we inhabit.


You can see this pattern in places that quietly earned trust over time:
in how Virgin rethought the airline booking flow, in how Lemonade restructured multi-step insurance forms, in how the GOV.UK team redesigned public-sector services around clear language and predictable structure, in how Stripe normalized complex financial flows through progressive disclosure, and in how Airbnb steadily reduced decision fatigue in booking by clarifying choices rather than multiplying them. 

The question is no longer whether systems function.
It is whether systems recognize the minds moving through them.


If you have encountered a digital experience that felt unusually accommodating—clear without being simplistic, supportive without being prescriptive—consider sharing it. These examples, scattered across the web and often unnoticed, are how this work becomes collective rather than theoretical.

Thanks for reading.

Next
Next

5 Essential Tips for StartUps