Skip to main content

axe-con 2024: Day One (Haley)

Accessibility
Team Insights

This year will be my second axe-con, but the first that I'm able to attend the sessions live (I was on maternity leave during axe-con 2023, so to say I was a bit distracted would be an understatement!) I've become very interested in AI and how it can be leveraged for UX design as well as accessibility, so I was excited to see that there were many AI-centric talks this year.

Responsible and Ethical AI

Speakers: Dr. Rumman Chowdhury

For an in-depth review of this talk, take a look at Riley's axe-con Day One article.

Human-Centered AI and Accessibility

Speakers: Dylan Barrell and Preety Kumar

For an in-depth review of this talk, take a look at Ashley's axe-con Day One article.

Empowering People with Disabilities Using Github Copilot

Speakers: Ed Summers, Carie Fisher, and Jesse Dugas

Github Copilot is an AI coding assistant that integrates with IDEs like Visual Studio Code. It analyzes your current code content and project structure to intelligently suggest code that seamlessly fits into your project. It can be an extremely helpful tool in your accessibility toolbox as it can help you write new code that is accessible and find/remedy current accessibility problems in your own code.

As with all AI tools, there are pitfalls that everyone should be aware of when using Github Copilot. These include:

  • Bias - AI learns from the code that’s currently out in the world, and a lot of that code is (unfortunately) inaccessible. This can lead AI to suggest inaccessible code.
  • Hallucinations - When AI makes a suggestion, it will appear confident in itself, even if the information it provides is incorrect or incomplete. Humans should always check AI’s work.
  • Ethics - AI tools need to be appropriately trained so as to not exclude or mistreat anyone. In the accessibility space, this is especially important to be aware of.
  • Dependence - AI is meant to enhance productivity, but we should not rely on AI alone.

Using Copilot to improve the accessibility of your code requires a few things:

  • Crafting effective prompts - provide the AI with detailed context and expectations. Avoid being too general.
  • Humans in the loop - validate information provided by AI, ask for references, and verify that the provided references are trustworthy.
  • Strong testing plans - involve real users in testing to ensure that the code generated by Copilot works and is accessible.
  • Continuous improvement - always refine prompts and challenge results over time.

Assembling a Network of Accessibility Advocacy Through UX Design

Speaker: Ray Dahl, PhD

Ray Dahl is a UX Strategist/Manager at the Church of Jesus Christ of Latter-day Saints. In his talk, he details his journey in developing a network of accessibility allies within his large organization to help drive change despite accessibility not being a formal organizational priority.

At the beginning of his quest, Ray was tasked with creating a design system for the organization. As someone who had consistently advocated for accessibility but was met with pushback, he took this opportunity to control what he could control: improving accessibility through color, type, focus indicators, target sizes, etc. When it came time to implement the new design system across the organization’s digital platforms, he found allies in the engineering department who were also interested in improving accessibility. Implementing sensible ARIA labels for icon buttons and requiring image alt text for valid builds were just two small changes that the engineering team made to move the needle even further. Ray then worked with the content creation team to figure out ways to make accessible digital content. The branches of the organization that were once separate were now starting to band together to work towards a united goal.

This expanding network effect resulted in accessibility advocacy within multiple spheres of influence within the organization. Eventually, senior leadership became aware of and more involved in this grassroots effort. Rather than “sneaking things in” to improve accessibility on a small scale, the organization saw the value in accessibility and made it more of a priority from the top. The lesson learned: When accessibility is a UX priority, it can influence other nodes in the network to join the cause.

Ray’s advice for UX teams:

  • Audit your design environment and find ways to enhance accessibility through small changes
  • Look for allies in other departments
  • Advocate, advocate, advocate!

Integrate Accessibility Snapshot Testing into the Mobile Build Process

Speakers: Andrea Skeries, Nicole Bergstrom, David Brunow, Lauren Ludovicy, and Wilfred Ilagan

Snapshot testing is a software testing technique that involves capturing and visually comparing screenshots of a component/page before making a change vs. after making a change. The team at Hilton uses snapshot testing to ensure that accessibility issues are caught and fixed before any code gets pushed to production. They do this by predefining a set of tests that instantly capture snapshots of a component in a number of different accessibility modes. This allows anyone, even non-developers, from the design, development, QA, and accessibility teams to easily view and compare screenshots in one place, increasing both the efficiency and accuracy of their testing efforts.

The tests that Hilton uses for each new component/feature includes:

  • Dark mode vs. light mode - snapshots are captured in both modes to ensure both have sufficient color contrast
  • Text sizes/magnification - snapshots are taken at different text size/magnification levels to ensure text doesn’t break or truncate oddly
  • Touch target size - snapshots add a visible border around each clickable element’s bounding box to ensure it is large enough and/or has enough space around it
  • Screen reader output/order - snapshots annotate the elements on the page in a color-coded, easy to read/scan list. This list displays what the screen reader will read for each element, what order each element is announced in, and what role each element has.
  • Localization - snapshots are captured for each supported language to ensure that text and layouts are not broken in any language (especially helpful for comparing LTR vs. RTL layouts)
  • Responsive - snapshots are captured at multiple different device sizes to catch issues with layout, sizing, spacing, etc. across devices.

I am particularly intrigued by the ability to capture screenshots that output the screen reader text. We currently use Diffy as our snapshot testing service, and I would be interested to see if there is any equivalent functionality that we could leverage there. I can see this being particularly helpful in reducing the amount of time needed to set up and use VoiceOver during manual testing.

AI Systems for Accessibility

Speakers: Noé Barrell and Dylan Sheffer

AI is a hot topic this year at axe-con, so it’s no surprise that Deque took the opportunity to announce two new AI-centric tools that they’re releasing soon: axe Advisor for Deque University and AI Accessibility Remediation for axe DevTools. Because AI has been trained on all the information that exists on the internet, it also inherits all of the bias that exists on the internet. This results in AI failing to consider accessibility when generating code and providing biased information about disabilities. Deque created these new tools in an attempt to solve these common AI problems.

Axe Advisor for Deque University is a human-centric AI assistant within Deque University that provides users with information about accessibility topics ranging from general information to specific technical challenges in a nonjudgmental way. The system analyzes all of the information within Deque University, formulates a response that the user will understand based on their experience level, and links to relevant courses within Deque University. The result is an LLM that acts as an accessibility expert/advisor without needing to be prompted to consider accessibility and avoid disability biases in its responses.

AI Accessibility Remediation for axe DevTools is a tool that analyzes code for accessibility issues. It details the causes of any issues it finds, determines the intention of the developer behind the code, attempts to address the issue while maintaining the desired code style, and provides actionable tips and feedback so the user knows how to avoid the issue in the future.

Both of these tools seem like useful additions to any accessibility professional’s toolbox.

Need a fresh perspective on a tough project?

Let’s talk about how RDG can help.

Contact Us