Archive for the 'accessibility web standards' Category

Reading List 273

Reading List 272

Reading List 271

Get that dream tech gig by solving Mike Taylr’s Silicon Valley whiteboard interview question

Like every other thought-leader, I follow Mike Taylr on social media. Ever since Shingy left AOL, “Mikey” has moved to the top spot of everyone’s Twitter “Futurist Gurus” Twitter list. This morning I awoke to read Twitter abuzz with exictement over Mike’s latest Nolidge Bom:

Of course, like anyone who’s ever sat a maths exam and been told to “show your working out”, you know that the widely diverse interview panel of white 20-ish year old men is as interested in how you arrived at your answer as in the answer itself. Given Mikey’s standing in the industry and the efficiency of his personal branding consultants, this question will soon be common for those interviewing in Big Tech, as it’s an industry that prides itself on innovative disruption by blindly copying each other. So let’s analyse it.

It’s obvious that the real test is your choice of marker colour. So, how would you go about making the right decision? Obviously, that depends where you’re interviewing.

If you’re interviewing for Google or one of its wannabes, simply set up a series of focus groups to choose the correct shade of blue.

If you’re interviewing for Apple or its acolytes, sadly, white ink won’t work on a whiteboard, no matter how aesthetically satisfying that would be. So choose a boring metallic colour and confidently assert any answer you give with “I KNOW BEST”.

If you’re interviewing for Microsoft, the colour doesn’t matter; just chain the marker to the whiteboard and say “you can’t change the marker, it’s an integral part of the whiteboard”, even after it stops working.

If you’re interviewing for Facebook or one of its wannabes, trawl through previous posts by the panellists, cross reference it with those of their spouses, friends and their friends to find their favourite colours, factor in their Instagram posts, give a weighting to anything they’ve ever bought on a site they’ve signed in using Facebook, and use that colour while whispering “Earth is flat. Vaccines cause cancer. Trump is the saviour. Muslims are evil. Hire me” subliminally over and over again.

Good luck in the new job! May your stocks vest well.

Don’t put pointer-events: none on form labels

The other day I was tearing my hair out wondering why an HTML form I was debugging wouldn’t focus on the form field when I was tapping on the associated label. The HTML was fine:

<label for="squonk">What's your name, ugly?</label>
<input id="squonk">

I asked my good chum Pat “Pattypoo” Lauke for help, and without even looking at the form or the code, he asked “Does it turn off pointer events in the CSS?”

Lo and FFS, there it was! label {pointer-events:none;}! This daft bit of CSS breaks the browser default behaviour of an associated label, and makes the hit target smaller than it would otherwise be. Try clicking in the “What’s your name, ugly?” text:

Try me, I’m good



Try me, I’m crap



I’m jolly lucky to have the editor of the Pointer Events spec as my chum. But why would anyone ever do this? (That line of CSS, I mean, not edit a W3C spec; you do the editing for the sex and the glory.)

Once again, Pat whipped out his code ouija board:

And, yes—the presentation had originally been Material Design floating labels, and this line of CSS had been cargo-culted into the new design. So don’t disable pointer events on forms—and, while you’re at it, Stop using Material Design text fields!

The clown in Steven King's IT down a storm drain, saying 'We all float labels down here Georgie"

Review: Evinced accessibility site scanner

That nice Marcy Sutton asked me to test and give feedback about a new product she’s involved with called Evinced, which describes itself as an “Enterprise grade digital accessibility platform for modern software development teams”. Quite what “enterprise grade” means is beyond me, but it’s basically software that can crawl a website from a root domain and check its code against some rules and report back. There are similar tools on the market, and I’ve recently been working with a Client to integrate Tenon into their workflow, so wanted to compare them.

“Now hang on!”, I hear you say. “automated tools are terrible!”. Well, yes and no. Certainly, overlays etc that claim to automatically fix the problems are terrible, but tools that help you identify potential problems can be very useful.

It’s also true that automated tools can’t spot every single accessibility error; they can tell you if an image is missing alternate text, but not that <img src=dog.png alt=”a cat”> has useless alt text. Only a human can find all the errors.

However, many errors are machine-findable. The most-common errors on WebAIM’s survey of the top million homepages are low contrast text, missing alternative text, empty links, missing form input labels, empty buttons and missing document language, all of which were found by running automated tests on them (which, presumably, the developers never did before they were pushed to production).

I personally feel that a good automated scanner is a worthwhile investment for any large site to catch the “lowest hanging fruit”. While some things can’t be automatically tested, other things can, and other aspects live in a grey area depending on the rigour of the test.

For example, a naive colour contrast test might compare CSS color with background-color, and give a pass or fail; a less naive test will factor in any CSS opacity set on the text and ignore off-screen/ hidden text. A sophisticated contrast test could take a screenshot of text over an image or gradient and do a pixel-by-pixel analysis. To do a test on a screenshot would require actually rendering a page. Like Tenon, Evinced doesn’t just scan the HTML, but renders the pages in a headless browser, which allows the DOM to be tested (although I don’t believe it tests colour contrast in this way).

Evinced uses Axe Core, and open-source library also used by Google Lighthouse. It also contains other (presumably proprietary secret-source) tests so that “if interactable components are built with divs, spans, or images – Evinced will detect if they are broken”.

Reporting

The proof of the pudding with automated site scanners is how well they report errors they’ve found. It’s all very well reporting umpty-squllion errors, but if it’s not reported in any actionable way, it’s not helpful.

Like all the automated scanners I’ve tried, errors are grouped according to severity. However, if those categories correspond with WCAG A, AA and AAA violations, that’s not made clear anywhere.

Graph showing number of errors by type

It’s a fact of corporate life that most organisations will attempt to claim AA compliance, so need to know the errors by WCAG compliance.

One innovative and useful reporting method is by what Evinced calls component grouping: “Consolidates hundreds of issues to a handful of broken code components”.

With other scanners, it takes a trained eye to look through thousands of site-wide errors and realise that a good percentage of them are because of one dodgy piece of code that is repeated on every page. Evinced analyses pages and identifies these repeated components for you, so you know where to concentrate your efforts to get gross numbers down. (We all know that in the corporate world, a quick fix that reduces 10,000 errors to 5,000 errors buys you time to concentrate on the really gnarly remaining problems.)

graph showing 32% of all issues are grouped into 38 components; 1 component accounts for 81% of critical issues

There’s a vague suggestion that this grouping is done by Artificial Intelligence/ Machine Learning. The algorithm obviously has quite clever rules, and shows me a screenshot of areas on my pages it has identified as components. It’s unclear whether this is real Machine Learning, eg whether it will improve as its corpus of naughty pages gets larger.

list of components automatically identified on my site with screenshots and the relevant areas highlighted

I don’t recall signing anything to allow my data to be used to improve the corpus; perhaps a discount for not-for-profits/ time-limited demos could be offered to organisations allowing their data to be added to the training data, if indeed that’s how it “learns”.

User Interface

Many of these site scanners are made by engineers for engineers and have the similar high levels of UX one would expect from JavaScripters.

Tenon has some clunkers in its web interface (for example, it’s hard to re-run a previously defined scan) because it’s most commonly accessed via its API rather than web back-end.

Evinced makes it easy to re-run a scan from the web interface, and also promises an API (pricing is not announced yet) but also suffers from its UI. For example, one of my pet peeves is pages telling me I have errors but not letting me easily click to see them, requiring me to hunt. The only link in this error page, for example, goes to the “knowledge base” that describes the generic error, but not to a list of scanned pages containing the error.

Page showing how many errors I have but not linking to them

(After I gave feedback to the developers, they told me the info is there if you go to the components view. But that requires me to learn and remember that. Don’t make me think!)

There’s also terminology oddities. When setting up a new site for testing, the web interface requires a seed URL and to press a button marked “start mapping”, after which the term “mapping” is no longer used, and I’m told the system is “crawling”. Once the crawl was complete, I couldn’t see any results. It took a while for me to realise that “crawling” and “mapping” are the same thing (getting a list of candidate URLs) and after the mapping/ crawling stage, I need to then do a “scan”.

A major flaw is the ability to customise tests. In Tenon I can turn off tests on an adhoc basis if, for example, one particular test is giving me false positives, or I only want to test for level A failures. This is unavailable in Evinced’s web interface.

Another important but often-overlooked UI aspect of these “Enterprise” site scanners is the need to share results across the enterprise. While it’s common to sell “per-seat” licenses, it’s also necessary for the licensee to able to share information with managers, bean-counters, legal eagles and the like. Downloading a CSV doesn’t really help; it’s much more useful to be able to share a link to the results of a run and let the recipient investigate the reports and issues, but not let them change any configuration or kick off any new scans. This is missing in both Evinced and Tenon.

Conclusion

The system is currently in Beta and definitely needs some proper usability testing with real target users and UI love. One niggle is the inaccuracy of its knowledge base (linked from the error reports). For example, about the footer element, Evinced says

Since the <footer> element includes information about the entire document, it should be at the document level (e.g., it should not be a child element of another landmark element). There should be no more than one ><footer> element on the same document

This is directly contradicted by the HTML specification, which says

The footer element represents a footer for its nearest ancestor sectioning content or sectioning root element… Here is a page with two footers, one at the top and one at the bottom… Here is an example which shows the footer element being used both for a site-wide footer and for a section footer.

I saw no evidence of this incorrect assumption about the footer element in the tests, however.

All in all, the ability of Evinced to identify repeated ‘components’ and understand the intended use of some splodge of JavaScriptted labyrinth of divs is a welcome feature and its main selling point. It’s definitely one to watch when the UX is sleeker (presumably when it comes out of Beta).

Reading List 270

Reading List 269

Reading List 268

  • Link o’ The Week: Why it’s good for users that HTML, CSS and JS are separate languages by Hidde de Vries
  • Bonus Link o’ The Week: Web Histories – “The Web Histories project is an attempt to document the stories of those who have helped to shape the open web platform, with a focus on those people who are often underrepresented in the formal histories.” by Rachel Andrew
  • The F-word episode 7 – Leningrad Lothario, Russian roister-doister Vadim spared a moment from his dizzying social life to sit with me and discuss Chrome 88 and the latest web development Grand Unification Proposal (= make it all JSON)
  • A Comprehensive Guide to Accessible User Research – “Researchers often want to include people with access needs in their studies but don’t know where to begin. This three-part series covers the various considerations for adapting your practice to include people with disabilities.”
  • Under-Engineered Responsive Tables – “I am just going to show you the bare minimum you need to make a WCAG-compliant responsive HTML table” says Uncle Adrian Roselli: <div role="region" aria-labelledby="Caption01" tabindex="0"> <table>[…]</table> </div>
  • Resources for developing accessible cards/tiles
  • Accessible SVGs – I needed this again this week/
  • Alt vs figcaption – what’s the difference, when should you use which and how.
  • Weaving Web Accessibility With Usability gives actionable advice on usability testing with disabled people (and quotes me!)
  • Welcome to Your Bland New World “The VC community is an increasingly predictable and lookalike bunch that just seems to follow each other around from one trivial idea to another”. Excellent article!
  • Is Stencil a Better React? by Wix Engineering (a React contributor). “the same JSX as React & some of the same concepts… Compiles to standard web components with optimal performance…Stencil was faster compared to React… also faster compared to Svelte, at least within this use case.”
  • CPRA Passes, Further Bolstering Privacy Regulations and Requirements in California “agreement obtained through use of dark patterns does not constitute consent”
  • 2020 Affordability Report – “Over a billion people live in the 57 countries in our survey that are yet to meet the UN Broadband Commission’s ‘1 for 2’ affordability threshold. 1GB is the minimum that allows someone to use the internet effectively”
  • Clean Advertising – an interesting article by Jezza with a bonus useful tip about tigers which I’ll be trying tonight.
  • New UK tech regulator to limit power of Google and Facebook – “the dominance of just a few big tech companies is leading to less innovation, higher advertising prices and less choice and control for consumers”

Reading List 267