September Newsletter 310
Test predicts eye disease that leads to blindness
A new eye test has been carried out at Western Eye Hospital, that can be used to predict a condition that can lead to blindness. It is part of a clinical trial and study, which involved 113 patients using retinal imaging technology. It was able to identify areas of the eye showing signs of geographic atrophy (GA), which is a common condition that causes reduced vision and blindness.
The trial was led by researchers at Imperial College London and UCL, who believe that this technology could be used as a screening test for GA, as well as helping advance the development of new treatments for the disease. GA is often difficult to identify at an early stage, so this is thought to help prevent any sight loss. The new study found that the technology was able to predict GA three years in advance.
Alexa allows people with sight loss to access audiobooks
Amazon’s Alexa virtual assistant can now be used by blind and partially sighted people to get instant access to thousands of audio books. Customers of the Royal National Institute of Blind People’s Talking Books library can say “Alexa, open RNIB Talking Books” to access their audio books instantly.
In the past year, 1.33 million Talking Books have been sent out, with many users describing the service as a “lifeline” during the pandemic
Review of a documentary series considering how creative people with low vision and blindness can survive and thrive.
Assistive devices for managing visual impairment:
Full details at the link below
What it’s like browsing Instagram while blind
Using a screen reader to navigate Instagram, as some people with low vision do, is a strange patchwork of sounds. It can be overwhelming, especially if you’re used to quickly scanning information with your eyes, to hear a synthetic voice clunkily rattle off usernames, timestamps, and like counts as though they’re all equally important as the actual content of the post. Among all that auditory stimulation, if someone added alt text to their photo, you might hear something like “John and I standing with our ankles in the water at the beach. John is making a distressed face while I menacingly hold out a dead crab and laugh.”
The image descriptions used by screen readers have to be added by users, and like many accessibility features in social media, those fields are regularly neglected. In those cases, the voice will sometimes recite alt text that Instagram or the user’s device generates automatically. The result, Danielle McCann, the social media coordinator for the National Federation of the Blind, tells me, can be pretty funny. The descriptions that have evolved from years of machine learning still often misidentify what’s happening in photos.