Even in a world of digital devices, braille continues to be a vital part of life for blind people. For nearly 200 years, this versatile writing system has allowed them to learn, work and live in a more independent way.
Technology undoubtedly has a role to play in enabling blind people to live independent lives. The news that the world’s first braille mobile phone has gone on sale is a step in the right direction but it is also clear that more people need to learn braille in the first place.
A 1998 study of 74 blind adults found that among those who had not learnt braille, 77% were unemployed while the figure dropped to only 44% among braille readers.
Despite this, a report by the National Federation of the Blind in 2009 revealed that fewer than 10% of legally blind people in the US are braille readers.
We are looking at how learners can make use of the touchscreen and keyboard devices that have become part of most people’s daily lives to learn braille, which, in turn, could help them gain better access to work and education.
Getting to grips with braille
The classic method of mastering braille involves a braille typewriter called a Perkins Brailler. But this can be an expensive piece of equipment to pick up so isn’t an option for many.
In an attempt to make braille more accessible, the Georgia Institute of Technology has developed an app called BrailleTouch. This transposes the six-figure braille keyboard to the smartphone.
The user holds the phone with the screen facing away from them, then using the same fingers as they would on the Perkins Brailler (index, middle and ring fingers) they can perform braille chords on the touchscreen. Different combinations of fingers produce different characters. Placing the left index and middle fingers on the screen will enter the character “b”, for example.
However, there is lack of feedback to the fingers because touchscreen devices have flat, featureless surfaces. It isn’t clear which fingers have been recognised by the device because the user will only receive feedback once the chord has been entered. So if they attempt to enter the character “b” but the touchscreen fails to recognise the presence of the middle finger on the keyboard, the device will recognise the character “a” instead. This can lead to errors.
In partnership with INESC-ID at the Technical University of Lisbon and LaSIGE at the University of Lisbon’s Department of Informatics, we are developing a system called HoliBraille that combines chord input with a series of vibrations that notify the user of what the system is registering. The HoliBraille case can be attached to a Samsung Android phone, and feeds information to the user in the form of vibrations felt through the fingers before the chord is committed and an error has been made.
We use an Arduino, an open source micro-controller, to talk to the phone case via Bluetooth. The case then passes on the information by activating individual vibro-tactile motors next to the fingers that make up the chords.
Preliminary results indicated that the system is 100% accurate for single finger vibrations, and 82% accurate on chord input. Because it works using Bluetooth, it’s conceivable that you could pass on messages between a range of devices, such as cash machines or desktop computers.
The learning curve
Motivation is undoubtedly an issue for people when it comes to learning braille and technology can play a part here too. Working with a user centre in Portugal called the Raquel and Martin Sain Foundation, we are developing three applications which make this learning more entertaining through gaming.
One of these is a game called BazingaBraille, which is designed to teach someone braille from scratch by speaking a word and sending a vibration to their fingers at the same time. We have also developed games such as BrailleHero, a variation on the popular GuitarHero series that encourages a user to input braille chords to keep the music going. Chord-based text entry is a fast and effective way of inputting text, even compared to QWERTY keyboards.
Our aim is now to continue reducing errors. We’re also developing an “autocorrect” system for multi-touch braille on touchscreens called B#, which uses an algorithm to correct chord errors, in much the same way as a standard smartphone corrects spelling errors. When the wrong chord is tapped in, B# draws on a list of chords that are similar to the chord in question, and substitutes the one that fits best with the sentence around it.
At the moment, this system provides correct suggestions for 72% of words, compared to the 38% achieved by the Android spellchecker. And we’re making all of this work open-source so that it can be adapted and improved even further.
The touchscreen has become ubiquitous remarkably quickly, but being able to use it effectively is something that sighted people often take for granted. The aim now is to refine the technologies that are available to us so that they can be used to empower blind and partially-sighted people worldwide.
http://theconversation.com/good-vibrations-bring-braille-into-the-21st-century-27002?utm_medium=email&utm_campaign=Latest+from+The+Conversation+for+30+May+2014&utm_content=Latest+from+The+Conversation+for+30+May+2014+CID_11a223f9d6a5087d58cfb1b7ee022db6&utm_source=campaign_monitor_uk&utm_term=Good%20vibrations%20bring%20braille%20into%20the%2021st%20century