Many people with a brain injury, dementia or mental health conditions are unable to do everyday tasks like getting dressed, not because of any physical disability, but because their cognitive impairments leave them unable to plan tasks and initiate the actions necessary to achieve their goals. In these cases caregivers spend many hours each day verbally prompting the actions necessary for tasks like getting dressed.
Being told every step of how to get up in the morning or make a snack is frustrating for all concerned. Yet there are many thousands of people in the UK who are either providing or receiving such care – and the numbers are set to rapidly increase as the population ages.
Receiving personal care is often demeaning and providing it is always costly. So can technology provide a supportive, more dignified and cost efficient way to deliver these services?
Dr Alex Gillespie and Cath Best of the University of Stirling’s Department of Psychology and Dr Brian O’Neil from the Brain Injury Rehabilitation Trust, have received funding from the Chief Scientist Office Scotland to develop a technology which simulates the verbal prompting that caregivers provide.
The new technology, called Guide, helps people with difficulties to carry out everyday tasks, such as doing their laundry or setting a morning routine. Think of it as a SatNav for everyday activities. Instead of telling you to turn left or right, this technology talks you through routine activities – for example, reminding you how to get dressed, to ensure that you are wearing appropriate clothes.
The routine activities of daily living may seem simple but they actually involve complex sequences of thought and action. For example, before making a cup of tea the person has to make sure that all the equipment is to hand, that the kettle contains sufficient water and that it is plugged in and switched on. Unless these steps are taken in the right order, it’s impossible to make the tea.
Guide simulates the verbal prompts provided by carers. For example, it asks: “Is there water in the kettle?” and then: “Is the red light on the kettle?” and so on. For each question, users can respond verbally, saying “yes” or “no.” Speech recognition software understands what is said, and uses it to advance the dialogue, sometimes helping to problem solve. Guide is sophisticated enough to let people carry out sequences in a different order – enabling them to take several possible routes to their goal.
The new project’s aim is to develop the system to the point where it helps people to complete their morning routines, including dressing, before going on to do their own laundry. The latter part of the study will entail talking people with brain injury through how to use a washing machine, a common difficulty among those with cognitive disabilities.