by Gianpaolo D’Amico, Lea Landucci and Daniele Pezzatini
We describe the design and development of a natural interaction system which provides a novel solution for neurocognitive rehabilitation in people affected by neglect syndrome.
Experimentation with natural interaction principles and advanced interactive solutions is a new approach in the rehabilitation and evaluation of patients with neglect syndrome. Our work aims at improving on the conventional pen and paper approach used for many years in this field.
This project is the result of a multidisciplinary collaboration between MICC - Media Integration and Communication Centre of the University of Florence (Italy), the Faculty of Psychology of the University of Florence (Italy) and Montedomini A.S.P., an Italian public agency offering welfare and healthcare services for self sufficient and disabled elderly people.
Neglect syndrome is a neuropsychological condition caused by damage to one of the brain’s hemispheres. This syndrome causes a deficit in processing and perceiving stimuli on one side of the body and/or the environment. The main symptom is a loss of awareness of one side of the field of view. For example, a stroke affecting the right lobe of the brain can lead to neglect in the left side of the field of view, causing a patient to behave as if sensory space does not exist on this side. Such patients are unable to go through a door without hitting the jamb, or to eat a meal without leaving the left part of the plate completely untouched.
Rehabilitation treatments that have been tested with variable degrees of success include enhancing patients’ awareness of their perception issues, for example by teaching them to rotate their heads to receive information on the ignored visual field. Doctors and therapists currently use pen and paper cognitive tests in order to assess the severity of the syndrome and exploit similar conventional techniques in rehabilitation training.
The natural interaction rehabilitation system
Our solution consists of an interactive environment in which patients accomplish different tasks in order to estimate their neurocognitive condition (testing phase) and support rehabilitation activities (training phase). The tasks consist of predefined exercises focused on the following elements: attention, memory, perceptual disturbances, visual-spatial disturbances and difficulties in executive functions.
Patients interact with an augmented reality system which provides digital content aimed at stimulating gestures and tasks similar to those that would occur in their daily life. A natural interaction multi-touch table is used to simulate a familiar situation: a dirty table to clean. The table supports the manipulation of digital content, letting users interact through natural gestures. Patients are asked to move a real sponge on the interactive table in order to physically erase spots displayed on the screen.
Training phase: an elderly patient using the interactive tabletop at Montedomini.
The testing phase consists of gathering information on the severity of the condition and progress of the patient, whilst the training phase encourages the exploration of the neglected hemifield through various procedures. The set of the spots (stimuli) displayed on the table can be configured according to four parameters: number, location, size and nature. Currently it is possible to visualize stains made of coffee, oil, water and dust. Doctors and therapists can create different degrees of difficulty which can be individually tuned to patients according to several parameters (response speed, exposure time of a stimulus, spatial distribution of stimuli, involved sensory channels, audiovisual tasks, number of stimuli to control, etc.).
All the activities during the rehabilitation sessions are monitored and automatically stored in a database so that a personal profile of each patient can be built in order to estimate performances in terms of accuracy (number of erased spots), time spent in accomplishing the task and trajectory of movements. In this way, medical staff can work with a novel diagnostic tool which provides useful information, statistics, charts and high-level data for the evaluation of patients.
The system was installed in the laboratories of Montedomini A.S.P. in Florence in July 2010. During the first year, the Montedomini staff experimented with the system on about 30 patients with different ages and levels of disease. The first results are very promising from both medical and patient viewpoints. The data collected are currently being analysed in order to scientifically validate them.
Future directions of the project will address the design of new training modalities by means of enriched dynamic contents (audio-visual and moving objects, diverse shapes, etc.) and collaborative functionalities involving medical trainers and patients simultaneously.
The authors would like to thank Nicola Torpei, Prof. Maria Pia Viggiano, Sergio Costanzo and Monica Dainelli for their invaluable work during the project.
University of Florence, Italy