by Cristina Farmaki, Matthew Pediaditis, and Vangelis Sakkalis (ICS-FORTH)

Unlocking the true potential of assistive rehabilitation technologies heavily depends on their adaptive potential to match the special needs and abilities of people in need. Previous attempts in advancing brain-computer interfacing (BCI) technologies promise an alternative communication path, but do they actually contribute towards social inclusion or are they for demonstration purposes only?

The i-AMA project’s goal is to develop complete closed-loop BCI applications including low-cost hardware and software for navigation purposes, in order to assist patients suffering from severe paralysis to gain a sense of autonomy and initiative.

Brainstem stroke, spinal cord injury and neurodegenerative diseases such as amyotrophic lateral sclerosis (ALS) are only some of the conditions that could lead to paralysis, quadriplegia, or even locked-in syndrome (LIS), where patients lose control of the majority of their muscles, while their cognitive state remains intact. The development of assistive systems that move beyond the typical means of control are of paramount importance for these individuals, as their brain signals are their only way of communicating with their environment. Patients suffering from the aforementioned conditions can greatly benefit from assistive BCI technologies, since such systems are solely based on brain signals to control external devices, without the use of peripheral nerves and muscles. Typically, BCIs take advantage of the excellent time resolution, portability, affordability and non-intrusiveness of electroencephalographic (EEG) recordings to use as control signals in real-time tasks. More specifically, an EEG-based BCI analyses the user’s EEG signals and decodes specific brain patterns into control commands, relative to the devices to be controlled.

The iAMA project [L1] stemmed from the collaboration of FORTH-ICS, in Crete (GR), with the rehabilitation and recovery centre, ANIMUS, in Larisa (GR), and WHEEL (GR), a company in Salonica that specialises in manufacturing wheelchairs. iAMA research, running from 2018 to 2022, is supported by the European Union and Greek national funds through the call RESEARCH CREATE INNOVATE (project code: T1EDK- 01675.) In the iAMA project, we focused on navigation-oriented applications and we chose to use the Steady-State Visually Evoked Potentials (SSVEPs) stimuli protocol, which employs specific visual targets, flickering at different frequencies. When someone focuses their gaze at a flickering stimulus, the same frequency can be detected by sophisticated signal processing algorithms at their visual cortex. Thus, each distinct frequency can be assigned to a different movement control command in a navigation system. SSVEPs have been proven to be ideal for such applications, due to their high Information Transfer Rate (ITR) and fast response time, and the minimal training time they require.

The initial application our team developed was an SSVEP-based BCI for the navigation of a telepresence robotic car, which has already been successfully evaluated and presented [1,2]. The evolution of navigating a telepresence robotic car was naturally the independent movement of the patients themselves. To this end, we applied the principles of the developed BCI to the control of an electric wheelchair. In order to establish communication between the BCI and the wheelchair, we developed an electric wheelchair controller (EWC), which replaces the standard joystick module and achieves both wireless and wired communication with the BCI. Thus, the BCI system can directly send the detected user’s commands to the EWC and control the wheelchair’s direction. The wheelchair’s battery has large capacity and can power the EWC and the computer system responsible for the interface presentation, the signal processing and the machine learning modules, thus creating an autonomous system. The SSVEP-based BCI uses a three-target scheme: three red–black checkerboards reverse their pattern at three different frequencies, on a laptop screen. The three targets correspond to the movement commands FORWARD and SELF-ROTATION TO RIGHT-LEFT, while the STOP command is detected when the user focuses their gaze at the centre of the screen or even off-centre, where no flickering occurs. An EEG device records the user’s brain signals at all times and the developed algorithms analyse and decode them in real-time into movement commands, which are forwarded to the wheelchair controller.

The BCI-based wheelchair navigation system has been tested on both indoor and outdoor conditions on able-bodied subjects [3], exhibiting high accuracy, robustness and ease of use. The latest version of the system, including distance sensors of high sensitivity for collision avoidance, is being tested on patients suffering from various neuromuscular dysfunctions, in the rehabilitation centre, ANIMUS (Figure 1). The application of an assistive system to real patients is always a challenging task, as each patient suffers from a different disease with distinct characteristics, which affect the effectiveness of the system to different extents. Robust brain signals captured from able-bodied users can be well-characterised and the BCI outcome is really promising, but in many cases cannot be directly applied to patients. i-AMA focuses its efforts on personalising the overall system according to each patient’s specific needs and limitations. Our research expands beyond able-bodied test users to account for real-world application difficulties, i.e., dealing with excessive motion artifacts and people not able to maintain focus, including the widely complex children application domain.

Figure 1: Evaluation of the SSVEP-based BCI for wheelchair navigation at ANIMUS Rehabilitation Center.
Figure 1: Evaluation of the SSVEP-based BCI for wheelchair navigation at ANIMUS Rehabilitation Center.

Our future goals include grouping specific disorders according to their specific traits, in order to develop various user-tailored initialisations of the system. An important insight from the evaluation of patients is the necessity for a user-friendly, cost-effective, highly adaptive system that requires minimal preparation and training. Hence, we exploit dry electrodes that require no preparation time, different interface schemes and pre-trained models for the machine learning algorithms, as well as low cost EEG hardware. All these considerations have the potential to enable the development of assistive interfaces that can be integrated into the daily life of patients, build social trust and pave the way towards a more accessible and inclusive society.

Link:
[L1] https://i-ama.gr/

References:
[1]  C. Farmaki, V. Sakkalis: “Low Cost Brain-Controlled Telepresence Robot: A Brain-Computer Interface for Robot Car Navigation”, ERCIM News, no. 114, July 2018, p. 42-43.
[2] C. Farmaki, et al.: “Single-channel SSVEP-based BCI for robotic car navigation in real world conditions”, IEEE International Conference on Bioinformatics and Bioengineering (BIBE), October 2019, Athens.
[3] M. Krana, et al.: “SSVEP based wheelchair navigation in outdoor environments”, Annual International Conference of the IEEE Engineering in Medicine and Biolofy Society, Novenber 2021, p. 6424-6427.

Please contact:
Vangelis Sakkalis, FORTH-ICS, Greece
This email address is being protected from spambots. You need JavaScript enabled to view it.

Next issue: January 2025
Special theme:
Large-Scale Data Analytics
Call for the next issue
Image ERCIM News 130
This issue in pdf

 

Image ERCIM News 130 epub
This issue in ePub format

Get the latest issue to your desktop
RSS Feed