by Cristina Farmaki and Vangelis Sakkalis (ICS-FORTH)
An innovative and reliable EEG brain-computer navigation system has been developed in the Computational Biomedicine Laboratory (CBML), at ICS-FORTH, in Crete, Greece. The implemented system is intended to enable patients suffering from neuromuscular paralysis to act independently, be able to make their own decisions and to some extent take part in social life. Using such a system challenges mental abilities in various ways and is expected to improve quality of life and benefit mental health.
A variety of neurological conditions can lead to severe paralysis, even to typical locked-in syndrome, where patients have retained their mental abilities and consciousness, but suffer from complete paralysis (quadriplegia and anarthria), except for eye movement control. Locked-in syndrome usually results from an insult to the ventral pons, most commonly a brainstem hemorrhage or infarct. Other potential causes that can affect this part of the brainstem can include trauma, such as stroke, encephalitis, as well as neurodegenerative diseases of motor neurons, such as Amyotrophic Lateral Sclerosis in which the patient gradually loses muscle control and, consequently, the ability to communicate.
As these patients maintain their mental functions unaffected, their motor impairment often results in social exclusion, usually leading to depression and resignation. As a consequence, providing even minimal means of communication and control can substantially improve the quality of life of both patients and their families. To this end, we have been developing brain-computer interfaces (BCIs), which constitute a direct communication pathway between the human brain and the external world. A BCI system relies only on brain signals, without the use of peripheral nerves, and therefore can provide communication and control for patients suffering from severe neuromuscular paralysis. BCIs capture brain signals using the electroencephalography (EEG) technique, due to its rather low cost, non-invasiveness, portability and good temporal resolution.
Bearing this in mind, our team, under the supervision of Dr. Vangelis Sakkalis, has designed and implemented an integrated EEG brain-computer interface for the navigation of a robot car, using a low-cost smartphone camera, in order for a patient to “move” (virtually) to remote and non-distant environments. Our BCI system is based on the SSVEP (steady-state visual evoked potentials) stimulation protocol, according to which, when a user attends a light source (LED or, as in our case, reversing screen patterns) that flashes at frequencies above 4 Hz, a specific signal response can be detected in the visual cortex, located at the occipital lobe. A user-tailored brief training session before using the interface ensures the individualisation of the process, thus leading to higher system accuracies. In order to wirelessly control the mobile robot car, the user focuses his/her gaze on one of four checkerboard targets, on a computer screen. The targets alternate their pattern at a constant frequency, which is different for each of them. A mobile wireless EEG device is continuously recording the visual cortex activity through four channels. A specialised algorithm analyses the brain signals in real-time and recognises which target the user is focusing on, using sophisticated machine learning techniques. The next step includes translating the user’s intention into a corresponding motion command (front, back, right, left) and transmitting it to the robot car via wireless communication. The robot car moves to the desired direction, whereas a smartphone camera, mounted onto the robot car, captures the environment around the user and projects it onto the user’s screen. Thus, the user can redefine his/her intentions according to the live feedback from the camera (Figure 1).
Figure 1: The user focuses his/her gaze on one of four checkerboard targets, on a computer screen (right), in order to remotely control the robot car (left). The robot car moves to the desired direction, whereas a smartphone camera, mounted onto the robot car, captures the environment around the user and projects it onto the user’s screen.
The use of a low-cost EEG device in combination with our custom-made brain interpretation algorithms implemented by C. Farmaki (computer engineer), the custom manufacture of the robot car using the Arduino Due onboard microcontroller assembled by G. Christodoulakis (robotics engineer), and the addition of a conventional smartphone camera ensures the affordability and wide accessibility of the overall solution. The implemented system has been published  and successfully presented in public [L1, L2], thus proving its efficiency and robustness in various conditions (daylight or artificial light in enclosed spaces, as well as noisy and crowded environments). The WiFi communication protocol has been used for the transmission of the motion commands to the robot car, however other solutions have been explored and tested, such as the Zigbee protocol.
The major advantage of our interface is that it needs minimal training, works in realistic conditions and can be adapted to user’s needs and alternative application scenarios including electric wheelchair navigation. Our team has secured a three-year Operational Programme on Competitiveness, Entrepreneurship and Innovation [L3] to build on top of this prototype and realise an industrial design along with a pilot study proving and extending the possibilities of the current implementation.
The implemented system enables patients suffering from severe neuromuscular paralysis to gain back a certain level of autonomy and communication with the world around them. The proposed technology paves a way where natural obstacles can be eliminated and locked-in patients can live with their families and even access “virtually” or “physically” (under certain conditions) schools, universities, museums, etc.
 L. F. Nicolas-Alonso, J. Gomez-Gil: “Brain computer interfaces, a review,” Sensors (Basel), vol. 12, no. 2, 2012, 1211-79.
 U. Chaudhary, et al.: “Brain-computer interfaces for communication and rehabilitation”, Nature Reviews Neurology, vol. 12, 2016, 513-525.
 C. Farmaki et al.: “Applicability of SSVEP-based BCIs for robot navigation in real environments”, IEEE EMBC, 2016, 2768-2771.
+30 (281) 0391448