Image Presentation Method for Human Machine Interface Using Deep Learning Object Recognition and P300 Brain Wave

Rio Nakajima - Gifu University, 1-1 Yanagido, Gifu, 501-1193, Japan
Muhammad Ilhamdi Rusydi - Universitas Andalas, Padang City, 25164, Indonesia
Salisa Asyarina Ramadhani - Universitas Andalas, Padang City, 25164, Indonesia
Joseph Muguro - Dedan Kimathi University of Technology, Private Bag, Nyeri 10143, Kenya
Kojiro Matsushita - Gifu University, 1-1 Yanagido, Gifu, 501-1193, Japan
Minoru Sasaki - Gifu University, 1-1 Yanagido, Gifu, 501-1193, Japan

Citation Format:



Welfare robots, as a category of robotics, seeks to improve the quality of life of the elderly and patients by availing a control mechanism to enable the participants to be self-dependent. This is achieved by using man-machine interfaces that manipulate certain external processes like feeding or communicating. This research aims to realize a man-machine interface using brainwave combined with object recognition applicable to patients with locked-in syndrome. The system utilizes a camera with pretrained object-detection system that recognizes the environment and displays the contents in an interface to solicit a choice using P300 signals. Being a camera-based system, field of view and luminance level were identified as possible influences. We designed six experiments by adapting the arrangement of stimuli (triangular or horizontal) and brightness/colour levels. The results showed that the horizontal arrangement had better accuracy than the triangular method. Further, colour was identified as a key parameter for the successful discrimination of target stimuli. From the paper, the precision of discrimination can be improved by adopting a harmonized arrangement and selecting the appropriate saturation/brightness of the interface.


Image; human machine interface; electroencephalogram; object recognition; P300.

Full Text:



“Decrease in working-age population,” Cabinet Office Japan. 2017, [Online]. Available:

S. B. of Japan, “2015 Population Census.” 2015, [Online]. Available:

U. Nations, World Population Ageing 2019 Highlights. United Nations, 2019.

M. Sasaki et al., “Robot control systems using bio-potential signals,” in AIP Conference Proceedings, 2020, vol. 2217, doi: 10.1063/5.0000624.

C. Ke, V. W. qun Lou, K. C. kian Tan, M. Y. Wai, and L. L. Chan, “Changes in technology acceptance among older people with dementia: the role of social robot engagement,” Int. J. Med. Inform., vol. 141, p. 104241, Sep. 2020, doi: 10.1016/j.ijmedinf.2020.104241.

S. Suwa et al., “Exploring perceptions toward home-care robots for older people in Finland, Ireland, and Japan: A comparative questionnaire study,” Arch. Gerontol. Geriatr., vol. 91, p. 104178, Nov. 2020, doi: 10.1016/j.archger.2020.104178.

J. L. Sirvent Blasco, E. Iáñez, A. Úbeda, and J. M. Azorín, “Visual evoked potential-based brain-machine interface applications to assist disabled people,” Expert Syst. Appl., vol. 39, no. 9, pp. 7908–7918, 2012, doi: 10.1016/j.eswa.2012.01.110.

J. S. Han, Z. Zenn Bien, D. J. Kim, H. E. Lee, and J. S. Kim, “Human-Machine Interface for wheelchair control with EMG and Its Evaluation,” Annu. Int. Conf. IEEE Eng. Med. Biol. - Proc., vol. 2, pp. 1602–1605, 2003, doi: 10.1109/iembs.2003.1279672.

B. Zhang et al., “Breath-based human–machine interaction system using triboelectric nanogenerator,” Nano Energy, vol. 64, no. July, p. 103953, 2019, doi: 10.1016/j.nanoen.2019.103953.

K. Sharma, N. Jain, and P. K. Pal, “Detection of eye closing/opening from EOG and its application in robotic arm control,” Biocybern. Biomed. Eng., vol. 40, no. 1, pp. 173–186, Jan. 2020, doi: 10.1016/j.bbe.2019.10.004.

N. Rudigkeit and M. Gebhard, “AMiCUS—a head motion-based interface for control of an assistive robot,” Sensors (Switzerland), vol. 19, no. 12, 2019, doi: 10.3390/s19122836.

A. M. Choudhari, P. Porwal, V. Jonnalagedda, and F. Mériaudeau, “An Electrooculography based Human Machine Interface for wheelchair control,” Biocybern. Biomed. Eng., vol. 39, no. 3, pp. 673–685, 2019, doi: 10.1016/j.bbe.2019.04.002.

A. B. Barreto, S. D. Scargle, and M. Adjouadi, “A practical EMG-based human-computer interface for users with motor disabilities,” J. Rehabil. Res. Dev., vol. 37, no. 1, pp. 53–64, 2000.

R. Bousseta, I. El Ouakouak, M. Gharbi, and F. Regragui, “EEG Based Brain Computer Interface for Controlling a Robot Arm Movement Through Thought,” IRBM, vol. 39, no. 2, pp. 129–135, Apr. 2018, doi: 10.1016/j.irbm.2018.02.001.

J. R. Wolpaw, D. J. McFarland, G. W. Neat, and C. A. Forneris, “An EEG-based brain-computer interface for cursor control,” Electroencephalogr. Clin. Neurophysiol., vol. 78, no. 3, pp. 252–259, 1991, doi: 10.1016/0013-4694(91)90040-B.

E. Magosso, M. Ursino, A. Zaniboni, and E. Gardella, “A wavelet-based energetic approach for the analysis of biomedical signals: Application to the electroencephalogram and electro-oculogram,” Appl. Math. Comput., vol. 207, no. 1, pp. 42–62, 2009, doi: 10.1016/j.amc.2007.10.069.

M. Sasaki and K. ho Choi, “Communications with a Brainwave Bio-Potential Based Computer Interface,” SICE Div. Conf. Progr. Abstr., vol. si2002, p. 438, 2002, doi: 10.11499/siced.si2002.0.438.0.

C. Castellini and P. van der Smagt, “Surface EMG in advanced hand prosthetics,” Biol. Cybern., vol. 100, no. 1, pp. 35–47, 2009, doi: 10.1007/s00422-008-0278-1.

H. Harun and W. Mansor, “EOG signal detection for home appliances activation,” Proc. 2009 5th Int. Colloq. Signal Process. Its Appl. CSPA 2009, pp. 195–197, 2009, doi: 10.1109/CSPA.2009.5069215.

A. L. C. Bissoli, M. M. Sime, and T. F. Bastos-Filho, “Using sEMG, EOG and VOG to Control an Intelligent Environment,” IFAC-PapersOnLine, vol. 49, no. 30, pp. 210–215, Jan. 2016, doi: 10.1016/j.ifacol.2016.11.169.

L. Bi, A. Feleke, and C. Guan, “A review on EMG-based motor intention prediction of continuous human upper limb motion for human-robot collaboration,” Biomedical Signal Processing and Control, vol. 51. Elsevier Ltd, pp. 113–127, May 2019, doi: 10.1016/j.bspc.2019.02.011.

R. J. M. G. Tello, A. L. C. Bissoli, F. Ferrara, S. Müller, A. Ferreira, and T. F. Bastos-Filho, “Development of a Human Machine Interface for Control of Robotic Wheelchair and Smart Environment,” IFAC-PapersOnLine, vol. 48, no. 19, pp. 136–141, 2015, doi: 10.1016/j.ifacol.2015.12.023.

D. Liberati, Brain Computer Interfacing. 2011.

J. W. Rohrbaugh, E. Donchin, and C. W. Eriksen, “Decision making and the P300 component of the cortical evoked response,” Percept. Psychophys., vol. 15, no. 2, pp. 368–374, 1974, doi: 10.3758/BF03213960.

L. Wang, J. Zheng, S. Huang, and H. Sun, “P300 and Decision Making under Risk and Ambiguity,” Comput. Intell. Neurosci., vol. 2015, p. 108417, 2015, doi: 10.1155/2015/108417.

E. Pasqualotto, A. Simonetta, V. Gnisci, S. Federici, and M. Olivetti Belardinelli, “Toward a Usability Evaluation of BCIs,” Int. J. Bioelectromagn., vol. 13, no. 3, pp. 121–122, 2011, [Online]. Available:

E. Pasqualotto, A. Simonetta, S. Federici, and M. O. Belardinelli, “Usability Evaluation of BCIs,” in Assistive Technology from Adapted Equipment to Inclusive Environments, 2009, doi: 10.3233/978-1-60750-042-1-882.

V. Guy, M.-H. Soriani, M. Bruno, T. Papadopoulo, C. Desnuelle, and M. Clerc, “Brain computer interface with the P300 speller: Usability for disabled people with amyotrophic lateral sclerosis,” Ann. Phys. Rehabil. Med., vol. 61, no. 1, pp. 5–11, 2018, doi:

Y. N. Ortega-Gijón and C. Mezura-Godoy, “Usability evaluation process of brain computer interfaces: An experimental study,” ACM Int. Conf. Proceeding Ser., pp. 1–8, 2019, doi: 10.1145/3358961.3358967.

K. Won, M. Kwon, S. Jang, M. Ahn, and S. C. Jun, “P300 Speller Performance Predictor Based on RSVP Multi-feature ,” Frontiers in Human Neuroscience , vol. 13. p. 261, 2019, [Online]. Available:

S. Bahman and M. B. Shamsollahi, “Robot Control Using an Inexpensive P300 Based BCI,” in 2019 26th National and 4th International Iranian Conference on Biomedical Engineering (ICBME), 2019, pp. 204–207, doi: 10.1109/ICBME49163.2019.9030408.

M. Li, W. Li, J. Zhao, Q. Meng, M. Zeng, and G. Chen, “A P300 Model for Cerebot – A Mind-Controlled Humanoid Robot BT - Robot Intelligence Technology and Applications 2: Results from the 2nd International Conference on Robot Intelligence Technology and Applications,” J.-H. Kim, E. T. Matson, H. Myung, P. Xu, and F. Karray, Eds. Cham: Springer International Publishing, 2014, pp. 495–502.

A. Mechelli and S. B. T. Vieira, Machine Learning Methods and Applications to Brain Disorders, 1st ed. Academic Press, 2020.


  • There are currently no refbacks.

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

JOIV : International Journal on Informatics Visualization
ISSN 2549-9610  (print) | 2549-9904 (online)
Organized by Department of Information Technology - Politeknik Negeri Padang, and Institute of Visual Informatics - UKM and Soft Computing and Data Mining Centre - UTHM
W :
E :,,

View JOIV Stats

Creative Commons License is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.