With its connection to sensory information and behavior control, neural activity research has recently received considerable attention. However, the current methods of brain activity sensing, however, involve expensive equipment and physical proximity with the subject. Sensation is a physical process involving sensory systems of a body responding to stimuli and providing data for perception1. Human sensory systems are involved in daily activities, both consciously and unconsciously, and a study of the senses, especially their connection with brain activity, has been gaining in popularity in recent years. Brain activity analysis using electroencephalography (EEG) has received considerable attention2–6, particularly in relation to the five basic human senses: sight2, touch7, hearing8,9, smell10–13 and taste14. In addition to EEG, a number of optical techniques have also been employed for monitoring human brain activity using image contrast analysis15,16 and cross-correlation based analysis of a laser speckle imaging17. While these methods mainly deal with a laser speckle image and its relation to temporal fluctuations, extracting semantic information from sensory activity is still lacking.
The temple area of the human head is located in front of the cerebral cortex and is not an optical quality surface. Therefore, when illuminated by a laser beam, the back scattered light forms secondary speckle patterns, which are possible to image by a digital camera with defocused optics. Analysis of temporal changes in the spatial distribution of the speckle patterns can be related to nano-vibrations in the illuminated surface due to the hemodynamic process associated with the transient flow of blood occurring during human brain activation17.
Speckle-based remote sensing has been used for the development of different biomedical applications, such as monitoring heart rate18, breathing19, blood pressure20, blood oximetry21, blood coagulation22,23, bone fractures24, melanoma25, and neural activity17. Prior methods for classifying speckle patterns used a single frame26 or full video frame-by-frame21 obtained by averaging the model predictions on all frames of the video and providing a threshold for selecting the desired output. The prior classification methods, used a Convolutional Neural Network (CNN) to encode data from a single frame; however, speckle pattern data recorded over successive periods could be characterized as time series data27. Due to this temporal dependency, we hypothesize that using a recurrent neural network architecture would provide improved results.
In this study, we propose a method for classification and detection of three basic senses: smell, taste, and hearing. The detection of senses is based on projecting a laser beam on a specific area of the human head associated with the cerebral cortex activity, (see Fig. 1) and analyzing the recorded speckle patterns using DNN. To ascertain reliability of our approach, the results were compared with a synchronized and simultaneously recorded EEG, known to be an effective method for detecting brain activity related to human senses2. We trained an EEG-based DNN using the recorded EEG data and compared it to the results of the speckle-based DNN to find out conformity between the two approaches.
Our study could be of importance for patients suffering from stroke or cancer28–31 and experiencing irregularities in their basic senses, especially in taste and smell. The frequently occurring loss of taste and smell associated with COVID-19 is also noteworthy32,33. The olfactory neurons, which detect odors in the air and send signals to the brain, are one possible pathway for sensory loss34. Predicting sensory loss with relative simplicity and remotely can contribute to the discovery of COVID-19 virus carriers as well.