Emotions detection scheme using facial skin temperature and heart rate variability

. Technology nowadays is aiming to provide a better life quality for people, schools and universities are working for the convenient of the students as well as ensuring a high quality of education is attained. Emotions detections system can be a solution for better education results and may also be used as a part of human-computer interaction applications such as robotics, games, and intelligent tutoring system, This study shows potentials method of detecting emotions using mobile computing to recognize and identify emotions (Relax, Fear, Sadness, and Joy) based on facial skin temperature, more specifically 5 spots on the face, Nose, Glabellar line (between the eyes and eyebrows) right\lift cheeks and the chin, in addition to the Heart Rate Variability (HRV). An experiment was conducted with 20 healthy subjects (10 females and 10 males, 20 to 31 years old), Both visual and auditory media were used to induce these emotions in the experiment. By the end of this paper, the output data will be anglicized by an Artificial neural network (ANN) The Multilayer Perceptron (MLP) was selected as a classifier with a result of 88.75 % accuracy. This mechanism proves that human`s emotions can easily identify without physical interaction with the subject and with high reliability with only 0.11 misprediction rate


Introduction
In the last decade, [1][2][3][4][5][6][7][8][9][10][11][12] people have developed serval methods to detect and analyse emotions through different models such as facial expressions, walking gestures, and speech, However, these methods are tending to misjudgement in the emotions detection due to humans restraining form showing their true emotions in public, especially negative emotions and that goes under the social mask theory [1]. Another factor to misclassify these emotions is the using the average of the results as well as normalizing/standardizing methods may lead to high accuracy however in the same time does not take the individual differences in concedes as every person is different from each other. On the other hand, Emotion recognition using physiological signals [2] has shown its reliability of detection different kind of emotions. Take in consideration the Atomic Nerve System (ANS), ANS has a strong connection with the human's emotions [2,7], both sympathetic and parasympathetic outflow which control and regulates the functions of our internal organs (the viscera) such as the heart rate, Heart rate variability, blood pressure and body temperature. From the observation, heart rate variability and facial skin temperature are on the top of the best parameters to detect emotions, facial skin temperature can be easily observed and analysis by using thermal camera, for observing the HRV, an HRV monitor was used to collect the HRV score which was calculated and provided by EIlet.HRV, Essence, human emotions can be easily identified using facial skin temperature and heart rate variability.

Related work
Microsoft [4] has a working emotion detecting system. This system is called emotions Application Programming Interface (API), and it can detect anger, contempt, disgust, fear, happiness, neutral, sadness and surprise. The only disadvantage of this system is the use of facial expression, as humans do not show their negative emotions often or have the ability to hide their emotions.
Viviane C.R. Appel, Valdinei L. Belini, Denny H. Jong, Daniel V. Magalhães and Glauco A.P. Caurin [5] proposed a system that uses the thermal camera to detect three emotions (neutral, motivated, overstressed) by taking images of the face, and the detection accuracy was 99.2%. However, the use of facial expressions to classify these emotions is considered a disadvantage as the subject can easily hide his/her emotions.
Valderas MT, Bolea J, Laguna P, Vallverdú M, Bailón R. [6] proposed a new system to detect 4 major emotions (Joy, Fear, Anger, and Sadness) using the heart rate variability (HRV) with varying spectral bands respiratory frequency (RF), On the other hand the use of a database of electrocardiogram (ECG), respiration, blood pressure (BP), skin temperature (ST) and galvanic skin response (GSR) makes this proposal less ubiquitous due to the need to attach several devices to the user to collect the needed data.

Proposed method
In this paper, a method to detect human's emotions by using facial skin temperature and HRV is proposed. Figure 1 describes five spots which are investigated to estimate the emotion. Figure 2 shows the HRV results of one subject. The face usually is not covered and its expression definitely does not show human's emotion. Hence, it is a fact that the facial skin temperature has some essential features of the human's emotion. In this research. The temperatures of the five spots are focused on as a senate of the facial skin temperature as well as the HRV. This method will be more suitable for young students Since they usually under much stress and have no time or money to talk to an expert.

Subjects
An experiment was conducted on 20 healthy volunteer subjects (10 females and 10 males, 20 to 31 years old) without any medical history, the subjects were from 10 different countries (Japan, China, Korea, Saudi Arabia, Indonesia, Senegal, Brazil, Mexico, Malaysia, Thailand) All students at Shibaura Institute of Technology, all subjects agreed on not consuming any caffeine, alcohol, and cigarettes for 6 hours before the experiment.it was confirmed that all subjects have had breakfast and rested well the night before.

Procedures
The subjects were instructed to accept the emotions and try to be honest in this experiment. The possible risk of fear and sadness stimuli was explained to all the subject, each subject conduct the experiment individually, The subjects were asked to sit in front of a Laptop in a dim light room, facing a thermal camera (FLIR C2) controlled remotely by (Flir tools software) and wearing a HRV monitor (Zoom HRV) connected by Bluetooth to (Huawei Mediapad m5) Using (Elite HRV), (Sony 1000XM2) headphones were used with sound cancellation as seen in Fig.3.the room temperature was sat on 22.5 degrees C. The thermal camera recorded the subject's face as video data. When the subjects became too emotional with fear or sadness stimuli the experiment will be terminated for a while then carried on again after the subject calmed down.
As in previous studies [2], Movie clips method has been confirmed as an effective emotion-inducing tool. All subjects watched the same set of videos in the same orders (Relax, Fear, Joy, Sadness), the content of each clip is described in Table 1. these clips were selected carefully from YouTube according to the number of views, the number of likes as well as the conformations comments of the viewers as an effective emotional stimulus. Each Video triggers one emotion. Each video was 3 minutes long, then the subjects took 3-or 5 minutes break to ensure the mental state of the subject goes back to normal. After the break, the next video will play and repeat until the subject has watched all the videos. Before finishing the experiment, the subjects answered a questionnaire to show how effectively the emotion was induced in each video. Finally, the subjects were rewarded for their contributions.

Sadness
Death and Sick animals.

Data collection and analysis
After collecting the biological information from the experiment (facial skin temperature and HRV) for all subjects, A database of the max temperature of the five spots on the face and the HRV of each subject was recorded as a baseline for each subject before the experiment started, and then once the video started, the recording starts every 30 seconds for both HRV and the max temperature of the five spots for each emotion, firstly the maximum temperature of the five spots were extracted. Then, that temperature was used as the temperature of the area, each subject provided 28 images for the four emotions; each image has five temperatures of the nose, glabellar line, right/left cheeks and the chin. In total, 560 thermal images were extracted from the thermal's records for all the subjects. on the other hand the HRV monitor was set to start recording as the video start and stopped when the video finished, with the help of (Kubios HRV standard) the HRV Score of each subject was extracted for every 30 seconds for each emotion. Artificial Neural Network (ANN) was used to estimate the emotion. In the ANN configuration, the five temperatures in each image are used as an input data set with the HRV in a time series database. Then, one of the emotions is output. In the evaluation, cross-validation was performed. Subsequently, all the prepared datasets were divided into training data set and testing data set. First evaluation was made using randomly divided 80% training data set and 20% testing data set. Then, second evaluation was done using randomly divided 50% training data set and 50% testing data set to confirm the validation. Table 2 shows the result using after processing the data on ANN performing cross-validation, the estimation accuracy of 88.75% was achieved. Table 3 shows the Correct Prediction of each emotion when the cross-validation was performed, which were 95.00% for relax and joy, 90% for fear and 85% for joy and Sadness.
Besides, when using 80% training data set and 20% testing dataset randomly selected from all the observed data set, the total estimation accuracy was 88.75%. Furthermore, when using 50% training data set and 50% testing data set, the total estimation accuracy was 70%.
Since the estimation accuracy using ANN was as the expected, the other algorithms, such as the Decision tree (J48) and Support Vector Machine (SVM), were also applied. The used data sets were the same as ANN. When decision tree J48 algorithm was performed, the lowest estimation accuracy of 62.5% was achieved. On the other hand, when SVM was performed, the total estimation accuracy was 68.75%.

Conclusion and future work
In this paper, a new method was proposed for detecting human's emotions by using facial skin temperature and HRV was proposed and its effectiveness was evaluated using several types of machine learning techniques. As a result, it was revealed that the proposed method with MLP as a classifier achieved the best accuracy of 88.75% in estimating emotions of Normal State (Relax), fear, joy, and sadness. As future works, more emotions will be including to confirm the proposed method. Also, increasing the number of subjects to investigate the individual difference. Furthermore, the other parameters from the HRV will be conceded such as LF/HF. Also Skin Conductance Response (SCR) will also be discussed as an additional parameter in order to improve the estimation accuracy of detecting emotions.