- Open Access
Wavelet-based study of valence–arousal model of emotions on EEG signals with LabVIEW
© The Author(s) 2016
- Received: 13 November 2015
- Accepted: 2 January 2016
- Published: 21 January 2016
This paper illustrates the wavelet-based feature extraction for emotion assessment using electroencephalogram (EEG) signal through graphical coding design. Two-dimensional (valence–arousal) emotion model was studied. Different emotions (happy, joy, melancholy, and disgust) were studied for assessment. These emotions were stimulated by video clips. EEG signals obtained from four subjects were decomposed into five frequency bands (gamma, beta, alpha, theta, and delta) using “db5” wavelet function. Relative features were calculated to obtain further information. Impact of the emotions according to valence value was observed to be optimal on power spectral density of gamma band. The main objective of this work is not only to investigate the influence of the emotions on different frequency bands but also to overcome the difficulties in the text-based program. This work offers an alternative approach for emotion evaluation through EEG processing. There are a number of methods for emotion recognition such as wavelet transform-based, Fourier transform-based, and Hilbert–Huang transform-based methods. However, the majority of these methods have been applied with the text-based programming languages. In this study, we proposed and implemented an experimental feature extraction with graphics-based language, which provides great convenience in bioelectrical signal processing.
Emotions are one of the most complex and most important features that people have. They constantly guide people in everyday life. They are acting on the decisions that people take. They provide information about human experience. Since emotions are an inseparable part of the people, automatic distinction between them is important. Currently, the most widely used techniques for the discrimination between emotions include facial expression, skin conductance, and brain activity. Recently, studies have focused more on electroencephalogram (EEG)-based emotion recognition [1–9]. Brain produces a low-amplitude electrical signal resulting from the ionic activities that take place in it. Special electrodes are attached to scalp to pick up the electrical signals resulting from ionic current between the neurons. These records are called EEG. EEG measures voltage fluctuations. It has different amplitudes and frequencies. It is not periodic but has some rhythmic frequency. It has a low amplitude that ranges between 1 and 400 µV and wide frequency band which ranges between 0.5 and 100 Hz. EEG is most often used for distinguishing emotions and diagnosing epilepsy, sleep disorders, coma, brain death, etc. [2–4]. EEG is divided into five frequency bands that are delta (between 0.4 and 4 Hz), theta (between 4 and 7 Hz), alpha (between 8 and 13 Hz), beta (between 14 and 40 Hz), and gamma (over 40 Hz). These frequencies emerge while brain is performing different functions.
EEG has been investigated for a long time to distinguish different emotions. Many researchers have intended to analyze and extract features of EEG to classify emotions. They use different stimuli paths to evoke emotions such as auditory and visual components, or audio-visual [5–13]. Murugappan et al. studied human emotion (disgust, fear, happy, surprise, and neutral) recognition in EEG with discrete wavelet transform via db4 wavelet function . Features were obtained by the conventional method and energy-based features. Liu et al. proposed real-time fractal dimension based on algorithm of quantification of basic emotions using valence–arousal emotion model . Fear, frustration, sad, happy, pleasant, and satisfied emotions are stimulated by music and sound. Lokannavar et al. introduced emotion recognition system for emotions such as happy, relax, sad, and fear, based on EEG signal through auto regression (AR) and Fast Fourier Transform (FFT) . Lin et al. investigate whether there is a link between four emotional states (joy, happy, sadness, and pleasure) and EEG activity during listening to music . They made a correlation between emotion and EEG that is derived from electrodes near frontal and parietal lob. Polat et al. investigate the reflection of emotions based on different stories onto EEG . The study contains fifteen different emotions that were stimulated by fifteen different stories. They observed distinction in different emotions between 5 and 8 Hz. Kvaale proposed an artificial neural network model to detect emotions . Two-dimensional model of emotion was used. Schmidt et al. investigate whether the pattern of EEG activity distinguished emotions induced by musical excerpts . Musicals were selected according to valence and intensity value. They found that the valence value of musical excerpts was distinguished in asymmetrical frontal EEG activity. The frontal EEG activity might separate emotions in valence and intensity values according to their study.
The aim of this paper is to investigate LabVIEW-based feature extraction methods to make a correlation between EEG and emotions. The structure of this paper is as follows: Sect. 2 includes a brief description of emotions, database signals taken from LabVIEW, and wavelet analysis. Third section includes evaluation of results. The last section involves conclusion and future works.
Emotions have been caused by internal and environmental influences. They are complex psychophysiological changes and involve so many different factors. Defining emotions is one of the difficult concepts. It is very difficult to distinguish emotions from each other. Because they are expressed differently in every culture and language, there is no clear distinction between them. There are different opinions about the number of emotions. Since the number of emotions increases or decreases for languages, races, religions, and cultures, it is hard to find the exact number of emotions. However, there are some emotions all people have regardless of cultures and languages such as joy, fear, angry, and sadness. Therefore, while analyzing emotions and classifying them, commonsense emotion should be selected or common model should be used. Today, many researchers prefer to use two-dimensional model owing to less complexity. There are two fixed perpendicular directed lines in this model. Horizontal axis shows valence and vertical axis shows arousal. Emotions are specified by their position in this model. High arousal refers to the excitement and low arousal refers to calmness. Valence is used as a measure of satisfaction such that low value represents sadness and high value represents happiness.
The data required for the study were taken from DEAP dataset that is a dataset for emotion analysis using EEG, physiological, and video signals . This database consists of two parts. In this study, the second part of the database was used. In the second part, 40 videos that can create different emotions in people were watched by 32 volunteers. These videos have been tagged with about 15 different emotions such as happy, fun, sad, sentiment, relaxation, etc. EEG signals were recorded and each participant rated the videos while watching them. The valence–arousal–dominance–liking–familiarity model has been used in the database for rating criteria. Valence is used as a measure of satisfaction. It is rated between 1 and 9 float type numbers. For example, low value represents sadness and high value represent happiness. Arousal is rated between 1 and 9 float numbers, which refer to the excitement and calmness. The dominance expresses the intensity of emotions. It is rated between 1 and 9 (float numbers). The liking is rated between 1 and 9 (float). In liking section, users are asked how much they liked the videos. The familiarity is rated (integer) between 1 and 5. In familiarity section, users are asked how often they watch videos. EEG signals were recorded according to 10/20 system. There are 32 .bdf format files, each of which has 48 recorded channels at 512 Hz. Then the data were down-sampled to 128 Hz. 32 channels of each file contain EEG records. The remaining are peripheral and status channels. LabVIEW can use .bdf file format, and file format converter (FFC) application can convert it into TDMS file type which is compatible with LabVIEW.
2.3 Improved graphics-based program LabVIEW
LabVIEW is a graphics-based software platform. It has been developed by the American National Instruments Company. The usage of program is increasing in engineering applications and in signal processing applications [15–18]. It provides a visual platform for the development of algorithms. The design and operations of LabVIEW are modeled by physical elements such as oscilloscopes and multi-meter. So it is called a virtual instrument. It is easier to use than the text-based programs. Therefore, the number of users is increasing gradually. Due to graphical representation and Biomedical Toolkits, it is highly preferred especially in biomedical fields.
The developed front panel has five tabs namely delta band, theta band, alpha band, beta band, and gamma band tabs. Tabs show the power spectral density (PSD) of the corresponding tab’s frequency bands. Each tab has four PSDs that are created for four videos by different colors. For instance, gamma band tabs demonstrate PSD of gamma bands for four videos. PSD of gamma band for V3 has been shown with dark blue, V11 with red, V23 with green, and V38 with light blue as shown in Fig. 2a. Figure 2b shows some parts of the graphical code to obtain the features available on the front panel.
Graphical user interface has been created in LabVIEW 15.0 version. FFT, wavelet transform, biomedical toolkit, and advanced signal processing toolkit of LabVIEW platform have been employed to analyze EEG signal.
EEG signal has vital information about the operation of the brain and body. However, raw EEG signal is very difficult to interpret correctly. Therefore, this signal needs to be processed with appropriate methods [19–22]. Signal processing applications often consist of three stages that are preprocessing, feature extraction, and classification.
Recorded EEG signal is contaminated by noises and artifacts. These noises must be suppressed in order to get correct required information from the signal and prepare the signal for further processing. Therefore, the data were band-pass filtered between 0.1 and 60 Hz.
2.5 Feature extraction
Feature extraction is the operation of describing a set of features. Feature extraction is so important which can provide the most efficient analysis. The main objective is to obtain reliable data for classification and effective analysis of the signals [23–26]. In this study, wavelet-based feature extraction was applied to the EEG records.
2.6 Wavelet analysis
Fourier transform is the most popular transformation that provides very good results in the analysis of stationary signals whose frequency content is unchanged over time. However, it is not very suitable for analysis of non-stationary signals whose statistic characteristics vary with time. Because Fourier transform can show frequency components that signal includes but it is not report that at what times these frequency components occur. The time and frequency information cannot be obtained at the same time. Therefore, although it is a suitable method for the frequency analysis of time-invariant signal, it is not suitable for the analysis of signals in temporary situation. A wavelet function is an oscillating small wave. It analyzes both shape and window. In this analysis method, window size can be changed. Wide window size can be used to analyze low frequencies and narrow window size can be used to analyze high frequencies. Thus, the optimum time–frequency resolution over the entire signal could be achieved. It makes possible to perform a multi-resolution analysis.
Frequencies corresponding to four-level decomposition of an EEG signal
Frequency range (Hz)
Corresponding to EEG frequency band
Emotions play an important role in human life. They teach people how to decide and response. They have a significant role in strengthening relations between people. Therefore, understanding emotions is intriguing for researchers. Despite the fact that a number of studies have been conducted to recognize emotion, majority of them applied text-based programming language.
The object of this study is to investigate the efficacy of different emotions (happy, joy, melancholy, and disgust) on different frequency bands. Different videos labeled happy, joy, melancholy, and disgust were used to stimulate emotions. The data required for this study were taken from DEAP dataset. Emotions used in this study were selected based on the valence–arousal model. Participants 2, 8, 12, and 28 were selected for this study. Db5 wavelet function was applied to divide EEG signal into five frequency bands. When the results are analyzed, the impact of valence value was observed on gamma band. It is briefly reported that emotion effect according to valence score can be achievable with multi-resolution analysis of db5 wavelet-based feature extraction on gamma band. However, there are some limitations on EEG-based emotion recognition. One of the major limitations is that the same emotions do not cause the same effect in each person. So, it is not possible to reach a definitive judgment. In this study, a general conclusion has been reached that emotions having low valence value have the higher amplitude in PSD of gamma band than emotions having high valence values. This study has been conducted through graphical code-based program. Creating program with LabVIEW software is quite easy and simple. It takes very less time to develop a signal processing algorithm, while text-based program takes relatively more time [27–29]. In addition, execution time is shorter than that for text-based programing language in signal processing applications. Moreover, it has many digital signal processing options such as biomedical toolkit and digital signal processing toolkit. The results of this study also show that graphical way of programming has successfully determined emotions according to arousal score.
The future works will include classification methods for further analysis to make specific link between EEG and emotional states. It is expected to understand the details of association between different emotional states and various EEG features. In addition, the effects of different wavelet functions will be investigated for remarkable conclusion.
This research was funded by a grant (No. MF.13.21.) from Firat University.
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
- Garcia-Molina G, Tsoneva T, Nijholt A (2013) Emotional brain–computer interfaces. Int J Auton Adapt Commun Syst 6(1):9–25View ArticleGoogle Scholar
- Guzel S, Kaya T, Guler H (2015) LabVIEW-based analysis of EEG signals in determination of sleep stages. In: Signal processing and communications applications conference (SIU), 23rd, IEEE, pp 799–802Google Scholar
- Gur D, Kaya T, Turk M (2014) Analysis of normal and epileptic eeg signals with filtering methods. In: IEEE 22nd signal processing and communications applications conference (SIU 2014), TrabzonGoogle Scholar
- Gur D, Kaya T, Turk M (2014) The detection of epileptic seizures based on discrete wavelet transform and Fourier transform INISTAGoogle Scholar
- Arı İ, Alsaran FO, Akarun L (2011) Vision-based real-time emotion recognition. In: Signal processing and communications applications (SIU), 2011 IEEE 19th conference on IEEE, pp 1149–1152Google Scholar
- Lin YP, Wang CH, Jung TP, Wu TL, Jeng SK, Duann JR, Chen JH (2010) EEG-based emotion recognition in music listening. Biomed Eng IEEE Trans on 57(7):1798–1806View ArticleGoogle Scholar
- Murugappan M, Ramachandran N, Sazali Y (2010) Classification of human emotion from EEG using discrete wavelet transform. J Biomed Sci Eng 3(04):390View ArticleGoogle Scholar
- Liu Y, Sourina O, Nguyen MK (2011) Real-time EEG-based emotion recognition and its applications. In: Transactions on computational science XII, Springer, Berlin, pp 256–277Google Scholar
- Lokannavar S, Lahane P, Gangurde A, Chidre P (2015) Emotion recognition using EEG signals. Emotion 4(5). doi:10.17148/IJARCCE.2015.4512
- Polat H, Ozerdem MS (2015) Reflection emotions based on different stories onto EEG signal. In: Signal processing and communications applications conference (SIU), 2015 23rd, IEEE, pp 2618–2618Google Scholar
- Kvaale SP (2012) Emotion recognition in EEG: a neuro evolutionary approach. Master Thesis in Norwegian University of Science and TechnologyGoogle Scholar
- Schmidt LA, Trainor LJ (2001) Frontal brain electrical activity (EEG) distinguishes valence and intensity of musical emotions. Cognit Emot 15(4):487–500View ArticleGoogle Scholar
- Scherer KR (2005) What are emotions? And how can they be measured? Soc Sci Inf 44(4):695–729View ArticleGoogle Scholar
- Guler H, Turkoglu I, Ata F (2014) Designing intelligent mechanical ventilator and user interface using LabVIEW. Arab J Sci Eng 39(6):4805–4813View ArticleGoogle Scholar
- Guler H, Ata F (2013) Design and implementation of training mechanical ventilator set for clinicians and students. Proced-Soc Behav Sci 83:493–496. doi:10.1016/j.sbspro.2013.06.095 View ArticleGoogle Scholar
- Guler H, Ata F (2014) The comparison of manual and LabVIEW based-fuzzy control on mechanical ventilation. J Eng Med 228(9):916–925View ArticleGoogle Scholar
- Dumitrescu C, Costea IM, Banica CK, Potlog S (2015) LabVIEW brain computer interface for EEG analysis during sleep stages. In: Advanced topics in electrical engineering (ATEE), 2015 9th international symposium on IEEE, pp 285–288Google Scholar
- Kaya T, Ince MC (2012) Design of FIR filter using modelled window function with helping of artificial neural networks. J Faculty Eng Archit Gazi Univ 27(3):599–606Google Scholar
- Kaya T, Ince MC (2012) The design of analog active filter with different component value using genetic algorithm. Int J Comput Appl 45(8):43–47. doi:10.5120/6804-9142 Google Scholar
- Kaya T, Ince MC (2009) The FIR filter design by using window parameters calculated with GA. In: ICSCCW 2009-fifth international conference on soft computing, computing with words and perceptions in system analysis, decision and control 2–4 Sept 2009, North CyprusGoogle Scholar
- Kaya T, Ince MC (2012) The obtaining of window function having useful spectral parameters by helping of genetic algorithm. In: 2nd world conference on educational technology researches near East University, 27–30 June 2012, Nicosia–North CyprusGoogle Scholar
- Zhu JY, Zheng WL, Peng Y, Duan RN, Lu BL (2014) EEG-based emotion recognition using discriminative graph regularized extreme learning machine. In: Neural networks (IJCNN), 2014 international joint conference on IEEE, pp 525–532Google Scholar
- Nie D, Wang XW, Shi LC, Lu BL (2011) EEG-based emotion recognition during watching movies. In: Neural engineering (NER), 2011 5th international IEEE/EMBS conference on IEEE, pp 667–670Google Scholar
- Li M, Lu BL (2009) Emotion classification based on gamma-band EEG. In: Engineering in medicine and biology society, 2009. EMBC 2009. Annual international conference of the IEEE pp 1223–1226Google Scholar
- Lee G, Kwon M, Sri SK, Lee M (2014) Emotion recognition based on 3D fuzzy visual and EEG features in movie clips. Neurocomputing 144:560–568View ArticleGoogle Scholar
- Guler H, Ata F (2014) Design of a fuzzy-LabVIEW-based mechanical ventilator. Int J Comput Syst Sci Eng 29(3):219–229Google Scholar
- Guler H, Ata F (2009) Estimation of inspiration and expiration time by using fuzzy control with respect to lung’s dynamics. In: ICSCCW 2009-Fifth international conference on soft computing, computing with words and perceptions in system analysis, decision and control, 1–4 Sept 2009 North CyprusGoogle Scholar
- Guler H, Ata F (2014) Development of a fuzzy-based tidal volume algorithm for patients with respiratory distress. J Fac Eng Archit Gazi University 29(4):699–706Google Scholar