different emotions should be recognized and used 
(Guler and Ubeyli, 2007). Before using different 
emotions on a BCI, their characteristic expressions 
(activation sites and specific features) have to be 
discovered and compared. In the present paper, the 
classification strategy proposed in (Iacoviello et al., 
2015c) is used on EEG signals collected in the 
DEAP dataset (Koelstra et al., 2012), a database 
containing a collection of physiological EEG signals 
of emotions from different subjects both for negative 
and positive emotions. In particular the participants 
watched music videos and rated each video in terms 
of arousal, valence, like/dislike, dominance, and 
familiarity. As the subjects watched the videos, their 
EEG and physiological signals were recorded. The 
stimuli used in the experiment were selected in 
different steps: first, 120 initial stimuli were 
selected; then, a one-minute highlight part was 
determined for each stimulus; finally, through a 
web-based subjective assessment experiment, 40 
final stimuli were selected. Being DEAP a reference 
database for tagged EEG emotional signals freely 
usable, we selected some of the stored experiments 
in order to study the brain activations due both to 
negative and positive emotions and to recognize the 
most significant. In particular, goals of this paper 
are: a) to verify that, for a subset of subjects from 
the DEAP dataset, the activated brain region for a 
negative emotion (negative valence and high 
arousal) is located in the right brain hemisphere; b) 
to classify positive emotions (high valence and high 
arousal) from the selected subjects; c) to verify the 
separation, in terms of activated channels and 
selected features, between negative and positive 
patterns; d) to propose a method for classifying 
several emotional states to be used in future multi-
emotional BCI. The paper is organized as follows. In 
Section II, the DEAP dataset and the experimental 
protocol adopted are described along with the 
considered classification method. In Section III the 
obtained results are proposed and discussed, whereas 
in Section IV the conclusions and future works are 
outlined. 
2 MATERIALS AND METHODS 
The DEAP database consists of the EEG 
physiological signals of 32 participants (16 men and 
16 women, aged between 19 and 37, average: 26.9) 
recorded while watching 40 one-minute long music 
videos on different arguments. Before starting the 
viewing, a two-minutes long EEG signal was 
collected by each subject while relaxing watching a 
fixation cross on the screen. The EEG signals, 
sampled at 512 Hz, were recorded from the follo-
wing 32 positions (according to the international 10-
20 positioning system, see Figure 1): Fp1, AF3, F3, 
F7, FC5, FC1, C3, T7, CP5, CP1, P3, P7, PO3, O1, 
Oz, Pz, Fp2, AF4, Fz, F4, F8, FC6, FC2, Cz, C4, T8, 
CP6, CP2, P4, P8, PO4, and O2. The proposed 
music videos were demonstrated to induce emotions 
to different users (Koelstra et al., 2012) represented 
in the valence-arousal scale (Russell, 1980). The 
participants had to rate each video in terms of 
arousal, valence, like/dislike, dominance and 
familiarity (the degree of valence and arousal was 
ranged by using the self-assessment manikins 
questionnaire). The same videos had an on-line 
evaluation that could be used for comparison. The 
videos were the same for all the participants but the 
sequence of visualization for each subject was 
random. As a first step, in the present study just the 
dimensions valence and arousal were considered. 
Data were provided both as they were acquired (raw 
data) and in the preprocessed form. In this study, the 
raw data were used and, before their usage, they 
were filtered between 1 Hz and 46 Hz. 
2.1  The Experimental Protocol 
The main goal of this study was to use DEAP to map 
the emotions through the EEG signals from different 
subjects, by considering the results on the 
classification of a strong negative emotion, the 
disgust (Placidi et al., 2015b). To this aim, we 
started by selecting subjects that experienced the 
“strongest” and reciprocally “farthest” couple of 
emotions, one corresponding to minimum negative 
valence and maximum arousal (in the following 
indicated with NVHA) and the other corresponding 
to the maximum valence and maximum arousal (in 
the following indicated with HVHA). Between the 
selected subjects, we further selected those whose 
self-assessment of NVHA and HVHA corresponded 
to videos having the same on-line evaluation: this 
was done in order to eliminate careless subjects 
(possible cases of wrong evaluations). From the 
selected subjects, besides the EEG signals corres-
ponding to these two emotional states, we extracted 
the EEG signals corresponding to the relaxing phase. 
In fact, after the selection of the subjects and of the 
signals of the chosen emotions, we aimed at 
classifying these two emotional states both with the 
corresponding relaxing signals and reciprocally. The 
one-minute signals corresponding to the emotional 
state elicited by a music video was broken into non-
overlapping trials, 3.52 seconds long, and separately