SMC2017: Dataset

Dataset [Dropbox Folder]

Experimental paradigm

This data set consists of EEG data from 12 subjects. The experimental protocol is designed to test the influence of tactile feedback on interaction ErrPs. It consists of a simulation of a human-robot interaction task. In this paradigm, the subject does not directly control the interface: rather, the simulated robot spontaneously moves toward the intended target with sporadic direction errors (correct-erroneous movement ratio is 4:1), while the subject is only asked to evaluate robot movements, thus generating (ideally) an ErrP whenever the robot moves in the wrong direction.

 

(a)

(b)

Fig. 1 Experimental protocol: scene at the beginning (a) and end (b) of a session.

 

The experiment scene is shown in figure 1a and b, respectively at the beginning and at the end of a reaching task session. Each trial consists of a cursor (green square) and robot arm movement between current position (among 21 possible positions) and the next one. Timing are set so that cursor movement is followed by a pause of random duration between 1.5 and 2.5 s. Trials are repeated until the target (orange square) is reached. Then, target is randomly positioned either at the far right or left side of the screen and cursor placed at the middle position. At the end of each task a simple animation shows the robot extending and grasping the target.

Two Myo (Thalmic Labs) armbands positioned on the subject's forearms were used to provide tactile stimuli, consisting in 0.5 s of vibration on either armband, immediately after robot movements. The task is performed in three conditions:

  1. no armband activation (visual stimuli - V);
  2. vibration from armband placed at the side corresponding to cursor movement direction (concordant visuo-tactile stimuli - concordant VT);
  3. vibration from armband placed target position side, i.e. an erroneous movement caused cursor to move left (right), and right (left) armband to activate (discordant visuo-tactile stimuli - discordant VT).

Twelve healthy subjects (31.9±4.2 y.o., 8 males and 4 females) participated in the study. Each subject performed 750 trials with 20% of error probability (cursor movement in the opposite direction of target), both for V and VT conditions. Each subject tested the visual-only feedback condition and one of the two visuo-tactile conditions, for a grand total of 24 recordings. Each condition lasted about half an hour, with pauses every 6 reached targets.

 

Data recording

 

The experimental acquisition setup consisted of 16 active g.LADYbird (g.tec) gel electrodes located at Fz, FC3, FC1, FCz, FC2, FC4, C3, C1, Cz, C2, C4, CP3, CP1, CPz, CP2 and CP4 according to the standard 10/20 international system, connected to a g.USBamp biosignal amplifier for EEG signals acquisition. Ground and reference were respectively placed on the forehead (AFz) and left ear lobe. Signals were sampled at 256 Hz and hardware filtered between 0.1 and 30Hz, while an additional notch filter contributed suppressing line noise at 50Hz. Experiments were started only after impedance of all electrodes was stably under 5kΩ.

 

Data file description

 

All data sets are stored both in CSV and MAT formats, in different files. Specifically, each CSV file contains a Nx17 matrix, where N is the number of time samples of the recording. The first 16 columns contain raw electrode data (electrode order is Fz, FC3, FC1, FCz, FC2, FC4, C3, C1, Cz, C2, C4, CP3, CP1, CPz, CP2, CP4), while the last column contains trial labels. In each second following a cursor movement, labels equal 1 and 2 respectively in the case of correct and incorrect cursor movements, otherwise the value of the label vector is 0. The file name contains indication on the subject of the recording (S01 to S12) and the feedback during that session (V, conVT and disVT to indicate visual only feedback, concordant visuo-tactile feedback and discordant visuo-tactile feedback, respectively).

Each MAT file contains a single struct, expData. An example struct is shown below:

 

expData =

 

rawData:

[432953x16 double]

expType:

'ConVT'

labelTimeStamps:

[2x750 double]

channelNames:

{1x16 cell}

samplingRate:

256

subject:

'S10'

recordingDateTime:

'20170328T152240'

 

Most fields should be self-explanatory, such as rawData or subject. channelNames is a cell array containing the names of the channels in the same order as in rawData (Fz to CP4, as detailed above). recordingDateTime marks the beginning of the recording in the format YYYYMMDDThhmmss (where ‘T’ is an actual ‘T’ character). Finally, labelTimeStamps is a 2x750 matrix. The first row contains the data sample at which the cursor movement occurred, while the second row presents either a value of 1 for correct movements and 2 for erroneous ones.

Partners

iit advr logo logo fondazione romaFondazione Sanità e Ricerca