Singh2021 / README.md
jalauer's picture
Added link to EEG-Bench
85a6b39 verified
metadata
license: pddl
tags:
  - eeg
  - medical
  - clinical
  - classification
  - parkinson
  - interval estimation

Singh2021: EEG Parkinson's Classification Dataset with Interval Estimation Task

The Singh2021 dataset contains EEG recordings collected during an interval timing task designed to study cognitive control in individuals with Parkinson's disease (PD). The dataset contains a total of 83 PD patients and 37 demographically matched healthy controls. Most PD patients (n = 74) completed the task while ON medication, and a subset (n = 9) completed both ON and OFF dopaminergic medication sessions.

Participants performed a peak-interval timing task with intermixed 3-second and 7-second trials. They were instructed to press a key when they estimated the target interval had elapsed. Visual distractions were included to discourage counting. Each participant completed 80 trials (40 per interval type). Only trials with a minimum of 20 valid keypresses per interval condition were included in analyses.

EEG was recorded using a 64-channel actiCAP system at 500 Hz.

Paper

Singh, A., Cole, R. C., Espinoza, A. I., Evans, A., Cao, S., Cavanagh, J. F., & Narayanan, N. S. (2021). Timing variability and midfrontal~ 4 Hz rhythms correlate with cognition in Parkinson’s disease. npj Parkinson's Disease, 7(1), 14.

DISCLAIMER: We (DISCO) are NOT the owners or creators of this dataset, but we merely uploaded it here, to support our's (EEG-Bench) and other's work on EEG benchmarking.

Dataset Structure

  • data/ contains the raw experiment EEG data.
  • Copy_of_IntervalTiming_Subj_Info_AIE.xlsx contains information about the participants (except the 9 ON+OFF PD participants).

Note that 6 PD (1205, 1255, 1345, 1355, 1495 and 1545) and 4 control (1005, 1045, 1165 and 1355) patients' recordings were excluded due to artifacts and noise.

Filename Format

A recording session consists of 3 files:

  • [GROUP][SID].vhdr: The header file with meta information
  • [GROUP][SID].eeg: contains the EEG (and accelerometer) data
  • [GROUP][SID].vmrk: contains event information

where GROUP is either Control or PD (Parkinson's Disease) and SID is the session ID.

While the paper gives some averaged information like age and MOCA scores about the 9 ON+OFF PD participants, the corresponding recordings must be inferred to be those 18 missing from Copy_of_IntervalTiming_Subj_Info_AIE.xlsx. If it is desired to identify the 9 pairs of recordings belonging to each of the 9 ON+OFF PD participants, one can make an educated guess from similarities in the session ID namings:

  • 1815 (ON) and 2865 (OFF)
  • 1835 (ON) and 2835 (OFF)
  • 1845 (ON) and 2845 (OFF)
  • 1855 (ON) and 2855 (OFF)
  • 1865 (ON) and 2865 (OFF)
  • 3445 (ON) and 2445 (OFF)
  • 3515 (ON) and 2515 (OFF)
  • 3565 (ON) and 2565 (OFF)
  • 3625 (ON) and 2625 (OFF)

where ON refers to the session ON medication and OFF to the session OFF medication (withdrawal 12h prior to recording).

To distinguish ON and OFF recordings, we assumed that ON sessions were recorded prior to OFF sessions (as mentioned in the paper: "We first tested [...] (ON sessions), and then [...] (OFF sessions)", p. 14: Methods->Paricipants) and used the measurement time information stored in each recording to determine the earlier session as the ON session.

To further support the hypothesis that these session pairs do belong to the same participant, one may notice that the measurement times of each pair is always precisely one day apart (allowing for the 12h window to withdraw from medication).

Fields in each File

In python, the 3 files that make up a raw recording can be read via:

import mne
raw = mne.io.read_raw_brainvision("path_to/[GROUP][SID].vhdr")

Now, raw.get_data(units='uV') yields a numpy array of shape (#channels, time_len) in micro-Volt units.

Some general info can be inspected with raw.info, such as the sampling rate (raw.info["sfreq"]), as well as the measurement time (raw.info["meas_date"]).

The channel names (in their correct order) can be seen via raw.ch_names.

Events can be read with

events_list, events_dict = mne.events_from_annotations(raw)

where events_dict contains the mapping of the original event types (like "Stimulus/S 1") to event IDs in [1,2,...,7,255], the latter of which are used in events_list.

events_list is a list of events, ordered by time. Each entry e = [timestamp, (not important), event ID] consists of the time of the event onset timestamp that refers to the time_len dimension in the raw.get_data() EEG array, as well as the event-ID.

(See the https://mne.tools/stable/generated/mne.io.Raw.html documentation for more details.)

The different event types will likely have the following meaning (inferred by events_list inspection via trial description from the paper):

  • Stimulus/S 1: Instruction to estimate short (3s) interval is shown on screen for 1s.
  • Stimulus/S 2: Instruction to estimate long (7s) interval is shown on screen for 1s.
  • Stimulus/S 3: Blue rectangle "GO" cue is shown on screen.
  • Stimulus/S 4: Participant presses spacebar key.
  • Stimulus/S 5: Participant releases spacebar key.
  • Stimulus/S 6: Distracting vowel appears on screen.
  • Stimulus/S 7: Trial Feedback is shown on screen.
  • Stimulus/S255: End of last trial

License

By the original authors of this work, this work has been licensed under the PDDL v1.0 license (see LICENSE.txt).