Code Monkey home page Code Monkey logo

hcph-fmri-tasks's Introduction

Functional MRI tasks of the Human Connectome PHantom (HCPh) study

This repository contains three functional MRI tasks implemented as Psychopy3 experiments:

Quality control task

The task contains a visual trial block, an eye-movement trial block, and a finger-tapping trial block. The task was adapted from «Version A» of the task proposed by Harvey et al.:

 Harvey J, Demetriou L, McGonigle J, Wall MB. 2018. A short, robust brain activation
 control task optimised for pharmacological fMRI studies. PeerJ 6:e5540
 doi:https://doi.org/10.7717/peerj.5540

The original implementation of their task is found at mattwall1103/fMRI-Control-Task.

The most prominent changes to the original task are:

  • Elimination of the auditive block (since we do not plan the parcipant to wear headphones)
  • Substitution of the motor task (button pressing) with a simple finger-tapping paradigm
  • Enabling eye-tracking with our SR instruments device.

Detailed description of QCT

The QCT consists of four paradigms whose presentation order and realization are randomly selected. The background for all tasks is black (HEX #000000). The units reported here correspond to Psychopy's "normalized units".

  1. Visual gratting pattern: Visual trials consist of a centrally-presented sine-wave grating subtending approximately 10° of visual angle and with a spatial frequency of 1.2 cycles/degree. The grating drifts laterally at a rate of 6 cycles per second, and the direction of the drift is reversed every 0.5s (Harvey et al. 2018). A small green, circular fixation point is also displayed in the center of the screen. The total duration of the stimulus is 3s.

  2. Fingertapping: Motor trials correspond to a simple finger-tapping task of the left or right hand. The participant is instructed to tap their thumb on each of their other four fingers of the hand indicated on the screen (presenting the words 'LEFT' or 'RIGHT'), sequentially with all fingers and reversing the direction at the extremes (from pointer to pinkie). The words 'LEFT' or 'RIGHT', are presented in random order, in white color, and 0.5 units to the left from the center of the screen, respectively to the right, for 5s.

  3. Gaze movement: Cognitive trials involve a series of fixation points moving across the screen. Each fixation point comprises a small green dot (HEX #00ff00) surrounded by a larger concentric gray circle (radius ratio is ~2.2). Participants are instructed to focus their gaze on the center of the fixation point and follow it with their eyes while avoiding head movements. The fixation point moves to six different locations corresponding to the compass directions North–East, East, South–East, North–West, West, and South–West. These points are mapped on a circle with a radius of approximately 8.75° of visual angle. Each location is maintained for 0.5 s, and all six are presented (in random order) in each three-second trial (Harvey et al. 2018).

  4. Fixation: Blank trials consist of a fixation point presented at the center of the screen for 3s. The fixation point is built like in the gaze movement task.

License

These tasks are released under the terms of the Apache 2.0, in order to abide by the NiPreps licensing principles. See NOTICE file for further details.

hcph-fmri-tasks's People

Contributors

celprov avatar esavary avatar mattwall1103 avatar oesteban avatar

hcph-fmri-tasks's Issues

Update information prompt at the beginning (ALL TASKS)

Pre-fill with session day and perhaps a trial number that starts at zero (or one) and it increments itself if there is a previous log file on the same session (meaning, you had to stop the task for whatever reason and you are launching it again).

DOC: Add detailed description of the tasks in the README

I think it would be nice to write a textual detailed description of the tasks in the README (especially for our QCT abstract). We want to be as precise as possible, so describing size, color, duration, number of repetitions of stimuli.

  • Verify that the parameters of our task are exactly as described in Harvey's paper in which case we don't have to rewrite a details description here. If it's not the case, we should clearly state differences.
  • Resting-state
  • Breath-holding task

WDYT @esavary @oesteban ?

Design and implement a brief ET validation "task"

The idea is to run this after the main 9-point calibration before the DWI and then again before the resting state (after ET drift correction).

The task would start and end with a fixation point in the center, then change the position to the four corners of the screen and then some random positions (e.g., 5 positions). Finally, the fixation point would be rendered in the middle again before finishing.

Unknown version 2022.3.0dev6 when running task-rest

I tried installing psychopy (which worked well), then I tried to test it and run the rest expriement and I get this type of error:

Traceback (most recent call last):
  File "/home/acionca/Documents/code/psychopy/psychopy/app/builder/builder.py", line 1249, in runFile
    self.app.runner.panel.runLocal(event)
  File "/home/acionca/Documents/code/psychopy/psychopy/app/runner/runner.py", line 749, in runLocal
    generateScript(experimentPath=currentFile.replace('.psyexp', '_lastrun.py'),
  File "/home/acionca/Documents/code/psychopy/psychopy/scripts/psyexpCompile.py", line 86, in generateScript
    raise LegacyScriptError(
psychopy.scripts.psyexpCompile.LegacyScriptError: Error: Script compile exited with code 1. Traceback:
Traceback (most recent call last):
  File "/usr/lib/python3.8/runpy.py", line 194, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.8/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/home/acionca/Documents/code/psychopy/psychopy/scripts/psyexpCompile.py", line 237, in <module>
    compileScript(args.infile, args.version, args.outfile)
  File "/home/acionca/Documents/code/psychopy/psychopy/scripts/psyexpCompile.py", line 226, in compileScript
    version = _setVersion(version)
  File "/home/acionca/Documents/code/psychopy/psychopy/scripts/psyexpCompile.py", line 125, in _setVersion
    useVersion(version)
  File "/home/acionca/Documents/code/psychopy/psychopy/tools/versionchooser.py", line 205, in useVersion
    raise ValueError(msg.format(requestedVersion))
ValueError: Unknown version `2022.3.0dev6`

Is it something to be expected ?

Send probe to trigger daemon at the start the experiment

If the trigger daemon is down, we only learn when the first signal is attempted.

We should send a probe (e.g., it could be all channels set to 1 except for the trigger channel, or alternatively, a zero -- but that could have problems in that we do not really know what's going on).

The probe could be sent even before the modal dialog to get experiment's info is rendered.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.