Code Monkey home page Code Monkey logo

steevelaquitaine / projinference Goto Github PK

View Code? Open in Web Editor NEW
1.0 1.0 3.0 234.88 MB

This is the code of our Neuron 2018 paper: A switching observer for human perceptual estimation. We used perceptual estimation experiments to probe and model whether/how human subjects use statistical priors to adapt perceptual judgements.

JavaScript 0.01% CSS 0.48% MATLAB 25.70% HTML 0.89% Rich Text Format 0.08% Jupyter Notebook 72.84%
bayesian behavioral-analysis computational-neuroscience decision-making motion-perception statistical-inference vision-science

projinference's Introduction

A Switching Bayesian observer

Steeve Laquitaine & Justin L. Gardner

Analyse and model human inference behavior in motion direction and spatial location estimation tasks in which optimality requires exploiting knowledge of the statistical distribution of the stimuli.

See also website.

Task

To run the task git clone the project and follow the instructions given in .../task/README.txt

  1. Set your screen parameters (will run on screenNumber = 1, you can change the screen number in taskDotDir.m)

  2. Open runTask

Run an 10º prior

taskDotDir('steeve_exp12_metho_Pstd010_mean225_coh006012024_dir36_t107_073_033perCoh_130217')

or a 80º prior

taskDotDir('steeve_exp12_metho_Pstd080_mean225_coh006012024_dir36_t106_075_034perCoh_130217')%dropped frames: 0.00 %done

note : taskDotDir.m loads the stimulus parameters (each trial's motion direction and coherence and a few stats) stored in a file like (e.g., 10º prior) :

steeve_exp12_metho_Pstd010_mean225_coh006012024_dir36_t107_074_033perCoh_controlRndInitPosSymPrior_131224.mat

There is one file per prior.

  1. When the task is running to enter a response move your mouse to the desired direction, then press 1 on keyboard (make sure mglEditScreenParameters takes "1" as input)

Data extraction

  1. Download the dataset from Mendeley into your project path e.g., proj_path = "Desktop/project"

Run from the terminal:

curl https://prod-dcd-datasets-cache-zipfiles.s3.eu-west-1.amazonaws.com/nxkvtrj9ps-1.zip --output dataset.zip

The dataset should look like this:

Desktop/project/
- data/mat/
  - data01_direction4priors/ # four-prior motion experiment
      - data/
        - sub01/  # subject 1
          - steeve_exp12_data_sub01_sess06_run26_Pstd040_mean225_coh006_012_024_dir36_randInitPosSymPrior80_140106.mat # task & behavioral data
          - steeve_exp12_data_sub01_sess06_run26_Pstd040_mean225_coh006_012_024_dir36_randInitPosSymPrior80_140106.edf # eye tracking data
          ...
        - sub12/
  - data02_direction1prior/ # one-prior motion experiment
      - data/
        - sub01/
          - steeve_data_sub01_sess01_run01_Pstd080_mean225_coh006_012_024_dir36_170704.mat
          - steeve_data_sub01_sess01_run02_Pstd080_mean225_coh006_012_024_dir36_170704.edf
          ...
        - sub06/
  - data03_orientation/ # four-prior location experiment
      - data/
        - sub01/
          - steeve_exp04_data_sub01_sess01_run01_Pstd080_mean225_con010_0156_1_loc36perCon_151208.mat 
          - steeve_exp04_data_sub01_sess01_run01_Pstd080_mean225_con010_0156_1_loc36perCon_151208.edf
          ...
        - sub09/  
  1. Install the mgl package for data extraction. Task and behavioral data are saved in the .mat. You can extract them with getTaskParameters('..mat'): Clone the mgl software and add it to your matlab path. In Matlab:
# move to project path
cd Desktop/project

# clone package
git clone https://github.com/justingardner/mgl.git mgl

# add package to path
addpath(genpath('Desktop/project/mgl'))

# test your installation
help mgl

ans = 
  MGL library functions

  Main screen functions
    mglOpen                   : Opens the screen
    mglFlush                  : Flips front and back buffer
    mglClose                  : Closes the screen
  1. Extract the data from each file (see documentation in 1). The extracted data is a matlab structure with two elements. The data are contained in the second element:
data = getTaskParameters('data/mat/data01_direction4priors/data/sub01/steeve_exp12_data_sub01_sess06_run26_Pstd040_mean225_coh006_012_024_dir36_randInitPosSymPrior80_140106.mat')
data{2}

ans = 

  struct with fields:

                  nTrials: 202            # trial count
              trialVolume: [1x202 double]
                trialTime: [1x202 double]
             trialTicknum: [1x202 double]
                   trials: [1x202 struct]
                 blockNum: [1x202 double]
            blockTrialNum: [1x202 double]
                 response: [1x202 double]
             reactionTime: [1x202 double]  # reaction time is in seconds relative to the trial segment start 
    originalTaskParameter: [1x1 struct]
         originalRandVars: [1x1 struct]
           responseVolume: [1x202 double]
          responseTimeRaw: [1x202 double]
                 randVars: [1x1 struct]
                parameter: [1x1 struct]

data{2}.reactionTime    

ans =

  Columns 1 through 14

    2.1582    2.5499    1.7992    2.6472    1.9805    1.9602    1.1360    1.9403

Get subject's estimated directions (Cartesian coordinates):

data{2}.randVars.prodcoor

Get the task's displayed directions (degree):

data{2}.randVars.myRandomDir

Get the task's displayed direction coherences (dot ratio):

data{2}.randVars.myRandomCoh

Get the response arrow's initial position/directions (degree):

data{2}.randVars.myRandomCoh

Please kindly cite

@article{laquitaine2018switching,
  title={A switching observer for human perceptual estimation},
  author={Laquitaine, Steeve and Gardner, Justin L},
  journal={Neuron},
  volume={97},
  number={2},
  pages={462--474},
  year={2018},
  publisher={Elsevier}
}

References

(1) https://gru.stanford.edu/doku.php/mgl/taskreference?s[]=reactiontime

projinference's People

Contributors

steevelaquitaine avatar

Stargazers

 avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.