Code Monkey home page Code Monkey logo

guppy's Introduction

DOI Join the chat at https://gitter.im/LernerLab/GuPPy

GuPPy

Guided Photometry Analysis in Python, a free and open-source fiber photometry data analysis tool.

Installation Instructions

GuPPy can be run on Windows, Mac or Linux.

Follow the instructions below to install GuPPy :

  • Current Users : Download new code updates by following steps 1.a to 1.c, then visit the Github Wiki page to get started on your analysis
  • New Users : Follow all the installation steps and then visit the Github Wiki page to get started on your analysis
  1. Download the Guppy code
    a. Press the green button labeled “Code” on the top right corner and that will initiate a pull down menu.

    b. Click on Download ZIP. (Ensure that you save this ZIP locally, not in any external cloud storage such as iCloud, OneDrive, Box, etc. We suggest saving it in your User folder on the C drive)

    c. Once downloaded, open the ZIP file and you should have a folder named “GuPPy-main”. Place this GuPPy-main folder wherever is most convenient (avoiding cloud storage).

    d. Inside the GuPPy-main folder there is a subfolder named “GuPPy”. Take note of the GuPPy subfolder location or path. It will be important for future steps in the GuPPy workflow

    • Mac: Right click folder → Click Get Info → Text next to “Where:”
      ~ Ex: /Users/LernerLab/Desktop/GuPPy-main
    • Windows/Linux: Right click folder → Properties → Text next to “Location:”
  2. Anaconda is a distribution of the Python and R programming languages for scientific computing. Install Anaconda. Install Anaconda based on your operating system (Mac, Windows or Linux) by following the prompts when you run the downloaded installation file.

  3. Once installed, open an Anaconda Prompt window (for windows) or Terminal window (for Mac or Linux). You can search for "anaconda prompt" or "terminal" on your computer to open this window.

  4. Find the location where GuPPy folder is located (from Step 1d) and execute the following command on the Anaconda Prompt or terminal window:

cd path_to_GuPPy_folder
  • Ex: cd /Users/LernerLab/Desktop/GuPPy-main
  1. Next, execute the following commands, in this specific order, on Anaconda Prompt or terminal window:
    • Note : filename in the first command should be replaced by spec_file_windows10.txt or spec_file_mac.txt or spec_file_linux.txt (based on your OS)
    • Some of these commands will initiate various transactions. Wait until they are all done before executing the next line
    • If the Anaconda Prompt or Terminal window asks: Proceed ([y]/n)? Respond with y
conda create --name guppy --file filename
conda activate guppy
  1. Lastly, execute the following command to open the GuPPy User Interface:
panel serve --show GuPPy/savingInputParameters.ipynb

GuPPy is now officially downloaded and ready to use!

  • The full instructions along with detailed descriptions of each step to run the GuPPy tool is on Github Wiki Page.

Uninstalling or removing instructions

  1. Open an Anaconda Prompt window (for windows) or Terminal window (for Mac or Linux).

  2. Execute the following command on Anaconda Prompt or terminal window:

conda remove --name guppy --all
  1. To reinstall, follow steps 1 (Download GuPPy code) and 4 to 6 from the Installation Instructions.

Tutorial Videos

Sample Data

  • Sample data for the user to go through the tool in the start. This folder of sample data has two types of sample data recorded with a TDT system : 1) Clean Data 2) Data with artifacts (to practice removing them) 3) Neurophotometrics data 4) Doric system data. Finally, it has a control channel, signal channel and event timestamps file in a 'csv' format to get an idea of how to structure other data in the 'csv' file format accepted by GuPPy.

Discussions

  • GuPPy was initially developed keeping our data (FP data recorded using TDT systems) in mind. GuPPy now supports data collected using Neurophotometrics, Doric system and also other data types/formats using 'csv' files as input, but these are less extensively tested because of lack of sample data. If you have any issues, please get in touch on the chat room or by raising an issue, so that we can continue to improve this tool.

Citation

Venus N. Sherathiya, Michael D. Schaid, Jillian L. Seiler, Gabriela C. Lopez, and Talia N. Lerner GuPPy, a Python toolbox for the analysis of fiber photometry data. Sci Rep 11, 24212 (2021). https://doi.org/10.1038/s41598-021-03626-9

Contributors

guppy's People

Contributors

glopez924 avatar talialerner avatar venus-sherathiya avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

guppy's Issues

Guppy "Read raw data" error - "handles"

Hello:

I'm using latest Guppy as of 7/14/22. With one of my three current datasets, when I try to "Read Raw Data" I get this error and the GUI gets stuck (green progress bar keeps moving). Windows 10, running on AMD Ryzen Threadripper 3970x 32-core processor, 64 logical processors. Thank you for any help.

Manny E.

image

KeyError: 'No object named df in the file'

So I have this recent issue and have not found an obvious fix for it. I am currently at the stage of attempting to get this program to work using the sample data provided and following the tutorials given exactly as shown. I have fixed many user errors, but this error is not obvious. It occurs on the last step of visualization:
['C:\\Users\\me\\Guppyinfo\\SampleData_Clean\\SampleData_Clean\\Photo_63_207-181030-103332\\Photo_63_207-181030-103332_output_1'] tornado.application - ERROR - Exception in callback functools.partial(<bound method IOLoop._discard_future_result of <tornado.platform.asyncio.AsyncIOMainLoop object at 0x00000221D08B88B0>>, <Task finished name='Task-832' coro=<_needs_document_lock.<locals>._needs_document_lock_wrapper() done, defined at A:\Anaconda\envs\guppy\lib\site-packages\bokeh\server\session.py:51> exception=KeyError('No object named df in the file')>) Traceback (most recent call last): File "A:\Anaconda\envs\guppy\lib\site-packages\tornado\ioloop.py", line 743, in _run_callback ret = callback() File "A:\Anaconda\envs\guppy\lib\site-packages\tornado\ioloop.py", line 767, in _discard_future_result future.result() File "A:\Anaconda\envs\guppy\lib\site-packages\bokeh\server\session.py", line 71, in _needs_document_lock_wrapper result = await result File "A:\Anaconda\envs\guppy\lib\site-packages\tornado\gen.py", line 191, in wrapper result = func(*args, **kwargs) File "A:\Anaconda\envs\guppy\lib\site-packages\panel\reactive.py", line 249, in _change_coroutine self._change_event(doc) File "A:\Anaconda\envs\guppy\lib\site-packages\panel\reactive.py", line 259, in _change_event self._process_events(events) File "A:\Anaconda\envs\guppy\lib\site-packages\panel\reactive.py", line 242, in _process_events self.param.set_param(**self._process_property_change(events)) File "A:\Anaconda\envs\guppy\lib\site-packages\param\parameterized.py", line 1472, in set_param self_._batch_call_watchers() File "A:\Anaconda\envs\guppy\lib\site-packages\param\parameterized.py", line 1611, in _batch_call_watchers self_._execute_watcher(watcher, events) File "A:\Anaconda\envs\guppy\lib\site-packages\param\parameterized.py", line 1573, in _execute_watcher watcher.fn(*args, **kwargs) File "A:\Desktop\Undergrad Research\Fiber photometry\GuPPy\GuPPy-main\GuPPy-main\GuPPy\savingInputParameters.ipynb", line 202, in onclickVisualization "\n", File "A:\Desktop\Undergrad Research\Fiber photometry\GuPPy\GuPPy-main\GuPPy-main\GuPPy\visualizePlot.py", line 475, in visualizeResults createPlots(filepath, storesList[1,:], inputParameters) File "A:\Desktop\Undergrad Research\Fiber photometry\GuPPy\GuPPy-main\GuPPy-main\GuPPy\visualizePlot.py", line 417, in createPlots helper_plots(filepath, event, name_arr) File "A:\Desktop\Undergrad Research\Fiber photometry\GuPPy\GuPPy-main\GuPPy-main\GuPPy\visualizePlot.py", line 70, in helper_plots frames.append(read_Df(filepath, new_event[i], '')) File "A:\Desktop\Undergrad Research\Fiber photometry\GuPPy\GuPPy-main\GuPPy-main\GuPPy\visualizePlot.py", line 33, in read_Df df = pd.read_hdf(op, key='df', mode='r') File "A:\Anaconda\envs\guppy\lib\site-packages\pandas\io\pytables.py", line 415, in read_hdf return store.select( File "A:\Anaconda\envs\guppy\lib\site-packages\pandas\io\pytables.py", line 842, in select raise KeyError(f"No object named {key} in the file") KeyError: 'No object named df in the file'
Context on all of my specifications: Windows 10 latest update, used the windows10.txt file at install, anaconda 2.0.3, CMD.exe Prompt 0.1.1, anaconda and guppy application are located on my A non-boot drive partition and the sample data is located on my C drive boot partition.

This is the whole tutorial followed cmd prompt if that helps.

`(base) C:\Users\me>conda activate guppy

(guppy) C:\Users\me>A:

(guppy) A:>cd A:\Desktop\Undergrad Research\Fiber photometry\GuPPy\GuPPy-main\GuPPy-main\GuPPy

(guppy) A:\Desktop\Undergrad Research\Fiber photometry\GuPPy\GuPPy-main\GuPPy-main\GuPPy>panel serve --show savingInputParameters.ipynb
2021-10-27 22:31:50,942 Starting Bokeh server version 2.3.1 (running on Tornado 6.0.4)
2021-10-27 22:31:50,944 User authentication hooks NOT provided (default user enabled)
2021-10-27 22:31:50,947 Bokeh app running at: http://localhost:5006/savingInputParameters
2021-10-27 22:31:50,947 Starting Bokeh server with process id: 10312
Launching server at http://localhost:49655
C:\Users\me\Guppyinfo\SampleData_Clean\SampleData_Clean\inputParameters\inputParameters.json
Input Parameters File Saved.
['C:\Users\me\Guppyinfo\SampleData_Clean\SampleData_Clean\Photo_63_207-181030-103332']
Launching server at http://localhost:5167
[['Dv1A' 'Dv2A' 'Dv3B' 'Dv4B' 'LNRW' 'LNnR' 'PrtN' 'PrtR' 'RNPS']
['Dv1A' 'Dv2A' 'Dv3B' 'Dv4B' 'LNRW' 'LNnR' 'PrtN' 'PrtR' 'RNPS']]
run

Reading raw data...

C:\Users\me\Guppyinfo\SampleData_Clean\SampleData_Clean\Photo_63_207-181030-103332

Reading tsq file....

Data from tsq file fetched....
Reading data for event Dv1A ...
Reading data for event Dv2A ...
Reading data for event Dv3B ...
Reading data for event Dv4B ...
Reading data for event LNRW ...
Reading data for event LNnR ...
Reading data for event PrtN ...
Reading data for event PrtR ...
Data for event LNRW fetched and stored.
Reading data for event RNPS ...
Data for event LNnR fetched and stored.
Data for event PrtN fetched and stored.
Data for event PrtR fetched and stored.
Data for event RNPS fetched and stored.
Data for event Dv1A fetched and stored.
Data for event Dv2A fetched and stored.
Data for event Dv3B fetched and stored.
Data for event Dv4B fetched and stored.
Time taken = 5.82118
Raw data fetched and saved.
Extracting signal data and event timestamps...
Remove Artifacts : False
Combine Data : False
Correcting timestamps by getting rid of the first 1 seconds and convert timestamps to seconds...
Timestamps corrected and converted to seconds.
Applying correction of timestamps to the data and event timestamps...
Timestamps corrections applied to the data and event timestamps.
Applying correction of timestamps to the data and event timestamps...
Timestamps corrections applied to the data and event timestamps.
Applying correction of timestamps to the data and event timestamps...
Timestamps corrections applied to the data and event timestamps.
Applying correction of timestamps to the data and event timestamps...
Timestamps corrections applied to the data and event timestamps.
Applying correction of timestamps to the data and event timestamps...
Timestamps corrections applied to the data and event timestamps.
Applying correction of timestamps to the data and event timestamps...
Timestamps corrections applied to the data and event timestamps.
Applying correction of timestamps to the data and event timestamps...
Timestamps corrections applied to the data and event timestamps.
Applying correction of timestamps to the data and event timestamps...
Timestamps corrections applied to the data and event timestamps.
Applying correction of timestamps to the data and event timestamps...
Timestamps corrections applied to the data and event timestamps.
Computing z-score for each of the data...
z-score for the data computed.
Signal data and event timestamps are extracted.
Computing PSTH, Peak and Area for each event...
Average for group : False
PSTH, Area and Peak are computed for all events.
Finding transients in z-score data and calculating frequency and amplitude....
calculating frequency and amplitude of transients in z-score data....
Frequency and amplitude of transients in z_score data are calculated.
Transients in z-score data found and frequency and amplitude are calculated.
False
['C:\Users\me\Guppyinfo\SampleData_Clean\SampleData_Clean\Photo_63_207-181030-103332\Photo_63_207-181030-103332_output_1']
tornado.application - ERROR - Exception in callback functools.partial(<bound method IOLoop._discard_future_result of <tornado.platform.asyncio.AsyncIOMainLoop object at 0x00000221D08B88B0>>, <Task finished name='Task-832' coro=<_needs_document_lock.._needs_document_lock_wrapper() done, defined at A:\Anaconda\envs\guppy\lib\site-packages\bokeh\server\session.py:51> exception=KeyError('No object named df in the file')>)
Traceback (most recent call last):
File "A:\Anaconda\envs\guppy\lib\site-packages\tornado\ioloop.py", line 743, in _run_callback
ret = callback()
File "A:\Anaconda\envs\guppy\lib\site-packages\tornado\ioloop.py", line 767, in _discard_future_result
future.result()
File "A:\Anaconda\envs\guppy\lib\site-packages\bokeh\server\session.py", line 71, in _needs_document_lock_wrapper
result = await result
File "A:\Anaconda\envs\guppy\lib\site-packages\tornado\gen.py", line 191, in wrapper
result = func(*args, **kwargs)
File "A:\Anaconda\envs\guppy\lib\site-packages\panel\reactive.py", line 249, in _change_coroutine
self._change_event(doc)
File "A:\Anaconda\envs\guppy\lib\site-packages\panel\reactive.py", line 259, in _change_event
self._process_events(events)
File "A:\Anaconda\envs\guppy\lib\site-packages\panel\reactive.py", line 242, in _process_events
self.param.set_param(**self.process_property_change(events))
File "A:\Anaconda\envs\guppy\lib\site-packages\param\parameterized.py", line 1472, in set_param
self
._batch_call_watchers()
File "A:\Anaconda\envs\guppy\lib\site-packages\param\parameterized.py", line 1611, in batch_call_watchers
self
._execute_watcher(watcher, events)
File "A:\Anaconda\envs\guppy\lib\site-packages\param\parameterized.py", line 1573, in _execute_watcher
watcher.fn(*args, **kwargs)
File "A:\Desktop\Undergrad Research\Fiber photometry\GuPPy\GuPPy-main\GuPPy-main\GuPPy\savingInputParameters.ipynb", line 202, in onclickVisualization
"\n",
File "A:\Desktop\Undergrad Research\Fiber photometry\GuPPy\GuPPy-main\GuPPy-main\GuPPy\visualizePlot.py", line 475, in visualizeResults
createPlots(filepath, storesList[1,:], inputParameters)
File "A:\Desktop\Undergrad Research\Fiber photometry\GuPPy\GuPPy-main\GuPPy-main\GuPPy\visualizePlot.py", line 417, in createPlots
helper_plots(filepath, event, name_arr)
File "A:\Desktop\Undergrad Research\Fiber photometry\GuPPy\GuPPy-main\GuPPy-main\GuPPy\visualizePlot.py", line 70, in helper_plots
frames.append(read_Df(filepath, new_event[i], ''))
File "A:\Desktop\Undergrad Research\Fiber photometry\GuPPy\GuPPy-main\GuPPy-main\GuPPy\visualizePlot.py", line 33, in read_Df
df = pd.read_hdf(op, key='df', mode='r')
File "A:\Anaconda\envs\guppy\lib\site-packages\pandas\io\pytables.py", line 415, in read_hdf
return store.select(
File "A:\Anaconda\envs\guppy\lib\site-packages\pandas\io\pytables.py", line 842, in select
raise KeyError(f"No object named {key} in the file")
KeyError: 'No object named df in the file'`

I do have coding experience but it's pretty basic and does not include file systems like HDF5. If you need screenshots or code sections then you'll need to explain the less obvious parts. Thanks for the help.

Cannot extract timestamps from .csv inputs

Hello,

I am running into a problem with Step 4: Extract timestamps and its correction. I am using .csv files to read in data, and Steps 1-3 work fine, I can see on the command prompt that they are all being executed correctly. However, when I read step 4, when I click the button, GuPPy throws this error:

Extracting signal data and event timestamps... Remove Artifacts : False Combine Data : False Isosbestic Control Channel : True Correcting timestamps by getting rid of the first 1 seconds and convert timestamps to seconds... tornado.application - ERROR - Exception in callback functools.partial(<bound method IOLoop._discard_future_result of <tornado.platform.asyncio.AsyncIOMainLoop object at 0x00000190742FB5B0>>, <Task finished name='Task-982' coro=<_needs_document_lock.<locals>._needs_document_lock_wrapper() done, defined at C:\Users\Isabella\.conda\envs\guppy\lib\site-packages\bokeh\server\session.py:51> exception=Exception('Error in naming convention of files or Error in storesList file')>) Traceback (most recent call last):

Then it goes on to list all my inputted TTL and signal/control files, and then repeats:

Exception: Error in naming convention of files or Error in storesList file

I have double-checked each of my files, the formatting is as specified in the video:
image
for signal/isos and
image
for TTL files.

The storesList file also looks fine, with the two rows as specified in the "Storenames GUI" tab.

Not sure what I'm doing wrong, any help would be appreciated!
Thank you so much!
-IC

Flow diagram of GuPPy trace processing

Hello,
I am doing a bit of analysis to make comparisons between different methods for fiber photometry analysis. I have noticed all methods are all very similar but with slightly different flavors. There is also more than one option on each pipeline.
You have some hints on the wiki, and there's a figure on the paper with the traces, but I was wondering if you had a diagram or something sequential, closer to code with the parameter values, so that I can implement the same functions in the repo outside GuPPy to compare pipelines.
If not, I am happy to make one but I would need a bit of help with the order of function calls 😄 !

Error at "extract timestamps and its corrections"

First, love the preprint and excited to play around with this. I think its a great resource.

Preface: using TDT GCaMP data with 405 and 470 channel and csv files with behavioral timepoints (in seconds). Running the gui on Ubuntu 18.04. When I get to the "extract timestamps and its corrections" section, i'm getting a type error: cannot use a string pattern on a bytes-like object. Here is the full error:
"Computing z-score for each of the data...
tornado.application - ERROR - Exception in callback functools.partial(<bound method IOLoop._discard_future_result of <tornado.platform.asyncio.AsyncIOMainLoop object at 0x7f90adfec470>>, <Task finished coro=<_needs_document_lock.._needs_document_lock_wrapper() done, defined at /home/samuel/anaconda3/envs/guppy/lib/python3.6/site-packages/bokeh/server/session.py:51> exception=TypeError('cannot use a string pattern on a bytes-like object',)>)"

Note: this is occurring with the sample data as well.

Storenames GUI

Hello,

The instructions up to this point have been great. This issue is similar to a previous question. I plan to use Neurophotometrics so I used the Neurophotometrics sample data in the Box. I got up to choosing input parameters, but I cannot choose storenames- the dropdown says "no choices to choose from". When I select "open storenames GUI" there is no error in the terminal. What is causing this problem?

Link to Sample Data broken

Dear Lerner lab,

Firstly thank you so much for putting this resource on github. I believe I have successfully installed Guppy but I have a hurdle in that I don't have access to the "sample data" so not sure how I should format my csv. Any assistance at your next availability would be greatly appreciated.

Jupyter and PIP

Can you do a version that runs in a notebook - like Colab and is installed via pip?

#feature-request

Missing sampling rate with manually made ttl files

Hi, I have been using GuPPy for neurophotometrics data and it works great when I'm only working with one or two behaviors that can be represented as booleans. However when I need to work with more than two behaviors and I manually make multiple ttl files representing each behavior using the single timestamps column, suddenly my generated chev and chod files lose the sampling_rate column and the whole thing stops working. I notice someone on the gitter page had the exact same issue as me so maybe you already know what I'm talking about. My goal is to be able to use GuPPy for neurophotometrics with multiple different behaviors so I hope you can please help with this.

Group Analysis

Hello,

I am analyzing FP data from NPM. I have 3 events during my record, generating 3 PSTH data. My individual analysis goes well, and I am able to check all the PSTH events from each animal. However, when I try the group analysis, only one of the events appear in the PSTH Visualization GUI (image below). Also, the terminal window is presenting an error (image below)

image

image

Could you help me?

Thank you,

Augusto

Error when running "Extract timestamps and its correction"

Hi!

I keep getting this error message "Exception: Error in naming convention of files or Error in storesList file" when I run "Step 4 : Extract timestamps and its correction". I have attached a screenshot of what is being shown on my terminal.

Is there anything I might have done wrong in previous steps?

What can I do to correct this?

Thank you!

Screen Shot 2021-08-06 at 9 38 06 PM

Can't read raw data

Hey,
I just installed GuPPy and tried to get the hang of it using the sample CSV files.
Unfortunatly, even after double checking every step of the process, I get the error message :

"Traceback (most recent call last):
File "readTevTsq.py", line 409, in
readRawData(json.loads(sys.argv[1]))
File "C:\Users\Massa\Anaconda3\envs\guppy\lib\json_init_.py", line 357, in loads
return _default_decoder.decode(s)
File "C:\Users\Massa\Anaconda3\envs\guppy\lib\json\decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "C:\Users\Massa\Anaconda3\envs\guppy\lib\json\decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)"

I'm new to this so thanks in advance for your time.

Neurophotometrics file - storenames not recognised

Hello,

I get this alert everytime I try and get storenames.

Alert !!
No storenames found. There are not any TDT files or csv files to look for storenames.

I am using files directly collected from neurophotometrics, and appear to be formatted the same as the sample data. Any ideas?

FileNotFoundError: REWARD_VS_z_score_VS.h5 does not exist

Hi! I got the error 'REWARD_VS_z_score_VS.h5 does not exist' when I push open visualization gui. I couldn't find this in the output filepath. I only got 'REWARD_VS.hdf5'. Please let me know if you need more information. Thanks!
Screenshot 2023-05-24 153554

Analyze data on C drive rather than user folder?

Hi, I'm trying to run GuPPy directly on my C drive on Windows because unfortunately I have a space in my username. I have installed it in the C drive, but when I run the GUI, it defaults to search for files only within my user folder, and fails due to the whitespace. Is there a way to analyze data from the C drive? Thanks!

Issues with running GuPPy

Hi! I am trying to run Guppy on some of my own data acquired from TDT. I do not have storenames of behaviors, since I handscored the time stamps of different behaviors of interest. Do I need to import this somehow into the storenames GUI?
Thank you for your help!

PSTH Computation Error Message

I keep running into an issue during PSTH computation where the program throws the following error: "ValueError: could not broadcast input array from shape (1956940) into shape (1981)" (see image 1 below)

This results in the required files for PSTH analysis and visualization not being created, so I can't proceed to the visualization step ("beep_VTA_z_score_VTA.h5 does not exist") (see image 2 below). What is causing this error? The rest of the steps seem to work just fine.

Image 1
image

Image 2
image

can't open file 'readTevTsq.py': [Errno 2] No such file or directory

Hello, I am a PhD student and I am trying to learn by myself to use fiber photometry on a tdt set up. My question may seem stupid but I am trying to analyze my data and when I am running the "read raw data" procedure, the following message is displayed in my terminal : "can't open file 'readTevTsq.py': [Errno 2] No such file or directory" . The python file is however in the folder "guppy-main". Can someone help me solve this? Thank you

Input Parameters GUI Neurophotometrics

Hi LernerLab,
Thank you very much for this amazing tool. I am getting started with GuPPy and some questions arose, which are perhaps too basic. In the lab we are using Neurophotometrics to record GCamP photometry on operant boxes, where we also measure the licks in the shape of TTLs. As data, we obtain four channels corresponding to one mouse each instead of regions (like the sample file shows). Each channel has an isosbestic (flag 17) and GCamP signal (flag 18) and this is all included in a single .csv file. However, the TTL signal is given as a binary set of data (0 for no lick, 1 for lick) inside another .csv file. After 'playing' for days with GuPPy, I still have some open questions:

  1. Can I process the data with all mice together in the 4 channels or should I split them into one file per mouse?
  2. Is it possible to include the TTLs for each mouse in said file even if the nature of the signal is different?
  3. Since the timestamp is common for the three parameters, is there any way the software can detect them all and plot them in the same graph to observe the Calcium signal given at a certain TTL? The purpose would be to see the neural encoding of the licks.

Thank you in advance for your time and attention!
Best regards,
cguerreromar

Clarification on input parameter - 'Number of channels (Neurophotometrics only)'

I am asking this here because the wiki page does not clarify this. Can you please explain what exactly is the input parameter - 'Number of channels (Neurophotometrics only)?'
I understand that it should be the number of channels used while recording, when data files has no column names mentioning “Flags” or “LedState”. I am wondering what to enter if my Neurophometrics data file has a "LedState" column? Does it override any entry and decides number of channel based on "LedState" column?

GupPY v1.2 doesn't run, .dll not found

I updated to v1.2 by extracting the .zip. Also deleted my old anaconda environment and rebuilt it so:

conda create --name guppy --file spec_file_windows10.txt
conda activate guppy
conda install -c anaconda ipykernel
python -m ipykernel install --user --name=guppy

Now when I try to start the server I get this ".dll not found" error:

(guppy) D:\Packages\zOLD VERSIONS\GuPPy-main OLD\GuPPy-main\GuPPy>panel serve --show savingInputParameters.ipynb
2022-11-15 15:12:06,166 Starting Bokeh server version 2.4.3 (running on Tornado 6.1)
2022-11-15 15:12:06,166 User authentication hooks NOT provided (default user enabled)
2022-11-15 15:12:06,166 Bokeh app running at: http://localhost:5006/savingInputParameters
2022-11-15 15:12:06,166 Starting Bokeh server with process id: 6804
2022-11-15 15:12:06,435 Error running application handler <bokeh.application.handlers.notebook.NotebookHandler object at 0x000001C0C8DB2C70>: Could not find module 'C:\Users\Windows\anaconda3\envs\guppy\lib\site-packages\scipy\.libs\libbanded5x.4LIW6FJ2MYAF374XJSSB2KHHIEBRW45R.gfortran-win_amd64.dll' (or one of its dependencies). Try using the full path with constructor syntax.
File '__init__.py', line 373, in __init__:
self._handle = _dlopen(self._name, mode) Traceback (most recent call last):
  File "C:\Users\Windows\anaconda3\envs\guppy\lib\site-packages\bokeh\application\handlers\code_runner.py", line 231, in run
    exec(self._code, module.__dict__)
  File "D:\Packages\zOLD VERSIONS\GuPPy-main OLD\GuPPy-main\GuPPy\savingInputParameters.ipynb", line 15, in <module>
    "import numpy as np\n",
  File "D:\Packages\zOLD VERSIONS\GuPPy-main OLD\GuPPy-main\GuPPy\preprocess.py", line 11, in <module>
    from scipy import signal as ss
  File "C:\Users\Windows\anaconda3\envs\guppy\lib\site-packages\scipy\__init__.py", line 144, in <module>
    from . import _distributor_init
  File "C:\Users\Windows\anaconda3\envs\guppy\lib\site-packages\scipy\_distributor_init.py", line 59, in <module>
    WinDLL(os.path.abspath(filename))
  File "C:\Users\Windows\anaconda3\envs\guppy\lib\ctypes\__init__.py", line 373, in __init__
    self._handle = _dlopen(self._name, mode)
FileNotFoundError: Could not find module 'C:\Users\Windows\anaconda3\envs\guppy\lib\site-packages\scipy\.libs\libbanded5x.4LIW6FJ2MYAF374XJSSB2KHHIEBRW45R.gfortran-win_amd64.dll' (or one of its dependencies). Try using the full path with constructor syntax.

2022-11-15 15:12:06,611 WebSocket connection opened
2022-11-15 15:12:06,612 ServerConnection created

No Storenames extracted for Neurophotometrics file

I have neurophotometrics data (saved in .csv files - one for the timestamp, one that contains the timeframe and interleaved signal and isosbestic) that I am trying to analyse via GuPPy but when I open the Storenames GUI after selecting these files, no storenames appear, as if the file has not been read. I've checked and the layout of the .csv files is the same as the example data provided.

Moreover, if I drag over a folder to 'selected files' that contains my .csv files (instead of the individual files) then I get the following error and no Storename Gui opens up.

tornado.application - ERROR - Exception in callback functools.partial(<bound method IOLoop._discard_future_result of <tornado.platform.asyncio.AsyncIOMainLoop object at 0x7fce4dadacf8>>, <Task finished coro=<_needs_document_lock.._needs_document_lock_wrapper() done, defined at /Users/evaguerrero/opt/anaconda3/envs/guppy/lib/python3.6/site-packages/bokeh/server/session.py:51> exception=ValueError("could not convert string to float: 'H'",)>)
Traceback (most recent call last):
File "pandas/_libs/parsers.pyx", line 1152, in pandas._libs.parsers.TextReader._convert_tokens
TypeError: Cannot cast array from dtype('O') to dtype('float64') according to the rule 'safe'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/Users/evaguerrero/opt/anaconda3/envs/guppy/lib/python3.6/site-packages/tornado/ioloop.py", line 743, in _run_callback
ret = callback()
File "/Users/evaguerrero/opt/anaconda3/envs/guppy/lib/python3.6/site-packages/tornado/ioloop.py", line 767, in _discard_future_result
future.result()
File "/Users/evaguerrero/opt/anaconda3/envs/guppy/lib/python3.6/site-packages/bokeh/server/session.py", line 71, in _needs_document_lock_wrapper
result = await result
File "/Users/evaguerrero/opt/anaconda3/envs/guppy/lib/python3.6/site-packages/tornado/gen.py", line 191, in wrapper
result = func(*args, **kwargs)
File "/Users/evaguerrero/opt/anaconda3/envs/guppy/lib/python3.6/site-packages/panel/reactive.py", line 249, in _change_coroutine
self._change_event(doc)
File "/Users/evaguerrero/opt/anaconda3/envs/guppy/lib/python3.6/site-packages/panel/reactive.py", line 259, in _change_event
self._process_events(events)
File "/Users/evaguerrero/opt/anaconda3/envs/guppy/lib/python3.6/site-packages/panel/reactive.py", line 242, in _process_events
self.param.set_param(**self.process_property_change(events))
File "/Users/evaguerrero/opt/anaconda3/envs/guppy/lib/python3.6/site-packages/param/parameterized.py", line 1472, in set_param
self
._batch_call_watchers()
File "/Users/evaguerrero/opt/anaconda3/envs/guppy/lib/python3.6/site-packages/param/parameterized.py", line 1611, in batch_call_watchers
self
._execute_watcher(watcher, events)
File "/Users/evaguerrero/opt/anaconda3/envs/guppy/lib/python3.6/site-packages/param/parameterized.py", line 1573, in _execute_watcher
watcher.fn(*args, **kwargs)
File "/Users/evaguerrero/GuPPy-main/GuPPy/savingInputParameters.ipynb", line 243, in onclickStoresList
"\n",
File "/Users/evaguerrero/GuPPy-main/GuPPy/saveStoresList.py", line 665, in execute
event_name, flag = import_np_csv(filepath, isosbestic_control, num_ch)
File "/Users/evaguerrero/GuPPy-main/GuPPy/saveStoresList.py", line 484, in import_np_csv
df = pd.read_csv(path[i], dtype=float)
File "/Users/evaguerrero/opt/anaconda3/envs/guppy/lib/python3.6/site-packages/pandas/io/parsers.py", line 676, in parser_f
return _read(filepath_or_buffer, kwds)
File "/Users/evaguerrero/opt/anaconda3/envs/guppy/lib/python3.6/site-packages/pandas/io/parsers.py", line 454, in _read
data = parser.read(nrows)
File "/Users/evaguerrero/opt/anaconda3/envs/guppy/lib/python3.6/site-packages/pandas/io/parsers.py", line 1133, in read
ret = self._engine.read(nrows)
File "/Users/evaguerrero/opt/anaconda3/envs/guppy/lib/python3.6/site-packages/pandas/io/parsers.py", line 2037, in read
data = self._reader.read(nrows)
File "pandas/_libs/parsers.pyx", line 860, in pandas._libs.parsers.TextReader.read
File "pandas/_libs/parsers.pyx", line 875, in pandas._libs.parsers.TextReader._read_low_memory
File "pandas/_libs/parsers.pyx", line 952, in pandas._libs.parsers.TextReader._read_rows
File "pandas/_libs/parsers.pyx", line 1084, in pandas._libs.parsers.TextReader._convert_column_data
File "pandas/_libs/parsers.pyx", line 1158, in pandas._libs.parsers.TextReader._convert_tokens
ValueError: could not convert string to float: 'H'

can't open .doric file

Hi,

I download the Guppy-1.2.0, which says to support the .doric file.
when I went to "open Storenames GUI", there was an error (see the picture)
image
also, see my .doric file in the "Input Parameters GUI"
image
Would you happen to have any idea about this?

Cannot see plots after 'Extract timestamps and apply corrections'

Hi,
I am trying to analyze Neurophometrics data using the instructions in the wiki page. It says that GuPPy displays signal after selecting 'Extract timestamps and apply corrections' but I do not see any plots after this step. I even tried it with the Neurophotometrics data in your 'Sample Data' folder and I do not see the plots pop up. I would appreciate your help in figuring this out.

Here is the output in the command prompt -
Raw data fetched and saved. Extracting signal data and event timestamps... Remove Artifacts : True Combine Data : False Isosbestic Control Channel : True Correcting timestamps by getting rid of the first 5 seconds and convert timestamps to seconds... Timestamps corrected and converted to seconds. Applying correction of timestamps to the data and event timestamps... Timestamps corrections applied to the data and event timestamps. Applying correction of timestamps to the data and event timestamps... Timestamps corrections applied to the data and event timestamps. Applying correction of timestamps to the data and event timestamps... Timestamps corrections applied to the data and event timestamps. Applying correction of timestamps to the data and event timestamps... Timestamps corrections applied to the data and event timestamps. Applying correction of timestamps to the data and event timestamps... Timestamps corrections applied to the data and event timestamps. Applying correction of timestamps to the data and event timestamps... Timestamps corrections applied to the data and event timestamps. Applying correction of timestamps to the data and event timestamps... Timestamps corrections applied to the data and event timestamps. Applying correction of timestamps to the data and event timestamps... Timestamps corrections applied to the data and event timestamps. Removing Artifacts from the data and correcting timestamps... Computing z-score for each of the data... z-score for the data computed. Processing timestamps to get rid of artifacts... Timestamps processed and artifacts are removed. Artifacts from the data are removed and timestamps are corrected. Signal data and event timestamps are extracted. .

Error "Column names should be timestamps, data and sampling_rate (all lower-cases)"

Hello,

We are currently trying to use GuPPy to analyze our Doric data. We have two signal (480nm and 405nm) with four DIO. I extract the data from doric neuroscience in .CSV

We obtain two files for the signal that we named :

  • RLA8_signal_DA.csv
  • RLA8_control_DA.csv
    In these files, on the first line we have : Time, Value (see picture1).
    Picture1
    We change Time by timestamps, value by data and we add the sampling_rate (see picture 2). We save as type CSV (Comma delimited).
    Picture2

I obtain four files for the 4 DIO, that I named :

  • RLA8_DIO01_IN.csv
  • RLA8_DIO02_IN.csv
  • RLA8_DIO03_IN.csv
  • RLA8_DIO04_IN.csv
    In these files, on the first line we have : DIOXX, Time (see picture3).
    Picture3

We change Time by timestamps. As we are interested only by the beginning of the behavior (we keep the timestamps where the DIO switch from 0 to 1), we remove all the timestamps useless using a Matlab script. We also remove the DIOXX column to keep only the timestamp column (see picture 4).
Picture4

Then we put the 6 files in a folder (see picture5) and we address this folder to GuPPy.
Picture5

When we lunched the step2 : Open Storenames GUI and save storenames, we obtained this error :

['/Users/raphaelgoutaudier/Desktop/RLA8-PC_TRAINING']
tornado.application - ERROR - Exception in callback functools.partial(<bound method IOLoop._discard_future_result of <tornado.platform.asyncio.AsyncIOMainLoop object at 0x7fd0ab859cf8>>, <Task finished coro=<_needs_document_lock.._needs_document_lock_wrapper() done, defined at /Users/raphaelgoutaudier/anaconda3/envs/guppy/lib/python3.6/site-packages/bokeh/server/session.py:51> exception=Exception('\x1b[1mColumn names should be timestamps, data and sampling_rate (all lower-cases)\x1b[0m',)>)
Traceback (most recent call last):
File "/Users/raphaelgoutaudier/anaconda3/envs/guppy/lib/python3.6/site-packages/tornado/ioloop.py", line 743, in _run_callback
ret = callback()
File "/Users/raphaelgoutaudier/anaconda3/envs/guppy/lib/python3.6/site-packages/tornado/ioloop.py", line 767, in _discard_future_result
future.result()
File "/Users/raphaelgoutaudier/anaconda3/envs/guppy/lib/python3.6/site-packages/bokeh/server/session.py", line 71, in _needs_document_lock_wrapper
result = await result
File "/Users/raphaelgoutaudier/anaconda3/envs/guppy/lib/python3.6/site-packages/tornado/gen.py", line 191, in wrapper
result = func(*args, **kwargs)
File "/Users/raphaelgoutaudier/anaconda3/envs/guppy/lib/python3.6/site-packages/panel/reactive.py", line 249, in _change_coroutine
self._change_event(doc)
File "/Users/raphaelgoutaudier/anaconda3/envs/guppy/lib/python3.6/site-packages/panel/reactive.py", line 259, in _change_event
self._process_events(events)
File "/Users/raphaelgoutaudier/anaconda3/envs/guppy/lib/python3.6/site-packages/panel/reactive.py", line 242, in _process_events
self.param.set_param(**self.process_property_change(events))
File "/Users/raphaelgoutaudier/anaconda3/envs/guppy/lib/python3.6/site-packages/param/parameterized.py", line 1472, in set_param
self
._batch_call_watchers()
File "/Users/raphaelgoutaudier/anaconda3/envs/guppy/lib/python3.6/site-packages/param/parameterized.py", line 1611, in batch_call_watchers
self
._execute_watcher(watcher, events)
File "/Users/raphaelgoutaudier/anaconda3/envs/guppy/lib/python3.6/site-packages/param/parameterized.py", line 1573, in _execute_watcher
watcher.fn(*args, **kwargs)
File "/Users/raphaelgoutaudier/Documents/GuPPy-main/GuPPy/savingInputParameters.ipynb", line 321, in onclickStoresList
"def onclickStoresList(event=None):\n",
File "/Users/raphaelgoutaudier/Documents/GuPPy-main/GuPPy/saveStoresList.py", line 763, in execute
event_name, flag = import_np_doric_csv(filepath, isosbestic_control, num_ch)
File "/Users/raphaelgoutaudier/Documents/GuPPy-main/GuPPy/saveStoresList.py", line 595, in import_np_doric_csv
raise Exception("\033[1m"+"Column names should be timestamps, data and sampling_rate (all lower-cases)"+"\033[0m")
Exception: Column names should be timestamps, data and sampling_rate (all lower-cases)"

We try several things. If we remove the two signal file we can go to the step2 but no storename appears. If we let the DIO files with the two column (DIO and timestamps), we can't go to the step2.

Also, we noticed that another user had a similar problem and that you adviced him to use textfile save in .csv, but for us it did not work as our text file are saved in .txt

Have you any idea about the way to solve our problem ?

Cannot open CSV file

Hi,

I converted my doric files using the format conventions you mentioned in the youtube video but still, storenames GUI doesn't know them
(Alert !!
No storenames found. There are not any TDT files or CSV files to look for storenames.)

In video, you also opened a folder which is impossible for me, as it only open storenames window if I open CSV files.

Control Fit Negative

When the recording is short and the signal is very pronounced, the control fit can turn up negative
Figure_4_cfc
.

IndexError: list index out of range

Hi,

I get this error when I try to run group analysis through guppy. Any ideas?

data_csv
event_csv
Launching server at http://localhost:5056
['20230623_00602_06557_signal', '20230623_00602_06557_control', '20230623_00602_06557_timestamp']
[['20230623_00602_06557_signal' '20230623_00602_06557_control'
'20230623_00602_06557_timestamp']
['signal_06557' 'control_06557' 'Injection1']]
Input Parameters File Saved.
Computing PSTH, Peak and Area for each event...
Average for group : True
Averaging group of data...
Traceback (most recent call last):
File "computePsth.py", line 682, in
psthForEachStorename(json.loads(sys.argv[1]))
File "computePsth.py", line 636, in psthForEachStorename
averageForGroup(storesListPath, storesList[1,k], inputParameters)
File "computePsth.py", line 489, in averageForGroup
new_path[idx].append(path[i])
IndexError: list index out of range
Finding transients in z-score data and calculating frequency and amplitude....
Combining results for frequency and amplitude of transients in z-score data...
Traceback (most recent call last):
File "findTransientsFreqAndAmp.py", line 326, in
executeFindFreqAndAmp(json.loads(sys.argv[1]))
File "findTransientsFreqAndAmp.py", line 294, in executeFindFreqAndAmp
averageForGroup(storesListPath, inputParameters)
File "findTransientsFreqAndAmp.py", line 245, in averageForGroup
new_path[idx].append(path[i])
IndexError: list index out of range

Please help super beginner!

image

I am super beginner for this and take whole day for one step!
I can see this ERROR and try to change name as under bar, number, shortly..

But still can see same problem.
Could you let me know how to fix this?

Vizualization GUI error

Hello,

When I run the Step 6 visualization GUI, I get the following error in my command prompt:
FileNotFoundError: File C:\Users\Isabella\use_guppy\use_guppy_output_1\WN_GCaMP_z_score_GCaMP.h5 does not exist

where WN is my event file and the GCaMP is my signal file. I've looked through the other responses about this issue and there didn't seem to be a clear solution-- #41 suggested making my timestamps in the signal and control files the same, so I did so, but that didn't fix my problem.

I'm using a .csv input, and the previous steps all seem to be working fine (graphs pop up and I can change the peak detection etc).

Thank you for your help!
-Isabella

Wrong Timestamps during Visualization-step

Hi,

I have recently started using GuPPy and ran into a problem during the visualization-step,

It appears that the used timestamps during this step are consistantly 1 second less than what in inputed in the TTL-file (see image),

Why is this?

Screenshot 2023-09-28 150342

Error with computing PSTH

Hi, I'm struggling with the error when computing PSTH (I use windows 10 pro).
It seems successfully reading Raw Data, and extracting timestamps.
However, when I clicked a PSTH computation button, anaconda prompt showed the error below.


Reading raw data...

C:\Users\MyPC\Desktop\실험실\2.실험데이터\StartReact\SAF-MDM2\Reaching\220207\m14\220207_m14s1

Reading tsq file....

No tsq file found.
Trying to read data for TTL1 from csv file.
Reading data for TTL1 from csv file is completed.
Trying to read data for m14s1 from csv file.
Reading data for m14s1 from csv file is completed.
Time taken = 0.94547
Raw data fetched and saved.
Extracting signal data and event timestamps...
Remove Artifacts : False
Combine Data : False
Isosbestic Control Channel : False
Correcting timestamps by getting rid of the first 1 seconds and convert timestamps to seconds...
Timestamps corrected and converted to seconds.
Applying correction of timestamps to the data and event timestamps...
Timestamps corrections applied to the data and event timestamps.
Applying correction of timestamps to the data and event timestamps...
Timestamps corrections applied to the data and event timestamps.
Applying correction of timestamps to the data and event timestamps...
Timestamps corrections applied to the data and event timestamps.
Creating control channel from signal channel using curve-fitting
Control channel from signal channel created using curve-fitting
Computing z-score for each of the data...
Remove Artifacts : False
z-score for the data computed.
Signal data and event timestamps are extracted.
Computing PSTH, Peak and Area for each event...
Average for group : False
Computing PSTH for event Turntable...
multiprocessing.pool.RemoteTraceback:
"""
Traceback (most recent call last):
File "C:\Users\MyPC\anaconda3\envs\guppy\lib\multiprocessing\pool.py", line 125, in worker
result = (True, func(*args, **kwds))
File "C:\Users\MyPC\anaconda3\envs\guppy\lib\multiprocessing\pool.py", line 51, in starmapstar
return list(itertools.starmap(args[0], args[1]))
File "C:\Users\MyPC\Desktop\실험실\2.실험데이터\GuPPy-main\GuPPy\computePsth.py", line 328, in storenamePsth
create_Df(filepath, event+'_'+name_1+'_baselineUncorrected', basename, psth_baselineUncorrected, columns=cols) # extra
File "C:\Users\MyPC\Desktop\실험실\2.실험데이터\GuPPy-main\GuPPy\computePsth.py", line 122, in create_Df
df.to_hdf(op, key='df', mode='w')
File "C:\Users\MyPC\anaconda3\envs\guppy\lib\site-packages\pandas\core\generic.py", line 2434, in to_hdf
pytables.to_hdf(
File "C:\Users\MyPC\anaconda3\envs\guppy\lib\site-packages\pandas\io\pytables.py", line 267, in to_hdf
with HDFStore(
File "C:\Users\MyPC\anaconda3\envs\guppy\lib\site-packages\pandas\io\pytables.py", line 553, in init
self.open(mode=mode, **kwargs)
File "C:\Users\MyPC\anaconda3\envs\guppy\lib\site-packages\pandas\io\pytables.py", line 697, in open
self._handle = tables.open_file(self._path, self._mode, **kwargs)
File "C:\Users\MyPC\anaconda3\envs\guppy\lib\site-packages\tables\file.py", line 315, in open_file
return File(filename, mode, title, root_uep, filters, **kwargs)
File "C:\Users\MyPC\anaconda3\envs\guppy\lib\site-packages\tables\file.py", line 778, in init
self._g_new(filename, mode, **params)
File "tables/hdf5extension.pyx", line 492, in tables.hdf5extension.File._g_new
tables.exceptions.HDF5ExtError: HDF5 error back trace

File "C:\ci\hdf5_1545244154871\work\src\H5F.c", line 444, in H5Fcreate
unable to create file
File "C:\ci\hdf5_1545244154871\work\src\H5Fint.c", line 1364, in H5F__create
unable to open file
File "C:\ci\hdf5_1545244154871\work\src\H5Fint.c", line 1557, in H5F_open
unable to open file: time = Thu Feb 10 12:50:55 2022
, name = 'C:\Users\MyPC\Desktop\실험실\2.실험데이터\StartReact\SAF-MDM2\Reaching\220207\m14\220207_m14s1\220207_m14s1_output_1\Turntable_MD_baselineUncorrected_z_score_MD.h5', tent_flags = 13
File "C:\ci\hdf5_1545244154871\work\src\H5FD.c", line 734, in H5FD_open
open failed
File "C:\ci\hdf5_1545244154871\work\src\H5FDsec2.c", line 346, in H5FD_sec2_open
unable to open file: name = 'C:\Users\MyPC\Desktop\실험실\2.실험데이터\StartReact\SAF-MDM2\Reaching\220207\m14\220207_m14s1\220207_m14s1_output_1\Turntable_MD_baselineUncorrected_z_score_MD.h5', errno = -1, error message = 'Unknown error', flags = 13, o_flags = 302

End of HDF5 error back trace

Unable to open/create file 'C:\Users\MyPC\Desktop\실험실\2.실험데이터\StartReact\SAF-MDM2\Reaching\220207\m14\220207_m14s1\220207_m14s1_output_1\Turntable_MD_baselineUncorrected_z_score_MD.h5'
"""

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "computePsth.py", line 629, in
psthForEachStorename(sys.argv[1:][0])
File "computePsth.py", line 616, in psthForEachStorename
p.starmap(storenamePsth, zip(repeat(filepath), storesList[1,:], repeat(inputParameters)))
File "C:\Users\MyPC\anaconda3\envs\guppy\lib\multiprocessing\pool.py", line 372, in starmap
return self._map_async(func, iterable, starmapstar, chunksize).get()
File "C:\Users\MyPC\anaconda3\envs\guppy\lib\multiprocessing\pool.py", line 771, in get
raise self._value
tables.exceptions.HDF5ExtError: HDF5 error back trace

File "C:\ci\hdf5_1545244154871\work\src\H5F.c", line 444, in H5Fcreate
unable to create file
File "C:\ci\hdf5_1545244154871\work\src\H5Fint.c", line 1364, in H5F__create
unable to open file
File "C:\ci\hdf5_1545244154871\work\src\H5Fint.c", line 1557, in H5F_open
unable to open file: time = Thu Feb 10 12:50:55 2022
, name = 'C:\Users\MyPC\Desktop\실험실\2.실험데이터\StartReact\SAF-MDM2\Reaching\220207\m14\220207_m14s1\220207_m14s1_output_1\Turntable_MD_baselineUncorrected_z_score_MD.h5', tent_flags = 13
File "C:\ci\hdf5_1545244154871\work\src\H5FD.c", line 734, in H5FD_open
open failed
File "C:\ci\hdf5_1545244154871\work\src\H5FDsec2.c", line 346, in H5FD_sec2_open
unable to open file: name = 'C:\Users\MyPC\Desktop\실험실\2.실험데이터\StartReact\SAF-MDM2\Reaching\220207\m14\220207_m14s1\220207_m14s1_output_1\Turntable_MD_baselineUncorrected_z_score_MD.h5', errno = -1, error message = 'Unknown error', flags = 13, o_flags = 302

End of HDF5 error back trace

Unable to open/create file 'C:\Users\MyPC\Desktop\실험실\2.실험데이터\StartReact\SAF-MDM2\Reaching\220207\m14\220207_m14s1\220207_m14s1_output_1\Turntable_MD_baselineUncorrected_z_score_MD.h5'
Finding transients in z-score data and calculating frequency and amplitude....
calculating frequency and amplitude of transients in z-score data....
Creating chunks for multiprocessing...
Chunks are created for multiprocessing.
Traceback (most recent call last):
File "findTransientsFreqAndAmp.py", line 314, in
executeFindFreqAndAmp(sys.argv[1:][0])
File "findTransientsFreqAndAmp.py", line 307, in executeFindFreqAndAmp
findFreqAndAmp(filepath, inputParameters, window=moving_window)
File "findTransientsFreqAndAmp.py", line 184, in findFreqAndAmp
create_Df(filepath, arr, basename, index=fileName ,columns=['freq (events/min)', 'amplitude'])
File "findTransientsFreqAndAmp.py", line 127, in create_Df
df.to_hdf(op, key='df', mode='w')
File "C:\Users\MyPC\anaconda3\envs\guppy\lib\site-packages\pandas\core\generic.py", line 2434, in to_hdf
pytables.to_hdf(
File "C:\Users\MyPC\anaconda3\envs\guppy\lib\site-packages\pandas\io\pytables.py", line 267, in to_hdf
with HDFStore(
File "C:\Users\MyPC\anaconda3\envs\guppy\lib\site-packages\pandas\io\pytables.py", line 553, in init
self.open(mode=mode, **kwargs)
File "C:\Users\MyPC\anaconda3\envs\guppy\lib\site-packages\pandas\io\pytables.py", line 697, in open
self._handle = tables.open_file(self._path, self._mode, **kwargs)
File "C:\Users\MyPC\anaconda3\envs\guppy\lib\site-packages\tables\file.py", line 315, in open_file
return File(filename, mode, title, root_uep, filters, **kwargs)
File "C:\Users\MyPC\anaconda3\envs\guppy\lib\site-packages\tables\file.py", line 778, in init
self._g_new(filename, mode, **params)
File "tables/hdf5extension.pyx", line 492, in tables.hdf5extension.File._g_new
tables.exceptions.HDF5ExtError: HDF5 error back trace

File "C:\ci\hdf5_1545244154871\work\src\H5F.c", line 444, in H5Fcreate
unable to create file
File "C:\ci\hdf5_1545244154871\work\src\H5Fint.c", line 1364, in H5F__create
unable to open file
File "C:\ci\hdf5_1545244154871\work\src\H5Fint.c", line 1557, in H5F_open
unable to open file: time = Thu Feb 10 12:50:59 2022
, name = 'C:\Users\MyPC\Desktop\실험실\2.실험데이터\StartReact\SAF-MDM2\Reaching\220207\m14\220207_m14s1\220207_m14s1_output_1\freqAndAmp_z_score_MD.h5', tent_flags = 13
File "C:\ci\hdf5_1545244154871\work\src\H5FD.c", line 734, in H5FD_open
open failed
File "C:\ci\hdf5_1545244154871\work\src\H5FDsec2.c", line 346, in H5FD_sec2_open
unable to open file: name = 'C:\Users\MyPC\Desktop\실험실\2.실험데이터\StartReact\SAF-MDM2\Reaching\220207\m14\220207_m14s1\220207_m14s1_output_1\freqAndAmp_z_score_MD.h5', errno = -1, error message = 'Unknown error', flags = 13, o_flags = 302

End of HDF5 error back trace

Unable to open/create file 'C:\Users\MyPC\Desktop\실험실\2.실험데이터\StartReact\SAF-MDM2\Reaching\220207\m14\220207_m14s1\220207_m14s1_output_1\freqAndAmp_z_score_MD.h5'


It would be pleasure if there's any support.

Thank you,
June

peak analysis for peri-event

Hi! guppy is a really great tool but I was wondering if you guys were planning on implementing peak amplitude/frequency analysis for events? could be interesting for us but unfortunately it is not offered with guppy

Storenames not populating

Hi! I'm a grad student new to fiber photometry and I'm interested in using GuPPy to analyze my csv files. I have some fake sample data that I'm testing out from a pyPhotometry, but I'm having some issues with the storenames GUI. It seems that nothing is coming up and there is an alert saying no tdt or csv files are found. This is my first time using GuPPy so is there a certain step that I'm missing? In the input parameters I have moved over my sample data csv file and followed the guidelines for reading generic csv files but I'm a little confused on the storenames. Thanks so much for your help!

Error with step4 "Extract timestamps and its correction" after "Remove Artifacts"

Hi, I'm struggling removing artifacts now.
Actually all other things go well without removing artifacts.
However, when I started removing artifacts, there was an error.
I followed your instructions. Firstly, I executed step1~3. After selecting chunks to analyze, I went back to input parameters, and clicked 'save fo files...' after changing 'removeArtifacts' True.
Then, an error showed up when I clicked "Extract timestamps and its correction".

image

It would be pleasure if there's any help.

Best,
Junesu

Misalignment of PSTH when using csv input

I am running into a problem where the timestamps of my behavioral events are being distorted/misaligned during the photometry processing by Guppy. I can't figure out what is causing the misalignment, but it looks like there is an across trials gradual accumulation of longer and longer lag between the actual behavioral event and the time 0 assigned in the PSTHs. Note that I am using an Inper FP system, with a 20.4992 Hz sampling rate (per wavelength). To demonstrated what I am seeing, I've attached a screen shot of the heatmap I can generate directly within the Inper software, and a heatmap generated within Guppy. Please note that the order of the trials is reversed from top to bottom in the Guppy Visualization compared to the Inper one. I am happy to provide my csv data and provide the input parameters .json file if they would be helpful for troubleshooting purposes.

Screen Shot 2023-03-29 at 3 00 31 PM

Screen Shot 2023-03-29 at 2 51 16 PM

Cannot find any file in the stornames

Hi,
I am trying to upload some csv files acquired with a Doric system. I run a test using a csv file with 3 columns named as shown in one of your youtube videos (timestamps, data, sampling_rate). However no storenames appear in the Storenames GUI.
Do you know why? Is it a matter of how the folder is organized maybe?
Thanks!

Could not parse explicit URL

Hello Lerner Lab

I was recommended GuPPy by another lab who is using fiber photometry, but I have had some issues with the installation. I've tried installing GuPPy on two different Windows devices, but I have had the same error on both attempts when trying to create the guppy env. Upon executing the initial command "conda create --name guppy --file spec_file_windows10.txt" in the C:\Users\MyUsername\Desktop\GuPPy-main folder where the files are held, I get this parsing error:

ParseError: Could not parse explicit URL: https://repo.anaconda.com/pkgs/main/win-64/blas-1.0-mkl.conda

It seems to be hung up on the initial URL in the spec_file_windows10.txt file. I'm not sure if this is an issue on Anaconda's end in how I might have it set up, or an issue consistent with both of my computers. If you would happen to know what causes this issue I would greatly appreciate the feedback.

Nick

time correction file does not exist+bokeh error

Hello,
Thank you for this great tool! I have been trying to implement it with my data, but I am getting an error message that I am not able to interpret. Everything works fine up to the "Extract timestamps and apply corrections" step, where I get the error message that you can see below in the attached files.

bokeh error (2)
bokeh error2
bokeh error 3

When I look at the output folder, the time correction file is called "timeCorrection_CeA.hdf5" (CeA being the region name that I used in the storenames) and not "timeCorrection_channel.hdf5", but this is the same name I get when I do the exact same analysis with the sample data, which works perfectly well until the end. So I am a bit at a loss as to what could have gone wrong.

I am also attaching my csv files, I tried to make them as similar to the sample data as possible but I might have missed something.

signal channel
control channel
TTL

Thank you in advance for your help!

Best Regards,

Celia

Storenames GUI with CSVs

Hi, thanks for taking the time to make this tool for photometry analysis. I'm using a Doric system so all of my files are saved as CSVs in the format specified. I was following along in the Youtube video you made for working with CSVs and I keep getting stuck on the "Storenames GUI" step. I have 12 files in my folder with unique names but when I click on the "Open Storenames GUI" none of my file names show up in the selection box -- instead it repeats "chev1, chev2, chev3, chod1, chod2, chod3" 12 times and now there are an additional 6 files named "chev1, chev2, chev3, chod1, chod2, chod3" in the folder I'm trying to analyze. Any idea why this is happening?
image
image

'TKAgg' backend error on headless server

When starting the GuPPy User Interface from a server running in 'headless' mode, there's an issue with 'TKAgg' backend.
The full error stack looks like this:

panel serve --show GuPPy/savingInputParameters.ipynb

2023-09-07 16:44:46,478 Starting Bokeh server version 2.3.2 (running on Tornado 6.1)
2023-09-07 16:44:46,479 User authentication hooks NOT provided (default user enabled)
2023-09-07 16:44:46,482 Bokeh app running at: http://localhost:5006/savingInputParameters
2023-09-07 16:44:46,482 Starting Bokeh server with process id: 96012
2023-09-07 16:44:47,365 Error running application handler <bokeh.application.handlers.notebook.NotebookHandler object at 0x7ff0e76917b8>: Cannot load backend 'TKAgg' which requires the 'tk' interactive framework, as 'headless' is currently running
File "pyplot.py", line 287, in switch_backend:
newbackend, required_framework, current_framework)) Traceback (most recent call last):
  File "/home/wanglab/mambaforge/envs/guppy/lib/python3.6/site-packages/bokeh/application/handlers/code_runner.py", line 197, in run
    exec(self._code, module.__dict__)
  File "/home/wanglab/code/FP/GuPPy/GuPPy/savingInputParameters.ipynb", line 16, in <module>
    "import numpy as np\n",
  File "/home/wanglab/code/FP/GuPPy/GuPPy/preprocess.py", line 16, in <module>
    plt.switch_backend('TKAgg')
  File "/home/wanglab/mambaforge/envs/guppy/lib/python3.6/site-packages/matplotlib/pyplot.py", line 287, in switch_backend
    newbackend, required_framework, current_framework))
ImportError: Cannot load backend 'TKAgg' which requires the 'tk' interactive framework, as 'headless' is currently running

This issue can be resolved with using a virtual display server, such as:

sudo apt-get install xvfb
xvfb-run -a panel serve --show GuPPy/savingInputParameters.ipynb

Then the server runs fine (if using VS Code, it takes care of the required port forwarding).

2023-09-07 16:49:06,688 Starting Bokeh server version 2.3.2 (running on Tornado 6.1)
2023-09-07 16:49:06,689 User authentication hooks NOT provided (default user enabled)
2023-09-07 16:49:06,693 Bokeh app running at: http://localhost:5006/savingInputParameters
2023-09-07 16:49:06,693 Starting Bokeh server with process id: 97214
Launching server at http://localhost:44699

There may be other way to solve this. This one works for Linux servers / workstations at least.

Stuck on Timestamps

Using the sample Clean TDT data and get the following sequence in the terminal window followed by the error that hangs the program-

Trying to read tsq file....

Data from tsq file fetched....
Reading data for event Dv1A ...
Reading data for event Dv3B ...
Data for event Dv1A fetched and stored.
Data for event Dv3B fetched and stored.
Reading data for event Dv2A ...
Reading data for event Dv4B ...
Data for event Dv4B fetched and stored.
Reading data for event LNRW ...
Data for event LNRW fetched and stored.
Reading data for event LNnR ...
Data for event LNnR fetched and stored.
Reading data for event PrtN ...
Data for event PrtN fetched and stored.
Reading data for event PrtR ...
Data for event PrtR fetched and stored.
Reading data for event RNPS ...
Data for event Dv2A fetched and stored.
Data for event RNPS fetched and stored.
Time taken = 8.60388
Raw data fetched and saved.
Extracting signal data and event timestamps...
Remove Artifacts : False
Combine Data : False
Isosbestic Control Channel : True
Correcting timestamps by getting rid of the first 1 seconds and convert timestamps to seconds...
Timestamps corrected and converted to seconds.
Applying correction of timestamps to the data and event timestamps...
Timestamps corrections applied to the data and event timestamps.
Applying correction of timestamps to the data and event timestamps...
Timestamps corrections applied to the data and event timestamps.
Applying correction of timestamps to the data and event timestamps...
Timestamps corrections applied to the data and event timestamps.
Applying correction of timestamps to the data and event timestamps...
Timestamps corrections applied to the data and event timestamps.
Applying correction of timestamps to the data and event timestamps...
Timestamps corrections applied to the data and event timestamps.
Applying correction of timestamps to the data and event timestamps...
Timestamps corrections applied to the data and event timestamps.
Applying correction of timestamps to the data and event timestamps...
Timestamps corrections applied to the data and event timestamps.
Applying correction of timestamps to the data and event timestamps...
Timestamps corrections applied to the data and event timestamps.
Applying correction of timestamps to the data and event timestamps...
Timestamps corrections applied to the data and event timestamps.
Computing z-score for each of the data...
Remove Artifacts : False
Remove Artifacts : False
z-score for the data computed.
invalid command name "140348261455104stayOnTop"
while executing
"140348261455104stayOnTop"
("after" script)

Error: Only chunked datasets can be resized

Hello,

I am working through GuPPY using the sample CSV data files and have run into the Type Error: Only chunked datasets can be resized when I try to 'Correct timestamps and it's correction'.

Sample data files are stored in a folder under my user name on the C: drive.

I've tried multiple input parameters, including the defaults, but always get this error.

Here is a link to an image with a snapshot of the Anaconda Prompt with the error.
https://user-images.githubusercontent.com/57728387/152023014-54e71753-c03c-495a-9239-b5de3de412f8.png

NO storename found in Neurophotometric file

Hi,
thank you so much for this amazing tool. I'm just getting started and wanted to play with my Neurophotometric data to compare Guppy with Matlab pipeline...but getting stuck at the very beginning!! I am attaching screenshots of my NPM .csv file (I'm recording both green and red channels in both hemispheres, so 4 channels total) and the ALERT note I get when trying to assign names to storenames. Any help will be greatly appreciated!
Screen Shot 2022-04-26 at 11 09 01 AM
Screen Shot 2022-04-26 at 11 15 11 AM

Visualization GUI Error

Hello, I'm currently getting the attached error when attempting to open the visualization GUI. I have downloaded new code, followed the naming conventions, and used the file format in the sample data.

image

Stuck on read raw data

SOLVED (messed up in open storenames GUI; understood that the saving at the bottom was only if I want to make changes to the storenames) - Hi, as many have commented before, this is a fantastic tool and such a treat for someone with little (no) coding experience. Thank you! But unfortunately, for the same reason I cannot troubleshoot (yet). I do not get any data displayed when hitting "extract timestamps and its correction" and realised that after "read raw data" there is no output file in my source folder. Can you please advise how I can fix that?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.