aiephys / acq4 Goto Github PK
View Code? Open in Web Editor NEWThis project forked from acq4/acq4
Electrophysiology data acquisition and analysis platform
Home Page: http://www.acq4.org
License: MIT License
This project forked from acq4/acq4
Electrophysiology data acquisition and analysis platform
Home Page: http://www.acq4.org
License: MIT License
The primary use is for tracking the focus depth for every frame in a stack. If the transform changes rapidly, then there will be some lag between each frame and its recorded transform (but we can live with this).
Complications:
Suggestions:
metaarray._info[0]['translation'] = array((N,3))
. When we record image stacks, we currently append a new frame time to _info[0]['values']
whenever a new frame comes in. We can extend this mechanism to optionally append values to the translation
field as well. Perhaps it would look something like ma.write(filename, appendAxis='Time', appendKeys=['translation']))
(this would be in acq4/util/imaging/record_thread.py : writeFrames() )The primary purpose is to extract 3D cell locations from manipulator position data.
Manipulators that are locked in the multipatch module are currently excluded from any automatic movements, but it is still possible to move these pipettes manually with the rotary controllers. Ideally, this should not be possible to prevent accidental pipette movement.
Behavior should be:
The main purpose is to be able to reload the complete state of a MosaicEditor session, including the list/order of files loaded, transparency, filters, etc.
Scientifica's new motion cards have a very low minimum speed, so we should be able to automate pipette resealing.
This feature was developed for 2p imaging devices but never tested with cameras. Should be fixable with minor effort..
Generate a single NWB file per experiment and send to LIMS
Should be part of a module for streamlining multipatch connectivity experiments (#13)
Camera.getScopeState()
currently returns the scopeState
dictionary stored by the camera. This should probably return a copy of the dict instead, but we should do a quick check to be sure performance is not affected.
I always forget to actually hit the record button in the multipatch module to record pipette positions during an experiment. Could this be automatically activated when a "New Site" is created in the Data Manager assuming that this means one is starting a new experiment?
Currently, making changes to the coordinate system of a stage (such as when we click the "zero" button) causes the calibration of pipettes to become invalid.
Probably the correct solution is to compensate for the sudden CS change by introducing the opposite change into the stage's transform, but then do we need to automatically update the config file as well?
Or maybe we just need to add a button that lets the user write the transform back out to config (this would be really nice when calibrating stage/scope/camera anyway).
I think right now we are only recording pipette positions.
Each value needs to be recorded once when the log file is created, and subsequently whenever the value changes.
Microscope
device and can be monitored with Microscope.sigSurfaceDepthChanged
. Use man.listInterfaces('microscope')
to list the names of all configured microscopes.PatchPipette
devices and can be monitored with PatchPipette.sigTargetChanged
.The primary purposes of this module will be to:
Implement on/off control of LEDs with buttons on the XKeys pad for easier accessibility when switching between light sources
Goal: when images are opened in MosaicEditor, they should look exactly as they looked onscreen when they were saved.
userLevels
and userLookupTable
info keys. These may be modified by MosaicEditor like userTransform
.backgroundImage
info key that references a stored background frame, and a 'backgroundRemovalMode' key that can be either 'divide' or 'subtract'.The main purpose of this module is to facilitate patching cells and monitoring cell health when using multiple patch electrodes.
Current development is in https://github.com/campagnola/acq4/tree/multipatch
Every image should have metadata on the illumination / filtering / imaging devices used to generate the image (when available).
This feature should be implemented in the Microscope class.
Should build on acq4#45
Right now we have two config options tipHeight
and tipSearchHeight
that are confusing and have weird interactions. Let's replace these with something more straightforward like
minSearchHeight
- minimum distance above the slice surface when bringing pipettes in for searchsearchFocusOffset
- Z-distance from pipette to focal plane in search mode (positive values to give the pipette extra room, prevent crashing into objective; negative values to make the pipette easier to find)For backward compatibility, we can catch the old options and raise a super-helpful error message that provides the new parameters and values to use.
Would like to see several features added to MosaicEditor image handling:
Currently, lines in the multipatch log file give the name of the manipulator to which they belong, but there is no way to determine anything else about the manipulator, such as:
It might be the case that some of this information should be logged elsewhere. It would be nice if we could at least graphically reconstruct the pipette orientations when reading the log file.
An easy fix for orientation might be to just include the complete transform for each pipette as part of its initial state. We could also have each pipette provide a describe
method that could provide more custom information (like MIES channel associations).
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.