Code Monkey home page Code Monkey logo

telemetyr's People

Contributors

kevinsee avatar mackerman44 avatar

Stargazers

 avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

telemetyr's Issues

Summarise Sentinel Data

Lower priority. Do we want to add functions to summarise data from so-called "sentinel" tags i.e. tags deployed in the field to simulate a fish for evaluations of read range, detection probability, and the like?

tag_data argument in summarise_test_data()

I'd like to add the ability for the tag_data argument in the summarise_test_data() function to accommodate either an object (e.g., the tag_releases) data or a file path (e.g., the file path to tag release metadata).

Four date/time columns preserved after clean_raw_data()

This one has perhaps been resolved. Maybe we just need to re-run clean_raw_data() and determine if it functions as intended, now.

@mackerman44 April 13, 2020:
When running the clean_raw_data() function on the "raw" data, multiple date and/or time columns are generated and afterwards we end up with 4 columns (orig_date, date, time, and date_time). Those columns are also preserved through round_tag_codes().

Are those columns duplicative i.e. can we only bring the date_time column through clean_raw_data()? Or is there a reason for keeping additional columns?

@KevinSee April 14, 2020:
The org_date was left to identify which rows had an original date-stamp of "00/00/00", for QA/QC purposes, or in case we realized that assigning a date for those observations based on the file name was not a good idea. If we're settled on how that works, we could remove orig_date, date and time columns from the data when it's run through clean_raw_data().

@mackerman44 April 14, 2020:
That makes sense to me. But maybe leave the orig_date, date, and time columns for now until we're further along in the build process? But also keep this issue open to remind ourselves later to remove?

Document QA/QC procedure

From @KevinSee : I added csv files of the raw and compressed data to the NAS for each year. We should define a QA/QC procedure to ensure this data is ready to go. Might involve several different things to check and clean up. Hopefully Nick can help write a document describing this.

Package may be incompatible with R 4.0.2

From @ntfuch1:

I was not able to successfully install this package running R 4.0.2. The issue has since been resolved only after uninstalling R, Rstudio, and Rtools. Only after re-installing R version 3.6 did the package install properly.

Import TidBit Temperature Data

Whether here or in the LemhiRT package we need to import, clean, and summarize the TidBit temperature data at some point. Putting this here to remind myself.

Missing Dates

This one seems more related to data QA/QC for the Lemhi RT project. Consider moving to KevinSee/LemhiRT?

From @KevinSee Apr 22, 2020:
Even after updating several functions, there are still a number of rows with missing "start" and "end" date/times in the compressed dataframe. We'll need to track down which raw files these correspond to, and figure out what's going on.

2018-2019: 95 rows, all from receiver DC1, across multiple tag IDs
2019-2020: 370,451 rows, all from receivers TT1 and TT2

Survival Models

Some of this could perhaps, at some point, be moved to KevinSee/LemhiRT.

From @KevinSee :
On April 22, 2020 -
Things to look into:

Simple CJS model using batch_1 tags and RT detections only

Bring in detections from batch_2 and batch_3 tags, starting them in the reach with their first detection.

Kaplan-Meier known-fates model

CJS models for right censored data (when batteries die)

Other ideas?

On May 5, 2020 -
Completed CJS models using:

PIT tag observations only.
RT observations only, including bringing in batch 2 and batch 3 tags, based on first observation of them (batch 2/3 tags that are never detected are not included in the model. Does this bias results?)

On May 29, 2020 -
Note that Kaplan-Meier and other known-fate models usually have data about when (or where) at least some individuals die. Right censored data in that context are individuals that have survived up to some point and then no further data is collected (e.g. the study ends, the radio tag dies). With this study, such models are probably not appropriate, since we never know when a fish actually dies, we just stop detecting it.

Package Installation Fail

Today, I attempted to install telemetyr on my new machine.

library(devtools)
install_github("BiomarkABS/telemetyr", build_vignettes = T)

Unfortunately, the first attempt failed because rjags was not installed. Then, after installing rjags the second attempt failed with:

E> The rjags package is just an interface to the JAGS library
E> Make sure you have installed JAGS-4.x.y.exe (for any x >=0, y>=0) from
E> http://www.sourceforge.net/projects/mcmc-jags/files

After downloading and installing JAGS the install worked fine.

Is this the intended or expected response? I need to remind myself, but wasn't jagsUI added (or something to that effect) to remove the need for JAGS. Not high priority, just ran into this when getting telemetyr on the new machine.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.