Code Monkey home page Code Monkey logo

magprop's Introduction

Magnetar Propeller Model with Fallback Accretion

⚠️ This repo is archived and read-only!!! ⚠️

Suite of code that models fallback accretion onto a magnetar and uses Markov Chain Monte Carlo to fit this to samples of Gamma-Ray Bursts.

License: MIT Build Status


Installation

Begin by cloning this repo.

git clone https://github.com/sgibson91/magprop.git
cd magprop

Install the requirements using pip.

pip install -r requirements.txt

Use the setup.py to install the magnetar library.

python setup.py install

Usage

Reproducing Model Figures 1-5

Execute a figure script by running:

python code/figure_<number>.py

These scripts will reproduce the model figures 1-5 in the Short GRBs paper. The figures will be saved to the plots/ directory.

Running MCMC on Synthetic Datasets

An MCMC simulation can be run on a synthetic dataset of one of the four GRB types in order to evaluate the performance of the model and MCMC algorithm. The four GRB types are: Humped, Classic, Sloped, and Stuttering.

First off, generate a dataset by running the following script.

python code/synthetic_dataset/generate_synthetic_dataset.py --grb <GRB-type>

The dataset will be saved to data/synthetic_datasets/<GRB-type>/<GRB-type>.csv.

Then run the MCMC simulation on the synthetic dataset.

python code/synthetic_dataset/mcmc_synthetic.py --grb <GRB-type> --n-walk <Nwalk> --n-step <Nstep>

where:

  • Nwalk is the number of MCMC walkers to use, and
  • Nstep is the number of MCMC steps to take.

This will optimise for 6 parameters: B, P, MdiscI, RdiscI, epsilon and delta. Generated datafiles will be saved to data/synthetic_datasets/<GRB-type> and figures will be saved to plots/synthetic_datasets/<GRB-type>.

If you need to re-run an anlysis with the same input random seed, parse the --re-run flag.

Once the MCMC is completed, then run the analysis script to generate figures and fitting statistics.

python code/synthetic_datasets/plot_synthetic.py --grb <GRB-type> --n-burn <Nburn>

where Nburn is the number of steps to remove as burn-in.

The optimal model will be saved to data/synthetic_datasets/<GRB-type>/<GRB-type>_model.csv and plots/synthetic_datasets/<GRB-type>/<GRB-type>_model.png. Another important file to check is data/synthetic_datasets/<GRB-type>/<GRB-type>_stats.json which will contain the optimised parameters and fitting statistics.

Preparing the GRB samples

The raw datafiles for the Short GRB sample are stored in data/SGRBS/. The dataset needs cleaning first to remove comments generated by the website that hosts the data and convert it to CSV format.

python code/clean_data.py

The last stage of preparing the dataset involves performing a k-correction. A k-correction accounts for the distance the GRB exploded at and the energy bandwidth of the telescope that captured the data in order to make it compatible with the magnetar model. See this paper for more detail.

Run the k-correction on the Short GRB sample by running the following command.

python code/kcorr -t S

Binder

Binder

To run this repo in Binder, click the launch button above. When your server launches, you will see a JupyterLab interface.

Running a script in the terminal will be the same as running the scripts locally. If you run a script in the Python Console, then you'll need to modify the command to the following.

%run code/figure_<number>.py

You will NOT be able to run the MCMC simulations inside the Binder instance as the servers are limited to 1G memory and 0.5 CPU. Please follow the instructions in Installation in order to run the MCMC simulations locally.

Running Tests

To run tests, execute the following command.

python -m pytest -vvv

To see the code coverage of the test suite, run the following commands.

coverage run -m pytest -vvv
coverage report

Citing this work

Please quote the following citation when referring to this work.

Paper

Citation

@article{doi:10.1093/mnras/stx1531,
author = {Gibson, S. L. and Wynn, G. A. and Gompertz, B. P. and O'Brien, P. T.},
title = {Fallback accretion on to a newborn magnetar: short GRBs with extended emission},
journal = {Monthly Notices of the Royal Astronomical Society},
volume = {470},
number = {4},
pages = {4925-4940},
year = {2017},
doi = {10.1093/mnras/stx1531},
URL = {http://dx.doi.org/10.1093/mnras/stx1531},
eprint = {/oup/backfile/content_public/journal/mnras/470/4/10.1093_mnras_stx1531/1/stx1531.pdf}
}

Software Citation

To cite the software contained in this repo, please use the metadata contained in CITATION.cff.

License

License: MIT

This work is published under the MIT license. Please see the LICENSE file for further information.

magprop's People

Contributors

sgibson91 avatar

Stargazers

 avatar  avatar

Forkers

bbw7561135

magprop's Issues

Restructure postBuild and postBuild.py

Abstract some features of postBuild.py for Binder out fo it's friendlier for local users.

  • Remove file deletion from postBuild.py and abstract out into postBuild
  • Rename postBuild.py to something like clean_data.py under code folder (reflect this in postBuild)
  • Update docs to reflect that clean_data.py should be run before kcorr.py and beginning MCMC simulations

Update to JupyterLab

Problem

"Vanilla" binder won't allow a terminal loaded from the dropdown Jupyter environment. JupyterLab should be able to provide a terminal however.

Update

Repo is now launching in JupyterLab, but running the scripts from the terminal does not work - the modules are not loaded. However, the script runs fine in an IPython console. Usage instructions in the README have been provided to reflect this.

Reorganise README

Reorganise the README so it's clear which scripts need to be run to gain which result.

Add tests

Begin to scope what kind of tests can be written and implemented.

Install magnetar library

Main Goals

Install magnetar library which contains:

  • script with functions to model the light curves
  • script with likelihood functions for Bayesian inference
  • CSV files containing appropriate parameter limits for prior function
  • script with fitting statistics in-built: (reduced) chi-square, AICc, chain convergence

Other necesseties

  • Update setup.py file for this repo

Links to Swift data are not permanent

Problem

Want to implement a postBuild file which would download the required GRB sample from the Swift website using wget. However, the links to the data files are not permanent and often throw a Cannot find file message resulting in a Cannot convert to float error when the postBuild.py script tries to read in the downloaded file.

Potential Solutions

  • Make a copy of the raw files and host them somewhere (e.g. Dropbox) so that the links are permanent (Don't need permissions to share either)
  • Upload sample straight into Git repo - but binder may not like this

Other issues

  • Attempting to wget files from Dropbox results in scraping the html not the file. There is a Dropbox API available for Python but it's compatibility with v2.7 is not clear.

Solution

Upload raw files into SGRBS/ folder. postBuild file runs script which cleans the data and sorts it into appropriate sub-directories.

Filepath dependencies

Investigate the os module to control the working directory, where files are read from, where files are saved etc. so that we don't have to explicitly tell users to run code from the repo root.

Develop a Makefile to run the pipelines

Develop a Makefile to run different pipelines, such as:

  • Set up environment: make output subdirectories and activate the kernel environment, etc. Run kcorr.py?
  • Generate first 5 model testing figures: A rule to run code/figure%.py where % is in range(6)
  • Clean up: Remove generated files and subdirectories
  • Run the test suite
  • Run synthetic data example: Create synthetic datasets, run MCMC on each synth dataset, produce analysis outputs

This may fix #10 as running make from root will fix filepath dependencies.

Will need to add argparse arguments to scripts for output files.

Include some synthetic datasets and mcmc scripts

To demonstrate the MCMC fitting routine without HPC support, include the following:

  • small synthetic datasets with noise for each burst type: humped, classic, sloped, stuttering
  • MCMC optimisation script with few walkers and steps that can be run without HPC
  • analysis script to generate plots and statistics

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.