Code Monkey home page Code Monkey logo

pycbc-inference-paper's Introduction

PyCBC Inference: A Python-based parameter estimation toolkit for compact-object merger signals

C. M. Biwer1,2, Collin D. Capano3, Soumi De2, Miriam Cabero3, Duncan A. Brown2, Alexander H. Nitz3, V. Raymond4,5

1Los Alamos National Laboratory, Los Alamos, NM 87545, USA

2Department of Physics, Syracuse University, Syracuse, NY 13244, USA

3Albert-Einstein-Institut, Max-Planck-Institut for Gravitationsphysik, D-30167 Hannover, Germany

4Albert-Einstein-Institut, Max-Planck-Institut for Gravitationsphysik, D-14476 Potsdam, Germany

5School of Physics and Astronomy, Cardiff University, Cardiff, CF243AA, Wales, UK

License

Creative Commons License

This work is licensed under a Creative Commons Attribution-ShareAlike 3.0 United States License.

Introduction

The contents in this repository is a companion to the paper posted at arxiv:1807.10312. We release posterior probability density files from the MCMC from running PyCBC inference on the gravitational-wave data for the binary black hole events GW150914, GW151226, and LVT151012, detected during LIGO's first observing run. We provide a notebook to demonstrate how to read these posterior files and reconstruct figures 4, 5 and 6 in the paper. We also provide the configuration files and sample scripts with command lines to run the PyCBC inference parameter estimation analysis on the gravitational-wave data for the three events.

We encourage use of these data in derivative works. If you use the material provided here, please cite the paper using the reference:

@article{Biwer:2018,
      author         = "Biwer, C. M. and Capano, Collin D. and De, Soumi and
                        Cabero, Miriam and Brown, Duncan A. and Nitz, Alexander H. and Raymond, V.",
      title          = "{PyCBC Inference: A Python-based parameter estimation toolkit for compact-object merger signals}",
      year           = "2018",
      eprint         = "1807.10312",
      archivePrefix  = "arXiv",
      primaryClass   = "astro-ph.IM",
      SLACcitation   = "%%CITATION = ARXIV:1807.10312;%%"
}

The posterior samples from the PyCBC Inference analyses of GW150914, LVT151012, and GW151226 presented in the paper are stored in the folders posteriors/GW150914, posteriors/LVT151012, and posteriors/GW151226 respectively in the files mentioned below. The data in these files contain the thinned posterior samples from the MCMC chains used to produce the posterior probability density and histogram plots :

  1. gw150914_posteriors_thinned.hdf contains the posterior samples from the MCMC for measuring properties of GW150914.
  2. gw151226_posteriors_thinned.hdf contains the posterior samples from the MCMC for measuring properties of GW151226.
  3. lvt151012_posteriors_thinned.hdf contains the posterior samples from the MCMC for measuring properties of LVT151012.

The sample scripts with command lines and configuration files for performing the analyses presented in the paper for GW150914, LVT151012, and GW151226 can be found in the folders samples/GW150914, samples/LVT151012, and samples/GW151226 respectively.

The results for the astrophysical events in the version-1 of the paper at arxiv:1807.10312 were generated with the PyCBC v1.9.4 release.

The results in the final version of the paper accepted to the Publications of the Astronomical Society of the Pacific were generated with the PyCBC v1.12.3 release.

Running the notebook in a Docker container

This notebook can be run from a PyCBC Docker container, or a machine with PyCBC installed. Instructions for downloading the docker container are available from the PyCBC home page. To start a container with instance of Jupyter notebook, run the commands

docker pull pycbc/pycbc-el7:v1.12.3
docker run -p 8888:8888 --name pycbc_notebook -it pycbc/pycbc-el7:v1.12.3 /bin/bash -l

Once the container has started, this git repository can be downloaded with the command:

git clone https://github.com/gwastro/pycbc-inference-paper.git

The notebook server can be started inside the container with the command:

jupyter notebook --ip 0.0.0.0 --no-browser

You can then connect to the notebook server at the URL printed by jupyter. Navigate to the directory pycbc-inference-paper in the cloned git repository and open data_release_pycbc-inference-paper_companion.ipynb (this notebook).

Funding

This work was supported by NSF awards PHY-1404395 (DAB, CMB), PHY-1707954 (DAB, SD), and PHY-1607169 (SD). SD was also supported by the Inaugural Kathy '73 and Stan '72 Walters Endowed Fund for Science Research Graduate Fellowship at Syracuse University. Computations were supported by Syracuse University and NSF award OAC-1541396. We also acknowledge the Max Planck Gesellschaft for support and the Atlas cluster computing team at AEI Hannover. The authors thank the LIGO Scientific Collaboration for access to the data and gratefully acknowledge the support of the United States National Science Foundation (NSF) for the construction and operation of the LIGO Laboratory and Advanced LIGO as well as the Science and Technology Facilities Council (STFC) of the United Kingdom, and the Max-Planck-Society (MPS) for support of the construction of Advanced LIGO. Additional support for Advanced LIGO was provided by the Australian Research Council. This research has made use of data obtained from the LIGO Open Science Center (https://losc.ligo.org).

pycbc-inference-paper's People

Contributors

soumide1102 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pycbc-inference-paper's Issues

How to thinned the posterior samples?

I use the script in run_files and get "gw150914_inference.hdf" file,
but it is not same with "gw150914_posteriors_thinned.hdf".

Could you introduce the method of how to thinned the posterior samples?

RuntimeError: Can't find MKL libraries

Hello, I use the docker version, but get this error, no MKL.

(pycbc-software) [pycbc@24f913201669 GW151226]$ ls
extract_independent_samples.sh gw151226_inference.ini run_pycbc_inference_gw151226.sh
(pycbc-software) [pycbc@24f913201669 GW151226]$ pwd
/home/pycbc/pycbc-inference-paper/run_files/GW151226
(pycbc-software) [pycbc@24f913201669 GW151226]$ bash run_pycbc_inference_gw151226.sh
1135136340
1135136356
1135136238
1135137262
/home/pycbc/pycbc-software/lib/python2.7/site-packages/h5py/init.py:36: FutureWarning: Conversion of the second argument of issubdtype from float to np.floating is deprecated. In future, it will be treated as np.float64 == np.dtype(float).type.
from ._conv import register_converters as _register_converters
2019-06-03 16:42:42,073 Using seed 12
Traceback (most recent call last):
File "/home/pycbc/pycbc-software/bin/pycbc_inference", line 4, in
import('pkg_resources').run_script('PyCBC==1.12.3', 'pycbc_inference')
File "/home/pycbc/pycbc-software/lib/python2.7/site-packages/pkg_resources/init.py", line 654, in run_script
self.require(requires)[0].run_script(script_name, ns)
File "/home/pycbc/pycbc-software/lib/python2.7/site-packages/pkg_resources/init.py", line 1434, in run_script
exec(code, namespace, namespace)
File "/home/pycbc/pycbc-software/lib/python2.7/site-packages/PyCBC-1.12.3-py2.7-linux-x86_64.egg/EGG-INFO/scripts/pycbc_inference", line 195, in
ctx = scheme.from_cli(opts)
File "/home/pycbc/pycbc-software/lib/python2.7/site-packages/PyCBC-1.12.3-py2.7-linux-x86_64.egg/pycbc/scheme.py", line 262, in from_cli
ctx = MKLScheme()
File "/home/pycbc/pycbc-software/lib/python2.7/site-packages/PyCBC-1.12.3-py2.7-linux-x86_64.egg/pycbc/scheme.py", line 149, in init
raise RuntimeError("Can't find MKL libraries")
RuntimeError: Can't find MKL libraries
(pycbc-software) [pycbc@24f913201669 GW151226]$

how to work with latest release v1.13.5

Is it possible to run the code using the latest pycbc release 1.13.5? Tried to change the import pycbc.io.inference_hdf.InferenceFile in 1.12.3 to pycbc.infererence.io.base_hdf.BaseInferenceFile, but it didn't work, giving the error messages: TypeError: Can't instantiate abstract class BaseInferenceFile with abstract methods read_raw_samples, write_posterior, etc.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.