Code Monkey home page Code Monkey logo

gwastro / o2-bbh-pe Goto Github PK

View Code? Open in Web Editor NEW
4.0 10.0 1.0 32.67 MB

Data Release of posterior samples from PyCBC Inference analysis of the binary black-hole signals GW170104, GW170608 and GW170814 in Advanced LIGO-Virgo's second observing run. This repository also includes configuration files and run scripts to run the analyses.

Jupyter Notebook 99.83% Shell 0.17%
bayesian-inference open-science open-data gw170104 gw170608 gw170814 pycbc data-release

o2-bbh-pe's Introduction

Posterior samples of the parameters of binary black holes from Advanced LIGO--Virgo's second observing run

Soumi De1, Christopher M. Biwer2, Collin D. Capano3,4, Alexander H. Nitz3,4, Duncan A. Brown1

1Department of Physics, Syracuse University, Syracuse, NY 13244, USA

2Los Alamos National Laboratory, Los Alamos, NM 87545, USA

3Albert-Einstein-Institut, Max-Planck-Institut for Gravitationsphysik, D-30167 Hannover, Germany

4Leibniz Universitat Hannover, D-30167, Hannover, Germany

License

Creative Commons License

This work is licensed under a https://creativecommons.org/licenses/by/4.0/deed.ast.

Introduction

This is a public data release of posterior samples from the parameter estimation analysis of the seven binary black hole mergers---GW170104, GW170608, GW170729, GW170809, GW170814, GW170818, and GW170823---detected during the second observing run of the Advanced LIGO and Virgo observatories using the gravitational-wave open data. The analysis to generate the data is presented in the paper posted at arxiv:1811.09232. We provide a notebook to demonstrate how to read the files containing the posterior samples, handle the data, tools for visualizing the data, and commands for reconstructing figures 1, 2 and 3 in the paper. We also provide the configuration files and sample scripts with command lines to replicate our analyses for the three events to generate these data.

We encourage use of these data in derivative works. If you use the material provided here, please cite the companion paper for this data release using the following reference. The companion paper provides a description of the data and our analyses for generating these data.

@article{De:2018,
      author         = "De, Soumi and Biwer, C. M. and Capano, Collin D. and Nitz, Alexander H. and Brown, Duncan A.",
      title          = "{Posterior samples of the parameters of black hole mergers in the second Advanced LIGO--Virgo observing run}",
      year           = "2018",
      eprint         = "1811.09232",
      archivePrefix  = "arXiv",
      primaryClass   = "astro-ph.IM",
      SLACcitation   = "%%CITATION = ARXIV:1811.09232;%%"
}

Please also cite Biwer et al. (2018) using the following reference. This paper describes and validates the PyCBC Inference parameter estimation toolkit that was used for generating the data.

@article{Biwer_2019,
	doi = {10.1088/1538-3873/aaef0b},
	year = 2019,
	month = {jan},
	publisher = {{IOP} Publishing},
	volume = {131},
	number = {996},
	pages = {024503},
	author = {C. M. Biwer and Collin D. Capano and Soumi De and Miriam Cabero and Duncan A. Brown and Alexander H. Nitz and V. Raymond},
	title = {{PyCBC} Inference: A Python-based Parameter Estimation Toolkit for Compact Binary Coalescence Signals},
	journal = {Publications of the Astronomical Society of the Pacific},
}

The parameter estimation analyses to generate the posterior samples and construction of the figures to visualize the results have been performed with PyCBC v1.12.3.

The contents in the repository are organized as follows :

  • data_release_o2_bbh_pe.ipynb : Notebook demonstrating tools to handle the released posteriors, visualize them, and reconstruct Figures in the paper arxiv:1811.09232

  • posteriors : Directory having the posterior files

    • GW170104 : Directory for GW170104
      • gw170104_posteriors_thinned.hdf : File containing posterior samples from the MCMC for measuring properties of GW170104. Also contains prior samples and PSDs used in the analysis.
    • GW170608 : Directory for GW170608
      • gw170608_posteriors_thinned.hdf : File containing posterior samples from the MCMC for measuring properties of GW170608. Also contains prior samples and PSDs used in the analysis.
    • GW170729 : Directory for GW170729
      • gw170729_posteriors_thinned.hdf : File containing posterior samples from the MCMC for measuring properties of GW170729. Also contains prior samples and PSDs used in the analysis.
    • GW170809 : Directory for GW170809
      • gw170809_posteriors_thinned.hdf : File containing posterior samples from the MCMC for measuring properties of GW170809. Also contains prior samples and PSDs used in the analysis.
    • GW170814 : Directory for GW170814
      • gw170814_posteriors_thinned.hdf : File containing posterior samples from the MCMC for measuring properties of GW170814. Also contains prior samples and PSDs used in the analysis.
    • GW170818 : Directory for GW170818
      • gw170818_posteriors_thinned.hdf : File containing posterior samples from the MCMC for measuring properties of GW170818. Also contains prior samples and PSDs used in the analysis.
    • GW170823 : Directory for GW170823
      • gw170823_posteriors_thinned.hdf : File containing posterior samples from the MCMC for measuring properties of GW170823. Also contains prior samples and PSDs used in the analysis.
  • run_files : Directory having run scripts and configuration files to replicate the analyses

    • GW170104 : Directory for GW170104
      • gw170104_inference.ini : Configuration file for GW170104 analysis
      • run_pycbc_inference_gw170104.sh : Run script for GW170104 analysis
    • GW170608 : Directory for GW170608
      • gw170608_inference.ini : Configuration file for GW170608 analysis
      • run_pycbc_inference_gw170608.sh : Run script for GW170608 analysis
    • GW170729 : Directory for GW170729
      • gw170729_inference.ini : Configuration file for GW170729 analysis
      • run_pycbc_inference_gw170729.sh : Run script for GW170729 analysis
    • GW170809 : Directory for GW170809
      • gw170809_inference.ini : Configuration file for GW170809 analysis
      • run_pycbc_inference_gw170809.sh : Run script for GW170809 analysis
    • GW170814 : Directory for GW170814 :
      • gw170814_inference.ini : Configuration file for GW170814 analysis
      • run_pycbc_inference_gw170814.sh : Run script for GW170814 analysis
    • GW170818 : Directory for GW170818
      • gw170818_inference.ini : Configuration file for GW170818 analysis
      • run_pycbc_inference_gw170818.sh : Run script for GW170818 analysis
    • GW170823 : Directory for GW170823
      • gw170823_inference.ini : Configuration file for GW170823 analysis
      • run_pycbc_inference_gw170823.sh : Run script for GW170823 analysis
    • run_pycbc_inference_extract_samples.sh : Contains command for extracting independent samples from the full chains obtained from the MCMC runs.

Running the notebook in a Docker container

This notebook can be run from a PyCBC Docker container, or a machine with PyCBC installed. Instructions for downloading the docker container are available from the PyCBC home page. To start a container with instance of Jupyter notebook, run the commands

docker pull pycbc/pycbc-el7:v1.12.3
docker run -p 8888:8888 --name pycbc_notebook -it pycbc/pycbc-el7:v1.12.3 /bin/bash -l

Once the container has started, this git repository can be downloaded with the command:

git clone https://github.com/gwastro/o2-bbh-pe.git

The notebook server can be started inside the container with the command:

jupyter notebook --ip 0.0.0.0 --no-browser

You can then connect to the notebook server at the URL printed by jupyter. Navigate to the directory o2-bbh-pe in the cloned git repository and open data_release_o2_bbh_pe.ipynb (this notebook).

Acknowledgements

This research has made use of data from the Gravitational Wave Open Science Center https://www.gw-openscience.org. Computations were performed in the Syracuse University SUGWG cluster.

Funding: This work was supported by NSF awards PHY-1707954 (DAB, SD), and PHY-1607169 (SD). SD was also supported by the Inaugural Kathy '73 and Stan '72 Walters Endowed Fund for Science Research Graduate Fellowship at Syracuse University. Computations were supported by Syracuse University and NSF award OAC-1541396.

Authors' contributions

Conceptualization: DAB, Methodology: SD, CMB, CDC, AHN; Software: CMB, CDC, SD, AHN, DAB; Validation: CDC, CMB, AHN; Formal Analysis: SD; Investigation: SD, CMB, CDC, AHN; Resources: DAB; Data Curation: DAB, CDC, CMB, AHN and SD; Writing: SD, CMB, CDC, DAB, and AHN; Visualization: SD, CMB, CDC, AHN; Supervision: DAB; Project Administration: DAB; Funding Acquisition: DAB.

o2-bbh-pe's People

Contributors

soumide1102 avatar

Stargazers

 avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Forkers

wushichao

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.