Commit d9725afe authored by Avi Vajpeyi's avatar Avi Vajpeyi 👽
Browse files

Merge branch 'add_citation' into 'master'

update docs with new install instructions and citations

See merge request !88
parents 05f5b308 1709d04d
Pipeline #215990 failed with stages
in 4 minutes and 48 seconds
......@@ -9,19 +9,101 @@ Publications
Please add the following line within your methods, conclusion or acknowledgements
sections:
"This research has made use of Parallel Bilby vX.Y (version citation), an
open-source and free community-developed parallelised Bayesian inference Python
package (citation)."
"This research has made use of Parallel Bilby vX.Y \cite{pbilby_paper, bilby_paper},
a parallelised Bayesian inference Python package, and Dynesty vX.Y
\cite{dynesty_paper, skilling2004, skilling2006}, a nested sampler,
to perform Bayesian parameter estimation."
.. code:: bibtex
@article{smith2019expediting,
title={Expediting Astrophysical Discovery with Gravitational-Wave Transients Through Massively Parallel Nested Sampling},
author={Smith, Rory and Ashton, Gregory},
journal={arXiv preprint arXiv:1909.11873},
year={2019}
@ARTICLE{pbilby_paper,
author = {{Smith}, Rory J.~E. and {Ashton}, Gregory and {Vajpeyi}, Avi and {Talbot}, Colm},
title = "{Massively parallel Bayesian inference for transient gravitational-wave astronomy}",
journal = {\mnras},
keywords = {gravitational waves, methods: data analysis, General Relativity and Quantum Cosmology, Astrophysics - Instrumentation and Methods for Astrophysics},
year = 2020,
month = aug,
volume = {498},
number = {3},
pages = {4492-4502},
doi = {10.1093/mnras/staa2483},
archivePrefix = {arXiv},
eprint = {1909.11873},
primaryClass = {gr-qc},
adsurl = {https://ui.adsabs.harvard.edu/abs/2020MNRAS.498.4492S},
adsnote = {Provided by the SAO/NASA Astrophysics Data System}
}
@ARTICLE{bilby_paper,
author = {{Ashton}, Gregory and {H{\"u}bner}, Moritz and {Lasky}, Paul D. and {Talbot}, Colm and {Ackley}, Kendall and {Biscoveanu}, Sylvia and {Chu}, Qi and {Divakarla}, Atul and {Easter}, Paul J. and {Goncharov}, Boris and {Hernandez Vivanco}, Francisco and {Harms}, Jan and {Lower}, Marcus E. and {Meadors}, Grant D. and {Melchor}, Denyz and {Payne}, Ethan and {Pitkin}, Matthew D. and {Powell}, Jade and {Sarin}, Nikhil and {Smith}, Rory J.~E. and {Thrane}, Eric},
title = "{BILBY: A User-friendly Bayesian Inference Library for Gravitational-wave Astronomy}",
journal = {\apjs},
keywords = {gravitational waves, methods: data analysis, methods: statistical, stars: black holes, stars: neutron, Astrophysics - Instrumentation and Methods for Astrophysics, Astrophysics - High Energy Astrophysical Phenomena, General Relativity and Quantum Cosmology},
year = 2019,
month = apr,
volume = {241},
number = {2},
eid = {27},
pages = {27},
doi = {10.3847/1538-4365/ab06fc},
archivePrefix = {arXiv},
eprint = {1811.02042},
primaryClass = {astro-ph.IM},
adsurl = {https://ui.adsabs.harvard.edu/abs/2019ApJS..241...27A},
adsnote = {Provided by the SAO/NASA Astrophysics Data System}
}
@ARTICLE{dynesty_paper,
author = {{Speagle}, Joshua S.},
title = "{DYNESTY: a dynamic nested sampling package for estimating Bayesian posteriors and evidences}",
journal = {\mnras},
keywords = {methods: data analysis, methods: statistical, Astrophysics - Instrumentation and Methods for Astrophysics, Statistics - Computation},
year = 2020,
month = apr,
volume = {493},
number = {3},
pages = {3132-3158},
doi = {10.1093/mnras/staa278},
archivePrefix = {arXiv},
eprint = {1904.02180},
primaryClass = {astro-ph.IM},
adsurl = {https://ui.adsabs.harvard.edu/abs/2020MNRAS.493.3132S},
adsnote = {Provided by the SAO/NASA Astrophysics Data System}
}
@INPROCEEDINGS{skilling2004,
author = {{Skilling}, John},
title = "{Nested Sampling}",
keywords = {02.50.Tt, Inference methods},
booktitle = {Bayesian Inference and Maximum Entropy Methods in Science and Engineering: 24th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering},
year = 2004,
editor = {{Fischer}, Rainer and {Preuss}, Roland and {Toussaint}, Udo Von},
series = {American Institute of Physics Conference Series},
volume = {735},
month = nov,
pages = {395-405},
doi = {10.1063/1.1835238},
adsurl = {https://ui.adsabs.harvard.edu/abs/2004AIPC..735..395S},
adsnote = {Provided by the SAO/NASA Astrophysics Data System}
}
@ARTICLE{skilling2006,
author = {{Skilling}, John},
doi = {10.1214/06-BA127},
journal = {Bayesian Analysis},
journal = {Bayesian Analysis},
month = dec,
number = 4,
pages = {833-859},
publisher = {International Society for Bayesian Analysis},
title = {Nested sampling for general Bayesian computation},
url = "https://doi.org/10.1214/06-BA127",
volume = 1,
year = 2006
}
The citation should be to the `Parallel Bilby paper`_ and the version number should
cite the specific version used in your work.
......@@ -32,5 +114,5 @@ Posters and talks
Please include the `Parallel Bilby logo`_ on the title, conclusion slide, or about page.
.. _Parallel Bilby paper: https://arxiv.org/pdf/1909.11873.pdf
.. _Parallel Bilby paper: http://dx.doi.org/10.1093/mnras/staa2483
.. _Parallel Bilby logo: https://git.ligo.org/uploads/-/system/project/avatar/1846/bilby.jpg?width=40
=============
Examples
=============
==================
Usage and Examples
==================
See https://git.ligo.org/lscsoft/parallel_bilby/-/tree/master/examples
Usage notes
-----------
The steps to analyse data with :code:`Parallel Bilby` are:
#. Ini Creation:
Create an :code:`ini` with paths to the `prior`, `PSD` and `data` files, along with other `kwargs`.
#. Parallel Bilby Generation:
Setup your :code:`Parallel Bilby` jobs with
.. code-block:: console
$ bash outdir/submit/bash_<label>.sh
This generates
* Plots of the `PSD` (review before submitting your job)
* :code:`Slurm` submission scripts
* a :code:`data dump` pickle (object packed with the `PSD`, `data`, etc)
#. Parallel Bilby Analysis:
To submit the :code:`Slurm` jobs on a cluster, run
.. code-block:: console
$ bash outdir/submit/bash_<label>.sh
Alternatively, to run locally without submitting a job, check the :code:`bash` file
for the required command. It should look something like:
.. code-block:: console
$ mpirun parallel_bilby_analysis outdir/data/<label>_data_dump.pickle --label <label> --outdir outdir/result --sampling-seed 1234`
Example ini files
-----------------
Refer to the `Parallel Bilby Examples Folder`_ for example :code:`ini` files along with :code:`Jupyter Notebooks`
explaining how to set up :code:`Parallel Bilby` jobs.
The folder has three ini examples:
#. GW150914 ini
#. GW170817 ini
#. Instructions to setup multiple injection inis
.. _Parallel Bilby Examples Folder: https://git.ligo.org/uploads/-/system/project/avatar/1846/bilby.jpg?width=40
\ No newline at end of file
......@@ -2,6 +2,21 @@
Installation
=============
Stable Installation
-------------------
Install the most recent stable version of :code:`Parallel Bilby` with :code:`pip`:
.. code-block:: console
$ pip install parallel_bilby
Alternatively, you can install :code:`Parallel Bilby` with :code:`conda`
.. code-block:: console
$ conda install -c conda-forge parallel_bilby
Dependencies
------------
......@@ -11,26 +26,28 @@ Install dependencies using
$ conda install -c conda-forge bilby_pipe schwimmbad
Installation
------------
Development Install
-------------------
Install the package locally with
.. code-block:: console
$ python setup.py install
$ python setup.py develop
This allows you to install :code:`Parallel Bilby` in a way that allows you edit your
code after installation to environment, and have the changes take effect immediately
(without re-installation).
Executables
-----------
This gives you access to three executables
Installing :code:`Parallel Bilby` gives you access to two executables
.. code-block:: console
$ parallel_bilby_generation --help
$ parallel_bilby_analysis --help
$ parallel_bilby_retrieve_data --help
Roughly speaking, the generation executable is run locally to prepare the data
for analysis. It takes as input a :code:`gwf` file of the strain data, the PSD text
......
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment