Skip to content
Snippets Groups Projects
Commit 88e2e742 authored by Cort Posnansky's avatar Cort Posnansky Committed by Shio Sakon
Browse files

doc/source/cbc_analysis.rst: updated tutorial down to Section: PSD

parent 8f7260ec
No related branches found
No related tags found
1 merge request!303Updated cbc offline configuration tutorial
......@@ -15,7 +15,7 @@ Open Science Grid (OSG).
Running Workflows
^^^^^^^^^^^^^^^^^^
1.A Build Singularity image (using the gstlal master branch)
1 Build Singularity image
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
NOTE: If you are using a reference Singularity container (suitable in most
......@@ -31,46 +31,8 @@ To pull a container with gstlal installed, run:
$ singularity build --sandbox --fix-perms <image-name> docker://containers.ligo.org/lscsoft/gstlal:master
1.B Build Singularity image (using a gstlal non-master branch)
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
If using a non-master branch, create a singularity build directory by running:
.. code:: bash
$ mkdir <singularity-dir>
$ cd <singularity-dir>
$ singularity build --sandbox --fix-perms <image-name> docker://containers.ligo.org/lscsoft/gstlal:<name-of-branch>
If running on the ICDS (PSU cluster), add a directory called ``ligo`` inside
``<image-name>``, and the following singularity commands should contain
``-B /ligo``.
In the directory where ``<image-name>`` exists, run:
.. code:: bash
$ singularity run --writable <image-name>
$ cd gstlal
If one is modifying code, apply changes at this step.
Then, install gstlal by running the following where ``<gstlal-sub>`` is
``gstlal``, ``gstlal-burst``, ``gstlal-inspiral``, and ``gstlal-ugly``.
.. code:: bash
$ cd <gstlal-sub> && echo | ./00init.sh
$ ./configure --prefix /usr
$ make
$ make install
$ cd ..
To get out of the singularity container, run
.. code:: bash
$ exit
To use a branch other than master, you can replace `master` in the above command with the name of the desired branch. To use a custom build instead, gstlal will need to be installed into the container from your modified source code. For installation instructions, see the
`installation page <https://docs.ligo.org/lscsoft/gstlal/installation.html>`_
2. Set up workflow
""""""""""""""""""""
......@@ -81,35 +43,33 @@ First, we create a new analysis directory and switch to it:
$ mkdir <analysis-dir>
$ cd <analysis-dir>
$ mkdir mass_model
$ mkdir bank
$ mkdir bank mass_model idq dtdphi
Default configuration files and environment (``env.sh``) for a
variety of different banks are contained in the
`offline-configuration <https://git.ligo.org/gstlal/offline-configuration/configs>`_
`offline-configuration <https://git.ligo.org/gstlal/offline-configuration>`_
repository.
One can run the commands below to grab the configuration files, or clone the
repository and copy the files as needed into the analysis directory.
To download data files (mass model, template banks) that may be needed for
offline runs, see
`offline-configuration README <https://git.ligo.org/gstlal/offline-configuration/-/blob/main/README.md>`_
Move the template bank(s) into ``bank`` and the mass model into ``mass_model``
offline runs, see the
`README <https://git.ligo.org/gstlal/offline-configuration/-/blob/main/README.md>`_
in the offline-configuration repo. Move the template bank(s) into ``bank`` and the mass model into ``mass_model``.
For example, to grab the configuration file and environment for the a small BNS dag:
For example, to grab all the relevant files for a small BNS dag:
.. code:: bash
$ curl -O https://git.ligo.org/gstlal/offline-configuration/-/raw/main/configs/bns-small/config.yml
$ curl -O https://git.ligo.org/gstlal/offline-configuration/-/raw/main/env.sh
$ curl -O https://dcc.ligo.org/DocDB/0184/T2200318/002/gstlal_bank_small.xml.gz
$ curl -O https://dcc.ligo.org/DocDB/0184/T2200318/002/mass_model_small.h5
$ curl -O https://dcc.ligo.org/DocDB/0184/T2200318/002/H1L1-IDQ_TIMESERIES-1239641219-692847.h5
$ curl -O https://dcc.ligo.org/DocDB/0184/T2200318/002/inspiral_dtdphi_pdf.h5
Then run the following to get the template banks and mass models.
.. code:: bash
$ conda activate igwn
$ dcc archive --archive-dir=. --files --interactive T2200318-v2
$ conda deactivate
Then move the template bank, mass model, idq file, and dtdphi file into their corresponding directories.
Now, we'll need to modify the configuration as needed to run the analysis. At
the very least, setting the start/end times and the instruments to run over:
......@@ -121,8 +81,7 @@ the very least, setting the start/end times and the instruments to run over:
instruments: H1L1
We also required template bank(s) and a mass model. Ensure these are pointed to
the right place in the configuration:
Ensure the template bank, mass model, idq file, and dtdphi file are pointed to in the configuration:
.. code-block:: yaml
......@@ -133,6 +92,8 @@ the right place in the configuration:
prior:
mass-model: bank/mass_model_small.h5
idq-timeseries: idq/H1L1-IDQ_TIMESERIES-1239641219-692847.h5
dtdphi: dtdphi/inspiral_dtdphi_pdf.h5
If you're creating a summary page for results, you'll need to point at a
location where they are web-viewable:
......@@ -140,7 +101,7 @@ location where they are web-viewable:
.. code-block:: yaml
summary:
webdir: /path/to/summary
webdir: ~/public_html/
If you're running on LIGO compute resources and your username doesn't match your
albert.einstein username, you'll also additionally need to specify the
......@@ -151,17 +112,17 @@ accounting group user for condor to track accounting information:
condor:
accounting-group-user: albert.einstein
In addition, update the ``singularity-image`` the ``condor`` section of your configuration if needed:
In addition, update the ``singularity-image`` in the ``condor`` section of your configuration if needed:
.. code-block:: yaml
condor:
singularity-image: /cvmfs/singularity.opensciencegrid.org/lscsoft/gstlal:master
If not using the reference Singularity image, you can replace this line with the
If not using a reference Singularity image, you can replace this with the
full path to a local singularity container ``<image>``.
For more detailed configuration options, take a look at the :ref:`configuration
For more detailed configuration options, take a look at the :ref:`configuration
section <analysis-configuration>` below.
If you haven't installed site-specific profiles yet (per-user), you can run:
......@@ -240,7 +201,7 @@ If one desires to see detailed error messages, add ``<PYTHONUNBUFFERED=1>`` to
.. code:: bash
$ sed -i 's@environment = "LAL_DATA_PATH=/cvmfs/oasis.opensciencegrid.org/ligo/sw/pycbc/lalsuite-extra/current/share/lalsimulation"@environment = "LAL_DATA_PATH=/cvmfs/oasis.opensciencegrid.org/ligo/sw/pycbc/lalsuite-extra/current/share/lalsimulation PYTHONUNBUFFERED=1"@g' *.sub
$ sed -i '/^environment = / s/\"$/ PYTHONUNBUFFERED=1\"/' *.sub
3. Launch workflows
......@@ -264,7 +225,7 @@ After the DAG has completed, you can generate the summary page for the analysis:
$ singularity exec <image> make summary
To make an open page, run:
To make an open-box page after this, run:
.. code:: bash
......@@ -285,13 +246,13 @@ The top-level configuration consists of the analysis times and detector configur
instruments: H1L1
min-instruments: 1
These set the start and stop times of the analysis, plus the detectors to use
(H1=Hanford, L1=Livingston, V1=Virgo). The start and stop times are gps times,
there is a nice online converter that can be used here:
https://www.gw-openscience.org/gps/. You can also use the program `gpstime` as
These set the start and stop gps times of the analysis, plus the detectors to use
(H1=Hanford, L1=Livingston, V1=Virgo). There is a nice online converter for gps times
here: https://www.gw-openscience.org/gps/. You can also use the program `gpstime` as
well. Note that these start and stop times have no knowledge about science
quality data, the actual science quality data that are analyzed is typically a
subset of the total time.
subset of the total time. Information about which detectors were on at different
times is available here: https://www.gw-openscience.org/data/.
``min-instruments`` sets the minimum number of instruments we will allow to form
an event, e.g. setting it to 1 means the analysis will consider single detector
......@@ -313,7 +274,11 @@ bank in particular contains a table that lists the parameters of all of the
templates, it does not contain the actual waveforms themselves. Metadata such as
the waveform approximant and the frequency cutoffs are also listed in this file.
One can use multiple sub template banks. Then, the configuration file would be like:
The ``analysis-dir`` option is used if the user wishes to point to an existing
analysis to perform a rerank or an injection-only workflow. This grabs existing files
from this directory to seed the rerank/injection workflows.
One can use multiple sub template banks. In this case, the configuration might look like:
.. code-block:: yaml
......@@ -326,10 +291,6 @@ One can use multiple sub template banks. Then, the configuration file would be l
imbh: bank/sub_bank/imbh_low_q.xml.gz
The ``analysis-dir`` option is used if the user wishes to point to an existing
analysis to perform a rerank or an injection-only workflow. This grabs existing files
from this directory to seed the rerank/injection workflows.
Section: Source
""""""""""""""""
......@@ -379,7 +340,7 @@ An example of configuration with the ``gwosc`` backend looks like:
vetoes:
category: CAT1
Here, the ``backend`` is set to ``gwosc`` so both segments are vetoes are determined
Here, the ``backend`` is set to ``gwosc`` so both segments and vetoes are determined
by querying the GWOSC server. There is no additional configuration needed to query
segments, but for vetoes, we also need to specify the ``category`` used for vetoes.
This can be one of ``CAT1``, ``CAT2``, or ``CAT3``. By default, segments are generated
......@@ -402,7 +363,7 @@ An example of configuration with the ``dqsegdb`` backend looks like:
version: O3b_CBC_H1L1V1_C01_v1.2
epoch: O3
Here, the ``backend`` is set to ``dqsegdb`` so both segments are vetoes are determined
Here, the ``backend`` is set to ``dqsegdb`` so both segments and vetoes are determined
by querying the DQSEGDB server. To query segments, one needs to specify the flag used
per instrument to query segments from. For vetoes, we need to specify the ``category``
used for vetoes as with the ``dqsegdb`` backend. Additionally, a veto definer file is
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment