Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
C
CW Inference in Python
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Requirements
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Locked files
Build
Pipelines
Jobs
Pipeline schedules
Test cases
Artifacts
Deploy
Releases
Package Registry
Container Registry
Model registry
Operate
Environments
Terraform modules
Monitor
Incidents
Service Desk
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Code review analytics
Issue analytics
Insights
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
Charlie Hoy
CW Inference in Python
Commits
dfa85205
Commit
dfa85205
authored
1 year ago
by
Matthew David Pitkin
Browse files
Options
Downloads
Patches
Plain Diff
Update documentation
parent
eeb12d21
No related branches found
Branches containing commit
No related tags found
No related merge requests found
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
docs/heterodyne/heterodyne.rst
+32
-5
32 additions, 5 deletions
docs/heterodyne/heterodyne.rst
docs/knope/knope.rst
+30
-5
30 additions, 5 deletions
docs/knope/knope.rst
with
62 additions
and
10 deletions
docs/heterodyne/heterodyne.rst
+
32
−
5
View file @
dfa85205
...
...
@@ -71,13 +71,40 @@ as those `available to the LVK <https://computing.docs.ligo.org/guide/grid/>`__,
Science Grid <https://opensciencegrid.org/>`__), or an individual machine (see below), running the
`HTCondor <https://htcondor.readthedocs.io/en/latest/>`_ job scheduler system then the analysis can
be split up using the ``cwinpy_heterodyne_pipeline`` pipeline script (see :ref:`Running using
HTCondor`). In some cases you may need to
`generate a SciToken <https://computing.docs.ligo.org/guide/auth/scitokens/#get>`__ to allow the
analysis script to access proprietary frame files, e.g.,:
HTCondor`).
.. code:: bash
.. note::
For LVK users, if requiring access to `proprietary IGWN frame data
<https://computing.docs.ligo.org/guide/auth/scitokens/#data>`__ (i.e., non-public frames visible
to those within the LVK collaboration) via `CVMFS
<https://computing.docs.ligo.org/guide/cvmfs/>`__ rather than frames locally stored on a cluster,
you will need to `generate a SciToken
<https://computing.docs.ligo.org/guide/auth/scitokens/#get>`__ to allow the analysis script to
access them. To enable the pipeline script to find science segments and frame URLs you must
`generate a token <https://computing.docs.ligo.org/guide/auth/scitokens/#get-default>`__ with:
.. code:: bash
htgettoken -a vault.ligo.org -i igwn
.. note::
On some systems still using the `X.509 authentication
<https://computing.docs.ligo.org/guide/auth/x509/>`__ you may instead need to run to find the
science segments and frame URLs:
.. code:: bash
ligo-proxy-init -p albert.einstein
and then run:
.. code:: bash
condor_vault_storer -v igwn
htgettoken --audience https://datafind.ligo.org --scope gwdatafind.read -a vault.ligo.org --issuer igwn
to allow the HTCondor jobs to access the credentials.
The ``cwinpy_heterodyne_pipeline`` should be used for most practical purposes, while
``cwinpy_heterodyne`` is generally useful for short tests.
...
...
This diff is collapsed.
Click to expand it.
docs/knope/knope.rst
+
30
−
5
View file @
dfa85205
...
...
@@ -36,13 +36,38 @@ provided :ref:`here<Local use of HTCondor>`) and the instructions here will focu
brief description of using ``cwinpy_knope`` will be provided, although this should primarily be
used, if required, for quick testing.
For LVK users, if running on proprietary data you may need to
`generate a SciToken <https://computing.docs.ligo.org/guide/auth/scitokens/#get>`__ to allow the
analysis scripts to access frame files, e.g.,:
.. note::
For LVK users, if requiring access to `proprietary IGWN frame data
<https://computing.docs.ligo.org/guide/auth/scitokens/#data>`__ (i.e., non-public frames visible
to those within the LVK collaboration) via `CVMFS
<https://computing.docs.ligo.org/guide/cvmfs/>`__ rather than frames locally stored on a cluster,
you will need to `generate a SciToken
<https://computing.docs.ligo.org/guide/auth/scitokens/#get>`__ to allow the analysis script to
access them. To enable the pipeline script to find science segments and frame URLs you must
`generate a token <https://computing.docs.ligo.org/guide/auth/scitokens/#get-default>`__ with:
.. code:: bash
htgettoken -a vault.ligo.org -i igwn
.. note::
On some systems still using the `X.509 authentication
<https://computing.docs.ligo.org/guide/auth/x509/>`__ you may instead need to run to find the
science segments and frame URLs:
.. code:: bash
ligo-proxy-init -p albert.einstein
and then run:
.. code:: bash
.. code:: bash
condor_vault_storer -v igwn
htgettoken --audience https://datafind.ligo.org --scope gwdatafind.read -a vault.ligo.org --issuer igwn
to allow the HTCondor jobs to access the credentials.
In many of the examples below we will assume that you are able to access the open LIGO and Virgo
data available from the `GWOSC <https://gwosc.org/>`__ via `CVMFS
...
...
This diff is collapsed.
Click to expand it.
Preview
0%
Loading
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment