@@ -43,12 +43,12 @@ An accounting tag used to measure LDG computational use. See https://ldas-gridmo
GROUP_USER=albert.einstein
This should be your albert.einstein user idenification. This is only needed if using a shared account. ::
This should be your albert.einstein user identification. This is only needed if using a shared account. ::
IFOS = H1 L1
MIN_IFOS = 2
Define which detectors to include within the analysis. H1, L1, and V1 are currently supported. Set minimum number of operational dectors for which to analyise. Able to analyse single detector time. ::
Define which detectors to include within the analysis. H1, L1, and V1 are currently supported. Set minimum number of operational detectors for which to analyise. Able to analyse single detector time. ::
START = 1187000000
STOP = 1187100000
...
...
@@ -149,20 +149,20 @@ Produce CAT1 vetoes file. ::
Include gating times into CAT3 veto times files. The gating files contain aditional times to veto that are not included within the veto definer file. The ascii files are converted into readable xml files with lauras_txt_files_to_xml. ::
Include gating times into CAT3 veto times files. The gating files contain additional times to veto that are not included within the veto definer file. The ascii files are converted into readable xml files with lauras_txt_files_to_xml. ::
Combine all vetoe files into single vetoes.xml.gz file.
Combine all veto files into single vetoes.xml.gz file.
tisi.xml.gz and inj_tisi.xml.gz file
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...
...
@@ -280,7 +280,7 @@ After obtaining a bank gstlal_inspiral_add_template_ids needs to be run on it in
--num-banks $(NUMBANKS) \
H1-TMPLTBANK-$(START)-2048.xml
This program needs to be run on the template bank being used to split it up into sub banks that will be passed to the singular value decompositon code within the pipeline.
This program needs to be run on the template bank being used to split it up into sub banks that will be passed to the singular value decomposition code within the pipeline.
Run gstlal_inspiral_pipe to produce offline analysis dag
A set of sed commands to to make the memory requet of jobs dynamical. These commands shouldn't be needed for most standard cases, but if you notice that jobs are being placed on hold by condor for going over their requested memory allowcation, then these should allow the jobs to run. ::
A set of sed commands to to make the memory request of jobs dynamical. These commands shouldn't be needed for most standard cases, but if you notice that jobs are being placed on hold by condor for going over their requested memory allocation, then these should allow the jobs to run. ::
sed -i "/^environment/s?\$$?GSTLAL_FIR_WHITEN=0;?" *.sub
A sed command to set 'GSTLAL_FIR_WHITEN=0' for all jobs. Required in all cases. This environment variable is sometimes also set within the env.sh file when sourcing an enviroment, if it was built by the user. This sed command should be included if using the system build. ::
A sed command to set 'GSTLAL_FIR_WHITEN=0' for all jobs. Required in all cases. This environment variable is sometimes also set within the env.sh file when sourcing an environment, if it was built by the user. This sed command should be included if using the system build. ::
sed -i 's@environment = GST_REGISTRY_UPDATE=no;@environment = "GST_REGISTRY_UPDATE=no LD_PRELOAD=$(MKLROOT)/lib/intel64/libmkl_core.so"@g' gstlal_inspiral_injection_snr.sub
...
...
@@ -357,5 +357,5 @@ Assuming you have all the prerequisites, running the BNS Makefile as it is only
* Line 129: Set path to veto definer file
* Line 183: Set path to Makefile.offline_analysis_rules
Then to run it, ensuring you have the correct envirnment set, run with: make -f Makefile.BNS_HL_test_dag_O2
Then to run it, ensuring you have the correct environment set, run with: make -f Makefile.BNS_HL_test_dag_O2