Pipeline Trigger Uploads to GraceDB
The online pipeline should conform to the following policies about the upload of trigger information and the quality of the uploaded data.
Each pipeline/search should Clustering triggers and should upload a single event to graceDB. Multiple uploads may be provided for a maximum of TWO for not "EarlyWarning" and they should not be identical and the second upload would supersede the first one and the pipeline/search should clearly state to the reason for the needed second upload to the "lowlatency" group. The uploaded second events should have a lower FAR than the first one. "EarlyWarning" searches are allowed to submit multiple uploads but they should provide to the "lowlatency" group the schedule-timeline of the envisioned uploads.
To participate in the generation of alerts during O4 the online pipeline MUST analyze (at least once) and upload events to GraceDB (playground) the Mock Data Challenge data both on the “Gaussian Injection Set” and the “O3 Replay Injection Set”.
They should communicate the possible “detection” creating a “G-event” to GraceDB following the same procedure used during O3, i.e., they should create a new event to GraceDB using the command:
gdb.createEvent(group=group,search=search,pipeline=pipeline,filename =filename,filecontents=data)
and to upload one file that is is required to generate localization. The Required additional file will be different for upload depending on the group to which the pipeline belongs:
- if (group=="CBC") they should upload a file named “psd.xml.gz” containing the PSD associate with the events. The upload should be tagged "psd"
- if (group=="Burst") they should upload an event localization “fits” file with a name that depends on the pipeline and should be named [pipeline].fits (i.e. if the pipeline name is pipeline="cWB" than filename="cWB.fits") and should be tagged "sky_loc"
Additionally, CBC pipelines are allowed to (but not required to) upload a property (json serialized) file and eventually, a pipeline-provided localization file. The optional uploaded file should be named:
- p_astro.json file containing the p_astro evaluation of the trigger. The file should conform to the following format 'json.dumps({"Terrestrial": 0.0043, "BNS": 0.3111, "BBH": 0.6022, "NSBH": 0.0824})'. The sum of the entry should be 1.0 and should be tagged "p_astro".
- descr.json file containing additional information about the trigger provided by the pipeline. It may provide classification information and p_astro information, For the format of the file, the required syntax, and the required information are shown below. It may replace the upload of the p_astro.json file and if this file is uploaded the content should be consistent with the one provided by the p_astro.json file.
- [pipeline].fits a localization “fits” file with a name that depends on the pipeline. (i.e. if the pipeline name is pipeline="gstlal" than filename="gstlal.fits") and should be tagged "sky_loc"
Partecipating pipelines are:
group | pipeline | search | .fits | #uploads | p_astro.json | descr.json |
---|---|---|---|---|---|---|
CBC | gstlal | AllSky | NO | (?) | YES | NO |
pycbc | AllSky | NO | 2 | NO | NO | |
MBTAOnline | AllSky | NO | 1 | NO | NO | |
spiir | AllSky | NO | 1 | NO | NO | |
spiir | HighMass | NO | 1 | NO | NO | |
spiir | LowMass | NO | 1 | NO | NO | |
Burst | cWB | AllSky | cWB.fits | 1 | NO | NO |
cWB | BBH | cWB.fits | 1 | NO | NO | |
cWB | IBBH | cWB.fits | 1 | NO | NO |
NEED TO BE CLARIFIED HOW WE COLLECT INFORMATION ON THE DETECTOR/DETECTORs INVOLVED: H, L, V, K, HL, HLV, HV, LV, HLVK,.....
A possible example of the contents of the file description.json is:
json.dumps({'p_astro': {"Terrestrial": 0.0043, "BNS": 0.3111, "BBH": 0.6022, "NSBH": 0.0824},
'em_bright':{'HasNs':0.3952,'HasRemnant':0.3421,'HasMassGap':0.27}
})