Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • emfollow/gwcelery
  • leo-singer/gwcelery
  • deep.chatterjee/gwcelery
  • michael-coughlin/gwcelery
  • brandon.piotrzkowski/gwcelery
  • geoffrey.mo/gwcelery
  • vinaya.valsan/gwcelery
  • patrick.godwin/gwcelery
  • john-veitch/gwcelery
  • roberto.depietri/gwcelery
  • veronica.villa/gwcelery
  • teresa.slaven-blair/gwcelery
  • cody.messick/gwcelery
  • sarah.antier/gwcelery
  • shreya.anand/gwcelery
  • ron.tapia/gwcelery
  • andrew.toivonen/gwcelery
  • adam-zadrozny/gwcelery
  • duncanmmacleod/gwcelery
  • sushant.sharma-chaudhary/gwcelery
  • manleong.chan/gwcelery
  • satyanarayan.raypitambarmohapatra/gwcelery
  • yu-kuang.chu/gwcelery
  • jacob.golomb/gwcelery
  • daniele.monteleone/gwcelery
  • albertcheng.zhang/gwcelery
  • colm.talbot/gwcelery
  • gaurav.waratkar/gwcelery
  • yun-jing.huang/gwcelery
29 results
Show changes
Commits on Source (20)
Showing
with 251 additions and 55 deletions
...@@ -62,6 +62,9 @@ test/poetry/python3.10: ...@@ -62,6 +62,9 @@ test/poetry/python3.10:
test/poetry/python3.11: test/poetry/python3.11:
extends: .test-poetry extends: .test-poetry
image: python:3.11 image: python:3.11
test/poetry/python3.12:
extends: .test-poetry
image: python:3.12
# Run test suite using wheel and bleeding-edge dependencies # Run test suite using wheel and bleeding-edge dependencies
.test-wheel: .test-wheel:
...@@ -81,6 +84,9 @@ test/wheel/python3.10: ...@@ -81,6 +84,9 @@ test/wheel/python3.10:
test/wheel/python3.11: test/wheel/python3.11:
extends: .test-wheel extends: .test-wheel
image: python:3.11 image: python:3.11
test/wheel/python3.12:
extends: .test-wheel
image: python:3.12
lint: lint:
stage: test stage: test
......
...@@ -31,6 +31,19 @@ Changelog ...@@ -31,6 +31,19 @@ Changelog
- Roll back ligo-followup-advocate to 1.2.9 until SSM triggers are planned - Roll back ligo-followup-advocate to 1.2.9 until SSM triggers are planned
in production. in production.
- Direct GWSkyNet tasks to the ``openmp`` queue and retire the old ``skynet``
queue. The dedicated ``gwskynet`` queue was necessary due to high memory
usage that was fixed in GWSkyNet 2.5.1.
- Reallocate the tasks for flattening and unflattening sky maps to a
dedicated Celery queue for high memory usage tasks. This should prevent
out-of-memory conditions that had resulted from these tasks being routed in
round-robin fashion to workers in the high-concurrency general-purpose
queue.
- Fix coincidence search so SubGRB events can only be found in coincidence
with CBC-like events (group is CBC or from CWB BBH search).
2.5.1 "Cactus cat" (2024-08-20) 2.5.1 "Cactus cat" (2024-08-20)
------------------------------- -------------------------------
......
...@@ -299,13 +299,18 @@ of several processes: ...@@ -299,13 +299,18 @@ of several processes:
A Celery worker that is dedicated to computing source properties of A Celery worker that is dedicated to computing source properties of
compact binary coalescences. compact binary coalescences.
10. **General-Purpose Worker** 10. **High Memory Worker**
A Celery worker with low concurrency that is dedicated to running tasks
that use a large amount of memory.
11. **General-Purpose Worker**
A Celery worker that accepts all other tasks. This worker also runs an A Celery worker that accepts all other tasks. This worker also runs an
:doc:`embedded IGWN Alert listener service <gwcelery.igwn_alert>` that is started :doc:`embedded IGWN Alert listener service <gwcelery.igwn_alert>` that is started
and stopped as a bootstep. and stopped as a bootstep.
10. **Flask Web Application** 12. **Flask Web Application**
A web application that provides forms to manually initiate certain tasks, A web application that provides forms to manually initiate certain tasks,
including sending an update alert or creating a mock event. including sending an update alert or creating a mock event.
......
...@@ -139,6 +139,7 @@ following commands:: ...@@ -139,6 +139,7 @@ following commands::
$ gwcelery worker -l info -n gwcelery-superevent-worker -Q superevent -c 1 $ gwcelery worker -l info -n gwcelery-superevent-worker -Q superevent -c 1
$ gwcelery worker -l info -n gwcelery-voevent-worker -Q voevent -P solo $ gwcelery worker -l info -n gwcelery-voevent-worker -Q voevent -P solo
$ gwcelery worker -l info -n gwcelery-em-bright-worker-Q em-bright -c 2 --prefetch-multiplier 1 $ gwcelery worker -l info -n gwcelery-em-bright-worker-Q em-bright -c 2 --prefetch-multiplier 1
$ gwcelery worker -l info -n gwcelery-highmem-worker -Q highmem -c 2 --prefetch-multiplier 1
$ gwcelery flask run $ gwcelery flask run
.. hint:: .. hint::
......
"""Application configuration for ``minikube`` local installation.""" """Application configuration for ``minikube`` local installation."""
import os
from . import * # noqa: F401, F403 from . import * # noqa: F401, F403
expose_to_public = True expose_to_public = True
"""Set to True if events meeting the public alert threshold really should be """Set to True if events meeting the public alert threshold really should be
exposed to the public.""" exposed to the public."""
igwn_alert_server = 'kafka://hopskotch-server' gracedb_host = os.getenv('GRACEDB_HOSTNAME',
'gracedb.default.svc.cluster.local')
"""GraceDB host."""
igwn_alert_server = os.getenv('IGWN_HOSTNAME',
'kafka://hopskotch-server')
"""IGWN alert server: None == DEFAULT_SERVER""" """IGWN alert server: None == DEFAULT_SERVER"""
igwn_alert_noauth = True igwn_alert_noauth = True
...@@ -15,9 +23,6 @@ igwn_alert_noauth = True ...@@ -15,9 +23,6 @@ igwn_alert_noauth = True
igwn_alert_group = 'default' igwn_alert_group = 'default'
"""IGWN alert group.""" """IGWN alert group."""
gracedb_host = 'gracedb.default.svc.cluster.local'
"""GraceDB host."""
mock_events_simulate_multiple_uploads = False mock_events_simulate_multiple_uploads = False
"""If True, then upload each mock event several times in rapid succession with """If True, then upload each mock event several times in rapid succession with
random jitter in order to simulate multiple pipeline uploads.""" random jitter in order to simulate multiple pipeline uploads."""
...@@ -28,8 +33,10 @@ kafka_consumer_config = { ...@@ -28,8 +33,10 @@ kafka_consumer_config = {
messages to be consumed. The values are a dictionary of the URL to listen to messages to be consumed. The values are a dictionary of the URL to listen to
and information about the message serializer.""" and information about the message serializer."""
kafka_alert_server = os.getenv('KAFKA_HOSTNAME',
'kafka://hopskotch-server')
kafka_alert_config = { kafka_alert_config = {
'scimma': {'url': 'kafka://hopskotch-server/igwn.gwalert-minikube', 'scimma': {'url': kafka_alert_server + '/igwn.gwalert-minikube',
'suffix': 'avro', 'skymap_encoder': lambda _: _, 'suffix': 'avro', 'skymap_encoder': lambda _: _,
'auth': False} 'auth': False}
} }
......
...@@ -43,18 +43,18 @@ arguments = "gwcelery worker -l info -n gwcelery-voevent-worker@%h -f %n.log -Q ...@@ -43,18 +43,18 @@ arguments = "gwcelery worker -l info -n gwcelery-voevent-worker@%h -f %n.log -Q
description = gwcelery-voevent-worker description = gwcelery-voevent-worker
queue queue
arguments = "gwcelery worker -l info -n gwcelery-kafka-producer-worker@%h -f %n.log -Q kafka-producer -P solo" arguments = "gwcelery worker -l info -n gwcelery-kafka-worker@%h -f %n.log -Q kafka -P solo"
description = gwcelery-kafka-producer-worker description = gwcelery-kafka-worker
queue
arguments = "gwcelery worker -l info -n gwcelery-kafka-consumer-worker@%h -f %n.log -Q kafka-consumer -P solo"
description = gwcelery-kafka-consumer-worker
queue queue
arguments = "gwcelery worker -l info -n gwcelery-em-bright-worker@%h -f %n.log -Q em-bright -c 2 --prefetch-multiplier 1" arguments = "gwcelery worker -l info -n gwcelery-em-bright-worker@%h -f %n.log -Q em-bright -c 2 --prefetch-multiplier 1"
description = gwcelery-em-bright-worker description = gwcelery-em-bright-worker
queue queue
arguments = "gwcelery worker -l info -n gwcelery-highmem-worker@%h -f %n.log -Q highmem -c 2 --prefetch-multiplier 1"
description = gwcelery-highmem-worker
queue
# Jobs defined below this point will run on specially configured cluster nodes. # Jobs defined below this point will run on specially configured cluster nodes.
+Online_EMFollow = True +Online_EMFollow = True
Requirements = (TARGET.Online_EMFollow =?= True) Requirements = (TARGET.Online_EMFollow =?= True)
...@@ -72,10 +72,3 @@ queue ...@@ -72,10 +72,3 @@ queue
arguments = "--unset OMP_NUM_THREADS gwcelery-condor-submit-helper gwcelery worker -l info -n gwcelery-openmp-worker-$(Process)@%h -f %n.log -Q openmp -c 1 --prefetch-multiplier 1" arguments = "--unset OMP_NUM_THREADS gwcelery-condor-submit-helper gwcelery worker -l info -n gwcelery-openmp-worker-$(Process)@%h -f %n.log -Q openmp -c 1 --prefetch-multiplier 1"
description = gwcelery-openmp-worker-$(Process) description = gwcelery-openmp-worker-$(Process)
queue 15 queue 15
+Online_GWSkyNet = True
Requirements = (TARGET.Online_GWSkyNet =?= True)
arguments = "gwcelery-condor-submit-helper gwcelery worker -l info -n gwcelery-skynet-worker@%h -f %n.log -Q skynet -c 1 --prefetch-multiplier 1"
description = gwcelery-skynet-worker
queue
...@@ -53,9 +53,9 @@ class Receiver(IGWNAlertBootStep): ...@@ -53,9 +53,9 @@ class Receiver(IGWNAlertBootStep):
def stop(self, consumer): def stop(self, consumer):
super().stop(consumer) super().stop(consumer)
if self._client.running: self._client.fatal_restart_running = False
self._client.running = False self._client.listening = False
self._client.stream_obj._consumer.stop() self._client.listen_stream._consumer.stop()
self.thread.join() self.thread.join()
def info(self, consumer): def info(self, consumer):
......
...@@ -234,7 +234,7 @@ class Consumer(KafkaBootStep): ...@@ -234,7 +234,7 @@ class Consumer(KafkaBootStep):
name = 'Kafka consumer' name = 'Kafka consumer'
def include_if(self, consumer): def include_if(self, consumer):
return 'kafka-consumer' in consumer.app.amqp.queues return 'kafka' in consumer.app.amqp.queues
def start(self, consumer): def start(self, consumer):
log.info(f'Starting {self.name}, topics: ' + log.info(f'Starting {self.name}, topics: ' +
...@@ -285,7 +285,7 @@ class Producer(KafkaBootStep): ...@@ -285,7 +285,7 @@ class Producer(KafkaBootStep):
name = 'Kafka producer' name = 'Kafka producer'
def include_if(self, consumer): def include_if(self, consumer):
return 'kafka-producer' in consumer.app.amqp.queues return 'kafka' in consumer.app.amqp.queues
def start(self, consumer): def start(self, consumer):
log.info(f'Starting {self.name}, topics: ' + log.info(f'Starting {self.name}, topics: ' +
......
...@@ -206,7 +206,7 @@ def _add_external_coinc_to_alert(alert_dict, superevent, ...@@ -206,7 +206,7 @@ def _add_external_coinc_to_alert(alert_dict, superevent,
return alert_dict, combined_skymap return alert_dict, combined_skymap
@app.task(bind=True, shared=False, queue='kafka-producer', ignore_result=True) @app.task(bind=True, shared=False, queue='kafka', ignore_result=True)
def _upload_notice(self, payload, brokerhost, superevent_id): def _upload_notice(self, payload, brokerhost, superevent_id):
''' '''
Upload serialized alert notice to GraceDB Upload serialized alert notice to GraceDB
...@@ -229,7 +229,7 @@ def _upload_notice(self, payload, brokerhost, superevent_id): ...@@ -229,7 +229,7 @@ def _upload_notice(self, payload, brokerhost, superevent_id):
superevent_id, message, tags=['public', 'em_follow']) superevent_id, message, tags=['public', 'em_follow'])
@app.task(bind=True, queue='kafka-producer', shared=False) @app.task(bind=True, queue='kafka', shared=False)
def _send(self, alert_dict, skymap, brokerhost, combined_skymap=None): def _send(self, alert_dict, skymap, brokerhost, combined_skymap=None):
"""Write the alert to the Kafka topic""" """Write the alert to the Kafka topic"""
# Copy the alert dictionary so we dont modify the original # Copy the alert dictionary so we dont modify the original
...@@ -258,14 +258,14 @@ def _send(self, alert_dict, skymap, brokerhost, combined_skymap=None): ...@@ -258,14 +258,14 @@ def _send(self, alert_dict, skymap, brokerhost, combined_skymap=None):
return payload return payload
@app.task(bind=True, queue='kafka-producer', shared=False) @app.task(bind=True, queue='kafka', shared=False)
def _send_with_combined(self, alert_dict_combined_skymap, skymap, brokerhost): def _send_with_combined(self, alert_dict_combined_skymap, skymap, brokerhost):
alert_dict, combined_skymap = alert_dict_combined_skymap alert_dict, combined_skymap = alert_dict_combined_skymap
return _send(alert_dict, skymap, brokerhost, return _send(alert_dict, skymap, brokerhost,
combined_skymap=combined_skymap) combined_skymap=combined_skymap)
@app.task(bind=True, ignore_result=True, queue='kafka-producer', shared=False) @app.task(bind=True, ignore_result=True, queue='kafka', shared=False)
def send(self, skymap_and_classification, superevent, alert_type, def send(self, skymap_and_classification, superevent, alert_type,
raven_coinc=False, combined_skymap_filename=None): raven_coinc=False, combined_skymap_filename=None):
"""Send an public alert to all currently connected kafka brokers. """Send an public alert to all currently connected kafka brokers.
......
...@@ -250,14 +250,28 @@ def handle_grb_igwn_alert(alert): ...@@ -250,14 +250,28 @@ def handle_grb_igwn_alert(alert):
group='Burst', se_searches=['MDC']) group='Burst', se_searches=['MDC'])
return return
if alert['object']['search'] in ['SubGRB', 'SubGRBTargeted']: elif alert['object']['search'] == 'SubGRB':
# Launch search with standard CBC
raven.coincidence_search(
graceid, alert['object'],
searches=['SubGRB'],
group='CBC',
pipelines=[alert['object']['pipeline']])
# Launch search with CWB BBH
raven.coincidence_search(
graceid, alert['object'],
searches=['SubGRB'],
group='Burst',
se_searches=['BBH'],
pipelines=[alert['object']['pipeline']])
elif alert['object']['search'] == 'SubGRBTargeted':
# if sub-threshold GRB, launch search with that pipeline # if sub-threshold GRB, launch search with that pipeline
raven.coincidence_search( raven.coincidence_search(
graceid, alert['object'], graceid, alert['object'],
searches=['SubGRB', 'SubGRBTargeted'], searches=['SubGRBTargeted'],
se_searches=['AllSky', 'BBH'], se_searches=['AllSky', 'BBH'],
pipelines=[alert['object']['pipeline']]) pipelines=[alert['object']['pipeline']])
else: elif alert['object']['search'] == 'GRB':
# launch standard Burst-GRB search # launch standard Burst-GRB search
raven.coincidence_search(graceid, alert['object'], raven.coincidence_search(graceid, alert['object'],
group='Burst', se_searches=['AllSky']) group='Burst', se_searches=['AllSky'])
......
...@@ -24,7 +24,9 @@ def GWSkyNet_model(): ...@@ -24,7 +24,9 @@ def GWSkyNet_model():
return GWSkyNet.load_GWSkyNet_model() return GWSkyNet.load_GWSkyNet_model()
@app.task(queue='skynet', shared=False) # FIXME: run GWSkyNet on general-purpose workers
# once https://git.ligo.org/manleong.chan/gwskynet/-/issues/6 is fixed.
@app.task(queue='openmp', shared=False)
def gwskynet_annotation(input_list, SNRs, superevent_id): def gwskynet_annotation(input_list, SNRs, superevent_id):
"""Perform the series of tasks necessary for GWSkyNet to """Perform the series of tasks necessary for GWSkyNet to
......
...@@ -384,8 +384,8 @@ def update_coinc_far(coinc_far_dict, superevent, ext_event): ...@@ -384,8 +384,8 @@ def update_coinc_far(coinc_far_dict, superevent, ext_event):
gracedb.update_superevent( gracedb.update_superevent(
superevent_id, superevent_id,
em_type=ext_event['graceid'], em_type=ext_event['graceid'],
time_coinc_far=coinc_far_dict['temporal_coinc_far'], time_coinc_far=coinc_far_dict.get('temporal_coinc_far'),
space_coinc_far=coinc_far_dict['spatiotemporal_coinc_far']) space_coinc_far=coinc_far_dict.get('spatiotemporal_coinc_far'))
return coinc_far_dict return coinc_far_dict
...@@ -423,10 +423,10 @@ def keyfunc(event_far): ...@@ -423,10 +423,10 @@ def keyfunc(event_far):
search_rank = app.conf['external_search_preference'].get( search_rank = app.conf['external_search_preference'].get(
event['search'], -1) event['search'], -1)
# Map so more significant FAR is a larger number # Map so more significant FAR is a larger number
spacetime_far = coinc_far['spatiotemporal_coinc_far'] spacetime_far = coinc_far.get('spatiotemporal_coinc_far')
spacetime_rank = \ spacetime_rank = \
-spacetime_far if spacetime_far is not None else -float('inf') -spacetime_far if spacetime_far is not None else -float('inf')
temporal_far = coinc_far['temporal_coinc_far'] temporal_far = coinc_far.get('temporal_coinc_far')
temporal_rank = \ temporal_rank = \
-temporal_far if temporal_far is not None else -float('inf') -temporal_far if temporal_far is not None else -float('inf')
......
import json import json
import numpy as np
from ligo.skymap.io import read_sky_map from ligo.skymap.io import read_sky_map
from ligo.skymap.postprocess.crossmatch import crossmatch from ligo.skymap.postprocess.crossmatch import crossmatch
...@@ -32,12 +33,25 @@ def check_high_profile(skymap, em_bright, ...@@ -32,12 +33,25 @@ def check_high_profile(skymap, em_bright,
far_list_sorted[0]["search"] != "BBH": far_list_sorted[0]["search"] != "BBH":
gracedb.create_label.si( gracedb.create_label.si(
'HIGH_PROFILE', superevent_id).delay() 'HIGH_PROFILE', superevent_id).delay()
gracedb.upload.delay(None, None, superevent_id,
'Superevent labeled <font color="red">HIGH_PROFILE</font> since '
'event with lowest FAR is a Burst event.',
tags=['em_follow'])
return "Event with the lowest FAR is a Burst event. Applying label" return "Event with the lowest FAR is a Burst event. Applying label"
# annotation number condition # annotation number condition
preferred_event = superevent['preferred_event_data'] preferred_event = superevent['preferred_event_data']
if preferred_event["search"] == "SSM":
gracedb.create_label.si(
'HIGH_PROFILE', superevent_id).delay()
gracedb.upload.delay(None, None, superevent_id,
'Superevent labeled <font color="red">HIGH_PROFILE</font> since '
'preferred event is from SSM search.',
tags=['em_follow'])
return "Preferred event is from SSM. Applying label"
if preferred_event["group"] == "CBC": if preferred_event["group"] == "CBC":
em_bright_dict = json.loads(em_bright) em_bright_dict = json.loads(em_bright)
has_remnant = em_bright_dict['HasRemnant'] has_remnant = em_bright_dict['HasRemnant']
pastro_dict = json.loads(p_astro) pastro_dict = json.loads(p_astro)
...@@ -50,10 +64,40 @@ def check_high_profile(skymap, em_bright, ...@@ -50,10 +64,40 @@ def check_high_profile(skymap, em_bright,
cl = 90 cl = 90
result = crossmatch(gw_skymap, contours=[cl / 100]) result = crossmatch(gw_skymap, contours=[cl / 100])
sky_area = result.contour_areas[0] sky_area = result.contour_areas[0]
# This is commented out while we figure out the distance cutoff
# is_far_away = not (gw_skymap.meta.get('distmean', np.nan) < 2000)
if p_terr < 0.5: if p_terr < 0.5:
if (p_bns > 0.1 or p_nsbh > 0.1 or has_remnant > 0.1 or sky_area < 100): # noqa: E501 if p_bns > 0.1:
gracedb.create_label.si(
'HIGH_PROFILE', superevent_id).delay()
gracedb.upload.delay(None, None, superevent_id,
'Superevent labeled <font color="red">HIGH_PROFILE</font> since '
'because p_BNS > 10%.',
tags=['em_follow'])
return "p_BNS condition satisfied. Applying label"
elif p_nsbh > 0.1:
gracedb.create_label.si(
'HIGH_PROFILE', superevent_id).delay()
gracedb.upload.delay(None, None, superevent_id,
'Superevent labeled <font color="red">HIGH_PROFILE</font> since '
'because p_NSBH> 10%.',
tags=['em_follow'])
return "p_NSBH condition satisfied. Applying label"
elif has_remnant > 0.1:
gracedb.create_label.si(
'HIGH_PROFILE', superevent_id).delay()
gracedb.upload.delay(None, None, superevent_id,
'Superevent labeled <font color="red">HIGH_PROFILE</font> since '
'because p_HasRemnant> 10%.',
tags=['em_follow'])
return "p_HasRemnant condition satisfied. Applying label"
elif sky_area < 100:
gracedb.create_label.si( gracedb.create_label.si(
'HIGH_PROFILE', superevent_id).delay() 'HIGH_PROFILE', superevent_id).delay()
return "Annotations condition satisfied. Applying label" gracedb.upload.delay(None, None, superevent_id,
'Superevent labeled <font color="red">HIGH_PROFILE</font> since '
'because area of 90% confidence level '
'in the skymap is < 100 sq.deg ',
tags=['em_follow'])
return "Skymap condition satisfied. Applying label"
return "No conditions satisfied. Skipping" return "No conditions satisfied. Skipping"
...@@ -151,7 +151,7 @@ def plot_volume(filecontents): ...@@ -151,7 +151,7 @@ def plot_volume(filecontents):
return pngfile.read() return pngfile.read()
@app.task(shared=False) @app.task(shared=False, queue='highmem')
def flatten(filecontents, filename): def flatten(filecontents, filename):
"""Convert a HEALPix FITS file from multi-resolution UNIQ indexing to the """Convert a HEALPix FITS file from multi-resolution UNIQ indexing to the
more common IMPLICIT indexing using the command-line tool more common IMPLICIT indexing using the command-line tool
...@@ -165,7 +165,7 @@ def flatten(filecontents, filename): ...@@ -165,7 +165,7 @@ def flatten(filecontents, filename):
return open(outfilename, 'rb').read() return open(outfilename, 'rb').read()
@app.task(shared=False) @app.task(shared=False, queue='highmem')
def unflatten(filecontents, filename): def unflatten(filecontents, filename):
"""Convert a HEALPix FITS file to multi-resolution UNIQ indexing from """Convert a HEALPix FITS file to multi-resolution UNIQ indexing from
the more common IMPLICIT indexing using the command-line tool the more common IMPLICIT indexing using the command-line tool
......
<?xml version = '1.0' encoding = 'UTF-8'?>
<voe:VOEvent
ivorn="ivo://nasa.gsfc.gcn/Fermi#GBM_Fin_Pos_2018-05-24T09:58:26.31_548848711_0-566"
role="observation" version="2.0"
xmlns:voe="http://www.ivoa.net/xml/VOEvent/v2.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.ivoa.net/xml/VOEvent/v2.0 http://www.ivoa.net/xml/VOEvent/VOEvent-v2.0.xsd" >
<Who>
<AuthorIVORN>ivo://nasa.gsfc.tan/gcn</AuthorIVORN>
<Author>
<shortName>Fermi (via VO-GCN)</shortName>
<contactName>Julie McEnery</contactName>
<contactPhone>+1-301-286-1632</contactPhone>
<contactEmail>Julie.E.McEnery@nasa.gov</contactEmail>
</Author>
<Date>2018-05-24T18:35:45</Date>
<Description>This VOEvent message was created with GCN VOE version: 1.25 07feb18</Description>
</Who>
<What>
<Param name="Packet_Type" value="115" />
<Param name="Pkt_Ser_Num" value="15" />
<Param name="TrigID" value="548848711" ucd="meta.id" />
<Param name="Sequence_Num" value="0" ucd="meta.id.part" />
<Param name="Burst_TJD" value="18262" unit="days" ucd="time" />
<Param name="Burst_SOD" value="35906.31" unit="sec" ucd="time" />
<Param name="Burst_Inten" value="0" unit="cts" ucd="phot.count" />
<Param name="Data_Integ" value="0.000" unit="sec" ucd="time.interval" />
<Param name="Burst_Signif" value="0.00" unit="sigma" ucd="stat.snr" />
<Param name="Phi" value="262.01" unit="deg" ucd="pos.az.azi" />
<Param name="Theta" value="64.10" unit="deg" ucd="pos.az.zd" />
<Param name="Algorithm" value="415" unit="dn" />
<Param name="Lo_Energy" value="50000" unit="keV" />
<Param name="Hi_Energy" value="300000" unit="keV" />
<Param name="Trigger_ID" value="0x0" />
<Param name="Misc_flags" value="0x40000001" />
<Group name="Trigger_ID" >
<Param name="Def_NOT_a_GRB" value="false" />
<Param name="Target_in_Blk_Catalog" value="false" />
<Param name="Human_generated" value="false" />
<Param name="Robo_generated" value="true" />
<Param name="Spatial_Prox_Match" value="false" />
<Param name="Temporal_Prox_Match" value="false" />
<Param name="Test_Submission" value="false" />
</Group>
<Group name="Misc_Flags" >
<Param name="Values_Out_of_Range" value="false" />
<Param name="Flt_Generated" value="false" />
<Param name="Gnd_Generated" value="true" />
<Param name="CRC_Error" value="false" />
</Group>
<Param name="LightCurve_URL" value="http://heasarc.gsfc.nasa.gov/FTP/fermi/data/gbm/triggers/2018/bn180524416/quicklook/glg_lc_medres34_bn180524416.gif" ucd="meta.ref.url" />
<Param name="LocationMap_URL" value="http://heasarc.gsfc.nasa.gov/FTP/fermi/data/gbm/triggers/2018/bn180524416/quicklook/glg_locplot_all_bn180524416.png" ucd="meta.ref.url" />
<Param name="Coords_Type" value="1" unit="dn" />
<Param name="Coords_String" value="source_object" />
<Group name="Obs_Support_Info" >
<Description>The Sun and Moon values are valid at the time the VOEvent XML message was created.</Description>
<Param name="Sun_RA" value="61.53" unit="deg" ucd="pos.eq.ra" />
<Param name="Sun_Dec" value="20.86" unit="deg" ucd="pos.eq.dec" />
<Param name="Sun_Distance" value="94.75" unit="deg" ucd="pos.angDistance" />
<Param name="Sun_Hr_Angle" value="-5.25" unit="hr" />
<Param name="Moon_RA" value="187.65" unit="deg" ucd="pos.eq.ra" />
<Param name="Moon_Dec" value="1.42" unit="deg" ucd="pos.eq.dec" />
<Param name="MOON_Distance" value="59.39" unit="deg" ucd="pos.angDistance" />
<Param name="Moon_Illum" value="77.31" unit="%" ucd="arith.ratio" />
<Param name="Galactic_Long" value="264.32" unit="deg" ucd="pos.galactic.lon" />
<Param name="Galactic_Lat" value="7.50" unit="deg" ucd="pos.galactic.lat" />
<Param name="Ecliptic_Long" value="160.83" unit="deg" ucd="pos.ecliptic.lon" />
<Param name="Ecliptic_Lat" value="-50.93" unit="deg" ucd="pos.ecliptic.lat" />
</Group>
<Description>The Fermi-GBM location of a transient.</Description>
</What>
<WhereWhen>
<ObsDataLocation>
<ObservatoryLocation id="GEOLUN" />
<ObservationLocation>
<AstroCoordSystem id="UTC-FK5-GEO" />
<AstroCoords coord_system_id="UTC-FK5-GEO">
<Time unit="s">
<TimeInstant>
<ISOTime>2018-05-24T09:58:26.31Z</ISOTime>
</TimeInstant>
</Time>
<Position2D unit="deg">
<Name1>RA</Name1>
<Name2>Dec</Name2>
<Value2>
<C1>140.0500</C1>
<C2>-39.0499</C2>
</Value2>
<Error2Radius>5.6000</Error2Radius>
</Position2D>
</AstroCoords>
</ObservationLocation>
</ObsDataLocation>
<Description>The RA,Dec coordinates are of the type: source_object.</Description>
</WhereWhen>
<How>
<Description>Fermi Satellite, GBM Instrument</Description>
<Reference uri="http://gcn.gsfc.nasa.gov/fermi.html" type="url" />
</How>
<Why importance="0.95">
<Inference probability="1.0">
<Concept>process.variation.burst;em.gamma</Concept>
</Inference>
</Why>
<Description>
</Description>
</voe:VOEvent>
<?xml version = '1.0' encoding = 'UTF-8'?> <?xml version = '1.0' encoding = 'UTF-8'?>
<voe:VOEvent <voe:VOEvent
ivorn="ivo://nasa.gsfc.gcn/Fermi#GBM_Fin_Pos2018-05-24T09:58:26.31_548848711_0-566" ivorn="ivo://nasa.gsfc.gcn/Fermi#GBM_Flt_Pos_2018-05-24T09:58:26.31_548848711_0-566"
role="observation" version="2.0" role="observation" version="2.0"
xmlns:voe="http://www.ivoa.net/xml/VOEvent/v2.0" xmlns:voe="http://www.ivoa.net/xml/VOEvent/v2.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
...@@ -17,7 +17,7 @@ ...@@ -17,7 +17,7 @@
<Description>This VOEvent message was created with GCN VOE version: 1.25 07feb18</Description> <Description>This VOEvent message was created with GCN VOE version: 1.25 07feb18</Description>
</Who> </Who>
<What> <What>
<Param name="Packet_Type" value="115" /> <Param name="Packet_Type" value="111" />
<Param name="Pkt_Ser_Num" value="15" /> <Param name="Pkt_Ser_Num" value="15" />
<Param name="TrigID" value="548848711" ucd="meta.id" /> <Param name="TrigID" value="548848711" ucd="meta.id" />
<Param name="Sequence_Num" value="0" ucd="meta.id.part" /> <Param name="Sequence_Num" value="0" ucd="meta.id.part" />
......
No preview for this file type
...@@ -12,6 +12,7 @@ from . import data ...@@ -12,6 +12,7 @@ from . import data
@pytest.mark.parametrize('pipeline, path', @pytest.mark.parametrize('pipeline, path',
[['Fermi', 'fermi_grb_gcn.xml'], [['Fermi', 'fermi_grb_gcn.xml'],
['Fermi_final', 'fermi_final_gcn.xml'],
['INTEGRAL', 'integral_grb_gcn.xml'], ['INTEGRAL', 'integral_grb_gcn.xml'],
['INTEGRAL_MDC', 'integral_mdc_gcn.xml']]) ['INTEGRAL_MDC', 'integral_mdc_gcn.xml']])
@patch('gwcelery.tasks.external_skymaps.create_upload_external_skymap.run') @patch('gwcelery.tasks.external_skymaps.create_upload_external_skymap.run')
...@@ -41,7 +42,7 @@ def test_handle_create_grb_event(mock_create_event, ...@@ -41,7 +42,7 @@ def test_handle_create_grb_event(mock_create_event,
mock_create_event.assert_called_once_with( mock_create_event.assert_called_once_with(
filecontents=text, filecontents=text,
search='GRB', search='GRB',
pipeline='INTEGRAL' if pipeline == 'INTEGRAL_MDC' else pipeline, pipeline=pipeline.split('_')[0],
group='External', group='External',
labels=None) labels=None)
calls = [ calls = [
...@@ -65,8 +66,10 @@ def test_handle_create_grb_event(mock_create_event, ...@@ -65,8 +66,10 @@ def test_handle_create_grb_event(mock_create_event,
['data_quality']) ['data_quality'])
] ]
mock_upload.assert_has_calls(calls, any_order=True) mock_upload.assert_has_calls(calls, any_order=True)
gcn_type_dict = {'Fermi': 115, 'INTEGRAL': 53, 'INTEGRAL_MDC': 53} gcn_type_dict = {'Fermi': 111, 'Fermi_final': 115,
'INTEGRAL': 53, 'INTEGRAL_MDC': 53}
time_dict = {'Fermi': '2018-05-24T18:35:45', time_dict = {'Fermi': '2018-05-24T18:35:45',
'Fermi_final': '2018-05-24T18:35:45',
'INTEGRAL': '2017-02-03T19:00:05', 'INTEGRAL': '2017-02-03T19:00:05',
'INTEGRAL_MDC': '2023-04-04T06:31:24'} 'INTEGRAL_MDC': '2023-04-04T06:31:24'}
mock_create_upload_external_skymap.assert_called_once_with( mock_create_upload_external_skymap.assert_called_once_with(
...@@ -90,7 +93,7 @@ def test_handle_create_grb_event(mock_create_event, ...@@ -90,7 +93,7 @@ def test_handle_create_grb_event(mock_create_event,
}, },
gcn_type_dict[pipeline], time_dict[pipeline]) gcn_type_dict[pipeline], time_dict[pipeline])
# If Fermi FINAL notice, check we try to grab sky map from HEASARC # If Fermi FINAL notice, check we try to grab sky map from HEASARC
if pipeline == 'Fermi': if 'final' in path:
mock_get_upload_external_skymap.assert_called_once_with( mock_get_upload_external_skymap.assert_called_once_with(
{'graceid': 'E1', {'graceid': 'E1',
'gpstime': 1, 'gpstime': 1,
...@@ -708,8 +711,10 @@ def test_handle_subgrb_exttrig_creation(mock_raven_coincidence_search): ...@@ -708,8 +711,10 @@ def test_handle_subgrb_exttrig_creation(mock_raven_coincidence_search):
# Check that the correct tasks were dispatched. # Check that the correct tasks were dispatched.
mock_raven_coincidence_search.assert_has_calls([ mock_raven_coincidence_search.assert_has_calls([
call('E1234', alert['object'], searches=['SubGRB', 'SubGRBTargeted'], call('E1234', alert['object'], searches=['SubGRB'],
se_searches=['AllSky', 'BBH'], pipelines=['Fermi'])]) group='CBC', pipelines=['Fermi']),
call('E1234', alert['object'], searches=['SubGRB'],
group='Burst', se_searches=['BBH'], pipelines=['Fermi'])])
@patch('gwcelery.tasks.external_skymaps.create_upload_external_skymap') @patch('gwcelery.tasks.external_skymaps.create_upload_external_skymap')
...@@ -726,7 +731,7 @@ def test_handle_subgrb_targeted_creation(mock_raven_coincidence_search, ...@@ -726,7 +731,7 @@ def test_handle_subgrb_targeted_creation(mock_raven_coincidence_search,
# Check that the correct tasks were dispatched. # Check that the correct tasks were dispatched.
mock_raven_coincidence_search.assert_has_calls([ mock_raven_coincidence_search.assert_has_calls([
call('E1234', alert['object'], se_searches=['AllSky', 'BBH'], call('E1234', alert['object'], se_searches=['AllSky', 'BBH'],
searches=['SubGRB', 'SubGRBTargeted'], searches=['SubGRBTargeted'],
pipelines=['Swift'])]) pipelines=['Swift'])])
......
...@@ -38,12 +38,10 @@ def test_pick_coinc(): ...@@ -38,12 +38,10 @@ def test_pick_coinc():
def test_upload_event(mock_create_signoff, mock_get_superevents, def test_upload_event(mock_create_signoff, mock_get_superevents,
mock_create_event): mock_create_event):
num = 16 if app.conf['mock_events_simulate_multiple_uploads'] else 1 num = 16 if app.conf['mock_events_simulate_multiple_uploads'] else 1
coinc = pick_coinc()
upload_event() upload_event()
mock_create_event.has_calls( assert mock_create_event.call_count == num
[call(coinc, 'MDC', 'gstlal', 'CBC') for count in range(num)])
mock_get_superevents.assert_called_once_with('MDC event: M1234') mock_get_superevents.assert_called_once_with('MDC event: M1234')
mock_create_signoff.assert_called_once() mock_create_signoff.assert_called_once()
msg = ('If this had been a real gravitational-wave event candidate, ' msg = ('If this had been a real gravitational-wave event candidate, '
......
...@@ -86,11 +86,11 @@ def test_handle_create_grb_event(monkeypatch, ...@@ -86,11 +86,11 @@ def test_handle_create_grb_event(monkeypatch,
labels=None)) labels=None))
if expected_result: if expected_result:
mock_create_event.assert_has_calls(calls) mock_create_event.assert_has_calls(calls)
if ext_search == 'SubGRB' or \ if ext_search == 'SubGRB':
('Fermi' in pipelines and ext_search == 'GRB'):
mock_get_upload_external_skymap.assert_called() mock_get_upload_external_skymap.assert_called()
else: else:
mock_create_upload_external_skymap.assert_called() mock_create_upload_external_skymap.assert_called()
mock_get_upload_external_skymap.assert_not_called()
else: else:
mock_create_event.assert_not_called() mock_create_event.assert_not_called()
mock_create_upload_external_skymap.assert_not_called() mock_create_upload_external_skymap.assert_not_called()
......