Gen 3 results in the CMR this time with a note that this is the last time. Will probably continue this until we have pipelines full parity
Schedule Dan from early August.
RHL hasa draft of a notebook out of Chris - will pass on details to Simon when there are more
AO for community engagement with commissining out – good interest.
Interest from DESC See #desc-announce, not sure about other SCs yet.
Bugs, issues, topics, etc of the week
Use of logger in faro . Some guidelines have emerged from work on
Getting issue details...STATUS
Do not use lsst.log, use python 'logging' instead. See an example in 'separations.py'
import logging log = logging.getLogger(__name__) log.debug("No matching visits " "found for objs: %d and %d" % obj1, obj2)
Except if a class is using 'Task' - then the logger that is attached to each 'Task; should be used. e.g 'self.log.info()'. See example in 'TractMeasurementTask'
class TExTask(Task): ... self.log.info("Measuring %s", metricName)
Do not use "f-strings" in any log call 'Task has been set to use % formatting and at the log handler level, f-string and % formatting cannot be mixed. This only applies to logging. f strings can be used elsewhere.
DM-31013 : NaNs being reported by all metrics. Problem with validation datasets - FGCM format changed? jeff will look at it
Keith been working on switching to using parquet tables DM-26214.
Has switched to src table for single detector metrics. Very fast. Per visit and per detector src catalogs only so far.
Implemented as additional base classes rather than replacing old.
Complete this PR first then move to implementation of using parquet in the remainder of the code
DM-30748: Reference catlogs
Gen 3 magic - will give the dataids associated with the shards that overlap the spatial region of the quantum being computed. So no need to load the whole reference catalogs! Access spatial info, timestanmp of visit from the dataId associated with the quantum. Can then apply the correct proper motion.
Can now proceed to implementing all metrics that require comaprison to external catalogs.
Simon has run DRP pipelines on reduced dataset. Can get through coadd and coadd measurement. One patch with complete coverage in all bands. Issue is time - 240 mins for SFP. Maybe not a nightly reprocessing. Deblending and measurement is time consuning. So we have a coherent dataset that can be run fully but time is an issue.
Keith Bechtol to make a ticket to better understand mapping of these camera and calibration products characterization efforts to verification documents and the focus of these efforts. Discuss with the SCLT