Spring 2022 epic DM-33385-Performance Characterization in S22IN PROGRESS
Backlog epic: DM-29525-Backlog epic for DM performance metricsTO DO
Peter and Erik were able to run photometric repeatability on AuxTel data
Make a SQuaSH dashboard for AuxTel? Currently we are using SQuaSH to track metrics with fixed data and changing pipelines. Currently the data processing and workflow is not set up to easily push metrics to SQuaSH in a more automated way.
Question of how to chunk up data for evaluating photometric repeatability, for example. More thought needed to delineate subsets of data, e.g., a time window of data.
AA1, absolute separations against a reference catalog (
DM-29368
-
Getting issue details...STATUS
)
Investigations of how to compute in different coordinate systems (e.g., Alt-Az, instrument coordinates). Suggest for this ticket to do the calculation with RA, Dec only and make a backlog ticket
Now that the SIT-Com reorganization is in place, we want to think about the scope of this meeting. Potential science verification topics to be diagonalized across various meetings include (while trying to reduce the total number of meetings to the extend possible):
More formal systems engineering aspects of developing requirement specifications (including documenting the algorithms to compute metrics and pass/fail criteria), developing test cases, organizing test campaigns, and documenting outcomes.
There is partial overlap of DMSR requirements and system-level requirements from OSS and LSR. Formally, the requirements are usually
Science verification and validation analysis tooling
"Nuts and bolts" of analysis_drp and faro infrastructure
Visualization tools
Adding metrics
Pushing on-sky data through Science Pipelines (e.g., AuxTel)
Analysis of on-sky data (e.g., AuxTel)
Tactical planning of activity on day to week timescale
Strategic planning on month(s) timescale
More detailed discussion of specific science verification and validation topics
PSF modeling
Weak lensing shear
Galaxy photometry
Astrometry
Low surface brightness features
...
How to modify/configure Science Pipelines in response to lessons learned from commissioning science verification?
Context questions:
What meetings do we expect (representatives from) in-kind contributing groups to attend?
Is all the formal systems engineering for all SVV scope?
Plan to embed in-kind groups in existing Project structures
Potential ways to diagonalize the meetings
Organizing by deliverable
Organizing by expertise
Current sketch of a plan is based more on an operational stance.
Tactical vs. strategic
"Technical" (both within DM and cross system) vs. "science"
DM works on components and delivers. Are these fit to begin doing full system integration. Formally, need to verify DMSR. SVV would evaluate with on-sky data in an integrated system.
DMSR includes topics such as network and Gen-3 butler verification that are more DM specific
Suggest to organize by timescale of deliverables (e.g. weekly/monthly updates or story/epic reviews) in accordance with Tactical/technical/strategic vision as opposed to Topics (e.g. AuxTel data processing). This allows for the floor to be open to a wide range of topics and invites experts across subsystems to attend and respond/present their topic of need. The benefit is that we don't have to define the discussion beforehand, and we let the scale of the problem/issue/update being presented dictate the amount of meeting time/follow-up to dedicate. One downside is we probably won't have enough time to cover all topics (if we are considering e.g. only 1hr a week).
Keith Bechtol to make a ticket to better understand mapping of these camera and calibration products characterization efforts to verification documents and the focus of these efforts. Discuss with the SCLT