Due to the ongoing COVID-19 situation, this meeting will be virtual. Please do not attempt to travel to Seattle — or to anywhere else — to participate.
Is this document acceptable to the DMLT? What are the remaining open questions? How will we resolve them?
DMTN-148 is almost there suggest 2 weeks review by DMLT.
John to setup feedback system with Chris.
This should be baselined (change controlled)
Robert asks when we will start "acting on this" - e..g when could it be used for LATISS on the mountain. On going work from Andres and Merlin - where is the ingest and validate..
KT last stage getting from production system via oods to summit to be used for ISR on summit. Certified and transferred to where its needed.
Jim - Good to separate operations concerns (how its used on the mountain) from about the code and how we implement. DMTN-111 could have the summit details. Tim - no agreement on every curated calibration had class somewhere, one end - other is the certification
Headline: The DMLT agrees that the story we tell the community is that our data model is effectively two tables, and users will need to join them themselves.
General agreement about using PyVO and Pandas.
Are DIAForcedSources included?
The same considerations apply mapping DIAObject to DIAForcedSource.
Our feature computation may be based on DIASources or DIAForcedSources; a recommendation from Eric will be forthcoming.
Adding support for e.g. non-detection upper limits in feature computation is a possible, and may make the inputs to feature computation more complex. However, this should not be unmanageable.
How tightly coupled is the AP pipeline with the database? Is this a technical risk?
Reconstructing data structures from the AP pipelines based on VO interfaces would be challenging.
The details of feature computation are well abstracted and testable; they are not tightly coupled.
Plugins are implemented for feature computation below the task level; the master task takes a Pandas data frame as input.
None of these proposals are changes to previous promises made to the community.
In terms of announcements to the community, we suggest that this should be rolled into discussion of capabilities available for DP0.
Some discussion of a PST-SciCollab talk if necessary.
Eric Bellm — update time-series technote to contain a discussion of the way in which data will be presented to users.
Gregory Dubois-Felsmann — update the Science Platform design documentation to reflect that data access services should be tested with PyVO.
How will DM respond to slips in the overall project schedule?
Calabrese coordinating mail pickup in Tucson.
Services which were used in commissioning/integration are easy to define as “done”.
Would be good to get a statement of thanks from project leadership to DM staff.
Aim to make “blurring” between construction and commissioning a positive opportunity.
Also look for opportunities in the deliveries to ops (but be careful that this is not blurring).
The details of financing, ramps, etc through FY22 & FY23 will have to be addressed on a case-by-case basis, depending on guidance from construction project management and the agencies.
Comments from Victor:
Covid 19 costs are not an appropriate use of current funding (baseline or contingency).
We will therefore adopt a new baseline; the so-called “over target baseline”.
The earlier we do this, the riskier it will be and the less accuracy it will have.
Currently seems like NSF will accept late replanning (October/November).
This is not an opportunity for us to reinstate previously-accepted descopes.
This information can be shared with the rest of the project.
Some concern expressed that operational priorities are different from construction priorities; we should be clear that staff transitioning do so in the project's interest, rather than just because it is financially expedient.
Victor is petitioning the agencies for a minimal status review this year.
The drawback of waiting longer for a rebaselining is that we have to live with bad metrics until it kicks in; might be an issue for reviews.
Not clear what the rebaselining process will be: could imagine an FDR-like process, but it's not clear that will be practical.
Concern raised that DM may be able to reach completion on close to original timescale.
Expectation is a 12 month delay with a cost of $3.5M per month. Do not believe there is a serious risk of this not being approved at the moment. Also do not believe there is a serious risk of being forced to accept technical compromises.
Kian-Tat Lim Convene a meeting with Colin, Tim, Robert, Yusra to resolve graph generation with per-dataset quantities (likely based on Consolidated DB work).
Everybody's talking about it, but what does it mean? Who will have to do what when?
Can we use this opportunity to get ahead of whatever Victor/Kevin/etc will ask for, and make sure DM comes out of the rebaselining process in good shape?