Agenda for Technical Review

This page currently contains a summary of discussion from recent meetings - feel free to comment or add ideas.  Content is arranged by order of (proposed) presentation at the Technical Review, attempts to identify appropriate presenters are made along with message for each presentation, and the "Supporting Documentation" column itemizes titles and lead authors for each document, along with the specific Charge question addressed in the last column.

 

Charge Questions

  1. Are the requirements for the Operations Simulator understood at a level appropriate to guide its development into the construction phase of the project (including the level of fidelity required)?
  2. Are there possible design elements and system constraints that affect the survey cadence that have been overlooked?
  3. Are the OpSim inputs adequate and representative of the expected operational environment for LSST?
  4. Is the software architecture of OpSim sufficient to explore a wide range of scheduling algorithms and observing modes?
  5. Is the suite of post processing tools (both existing and in development, taken together) adequate to evaluate simulations for their performance with regards to science priorities?
  6. Do the outputs from OpSim represent a reasonable prediction of the expected sequence of observations from LSST for a given set of science priorities?
  7. Does the architecture of OpSim sufficiently capture the logic of telescope scheduling to serve as a credible tool for prototyping the development of the OCS Scheduler?
  8. Are the development plans for OpSim, including the proposed timeline and allocated resources credible given the requirement that a validated OCS Scheduler be delivered by the start of commissioning in 2019.

 

Agenda & Presentations

Topic

Presentation (presenter)

 

Charge Element

Introduction

Chuck - from SE point of view: how OpSim fits into the project, what it is supposed to do, why it is important to the project

Bill - from T&S point of view: what is the Scheduler and its function/importance, what are the resources for providing this

Zeljko - from the Scientist's perspective: why do scientists care about OpSim, what functionality do they need. Delineate project needs and science needs

 

1

Simulator Requirements

Abi/Francisco - What functionality is needed

 

1,2

Building the Simulator

Francisco/Srini - How does it work - design, architecture, how do you run it

 

2,7

Validation / Reference Run

Kem/Cathy - How do we know it works as intended, how is a "Reference Run" defined and what does that mean?

 

 

6

Cadence Exploration

Kem/Cathy - project v science;

 

2,4,6

SSTAR / Metrics

Steve/Srini/Cathy - Post-processing tools, motivation and status

 

5

Analysis Framework Vision

Srini/Lynne/Peter/Steve - requirements, development plans

 

5

Scheduler

Kem/Abi/Steve - Requirements, etc.

 

1,2,3,4,7

What's Next

Abi - objectives, plans, timelines, resources

 

8

 

Supporting Documentation

  1. Simulator requirements [Francisco] (see below) Srini to send docushare link-
  2. Overview [Abi] **
    1. overview of how we are doing what we are doing - overarching philosophy why a tool, why this form, why not someone else's code?
    2. motivate scheduler AND simulator
    3. Include or refer to requirements document [Francisco]
    4. structure of how the models are broken out and
    5. logical flowchart of decision making
    6. point out various functions at different times of the project
  3. Simulator description ** - How does it actually work -  (technical journal article, refer SPIE paper - based on this and different in these ways…) [Francisco]
  4. Conceptual APIs for the for the Scheduler (interface with OCS)
    1. but don't turn into a review of what it is going to become
    2. framework in which we are going to test the new scheduling algorithms
    3. API between simulated scheduler and actual scheduler
  5. Sphinx documentation on SSTAR *(how to install, run, what comes out) [Srini]
  6. Sphinx documentation on OpSim * (how to install, change parameters, output) [Srini]
  7. Post-processing tools: SSTAR+ (explains) + other output (validates code; annotate sample SSTAR) [Srini]
  8. Post-processing tools: Metrics  (SSTAR+ doesn't do it all, need other science cases; give examples of 7- reshape PST) now need MAF so… [Steve/Srini]
  9. MAF design and development plan [Steve/Srini]
  10. Validation v3.0 [Cathy]
  11. Cadence runs [Kem - defer to after review?]
  12. Baseline run like opsim3.61 - reference run? [Kem] -
  • No labels

1 Comment

  1. I do understand what you meant by "project vs. science" but I find it open

    to misinterpretations - the main goal of the project IS science! 

    Perhaps a better choice of words would be "construction vs. deployment"

    or "engineering studies vs. cadence studies"?