Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • DPDD: make available a precovery service, by API. We have pieces of this in the diagram, should be clear how we will do this, and who.
  • Who maintains image/chip/visit metadata? Provided by metaserve.
  • Use case: "Find me a 3-sigma object in this (arcmin) area." Useful for asteroids, where you have an error ellipse. Requires going back to the images, since we are not detecting at this level. Interface is probably SUI/T, but who supplies the code for the image processing? Will the current image access service return an image of an arbitrary area? Yes. Then someone just needs to write the code to work on the cutout. Is this using direct or difference images? Useful to have difference images, but we might have to generate that off the latest template, not the original template.
  • Solar system processing. Not on this diagram, in: LDM-151 MOPS. On the MOPS diagram, asteroid assignment is a time-evolving problem. Do we have a requirement to retain this history? Not decided. Mario, K-T responsibility.
  • Zeljko: Simon and Andy, where do we stand? How hard will implementing this system be, and are we prepared to build it?
    • Single Frame Processing, Simon: some worry about some of the pixel level corrections. They may be complex. But for all of these we have at least some implementation. Doable with the nominal 2 FTEs over two years.
    • Alert Detection: Template generation is a hard problem. Requires both Princeton and UW effort. Jim: Two hard parts, varying seeing effects (Princeton is thinking about) , and DCR (UW). Image differencing and measurement is not a big concern. Might need to measure on likelihood images (offline discussion required). Robert thinks there's more to worry about in differencing.
    • Hard to understand overall complexity of the operational system, how to coordinate all of the data flows in processing. Two general risks here: integration risk of all the associated pipelines, and individual scientific/algorithmic risks. UW/Princeton better set up to retire the latter, requires cross-site work to address the former. Also requires significant QA infrastructure for evaluating science products, requires prioritization decisions from Mario, Jacek.
    • Association pipeline is hopefully not beyond state of the art, implement Budavari algorithm. Debate as to whether this belongs inside or outside the database. This association toolkit is also important for QA purpose. Want to preserve the ability to run this on e.g. a laptop.
    • Where is ghosts/glints finding/masking in this diagram? 
    • Aggregate measurements bubble is not implemented. No milestone in the plan for variability characterization, need to add.
    • Alert broker: going to require significant work. Community prototypes available for alert related work, but will want to improve. Lots of interfaces in this area to SUI/T, NCSA.
  • What is the impact of these L1 pipelines on the QA pipelines? 

...