The  2019-06-03 DM-SST F2F  meeting will focus on DM algorithms,  in preparation for the algorithms workshop in Nov/Dec this year.  This should be a deep-dive into challenging and important algorithms. Please list here algorithms or topics related to algorithms that you would like to cover. Key points to address for all algorithms are: 

  • Current status and performance
  • Will we meet LSST SRD requirements with the current algorithmic baseline?
  • For which algorithms might we engage and leverage effort from science collaborations (e.f DESC PSF taskforce) 
  • What are the plans to verify our choice of algorithms? Precursor data, commissioning ComCam data?
TopicRequested byTime required (estimate)PriorityNotes
Status & Performance of Image Differencing (decorrelated A&L, ZOGY, etc.)
  • High
  • Normal
  • Low

(RHL) This means processing a lot of data, analysing the failure modes (including e.g. astrometry), and tweaking things to improve matters.  There's HSC data available, and I think UW's been processing DES – so maybe this is done and I just haven't seen the results.

We need to know the false positive rate (probably via injecting fakes), and classify what's going wrong.  Variability in the cores of galaxies is something we need to do well.

We need to remember that the diffim code has to work both at high latitude and down in the bulge.

DCR correction
  • High
  • Normal
  • Low 
(RHL) We need to know if this works.  What is the proposal for validating the current code?  We also need to know if it's needed, which will require some serious study of DECam subtractions at ranges of air masses.  E.g. I can think of ways of processing dipoles to not cause false positives (but I don't know if they are good ideas)
Status of MOPSEric Bellm
  • High
  • Normal
  • Low

Source Association (single-frame to reference, N-way matching)Eric Bellm
  • High
  • Normal
  • Low
(RHL) We need this for QA.
Real-BogusEric Bellm
  • High
  • Normal
  • Low
(RHL) We need to analyse the problems in the tuned difference imaging before looking at this.  The DPDD has a number of placeholders (dipoles etc.) that we need to validate as useful as part of this work.
Brighter-Fatter Kernel generation
  • High
  • Normal
  • Low

Is the current approach to calculating the BF kernel sufficient to meet our SRD requirement?

How much better could we do if we adopted a fuller treatment of the Photon Transfer Curve?

What set of tests would we want to see from ComCam lab data and on-sky data to verify our algorithmic choice?

Deep object measurement 
  • High
  • Normal
  • Low 

Cell based coadds? Multifit?

(RHL) see multiple rows towards the bottom of the table

PSF Determination
  • High
  • Normal
  • Low

Chromatic PSFs?  (RHL): I think that PIFF will probably work for us, so we need to implement this as soon as the code is stable.  We shouldn't be using PSFEx in 6 months.

(RHL) I think we need to handle chromatic PSFs to take out the DCR in the astrometry;  there are other approaches but we do need to decide.

(RHL) we need to be able to use external PSF catalogues at some point in the processing.  Jim Boschargues that we don't need them for SFM (but I worry about robustness under poor conditions in u), and I'm OK with doing the PSF estimation post-jointcal;  if we're going to do this, we should do it soon.

DM's plans for computing photo-z

Leanne Guy (from dm-sst discussion)


  • High
  • Normal
  • Low

#dm-sst  discussion on 22 APR 2019

Machine learning for source measurement and/or deblending.
  • High
  • Normal
  • Low
#dm-sst discussion on 7 MAY. 
Deblender
  • High
  • Normal
  • Low

(RHL) I don't believe that we should assume the current approach (scarlet) will work.  We don't have a backup plan, except some sort of simplistic forward modelling (Sersic?);  that would probably work for some measurements. 

I think we should be considering a class of simultaneous measurements in addition to measuring on the children.

Stellar crowded fields
  • High
  • Normal
  • Low

(RHL) This comes in two parts:  modifications to the current approach that can process all 18000 degrees (probably needs some sort of recursive subtraction of sources, including in PSF estimation) and a "real" crowded field that can handle the bulge fields where we can neglect the galaxies

 LPG: Seems mostly a question of resources at this stage. 

Initial Astrometry
  • High
  • Normal
  • Low
(RHL) we need an astrometric solver that never ever fails to get a 100mas solution for every CCD in the camera even in u with significant cloud.  This will probably require a full-focal plane solution, in the sense that knowing where one chip is tells you where the rest are.  The guider is also probably a sufficient source of information to achieve this
Final Astrometry
  • High
  • Normal
  • Low
(RHL) We need a version of astrometric jointcal that runs fast enough to be used on 100s of visits in a reasonable amount of memory.  We need to include a structured model (atmosphere, optics, camera, CCD) with control over the degrees of freedom (e.g. freeze optics/camera/CCD for the data and fit the atmosphere).  We need to be able to extract the e.g. CCD positions and tree-ring distortions from the solutions.  We need to do the clipping properly, and in ways that we understand.  We probably need to include the parallax/proper motion as part of this processing (not just the catalogues);  if not we need to work on this ASAP.  We should be able to demonstrate sqrt(n) scaling converging to (I expect) c. 1mas;  we need to demonstrate that we are rigidly on the Gaia system.
Photometry for stars
  • High
  • Normal
  • Low

(RHL) FGCM is working well, and we need to decide if we are going to use it to replace photometric jointcal.  If we keep jointcal, we need to separate the components (atmosphere, camera, CCD) and include the estimated SEDs of the stars.  It must run in finite time/memory.

We need to identify a photometric catalogue to use (ultimately from HSC, but tied to external catalogues).  Gaia probably plays a rôle (at least in gri); PanSTARRS seems to be adding noise – but we need QA to confirm this. DES?

Feature extraction from light curves
  • High
  • Normal
  • Low
We were waiting for feedback from the TVS group - we will now start an SST study on this to be competed before the workshop
Galaxy photometry
  • High
  • Normal
  • Low
(RHL) We don't have great galaxy measurement code at present, and we don't know how bad it is; it appears to be sensitive to background levels and doesn't handle "B+D" and forced photometry where we free up some parameters (n.b.  I'm not sure we need forced photometry, but we say we'll do it).  We have no code for PSF-matched colours (using e.g. the Kuijken approach).  We need robust fake-source injection running at scale as part of the QA for this.
Forced photometry 
  • High
  • Normal
  • Low
(RHL) If we're going to use forced photometry we need to figure out how to make it useful in the presence of blending.
Shape measurement
  • High
  • Normal
  • Low
(RHL) The current shape measurement is via HSM regaussianistion.  This isn't good enough.  We are investigating using metacal, but this must be integrated and run at scale.  N.b. this may require a different structure to the measurement code (see deblender) if the child shapes are measured by simultaneous fit in meta cal.  
ISR
  • High
  • Normal
  • Low
(RHL) AP and DRP rely on good ISR.  I'm not convinced that it is good enough (esp. for LSST CCDs);  some of this is about calibration product production but some is how we use it.  This is separate from the BF that MWV mentioned;  an example is CTE corrections if needed.  Another is using flats with the colour of the sky (and then switching to flat nu F_nu).  This may only be possible on imsim data for now
Cosmic rays
  • High
  • Normal
  • Low
(RHL) We haven't worked on CR code for ever.  We need to do better – e.g. the HSC search for z-band dropouts is hard
improved interpolation
  • High
  • Normal
  • Low
(RHL) we can do a better job on interpolation using "live" GPs rather than the the hard-coded version we inherited from SDSS.  This will probably improve the photometry/astrometry of saturated stars and may be important for building coadds.  If it's just for pretty pictures, it's less important (but EPO should care!)
global sky subtraction
  • High
  • Normal
  • Low
(RHL) We need at least full-focal plane sky subtraction;  the HSC code only subtracts the first PCA component and won't be good enough for LSST.  It is possible that we'll be able to get away with full focal plane subtractions rather than implementing background matching
background matching
  • High
  • Normal
  • Low
(RHL) Once we have good flats and non-linearity we should revisit this (but see previous row).
Star wing subtraction
  • High
  • Normal
  • Low
(RHL) It is clear that doing a good job on sky subtraction makes the deblending and measurement harder.  One way to make this less bad is to subtract the wings of bright stars.
Detection, modelling and removal of extended emission (e.g. ICL; tidal features; IR cirrus)
  • High
  • Normal
  • Low
(RHL) We need at least full-focal plane sky subtraction;  the HSC code only subtracts the first PCA component and won't be good enough for LSST.  It is possible that we'll be able to get away with full focal plane subtractions rather than implementing background matching
Building coadds
  • High
  • Normal
  • Low

(RHL) The current coadd code doesn't clip as well as it needs to, at least for HSC wide (too few epochs) and Deep/UDeep (too small dithers).  We need to work on how we'll do this – PSF matching and iterating?  Something else?

Note that we also need to generate ip_diffim templates, so there's an overlap with DCR concerns.

QA
  • High
  • Normal
  • Low
(RHL) We need QA, including roll-up to high level metrics, for all the processing.  In particular, PSF, astrometry, stellar photometry, galaxy photometry.  This must apply to per-visit as well as per-tract and handle multi-band as well as single-band where appropriate
Data access
  • High
  • Normal
  • Low
(RHL) As part of processing (e.g. jointcal) and QA we need fast ways to read and process catalogue data.  If we're using parquet, we need to have efficient ways to generate the files.
asteroids
  • High
  • Normal
  • Low
(RHL) If we care about trailed objects we need to measure them.  If we really care we need to detect them specially too.
satellite trails
  • High
  • Normal
  • Low
(RHL) we may need to explicitly detect and remove satellite trails as part of the coadd generation.
ghosts and ghouls
  • High
  • Normal
  • Low
(RHL) We need to think about how to handle ghosts and ghouls.  I'd probably defer this until we see how bad the LSSTCam really is
Star/Galaxy separation
  • High
  • Normal
  • Low
(RHL) We need to get the S/G work (I think using an SVM) into the codebase so we can retrain and use it
Variability in the sky v. templates
  • High
  • Normal
  • Low
(RHL) giants vary; stars move.  Think about how to generate consistent ip_diffim photometry as the templates evolve (or don't).  Note that stars varying breaks the assumptions that go into A&L, but I think we can sort of how to handle this.