You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 185 Next »

Introduction

We will be having a bootcamp to bring together members from the camera, commissioning, and data management teams for the purpose of exploring instrument signature removal in the context of all three teams.

Slack channel: #dm-bootcamps

Goals

  • To empower people to work with calibration related data and pipelines using the LSP as the primary interface.
  • To explore how algorithms and tooling can be better integrated among the LSST team.
  • To conduct comparisons of how calibration products are generated and applied via various input data and processing pipelines.


Following is a set of goals put forward by each of the participating groups that expands in more detail on the high level goals above.

Camera: 

  • Take specific steps toward using DM tools to analyze Camera test data, with the eventual goal of complete use of DM tools for Camera reverification. Start with specific EOtest tasks and compare with the existing Camera analysis chain results. Build from there. (See tasks below)
  • Getting Started documentation for others to follow.
  • Make this a common standard platform for people interested in looking at Camera test data
  • Transition from "Camera team", "DM team", "Commissioning team", ...  toward LSST team.

Data Management:

  • To better understand the status of algorithms for the purpose of ISR in both the DM and Camera ecosystems.  This includes both calibration product production as well as application.
  • To compare an example calibration exercise with both Camera and DM code
  • To understand the best route toward unifying code between Camera and DM, and the extent to which that's possible.

Commissioning:

  • To be able to run an example of generating a calibration product and applying that calibration product as part of ISR using the DM stack running on the LSP with relevant camera data hosted at NCSA. Start with simplest end-to-end test.
  • Creating some baseline of documentation from which to spread knowledge of ISR implementation.

Logistics

When: 14, 15 Nov. 2018 with space available on the afternoon of 13 Nov. and morning of 16 Nov.

Where: Redwood Conference room SLAC National Laboratory

We will have from 10 to 20 rooms reserved at the SLAC Guest House.  This block is now used.  Please book hotel ASAP.

Other suggested hotels: Comfort Inn, Atherton Inn

For practical reasons, we are not supporting remote participation via bluejeans for the entire meeting. (During the meeting, we expect to reach out with specific questions via Slack and can make specific bluejeans calls if needed.) Our goal is that the products of the bootcamp will leave enough breadcrumbs and documentation that others can quickly get up to speed. We are planning a series of bootcamps that will each focus on a different topic, so there will be many future opportunities to participate.

Attendees

Please add your name if you are interested to participate in the bootcamp. We sized the bootcamp for 40 total attendees, though we do not necessarily need to be that large / might not have the budget for that many travelers. The Nov 2018 bootcamp is specifically focused on ISR. We expect to have a series of bootcamps that focus on different themes and are intended to be independent of each other (i.e., one does not need to participate in this bootcamp to participate in future bootcamps on other topics).


NameInstitutionNotes/specific interestsConfirmedIntroCoda
1Simon KrughoffLSST/AURAI want to continue to find out how best to use the LSP
  •  
  •  
  •  
2Leanne GuyLSST/AURALSST subsystem integration, feedback on the DM system, cross subsystem verification and validation
  •  
  •  
  •  
3Brian StalderLSST/AURAComCam integration, verification
  •  
  •  
  •  
4Tony TysonUCD-LSSTComCam integration, ISR, verification, metrics for SV
  •  
  •  
  •  
5Keith BechtolUW-MadisonGet the commissioning team looking at data from LSST hardware using the interfaces we expect to use during commissioning
  •  
  •  
  •  
6Rob MorganUW-Madison

Interested in photometric calibration and associated calibration products

  •  
  •  
  •  
7Craig LageUC Davis

ISR, especially brighter-fatter.

  •  
  •  
  •  
8Andrew BradshawUC DavisCamera systematics and weak lensing
  •  
  •  
  •  
9Imran HasanUC DavisWeak lensing
  •  
  •  
  •  
10Colin SlaterU. WashingtonDM tooling for commissioning/camera analysis.
  •  
  •  
  •  
11Bo XinLSST/AURASensor characterization, ISR
  •  
  •  
  •  
12Michael ReuterLSST/AURAUnderstand sensor characterization and how to apply knowledge during commissioning
  •  
  •  
  •  
13Emily Phillips LongleyDukeConfirmed KB, eotest and ISR, results access and displays/plotting
  •  
  •  
  •  
14Steve RitzUCSCEOTEST and ISR
  •  
  •  
  •  
15Duncan WoodUCSCEOTEST and ISR
  •  
  •  
  •  
16Pierre AntilogusLPNHE-IN2P3Strong interest in pixel / eotest like / ISR  operation in DM framework 
  •  
  •  
  •  
17Bela AlbolfathiUC-Irvine / SLACEOTEST and ISR
  •  
  •  
  •  
18Bryce KalmbachU. Washington
  •  
  •  
  •  
19

Andrés Alejandro Plazas Malagón

PrincetonWill be joining the DM calibration products production team at the start of November.
  •  
  •  
  •  
20Andrei NomerotskiBNLEOTEST & SR in dm
  •  
  •  
  •  
21Hsin-Fang ChiangNCSA
  •  
  •  
  •  
22Scott DanielU. Washington
  •  
  •  
  •  
23Melissa GrahamU. Washington


  •  
  •  
  •  
24Christopher WatersPrincetonDM/pipeline version of ISR
  •  
  •  
  •  
25Jim ChiangSLAC
  •  
  •  
  •  
26Tony JohnsonSLACInterested in learning more about use of DM tools, especially in the context of running EO like analysis on the camera diagnostic cluster. (Tuesday+Wednesday only)
  •  
  •  
  •  
27Robert LuptonPrinceton
  •  
  •  
  •  
28Stuart MarshallSLACHope to learn to work with TS8 data using DM tools.
  •  
  •  
  •  
29Andy ConnollyU. WashingtonDM/pipeline version of ISR
  •  
  •  
  •  
30

Seth Digel

SLAC
  •  
  •  
  •  
31David ThomasStanfordAOS
  •  
  •  
  •  
32Eric CharlesSLAC
  •  
  •  
  •  

TBC: Andy Connolly,  Patrick Ingraham, Jeff Carlin, 

Others contacted who are unable to attend: Chris Walter, Chris Stubbs, Michael Wood-Vasey, Paul O'Connor, Johann Cohen-Tanugi, Robert G.


Agenda

General plan for full days is start at 9am. Lunch at noon. Close at 5pm. Coffee/tea at 10:30am and 3pm.


Tuesday 13 NovWednesday 14 NovThursday 15 NovFriday 16 Nov
MorningOpenFocused HackFocused HackCoda 9-noon
Afternoon

Intro 14:00 - 17:00

  • Terminology: matching up terms
  • An introduction to the DM data model
  • Introduction to ISR
  • Verification of accounts and accessibility
Focused HackFocused HackOpen

Proposed Tasks

Task NameInterested attendeesNecessary DataAdditional info/linksOrganizers/Contacts
  1. Use the science platform hosted at NCSA and DM tools (CPP and ISR) with IR2 Camera test data to reproduce EOTest results, compare in detail with the Camera software chain (which is based on DM stack, see hack day slides linked below). The following table entries show subtasks. Start with the gain determination, comparing 55Fe and PTC using both the current standard Camera software suite and the DM tools. Then check non-linearity. Then work through as much of the EO test set as possible. See the Camera EO Test Package and EO Test Plan links in the References below for details. Ingest Camera-generated bad pixel masks.



Steve Ritz
1a. Determine PTC gains using DM primitives (ISRtask, assemble images, calculate means and variances).


Steve Ritz

1b. Determine 55Fe gains using DM (Merlin to supply workflow).

Fe55 analysis:
• Run ISR with ~everything off
• For amp in CCD:
• Background subtract (KSK – I'm worried about this step because of discontinuities between amps. Could be ok if done amp by amp.)
• Footprint detect
• Grow (grow = [0,1,2])
• Sum flux in footprint
• histogram
• Fit with gaussian
• Profit




Steve Ritz
1c. Compare results from the 2 methods and investigate differences (also see this work by Seth Digel, who will likely post more information here). Then also compare with the standard EO Test results, linked for the data runs below.


Steve Ritz
1d. Use the results of 1a and 1b to look at non-linearity distributions and compare in detail with the standard EO Test results.


Steve Ritz
1e. Use the overscans to estimate serial and parallel CTE. Start with the test description below. Compare with standard EO Test results.


Steve Ritz
1f. Determine the noise distributions for each amplifier.


Steve Ritz
1g. If everything else above is done (which is very unlikely), start looking at the bad pixel determinations by category (see EO test description for details).


Steve Ritz
2. Getting started page for people with Camera expertise, posting links to related notebooks


Steve Ritz
3. Set up infrastructure to support development of algorithms to remove effects of deferred signal in "high-bias" ITL sensors.


Steve Ritz





Port this script to use DM routines. See this function for inspiration.


Simon Krughoff
Produce a document that describes the process of ingesting ts8 data into an LSST data repository.


Simon Krughoff
Identify a calibration production step missing from the DM calibration products production pipelines and implement a prototype using DM primitives.


Simon Krughoff
Generate a porting guide for eotest routines: e.g. the functionality here is essentially duplicated here.


Simon Krughoff

Pre-bootcamp checklist for attendees

(1) SLAC is requesting that all non-SLAC employees who do not have SLAC badges fill out and return the attached DOE FACTS Questionnaire. Please fill it out, save it as a pdf and email to: regina@slac.stanford.edu

We need to receive your DOE FACTS Questionnaire ***by November 9, 2018***. Please contact Regina Matter if you have any questions.

(2) Join the #dm-bootcamps channel in slack. That is where updates and discussion for this bootcamp are being posted.

(3) Make sure you have access to the notebook aspect of the LSST Science Platform (LSP) at the LSST Data Facility (LDF), i.e., NCSA.
* If you have an account on any of the resources at the LDF, e.g., lsst-dev.ncsa.illinois.edu, you already have an account on the LSP.
* If you do not (or don't know if you) have an account, contact Simon Krughoff (SKrughoff@lsst.org)

(4) Once an account is established, attempt to connect to https://lsst-lspdev.ncsa.illinois.edu/nb.
* You will need to activate two-factor authentication and connect through the NCSA VPN.  Instructions can be found at https://nb.lsst.io.
* You will need to start the VPN and then log in to the service via CILogon. Both require two-factor authentication. Both will use the same credentials.

(5) Upon successful connection to the LSP
* See documentation for the notebook aspect of the LSP: https://nb.lsst.io
* Checkout the repo for the bootcamps: https://github.com/lsst/bootcamp-work
* Attempt to run the example notebook: https://github.com/lsst/bootcamp-work/blob/master/examples/welcome_to_FE55.ipynb

(6) Background reading and task list. We expect that most bootcamp participants will be starting the hack days with gain and linearity measurements, and then move on to other EO tests as time allows. The proposed task list and links to many documentation sources are available below.

Please visit the confluence page to list your name under the different tasks if you have a preference. Based on your expressed interest, we’ll start forming hack groups this week so that we can get straight to work next Wednesday.

If you run into any trouble or have questions, feel free to use the #dm-bootcamps for discussion or contact any of the organizers directly.

Background Reading / References

Slides:

Papers / documents / notes:

Notebooks:

Source code:

  • A repository to hold the description for the LSST 3.2 GPix camera. https://github.com/lsst/obs_lsstCam
  • Code to produce calibration products, required to perform ISR and other calibration tasks. https://github.com/lsst/cp_pipe
  • The ip_isr package provides Instrument Signature Removal related tasks. ISR includes steps such as combining multiple amplifiers into one full CCD image, corrections for overscans, crosstalk, bias and dark frames, and the creation of variance and mask planes. https://github.com/lsst/ip_isr

Data:

  • No labels