Skip to end of metadata
Go to start of metadata

Introduction

We will be having a bootcamp to bring together members from the camera, commissioning, and data management teams for the purpose of exploring instrument signature removal in the context of all three teams.

Slack channel: #dm-bootcamps

Goals

  • To empower people to work with calibration related data and pipelines using the LSP as the primary interface.
  • To explore how algorithms and tooling can be better integrated among the LSST team.
  • To conduct comparisons of how calibration products are generated and applied via various input data and processing pipelines.


Following is a set of goals put forward by each of the participating groups that expands in more detail on the high level goals above.

Camera: 

  • Take specific steps toward using DM tools to analyze Camera test data, with the eventual goal of complete use of DM tools for Camera reverification. Start with specific EOtest tasks and compare with the existing Camera analysis chain results. Build from there. (See tasks below)
  • Getting Started documentation for others to follow.
  • Make this a common standard platform for people interested in looking at Camera test data
  • Transition from "Camera team", "DM team", "Commissioning team", ...  toward LSST team.

Data Management:

  • To better understand the status of algorithms for the purpose of ISR in both the DM and Camera ecosystems.  This includes both calibration product production as well as application.
  • To compare an example calibration exercise with both Camera and DM code
  • To understand the best route toward unifying code between Camera and DM, and the extent to which that's possible.

Commissioning:

  • To be able to run an example of generating a calibration product and applying that calibration product as part of ISR using the DM stack running on the LSP with relevant camera data hosted at NCSA. Start with simplest end-to-end test.
  • Creating some baseline of documentation from which to spread knowledge of ISR implementation.

Logistics

When: 14, 15 Nov. 2018 with space available on the afternoon of 13 Nov. and morning of 16 Nov.

Where: Redwood Conference room SLAC National Laboratory

We will have from 10 to 20 rooms reserved at the SLAC Guest House.  This block is now used.  Please book hotel ASAP.

Other suggested hotels: Comfort Inn, Atherton Inn

For practical reasons, we are not supporting remote participation via bluejeans for the entire meeting. (During the meeting, we expect to reach out with specific questions via Slack and can make specific bluejeans calls if needed.) Our goal is that the products of the bootcamp will leave enough breadcrumbs and documentation that others can quickly get up to speed. We are planning a series of bootcamps that will each focus on a different topic, so there will be many future opportunities to participate.

LOGISTICAL INFO for those of you who are new to SLAC:

Address: SLAC National Accelerator Laboratory
Building 48/ Redwood Conference room
2575 Sand Hill Road
Menlo Park, CA  94025

See attached map of SLAC ( Highlighted Building 48 and parking).

Directions to SLAC Link-
https://vue.slac.stanford.edu/maps-and-directions

If you need any additional assistance to please contact Regina Matter, who has kindly agreed to help
Regina@slac.stanford.edu
650-926-3783

Wi-Fi: eduroam works at SLAC. If you don't have eduroam, you can sign on to the SLAC visitor wifi, after agreeing to the conditions.


Attendees

Please add your name if you are interested to participate in the bootcamp. We sized the bootcamp for 40 total attendees, though we do not necessarily need to be that large / might not have the budget for that many travelers. The Nov 2018 bootcamp is specifically focused on ISR. We expect to have a series of bootcamps that focus on different themes and are intended to be independent of each other (i.e., one does not need to participate in this bootcamp to participate in future bootcamps on other topics).


NameInstitutionNotes/specific interestsConfirmedIntroCoda
1Simon KrughoffLSST/AURAI want to continue to find out how best to use the LSP
  •  
  •  
  •  
2Leanne GuyLSST/AURALSST subsystem integration, feedback on the DM system, cross subsystem verification and validation
  •  
  •  
  •  
3Brian StalderLSST/AURAComCam integration, verification
  •  
  •  
  •  
4Tony TysonUCD-LSSTComCam integration, ISR, verification, metrics for SV
  •  
  •  
  •  
5Keith BechtolUW-MadisonGet the commissioning team looking at data from LSST hardware using the interfaces we expect to use during commissioning
  •  
  •  
  •  
6Rob MorganUW-Madison

Interested in photometric calibration and associated calibration products

  •  
  •  
  •  
7Craig LageUC Davis

ISR, especially brighter-fatter.

  •  
  •  
  •  
8Andrew BradshawUC DavisCamera systematics and weak lensing
  •  
  •  
  •  
9Imran HasanUC DavisWeak lensing
  •  
  •  
  •  
10Colin SlaterU. WashingtonDM tooling for commissioning/camera analysis.
  •  
  •  
  •  
11Bo XinLSST/AURASensor characterization, ISR
  •  
  •  
  •  
12Michael ReuterLSST/AURAUnderstand sensor characterization and how to apply knowledge during commissioning
  •  
  •  
  •  
13Emily Phillips LongleyDukeConfirmed KB, eotest and ISR, results access and displays/plotting
  •  
  •  
  •  
14Steve RitzUCSCEOTEST and ISR
  •  
  •  
  •  
15Duncan WoodUCSCEOTEST and ISR
  •  
  •  
  •  
16Pierre AntilogusLPNHE-IN2P3Strong interest in pixel / eotest like / ISR  operation in DM framework 
  •  
  •  
  •  
17Bela AlbolfathiUC-Irvine / SLACEOTEST and ISR
  •  
  •  
  •  
18Bryce KalmbachU. Washington
  •  
  •  
  •  
19

Andrés Alejandro Plazas Malagón

PrincetonWill be joining the DM calibration products production team at the start of November.
  •  
  •  
  •  
20Andrei NomerotskiBNLEOTEST & SR in dm
  •  
  •  
  •  
21Hsin-Fang ChiangNCSA
  •  
  •  
  •  
22Scott DanielU. Washington
  •  
  •  
  •  
23Melissa GrahamU. Washington


  •  
  •  
  •  
24Christopher WatersPrincetonDM/pipeline version of ISR
  •  
  •  
  •  
25Jim ChiangSLAC
  •  
  •  
  •  
26Tony JohnsonSLACInterested in learning more about use of DM tools, especially in the context of running EO like analysis on the camera diagnostic cluster. (Tuesday+Wednesday only)
  •  
  •  
  •  
27Robert LuptonPrinceton
  •  
  •  
  •  
28Stuart MarshallSLACHope to learn to work with TS8 data using DM tools.
  •  
  •  
  •  
29Andy ConnollyU. WashingtonDM/pipeline version of ISR
  •  
  •  
  •  
30

Seth Digel

SLAC
  •  
  •  
  •  
31David ThomasStanfordAOS
  •  
  •  
  •  
32Eric CharlesSLAC
  •  
  •  
  •  
33Yousuke UtsumiSLAC
  •  
  •  
  •  
34Homer NealSLAC
  •  
  •  
  •  
35Aaron RoodmanSLACEO testing
  •  
  •  
  •  
36Andy RasmussenSLACEO, ISM, PSF estimation, TS8 data using DM tools


TBC: Andy Connolly,  Patrick Ingraham, Jeff Carlin, 

Others contacted who are unable to attend: Chris Walter, Chris Stubbs, Michael Wood-Vasey, Paul O'Connor, Johann Cohen-Tanugi, Robert G.


Agenda

General plan for full days is start at 9am. Lunch at noon. Close at 5pm. Coffee/tea at 10:30am and 3pm.


Tuesday 13 NovWednesday 14 NovThursday 15 NovFriday 16 Nov
MorningOpen

08:00 – refreshments

09:00 – Welcome

09:45 – Work in small teams

10:30 – Coffee break

11:00 – Work in small teams


08:00 – refreshments

09:00 – Welcome

09:45 – Work in small teams

10:30 – Coffee break

11:00 – Work in small teams

Coda 9-noon

  • Actions
  • Future work
  • Close out
Afternoon

Intro 14:00 - 17:00

12:30 – Lunch

13:30 – Work in small teams

15:00 – Coffee break

15:30 – Work in small teams

16:30 – Review work

17:00 – Adjourn

12:30 – Lunch

13:30 – Work in small teams

15:00 – Coffee break

15:30 – Work in small teams

16:30 – Review work

17:00 – Adjourn

Open

Proposed Tasks

Spreadsheet of hack groups and task assignments (still preliminary)

Slide deck for day one progress

Slide deck for day two close-out

Task NameInterested attendeesNecessary DataAdditional info/linksOrganizers/Contacts
  1. Use the science platform hosted at NCSA and DM tools (CPP and ISR) with IR2 Camera test data to reproduce EOTest results, compare in detail with the Camera software chain (which is based on DM stack, see hack day slides linked below). The following table entries show subtasks. Start with the gain determination, comparing 55Fe and PTC using both the current standard Camera software suite and the DM tools. Then check non-linearity. Then work through as much of the EO test set as possible. See the Camera EO Test Package and EO Test Plan links in the References below for details. Ingest Camera-generated bad pixel masks.



Steve Ritz
1a. Determine PTC gains using DM primitives (ISRtask, assemble images, calculate means and variances).


Steve Ritz

1b. Determine 55Fe gains using DM (Merlin to supply workflow).

Fe55 analysis:
• Run ISR with ~everything off
• For amp in CCD:
• Background subtract
• Footprint detect
• Grow (grow = [0,1,2])
• Sum flux in footprint
• histogram
• Fit with gaussian
• Profit




Steve Ritz
1c. Compare results from the 2 methods and investigate differences (also see this work by Seth Digel, who will likely post more information here). Then also compare with the standard EO Test results, linked for the data runs below.


Steve Ritz
1d. Use the results of 1a and 1b to look at non-linearity distributions and compare in detail with the standard EO Test results.


Steve Ritz
1e. Use the overscans to estimate serial and parallel CTE. Start with the test description below. Compare with standard EO Test results.


Steve Ritz
1f. Determine the noise distributions for each amplifier.

See total_noise_histograms.py

and

read_noise.py

For example plots, see https://lsst-camera.slac.stanford.edu/DataPortal/SummaryReport.jsp?run=5943D&dataSourceMode=Dev

Steve Ritz
1g. If everything else above is done (which is very unlikely), start looking at the bad pixel determinations by category (see EO test description for details).


Steve Ritz
2. Getting started page for people with Camera expertise, posting links to related notebooks


Steve Ritz
3. Set up infrastructure to support development of algorithms to remove effects of deferred signal in "high-bias" ITL sensors.


Steve Ritz





Port this script to use DM routines. See this function for inspiration.


Simon Krughoff
Produce a document that describes the process of ingesting ts8 data into an LSST data repository.


Simon Krughoff
Identify a calibration production step missing from the DM calibration products production pipelines and implement a prototype using DM primitives.


Simon Krughoff
Generate a porting guide for eotest routines: e.g. the functionality here is essentially duplicated here.


Simon Krughoff

Pre-bootcamp checklist for attendees

(1) SLAC is requesting that all non-SLAC employees who do not have SLAC badges fill out and return the attached DOE FACTS Questionnaire. Please fill it out, save it as a pdf and email to: regina@slac.stanford.edu

We need to receive your DOE FACTS Questionnaire ***by November 9, 2018***. Please contact Regina Matter if you have any questions.

(2) Join the #dm-bootcamps channel in slack. That is where updates and discussion for this bootcamp are being posted.

(3) Make sure you have access to the notebook aspect of the LSST Science Platform (LSP) at the LSST Data Facility (LDF), i.e., NCSA.
* If you have an account on any of the resources at the LDF, e.g., lsst-dev.ncsa.illinois.edu, you already have an account on the LSP.
* If you do not (or don't know if you) have an account, contact Simon Krughoff (SKrughoff@lsst.org)

(4) Once an account is established, attempt to connect to https://lsst-lspdev.ncsa.illinois.edu/nb.
* You will need to activate two-factor authentication and connect through the NCSA VPN.  Instructions can be found at https://nb.lsst.io.
* You will need to start the VPN and then log in to the service via CILogon. Both require two-factor authentication. Both will use the same credentials.

(5) Upon successful connection to the LSP
* See documentation for the notebook aspect of the LSP: https://nb.lsst.io
* Checkout the repo for the bootcamps: https://github.com/lsst/bootcamp-work
* Attempt to run the example notebook: https://github.com/lsst/bootcamp-work/blob/master/examples/welcome_to_FE55.ipynb

(6) Background reading and task list. We expect that most bootcamp participants will be starting the hack days with gain and linearity measurements, and then move on to other EO tests as time allows. The proposed task list and links to many documentation sources are available below.

Please visit the confluence page to list your name under the different tasks if you have a preference. Based on your expressed interest, we’ll start forming hack groups this week so that we can get straight to work next Wednesday.

If you run into any trouble or have questions, feel free to use the #dm-bootcamps for discussion or contact any of the organizers directly.

Background Reading / References

Slides:

Papers / documents / notes:

Notebooks:

Source code:

  • A repository to hold the description for the LSST 3.2 GPix camera. https://github.com/lsst/obs_lsstCam
  • Code to produce calibration products, required to perform ISR and other calibration tasks. https://github.com/lsst/cp_pipe
  • The ip_isr package provides Instrument Signature Removal related tasks. ISR includes steps such as combining multiple amplifiers into one full CCD image, corrections for overscans, crosstalk, bias and dark frames, and the creation of variance and mask planes. https://github.com/lsst/ip_isr

Data:

Technical Notes

Setting up obs_lsst

Before running the notebooks in bootcamp-work, you'll need to set up the obs_lsst package. If you don't, you will get an error similar to "ModuleNotFoundError: No module named 'lsst.obs.lsst'". You will only need to do this once.

Step-by-step instructions:

  1. Start a terminal in JupyterLab. In the terminal, setup the Stack with the command source /opt/lsst/software/stack/loadLSST.bash and then issue the command setup lsst_distrib to allow you to run scons in a subsequent step.

  2. Create and/or switch into a folder where you want to put your local versions of the LSST Stack (e.g., ~/repos) Run the following commands

    git clone https://github.com/lsst/obs_lsstCam.git
    cd obs_lsstCam
    setup -j -r .
    scons
  3. Add setup -k -r path_to_repos/obs_lsstCam to $HOME/notebooks/.user_setups. Note that "path_to_repos" should be replaced by the directory where you put the obs_lsst package. By adding the setup command to this file, all future notebooks you start will automatically load obs_lsst.

  4. Restart your kernel using the Restart Kernel and Clear All Outputs option under the Kernel menu item.

Code repository

For this bootcamp, we will be doing development in https://github.com/lsst/bootcamp-work

To clone this repository open a terminal from the File menu item: File → New → Terminal.  Then execute the following code block.

cd ~/notebooks/
git clone https://github.com/lsst/bootcamp-work.git

Now you will be able to select the notebooks from the file browser by navigating to notebooks → bootcamp-work → examples.

Stack version

We will be using Stack version w_2018_45 , meaning  "weekly release 45 of the year 2018". You can select the Stack version when starting the LSST Science Platform (LSP)

git workflow

As a starting point, it may be helpful to review the LSST DM git workflow and best practices

During the bootcamp, we should plan to do development on separate branches for our different projects

  • Concept and naming convention for user branches:  `u/{{username}}/{{topic}}`
  • Committing and pushing changes
git add file_you_want_to_add
git commit -m "your commit message; see best practices linked above"
git push -u origin name_of_your_branch


  • Making a pull request on github
  • Practice doing code reviews at the end of each day as a group (?)
  • Merging:
git checkout master
git pull  # Sanity check; rebase ticket if master was updated.
git merge --no-ff name_of_your_branch
git push

Shared datasets

There is a shared space that we should all have access to. This folder is for datasets of general interest.

/project/shared/data

There is also user folder for sharing

/project/username

  • No labels