We will be having a bootcamp to bring together members from the camera, commissioning, and data management teams for the purpose of exploring instrument signature removal in the context of all three teams.
Slack channel: #dm-bootcamps
Following is a set of goals put forward by each of the participating groups that expands in more detail on the high level goals above.
Camera:
- Take specific steps toward using DM tools to analyze Camera test data, with the eventual goal of complete use of DM tools for Camera reverification. Start with specific EOtest tasks and compare with the existing Camera analysis chain results. Build from there. (See tasks below)
- Getting Started documentation for others to follow.
- Make this a common standard platform for people interested in looking at Camera test data
- Transition from "Camera team", "DM team", "Commissioning team", ... toward LSST team.
Data Management:
- To better understand the status of algorithms for the purpose of ISR in both the DM and Camera ecosystems. This includes both calibration product production as well as application.
- To compare an example calibration exercise with both Camera and DM code
- To understand the best route toward unifying code between Camera and DM, and the extent to which that's possible.
Commissioning:
- To be able to run an example of generating a calibration product and applying that calibration product as part of ISR using the DM stack running on the LSP with relevant camera data hosted at NCSA. Start with simplest end-to-end test.
- Creating some baseline of documentation from which to spread knowledge of ISR implementation.
When: 14, 15 Nov. 2018 with space available on the afternoon of 13 Nov. and morning of 16 Nov.
Where: Redwood Conference room SLAC National Laboratory
We will have from 10 to 20 rooms reserved at the SLAC Guest House. This block is now used. Please book hotel ASAP.
Other suggested hotels: Comfort Inn, Atherton Inn
For practical reasons, we are not supporting remote participation via bluejeans for the entire meeting. (During the meeting, we expect to reach out with specific questions via Slack and can make specific bluejeans calls if needed.) Our goal is that the products of the bootcamp will leave enough breadcrumbs and documentation that others can quickly get up to speed. We are planning a series of bootcamps that will each focus on a different topic, so there will be many future opportunities to participate.
LOGISTICAL INFO for those of you who are new to SLAC:
Address: SLAC National Accelerator Laboratory
Building 48/ Redwood Conference room
2575 Sand Hill Road
Menlo Park, CA 94025
See attached map of SLAC ( Highlighted Building 48 and parking).
Directions to SLAC Link-
https://vue.slac.stanford.edu/maps-and-directions
If you need any additional assistance to please contact Regina Matter, who has kindly agreed to help
Regina@slac.stanford.edu
650-926-3783
Wi-Fi: eduroam works at SLAC. If you don't have eduroam, you can sign on to the SLAC visitor wifi, after agreeing to the conditions.
Attendees
Please add your name if you are interested to participate in the bootcamp. We sized the bootcamp for 40 total attendees, though we do not necessarily need to be that large / might not have the budget for that many travelers. The Nov 2018 bootcamp is specifically focused on ISR. We expect to have a series of bootcamps that focus on different themes and are intended to be independent of each other (i.e., one does not need to participate in this bootcamp to participate in future bootcamps on other topics).
Name | Institution | Notes/specific interests | Confirmed | Intro | Coda | |
---|---|---|---|---|---|---|
1 | Simon Krughoff | LSST/AURA | I want to continue to find out how best to use the LSP | |||
2 | Leanne Guy | LSST/AURA | LSST subsystem integration, feedback on the DM system, cross subsystem verification and validation | |||
3 | Brian Stalder | LSST/AURA | ComCam integration, verification | |||
4 | Tony Tyson | UCD-LSST | ComCam integration, ISR, verification, metrics for SV | |||
5 | Keith Bechtol | UW-Madison | Get the commissioning team looking at data from LSST hardware using the interfaces we expect to use during commissioning | |||
6 | Rob Morgan | UW-Madison | Interested in photometric calibration and associated calibration products | |||
7 | Craig Lage | UC Davis | ISR, especially brighter-fatter. | |||
8 | Andrew Bradshaw | UC Davis | Camera systematics and weak lensing | |||
9 | Imran Hasan | UC Davis | Weak lensing | |||
10 | Colin Slater | U. Washington | DM tooling for commissioning/camera analysis. | |||
11 | Bo Xin | LSST/AURA | Sensor characterization, ISR | |||
12 | Michael Reuter | LSST/AURA | Understand sensor characterization and how to apply knowledge during commissioning | |||
13 | Emily Phillips Longley | Duke | Confirmed KB, eotest and ISR, results access and displays/plotting | |||
14 | Steve Ritz | UCSC | EOTEST and ISR | |||
15 | Duncan Wood | UCSC | EOTEST and ISR | |||
16 | Pierre Antilogus | LPNHE-IN2P3 | Strong interest in pixel / eotest like / ISR operation in DM framework | |||
17 | Bela Albolfathi | UC-Irvine / SLAC | EOTEST and ISR | |||
18 | Bryce Kalmbach | U. Washington | ||||
19 | Andrés Alejandro Plazas Malagón | Princeton | Will be joining the DM calibration products production team at the start of November. | |||
20 | Andrei Nomerotski | BNL | EOTEST & SR in dm | |||
21 | Hsin-Fang Chiang | NCSA | ||||
22 | Scott Daniel | U. Washington | ||||
23 | Melissa Graham | U. Washington | ||||
24 | Christopher Waters | Princeton | DM/pipeline version of ISR | |||
25 | Jim Chiang | SLAC | ||||
26 | Tony Johnson | SLAC | Interested in learning more about use of DM tools, especially in the context of running EO like analysis on the camera diagnostic cluster. (Tuesday+Wednesday only) | |||
27 | Robert Lupton | Princeton | ||||
28 | Stuart Marshall | SLAC | Hope to learn to work with TS8 data using DM tools. | |||
29 | Andy Connolly | U. Washington | DM/pipeline version of ISR | |||
30 | Seth Digel | SLAC | ||||
31 | David Thomas | Stanford | AOS | |||
32 | Eric Charles | SLAC | ||||
33 | Yousuke Utsumi | SLAC | ||||
34 | Homer Neal | SLAC | ||||
35 | Aaron Roodman | SLAC | EO testing | |||
36 | Andy Rasmussen | SLAC | EO, ISM, PSF estimation, TS8 data using DM tools |
TBC: Andy Connolly, Patrick Ingraham, Jeff Carlin,
Others contacted who are unable to attend: Chris Walter, Chris Stubbs, Michael Wood-Vasey, Paul O'Connor, Johann Cohen-Tanugi, Robert G.
General plan for full days is start at 9am. Lunch at noon. Close at 5pm. Coffee/tea at 10:30am and 3pm.
Tuesday 13 Nov | Wednesday 14 Nov | Thursday 15 Nov | Friday 16 Nov | |
---|---|---|---|---|
Morning | Open | 08:00 – refreshments 09:00 – Welcome 09:45 – Work in small teams 10:30 – Coffee break 11:00 – Work in small teams | 08:00 – refreshments 09:00 – Welcome 09:45 – Work in small teams 10:30 – Coffee break 11:00 – Work in small teams | Coda 9-noon
|
Afternoon | Intro 14:00 - 17:00
| 12:30 – Lunch 13:30 – Work in small teams 15:00 – Coffee break 15:30 – Work in small teams 16:30 – Review work 17:00 – Adjourn | 12:30 – Lunch 13:30 – Work in small teams 15:00 – Coffee break 15:30 – Work in small teams 16:30 – Review work 17:00 – Adjourn | Open |
Spreadsheet of hack groups and task assignments (still preliminary)
Slide deck for day one progress
Slide deck for day two close-out
Task Name | Interested attendees | Necessary Data | Additional info/links | Organizers/Contacts |
---|---|---|---|---|
| Steve Ritz | |||
1a. Determine PTC gains using DM primitives (ISRtask, assemble images, calculate means and variances). | Steve Ritz | |||
1b. Determine 55Fe gains using DM (Merlin to supply workflow). Fe55 analysis: | Steve Ritz | |||
1c. Compare results from the 2 methods and investigate differences (also see this work by Seth Digel, who will likely post more information here). Then also compare with the standard EO Test results, linked for the data runs below. | Steve Ritz | |||
1d. Use the results of 1a and 1b to look at non-linearity distributions and compare in detail with the standard EO Test results. | Steve Ritz | |||
1e. Use the overscans to estimate serial and parallel CTE. Start with the test description below. Compare with standard EO Test results. | Steve Ritz | |||
1f. Determine the noise distributions for each amplifier. | and For example plots, see https://lsst-camera.slac.stanford.edu/DataPortal/SummaryReport.jsp?run=5943D&dataSourceMode=Dev | Steve Ritz | ||
1g. If everything else above is done (which is very unlikely), start looking at the bad pixel determinations by category (see EO test description for details). | Steve Ritz | |||
2. Getting started page for people with Camera expertise, posting links to related notebooks | Steve Ritz | |||
3. Set up infrastructure to support development of algorithms to remove effects of deferred signal in "high-bias" ITL sensors. | Steve Ritz | |||
Port this script to use DM routines. See this function for inspiration. | Simon Krughoff | |||
Produce a document that describes the process of ingesting ts8 data into an LSST data repository. | Simon Krughoff | |||
Identify a calibration production step missing from the DM calibration products production pipelines and implement a prototype using DM primitives. | Simon Krughoff | |||
Generate a porting guide for eotest routines: e.g. the functionality here is essentially duplicated here. | Simon Krughoff |
(1) SLAC is requesting that all non-SLAC employees who do not have SLAC badges fill out and return the attached DOE FACTS Questionnaire. Please fill it out, save it as a pdf and email to: regina@slac.stanford.edu
We need to receive your DOE FACTS Questionnaire ***by November 9, 2018***. Please contact Regina Matter if you have any questions.
(2) Join the #dm-bootcamps channel in slack. That is where updates and discussion for this bootcamp are being posted.
(3) Make sure you have access to the notebook aspect of the LSST Science Platform (LSP) at the LSST Data Facility (LDF), i.e., NCSA.
* If you have an account on any of the resources at the LDF, e.g., lsst-dev.ncsa.illinois.edu, you already have an account on the LSP.
* If you do not (or don't know if you) have an account, contact Simon Krughoff (SKrughoff@lsst.org)
(4) Once an account is established, attempt to connect to https://lsst-lspdev.ncsa.illinois.edu/nb.
* You will need to activate two-factor authentication and connect through the NCSA VPN. Instructions can be found at https://nb.lsst.io.
* You will need to start the VPN and then log in to the service via CILogon. Both require two-factor authentication. Both will use the same credentials.
(5) Upon successful connection to the LSP
* See documentation for the notebook aspect of the LSP: https://nb.lsst.io
* Checkout the repo for the bootcamps: https://github.com/lsst/bootcamp-work
* Attempt to run the example notebook: https://github.com/lsst/bootcamp-work/blob/master/examples/welcome_to_FE55.ipynb
(6) Background reading and task list. We expect that most bootcamp participants will be starting the hack days with gain and linearity measurements, and then move on to other EO tests as time allows. The proposed task list and links to many documentation sources are available below.
Please visit the confluence page to list your name under the different tasks if you have a preference. Based on your expressed interest, we’ll start forming hack groups this week so that we can get straight to work next Wednesday.
If you run into any trouble or have questions, feel free to use the #dm-bootcamps for discussion or contact any of the organizers directly.
Slides:
Papers / documents / notes:
Notebooks:
Source code:
Data:
After successfully looking at RTM-007, we could try RTM-011. Here, use run 5943D (https://lsst-camera.slac.stanford.edu/DataPortal/SummaryReport.jsp?run=5943D&dataSourceMode=Dev ). That would be interesting because we have similar runs for 2 sec and 3 sec readout.
Before running the notebooks in bootcamp-work, you'll need to set up the obs_lsst package. If you don't, you will get an error similar to "ModuleNotFoundError: No module named 'lsst.obs.lsst'
". You will only need to do this once.
Step-by-step instructions:
Start a terminal in JupyterLab. In the terminal, setup the Stack with the command source /opt/lsst/software/stack/loadLSST.bash
and then issue the command setup lsst_distrib
to allow you to run scons in a subsequent step.
Create and/or switch into a folder where you want to put your local versions of the LSST Stack (e.g., ~/repos
) Run the following commands
git clone https://github.com/lsst/obs_lsstCam.git cd obs_lsstCam setup -j -r . scons |
Add setup -k -r path_to_repos/obs_lsstCam
to $HOME/notebooks/.user_setups.
Note that "path_to_repos
" should be replaced by the directory where you put the obs_lsst
package. By adding the setup command to this file, all future notebooks you start will automatically load obs_lsst
.
Restart Kernel and Clear All Outputs
option under the Kernel
menu item.For this bootcamp, we will be doing development in https://github.com/lsst/bootcamp-work
To clone this repository open a terminal from the File
menu item: File
→ New
→ Terminal
. Then execute the following code block.
cd ~/notebooks/ git clone https://github.com/lsst/bootcamp-work.git |
Now you will be able to select the notebooks from the file browser by navigating to notebooks
→ bootcamp-work
→ examples
.
We will be using Stack version w_2018_45
, meaning "weekly release 45 of the year 2018". You can select the Stack version when starting the LSST Science Platform (LSP)
As a starting point, it may be helpful to review the LSST DM git workflow and best practices
During the bootcamp, we should plan to do development on separate branches for our different projects
git add file_you_want_to_add git commit -m "your commit message; see best practices linked above" git push -u origin name_of_your_branch |
git checkout master git pull # Sanity check; rebase ticket if master was updated. git merge --no-ff name_of_your_branch git push |
There is a shared space that we should all have access to. This folder is for datasets of general interest.
/project/shared/data
There is also user folder for sharing
/project/username