Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • Follow the steps above to evaluate metric data values for each opsim simulated survey, on whatever timescale appropriate .. as the runs become available, for instance .. and then save the calculated metric data values to disk, together with their pickled binner object. 
  • Instantiate a binMetric object. 
  • Use the binMetric object to read the pickled binner object (or alternatively, create a new binner identical to the first and set that as the binner for the binMetric).
  • Use the binMetric object to read all the relevant metric data values from disk (note that they must all use the same binner) from a list of files in 'filenames'.
  • Use the binMetric object 'comparison' methods to generate visualizations of the differences between the different metric data values (such as a plot with histogram data from multiple metrics, or a sky map of the difference in metric values across RA/Dec points), where the metrics included in each comparison is determined by a passed list of the metric object names. 
  • Note - other kinds of comparisons, such as ranking 'summary statistics' of the metric values, will be done in the TBD full presentation/visualization layer.  

 

Details of the classes

db

The db classes provide methods to query simulated data stored in a database (all kinds of databases, including sqlite) and return that data in a standard data structure, either a numpy recarray or a catalogs.generation chunk iterator (useful if the data set is especially large). Classes in the db package include 'Table' and 'Database'.  

...