id
int64 0
5.38k
| issuekey
stringlengths 4
16
| created
stringlengths 19
19
| title
stringlengths 5
252
| description
stringlengths 1
1.39M
| storypoint
float64 0
100
|
---|---|---|---|---|---|
899 |
DM-4753
|
01/06/2016 15:56:44
|
Cleanup location of anonymous namespaces
|
we place anonymous namespace in two ways: (a), INSIDE lsst::qserv::<module> namespace, or (b) BEFORE. This story involves cleaning it up - move them to before lsst::qserv::<module>
| 1 |
900 |
DM-4759
|
01/07/2016 09:35:18
|
Port Data set info converter achitechture
|
defines various image data types, how to get them, groupings, artifacts. I am not quite happy with how we did in in GWT so the design needs to be improved. Must be less complex.
| 8 |
901 |
DM-4780
|
01/08/2016 18:08:24
|
meas_extensions_shapeHSM seems to be broken
|
I have installed the meas_extensions_shapeHSM package together with galsim and tmv (I documented it at : https://github.com/DarkEnergyScienceCollaboration/ReprocessingTaskForce/wiki/Installing-the-LSST-DM-stack-and-the-related-packages#installing-meas_extensions_shapehsm) and tried to run it on CFHT cluster data. My config file is the following: {code:python} import lsst.meas.extensions.shapeHSM config.measurement.plugins.names |= ["ext_shapeHSM_HsmShapeRegauss", "ext_shapeHSM_HsmMoments", "ext_shapeHSM_HsmPsfMoments"] config.measurement.plugins['ext_shapeHSM_HsmShapeRegauss'].deblendNChild='' config.measurement.slots.shape = "ext_shapeHSM_HsmMoments" {code} When I run measCoaddSources.py, I get the following error : {code} Traceback (most recent call last): File "/sps/lsst/Library/lsstsw/stack/Linux64/pipe_tasks/2015_10.0-10-g1170fd0/bin/measureCoaddSources.py", line 3, in <module> MeasureMergedCoaddSourcesTask.parseAndRun() File "/sps/lsst/Library/lsstsw/stack/Linux64/pipe_base/2015_10.0-3-g24e103a/python/lsst/pipe/base/cmdLineTask.py", line 444, in parseAndRun resultList = taskRunner.run(parsedCmd) File "/sps/lsst/Library/lsstsw/stack/Linux64/pipe_base/2015_10.0-3-g24e103a/python/lsst/pipe/base/cmdLineTask.py", line 192, in run if self.precall(parsedCmd): File "/sps/lsst/Library/lsstsw/stack/Linux64/pipe_base/2015_10.0-3-g24e103a/python/lsst/pipe/base/cmdLineTask.py", line 279, in precall task = self.makeTask(parsedCmd=parsedCmd) File "/sps/lsst/Library/lsstsw/stack/Linux64/pipe_base/2015_10.0-3-g24e103a/python/lsst/pipe/base/cmdLineTask.py", line 363, in makeTask return self.TaskClass(config=self.config, log=self.log, butler=butler) File "/sps/lsst/Library/lsstsw/stack/Linux64/pipe_tasks/2015_10.0-10-g1170fd0/python/lsst/pipe/tasks/multiBand.py", line 530, in __init__ self.makeSubtask("measurement", schema=self.schema, algMetadata=self.algMetadata) File "/sps/lsst/Library/lsstsw/stack/Linux64/pipe_base/2015_10.0-3-g24e103a/python/lsst/pipe/base/task.py", line 255, in makeSubtask subtask = configurableField.apply(name=name, parentTask=self, **keyArgs) File "/sps/lsst/Library/lsstsw/stack/Linux64/pex_config/2015_10.0-1-gc006da1/python/lsst/pex/config/configurableField.py", line 77, in apply return self.target(*args, config=self.value, **kw) File "/sps/lsst/dev/lsstprod/clusters/my_packages/meas_base/python/lsst/meas/base/sfm.py", line 247, in __init__ self.initializePlugins(schema=self.schema) File "/sps/lsst/dev/lsstprod/clusters/my_packages/meas_base/python/lsst/meas/base/baseMeasurement.py", line 298, in initializePlugins self.plugins[name] = PluginClass(config, name, metadata=self.algMetadata, **kwds) File "/sps/lsst/dev/lsstprod/clusters/my_packages/meas_base/python/lsst/meas/base/wrappers.py", line 15, in __init__ self.cpp = self.factory(config, name, schema, metadata) File "/sps/lsst/dev/lsstprod/clusters/my_packages/meas_base/python/lsst/meas/base/wrappers.py", line 223, in factory return AlgClass(config.makeControl(), name, schema) File "/sps/lsst/dev/lsstprod/clusters/my_packages/meas_extensions_shapeHSM/python/lsst/meas/extensions/shapeHSM/hsmLib.py", line 964, in __init__ def __init__(self, *args, **kwargs): raise AttributeError("No constructor defined - class is abstract") AttributeError: No constructor defined - class is abstract {code}
| 1 |
902 |
DM-4781
|
01/09/2016 00:25:30
|
MariaDB does not work together with mysql-proxy
|
We have switched to MAriaDB but there is one issue that complicates things - mysql client from mariadb fails to connect to mysql-proxy with an error: {noformat} ERROR 1043 (08S01): Bad handshake {noformat} so Fabrice had to find a workaround for our setup to use client from mysqlclient package instead. This workaround is not perfect and it complicates other things. Would be nice to make things work transparently for mariadb.
| 2 |
903 |
DM-4782
|
01/10/2016 21:52:31
|
JIRA project for the publication board
|
The LSST Publication Board requests a JIRA project for managing its workload.
| 2 |
904 |
DM-4785
|
01/11/2016 16:01:59
|
Update provenance in baseline schema
|
Current provenance schema in baseline (cat/sql) is very old and no longer reflect latest thinking. This story involves bringing cat/sql up to data and replacing existing prv_* tables with tables we came up with in the epic.
| 2 |
905 |
DM-4786
|
01/12/2016 03:28:31
|
Packge mysqlproxy 0.8.5
|
See https://mariadb.atlassian.net/browse/MDEV-9389
| 2 |
906 |
DM-4789
|
01/12/2016 11:31:00
|
FITS Visualizer porting: Mouse Readout: part 3: Lock by click & 3 color support
|
add toggle button that make the mouse readout lock to last position click on. It will not longer update on move but by click Include: 3 Color Support
| 8 |
907 |
DM-4793
|
01/12/2016 17:45:21
|
Refactor prototype docs into “Developer Guide” and Science Pipelines doc projects
|
Refactor [lsst_stack_docs|https://github.com/lsst-sqre/lsst_stack_docs] into two doc projects - LSST DM Developer Guide that will be published to {{developer.lsst.io}}, and - LSST Science Pipelines that will be published to {{pipelines.lsst.io}}
| 3 |
908 |
DM-4794
|
01/12/2016 18:04:15
|
Write Zoom Options Popup
|
Write the simple zoom options popup that is show when the user clicks zoom too fast or the zoom level exceeds the maximum size. activate this popup from visualize/ui/ZoomButton.jsx
| 2 |
909 |
DM-4798
|
01/13/2016 10:03:31
|
DetectCoaddSourcesTask.scaleVariance gets wrong result
|
DetectCoaddSourcesTask.scaleVariance is used to adjust the variance plane in the coadd to match the observed variance in the image plane (necessary after warping because we've lost variance into covariance). The current implementation produces the wrong scaling in cases where the image has strongly variable variance (e.g., 10 inputs contributed to half the image, but only 1 input contributed to the other half) because it calculates the variance of the image and the mean of the variance separately so that clipping can affect different pixels. Getting this scaling very wrong can make us dig into the dirt when detecting objects, with drastic implications for the resultant catalog. This is a port of [HSC-1357|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1357] and [HSC-1383|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1383].
| 1 |
910 |
DM-4801
|
01/13/2016 14:48:50
|
Update the ground truth values in the lsst_dm_demo to reflect new defaults in deblending
|
In DM-4410 default configuration options were changed such that footprints are now grown in the detection task, and the deblender is run by default. This breaks the lsst_dm_demo, as now the results of processing are slightly different. The short term solution as part of DM-4410 was to run the demo with the defaults overridden to be what they were prior to DM-4410. In the long term the values used in the compare script should be updated to reflect what would be generated with running processCcd with the stack defaults.
| 0.5 |
911 |
DM-4806
|
01/14/2016 13:27:18
|
Test stack with mariadbclient
|
Now that we switched Qserv to mariadb, it'd be good to switch the rest of the stack. This story involves trying out if things still work if we switch mysqlclient to mariadbclient.
| 2 |
912 |
DM-4820
|
01/15/2016 11:33:49
|
Improvement of raw data handling in DecamMapper
|
Two minor improvements with better coding practice: - Be more specific copying FITS header keywords. Avoid potential problems if unwelcome keywords appear in the header in the future. Suggested in the discussions in DM-4133. - Reuse {{isr.getDefectListFromMask}} for converting defects. A more efficient method that uses the FootprintSet constructor with a Mask and a threshold has just been adopted in DM-4800. Processing is not changed effectively.
| 1 |
913 |
DM-4821
|
01/15/2016 15:58:30
|
HSC backport: Remove interpolated background before detection to reduce junk sources
|
This is a port of [HSC-1353|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1353] and [HSC-1360|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1360]. Descriptions from HSC: {panel:title=HSC-1353} We typically get a large number of junk detections around bright objects due to noise fluctuations in the elevated background. We can try to reduce the number of junk detections by adding an additional local background subtraction before object detection. We can then add this back in after detection of footprints and peaks. {panel} {panel:title=HSC-1360} I forgot to set the useApprox=True for the background subtraction that runs before footprint and peak detection. This will then use the Chebyshev instead of the spline. {panel}
| 1 |
914 |
DM-4823
|
01/15/2016 17:45:01
|
Add Dropdowns to Vis toolbar
|
Add the dropdown to the vis tool bar
| 2 |
915 |
DM-4824
|
01/15/2016 18:02:30
|
Clean up div and css layout on FitsDownloadDialog
|
FitsDownload dialogs html and css is not quite right. Needs some clean up.
| 1 |
916 |
DM-4825
|
01/15/2016 18:35:53
|
makeDiscreteSkymap has a default dataset of 'raw'
|
The default dataset type for command line tasks is raw. In the case MakeDiscreteSkyMapTask is asking the butler for calexp images. This shouldn't be a problem, but in my case I have calexp images, but no raw images. This causes the task to think there is no data to work on, so it exits.
| 1 |
917 |
DM-4831
|
01/18/2016 10:45:46
|
Add bright object masks to pipeline outputs
|
Given per-patch inputs providing {code} id, B, V, R, ra, dec, radius {code} for each star to be masked, use this information to set: * A bit in the mask plane for each affected pixel * A flag in the source catalogues for each object that has a centroid lying within this mask area This is a port of [HSC-1342|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1342] and [HSC-1381|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1381].
| 3 |
918 |
DM-4833
|
01/18/2016 11:08:13
|
Update configuration for Suprime-Cam
|
The {{obs_subaru}} configuration for Suprime-Cam needs updating to match recent changes in the stack. Port of [HSC-1372|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1372].
| 1 |
919 |
DM-4834
|
01/18/2016 11:11:55
|
Preliminaries for LSST vs HSC pipeline comparison through coadd processing
|
This is the equivalent of DM-3942 but through coadd processing. Relevant HSC tickets include: * [HSC-1371|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1371]
| 1 |
920 |
DM-4835
|
01/18/2016 11:17:39
|
Allow slurm to request total CPUs rather than nodes*processors.
|
On some systems, we are asked to request a total number of tasks, rather than specify a combination of nodes and processors per node. It also makes sense to use the SMP option this way. This is a port of [HSC-1369|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1369].
| 2 |
921 |
DM-4841
|
01/18/2016 15:07:01
|
Use high S/N band as reference for multiband forced photometry
|
We are currently choosing the priority band as the reference band for forced photometry as long as it has a peak in the priority band regardless of the S/N. Please change this to pick the highest S/N band as the reference band when the priority band S/N is sufficiently low. This is a port of [HSC-1349|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1349].
| 1 |
922 |
DM-4842
|
01/18/2016 15:17:15
|
Don't write HeavyFootprints in forced photometry
|
There's no need to persist {{HeavyFootprint}}s while performing forced photometry since retrieving them is as simple as loading the _meas catalog. This is a port of [HSC-1345|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1345].
| 0.5 |
923 |
DM-4847
|
01/18/2016 17:08:07
|
Add new blendedness metric
|
[HSC-1316|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1316] shifts the calculation of blendedness from {{meas_deblender}} to {{meas_algorithms}} and defines a new blendedness metric in the process. Please port it.
| 3 |
924 |
DM-4849
|
01/18/2016 21:26:22
|
LDM-151 - comments from Jacek
|
I am reading your https://github.com/lsst/LDM-151/blob/draft/DM_Applications_Design.tex, and I have some minor comments suggestions. I am going to add comments to this story to capture it. Feel free to apply to ignore :)
| 1 |
925 |
DM-4850
|
01/19/2016 06:20:23
|
Factor out duplicate setIsPrimaryFlag from MeasureMergedCoaddSourcesTask and ProcessCoaddTask
|
{{MeasureMergedCoaddSourcesTask.setIsPrimaryFlag()}} and {{ProcessCoaddTask.setIsPrimaryFlag()}} are effectively the same code. Please split this out into a separate task which both of the above can call. This is a (partial) port of [HSC-1112|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1112] and should include fixes from [HSC-1297|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1297].
| 2 |
926 |
DM-4856
|
01/19/2016 11:19:54
|
Add __setitem__ for columns in afw.table
|
It's confusing to have to use an extra {{[:]}} to set a column in afw.table, and we can make that unnecessary if we override {{\_\_setitem\_\_}} as well as {{\_\_getitem\_\_}}.
| 2 |
927 |
DM-4858
|
01/19/2016 15:13:06
|
imagesDiffer doesn't handle overflow for unsigned integers
|
I'm seeing a test failure in afw's testTestMethods.py, apparently due to my numpy (1.8.2) treating images that differ by -1 as differing by 65535 in both {{numpy.allclose}} and array subtraction (which doesn't promote to an unsigned type). Does this still cause problems in more recent versions of {{numpy}}? If not, I imagine it's up to me to find a workaround for older versions if I want it fixed? (assigning to [~rowen] for now, just because I know he originally wrote this test and I hope he might know more)
| 1 |
928 |
DM-4862
|
01/20/2016 10:08:08
|
Add point selection
|
click and highlight a point. Is on when mouse readout "Lock by Click" is on. However, can me turned on externally by adding toolbar context menu options.
| 2 |
929 |
DM-4867
|
01/20/2016 15:25:37
|
scisql build scripts are buggy
|
The scisql build script logic for MySQL/MariaDB version checking is broken on all platforms. There are also assumptions about shared library naming that do not hold on OS/X, which means that the deployment scripts are likely broken on all platforms other than Linux.
| 2 |
930 |
DM-4873
|
01/20/2016 17:02:45
|
Test the matchOptimisticB astrometric matcher
|
The matchOptimisticB matcher fails on many visits of the bulge verification dataset. This prompted a deeper investigation of the performance of the matcher. Angelo and David developed a test script and discovered that the matcher works well with offsets of the two source catalogs of up to 80 arcsec, but fails beyond that. This should be robust enough for nearly all datasets that the LSST stack will be used on.
| 3 |
931 |
DM-4876
|
01/20/2016 17:09:10
|
Compile list of DM simulation needs for Andy Connolly
|
Compile list of DM simulation needs over the next ~6 months to give to Andy Connolly (simulations lead).
| 3 |
932 |
DM-4878
|
01/20/2016 17:18:51
|
Propagate flags from individual visit measurements to coadd measurements
|
It is useful to be able to identify suitable PSF stars from a coadd catalogue. However, the PSF is not determined on the coadd, but from all the inputs. Add a mechanism for propagating flags from the input catalogues to the coadd catalogue indicating stars that were used for measuring the PSF. Make the inclusion fraction threshold configurable so we can tweak it (so we only get stars that were consistently used for the PSF model; the threshold might be set it to 0 for "or", 1 for "all" and something in between for "some"). Make the task sufficiently general that it can be used for propagating arbitrary flags. This is a port of work carried out on [HSC-1052|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1052] and (part of) [HSC-1293|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1293].
| 2 |
933 |
DM-4882
|
01/20/2016 20:41:57
|
base_Variance plugin generates errors in lsst_dm_stack_demo
|
Since DM-4235 was merged, we see a bunch of messages along the lines of: {code} processCcd.measurement WARNING: Error in base_Variance.measure on record 427969358631797076: The center is outside the Footprint of the source record {code} in the output from {{lsst_dm_stack_demo}}. (See e.g. [here|https://ci.lsst.codes/job/stack-os-matrix/label=centos-6/7482/console#console-section-3]). It's not fatal, but the warnings are disconcerting and could be indicative of a deeper problem.
| 2 |
934 |
DM-4885
|
01/20/2016 21:44:41
|
Improve/simplify multi-worker tests
|
The idea is that our current integration test is using "mono" configuration which is only useful for integration test but it's not used anywhere else. It would be more useful to have an integration test which is close to real setup, e.g. used more than one worker. It should still be possible to run the whole shebang on a single node though to keep it usable for regular development tasks.
| 5 |
935 |
DM-4887
|
01/21/2016 09:48:04
|
Refactor measurement afterburners into a new plugin system
|
Some of the operations we currently run as part of measurement (or would like to) share some features that make them a bit different from most plugin algorithms: - They must be run after at least some other high-level plugins, and may be run after all of them. - They do not require access to pixel data, as they derive their outputs entirely from other plugins' catalog outputs. - They may require an aggregation stage of some sort to be run on the regular plugin output before they can be run. Some examples include: - Star/Galaxy classification (with training done after measurement and before classification). - Applying aperture corrections (estimating the correction must be done first). - BFD's P, Q, R statistics (requires a prior estimated from deep data). We should move these algorithms to a new plugin system that's run by a new subtask, allowing these plugins to be run entirely separately from {{SingleFrameMeasurementTask}}. This will simplify some of the currently contorted logic required to make S/G classification happen after aperture correction, while making room for hierarchical inference algorithms like BFD and Bayesian S/G classification in the future. (We will not be able to support BFD immediately, as this will also require changes to our parallelization approach, but this will be a step in the right direction). This work should *probably* be delayed until after the HSC merge and [~rowen]'s rewrite of {{ProcessCcdTask}} are complete, but it's conceivable that this refactoring could solve emergent problems there and be worth doing earlier as a result.
| 8 |
936 |
DM-4893
|
01/22/2016 10:00:42
|
Write tutorial describing remote IPython + ds9 on lsst-dev
|
[~mfisherlevine] recently figured out how to set up his system to run a remote IPython kernel on {{lsst-dev}} and interact with it from his laptop, including streaming image display from the remote system to a local instance of {{ds9}}. He will write all this up so that others in the community can easily do the same.
| 2 |
937 |
DM-4894
|
01/22/2016 10:07:13
|
Ingest DECam/CBP data into LSST stack
|
[~mfisherlevine] will ingest the data taken in DM-4892 into the LSST stack. Initial experiments indicate problems with: * Bias subtraction * Flat fielding * Bad pixel masks These may already be remedied by work on {{obs_decam}}; if not, he will file stories and fix them.
| 3 |
938 |
DM-4904
|
01/22/2016 20:59:03
|
Buffer overrun in wcslib causes stack corruption
|
The buffer 'msg' in wcsfix.c is used to report attempts by wcslib to re-format units found in fits files. It is allocated on the stack (in function 'unitfix') using a pre-processor macro defined size of 160 chars (set in wcserr.h). When attempting to run the function 'unitfix' in wcsfix, this buffer can overflow on some fits files (the raw files generated by HSC seem particularly prone to triggering this behavior) and results in the session being terminated on Ubuntu 14.04 as stack protection is turned on by default i.e. the stack crashes with a 'stack smashing detected' error. We have reported the bug to the creators of wcslib. As a temporary workaround, users affected by the bug should increase the default size of 'msg' by increasing WCSERR_MSG_LENGTH defined in wcserr.h We are providing a small python example that demonstrates the problem. Run it as python test.py <path to ci_hsc>/raw/<any fits file in this directory> We are also providing a simple c program to demonstrate the bug. Compile it as cc -fsanitize=address -g -I$WCSLIB_DIR/include/wcslib -o test test.c -L$WCSLIB_DIR/lib -lwcs (on Linux) cc -fsanitize=address -g -L$WCSLIB_DIR/lib -lwcs -I$WCSLIB_DIR/include/wcslib -o test test.c (on Mac OS X)
| 2 |
939 |
DM-4916
|
01/26/2016 09:18:58
|
Test obs_decam with processed data
|
Sometimes DECam-specific bugs only reveal in or affect the processed data. For example the bug of DM-4859 reveals in the {{postISRCCD}} products. If the bugs are DECam-specific, some changes in {{obs_decam}} are likely needed. It would be useful to have a more convenient way to test those changes. In this ticket I modify {{testdata_decam}} so that those data can be processed, and then allow wider options in the {{obs_decam}} unit tests. I add {{testProcessCcd.py}} in {{obs_decam}} that runs {{processCcd.py}} with raw and calibration data in {{testdata_decam}}. Besides a short sanity check, I add a test (testWcsPostIsr) that tests DM-4859. {{testWcsPostIsr}} fails without the DM-4859 fix, and passes with it.
| 3 |
940 |
DM-4917
|
01/26/2016 11:37:24
|
Porting encodeURL of the java FitsDownlaodDialog code to javascript
|
When download an image, the proper name needs to be resolved based on the URL and the information about the image. In Java code, it has the following three methods: {code} encodeUrl makeFileName makeTitleFileName {code} These method should be ported to javascript. Thus, the javascript version of the FitsDownloadDialog will save the file in the same manner.
| 2 |
941 |
DM-4921
|
01/26/2016 14:23:55
|
Make obs_subaru build with OS X SIP
|
Because of OS X SIP, {{obs_subaru}} fails to build on os x 10.11. In the {{hsc/SConscript}} file, the library environment variables need properly set, and scripts need to be delayed until the shebang rewriting occurs.
| 0.5 |
942 |
DM-4926
|
01/27/2016 07:02:07
|
Centroids fall outside Footprints
|
In DM-4882, we observed a number of centroids measured while running the {{lsst_dm_stack_demo}} routines fall outside their associated {{Footprints}}. This was seen with both the {{NaiveCentroid}} and the {{SdssCentroid}} centroiders. For the purposes of DM-4882 we quieted the warnings arising from this, but we should investigate why this is happening and, if necessary, weed out small {{Footprints}} entirely.
| 8 |
943 |
DM-4929
|
01/27/2016 11:32:50
|
Fix build of MariaDB on OS X El Capitan
|
The current MariaDB EUPS package does not build on OS X El Capitan because OS X no longer ships with OpenSSL developer files. MariaDB has a build option to use a bundled SSL library in preference to OpenSSL but the logic for automatically switching to this version breaks when the Anaconda OpenSSL libraries are present.
| 1 |
944 |
DM-4931
|
01/27/2016 11:48:11
|
Qserv build fails on El Capitan with missing OpenSSL
|
Qserv does not build on OS X El Capitan due to the absence of OpenSSL include files. Apple now only ship the OpenSSL library (for backwards compatibility reasons). Qserv only uses SSL in two places to calculate digests (MD5 and SHA). This functionality is available in the Apple CommonCrypto library. Qserv digest code needs to be taught how to use CommonCrypto.
| 2 |
945 |
DM-4933
|
01/27/2016 13:50:01
|
Create a utility function do do spherical geometry averaging
|
I would like to calculate a correct average and RMS for a set of RA, Dec positions. Neither [~jbosch] nor [~price] knew of an easy, simple function to do that that existed in the stack. [~price] suggested: {code} mean = sum(afwGeom.Extent3D(coord.toVector()) for coord in coordList, afwGeom.Point3D(0, 0, 0)) mean /= len(coordList) mean = afwCoord.IcrsCoord(mean) {code} That makes sense, but it's a bit unobvious (it's obvious how it works, but would likely never occur to someone that they should do it that way in the stack). Pedantically it's also not the best way to do a mean while preserving precision, but I don't anticipate that to be an issue in practice. Creating a function that did this would provide clarity. I don't know where that function should live. Note: I know how to do this in Astropy. I'm intentionally not using astropy here. But part of the astropy dependency discussion is likely "how much are we otherwise rewriting in the LSST stack".
| 1 |
946 |
DM-4934
|
01/27/2016 14:57:01
|
on-going support to Camera team in visualization at UIUC
|
Attend the weekly meeting and answer questions as needed
| 2 |
947 |
DM-4936
|
01/28/2016 09:11:22
|
Enable validateMatches in ci_hsc
|
{{python/lsst/ci/hsc/validate.py}} in {{ci_hsc}} [says|https://github.com/lsst/ci_hsc/blob/69c7a62f675b8fb4164065d2c8c1621e296e40ad/python/lsst/ci/hsc/validate.py#L78]: {code:python} def validateMatches(self, dataId): # XXX lsst.meas.astrom.readMatches is gone! return {code} {{readMatches}} (or its successor) should be back in place as of DM-3633. Please enable this test.
| 2 |
948 |
DM-4937
|
01/28/2016 09:12:05
|
multiple CVEs relevant to mariadb 10.1.9 and mysql
|
Multiple CVEs have been released this week for mysql & mariadb. The current eups product for mariadb is bundling 10.1.9, which is affected. Several of the CVEs do not yet provide details, which typically means they are "really bad". https://github.com/lsst/mariadb/blob/master/upstream/mariadb-10.1.9.tar.gz https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-0505 https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-0546 https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-0596 https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-0597 https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-0598 https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-0600 https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-0606 https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-0608 https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-0609 https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-0616 https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-2047
| 0.5 |
949 |
DM-4938
|
01/28/2016 10:33:52
|
Update scisql to v0.3.5
|
In order to update MariaDB to v10.1.10 {{scisql}} needs to also be updated to deal with the hard-coded version checking. For the current version we get this error with the latest MariaDB: {code} ::::: [2016-01-28T16:51:40.539306Z] user_function(self) ::::: [2016-01-28T16:51:40.539334Z] File "/home/build0/lsstsw/build/scisql/wscript", line 63, in configure ::::: [2016-01-28T16:51:40.539346Z] ctx.check_mysql() ::::: [2016-01-28T16:51:40.539392Z] File "/home/build0/lsstsw/build/scisql/.waf-1.6.11-30618c54883417962c38f5d395f83584/waflib/Configure.py", line 221, in fun ::::: [2016-01-28T16:51:40.539410Z] return f(*k,**kw) ::::: [2016-01-28T16:51:40.539432Z] File "tools/mysql_waf.py", line 85, in check_mysql ::::: [2016-01-28T16:51:40.539451Z] (ok, msg) = mysqlversion.check(version) ::::: [2016-01-28T16:51:40.539473Z] File "tools/mysqlversion.py", line 74, in check ::::: [2016-01-28T16:51:40.539514Z] if not comparison_op(version_nums, constraint_nums): ::::: [2016-01-28T16:51:40.539547Z] UnboundLocalError: local variable 'constraint_nums' referenced before assignment Failed during rebuild of DM stack. {code}
| 0.5 |
950 |
DM-4939
|
01/28/2016 11:51:18
|
IRSA developer mentoring effort
|
IRSA is contributing to the Firefly package development. we need to put in time to mentor the developers.
| 2 |
951 |
DM-4940
|
01/28/2016 11:56:35
|
IRSA developer mentoring effort
|
IRSA is contributing to Firefly development. We need to mentor the new developers.
| 2 |
952 |
DM-4952
|
01/29/2016 09:44:52
|
delegate argument parsing to CmdLineTask instances
|
Command-line argument parsing of data IDs for {{CmdLineTask}} s is currently defined at the class level, which means that we cannot make data ID definitions dependent on task configuration. That in turn requires custom {{processCcd}} scripts for cameras that start processing at a level other than "raw" (SDSS, DECam with community pipeline ISR, possibly CFHT). Instead, we should let {{CmdLineTask}} *instances* setup command-line parsing; after a {{CmdLineTask}} is constructed, it will have access to its final configuration tree, and can better choose how to parse its ID arguments. I've assigned this to Process Middleware for now, since that's where it lives in the codebase, but it may make more sense to give this to [~rowen], [~price], or [~jbosch], just because we've already got enough familiarity with the code in question that we could do it quickly. I'll leave that up to [~swinbank], [~krughoff], and [~mgelman2] to decide.
| 2 |
953 |
DM-4955
|
01/29/2016 12:44:30
|
Update pyfits
|
The final version of {{pyfits}} has just been released. This ticket covers updating to that version. This will be helpful in determining whether the migration to {{astropy.io.fits}} will be straightforward or complicated.
| 1 |
954 |
DM-4957
|
01/29/2016 13:42:33
|
Generate JSON output from validate_drp for inclusion in a test harness
|
Generate JSON output from validate_drp for inclusion in a test harness. Generate a file that summarizes the key metrics calculated by `validate_drp`. Develop naming conventions that will make it easy to plug into the eventual harness being developed as part of DM-2050.
| 2 |
955 |
DM-4959
|
01/30/2016 19:50:52
|
ci_hsc fails to execute tasks from with SCons on OSX 10.11/SIP
|
The {{ci_hsc}} package executes a number of command line tasks directly from SCons based on {{Command}} directives in a {{SConstruct}} file. On an OSX 10.11 system with SIP enabled, there are two distinct problems which prevent the necessary environment being propagated to the tasks: * -The {{scons}} executable starts with a {{#!/usr/bin/env python}}. Running through {{/usr/bin/env}} strips {{DYLD_LIBRARY_PATH}} from the environment.- (duplicates DM-4954) * SCons executes command using the [{{sh}} shell on posix systems|https://bitbucket.org/scons/scons/src/09e1f0326b7678d1248dab88b28b456fd7d6fb54/src/engine/SCons/Platform/posix.py?at=default&fileviewer=file-view-default#posix.py-105]. By default, that means {{/bin/sh}} on a Mac, which, again, will strip {{DYLD_LIBRARY_PATH}}. Please make it possible to run {{ci_hsc}} on such a system.
| 0.5 |
956 |
DM-4961
|
01/31/2016 15:37:38
|
Obs_Subaru camera mapper has wrong deep_assembleCoadd_config
|
When lsst switched to using SafeClipAssembleCoaddTask, the camera mapper for hsc was not updated accordingly. This causes ci_hsc to fail when it attempts to verify the config class type for the deep_coadd. Camera mapper should be updated accordingly
| 0.5 |
957 |
DM-4983
|
02/01/2016 11:39:26
|
upstream patches/deps from conda-lsst
|
Where ever possible, missing dep information and patches from conda-lsst should be upstreamed. The patches have already been observed to cause builds to fail due to upstream changes.
| 3 |
958 |
DM-18241
|
02/01/2016 14:19:59
|
Create initial M1M3, M2 simulators
|
Initial simulator support
| 3 |
959 |
DM-17268
|
02/01/2016 14:34:10
|
SAL release 4 build and distribute
|
Release new version
| 5 |
960 |
DM-4991
|
02/01/2016 14:45:18
|
Save algorithm metadata in multiband.py
|
The various {{Tasks}} in {{multiband.py}} do not attach the {{self.algMetadata}} instance attribute to their output tables before writing them out, so we aren't actually saving information like which radii were used for apertures. We should also make sure this feature is maintained in the processCcd.py rewrite.
| 3 |
961 |
DM-4993
|
02/01/2016 20:37:35
|
review of dependency on the third party packages
|
We need to periodically review the status of the third party software packages that Firefly depends on. Making a plan to do upgrade if needed. package.json lists out the dependencies Firefly has on the third party software. The attached file was last modified 2016-02-09. package.json_version lists the current version of the third party packages, major changes were indicated by (M). The attached file was created on 2016-02-29. bq. "babel" : "5.8.34", 6.5.2 (M) "history" : "1.17.0", 2.0.0 (M) "icepick" : "0.2.0", 1.1.0 (M) "react-highcharts": "5.0.6", 7.0.0 (M) "react-redux": "3.1.2", 4.4.0 (M) "react-split-pane": "0.1.22", 2.0.1 (M) "redux-thunk": "0.1.0", 1.0.3 (M) "redux-logger": "1.0.9", 2.6.1 (M) "validator" : "4.5.0", 5.1.0 (M) "chai": "^2.3.0", 3.5.0 (M) "esprima-fb": "^14001.1.0-dev-harmony-fb", 15001.1001.0-dev-harmony-fb (M) "babel-eslint" : "^4.1.3", 5.0.0 (M) "babel-loader" : "^5.3.2", 6.2.4 (M) "babel-plugin-react-transform": "^1.1.0", 2.0.0 (M) "babel-runtime" : "^5.8.20", 6.6.0 (M) "eslint" : "^1.10.3", 2.2.0 (M) "eslint-config-airbnb": "0.1.0", 6.0.2 (M) works with eslint 2.2.0 "eslint-plugin-react": "^3.5.1", 4.1.0 (M) works with eslint 2.2.0 "extract-text-webpack-plugin": "^0.8.0", 1.0.1 (M) "html-webpack-plugin": "^1.6.1", 2.9.0 (M) "karma-sinon-chai": "^0.3.0", 1.2.0 (M) "redux-devtools" : "^2.1.2", 3.3.1 (M) "webpack": "^1.8.2" 1.12.14, 2.1.0 beta4 (M)
| 2 |
962 |
DM-4995
|
02/02/2016 01:07:10
|
Extend webserv API to pass security tokens
|
Extend the [API|https://confluence.lsstcorp.org/display/DM/AP] to pass security tokens.
| 8 |
963 |
DM-4996
|
02/02/2016 09:34:33
|
Update validate_drp for El Capitan
|
validate_drp does not work on El Capitan due to SIP (System Integrity Protection) stripping DYLD_LIBRARY_PATH from shell scripts. The simple fix is to add {code} export DYLD_LIBRARY_PATH=${LSST_LIBRARY_PATH} {code} near the top of the scripts.
| 1 |
964 |
DM-4998
|
02/02/2016 12:54:03
|
Fix rotation for isr in obs_subaru
|
Approximately half of the HSC CCDs are rotated 180 deg with respect to the others. Two others have 90 deg rotations and another two have 270 deg rotations (see [HSC CCD layout|http://www.naoj.org/Observing/Instruments/HSC/CCDPosition_20150804.png]) . The raw images for the rotated CCDs thus need to be rotated to match the rotation of their associated calibration frames prior to applying the corrections. This is accomplished by rotating the exposure using the *rotated* context manager function in {{obs_subaru}}'s *isr.py* and the *nQuarter* specification in the policy file for each CCD. Currently, *rotated* uses {{afw}}'s *rotateImageBy90* (which apparently rotates in a counter-clockwise direction) to rotated the exposure by 4 - nQuarter turns. This turns out to be the wrong rotation for the odd nQuarter CCDs as shown here: !ccd100_nQuarter3.png|width=200! top left = raw exposure as read in top right = flatfield exposure as read in bottom left = _incorrectly_ rotated raw exposure prior to flatfield correction
| 2 |
965 |
DM-5002
|
02/02/2016 16:02:48
|
Make ci_hsc resumable
|
if ci_hsc fails for any reason, (or is cancelled) it must start from the beginning of processing again. This is because of the use of functools.partial to generate dynamic function. These differ enough in their byte code that scons thinks each build has a new function definition passed to the env.command function. Using lambda would suffer from the same problem. This ticket should change how the function signature is calculated such that scons can be resumed. This work does not prevent this from being used as a ci tool, as the .scons directory can be deleted which will force the whole SConstruct file to run again.
| 2 |
966 |
DM-5005
|
02/02/2016 16:11:36
|
Please trim config overrides in validate_drp
|
validate_drp will test more of our code if it uses default config parameters wherever possible. To that effect I would like to ask you to eliminate all config overrides that are not essential and document the reasons for the remaining overrides. For DECam there are no overrides that are different than the defaults, so the file can simply be emptied (for now). For CFHT there are many overrides that are different, and an important question is whether the overrides in this package are better for CFHT data than the overrides in obs_cfht; if so, please move them to obs_cfht. As a heads up: the default star selector is changing from "secondMoment" to "objectSize" in DM-4692 and I hope to allow that in validate_drp, since it works better and is better supported. Sorry for the incorrect component, but validate_drp is not yet a supported component in JIRA (see DM-5004)
| 0.5 |
967 |
DM-5013
|
02/03/2016 09:34:31
|
Convert Confluence DM Developer Guide to Sphinx (hack day)
|
This is a hack day sprint to convert all remaining content on https://confluence.lsstcorp.org/display/LDMDG to reStructuredText content in the Sphinx project at https://github.com/lsst-sqre/dm_dev_guide and published at http://developer.lsst.io. The top priority for this sprint is to port all content into reST and have it tracked by Git. h2. Sprint ground rules # Before the sprint, clone {{https://github.com/lsst-sqre/dm_dev_guide.git}} and {{pip install -r requirements.txt}} in a Python 2.7 environment so that you can locally build the docs ({{make html}}). # Claim a page from the list below by putting your name on it. Put a checkmark on the page when you’ve merged it to the ticket branch (see below). # See http://developer.lsst.io/en/latest/docs/rst_styleguide.html for guidance on writing our style of reStructuredText. Pay attention to the [heading hierarchy|http://developer.lsst.io/en/latest/docs/rst_styleguide.html#sections] and [labelling for internal links|http://developer.lsst.io/en/latest/docs/rst_styleguide.html#internal-links-to-labels]. # If you use Pandoc to do an initial content conversion, you still need to go through the content line-by-line to standardize the reStructuredText. I personally recommend copy-and-pasting-and-formatting instead of using Pandoc. # Your Git commit messages should include the URL of the original content from Confluence. # Merge your work onto the {{tickets/DM-5013}} ticket branch. Rebase your personal work branch before merging. JSick is responsible for merging this ticket branch to {{master}}. # Put a note at the top of the confluence page with the new URL; root is {{http://developer.lsst.io/en/latest/}}. h2. Planned Developer Guide Table of Contents We’re improving the organization of DM’s Developer Guide; there isn’t a 1:1 mapping of Confluence pages to developer.lsst.io pages. Below is a proposed section organization and page structure. These sections can still be refactored based on discussion during the hack day. h3. Getting Started — /getting-started/ * ✅ *Onboarding Checklist* (Confluence: [Getting Started in DM|https://confluence.lsstcorp.org/display/LDMDG/Getting+Started+in+DM]). I’d like this to eventually be a quick checklist of things a new developer should do. It should be both a list of accounts the dev needs to have created, and a list of important developer guide pages to read next. The NCSA-specific material should be spun out. [[~jsick]] * *Communication Tools* (new + DM Confluence [Communication and Links|https://confluence.lsstcorp.org/display/DM/Communication+and+Links]). I see this as being an overview of what methods DM uses to communicate, and what method should be chosen for any circumstance. * *Finding Code on GitHub* (new). This should point out all of the GitHub organizations that a developer might come across (DM and LSST-wide), and point out important repositories within each organization. Replaces the confluence page [LSST Code Repositories|https://confluence.lsstcorp.org/display/LDMDG/LSST+Code+Repositories] h3. Processes — /processes/ * ✅ *Team Culture and Conduct Standards* (confluence) * ✅ *DM Development Workflow with Git, GitHub, JIRA and Jenkins* (new & Confluence: [git development guidelines for LSST|https://confluence.lsstcorp.org/display/LDMDG/git+development+guidelines+for+LSST] + [Git Commit Best Practices|https://confluence.lsstcorp.org/display/LDMDG/Git+Commit+Best+Practices] + [DM Branching Policy|https://confluence.lsstcorp.org/display/LDMDG/DM+Branching+Policy]) * ✅ *Discussion and Decision Making Process* (new & [confluence|https://confluence.lsstcorp.org/display/LDMDG/Discussion+and+Decision+Making+Process]) * ✅ *DM Wiki Use* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/DM+Wiki+Use]) [[~swinbank]] * ✅ *Policy on Updating Doxygen* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Policy+on+Updating+Doxygen]); needs to be addressed with TCT. Inter-link with the developer workflow page. [[~jsick]] (we’re just re-pointing the Confluence page to the workflow document) * ✅ *Transferring Code Between Packages* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Transferring+Code+Between+Packages]) [[~swinbank]] * -*Policy on Changing a Baseline Requirement*- ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Policy+on+Changing+a+Baseline+Requirement]) * ✅ *Project Planning for Software Development* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Project+Planning+for+Software+Development]) [[~swinbank]] * ✅ *JIRA Agile Usage* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/JIRA+Agile+Usage]) [[~swinbank]] * -*Technical/Control Account Manager Guide*- ([confluence|https://confluence.lsstcorp.org/pages/viewpage.action?pageId=21397653]) (Do not port; see discussion below.) * *Licensing* (new) Need a centralized page to discuss license and copyright policies; include boilerplate statements. h3. Coding Guides — /coding/ * ✅ *Introduction* and note on stringency language (confluence: [DM Coding Style Policy|https://confluence.lsstcorp.org/display/LDMDG/DM+Coding+Style+Policy]) * ✅ *DM Python Style Guide* (confluence: [Python Coding Standard|https://confluence.lsstcorp.org/display/LDMDG/Python+Coding+Standard]) * ✅ *DM C++ Style Guide* (confluence pages: [C++ Coding Standard|https://confluence.lsstcorp.org/pages/viewpage.action?pageId=16908666] + [C++ General Recommendations|https://confluence.lsstcorp.org/pages/viewpage.action?pageId=16908756] + [C++ Naming Conventions|https://confluence.lsstcorp.org/pages/viewpage.action?pageId=16908685] + [C++ Files|https://confluence.lsstcorp.org/pages/viewpage.action?pageId=16908674] + [C++ Statements|https://confluence.lsstcorp.org/pages/viewpage.action?pageId=16908706] + [C++ Layout and Comments|https://confluence.lsstcorp.org/pages/viewpage.action?pageId=16908737] + [Policy on use of C++11/14|https://confluence.lsstcorp.org/pages/viewpage.action?pageId=20283399] + [On Using ‘Using’|https://confluence.lsstcorp.org/pages/viewpage.action?pageId=20283856]) * Coding Style Linters (new; draft from confluence [C++ Coding Standards Compliance|https://confluence.lsstcorp.org/pages/viewpage.action?pageId=20283861] and [Python Coding Standards Compliance|https://confluence.lsstcorp.org/display/LDMDG/Python+Coding+Standards+Compliance] * ✅ *Using C++ Templates* ([confluence|https://confluence.lsstcorp.org/pages/viewpage.action?pageId=20284190]); this page needs to severely edited or re-written, however. * ✅ *Profiling* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Profiling|]). Also add a section ‘Using Valgrind with Python' (new) [[~jsick]] * ✅ *Boost Usage* ([TRAC|https://dev.lsstcorp.org/trac/wiki/TCT/BoostUsageProposal]) [[~tjenness]] * ✅ *Software Unit Test Policy* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Software+Unit+Test+Policy]) [[~swinbank]] * ✅ *Unit Test Coverage Analysis* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Coverage+Analysis]) [[~swinbank]] * ✅ *Unit Testing Private C++ Functions* ([trac|https://dev.lsstcorp.org/trac/wiki/UnitTestingPrivateFunctions]) [[~swinbank]] h3. Writing Docs — /docs/ * *Introduction* (new): Overview of DM’s documentation needs; links resources on technical writing. * *English Style Guide* (new): Supplement the [LSST Style Manual|https://www.lsstcorp.org/docushare/dsweb/Get/Document-13016/LSSTStyleManual.pdf] and provide English style guidance specific to DM. Capitalization of different heading levels; use of Chicago Manual of Style; a ‘this, not that’ table of spelling and word choices. * ✅ *ReStructuredText Style Guide* (new) * ✅ *Documenting Stack Packages* (new) * ✅ *Documenting Python Code* (new) * ✅ *Documenting C++ Code* (confluence, adapted from [Documentation Standards|https://confluence.lsstcorp.org/display/LDMDG/Documentation+Standards]); needs improvement * ✅ *Writing Technotes* (new; port README from [lsst-technote-bootstrap|https://github.com/lsst-sqre/lsst-technote-bootstrap/blob/master/README.rst]) h3. Developer Tools — /tools/ * ✅ *Git Setup and Best Practices* (new) * ✅ *Using Git Large File Storage (LFS) for Data Repositories* (new) * ✅ *JIRA Work Management Recipes* (new) * ✅ *Emacs Configuration* ([Confluence|https://confluence.lsstcorp.org/display/LDMDG/Emacs+Support+for+LSST+Development]). See DM-5045 for issue with Emacs config repo - [~jsick] * ✅ *Vim Configuration* ([Confluence|https://confluence.lsstcorp.org/display/LDMDG/Config+for+VIM]) - [~jsick] h3. Developer Services — /services/ * ✅ *NCSA Nebula OpenStack Guide* (Confluence: [User Guide|https://confluence.lsstcorp.org/display/LDMDG/NCSA+Nebula+OpenStack+User+Guide] + [Starting an Instance|https://confluence.lsstcorp.org/display/LDMDG/Introduction+to+Starting+a+Nebula+Instance] + [Using Snapshots|https://confluence.lsstcorp.org/display/LDMDG/Start+an+Instance+using+a+base+snapshot+with+the+LSST+Stack]. Add the [Vagrant instructions too from SQR-002|http://sqr-002.lsst.io]? [[~jsick]] * ✅ *Using lsst-dev* (Confluence: [notes Getting Started|https://confluence.lsstcorp.org/display/LDMDG/Getting+Started+in+DM] + [Developer Tools at NCSA|https://confluence.lsstcorp.org/display/LDMDG/Developer+Tools+at+NCSA] * ✅ *Using the Bulk Transfer Server at NCSA* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Using+the+Bulk+Transfer+Server+at+NCSA]) [[~jsick]] h3. Build, Test, Release — /build-ci/ * *Eups for LSST Developers* (new) [[~swinbank]] * ✅ *The LSST Software Build Tool* → ‘Using lsstsw and lsst-build' ([confluence|https://confluence.lsstcorp.org/display/LDMDG/The+LSST+Software+Build+Tool]); lsstsw and lsst-build documentation. [[~swinbank]] * *Using DM’s Jenkins for Continuous Integration* (new) [~frossie] * ✅ *Adding a New Package to the Build*([confluence|https://confluence.lsstcorp.org/display/LDMDG/Adding+a+new+package+to+the+build]) [[~swinbank]] * ✅ *Distributing Third-Party Packages with Eups* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Distributing+third-party+packages+with+EUPS]) [[~swinbank]] * ✅ *Triggering a Buildbot Build* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Triggering+a+Buildbot+Build]) [~frossie] * ✅ *Buildbot Errors FAQ* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Buildbot+FAQ+on+Errors]) [~frossie] * * Buildbot configuration ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Buildbot+Configuration+and+Setup] [~frossie] * *Creating a new DM Stack Release* ([confluence|https://confluence.lsstcorp.org/display/LDMDG/Creating+a+new+DM+Stack+Release]); though this page or a modern equivalent should probably belong with the software docs? [~frossie] _A lot of work should go into this section._ Have something about Scons? Or maybe that belongs in the doc of each relevant software product. h2. Leftover Confluence pages h3. The following pages should be moved to a separate Confluence space run by NCSA: * [NCSA Nebula OpenStack Issues|https://confluence.lsstcorp.org/display/LDMDG/NCSA+Nebula+OpenStack+Issues] * [DM System Announcements|https://confluence.lsstcorp.org/display/LDMDG/DM+System+Announcements] * [NCSA Development Servers|https://confluence.lsstcorp.org/display/LDMDG/DM+Development+Servers] h3. The following pages are either not relevant, generally misplaced, or need to be updated/recalibrated: * [Git Crash Course|https://confluence.lsstcorp.org/display/LDMDG/Git+Crash+Course] * [Basic Git Operations|https://confluence.lsstcorp.org/display/LDMDG/Basic+Git+Operations] * [Handling Git Push Problems|https://confluence.lsstcorp.org/display/LDMDG/Handling+Git+Push+Problems] * [LSST Code Repositories|https://confluence.lsstcorp.org/display/LDMDG/LSST+Code+Repositories]; see the proposed “Finding Code on GitHub” page for a replacement. * [Standards and Policies|https://confluence.lsstcorp.org/display/LDMDG/Standards+and+Policies]: this is a good TOC for the Confluence docs; but not longer needed for the new docs. * [Documentation Guidelines|https://confluence.lsstcorp.org/display/LDMDG/Documentation+Guidelines]. Some of this could be re-purposed into an intro to the ‘Writing Documentation’ section; some of this should go in a ‘Processes' page. * [DM Acknowledgements of Use|https://confluence.lsstcorp.org/display/LDMDG/DM+Acknowledgements+of+Use]: this probably belongs in documentation for the software projects that actually used this work.
| 5 |
968 |
DM-5014
|
02/03/2016 11:35:35
|
Set doRenorm default to False in AssembleCcdTask
|
Change the default value of {{AssembleCcdConfig.doRenorm}} to {{False}} for the reasons given in RFC-157 and to implement that RFC.
| 1 |
969 |
DM-5018
|
02/03/2016 12:17:28
|
Modernize version check scripts in matplotlib and numpy packages
|
The version check scripts in the stub {{matplotlib}} and {{numpy}} eups packages use old Python conventions. They should be updated to work with 2.7+.
| 0.5 |
970 |
DM-5022
|
02/03/2016 13:55:47
|
Modernize python code in Qserv scons package
|
The {{site_scons}} Python code is not using current project standards. For example, print is not a function, exceptions are not caught {{as e}}, {{map}} is called without storing the result and {{map/filter/lambda}} are used where list comprehensions would be clearer. Most of these fixes are trivial with {{futurize}}.
| 0.5 |
971 |
DM-5026
|
02/03/2016 14:45:53
|
Fix dependencies for eups-packaged sqlalchemy
|
Eups-packaged sqlalchemy lists {{mysqlclient}} as required dependency which is not really right. sqlalchemy does not directly depend on mysql client stuff, instead it determines at run time which python modules it needs to load depending on what exact driver client code is requesting (and {{mysqlclient}} does not actually provides python module so this dependency does not even make anything useful). So dependency on specific external package should be declared on client side and not in sqlalchemy, {{mysqlclient}} should be removed from sqlalchemy.table.
| 1 |
972 |
DM-5030
|
02/03/2016 16:56:05
|
Tests fail on Qserv on OS X El Capitan because of SIP
|
OS X El Capitan introduced System Integrity Protection which leads to dangerous environment variables being stripped when executing trusted binaries. Since {{scons}} is launched using {{/usr/bin/env}} the tests that run do not get to see {{DYLD_LIBRARY_PATH}}. This causes them to fail. The same fix that was applied to {{sconsUtils}} (copying the path information from {{LSST_LIBRARY_PATH}}) needs to be applied to the test execution code used by Qserv's private {{site_scons}} utility code.
| 2 |
973 |
DM-5050
|
02/04/2016 13:10:42
|
SingleFrameVariancePlugin takes variance of entire image
|
{{SingleFrameVariancePlugin}} takes the median variance of the entire image, rather than within an aperture around the source of interest. A {{Footprint}} is constructed with the aperture, but it is unused. This means that this plugin takes an excessive amount of run time (255/400 sec in a recent run of processCcd on HSC {{visit=1248 ccd=49}} with DM-4692).
| 1 |
974 |
DM-5052
|
02/04/2016 17:52:19
|
Design replacement for A.net index files
|
We need a simple way to hold index files that will be easy to use and simple to set up.
| 2 |
975 |
DM-5084
|
02/05/2016 13:32:30
|
PropagateVisitFlags doesn't work with other pipeline components
|
{{PropagateVisitFlags}}, which was recently ported over from HSC on DM-4878, doesn't work due to some inconsistencies with earlier packages/tasks: - The default fields to transfer have new names: "calib_psfCandidate" and "calib_psfUsed" - We're not currently transferring these fields from icSrc to src, so those fields aren't present in src anyway. I propose we just match against icSrc for now, since it has all of the fields we're concerned with. - It makes a call to {{afw.table.ExposureCatalog.subsetContaining(Point, Wcs, bool)}}, which apparently exists in C++ but not in Python; I'll look into seeing which HSC commits may have been missed in that port.
| 1 |
976 |
DM-5085
|
02/05/2016 13:35:55
|
Please add a package that includes obs_decam, obs_cfht and all validation_data datasets
|
It would be very helpful to have an lsstsw package that added all supported obs_* packages (certainly including obs_cfht and obs_decam, and I hope obs_subaru) and all validation_data_* packages. This could be something other than lsst_apps, but I'm not sure what to call it.
| 0.5 |
977 |
DM-5086
|
02/05/2016 13:58:51
|
Enable aperture correction on coadd processing
|
Aperture corrections are now coadded, so we can enable aperture corrections in measurements done on coadds.
| 0.5 |
978 |
DM-5094
|
02/08/2016 14:15:59
|
HSC backport: Set BAD mask for dead amps instead of SAT
|
This is a port of [HSC-1095|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1095] and a leftover commit from [HSC-1231|https://hsc-jira.astro.princeton.edu/jira/browse/HSC-1231]: [isr: don't perform overscan subtraction on bad amps|https://github.com/HyperSuprime-Cam/obs_subaru/commit/d6fe6cf5c4ecadebd5a344d163e1f1e60137c7e4] (noted in DM-3942).
| 3 |
979 |
DM-5095
|
02/08/2016 14:51:18
|
Redirect confluence based pages to new developer guide.
|
Delete and apply redirects to all migrated pages in old Confluence-based Developer Guide
| 0.5 |
980 |
DM-5100
|
02/09/2016 10:02:18
|
Docs for ltd-keeper
|
Create a documentation project within ltd-keeper that documents the RESTful API while it is being developed. This will allow the [SQR-006|http://sqr-006.lsst.io] technote to have a place to link to for detailed information.
| 1 |
981 |
DM-5107
|
02/09/2016 17:12:57
|
Fix effective coordinates for defects in obs_subaru
|
The defects as defined in {{obs_subaru}} (in the {{hsc/defects/20NN-NN-NN/defects.dat}} files) are defined in a coordinate system where pixel (0, 0) is the lower left pixel. However, the LSST stack does not use this interpretation, preferring to maintain the coordinate system tied to the electronics. As such, the defect positions are being misinterpreted for the rotated CCDs in HSC (see [HSC CCD layout|http://www.naoj.org/Observing/Instruments/HSC/CCDPosition_20150804.png]). This needs to be remedied.
| 2 |
982 |
DM-5120
|
02/10/2016 12:46:51
|
Add intelligence to `validate_drp` so it does "A Reasonable Thing" on an unknown output repo
|
validate_drp current takes as input both a repository and a configuration file. The configuration file contains information to construct the list of dataIds to analyze. However, these dataIds could be extracted from the repo itself, in cases where the desired is to analyze the entire repo. 1. Add a function that loads the set of dataIds from the repo. (/) 2. Select reasonable defaults for the additional parameters specified in the config file. (/) 3. Design how to handle multiple filters. (/)
| 5 |
983 |
DM-5121
|
02/10/2016 12:50:27
|
Add multiple-filter capabilities to `validate_drp`
|
Design and refactor `validate_drp` to produce results for multiple filters. 1. Decide on the syntax for the YAML configuration file that denotes the multiple filters. E.g., which visit goes with what filter? (/) 2. Organize the running of multiple filters in `validate.run` to sequentially generate statistics and plots for each filter. (/) 3. Add a filter designation to the default output prefix. (/) Note: matching objects *across* filters is out-of-scope for this ticket.
| 1 |
984 |
DM-5122
|
02/10/2016 13:30:01
|
LOAD DATA LOCAL does not work with mariadb
|
After we un-messed mariadb-mysqlclient we see errors now when trying to run integration tests: {noformat} File "/usr/local/home/salnikov/dm-yyy/lib/python/lsst/qserv/wmgr/client.py", line 683, in _request raise ServerError(exc.response.status_code, exc.response.text) ServerError: Server returned error: 500 (body: "{"exception": "OperationalError", "message": "(_mysql_exceptions.OperationalError) (1148, 'The used command is not allowed with this MariaDB version') [SQL: 'LOAD DATA LOCAL INFILE %(file)s INTO TABLE qservTest_case01_mysql.LeapSeconds FIELDS TERMINATED BY %(delimiter)s ENCLOSED BY %(enclose)s ESCAPED BY %(escape)s LINES TERMINATED BY %(terminate)s'] [parameters: {'terminate': u'\\n', 'delimiter': u'\\t', 'enclose': u'', 'file': '/home/salnikov/qserv-run/2016_02/tmp/tmpWeAj6u/tabledata.dat', 'escape': u'\\\\'}]"}") 2016-02-10 14:17:40,836 - lsst.qserv.admin.commons - CRITICAL - Error code returned by command : qserv-data-loader.py -v --config=/usr/local/home/salnikov/testdata-repo/datasets/case01/data/common.cfg --host=127.0.0.1 --port=5012 --secret=/home/salnikov/qserv-run/2016_02/etc/wmgr.secret --delete-tables --chunks-dir=/home/salnikov/qserv-run/2016_02/tmp/qserv_data_loader/LeapSeconds --no-css --skip-partition --one-table qservTest_case01_mysql LeapSeconds /usr/local/home/salnikov/testdata-repo/datasets/case01/data/LeapSeconds.schema /usr/local/home/salnikov/testdata-repo/datasets/case01/data/LeapSeconds.tsv.gz {noformat} It looks like mariadb client by default disables LOCAL option for data loading and it needs to be explicitly enabled.
| 1 |
985 |
DM-5125
|
02/10/2016 15:05:54
|
qserv fails when it mixes mariadb and mariadbclient directories
|
When I tried to run qserv-configure after installing qserv 2016_01-7-gbd0349f I got this error: {noformat} 2016-02-10 16:03:16,915 - lsst.qserv.admin.commons - CRITICAL - Error code returned by command : /home/salnikov/qserv-run/2016_02/tmp/configure/mysql.sh {noformat} Running script configure/mysql.sh: {noformat} $ sh -x /home/salnikov/qserv-run/2016_02/tmp/configure/mysql.sh + echo '-- Installing mysql database files.' -- Installing mysql database files. + /u2/salnikov/STACK/Linux64/mariadbclient/10.1.11/scripts/mysql_install_db --basedir=/u2/salnikov/STACK/Linux64/mariadbclient/10.1.11 --defaults-file=/home/salnikov/qserv-run/2016_02/etc/my.cnf --user=salnikov + echo 'ERROR : mysql_install_db failed, exiting' ERROR : mysql_install_db failed, exiting + exit 1 {noformat} and {noformat} $ /u2/salnikov/STACK/Linux64/mariadbclient/10.1.11/scripts/mysql_install_db --basedir=/u2/salnikov/STACK/Linux64/mariadbclient/10.1.11 --defaults-file=/home/salnikov/qserv-run/2016_02/etc/my.cnf --user=salnikov FATAL ERROR: Could not find mysqld The following directories were searched: /u2/salnikov/STACK/Linux64/mariadbclient/10.1.11/libexec /u2/salnikov/STACK/Linux64/mariadbclient/10.1.11/sbin /u2/salnikov/STACK/Linux64/mariadbclient/10.1.11/bin {noformat} So it looks for mysqld in mariadbclient, the same directory as mysql_install_db script, mysql_install_db should be actually running from mariadb.
| 1 |
986 |
DM-5129
|
02/11/2016 10:26:34
|
Create InputField for generic use cases.
|
Create a composable, validating InputField so it can use outside of the form/submit use-case.
| 2 |
987 |
DM-5130
|
02/11/2016 11:50:35
|
B-F correction breaks non-HSC custom ISR, ci_hsc
|
The addition of brighter-fatter correction on DM-4837 breaks obs_cfht's custom ISR, since it slightly changes an internal ISR API by addding an argument that isn't expected by the obs_cfht version. It also breaks ci_hsc, since the B-F kernel file isn't included in the calibrations packaged there.
| 0.5 |
988 |
DM-5132
|
02/11/2016 19:27:30
|
obs_subaru install with eups distrib fails
|
Thus: {code} $ eups distrib install -t w_2016_06 obs_subaru ... [ 52/52 ] obs_subaru 5.0.0.1-60-ge4efae7+2 ... ***** error: from /Users/jds/Projects/Astronomy/LSST/stack/EupsBuildDir/DarwinX86/obs_subaru-5.0.0.1-60-ge4efae7+2/build.log: ---------------------------------------------------------------------- Traceback (most recent call last): File "tests/hscRepository.py", line 91, in setUp self.repoPath = createDataRepository("lsst.obs.hsc.HscMapper", rawPath) File "tests/hscRepository.py", line 63, in createDataRepository check_call([ingest_cmd, repoPath] + glob(os.path.join(inputPath, "*.fits.gz"))) File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/subprocess.py", line 540, in check_call raise CalledProcessError(retcode, cmd) CalledProcessError: Command '['/Users/jds/Projects/Astronomy/LSST/stack/EupsBuildDir/DarwinX86/obs_subaru-5.0.0.1-60-ge4efae7+2/obs_subaru-5.0.0.1-60-ge4efae7+2/bin/hscIngestImages.py', '/var/folders/jp/lqz3n0m17nqft7bwtw3b8n380000gp/T/tmptUSKuf', '/Users/jds/Projects/Astronomy/LSST/stack/DarwinX86/testdata_subaru/master-gf9ba9abdbe/hsc/raw/HSCA90402512.fits.gz']' returned non-zero exit status 1 ---------------------------------------------------------------------- Ran 8 tests in 9.928s FAILED (errors=7) The following tests failed: /Users/jds/Projects/Astronomy/LSST/stack/EupsBuildDir/DarwinX86/obs_subaru-5.0.0.1-60-ge4efae7+2/obs_subaru-5.0.0.1-60-ge4efae7+2/tests/.tests/hscRepository.py.failed 1 tests failed scons: *** [checkTestStatus] Error 1 scons: building terminated because of errors. + exit -4 {code} Please fix it.
| 1 |
989 |
DM-5135
|
02/12/2016 10:01:49
|
Make ci_hsc buildable by Jenkins
|
1. Make sure {{ci_hsc}} is buildable by {{lsstsw}} / {{lsst_build}} (/) 2. Add {{ci_hsc}} to lsstsw/etc/repos.yaml so that one can request that Jenkins builds it. (/) 3. Verify that the test in {{ci_hsc}} fails on known broken tags and passes on known successful tags. (/) No dependencies will be added to {{lsst_sims}} or {{lsst_distib}}. This is meant to provide the ability to request that Jenkins do these builds and to fail if something has broken them. This will later be expanded to new packages {{ci_cfht}}, {{ci_decam}}, and {{ci_sim}}. The key goal is to make sure one hasn't broken obs_ packages in their butler interface or in their processCcd Additional Notes and Thoughts from HipChat Discussion [~ktl] Sounds good to me; we might have an "lsst_ci" top-level metapackage depending on all of them which is what Jenkins would run regularly. If the goal is to test obs_ packages, then my first instinct would be to put that in the obs_ package. Longer term goal to test the stack with different precursor datasets. If this is testing obs_ packages on a slower cadence than the built-in tests, it's OK for that to be a separate package. [~jbosch] Eventually, I think we need to run a CI dataset for each camera, then run some camera generic tests on each of those, then run some camera-specific tests on each of those. So we don't want to go too far down a road in which all tests are camera-specific, but maybe we don't have a choice until we have some better unifying framework for them. I've certainly been putting some checks in {{ci_hsc}} that would be valid for all other cameras, if we had a CI package for them that went through to coadd processing.
| 2 |
990 |
DM-5139
|
02/12/2016 11:57:08
|
Update apr and apr_util
|
{{apr}} and {{apr-util}} are outdated and lagging behind the versions on RHEL6. They should be updated as agreed in RFC-76.
| 0.5 |
991 |
DM-5140
|
02/12/2016 12:39:57
|
Move luaxmlrpc to lsst-dm/legacy-
|
We no longer need luaxmlrpc because we run czar inside proxy. We should move it to lsst-dm/legacy-, and remove mentioning it in readme.
| 0.5 |
992 |
DM-5147
|
02/12/2016 21:12:52
|
Provide usable repos in {{validation_data_*}} packages.
|
Re-interpreted ticket: 1. Provide already-initialized repositories in the `validation_data_cfht`, `validation_data_decam`, and `validation_data_hsc` packages alongside the raw data. The goal is to allow both easy quick-start analyses as well as comparisons of output steps from processCcd.py and friends at each step of the processing. (/) 2. Add (Cfht,Decam,HSC).list files to provide for easy processing of the available dataIds in the example data. (/) 3. Update README files to explain available data. (/) [Original request:] In validation_drp when I run examples/runXTest.sh I find that any data I had saved in CFHT or DECam is lost, even if I have carefully renamed it. This is very dangerous and I lost a lot of work due to it. At a bare minimum please do NOT touch any directories not named "input" or "output". Lower priority requests that I hope you will consider: - Have the the input repo be entirely contained in the validation_data_X packages, ready to use "as is". That would simplify the use of those packages by other code. It would also simplify validate_drp, and it would just leave the output repo to generate (which already has a link back to the input repo). - Have runXTest.sh accept a single argument: the path to the output. (The path to the input is not necessary if you implement the first suggestion).
| 3 |
993 |
DM-5156
|
02/15/2016 12:43:37
|
Please document MemoryTestCase
|
{{lsst.utils.tests.MemoryTestCase}} is used extensively throughout our test suite, but it is lacking in documentation and it's not clear under what circumstances its use is required or encouraged. Please add appropriate documentation to the [Software Unit Test Policy |http://developer.lsst.io/en/latest/coding/unit_test_policy.html]. See also [this thread on clo|https://community.lsst.org/t/what-is-the-policy-for-using-lsst-utils-tests-memorytestcase].
| 0.5 |
994 |
DM-5159
|
02/15/2016 16:59:16
|
Please use angle and Coord where possible
|
validate_drp would be easier to follow and safer if it took advantage of lsst.afw.geom.Angle and lsst.afw.coord.IcrsCoord. For instance {{averageRaDecFromCat}} could return an IcrsCoord and positionRms could use coord1.angularSeparation(coord2) and handle wraparound and other effects simply and safely.
| 1 |
995 |
DM-5161
|
02/16/2016 14:53:45
|
HSC backport: Support a full background model when detecting cosmic rays
|
This is a port of the following two standalone HSC commits: [Support a full background model when detecting cosmic rays|https://github.com/HyperSuprime-Cam/pipe_tasks/commit/3bae328e0fff4b2a02267e97cc1e53b5bbe431cb] {code} If there are strong gradients (e.g. M31's nucleus) we need to do more than treat the background as a constant. However, this requires making a copy of the data so the background-is-a-constant model is preserved as a special case {code} [Fixed cosmicRay() in RepairTask for the case background is subtracted.|https://github.com/HyperSuprime-Cam/pipe_tasks/commit/2cdb7c606270d84c7a05baf9949ff5724463fa6b] {code} When the background is subtracted with finer binsize, new exposure will be created and cosmic rays will be detected on that exposure. But the image of that exposure was not properly returned back. {code}
| 1 |
996 |
DM-5164
|
02/17/2016 12:25:31
|
Tests in daf_persistence should skip properly
|
Some of the tests in {{daf_persistence}} have a couple of problems that cause difficulties with modern test frameworks: # unittest is not being used at all in some cases # Skipping is done with a print and a {{sys.exit}} They need to be modernized.
| 2 |
997 |
DM-5169
|
02/17/2016 16:16:22
|
Fastly API interactions for LSST the Docs
|
Using Fastly’s API, have ltd-keeper setup new builds and editions: - Add {{Surrogate-Key}} to headers of objects uploaded to S3 (happens on ltd-mason side) - Configure Varnish to serve specific bucket directories as specific domains (DM-4951 has added Route 53 interactions to ltd-keeper) - Purge content when editions switch or content is deleted. DM-5167 is covering non-API driven work to configure fastly. See https://www.hashicorp.com/blog/serving-static-sites-with-fastly.html for a write-up on serving static site via fastly. See also http://sqr-006.lsst.io for an overview of LSST the Docs.
| 8 |
998 |
DM-5179
|
02/18/2016 15:02:48
|
miniconda2 eups package fails to install on OS X
|
The {{miniconda2}} eups package attempts to install the relevant conda packages by downloading a list from the {{lsstsw}} repository. This fails for the same reason that {{lsstsw}} fails in DM-5178 in that the list of packages is not OS-specific. This means that {{newinstall.sh}} does not work any more on OS X.
| 0.5 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.