id
int64 0
5.38k
| issuekey
stringlengths 4
16
| created
stringlengths 19
19
| title
stringlengths 5
252
| description
stringlengths 1
1.39M
| storypoint
float64 0
100
|
---|---|---|---|---|---|
1,799 |
DM-8698
|
12/20/2016 09:12:54
|
Jenkins build failure in meas_modelfit
|
{code} tests/testMeasureImage.py /home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py2/lsstsw/miniconda/lib/python2.7/site-packages/astropy/config/configuration.py:687: ConfigurationMissingWarning: Configuration defaults will be used due to OSError:Could not find unix home directory to search for astropy config dir on None warn(ConfigurationMissingWarning(msg)) Traceback (most recent call last): File "tests/testMeasureImage.py", line 38, in <module> import lsst.meas.modelfit.display File "/home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py2/lsstsw/build/meas_modelfit/python/lsst/meas/modelfit/display/__init__.py", line 1, in <module> from .densityPlot import * File "/home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py2/lsstsw/build/meas_modelfit/python/lsst/meas/modelfit/display/densityPlot.py", line 63, in <module> class HistogramLayer(object): File "/home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py2/lsstsw/build/meas_modelfit/python/lsst/meas/modelfit/display/densityPlot.py", line 80, in HistogramLayer defaults2d = dict(cmap=matplotlib.cm.Blues, vmin=0.0, interpolation='nearest') AttributeError: 'module' object has no attribute 'cm' ----------------------------------------------------- {code}
| 1 |
1,800 |
DM-8701
|
12/20/2016 14:24:53
|
Copy and ingest HSC Cosmos data
|
Once RFC-266 is accepted, {{/gpfs/fs0/scratch/pprice/UH-Cosmos/*.fits}} needs to be copied into {{/datasets/hsc/raw/cosmos}} and ingested into {{/datasets/hsc/repo}}.
| 1 |
1,801 |
DM-8714
|
12/21/2016 13:21:23
|
Fix position of psf computation for base_SdssShape_psf
|
There is an error in the psf computation of *base_SdssShape_psf* in {{meas_base}}'s *src/SdssShape.cc* in that it does not provide the position of the source, so is just getting the measurement at the position returned by {{afw}}'s {{getAveragePosition()}} in *src/detection/Psf.cc* for all sources. Please fix.
| 1 |
1,802 |
DM-8729
|
12/22/2016 16:32:47
|
Publish display_firefly to eups
|
This ticket captures work done to publish the display_firefly backend for afw.display to eups, including researching and learning the steps to do it. * tag the ws4py and firefly_client dependencies appropriately * rebuild display_firefly using Jenkins * publish display_firefly using Jenkins enabling: {code} export EUPS_PKGROOT=https://sw.lsstcorp.org/eupspkg eups distrib list display_firefly # returns version to use in next line eups distrib install display_firefly master-g0a33da8b30+2 {code}
| 2 |
1,803 |
DM-8741
|
12/22/2016 19:15:17
|
Bug fixes, improvements, and code refactoring in SUIT S19
|
This epic includes the tickets for improvements, code refactoring, and bug fixes needed for LSP portal.
| 100 |
1,804 |
DM-8742
|
12/22/2016 19:18:12
|
Firefly code refactor and bug fixes (S19)
|
This version of pipelineQA portal could be used for ComCam commissioning.
| 100 |
1,805 |
DM-8750
|
12/22/2016 22:25:28
|
eliminate jointcal compile warnings
|
Jointcal produces a number of compile-time warnings (which one can miss without the scons fix described in RFC-246). We should clean these up, which might be helped by compiling with both gcc and clang, and comparing their messages. It may be best to do this after we've dealt with other problems (like the boost pointer memory management), as that may take care of some of the warnings along the way. On the other hand, there's quite a bit of low-hanging fruit in the form of variables that are defined but never used.
| 2 |
1,806 |
DM-8751
|
12/23/2016 09:17:53
|
Fix skipped testPickle in testSourceTable.py
|
There is one skipped test in {{testSourceTable.py}}: {{testPickle}}. This segfaults, likely when a source catalog is pickled. Fix this. Should all kinds of catalog be pickleable? If so, make sure all are being tested.
| 2 |
1,807 |
DM-8757
|
12/23/2016 12:17:12
|
monthly test of tickets related to UI changes and bug fixes (Jan. 2017)
|
Test all the tickets affecting UI when merged into dev. File Jira tickets with the steps to duplicate the bugs. Jan. 3 2017 Tested https://jira.lsstcorp.org/browse/DM-8548 and found the bug. See comment in the ticket. Jan. 3 2017 Reviewed DM-8648 in git hub. Jan. 4, 2017 Tested/reviewed DM-8548. Jan. 9, 2017 Found a bug and issued ticket DM-8957:Original plot Inconsistent does not abort Jan. 12, 2017 Reviewed DM-8957: Original plot does not abort and over writes active plot Jan. 12, 2017 Reviewed DM-8049: Crop related classes need to be refactored Jan. 17, 2017 Reviewed DM-7827: Create LSST File group processor for packaging (download the LSST images) Jan. 20, 23, 2017 Reviewed DM-8668: LC period finder Jan. 31, 2017 Tested DM-8985: mask
| 5 |
1,808 |
DM-8758
|
12/23/2016 12:18:30
|
monthly test of tickets related to UI changes and bug fixes (Feb. 2017)
|
monthly test of tickets related to UI changes and bug fixes. file tickets with steps to reproduce the bug. 2/1/2017 Reviewed DM-8660: Unit test for Geom. 2/2/2017 Reviewed DM-8670: Light Curve UI (wise data). 2/3/2017 Reviewed DM-7780: FlipXYTest 2/7/2017 Reviewed DM-9247: All Sky catalog query 2/8/2017 Reviewed DM-9248: All Sky mode 2/9/2017 Reviewed DM-8661: Unit Test for ImageData 2/12/2017 Reviewed DM-7946: UnitTestForCentralPoint 2/15/2017 Reviewed DM-7780: FlipXYTest 2/16/2017 Reviewed DM-8839: Image header unit test 2/17/2017 Reviewed DM-9470: Compass layout 2/21/2017 Reviewed DM-8845: UnitTest for ImagePlot class
| 5 |
1,809 |
DM-8759
|
12/23/2016 12:19:25
|
monthly test of tickets related to UI changes and bug fixes (Mar. 2017)
|
monthly test of tickets related to UI changes and bug fixes File tickets with steps to reproduce the bugs 3/1/2017: Tested DM-8578.
| 0.5 |
1,810 |
DM-8760
|
12/23/2016 12:20:10
|
monthly test of tickets related to UI changes and bug fixes (Apr. 2017)
|
monthly test of tickets related to UI changes and bug fixes file tickets with steps to reproduce the bugs DM-10065 needs to be thoroughly tested.
| 5 |
1,811 |
DM-8761
|
12/23/2016 12:21:13
|
monthly test of tickets related to UI changes and bug fixes (May 2017)
|
monthly test of tickets related to UI changes and bug fixes file tickets with steps to reproduce the bugs
| 5 |
1,812 |
DM-8791
|
12/29/2016 10:22:47
|
lsst-dm-mac.lsst.org is inaccessible
|
From my office desktop: {code:java} $ ping -i0.2 -c 10 -q lsst-dm-mac.lsst.org PING lsst-dm-mac.lsst.org (140.252.32.108) 56(84) bytes of data. --- lsst-dm-mac.lsst.org ping statistics --- 10 packets transmitted, 0 received, +10 errors, 100% packet loss, time 1869ms pipe 10 $ ssh -vvv [email protected] OpenSSH_7.2p2, OpenSSL 1.0.2j-fips 26 Sep 2016 debug1: Reading configuration data /home/jhoblitt/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: /etc/ssh/ssh_config line 3: Applying options for * debug1: auto-mux: Trying existing master debug1: Control socket "/home/jhoblitt/.ssh/master/[email protected]:22" does not exist debug2: resolving "lsst-dm-mac.lsst.org" port 22 debug2: ssh_connect_direct: needpriv 0 debug1: Connecting to lsst-dm-mac.lsst.org [140.252.32.108] port 22. debug2: fd 3 setting O_NONBLOCK debug1: connect to address 140.252.32.108 port 22: No route to host ... {code} From the production jenkins instance: {code:java} $ ssh -i /var/lib/jenkins/.ssh/id_rsa [email protected] -vvv OpenSSH_6.6.1, OpenSSL 1.0.1e-fips 11 Feb 2013 debug1: Reading configuration data /etc/ssh/ssh_config debug1: /etc/ssh/ssh_config line 56: Applying options for * debug2: ssh_connect: needpriv 0 debug1: Connecting to lsst-dm-mac.lsst.org [140.252.32.108] port 22. debug1: connect to address 140.252.32.108 port 22: Connection timed out ssh: connect to host lsst-dm-mac.lsst.org port 22: Connection timed out {code}
| 0.5 |
1,813 |
DM-8817
|
01/03/2017 13:31:04
|
Port DMTN-017 LaTeX document to technote platform
|
DMTN-017 was written as a LaTeX doc before technotes existed. This story is to port DMTN-017 to a PDF viewer template (DM-4602) for PDF documents that can be published with LSST the Docs.
| 1 |
1,814 |
DM-8822
|
01/03/2017 14:19:48
|
Fix octal umask handling in ctrl_pool
|
While reprocessing some hsc data, I received the error given below. This is because python3 does not implicitly convert integers with leading zeros into octal. We should make the ctrl_pool code use an explicit octal int for the umask defined at the top of parallel.py, and then use a better formatting string where necessary. {code} SyntaxError: invalid token File "<string>", line 1 import os; os.umask(002); import lsst.ctrl.pool.log; lsst.ctrl.pool.log.jobLog("singleFrame"); import lsst.pipe.drivers.singleFrameDriver; lsst.pipe.drivers.singleFrameDriver.SingleFrameDriverTask.parseAndRun(); ^ SyntaxError: invalid token Tue Jan 3 12:38:28 PST 2017 Done. {code}
| 1 |
1,815 |
DM-8825
|
01/03/2017 17:31:51
|
cannot run singleFrameDriver on hsc data with python3 due to "__builtin__.str" in *Mapper.paf
|
When attempting to reprocess some hsc data, I ran into a number of problems (one fixed in DM-8822). In this case, some of the python type definition in the obs_subaru {{Mapper.paf}} files are not valid: {{\_\_builtin\_\_.str}} doesn't exist on python3. Fortunately because of our use of futurize, we have {{builtins.str}}. Although I don't plan to write it as part of this ticket, having a python2+3 validation system for the policy files would be good.
| 1 |
1,816 |
DM-8830
|
01/04/2017 11:09:57
|
Fix accounting for fraction of successful measurements
|
The final accounting for success/fail of the KPMs in `validateDrp.py` appears to be off: {code} ================================================================= design level summary ================================================================= FAILED (37/21 measurements) ================================================================= {code} 1. Investigate why this is happening (likely a counter not being reset, or the denominator not being updated). 2. Decide on correct accounting and implement.
| 1 |
1,817 |
DM-8832
|
01/04/2017 12:12:17
|
Butler mapper issue in validate_drp since Jenkins build 663
|
Since [build 663 on Jenkins|https://ci.lsst.codes/job/validate_drp/dataset=cfht,label=centos-7,python=py2/663/console] we're seeing a new Butler-related issue in {{validate_drp}} even with the CFHT dataset. Could there be a butler change that is triggering this issue? This would have happened January 3-4 2017. {noformat} [py2] $ /bin/bash -e /tmp/hudson8266472982640135803.sh notice: lsstsw tools have been set up. Ingesting Raw data root INFO: Loading config overrride file '/home/jenkins-slave/workspace/validate_drp/dataset/cfht/label/centos-7/python/py2/lsstsw/stack/Linux64/obs_cfht/12.1-14-gddeedbd+20/config/ingest.py' CameraMapper INFO: Unable to locate registry registry in root: /home/jenkins-slave/workspace/validate_drp/dataset/cfht/label/centos-7/python/py2/validate_drp/Cfht/input/registry.sqlite3 CameraMapper INFO: Unable to locate registry registry in current dir: ./registry.sqlite3 CameraMapper INFO: Loading Posix registry from /home/jenkins-slave/workspace/validate_drp/dataset/cfht/label/centos-7/python/py2/validate_drp/Cfht/input CameraMapper INFO: Unable to locate calibRegistry registry in root: /home/jenkins-slave/workspace/validate_drp/dataset/cfht/label/centos-7/python/py2/validate_drp/Cfht/input/calibRegistry.sqlite3 CameraMapper INFO: Unable to locate calibRegistry registry in current dir: ./calibRegistry.sqlite3 CameraMapper INFO: Loading Posix registry from /home/jenkins-slave/workspace/validate_drp/dataset/cfht/label/centos-7/python/py2/validate_drp/Cfht/input /home/jenkins-slave/workspace/validate_drp/dataset/cfht/label/centos-7/python/py2/lsstsw/stack/Linux64/validation_data_cfht/master-g2016f8e221+2/raw/849375p.fits.fz --<link>--> /home/jenkins-slave/workspace/validate_drp/dataset/cfht/label/centos-7/python/py2/validate_drp/Cfht/input/raw/06AL01/D3/2006-05-20/r/849375p.fits.fz /home/jenkins-slave/workspace/validate_drp/dataset/cfht/label/centos-7/python/py2/lsstsw/stack/Linux64/validation_data_cfht/master-g2016f8e221+2/raw/850587p.fits.fz --<link>--> /home/jenkins-slave/workspace/validate_drp/dataset/cfht/label/centos-7/python/py2/validate_drp/Cfht/input/raw/06AL01/D3/2006-06-02/r/850587p.fits.fz running processCcd validating Traceback (most recent call last): File "/home/jenkins-slave/workspace/validate_drp/dataset/cfht/label/centos-7/python/py2/lsstsw/stack/Linux64/validate_drp/master-gf3b529ce90+2/bin/validateDrp.py", line 95, in <module> validate.run(args.repo, **kwargs) File "/home/jenkins-slave/workspace/validate_drp/dataset/cfht/label/centos-7/python/py2/lsstsw/stack/Linux64/validate_drp/master-gf3b529ce90+2/python/lsst/validate/drp/validate.py", line 104, in run **kwargs) File "/home/jenkins-slave/workspace/validate_drp/dataset/cfht/label/centos-7/python/py2/lsstsw/stack/Linux64/validate_drp/master-gf3b529ce90+2/python/lsst/validate/drp/validate.py", line 204, in runOneFilter verbose=verbose) File "/home/jenkins-slave/workspace/validate_drp/dataset/cfht/label/centos-7/python/py2/lsstsw/stack/Linux64/validate_drp/master-gf3b529ce90+2/python/lsst/validate/drp/matchreduce.py", line 147, in __init__ repo, dataIds, matchRadius) File "/home/jenkins-slave/workspace/validate_drp/dataset/cfht/label/centos-7/python/py2/lsstsw/stack/Linux64/validate_drp/master-gf3b529ce90+2/python/lsst/validate/drp/matchreduce.py", line 175, in _loadAndMatchCatalogs butler = dafPersist.Butler(repo) File "/home/jenkins-slave/workspace/validate_drp/dataset/cfht/label/centos-7/python/py2/lsstsw/stack/Linux64/daf_persistence/12.1-17-g9654cba+1/python/lsst/daf/persistence/butler.py", line 304, in __init__ self._addRepo(args, inout='out', defaultMapper=defaultMapper, butlerIOParents=butlerIOParents) File "/home/jenkins-slave/workspace/validate_drp/dataset/cfht/label/centos-7/python/py2/lsstsw/stack/Linux64/daf_persistence/12.1-17-g9654cba+1/python/lsst/daf/persistence/butler.py", line 399, in _addRepo "Could not infer mapper and one not specified in repositoryArgs:%s" % args) RuntimeError: Could not infer mapper and one not specified in repositoryArgs:RepositoryArgs(root='Cfht/output', cfgRoot=None, mapper=None, mapperArgs={}, tags=set([]), mode='rw', policy=None) Validation failed Build step 'Execute shell' marked build as failure [PostBuildScript] - Execution post build scripts. [py2] $ /bin/sh -xe /tmp/hudson1544818061461777296.sh ++ lsof -d 200 -t + Z= + [[ ! -z '' ]] + rm -rf /home/jenkins-slave/workspace/validate_drp/dataset/cfht/label/centos-7/python/py2/lsstsw/stack/.lockDir Archiving artifacts [INFO] HipChat notification sent to the following rooms: Bot: Jenkins Finished: FAILURE {noformat}
| 1 |
1,818 |
DM-8837
|
01/04/2017 17:38:15
|
Reference catalog proper motions, parallaxes and errors
|
We will need to apply proper motion corrections when matching reference catalogs to measured catalogs. This epic will implement that feature. This will require several steps: # Implement standardized schema (per RFC-271) # Implement correction code # Write tests and verify performance The construction of reference catalogs is briefly discussed in LDM-151 v4.1 §6.1, but we defer to RFC-271 in terms of implementation detail. A deliverable for this epic will be documentation describing the format and contents of reference catalogs. Per the comment below, this epic also covers implementing RFC-368, including adding parallaxes to the reference catalogs.
| 40 |
1,819 |
DM-8841
|
01/04/2017 19:35:38
|
ci_hsc is broken
|
[~Parejkoj] in DM-8825 points out that ci_hsc is broken. It appears to have broken between build [#932 (Jan 3, 2017 5:57:59 PM)|https://ci.lsst.codes/job/ci_hsc/932/] and [#933 (Jan 4, 2017 1:57:00 AM)|https://ci.lsst.codes/job/ci_hsc/933/]. Diagnose and fix.
| 1 |
1,820 |
DM-8842
|
01/05/2017 08:57:31
|
LeastSqFitter1d(..., unsigned int order) should be signed
|
LeastSqFitter1d's constructor's {{order}} parameter is {{unsigned int}}. However it is stored as {{int}} and {{LeastSqFitter2d}} uses {{int}} in both places. I suggest switching to {{int}}. The existing code always supplies a small positive integer, so it's safe and trivial to fix. However, it will require the same change to the pybind11 interface file. I further suggest not changing this until after the pybind11 transition. Then change it and make sure the whole stack builds.
| 0.5 |
1,821 |
DM-8866
|
01/05/2017 12:43:11
|
Wrap datarel with pybind11
|
May not have any work associated with it, but is an {{lsst_distrib}} dependency. Investigate and update SP.
| 0.5 |
1,822 |
DM-8867
|
01/05/2017 12:43:51
|
Wrap lsst_apps with pybind11
|
May not have any work associated with it, but is an {{lsst_distrib}} dependency. Investigate and update SP.
| 0 |
1,823 |
DM-8872
|
01/05/2017 12:52:20
|
Disable locking in shared-stack.py
|
[~price] points out that we can (and likely should) disable EUPS locking globally by adding {code} hooks.config.site.lockDirectoryBase = None {code} to the EUPS configuration.
| 1 |
1,824 |
DM-8874
|
01/05/2017 12:54:24
|
Wrap display_ds9 with pybind11
|
May not have any work associated with it, but is an {{lsst_distrib}} dependency. Investigate and update SP.
| 3 |
1,825 |
DM-8877
|
01/05/2017 12:56:13
|
Wrap skymap with pybind11
|
May not have any work associated with it, but is an {{lsst_distrib}} dependency. Investigate and update SP.
| 2 |
1,826 |
DM-8890
|
01/05/2017 15:41:46
|
Investigate if MariaDB 10.1+ DynamicField() can be used for storing JSON blobs
|
During deployment of SQuaSH we realized that JSONField() as implemented in DM-8414 works only with MySQL and that the corresponding field type for MariaDB is the DynamicField() This ticket is to make sure we can use MariaDB features and stick with it in production.
| 2 |
1,827 |
DM-8914
|
01/06/2017 06:04:40
|
Improve container build in Jenkins
|
Build script is here: https://github.com/lsst-sqre/jenkins-dm-jobs/blob/master/pipelines/qserv/docker/build.groovy#L25-L26 dev containers should be created at each build, whereas release should only be created once a month.
| 5 |
1,828 |
DM-8923
|
01/06/2017 09:54:12
|
remove jenkins hipchat notifications
|
The slack migration appears to be a success and HC is essentially unused. Hipchat notifications from jenkins jobs should be safe to remove at this point and it is an opportunity to prune another plugin.
| 1 |
1,829 |
DM-8933
|
01/06/2017 11:14:02
|
Fix formatting in validateDrp.py --help message
|
Fix formatting in validateDrp.py --help message The "description" currently reads as {code} Calculate and plot validation Key Project Metrics from the LSST SRD. http://ls.st/LPM-17\n Produces results to: STDOUT Summary of key metrics REPONAME*.png Plots of key metrics. Generated in current working directory. REPONAME*.json JSON serialization of each KPM. where REPONAME is based on the repository name but with path separators replaced with underscores. E.g., "Cfht/output" -> "Cfht_output_" {code} But it should read (as written in the string): {code} description = """ Calculate and plot validation Key Project Metrics from the LSST SRD. http://ls.st/LPM-17 Produces results to: STDOUT Summary of key metrics REPONAME*.png Plots of key metrics. Generated in current working directory. REPONAME*.json JSON serialization of each KPM. where REPONAME is based on the repository name but with path separators replaced with underscores. E.g., "Cfht/output" -> "Cfht_output_" """ {code} * I think this is just a matter of passing {{formatter_class=argparse.RawDescriptionHelpFormatter}} to {{argparse.ArgumentParser}}
| 0 |
1,830 |
DM-8934
|
01/06/2017 11:28:28
|
Service Management & Emergent Work (December)
|
This story captures service management / emergent work for December actives related to the LSST development servers and Nebula OpenStack. This include issues and project communications related to the Home Directory transition to NFS to GPFS, network settings maintenance in NPCF, tuning of HTCondor settings & installation on lsst-dev01, Nebula instance shutdowns / migrations to help support live migration functionality, and similar issues.
| 5 |
1,831 |
DM-8944
|
01/09/2017 10:02:02
|
migrate squash production DB to MariaDB 10.1.x
|
Migrate the production RDS instance from MariaDB 10.0 -> 10.1 in preparation for the next squash release.
| 0.5 |
1,832 |
DM-8948
|
01/09/2017 12:14:15
|
ctrl_stats fails tests in 2017
|
{{lsst_py3}} is no longer building because {{ctrl_stats}} is getting the year wrong in tests. {code} tests/testTerminated.py ...F.F.. ====================================================================== FAIL: test4 (__main__.TestTerminated) ---------------------------------------------------------------------- Traceback (most recent call last): File "tests/testTerminated.py", line 79, in test4 self.assertEqual(rec.timestamp, self.year+"-08-21 10:27:31") AssertionError: '2016-08-21 10:27:31' != '2017-08-21 10:27:31' ====================================================================== FAIL: test6 (__main__.TestTerminated) ---------------------------------------------------------------------- Traceback (most recent call last): File "tests/testTerminated.py", line 102, in test6 self.assertEqual(rec.timestamp, self.year+"-08-21 10:29:43") AssertionError: '2016-08-21 10:29:43' != '2017-08-21 10:29:43' ---------------------------------------------------------------------- Ran 8 tests in 0.040s FAILED (failures=2) {code} 3 other test files fail.
| 1 |
1,833 |
DM-8949
|
01/09/2017 13:01:39
|
ctrl_stats doesn't calculate year to next year progress properly
|
An issue in DM-8948 brought up a problem in how the time are calculated within ctrl_stats. It made an assumption and used a files creation date as the year to start with; this worked throughout the year, but failed if the files were pulled in one year, and run in the next year. (time stamps of 2016 vs the year being 2017). This assumption was fixed in DM-8948. There is still a problem that exists in year to year calculations, for jobs that start in one year and end in the next. There will be negatives times calculated. This ticket will address this issue.
| 8 |
1,834 |
DM-8957
|
01/09/2017 17:45:41
|
Original plot does not abort and over writes active plot
|
Sometimes in the irsa tri-view, the images don't match the table and the xy-plot. For example: In http://localhost:8080/firefly/lsst-pdac-triview.html;a=layout.showDropDown, select "Images" and "Science Ccd Exposure", target: ra = 9.6, dec = -1.1; Then in the tri-view, five images have been downloaded (one of the scienceCcdExposureId is 1755410440). Now click any row which never downloaded images (for example, with the scienceCcdExposureId=4203410469), now this new set of images start to be downloaded. BEFORE (!) the downloading is finished, click back the very first row (1755410440) and wait for the downloading to finish. When the downloading is done, you will see the inconsistent tri-view: The first row is selected in the table, the dot of the first row in the xy-plot is highlighted, but the images are the new set. If user keeps clicking on the first row, the tri-view won't change and the inconsistency stays. Only when the user clicks any other row, the inconsistency will be gone. Bug?
| 5 |
1,835 |
DM-8963
|
01/10/2017 12:41:07
|
FITS download (save) does not work for 3 color image if the type is FITS
|
To reproduce the problem: # Start the IRSA viewer in the browser (/ocalhost:8080/firefly/) # Select "Create a New Plot 3-color" # Select Red panel and Wise data # Enter "m31" # Click "Search" # Click the file save icon in the toolbar # Click "Download" # But nothing happens after clicking download NOTE: If the type of file is PNG, it works.
| 0 |
1,836 |
DM-8964
|
01/10/2017 13:14:57
|
Update singleFrameDriver following changes to reference catalogs
|
The signature of {{ProcessCcdTask.\_\_init\_\_}} has changed (in DM-8232 to support distinct reference catalogs). {{SingleFrameDriverTask.\_\_init\_\_}} needs to be changed to match.
| 0.5 |
1,837 |
DM-8965
|
01/10/2017 14:08:18
|
Extend Alert Production prototype with new index type
|
I want to try to improve indexing in the DiaObject table to make both search and insertion faster.
| 8 |
1,838 |
DM-8967
|
01/10/2017 14:53:19
|
Documenting using emoji reactions in GitHub code reviews
|
After a discussion in Slack (a while ago in Dec 2016) we agreed that we're getting too many emails during code reviews when a developer replies "Done" to each line comment. A way around this is to use GitHub's emoji reactions: the dev can check off a comment, but an email is not sent out. This ticket documents that process in the [Developer Workflow page|https://developer.lsst.io/processes/workflow.html] of developer.lsst.io.
| 0.5 |
1,839 |
DM-8972
|
01/11/2017 11:15:00
|
obs_cfht table file uses envAppend
|
obs_cfht.table includes: {code} envAppend(DYLD_LIBRARY_PATH, ${PRODUCT_DIR}/lib) envAppend(PYTHONPATH, ${PRODUCT_DIR}/python) envAppend(PATH, ${PRODUCT_DIR}/bin) {code} These {{envAppend}} calls should be {{envPrepend}}.
| 1 |
1,840 |
DM-8973
|
01/11/2017 12:47:31
|
Wrap new Footprints with pybind11 and create python unit test
|
Wrap the new Footprints class with pybind11. A success criteria for this is successfully porting and updating the Footprints python unit test to successfully run.
| 8 |
1,841 |
DM-8978
|
01/11/2017 16:58:39
|
writing jointcal output is slow due to dataRef lookup
|
[~boutigny] noticed that jointcal is now very slow to write its output. This is at least in part due to the way I rewrote the output code to work with decam, which does not have "ccd" in its dataRefs. I think new slowdown is coming from the repeated calls to {{get("calexp").getDetector().getId()}}. [~price] suggested making a dictionary to map visit and ccd name to each dataRef: {code} visit_ccd_to_dataRef = {(dataRef.dataId['visit'], dataRef.get('calexp').getDetector.getId()): dataRef for dataRef in dataRefs} {code} and then replacing the for loop and if statement with a lookup in the dict. Note to Butler people: this problem is related to the problem of having no standard set identifiers in the dataIds: if we can guarantee that "visit" and "ccd" are always there (and, I'd argue, some other things), this code would be quite a bit simpler.
| 2 |
1,842 |
DM-8980
|
01/12/2017 09:36:54
|
Revise Python Style Guide for RFC-107 (79 character docstring lengths)
|
This ticket will implement RFC-107, which states that all Python docstrings and comments must have a maximum line length of 79 characters.
| 1 |
1,843 |
DM-8982
|
01/12/2017 10:37:10
|
Incorrect binning in overscan spline interpolation
|
The ordinates for the overscan spline interpolation can violate the requirement of monotonic increasing in the presence of masked rows, causing GSL to reject it and we end up raising an exception. For example (notice the third element): {code} #1 0x00002aaadf6ccae4 in lsst::afw::math::InterpolateGsl::InterpolateGsl ( this=0x6f4ef0, x=std::vector of length 30, capacity 32 = {...}, y=std::vector of length 30, capacity 32 = {...}, style=lsst::afw::math::Interpolate::AKIMA_SPLINE) at src/math/Interpolate.cc:211 211 int const status = ::gsl_interp_init(_interp, &x[0], &y[0], _y.size()); (gdb) p x $19 = std::vector of length 30, capacity 32 = {-4.6668978729026289, -1.0006934865900383, -2.2712418300653598, -0.76676245210727956, -0.70019157088122619, -0.63362068965517238, -0.5668103448275863, -0.5, -0.43342911877394619, -0.36685823754789276, -0.30028735632183906, -0.23371647509578544, -0.1669061302681992, -0.10009578544061301, -0.033524904214559385, 0.033045977011494247, 0.099616858237547928, 0.16618773946360157, 0.23299808429118776, 0.29980842911877403, 0.36637931034482762, 0.43295019157088127, 0.49952107279693497, 0.56609195402298851, 0.6329022988505747, 0.69971264367816088, 0.7662835249042147, 0.83285440613026807, 0.89942528735632199, 0.96623563218390807} {code} This is because the code to generate these ordinates (binning the overscan data vector) is incorrect.
| 0.5 |
1,844 |
DM-8990
|
01/13/2017 00:22:03
|
wmgr database connection leak
|
Igor reported mysql error when loading lots of data into qserv cluster at in2p3: {noformat} I ran into an interesting problem when doing bulk loading of the KPM 20% data into the second Qserv cluster at *IN2P3*. At some point my loaders (at some point I had ~20 parallel loaders per each worker node) began to fail with the following complain: [DEBUG] lsst.qserv.wmgr.client: Response body: {"exception": "OperationalError", "message": "(_mysql_exceptions.OperationalError) (1040, 'Too many connections')"} [CRITICAL] Loader: Failed to create chunk 10772 for table 'ForcedSource' {noformat} I think it happens due to wmgr creating new connection on every data load request (this was verified) because it creates new sqlalchemy engine and: - every new sqlalchemy engine has its separate connection pool - engines are not destroyed by sqlalchemy (and not reused) To avoid this issue we should reuse engine instances and avoid creating new ones. May need to verify first whether my guesses above are true.
| 2 |
1,845 |
DM-9004
|
01/16/2017 11:14:41
|
Check uses of darktime for NAN
|
The darktime can be {{NaN}} if not set explicitly in the obs package's {{makeRawVisitInfo}}. Any scaling of an exposure by the darktime can therefore result in a useless image full of {{NaN}} values. We therefore need to catch the case {{isnan(darktime)}} wherever we use it: pipe_drivers for construction of the dark, and ip_isr for application of the dark.
| 1 |
1,846 |
DM-9005
|
01/16/2017 11:55:17
|
Shared stack build failures on lsst-dev01
|
lsst-dev01 shared stack builds are having exactly the same problem with w_2017_2 & _3 as recorded for w_2017_1 in DM-8803: {code} ***** error: from /software/lsstsw/stack/EupsBuildDir/Linux64/meas_algorithms-12.1-15-g09aec8f/build.log: tests/testPsfCaching.cc(32): last checkpoint *** 1 failure is detected in the test module "PsfCaching" tests/testWarpedPsf Running 2 test cases... unknown location(0): fatal error: in "warpedPsf": signal: SIGSEGV, si_code: 0 (memory access violation at address: 0x00000080) tests/testWarpedPsf.cc(260): last checkpoint: "warpedPsf" entry. *** 1 failure is detected in the test module "DISTORTION" The following tests failed: /software/lsstsw/stack/EupsBuildDir/Linux64/meas_algorithms-12.1-15-g09aec8f/meas_algorithms-12.1-15-g09aec8f/tests/.tests/testExecutables.py.failed /software/lsstsw/stack/EupsBuildDir/Linux64/meas_algorithms-12.1-15-g09aec8f/meas_algorithms-12.1-15-g09aec8f/tests/.tests/testImagePsf.failed /software/lsstsw/stack/EupsBuildDir/Linux64/meas_algorithms-12.1-15-g09aec8f/meas_algorithms-12.1-15-g09aec8f/tests/.tests/testPsfAttributes.failed /software/lsstsw/stack/EupsBuildDir/Linux64/meas_algorithms-12.1-15-g09aec8f/meas_algorithms-12.1-15-g09aec8f/tests/.tests/testPsfCaching.failed /software/lsstsw/stack/EupsBuildDir/Linux64/meas_algorithms-12.1-15-g09aec8f/meas_algorithms-12.1-15-g09aec8f/tests/.tests/testWarpedPsf.failed 5 tests failed scons: *** [checkTestStatus] Error 1 scons: building terminated because of errors. + exit -4 eups distrib: Failed to build meas_algorithms-12.1-15-g09aec8f.eupspkg: Command: source /software/lsstsw/stack/eups/bin/setups.sh; export EUPS_PATH=/software/lsstsw/stack; (/software/lsstsw/stack/EupsBuildDir/Linux64/meas_algorithms-12.1-15-g09aec8f/build.sh) >> /software/lsstsw/stack/EupsBuildDir/Linux64/meas_algorithms-12.1-15-g09aec8f/build.log 2>&1 4>/software/lsstsw/stack/EupsBuildDir/Linux64/meas_algorithms-12.1-15-g09aec8f/build.msg exited with code 252 {code}
| 1 |
1,847 |
DM-9011
|
01/17/2017 12:13:38
|
Make simpleShape less chatty
|
simpleShape can be verbose due to throwing {{pex::exceptions::RuntimeError}}. Throwing {{meas::base::MeasurementError}} would make it quieter.
| 0.5 |
1,848 |
DM-9012
|
01/17/2017 12:33:12
|
Refactor FileInfo and FileData classes into one class
|
We have 2 FileData classes and 1 FileInfo class which are all how information about a file. Refactor these into one class - FileInfo.
| 3 |
1,849 |
DM-9013
|
01/17/2017 12:33:55
|
Re-enable MKL/OpenBLAS
|
Use of MKL/OpenBLAS was disabled (in commit {{6fe95ec}}) while adapting meas_mosaic to work with the LSST pipeline. It needs to be re-enabled so we're not limited to the slow matrix inversion using Eigen. Reverting that single commit is sufficient to get the threaded matrix inversion: {code} PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 24920 pprice 32 12 14.7g 13g 45m R 717.0 14.3 506:09.03 python {code}
| 1 |
1,850 |
DM-9014
|
01/17/2017 12:44:26
|
Add 2-d version of cppIndex
|
Add a 2-d version of cppindex to pybind11.h in utils in order to give better error messages when used on 2-d arrays. Also fix the casting in the old cppIndex to avoid compiler warnings. Note that this is pybind11-related so the work should branch from and merge to DM-8467
| 0.5 |
1,851 |
DM-9016
|
01/17/2017 14:14:08
|
Drivers should be able to be made less verbose about eups
|
{{singleFrameDriver.py}} (and likely other pipe_drivers scripts) echos a bunch of eups commands and their output on startup. Those commands aren't part of the standard logging system and it should be possible to turn them on/off with an argument. I suggest either: * {{--verbose-eups-dump}} to turn them on or * {{--quiet-eups-dump}} to turn them off. I'm not sure whether on or off is the better default, and I'd be happy with either. One can turn them off currently via {{--batch-type none}}, but that changes the actual type of processing, which is not desired.
| 1 |
1,852 |
DM-9024
|
01/18/2017 09:11:51
|
Amend Python test naming guidelines in Developer Guide Following RFC-229
|
Amend the Developer Guide to require that test modules be prefixed with {{test_}} to enable automatic pytest discovery: {code} tests/test_example.py {code} See RFC-229.
| 1 |
1,853 |
DM-9026
|
01/18/2017 10:35:38
|
Firefly IPAC table reader should handle data type "real"
|
IRSA IPAC table reader handles real number as double. See detail in [http://irsa.ipac.caltech.edu/applications/DDGEN/Doc/ipac_tbl.html]. Firefly IPAC table reader currently treats real as char (data type show description) and String (the data type class). See the 4th column in the attached table. It could be confirmed by saving the table after upload. The type for column 'V' chnaged form "Real" to "Char". We need to follow IRSA's IPAC table definition to treat real as double.
| 1 |
1,854 |
DM-9041
|
01/18/2017 18:55:45
|
Make indexed reference loader agnostic to ingest name
|
The indexed reference loader reads the dataset name out of the ingest config, but that's ridiculous because you need to give it the dataset name to find the config. I'm removing the bit that loads the config and gets the name out of it, so that the loader will be agnostic to what name it was called at ingest time.
| 2 |
1,855 |
DM-9045
|
01/19/2017 09:07:45
|
Remove or revive bitrotted code in meas_modelfit
|
meas_modelfit contains a lot of code that was used for FDR-era prototyping of modelfitting and has not been used since. Some of this is worth keeping and reviving, because it's hard-won algorithmic code we may want to use in the future, while some of it should be removed. In many cases, the unwanted code is being kept around because it's the only way the code we want is still tested; in other cases a small, easy-to-replace fraction of it is used by the bitrotted code we want to keep or code in active use (i.e. CModel). This ticket is for cleaning up that mess. Roughly, that means: - Remove the custom table/record/catalog classes. - Remove the custom CmdLineTasks. - Remove the Sampler and Interpreter classes. Some of this work may be best spawned done on other tickets, which should be linked here.
| 2 |
1,856 |
DM-9049
|
01/19/2017 13:57:00
|
Enable autolinking in Doxygen
|
During review of DM-7891, [~jsick] said he intends for the final LSST documentation to link all mentions of API components, and requested that a style rule concerning manual links ({{@ref}}) be removed from the draft style guidelines. Without such a rule, however, current documentation will have no links at all. Since it would be simpler to not introduce the rule now than to request/approve its removal later, the best solution is to enable Doxygen's autolinking, removing the need for {{@ref}} tags. This change should not cause compatibility problems for existing documentation that uses {{@ref}}.
| 1 |
1,857 |
DM-9050
|
01/19/2017 14:14:02
|
Add flags for sources used in astrometric and photometric calibration
|
The PSF modeling tasks conveniently create and set flags indicating which sources were used to determine the PSF model. The single-frame astrometric and photometric calibration tasks should do the same, indicating at least (for each procedure): - Which sources were selected for potential matching. - Which sources were actually matched. - Which sources were actually used for calibration. As we have done in PSF modeling, it would also be good to have the ability to reserve a set of candidate sources for validation purposes. [~Parejkoj] and/or [~cmorrison] may have opinions on how this interacts with the new source selection stuff they've been working on.
| 5 |
1,858 |
DM-9055
|
01/19/2017 15:39:22
|
DarkCombineTask broken
|
DM-8913 changed {{DarkCombineTask}} to use {{VisitInfo}}, but this assumed that the {{combined}} variable is an {{Exposure}}, but it's actually a {{DecoratedImage}}. That means we need to use a different means of getting the metadata in.
| 0.5 |
1,859 |
DM-9060
|
01/19/2017 19:46:27
|
Add metadata access to get wcs, visitInfo, and calib from a calexp dataset
|
This is regarding the recent discussion in the science-pipelines room where there has been discussion about using composites to get components of a calexp. In {{CameraMapper}} we will add metadata readers with object constructors for wcs, calib, and visitInfo. The API to get the components and entire calexp would be {code:} wcs = butler.get(‘calexp_wcs’, dataId={…}) calib = butler.get(‘calexp_calib’, dataId={…}) visitInfo = butler.get(‘calexp_visitInfo’, dataId={…}) calexp = butler.get(‘calexp’, dataId={…}) {code} (note: we were originally going to use butler composites for this, but decided this is a better way to go)
| 8 |
1,860 |
DM-9064
|
01/20/2017 09:20:17
|
Fix memory leak in SpanSets Persistence
|
A schema used in the persistence layer of SpanSets needs to be marked as persistent or it causes a memory leak. Mark the schema as such.
| 0.5 |
1,861 |
DM-9072
|
01/20/2017 16:24:51
|
Chart container tracking an active table in a table group
|
Currently, ChartsContainer is only displaying the default chart viewer. In future, we want to add a tbl_group property and logic that ties a chart container to a given group. When table group property is provided, the Chart Container would create a chart viewer for the given table group (if does not yet exist), and display the charts related to the active table in that table group.
| 8 |
1,862 |
DM-9076
|
01/23/2017 12:08:33
|
Keep LS as only periodogram calculation option
|
Update the periodogram panel option so is only reflect LS algorithm and keep as single option for now. Others doesn't make sense with the current API and there won;t be time to make progress on that until March release.
| 2 |
1,863 |
DM-9080
|
01/23/2017 18:22:48
|
labels for tab were cutoff a little at the bottom
|
The bottom of letter 'g' was cutoff in the labels. See attached image.
| 2 |
1,864 |
DM-9081
|
01/23/2017 18:39:29
|
testExposure.testGetWcs docstring is wrong, and tests should be assertIsNone
|
The docstring for {{testExposure.testGetWcs}} does not match the implemented tests in the method, and at least some of those tests appear to be incorrect, given the current (SWIGed) behavior of {{lsst.afw.image.Wcs}}. I've listed several obvious problems below: * The docstring claims exceptions should be raised, but none of the tests check for exceptions. * In addition, {{exposure.getWcs()}} returns None if the exposure was initialized without a Wcs, not False. None is "falsey", but if the API really wants None returned, we should test for that explicitly. * The two unadorned {{getWcs()}} calls should have an {{assertEqual}} against self.wcs, since that's what those exposures were initialized with. * I also noticed testSetMembers, which catches a pex.Exception, prints a message, and then continues on its merry way. This should also be fixed. This whole test suite needs to be looked at. That might be beyond scope for this ticket, but all of the above points are very worrying.
| 2 |
1,865 |
DM-9096
|
01/24/2017 15:50:38
|
Wrap geom with pybind11
|
Wrap package {{geom}} with pybind11 instead of Swig.
| 0 |
1,866 |
DM-9097
|
01/24/2017 17:19:42
|
Fix lsst-sphinx-bootstrap-theme deployment
|
https://github.com/lsst-sqre/lsst-sphinx-bootstrap-theme is on PyPI, but the current v0.1.0 distribution is missing the {{templates/}} directory. Could have been human error in setting up the package, or a problem with wheels. This ticket will fix the PyPI distribution, and possibly set up automated continuous delivery with Travis to PyPI.
| 0.5 |
1,867 |
DM-9098
|
01/24/2017 18:41:49
|
Update mariadb eups packages
|
Experimentation with latest MariaDB containers shows indications of significant performance improvements. We'd like to update MariaDB and client to latest versions before undertaking our upcoming qserv KPMs
| 1 |
1,868 |
DM-9102
|
01/25/2017 12:54:42
|
Update the import package in LSSTFileGroupProcessor
|
The FileInfo class was updated and moved to a new location. It triggered LSSTFileGroupProcessor failing. The LSSTFileGroupProcessor should be modified to import from the new location.
| 0.5 |
1,869 |
DM-9105
|
01/25/2017 14:22:43
|
Make SpanSet operator templates more generic
|
Expand the flatten and unflatten methods of SpanSets such that they can operate on multi-dimensional ndarrays. This work involves making the template parameters for these functions, and the getter classes more generic.
| 2 |
1,870 |
DM-9109
|
01/25/2017 16:40:15
|
Create ellipticity residuals quiver plots
|
Include ellipticity residual comparison and quiver plots in the analysis script, including: *Intra-stack:* scatter/histogram/sky plots for *psfUsed* stars of > residual e1 & e2 ellipticities between the source and model psf at the position of the source (for both SDSS and HSM measurements), where: {code} e1 = (shape_xx - shape_yy)/(shape_xx + shape_yy) e2 = 2*shape_xy/(shape_xx + shape_yy) {code} > residual ellipticity, δe, between source and model psf at source position quiver plot where: {code} δe = sqrt((e1_src - e1_psf)^2 + (e2_src - e2_psf)^2)) {code} *Inter-stack:* scatter/histogram/sky comparison plots for; > comparison of the trace radii of the sf models at the position of *psfUsed* stars matched between the two stack catalogs, where: {code} traceRadius = sqrt(0.5*(shape_xx + shape_yy)) {code}
| 5 |
1,871 |
DM-9113
|
01/26/2017 08:26:02
|
the bias ingestion should not care about the filter
|
In the CP MaterCal bias products, the filter in the header can be anything. Currently the ingest code updates the validity range for each filter, which can result in more than one bias at one time.
| 1 |
1,872 |
DM-9120
|
01/26/2017 19:09:57
|
matcherSourceSelector incorrectly uses nChild and footprints in isMultiple test.
|
This bug was found by dm-square as a drop in match rms quality. The replacement for the SourceInfo class in matchOptimisticB.py has a test for isMultiple but this test was not used in the subsequent isGood or isUsable tests. The current implementation of the new matcherSourceSelector incorrectly uses this test and to parrot the performance of SourceInfo (which was the goal of DM-6824) this test should be removed.
| 1 |
1,873 |
DM-9126
|
01/27/2017 11:42:10
|
qserv_distrib does not setup qserv_testdata
|
[~jammes] reported on slack that the {{qserv:dev}} container is unable to setup the {{qserv}}. Demonstration of issue: {code:java} [master] ~ $ docker run -ti qserv/qserv:dev qserv@f405e3798cb4:~$ . /qserv/stack/loadLSST.bash qserv@f405e3798cb4:~$ setup qserv_distrib -t qserv-dev qserv@f405e3798cb4:~$ eups list -s | egrep ^qserv qserv_distrib 1.0.0+664 b2666 qserv-dev setup {code} Expected result: {code:java} [master] ~ $ docker run -ti qserv/qserv:dev qserv@ab194158b222:~$ . /qserv/stack/loadLSST.bash qserv@ab194158b222:~$ setup qserv_distrib qserv@ab194158b222:~$ eups list -s | egrep ^qserv qserv 12.1-27-gc42959c b2531 current b2541 w_2016_51 setup qserv_distrib 1.0.0+652 b2531 current b2541 w_2016_51 setup qserv_testdata 12.0+79 current b2541 qserv-dev w_2016_51 b2531 setup {code} This appears to be caused by this changed commited on Dec 19th as part of DM-8256: https://github.com/lsst/lsstsw/commit/49acd3e33364d0b4b67a7900d910fe121a0ac8fb
| 0.5 |
1,874 |
DM-9131
|
01/27/2017 12:52:21
|
Apply second round of Robert's LDM-151 comments
|
Having applied Robert's first round of corrections, a second marked-up pdf with many few comments exists which should be applied during this iteration on LDM-151.
| 8 |
1,875 |
DM-9135
|
01/27/2017 13:57:00
|
bulk rename of jointcal variables
|
Jointcal currently uses very non-standard, non-stylistic, and/or non-helpful variable and method names (e.g. {{void FittedStar::SetRefStar(const RefStar *R)}}). I've been cleaning them up piecemeal as I go, but that results in confusing commits and can be a pain. The clang format package provides "clang-rename" which might help take care of most/all of these in one foul swoop. Other suggestions are welcome (SublimeText's smart selection isn't quite smart enough, particularly for single-character variable names).
| 2 |
1,876 |
DM-9136
|
01/27/2017 14:30:51
|
Include CBP coordinate transformation system in LDM-151
|
Per [LCR-581|https://docushare.lsstcorp.org/docushare/dsweb/Get/Version-37829/LCR-581CalibrationHardwareRequirementsUpdateApproved.pdf]: {quote} To be added to LSE-30 (OSS): Beam Projector Coordinate Relationship Specification: Coordinate system transformations shall be measured and/or computed relating the collimated beam projector position and telescope pupil position to the illumination position on the telescope optical elements and focal plane, and a software interface shall be developed to represent these relationships, including their possible evolution in time. Justification: This is necessary to facilitate the data acquisition and reduction. The user shall be able to specify an LSST pupil and focal plane position for a given spot, then have the CBP and telescope offset accordingly. Similarily, the spot positions should be predictable based on the CBP and telescope position. This requirement will need to be flowed down appropriately to the Data Management and Observatory Control System requirements documents. It is assumed that Data Management will develop a Python interface that represents the relationships, and that the Observatory Control System will use those in the construction of the control system functions relating to the collimated beam projector. {quote} Note the requirements on DM above. These should be reflected in LDM-151.
| 2 |
1,877 |
DM-9162
|
01/30/2017 17:41:15
|
magnitude shoud be plotted in decrease order in period finding layout
|
The preview phase folded curve in the period finding layout is showing the magnitude axis in increase order but it should be in decrease order as the result layout to be consistent.
| 1 |
1,878 |
DM-9163
|
01/30/2017 18:47:33
|
newinstall.sh broken by conda package removal from public channels
|
{code:java} Installing Miniconda2 Python Distribution ... [ 1/1 ] miniconda2 4.2.12 ... ***** error: from /newinstall/EupsBuildDir/Linux64/miniconda2-4.2.12/build.log: Fetching package metadata ....... .Solving package specifications: . PackageNotFoundError: Package not found: '' Packages missing in current linux-64 channels: - fontconfig 2.12.1 0 - glib 2.50.2 0 {code}
| 1 |
1,879 |
DM-9165
|
01/30/2017 18:57:49
|
Add feature to overlay the searched position on the coverage image.
|
The old firefly's API 'addCoveragePlot' will overlay the coverage image with the searched position given by the 'OverlayPosition' parameter. This feature is missing from the new showCoverage function. Also, catalog search results should also have the searched position overlaid as well. Please confirm.
| 3 |
1,880 |
DM-9166
|
01/31/2017 07:19:35
|
Help IN2P3 scientist to load stack processed data inside Qserv
|
Help [~nchotard] to load LSST processed data inside Qserv.
| 5 |
1,881 |
DM-9182
|
01/31/2017 14:46:13
|
Cleanup pybind11 code in afw
|
Following the review of DM-9063, I've put together a list of fixes to make for most of afw and its dependencies, which I have attached to the ticket. I have *not* looked at afw::table, as that will be reviewed and fixed independently on DM-8716. First, a list of *generic* issues that weren't worth capturing on a file-by-file basis (though I did start by doing that). This is essentially a checklist that should be used by all pybind11 cleanup issues: - Move trivial Python extensions to C++. - Use continueClass decorator in remaining Python files. - Reorganize/rename files as per RFC. - Address any TODOs - Remove commented-out headers. - Make sure pybind11 header is included first (because Python.h needs to be included first). - Remove (or otherwise address) commented-out code - Remove any use of ndarray/converters.h (redundant with ndarray/pybind11.h) - Make sure all functions that should have kwargs do. - Replace py::arg("foo") with "foo"_a (with using pybind11::literals) - Make sure shared_ptr holder type is used for all but trivial classes - Look for comment headings that are unused or disrupt readability - Make sure {{py::is_operator}} is used on binary, non-in-place operators, and is not used anywhere else. - Look for lambdas with non-const reference arguments; could they be const references? - Look for getters that should be using reference_internal - Remove spurious (empty) wrapper files. - Determine whether enums are used as enums or integer constants, and adjust wrappers accordingly. - Check for anonymous namespace and `static` usage. - Check for worthless module docstrings. - Define typedefs for py::class_ instantiations, or otherwise ensure they're not repeated. - Don't use ::Ptr (here, or anywhere). - Run clang-format? Some of indentation is really terrible at definitely not conformant, and clang-format improves it. - Delete trailing whitespace.
| 3 |
1,882 |
DM-9186
|
01/31/2017 16:30:51
|
Create hscIsr.py script
|
I have some unusual data (HSC pinhole filter images) for which I'd like to run only very basic Isr tasks (overscan subtraction and bias subtraction), which is tricky to manage with processCcd.py. This ticket is to create a subaruIsr.py command line executable to accomplish this.
| 2 |
1,883 |
DM-9187
|
01/31/2017 16:35:29
|
port jointcal to pybind11
|
Jointcal's python interface is currently SWIG-based. Now that most (all?) of the dependencies are converted, it's time to convert jointcal to pybind11.
| 8 |
1,884 |
DM-9196
|
01/31/2017 18:26:33
|
cutout size parameter is not passed in the request by the download dialog
|
The download dialog has an option 'cutout_size' but is not passed to the server. Please fix.
| 1 |
1,885 |
DM-9206
|
02/01/2017 07:50:18
|
Precision of expression columns
|
This is a bug introduced by in DM-8367 line chart. The precision of the expression columns is not set when the values are saved to an IPAC table. When the table is read back and the first row happens to be 0.0, all values will be passed to the client with 1 decimal digit. Test: load sample table, change y column to -count/time. Notice that all values in the point tooltip have 1 digit after decimal point and that highlighted points do not match the points on the line. (Highlighted values are calculated on client from the values of the table, plot points - on the server.) https://github.com/Caltech-IPAC/firefly/pull/279
| 2 |
1,886 |
DM-9233
|
02/01/2017 13:25:17
|
Add default constructor to new Footprints
|
Having a default constructor to Footprints would be useful in many cases. Add one which takes only a PeakSchema as a default argument, and creates a null SpanSet
| 1 |
1,887 |
DM-9242
|
02/02/2017 09:47:02
|
squash KPM plots should label the time axis with a timezone
|
At present, the KPM time series plots do not label the time axis with the timezone.
| 0.5 |
1,888 |
DM-9245
|
02/02/2017 10:22:13
|
Replacing JS package manager npm with yarn
|
Yarn is a new package manager for JS. It increases performance and is more reliable. You do need to install yarn. From your command prompt: npm install yarn -g Depending on where you install node, you may need sudo for this to work. I've updated our docs to reflect this new requirement. In order to get consistent installs across machines, yarn.lock is used. If you updated package.json, you need to run yarn install to create a new yarn.lock file. You must commit the new yarn.lock file with your updated package.json for build to work.
| 1 |
1,889 |
DM-9249
|
02/02/2017 11:26:09
|
Modify FlagHandler C++ and flagDecorator.py to make flag identification robust
|
As discovered by DM-6561, the FlagHandler mechanism for connecting the enumeration of flags in C++ and the order in which the flags appear in the schema and internal FlagHandler structures is not robust. Fix this, problem, so that the identifier used identify a particular flag and the lookup of the Flag Key are guaranteed to match. Then fix the flagDecorator (it will be simpler) and all of the algorithms to match the new FlagHandler scheme.
| 8 |
1,890 |
DM-9253
|
02/02/2017 16:12:44
|
Prepare for IRSA Time series viewer release
|
The story is to collect the ticket and track them in order to prepare and release Time Series viewer (old LC) in March. Links to issues should be added. The RC is expected to happen around mid February.
| 8 |
1,891 |
DM-9254
|
02/02/2017 17:41:36
|
Firefly is not working in Windows browser IE 11
|
I've tested Firefly in Windows browser IE11 and it doesn't render. We should try to figure out the issues first. Make new tickets for large bugs.
| 8 |
1,892 |
DM-9261
|
02/03/2017 11:34:20
|
Update git-lfs repositories to address deprecations.
|
Update all git-lfs repositories to be compliant with current git-lfs best practices. 1. Remove {{batch = false}} configurations. 2. Ensure {{.lfsconfig}} files exist.
| 2 |
1,893 |
DM-9274
|
02/03/2017 18:15:11
|
Build Python doc using Sphinx
|
Jonathon Sick has a boiler plate to bild docs for Numpy doc docstring. This is a ticket to use that for SUIT Python doc build. https://community.lsst.org/t/what-packages-have-numpydoc-docstrings-so-far/1612 Please add the cheat sheet for Python doc generation in https://confluence.lsstcorp.org/display/DM/Python+document+generation+cheat+sheet.
| 8 |
1,894 |
DM-9275
|
02/04/2017 08:35:19
|
Port obs_monocam to pybind11
|
Port the obs_monocam package to pybind11 Note that obs_monocam is part of lsst_distrib
| 0.5 |
1,895 |
DM-9294
|
02/06/2017 13:16:10
|
makeCamera.py has undefined variables
|
Function {{makeAmp}} in {{makeCamera.py}} has two undefined variables: {{nExtended}} and {{nOverclock}}.
| 1 |
1,896 |
DM-9298
|
02/06/2017 15:54:41
|
The stripMetadata argument of makeWcs doesn't work reliably
|
The function {{afw::image::makeWcs(metadata, stripMetadata}} with its second argument true does not strip metadata if certain values are present because those values induce a deep copy of the metadata, and the keywords are stripped from the copy.
| 0.5 |
1,897 |
DM-9301
|
02/07/2017 01:07:05
|
Implement parallel processing in L1DB prototype
|
Next step in understanding alert production performance is to see if paralellising processing can help with reducing database access overhead.
| 8 |
1,898 |
DM-9313
|
02/07/2017 10:47:25
|
obs_decam should not call stripWcsKeywords
|
{{DecamMapper}} presently calls {{stripWcsKeywords}} in several places. This function is not part of afw's public interface (it is in namespace {{lsst::afw::image::detail}} and the call is unnecessary because {{makeWcs}} will strip the metadata if told to do so (by setting the second argument true). Fixing this will help the pybind11 conversion effort because the pybind11 wrapped version of afw does not make {{stripWcsKeywords}} available, and we'd rather not change that.
| 0.5 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.