message
stringlengths 13
484
| diff
stringlengths 38
4.63k
|
---|---|
adding logo for license, version, download, status
add slack channel link | +| |license| |status| |versions| |downloads|
+
+.. |license| image:: https://img.shields.io/pypi/l/buildtest-framework.svg
+.. |status| image:: https://img.shields.io/pypi/status/buildtest-framework.svg
+.. |versions| image:: https://img.shields.io/pypi/pyversions/buildtest-framework.svg
+.. |downloads| image:: https://img.shields.io/pypi/dw/buildtest-framework.svg
+
+
buildtest-framework
-------------------
-Description
------------
-The buildtest repository is an **Automatic Test Generation Framework** for writing test cases suited to work well in a HPC environment. This framework heavily relies that your application is built with `EasyBuild <https://easybuild.readthedocs.io/en/latest/>`_
-and your system has `Lmod <http://lmod.readthedocs.io/en/latest/010_user.html>`_ or `Environment Modules <http://modules.sourceforge.net>`_ for managing modules
+`buildtest <http://buildtestdocs.readthedocs.io/en/latest/>`_ is a HPC Software
+Testing Framework designed to build tests quickly and verify an entire software stack
+
+
Documentation
-------------
Please see `buildtest documentation <http://buildtestdocs.readthedocs.io/en/latest/>`_ for more details.
-Contact
--------
-email: [email protected]
+
+Slack
+------
+
+Join us at hpcbuildtest.slack.com
+
+
|
Fix broken link on installing Qiskit Terra from Source
Fix broken link on how to install Qiskit Terra from source in the file CONTRIBUTING.md.
Fixes | @@ -121,10 +121,8 @@ Issue #190: Short summary of the issue
Installing Qiskit Terra from source
-----------------------------------
-
Please see the [Installing Qiskit Terra from
-Source](https://qiskit.org/documentation/install/terra.html) section of
-the Qiskit documentation.
+Source](https://qiskit.org/documentation/contributing_to_qiskit.html#installing-terra-from-source) section of the Qiskit documentation.
### Test
|
selector form fixed
HG--
branch : feature/microservices | @@ -132,9 +132,8 @@ Ext.define("NOC.sa.managedobjectselector.Application", {
{
xtype: "fieldset",
title: __("Filter by Object Attributes"),
- layout: "hbox",
defaults: {
- labelAlign: "top",
+ labelAlign: "left",
padding: 4
},
items: [
@@ -174,9 +173,8 @@ Ext.define("NOC.sa.managedobjectselector.Application", {
{
xtype: "fieldset",
title: __("Filter by Network Attributes"),
- layout: "hbox",
defaults: {
- labelAlign: "top",
+ labelAlign: "left",
padding: 4
},
items: [
|
function/FHNIntegrator: Update to devel version
("functions/FHNIntegrator: cast
input to 1d array")
("function/FHNIntegrator: Make
sure dv value is always a vector")
both made local modifications that did not make it to devel. Revert back
to devel version. | @@ -7703,7 +7703,10 @@ class FHNIntegrator(Integrator): # --------------------------------------------
#Gilzenrat paper - hardcoded for testing
# val = (v - 0.5*w)
- return np.broadcast_to(val, np.atleast_1d(variable).shape)
+ if not np.isscalar(variable):
+ val = np.broadcast_to(val, variable.shape)
+
+ return val
def function(self,
variable=None,
|
cirrus CI build: fix docker context to make COPY instructions work
see :
> CIRRUS_DOCKER_CONTEXT: Docker build's context directory to use for Dockerfile as a CI environment. Defaults to project's root directory. | @@ -168,6 +168,7 @@ task:
path: "contrib/build-wine/dist/*"
env:
CIRRUS_WORKING_DIR: /opt/wine64/drive_c/electrum
+ CIRRUS_DOCKER_CONTEXT: contrib/build-wine
task:
name: Android build
@@ -206,6 +207,8 @@ task:
- ./contrib/build-linux/appimage/make_appimage.sh
binaries_artifacts:
path: "dist/*"
+ env:
+ CIRRUS_DOCKER_CONTEXT: contrib/build-linux/appimage
task:
name: tarball build
|
DOC: add `printoptions` as a context manager to `set_printoptions`
DOC: add `printoptions` as a context manager to `set_printoptions` | @@ -200,6 +200,8 @@ def set_printoptions(precision=None, threshold=None, edgeitems=None,
-----
`formatter` is always reset with a call to `set_printoptions`.
+ Use `printoptions` as a context manager to set the values temporarily.
+
Examples
--------
Floating point precision can be set:
@@ -239,6 +241,13 @@ def set_printoptions(precision=None, threshold=None, edgeitems=None,
>>> np.set_printoptions(edgeitems=3, infstr='inf',
... linewidth=75, nanstr='nan', precision=8,
... suppress=False, threshold=1000, formatter=None)
+
+ Also to temporarily override options, use `printoptions` as a context manager:
+
+ >>> with np.printoptions(precision=2, suppress=True, threshold=5):
+ ... np.linspace(0, 10, 10)
+ array([ 0. , 1.11, 2.22, ..., 7.78, 8.89, 10. ])
+
"""
legacy = kwarg.pop('legacy', None)
if kwarg:
|
Added deprecation warning
Added deprecation warning to untransform_grad method of NormalizationTransformer class. Related to issue | @@ -587,9 +587,10 @@ class NormalizationTransformer(Transformer):
return z * y_stds
def untransform_grad(self, grad, tasks):
- """
- Undo transformation on gradient.
- """
+ """DEPRECATED. DO NOT USE."""
+ logger.warning(
+ "NormalizationTransformer.untransform_grad is DEPRECATED and will be removed in a future version of DeepChem. Manually implement transforms to perform force calculations."
+ )
if self.transform_y:
grad_means = self.y_means[1:]
|
robocorp-code: update basic tutorial URL
- Allows us to get rid of an old redirect | @@ -24,7 +24,7 @@ Note: the use of cloud-based orchestration in [Robocorp Cloud](https://robocorp.
1. Install this extension together with the [Robot Framework Language Server extension](https://marketplace.visualstudio.com/items?itemName=robocorp.robotframework-lsp).
-1. Download [Robocorp VS Code extension - basic tutorial](https://robocorp.com/robots/robot/robocorp-vs-code-extension-basic-tutorial), and open it in VS Code.
+1. Download [Robocorp VS Code extension - basic tutorial](https://robocorp.com/portal/robot/robocorp/example-vscode-basics), and open it in VS Code.
1. Open the command palette - (Windows, Linux): `ctrl-shift-P` (macOS): `cmd-shift-P`
|
[Core] fix wrong memory size reporting
The current resource reporting is run in OSS. Revert the change. For example it reported
InitialConfigResources: {node:172.31.45.118: 1.000000}, {object_store_memory: 468605759.960938 GiB},
For 10GB memory object_store. | @@ -197,8 +197,7 @@ double ResourceSet::GetNumCpusAsDouble() const {
std::string format_resource(std::string resource_name, double quantity) {
if (resource_name == "object_store_memory" ||
resource_name.find(kMemory_ResourceLabel) == 0) {
- // The memory resources (in 50MiB unit) are converted to GiB
- return std::to_string(quantity * 50 / 1024) + " GiB";
+ return std::to_string(quantity / (1024 * 1024 * 1024)) + " GiB";
}
return std::to_string(quantity);
}
|
Migrate from louie/dispatcher to eventdispatcher: media_renderer_client.py
Add/fix module documentation and renames events as following:
- Coherence.UPnP.DeviceClient.detection_completed => device_client_detection_completed | # http://opensource.org/licenses/mit-license.php
# Copyright 2006, Frank Scholz <[email protected]>
+# Copyright 2018, Pol Canelles <[email protected]>
+
+'''
+:class:`MediaRendererClient`
+----------------------------
+
+A class representing an media renderer client device.
+'''
+
+from eventdispatcher import EventDispatcher, Property
-import coherence.extern.louie as louie
from coherence import log
from coherence.upnp.services.clients.av_transport_client import \
AVTransportClient
@@ -13,25 +22,45 @@ from coherence.upnp.services.clients.rendering_control_client import \
RenderingControlClient
-class MediaRendererClient(log.LogAble):
+class MediaRendererClient(EventDispatcher, log.LogAble):
+ '''
+ .. versionchanged:: 0.9.0
+
+ * Introduced inheritance from EventDispatcher
+ * The emitted events changed:
+
+ - Coherence.UPnP.DeviceClient.detection_completed =>
+ device_client_detection_completed
+
+ * Changed class variable :attr:`detection_completed` to benefit
+ from the EventDispatcher's properties
+ '''
logCategory = 'mr_client'
+ detection_completed = Property(False)
+ '''
+ To know whenever the device detection has completed. Defaults to `False`
+ and it will be set automatically to `True` by the class method
+ :meth:`service_notified`.
+ '''
+
def __init__(self, device):
log.LogAble.__init__(self)
+ EventDispatcher.__init__(self)
+ self.register_event(
+ 'device_client_detection_completed',
+ )
+
self.device = device
+ self.device.bind(device_service_notified=self.service_notified)
self.device_type = self.device.get_friendly_device_type()
+
self.version = int(self.device.get_device_type_version())
self.icons = device.icons
self.rendering_control = None
self.connection_manager = None
self.av_transport = None
- self.detection_completed = False
-
- louie.connect(self.service_notified,
- signal='Coherence.UPnP.DeviceClient.Service.notified',
- sender=self.device)
-
for service in self.device.get_services():
if service.get_type() in [
"urn:schemas-upnp-org:service:RenderingControl:1",
@@ -120,7 +149,8 @@ class MediaRendererClient(log.LogAble):
if self.av_transport.service.last_time_updated is None:
return
self.detection_completed = True
- louie.send('Coherence.UPnP.DeviceClient.detection_completed', None,
+ self.dispatch_event(
+ 'device_client_detection_completed',
client=self, udn=self.device.udn)
def state_variable_change(self, variable):
|
Fixed Mouser API missing data cases
No Availability declared
No price list declared | @@ -214,7 +214,7 @@ class MouserPartSearchRequest(MouserBaseRequest):
part_data = parts[0]
# Merge
for key in cleaned_data:
- cleaned_data[key] = part_data[key]
+ cleaned_data[key] = part_data.get(key, cleaned_data[key])
return cleaned_data
def print_clean_response(self):
@@ -346,7 +346,7 @@ class api_mouser(distributor_class):
if res_stock:
dd.qty_avail = int(res_stock.group(1))
pb = data['PriceBreaks']
- dd.currency = pb[0]['Currency']
+ dd.currency = pb[0]['Currency'] if pb else currency
dd.price_tiers = {p['Quantity']: get_number(p['Price']) for p in pb}
# Extra information
product_description = data['Description']
|
Close sys.stdin before assigning to it
This is an attempt to fix issue (ResourceWarning: unclosed file).
I cannot reproduce it myself, but closing sys.stdin before assigning to it
seems to be the right thing to do. | @@ -360,6 +360,7 @@ def reader_process(file, file2, connections, queue, buffer_size, stdin_fd):
and finally sends "poison pills" (the value -1) to all connections.
"""
if stdin_fd != -1:
+ sys.stdin.close()
sys.stdin = os.fdopen(stdin_fd)
try:
with xopen(file, 'rb') as f:
|
CLN: Remove character
Remove extra 9 | @@ -189,4 +189,4 @@ https://github.com/statsmodels/statsmodels/issues
.. |Conda Version| image:: https://anaconda.org/conda-forge/statsmodels/badges/version.svg
:target: https://anaconda.org/conda-forge/statsmodels/
.. |License| image:: https://img.shields.io/pypi/l/statsmodels.svg
- :9target: https://github.com/statsmodels/statsmodels/blob/master/LICENSE.txt
+ :target: https://github.com/statsmodels/statsmodels/blob/master/LICENSE.txt
|
Print more helpful error message for corrupt event
fixes | @@ -218,9 +218,18 @@ class Event:
@property
def recurring(self):
- return 'RRULE' in self._vevents[self.ref] or \
+ try:
+ rval = 'RRULE' in self._vevents[self.ref] or \
'RECURRENCE-ID' in self._vevents[self.ref] or \
'RDATE' in self._vevents[self.ref]
+ except KeyError:
+ logger.fatal(
+ f"The event at {self.href} might be broken. You might want to "
+ "file an issue at https://github.com/pimutils/khal/issues"
+ )
+ raise
+ else:
+ return rval
@property
def recurpattern(self):
|
make 'is_active' of payment modifiers use the identifier
instead of the namespace of the payment_provider (the latter is used
by default nevertheless if there is no identifier set) | @@ -35,7 +35,7 @@ class PaymentModifier(BaseCartModifier):
:returns: ``True`` if this payment modifier is active.
"""
assert hasattr(self, 'payment_provider'), "A Payment Modifier requires a Payment Provider"
- return payment_modifier == self.payment_provider.namespace
+ return payment_modifier == self.identifier
def is_disabled(self, cart):
"""
|
Minor updates to Install documentation
Update Python download link
Move jupyter server extension troubleshooting steps close to
the expected result from "jupyter serverextension list" command. | @@ -16,11 +16,11 @@ limitations under the License.
{% endcomment %}
-->
## Installation
-Elyra can be installed via PyPi:
+Elyra can be installed via PyPI:
### Prerequisites
-* [NodeJS 12+](https://nodejs.org/en/)
-* [Python 3.X](https://www.anaconda.com/distribution/)
+* [Node.js 12+](https://nodejs.org/en/)
+* [Python 3.x](https://www.python.org/downloads/)
##### Optional
* [Anaconda](https://www.anaconda.com/distribution/)
@@ -30,7 +30,7 @@ Elyra can be installed via PyPi:
* [JupyterLab](https://github.com/jupyterlab/jupyterlab) 1.x is supported on **Elyra 0.10.1 and below**
* [JupyterLab](https://github.com/jupyterlab/jupyterlab) 2.x is supported on **Elyra 0.11.0-rc0 and above**
-via PyPi:
+via PyPI:
```bash
pip install elyra && jupyter lab build
```
@@ -57,6 +57,10 @@ config dir: /usr/local/etc/jupyter
nbdime 2.0.0 OK
```
+
+NOTE: If you don't see the elyra server extension enabled, you may need to explicitly enable
+it with `jupyter serverextension enable elyra`
+
```bash
jupyter labextension list
```
@@ -72,5 +76,3 @@ Known labextensions:
@jupyterlab/toc v3.0.0 enabled OK
nbdime-jupyterlab v2.0.0 enabled OK
```
-NOTE: If you don't see the elyra server extension enabled, you may need to explicitly enable
-it with `jupyter serverextension enable elyra`
|
Update README about python to use
We now run python 3.6 in production and there are issues with marshmallow-sqlalchemy using 3.5. | @@ -12,7 +12,7 @@ Contains:
### Python version
-This codebase is Python 3 only. At the moment we run 3.5 in production. You will run into problems if you try to use Python 3.4 or older, or Python 3.7 or newer.
+This codebase is Python 3 only. At the moment we run 3.6 in production. You will run into problems if you try to use Python 3.5 or older, or Python 3.7 or newer.
### AWS credentials
|
Removing invalid imports and urls patterns
urls for upgrade_patch and upgrade_patch_retry were removed | @@ -29,7 +29,6 @@ from logical.views import database_details, database_hosts, \
database_credentials, database_resizes, database_backup, database_dns, \
database_metrics, database_destroy, database_delete_host, \
database_upgrade, database_upgrade_retry, database_resize_retry, \
- database_upgrade_patch, database_upgrade_patch_retry, \
database_resize_rollback, database_make_backup, \
database_change_parameters, database_change_parameters_retry, \
database_switch_write, database_reinstall_vm, database_reinstall_vm_retry,\
@@ -550,18 +549,6 @@ class DatabaseAdmin(admin.DjangoServicesAdmin):
name="upgrade_retry"
),
- url(
- r'^/?(?P<id>\d+)/upgrade_patch/$',
- self.admin_site.admin_view(database_upgrade_patch),
- name="upgrade_patch"
- ),
-
- url(
- r'^/?(?P<id>\d+)/upgrade_patch_retry/$',
- self.admin_site.admin_view(database_upgrade_patch_retry),
- name="upgrade_patch_retry"
- ),
-
url(
r'^/?(?P<id>\d+)/resize_retry/$',
self.admin_site.admin_view(database_resize_retry),
|
Improve multiple create server tests
multiple create server tests just verify the
reservation id in response.
Let's check whether the requested number of servers
got created or not. | # under the License.
from tempest.api.compute import base
+from tempest.common import compute
from tempest.lib import decorators
@@ -21,13 +22,16 @@ class MultipleCreateTestJSON(base.BaseV2ComputeTest):
@decorators.idempotent_id('61e03386-89c3-449c-9bb1-a06f423fd9d1')
def test_multiple_create(self):
- body = self.create_test_server(wait_until='ACTIVE',
- min_count=1,
- max_count=2)
+ body, servers = compute.create_test_server(self.os,
+ wait_until='ACTIVE',
+ min_count=2)
+ for server in servers:
+ self.addCleanup(self.servers_client.delete_server, server['id'])
# NOTE(maurosr): do status response check and also make sure that
# reservation_id is not in the response body when the request send
# contains return_reservation_id=False
self.assertNotIn('reservation_id', body)
+ self.assertEqual(2, len(servers))
@decorators.idempotent_id('864777fb-2f1e-44e3-b5b9-3eb6fa84f2f7')
def test_multiple_create_with_reservation_return(self):
|
Avoid executing the cell that starts TensorBoard for docs generation.
Why?
TensorBoard integration is not yet supported in TF docs generation. | },
"outputs": [],
"source": [
+ "#docs_infra: no_execute\n",
+ "\n",
"# Get the URI of the output artifact representing the training logs,\n",
"# which is a directory\n",
"model_dir = train_uri\n",
|
Update skfda/misc/metrics.py
Fixing doctest | @@ -95,7 +95,7 @@ def vectorial_norm(fdatagrid, p=2):
Examples:
>>> from skfda.datasets import make_multimodal_samples
- >>> from skfda.preprocessing.dim_reduction import vectorial_norm
+ >>> from skfda.misc.metrics import vectorial_norm
First we will construct an example dataset with curves in
:math:`\mathbb{R}^2`.
|
ENH: add full_output to f2py.compile
fixes by providing a straightforward way to return the stdout/stderr
from compiling the FORTRAN module to the caller | @@ -21,7 +21,8 @@ def compile(source,
extra_args='',
verbose=True,
source_fn=None,
- extension='.f'
+ extension='.f',
+ full_output=False
):
"""
Build extension module from a Fortran 77 source string with f2py.
@@ -55,10 +56,19 @@ def compile(source,
.. versionadded:: 1.11.0
+ full_output : bool, optional
+ If True, return a `subprocess.CompletedProcess` containing
+ the stdout and stderr of the compile process, instead of just
+ the status code.
+
+ .. versionadded:: 1.20.0
+
+
Returns
-------
- result : int
- 0 on success
+ result : int or `subprocess.CompletedProcess`
+ 0 on success, or a `subprocess.CompletedProcess` if
+ ``full_output=True``
Examples
--------
@@ -95,23 +105,21 @@ def compile(source,
'-c',
'import numpy.f2py as f2py2e;f2py2e.main()'] + args
try:
- output = subprocess.check_output(c)
- except subprocess.CalledProcessError as exc:
- status = exc.returncode
- output = ''
+ cp = subprocess.run(c, stdout=subprocess.PIPE,
+ stderr=subprocess.PIPE)
except OSError:
# preserve historic status code used by exec_command()
- status = 127
- output = ''
- else:
- status = 0
- output = output.decode()
+ cp = subprocess.CompletedProcess(c, 127, stdout='', stderr='')
if verbose:
- print(output)
+ print(cp.stdout.decode())
finally:
if source_fn is None:
os.remove(fname)
- return status
+
+ if full_output:
+ return cp
+ else:
+ return cp.returncode
from numpy._pytesttester import PytestTester
test = PytestTester(__name__)
|
Removed the self.sig_close_split.emit() logic
Replaced the signal by a slot that explicitely close the editorstack if
it is closable. | @@ -463,7 +463,6 @@ class EditorStack(QWidget):
edit_goto = Signal(str, int, str)
sig_split_vertically = Signal()
sig_split_horizontally = Signal()
- sig_close_split = Signal()
sig_new_file = Signal((str,), ())
sig_save_as = Signal()
sig_prev_edit_pos = Signal()
@@ -736,7 +735,7 @@ def create_shortcuts(self):
context="Editor",
name="split horizontally",
parent=self)
- close_split = config_shortcut(lambda: self.sig_close_split.emit(),
+ close_split = config_shortcut(self.close_split,
context="Editor",
name="close split panel",
parent=self)
@@ -819,7 +818,14 @@ def setup_editorstack(self, parent, layout):
def add_corner_widgets_to_tabbar(self, widgets):
self.tabs.add_corner_widgets(widgets)
+ @Slot()
+ def close_split(self):
+ """Closes the editorstack if it is not the last one opened."""
+ if self.is_closable:
+ self.close()
+
def closeEvent(self, event):
+ """Overrides QWidget closeEvent()."""
self.threadmanager.close_all_threads()
self.analysis_timer.timeout.disconnect(self.analyze_script)
@@ -1380,7 +1386,7 @@ def __get_split_actions(self):
context=Qt.WidgetShortcut)
self.close_action = create_action(self, _("Close this panel"),
icon=ima.icon('close_panel'),
- triggered=lambda: self.sig_close_split.emit(),
+ triggered=self.close_split,
shortcut=get_shortcut(context='Editor', name='close split panel'),
context=Qt.WidgetShortcut)
actions = [MENU_SEPARATOR, self.undock_action,
@@ -2564,7 +2570,6 @@ def __init__(self, parent, plugin, menu_actions, first=False,
lambda: self.split(orientation=Qt.Vertical))
self.editorstack.sig_split_horizontally.connect(
lambda: self.split(orientation=Qt.Horizontal))
- self.editorstack.sig_close_split.connect(lambda: self.close())
self.addWidget(self.editorstack)
def closeEvent(self, event):
|
Intents: initial setup
For now, we require the privileged 'Guild Members' intent, to maintain
all current functionality (e.g. Member convertors working with IDs).
In the future, we may look into disabling this intent. | @@ -204,8 +204,18 @@ class SeasonalBot(commands.Bot):
_allowed_roles = [discord.Object(id_) for id_ in MODERATION_ROLES]
+
+_intents = discord.Intents().all()
+_intents.bans = False
+_intents.integrations = False
+_intents.invites = False
+_intents.presences = False
+_intents.typing = False
+_intents.webhooks = False
+
bot = SeasonalBot(
command_prefix=Client.prefix,
activity=discord.Game(name=f"Commands: {Client.prefix}help"),
allowed_mentions=discord.AllowedMentions(everyone=False, roles=_allowed_roles),
+ intents=_intents,
)
|
Textual correction on TLS Authentication
Correct wording on the TLS Authentication section of the configure.rst page. | @@ -238,7 +238,7 @@ TLS Authentication
------------------
Ray can be configured to use TLS on it's gRPC channels.
-This has means that connecting to the Ray client on the head node will
+This means that connecting to the Ray client on the head node will
require an appropriate set of credentials and also that data exchanged between
various processes (client, head, workers) will be encrypted.
|
opt_code_ada.mako: refactor references to the subparser
TN: | -- Start opt_code
-${parser.parser.generate_code()}
-
<%
-parser_type = parser.parser.type
+subparser = parser.parser
+
+parser_type = subparser.type
if parser._booleanize:
base = parser.booleanized_type
if not base.is_bool_type:
alt_true, alt_false = base._alternatives
%>
-if ${parser.parser.pos_var} = No_Token_Index then
+${subparser.generate_code()}
+
+if ${subparser.pos_var} = No_Token_Index then
% if parser._booleanize:
% if base.is_bool_type:
${parser.res_var} := False;
@@ -26,18 +28,18 @@ if ${parser.parser.pos_var} = No_Token_Index then
${parser.res_var}.Self_Env := AST_Envs.Empty_Env;
% endif
% elif parser_type and parser_type.is_list_type:
- ${parser.parser.res_var} :=
+ ${subparser.res_var} :=
(${parser_type.storage_type_name}_Alloc.Alloc (Parser.Mem_Pool));
${parser.res_var}.Kind := ${parser_type.ada_kind_name};
- ${parser.parser.res_var}.Unit := Parser.Unit;
- ${parser.parser.res_var}.Count := 0;
- ${parser.parser.res_var}.Nodes :=
+ ${subparser.res_var}.Unit := Parser.Unit;
+ ${subparser.res_var}.Count := 0;
+ ${subparser.res_var}.Nodes :=
Alloc_AST_List_Array.Alloc (Parser.Mem_Pool, 0);
- ${parser.parser.res_var}.Token_Start_Index := ${parser.start_pos} - 1;
- ${parser.parser.res_var}.Token_End_Index := No_Token_Index;
+ ${subparser.res_var}.Token_Start_Index := ${parser.start_pos} - 1;
+ ${subparser.res_var}.Token_End_Index := No_Token_Index;
${parser.res_var}.Self_Env := AST_Envs.Empty_Env;
% elif parser_type:
- ${parser.parser.res_var} :=
+ ${subparser.res_var} :=
${parser_type.storage_nullexpr};
% endif
@@ -46,10 +48,10 @@ if ${parser.parser.pos_var} = No_Token_Index then
## succeeded.
Append (Parser.Diagnostics,
Get_Token (Parser.TDH.all, ${parser.start_pos}).Sloc_Range,
- To_Text ("Missing '${parser.parser.error_repr}'"));
+ To_Text ("Missing '${subparser.error_repr}'"));
% endif
- ${parser.parser.pos_var} := ${parser.start_pos};
+ ${subparser.pos_var} := ${parser.start_pos};
% if parser._booleanize:
else
|
renamed parallel_measurement to parallel_meas for consistency reasons
implemented WAT wait_time method | @@ -918,7 +918,7 @@ class AgilentB1500(Instrument):
######################################
@property
- def parallel_measurement(self):
+ def parallel_meas(self):
""" Enable/Disable parallel measurements.
Effective for SMUs using HSADC and measurement modes 1,2,10,18. (``PAD``)
"""
@@ -926,8 +926,8 @@ class AgilentB1500(Instrument):
response = bool(int(response))
return response
- @parallel_measurement.setter
- def parallel_measurement(self, setting):
+ @parallel_meas.setter
+ def parallel_meas(self, setting):
setting = int(setting)
self.write('PAD %d' % setting)
self.check_errors()
@@ -1045,6 +1045,21 @@ class AgilentB1500(Instrument):
def query_time_stamp_setting(self):
return self.query_learn_header(60)
+ def wait_time(self, wait_type, N, offset=0):
+ """Configure wait time. (``WAT``)
+
+ :param wait_type: Wait time type
+ :type wait_type: :class:`.WaitTimeType`
+ :param N: Coefficient for initial wait time, default: 1
+ :type N: float
+ :param offset: Offset for wait time, defaults to 0
+ :type offset: int, optional
+ """
+ wait_type = WaitTimeType.get(wait_type).value
+ self.write('WAT %d, %f, %d' % (wait_type, N, offset))
+ self.check_errors()
+
+
######################################
# Sweep Setup
######################################
@@ -1974,3 +1989,9 @@ class StaircaseSweepPostOutput(CustomIntEnum):
class CompliancePolarity(CustomIntEnum):
AUTO = 0
MANUAL = 1
+
+class WaitTimeType(CustomIntEnum):
+ """Wait time type"""
+ SMU_SOURCE = 1 #:
+ SMU_MEASUREMENT = 2 #:
+ CMU_MEASUREMENT = 3 #:
|
Use IPv6 addresses with brackets also when not using hostnames
Fixes using IPv6 address for master within minion configuration | @@ -1745,7 +1745,7 @@ def dns_check(addr, port, safe=False, ipv6=None):
for h in hostnames:
# It's an IP address, just return it
if h[4][0] == addr:
- resolved = addr
+ resolved = salt.utils.zeromq.ip_bracket(addr)
break
if h[0] == socket.AF_INET and ipv6 is True:
|
Update setup.py
Update setup.py to find pip dependencies for > pip 10 | @@ -24,8 +24,12 @@ import time
from setuptools import find_packages
from setuptools import setup
-from pip.req import parse_requirements
+try: # for pip >= 10
+ from pip._internal.download import PipSession
+ from pip._internal.req import parse_requirements
+except ImportError: # for pip <= 9.0.3
from pip.download import PipSession
+ from pip.req import parse_requirements
timesketch_version = u'20170721'
|
Bugfix accept bytes as cookie value argument
Werkzeug accepted either and dealt accordingly, now Quart does the
same. | @@ -550,10 +550,10 @@ class Response(_BaseRequestResponse, JSONMixin):
if self.automatically_set_content_length:
self.headers['Content-Length'] = str(len(bytes_data))
- def set_cookie(
+ def set_cookie( # type: ignore
self,
key: str,
- value: str='',
+ value: AnyStr='',
max_age: Optional[Union[int, timedelta]]=None,
expires: Optional[datetime]=None,
path: str='/',
@@ -566,7 +566,9 @@ class Response(_BaseRequestResponse, JSONMixin):
The arguments are the standard cookie morsels and this is a
wrapper around the stdlib SimpleCookie code.
"""
- cookie = create_cookie(key, value, max_age, expires, path, domain, secure, httponly)
+ if isinstance(value, bytes):
+ value = value.decode() # type: ignore
+ cookie = create_cookie(key, value, max_age, expires, path, domain, secure, httponly) # type: ignore # noqa: E501
self.headers.add('Set-Cookie', cookie.output(header=''))
def delete_cookie(self, key: str, path: str='/', domain: Optional[str]=None) -> None:
|
use requests
brings uniformity
closes | # !!! This uses the https://newsapi.org/ api. TO comply with the TOU
# !!! we must link back to this site whenever we display results.
import json
+import requests
import webbrowser
-from six import PY3
from colorama import Fore
from plugin import plugin, require
-if PY3:
- import urllib.request
-else:
- import urllib
@require(network=True)
@@ -225,26 +221,18 @@ class News:
def _get(self, jarvis, url):
"""fetch a webpage"""
- try:
- if PY3:
- response = urllib.request.urlopen(url)
+ response = requests.get(url)
+ if response.status_code == requests.codes.ok:
+ data = json.loads(response.text)
+ return data
else:
- response = urllib.urlopen(url)
- except urllib.error.HTTPError as err:
- # Catch invalid key(Unauthorized) error
- if err.code == 401:
+ if response.status_code == 401:
jarvis.say("API key not valid", Fore.RED)
- return None
- # Catch some other errors
else:
jarvis.say("An error occured: Error code: "
- + str(err.code), Fore.RED)
+ + response.raise_for_status(), Fore.RED)
return None
- # Load json
- data = json.loads(response.read().decode('utf-8'))
- return data
-
def parse_articles(self, data, jarvis):
article_list = {}
index = 1
|
Update show_nbar.py
Updated to remove pdb imports and changed keys to lower case as per request on naming standards | @@ -13,14 +13,14 @@ class ShowIpNbarDiscoverySchema(MetaParser):
Any(): {
'protocol': {
Any(): {
- 'IN Packet Count': int,
- 'OUT Packet Count': int,
- 'IN Byte Count': int,
- 'OUT Byte Count': int,
- 'IN 5min Bit Rate (bps)': int,
- 'OUT 5min Bit Rate (bps)': int,
- 'IN 5min Max Bit Rate (bps)': int,
- 'OUT 5min Max Bit Rate (bps)': int,
+ 'in_packet_count': int,
+ 'out_packet_count': int,
+ 'in_byte_count': int,
+ 'out_byte_count': int,
+ 'in_5min_bit_rate_bps': int,
+ 'out_5min_bit_rate_bps': int,
+ 'in_5min_max_bit_rate_bps': int,
+ 'out_5min_max_bit_rate_bps': int,
}
}
}
@@ -61,10 +61,8 @@ class ShowIpNbarDiscovery(ShowIpNbarDiscoverySchema):
for line in out.splitlines():
- #import pdb; pdb.set_trace()
if line:
line = line.strip()
- # print(line)
else:
continue
@@ -83,16 +81,16 @@ class ShowIpNbarDiscovery(ShowIpNbarDiscoverySchema):
group = m.groupdict()
protocol=group['protocol']
result_dict[interface]['protocol'][protocol]={}
- result_dict[interface]['protocol'][protocol].update({'IN Packet Count': int(group['In_Packet_Count'])})
- result_dict[interface]['protocol'][protocol].update({'OUT Packet Count': int(group['Out_Packet_Count'])})
+ result_dict[interface]['protocol'][protocol].update({'in_packet_count': int(group['In_Packet_Count'])})
+ result_dict[interface]['protocol'][protocol].update({'out_packet_count': int(group['Out_Packet_Count'])})
continue
m = p3.match(line)
if m:
group = m.groupdict()
- result_dict[interface]['protocol'][protocol].update({'IN Byte Count': int(group['In_Byte_Count'])})
- result_dict[interface]['protocol'][protocol].update({'OUT Byte Count': int(group['Out_Byte_Count'])})
+ result_dict[interface]['protocol'][protocol].update({'in_byte_count': int(group['In_Byte_Count'])})
+ result_dict[interface]['protocol'][protocol].update({'out_byte_count': int(group['Out_Byte_Count'])})
@@ -101,8 +99,8 @@ class ShowIpNbarDiscovery(ShowIpNbarDiscoverySchema):
if m:
group = m.groupdict()
- result_dict[interface]['protocol'][protocol].update({'IN 5min Bit Rate (bps)': int(group['In_Bitrate'])})
- result_dict[interface]['protocol'][protocol].update({'OUT 5min Bit Rate (bps)': int(group['Out_Bitrate'])})
+ result_dict[interface]['protocol'][protocol].update({'in_5min_bit_rate_bps': int(group['In_Bitrate'])})
+ result_dict[interface]['protocol'][protocol].update({'out_5min_bit_rate_bps': int(group['Out_Bitrate'])})
@@ -110,8 +108,8 @@ class ShowIpNbarDiscovery(ShowIpNbarDiscoverySchema):
if m:
group = m.groupdict()
- result_dict[interface]['protocol'][protocol].update({'IN 5min Max Bit Rate (bps)': int(group['In_Bitrate_Max'])})
- result_dict[interface]['protocol'][protocol].update({'OUT 5min Max Bit Rate (bps)': int(group['Out_Bitrate_Max'])})
+ result_dict[interface]['protocol'][protocol].update({'in_5min_max_bit_rate_bps': int(group['In_Bitrate_Max'])})
+ result_dict[interface]['protocol'][protocol].update({'out_5min_max_bit_rate_bps': int(group['Out_Bitrate_Max'])})
|
BM - Add encoding as parameter for reading data files
Add encoding as paramter for HIReader class, defaulted to latin-1. The DataReader class gets the encoding value for a specific manifest row and uses that value in its init. | @@ -26,18 +26,19 @@ class HIReader(object):
File can be local (path_type="file") or remote (path_type="s3"). Note, local files are preferred
when possible for faster processing time and lower bandwidth usage.
"""
- def __init__(self, path, path_type="file"):
+ def __init__(self, path, path_type="file", encoding="latin-1"):
self.path = path
self._length = None
self._keys = None
self.path_type = path_type
+ self.encoding = encoding
def __iter__(self):
self._length = 0
self._counter = Counter()
if self.path_type == "file":
- with open(self.path, 'r', newline='', encoding='latin-1') as data:
+ with open(self.path, 'r', newline='', encoding=self.encoding) as data:
reader = DictReader(data)
self._keys = reader.fieldnames
for row in reader:
@@ -155,6 +156,7 @@ class DataReader(HIReader):
self.manifest_row = manifest_row #a dictionary from the manifest
self.destination_table = manifest_row['destination_table']
+ self.encoding = manifest_row.get('encoding', 'latin-1') # Defaults to latin-1 in case key not present in manifest.
self.load_from=load_from
self.s3_path = os.path.join(manifest_row['s3_folder'], manifest_row['filepath'].strip("\/")).replace("\\","/")
@@ -170,7 +172,7 @@ class DataReader(HIReader):
self.not_found = [] #Used to log missing fields compared to meta data
- super().__init__(self.path, self.path_type)
+ super().__init__(self.path, self.path_type, self.encoding)
def validate_or_create_path(self):
root_path = os.path.abspath(os.path.dirname(self.path))
|
refactor(cli): add back --version and remove subcommand required constraint
these changes will apply when move into 2.0 | @@ -20,10 +20,16 @@ data = {
"arguments": [
{"name": "--debug", "action": "store_true", "help": "use debug mode"},
{"name": ["-n", "--name"], "help": "use the given commitizen"},
+ {
+ "name": ["--version"],
+ "action": "store_true",
+ "help": "get the version of the installed commitizen",
+ },
],
"subcommands": {
"title": "commands",
- "required": True,
+ # TODO: Add this constraint back in 2.0
+ # "required": True,
"commands": [
{
"name": "ls",
@@ -123,6 +129,13 @@ def main():
if args.name:
conf.update({"name": args.name})
+ if args.version:
+ warnings.warn(
+ "'cz --version' will be deprecated in next major version. "
+ "Please use 'cz version' command from your scripts"
+ )
+ logging.getLogger("commitizen").setLevel(logging.DEBUG)
+
if args.debug:
warnings.warn(
"Debug will be deprecated in next major version. "
@@ -130,4 +143,8 @@ def main():
)
logging.getLogger("commitizen").setLevel(logging.DEBUG)
+ # TODO: This try block can be removed after command is required in 2.0
+ try:
args.func(conf, vars(args))()
+ except AttributeError:
+ out.error("Command is required")
|
Update elf_ransomware.txt
Minus socks-proxy address, not an IoC. | @@ -27,7 +27,68 @@ sg3dwqfpnr4sl5hh.onion
y7mfrrjkzql32nwcmgzwp3zxaqktqywrwvzfni4hm4sebtpw5kuhjzqd.onion
# Reference: https://twitter.com/joakimkennedy/status/1268243062611984384
+# Reference: https://unit42.paloaltonetworks.com/ech0raix-ransomware-soho/
# Reference: https://www.virustotal.com/gui/file/88a73f1c1e5a7c921f61638d06f3fed7389e1b163da7a1cc62a666d0a88baf47/detection
-176.122.23.54:9100
veqlxhq7ub5qze3qy56zx2cig2e6tzsgxdspkubwbayqije6oatma6id.onion
+/crp_linux_arc
+/crp_linux_arcle-hs38
+/crp_linux_arm
+/crp_linux_arm4
+/crp_linux_arm4l
+/crp_linux_arm4t
+/crp_linux_arm4tl
+/crp_linux_arm4tll
+/crp_linux_arm5
+/crp_linux_arm5l
+/crp_linux_arm5n
+/crp_linux_arm6
+/crp_linux_arm64
+/crp_linux_arm6l
+/crp_linux_arm7
+/crp_linux_arm7l
+/crp_linux_arm8
+/crp_linux_armv4
+/crp_linux_armv4l
+/crp_linux_armv5l
+/crp_linux_armv6
+/crp_linux_armv61
+/crp_linux_armv6l
+/crp_linux_armv7l
+/crp_linux_dbg
+/crp_linux_exploit
+/crp_linux_i4
+/crp_linux_i486
+/crp_linux_i586
+/crp_linux_i6
+/crp_linux_i686
+/crp_linux_kill
+/crp_linux_m68
+/crp_linux_m68k
+/crp_linux_mips
+/crp_linux_mips64
+/crp_linux_mipseb
+/crp_linux_mipsel
+/crp_linux_mpsl
+/crp_linux_pcc
+/crp_linux_powerpc
+/crp_linux_powerpc-440fp
+/crp_linux_powerppc
+/crp_linux_ppc
+/crp_linux_pp-c
+/crp_linux_ppc2
+/crp_linux_ppc440
+/crp_linux_ppc440fp
+/crp_linux_root
+/crp_linux_root32
+/crp_linux_sh
+/crp_linux_sh4
+/crp_linux_sparc
+/crp_linux_spc
+/crp_linux_ssh4
+/crp_linux_x32
+/crp_linux_x32_64
+/crp_linux_x64
+/crp_linux_x86
+/crp_linux_x86_32
+/crp_linux_x86_64
|
Better addon handling in Kubeflow addon
As a workaround for specifically waits for enabled addons to
finish setting up before proceeding to bootstrapping Juju.
Also runs black formatter over kubeflow enable script and fixes lints | @@ -41,12 +41,12 @@ def run(*args, die=True, debug=False, stdout=True):
else:
raise
- result_stdout = result.stdout.decode('utf-8')
+ result_stdout = result.stdout.decode("utf-8")
if debug and stdout:
print(result_stdout)
if result.stderr:
- print(result.stderr.decode('utf-8'))
+ print(result.stderr.decode("utf-8"))
return result_stdout
@@ -59,9 +59,9 @@ def get_random_pass():
def juju(*args, **kwargs):
if strtobool(os.environ.get("KUBEFLOW_DEBUG") or "false"):
- return run('microk8s-juju.wrapper', "--debug", *args, debug=True, **kwargs)
+ return run("microk8s-juju.wrapper", "--debug", *args, debug=True, **kwargs)
else:
- return run('microk8s-juju.wrapper', *args, **kwargs)
+ return run("microk8s-juju.wrapper", *args, **kwargs)
def get_hostname():
@@ -79,7 +79,7 @@ def get_hostname():
die=False,
)
return json.loads(output)["spec"]["rules"][0]["host"]
- except (KeyError, subprocess.CalledProcessError) as err:
+ except (KeyError, subprocess.CalledProcessError):
pass
# Otherwise, see if we've set up metallb with a custom service
@@ -95,7 +95,7 @@ def get_hostname():
)
pub_ip = json.loads(output)["status"]["loadBalancer"]["ingress"][0]["ip"]
return "%s.xip.io" % pub_ip
- except (KeyError, subprocess.CalledProcessError) as err:
+ except (KeyError, subprocess.CalledProcessError):
pass
# If all else fails, just use localhost
@@ -107,6 +107,7 @@ def main():
channel = os.environ.get("KUBEFLOW_CHANNEL") or "stable"
no_proxy = os.environ.get("KUBEFLOW_NO_PROXY") or None
hostname = os.environ.get("KUBEFLOW_HOSTNAME") or None
+ debug = strtobool(os.environ.get("KUBEFLOW_DEBUG") or "false")
password_overlay = {
"applications": {
@@ -129,9 +130,21 @@ def main():
"metallb:10.64.140.43-10.64.140.49",
]:
print("Enabling %s..." % service)
- run("microk8s-enable.wrapper", service)
+ run("microk8s-enable.wrapper", service, debug=debug)
+
+ run("microk8s-status.wrapper", "--wait-ready", debug=debug)
- run("microk8s-status.wrapper", '--wait-ready')
+ print("Waiting for DNS and storage plugins to finish setting up")
+ run(
+ "microk8s-kubectl.wrapper",
+ "wait",
+ "--for=condition=available",
+ "-nkube-system",
+ "deployment/coredns",
+ "deployment/hostpath-provisioner",
+ "--timeout=10m",
+ debug=debug,
+ )
try:
juju("show-controller", "uk8s", die=False, stdout=False)
@@ -237,6 +250,7 @@ def main():
"pod",
"--timeout=-1s",
"--all",
+ debug=debug,
)
hostname = hostname or get_hostname()
|
re.search in function _regex_to_static() should support re.MULTILINE or cause bug
modified: file.py | @@ -1555,7 +1555,7 @@ def _regex_to_static(src, regex):
return None
try:
- src = re.search(regex, src)
+ src = re.search(regex, src, re.M)
except Exception as ex:
raise CommandExecutionError("{0}: '{1}'".format(_get_error_message(ex), regex))
|
TST: Fixup string cast test to not use `tiny`
There is not much value in these values anyway probably, but tiny
isn't reliably for double-double (maybe it should be, but that is
a different issue).
Fixup for and | @@ -90,8 +90,8 @@ def test_string_comparisons_empty(op, ufunc, sym, dtypes):
def test_float_to_string_cast(str_dt, float_dt):
float_dt = np.dtype(float_dt)
fi = np.finfo(float_dt)
- arr = np.array([np.nan, np.inf, -np.inf, fi.max, fi.tiny], dtype=float_dt)
- expected = ["nan", "inf", "-inf", repr(fi.max), repr(fi.tiny)]
+ arr = np.array([np.nan, np.inf, -np.inf, fi.max, fi.min], dtype=float_dt)
+ expected = ["nan", "inf", "-inf", repr(fi.max), repr(fi.min)]
if float_dt.kind == 'c':
expected = [f"({r}+0j)" for r in expected]
|
Check CERT_MANAGER_API if True or False
Follow-up on "Change 529818" to check variable value "True" or "False".
Related-bug: | @@ -57,7 +57,7 @@ if [ -n "$TRUST_ID" ]; then
KUBE_CONTROLLER_MANAGER_ARGS="$KUBE_CONTROLLER_MANAGER_ARGS --cloud-config=/etc/kubernetes/kube_openstack_config --cloud-provider=openstack"
fi
-if [ -n "$CERT_MANAGER_API" ]; then
+if [ "$(echo $CERT_MANAGER_API | tr '[:upper:]' '[:lower:]')" = "true" ]; then
KUBE_CONTROLLER_MANAGER_ARGS="$KUBE_CONTROLLER_MANAGER_ARGS --cluster-signing-cert-file=$CERT_DIR/ca.crt --cluster-signing-key-file=$CERT_DIR/ca.key"
fi
|
Use full spacy pipeline for dependency parse needed for sentence
tokenization | @@ -843,7 +843,7 @@ class RnnEntityGuesser(AbstractGuesser):
guesser.learning_rate = params['learning_rate']
guesser.max_grad_norm = params['max_grad_norm']
guesser.model = torch.load(os.path.join(directory, 'rnn_entity.pt'))
- guesser.nlp = spacy.load('en', create_pipeline=custom_spacy_pipeline)
+ guesser.nlp = spacy.load('en')
guesser.features = params['features']
guesser.rel_position_vocab = params['rel_position_vocab']
guesser.rel_position_lookup = params['rel_position_lookup']
|
Removed the sorting of the input files, based on code review
David was right in the code review - the user should be able to control
the ordering. I also improved the task documentation a bit. | @@ -16,7 +16,7 @@ from robot.libraries.BuiltIn import RobotNotRunningError
class RobotLibDoc(BaseTask):
task_options = {
"path": {
- "description": "The path to the robot library to be documented. Can be a python file or a .robot file.",
+ "description": "The path to the robot library to be documented. Can be single a python file or a .robot file, or a comma separated list of those files. The order of the files will be preserved in the generated documentation.",
"required": True,
},
"output": {
@@ -54,7 +54,7 @@ class RobotLibDoc(BaseTask):
def _run_task(self):
libraries = []
processed_files = {}
- for input_file in sorted(self.options["path"]):
+ for input_file in self.options["path"]:
try:
libdoc = DocumentationBuilder(input_file).build(input_file)
libraries.append(libdoc)
|
Update Recursion.md
Fixed the factorial function so that factorial(0) returns 1 (as it should) | @@ -39,7 +39,7 @@ For example, this function will perform multiplication by recursively adding :
Exercise
--------
-Define a new function called `factorial()` that will compute the factorial by recursive multiplication (5! = 5 x 4 x 3 x 2 x 1).
+Define a new function called `factorial()` that will compute the factorial by recursive multiplication (5! = 5 x 4 x 3 x 2 x 1). Note that by convention, the factorial of 0 is equal to 1 (0! = 1).
Tutorial Code
-------------
@@ -48,6 +48,7 @@ Tutorial Code
int main() {
/* testing code */
+ printf("0! = %i\n", factorial(0));
printf("1! = %i\n", factorial(1));
printf("3! = %i\n", factorial(3));
printf("5! = %i\n", factorial(5));
@@ -58,6 +59,7 @@ Tutorial Code
Expected Output
---------------
+ 0! = 1
1! = 1
3! = 6
5! = 120
@@ -71,15 +73,17 @@ Solution
int main() {
/* testing code */
+ printf("0! = %i\n", factorial(0));
printf("1! = %i\n", factorial(1));
printf("3! = %i\n", factorial(3));
printf("5! = %i\n", factorial(5));
}
int factorial(int number) {
- int f = number;
if (number > 1) {
- f *= factorial(number-1);
+ return number * factorial(number-1);
+ }
+ else {
+ return 1;
}
- return f;
}
|
utils/doc: Add support for dicts to format literal
Now supports cleaner outputing of python dicts | @@ -263,6 +263,9 @@ def format_literal(lit):
return '``\'{}\'``'.format(lit)
elif hasattr(lit, 'pattern'): # regex
return '``r\'{}\'``'.format(lit.pattern)
+ elif isinstance(lit, dict):
+ content = indent(',\n'.join("{}: {}".format(key,val) for (key,val) in lit.iteritems()))
+ return '::\n\n{}'.format(indent('{{\n{}\n}}'.format(content)))
else:
return '``{}``'.format(lit)
|
fix direction of NZ exchange arrow
North Island is generally to the east of South Island.
Thus an arrow for transfer from North to South must point west :) | @@ -392,6 +392,6 @@ exports.addExchangesConfiguration = function(exchanges) {
}
exchanges['NZ-NZN->NZ-NZS'] = {
lonlat: [174.424066, -41.140732],
- rotation: 90
+ rotation: -90
};
}
|
Minor Changes Made
Code changed to check if directory exists under the name you need to check and if directory exists , output:"The directory exists" . No directory will be created. | # Description : Tests to see if the directory testdir exists, if not it will create the directory for you
-import os # Import the OS module
-DirCheck = raw_input("Please enter directory name to check : ")
+import os #Import the OS Module
+CheckDir = raw_input("Enter the name of the directory to check : ")
print
-print "There was no directory under the name " +DirCheck
+if os.path.exists(CheckDir):#Checks if the dir exists
+ print "The directory exists"
+else:
+ print "No directory found for "+CheckDir #Output if no directory
print
-print "So, a new directory under the name " +DirCheck + " has been created!"
-if not os.path.exists(DirCheck): # Check to see if it exists
- os.makedirs(DirCheck) # Create the directory
+ os.makedirs(CheckDir)#Creates a new dir for the given name
+ print "Directory created for "+CheckDir
|
Clear caches on DynamicRateDefinition deletion for completeness
and to help with tests | @@ -10,6 +10,13 @@ class DynamicRateDefinition(models.Model):
per_second = models.FloatField(default=None, blank=True, null=True)
def save(self, *args, **kwargs):
+ self._clear_caches()
+ super().save(*args, **kwargs)
+
+ def delete(self, *args, **kwargs):
+ self._clear_caches()
+ super().delete(*args, **kwargs)
+
+ def _clear_caches(self):
from corehq.project_limits.rate_limiter import get_dynamic_rate_definition
get_dynamic_rate_definition.clear(self.key, {})
- super().save(*args, **kwargs)
|
Update setup.py
add missing Python 3.7 tag | @@ -56,5 +56,6 @@ setup(
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
+ 'Programming Language :: Python :: 3.7',
]
)
|
Predicate: fix the generated debug image
TN: | @@ -459,8 +459,8 @@ class Predicate(AbstractExpression):
# Append the debug image for the predicate
closure_exprs.append(untyped_literal_expr('"{}.{}"'.format(
- self.pred_property.name.camel_with_underscores,
- self.pred_property.struct.name.camel_with_underscores
+ self.pred_property.struct.name.camel_with_underscores,
+ self.pred_property.name.camel_with_underscores
)))
logic_var_exprs.append(
|
add docker-ce to docker subtype grains check
Fixes | @@ -699,9 +699,10 @@ def _virtual(osdata):
with salt.utils.fopen('/proc/1/cgroup', 'r') as fhr:
if ':/lxc/' in fhr.read():
grains['virtual_subtype'] = 'LXC'
+ dstrings = (':/system.slice/docker', ':/docker/', ':/docker-ce/')
with salt.utils.fopen('/proc/1/cgroup', 'r') as fhr:
fhr_contents = fhr.read()
- if ':/docker/' in fhr_contents or ':/system.slice/docker' in fhr_contents:
+ if any(dstring in fhr_contents for dstring in dstrings):
grains['virtual_subtype'] = 'Docker'
except IOError:
pass
|
[kern] Allow compiling kern tables with more than 64k entries
Fixes | @@ -161,7 +161,7 @@ class KernTable_format_0(object):
len(data) - 6 * nPairs)
def compile(self, ttFont):
- nPairs = len(self.kernTable)
+ nPairs = min(len(self.kernTable), 0xFFFF)
searchRange, entrySelector, rangeShift = getSearchRange(nPairs, 6)
searchRange &= 0xFFFF
data = struct.pack(
|
some minor doc fixes to make the defaults clearer.
(They're still pretty hard to navigate though.) | @@ -670,22 +670,25 @@ def process(configfile='salt://hubblestack_pulsar/hubblestack_pulsar_config.yaml
* attrib - File metadata changed
* close_nowrite - Unwritable file closed
* close_write - Writable file closed
- * create - File created in watched directory
- * delete - File deleted from watched directory
+ * create [def] - File created in watched directory
+ * delete [def] - File deleted from watched directory
* delete_self - Watched file or directory deleted
- * modify - File modified
+ * modify [def] - File modified
* moved_from - File moved out of watched directory
* moved_to - File moved into watched directory
* move_self - Watched file moved
* open - File opened
- The mask can also contain the following options:
+ The mask can also contain the following options (none enabled by default):
* dont_follow - Don't dereference symbolic links
* excl_unlink - Omit events for children after they have been unlinked
* oneshot - Remove watch after one event
* onlydir - Operate only if name is directory
+ All the below options regarding further recursion and file watches default
+ to False.
+
recurse:
Recursively watch directories under the named directory
auto_add:
@@ -701,6 +704,9 @@ def process(configfile='salt://hubblestack_pulsar/hubblestack_pulsar_config.yaml
Can use regex if regex is set to True
contents:
Retrieve the contents of changed files based on checksums (which must be enabled)
+ When enabled, the options contents_size (default 20480) is also used to
+ decide, "Don't fetch contents for any file over contents_size or where
+ the checksum is unchanged."
If pillar/grains/minion config key `hubblestack:pulsar:maintenance` is set to
True, then changes will be discarded.
|
Include the guild ID in mod-log embed.
This gives easier access to the Guild ID in the place where you're most
likely to want to use the whitelist command. | @@ -111,7 +111,7 @@ class Filtering(Cog):
def _get_allowlist_items(self, allow: bool, list_type: str, compiled: Optional[bool] = False) -> list:
"""Fetch items from the allow_deny_list_cache."""
- items = self.bot.allow_deny_list_cache[f"{list_type}.{allow}"]
+ items = self.bot.allow_deny_list_cache.get(f"{list_type.upper()}.{allow}", [])
if compiled:
return [re.compile(fr'{item.get("content")}', flags=re.IGNORECASE) for item in items]
@@ -371,14 +371,14 @@ class Filtering(Cog):
# They have no data so additional embeds can't be created for them.
if name == "filter_invites" and match is not True:
additional_embeds = []
- for invite, data in match.items():
+ for _, data in match.items():
embed = discord.Embed(description=(
f"**Members:**\n{data['members']}\n"
f"**Active:**\n{data['active']}"
))
embed.set_author(name=data["name"])
embed.set_thumbnail(url=data["icon"])
- embed.set_footer(text=f"Guild Invite Code: {invite}")
+ embed.set_footer(text=f"Guild ID: {data['id']}")
additional_embeds.append(embed)
additional_embeds_msg = "For the following guild(s):"
@@ -489,6 +489,7 @@ class Filtering(Cog):
invite_data[invite] = {
"name": guild["name"],
+ "id": guild['id'],
"icon": guild_icon,
"members": response["approximate_member_count"],
"active": response["approximate_presence_count"]
|
Remove broad-except pylint directive
The block of code distinguishes among enough categories of exceptions
that pylint doesn't feel the need to complain. | @@ -46,7 +46,6 @@ def run():
# handled only there; it is just reraised here.
except KeyboardInterrupt as err:
raise err
- # pylint: disable=broad-except
except BaseException as err:
raise StratisCliActionError(command_line_args, result) from err
except StratisCliActionError as err:
|
settings: Fix display_emoji_reaction_users org setting save/discard.
Follow-up to commit
The new display setting introduced in above commit was not registered
properly and enabling/disabling it from Organization settings > Default
user settings did not display a "Save/Discard" widget. This has been
fixed by modifying the `settings_org.get_subsection_property_elements`
function. | @@ -211,6 +211,9 @@ function get_subsection_property_elements(element) {
// Because the emojiset widget has a unique radio button
// structure, it needs custom code.
const $color_scheme_elem = $subsection.find(".setting_color_scheme");
+ const $display_emoji_reaction_users_elem = $subsection.find(
+ ".display_emoji_reaction_users",
+ );
const $emojiset_elem = $subsection.find("input[name='emojiset']:checked");
const $user_list_style_elem = $subsection.find("input[name='user_list_style']:checked");
const $translate_emoticons_elem = $subsection.find(".translate_emoticons");
@@ -219,6 +222,7 @@ function get_subsection_property_elements(element) {
$emojiset_elem,
$user_list_style_elem,
$translate_emoticons_elem,
+ $display_emoji_reaction_users_elem,
];
}
return Array.from($subsection.find(".prop-element"));
|
fixed crash when --domain not provided for rasa train command
An Error message will be displayed in case of invalid domain file provided
or no domain file provided at all for 'rasa train core' only | @@ -247,6 +247,13 @@ async def train_core_async(
skill_imports = SkillSelector.load(config, stories)
+ if isinstance(domain, type(None)):
+ print_error(
+ "Core training is skipped because no domain was found. "
+ "Please specify a valid domain using '--domain' argument or check if provided domain file does exists"
+ )
+ return None
+
if isinstance(domain, str):
try:
domain = Domain.load(domain, skill_imports)
|
Fix typo
``Revision.commment`` to ``Revision.comment`` | @@ -68,7 +68,7 @@ django-reversion changelog
3.0.0 - 2018-07-19
------------------
-- **Breaking:** ``Revision.commment`` now contains the raw JSON change message generated by django admin, rather than
+- **Breaking:** ``Revision.comment`` now contains the raw JSON change message generated by django admin, rather than
a string. Accesing ``Revision.comment`` directly is no longer recommended. Instead, use ``Revision.get_comment()``.
(@RamezIssac).
- **BREAKING:** django-reversion now uses ``_base_manager`` to calculate deleted models, not ``_default_manager``. This
|
modify SelectToggle to use JSONEncoder so we can pass gettext_lazy text choices
and fix spacing | @@ -9,6 +9,8 @@ from django.utils.safestring import mark_safe
from django.utils.html import format_html, conditional_escape
from django.utils.translation import gettext_noop
+from corehq.util.json import CommCareJSONEncoder
+
from dimagi.utils.dates import DateSpan
from corehq.apps.hqwebapp.templatetags.hq_shared_tags import html_attr
@@ -153,11 +155,16 @@ class SelectToggle(forms.Select):
id: '{id}',
value: {value},
options: {options}"></select-toggle>
- '''.format(apply_bindings="true" if self.apply_bindings else "false",
+ '''.format(
+ apply_bindings="true" if self.apply_bindings else "false",
name=name,
id=html_attr(attrs.get('id', '')),
value=html_attr(self.params['value'] or '"{}"'.format(html_attr(value))),
- options=html_attr(json.dumps([{'id': c[0], 'text': c[1]} for c in self.choices])))
+ options=html_attr(json.dumps(
+ [{'id': c[0], 'text': c[1]} for c in self.choices],
+ cls=CommCareJSONEncoder
+ ))
+ )
class GeoCoderInput(Input):
|
bugfix set_tags
Bugfix to `BaseObject.clone_tags`: `set_tags` was called incorrectly from `clone_tags` without the kwargs symbol. | @@ -190,7 +190,7 @@ class BaseObject(_BaseEstimator):
update_dict = {key: tags_est[key] for key in tag_names}
- self.set_tags(update_dict)
+ self.set_tags(**update_dict)
return self
|
Fix - handle inputLinks and hero version
Hero version doesn't store inputLinks, but with changes in comparing it should work. | @@ -164,7 +164,6 @@ def get_linked_representation_id(
# Recursive graph lookup for inputs
{"$graphLookup": graph_lookup}
]
-
conn = get_project_connection(project_name)
result = conn.aggregate(query_pipeline)
referenced_version_ids = _process_referenced_pipeline_result(
@@ -213,7 +212,7 @@ def _process_referenced_pipeline_result(result, link_type):
for output in sorted(outputs_recursive, key=lambda o: o["depth"]):
output_links = output.get("data", {}).get("inputLinks")
- if not output_links:
+ if not output_links and output["type"] != "hero_version":
continue
# Leaf
@@ -232,6 +231,9 @@ def _process_referenced_pipeline_result(result, link_type):
def _filter_input_links(input_links, link_type, correctly_linked_ids):
+ if not input_links: # to handle hero versions
+ return
+
for input_link in input_links:
if link_type and input_link["type"] != link_type:
continue
|
feat(stock_wc_hot_top): add stock_wc_hot_top interface
add stock_wc_hot_top interface | @@ -369,13 +369,13 @@ def stock_zh_a_minute(
if __name__ == "__main__":
- stock_zh_a_daily_hfq_df_one = stock_zh_a_daily(symbol="sz000001", start_date="20201103", end_date="20201104", adjust="qfq")
+ stock_zh_a_daily_hfq_df_one = stock_zh_a_daily(symbol="sz000002", start_date="20201103", end_date="20201104", adjust="qfq")
print(stock_zh_a_daily_hfq_df_one)
stock_zh_a_daily_hfq_df_three = stock_zh_a_daily(symbol="sz000001", start_date="19900103", end_date="20210118", adjust="qfq")
print(stock_zh_a_daily_hfq_df_three)
- stock_zh_a_daily_hfq_df_two = stock_zh_a_daily(symbol="sz000001", start_date="19900103", end_date="20210118")
+ stock_zh_a_daily_hfq_df_two = stock_zh_a_daily(symbol="sz000002", start_date="20101103", end_date="20210118")
print(stock_zh_a_daily_hfq_df_two)
qfq_factor_df = stock_zh_a_daily(symbol="sz300857", adjust='hqf-factor')
|
tools: Rename server to zulip in fetch-contributor-data.
This was missed out in | @@ -106,12 +106,12 @@ def update_contributor_data_file() -> None:
with open(duplicate_commits_file) as f:
duplicate_commits = json.load(f)
for committer in duplicate_commits:
- if committer in contributor_username_to_data and contributor_username_to_data[committer].get('server'):
- total_commits = contributor_username_to_data[committer]['server']
+ if committer in contributor_username_to_data and contributor_username_to_data[committer].get('zulip'):
+ total_commits = contributor_username_to_data[committer]['zulip']
assert isinstance(total_commits, int)
duplicate_commits_count = duplicate_commits[committer]
original_commits = total_commits - duplicate_commits_count
- contributor_username_to_data[committer]['server'] = original_commits
+ contributor_username_to_data[committer]['zulip'] = original_commits
data['contributors'] = list(contributor_username_to_data.values())
write_to_disk(data, settings.CONTRIBUTOR_DATA_FILE_PATH)
|
Update vidar.txt
> netwire | @@ -1419,33 +1419,3 @@ sinelnikovd.ru
wzqyuwtdxyee.ru
zpuxmwmwdxxk.ru
zyzkikpfewuf.ru
-
-# Reference: https://www.virustotal.com/gui/file/196e5f9c769a45e6cebd587d193d53eb6aa8872ffb6f627988cb0ce457dad88e/detection
-
-riotvalorantgame.com
-
-# Reference: https://www.virustotal.com/gui/file/be4a188bcaa832f0adc28a0ab376a0b55b0cb2c8d6bbc57fe74b1ea72f1e520a/detection
-
-generalmotorshelp.com
-
-# Reference: https://www.virustotal.com/gui/file/c75a9108d565dda4d08d4673f221c53cce07b50680e62df43f30a1aa56a9957b/detection
-
-phonecallvoicemail.com
-
-# Reference: https://www.virustotal.com/gui/file/d46e5aaba3d0e10005c5cb1a313e3f10736b8d4dee4ddde464737aa363edeb6c/detection
-
-microphonesupport.com
-
-# Reference: https://www.virustotal.com/gui/file/e8e5df1b5ee0b46a3a5a63f789f039ddc338719227b5d16e16c28e9cf3e6e776/detection
-
-78.142.18.37:1980
-xlongphotography.com
-
-# Reference: https://www.virustotal.com/gui/file/3b9420f430267a8a7c2e29f69175590566c5f3c7f7136de19dde95994f65d972/detection
-
-lookingtotomorrow.com
-
-# Reference: https://www.virustotal.com/gui/file/99e80d903d29ba2d80d5074b036e94174a15f5fc8b08a5488cfb6c4efb1b766e/detection
-
-204.9.187.130:1986
-ohjddjhjfjd.com
|
Support PyG Linear in GraphGym pipeline.
Lazy init can be used in GraphGym now | @@ -3,6 +3,7 @@ from dataclasses import dataclass, replace
import torch
import torch.nn as nn
+from torch_geometric.nn import Linear as Linear_pyg
import torch.nn.functional as F
import torch_geometric as pyg
@@ -161,7 +162,7 @@ class Linear(nn.Module):
"""
def __init__(self, layer_config: LayerConfig, **kwargs):
super(Linear, self).__init__()
- self.model = nn.Linear(
+ self.model = Linear_pyg(
layer_config.dim_in,
layer_config.dim_out,
bias=layer_config.has_bias)
@@ -310,9 +311,9 @@ class GINConv(nn.Module):
def __init__(self, layer_config: LayerConfig, **kwargs):
super(GINConv, self).__init__()
gin_nn = nn.Sequential(
- nn.Linear(layer_config.dim_in, layer_config.dim_out),
+ Linear_pyg(layer_config.dim_in, layer_config.dim_out),
nn.ReLU(),
- nn.Linear(layer_config.dim_out, layer_config.dim_out))
+ Linear_pyg(layer_config.dim_out, layer_config.dim_out))
self.model = pyg.nn.GINConv(gin_nn)
def forward(self, batch):
|
works ok, but still backup project first.
Works by looking for identical String segments in the replacement file to match codings, annotations. If the replacement file contains multiple matches, then only the first match is used. | @@ -101,10 +101,11 @@ class ReplaceTextFile:
return
self.get_codings_annotations_case()
self.load_file_text()
- self.update_annotation_positions()
- self.update_code_positions()
- self.update_case_positions()
- Message(self.app,_("File replaced"), _("Text file replaced.")).exec_()
+ errs = self.update_annotation_positions()
+ errs += self.update_code_positions()
+ errs += self.update_case_positions()
+ msg = _("Reload the other tabs.\nCheck accuracy of codings and annotations.") + "\n" + errs
+ Message(self.app, _("File replaced"), msg).exec_()
def update_case_positions(self):
""" Update case if all file is assigned to case or portions assigned to case. """
@@ -114,7 +115,7 @@ class ReplaceTextFile:
# Entire file assigned to case
if self.case_is_full_file is not None:
cur = self.app.conn.cursor()
- cur.execute("update casetext set pos1=? where caseid=?", [len(self.new_file['fulltext']) - 1, self.case_is_full_file])
+ cur.execute("update case_text set pos1=? where caseid=?", [len(self.new_file['fulltext']) - 1, self.case_is_full_file])
self.app.conn.commit()
return
# Find matching text segments and assign to case
@@ -133,12 +134,11 @@ class ReplaceTextFile:
cur.execute("update case_text set pos0=?, pos1=? where id=?", [pos, pos + c_len, c['id']])
self.app.conn.commit()
for id_ in to_delete:
- cur.execute("delete from code_text where ctid=?", [id_])
+ cur.execute("delete from case_text where id=?", [id_])
self.app.conn.commit()
- '''if len(to_delete) > 0:
- err_msg += _("\nDeleted ") + str(len(to_delete)) + _(" unmatched case text segments")'''
if err_msg != "":
- Message(self.app, "Case text warnings", err_msg).exec_()
+ return "\n" + err_msg
+ return err_msg
def update_code_positions(self):
""" Find matching text and update pos0 and pos1.
@@ -164,7 +164,8 @@ class ReplaceTextFile:
if len(to_delete) > 0:
err_msg += _("\nDeleted ") + str(len(to_delete)) + _(" unmatched codings")
if err_msg != "":
- Message(self.app, "Coding warnings", err_msg).exec_()
+ return err_msg
+ return err_msg
def update_annotation_positions(self):
""" Find matching text and update pos0 and pos1.
@@ -190,7 +191,8 @@ class ReplaceTextFile:
if len(to_delete) > 0:
err_msg += _("\nDeleted ") + str(len(to_delete)) + _(" unmatched codings")
if err_msg != "":
- Message(self.app, "Annotation warnings", err_msg).exec_()
+ return err_msg + "\n"
+ return err_msg
def get_codings_annotations_case(self):
""" Get codings and annotations for old file. """
|
Optionally create items in harvest_template.py
Needs both WikidataBot and harvest_template.py rewrites.
Depends-On:
Depends-On: | @@ -21,6 +21,10 @@ These command line parameters can be used to specify which pages to work on:
¶ms;
+You can also use additional parameters:
+
+-create Create missing items before importing.
+
The following command line parameters can be used to change the bot's behavior.
If you specify them before all parameters, they are global and are applied to
all param-property pairs. If you specify them after a param-property pair,
@@ -115,9 +119,12 @@ class HarvestRobot(WikidataBot):
@type fields: dict
@keyword islink: Whether non-linked values should be treated as links
@type islink: bool
+ @keyword create: Whether to create a new item if it's missing
+ @type create: bool
"""
self.availableOptions.update({
'always': True,
+ 'create': False,
'islink': False,
})
super(HarvestRobot, self).__init__(**kwargs)
@@ -133,6 +140,7 @@ class HarvestRobot(WikidataBot):
self.cacheSources()
self.templateTitles = self.getTemplateSynonyms(self.templateTitle)
self.linkR = textlib.compileLinkR()
+ self.create_missing_item = self.getOption('create')
def getTemplateSynonyms(self, title):
"""Fetch redirects of the title, so we can check against them."""
@@ -321,6 +329,8 @@ def main(*args):
u'Please enter the template to work on:')
else:
template_title = arg[10:]
+ elif arg.startswith('-create'):
+ options['create'] = True
elif gen.handleArg(arg):
if arg.startswith(u'-transcludes:'):
template_title = arg[13:]
|
New entry: young woman shot in the head by rubber bullet
Added a new entry. | @@ -29,3 +29,12 @@ While the prison transport vehicle was being pushed around, the police open fire
**Links**
* https://old.reddit.com/r/PublicFreakout/comments/gutezm/multiple_kentucky_state_police_troopers_tackled/
+
+### Young woman shot in the head by a rubber bullet | May 30th
+
+A young woman was injured by a rubber bullet she took to the head.
+
+**Links**
+
+* https://twitter.com/shannynsharyse/status/1267015577266249728
+* https://twitter.com/shannynsharyse/status/1266631722239766528?s=21
|
measurement location routine
Added routine to calculate lat/lon of points measured by the JRO ISR using the instrument's data keys. | @@ -200,3 +200,48 @@ def clean(self):
self.data = self[idx]
return
+
+def calc_measurement_loc(self):
+ """ Calculate the instrument measurement location in geographic coordinates
+
+ Returns
+ -------
+ Void : adds 'gdlat#', 'gdlon#' to the instrument, for all directions that
+ have azimuth and elevation keys that match the format 'eldir#' and 'azdir#'
+
+ """
+
+ az_keys = [kk[5:] for kk in list(self.data.keys()) if kk.find('azdir') == 0]
+ el_keys = [kk[5:] for kk in list(self.data.keys()) if kk.find('eldir') == 0]
+ good_dir = list()
+
+ for i,kk in enumerate(az_keys):
+ if kk in el_keys:
+ try:
+ good_dir.append(int(kk))
+ except:
+ print("WARNING: unknown direction number [{:}]".format(kk))
+
+ # Calculate the geodetic latitude and longitude for each direction
+ for dd in good_dir:
+ # Format the direction location keys
+ az_key = 'azdir{:d}'.format(dd)
+ el_key = 'eldir{:d}'.format(dd)
+ lat_key = 'gdlat{:d}'.format(dd)
+ lon_key = 'gdlat{:d}'.format(dd)
+ # JRO is located 520 m above sea level (jro.igp.gob.pe./english/)
+ # Also, altitude has already been calculated
+ gdaltr = np.ones(shape=self['gdlonr'].shape) * 0.52
+ gdlat, gdlon, _ = local_horizontal_to_global_geo(self[az_key],
+ self[el_key],
+ self['range'],
+ self['gdlatr'],
+ self['gdlonr'], gdaltr,
+ geodetic=True)
+
+ self[lat_key] = pds.Series(gdlat, index=self.data.index)
+ self[lon_key] = pds.Series(gdlon, index=self.data.index)
+ else:
+ raise ValueError("No matching azimuth and elevation data included")
+
+ return
|
Update morphology.py
Fixing backprop problems | @@ -71,7 +71,7 @@ def dilation(
neighborhood = torch.zeros_like(kernel)
neighborhood[kernel == 0] = -max_val
else:
- neighborhood = structuring_element
+ neighborhood = structuring_element.clone()
neighborhood[kernel == 0] = -max_val
output = output.unfold(2, se_h, 1).unfold(3, se_w, 1)
@@ -148,7 +148,7 @@ def erosion(
neighborhood = torch.zeros_like(kernel)
neighborhood[kernel == 0] = -max_val
else:
- neighborhood = structuring_element
+ neighborhood = structuring_element.clone()
neighborhood[kernel == 0] = -max_val
output = output.unfold(2, se_h, 1).unfold(3, se_w, 1)
|
Update robotstxt.py
Add message to IgnoreRequest exception so that it can be detectedin the errbak method of a spider | @@ -45,7 +45,7 @@ class RobotsTxtMiddleware(object):
to_native_str(self._useragent), request.url):
logger.debug("Forbidden by robots.txt: %(request)s",
{'request': request}, extra={'spider': spider})
- raise IgnoreRequest()
+ raise IgnoreRequest("Forbidden by robots.txt")
def robot_parser(self, request, spider):
url = urlparse_cached(request)
|
bugfix
Fixed bug introduced into scale_units, which didn't account for the key difference between the accepted units dictionary and the scales dicitonary. | @@ -689,8 +689,11 @@ def scale_units(out_unit, in_unit):
if in_key != out_key:
raise ValueError('Cannot scale {:s} and {:s}'.format(out_unit,
in_unit))
+ # Recast units as keys for the scales dictionary
+ out_key = out_unit
+ in_key = in_unit
- unit_scale = scales[out_unit.lower()] / scales[in_unit.lower()]
+ unit_scale = scales[out_key.lower()] / scales[in_key.lower()]
return unit_scale
|
Fix "trove module-instances" command which don't work.
The "trove module-instances" command don't work.
cat <<EOF>> myping.data
message=Module.V1
EOF
trove module-create myping ping myping.data
trove module-apply \
myping
trove module-instances myping
ERROR: Module with ID \
could not be found.
Closes-Bug: | @@ -310,6 +310,8 @@ def _print_instances(instances, is_admin=False):
setattr(instance, 'datastore_version',
instance.datastore['version'])
setattr(instance, 'datastore', instance.datastore['type'])
+ if not hasattr(instance, 'region'):
+ setattr(instance, 'region', '')
fields = ['id', 'name', 'datastore',
'datastore_version', 'status',
'flavor_id', 'size', 'region']
|
Update README.md
Added environmental variable setup in windows | @@ -60,7 +60,7 @@ Temporarily set the environment variable(accesible only during the current cli s
```bash
set SENDGRID_API_KEY=YOUR_API_KEY
```
-Permanently set the environment variable:
+Permanently set the environment variable(accessible in all subsequent cli sessions):
```bash
setx SENDGRID_API_KEY "YOUR_API_KEY"
```
|
Fix Inotify
FSStore throws an error when detecting inotify events, due to missing flag_to_human attribute. | @@ -69,7 +69,7 @@ from coherence.upnp.core import utils
try:
from twisted.internet.inotify import (
INotify, IN_CREATE, IN_DELETE, IN_MOVED_FROM, IN_MOVED_TO,
- IN_ISDIR, IN_CHANGED)
+ IN_ISDIR, IN_CHANGED, _FLAG_TO_HUMAN)
except Exception as msg:
INotify = None
no_inotify_reason = msg
@@ -847,7 +847,7 @@ class FSStore(BackendStore):
def notify(self, ignore, path, mask, parameter=None):
self.info("Event %s on %s - parameter %r",
- ', '.join(self.inotify.flag_to_human(mask)), path.path,
+ ', '.join(_FLAG_TO_HUMAN(mask)), path.path,
parameter)
if mask & IN_CHANGED:
|
refactor: add default to pop
[skip ci] | @@ -354,7 +354,7 @@ def login():
args = frappe.form_dict
ldap: LDAPSettings = frappe.get_doc("LDAP Settings")
- user = ldap.authenticate(frappe.as_unicode(args.usr), frappe.as_unicode(args.pop("pwd")))
+ user = ldap.authenticate(frappe.as_unicode(args.usr), frappe.as_unicode(args.pop("pwd", None)))
frappe.local.login_manager.user = user.name
if should_run_2fa(user.name):
|
Update publish-flow-graphql.md
Small bug fix: replace "branin" stuff (wrong) with "ocean-subgraph" / "data NFTs". Other minor tweaks to streamline UX | @@ -5,7 +5,7 @@ SPDX-License-Identifier: Apache-2.0
# Quickstart: Publish & Consume Flow for GraphQL data type
-This quickstart describes a flow to publish & consume GraphQL-style URIs.
+This quickstart describes a flow to publish & consume GraphQL-style URIs. In our example, the data asset is a query to find data NFTs via ocean-subgraph.
Here are the steps:
@@ -29,22 +29,19 @@ From [data-nfts-and-datatokens-flow](data-nfts-and-datatokens-flow.md), do:
Then in the same python console:
```python
-from ocean_lib.web3_internal.constants import ZERO_ADDRESS
-
-# Specify metadata and services, using the Branin test dataset
+# Specify metadata and services
date_created = "2021-12-28T10:55:11Z"
-
metadata = {
"created": date_created,
"updated": date_created,
- "description": "Branin dataset",
- "name": "Branin dataset",
+ "description": "ocean-subgraph data NFTs",
+ "name": "ocean-subgraph data NFTs ",
"type": "dataset",
- "author": "Trent",
+ "author": "Alex",
"license": "CC0: PublicDomain",
}
-# we use just a simple graphql query
+# construct the graphql query itself
from ocean_lib.structures.file_objects import GraphqlQuery
graphql_query = GraphqlQuery(
url="https://v4.subgraph.rinkeby.oceanprotocol.com/subgraphs/name/oceanprotocol/ocean-subgraph",
@@ -59,7 +56,8 @@ graphql_query = GraphqlQuery(
"""
)
-# Publish dataset. It creates the data NFT, datatoken, and fills in metadata.
+# Publish dataset. It creates the data NFT, datatoken, and fills in metadata
+from ocean_lib.web3_internal.constants import ZERO_ADDRESS
asset = ocean.assets.create(
metadata,
alice_wallet,
|
Add setup details
Add details of prerequisites needed for development environment to function. | @@ -17,11 +17,15 @@ This is the open source repository for the free interactive tutorial websites:
Please feel free to contribute your tutorials or exercises by sending a pull request and adding yourself on the list.
+Developers will require the programming language Python https://www.python.org/ and the web framework Flask http://flask.pocoo.org/ installed in order to use this repository.
+
The web server will locally compile and load all Markdown files into memory. The development version of the web server
requires that you specify the domain you are working on, e.g.:
python main.py -d learnpython.org
+By default, the server process will run at http://localhost:5000
+
Contributors
============
superreg
|
Fix vlan in Qtech.QSW2800 config parser
HG--
branch : feature/microservices | @@ -170,8 +170,11 @@ class BaseQSW2800Parser(BaseParser):
:param tokens:
:return:
"""
- if "-" not in tokens[-1] or "database" not in tokens:
+ if "-" not in tokens[-1] and "database" not in tokens:
self.get_vlan_fact(int(tokens[-1].strip()))
+ elif "-" in tokens[-1]:
+ for v in ranges_to_list(tokens[-1].strip()):
+ self.get_vlan_fact(int(v))
def on_vlan_name(self, tokens):
self.get_current_vlan().name = tokens[-1]
|
Mathematicalize 2nd `ExtractionTurbineCHP` equation
Besides adding one more symbol which needs to be explained, this also
means we can put the timestep and variable quantification at the bottom
to have it range over both equations. This means less repetition, which
IMHO is easier to understand and less error prone. | @@ -894,7 +894,14 @@ class ExtractionTurbineCHPBlock(SimpleBlock):
&
f(i, n, t) =
\frac{(f(n, o_m, t) + f(n, o_t, t) \cdot I_{mfl}(n, t))}
- {E_c(n, t)} \\
+ {E_c(n, t)}.
+
+ Out flow relation :attr:`om.ExtractionTurbineCHP.relation[i,o,t]`
+ .. math::
+ &
+ f(n, o_m, t) = f(n, o_t, t) \cdot
+ \frac{\eta(n, o_m, t)}
+ {\eta(n, o_t, t)}\\
&
\forall t \in \textrm{TIMESTEPS}, \\
&
@@ -906,19 +913,16 @@ class ExtractionTurbineCHPBlock(SimpleBlock):
node :math:`o` at timestep :math:`t`,
* :math:`o_m` is the :py:obj:`main_output`,
* :math:`o_t` is the :py:obj:`tapped_output`,
+ * :math:`\eta(n, o, t)` is the efficiency (:py:obj:`conversion_factor`)
+ applied to the output from node :math:`n` to node :math:`o` at
+ timestep :math:`t`,
* :math:`I_{mfl}(n, t)` is the :py:obj:`main_flow_loss_index` at
node :math:`n`, at timestep :math:`t` and
* :math:`E_c(n, t)` is the condensing efficiency
(:py:obj:`efficiency_condensing`) at node :math:`n` at timestep
:math:`t`.
- Out flow relation :attr:`om.ExtractionTurbineCHP.relation[i,o,t]`
- .. math::
- flow(n, main\_output, t) = flow(n, tapped\_output, t) \cdot \\
- conversion\_factor(n, main\_output, t) / \
- conversion\_factor(n, tapped\_output, t)\\
- \forall t \in \textrm{TIMESTEPS}, \\
- \forall n \in \textrm{VARIABLE\_FRACTION\_TRANSFORMERS}.
+
"""
CONSTRAINT_GROUP = True
|
misc/rewriting: fix coding style issue
TN: | with Ada.Text_IO; use Ada.Text_IO;
with Libfoolang.Analysis; use Libfoolang.Analysis;
-with Libfoolang.Rewriting; use Libfoolang.Rewriting;
with Libfoolang.Common;
+with Libfoolang.Rewriting; use Libfoolang.Rewriting;
with Process_Apply;
|
include actual exception type and message in bug report tickets
and format it the way the python shell does | @@ -80,6 +80,22 @@ def is_deploy_in_progress():
return cache.get(DEPLOY_IN_PROGRESS_FLAG) is not None
+def format_traceback_the_way_python_does(type, exc, tb):
+ """
+ Returns a traceback that looks like the one python gives you in the shell, e.g.
+
+ Traceback (most recent call last):
+ File "<stdin>", line 2, in <module>
+ NameError: name 'name' is not defined
+ """
+
+ return u'Traceback (most recent call last):\n{}{}: {}'.format(
+ ''.join(traceback.format_tb(tb)),
+ type.__name__,
+ unicode(exc)
+ )
+
+
def server_error(request, template_name='500.html'):
"""
500 error handler.
@@ -91,7 +107,7 @@ def server_error(request, template_name='500.html'):
t = loader.get_template(template_name)
type, exc, tb = sys.exc_info()
- traceback_text = ''.join(traceback.format_tb(tb))
+ traceback_text = format_traceback_the_way_python_does(type, exc, tb)
traceback_key = uuid.uuid4().hex
cache.cache.set(traceback_key, traceback_text, 60*60)
|
Redirect + snackbar notification on resource selection save.
Moving resource array shaping responsibility to template.
Adding light validation. | @@ -4,6 +4,7 @@ import { setClassState } from './main';
import { LearnerGroupResource, LessonResource, ContentNodeResource } from 'kolibri.resources';
import { ContentNodeKinds } from 'kolibri.coreVue.vuex.constants';
import { createTranslator } from 'kolibri.utils.i18n';
+import every from 'lodash/every';
const translator = createTranslator('lessonsPageTitles', {
lessons: 'Lessons',
@@ -175,10 +176,11 @@ export function showLessonSelectionTopicPage(store, classId, lessonId, topicId)
);
}
-export function saveLessonResources(store, lessonId, resourceArray) {
- LessonResource.getModel(lessonId).save({
- resources: resourceArray.map(resourceId => ({ contentnode_id: resourceId })),
- });
+export function saveLessonResources(store, lessonId, resources) {
+ if (every(resources, resource => resource.contentnode_id)) {
+ return LessonResource.getModel(lessonId).save({ resources });
+ }
+ return Promise.reject();
}
export function showLessonSelectionSearchPage(store, classId, lessonId, searchTerm) {}
|
fix: correct stacklevel for warnings
Stacklevel=2 just points to frame that called warning and not frame
where it originated. This frame is useless in most cases as you can just
`grep` for it instead of looking at log.
stacklevel=3 gives frame which is calling the code with warnings.
[skip ci] | @@ -59,7 +59,7 @@ def log(message, colour=""):
print(colour + message + end_line)
-def warn(message, category=None, stacklevel=2):
+def warn(message, category=None, stacklevel=3):
from warnings import warn
warn(message=message, category=category, stacklevel=stacklevel)
|
More readable, link to log of example test suite
I really have to break my markdown habits! | @@ -13,15 +13,12 @@ Introduction
:target: https://travis-ci.org/adafruit/Adafruit__Micropython_Blinka
:alt: Build Status
-Description
-===========
-
This repository contains a selection of packages mirroring the CircuitPython API
-on hosts running micropython. At the time of writing drafts exist for
+on hosts running micropython. Working code exists to emulate the CircuitPython packages;
-* board - breakout-specific pin identities
-* microcontroller - chip-specific pin identities
-* digitalio - digital input/output pins, using pin identities from board/microcontroller
+* **board** - breakout-specific pin identities
+* **microcontroller** - chip-specific pin identities
+* **digitalio** - digital input/output pins, using pin identities from board+microcontroller packages
Dependencies
@@ -31,16 +28,18 @@ The Micropython compatibility layers described above are intended to provide a C
are running Micropython. Since corresponding packages should be built-in to any standard
CircuitPython image, they have no value on a device already running CircuitPython and would likely conflict in unhappy ways.
-However, the test suites under *testing.implementation.all* are by design
-intended to run on either CircuitPython or on Micropython+compatibility layer to prove conformance, while the test suites under *testing.implementation.micropython* will only run
-on Micropython and *testing.implementation.circuitpython* will only run on CircuitPython
+The test suites under **testing.implementation.all** are by design
+intended to run on *either* CircuitPython *or* Micropython+compatibility layer to prove conformance.
+
+The test suites under **testing.implementation.micropython** will only run
+on Micropython and **testing.implementation.circuitpython** will only run on CircuitPython
Usage Example
=============
At the time of writing (`git:3b2fc268 <https://github.com/cefn/Adafruit_Micropython_Blinka/tree/3b2fc268d89aee6a648da456224e6d48d2476baa>`_),
-the following sequence runs through some basic testing of the digitalio compatibility layer.
+the following sequence runs through some basic testing of the digitalio compatibility layer, which looks like `this <https://github.com/cefn/Adafruit_Micropython_Blinka/issues/2#issuecomment-366713394>`_ .
.. code-block:: python
|
Add more log messages around exiting
This is to help understand ongoing issues with
parsl not exiting cleanly / hanging at exit. | @@ -1075,6 +1075,7 @@ class DataFlowKernel(object):
logger.info("Closing flowcontrol")
self.flowcontrol.close()
+ logger.info("Terminated flow control")
logger.info("Scaling in and shutting down executors")
@@ -1093,14 +1094,17 @@ class DataFlowKernel(object):
self.monitoring.send(MessageType.BLOCK_INFO, msg)
logger.info(f"Shutting down executor {executor.label}")
executor.shutdown()
+ logger.info(f"Shut down executor {executor.label}")
elif executor.managed and executor.bad_state_is_set: # and bad_state_is_set
logger.warning(f"Not shutting down executor {executor.label} because it is in bad state")
else:
logger.info(f"Not shutting down executor {executor.label} because it is unmanaged")
+ logger.info("Terminated executors")
self.time_completed = datetime.datetime.now()
if self.monitoring:
+ logger.info("Sending final monitoring message")
self.monitoring.send(MessageType.WORKFLOW_INFO,
{'tasks_failed_count': self.task_state_counts[States.failed],
'tasks_completed_count': self.task_state_counts[States.exec_done],
@@ -1109,7 +1113,9 @@ class DataFlowKernel(object):
'run_id': self.run_id, 'rundir': self.run_dir,
'exit_now': True})
+ logger.info("Terminating monitoring")
self.monitoring.close()
+ logger.info("Terminated monitoring")
logger.info("DFK cleanup complete")
|
Handle empty yaml documents in annotations
When a service annotation has an empty yaml document, ambassador is
crashlooping.
This can happen when a developer is commenting the yaml doc for testing,
or with a bad deployment.
Ambassador should not crash in this case and continue to process the
other documents. | @@ -856,7 +856,7 @@ class ResourceFetcher:
else:
self.logger.debug(f"not saving K8s Service {resource_name}.{resource_namespace} with no ports")
- objects: List[Any] = []
+ result: List[Any] = []
if annotations:
if (self.filename is not None) and (not self.filename.endswith(":annotation")):
@@ -866,15 +866,19 @@ class ResourceFetcher:
objects = parse_yaml(annotations)
for obj in objects:
+ if not obj:
+ self.logger.warning(f"empty YAML document found in ambassador service: {resource_name}.{resource_namespace}")
+ continue
if obj.get('metadata_labels') is None and metadata_labels:
obj['metadata_labels'] = metadata_labels
if obj.get('namespace') is None:
obj['namespace'] = resource_namespace
+ result.append(obj)
except yaml.error.YAMLError as e:
self.logger.debug("could not parse YAML: %s" % e)
- return resource_identifier, objects
+ return resource_identifier, result
# Handler for K8s Secret resources.
def handle_k8s_secret(self, k8s_object: AnyDict) -> HandlerResult:
|
[BUG] Fixing overlap in `NaiveVariance` train/test set due to inclusive indexing for timestamp limits
Fixes issue
Pandas treats slicing with integer vs timestamp index / period index differently (the first has an exclusive end and the second inclusive). Use of `get_slice` in `NaiveVariance` ensures that both are exclusive. | @@ -19,6 +19,7 @@ import numpy as np
import pandas as pd
from scipy.stats import norm
+from sktime.datatypes._utilities import get_slice
from sktime.forecasting.base._base import DEFAULT_ALPHA, BaseForecaster
from sktime.forecasting.base._sktime import _BaseWindowForecaster
from sktime.forecasting.compose import ColumnEnsembleForecaster
@@ -632,8 +633,8 @@ class NaiveVariance(BaseForecaster):
for id in y_index:
forecaster = forecaster.clone()
- y_train = y[:id] # subset on which we fit
- y_test = y[id:] # subset on which we predict
+ y_train = get_slice(y, start=None, end=id) # subset on which we fit
+ y_test = get_slice(y, start=id, end=None) # subset on which we predict
try:
forecaster.fit(y_train, fh=y_test.index)
except ValueError:
|
Update export_tflite_ssd_graph_lib.py
Correcting comments | @@ -41,7 +41,7 @@ def get_const_center_size_encoded_anchors(anchors):
boxes
Returns:
- encoded_anchors: a float32 constant tensor of shape [4, num_anchors]
+ encoded_anchors: a float32 constant tensor of shape [num_anchors, 4]
containing the anchor boxes.
"""
anchor_boxlist = box_list.BoxList(anchors)
@@ -83,10 +83,10 @@ def append_postprocessing_op(frozen_graph_def, max_detections,
TFLite_Detection_PostProcess custom op node has four outputs:
detection_boxes: a float32 tensor of shape [1, num_boxes, 4] with box
locations
- detection_scores: a float32 tensor of shape [1, num_boxes]
- with class scores
detection_classes: a float32 tensor of shape [1, num_boxes]
with class indices
+ detection_scores: a float32 tensor of shape [1, num_boxes]
+ with class scores
num_boxes: a float32 tensor of size 1 containing the number of detected
boxes
"""
|
[cleanup] use codes instead of languages_by_size
languages_by_size was never sorted by site size but contains the site codes
in alphabetical order. Use codes tuple instead to hold the site codes
Create a class property for languages_by_size for compatibility purpose | @@ -18,14 +18,20 @@ class Family(family.SubdomainFamily, family.FandomFamily):
name = 'wowwiki'
domain = 'wowwiki.fandom.com'
- languages_by_size = [
+ codes = (
'ar', 'cs', 'da', 'de', 'el', 'en', 'es', 'et', 'fa', 'fi', 'fr', 'he',
'hu', 'is', 'it', 'ja', 'ko', 'lt', 'lv', 'nl', 'nn', 'no', 'pl', 'pt',
'pt-br', 'ru', 'sk', 'tr', 'uk', 'zh', 'zh-tw'
- ]
+ )
interwiki_removals = ['hr', 'ro', 'sr', 'sv']
+ @classproperty
+ @deprecated('codes attribute', since='20190422')
+ def languages_by_size(cls):
+ """DEPRECATED. languages_by_size property for compatibility purpose."""
+ return list(cls.codes)
+
@classproperty
def langs(cls):
"""Property listing family languages."""
|
[dagit] Store whitespace state in localStorage
## Summary
Resolves
Track the "toggle whitespace" launchpad setting in localStorage.
## Test Plan
View launchpad, turn on whitespace visibility. Reload page, verify persistence. Turn it off, repeat, verify same. | @@ -33,6 +33,7 @@ import {
responseToYamlValidationResult,
} from '../configeditor/ConfigEditorUtils';
import {isHelpContextEqual} from '../configeditor/isHelpContextEqual';
+import {useStateWithStorage} from '../hooks/useStateWithStorage';
import {DagsterTag} from '../runs/RunTag';
import {RepositorySelector} from '../types/globalTypes';
import {repoAddressToSelector} from '../workspace/repoAddressToSelector';
@@ -81,7 +82,6 @@ interface ILaunchpadSessionState {
previewedDocument: any | null;
configLoading: boolean;
editorHelpContext: ConfigEditorHelpContext | null;
- showWhitespace: boolean;
tagEditorOpen: boolean;
}
@@ -97,8 +97,7 @@ type Action =
}
| {type: 'toggle-tag-editor'; payload: boolean}
| {type: 'toggle-config-loading'; payload: boolean}
- | {type: 'set-editor-help-context'; payload: ConfigEditorHelpContext | null}
- | {type: 'toggle-whitepsace'; payload: boolean};
+ | {type: 'set-editor-help-context'; payload: ConfigEditorHelpContext | null};
const reducer = (state: ILaunchpadSessionState, action: Action) => {
switch (action.type) {
@@ -119,8 +118,6 @@ const reducer = (state: ILaunchpadSessionState, action: Action) => {
return {...state, configLoading: action.payload};
case 'set-editor-help-context':
return {...state, editorHelpContext: action.payload};
- case 'toggle-whitepsace':
- return {...state, showWhitespace: action.payload};
default:
return state;
}
@@ -131,7 +128,6 @@ const initialState: ILaunchpadSessionState = {
previewLoading: false,
previewedDocument: null,
configLoading: false,
- showWhitespace: true,
editorHelpContext: null,
tagEditorOpen: false,
};
@@ -147,6 +143,11 @@ const LaunchpadSessionContainer: React.FC<LaunchpadSessionContainerProps> = (pro
const editorSplitPanelContainer = React.useRef<SplitPanelContainer | null>(null);
const previewCounter = React.useRef(0);
+ const [showWhitespace, setShowWhitespace] = useStateWithStorage(
+ 'launchpad-whitespace',
+ (json: any) => (typeof json === 'boolean' ? json : true),
+ );
+
const {isJob, presets} = pipeline;
const initialDataForMode = React.useMemo(() => {
@@ -515,7 +516,6 @@ const LaunchpadSessionContainer: React.FC<LaunchpadSessionContainerProps> = (pro
previewedDocument,
configLoading,
editorHelpContext,
- showWhitespace,
tagEditorOpen,
} = state;
@@ -606,7 +606,11 @@ const LaunchpadSessionContainer: React.FC<LaunchpadSessionContainerProps> = (pro
title="Toggle whitespace"
icon={<Icon name="toggle_whitespace" />}
active={showWhitespace}
- onClick={() => dispatch({type: 'toggle-whitepsace', payload: !showWhitespace})}
+ onClick={() =>
+ setShowWhitespace((current: boolean | undefined) =>
+ current === undefined ? true : !current,
+ )
+ }
/>
<SessionSettingsSpacer />
<SecondPanelToggle axis="horizontal" container={editorSplitPanelContainer} />
|
no longer accept HTML entities in xml files
the script that produces these files has been updated
to produce utf8 rather HTML entities. | @@ -17,12 +17,6 @@ end
def load_volume_xml(xml_data)
xml_data.force_encoding('UTF-8').encode('UTF-8', :invalid => :replace, :undef => :replace, :replace => '')
- xml_data.gsub!(/&/, '&amp;') # three chars that need to stay
- xml_data.gsub!(/>/, '&gt;') # escaped in xml
- xml_data.gsub!(/</, '&lt;') # will go back to & > <
-
- xml_data = HTMLEntities.new.decode xml_data # Change all escape characters to Unicode
- # handles html entities such as é that are not known in xml
xml_data.gsub!(/<</, '<<')
xml_data.gsub!(/>>/, '>>')
|
Strip splitted list
And not initial string | @@ -43,7 +43,7 @@ AUTH_MODULES = {
def generate_auth_options(auth_list):
auth_options = {}
- methods = auth_list.strip().split(',')
+ methods = [item.strip() for item in auth_list.split(',')]
for m in methods:
if m in AUTH_MODULES:
auth_options[m] = AUTH_MODULES[m]
|
change mode
At Snwp=0, we have no invasion sequence, so we won't calculate Keff for that point. (masking for calculation of the saturation points is based on inv_seq<i) | @@ -89,11 +89,11 @@ class RelativePermeability(GenericAlgorithm):
wp = self.project[self.settings['wp']]
modelwp = models.physics.multiphase.conduit_conductance
wp.add_model(model=modelwp, propname=prop,
- throat_conductance=prop_q, mode='loose')
+ throat_conductance=prop_q, mode='medium')
nwp = self.project[self.settings['nwp']]
modelnwp = models.physics.multiphase.conduit_conductance
nwp.add_model(model=modelnwp, propname=prop,
- throat_conductance=prop_q, mode='loose')
+ throat_conductance=prop_q, mode='medium')
def _abs_perm_calc(self, flow_pores):
r"""
|
Remove test_generic_storage_with_old_parameters
With explicit kwargs, the test will no longer be needed. | @@ -97,37 +97,6 @@ def test_generic_storage_4():
)
-def test_generic_storage_with_old_parameters():
- deprecated = {
- "nominal_capacity": 45,
- "initial_capacity": 0,
- "capacity_loss": 0,
- "capacity_min": 0,
- "capacity_max": 0,
- }
- # Make sure an `AttributeError` is raised if we supply all deprecated
- # parameters.
- with pytest.raises(AttributeError) as caught:
- components.GenericStorage(
- label="`GenericStorage` with all deprecated parameters",
- **deprecated,
- )
- for parameter in deprecated:
- # Make sure every parameter used is mentioned in the exception's
- # message.
- assert parameter in str(caught.value)
- # Make sure an `AttributeError` is raised for each deprecated
- # parameter.
- pytest.raises(
- AttributeError,
- components.GenericStorage,
- **{
- "label": "`GenericStorage` with `{}`".format(parameter),
- parameter: deprecated[parameter],
- },
- )
-
-
def test_generic_storage_with_non_convex_investment():
"""Tests error if `offset` and `existing` attribute are given."""
with pytest.raises(
|
Add ordered list styling for /help/ pages.
This adds a styling that puts the numbers in a Zulip brand green bubble
with white text for the number. | @@ -57,10 +57,51 @@ body {
font-size: 17px;
}
-li {
+.markdown ul,
+.markdown ol {
+ margin-left: 30px;
+}
+
+.markdown li {
line-height: 150%;
}
+.markdown ol {
+ counter-reset: item;
+ list-style: none;
+}
+
+.markdown ol li {
+ counter-increment: item;
+ margin-bottom: 5px;
+}
+
+.markdown ol li:before {
+ content: counter(item);
+
+ display: inline-block;
+ vertical-align: top;
+
+ padding: 3px 6.5px 3px 7.5px;
+ margin-right: 5px;
+ background-color: #52c2af;
+ color: white;
+ border-radius: 100%;
+ font-size: 0.9em;
+ line-height: 1.1;
+ text-align: center;
+}
+
+.markdown ol li p {
+ display: inline-block;
+ vertical-align: top;
+
+ max-width: calc(100% - 28px);
+
+ position: relative;
+ top: -2px;
+}
+
.title {
font-family: Helvetica;
font-size: 100px;
|
integrations: Update HomeAssistant Documentation.
I have updated the docs for the homeassistant integration to include
numbers to increase visibility.
Fixies part of | -{!create-stream.md!}
+1. {!create-stream.md!}
-Next, on your {{ settings_html|safe }}, create a bot and
+1. Next, on your {{ settings_html|safe }}, create a bot and
note its email and API key.
-In Home Assistant, you need to add the `notify` service to your
+1. In Home Assistant, you need to add the `notify` service to your
`configuration.yaml` file. This should look something like this:

-The `api_key` parameter should correspond to your bot's key. The `stream`
+1. The `api_key` parameter should correspond to your bot's key. The `stream`
parameter is not necessarily required; if not given, it will default to
the `homeassistant` stream.
-And the URL under `resource` should start with:
+1. And the URL under `resource` should start with:
`{{ api_url }}/v1/external/homeassistant`
-Finally, you need to configure a trigger for the service by adding
+1. Finally, you need to configure a trigger for the service by adding
an automation entry in the HomeAssistant `configuration.yaml` file.

|
Bugfix be clear about header encoding
latin1 is technically allowed. | @@ -40,7 +40,7 @@ class ASGIHTTPConnection:
headers = CIMultiDict()
headers['Remote-Addr'] = (self.scope.get('client') or ['<local>'])[0]
for name, value in self.scope['headers']:
- headers.add(name.decode().title(), value.decode())
+ headers.add(name.decode("latin1").title(), value.decode("latin1"))
if self.scope['http_version'] < '1.1':
headers.setdefault('Host', self.app.config['SERVER_NAME'] or '')
@@ -120,7 +120,7 @@ class ASGIWebsocketConnection:
headers = CIMultiDict()
headers['Remote-Addr'] = (self.scope.get('client') or ['<local>'])[0]
for name, value in self.scope['headers']:
- headers.add(name.decode().title(), value.decode())
+ headers.add(name.decode("latin1").title(), value.decode("latin1"))
path = self.scope["path"]
path = path if path[0] == "/" else urlparse(path).path
|
utils/files remove temp file upon move failure
Fixes | @@ -26,6 +26,16 @@ REMOTE_PROTOS = ('http', 'https', 'ftp', 'swift', 's3')
VALID_PROTOS = ('salt', 'file') + REMOTE_PROTOS
+def __clean_tmp(tmp):
+ '''
+ Remove temporary files
+ '''
+ try:
+ salt.utils.rm_rf(tmp)
+ except Exception:
+ pass
+
+
def guess_archive_type(name):
'''
Guess an archive type (tar, zip, or rar) by its file extension
@@ -93,7 +103,15 @@ def copyfile(source, dest, backup_mode='', cachedir=''):
fstat = os.stat(dest)
except OSError:
pass
+
+ # The move could fail if the dest has xattr protections, so delete the
+ # temp file in this case
+ try:
shutil.move(tgt, dest)
+ except Exception:
+ __clean_tmp(tgt)
+ raise
+
if fstat is not None:
os.chown(dest, fstat.st_uid, fstat.st_gid)
os.chmod(dest, fstat.st_mode)
@@ -111,10 +129,7 @@ def copyfile(source, dest, backup_mode='', cachedir=''):
subprocess.call(cmd, stdout=dev_null, stderr=dev_null)
if os.path.isfile(tgt):
# The temp file failed to move
- try:
- os.remove(tgt)
- except Exception:
- pass
+ __clean_tmp(tgt)
def rename(src, dst):
|
Add the changelog entry
Add the changelog entry about the addition of nologo to RCFLAGS in MSVC tool. | @@ -38,6 +38,8 @@ RELEASE 3.0.5.alpha.yyyymmdd - NEW DATE WILL BE INSERTED HERE
From Bernhard M. Wiedemann:
- Do not store build host+user name if reproducible builds are wanted
+ From Maciej Kumorek:
+ - Update the MSVC tool to include the nologo flag by default in RCFLAGS
RELEASE 3.0.4 - Mon, 20 Jan 2019 22:49:27 +0000
|
MockBot needs to be aware of redis_ready
Forgot to update the additional_spec_asyncs when changing the name of
this Bot attribute to be public. | @@ -287,7 +287,7 @@ class MockBot(CustomMockMixin, unittest.mock.MagicMock):
For more information, see the `MockGuild` docstring.
"""
spec_set = Bot(command_prefix=unittest.mock.MagicMock(), loop=_get_mock_loop())
- additional_spec_asyncs = ("wait_for", "_redis_ready")
+ additional_spec_asyncs = ("wait_for", "redis_ready")
def __init__(self, **kwargs) -> None:
super().__init__(**kwargs)
|
[BUG] fixed loc/iloc indexing bug in nested_df_has_nans
This PR fixes the following bug:
The `_nested_dataframe_has_nans` utility function did not work for nested data frames where index was not integer starting at zero - fixed replacing `loc` indexing by `iloc`. | @@ -278,11 +278,11 @@ def _nested_dataframe_has_nans(X: pd.DataFrame) -> bool:
"""
cases = len(X)
dimensions = len(X.columns)
- for i in range(0, cases):
- for j in range(0, dimensions):
+ for i in range(cases):
+ for j in range(dimensions):
s = X.iloc[i, j]
- for k in range(0, s.size):
- if pd.isna(s[k]):
+ for k in range(s.size):
+ if pd.isna(s.iloc[k]):
return True
return False
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.