message
stringlengths
13
484
diff
stringlengths
38
4.63k
update `create_pydantic_model` docs Didn't mention the `include_columns` option.
@@ -54,8 +54,8 @@ We can then create model instances from data we fetch from the database: You have several options for configuring the model, as shown below. -exclude_columns -~~~~~~~~~~~~~~~ +include_columns / exclude_columns +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ If we want to exclude the ``popularity`` column from the ``Band`` table: @@ -63,6 +63,12 @@ If we want to exclude the ``popularity`` column from the ``Band`` table: BandModel = create_pydantic_model(Band, exclude_columns=(Band.popularity,)) +Conversely, if you only wanted the ``popularity`` column: + +.. code-block:: python + + BandModel = create_pydantic_model(Band, include_columns=(Band.popularity,)) + nested ~~~~~~ @@ -148,8 +154,9 @@ By default the primary key column isn't included - you can add it using: Source ~~~~~~ -.. automodule:: piccolo.utils.pydantic - :members: +.. currentmodule:: piccolo.utils.pydantic + +.. autofunction:: create_pydantic_model .. hint:: A good place to see ``create_pydantic_model`` in action is `PiccoloCRUD <https://github.com/piccolo-orm/piccolo_api/blob/master/piccolo_api/crud/endpoints.py>`_, as it uses ``create_pydantic_model`` extensively to create Pydantic models
Add warning to a known autograd issue on XLA backend. Summary: Pull Request resolved:
@@ -384,6 +384,20 @@ unsigned VariableHooks::_register_hook(const Tensor& self, std::function<Tensor( } void handle_view_on_rebase(DifferentiableViewMeta* diff_view_meta, bool indirect) { + // TODO: Remove this warning once we allow XLA to workaround CopySlices. + if (diff_view_meta->base_.device().type() == c10::DeviceType::XLA) { + std::string msg; + if (indirect) { + msg = "This view requires gradients but its base or another view of the same base has been modified inplace. "; + } else { + msg = "This view requires gradients and it's being modified inplace. "; + } + msg = c10::str(msg, "Backward through inplace update on view tensors is WIP for XLA backwend. " + "Gradient might be wrong in certain cases. Running forward alone is fine. " + "To work around it, please replace the inplace operation by an out-of-place one."); + TORCH_WARN(msg); + } + /// See NOTE [ View + Inplace detection ] for justification of the logic below if (diff_view_meta->creation_meta != CreationMeta::DEFAULT) { auto grad_fn = diff_view_meta->grad_fn_.get();
BUG: toggling conemporaneous on Dendrogram now toggles [FIXED] Dendrogram display now relfects new value
@@ -423,6 +423,7 @@ class Dendrogram(Drawable): self._edge_mapping = {} self._contemporaneous = contemporaneous self._tips_as_text = True + self._length_attr = self.tree._length @property def label_pad(self): @@ -442,9 +443,10 @@ class Dendrogram(Drawable): def contemporaneous(self, value): if not type(value) == bool: raise TypeError - if not self._contemporaneous == value: + if self._contemporaneous != value: klass = self.tree.__class__ - self.tree = klass(self.tree, length_attr="frac_pos") + length_attr = "frac_pos" if value else self._length_attr + self.tree = klass(self.tree, length_attr=length_attr) self.tree.propagate_properties() self._traces = [] if value: # scale bar not needed
Fixed bad hash method iteration example Updated the iteration docs to a working example. Made consistent with iteration and method example in previous lines.
@@ -173,7 +173,7 @@ def generate_password_hash(password, method="pbkdf2:sha256", salt_length=8): :param password: the password to hash. :param method: the hash method to use (one that hashlib supports). Can - optionally be in the format ``pbkdf2:<method>[:iterations]`` + optionally be in the format ``pbkdf2:method:iterations`` to enable PBKDF2. :param salt_length: the length of the salt in letters. """
[revert]: removed the default pse route statement as it is maintained by the routers Github Issue: Authored-by: Shubham Bansal
@@ -121,10 +121,6 @@ function MapOrSectorController($location, storageService, locationsService) { break; } } - // PSE is the deafult tab - if (awcReportPath === 'awc_reports') { - awcReportPath += '/pse'; - } $location.path(awcReportPath); } });
update ldap documentation use new ldap cache configuration in documentation
@@ -110,8 +110,8 @@ AUTH_LDAP_USER_FLAGS_BY_GROUP = { AUTH_LDAP_FIND_GROUP_PERMS = True # Cache groups for one hour to reduce LDAP traffic -AUTH_LDAP_CACHE_GROUPS = True -AUTH_LDAP_GROUP_CACHE_TIMEOUT = 3600 +AUTH_LDAP_CACHE_TIMEOUT = 3600 + ``` * `is_active` - All users must be mapped to at least this group to enable authentication. Without this, users cannot log in.
[Train] Monkeypatch environment variables in `test_json` If we use `os.environ` to set environment variables in tests, then our tests become coupled. By using `monkeypatch`, we can safely set environment variables while ensuring our tests remain decoupled. For more information, see the [monkeypatching documentation](https://docs.pytest.org/en/6.2.x/monkeypatch.html#monkeypatching-environment-variables).
@@ -59,10 +59,10 @@ class TestBackend(Backend): @pytest.mark.parametrize("workers_to_log", [0, None, [0, 1]]) @pytest.mark.parametrize("detailed", [False, True]) @pytest.mark.parametrize("filename", [None, "my_own_filename.json"]) -def test_json(ray_start_4_cpus, make_temp_dir, workers_to_log, detailed, - filename): +def test_json(monkeypatch, ray_start_4_cpus, make_temp_dir, workers_to_log, + detailed, filename): if detailed: - os.environ[ENABLE_DETAILED_AUTOFILLED_METRICS_ENV] = "1" + monkeypatch.setenv(ENABLE_DETAILED_AUTOFILLED_METRICS_ENV, "1") config = TestConfig() @@ -119,9 +119,6 @@ def test_json(ray_start_4_cpus, make_temp_dir, workers_to_log, detailed, all(not any(key in worker for key in DETAILED_AUTOFILLED_KEYS) for worker in element) for element in log) - os.environ.pop(ENABLE_DETAILED_AUTOFILLED_METRICS_ENV, 0) - assert ENABLE_DETAILED_AUTOFILLED_METRICS_ENV not in os.environ - def _validate_tbx_result(events_dir): events_file = list(glob.glob(f"{events_dir}/events*"))[0]
PerfRegion uses PLT_LINKLET_TIMES env var instead of verbose Fixes
@@ -27,6 +27,14 @@ def console_log_after_boot(print_str, given_verbosity_level=0, debug=False): if glob.is_boot_completed(): console_log(print_str, given_verbosity_level, debug) +def os_check_env_var(var_str): + import os + return var_str in os.environ.keys() + +def os_get_env_var(var_str): + import os + return os.environ.get(var_str) if var_str in os.environ.keys() else "" + ## this code is a port of cs/linklet/performance.ss class PerfRegion(object): @@ -54,16 +62,11 @@ class PerfRegionCPS(PerfRegion): return None # using False here confuses rtyper def start_perf_region(label): - from pycket.env import w_global_config - current_v_level = w_global_config.get_config_val('verbose') - if current_v_level > 0: + if os_check_env_var("PLT_LINKLET_TIMES"): linklet_perf.current_start_time.append(rtime.time()) - def finish_perf_region(label): - from pycket.env import w_global_config - current_v_level = w_global_config.get_config_val('verbose') - if current_v_level > 0: + if os_check_env_var("PLT_LINKLET_TIMES"): assert (len(linklet_perf.current_start_time) > 0) delta = rtime.time() - linklet_perf.current_start_time[-1] table_add(linklet_perf.region_times, label, delta) @@ -119,9 +122,7 @@ class LinkletPerf(object): self.loop(sub_ht, level+1) def print_report(self): - from pycket.env import w_global_config - current_v_level = w_global_config.get_config_val('verbose') - if current_v_level > 0: + if os_check_env_var("PLT_LINKLET_TIMES"): total = 0 self.name_len = 0 for k in self.region_times:
Remove object as base class for MutableChain Plus some minor styling adjustments
""" This module contains essential stuff that should've come with Python itself ;) """ +import errno import gc +import inspect import os import re -import inspect +import sys import weakref -import errno from functools import partial, wraps from itertools import chain -import sys from scrapy.utils.decorators import deprecated @@ -371,7 +371,7 @@ else: gc.collect() -class MutableChain(object): +class MutableChain: """ Thin wrapper around itertools.chain, allowing to add iterables "in-place" """
Don't use checkboxes in bug issue form Github's form schema treats the `checkbox` type as a tasklist. Since the intention is to allow the user to select more than one command environment we should use a dropdown and give the user the option to select more than one option.
@@ -47,16 +47,17 @@ body: attributes: label: Operating System description: Your operating system and version. - - type: checkboxes + - type: dropdown id: win attributes: label: Windows environment description: If using Windows, how are you running CumulusCI? + multiple: true options: - - label: Command Prompt - - label: PowerShell - - label: Bash on Windows - - label: WSL + - Bash on Windows + - Command Prompt + - PowerShell + - WSL - type: dropdown id: install attributes:
[LLVM] Fix build errors in CodeGenCPU::AddDebugInformation This code is guarded by TVM_LLVM_VERSION >= 50 and < 70, so the errors were not detected in local tests or in CI.
@@ -203,11 +203,12 @@ void CodeGenCPU::AddDebugInformation(PrimFunc f_tir, llvm::Function* f_llvm) { ICHECK(f_llvm->getReturnType() == t_void_ || f_llvm->getReturnType() == t_int_) << "Unexpected return type"; auto ret_type_tir = f_llvm->getReturnType() == t_int_ ? DataType::Int(32) : DataType::Void(); - llvm::DIType* returnTy = GetDebugType(ret_type_tir, f_llvm->getReturnType()); + llvm::DIType* returnTy = + GetDebugType(GetTypeFromRuntimeDataType(ret_type_tir), f_llvm->getReturnType()); paramTys.push_back(returnTy); for (size_t i = 0; i < f_llvm->arg_size(); ++i) { paramTys.push_back( - GetDebugType(GetType(f_tir->args[i]), f_llvm->getFunctionType()->getParamType(i))); + GetDebugType(GetType(f_tir->params[i]), f_llvm->getFunctionType()->getParamType(i))); } auto* DIFunctionTy = dbg_info_->di_builder_->createSubroutineType( dbg_info_->di_builder_->getOrCreateTypeArray(paramTys)); @@ -240,7 +241,7 @@ void CodeGenCPU::AddDebugInformation(PrimFunc f_tir, llvm::Function* f_llvm) { std::string paramName = "arg" + std::to_string(i + 1); auto param = dbg_info_->di_builder_->createParameterVariable( DIFunction, paramName, i + 1, dbg_info_->file_, 0, - GetDebugType(GetType(f_tir->args[i]), f_llvm->getFunctionType()->getParamType(i)), + GetDebugType(GetType(f_tir->params[i]), f_llvm->getFunctionType()->getParamType(i)), /*alwaysPreserve=*/true); auto* store = builder.CreateStore(f_llvm->arg_begin() + i, paramAlloca); dbg_info_->di_builder_->insertDeclare(paramAlloca, param,
Add native/quantized to the list of header rewrites Summary: Pull Request resolved: same as title. I am not sure why this was not added in the first place. Test Plan: wait for build to succeed.
@@ -632,6 +632,7 @@ def preprocessor(output_directory, filepath, stats, hip_clang_launch): if ( f.startswith("ATen/cuda") or f.startswith("ATen/native/cuda") + or f.startswith("ATen/native/quantized/cuda") or f.startswith("ATen/native/sparse/cuda") or f.startswith("THC/") or f.startswith("THCUNN/")
fix nightly job Summary: not sure whats up with tag cleaning, seems to be trying to delete the same ones every day. Also we deleted the script its trying to run. Test Plan: ??? Reviewers: max, schrockn
@@ -32,6 +32,7 @@ jobs: - run: name: Clean Phabricator Tags command: | + git fetch -p -P origin git tag | grep phabricator | xargs git push -d origin - run: @@ -75,4 +76,4 @@ jobs: command: | python -m venv . source bin/activate - . dev_env_setup.sh + make install_dev_python_modules
changed the digits regex to optionally accept signed numbers edited the unit test accordingly documented the get_remove_digits function to properly describe the new way that we handle digits updated the test suite accordingly, added two new files
@@ -462,11 +462,13 @@ def get_remove_punctuation_map( def get_remove_digits(text: str) -> str: - """Removes all digits. + """Removes signed / unsigned numbers, removes decimal / delimiter + separated numbers, does not remove currency symbols, will modify + some tokens where digits appear. :param text: A unicode string representing the whole text that is being manipulated. - :return: A string with all digits removed + :return: A string with all digits removed. """ # Using "." to represent any unicode character used to indicate @@ -474,7 +476,7 @@ def get_remove_digits(text: str) -> str: # unicode digits, this pattern will match: # 1) *** # 2) ***.*** - pattern = re.compile(r"(\d+)|((?<=\d)[\u0027|\u002C|\u002E|\u00B7" + pattern = re.compile(r"([+-]?\d+)|((?<=\d)[\u0027|\u002C|\u002E|\u00B7" r"|\u02D9|\u066B|\u066C|\u2396]\d+)", re.UNICODE) remove_digits = str(re.sub(pattern, r"", text))
Fix katex math rendering Summary: I'm 80% sure that this fixes the math bug. But I can't repro locally so I don't know. Pull Request resolved:
@@ -55,22 +55,6 @@ extensions = [ 'sphinxcontrib.katex', ] -# katex (mathjax replacement) macros -# -# - -katex_macros = r''' -"\\op": "\\operatorname{{#1}}", -"\\i": "\\mathrm{i}", -"\\e": "\\mathrm{e}^{#1}", -"\\w": "\\omega", -"\\vec": "\\mathbf{#1}", -"\\x": "\\vec{x}", -"\\d": "\\operatorname{d}\\!{}", -"\\dirac": "\\operatorname{\\delta}\\left(#1\\right)", -"\\scalarprod": "\\left\\langle#1,#2\\right\\rangle", -''' - # katex options # # @@ -78,10 +62,9 @@ katex_macros = r''' katex_options = r''' delimiters : [ {left: "$$", right: "$$", display: true}, - {left: "\\(", right: "\\)", display: true}, + {left: "\\(", right: "\\)", display: false}, {left: "\\[", right: "\\]", display: true} -], -strict : false +] ''' napoleon_use_ivar = True
TST: Test real ?gtsvx with NAG f07cbf example This commit adds a test for ?gtsvx using the example provided by NAG. The example solves a system of equations of the form AX=B where A is a tridiagonal matrix. See
@@ -922,6 +922,36 @@ class TestHetrd(object): ) +class TestGtsvx: + + @pytest.mark.parametrize('dtype', REAL_DTYPES) + def test_nag_f07cbf(self, dtype): + """Find the solution that satisfies the set of equations Ax=b. + + For the full reference see: + https://www.nag.com/numeric/fl/nagdoc_latest/examples/source/f07cbf.html + + """ + du = np.array([2.1, -1.0, 1.9, 8.0], dtype=dtype) + d = np.array([3.0, 2.3, -5.0, -0.9, 7.1], dtype=dtype) + dl = np.array([3.4, 3.6, 7.0, -6.0], dtype=dtype) + b = np.array([[2.7, 6.6], + [-0.5, 10.8], + [2.6, -3.2], + [0.6, -11.2], + [2.7, 19.1]]) + + gtsvx = get_lapack_funcs('gtsvx', dtype=dtype) + dlf,df,duf,du2,ipiv,x,rcond,ferr,berr,info = gtsvx(dl,d,du,b,fact='N') + + assert_equal(info, 0) + assert_allclose(x, [[-4.0000, 5.0000], + [ 7.0000, -4.0000], + [ 3.0000, -3.0000], + [-4.0000, -2.0000], + [-3.0000, 1.0000]], + atol=1e-5) + def test_gglse(): # Example data taken from NAG manual for ind, dtype in enumerate(DTYPES):
Write more often on district migration To avoid tasks OOMing
@@ -211,14 +211,11 @@ class AdminCreateDistrictsDo(LoggedInHandler): year = int(year) year_dcmps = DistrictListQuery(year).fetch() districts_to_write = [] - events_to_write = [] - districtteams_to_write = [] + for dcmp in year_dcmps: district_abbrev = DistrictType.type_abbrevs[dcmp.event_district_enum] district_key = District.renderKeyName(year, district_abbrev) logging.info("Creating {}".format(district_key)) - district_events_future = DistrictEventsQuery(district_key).fetch_async() - districtteams_future = DistrictTeam.query(DistrictTeam.year == year, DistrictTeam.district == DistrictType.abbrevs.get(district_abbrev, None)).fetch_async() district = District( id=district_key, @@ -229,22 +226,29 @@ class AdminCreateDistrictsDo(LoggedInHandler): ) districts_to_write.append(district) + logging.info("Writing {} new districts".format(len(districts_to_write))) + DistrictManipulator.createOrUpdate(districts_to_write, run_post_update_hook=False) + + for dcmp in year_dcmps: + district_abbrev = DistrictType.type_abbrevs[dcmp.event_district_enum] + district_key = District.renderKeyName(year, district_abbrev) + district_events_future = DistrictEventsQuery(district_key).fetch_async() + districtteams_future = DistrictTeam.query(DistrictTeam.year == year, DistrictTeam.district == DistrictType.abbrevs.get(district_abbrev, None)).fetch_async() + district_events = district_events_future.get_result() logging.info("Found {} events to update".format(len(district_events))) + events_to_write = [] + districtteams_to_write = [] for event in district_events: event.district_key = district.key events_to_write.append(event) + EventManipulator.createOrUpdate(events_to_write) districtteams = districtteams_future.get_result() logging.info("Found {} DistrictTeams to update".format(len(districtteams))) for districtteam in districtteams: districtteam.district_key = district.key districtteams_to_write.append(districtteam) - - logging.info("Writing {} new districts".format(len(districts_to_write))) - DistrictManipulator.createOrUpdate(districts_to_write, run_post_update_hook=False) - - EventManipulator.createOrUpdate(events_to_write) DistrictTeamManipulator.createOrUpdate(districtteams_to_write)
public_export.py: Reorder the creation of the RealmAuditLog object. This reordering was originally made with regard to the delete after access feature for the public export. However, this reordering is more correct overall, i.e., the object should be created before the event pertaining to the object is sent.
@@ -25,15 +25,14 @@ def public_only_realm_export(request: HttpRequest, user: UserProfile) -> HttpRes if len(limit_check) >= time_delta_limit: return json_error(_('Exceeded rate limit.')) - # Using the deferred_work queue processor to avoid killing the process after 60s + RealmAuditLog.objects.create(realm=realm, + event_type=event_type, + event_time=event_time) + # Using the deferred_work queue processor to avoid + # killing the process after 60s. event = {'type': event_type, 'time': event_time, 'realm_id': realm.id, 'user_profile_id': user.id} queue_json_publish('deferred_work', event) - - RealmAuditLog.objects.create(realm=realm, - event_type=event_type, - event_time=event_time) - return json_success()
[core/theme] Fix loading of iconsets * First, make iconsets override anything already present in the "base" configuration * Second, make sure that CLI provided iconsets have higher priority than "built-in" ones see
@@ -51,10 +51,11 @@ class Theme(object): self.__keywords = {} self.__value_idx = {} self.__data = raw_data if raw_data else self.load(name) + + for icons in self.__data.get("icons", []): + self.__data = util.algorithm.merge(self.load(icons, "icons"), self.__data) if iconset != "auto": self.__data = util.algorithm.merge(self.load(iconset, "icons"), self.__data) - for icons in self.__data.get("icons", []): - util.algorithm.merge(self.__data, self.load(icons, "icons")) for colors in self.__data.get("colors", []): util.algorithm.merge(self.__keywords, self.load_keywords(colors))
Fix Travis builds on default Trusty infrastructure After the Travis container infrastructure was deprecated, boto imports stopped working in our tests; this hack fixes that issue.
@@ -7,3 +7,8 @@ install: - make develop extras=[aws,google] # adding extras to avoid import errors script: - TOIL_TEST_QUICK=True make test_offline +env: + # Necessary to get boto to work in Travis's Ubuntu Precise + # environment (see #2498). Consider removing this if/when we + # transition to the Xenial environment. + - BOTO_CONFIG=/dev/null
bugfix: show each license in new line styling: almost same as deposit form
<dt><b>{{ _('Licenses')}}</b></dt> <dd> {%- for right in rights%} - <a href="{{ right.link }}" target="_blank">{{ right.title }}</a> + <div class="content"> + <div class="header"> + {{ right.title }} + </div> + <div class="description"> + {% if right.description %} + <span style="color: rgba(0,0,0,.7);"> + {{ right.description }} + </span> + {% endif %} + {% if right.link %} + <span style="display: inline;"> + <a href="{{ right.link }}" target="_blank">Read more</a> + </span> + {% endif %} + </div> + </div> + <br> {% endfor %} </dd> {% endif %}
Add installation instructions for openSUSE magic-wormhole has been included in openSUSE since Leap 15.1, and is also available in Tumbleweed: So document this explicitly.
@@ -70,6 +70,12 @@ $ sudo apt install magic-wormhole $ sudo dnf install magic-wormhole ``` +### Linux (openSUSE) + +``` +$ sudo zypper install python-magic-wormhole +``` + ### Linux (Snap package) Many linux distributions (including Ubuntu) can install ["Snap"
Fix openssl 1.0.2x shared library permissions Invoking `conan install` second time fails if openssl/1.0.2x package is used. Shared libraries are built with permissions restricting owner to write, a behaviour that produces mentioned install failure. Update openssl recipe to fix shared library permissions on Unix-like systems. Closes
@@ -555,6 +555,12 @@ class OpenSSLConan(ConanFile): with tools.chdir(os.path.join(self.package_folder, 'lib')): os.rename('libssl.lib', 'libssld.lib') os.rename('libcrypto.lib', 'libcryptod.lib') + # Old OpenSSL version family has issues with permissions. + # See https://github.com/conan-io/conan/issues/5831 + if self._full_version < "1.1.0" and self.options.shared and self.settings.os in ("Android", "FreeBSD", "Linux"): + with tools.chdir(os.path.join(self.package_folder, "lib")): + os.chmod("libssl.so.1.0.0", 0o755) + os.chmod("libcrypto.so.1.0.0", 0o755) tools.rmdir(os.path.join(self.package_folder, "lib", "pkgconfig")) def package_info(self):
Update Bno055.py Cleanup for automated test
-arduino = Runtime.createAndStart("arduino","Arduino") -arduino.connect("COM11") - -bno = Runtime.createAndStart("bno","Bno055") +# config +port = "COM11" +# Code to be able to use this script with virtalArduino +if ('virtual' in globals() and virtual): + virtualArduino = Runtime.start("virtualArduino", "VirtualArduino") + virtualArduino.connect(port) +arduino = Runtime.start("arduino","Arduino") +arduino.connect(port) + +bno = Runtime.start("bno","Bno055") # From version 1.0.2316 use attach instead of setController # bno.setController(arduino)
Add n_link argument This argument will be used if we want to link the bearing to a different node (instead of ground).
@@ -127,7 +127,11 @@ class BearingElement(Element): Array with the speeds (rad/s). tag: str, optional A tag to name the element - Default is None + Default is None. + n_link: int, optional + Node to which the bearing will connect. If None the bearing is + connected to ground. + Default is None. Examples -------- >>> # A bearing element located in the first rotor node, with these @@ -147,7 +151,19 @@ class BearingElement(Element): """ def __init__( - self, n, kxx, cxx, kyy=None, kxy=0, kyx=0, cyy=None, cxy=0, cyx=0, w=None, tag=None + self, + n, + kxx, + cxx, + kyy=None, + kxy=0, + kyx=0, + cyy=None, + cxy=0, + cyx=0, + w=None, + tag=None, + n_link=None, ): args = ["kxx", "kyy", "kxy", "kyx", "cxx", "cyy", "cxy", "cyx"]
[docs] Fix bug in old LaTeX package expdlist This bug is an extra space in the hacked \@item. It shows up with Sphinx 1.0's \pysiglinewithargsret and gives overfull hboxes which do not modify PDF output but they slow down compilation and fill the LaTeX log with dozens (at least) of overfull hboxes warnings.
@@ -187,6 +187,11 @@ latex_elements = { \renewenvironment{description}% {\begin{latexdescription}[\setleftmargin{60pt}\breaklabel\setlabelstyle{\bfseries\itshape}]}% {\end{latexdescription}} +% Fix bug in expdlist's modified \@item +\usepackage{etoolbox} +\makeatletter +\patchcmd\@item{{\@breaklabel} }{{\@breaklabel}}{}{} +\makeatother % Make Examples/etc section headers smaller and more compact \titlespacing*{\paragraph}{0pt}{1ex}{0pt}
Fix oslo.vmware change that added new keyword argument Next release of oslo.vmware adds a new keyword argument that this mock needs to consume (otherwise this whole test breaks).
@@ -45,7 +45,8 @@ class VsphereOperationsTest(base.BaseTestCase): vm_object.propSet[0].val = vm_instance return vm_object - def retrieve_props_side_effect(pc, specSet, options): + def retrieve_props_side_effect(pc, specSet, + options, skip_op_id=False): # assert inputs self.assertEqual(self._vsphere_ops._max_objects, options.maxObjects)
Switch __nonzero__ to __bool__. This is the right magic method to use for Python 3.
@@ -123,6 +123,10 @@ class DataFrameCollection: """Returns number of tables that are stored in this DataFrameCollection.""" return len(self._table_ids) + def __bool__(self): + """Returns true if this collection contains something.""" + return bool(self._table_ids) + def items(self) -> Iterator[Tuple[str, pd.DataFrame]]: """Iterates over table names and the corresponding pd.DataFrame objects.""" for name in self.get_table_names():
Trivial: use default value in next() func This patch replaces StopIteration exception handler for next() function setting the default value in the next() argument.
@@ -693,14 +693,13 @@ class HostManager(object): timeout = context_module.CELL_TIMEOUT nodes_by_cell = context_module.scatter_gather_cells( ctxt, cells, timeout, target_fnc) - try: - # Only one cell should have a value for the compute nodes - # so we get it here + + # Only one cell should have values for the compute nodes + # so we get them here, or return an empty list if no cell + # has a value nodes = next( - nodes for nodes in nodes_by_cell.values() if nodes) - except StopIteration: - # ...or we find no node if none of the cells has a value - nodes = objects.ComputeNodeList() + (nodes for nodes in nodes_by_cell.values() if nodes), + objects.ComputeNodeList()) return nodes
Update QuantumCircuit.barrier docstrings - Issue8076 Fix * Update QuantumCircuit.barrier docstrings * Update docstring formatting * Update qiskit/circuit/barrier.py Committed string suggestion * Ran black ($tox -eblack) to update formating
@@ -21,7 +21,12 @@ from .instruction import Instruction class Barrier(Instruction): - """Barrier instruction.""" + """Barrier instruction. + + A barrier is a visual indicator of the grouping of a circuit section. + It also acts as a directive for circuit compilation to separate pieces + of a circuit so that any optimizations or re-writes are constrained + to only act between barriers.""" _directive = True
search: fix rdm search bar 'executeSearch'. Fixed an issue where 'executeSearch' is undefined.
@@ -141,18 +141,15 @@ export const RDMRecordSearchBarElement = withState( placeholder: passedPlaceholder, queryString, onInputChange, - executeSearch, updateQueryState, }) => { const placeholder = passedPlaceholder || i18next.t("Search"); const onBtnSearchClick = () => { - updateQueryState({ filters: [] }); - executeSearch(); + updateQueryState({ filters: [], queryString }); }; const onKeyPress = (event) => { if (event.key === "Enter") { - updateQueryState({ filters: [] }); - executeSearch(); + updateQueryState({ filters: [], queryString }); } }; return (
Fixup hover popup behavior We used show_popup two times, but it's a smoother experience to use show_popup and then update_popup. There's a bug in ST where it thinks the view is modified when doing a show_popup or an add_regions call with an edit token. Workaround this by running sublime.set_timeout. See:
@@ -109,11 +109,11 @@ class LspHoverCommand(LspTextCommand): def handle_code_actions(self, responses: Dict[str, List[CodeActionOrCommand]], point: int) -> None: self._actions_by_config = responses - sublime.set_timeout(lambda: self.show_hover(point)) + self.show_hover(point) def handle_response(self, response: Optional[Any], point: int) -> None: self._hover = response - sublime.set_timeout(lambda: self.show_hover(point)) + self.show_hover(point) def symbol_actions_content(self) -> str: actions = [] @@ -175,6 +175,9 @@ class LspHoverCommand(LspTextCommand): return minihtml(self.view, content, allowed_formats=FORMAT_MARKED_STRING | FORMAT_MARKUP_CONTENT) def show_hover(self, point: int) -> None: + sublime.set_timeout(lambda: self._show_hover(point)) + + def _show_hover(self, point: int) -> None: contents = self.diagnostics_content() + self.hover_content() if contents and settings.show_symbol_action_links: contents += self.symbol_actions_content() @@ -183,6 +186,14 @@ class LspHoverCommand(LspTextCommand): _test_contents.append(contents) # for testing only if contents: + if self.view.is_popup_visible(): + mdpopups.update_popup( + self.view, + contents, + css=popups.stylesheet, + md=False, + wrapper_class=popups.classname) + else: mdpopups.show_popup( self.view, contents,
Deleted check when SFR is not completely fulfilled This is currently blocking when diversified SFRs cannot complete due to a lack of a certain type of instance. If we are scaling up, the target capacity may be overwitten with the same value. If we are scaling down, there is no point checking this
@@ -572,12 +572,6 @@ class SpotAutoscaler(ClusterAutoscaler): self.resource['id'], )) raise ClusterAutoscalingError - if self.is_aws_launching_instances() and self.sfr['SpotFleetRequestState'] == 'active': - self.log.warning( - "AWS hasn't reached the TargetCapacity that is currently set. We won't make any " - "changes this time as we should wait for AWS to launch more instances first.", - ) - return 0, 0 current, target = self.get_spot_fleet_delta(error) if self.sfr['SpotFleetRequestState'] == 'cancelled_running': self.resource['min_capacity'] = 0
Fix for `plot_field` function failing on non-square grids Fix for issue
@@ -16,7 +16,7 @@ def plot_field(field, xmax=2., ymax=2., zmax=None, view=None, linewidth=0): y_coord = np.linspace(0, ymax, field.shape[1]) fig = pyplot.figure(figsize=(11, 7), dpi=100) ax = fig.gca(projection='3d') - X, Y = np.meshgrid(x_coord, y_coord) + X, Y = np.meshgrid(x_coord, y_coord, indexing='ij') ax.plot_surface(X, Y, field[:], cmap=cm.viridis, rstride=1, cstride=1, linewidth=linewidth, antialiased=False)
Changelog for 0.5.9 Test Plan: N/A Reviewers: #ft, alangenfeld
# Changelog +## 0.5.9 +- Fixes an issue using custom types for fan-in dependencies with intermediate storage. + +## 0.5.8 +- Fixes an issue running some Dagstermill notebooks on Windows. +- Fixes a transitive dependency issue with Airflow. +- Bugfixes, performance improvements, and better documentation. + ## 0.5.7 - Fixed an issue with specifying composite output mappings (#1674) - Added support for specifying
Update apt_adwind.txt Removing dup.
@@ -744,13 +744,16 @@ tradcan.duckdns.org 185.165.153.150:4145 # Reference: https://pastebin.com/29uSdMAk +# Reference: https://app.any.run/tasks/6272b39e-7fea-4134-819e-6d3b6b5a0d2b +# Reference: https://www.virustotal.com/gui/file/7a01202131c133a5f78134f264383e827a68164a05e5927da485527da00f8b32/detection 0000rrrvvv.duckdns.org addahost.ddns.net lexd.duckdns.org respainc.duckdns.org -# Reference: https://app.any.run/tasks/6272b39e-7fea-4134-819e-6d3b6b5a0d2b -# Reference: https://www.virustotal.com/gui/file/7a01202131c133a5f78134f264383e827a68164a05e5927da485527da00f8b32/detection +# Reference: https://twitter.com/wwp96/status/1192098993158918145 +# Reference: https://app.any.run/tasks/4c70e0e0-ce08-4bd8-ae00-77791545807f/ -0000rrrvvv.duckdns.org \ No newline at end of file +95.213.195.71:3999 +mamased.duckdns.org
docs/Running: Section-ise the setup.py install section * Moves the "you need to edit these files in the monitor" into a section related to this, rather than up in #concepts.
@@ -306,16 +306,8 @@ from the Gateway to the Relay and Monitor. As the code currently (2021-05-16) stands it MUST run on a standalone host such that everything is served relative to the path root, not a path prefix. -Also all of the `contrib/monitor` files have `eddn.edcd.io` hard-coded. You -will need to perform search and replace on the installed/live files to use a -test host. The files in question are: - - monitor/js/eddn.js - monitor/schemas.html - -Replace the string `eddn.edcd.io` with the hostname you're using. You'll need -to perform similar substitutions if you change the configuration to use any -different port numbers. +See also the [post-installation notes](#post-installation-steps) for some +caveats about running this other than on the actual eddn.edcd.io host. --- @@ -403,15 +395,18 @@ It sets: # Running You have some choices for how to run the application components: -1. If you are just testing out code changes then you can choose to run +## Running scripts from source +If you are just testing out code changes then you can choose to run this application directly from the source using the provided script in - `contrib/run-from-source.sh`. +`contrib/run-from-source.sh`. This assumes the `dev` environment. -1. Otherwise you will want to utilise the `setup.py` file to build and +## Running from installation +Otherwise you will want to utilise the `setup.py` file to build and install the application files. You'll need to do some setup first as there are necessary files *not* checked into git, because they're per environment: +### Performing the installation 1. Change directory to the top level of the git clone. 1. Create a file `setup_env.py` with contents: @@ -452,6 +447,21 @@ You have some choices for how to run the application components: with an example config override file if you didn't already have a `config.json` here. +### Post-installation steps +If you're not using the `live` environment then there are some edits you +need to make. + +All of the `contrib/monitor` files have the hostname `eddn.edcd.io` + hard-coded. You will need to perform search and replace on the + installed/live files to use a test host. The files in question are: + + monitor/js/eddn.js + monitor/schemas.html + +Replace the string `eddn.edcd.io` with the hostname you're using. +You'll need to perform similar substitutions if you change the +configuration to use any different port numbers. + --- # Accessing the Monitor
Fixes to reconnect to the phone and specify a serial number * Add ability to close and reconnect to phone. * Add specifying an Android phone's serial number. This allows supporting multiple Android phones plugs in a single test with subclasses: conf.declare('adb_serial_number1', default=None, description='SN for my Android') class AdbPlug1(usb.AdbPlug): serial_number = conf.adb_serial_number1
@@ -60,7 +60,7 @@ def init_dependent_flags(): parser.parse_known_args() -def _open_usb_handle(**kwargs): +def _open_usb_handle(serial_number=None, **kwargs): """Open a UsbHandle subclass, based on configuration. If configuration 'remote_usb' is set, use it to connect to remote usb, @@ -74,13 +74,13 @@ def _open_usb_handle(**kwargs): plug_port: 5 Args: + serial_number: Optional serial number to connect to. **kwargs: Arguments to pass to respective handle's Open() method. Returns: Instance of UsbHandle. """ init_dependent_flags() - serial = None remote_usb = conf.remote_usb if remote_usb: if remote_usb.strip() == 'ethersync': @@ -92,9 +92,9 @@ def _open_usb_handle(**kwargs): raise ValueError('Ethersync needs mac_addr and plug_port to be set') else: ethersync = cambrionix.EtherSync(mac_addr) - serial = ethersync.get_usb_serial(port) + serial_number = ethersync.get_usb_serial(port) - return local_usb.LibUsbHandle.open(serial_number=serial, **kwargs) + return local_usb.LibUsbHandle.open(serial_number=serial_number, **kwargs) class FastbootPlug(plugs.BasePlug): @@ -118,21 +118,41 @@ class FastbootPlug(plugs.BasePlug): class AdbPlug(plugs.BasePlug): """Plug that provides ADB.""" + serial_number = None + def __init__(self): - kwargs = {} if conf.libusb_rsa_key: - kwargs['rsa_keys'] = [adb_device.M2CryptoSigner(conf.libusb_rsa_key)] + self._rsa_keys = [adb_device.M2CryptoSigner(conf.libusb_rsa_key)] + else: + self._rsa_keys = None + self._device = None + self.connect() + + def tearDown(self): + if self._device: + self._device.close() + + def connect(self): + if self._device: + try: + self._device.close() + except (usb_exceptions.UsbWriteFailedError, + usb_exceptions.UsbReadFailedError): + pass + self._device = None + + kwargs = {} + if self._rsa_keys: + kwargs['rsa_keys'] = self._rsa_keys self._device = adb_device.AdbDevice.connect( _open_usb_handle( interface_class=adb_device.CLASS, interface_subclass=adb_device.SUBCLASS, - interface_protocol=adb_device.PROTOCOL), + interface_protocol=adb_device.PROTOCOL, + serial_number=self.serial_number), **kwargs) - def tearDown(self): - self._device.close() - def __getattr__(self, attr): """Forward other attributes to the device.""" return getattr(self._device, attr)
switch2container: remove deb systemd units When running the switch2container playbook on a Debian based system then the systemd unit path isn't the same than Red Hat based system. Because the systemd unit files aren't removed then the new container systemd unit isn't take in count.
- name: remove old systemd unit files file: - path: /usr/lib/systemd/system/{{ item }} + path: "{{ item }}" state: absent with_items: - - [email protected] - - ceph-mon.target + - /usr/lib/systemd/system/[email protected] + - /usr/lib/systemd/system/ceph-mon.target + - /lib/systemd/system/[email protected] + - /lib/systemd/system/ceph-mon.target - import_role: name: ceph-defaults - name: remove old systemd unit files file: - path: /usr/lib/systemd/system/{{ item }} + path: "{{ item }}" state: absent with_items: - - [email protected] - - ceph-mgr.target + - /usr/lib/systemd/system/[email protected] + - /usr/lib/systemd/system/ceph-mgr.target + - /lib/systemd/system/[email protected] + - /lib/systemd/system/ceph-mgr.target - import_role: name: ceph-defaults - /usr/lib/systemd/system/ceph-osd.target - /usr/lib/systemd/system/[email protected] - /usr/lib/systemd/system/[email protected] + - /lib/systemd/system/ceph-osd.target + - /lib/systemd/system/[email protected] + - /lib/systemd/system/[email protected] - import_role: name: ceph-facts - name: remove old systemd unit files file: - path: /usr/lib/systemd/system/{{ item }} + path: "{{ item }}" state: absent with_items: - - [email protected] - - ceph-mds.target + - /usr/lib/systemd/system/[email protected] + - /usr/lib/systemd/system/ceph-mds.target + - /lib/systemd/system/[email protected] + - /lib/systemd/system/ceph-mds.target - import_role: name: ceph-defaults - name: remove old systemd unit files file: - path: /usr/lib/systemd/system/{{ item }} + path: "{{ item }}" state: absent with_items: - - [email protected] - - ceph-radosgw.target + - /usr/lib/systemd/system/[email protected] + - /usr/lib/systemd/system/ceph-radosgw.target + - /lib/systemd/system/[email protected] + - /lib/systemd/system/ceph-radosgw.target - import_role: name: ceph-handler - name: remove old systemd unit files file: - path: /usr/lib/systemd/system/{{ item }} + path: "{{ item }}" state: absent with_items: - - [email protected] - - ceph-rbd-mirror.target + - /usr/lib/systemd/system/[email protected] + - /usr/lib/systemd/system/ceph-rbd-mirror.target + - /lib/systemd/system/[email protected] + - /lib/systemd/system/ceph-rbd-mirror.target - import_role: name: ceph-defaults
fixup: ensure default blinder source 'INTERMISSION' shows a SMTPE signal fixed documentation
@@ -99,7 +99,7 @@ Without any further configuration this will produce two test sources named `cam1 Without any further configuration a source becomes a **test source** by default. Every test source will add a [videotestsrc](https://gstreamer.freedesktop.org/documentation/videotestsrc/index.html?gi-language=python) and an [audiotestsrc](https://gstreamer.freedesktop.org/documentation/audiotestsrc/index.html?gi-language=python) element to the internal GStreamer pipeline and so it produces a test video and sound. -As in the order they appear in `mix/sources` the test patterns of all test sources will iterate through the [GStreamer test pattern values](https://gstreamer.freedesktop.org/documentation/videotestsrc/index.html?gi-language=python#members-2). +As in the order they appear in `mix/sources` the test patterns of all test sources will iterate through the [GStreamer test pattern values](https://gstreamer.freedesktop.org/documentation/videotestsrc/index.html?gi-language=python#members-2) beginning with the value `snow` (followed by `black`, `white`, `red`, `green`, `blue`, ...). To set the pattern of a test source explicitly you need to add an own section `source.x` (where `x` is the source's identifier) to the configuration
Drop use of deprecated allow_tags attribute. mozilla/sumo-project#136
@@ -5,6 +5,7 @@ from django import forms from django.contrib import admin from django.urls import reverse from django.db import models +from django.utils.safestring import mark_safe from kitsune.kbadge.models import Badge, Award @@ -20,13 +21,12 @@ def show_image(obj): if not obj.image: return "None" img_url = obj.image.url - return '<a href="%s" target="_new"><img src="%s" width="48" height="48" /></a>' % ( + return mark_safe('<a href="%s" target="_new"><img src="%s" width="48" height="48" /></a>' % ( img_url, img_url, - ) + )) -show_image.allow_tags = True show_image.short_description = "Image" @@ -50,10 +50,9 @@ def build_related_link(self, model_name, name_single, name_plural, qs): def related_awards_link(self): - return build_related_link(self, "award", "award", "awards", self.award_set) + return mark_safe(build_related_link(self, "award", "award", "awards", self.award_set)) -related_awards_link.allow_tags = True related_awards_link.short_description = "Awards" @@ -92,10 +91,9 @@ class BadgeAdmin(admin.ModelAdmin): def badge_link(self): url = reverse("admin:kbadge_badge_change", args=[self.badge.id]) - return '<a href="%s">%s</a>' % (url, self.badge) + return mark_safe('<a href="%s">%s</a>' % (url, self.badge)) -badge_link.allow_tags = True badge_link.short_description = "Badge" @@ -123,10 +121,9 @@ class AwardAdmin(admin.ModelAdmin): def award_link(self): url = reverse("admin:kbadge_award_change", args=[self.award.id]) - return '<a href="%s">%s</a>' % (url, self.award) + return mark_safe('<a href="%s">%s</a>' % (url, self.award)) -award_link.allow_tags = True award_link.short_description = "award"
[IMPR] catch ServerError as a whole in reflinks.py All server errors derive from ServerError which can be used to catch them all.
@@ -64,11 +64,7 @@ from pywikibot import comms, config, i18n, pagegenerators, textlib from pywikibot.backports import removeprefix from pywikibot.bot import ConfigParserBot, ExistingPageBot, SingleSiteBot from pywikibot.comms.http import get_charset_from_content_type -from pywikibot.exceptions import ( - FatalServerError, - Server414Error, - Server504Error, -) +from pywikibot.exceptions import ServerError from pywikibot.pagegenerators import ( XMLDumpPageGenerator as _XMLDumpPageGenerator, ) @@ -638,9 +634,7 @@ class ReferencesRobot(SingleSiteBot, ConfigParserBot, ExistingPageBot): except (ValueError, # urllib3.LocationParseError derives from it OSError, httplib.error, - FatalServerError, - Server414Error, - Server504Error) as e: + ServerError) as e: pywikibot.output( "{err.__class__.__name__}: Can't retrieve url {url}: {err}" .format(url=ref.url, err=e))
Remove required-by for tensorboard tests Not sure why this isn't showing up (recent refactored use of setup tools?) but it's non-critical and not worth chasing.
@@ -15,7 +15,7 @@ installed `pypi.tensorboard` package. license: Apache 2.0 location: ... requires: ... - required-by:...guildai... + required-by:... <exit 0> >>> run("guild packages info tensorboard") # doctest: -PY3 @@ -28,5 +28,5 @@ installed `pypi.tensorboard` package. license: Apache 2.0 location: ... requires: ... - required-by:...guildai... + required-by:... <exit 0>
Update distro.linux_distribution & mariadb_ver [10.8] Fix: DeprecationWarning: distro.linux_distribution() is deprecated Deprecated since version 3.5, removed in version 3.7. Update: OVH MariaDB mirror has been updated and support Ubuntu 22.04 [jammy]
"""WordOps core variable module""" import configparser import os +import sys from datetime import datetime from re import match from socket import getfqdn from shutil import copy2 -from distro import linux_distribution +from distro import distro, linux_distribution from sh import git @@ -30,6 +31,7 @@ class WOVar(): # WordOps core variables # linux distribution + if sys.version_info <= (3, 7): wo_distro = linux_distribution( full_distribution_name=False)[0].lower() wo_platform_version = linux_distribution( @@ -37,6 +39,11 @@ class WOVar(): # distro codename (bionic, xenial, stretch ...) wo_platform_codename = linux_distribution( full_distribution_name=False)[2].lower() + else: + wo_distro = distro.id() + wo_platform_version = distro.version() + # distro codename (bionic, xenial, stretch ...) + wo_platform_codename = distro.codename() # Get timezone of system if os.path.isfile('/etc/timezone'): @@ -170,6 +177,9 @@ class WOVar(): mariadb_ver = '10.1' else: mariadb_ver = '10.3' + else: + if wo_platform_codename == 'jammy': + mariadb_ver = '10.8' else: mariadb_ver = '10.5' wo_mysql = wo_mysql + ["mariadb-backup"] @@ -183,8 +193,9 @@ class WOVar(): # APT repositories wo_mysql_repo = ("deb [arch=amd64,arm64,ppc64el] " "http://mariadb.mirrors.ovh.net/MariaDB/repo/" - "10.5/{distro} {codename} main" - .format(distro=wo_distro, + "{version}/{distro} {codename} main" + .format(version=mariadb_ver, + distro=wo_distro, codename=wo_platform_codename)) if wo_distro == 'ubuntu': wo_php_repo = "ppa:ondrej/php"
Fixed broken pytest I'm not convinced this is the best way to do the mock, but my mock fu is weak. I'm going to check it in anyway, so I can run the test in ci.
@@ -6,11 +6,21 @@ from unittest import mock import cumulusci.robotframework.utils as robot_utils from cumulusci.utils import touch +mock_SeleniumLibrary = mock.Mock() + + +class MockBuiltIn: + get_library_instance = mock.Mock( + return_value={"SeleniumLibrary": mock_SeleniumLibrary} + ) + + +robot_utils.BuiltIn = MockBuiltIn + class TestRobotframeworkUtils: def setup_method(self): - robot_utils.BuiltIn = mock.Mock(name="BuiltIn") - self.mock_selib = robot_utils.BuiltIn().get_library_instance("SeleniumLibrary") + mock_SeleniumLibrary.reset_mock() def test_screenshot_decorator_fail(self): """Verify that the decorator will capture a screenshot on keyword failure""" @@ -23,7 +33,7 @@ class TestRobotframeworkUtils: example_function() except Exception: pass - self.mock_selib.failure_occurred.assert_called_once() + mock_SeleniumLibrary.capture_page_screenshot.assert_called_once() def test_screenshot_decorator_pass(self): """Verify that decorator does NOT capture screenshot on keyword success""" @@ -33,7 +43,8 @@ class TestRobotframeworkUtils: return True example_function() - self.mock_selib.failure_occurred.assert_not_called() + + mock_SeleniumLibrary.capture_page_screenshot.assert_not_called() class TestGetLocatorModule:
navbar: Fix search icon click event. This block was accidentally deleted in
@@ -69,6 +69,13 @@ function append_and_display_title_area(tab_bar_data) { } function bind_title_area_handlers() { + $(".search_closed").on("click", function (e) { + exports.open_search_bar_and_close_narrow_description(); + search.initiate_search(); + e.preventDefault(); + e.stopPropagation(); + }); + $("#tab_list span:nth-last-child(2)").on("click", function (e) { if (document.getSelection().type === "Range") { // Allow copy/paste to work normally without interference.
go: check whether the race detector is supported on the platform Check whether the race detector is supported on the current platform and disable it if it is not supported.
# Licensed under the Apache License, Version 2.0 (see LICENSE). from __future__ import annotations +import logging from dataclasses import dataclass from typing import Iterable @@ -12,8 +13,9 @@ from pants.backend.go.target_types import ( GoRaceDetectorEnabledField, GoTestRaceDetectorEnabledField, ) -from pants.backend.go.util_rules import go_mod +from pants.backend.go.util_rules import go_mod, goroot from pants.backend.go.util_rules.go_mod import OwningGoMod, OwningGoModRequest +from pants.backend.go.util_rules.goroot import GoRoot from pants.build_graph.address import Address from pants.engine.engine_aware import EngineAwareParameter from pants.engine.internals import graph @@ -21,6 +23,8 @@ from pants.engine.internals.selectors import Get from pants.engine.rules import collect_rules, rule from pants.engine.target import FieldSet, WrappedTarget, WrappedTargetRequest +logger = logging.getLogger(__name__) + @dataclass(frozen=True) class GoBuildOptions: @@ -63,9 +67,23 @@ def _first_non_none_value(items: Iterable[bool | None]) -> bool: return False +# Adapted from https://github.com/golang/go/blob/920f87adda5412a41036a862cf2139bed24aa533/src/internal/platform/supported.go#L7-L23. +def _race_detector_supported(goroot: GoRoot) -> bool: + """Returns True if the Go data race detector is supported for the `goroot`'s platform.""" + if goroot.goos == "linux": + return goroot.goarch in ("amd64", "ppc64le", "arm64", "s390x") + elif goroot.goos == "darwin": + return goroot.goarch in ("amd64", "arm64") + elif goroot.goos in ("freebsd", "netbsd", "openbsd", "windows"): + return goroot.goarch == "amd64" + else: + return False + + @rule async def go_extract_build_options_from_target( request: GoBuildOptionsFromTargetRequest, + goroot: GoRoot, golang: GolangSubsystem, go_test_subsystem: GoTestSubsystem, ) -> GoBuildOptions: @@ -119,6 +137,12 @@ async def go_extract_build_options_from_target( False, ] ) + if with_race_detector and not _race_detector_supported(goroot): + logger.warning( + f"The Go data race detector would have been enabled for target `{request.address}, " + f"but the race detector is not supported on platform {goroot.goos}/{goroot.goarch}." + ) + with_race_detector = False return GoBuildOptions( cgo_enabled=cgo_enabled, @@ -130,5 +154,6 @@ def rules(): return ( *collect_rules(), *go_mod.rules(), + *goroot.rules(), *graph.rules(), )
swarming: actually retry the pool finding function 4 attempts with 10s between = ~30s
@@ -43,7 +43,7 @@ def retry_exception(exc_type, max_attempts, delay): return deco -@retry_exception(ValueError, 1, 10) +@retry_exception(ValueError, 4, 10) def pick_best_pool(url, server_version): """Pick the best pool to run the health check task on.
fix: Stock Balance Report Shows Fatal Error -2 Old: skip_total_row = result['skip_total_row'] if 'skip_total_row' in result else '' result["add_total_row"] = report.add_total_row and not skip_total_row New: result["add_total_row"] = report.add_total_row and not result.get('skip_total_row')
@@ -185,8 +185,7 @@ def run(report_name, filters=None, user=None, ignore_prepared_report=False): else: result = generate_report_result(report, filters, user) - skip_total_row = result['skip_total_row'] if 'skip_total_row' in result else '' - result["add_total_row"] = report.add_total_row and not skip_total_row + result["add_total_row"] = report.add_total_row and not result.get('skip_total_row') return result
Proper warning level of deprecation notice This enables us to control emitted messages via the PYTHONWARNINGS environment variable or by -W option.
@@ -876,7 +876,8 @@ class Response: def stream(self): # type: ignore warnings.warn( # pragma: nocover "Response.stream() is due to be deprecated. " - "Use Response.aiter_bytes() instead." + "Use Response.aiter_bytes() instead.", + DeprecationWarning, ) return self.aiter_bytes # pragma: nocover @@ -884,7 +885,8 @@ class Response: def raw(self): # type: ignore warnings.warn( # pragma: nocover "Response.raw() is due to be deprecated. " - "Use Response.aiter_raw() instead." + "Use Response.aiter_raw() instead.", + DeprecationWarning, ) return self.aiter_raw # pragma: nocover
Insert a shallow copy of default_settings Prevents pymongo from adding an "_id" key to the default_values dict itself.
@@ -222,7 +222,7 @@ def get_settings(): db = api.db.get_conn() settings = db.settings.find_one({}, {"_id": 0}) if settings is None: - db.settings.insert(default_settings) + db.settings.insert(default_settings.copy()) return default_settings return settings
Fixed dupe bug for comments, enhanced logging info Fixed the dupe bug by altering the find duplicates method
@@ -370,7 +370,8 @@ class GitHubWorker: # Increment our global track of the cntrb id for the possibility of it being used as a FK self.cntrb_id_inc += 1 - except: + except Exception as e: + logging.info("Caught exception: " + str(e)) logging.info("Contributor not defined. Please contact the manufacturers of Soylent Green " + url + " ...\n") logging.info("Cascading Contributor Anomalie from missing repo contributor data: " + url + " ...\n") else: @@ -416,9 +417,6 @@ class GitHubWorker: issues += j i += 1 - # To store GH's issue numbers that are used in other endpoints for events and comments - issue_numbers = [] - # Discover and remove duplicates before we start inserting need_insertion = self.filter_duplicates({'gh_issue_id': 'id'}, ['issues'], issues) logging.info("Count of issues needing insertion: " + str(len(need_insertion)) + "\n") @@ -685,14 +683,15 @@ class GitHubWorker: colSQL = s.sql.text(""" SELECT {} FROM {} """.format(col, table_str)) - + logging.info(str(colSQL) + "\n\n") values = pd.read_sql(colSQL, self.db, params={}) for obj in og_data: if values.isin([obj[cols[col]]]).any().any(): logging.info("value of tuple exists: " + str(obj[cols[col]]) + "\n") - else: + elif obj not in need_insertion: need_insertion.append(obj) + logging.info(str(len(og_data)) + str(len(need_insertion)) + "\n\n") return need_insertion def find_id_from_login(self, login): @@ -772,7 +771,7 @@ class GitHubWorker: def update_rate_limit(self, response): # self.rate_limit -= 1 # logging.info("OUR TRACK OF LIMIT: " + str(self.rate_limit) + " ACTUAL: " + str(response.headers['X-RateLimit-Remaining'])) - self.rate_limit = response.headers['X-RateLimit-Remaining'] + self.rate_limit = int(response.headers['X-RateLimit-Remaining']) logging.info("Updated rate limit, you have: " + str(self.rate_limit) + " requests remaining.\n") if self.rate_limit <= 0:
Hypothesis tests: add ability to enforce shape inference Summary: Pull Request resolved: Add parameter to enforce that outputs are inferred
@@ -510,7 +510,12 @@ class HypothesisTestCase(test_util.TestCase): np.testing.assert_allclose(indices, ref_indices, atol=1e-4, rtol=1e-4) - def _assertInferTensorChecks(self, name, shapes, types, output): + def _assertInferTensorChecks(self, name, shapes, types, output, + ensure_output_is_inferred=False): + self.assertTrue( + not ensure_output_is_inferred or (name in shapes), + 'Shape for {0} was not inferred'.format(name)) + if name not in shapes: # No inferred shape or type available return @@ -565,6 +570,7 @@ class HypothesisTestCase(test_util.TestCase): grad_reference=None, atol=None, outputs_to_check=None, + ensure_outputs_are_inferred=False, ): """ This runs the reference Python function implementation @@ -643,7 +649,8 @@ class HypothesisTestCase(test_util.TestCase): ) if test_shape_inference: self._assertInferTensorChecks( - output_blob_name, shapes, types, output) + output_blob_name, shapes, types, output, + ensure_output_is_inferred=ensure_outputs_are_inferred) outs.append(output) if grad_reference is not None: assert output_to_grad is not None, \
Make MD work again by disabling SSL verification Moldelectrica still hasn't replaced their expired cert after >2 weeks Closes
@@ -26,8 +26,8 @@ def get_data(session=None): s = session or requests.Session() #In order for the data url to return data, cookies from the display url must be obtained then reused. - response = s.get(display_url) - data_response = s.get(data_url) + response = s.get(display_url, verify=False) + data_response = s.get(data_url, verify=False) raw_data = data_response.text try: data = [float(i) for i in raw_data.split(',')]
fix typo in caesar_cipher.py very character-> every character
@@ -27,7 +27,7 @@ def encrypt(input_string: str, key: int, alphabet: str | None = None) -> str: ========================= The caesar cipher is named after Julius Caesar who used it when sending secret military messages to his troops. This is a simple substitution cipher - where very character in the plain-text is shifted by a certain number known + where every character in the plain-text is shifted by a certain number known as the "key" or "shift". Example:
load source files as binary If we load them as text on a system without an locale setup, (so it defaults to C or POSIX), then no modules/states with unicode in the documentation can be loaded.
@@ -53,7 +53,7 @@ if six.PY3: for suffix in importlib.machinery.BYTECODE_SUFFIXES: SUFFIXES.append((suffix, 'rb', 2)) for suffix in importlib.machinery.SOURCE_SUFFIXES: - SUFFIXES.append((suffix, 'r', 1)) + SUFFIXES.append((suffix, 'rb', 1)) # pylint: enable=no-member,no-name-in-module,import-error else: SUFFIXES = imp.get_suffixes()
Python API: no AnalysisUnit.get_from_provider without default provider This API is not supposed to be available when the language spec does not defines a default unit provider. TN:
@@ -544,6 +544,7 @@ class AnalysisContext: GrammarRule._unwrap(rule)) return AnalysisUnit._wrap(c_value) +% if ctx.default_unit_provider: def get_from_provider(self, name, kind, charset=None, reparse=False): ${py_doc('langkit.get_unit_from_provider', 8)} if isinstance(name, bytes): @@ -561,6 +562,7 @@ class AnalysisContext: raise InvalidUnitNameError('Invalid unit name: {} ({})'.format( repr(name), kind )) +% endif def discard_errors_in_populate_lexical_env(self, discard): ${py_doc('langkit.context_discard_errors_in_populate_lexical_env', 8)}
Hebbian learning fix * Hebbian learning fix simply changed it from error to warning. * Removed no-targets error removed error statement when system is not given targets for learning, because Hebbian doesn't need targets
@@ -1861,9 +1861,13 @@ class System(System_Base): def _instantiate_target_inputs(self, context=None): if self.learning and self.targets is None: + # MODIFIED CW and KM 1/29/18: changed below from error to warning if not self.target_mechanisms: - raise SystemError("PROGRAM ERROR: Learning has been specified for {} but it has no target_mechanisms". + if self.verbosePref: + warnings.warn("WARNING: Learning has been specified for {} but it has no target_mechanisms. This " + "is okay if the learning (e.g. Hebbian learning) does not need a target.". format(self.name)) + return # # MODIFIED 6/25/17 OLD: # raise SystemError("Learning has been specified for {} so its \'targets\' argument must also be specified". # format(self.name)) @@ -2673,8 +2677,11 @@ class System(System_Base): if isinstance(self.targets, function_type): self.current_targets = self.targets() - if self.current_targets is None: - raise SystemError("No targets were specified in the call to execute {} with learning".format(self.name)) + # MODIFIED CW 1/29/18: removed this because some learning (e.g. Hebbian) doesn't need targets + # if self.current_targets is None: + # if self.verbosePref: + # warnings.warn("No targets were specified in the call to execute {} with learning. This is okay if " + # "your learning (e.g. Hebbian learning) does not need a target.".format(self.name)) for i, target_mech in zip(range(len(self.target_mechanisms)), self.target_mechanisms): # Assign each item of targets to the value of the targetInputState for the TARGET mechanism
Add new environment option KOLIBRI_DEBUG, make get_logger internal to kolibri.utils.options __get_logger does not return a fully configured logger, since logging may not be configured at the time of calling the function
@@ -130,6 +130,11 @@ base_option_spec = { "default": False, "envvars": ("KOLIBRI_SERVER_PROFILE",), }, + "DEBUG": { + "type": "boolean", + "default": False, + "envvars": ("KOLIBRI_DEBUG",), + }, }, "Paths": { "CONTENT_DIR": { @@ -177,9 +182,13 @@ base_option_spec = { } -def get_logger(KOLIBRI_HOME): +def __get_logger(KOLIBRI_HOME): """ - We define a minimal default logger config here, since we can't yet load up Django settings. + We define a minimal default logger config here, since we can't yet + load up Django settings. + + NB! Since logging can be defined by options, the logging from some + of the functions in this module do not use fully customized logging. """ from kolibri.utils.conf import LOG_ROOT @@ -233,7 +242,7 @@ def clean_conf(conf): def read_options_file(KOLIBRI_HOME, ini_filename="options.ini"): - logger = get_logger(KOLIBRI_HOME) + logger = __get_logger(KOLIBRI_HOME) ini_path = os.path.join(KOLIBRI_HOME, ini_filename) @@ -326,7 +335,7 @@ def _expand_paths(basepath, pathdict): def update_options_file(section, key, value, KOLIBRI_HOME, ini_filename="options.ini"): - logger = get_logger(KOLIBRI_HOME) + logger = __get_logger(KOLIBRI_HOME) # load the current conf from disk into memory conf = read_options_file(KOLIBRI_HOME, ini_filename=ini_filename)
[dagit] Fix AssetView.test key warnings ### Summary & Motivation Repair these warnings by adjusting some of the mocks. ### How I Tested These Changes yarn jest AssetView
@@ -13372,9 +13372,9 @@ __metadata: linkType: hard "caniuse-lite@npm:^1.0.0, caniuse-lite@npm:^1.0.30001109, caniuse-lite@npm:^1.0.30001214, caniuse-lite@npm:^1.0.30001219, caniuse-lite@npm:^1.0.30001286, caniuse-lite@npm:^1.0.30001297, caniuse-lite@npm:^1.0.30001299, caniuse-lite@npm:^1.0.30001332": - version: 1.0.30001336 - resolution: "caniuse-lite@npm:1.0.30001336" - checksum: 05577b295f2c3780f4a2c814c4255b8b73353ff5a7238f5f97fe3b2bb61b78d77be7df52e9646b829bbde8f0efbfbad971324001086f9069ec144e4fc88ed5b8 + version: 1.0.30001426 + resolution: "caniuse-lite@npm:1.0.30001426" + checksum: e8b9c14ee33410d95b27da619f50648f373a7be712748970643f25d95fa80687b4755ba365f34a7a1cea00f9137193943aa6e742eedf0a4d7857f83809f49435 languageName: node linkType: hard
Wrong Client is also a FatalClientError FatalClientError is it SHOULD NOT be redirected to client (redirect_uri), but MUST be redirected to USERS (error_uri).
@@ -224,7 +224,7 @@ class TemporarilyUnavailableError(OAuth2Error): error = 'temporarily_unavailable' -class InvalidClientError(OAuth2Error): +class InvalidClientError(FatalClientError): """ Client authentication failed (e.g. unknown client, no client authentication included, or unsupported authentication method).
Update Redis exporter to 1.14.0 PR - allow configuring whether the port is included in the client's details (by default it is not)
@@ -58,7 +58,7 @@ packages: context: static: <<: *default_static_context - version: 1.13.1 + version: 1.14.0 license: MIT summary: Prometheus exporter for Redis server metrics. description: Prometheus Exporter for Redis Metrics. Supports Redis 2.x, 3.x, 4.x, 5.x and 6.x
Fix pylint warning in test_l3_hamode_db.py ************* Module neutron.tests.unit.db.test_l3_hamode_db C:841, 0: Line too long (80/79) (line-too-long) Trivialfix
@@ -838,8 +838,8 @@ class L3HATestCase(L3HATestFramework): self.admin_ctx, states, self.agent1['host']) def test_exclude_dvr_agents_for_ha_candidates(self): - """Test dvr agents configured with "dvr" only, as opposed to "dvr_snat", - are excluded. + """Test dvr agents configured with "dvr" only, as opposed to + "dvr_snat", are excluded. This test case tests that when get_number_of_agents_for_scheduling is called, it does not count dvr only agents. """
Remove unnecessary log statement. All paths are covered by more specific statements, so remove extra log line in the most common case.
@@ -427,13 +427,13 @@ class Interchange: try: msg = Message.unpack(raw_msg) - log.debug("received Message/Heartbeat? on task queue") except Exception: log.exception(f"Failed to unpack message, RAW:{raw_msg}") continue if msg == "STOP": # TODO: Yadu. This should be replaced by a proper MessageType + log.debug("Received STOP message.") kill_event.set() break elif isinstance(msg, Heartbeat):
make `setxor1d' a bit clear and speed up We need to find the index which is not the same with the left and right, I think np.logical_and's meaning is more clear and I test this got a speed up
@@ -378,8 +378,9 @@ def setxor1d(ar1, ar2, assume_unique=False): # flag = ediff1d( aux, to_end = 1, to_begin = 1 ) == 0 flag = np.concatenate(([True], aux[1:] != aux[:-1], [True])) # flag2 = ediff1d( flag ) == 0 - flag2 = flag[1:] == flag[:-1] - return aux[flag2] +# flag2 = flag[1:] == flag[:-1] +# return aux[flag2] + return aux[np.logical_and(flag[1:], flag[:-1])] def in1d(ar1, ar2, assume_unique=False, invert=False):
Update options-for-quality-metrics.md specified which reports it applies to
## Available Options -These options apply to different plots in the Evidently reports: Data Drift, Classification Performance, Probabilistic classification performance, Regression Performance. +These options apply to different plots in the Evidently reports: Data Drift, Categorical Target Drift, Classification Performance, Probabilistic classification performance. You can specify the following parameters:
smp: Fix smp_call macros with fewer than 4 arguments The full 4-argument version of smp_call is smp_call4; smp_call doesn't exist.
@@ -15,10 +15,10 @@ void smp_secondary_entry(void); void smp_start_secondaries(void); -#define smp_call0(i, f) smp_call(i, f, 0, 0, 0, 0) -#define smp_call1(i, f, a) smp_call(i, f, a, 0, 0, 0) -#define smp_call2(i, f, a, b) smp_call(i, f, a, b, 0, 0) -#define smp_call3(i, f, a, b, c) smp_call(i, f, a, b, c, 0) +#define smp_call0(i, f) smp_call4(i, f, 0, 0, 0, 0) +#define smp_call1(i, f, a) smp_call4(i, f, a, 0, 0, 0) +#define smp_call2(i, f, a, b) smp_call4(i, f, a, b, 0, 0) +#define smp_call3(i, f, a, b, c) smp_call4(i, f, a, b, c, 0) void smp_call4(int cpu, void *func, u64 arg0, u64 arg1, u64 arg2, u64 arg3);
[Doc] NN Modules Edit for grammar and style * Edit for grammar and style Improve the flow and readability * Update docs/source/features/nn.rst Better now? NN Modules as a title is vague
.. currentmodule:: dgl -NN Modules +Graph neural network modules =============== -A set of high-level pre-defined modules are provided to build graph neural networks. +This topic provides a link to several high-level, pre-defined modules you can use to build graph neural networks (NN). .. toctree:: @@ -13,11 +13,10 @@ A set of high-level pre-defined modules are provided to build graph neural netwo ../api/python/nn -Contribution Guide +Contribution guide ~~~~~~~~~~~~~~~~~~~~~ -We welcome your contribution! If you want a model to be implemented in DGL as a NN module, -please `create an issue <https://github.com/dmlc/dgl/issues>`_ started with "[Feature Request] NN Module XXXModel". +We welcome your contribution! If you want a model to be implemented in DGL as an NN module, +please `create an issue <https://github.com/dmlc/dgl/issues>`_ and use this title style: "[Feature Request] NN Module XXXModel". -If you want to contribute a NN module, please `create a pull request <https://github.com/dmlc/dgl/pulls>`_ started -with "[NN] XXXModel in MXNet/PyTorch NN Modules" and our team member would review this PR. +If you want to contribute an NN module, please `create a pull request <https://github.com/dmlc/dgl/pulls>`_ and use this title style: "[NN] XXXModel in MXNet/PyTorch NN Modules". A team member will review the pull request.
Update italics formatting help to use asterisks See for context. One user didn't realize `*text*` italicizes it, and asterisks are used in other help docs more consistently, e.g. `_italics_` kept as an alternative for italics.
@@ -22,9 +22,9 @@ Text Style You can use either ``_`` or ``*`` around a word to make it italic. Use two to make it bold. -* ``_italics_`` renders as `italics` +* ``*italics*`` (or ``_italics_``) renders as *italics* * ``**bold**`` renders as **bold** -* ``**_bold-italic_**`` renders as |bold_italics| +* ``***bold-italic***`` renders as |bold_italics| * ``~~strikethrough~~`` renders as |strikethrough| .. |bold_italics| image:: ../../images/bold_italics.PNG
m1n1.constructutils: Improve recursive reload Reload more things, but also avoid reloading the same class multiple times.
@@ -5,35 +5,70 @@ from .utils import Reloadable, ReloadableMeta import inspect import textwrap -def recusive_reload(obj): +g_struct_trace = set() +g_depth = 0 + +def recusive_reload(obj, token=None): + global g_depth + + if token is None: + g_depth = 0 + token = object() + + cur_token = getattr(obj, "_token", None) + if cur_token is token: + return + + g_depth += 1 + #print(" " * g_depth + f"> {obj}", id(obj), id(token)) + if isinstance(obj, Construct) and hasattr(obj, 'subcon'): + # Single subcon types + if inspect.isclass(obj.subcon): + #print("> isclass") + if hasattr(obj.subcon, "_reloadcls"): + #print("> Recursive (subcon)") + obj.subcon = obj.subcon._reloadcls(token=token) + else: + if isinstance(obj.subcon, Construct): + recusive_reload(obj.subcon, token) if isinstance(obj, Construct) and hasattr(obj, 'subcons'): # Construct types that have lists + new_subcons = [] for i, item in enumerate(obj.subcons): if inspect.isclass(item): - if issubclass(item, Reloadable): - obj.subcons[i] = item._reloadcls() + if hasattr(item, "_reloadcls"): + #print("> Recursive (subcons)") + item = item._reloadcls() else: if isinstance(item, Construct): - recusive_reload(item) + recusive_reload(item, token) + new_subcons.append(item) + obj.subcons = new_subcons if isinstance(obj, Construct) and hasattr(obj, 'cases'): # Construct types that have lists - for i, item in obj.cases.items(): + for i, item in list(obj.cases.items()): if inspect.isclass(item): - if issubclass(item, Reloadable): - obj.cases[i] = item._reloadcls() + if hasattr(item, "_reloadcls"): + #print("> Recursive (cases)") + obj.cases[i] = item._reloadcls(token=token) else: if isinstance(item, Construct): - recusive_reload(item) + recusive_reload(item, token) for field in dir(obj): value = getattr(obj, field) if inspect.isclass(value): - if issubclass(value, Reloadable): - setattr(obj, field, value._reloadcls()) + if hasattr(value, "_reloadcls"): + #print("> Recursive (value)") + setattr(obj, field, value._reloadcls(token=token)) else: if isinstance(value, Construct): - recusive_reload(value) + recusive_reload(value, token) + + obj._token = token + + g_depth -= 1 def str_value(value): if isinstance(value, DecDisplayedInteger): @@ -149,9 +184,11 @@ class ConstructClassBase(Reloadable, metaclass=ReloadableConstructMeta): return cls.subcon._sizeof(context, f"{path} -> {cls.name}") @classmethod - def _reloadcls(cls, force=False): - newcls = super()._reloadcls(force) - recusive_reload(newcls.subcon) + def _reloadcls(cls, force=False, token=None): + #print(f"_reloadcls({cls})", id(cls)) + newcls = Reloadable._reloadcls.__func__(cls, force) + if hasattr(newcls, "subcon"): + recusive_reload(newcls.subcon, token) return newcls def _apply(self, obj):
Capirca should be able to use non-int ports Adding feature to allow capirca_acl module to translate ports specified using the service name to their integer value, as mapped in the IANA /etc/services file
@@ -149,7 +149,7 @@ _IP_FILEDS = [ 'next_ip' ] -_DEFAULT_SERVICES = {} +_SERVICES = {} # ------------------------------------------------------------------------------ # helper functions -- will not be exported @@ -208,8 +208,9 @@ def _get_services_mapping(): services shortcut and they will need to specify the protocol / port combination using the source_port / destination_port & protocol fields. ''' - services = {} - services.update(_DEFAULT_SERVICES) + if _SERVICES: + return _SERVICES + global _SERVICES services_txt = '' try: with salt.utils.fopen('/etc/services', 'r') as srv_f: @@ -217,27 +218,38 @@ def _get_services_mapping(): except IOError as ioe: log.error('Unable to read from /etc/services:') log.error(ioe) - return services # no mapping possible, sorry + return _SERVICES # no mapping possible, sorry # will return the default mapping service_rgx = re.compile(r'^([a-zA-Z0-9-]+)\s+(\d+)\/(tcp|udp)(.*)$') for line in services_txt.splitlines(): service_rgx_s = service_rgx.search(line) if service_rgx_s and len(service_rgx_s.groups()) == 4: srv_name, port, protocol, _ = service_rgx_s.groups() - if srv_name not in services: - services[srv_name] = { + if srv_name not in _SERVICES: + _SERVICES[srv_name] = { 'port': [], 'protocol': [] } try: - services[srv_name]['port'].append(int(port)) + _SERVICES[srv_name]['port'].append(int(port)) except ValueError as verr: log.error(verr) log.error('Did not read that properly:') log.error(line) log.error('Please report the above error: {port} does not seem a valid port value!'.format(port=port)) - services[srv_name]['protocol'].append(protocol) - return services + _SERVICES[srv_name]['protocol'].append(protocol) + return _SERVICES + + +def _translate_port(port): + ''' + Look into services and return the port value using the + service name as lookup value. + ''' + services = _get_services_mapping() + if port in services and services[port]['port']: + return services[port]['port'][0] + return port def _make_it_list(dict_, field_name, value): @@ -269,9 +281,23 @@ def _make_it_list(dict_, field_name, value): portval.append((port, port)) else: portval.append(port) - return list(set(prev_value + portval)) + translated_portval = [] + # and the ports sent as string, e.g. ntp instead of 123 + # needs to be translated + # again, using the same /etc/services + for port_start, port_end in portval: + if not isinstance(port_start, int): + port_start = _translate_port(port_start) + if not isinstance(port_end, int): + port_end = _translate_port(port_end) + translated_portval.append( + (port_start, port_end) + ) + return list(set(prev_value + translated_portval)) return list(set(prev_value + list(value))) if field_name in ('source_port', 'destination_port'): + if not isinstance(value, int): + value = _translate_port(value) return list(set(prev_value + [(value, value)])) # a list of tuples # anything else will be enclosed in a list-type return list(set(prev_value + [value]))
Add python-dateutil as a dependency of the EBCLI Components of the 'dateutil' module are invariably found in the response objects that botocore generates SIM: cr
@@ -11,6 +11,7 @@ requires = [ 'cement==2.8.2', 'colorama==0.3.7', 'pathspec==0.5.5', + 'python-dateutil>=2.1,<3.0.0', # use the same range that 'botocore' uses 'pyyaml>=3.11', 'setuptools >= 20.0', 'docker-compose >= 1.21.2, < 1.22.0',
Update README.md Add some instructions to solve a common port number issue when 7000 is occupied
# Docker -Make sure you have Docker installed +Make sure you have Docker installed, and Docker Desktop is running. ## Build an image @@ -16,6 +16,12 @@ docker build -t knowledge -f docker/Dockerfile.dev . docker run --rm -it -p 7000:7000 -e KNOWLEDGE_REPO=test_repo knowledge </pre> +After the server is up and running in Docker, the knowledge-repo service will be accessible at http://localhost:7000. If the 7000 port number is being used by another process, you may choose another local port number to use, e.g., the following command starts the service at port number 7001 (i.e., http://localhost:7001): + +<pre> +docker run --rm -it -p 7001:7000 -e KNOWLEDGE_REPO=test_repo knowledge +</pre> + ## Run it locally with a mapped notebook-folder <pre>
Update 35-gaussian_density_fit.py pbc/RHF does not support k-point meshes.
@@ -91,7 +91,7 @@ mf.kernel() # below. Assuming in the first pass, the GDF 3-index tensors are saved with # the following code # -mf = scf.RHF(cell, cell.make_kpts([2,2,2])).density_fit(auxbasis=auxbasis) +mf = scf.KRHF(cell, cell.make_kpts([2,2,2])).density_fit(auxbasis=auxbasis) mf.with_df._cderi_to_save = 'pbc_gdf.h5' mf.kernel()
Improve ensuring scene setting only one call to Harmony message box for missing attributes resolution attributes.
import os import time +import sys from avalon import api, harmony +from avalon.vendor import Qt import pyblish.api from pype import lib def ensure_scene_settings(): - fps = lib.get_asset()["data"]["fps"] - frame_start = lib.get_asset()["data"]["frameStart"] - frame_end = lib.get_asset()["data"]["frameEnd"] + asset_data = lib.get_asset()["data"] + fps = asset_data["fps"] + frame_start = asset_data["frameStart"] + frame_end = asset_data["frameEnd"] + resolution_width = asset_data.get("resolutionWidth") + resolution_height = asset_data.get("resolutionHeight") settings = { - "setFrameRate": fps, - "setStartFrame": frame_start, - "setStopFrame": frame_end + "fps": fps, + "frameStart": frame_start, + "frameEnd": frame_end, + "resolutionWidth": resolution_width, + "resolutionHeight": resolution_height } - func = """function func(arg) - {{ - scene.{method}(arg); - }} - func - """ + + invalid_settings = [] + valid_settings = {} for key, value in settings.items(): if value is None: - continue - - # Need to wait to not spam Harmony with multiple requests at the same - # time. - time.sleep(1) - - harmony.send({"function": func.format(method=key), "args": [value]}) + invalid_settings.append(key) + else: + valid_settings[key] = value + + # Warn about missing attributes. + print("Starting new QApplication..") + app = Qt.QtWidgets.QApplication(sys.argv) + + message_box = Qt.QtWidgets.QMessageBox() + message_box.setIcon(Qt.QtWidgets.QMessageBox.Warning) + msg = "Missing attributes:" + if invalid_settings: + for item in invalid_settings: + msg += f"\n{item}" + message_box.setText(msg) + message_box.exec_() + + # Garbage collect QApplication. + del app - time.sleep(1) - - func = """function func(arg) + func = """function func(args) { - frame.remove(arg, frame.numberOf() - arg); + if (args["fps"]) + { + scene.setFrameRate(); + } + if (args["frameStart"]) + { + scene.setStartFrame(args[1]); + } + if (args["frameEnd"]) + { + scene.setStopFrame(args[2]); + frame.remove(args[2], frame.numberOf() - args[2]); + } + if (args["resolutionWidth"] && args["resolutionHeight"]) + { + scene.setDefaultResolution( + args["resolutionWidth"], args["resolutionHeight"], 41.112 + ) + } } func """ - harmony.send({"function": func, "args": [frame_end]}) + + harmony.send({"function": func, "args": [valid_settings]}) def install():
Add ordering operators to Location TN:
@@ -82,6 +82,16 @@ class Location(object): self.line = line self.text = text + @property + def as_tuple(self): + return (self.file, self.line) + + def __eq__(self, other): + return self.as_tuple == other.as_tuple + + def __lt__(self, other): + return self.as_tuple < other.as_tuple + def __repr__(self): return "<Location {} {}>".format(self.file, self.line)
Fix trailing slash in base_url. * Strip trailing slash in base_url. * Revert "Strip trailing slash in base_url." This reverts commit * Combine urls using urllib.parse.urljoin().
@@ -5,6 +5,7 @@ import os import re from collections import UserList from typing import Any, Dict, List, Union +from urllib.parse import urljoin import requests.utils from requests import Response, Session @@ -152,7 +153,7 @@ class APIClient: def _get_base_url_with_base_path(self): base_path = "/api/{}/projects/{}".format(self._api_version, self._config.project) if self._api_version else "" - return self._config.base_url + base_path + return urljoin(self._config.base_url, base_path) def _is_retryable(self, method, path): valid_methods = ["GET", "POST", "PUT", "DELETE"]
Update broad-references.yaml Updating readme URL
Name: Broad Genome References Description: Broad maintained human genome reference builds hg19/hg38 and decoy references. -Documentation: https://s3.amazonaws.com/broad-references/README +Documentation: https://s3.amazonaws.com/broad-references/broad-references-readme.html Contact: [email protected] ManagedBy: Broad Institute UpdateFrequency: Monthly
[doc] Update documentation lintrails --> linktrails
@@ -142,7 +142,7 @@ def update_family_file(): text += yield except GeneratorExit: text += ' }' - # write lintrails to family file + # write linktrails to family file pywikibot.output('Writing family file...') family_file_name = join('pywikibot', 'family.py') with codecs.open(family_file_name, 'r', 'utf8') as family_file:
Update prototype_index.rst Adds module recipe to prototype index.
@@ -91,6 +91,15 @@ Prototype features are not available as part of binary distributions like PyPI o :link: ../prototype/vulkan_workflow.html :tags: Mobile +.. Modules + +.. customcarditem:: + :header: Skipping Module Parameter Initialization in PyTorch 1.10 + :card_description: Describes skipping parameter initialization during module construction in PyTorch 1.10, avoiding wasted computation. + :image: ../_static/img/thumbnails/cropped/generic-pytorch-logo.png + :link: ../prototype/skip_param_init.html + :tags: Modules + .. TorchScript .. customcarditem::
STY: take dict before Series name Changed `_custom.py` 'add' logic to take the function supplied metadata in the dictionary before the Series metadata, since the series metadata is more likely to carry over from input and the dictionary input is more likely to be delibrately specified.
@@ -163,14 +163,14 @@ class Custom(object): sat[newData['data'].columns] = newData # if a series is returned, add it as a column elif isinstance(newData['data'], pds.Series): - # look for name attached to series first - if newData['data'].name is not None: - sat[newData['data'].name] = newData - # look if name is provided as part of dict - # returned from function - elif 'name' in newData.keys(): + # Look if name is provided as part of dict + # returned from function first + if 'name' in newData.keys(): name = newData.pop('name') sat[name] = newData + # look for name attached to Series second + elif newData['data'].name is not None: + sat[newData['data'].name] = newData # couldn't find name information else: raise ValueError(
Update detection-testing.yml Accidentally removed change directory. We were not in the correct directory for the file we were trying to upload to S3.
@@ -323,7 +323,7 @@ jobs: - name: Upload S3 Badge and Summary Artifacts for Nightly Scheduled Run if: ${{ github.event_name == 'schedule' }} run: | - #cd bin/docker_detection_tester + cd bin/docker_detection_tester #python generate_detection_coverage_badge.py --input_summary_file summary_test_results.json --output_badge_file detection_coverage.svg --badge_string "Pass Rate"
user status: Decrease user status modal width by 20% User status modal width is decreased by 20% to make it look better with the user status message options added in the previous commit.
.user_status_overlay { .overlay-content { - width: 480px; + width: 384px; margin: 0 auto; position: relative; top: calc((30vh - 50px) / 2); } input.user_status { - width: 420px; + width: 336px; }; .user-status-header {
Fix typo in Exception type Hasn't been hit unless there is an error in the migrations which is why it didn't cause tests to fail
@@ -126,7 +126,7 @@ def run_migrations_online(): for rec in engines.values(): rec["transaction"].commit() - except Except: + except Exception: for rec in engines.values(): rec["transaction"].rollback() raise
Update cobaltstrike.txt Fixed
@@ -10608,7 +10608,17 @@ yazorac.com # Reference: https://twitter.com/TheDFIRReport/status/1382757614094852103 -apigw.tencentcs.com +service-3ehlvob0-1301977346.gz.apigw.tencentcs.com +service-7swl0aox-1257100087.cd.apigw.tencentcs.com +service-fooemyjn-1304230653.sh.apigw.tencentcs.com +service-hzt1fyzo-1305236517.gz.apigw.tencentcs.com +service-ijuzpjsx-1255997775.bj.apigw.tencentcs.com +service-iwos0gcv-1257776894.sh.apigw.tencentcs.com +service-pvgy9r42-1257357125.gz.apigw.tencentcs.com +service-0dibtqsv-1255352921.cd.apigw.tencentcs.com +service-4ng7k4aw-1256691685.gz.apigw.tencentcs.com +service-dlijjgbw-1304664184.hk.apigw.tencentcs.com +service-ln18385c-1253152225.hk.apigw.tencentcs.com # Generic
feat(api): schedule scavenge-unused management command daily Running during daytime so that any potential errors have a higher chance of being noticed quickly. Load is neglibile.
*/5 * * * * /usr/local/bin/python3 -u /usr/src/app/manage.py chores >> /var/log/cron.log 2>&1 */5 * * * * /usr/local/bin/python3 -u /usr/src/app/manage.py check-slaves >> /var/log/cron.log 2>&1 +7 11 * * * /usr/local/bin/python3 -u /usr/src/app/manage.py scavenge-unused >> /var/log/cron.log 2>&1
Close Add URM connector to the list of available types for front and rear ports. There are URM-P2, URM-P4 and URM-P8 connectors available.
@@ -958,6 +958,9 @@ class PortTypeChoices(ChoiceSet): TYPE_SPLICE = 'splice' TYPE_CS = 'cs' TYPE_SN = 'sn' + TYPE_URM_P2 = 'urm-p2' + TYPE_URM_P4 = 'urm-p4' + TYPE_URM_P8 = 'urm-p8' CHOICES = ( ( @@ -998,6 +1001,9 @@ class PortTypeChoices(ChoiceSet): (TYPE_ST, 'ST'), (TYPE_CS, 'CS'), (TYPE_SN, 'SN'), + (TYPE_URM_P2, 'URM-P2'), + (TYPE_URM_P4, 'URM-P4'), + (TYPE_URM_P8, 'URM-P8'), (TYPE_SPLICE, 'Splice'), ) )
npm 5.0.0 added a second line after fsevents Remove that line from so we only get the json
@@ -188,7 +188,7 @@ def _extract_json(npm_output): # macOS with fsevents includes the following line in the return # when a new module is installed which is invalid JSON: # [fsevents] Success: "..." - while lines and lines[0].startswith('[fsevents]'): + while lines and (lines[0].startswith('[fsevents]') or lines[0].startswith('Pass ')): lines = lines[1:] try: return json.loads(''.join(lines))
CoreDNS support EndpointSlices In order to properly support EndpointSlices, enhance ClusterRole. story: task: 44582
@@ -40,6 +40,13 @@ rules: - nodes verbs: - get +- apiGroups: + - discovery.k8s.io + resources: + - endpointslices + verbs: + - list + - watch --- apiVersion: rbac.authorization.k8s.io/v1 kind: ClusterRoleBinding
Fix format of tip; improve info order Correct the indent so that the tip is formatted correctly. Also move the tip after the info about replacing the VALxKEY strings, because the tip is less important (the example shows the current version number).
@@ -75,6 +75,11 @@ genesis block also includes the keys for the other nodes in the initial network. sawtooth.consensus.algorithm.version=1.0 \ sawtooth.consensus.pbft.members='["VAL1KEY","VAL2KEY",...,"VALnKEY"]' + Replace ``"VAL1KEY","VAL2KEY","VAL3KEY",...,"VALnKEY"`` with the validator public + keys of all the nodes (including this node). This information is in the + file ``/etc/sawtooth/keys/validator.pub`` on each node. Be sure to use + single quotes and double quotes correctly, as shown in the example. + .. tip:: The PBFT version number is in the file ``sawtooth-pbft/Cargo.toml`` @@ -82,11 +87,6 @@ genesis block also includes the keys for the other nodes in the initial network. digits (major and minor release numbers); omit the patch number. For example, if the version is 1.0.3, use ``1.0`` for this setting. - Replace ``"VAL1KEY","VAL2KEY","VAL3KEY",...,"VALnKEY"`` with the validator public - keys of all the nodes (including this node). This information is in the - file ``/etc/sawtooth/keys/validator.pub`` on each node. Be sure to use - single quotes and double quotes correctly, as shown in the example. - * For PoET: .. code-block:: console
Update REAME, post merge fix Remove redundant phrase.
@@ -195,7 +195,6 @@ If you want to run the pipeline on a subset of your BIDS dataset, you can use the `-tsv` flag to specify in a TSV file the participants belonging to your subset. -A description of the arguments for the `preprocessing` task is presented below: <details> <summary> Here is a description of the arguments present for the preprocessing task.
Fix sporadic dry_run test failures By using sets instead of lists. Apparently events aren't always ordered perfectly when received by the client, so using sets removes the need for order.
@@ -234,7 +234,7 @@ class ExecutionsTest(AgentlessTestCase): self.assertDictEqual(invocations[0], {'before-sleep': None}) def test_dry_run_execution(self): - expected_messages = [ + expected_messages = { "Starting 'install' workflow execution (dry run)", "Creating node", "Sending task 'cloudmock.tasks.provision'", @@ -249,7 +249,7 @@ class ExecutionsTest(AgentlessTestCase): "Task started 'cloudmock.tasks.get_state'", "Task succeeded 'cloudmock.tasks.get_state (dry run)'", "'install' workflow execution succeeded (dry run)" - ] + } dsl_path = resource("dsl/basic.yaml") _, execution_id = self.deploy_application(dsl_path, @@ -260,6 +260,6 @@ class ExecutionsTest(AgentlessTestCase): # "termindated", because it arrives via a different mechanism time.sleep(3) events = self.client.events.list(execution_id=execution_id) - event_messages = [event['message'] for event in events] + event_messages = {event['message'] for event in events} self.assertEquals(event_messages, expected_messages)
Tidy up two checks that do the same thing This also is in the regular language spirit.
@@ -167,11 +167,9 @@ def _interpret_errors(errors): if isinstance( # pylint: disable=bad-continuation error, - DbusClientMissingSearchPropertiesError, + (DbusClientMissingSearchPropertiesError, DbusClientMissingPropertyError), ): # pragma: no cover return _DBUS_INTERFACE_MSG - if isinstance(error, DbusClientMissingPropertyError): # pragma: no cover - return _DBUS_INTERFACE_MSG if isinstance(error, StratisCliEngineError): fmt_str = (
Remove use of format! to change &str to String Turning an &str into String requires only .into(), or .to_string(), whereas format! is more complex than necessary.
@@ -34,7 +34,7 @@ use err::CliError; pub fn run<'a>(args: &ArgMatches<'a>) -> Result<(), CliError> { let genesis_file_path = if args.is_present("output") { args.value_of("output") - .ok_or_else(|| CliError::ArgumentError(format!("Failed to read `output` arg"))) + .ok_or_else(|| CliError::ArgumentError("Failed to read `output` arg".into())) .map(|pathstr| Path::new(pathstr).to_path_buf()) } else { Ok(config::get_path_config().data_dir.join("genesis.batch"))
Add minimal optional method For now it only reads in the datapackage and returns without doing anything else.
@@ -133,6 +133,15 @@ class EnergySystem: g(entity, groups) return groups + if not datapackage is NOT_AVAILABLE: + @classmethod + def from_datapackage(cls, path): + package = datapackage.Package(path) + # This is necessary because before reading a resource for the first + # time its `headers` attribute ist `None`. + for r in package.resources: r.read() + + def _add(self, entity): self.entities.append(entity) self._groups = partial(self._regroup, entity, self.groups,
Only unsubscribe dropdown hooks when they were subscribed WindowVisibilityToggler subscribes to two hooks conditional upon self.on_focus_lost_hide, but unconditionally unsubscribes them. This pollutes the log with lots of erroneous exception messages. Closes
@@ -127,6 +127,7 @@ class WindowVisibilityToggler: def unsubscribe(self): """unsubscribe all hooks""" + if self.on_focus_lost_hide: try: hook.unsubscribe.client_focus(self.on_focus_change) except utils.QtileError as err:
Prevent duplicate check_queued_build executions by populating and checking the task_id_check field on the build
@@ -61,11 +61,7 @@ def check_queued_build(build_id): reset_database_connection() from mrbelvedereci.build.models import Build - try: build = Build.objects.get(id = build_id) - except Build.DoesNotExist: - time.sleep(1) - check_queued_build.delay(build_id) if build.status != 'queued': return 'Build is not queued. Current build status is {}'.format(build.status) @@ -74,11 +70,16 @@ def check_queued_build(build_id): try: org = Org.objects.get(name = build.plan.org, repo = build.repo) except Org.DoesNotExist: + message = 'Could not find org configuration for org {}'.format(build.plan.org) + build.log = message + build.status = 'error' + build.save() return 'Could not find org configuration for org {}'.format(build.plan.org) if org.scratch: # For scratch orgs, we don't need concurrency blocking logic, just run the build res_run = run_build.delay(build.id) + build.task_id_check = None build.task_id_run = res_run.id build.save() return "Org is a scratch org, running build concurrently as task {}".format(res_run.id) @@ -96,6 +97,8 @@ def check_queued_build(build_id): return "Got a lock on the org, running as task {}".format(res_run.id) else: # Failed to get lock, queue next check + build.task_id_check = None + build.save() return "Failed to get lock on org. {} has the org locked. Queueing next check.".format(cache.get(lock_id)) @django_rq.job('short', timeout=60) @@ -104,9 +107,11 @@ def check_queued_builds(): from mrbelvedereci.build.models import Build builds = [] - for build in Build.objects.filter(status = 'queued').order_by('time_queue'): + for build in Build.objects.filter(status = 'queued', task_id_check__isnull = True).order_by('time_queue'): builds.append(build.id) - check_queued_build.delay(build.id) + res_check = check_queued_build.delay(build.id) + build.task_id_check = res_check.id + build.save() if builds: return 'Checked queued builds: {}'.format(', '.join(builds)) @@ -135,7 +140,7 @@ def delete_scratch_orgs(): else: orgs_failed += 1 - if orgs_deleted and not orgs_failed: + if not orgs_deleted and not orgs_failed: return 'No orgs found to delete' return 'Deleted {} orgs and failed to delete {} orgs'.format(orgs_deleted, orgs_failed)
Remove maintainer (myself) I have been away from this project too long to still qualify as a maintainer. I shall continue to contribute as and when possible.
@@ -21,4 +21,4 @@ _pywebview_ is a BSD licensed open source project. It is an independent project -_pywebview_ is created by [Roman Sirokov](https://github.com/r0x0r/). Maintained by Roman and [Shiva Prasad](https://github.com/shivaprsdv). +_pywebview_ is created and maintained by [Roman Sirokov](https://github.com/r0x0r/).
Add release timeline The examples and the release timeline will help developer to plan the function deprecation w.r.t the insights-core release cycle.
@@ -464,7 +464,7 @@ Functions from insights.util import deprecated def old_feature(arguments): - deprecated(old_feature, "Use the new_feature() function instead") + deprecated(old_feature, "Use the new_feature() function instead", "3.1.25") ... Class methods @@ -478,7 +478,7 @@ Class methods ... def old_method(self, *args, **kwargs): - deprecated(self.old_method, "Use the new_method() method instead") + deprecated(self.old_method, "Use the new_method() method instead", "3.1.25") self.new_method(*args, **kwargs) ... @@ -491,7 +491,7 @@ Class class ThingParser(Parser): def __init__(self, *args, **kwargs): - deprecated(ThingParser, "Use the new_feature() function instead") + deprecated(ThingParser, "Use the new_feature() function instead", "3.1.25") super(ThingParser, self).__init__(*args, **kwargs) ... @@ -508,3 +508,30 @@ The :py:func:`insights.util.deprecated` function takes three arguments: ``new_parser`` module." - For a specific method being replaced by a general mechanism: "Please use the ``search`` method with the arguments ``state="LISTEN"``." +- The last ``version`` of insights-core that the functions will be available + before it is removed. For example: + + - For version 3.1.0 the last revision will be 3.1.25. If the deprecation + message indicate that the last version is 3.1.25, the function will be + removed in 3.2.0. + + +Insights-core release timeline +------------------------------ + +.. table:: + :widths: auto + + ======= ===================== + Version Expected release date + ======= ===================== + 3.0.300 December 2022 (Initial release) + 3.1.0 December 2022 + 3.2.0 June 2023 + 3.3.0 December 2023 + 3.4.0 June 2024 + ======= ===================== + +.. note:: + - We bump the insights-core revision every week. Please refer the `CHANGELOG.md file <https://github.com/RedHatInsights/insights-core/blob/3.0/CHANGELOG.md>`_ for more info. + - The minor version will be bumped after every 25 revisions. For example, after 3.1.25, we would move to 3.2.0 except for 3.0.300 which marks the first planned release. After 3.0.300, we bump the minor version to 3.1.0.
[Context] Add region_name in credential If we don't add region_name in multi-regions cloud, it exist the potential possibility that we can not pass the validate function. And previous param of region_name does not in right location.
@@ -210,11 +210,13 @@ class UserGenerator(context.Context): user_credential = objects.Credential( self.credential.auth_url, user.name, password, self.context["tenants"][tenant_id]["name"], - consts.EndpointPermission.USER, self.credential.region_name, - project_domain_name=project_dom, user_domain_name=user_dom, + consts.EndpointPermission.USER, + project_domain_name=project_dom, + user_domain_name=user_dom, endpoint_type=self.credential.endpoint_type, https_insecure=self.credential.insecure, - https_cacert=self.credential.cacert) + https_cacert=self.credential.cacert, + region_name=self.credential.region_name) users.append({"id": user.id, "credential": user_credential, "tenant_id": tenant_id})