Unnamed: 0
int64
0
389k
code
stringlengths
26
79.6k
docstring
stringlengths
1
46.9k
378,000
def multi_ping(dest_addrs, timeout, retry=0, ignore_lookup_errors=False): retry = int(retry) if retry < 0: retry = 0 timeout = float(timeout) if timeout < 0.1: raise MultiPingError("Timeout < 0.1 seconds not allowed") retry_timeout = float(timeout) / (retry + 1) if retry_timeout < 0.1: raise MultiPingError("Time between ping retries < 0.1 seconds") mp = MultiPing(dest_addrs, ignore_lookup_errors=ignore_lookup_errors) results = {} retry_count = 0 while retry_count <= retry: mp.send() single_results, no_results = mp.receive(retry_timeout) results.update(single_results) if not no_results: break retry_count += 1 return results, no_results
Combine send and receive measurement into single function. This offers a retry mechanism: Overall timeout time is divided by number of retries. Additional ICMPecho packets are sent to those addresses from which we have not received answers, yet. The retry mechanism is useful, because individual ICMP packets may get lost. If 'retry' is set to 0 then only a single packet is sent to each address. If 'ignore_lookup_errors' is set then any issues with resolving target names or looking up their address information will silently be ignored. Those targets simply appear in the 'no_results' return list.
378,001
def status(self): status_request = etcdrpc.StatusRequest() status_response = self.maintenancestub.Status( status_request, self.timeout, credentials=self.call_credentials, metadata=self.metadata ) for m in self.members: if m.id == status_response.leader: leader = m break else: leader = None return Status(status_response.version, status_response.dbSize, leader, status_response.raftIndex, status_response.raftTerm)
Get the status of the responding member.
378,002
def dprint(s): import inspect frameinfo = inspect.stack()[1] callerframe = frameinfo.frame d = callerframe.f_locals if (isinstance(s,str)): val = eval(s, d) else: val = s cc = frameinfo.code_context[0] import re regex = re.compile("dprint\((.*)\)") res = regex.search(cc) s = res.group(1) text = text += bcolors.OKBLUE + "At <{}>\n".format(str(frameinfo)) + bcolors.ENDC text += bcolors.WARNING + "{}: ".format(s) + bcolors.ENDC text += str(val) text += str() print(text)
Prints `s` with additional debugging informations
378,003
def default_targets(self): from dvc.stage import Stage msg = "assuming default target .".format(Stage.STAGE_FILE) logger.warning(msg) return [Stage.STAGE_FILE]
Default targets for `dvc repro` and `dvc pipeline`.
378,004
def song(self): if self._song is None: self._song = Song(self._song_data) return self._song
the song associated with the project
378,005
def iter_chain(cur): select = "SELECT nodes FROM chain" for nodes, in cur.execute(select): yield json.loads(nodes)
Iterate over all of the chains in the database. Args: cur (:class:`sqlite3.Cursor`): An sqlite3 cursor. This function is meant to be run within a :obj:`with` statement. Yields: list: The chain.
378,006
def parseinput(inputlist,outputname=None, atfile=None): files = [] newoutputname = outputname assoclist.append(fileutil.buildRootname(f)) files.remove(file) files.extend(assoclist) return files, newoutputname
Recursively parse user input based upon the irafglob program and construct a list of files that need to be processed. This program addresses the following deficiencies of the irafglob program:: parseinput can extract filenames from association tables Returns ------- This program will return a list of input files that will need to be processed in addition to the name of any outfiles specified in an association table. Parameters ---------- inputlist - string specification of input files using either wild-cards, @-file or comma-separated list of filenames outputname - string desired name for output product to be created from the input files atfile - object function to use in interpreting the @-file columns that gets passed to irafglob Returns ------- files - list of strings names of output files to be processed newoutputname - string name of output file to be created. See Also -------- stsci.tools.irafglob
378,007
def highlight_nodes(graph: BELGraph, nodes: Optional[Iterable[BaseEntity]] = None, color: Optional[str]=None): color = color or NODE_HIGHLIGHT_DEFAULT_COLOR for node in nodes if nodes is not None else graph: graph.node[node][NODE_HIGHLIGHT] = color
Adds a highlight tag to the given nodes. :param graph: A BEL graph :param nodes: The nodes to add a highlight tag on :param color: The color to highlight (use something that works with CSS)
378,008
def get_paged(self, res, **kwargs): if self.page_size is not None: kwargs[] = self.page_size if self.page_size <= 0: return res(**kwargs) def worker(): kwargs[] = 1 while True: response = res(**kwargs) yield response[] if response[]: kwargs[] += 1 else: break return itertools.chain.from_iterable(worker())
This call is equivalent to ``res(**kwargs)``, only it retrieves all pages and returns the results joined into a single iterable. The advantage over retrieving everything at once is that the result can be consumed immediately. :param res: what resource to connect to :param kwargs: filters to be used :: # Example: Iterate over all active releases for release in client.get_paged(client['releases']._, active=True): ... This function is obsolete and not recommended.
378,009
def child_context(self, *args, **kwargs): expected_args = { : [], : [], : [], : [], : [], } ifargs = { arg: self.config_handler.config.get(arg.upper(), default) for arg, default in expected_args.items() } ifargs.update(kwargs) if self.interface is None: self.setup(*args, **ifargs) with super(PyrosBase, self).child_context(*args, **kwargs) as cctxt: yield cctxt
Context setup first in child process, before returning from start() call in parent. Result is passed in as argument of update :return:
378,010
def generate_dumper(self, mapfile, names): return self.build_template(mapfile, names, self._dumpdata_template)
Build dumpdata commands
378,011
def locations_for(self, city_name, country=None, matching=): if not city_name: return [] if matching not in self.MATCHINGS: raise ValueError("Unknown type of matching: " "allowed values are %s" % ", ".join(self.MATCHINGS)) if country is not None and len(country) != 2: raise ValueError("Country must be a 2-char string") splits = self._filter_matching_lines(city_name, country, matching) return [Location(item[0], float(item[3]), float(item[2]), int(item[1]), item[4]) for item in splits]
Returns a list of Location objects corresponding to the int IDs and relative toponyms and 2-chars country of the cities matching the provided city name. The rule for identifying matchings is according to the provided `matching` parameter value. If `country` is provided, the search is restricted to the cities of the specified country. :param country: two character str representing the country where to search for the city. Defaults to `None`, which means: search in all countries. :param matching: str among `exact` (literal, case-sensitive matching), `nocase` (literal, case-insensitive matching) and `like` (matches cities whose name contains as a substring the string fed to the function, no matter the case). Defaults to `nocase`. :raises ValueError if the value for `matching` is unknown :return: list of `weatherapi25.location.Location` objects
378,012
def to_dict(self): session = self._get_session() snapshot = self._get_snapshot() return { "session_id": session._session_id, "transaction_id": snapshot._transaction_id, }
Return state as a dictionary. Result can be used to serialize the instance and reconstitute it later using :meth:`from_dict`. :rtype: dict
378,013
def postprocess_authors_init(self, entry): if type(entry.authors_init) is not list: entry.authors_init = [entry.authors_init]
If only a single author was found, ensure that ``authors_init`` is nonetheless a list.
378,014
def generate_func_call(name, args=None, kwargs=None): all_args = [] if args: all_args.extend(args) if kwargs: all_args.extend(.format(k, v) for k, v in kwargs if v is not None) return .format(name, .join(all_args))
Generates code to call a function. Args: name (str): The function name. args (list[str]): Each positional argument. kwargs (list[tuple]): Each tuple is (arg: str, value: str). If value is None, then the keyword argument is omitted. Otherwise, if the value is not a string, then str() is called on it. Returns: str: Code to call a function.
378,015
def select(self, choice_scores): alg_scores = {} for algorithm, choices in self.by_algorithm.items(): if not set(choices) & set(choice_scores.keys()): continue sublists = [choice_scores.get(c, []) for c in choices] alg_scores[algorithm] = sum(sublists, []) best_algorithm = self.bandit(alg_scores) best_subset = self.by_algorithm[best_algorithm] normal_ucb1 = UCB1(choices=best_subset) return normal_ucb1.select(choice_scores)
Groups the frozen sets by algorithm and first chooses an algorithm based on the traditional UCB1 criteria. Next, from that algorithm's frozen sets, makes the final set choice.
378,016
def _compute_weights(self): n = self.n c = 1. / (n + 1) self.Wm = np.full(n + 1, c) self.Wc = self.Wm
Computes the weights for the scaled unscented Kalman filter.
378,017
def get_signing_keys(eid, keydef, key_file): if os.path.isfile(key_file): kj = KeyJar() kj.import_jwks(json.loads(open(key_file, ).read()), eid) else: kj = build_keyjar(keydef)[1] fp = open(key_file, ) fp.write(json.dumps(kj.export_jwks())) fp.close() kj.issuer_keys[eid] = kj.issuer_keys[] return kj
If the *key_file* file exists then read the keys from there, otherwise create the keys and store them a file with the name *key_file*. :param eid: The ID of the entity that the keys belongs to :param keydef: What keys to create :param key_file: A file name :return: A :py:class:`oidcmsg.key_jar.KeyJar` instance
378,018
def set_input(self, p_name, value): name = self.python_names.get(p_name) if p_name is None or name not in self.get_input_names(): raise ValueError(.format(p_name)) self.step_inputs[name] = value
Set a Step's input variable to a certain value. The value comes either from a workflow input or output of a previous step. Args: name (str): the name of the Step input value (str): the name of the output variable that provides the value for this input. Raises: ValueError: The name provided is not a valid input name for this Step.
378,019
def _find_feature_type(self, feature_name, eopatch): for feature_type in self.allowed_feature_types: if feature_type.has_dict() and feature_name in eopatch[feature_type]: return feature_type return None
Iterates over allowed feature types of given EOPatch and tries to find a feature type for which there exists a feature with given name :return: A feature type or `None` if such feature type does not exist :rtype: FeatureType or None
378,020
def _processArgs(self, entry, *_args, **_kwargs): args = list(_args) kwargs = copy.deepcopy(_kwargs) reqArgs = entry[] routeParams = {} query = {} payload = None kwApiArgs = {} paginationHandler = None paginationLimit = None if len(kwargs) == 0: if in entry and len(args) == len(reqArgs) + 1: payload = args.pop() if len(args) != len(reqArgs): log.debug(args) log.debug(reqArgs) raise exceptions.TaskclusterFailure() log.debug() else: else: isFlatKwargs = False for arg in args: if not isinstance(arg, six.string_types) and not isinstance(arg, int): raise exceptions.TaskclusterFailure( % (arg, entry[])) for name, arg in six.iteritems(kwApiArgs): if not isinstance(arg, six.string_types) and not isinstance(arg, int): raise exceptions.TaskclusterFailure( % (name, arg, entry[])) if len(args) > 0 and len(kwApiArgs) > 0: raise exceptions.TaskclusterFailure() if len(reqArgs) > len(args) + len(kwApiArgs): raise exceptions.TaskclusterFailure( % ( entry[], len(reqArgs), len(args) + len(kwApiArgs))) if len(args) > len(reqArgs): raise exceptions.TaskclusterFailure(, entry[]) i = 0 for arg in args: log.debug(, arg) routeParams[reqArgs[i]] = arg i += 1 log.debug(, routeParams) routeParams.update(kwApiArgs) log.debug(, routeParams) if len(reqArgs) != len(routeParams): errMsg = % ( entry[], .join(reqArgs), routeParams.keys()) log.error(errMsg) raise exceptions.TaskclusterFailure(errMsg) for reqArg in reqArgs: if reqArg not in routeParams: errMsg = % ( entry[], reqArg) log.error(errMsg) raise exceptions.TaskclusterFailure(errMsg) return routeParams, payload, query, paginationHandler, paginationLimit
Given an entry, positional and keyword arguments, figure out what the query-string options, payload and api arguments are.
378,021
def write(self, text, fg=, bg=): if isinstance(text, str): sys.stdout.write(text) else: sys.stdout.write(str(text)) sys.stdout.flush()
write to the console
378,022
def parse_kal_scan(kal_out): kal_data = [] scan_band = determine_scan_band(kal_out) scan_gain = determine_scan_gain(kal_out) scan_device = determine_device(kal_out) sample_rate = determine_sample_rate(kal_out) chan_detect_threshold = determine_chan_detect_threshold(kal_out) for line in kal_out.splitlines(): if "chan:" in line: p_line = line.split() chan = str(p_line[1]) modifier = str(p_line[3]) power = str(p_line[5]) mod_raw = str(p_line[4]).replace(, ) base_raw = str((p_line[2]).replace(, )) mod_freq = herz_me(mod_raw) base_freq = herz_me(base_raw) final_freq = to_eng(determine_final_freq(base_freq, modifier, mod_freq)) kal_run = {"channel": chan, "base_freq": base_freq, "mod_freq": mod_freq, "modifier": modifier, "final_freq": final_freq, "power": power, "band": scan_band, "gain": scan_gain, "device": scan_device, "sample_rate": sample_rate, "channel_detect_threshold": chan_detect_threshold} kal_data.append(kal_run.copy()) return kal_data
Parse kal band scan output.
378,023
def advection(scalar, wind, deltas): r wind = _stack(wind) if wind.ndim > scalar.ndim: wind = wind[::-1] grad = _stack(gradient(scalar, deltas=deltas[::-1])) grad, wind = atleast_2d(grad, wind) return (-grad * wind).sum(axis=0)
r"""Calculate the advection of a scalar field by the wind. The order of the dimensions of the arrays must match the order in which the wind components are given. For example, if the winds are given [u, v], then the scalar and wind arrays must be indexed as x,y (which puts x as the rows, not columns). Parameters ---------- scalar : N-dimensional array Array (with N-dimensions) with the quantity to be advected. wind : sequence of arrays Length M sequence of N-dimensional arrays. Represents the flow, with a component of the wind in each dimension. For example, for horizontal advection, this could be a list: [u, v], where u and v are each a 2-dimensional array. deltas : sequence of float or ndarray A (length M) sequence containing the grid spacing(s) in each dimension. If using arrays, in each array there should be one item less than the size of `scalar` along the applicable axis. Returns ------- N-dimensional array An N-dimensional array containing the advection at all grid points.
378,024
def list_files(self, dataset_id, glob=".", is_dir=False): data = { "list": { "glob": glob, "isDir": is_dir } } return self._get_success_json(self._post_json(routes.list_files(dataset_id), data, failure_message="Failed to list files for dataset {}".format(dataset_id)))[]
List matched filenames in a dataset on Citrination. :param dataset_id: The ID of the dataset to search for files. :type dataset_id: int :param glob: A pattern which will be matched against files in the dataset. :type glob: str :param is_dir: A boolean indicating whether or not the pattern should match against the beginning of paths in the dataset. :type is_dir: bool :return: A list of filepaths in the dataset matching the provided glob. :rtype: list of strings
378,025
def _peg_pose_in_hole_frame(self): peg_pos_in_world = self.sim.data.get_body_xpos("cylinder") peg_rot_in_world = self.sim.data.get_body_xmat("cylinder").reshape((3, 3)) peg_pose_in_world = T.make_pose(peg_pos_in_world, peg_rot_in_world) hole_pos_in_world = self.sim.data.get_body_xpos("hole") hole_rot_in_world = self.sim.data.get_body_xmat("hole").reshape((3, 3)) hole_pose_in_world = T.make_pose(hole_pos_in_world, hole_rot_in_world) world_pose_in_hole = T.pose_inv(hole_pose_in_world) peg_pose_in_hole = T.pose_in_A_to_pose_in_B( peg_pose_in_world, world_pose_in_hole ) return peg_pose_in_hole
A helper function that takes in a named data field and returns the pose of that object in the base frame.
378,026
def reversals(self, transfer_id, data={}, **kwargs): url = "{}/{}/reversals".format(self.base_url, transfer_id) return self.get_url(url, data, **kwargs)
Get all Reversal Transfer from given id Args: transfer_id : Id for which reversal transfer object has to be fetched Returns: Transfer Dict
378,027
def list(self, *args, **kwargs): return [ self.prepare_model(n) for n in self.client.api.nodes(*args, **kwargs) ]
List swarm nodes. Args: filters (dict): Filters to process on the nodes list. Valid filters: ``id``, ``name``, ``membership`` and ``role``. Default: ``None`` Returns: A list of :py:class:`Node` objects. Raises: :py:class:`docker.errors.APIError` If the server returns an error. Example: >>> client.nodes.list(filters={'role': 'manager'})
378,028
def perform(self, command, params=None, **kwargs): self._check_session() if not params: params = {} if kwargs: params.update(kwargs) params[] = command status, data = self._rest.post_request(, None, params) return data
Execute a command. Arguments can be supplied either as a dictionary or as keyword arguments. Examples: stc.perform('LoadFromXml', {'filename':'config.xml'}) stc.perform('LoadFromXml', filename='config.xml') Arguments: command -- Command to execute. params -- Optional. Dictionary of parameters (name-value pairs). kwargs -- Optional keyword arguments (name=value pairs). Return: Data from command.
378,029
def unpublish(self): self._client._delete( "{0}/published".format( self.__class__.base_url( self.sys[].id, self.sys[], environment_id=self._environment_id ), ), headers=self._update_headers() ) return self.reload()
Unpublishes the resource.
378,030
def is_uncertainty_edition_allowed(self, analysis_brain): if not self.is_result_edition_allowed(analysis_brain): return False obj = api.get_object(analysis_brain) if not obj.getAllowManualUncertainty(): return False if obj.getDetectionLimitOperand() in [LDL, UDL]: return False return True
Checks if the edition of the uncertainty field is allowed :param analysis_brain: Brain that represents an analysis :return: True if the user can edit the result field, otherwise False
378,031
def byte_bounds_offset(self): if self.data.base is None: if self.is_indexed: basearray = self.data.np_data else: basearray = self.data return 0, len(basearray) return int(self.data_start - self.base_start), int(self.data_end - self.base_start)
Return start and end offsets of this segment's data into the base array's data. This ignores the byte order index. Arrays using the byte order index will have the entire base array's raw data.
378,032
def imagetransformer_sep_channels_12l_16h_imagenet_large(): hparams = imagetransformer_sep_channels_8l_8h() hparams.num_hidden_layers = 12 hparams.batch_size = 1 hparams.filter_size = 2048 hparams.num_heads = 16 hparams.learning_rate_warmup_steps = 16000 hparams.sampling_method = "random" hparams.learning_rate = 0.1 return hparams
separate rgb embeddings.
378,033
def set_many(self, mapping, timeout=None): rv = True for key, value in _items(mapping): if not self.set(key, value, timeout): rv = False return rv
Sets multiple keys and values from a mapping. :param mapping: a mapping with the keys/values to set. :param timeout: the cache timeout for the key (if not specified, it uses the default timeout). :returns: Whether all given keys have been set. :rtype: boolean
378,034
def artUrl(self): art = self.firstAttr(, ) return self._server.url(art, includeToken=True) if art else None
Return the first first art url starting on the most specific for that item.
378,035
def refresh_fqdn_cache(force=False): ** if not isinstance(force, bool): raise CommandExecutionError("Force option must be boolean.") if force: query = {: , : } else: query = {: , : } return __proxy__[](query)
Force refreshes all FQDNs used in rules. force Forces all fqdn refresh CLI Example: .. code-block:: bash salt '*' panos.refresh_fqdn_cache salt '*' panos.refresh_fqdn_cache force=True
378,036
def is_switched_on(self, refresh=False): if refresh: self.refresh() val = self.get_value() return val ==
Get armed state. Refresh data from Vera if refresh is True, otherwise use local cache. Refresh is only needed if you're not using subscriptions.
378,037
def until(self, condition, is_true=None, message=""): rv = None last_exc = None until = is_true or until_pred start = self.clock.now while not until(self.clock, self.end): try: rv = condition() except (KeyboardInterrupt, SystemExit) as e: raise e except self.exceptions as e: last_exc = sys.exc_info() if isinstance(rv, bool) and not rv: time.sleep(self.interval) continue if rv is not None: return rv self.clock.sleep(self.interval) if message: message = " with message: %s" % message raise TimeoutException( "Timed out after %s seconds%s" % ((self.clock.now - start), message), cause=last_exc)
Repeatedly runs condition until its return value evalutes to true, or its timeout expires or the predicate evaluates to true. This will poll at the given interval until the given timeout is reached, or the predicate or conditions returns true. A condition that returns null or does not evaluate to true will fully elapse its timeout before raising a ``TimeoutException``. If an exception is raised in the condition function and it's not ignored, this function will raise immediately. If the exception is ignored, it will continue polling for the condition until it returns successfully or a ``TimeoutException`` is raised. The return value of the callable `condition` will be returned once it completes successfully. :param condition: A callable function whose return value will be returned by this function if it evalutes to true. :param is_true: An optional predicate that will terminate and return when it evalutes to False. It should be a function that will be passed `clock` and an end time. The default predicate will terminate a wait when the clock elapses the timeout. :param message: An optional message to include in the exception's message if this function times out. :returns: Return value of `condition`.
378,038
def syncView(self): if not self.updatesEnabled(): return for item in self.topLevelItems(): try: item.syncView(recursive=True) except AttributeError: continue
Syncs all the items to the view.
378,039
def write_branch_data(self, file): writer = self._get_writer(file) writer.writerow(BRANCH_ATTRS) for branch in self.case.branches: writer.writerow([getattr(branch, a) for a in BRANCH_ATTRS])
Writes branch data as CSV.
378,040
def _authenticate(self, params, headers): if self.authentication: user = self.authentication.get_user() params.update({: user.firebase_auth_token}) headers.update(self.authentication.authenticator.HEADERS)
Method that simply adjusts authentication credentials for the request. `params` is the querystring of the request. `headers` is the header of the request. If auth instance is not provided to this class, this method simply returns without doing anything.
378,041
def to_pb(self): union = table_v2_pb2.GcRule.Union(rules=[rule.to_pb() for rule in self.rules]) return table_v2_pb2.GcRule(union=union)
Converts the union into a single GC rule as a protobuf. :rtype: :class:`.table_v2_pb2.GcRule` :returns: The converted current object.
378,042
def keyword_hookup(self, noteId, keywords): try: self.cur.execute("DELETE FROM notekeyword WHERE noteid=?", [noteId]) except: self.error("ERROR: cannot unhook previous keywords") for keyword in keywords: keyword = keyword.decode() self.fyi(" inserting keyword:", keyword) keywordId = self.con.execute("SELECT keywordId FROM keyword WHERE keyword = ?;", [keyword]).fetchone() try: if keywordId: self.fyi(" (existing keyword with id: %s)" % keywordId) keywordId = keywordId[0] else: self.fyi(" (new keyword)") self.cur.execute("INSERT INTO keyword(keyword) VALUES (?);", [keyword]) keywordId = self.cur.lastrowid self.con.execute("INSERT INTO notekeyword(noteId, keywordID) VALUES(?, ?)", [noteId, keywordId]) except: self.error("error hooking up keyword " % keyword) self.con.commit()
Unhook existing cross-linking entries.
378,043
def from_offset(self, value): if not self.params: self.params = dict({:value}) return self self.params[] = value return self
The starting from index of the hits to return. Defaults to 0.
378,044
def adjustHeight(self, column): tree = self.treeWidget() if not tree: return w = tree.width() if tree.verticalScrollBar().isVisible(): w -= tree.verticalScrollBar().width() doc = QtGui.QTextDocument() doc.setTextWidth(w) doc.setHtml(self.text(0)) height = doc.documentLayout().documentSize().height() self.setFixedHeight(height+2)
Adjusts the height for this item based on the columna and its text. :param column | <int>
378,045
def interpret(self, msg): slides = msg.get(, []) self.cache = msg.get(, ) self.gallery = msg.get(, []) self.finder.interpret(dict(galleries=self.gallery)) slides = [slide for slide in slides] logname = msg.get() if logname: self.write_slide_list(logname, slides) for slide in slides: image = self.draw_slide(slide) heading = slide[][] filename = self.get_image_name(heading) self.cache_image(filename, image) return
Load input
378,046
def dns_resource_reference(self): api_version = self._get_api_version() if api_version == : from .v2018_05_01.operations import DnsResourceReferenceOperations as OperationClass else: raise NotImplementedError("APIVersion {} is not available".format(api_version)) return OperationClass(self._client, self.config, Serializer(self._models_dict(api_version)), Deserializer(self._models_dict(api_version)))
Instance depends on the API version: * 2018-05-01: :class:`DnsResourceReferenceOperations<azure.mgmt.dns.v2018_05_01.operations.DnsResourceReferenceOperations>`
378,047
def set(self, key, value, **kw): self.impl.set(key, value, **self._get_cache_kw(kw, None))
Place a value in the cache. :param key: the value's key. :param value: the value. :param \**kw: cache configuration arguments.
378,048
def in_use(self): state = State.objects.filter(flow=self).first() return bool(state)
Returns True if there is a :class:`State` object that uses this ``Flow``
378,049
def choice_voters_changed_update_cache( sender, instance, action, reverse, model, pk_set, **kwargs): if action not in (, , ): return if model == User: assert type(instance) == Choice choices = [instance] if pk_set: users = list(User.objects.filter(pk__in=pk_set)) else: users = [] else: if pk_set: choices = list(Choice.objects.filter(pk__in=pk_set)) else: choices = [] users = [instance] from .tasks import update_cache_for_instance for choice in choices: update_cache_for_instance(, choice.pk, choice) for user in users: update_cache_for_instance(, user.pk, user)
Update cache when choice.voters changes.
378,050
def add(self, layer, verbosity = 0, position = None): layer._verbosity = verbosity layer._maxRandom = self._maxRandom layer.minTarget = 0.0 layer.maxTarget = 1.0 layer.minActivation = 0.0 layer.maxActivation = 1.0 if position == None: self.layers.append(layer) else: self.layers.insert(position, layer) self.layersByName[layer.name] = layer
Adds a layer. Layer verbosity is optional (default 0).
378,051
def generate_enums(basename, xml): directory = os.path.join(basename, ) mavparse.mkdir_p(directory) for en in xml.enum: f = open(os.path.join(directory, en.name+".java"), mode=) t.write(f, , en) f.close()
generate main header per XML file
378,052
def check_output(self, cmd): ret, output = self._call(cmd, True) if ret != 0: raise RemoteCommandFailure(command=cmd, ret=ret) logger.debug("Output: %r", output) return output
Calls a command through SSH and returns its output.
378,053
def get_collection_in_tower(self, key): new = tf.get_collection(key) old = set(self.original.get(key, [])) return [x for x in new if x not in old]
Get items from this collection that are added in the current tower.
378,054
def sign_ssh_challenge(self, blob, identity): msg = _parse_ssh_blob(blob) log.debug(, msg[], msg[], msg[], msg[]) log.debug(, msg[]) fp = msg[][] log.debug(, fp) log.debug(, len(blob)) log.info(, msg[].decode(), identity.to_string(), self.device) with self.device: return self.device.sign(blob=blob, identity=identity)
Sign given blob using a private key on the device.
378,055
def get_kernel_id(self): sessions_url = self.get_session_url() sessions_req = requests.get(sessions_url).content.decode() sessions = json.loads(sessions_req) if os.name == : path = self.path.replace(, ) else: path = self.path for session in sessions: notebook_path = session.get(, {}).get() if notebook_path is not None and notebook_path == path: kernel_id = session[][] return kernel_id
Get the kernel id of the client. Return a str with the kernel id or None.
378,056
def _find_link_target(self, tarinfo): if tarinfo.issym(): linkname = os.path.dirname(tarinfo.name) + "/" + tarinfo.linkname limit = None else: linkname = tarinfo.linkname limit = tarinfo member = self._getmember(linkname, tarinfo=limit, normalize=True) if member is None: raise KeyError("linkname %r not found" % linkname) return member
Find the target member of a symlink or hardlink member in the archive.
378,057
def set(self, name: str, value: Any) -> None: self.agent.set(name, value)
Stores a knowledge item in the agent knowledge base. Args: name (str): name of the item value (Any): value of the item
378,058
def table(self) -> Table: if self._table is not None: return self._table assert self._metadata, ( "Must specify metadata (in constructor or via set_metadata()/" "set_metadata_if_none() before you can get a Table from a " "tablename" ) for table in self._metadata.tables.values(): if table.name == self._tablename: return table raise ValueError("No table named {!r} is present in the " "metadata".format(self._tablename))
Returns a SQLAlchemy :class:`Table` object. This is either the :class:`Table` object that was used for initialization, or one that was constructed from the ``tablename`` plus the ``metadata``.
378,059
def vcsNodeState_originator_switch_info_switchIpV6Address(self, **kwargs): config = ET.Element("config") vcsNodeState = ET.SubElement(config, "vcsNodeState", xmlns="urn:brocade.com:mgmt:brocade-vcs") originator_switch_info = ET.SubElement(vcsNodeState, "originator-switch-info") switchIpV6Address = ET.SubElement(originator_switch_info, "switchIpV6Address") switchIpV6Address.text = kwargs.pop() callback = kwargs.pop(, self._callback) return callback(config)
Auto Generated Code
378,060
def _encode_params(data): if hasattr(data, ): data = dict(data) if hasattr(data, ): result = [] for k, vs in data.items(): for v in isinstance(vs, list) and vs or [vs]: result.append((k.encode() if isinstance(k, unicode) else k, v.encode() if isinstance(v, unicode) else v)) return result, urllib.urlencode(result, doseq=True) else: return data, data
Encode parameters in a piece of data. If the data supplied is a dictionary, encodes each parameter in it, and returns a list of tuples containing the encoded parameters, and a urlencoded version of that. Otherwise, assumes the data is already encoded appropriately, and returns it twice.
378,061
def friedmanchisquare(*args): k = len(args) if k < 3: raise ValueError() n = len(args[0]) data = map(zip, tuple(args)) for i in range(len(data)): data[i] = rankdata(data[i]) ssbn = 0 for i in range(k): ssbn = ssbn + sum(args[i]) ** 2 chisq = 12.0 / (k * n * (k + 1)) * ssbn - 3 * n * (k + 1) return chisq, chisqprob(chisq, k - 1)
Friedman Chi-Square is a non-parametric, one-way within-subjects ANOVA. This function calculates the Friedman Chi-square test for repeated measures and returns the result, along with the associated probability value. It assumes 3 or more repeated measures. Only 3 levels requires a minimum of 10 subjects in the study. Four levels requires 5 subjects per level(??). Usage: lfriedmanchisquare(*args) Returns: chi-square statistic, associated p-value
378,062
def delete(self): pipeline = self.db.pipeline() self._delete_from_indices(pipeline) self._delete_membership(pipeline) pipeline.delete(self.key()) pipeline.execute()
Deletes the object from the datastore.
378,063
def state(name): contextkey = .format(name) if contextkey in __context__: return __context__[contextkey] __context__[contextkey] = _get_state(inspect_container(name)) return __context__[contextkey]
Returns the state of the container name Container name or ID **RETURN DATA** A string representing the current state of the container (either ``running``, ``paused``, or ``stopped``) CLI Example: .. code-block:: bash salt myminion docker.state mycontainer
378,064
def _pfp__parse(self, stream, save_offset=False): res = super(Enum, self)._pfp__parse(stream, save_offset) if self._pfp__value in self.enum_vals: self.enum_name = self.enum_vals[self._pfp__value] else: self.enum_name = "?? UNK_ENUM ??" return res
Parse the IO stream for this enum :stream: An IO stream that can be read from :returns: The number of bytes parsed
378,065
def stop(self): if not self.device.is_streaming: return self.device.stop_stream() self._writer.close() self._bins = None self._repeats = None self._base_buffer_size = None self._max_buffer_size = None self._buffer_repeats = None self._buffer = None self._tune_delay = None self._reset_stream = None self._psd = None self._writer = None
Stop streaming samples from device and delete samples buffer
378,066
def reconstitute_path(drive, folders): reconstituted = os.path.join(drive, os.path.sep, *folders) return reconstituted
Reverts a tuple from `get_path_components` into a path. :param drive: A drive (eg 'c:'). Only applicable for NT systems :param folders: A list of folder names :return: A path comprising the drive and list of folder names. The path terminate with a `os.path.sep` *only* if it is a root directory
378,067
def map_transaction(txn): if isinstance(txn[], dict): sid = txn[][] symbol = txn[][] else: sid = txn[] symbol = txn[] return {: sid, : symbol, : txn[], : txn[], : txn[], : txn[], : txn[]}
Maps a single transaction row to a dictionary. Parameters ---------- txn : pd.DataFrame A single transaction object to convert to a dictionary. Returns ------- dict Mapped transaction.
378,068
def download_file(cls, url, local_file_name=None, force=False, chunk_size=1024): local_file_name = local_file_name if local_file_name else url.split()[-1] filepath = os.path.join(cls.data_path, local_file_name) if not os.path.exists(filepath) or force: try: headers = requests.head(url, allow_redirects=True).headers length = headers.get() logger.info("Starting download of {} file with {} bytes ...".format(url, length)) widgets = [ , progressbar.Percentage(), , progressbar.Bar(), , progressbar.ETA(), , progressbar.FileTransferSpeed(), ] bar = progressbar.ProgressBar(widgets=widgets, max_value=int(length) + chunk_size).start() r = requests.get(url, stream=True) with open(filepath, ) as f: total_chunk = 0 for chunk in r.iter_content(chunk_size): if chunk: f.write(chunk) total_chunk += chunk_size bar.update(total_chunk) bar.finish() except: if os.path.exists(filepath): os.remove(filepath) raise return filepath
Download file from a given url
378,069
def mget(self, body, doc_type=None, index=None, params=None): if body in SKIP_IN_PATH: raise ValueError("Empty value passed for a required argument .") return self.transport.perform_request( "GET", _make_path(index, doc_type, "_mget"), params=params, body=body )
Get multiple documents based on an index, type (optional) and ids. `<http://www.elastic.co/guide/en/elasticsearch/reference/current/docs-multi-get.html>`_ :arg body: Document identifiers; can be either `docs` (containing full document information) or `ids` (when index and type is provided in the URL. :arg index: The name of the index :arg _source: True or false to return the _source field or not, or a list of fields to return :arg _source_exclude: A list of fields to exclude from the returned _source field :arg _source_include: A list of fields to extract and return from the _source field :arg preference: Specify the node or shard the operation should be performed on (default: random) :arg realtime: Specify whether to perform the operation in realtime or search mode :arg refresh: Refresh the shard containing the document before performing the operation :arg routing: Specific routing value :arg stored_fields: A comma-separated list of stored fields to return in the response
378,070
def init_app(self, app): if not hasattr(app, ): app.extensions = {} app.extensions[] = self app.add_template_global(self.elems, ) for args in self._renderers: register_renderer(app, *args)
Initialize an application. :param app: A :class:`~flask.Flask` app.
378,071
def idle_print_status(self): now = time.time() if (now - self.last_idle_status_printed_time) >= 10: print(self.status()) self.last_idle_status_printed_time = now
print out statistics every 10 seconds from idle loop
378,072
def pairwise_kernel(self, X, Y): check_is_fitted(self, ) if X.shape[0] != Y.shape[0]: raise ValueError() val = pairwise_continuous_ordinal_kernel(X[self._numeric_columns], Y[self._numeric_columns], self._numeric_ranges) if len(self._nominal_columns) > 0: val += pairwise_nominal_kernel(X[self._nominal_columns].astype(numpy.int8), Y[self._nominal_columns].astype(numpy.int8)) val /= X.shape[0] return val
Function to use with :func:`sklearn.metrics.pairwise.pairwise_kernels` Parameters ---------- X : array, shape = (n_features,) Y : array, shape = (n_features,) Returns ------- similarity : float Similarities are normalized to be within [0, 1]
378,073
def wait_until_page_ready(page_object, timeout=WTF_TIMEOUT_MANAGER.NORMAL): try: do_until(lambda: page_object.webdriver.execute_script("return document.readyState").lower() == , timeout) except wait_utils.OperationTimeoutError: raise PageUtilOperationTimeoutError( "Timeout occurred while waiting for page to be ready.")
Waits until document.readyState == Complete (e.g. ready to execute javascript commands) Args: page_object (PageObject) : PageObject class Kwargs: timeout (number) : timeout period
378,074
def newgroups_gen(self, timestamp): if timestamp.tzinfo: ts = timestamp.asttimezone(date.TZ_GMT) else: ts = timestamp.replace(tzinfo=date.TZ_GMT) args = ts.strftime("%Y%m%d %H%M%S %Z") code, message = self.command("NEWGROUPS", args) if code != 231: raise NNTPReplyError(code, message) for line in self.info_gen(code, message): yield utils.parse_newsgroup(line)
Generator for the NEWGROUPS command. Generates a list of newsgroups created on the server since the specified timestamp. See <http://tools.ietf.org/html/rfc3977#section-7.3> Args: timestamp: Datetime object giving 'created since' datetime. Yields: A tuple containing the name, low water mark, high water mark, and status for the newsgroup. Note: If the datetime object supplied as the timestamp is naive (tzinfo is None) then it is assumed to be given as GMT.
378,075
def autocomplete(query, country=None, hurricanes=False, cities=True, timeout=5): data = {} data[] = quote(query) data[] = country or data[] = 1 if hurricanes else 0 data[] = 1 if cities else 0 data[] = r = requests.get(AUTOCOMPLETE_URL.format(**data), timeout=timeout) results = json.loads(r.content)[] return results
Make an autocomplete API request This can be used to find cities and/or hurricanes by name :param string query: city :param string country: restrict search to a specific country. Must be a two letter country code :param boolean hurricanes: whether to search for hurricanes or not :param boolean cities: whether to search for cities or not :param integer timeout: timeout of the api request :returns: result of the autocomplete API request :rtype: dict
378,076
def update(self): self._controller.update(self._id, wake_if_asleep=False) data = self._controller.get_charging_params(self._id) if data and (time.time() - self.__manual_update_time > 60): self.__maxrange_state = data[]
Update the status of the range setting.
378,077
def cluster_centers_(self): for attr in (,): try: return getattr(self.estimator, attr) except AttributeError: continue raise AttributeError( "could not find or make cluster_centers_ for {}".format( self.estimator.__class__.__name__ ))
Searches for or creates cluster centers for the specified clustering algorithm. This algorithm ensures that that the centers are appropriately drawn and scaled so that distance between clusters are maintained.
378,078
def writeSentence(self, cmd, *words): encoded = self.encodeSentence(cmd, *words) self.log(, cmd, *words) self.transport.write(encoded)
Write encoded sentence. :param cmd: Command word. :param words: Aditional words.
378,079
def example_value(self): from .serializable import Serializable inst = self._static_example_value() if inst is tr.Undefined and issubclass(self.klass, Serializable): return self.klass.example_instance() return inst
If we're an instance of a Serializable, fall back to its `example_instance()` method.
378,080
def log_call(call_name): def decorator(f): @wraps(f) def wrapper(*args, **kw): instance = args[0] instance.logger.info(call_name, {"content": request.get_json()}) return f(*args, **kw) return wrapper return decorator
Log the API call to the logger.
378,081
def plot_grid(step): rad = get_rprof(step, )[0] drad = get_rprof(step, )[0] _, unit = step.sdat.scale(1, ) if unit: unit = .format(unit) fig, (ax1, ax2) = plt.subplots(2, sharex=True) ax1.plot(rad, ) ax1.set_ylabel( + unit) ax2.plot(drad, ) ax2.set_ylabel( + unit) ax2.set_xlim([-0.5, len(rad) - 0.5]) ax2.set_xlabel() misc.saveplot(fig, , step.istep)
Plot cell position and thickness. The figure is call grid_N.pdf where N is replace by the step index. Args: step (:class:`~stagpy.stagyydata._Step`): a step of a StagyyData instance.
378,082
def convert_column(self, values): assert all(values >= 0), total = sum(values) if total > 0: return values / total else: return values
Normalize values.
378,083
def positionToIntensityUncertaintyForPxGroup(image, std, y0, y1, x0, x1): fy, fx = y1 - y0, x1 - x0 if fy != fx: raise Exception() image = _coarsenImage(image, fx) k = _kSizeFromStd(std) y0 = int(round(y0 / fy)) x0 = int(round(x0 / fx)) arr = image[y0 - k:y0 + k, x0 - k:x0 + k] U = positionToIntensityUncertainty(arr, std / fx, std / fx) return U[k:-k, k:-k]
like positionToIntensityUncertainty but calculated average uncertainty for an area [y0:y1,x0:x1]
378,084
def jwt_optional(fn): @wraps(fn) def wrapper(*args, **kwargs): verify_jwt_in_request_optional() return fn(*args, **kwargs) return wrapper
A decorator to optionally protect a Flask endpoint If an access token in present in the request, this will call the endpoint with :func:`~flask_jwt_extended.get_jwt_identity` having the identity of the access token. If no access token is present in the request, this endpoint will still be called, but :func:`~flask_jwt_extended.get_jwt_identity` will return `None` instead. If there is an invalid access token in the request (expired, tampered with, etc), this will still call the appropriate error handler instead of allowing the endpoint to be called as if there is no access token in the request.
378,085
def get_percentile(self, percentile): assert 0 <= percentile <= 100, \ .format(percentile) return self._percentile(self._values, percentile)
Get a value representing a the input percentile of the Data Collection. Args: percentile: A float value from 0 to 100 representing the requested percentile. Return: The Data Collection value at the input percentile
378,086
def generate(basename, xml): if basename.endswith(): filename = basename else: filename = basename + msgs = [] enums = [] filelist = [] for x in xml: msgs.extend(x.message) enums.extend(x.enum) filelist.append(os.path.basename(x.filename)) for m in msgs: if xml[0].little_endian: m.fmtstr = else: m.fmtstr = for f in m.ordered_fields: m.fmtstr += mavfmt(f) m.order_map = [ 0 ] * len(m.fieldnames) for i in range(0, len(m.fieldnames)): m.order_map[i] = m.ordered_fieldnames.index(m.fieldnames[i]) print("Generating %s" % filename) outf = open(filename, "w") generate_preamble(outf) generate_msg_table(outf, msgs) generate_body_fields(outf) for m in msgs: generate_msg_fields(outf, m) for m in msgs: generate_payload_dissector(outf, m) generate_packet_dis(outf) generate_epilog(outf) outf.close() print("Generated %s OK" % filename)
generate complete python implemenation
378,087
def CheckBreakpointsExpiration(self): with self._lock: current_time = BreakpointsManager.GetCurrentTime() if self._next_expiration > current_time: return expired_breakpoints = [] self._next_expiration = datetime.max for breakpoint in six.itervalues(self._active): expiration_time = breakpoint.GetExpirationTime() if expiration_time <= current_time: expired_breakpoints.append(breakpoint) else: self._next_expiration = min(self._next_expiration, expiration_time) for breakpoint in expired_breakpoints: breakpoint.ExpireBreakpoint()
Completes all breakpoints that have been active for too long.
378,088
def set_link(self, link,y=0,page=-1): "Set destination of internal link" if(y==-1): y=self.y if(page==-1): page=self.page self.links[link]=[page,y]
Set destination of internal link
378,089
def to_xdr_object(self): trustor = account_xdr_object(self.trustor) length = len(self.asset_code) assert length <= 12 pad_length = 4 - length if length <= 4 else 12 - length asset_code = bytearray(self.asset_code, ) + b * pad_length asset = Xdr.nullclass() if len(asset_code) == 4: asset.type = Xdr.const.ASSET_TYPE_CREDIT_ALPHANUM4 asset.assetCode4 = asset_code else: asset.type = Xdr.const.ASSET_TYPE_CREDIT_ALPHANUM12 asset.assetCode12 = asset_code allow_trust_op = Xdr.types.AllowTrustOp(trustor, asset, self.authorize) self.body.type = Xdr.const.ALLOW_TRUST self.body.allowTrustOp = allow_trust_op return super(AllowTrust, self).to_xdr_object()
Creates an XDR Operation object that represents this :class:`AllowTrust`.
378,090
def serialize_to_flat(self, name, datas): keys = datas.get(, None) values = datas.get(, None) splitter = datas.get(, self._DEFAULT_SPLITTER) if not keys: msg = ("Flat reference lacks of required variable or " "is empty") raise SerializerError(msg.format(name)) else: keys = self.value_splitter(name, , keys, mode=splitter) if not values: msg = ("Flat reference lacks of required variable " "or is empty") raise SerializerError(msg.format(name)) else: values = self.value_splitter(name, , values, mode=splitter) if len(values) != len(keys): msg = ("Flat reference have different length of ands " " variable") raise SerializerError(msg.format(name)) return OrderedDict(zip(keys, values))
Serialize given datas to a flat structure ``KEY:VALUE`` where ``KEY`` comes from ``keys`` variable and ``VALUE`` comes from ``values`` variable. This means both ``keys`` and ``values`` are required variable to be correctly filled (each one is a string of item separated with an empty space). Both resulting list must be the same length. Arguments: name (string): Name only used inside possible exception message. datas (dict): Datas to serialize. Returns: dict: Flat dictionnay of serialized reference datas.
378,091
def macshim(): import subprocess, sys subprocess.call([ sys.argv[0] + ]+sys.argv[1:], env={"VERSIONER_PYTHON_PREFER_32_BIT":"yes"} )
Shim to run 32-bit on 64-bit mac as a sub-process
378,092
def title(self): tmp = c.namemap_lookup(self.id) if c.namemap_lookup(self.id) is not None else self._title return secure_filename(tmp)
get title of this node. If an entry for this course is found in the configuration namemap it is used, otherwise the default value from stud.ip is used.
378,093
def _set_member_entry(self, v, load=False): if hasattr(v, "_utype"): v = v._utype(v) try: t = YANGDynClass(v,base=YANGListType("member",member_entry.member_entry, yang_name="member-entry", rest_name="member-entry", parent=self, is_container=, user_ordered=False, path_helper=self._path_helper, yang_keys=, extensions={u: {u: u, u: None, u: u, u: None, u: None}}), is_container=, yang_name="member-entry", rest_name="member-entry", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u: {u: u, u: None, u: u, u: None, u: None}}, namespace=, defining_module=, yang_type=, is_config=True) except (TypeError, ValueError): raise ValueError({ : , : "list", : , }) self.__member_entry = t if hasattr(self, ): self._set()
Setter method for member_entry, mapped from YANG variable /rbridge_id/secpolicy/defined_policy/policies/member_entry (list) If this variable is read-only (config: false) in the source YANG file, then _set_member_entry is considered as a private method. Backends looking to populate this variable should do so via calling thisObj._set_member_entry() directly.
378,094
def readlist(self): printtime(, self.start) for i in range(self.cpus): threads = Thread(target=self.listread, args=()) threads.setDaemon(True) threads.start() for sample in self.runmetadata.samples: self.listqueue.put(sample) self.listqueue.join() self.fastqfilter()
Sort the reads, and create lists to be used in creating sorted .fastq files
378,095
def length( cls, request, vector: (Ptypes.body, Vector())) -> [ (200, , Float), (400, )]: log.info(.format(vector)) try: Respond(200, sqrt(vector[] ** 2 + vector[] ** 2 + vector.get(, 0) ** 2)) except ValueError: Respond(400)
Return the modulo of a vector.
378,096
def auto2unicode(text): _all_unique_encodes_, _all_common_encodes_ = _get_unique_common_encodes() unique_chars = _get_unique_ch(text, _all_common_encodes_) clen = len(_all_common_encodes_) msg = "Sorry, couldnNeed more words to find unique encode out side of %d common compound characters'
This function tries to identify encode in available encodings. If it finds, then it will convert text into unicode string. Author : Arulalan.T 04.08.2014
378,097
async def download_file( self, input_location, file=None, *, part_size_kb=None, file_size=None, progress_callback=None, dc_id=None): if not part_size_kb: if not file_size: part_size_kb = 64 else: part_size_kb = utils.get_appropriated_part_size(file_size) part_size = int(part_size_kb * 1024) if part_size % 4096 != 0: raise ValueError( ) in_memory = file is None or file is bytes if in_memory: f = io.BytesIO() elif isinstance(file, str): config = await self(functions.help.GetConfigRequest()) for option in config.dc_options: if option.ip_address == self.session.server_address: self.session.set_dc( option.id, option.ip_address, option.port) self.session.save() break sender = self._sender exported = False else: sender = self._sender self._log[__name__].info(, part_size) try: offset = 0 while True: try: result = await sender.send(functions.upload.GetFileRequest( input_location, offset, part_size )) if isinstance(result, types.upload.FileCdnRedirect): raise NotImplementedError except errors.FileMigrateError as e: self._log[__name__].info() sender = await self._borrow_exported_sender(e.new_dc) exported = True continue offset += part_size if not result.bytes: if in_memory: f.flush() return f.getvalue() else: return getattr(result, , ) self._log[__name__].debug(, len(result.bytes)) f.write(result.bytes) if progress_callback: progress_callback(f.tell(), file_size) finally: if exported: await self._return_exported_sender(sender) elif sender != self._sender: await sender.disconnect() if isinstance(file, str) or in_memory: f.close()
Downloads the given input location to a file. Args: input_location (:tl:`InputFileLocation`): The file location from which the file will be downloaded. See `telethon.utils.get_input_location` source for a complete list of supported types. file (`str` | `file`, optional): The output file path, directory, or stream-like object. If the path exists and is a file, it will be overwritten. If the file path is ``None`` or ``bytes``, then the result will be saved in memory and returned as `bytes`. part_size_kb (`int`, optional): Chunk size when downloading files. The larger, the less requests will be made (up to 512KB maximum). file_size (`int`, optional): The file size that is about to be downloaded, if known. Only used if ``progress_callback`` is specified. progress_callback (`callable`, optional): A callback function accepting two parameters: ``(downloaded bytes, total)``. Note that the ``total`` is the provided ``file_size``. dc_id (`int`, optional): The data center the library should connect to in order to download the file. You shouldn't worry about this.
378,098
def read_job(self, job_id, checkout=False): self.job_id = job_id commit = self.get_head_commit() self.logger.debug( + commit) self.command_exec([, self.ref_head]) if checkout: self.logger.debug( + self.work_tree) if os.path.exists(self.work_tree): shutil.rmtree(self.work_tree) os.makedirs(self.work_tree) self.command_exec([, self.work_tree, , self.ref_head, , ])
Reads head and reads the tree into index, and checkout the work-tree when checkout=True. This does not fetch the job from the actual server. It needs to be in the local git already.
378,099
def on_post(self, req, resp): grant_type = req.get_param() password = req.get_param() username = req.get_param() resp.disable_caching() if not grant_type or not password or not username: resp.status = falcon.HTTP_400 resp.serialize({ : , : , : , }) elif grant_type != : resp.status = falcon.HTTP_400 resp.serialize({ : , : % grant_type, : , }) else: try: token = self.auth_creds(username, password) resp.serialize({ : token, : , }) except AuthRejected as exc: resp.status = falcon.HTTP_401 resp.set_header(, self._realm) resp.serialize({ : , : exc.detail, })
Validate the access token request for spec compliance The spec also dictates the JSON based error response on failure & is handled in this responder.