code
stringlengths 4
4.48k
| docstring
stringlengths 1
6.45k
| _id
stringlengths 24
24
|
---|---|---|
class handlers(_Stacked): <NEW_LINE> <INDENT> def __init__(self, *bindings): <NEW_LINE> <INDENT> for t, c in bindings: <NEW_LINE> <INDENT> if not (((isinstance(t, tuple) and all(safeissubclass(x, BaseException) for x in t)) or safeissubclass(t, BaseException)) and callable(c)): <NEW_LINE> <INDENT> error(TypeError("Each binding must be of the form (type, callable) or ((t0, ..., tn), callable)")) <NEW_LINE> <DEDENT> <DEDENT> super().__init__(bindings) <NEW_LINE> self.dq = _stacks.handlers | Set up condition handlers. Known as `HANDLER-BIND` in Common Lisp.
Usage::
with handlers((cls, callable), ...):
...
where `cls` is a condition type (class), or a `tuple` of such types,
just like in `except`.
The `callable` may optionally accept one positional argument, the condition
instance (like an `except ... as ...` clause). If you don't need data from
the condition object (just using its type for control purposes, like an
`except ...` clause), the handler doesn't need to accept any arguments.
To *handle* the condition, a handler must call `invoke()` for one
of the restarts currently in scope. This immediately terminates the handler,
transferring control to the restart.
To cancel, and delegate to the next (outer) handler for the same condition
type, a handler may return normally without calling `invoke()`. The
return value of the handler is ignored. Any side effects the canceled
handler performed (such as logging), up to the point where it returned,
still occur.
**Notes**
If you use only `with handlers` and `error` (no restarts), the conditions
system reduces into an exceptions system. The `error` function plays the
role of `raise`, and `with handlers` plays the role of `try/except`.
If that's all you need, just use exceptions - the purpose of the conditions
system is to allow customizing the semantics.
Also, the condition system does not have a `finally` form. For that, use
the usual `try/finally`, it will work fine also with conditions. Just keep
in mind that the call stack unwinding actually occurs later than usual.
The `finally` block will fire at unwind time, as usual.
(Exception systems often perform double duty, providing both a throw/catch
mechanism and an `unwind-protect` mechanism. This conditions system provides
only a resumable throw/catch/restart mechanism.) | 625990753539df3088ecdbea |
class FenwickTree: <NEW_LINE> <INDENT> def __init__(self, n): <NEW_LINE> <INDENT> self.n = n <NEW_LINE> self.arr = [0] * (n + 1) <NEW_LINE> <DEDENT> def add(self, i, v): <NEW_LINE> <INDENT> i += 1 <NEW_LINE> while i <= self.n: <NEW_LINE> <INDENT> self.arr[i] += v <NEW_LINE> i += i & -i <NEW_LINE> <DEDENT> <DEDENT> def sum_to(self, i): <NEW_LINE> <INDENT> i += 1 <NEW_LINE> s = 0 <NEW_LINE> while i > 0: <NEW_LINE> <INDENT> s += self.arr[i] <NEW_LINE> i -= i & -i <NEW_LINE> <DEDENT> return s <NEW_LINE> <DEDENT> def sum_range(self, i, j): <NEW_LINE> <INDENT> return self.sum_to(j) - self.sum_to(i) | Summed array with O(log(N)) edits and O(log(N)) sums.
Is 1-indexed internally but presents a 0-indexed interface. | 62599075460517430c432d02 |
class CubeAxesActor2D(tvtk.CubeAxesActor2D): <NEW_LINE> <INDENT> use_data_bounds = true <NEW_LINE> input_info = PipelineInfo(datasets=['any'], attribute_types=['any'], attributes=['any']) <NEW_LINE> traits_view = View(Group( Group( Item('visibility'), HGroup( Item('x_axis_visibility', label='X axis'), Item('y_axis_visibility', label='Y axis'), Item('z_axis_visibility', label='Z axis'), ), show_border=True, label='Visibity'), Group( Item('use_ranges'), HGroup( Item('ranges', enabled_when='use_ranges'), ), show_border=True), Group( Item('use_data_bounds'), HGroup( Item('bounds', enabled_when='not use_data_bounds'), ), show_border=True), Group( Item('x_label'), Item('y_label'), Item('z_label'), Item('label_format'), Item('number_of_labels'), Item('font_factor'), show_border=True), HGroup(Item('show_actual_bounds', label='Use size bigger than screen', editor=BooleanEditor())), Item('fly_mode'), Item('corner_offset'), Item('layer_number'), springy=True, ), scrollable=True, resizable=True, ) | Just has a different view than the tvtk.CubesAxesActor2D, with an
additional tick box. | 62599075aad79263cf43010b |
class iGPReaderCase(unittest.TestCase): <NEW_LINE> <INDENT> def setUp(self) -> None: <NEW_LINE> <INDENT> self.igp_reader = iGPReader(test_data_path() / "test_implicit_to_explicit.gid", project_name="test-implicit_to_explicit" ) <NEW_LINE> <DEDENT> def test_implicit_to_explicit(self): <NEW_LINE> <INDENT> self.igp_reader.build_mesh_data() <NEW_LINE> self.igp_reader.implicit_to_explicit() | This class tests the iGPReader[PyFLOTRAN.readers.iGPReader.io.iGPReader.iGPReader] class implementation | 6259907555399d3f05627e6c |
class OperationProgress(_messages.Message): <NEW_LINE> <INDENT> class StatusValueValuesEnum(_messages.Enum): <NEW_LINE> <INDENT> STATUS_UNSPECIFIED = 0 <NEW_LINE> PENDING = 1 <NEW_LINE> RUNNING = 2 <NEW_LINE> DONE = 3 <NEW_LINE> ABORTING = 4 <NEW_LINE> <DEDENT> metrics = _messages.MessageField('Metric', 1, repeated=True) <NEW_LINE> name = _messages.StringField(2) <NEW_LINE> stages = _messages.MessageField('OperationProgress', 3, repeated=True) <NEW_LINE> status = _messages.EnumField('StatusValueValuesEnum', 4) | Information about operation (or operation stage) progress.
Enums:
StatusValueValuesEnum: Status of an operation stage. Unset for single-
stage operations.
Fields:
metrics: Progress metric bundle, for example: metrics: [{name: "nodes
done", int_value: 15}, {name: "nodes total",
int_value: 32}] or metrics: [{name: "progress", double_value:
0.56}, {name: "progress scale", double_value: 1.0}]
name: A non-parameterized string describing an operation stage. Unset for
single-stage operations.
stages: Substages of an operation or a stage.
status: Status of an operation stage. Unset for single-stage operations. | 6259907501c39578d7f143de |
@enum.unique <NEW_LINE> class VideoLength(enum.IntEnum): <NEW_LINE> <INDENT> three_minutes = 1 <NEW_LINE> five_minutes = 2 <NEW_LINE> ten_minutes = 3 | Dashcam video length | 62599075baa26c4b54d50c02 |
class SchemaError(Exception): <NEW_LINE> <INDENT> pass | Raised on schema validation error. | 62599075e1aae11d1e7cf4b8 |
class TYPE(BASE, mixin.BasicMixin, mixin.UserMixin, mixin.ExtraDataMixin, mixin.TimestampMixin): <NEW_LINE> <INDENT> pass | 定义数据类型的TYPE
通常情况下会根据VFX 制作的定义来区分,例如:plt, mod, cam, flip, pyro, ani, srf, tex... | 625990755fc7496912d48f13 |
class NoGPUEmptySolver(object): <NEW_LINE> <INDENT> def __init__(self, max_N, bend_coefs): <NEW_LINE> <INDENT> d = 3 <NEW_LINE> self.max_N = max_N <NEW_LINE> self.bend_coefs = bend_coefs <NEW_LINE> self.cur_solver = None <NEW_LINE> <DEDENT> def get_solver(self, x_na, K_nn, bend_coefs, rot_coef): <NEW_LINE> <INDENT> n,d = x_na.shape <NEW_LINE> assert len(bend_coefs) <= len(self.bend_coefs) <NEW_LINE> assert n <= self.max_N <NEW_LINE> if not self.cur_solver is None: <NEW_LINE> <INDENT> self.cur_solver.valid = False <NEW_LINE> <DEDENT> Q = np.c_[np.ones((n, 1)), x_na, K_nn] <NEW_LINE> A = np.r_[np.zeros((d+1, d+1)), np.c_[np.ones((n, 1)), x_na]].T <NEW_LINE> R = np.zeros((n+d+1, d)) <NEW_LINE> R[1:d+1, :d] = np.diag(rot_coef) <NEW_LINE> n_cnts = A.shape[0] <NEW_LINE> _u,_s,_vh = np.linalg.svd(A.T) <NEW_LINE> N = _u[:,n_cnts:] <NEW_LINE> QN = Q.dot(N) <NEW_LINE> NR = N.T.dot(R) <NEW_LINE> NON = {} <NEW_LINE> for i, b in enumerate(bend_coefs): <NEW_LINE> <INDENT> O_b = np.zeros((n+d+1, n+d+1), np.float64) <NEW_LINE> O_b[d+1:, d+1:] += b * K_nn <NEW_LINE> O_b[1:d+1, 1:d+1] += np.diag(rot_coef) <NEW_LINE> NON[b] = N.T.dot(O_b.dot(N)) <NEW_LINE> <DEDENT> self.cur_solver = NoGPUTPSSolver(bend_coefs, N, QN, NON, NR, x_na, K_nn, rot_coef) <NEW_LINE> return self.cur_solver | computes solution params and returns a NoGPUTPSSolver | 625990757047854f46340d0c |
class VariableDistribution(base.PlotterBase): <NEW_LINE> <INDENT> def plot(self): <NEW_LINE> <INDENT> fig, ax = plt.subplots() <NEW_LINE> variable = self.graph.settings['variable'] <NEW_LINE> for run in self.runs: <NEW_LINE> <INDENT> data = run.get_dataset("stats-performance-raw-*.csv") <NEW_LINE> data = data[variable] <NEW_LINE> ecdf = sm.distributions.ECDF(data) <NEW_LINE> ax.plot(ecdf.x, ecdf.y, drawstyle='steps', linewidth=2) <NEW_LINE> <DEDENT> ax.set_xlabel(variable.capitalize()) <NEW_LINE> ax.set_ylabel('Kumulativna verjetnost') <NEW_LINE> ax.grid() <NEW_LINE> ax.axis((0, None, 0, 1.01)) <NEW_LINE> self.convert_axes_to_bw(ax) <NEW_LINE> fig.savefig(self.get_figure_filename()) | Draws distribution of some variable over nodes | 62599075a05bb46b3848bdd5 |
class Host(object): <NEW_LINE> <INDENT> def __init__(self, hostName): <NEW_LINE> <INDENT> self.hostName = hostName <NEW_LINE> <DEDENT> def getHostName(self): <NEW_LINE> <INDENT> return self.hostName | openstack entity, including instances and compute hosts | 62599075bf627c535bcb2e21 |
class Overlord(Zerg): <NEW_LINE> <INDENT> def __init__(self, ticks, refined_minerals): <NEW_LINE> <INDENT> self.maps = {} <NEW_LINE> self.zerg = {} <NEW_LINE> self.deployedzerg = {} <NEW_LINE> self.refined_minerals = refined_minerals <NEW_LINE> self.ticks = ticks <NEW_LINE> self.dashboard = Dashboard.update <NEW_LINE> for _ in range(6): <NEW_LINE> <INDENT> ztype = Miner() <NEW_LINE> self.zerg[id(ztype)] = ztype <NEW_LINE> self.refined_minerals -= 9 <NEW_LINE> <DEDENT> <DEDENT> def add_map(self, map_id, summary): <NEW_LINE> <INDENT> self.maps[map_id] = summary <NEW_LINE> <DEDENT> def action(self): <NEW_LINE> <INDENT> if self.ticks < 10: <NEW_LINE> <INDENT> return 'RETURN {}'.format(choice(list(self.zerg.keys()))) <NEW_LINE> <DEDENT> act = randint(0, 3) <NEW_LINE> self.ticks -= 1 <NEW_LINE> for i in self.zerg.keys(): <NEW_LINE> <INDENT> if self.zerg[i].getme is True: <NEW_LINE> <INDENT> return 'RETURN {}'.format(choice(list(self.zerg.keys()))) <NEW_LINE> <DEDENT> <DEDENT> if act == 0: <NEW_LINE> <INDENT> return 'RETURN {}'.format(choice(list(self.zerg.keys()))) <NEW_LINE> <DEDENT> elif act == 1 or act == 2: <NEW_LINE> <INDENT> return 'DEPLOY {} {}'.format(choice(list(self.zerg.keys())), choice(list(self.maps.keys()))) <NEW_LINE> <DEDENT> return 'NONE' | The Overlord class is used to control the drones and visualize the map
in order to determine whether the map is worth mining. | 62599075167d2b6e312b823b |
class TextLexer(Lexer): <NEW_LINE> <INDENT> name = 'Text only' <NEW_LINE> aliases = ['text'] <NEW_LINE> filenames = ['*.txt'] <NEW_LINE> mimetypes = ['text/plain'] <NEW_LINE> def get_tokens_unprocessed(self, text): <NEW_LINE> <INDENT> yield 0, Text, text | "Null" lexer, doesn't highlight anything. | 6259907544b2445a339b7608 |
class FinishedJobExeMetrics(object): <NEW_LINE> <INDENT> def __init__(self): <NEW_LINE> <INDENT> self.completed_metrics = JobExeMetricsByType() <NEW_LINE> self.failed_alg_metrics = JobExeMetricsByType() <NEW_LINE> self.failed_data_metrics = JobExeMetricsByType() <NEW_LINE> self.failed_system_metrics = JobExeMetricsByType() <NEW_LINE> <DEDENT> @property <NEW_LINE> def count(self): <NEW_LINE> <INDENT> failed_count = self.failed_alg_metrics.total_count <NEW_LINE> failed_count += self.failed_data_metrics.total_count <NEW_LINE> failed_count += self.failed_system_metrics.total_count <NEW_LINE> return self.completed_metrics.total_count + failed_count <NEW_LINE> <DEDENT> def add_job_execution(self, job_exe): <NEW_LINE> <INDENT> if job_exe.status == 'COMPLETED': <NEW_LINE> <INDENT> self.completed_metrics.add_job_execution(job_exe) <NEW_LINE> <DEDENT> elif job_exe.status == 'FAILED': <NEW_LINE> <INDENT> if job_exe.error_category == 'ALGORITHM': <NEW_LINE> <INDENT> self.failed_alg_metrics.add_job_execution(job_exe) <NEW_LINE> <DEDENT> elif job_exe.error_category == 'DATA': <NEW_LINE> <INDENT> self.failed_data_metrics.add_job_execution(job_exe) <NEW_LINE> <DEDENT> elif job_exe.error_category == 'SYSTEM': <NEW_LINE> <INDENT> self.failed_system_metrics.add_job_execution(job_exe) <NEW_LINE> <DEDENT> <DEDENT> <DEDENT> def generate_status_json(self, json_dict): <NEW_LINE> <INDENT> completed_dict = {} <NEW_LINE> self.completed_metrics.generate_status_json(completed_dict) <NEW_LINE> alg_dict = {} <NEW_LINE> data_dict = {} <NEW_LINE> system_dict = {} <NEW_LINE> failed_count = self.failed_alg_metrics.total_count <NEW_LINE> failed_count += self.failed_data_metrics.total_count <NEW_LINE> failed_count += self.failed_system_metrics.total_count <NEW_LINE> self.failed_alg_metrics.generate_status_json(alg_dict) <NEW_LINE> self.failed_data_metrics.generate_status_json(data_dict) <NEW_LINE> self.failed_system_metrics.generate_status_json(system_dict) <NEW_LINE> failed_dict = {'total': failed_count, 'algorithm': alg_dict, 'data': data_dict, 'system': system_dict} <NEW_LINE> json_dict['completed'] = completed_dict <NEW_LINE> json_dict['failed'] = failed_dict <NEW_LINE> <DEDENT> def subtract_metrics(self, metrics): <NEW_LINE> <INDENT> self.completed_metrics.subtract_metrics(metrics.completed_metrics) <NEW_LINE> self.failed_alg_metrics.subtract_metrics(metrics.failed_alg_metrics) <NEW_LINE> self.failed_data_metrics.subtract_metrics(metrics.failed_data_metrics) <NEW_LINE> self.failed_system_metrics.subtract_metrics(metrics.failed_system_metrics) | This class holds metrics for finished job executions | 625990754e4d562566373d5c |
class Body3(Model): <NEW_LINE> <INDENT> def __init__(self, id: int=None, username: str=None, first_name: str=None, last_name: str=None, email: str=None, password: str=None, phone: str=None, user_status: int=None): <NEW_LINE> <INDENT> self.swagger_types = { 'id': int, 'username': str, 'first_name': str, 'last_name': str, 'email': str, 'password': str, 'phone': str, 'user_status': int } <NEW_LINE> self.attribute_map = { 'id': 'id', 'username': 'username', 'first_name': 'firstName', 'last_name': 'lastName', 'email': 'email', 'password': 'password', 'phone': 'phone', 'user_status': 'userStatus' } <NEW_LINE> self._id = id <NEW_LINE> self._username = username <NEW_LINE> self._first_name = first_name <NEW_LINE> self._last_name = last_name <NEW_LINE> self._email = email <NEW_LINE> self._password = password <NEW_LINE> self._phone = phone <NEW_LINE> self._user_status = user_status <NEW_LINE> <DEDENT> @classmethod <NEW_LINE> def from_dict(cls, dikt) -> 'Body3': <NEW_LINE> <INDENT> return deserialize_model(dikt, cls) <NEW_LINE> <DEDENT> @property <NEW_LINE> def id(self) -> int: <NEW_LINE> <INDENT> return self._id <NEW_LINE> <DEDENT> @id.setter <NEW_LINE> def id(self, id: int): <NEW_LINE> <INDENT> self._id = id <NEW_LINE> <DEDENT> @property <NEW_LINE> def username(self) -> str: <NEW_LINE> <INDENT> return self._username <NEW_LINE> <DEDENT> @username.setter <NEW_LINE> def username(self, username: str): <NEW_LINE> <INDENT> self._username = username <NEW_LINE> <DEDENT> @property <NEW_LINE> def first_name(self) -> str: <NEW_LINE> <INDENT> return self._first_name <NEW_LINE> <DEDENT> @first_name.setter <NEW_LINE> def first_name(self, first_name: str): <NEW_LINE> <INDENT> self._first_name = first_name <NEW_LINE> <DEDENT> @property <NEW_LINE> def last_name(self) -> str: <NEW_LINE> <INDENT> return self._last_name <NEW_LINE> <DEDENT> @last_name.setter <NEW_LINE> def last_name(self, last_name: str): <NEW_LINE> <INDENT> self._last_name = last_name <NEW_LINE> <DEDENT> @property <NEW_LINE> def email(self) -> str: <NEW_LINE> <INDENT> return self._email <NEW_LINE> <DEDENT> @email.setter <NEW_LINE> def email(self, email: str): <NEW_LINE> <INDENT> self._email = email <NEW_LINE> <DEDENT> @property <NEW_LINE> def password(self) -> str: <NEW_LINE> <INDENT> return self._password <NEW_LINE> <DEDENT> @password.setter <NEW_LINE> def password(self, password: str): <NEW_LINE> <INDENT> self._password = password <NEW_LINE> <DEDENT> @property <NEW_LINE> def phone(self) -> str: <NEW_LINE> <INDENT> return self._phone <NEW_LINE> <DEDENT> @phone.setter <NEW_LINE> def phone(self, phone: str): <NEW_LINE> <INDENT> self._phone = phone <NEW_LINE> <DEDENT> @property <NEW_LINE> def user_status(self) -> int: <NEW_LINE> <INDENT> return self._user_status <NEW_LINE> <DEDENT> @user_status.setter <NEW_LINE> def user_status(self, user_status: int): <NEW_LINE> <INDENT> self._user_status = user_status | NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually. | 62599075460517430c432d03 |
class PamIdentity(object): <NEW_LINE> <INDENT> def authenticate(self, username, password, **kwargs): <NEW_LINE> <INDENT> if pam.authenticate(username, password): <NEW_LINE> <INDENT> metadata = {} <NEW_LINE> if username == 'root': <NEW_LINE> <INDENT> metadata['is_admin'] == True <NEW_LINE> <DEDENT> tenant = {'id': username, 'name': username} <NEW_LINE> user = {'id': username, 'name': username} <NEW_LINE> return (tenant, user, metadata) <NEW_LINE> <DEDENT> <DEDENT> def get_tenants(self, username): <NEW_LINE> <INDENT> return [{'id': username, 'name': username}] | Very basic identity based on PAM.
Tenant is always the same as User, root user has admin role. | 6259907597e22403b383c858 |
class UpdateUserProfileView(UpdateView): <NEW_LINE> <INDENT> model = UserProfile <NEW_LINE> form_class = UpdateUserProfileForm <NEW_LINE> template_name = 'user/form.html' <NEW_LINE> def dispatch(self, request, *args, **kwargs): <NEW_LINE> <INDENT> self.obj = self.get_object() <NEW_LINE> if (self.obj.id == request.user.id or self.request.user.is_admin) and self.obj.is_developer: <NEW_LINE> <INDENT> return super(UpdateUserProfileView, self).dispatch( request, *args, **kwargs ) <NEW_LINE> <DEDENT> if self.obj.id == request.user.id and self.request.user.is_admin: <NEW_LINE> <INDENT> return super(UpdateUserProfileView, self).dispatch( request, *args, **kwargs ) <NEW_LINE> <DEDENT> return HttpResponseRedirect(self.get_success_url()) <NEW_LINE> <DEDENT> def get_success_url(self): <NEW_LINE> <INDENT> return reverse('home') <NEW_LINE> <DEDENT> def get(self, request, *args, **kwargs): <NEW_LINE> <INDENT> context = { 'form': UpdateUserProfileForm(instance=self.obj), 'title': 'Update User' } <NEW_LINE> return render(request, 'user/form.html', context) | Update User View Definition. | 625990757d43ff24874280be |
class URLFileMapperMiddleware(object): <NEW_LINE> <INDENT> def __init__(self): <NEW_LINE> <INDENT> self.uploadfiles = [] <NEW_LINE> <DEDENT> def process_request(self, request): <NEW_LINE> <INDENT> if request.META.get("CONTENT_TYPE") == "application/json": <NEW_LINE> <INDENT> items = json.loads(request.raw_post_data).items() <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> items = request.POST.items() <NEW_LINE> <DEDENT> for key, val in items: <NEW_LINE> <INDENT> try: <NEW_LINE> <INDENT> fp = FilepickerFile(val) <NEW_LINE> self.uploadfiles.append(fp) <NEW_LINE> <DEDENT> except ValueError: <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> except TypeError: <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> splits = val.split(",") <NEW_LINE> for url in splits: <NEW_LINE> <INDENT> if key in request.FILES: <NEW_LINE> <INDENT> request.FILES.setlist(key, list( request.FILES.getlist(key) + [fp.get_file()])) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> request.FILES[key] = fp.get_file() <NEW_LINE> request.POST = request.POST.copy() <NEW_LINE> request.POST[key] = request.FILES[key] <NEW_LINE> <DEDENT> <DEDENT> <DEDENT> <DEDENT> <DEDENT> def process_response(self, request, response): <NEW_LINE> <INDENT> while self.uploadfiles: <NEW_LINE> <INDENT> ff = self.uploadfiles.pop() <NEW_LINE> del ff <NEW_LINE> <DEDENT> return response | This middleware will take any Filepicker.io urls that are posted to the server via a POST
and put a matching File object into request.FILES. This way, if you're used to grabbing files out of
request.FILES, you don't have to change your backend code when using the filepicker.io widgets.
This middleware is rather agressive in that it will automatically fetch any and all filepicker
urls passed to the server, so if you are already processing the files via FPFileField or similar
this functionality is redundant
Note that the original filepicker.io url will still be available in POST if you need it. | 62599075fff4ab517ebcf16c |
class AuthenticationHandler: <NEW_LINE> <INDENT> def __init__(self, email, password): <NEW_LINE> <INDENT> self.email = email <NEW_LINE> self.password = password <NEW_LINE> <DEDENT> def get_token(self): <NEW_LINE> <INDENT> glb_token = "" <NEW_LINE> token_file = Path(TOKEN_FILE_NAME) <NEW_LINE> try: <NEW_LINE> <INDENT> token_file.resolve() <NEW_LINE> with token_file.open('r') as file: <NEW_LINE> <INDENT> glb_token = file.readline() <NEW_LINE> file.close() <NEW_LINE> <DEDENT> self._test_token(glb_token) <NEW_LINE> <DEDENT> except FileNotFoundError or RuntimeError or HTTPError: <NEW_LINE> <INDENT> glb_token = self._authenticate() <NEW_LINE> with token_file.open('w') as file: <NEW_LINE> <INDENT> file.write(glb_token) <NEW_LINE> file.close() <NEW_LINE> <DEDENT> <DEDENT> finally: <NEW_LINE> <INDENT> return glb_token <NEW_LINE> <DEDENT> <DEDENT> @staticmethod <NEW_LINE> def _test_token(token): <NEW_LINE> <INDENT> header = {'X-GLB-Token': token} <NEW_LINE> resp = requests.get(url=VALIDOR_URL, headers=header) <NEW_LINE> resp.raise_for_status() <NEW_LINE> print("O token ainda é válido!") <NEW_LINE> <DEDENT> def _authenticate(self): <NEW_LINE> <INDENT> url = GLOBO_LOGIN_URL <NEW_LINE> body = { 'payload': { 'email': self.email, 'password': self.password, 'serviceId': 438 }, 'captcha': "" } <NEW_LINE> try: <NEW_LINE> <INDENT> resp = requests.post(url=url, json=body) <NEW_LINE> resp.raise_for_status() <NEW_LINE> <DEDENT> except HTTPError as e: <NEW_LINE> <INDENT> print(e.response) <NEW_LINE> print(e.args) <NEW_LINE> return "" <NEW_LINE> <DEDENT> glb_token = resp.json().get('glbId') <NEW_LINE> print("Usuário %s autenticado com sucesso" % self.email) <NEW_LINE> return glb_token | Class responsible for handling the authentication to globo.com | 625990757b180e01f3e49d0f |
class BulkTransformer(CryptoChildTransformerCreatorMixin, GetPrimaryKeyTransformerMixin, SupportedModelsMixin, ManyTransformer): <NEW_LINE> <INDENT> child_transformer_class = CryptoBulkEntryTransformer <NEW_LINE> def __init__(self, crypto_controller, **kwargs): <NEW_LINE> <INDENT> self.crypto_controller = crypto_controller <NEW_LINE> super(BulkTransformer, self).__init__(**kwargs) <NEW_LINE> self.deleted_sets_transformer = DeleteSetsTransformer( storage=self.storage, account_manager=self.account_manager ) <NEW_LINE> <DEDENT> def to_model(self, payload): <NEW_LINE> <INDENT> models = {} <NEW_LINE> models['last_synced'] = payload.pop('now') <NEW_LINE> deleted_sets = payload.pop('deleted_sets') <NEW_LINE> for set_name, transformer in self.mapping.items(): <NEW_LINE> <INDENT> models[set_name] = [ transformer.to_model(i) for i in payload[set_name] ] <NEW_LINE> <DEDENT> models['deleted_sets'] = self.deleted_sets_transformer.to_model( deleted_sets ) <NEW_LINE> return models <NEW_LINE> <DEDENT> def to_payload(self, model): <NEW_LINE> <INDENT> payload = {} <NEW_LINE> payload['last_synced'] = model.pop('last_synced') <NEW_LINE> payload['delete_sets'] = self.deleted_sets_transformer.to_payload(None) <NEW_LINE> for set_name, transformer in self.mapping.items(): <NEW_LINE> <INDENT> internal_model = self.storage.filter( transformer.model_class, any, **{ 'remote_instance.state.rcontains': ['created', 'updated'], 'remote_instance': None } ) <NEW_LINE> payload[set_name] = [ transformer.to_payload(i) for i in internal_model ] <NEW_LINE> <DEDENT> return payload | Transformer for entry list. | 6259907556ac1b37e630398d |
class IListedSingle(Interface): <NEW_LINE> <INDENT> def single(self, obj=None, pos=None): <NEW_LINE> <INDENT> pass | Listed Single
| 62599075796e427e538500d0 |
class RespVoiceprintThresholdResponse(object): <NEW_LINE> <INDENT> openapi_types = { 'has_error': 'bool', 'error_id': 'str', 'error_desc': 'str', 'data': 'VoiceprintThresholdResponse' } <NEW_LINE> attribute_map = { 'has_error': 'hasError', 'error_id': 'errorId', 'error_desc': 'errorDesc', 'data': 'data' } <NEW_LINE> def __init__(self, has_error=None, error_id=None, error_desc=None, data=None): <NEW_LINE> <INDENT> self._has_error = None <NEW_LINE> self._error_id = None <NEW_LINE> self._error_desc = None <NEW_LINE> self._data = None <NEW_LINE> self.discriminator = None <NEW_LINE> if has_error is not None: <NEW_LINE> <INDENT> self.has_error = has_error <NEW_LINE> <DEDENT> if error_id is not None: <NEW_LINE> <INDENT> self.error_id = error_id <NEW_LINE> <DEDENT> if error_desc is not None: <NEW_LINE> <INDENT> self.error_desc = error_desc <NEW_LINE> <DEDENT> if data is not None: <NEW_LINE> <INDENT> self.data = data <NEW_LINE> <DEDENT> <DEDENT> @property <NEW_LINE> def has_error(self): <NEW_LINE> <INDENT> return self._has_error <NEW_LINE> <DEDENT> @has_error.setter <NEW_LINE> def has_error(self, has_error): <NEW_LINE> <INDENT> self._has_error = has_error <NEW_LINE> <DEDENT> @property <NEW_LINE> def error_id(self): <NEW_LINE> <INDENT> return self._error_id <NEW_LINE> <DEDENT> @error_id.setter <NEW_LINE> def error_id(self, error_id): <NEW_LINE> <INDENT> self._error_id = error_id <NEW_LINE> <DEDENT> @property <NEW_LINE> def error_desc(self): <NEW_LINE> <INDENT> return self._error_desc <NEW_LINE> <DEDENT> @error_desc.setter <NEW_LINE> def error_desc(self, error_desc): <NEW_LINE> <INDENT> self._error_desc = error_desc <NEW_LINE> <DEDENT> @property <NEW_LINE> def data(self): <NEW_LINE> <INDENT> return self._data <NEW_LINE> <DEDENT> @data.setter <NEW_LINE> def data(self, data): <NEW_LINE> <INDENT> self._data = data <NEW_LINE> <DEDENT> def to_dict(self): <NEW_LINE> <INDENT> result = {} <NEW_LINE> for attr, _ in six.iteritems(self.openapi_types): <NEW_LINE> <INDENT> value = getattr(self, attr) <NEW_LINE> if isinstance(value, list): <NEW_LINE> <INDENT> result[attr] = list(map( lambda x: x.to_dict() if hasattr(x, "to_dict") else x, value )) <NEW_LINE> <DEDENT> elif hasattr(value, "to_dict"): <NEW_LINE> <INDENT> result[attr] = value.to_dict() <NEW_LINE> <DEDENT> elif isinstance(value, dict): <NEW_LINE> <INDENT> result[attr] = dict(map( lambda item: (item[0], item[1].to_dict()) if hasattr(item[1], "to_dict") else item, value.items() )) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> result[attr] = value <NEW_LINE> <DEDENT> <DEDENT> return result <NEW_LINE> <DEDENT> def to_str(self): <NEW_LINE> <INDENT> return pprint.pformat(self.to_dict()) <NEW_LINE> <DEDENT> def __repr__(self): <NEW_LINE> <INDENT> return self.to_str() <NEW_LINE> <DEDENT> def __eq__(self, other): <NEW_LINE> <INDENT> if not isinstance(other, RespVoiceprintThresholdResponse): <NEW_LINE> <INDENT> return False <NEW_LINE> <DEDENT> return self.__dict__ == other.__dict__ <NEW_LINE> <DEDENT> def __ne__(self, other): <NEW_LINE> <INDENT> return not self == other | NOTE: This class is auto generated by OpenAPI Generator.
Ref: https://openapi-generator.tech
Do not edit the class manually. | 62599075cc0a2c111447c77c |
class ExternalAuthMap(models.Model): <NEW_LINE> <INDENT> class Meta(object): <NEW_LINE> <INDENT> app_label = "external_auth" <NEW_LINE> unique_together = (('external_id', 'external_domain'), ) <NEW_LINE> <DEDENT> external_id = models.CharField(max_length=255, db_index=True) <NEW_LINE> external_domain = models.CharField(max_length=255, db_index=True) <NEW_LINE> external_credentials = models.TextField(blank=True) <NEW_LINE> external_email = models.CharField(max_length=255, db_index=True) <NEW_LINE> external_name = models.CharField(blank=True, max_length=255, db_index=True) <NEW_LINE> user = models.OneToOneField(User, unique=True, db_index=True, null=True) <NEW_LINE> internal_password = models.CharField(blank=True, max_length=31) <NEW_LINE> dtcreated = models.DateTimeField('creation date', auto_now_add=True) <NEW_LINE> dtsignup = models.DateTimeField('signup date', null=True) <NEW_LINE> def __unicode__(self): <NEW_LINE> <INDENT> return "[%s] = (%s / %s)" % (self.external_id, self.external_name, self.external_email) | Model class for external auth. | 625990755fcc89381b266e04 |
class FragmentCacheExtension(Extension): <NEW_LINE> <INDENT> tags = set(['cache']) <NEW_LINE> def __init__(self, environment): <NEW_LINE> <INDENT> super(FragmentCacheExtension, self).__init__(environment) <NEW_LINE> <DEDENT> def preprocess(self, source, name, filename=None): <NEW_LINE> <INDENT> self.name = filename or name <NEW_LINE> return source <NEW_LINE> <DEDENT> def parse(self, parser): <NEW_LINE> <INDENT> lineno = parser.stream.next().lineno <NEW_LINE> name = '%s+%s' % (self.name, lineno) <NEW_LINE> args = [nodes.Const(name), parser.parse_expression()] <NEW_LINE> timeout = nodes.Const(None) <NEW_LINE> extra = nodes.Const([]) <NEW_LINE> while parser.stream.skip_if('comma'): <NEW_LINE> <INDENT> x = parser.parse_expression() <NEW_LINE> if parser.stream.current.type == 'assign': <NEW_LINE> <INDENT> next(parser.stream) <NEW_LINE> extra = parser.parse_expression() <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> timeout = x <NEW_LINE> <DEDENT> <DEDENT> args.extend([timeout, extra]) <NEW_LINE> body = parser.parse_statements(['name:endcache'], drop_needle=True) <NEW_LINE> self.process_cache_arguments(args) <NEW_LINE> return nodes.CallBlock(self.call_method('_cache_support', args), [], [], body).set_lineno(lineno) <NEW_LINE> <DEDENT> def process_cache_arguments(self, args): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def _cache_support(self, name, obj, timeout, extra, caller): <NEW_LINE> <INDENT> if settings.TEMPLATE_DEBUG: <NEW_LINE> <INDENT> return caller() <NEW_LINE> <DEDENT> extra = ':'.join(map(encoding.smart_str, extra)) <NEW_LINE> key = 'fragment:%s:%s' % (name, extra) <NEW_LINE> return base.cached_with(obj, caller, key, timeout) | Cache a chunk of template code based on a queryset. Since looping over
querysets is the slowest thing we do, you should wrap you for loop with the
cache tag. Uses the default timeout unless you pass a second argument.
{% cache queryset[, timeout] %}
...template code...
{% endcache %}
Derived from the jinja2 documentation example. | 625990755166f23b2e244d2c |
class CustomMetric(EvalMetric): <NEW_LINE> <INDENT> def __init__(self, feval, name=None, allow_extra_outputs=False): <NEW_LINE> <INDENT> if name is None: <NEW_LINE> <INDENT> name = feval.__name__ <NEW_LINE> if name.find('<') != -1: <NEW_LINE> <INDENT> name = 'custom(%s)' % name <NEW_LINE> <DEDENT> <DEDENT> super(CustomMetric, self).__init__(name) <NEW_LINE> self._feval = feval <NEW_LINE> self._allow_extra_outputs = allow_extra_outputs <NEW_LINE> <DEDENT> def update(self, labels, preds): <NEW_LINE> <INDENT> if not self._allow_extra_outputs: <NEW_LINE> <INDENT> check_label_shapes(labels, preds) <NEW_LINE> <DEDENT> for pred, label in zip(preds, labels): <NEW_LINE> <INDENT> label = label.asnumpy() <NEW_LINE> pred = pred.asnumpy() <NEW_LINE> reval = self._feval(label, pred) <NEW_LINE> if isinstance(reval, tuple): <NEW_LINE> <INDENT> (sum_metric, num_inst) = reval <NEW_LINE> self.sum_metric += sum_metric <NEW_LINE> self.num_inst += num_inst <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> self.sum_metric += reval <NEW_LINE> self.num_inst += 1 | Computes a customized evaluation metric.
The `feval` function can return a `tuple` of (sum_metric, num_inst) or return
an `int` sum_metric.
Parameters
----------
feval : callable(label, pred)
Customized evaluation function.
name : str, optional
The name of the metric. (the default is None).
allow_extra_outputs : bool, optional
If true, the prediction outputs can have extra outputs.
This is useful in RNN, where the states are also produced
in outputs for forwarding. (the default is False).
Examples
--------
>>> predicts = [mx.nd.array(np.array([3, -0.5, 2, 7]).reshape(4,1))]
>>> labels = [mx.nd.array(np.array([2.5, 0.0, 2, 8]).reshape(4,1))]
>>> feval = lambda x, y : (x + y).mean()
>>> eval_metrics = mx.metric.CustomMetric(feval=feval)
>>> eval_metrics.update(labels, predicts)
>>> print eval_metrics.get()
('custom(<lambda>)', 6.0) | 6259907576e4537e8c3f0ed6 |
class NullJSONPRenderer(JSONPRenderer, NullJSONRenderer): <NEW_LINE> <INDENT> pass | (JSONPRenderer will call NullJSONRenderer before JSONRenderer) | 62599075167d2b6e312b823c |
class BB(object): <NEW_LINE> <INDENT> def __init__(self, *args): <NEW_LINE> <INDENT> if len(args) == 0: <NEW_LINE> <INDENT> self._bb = cp.cpBB() <NEW_LINE> <DEDENT> elif len(args) == 1: <NEW_LINE> <INDENT> self._bb = args[0] <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> self._bb = cpffi.cpBBNew(args[0], args[1], args[2], args[3]) <NEW_LINE> <DEDENT> <DEDENT> def __repr__(self): <NEW_LINE> <INDENT> return 'BB(%s, %s, %s, %s)' % (self.left, self.bottom, self.right, self.top) <NEW_LINE> <DEDENT> def __eq__(self, other): <NEW_LINE> <INDENT> return self.left == other.left and self.bottom == other.bottom and self.right == other.right and self.top == other.top <NEW_LINE> <DEDENT> def __ne__(self, other): <NEW_LINE> <INDENT> return not self.__eq__(other) <NEW_LINE> <DEDENT> def intersects(self, other): <NEW_LINE> <INDENT> return bool(cpffi.cpBBIntersects(self._bb, other._bb)) <NEW_LINE> <DEDENT> def contains(self, other): <NEW_LINE> <INDENT> return bool(cpffi.cpBBContainsBB(self._bb, other._bb)) <NEW_LINE> <DEDENT> def contains_vect(self, v): <NEW_LINE> <INDENT> return bool(cpffi.cpBBContainsVect(self._bb, v)) <NEW_LINE> <DEDENT> def merge(self, other): <NEW_LINE> <INDENT> return BB(cpffi.cpBBMerge(self._bb, other._bb)) <NEW_LINE> <DEDENT> def expand(self, v): <NEW_LINE> <INDENT> return BB(cpffi.cpBBExpand(self._bb, v)) <NEW_LINE> <DEDENT> left = property(lambda self: self._bb.l) <NEW_LINE> bottom = property(lambda self: self._bb.b) <NEW_LINE> right = property(lambda self: self._bb.r) <NEW_LINE> top = property(lambda self: self._bb.t) <NEW_LINE> def clamp_vect(self, v): <NEW_LINE> <INDENT> return cpffi.cpBBClampVect(self._bb, v) <NEW_LINE> <DEDENT> def wrap_vect(self, v): <NEW_LINE> <INDENT> return cp.cpBBWrapVect(self._bb, v) | Simple bounding box class. Stored as left, bottom, right, top values. | 62599075adb09d7d5dc0bec1 |
class FlowListMixin(object): <NEW_LINE> <INDENT> ns_map = None <NEW_LINE> def __init__(self, *args, **kwargs): <NEW_LINE> <INDENT> self.ns_map = kwargs.get('ns_map', {}) <NEW_LINE> super(FlowListMixin, self).__init__(*args, **kwargs) <NEW_LINE> <DEDENT> @property <NEW_LINE> def flows(self): <NEW_LINE> <INDENT> return self.ns_map.values() | Mixin for list view contains multiple flows | 6259907563b5f9789fe86abc |
class JobData(dict): <NEW_LINE> <INDENT> def __init__(self, data, context=None): <NEW_LINE> <INDENT> self.context = context or [] <NEW_LINE> super(JobData, self).__init__(data) <NEW_LINE> <DEDENT> def __getitem__(self, name): <NEW_LINE> <INDENT> full_context = list(self.context) + [name] <NEW_LINE> try: <NEW_LINE> <INDENT> value = super(JobData, self).__getitem__(name) <NEW_LINE> <DEDENT> except KeyError: <NEW_LINE> <INDENT> raise JobDataError("Missing data: {0}.".format( "".join(["['{0}']".format(c) for c in full_context]))) <NEW_LINE> <DEDENT> if isinstance(value, dict): <NEW_LINE> <INDENT> value = self.__class__(value, full_context) <NEW_LINE> <DEDENT> return value | Encapsulates data access from incoming test data structure.
All missing-data errors raise ``JobDataError`` with a useful
message. Unlike regular nested dictionaries, ``JobData`` keeps track of
context, so errors contain not only the name of the immediately-missing
key, but the full parent-key context as well. | 62599075627d3e7fe0e087df |
class CodeSnippet: <NEW_LINE> <INDENT> _source_executors = { ".cpp" : CppExecutor } <NEW_LINE> def __init__(self, file_path): <NEW_LINE> <INDENT> self._file_path = file_path <NEW_LINE> extension = splitext(file_path)[1] <NEW_LINE> self._source_executor = self._create_source_executor(extension) <NEW_LINE> <DEDENT> def simulate(self, threats): <NEW_LINE> <INDENT> print(self._source_executor.execute_code()) <NEW_LINE> <DEDENT> def _create_source_executor(self, extension): <NEW_LINE> <INDENT> return self._source_executors[extension](self._file_path) | Represents a code snippet. Can be executed by the simulate method. | 6259907560cbc95b06365a19 |
class BookUpdateAPIView(generics.UpdateAPIView): <NEW_LINE> <INDENT> queryset = Book.objects.all() <NEW_LINE> serializer_class = BookSerializer | 本モデルの更新・一部更新APIクラス | 62599075d268445f2663a809 |
class JobIdGenerator(object): <NEW_LINE> <INDENT> counter = 0 <NEW_LINE> @classmethod <NEW_LINE> def get_new_id(cls): <NEW_LINE> <INDENT> cls.counter += 1 <NEW_LINE> return cls.counter | Generate pseudo-unique job_id. Every id must be higher than previous one
(old_id < new_id), this is used in min_job_id mechanism for forcing difficulty. | 62599075a17c0f6771d5d858 |
class Encoder(object): <NEW_LINE> <INDENT> def __init__(self, input_dir, destination_dir, working_dir='/tmp', delete_source_file=False, handbrake_cli='/usr/bin/HandBrakeCLI'): <NEW_LINE> <INDENT> self.input_dir = input_dir <NEW_LINE> self.destination_dir = destination_dir <NEW_LINE> self.working_dir = working_dir <NEW_LINE> self.delete_source_file = delete_source_file <NEW_LINE> self.handbrake_cli = handbrake_cli <NEW_LINE> self.valid_extensions = ['.mkv', '.mp4', '.m4v', '.avi', '.wmv', '.ts'] <NEW_LINE> <DEDENT> def media_files(self): <NEW_LINE> <INDENT> media_files = [] <NEW_LINE> for root, dirs, files in os.walk(self.input_dir): <NEW_LINE> <INDENT> logger.debug('Searching %s for media files', root) <NEW_LINE> for filename in files: <NEW_LINE> <INDENT> if os.path.splitext(filename)[1] in self.valid_extensions: <NEW_LINE> <INDENT> media_files.append( os.path.abspath(os.path.join(root, filename))) <NEW_LINE> <DEDENT> <DEDENT> <DEDENT> logger.debug('Found media files %s', media_files) <NEW_LINE> return media_files <NEW_LINE> <DEDENT> def encode_media_files(self): <NEW_LINE> <INDENT> for media_file in self.media_files(): <NEW_LINE> <INDENT> if not self.sample_file(media_file): <NEW_LINE> <INDENT> handbrake = Handbrake(media_file, working_dir=self.working_dir, exe=self.handbrake_cli) <NEW_LINE> output_file = handbrake.encode() <NEW_LINE> self.move_file(output_file) <NEW_LINE> if self.delete_source_file: <NEW_LINE> <INDENT> self.delete_file(media_file) <NEW_LINE> <DEDENT> <DEDENT> <DEDENT> <DEDENT> @classmethod <NEW_LINE> def sample_file(cls, filename): <NEW_LINE> <INDENT> match = re.search(r'sample', filename, re.IGNORECASE) <NEW_LINE> if match: <NEW_LINE> <INDENT> logger.debug('%s is a sample file', filename) <NEW_LINE> return True <NEW_LINE> <DEDENT> return False <NEW_LINE> <DEDENT> @classmethod <NEW_LINE> def delete_file(cls, filename): <NEW_LINE> <INDENT> if os.path.exists(filename): <NEW_LINE> <INDENT> try: <NEW_LINE> <INDENT> os.remove(filename) <NEW_LINE> return True <NEW_LINE> <DEDENT> except OSError: <NEW_LINE> <INDENT> logger.exception('Error attempting to delete %s', filename) <NEW_LINE> return False <NEW_LINE> <DEDENT> <DEDENT> else: <NEW_LINE> <INDENT> logger.warning('Unable to delete (file does not exist) %s', filename) <NEW_LINE> return False <NEW_LINE> <DEDENT> <DEDENT> def move_file(self, filename): <NEW_LINE> <INDENT> if not os.path.exists(self.destination_dir): <NEW_LINE> <INDENT> try: <NEW_LINE> <INDENT> logger.debug('Creating %s', self.destination_dir) <NEW_LINE> os.makedirs(self.destination_dir) <NEW_LINE> <DEDENT> except OSError: <NEW_LINE> <INDENT> logger.exception('Error attempting to create %s', self.destination_dir) <NEW_LINE> return False <NEW_LINE> <DEDENT> <DEDENT> if os.path.exists(filename): <NEW_LINE> <INDENT> destination_file = os.path.join(self.destination_dir, os.path.basename(filename)) <NEW_LINE> try: <NEW_LINE> <INDENT> shutil.move(filename, destination_file) <NEW_LINE> return True <NEW_LINE> <DEDENT> except OSError: <NEW_LINE> <INDENT> logger.exception('Error attempting to move %s to %s', filename, destination_file) <NEW_LINE> <DEDENT> <DEDENT> else: <NEW_LINE> <INDENT> logger.error('Unable to move (files does not exist) %s', filename) <NEW_LINE> <DEDENT> return False | Class for encoding media files.
:param str input_dir: The input directory
:param str destination_dir: The destination directory
:param str working_dir: The working directory
:param bool delete_source_file: If the source file should be deleted | 6259907567a9b606de547750 |
class ExteriorWallColor(Color): <NEW_LINE> <INDENT> pass | Color of a material or component. Can be applied to opaque surfaces, materials, and so forth. | 62599075ad47b63b2c5a91a6 |
@implementer(IMessage) <NEW_LINE> class Unsubscribe(Message): <NEW_LINE> <INDENT> MESSAGE_TYPE = 34 <NEW_LINE> def __init__(self, request, subscription): <NEW_LINE> <INDENT> Message.__init__(self) <NEW_LINE> self.request = request <NEW_LINE> self.subscription = subscription <NEW_LINE> <DEDENT> @staticmethod <NEW_LINE> def parse(wmsg): <NEW_LINE> <INDENT> assert(len(wmsg) > 0 and wmsg[0] == Unsubscribe.MESSAGE_TYPE) <NEW_LINE> if len(wmsg) != 3: <NEW_LINE> <INDENT> raise ProtocolError("invalid message length {} for WAMP UNSUBSCRIBE".format(len(wmsg))) <NEW_LINE> <DEDENT> request = check_or_raise_id(wmsg[1], "'request' in UNSUBSCRIBE") <NEW_LINE> subscription = check_or_raise_id(wmsg[2], "'subscription' in UNSUBSCRIBE") <NEW_LINE> obj = Unsubscribe(request, subscription) <NEW_LINE> return obj <NEW_LINE> <DEDENT> def marshal(self): <NEW_LINE> <INDENT> return [Unsubscribe.MESSAGE_TYPE, self.request, self.subscription] <NEW_LINE> <DEDENT> def __str__(self): <NEW_LINE> <INDENT> return "WAMP UNSUBSCRIBE Message (request = {}, subscription = {})".format(self.request, self.subscription) | A WAMP `UNSUBSCRIBE` message.
Format: `[UNSUBSCRIBE, Request|id, SUBSCRIBED.Subscription|id]` | 62599075435de62698e9d760 |
class RVR(BaseRVM, RegressorMixin): <NEW_LINE> <INDENT> def _posterior(self): <NEW_LINE> <INDENT> i_s = np.diag(self.alpha_) + self.beta_ * np.dot(self.phi.T, self.phi) <NEW_LINE> self.sigma_ = np.linalg.inv(i_s) <NEW_LINE> self.m_ = self.beta_ * np.dot(self.sigma_, np.dot(self.phi.T, self.y)) <NEW_LINE> <DEDENT> def predict(self, X, eval_MSE=False): <NEW_LINE> <INDENT> phi = self._apply_kernel(X, self.relevance_) <NEW_LINE> y = np.dot(phi, self.m_) <NEW_LINE> if eval_MSE: <NEW_LINE> <INDENT> MSE = (1/self.beta_) + np.dot(phi, np.dot(self.sigma_, phi.T)) <NEW_LINE> return y, MSE[:, 0] <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> return y | Relevance Vector Machine Regression.
Implementation of Mike Tipping's Relevance Vector Machine for regression
using the scikit-learn API. | 625990752c8b7c6e89bd5141 |
class capture_stderr(list): <NEW_LINE> <INDENT> def __enter__(self): <NEW_LINE> <INDENT> self.sys_stderr = sys.stderr <NEW_LINE> self.stringio = StringIO() <NEW_LINE> sys.stderr = self.stringio <NEW_LINE> return self <NEW_LINE> <DEDENT> def __exit__(self, *args): <NEW_LINE> <INDENT> self.append(str(self.stringio.getvalue())) <NEW_LINE> del self.stringio <NEW_LINE> sys.stderr = self.sys_stderr | Replace sys.stderr with a temporary StringIO | 62599075dc8b845886d54f12 |
class PaymentArgumentsResponse(object): <NEW_LINE> <INDENT> def __init__(self, token=None, type=None): <NEW_LINE> <INDENT> self.swagger_types = { 'token': 'str', 'type': 'str' } <NEW_LINE> self.attribute_map = { 'token': 'token', 'type': 'type' } <NEW_LINE> self._token = token <NEW_LINE> self._type = type <NEW_LINE> <DEDENT> @property <NEW_LINE> def token(self): <NEW_LINE> <INDENT> return self._token <NEW_LINE> <DEDENT> @token.setter <NEW_LINE> def token(self, token): <NEW_LINE> <INDENT> self._token = token <NEW_LINE> <DEDENT> @property <NEW_LINE> def type(self): <NEW_LINE> <INDENT> return self._type <NEW_LINE> <DEDENT> @type.setter <NEW_LINE> def type(self, type): <NEW_LINE> <INDENT> self._type = type <NEW_LINE> <DEDENT> def to_dict(self): <NEW_LINE> <INDENT> result = {} <NEW_LINE> for attr, _ in iteritems(self.swagger_types): <NEW_LINE> <INDENT> value = getattr(self, attr) <NEW_LINE> if isinstance(value, list): <NEW_LINE> <INDENT> result[attr] = list(map( lambda x: x.to_dict() if hasattr(x, "to_dict") else x, value )) <NEW_LINE> <DEDENT> elif hasattr(value, "to_dict"): <NEW_LINE> <INDENT> result[attr] = value.to_dict() <NEW_LINE> <DEDENT> elif isinstance(value, dict): <NEW_LINE> <INDENT> result[attr] = dict(map( lambda item: (item[0], item[1].to_dict()) if hasattr(item[1], "to_dict") else item, value.items() )) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> result[attr] = value <NEW_LINE> <DEDENT> <DEDENT> return result <NEW_LINE> <DEDENT> def to_str(self): <NEW_LINE> <INDENT> return pformat(self.to_dict()) <NEW_LINE> <DEDENT> def __repr__(self): <NEW_LINE> <INDENT> return self.to_str() <NEW_LINE> <DEDENT> def __eq__(self, other): <NEW_LINE> <INDENT> return self.__dict__ == other.__dict__ <NEW_LINE> <DEDENT> def __ne__(self, other): <NEW_LINE> <INDENT> return not self == other | NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually. | 62599075cc0a2c111447c77d |
class Bert4LM(BertModel): <NEW_LINE> <INDENT> def __init__(self, *args, **kwargs): <NEW_LINE> <INDENT> super(Bert4LM, self).__init__(*args, **kwargs) <NEW_LINE> self.with_pool = False <NEW_LINE> self.with_nsp = False <NEW_LINE> self.with_mlm = True <NEW_LINE> self.attention_mask = 'history_only' <NEW_LINE> <DEDENT> def compute_attention_mask(self, layer_id, segment_ids): <NEW_LINE> <INDENT> return self.attention_mask | 用来做语言模型任务的Bert
| 62599075aad79263cf430110 |
class DeviceSelector(QtWidgets.QDialog): <NEW_LINE> <INDENT> selected = QtCore.pyqtSignal(str) <NEW_LINE> def __init__(self, parent): <NEW_LINE> <INDENT> super().__init__(parent=parent) <NEW_LINE> self.ui = Ui_DeviceSelector() <NEW_LINE> self.ui.setupUi(self) <NEW_LINE> self.ui.deviceList.addItems(_get_devices()) <NEW_LINE> self.ui.connectButton.clicked.connect(self._selected) <NEW_LINE> prev = qt_util.get_settings('last_used_device') <NEW_LINE> self.ui.deviceList.setCurrentIndex(self.ui.deviceList.findText(prev)) <NEW_LINE> <DEDENT> def _selected(self): <NEW_LINE> <INDENT> device = self.ui.deviceList.currentText() <NEW_LINE> if device: <NEW_LINE> <INDENT> qt_util.store_settings(last_used_device=device) <NEW_LINE> self.selected.emit(device) | Setup Dialog for selecting a device from serial port list | 62599075ec188e330fdfa1fe |
class Classobj: <NEW_LINE> <INDENT> pass | 兼容 python 2 的 classobj | 62599075283ffb24f3cf5203 |
@collection( name='microscope-settings-a1', properties={ 'title': "Microscope Settings Tier A1", 'description': 'Collection of Metadata for microscope settings of Tier A1 microscopy files.', }) <NEW_LINE> class MicroscopeSettingA1(MicroscopeSetting): <NEW_LINE> <INDENT> item_type = 'microscope_setting_a1' <NEW_LINE> schema = load_schema('encoded:schemas/microscope_setting_a1.json') <NEW_LINE> embedded_list = MicroscopeSetting.embedded_list | the sub class of microscope settings for tier A1 files. | 625990753539df3088ecdbf0 |
class PacketCaptureParameters(msrest.serialization.Model): <NEW_LINE> <INDENT> _validation = { 'target': {'required': True}, 'storage_location': {'required': True}, } <NEW_LINE> _attribute_map = { 'target': {'key': 'target', 'type': 'str'}, 'bytes_to_capture_per_packet': {'key': 'bytesToCapturePerPacket', 'type': 'int'}, 'total_bytes_per_session': {'key': 'totalBytesPerSession', 'type': 'int'}, 'time_limit_in_seconds': {'key': 'timeLimitInSeconds', 'type': 'int'}, 'storage_location': {'key': 'storageLocation', 'type': 'PacketCaptureStorageLocation'}, 'filters': {'key': 'filters', 'type': '[PacketCaptureFilter]'}, } <NEW_LINE> def __init__( self, **kwargs ): <NEW_LINE> <INDENT> super(PacketCaptureParameters, self).__init__(**kwargs) <NEW_LINE> self.target = kwargs['target'] <NEW_LINE> self.bytes_to_capture_per_packet = kwargs.get('bytes_to_capture_per_packet', 0) <NEW_LINE> self.total_bytes_per_session = kwargs.get('total_bytes_per_session', 1073741824) <NEW_LINE> self.time_limit_in_seconds = kwargs.get('time_limit_in_seconds', 18000) <NEW_LINE> self.storage_location = kwargs['storage_location'] <NEW_LINE> self.filters = kwargs.get('filters', None) | Parameters that define the create packet capture operation.
All required parameters must be populated in order to send to Azure.
:param target: Required. The ID of the targeted resource, only VM is currently supported.
:type target: str
:param bytes_to_capture_per_packet: Number of bytes captured per packet, the remaining bytes
are truncated.
:type bytes_to_capture_per_packet: int
:param total_bytes_per_session: Maximum size of the capture output.
:type total_bytes_per_session: int
:param time_limit_in_seconds: Maximum duration of the capture session in seconds.
:type time_limit_in_seconds: int
:param storage_location: Required. Describes the storage location for a packet capture session.
:type storage_location: ~azure.mgmt.network.v2019_06_01.models.PacketCaptureStorageLocation
:param filters: A list of packet capture filters.
:type filters: list[~azure.mgmt.network.v2019_06_01.models.PacketCaptureFilter] | 6259907571ff763f4b5e9104 |
class pyOrmBaseException(Exception): <NEW_LINE> <INDENT> pass | Base exception class for all pyorm exceptions | 62599075f9cc0f698b1c5f78 |
class Hsdsch(pkt.Packet): <NEW_LINE> <INDENT> __fields__ = [ ('frmcrc', 7), ('ft', 1) ] <NEW_LINE> _typesw = {} <NEW_LINE> def unpack(self, buf): <NEW_LINE> <INDENT> pkt.Packet.unpack(self, buf) <NEW_LINE> buf = buf[self.__pkt_fields_len__:] <NEW_LINE> try: <NEW_LINE> <INDENT> self.data = self._typesw[self.ft](buf) <NEW_LINE> setattr(self, self.data.__class__.__name__.lower(), self.data) <NEW_LINE> <DEDENT> except (KeyError, pkt.UnpackError): <NEW_LINE> <INDENT> self.data = buf <NEW_LINE> <DEDENT> <DEDENT> @classmethod <NEW_LINE> def set_type(cls, t, pktclass): <NEW_LINE> <INDENT> cls._typesw[t] = pktclass <NEW_LINE> <DEDENT> @classmethod <NEW_LINE> def get_type(cls, t): <NEW_LINE> <INDENT> return cls._typesw[t] | docstring for hsdsch | 625990758a349b6b43687bb3 |
class TensorRunner(BaseRunner[ModelPart]): <NEW_LINE> <INDENT> def __init__(self, output_series: str, toplevel_modelpart: ModelPart, toplevel_tensors: List[tf.Tensor], tensors_by_name: List[str], tensors_by_ref: List[tf.Tensor], batch_dims_by_name: List[int], batch_dims_by_ref: List[int], select_session: int = None) -> None: <NEW_LINE> <INDENT> check_argument_types() <NEW_LINE> BaseRunner[ModelPart].__init__(self, output_series, toplevel_modelpart) <NEW_LINE> self._names = tensors_by_name <NEW_LINE> self._tensors = tensors_by_ref <NEW_LINE> self._batch_dims_name = batch_dims_by_name <NEW_LINE> self._batch_dims_ref = batch_dims_by_ref <NEW_LINE> self._select_session = select_session <NEW_LINE> log("Blessing toplevel tensors for tensor runner:") <NEW_LINE> for tensor in toplevel_tensors: <NEW_LINE> <INDENT> log("Toplevel tensor: {}".format(tensor)) <NEW_LINE> <DEDENT> <DEDENT> def get_executable(self, compute_losses: bool, summaries: bool, num_sessions: int) -> TensorExecutable: <NEW_LINE> <INDENT> fetches = {} <NEW_LINE> batch_ids = {} <NEW_LINE> for name, bid in zip(self._names, self._batch_dims_name): <NEW_LINE> <INDENT> try: <NEW_LINE> <INDENT> fetches[name] = tf.get_default_graph().get_tensor_by_name(name) <NEW_LINE> batch_ids[name] = bid <NEW_LINE> <DEDENT> except KeyError: <NEW_LINE> <INDENT> warn(("The tensor of name '{}' is not present in the " "graph.").format(name)) <NEW_LINE> <DEDENT> <DEDENT> for tensor, bid in zip(self._tensors, self._batch_dims_ref): <NEW_LINE> <INDENT> fetches[tensor.name] = tensor <NEW_LINE> batch_ids[tensor.name] = bid <NEW_LINE> <DEDENT> return TensorExecutable( self.all_coders, fetches, batch_ids, self._select_session) <NEW_LINE> <DEDENT> @property <NEW_LINE> def loss_names(self) -> List[str]: <NEW_LINE> <INDENT> return [] | Runner class for printing tensors from a model.
Use this runner if you want to retrieve a specific tensor from the model
using a given dataset. The runner generates an output data series which
will contain the tensors in a dictionary of numpy arrays. | 6259907555399d3f05627e72 |
class AddOperation(PatchOperation): <NEW_LINE> <INDENT> def apply(self, obj): <NEW_LINE> <INDENT> try: <NEW_LINE> <INDENT> value = self.operation["value"] <NEW_LINE> <DEDENT> except KeyError as ex: <NEW_LINE> <INDENT> raise InvalidJsonPatch( "The operation does not contain a 'value' member") <NEW_LINE> <DEDENT> subobj, part = self.pointer.to_last(obj) <NEW_LINE> if isinstance(subobj, list): <NEW_LINE> <INDENT> if part == '-': <NEW_LINE> <INDENT> subobj.append(value) <NEW_LINE> <DEDENT> elif part > len(subobj) or part < 0: <NEW_LINE> <INDENT> raise JsonPatchConflict("can't insert outside of list") <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> subobj.insert(part, value) <NEW_LINE> <DEDENT> <DEDENT> elif isinstance(subobj, dict): <NEW_LINE> <INDENT> if part is None: <NEW_LINE> <INDENT> obj = value <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> subobj[part] = value <NEW_LINE> <DEDENT> <DEDENT> else: <NEW_LINE> <INDENT> raise TypeError("invalid document type {0}".format(type(subobj))) <NEW_LINE> <DEDENT> return obj | Adds an object property or an array element. | 62599075e1aae11d1e7cf4bb |
class BatteryServiceData(ServiceData): <NEW_LINE> <INDENT> def __init__(self): <NEW_LINE> <INDENT> super().__init__(BATTERY_UUID) <NEW_LINE> <DEDENT> @property <NEW_LINE> def data(self) -> int: <NEW_LINE> <INDENT> return int(self._data[0]) <NEW_LINE> <DEDENT> @data.setter <NEW_LINE> def data(self, value: int): <NEW_LINE> <INDENT> if isinstance(value, int): <NEW_LINE> <INDENT> self._data = struct.pack("B", value) <NEW_LINE> <DEDENT> elif isinstance(value, (bytes, bytearray)): <NEW_LINE> <INDENT> self._data = value <NEW_LINE> <DEDENT> <DEDENT> def __repr__(self) -> str: <NEW_LINE> <INDENT> return f"Battery capacity remaining: {self.data}%" | This derivitive of the `ServiceData` class can be used to represent
battery charge percentage as a 1-byte value. | 625990757d43ff24874280c0 |
class Handler(object): <NEW_LINE> <INDENT> def __init__(self, callback=None, formatter=None, filter=None, reducer=None): <NEW_LINE> <INDENT> self.filters = [] <NEW_LINE> self.formatters = [] <NEW_LINE> self.callbacks = [] <NEW_LINE> self.add_filter(filter) <NEW_LINE> self.add_formatter(formatter) <NEW_LINE> self.add_callback(callback) <NEW_LINE> self.reducer = reducer or delistify <NEW_LINE> <DEDENT> @property <NEW_LINE> def is_async(self): <NEW_LINE> <INDENT> return any(is_async(callback) for callback in self.callbacks) <NEW_LINE> <DEDENT> def add(self, callback=None, formatter=None, filter=None): <NEW_LINE> <INDENT> self.add_formatter(formatter) <NEW_LINE> self.add_filter(filter) <NEW_LINE> self.add_callback(callback) <NEW_LINE> <DEDENT> def add_formatter(self, formatter=None): <NEW_LINE> <INDENT> if formatter: <NEW_LINE> <INDENT> self.formatters += listify(formatter) <NEW_LINE> <DEDENT> <DEDENT> def add_filter(self, filter=None, rule=None): <NEW_LINE> <INDENT> if filter: <NEW_LINE> <INDENT> if rule: <NEW_LINE> <INDENT> if isinstance(filter, list): <NEW_LINE> <INDENT> raise Exception("Filter should be callable if rule is specified. ") <NEW_LINE> <DEDENT> _filter = lambda *args, **kwargs: rule(filter(*args, **kwargs)) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> _filter = filter <NEW_LINE> <DEDENT> self.filters += listify(_filter) <NEW_LINE> <DEDENT> <DEDENT> def add_callback(self, callback=None): <NEW_LINE> <INDENT> if callback: <NEW_LINE> <INDENT> self.callbacks += listify(callback) <NEW_LINE> <DEDENT> <DEDENT> def __call__(self, *args, **kwargs): <NEW_LINE> <INDENT> for formatter in self.formatters: <NEW_LINE> <INDENT> args, kwargs = formatter(*args, **kwargs) <NEW_LINE> <DEDENT> if all(_filter(*args, **kwargs) for _filter in self.filters): <NEW_LINE> <INDENT> return self.reducer( [callback(*args, **kwargs) for callback in self.callbacks] ) <NEW_LINE> <DEDENT> <DEDENT> def call(self, *args, **kwargs): <NEW_LINE> <INDENT> return self.__call__(*args, **kwargs) <NEW_LINE> <DEDENT> async def call_async(self, *args, **kwargs): <NEW_LINE> <INDENT> for formatter in self.formatters: <NEW_LINE> <INDENT> args, kwargs = formatter(*args, **kwargs) <NEW_LINE> <DEDENT> if all(_filter(*args, **kwargs) for _filter in self.filters): <NEW_LINE> <INDENT> res = [] <NEW_LINE> for callback in self.callbacks: <NEW_LINE> <INDENT> if is_async(callback): <NEW_LINE> <INDENT> if hasattr(callback, "aiocall"): <NEW_LINE> <INDENT> res.append(await callback.call_async(*args, **kwargs)) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> res.append(await callback(*args, **kwargs)) <NEW_LINE> <DEDENT> <DEDENT> else: <NEW_LINE> <INDENT> res.append(callback(*args, **kwargs)) <NEW_LINE> <DEDENT> <DEDENT> return self.reducer(res) <NEW_LINE> <DEDENT> <DEDENT> def has_coroutine(self): <NEW_LINE> <INDENT> return any(inspect.iscoroutinefunction(callback) for callback in self.callbacks) <NEW_LINE> <DEDENT> def __repr__(self): <NEW_LINE> <INDENT> vals = [] <NEW_LINE> vals += [f"Formatter: {formatter}" for formatter in self.formatters] <NEW_LINE> vals += [f"Filter: {filter}" for filter in self.filters] <NEW_LINE> vals += [f"Callback: {callback}" for callback in self.callbacks] <NEW_LINE> return "\n".join(vals) <NEW_LINE> <DEDENT> def __str__(self): <NEW_LINE> <INDENT> return self.__repr__() | Complex func wrapper. Computational graph with multiple filters, multiple formatters and multiple callbacks. | 625990754f6381625f19a157 |
class TestJsHintQualityReporterTest(JsQualityBaseReporterMixin): <NEW_LINE> <INDENT> quality_name = "jshint" <NEW_LINE> def _get_out(self): <NEW_LINE> <INDENT> return jshint_driver | JsHintQualityReporter tests. Assumes JsHint is not available as a python
library, but is available on the commandline. | 6259907597e22403b383c85c |
class LoggingProjectsLogServicesSinksCreateRequest(messages.Message): <NEW_LINE> <INDENT> logServicesId = messages.StringField(1, required=True) <NEW_LINE> logSink = messages.MessageField('LogSink', 2) <NEW_LINE> projectsId = messages.StringField(3, required=True) | A LoggingProjectsLogServicesSinksCreateRequest object.
Fields:
logServicesId: Part of `serviceName`. See documentation of `projectsId`.
logSink: A LogSink resource to be passed as the request body.
projectsId: Part of `serviceName`. The service in which to create a sink. | 62599075be8e80087fbc09ec |
class CreateServiceTimeCallback(object): <NEW_LINE> <INDENT> def __init__(self, demands, time_per_demand_unit): <NEW_LINE> <INDENT> self.matrix = demands <NEW_LINE> self.time_per_demand_unit = time_per_demand_unit <NEW_LINE> <DEDENT> def ServiceTime(self, from_node, to_node): <NEW_LINE> <INDENT> return int(self.matrix[from_node] * self.time_per_demand_unit) | Create callback to get time windows at each location. | 6259907567a9b606de547751 |
class Pipe: <NEW_LINE> <INDENT> def __init__(self, timeout): <NEW_LINE> <INDENT> self.timeout = timeout <NEW_LINE> <DEDENT> def settimeout(self, timeout): <NEW_LINE> <INDENT> self.timeout = timeout <NEW_LINE> <DEDENT> def gettimeout(self): <NEW_LINE> <INDENT> return self.timeout <NEW_LINE> <DEDENT> @contextmanager <NEW_LINE> def _tmp_timeout(self, timeout): <NEW_LINE> <INDENT> try: <NEW_LINE> <INDENT> if timeout: <NEW_LINE> <INDENT> old_timeout = self.gettimeout() <NEW_LINE> self.settimeout(timeout) <NEW_LINE> <DEDENT> yield <NEW_LINE> <DEDENT> finally: <NEW_LINE> <INDENT> if timeout: <NEW_LINE> <INDENT> self.settimeout(old_timeout) <NEW_LINE> <DEDENT> <DEDENT> <DEDENT> def recv(self, bufsize): <NEW_LINE> <INDENT> raise NotImplementedError <NEW_LINE> <DEDENT> def send(self, data): <NEW_LINE> <INDENT> raise NotImplementedError <NEW_LINE> <DEDENT> def close(self): <NEW_LINE> <INDENT> raise NotImplementedError <NEW_LINE> <DEDENT> def _read_regex(self, pattern, bufsize=4096, timeout=None): <NEW_LINE> <INDENT> cre = re.compile(pattern, re.MULTILINE | re.DOTALL) <NEW_LINE> buf = b'' <NEW_LINE> with self._tmp_timeout(timeout): <NEW_LINE> <INDENT> while True: <NEW_LINE> <INDENT> ret = self.recv(bufsize) <NEW_LINE> if not ret: <NEW_LINE> <INDENT> raise EOFError <NEW_LINE> <DEDENT> buf += ret <NEW_LINE> match = cre.search(buf) <NEW_LINE> if match: <NEW_LINE> <INDENT> break <NEW_LINE> <DEDENT> <DEDENT> <DEDENT> return buf, match <NEW_LINE> <DEDENT> def read_until(self, pattern, bufsize=4096, timeout=None): <NEW_LINE> <INDENT> buf, _match = self._read_regex(pattern, bufsize, timeout) <NEW_LINE> return buf <NEW_LINE> <DEDENT> def read_search(self, pattern, bufsize=4096, timeout=None): <NEW_LINE> <INDENT> _buf, match = self._read_regex(pattern, bufsize, timeout) <NEW_LINE> return match <NEW_LINE> <DEDENT> def readall(self, bufsize=4096, timeout=None): <NEW_LINE> <INDENT> buf = b'' <NEW_LINE> with self._tmp_timeout(timeout): <NEW_LINE> <INDENT> while True: <NEW_LINE> <INDENT> try: <NEW_LINE> <INDENT> ret = self.recv(bufsize) <NEW_LINE> if not ret: <NEW_LINE> <INDENT> break <NEW_LINE> <DEDENT> buf += ret <NEW_LINE> <DEDENT> except TimeoutError: <NEW_LINE> <INDENT> break <NEW_LINE> <DEDENT> <DEDENT> <DEDENT> return buf <NEW_LINE> <DEDENT> def sendall(self, data): <NEW_LINE> <INDENT> ret = 0 <NEW_LINE> while ret < len(data): <NEW_LINE> <INDENT> ret += self.send(data[ret:]) <NEW_LINE> <DEDENT> <DEDENT> def interact(self): <NEW_LINE> <INDENT> while True: <NEW_LINE> <INDENT> cmd = input('> ') + '\n' <NEW_LINE> self.sendall(cmd.encode('utf-8')) <NEW_LINE> ret = self.readall().decode('utf-8') <NEW_LINE> print(ret) | Base class for Pipe implementations. | 6259907556b00c62f0fb422b |
class ClasseFonction(Fonction): <NEW_LINE> <INDENT> @classmethod <NEW_LINE> def init_types(cls): <NEW_LINE> <INDENT> cls.ajouter_types(cls.xp, "Personnage") <NEW_LINE> cls.ajouter_types(cls.xp_niveau, "Fraction") <NEW_LINE> <DEDENT> @staticmethod <NEW_LINE> def xp(pnj): <NEW_LINE> <INDENT> if pnj.gain_xp: <NEW_LINE> <INDENT> xp = importeur.perso.gen_niveaux.grille_xp[pnj.niveau][1] <NEW_LINE> xp = xp * pnj.gain_xp / 100 <NEW_LINE> return Fraction(xp) <NEW_LINE> <DEDENT> return Fraction(0) <NEW_LINE> <DEDENT> @staticmethod <NEW_LINE> def xp_niveau(niveau): <NEW_LINE> <INDENT> xp = importeur.perso.gen_niveaux.grille_xp[int(niveau)][1] <NEW_LINE> return Fraction(xp) | Retourne l'XP d'un PNJ passé en paramètre. | 625990752ae34c7f260aca3f |
class readMap: <NEW_LINE> <INDENT> def __init__(self, dataDir = 'data', mapFile = 'map.vtk'): <NEW_LINE> <INDENT> reader = vtk.vtkStructuredGridReader() <NEW_LINE> reader.SetFileName(dataDir + '/' + mapFile) <NEW_LINE> reader.Update() <NEW_LINE> output = reader.GetOutput() <NEW_LINE> field = output.GetFieldData() <NEW_LINE> nArrays = field.GetNumberOfArrays() <NEW_LINE> class params: pass <NEW_LINE> p = params() <NEW_LINE> for i in range(nArrays): <NEW_LINE> <INDENT> arrayName = field.GetArrayName(i) <NEW_LINE> if any(arrayName == np.array(['x0', 'y0', 'sl'])): <NEW_LINE> <INDENT> setattr(self, arrayName, VN.vtk_to_numpy(field.GetArray(arrayName))) <NEW_LINE> <DEDENT> elif any(arrayName == np.array(['sub', 'hMin', 'hMax', 'lMax', 'tol', 'iterMax'])): <NEW_LINE> <INDENT> setattr(self, arrayName, VN.vtk_to_numpy(field.GetArray(arrayName))[0]) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> setattr(p, arrayName, VN.vtk_to_numpy(field.GetArray(arrayName))[0]) <NEW_LINE> <DEDENT> <DEDENT> setattr(self, 'p', p) <NEW_LINE> points = output.GetPoints() <NEW_LINE> data = points.GetData() <NEW_LINE> tracers0 = VN.vtk_to_numpy(data) <NEW_LINE> tracers0 = np.reshape(tracers0, (p.ny+2,p.nx+2,3)) <NEW_LINE> tracers0 = np.swapaxes(tracers0, 0, 2) <NEW_LINE> self.tracers0 = tracers0 <NEW_LINE> pointData = output.GetPointData() <NEW_LINE> nArrays = pointData.GetNumberOfArrays() <NEW_LINE> for i in range(nArrays): <NEW_LINE> <INDENT> name = pointData.GetArrayName(i) <NEW_LINE> array = VN.vtk_to_numpy(pointData.GetArray(i)) <NEW_LINE> array = np.reshape(array, (p.nx+2, p.ny+2)) <NEW_LINE> array = np.swapaxes(array, 0, 1) <NEW_LINE> setattr(self, name, array) | readMap -- Holds the integration maps. | 62599075dc8b845886d54f14 |
class List(ValueField): <NEW_LINE> <INDENT> structcode = None <NEW_LINE> def __init__(self, name, type, pad = 1): <NEW_LINE> <INDENT> ValueField.__init__(self, name) <NEW_LINE> self.type = type <NEW_LINE> self.pad = pad <NEW_LINE> <DEDENT> def parse_binary_value(self, data, display, length, format): <NEW_LINE> <INDENT> if length is None: <NEW_LINE> <INDENT> ret = [] <NEW_LINE> if self.type.structcode is None: <NEW_LINE> <INDENT> while data: <NEW_LINE> <INDENT> val, data = self.type.parse_binary(data, display) <NEW_LINE> ret.append(val) <NEW_LINE> <DEDENT> <DEDENT> else: <NEW_LINE> <INDENT> scode = '=' + self.type.structcode <NEW_LINE> slen = struct.calcsize(scode) <NEW_LINE> pos = 0 <NEW_LINE> while pos + slen <= len(data): <NEW_LINE> <INDENT> v = struct.unpack(scode, data[pos: pos + slen]) <NEW_LINE> if self.type.structvalues == 1: <NEW_LINE> <INDENT> v = v[0] <NEW_LINE> <DEDENT> if self.type.parse_value is None: <NEW_LINE> <INDENT> ret.append(v) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> ret.append(self.type.parse_value(v, display)) <NEW_LINE> <DEDENT> pos = pos + slen <NEW_LINE> <DEDENT> data = data[pos:] <NEW_LINE> <DEDENT> <DEDENT> else: <NEW_LINE> <INDENT> ret = [None] * int(length) <NEW_LINE> if self.type.structcode is None: <NEW_LINE> <INDENT> for i in range(0, length): <NEW_LINE> <INDENT> ret[i], data = self.type.parse_binary(data, display) <NEW_LINE> <DEDENT> <DEDENT> else: <NEW_LINE> <INDENT> scode = '=' + self.type.structcode <NEW_LINE> slen = struct.calcsize(scode) <NEW_LINE> pos = 0 <NEW_LINE> for i in range(0, length): <NEW_LINE> <INDENT> v = struct.unpack(scode, data[pos: pos + slen]) <NEW_LINE> if self.type.structvalues == 1: <NEW_LINE> <INDENT> v = v[0] <NEW_LINE> <DEDENT> if self.type.parse_value is None: <NEW_LINE> <INDENT> ret[i] = v <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> ret[i] = self.type.parse_value(v, display) <NEW_LINE> <DEDENT> pos = pos + slen <NEW_LINE> <DEDENT> data = data[pos:] <NEW_LINE> <DEDENT> <DEDENT> if self.pad: <NEW_LINE> <INDENT> data = data[len(data) % 4:] <NEW_LINE> <DEDENT> return ret, data <NEW_LINE> <DEDENT> def pack_value(self, val): <NEW_LINE> <INDENT> if self.type.structcode and len(self.type.structcode) == 1: <NEW_LINE> <INDENT> if self.type.check_value is not None: <NEW_LINE> <INDENT> val = [self.type.check_value(v) for v in val] <NEW_LINE> <DEDENT> data = array(struct_to_array_codes[self.type.structcode], val).tostring() <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> data = [] <NEW_LINE> for v in val: <NEW_LINE> <INDENT> data.append(self.type.pack_value(v)) <NEW_LINE> <DEDENT> data = b''.join(data) <NEW_LINE> <DEDENT> if self.pad: <NEW_LINE> <INDENT> dlen = len(data) <NEW_LINE> data = data + b'\0' * ((4 - dlen % 4) % 4) <NEW_LINE> <DEDENT> return data, len(val), None | The List, FixedList and Object fields store compound data objects.
The type of data objects must be provided as an object with the
following attributes and methods:
... | 625990754a966d76dd5f0845 |
class EventHandler(Popup): <NEW_LINE> <INDENT> name = ObjectProperty() <NEW_LINE> location = ObjectProperty() <NEW_LINE> begin = ObjectProperty() <NEW_LINE> end = ObjectProperty() <NEW_LINE> def __init__(self, **kwargs): <NEW_LINE> <INDENT> super(EventHandler, self).__init__(**kwargs) <NEW_LINE> self.event = kwargs['event'] <NEW_LINE> self.widget = kwargs['widget'] <NEW_LINE> Clock.schedule_once(lambda dt: self.load_event(), -1) <NEW_LINE> <DEDENT> def load_event(self): <NEW_LINE> <INDENT> self.name.text = self.event.name <NEW_LINE> self.location.text = self.event.location <NEW_LINE> self.begin.text = self.event.begin.strftime("%d/%m/%y") <NEW_LINE> self.end.text = self.event.end.strftime("%d/%m/%y") <NEW_LINE> <DEDENT> def save(self): <NEW_LINE> <INDENT> if not self.name.text or not self.location.text: <NEW_LINE> <INDENT> return <NEW_LINE> <DEDENT> self.event.name = self.name.text.decode('utf-8') <NEW_LINE> self.event.location = self.location.text.decode('utf-8') <NEW_LINE> try: <NEW_LINE> <INDENT> self.event.begin = datetime.datetime. strptime(self.begin.text, "%d/%m/%y").date() <NEW_LINE> self.event.end = datetime.datetime. strptime(self.end.text, "%d/%m/%y").date() <NEW_LINE> <DEDENT> except: <NEW_LINE> <INDENT> self.dismiss() <NEW_LINE> return <NEW_LINE> <DEDENT> models.Session.add(self.event) <NEW_LINE> models.Session.commit() <NEW_LINE> if self.widget and type(self.widget) == Event: <NEW_LINE> <INDENT> self.widget.update_name() <NEW_LINE> <DEDENT> if self.widget and type(self.widget) == EventsList: <NEW_LINE> <INDENT> self.widget.list_layout.add_widget(Event(event=self.event)) <NEW_LINE> <DEDENT> self.dismiss() | A quick form to edit and add an event | 625990758e7ae83300eea9ec |
class VirtualMachineScaleSetOSDisk(Model): <NEW_LINE> <INDENT> _validation = { 'name': {'required': True}, 'create_option': {'required': True}, } <NEW_LINE> _attribute_map = { 'name': {'key': 'name', 'type': 'str'}, 'caching': {'key': 'caching', 'type': 'CachingTypes'}, 'create_option': {'key': 'createOption', 'type': 'DiskCreateOptionTypes'}, 'os_type': {'key': 'osType', 'type': 'OperatingSystemTypes'}, 'image': {'key': 'image', 'type': 'VirtualHardDisk'}, 'vhd_containers': {'key': 'vhdContainers', 'type': '[str]'}, } <NEW_LINE> def __init__(self, *, name: str, create_option, caching=None, os_type=None, image=None, vhd_containers=None, **kwargs) -> None: <NEW_LINE> <INDENT> super(VirtualMachineScaleSetOSDisk, self).__init__(**kwargs) <NEW_LINE> self.name = name <NEW_LINE> self.caching = caching <NEW_LINE> self.create_option = create_option <NEW_LINE> self.os_type = os_type <NEW_LINE> self.image = image <NEW_LINE> self.vhd_containers = vhd_containers | Describes a virtual machine scale set operating system disk.
All required parameters must be populated in order to send to Azure.
:param name: Required. The disk name.
:type name: str
:param caching: Specifies the caching requirements. <br><br> Possible
values are: <br><br> **None** <br><br> **ReadOnly** <br><br> **ReadWrite**
<br><br> Default: **None for Standard storage. ReadOnly for Premium
storage**. Possible values include: 'None', 'ReadOnly', 'ReadWrite'
:type caching: str or ~azure.mgmt.compute.v2016_03_30.models.CachingTypes
:param create_option: Required. Specifies how the virtual machines in the
scale set should be created.<br><br> The only allowed value is:
**FromImage** \u2013 This value is used when you are using an image to
create the virtual machine. If you are using a platform image, you also
use the imageReference element described above. If you are using a
marketplace image, you also use the plan element previously described.
Possible values include: 'FromImage', 'Empty', 'Attach'
:type create_option: str or
~azure.mgmt.compute.v2016_03_30.models.DiskCreateOptionTypes
:param os_type: This property allows you to specify the type of the OS
that is included in the disk if creating a VM from user-image or a
specialized VHD. <br><br> Possible values are: <br><br> **Windows**
<br><br> **Linux**. Possible values include: 'Windows', 'Linux'
:type os_type: str or
~azure.mgmt.compute.v2016_03_30.models.OperatingSystemTypes
:param image: The Source User Image VirtualHardDisk. This VirtualHardDisk
will be copied before using it to attach to the Virtual Machine. If
SourceImage is provided, the destination VirtualHardDisk should not exist.
:type image: ~azure.mgmt.compute.v2016_03_30.models.VirtualHardDisk
:param vhd_containers: The list of virtual hard disk container uris.
:type vhd_containers: list[str] | 625990755fcc89381b266e06 |
class EchoServerProtocol(asyncio.Protocol): <NEW_LINE> <INDENT> def __init__(self, loop=None): <NEW_LINE> <INDENT> self.deserializer = EchoPacket.Deserializer() <NEW_LINE> self.loop = loop <NEW_LINE> self.transport = None <NEW_LINE> <DEDENT> def connection_made(self, transport): <NEW_LINE> <INDENT> print("Received a connection from {}".format(transport.get_extra_info("peername"))) <NEW_LINE> self.transport = transport <NEW_LINE> <DEDENT> def connection_lost(self, reason=None): <NEW_LINE> <INDENT> print("Lost connection to client. Cleaning up.") <NEW_LINE> if self.loop: <NEW_LINE> <INDENT> self.loop.stop() <NEW_LINE> <DEDENT> <DEDENT> def data_received(self, data): <NEW_LINE> <INDENT> self.deserializer.update(data) <NEW_LINE> for echoPacket in self.deserializer.nextPackets(): <NEW_LINE> <INDENT> if echoPacket.original: <NEW_LINE> <INDENT> print("Got {} from client.".format(echoPacket.message)) <NEW_LINE> if echoPacket.message == "__QUIT__": <NEW_LINE> <INDENT> print("Client instructed server to quit. Terminating") <NEW_LINE> self.transport.close() <NEW_LINE> return <NEW_LINE> <DEDENT> responsePacket = EchoPacket() <NEW_LINE> responsePacket.original = False <NEW_LINE> responsePacket.message = echoPacket.message <NEW_LINE> self.transport.write(responsePacket.__serialize__()) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> print("Got a packet from client not marked as 'original'. Dropping") | This is our class for the Server's protocol. It simply receives
an EchoProtocolMessage and sends back a response | 6259907501c39578d7f143e2 |
class PlayerRaceMultiClass: <NEW_LINE> <INDENT> def __init__(self,race,klass): <NEW_LINE> <INDENT> PlayerBase.__init__(self, race, klass) | Player Multi Class set | 62599075097d151d1a2c29cf |
class Lista(Expr): <NEW_LINE> <INDENT> def __init__(self,cond,instr): <NEW_LINE> <INDENT> self.type = "lista" <NEW_LINE> self.condicion = cond <NEW_LINE> self.instrucciones = instr <NEW_LINE> self.sig = None | Nodo que almacena los identificadores del programa | 62599075a8370b77170f1d27 |
class SectionB(models.Model): <NEW_LINE> <INDENT> qapp = models.ForeignKey(Qapp, on_delete=models.CASCADE) <NEW_LINE> sectionb_type = models.ForeignKey( SectionBType, on_delete=models.CASCADE) <NEW_LINE> b1_1 = models.TextField(blank=True, null=True) <NEW_LINE> b1_2 = models.TextField(blank=True, null=True) <NEW_LINE> b1_3 = models.TextField(blank=True, null=True) <NEW_LINE> b1_4 = models.TextField(blank=True, null=True) <NEW_LINE> b1_5 = models.TextField(blank=True, null=True) <NEW_LINE> b2_1 = models.TextField(blank=True, null=True) <NEW_LINE> b2_2 = models.TextField(blank=True, null=True) <NEW_LINE> b2_3 = models.TextField(blank=True, null=True) <NEW_LINE> b2_4 = models.TextField(blank=True, null=True) <NEW_LINE> b2_5 = models.TextField(blank=True, null=True) <NEW_LINE> b2_6 = models.TextField(blank=True, null=True) <NEW_LINE> b2_7 = models.TextField(blank=True, null=True) <NEW_LINE> b2_8 = models.TextField(blank=True, null=True) <NEW_LINE> b3_1 = models.TextField(blank=True, null=True) <NEW_LINE> b3_2 = models.TextField(blank=True, null=True) <NEW_LINE> b3_3 = models.TextField(blank=True, null=True) <NEW_LINE> b3_4 = models.TextField(blank=True, null=True) <NEW_LINE> b3_5 = models.TextField(blank=True, null=True) <NEW_LINE> b3_6 = models.TextField(blank=True, null=True) <NEW_LINE> b3_7 = models.TextField(blank=True, null=True) <NEW_LINE> b3_8 = models.TextField(blank=True, null=True) <NEW_LINE> b3_9 = models.TextField(blank=True, null=True) <NEW_LINE> b3_10 = models.TextField(blank=True, null=True) <NEW_LINE> b4_1 = models.TextField(blank=True, null=True) <NEW_LINE> b4_2 = models.TextField(blank=True, null=True) <NEW_LINE> b4_3 = models.TextField(blank=True, null=True) <NEW_LINE> b4_4 = models.TextField(blank=True, null=True) <NEW_LINE> b4_5 = models.TextField(blank=True, null=True) <NEW_LINE> b5_1 = models.TextField(blank=True, null=True) <NEW_LINE> b5_2 = models.TextField(blank=True, null=True) <NEW_LINE> b6_1 = models.TextField(blank=True, null=True) <NEW_LINE> b6_2 = models.TextField(blank=True, null=True) <NEW_LINE> class Meta: <NEW_LINE> <INDENT> unique_together = ('qapp', 'sectionb_type') | Class representing the entirety of SectionB for a given QAPP.
Instead of creating a Section B class for each of the Section B Types,
there will instead be one class with extra nullable fields. There
will likely still be multiple forms for the different Section B Types. | 625990753317a56b869bf1f3 |
class Card: <NEW_LINE> <INDENT> def __init__(self, rank, suit): <NEW_LINE> <INDENT> if rank not in RANKS: <NEW_LINE> <INDENT> raise ValueError('Invalid card rank.') <NEW_LINE> <DEDENT> if suit not in SUITS: <NEW_LINE> <INDENT> raise ValueError('Invalid card suit.') <NEW_LINE> <DEDENT> self._rank = rank <NEW_LINE> self._suit = suit <NEW_LINE> self._value = self._rank if self._rank in range(2, 10) else 1 if self._rank == 'ace' else 0 <NEW_LINE> <DEDENT> @property <NEW_LINE> def value(self): <NEW_LINE> <INDENT> return self._value <NEW_LINE> <DEDENT> @property <NEW_LINE> def rank(self): <NEW_LINE> <INDENT> return self._rank <NEW_LINE> <DEDENT> @property <NEW_LINE> def suit(self): <NEW_LINE> <INDENT> return self._suit <NEW_LINE> <DEDENT> def __add__(self, other): <NEW_LINE> <INDENT> return (self._value + other) % 10 <NEW_LINE> <DEDENT> __radd__ = __add__ <NEW_LINE> def __repr__(self): <NEW_LINE> <INDENT> if isinstance(self._rank, str): <NEW_LINE> <INDENT> return f'Card(\'{self._rank}\', \'{self._suit}\')' <NEW_LINE> <DEDENT> elif isinstance(self._rank, int): <NEW_LINE> <INDENT> return f'Card({self._rank}, \'{self._suit}\')' <NEW_LINE> <DEDENT> <DEDENT> def __str__(self): <NEW_LINE> <INDENT> return f'{self._rank} of {self._suit}' | Playing card to be used to fill a baccarat shoe and
to be drawn to a playing hand.
Args:
rank: int or string, the rank of the card.
suit: string, the suit of the card.
Attributes:
value: int, baccarat value of the card.
rank: int or string, the rank of the card.
suit: string, the suit of the card.
Raises:
ValueError: On invalid card rank or suit. | 6259907567a9b606de547752 |
class MAVLink_manual_setpoint_message(MAVLink_message): <NEW_LINE> <INDENT> id = MAVLINK_MSG_ID_MANUAL_SETPOINT <NEW_LINE> name = 'MANUAL_SETPOINT' <NEW_LINE> fieldnames = ['time_boot_ms', 'roll', 'pitch', 'yaw', 'thrust', 'mode_switch', 'manual_override_switch'] <NEW_LINE> ordered_fieldnames = ['time_boot_ms', 'roll', 'pitch', 'yaw', 'thrust', 'mode_switch', 'manual_override_switch'] <NEW_LINE> fieldtypes = ['uint32_t', 'float', 'float', 'float', 'float', 'uint8_t', 'uint8_t'] <NEW_LINE> fielddisplays_by_name = {} <NEW_LINE> fieldenums_by_name = {} <NEW_LINE> fieldunits_by_name = {"time_boot_ms": "ms", "roll": "rad/s", "pitch": "rad/s", "yaw": "rad/s"} <NEW_LINE> format = '<IffffBB' <NEW_LINE> native_format = bytearray('<IffffBB', 'ascii') <NEW_LINE> orders = [0, 1, 2, 3, 4, 5, 6] <NEW_LINE> lengths = [1, 1, 1, 1, 1, 1, 1] <NEW_LINE> array_lengths = [0, 0, 0, 0, 0, 0, 0] <NEW_LINE> crc_extra = 106 <NEW_LINE> unpacker = struct.Struct('<IffffBB') <NEW_LINE> def __init__(self, time_boot_ms, roll, pitch, yaw, thrust, mode_switch, manual_override_switch): <NEW_LINE> <INDENT> MAVLink_message.__init__(self, MAVLink_manual_setpoint_message.id, MAVLink_manual_setpoint_message.name) <NEW_LINE> self._fieldnames = MAVLink_manual_setpoint_message.fieldnames <NEW_LINE> self.time_boot_ms = time_boot_ms <NEW_LINE> self.roll = roll <NEW_LINE> self.pitch = pitch <NEW_LINE> self.yaw = yaw <NEW_LINE> self.thrust = thrust <NEW_LINE> self.mode_switch = mode_switch <NEW_LINE> self.manual_override_switch = manual_override_switch <NEW_LINE> <DEDENT> def pack(self, mav, force_mavlink1=False): <NEW_LINE> <INDENT> return MAVLink_message.pack(self, mav, 106, struct.pack('<IffffBB', self.time_boot_ms, self.roll, self.pitch, self.yaw, self.thrust, self.mode_switch, self.manual_override_switch), force_mavlink1=force_mavlink1) | Setpoint in roll, pitch, yaw and thrust from the operator | 6259907556ac1b37e6303990 |
@command_lib.CommandRegexParser(r'chimps?%s' % SUMMONER_REGEX) <NEW_LINE> class ChimpsCommand(_BaseSummonerCommand): <NEW_LINE> <INDENT> _hypebot_message = messages.HYPEBOT_IS_THE_CHIMP_STRING <NEW_LINE> def _Handle(self, channel, user, smurfs, region, name): <NEW_LINE> <INDENT> name = summoner_lib.NormalizeSummoner(name) <NEW_LINE> if name == 'me': <NEW_LINE> <INDENT> self._core.last_command = partial(self._Handle, smurfs=smurfs, region=region, name=name) <NEW_LINE> <DEDENT> summoners = self._ParseSummoners(channel, user, smurfs, region, name) <NEW_LINE> return [self._core.summoner.Chimps(summoner) for summoner in summoners] | Display chimp mastery. | 625990752c8b7c6e89bd5144 |
class OrderTax(models.Model): <NEW_LINE> <INDENT> order = models.ForeignKey(Order, related_name="taxes") <NEW_LINE> name = models.CharField(max_length=20) <NEW_LINE> rate = models.DecimalField(max_digits=7, decimal_places=4, validators=[MinValueValidator(0.0)]) <NEW_LINE> total = models.DecimalField(max_digits=9, decimal_places=2, validators=[MinValueValidator(0.0)]) | A tax (eg GST or PST) that has been applied to an order. | 6259907566673b3332c31d5a |
class AvailablePrivateEndpointTypesResult(msrest.serialization.Model): <NEW_LINE> <INDENT> _validation = { 'next_link': {'readonly': True}, } <NEW_LINE> _attribute_map = { 'value': {'key': 'value', 'type': '[AvailablePrivateEndpointType]'}, 'next_link': {'key': 'nextLink', 'type': 'str'}, } <NEW_LINE> def __init__( self, *, value: Optional[List["AvailablePrivateEndpointType"]] = None, **kwargs ): <NEW_LINE> <INDENT> super(AvailablePrivateEndpointTypesResult, self).__init__(**kwargs) <NEW_LINE> self.value = value <NEW_LINE> self.next_link = None | An array of available PrivateEndpoint types.
Variables are only populated by the server, and will be ignored when sending a request.
:param value: An array of available privateEndpoint type.
:type value: list[~azure.mgmt.network.v2020_07_01.models.AvailablePrivateEndpointType]
:ivar next_link: The URL to get the next set of results.
:vartype next_link: str | 625990755166f23b2e244d32 |
class BaseVotedEntryComment(models.Model): <NEW_LINE> <INDENT> class Meta: <NEW_LINE> <INDENT> abstract = True <NEW_LINE> <DEDENT> body = models.TextField() <NEW_LINE> created_at = CreationDateTimeField() <NEW_LINE> user = models.ForeignKey(User) <NEW_LINE> def get_absolute_url(self): <NEW_LINE> <INDENT> return self.entry.get_absolute_url() + '#comment-%s' % self.id | Inherit and make sure to add a FK to the VotedEntry subclass named entry
entry = models.ForeignKey(VotedEntry, related_name='comments') | 62599075dc8b845886d54f16 |
class Seg3(Net1D): <NEW_LINE> <INDENT> def __init__(self, *args, **kwds): <NEW_LINE> <INDENT> super(Seg3, self).__init__(*args, **kwds) <NEW_LINE> self.N = lambda ksi,eta: array([ksi*(ksi - 1)/2, ksi*(ksi + 1)/2, -ksi**2 + 1]) <NEW_LINE> self.dNdk = lambda ksi,eta: array([ksi - 1/2, ksi + 1/2, -2*ksi]) | 3-node second order line (2 nodes associated with the vertices and 1 with the edge). | 62599075aad79263cf430114 |
class MenuItem(MenuNode): <NEW_LINE> <INDENT> def __init__(self, id, string): <NEW_LINE> <INDENT> super(MenuItem, self).__init__() <NEW_LINE> self.id = id <NEW_LINE> self.string = string <NEW_LINE> <DEDENT> def render(self, dialog, res): <NEW_LINE> <INDENT> dialog.MenuAddString(self.id, self.string) <NEW_LINE> <DEDENT> def find_node(self, node_id, res): <NEW_LINE> <INDENT> if node_id == self.id: <NEW_LINE> <INDENT> return self <NEW_LINE> <DEDENT> <DEDENT> def copy(self): <NEW_LINE> <INDENT> return MenuItem(self.id, self.string) | This class represents an item added via
:meth:`c4d.gui.GeDialog.MenuAddString`. It is not created from this
module but may be used create dynamic menus.
.. attribute:: id
The integral number of the symbol to add.
.. attribute:: string
The menu-commands item string. | 625990754428ac0f6e659e8b |
@scenario.configure( name="Kubernetes.create_scale_and_delete_replication_controller", platform="kubernetes" ) <NEW_LINE> class CreateScaleAndDeleteRCPlugin(common_scenario.BaseKubernetesScenario): <NEW_LINE> <INDENT> def run(self, image, replicas, scale_replicas, image_pull_policy='IfNotPresent', command=None, status_wait=True): <NEW_LINE> <INDENT> namespace = self.choose_namespace() <NEW_LINE> name = self.client.create_rc( namespace=namespace, replicas=replicas, image=image, image_pull_policy=image_pull_policy, command=command, status_wait=status_wait ) <NEW_LINE> self.client.scale_rc( name, namespace=namespace, replicas=scale_replicas, status_wait=status_wait ) <NEW_LINE> self.client.scale_rc( name, namespace=namespace, replicas=replicas, status_wait=status_wait ) <NEW_LINE> self.client.delete_rc( name, namespace=namespace, status_wait=status_wait ) | Kubernetes replication controller scale test.
Create replication controller, scale it with number of replicas,
scale it with original number of replicas, delete replication controller. | 625990752ae34c7f260aca41 |
class RnnModel(nn.Module): <NEW_LINE> <INDENT> def __init__(self, input_dim, output_dim, hidden_size, dropout_p, cell_type): <NEW_LINE> <INDENT> super(RnnModel, self).__init__() <NEW_LINE> self.output_dim = output_dim <NEW_LINE> self.hidden_size = hidden_size <NEW_LINE> self.cell_type = cell_type <NEW_LINE> self.dropout = nn.Dropout(dropout_p) <NEW_LINE> if cell_type == 'LSTM': <NEW_LINE> <INDENT> self.encoder = nn.LSTM(input_dim, hidden_size) <NEW_LINE> <DEDENT> elif cell_type == 'GRU': <NEW_LINE> <INDENT> self.encoder = nn.GRU(input_dim, hidden_size) <NEW_LINE> <DEDENT> elif cell_type == 'RNN': <NEW_LINE> <INDENT> self.encoder = nn.RNN(input_dim, hidden_size) <NEW_LINE> <DEDENT> self.out = nn.Linear(hidden_size, output_dim) <NEW_LINE> <DEDENT> def forward(self, input_seq, hidden_state): <NEW_LINE> <INDENT> input_seq = self.dropout(input_seq) <NEW_LINE> encoder_outputs, _ = self.encoder(input_seq, hidden_state) <NEW_LINE> score_seq = self.out(encoder_outputs[-1, :, :]) <NEW_LINE> return score_seq, encoder_outputs[-1, :, :] <NEW_LINE> <DEDENT> def init_hidden(self, batch_size): <NEW_LINE> <INDENT> if self.cell_type == 'LSTM': <NEW_LINE> <INDENT> h_init = torch.zeros(1, batch_size, self.hidden_size) <NEW_LINE> c_init = torch.zeros(1, batch_size, self.hidden_size) <NEW_LINE> return (h_init, c_init) <NEW_LINE> <DEDENT> elif self.cell_type == 'GRU': <NEW_LINE> <INDENT> return torch.zeros(1, batch_size, self.hidden_size) <NEW_LINE> <DEDENT> elif self.cell_type == 'RNN': <NEW_LINE> <INDENT> return torch.zeros(1, batch_size, self.hidden_size) | An RNN model using either RNN, LSTM or GRU cells. | 625990755fdd1c0f98e5f8d9 |
class LowVarianceRemover(BaseEstimator, TransformerMixin): <NEW_LINE> <INDENT> def fit(self, data, target=None): <NEW_LINE> <INDENT> nunique = data.apply(pd.Series.nunique) <NEW_LINE> few_nunique = nunique[(nunique < 5) & (nunique.index != 'proxy')].index <NEW_LINE> self.toremove = few_nunique <NEW_LINE> return self <NEW_LINE> <DEDENT> def transform(self, data): <NEW_LINE> <INDENT> return data.drop(self.toremove, axis=1) | This transform removes all features with few unique values.
It keeps `proxy` because that is a boolean. | 62599075bf627c535bcb2e29 |
class StarbucksStore: <NEW_LINE> <INDENT> _name: str = '' <NEW_LINE> _address: str = '' <NEW_LINE> _is_open: str = '' <NEW_LINE> def __init__(self, name: str, address: str, is_open: bool): <NEW_LINE> <INDENT> self._name = name <NEW_LINE> self._address = address <NEW_LINE> self._is_open = is_open <NEW_LINE> <DEDENT> def __repr__(self): <NEW_LINE> <INDENT> return self.description <NEW_LINE> <DEDENT> @property <NEW_LINE> def description(self): <NEW_LINE> <INDENT> output = '{}, {}'.format(self.name, self.address) <NEW_LINE> if self.is_open is True: <NEW_LINE> <INDENT> output += ' (open now).' <NEW_LINE> <DEDENT> return output <NEW_LINE> <DEDENT> @property <NEW_LINE> def name(self): <NEW_LINE> <INDENT> return self._name <NEW_LINE> <DEDENT> @property <NEW_LINE> def address(self): <NEW_LINE> <INDENT> return self._address <NEW_LINE> <DEDENT> @property <NEW_LINE> def is_open(self): <NEW_LINE> <INDENT> return self._is_open | Starbucks Store that does not allow the user to change the attributes once it has been created.
This has been done to show you another style of modelling where the model does not allow mutations, therefore, we
get around potential problems of different parts of our code changing a single model that might be referenced in
multiple places. | 62599075a05bb46b3848bdd9 |
class IssueArbitraryStatusUpdater(IssueStatusUpdater): <NEW_LINE> <INDENT> to_state = None <NEW_LINE> def get_parser(self, prog_name): <NEW_LINE> <INDENT> statuses = [s.name for s in self.app.client.statuses()] <NEW_LINE> parser = super(IssueArbitraryStatusUpdater, self).get_parser(prog_name) <NEW_LINE> parser.add_argument('-s', '--to-state', choices=statuses) <NEW_LINE> return parser <NEW_LINE> <DEDENT> def update_issue_status(self, issue, _, parsed_args): <NEW_LINE> <INDENT> super(IssueArbitraryStatusUpdater, self).update_issue_status( issue, parsed_args.to_state, parsed_args) | Update jira issue status | 625990758a43f66fc4bf3af2 |
class AllHelp(HelpRequest): <NEW_LINE> <INDENT> pass | The user requested a dump of all help info. | 62599075d268445f2663a80c |
class IscsiIsnsGetIterKeyTd(NetAppObject): <NEW_LINE> <INDENT> _key_0 = None <NEW_LINE> @property <NEW_LINE> def key_0(self): <NEW_LINE> <INDENT> return self._key_0 <NEW_LINE> <DEDENT> @key_0.setter <NEW_LINE> def key_0(self, val): <NEW_LINE> <INDENT> if val != None: <NEW_LINE> <INDENT> self.validate('key_0', val) <NEW_LINE> <DEDENT> self._key_0 = val <NEW_LINE> <DEDENT> @staticmethod <NEW_LINE> def get_api_name(): <NEW_LINE> <INDENT> return "iscsi-isns-get-iter-key-td" <NEW_LINE> <DEDENT> @staticmethod <NEW_LINE> def get_desired_attrs(): <NEW_LINE> <INDENT> return [ 'key-0', ] <NEW_LINE> <DEDENT> def describe_properties(self): <NEW_LINE> <INDENT> return { 'key_0': { 'class': basestring, 'is_list': False, 'required': 'optional' }, } | Key typedef for table iscsiIsns | 625990757d43ff24874280c2 |
class xml_node(object): <NEW_LINE> <INDENT> def __init__(self, attribs=None, name="", fields=None): <NEW_LINE> <INDENT> if attribs is None: <NEW_LINE> <INDENT> attribs = {} <NEW_LINE> <DEDENT> if fields is None: <NEW_LINE> <INDENT> fields = [] <NEW_LINE> <DEDENT> self.attribs = attribs <NEW_LINE> self.name = name <NEW_LINE> self.fields = fields | Class to handle a particular xml tag.
Tags are generally written in the form
<tag_name attribs="attrib_data"> main_data </tag_name>. This class holds
tag_name, attrib_data and main_data separately so they can be used to
create the objects with the appropriate names and data.
Attributes:
attribs: The attribute data for the tag.
fields: The rest of the data.
name: The tag name. | 625990757d847024c075dd37 |
class SetRI(Opcode): <NEW_LINE> <INDENT> def __init__(self, target: int, intval: int) -> None: <NEW_LINE> <INDENT> self.target = target <NEW_LINE> self.intval = intval <NEW_LINE> <DEDENT> def __str__(self) -> str: <NEW_LINE> <INDENT> return 'r%d = %d' % (self.target, self.intval) | Assign integer literal to register (rN = N). | 62599075a8370b77170f1d29 |
class MelonType: <NEW_LINE> <INDENT> def __init__(self, code, first_harvest, color, is_seedless, is_bestseller, name): <NEW_LINE> <INDENT> self.code = code <NEW_LINE> self.first_harvest=first_harvest <NEW_LINE> self.color=color <NEW_LINE> self.is_seedless=is_seedless <NEW_LINE> self.is_bestseller = is_bestseller <NEW_LINE> self.name=name <NEW_LINE> self.pairings = [] <NEW_LINE> <DEDENT> def add_pairing(self, pairing): <NEW_LINE> <INDENT> self.pairings.append(pairing) <NEW_LINE> return None <NEW_LINE> <DEDENT> def update_code(self, new_code): <NEW_LINE> <INDENT> self.code = new_code <NEW_LINE> return None <NEW_LINE> <DEDENT> def __repr__(self): <NEW_LINE> <INDENT> return "Code: {}, First harvest: {}, Color: {}, Seedless: {}, bestseller: {}, name: {}".format(self.code, self.first_harvest, self.color, self.is_seedless, self.is_bestseller,self.name) | A species of melon at a melon farm. | 62599075a8370b77170f1d2a |
class TagDayJob(BaseJob): <NEW_LINE> <INDENT> def process(self): <NEW_LINE> <INDENT> data = self.data <NEW_LINE> pictures = list(self.db.pictures.find({ 'year': data['year'], 'month': data['month'], 'day': data['day'] })) <NEW_LINE> log.info("Tagging day: %s-%s-%s (%s pictures)" % (data['year'], data['month'], data['day'], len(pictures))) <NEW_LINE> tags = data['tags'] <NEW_LINE> for picture in pictures: <NEW_LINE> <INDENT> self.db.tags.change_for_picture(picture['id'], tags) <NEW_LINE> <DEDENT> log.info("Done") | This job will receive a year/month/day and change the tags
of all pictures on that date for the new ones. | 62599075627d3e7fe0e087e5 |
class TestRegistrationFieldsResponse(unittest.TestCase): <NEW_LINE> <INDENT> def setUp(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def tearDown(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def testRegistrationFieldsResponse(self): <NEW_LINE> <INDENT> model = kinow_client.models.registration_fields_response.RegistrationFieldsResponse() | RegistrationFieldsResponse unit test stubs | 625990757047854f46340d16 |
class CommandException(ShellException): <NEW_LINE> <INDENT> def __init__(self, message, short='internal command error'): <NEW_LINE> <INDENT> super().__init__(message, short) | Base class for all exceptions relating to command execution
| 6259907616aa5153ce401e37 |
class APIError(Exception): <NEW_LINE> <INDENT> def __init__(self, error, data='', message=''): <NEW_LINE> <INDENT> super(APIError, self).__init__(message) <NEW_LINE> self.error = error <NEW_LINE> self.data = data <NEW_LINE> self.message = message | the base APTError which contains error(required), data(optional) and message(optional). | 62599076dc8b845886d54f18 |
class DualPLA: <NEW_LINE> <INDENT> def __init__(self, eta, max_iters=None): <NEW_LINE> <INDENT> self._eta = eta <NEW_LINE> self._max_iters = max_iters <NEW_LINE> <DEDENT> def classify(self, y, i): <NEW_LINE> <INDENT> scores = np.sum(np.dot((self._alpha * y), self._gram[i, :])) <NEW_LINE> scores = (scores + self._b) * y[i] <NEW_LINE> return scores <= 0 <NEW_LINE> <DEDENT> def update(self, y, i): <NEW_LINE> <INDENT> self._alpha[:, i] = self._alpha[:, i] + self._eta <NEW_LINE> self._b = self._b + self._eta * y[i] <NEW_LINE> <DEDENT> def train_fit(self, X, y): <NEW_LINE> <INDENT> self.n_features = X.shape[1] <NEW_LINE> self.n_samples = X.shape[0] <NEW_LINE> self._w = np.zeros((1, self.n_features)) <NEW_LINE> self._alpha = np.zeros((1, self.n_samples)) <NEW_LINE> self._b = 0 <NEW_LINE> self._gram = np.dot(X, X.T) <NEW_LINE> step = 0 <NEW_LINE> while not self._max_iters or step < self._max_iters: <NEW_LINE> <INDENT> step += 1 <NEW_LINE> for i in range(self.n_samples): <NEW_LINE> <INDENT> if self.classify(y, i): <NEW_LINE> <INDENT> self.update(y, i) <NEW_LINE> print("Iteration :", step, "misclassfied x is ", (i + 1), "alpha: ", self._alpha, "b: ", self._b) <NEW_LINE> break <NEW_LINE> <DEDENT> <DEDENT> else: <NEW_LINE> <INDENT> break <NEW_LINE> <DEDENT> <DEDENT> self._w = np.dot(self._alpha * y, X) <NEW_LINE> self.w = np.concatenate( [np.array(self._b).reshape((1, 1)), self._w], axis=1) <NEW_LINE> return self._w, self._b <NEW_LINE> <DEDENT> def predict(self, X): <NEW_LINE> <INDENT> return np.sign(np.dot(add_intercept(X, 1), self.w.T)).flatten() | 原始 PLA 的对偶形式
Model: f(x) = sign(w * x + b)
self._X : [[x1], [x2], ... x[m]]
self._Y : [y1, y2, ..., ym] | 62599076379a373c97d9a980 |
class Driver(object): <NEW_LINE> <INDENT> def __init__(self): <NEW_LINE> <INDENT> self.logger = logging.getLogger(__name__) <NEW_LINE> self.config = DriverConfig() <NEW_LINE> <DEDENT> def tops(self): <NEW_LINE> <INDENT> return {} <NEW_LINE> <DEDENT> def fields(self): <NEW_LINE> <INDENT> return {} <NEW_LINE> <DEDENT> def info(self): <NEW_LINE> <INDENT> return {} <NEW_LINE> <DEDENT> def initialize(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def terminate(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def configure(self, config): <NEW_LINE> <INDENT> self.config.parse(config) | Base class Driver | 625990764428ac0f6e659e8d |
class InvalidState(DriverError): <NEW_LINE> <INDENT> pass | Invalid state for the requested operation. | 625990764a966d76dd5f0849 |
class WaterLimiterModifier(BaseMapModifier): <NEW_LINE> <INDENT> def __init__(self, max_water_depth): <NEW_LINE> <INDENT> assert 0 <= max_water_depth <= 9, "max_water_depth must be between 0 and 9." <NEW_LINE> self.max_water_depth = abs(max_water_depth) <NEW_LINE> <DEDENT> def modify_map(self, seed_val, mmap): <NEW_LINE> <INDENT> for y in range(0, mmap.get_map_height()): <NEW_LINE> <INDENT> for x in range(0, mmap.get_map_width()): <NEW_LINE> <INDENT> if mmap.terrain_list[y][x] != '~': <NEW_LINE> <INDENT> continue <NEW_LINE> <DEDENT> if mmap.elevation_list[y][x] > self.max_water_depth: <NEW_LINE> <INDENT> mmap.elevation_list[y][x] = self.max_water_depth | This modifier clamps the max water depth to a certain elevation. | 62599076d486a94d0ba2d917 |
class PixelCopterState(State): <NEW_LINE> <INDENT> def __init__(self, observation): <NEW_LINE> <INDENT> super().__init__() <NEW_LINE> self.observation = observation <NEW_LINE> <DEDENT> def __str__(self) -> str: <NEW_LINE> <INDENT> return str(self.observation) <NEW_LINE> <DEDENT> def copy(self): <NEW_LINE> <INDENT> c = PixelCopterState(self.observation) <NEW_LINE> c.terminal = self.terminal <NEW_LINE> return c <NEW_LINE> <DEDENT> def set_observation(self, observation): <NEW_LINE> <INDENT> self.observation = observation | A PixelCopter environment state | 62599076bf627c535bcb2e2b |
class BJ_Player(BJ_Hand): <NEW_LINE> <INDENT> def is_hitting(self): <NEW_LINE> <INDENT> response = games.ask_yes_no("\n" + self.name + ", do you want a hit? (Y/N): ") <NEW_LINE> return response == "y" <NEW_LINE> <DEDENT> def bust(self): <NEW_LINE> <INDENT> print(self.name, "busts.") <NEW_LINE> self.lose() <NEW_LINE> <DEDENT> def lose(self): <NEW_LINE> <INDENT> print(self.name, "loses.") <NEW_LINE> <DEDENT> def win(self): <NEW_LINE> <INDENT> print(self.name, "wins.") <NEW_LINE> self.bankroll += (self.bet * 2) <NEW_LINE> <DEDENT> def push(self): <NEW_LINE> <INDENT> print(self.name, "pushes.") <NEW_LINE> self.bankroll += (self.bet) | A Blackjack Player. | 625990769c8ee82313040e36 |
class MiningShaft(src.items.Item): <NEW_LINE> <INDENT> type = "MiningShaft" <NEW_LINE> def apply(self, character): <NEW_LINE> <INDENT> character.zPosition -= 1 <NEW_LINE> <DEDENT> def configure(self, character): <NEW_LINE> <INDENT> character.zPosition += 1 | ingame item to change z-levels | 625990767cff6e4e811b739e |
class CustomArgument(BaseCLIArgument): <NEW_LINE> <INDENT> def __init__(self, name, help_text='', dest=None, default=None, action=None, required=None, choices=None, nargs=None, cli_type_name=None, group_name=None, positional_arg=False, no_paramfile=False): <NEW_LINE> <INDENT> self._name = name <NEW_LINE> self._help = help_text <NEW_LINE> self._dest = dest <NEW_LINE> self._default = default <NEW_LINE> self._action = action <NEW_LINE> self._required = required <NEW_LINE> self._nargs = nargs <NEW_LINE> self._cli_type_name = cli_type_name <NEW_LINE> self._group_name = group_name <NEW_LINE> self._positional_arg = positional_arg <NEW_LINE> if choices is None: <NEW_LINE> <INDENT> choices = [] <NEW_LINE> <DEDENT> self._choices = choices <NEW_LINE> self.no_paramfile = no_paramfile <NEW_LINE> self.argument_object = None <NEW_LINE> <DEDENT> @property <NEW_LINE> def cli_name(self): <NEW_LINE> <INDENT> if self._positional_arg: <NEW_LINE> <INDENT> return self._name <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> return '--' + self._name <NEW_LINE> <DEDENT> <DEDENT> def add_to_parser(self, parser): <NEW_LINE> <INDENT> cli_name = self.cli_name <NEW_LINE> kwargs = {} <NEW_LINE> if self._dest is not None: <NEW_LINE> <INDENT> kwargs['dest'] = self._dest <NEW_LINE> <DEDENT> if self._action is not None: <NEW_LINE> <INDENT> kwargs['action'] = self._action <NEW_LINE> <DEDENT> if self._default is not None: <NEW_LINE> <INDENT> kwargs['default'] = self._default <NEW_LINE> <DEDENT> if self._choices: <NEW_LINE> <INDENT> kwargs['choices'] = self._choices <NEW_LINE> <DEDENT> if self._required is not None: <NEW_LINE> <INDENT> kwargs['required'] = self._required <NEW_LINE> <DEDENT> if self._nargs is not None: <NEW_LINE> <INDENT> kwargs['nargs'] = self._nargs <NEW_LINE> <DEDENT> parser.add_argument(cli_name, **kwargs) <NEW_LINE> <DEDENT> @property <NEW_LINE> def required(self): <NEW_LINE> <INDENT> if self._required is None: <NEW_LINE> <INDENT> return False <NEW_LINE> <DEDENT> return self._required <NEW_LINE> <DEDENT> @required.setter <NEW_LINE> def required(self, value): <NEW_LINE> <INDENT> self._required = value <NEW_LINE> <DEDENT> @property <NEW_LINE> def documentation(self): <NEW_LINE> <INDENT> return self._help <NEW_LINE> <DEDENT> @property <NEW_LINE> def cli_type_name(self): <NEW_LINE> <INDENT> if self._cli_type_name is not None: <NEW_LINE> <INDENT> return self._cli_type_name <NEW_LINE> <DEDENT> elif self._action in ['store_true', 'store_false']: <NEW_LINE> <INDENT> return 'boolean' <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> return 'string' <NEW_LINE> <DEDENT> <DEDENT> @property <NEW_LINE> def cli_type(self): <NEW_LINE> <INDENT> cli_type = str <NEW_LINE> if self._action in ['store_true', 'store_false']: <NEW_LINE> <INDENT> cli_type = bool <NEW_LINE> <DEDENT> return cli_type <NEW_LINE> <DEDENT> @property <NEW_LINE> def choices(self): <NEW_LINE> <INDENT> return self._choices <NEW_LINE> <DEDENT> @property <NEW_LINE> def group_name(self): <NEW_LINE> <INDENT> return self._group_name | Represents a CLI argument that is configured from a dictionary.
For example, the "top level" arguments used for the CLI
(--region, --output) can use a CustomArgument argument,
as these are described in the cli.json file as dictionaries.
This class is also useful for plugins/customizations that want to
add additional args. | 625990763d592f4c4edbc80c |
class Author(ATCTContent): <NEW_LINE> <INDENT> security = ClassSecurityInfo() <NEW_LINE> implements(IAuthor) <NEW_LINE> meta_type = 'Author' <NEW_LINE> _at_rename_after_creation = False <NEW_LINE> schema = AuthorSchema <NEW_LINE> security.declarePrivate('_renameAfterCreation') <NEW_LINE> def _renameAfterCreation(self, check_auto_id=False): <NEW_LINE> <INDENT> plone_tool = getToolByName(self, 'plone_utils', None) <NEW_LINE> parent = self.aq_inner.aq_parent <NEW_LINE> newId = self.getName() <NEW_LINE> newId = plone_tool.normalizeString(newId) <NEW_LINE> transaction.savepoint(optimistic = True) <NEW_LINE> self.setId(newId) <NEW_LINE> <DEDENT> security.declarePrivate('at_post_edit_script') <NEW_LINE> def at_post_edit_script(self): <NEW_LINE> <INDENT> self._renameAfterCreation() <NEW_LINE> <DEDENT> security.declarePublic('Title') <NEW_LINE> def Title(self): <NEW_LINE> <INDENT> return self.getName() <NEW_LINE> <DEDENT> security.declarePublic('canSetConstrainTypes') <NEW_LINE> def getName(self): <NEW_LINE> <INDENT> personal_names = self.getPersonalNames() <NEW_LINE> family_name = self.getFamilyName() <NEW_LINE> if self.getNameOrder(): <NEW_LINE> <INDENT> return family_name + ' ' + personal_names <NEW_LINE> <DEDENT> return personal_names + ' ' + family_name | A author of a piece of research | 62599076283ffb24f3cf5209 |
class SquareShoppingPage(SuperPage): <NEW_LINE> <INDENT> def __init__(self, testcase, driver, logger): <NEW_LINE> <INDENT> super(SquareShoppingPage, self).__init__(testcase, driver, logger) <NEW_LINE> <DEDENT> def validSelf(self): <NEW_LINE> <INDENT> API().assertElementByResourceId(self.testcase, self.driver, self.logger, SSPC.resource_id_iv_find_iv, 10) <NEW_LINE> <DEDENT> def clickOnSubCommodity(self): <NEW_LINE> <INDENT> tempText = API().getTextByXpath(self.testcase, self.driver, self.logger, SSPC.xpath_sub_commodity_button, 10) <NEW_LINE> API().clickElementByXpath(self.testcase, self.driver, self.logger, SSPC.xpath_sub_commodity_button, SSPC.click_on_button_timeout) <NEW_LINE> return tempText | 作者 宋波
首页=>广场=>爱购物 | 62599076f548e778e596ceee |
class Notification(QWidget): <NEW_LINE> <INDENT> def __init__(self, parent=None): <NEW_LINE> <INDENT> super(Notification, self).__init__(None, Qt.ToolTip) <NEW_LINE> self._parent = parent <NEW_LINE> self._duration = 3000 <NEW_LINE> self.setAttribute(Qt.WA_TranslucentBackground, True) <NEW_LINE> self.setAttribute(Qt.WA_TransparentForMouseEvents) <NEW_LINE> self.setAttribute(Qt.WA_ShowWithoutActivating) <NEW_LINE> self.setFixedHeight(30) <NEW_LINE> view = QQuickWidget() <NEW_LINE> view.setClearColor(Qt.transparent) <NEW_LINE> view.setResizeMode(QQuickWidget.SizeRootObjectToView) <NEW_LINE> view.setSource(ui_tools.get_qml_resource("Notification.qml")) <NEW_LINE> self._root = view.rootObject() <NEW_LINE> vbox = QVBoxLayout(self) <NEW_LINE> vbox.setContentsMargins(0, 0, 0, 0) <NEW_LINE> vbox.setSpacing(0) <NEW_LINE> vbox.addWidget(view) <NEW_LINE> self._root.close.connect(self.close) <NEW_LINE> <DEDENT> def showEvent(self, event): <NEW_LINE> <INDENT> super(Notification, self).showEvent(event) <NEW_LINE> width, pgeo = self._parent.width(), self._parent.geometry() <NEW_LINE> conditional_vertical = settings.NOTIFICATION_POSITION in (0, 1) <NEW_LINE> conditional_horizontal = settings.NOTIFICATION_POSITION in (0, 2) <NEW_LINE> x = pgeo.left() if conditional_horizontal else pgeo.right() <NEW_LINE> y = (pgeo.bottom() - self.height() if conditional_vertical else pgeo.top()) <NEW_LINE> self.setFixedWidth(width) <NEW_LINE> self.setGeometry(x, y, self.width(), self.height()) <NEW_LINE> background_color = str(settings.NOTIFICATION_COLOR) <NEW_LINE> foreground_color = str( settings.NOTIFICATION_COLOR).lower().maketrans( '0123456789abcdef', 'fedcba9876543210') <NEW_LINE> foreground_color = background_color.translate(foreground_color) <NEW_LINE> self._root.setColor(background_color, foreground_color) <NEW_LINE> self._root.start(self._duration) <NEW_LINE> <DEDENT> def set_message(self, text='', duration=3000): <NEW_LINE> <INDENT> self._root.setText(text) <NEW_LINE> self._duration = duration | Notification class with the Logic for the QML UI | 6259907697e22403b383c861 |
class TokenCreateError(TokenError): <NEW_LINE> <INDENT> pass | トークンを作成時のエラー
http://docs.pay.jp/docs/token-create | 62599076d268445f2663a80d |
class Packager(DummyPackager): <NEW_LINE> <INDENT> def __init__(self, basepath: Path, exepath: Path, machine: str, version: str): <NEW_LINE> <INDENT> self.paths = { "base": basepath, "packaging": basepath / "builder" / "packaging", "build": basepath / "build" / "packaging", "dist": basepath / "dist" / "packaging", "exe": exepath, } <NEW_LINE> self.machine = machine <NEW_LINE> self.version = version <NEW_LINE> self.downloader: Optional[ThreadedDownloader] = None <NEW_LINE> <DEDENT> def package(self): <NEW_LINE> <INDENT> print("Making portable binary ZIP") <NEW_LINE> zipname = ( f"kithare-{self.version}-{platform.system().lower()}-{self.machine}.zip" ) <NEW_LINE> portable_zip = self.paths["dist"] / zipname <NEW_LINE> portable_zip.parent.mkdir(exist_ok=True) <NEW_LINE> with ZipFile(portable_zip, mode="w") as myzip: <NEW_LINE> <INDENT> for dfile in self.paths["exe"].parent.rglob("*"): <NEW_LINE> <INDENT> zipped_file = Path("Kithare") / dfile.relative_to( self.paths["exe"].parent ) <NEW_LINE> myzip.write(dfile, arcname=zipped_file) <NEW_LINE> <DEDENT> <DEDENT> print(f"Finished making zipfile in '{portable_zip}'\n") | Packager is a base class for all other platform-specific Packager classes,
that help packaging Kithare. | 62599076097d151d1a2c29d3 |
class Similarity_Calcs: <NEW_LINE> <INDENT> def __init__(self, dataframe): <NEW_LINE> <INDENT> self.dataframe = dataframe <NEW_LINE> self.series = series_by_team(goals_pruner(dataframe)) <NEW_LINE> self.individ = np.array([]) <NEW_LINE> self.team = np.array([]) | Object's attributes represent data structures generated from applying
similarity calculations across team survey. | 625990761b99ca40022901e5 |
class Timeout(MessageQueueError): <NEW_LINE> <INDENT> pass | Exception for tiemouts on send, recv operations. | 625990763346ee7daa33830f |
class ObjectStatusDefinition(msrest.serialization.Model): <NEW_LINE> <INDENT> _attribute_map = { 'name': {'key': 'name', 'type': 'str'}, 'namespace': {'key': 'namespace', 'type': 'str'}, 'kind': {'key': 'kind', 'type': 'str'}, 'compliance_state': {'key': 'complianceState', 'type': 'str'}, 'applied_by': {'key': 'appliedBy', 'type': 'ObjectReferenceDefinition'}, 'status_conditions': {'key': 'statusConditions', 'type': '[ObjectStatusConditionDefinition]'}, 'helm_release_properties': {'key': 'helmReleaseProperties', 'type': 'HelmReleasePropertiesDefinition'}, } <NEW_LINE> def __init__( self, **kwargs ): <NEW_LINE> <INDENT> super(ObjectStatusDefinition, self).__init__(**kwargs) <NEW_LINE> self.name = kwargs.get('name', None) <NEW_LINE> self.namespace = kwargs.get('namespace', None) <NEW_LINE> self.kind = kwargs.get('kind', None) <NEW_LINE> self.compliance_state = kwargs.get('compliance_state', "Unknown") <NEW_LINE> self.applied_by = kwargs.get('applied_by', None) <NEW_LINE> self.status_conditions = kwargs.get('status_conditions', None) <NEW_LINE> self.helm_release_properties = kwargs.get('helm_release_properties', None) | Statuses of objects deployed by the user-specified kustomizations from the git repository.
:param name: Name of the applied object.
:type name: str
:param namespace: Namespace of the applied object.
:type namespace: str
:param kind: Kind of the applied object.
:type kind: str
:param compliance_state: Compliance state of the applied object showing whether the applied
object has come into a ready state on the cluster. Possible values include: "Compliant",
"Non-Compliant", "Pending", "Suspended", "Unknown". Default value: "Unknown".
:type compliance_state: str or
~azure.mgmt.kubernetesconfiguration.v2022_01_01_preview.models.FluxComplianceState
:param applied_by: Object reference to the Kustomization that applied this object.
:type applied_by:
~azure.mgmt.kubernetesconfiguration.v2022_01_01_preview.models.ObjectReferenceDefinition
:param status_conditions: List of Kubernetes object status conditions present on the cluster.
:type status_conditions:
list[~azure.mgmt.kubernetesconfiguration.v2022_01_01_preview.models.ObjectStatusConditionDefinition]
:param helm_release_properties: Additional properties that are provided from objects of the
HelmRelease kind.
:type helm_release_properties:
~azure.mgmt.kubernetesconfiguration.v2022_01_01_preview.models.HelmReleasePropertiesDefinition | 625990762c8b7c6e89bd5148 |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.