code
stringlengths
4
4.48k
docstring
stringlengths
1
6.45k
_id
stringlengths
24
24
class DepthFirstCrawler(Crawler): <NEW_LINE> <INDENT> def _crawl(self, nodes): <NEW_LINE> <INDENT> node_list = nodes <NEW_LINE> cur_node = node_list[-1] <NEW_LINE> while cur_node and cur_node.depth < self.max_depth: <NEW_LINE> <INDENT> new_node = None <NEW_LINE> while not new_node and len(cur_node.links) > 0: <NEW_LINE> <INDENT> link = random.choice(cur_node.links) <NEW_LINE> cur_node.links.remove(link) <NEW_LINE> new_node = PageNode.make_pagenode(self.id_gen, link, cur_node, self.end_phrase) <NEW_LINE> <DEDENT> if not new_node: <NEW_LINE> <INDENT> cur_node = node_list[cur_node.parent] <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> yield new_node <NEW_LINE> node_list.append(new_node) <NEW_LINE> cur_node = new_node <NEW_LINE> <DEDENT> <DEDENT> <DEDENT> def _get_unfinished_nodes(self): <NEW_LINE> <INDENT> logging.warning("Retrieving unfinished DFS job") <NEW_LINE> nodes = JobModel.get_from_key(self.job_key).get_results() <NEW_LINE> for node in nodes: <NEW_LINE> <INDENT> node.end_phrase = self.end_phrase <NEW_LINE> <DEDENT> return nodes
Implements a depth first strategy to crawl. This will randomly select a link on each page to follow to the next level. Terminates after reaching the desired depth or when the termination phrase is encountered
62599062627d3e7fe0e08562
class ACLAction(Base): <NEW_LINE> <INDENT> def __init__(self, **kwargs): <NEW_LINE> <INDENT> super(ACLAction, self).__init__(ACLActionParam, [ACLActionParam.NAME, ACLActionParam.DESCRIPTION], kwargs) <NEW_LINE> self.grant_rules = [] <NEW_LINE> self.revoke_rules = [] <NEW_LINE> <DEDENT> def grant(self, acl_action_rule): <NEW_LINE> <INDENT> self.grant_rules.append(acl_action_rule) <NEW_LINE> if acl_action_rule in self.revoke_rules: <NEW_LINE> <INDENT> self.revoke_rules.remove(acl_action_rule) <NEW_LINE> <DEDENT> <DEDENT> def revoke(self, acl_action_rule): <NEW_LINE> <INDENT> self.revoke_rules.append(acl_action_rule) <NEW_LINE> if acl_action_rule in self.grant_rules: <NEW_LINE> <INDENT> self.grant_rules.remove(acl_action_rule)
This class represents a ACLAction :param id_unique: ID of the ACLAction :type id_unique: int :param name: Name of the ACLAction :type name: str :param description: Description of the ACLAction :type description: str :param activate: Is the ACLAction enabled? :type activate: bool
625990626e29344779b01d27
class BatchNorm2d(_BatchNorm): <NEW_LINE> <INDENT> def _check_input_dim(self, input): <NEW_LINE> <INDENT> if input.dim() != 4: <NEW_LINE> <INDENT> raise ValueError('expected 4D input (got {}D input)' .format(input.dim()))
Applies Batch Normalization over a 4D input (a mini-batch of 2D inputs with additional channel dimension) as described in the paper `Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift`_ . .. math:: y = \frac{x - \mathrm{E}[x]}{ \sqrt{\mathrm{Var}[x] + \epsilon}} * \gamma + \beta The mean and standard-deviation are calculated per-dimension over the mini-batches and :math:`\gamma` and :math:`\beta` are learnable parameter vectors of size `C` (where `C` is the input size). By default, the elements of :math:`\gamma` are sampled from :math:`\mathcal{U}(0, 1)` and the elements of :math:`\beta` are set to 0. Also by default, during training this layer keeps running estimates of its computed mean and variance, which are then used for normalization during evaluation. The running estimates are kept with a default :attr:`momentum` of 0.1. If :attr:`track_running_stats` is set to ``False``, this layer then does not keep running estimates, and batch statistics are instead used during evaluation time as well. .. note:: This :attr:`momentum` argument is different from one used in optimizer classes and the conventional notion of momentum. Mathematically, the update rule for running statistics here is :math:`\hat{x}_\text{new} = (1 - \text{momentum}) \times \hat{x} + \text{momemtum} \times x_t`, where :math:`\hat{x}` is the estimated statistic and :math:`x_t` is the new observed value. Because the Batch Normalization is done over the `C` dimension, computing statistics on `(N, H, W)` slices, it's common terminology to call this Spatial Batch Normalization. Args: num_features: :math:`C` from an expected input of size :math:`(N, C, H, W)` eps: a value added to the denominator for numerical stability. Default: 1e-5 momentum: the value used for the running_mean and running_var computation. Can be set to ``None`` for cumulative moving average (i.e. simple average). Default: 0.1 affine: a boolean value that when set to ``True``, this module has learnable affine parameters. Default: ``True`` track_running_stats: a boolean value that when set to ``True``, this module tracks the running mean and variance, and when set to ``False``, this module does not track such statistics and always uses batch statistics in both training and eval modes. Default: ``True`` Shape: - Input: :math:`(N, C, H, W)` - Output: :math:`(N, C, H, W)` (same shape as input) Examples:: >>> # With Learnable Parameters >>> m = nn.BatchNorm2d(100) >>> # Without Learnable Parameters >>> m = nn.BatchNorm2d(100, affine=False) >>> input = torch.randn(20, 100, 35, 45) >>> output = m(input) .. _`Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift`: https://arxiv.org/abs/1502.03167
625990624e4d562566373ade
class HelpersDirectoryTests(unittest.TestCase): <NEW_LINE> <INDENT> @unittest.skip('Not yet implemented') <NEW_LINE> def test_make_dirs(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> @unittest.skip('Not yet implemented') <NEW_LINE> def test_delete_empty_folders(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> @unittest.skip('Not yet implemented') <NEW_LINE> def test_make_dir(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> @unittest.skip('Not yet implemented') <NEW_LINE> def test_get_temp_dir(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> @unittest.skip('Not yet implemented') <NEW_LINE> def test_is_hidden_folder(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> @unittest.skip('Not yet implemented') <NEW_LINE> def test_real_path(self): <NEW_LINE> <INDENT> pass
Test directory methods
62599062be8e80087fbc075e
class cutfastqC(): <NEW_LINE> <INDENT> def __init__(self, *fastq, length=0, size=0, outprefix=''): <NEW_LINE> <INDENT> self._file = fastq <NEW_LINE> self._fastq = [] <NEW_LINE> self._length = float(length) <NEW_LINE> self._size = self._normalized(size) <NEW_LINE> self._out = outprefix <NEW_LINE> self._fs = '/annoroad/data1/bioinfo/PROJECT/RD/Medical/Leukemia/chenyl/bin/fastq-sample' <NEW_LINE> self._gzip = '/bin/gzip' <NEW_LINE> self._gunzip = '/bin/gunzip' <NEW_LINE> self._tmp = [] <NEW_LINE> <DEDENT> def _normalized(self, size): <NEW_LINE> <INDENT> return float(size) * 1000000000 <NEW_LINE> <DEDENT> @property <NEW_LINE> def _readcount(self): <NEW_LINE> <INDENT> count = int(self._size / (self._length*len(self._file))) <NEW_LINE> return count <NEW_LINE> <DEDENT> def _Popen(self,cmd): <NEW_LINE> <INDENT> p = subprocess.Popen(cmd,shell=True,stdout=subprocess.PIPE, stderr=subprocess.PIPE) <NEW_LINE> return p <NEW_LINE> <DEDENT> def _getcmd(self): <NEW_LINE> <INDENT> cmd = '{0} -n {1} -s 422 -o {2}'.format(self._fs,self._readcount,self._out) <NEW_LINE> for i in self._fastq: <NEW_LINE> <INDENT> cmd = '{0} {1}'.format(cmd,i) <NEW_LINE> <DEDENT> return cmd <NEW_LINE> <DEDENT> def _gunzipf(self,file): <NEW_LINE> <INDENT> tmp = '{0}.{1}.fastq'.format(self._out,np.random.randint(1000)) <NEW_LINE> self._tmp.append(tmp) <NEW_LINE> self._fastq.append(tmp) <NEW_LINE> cmd = '{0} -c {1} > {2}'.format(self._gunzip,file,tmp) <NEW_LINE> return cmd <NEW_LINE> <DEDENT> def _gzipf(self,file): <NEW_LINE> <INDENT> cmd = '{0} {1}'.format(self._gzip,file) <NEW_LINE> return cmd <NEW_LINE> <DEDENT> def _read(self): <NEW_LINE> <INDENT> pSet =[] <NEW_LINE> for file in self._file: <NEW_LINE> <INDENT> if '.gz' in os.path.basename(file): <NEW_LINE> <INDENT> cmd =self._gunzipf(file) <NEW_LINE> pSet.append(self._Popen(cmd)) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> self._fastq.append(file) <NEW_LINE> <DEDENT> <DEDENT> while len(pSet) > 0: <NEW_LINE> <INDENT> p = pSet.pop() <NEW_LINE> print(p.communicate()) <NEW_LINE> <DEDENT> <DEDENT> def _write(self): <NEW_LINE> <INDENT> pSet = [] <NEW_LINE> if len(self._fastq) == 1: <NEW_LINE> <INDENT> cmd = self._gzipf('{}.fastq'.format(self._out)) <NEW_LINE> pSet.append(self._Popen(cmd)) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> for i in range(1,len(self._fastq)+1): <NEW_LINE> <INDENT> cmd = self._gzipf('{0}.{1}.fastq'.format(self._out,i)) <NEW_LINE> pSet.append(self._Popen(cmd)) <NEW_LINE> <DEDENT> <DEDENT> while len(pSet) > 0: <NEW_LINE> <INDENT> p = pSet.pop() <NEW_LINE> print(p.communicate()) <NEW_LINE> <DEDENT> print('gzip finished!') <NEW_LINE> <DEDENT> def _remove(self): <NEW_LINE> <INDENT> for file in self._tmp: <NEW_LINE> <INDENT> cmd = 'rm {}'.format(file) <NEW_LINE> print(self._Popen(cmd).communicate()) <NEW_LINE> <DEDENT> print('Remove tmp files') <NEW_LINE> <DEDENT> def run(self): <NEW_LINE> <INDENT> self._read() <NEW_LINE> p = self._Popen(self._getcmd()) <NEW_LINE> print(p.communicate()) <NEW_LINE> self._remove() <NEW_LINE> self._write()
trim fastq data(use c program fastq-tool)
625990623d592f4c4edbc5b4
@place(DEVICES) <NEW_LINE> @parameterize((TEST_CASE_NAME, 'x', 'n', 'axis', 'norm'), [ ('test_x_complex128', (np.random.randn(4, 4, 4) + 1j * np.random.randn(4, 4, 4) ).astype(np.complex128), None, None, "backward"), ('test_n_grater_than_input_length', np.random.randn(4, 4, 4) + 1j * np.random.randn(4, 4, 4), [4], None, "backward"), ('test_n_smaller_than_input_length', np.random.randn(4, 4, 4) + 1j * np.random.randn(4, 4, 4), [2], None, "backward"), ('test_axis_not_last', np.random.randn(4, 4, 4) + 1j * np.random.randn(4, 4, 4), None, None, "backward"), ('test_norm_forward', np.random.randn(4, 4, 4) + 1j * np.random.randn(4, 4, 4), None, None, "forward"), ('test_norm_ortho', np.random.randn(4, 4, 4) + 1j * np.random.randn(4, 4, 4), None, None, "ortho"), ]) <NEW_LINE> class Testhfftn(unittest.TestCase): <NEW_LINE> <INDENT> def test_static_hfftn(self): <NEW_LINE> <INDENT> with stgraph(paddle.fft.hfftn, self.place, self.x, self.n, self.axis, self.norm) as y: <NEW_LINE> <INDENT> np.testing.assert_allclose( scipy.fft.hfftn(self.x, self.n, self.axis, self.norm), y, rtol=1e-5, atol=0)
Test hfftn with norm condition
6259906276e4537e8c3f0c63
class BaseType: <NEW_LINE> <INDENT> valid_values = None <NEW_LINE> special = False <NEW_LINE> def __init__(self, none_ok=False): <NEW_LINE> <INDENT> self.none_ok = none_ok <NEW_LINE> <DEDENT> def _basic_validation(self, value): <NEW_LINE> <INDENT> if not value: <NEW_LINE> <INDENT> if self.none_ok: <NEW_LINE> <INDENT> return <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> raise configexc.ValidationError(value, "may not be empty!") <NEW_LINE> <DEDENT> <DEDENT> if any(ord(c) < 32 or ord(c) == 0x7f for c in value): <NEW_LINE> <INDENT> raise configexc.ValidationError(value, "may not contain " "unprintable chars!") <NEW_LINE> <DEDENT> <DEDENT> def transform(self, value): <NEW_LINE> <INDENT> if not value: <NEW_LINE> <INDENT> return None <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> return value <NEW_LINE> <DEDENT> <DEDENT> def validate(self, value): <NEW_LINE> <INDENT> self._basic_validation(value) <NEW_LINE> if not value: <NEW_LINE> <INDENT> return <NEW_LINE> <DEDENT> if self.valid_values is not None: <NEW_LINE> <INDENT> if value not in self.valid_values: <NEW_LINE> <INDENT> raise configexc.ValidationError( value, "valid values: {}".format(', '.join( self.valid_values))) <NEW_LINE> <DEDENT> <DEDENT> else: <NEW_LINE> <INDENT> raise NotImplementedError("{} does not implement validate.".format( self.__class__.__name__)) <NEW_LINE> <DEDENT> <DEDENT> def complete(self): <NEW_LINE> <INDENT> if self.valid_values is None: <NEW_LINE> <INDENT> return None <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> out = [] <NEW_LINE> for val in self.valid_values: <NEW_LINE> <INDENT> try: <NEW_LINE> <INDENT> desc = self.valid_values.descriptions[val] <NEW_LINE> <DEDENT> except KeyError: <NEW_LINE> <INDENT> desc = "" <NEW_LINE> <DEDENT> out.append((val, desc)) <NEW_LINE> <DEDENT> return out
A type used for a setting value. Attributes: none_ok: Whether to convert to None for an empty string. Class attributes: valid_values: Possible values if they can be expressed as a fixed string. ValidValues instance. special: If set, the type is only used for one option and isn't mentioned in the config file.
625990624e4d562566373adf
class AddColumns(Operation): <NEW_LINE> <INDENT> __metaclass__ = ABCMeta <NEW_LINE> def __init__(self, join_kind="left"): <NEW_LINE> <INDENT> self.join_kind = join_kind <NEW_LINE> <DEDENT> @abstractmethod <NEW_LINE> def build_output(self, story_data): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def transform(self, story_data): <NEW_LINE> <INDENT> output = self.build_output(story_data) <NEW_LINE> return pd.merge(left=story_data, right=output, left_index=True, right_index=True, how=self.join_kind)
Very typical case of an operation that appends (i.e. joins) columns to the previous result
625990627d847024c075daad
class TestLinearCreateStopOrderResultBase(unittest.TestCase): <NEW_LINE> <INDENT> def setUp(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def tearDown(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def testLinearCreateStopOrderResultBase(self): <NEW_LINE> <INDENT> pass
LinearCreateStopOrderResultBase unit test stubs
6259906232920d7e50bc771f
class OperationTypeValueValuesEnum(_messages.Enum): <NEW_LINE> <INDENT> TYPE_UNSPECIFIED = 0 <NEW_LINE> CREATE_CLUSTER = 1 <NEW_LINE> DELETE_CLUSTER = 2 <NEW_LINE> UPGRADE_MASTER = 3 <NEW_LINE> UPGRADE_NODES = 4 <NEW_LINE> REPAIR_CLUSTER = 5 <NEW_LINE> UPDATE_CLUSTER = 6 <NEW_LINE> CREATE_NODE_POOL = 7 <NEW_LINE> DELETE_NODE_POOL = 8 <NEW_LINE> SET_NODE_POOL_MANAGEMENT = 9
The operation type. Values: TYPE_UNSPECIFIED: Not set. CREATE_CLUSTER: Cluster create. DELETE_CLUSTER: Cluster delete. UPGRADE_MASTER: A master upgrade. UPGRADE_NODES: A node upgrade. REPAIR_CLUSTER: Cluster repair. UPDATE_CLUSTER: Cluster update. CREATE_NODE_POOL: Node pool create. DELETE_NODE_POOL: Node pool delete. SET_NODE_POOL_MANAGEMENT: Set node pool management.
62599062d268445f2663a6c9
class ThomsonDeviceScanner(object): <NEW_LINE> <INDENT> def __init__(self, config): <NEW_LINE> <INDENT> self.host = config[CONF_HOST] <NEW_LINE> self.username = config[CONF_USERNAME] <NEW_LINE> self.password = config[CONF_PASSWORD] <NEW_LINE> self.lock = threading.Lock() <NEW_LINE> self.last_results = {} <NEW_LINE> data = self.get_thomson_data() <NEW_LINE> self.success_init = data is not None <NEW_LINE> <DEDENT> def scan_devices(self): <NEW_LINE> <INDENT> self._update_info() <NEW_LINE> return [client['mac'] for client in self.last_results] <NEW_LINE> <DEDENT> def get_device_name(self, device): <NEW_LINE> <INDENT> if not self.last_results: <NEW_LINE> <INDENT> return None <NEW_LINE> <DEDENT> for client in self.last_results: <NEW_LINE> <INDENT> if client['mac'] == device: <NEW_LINE> <INDENT> return client['host'] <NEW_LINE> <DEDENT> <DEDENT> return None <NEW_LINE> <DEDENT> @Throttle(MIN_TIME_BETWEEN_SCANS) <NEW_LINE> def _update_info(self): <NEW_LINE> <INDENT> if not self.success_init: <NEW_LINE> <INDENT> return False <NEW_LINE> <DEDENT> with self.lock: <NEW_LINE> <INDENT> _LOGGER.info("Checking ARP") <NEW_LINE> data = self.get_thomson_data() <NEW_LINE> if not data: <NEW_LINE> <INDENT> return False <NEW_LINE> <DEDENT> active_clients = [client for client in data.values() if client['status'].find('C') != -1] <NEW_LINE> self.last_results = active_clients <NEW_LINE> return True <NEW_LINE> <DEDENT> <DEDENT> def get_thomson_data(self): <NEW_LINE> <INDENT> try: <NEW_LINE> <INDENT> telnet = telnetlib.Telnet(self.host) <NEW_LINE> telnet.read_until(b'Username : ') <NEW_LINE> telnet.write((self.username + '\r\n').encode('ascii')) <NEW_LINE> telnet.read_until(b'Password : ') <NEW_LINE> telnet.write((self.password + '\r\n').encode('ascii')) <NEW_LINE> telnet.read_until(b'=>') <NEW_LINE> telnet.write(('hostmgr list\r\n').encode('ascii')) <NEW_LINE> devices_result = telnet.read_until(b'=>').split(b'\r\n') <NEW_LINE> telnet.write('exit\r\n'.encode('ascii')) <NEW_LINE> <DEDENT> except EOFError: <NEW_LINE> <INDENT> _LOGGER.exception("Unexpected response from router") <NEW_LINE> return <NEW_LINE> <DEDENT> except ConnectionRefusedError: <NEW_LINE> <INDENT> _LOGGER.exception("Connection refused by router," + " is telnet enabled?") <NEW_LINE> return <NEW_LINE> <DEDENT> devices = {} <NEW_LINE> for device in devices_result: <NEW_LINE> <INDENT> match = _DEVICES_REGEX.search(device.decode('utf-8')) <NEW_LINE> if match: <NEW_LINE> <INDENT> devices[match.group('ip')] = { 'ip': match.group('ip'), 'mac': match.group('mac').upper(), 'host': match.group('host'), 'status': match.group('status') } <NEW_LINE> <DEDENT> <DEDENT> return devices
This class queries a router running THOMSON firmware for connected devices. Adapted from ASUSWRT scanner.
6259906245492302aabfdbb3
class InstallNodeDependenciesCommand(Command): <NEW_LINE> <INDENT> description = 'Install the node packages required for building static media.' <NEW_LINE> user_options = [ (str('use-npm-cache'), None, 'Use npm-cache to install packages'), ] <NEW_LINE> boolean_options = [str('use-npm-cache')] <NEW_LINE> def initialize_options(self): <NEW_LINE> <INDENT> self.use_npm_cache = None <NEW_LINE> <DEDENT> def finalize_options(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def run(self): <NEW_LINE> <INDENT> if self.use_npm_cache: <NEW_LINE> <INDENT> npm_command = 'npm-cache' <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> npm_command = 'npm' <NEW_LINE> <DEDENT> try: <NEW_LINE> <INDENT> check_run([npm_command, '--version']) <NEW_LINE> <DEDENT> except (subprocess.CalledProcessError, OSError): <NEW_LINE> <INDENT> raise RuntimeError( 'Unable to locate %s in the path, which is needed to ' 'install dependencies required to build this package.' % npm_command) <NEW_LINE> <DEDENT> self.run_command('list_node_deps') <NEW_LINE> print('Installing node.js modules...') <NEW_LINE> result = os.system('%s install' % npm_command) <NEW_LINE> os.unlink('package.json') <NEW_LINE> if result != 0: <NEW_LINE> <INDENT> raise RuntimeError( 'One or more node.js modules could not be installed.')
Installs all node.js dependencies from npm. If ``--use-npm-cache`` is passed, this will use :command:`npm-cache` to install the packages, which is best for Continuous Integration setups. Otherwise, :command:`npm` is used.
625990624a966d76dd5f05cd
class ControlPanel(wx.Panel): <NEW_LINE> <INDENT> def __init__(self, parent, vsProc): <NEW_LINE> <INDENT> wx.Panel.__init__(self, parent, -1, style=wx.WANTS_CHARS) <NEW_LINE> self.parent = parent <NEW_LINE> self.vsProc = vsProc <NEW_LINE> gbs = wx.GridBagSizer(3, 4) <NEW_LINE> label = wx.StaticText(self, -1, _("Source file:")) <NEW_LINE> gbs.Add(label, (0,0)) <NEW_LINE> id = wx.NewId() <NEW_LINE> self.teSource = wx.TextCtrl(self, id, "", size=(200,-1), style=wx.TE_PROCESS_ENTER) <NEW_LINE> wx.EVT_TEXT_ENTER(self.teSource, id, self.enteredSourceText) <NEW_LINE> gbs.Add(self.teSource, (0,1), flag=wx.EXPAND) <NEW_LINE> self.vsProc.registerEditor(self.teSource, "") <NEW_LINE> sBtn = wx.Button(self, wx.ID_OPEN) <NEW_LINE> wx.EVT_BUTTON(sBtn, wx.ID_OPEN, parent.onFileOpen) <NEW_LINE> gbs.Add(sBtn, (0,2)) <NEW_LINE> label = wx.StaticText(self, -1, _("Destination file:")) <NEW_LINE> gbs.Add(label, (1,0)) <NEW_LINE> id = wx.NewId() <NEW_LINE> self.teDestin = wx.TextCtrl(self, id, "", size=(200,-1), style=wx.TE_PROCESS_ENTER) <NEW_LINE> wx.EVT_TEXT_ENTER(self.teDestin, id, self.enteredDestinText) <NEW_LINE> gbs.Add(self.teDestin, (1,1), flag=wx.EXPAND) <NEW_LINE> self.vsProc.registerEditor(self.teDestin, "") <NEW_LINE> dBtn = wx.Button(self, wx.ID_SAVE) <NEW_LINE> wx.EVT_BUTTON(dBtn, wx.ID_SAVE, parent.onDestChooser) <NEW_LINE> gbs.Add(dBtn, (1,2)) <NEW_LINE> id = wx.NewId() <NEW_LINE> self.options = ["HTML", "LaTeX", "XML"] <NEW_LINE> self.rbFormat = wx.RadioBox(self, id, _("Destination format"), wx.DefaultPosition, wx.DefaultSize, self.options, 1, style=wx.RA_SPECIFY_COLS) <NEW_LINE> wx.EVT_RADIOBOX(self.rbFormat, id, self.rbSelection) <NEW_LINE> gbs.Add(self.rbFormat, (0,3), (3,1)) <NEW_LINE> id = wx.NewId() <NEW_LINE> procBtn = wx.Button(self, id, _("Process")) <NEW_LINE> wx.EVT_BUTTON(procBtn, id, parent.goProcess) <NEW_LINE> vsProc.registerAction(procBtn) <NEW_LINE> gbs.Add(procBtn, (2,4)) <NEW_LINE> self.SetSizer(gbs) <NEW_LINE> return <NEW_LINE> <DEDENT> def enteredSourceText(self, event): <NEW_LINE> <INDENT> self.parent.onFileOpen(None, self.teSource.GetValue()) <NEW_LINE> return <NEW_LINE> <DEDENT> def enteredDestinText(self, event): <NEW_LINE> <INDENT> self.parent.onDestChooser(None, self.teDestin.GetValue()) <NEW_LINE> return <NEW_LINE> <DEDENT> def rbSelection(self, event): <NEW_LINE> <INDENT> chosen = event.GetSelection() <NEW_LINE> format = self.options[chosen] <NEW_LINE> filename = self.teDestin.GetValue() <NEW_LINE> for ext in [x[1] for x in FORMATS.values()]: <NEW_LINE> <INDENT> if filename.endswith(ext): <NEW_LINE> <INDENT> filename = filename[:-len(ext)] <NEW_LINE> <DEDENT> <DEDENT> filename += FORMATS[format][1] <NEW_LINE> self.teDestin.SetValue(filename) <NEW_LINE> self.enteredDestinText(None) <NEW_LINE> return
Panel with the files selections grid.
625990623eb6a72ae038bd38
class ErrorController(BaseController): <NEW_LINE> <INDENT> def document(self): <NEW_LINE> <INDENT> page = error_document_template % dict(prefix=request.environ.get('SCRIPT_NAME', ''), code=request.params.get('code', ''), message=request.params.get('message', '')) <NEW_LINE> return render('not_found.html') <NEW_LINE> <DEDENT> def img(self, id): <NEW_LINE> <INDENT> return self._serve_file(os.path.join(media_path, 'img', id)) <NEW_LINE> <DEDENT> def style(self, id): <NEW_LINE> <INDENT> return self._serve_file(os.path.join(media_path, 'style', id)) <NEW_LINE> <DEDENT> def _serve_file(self, path): <NEW_LINE> <INDENT> fapp = paste.fileapp.FileApp(path) <NEW_LINE> return fapp(request.environ, self.start_response)
Generates error documents as and when they are required. The ErrorDocuments middleware forwards to ErrorController when error related status codes are returned from the application. This behaviour can be altered by changing the parameters to the ErrorDocuments middleware in your config/middleware.py file.
625990624f88993c371f108b
class NewsSourceForm(OptionalValidateUniqueMixin, forms.ModelForm): <NEW_LINE> <INDENT> class Meta: <NEW_LINE> <INDENT> model = NewsSourceModel <NEW_LINE> fields = '__all__'
Model form to validate data from newsapi.org
625990628e71fb1e983bd1a4
class Solution: <NEW_LINE> <INDENT> def is_monotonic(self, a): <NEW_LINE> <INDENT> if len(a) < 2: <NEW_LINE> <INDENT> return True <NEW_LINE> <DEDENT> s = set() <NEW_LINE> n = len(a) <NEW_LINE> for i in range(0, n - 1, 1): <NEW_LINE> <INDENT> if a[i + 1] - a[i] == 0: <NEW_LINE> <INDENT> continue <NEW_LINE> <DEDENT> elif a[i + 1] - a[i] > 0: <NEW_LINE> <INDENT> s.add(True) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> s.add(False) <NEW_LINE> <DEDENT> if len(s) > 1: <NEW_LINE> <INDENT> return False <NEW_LINE> <DEDENT> <DEDENT> return True
Iteration over all elements in array. Time complexity: O(n) - Iterate over all array elements Space complexity: O(1) - Amortized collect one element in unique set
62599062a17c0f6771d5d711
class _DictMixed(IValidator): <NEW_LINE> <INDENT> cleaned_data = None <NEW_LINE> errors = None <NEW_LINE> error_msg = 'No validator for {}' <NEW_LINE> error_msg_required = 'Field {} required' <NEW_LINE> def __init__(self, validators, policy='error', required=False): <NEW_LINE> <INDENT> self.validators = validators <NEW_LINE> self.required = required <NEW_LINE> if policy not in ('error', 'except', 'ignore', 'drop'): <NEW_LINE> <INDENT> raise KeyError( 'Bad policy value.' 'Allowed "error", "except", "ignore" or "drop".') <NEW_LINE> <DEDENT> self.policy = policy <NEW_LINE> <DEDENT> def __call__(self, data, **kwargs): <NEW_LINE> <INDENT> self.data_dict = data <NEW_LINE> self.kwargs = kwargs <NEW_LINE> return self <NEW_LINE> <DEDENT> def is_valid(self): <NEW_LINE> <INDENT> self.cleaned_data = {} <NEW_LINE> if self.required: <NEW_LINE> <INDENT> for field in self.validators: <NEW_LINE> <INDENT> if field not in self.data_dict: <NEW_LINE> <INDENT> self.errors = {'__all__': [self.error_msg_required.format(field)]} <NEW_LINE> return False <NEW_LINE> <DEDENT> <DEDENT> <DEDENT> for key, data in self.data_dict.items(): <NEW_LINE> <INDENT> if key in self.validators: <NEW_LINE> <INDENT> validator = self.validators[key](data=data, **self.kwargs) <NEW_LINE> <DEDENT> elif self.policy == 'error': <NEW_LINE> <INDENT> self.errors = {'__all__': [self.error_msg.format(key)]} <NEW_LINE> return False <NEW_LINE> <DEDENT> elif self.policy == 'except': <NEW_LINE> <INDENT> raise KeyError(self.error_msg.format(key)) <NEW_LINE> <DEDENT> elif self.policy == 'ignore': <NEW_LINE> <INDENT> self.cleaned_data[key] = data <NEW_LINE> continue <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> continue <NEW_LINE> <DEDENT> if validator.is_valid(): <NEW_LINE> <INDENT> self.cleaned_data[key] = validator.cleaned_data <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> self.cleaned_data = {} <NEW_LINE> self.errors = validator.errors <NEW_LINE> return False <NEW_LINE> <DEDENT> <DEDENT> return True
Validate dict keys by multiple validators :param dict validators: validator which be applyed to all values of dict. :param str policy: policy if validator for data not found: "error" - add error into `errors` attr and return False. "except" - raise KeyError exception. "ignore" - add source value into cleaned_data. "drop" - drop this value and continue.
62599062097d151d1a2c2749
class Subscription(graphene.ObjectType): <NEW_LINE> <INDENT> new_message = graphene.Field(MessageType, channel_id=graphene.ID()) <NEW_LINE> notifications = graphene.Field(RoomType, user_id=graphene.Int()) <NEW_LINE> on_focus = graphene.Boolean(room_id=graphene.ID()) <NEW_LINE> has_unreaded_messages = graphene.Boolean(user_id=graphene.ID()) <NEW_LINE> online_users = graphene.List(UserType) <NEW_LINE> async def resolve_new_message(root, info, channel_id): <NEW_LINE> <INDENT> channel_name = await channel_layer.new_channel() <NEW_LINE> await channel_layer.group_add("new_message_" + str(channel_id), channel_name) <NEW_LINE> try: <NEW_LINE> <INDENT> while True: <NEW_LINE> <INDENT> message = await channel_layer.receive(channel_name) <NEW_LINE> yield message["data"] <NEW_LINE> <DEDENT> <DEDENT> finally: <NEW_LINE> <INDENT> await channel_layer.group_discard("new_message_" + str(channel_id), channel_name) <NEW_LINE> <DEDENT> <DEDENT> async def resolve_notifications(root, info, user_id): <NEW_LINE> <INDENT> channel_name = await channel_layer.new_channel() <NEW_LINE> await channel_layer.group_add("notify_" + str(user_id), channel_name) <NEW_LINE> try: <NEW_LINE> <INDENT> while True: <NEW_LINE> <INDENT> room = await channel_layer.receive(channel_name) <NEW_LINE> yield room["data"] <NEW_LINE> <DEDENT> <DEDENT> finally: <NEW_LINE> <INDENT> await channel_layer.group_discard("notify_" + str(user_id), channel_name) <NEW_LINE> <DEDENT> <DEDENT> async def resolve_on_focus(root, info, room_id): <NEW_LINE> <INDENT> channel_name = await channel_layer.new_channel() <NEW_LINE> await channel_layer.group_add("focused_" + str(room_id) + '_' + str(info.context['user'].id), channel_name) <NEW_LINE> try: <NEW_LINE> <INDENT> while True: <NEW_LINE> <INDENT> focused = await channel_layer.receive(channel_name) <NEW_LINE> yield focused['data'] <NEW_LINE> <DEDENT> <DEDENT> finally: <NEW_LINE> <INDENT> await channel_layer.group_discard("focused_" + str(room_id) + '_' + str(info.context['user'].id), channel_name) <NEW_LINE> <DEDENT> <DEDENT> async def resolve_has_unreaded_messages(root, info, user_id): <NEW_LINE> <INDENT> channel_name = await channel_layer.new_channel() <NEW_LINE> await channel_layer.group_add("has_unreaded_messages_" + str(user_id), channel_name) <NEW_LINE> try: <NEW_LINE> <INDENT> while True: <NEW_LINE> <INDENT> data = await channel_layer.receive(channel_name) <NEW_LINE> yield data['data'] <NEW_LINE> <DEDENT> <DEDENT> finally: <NEW_LINE> <INDENT> await channel_layer.group_discard("has_unreaded_messages_" + str(user_id), channel_name) <NEW_LINE> <DEDENT> <DEDENT> async def resolve_online_users(root, info): <NEW_LINE> <INDENT> channel_name = await channel_layer.new_channel() <NEW_LINE> await channel_layer.group_add("users", channel_name) <NEW_LINE> try: <NEW_LINE> <INDENT> while True: <NEW_LINE> <INDENT> online_users = await channel_layer.receive(channel_name) <NEW_LINE> users = get_user_model().objects.only('id', 'email', 'full_name', 'online') <NEW_LINE> yield users <NEW_LINE> <DEDENT> <DEDENT> finally: <NEW_LINE> <INDENT> await channel_layer.group_discard("users", channel_name)
All Subscriptions
62599062d6c5a102081e37fe
class AmbientSpace(ambient_space.AmbientSpace): <NEW_LINE> <INDENT> @lazy_attribute <NEW_LINE> def _dual_space(self): <NEW_LINE> <INDENT> K = self.base_ring() <NEW_LINE> return self.cartan_type().dual().root_system().ambient_space(K) <NEW_LINE> <DEDENT> def dimension(self): <NEW_LINE> <INDENT> return self.root_system.dual.ambient_space().dimension() <NEW_LINE> <DEDENT> @cached_method <NEW_LINE> def simple_root(self, i): <NEW_LINE> <INDENT> dual_coroot = self._dual_space.simple_coroot(i) <NEW_LINE> return self.sum_of_terms(dual_coroot) <NEW_LINE> <DEDENT> @cached_method <NEW_LINE> def fundamental_weights(self): <NEW_LINE> <INDENT> return self.fundamental_weights_from_simple_roots() <NEW_LINE> <DEDENT> @lazy_attribute <NEW_LINE> def _plot_projection(self): <NEW_LINE> <INDENT> dual_space = self.cartan_type().dual().root_system().ambient_space(self.base_ring()) <NEW_LINE> if dual_space._plot_projection == dual_space._plot_projection_barycentric: <NEW_LINE> <INDENT> return self._plot_projection_barycentric <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> RootLatticeRealizations.ParentMethods.__dict__["_plot_projection"]
Ambient space for a dual finite Cartan type. It is constructed in the canonical way from the ambient space of the original Cartan type by switching the roles of simple roots, fundamental weights, etc. .. NOTE:: Recall that, for any finite Cartan type, and in particular the a simply laced one, the dual Cartan type is constructed as another preexisting Cartan type. Furthermore the ambient space for an affine type is constructed from the ambient space for its classical type. Thus this code is not actually currently used. It is kept for cross-checking and for reference in case it could become useful, e.g., for dual of general Kac-Moody types. For the doctests, we need to explicitly create a dual type. Subsequently, since reconstruction of the dual of type `F_4` is the relabelled Cartan type, pickling fails on the ``TestSuite`` run. EXAMPLES:: sage: ct = sage.combinat.root_system.type_dual.CartanType(CartanType(['F',4])) sage: L = ct.root_system().ambient_space(); L Ambient space of the Root system of type ['F', 4]^* sage: TestSuite(L).run(skip=["_test_elements","_test_pickling"])
62599062ac7a0e7691f73bbe
class VoxelGeoVolumeFile(GeoprobeVolumeFileV2): <NEW_LINE> <INDENT> _magic_number = 43970
Appears to be identical to a geoprobe volume, but uses a different magic number. It's possible (and likely?) that there are other differences, but I don't have access to VoxelGeo to test and see.
62599062435de62698e9d4e1
class EtlHeadGenerator(object): <NEW_LINE> <INDENT> def __init__(self, source_key): <NEW_LINE> <INDENT> self.source_key = source_key <NEW_LINE> self.next_id = 0 <NEW_LINE> <DEDENT> def next( self, source_etl, **kwargs ): <NEW_LINE> <INDENT> num = self.next_id <NEW_LINE> self.next_id = num + 1 <NEW_LINE> dest_key = self.source_key + "." + text(num) <NEW_LINE> dest_etl = set_default( { "id": num, "source": source_etl, "type": "join", "revision": git.get_revision(), "timestamp": Date.now().unix }, kwargs ) <NEW_LINE> return dest_key, dest_etl
WILL RETURN A UNIQUE ETL STRUCTURE, GIVEN A SOURCE AND A DESTINATION NAME
62599062e5267d203ee6cf2c
class FileInit(): <NEW_LINE> <INDENT> @staticmethod <NEW_LINE> def client_init_files(): <NEW_LINE> <INDENT> open("current_request.txt", "w") <NEW_LINE> <DEDENT> @staticmethod <NEW_LINE> def executor_init_files(): <NEW_LINE> <INDENT> FileInit.make_dir("Requests_To_Execute/") <NEW_LINE> <DEDENT> @staticmethod <NEW_LINE> def make_dir(directory): <NEW_LINE> <INDENT> try: <NEW_LINE> <INDENT> os.makedirs(directory) <NEW_LINE> <DEDENT> except: <NEW_LINE> <INDENT> pass
The class initializes the files the system is going to use on this client.
62599062a8370b77170f1aa8
class SessionPasswordNeeded(Unauthorized): <NEW_LINE> <INDENT> ID = "SESSION_PASSWORD_NEEDED" <NEW_LINE> MESSAGE = __doc__
Two-step verification password required
625990624428ac0f6e659c0d
@dataclass <NEW_LINE> class OffsetData: <NEW_LINE> <INDENT> value: typing.List[float] <NEW_LINE> last_modified: typing.Optional[datetime]
Class to categorize the shape of a given calibration data.
62599062dd821e528d6da4ee
class EditarDependencia(UpdateView): <NEW_LINE> <INDENT> model = Dependencia <NEW_LINE> template_name = 'configuracion/dependencia/modificar.html' <NEW_LINE> context_object_name = "editar_dependencia" <NEW_LINE> success_url = reverse_lazy("listar_dependencia") <NEW_LINE> def get_context_data(self, **kwargs): <NEW_LINE> <INDENT> context = super(EditarDependencia, self).get_context_data(**kwargs) <NEW_LINE> list_institucion = Institucion.objects.all() <NEW_LINE> context['list_institucion'] = list_institucion <NEW_LINE> return context
Vista basada en clase: (`Editar`) :param template_name: ruta de la plantilla :param model: Modelo al cual se hace referencia :param success_url: nombre de la ruta a la cual se redireccionara la aplicacion una vez culminada la edición del registro satisfactoriamente :param context_object_name: nombre del objeto que contiene esta vista
625990623617ad0b5ee07829
@dataclasses.dataclass <NEW_LINE> class SteelMaterial(): <NEW_LINE> <INDENT> name: str <NEW_LINE> E: unyt.unyt_quantity <NEW_LINE> Fy: unyt.unyt_quantity <NEW_LINE> Fu: unyt.unyt_quantity <NEW_LINE> Ry: float <NEW_LINE> Rt: float <NEW_LINE> def __post_init__(self): <NEW_LINE> <INDENT> get_stress = units.UnitInputParser(default_units='ksi') <NEW_LINE> get_factor = units.UnitInputParser(default_units='', convert=True) <NEW_LINE> self.E = get_stress(self.E) <NEW_LINE> self.Fy = get_stress(self.Fy) <NEW_LINE> self.Fu = get_stress(self.Fu) <NEW_LINE> self.Ry = get_factor(self.Ry).item() <NEW_LINE> self.Rt = get_factor(self.Rt).item() <NEW_LINE> if self.Fy > self.Fu: <NEW_LINE> <INDENT> raise SteelError('SteelMaterial: yield strength must' ' be less than tensile strength') <NEW_LINE> <DEDENT> <DEDENT> @property <NEW_LINE> def eFy(self): <NEW_LINE> <INDENT> return self.Fy*self.Ry <NEW_LINE> <DEDENT> @property <NEW_LINE> def eFu(self): <NEW_LINE> <INDENT> return self.Fu*self.Rt <NEW_LINE> <DEDENT> @classmethod <NEW_LINE> def from_name(cls, name: str, grade: str = None, application: str = None): <NEW_LINE> <INDENT> name, grade = _check_deprecated_material(name, grade) <NEW_LINE> def normalize(input): <NEW_LINE> <INDENT> if pd.isna(input): <NEW_LINE> <INDENT> return slice(None) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> return str(input).casefold() <NEW_LINE> <DEDENT> <DEDENT> material = cls._get_materials_db().loc[normalize(name), normalize(grade), normalize(application)] <NEW_LINE> if isinstance(material, pd.DataFrame): <NEW_LINE> <INDENT> if len(material) != 1: <NEW_LINE> <INDENT> raise ValueError('Multiple materials found: specify grade ' 'and/or application to narrow search') <NEW_LINE> <DEDENT> material = material.iloc[0] <NEW_LINE> <DEDENT> name, grade, application = material.name <NEW_LINE> display_grade = '' if pd.isna(grade) else f' Gr. {grade}' <NEW_LINE> display_name = f'{name}{display_grade} ({application})' <NEW_LINE> return cls(display_name.title(), **material) <NEW_LINE> <DEDENT> @classmethod <NEW_LINE> def available_materials(cls): <NEW_LINE> <INDENT> return cls._get_materials_db().index.to_frame(index=False) <NEW_LINE> <DEDENT> @classmethod <NEW_LINE> def _get_materials_db(cls) -> pd.DataFrame: <NEW_LINE> <INDENT> try: <NEW_LINE> <INDENT> return cls._materials_db <NEW_LINE> <DEDENT> except AttributeError: <NEW_LINE> <INDENT> cls._materials_db = _load_materials_db('steel-materials-us.csv') <NEW_LINE> return cls._materials_db
A steel material. Parameters ---------- name : str Name of the material. E : float, unyt.unyt_array Elastic modulus. If units are not specified, assumed to be ksi. Fy : float, unyt.unyt_array Design yield strength. If units are not specified, assumed to be ksi. Fu : float, unyt.unyt_array Design tensile strength. If units are not specified, assumed to be ksi. Ry : float Expected yield strength factor. Dimensionless. Rt : float Expected tensile strength factor. Dimensionless.
625990627b25080760ed884e
class InvalidTwitterEndpointException(TwitterApplianceException): <NEW_LINE> <INDENT> def __init__(self, endpoint, msg): <NEW_LINE> <INDENT> self.endpoint = endpoint <NEW_LINE> self.msg = msg
Exception raised by looking for invalid resources Attributes: endpoint -- the endpoint looked for msg -- explanation of the error
62599062d486a94d0ba2d6a3
class Event(generics.GenericModel, Generic[IDT, PayloadT]): <NEW_LINE> <INDENT> id: IDT <NEW_LINE> delivery_id: UUID <NEW_LINE> hook_id: int <NEW_LINE> payload: PayloadT
Represents an abstract Github webhook event. All known GitHub webhook events should derive from this class and specify the event 'name' that they should be parsed from. Event models encapsulate information that comes in both the webhook request payload and HTTP headers.
625990623539df3088ecd978
class ResourceItemAdmin(admin.ModelAdmin): <NEW_LINE> <INDENT> list_display = ('title', 'file', 'description',) <NEW_LINE> prepopulated_fields = {'slug':('title',)}
Admin View for ResourceItem
6259906256b00c62f0fb3fa6
class EmptyLayerException(Exception): <NEW_LINE> <INDENT> pass
Raises error if user tries to add a layer with less than one neurode
6259906216aa5153ce401bb7
@global_preferences_registry.register <NEW_LINE> class LdapNonOrionADNodesReportSubscription(StringPreference): <NEW_LINE> <INDENT> section = LDAP_PROBE <NEW_LINE> name = 'ldap_non_orion_ad_nodes_subscription' <NEW_LINE> default = 'LDAP: non Orion AD nodes' <NEW_LINE> required = True <NEW_LINE> verbose_name = _( 'Email Subscription for non Orion AD Nodes Reports').title()
Dynamic preferences class controlling the name of the :class:`Email subscription <p_soc_auto_base.models.Subscription>` used for dispatching `LDAP` reports about `AD` nodes not defined on the `Orion` server :access_key: 'ldapprobe__ldap_non_orion_ad_nodes_subscription'
62599062462c4b4f79dbd0e1
class GaussianMP2Test(GenericMP2Test): <NEW_LINE> <INDENT> def testnocoeffs(self): <NEW_LINE> <INDENT> self.assertEquals(self.data.nocoeffs.shape, (self.data.nmo, self.data.nbasis)) <NEW_LINE> <DEDENT> def testnocoeffs(self): <NEW_LINE> <INDENT> self.assertEquals(self.data.nooccnos.shape, (self.data.nmo, ))
Customized MP2 unittest
62599062627d3e7fe0e08566
class IperfSessionBuilder(BaseToolBuilder): <NEW_LINE> <INDENT> def __init__(self, *args, **kwargs): <NEW_LINE> <INDENT> super(IperfSessionBuilder, self).__init__(*args, **kwargs) <NEW_LINE> self._test = None <NEW_LINE> self._directions = None <NEW_LINE> self._filename = None <NEW_LINE> return <NEW_LINE> <DEDENT> @property <NEW_LINE> def filename(self): <NEW_LINE> <INDENT> if self._filename is None: <NEW_LINE> <INDENT> self._filename = self.config_map.get(ConfigOptions.test_section, ConfigOptions.output_folder_option, default=None, optional=True) <NEW_LINE> <DEDENT> return self._filename <NEW_LINE> <DEDENT> @property <NEW_LINE> def test(self): <NEW_LINE> <INDENT> if self._test is None: <NEW_LINE> <INDENT> self._test = IperfTestBuilder(self.config_map, self.master.events).test <NEW_LINE> <DEDENT> return self._test <NEW_LINE> <DEDENT> @property <NEW_LINE> def directions(self): <NEW_LINE> <INDENT> if self._directions is None: <NEW_LINE> <INDENT> self._directions = self.config_map.get_list(ConfigOptions.iperf_section, ConfigOptions.directions_option) <NEW_LINE> for direction in self._directions: <NEW_LINE> <INDENT> if self.product.to_node_expression.search(direction): <NEW_LINE> <INDENT> continue <NEW_LINE> <DEDENT> if self.product.from_node_expression.search(direction): <NEW_LINE> <INDENT> continue <NEW_LINE> <DEDENT> raise IperfSessionBuilderError("Unknown Direction: {0}".format(direction)) <NEW_LINE> <DEDENT> <DEDENT> return self._directions <NEW_LINE> <DEDENT> @property <NEW_LINE> def product(self): <NEW_LINE> <INDENT> if self._product is None: <NEW_LINE> <INDENT> self._product = IperfSession(iperf_test=self.test, nodes=self.master.nodes, tpc=self.master.tpc_device, filename_base=self.filename) <NEW_LINE> <DEDENT> return self._product <NEW_LINE> <DEDENT> @property <NEW_LINE> def parameters(self): <NEW_LINE> <INDENT> if self._parameters is None: <NEW_LINE> <INDENT> if not any([p.name == BuilderParameterEnums.nodes for p in self.previous_parameters]): <NEW_LINE> <INDENT> self.previous_parameters.append(Parameters(name=BuilderParameterEnums.nodes, parameters=self.master.nodes.keys())) <NEW_LINE> <DEDENT> if not any([p.name == BuilderParameterEnums.iperf_directions for p in self.previous_parameters]): <NEW_LINE> <INDENT> self.previous_parameters.append(Parameters(name=BuilderParameterEnums.iperf_directions, parameters=self.directions)) <NEW_LINE> <DEDENT> self._parameters = self.previous_parameters <NEW_LINE> <DEDENT> return self._parameters
A class to build an iperf session
62599062be8e80087fbc0762
class Cola(object): <NEW_LINE> <INDENT> def __init__(self): <NEW_LINE> <INDENT> self.frente, self.final, self.tamanio = None, None, 0
TDA cola
62599062435de62698e9d4e3
class ForeignKeysResponse: <NEW_LINE> <INDENT> thrift_spec = ( None, (1, TType.LIST, 'foreignKeys', (TType.STRUCT,(SQLForeignKey, SQLForeignKey.thrift_spec)), None, ), ) <NEW_LINE> def __init__(self, foreignKeys=None,): <NEW_LINE> <INDENT> self.foreignKeys = foreignKeys <NEW_LINE> <DEDENT> def read(self, iprot): <NEW_LINE> <INDENT> if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None: <NEW_LINE> <INDENT> fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec)) <NEW_LINE> return <NEW_LINE> <DEDENT> iprot.readStructBegin() <NEW_LINE> while True: <NEW_LINE> <INDENT> (fname, ftype, fid) = iprot.readFieldBegin() <NEW_LINE> if ftype == TType.STOP: <NEW_LINE> <INDENT> break <NEW_LINE> <DEDENT> if fid == 1: <NEW_LINE> <INDENT> if ftype == TType.LIST: <NEW_LINE> <INDENT> self.foreignKeys = [] <NEW_LINE> (_etype300, _size297) = iprot.readListBegin() <NEW_LINE> for _i301 in xrange(_size297): <NEW_LINE> <INDENT> _elem302 = SQLForeignKey() <NEW_LINE> _elem302.read(iprot) <NEW_LINE> self.foreignKeys.append(_elem302) <NEW_LINE> <DEDENT> iprot.readListEnd() <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> iprot.skip(ftype) <NEW_LINE> <DEDENT> <DEDENT> else: <NEW_LINE> <INDENT> iprot.skip(ftype) <NEW_LINE> <DEDENT> iprot.readFieldEnd() <NEW_LINE> <DEDENT> iprot.readStructEnd() <NEW_LINE> <DEDENT> def write(self, oprot): <NEW_LINE> <INDENT> if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None: <NEW_LINE> <INDENT> oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec))) <NEW_LINE> return <NEW_LINE> <DEDENT> oprot.writeStructBegin('ForeignKeysResponse') <NEW_LINE> if self.foreignKeys is not None: <NEW_LINE> <INDENT> oprot.writeFieldBegin('foreignKeys', TType.LIST, 1) <NEW_LINE> oprot.writeListBegin(TType.STRUCT, len(self.foreignKeys)) <NEW_LINE> for iter303 in self.foreignKeys: <NEW_LINE> <INDENT> iter303.write(oprot) <NEW_LINE> <DEDENT> oprot.writeListEnd() <NEW_LINE> oprot.writeFieldEnd() <NEW_LINE> <DEDENT> oprot.writeFieldStop() <NEW_LINE> oprot.writeStructEnd() <NEW_LINE> <DEDENT> def validate(self): <NEW_LINE> <INDENT> if self.foreignKeys is None: <NEW_LINE> <INDENT> raise TProtocol.TProtocolException(message='Required field foreignKeys is unset!') <NEW_LINE> <DEDENT> return <NEW_LINE> <DEDENT> def __hash__(self): <NEW_LINE> <INDENT> value = 17 <NEW_LINE> value = (value * 31) ^ hash(self.foreignKeys) <NEW_LINE> return value <NEW_LINE> <DEDENT> def __repr__(self): <NEW_LINE> <INDENT> L = ['%s=%r' % (key, value) for key, value in self.__dict__.iteritems()] <NEW_LINE> return '%s(%s)' % (self.__class__.__name__, ', '.join(L)) <NEW_LINE> <DEDENT> def __eq__(self, other): <NEW_LINE> <INDENT> return isinstance(other, self.__class__) and self.__dict__ == other.__dict__ <NEW_LINE> <DEDENT> def __ne__(self, other): <NEW_LINE> <INDENT> return not (self == other)
Attributes: - foreignKeys
6259906263d6d428bbee3df6
class Section(object): <NEW_LINE> <INDENT> subsections_number = 0 <NEW_LINE> @classmethod <NEW_LINE> def reset(cls): <NEW_LINE> <INDENT> cls.subsections_number = 0 <NEW_LINE> Subsection.reset() <NEW_LINE> return <NEW_LINE> <DEDENT> def __init__(self, number, title=None): <NEW_LINE> <INDENT> self.number = number <NEW_LINE> self.title = title <NEW_LINE> self.subsections = [] <NEW_LINE> self.toc = OrderedDict() <NEW_LINE> return <NEW_LINE> <DEDENT> def __str__(self): <NEW_LINE> <INDENT> strings = [str(self.title)] <NEW_LINE> for subsection in self.subsections: <NEW_LINE> <INDENT> strings.append(" " + str(subsection)) <NEW_LINE> <DEDENT> return '\n'.join(strings) <NEW_LINE> <DEDENT> def update_toc(self): <NEW_LINE> <INDENT> self.toc[self.subsections[-1].title] = self.subsections[-1].toc <NEW_LINE> <DEDENT> def add_subsection(self, subsection): <NEW_LINE> <INDENT> Section.subsections_number += 1 <NEW_LINE> self.subsections.append(subsection) <NEW_LINE> self.update_toc() <NEW_LINE> return <NEW_LINE> <DEDENT> def put_html_attributes(self, doc): <NEW_LINE> <INDENT> doc.attr(('sectionnumber', str(self.number))) <NEW_LINE> if self.title is not None: <NEW_LINE> <INDENT> doc.attr(('sectiontitle', str(self.title))) <NEW_LINE> <DEDENT> return
Section object. Attributes ---------- subsections_number: int
62599062009cb60464d02c13
class CAdvCopy(object): <NEW_LINE> <INDENT> fieldlist = {'xxx': 999} <NEW_LINE> template = "" <NEW_LINE> def __init__(self,fieldlist,template): <NEW_LINE> <INDENT> self.fieldlist.clear() <NEW_LINE> self.template = template <NEW_LINE> for field in fieldlist: <NEW_LINE> <INDENT> u = uuid.uuid4() <NEW_LINE> suuid = str(u.hex) <NEW_LINE> self.fieldlist[field] = suuid <NEW_LINE> <DEDENT> <DEDENT> def _get_uuid(self,field_name): <NEW_LINE> <INDENT> try: <NEW_LINE> <INDENT> return self.fieldlist[field_name] <NEW_LINE> <DEDENT> except IndexError: <NEW_LINE> <INDENT> return "" <NEW_LINE> <DEDENT> <DEDENT> def _get_name(self,uuid_str): <NEW_LINE> <INDENT> for key,value in self.fieldlist: <NEW_LINE> <INDENT> if value == uuid_str: <NEW_LINE> <INDENT> return key <NEW_LINE> <DEDENT> <DEDENT> return "" <NEW_LINE> <DEDENT> def process(self,record): <NEW_LINE> <INDENT> result = ""; <NEW_LINE> result = self.template <NEW_LINE> for key,value in self.fieldlist.items(): <NEW_LINE> <INDENT> result = result.replace("{{"+key+"}}",self._get_uuid(key)) <NEW_LINE> <DEDENT> for (key,value) in list(record.items()): <NEW_LINE> <INDENT> if key in self.fieldlist: <NEW_LINE> <INDENT> suuid_tag = self._get_uuid(key) <NEW_LINE> result = result.replace(suuid_tag,value) <NEW_LINE> <DEDENT> <DEDENT> return result
Generates a universally unique ID. args make it more random, if used.
6259906276e4537e8c3f0c65
class Commands: <NEW_LINE> <INDENT> def __init__(self): <NEW_LINE> <INDENT> self.handlers = {} <NEW_LINE> <DEDENT> def add(self, name, auth_required=True, list_command=True, **validators): <NEW_LINE> <INDENT> def wrapper(func): <NEW_LINE> <INDENT> if name in self.handlers: <NEW_LINE> <INDENT> raise ValueError(f"{name} already registered") <NEW_LINE> <DEDENT> spec = inspect.getfullargspec(func) <NEW_LINE> defaults = dict( zip(spec.args[-len(spec.defaults or []) :], spec.defaults or []) ) <NEW_LINE> if not spec.args and not spec.varargs: <NEW_LINE> <INDENT> raise TypeError("Handler must accept at least one argument.") <NEW_LINE> <DEDENT> if len(spec.args) > 1 and spec.varargs: <NEW_LINE> <INDENT> raise TypeError( "*args may not be combined with regular arguments" ) <NEW_LINE> <DEDENT> if not set(validators.keys()).issubset(spec.args): <NEW_LINE> <INDENT> raise TypeError("Validator for non-existent arg passed") <NEW_LINE> <DEDENT> if spec.varkw or spec.kwonlyargs: <NEW_LINE> <INDENT> raise TypeError("Keyword arguments are not permitted") <NEW_LINE> <DEDENT> def validate(*args, **kwargs): <NEW_LINE> <INDENT> if spec.varargs: <NEW_LINE> <INDENT> return func(*args, **kwargs) <NEW_LINE> <DEDENT> try: <NEW_LINE> <INDENT> ba = inspect.signature(func).bind(*args, **kwargs) <NEW_LINE> ba.apply_defaults() <NEW_LINE> callargs = ba.arguments <NEW_LINE> <DEDENT> except TypeError: <NEW_LINE> <INDENT> raise exceptions.MpdArgError( f'wrong number of arguments for "{name}"' ) <NEW_LINE> <DEDENT> for key, value in callargs.items(): <NEW_LINE> <INDENT> default = defaults.get(key, object()) <NEW_LINE> if key in validators and value != default: <NEW_LINE> <INDENT> try: <NEW_LINE> <INDENT> callargs[key] = validators[key](value) <NEW_LINE> <DEDENT> except ValueError: <NEW_LINE> <INDENT> raise exceptions.MpdArgError("incorrect arguments") <NEW_LINE> <DEDENT> <DEDENT> <DEDENT> return func(**callargs) <NEW_LINE> <DEDENT> validate.auth_required = auth_required <NEW_LINE> validate.list_command = list_command <NEW_LINE> self.handlers[name] = validate <NEW_LINE> return func <NEW_LINE> <DEDENT> return wrapper <NEW_LINE> <DEDENT> def call(self, tokens, context=None): <NEW_LINE> <INDENT> if not tokens: <NEW_LINE> <INDENT> raise exceptions.MpdNoCommand() <NEW_LINE> <DEDENT> if tokens[0] not in self.handlers: <NEW_LINE> <INDENT> raise exceptions.MpdUnknownCommand(command=tokens[0]) <NEW_LINE> <DEDENT> return self.handlers[tokens[0]](context, *tokens[1:])
Collection of MPD commands to expose to users. Normally used through the global instance which command handlers have been installed into.
62599062f7d966606f749427
class StaticRNNMemoryLink(object): <NEW_LINE> <INDENT> def __init__(self, init, pre_mem, mem=None): <NEW_LINE> <INDENT> self.init = init <NEW_LINE> self.pre_mem = pre_mem <NEW_LINE> self.mem = mem
StaticRNNMemoryLink class. StaticRNNMemoryLink class is used to create a link between two memory cells of a StaticRNN. NOTE: This is a internal data structure of a very low-level API. Please use StaticRNN instead. Args: init(Variable): the initial variable for Memory. pre_mem(Variable): the memory variable in previous time step. mem(Variable): the memory variable in current time step.
62599062379a373c97d9a700
class Combatant(DirectionalGameObject): <NEW_LINE> <INDENT> def __init__(self, x, y, direction, score, hp, shield_active, laser_count, teleport_count, shield_count): <NEW_LINE> <INDENT> super().__init__(x, y, direction) <NEW_LINE> self.score = score <NEW_LINE> self.hp = hp <NEW_LINE> self.shield_active = shield_active <NEW_LINE> self.laser_count = laser_count <NEW_LINE> self.teleport_count = teleport_count <NEW_LINE> self.shield_count = shield_count
A general object representing a player on the gameboard. Attributes: score (int): This Combatant's current score. hp (int): This Combatant's current hp (remaining lives/hit points). shield_active (boolean): True iff this Combatant's shield is currently active. laser_count (int): The number of laser power-ups this Combatant has. teleport_count (int): The number of teleport power-ups this Combatant has. shield_count (int): The number of shield power-ups this Combatant has.
6259906224f1403a9268643c
class Coverage(object): <NEW_LINE> <INDENT> config = Config( cmd_run_test = "`which py.test`", branch=True, parallel=False, omit=[]) <NEW_LINE> def __init__(self, pkgs, config=None): <NEW_LINE> <INDENT> self.config = self.config.make(config) <NEW_LINE> self.pkgs = [] <NEW_LINE> for pkg in pkgs: <NEW_LINE> <INDENT> if isinstance(pkg, PythonPackage): <NEW_LINE> <INDENT> self.pkgs.append(pkg) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> self.pkgs.append(PythonPackage(pkg)) <NEW_LINE> <DEDENT> <DEDENT> <DEDENT> def _action_list(self, modules, test=None): <NEW_LINE> <INDENT> run_options = [] <NEW_LINE> if self.config['branch']: <NEW_LINE> <INDENT> run_options.append('--branch') <NEW_LINE> <DEDENT> if self.config['parallel']: <NEW_LINE> <INDENT> run_options.append('--parallel-mode') <NEW_LINE> <DEDENT> report_options = [] <NEW_LINE> if self.config['omit']: <NEW_LINE> <INDENT> omit_list = ','.join(self.config['omit']) <NEW_LINE> report_options.append('--omit {}'.format(omit_list)) <NEW_LINE> <DEDENT> actions = [sep("coverage run", sep(*run_options), self.config['cmd_run_test'], test)] <NEW_LINE> if self.config['parallel']: <NEW_LINE> <INDENT> actions.append('coverage combine') <NEW_LINE> <DEDENT> actions.append(sep("coverage report --show-missing", sep(*report_options), sep(*modules))) <NEW_LINE> return actions <NEW_LINE> <DEDENT> def all(self, basename='coverage'): <NEW_LINE> <INDENT> all_modules = [] <NEW_LINE> for pkg in self.pkgs: <NEW_LINE> <INDENT> for module in pkg.all_modules(): <NEW_LINE> <INDENT> all_modules.append(module) <NEW_LINE> <DEDENT> <DEDENT> yield { 'basename': basename, 'actions': self._action_list(all_modules), 'verbosity': 2, } <NEW_LINE> <DEDENT> def src(self, basename='coverage_src'): <NEW_LINE> <INDENT> all_modules = [] <NEW_LINE> for pkg in self.pkgs: <NEW_LINE> <INDENT> for module in pkg.src: <NEW_LINE> <INDENT> all_modules.append(module) <NEW_LINE> <DEDENT> <DEDENT> yield { 'basename': basename, 'actions': self._action_list(all_modules), 'verbosity': 2, } <NEW_LINE> <DEDENT> def by_module(self, basename='coverage_module'): <NEW_LINE> <INDENT> for pkg in self.pkgs: <NEW_LINE> <INDENT> to_strip = len('{}/{}'.format(pkg.test_base, pkg.test_prefix)) <NEW_LINE> tests = glob.glob('{}/{}*.py'.format(pkg.test_base, pkg.test_prefix)) <NEW_LINE> for test in tests: <NEW_LINE> <INDENT> source = pkg.src_base + '/' + test[to_strip:] <NEW_LINE> yield { 'basename': basename, 'name': test, 'actions': self._action_list([source, test], test), 'verbosity': 2, }
generate tasks for coverage.py
625990628e71fb1e983bd1a7
class base_case(object): <NEW_LINE> <INDENT> def handle_file(self, handler, full_path): <NEW_LINE> <INDENT> try: <NEW_LINE> <INDENT> with open(full_path, 'rb') as reader: <NEW_LINE> <INDENT> content = reader.read() <NEW_LINE> <DEDENT> handler.send_content(content) <NEW_LINE> <DEDENT> except IOError as msg: <NEW_LINE> <INDENT> msg = "'{0}' cannot be read: {1}".format(full_path, msg) <NEW_LINE> handler.handle_error(msg) <NEW_LINE> <DEDENT> <DEDENT> def index_path(self, handler): <NEW_LINE> <INDENT> return os.path.join(handler.full_path, 'index.html') <NEW_LINE> <DEDENT> def test(self, handler): <NEW_LINE> <INDENT> assert False, 'Not implemented.' <NEW_LINE> <DEDENT> def act(self, handler): <NEW_LINE> <INDENT> assert False, 'Not implemented.'
条件处理基类
6259906229b78933be26ac32
class TransformEvaluatorRegistry(object): <NEW_LINE> <INDENT> def __init__(self, evaluation_context): <NEW_LINE> <INDENT> assert evaluation_context <NEW_LINE> self._evaluation_context = evaluation_context <NEW_LINE> self._evaluators = { io.Read: _BoundedReadEvaluator, io.ReadStringsFromPubSub: _PubSubReadEvaluator, core.Flatten: _FlattenEvaluator, core.ParDo: _ParDoEvaluator, core._GroupByKeyOnly: _GroupByKeyOnlyEvaluator, _StreamingGroupByKeyOnly: _StreamingGroupByKeyOnlyEvaluator, _StreamingGroupAlsoByWindow: _StreamingGroupAlsoByWindowEvaluator, _NativeWrite: _NativeWriteEvaluator, TestStream: _TestStreamEvaluator, ProcessElements: _ProcessElementsEvaluator } <NEW_LINE> self._root_bundle_providers = { core.PTransform: DefaultRootBundleProvider, TestStream: _TestStreamRootBundleProvider, } <NEW_LINE> <DEDENT> def get_evaluator( self, applied_ptransform, input_committed_bundle, side_inputs, scoped_metrics_container): <NEW_LINE> <INDENT> assert applied_ptransform <NEW_LINE> assert bool(applied_ptransform.side_inputs) == bool(side_inputs) <NEW_LINE> for cls in applied_ptransform.transform.__class__.mro(): <NEW_LINE> <INDENT> evaluator = self._evaluators.get(cls) <NEW_LINE> if evaluator: <NEW_LINE> <INDENT> break <NEW_LINE> <DEDENT> <DEDENT> if not evaluator: <NEW_LINE> <INDENT> raise NotImplementedError( 'Execution of [%s] not implemented in runner %s.' % ( type(applied_ptransform.transform), self)) <NEW_LINE> <DEDENT> return evaluator(self._evaluation_context, applied_ptransform, input_committed_bundle, side_inputs, scoped_metrics_container) <NEW_LINE> <DEDENT> def get_root_bundle_provider(self, applied_ptransform): <NEW_LINE> <INDENT> provider_cls = None <NEW_LINE> for cls in applied_ptransform.transform.__class__.mro(): <NEW_LINE> <INDENT> provider_cls = self._root_bundle_providers.get(cls) <NEW_LINE> if provider_cls: <NEW_LINE> <INDENT> break <NEW_LINE> <DEDENT> <DEDENT> if not provider_cls: <NEW_LINE> <INDENT> raise NotImplementedError( 'Root provider for [%s] not implemented in runner %s' % ( type(applied_ptransform.transform), self)) <NEW_LINE> <DEDENT> return provider_cls(self._evaluation_context, applied_ptransform) <NEW_LINE> <DEDENT> def should_execute_serially(self, applied_ptransform): <NEW_LINE> <INDENT> return isinstance(applied_ptransform.transform, (core._GroupByKeyOnly, _StreamingGroupByKeyOnly, _StreamingGroupAlsoByWindow, _NativeWrite))
For internal use only; no backwards-compatibility guarantees. Creates instances of TransformEvaluator for the application of a transform.
625990627b25080760ed884f
class _DenseAsppBlock(nn.Module): <NEW_LINE> <INDENT> def __init__(self, input_num, num1, num2, dilation_rate, drop_out, bn_start=True,modulation=True,adaptive_d=True): <NEW_LINE> <INDENT> super(_DenseAsppBlock, self).__init__() <NEW_LINE> self.modulation = modulation <NEW_LINE> self.adaptive_d = adaptive_d <NEW_LINE> self.bn_start = bn_start <NEW_LINE> self.bn1 = bn(input_num, momentum=0.0003) <NEW_LINE> self.relu1 = nn.ReLU(inplace = True) <NEW_LINE> self.conv_1 = nn.Conv2d(in_channels=input_num, out_channels=num1, kernel_size=1) <NEW_LINE> self.bn2 = bn(num1, momentum=0.0003) <NEW_LINE> self.relu2 = nn.ReLU(inplace = True) <NEW_LINE> self.deform_conv = DeformConv2d(num1,num2,3,padding=1,dilation=dilation_rate,modulation=self.modulation,adaptive_d=self.adaptive_d) <NEW_LINE> <DEDENT> def forward(self,input): <NEW_LINE> <INDENT> if self.bn_start == True: <NEW_LINE> <INDENT> input = self.bn1(input) <NEW_LINE> <DEDENT> feature = self.relu1(input) <NEW_LINE> feature = self.conv_1(feature) <NEW_LINE> feature = self.bn2(feature) <NEW_LINE> feature1 = self.deform_conv(feature) <NEW_LINE> return feature1
ConvNet block for building DenseASPP.
6259906201c39578d7f142a3
class CThostFtdcCombinationLegField: <NEW_LINE> <INDENT> def __init__(self,**fields): <NEW_LINE> <INDENT> """组合合约代码""" <NEW_LINE> self.CombInstrumentID = None <NEW_LINE> """单腿编号""" <NEW_LINE> self.LegID = None <NEW_LINE> """单腿合约代码""" <NEW_LINE> self.LegInstrumentID = None <NEW_LINE> """买卖方向""" <NEW_LINE> self.Direction = None <NEW_LINE> """单腿乘数""" <NEW_LINE> self.LegMultiple = None <NEW_LINE> """派生层数""" <NEW_LINE> self.ImplyLevel = None <NEW_LINE> self.__dict__.update(fields) <NEW_LINE> <DEDENT> def toDict(self): <NEW_LINE> <INDENT> return {k:v for k,v in self.__dict__.iteritems() if v != None}
组合交易合约的单腿 CombInstrumentID 组合合约代码 char[31] LegID 单腿编号 int LegInstrumentID 单腿合约代码 char[31] Direction 买卖方向 char LegMultiple 单腿乘数 int ImplyLevel 派生层数 int
625990628e71fb1e983bd1a8
class FunctionKinds(IntEnum): <NEW_LINE> <INDENT> interpreted_function = 1 <NEW_LINE> uninterpreted_function = 2 <NEW_LINE> synth_function = 3 <NEW_LINE> macro_function = 4
Function Kinds. builtin_function: represents a builtin function. macro_function: represents a user defined macro. unknown_function: represents a function to be synthesized for.
6259906256b00c62f0fb3fa8
class SessionAuthentication(Authentication): <NEW_LINE> <INDENT> def is_authenticated(self, request, **kwargs): <NEW_LINE> <INDENT> if request.method in ("GET", "HEAD", "OPTIONS", "TRACE"): <NEW_LINE> <INDENT> return request.user.is_authenticated() <NEW_LINE> <DEDENT> if getattr(request, "_dont_enforce_csrf_checks", False): <NEW_LINE> <INDENT> return request.user.is_authenticated() <NEW_LINE> <DEDENT> csrf_token = _sanitize_token(request.COOKIES.get(settings.CSRF_COOKIE_NAME, "")) <NEW_LINE> if request.is_secure(): <NEW_LINE> <INDENT> referer = request.META.get("HTTP_REFERER") <NEW_LINE> if referer is None: <NEW_LINE> <INDENT> return False <NEW_LINE> <DEDENT> good_referer = "https://%s/" % request.get_host() <NEW_LINE> <DEDENT> request_csrf_token = request.META.get("HTTP_X_CSRFTOKEN", "") <NEW_LINE> if not constant_time_compare(request_csrf_token, csrf_token): <NEW_LINE> <INDENT> return False <NEW_LINE> <DEDENT> return request.user.is_authenticated() <NEW_LINE> <DEDENT> def get_identifier(self, request): <NEW_LINE> <INDENT> return request.user.username
An authentication mechanism that piggy-backs on Django sessions. This is useful when the API is talking to Javascript on the same site. Relies on the user being logged in through the standard Django login setup. Requires a valid CSRF token.
625990628da39b475be048c6
class process: <NEW_LINE> <INDENT> def __init__(self, number, numRefs, S, A, B, C): <NEW_LINE> <INDENT> self.number = int(number) <NEW_LINE> self.numRefs = int(numRefs) <NEW_LINE> self.numRefsLeft = int(numRefs) <NEW_LINE> self.refSize = int(S) <NEW_LINE> self.nextWordRef = (111 * self.number) % self.refSize <NEW_LINE> self.numFaults = 0 <NEW_LINE> self.residTime = 0 <NEW_LINE> self.numEvicts = 0 <NEW_LINE> self.A = int(A) <NEW_LINE> self.B =int(B) <NEW_LINE> self.C = int(C)
implementation of process
62599062097d151d1a2c274d
class DbConnectionRefusedError(DatabaseError): <NEW_LINE> <INDENT> def __init__(self): <NEW_LINE> <INDENT> self.message = 'connection refused. are the credentials and replica set name correct?'
Exception raise when connection is refused to mongodb Attributes: message -- generated explanation of the error
625990626e29344779b01d2d
class Boolean(TypeEngine, SchemaType): <NEW_LINE> <INDENT> __visit_name__ = 'boolean' <NEW_LINE> def __init__(self, create_constraint=True, name=None): <NEW_LINE> <INDENT> self.create_constraint = create_constraint <NEW_LINE> self.name = name <NEW_LINE> <DEDENT> def _should_create_constraint(self, compiler): <NEW_LINE> <INDENT> return not compiler.dialect.supports_native_boolean <NEW_LINE> <DEDENT> @util.dependencies("sqlalchemy.sql.schema") <NEW_LINE> def _set_table(self, schema, column, table): <NEW_LINE> <INDENT> if not self.create_constraint: <NEW_LINE> <INDENT> return <NEW_LINE> <DEDENT> e = schema.CheckConstraint( type_coerce(column, self).in_([0, 1]), name=self.name, _create_rule=util.portable_instancemethod( self._should_create_constraint) ) <NEW_LINE> assert e.table is table <NEW_LINE> <DEDENT> @property <NEW_LINE> def python_type(self): <NEW_LINE> <INDENT> return bool <NEW_LINE> <DEDENT> def bind_processor(self, dialect): <NEW_LINE> <INDENT> if dialect.supports_native_boolean: <NEW_LINE> <INDENT> return None <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> return processors.boolean_to_int <NEW_LINE> <DEDENT> <DEDENT> def result_processor(self, dialect, coltype): <NEW_LINE> <INDENT> if dialect.supports_native_boolean: <NEW_LINE> <INDENT> return None <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> return processors.int_to_boolean
A bool datatype. Boolean typically uses BOOLEAN or SMALLINT on the DDL side, and on the Python side deals in ``True`` or ``False``.
62599062627d3e7fe0e08568
class CategoryArticleList(ArticleListBase): <NEW_LINE> <INDENT> def get_queryset(self): <NEW_LINE> <INDENT> return super(CategoryArticleList, self).get_queryset().filter( categories=self.category ) <NEW_LINE> <DEDENT> def get(self, request, category): <NEW_LINE> <INDENT> language = translation.get_language_from_request( request, check_path=True) <NEW_LINE> self.category = Category.objects.language(language).active_translations( language, slug=category).first() <NEW_LINE> if not self.category: <NEW_LINE> <INDENT> raise Http404('Category is not found') <NEW_LINE> <DEDENT> return super(CategoryArticleList, self).get(request) <NEW_LINE> <DEDENT> def get_context_data(self, **kwargs): <NEW_LINE> <INDENT> kwargs['newsblog_category'] = self.category <NEW_LINE> ctx = super(CategoryArticleList, self).get_context_data(**kwargs) <NEW_LINE> ctx['newsblog_category'] = self.category <NEW_LINE> return ctx
A list of articles filtered by categories.
62599062ac7a0e7691f73bc2
class GetTestCase(unittest.TestCase): <NEW_LINE> <INDENT> def test_get(self): <NEW_LINE> <INDENT> adapter = get() <NEW_LINE> self.assertIsNotNone(adapter) <NEW_LINE> self.assertEqual('root', adapter.logger.name) <NEW_LINE> <DEDENT> def test_get_with_name(self): <NEW_LINE> <INDENT> adapter = get('logger_test') <NEW_LINE> self.assertIsNotNone(adapter) <NEW_LINE> self.assertEqual('logger_test', adapter.logger.name)
Tests for the monocle.logger.get() function.
625990628e7ae83300eea76b
class IapProjectsIapWebSetIamPolicyRequest(_messages.Message): <NEW_LINE> <INDENT> resource = _messages.StringField(1, required=True) <NEW_LINE> setIamPolicyRequest = _messages.MessageField('SetIamPolicyRequest', 2)
A IapProjectsIapWebSetIamPolicyRequest object. Fields: resource: REQUIRED: The resource for which the policy is being specified. See the operation documentation for the appropriate value for this field. setIamPolicyRequest: A SetIamPolicyRequest resource to be passed as the request body.
62599062fff4ab517ebcef04
@deconstructible <NEW_LINE> class AlphanumericExcludingDiacritic: <NEW_LINE> <INDENT> def __init__(self, start=0): <NEW_LINE> <INDENT> self.start = start <NEW_LINE> <DEDENT> def __call__(self, value): <NEW_LINE> <INDENT> stripped_value = value[self.start :] <NEW_LINE> match = WORD_REGEX.match(stripped_value) <NEW_LINE> if not match: <NEW_LINE> <INDENT> raise ValidationError( 'Waarde "{0}" mag geen diakrieten of non-ascii tekens bevatten{1}'.format( value, " na de eerste {0} karakters".format(self.start) if self.start else "", ) ) <NEW_LINE> <DEDENT> <DEDENT> def __eq__(self, other): <NEW_LINE> <INDENT> return ( isinstance(other, AlphanumericExcludingDiacritic) and self.start == other.start )
Alle alfanumerieke tekens m.u.v. diacrieten. RGBZ heeft hier een vreemde definitie voor. De oorsprong is dat dit gek is voor bestandsnamen, en dus speciale karakters uitgesloten worden.
62599062be8e80087fbc0764
class abstract_declarator_Node(ParseNode): <NEW_LINE> <INDENT> def __init__(self, **kw): <NEW_LINE> <INDENT> ParseNode.__init__(self, **kw) <NEW_LINE> <DEDENT> def dump(self, indent=0): <NEW_LINE> <INDENT> ParseNode.dump(self, indent)
Holds an "abstract_declarator" parse target and its components.
62599062baa26c4b54d50980
class DivExpression(BinaryExpression): <NEW_LINE> <INDENT> pass
{{ foo / bar }}
6259906221bff66bcd724343
class AtLeastImportTestCase(unittest.TestCase): <NEW_LINE> <INDENT> failureException = ImportError <NEW_LINE> def test_misc(self): <NEW_LINE> <INDENT> from twisted import copyright <NEW_LINE> <DEDENT> def test_persisted(self): <NEW_LINE> <INDENT> from twisted.persisted import dirdbm <NEW_LINE> from twisted.persisted import styles <NEW_LINE> <DEDENT> def test_internet(self): <NEW_LINE> <INDENT> from twisted.internet import tcp <NEW_LINE> from twisted.internet import main <NEW_LINE> from twisted.internet import abstract <NEW_LINE> from twisted.internet import udp <NEW_LINE> from twisted.internet import protocol <NEW_LINE> from twisted.internet import defer <NEW_LINE> <DEDENT> def test_unix(self): <NEW_LINE> <INDENT> from twisted.internet import stdio <NEW_LINE> from twisted.internet import process <NEW_LINE> from twisted.internet import unix <NEW_LINE> <DEDENT> if platformType != "posix": <NEW_LINE> <INDENT> test_unix.skip = "UNIX-only modules" <NEW_LINE> <DEDENT> def test_spread(self): <NEW_LINE> <INDENT> from twisted.spread import pb <NEW_LINE> from twisted.spread import jelly <NEW_LINE> from twisted.spread import banana <NEW_LINE> from twisted.spread import flavors <NEW_LINE> <DEDENT> def test_twistedPython(self): <NEW_LINE> <INDENT> from twisted.python import hook <NEW_LINE> from twisted.python import log <NEW_LINE> from twisted.python import reflect <NEW_LINE> from twisted.python import usage <NEW_LINE> from twisted.python import otp <NEW_LINE> <DEDENT> def test_protocols(self): <NEW_LINE> <INDENT> from twisted.protocols import basic <NEW_LINE> from twisted.protocols import ftp <NEW_LINE> from twisted.protocols import telnet <NEW_LINE> from twisted.protocols import policies
I test that there are no syntax errors which will not allow importing.
62599062d7e4931a7ef3d6f4
class ProcPipeMaxSize(KernelProcFileTestBase.KernelProcFileTestBase): <NEW_LINE> <INDENT> def parse_contents(self, contents): <NEW_LINE> <INDENT> return self.parse_line("{:d}\n", contents)[0] <NEW_LINE> <DEDENT> def get_path(self): <NEW_LINE> <INDENT> return "/proc/sys/fs/pipe-max-size" <NEW_LINE> <DEDENT> def get_permission_checker(self): <NEW_LINE> <INDENT> return target_file_utils.IsReadWrite
/proc/sys/fs/pipe-max-size reports the maximum size (in bytes) of individual pipes.
625990620a50d4780f70692e
class InstanceBoundTypeMixin(object): <NEW_LINE> <INDENT> __slots__ = () <NEW_LINE> _type_eq = classmethod(lambda cls, other: cls is type(other)) <NEW_LINE> _type_hash = classmethod(id) <NEW_LINE> def __eq__(self, other): <NEW_LINE> <INDENT> return self._type_eq(type(other)) <NEW_LINE> <DEDENT> def __hash__(self): <NEW_LINE> <INDENT> return self._type_hash()
Base class for per-instance types, that is types defined for each instance of the target type. Do not use without mixing in a instance-private type.
625990627cff6e4e811b7124
class RandomNumOp(dsl.ContainerOp): <NEW_LINE> <INDENT> def __init__(self, low, high): <NEW_LINE> <INDENT> super(RandomNumOp, self).__init__( name='Random number', image='python:alpine3.6', command=['sh', '-c'], arguments=['python -c "import random; print(random.randint(%s,%s))" | tee /tmp/output' % (low, high)], file_outputs={'output': '/tmp/output'})
Generate a random number between low and high.
62599062379a373c97d9a702
class RouteGuideServicer(route_guide_pb2.EarlyAdopterRouteGuideServicer): <NEW_LINE> <INDENT> def __init__(self): <NEW_LINE> <INDENT> self.db = route_guide_resources.read_route_guide_database() <NEW_LINE> <DEDENT> def GetFeature(self, request, context): <NEW_LINE> <INDENT> feature = get_feature(self.db, request) <NEW_LINE> if not feature: <NEW_LINE> <INDENT> feature = route_guide_pb2.Feature( name="", location=route_guide_pb2.Point( latitude=request.latitude, longitude=request.longitude)) <NEW_LINE> <DEDENT> return feature <NEW_LINE> <DEDENT> def ListFeatures(self, request, context): <NEW_LINE> <INDENT> lo = request.lo <NEW_LINE> hi = request.hi <NEW_LINE> left = min(lo.longitude, hi.longitude) <NEW_LINE> right = max(lo.longitude, hi.longitude) <NEW_LINE> top = max(lo.latitude, hi.latitude) <NEW_LINE> bottom = min(lo.latitude, hi.latitude) <NEW_LINE> for feature in self.db: <NEW_LINE> <INDENT> if (feature.location.longitude >= left and feature.location.longitude <= right and feature.location.latitude >= bottom and feature.location.latitude <= top): <NEW_LINE> <INDENT> yield feature <NEW_LINE> <DEDENT> <DEDENT> <DEDENT> def RecordRoute(self, request_iterator, context): <NEW_LINE> <INDENT> point_count = 0 <NEW_LINE> feature_count = 0 <NEW_LINE> distance = 0.0 <NEW_LINE> prev_point = None <NEW_LINE> start_time = time.time() <NEW_LINE> for point in request_iterator: <NEW_LINE> <INDENT> point_count += 1 <NEW_LINE> if get_feature(self.db, point): <NEW_LINE> <INDENT> feature_count += 1 <NEW_LINE> <DEDENT> if prev_point: <NEW_LINE> <INDENT> distance += get_distance(prev_point, point) <NEW_LINE> <DEDENT> prev_point = point <NEW_LINE> <DEDENT> elapsed_time = time.time() - start_time <NEW_LINE> return route_guide_pb2.RouteSummary(point_count=point_count, feature_count=feature_count, distance=int(distance), elapsed_time=int(elapsed_time)) <NEW_LINE> <DEDENT> def RouteChat(self, request_iterator, context): <NEW_LINE> <INDENT> prev_notes = [] <NEW_LINE> for new_note in request_iterator: <NEW_LINE> <INDENT> for prev_note in prev_notes: <NEW_LINE> <INDENT> if prev_note.location == new_note.location: <NEW_LINE> <INDENT> yield prev_note <NEW_LINE> <DEDENT> <DEDENT> prev_notes.append(new_note)
Provides methods that implement functionality of route guide server.
625990624f6381625f19a012
class ln_service_sector_employment_within_walking_distance(Variable): <NEW_LINE> <INDENT> _return_type="float32" <NEW_LINE> service_sector_employment_within_walking_distance = "service_sector_employment_within_walking_distance" <NEW_LINE> def dependencies(self): <NEW_LINE> <INDENT> return [my_attribute_label(self.service_sector_employment_within_walking_distance)] <NEW_LINE> <DEDENT> def compute(self, dataset_pool): <NEW_LINE> <INDENT> return ln_bounded(self.get_dataset().get_attribute(self.service_sector_employment_within_walking_distance))
Natural log of the service_sector_employment_within_walking_distance for this gridcell
62599062e64d504609df9f3d
class Income: <NEW_LINE> <INDENT> name = 'Доход' <NEW_LINE> def __init__(self, date, money, num, comment): <NEW_LINE> <INDENT> self.date = date <NEW_LINE> self.money = money <NEW_LINE> self.comment = comment <NEW_LINE> self.num = num <NEW_LINE> <DEDENT> def __str__(self): <NEW_LINE> <INDENT> return f'*{self.name} {self.num}*\nсумма: {self.money}\nдата: {self.date}\nкоментарий: {self.comment}'
Новый доход
625990621f037a2d8b9e53da
class SmsCodeViewset(CreateModelMixin, viewsets.GenericViewSet): <NEW_LINE> <INDENT> serializer_class = SmsSerializer <NEW_LINE> def generate_code(self): <NEW_LINE> <INDENT> seeds = '1234567890' <NEW_LINE> random_str = [] <NEW_LINE> for i in range(4): <NEW_LINE> <INDENT> random_str.append(choice(seeds)) <NEW_LINE> <DEDENT> return "".join(random_str) <NEW_LINE> <DEDENT> def create(self, request, *args, **kwargs): <NEW_LINE> <INDENT> serializer = self.get_serializer(data=request.data) <NEW_LINE> serializer.is_valid(raise_exception=True) <NEW_LINE> mobile = serializer.validated_data['mobile'] <NEW_LINE> yun_pian = YunPian(APIKEY) <NEW_LINE> code = self.generate_code() <NEW_LINE> sms_status = yun_pian.send_sms(code=code, mobile=mobile) <NEW_LINE> if sms_status['code'] != 0: <NEW_LINE> <INDENT> return Response( {'mobile': sms_status['msg']}, status=status.HTTP_400_BAD_REQUEST) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> code_record = VerifyCode(code=code, mobile=mobile) <NEW_LINE> code_record.save() <NEW_LINE> return Response( {'mobile': mobile}, status=status.HTTP_201_CREATED)
发送短信验证码
625990622ae34c7f260ac7c5
class FilterModule(object): <NEW_LINE> <INDENT> def filters(self): <NEW_LINE> <INDENT> return { 'b64decode': base64.b64decode, 'b64encode': base64.b64encode, 'to_json': json.dumps, 'to_nice_json': to_nice_json, 'from_json': json.loads, 'to_yaml': yaml.safe_dump, 'to_nice_yaml': to_nice_yaml, 'from_yaml': yaml.safe_load, 'basename': os.path.basename, 'dirname': os.path.dirname, 'realpath': os.path.realpath, 'failed' : failed, 'success' : success, 'changed' : changed, 'skipped' : skipped, 'mandatory': mandatory, 'bool': bool, 'quote': quote, 'md5': md5s, 'fileglob': fileglob, 'match': match, 'search': search, 'regex': regex, 'unique' : unique, 'intersect': intersect, 'difference': difference, 'symmetric_difference': symmetric_difference, 'union': union, }
Ansible core jinja2 filters
6259906266673b3332c31adb
class StopSystemException(AxonException): <NEW_LINE> <INDENT> pass
This exception is used to stop the whole Axon system.
62599062d486a94d0ba2d6a7
class ConnectionMonitorResultProperties(ConnectionMonitorParameters): <NEW_LINE> <INDENT> _validation = { 'monitoring_interval_in_seconds': {'maximum': 1800, 'minimum': 30}, 'provisioning_state': {'readonly': True}, 'start_time': {'readonly': True}, 'monitoring_status': {'readonly': True}, 'connection_monitor_type': {'readonly': True}, } <NEW_LINE> _attribute_map = { 'source': {'key': 'source', 'type': 'ConnectionMonitorSource'}, 'destination': {'key': 'destination', 'type': 'ConnectionMonitorDestination'}, 'auto_start': {'key': 'autoStart', 'type': 'bool'}, 'monitoring_interval_in_seconds': {'key': 'monitoringIntervalInSeconds', 'type': 'int'}, 'endpoints': {'key': 'endpoints', 'type': '[ConnectionMonitorEndpoint]'}, 'test_configurations': {'key': 'testConfigurations', 'type': '[ConnectionMonitorTestConfiguration]'}, 'test_groups': {'key': 'testGroups', 'type': '[ConnectionMonitorTestGroup]'}, 'outputs': {'key': 'outputs', 'type': '[ConnectionMonitorOutput]'}, 'notes': {'key': 'notes', 'type': 'str'}, 'provisioning_state': {'key': 'provisioningState', 'type': 'str'}, 'start_time': {'key': 'startTime', 'type': 'iso-8601'}, 'monitoring_status': {'key': 'monitoringStatus', 'type': 'str'}, 'connection_monitor_type': {'key': 'connectionMonitorType', 'type': 'str'}, } <NEW_LINE> def __init__( self, **kwargs ): <NEW_LINE> <INDENT> super(ConnectionMonitorResultProperties, self).__init__(**kwargs) <NEW_LINE> self.provisioning_state = None <NEW_LINE> self.start_time = None <NEW_LINE> self.monitoring_status = None <NEW_LINE> self.connection_monitor_type = None
Describes the properties of a connection monitor. Variables are only populated by the server, and will be ignored when sending a request. :param source: Describes the source of connection monitor. :type source: ~azure.mgmt.network.v2020_11_01.models.ConnectionMonitorSource :param destination: Describes the destination of connection monitor. :type destination: ~azure.mgmt.network.v2020_11_01.models.ConnectionMonitorDestination :param auto_start: Determines if the connection monitor will start automatically once created. :type auto_start: bool :param monitoring_interval_in_seconds: Monitoring interval in seconds. :type monitoring_interval_in_seconds: int :param endpoints: List of connection monitor endpoints. :type endpoints: list[~azure.mgmt.network.v2020_11_01.models.ConnectionMonitorEndpoint] :param test_configurations: List of connection monitor test configurations. :type test_configurations: list[~azure.mgmt.network.v2020_11_01.models.ConnectionMonitorTestConfiguration] :param test_groups: List of connection monitor test groups. :type test_groups: list[~azure.mgmt.network.v2020_11_01.models.ConnectionMonitorTestGroup] :param outputs: List of connection monitor outputs. :type outputs: list[~azure.mgmt.network.v2020_11_01.models.ConnectionMonitorOutput] :param notes: Optional notes to be associated with the connection monitor. :type notes: str :ivar provisioning_state: The provisioning state of the connection monitor. Possible values include: "Succeeded", "Updating", "Deleting", "Failed". :vartype provisioning_state: str or ~azure.mgmt.network.v2020_11_01.models.ProvisioningState :ivar start_time: The date and time when the connection monitor was started. :vartype start_time: ~datetime.datetime :ivar monitoring_status: The monitoring status of the connection monitor. :vartype monitoring_status: str :ivar connection_monitor_type: Type of connection monitor. Possible values include: "MultiEndpoint", "SingleSourceDestination". :vartype connection_monitor_type: str or ~azure.mgmt.network.v2020_11_01.models.ConnectionMonitorType
625990628e71fb1e983bd1aa
class VocabularyTest(unittest.TestCase): <NEW_LINE> <INDENT> def test_load_from_json(self): <NEW_LINE> <INDENT> voc = Loader(0, "test/data") <NEW_LINE> vocabulary = voc.load_json("test/data/Vocabulary/sample.json") <NEW_LINE> print(*vocabulary) <NEW_LINE> self.assertTrue(len(vocabulary) == 2, f"Found: {len(vocabulary)}") <NEW_LINE> <DEDENT> def test_load_from_text(self): <NEW_LINE> <INDENT> voc = Loader(0, "test/data") <NEW_LINE> vocabulary = voc.load_text("test/data/Vocabulary/sample.txt") <NEW_LINE> self.assertTrue(len(vocabulary) > 0) <NEW_LINE> print(*vocabulary) <NEW_LINE> self.assertTrue(len(vocabulary) == 1, f"Found: {len(vocabulary)}") <NEW_LINE> <DEDENT> def test_load_from_location(self): <NEW_LINE> <INDENT> voc = Loader(0, "data") <NEW_LINE> vocabulary = voc.load_from_location(Path("test/data")) <NEW_LINE> print(*vocabulary) <NEW_LINE> self.assertTrue(len(vocabulary) == 3, f"Found: {len(vocabulary)}")
Vocabulary module tester class
625990627d847024c075dab4
class PortString(object): <NEW_LINE> <INDENT> def __init__(self, message=None, max_values=constants.MAX_COMMA_VALUES): <NEW_LINE> <INDENT> if not message: <NEW_LINE> <INDENT> message = u'Invalid syntax: ' <NEW_LINE> <DEDENT> self.message = message <NEW_LINE> self.max_values = max_values <NEW_LINE> <DEDENT> def __call__(self, form, field): <NEW_LINE> <INDENT> field_data = field.data.split(";") <NEW_LINE> if len(field_data) > self.max_values: <NEW_LINE> <INDENT> raise ValidationError("{} maximum {} comma separated values".format(self.message, self.max_values)) <NEW_LINE> <DEDENT> try: <NEW_LINE> <INDENT> for port_string in field_data: <NEW_LINE> <INDENT> flowspec.to_exabgp_string(port_string, constants.MAX_PORT) <NEW_LINE> <DEDENT> <DEDENT> except ValueError as e: <NEW_LINE> <INDENT> raise ValidationError(self.message + str(e.args[0]))
Validator for port string - must be translatable to ExaBgp syntax Max number of comma separated values must be <= 6 (default)
62599062d268445f2663a6cc
@linter(executable=sys.executable, prerequisite_check_command=(sys.executable, '-m', 'mypy', '-V'), output_format='regex', output_regex=r'[^:]+:(?:(?P<line>\d+):)? ' '(?P<severity>error): (?P<message>.*)') <NEW_LINE> class MypyBear: <NEW_LINE> <INDENT> LANGUAGES = {'Python', 'Python 2', 'Python 3'} <NEW_LINE> AUTHORS = {'Petr Viktorin'} <NEW_LINE> REQUIREMENTS = {PipRequirement('mypy-lang', '0.4.6')} <NEW_LINE> AUTHORS_EMAILS = {'[email protected]'} <NEW_LINE> LICENSE = 'AGPL-3.0' <NEW_LINE> ASCIINEMA_URL = 'https://asciinema.org/a/90736' <NEW_LINE> CAN_DETECT = set() <NEW_LINE> @add_param_docs(FLAG_MAP) <NEW_LINE> def create_arguments(self, filename, file, config_file, language: language=language('Python 3'), python_version: str=None, allow_untyped_functions: bool=True, allow_untyped_calls: bool=True, check_untyped_function_bodies: bool=False, strict_optional: bool=False): <NEW_LINE> <INDENT> args = ['-m', 'mypy'] <NEW_LINE> if 'python 2' in language: <NEW_LINE> <INDENT> args.append('--py2') <NEW_LINE> <DEDENT> elif 'python 3' not in language: <NEW_LINE> <INDENT> self.err( 'Language needs to be "Python", "Python 2" or "Python 3". ' 'Assuming Python 3.') <NEW_LINE> <DEDENT> if python_version: <NEW_LINE> <INDENT> args.extend(['--python-version', python_version]) <NEW_LINE> <DEDENT> loc = locals() <NEW_LINE> args.extend(flag.arg for name, flag in FLAG_MAP.items() if flag.want_flag(loc[name])) <NEW_LINE> args.append(filename) <NEW_LINE> return args
Type-checks your Python files! Checks optional static typing using the mypy tool. See <http://mypy.readthedocs.io/en/latest/basics.html> for info on how to add static typing.
625990624e4d562566373ae6
class Data(pyffi.object_models.FileFormat.Data): <NEW_LINE> <INDENT> fileinfos = [] <NEW_LINE> def read(self, stream): <NEW_LINE> <INDENT> raise NotImplementedError
Process archives described by mexscript files.
62599062460517430c432bc3
class TestNewTasks(unittest.TestCase): <NEW_LINE> <INDENT> def test_verify_data(self): <NEW_LINE> <INDENT> parser = setup_args() <NEW_LINE> opt = parser.parse_args(print_args=False) <NEW_LINE> changed_task_files = [ fn for fn in testing_utils.git_changed_files() if testing_utils.is_new_task_filename(fn) ] <NEW_LINE> if not changed_task_files: <NEW_LINE> <INDENT> return <NEW_LINE> <DEDENT> found_errors = False <NEW_LINE> for file in changed_task_files: <NEW_LINE> <INDENT> task = file.split('/')[-2] <NEW_LINE> module_name = "%s.tasks.%s.agents" % ('parlai', task) <NEW_LINE> task_module = importlib.import_module(module_name) <NEW_LINE> subtasks = [ ':'.join([task, x]) for x in dir(task_module) if ('teacher' in x.lower() and x not in BASE_TEACHERS) ] <NEW_LINE> if testing_utils.is_this_circleci(): <NEW_LINE> <INDENT> if len(subtasks) == 0: <NEW_LINE> <INDENT> continue <NEW_LINE> <DEDENT> self.fail( 'test_verify_data plays poorly with CircleCI. Please run ' '`python tests/datatests/test_new_tasks.py` locally and ' 'paste the output in your pull request.' ) <NEW_LINE> <DEDENT> for subt in subtasks: <NEW_LINE> <INDENT> parser = setup_args() <NEW_LINE> opt = parser.parse_args(args=['--task', subt], print_args=False) <NEW_LINE> opt['task'] = subt <NEW_LINE> try: <NEW_LINE> <INDENT> with testing_utils.capture_output(): <NEW_LINE> <INDENT> text, log = verify(opt, print_parser=False) <NEW_LINE> <DEDENT> <DEDENT> except Exception: <NEW_LINE> <INDENT> found_errors = True <NEW_LINE> traceback.print_exc() <NEW_LINE> print("Got above exception in {}".format(subt)) <NEW_LINE> <DEDENT> for key in KEYS: <NEW_LINE> <INDENT> if log[key] != 0: <NEW_LINE> <INDENT> print('There are {} {} in {}.'.format(log[key], key, subt)) <NEW_LINE> found_errors = True <NEW_LINE> <DEDENT> <DEDENT> <DEDENT> <DEDENT> self.assertFalse(found_errors, "Errors were found.")
Make sure any changes to tasks pass verify_data test.
6259906244b2445a339b74d0
class UTF8Deserializer(Serializer): <NEW_LINE> <INDENT> def loads(self, stream): <NEW_LINE> <INDENT> length = read_int(stream) <NEW_LINE> return stream.read(length).decode('utf8') <NEW_LINE> <DEDENT> def load_stream(self, stream): <NEW_LINE> <INDENT> while True: <NEW_LINE> <INDENT> try: <NEW_LINE> <INDENT> yield self.loads(stream) <NEW_LINE> <DEDENT> except struct.error: <NEW_LINE> <INDENT> return <NEW_LINE> <DEDENT> except EOFError: <NEW_LINE> <INDENT> return
Deserializes streams written by String.getBytes.
625990623cc13d1c6d466e22
class NamespacedAngularAppStorage(AppStaticStorage): <NEW_LINE> <INDENT> source_dir = 'app' <NEW_LINE> def __init__(self, app, *args, **kwargs): <NEW_LINE> <INDENT> self.prefix = os.path.join(*(app.split('.'))) <NEW_LINE> super(NamespacedAngularAppStorage, self).__init__(app, *args, **kwargs)
A file system storage backend that takes an app module and works for the ``app`` directory of it. The app module will be included in the url for the content.
62599062baa26c4b54d50982
class IPV6Sample(object): <NEW_LINE> <INDENT> def __init__(self, u): <NEW_LINE> <INDENT> self.length = u.unpack_uint() <NEW_LINE> self.protocol = u.unpack_uint() <NEW_LINE> self.src_ip = u.unpack_fstring(16) <NEW_LINE> self.dst_ip = u.unpack_fstring(16) <NEW_LINE> self.src_port = u.unpack_uint() <NEW_LINE> self.dst_port = u.unpack_uint() <NEW_LINE> self.tcp_flags = u.unpack_uint() <NEW_LINE> self.priority = u.unpack_uint()
IPv6 sample data
62599062009cb60464d02c17
@admin.register(models.Photo) <NEW_LINE> class PhotoAdmin(admin.ModelAdmin): <NEW_LINE> <INDENT> list_display = ("__str__", "get_thumnail") <NEW_LINE> def get_thumnail(self, obj): <NEW_LINE> <INDENT> return mark_safe(f'<img width="50px"src="{obj.file.url}"/>') <NEW_LINE> <DEDENT> get_thumnail.short_description = "Thumnail"
Photo Admin Difinition
62599062f548e778e596cc68
class MessageSender(MessageProcessor): <NEW_LINE> <INDENT> def run(self, job: Job, messages: List[Message]): <NEW_LINE> <INDENT> job.distributor.distribute(job, messages) <NEW_LINE> return []
消息 分发处理器
62599062379a373c97d9a704
class CollinsClient: <NEW_LINE> <INDENT> def __init__(self, username, passwd, host): <NEW_LINE> <INDENT> self.username = username <NEW_LINE> self.passwd = passwd <NEW_LINE> self.host = host <NEW_LINE> <DEDENT> def async_update_asset(self, tag, params={}): <NEW_LINE> <INDENT> url = "/api/asset/%s" % tag <NEW_LINE> return grequests.post(self.host+url, auth=(self.username, self.passwd), data=params) <NEW_LINE> <DEDENT> def async_asset_finder(self, params={}): <NEW_LINE> <INDENT> url = "/api/assets" <NEW_LINE> return grequests.get(self.host+url, auth=(self.username, self.passwd), params=params) <NEW_LINE> <DEDENT> def assets(self, params={}): <NEW_LINE> <INDENT> url = "/api/assets" <NEW_LINE> response = self._query("get", url, params) <NEW_LINE> return response <NEW_LINE> <DEDENT> def create_asset(self, tag, params={}): <NEW_LINE> <INDENT> url = "/api/asset/%s" % tag <NEW_LINE> response = self._query("put", url, params) <NEW_LINE> return response <NEW_LINE> <DEDENT> def update_asset(self, tag, params={}): <NEW_LINE> <INDENT> url = "/api/asset/%s" % tag <NEW_LINE> response = self._query("post", url, params) <NEW_LINE> return response <NEW_LINE> <DEDENT> def delete_asset(self, tag, params={}): <NEW_LINE> <INDENT> url = "/api/asset/%s" % tag <NEW_LINE> response = self._query("delete", url, params) <NEW_LINE> return response <NEW_LINE> <DEDENT> def delete_asset_attribute(self, tag, attribute): <NEW_LINE> <INDENT> url = "/api/asset/%s/attribute/%s" % (tag, attribute) <NEW_LINE> response = self._query("delete", url, {}) <NEW_LINE> return response <NEW_LINE> <DEDENT> def asset_finder(self, params={}): <NEW_LINE> <INDENT> url = "/api/assets" <NEW_LINE> response = self._query("get", url, params) <NEW_LINE> return response <NEW_LINE> <DEDENT> def asset_info(self, tag, params={}): <NEW_LINE> <INDENT> url = "/api/asset/%s" % tag <NEW_LINE> response = self._query("get", url, params) <NEW_LINE> return response <NEW_LINE> <DEDENT> def assets_logs(self, tag, params={}): <NEW_LINE> <INDENT> url = "/api/asset/%s/logs" % (tag) <NEW_LINE> response = self._query("get", url, params) <NEW_LINE> return response <NEW_LINE> <DEDENT> def create_assets_log(self, tag, params={}): <NEW_LINE> <INDENT> url = "/api/asset/%s/log" % (tag) <NEW_LINE> response = self._query("put", url, params) <NEW_LINE> <DEDENT> def _query(self, method, url, params={}): <NEW_LINE> <INDENT> handle = urllib2.build_opener(urllib2.HTTPHandler) <NEW_LINE> if method in ['post', 'put']: <NEW_LINE> <INDENT> request = urllib2.Request(self.host+url, data=urllib.urlencode(params, doseq=True)) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> if params: <NEW_LINE> <INDENT> url += "?" + urllib.urlencode(params, doseq=True) <NEW_LINE> <DEDENT> request = urllib2.Request(self.host+url) <NEW_LINE> <DEDENT> authstring = base64.encodestring("%s:%s" % (self.username, self.passwd)).strip() <NEW_LINE> request.add_header("Authorization", "Basic %s" % authstring) <NEW_LINE> request.get_method = { "get" : lambda: "GET", "post" : lambda: "POST", "put" : lambda: "PUT", "delete" : lambda: "DELETE" }.get(method, "get") <NEW_LINE> response = handle.open(request).read() <NEW_LINE> response = json.loads(response) <NEW_LINE> return response
This client will help you interface with Collins in a meaningful way giving access to all the different apis that Collins allows.
6259906232920d7e50bc7726
class BookInfoAdmin(admin.ModelAdmin): <NEW_LINE> <INDENT> list_display = ['id','btitle','bpub_data']
图书模型管理类
6259906207f4c71912bb0b17
class TestImobiliariaController(BaseTestCase): <NEW_LINE> <INDENT> def test_imobiliaria_get(self): <NEW_LINE> <INDENT> response = self.client.open( '/api/v1.0/imobiliaria', method='GET') <NEW_LINE> self.assert200(response, 'Response body is : ' + response.data.decode('utf-8')) <NEW_LINE> <DEDENT> def test_imobiliaria_id_get(self): <NEW_LINE> <INDENT> response = self.client.open( '/api/v1.0/imobiliaria/{id}'.format(id=56), method='GET') <NEW_LINE> self.assert200(response, 'Response body is : ' + response.data.decode('utf-8')) <NEW_LINE> <DEDENT> def test_imobiliaria_post(self): <NEW_LINE> <INDENT> body = Body1() <NEW_LINE> response = self.client.open( '/api/v1.0/imobiliaria', method='POST', data=json.dumps(body), content_type='application/json') <NEW_LINE> self.assert200(response, 'Response body is : ' + response.data.decode('utf-8')) <NEW_LINE> <DEDENT> def test_imobiliaria_put(self): <NEW_LINE> <INDENT> body = Imobiliaria() <NEW_LINE> response = self.client.open( '/api/v1.0/imobiliaria', method='PUT', data=json.dumps(body), content_type='application/json') <NEW_LINE> self.assert200(response, 'Response body is : ' + response.data.decode('utf-8'))
ImobiliariaController integration test stubs
6259906245492302aabfdbbb
class CeilingMixin: <NEW_LINE> <INDENT> def __init__(self, ceiling: torch.Tensor, *args, **kwargs): <NEW_LINE> <INDENT> self.ceiling = ceiling <NEW_LINE> super().__init__(*args, **kwargs) <NEW_LINE> <DEDENT> def cdf(self, value: torch.Tensor) -> torch.Tensor: <NEW_LINE> <INDENT> ceiling, value = broadcast_all(self.ceiling, value) <NEW_LINE> return ceiling * super().cdf(value) <NEW_LINE> <DEDENT> def log_prob(self, value: torch.Tensor) -> torch.Tensor: <NEW_LINE> <INDENT> ceiling, value = broadcast_all(self.ceiling, value) <NEW_LINE> return ceiling.log() + super().log_prob(value) <NEW_LINE> <DEDENT> def expand(self, *args, **kwargs): <NEW_LINE> <INDENT> raise NotImplementedError
Mixin for torch.distribution.Distributions where instead of assuming that events always happen eventually, the event-probability asymptotes to some probability less than 1.0.
6259906291f36d47f22319ff
class FilterExpression(AWSProperty): <NEW_LINE> <INDENT> props: PropsDictType = { "Expression": (str, True), "ValuesMap": ([FilterValue], True), }
`FilterExpression <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-databrew-dataset-filterexpression.html>`__
62599062a17c0f6771d5d715
class SearchSpaceGenerator(ast.NodeTransformer): <NEW_LINE> <INDENT> def __init__(self, module_name): <NEW_LINE> <INDENT> self.module_name = module_name <NEW_LINE> self.search_space = {} <NEW_LINE> self.last_line = 0 <NEW_LINE> <DEDENT> def visit_Call(self, node): <NEW_LINE> <INDENT> self.generic_visit(node) <NEW_LINE> if type(node.func) is not ast.Attribute: <NEW_LINE> <INDENT> return node <NEW_LINE> <DEDENT> if type(node.func.value) is not ast.Name: <NEW_LINE> <INDENT> return node <NEW_LINE> <DEDENT> if node.func.value.id != 'nni': <NEW_LINE> <INDENT> return node <NEW_LINE> <DEDENT> func = node.func.attr <NEW_LINE> if func not in _ss_funcs: <NEW_LINE> <INDENT> return node <NEW_LINE> <DEDENT> self.last_line = node.lineno <NEW_LINE> if node.keywords: <NEW_LINE> <INDENT> assert len(node.keywords) == 1, 'Smart parameter has keyword argument other than "name"' <NEW_LINE> assert node.keywords[0].arg == 'name', 'Smart paramater\'s keyword argument is not "name"' <NEW_LINE> assert type(node.keywords[0].value) is ast.Str, 'Smart parameter\'s name must be string literal' <NEW_LINE> name = node.keywords[0].value.s <NEW_LINE> specified_name = True <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> name = '__line' + str(str(node.args[-1].lineno)) <NEW_LINE> specified_name = False <NEW_LINE> node.keywords = list() <NEW_LINE> <DEDENT> if func in ('choice', 'function_choice'): <NEW_LINE> <INDENT> assert len(node.args) == 1, 'Smart parameter has arguments other than dict' <NEW_LINE> args = [key.n if type(key) is ast.Num else key.s for key in node.args[0].keys] <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> assert all(type(arg) is ast.Num for arg in node.args), 'Smart parameter\'s arguments must be number literals' <NEW_LINE> args = [arg.n for arg in node.args] <NEW_LINE> <DEDENT> key = self.module_name + '/' + name + '/' + func <NEW_LINE> node.keywords.append(ast.keyword(arg='key', value=ast.Str(s=key))) <NEW_LINE> if func == 'function_choice': <NEW_LINE> <INDENT> func = 'choice' <NEW_LINE> <DEDENT> value = {'_type': func, '_value': args} <NEW_LINE> if specified_name: <NEW_LINE> <INDENT> old = self.search_space.get(key) <NEW_LINE> assert old is None or old == value, 'Different smart parameters have same name' <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> assert key not in self.search_space, 'Only one smart parameter is allowed in a line' <NEW_LINE> <DEDENT> self.search_space[key] = value <NEW_LINE> return node
Generate search space from smart parater APIs
6259906299cbb53fe68325c3
class phi1DTestCase(unittest.TestCase): <NEW_LINE> <INDENT> def setUp(self): <NEW_LINE> <INDENT> self.prior_seterr = numpy.seterr(divide='raise') <NEW_LINE> <DEDENT> def tearDown(self): <NEW_LINE> <INDENT> numpy.seterr(**self.prior_seterr) <NEW_LINE> <DEDENT> xx = dadi.Numerics.default_grid(20) <NEW_LINE> def test_snm(self): <NEW_LINE> <INDENT> dadi.PhiManip.phi_1D(self.xx) <NEW_LINE> <DEDENT> def test_genic(self): <NEW_LINE> <INDENT> dadi.PhiManip.phi_1D(self.xx, gamma=1) <NEW_LINE> dadi.PhiManip.phi_1D(self.xx, gamma=-500) <NEW_LINE> <DEDENT> def test_dominance(self): <NEW_LINE> <INDENT> dadi.PhiManip.phi_1D(self.xx, gamma=1, h=0.3)
Test routines for generating input phi's. These tests are primarily motivated by the desire to avoid divide by zero warnings.
62599062442bda511e95d8ca
class PCA(LocalSystem): <NEW_LINE> <INDENT> def __new__(cls, coords): <NEW_LINE> <INDENT> eigen_vectors, eigen_values = eigen(coords) <NEW_LINE> center = np.mean(coords, axis=0) <NEW_LINE> dim = len(center) <NEW_LINE> T = LocalSystem(identity(dim + 1)).view(cls) <NEW_LINE> T[:dim, :dim] = eigen_vectors.T <NEW_LINE> T = T @ t_matrix(-center) <NEW_LINE> T._eigen_values = eigen_values <NEW_LINE> return T <NEW_LINE> <DEDENT> @property <NEW_LINE> def eigen_values(self): <NEW_LINE> <INDENT> return self._eigen_values <NEW_LINE> <DEDENT> def pc(self, k): <NEW_LINE> <INDENT> if not (k >= 1 and k <= self.dim): <NEW_LINE> <INDENT> raise ValueError("%-'th principal component not available") <NEW_LINE> <DEDENT> return self[k - 1, :self.dim]
Principal Component Analysis (PCA). Parameters ---------- coords : array_like(Number, shape=(n, k)) Represents `n` data points of `k` dimensions. These coordinates are used to fit a PCA. Attributes ---------- eigen_values : np.ndarray(Number, shape=(k)) Characteristic Eigenvalues of the PCA. Notes ----- Implementation idea taken from [1]. References ---------- [1] https://stackoverflow.com/questions/13224362/principal-component-analysis-pca-in-python Examples -------- >>> coords = [(0, 0), (3, 4)] >>> T = PCA(coords) >>> print_rounded(T) [[ 0.6 0.8 -2.5] [-0.8 0.6 0. ] [ 0. 0. 1. ]] >>> print_rounded(np.linalg.inv(T)) [[ 0.6 -0.8 1.5] [ 0.8 0.6 2. ] [ 0. 0. 1. ]] >>> print_rounded(transform(coords, T)) [[-2.5 0. ] [ 2.5 0. ]]
62599062462c4b4f79dbd0e7
class Computer(Player): <NEW_LINE> <INDENT> def __repr__(self): <NEW_LINE> <INDENT> return 'Computer(%s)' % str(self) <NEW_LINE> <DEDENT> def get_move(self): <NEW_LINE> <INDENT> return random.randint(1, 10)
计算机抽象类
6259906238b623060ffaa3c1
class KafkaCompressSink(KafkaSink): <NEW_LINE> <INDENT> def producer_config(self, kafka_config: Dict[str, Any]) -> Dict[str, Any]: <NEW_LINE> <INDENT> config = kafka_config.copy() <NEW_LINE> config.update( { "compression.codec": "gzip", "retry.backoff.ms": 250, "linger.ms": 1000, "batch.num.messages": 50, "delivery.report.only.error": True, "default.topic.config": { "message.timeout.ms": 30000, "request.required.acks": -1, }, } ) <NEW_LINE> return config
Variant of KafkaSink for large documents. Used for, eg, GROBID output.
62599062d6c5a102081e3806
class PluginsDao(object): <NEW_LINE> <INDENT> def __init__(self, data_path): <NEW_LINE> <INDENT> self.data_path = data_path <NEW_LINE> self.path = self.data_path / Path('plugins_lv2.json') <NEW_LINE> <DEDENT> def load(self): <NEW_LINE> <INDENT> return Persistence.read(self.path) <NEW_LINE> <DEDENT> def save(self, data): <NEW_LINE> <INDENT> Persistence.save(self.path, data) <NEW_LINE> <DEDENT> @property <NEW_LINE> def exists_data(self): <NEW_LINE> <INDENT> return self.path.exists()
Persists and loads Lv2Plugins data
62599062d268445f2663a6cd
class Alien(GSprite): <NEW_LINE> <INDENT> def __init__(self,x1,y1,n): <NEW_LINE> <INDENT> super().__init__(width=ALIEN_WIDTH,height=ALIEN_HEIGHT,x=x1,y=y1, source=ALIEN_IMAGES[n],format=(3,2)) <NEW_LINE> <DEDENT> def collides(self, bolt): <NEW_LINE> <INDENT> collide = False <NEW_LINE> if bolt.isPlayerBolt(): <NEW_LINE> <INDENT> points = [self.contains(((bolt.x - int(bolt.width/2)), bolt.y)), self.contains(((bolt.x + int(bolt.width/2)), bolt.y)), self.contains((bolt.x - int(bolt.width/2), bolt.y + BOLT_HEIGHT)), self.contains(((bolt.x + int(bolt.width/2)), (bolt.y + BOLT_HEIGHT)))] <NEW_LINE> for a in range(len(points)): <NEW_LINE> <INDENT> if points[a] == True: <NEW_LINE> <INDENT> collide = True <NEW_LINE> <DEDENT> <DEDENT> return collide <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> return False
A class to represent a single alien. At the very least, you want a __init__ method to initialize the alien dimensions. These dimensions are all specified in consts.py. You also MIGHT want to add code to detect a collision with a bolt. We do not require this. You could put this method in Wave if you wanted to. But the advantage of putting it here is that Ships and Aliens collide with different bolts. Ships collide with Alien bolts, not Ship bolts. And Aliens collide with Ship bolts, not Alien bolts. An easy way to keep this straight is for this class to have its own collision method. However, there is no need for any more attributes other than those inherited by GSprite. You would only add attributes if you needed them for extra gameplay features (like giving each alien a score value). If you add attributes, list them below. LIST MORE ATTRIBUTES (AND THEIR INVARIANTS) HERE IF NECESSARY
62599062be8e80087fbc0768
class Writer(Collab): <NEW_LINE> <INDENT> def __init__(self): <NEW_LINE> <INDENT> Collab.__init__(self, 'ace_text-input') <NEW_LINE> self.__word_to_type = "type_something" <NEW_LINE> <DEDENT> def run(self): <NEW_LINE> <INDENT> print("=== Writer is starting ===") <NEW_LINE> self.alive = True <NEW_LINE> while self.alive: <NEW_LINE> <INDENT> print("=== Writer is typing : %s ===" % self.__word_to_type) <NEW_LINE> self.select.send_keys(self.__word_to_type) <NEW_LINE> self.content_editor += self.__word_to_type <NEW_LINE> time.sleep(5) <NEW_LINE> <DEDENT> <DEDENT> def stop(self): <NEW_LINE> <INDENT> Collab.stop(self) <NEW_LINE> print("=== Writer is stopping ===")
docstring for Writer
62599062f7d966606f74942a
class Physics(mujoco.Physics): <NEW_LINE> <INDENT> def upright(self): <NEW_LINE> <INDENT> return self.named.data.xmat['torso', 'zz'] <NEW_LINE> <DEDENT> def torso_velocity(self): <NEW_LINE> <INDENT> return self.data.sensordata <NEW_LINE> <DEDENT> def joint_velocities(self): <NEW_LINE> <INDENT> return self.named.data.qvel[_JOINTS] <NEW_LINE> <DEDENT> def joint_angles(self): <NEW_LINE> <INDENT> return self.named.data.qpos[_JOINTS] <NEW_LINE> <DEDENT> def mouth_to_target(self): <NEW_LINE> <INDENT> data = self.named.data <NEW_LINE> mouth_to_target_global = data.geom_xpos['target'] - data.geom_xpos['mouth'] <NEW_LINE> return mouth_to_target_global.dot(data.geom_xmat['mouth'].reshape(3, 3))
Physics simulation with additional features for the Fish domain.
625990629c8ee82313040cfa
class TransportError(Exception): <NEW_LINE> <INDENT> pass
Exception class for Transport errors
625990628a43f66fc4bf3871
class ROSWorld(AbstractWorld): <NEW_LINE> <INDENT> def __init__(self, subscriptions, publications): <NEW_LINE> <INDENT> super(ROSWorld, self).__init__() <NEW_LINE> rospy.init_node(name='rlpy', anonymous=True, disable_signals=True) <NEW_LINE> self.proxy = ROSProxy(subscriptions, publications) <NEW_LINE> self.thread = threading.Thread(target=(lambda: self.proxy.run())) <NEW_LINE> print('Starting ROS thread...') <NEW_LINE> self.thread.start() <NEW_LINE> self.actions = self.proxy.numberOfActions() <NEW_LINE> self.initial = self.proxy.observation()[0] <NEW_LINE> print('Started!') <NEW_LINE> <DEDENT> def nb_actions(self): <NEW_LINE> <INDENT> return self.actions <NEW_LINE> <DEDENT> def reset(self): <NEW_LINE> <INDENT> self.proxy.setAction(self.actions) <NEW_LINE> <DEDENT> def performAction(self, action): <NEW_LINE> <INDENT> self.proxy.setAction(action) <NEW_LINE> return self.proxy.observation()
Bridge between ROS (Robot OS) and this framework. This world subscribes to ROS topics and use them to produce states. When actions are performed, they are transformed to publishings in the ROS network.
6259906267a9b606de547613
class log_with(object): <NEW_LINE> <INDENT> ENTRY_MESSAGE = 'Entering {}' <NEW_LINE> EXIT_MESSAGE = 'Exiting {}' <NEW_LINE> def __init__(self, logger=None): <NEW_LINE> <INDENT> self.logger = logger <NEW_LINE> <DEDENT> def __call__(self, func): <NEW_LINE> <INDENT> if not self.logger: <NEW_LINE> <INDENT> logging.basicConfig() <NEW_LINE> self.logger = logging.getLogger(func.__module__) <NEW_LINE> <DEDENT> @functools.wraps(func) <NEW_LINE> def wrapper(*args, **kwds): <NEW_LINE> <INDENT> self.logger.debug( self.ENTRY_MESSAGE.format(func.__name__)) <NEW_LINE> f_result = func(*args, **kwds) <NEW_LINE> self.logger.debug( self.EXIT_MESSAGE.format(func.__name__)) <NEW_LINE> return f_result <NEW_LINE> <DEDENT> return wrapper
Logging decorator that allows you to log with a specific logger.
6259906245492302aabfdbbd
class Vector(object): <NEW_LINE> <INDENT> def __init__(self,x=0.0,y=0.0,z=0.0): <NEW_LINE> <INDENT> self.coords=[x,y,z] <NEW_LINE> self.x=x <NEW_LINE> self.y=y <NEW_LINE> self.z=z <NEW_LINE> <DEDENT> def length(self): <NEW_LINE> <INDENT> dr = 0.0 <NEW_LINE> for x in self.coords: <NEW_LINE> <INDENT> dr = dr + x**2 <NEW_LINE> <DEDENT> dr = math.sqrt(dr) <NEW_LINE> return dr <NEW_LINE> <DEDENT> def __str__(self): <NEW_LINE> <INDENT> output = "%12.6f %12.6f %12.6f"%(self.coords[0],self.coords[1],self.coords[2]) <NEW_LINE> return output <NEW_LINE> <DEDENT> def __repr__(self): <NEW_LINE> <INDENT> output = "Vector(%g,%g,%g)"%(self.coords[0],self.coords[1],self.coords[2]) <NEW_LINE> return output <NEW_LINE> <DEDENT> def __add__(self,y): <NEW_LINE> <INDENT> z = Vector() <NEW_LINE> for i in range(len(self.coords)): <NEW_LINE> <INDENT> z.coords[i] = self.coords[i]+y.coords[i] <NEW_LINE> <DEDENT> return z <NEW_LINE> <DEDENT> def __sub__(self,y): <NEW_LINE> <INDENT> z = Vector() <NEW_LINE> for i in range(len(self.coords)): <NEW_LINE> <INDENT> z.coords[i] = self.coords[i]-y.coords[i] <NEW_LINE> <DEDENT> return z
classdocs
6259906229b78933be26ac35
class CheckDep(Command): <NEW_LINE> <INDENT> description = "Checks that the required dependencies are installed on the system" <NEW_LINE> user_options = [] <NEW_LINE> def initialize_options(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def finalize_options(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def run(self): <NEW_LINE> <INDENT> for dep in dependencies: <NEW_LINE> <INDENT> try: <NEW_LINE> <INDENT> require(dep) <NEW_LINE> print(dep + " ...[ok]") <NEW_LINE> <DEDENT> except (DistributionNotFound, VersionConflict): <NEW_LINE> <INDENT> print(dep + "... MISSING!")
Command to check that the required dependencies are installed on the system
6259906232920d7e50bc7729
class ImageSelectionCreateView(CreateView): <NEW_LINE> <INDENT> model = ImageSelection <NEW_LINE> form_class = ImageSelectionCreateForm <NEW_LINE> template_name = 'lumina/base_create_update_form.html' <NEW_LINE> def get_initial(self): <NEW_LINE> <INDENT> initial = super(ImageSelectionCreateView, self).get_initial() <NEW_LINE> if 'id_session' in self.request.GET: <NEW_LINE> <INDENT> initial.update({ 'session': self.request.GET['id_session'], }) <NEW_LINE> <DEDENT> return initial <NEW_LINE> <DEDENT> def form_valid(self, form): <NEW_LINE> <INDENT> form.instance.studio = self.request.user.studio <NEW_LINE> form.instance.customer = form.instance.session.customer <NEW_LINE> ret = super(ImageSelectionCreateView, self).form_valid(form) <NEW_LINE> subject = "Solicitud de seleccion de imagenes" <NEW_LINE> link = self.request.build_absolute_uri( reverse('session_detail', args=[form.instance.session.id])) <NEW_LINE> message = "Tiene una nueva solicitud para seleccionar fotografías.\n" + "Para verlo ingrese a {}".format(link) <NEW_LINE> for customer_user in form.instance.customer.users.all(): <NEW_LINE> <INDENT> to_email = customer_user.email <NEW_LINE> send_email(subject, to_email, message) <NEW_LINE> <DEDENT> messages.success( self.request, 'La solicitud de seleccion de imagenes ' 'fue creada correctamente.') <NEW_LINE> return ret <NEW_LINE> <DEDENT> def get_success_url(self): <NEW_LINE> <INDENT> return reverse('session_detail', args=[self.object.session.id]) <NEW_LINE> <DEDENT> def get_context_data(self, **kwargs): <NEW_LINE> <INDENT> context = super(ImageSelectionCreateView, self).get_context_data(**kwargs) <NEW_LINE> context['form'].fields['session'].queryset = self.request.user.studio.session_set.all() <NEW_LINE> context['title'] = "Solicitud de seleccion de fotos" <NEW_LINE> context['submit_label'] = "Enviar solicitud" <NEW_LINE> return context
With this view, the photographer creates a request to the customer.
625990623539df3088ecd980
class UserProfile(FacebookProfile): <NEW_LINE> <INDENT> class Meta: <NEW_LINE> <INDENT> proxy = True <NEW_LINE> <DEDENT> def get_avatar(self): <NEW_LINE> <INDENT> return get_thumbnail('http://graph.facebook.com/%s/picture' % self.facebook_id, '50')
User Profile Wrapper. TODO: migrate to social_auth.
6259906292d797404e3896cf
class Mimetype(CustomRule): <NEW_LINE> <INDENT> priority = POST_PROCESS <NEW_LINE> dependency = Processors <NEW_LINE> def when(self, matches, context): <NEW_LINE> <INDENT> mime, _ = mimetypes.guess_type(matches.input_string, strict=False) <NEW_LINE> return mime <NEW_LINE> <DEDENT> def then(self, matches, when_response, context): <NEW_LINE> <INDENT> mime = when_response <NEW_LINE> matches.append(Match(len(matches.input_string), len(matches.input_string), name='mimetype', value=mime))
Mimetype post processor :param matches: :type matches: :return: :rtype:
625990620c0af96317c578d0
class MultiInputPolicy(DQNPolicy): <NEW_LINE> <INDENT> def __init__( self, observation_space: gym.spaces.Dict, action_space: gym.spaces.Space, lr_schedule: Schedule, net_arch: Optional[List[int]] = None, activation_fn: Type[nn.Module] = nn.ReLU, features_extractor_class: Type[BaseFeaturesExtractor] = CombinedExtractor, features_extractor_kwargs: Optional[Dict[str, Any]] = None, normalize_images: bool = True, optimizer_class: Type[th.optim.Optimizer] = th.optim.Adam, optimizer_kwargs: Optional[Dict[str, Any]] = None, ): <NEW_LINE> <INDENT> super(MultiInputPolicy, self).__init__( observation_space, action_space, lr_schedule, net_arch, activation_fn, features_extractor_class, features_extractor_kwargs, normalize_images, optimizer_class, optimizer_kwargs, )
Policy class for DQN when using dict observations as input. :param observation_space: Observation space :param action_space: Action space :param lr_schedule: Learning rate schedule (could be constant) :param net_arch: The specification of the policy and value networks. :param activation_fn: Activation function :param features_extractor_class: Features extractor to use. :param normalize_images: Whether to normalize images or not, dividing by 255.0 (True by default) :param optimizer_class: The optimizer to use, ``th.optim.Adam`` by default :param optimizer_kwargs: Additional keyword arguments, excluding the learning rate, to pass to the optimizer
625990628e71fb1e983bd1ae