code
stringlengths
4
4.48k
docstring
stringlengths
1
6.45k
_id
stringlengths
24
24
class TimeSeries(_messages.Message): <NEW_LINE> <INDENT> class MetricKindValueValuesEnum(_messages.Enum): <NEW_LINE> <INDENT> METRIC_KIND_UNSPECIFIED = 0 <NEW_LINE> GAUGE = 1 <NEW_LINE> DELTA = 2 <NEW_LINE> CUMULATIVE = 3 <NEW_LINE> <DEDENT> class ValueTypeValueValuesEnum(_messages.Enum): <NEW_LINE> <INDENT> VALUE_TYPE_UNSPECIFIED = 0 <NEW_LINE> BOOL = 1 <NEW_LINE> INT64 = 2 <NEW_LINE> DOUBLE = 3 <NEW_LINE> STRING = 4 <NEW_LINE> DISTRIBUTION = 5 <NEW_LINE> MONEY = 6 <NEW_LINE> <DEDENT> metadata = _messages.MessageField('MonitoredResourceMetadata', 1) <NEW_LINE> metric = _messages.MessageField('Metric', 2) <NEW_LINE> metricKind = _messages.EnumField('MetricKindValueValuesEnum', 3) <NEW_LINE> points = _messages.MessageField('Point', 4, repeated=True) <NEW_LINE> resource = _messages.MessageField('MonitoredResource', 5) <NEW_LINE> valueType = _messages.EnumField('ValueTypeValueValuesEnum', 6)
A collection of data points that describes the time-varying values of a metric. A time series is identified by a combination of a fully-specified monitored resource and a fully-specified metric. This type is used for both listing and creating time series. Enums: MetricKindValueValuesEnum: The metric kind of the time series. When listing time series, this metric kind might be different from the metric kind of the associated metric if this time series is an alignment or reduction of other time series.When creating a time series, this field is optional. If present, it must be the same as the metric kind of the associated metric. If the associated metric's descriptor must be auto- created, then this field specifies the metric kind of the new descriptor and must be either GAUGE (the default) or CUMULATIVE. ValueTypeValueValuesEnum: The value type of the time series. When listing time series, this value type might be different from the value type of the associated metric if this time series is an alignment or reduction of other time series.When creating a time series, this field is optional. If present, it must be the same as the type of the data in the points field. Fields: metadata: Output only. The associated monitored resource metadata. When reading a a timeseries, this field will include metadata labels that are explicitly named in the reduction. When creating a timeseries, this field is ignored. metric: The associated metric. A fully-specified metric used to identify the time series. metricKind: The metric kind of the time series. When listing time series, this metric kind might be different from the metric kind of the associated metric if this time series is an alignment or reduction of other time series.When creating a time series, this field is optional. If present, it must be the same as the metric kind of the associated metric. If the associated metric's descriptor must be auto-created, then this field specifies the metric kind of the new descriptor and must be either GAUGE (the default) or CUMULATIVE. points: The data points of this time series. When listing time series, points are returned in reverse time order.When creating a time series, this field must contain exactly one point and the point's type must be the same as the value type of the associated metric. If the associated metric's descriptor must be auto-created, then the value type of the descriptor is determined by the point's type, which must be BOOL, INT64, DOUBLE, or DISTRIBUTION. resource: The associated monitored resource. Custom metrics can use only certain monitored resource types in their time series data. valueType: The value type of the time series. When listing time series, this value type might be different from the value type of the associated metric if this time series is an alignment or reduction of other time series.When creating a time series, this field is optional. If present, it must be the same as the type of the data in the points field.
6259907ba8370b77170f1dec
class ArgProcessor(DataProcessor): <NEW_LINE> <INDENT> def get_train_examples(self, data_dir): <NEW_LINE> <INDENT> logger.info("LOOKING AT {}".format(os.path.join(data_dir, "train_everything.tsv"))) <NEW_LINE> return self._create_examples( self._read_tsv(os.path.join(data_dir, "train_everything.tsv")), "train") <NEW_LINE> <DEDENT> def get_dev_examples(self, data_dir): <NEW_LINE> <INDENT> return self._create_examples( self._read_tsv(os.path.join(data_dir, "dev.tsv")), "dev") <NEW_LINE> <DEDENT> def get_labels(self): <NEW_LINE> <INDENT> return ["0", "1", "2"] <NEW_LINE> <DEDENT> def _create_examples(self, lines, set_type): <NEW_LINE> <INDENT> examples = [] <NEW_LINE> for (i, line) in enumerate(lines): <NEW_LINE> <INDENT> guid = "%s-%s" % (set_type, i) <NEW_LINE> if len(line) >= 2: <NEW_LINE> <INDENT> text_a = line[0] <NEW_LINE> text_b = None <NEW_LINE> label = line[1] <NEW_LINE> examples.append( InputExample(guid=guid, text_a=text_a, text_b=text_b, label=label)) <NEW_LINE> <DEDENT> <DEDENT> return examples
Processor for the MRPC data set (GLUE version).
6259907b796e427e53850198
class RouteErrorRange(msrest.serialization.Model): <NEW_LINE> <INDENT> _attribute_map = { 'start': {'key': 'start', 'type': 'RouteErrorPosition'}, 'end': {'key': 'end', 'type': 'RouteErrorPosition'}, } <NEW_LINE> def __init__( self, *, start: Optional["RouteErrorPosition"] = None, end: Optional["RouteErrorPosition"] = None, **kwargs ): <NEW_LINE> <INDENT> super(RouteErrorRange, self).__init__(**kwargs) <NEW_LINE> self.start = start <NEW_LINE> self.end = end
Range of route errors. :ivar start: Start where the route error happened. :vartype start: ~azure.mgmt.iothub.v2018_04_01.models.RouteErrorPosition :ivar end: End where the route error happened. :vartype end: ~azure.mgmt.iothub.v2018_04_01.models.RouteErrorPosition
6259907b3617ad0b5ee07b6b
class Normalizer(): <NEW_LINE> <INDENT> def __init__(self, n_inputs): <NEW_LINE> <INDENT> self.n = np.zeros(n_inputs) <NEW_LINE> self.mean = np.zeros(n_inputs) <NEW_LINE> self.mean_diff = np.zeros(n_inputs) <NEW_LINE> self.var = np.zeros(n_inputs) <NEW_LINE> <DEDENT> def observe(self, x): <NEW_LINE> <INDENT> self.n += 1.0 <NEW_LINE> last_mean = self.mean.copy() <NEW_LINE> self.mean += (x - self.mean) / self.n <NEW_LINE> self.mean_diff += (x - last_mean) * (x - self.mean) <NEW_LINE> self.var = (self.mean_diff / self.n).clip(min = 1e-2) <NEW_LINE> <DEDENT> def normalize(self, inputs): <NEW_LINE> <INDENT> obs_mean = self.mean <NEW_LINE> obs_std = np.sqrt(self.var) <NEW_LINE> return (inputs - obs_mean) / obs_std
Normalize input values. Args: n_inputs: Number of input values
6259907b3d592f4c4edbc86d
class Event(Model): <NEW_LINE> <INDENT> id = fields.IntField(pk=True) <NEW_LINE> name = fields.CharField(max_length=255) <NEW_LINE> created_at = fields.DatetimeField(auto_now_add=True) <NEW_LINE> tournament: fields.ForeignKeyNullableRelation[Tournament] = fields.ForeignKeyField( "models.Tournament", related_name="events", null=True ) <NEW_LINE> class Meta: <NEW_LINE> <INDENT> ordering = ["name"]
The Event model docstring. This is multiline docs.
6259907b92d797404e38986b
class LMTextList(TextList): <NEW_LINE> <INDENT> _bunch = TextLMDataBunch <NEW_LINE> _is_lm = True <NEW_LINE> _label_cls = EmptyLabel
Special `TextList` for a language model.
6259907b5fdd1c0f98e5f99d
class TestFunctions(unittest.TestCase): <NEW_LINE> <INDENT> def testLookup_rpcsvc(self): <NEW_LINE> <INDENT> keys = common.ProgramNumbers.keys() <NEW_LINE> for key in keys: <NEW_LINE> <INDENT> entry = common.ProgramNumbers[ key ] <NEW_LINE> bunch = common.lookup_rpcsvc(key) <NEW_LINE> self.assertEquals( entry[0], bunch.server_receive_prgnum ) <NEW_LINE> self.assertEquals( entry[0], bunch.client_send_prgnum ) <NEW_LINE> self.assertEquals( entry[1], bunch.server_send_prgnum ) <NEW_LINE> self.assertEquals( entry[1], bunch.client_receive_prgnum ) <NEW_LINE> <DEDENT> <DEDENT> def testGet_rpcsvc_keys(self): <NEW_LINE> <INDENT> keysDirect = common.ProgramNumbers.keys() <NEW_LINE> keysFromFn = common.get_rpcsvc_keys() <NEW_LINE> self.assertEquals( keysDirect, keysFromFn ) <NEW_LINE> <DEDENT> def testTime2timestamp(self): <NEW_LINE> <INDENT> sectime = 36001.0 <NEW_LINE> timestamp = common.time2timestamp(sectime) <NEW_LINE> self.assertEquals( '19700101000001.000', timestamp ) <NEW_LINE> sectime = common.timestamp2time(timestamp) <NEW_LINE> self.assertAlmostEquals( 36001.0, sectime, 3 ) <NEW_LINE> timestampIn = '20060125161614.769' <NEW_LINE> sectime = common.timestamp2time(timestampIn) <NEW_LINE> self.assertAlmostEquals( 1138241774.769, sectime, 3 ) <NEW_LINE> timestampOut = common.time2timestamp(sectime) <NEW_LINE> self.assertEquals( timestampIn, timestampOut ) <NEW_LINE> <DEDENT> def testGetMyhost(self): <NEW_LINE> <INDENT> shortHostnamePattern = re.compile('^.*?\.') <NEW_LINE> try: <NEW_LINE> <INDENT> fi = os.popen( 'hostname --fqdn' ) <NEW_LINE> hostStr = fi.readline() <NEW_LINE> fi.close() <NEW_LINE> longHostname = hostStr[:-1] <NEW_LINE> mo = shortHostnamePattern.match( hostStr ) <NEW_LINE> if mo: <NEW_LINE> <INDENT> shortHostname = (mo.group(0))[:-1] <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> shortHostname = hostStr[:-1] <NEW_LINE> <DEDENT> <DEDENT> except KeyError: <NEW_LINE> <INDENT> longHostname = None <NEW_LINE> shortHostname = None <NEW_LINE> <DEDENT> self.assertEquals( longHostname, common.get_myhost() ) <NEW_LINE> self.assertEquals( longHostname, common.get_myhost(False) ) <NEW_LINE> self.assertEquals( shortHostname, common.get_myhost(True) ) <NEW_LINE> <DEDENT> def testGetRandomPort(self): <NEW_LINE> <INDENT> port = common.get_randomport() <NEW_LINE> self.assert_( port >= 20000 ) <NEW_LINE> self.assert_( port <= 30000 ) <NEW_LINE> port = common.get_randomport() <NEW_LINE> self.assert_( port >= 20000 ) <NEW_LINE> self.assert_( port <= 30000 ) <NEW_LINE> port = common.get_randomport(10,12) <NEW_LINE> self.assert_( port >= 10 ) <NEW_LINE> self.assert_( port <= 12 ) <NEW_LINE> port = common.get_randomport(10,12) <NEW_LINE> self.assert_( port >= 10 ) <NEW_LINE> self.assert_( port <= 12 ) <NEW_LINE> port = common.get_randomport(10,12) <NEW_LINE> self.assert_( port >= 10 ) <NEW_LINE> self.assert_( port <= 12 )
Test module functions class.
6259907b7d43ff2487428124
class SymbianPlatform(Platform): <NEW_LINE> <INDENT> def __init__(self, config): <NEW_LINE> <INDENT> Platform.__init__(self, config, "symbian") <NEW_LINE> self.requireOrdinals = True <NEW_LINE> self.exportLinkage = "__declspec(dllexport)" <NEW_LINE> self.entryLinkage = "" <NEW_LINE> self.language = "c++" <NEW_LINE> <DEDENT> def createBuild(self, config, library, name, targetName): <NEW_LINE> <INDENT> return SymbianBuild(config, library, self, name, targetName)
Symbian C++ platform
6259907baad79263cf4301d8
class DevConfig(BaseConfig): <NEW_LINE> <INDENT> SQLALCHEMY_DATABASE_URI = 'sqlite:////home/vagrant/skynet/skynet.db'
Local config for running the app on your local machine.
6259907b1f5feb6acb164616
class TestNetworkGroupnetsApi(unittest.TestCase): <NEW_LINE> <INDENT> def setUp(self): <NEW_LINE> <INDENT> self.api = isi_sdk_9_1_0.api.network_groupnets_api.NetworkGroupnetsApi() <NEW_LINE> <DEDENT> def tearDown(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def test_create_groupnet_subnet(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def test_create_subnets_subnet_pool(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def test_delete_groupnet_subnet(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def test_delete_subnets_subnet_pool(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def test_get_groupnet_subnet(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def test_get_subnets_subnet_pool(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def test_list_groupnet_subnets(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def test_list_subnets_subnet_pools(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def test_update_groupnet_subnet(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def test_update_subnets_subnet_pool(self): <NEW_LINE> <INDENT> pass
NetworkGroupnetsApi unit test stubs
6259907b656771135c48ad3f
class Document(models.Model): <NEW_LINE> <INDENT> ticket = models.ForeignKey('tracker.Ticket') <NEW_LINE> filename = models.CharField(max_length=120, help_text='Document filename', validators=[ RegexValidator(r'^[-_\.A-Za-z0-9]+\.[A-Za-z0-9]+$', message=_(u'We need a sane file name, such as my-invoice123.jpg')), ]) <NEW_LINE> size = models.PositiveIntegerField() <NEW_LINE> content_type = models.CharField(max_length=64) <NEW_LINE> description = models.CharField(max_length=255, blank=True, help_text='Optional further description of the document') <NEW_LINE> payload = models.FileField(upload_to='tickets/%Y/', storage=FileSystemStorage(location=settings.TRACKER_DOCS_ROOT)) <NEW_LINE> def __unicode__(self): <NEW_LINE> <INDENT> return self.filename <NEW_LINE> <DEDENT> def inline_intro(self): <NEW_LINE> <INDENT> try: <NEW_LINE> <INDENT> context = template.Context({'doc':self}) <NEW_LINE> return DOCUMENT_INTRO_TEMPLATE.render(context) <NEW_LINE> <DEDENT> except NoReverseMatch: <NEW_LINE> <INDENT> return self.filename <NEW_LINE> <DEDENT> <DEDENT> def html_item(self): <NEW_LINE> <INDENT> context = template.Context({'doc':self, 'detail':True}) <NEW_LINE> return DOCUMENT_INTRO_TEMPLATE.render(context) <NEW_LINE> <DEDENT> class Meta: <NEW_LINE> <INDENT> permissions = ( ("see_all_docs", "Can see all documents"), ("edit_all_docs", "Can edit all documents"), )
Document related to particular ticket, not publicly accessible.
6259907b5fdd1c0f98e5f99e
class CmdSshMaster(ClusterCompleter): <NEW_LINE> <INDENT> names = ['sshmaster', 'sm'] <NEW_LINE> def addopts(self, parser): <NEW_LINE> <INDENT> parser.add_option("-u", "--user", dest="user", action="store", type="string", default='root', help="login as USER (defaults to root)") <NEW_LINE> parser.add_option("-X", "--forward-x11", dest="forward_x11", action="store_true", default=False, help="enable X11 forwarding") <NEW_LINE> <DEDENT> def execute(self, args): <NEW_LINE> <INDENT> if not args: <NEW_LINE> <INDENT> self.parser.error("please specify a cluster") <NEW_LINE> <DEDENT> clname = args[0] <NEW_LINE> cmd = ' '.join(args[1:]) <NEW_LINE> retval = self.cm.ssh_to_master(clname, user=self.opts.user, command=cmd, forward_x11=self.opts.forward_x11) <NEW_LINE> if cmd and retval is not None: <NEW_LINE> <INDENT> sys.exit(retval)
sshmaster [options] <cluster> [<remote-command>] SSH to a cluster's master node Example: $ sshmaster mycluster You can also execute commands without directly logging in: $ starcluster sshmaster mycluster 'cat /etc/hosts'
6259907b97e22403b383c920
@admin.register(Basin) <NEW_LINE> class BasinAdmin(admin.ModelAdmin): <NEW_LINE> <INDENT> list_display = ('basin_id','location') <NEW_LINE> list_filter = ('basin_id','location') <NEW_LINE> search_fields = ['basin_id','location']
docstring for BasinAdmin
6259907b1b99ca4002290245
class VirtualMachineScaleSetUpdateNetworkConfiguration(SubResource): <NEW_LINE> <INDENT> _attribute_map = { 'id': {'key': 'id', 'type': 'str'}, 'name': {'key': 'name', 'type': 'str'}, 'primary': {'key': 'properties.primary', 'type': 'bool'}, 'enable_accelerated_networking': {'key': 'properties.enableAcceleratedNetworking', 'type': 'bool'}, 'network_security_group': {'key': 'properties.networkSecurityGroup', 'type': 'SubResource'}, 'dns_settings': {'key': 'properties.dnsSettings', 'type': 'VirtualMachineScaleSetNetworkConfigurationDnsSettings'}, 'ip_configurations': {'key': 'properties.ipConfigurations', 'type': '[VirtualMachineScaleSetUpdateIPConfiguration]'}, 'enable_ip_forwarding': {'key': 'properties.enableIPForwarding', 'type': 'bool'}, } <NEW_LINE> def __init__( self, *, id: Optional[str] = None, name: Optional[str] = None, primary: Optional[bool] = None, enable_accelerated_networking: Optional[bool] = None, network_security_group: Optional["SubResource"] = None, dns_settings: Optional["VirtualMachineScaleSetNetworkConfigurationDnsSettings"] = None, ip_configurations: Optional[List["VirtualMachineScaleSetUpdateIPConfiguration"]] = None, enable_ip_forwarding: Optional[bool] = None, **kwargs ): <NEW_LINE> <INDENT> super(VirtualMachineScaleSetUpdateNetworkConfiguration, self).__init__(id=id, **kwargs) <NEW_LINE> self.name = name <NEW_LINE> self.primary = primary <NEW_LINE> self.enable_accelerated_networking = enable_accelerated_networking <NEW_LINE> self.network_security_group = network_security_group <NEW_LINE> self.dns_settings = dns_settings <NEW_LINE> self.ip_configurations = ip_configurations <NEW_LINE> self.enable_ip_forwarding = enable_ip_forwarding
Describes a virtual machine scale set network profile's network configurations. :ivar id: Resource Id. :vartype id: str :ivar name: The network configuration name. :vartype name: str :ivar primary: Whether this is a primary NIC on a virtual machine. :vartype primary: bool :ivar enable_accelerated_networking: Specifies whether the network interface is accelerated networking-enabled. :vartype enable_accelerated_networking: bool :ivar network_security_group: The network security group. :vartype network_security_group: ~azure.mgmt.compute.v2017_12_01.models.SubResource :ivar dns_settings: The dns settings to be applied on the network interfaces. :vartype dns_settings: ~azure.mgmt.compute.v2017_12_01.models.VirtualMachineScaleSetNetworkConfigurationDnsSettings :ivar ip_configurations: The virtual machine scale set IP Configuration. :vartype ip_configurations: list[~azure.mgmt.compute.v2017_12_01.models.VirtualMachineScaleSetUpdateIPConfiguration] :ivar enable_ip_forwarding: Whether IP forwarding enabled on this NIC. :vartype enable_ip_forwarding: bool
6259907b5fc7496912d48f7a
class CreatePaymentRequest(models.Model): <NEW_LINE> <INDENT> APP_TYPE = ( ('CHILDMINDER', 'CHILDMINDER'), ('NANNY', 'NANNY'), ('PAY', 'PAY') ) <NEW_LINE> amount = models.DecimalField(max_digits=10, decimal_places=0, blank=False) <NEW_LINE> urn = models.CharField(max_length=9, blank=False) <NEW_LINE> service = models.CharField(choices=APP_TYPE, max_length=50, blank=False) <NEW_LINE> application_id = models.UUIDField(blank=True) <NEW_LINE> applicant_name = models.CharField(max_length=500, blank=True) <NEW_LINE> return_url = models.CharField(max_length=2000, blank=False) <NEW_LINE> description = models.CharField(max_length=255, blank=False) <NEW_LINE> invoice_number = models.CharField(max_length=14, blank=True)
Model used for serialization of create payment JSON objects
6259907b7047854f46340dd5
class Uniform(Distribution): <NEW_LINE> <INDENT> def __init__(self, lb : np.ndarray, ub : np.ndarray, seed=None): <NEW_LINE> <INDENT> self.lb, self.ub = self._check_parameters(lb, ub) <NEW_LINE> self.rng = np.random.RandomState(seed) <NEW_LINE> <DEDENT> def set_parameters(self, params): <NEW_LINE> <INDENT> lb = params[0] <NEW_LINE> ub = params[1] <NEW_LINE> self.lb, self.ub = self._check_parameters(lb, ub) <NEW_LINE> <DEDENT> def reseed(self, seed): <NEW_LINE> <INDENT> self.rng.seed(seed) <NEW_LINE> <DEDENT> def sample(self, k): <NEW_LINE> <INDENT> samples = np.zeros(shape=(k,len(self.lb))) <NEW_LINE> for j in range(0,len(self.lb)): <NEW_LINE> <INDENT> samples[:,j] = self.rng.uniform(self.lb[j], self.ub[j], k) <NEW_LINE> <DEDENT> return samples <NEW_LINE> <DEDENT> def pdf(self,x): <NEW_LINE> <INDENT> if np.product(np.greater_equal(x,self.lb)*np.less_equal(x,self.ub)): <NEW_LINE> <INDENT> temp = self.ub-self.lb <NEW_LINE> pdf_value = 1/np.product(temp[np.nonzero(temp)]) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> pdf_value = 0 <NEW_LINE> <DEDENT> return pdf_value <NEW_LINE> <DEDENT> def _check_parameters(self, lb, ub): <NEW_LINE> <INDENT> new_lb = new_ub = None <NEW_LINE> if isinstance(lb, (list,np.ndarray)): <NEW_LINE> <INDENT> new_lb = np.array(lb) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> raise TypeError('The lower bound is not of allowed types') <NEW_LINE> <DEDENT> if isinstance(ub, (list,np.ndarray)): <NEW_LINE> <INDENT> new_ub = np.array(ub) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> raise TypeError('The upper bound is not of allowed types') <NEW_LINE> <DEDENT> if new_lb.shape != new_ub.shape: <NEW_LINE> <INDENT> raise BaseException('Dimension of lower bound and upper bound is not same.') <NEW_LINE> <DEDENT> return (new_lb, new_ub)
This class implements a p-dimensional uniform Prior distribution in a closed interval.
6259907b8a349b6b43687c7b
class Act(NarrativeElement): <NEW_LINE> <INDENT> noun = u'act' <NEW_LINE> def _load(self): <NEW_LINE> <INDENT> super(Act, self)._load() <NEW_LINE> self.description = self.data['description'] <NEW_LINE> self.banner = self.data.get("banner", None) <NEW_LINE> self.banner_background = self.data.get("banner_background", None) <NEW_LINE> self.banner_class = self.data.get("banner_class", None) <NEW_LINE> self.banner_colour = self.data.get("banner_colour", None) <NEW_LINE> self.orbital = self.data.get("orbital", None) <NEW_LINE> self.illustration = self.data.get("illustration", None) <NEW_LINE> self.homepage = self.data.get("homepage", None) <NEW_LINE> stats_data = self.redis_conn.hgetall(u"%s:%s:stats" % (self.noun, self.id)) <NEW_LINE> if stats_data: <NEW_LINE> <INDENT> self.has_stats = True <NEW_LINE> self.stats_image_map = stats_data['image_map'] <NEW_LINE> self.stats_image_map_id = stats_data['image_map_id'] <NEW_LINE> self.stats_image = "graph_%s_%d.png" % (self.mission_name, self.number) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> self.has_stats = False <NEW_LINE> <DEDENT> <DEDENT> def key_scenes(self): <NEW_LINE> <INDENT> return list( KeyScene.Query(self.redis_conn, self.mission_name).act_number(self.number).items() ) <NEW_LINE> <DEDENT> def banner_styles(self): <NEW_LINE> <INDENT> styles = [] <NEW_LINE> if self.banner_colour: <NEW_LINE> <INDENT> styles.append('background-color: %s' % self.banner_colour) <NEW_LINE> <DEDENT> if self.banner_background: <NEW_LINE> <INDENT> from apps.transcripts.templatetags.missionstatic import mission_static <NEW_LINE> url = mission_static( self.mission_name, 'images/banners', self.banner_background, ) <NEW_LINE> styles.append('background-image: url(%s)' % url) <NEW_LINE> <DEDENT> return '; '.join(styles) <NEW_LINE> <DEDENT> class Query(NarrativeElement.Query): <NEW_LINE> <INDENT> all_key_pattern = u"acts:%(mission_name)s" <NEW_LINE> def _key_to_instance(self, key): <NEW_LINE> <INDENT> mission_name, number = key.split(u":", 1) <NEW_LINE> return Act(self.redis_conn, mission_name, int(number))
Represents an Act in the mission.
6259907b5fcc89381b266e6b
class _PowerAction(BaseAction): <NEW_LINE> <INDENT> def Validate(self): <NEW_LINE> <INDENT> self._TypeValidator(self._args, list) <NEW_LINE> if len(self._args) not in [1, 2, 3]: <NEW_LINE> <INDENT> raise ValidationError('Invalid args length: %s' % self._args) <NEW_LINE> <DEDENT> if not isinstance(self._args[0], str) and not isinstance(self._args[0], int): <NEW_LINE> <INDENT> raise ValidationError('Invalid argument type: %s' % self._args[0]) <NEW_LINE> <DEDENT> if len(self._args) > 1 and not isinstance(self._args[1], str): <NEW_LINE> <INDENT> raise ValidationError('Invalid argument type: %s' % self._args[1]) <NEW_LINE> <DEDENT> if len(self._args) > 2 and not isinstance(self._args[2], bool): <NEW_LINE> <INDENT> raise ValidationError('Invalid argument type: %s' % self._args[2])
Validation for Power actions.
6259907bd268445f2663a86e
class postal_record(record): <NEW_LINE> <INDENT> def __init__(self, global_context_dict, pool, pline): <NEW_LINE> <INDENT> super(postal_record, self).__init__(global_context_dict, pool, pline) <NEW_LINE> self.is_9_pos_adherent = False <NEW_LINE> <DEDENT> def validate_global_context_dict(self): <NEW_LINE> <INDENT> if _is_9_pos_bvr_adherent(self.global_values['partner_bvr']): <NEW_LINE> <INDENT> parts = self.global_values['partner_bvr'].split('-') <NEW_LINE> parts[1] = parts[1].rjust(6, '0') <NEW_LINE> self.global_values['partner_bvr'] = ''.join(parts) <NEW_LINE> self.is_9_pos_adherent = True <NEW_LINE> <DEDENT> elif len(self.global_values['partner_bvr']) == 5: <NEW_LINE> <INDENT> val = self.global_values['partner_bvr'].rjust(9, '0') <NEW_LINE> self.global_values['partner_bvr'] = val <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> raise orm.except_orm( _('Error'), _('Wrong postal number format.\n' 'It must be 12-123456-9 or 12345 format') )
Class that propose common knowledge for all postal account type
6259907b009cb60464d02f5f
class TagHelperNode(Node): <NEW_LINE> <INDENT> def __init__(self, takes_context, args, kwargs): <NEW_LINE> <INDENT> self.takes_context = takes_context <NEW_LINE> self.args = args <NEW_LINE> self.kwargs = kwargs <NEW_LINE> <DEDENT> def get_resolved_arguments(self, context): <NEW_LINE> <INDENT> resolved_args = [var.resolve(context) for var in self.args] <NEW_LINE> if self.takes_context: <NEW_LINE> <INDENT> resolved_args = [context] + resolved_args <NEW_LINE> <DEDENT> resolved_kwargs = {k: v.resolve(context) for k, v in self.kwargs.items()} <NEW_LINE> return resolved_args, resolved_kwargs
Base class for tag helper nodes such as SimpleNode and InclusionNode. Manages the positional and keyword arguments to be passed to the decorated function.
6259907b7d43ff2487428125
class RunPylint(object): <NEW_LINE> <INDENT> def test_pylint(self): <NEW_LINE> <INDENT> files_list = [] <NEW_LINE> for root, dirnames, filenames in os.walk(PROJECT_DIR): <NEW_LINE> <INDENT> if not should_check_directory(root): <NEW_LINE> <INDENT> continue <NEW_LINE> <DEDENT> for filename in fnmatch.filter(filenames, '*.py'): <NEW_LINE> <INDENT> files_list.append(os.path.join(root, filename)) <NEW_LINE> <DEDENT> <DEDENT> for file in files_list: <NEW_LINE> <INDENT> call(['pylint', '--errors-only', file])
Run pylint on all Python files.
6259907b3539df3088ecdcb8
class PredictorDataSetManager(object): <NEW_LINE> <INDENT> def __init__(self, predictor_config_manager=None): <NEW_LINE> <INDENT> self.predictor_config_manager = predictor_config_manager if predictor_config_manager is not None else PredictorConfigManager() <NEW_LINE> <DEDENT> @property <NEW_LINE> def datasets(self): <NEW_LINE> <INDENT> if not hasattr(self, '_datasets'): <NEW_LINE> <INDENT> self._datasets = defaultdict(lambda: defaultdict(dict)) <NEW_LINE> <DEDENT> return self._datasets <NEW_LINE> <DEDENT> def dataset(self, data_name, trait_name, categorical_trait=False, **kwargs): <NEW_LINE> <INDENT> kwargs['categorical_trait'] = categorical_trait <NEW_LINE> dataset = self.new_dataset(data_name, trait_name, **kwargs) <NEW_LINE> if data_name in self.datasets: <NEW_LINE> <INDENT> if trait_name in self.datasets[data_name]: <NEW_LINE> <INDENT> if categorical_trait in self.datasets[data_name][ trait_name] and self.datasets[data_name][trait_name][ categorical_trait] != dataset: <NEW_LINE> <INDENT> sys.stderr.write( "WARNING: over-writing dataset named: {}".format( (data_name, trait_name, categorical_trait))) <NEW_LINE> self.datasets[data_name][trait_name][ categorical_trait] = dataset <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> self.datasets[data_name][trait_name][ categorical_trait] = dataset <NEW_LINE> <DEDENT> <DEDENT> else: <NEW_LINE> <INDENT> self.datasets[data_name][trait_name][ categorical_trait] = dataset <NEW_LINE> <DEDENT> <DEDENT> else: <NEW_LINE> <INDENT> self.datasets[data_name][trait_name][categorical_trait] = dataset <NEW_LINE> <DEDENT> return dataset <NEW_LINE> <DEDENT> @memoize <NEW_LINE> def new_dataset(self, data_name, trait_name, categorical_trait=False, data=None, trait=None, predictor_config_manager=None): <NEW_LINE> <INDENT> if data is None: <NEW_LINE> <INDENT> args = np.array([data, trait, predictor_config_manager]) <NEW_LINE> if np.any([i is not None for i in args]): <NEW_LINE> <INDENT> raise Exception <NEW_LINE> <DEDENT> try: <NEW_LINE> <INDENT> return self.datasets[data_name][trait_name][categorical_trait] <NEW_LINE> <DEDENT> except KeyError: <NEW_LINE> <INDENT> raise KeyError("No such dataset: {}".format( (data_name, trait_name, categorical_trait))) <NEW_LINE> <DEDENT> <DEDENT> if trait is None: <NEW_LINE> <INDENT> raise Exception <NEW_LINE> <DEDENT> if trait_name != trait.name: <NEW_LINE> <INDENT> raise ValueError <NEW_LINE> <DEDENT> if data_name is None: <NEW_LINE> <INDENT> data_name = "MyData" <NEW_LINE> <DEDENT> predictor_config_manager = predictor_config_manager if predictor_config_manager is not None else self.predictor_config_manager <NEW_LINE> return PredictorDataSet( data, trait, data_name, categorical_trait=categorical_trait, predictor_config_manager=predictor_config_manager)
A collection of PredictorDataSet instances. Parameters ---------- predictor_config_manager : PredictorConfigManager, optional (default None) A predictor configuration manager. If None, instantiate a new one. Attributes ---------- datasets : dict Dict of dicts of {data: {trait: {categorical: dataset}}}. For convenient retrieval of predictors
6259907bf548e778e596cfb2
class ENVI_SLayer_Socket(NodeSocket): <NEW_LINE> <INDENT> bl_idname = 'envi_sl_sock' <NEW_LINE> bl_label = 'Shade layer socket' <NEW_LINE> valid = ['GLayer', 'Tlayer'] <NEW_LINE> def draw(self, context, layout, node, text): <NEW_LINE> <INDENT> layout.label(text) <NEW_LINE> <DEDENT> def draw_color(self, context, node): <NEW_LINE> <INDENT> return (0, 0, 0, 1.0) <NEW_LINE> <DEDENT> def ret_valid(self, node): <NEW_LINE> <INDENT> return ['GLayer', 'Tlayer']
EnVi shade layer socket
6259907b1f5feb6acb164618
class ArtifactRepository: <NEW_LINE> <INDENT> __metaclass__ = ABCMeta <NEW_LINE> def __init__(self, artifact_uri): <NEW_LINE> <INDENT> self.artifact_uri = artifact_uri <NEW_LINE> <DEDENT> @abstractmethod <NEW_LINE> def log_artifact(self, local_file, artifact_path=None): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> @abstractmethod <NEW_LINE> def log_artifacts(self, local_dir, artifact_path=None): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> @abstractmethod <NEW_LINE> def list_artifacts(self, path): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def _is_directory(self, artifact_path): <NEW_LINE> <INDENT> listing = self.list_artifacts(artifact_path) <NEW_LINE> return len(listing) > 0 <NEW_LINE> <DEDENT> def download_artifacts(self, artifact_path, dst_path=None): <NEW_LINE> <INDENT> def download_file(fullpath): <NEW_LINE> <INDENT> fullpath = fullpath.rstrip('/') <NEW_LINE> dirpath, _ = posixpath.split(fullpath) <NEW_LINE> local_dir_path = os.path.join(dst_path, dirpath) <NEW_LINE> local_file_path = os.path.join(dst_path, fullpath) <NEW_LINE> if not os.path.exists(local_dir_path): <NEW_LINE> <INDENT> os.makedirs(local_dir_path) <NEW_LINE> <DEDENT> self._download_file(remote_file_path=fullpath, local_path=local_file_path) <NEW_LINE> return local_file_path <NEW_LINE> <DEDENT> def download_artifact_dir(dir_path): <NEW_LINE> <INDENT> local_dir = os.path.join(dst_path, dir_path) <NEW_LINE> dir_content = [ file_info for file_info in self.list_artifacts(dir_path) if file_info.path != "." and file_info.path != dir_path] <NEW_LINE> if not dir_content: <NEW_LINE> <INDENT> if not os.path.exists(local_dir): <NEW_LINE> <INDENT> os.makedirs(local_dir) <NEW_LINE> <DEDENT> <DEDENT> else: <NEW_LINE> <INDENT> for file_info in dir_content: <NEW_LINE> <INDENT> if file_info.is_dir: <NEW_LINE> <INDENT> download_artifact_dir(dir_path=file_info.path) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> download_file(file_info.path) <NEW_LINE> <DEDENT> <DEDENT> <DEDENT> return local_dir <NEW_LINE> <DEDENT> if dst_path is None: <NEW_LINE> <INDENT> dst_path = tempfile.mkdtemp() <NEW_LINE> <DEDENT> dst_path = os.path.abspath(dst_path) <NEW_LINE> if not os.path.exists(dst_path): <NEW_LINE> <INDENT> raise MlflowException( message=( "The destination path for downloaded artifacts does not" " exist! Destination path: {dst_path}".format(dst_path=dst_path)), error_code=RESOURCE_DOES_NOT_EXIST) <NEW_LINE> <DEDENT> elif not os.path.isdir(dst_path): <NEW_LINE> <INDENT> raise MlflowException( message=( "The destination path for downloaded artifacts must be a directory!" " Destination path: {dst_path}".format(dst_path=dst_path)), error_code=INVALID_PARAMETER_VALUE) <NEW_LINE> <DEDENT> if self._is_directory(artifact_path): <NEW_LINE> <INDENT> return download_artifact_dir(artifact_path) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> return download_file(artifact_path) <NEW_LINE> <DEDENT> <DEDENT> @abstractmethod <NEW_LINE> def _download_file(self, remote_file_path, local_path): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> @experimental <NEW_LINE> def delete_artifacts(self, artifact_path=None): <NEW_LINE> <INDENT> pass
Abstract artifact repo that defines how to upload (log) and download potentially large artifacts from different storage backends.
6259907be1aae11d1e7cf521
class TestInventoryCreateByApplyDelta(TestInventory): <NEW_LINE> <INDENT> def test_add(self): <NEW_LINE> <INDENT> inv = self.make_init_inventory() <NEW_LINE> inv = inv.create_by_apply_delta([ (None, "a", "a-id", self.make_file('a-id', 'a', 'tree-root')), ], 'new-test-rev') <NEW_LINE> self.assertEqual('a', inv.id2path('a-id')) <NEW_LINE> <DEDENT> def test_delete(self): <NEW_LINE> <INDENT> inv = self.make_init_inventory() <NEW_LINE> inv = inv.create_by_apply_delta([ (None, "a", "a-id", self.make_file('a-id', 'a', 'tree-root')), ], 'new-rev-1') <NEW_LINE> self.assertEqual('a', inv.id2path('a-id')) <NEW_LINE> inv = inv.create_by_apply_delta([ ("a", None, "a-id", None), ], 'new-rev-2') <NEW_LINE> self.assertRaises(errors.NoSuchId, inv.id2path, 'a-id') <NEW_LINE> <DEDENT> def test_rename(self): <NEW_LINE> <INDENT> inv = self.make_init_inventory() <NEW_LINE> inv = inv.create_by_apply_delta([ (None, "a", "a-id", self.make_file('a-id', 'a', 'tree-root')), ], 'new-rev-1') <NEW_LINE> self.assertEqual('a', inv.id2path('a-id')) <NEW_LINE> a_ie = inv['a-id'] <NEW_LINE> b_ie = self.make_file(a_ie.file_id, "b", a_ie.parent_id) <NEW_LINE> inv = inv.create_by_apply_delta([("a", "b", "a-id", b_ie)], 'new-rev-2') <NEW_LINE> self.assertEqual("b", inv.id2path('a-id')) <NEW_LINE> <DEDENT> def test_illegal(self): <NEW_LINE> <INDENT> inv = self.make_init_inventory() <NEW_LINE> self.assertRaises(errors.InconsistentDelta, inv.create_by_apply_delta, [ (None, "a", "id-1", self.make_file('id-1', 'a', 'tree-root')), (None, "b", "id-1", self.make_file('id-1', 'b', 'tree-root')), ], 'new-rev-1')
A subset of the inventory delta application tests. See test_inv which has comprehensive delta application tests for inventories, dirstate, and repository based inventories.
6259907b4f88993c371f1232
class AlloyCooking_Profiling: <NEW_LINE> <INDENT> def __init__(self, bounds=None,sd=None): <NEW_LINE> <INDENT> self.input_dim = 4 <NEW_LINE> if bounds == None: <NEW_LINE> <INDENT> self.bounds = OrderedDict([('Time1',(2*3600,4*3600)),('Time2',(2*3600,4*3600)),('Temp1',(175,225)),('Temp2',(225,275))]) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> self.bounds = bounds <NEW_LINE> <DEDENT> self.min = [(0.)*self.input_dim] <NEW_LINE> self.fmin = 150 <NEW_LINE> self.ismax=1 <NEW_LINE> self.name='AlloyCooking_Profiling' <NEW_LINE> <DEDENT> def get_data(self,mystr): <NEW_LINE> <INDENT> data = load_svmlight_file(mystr) <NEW_LINE> return data[0], data[1] <NEW_LINE> <DEDENT> def run_Profiling(self,X): <NEW_LINE> <INDENT> print (X) <NEW_LINE> x1=X[0] <NEW_LINE> x2=X[1] <NEW_LINE> x3=X[2] <NEW_LINE> x4=X[3] <NEW_LINE> if x3<0.000001: <NEW_LINE> <INDENT> x3=0.000001 <NEW_LINE> <DEDENT> if x2<0.000001: <NEW_LINE> <INDENT> x2=0.000001 <NEW_LINE> <DEDENT> myEm=0.45; <NEW_LINE> myxmatrix=0.0006; <NEW_LINE> myiSurfen=0.1097149825; <NEW_LINE> myfSurfen=0.1656804095; <NEW_LINE> myRadsurfenchange=0.0000000041; <NEW_LINE> import matlab.engine <NEW_LINE> import matlab <NEW_LINE> eng = matlab.engine.start_matlab() <NEW_LINE> eng.addpath(r'F:\Dropbox\02.Sharing\Vu_Sunil_Santu\Alloy_Paul\parameters_fitting',nargout=0) <NEW_LINE> myCookTemp=matlab.double([x3,x4]) <NEW_LINE> myCookTime=matlab.double([x1,x2]) <NEW_LINE> strength=eng.PrepNuclGrowthModel_MultipleStages(myxmatrix,myCookTemp,myCookTime,myEm,myiSurfen,myfSurfen,myRadsurfenchange) <NEW_LINE> temp=np.asarray(strength) <NEW_LINE> return temp[0][1] <NEW_LINE> <DEDENT> def func(self,X): <NEW_LINE> <INDENT> X=np.asarray(X) <NEW_LINE> if len(X.shape)==1: <NEW_LINE> <INDENT> Strength=self.run_Profiling(X) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> Strength=np.apply_along_axis( self.run_Profiling,1,X) <NEW_LINE> <DEDENT> return Strength*self.ismax
SVR_function: function :param sd: standard deviation, to generate noisy evaluations of the function.
6259907b32920d7e50bc7a63
class UserForm(messages.Message): <NEW_LINE> <INDENT> user_name = messages.StringField(1, required=True) <NEW_LINE> email = messages.StringField(2, required=True)
UserForm for username and email information
6259907c7cff6e4e811b7461
class Type(IntEnum): <NEW_LINE> <INDENT> PLAYLIST = 0 <NEW_LINE> SONG = 1
Enum for Playlists Notifications
6259907c97e22403b383c922
class Separator(ChildMixin, Widget): <NEW_LINE> <INDENT> _widget_name = 'ttk::separator' <NEW_LINE> tk_class_name = 'TSeparator' <NEW_LINE> def _repr_parts(self): <NEW_LINE> <INDENT> return ['orient=' + repr(self.config['orient'])]
A horizontal or vertical line, depending on an ``orient`` option. Create a horizontal separator like this... :: separator = teek.Separator(some_widget, orient='horizontal') separator.pack(fill='x') # default is side='top' ...and create a vertical separator like this:: separator = teek.Separator(some_widget, orient='vertical') separator.pack(fill='y', side='left') # can also use side='right' See :source:`examples/separator.py` for more example code. Manual page: :man:`ttk_separator(3tk)`
6259907c4a966d76dd5f0907
class UserWidget(tw2.core.Widget): <NEW_LINE> <INDENT> resources = [photo_css, thumbnail_js] <NEW_LINE> template = 'fedoratagger.widgets.templates.user' <NEW_LINE> @property <NEW_LINE> def gravatar_tag(self): <NEW_LINE> <INDENT> return m.get_user().gravatar_md <NEW_LINE> <DEDENT> @property <NEW_LINE> def formatted_name(self): <NEW_LINE> <INDENT> return tg.request.identity.get( 'ircnick', self.username ) <NEW_LINE> <DEDENT> @property <NEW_LINE> def logged_in(self): <NEW_LINE> <INDENT> return self.username != 'anonymous' <NEW_LINE> <DEDENT> @property <NEW_LINE> def username(self): <NEW_LINE> <INDENT> return m.get_user().username <NEW_LINE> <DEDENT> @property <NEW_LINE> def total_votes(self): <NEW_LINE> <INDENT> user = m.get_user(self.username) <NEW_LINE> return user.total_votes <NEW_LINE> <DEDENT> @property <NEW_LINE> def rank(self): <NEW_LINE> <INDENT> user = m.get_user(self.username) <NEW_LINE> return user.rank <NEW_LINE> <DEDENT> @property <NEW_LINE> def notifications_on(self): <NEW_LINE> <INDENT> user = m.get_user(self.username) <NEW_LINE> return user.notifications_on and "checked='checked'" or "" <NEW_LINE> <DEDENT> @property <NEW_LINE> def _notifications_on(self): <NEW_LINE> <INDENT> user = m.get_user(self.username) <NEW_LINE> return user.notifications_on and "true" or "false"
Gravatar widget
6259907cf9cc0f698b1c5fdd
class KNearestNeighbor(object): <NEW_LINE> <INDENT> def __init__(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def train(self, X, y): <NEW_LINE> <INDENT> self.X_train = X <NEW_LINE> self.y_train = y <NEW_LINE> <DEDENT> def predict(self, X, k=1, num_loops=0): <NEW_LINE> <INDENT> if num_loops == 0: <NEW_LINE> <INDENT> dists = self.compute_distances_no_loops(X) <NEW_LINE> <DEDENT> elif num_loops == 1: <NEW_LINE> <INDENT> dists = self.compute_distances_one_loop(X) <NEW_LINE> <DEDENT> elif num_loops == 2: <NEW_LINE> <INDENT> dists = self.compute_distances_two_loops(X) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> raise ValueError('Invalid value %d for num_loops' % num_loops) <NEW_LINE> <DEDENT> return self.predict_labels(dists, k=k) <NEW_LINE> <DEDENT> def compute_distances_two_loops(self, X): <NEW_LINE> <INDENT> num_test = X.shape[0] <NEW_LINE> num_train = self.X_train.shape[0] <NEW_LINE> dists = np.zeros((num_test, num_train)) <NEW_LINE> for i in xrange(num_test): <NEW_LINE> <INDENT> for j in xrange(num_train): <NEW_LINE> <INDENT> dists[i,j] = np.linalg.norm(X[i]-self.X_train[j]) <NEW_LINE> pass <NEW_LINE> <DEDENT> <DEDENT> return dists <NEW_LINE> <DEDENT> def compute_distances_one_loop(self, X): <NEW_LINE> <INDENT> num_test = X.shape[0] <NEW_LINE> num_train = self.X_train.shape[0] <NEW_LINE> dists = np.zeros((num_test, num_train)) <NEW_LINE> for i in xrange(num_test): <NEW_LINE> <INDENT> dists[i] = np.sqrt(np.sum((X[i] - self.X_train)**2, axis = 1)) <NEW_LINE> <DEDENT> return dists <NEW_LINE> <DEDENT> def compute_distances_no_loops(self, X): <NEW_LINE> <INDENT> num_test = X.shape[0] <NEW_LINE> num_train = self.X_train.shape[0] <NEW_LINE> dists = np.zeros((num_test, num_train)) <NEW_LINE> XX = np.sum(np.square(X), axis = 1, keepdims = True) <NEW_LINE> YY = np.sum(np.square(self.X_train), axis = 1) <NEW_LINE> XY = np.multiply(np.dot(X, self.X_train.T),-2) <NEW_LINE> XX = np.add(XX, YY) <NEW_LINE> XX = np.add(XX, XY) <NEW_LINE> dists = np.sqrt(XX) <NEW_LINE> return dists <NEW_LINE> <DEDENT> def predict_labels(self, dists, k=1): <NEW_LINE> <INDENT> num_test = dists.shape[0] <NEW_LINE> y_pred = np.zeros(num_test) <NEW_LINE> for i in xrange(num_test): <NEW_LINE> <INDENT> closest_y = [] <NEW_LINE> closest_y = self.y_train[np.argsort(dists[i])[:k]] <NEW_LINE> y_pred[i] = Counter(closest_y).most_common(1)[0][0] <NEW_LINE> pass <NEW_LINE> <DEDENT> return y_pred
a kNN classifier with L2 distance
6259907c3d592f4c4edbc86f
class PreferredSessionSyncPort(A10BaseClass): <NEW_LINE> <INDENT> def __init__(self, **kwargs): <NEW_LINE> <INDENT> self.ERROR_MSG = "" <NEW_LINE> self.required=[] <NEW_LINE> self.b_key = "preferred-session-sync-port" <NEW_LINE> self.a10_url="/axapi/v3/vrrp-a/preferred-session-sync-port" <NEW_LINE> self.DeviceProxy = "" <NEW_LINE> self.trunk_list = [] <NEW_LINE> self.ethernet_list = [] <NEW_LINE> for keys, value in kwargs.items(): <NEW_LINE> <INDENT> setattr(self,keys, value)
Class Description:: VRRP-A preferred-session-sync-port. Class preferred-session-sync-port supports CRUD Operations and inherits from `common/A10BaseClass`. This class is the `"PARENT"` class for this module.` :param trunk_list: {"minItems": 1, "items": {"type": "trunk"}, "uniqueItems": true, "array": [{"required": ["pre-trunk"], "properties": {"pre-vlan": {"description": "Interface VLAN (VLAN ID)", "format": "number", "type": "number", "maximum": 4094, "minimum": 1, "optional": true}, "pre-trunk": {"optional": false, "type": "number", "description": "Trunk Interface number", "format": "interface"}}}], "type": "array", "$ref": "/axapi/v3/vrrp-a/preferred-session-sync-port/trunk/{pre-trunk}"} :param ethernet_list: {"minItems": 1, "items": {"type": "ethernet"}, "uniqueItems": true, "array": [{"required": ["pre-eth"], "properties": {"pre-eth": {"optional": false, "type": "number", "description": "Ethernet interface number", "format": "interface"}, "pre-vlan": {"description": "Interface VLAN (VLAN ID)", "format": "number", "type": "number", "maximum": 4094, "minimum": 1, "optional": true}}}], "type": "array", "$ref": "/axapi/v3/vrrp-a/preferred-session-sync-port/ethernet/{pre-eth}"} :param DeviceProxy: The device proxy for REST operations and session handling. Refer to `common/device_proxy.py` URL for this object:: `https://<Hostname|Ip address>//axapi/v3/vrrp-a/preferred-session-sync-port`.
6259907c009cb60464d02f61
class DnSet(BaseObject): <NEW_LINE> <INDENT> def __init__(self, **kwargs): <NEW_LINE> <INDENT> BaseObject.__init__(self, "DnSet", "dnSet") <NEW_LINE> if kwargs: <NEW_LINE> <INDENT> for n, v in ucsgenutils.iteritems(kwargs): <NEW_LINE> <INDENT> self.attr_set(n, v)
This is DnSet class.
6259907c97e22403b383c923
class Server(Connection): <NEW_LINE> <INDENT> def __init__(self, host, port): <NEW_LINE> <INDENT> super(Server, self).__init__(b'server') <NEW_LINE> self.addr = (host, int(port)) <NEW_LINE> <DEDENT> def connect(self): <NEW_LINE> <INDENT> self.conn = socket.socket(socket.AF_INET, socket.SOCK_STREAM) <NEW_LINE> self.conn.connect((self.addr[0], self.addr[1])) <NEW_LINE> self.conn.connect((str(self.addr[0], encoding='utf-8'), self.addr[1]))
Establish connection to destination server.
6259907ce1aae11d1e7cf522
class entity_keyword(parser.keyword): <NEW_LINE> <INDENT> def __init__(self, sString): <NEW_LINE> <INDENT> parser.keyword.__init__(self, sString)
unique_id = instantiated_unit : entity_keyword
6259907c2c8b7c6e89bd5208
class Instance(A10BaseClass): <NEW_LINE> <INDENT> def __init__(self, **kwargs): <NEW_LINE> <INDENT> self.ERROR_MSG = "" <NEW_LINE> self.required = [ "name"] <NEW_LINE> self.b_key = "instance" <NEW_LINE> self.a10_url="/axapi/v3/aam/authentication/logon/http-authenticate/instance/{name}" <NEW_LINE> self.DeviceProxy = "" <NEW_LINE> self.uuid = "" <NEW_LINE> self.retry = "" <NEW_LINE> self.name = "" <NEW_LINE> self.auth_method = {} <NEW_LINE> for keys, value in kwargs.items(): <NEW_LINE> <INDENT> setattr(self,keys, value)
Class Description:: HTTP-authenticate Logon. Class instance supports CRUD Operations and inherits from `common/A10BaseClass`. This class is the `"PARENT"` class for this module.` :param uuid: {"description": "uuid of the object", "format": "string", "minLength": 1, "modify-not-allowed": 1, "optional": true, "maxLength": 64, "type": "string"} :param retry: {"description": "Specify max. number of failure retry (1 ~ 32), default is 3", "format": "number", "default": 3, "optional": true, "maximum": 32, "minimum": 1, "type": "number"} :param name: {"description": "Specify HTTP-Authenticate logon name", "format": "string", "minLength": 1, "optional": false, "maxLength": 63, "type": "string"} :param DeviceProxy: The device proxy for REST operations and session handling. Refer to `common/device_proxy.py` URL for this object:: `https://<Hostname|Ip address>//axapi/v3/aam/authentication/logon/http-authenticate/instance/{name}`.
6259907c3317a56b869bf257
class Amount2DecimalType (pyxb.binding.datatypes.decimal): <NEW_LINE> <INDENT> _ExpandedName = pyxb.namespace.ExpandedName(Namespace, 'Amount2DecimalType') <NEW_LINE> _XSDLocation = pyxb.utils.utility.Location('/opt/odoo/tmp/data/datifatture/DatiFatturav2.1.xsd', 468, 2) <NEW_LINE> _Documentation = None
An atomic simple type.
6259907c32920d7e50bc7a65
class CheckFailed(ActionError): <NEW_LINE> <INDENT> pass
Raised when a check for a command failed
6259907c01c39578d7f14446
class SimilarGeneSelector(Recombinator): <NEW_LINE> <INDENT> def __init__(self): <NEW_LINE> <INDENT> super(SimilarGeneSelector, self).__init__() <NEW_LINE> <DEDENT> def choose_genes(self, chr1, chr2): <NEW_LINE> <INDENT> genes = chr1.get_all_genes() <NEW_LINE> for gene1 in random.sample(genes, len(genes)): <NEW_LINE> <INDENT> for gene2 in chr2.get_all_genes(): <NEW_LINE> <INDENT> if gene2.is_equal(gene1): <NEW_LINE> <INDENT> return gene1, gene2 <NEW_LINE> <DEDENT> <DEDENT> <DEDENT> return None, None
Selects similar genes from two chromosomes
6259907c71ff763f4b5e91cf
class UBXStreamer: <NEW_LINE> <INDENT> def __init__(self, port, baudrate, timeout=5, ubx_only=False): <NEW_LINE> <INDENT> self._serial_object = None <NEW_LINE> self._serial_thread = None <NEW_LINE> self._ubxreader = None <NEW_LINE> self._connected = False <NEW_LINE> self._reading = False <NEW_LINE> self._port = port <NEW_LINE> self._baudrate = baudrate <NEW_LINE> self._timeout = timeout <NEW_LINE> self._ubx_only = ubx_only <NEW_LINE> <DEDENT> def __del__(self): <NEW_LINE> <INDENT> self.stop_read_thread() <NEW_LINE> self.disconnect() <NEW_LINE> <DEDENT> def connect(self): <NEW_LINE> <INDENT> self._connected = False <NEW_LINE> try: <NEW_LINE> <INDENT> self._serial_object = Serial( self._port, self._baudrate, timeout=self._timeout ) <NEW_LINE> self._ubxreader = UBXReader(BufferedReader(self._serial_object), ubxonly=self._ubx_only) <NEW_LINE> self._connected = True <NEW_LINE> <DEDENT> except (SerialException, SerialTimeoutException) as err: <NEW_LINE> <INDENT> print(f"Error connecting to serial port {err}") <NEW_LINE> <DEDENT> return self._connected <NEW_LINE> <DEDENT> def disconnect(self): <NEW_LINE> <INDENT> if self._connected and self._serial_object: <NEW_LINE> <INDENT> try: <NEW_LINE> <INDENT> self._serial_object.close() <NEW_LINE> <DEDENT> except (SerialException, SerialTimeoutException) as err: <NEW_LINE> <INDENT> print(f"Error disconnecting from serial port {err}") <NEW_LINE> <DEDENT> <DEDENT> self._connected = False <NEW_LINE> return self._connected <NEW_LINE> <DEDENT> def start_read_thread(self): <NEW_LINE> <INDENT> if self._connected: <NEW_LINE> <INDENT> self._reading = True <NEW_LINE> self._serial_thread = Thread(target=self._read_thread, daemon=True) <NEW_LINE> self._serial_thread.start() <NEW_LINE> <DEDENT> <DEDENT> def stop_read_thread(self): <NEW_LINE> <INDENT> if self._serial_thread is not None: <NEW_LINE> <INDENT> self._reading = False <NEW_LINE> <DEDENT> <DEDENT> def send(self, data): <NEW_LINE> <INDENT> self._serial_object.write(data) <NEW_LINE> <DEDENT> def flush(self): <NEW_LINE> <INDENT> self._serial_object.reset_input_buffer() <NEW_LINE> <DEDENT> def waiting(self): <NEW_LINE> <INDENT> return self._serial_object.in_waiting <NEW_LINE> <DEDENT> def _read_thread(self): <NEW_LINE> <INDENT> while self._reading and self._serial_object: <NEW_LINE> <INDENT> if self._serial_object.in_waiting: <NEW_LINE> <INDENT> try: <NEW_LINE> <INDENT> (raw_data, parsed_data) = self._ubxreader.read() <NEW_LINE> if parsed_data: <NEW_LINE> <INDENT> print(parsed_data) <NEW_LINE> <DEDENT> <DEDENT> except ( ube.UBXStreamError, ube.UBXMessageError, ube.UBXTypeError, ube.UBXParseError, ) as err: <NEW_LINE> <INDENT> print(f"Something went wrong {err}") <NEW_LINE> continue
UBXStreamer class.
6259907cf9cc0f698b1c5fde
class NullContext(object): <NEW_LINE> <INDENT> def __enter__(self): <NEW_LINE> <INDENT> self.old_contexts = _state.contexts <NEW_LINE> _state.contexts = () <NEW_LINE> <DEDENT> def __exit__(self, type, value, traceback): <NEW_LINE> <INDENT> _state.contexts = self.old_contexts
Resets the StackContext. Useful when creating a shared resource on demand (e.g. an AsyncHTTPClient) where the stack that caused the creating is not relevant to future operations.
6259907c3617ad0b5ee07b71
class L3(CFRegularizer): <NEW_LINE> <INDENT> def compute_norm(self, x): <NEW_LINE> <INDENT> return tf.reduce_sum(tf.abs(x)**3)
L3 regularization.
6259907c7b180e01f3e49d77
class Attributes(Base): <NEW_LINE> <INDENT> def test_path_type(self): <NEW_LINE> <INDENT> b = NonbibFile(self.invalid) <NEW_LINE> self.assertIsInstance(b.path, pathlib.Path) <NEW_LINE> <DEDENT> def test_src_txt_type(self): <NEW_LINE> <INDENT> b = NonbibFile(self.invalid) <NEW_LINE> self.assertIsInstance(b.src_txt, unicode) <NEW_LINE> <DEDENT> def test_bib_type(self): <NEW_LINE> <INDENT> b = NonbibFile(self.invalid) <NEW_LINE> self.assertIsInstance(b.bib, PybtexError) <NEW_LINE> <DEDENT> def test_path_immutability(self): <NEW_LINE> <INDENT> b = NonbibFile(self.invalid) <NEW_LINE> try: <NEW_LINE> <INDENT> b.path = self.empty <NEW_LINE> <DEDENT> except AttributeError: <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> self.fail("NonbibFile.path can be set after instantiation") <NEW_LINE> <DEDENT> <DEDENT> def test_bib_immutability(self): <NEW_LINE> <INDENT> b = NonbibFile(self.invalid) <NEW_LINE> bib = b.bib <NEW_LINE> try: <NEW_LINE> <INDENT> b.bib = bib <NEW_LINE> <DEDENT> except AttributeError: <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> self.fail("NonbibFile.bib can be set after instantiation") <NEW_LINE> <DEDENT> <DEDENT> def test_src_txt_immutability(self): <NEW_LINE> <INDENT> b = NonbibFile(self.invalid) <NEW_LINE> try: <NEW_LINE> <INDENT> b.src_txt = "legitimate text string" <NEW_LINE> <DEDENT> except AttributeError: <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> self.fail("NonbibFile.src_txt can be set after instantiation")
Test attributes of NonbibFile These tests include type checks, setting immutable attributes, etc.
6259907c67a9b606de5477b8
class UploadImageStoreForm(forms.ModelForm): <NEW_LINE> <INDENT> class Meta: <NEW_LINE> <INDENT> model = Store <NEW_LINE> fields = ('logo',)
Form to handle the upload of product images
6259907cd486a94d0ba2d9dc
class ErrorInvalidCommand(ClientError): <NEW_LINE> <INDENT> pass
Invalid command, wrong formating of parameters etc.
6259907c7c178a314d78e8fd
class EngineConfig(object): <NEW_LINE> <INDENT> encoding = True <NEW_LINE> serializer = json <NEW_LINE> compression = False <NEW_LINE> mode = RedisMode.toplevel_blob
Configuration for the `RedisAdapter` engine.
6259907cfff4ab517ebcf23d
@view_defaults(route_name='password:reset:continue', permission=NO_PERMISSION_REQUIRED, renderer='pyramid_fullauth:resources/templates/reset.proceed.mako') <NEW_LINE> class PasswordResetContinueView(BaseView): <NEW_LINE> <INDENT> @view_config(request_method='GET') <NEW_LINE> def get(self): <NEW_LINE> <INDENT> self.request.logout() <NEW_LINE> return { 'status': True, 'csrf_token': self.request.session.get_csrf_token() } <NEW_LINE> <DEDENT> @view_config(request_method='POST', check_csrf=True) <NEW_LINE> def post(self): <NEW_LINE> <INDENT> user = self.request.matchdict.get('user') <NEW_LINE> password = self.request.POST.get('password', None) <NEW_LINE> password_confirm = self.request.POST.get('confirm_password', None) <NEW_LINE> if password == password_confirm: <NEW_LINE> <INDENT> try: <NEW_LINE> <INDENT> self.request.registry.notify(BeforeReset(self.request, user)) <NEW_LINE> validate_passsword(self.request, password, user) <NEW_LINE> user.reset_key = None <NEW_LINE> try: <NEW_LINE> <INDENT> pyramid_basemodel.Session.query(AuthenticationProvider).filter( AuthenticationProvider.user_id == user.id, AuthenticationProvider.provider == text_type('email') ).one() <NEW_LINE> <DEDENT> except NoResultFound: <NEW_LINE> <INDENT> user.providers.append( AuthenticationProvider( provider=text_type('email'), provider_id=user.id ) ) <NEW_LINE> <DEDENT> pyramid_basemodel.Session.flush() <NEW_LINE> <DEDENT> except (ValidateError, AttributeError) as e: <NEW_LINE> <INDENT> return { 'status': False, 'msg': text_type(e), 'csrf_token': self.request.session.get_csrf_token() } <NEW_LINE> <DEDENT> try: <NEW_LINE> <INDENT> self.request.registry.notify(AfterReset(self.request, user)) <NEW_LINE> <DEDENT> except HTTPRedirection as redirect: <NEW_LINE> <INDENT> return redirect <NEW_LINE> <DEDENT> <DEDENT> else: <NEW_LINE> <INDENT> return { 'status': False, 'msg': self.request._('password-mismatch', default='Password doesn\'t match', domain='pyramid_fullauth'), 'csrf_token': self.request.session.get_csrf_token() } <NEW_LINE> <DEDENT> return self.get()
Password reset views. These views display actual reset password views.
6259907c7cff6e4e811b7465
class UnityActionObjectRemove(bpy.types.Operator): <NEW_LINE> <INDENT> bl_idname = "group.unity_action_object_remove" <NEW_LINE> bl_label = "Remove Group Object from Unity Actions" <NEW_LINE> @classmethod <NEW_LINE> def poll(cls, context): <NEW_LINE> <INDENT> return hasattr (context, "group") and context.group != None and hasattr(context, "unity_group_action_object") and context.unity_group_action_object != None <NEW_LINE> <DEDENT> def execute(self, context): <NEW_LINE> <INDENT> index = -1 <NEW_LINE> for i, a in enumerate(context.group.unity_action_objects): <NEW_LINE> <INDENT> if a == context.unity_group_action_object: <NEW_LINE> <INDENT> index = i <NEW_LINE> <DEDENT> <DEDENT> print(index) <NEW_LINE> if index >= 0: <NEW_LINE> <INDENT> context.group.unity_action_objects.remove(index) <NEW_LINE> <DEDENT> return {'FINISHED'}
Remove group object from current action
6259907c97e22403b383c926
class WordInfo(object): <NEW_LINE> <INDENT> def __init__(self, text): <NEW_LINE> <INDENT> super(WordInfo, self).__init__() <NEW_LINE> self.text = text <NEW_LINE> self.freq = 0.0 <NEW_LINE> self.left = [] <NEW_LINE> self.right = [] <NEW_LINE> self.aggregation = 0 <NEW_LINE> self.entropy = 0 <NEW_LINE> <DEDENT> def update(self, left, right): <NEW_LINE> <INDENT> self.freq += 1 <NEW_LINE> if left: self.left.append(left) <NEW_LINE> if right: self.right.append(right) <NEW_LINE> <DEDENT> def compute(self, length): <NEW_LINE> <INDENT> self.freq /= length <NEW_LINE> self.entropy = min(entropyOfList(self.left), entropyOfList(self.right)) <NEW_LINE> <DEDENT> def computeAggregation(self, words_dict): <NEW_LINE> <INDENT> parts = genSubparts(self.text) <NEW_LINE> if len(parts) > 0: <NEW_LINE> <INDENT> self.aggregation = min(map( lambda p: self.freq/words_dict[p[0]].freq/words_dict[p[1]].freq, parts ))
Store information of each word, including its freqency, left neighbors and right neighbors
6259907c283ffb24f3cf52c5
class RunTests(Command): <NEW_LINE> <INDENT> description = 'run tests' <NEW_LINE> user_options = [] <NEW_LINE> def initialize_options(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def finalize_options(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def run(self): <NEW_LINE> <INDENT> errno = call(['py.test', '--cov=bonita', '--cov-report=term-missing']) <NEW_LINE> raise SystemExit(errno)
Run all tests.
6259907c71ff763f4b5e91d1
class PeriodicNeighbor(PeriodicSite): <NEW_LINE> <INDENT> def __init__(self, species: Composition, coords: np.ndarray, lattice: Lattice, properties: dict = None, nn_distance: float = 0.0, index: int = 0, image: tuple = (0, 0, 0)): <NEW_LINE> <INDENT> self._lattice = lattice <NEW_LINE> self._frac_coords = coords <NEW_LINE> self._species = species <NEW_LINE> self.properties = properties or {} <NEW_LINE> self.nn_distance = nn_distance <NEW_LINE> self.index = index <NEW_LINE> self.image = image <NEW_LINE> <DEDENT> @property <NEW_LINE> def coords(self): <NEW_LINE> <INDENT> return self._lattice.get_cartesian_coords(self._frac_coords) <NEW_LINE> <DEDENT> def __len__(self): <NEW_LINE> <INDENT> return 4 <NEW_LINE> <DEDENT> def __getitem__(self, i: int): <NEW_LINE> <INDENT> return (self, self.nn_distance, self.index, self.image)[i]
Simple PeriodicSite subclass to contain a neighboring atom that skips all the unnecessary checks for speed. Can be used as a fixed-length tuple of size 4 to retain backwards compatibility with past use cases. (site, distance, index, image). In future, usage should be to call attributes, e.g., PeriodicNeighbor.index, PeriodicNeighbor.distance, etc.
6259907c44b2445a339b7670
class ItemNotCreatedError(PredictionIOAPIError): <NEW_LINE> <INDENT> pass
Error happened when tried to create item
6259907c796e427e538501a0
class HardwareTrackerGroupConfig(TrackerGroupConfig): <NEW_LINE> <INDENT> def __init__(self, *args, device_identifier: Optional[DeviceId] = None, is_origin: bool = False, tracked_position_offset: Optional[Sequence[float]] = None, tracked_rotation_offset: Optional[Sequence[float]] = None, mimic_in_sim: bool = False, mimic_ignore_z_axis: bool = False, mimic_ignore_rotation: bool = False, **kwargs): <NEW_LINE> <INDENT> super().__init__(*args, **kwargs) <NEW_LINE> self.device_identifier = device_identifier <NEW_LINE> self.is_origin = is_origin <NEW_LINE> self.mimic_in_sim = mimic_in_sim <NEW_LINE> self.mimic_ignore_z_axis = mimic_ignore_z_axis <NEW_LINE> self.mimic_ignore_rotation = mimic_ignore_rotation <NEW_LINE> self.tracked_position_offset = None <NEW_LINE> if tracked_position_offset is not None: <NEW_LINE> <INDENT> tracked_position_offset = np.array( tracked_position_offset, dtype=np.float32) <NEW_LINE> assert tracked_position_offset.shape == ( 3,), tracked_position_offset.shape <NEW_LINE> self.tracked_position_offset = tracked_position_offset <NEW_LINE> <DEDENT> self.tracked_rotation_offset = None <NEW_LINE> if tracked_rotation_offset is not None: <NEW_LINE> <INDENT> tracked_rotation_offset = np.array( tracked_rotation_offset, dtype=np.float32) <NEW_LINE> assert tracked_rotation_offset.shape == ( 3,), tracked_rotation_offset.shape <NEW_LINE> self.tracked_rotation_offset = euler2quat( *tracked_rotation_offset, axes='rxyz')
Stores group configuration for a VrTrackerComponent.
6259907c7047854f46340ddb
class BingParser(Parser): <NEW_LINE> <INDENT> search_engine = 'bing' <NEW_LINE> search_types = ['normal', 'image'] <NEW_LINE> no_results_selector = ['#b_results > .b_ans::text'] <NEW_LINE> num_results_search_selectors = ['.sb_count'] <NEW_LINE> effective_query_selector = ['#sp_requery a > strong', '#sp_requery + #sp_recourse a::attr(href)'] <NEW_LINE> page_number_selectors = ['.sb_pagS::text'] <NEW_LINE> normal_search_selectors = { 'results': { 'us_ip': { 'container': '#b_results', 'result_container': '.b_algo', 'link': 'h2 > a::attr(href)', 'snippet': '.b_caption > p::text', 'title': 'h2::text', 'visible_link': 'cite::text' }, 'de_ip': { 'container': '#b_results', 'result_container': '.b_algo', 'link': 'h2 > a::attr(href)', 'snippet': '.b_caption > p::text', 'title': 'h2::text', 'visible_link': 'cite::text' }, 'de_ip_news_items': { 'container': 'ul.b_vList li', 'link': ' h5 a::attr(href)', 'snippet': 'p::text', 'title': ' h5 a::text', 'visible_link': 'cite::text' }, }, 'ads_main': { 'us_ip': { 'container': '#b_results .b_ad', 'result_container': '.sb_add', 'link': 'h2 > a::attr(href)', 'snippet': '.sb_addesc::text', 'title': 'h2 > a::text', 'visible_link': 'cite::text' }, 'de_ip': { 'container': '#b_results .b_ad', 'result_container': '.sb_add', 'link': 'h2 > a::attr(href)', 'snippet': '.b_caption > p::text', 'title': 'h2 > a::text', 'visible_link': 'cite::text' } } } <NEW_LINE> image_search_selectors = { 'results': { 'ch_ip': { 'container': '#dg_c .imgres', 'result_container': '.dg_u', 'link': 'a::attr(m)' }, } } <NEW_LINE> def __init__(self, *args, **kwargs): <NEW_LINE> <INDENT> super().__init__(*args, **kwargs) <NEW_LINE> <DEDENT> def after_parsing(self): <NEW_LINE> <INDENT> super().after_parsing() <NEW_LINE> if self.searchtype == 'normal': <NEW_LINE> <INDENT> self.no_results = False <NEW_LINE> if self.no_results_text: <NEW_LINE> <INDENT> self.no_results = self.query in self.no_results_text or 'Do you want results only for' in self.no_results_text <NEW_LINE> <DEDENT> <DEDENT> if self.searchtype == 'image': <NEW_LINE> <INDENT> for key, i in self.iter_serp_items(): <NEW_LINE> <INDENT> for regex in ( r'imgurl:"(?P<url>.*?)"', ): <NEW_LINE> <INDENT> result = re.search(regex, self.search_results[key][i]['link']) <NEW_LINE> if result: <NEW_LINE> <INDENT> self.search_results[key][i]['link'] = result.group('url') <NEW_LINE> break
Parses SERP pages of the Bing search engine.
6259907c3d592f4c4edbc871
class DuplicateWalFile(WALFileException): <NEW_LINE> <INDENT> pass
A duplicate WAL file has been found
6259907cdc8b845886d54fe0
class TclCommandAddPolyline(TclCommand.TclCommandSignaled): <NEW_LINE> <INDENT> aliases = ['add_polyline'] <NEW_LINE> arg_names = collections.OrderedDict([ ('name', str) ]) <NEW_LINE> option_types = collections.OrderedDict() <NEW_LINE> required = ['name'] <NEW_LINE> help = { 'main': "Creates a polyline in the given Geometry object.", 'args': collections.OrderedDict([ ('name', 'Name of the Geometry object to which to append the polyline.'), ('xi, yi', 'Coordinates of points in the polyline.') ]), 'examples': [ 'add_polyline <name> <x0> <y0> <x1> <y1> <x2> <y2> [x3 y3 [...]]' ] } <NEW_LINE> def execute(self, args, unnamed_args): <NEW_LINE> <INDENT> name = args['name'] <NEW_LINE> obj = self.app.collection.get_by_name(name) <NEW_LINE> if obj is None: <NEW_LINE> <INDENT> self.raise_tcl_error("Object not found: %s" % name) <NEW_LINE> <DEDENT> if not isinstance(obj, Geometry): <NEW_LINE> <INDENT> self.raise_tcl_error('Expected Geometry, got %s %s.' % (name, type(obj))) <NEW_LINE> <DEDENT> if len(unnamed_args) % 2 != 0: <NEW_LINE> <INDENT> self.raise_tcl_error("Incomplete coordinates.") <NEW_LINE> <DEDENT> points = [[float(unnamed_args[2*i]), float(unnamed_args[2*i+1])] for i in range(len(unnamed_args)/2)] <NEW_LINE> obj.add_polyline(points) <NEW_LINE> obj.plot()
Tcl shell command to create a polyline in the given Geometry object
6259907c2c8b7c6e89bd520c
class NoImageError(VisualSearchEngineError): <NEW_LINE> <INDENT> def __init__(self, image_id): <NEW_LINE> <INDENT> message = 'Image {} does not exist in the index'.format(image_id) <NEW_LINE> VisualSearchEngineError.__init__(self, message)
Raised when trying to delete non-existing image path from the image index.
6259907c656771135c48ad43
class PicturesMyHandler(NewebeAuthHandler): <NEW_LINE> <INDENT> def get(self, startKey=None, tag=None): <NEW_LINE> <INDENT> self.return_documents_since(PictureManager.get_owner_last_pictures, startKey, tag)
This handler handles requests that retrieve last pictures posted by Newebe owner. * GET: Retrieves last pictures posted by newebe owner. * POST: Creates a picture.
6259907c5fdd1c0f98e5f9a6
@register_test() <NEW_LINE> class media_tests(TsBase): <NEW_LINE> <INDENT> cycles = 5 <NEW_LINE> desktop = True <NEW_LINE> url = 'http://localhost:16932/startup_test/media/html/media_tests.html' <NEW_LINE> timeout = 360
Media Performance Tests
6259907ca05bb46b3848be3c
class TestEdxJsonEncoder(unittest.TestCase): <NEW_LINE> <INDENT> def setUp(self): <NEW_LINE> <INDENT> self.encoder = EdxJSONEncoder() <NEW_LINE> class OffsetTZ(tzinfo): <NEW_LINE> <INDENT> def utcoffset(self, _dt): <NEW_LINE> <INDENT> return timedelta(hours=4) <NEW_LINE> <DEDENT> <DEDENT> self.offset_tz = OffsetTZ() <NEW_LINE> class NullTZ(tzinfo): <NEW_LINE> <INDENT> def utcoffset(self, _dt): <NEW_LINE> <INDENT> return None <NEW_LINE> <DEDENT> <DEDENT> self.null_utc_tz = NullTZ() <NEW_LINE> <DEDENT> def test_encode_location(self): <NEW_LINE> <INDENT> loc = Location('org', 'course', 'run', 'category', 'name', None) <NEW_LINE> self.assertEqual(loc.to_deprecated_string(), self.encoder.default(loc)) <NEW_LINE> loc = Location('org', 'course', 'run', 'category', 'name', 'version') <NEW_LINE> self.assertEqual(loc.to_deprecated_string(), self.encoder.default(loc)) <NEW_LINE> <DEDENT> def test_encode_naive_datetime(self): <NEW_LINE> <INDENT> self.assertEqual( "2013-05-03T10:20:30.000100", self.encoder.default(datetime(2013, 5, 3, 10, 20, 30, 100)) ) <NEW_LINE> self.assertEqual( "2013-05-03T10:20:30", self.encoder.default(datetime(2013, 5, 3, 10, 20, 30)) ) <NEW_LINE> <DEDENT> def test_encode_utc_datetime(self): <NEW_LINE> <INDENT> self.assertEqual( "2013-05-03T10:20:30+00:00", self.encoder.default(datetime(2013, 5, 3, 10, 20, 30, 0, pytz.UTC)) ) <NEW_LINE> self.assertEqual( "2013-05-03T10:20:30+04:00", self.encoder.default(datetime(2013, 5, 3, 10, 20, 30, 0, self.offset_tz)) ) <NEW_LINE> self.assertEqual( "2013-05-03T10:20:30Z", self.encoder.default(datetime(2013, 5, 3, 10, 20, 30, 0, self.null_utc_tz)) ) <NEW_LINE> <DEDENT> def test_fallthrough(self): <NEW_LINE> <INDENT> with self.assertRaises(TypeError): <NEW_LINE> <INDENT> self.encoder.default(None) <NEW_LINE> <DEDENT> with self.assertRaises(TypeError): <NEW_LINE> <INDENT> self.encoder.default({})
Tests for xml_exporter.EdxJSONEncoder
6259907c97e22403b383c928
class TimeoutTracer(object): <NEW_LINE> <INDENT> def __init__(self, granularity=5): <NEW_LINE> <INDENT> self.granularity = granularity <NEW_LINE> <DEDENT> def connection_raw_execute(self, connection, raw_cursor, statement, params): <NEW_LINE> <INDENT> remaining_time = self.get_remaining_time() <NEW_LINE> if remaining_time <= 0: <NEW_LINE> <INDENT> raise TimeoutError( statement, params, "%d seconds remaining in time budget" % remaining_time) <NEW_LINE> <DEDENT> last_remaining_time = getattr(connection, "_timeout_tracer_remaining_time", 0) <NEW_LINE> if (remaining_time > last_remaining_time or last_remaining_time - remaining_time >= self.granularity): <NEW_LINE> <INDENT> self.set_statement_timeout(raw_cursor, remaining_time) <NEW_LINE> connection._timeout_tracer_remaining_time = remaining_time <NEW_LINE> <DEDENT> <DEDENT> def connection_raw_execute_error(self, connection, raw_cursor, statement, params, error): <NEW_LINE> <INDENT> raise NotImplementedError("%s.connection_raw_execute_error() must be " "implemented" % self.__class__.__name__) <NEW_LINE> <DEDENT> def connection_commit(self, connection, xid=None): <NEW_LINE> <INDENT> self._reset_timeout_tracer_remaining_time(connection) <NEW_LINE> <DEDENT> def connection_rollback(self, connection, xid=None): <NEW_LINE> <INDENT> self._reset_timeout_tracer_remaining_time(connection) <NEW_LINE> <DEDENT> def _reset_timeout_tracer_remaining_time(self, connection): <NEW_LINE> <INDENT> connection._timeout_tracer_remaining_time = 0 <NEW_LINE> <DEDENT> def set_statement_timeout(self, raw_cursor, remaining_time): <NEW_LINE> <INDENT> raise NotImplementedError("%s.set_statement_timeout() must be " "implemented" % self.__class__.__name__) <NEW_LINE> <DEDENT> def get_remaining_time(self): <NEW_LINE> <INDENT> raise NotImplementedError("%s.get_remaining_time() must be implemented" % self.__class__.__name__)
Provide a timeout facility for connections to prevent rogue operations. This tracer must be subclassed by backend-specific implementations that override C{connection_raw_execute_error}, C{set_statement_timeout} and C{get_remaining_time} methods.
6259907c4a966d76dd5f090d
class CIFWrapper(object): <NEW_LINE> <INDENT> _DATA = collections.OrderedDict() <NEW_LINE> def __init__(self, d, data_id=None): <NEW_LINE> <INDENT> if d is not None: <NEW_LINE> <INDENT> __dictionary = copy.deepcopy(d) <NEW_LINE> self.data_id = data_id if data_id is not None else '' <NEW_LINE> try: <NEW_LINE> <INDENT> (datablock_id, datablock) = list(__dictionary.items())[0] <NEW_LINE> category = list(datablock.values())[0] <NEW_LINE> item = list(category.values())[0] <NEW_LINE> self.data_id = datablock_id <NEW_LINE> self._DATA = datablock <NEW_LINE> <DEDENT> except AttributeError: <NEW_LINE> <INDENT> self._DATA = __dictionary <NEW_LINE> <DEDENT> self.__convertDictToCIFWrapperTable() <NEW_LINE> <DEDENT> <DEDENT> def __getattr__(self, attr_in): <NEW_LINE> <INDENT> return self._DATA.get(attr_in) <NEW_LINE> <DEDENT> def __convertDictToCIFWrapperTable(self): <NEW_LINE> <INDENT> for k in list(self._DATA.keys()): <NEW_LINE> <INDENT> j = collections.OrderedDict() <NEW_LINE> for k2, v2 in list(self._DATA[k].items()): <NEW_LINE> <INDENT> if isinstance(v2, list): <NEW_LINE> <INDENT> j[k2] = v2 <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> j[k2] = [v2, ] <NEW_LINE> <DEDENT> <DEDENT> self._DATA.update({k: CIFWrapperTable(j)}) <NEW_LINE> <DEDENT> <DEDENT> def unwrap(self): <NEW_LINE> <INDENT> cleaned_map = collections.OrderedDict() <NEW_LINE> for k, v in list(self._DATA.items()): <NEW_LINE> <INDENT> cleaned_map.setdefault(k, collections.OrderedDict()) <NEW_LINE> for k2, v2 in list(v._DATA.items()): <NEW_LINE> <INDENT> cleaned_map[k][k2] = v2 <NEW_LINE> <DEDENT> <DEDENT> if self.data_id is not None and self.data_id != '': <NEW_LINE> <INDENT> return {self.data_id: cleaned_map} <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> return {str(id(self)): cleaned_map} <NEW_LINE> <DEDENT> <DEDENT> def __contains__(self, tableNameIn): <NEW_LINE> <INDENT> return tableNameIn in self._DATA <NEW_LINE> <DEDENT> def __getitem__(self, tableNameIn): <NEW_LINE> <INDENT> return self._DATA.get(tableNameIn) <NEW_LINE> <DEDENT> def __delitem__(self, tableName): <NEW_LINE> <INDENT> if tableName in self._DATA: <NEW_LINE> <INDENT> del self._DATA[tableName] <NEW_LINE> <DEDENT> <DEDENT> def contents(self): <NEW_LINE> <INDENT> return list(self._DATA.keys())
CIFWrapper is a wrapper object for the output of the MMCIF2Dict object i.e., an mmCIF-like python dictionary object. This implies that mmCIF-like dictionaries written outside this package may be used to initialize the CIFWrapper class as well. The CIFWrapper object emulates python objects by providing access to mmCIF categories and items using the familiar python 'dot' notation.
6259907c66673b3332c31e26
class VppBondInterface(VppInterface): <NEW_LINE> <INDENT> def __init__(self, test, mode, lb=0, use_custom_mac=0, mac_address=''): <NEW_LINE> <INDENT> super(VppBondInterface, self).__init__(test) <NEW_LINE> self.mode = mode <NEW_LINE> self.lb = lb <NEW_LINE> self.use_custom_mac = use_custom_mac <NEW_LINE> self.mac_address = mac_address <NEW_LINE> <DEDENT> def add_vpp_config(self): <NEW_LINE> <INDENT> r = self.test.vapi.bond_create(self.mode, self.lb, self.use_custom_mac, self.mac_address) <NEW_LINE> self.set_sw_if_index(r.sw_if_index) <NEW_LINE> <DEDENT> def remove_vpp_config(self): <NEW_LINE> <INDENT> self.test.vapi.bond_delete(self.sw_if_index) <NEW_LINE> <DEDENT> def enslave_vpp_bond_interface(self, sw_if_index, is_passive, is_long_timeout): <NEW_LINE> <INDENT> self.test.vapi.bond_enslave(sw_if_index, self.sw_if_index, is_passive, is_long_timeout) <NEW_LINE> <DEDENT> def detach_vpp_bond_interface(self, sw_if_index): <NEW_LINE> <INDENT> self.test.vapi.bond_detach_slave(sw_if_index) <NEW_LINE> <DEDENT> def is_interface_config_in_dump(self, dump): <NEW_LINE> <INDENT> for i in dump: <NEW_LINE> <INDENT> if i.sw_if_index == self.sw_if_index: <NEW_LINE> <INDENT> return True <NEW_LINE> <DEDENT> <DEDENT> else: <NEW_LINE> <INDENT> return False
VPP bond interface.
6259907c283ffb24f3cf52c8
class CommandSet(object): <NEW_LINE> <INDENT> _func_base = f'{__package__}.admin_commands' <NEW_LINE> def __init__(self, settings): <NEW_LINE> <INDENT> self._settings = settings <NEW_LINE> <DEDENT> def allows(self, command_name) -> CommandPermission: <NEW_LINE> <INDENT> return CommandPermission(name=command_name, _settings=self._commands.get(command_name, {})) <NEW_LINE> <DEDENT> def func(self, command_name) -> AdminCommandDef: <NEW_LINE> <INDENT> func_name = self._commands.get(command_name, {}) .get('pythonFunction', 'undefined_operation') <NEW_LINE> module = importlib.import_module(self._func_base) <NEW_LINE> segments = func_name.split('.') <NEW_LINE> func = module <NEW_LINE> while len(segments) > 0: <NEW_LINE> <INDENT> func = getattr(func, segments.pop(0)) <NEW_LINE> <DEDENT> return cast(AdminCommandDef, func) <NEW_LINE> <DEDENT> @property <NEW_LINE> def no(self) -> str: <NEW_LINE> <INDENT> return random.choice(self._settings.get('notAuthorizedResponses', [])) <NEW_LINE> <DEDENT> @property <NEW_LINE> def _commands(self) -> Dict[str, Any]: <NEW_LINE> <INDENT> return self._settings.get('commands', {})
A config object for asking questions of `commands.json` for admin command configuration.
6259907ca8370b77170f1df6
class TopoDataFormat(VegaLiteSchema): <NEW_LINE> <INDENT> _schema = {'$ref': '#/definitions/TopoDataFormat'} <NEW_LINE> _rootschema = Root._schema <NEW_LINE> def __init__(self, feature=Undefined, mesh=Undefined, parse=Undefined, type=Undefined, **kwds): <NEW_LINE> <INDENT> super(TopoDataFormat, self).__init__(feature=feature, mesh=mesh, parse=parse, type=type, **kwds)
TopoDataFormat schema wrapper Mapping(required=[]) Attributes ---------- feature : string The name of the TopoJSON object set to convert to a GeoJSON feature collection. For example, in a map of the world, there may be an object set named ``"countries"``. Using the feature property, we can extract this set and generate a GeoJSON feature object for each country. mesh : string The name of the TopoJSON object set to convert to mesh. Similar to the ``feature`` option, ``mesh`` extracts a named TopoJSON object set. Unlike the ``feature`` option, the corresponding geo data is returned as a single, unified mesh instance, not as individual GeoJSON features. Extracting a mesh is useful for more efficiently drawing borders or other geographic elements that you do not need to associate with specific regions such as individual countries, states or counties. parse : anyOf(enum('auto'), Mapping(required=[])) If set to auto (the default), perform automatic type inference to determine the desired data types. Alternatively, a parsing directive object can be provided for explicit data types. Each property of the object corresponds to a field name, and the value to the desired data type (one of ``"number"``, ``"boolean"`` or ``"date"`` ). For example, ``"parse": {"modified_on": "date"}`` parses the ``modified_on`` field in each input record a Date value. For ``"date"``, we parse data based using Javascript's ` ``Date.parse()`` <https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date/parse>`_. For Specific date formats can be provided (e.g., ``{foo: 'date:"%m%d%Y"'}`` ), using the `d3-time-format syntax <https://github.com/d3/d3-time-format#locale_format>`_. UTC date format parsing is supported similarly (e.g., ``{foo: 'utc:"%m%d%Y"'}`` ). See more about `UTC time <timeunit.html#utc>`_ type : enum('topojson') Type of input data: ``"json"``, ``"csv"``, ``"tsv"``. The default format type is determined by the extension of the file URL. If no extension is detected, ``"json"`` will be used by default.
6259907cf9cc0f698b1c5fe0
class User: <NEW_LINE> <INDENT> def __init__(self): <NEW_LINE> <INDENT> self._file = ".user.txt" <NEW_LINE> self._data = [] <NEW_LINE> self.read_from_user() <NEW_LINE> <DEDENT> def read_file(self): <NEW_LINE> <INDENT> data = [] <NEW_LINE> try: <NEW_LINE> <INDENT> with open(self._file, 'r') as f: <NEW_LINE> <INDENT> for line in f: <NEW_LINE> <INDENT> data.append(line.rstrip('\n')) <NEW_LINE> <DEDENT> <DEDENT> <DEDENT> except FileNotFoundError: <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> return data <NEW_LINE> <DEDENT> def write_file(self): <NEW_LINE> <INDENT> with open(self._file, 'w') as f: <NEW_LINE> <INDENT> for data in self._data: <NEW_LINE> <INDENT> f.write(str(data) + '\n') <NEW_LINE> <DEDENT> <DEDENT> <DEDENT> def read_from_user(self): <NEW_LINE> <INDENT> msg = "Please type in your username and preferred options." <NEW_LINE> title = "Advance Wars by Web Analyzer" <NEW_LINE> options = ["Username", "Minutes to run program again"] <NEW_LINE> values = self.read_file() <NEW_LINE> while True: <NEW_LINE> <INDENT> self._data = easygui.multenterbox(msg, title, options, values) <NEW_LINE> if self._data is None: <NEW_LINE> <INDENT> sys.exit(1) <NEW_LINE> <DEDENT> if self.process_data(): <NEW_LINE> <INDENT> break <NEW_LINE> <DEDENT> <DEDENT> self.write_file() <NEW_LINE> <DEDENT> def process_data(self): <NEW_LINE> <INDENT> name = self._data[0] <NEW_LINE> if name == "": <NEW_LINE> <INDENT> return False <NEW_LINE> <DEDENT> time = self._data[1] <NEW_LINE> if time == "": <NEW_LINE> <INDENT> self._data = "5" <NEW_LINE> <DEDENT> elif not re.search('(\d*\.)?\d+', time): <NEW_LINE> <INDENT> return False <NEW_LINE> <DEDENT> return True <NEW_LINE> <DEDENT> @property <NEW_LINE> def data(self): <NEW_LINE> <INDENT> return self._data
Handles the input of the program.
6259907c7d43ff2487428129
class TestAnnouncement(unittest.TestCase): <NEW_LINE> <INDENT> def setUp(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def tearDown(self): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> def testAnnouncement(self): <NEW_LINE> <INDENT> model = swagger_client.models.announcement.Announcement()
Announcement unit test stubs
6259907c56b00c62f0fb42fc
class NerProcessor(DataProcessor): <NEW_LINE> <INDENT> def get_labels(self) -> List[str]: <NEW_LINE> <INDENT> label_file = os.path.join(self.data_dir, 'labels.txt') <NEW_LINE> with open(label_file, 'r', encoding='utf-8') as f: <NEW_LINE> <INDENT> labels = f.readlines() <NEW_LINE> return labels <NEW_LINE> <DEDENT> <DEDENT> def get_train_examples(self) -> List[InputExample]: <NEW_LINE> <INDENT> return self._create_example( os.path.join(self.data_dir, 'train.txt'), "train" ) <NEW_LINE> <DEDENT> def get_test_examples(self) -> List[InputExample]: <NEW_LINE> <INDENT> return self._create_example( os.path.join(self.data_dir, 'test.txt'), "test" ) <NEW_LINE> <DEDENT> def get_dev_examples(self) -> List[InputExample]: <NEW_LINE> <INDENT> return self._create_example( os.path.join(self.data_dir, 'dev.txt'), "dev" ) <NEW_LINE> <DEDENT> def _read_file(self, input_file): <NEW_LINE> <INDENT> pass <NEW_LINE> <DEDENT> @staticmethod <NEW_LINE> def _create_example(file: str, data_type: str = "train") -> List[InputExample]: <NEW_LINE> <INDENT> examples: List[InputExample] = [] <NEW_LINE> with open(file, 'r', encoding='utf-8') as f: <NEW_LINE> <INDENT> for index, line in enumerate(f): <NEW_LINE> <INDENT> data = json.loads(line) <NEW_LINE> examples.append(InputExample( guid=f"id-{data_type}-{index}", text_a=data['text'], text_b=None, label=data['entities'] )) <NEW_LINE> <DEDENT> <DEDENT> return examples
ner data process
6259907c627d3e7fe0e088af
class IFTTTRobot(cozmo.robot.Robot): <NEW_LINE> <INDENT> async def get_in_position(self): <NEW_LINE> <INDENT> if (self.lift_height.distance_mm > 45) or (self.head_angle.degrees < 40): <NEW_LINE> <INDENT> async with self.perform_off_charger(): <NEW_LINE> <INDENT> await self.set_lift_height(0.0).wait_for_completed() <NEW_LINE> await self.set_head_angle(cozmo.robot.MAX_HEAD_ANGLE).wait_for_completed() <NEW_LINE> <DEDENT> <DEDENT> <DEDENT> def display_image_file_on_face(self, image_name): <NEW_LINE> <INDENT> image = Image.open(image_name) <NEW_LINE> resized_image = image.resize(cozmo.oled_face.dimensions(), Image.NEAREST) <NEW_LINE> face_image = cozmo.oled_face.convert_image_to_screen_data(resized_image, invert_image=True) <NEW_LINE> self.display_oled_face_image(face_image, 5000.0)
Add some methods to the base Robot class.
6259907ce1aae11d1e7cf525
class LoginForm(forms.Form): <NEW_LINE> <INDENT> username = forms.CharField(label=_("Username"), error_messages={'required': _("Please enter your username")}) <NEW_LINE> password = forms.CharField(label=_("Password"), widget=forms.PasswordInput, error_messages={'required': _("Please enter your password")}) <NEW_LINE> service = forms.CharField(widget=forms.HiddenInput, required=False) <NEW_LINE> def clean_username(self): <NEW_LINE> <INDENT> username = self.cleaned_data.get('username') <NEW_LINE> return lower(username) <NEW_LINE> <DEDENT> def clean_service(self): <NEW_LINE> <INDENT> service = self.cleaned_data.get('service') <NEW_LINE> return urlunquote_plus(service) <NEW_LINE> <DEDENT> def clean(self): <NEW_LINE> <INDENT> username = self.cleaned_data.get('username') <NEW_LINE> password = self.cleaned_data.get('password') <NEW_LINE> if username and password: <NEW_LINE> <INDENT> user = auth.authenticate(username=username, password=password) <NEW_LINE> if user: <NEW_LINE> <INDENT> if user.is_active: <NEW_LINE> <INDENT> self.user = user <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> logger.warn("User account '%s' is disabled" % username) <NEW_LINE> raise forms.ValidationError(_("This user account is disabled")) <NEW_LINE> <DEDENT> <DEDENT> else: <NEW_LINE> <INDENT> logger.warn("Error authenticating user %s" % username) <NEW_LINE> raise forms.ValidationError(_("The username or password is not correct")) <NEW_LINE> <DEDENT> <DEDENT> return self.cleaned_data
Form implementing standard username and password authentication. The ``clean()`` method passes the provided username and password to the active authentication backend(s) and verifies the user account is not disabled.
6259907cec188e330fdfa2d1
class ProcessFrame: <NEW_LINE> <INDENT> def __init__(self, frame_height=84, frame_width=84): <NEW_LINE> <INDENT> self.frame_height = frame_height <NEW_LINE> self.frame_width = frame_width <NEW_LINE> self.frame = tf.placeholder(shape=[210, 160, 3], dtype=tf.uint8) <NEW_LINE> self.processed = tf.image.rgb_to_grayscale(self.frame) <NEW_LINE> self.processed = tf.image.crop_to_bounding_box(self.processed, 34, 0, 160, 160) <NEW_LINE> self.processed = tf.image.resize_images(self.processed, [self.frame_height, self.frame_width], method=tf.image.ResizeMethod.NEAREST_NEIGHBOR) <NEW_LINE> <DEDENT> def process(self, session, frame): <NEW_LINE> <INDENT> return session.run(self.processed, feed_dict={self.frame:frame})
Resizes and converts RGB Atari frames to grayscale
6259907c23849d37ff852ae1
class Solution: <NEW_LINE> <INDENT> def sortColors(self, nums): <NEW_LINE> <INDENT> if len(nums) <= 0: <NEW_LINE> <INDENT> return <NEW_LINE> <DEDENT> left, right = 0, len(nums) - 1 <NEW_LINE> middle = 0 <NEW_LINE> while middle <= right: <NEW_LINE> <INDENT> if nums[middle] == 0: <NEW_LINE> <INDENT> nums[left], nums[middle] = nums[middle], nums[left] <NEW_LINE> left += 1 <NEW_LINE> middle += 1 <NEW_LINE> <DEDENT> elif nums[middle] == 2: <NEW_LINE> <INDENT> nums[right], nums[middle] = nums[middle], nums[right] <NEW_LINE> right -= 1 <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> middle += 1 <NEW_LINE> <DEDENT> <DEDENT> return
@param nums: A list of integer which is 0, 1 or 2 @return: nothing
6259907c091ae35668706667
class SockCreator(BaseComponent): <NEW_LINE> <INDENT> def __init__(self, process, b10_init, kind, address=None, params=None): <NEW_LINE> <INDENT> BaseComponent.__init__(self, b10_init, kind) <NEW_LINE> self.__creator = None <NEW_LINE> <DEDENT> def _start_internal(self): <NEW_LINE> <INDENT> self._b10_init.curproc = 'b10-sockcreator' <NEW_LINE> self.__creator = isc.bind10.sockcreator.Creator(LIBEXECPATH + ':' + os.environ['PATH']) <NEW_LINE> self._b10_init.register_process(self.pid(), self) <NEW_LINE> self._b10_init.set_creator(self.__creator) <NEW_LINE> self._b10_init.log_started(self.pid()) <NEW_LINE> self._b10_init.change_user() <NEW_LINE> <DEDENT> def _stop_internal(self): <NEW_LINE> <INDENT> self.__creator.terminate() <NEW_LINE> <DEDENT> def name(self): <NEW_LINE> <INDENT> return "Socket creator" <NEW_LINE> <DEDENT> def pid(self): <NEW_LINE> <INDENT> return self.__creator.pid() if self.__creator else None <NEW_LINE> <DEDENT> def kill(self, forceful=False): <NEW_LINE> <INDENT> if self.__creator: <NEW_LINE> <INDENT> self.__creator.kill()
The socket creator component. Will start and stop the socket creator accordingly. Note: _creator shouldn't be reset explicitly once created. The underlying Popen object would then wait() the child process internally, which breaks the assumption of b10-init, who is expecting to see the process die in waitpid().
6259907c32920d7e50bc7a6b
class Track(models.Model): <NEW_LINE> <INDENT> mail = models.ForeignKey(Mail) <NEW_LINE> staff = models.ForeignKey(Staff) <NEW_LINE> start_date = models.DateTimeField(auto_now_add=True, blank=True) <NEW_LINE> hard_copy_transfer_time = models.DateTimeField(null=True) <NEW_LINE> purpose = models.CharField(max_length=300) <NEW_LINE> end_date = models.DateTimeField(blank=True, null=True) <NEW_LINE> soft_copy = models.FileField(upload_to ='uploads/', null=True) <NEW_LINE> def __unicode__(self): <NEW_LINE> <INDENT> return "{0} [Registered {1}] - {2} - Start {3} - End {4} - Mail id {5} - Staff id {6}".format(self.mail.sender.first_name, self.mail.received_time, self.staff.first_name, self.start_date, self.end_date, self.mail.id, self.staff.id)
In this model, we will record wherever a mail went through
6259907cbf627c535bcb2ef9
class MarkdownStoryExporter(interface.StoryExporter): <NEW_LINE> <INDENT> EXPORT_FORMAT = 'markdown' <NEW_LINE> _DATAFRAM_HEADER_ROWS = 20 <NEW_LINE> _DATAFRAM_TAIL_ROWS = 5 <NEW_LINE> def _dataframe_to_markdown(self, data_frame): <NEW_LINE> <INDENT> nr_rows, _ = data_frame.shape <NEW_LINE> if not nr_rows: <NEW_LINE> <INDENT> return '*<empty table>*' <NEW_LINE> <DEDENT> if nr_rows <= (self._DATAFRAM_HEADER_ROWS + self._DATAFRAM_TAIL_ROWS): <NEW_LINE> <INDENT> return tabulate.tabulate( data_frame, tablefmt='pipe', headers='keys') <NEW_LINE> <DEDENT> return_lines = [] <NEW_LINE> return_lines.append(tabulate.tabulate( data_frame[:self._DATAFRAM_HEADER_ROWS], tablefmt='pipe', headers='keys')) <NEW_LINE> return_lines.append('| ... |') <NEW_LINE> return_lines.append(tabulate.tabulate( data_frame[-self._DATAFRAM_TAIL_ROWS:], tablefmt='pipe', headers='keys')) <NEW_LINE> return '\n'.join(return_lines) <NEW_LINE> <DEDENT> def export_story(self): <NEW_LINE> <INDENT> return_strings = [] <NEW_LINE> for line_dict in self._data_lines: <NEW_LINE> <INDENT> line_type = line_dict.get('type', '') <NEW_LINE> if line_type == 'text': <NEW_LINE> <INDENT> return_strings.append(line_dict.get('value', '')) <NEW_LINE> <DEDENT> elif line_type == 'aggregation': <NEW_LINE> <INDENT> aggregation_data = line_dict.get('value') <NEW_LINE> aggregation = aggregation_data.get('aggregation') <NEW_LINE> if not aggregation: <NEW_LINE> <INDENT> return_strings.append( '**Unable to fetch aggregation data**') <NEW_LINE> continue <NEW_LINE> <DEDENT> return_strings.append( self._dataframe_to_markdown(aggregation.to_pandas())) <NEW_LINE> <DEDENT> elif line_type == 'dataframe': <NEW_LINE> <INDENT> return_strings.append( self._dataframe_to_markdown(line_dict.get('value'))) <NEW_LINE> <DEDENT> elif line_type == 'chart': <NEW_LINE> <INDENT> return_strings.append( '*<unable_to_display_chart_objects>*') <NEW_LINE> <DEDENT> <DEDENT> return '\n\n'.join(return_strings)
Markdown story exporter.
6259907c4a966d76dd5f090f
class Convertisseur: <NEW_LINE> <INDENT> def depuis_version_0(objet, classe): <NEW_LINE> <INDENT> objet.set_version(classe, 1) <NEW_LINE> objet.cle = supprimer_accents(objet._titre) <NEW_LINE> objet.titre = objet.resume <NEW_LINE> del objet._titre <NEW_LINE> del objet.resume <NEW_LINE> objet.contenu.scriptable = False
Classe pour envelopper les convertisseurs.
6259907c5fc7496912d48f7f
class AppBasePage(object): <NEW_LINE> <INDENT> def __init__(self, appium_driver): <NEW_LINE> <INDENT> self.driver = appium_driver <NEW_LINE> By.IOS_UIAUTOMATION = MobileBy.IOS_UIAUTOMATION <NEW_LINE> By.IOS_PREDICATE = MobileBy.IOS_PREDICATE <NEW_LINE> By.IOS_CLASS_CHAIN = MobileBy.IOS_CLASS_CHAIN <NEW_LINE> By.ANDROID_UIAUTOMATOR = MobileBy.ANDROID_UIAUTOMATOR <NEW_LINE> By.ACCESSIBILITY_ID = MobileBy.ACCESSIBILITY_ID
BasePage封装所有页面都公用的方法,例如driver, url ,FindElement等
6259907c167d2b6e312b82a8
class QRTComponentType(Enum): <NEW_LINE> <INDENT> Component3d = 1 <NEW_LINE> Component3dNoLabels = 2 <NEW_LINE> ComponentAnalog = 3 <NEW_LINE> ComponentForce = 4 <NEW_LINE> Component6d = 5 <NEW_LINE> Component6dEuler = 6 <NEW_LINE> Component2d = 7 <NEW_LINE> Component2dLin = 8 <NEW_LINE> Component3dRes = 9 <NEW_LINE> Component3dNoLabelsRes = 10 <NEW_LINE> Component6dRes = 11 <NEW_LINE> Component6dEulerRes = 12 <NEW_LINE> ComponentAnalogSingle = 13 <NEW_LINE> ComponentImage = 14 <NEW_LINE> ComponentForceSingle = 15 <NEW_LINE> ComponentGazeVector = 16 <NEW_LINE> ComponentTimecode = 17 <NEW_LINE> ComponentSkeleton = 18 <NEW_LINE> ComponentEyeTracker = 19
QTM Component types
6259907c4527f215b58eb6b5
class DecoderNet2d(_DecoderNetNd): <NEW_LINE> <INDENT> def __init__(self, channel, layers, out_size, kernel_size=3, in_length=2, out_planes=1): <NEW_LINE> <INDENT> super().__init__(2, channel=channel, layers=layers, out_size=out_size, kernel_size=kernel_size, in_length=in_length, out_planes=out_planes)
2D convolutional up-scale (decoder) network. This moule is a built-in model for convolutional network. The network could be used for up-scaling or generating samples. The network would up-sample and the input data according to the network depth. The depth is given by the length of the argument "layers". Different from the encoder network, this module requires the output shape, and the input shape is inferred from the given output shape.
6259907c63b5f9789fe86b91
class RHContactEditingTeam(RHEditableTypeManagementBase): <NEW_LINE> <INDENT> def _process(self): <NEW_LINE> <INDENT> editors = get_editors(self.event, self.editable_type) <NEW_LINE> return jsonify_template('events/editing/management/editor_list.html', event_persons=editors, event=self.event)
Send emails to editing team.
6259907c009cb60464d02f69
class CollectedLinks: <NEW_LINE> <INDENT> def __init__( self, files, find_links, project_urls, ): <NEW_LINE> <INDENT> self.files = files <NEW_LINE> self.find_links = find_links <NEW_LINE> self.project_urls = project_urls
Encapsulates the return value of a call to LinkCollector.collect_links(). The return value includes both URLs to project pages containing package links, as well as individual package Link objects collected from other sources. This info is stored separately as: (1) links from the configured file locations, (2) links from the configured find_links, and (3) urls to HTML project pages, as described by the PEP 503 simple repository API.
6259907ce1aae11d1e7cf526
class StopSync(Exception): <NEW_LINE> <INDENT> def __init__(self, step=0): <NEW_LINE> <INDENT> self.step = step <NEW_LINE> Exception.__init__(self, "Synchronization aborted at step %s" % self.step )
Raised by the syncworker to tell the syncmanager to stop
6259907c091ae35668706669
class MNISTPlus(dense_design_matrix.DenseDesignMatrix): <NEW_LINE> <INDENT> idx = {'train': slice(0,50000), 'valid': slice(50000,60000), 'test': slice(60000,70000)} <NEW_LINE> def __init__(self, which_set, label_type=None, azimuth=False, rotation=False, texture=False, center = False, contrast_normalize=False, seed=132987): <NEW_LINE> <INDENT> assert which_set in ['train','valid','test'] <NEW_LINE> assert label_type in [None,'label','azimuth','rotation','texture_id'] <NEW_LINE> fname = '${PYLEARN2_DATA_PATH}/mnistplus/mnistplus' <NEW_LINE> if azimuth: <NEW_LINE> <INDENT> fname += '_azi' <NEW_LINE> <DEDENT> if rotation: <NEW_LINE> <INDENT> fname += '_rot' <NEW_LINE> <DEDENT> if texture: <NEW_LINE> <INDENT> fname += '_tex' <NEW_LINE> <DEDENT> data = load(fname + '.pkl') <NEW_LINE> data_x = np.cast[config.floatX](data['data']) <NEW_LINE> data_x = data_x[MNISTPlus.idx[which_set]] <NEW_LINE> if contrast_normalize: <NEW_LINE> <INDENT> meanx = np.mean(data_x, axis=1)[:,None] <NEW_LINE> stdx = np.std(data_x, axis=1)[:,None] <NEW_LINE> data_x = (data_x - meanx) / stdx <NEW_LINE> <DEDENT> if center: <NEW_LINE> <INDENT> data_x -= np.mean(data_x, axis=0) <NEW_LINE> <DEDENT> data_y = None <NEW_LINE> if label_type is not None: <NEW_LINE> <INDENT> data_y = data[label_type] <NEW_LINE> if label_type in ['azimuth','rotation']: <NEW_LINE> <INDENT> data_y = np.cast[config.floatX](data_y / 360.) <NEW_LINE> <DEDENT> data_y = data_y[MNISTPlus.idx[which_set]] <NEW_LINE> <DEDENT> view_converter = dense_design_matrix.DefaultViewConverter((48, 48)) <NEW_LINE> super(MNISTPlus, self).__init__(X = data_x, y = data_y, view_converter = view_converter) <NEW_LINE> assert not np.any(np.isnan(self.X))
Pylearn2 wrapper for the MNIST-Plus dataset. Parameters ---------- which_set : WRITEME Dataset to load. One of ['train','valid','test']. label_type : WRITEME String specifies which contents of dictionary are used as "labels" azimuth : WRITEME Load version where lighting is a factor of variation rotation : WRITEME Load version where MNIST digits are rotated texture : WRITEME Load version where MNIST is jointly embossed on a textured background. center : WRITEME If True, remove mean (across examples) for each pixel contrast_normalize : WRITEME If True, for each image, remove mean and divide by standard deviation.
6259907c26068e7796d4e36a
class TestingConfig(Config): <NEW_LINE> <INDENT> ENV = 'testing' <NEW_LINE> DEBUG = True <NEW_LINE> TESTING = True <NEW_LINE> SQLALCHEMY_DATABASE_URI = os.getenv('TEST_DATABASE_URL') <NEW_LINE> MAIL_SUPPRESS_SEND = False <NEW_LINE> ACTIVATION_TOKEN_EXPIRY_SECONDS = 1
Configurations for Testing, with a separate test database.
6259907c442bda511e95da6d
class Test(models.Model): <NEW_LINE> <INDENT> name = models.CharField(max_length=50) <NEW_LINE> results = models.TextField() <NEW_LINE> release = models.BooleanField(default=False) <NEW_LINE> testPatient = models.ForeignKey(Patient, on_delete=models.CASCADE) <NEW_LINE> testDoctor = models.ForeignKey(Doctor, on_delete=models.CASCADE) <NEW_LINE> testDate = models.DateTimeField(default=django.utils.timezone.now) <NEW_LINE> class Meta: <NEW_LINE> <INDENT> verbose_name = 'Test' <NEW_LINE> verbose_name_plural = 'Tests'
Tests to release to patients
6259907cbe7bc26dc9252b6b
class ThreatIndicator(object): <NEW_LINE> <INDENT> ADDED_ON = Common.ADDED_ON <NEW_LINE> CONFIDENCE = 'confidence' <NEW_LINE> DESCRIPTION = 'description' <NEW_LINE> EXPIRED_ON = 'expired_on' <NEW_LINE> ID = Common.ID <NEW_LINE> INDICATOR = 'indicator' <NEW_LINE> METADATA = Common.METADATA <NEW_LINE> PASSWORDS = 'passwords' <NEW_LINE> PRIVACY_TYPE = 'privacy_type' <NEW_LINE> PRIVACY_MEMBERS = 'privacy_members' <NEW_LINE> REPORT_URLS = 'report_urls' <NEW_LINE> SEVERITY = 'severity' <NEW_LINE> SHARE_LEVEL = Common.SHARE_LEVEL <NEW_LINE> STATUS = Common.STATUS <NEW_LINE> SUBMITTER_COUNT = Common.SUBMITTER_COUNT <NEW_LINE> THREAT_TYPE = 'threat_type' <NEW_LINE> THREAT_TYPES = 'threat_types' <NEW_LINE> TYPE = 'type'
Vocabulary specific to searching for, adding, or modifying a Threat Indicator object.
6259907c4f6381625f19a1c3
class ExcelResponse(HttpResponse): <NEW_LINE> <INDENT> def export_excel(self, file_name, head_data=[], content_data=[]): <NEW_LINE> <INDENT> wb = Workbook() <NEW_LINE> wb.encoding = 'utf-8' <NEW_LINE> font = Font(u'DengXian', size=14, bold=True, color='000000') <NEW_LINE> body_font = Font(u'DengXian', size=14, bold=False, color='000000') <NEW_LINE> alignment = Alignment(horizontal='center', vertical='center') <NEW_LINE> fill = PatternFill("solid", fgColor="d1cbcb") <NEW_LINE> sheet1 = wb.active <NEW_LINE> for i in range(1, len(head_data)+1): <NEW_LINE> <INDENT> sheet1.cell(row=1, column=i).value = head_data[i - 1] <NEW_LINE> sheet1.cell(row=1, column=i).font = font <NEW_LINE> sheet1.cell(row=1, column=i).alignment = alignment <NEW_LINE> <DEDENT> sheet1.row_dimensions[1].height = 30 <NEW_LINE> for row in sheet1.rows: <NEW_LINE> <INDENT> for cell in row: <NEW_LINE> <INDENT> sheet1[cell.coordinate].fill = fill <NEW_LINE> colum_name = cell.coordinate[:-1] <NEW_LINE> sheet1.column_dimensions[colum_name].width = 20 <NEW_LINE> <DEDENT> <DEDENT> for obj in content_data: <NEW_LINE> <INDENT> max_row = sheet1.max_row + 1 <NEW_LINE> for x in range(1, len(obj)+1): <NEW_LINE> <INDENT> sheet1.cell(row=max_row, column=x).value = obj[x-1] <NEW_LINE> sheet1.cell(row=max_row, column=x).font = body_font <NEW_LINE> sheet1.cell(row=max_row, column=x).alignment = alignment <NEW_LINE> <DEDENT> <DEDENT> output = BytesIO() <NEW_LINE> wb.save(output) <NEW_LINE> output.seek(0) <NEW_LINE> response = HttpResponse(output.getvalue(), content_type='application/vnd.ms-excel') <NEW_LINE> file_name = '%s.xls' % file_name <NEW_LINE> file_name = urlquote(file_name) <NEW_LINE> response['Content-Disposition'] = 'attachment; filename=%s' % file_name <NEW_LINE> return response
excel文件导出 支持xls和csv格式文件 支持多sheet页导出
6259907c8a349b6b43687c87
@generate_repr("key", "value", "origin", "from_cli") <NEW_LINE> class Setting(StringConverter): <NEW_LINE> <INDENT> def __init__(self, key, value, origin="", strip_whitespaces=True, list_delimiters=(",", ";"), from_cli=False): <NEW_LINE> <INDENT> if not isinstance(from_cli, bool): <NEW_LINE> <INDENT> raise TypeError("from_cli needs to be a boolean value.") <NEW_LINE> <DEDENT> StringConverter.__init__(self, value, strip_whitespaces=strip_whitespaces, list_delimiters=list_delimiters) <NEW_LINE> self.from_cli = from_cli <NEW_LINE> self.key = key <NEW_LINE> self.origin = str(origin) <NEW_LINE> <DEDENT> def __path__(self, origin=None): <NEW_LINE> <INDENT> strrep = str(self).strip() <NEW_LINE> if os.path.isabs(strrep): <NEW_LINE> <INDENT> return strrep <NEW_LINE> <DEDENT> if hasattr(self, "origin") and self.origin != "": <NEW_LINE> <INDENT> origin = self.origin <NEW_LINE> <DEDENT> if origin is None: <NEW_LINE> <INDENT> raise ValueError("Cannot determine path without origin.") <NEW_LINE> <DEDENT> return os.path.abspath(os.path.join(os.path.dirname(origin), strrep)) <NEW_LINE> <DEDENT> def __path_list__(self): <NEW_LINE> <INDENT> return [Setting.__path__(elem, self.origin) for elem in self] <NEW_LINE> <DEDENT> @property <NEW_LINE> def key(self): <NEW_LINE> <INDENT> return self._key <NEW_LINE> <DEDENT> @key.setter <NEW_LINE> def key(self, key): <NEW_LINE> <INDENT> newkey = str(key) <NEW_LINE> if newkey == "": <NEW_LINE> <INDENT> raise ValueError("An empty key is not allowed for a setting.") <NEW_LINE> <DEDENT> self._key = newkey
A Setting consists mainly of a key and a value. It mainly offers many conversions into common data types.
6259907c3d592f4c4edbc874
class FrontDoorNameAvailabilityOperations: <NEW_LINE> <INDENT> models = _models <NEW_LINE> def __init__(self, client, config, serializer, deserializer) -> None: <NEW_LINE> <INDENT> self._client = client <NEW_LINE> self._serialize = serializer <NEW_LINE> self._deserialize = deserializer <NEW_LINE> self._config = config <NEW_LINE> <DEDENT> async def check( self, check_front_door_name_availability_input: "_models.CheckNameAvailabilityInput", **kwargs ) -> "_models.CheckNameAvailabilityOutput": <NEW_LINE> <INDENT> cls = kwargs.pop('cls', None) <NEW_LINE> error_map = { 401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError } <NEW_LINE> error_map.update(kwargs.pop('error_map', {})) <NEW_LINE> api_version = "2020-05-01" <NEW_LINE> content_type = kwargs.pop("content_type", "application/json") <NEW_LINE> accept = "application/json" <NEW_LINE> url = self.check.metadata['url'] <NEW_LINE> query_parameters = {} <NEW_LINE> query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str') <NEW_LINE> header_parameters = {} <NEW_LINE> header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str') <NEW_LINE> header_parameters['Accept'] = self._serialize.header("accept", accept, 'str') <NEW_LINE> body_content_kwargs = {} <NEW_LINE> body_content = self._serialize.body(check_front_door_name_availability_input, 'CheckNameAvailabilityInput') <NEW_LINE> body_content_kwargs['content'] = body_content <NEW_LINE> request = self._client.post(url, query_parameters, header_parameters, **body_content_kwargs) <NEW_LINE> pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs) <NEW_LINE> response = pipeline_response.http_response <NEW_LINE> if response.status_code not in [200]: <NEW_LINE> <INDENT> map_error(status_code=response.status_code, response=response, error_map=error_map) <NEW_LINE> error = self._deserialize(_models.ErrorResponse, response) <NEW_LINE> raise HttpResponseError(response=response, model=error, error_format=ARMErrorFormat) <NEW_LINE> <DEDENT> deserialized = self._deserialize('CheckNameAvailabilityOutput', pipeline_response) <NEW_LINE> if cls: <NEW_LINE> <INDENT> return cls(pipeline_response, deserialized, {}) <NEW_LINE> <DEDENT> return deserialized <NEW_LINE> <DEDENT> check.metadata = {'url': '/providers/Microsoft.Network/checkFrontDoorNameAvailability'}
FrontDoorNameAvailabilityOperations async operations. You should not instantiate this class directly. Instead, you should create a Client instance that instantiates it for you and attaches it as an attribute. :ivar models: Alias to model classes used in this operation group. :type models: ~azure.mgmt.frontdoor.models :param client: Client for service requests. :param config: Configuration of service client. :param serializer: An object model serializer. :param deserializer: An object model deserializer.
6259907c3617ad0b5ee07b79
class PublishedManager(models.Manager): <NEW_LINE> <INDENT> def get_query_set(self): <NEW_LINE> <INDENT> return super(PublishedManager, self).get_query_set().filter(publication_status='P', publication_date__lte=datetime.datetime.now()).order_by('-publication_date')
Published post manager.
6259907cdc8b845886d54fe7
class gRPCMdtDialout(object): <NEW_LINE> <INDENT> @staticmethod <NEW_LINE> def MdtDialout(request_iterator, target, options=(), channel_credentials=None, call_credentials=None, compression=None, wait_for_ready=None, timeout=None, metadata=None): <NEW_LINE> <INDENT> return grpc.experimental.stream_stream(request_iterator, target, '/mdt_dialout.gRPCMdtDialout/MdtDialout', mdt__dialout__pb2.MdtDialoutArgs.SerializeToString, mdt__dialout__pb2.MdtDialoutArgs.FromString, options, channel_credentials, call_credentials, compression, wait_for_ready, timeout, metadata)
Missing associated documentation comment in .proto file.
6259907c1f5feb6acb164624
class SbvHedVob3(Class3Graph1): <NEW_LINE> <INDENT> template_name = 'SbvHedVob3' <NEW_LINE> words_involved = {0: ('ANY', 'SBV'), 1: ('ANY', 'ATT'), 2: ('ANY', 'VOB'), 3: ('ANY', 'HED')} <NEW_LINE> relations_involved = [(3, 1), (1, 0), (1, 2)] <NEW_LINE> targets_involved = [0, 1, 2, 3] <NEW_LINE> def target(self, rule_target, mapping, words, arcs): <NEW_LINE> <INDENT> if rule_target: <NEW_LINE> <INDENT> goal = [OneTerm(mapping[node], words, arcs).get_self() for node in self.targets_involved] <NEW_LINE> return goal <NEW_LINE> <DEDENT> <DEDENT> @classmethod <NEW_LINE> def generate(cls, mapping, words, arcs, results): <NEW_LINE> <INDENT> temp = [OneTerm(mapping[node], words, arcs).get_content() for node in cls.targets_involved] <NEW_LINE> goal = [temp[3], '是', ': '] <NEW_LINE> result = '; '.join(results) <NEW_LINE> return ''.join(goal) + result
马拉多纳执教阿根廷队的开始时间;xx做xx的xx
6259907c3539df3088ecdcc4
class FakeSNMPDriver(base.BaseDriver): <NEW_LINE> <INDENT> def __init__(self): <NEW_LINE> <INDENT> if not importutils.try_import('pysnmp'): <NEW_LINE> <INDENT> raise exception.DriverLoadError( driver=self.__class__.__name__, reason=_("Unable to import pysnmp library")) <NEW_LINE> <DEDENT> self.power = snmp.SNMPPower() <NEW_LINE> self.deploy = fake.FakeDeploy()
Fake SNMP driver.
6259907c5fdd1c0f98e5f9ac
class tile(object): <NEW_LINE> <INDENT> def __init__(self, screen_pos, grid_pos, terrain_type, tile_surface = pygame.Surface((2,2)), has_menu = False, sprite_list = [], building_list = []): <NEW_LINE> <INDENT> self.screen_pos = screen_pos <NEW_LINE> self.grid_pos = grid_pos <NEW_LINE> self.terrain_type = terrain_type <NEW_LINE> self.surface = tile_surface <NEW_LINE> self.has_menu = has_menu <NEW_LINE> self.sprite_list = sprite_list <NEW_LINE> self.building_list = building_list <NEW_LINE> <DEDENT> def get_colour(self): <NEW_LINE> <INDENT> if self.terrain_type == 0: <NEW_LINE> <INDENT> colour = BLUE <NEW_LINE> <DEDENT> if self.terrain_type == 1: <NEW_LINE> <INDENT> colour = OLIVE <NEW_LINE> <DEDENT> if self.terrain_type == 2: <NEW_LINE> <INDENT> colour = GREY <NEW_LINE> <DEDENT> return colour <NEW_LINE> <DEDENT> def draw_tile(self, screen, tile_width, tile_height, line_width, line_colour): <NEW_LINE> <INDENT> self.surface = pygame.Surface((tile_width, tile_height)) <NEW_LINE> colour = self.get_colour() <NEW_LINE> pygame.draw.rect(self.surface, colour, [0, 0, tile_width, tile_height]) <NEW_LINE> pointlist = [[0 ,0], [tile_width, 0], [tile_width, tile_height], [0, tile_height], [0, 0]] <NEW_LINE> pygame.draw.lines(self.surface, line_colour, False, pointlist, line_width) <NEW_LINE> for sprite in self.sprite_list: <NEW_LINE> <INDENT> sprite.draw_sprite() <NEW_LINE> <DEDENT> screen.blit(self.surface, self.screen_pos)
for map tiles, to contain everything to do with a tile, resources, sprites etc.
6259907ca05bb46b3848be3f
class ExprLT(ExprOperator): <NEW_LINE> <INDENT> def __init__(self, *args, **kwargs): <NEW_LINE> <INDENT> super(ExprLT, self).__init__(*args, **kwargs) <NEW_LINE> <DEDENT> def Apply(self, value, operand): <NEW_LINE> <INDENT> return value < operand
LT node.
6259907c5fc7496912d48f81
class PythonDropSource(wx.DropSource): <NEW_LINE> <INDENT> def __init__(self, source, data, handler=None, allow_move=True): <NEW_LINE> <INDENT> self.handler = handler <NEW_LINE> self.allow_move = allow_move <NEW_LINE> clipboard.data = data <NEW_LINE> clipboard.source = source <NEW_LINE> clipboard.drop_source = self <NEW_LINE> data_object = wx.CustomDataObject(PythonObject) <NEW_LINE> data_object.SetData(b"dummy") <NEW_LINE> wx.DropSource.__init__(self, source) <NEW_LINE> self.SetData(data_object) <NEW_LINE> if allow_move: <NEW_LINE> <INDENT> flags = wx.Drag_DefaultMove | wx.Drag_AllowMove <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> flags = wx.Drag_CopyOnly <NEW_LINE> <DEDENT> self.result = self.DoDragDrop(flags) <NEW_LINE> <DEDENT> def on_dropped(self, drag_result): <NEW_LINE> <INDENT> if self.handler is not None: <NEW_LINE> <INDENT> if hasattr(self.handler, "on_dropped"): <NEW_LINE> <INDENT> args = getfullargspec(self.handler.on_dropped)[0] <NEW_LINE> if len(args) == 3: <NEW_LINE> <INDENT> self.handler.on_dropped(clipboard.data, drag_result) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> self.handler.on_dropped() <NEW_LINE> <DEDENT> <DEDENT> else: <NEW_LINE> <INDENT> args = getfullargspec(self.handler)[0] <NEW_LINE> if len(args) == 2: <NEW_LINE> <INDENT> self.handler(clipboard.data, drag_result) <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> self.handler()
Drop source for Python objects.
6259907cfff4ab517ebcf245
class Rescale(object): <NEW_LINE> <INDENT> def __init__(self, output_size): <NEW_LINE> <INDENT> assert isinstance(output_size, (int, tuple)) <NEW_LINE> self.output_size = output_size <NEW_LINE> <DEDENT> def __call__(self, sample): <NEW_LINE> <INDENT> image, landmarks = sample['image'], sample['landmarks'] <NEW_LINE> h, w = image.shape[:2] <NEW_LINE> if isinstance(self.output_size, int): <NEW_LINE> <INDENT> if h > w: <NEW_LINE> <INDENT> new_h, new_w = self.output_size * h / w, self.output_size <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> new_h, new_w = self.output_size, self.output_size * w / h <NEW_LINE> <DEDENT> <DEDENT> else: <NEW_LINE> <INDENT> new_h, new_w = self.output_size <NEW_LINE> <DEDENT> new_h, new_w = int(new_h), int(new_w) <NEW_LINE> img = transform.resize(image, (new_h, new_w)) <NEW_LINE> landmarks = landmarks * [new_w / w, new_h / h] <NEW_LINE> return {'image': img, 'landmarks': landmarks}
Rescale the image in a sample to a given size. Args: output_size (tuple or int): Desired output size. If tuple, output is matched to output_size. If int, smaller of image edges is matched to output_size keeping aspect ratio the same.
6259907c796e427e538501a8
class UserManager(object): <NEW_LINE> <INDENT> def __init__(self, path): <NEW_LINE> <INDENT> self.file = os.path.join(path, 'users.json') <NEW_LINE> <DEDENT> def read(self): <NEW_LINE> <INDENT> if not os.path.exists(self.file): <NEW_LINE> <INDENT> return {} <NEW_LINE> <DEDENT> with open(self.file) as f: <NEW_LINE> <INDENT> data = json.loads(f.read()) <NEW_LINE> <DEDENT> return data <NEW_LINE> <DEDENT> def write(self, data): <NEW_LINE> <INDENT> with open(self.file, 'w') as f: <NEW_LINE> <INDENT> f.write(json.dumps(data, indent=2)) <NEW_LINE> <DEDENT> <DEDENT> def add_user(self, name, password, time, active=True, roles=[], authentication_method=None): <NEW_LINE> <INDENT> users = self.read() <NEW_LINE> if users.get(name): <NEW_LINE> <INDENT> return False <NEW_LINE> <DEDENT> if authentication_method is None: <NEW_LINE> <INDENT> authentication_method = get_default_authentication_method() <NEW_LINE> <DEDENT> new_user = { 'active': active, 'roles': roles, 'time': time, 'authentication_method': authentication_method, 'authenticated': False } <NEW_LINE> if authentication_method == 'hash': <NEW_LINE> <INDENT> new_user['hash'] = make_salted_hash(password) <NEW_LINE> <DEDENT> elif authentication_method == 'cleartext': <NEW_LINE> <INDENT> new_user['password'] = password <NEW_LINE> <DEDENT> else: <NEW_LINE> <INDENT> raise NotImplementedError(authentication_method) <NEW_LINE> <DEDENT> users[name] = new_user <NEW_LINE> self.write(users) <NEW_LINE> userdata = users.get(name) <NEW_LINE> return User(self, name, userdata) <NEW_LINE> <DEDENT> def getUsers(self): <NEW_LINE> <INDENT> users = self.read() <NEW_LINE> userarray = [] <NEW_LINE> for i in users: <NEW_LINE> <INDENT> user = User(self,i,users.get(i)) <NEW_LINE> user.time = users.get(i).get("time") <NEW_LINE> userarray.append(user) <NEW_LINE> <DEDENT> return userarray <NEW_LINE> <DEDENT> def get_user(self, name): <NEW_LINE> <INDENT> users = self.read() <NEW_LINE> userdata = users.get(name) <NEW_LINE> if not userdata: <NEW_LINE> <INDENT> return None <NEW_LINE> <DEDENT> return User(self, name, userdata) <NEW_LINE> <DEDENT> def delete_user(self, name): <NEW_LINE> <INDENT> users = self.read() <NEW_LINE> if not users.pop(name, False): <NEW_LINE> <INDENT> return False <NEW_LINE> <DEDENT> self.write(users) <NEW_LINE> return True <NEW_LINE> <DEDENT> def update(self, name, userdata): <NEW_LINE> <INDENT> data = self.read() <NEW_LINE> data[name] = userdata <NEW_LINE> self.write(data)
A very simple user Manager, that saves it's data as json.
6259907ca8370b77170f1dfd
class RestrictedManager(models.Manager): <NEW_LINE> <INDENT> def get(self, *args, **kwargs): <NEW_LINE> <INDENT> if 'permission__user' in kwargs: <NEW_LINE> <INDENT> if getattr(kwargs['permission__user'], 'is_superuser'): <NEW_LINE> <INDENT> kwargs.pop('permission__user') <NEW_LINE> <DEDENT> <DEDENT> return super().get(*args, **kwargs) <NEW_LINE> <DEDENT> def get_queryset(self): <NEW_LINE> <INDENT> return RestrictedQuerySet(self.model, using=self._db)
custom manager to override QuerySet
6259907c60cbc95b06365a84
class RsatDatabaseTest(unittest.TestCase): <NEW_LINE> <INDENT> def setUp(self): <NEW_LINE> <INDENT> if not os.path.exists('testcache'): <NEW_LINE> <INDENT> os.mkdir('testcache') <NEW_LINE> <DEDENT> self.database = rsat.RsatDatabase(RSAT_BASE_URL, 'testcache', 'Helicobacter_pylori_26695_uid57787', 85962) <NEW_LINE> <DEDENT> def tearDown(self): <NEW_LINE> <INDENT> if os.path.exists('testcache'): <NEW_LINE> <INDENT> shutil.rmtree('testcache') <NEW_LINE> <DEDENT> <DEDENT> def test_get_organism_names(self): <NEW_LINE> <INDENT> self.assertEquals("85962", self.database.get_taxonomy_id('Helicobacter_pylori_26695_uid57787')) <NEW_LINE> <DEDENT> def test_get_features(self): <NEW_LINE> <INDENT> text = self.database.get_features('Helicobacter_pylori_26695_uid57787') <NEW_LINE> self.assertIsNotNone(text) <NEW_LINE> <DEDENT> def test_get_feature_names(self): <NEW_LINE> <INDENT> text = self.database.get_feature_names('Helicobacter_pylori_26695_uid57787') <NEW_LINE> self.assertIsNotNone(text)
Test class for RsatDatabase. These tests interacts with a real web service and will therefore run relatively slowly. Should be run as part of an integration test suite and not of the unit test suite. There is no real attempt to actually check the contents of the files, it is mainly a check for link availability.
6259907c5fdd1c0f98e5f9ad