text
stringlengths
29
850k
import subprocess import os import shutil from urllib.parse import urlparse import feedparser import requests import newspaper from ffmpy import FFmpeg feed = feedparser.parse('http://feeds.feedburner.com/iolandachannel') chapters = [] for entry in feed['entries']: title = entry['title'] print(f'Parsing: {title}') news_link = entry['link'] article = newspaper.Article(news_link) article.download() article.parse() media = article.top_img if not media: continue images = [] response = requests.get(media, stream=True) filename = os.path.basename(urlparse(media).path) filename = f'images/{filename}' with open(filename, 'wb') as out_file: shutil.copyfileobj(response.raw, out_file) images.append(filename) from gtts import gTTS text = article.text tts = gTTS(text=text, lang='pt', slow=False) tts.save("article.mp3") inputs = { 'article.mp3': None, } for image in images: inputs[image] = '-loop 1 -r 1' ff = FFmpeg(inputs=inputs, outputs={'article.avi': '-y -acodec copy -shortest -qscale 5'}) print(ff.cmd) ff.run() command = f'youtube-upload article.avi --title "{title}"' command += f' --description "{text}\n\n' command += f'LINK PARA A NOTÍCIA ORIGINAL: {news_link}"' subprocess.call(command, shell=True) print(f'Article parsed: {title}') import sys; sys.exit() print('That\'s all folks!')
Know with us the advantages of contact lenses flaws, where contact lenses are divided into two types: soft and hard, each with advantages and disadvantages. There are also cases of inability to wear lenses. Specialist lenses lenses German Hans Joerg Atzlr says it could be argued that -amoma- solid lenses fit the eye more than the soft, where at least solid for soft lens size, and then the supply of oxygen to the eye better when wearing them. It is advisable Atzlr, a member of the German Association of specialists contact lenses in Friedberg, who has the freedom of choice, to choose solid lenses. German ophthalmologist George Eckert said he often prefers apostates type soft contact lenses for steel. And accounts for Eckert, a member of the German Association of Ophthalmologists in Dusseldorf, saying that the reason for that feeling increasingly comfortable when you wear soft lenses more than solid. Atzlr and attributed this to what it usually human needs for a long period of time to acclimate to the solid lenses in his eyes, as the first one feels as if wearing a foreign body inside.
#!/usr/bin/env python # coding: utf-8 # Copyright 2013 Abram Hindle # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # # You can start this by executing it in python: # python server.py # # remember to: # pip install flask import flask from flask import Flask, request, send_from_directory, url_for, redirect, jsonify import json app = Flask(__name__) app.debug = True # An example world # { # 'a':{'x':1, 'y':2}, # 'b':{'x':2, 'y':3} # } class World: def __init__(self): self.clear() def update(self, entity, key, value): entry = self.space.get(entity,dict()) entry[key] = value self.space[entity] = entry def set(self, entity, data): self.space[entity] = data def clear(self): self.space = dict() def get(self, entity): return self.space.get(entity,dict()) def world(self): return self.space # you can test your webservice from the commandline # curl -v -H "Content-Type: appication/json" -X PUT http://127.0.0.1:5000/entity/X -d '{"x":1,"y":1}' myWorld = World() # I give this to you, this is how you get the raw body/data portion of a post in flask # this should come with flask but whatever, it's not my project. def flask_post_json(request): '''Ah the joys of frameworks! They do so much work for you that they get in the way of sane operation!''' if (request.json != None): return request.json elif (request.data != None and request.data != ''): return json.loads(request.data) else: return json.loads(request.form.keys()[0]) @app.route("/") def hello(): '''Return something coherent here.. perhaps redirect to /static/index.html ''' return send_from_directory('static', 'index.html') @app.route("/entity/<entity>", methods=['POST','PUT']) def update(entity): '''update the entities via this interface''' #Fixed to use Hindle's JSON functions myData = flask_post_json(request) if request.method == "POST": myWorld.set(entity, myData) elif request.method == "PUT": for myKey, myValue in myData.iteritems(): myWorld.update(entity, myKey, myValue) # Return the world not redirect to page return jsonify(myWorld.get(entity)) @app.route("/world", methods=['POST','GET']) def world(): '''you should probably return the world here''' if request.method == "POST": aWorld = flask_post_json(request) print(aWorld) return jsonify(myWorld.world()) elif request.method == "GET": return jsonify(myWorld.world()) @app.route("/entity/<entity>") def get_entity(entity): '''This is the GET version of the entity interface, return a representation of the entity''' return jsonify(** myWorld.get(entity)) @app.route("/clear", methods=['POST','GET']) def clear(): # Call built in function myWorld.clear() return jsonify(myWorld.world()) if __name__ == "__main__": app.run()
This week’s blockchain project of the week is Cardano blockchain. Cardano is a blockchain technology platform that is capable of running financial applications currently used every day by individuals, organizations and governments all around the world. The platform is being constructed in layers, which gives the system the flexibility to be more easily maintained and allow for upgrades by way of soft forks. After the settlement layer that will run Ada is complete, a separate computing layer will be built to handle smart contracts, the digital legal agreements that will underpin future commerce and business. Cardano will also run decentralised applications, or dapps, services not controlled by any single party but instead operate on a blockchain. Cardano was created in 2017 by blockchain development firm Input Output Hong Kong (IOHK), IOHK is run by its CEO Charles Hoskinson, who was also a co-founder of BitShares, Ethereum and Ethereum Classic. Cardano blockchain is written in programming language Haskell. It is exected to run smart contracts with other blockchain features. ADA cryptocurrency is traded on major cryto exchanges.
""" This file contains classes and functionto interact with qcow images servers """ import copy import logging import os import paramiko from common import PromotionError class QcowConnectionClient(object): """ Proxy class for client connection """ _log = logging.getLogger("promoter") def __init__(self, server_conf): self._host = server_conf['host'] self._user = server_conf['user'] self._client_type = server_conf['client'] self._keypath = server_conf['keypath'] self._client = os if self._client_type == "sftp": client = paramiko.SSHClient() client.load_system_host_keys() client.set_missing_host_key_policy(paramiko.WarningPolicy) keypath = os.path.expanduser(self._keypath) self.key = paramiko.rsakey.RSAKey(filename=keypath) self.kwargs = {} if self._user is not None: self.kwargs['username'] = self._user else: self.kwargs['username'] = os.environ.get("USER") self._log.debug("Connecting to %s as user %s", self._host, self._user) self.ssh_client = client def connect(self): if hasattr(self, 'ssh_client'): self.ssh_client.connect(self._host, pkey=self.key, **self.kwargs) self._client = self.ssh_client.open_sftp() def __getattr__(self, item): return getattr(self._client, item) def close(self): if self._client_type == "sftp": self._client.close() class QcowClient(object): """ This class interacts with qcow images servers """ log = logging.getLogger("promoter") def __init__(self, config): self.config = config self.git_root = self.config.git_root self.promote_script = os.path.join(self.git_root, 'ci-scripts', 'promote-images.sh') self.distro_name = self.config.distro_name self.distro_version = self.config.distro_version self.rollback_links = {} server_conf = self.config.overcloud_images.get('qcow_servers') self.user = server_conf['local']['user'] self.root = server_conf['local']['root'] self.host = server_conf['local']['host'] self.client = QcowConnectionClient(server_conf['local']) self.images_dir = os.path.join(self.root, config.distro, config.release, "rdo_trunk") def validate_qcows(self, dlrn_hash, name=None, assume_valid=False): """ Check we have the images dir in the server if name is specified, verify that name points to the hash - maybe qcow ran and failed Check at which point of qcow promotion we stopped 1) did we create a new symlink ? 2) did we create the previous symlink ? 3) are all the images uploaded correctly ? :param dlrn_hash: The hash to check :param name: The promotion name :param assume_valid: report everything worked unconditionally :return: A dict with result of the validation """ try: self.client.listdir(self.images_dir) self.client.chdir(self.images_dir) except EnvironmentError as ex: self.log.error("Qcow-client: Image root dir %s does not exist " "in the server, or is not accessible") self.log.exception(ex) raise results = { "hash_valid": False, "promotion_valid": False, "qcow_valid": False, "missing_qcows": copy.copy( self.config.overcloud_images['qcow_images']), "present_qcows": [], } stat = None images = None images_path = os.path.join(self.images_dir, dlrn_hash.full_hash) try: stat = self.client.stat(images_path) images = sorted(self.client.listdir(images_path)) except EnvironmentError: self.log.error("Images path for hash %s not present or " "accessible", dlrn_hash) if not images: self.log.error("No images found") if stat and images: results['hash_valid'] = True results['present_qcows'] = images results['missing_qcows'] = \ list(set(self.config.overcloud_images[ 'qcow_images']).difference( images)) if images == self.config.overcloud_images['qcow_images']: results['qcow_valid'] = True if name is not None: try: link = self.client.readlink(name) if link == dlrn_hash.full_hash: results['promotion_valid'] = True except EnvironmentError: self.log.error("%s was not promoted to %s", dlrn_hash.full_hash, name) return results def rollback(self): """ Rolls back the link to the initial status Rollback is guaranteed to work only for caught exceptions, and it may not be really useful. We have a rollback only if a remove or a symlink fails. - If a remove fails, it means that we don't need to rollback - If a symlink fails, then it will probably fail on rollback too. :return: None """ for name, target in self.rollback_links.items(): self.client.remove(name) self.client.symlink(target, name) self.rollback_links = {} def promote(self, candidate_hash, target_label, candidate_label=None, create_previous=True, validation=True): """ Effective promotion of the images. This method will handle symbolic links to the dir containing images from the candidate hash, optionally saving the current link as previous :param candidate_hash: The dlrn hash to promote :param target_label: The name of the link to create :param candidate_label: Currently unused :param create_previous: A bool to determine if previous link is created :param validation: A bool to determine if qcow validation should be done :return: None """ self.client.connect() if validation: self.validate_qcows(candidate_hash) self.client.chdir(self.images_dir) log_header = "Qcow promote '{}' to {}:".format(candidate_hash, target_label) self.log.info("%s Attempting promotion", log_header) # Check if candidate_hash dir is present try: self.client.stat(candidate_hash.full_hash) except EnvironmentError as ex: self.log.error("%s images dir for hash %s not present or not " "accessible", log_header, candidate_hash) self.log.exception(ex) self.client.close() raise PromotionError("{} No images dir for hash {}" "".format(log_header, candidate_hash)) # Check if the target label exists and points to a hash dir current_hash = None try: current_hash = self.client.readlink(target_label) except EnvironmentError: self.log.debug("%s No link named %s exists", log_header, target_label) # If this exists Check if we can remove the symlink if current_hash: self.rollback_links['target_label'] = current_hash try: self.client.remove(target_label) except EnvironmentError as ex: self.log.debug("Unable to remove the target_label: %s", target_label) self.log.exception(ex) self.client.close() raise # Check if a previous link exists and points to an hash-dir previous_label = "previous-{}".format(target_label) previous_hash = None try: previous_hash = self.client.readlink(previous_label) except EnvironmentError: self.log.debug("%s No previous-link named %s exists", log_header, previous_label) self.log.debug("Previous hash %s", previous_hash) # If it exists and we are handling it, check if we can remove and # reassign it if current_hash and previous_hash and create_previous: self.rollback_links[previous_label] = previous_hash try: self.client.remove(previous_label) except EnvironmentError as ex: self.log.debug("Unable to remove the target_label: %s", target_label) self.log.exception(ex) self.client.close() # Rollback is not tested, we enable it later, when tests are # easier to add # self.rollback() raise try: self.client.symlink(current_hash, previous_label) except EnvironmentError as ex: self.log.error("%s failed to link %s to %s", log_header, previous_label, current_hash) self.log.exception(ex) # Rollback is not tested, we enable it later, when tests are # easier to add # self.rollback() self.client.close() raise # Finally the effective promotion try: self.client.symlink(candidate_hash.full_hash, target_label) except EnvironmentError as ex: self.log.error("%s failed to link %s to %s", log_header, target_label, candidate_hash.full_hash) self.log.exception(ex) # Rollback is not tested, we enable it later, when tests are # easier to add # self.rollback() finally: self.client.close() self.log.info("%s Successful promotion", log_header)
With more than 25 years of experience in organizational and leadership development, strategic planning, facilitation, and coaching, Mary Baker brings a wealth of knowledge and business perspective to each consulting and coaching engagement. Mary’s experience includes ten years consulting with organizations both for and not for profit including financial institutions, manufacturing, education, faith based, and social services. She led a consulting center at Belmont University's College of Business and was an employee relations officer at SunTrust Bank. Early in her career she served as CEO of Crittenton Services, a teen pregnancy prevention agency. Most recently she has completed nearly seven years as the CEO of Monroe Harding, Inc., a 125-year-old nonprofit providing residential and mental health services to foster children and young adults.
#!/usr/bin/python # -*- coding: utf-8 -*- from __future__ import with_statement import re import sys from glob import glob from os import path from subprocess import Popen, PIPE from sys import argv # Local module: generator for texture lookup builtins from texture_builtins import generate_texture_functions builtins_dir = path.join(path.dirname(path.abspath(__file__)), "..") # Get the path to the standalone GLSL compiler if len(argv) != 2: print "Usage:", argv[0], "<path to compiler>" sys.exit(1) compiler = argv[1] # Read the files in builtins/ir/*...add them to the supplied dictionary. def read_ir_files(fs): for filename in glob(path.join(path.join(builtins_dir, 'ir'), '*')): with open(filename) as f: fs[path.basename(filename)] = f.read() # Return a dictionary containing all builtin definitions (even generated) def get_builtin_definitions(): fs = {} generate_texture_functions(fs) read_ir_files(fs) return fs def stringify(s): # Work around MSVC's 65535 byte limit by outputting an array of characters # rather than actual string literals. if len(s) >= 65535: #t = "/* Warning: length " + repr(len(s)) + " too large */\n" t = "" for c in re.sub('\s\s+', ' ', s): if c == '\n': t += '\n' else: t += "'" + c + "'," return '{' + t[:-1] + '}' t = s.replace('\\', '\\\\').replace('"', '\\"').replace('\n', '\\n"\n "') return ' "' + t + '"\n' def write_function_definitions(): fs = get_builtin_definitions() for k, v in sorted(fs.iteritems()): print 'static const char builtin_' + k + '[] =' print stringify(v), ';' def run_compiler(args): command = [compiler, '--dump-lir'] + args p = Popen(command, 1, stdout=PIPE, shell=False) output = p.communicate()[0] # Clean up output a bit by killing whitespace before a closing paren. kill_paren_whitespace = re.compile(r'[ \n]*\)', re.MULTILINE) output = kill_paren_whitespace.sub(')', output) # Also toss any duplicate newlines output = output.replace('\n\n', '\n') return (output, p.returncode) def write_profile(filename, profile): (proto_ir, returncode) = run_compiler([filename]) if returncode != 0: print '#error builtins profile', profile, 'failed to compile' return # Kill any global variable declarations. We don't want them. kill_globals = re.compile(r'^\(declare.*\n', re.MULTILINE) proto_ir = kill_globals.sub('', proto_ir) print 'static const char prototypes_for_' + profile + '[] =' print stringify(proto_ir), ';' # Print a table of all the functions (not signatures) referenced. # This is done so we can avoid bothering with a hash table in the C++ code. function_names = set() for func in re.finditer(r'\(function (.+)\n', proto_ir): function_names.add(func.group(1)) print 'static const char *functions_for_' + profile + ' [] = {' for func in sorted(function_names): print ' builtin_' + func + ',' print '};' def write_profiles(): profiles = get_profile_list() for (filename, profile) in profiles: write_profile(filename, profile) def get_profile_list(): profiles = [] for pfile in sorted(glob(path.join(path.join(builtins_dir, 'profiles'), '*'))): profiles.append((pfile, path.basename(pfile).replace('.', '_'))) return profiles if __name__ == "__main__": print """/* DO NOT MODIFY - automatically generated by generate_builtins.py */ /* * Copyright © 2010 Intel Corporation * * Permission is hereby granted, free of charge, to any person obtaining a * copy of this software and associated documentation files (the "Software"), * to deal in the Software without restriction, including without limitation * the rights to use, copy, modify, merge, publish, distribute, sublicense, * and/or sell copies of the Software, and to permit persons to whom the * Software is furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice (including the next * paragraph) shall be included in all copies or substantial portions of the * Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL * THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER * DEALINGS IN THE SOFTWARE. */ #include <stdio.h> #include "main/core.h" /* for struct gl_shader */ #include "glsl_parser_extras.h" #include "ir_reader.h" #include "program.h" #include "ast.h" extern "C" struct gl_shader * _mesa_new_shader(struct gl_context *ctx, GLuint name, GLenum type); gl_shader * read_builtins(GLenum target, const char *protos, const char **functions, unsigned count) { struct gl_context fakeCtx; fakeCtx.API = API_OPENGL; fakeCtx.Const.GLSLVersion = 130; fakeCtx.Extensions.ARB_ES2_compatibility = true; gl_shader *sh = _mesa_new_shader(NULL, 0, target); struct _mesa_glsl_parse_state *st = new(sh) _mesa_glsl_parse_state(&fakeCtx, target, sh); st->language_version = 130; st->symbols->language_version = 130; st->ARB_texture_rectangle_enable = true; st->EXT_texture_array_enable = true; _mesa_glsl_initialize_types(st); sh->ir = new(sh) exec_list; sh->symbols = st->symbols; /* Read the IR containing the prototypes */ _mesa_glsl_read_ir(st, sh->ir, protos, true); /* Read ALL the function bodies, telling the IR reader not to scan for * prototypes (we've already created them). The IR reader will skip any * signature that does not already exist as a prototype. */ for (unsigned i = 0; i < count; i++) { _mesa_glsl_read_ir(st, sh->ir, functions[i], false); if (st->error) { printf("error reading builtin: %.35s ...\\n", functions[i]); printf("Info log:\\n%s\\n", st->info_log); ralloc_free(sh); return NULL; } } reparent_ir(sh->ir, sh); delete st; return sh; } """ write_function_definitions() write_profiles() profiles = get_profile_list() print 'static gl_shader *builtin_profiles[%d];' % len(profiles) print """ void *builtin_mem_ctx = NULL; void _mesa_glsl_release_functions(void) { ralloc_free(builtin_mem_ctx); builtin_mem_ctx = NULL; memset(builtin_profiles, 0, sizeof(builtin_profiles)); } static void _mesa_read_profile(struct _mesa_glsl_parse_state *state, int profile_index, const char *prototypes, const char **functions, int count) { gl_shader *sh = builtin_profiles[profile_index]; if (sh == NULL) { sh = read_builtins(GL_VERTEX_SHADER, prototypes, functions, count); ralloc_steal(builtin_mem_ctx, sh); builtin_profiles[profile_index] = sh; } state->builtins_to_link[state->num_builtins_to_link] = sh; state->num_builtins_to_link++; } void _mesa_glsl_initialize_functions(struct _mesa_glsl_parse_state *state) { if (builtin_mem_ctx == NULL) { builtin_mem_ctx = ralloc_context(NULL); // "GLSL built-in functions" memset(&builtin_profiles, 0, sizeof(builtin_profiles)); } state->num_builtins_to_link = 0; """ i = 0 for (filename, profile) in profiles: if profile.endswith('_vert'): check = 'state->target == vertex_shader && ' elif profile.endswith('_frag'): check = 'state->target == fragment_shader && ' version = re.sub(r'_(vert|frag)$', '', profile) if version.isdigit(): check += 'state->language_version == ' + version else: # an extension name check += 'state->' + version + '_enable' print ' if (' + check + ') {' print ' _mesa_read_profile(state, %d,' % i print ' prototypes_for_' + profile + ',' print ' functions_for_' + profile + ',' print ' Elements(functions_for_' + profile + '));' print ' }' print i = i + 1 print '}'
The SEED Awards for Entrepreneurship in Sustainable Development is an annual awards scheme designed to identify the most innovative and promising locally led start-up eco-inclusive enterprises in developing and emerging economies. Each SEED Award Winner receives a customised SEED Support Package, which includes the tools and guidance required for an enterprise to emerge from the start-up stage and effectively scale-up. Each SEED Award Winner is selected by our independent SEED International Jury based on the enterprise's potential to scale-up their contributions to poverty eradication and environmental sustainability while leading the transition to a green economy.
from flask import Flask, jsonify import pymongo import json app = Flask(__name__) client = pymongo.MongoClient() db = client.grouch courses = db.courses @app.route('/') def index(): return 'test' @app.route('/spring2016/') def year(): schools = courses.distinct('school') response = jsonify({'schools': schools}) response.headers['Access-Control-Allow-Origin'] = "*" return response @app.route('/spring2016/<school>/') def for_school(school): aggregationPipeline = [ { '$match': { 'school': school }, }, { '$group': { '_id': None, 'classes': { '$push': '$number' } } } ] result = list(courses.aggregate(aggregationPipeline)) classes = result[0].get('classes') if len(result) > 0 else None response = jsonify({'numbers': classes}) response.headers['Access-Control-Allow-Origin'] = "*" return response @app.route('/spring2016/<school>/<number>') def single_course(school, number): course = courses.find_one({'school':school, 'number':number}, {'_id': 0}) response = jsonify(course) response.headers['Access-Control-Allow-Origin'] = "*" return response if __name__ == '__main__': app.run()
Dining table is the focal point of your social center being focal point it is an important feature. If you have just invested in a high-class dining room table made of wood, it\'s just as well to understand some of the basics of keeping it in great condition and having it truly last a lifetime. A select range of Oak dining furniture consisting of high quality square oak dining tables and chairs.oak wood furniture is popular with many homeowners, interior designers and architects.
''' MIT License Copyright (c) 2017 Matej Usaj Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. Created on Apr 10, 2017 @author: Matej Usaj ''' import logging import numpy as np import pandas as p from . import USE_C_OPT logger = logging.getLogger(__name__) def _c_normalize(data3_tableix, t1, data3_nn, cpu=1): from . import c_impl c_impl.table_norm(data3_nn, t1, data3_tableix) def _normalize(index, t1, data3_nn): for x in t1: index += data3_nn > x index[index == 0] = 1 def normalize(table, t1, t2): data = table.values.flatten() result = np.full_like(data, np.nan) data_values = ~np.isnan(data) data = data[data_values] data_index = np.zeros_like(data, dtype=np.int64) if USE_C_OPT: _c_normalize(data_index, t1, data) else: _normalize(data_index, t1, data) data_index -= 1 # leftover from matlab code conversion result[data_values] = t2[data_index] return p.DataFrame(result.reshape(table.shape), index=table.index, columns=table.columns) def _quantile_normalize(data, refdist): percentiles = np.linspace(100. / data.shape[0], 100, num=data.shape[0]) ref_quantiles = np.percentile(refdist, percentiles, interpolation='midpoint') # interpolation used in matlab sort_ind = np.argsort(data, kind='mergesort') # sorting alg used in matlab result = np.zeros_like(data) result[sort_ind] = ref_quantiles return result def table_normalize(data1, data2, data3): data1 = data1.values.flatten() data2 = data2.values.flatten() nn = ~np.isnan(data1) & ~np.isnan(data2) # extract cells with values in both arrays data2_norm = np.full_like(data2, np.nan) data2_norm[nn] = _quantile_normalize(data2[nn], data1[nn]); table = p.DataFrame({'data2': data2[nn], 'data2_norm': data2_norm[nn]}) table = table.sort_values('data2', kind='mergesort') table = table.groupby('data2').median().reset_index() return normalize(data3, table.data2, table.data2_norm)
The Jordan Valley and the northern Dead Sea constitute almost 30% of the West Bank. Nearly 65,000 Palestinians and some 11,000 settlers live there. Although the region serves as the Palestinians’ most significant land reserve, Israel has taken over most of the land with a view to enabling its de facto annexation. Israel also endeavors to minimize Palestinian presence there: barring Palestinians from using 85% of the land, restricting their access to water resources and keeping them from building homes. Israeli authorities are also taking measures to drive out over 50 Palestinian communities across the Jordan Valley, by making their lives intolerable.
#!/usr/bin/env python # -*- coding: utf-8 -*- """ Example script accessing data from a WunderBar microphone via MQTT. This will connect to a microphone, read its noise level and send an email notification to some receiver if that noise level exceeds a certain threshold. """ import sys import json import time import getpass import smtplib from email.mime.text import MIMEText from relayr import Client from relayr.resources import Device from relayr.dataconnection import MqttStream # Replace with your own values! ACCESS_TOKEN = '...' MICROPHONE_ID = '...' # EMAIL/SMTP settings, please provide your own! RECEIVER = '...' SMTP_SERVER = '...' SMTP_USERNAME = '...' SMTP_PASSWORD = '' # will be requested at run time if left empty SMTP_SENDER = 'WunderBar <[email protected]>' SMTP_USE_SSL = False try: settings = [ACCESS_TOKEN, MICROPHONE_ID, RECEIVER, SMTP_SERVER, SMTP_USERNAME] assert not any(map(lambda x: x=='...', settings)) except AssertionError: print('Please provide meaningful settings in the code first!') sys.exit(1) class Callbacks(object): "A class providing callbacks for incoming data from some device." def __init__(self, device): "An initializer to capture the device for later use." self.device = device def send_email(self, text): "Send an email notification." sender = SMTP_SENDER subject = 'WunderBar Notification from Device: %s' % self.device.name msg = MIMEText(text) msg['Subject'] = subject msg['From'] = sender msg['To'] = RECEIVER if SMTP_USE_SSL == True: s = smtplib.SMTP_SSL(SMTP_SERVER) else: s = smtplib.SMTP(SMTP_SERVER) prompt = "SMTP user password for user '%s'? " % SMTP_USERNAME global SMTP_PASSWORD SMTP_PASSWORD = SMTP_PASSWORD or getpass.getpass(prompt) s.login(SMTP_USERNAME, SMTP_PASSWORD) s.sendmail(sender, [RECEIVER], msg.as_string()) s.quit() print("Email notification sent to '%s'" % RECEIVER) def microphone(self, topic, message): "Callback displaying incoming noise level data and email if desired." readings = json.loads(message)['readings'] level = [r for r in readings if r['meaning']=='noiseLevel'][0]['value'] print(level) threshold = 75 if level > threshold: dname, did = self.device.name, self.device.id text = "Notification from '%s' (%s):\n" % (dname, did) text += ("The noise level now is %d (> %d)! " % (level, threshold)) text += "Put on a sound protection helmet before you get deaf!" self.send_email(text) def connect(): "Connect to a device and read data for some time." c = Client(token=ACCESS_TOKEN) mic = Device(id=MICROPHONE_ID, client=c).get_info() callbacks = Callbacks(mic) print("Monitoring '%s' (%s) for 60 seconds..." % (mic.name, mic.id)) stream = MqttStream(callbacks.microphone, [mic], transport='mqtt') stream.start() try: time.sleep(60) except KeyboardInterrupt: print('') stream.stop() print("Stopped") if __name__ == "__main__": connect()
No-show (or late cancellation) of free passenger incurs $248 for the reserved bus seat. Maid of the Mist This famous boat experience has been taking passengers as close to the Niagara Falls as they can get for more than a century. Be ready to get wet-- guests will make their way into the spray itself! Niagara Falls Adventure Movie This movie, shown in the "Adventure Theater" on the American side of the Falls examines the lengths humanity's desire to conquer nature has sent us to and answers the question, "Can you really go over Niagara in a barrel?" In the morning, we will head to Toronto, ON. We will tour this scenic city, stopping at the CN Tower, Casa Loma, Toronto City Hall. This is also where we will spend the night. We will leave Toronto at 7:00am for the Thousand Islands. When we arrive, we will take a Thousand Islands Cruise.* We will arrive back in Boston at about 8:00pm. This tour was good. However, we did not see all activities listed on the last day. The tour guide wanted to end our tour at 1:30 pm because two other people book flights at 3 pm, even though the trips details states "Book your trip after 5 pm". The trip was suppose to end at 5 pm. We ended the tour at 3 pm. I really had a great time with theTake Tour during those 3 days. My tour guy was Jay! He was so nice with me. Everybody in the tour was traveling with their family. I was the only one by myself so he was always worried about me, checking if I was feeling safe during our tour at Niagara Falls and 1000 Island. I was also the only one that spoke English during our tour, so he always tried to make sure if I understood all the information that he was explaining making me feel more comfortable in the group. It was a great experience. I'm from Brazil, and in this trip I learned a lot about United States, Canada and also about China. The restaurant that we went was delicious, the hotel was good too. Just my hair blower in my room that didn't work. That is my second trip with Take Tour. The first one was in 2012 to California, Arizona and Nevada. It is a great way to get to know U.S. It's cheap and fast for who doesn't have too much time to spend traveling. I hope next time I can get my Canada visa to go with you guys to Montreal, Quebec, Toronto, Seattle and Vancouver. Thank you for everything!!!
# Do everything properly, and componentize from twisted.application import internet, service, strports from twisted.internet import protocol, reactor, defer, endpoints from twisted.words.protocols import irc from twisted.protocols import basic from twisted.python import components from twisted.web import resource, server, static, xmlrpc from twisted.spread import pb from zope.interface import Interface, implementer from OpenSSL import SSL import cgi class IFingerService(Interface): def getUser(user): """ Return a deferred returning a string. """ def getUsers(): """ Return a deferred returning a list of strings. """ class IFingerSetterService(Interface): def setUser(user, status): """ Set the user's status to something. """ def catchError(err): return "Internal error in server" class FingerProtocol(basic.LineReceiver): def lineReceived(self, user): d = self.factory.getUser(user) d.addErrback(catchError) def writeValue(value): self.transport.write(value+'\r\n') self.transport.loseConnection() d.addCallback(writeValue) class IFingerFactory(Interface): def getUser(user): """ Return a deferred returning a string. """ def buildProtocol(addr): """ Return a protocol returning a string. """ @implementer(IFingerFactory) class FingerFactoryFromService(protocol.ServerFactory): protocol = FingerProtocol def __init__(self, service): self.service = service def getUser(self, user): return self.service.getUser(user) components.registerAdapter(FingerFactoryFromService, IFingerService, IFingerFactory) class FingerSetterProtocol(basic.LineReceiver): def connectionMade(self): self.lines = [] def lineReceived(self, line): self.lines.append(line) def connectionLost(self, reason): if len(self.lines) == 2: self.factory.setUser(*self.lines) class IFingerSetterFactory(Interface): def setUser(user, status): """ Return a deferred returning a string. """ def buildProtocol(addr): """ Return a protocol returning a string. """ @implementer(IFingerSetterFactory) class FingerSetterFactoryFromService(protocol.ServerFactory): protocol = FingerSetterProtocol def __init__(self, service): self.service = service def setUser(self, user, status): self.service.setUser(user, status) components.registerAdapter(FingerSetterFactoryFromService, IFingerSetterService, IFingerSetterFactory) class IRCReplyBot(irc.IRCClient): def connectionMade(self): self.nickname = self.factory.nickname irc.IRCClient.connectionMade(self) def privmsg(self, user, channel, msg): user = user.split('!')[0] if self.nickname.lower() == channel.lower(): d = self.factory.getUser(msg) d.addErrback(catchError) d.addCallback(lambda m: "Status of %s: %s" % (msg, m)) d.addCallback(lambda m: self.msg(user, m)) class IIRCClientFactory(Interface): """ @ivar nickname """ def getUser(user): """ Return a deferred returning a string. """ def buildProtocol(addr): """ Return a protocol. """ @implementer(IIRCClientFactory) class IRCClientFactoryFromService(protocol.ClientFactory): protocol = IRCReplyBot nickname = None def __init__(self, service): self.service = service def getUser(self, user): return self.service.getUser(user) components.registerAdapter(IRCClientFactoryFromService, IFingerService, IIRCClientFactory) class UserStatusTree(resource.Resource): def __init__(self, service): resource.Resource.__init__(self) self.service=service # add a specific child for the path "RPC2" self.putChild("RPC2", UserStatusXR(self.service)) # need to do this for resources at the root of the site self.putChild("", self) def _cb_render_GET(self, users, request): userOutput = ''.join(["<li><a href=\"%s\">%s</a></li>" % (user, user) for user in users]) request.write(""" <html><head><title>Users</title></head><body> <h1>Users</h1> <ul> %s </ul></body></html>""" % userOutput) request.finish() def render_GET(self, request): d = self.service.getUsers() d.addCallback(self._cb_render_GET, request) # signal that the rendering is not complete return server.NOT_DONE_YET def getChild(self, path, request): return UserStatus(user=path, service=self.service) components.registerAdapter(UserStatusTree, IFingerService, resource.IResource) class UserStatus(resource.Resource): def __init__(self, user, service): resource.Resource.__init__(self) self.user = user self.service = service def _cb_render_GET(self, status, request): request.write("""<html><head><title>%s</title></head> <body><h1>%s</h1> <p>%s</p> </body></html>""" % (self.user, self.user, status)) request.finish() def render_GET(self, request): d = self.service.getUser(self.user) d.addCallback(self._cb_render_GET, request) # signal that the rendering is not complete return server.NOT_DONE_YET class UserStatusXR(xmlrpc.XMLRPC): def __init__(self, service): xmlrpc.XMLRPC.__init__(self) self.service = service def xmlrpc_getUser(self, user): return self.service.getUser(user) def xmlrpc_getUsers(self): return self.service.getUsers() class IPerspectiveFinger(Interface): def remote_getUser(username): """ Return a user's status. """ def remote_getUsers(): """ Return a user's status. """ @implementer(IPerspectiveFinger) class PerspectiveFingerFromService(pb.Root): def __init__(self, service): self.service = service def remote_getUser(self, username): return self.service.getUser(username) def remote_getUsers(self): return self.service.getUsers() components.registerAdapter(PerspectiveFingerFromService, IFingerService, IPerspectiveFinger) @implementer(IFingerService) class FingerService(service.Service): def __init__(self, filename): self.filename = filename self.users = {} def _read(self): self.users.clear() with open(self.filename) as f: for line in f: user, status = line.split(':', 1) user = user.strip() status = status.strip() self.users[user] = status self.call = reactor.callLater(30, self._read) def getUser(self, user): return defer.succeed(self.users.get(user, "No such user")) def getUsers(self): return defer.succeed(self.users.keys()) def startService(self): self._read() service.Service.startService(self) def stopService(self): service.Service.stopService(self) self.call.cancel() application = service.Application('finger', uid=1, gid=1) f = FingerService('/etc/users') serviceCollection = service.IServiceCollection(application) f.setServiceParent(serviceCollection) strports.service("tcp:79", IFingerFactory(f) ).setServiceParent(serviceCollection) site = server.Site(resource.IResource(f)) strports.service("tcp:8000", site, ).setServiceParent(serviceCollection) strports.service("ssl:port=443:certKey=cert.pem:privateKey=key.pem", site ).setServiceParent(serviceCollection) i = IIRCClientFactory(f) i.nickname = 'fingerbot' internet.ClientService( endpoints.clientFromString(reactor, "tcp:irc.freenode.org:6667"), i).setServiceParent(serviceCollection) strports.service("tcp:8889", pb.PBServerFactory(IPerspectiveFinger(f)) ).setServiceParent(serviceCollection)
Tithee Chakma, student of Department of Computer Science and Engineering and Md. Sahin Khan, alumni & current employee of DIU are now attending “Winter for International Learners & Leaders (WILL 2018)” at Chungnam National University (CNU) in South Korea. CNU WILL is a 2-week short-term program started on 27 December 2018 and will continue until 10 January 2019. The program is designed for international students who will complete an academic course of 3 credits on ‘Asia Business’. They will have opportunities to visit several research institutes in Daedeok Science Town and to enjoy winter sports and experience Korean culture. Under this program, they are exploring a new country and spending their last day of this year in the rich culture of South Korea! Chungnam National University (CNU) is one of Top 5 National Universities in South Korea and an active Partner university of DIU since last few years.
# Copyright (c) 2017 Fujitsu Limited # All Rights Reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. from neutron_lib.callbacks import events from neutron_lib.callbacks import registry from neutron_lib.callbacks import resources from oslo_log import log as logging from neutron.common import exceptions from neutron.services.logapi.common import constants as log_const from neutron.services.logapi.common import db_api from neutron.services.logapi.common import exceptions as log_exc from neutron.services.logapi.rpc import server as server_rpc LOG = logging.getLogger(__name__) def _get_param(args, kwargs, name, index): try: return kwargs[name] except KeyError: try: return args[index] except IndexError: msg = "Missing parameter %s" % name raise log_exc.LogapiDriverException(exception_msg=msg) @registry.has_registry_receivers class LoggingServiceDriverManager(object): def __init__(self): self._drivers = set() self.rpc_required = False registry.publish(log_const.LOGGING_PLUGIN, events.AFTER_INIT, self) if self.rpc_required: self._start_rpc_listeners() self.logging_rpc = server_rpc.LoggingApiNotification() @property def drivers(self): return self._drivers def register_driver(self, driver): """Register driver with logging plugin. This method is called from drivers on INIT event. """ self._drivers.add(driver) self.rpc_required |= driver.requires_rpc def _start_rpc_listeners(self): self._skeleton = server_rpc.LoggingApiSkeleton() return self._skeleton.conn.consume_in_threads() @property def supported_logging_types(self): if not self._drivers: return set() log_types = set() for driver in self._drivers: log_types |= set(driver.supported_logging_types) LOG.debug("Supported logging types (logging types supported " "by at least one loaded log_driver): %s", log_types) return log_types def call(self, method_name, *args, **kwargs): """Helper method for calling a method across all extension drivers.""" exc_list = [] for driver in self._drivers: try: getattr(driver, method_name)(*args, **kwargs) except Exception as exc: exception_msg = ("Extension driver '%(name)s' failed in " "%(method)s") exception_data = {'name': driver.name, 'method': method_name} LOG.exception(exception_msg, exception_data) exc_list.append(exc) if exc_list: raise exceptions.DriverCallError(exc_list=exc_list) if self.rpc_required: context = _get_param(args, kwargs, 'context', index=0) log_obj = _get_param(args, kwargs, 'log_obj', index=1) try: rpc_method = getattr(self.logging_rpc, method_name) except AttributeError: LOG.error("Method %s is not implemented in logging RPC", method_name) return rpc_method(context, log_obj) @registry.receives(resources.SECURITY_GROUP_RULE, [events.AFTER_CREATE, events.AFTER_DELETE]) def _handle_sg_rule_callback(self, resource, event, trigger, **kwargs): """Handle sg_rule create/delete events This method handles sg_rule events, if sg_rule bound by log_resources, it should tell to agent to update log_drivers. """ context = kwargs['context'] sg_rules = kwargs.get('security_group_rule') if sg_rules: sg_id = sg_rules.get('security_group_id') else: sg_id = kwargs.get('security_group_id') log_resources = db_api.get_logs_bound_sg(context, sg_id) if log_resources: self.call( log_const.RESOURCE_UPDATE, context, log_resources)
Getting a badkamer renovatie done can be quite a simple task if you put the right people to work. They will know exactly what rabbits to pull out of the hat to give you old bathroom a whole new look. The use of functional and easy to clean sanitair is sacrosanct to remodelling your bathroom. Use of natural light, good floor and wall tiles, a bit of work on the ceiling and giving a trendy look to that unused window can give a complete makeover to your bathroom. It’s easier to work on large bathrooms than on smaller ones. The latter require innovative concepts to make optimum use of space and light to give a completely new look to your bathroom. For instance, a glass block window could work great with natural stone look tiles. The usual choices for bathroom tiles are marble, mosaic, glass, ceramic or porcelain. Choose according to your requirement and budget. The idea is to give a spacious look which is achieved with wall-hung sanitair. Do away with the shower with a floor bathtub. Use the same tiles on wall and floor to give a continuous feel to the space and you have a spacious small bathroom. In case you have a large bathroom then there is a lot more scope to do something experimental, like a theme remodelling. The designers will give you the look and feel of the place you’d like to be in. Do you prefer a spa like look? Then the wall tiles adjacent to the bathtub can be where the design starts and spreads across the entire wall. The floor tiles and the mirror have to match the theme as well. A pristine white theme can be achieved with the help of marbles. A traditional bathroom can have marbles and black granite theme on the walls and floor. Contemporary small bathrooms can have graphic designs, mosaic tiles, a large mirror and a curtained shower area. Badkamer renovatie is all about digging out innovative concepts. If there’s enough space, you can also design separate dressing areas for two users. Trendy designs can give your bathroom remodelling a completely different appeal. The thing to remember is that a bathroom should be functional. This is the first criteria. Then we have aesthetics and other factors. Make a sketch of how you want your bathroom to be. Then decide on the sanitary fixtures and the budget. Then the installers will survey and submit a project estimate. After you come to an agreement regarding the work order, the work for renovation starts. Tearing down what was installed and starting from the scratch allows a lot more scope to work. The first step to start with the building process is to install new electricity and refreshed plumbing lines. Walls are plastered and prepped for the tiles to be placed. Suspended ceilings and recessed lights can be installed if required. If you are looking at badkamer renovatie without these, then lighting materials have to be chosen accordingly. Installing the sanitair comes after the tiles are placed on floors and walls. There is opportunity to bring in new concepts at every step, starting from the tiles to sanitary appliances to mirrors and lights. Get your badkamer renovatie ( https://www.kerkstoel-bouwmaterialen.be/sanitair/ ) done by a wholesaler offering such services apart from selling sanitair ( https://www.kerkstoel-bouwmaterialen.be/sanitair/ ) to save expenditures.
#!/usr/bin/env python """ monitor_dynamixels.py - Version 0.1 2013-07-07 Monitor the /diagnostics topic for Dynamixel messages and disable a servo if we receive an ERROR status message (e.g. overheating). Created for the Pi Robot Project: http://www.pirobot.org Copyright (c) 2014 Patrick Goebel. All rights reserved. This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version.5 This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details at: http://www.gnu.org/licenses/gpl.htmlPoint """ import rospy from diagnostic_msgs.msg import DiagnosticArray, DiagnosticStatus from arbotix_msgs.srv import Relax, Enable class MonitorDynamixels: def __init__(self): # Initialize the node rospy.init_node("monitor_dynamixels") # The arbotix controller uses the /arbotix namespace namespace = '/arbotix' # Get the list of joints (servos) self.joints = rospy.get_param(namespace + '/joints', '') # Minimum time to rest servos that are hot self.minimum_rest_interval = rospy.get_param('~minimum_rest_interval', 60) # Initialize the rest timer self.rest_timer = 0 # Are we already resting a servo? self.resting = False # Are the servos enabled? self.servos_enabled = False # Have we displayed a warning recently? self.warned = False # Connect to the servo services self.connect_servos() rospy.Subscriber('diagnostics', DiagnosticArray, self.get_diagnostics) def get_diagnostics(self, msg): if self.rest_timer != 0: if rospy.Time.now() - self.rest_timer < rospy.Duration(self.minimum_rest_interval): return else: self.resting = False rest_timer = 0 # Track if we have issued a warning on this pass warn = False for k in range(len(msg.status)): # Check for the Dynamixel identifying string in the name field if not '_joint' in msg.status[k].name: # Skip other diagnostic messages continue # Check the DiagnosticStatus level for this servo if msg.status[k].level == DiagnosticStatus.ERROR: # If the servo is overheating and not already resting, then disable all servos if not self.resting: rospy.loginfo("DANGER: Overheating servo: " + str(msg.status[k].name)) rospy.loginfo("Disabling servos for a minimum of " + str(self.minimum_rest_interval) + " seconds...") self.disable_servos() self.servos_enabled = False self.rest_timer = rospy.Time.now() self.resting = True break elif msg.status[k].level == DiagnosticStatus.WARN: # If the servo is starting to get toasty, display a warning but do not disable rospy.loginfo("WARNING: Servo " + str(msg.status[k].name) + " getting hot...") self.warned = True warn = True # No servo is overheated so re-enable all servos if not self.resting and not self.servos_enabled: rospy.loginfo("Dynamixel temperatures OK so enabling") self.enable_servos() self.servos_enabled = True self.resting = False # Check if a prior warning is no longer necessary if self.warned and not warn: rospy.loginfo("All servos back to a safe temperature") self.warned = False def connect_servos(self): # Create a dictionary to hold the torque and enable services self.relax = dict() self.enable = dict() # Connect to the set_speed services and define a position publisher for each servo rospy.loginfo("Waiting for joint controllers services...") for joint in sorted(self.joints): # A service to relax a servo relax = '/' + joint + '/relax' rospy.wait_for_service(relax) self.relax[joint] = rospy.ServiceProxy(relax, Relax) # A service to enable/disable a servo enable_service = '/' + joint + '/enable' rospy.wait_for_service(enable_service) self.enable[joint] = rospy.ServiceProxy(enable_service, Enable) rospy.loginfo("Connected to servos.") def disable_servos(self): for joint in sorted(self.joints): self.relax[joint]() self.enable[joint](False) def enable_servos(self): for joint in sorted(self.joints): self.enable[joint](True) if __name__ == '__main__': MonitorDynamixels() rospy.spin()
What is the graph of #y=sin^2 x#? Something important to recognize, if you compare this to the graph of #y=cos^2(x)#,(http://socratic.org/questions/what-is-the-graph-of-y-cos-2-x) you can see one is simply the negative of the other. What are common mistakes students make with sinusoidal graphs? How do I graph sinusoidal functions? What is the amplitude of a sinusoidal graph? What is the graph of #y=sin^(-1) x#? How do I find the frequency of a sinusoidal graph? How do I find the equation of a sinusoidal graph? How do I find the phase shift of a sinusoidal graph? What is the graph of #y=cos^2 x#? What is the graph of #y=cos^-1x#?
# -*- coding: utf-8 -*- from django.db import models from django.contrib import admin from django.contrib.auth.models import User from django.contrib.contenttypes.models import ContentType from django.contrib.contenttypes import generic # Create your models here. class KeyWord(models.Model): content = models.CharField(u"内容", max_length=200) description = models.TextField(u"描述", blank=True) categoryid = models.IntegerField(u"分类编号", null=True, blank=True) #TODO:弄清这里的详细意义 standard_judge = models.BooleanField(u"是否为标准关键词", default=False) created_at = models.DateTimeField(auto_now_add=True, verbose_name = u"创建时间") user = models.ForeignKey(User, verbose_name=u"用户", related_name="keywords", null=True, blank=True) content_type = models.ForeignKey(ContentType, null=True, blank=True) #将KeyWord作为GFK object_id = models.PositiveIntegerField(null=True, blank=True) content_object = generic.GenericForeignKey("content_type", "object_id") class Meta: verbose_name_plural = u"关键词" def __unicode__(self): return "%s" % self.content # Query {{{ class Query(models.Model): content = models.CharField(u"内容", max_length=500) level = models.PositiveIntegerField(u"级数",default=1) categoryid = models.IntegerField(u"分类编号", null=True, blank=True, default=1) #TODO:弄清这里的详细意义 created_at = models.DateTimeField(auto_now_add=True, verbose_name = u"创建时间") standard_judge = models.BooleanField(u"是否为标准问题", default=False) user = models.ForeignKey(User, verbose_name=u"用户", related_name="querys") content_type = models.ForeignKey(ContentType, null=True, blank=True) #将Query作为GFK object_id = models.PositiveIntegerField(null=True, blank=True) content_object = generic.GenericForeignKey("content_type", "object_id") class Meta: verbose_name_plural = u"问题" def __unicode__(self): return "< Query: %s >" % self.content def show(self): """ used in 'search/search.html' to show search result """ return self.__unicode__() # }}} class WordWordRelation(models.Model): value = models.FloatField(u"关联度") word1 = models.ForeignKey("KeyWord", verbose_name=u"关键词1", related_name="relations_with_other_words_as_primary", null=True, blank=True) word2 = models.ForeignKey("KeyWord", verbose_name=u"关键词2", related_name="relations_with_other_words_as_deputy", null=True, blank=True) class Meta: verbose_name_plural = u"关键词与关键词的关系" def __unicode__(self): return "< WordWordRelation: (%s, %s) >" % (self.word1.content, self.word2.content) class QueryQueryRelation(models.Model): value = models.FloatField(u"关联度") query1 = models.ForeignKey("Query", verbose_name=u"问题1", related_name="relations_with_other_querys_as_primary", null=True, blank=True) query2 = models.ForeignKey("Query", verbose_name=u"问题2", related_name="relations_with_other_querys_as_deputy", null=True, blank=True) class Meta: verbose_name_plural = u"问题与问题的关系" def __unicode__(self): return "< QueryQueryRelation: (%s, %s) >" % (self.query1.content, self.query2.content) class WordQueryRelation(models.Model): value = models.FloatField(u"关联度") word = models.ForeignKey("KeyWord", verbose_name=u"关键词", related_name="relations_with_querys", null=True, blank=True) query2 = models.ForeignKey("Query", verbose_name=u"问题", related_name="relations_with_words", null=True, blank=True) class Meta: verbose_name_plural = u"关键词与问题的关系" def __unicode__(self): return "< WordQueryRelation: (%s, %s) >" % (self.word.content, self.query.content) class BlogQueryRelation(models.Model): value = models.FloatField(u"关联度") blog = models.ForeignKey("sciblog.SciBlog", verbose_name=u"文章", related_name="relations_with_querys", null=True, blank=True) query = models.ForeignKey("Query", verbose_name=u"问题", related_name="relations_with_blogs", null=True, blank=True) class Meta: verbose_name_plural = u"文章与问题的关系" def __unicode__(self): return "< BlogRelation: (%s, %s) >" % (self.blog.title, self.query.content) admin.site.register([ KeyWord, Query, WordWordRelation, WordQueryRelation, BlogQueryRelation, ])
Fall is upon us, and with that comes the “Meet The Cast” releases for all of your favorite shows. Next up is The Amazing Race, embarking on its 19th Season of racing around the world with host Phil Keoghan. There are a few familiar faces this season and once again, a great mixture of people. You’ll recognize Ethan Zohn and Jenna Morasca, two former winners of Survivor. You’ll also meet an Olympic snowboardering pair, twin sisters, an ex-NFL tight end, gay flight attendants (that are domestic partners), and the youngest person to ever sail around the world alone. That’s pretty “amazing” in itself! This season, teams will face a new penalty called the “Hazard,” which will impact one Team’s future on the Race right from the beginning. Teams will first depart from Southern California onto their first destination, Taipei, Taiwan. Those that make it past the first leg will find themselves in an 8th Century Buddhist temple in Indonesia, riding elephants through the rain forests of Thailand, and racing for their lives when they find out they’ll be facing the series first ever Double Elimination. I love that they’re introducing some new twists! “The Amazing Race” premieres Sunday, September 25th at 8 p.m. ET on CBS. Check out the teams below and vote for who your favorite is! I think they’ll be a force to be reckoned with. Both in shape and one of them is a project manager, so perhaps they’ll be able to apply their works skills & abilities to the tasks they’ll face. Oh, and Ernie is cute! Also a great pairing I think. A doctor obviously has the smarts and patience. Speaking of patience, Jennifer is a special ed teacher which would also give her a lot of patience as well. Hopefully they aren’t the kind of siblings that fight all the time. Aww look at these cute twins! Twins always have an advantage because they know each other so well. What each of their skills and abilities are, as well as the second half of their sentences. I should know – my brothers are twins. Yay, a gay couple! Not only do I think these guys will have the ambition to get to the end, but their experience in traveling as a profession will certainly help them in making wise decisions with flights, etc. Go team gay! Who doesn’t love some “celebrity reality stars” to get thrown into the mix. As winners of Survivor, these two are pretty much the biggest threat in the game. These guys need to get U-Turned at the first opportunity! These two are so adorable! Of course they have their age as something that may work against them, but I think they won’t be as high strung as some of the others which is often the downfall of a team. Just two highschool sweethearts embarking on the adventure of a lifetime! Well these two certainly will have what it takes physically, being former Olympians. Also, I think they have the “chill” factor meaning that I don’t think that they’ll be at each other’s throats either. Whenever you can eliminate that flaw from a team, it’s huge edge for them. These two seem like they’ll work together well, given their success to date in the NFL and in business. I don’t know much about them otherwise, but they seem pretty gung-ho and perhaps a good team for others to align with. Gotta throw those showgirls & cocktail waitresses in the mix, don’t you CBS! They seem to love this profession when selecting new people for their reality shows. Anyway, they seem pretty in shape and could be a good team as long as they are prepared to face a world outside of the bubble that is Las Vegas. I love a father/son team! What a once in a lifetime experience it is go through something like this witha parent or child. I think they could make it far, as they’ll compliment each others’ skills well. I can’t seem to indentify which team will be the one that fights all the time, so I’m going to go with this one. I could be completely wrong, but there has to be one in every cast LOL. Vote for who you are routing for below! The Amazing Race 19: Which Are You Cheering On?
from __future__ import division import logging import os import pygame from core import prepare from core.tools import open_dialog from core.components import save from core.components.menu import PopUpMenu from core.components.menu.interface import MenuItem from core.components.ui import text # Create a logger for optional handling of debug messages. logger = logging.getLogger(__name__) logger.debug("%s successfully imported" % __name__) class SaveMenuState(PopUpMenu): number_of_slots = 3 shrink_to_items = True def initialize_items(self): empty_image = None rect = self.game.screen.get_rect() slot_rect = pygame.Rect(0, 0, rect.width * 0.80, rect.height // 6) for i in range(self.number_of_slots): # Check to see if a save exists for the current slot if os.path.exists(prepare.SAVE_PATH + str(i + 1) + ".save"): image = self.render_slot(slot_rect, i + 1) yield MenuItem(image, "SAVE", None, None) else: if not empty_image: empty_image = self.render_empty_slot(slot_rect) yield MenuItem(empty_image, "SAVE", None, None) def render_empty_slot(self, rect): slot_image = pygame.Surface(rect.size, pygame.SRCALPHA) rect = rect.move(0, rect.height // 2 - 10) text.draw_text(slot_image, "Empty Slot", rect, font=self.font) return slot_image def render_slot(self, rect, slot_num): slot_image = pygame.Surface(rect.size, pygame.SRCALPHA) # TODO: catch missing file thumb_image = pygame.image.load(prepare.SAVE_PATH + str(slot_num) + ".png").convert() thumb_rect = thumb_image.get_rect().fit(rect) thumb_image = pygame.transform.smoothscale(thumb_image, thumb_rect.size) # Draw the screenshot slot_image.blit(thumb_image, (rect.width * .20, 0)) # Draw the slot text rect = rect.move(0, rect.height // 2 - 10) text.draw_text(slot_image, "Slot " + str(slot_num), rect, font=self.font) # Try and load the save game and draw details about the save try: save_data = save.load(slot_num) except Exception as e: logger.error(e) save_data = dict() save_data["error"] = "Save file corrupted" logger.error("Failed loading save file.") raise if "error" not in save_data: x = int(rect.width * .5) text.draw_text(slot_image, save_data['player_name'], (x, 0, 500, 500), font=self.font) text.draw_text(slot_image, save_data['time'], (x, 50, 500, 500), font=self.font) return slot_image def on_menu_selection(self, menuitem): logger.info("Saving!") try: save.save(self.game.player1, self.capture_screenshot(), self.selected_index + 1, self.game) except Exception as e: logger.error("Unable to save game!!") logger.error(e) open_dialog(self.game, ["There was a problem saving!"]) self.game.pop_state(self) else: open_dialog(self.game, ["Saved!"]) self.game.pop_state(self) def capture_screenshot(self): screenshot = pygame.Surface(self.game.screen.get_size()) world = self.game.get_state_name("WorldState") world.draw(screenshot) return screenshot
Our professional waste removal team in Eagle River, WI can handle any type of waste the same day. When our team arrives at your location, you can simply point us to all the waste materials you need to be taken away and we will give you an upfront, discounted price quote – now that’s truly cost effective. Call us at 888-637-1190 to request a free quote! 1000+ satisfied customers can’t be wrong when they picked us for the job. We are the best waste removal company in Eagle River, WI, and we don’t just remove your waste but we also recycle or donate household items, furniture, scrap metal and various kinds of waste materials that we remove from businesses and homes. We pride ourselves by providing only the best and highly trained waste removal experts in Eagle River, WI. The only thing you need to do is to point us to your location and we will remove all your waste using the most advanced techniques and tools available - you don’t have to lift a finger because we will do all the lifting and loading for you! One of the most important information you should supply our company with is the type and amount of waste you want to remove from your pit or house. This will let us determine the type of truck we need to bring to your pit in Eagle River, WI and the amount of roll off containers or garbage bins they we need to reserve. It will also give us an idea if the waste you want to remove is dangerous or needs more special handling. In that case, there will be an additional cost for removing these types of waste since it will require several experts who have the knowledge and expertise in dangerous waste removal in your Eagle River, WI to take away these materials. To learn more about our waste removal services, you can call our office now at 888-637-1190 and request a free quote. To get the right price quote from our waste removal company in Eagle River, WI, you need to call us or send an e-mail. We always have the time to go to the location of your waste and provide you an upfront price quote. The total cost of the service will be based on the amount, type of waste and your Eagle River, WI. However, our waste removal company always offers the best price in the area.
import os import json import datetime import logging import sys d2=os.getcwd() d2=d2.replace("bin","") d2=d2.replace("\\","/") sys.path.append(d2) d2=d2+"db/admin/" ################## d3=os.getcwd() d3=d3.replace("bin","") d3=d3.replace("\\","/") sys.path.append(d3) d3=d3+"db/user/" time=datetime.date.today() time=str(time) time=time.replace("-","_") user_stat={"kahao":"","user":"","pass":"","edu":"","benyueedu":"","createdata":"","status":"","saving":""} def main(): falg=True msg=["1.取款","2.存款","3.转账","4.还款"] while falg: for i in msg: print(i) chosse=input("请输入你要选择操作的编号:") chosse=chosse.strip() if chosse=="1": print("你选择了取款") qukuan() elif chosse=="2": print("你选择了存款") chunkuan() elif chosse=="3": print("你选择了转账") zhuanzhang() elif chosse=="4": print("你选择了还款") huankuan() elif chosse=="q": break else: print("你选择的操作编号不正确,请重新输入") #取款模块 def qukuan(): falg=True kahao=user_stat.get("kahao") saving=user_stat.get("saving") user=user_stat.get("user") edu=user_stat.get("edu") saving=int(saving) edu=int(edu) while falg: choose=input("请选择取款金额:") choose=choose.strip() if choose=="q": exit("你选择了退出") elif choose=="b": flag=False break if choose.isdigit(): print("你输入的金额格式正确") choose=int(choose) if choose<=saving: print("你的储存卡的余额足够,可以提现,提现成功!!!!") saving=saving-choose #更新用户数据操作 os.chdir(d3) os.chdir(kahao) basic_infor=json.load(open("basic_infor","r")) basic_infor["saving"]=saving json.dump(basic_infor,open("basic_infor","w")) #更新用户数据操作结束 message="卡号{kahao},用户{user},取款{choose}成功!!!".format(kahao=kahao,user=user,choose=choose) log_suer("普通用户取款成功",kahao,message) user_stat["saving"]=saving #####写入账单 zhangdan_user("提现记录",kahao,message) break elif choose>saving: print("你的储存卡的余额不够,需要从信用卡提现") kamax=edu*0.7 #信用卡最多提现的钱 tixian=choose-saving #减去储存卡里面的余额,从信用卡上提现的钱 if tixian<=kamax: print("可以提现,提现成功!!!!") #更新用户数据操作 os.chdir(d3) os.chdir(kahao) basic_infor=json.load(open("basic_infor","r")) basic_infor["saving"]="0" edu=edu-tixian-tixian*0.05 #basic_infor["edu"]=edu basic_infor["benyueedu"]=edu json.dump(basic_infor,open("basic_infor","w")) #更新用户数据操作结束 message="卡号{kahao},用户{user},取款{choose}成功!!!".format(kahao=kahao,user=user,choose=choose) log_suer("普通用户取款成功",kahao,message) user_stat["saving"]="0" user_stat["benyueedu"]=edu #####写入账单 zhangdan_user("提现记录",kahao,message) break elif tixian>kamax: print("不可以提现,你要提现的金额超出范围") message="不可以提现,你要提现的金额超出范围" log_suer("不可以提现,你要提现的金额超出范围",kahao,message) else: print("你输入的金额格式错误,请重新输入") message="你输入的金额格式错误,请重新输入" log_suer("你输入的金额格式错误,请重新输入",kahao,message) #取款模块 #存款模块 def chunkuan(): falg=True kahao=user_stat.get("kahao") saving=user_stat.get("saving") user=user_stat.get("user") while falg: chosse=input("请选择要存款的金额: ") chosse=chosse.strip() if chosse=="q": exit("你选择了退出") elif chosse=="b": falg=False break if chosse.isdigit(): print("你输入的金额正确") chosse=int(chosse) saving=int(saving) saving=chosse+saving ##更新用户信息 os.chdir(d3) os.chdir(kahao) basic_infor=json.load(open("basic_infor","r")) basic_infor["saving"]=saving json.dump(basic_infor,open("basic_infor","w")) ##更新用户信息完毕 message="卡号{kahao}用户{user}存款{chosse}".format(kahao=kahao,user=user,chosse=chosse) log_suer("用户存款成功!!!",kahao,message) user_stat["saving"]=saving #####写入账单 zhangdan_user("存款记录",kahao,message) break else: print("你输入的金额不正确") message="你输入的金额不正确" log_suer("你输入的金额不正确!!!",kahao,message) #存款模块 #转账模块 def zhuanzhang(): flag=True kahao=user_stat.get("kahao") saving=user_stat.get("saving") user=user_stat.get("user") while flag: choose_user=input("请选择转账用户卡号:") if choose_user=="q": exit("你选择了退出") elif choose_user=="b": flag=False break choose_cash=input("请选择转账金额: ") choose_user=choose_user.strip() choose_cash=choose_cash.strip() os.chdir(d3) if os.path.exists(choose_user): print("你要转账的用户卡号在系统中,可以转账") if choose_cash.isdigit(): print("你输入的金额格式正确") choose_cash=int(choose_cash) saving=int(saving) #判断是否有钱转账 if saving>=choose_cash: print("你账户里面有足够的钱可以转账") ##自己的账户先扣钱 os.chdir(kahao) basic_infor=json.load(open("basic_infor","r")) saving=saving-choose_cash basic_infor["saving"]=saving json.dump(basic_infor,open("basic_infor","w")) user_stat["saving"]=saving ##扣钱完毕 ##转给要转账的用户 os.chdir("G:/python代码/ATM/db/user") os.chdir(choose_user) basic_infor=json.load(open("basic_infor","r")) old=basic_infor.get("saving") old=int(old) #原来账户里面的余额 new=old+choose_cash basic_infor["saving"]=new json.dump(basic_infor,open("basic_infor","w")) print("转账成功!!!!!!!!!!!!") #转账完毕 message="卡号{kahao}用户{user}转入给{choose_user}转账金额{choose_cash}元".format(kahao=kahao,user=user,choose_user=choose_user,choose_cash=choose_cash) log_suer("用户转账成功!!!",kahao,message) #####写入账单 zhangdan_user("转账记录",kahao,message) break else: print("你的账户余额不足,不能转账,重新输入金额") message="你的账户余额不足,不能转账" log_suer("账户余额不足",kahao,message) else: print("你输入的金额格式不正确,请重新输入") message="你输入的金额格式不正确,请重新输入" log_suer("输入的金额格式不正确",kahao,message) else: print("你要转账的用户卡号不在系统中,请重新输入") message="你要转账的用户卡号不在系统中,请重新输入" log_suer("转账的用户卡号不在系统中",kahao,message) #转账模块 #还款模块 def huankuan(): flag=True kahao=user_stat.get("kahao") edu=user_stat.get("edu") #规定的额度 benyueedu=user_stat.get("benyueedu") #本月额度 saving=user_stat.get("saving") #储存卡的余额 edu=int(edu) benyueedu=int(benyueedu) saving=int(saving) while flag: if edu>benyueedu: print("你需要还款") cash=input("请输入你要还款的金额:") cash=cash.strip() if cash=="q": exit("你选择了退出") elif cash=="b": flag=False break if cash.isdigit(): print("你输入的金额合法") cash=int(cash) if cash<=saving: print("你的余额足够,开始还款") os.chdir(d3) os.chdir(kahao) basic_infor=json.load(open("basic_infor","r")) saving=saving-cash benyueedu=benyueedu+cash benyueedu=int(benyueedu) if edu==benyueedu: basic_infor["saving"]=saving basic_infor["benyueedu"]=benyueedu ######更新 json.dump(basic_infor,open("basic_infor","w")) user_stat["saving"]=saving user_stat["benyueedu"]=benyueedu ######更新完毕 print("你已经全部还完额度") message="你已经全部还完额度" log_suer("你已经全部还完额度",kahao,message) #####写入账单 zhangdan_user("还款记录",kahao,message) elif edu>benyueedu: basic_infor["saving"]=saving basic_infor["benyueedu"]=benyueedu ######更新 json.dump(basic_infor,open("basic_infor","w")) user_stat["saving"]=saving user_stat["benyueedu"]=benyueedu message="你已经还款,但是还没还清" log_suer("你已经还款,但是还没还清",kahao,message) ######更新完毕 #####写入账单 zhangdan_user("还款记录",kahao,message) elif edu<benyueedu: print("你所还的钱超出了你的欠款,请重新输入") message="你所还的钱超出了你的欠款,请重新输入" log_suer("你所还的钱超出了你的欠款,请重新输入",kahao,message) break else: print("你的余额不足,无法还款") message="你的余额不足,无法还款" log_suer("你的余额不足,无法还款",kahao,message) else: print("你输入的金额不合法,请重新输入") message="你输入的金额不合法,请重新输入" log_suer("你输入的金额不合法,请重新输入",kahao,message) elif edu==benyueedu: print("你不需要还款") message="你不需要还款" log_suer("你不需要还款",kahao,message) break #还款模块 #登录模块 def longin(): falg=True while falg: os.chdir(d3) user=input("请输入卡号: ") pas=input("请输入密码: ") user=user.strip() pas=pas.strip() if os.path.exists(user): print("你输入的用户存在") os.chdir(user) basic_infor=json.load(open("basic_infor","r")) kahao=basic_infor.get("kahao") pas2=basic_infor.get("pass") status=basic_infor.get("status") edu=basic_infor.get("edu") benyueedu=basic_infor.get("benyueedu") user=basic_infor.get("user") createdata=basic_infor.get("createdata") saving=basic_infor.get("saving") if pas==pas2 and status==0: print("账户密码正确,成功登陆") user_stat["kahao"]=kahao user_stat["user"]=user user_stat["pass"]=pas2 user_stat["edu"]=edu user_stat["benyueedu"]=benyueedu user_stat["createdata"]=createdata user_stat["status"]=status user_stat["saving"]=saving message="卡号{kahao},用户{user}".format(kahao=kahao,user=user) os.chdir(d2) log("普通用户登录成功","record/",message) ############### log_suer("普通用户登录成功",kahao,message) return True else: print("账户密码不正确,登陆失败") os.chdir(d2) message="账户密码不正确,登陆失败" log("账户密码不正确,登陆失败","record/",message) else: print("你输入的用户不存在") os.chdir(d2) message="你输入的用户不存在" log("你输入的用户不存在","record/",message) #登录模块 #日志模块 def log_suer(name,kahao,message): #create logger logger=logging.getLogger(name) logger.setLevel(logging.DEBUG) #文件输出Handler os.chdir(d3) os.chdir(kahao) os.chdir("record") fh=logging.FileHandler(time+".log") fh.setLevel(logging.DEBUG) #指定日志格式 formatter=logging.Formatter("%(asctime)s-%(name)s-%(levelname)s-%(message)s") #Formatter注册给Handler fh.setFormatter(formatter) #Handler注册给logeer logger.addHandler(fh) ###################### logger.info(message) def log(name,path,message): #create logger logger=logging.getLogger(name) logger.setLevel(logging.DEBUG) #文件输出Handler os.chdir(path) fh=logging.FileHandler(time+".log") fh.setLevel(logging.DEBUG) #指定日志格式 formatter=logging.Formatter("%(asctime)s-%(name)s-%(levelname)s-%(message)s") #Formatter注册给Handler fh.setFormatter(formatter) #Handler注册给logeer logger.addHandler(fh) ###################### logger.info(message) def zhangdan_user(name,kahao,message): #create logger logger=logging.getLogger(name) logger.setLevel(logging.DEBUG) #文件输出Handler os.chdir(d3) os.chdir(kahao) os.chdir("record") fh=logging.FileHandler(time+".record") fh.setLevel(logging.DEBUG) #指定日志格式 formatter=logging.Formatter("%(asctime)s-%(name)s-%(levelname)s-%(message)s") #Formatter注册给Handler fh.setFormatter(formatter) #Handler注册给logeer logger.addHandler(fh) ###################### logger.info(message) def run(): r=longin() if r==True: main()
This brand new development of just 10 very spacious 3 bed ground floor and penthouse apartments is located just a few minutes walk from the town centre of the popular modern coastal town of Pilar De La Horadada. Whilst located in a quiet residential area the main high street with a great choice of shops, bars, restaurants and services is just about a 300m walk away. The town also has a fantastic public sports village with football pitches, bowling green, indoor swimming pool, climbing wall, multi-use courts, gym and tennis courts as well as all the usual amenities and services you would expect such as schools, medical centres, parks and leisure facilities. The fantastic Lo Romero Golf Club & Resort is just a 5 minute drive and the beautiful golden sandy beaches of Torre De La Horadada and Mil Palmeras are around 3.5km away. In addition, the beaches and natural park of San Pedro Del Pinatar on the Mar Menor Sea with its famous therapeutic, mud baths and pink flamingo breading grounds are also just a 5 minute drive away. 2 seas and 2 beach resorts both within a 5 minute drive! The properties consist of large open plan living areas with modern fitted kitchen, 3 double bedrooms and 2 bathrooms. All the apartments feature huge outside spaces with the ground floors having large private gardens & terraces and the penthouse properties huge 75m2 to 90m2 private roof solariums. All the apartments come complete with all kitchen appliances, LED lighting, fully furnished bathrooms with underfloor heating, a choice of kitchen units as well as a large choice of top quality Porcelanosa floor and wall tiles throughout. The development is gated and each property has the use of the large communal pool and gardens. We have received your request regarding the property reference: PDLH OKY F6 AP GF 3B. One of our agents will contact you as soon as possible.
''' UForgeCLI ''' try: from urllib.parse import urlencode except ImportError: from urllib import urlencode import argparse import getpass import base64 import httplib2 import os import sys import ussclicore.utils.generics_utils from ussclicore.utils import printer from ussclicore.cmd import Cmd, CmdUtils from ussclicore.argumentParser import CoreArgumentParser, ArgumentParser, ArgumentParserError import commands from uforge.application import Api from utils import * __author__ = "UShareSoft" __license__ = "Apache License 2.0" class CmdBuilder(object): @staticmethod def generateCommands(class_): # Create subCmds if not exist if not hasattr(class_, 'subCmds'): class_.subCmds = {} user = commands.user.User_Cmd() class_.subCmds[user.cmd_name] = user entitlement = commands.entitlement.Entitlement_Cmd() class_.subCmds[entitlement.cmd_name] = entitlement subscription = commands.subscription.Subscription_Cmd() class_.subCmds[subscription.cmd_name] = subscription role = commands.role.Role_Cmd() class_.subCmds[role.cmd_name] = role images = commands.images.Images_Cmd() class_.subCmds[images.cmd_name] = images org = commands.org.Org_Cmd() class_.subCmds[org.cmd_name] = org os = commands.os.Os_Cmd() class_.subCmds[os.cmd_name] = os pimages = commands.pimages.Pimages_Cmd() class_.subCmds[pimages.cmd_name] = pimages usergrp = commands.usergrp.Usergrp_Cmd() class_.subCmds[usergrp.cmd_name] = usergrp template = commands.template.Template_Cmd() class_.subCmds[template.cmd_name] = template ## Main cmd class Uforgecli(Cmd): #subCmds = { # 'tools': CmdUtils #} def __init__(self): super(Uforgecli, self).__init__() self.prompt = 'uforge-cli >' def do_exit(self, args): return True def do_quit(self, args): return True def arg_batch(self): doParser = ArgumentParser("batch", add_help = False, description="Execute uforge-cli batch command from a file (for scripting)") mandatory = doParser.add_argument_group("mandatory arguments") optionnal = doParser.add_argument_group("optional arguments") mandatory.add_argument('--file', dest='file', required=True, help="uforge-cli batch file commands") optionnal.add_argument('-f', '--fatal', dest='fatal', action='store_true',required=False, help="exit on first error in batch file (default is to continue)") # Help is not call at the doParser declaration because it would create two separate argument group for optional arguments. optionnal.add_argument('-h', '--help', action='help', help="show this help message and exit") return doParser def do_batch(self, args): try: doParser = self.arg_batch() try: doArgs = doParser.parse_args(args.split()) except SystemExit as e: return with open(doArgs.file) as f: for line in f: try: self.run_commands_at_invocation([line]) except: printer.out("bad command '"+line+"'", printer.ERROR) # If fatal optionnal argument is specified. if doArgs.fatal: printer.out("Fatal error leading to exit task", printer.ERROR) return print "\n" except IOError as e: printer.out("File error: "+str(e), printer.ERROR) return except ArgumentParserError as e: printer.out("In Arguments: "+str(e), printer.ERROR) self.help_batch() def help_batch(self): doParser = self.arg_batch() doParser.print_help() def cmdloop(self, args): if len(args): code = self.run_commands_at_invocation([str.join(' ', args)]) sys.exit(code) else: self._cmdloop() def generate_base_doc(app, uforgecli_help): myactions=[] cmds= sorted(app.subCmds) for cmd in cmds: myactions.append(argparse._StoreAction( option_strings=[], dest=str(cmd), nargs=None, const=None, default=None, type=str, choices=None, required=False, help=str(app.subCmds[cmd].__doc__), metavar=None)) return myactions def set_globals_cmds(subCmds): for cmd in subCmds: if hasattr(subCmds[cmd], 'set_globals'): subCmds[cmd].set_globals(api, username, password) if hasattr(subCmds[cmd], 'subCmds'): set_globals_cmds(subCmds[cmd].subCmds) #Generate Uforgecli base command + help base command CmdBuilder.generateCommands(Uforgecli) app = Uforgecli() myactions=generate_base_doc(app, uforgecli_help="") # Args parsing mainParser = CoreArgumentParser(add_help=False) CoreArgumentParser.actions=myactions mainParser.add_argument('-U', '--url', dest='url', type=str, help='the server URL endpoint to use', required = False) mainParser.add_argument('-u', '--user', dest='user', type=str, help='the user name used to authenticate to the server', required = False) mainParser.add_argument('-p', '--password', dest='password', type=str, help='the password used to authenticate to the server', required = False) mainParser.add_argument('-v', action='version', help='displays the current version of the uforge-cli tool', version="%(prog)s version '"+constants.VERSION+"'") mainParser.add_argument('-h', '--help', dest='help', action='store_true', help='show this help message and exit', required = False) mainParser.add_argument('-k', '--publickey', dest='publickey', type=str, help='public API key to use for this request. Default: no default', required = False) mainParser.add_argument('-s', '--secretkey', dest='secretkey', type=str, help='secret API key to use for this request. Default: no default', required = False) mainParser.add_argument('-c', '--no-check-certificate', dest='crypt', action="store_true", help='Don\'t check the server certificate against the available certificate authorities', required = False) mainParser.set_defaults(help=False) mainParser.add_argument('cmds', nargs='*', help='UForge CLI cmds') mainArgs, unknown = mainParser.parse_known_args() if mainArgs.help and not mainArgs.cmds: mainParser.print_help() exit(0) if mainArgs.user is not None and mainArgs.url is not None: if not mainArgs.password: mainArgs.password = getpass.getpass() username=mainArgs.user password=mainArgs.password url=mainArgs.url if mainArgs.crypt == True: sslAutosigned = True else: sslAutosigned = False else: mainParser.print_help() exit(0) #UForge API instanciation client = httplib2.Http(disable_ssl_certificate_validation=sslAutosigned, timeout=constants.HTTP_TIMEOUT) #activate http caching #client = httplib2.Http(generics_utils.get_Uforgecli_dir()+os.sep+"cache") headers = {} headers['Authorization'] = 'Basic ' + base64.encodestring( username + ':' + password ) api = Api(url, client = client, headers = headers) set_globals_cmds(app.subCmds) if mainArgs.help and len(mainArgs.cmds)>=1: argList=mainArgs.cmds + unknown; argList.insert(len(mainArgs.cmds)-1, "help") app.cmdloop(argList) elif mainArgs.help: app.cmdloop(mainArgs.cmds + unknown + ["-h"]) else: app.cmdloop(mainArgs.cmds + unknown)
Firefighter Edward Lindemulder suffered from permanent, irreversible chronic obstructive pulmonary disease. The City of Naperville, Illinois put Lindemulder on medical leave because his COPD prevented him from performing the essential functions of his job. Lindemulder applied for line-of-duty or occupational disease disability benefits. He claimed his COPD was caused or exacerbated by his exposure to diesel fumes at the fire station or to fire smoke. The board denied Lindemulder’s request, but did award a non-duty pension. The board ruled “that any alleged on-duty incidents or exposures did not cause or contribute to plaintiff’s [Lindemulder’s] disability, which instead was caused by cigarette smoking.” Lindemulder requested review of the board’s decision. But the trial court agreed with the board, so Lindemulder appealed. The applicable standard of review depends upon whether the issue is one of fact, one of law, or a mixed question of law and fact … We will reverse a ruling on a question of fact if it is against the manifest weight of the evidence … We review questions of law de novo and mixed questions of law and fact under the “clearly erroneous” standard … The examination of the legal effect of a given set of facts is what requires review under the “clearly erroneous” standard … Here, in finding that plaintiff’s disability was the result of cigarette smoking and that no on-duty incidents or exposures caused or contributed to his disability, the Board ruled on questions of fact. Accordingly, our review is whether the Board’s decision was against the manifest weight of the evidence. In the end, the appellate court affirmed the decision of the board. Read the whole case, Lindemulder v. Board of Trustees of the Naperville Firefighters’ Pension Fund, No. 2-10-0063 (3/8/11).
#!/bin/python3 from Canopto import Canopto import pygame from pygame import * from pygame.locals import * import time from random import randint from colorsys import * import sys display_size = (8, 8) cans = Canopto (display_size[0], display_size[1], True, True) # Create an image, wider than the display, with randomized lines across it # and a little buffer to each side # Scroll image over display while True: # Make an image 1000 pixels wide lines = Surface((1000, display_size[1]+2)) lines.fill(Color(0, 0, 92, 255)) # Draw lines # Come up with sets of points. Alternate between high and low lines. Allow random space between each, and generate up to the end of the surface # Simple algorithm: generate one line. margin = 5 currX = margin points = [(margin, randint(0, lines.get_height()-1))] while currX < lines.get_width() - margin: currX = randint(currX+7, currX+30) currX = min(lines.get_width()-margin, currX) points.append ((currX, randint(1, lines.get_height()-2))) # Draw line from points #line_color = Color(54, 255, 54, 255) line_color = Color(255, 128, 0, 255) pygame.draw.aalines(lines, line_color, False, points) # Scroll image across canopto for x in range (0, lines.get_width()-(display_size[0]-1)): frame = Surface (display_size) frame.blit (lines, (-x, -1)) cans.drawSurface(frame) cans.update() for event in pygame.event.get(): if event.type==QUIT: sys.exit(0) time.sleep(0.03)
MAN CITY TEAM NEWS: Manchester City take on Leicester this evening in the Carabao Cup quarter-finals. Pep Guardiola’s men travel to the King Power Stadium to take on the Foxes as they bid to defend the trophy they won last season. The Spaniard has indicated that he will field a strong side for the match, with a number of key players injured right now, but a few youngsters are still likely to feature. David Silva is out until early January with a hamstring problem, while Benjamin Mendy won’t return until February as he recovers from a knee injury. Goalkeeper Claudio Bravo is still out with a ruptured Achilles so Aru Muric should play for the third consecutive Carabao Cup clash. Danilo (knock) and John Stones (knee) could both return to the line-up though after brief lay-offs, and Kevin De Bruyne will definitely start for the first time since his second knee injury of the season. Sergio Aguero is also ready to start after a groin injury forced him out for the past couple of weeks. Aymeric Laporte could be rested with Vincent Kompany coming in, with Oleksandr Zinchenko likely to play at left-back. In cental midfield, Ilkay Gundogan could play the Fernandinho role, with the Brazilian having been overworked this season and plenty more Premier League games coming up. Phil Foden should get a start after impressing against Hoffenheim last week, and Raheem Sterling is also likely to play after being benched at the weekend. Brahim Diaz may not be involved, with the youngster rumoured to be moving to Real Madrid next month.
from rest_framework import serializers from django.contrib.auth.models import User from .models import Account from rest_framework.exceptions import ValidationError class AccountCreateSerializer(serializers.ModelSerializer): username = serializers.CharField(source='user.username') password = serializers.CharField(source='user.password', style={'input_type': 'password'}) class Meta: model = Account fields = [ 'id', 'username', 'displayName', 'facebook', 'password', ] def create(self, validated_data): user_data = validated_data.pop('user') user = User.objects.create(**user_data) user.set_password(user_data['password']) user.save() account = Account.objects.create(user=user, **validated_data) account.username = user.username account.save() return account class AccountSerializer(serializers.ModelSerializer): class Meta: model = Account fields = [ 'id', 'username', 'displayName', 'facebook', ] class AccountRetrieveSerializer(serializers.ModelSerializer): class Meta: model = Account fields = [ 'id', 'username', 'displayName', 'facebook', ] class UpdateAccountSerializer(serializers.ModelSerializer): password = serializers.CharField(source='user.password', allow_blank=True, allow_null=True) facebook = serializers.CharField(allow_blank=True, allow_null=True) displayName = serializers.CharField(allow_blank=True, allow_null=True) class Meta: model = Account fields = [ 'displayName', 'facebook', 'password' ] def update(self, instance, validated_data): user_data = validated_data.pop('user', None) user = User.objects.get(id=instance.user.id) instance.displayName = self.value_or_keep(instance.displayName, validated_data.get('displayName', instance.displayName)) instance.facebook = self.value_or_keep(instance.facebook, validated_data.get('facebook',instance.facebook)) if user_data['password'] != "": user.set_password(user_data['password']) user.save() instance.save() return instance @staticmethod def value_or_keep(field, value): if value == "": return field return value class AuthenticateSerializer(serializers.ModelSerializer): username = serializers.CharField(source='user.username') password = serializers.CharField(source='user.password', style={'input_type': 'password'}) account = AccountSerializer(allow_null=True, read_only=True) class Meta: model = User depth = 1 fields = [ 'username', 'password', 'account', ] extra_kwargs = {"password": {"write_only": True}} def validate(self, attrs): validation_data = dict(attrs)['user'] username = validation_data.get('username', None) password = validation_data.get('password', None) try: user = User.objects.get(username=username) except: raise ValidationError("Incorrect Username/Password") if user.check_password(password): attrs['account'] = user.account return attrs raise ValidationError("Incorrect login/password.")
Ikea Bathroom Inspiration Inspiration Landscape Ikea Bathroom Hacks Index is part of great design ideas. Ikea Bathroom Inspiration Inspiration Landscape Ikea Bathroom Hacks Index was created by combining fantastic ideas, interesting arrangements, and follow the current trends in the field of that make you more inspired and give artistic touches. We'd be honored if you can apply some or all of these design in your home. believe me, brilliant ideas would be perfect if it can be applied in real and make the people around you amazed! Ikea Bathroom Inspiration Inspiration Landscape Ikea Bathroom Hacks Index was posted in June 8, 2018 at 5:28 am. Ikea Bathroom Inspiration Inspiration Landscape Ikea Bathroom Hacks Index has viewed by 54 users. Click it and download the Ikea Bathroom Inspiration Inspiration Landscape Ikea Bathroom Hacks Index. Bathroom, Ikea Bathroom was posted February 11, 2018 at 6:31 am by Erinnsbeauty.com . More over Ikea Bathroom has viewed by 1303 visitor.
# Copyright 2019 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # https://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from releasetool.commands.tag.java import ( _parse_release_tag, kokoro_job_name, package_name, ) RELEASE_PLEASE_OUTPUT = """ ✔ creating release v1.20.0 ✔ Created release: https://github.com/googleapis/java-bigtable/releases/tag/v1.20.0. ✔ adding comment to https://github.com/googleapis/java-bigtable/issue/610 ✔ adding label autorelease: tagged to https://github.com/googleapis/java-bigtable/pull/610 ✔ removing label autorelease: pending from 610 """ def test_releasetool_release_tag(): expected = "v1.20.0" assert _parse_release_tag(RELEASE_PLEASE_OUTPUT) == expected def test_kokoro_job_name(): job_name = kokoro_job_name("upstream-owner/upstream-repo", "some-package-name") assert job_name == "cloud-devrel/client-libraries/java/upstream-repo/release/stage" def test_package_name(): name = package_name({"head": {"ref": "release-storage-v1.2.3"}}) assert name is None
Though the Federal government provides some information concerning your Federal retirement benefits, it can be hard to understand and doesn’t offer advice specific to your situation. FEDweek’s reader-friendly Handbooks compile the latest critical information on your FERS and CSRS plans to help you take full advantage of your benefits and avoid making costly mistakes. Through our exclusive partnership with FEDweek, F.F.E.B.A. can provide you with one of five FEDweek Handbooks at no charge. When you consider these publications are sold on the FEDweek website for $12.95 to $13.95, that’s quite a savings. This website provides an overview of the Educational Seminars and Resources F.F.E.B.A can provide to Federal Agencies seeking to help their employees stay current with the ever-changing rules and requirements of their FERS and CSRS retirement benefits. For complete information, call F.F.E.B.A at: 727-491-7031. At F.F.E.B.A. we consider it our mission to help educate Federal Employees on their FERS and CSRS benefits so they can plan the financially-balanced retirement they’ve worked long and hard to earn. As a resource for the latest information on the FERS and CSRS benefit plans, the educational material we provide is designed to help them make strategic retirement-planning decisions specific to their financial situation and needs. F.F.E.B.A is not affiliated or endorsed by the Federal Government. Revised and Adopted: July 22, 2016. When you log onto this Website, or submit a benefit analysis request to FFEBA, we collect basic information such as your name, contact information, preferences and date of birth. We collect information about your participation in FFEBA activities. FFEBA also collects information on our Website. We collect both information that identifies you as a particular individual (“personally identifiable information”) and anonymous information that is not associated with a specific individual (“nonpersonally identifiable information”). When you visit our Website, some information may be collected automatically as part of the site’s operation. We also collect information we receive from you during online registration and when you complete other forms. In the following sections, we explain in more detail the types of information we collect online. Unique identifiers when using a mobile device, including mobile device identification numbers. Other information that may help us to provide customized content or advertising. We may allow third-party analytics companies, research companies or ad networks to collect nonpersonally identifiable information on our website. These companies may use tracking technologies, including cookies and Web beacons, to collect information about users of our site in order to analyze, report on or customize advertising on our site or on other sites. If you stay logged in to your social media accounts, log-in to our website using your social media account, use a “like” button or use other social media features while visiting our website those social media companies may collect information about you. Your interactions with social media companies and the use of their features are governed by the privacy policies of the social media companies who provide those features. Social Media Advertising Networks. We may share information about you with social media networks to allow for the delivery of customized advertising about products and services that you may be interested in. We provide this information to the social media network so they can deliver the ad to the right social media user. The delivery of customized social media ads, and your ability to opt-out of receiving those ads, is governed by the privacy policies of the social media companies who deliver customized ads to you. Licensed Providers.Licensed providers are third parties who make available special FFEBA branded discounts, products or services. Licensed providers are required to keep FFEBA user or member information confidential, and to use the information only to offer the contracted products or services to FFEBA and FFEBA user or members. Some of our licensed providers may also collect data about our members through their interactions with the members. The licensed providers may share some or all of this data with FFEBAP so that we may make available more effective and personalized service to our members. We may share information including personally identifiable information collected with Licensed Providers, even if you are not a user or member of FFEBA. Corporate Affiliates.We may share information including your personally identifiable information with wholly or majority owned affiliates or other affiliated organizations, so they can provide you with information about services and programs that might interest you. Approved Vendors.We may share your information including your personally identifiable information with companies we hire to provide certain a services such as sending mailings, improving advertising services, providing member benefits and managing databases or other technology. Selected Organizations.We may occasionally engage in “list exchanges” with selected organizations in which we may share your information including your personally identifiable information. Aggregate Data.We may share aggregate statistics and other nonpersonally identifiable information with the media, government agencies, advertisers and other third parties. These aggregate statistics will not allow anyone to identify your name or other personally identifiable information. Pursuant to Legal Requirements.FFEBA will disclose your personally identifiable information in response to a subpoena or similar investigative demand, a court order, or a request for cooperation from a law enforcement agency or other government agency; to establish or exercise our legal rights; to defend against legal claims; or as otherwise required by law. To Protect Our Website and Users.We will disclose your personally identifiable information when we reasonably believe disclosure is necessary to investigate, prevent or take action regarding illegal activity, suspected fraud or other wrongdoing; to protect and defend the rights, property or safety of our company, our employees, our website users or others; or to enforce our website terms and conditions or other agreements or policies. In Connection With Business Transfers.We may share your personally identifiable information as required by law, such as in response to a subpoena or court order or in the unlikely event of a substantial corporate transaction, such as the sale of FFEBA or our affiliates, a merger, consolidation, asset sale or bankruptcy. FFEBA uses commercially reasonable physical, procedural and technological security measures to help prevent unauthorized access to and improper use of the information we collect about you. For example, only authorized employees and authorized third parties are permitted to access personally identifiable information, and they may do so only for lawful purposes. When we share aggregate data with third parties we prohibit them from reengineering nonpersonal data to permit identification of individual users. We maintain physical, electronic and procedural safeguards that comply with federal standards to guard your nonpublic personal information. However, no website or Internet network can be completely secure. Although we take steps to secure your information, we do not guarantee, and you should not expect, that your personal information, searches or other communications will always remain secure. Please refer to the U.S. Federal Trade Commission’s website for information on how to protect yourself from identity theft. If you choose to share contact information with us, you will have the opportunity to update or delete that information or to shut down your account by sending an email to: [email protected]. You may request to stop receiving communications from us by sending an email to: [email protected]. Please note that if you opt out of receiving communications from us, you may continue to receive material from Service Providers, and/or other parties to whom we provided your information before processing your request to opt-out of communications. You will need to directly contact those third parties in order to opt out of receiving such communications. You also may opt out of receiving such emails by clicking on the “unsubscribe” link within the text of those third party emails. Even if you opt out of the sharing of information for the purpose of customizing ads to your interests, FFEBA may continue to collect and use such information for other purposes. THE WEB PAGE CONTENT ON OR AVAILABLE THROUGH THIS WEB SITE ARE PROVIDED “AS IS” AND WITHOUT WARRANTIES OF ANY KIND, EITHER EXPRESS OR IMPLIED. TO THE FULLEST EXTENT PERMISSIBLE UNDER APPLICABLE LAW, FFEBA DISCLAIMS ALL WARRANTIES, EXPRESS OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, ALL IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, TITLE AND NON-INFRINGEMENT. FFEBA MAKES NO REPRESENTATION OR WARRANTY REGARDING THE WEB PAGE CONTENT OR ITS USE THEREOF. THE WEB PAGE CONTENT ON OR AVAILABLE THROUGH THIS WEB SITE COULD INCLUDE INACCURACIES OR TYPOGRAPHICAL ERRORS AND COULD BECOME INACCURATE BECAUSE OF DEVELOPMENTS OCCURRING AFTER THEIR RESPECTIVE DATES OF PREPARATION OR PUBLICATION. FFEBA HAS NO OBLIGATION TO MAINTAIN THE CURRENCY OR ACCURACY OF ANY WEB PAGE CONTENT ON OR AVAILABLE THROUGH THIS WEB SITE. YOU ACKNOWLEDGE AND AGREE THAT FFEBA IS NOT, AND SHALL NOT BE, RESPONSIBLE FOR THE RESULTS OF ANY DEFECTS THAT MAY EXIST IN THIS WEB SITE OR ITS OPERATION. AS TO THE OPERATION OF THIS WEB SITE, FFEBA EXPRESSLY DISCLAIMS ALL WARRANTIES OF ANY KIND, WHETHER EXPRESS OR IMPLIED, INCLUDING, BUT NOT LIMITED TO THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. FFEBA MAKES NO REPRESENTATION OR WARRANTY THAT (A) THE OPERATION OF THIS WEB SITE WILL MEET YOUR OR ANY OTHER USER’S REQUIREMENTS; (B) ACCESS TO THE WEB SITE WILL BE UNINTERRUPTED, TIMELY, SECURE, OR FREE OF ERRORS, VIRUSES OR OTHER HARMFUL COMPONENTS; OR ANY DEFECTS IN THIS WEB SITE WILL BE CORRECTED. YOU AGREE THAT YOU, AND NOT FFEBA, WILL BEAR THE ENTIRE COST OF ALL SERVICING, REPAIR, CORRECTION OR RESTORATION THAT MAY BE NECESSARY FOR YOUR DATA, SOFTWARE. YOU AGREE THAT UNDER NO CIRCUMSTANCES WILL FFEBA BE LIABLE TO YOU OR ANY OTHER PERSON OR ENTITY FOR ANY DAMAGES OR INJURY, INCLUDING ANY DIRECT, SPECIAL, INCIDENTAL, CONSEQUENTIAL OR PUNITIVE DAMAGES OR ANY DAMAGES OR INJURY CAUSED BY ERROR, INACCURACY, OMISSION, INTERRUPTION, DEFECT, FAILURE OF PERFORMANCE, DELAY IN OPERATION OR TRANSMISSION, TELECOMMUNICATIONS FAILURE OR COMPUTER VIRUS OR OTHER PROBLEM, THAT MAY RESULT FROM THE USE OF, OR THE INABILITY TO USE, THIS WEB SITE OR THE WEB PAGE CONTENT ON OR AVAILABLE THROUGH THIS WEB SITE, WHETHER IN AN ACTION ALLEGING BREACH OF CONTRACT, NEGLIGENCE OR ANY OTHER CAUSE OF ACTION, OR ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF ANY WEB PAGE CONTENT ON OR AVAILABLE THROUGH THIS WEB SITE. YOU AGREE THAT FFEBA SHALL NOT BE LIABLE EVEN IF WE OR OUR AUTHORIZED REPRESENTATIVES HAVE BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. APPLICABLE LAW MAY NOT ALLOW THE EXCLUSION OF CERTAIN WARRANTIES OR THE LIMITATION OR EXCLUSION OF LIABILITY FOR INCIDENTAL OR CONSEQUENTIAL DAMAGES. ACCORDINGLY, SOME OF THE ABOVE LIMITATIONS OR EXCLUSIONS MAY NOT APPLY TO YOU. HOWEVER, IN NO EVENT SHALL FFEBA’S TOTAL LIABILITY TO YOU FOR DAMAGES, LOSSES, AND CAUSES OF ACTION (WHETHER IN CONTRACT, TORT OR OTHERWISE) EXCEED THE AMOUNT PAID BY YOU, IF ANY, IN ACCESSING OR USING THIS WEB SITE OF WEB PAGE CONTENT THEREON. Any recommendations or analysis performed or provided on the Web site is not intended to represent financial advice or guarantee future results. Prior to making any investment or financing decisions, please consult a qualified attorney or financial advisor. Your California Privacy Rights: A Notice to California Users (California Civil Code Section 1798.83). Under the California “Shine The Light” law, California residents may opt-out of LLS’s disclosure of personal information to third parties for their direct marketing purposes. Any California resident may choose to opt-out of the sharing of such personal information with third parties for marketing purposes at any time by sending an email to: [email protected]. For the avoidance of doubt, it is important to understand that this opt-out shall not prohibit any disclosures made for non-marketing purposes. The site is directed to a general audience and not directed to those under the age of 18. FFEBA asks that minors not submit any personal information. FFEBA does not knowingly collect personal information from individuals under 18 years of age and if we obtain actual knowledge that a user is under 18 years of age, we will take steps to remove that user’s personal information from our systems. The Online Services is operated in the United States and is intended for users located in the United States. If you are located anywhere outside of the United States, please be aware that information we collect, including personal information, will be transferred to, processed and stored in the United States. The data protection laws in the United States may differ from those of the country in which you are located, and your personal information may be subject to access requests from governments, courts, or law enforcement in the United States according to the laws of the United States. By using the Online Services or providing us with any information, you consent to the transfer, processing and storage of your information in the United States. You are also consenting to the application of United States federal and Florida state law in all matters concerning the Online Services and this Policy. The laws of the State of Florida shall govern any dispute, including those disputes arising from FFEBA’s use of personal information or otherwise relating to privacy. This Policy does not create rights enforceable by third parties. By inserting your information, you give FFEBA consent to contact you through telephonic, electronic or printed forms of communication. You may always opt out of being contact by sending a written request to [email protected]. FFEBA is registered with the Federal Government’s SAM, the primary database of vendors doing business with the Federal Government. Federation of Federal Employee Benefit Advocates (“FFEBA”) is not affiliated, endorsed or sponsored by the Federal Government or any U.S. Government Agency. FFEBA is not a broker dealer, investment advisory firm, insurance company or agency and does not provide investment or insurance related advice or recommendations. FFEBA Network Representatives, which comprises of attorneys, CPAs, financial professionals, are independent practitioners, and are not employed by FFEBA. A FFEBA Network Representative is available upon request. How much is your annual federal income (in thousands)? Answer: Your salary is used to determine your federal annuity for retirement as well as used in determining premium cost of FEGLI. What is your SCD/Hire Date? Answer: If you did not start in a career position, your service date will be the date that you began your career service. This date is used to determine your eligibility for retirement. Answer: In order to generate your benefit analysis, we need to know the approximate date that you expect to retire. How much do you estimate your TSP balance to be? Answer: Your TSP balance affects the growth of your TSP dollar amount. Your balance may increase or decrease depending on your fund selections. What percentage or amount from your income you are putting in TSP? Answer: The government can contribute up to 5% of your salary to the TSP each pay period, as follows: 1% agency automatic contribution paid whether or not you are contributing to the TSP. A dollar-for-dollar match on the first 3% of your salary that you contribute each pay period. Answer: Your date of birth is a used to determine your eligibility for retirement, and also affects the premium cost for FEGLI. Answer: However, if you did not start in a career position, your service date will be the date that you began your career service. This date is used to determine your eligibility for retirement. Answer: Postal Employee’s do not pay the same premiums for life insurance or medical insurance. Do you have Option A? Do you have Option B? Answer: Option B is additional coverage that will increase your death benefit up to (1) one,(2) two, (3) three,(4) four, or (5) five times your basic annual pay. In choosing this election your cost of premium will increase. Did you choose Option C for your spouse? Answer: You may choose Option C to provide your spouse with life insurance. You may choose (1) one, (2) two, (3) three, (4) four or (5). These options provide different coverage amounts for the death benefit. All eligible family members are automatically covered. Did you choose Option C for a dependent? Answer: You may choose Option C to provide your dependent(s) with life insurance. You may choose (1) one, (2) two, (3) three, (4) four or (5). These options provide different coverage amounts for the death benefit All eligible family members are automatically covered. About how many unused sick leave hours do you have? Answer: Your unused sick time affects the dollar amount of your federal annuity upon retirement. Any unused sick time remaining on the day you retire is added as service time in the federal annuity calculation.
from __future__ import print_function import getpass import inspect import os import sys import textwrap from oslo_utils import encodeutils from oslo_utils import strutils import prettytable import six from six import moves from ..common._i18n import _ class MissingArgs(Exception): """Supplied arguments are not sufficient for calling a function.""" def __init__(self, missing): self.missing = missing msg = _("Missing arguments: %s") % ", ".join(missing) super(MissingArgs, self).__init__(msg) def validate_args(fn, *args, **kwargs): """Check that the supplied args are sufficient for calling a function. >>> validate_args(lambda a: None) Traceback (most recent call last): ... MissingArgs: Missing argument(s): a >>> validate_args(lambda a, b, c, d: None, 0, c=1) Traceback (most recent call last): ... MissingArgs: Missing argument(s): b, d :param fn: the function to check :param arg: the positional arguments supplied :param kwargs: the keyword arguments supplied """ argspec = inspect.getargspec(fn) num_defaults = len(argspec.defaults or []) required_args = argspec.args[:len(argspec.args) - num_defaults] def isbound(method): return getattr(method, '__self__', None) is not None if isbound(fn): required_args.pop(0) missing = [arg for arg in required_args if arg not in kwargs] missing = missing[len(args):] if missing: raise MissingArgs(missing) def arg(*args, **kwargs): """Decorator for CLI args. Example: >>> @arg("name", help="Name of the new entity") ... def entity_create(args): ... pass """ def _decorator(func): add_arg(func, *args, **kwargs) return func return _decorator def env(*args, **kwargs): """Returns the first environment variable set. If all are empty, defaults to '' or keyword arg `default`. """ for arg in args: value = os.environ.get(arg) if value: return value return kwargs.get('default', '') def add_arg(func, *args, **kwargs): """Bind CLI arguments to a shell.py `do_foo` function.""" if not hasattr(func, 'arguments'): func.arguments = [] # NOTE(sirp): avoid dups that can occur when the module is shared across # tests. if (args, kwargs) not in func.arguments: # Because of the semantics of decorator composition if we just append # to the options list positional options will appear to be backwards. func.arguments.insert(0, (args, kwargs)) def unauthenticated(func): """Adds 'unauthenticated' attribute to decorated function. Usage: >>> @unauthenticated ... def mymethod(f): ... pass """ func.unauthenticated = True return func def isunauthenticated(func): """Checks if the function does not require authentication. Mark such functions with the `@unauthenticated` decorator. :returns: bool """ return getattr(func, 'unauthenticated', False) def print_list(objs, fields, formatters=None, sortby_index=0, mixed_case_fields=None, field_labels=None): """Print a list or objects as a table, one row per object. :param objs: iterable of :class:`Resource` :param fields: attributes that correspond to columns, in order :param formatters: `dict` of callables for field formatting :param sortby_index: index of the field for sorting table rows :param mixed_case_fields: fields corresponding to object attributes that have mixed case names (e.g., 'serverId') :param field_labels: Labels to use in the heading of the table, default to fields. """ formatters = formatters or {} mixed_case_fields = mixed_case_fields or [] field_labels = field_labels or fields if len(field_labels) != len(fields): raise ValueError(_("Field labels list %(labels)s has different number " "of elements than fields list %(fields)s"), {'labels': field_labels, 'fields': fields}) if sortby_index is None: kwargs = {} else: kwargs = {'sortby': field_labels[sortby_index]} pt = prettytable.PrettyTable(field_labels) pt.align = 'l' for o in objs: row = [] for field in fields: if field in formatters: row.append(formatters[field](o)) else: if field in mixed_case_fields: field_name = field.replace(' ', '_') else: field_name = field.lower().replace(' ', '_') data = getattr(o, field_name, '') row.append(data) pt.add_row(row) print(encodeutils.safe_encode(pt.get_string(**kwargs))) def print_dict(dct, dict_property="Property", wrap=0): """Print a `dict` as a table of two columns. :param dct: `dict` to print :param dict_property: name of the first column :param wrap: wrapping for the second column """ pt = prettytable.PrettyTable([dict_property, 'Value']) pt.align = 'l' for k, v in six.iteritems(dct): # convert dict to str to check length if isinstance(v, dict): v = six.text_type(v) if wrap > 0: v = textwrap.fill(six.text_type(v), wrap) # if value has a newline, add in multiple rows # e.g. fault with stacktrace if v and isinstance(v, six.string_types) and r'\n' in v: lines = v.strip().split(r'\n') col1 = k for line in lines: pt.add_row([col1, line]) col1 = '' else: pt.add_row([k, v]) print(encodeutils.safe_encode(pt.get_string())) def get_password(max_password_prompts=3): """Read password from TTY.""" verify = strutils.bool_from_string(env("OS_VERIFY_PASSWORD")) pw = None if hasattr(sys.stdin, "isatty") and sys.stdin.isatty(): # Check for Ctrl-D try: for __ in moves.range(max_password_prompts): pw1 = getpass.getpass("OS Password: ") if verify: pw2 = getpass.getpass("Please verify: ") else: pw2 = pw1 if pw1 == pw2 and pw1: pw = pw1 break except EOFError: pass return pw def service_type(stype): """Adds 'service_type' attribute to decorated function. Usage: .. code-block:: python @service_type('volume') def mymethod(f): ... """ def inner(f): f.service_type = stype return f return inner def get_service_type(f): """Retrieves service type from function.""" return getattr(f, 'service_type', None) def pretty_choice_list(l): return ', '.join("'%s'" % i for i in l) def exit(msg=''): if msg: print (msg, file=sys.stderr) sys.exit(1)
The original Serro SCOTTY factory, circa 1960, which was located near the Irwin exit of the Pennsylvania Turnkpike. The idea for SCOTTY trailers was born on the fourth of July in 1956 on the back of a calendar. John Serro, then a 55 year-old retired auto dealership owner, was on a weekend trip and bad weather prevented him from enjoying outdoor activities. Instead, he had a vision and immediately sketched it out: a bed, sink, two-burner stove and dinette all tucked into a 13 ft. long trailer that would fit in an average garage. When he returned, he built his first trailer, and it was an immediate hit with consumers. Partnering with his son-in-law Joe Pirschl, he opened a factory in Irwin, PA, near the Pittsburgh regional area, and started a business that would grow to be a household name. Some Serro SCOTTY travel trailer memoriabilia featuring the Scotty dog mascot, still in use by Mobile Concepts. Joe Pirschl’s daughter, Anne Degre, joined the family business in 1990 and discovered a niche for what is now known as the SCOTTY Fire Safety House. Fire departments and other groups of first responders were in need of the company’s expertise in building lightweight trailers, so they could take fire prevention and safety education directly to their communities. The fire that destroyed the Serro SCOTTY factory in 1997. On April 17, 1997, the company’s 45,000 square-foot factory in Irwin burned to the ground. Determined to endure, employees and owners were back to work in a temporary location 25 days later. The company’s name was changed to Mobile Concepts by Scotty in 2001 to reflect a new focus on engineering, designing and manufacturing specialty trailers. The company has grown to specialize in task-specific vehicles and trailers for everything from fire departments to the U.S. Department of Defense. Anne Degre, CEO of Mobile Concepts and whose grandfather and father started the Serro SCOTTY business, with her family. In late 2014, the “by Scotty” was replaced with “Specialty Vehicles” to better portray the company’s expertise. For more than 6 decades, the Scotty brand has been- and will continue to be- a trusted name. The Scotty dog emblem is still used in the logo and still stands for family, value and integrity.
# -*- coding: utf-8 -*- from __future__ import unicode_literals from django.db import models, migrations from django.conf import settings class Migration(migrations.Migration): dependencies = [ migrations.swappable_dependency(settings.AUTH_USER_MODEL), ('feed', '0004_auto_20160418_0027'), ] operations = [ migrations.CreateModel( name='FeedSubscription', fields=[ ('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)), ('network', models.CharField(max_length=50)), ('room', models.CharField(max_length=50)), ('periodic', models.CharField(default=b'', max_length=15, blank=True)), ('active', models.BooleanField(default=True)), ('last_read', models.DateTimeField()), ('feed', models.ForeignKey(to='feed.Feed')), ('user_added', models.ForeignKey(to=settings.AUTH_USER_MODEL)), ], options={ }, bases=(models.Model,), ), migrations.RemoveField( model_name='cache', name='actioned', ), migrations.RemoveField( model_name='cache', name='network', ), migrations.RemoveField( model_name='cache', name='room', ), migrations.RemoveField( model_name='feed', name='network', ), migrations.RemoveField( model_name='feed', name='periodic', ), migrations.RemoveField( model_name='feed', name='room', ), migrations.RemoveField( model_name='feed', name='user_added', ), ]
The Sales executive is the key point of contact between a Dream Hotels & Resorts properties and their assigned properties towards the allocated client base: they would need to be able to answer queries, offer advice, provide solutions and introduce new products. Their work includes: full account management – scheduling sales calls, demonstrating and presenting products with the key mandate of revenue generation for all Dream Hotels & Resorts properties within a prescribed area and market segments. government, local government and corporates within the region. Niche – mountain biking and schools within the region. The promotion of weddings within the area and community. There will also be a large element of cross selling within the region of other properties. Ability to prioritize workload and juggle multiple tasks. Ability to analyse data and show it in a meaningful way. We at 1fourall recruitment are looking for 1X salesman/women with their own transport to join our business in Durban. Duties & Requirements: 3 years relevant experience, directly interfacing with clients and providing general administrative support services. Typing, computer literacy (Advanced MS Office, Contract Creation & Procurement system). Contact Taylor-Made Recruitment. Contact hours from 9am – 4pm. Fax us your CV: 086 665 2050 We do not charge candidate fees. We will use your information from our data base for future vacancies. Education: Relevant Degree DiplomaExperience: Experience of 3+ YearsReference: RDSJob Description: Do you find yourself talking to random strangers about the most random things? Can you influence people without having to die trying, let’s have chat, send your CV and get that job that will have you meeting some real interesting peopleRequirements: Relevant Degree Experience of 3+ YearsEssential Skills and Requirements:• Managing communication between the client and the design development teams.• Assisting the creation of client proposals cost estimates as part of the sales pitching process• Responsibility for multiple clients, retention and care• Deployment & management of Internal Team Tasks & Jobs• Proactively up-selling additional digital services to your clients• Ensure all client jobs are in place, client invoices are prepared and that pricing is captured correctly• Communicate client feedback in a positive manner with fellow employees & ensure tasks are completed in time• Microsoft Office skills (Excel, Word, PowerPoint) (Expert), basic Internet technologies (HTML, CSS) and Microsoft ProjectEmail your CV to [email protected]. You can also contact any of the consultants Reana, Caren, Liezl, Lisa, Jessica or Christie on 021 555 0952 or alternatively visit our website at http: www.goldmantech.co.zaCorrespondence will only be conducted with short listed candidates. Should you not hear from us within 3 days, please consider your application unsuccessful. Education: Bachelor of Science DegreeExperience: 3 yearsReference: LFJob Description: My client in Durban is looking for a talented PHP Web Developer. If you meet the below requirements, send us your CV today to avoid missing out on an opportunity of a lifetime!Key Requirements:• Bachelor of Science Degree in Computer Science Similar Qualification • Java Certification • Minimum 3 years’ experience Minimum Requirements:• Sublime Text or Similar IDE (expert).• Apache (Web Servers)• Chrome Inspector (Expert)• Version control – Git • Basic HTML and CSS• Ability to build and integrate APIs• FTP and SSHPermanentDurbanShould you meet the requirements for this position, please email your CV to [email protected] or alternatively, visit our website on www.goldmantech.co.za. You can also contact Liezl on 021 555 0952. Correspondence will only be conducted with short listed candidates. Should you not hear from us within 3 days, please consider your application unsuccessful. Donate Your Eggs With South Africa S Leading Egg Donor Agency! An egg donor is a woman between the ages of 19 to 33 who donates a few of her eggs, that she does not need, to another woman, who is unable to fall pregnant with her own eggs. Egg donation is a routine procedure, which is: All ethnic race woman needed Legal, anonymous and confidential Requires no surgery or cutting Does not affect your own ability to have a baby Helps hundreds of woman each year Is performed by the leading fertility specialists around South Africa Egg donors are generously compensated per donation for their time and effort. Contact us NOW for more information: Email us at [email protected], call us on 021 439 8823 or sms or whatsapp to 072 887 1495 to find out more. Are You Looking To Change Your Career And Put Your Skills To Use? Are you looking to change your career and put your sales skills to use? Are you looking to work alongside energetic and positive people? Do you believe you get out of life what you put in? Do you believe it’s vital to enjoy your work? Do you believe you should like the team you work with? Do you believe it’s okay to look for the positive in a bad situation? Then this opportunity may be for you! We’re looking for exciting, outgoing, honest people with a passion for sales, teaching and training and customer service to fill sales leader positions. Requirements: * Excellent communication skills * Great attitude Positive outlook * Desire to be recognised for your achievements * Team-oriented and driven to achieve goals * Able to work independently, self-generating * Matric or the equivalent qualification SEND YOUR CV TODAY FOR CONSIDERATION! Our interview process involves a 20-minute first round interview, which is a brilliant chance to find out more about the company. If you are suitable for this position, we will contact you to set up an interview.
#!/usr/bin/env python # -*- coding: utf-8 -*- """ test_django-pin-auth ------------ Tests for `django-pin-auth` models module. """ import mock import datetime from django.test import TestCase from django.apps import apps from django.contrib.auth.models import User from faker import Faker from django_pin_auth import models from django_pin_auth.read_policies import ReadPolicy fake = Faker('ja_JP') # anything that's UTF8 will do class TokenCreate(TestCase): def setUp(self): self.user = User.objects.create_user(username=fake.email()) self.token = models.SingleUseToken.objects.create(user=self.user) class TestCreateDelete(TokenCreate): def test_create_token_value(self): """Should automatically create a 6 digit token.""" assert self.token.token.__len__() == 6 for character in self.token.token: try: int(character) except ValueError: raise AssertionError('Character "%s" is not a digit' % character) def test_read_gone(self): """After read, shouldn't be found by the manager, regardless of policy.""" self.token.read() with self.assertRaises(models.SingleUseToken.DoesNotExist): models.SingleUseToken.objects.get(pk=self.token.pk) def test_read_full_delete(self): """After read, should be totally gone if policy is delete (default).""" self.token.read() with self.assertRaises(models.SingleUseToken.DoesNotExist): models.SingleUseToken.all_objects.get(pk=self.token.pk) def test_read_soft_delete(self): """After read, should be still there, just disabled, if policy is mark.""" config = apps.get_app_config('django_pin_auth') config.read_policy = ReadPolicy.mark self.token.read() try: models.SingleUseToken.all_objects.get(pk=self.token.pk) except models.SingleUseToken.DoesNotExist: raise AssertionError('Token should still exist') config.read_policy = ReadPolicy.delete class TestValidity(TokenCreate): @mock.patch('django_pin_auth.models.datetime') def test_valid_within_timerange(self, mock_dt): """Token is valid within the time provided.""" config = apps.get_app_config('django_pin_auth') mock_dt.datetime.now = mock.Mock(return_value=datetime.datetime.now(datetime.timezone.utc)+config.pin_validity-datetime.timedelta(seconds=1)) assert self.token.is_valid() is True @mock.patch('django_pin_auth.models.datetime') def test_invalid_after_timerange(self, mock_dt): """Token is invalid after the time provided.""" config = apps.get_app_config('django_pin_auth') mock_dt.datetime.now = mock.Mock(return_value=datetime.datetime.now(datetime.timezone.utc)+config.pin_validity+datetime.timedelta(seconds=1)) assert self.token.is_valid() is False @mock.patch('django_pin_auth.models.datetime') def test_always_valid(self, mock_dt): """Token is always valid if no time given.""" config = apps.get_app_config('django_pin_auth') keep_value = config.pin_validity config.pin_validity = None mock_dt.datetime.now = mock.Mock(return_value=datetime.datetime(2713, 12, 25)) assert self.token.is_valid() is True config.pin_validity = keep_value class TestUserToken(TokenCreate): def setUp(self): super().setUp() # Copy the values and do it again self.user2 = self.user self.token2 = self.token super().setUp() def test_correct_user_token(self): """Should find token.""" self.assertEqual(models.get_user_token(self.user, self.token.token), self.token) def test_incorrect_user(self): """Should not find token with not correct user.""" self.assertEqual(models.get_user_token(self.user2, self.token), None) def test_incorrect_token(self): """Should not find token with not correct token. Well, which is incorrect is relative...""" self.assertEqual(models.get_user_token(self.user2, self.token), None)
Ready to grow your digital revenues? Whether you're a multinational telecoms company or selling cupcakes from your kitchen Sharpish Insights is a London based digital consultancy designed to make your online customer touch points as effective and profitable as possible. Social Media - Engaging your audiences on twitter, Facebook etc..
# coding=utf-8 # -------------------------------------------------------------------------- # Copyright (c) Microsoft Corporation. All rights reserved. # Licensed under the MIT License. See License.txt in the project root for license information. # Code generated by Microsoft (R) AutoRest Code Generator. # Changes may cause incorrect behavior and will be lost if the code is regenerated. # -------------------------------------------------------------------------- from typing import Any, AsyncIterable, Callable, Dict, Generic, List, Optional, TypeVar import warnings from azure.core.async_paging import AsyncItemPaged, AsyncList from azure.core.exceptions import HttpResponseError, ResourceExistsError, ResourceNotFoundError, map_error from azure.core.pipeline import PipelineResponse from azure.core.pipeline.transport import AsyncHttpResponse, HttpRequest from ... import models T = TypeVar('T') ClsType = Optional[Callable[[PipelineResponse[HttpRequest, AsyncHttpResponse], T, Dict[str, Any]], Any]] class AccessControlOperations: """AccessControlOperations async operations. You should not instantiate this class directly. Instead, you should create a Client instance that instantiates it for you and attaches it as an attribute. :ivar models: Alias to model classes used in this operation group. :type models: ~azure.synapse.accesscontrol.models :param client: Client for service requests. :param config: Configuration of service client. :param serializer: An object model serializer. :param deserializer: An object model deserializer. """ models = models def __init__(self, client, config, serializer, deserializer) -> None: self._client = client self._serialize = serializer self._deserialize = deserializer self._config = config def get_role_definitions( self, **kwargs ) -> AsyncIterable["models.RolesListResponse"]: """List roles. :keyword callable cls: A custom type or function that will be passed the direct response :return: An iterator like instance of either RolesListResponse or the result of cls(response) :rtype: ~azure.core.async_paging.AsyncItemPaged[~azure.synapse.accesscontrol.models.RolesListResponse] :raises: ~azure.core.exceptions.HttpResponseError """ cls = kwargs.pop('cls', None) # type: ClsType["models.RolesListResponse"] error_map = {404: ResourceNotFoundError, 409: ResourceExistsError} error_map.update(kwargs.pop('error_map', {})) api_version = "2020-02-01-preview" def prepare_request(next_link=None): # Construct headers header_parameters = {} # type: Dict[str, Any] header_parameters['Accept'] = 'application/json' if not next_link: # Construct URL url = self.get_role_definitions.metadata['url'] # type: ignore path_format_arguments = { 'endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True), } url = self._client.format_url(url, **path_format_arguments) # Construct parameters query_parameters = {} # type: Dict[str, Any] query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str') request = self._client.get(url, query_parameters, header_parameters) else: url = next_link query_parameters = {} # type: Dict[str, Any] path_format_arguments = { 'endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True), } url = self._client.format_url(url, **path_format_arguments) request = self._client.get(url, query_parameters, header_parameters) return request async def extract_data(pipeline_response): deserialized = self._deserialize('RolesListResponse', pipeline_response) list_of_elem = deserialized.value if cls: list_of_elem = cls(list_of_elem) return deserialized.next_link or None, AsyncList(list_of_elem) async def get_next(next_link=None): request = prepare_request(next_link) pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs) response = pipeline_response.http_response if response.status_code not in [200]: error = self._deserialize(models.ErrorContract, response) map_error(status_code=response.status_code, response=response, error_map=error_map) raise HttpResponseError(response=response, model=error) return pipeline_response return AsyncItemPaged( get_next, extract_data ) get_role_definitions.metadata = {'url': '/rbac/roles'} # type: ignore async def get_role_definition_by_id( self, role_id: str, **kwargs ) -> "models.SynapseRole": """Get role by role Id. :param role_id: Synapse Built-In Role Id. :type role_id: str :keyword callable cls: A custom type or function that will be passed the direct response :return: SynapseRole, or the result of cls(response) :rtype: ~azure.synapse.accesscontrol.models.SynapseRole :raises: ~azure.core.exceptions.HttpResponseError """ cls = kwargs.pop('cls', None) # type: ClsType["models.SynapseRole"] error_map = {404: ResourceNotFoundError, 409: ResourceExistsError} error_map.update(kwargs.pop('error_map', {})) api_version = "2020-02-01-preview" # Construct URL url = self.get_role_definition_by_id.metadata['url'] # type: ignore path_format_arguments = { 'endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True), 'roleId': self._serialize.url("role_id", role_id, 'str'), } url = self._client.format_url(url, **path_format_arguments) # Construct parameters query_parameters = {} # type: Dict[str, Any] query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str') # Construct headers header_parameters = {} # type: Dict[str, Any] header_parameters['Accept'] = 'application/json' request = self._client.get(url, query_parameters, header_parameters) pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs) response = pipeline_response.http_response if response.status_code not in [200]: map_error(status_code=response.status_code, response=response, error_map=error_map) error = self._deserialize(models.ErrorContract, response) raise HttpResponseError(response=response, model=error) deserialized = self._deserialize('SynapseRole', pipeline_response) if cls: return cls(pipeline_response, deserialized, {}) return deserialized get_role_definition_by_id.metadata = {'url': '/rbac/roles/{roleId}'} # type: ignore async def create_role_assignment( self, create_role_assignment_options: "models.RoleAssignmentOptions", **kwargs ) -> "models.RoleAssignmentDetails": """Create role assignment. :param create_role_assignment_options: Details of role id and object id. :type create_role_assignment_options: ~azure.synapse.accesscontrol.models.RoleAssignmentOptions :keyword callable cls: A custom type or function that will be passed the direct response :return: RoleAssignmentDetails, or the result of cls(response) :rtype: ~azure.synapse.accesscontrol.models.RoleAssignmentDetails :raises: ~azure.core.exceptions.HttpResponseError """ cls = kwargs.pop('cls', None) # type: ClsType["models.RoleAssignmentDetails"] error_map = {404: ResourceNotFoundError, 409: ResourceExistsError} error_map.update(kwargs.pop('error_map', {})) api_version = "2020-02-01-preview" content_type = kwargs.pop("content_type", "application/json") # Construct URL url = self.create_role_assignment.metadata['url'] # type: ignore path_format_arguments = { 'endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True), } url = self._client.format_url(url, **path_format_arguments) # Construct parameters query_parameters = {} # type: Dict[str, Any] query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str') # Construct headers header_parameters = {} # type: Dict[str, Any] header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str') header_parameters['Accept'] = 'application/json' body_content_kwargs = {} # type: Dict[str, Any] body_content = self._serialize.body(create_role_assignment_options, 'RoleAssignmentOptions') body_content_kwargs['content'] = body_content request = self._client.post(url, query_parameters, header_parameters, **body_content_kwargs) pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs) response = pipeline_response.http_response if response.status_code not in [200]: map_error(status_code=response.status_code, response=response, error_map=error_map) error = self._deserialize(models.ErrorContract, response) raise HttpResponseError(response=response, model=error) deserialized = self._deserialize('RoleAssignmentDetails', pipeline_response) if cls: return cls(pipeline_response, deserialized, {}) return deserialized create_role_assignment.metadata = {'url': '/rbac/roleAssignments'} # type: ignore async def get_role_assignments( self, role_id: Optional[str] = None, principal_id: Optional[str] = None, continuation_token_parameter: Optional[str] = None, **kwargs ) -> List["models.RoleAssignmentDetails"]: """List role assignments. :param role_id: Synapse Built-In Role Id. :type role_id: str :param principal_id: Object ID of the AAD principal or security-group. :type principal_id: str :param continuation_token_parameter: Continuation token. :type continuation_token_parameter: str :keyword callable cls: A custom type or function that will be passed the direct response :return: list of RoleAssignmentDetails, or the result of cls(response) :rtype: list[~azure.synapse.accesscontrol.models.RoleAssignmentDetails] :raises: ~azure.core.exceptions.HttpResponseError """ cls = kwargs.pop('cls', None) # type: ClsType[List["models.RoleAssignmentDetails"]] error_map = {404: ResourceNotFoundError, 409: ResourceExistsError} error_map.update(kwargs.pop('error_map', {})) api_version = "2020-02-01-preview" # Construct URL url = self.get_role_assignments.metadata['url'] # type: ignore path_format_arguments = { 'endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True), } url = self._client.format_url(url, **path_format_arguments) # Construct parameters query_parameters = {} # type: Dict[str, Any] query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str') if role_id is not None: query_parameters['roleId'] = self._serialize.query("role_id", role_id, 'str') if principal_id is not None: query_parameters['principalId'] = self._serialize.query("principal_id", principal_id, 'str') # Construct headers header_parameters = {} # type: Dict[str, Any] if continuation_token_parameter is not None: header_parameters['x-ms-continuation'] = self._serialize.header("continuation_token_parameter", continuation_token_parameter, 'str') header_parameters['Accept'] = 'application/json' request = self._client.get(url, query_parameters, header_parameters) pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs) response = pipeline_response.http_response if response.status_code not in [200]: map_error(status_code=response.status_code, response=response, error_map=error_map) error = self._deserialize(models.ErrorContract, response) raise HttpResponseError(response=response, model=error) response_headers = {} response_headers['x-ms-continuation']=self._deserialize('str', response.headers.get('x-ms-continuation')) deserialized = self._deserialize('[RoleAssignmentDetails]', pipeline_response) if cls: return cls(pipeline_response, deserialized, response_headers) return deserialized get_role_assignments.metadata = {'url': '/rbac/roleAssignments'} # type: ignore async def get_role_assignment_by_id( self, role_assignment_id: str, **kwargs ) -> "models.RoleAssignmentDetails": """Get role assignment by role assignment Id. :param role_assignment_id: The ID of the role assignment. :type role_assignment_id: str :keyword callable cls: A custom type or function that will be passed the direct response :return: RoleAssignmentDetails, or the result of cls(response) :rtype: ~azure.synapse.accesscontrol.models.RoleAssignmentDetails :raises: ~azure.core.exceptions.HttpResponseError """ cls = kwargs.pop('cls', None) # type: ClsType["models.RoleAssignmentDetails"] error_map = {404: ResourceNotFoundError, 409: ResourceExistsError} error_map.update(kwargs.pop('error_map', {})) api_version = "2020-02-01-preview" # Construct URL url = self.get_role_assignment_by_id.metadata['url'] # type: ignore path_format_arguments = { 'endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True), 'roleAssignmentId': self._serialize.url("role_assignment_id", role_assignment_id, 'str', min_length=1), } url = self._client.format_url(url, **path_format_arguments) # Construct parameters query_parameters = {} # type: Dict[str, Any] query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str') # Construct headers header_parameters = {} # type: Dict[str, Any] header_parameters['Accept'] = 'application/json' request = self._client.get(url, query_parameters, header_parameters) pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs) response = pipeline_response.http_response if response.status_code not in [200]: map_error(status_code=response.status_code, response=response, error_map=error_map) error = self._deserialize(models.ErrorContract, response) raise HttpResponseError(response=response, model=error) deserialized = self._deserialize('RoleAssignmentDetails', pipeline_response) if cls: return cls(pipeline_response, deserialized, {}) return deserialized get_role_assignment_by_id.metadata = {'url': '/rbac/roleAssignments/{roleAssignmentId}'} # type: ignore async def delete_role_assignment_by_id( self, role_assignment_id: str, **kwargs ) -> None: """Delete role assignment by role assignment Id. :param role_assignment_id: The ID of the role assignment. :type role_assignment_id: str :keyword callable cls: A custom type or function that will be passed the direct response :return: None, or the result of cls(response) :rtype: None :raises: ~azure.core.exceptions.HttpResponseError """ cls = kwargs.pop('cls', None) # type: ClsType[None] error_map = {404: ResourceNotFoundError, 409: ResourceExistsError} error_map.update(kwargs.pop('error_map', {})) api_version = "2020-02-01-preview" # Construct URL url = self.delete_role_assignment_by_id.metadata['url'] # type: ignore path_format_arguments = { 'endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True), 'roleAssignmentId': self._serialize.url("role_assignment_id", role_assignment_id, 'str', min_length=1), } url = self._client.format_url(url, **path_format_arguments) # Construct parameters query_parameters = {} # type: Dict[str, Any] query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str') # Construct headers header_parameters = {} # type: Dict[str, Any] request = self._client.delete(url, query_parameters, header_parameters) pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs) response = pipeline_response.http_response if response.status_code not in [200, 204]: map_error(status_code=response.status_code, response=response, error_map=error_map) error = self._deserialize(models.ErrorContract, response) raise HttpResponseError(response=response, model=error) if cls: return cls(pipeline_response, None, {}) delete_role_assignment_by_id.metadata = {'url': '/rbac/roleAssignments/{roleAssignmentId}'} # type: ignore async def get_caller_role_assignments( self, **kwargs ) -> List[str]: """List role assignments of the caller. :keyword callable cls: A custom type or function that will be passed the direct response :return: list of str, or the result of cls(response) :rtype: list[str] :raises: ~azure.core.exceptions.HttpResponseError """ cls = kwargs.pop('cls', None) # type: ClsType[List[str]] error_map = {404: ResourceNotFoundError, 409: ResourceExistsError} error_map.update(kwargs.pop('error_map', {})) api_version = "2020-02-01-preview" # Construct URL url = self.get_caller_role_assignments.metadata['url'] # type: ignore path_format_arguments = { 'endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True), } url = self._client.format_url(url, **path_format_arguments) # Construct parameters query_parameters = {} # type: Dict[str, Any] query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str') # Construct headers header_parameters = {} # type: Dict[str, Any] header_parameters['Accept'] = 'application/json' request = self._client.post(url, query_parameters, header_parameters) pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs) response = pipeline_response.http_response if response.status_code not in [200]: map_error(status_code=response.status_code, response=response, error_map=error_map) error = self._deserialize(models.ErrorContract, response) raise HttpResponseError(response=response, model=error) deserialized = self._deserialize('[str]', pipeline_response) if cls: return cls(pipeline_response, deserialized, {}) return deserialized get_caller_role_assignments.metadata = {'url': '/rbac/getMyAssignedRoles'} # type: ignore
The current pattern across Europe is becoming quite dynamic as a deep trough pushed into the unsually warm Mediterranean last night. The trough is transforming into an upper low today. This will be the focus for dangerous weather this week, with locally huge amounts of rainfall across the southern Mediterranean region and therefore an enhanced threat for damaging flash floods and landslides. Today’s pattern across Europe indicates a strong upper ridge across the eastern Atlantic and western Europe while a deep trough is pushed across the Scandianvia into the northeastern Europe. A pronounced upper low develops over the Mediterranean with an associated surface cyclone / frontal system slowly moving south towards the northern Africa. On Thursday, the ridge across the western Europe expands also into central Europe as a deep trough over the eastern Europe ejects into the western Russia. An upper low pushes over the southern Mediterranean. The overall pattern will be conductive for severe weather, especially an extreme amount of rainfall resulting from both convective and orographic rainfall. Various models (GFS, ARPEGE, ICON-EU, HRRR) suggest locally 150-300 mm or even more rainfall will be possible until Friday evening (72 hours period) across parts of southern Italy, including Sicily and Sardinia, as well as Malta, northern Algeria and Tunisia. Such amounts of rainfall are likely going to produce flash floods and dangerous weather conditions along the complex terrain. The extreme amount of rainfall will result from severe storms developing in moderately sheared and unstable southern Mediterranean region. Here are 24-hour rainfall accumulations simulated by www.youmeteo.com model. Stay alert for dangerous flash floods conditions and stay tuned for further updates!
import unittest import vdebug.breakpoint import vdebug.error import vdebug.util import base64 try: from unittest.mock import Mock except ImportError: from mock import Mock class LineBreakpointTest(unittest.TestCase): def test_get_file(self): """ Test that the line number is retrievable.""" ui = None file = "/path/to/file" line = 1 bp = vdebug.breakpoint.LineBreakpoint(ui,file,line) self.assertEqual(bp.get_file(),file) def test_get_line(self): """ Test that the line number is retrievable.""" ui = None file = "/path/to/file" line = 10 bp = vdebug.breakpoint.LineBreakpoint(ui,file,line) self.assertEqual(bp.get_line(),line) def test_get_cmd(self): """ Test that the dbgp command is correct.""" ui = None file = vdebug.util.FilePath("/path/to/file") line = 20 bp = vdebug.breakpoint.LineBreakpoint(ui,file,line) self.assertEqual(bp.get_cmd(),"-t line -f \"file://%s\" -n %i -s enabled" %(file, line)) def test_on_add_sets_ui_breakpoint(self): """ Test that the breakpoint is placed on the source window.""" ui = Mock() file = vdebug.util.FilePath("/path/to/file") line = 20 bp = vdebug.breakpoint.LineBreakpoint(ui,file,line) bp.on_add() ui.register_breakpoint.assert_called_with(bp) def test_on_remove_deletes_ui_breakpoint(self): """ Test that the breakpoint is removed from the source window.""" ui = Mock() file = vdebug.util.FilePath("/path/to/file") line = 20 bp = vdebug.breakpoint.LineBreakpoint(ui,file,line) bp.on_remove() ui.remove_breakpoint.assert_called_with(bp) class ConditionalBreakpointTest(unittest.TestCase): def setUp(self): vdebug.opts.Options.set({}) def test_get_cmd(self): """ Test that the dbgp command is correct.""" ui = None file = vdebug.util.FilePath("/path/to/file") line = 20 condition = "$x > 20" bp = vdebug.breakpoint.ConditionalBreakpoint(ui,file,line,condition) b64cond = base64.encodebytes(condition.encode("UTF-8")).decode("UTF-8") exp_cmd = "-t conditional -f \"file://%s\" -n %i -s enabled -- %s" %(file, line, b64cond) self.assertEqual(bp.get_cmd(), exp_cmd) class ExceptionBreakpointTest(unittest.TestCase): def test_get_cmd(self): """ Test that the dbgp command is correct.""" ui = None exception = "ExampleException" bp = vdebug.breakpoint.ExceptionBreakpoint(ui,exception) exp_cmd = "-t exception -x %s -s enabled" % exception self.assertEqual(bp.get_cmd(), exp_cmd) class CallBreakpointTest(unittest.TestCase): def test_get_cmd(self): """ Test that the dbgp command is correct.""" ui = None function = "myfunction" bp = vdebug.breakpoint.CallBreakpoint(ui,function) exp_cmd = "-t call -m %s -s enabled" % function self.assertEqual(bp.get_cmd(), exp_cmd) class ReturnBreakpointTest(unittest.TestCase): def test_get_cmd(self): """ Test that the dbgp command is correct.""" ui = None function = "myfunction" bp = vdebug.breakpoint.ReturnBreakpoint(ui,function) exp_cmd = "-t return -m %s -s enabled" % function self.assertEqual(bp.get_cmd(), exp_cmd) class BreakpointTest(unittest.TestCase): def test_id_is_unique(self): """Test that each vdebug.breakpoint has a unique ID. Consecutively generated breakpoints should have different IDs.""" bp1 = vdebug.breakpoint.Breakpoint(None) bp2 = vdebug.breakpoint.Breakpoint(None) self.assertNotEqual(bp1.get_id(),bp2.get_id()) def test_parse_with_line_breakpoint(self): """ Test that a LineBreakpoint is created.""" Mock.__len__ = Mock(return_value=1) ui = Mock() ret = vdebug.breakpoint.Breakpoint.parse(ui,"") self.assertIsInstance(ret,vdebug.breakpoint.LineBreakpoint) def test_parse_with_empty_line_raises_error(self): """ Test that a LineBreakpoint is created.""" Mock.__len__ = Mock(return_value=0) ui = Mock() re = 'Cannot set a breakpoint on an empty line' self.assertRaisesRegex(vdebug.error.BreakpointError,\ re,vdebug.breakpoint.Breakpoint.parse,ui,"") def test_parse_with_conditional_breakpoint(self): """ Test that a ConditionalBreakpoint is created.""" ui = Mock() ret = vdebug.breakpoint.Breakpoint.parse(ui,"conditional $x == 3") self.assertIsInstance(ret,vdebug.breakpoint.ConditionalBreakpoint) self.assertEqual(ret.condition, "$x == 3") def test_parse_with_conditional_raises_error(self): """ Test that an exception is raised with invalid conditional args.""" ui = Mock() args = "conditional" re = "Conditional breakpoints require a condition "+\ "to be specified" self.assertRaisesRegex(vdebug.error.BreakpointError,\ re, vdebug.breakpoint.Breakpoint.parse, ui, args) def test_parse_with_exception_breakpoint(self): """ Test that a ExceptionBreakpoint is created.""" ui = Mock() ret = vdebug.breakpoint.Breakpoint.parse(ui,"exception ExampleException") self.assertIsInstance(ret,vdebug.breakpoint.ExceptionBreakpoint) self.assertEqual(ret.exception, "ExampleException") def test_parse_with_exception_raises_error(self): """ Test that an exception is raised with invalid exception args.""" ui = Mock() args = "exception" re = "Exception breakpoints require an exception name "+\ "to be specified" self.assertRaisesRegex(vdebug.error.BreakpointError,\ re, vdebug.breakpoint.Breakpoint.parse, ui, args) def test_parse_with_call_breakpoint(self): """ Test that a CallBreakpoint is created.""" ui = Mock() ret = vdebug.breakpoint.Breakpoint.parse(ui,"call myfunction") self.assertIsInstance(ret,vdebug.breakpoint.CallBreakpoint) self.assertEqual(ret.function , "myfunction") def test_parse_with_call_raises_error(self): """ Test that an exception is raised with invalid call args.""" ui = Mock() args = "call" re = "Call breakpoints require a function name "+\ "to be specified" self.assertRaisesRegex(vdebug.error.BreakpointError,\ re, vdebug.breakpoint.Breakpoint.parse, ui, args) def test_parse_with_return_breakpoint(self): """ Test that a ReturnBreakpoint is created.""" ui = Mock() ret = vdebug.breakpoint.Breakpoint.parse(ui,"return myfunction") self.assertIsInstance(ret,vdebug.breakpoint.ReturnBreakpoint) self.assertEqual(ret.function, "myfunction") def test_parse_with_return_raises_error(self): """ Test that an exception is raised with invalid return args.""" ui = Mock() args = "return" re = "Return breakpoints require a function name "+\ "to be specified" self.assertRaisesRegex(vdebug.error.BreakpointError,\ re, vdebug.breakpoint.Breakpoint.parse, ui, args)
Hi! This post is for anyone that would like to help others reach their health goals by becoming a Certified Health Coach. * Do you see yourself as someone who has a passion for health & wellness? * Do you post health nuggets on social media to get the word out? * Do you believe there is a better, more natural (and Biblical) way to help people get well and stay well? mentor of mine. She’s the Founder of PraiseMoves Fitness and the Certified Health Coach Institute, and she has written numerous books and articles about health & wellness. BE SURE TO LET THEM KNOW THAT YOU WERE REFERRED BY [ARLETIA MAYFIELD].
# -*- coding: utf-8; -*- # # This file is part of Superdesk. # # Copyright 2013, 2014 Sourcefabric z.u. and contributors. # # For the full copyright and license information, please see the # AUTHORS and LICENSE files distributed with this source code, or # at https://www.sourcefabric.org/superdesk/license import logging import json from flask import request, current_app as app from eve.utils import config from eve.methods.common import serialize_value from superdesk import privilege from superdesk.notification import push_notification from superdesk.resource import Resource from superdesk.services import BaseService from superdesk.users import get_user_from_request from superdesk.utc import utcnow from superdesk.errors import SuperdeskApiError logger = logging.getLogger(__name__) privilege(name="vocabularies", label="Vocabularies Management", description="User can manage vocabularies' contents.") # TODO(petr): add api to specify vocabulary schema vocab_schema = { 'crop_sizes': { 'width': {'type': 'integer'}, 'height': {'type': 'integer'}, } } class VocabulariesResource(Resource): schema = { '_id': { 'type': 'string', 'required': True, 'unique': True }, 'display_name': { 'type': 'string', 'required': True }, 'type': { 'type': 'string', 'required': True, 'allowed': ['manageable', 'unmanageable'] }, 'items': { 'type': 'list', 'required': True }, 'single_value': { 'type': 'boolean', }, 'schema_field': { 'type': 'string', 'required': False, 'nullable': True }, 'dependent': { 'type': 'boolean', }, 'service': { 'type': 'dict', }, 'priority': { 'type': 'integer' }, 'unique_field': { 'type': 'string', 'required': False, 'nullable': True } } item_url = 'regex("[\w]+")' item_methods = ['GET', 'PATCH'] resource_methods = ['GET'] privileges = {'PATCH': 'vocabularies', } class VocabulariesService(BaseService): def on_replace(self, document, original): document[app.config['LAST_UPDATED']] = utcnow() document[app.config['DATE_CREATED']] = original[app.config['DATE_CREATED']] if original else utcnow() logger.info("updating vocabulary item: %s", document["_id"]) def on_fetched(self, doc): """Overriding to filter out inactive vocabularies and pops out 'is_active' property from the response. It keeps it when requested for manageable vocabularies. """ if request and hasattr(request, 'args') and request.args.get('where'): where_clause = json.loads(request.args.get('where')) if where_clause.get('type') == 'manageable': return doc for item in doc[config.ITEMS]: self._filter_inactive_vocabularies(item) self._cast_items(item) def on_fetched_item(self, doc): """ Overriding to filter out inactive vocabularies and pops out 'is_active' property from the response. """ self._filter_inactive_vocabularies(doc) self._cast_items(doc) def on_update(self, updates, original): """Checks the duplicates if a unique field is defined""" unique_field = original.get('unique_field') if unique_field: self._check_uniqueness(updates.get('items', []), unique_field) def on_updated(self, updates, original): """ Overriding this to send notification about the replacement """ self._send_notification(original) def on_replaced(self, document, original): """ Overriding this to send notification about the replacement """ self._send_notification(document) def _check_uniqueness(self, items, unique_field): """Checks the uniqueness if a unique field is defined :param items: list of items to check for uniqueness :param unique_field: name of the unique field """ unique_values = [] for item in items: # compare only the active items if not item.get('is_active'): continue if not item.get(unique_field): raise SuperdeskApiError.badRequestError("{} cannot be empty".format(unique_field)) unique_value = str(item.get(unique_field)).upper() if unique_value in unique_values: raise SuperdeskApiError.badRequestError("Value {} for field {} is not unique". format(item.get(unique_field), unique_field)) unique_values.append(unique_value) def _filter_inactive_vocabularies(self, item): vocs = item['items'] active_vocs = ({k: voc[k] for k in voc.keys() if k != 'is_active'} for voc in vocs if voc.get('is_active', True)) item['items'] = list(active_vocs) def _cast_items(self, vocab): """Cast values in vocabulary items using predefined schema. :param vocab """ schema = vocab_schema.get(vocab.get('_id'), {}) for item in vocab.get('items', []): for field, field_schema in schema.items(): if field in item: item[field] = serialize_value(field_schema['type'], item[field]) def _send_notification(self, updated_vocabulary): """ Sends notification about the updated vocabulary to all the connected clients. """ user = get_user_from_request() push_notification('vocabularies:updated', vocabulary=updated_vocabulary.get('display_name'), user=str(user[config.ID_FIELD]) if user else None) def get_rightsinfo(self, item): rights_key = item.get('source', item.get('original_source', 'default')) all_rights = self.find_one(req=None, _id='rightsinfo') if not all_rights or not all_rights.get('items'): return {} try: default_rights = next(info for info in all_rights['items'] if info['name'] == 'default') except StopIteration: default_rights = None try: rights = next(info for info in all_rights['items'] if info['name'] == rights_key) except StopIteration: rights = default_rights if rights: return { 'copyrightholder': rights.get('copyrightHolder'), 'copyrightnotice': rights.get('copyrightNotice'), 'usageterms': rights.get('usageTerms'), } else: return {}
Shipston on Stour and District Angling Club will be holding a fly fishing day on Saturday 9th June 2018. The day is aimed at an introduction to fly fishing on the river Stour. There will be qualified instructors running the day and will feature casting instruction, river fishing techniques demonstrations, watercraft and entomology and fly selection. The Warwickshire Stour while not know for it's trout fishing holds a good head of wild trout and has good fly life. It is by no means an easy river but would certainly be of interest to anyone looking for some river trouting in the Warwickshire and north Oxfordshire areas. Cost for the day is £70 and includes one years membership to the Shipston on Stour and District Angling Club and is run on a 'non profit' basis. Places are limited so please contact me for more details.
from __future__ import annotations from datetime import datetime, timedelta from io import StringIO from typing import Dict, Optional import csv from sqlalchemy.exc import IntegrityError from flask import ( Response, current_app, jsonify, make_response, render_template, request, ) from baseframe import _ from coaster.auth import current_auth from coaster.utils import getbool, make_name, midnight_to_utc, utcnow from coaster.views import ClassView, render_with, requestargs, route from .. import app from ..models import ContactExchange, Project, TicketParticipant, db from ..utils import abort_null, format_twitter_handle from .login_session import requires_login def contact_details(ticket_participant: TicketParticipant) -> Dict[str, Optional[str]]: return { 'fullname': ticket_participant.fullname, 'company': ticket_participant.company, 'email': ticket_participant.email, 'twitter': format_twitter_handle(ticket_participant.twitter), 'phone': ticket_participant.phone, } @route('/account/contacts') class ContactView(ClassView): current_section = 'account' def get_project(self, uuid_b58): return ( Project.query.filter_by(uuid_b58=uuid_b58) .options(db.load_only(Project.id, Project.uuid, Project.title)) .one_or_404() ) @route('', endpoint='contacts') @requires_login @render_with('contacts.html.jinja2') def contacts(self): """Return contacts grouped by project and date.""" archived = getbool(request.args.get('archived')) return { 'contacts': ContactExchange.grouped_counts_for( current_auth.user, archived=archived ) } def contacts_to_csv(self, contacts, timezone, filename): """Return a CSV of given contacts.""" outfile = StringIO(newline='') out = csv.writer(outfile) out.writerow( [ 'scanned_at', 'fullname', 'email', 'phone', 'twitter', 'job_title', 'company', 'city', ] ) for contact in contacts: proxy = contact.current_access() ticket_participant = proxy.ticket_participant out.writerow( [ proxy.scanned_at.astimezone(timezone) .replace(second=0, microsecond=0, tzinfo=None) .isoformat(), # Strip precision from timestamp ticket_participant.fullname, ticket_participant.email, ticket_participant.phone, ticket_participant.twitter, ticket_participant.job_title, ticket_participant.company, ticket_participant.city, ] ) outfile.seek(0) return Response( outfile.getvalue(), content_type='text/csv', headers=[ ( 'Content-Disposition', f'attachment;filename="{filename}.csv"', ) ], ) @route('<uuid_b58>/<datestr>.csv', endpoint='contacts_project_date_csv') @requires_login def project_date_csv(self, uuid_b58, datestr): """Return contacts for a given project and date in CSV format.""" archived = getbool(request.args.get('archived')) project = self.get_project(uuid_b58) date = datetime.strptime(datestr, '%Y-%m-%d').date() contacts = ContactExchange.contacts_for_project_and_date( current_auth.user, project, date, archived ) return self.contacts_to_csv( contacts, timezone=project.timezone, filename='contacts-{project}-{date}'.format( project=make_name(project.title), date=date.strftime('%Y%m%d') ), ) @route('<uuid_b58>.csv', endpoint='contacts_project_csv') @requires_login def project_csv(self, uuid_b58): """Return contacts for a given project in CSV format.""" archived = getbool(request.args.get('archived')) project = self.get_project(uuid_b58) contacts = ContactExchange.contacts_for_project( current_auth.user, project, archived ) return self.contacts_to_csv( contacts, timezone=project.timezone, filename=f'contacts-{make_name(project.title)}', ) @route('scan', endpoint='scan_contact') @requires_login def scan(self): """Scan a badge.""" return render_template('scan_contact.html.jinja2') @route('scan/connect', endpoint='scan_connect', methods=['POST']) @requires_login @requestargs(('puk', abort_null), ('key', abort_null)) def connect(self, puk, key): """Verify a badge scan and create a contact.""" ticket_participant = TicketParticipant.query.filter_by(puk=puk, key=key).first() if ticket_participant is None: return make_response( jsonify(status='error', message="Attendee details not found"), 404 ) project = ticket_participant.project if project.end_at: if ( midnight_to_utc(project.end_at + timedelta(days=1), project.timezone) < utcnow() ): return make_response( jsonify(status='error', message=_("This project has concluded")), 403, ) try: contact_exchange = ContactExchange( user=current_auth.actor, ticket_participant=ticket_participant ) db.session.add(contact_exchange) db.session.commit() except IntegrityError: current_app.logger.warning("Contact already scanned") db.session.rollback() return jsonify(contact=contact_details(ticket_participant)) else: # FIXME: when status='error', the message should be in `error_description`. return make_response( jsonify(status='error', message=_("Unauthorized contact exchange")), 403 ) ContactView.init_app(app)
Since 2010, every Saturday morning from October thru May we have been distributing a bagged lunch and a beverage to about 80 people in the parking field of the local Home Depot. Those we serve stand seeking work in the cold, in the heat, in the rain and in the snow. Everyone is welcome to join us every Friday evening at 7:30 P.M. as we prepare the sandwiches and on Saturday morning at 7:30 A.M. as we distribute the food.
# Copyright 2015 Hewlett-Packard Development Company, L.P. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. """ Django settings for stackviz project. For more information on this file, see https://docs.djangoproject.com/en/1.7/topics/settings/ For the full list of settings and their values, see https://docs.djangoproject.com/en/1.7/ref/settings/ """ # Build paths inside the project like this: os.path.join(BASE_DIR, ...) import os BASE_DIR = os.path.dirname(os.path.dirname(__file__)) # Quick-start development settings - unsuitable for production # See https://docs.djangoproject.com/en/1.7/howto/deployment/checklist/ # SECURITY WARNING: keep the secret key used in production secret! SECRET_KEY = '*to^*vlhq&05jo0^kad)=kboy$8@&x9s6i23ukh*^%w_$=5bmh' # SECURITY WARNING: don't run with debug turned on in production! DEBUG = True TEMPLATE_DEBUG = True ALLOWED_HOSTS = [] # Application definition INSTALLED_APPS = ( 'django.contrib.contenttypes', 'django.contrib.sessions', 'django.contrib.messages', 'django.contrib.staticfiles', ) MIDDLEWARE_CLASSES = ( 'django.contrib.sessions.middleware.SessionMiddleware', 'django.middleware.common.CommonMiddleware', 'django.middleware.csrf.CsrfViewMiddleware', 'django.contrib.auth.middleware.AuthenticationMiddleware', 'django.contrib.auth.middleware.SessionAuthenticationMiddleware', 'django.contrib.messages.middleware.MessageMiddleware', 'django.middleware.clickjacking.XFrameOptionsMiddleware', ) ROOT_URLCONF = 'stackviz.urls' WSGI_APPLICATION = 'stackviz.wsgi.application' # Database # https://docs.djangoproject.com/en/1.7/ref/settings/#databases DATABASES = {} # Internationalization # https://docs.djangoproject.com/en/1.7/topics/i18n/ LANGUAGE_CODE = 'en-us' TIME_ZONE = 'UTC' USE_I18N = True USE_L10N = True USE_TZ = True TEMPLATE_CONTEXT_PROCESSORS = ( 'stackviz.global_template_injector.inject_extra_context', ) # Static files (CSS, JavaScript, Images) # https://docs.djangoproject.com/en/1.7/howto/static-files/ STATIC_URL = '/static/' STATICFILES_DIRS = [ os.path.join(BASE_DIR, 'stackviz', 'static') ] TEMPLATE_DIRS = [ os.path.join(BASE_DIR, 'stackviz', 'templates') ] # If True, read a stream from stdin (only valid for exported sites) TEST_STREAM_STDIN = False # A list of files containing directly-accessible subunit streams. TEST_STREAMS = [] # A list of test repositories containing (potentially) multiple subunit # streams. TEST_REPOSITORIES = [ os.path.join(BASE_DIR, 'test_data') ] # The input dstat file DSTAT_CSV = 'dstat.log' # If true, AJAX calls should attempt to load `*.json.gz` files rather than # plain `*.json` files. This should only ever be toggled `True` for static site # exports and is not currently supported on live servers. USE_GZIP = False # Toggles offline/online mode for static export. Will trigger menu to show # either the full site or only links supported by static exporter. OFFLINE = False
It's time to put the fun and romance back into flying! Fly ‘low and slow’ on a classic DC3 airliner and see New Zealand our way – this is an adventure like no other! Your air tour DC3 ‘ZK-DAK’ (operated by ‘Fly DC3 NZ’) is one of only a few in the world to hold a full ‘Airline Operating Certificate’. So make the most of this amazing opportunity – let our professional flight crew and tour director show you New Zealand from a refreshingly different perspective! Stay in luxury hotel accommodation each night (with most meals included) and enjoy plenty of fascinating ground sightseeing along the way. This is small group touring at it’s very best – fun, sociable and exciting. With only 24 passengers per tour, seats are limited. Don’t miss out – choose your tour and book today!
# -*- coding: utf-8 -*- # -------------------------------------------------------------------- # The MIT License (MIT) # # Copyright (c) 2015 Jonathan Labéjof <[email protected]> # # Permission is hereby granted, free of charge, to any person obtaining a copy # of this software and associated documentation files (the "Software"), to deal # in the Software without restriction, including without limitation the rights # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell # copies of the Software, and to permit persons to whom the Software is # furnished to do so, subject to the following conditions: # # The above copyright notice and this permission notice shall be included in # all copies or substantial portions of the Software. # # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE # SOFTWARE. # -------------------------------------------------------------------- """Decorators dedicated to asynchronous programming.""" from __future__ import absolute_import try: from threading import Thread, RLock except ImportError: from dummythreading import Thread, RLock from time import sleep from signal import signal, SIGALRM, alarm from six import callable from six.moves.queue import Queue from .core import Annotation from .interception import PrivateInterceptor from .oop import Mixin __all__ = [ 'Synchronized', 'SynchronizedClass', 'Asynchronous', 'TimeOut', 'Wait', 'Observable' ] class Synchronized(PrivateInterceptor): """Transform a target into a thread safe target.""" #: lock attribute name _LOCK = '_lock' __slots__ = (_LOCK,) + PrivateInterceptor.__slots__ def __init__(self, lock=None, *args, **kwargs): super(Synchronized, self).__init__(*args, **kwargs) self._lock = RLock() if lock is None else lock def _interception(self, joinpoint): self._lock.acquire() result = joinpoint.proceed() self._lock.release() return result class SynchronizedClass(Synchronized): """Transform a class into a thread safe class.""" def on_bind_target(self, target, ctx=None): for attribute in target.__dict__: if callable(attribute): Synchronized(attribute, self._lock) class Asynchronous(Annotation): """Transform a target into an asynchronous callable target.""" def __init__(self, *args, **kwargs): super(Asynchronous, self).__init__(*args, **kwargs) self.queue = None def _threaded(self, *args, **kwargs): """Call the target and put the result in the Queue.""" for target in self.targets: result = target(*args, **kwargs) self.queue.put(result) def on_bind_target(self, target, ctx=None): # add start function to wrapper super(Asynchronous, self).on_bind_target(target, ctx=ctx) setattr(target, 'start', self.start) def start(self, *args, **kwargs): """Start execution of the function.""" self.queue = Queue() thread = Thread(target=self._threaded, args=args, kwargs=kwargs) thread.start() return Asynchronous.Result(self.queue, thread) class NotYetDoneException(Exception): """Handle when a result is not yet available.""" class Result(object): """In charge of receive asynchronous function result.""" __slots__ = ('queue', 'thread', 'result') def __init__(self, queue, thread): super(Asynchronous.Result, self).__init__() self.result = None self.queue = queue self.thread = thread def is_done(self): """True if result is available.""" return not self.thread.is_alive() def get_result(self, wait=-1): """Get result value. Wait for it if necessary. :param int wait: maximum wait time. :return: result value. """ if not self.is_done(): if wait >= 0: self.thread.join(wait) else: raise Asynchronous.NotYetDoneException( 'the call has not yet completed its task' ) if self.result is None: self.result = self.queue.get() return self.result class TimeOut(PrivateInterceptor): """Raise an Exception if the target call has not finished in time.""" class TimeOutError(Exception): """Exception thrown if time elapsed before the end of the target call. """ #: Default time out error message. DEFAULT_MESSAGE = \ 'Call of {0} with parameters {1} and {2} is timed out in frame {3}' def __init__(self, timeout_interceptor, frame): super(TimeOut.TimeOutError, self).__init__( timeout_interceptor.message.format( timeout_interceptor.target, timeout_interceptor.args, timeout_interceptor.kwargs, frame ) ) SECONDS = 'seconds' ERROR_MESSAGE = 'error_message' __slots__ = (SECONDS, ERROR_MESSAGE) + PrivateInterceptor.__slots__ def __init__( self, seconds, error_message=TimeOutError.DEFAULT_MESSAGE, *args, **kwargs ): super(TimeOut, self).__init__(*args, **kwargs) self.seconds = seconds self.error_message = error_message def _handle_timeout(self, frame=None, **_): """Sig ALARM timeout function.""" raise TimeOut.TimeOutError(self, frame) def _interception(self, joinpoint): signal(SIGALRM, self._handle_timeout) alarm(self.seconds) try: result = joinpoint.proceed() finally: alarm(0) return result class Wait(PrivateInterceptor): """Define a time to wait before and after a target call.""" DEFAULT_BEFORE = 1 #: default seconds to wait before the target call. DEFAULT_AFTER = 1 #: default seconds to wait after the target call. BEFORE = 'before' #: before attribute name. AFTER = 'after' #: after attribute name. __slots__ = (BEFORE, AFTER) + PrivateInterceptor.__slots__ def __init__( self, before=DEFAULT_BEFORE, after=DEFAULT_AFTER, *args, **kwargs ): super(Wait, self).__init__(*args, **kwargs) self.before = before self.after = after def _interception(self, joinpoint): sleep(self.before) result = joinpoint.proceed() sleep(self.after) return result class Observable(PrivateInterceptor): """Imlementation of the observer design pattern. It transforms a target into an observable object in adding method register_observer, unregister_observer and notify_observers. Observers listen to pre/post target interception. """ def __init__(self, *args, **kwargs): super(Observable, self).__init__(*args, **kwargs) self.observers = set() def register_observer(self, observer): """Register an observer.""" self.observers.add(observer) def unregister_observer(self, observer): """Unregister an observer.""" self.observers.remove(observer) def notify_observers(self, joinpoint, post=False): """Notify observers with parameter calls and information about pre/post call. """ _observers = tuple(self.observers) for observer in _observers: observer.notify(joinpoint=joinpoint, post=post) def on_bind_target(self, target, ctx=None): Mixin.set_mixin(target, self.register_observer) Mixin.set_mixin(target, self.unregister_observer) Mixin.set_mixin(target, self.notify_observers) def _interception(self, joinpoint): self.notify_observers(joinpoint=joinpoint) result = joinpoint.proceed() self.notify_observers(joinpoint=joinpoint, post=True) return result
Please note, effective November 26, 2018, Simcoe Trauma Recovery Clinic will be permanently located at 15 Gallie Court, Suite 110, Barrie. (Directions here). Our new offices are located in the brand new Quarry Ridge medical building, directly beside our former offices. We are excited to welcome you to our brand new, permanent facility! According to the Centre for Suicide Prevention – Suicide Prevention Toolkit (2015), one in five first responders are affected by trauma. Imagine being an everyday hero – a firefighter, paramedic, police officer, or rescue unit – and having to experience being the first on the scene of a horrific accident, assault, suicide or murder... These events can be extremely traumatic to witness and be a part of. This trauma doesn't just fade away once you leave the scene. Often times the memories linger for weeks, months and years after, haunting you over and over again. You might even feel some blame for what took place, perhaps thinking you could have done something more to help prevent the victim's pain. These traumatic memories can often compound and become a major hinderance to a first responder, and over time morph into serious mental illness; depression, anxiety, extreme anger, isolation, post-traumatic stress disorder, and thoughts of suicide are common. Here at STRC, we specialize in helping people process through traumatic events, and come out the other side healthier and happier. With our methods of EMDR and psychotherapy, we can assist you in breaking down the walls of fear, sadness and regret that are linked to those traumatic events, and help you be you again. It's not too late. If you are suffering, or know someone who is, please reach out to us. We are here to help.
# Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See the NOTICE file distributed with # this work for additional information regarding copyright ownership. # The ASF licenses this file to You under the Apache License, Version 2.0 # (the "License"); you may not use this file except in compliance with # the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. import os import time import yaml import netifaces from libcloud.compute.base import NodeState from libcloud.compute.deployment import Deployment from libcloud.compute.deployment import ScriptDeployment from libcloud.compute.deployment import SSHKeyDeployment from libcloud.compute.ssh import SSHClient from plumbery.exception import PlumberyException from plumbery.nodes import PlumberyNodes from plumbery.polisher import PlumberyPolisher from plumbery.text import PlumberyText from plumbery.text import PlumberyNodeContext from plumbery.plogging import plogging class FileContentDeployment(Deployment): """ Installs a file on a target node. """ def __init__(self, content, target): """ :type content: ``str`` :keyword content: Content of the target file to create :type target: ``str`` :keyword target: Path to install file on node """ self.content = content self.target = target def run(self, node, client): """ Writes the file. See also :class:`Deployment.run` """ client.put(path=self.target, contents=self.content) return node class RebootDeployment(Deployment): """ Reboots a node and let cloud-init do the dirty job. """ def __init__(self, container): """ :param container: the container of this node :type container: :class:`plumbery.PlumberyInfrastructure` """ self.region = container.region def run(self, node, client): """ Reboots the node. See also :class:`Deployment.run` """ repeats = 0 while True: try: self.region.reboot_node(node) except Exception as feedback: if 'RESOURCE_BUSY' in str(feedback): time.sleep(10) continue if 'VM_TOOLS_INVALID_STATUS' in str(feedback): if repeats < 5: time.sleep(10) repeats += 1 continue plogging.error("- unable to reboot node") plogging.error(str(feedback)) finally: return node class PreparePolisher(PlumberyPolisher): """ Bootstraps nodes via ssh This polisher looks at each node in sequence, and contact selected nodes via ssh to prepare them. The goal here is to accelerate post-creation tasks as much as possible. Bootstrapping steps can consist of multiple tasks: * push a SSH public key to allow for automated secured communications * ask for package update * install docker * install any pythons script * install Stackstorm * configure a Chef client * register a node to a monitoring dashboard * ... To activate this polisher you have to mention it in the fittings plan, like in the following example:: --- safeMode: False actions: - prepare: key: ~/.ssh/myproject_rsa.pub --- # Frankfurt in Europe locationId: EU6 regionId: dd-eu ... Plumbery will only prepare nodes that have been configured for it. The example below demonstrates how this can be done for multiple docker containers:: # some docker resources - docker: domain: *vdc1 ethernet: *containers nodes: - docker1: prepare: &docker - run prepare.update.sh - run prepare.docker.sh - docker2: prepare: *docker - docker3: prepare: *docker In the real life when you have to prepare any appliance, you need to be close to the stuff and to touch it. This is the same for virtual fittings. This polisher has the need to communicate directly with target nodes over the network. This connectivity can become quite complicated because of the potential mix of private and public networks, firewalls, etc. To stay safe plumbery enforces a simple beachheading model, where network connectivity with end nodes is a no brainer. This model is based on predefined network addresses for plumbery itself, as in the snippet below:: --- # Frankfurt in Europe locationId: EU6 regionId: dd-eu # network subnets are 10.1.x.y prepare: - beachhead: 10.1.3.4 Here nodes at EU6 will be prepared only if the machine that is executing plumbery has the adress 10.1.3.4. In other cases, plumbery will state that the location is out of reach. """ def upgrade_vmware_tools(self, node): """ Upgrade VMware tools on target node :param node: the node to be polished :type node: :class:`libcloud.compute.base.Node` """ if self.engine.safeMode: return True while True: try: self.region.ex_update_vm_tools(node=node) plogging.info("- upgrading vmware tools") return True except Exception as feedback: if 'RESOURCE_BUSY' in str(feedback): time.sleep(10) continue if 'Please try again later' in str(feedback): time.sleep(10) continue if 'NO_CHANGE' in str(feedback): plogging.debug("- vmware tools is already up-to-date") return True plogging.warning("- unable to upgrade vmware tools") plogging.warning(str(feedback)) return False def _apply_prepares(self, node, steps): """ Does the actual job over SSH :param node: the node to be polished :type node: :class:`libcloud.compute.base.Node` :param steps: the various steps of the preparing :type steps: ``list`` of ``dict`` :return: ``True`` if everything went fine, ``False`` otherwise :rtype: ``bool`` """ if node is None or node.state != NodeState.RUNNING: plogging.warning("- skipped - node is not running") return False # select the address to use if len(node.public_ips) > 0: target_ip = node.public_ips[0] elif node.extra['ipv6']: target_ip = node.extra['ipv6'] else: target_ip = node.private_ips[0] # use libcloud to communicate with remote nodes session = SSHClient(hostname=target_ip, port=22, username=self.user, password=self.secret, key_files=self.key_files, timeout=10) repeats = 0 while True: try: session.connect() break except Exception as feedback: repeats += 1 if repeats > 5: plogging.error("Error: can not connect to '{}'!".format( target_ip)) plogging.error("- failed to connect") return False plogging.debug(str(feedback)) plogging.debug("- connection {} failed, retrying".format(repeats)) time.sleep(10) continue while True: try: if self.engine.safeMode: plogging.info("- skipped - no ssh interaction in safe mode") else: for step in steps: plogging.info('- {}'.format(step['description'])) step['genius'].run(node, session) except Exception as feedback: if 'RESOURCE_BUSY' in str(feedback): time.sleep(10) continue plogging.error("Error: unable to prepare '{}' at '{}'!".format( node.name, target_ip)) plogging.error(str(feedback)) plogging.error("- failed") result = False else: result = True break try: session.close() except: pass return result def _get_prepares(self, node, settings, container): """ Defines the set of actions to be done on a node :param node: the node to be polished :type node: :class:`libcloud.compute.base.Node` :param settings: the fittings plan for this node :type settings: ``dict`` :param container: the container of this node :type container: :class:`plumbery.PlumberyInfrastructure` :return: a list of actions to be performed, and related descriptions :rtype: a ``list`` of `{ 'description': ..., 'genius': ... }`` """ if not isinstance(settings, dict): return [] environment = PlumberyNodeContext(node=node, container=container, context=self.facility) prepares = [] for key_file in self.key_files: try: path = os.path.expanduser(key_file) with open(path) as stream: key = stream.read() stream.close() prepares.append({ 'description': 'deploy SSH public key', 'genius': SSHKeyDeployment(key=key)}) except IOError: plogging.warning("no ssh key in {}".format(key_file)) if ('prepare' in settings and isinstance(settings['prepare'], list) and len(settings['prepare']) > 0): plogging.info('- using prepare commands') for script in settings['prepare']: tokens = script.split(' ') if len(tokens) == 1: tokens.insert(0, 'run') if tokens[0] in ['run', 'run_raw']: # send and run a script script = tokens[1] if len(tokens) > 2: args = tokens[2:] else: args = [] plogging.debug("- {} {} {}".format( tokens[0], script, ' '.join(args))) try: with open(script) as stream: text = stream.read() if(tokens[0] == 'run' and PlumberyText.could_expand(text)): plogging.debug("- expanding script '{}'" .format(script)) text = PlumberyText.expand_string( text, environment) if len(text) > 0: plogging.info("- running '{}'" .format(script)) prepares.append({ 'description': ' '.join(tokens), 'genius': ScriptDeployment( script=text, args=args, name=script)}) else: plogging.error("- script '{}' is empty" .format(script)) except IOError: plogging.error("- unable to read script '{}'" .format(script)) elif tokens[0] in ['put', 'put_raw']: # send a file file = tokens[1] if len(tokens) > 2: destination = tokens[2] else: destination = './'+file plogging.debug("- {} {} {}".format( tokens[0], file, destination)) try: with open(file) as stream: content = stream.read() if(tokens[0] == 'put' and PlumberyText.could_expand(content)): plogging.debug("- expanding file '{}'" .format(file)) content = PlumberyText.expand_string( content, environment) plogging.info("- putting file '{}'" .format(file)) prepares.append({ 'description': ' '.join(tokens), 'genius': FileContentDeployment( content=content, target=destination)}) except IOError: plogging.error("- unable to read file '{}'" .format(file)) else: # echo a sensible message eventually if tokens[0] == 'echo': tokens.pop(0) message = ' '.join(tokens) message = PlumberyText.expand_string( message, environment) plogging.info("- {}".format(message)) if ('cloud-config' in settings and isinstance(settings['cloud-config'], dict) and len(settings['cloud-config']) > 0): plogging.info('- using cloud-config') # mandatory, else cloud-init will not consider user-data plogging.debug('- preparing meta-data') meta_data = 'instance_id: dummy\n' destination = '/var/lib/cloud/seed/nocloud-net/meta-data' prepares.append({ 'description': 'put meta-data', 'genius': FileContentDeployment( content=meta_data, target=destination)}) plogging.debug('- preparing user-data') expanded = PlumberyText.expand_string( settings['cloud-config'], environment) user_data = '#cloud-config\n'+expanded plogging.debug(user_data) destination = '/var/lib/cloud/seed/nocloud-net/user-data' prepares.append({ 'description': 'put user-data', 'genius': FileContentDeployment( content=user_data, target=destination)}) plogging.debug('- preparing remote install of cloud-init') script = 'prepare.cloud-init.sh' try: path = os.path.dirname(__file__)+'/'+script with open(path) as stream: text = stream.read() if text: prepares.append({ 'description': 'run '+script, 'genius': ScriptDeployment( script=text, name=script)}) except IOError: raise PlumberyException("Error: cannot read '{}'" .format(script)) plogging.debug('- preparing reboot to trigger cloud-init') prepares.append({ 'description': 'reboot node', 'genius': RebootDeployment( container=container)}) return prepares def go(self, engine): """ Starts the prepare process :param engine: access to global parameters and functions :type engine: :class:`plumbery.PlumberyEngine` """ super(PreparePolisher, self).go(engine) self.report = [] self.user = engine.get_shared_user() self.secret = engine.get_shared_secret() self.key_files = engine.get_shared_key_files() if 'key' in self.settings: key = self.settings['key'] key = os.path.expanduser(key) if os.path.isfile(key): plogging.debug("- using shared key {}".format(key)) if self.key_files is None: self.key_files = [key] else: self.key_files.insert(0, key) else: plogging.error("Error: missing file {}".format(key)) def move_to(self, facility): """ Checks if we can beachhead at this facility :param facility: access to local parameters and functions :type facility: :class:`plumbery.PlumberyFacility` This function lists all addresses of the computer that is running plumbery. If there is at least one routable IPv6 address, then it assumes that communication with nodes is possible. If no suitable IPv6 address can be found, then plumbery falls back to IPv4. Beachheading is granted only if the address of the computer running plumbery matches the fitting parameter ``beachhead``. """ self.facility = facility self.region = facility.region self.nodes = PlumberyNodes(facility) self.beachheading = False try: self.addresses = [] for interface in netifaces.interfaces(): addresses = netifaces.ifaddresses(interface) if netifaces.AF_INET in addresses.keys(): for address in addresses[netifaces.AF_INET]: # strip local loop if address['addr'].startswith('127.0.0.1'): continue self.addresses.append(address['addr']) if netifaces.AF_INET6 in addresses.keys(): for address in addresses[netifaces.AF_INET6]: # strip local loop if address['addr'].startswith('::1'): continue # strip local link addresses if address['addr'].startswith('fe80::'): continue # we have a routable ipv6, so let's go self.beachheading = True except Exception as feedback: plogging.error(str(feedback)) for item in self.facility.get_setting('prepare', []): if not isinstance(item, dict): continue if 'beachhead' not in item.keys(): continue if item['beachhead'] in self.addresses: self.beachheading = True break if self.beachheading: plogging.debug("- beachheading at '{}'".format( self.facility.get_setting('locationId'))) else: plogging.debug("- not beachheading at '{}'".format( self.facility.get_setting('locationId'))) def attach_node_to_internet(self, node, ports=[]): """ Adds address translation for one node :param node: node that has to be reachable from the internet :type node: :class:`libcloud.common.Node` :param ports: the ports that have to be opened :type ports: a list of ``str`` """ plogging.info("Making node '{}' reachable from the internet" .format(node.name)) domain = self.container.get_network_domain( self.container.blueprint['domain']['name']) internal_ip = node.private_ips[0] external_ip = None for rule in self.region.ex_list_nat_rules(domain): if rule.internal_ip == internal_ip: external_ip = rule.external_ip plogging.info("- node is reachable at '{}'".format(external_ip)) if self.engine.safeMode: plogging.info("- skipped - safe mode") return if external_ip is None: external_ip = self.container._get_ipv4() if external_ip is None: plogging.info("- no more ipv4 address available -- assign more") return while True: try: self.region.ex_create_nat_rule( domain, internal_ip, external_ip) plogging.info("- node is reachable at '{}'".format( external_ip)) except Exception as feedback: if 'RESOURCE_BUSY' in str(feedback): time.sleep(10) continue elif 'RESOURCE_LOCKED' in str(feedback): plogging.info("- not now - locked") return else: plogging.info("- unable to add address translation") plogging.error(str(feedback)) break candidates = self.container._list_candidate_firewall_rules(node, ports) for rule in self.container._list_firewall_rules(): if rule.name in candidates.keys(): plogging.info("Creating firewall rule '{}'" .format(rule.name)) plogging.info("- already there") candidates = {k: candidates[k] for k in candidates if k != rule.name} for name, rule in candidates.items(): plogging.info("Creating firewall rule '{}'" .format(name)) if self.engine.safeMode: plogging.info("- skipped - safe mode") else: try: self.container._ex_create_firewall_rule( network_domain=domain, rule=rule, position='LAST') plogging.info("- in progress") except Exception as feedback: if 'NAME_NOT_UNIQUE' in str(feedback): plogging.info("- already there") else: plogging.info("- unable to create firewall rule") plogging.error(str(feedback)) return external_ip def shine_node(self, node, settings, container): """ prepares a node :param node: the node to be polished :type node: :class:`libcloud.compute.base.Node` :param settings: the fittings plan for this node :type settings: ``dict`` :param container: the container of this node :type container: :class:`plumbery.PlumberyInfrastructure` """ self.container = container plogging.info("Preparing node '{}'".format(settings['name'])) if node is None: plogging.error("- not found") return timeout = 300 tick = 6 while node.extra['status'].action == 'START_SERVER': time.sleep(tick) node = self.nodes.get_node(node.name) timeout -= tick if timeout < 0: break if node.state != NodeState.RUNNING: plogging.error("- skipped - node is not running") return self.upgrade_vmware_tools(node) prepares = self._get_prepares(node, settings, container) if len(prepares) < 1: plogging.info('- nothing to do') self.report.append({node.name: { 'status': 'skipped - nothing to do' }}) return if len(node.public_ips) > 0: plogging.info("- node is reachable at '{}'".format( node.public_ips[0])) node.transient = False elif container.with_transient_exposure(): external_ip = self.attach_node_to_internet(node, ports=['22']) if external_ip is None: plogging.error('- no IP has been assigned') self.report.append({node.name: { 'status': 'unreachable' }}) return node.public_ips = [external_ip] node.transient = True elif not self.beachheading: plogging.error('- node is unreachable') self.report.append({node.name: { 'status': 'unreachable' }}) return descriptions = [] for item in prepares: descriptions.append(item['description']) if self._apply_prepares(node, prepares): self.report.append({node.name: { 'status': 'completed', 'prepares': descriptions }}) else: self.report.append({node.name: { 'status': 'failed', 'prepares': descriptions }}) if node.transient: self.container._detach_node_from_internet(node) def reap(self): """ Reports on preparing """ if 'output' not in self.settings: return fileName = self.settings['output'] plogging.info("Reporting on preparations in '{}'".format(fileName)) with open(fileName, 'w') as stream: stream.write(yaml.dump(self.report, default_flow_style=False)) stream.close()
On August 9, 2014, 18-year-old Michael Brown, African-American, was shot to death by Darren Wilson, 28, white, a police officer in Ferguson, Missouri, a suburb of Saint Louis. Brown, a large young man, had stolen a box of cigars from a convenience store in Ferguson. Wilson was patrolling in the area, crossed paths with Brown, and they had a confrontation, which resulted in Wilson shooting Brown six times. At the core of the Ferguson civil fiasco were these two questions: Did Brown instigate the shooting by physically aggressing Wilson, putting Wilson in fear of his life? Or did Wilson act "too hastily" and pump six rounds into Brown, senselessly ending the young man's life? Needless to say, investigators and prosecutors commenced sifting through the evidence "to separate fact from fiction" and get to the truth. Nov. 24, 2014, around 8:30 p.m., a news bulletin interrupted regular TV shows: The bulletin aired to announce, live, to the nation the verdict in the Aug. 9, 2014 shooting death of Michael Brown by police officer Darren Wilson. On the news bulletin, St. Louis County Prosecutor Robert McCulloch, white, told a national audience Ferguson policeman Wilson "would not face charges for fatally shooting" Michael Brown. As mentioned, U.S. Attorney General Eric Holder, black, weighed in on the shooting death of Michael Brown, and charges were not brought against Officer Darren Wilson. This indicates to me Wilson didn't go to the crime scene with the intention of killing a black man. Instead, he got caught up in a situation and acted in self-defense. A major reason for me believing Wilson acted in self-defense is some original "eyewitnesses" claimed Wilson shot Brown in the back as Brown attempted to flee from Wilson. However, after all evidence was gathered and examined, Prosecutor McCulloch said the same evidence forensically supported the fact Brown had been shot only in the front, not in the back. Sadly enough, I wager the looting, demonstrating, and civil unrest will continue in Ferguson, Missouri, as well as all across the country, for some time to come. Though some protesters are genuinely upset (particularly Michael Brown's family and friends), I think it is high time we examine what, I feel, is at the very core of racial problems such as the ongoing one in Ferguson, Missouri: Plain and simple, some blacks hate all whites, and some whites hate all blacks . . . regardless the facts! These "race-haters," as I call them - black and white, from the East Coast to the West Coast, and all places in-between - spend a majority of their existence just waiting for an excuse to go off on another race. From my perspective, it has nothing to do with how fairly or wrongly their particular race is treated by another race. Rather, it has everything to do with them having an excuse to vent their own viral hatred! Of a somewhat lighter note, I heard a real estate agent say something to the effect of, If they don't settle down, the price of land in Ferguson, Missouri will be down to nothin' . . . won't be able to give it away. Apply the real estate agent's sentiment to America as a whole. Scary, isn't it?
""" Show he list of Panda Jobs with LFN or versa verse LFNs by Panda IDs </td><td>$Rev$""" # $Id: joblfn.py 19632 2014-07-06 07:30:10Z jschovan $ from pmUtils.pmState import pmstate from pmCore.pmModule import pmRoles from pmTaskBuffer.pmTaskBuffer import pmtaskbuffer as pmt import pmUtils.pmUtils as utils from pmCore.pmModule import pmModule class joblfn(pmModule): """ Show the list of Panda id with LFN or versa verse LFNs by Panda IDs """ #______________________________________________________________________________________ def __init__(self,name=None,parent=None,obj=None): pmModule.__init__(self,name,parent,obj) self.publishUI(self.doJson) #______________________________________________________________________________________ def doJson(self,lfn=None, jobs=None,type='input',ds=None,table='new',limit=1000,jobstatus=None,site=None,jobtype='production',days=1,user=None,select=None): """ Show the list of Panda id with LFN or versa verse LFNs by Panda IDs <ul> <li><code>lfn</code> - the list of the comma separated files <li><code>ds</code> - the list of the comma separated datasets <li><code>jobs</code> - the list of the comma separated Panda's job IDs <li><code>table</code> = "new" (default) look up the records for last 3 days <br> ="old" - look up the records those more than 3 days old (slow) <br> ="deep" - look up the "old" and "new" tables (slow) <li><code>type</code> - the type selector. <br> = 'input - the default value<br> = '*' | 'all' - list all types available. <li><code>jobstatus</code> = the comma separated list of the job status to filter <br>For example: 'defined, waiting,assigned,activated,sent,starting,running' <li><code>site</code> = the comma separated list of the sizte to list the jobs from <br> For example 'UTA_SWT2' <li><code>jobtype</code> = the comma separated list of the job type to filter <br> For example, "analysis, production" <li><code>days</code> = the number of the days to look up the list of the jobs if either 'jobstatus' or 'site' parameter is defined <li><code>user</code> = the comma separated list of the usernames . <br>NB. The names may contain the the wildcard symbol '*'. Be aware the wildcard slows the search down </ul> """ title = 'The list of files for the ' if jobstatus and jobstatus.strip() =='': jobstatus = None if site and site.strip() =='': site = None if lfn and lfn.strip() =='': lfn = None if jobs and isinstance(jobs,str) and jobs.strip() =='': jobs = None if ds and ds.strip() =='': ds=None if type and type.strip() =='': type='all' if lfn==None and jobs==None and ds==None and jobstatus==None and site==None: self.publishTitle("Ambigios query: lfn=%(lfn)s; pandaid=%(pandaid)s either lfn or padaid can be defined. One can not define lfn and pandaid at once" % { 'lfn': lfn, 'pandaid' : jobs} ) self.publish('<h2>Check the input parameters. Click the "?" to see the API documentaion<h2>', role=pmRoles.html()) else: nav = '' if limit: nav += "Limit %s rows." % limit if type=='*' or type==None: type = 'all' if lfn != None: self.publishTitle("The list of the PANDA jobs with the LFN of the '%s' type provided" % type) if not '*' in lfn: # disregard the jobtype parameter if utils.isFilled(jobtype): nav += " Disregarding the jobtype='%s' default parameter" % jobtype jobtype = None if ds != None: self.publishTitle("The list of the PANDA jobs with the DATASETs of the '%s' type provided" % type) if jobs!=None: self.publishTitle("The list of the '%s' LFN with the PANDA Job IDs provided" % type) if utils.isFilled(nav): self.publishNav(nav) main = {} main["buffer"] = {} main["buffer"]["method"] = "joblfn" main["buffer"]["params"] = (lfn if lfn!=None else '',jobs if jobs!= None else '' ) if jobs != None: main["buffer"]["jobs"] = jobs main["buffer"]["type"] = False if (utils.isFilled(jobstatus) or utils.isFilled(site) or utils.isFilled(user)) and not utils.isFilled(jobs): tables = ['atlas_panda.jobsArchived4','atlas_panda.jobsActive4','atlas_panda.jobsWaiting4','atlas_panda.jobsDefined4'] r = pmt.getJobIds(site, jobtype,jobstatus,table=tables,days=days,username=user) jobs = [i[0] for i in r['rows']] if not utils.isFilled(select): select = [] if jobs == None or ( not isinstance(jobs,int) and len(jobs) > 1): select.append('pandaid'); select += ['type', 'lfn', 'fsize', 'dataset', 'guid', 'scope', 'destinationse'] else: select = utils.parseArray(select); main["buffer"]["data"] = pmt.getJobLFN(select=','.join(select),table=table,lfn=lfn,pandaid=jobs,type=type,ds=ds,limit=limit) self.publish(main) self.publish( "%s/%s" % (self.server().fileScriptURL(),"taskBuffer/%s.js" % "getJobLFN"),role=pmRoles.script())
Not since the Japanese first stormed the Paris scene, in 1983, has the Far East been such a preoccupation of the fashion flock—both in terms of trends (Roberto Cavalli's Ming vase prints) and business (new stores in China and/or Japan from Prada, Giorgio Armani, and Louis Vuitton). That's good timing for Tsumori Chisato, a designer who is "big in Japan" and now starting to make inroads into the Western market. A graduate of Tokyo's renowned Bunka Fashion College, Chisato started working in 1977 for Issey Miyake, who helped her launch her own line in 1990. She first brought her collection to Paris in 2003 and has been quietly building street cred with cool girls in Los Angeles and New York. Her fall collection stayed faithful to her aesthetic, which is print-heavy with a healthy dose of manga/bohemian cuteness. The show took wing with the first look, an owl-appliqué minidress, and closed with homey quilted frocks that tied into the vague Bloomsbury theme at play throughout. There were a number of pretty dresses: some dramatic with pile stripes, a requisite velvet number, and others with tree appliqués and a patchwork of fun, flirty prints. The second look was a simple white pinafore that recalled Peter Pan's Wendy—and served as a reminder that, for every governess that came down the runways this season, there is a corresponding youthful charge. Chisato is ready to dress her.
#!/usr/bin/env python3 # # Copyright (c) 2016 Roberto Riggio # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, # software distributed under the License is distributed on an # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY # KIND, either express or implied. See the License for the # specific language governing permissions and limitations # under the License. """Empower persistence layer.""" import uuid import empower.datatypes.etheraddress as etheraddress import empower.datatypes.ssid as ssid from sqlalchemy.ext.declarative import declarative_base from sqlalchemy import Column, String, Integer, ForeignKey from sqlalchemy.types import TypeDecorator, Unicode from sqlalchemy.orm import relationship from empower.persistence import ENGINE Base = declarative_base() class UUID(TypeDecorator): """UUID type.""" impl = Unicode def __init__(self): self.impl.length = 16 TypeDecorator.__init__(self, length=self.impl.length) def process_bind_param(self, value, dialect=None): if value and isinstance(value, uuid.UUID): return value.bytes elif value and not isinstance(value, uuid.UUID): raise ValueError('value %s is not a valid uuid.UUID' % value) else: return None def process_result_value(self, value, dialect=None): if value: return uuid.UUID(bytes=value) else: return None def is_mutable(self): return False class EtherAddress(TypeDecorator): """EtherAddress type.""" impl = Unicode def __init__(self): self.impl.length = 6 TypeDecorator.__init__(self, length=self.impl.length) def process_bind_param(self, value, dialect=None): if value and isinstance(value, etheraddress.EtherAddress): return value.to_raw() elif value and not isinstance(value, etheraddress.EtherAddress): raise ValueError('value %s is not a valid EtherAddress' % value) else: return None def process_result_value(self, value, dialect=None): if value: return etheraddress.EtherAddress(value) else: return None def is_mutable(self): return False class SSID(TypeDecorator): """EtherAddress type.""" impl = Unicode def __init__(self): self.impl.length = 30 TypeDecorator.__init__(self, length=self.impl.length) def process_bind_param(self, value, dialect=None): if value and isinstance(value, ssid.SSID): return value.to_raw() elif value and not isinstance(value, ssid.SSID): raise ValueError('value %s is not a valid SSID' % value) else: return None def process_result_value(self, value, dialect=None): if value: return ssid.SSID(value) else: return None def is_mutable(self): return False class TblFeed(Base): """ Energino Feeds Table. """ __tablename__ = 'Feed' feed_id = Column(Integer, primary_key=True) title = Column(String) created = Column(String) updated = Column(String) pnfdev_addr = Column(EtherAddress, nullable=True) class TblAccount(Base): """ Account table. """ __tablename__ = 'account' username = Column(String, primary_key=True) password = Column(String) name = Column(String) surname = Column(String) email = Column(String) role = Column(String) class TblPendingTenant(Base): """ List of pending Tenant request. """ __tablename__ = 'pending_tenant' tenant_id = Column("tenant_id", UUID(), primary_key=True, default=uuid.uuid4) tenant_name = Column(SSID, unique=True) desc = Column(String) owner = Column(String) bssid_type = Column(String) def to_dict(self): """ Return a JSON-serializable dictionary representing the request """ return {'tenant_id': self.tenant_id, 'owner': self.owner, 'tenant_name': self.tenant_name, 'desc': self.desc, 'bssid_type': self.bssid_type} class TblTenant(Base): """ Tenant table. """ __tablename__ = 'tenant' tenant_id = Column("tenant_id", UUID(), primary_key=True, default=uuid.uuid4) tenant_name = Column(SSID, unique=True) desc = Column(String) owner = Column(String) bssid_type = Column(String) class TblPNFDev(Base): """ Programmable network fabric device table. """ __tablename__ = 'pnfdev' addr = Column("addr", EtherAddress(), primary_key=True) label = Column(String) tbl_type = Column(String(20)) __mapper_args__ = { 'polymorphic_on': tbl_type, 'polymorphic_identity': 'pnfdevs' } class TblBelongs(Base): """Link PNFDevs with Tenants""" __tablename__ = 'belongs' addr = Column(EtherAddress(), ForeignKey('pnfdev.addr'), primary_key=True) tenant_id = Column(UUID(), ForeignKey('tenant.tenant_id'), primary_key=True) class TblCPP(TblPNFDev): """ Programmable network fabric device table. """ __mapper_args__ = { 'polymorphic_identity': 'cpps' } class TblWTP(TblPNFDev): """ Wireless Termination point. """ __mapper_args__ = { 'polymorphic_identity': 'wtps' } class TblVBS(TblPNFDev): """ Virtual Base Station Point. """ __mapper_args__ = { 'polymorphic_identity': 'vbses' } class TblAllow(Base): """ Allow table. """ __tablename__ = 'allow' addr = Column("addr", EtherAddress(), primary_key=True) label = Column(String) class TblDeny(Base): """ Deny table. """ __tablename__ = 'deny' addr = Column("addr", EtherAddress(), primary_key=True) label = Column(String) class TblRule(Base): """rule table""" __tablename__ = 'rule' rule_id = Column(Integer, primary_key=True , autoincrement=True) slvap = Column("slvap",EtherAddress()) swtp = Column("swtp",EtherAddress()) type = Column(String) dwtp = Column("dwtp",EtherAddress()) dlvap = Column("dlvap",EtherAddress()) def to_dict(self): """ Return a JSON-serializable dictionary representing the request """ return {'rule_id': self.rule_id, 'slvap': self.slvap, 'dwtp': self.dwtp, 'type' : self.type, 'swtp': self.swtp, 'dlvap': self.dlvap } class TblRulegroupid(Base): """rule relation""" __tablename__ = 'groupid ' id = Column(Integer,primary_key=True,autoincrement=True) target = Column("target",EtherAddress(),nullable=True) lvaps = relationship('TblRulegroup') class TblRulegroup(Base): """rule lvaps""" __tablename__ = 'rulegroup' groupid = Column(Integer, primary_key=True, autoincrement=True) lvap = Column("lvap",EtherAddress()) rule_id = Column(Integer,ForeignKey(TblRulegroupid.id)) Base.metadata.create_all(ENGINE)
The eligibility, type and methods of awarding prizes should be resolved before the Grad Night event. The Prize committee should resolve who is eligible, what types of prizes, and how the prizes will be awarded. Incentive Prizes: Awarded during the year to entice the graduates to attend Grad Night. Game and Contest Prizes: Awarded to entice the graduates to try the Games/Contests. Door Prizes: Awarded as an incentive to come to the Grad Night. Grand Prize: A high value prize awarded as an incentive to stay the entire night. Random Prizes: Prizes given randomly to make Grad Night more exciting and fun. Grad Night Favors: Inexpensive prizes given to all graduates. Inexpensive Item (CD player, portable telephone, answering machine, etc.). Carnival and Games of Chance prizes should not be high-value items. Ensure the prize awards are setup to prevent the appearance of gambling. Keep in mind that some graduates will not (or are not allowed to) participate in the games of chance. GAMBLING MUST NOT BE A PART OF THE GRAD NIGHT EVENT. Some Grad Nights give raffle tickets as prizes for the games of chance. The raffle tickets are then entered into a special drawing at the end of the night to win prizes. Some Grad Nights give "funny" money or Grad Bucks as prizes. The "funny" money is redeemed at a prize store or silent auction. "Funny" money prizes should be of equal value to discourage the graduates from playing the games of chance for the sole purpose of trying to get an item of high value. These prizes encourage participation in the various contests such as dance contests, hula-hoop, limbo, lip sync, Guest-How-Many, and photo identification. Anytime the contest winner is a team of two or more persons, be sure to give individual prizes. Just because two people competed together doesn't mean that they will share the prize. Consider giving inexpensive consolation prizes for non-winners of carnival games. Some businesses give product samples or ad products suitable for consolation prizes (soft drink coupons, model kits, cosmetics, product samples, etc.). Door prizes are typically large value items, which all graduates are equally eligible to win. Grad Nights typically award door prizes at the end of the party and require the winner to be present to win. The Grand Prize is typically a high value item (s), which all graduates are equally eligible to win. Grad Nights typically select the Grand prizes winners at the end of the party and require the winner to be present to win. Give a prize to every 10th, 50th, and/or 300th graduate to arrive. Give a prize to the first 50 or 100 graduates to arrive. Give prizes to those who find certain "items" at Grad Night (marked cup, chair, etc.). Some schools design a grid on a page of the graduate's passport. Each square is for a certain area of Grad Night (like the casino). The graduates with the most squares filled in with a verification stamp from each area win a prize. Inexpensive favors are fun. The idea is for every graduate to leave with something. Yo-Yos Glow Necklaces Grad Night T-shirts key chains Grad Night mugs Grad Night hats "Goodie" bags (containing product samples, candy, gum, toothbrush and toothpaste, pencils, balloons, etc.). Give incentive prizes to entice graduates to purchase tickets before the Grad Night. Award door prizes all night but distribute them only at the end of the Grad Night. The graduates must be present in order to claim the prizes. Hold the drawing for the Grand prizes as the very last event of the Grad Night. Give at least one Grad Night favor to each attendee. Give no prizes at all. It's always fun to have a chance to win something! Have a door prize for all attendees. This is not needed or expected by graduates. Word to the Wise: It is possible to order customized items inexpensively. Consider joining with other schools to order items more cost effectively. Items might say "Class of", or "Grad Night All Night", etc. Some of the more reasonable items are pens, pencils, sun visors, Frisbees, mugs, and stadium cups. Purchased from catalogs, warehouse stores, military exchanges, wholesalers, retailers, etc. Donated by the community, family and friends, local clubs, religious organizations, local merchants, the media, professionals, government agencies, corporations, etc. Ensure your prize committee resolves the following questions before Grad Night. These can be very difficult decisions for each committee to make depending on the differing philosophies of the committee members. When will prizes be given? Must graduates be present to win the prizes? Will "guests" (like spouses, foreign exchange graduates, etc.) be allowed to win prizes? How many door or grand prizes can one individual win? Will cash be given as a prize? How will unclaimed prizes be handled? Publicize the rules for prize eligibility to the graduates, parents and volunteers. Don't start the final Grand Prize drawing until all activities have been closed down and all graduates are in the area. If cash will be given post-date checks or mail checks to winners. Remember that Scholarships are typically more of a prize for parents of the graduates who will be paying the college bills. A scholarship is also of little value to graduates who do not intend to go onto college. Provide a safe and secure storage area for the prizes. The prizes should be visible but securely stored. Consider announcing winners on the PA system and/or a billboard with names displayed. Consider putting a "From" tag on donated prizes so graduates will know who donated them. Some schools print thank you notes and have the winners sign them. The committee then mails them to the donors immediately following the celebration. The question of favoritism usually arises anytime tickets are drawn. One way to eliminate this is to put the names of the prizes in one container and the names of the graduates in another. Draw one ticket from each container. Consider allowing each graduate who stays until the end of the Grad Night an opportunity to pull one piece of real money from a moneybox as they exit the facility. Denominations could range from $1 to $50.
""" Django settings for modelrepository project. Generated by 'django-admin startproject' using Django 1.11.2. For more information on this file, see https://docs.djangoproject.com/en/1.11/topics/settings/ For the full list of settings and their values, see https://docs.djangoproject.com/en/1.11/ref/settings/ """ import os # Build paths inside the project like this: os.path.join(BASE_DIR, ...) BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__))) # Quick-start development settings - unsuitable for production # See https://docs.djangoproject.com/en/1.11/howto/deployment/checklist/ # SECURITY WARNING: keep the secret key used in production secret! SECRET_KEY = '&$)1o$zbwi&y=qu)fb@1o_@p&bzjtnq3f2!gz*h+xex=(e@_&_' # SECURITY WARNING: don't run with debug turned on in production! DEBUG = True ALLOWED_HOSTS = [] # Application definition INSTALLED_APPS = [ 'django.contrib.auth', 'django.contrib.contenttypes', 'django.contrib.sessions', 'django.contrib.messages', 'django.contrib.staticfiles', 'django.contrib.postgres', 'mainapp', 'django_pgviews', 'compressor', ] MIDDLEWARE = [ 'django.middleware.security.SecurityMiddleware', 'django.contrib.sessions.middleware.SessionMiddleware', 'django.middleware.common.CommonMiddleware', 'django.middleware.csrf.CsrfViewMiddleware', 'django.contrib.auth.middleware.AuthenticationMiddleware', 'django.contrib.messages.middleware.MessageMiddleware', 'django.middleware.clickjacking.XFrameOptionsMiddleware', ] ROOT_URLCONF = 'modelrepository.urls' TEMPLATES = [ { 'BACKEND': 'django.template.backends.django.DjangoTemplates', 'DIRS': [], 'APP_DIRS': True, 'OPTIONS': { 'context_processors': [ 'django.template.context_processors.debug', 'django.template.context_processors.request', 'django.contrib.auth.context_processors.auth', 'django.contrib.messages.context_processors.messages', ], }, }, ] LOGIN_REDIRECT_URL = '/login' WSGI_APPLICATION = 'modelrepository.wsgi.application' # Database # https://docs.djangoproject.com/en/1.11/ref/settings/#databases DATABASES = { 'default': { 'ENGINE': 'django.db.backends.postgresql', 'NAME': os.environ.get('RESERVOIR_DB_NAME', 'reservoir'), 'USER': os.environ.get('RESERVOIR_DB_USER', 'reservoir'), 'PASSWORD': os.environ.get('RESERVOIR_DB_PASSWORD','reservoir'), 'HOST': os.environ.get('RESERVOIR_DB_HOST','127.0.0.1'), 'PORT': os.environ.get('RESERVOIR_DB_PORT','5432'), } } # Password validation # https://docs.djangoproject.com/en/1.11/ref/settings/#auth-password-validators AUTH_PASSWORD_VALIDATORS = [ { 'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator', }, { 'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator', }, { 'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator', }, { 'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator', }, ] # Internationalization # https://docs.djangoproject.com/en/1.11/topics/i18n/ LANGUAGE_CODE = 'en-us' TIME_ZONE = 'UTC' USE_I18N = True USE_L10N = True USE_TZ = True # Static files (CSS, JavaScript, Images) # https://docs.djangoproject.com/en/1.11/howto/static-files/ STATIC_URL = '/static/' STATIC_ROOT = '/home/tdmr/static/' STATICFILES_FINDERS = ( 'django.contrib.staticfiles.finders.AppDirectoriesFinder', 'compressor.finders.CompressorFinder', )
To keep in touch with our news, helpful tips and information, sign up to our mailing list. We’ll send you a periodic update. Don’t worry, it’s not the least bit annoying.
from argh import arg from six import iteritems import logging from pnc_cli import swagger_client from pnc_cli.swagger_client.apis.productversions_api import ProductversionsApi from pnc_cli.swagger_client.apis.products_api import ProductsApi from pnc_cli import utils versions_api = ProductversionsApi(utils.get_api_client()) products_api = ProductsApi(utils.get_api_client()) __author__ = 'thauser' def create_product_version_object(**kwargs): created_version = swagger_client.ProductVersionRest() for key, value in iteritems(kwargs): setattr(created_version, key, value) return created_version def version_exists(id): response = utils.checked_api_call(versions_api, 'get_specific', id=id) if not response: return False return True def version_exists_for_product(id, version): existing_products = products_api.get_product_versions(id=id).content if existing_products: return version in [x.version for x in existing_products] else: return False @arg("-p", "--page-size", help="Limit the amount of build records returned") @arg("-s", "--sort", help="Sorting RSQL") @arg("-q", help="RSQL query") def list_product_versions(page_size=200, sort="", q=""): """ List all ProductVersions """ response = utils.checked_api_call(versions_api, 'get_all', page_size=page_size, sort=sort, q=q) if response: return response.content # TODO: Version needs to be checked for validity. @arg("product_id", help="ID of product to add a version to") @arg("version", help="Version to add") @arg("-cm", "--current-product-milestone-id", help="ID of the milestone this version should be on") @arg("-pr", "--product-releases", type=int, nargs="+", help="List of product release IDs for this Product version") @arg("-pm", "--product-milestones", type=int, nargs="+", help="List of milestone IDs to associate with the new version") @arg("-bc", "--build-configuration-set-ids", type=int, nargs="+", help="List of build configuration set IDs to associate with the new version") def create_product_version(product_id, version, **kwargs): """ Create a new ProductVersion. Each ProductVersion represents a supported product release stream, which includes milestones and releases typically associated with a single major.minor version of a Product. Follows the Red Hat product support cycle, and typically includes Alpha, Beta, GA, and CP releases with the same major.minor version. Example: ProductVersion 1.0 includes the following releases: 1.0.Beta1, 1.0.GA, 1.0.1, etc. """ if version_exists_for_product(product_id, version): logging.error("Version {} already exists for product: {}".format( version, products_api.get_specific(id=product_id).content.name)) return kwargs['product_id'] = product_id kwargs['version'] = version product_version = create_product_version_object(**kwargs) response = utils.checked_api_call(versions_api, 'create_new_product_version', body=product_version) if response: return response.content @arg("id", help="ID of the ProductVersion to retrieve") def get_product_version(id): """ Retrieve a specific ProductVersion by ProductVersion ID """ if not version_exists(id): logging.error("No ProductVersion with ID {} exists.".format(id)) return response = utils.checked_api_call(versions_api, 'get_specific', id=id) if response: return response.content # TODO: how should constraints be defined? Can a new productId be specified? # TODO: Version needs to be checked for validity. @arg("id", help="ID of the ProductVersion to update.") @arg("-pid", "--product-id", help="ID of product to add a version to") @arg("-v", "--version", help="Version to add") @arg("-cm", "--current-product-milestone-id", type=int, help="ID of the ProductMilestone this version should be on") @arg("-pr", "--product-releases", type=int, nargs="+", help="List of ProductRelease IDs for this Product version") @arg("-pm", "--product-milestones", type=int, nargs="+", help="List of ProductMilestone IDs to associate with the new version") @arg("-bc", "--build-configuration-set-ids", type=int, nargs="+", help="List of BuildConfigurationSet IDs to associate with the new version") def update_product_version(id, **kwargs): """ Update the ProductVersion with ID id with new values. """ if not version_exists(id): logging.error("A ProductVersion with id {} doesn't exist.".format(id)) return to_update = versions_api.get_specific(id=id).content for key, value in kwargs.items(): if value is not None: setattr(to_update, key, value) response = utils.checked_api_call(versions_api, 'update', id=id, body=to_update) if response: return response.content
You’re probably wondering why you need financing…. If you’re about to start a franchise, you’ve made a great move. You will have the independence and flexibility of a small business owner with the infrastructure and support of a large corporation. Generally, a franchise comes with a fair share of fees. It is possible that you may not have planned for the capital to cover the franchise costs that you weren’t planning for. You’ll probably want to check out your franchise financing options. There are several franchise lending guides to assist you in weighing your options. You and your team will have to review each one to determine the best course of action. Franchisees have to cover the startup costs. That’s where franchise lending comes in. Luckily, there are business loans that you can use to start or grow your franchise. There are options out there for potential franchise owners regarding how to cover expenses. 2 Touch POS™ does have a qualification process. We carefully review your financial status because we want you to be able to afford the 90 day start up that is necessary to begin your business. When you add up all these fees, the cost of starting and running your franchise can be steep. To cover these costs, you can take out a variety of small business loans to finance your franchise. As always connect with us to learn more about this exciting opportunity! When you are ready; you will complete the qualification form.
""" Django settings for bball_intel project. For more information on this file, see https://docs.djangoproject.com/en/1.7/topics/settings/ For the full list of settings and their values, see https://docs.djangoproject.com/en/1.7/ref/settings/ """ # Build paths inside the project like this: BASE_DIR.child(...) from unipath import Path PROJECT_DIR = Path(__file__).ancestor(4) BASE_DIR = Path(__file__).ancestor(3) import dj_database_url # Quick-start development settings - unsuitable for production # See https://docs.djangoproject.com/en/1.7/howto/deployment/checklist/ # SECURITY WARNING: keep the secret key used in production secret! SECRET_KEY = '8f03d!k79490u5t@5+pxs(j$%lq@kp$n5od3br#d$#0)0f*14a' # SECURITY WARNING: don't run with debug turned on in production! DEBUG = True TEMPLATE_DEBUG = True ALLOWED_HOSTS = [] # Application definition INSTALLED_APPS = ( 'django.contrib.admin', 'django.contrib.auth', 'django.contrib.contenttypes', 'django.contrib.sessions', 'django.contrib.messages', 'django.contrib.staticfiles', ) MIDDLEWARE_CLASSES = ( 'django.contrib.sessions.middleware.SessionMiddleware', 'django.middleware.common.CommonMiddleware', 'django.middleware.csrf.CsrfViewMiddleware', 'django.contrib.auth.middleware.AuthenticationMiddleware', 'django.contrib.auth.middleware.SessionAuthenticationMiddleware', 'django.contrib.messages.middleware.MessageMiddleware', 'django.middleware.clickjacking.XFrameOptionsMiddleware', ) ROOT_URLCONF = 'bball_intel.urls' WSGI_APPLICATION = 'bball_intel.wsgi.application' # Database # https://docs.djangoproject.com/en/1.7/ref/settings/#databases DATABASES = { 'default': dj_database_url.config( default = 'sqlite:///{base}/db.sqlite3'.format(base=BASE_DIR) ) } # Internationalization # https://docs.djangoproject.com/en/1.7/topics/i18n/ LANGUAGE_CODE = 'en-us' TIME_ZONE = 'UTC' USE_I18N = True USE_L10N = True USE_TZ = True # Static files (CSS, JavaScript, Images) # https://docs.djangoproject.com/en/1.7/howto/static-files/ STATIC_URL = '/static/'
US-65 to Hwy. 76 Exit, left on 76 to stoplight at Fall Creek Rd, turn left, to Wildwood, turn right, follow Wildwood to hotel on the right. Spacious suites with full kitchens, living area with sleeper sofas and separate bedrooms.
#!/usr/bin/python from __future__ import with_statement from struct import * #from __future__ import with_statement import re import os import sys def get_section_info(binname, secname, info): pattern = re.compile(r"\s*\[\s*(?P<num>[\d]{1,2})\]\s*" "(?P<name>[\S]+)\s*" "(?P<type>[\S]+)\s*" "(?P<addr>[\S]+)\s*" "(?P<offset>[\S]+)\s*" "(?P<size>[\S]+)\s*" "[^\n]*$") cmd = "readelf -S " + binname; with os.popen(cmd) as file: for line in file: line = line.strip(); m=pattern.match(line); if((m != None) and (m.group('name') == secname)): if(info == 'num'): return int(m.group(info),10) if((info == 'addr') or (info == 'offset') or (info == 'size') ): return int(m.group(info),16) else: return m.group(info) return m.group(info) return None def extract_data(binname,secname,output): gen_asm_offset = get_section_info(binname, secname, "offset"); gen_asm_size = get_section_info(binname, secname, "size"); fd = os.open(binname,os.O_RDWR); os.lseek(fd, gen_asm_offset, os.SEEK_SET); buf = os.read(fd, gen_asm_size); fd2 = os.open(output, os.O_CREAT|os.O_TRUNC|os.O_RDWR, 0644) os.write(fd2,buf); os.close(fd2); os.close(fd); print "text offset %lx"%gen_asm_offset print "text size %lx"%gen_asm_size def main(): orig_bin = sys.argv[1]; output_file = sys.argv[2]; extract_data(orig_bin, ".text",output_file); main();
Karen has taught writing workshops at Mendocino College, College of the Redwoods, the Redwood Coast Senior Center and at the Mendocino Cancer Resource Centers. At Mendocino's Kelley House Museum, writers excavated local history. With support from California Poets in the Schools, the Arts Council of Mendocino County’s GASP program, and the Mendocino County Office of Education, Karen leads hands-on creative writing workshops at K-12 schools and after-school programs. She is a 7-time recipient of California Arts Council – Artists in Schools grant awards. Try a haiku hike or a deeper dive into poetic form while crafting sonnets, fractured sonnets, chants, prose poems, or the ever-popular free verse. Turn real life into fiction, memoir, flash, or hybrid literary expressions. Contact Karen to set up a workshop. Ready-to-roll creative writing lessons now available free to classroom teachers!
from django import http from django.views.generic.base import View from skrill.models import PaymentRequest, StatusReport class StatusReportView(View): def post(self, request, *args, **kwargs): payment_request = PaymentRequest.objects.get(pk=request.POST['transaction_id']) report = StatusReport() report.payment_request = payment_request report.pay_to_email = request.POST['pay_to_email'] report.pay_from_email = request.POST['pay_from_email'] report.merchant_id = request.POST['merchant_id'] report.customer_id = request.POST.get('customer_id', None) report.mb_transaction_id = request.POST['mb_transaction_id'] report.mb_amount = request.POST['mb_amount'] report.mb_currency = request.POST['mb_currency'] report.status = request.POST['status'] report.failed_reason_code = request.POST.get('failed_reason_code', None) report.md5sig = request.POST['md5sig'] report.sha2sig = request.POST.get('sha2sig', None) report.amount = request.POST['amount'] report.currency = request.POST['currency'] report.payment_type = request.POST.get('payment_type', None) report.custom_field_1 = request.POST.get('custom_field_1', None) report.custom_field_2 = request.POST.get('custom_field_2', None) report.custom_field_3 = request.POST.get('custom_field_3', None) report.custom_field_4 = request.POST.get('custom_field_4', None) report.custom_field_5 = request.POST.get('custom_field_5', None) report.save() report.validate_md5sig() report.valid = True report.save() report.send_signal() return http.HttpResponse()
I have been through several hurricanes and natural disasters. I have seen weeks without access to our office and I have even helped haul servers down flights of stairs so that we could get offices set up and operational. ​There was one thing that I did not worry about at all. I never lost a minute’s sleep over our data or how soon our office could be up and running. In fact, there were moments that I was logged in working during the middle of the hurricane. This is the benefit of cloud computing. We have no redundant servers. No backups. In fact, we have no servers in our office. Nothing to haul down stairs or set up in a remote location. No, we aren’t a small firm. Over 100 people potentially affected by Harvey. Today I had our Chief Administrative Officer, Emily Mazey and our Employee Experience Officer, Amanda Shook, look into missed time for the week and encourage our team to take advantage of the federal benefits for disaster unemployment. They did. They came back to me and said “Wes, people worked all week. They weren’t unemployed, there are hours charged to clients, the work never stopped”. Wow. We got something right. We were in the cloud. Our team was able to carry on in the midst of the worst hurricane in decades and a flood that was deemed to be a “500 year flood”. Microsoft 365, in the cloud, gave us uninterrupted email, chat and calendar services. Our Brand Experience Officer Catherine Seitz and Partner Stan Raines set up a text messaging system for us to communicate and assist others. The communication lines were open. CCH Engagement and Quickbooks were both hosted in military grade data centers that never went down. It is a textbook case study for cloud based computing. If you aren’t going to change because it is just the smartest and most efficient way to do business, then consider it for this #1 reason. When a natural disaster strikes, you will be prepared. Your disaster recovery plans will look like ours, a laptop. You are covered. There was one guy who wasn't stressing over disaster recovery. Our internal IT guy, Cory Dial. Can you picture the Maytag repair man? That’s Cory. Cory is important to our organization, but he can focus on service and strategy.
# -*- test-case-name: twisted.conch.test.test_keys -*- # Copyright (c) Twisted Matrix Laboratories. # See LICENSE for details. """ Handling of RSA and DSA keys. """ from __future__ import absolute_import, division import base64 import itertools import warnings from hashlib import md5 from cryptography.exceptions import InvalidSignature from cryptography.hazmat.backends import default_backend from cryptography.hazmat.primitives import hashes, serialization from cryptography.hazmat.primitives.asymmetric import dsa, rsa, padding try: from cryptography.hazmat.primitives.asymmetric.utils import ( encode_dss_signature, decode_dss_signature) except ImportError: from cryptography.hazmat.primitives.asymmetric.utils import ( encode_rfc6979_signature as encode_dss_signature, decode_rfc6979_signature as decode_dss_signature) from cryptography.hazmat.primitives.ciphers import Cipher, algorithms, modes from pyasn1.error import PyAsn1Error from pyasn1.type import univ from pyasn1.codec.ber import decoder as berDecoder from pyasn1.codec.ber import encoder as berEncoder from twisted.conch.ssh import common, sexpy from twisted.conch.ssh.common import int_from_bytes, int_to_bytes from twisted.python import randbytes from twisted.python.compat import iterbytes, long, izip, nativeString, _PY3 from twisted.python.deprecate import deprecated, getDeprecationWarningString from twisted.python.versions import Version class BadKeyError(Exception): """ Raised when a key isn't what we expected from it. XXX: we really need to check for bad keys """ class EncryptedKeyError(Exception): """ Raised when an encrypted key is presented to fromString/fromFile without a password. """ class Key(object): """ An object representing a key. A key can be either a public or private key. A public key can verify a signature; a private key can create or verify a signature. To generate a string that can be stored on disk, use the toString method. If you have a private key, but want the string representation of the public key, use Key.public().toString(). @ivar keyObject: DEPRECATED. The C{Crypto.PublicKey} object that operations are performed with. """ def fromFile(cls, filename, type=None, passphrase=None): """ Load a key from a file. @param filename: The path to load key data from. @type type: L{str} or C{None} @param type: A string describing the format the key data is in, or C{None} to attempt detection of the type. @type passphrase: L{bytes} or C{None} @param passphrase: The passphrase the key is encrypted with, or C{None} if there is no encryption. @rtype: L{Key} @return: The loaded key. """ with open(filename, 'rb') as f: return cls.fromString(f.read(), type, passphrase) fromFile = classmethod(fromFile) def fromString(cls, data, type=None, passphrase=None): """ Return a Key object corresponding to the string data. type is optionally the type of string, matching a _fromString_* method. Otherwise, the _guessStringType() classmethod will be used to guess a type. If the key is encrypted, passphrase is used as the decryption key. @type data: L{bytes} @param data: The key data. @type type: L{str} or C{None} @param type: A string describing the format the key data is in, or C{None} to attempt detection of the type. @type passphrase: L{bytes} or C{None} @param passphrase: The passphrase the key is encrypted with, or C{None} if there is no encryption. @rtype: L{Key} @return: The loaded key. """ if type is None: type = cls._guessStringType(data) if type is None: raise BadKeyError('cannot guess the type of %r' % (data,)) method = getattr(cls, '_fromString_%s' % (type.upper(),), None) if method is None: raise BadKeyError('no _fromString method for %s' % (type,)) if method.__code__.co_argcount == 2: # No passphrase if passphrase: raise BadKeyError('key not encrypted') return method(data) else: return method(data, passphrase) fromString = classmethod(fromString) @classmethod def _fromString_BLOB(cls, blob): """ Return a public key object corresponding to this public key blob. The format of a RSA public key blob is:: string 'ssh-rsa' integer e integer n The format of a DSA public key blob is:: string 'ssh-dss' integer p integer q integer g integer y @type blob: L{bytes} @param blob: The key data. @return: A new key. @rtype: L{twisted.conch.ssh.keys.Key} @raises BadKeyError: if the key type (the first string) is unknown. """ keyType, rest = common.getNS(blob) if keyType == b'ssh-rsa': e, n, rest = common.getMP(rest, 2) return cls( rsa.RSAPublicNumbers(e, n).public_key(default_backend())) elif keyType == b'ssh-dss': p, q, g, y, rest = common.getMP(rest, 4) return cls( dsa.DSAPublicNumbers( y=y, parameter_numbers=dsa.DSAParameterNumbers( p=p, q=q, g=g ) ).public_key(default_backend()) ) else: raise BadKeyError('unknown blob type: %s' % (keyType,)) @classmethod def _fromString_PRIVATE_BLOB(cls, blob): """ Return a private key object corresponding to this private key blob. The blob formats are as follows: RSA keys:: string 'ssh-rsa' integer n integer e integer d integer u integer p integer q DSA keys:: string 'ssh-dss' integer p integer q integer g integer y integer x @type blob: L{bytes} @param blob: The key data. @return: A new key. @rtype: L{twisted.conch.ssh.keys.Key} @raises BadKeyError: if the key type (the first string) is unknown. """ keyType, rest = common.getNS(blob) if keyType == b'ssh-rsa': n, e, d, u, p, q, rest = common.getMP(rest, 6) return cls._fromRSAComponents(n=n, e=e, d=d, p=p, q=q) elif keyType == b'ssh-dss': p, q, g, y, x, rest = common.getMP(rest, 5) return cls._fromDSAComponents(y=y, g=g, p=p, q=q, x=x) else: raise BadKeyError('unknown blob type: %s' % (keyType,)) @classmethod def _fromString_PUBLIC_OPENSSH(cls, data): """ Return a public key object corresponding to this OpenSSH public key string. The format of an OpenSSH public key string is:: <key type> <base64-encoded public key blob> @type data: L{bytes} @param data: The key data. @return: A new key. @rtype: L{twisted.conch.ssh.keys.Key} @raises BadKeyError: if the blob type is unknown. """ blob = base64.decodestring(data.split()[1]) return cls._fromString_BLOB(blob) @classmethod def _fromString_PRIVATE_OPENSSH(cls, data, passphrase): """ Return a private key object corresponding to this OpenSSH private key string. If the key is encrypted, passphrase MUST be provided. Providing a passphrase for an unencrypted key is an error. The format of an OpenSSH private key string is:: -----BEGIN <key type> PRIVATE KEY----- [Proc-Type: 4,ENCRYPTED DEK-Info: DES-EDE3-CBC,<initialization value>] <base64-encoded ASN.1 structure> ------END <key type> PRIVATE KEY------ The ASN.1 structure of a RSA key is:: (0, n, e, d, p, q) The ASN.1 structure of a DSA key is:: (0, p, q, g, y, x) @type data: L{bytes} @param data: The key data. @type passphrase: L{bytes} or C{None} @param passphrase: The passphrase the key is encrypted with, or C{None} if it is not encrypted. @return: A new key. @rtype: L{twisted.conch.ssh.keys.Key} @raises BadKeyError: if * a passphrase is provided for an unencrypted key * the ASN.1 encoding is incorrect @raises EncryptedKeyError: if * a passphrase is not provided for an encrypted key """ lines = data.strip().split(b'\n') kind = lines[0][11:14] if lines[1].startswith(b'Proc-Type: 4,ENCRYPTED'): if not passphrase: raise EncryptedKeyError('Passphrase must be provided ' 'for an encrypted key') # Determine cipher and initialization vector try: _, cipherIVInfo = lines[2].split(b' ', 1) cipher, ivdata = cipherIVInfo.rstrip().split(b',', 1) except ValueError: raise BadKeyError('invalid DEK-info %r' % (lines[2],)) if cipher == b'AES-128-CBC': algorithmClass = algorithms.AES keySize = 16 if len(ivdata) != 32: raise BadKeyError('AES encrypted key with a bad IV') elif cipher == b'DES-EDE3-CBC': algorithmClass = algorithms.TripleDES keySize = 24 if len(ivdata) != 16: raise BadKeyError('DES encrypted key with a bad IV') else: raise BadKeyError('unknown encryption type %r' % (cipher,)) # Extract keyData for decoding iv = bytes(bytearray([int(ivdata[i:i + 2], 16) for i in range(0, len(ivdata), 2)])) ba = md5(passphrase + iv[:8]).digest() bb = md5(ba + passphrase + iv[:8]).digest() decKey = (ba + bb)[:keySize] b64Data = base64.decodestring(b''.join(lines[3:-1])) decryptor = Cipher( algorithmClass(decKey), modes.CBC(iv), backend=default_backend() ).decryptor() keyData = decryptor.update(b64Data) + decryptor.finalize() removeLen = ord(keyData[-1:]) keyData = keyData[:-removeLen] else: b64Data = b''.join(lines[1:-1]) keyData = base64.decodestring(b64Data) try: decodedKey = berDecoder.decode(keyData)[0] except PyAsn1Error as e: raise BadKeyError( 'Failed to decode key (Bad Passphrase?): %s' % (e,)) if kind == b'RSA': if len(decodedKey) == 2: # Alternate RSA key decodedKey = decodedKey[0] if len(decodedKey) < 6: raise BadKeyError('RSA key failed to decode properly') n, e, d, p, q, dmp1, dmq1, iqmp = [ long(value) for value in decodedKey[1:9] ] if p > q: # Make p smaller than q p, q = q, p return cls( rsa.RSAPrivateNumbers( p=p, q=q, d=d, dmp1=dmp1, dmq1=dmq1, iqmp=iqmp, public_numbers=rsa.RSAPublicNumbers(e=e, n=n), ).private_key(default_backend()) ) elif kind == b'DSA': p, q, g, y, x = [long(value) for value in decodedKey[1: 6]] if len(decodedKey) < 6: raise BadKeyError('DSA key failed to decode properly') return cls( dsa.DSAPrivateNumbers( x=x, public_numbers=dsa.DSAPublicNumbers( y=y, parameter_numbers=dsa.DSAParameterNumbers( p=p, q=q, g=g ) ) ).private_key(backend=default_backend()) ) else: raise BadKeyError("unknown key type %s" % (kind,)) @classmethod def _fromString_PUBLIC_LSH(cls, data): """ Return a public key corresponding to this LSH public key string. The LSH public key string format is:: <s-expression: ('public-key', (<key type>, (<name, <value>)+))> The names for a RSA (key type 'rsa-pkcs1-sha1') key are: n, e. The names for a DSA (key type 'dsa') key are: y, g, p, q. @type data: L{bytes} @param data: The key data. @return: A new key. @rtype: L{twisted.conch.ssh.keys.Key} @raises BadKeyError: if the key type is unknown """ sexp = sexpy.parse(base64.decodestring(data[1:-1])) assert sexp[0] == b'public-key' kd = {} for name, data in sexp[1][1:]: kd[name] = common.getMP(common.NS(data))[0] if sexp[1][0] == b'dsa': return cls._fromDSAComponents( y=kd[b'y'], g=kd[b'g'], p=kd[b'p'], q=kd[b'q']) elif sexp[1][0] == b'rsa-pkcs1-sha1': return cls._fromRSAComponents(n=kd[b'n'], e=kd[b'e']) else: raise BadKeyError('unknown lsh key type %s' % (sexp[1][0],)) @classmethod def _fromString_PRIVATE_LSH(cls, data): """ Return a private key corresponding to this LSH private key string. The LSH private key string format is:: <s-expression: ('private-key', (<key type>, (<name>, <value>)+))> The names for a RSA (key type 'rsa-pkcs1-sha1') key are: n, e, d, p, q. The names for a DSA (key type 'dsa') key are: y, g, p, q, x. @type data: L{bytes} @param data: The key data. @return: A new key. @rtype: L{twisted.conch.ssh.keys.Key} @raises BadKeyError: if the key type is unknown """ sexp = sexpy.parse(data) assert sexp[0] == b'private-key' kd = {} for name, data in sexp[1][1:]: kd[name] = common.getMP(common.NS(data))[0] if sexp[1][0] == b'dsa': assert len(kd) == 5, len(kd) return cls._fromDSAComponents( y=kd[b'y'], g=kd[b'g'], p=kd[b'p'], q=kd[b'q'], x=kd[b'x']) elif sexp[1][0] == b'rsa-pkcs1': assert len(kd) == 8, len(kd) if kd[b'p'] > kd[b'q']: # Make p smaller than q kd[b'p'], kd[b'q'] = kd[b'q'], kd[b'p'] return cls._fromRSAComponents( n=kd[b'n'], e=kd[b'e'], d=kd[b'd'], p=kd[b'p'], q=kd[b'q']) else: raise BadKeyError('unknown lsh key type %s' % (sexp[1][0],)) @classmethod def _fromString_AGENTV3(cls, data): """ Return a private key object corresponsing to the Secure Shell Key Agent v3 format. The SSH Key Agent v3 format for a RSA key is:: string 'ssh-rsa' integer e integer d integer n integer u integer p integer q The SSH Key Agent v3 format for a DSA key is:: string 'ssh-dss' integer p integer q integer g integer y integer x @type data: L{bytes} @param data: The key data. @return: A new key. @rtype: L{twisted.conch.ssh.keys.Key} @raises BadKeyError: if the key type (the first string) is unknown """ keyType, data = common.getNS(data) if keyType == b'ssh-dss': p, data = common.getMP(data) q, data = common.getMP(data) g, data = common.getMP(data) y, data = common.getMP(data) x, data = common.getMP(data) return cls._fromDSAComponents(y=y, g=g, p=p, q=q, x=x) elif keyType == b'ssh-rsa': e, data = common.getMP(data) d, data = common.getMP(data) n, data = common.getMP(data) u, data = common.getMP(data) p, data = common.getMP(data) q, data = common.getMP(data) return cls._fromRSAComponents(n=n, e=e, d=d, p=p, q=q, u=u) else: raise BadKeyError("unknown key type %s" % (keyType,)) def _guessStringType(cls, data): """ Guess the type of key in data. The types map to _fromString_* methods. @type data: L{bytes} @param data: The key data. """ if data.startswith(b'ssh-'): return 'public_openssh' elif data.startswith(b'-----BEGIN'): return 'private_openssh' elif data.startswith(b'{'): return 'public_lsh' elif data.startswith(b'('): return 'private_lsh' elif data.startswith(b'\x00\x00\x00\x07ssh-'): ignored, rest = common.getNS(data) count = 0 while rest: count += 1 ignored, rest = common.getMP(rest) if count > 4: return 'agentv3' else: return 'blob' _guessStringType = classmethod(_guessStringType) @classmethod def _fromRSAComponents(cls, n, e, d=None, p=None, q=None, u=None): """ Build a key from RSA numerical components. @type n: L{int} @param n: The 'n' RSA variable. @type e: L{int} @param e: The 'e' RSA variable. @type d: L{int} or C{None} @param d: The 'd' RSA variable (optional for a public key). @type p: L{int} or C{None} @param p: The 'p' RSA variable (optional for a public key). @type q: L{int} or C{None} @param q: The 'q' RSA variable (optional for a public key). @type u: L{int} or C{None} @param u: The 'u' RSA variable. Ignored, as its value is determined by p and q. @rtype: L{Key} @return: An RSA key constructed from the values as given. """ publicNumbers = rsa.RSAPublicNumbers(e=e, n=n) if d is None: # We have public components. keyObject = publicNumbers.public_key(default_backend()) else: privateNumbers = rsa.RSAPrivateNumbers( p=p, q=q, d=d, dmp1=rsa.rsa_crt_dmp1(d, p), dmq1=rsa.rsa_crt_dmq1(d, q), iqmp=rsa.rsa_crt_iqmp(p, q), public_numbers=publicNumbers, ) keyObject = privateNumbers.private_key(default_backend()) return cls(keyObject) @classmethod def _fromDSAComponents(cls, y, p, q, g, x=None): """ Build a key from DSA numerical components. @type y: L{int} @param y: The 'y' DSA variable. @type p: L{int} @param p: The 'p' DSA variable. @type q: L{int} @param q: The 'q' DSA variable. @type g: L{int} @param g: The 'g' DSA variable. @type x: L{int} or C{None} @param x: The 'x' DSA variable (optional for a public key) @rtype: L{Key} @return: A DSA key constructed from the values as given. """ publicNumbers = dsa.DSAPublicNumbers( y=y, parameter_numbers=dsa.DSAParameterNumbers(p=p, q=q, g=g)) if x is None: # We have public components. keyObject = publicNumbers.public_key(default_backend()) else: privateNumbers = dsa.DSAPrivateNumbers( x=x, public_numbers=publicNumbers) keyObject = privateNumbers.private_key(default_backend()) return cls(keyObject) def __init__(self, keyObject): """ Initialize with a private or public C{cryptography.hazmat.primitives.asymmetric} key. @param keyObject: Low level key. @type keyObject: C{cryptography.hazmat.primitives.asymmetric} key. """ # Avoid importing PyCrypto if at all possible if keyObject.__class__.__module__.startswith('Crypto.PublicKey'): warningString = getDeprecationWarningString( Key, Version("Twisted", 16, 0, 0), replacement='passing a cryptography key object') warnings.warn(warningString, DeprecationWarning, stacklevel=2) self.keyObject = keyObject else: self._keyObject = keyObject def __eq__(self, other): """ Return True if other represents an object with the same key. """ if type(self) == type(other): return self.type() == other.type() and self.data() == other.data() else: return NotImplemented def __ne__(self, other): """ Return True if other represents anything other than this key. """ result = self.__eq__(other) if result == NotImplemented: return result return not result def __repr__(self): """ Return a pretty representation of this object. """ lines = [ '<%s %s (%s bits)' % ( nativeString(self.type()), self.isPublic() and 'Public Key' or 'Private Key', self._keyObject.key_size)] for k, v in sorted(self.data().items()): if _PY3 and isinstance(k, bytes): k = k.decode('ascii') lines.append('attr %s:' % (k,)) by = common.MP(v)[4:] while by: m = by[:15] by = by[15:] o = '' for c in iterbytes(m): o = o + '%02x:' % (ord(c),) if len(m) < 15: o = o[:-1] lines.append('\t' + o) lines[-1] = lines[-1] + '>' return '\n'.join(lines) @property @deprecated(Version('Twisted', 16, 0, 0)) def keyObject(self): """ A C{Crypto.PublicKey} object similar to this key. As PyCrypto is no longer used for the underlying operations, this property should be avoided. """ # Lazy import to have PyCrypto as a soft dependency. from Crypto.PublicKey import DSA, RSA keyObject = None keyType = self.type() keyData = self.data() isPublic = self.isPublic() if keyType == 'RSA': if isPublic: keyObject = RSA.construct(( keyData['n'], long(keyData['e']), )) else: keyObject = RSA.construct(( keyData['n'], long(keyData['e']), keyData['d'], keyData['p'], keyData['q'], keyData['u'], )) elif keyType == 'DSA': if isPublic: keyObject = DSA.construct(( keyData['y'], keyData['g'], keyData['p'], keyData['q'], )) else: keyObject = DSA.construct(( keyData['y'], keyData['g'], keyData['p'], keyData['q'], keyData['x'], )) else: raise BadKeyError('Unsupported key type.') return keyObject @keyObject.setter @deprecated(Version('Twisted', 16, 0, 0)) def keyObject(self, value): # Lazy import to have PyCrypto as a soft dependency. from Crypto.PublicKey import DSA, RSA if isinstance(value, RSA._RSAobj): rawKey = value.key if rawKey.has_private(): newKey = self._fromRSAComponents( e=rawKey.e, n=rawKey.n, p=rawKey.p, q=rawKey.q, d=rawKey.d, u=rawKey.u, ) else: newKey = self._fromRSAComponents(e=rawKey.e, n=rawKey.n) elif isinstance(value, DSA._DSAobj): rawKey = value.key if rawKey.has_private(): newKey = self._fromDSAComponents( y=rawKey.y, p=rawKey.p, q=rawKey.q, g=rawKey.g, x=rawKey.x, ) else: newKey = self._fromDSAComponents( y=rawKey.y, p=rawKey.p, q=rawKey.q, g=rawKey.g, ) else: raise BadKeyError('PyCrypto key type not supported.') self._keyObject = newKey._keyObject def isPublic(self): """ Check if this instance is a public key. @return: C{True} if this is a public key. """ return isinstance( self._keyObject, (rsa.RSAPublicKey, dsa.DSAPublicKey)) def public(self): """ Returns a version of this key containing only the public key data. If this is a public key, this may or may not be the same object as self. @rtype: L{Key} @return: A public key. """ return Key(self._keyObject.public_key()) def fingerprint(self): """ Get the user presentation of the fingerprint of this L{Key}. As described by U{RFC 4716 section 4<http://tools.ietf.org/html/rfc4716#section-4>}:: The fingerprint of a public key consists of the output of the MD5 message-digest algorithm [RFC1321]. The input to the algorithm is the public key data as specified by [RFC4253]. (...) The output of the (MD5) algorithm is presented to the user as a sequence of 16 octets printed as hexadecimal with lowercase letters and separated by colons. @since: 8.2 @return: the user presentation of this L{Key}'s fingerprint, as a string. @rtype: L{str} """ return ':'.join([x.encode('hex') for x in md5(self.blob()).digest()]) def type(self): """ Return the type of the object we wrap. Currently this can only be 'RSA' or 'DSA'. @rtype: L{str} """ if isinstance( self._keyObject, (rsa.RSAPublicKey, rsa.RSAPrivateKey)): return 'RSA' elif isinstance( self._keyObject, (dsa.DSAPublicKey, dsa.DSAPrivateKey)): return 'DSA' else: raise RuntimeError( 'unknown type of object: %r' % (self._keyObject,)) def sshType(self): """ Get the type of the object we wrap as defined in the SSH protocol, defined in RFC 4253, Section 6.6. Currently this can only be b'ssh-rsa' or b'ssh-dss'. @return: The key type format. @rtype: L{bytes} """ return {'RSA': b'ssh-rsa', 'DSA': b'ssh-dss'}[self.type()] def size(self): """ Return the size of the object we wrap. @return: The size of the key. @rtype: C{int} """ if self._keyObject is None: return 0 return self._keyObject.key_size def data(self): """ Return the values of the public key as a dictionary. @rtype: C{dict} """ if isinstance(self._keyObject, rsa.RSAPublicKey): numbers = self._keyObject.public_numbers() return { "n": numbers.n, "e": numbers.e, } elif isinstance(self._keyObject, rsa.RSAPrivateKey): numbers = self._keyObject.private_numbers() return { "n": numbers.public_numbers.n, "e": numbers.public_numbers.e, "d": numbers.d, "p": numbers.p, "q": numbers.q, # Use a trick: iqmp is q^-1 % p, u is p^-1 % q "u": rsa.rsa_crt_iqmp(numbers.q, numbers.p), } elif isinstance(self._keyObject, dsa.DSAPublicKey): numbers = self._keyObject.public_numbers() return { "y": numbers.y, "g": numbers.parameter_numbers.g, "p": numbers.parameter_numbers.p, "q": numbers.parameter_numbers.q, } elif isinstance(self._keyObject, dsa.DSAPrivateKey): numbers = self._keyObject.private_numbers() return { "x": numbers.x, "y": numbers.public_numbers.y, "g": numbers.public_numbers.parameter_numbers.g, "p": numbers.public_numbers.parameter_numbers.p, "q": numbers.public_numbers.parameter_numbers.q, } else: raise RuntimeError("Unexpected key type: %s" % (self._keyObject,)) def blob(self): """ Return the public key blob for this key. The blob is the over-the-wire format for public keys. SECSH-TRANS RFC 4253 Section 6.6. RSA keys:: string 'ssh-rsa' integer e integer n DSA keys:: string 'ssh-dss' integer p integer q integer g integer y @rtype: L{bytes} """ type = self.type() data = self.data() if type == 'RSA': return (common.NS(b'ssh-rsa') + common.MP(data['e']) + common.MP(data['n'])) elif type == 'DSA': return (common.NS(b'ssh-dss') + common.MP(data['p']) + common.MP(data['q']) + common.MP(data['g']) + common.MP(data['y'])) else: raise BadKeyError("unknown key type %s" % (type,)) def privateBlob(self): """ Return the private key blob for this key. The blob is the over-the-wire format for private keys: Specification in OpenSSH PROTOCOL.agent RSA keys:: string 'ssh-rsa' integer n integer e integer d integer u integer p integer q DSA keys:: string 'ssh-dss' integer p integer q integer g integer y integer x """ type = self.type() data = self.data() if type == 'RSA': return (common.NS(b'ssh-rsa') + common.MP(data['n']) + common.MP(data['e']) + common.MP(data['d']) + common.MP(data['u']) + common.MP(data['p']) + common.MP(data['q'])) elif type == 'DSA': return (common.NS(b'ssh-dss') + common.MP(data['p']) + common.MP(data['q']) + common.MP(data['g']) + common.MP(data['y']) + common.MP(data['x'])) else: raise BadKeyError("unknown key type %s" % (type,)) def toString(self, type, extra=None): """ Create a string representation of this key. If the key is a private key and you want the represenation of its public key, use C{key.public().toString()}. type maps to a _toString_* method. @param type: The type of string to emit. Currently supported values are C{'OPENSSH'}, C{'LSH'}, and C{'AGENTV3'}. @type type: L{str} @param extra: Any extra data supported by the selected format which is not part of the key itself. For public OpenSSH keys, this is a comment. For private OpenSSH keys, this is a passphrase to encrypt with. @type extra: L{bytes} or L{NoneType} @rtype: L{bytes} """ method = getattr(self, '_toString_%s' % (type.upper(),), None) if method is None: raise BadKeyError('unknown key type: %s' % (type,)) if method.__code__.co_argcount == 2: return method(extra) else: return method() def _toString_OPENSSH(self, extra): """ Return a public or private OpenSSH string. See _fromString_PUBLIC_OPENSSH and _fromString_PRIVATE_OPENSSH for the string formats. If extra is present, it represents a comment for a public key, or a passphrase for a private key. @param extra: Comment for a public key or passphrase for a private key @type extra: L{bytes} @rtype: L{bytes} """ data = self.data() if self.isPublic(): b64Data = base64.encodestring(self.blob()).replace(b'\n', b'') if not extra: extra = b'' return (self.sshType() + b' ' + b64Data + b' ' + extra).strip() else: lines = [b''.join((b'-----BEGIN ', self.type().encode('ascii'), b' PRIVATE KEY-----'))] if self.type() == 'RSA': p, q = data['p'], data['q'] objData = (0, data['n'], data['e'], data['d'], q, p, data['d'] % (q - 1), data['d'] % (p - 1), data['u']) else: objData = (0, data['p'], data['q'], data['g'], data['y'], data['x']) asn1Sequence = univ.Sequence() for index, value in izip(itertools.count(), objData): asn1Sequence.setComponentByPosition(index, univ.Integer(value)) asn1Data = berEncoder.encode(asn1Sequence) if extra: iv = randbytes.secureRandom(8) hexiv = ''.join(['%02X' % (ord(x),) for x in iterbytes(iv)]) hexiv = hexiv.encode('ascii') lines.append(b'Proc-Type: 4,ENCRYPTED') lines.append(b'DEK-Info: DES-EDE3-CBC,' + hexiv + b'\n') ba = md5(extra + iv).digest() bb = md5(ba + extra + iv).digest() encKey = (ba + bb)[:24] padLen = 8 - (len(asn1Data) % 8) asn1Data += (chr(padLen) * padLen).encode('ascii') encryptor = Cipher( algorithms.TripleDES(encKey), modes.CBC(iv), backend=default_backend() ).encryptor() asn1Data = encryptor.update(asn1Data) + encryptor.finalize() b64Data = base64.encodestring(asn1Data).replace(b'\n', b'') lines += [b64Data[i:i + 64] for i in range(0, len(b64Data), 64)] lines.append(b''.join((b'-----END ', self.type().encode('ascii'), b' PRIVATE KEY-----'))) return b'\n'.join(lines) def _toString_LSH(self): """ Return a public or private LSH key. See _fromString_PUBLIC_LSH and _fromString_PRIVATE_LSH for the key formats. @rtype: L{bytes} """ data = self.data() type = self.type() if self.isPublic(): if type == 'RSA': keyData = sexpy.pack([[b'public-key', [b'rsa-pkcs1-sha1', [b'n', common.MP(data['n'])[4:]], [b'e', common.MP(data['e'])[4:]]]]]) elif type == 'DSA': keyData = sexpy.pack([[b'public-key', [b'dsa', [b'p', common.MP(data['p'])[4:]], [b'q', common.MP(data['q'])[4:]], [b'g', common.MP(data['g'])[4:]], [b'y', common.MP(data['y'])[4:]]]]]) else: raise BadKeyError("unknown key type %s" % (type,)) return (b'{' + base64.encodestring(keyData).replace(b'\n', b'') + b'}') else: if type == 'RSA': p, q = data['p'], data['q'] return sexpy.pack([[b'private-key', [b'rsa-pkcs1', [b'n', common.MP(data['n'])[4:]], [b'e', common.MP(data['e'])[4:]], [b'd', common.MP(data['d'])[4:]], [b'p', common.MP(q)[4:]], [b'q', common.MP(p)[4:]], [b'a', common.MP( data['d'] % (q - 1))[4:]], [b'b', common.MP( data['d'] % (p - 1))[4:]], [b'c', common.MP(data['u'])[4:]]]]]) elif type == 'DSA': return sexpy.pack([[b'private-key', [b'dsa', [b'p', common.MP(data['p'])[4:]], [b'q', common.MP(data['q'])[4:]], [b'g', common.MP(data['g'])[4:]], [b'y', common.MP(data['y'])[4:]], [b'x', common.MP(data['x'])[4:]]]]]) else: raise BadKeyError("unknown key type %s'" % (type,)) def _toString_AGENTV3(self): """ Return a private Secure Shell Agent v3 key. See _fromString_AGENTV3 for the key format. @rtype: L{bytes} """ data = self.data() if not self.isPublic(): if self.type() == 'RSA': values = (data['e'], data['d'], data['n'], data['u'], data['p'], data['q']) elif self.type() == 'DSA': values = (data['p'], data['q'], data['g'], data['y'], data['x']) return common.NS(self.sshType()) + b''.join(map(common.MP, values)) def sign(self, data): """ Sign some data with this key. SECSH-TRANS RFC 4253 Section 6.6. @type data: L{bytes} @param data: The data to sign. @rtype: L{bytes} @return: A signature for the given data. """ if self.type() == 'RSA': signer = self._keyObject.signer( padding.PKCS1v15(), hashes.SHA1()) signer.update(data) ret = common.NS(signer.finalize()) elif self.type() == 'DSA': signer = self._keyObject.signer(hashes.SHA1()) signer.update(data) signature = signer.finalize() (r, s) = decode_dss_signature(signature) # SSH insists that the DSS signature blob be two 160-bit integers # concatenated together. The sig[0], [1] numbers from obj.sign # are just numbers, and could be any length from 0 to 160 bits. # Make sure they are padded out to 160 bits (20 bytes each) ret = common.NS(int_to_bytes(r, 20) + int_to_bytes(s, 20)) else: raise BadKeyError("unknown key type %s" % (self.type(),)) return common.NS(self.sshType()) + ret def verify(self, signature, data): """ Verify a signature using this key. @type signature: L{bytes} @param signature: The signature to verify. @type data: L{bytes} @param data: The signed data. @rtype: L{bool} @return: C{True} if the signature is valid. """ if len(signature) == 40: # DSA key with no padding signatureType, signature = b'ssh-dss', common.NS(signature) else: signatureType, signature = common.getNS(signature) if signatureType != self.sshType(): return False if self.type() == 'RSA': k = self._keyObject if not self.isPublic(): k = k.public_key() verifier = k.verifier( common.getNS(signature)[0], padding.PKCS1v15(), hashes.SHA1(), ) elif self.type() == 'DSA': concatenatedSignature = common.getNS(signature)[0] r = int_from_bytes(concatenatedSignature[:20], 'big') s = int_from_bytes(concatenatedSignature[20:], 'big') signature = encode_dss_signature(r, s) k = self._keyObject if not self.isPublic(): k = k.public_key() verifier = k.verifier( signature, hashes.SHA1()) else: raise BadKeyError("unknown key type %s" % (self.type(),)) verifier.update(data) try: verifier.verify() except InvalidSignature: return False else: return True @deprecated(Version("Twisted", 15, 5, 0)) def objectType(obj): """ DEPRECATED. Return the SSH key type corresponding to a C{Crypto.PublicKey.pubkey.pubkey} object. @param obj: Key for which the type is returned. @type obj: C{Crypto.PublicKey.pubkey.pubkey} @return: Return the SSH key type corresponding to a PyCrypto object. @rtype: C{str} """ keyDataMapping = { ('n', 'e', 'd', 'p', 'q'): b'ssh-rsa', ('n', 'e', 'd', 'p', 'q', 'u'): b'ssh-rsa', ('y', 'g', 'p', 'q', 'x'): b'ssh-dss' } try: return keyDataMapping[tuple(obj.keydata)] except (KeyError, AttributeError): raise BadKeyError("invalid key object", obj) def _getPersistentRSAKey(location, keySize=4096): """ This function returns a persistent L{Key}. The key is loaded from a PEM file in C{location}. If it does not exist, a key with the key size of C{keySize} is generated and saved. @param location: Where the key is stored. @type location: L{twisted.python.filepath.FilePath) @param keySize: The size of the key, if it needs to be generated. @type keySize: L{int} @returns: A persistent key. @rtype: L{Key} """ location.parent().makedirs(ignoreExistingDirectory=True) # If it doesn't exist, we want to generate a new key and save it if not location.exists(): privateKey = rsa.generate_private_key( public_exponent=65537, key_size=keySize, backend=default_backend() ) pem = privateKey.private_bytes( encoding=serialization.Encoding.PEM, format=serialization.PrivateFormat.TraditionalOpenSSL, encryption_algorithm=serialization.NoEncryption() ) location.setContent(pem) # By this point (save any hilarious race conditions) we should have a # working PEM file. Load it! # (Future archaelogical readers: I chose not to short circuit above, # because then there's two exit paths to this code!) with location.open("rb") as keyFile: privateKey = serialization.load_pem_private_key( keyFile.read(), password=None, backend=default_backend() ) return Key(privateKey) if _PY3: # The objectType function is deprecated and not being ported to Python 3. del objectType
A super slick trick-taking game featuring art by Don Maitz and Keith Parkinson! Seriously, you should buy it. This edition is "special" as it's the only one!
""" Python class representing the Agilent4395A Frequency Response Analyzer """ import cmath import datetime import sys import visa from draw import draw # Controller class for the Agilent 4395A Frequency Response Analyzer # - operates over GPIB using a National Instruments USB-GPIB cable # # GPIB uses two methods for communication: # WRITE - Sends a command, doesn't return a response # QUERY - Sends a command, followed by a '?', and returns a response class Agilent4395A: # GPIB Address, used to specify connection ADDRESS = u'GPIB0::16::INSTR' # GPIB ID string, returned by the '*IDN?' query, used to test if connection is successful ID = u'HEWLETT-PACKARD,4395A,MY41101925,REV1.12\n' def __init__(self): self.rm = visa.ResourceManager() self.analyzer = None # Connect to and initialize the analyzer def connect(self): self.analyzer = self.rm.open_resource(self.ADDRESS) currentID = (self.query("*IDN?")) if currentID != self.ID: print "ID discrepancy:" print " expected:", self.ID print " actual: ", currentID return False self.write("*RST") self.write("*CLS") return True # Close the connection (should be done before script exits) def disconnect(self): self.analyzer.close() # Sends a string to the analyzer, does not return a response def write(self, cmd): self.analyzer.write(cmd) # Sends a string (must end with '?'), returns response # If the response is large, it may take several seconds to return def query(self, cmd): return self.analyzer.query(cmd) if __name__ == "__main__": # Test script, sends some configuration commands and reads the measured data import time fra = Agilent4395A() if not fra.connect(): print "Failed to connect to Agilent4395A" exit(1) # dictionaries to store measurements results = {"A": {}, "B": {}} channels = ["A", "B"] filename = sys.argv[1] # Setup parameters Rs = 50.0 # resistance, ohms power = 0.0 # dB nPoints = 201 f1 = 100 f2 = 100000000 # overwrite power from command line if len(sys.argv) == 3: power = eval(sys.argv[2]) # Validate parameters if not (1 <= nPoints <= 801): print "nPoints must be in the range [0, 801]" exit() if not (f1 < f2): print "f1 must be less than f2" exit() if not (10 <= f1 <= 510000000): print "start/stop frequencies must be in the range [10, 510M]" exit() if not (10 <= f2 <= 510000000): print "start/stop frequencies must be in the range [10, 510M]" exit() # BWAUTO 1 fmts = ["LINM", "PHAS", "REAL", "IMAG", "LOGM"] commands = """NA CHAN1 HOLD SWPT LOGF BWAUTO 1 POIN {} FORM4 MEAS {{}} PHAU DEG STAR {} HZ STOP {} HZ POWE {} FMT LINM""".format(nPoints, f1, f2, power) for channel in channels: #print "press enter to measure channel {}".format(channel), raw_input() # Configure analyzer for measurements print "Configuring analyzer for measurement of channel {}".format(channel) commandList = commands.format(channel).split("\n") for cmd in commandList: fra.write(cmd.strip()) time.sleep(15) # Get sweep duration t = 10 try: duration = fra.query("SWET?").strip() t = float(duration) except: print "failed to convert to float: ", duration t = 180 print "sweep time: {}".format(t) # Make measurement t0 = time.time() fra.write("SING") while time.time() < t0 + t + 2: print "waiting" time.sleep(1) # Read data from analyzer for fmt in fmts: print "Reading Channel {}: {} ...".format(channel, fmt), fra.write("FMT {}".format(fmt)) # results are read as list of x1,y1,x2,y2,x3,y3... where every yn value is 0. # this line splits the list at every comma, strips out every second value, and converts to floats response = fra.query("OUTPDTRC?") print "done - {} bytes".format(len(response)) results[channel][fmt] = map(float, response.strip().split(",")[::2]) # Read x-axis values (frequency points) freqs = fra.query("OUTPSWPRM?").strip() freqs = map(float, freqs.split(",")) timestamp = datetime.datetime.now().isoformat() print "saving file" filename = "sweepResults_{}.csv".format(filename) output = open(filename, "w") output.write("Impedance Measurement Performed with an Agilent 4395A Network Analyzer\n") output.write("File generated on: {}\n".format(timestamp)) output.write("Rs = {} ohms\n".format(Rs)) output.write("Impedance Calculation: Rs x (Va - Vb) / Vb\n") output.write("Start Frequency: {} Hz\n".format(f1)) output.write("Stop Frequency: {} Hz\n".format(f2)) output.write("Number of data points: {}\n".format(nPoints)) output.write("Source Power (dB): {}\n".format(power)) output.write("Measurement BW: auto \n") output.write("\n") # Store additional info here output.write("\n") # Store additional info here output.write("Frequency,Va (real),Va (imag),Vb (real),Vb (imag),Va Mag,Va Phase,Vb Mag,Vb Phase,Impedance Mag,Impedance Phase\n") for i in range(nPoints): freq = freqs[i] VaReal = results["A"]["REAL"][i] VaImag = results["A"]["IMAG"][i] VbReal = results["B"]["REAL"][i] VbImag = results["B"]["IMAG"][i] Va = VaReal + 1j * VaImag Vb = VbReal + 1j * VbImag Z = Rs * (Va - Vb) / Vb VaMag, VaPhase = cmath.polar(Va) VbMag, VbPhase = cmath.polar(Vb) ZMag, ZPhase = cmath.polar(Z) VaPhase = 180 * VaPhase / cmath.pi VbPhase = 180 * VbPhase / cmath.pi ZPhase = 180 * ZPhase / cmath.pi output.write("{},{},{},{},{},{},{},{},{},{},{}\n".format( freq, VaReal, VaImag, VbReal, VbImag, VaMag, VaPhase, VbMag, VbPhase, ZMag, ZPhase)) output.close() fra.disconnect() #time.sleep(1) draw(filename) exit()
I would like to program the my Stemlab 125-14 v.1.0 using the JTAG connector on the board, so I can do it directly from Vivado (instead of doing it over ssh) and also debug my designs using the Integrated logic analyzer of Vivado. This cable unfortunately does not really fit on the pins on the board. Therefore I would like to solder a little adapter. Using my adapter together with the Digilent JTAG-HS3 Cable I try to program the Red Pitaya in Vivado 2015.4. ERROR: [Labtools 27-2269] No devices detected on target localhost:3121/xilinx_tcf/Digilent/210299A888F9. use the disconnect_hw_server and connect_hw_server to re-register this hardware target. PROBLEM SOLVED: Using a Xilinx Platform USB cable 2 did the Job for me. to be able to connect to it in Vivado.
# Copyright 2014-2015 Eucalyptus Systems, Inc. # # Redistribution and use of this software in source and binary forms, # with or without modification, are permitted provided that the following # conditions are met: # # Redistributions of source code must retain the above copyright notice, # this list of conditions and the following disclaimer. # # Redistributions in binary form must reproduce the above copyright # notice, this list of conditions and the following disclaimer in the # documentation and/or other materials provided with the distribution. # # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS # "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT # LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR # A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT # OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, # SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT # LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, # DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY # THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. from requestbuilder import Arg from euca2ools.commands.iam import IAMRequest, AS_ACCOUNT, arg_iprofile class GetInstanceProfile(IAMRequest): DESCRIPTION = "Display an instance profile's ARN and GUID" ARGS = [arg_iprofile( help='name of the instance profile to describe (required)'), Arg('-r', dest='show_roles', action='store_true', route_to=None, help='''also list the roles associated with the instance profile'''), AS_ACCOUNT] LIST_TAGS = ['Roles'] def print_result(self, result): print result.get('InstanceProfile', {}).get('Arn') print result.get('InstanceProfile', {}).get('InstanceProfileId') if self.args.get('show_roles'): for role in result.get('InstanceProfile', {}).get('Roles') or []: print role.get('Arn')
In her directorial debut, the Israeli director follows two women who, in their own way, try to break with outdated traditions. The section head and the director of the Israeli film. A wedding in a Bedouin village in the desert of southern Israel. Woman and men celebrate and dance separately whilst the bride sits alone at the edge of the exuberant gathering. Hostess Jalila is also overwhelmed and covers up her emotions. It is the wedding of her husband Sulimann; he is taking a younger woman as his second wife. On top of that, her oldest daughter, the self-confident Layla, is also causing her concern: she has fallen for a boy at her university. Aware that her husband will not allow this relationship, Jalila forbids Layla from seeing the boy again. But when Sulimann reacts even more harshly, she takes up her daughter’s cause. In her directorial debut, Elite Zexer follows two women who, in their own way, try to break with outdated traditions. Whilst Jalila initially adopts a quiet approach, her daughter chooses instead to go on the offensive. When this threatens to tear apart the family, it becomes clear that Layla’s temperament, strength and self-assurance are not enough to overcome the rigid social structures. Nonetheless, the women standing together in solidarity could pave the way for more freedom – at least for the next generation.
#!/usr/bin/env python3 # # rewritemeta.py - part of the FDroid server tools # This cleans up the original .yml metadata file format. # Copyright (C) 2010-12, Ciaran Gultnieks, [email protected] # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see <http://www.gnu.org/licenses/>. from argparse import ArgumentParser import os import logging import io import tempfile import shutil from . import _ from . import common from . import metadata config = None options = None def proper_format(app): s = io.StringIO() # TODO: currently reading entire file again, should reuse first # read in metadata.py with open(app.metadatapath, 'r', encoding='utf-8') as f: cur_content = f.read() if app.metadatapath.endswith('.yml'): metadata.write_yaml(s, app) content = s.getvalue() s.close() return content == cur_content def main(): global config, options parser = ArgumentParser() common.setup_global_opts(parser) parser.add_argument("-l", "--list", action="store_true", default=False, help=_("List files that would be reformatted")) parser.add_argument("appid", nargs='*', help=_("application ID of file to operate on")) metadata.add_metadata_arguments(parser) options = parser.parse_args() metadata.warnings_action = options.W config = common.read_config(options) # Get all apps... allapps = metadata.read_metadata(options.appid) apps = common.read_app_args(options.appid, allapps, False) for appid, app in apps.items(): path = app.metadatapath if path.endswith('.yml'): logging.info(_("Rewriting '{appid}'").format(appid=appid)) else: logging.warning(_('Cannot rewrite "{path}"').format(path=path)) continue if options.list: if not proper_format(app): print(path) continue newbuilds = [] for build in app.get('Builds', []): new = metadata.Build() for k in metadata.build_flags: v = build[k] if v is None or v is False or v == [] or v == '': continue new[k] = v newbuilds.append(new) app['Builds'] = newbuilds # rewrite to temporary file before overwriting existsing # file in case there's a bug in write_metadata with tempfile.TemporaryDirectory() as tmpdir: tmp_path = os.path.join(tmpdir, os.path.basename(path)) metadata.write_metadata(tmp_path, app) shutil.move(tmp_path, path) logging.debug(_("Finished")) if __name__ == "__main__": main()
pinehillsholton Inc. ("pinehillsholton") operates pinehillsholton.com and may operate other websites. It is pinehillsholton's policy to respect your privacy regarding any information we may collect while operating our websites. Like most website operators, pinehillsholton collects non-personally-identifying information of the sort that web browsers and servers typically make available, such as the browser type, language preference, referring site, and the date and time of each visitor request. pinehillsholton's purpose in collecting non-personally identifying information is to better understand how pinehillsholton's visitors use its website. From time to time, pinehillsholton may release non-personally-identifying information in the aggregate, e.g., by publishing a report on trends in the usage of its website. pinehillsholton also collects potentially personally-identifying information like Internet Protocol (IP) addresses for logged in users and for users leaving comments on pinehillsholton.com blogs/sites. pinehillsholton only discloses logged in user and commenter IP addresses under the same circumstances that it uses and discloses personally-identifying information as described below, except that commenter IP addresses and email addresses are visible and disclosed to the administrators of the blog/site where the comment was left. Certain visitors to pinehillsholton's websites choose to interact with pinehillsholton in ways that require pinehillsholton to gather personally-identifying information. The amount and type of information that pinehillsholton gathers depends on the nature of the interaction. For example, we ask visitors who sign up at pinehillsholton.com to provide a username and email address. Those who engage in transactions with pinehillsholton are asked to provide additional information, including as necessary the personal and financial information required to process those transactions. In each case, pinehillsholton collects such information only insofar as is necessary or appropriate to fulfill the purpose of the visitor's interaction with pinehillsholton. pinehillsholton does not disclose personally-identifying information other than as described below. And visitors can always refuse to supply personally-identifying information, with the caveat that it may prevent them from engaging in certain website-related activities. pinehillsholton may collect statistics about the behavior of visitors to its websites. pinehillsholton may display this information publicly or provide it to others. However, pinehillsholton does not disclose personally-identifying information other than as described below. pinehillsholton discloses potentially personally-identifying and personally-identifying information only to those of its employees, contractors and affiliated organizations that (i) need to know that information in order to process it on pinehillsholton's behalf or to provide services available at pinehillsholton's websites, and (ii) that have agreed not to disclose it to others. Some of those employees, contractors and affiliated organizations may be located outside of your home country; by using pinehillsholton's websites, you consent to the transfer of such information to them. pinehillsholton will not rent or sell potentially personally-identifying and personally-identifying information to anyone. Other than to its employees, contractors and affiliated organizations, as described above, pinehillsholton discloses potentially personally-identifying and personally-identifying information only in response to a subpoena, court order or other governmental request, or when pinehillsholton believes in good faith that disclosure is reasonably necessary to protect the property or rights of pinehillsholton, third parties or the public at large. If you are a registered user of an pinehillsholton website and have supplied your email address, pinehillsholton may occasionally send you an email to tell you about new features, solicit your feedback, or just keep you up to date with what's going on with pinehillsholton and our products. If you send us a request (for example via email or via one of our feedback mechanisms), we reserve the right to publish it in order to help us clarify or respond to your request or to help us support other users. pinehillsholton takes all measures reasonably necessary to protect against the unauthorized access, use, alteration or destruction of potentially personally-identifying and personally-identifying information. If pinehillsholton, or substantially all of its assets, were acquired, or in the unlikely event that pinehillsholton goes out of business or enters bankruptcy, user information would be one of the assets that is transferred or acquired by a third party. You acknowledge that such transfers may occur, and that any acquirer of pinehillsholton may continue to use your personal information as set forth in this policy.
"""Imports/exports arrays and generates artificial ones. The artificial data are DTMs and DSMs are basically numpy arrays with height values. All sizes and 2D coordinates refer to array elements, with (0==row, 0==column) being the top left cell. """ import numpy from scipy.ndimage import measurements def asarray(parameters): """Converts a PIL image to a numpy array. :param parameters['data']: the input image, takes only one :type parameters['data']: PIL.Image :return: numpy.array """ return numpy.asarray(parameters['data'][0]) def get_shape(parameters): """Returns the shape of the input array. :param parameters['data']: the input array, takes only one :type parameters['data']: numpy.array :return: tuple """ return parameters['data'][0].shape def gaussian_noise(parameters): """Generates gaussian noise. .. warning:: If this is to be applied to images keep in mind that the values should be integers and that adding noise will push some pixel values over the supports color depth. e.g. In an 8 bit grey image, normally taking color values in [0, 255] adding noise to it will make some pixels take color values > 255. Scaling these pixels to become white will result in more white pixels than expected. :param parameters['data']: the input array :type parameters['data']: numpy.array :param parameters['mean']: mean value of the distribution :type parameters['mean']: float :param parameters['stddev']: standard deviation of the distribution :type parameters['stddev']: float :return: numpy.array """ return numpy.random.normal(parameters['mean'], parameters['stddev'], parameters['shape']) def load(parameters): """Loads an array from file and returns it. It supports loading from txt and npy files. :param parameters['path']: path to the file :type parameters['path']: string :param parameters['delimiter']: select which delimiter to use for loading a txt to an array, defaults to space :type parameters['delimiter']: string :return: numpy.array """ path = parameters['path'] extension = path.split('.').pop() if extension in 'txt': delimiter = parameters.get('delimiter', ' ') return numpy.loadtxt(path, delimiter=delimiter) elif extension in 'npy': return numpy.load(path) else: raise TypeError("Filetype not supported") def save(parameters): """Saves an object to a file. It supports saving to txt and npy files. :param parameters['data']: the object to be saved, takes only one :type parameters['data']: numpy.array :param parameters['path']: destination path :type parameters['path']: string :param parameters['format']: select output format, defaults to '%.2f' :type parameters['format']: string :param parameters['delimiter']: select which delimiter to use for saving a txt to an array, defaults to space :type parameters['delimiter']: string :return: True or raise TypeError """ path = parameters['path'] data = parameters['data'][0] extension = path.split('.').pop() if extension in 'txt': format = parameters.get('fmt', '%.2f') delimiter = parameters.get('delimiter', ' ') numpy.savetxt(path, data, fmt=format, delimiter=delimiter) elif extension in 'npy': numpy.save(path, data) else: raise TypeError("Filetype not supported") return True def split(parameters): """Splits a 3D array and returns only the layer requested. :param parameters['data']: the input 3D array, takes only one :type parameters['data']: numpy.array :param parameters['layer']: the 2D layer to return, 0 is the first one :type parameters['layer']: numpy.array :return: 2D numpy.array """ return parameters['data'][0][:, :, parameters['layer']] def dtm(parameters): """Generates a DTM with linear slope. Slope is applied in row major order, so pixels in each row have the same height value. :param parameters['slope_step']: height difference for neighbouring cells :type parameters['slope_step']: float or integer :param parameters['min_value']: global minimum height value :type parameters['min_value']: float or integer :param parameters['size']: the size of the surface in [rows, columns] :type parameters['size']: list :return: numpy.array """ slope_step = parameters['slope_step'] min_value = parameters['min_value'] size = parameters['size'] data = numpy.zeros(size, dtype=float) for i in range(size[0]): data[i, :] = numpy.arange(min_value, size[1], slope_step) return data def dsm(parameters): """Generates a DSM by elevating groups a cells by certain height. This requires an input array, the DTM, and a mask. The mask designates which cells of the DTM should be elevated in order to produce the DSM. Basically, the mask shows in which cells there are features with significant height, e.g. trees, buildings etc. The tricky part it to account for DTM slope when elevating a group of cells. If you simply add some height to the initial DTM then the features will be elevated parallel to the ground. Especially in the case of buildings, their roof is horizontal, regardless of the underlying DTM slope. To account for this, the algorithm initially labels the mask. As a result you get groups of cells which should all be elevated to the same height. Next, it finds the maximum height value of underlying DTM for each blob. Finally, it assigns `max_blob_height + delta_height` to each blob cell. :param parameters['data'][0]: the base DTM :type parameters['data'][0]: numpy.array :param parameters['data'][1]: the mask of cells to elevate :type parameters['data'][1]: numpy.array with boolean/binary values :param parameters['delta_height']: single cell elevation value :type parameters['delta_height']: float or integer :return: numpy.array """ dtm = parameters['data'][0] mask = parameters['data'][1] delta_height = parameters['delta_height'] # label and find the max height of each blob labels, count = measurements.label(mask) max_heights = measurements.maximum(dtm, labels=labels, index=range(1, count + 1)) # assign the max height at each blob cell, required to copy so it won't # change the initial dtm values dsm = dtm.copy() for blob_id in range(1, count + 1): dsm[numpy.where(labels == blob_id)] = max_heights[blob_id - 1] +\ delta_height return dsm
April 24, 1992: The defending champs looked every bit that as Jordan went on a tear with 46p in helping the Bulls pull away in the 2nd half. April 26, 1992: In this Game 2 both Jordan and Pippen scored 30 points in routing the Heat and moving one step closer to the 2nd round. April 29, 1992: One of the best playoff games of 1992 saw the Heat race out to an 18 point lead and hold Jordan to no points in the first 10 minutes. One problem: He scored 56 over the next 38 minutes! Special Boxset Price for this 1992 Eastern Conference Semi-Finals. Price will reduce when added to Basket. 1992 Eastern Conference Semi-Finals: After getting embarrased one year earlier the Knicks made some key offseason moves. The biggest was new coach Pat Riley along with Xavier McDaniel, Anthony Mason and an up-and-coming John Starks. The Bulls had again dominated the Knicks in the Regular Season but this series was a tatse of things to come as it went a full gruelling 7 games. May 5, 1992: The Knicks made the plays when it mattered most and a relentless defense sent them to an improbable Game 1 victory. May 7, 1992: The Bulls stepped up the defensive pressure a notch as their backup PG BJ Armstrong played the role of hero. NOTE: About 50 seconds of game time is missing in the final minute of the game. May 9, 1992: The Bulls used the Knicks gameplan on the Knicks as they never really allowed NY to get into the swing of things. For the first time in the series both Jordan and Pippen dominated regaining home court advantage for the Bulls. May 10, 1992: The Knicks were in a must-win situation and they delivered. With Ewing in foul rtouble for most of thegame the Knicks had to look elsewhere and McDaniel, Oakley and Starks all delivered. May 12, 1992: The series took on a different atmosphere as flagrant fouls were being called left and right. But, it was Jordan who scored 26 in the 2nd half who would have the final word. May 14, 1992: The Knicks pumped up by an amazing crowd put in a great performance on both ends of the floor using a 13-0 run to start the 4th quarter to their full advantage. May 17, 1992: An amazing series comes down toa final 7th game. Jordan steps up and refuses to let his team lose as he puts on a dominating 42p display and sends Chicago back to the Conference Finals. Special Boxset Price for the 1992 Eastern Conference Finals. Price will reduce when added to Basket. 1992 Eastern Conference Finals: This was to be the Cavaliers best shot at a title until the LeBron years. Led by all-stars Brad Daugherty and Mark Price and veteren Larry Nance the Cavs were out to prove they just weren't cannon fodder for the Champs. May 21, 1992: Widely criticised after their Game 1 effort the Cavs turned it around 100% in completely dominating the Bulls on their home floor. May 25, 1992: With their two stars struggling the rest of the Cavs had to step up and they did. Great team performance from Cleveland despite Jordan's 35 points. Danny Ferry did his best in trying to take out Jordan with a couple of punches. May 29, 1992: Probably the best game of the series as it was close throughout. Jordan struggled in the first 3 quarters but his teammates carried the load but when it mattered most it was MJ who stood head and shoulders above the rest. May 5, 1992: A thrilling close game that came down to the final seconds. A top defensive play the difference. May 7, 1992: Balzers get off to a big start but the Suns battle back after Kevin Johnson drops 22, including 18 straight, points ion the 3rd quarter. Blazres take the lead for good when Johnson is rested in the 4th. May 9, 1992: This game was the perfect example of the Suns fast-break offense. May 11, 1992: One of the best playoff games of all time. Over 300 points in this double OT thriller. May 14, 1992: The upstart Suns put on a valiant effort but it was not to be. Their time would come one year later. May 26, 1992: Stockton injured late in first half but Jazz held bravely on to take the game into overtime. May 28, 1992: Utah is undefeated at home in the Playoffs. Portland is 0 - 4 this season at Utah. Game was close until Blazers started to pull away in the 3rd. Jazz make a furious comeback that falls just a bit short.
# -*- coding: utf-8 -*- # Generated by Django 1.11.2 on 2017-09-14 05:42 from __future__ import unicode_literals from django.db import migrations, models class Migration(migrations.Migration): initial = True dependencies = [ ] operations = [ migrations.CreateModel( name='Event', fields=[ ('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')), ('title', models.CharField(max_length=80, verbose_name='Name')), ('time', models.DateTimeField(verbose_name='Date and Time')), ('description', models.TextField(verbose_name='Description')), ], ), migrations.CreateModel( name='Officer', fields=[ ('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')), ('name', models.CharField(max_length=50, verbose_name='Name')), ('position', models.CharField(max_length=30, verbose_name='Position')), ('username', models.CharField(blank=True, max_length=30, verbose_name='Username')), ('email', models.EmailField(max_length=254, verbose_name='Email')), ('website', models.URLField(blank=True, verbose_name='Website')), ('github', models.URLField(blank=True, verbose_name='GitHub')), ('linkedin', models.URLField(blank=True, verbose_name='LinkedIn')), ('description', models.CharField(max_length=200, verbose_name='Description')), ], ), ]
William Cook - TD Financial Planner in 1633 Ellis Street, Kelowna, BC, Nova Scotia: opening hours, driving directions, official site, phone numbers & customer reviews. William Cook - TD Financial Planner in 1633 Ellis Street, Kelowna, BC, Nova Scotia: consumer reviews, opening hours, driving directions, photos etc. William Cook - TD Financial Planner is business services based in British Columbia. William Cook - TD Financial Planner is located at 1633 Ellis Street, Kelowna, BC. You can find William Cook - TD Financial Planner opening hours, address, driving directions and map, photos and phone numbers 250-448-7392. Find helpful customer reviewsand write your own review to rate the business service.
""" ControlDemo Example Bill Winder <[email protected]> added HorizontalSlider demo. Bill Winder <[email protected]> added AreaSlider demo. """ import pyjd # dummy in pyjs from pyjamas.ui.RootPanel import RootPanel from pyjamas.ui.CaptionPanel import CaptionPanel from pyjamas.ui.Label import Label from pyjamas.ui.Controls import VerticalDemoSlider from pyjamas.ui.Controls import VerticalDemoSlider2 from pyjamas.ui.Controls import HorizontalDemoSlider from pyjamas.ui.Controls import HorizontalDemoSlider2 from pyjamas.ui.Controls import AreaDemoSlider from pyjamas.ui.Controls import AreaDemoSlider2 from pyjamas.ui.Controls import InputControl from pyjamas.ui.MouseInputControl import MouseInputControl from pyjamas.ui.HorizontalPanel import HorizontalPanel from pyjamas.ui.VerticalPanel import VerticalPanel from pyjamas.ui import HasAlignment class SliderClass(VerticalPanel): """ example of control which pairs up two other controls. should really be made into a control itself. """ def __init__(self, p2): VerticalPanel.__init__(self) self.setSpacing(10) if p2: self.b = VerticalDemoSlider2(0, 100) else: self.b = VerticalDemoSlider(0, 100) self.add(self.b) self.b.setWidth("20px") self.b.setHeight("100px") self.b.addControlValueListener(self) self.label = InputControl(0, 100) self.add(self.label) self.label.addControlValueListener(self) def onControlValueChanged(self, sender, old_value, new_value): if sender == self.label: self.b.setControlPos(new_value) self.b.setValue(new_value, 0) if sender == self.b: self.label.setControlPos(new_value) self.label.setValue(new_value, 0) class HSliderClass(VerticalPanel): """ example of control which pairs up two other controls. should really be made into a control itself. """ def __init__(self, p2): VerticalPanel.__init__(self) self.setSpacing(10) if p2: self.b = HorizontalDemoSlider2(0, 100) else: self.b = HorizontalDemoSlider(0, 100) self.add(self.b) self.b.setHeight("20px") self.b.setWidth("100px") self.b.addControlValueListener(self) self.label = InputControl(0, 100) self.add(self.label) self.label.addControlValueListener(self) def onControlValueChanged(self, sender, old_value, new_value): if sender == self.label: self.b.setControlPos(new_value) self.b.setValue(new_value, 0) if sender == self.b: self.label.setControlPos(new_value) self.label.setValue(new_value, 0) class ASliderClass(VerticalPanel): """ example of control which pairs up two other controls. should really be made into a control itself. """ def __init__(self, p2): VerticalPanel.__init__(self) self.setSpacing(10) if p2: self.b = AreaDemoSlider2([0,0], [100,100], [0.2, 0.2]) else: self.b = AreaDemoSlider([0,0], [100,100], [0.2, 0.2]) self.add(self.b) self.b.setHeight("100px") self.b.setWidth("100px") self.b.addControlValueListener(self) self.label_x = MouseInputControl(0, 100, 0.2) self.add(self.label_x) self.label_x.addControlValueListener(self) self.label_y = MouseInputControl(0, 100, 0.2) self.add(self.label_y) self.label_y.addControlValueListener(self) def onControlValueChanged(self, sender, old_value_xy, new_value_xy): #no use of old_values? (old_value_x,old_value_y) if (sender == self.label_x): self.b.setControlPos([new_value_xy, self.b.value_y]) self.b.setValue([new_value_xy, self.b.value_y], 0) elif (sender == self.label_y): self.b.setControlPos([self.b.value_x, new_value_xy]) self.b.setValue([self.b.value_x, new_value_xy], 0) elif (sender == self.b): (new_value_x,new_value_y) = new_value_xy self.label_x.setControlPos(new_value_x) self.label_x.setValue(new_value_x, 0) self.label_y.setControlPos(new_value_y) self.label_y.setValue(new_value_y, 0) class ControlDemo: def onModuleLoad(self): v = VerticalPanel(Spacing=10) p = HorizontalPanel(Spacing=10, VerticalAlignment=HasAlignment.ALIGN_BOTTOM) sc = SliderClass(False) p.add(CaptionPanel("clickable only", sc)) sc = SliderClass(True) p.add(CaptionPanel("draggable", sc)) sc = SliderClass(True) p.add(CaptionPanel("draggable", sc)) v.add(CaptionPanel("Vertical Sliders with inputboxes", p)) p = HorizontalPanel() p.setSpacing(10) p.setVerticalAlignment(HasAlignment.ALIGN_BOTTOM) sc = HSliderClass(False) p.add(CaptionPanel("clickable only", sc)) sc = HSliderClass(True) p.add(CaptionPanel("draggable", sc)) v.add(CaptionPanel("Horizontal Sliders with inputboxes", p)) p = HorizontalPanel() p.setSpacing(10) p.setVerticalAlignment(HasAlignment.ALIGN_BOTTOM) sc = ASliderClass(False) p.add(CaptionPanel("clickable only", sc)) sc = ASliderClass(True) p.add(CaptionPanel("draggable", sc)) v.add(CaptionPanel("2D Controls: Inputboxes are draggable as well", p)) RootPanel().add(v) if __name__ == '__main__': pyjd.setup("./public/ControlDemo.html") app = ControlDemo() app.onModuleLoad() pyjd.run()
As the international leader in authorized sports merchandise, Fanatics delivers to over a hundred and eighty countries, runs stores in over twelve languages and supports multi-lingual decision centres incorporating eleven languages. Fanatics extended the reach of its authorized sports merchandise business by exploit UK-based international sports e-commerce company, packsack on Feb second 2016. It currently supports and compliments Fanaticsâ sure-fire U.S. operation by specializing in partnerships with the largest sporting groups and organisations round the world.
# coding: utf-8 # Copyright (c) Pymatgen Development Team. # Distributed under the terms of the MIT License. from __future__ import division, unicode_literals import unittest from pymatgen import Structure from pymatgen.command_line.critic2_caller import * from monty.os.path import which __author__ = "Matthew Horton" __version__ = "0.1" __maintainer__ = "Matthew Horton" __email__ = "[email protected]" __status__ = "Production" __date__ = "July 2017" @unittest.skipIf(not which('critic2'), "critic2 executable not present") class Critic2CallerTest(unittest.TestCase): def test_from_path(self): # uses chgcars test_dir = os.path.join(os.path.dirname(__file__), "..", "..", "..", 'test_files/bader') c2c = Critic2Caller.from_path(test_dir) # check we have some results! self.assertGreaterEqual(len(c2c._stdout), 500) def test_from_structure(self): # uses promolecular density structure = Structure.from_file(os.path.join(os.path.dirname(__file__), "..", "..", "..", 'test_files/critic2/MoS2.cif')) c2c = Critic2Caller(structure) # check we have some results! self.assertGreaterEqual(len(c2c._stdout), 500) class Critic2OutputTest(unittest.TestCase): def setUp(self): stdout_file = os.path.join(os.path.dirname(__file__), "..", "..", "..", 'test_files/critic2/MoS2_critic2_stdout.txt') with open(stdout_file, 'r') as f: reference_stdout = f.read() structure = Structure.from_file(os.path.join(os.path.dirname(__file__), "..", "..", "..", 'test_files/critic2/MoS2.cif')) self.c2o = Critic2Output(structure, reference_stdout) def test_properties_to_from_dict(self): self.assertEqual(len(self.c2o.critical_points), 6) self.assertEqual(len(self.c2o.nodes), 14) self.assertEqual(len(self.c2o.edges), 10) # reference dictionary for c2o.critical_points[0].as_dict() # {'@class': 'CriticalPoint', # '@module': 'pymatgen.command_line.critic2_caller', # 'coords': None, # 'field': 93848.0413, # 'field_gradient': 0.0, # 'field_hessian': [[-2593274446000.0, -3.873587547e-19, -1.704530713e-08], # [-3.873587547e-19, -2593274446000.0, 1.386877485e-18], # [-1.704530713e-08, 1.386877485e-18, -2593274446000.0]], # 'frac_coords': [0.333333, 0.666667, 0.213295], # 'index': 0, # 'multiplicity': 1.0, # 'point_group': 'D3h', # 'type': < CriticalPointType.nucleus: 'nucleus' >} self.assertEqual(str(self.c2o.critical_points[0].type), "CriticalPointType.nucleus") # test connectivity self.assertDictEqual(self.c2o.edges[3], {'from_idx': 1, 'from_lvec': (0, 0, 0), 'to_idx': 0, 'to_lvec': (1, 0, 0)}) # test as/from dict d = self.c2o.as_dict() self.assertEqual(set(d.keys()), {'@module', '@class', 'structure', 'critic2_stdout'}) self.c2o.from_dict(d) if __name__ == '__main__': unittest.main()
Altitudinal gradients in mountain regions are short-range clines of environmental parameters like temperature or radiation. Common garden experiments of populations from different altitudes have been used for a long time to study local adaptation. Modern genomics allows the investigation of footprints of selection at very high resolution. I will provide an overview of current methods and provide examples from the adaptation to high altitude of the wild plant Arabidopsis thaliana to the high Alps in Europe and maize landraces in Peru. In Arabidopsis thaliana, we investigated genomic and phenotypic signatures of adaptation to altitude in populations from the North Italian Alps that originated from 580 to 2350 m altitude by resequencing pools of individuals from each population. High-altitude populations showed a lower nucleotide diversity and negative Tajima’s D values and were more closely related to each other than to low-altitude populations from the same valley. Despite their close geographic proximity, demographic analysis revealed that low- and high-altitude populations split between 260 000 and 15 000 years before present. Single nucleotide polymorphisms whose allele frequencies were highly differentiated between low- and high-altitude populations identified genomic regions of up to 50 kb length where patterns of genetic diversity are consistent with signatures of local selective sweeps. These regions harbour multiple genes known to be involved in stress response. Variation among populations in two putative adaptive phenotypic traits, frost tolerance and response to light/UV stress to frost tolerance strongly suggest that the main determinant of local adaptation at high altitudes reflects the highly variable microclimate. Multigenerational reciprocal common garden experiments also suggest a strong epigenetic effect that may contribute to fitness via a home site advantage. A similar pattern of latitudinal differentiation was observed in the analysis of nearly 2000 genebank accessions of low and highland maize from Peru (0 to 3900 m altitude). They show a strong genetic and phenotypic differentiation that results from local adaptation, and FST-based outlier tests indicate genomic regions that likely were targets of differential selection. Taken together, the results provide strong evidence for local adaptation of plants along altitudinal gradients, and also show further avenues for the investigation of this diversity using genetic approaches to identify the functional basis of local adaptation. All organisms live in environments that vary through time and space. Such environmental change has a dramatic impact on the evolutionary dynamics of populations and promotes adaptive evolution to local environments. For organisms with limited dispersal or rapid generation time relative to the pace of environmental change, local adaptation to variable environments is predicted to be common. Seasonal changes in climate coupled with inter-annual variability and anthropogenic changes over decadal scales is a potent driver of adaptive evolution. Is adaptive evolution to climate variability predictable at a genetic level? Are polymorphisms that promote adaptive evolution to short-term fluctuations in climate likely to persist in populations, potentially enabling species to adapt to long term changes in climate? To address these questions, I will present our recent theoretical investigations and empirical assessment of rapid and cyclic adaptation over seasonal time scales in the fruit fly, Drosophila melanogaster. Our theoretical models predict that polymorphisms underlying adaptation to seasonally fluctuating environments could plausibly cycle in frequency repeatedly over multiple years and can persist in populations for long periods of time. Using pooled estimates of allele frequencies from 20 paired spring-fall samples collected in North America and Europe, we show that seasonally selected polymorphisms can vary predictably among populations and that they tend to persist in the species for long periods of time. I will relate our work to the general topic of adaptation to anthropogenic climate change across taxa by discussing the importance of understanding the relationship between the pace of environmental change and an organism’s life-history.
#!/usr/bin/python #------------------------------------------------------------------------------ # # Process tile worker arguments from command line # # Author: [email protected] # # Copyright 2010-1 Mapquest, Inc. All Rights reserved. # import os import sys import getopt class worker_opts: def __init__( self, args ): self.argv = args self.brokeraddress = None self.worker_id = "pid%08d" % os.getpid() self.mapstyle = None self.tile_dir = None self.timeouts = 4 #-------------------------------------------------------------------------- def usage( self ): print "worker.py --address=192.168.0.0:8888 --mapstyle=thestyle --tiledir=/mnt/tiles --id=foo --timeouts=8" #-------------------------------------------------------------------------- def process( self ): try: opts, args = getopt.getopt( self.argv, "a:d:hi:m:t:", [ "address=", "help", "id=", "mapstyle=", "tiledir=", "timeouts=" ]) except getopt.GetoptError: self.usage() sys.exit() for opt, arg in opts: if opt in ("-h", "--help"): self.usage() sys.exit() elif opt in ("-a", "--address"): self.brokeraddress = arg elif opt in ("-t", "--tiledir"): self.tile_dir = arg elif opt in ("-i", "--id"): self.worker_id = arg elif opt in ("-m", "--mapstyle"): self.mapstyle = arg elif opt in ("-t", "--timeouts"): self.tile_dir = arg #-------------------------------------------------------------------------- def validate( self ): if self.brokeraddress == None: return False if self.worker_id == None: return False if self.mapstyle == None: return False if self.tile_dir == None: return False return True #-------------------------------------------------------------------------- def getBrokerAddress( self ): return self.brokeraddress #-------------------------------------------------------------------------- def getWorkerID( self ): return self.worker_id #-------------------------------------------------------------------------- def getMapStyle( self ): return self.mapstyle #-------------------------------------------------------------------------- def getTileDir( self ): return self.tile_dir #-------------------------------------------------------------------------- def getTimeouts( self ): return self.timeouts #-------------------------------------------------------------------------- #-------------------------------------------------------------------------- if __name__ == "__main__": objOpts = worker_opts( sys.argv[1:] ) objOpts.process()
Let' face it, prepaid phones just don't seem to be as good as the contract phones and I can truly speak from my own experience here. I bought this when I had another Motorola fail in a very short time - yet out of the Verizon warranty period. While this was an "emergency replacement" phone, actually it's been pretty good overall. Call quality and performance is excellent. WiFi and 4G connectivity have been trouble free. The screen is nice, not the best, but better than the similar priced Motorolas by a long shot.. and I keep the brightness lowered to extend battery life. The music player works reasonably well with decent fidelity through several headphones I have used. Overall performance is responsive, but for phones I am not a "power user" and keep the apps down... yet the memory is enough at this point for the apps I use. The camera is respectable - most important is that auto focus actually DOES work well. I do a lot of photography and have both a DSLR and Point and shoot camera, but in a pinch this does the job. Note this does have a flash and a basic selfie / conference camera that is good enough and that's it. Only shortcoming vs. other phones I've had. Battery life c an be quirky and I haven't figured out why. I have found that cleaning the contacts on the battery and phone does help. Overall battery life is quite good - then out of the blue it becomes a power hog. Not sure why, and not holding this against the phone. I would like to see auto display dimming, a nice feature often found at the price point and lacking on this phone. However at least it does have a mutli-color alert LED to let you know you've got mail / messages / missed a call and charge status at a glance. I purchased this as a first smart phone for my daughter before she went away on a retreat (first time away from home). I wanted her to be able to take pictures, contact me via phone, text, email so this was perfect. I also purchased this particular phone because I wanted her to have a trial run with a prepay without the committment of a contract. I am very surprised at how nice this phone is as I am a bit of an iPhone snob. The only slight down side is Best Buy did not have any phone covers that would fit this style but luckily one of the employees let me know exactly where to go to get one. Overall great decision if you are not wanting to commit to a cell phone contract. Oh, I almost forgot the plan of $45 is for 3GB if you do Autopay (only 1GB without). I like the fact that I can cancel it at anytime so I am not bound to it like a contract. I have the app for Verizon on my iPhone so I have complete access to billing, incoming/outgoing, calls/texts, etc., which is a nice feature. I've been through a few phones over the last year, and this one is at the top of my list so far. I'm not fond of the lack of buttons on this model, and I still miss the old slider phones, but overall, it can handle a lot of different apps at once and seems to be pretty durable. The only problem is, whenever you have an alarm or an incoming text/call, any program you're currently using will crash. Actually, the program will freeze for a couple of seconds, and then you'll get the communication request, and then you'll have to close whatever you were doing and re-launch it. However, this is a pretty cheap phone. And it runs on Verizon's network. So it's a good bet for the cost. Just don't expect anything stellar. I had to get a different phone since my LG Power (aka LG Leon) was not compatible with Verizon CDMA. That phone was $79 same as this one ON SALE. This phone, even though it has the exact same specs (ram, rom, processor etc) is at least three times slower than the LG and I'm being generous. I also found out the hard way there is no simultaneous data and phone call capability, something I've had for 10 years starting with Sprint. There are loads of bugs, too, like the word "paste" pops up and you can't get rid of it. If you touch it, it pastes whatever is in your clipboard memory into whatever you're typing. Do not get this phone. I got this phone because I'm on a budget and it had the exact same specs as my quick LG Power (cpu, sd card, os, internal memory, removable battery, ram etc), and because my LG was not available on Verizon. Big mistake. It freezes constantly, video is choppy, and when switching between apps they restart, presumably due to memory management, and worst, when i used a key chord (i.e. power, home and volume button same time) to restart the locked up phone, it factory reset itself with no confirmation. I lost everything. Yes my apps reinstalled themselves later on, but I lost everything on internal memory including photos, music and all text messages. I won't make the same mistake again. No more Samsung at any price. This is my first smart phone and although I'm climbing a learning curve, it seems to be full featured without the high monthly cost of a contract phone. The $45 monthly prepaid includes unlimited talk and text, but has limits on Web connectivity, so I only do downloads and such when connected to my Wi-Fi. Camera is great. Has GPS. My son plays games on it. And if I travel internationally, I can exchange the SIMM card, as it's an unlocked phone. Comes with 8GB of Storage, but can be expanded by adding a micro SD card.
# -*- coding: utf-8 -*- # # Licensed to the Apache Software Foundation (ASF) under one # or more contributor license agreements. See the NOTICE file # distributed with this work for additional information # regarding copyright ownership. The ASF licenses this file # to you under the Apache License, Version 2.0 (the # "License"); you may not use this file except in compliance # with the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, # software distributed under the License is distributed on an # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY # KIND, either express or implied. See the License for the # specific language governing permissions and limitations # under the License. # import sys import unittest from copy import deepcopy from airflow import configuration from airflow.exceptions import AirflowException from airflow.contrib.operators.ecs_operator import ECSOperator try: from unittest import mock except ImportError: try: import mock except ImportError: mock = None RESPONSE_WITHOUT_FAILURES = { "failures": [], "tasks": [ { "containers": [ { "containerArn": "arn:aws:ecs:us-east-1:012345678910:container/e1ed7aac-d9b2-4315-8726-d2432bf11868", "lastStatus": "PENDING", "name": "wordpress", "taskArn": "arn:aws:ecs:us-east-1:012345678910:task/d8c67b3c-ac87-4ffe-a847-4785bc3a8b55" } ], "desiredStatus": "RUNNING", "lastStatus": "PENDING", "taskArn": "arn:aws:ecs:us-east-1:012345678910:task/d8c67b3c-ac87-4ffe-a847-4785bc3a8b55", "taskDefinitionArn": "arn:aws:ecs:us-east-1:012345678910:task-definition/hello_world:11" } ] } class TestECSOperator(unittest.TestCase): @mock.patch('airflow.contrib.operators.ecs_operator.AwsHook') def setUp(self, aws_hook_mock): configuration.load_test_config() self.aws_hook_mock = aws_hook_mock self.ecs = ECSOperator( task_id='task', task_definition='t', cluster='c', overrides={}, aws_conn_id=None, region_name='eu-west-1') def test_init(self): self.assertEqual(self.ecs.region_name, 'eu-west-1') self.assertEqual(self.ecs.task_definition, 't') self.assertEqual(self.ecs.aws_conn_id, None) self.assertEqual(self.ecs.cluster, 'c') self.assertEqual(self.ecs.overrides, {}) self.assertEqual(self.ecs.hook, self.aws_hook_mock.return_value) self.aws_hook_mock.assert_called_once_with(aws_conn_id=None) def test_template_fields_overrides(self): self.assertEqual(self.ecs.template_fields, ('overrides',)) @mock.patch.object(ECSOperator, '_wait_for_task_ended') @mock.patch.object(ECSOperator, '_check_success_task') def test_execute_without_failures(self, check_mock, wait_mock): client_mock = self.aws_hook_mock.return_value.get_client_type.return_value client_mock.run_task.return_value = RESPONSE_WITHOUT_FAILURES self.ecs.execute(None) self.aws_hook_mock.return_value.get_client_type.assert_called_once_with('ecs', region_name='eu-west-1') client_mock.run_task.assert_called_once_with( cluster='c', launchType='EC2', overrides={}, startedBy=mock.ANY, # Can by 'airflow' or 'Airflow' taskDefinition='t' ) wait_mock.assert_called_once_with() check_mock.assert_called_once_with() self.assertEqual(self.ecs.arn, 'arn:aws:ecs:us-east-1:012345678910:task/d8c67b3c-ac87-4ffe-a847-4785bc3a8b55') def test_execute_with_failures(self): client_mock = self.aws_hook_mock.return_value.get_client_type.return_value resp_failures = deepcopy(RESPONSE_WITHOUT_FAILURES) resp_failures['failures'].append('dummy error') client_mock.run_task.return_value = resp_failures with self.assertRaises(AirflowException): self.ecs.execute(None) self.aws_hook_mock.return_value.get_client_type.assert_called_once_with('ecs', region_name='eu-west-1') client_mock.run_task.assert_called_once_with( cluster='c', launchType='EC2', overrides={}, startedBy=mock.ANY, # Can by 'airflow' or 'Airflow' taskDefinition='t' ) def test_wait_end_tasks(self): client_mock = mock.Mock() self.ecs.arn = 'arn' self.ecs.client = client_mock self.ecs._wait_for_task_ended() client_mock.get_waiter.assert_called_once_with('tasks_stopped') client_mock.get_waiter.return_value.wait.assert_called_once_with(cluster='c', tasks=['arn']) self.assertEquals(sys.maxsize, client_mock.get_waiter.return_value.config.max_attempts) def test_check_success_tasks_raises(self): client_mock = mock.Mock() self.ecs.arn = 'arn' self.ecs.client = client_mock client_mock.describe_tasks.return_value = { 'tasks': [{ 'containers': [{ 'name': 'foo', 'lastStatus': 'STOPPED', 'exitCode': 1 }] }] } with self.assertRaises(Exception) as e: self.ecs._check_success_task() # Ordering of str(dict) is not guaranteed. self.assertIn("This task is not in success state ", str(e.exception)) self.assertIn("'name': 'foo'", str(e.exception)) self.assertIn("'lastStatus': 'STOPPED'", str(e.exception)) self.assertIn("'exitCode': 1", str(e.exception)) client_mock.describe_tasks.assert_called_once_with(cluster='c', tasks=['arn']) def test_check_success_tasks_raises_pending(self): client_mock = mock.Mock() self.ecs.client = client_mock self.ecs.arn = 'arn' client_mock.describe_tasks.return_value = { 'tasks': [{ 'containers': [{ 'name': 'container-name', 'lastStatus': 'PENDING' }] }] } with self.assertRaises(Exception) as e: self.ecs._check_success_task() # Ordering of str(dict) is not guaranteed. self.assertIn("This task is still pending ", str(e.exception)) self.assertIn("'name': 'container-name'", str(e.exception)) self.assertIn("'lastStatus': 'PENDING'", str(e.exception)) client_mock.describe_tasks.assert_called_once_with(cluster='c', tasks=['arn']) def test_check_success_tasks_raises_mutliple(self): client_mock = mock.Mock() self.ecs.client = client_mock self.ecs.arn = 'arn' client_mock.describe_tasks.return_value = { 'tasks': [{ 'containers': [{ 'name': 'foo', 'exitCode': 1 }, { 'name': 'bar', 'lastStatus': 'STOPPED', 'exitCode': 0 }] }] } self.ecs._check_success_task() client_mock.describe_tasks.assert_called_once_with(cluster='c', tasks=['arn']) def test_check_success_task_not_raises(self): client_mock = mock.Mock() self.ecs.client = client_mock self.ecs.arn = 'arn' client_mock.describe_tasks.return_value = { 'tasks': [{ 'containers': [{ 'name': 'container-name', 'lastStatus': 'STOPPED', 'exitCode': 0 }] }] } self.ecs._check_success_task() client_mock.describe_tasks.assert_called_once_with(cluster='c', tasks=['arn']) if __name__ == '__main__': unittest.main()
We’re exactly two months away from the release of COLD BLOODED by Toni Anderson – and we’re so excited to reveal the cover for you now! Romantic suspense lovers – this is the series for you! Check out the cover below and preorder your copy now!
# Reversi/Othello Board Game using Minimax and Alpha-Beta Pruning # https://en.wikipedia.org/wiki/Reversi # https://en.wikipedia.org/wiki/Computer_Othello # https://en.wikipedia.org/wiki/Minimax # https://en.wikipedia.org/wiki/Alpha%E2%80%93beta_pruning # https://en.wikipedia.org/wiki/Negamax # https://en.wikipedia.org/wiki/Principal_variation_search # FB36 - 20160831 import os, copy n = 8 # board size (even) board = [['0' for x in range(n)] for y in range(n)] # 8 directions dirx = [-1, 0, 1, -1, 1, -1, 0, 1] diry = [-1, -1, -1, 0, 0, 1, 1, 1] def InitBoard(): if n % 2 == 0: # if board size is even z = (n - 2) / 2 board[z][z] = '2' board[n - 1 - z][z] = '1' board[z][n - 1 - z] = '1' board[n - 1 - z][n - 1 - z] = '2' def PrintBoard(): m = len(str(n - 1)) for y in range(n): row = '' for x in range(n): row += board[y][x] row += ' ' * m print row + ' ' + str(y) print row = '' for x in range(n): row += str(x).zfill(m) + ' ' print row + '\n' def MakeMove(board, x, y, player): # assuming valid move totctr = 0 # total number of opponent pieces taken board[y][x] = player for d in range(8): # 8 directions ctr = 0 for i in range(n): dx = x + dirx[d] * (i + 1) dy = y + diry[d] * (i + 1) if dx < 0 or dx > n - 1 or dy < 0 or dy > n - 1: ctr = 0; break elif board[dy][dx] == player: break elif board[dy][dx] == '0': ctr = 0; break else: ctr += 1 for i in range(ctr): dx = x + dirx[d] * (i + 1) dy = y + diry[d] * (i + 1) board[dy][dx] = player totctr += ctr return (board, totctr) def ValidMove(board, x, y, player): if x < 0 or x > n - 1 or y < 0 or y > n - 1: return False if board[y][x] != '0': return False (boardTemp, totctr) = MakeMove(copy.deepcopy(board), x, y, player) if totctr == 0: return False return True minEvalBoard = -1 # min - 1 maxEvalBoard = n * n + 4 * n + 4 + 1 # max + 1 def EvalBoard(board, player): tot = 0 for y in range(n): for x in range(n): if board[y][x] == player: if (x == 0 or x == n - 1) and (y == 0 or y == n - 1): tot += 4 # corner elif (x == 0 or x == n - 1) or (y == 0 or y == n - 1): tot += 2 # side else: tot += 1 return tot # if no valid move(s) possible then True def IsTerminalNode(board, player): for y in range(n): for x in range(n): if ValidMove(board, x, y, player): return False return True def GetSortedNodes(board, player): sortedNodes = [] for y in range(n): for x in range(n): if ValidMove(board, x, y, player): (boardTemp, totctr) = MakeMove(copy.deepcopy(board), x, y, player) sortedNodes.append((boardTemp, EvalBoard(boardTemp, player))) sortedNodes = sorted(sortedNodes, key = lambda node: node[1], reverse = True) sortedNodes = [node[0] for node in sortedNodes] return sortedNodes def Minimax(board, player, depth, maximizingPlayer): if depth == 0 or IsTerminalNode(board, player): return EvalBoard(board, player) if maximizingPlayer: bestValue = minEvalBoard for y in range(n): for x in range(n): if ValidMove(board, x, y, player): (boardTemp, totctr) = MakeMove(copy.deepcopy(board), x, y, player) v = Minimax(boardTemp, player, depth - 1, False) bestValue = max(bestValue, v) else: # minimizingPlayer bestValue = maxEvalBoard for y in range(n): for x in range(n): if ValidMove(board, x, y, player): (boardTemp, totctr) = MakeMove(copy.deepcopy(board), x, y, player) v = Minimax(boardTemp, player, depth - 1, True) bestValue = min(bestValue, v) return bestValue def AlphaBeta(board, player, depth, alpha, beta, maximizingPlayer): if depth == 0 or IsTerminalNode(board, player): return EvalBoard(board, player) if maximizingPlayer: v = minEvalBoard for y in range(n): for x in range(n): if ValidMove(board, x, y, player): (boardTemp, totctr) = MakeMove(copy.deepcopy(board), x, y, player) v = max(v, AlphaBeta(boardTemp, player, depth - 1, alpha, beta, False)) alpha = max(alpha, v) if beta <= alpha: break # beta cut-off return v else: # minimizingPlayer v = maxEvalBoard for y in range(n): for x in range(n): if ValidMove(board, x, y, player): (boardTemp, totctr) = MakeMove(copy.deepcopy(board), x, y, player) v = min(v, AlphaBeta(boardTemp, player, depth - 1, alpha, beta, True)) beta = min(beta, v) if beta <= alpha: break # alpha cut-off return v def AlphaBetaSN(board, player, depth, alpha, beta, maximizingPlayer): if depth == 0 or IsTerminalNode(board, player): return EvalBoard(board, player) sortedNodes = GetSortedNodes(board, player) if maximizingPlayer: v = minEvalBoard for boardTemp in sortedNodes: v = max(v, AlphaBetaSN(boardTemp, player, depth - 1, alpha, beta, False)) alpha = max(alpha, v) if beta <= alpha: break # beta cut-off return v else: # minimizingPlayer v = maxEvalBoard for boardTemp in sortedNodes: v = min(v, AlphaBetaSN(boardTemp, player, depth - 1, alpha, beta, True)) beta = min(beta, v) if beta <= alpha: break # alpha cut-off return v def Negamax(board, player, depth, color): if depth == 0 or IsTerminalNode(board, player): return color * EvalBoard(board, player) bestValue = minEvalBoard for y in range(n): for x in range(n): if ValidMove(board, x, y, player): (boardTemp, totctr) = MakeMove(copy.deepcopy(board), x, y, player) v = -Negamax(boardTemp, player, depth - 1, -color) bestValue = max(bestValue, v) return bestValue def NegamaxAB(board, player, depth, alpha, beta, color): if depth == 0 or IsTerminalNode(board, player): return color * EvalBoard(board, player) bestValue = minEvalBoard for y in range(n): for x in range(n): if ValidMove(board, x, y, player): (boardTemp, totctr) = MakeMove(copy.deepcopy(board), x, y, player) v = -NegamaxAB(boardTemp, player, depth - 1, -beta, -alpha, -color) bestValue = max(bestValue, v) alpha = max(alpha, v) if alpha >= beta: break return bestValue def NegamaxABSN(board, player, depth, alpha, beta, color): if depth == 0 or IsTerminalNode(board, player): return color * EvalBoard(board, player) sortedNodes = GetSortedNodes(board, player) bestValue = minEvalBoard for boardTemp in sortedNodes: v = -NegamaxABSN(boardTemp, player, depth - 1, -beta, -alpha, -color) bestValue = max(bestValue, v) alpha = max(alpha, v) if alpha >= beta: break return bestValue def Negascout(board, player, depth, alpha, beta, color): if depth == 0 or IsTerminalNode(board, player): return color * EvalBoard(board, player) firstChild = True for y in range(n): for x in range(n): if ValidMove(board, x, y, player): (boardTemp, totctr) = MakeMove(copy.deepcopy(board), x, y, player) if not firstChild: score = -Negascout(boardTemp, player, depth - 1, -alpha - 1, -alpha, -color) if alpha < score and score < beta: score = -Negascout(boardTemp, player, depth - 1, -beta, -score, -color) else: firstChild = False score = -Negascout(boardTemp, player, depth - 1, -beta, -alpha, -color) alpha = max(alpha, score) if alpha >= beta: break return alpha def NegascoutSN(board, player, depth, alpha, beta, color): if depth == 0 or IsTerminalNode(board, player): return color * EvalBoard(board, player) sortedNodes = GetSortedNodes(board, player) firstChild = True for boardTemp in sortedNodes: if not firstChild: score = -NegascoutSN(boardTemp, player, depth - 1, -alpha - 1, -alpha, -color) if alpha < score and score < beta: score = -NegascoutSN(boardTemp, player, depth - 1, -beta, -score, -color) else: firstChild = False score = -NegascoutSN(boardTemp, player, depth - 1, -beta, -alpha, -color) alpha = max(alpha, score) if alpha >= beta: break return alpha def BestMove(board, player): maxPoints = 0 mx = -1; my = -1 for y in range(n): for x in range(n): if ValidMove(board, x, y, player): (boardTemp, totctr) = MakeMove(copy.deepcopy(board), x, y, player) if opt == 0: points = EvalBoard(boardTemp, player) elif opt == 1: points = Minimax(boardTemp, player, depth, True) elif opt == 2: points = AlphaBeta(board, player, depth, minEvalBoard, maxEvalBoard, True) elif opt == 3: points = Negamax(boardTemp, player, depth, 1) elif opt == 4: points = NegamaxAB(boardTemp, player, depth, minEvalBoard, maxEvalBoard, 1) elif opt == 5: points = Negascout(boardTemp, player, depth, minEvalBoard, maxEvalBoard, 1) elif opt == 6: points = AlphaBetaSN(board, player, depth, minEvalBoard, maxEvalBoard, True) elif opt == 7: points = NegamaxABSN(boardTemp, player, depth, minEvalBoard, maxEvalBoard, 1) elif opt == 8: points = NegascoutSN(boardTemp, player, depth, minEvalBoard, maxEvalBoard, 1) if points > maxPoints: maxPoints = points mx = x; my = y return (mx, my) print 'REVERSI/OTHELLO BOARD GAME' print '0: EvalBoard' print '1: Minimax' print '2: Minimax w/ Alpha-Beta Pruning' print '3: Negamax' print '4: Negamax w/ Alpha-Beta Pruning' print '5: Negascout (Principal Variation Search)' print '6: Minimax w/ Alpha-Beta Pruning w/ Sorted Nodes' print '7: Negamax w/ Alpha-Beta Pruning w/ Sorted Nodes' print '8: Negascout (Principal Variation Search) w/ Sorted Nodes' opt = int(raw_input('Select AI Algorithm: ')) if opt > 0 and opt < 9: depth = 4 depthStr = raw_input('Select Search Depth (DEFAULT: 4): ') if depthStr != '': depth = int(depth) print '\n1: User 2: AI (Just press Enter for Exit!)' InitBoard() while True: for p in range(2): print PrintBoard() player = str(p + 1) print 'PLAYER: ' + player if IsTerminalNode(board, player): print 'Player cannot play! Game ended!' print 'Score User: ' + str(EvalBoard(board, '1')) print 'Score AI : ' + str(EvalBoard(board, '2')) os._exit(0) if player == '1': # user's turn while True: xy = raw_input('X Y: ') if xy == '': os._exit(0) (x, y) = xy.split() x = int(x); y = int(y) if ValidMove(board, x, y, player): (board, totctr) = MakeMove(board, x, y, player) print '# of pieces taken: ' + str(totctr) break else: print 'Invalid move! Try again!' else: # AI's turn (x, y) = BestMove(board, player) if not (x == -1 and y == -1): (board, totctr) = MakeMove(board, x, y, player) print 'AI played (X Y): ' + str(x) + ' ' + str(y) print '# of pieces taken: ' + str(totctr)
A clump forming woodland perennial, that grows wild in eastern North America, from northern Quebec to the southern parts of the United States, through the Appalachian Mountains and into northernmost Georgia and west to Minnesota. It is also found on Vancouver Island. Trillium grandiflorum is a 'spring ephemeral' with a life-cycle that is synchronised with its favoured habitat, namely deciduous woodland. This means that growth, and in particular flowering, takes place in early spring before the leaf canopy above comes into leaf and light levels on the for forest floor radically decline - the familiar English Bluebell has precisely the same growth strategy, with an early spring flowering. Trillium grandiflorum produces a very characteristic whorl of the three oval leaves with pointed tips on a stem about 10cm or so long, followed shortly thereafter, during April or May, by an erect, large, three petalled pure white flower that is devoid of fragrance. The flowers are held above the foliage on a short stalk (pedicel) and are tinged with pink as they age -although this colouration should not be mistaken for the very rare variant that has pink flowers through the growth cycle. First described to science in 1803, Trillium grandiflorum will grow most happily in well-drained, neutral to slightly acid soils, under the protective canopy of deciduous trees or shrubs, conditions that simulate as closely as possible it native wild habitat. Soils that are rich in leaf litter and other organic matter are preferred so if necessary dig in well rotted garden compost before planting. Trillium grandiflorum is very slow growing and the plants we sell are always several years old and are either of sufficient maturity to have initiated a flowering cycle, or are very close to such an age. Trillium grandiflorum, although occasionally locally very abundant is now considered under some significant threat in the wild, with some populations entirely extirpated. It is now thought that bumble bees are the primary pollination agent, and it has also been shown that ants play a critical role in distributing seeds, possibly because Trillium grandiflorum has evolved a mechanism to fool the ants into treating the seeds as an animal corpse which they then carry back to their nests. In addition to the seed dispersal advantage this confers it also means that Trillium seeds can germinate well below the soil surface which increases the likelihood that the slow-developing rhizome will survive the several years it takes to reach flowering age. Awarded the RHS Award of Garden Merit, designated as the provincial emblem of Ontario and the state wild flower of Ohio. with dark-mid green basal leaves. White flowers in spring, stalked with three petals, often fading to pink. Deep shade and humus rich soil preferred. Fully hardy. Eastern N. America H:40cm.
# -*- coding: utf-8 -*- ''' :codeauthor: :email:`Rahul Handay <[email protected]>` ''' # Import Python libs from __future__ import absolute_import import os # Import Salt Libs from salt.states import hg # Import Salt Testing Libs from salttesting import skipIf, TestCase from salttesting.helpers import ensure_in_syspath from salttesting.mock import ( NO_MOCK, NO_MOCK_REASON, MagicMock, patch ) ensure_in_syspath('../../') hg.__opts__ = {} @skipIf(NO_MOCK, NO_MOCK_REASON) class HgTestCase(TestCase): ''' Validate the svn state ''' def test_latest(self): ''' Test to Make sure the repository is cloned to the given directory and is up to date ''' ret = {'changes': {}, 'comment': '', 'name': 'salt', 'result': True} mock = MagicMock(return_value=True) with patch.object(hg, '_fail', mock): self.assertTrue(hg.latest("salt")) mock = MagicMock(side_effect=[False, True, False, False, False, False]) with patch.object(os.path, 'isdir', mock): mock = MagicMock(return_value=True) with patch.object(hg, '_handle_existing', mock): self.assertTrue(hg.latest("salt", target="c:\\salt")) with patch.dict(hg.__opts__, {'test': True}): mock = MagicMock(return_value=True) with patch.object(hg, '_neutral_test', mock): self.assertTrue(hg.latest("salt", target="c:\\salt")) with patch.dict(hg.__opts__, {'test': False}): mock = MagicMock(return_value=True) with patch.object(hg, '_clone_repo', mock): self.assertDictEqual(hg.latest("salt", target="c:\\salt"), ret) if __name__ == '__main__': from integration import run_tests run_tests(HgTestCase, needs_daemon=False)
BC Management's Current Job Activity. New Jobs Are Listed Below!!!! - BC Management, Inc. BC Management's Current Job Activity. New Jobs Are Listed Below!!!! PLEASE REVIEW THE COMPLETE JOB DESCRIPTION AND APPLY TO THE JOBS VIA THIS LINK (http://www.bcmanagement.com/searchjobs.html). ENTER THE CORRESPONDING JOB NUMBERS BELOW. Looking for your next ideal job? Register with BC Management (http://www.bcmanagement.com/my-profile.html) today and stay informed throughour BCM Blog (http://www.bcmanagement.com/bcm-blog.html) to ensure you don't miss that next BC/DR job opportunity. · Senior Disaster Recovery Specialist (Job #2550) – Houston, TX. *Client is seeking a candidate who has 5+ years of DR expertise. · Director Business Continuity (Job #2541) – Santa Clara, CA. *Successful candidates must possess 8-10+ years experience managing a enterprise BC/CM wide program within a financial institution. Along with, proven experience as the responsible person during multiple crisis situations. · Sr. Risk Analyst (Job #2542) – Fort Lauderdale, FL. *Seeking candidates with 7 years expertise in insurance coupled with minimal business continuity expertise. The main focus of this position will be to provide support for the management of company’s insurance programs, as well as serve as a backup and support for the company’s business continuity and critical incident planning. Also responsible for risk identification and analysis to support the business units of the organization. · Disaster Recovery Consultant (Job #2554) - northern NJ, New York City, Boston, Washington DC, Chicago, Atlanta or Toronto. *Candidates must have 3-5 years in technology consulting, operations or architecture capacity. This is a technical hands on position. · Business Risk Consultant (Job #2544) – San Francisco, CA or NY, NY. Local Candidates Only. *Candidates must have a solid financial analysis consulting background as a primary skill set with BCP expertise as a secondary qualification. Client company is seeking candidates from a Big Accounting Consulting background, i.e, D&T, PWC, KPMG, Protiviti, etc. The BIAs conducted for clients are more focused on the financial risk assessment versus the standard BCP BIA approach. · Business Continuity Management Consultant (Job #2546) – NYC, NY. Local Candidates Only. *Seeking candidates who have previously worked for a Big 4 consulting company (Deloitte& Touché, KPMG, Accenture, PricewaterhouseCoopers, etc) or a large consulting company. Candidates who are fluent in Spanish and English are highly desirable. · IT Architect – Disaster Recovery (Job #2547) – San Francisco, CA or Reston, VA. Local Candidates Only. *This role is highly technical. The successful candidate must have an exceptional technical understanding with DR architecture design, storage technologies and high availability solutions. • Disaster Recovery Manager Senior (Job #2412) – Minneapolis/St. Paul, MN - Relocation Assistance Provided. *Client is seeking a candidate who has either built a DR program from scratch or successfully augmented an effective best in class DR program. The candidate must bring real world experience with the ability to clearly articulate what did and did not work in previous programs and why. Experience with highly resilient Infrastructure environments is a primary requirement for this position. Must have recent experience designing high availability solutions within multiple platforms within a large IT enterprise environment. Seeking candidates who have come from a large enterprise financial services environment. Candidates must have exceptional, proven staff management skills.
from datetime import datetime import os import requests import urlparse import unittest import json from subprocess import call, check_output DAEMON = os.environ.get('DAEMON', 'authoritative') class ApiTestCase(unittest.TestCase): def setUp(self): # TODO: config self.server_address = '127.0.0.1' self.server_port = int(os.environ.get('WEBPORT', '8084')) self.server_url = 'http://%s:%s/' % (self.server_address, self.server_port) self.session = requests.Session() self.session.auth = ('foo', os.environ.get('APIKEY', 'super')) #self.session.keep_alive = False # self.session.headers = {'X-API-Key': os.environ.get('APIKEY', 'changeme-key'), 'Origin': 'http://%s:%s' % (self.server_address, self.server_port)} def writeFileToConsole(self, file): fp = open(file) cmds_nl = fp.read() # Lua doesn't need newlines and the console gets confused by them e.g. # function definitions cmds = cmds_nl.replace("\n", " ") return call(["../wforce", "-c", "../wforce.conf", "-e", cmds]) def writeCmdToConsole(self, cmd): return check_output(["../wforce", "-c", "../wforce.conf", "-e", cmd]) def allowFunc(self, login, remote, pwhash): return self.allowFuncAttrs(login, remote, pwhash, {}) def allowFuncAttrs(self, login, remote, pwhash, attrs): payload = dict() payload['login'] = login payload['remote'] = remote payload['pwhash'] = pwhash payload['attrs'] = attrs return self.session.post( self.url("/?command=allow"), data=json.dumps(payload), headers={'Content-Type': 'application/json'}) def reportFunc(self, login, remote, pwhash, success): return self.reportFuncAttrs(login, remote, pwhash, success, {}) def reportFuncAttrs(self, login, remote, pwhash, success, attrs): payload = dict() payload['login'] = login payload['remote'] = remote payload['pwhash'] = pwhash payload['success'] = success payload['attrs'] = attrs return self.session.post( self.url("/?command=report"), data=json.dumps(payload), headers={'Content-Type': 'application/json'}) def resetFunc(self, login, ip): payload = dict() payload['login'] = login payload['ip'] = ip return self.session.post( self.url("/?command=reset"), data=json.dumps(payload), headers={'Content-Type': 'application/json'}) def pingFunc(self): return self.session.get(self.url("/?command=ping")) def url(self, relative_url): return urlparse.urljoin(self.server_url, relative_url) def assert_success_json(self, result): try: result.raise_for_status() except: print result.content raise self.assertEquals(result.headers['Content-Type'], 'application/json')
Twitter is the world’s most important social network. That might sound like the ravings of an addict, but look at the headlines in every morning’s newspaper and the obsessions of every evening’s cable news broadcast. Just about anything you encounter in the news media these days has some foot in the controversies and conversations occurring on the 140-character network. Yet for all its influence, Twitter, as a company, is in trouble. Big trouble. The network has always been too strange and too difficult for new users to get the hang of, and its growth has been slowing for a while. Now, according to an earnings report released on Wednesday, user growth has ground to a halt. Tech investors have not been in a forgiving mood lately for any company, and Jack Dorsey, Twitter’s co-founder and recently reinstated chief executive, is unlikely to enjoy much grace. So what’s to be done? It’s time for Twitter to consider something radical. More than two years ago, the company floated its shares on the New York Stock Exchange. On its first day of trading, investors valued Twitter at nearly $32 billion, a price that established a certain set of expectations: that Twitter would keep altering its service to attract mainstream users, and that its ad business would continue to grow at a breakneck pace. Wall Street has only one template of success for an Internet company: Google and, later, Facebook. By filing for an initial public offering, Twitter was telling the world that it was part of the same club — that there was no upper bound to its business aims, and that it would try to build a money machine that matched the size and importance of its service. But what if the best path for Twitter, as a service, is for Twitter the company to abandon that dream? What if becoming a $25 billion, $50 billion or $100 billion world-swallowing Internet giant just isn’t in the cards for a niche service like Twitter? Perhaps there’s more promise in a future as an independent but private company; as a small and sustainable division of some larger tech or media conglomerate; or even as a venture that operates more like a nonprofit foundation. Even if Twitter intends to remain a public corporation — because there does not seem to be much appetite, nor a very obvious mechanism, for investors to take it private — it’s time for Mr. Dorsey to reset expectations for what his company can become. Twitter should think of itself and portray itself to investors as more of a public utility than as a business that never stops growing, and that could ever hope to approach the market value of Facebook. This does not mean that Twitter should abandon making money. Twitter can still run a fine business that generates enough to maintain and improve the reach of the service. But every Internet company doesn’t have to be the biggest, boldest, always-growing entity we are conditioned to expect. Instead of aiming for something like Facebook, Twitter could mold itself on some other template for success. It could become a venture like Wikipedia, run by a nonprofit that depends on donations, or a business like The New York Times Company, a publicly traded enterprise controlled by a family that has a preference for journalistic ambition. In other words, Twitter should make clear that there are limits to the scope of its business ambitions, and that it is guided by a philosophical bias for the health of the service over an ambition to grow at any cost. There’s a simple reason for Mr. Dorsey to consider some other path: Implausible expectations are a setup for a bleak future. After losing its dominance as a search engine, Yahoo has faced more than a decade of struggle mainly because it has tried too long to become the one-stop portal that it isn’t. In that effort it has squandered talent and money and run through more chief executives than you’d find at a Brooks Brothers sale. If, instead of pursuing the moon, Yahoo had vastly lowered its ambitions and planned to do one or two things really well, it could have found a sustainable path forward. Clarifying Twitter’s mission would be painful for the people who work at the company. There could be more layoffs and a plummeting of the equity they hold in the company if it were to go private. It would also be painful for investors who bought in to the company expecting it to be the next big thing. But limiting Twitter’s scope would almost certainly be better for the service. Twitter needs to improve its product: It should have better ways of addressing abuse and harassment, and it should be easier for users to find tweets about topics that interest them. But it isn’t clear that the solutions to these problems are found in growth. Setting a cap on Twitter’s business scope could also tame some of the overreactions that occur whenever the company tries to make reasonable improvements to its service. This week, in anticipation of what turned out to be a modest feature that recommends interesting tweets to people, thousands of users on the service lost their minds predicting that Twitter as we know it was soon to be dead. I suspect they are leery because they see a company being pushed to change by Wall Street, for reasons that aren’t exactly obvious to the people who now love and value Twitter as it is. They see their favorite band straining to become U2. And they hate U2. Twitter, as it is right now, is mostly pretty good for lots and lots of people. Facebook has five times as many users, Instagram is as luxurious as a European vacation, and Snapchat is too cool for school. But because Twitter is an accessible, real-time network that has become the nerve center of the world’s journalists, politicians, activists and agitants, it has, for better or worse, demonstrated an unrivaled capacity to influence real things in the real world. But even beyond these movements, look at all those hashtags littering every Super Bowl ad. Or more simply, look at the thrum of minute-by-minute Twitter commentary that now shapes how any news event or TV show is received. If people don’t tweet it, did it even happen? Of course, Twitter could be better. But it does not need a wholesale overhaul just for the sake of fueling growth. “Twitter maybe hasn’t evolved the product to the point that Wall Street analysts might like, but the utility that Twitter has for providing real-time news is real, and it hasn’t really been disrupted by anyone else,” said Mike Jones, the chief executive of a start-up incubator named Science Media. Mr. Jones used to be the chief executive of Myspace, the once-giant social network, but he rejected any comparison between Twitter and that now vastly diminished network. “For me and for lots of people, Twitter has actual utility to it, and for those people, that’s what will keep it around,” he said.
# Copyright 2013 OpenStack, LLC. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. # """ Installation script for python-openstackclient's development virtualenv """ import os import sys import install_venv_common as install_venv def print_help(): help = """ python-openstackclient development environment setup is complete. python-openstackclient development uses virtualenv to track and manage Python dependencies while in development and testing. To activate the python-openstackclient virtualenv for the extent of your current shell session you can run: $ source .venv/bin/activate Or, if you prefer, you can run commands in the virtualenv on a case by case basis by running: $ tools/with_venv.sh <your command> Also, make test will automatically use the virtualenv. """ print help def main(argv): root = os.path.dirname(os.path.dirname(os.path.realpath(__file__))) venv = os.path.join(root, ".venv") pip_requires = os.path.join(root, "requirements.txt") test_requires = os.path.join(root, "test-requirements.txt") py_version = "python%s.%s" % (sys.version_info[0], sys.version_info[1]) project = "python-openstackclient" install = install_venv.InstallVenv(root, venv, pip_requires, test_requires, py_version, project) options = install.parse_args(argv) install.check_python_version() install.check_dependencies() install.create_virtualenv(no_site_packages=options.no_site_packages) install.install_dependencies() print_help() if __name__ == "__main__": main(sys.argv)
O’Brien is a commercial real estate broker engaged in general brokerage as well as tenant/buyer representation. A partner in a combined portfolio of over $40 million in development and rehabilitation work, he demonstrates a clear understanding of all facets of real estate from approval through Certificate of Occupancy. O’Brien is a leader in the commercial brokerage field who consistently seeks challenging projects, and has brokered over $250,000,000 in the past 18 years. Representing local, regional and national clients, Daniel has achieved Prudential’s prestigious “Chairman’s Circle” designation in 1999, 2000, 2001, 2002 as well as Prudential Real Estate Affiliates’ prestigious “Pinnacle Award” representing his status as their top producer in the nation for commercial real estate in 2001.
#!/cygdrive/c/Anaconda3/python.exe import os import sys import multiprocessing as mp import requests from bs4 import BeautifulSoup, SoupStrainer from PyQt4.QtCore import * from PyQt4.QtGui import * from logger import Logger class ScotusParser(QThread): def __init__(self, name='Parser', log_dir='.', save_dir='.', parent=None): QThread.__init__(self, parent) self.name = name self.log_dir = os.path.realpath(log_dir) self.save_dir = os.path.realpath(save_dir) self.log = Logger(name=self.name, save_dir=self.log_dir) self.base_url = 'http://supremecourt.gov' @staticmethod def fmt_name(inp): """ cleans up file names/removes explicity disallowed characters """ not_allowed_chars = '<>:"/\\|?*' # Explicity not allowed characters for char in not_allowed_chars: inp = inp.replace(char, '') for char in ', ': # I personally don't want these (spaces and commas) inp = inp.replace(char, '') inp = inp.replace('..', '.').replace('v.', '_v_') # looks weird return os.path.normpath(inp) def make_dir(self, dir_path): if not os.path.exists(dir_path) or os.path.isfile(dir_path): self.log('Directory does not exist: {}'.format(dir_path)) try: os.makedirs(dir_path) except OSError as e: self.log('Problem creating: {}'.format(dir_path)) return False else: self.log('Directory created: {}'.format(dir_path)) return True else: self.log('Directory exists: {}'.format(dir_path)) return True def argument_audio(self, year): year = int(year) pairs = [] self.log('Finding argument audio for {}'.format(year)) audio = {'media': '/'.join([self.base_url, 'media/audio/mp3files']), 'dir': os.path.join(self.save_dir, str(year), 'Argument_Audio'), 'url': '/'.join([self.base_url, 'oral_arguments/argument_audio', str(year)]), 'search': '../audio/{year}/'.format(year=year)} if not self.make_dir(audio['dir']): return pairs res = requests.get(audio['url']) if res.status_code == 404: self.log('Got 404 from {}'.format(audio['url'])) return pairs soup = BeautifulSoup(res.content, 'lxml') for rows in soup('tr'): for a in rows('a', class_=None): if audio['search'] in a.get('href'): link = a.get('href') docket = a.string name = rows.find('span').string name = self.fmt_name('{}-{}.mp3'.format(docket, name)) file_path = os.path.join(audio['dir'], name) url = '/'.join([audio['media'], '{}.mp3'.format(docket)]) pairs.append((url, file_path)) url = url.replace(self.base_url, '') file_path = os.path.relpath(file_path) self.log('Found: ({url}, {file})' .format(url=url, file=file_path)) return pairs def slip_opinions(self, year): year = int(year) pairs = [] self.log('Finding slip opinions for {}'.format(year)) slip = {'dir': os.path.join(self.save_dir, str(year), 'Slip Opinions'), 'url': '/'.join([self.base_url, 'opinions', 'slipopinion', str(year-2000)]), 'filter': SoupStrainer('table', class_='table table-bordered')} if not self.make_dir(slip['dir']): return pairs res = requests.get(slip['url']) if res.status_code == 404: self.log('Got 404 from {}'.format(slip['url'])) return pairs soup = BeautifulSoup(res.content, 'lxml', parse_only=slip['filter']) for rows in soup('tr'): docket, name = None, None for i, cell in enumerate(rows('td')): if i == 2: docket = cell.string elif i == 3: a = cell.find('a') link = a.get('href') url = ''.join([self.base_url, link]) name = a.string if docket and name: file_name = self.fmt_name('{}-{}.pdf'.format(docket, name)) file_path = os.path.join(slip['dir'], file_name) pairs.append((url, file_path)) url = url.replace(self.base_url, '') file_path = os.path.relpath(file_path) self.log('Found: ({url}, {file})' .format(url=url, file=file_path)) return pairs def argument_transcripts(self, year): year = int(year) pairs = [] self.log('Finding argument transcripts for {}'.format(year)) script = {'dir': os.path.join(self.save_dir, str(year), 'Argument Transcripts'), 'url': '/'.join([self.base_url, 'oral_arguments', 'argument_transcript', str(year)]), 'search': '../argument_transcripts/'} if not self.make_dir(script['dir']): return pairs res = requests.get(script['url']) if res.status_code == 404: self.log('Got 404 from {}'.format(script['url'])) return pairs soup = BeautifulSoup(res.content, 'lxml') for cell in soup('td'): for a in cell('a'): if script['search'] in a.get('href'): link = a.get('href').replace('../', '') docket = link.replace('argument_transcripts/', '') docket = docket.replace('.pdf', '') url = '/'.join([self.base_url, 'oral_arguments', link]) name = cell.find('span').string file_name = self.fmt_name('{}-{}.pdf'.format(docket, name)) file_path = os.path.join(script['dir'], file_name) pairs.append((url, file_path)) url = url.replace(self.base_url, '') file_path = os.path.relpath(file_path) self.log('Found: ({url}, {file})' .format(url=url, file=file_path)) return pairs if __name__ == '__main__': p = ScotusParser(log_dir='logs', save_dir='SCOTUS') p.argument_audio(2015) p.slip_opinions(2015) p.argument_transcripts(2015)
I remember doing that triple brick with you and Bri, seems easy on the first leg. The last leg you realize just how bad it hurts, especially with Bri walking away from you on the run like you are standing still. Yikes, somebody slow that girl down! I found your blog doing searches for Martial Arts schools in the Raleigh area. I see you're a student at Kung Fu Center in Raleigh. How do you like it? It is a good school, Master Chen is a great teacher. The classes are challenging - we do a warm up, a series of kicks & punches, and then work on forms. You should check a class out sometime. Actually, I'm holding Tassie's paw while taking her picture and SHE is watching YOU!
# Author : Hoang NT # Date : 2016-03-08 # # Simple test case for Updater.py import tensorflow as tf import numpy as np import Update as ud # Made-up matrix for testing # A is 5-by-5 adj matrix # O is 5-by-5 prior knowledge matrix # D is 5-by-5 diagonal matrix of O # k = 4 - Suppose we have 4 clustering # lambda = 0.5 - Consider trade off between topology (A) and prior knowledge (O) A = np.array([[0, 1, 1, 0, 1], [1, 0, 1, 1, 0], [1, 1, 0, 0, 1], [0, 1, 0, 0, 1], [1, 0, 1, 1, 0]]) O = np.array([[1, 1, 1, 1, 1], [1, 0, 1, 1, 1], [1, 1, 1, 1, 0], [1, 1, 1, 1, 1], [1, 1, 0, 1, 1]]) D = np.diag(O.sum(axis=1)) k = 4 l = 0.5 # lambda iterations = 100 # Create a tensorflow graph and add nodes graph = tf.Graph() updater = ud.UpdateElem() # Add updating rule computing node to graph # Look at UpdateElem class (Update.py) for more detail updater.add_semi_supervised_rule(A, O, D, k, l) # Create a session to run sess = tf.Session() init = tf.initialize_all_variables() sess.run(init) # Check for initial values of W and H print(sess.run(updater._W)) print(sess.run(updater._H)) # Update the matrices for _ in range(iterations) : # Get the new value for W and W sess.run([updater.update_H_node(), updater.update_W_node()]) # Assign W and H to the new values sess.run([updater.assign_H_node(), updater.assign_W_node()]) # Print results print('Final result for %d iterations' % iterations) print(sess.run(updater._W)) print(sess.run(updater._H))
CloudOne.mobi and the TallOrder team prides ourselves on providing the owners of small and medium sized businesses with a point of sale solution that empowers their business to succeed. One such empowered business is Shameela Olney from Passion Wholesalers Cape Quarter. She is a true representation of someone who is genuinely passionate about adding value to people’s lives! After years of working in a corporate environment Shameela felt that she was better suited for running her own business. Passion Wholesalers provides quality and high-fashion clothing at affordable prices and have gathered a loyal customer base due to their personalised approached to retail. Shameela assures everyone that after visiting them, you’ll tell all your friends about them – and she’s right. That’s exactly what we did! The TallOrder marketing team spent a morning with Shameela at her store and whether blown away by her positive attitude and bubbly personality. She even managed to convince some of us to buy new items for our wardrobes! Shameela showed us exactly how beneficial TallOrder POS has been to her business. As a retail store Passion Wholesalers makes use of TallOrder’s Inventory and Barcode scanning features; as well as our new Barcode printing capabilities. While we were there a customer came to the store and paid for their item with TallOrder’s Lay-by feature. TallOrder is built to make the lives of business owners, like Shameela’s, easier. Shameela calls herself a TallOrder ambassador and we could not be prouder to have her as a part of the TallOrder family. Getting to know one happy customer at a time. Visit TallOrder POS for more information.