text
stringlengths 29
850k
|
---|
#!/usr/bin/env python
# _*_ coding: utf-8_*_
#
# Copyright 2016 planc2c.com
# [email protected]
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import tornado.web
import logging
import uuid
import time
import re
import json as JSON # 启用别名,不会跟方法里的局部变量混淆
import sys
import os
sys.path.insert(0, os.path.join(os.path.dirname(__file__), "../"))
sys.path.insert(0, os.path.join(os.path.dirname(__file__), "../dao"))
from tornado.escape import json_encode, json_decode
from tornado.httpclient import HTTPClient
from tornado.httputil import url_concat
from bson import json_util
from comm import *
from dao import budge_num_dao
from dao import category_dao
from dao import activity_dao
from dao import group_qrcode_dao
from dao import cret_template_dao
from dao import bonus_template_dao
from dao import bonus_dao
from dao import apply_dao
from dao import order_dao
from dao import group_qrcode_dao
from dao import vendor_member_dao
from dao import voucher_dao
from dao import insurance_template_dao
from dao import contact_dao
from dao import vendor_hha_dao
from dao import voucher_pay_dao
from dao import vendor_wx_dao
from dao import voucher_order_dao
from dao import trip_router_dao
from dao import triprouter_share_dao
from dao import club_dao
from dao import activity_share_dao
from foo.wx import wx_wrap
from xml_parser import parseWxOrderReturn, parseWxPayReturn
from global_const import *
# 俱乐部操作员查看订单详情
class WxVendorOrderInfoHandler(BaseHandler):
def get(self, club_id, order_id):
logging.info("GET %r", self.request.uri)
access_token = DEFAULT_USER_ID
order = self.get_order_index(order_id)
order['create_time'] = timestamp_datetime(order['create_time'])
order['amount'] = float(order['amount']) / 100
order['actual_payment'] = float(order['actual_payment']) / 100
if order['pay_status'] == 30:
order['pay_status'] = u"支付成功"
elif order['pay_status'] == 31:
order['pay_status'] = u"支付失败"
elif order['pay_status'] == 21:
order['pay_status'] = u"下单失败"
elif order['pay_status'] == 20:
order['pay_status'] = u"未支付"
if order['_status'] == 0:
order['_status'] = u"未填报"
if order['_status'] == 50:
order['_status'] = u"填报成功"
activity = self.get_activity(order['item_id'])
params = {"filter":"order", "order_id":order_id, "page":1, "limit":20}
url = url_concat(API_DOMAIN + "/api/applies", params)
http_client = HTTPClient()
headers = {"Authorization":"Bearer " + access_token}
response = http_client.fetch(url, method="GET", headers=headers)
logging.info("got response.body %r", response.body)
data = json_decode(response.body)
rs = data['rs']
applies = rs['data']
for _apply in applies:
# 下单时间,timestamp -> %m月%d 星期%w
_apply['create_time'] = timestamp_datetime(float(_apply['create_time']))
if _apply['gender'] == 'male':
_apply['gender'] = u'男'
else:
_apply['gender'] = u'女'
self.render('order/order.html',
activity=activity,
applies=applies,
order=order)
|
Over the past few years mental health has been brought to the forefront of society – and so to the profession of architecture. And as concepts such as ‘self-care’ and ‘work-life balance’ have permeated into the way we talk about work, and what we expect from our jobs, it’s important that our profession responds.
Architecture, both as a career and a day-to-day job, can be extremely demanding. Everyone reading this will know first-hand of the pressures and pains it brings. We strive for perfection because we see ourselves in our work. But by putting so much of ourselves into what we do, other aspects of our lives can suffer. Too often, this is our mental and physical health.
After a series of alarming mental health statistics from the RIBA and the press, and with my own experience studying my Part 3 still fresh in my mind, I thought it was time to create a space where architects could talk freely, in terms of mental wellbeing, about our profession, it’s practices and how we can better serve our peers and those aspiring to join us.
You don’t have to spend very long in an architecture department at a university to see tired eyes and hear tales of burn-out. This work ethic doesn’t cease after university – if anything, it can be exacerbated by the pressures of the commercial world. It bleeds into every aspect of our lives and despite the omnipresence of wellbeing issues in architectural work and education, official, institution-led support mechanisms is poor.
But idea has become reality. The Architects’ Mental Wellbeing Forum seeks to begin a conversation on mental wellbeing within architecture and press for changes to the way we work. Although our forum is still in its nascent stages, our ambitions are large. Involving architects and human resource professionals from leading UK practices known for their excellent treatment of staff, we hope to share knowledge and research, and examine ways of improving the day-to-day working lives of architects. We want to rid the profession of the pernicious, macho and almost masochistic aspects of work in order to make architects happier and, in turn, better at what they do. And to make sure that it’s not just a talking shop, Katie Vivian from the Architects’ Benevolent Society (ABS) and Virginia Newman, RIBA’s mental health champion, complete the group, ensuring that we have input beyond our respective practices.
Since our kick-off meeting in January, we have refined our rather lofty ambitions. We want to create a ‘toolkit’ in collaboration with the RIBA to give architects and employers guidance on how to promote healthy mental wellbeing in the workplace, as well as opening a dialogue with universities to try and nip in the bud the issues that spill over into the profession. In particular, we will be working alongside Sheffield student Melissa Kirkpatrick, who is doing stellar research work on mental health and architectural education, supported by the ABS. Through events, research pieces and social media initiatives, we will give mental wellbeing in architecture the attention it truly deserves.
By reading this piece, you – whether you know it or not – have helped in some way to raise the issue of mental wellbeing in architecture. So, instead of eating a sandwich at your desk again, how about you take some time out, sit with your colleagues, eat a leisurely lunch, and discuss how your practice could better support the mental and physical wellbeing of its staff? I – and everyone else at The Architects’ Mental Wellbeing Forum – would love to hear what you think.
If you would like more information, would like to get involved, or have a (good or bad) story you wish to share, contact @AMWForum on Twitter, [email protected] (AMWF chair) or [email protected] (RIBA’s mental health champion).
|
"""The application's model objects"""
import sqlalchemy as sa
from meta import Base
from meta import Session
from ceiling import Ceiling
from product_category import ProductCategory
from product_ceiling_map import product_ceiling_map
def setup(meta):
category_ticket = ProductCategory.find_by_name('Ticket')
ceiling_conference = Ceiling.find_by_name('conference-paid')
ceiling_all_conference = Ceiling.find_by_name('conference-all')
ceiling_earlybird = Ceiling.find_by_name('conference-earlybird')
ceiling_nonearlybird = Ceiling.find_by_name('conference-non-earlybird')
# Tickets
ticket_student = Product(category=category_ticket, active=True, description="Student Ticket",
cost="12500", auth=None, validate=None)
ticket_student.ceilings.append(ceiling_conference)
ticket_student.ceilings.append(ceiling_all_conference)
meta.Session.add(ticket_student);
ticket_hobbyist_eb = Product(category=category_ticket, active=True, description="Earlybird Hobbyist Ticket",
cost="29900", auth=None, validate=None)
ticket_hobbyist_eb.ceilings.append(ceiling_conference)
ticket_hobbyist_eb.ceilings.append(ceiling_all_conference)
ticket_hobbyist_eb.ceilings.append(ceiling_earlybird)
meta.Session.add(ticket_hobbyist_eb);
ticket_hobbyist = Product(category=category_ticket, active=True, description="Hobbyist Ticket",
cost="37500", auth=None, validate=None)
ticket_hobbyist.ceilings.append(ceiling_conference)
ticket_hobbyist.ceilings.append(ceiling_all_conference)
ticket_hobbyist.ceilings.append(ceiling_nonearlybird)
meta.Session.add(ticket_hobbyist);
ticket_professional_eb = Product(category=category_ticket, active=True, description="Earlybird Professional Ticket",
cost="63500", auth=None, validate=None)
ticket_professional_eb.ceilings.append(ceiling_conference)
ticket_professional_eb.ceilings.append(ceiling_all_conference)
ticket_professional_eb.ceilings.append(ceiling_earlybird)
meta.Session.add(ticket_professional_eb);
ticket_professional = Product(category=category_ticket, active=True, description="Professional Ticket",
cost="79500", auth=None, validate=None)
ticket_professional.ceilings.append(ceiling_conference)
ticket_professional.ceilings.append(ceiling_all_conference)
ticket_professional.ceilings.append(ceiling_nonearlybird)
meta.Session.add(ticket_professional);
ticket_fairy_penguin = Product(category=category_ticket, active=True, description="Fairy Penguin Sponsor",
cost="150000", auth=None, validate=None)
ticket_fairy_penguin.ceilings.append(ceiling_conference)
ticket_fairy_penguin.ceilings.append(ceiling_all_conference)
meta.Session.add(ticket_fairy_penguin);
ticket_speaker = Product(category=category_ticket, active=True, description="Speaker Ticket",
cost="0", auth="self.is_speaker()", validate=None)
ticket_speaker.ceilings.append(ceiling_all_conference)
meta.Session.add(ticket_speaker);
ticket_miniconf = Product(category=category_ticket, active=True, description="Miniconf Organiser Ticket",
cost="0", auth="self.is_miniconf_org()", validate=None)
ticket_miniconf.ceilings.append(ceiling_all_conference)
meta.Session.add(ticket_miniconf);
ticket_volunteer_free = Product(category=category_ticket, active=True, description="Volunteer Ticket (Free)",
cost="0", auth="self.is_volunteer(product)", validate=None)
ticket_volunteer_free.ceilings.append(ceiling_all_conference)
meta.Session.add(ticket_volunteer_free);
ticket_volunteer_paid = Product(category=category_ticket, active=True, description="Volunteer Ticket (paid)",
cost="12500", auth="self.is_volunteer(product)", validate=None)
ticket_volunteer_paid.ceilings.append(ceiling_all_conference)
meta.Session.add(ticket_volunteer_paid);
ticket_press = Product(category=category_ticket, active=True, description="Press Ticket",
cost="0", auth="self.is_role('press')", validate=None)
ticket_press.ceilings.append(ceiling_all_conference)
meta.Session.add(ticket_press)
ticket_team = Product(category=category_ticket, active=True, description="Team Ticket",
cost="0", auth="self.is_role('team')", validate=None)
# Miniconfs
category_miniconf = ProductCategory.find_by_name('Miniconfs')
ceiling_miniconf_all = Ceiling.find_by_name('miniconf-all')
ceiling_miniconf_monday = Ceiling.find_by_name('miniconf-monday')
ceiling_miniconf_tuesday = Ceiling.find_by_name('miniconf-tuesday')
ceiling_rocketry = Ceiling.find_by_name('miniconf-rocketry')
product = Product(category=category_miniconf, active=True, description="Monday Southern Plumbers",
cost="0", auth=None, validate=None)
product.ceilings.append(ceiling_miniconf_all)
product.ceilings.append(ceiling_miniconf_monday)
meta.Session.add(product)
product = Product(category=category_miniconf, active=True, description="Monday Haecksen",
cost="0", auth=None, validate=None)
product.ceilings.append(ceiling_miniconf_all)
product.ceilings.append(ceiling_miniconf_monday)
meta.Session.add(product)
product = Product(category=category_miniconf, active=True, description="Monday Multimedia + Music",
cost="0", auth=None, validate=None)
product.ceilings.append(ceiling_miniconf_all)
product.ceilings.append(ceiling_miniconf_monday)
meta.Session.add(product)
product = Product(category=category_miniconf, active=True, description="Monday Arduino",
cost="0", auth=None, validate=None)
product.ceilings.append(ceiling_miniconf_all)
product.ceilings.append(ceiling_miniconf_monday)
meta.Session.add(product)
product = Product(category=category_miniconf, active=True, description="Monday Open Programming",
cost="0", auth=None, validate=None)
product.ceilings.append(ceiling_miniconf_all)
product.ceilings.append(ceiling_miniconf_monday)
meta.Session.add(product)
product = Product(category=category_miniconf, active=True, description="Monday The Business of Open Source",
cost="0", auth=None, validate=None)
product.ceilings.append(ceiling_miniconf_all)
product.ceilings.append(ceiling_miniconf_monday)
meta.Session.add(product)
product = Product(category=category_miniconf, active=True, description="Monday Freedom in the cloud",
cost="0", auth=None, validate=None)
product.ceilings.append(ceiling_miniconf_all)
product.ceilings.append(ceiling_miniconf_monday)
meta.Session.add(product)
product = Product(category=category_miniconf, active=True, description="Tuesday Multicore and Parallel Computing",
cost="0", auth=None, validate=None)
product.ceilings.append(ceiling_miniconf_all)
product.ceilings.append(ceiling_miniconf_tuesday)
meta.Session.add(product)
product = Product(category=category_miniconf, active=True, description="Tuesday Rocketry",
cost="20000", auth=None, validate=None)
product.ceilings.append(ceiling_miniconf_all)
product.ceilings.append(ceiling_miniconf_tuesday)
product.ceilings.append(ceiling_rocketry)
meta.Session.add(product)
product = Product(category=category_miniconf, active=True, description="Tuesday Systems Administration",
cost="0", auth=None, validate=None)
product.ceilings.append(ceiling_miniconf_all)
product.ceilings.append(ceiling_miniconf_tuesday)
meta.Session.add(product)
product = Product(category=category_miniconf, active=True, description="Tuesday Open in the public sector ",
cost="0", auth=None, validate=None)
product.ceilings.append(ceiling_miniconf_all)
product.ceilings.append(ceiling_miniconf_tuesday)
meta.Session.add(product)
product = Product(category=category_miniconf, active=True, description="Tuesday Mobile FOSS",
cost="0", auth=None, validate=None)
product.ceilings.append(ceiling_miniconf_all)
product.ceilings.append(ceiling_miniconf_tuesday)
meta.Session.add(product)
product = Product(category=category_miniconf, active=True, description="Tuesday Data Storage",
cost="0", auth=None, validate=None)
product.ceilings.append(ceiling_miniconf_all)
product.ceilings.append(ceiling_miniconf_tuesday)
meta.Session.add(product)
product = Product(category=category_miniconf, active=True, description="Tuesday Research and Student Innovation",
cost="0", auth=None, validate=None)
product.ceilings.append(ceiling_miniconf_all)
product.ceilings.append(ceiling_miniconf_tuesday)
meta.Session.add(product)
product = Product(category=category_miniconf, active=True, description="Tuesday Libre Graphics Day",
cost="0", auth=None, validate=None)
product.ceilings.append(ceiling_miniconf_all)
product.ceilings.append(ceiling_miniconf_monday)
meta.Session.add(product)
# Shirts
category_shirt = ProductCategory.find_by_name('T-Shirt')
ceiling_shirt_all = Ceiling.find_by_name('shirt-all')
ceiling_shirt_men = Ceiling.find_by_name('shirt-men')
ceiling_shirt_women = Ceiling.find_by_name('shirt-women')
product = Product(category=category_shirt, active=True, description="Men's Small", cost="2500", auth=None, validate=None)
product.ceilings.append(ceiling_shirt_all)
product.ceilings.append(ceiling_shirt_men)
meta.Session.add(product)
product = Product(category=category_shirt, active=True, description="Men's Medium", cost="2500", auth=None, validate=None)
product.ceilings.append(ceiling_shirt_all)
product.ceilings.append(ceiling_shirt_men)
meta.Session.add(product)
product = Product(category=category_shirt, active=True, description="Men's Large", cost="2500", auth=None, validate=None)
product.ceilings.append(ceiling_shirt_all)
product.ceilings.append(ceiling_shirt_men)
meta.Session.add(product)
product = Product(category=category_shirt, active=True, description="Men's XL", cost="2500", auth=None, validate=None)
product.ceilings.append(ceiling_shirt_all)
product.ceilings.append(ceiling_shirt_men)
meta.Session.add(product)
product = Product(category=category_shirt, active=True, description="Men's 2XL", cost="2500", auth=None, validate=None)
product.ceilings.append(ceiling_shirt_all)
product.ceilings.append(ceiling_shirt_men)
meta.Session.add(product)
product = Product(category=category_shirt, active=True, description="Men's 3XL", cost="2500", auth=None, validate=None)
product.ceilings.append(ceiling_shirt_all)
product.ceilings.append(ceiling_shirt_men)
meta.Session.add(product)
product = Product(category=category_shirt, active=True, description="Men's 5XL", cost="2500", auth=None, validate=None)
product.ceilings.append(ceiling_shirt_all)
product.ceilings.append(ceiling_shirt_men)
meta.Session.add(product)
product = Product(category=category_shirt, active=True, description="Men's 7XL", cost="2500", auth=None, validate=None)
product.ceilings.append(ceiling_shirt_all)
product.ceilings.append(ceiling_shirt_men)
meta.Session.add(product)
product = Product(category=category_shirt, active=True, description="Women's Size 6", cost="2500", auth=None, validate=None)
product.ceilings.append(ceiling_shirt_all)
product.ceilings.append(ceiling_shirt_women)
meta.Session.add(product)
product = Product(category=category_shirt, active=True, description="Women's Size 8", cost="2500", auth=None, validate=None)
product.ceilings.append(ceiling_shirt_all)
product.ceilings.append(ceiling_shirt_women)
meta.Session.add(product)
product = Product(category=category_shirt, active=True, description="Women's Size 10", cost="2500", auth=None, validate=None)
product.ceilings.append(ceiling_shirt_all)
product.ceilings.append(ceiling_shirt_women)
meta.Session.add(product)
product = Product(category=category_shirt, active=True, description="Women's Size 12", cost="2500", auth=None, validate=None)
product.ceilings.append(ceiling_shirt_all)
product.ceilings.append(ceiling_shirt_women)
meta.Session.add(product)
product = Product(category=category_shirt, active=True, description="Women's Size 14", cost="2500", auth=None, validate=None)
product.ceilings.append(ceiling_shirt_all)
product.ceilings.append(ceiling_shirt_women)
meta.Session.add(product)
product = Product(category=category_shirt, active=True, description="Women's Size 16", cost="2500", auth=None, validate=None)
product.ceilings.append(ceiling_shirt_all)
product.ceilings.append(ceiling_shirt_women)
meta.Session.add(product)
product = Product(category=category_shirt, active=True, description="Women's Size 18", cost="2500", auth=None, validate=None)
product.ceilings.append(ceiling_shirt_all)
product.ceilings.append(ceiling_shirt_women)
meta.Session.add(product)
product = Product(category=category_shirt, active=True, description="Women's Size 20", cost="2500", auth=None, validate=None)
product.ceilings.append(ceiling_shirt_all)
product.ceilings.append(ceiling_shirt_women)
meta.Session.add(product)
product = Product(category=category_shirt, active=True, description="Women's Size 22", cost="2500", auth=None, validate=None)
product.ceilings.append(ceiling_shirt_all)
product.ceilings.append(ceiling_shirt_women)
meta.Session.add(product)
# Penguin Dinner
category_penguin = ProductCategory.find_by_name('Penguin Dinner Ticket')
ceiling_penguin_all = Ceiling.find_by_name('penguindinner-all')
product = Product(category=category_penguin, active=True, description="Adult", cost="9000", auth=None, validate="ProDinner(dinner_field='product_Penguin Dinner Ticket_Adult_qty',ticket_category='category_Ticket',ticket_id=[4,5,6,7,8,11,12])")
product.ceilings.append(ceiling_penguin_all)
meta.Session.add(product)
product = Product(category=category_penguin, active=True, description="Child", cost="2000", auth=None, validate=None)
product.ceilings.append(ceiling_penguin_all)
meta.Session.add(product)
Product(category=category_penguin, active=True, description="Infant", cost="0", auth=None, validate=None)
meta.Session.add(product)
# Speakers Dinner
category_speakers = ProductCategory.find_by_name('Speakers Dinner Ticket')
ceiling_speakers_all = Ceiling.find_by_name('speakersdinner-all')
product = Product(category=category_speakers, active=True, description="Adult", cost="0", validate="ProDinner(dinner_field='product_Speakers Dinner Ticket_Adult_qty',ticket_category='category_Ticket',ticket_id=[7,8,12])", auth="self.is_speaker() or self.is_miniconf_org() or self.is_role('team')")
product.ceilings.append(ceiling_speakers_all)
meta.Session.add(product)
product = Product(category=category_speakers, active=True, description="Child", cost="0", validate=None , auth="self.is_speaker() or self.is_miniconf_org() or self.is_role('team')")
product.ceilings.append(ceiling_speakers_all)
meta.Session.add(product)
product = Product(category=category_speakers, active=True, description="Infant", cost="0", validate=None , auth="self.is_speaker() or self.is_miniconf_org() or self.is_role('team')")
meta.Session.add(product)
# Accommodation
category_accomodation = ProductCategory.find_by_name('Accommodation')
ceiling_accom_all = Ceiling.find_by_name('accomodation-all')
product = Product(category=category_accomodation, active=True, description="I will organise my own",
cost="0", auth=None, validate=None)
product.ceilings.append(ceiling_accom_all)
meta.Session.add(product);
# Partners' Programme
category_partners = ProductCategory.find_by_name('Partners\' Programme')
ceiling_partners_all = Ceiling.find_by_name('partners-all')
partners_adult = Product(category=category_partners, active=True, description="Adult", cost="23500", auth=None, validate="PPDetails(adult_field='product_Partners Programme_Adult_qty', email_field='partner_email', name_field='partner_name', mobile_field='partner_mobile')")
partners_adult.ceilings.append(ceiling_partners_all)
meta.Session.add(partners_adult);
product = Product(category=category_partners, active=True, description="Child (3-14 years old)", cost="16500", auth=None, validate="PPChildrenAdult(current_field='product_Partners Programme_Child (3_14 years old)_qty',adult_field='product_Partners Programme_Adult_qty')")
product.ceilings.append(ceiling_partners_all)
meta.Session.add(product);
product = Product(category=category_partners, active=True, description="Infant (0-2 years old)", cost="0", auth=None, validate="PPChildrenAdult(current_field='product_Partners Programme_Child (0_2 years old)_qty',adult_field='product_Partners Programme_Adult_qty')")
product.ceilings.append(ceiling_partners_all)
meta.Session.add(product);
# Product includes
meta.Session.add_all(
[
# Include 1 Shirt in all registration types
ProductInclude(product=ticket_student, include_category=category_shirt, include_qty='1'), # Student
ProductInclude(product=ticket_hobbyist_eb, include_category=category_shirt, include_qty='1'), # Hobbyist EB
ProductInclude(product=ticket_hobbyist, include_category=category_shirt, include_qty='1'), # Hobbyist
ProductInclude(product=ticket_professional_eb, include_category=category_shirt, include_qty='1'), # Pro EB
ProductInclude(product=ticket_professional, include_category=category_shirt, include_qty='1'), # Pro
ProductInclude(product=ticket_fairy_penguin, include_category=category_shirt, include_qty='1'), # Fairy
ProductInclude(product=ticket_speaker, include_category=category_shirt, include_qty='1'), # Speaker
ProductInclude(product=ticket_miniconf, include_category=category_shirt, include_qty='1'), # Miniconf
ProductInclude(product=ticket_volunteer_free, include_category=category_shirt, include_qty='2'), # Volunteer
ProductInclude(product=ticket_volunteer_paid, include_category=category_shirt, include_qty='2'), # Volunteer
ProductInclude(product=ticket_press, include_category=category_shirt, include_qty='1'), # Press
ProductInclude(product=ticket_team, include_category=category_shirt, include_qty='6'), # Team
#ProductInclude(product=partners_adult, include_category=category_shirt, include_qty='1'), # Partner's Programme get a t-shirt
# Include 1 Dinner for Professional+miniconf and for Speaker registrations
ProductInclude(product=ticket_professional_eb, include_category=category_penguin, include_qty='1'), # Pro EB
ProductInclude(product=ticket_professional, include_category=category_penguin, include_qty='1'), # Pro
ProductInclude(product=ticket_fairy_penguin, include_category=category_penguin, include_qty='1'), # Fairy
ProductInclude(product=ticket_speaker, include_category=category_penguin, include_qty='1'), # Speaker
ProductInclude(product=ticket_miniconf, include_category=category_penguin, include_qty='1'), # Miniconf
ProductInclude(product=ticket_press, include_category=category_penguin, include_qty='1'), # Press
ProductInclude(product=ticket_team, include_category=category_penguin, include_qty='2'), # Team
# Include 2 partners in the partners program for speakers
ProductInclude(product=ticket_speaker, include_category=category_partners, include_qty='2'),
]
)
class Product(Base):
"""Stores the products used for registration
"""
# table
__tablename__ = 'product'
__table_args__ = (
# Descriptions should be unique within a category
sa.UniqueConstraint('category_id', 'description'),
{}
)
id = sa.Column(sa.types.Integer, primary_key=True)
category_id = sa.Column(sa.types.Integer, sa.ForeignKey('product_category.id', ondelete='CASCADE', onupdate='CASCADE'), nullable=True)
fulfilment_type_id = sa.Column(sa.types.Integer, sa.ForeignKey('fulfilment_type.id'), nullable=True)
display_order = sa.Column(sa.types.Integer, nullable=False, default=10)
active = sa.Column(sa.types.Boolean, nullable=False)
description = sa.Column(sa.types.Text, nullable=False)
badge_text = sa.Column(sa.types.Text, nullable=True)
cost = sa.Column(sa.types.Integer, nullable=False)
auth = sa.Column(sa.types.Text, nullable=True)
validate = sa.Column(sa.types.Text, nullable=True)
# relations
category = sa.orm.relation(ProductCategory, lazy=True, backref=sa.orm.backref('products', order_by=lambda: [Product.display_order, Product.cost]))
ceilings = sa.orm.relation(Ceiling, secondary=product_ceiling_map, lazy=True, order_by=Ceiling.name, backref='products')
fulfilment_type = sa.orm.relation('FulfilmentType')
def __init__(self, **kwargs):
super(Product, self).__init__(**kwargs)
@classmethod
def find_all(self):
return Session.query(Product).order_by(Product.display_order).order_by(Product.cost).all()
@classmethod
def find_by_id(cls, id):
return Session.query(Product).filter_by(id=id).first()
@classmethod
def find_by_category(cls, id):
return Session.query(Product).filter_by(category_id=id).order_by(Product.display_order).order_by(Product.cost)
def qty_free(self):
qty = 0
for ii in self.invoice_items:
if not ii.invoice.void and ii.invoice.is_paid:
qty += ii.free_qty
return qty
def qty_sold(self):
qty = 0
for ii in self.invoice_items:
if not ii.invoice.void and ii.invoice.is_paid:
qty += (ii.qty - ii.free_qty)
return qty
def qty_invoiced(self, date=True):
# date: bool? only count items that are not overdue
qty = 0
for ii in self.invoice_items:
# also count sold items as invoiced since they are valid
if not ii.invoice.void and ((ii.invoice.is_paid or not ii.invoice.is_overdue or not date)):
qty += ii.qty
return qty
def remaining(self):
max_ceiling = None
for c in self.ceilings:
if c.remaining() > max_ceiling:
max_ceiling = c.remaining
return max_ceiling
def available(self, stock=True, qty=0):
# bool stock: care about if the product is in stock (ie sold out?)
if self.active:
for c in self.ceilings:
if not c.available(stock, qty):
return False
return True
else:
return False
def can_i_sell(self, person, qty):
if not self.available():
return False
if not self.category.can_i_sell(person, qty):
return False
for c in self.ceiling:
if not c.can_i_sell(qty):
return False
return True
def available_until(self):
until = []
for ceiling in self.ceilings:
if ceiling.available_until != None:
until.append(ceiling.available_until)
if len(until) > 0:
return max(until)
def clean_description(self, category=False):
if category == True:
return self.category.clean_name() + '_' + self.description.replace('-','_').replace("'",'')
else:
return self.description.replace('-','_').replace("'",'');
def __repr__(self):
return '<Product id=%r active=%r description=%r cost=%r auth=%r validate%r>' % (self.id, self.active, self.description, self.cost, self.auth, self.validate)
class ProductInclude(Base):
"""Stores the product categories that are included in another product
"""
__tablename__ = 'product_include'
product_id = sa.Column(sa.types.Integer, sa.ForeignKey('product.id'), primary_key=True)
include_category_id = sa.Column(sa.types.Integer, sa.ForeignKey('product_category.id'), primary_key=True)
include_qty = sa.Column(sa.types.Integer, nullable=False)
product = sa.orm.relation(Product, backref='included', lazy=False)
include_category = sa.orm.relation(ProductCategory)
def __init__(self, **kwargs):
super(ProductInclude, self).__init__(**kwargs)
@classmethod
def find_by_category(cls, id):
return Session.query(ProductInclude).filter_by(include_category_id=id)
@classmethod
def find_by_product(cls, id):
return Session.query(ProductInclude).filter_by(product_id=id)
|
Midnight Sim Racing League - Wer ist online?
Midnight Sim Racing League › Wer ist online?
Gast 20:21 Liest das Thema Anmeldung Gaststarter!
|
# Copyright 2017 SAP SE
#
# Author: Rudolf Vriend <[email protected]>
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from designateclient.v2.base import V2Controller
from designateclient.v2 import utils as v2_utils
class TSIGKeysController(V2Controller):
def create(self, name, algorithm, secret, scope, resource_id):
data = {
'name': name,
'algorithm': algorithm,
'secret': secret,
'scope': scope,
'resource_id': resource_id
}
return self._post('/tsigkeys', data=data)
def list(self, criterion=None, marker=None, limit=None):
url = self.build_url('/tsigkeys', criterion, marker, limit)
return self._get(url, response_key='tsigkeys')
def get(self, tsigkey):
tsigkey = v2_utils.resolve_by_name(self.list, tsigkey)
return self._get('/tsigkeys/%s' % tsigkey)
def update(self, tsigkey, values):
tsigkey = v2_utils.resolve_by_name(self.list, tsigkey)
return self._patch('/tsigkeys/%s' % tsigkey, data=values)
def delete(self, tsigkey):
tsigkey = v2_utils.resolve_by_name(self.list, tsigkey)
return self._delete('/tsigkeys/%s' % tsigkey)
|
UPMC Susquehanna is currently seeking a Registered Nurse for our Home Health & Hospice team in Tioga County! This is an excellent opportunity for both experienced or newly registered nurses to work with a variety of patients whether it be in their homes, assistant living locations or long term care facilities.
For those who qualify, we are currently offering a $10,000 Sign-On Bonus for individuals with 2+ years of RN experience and a $5,000 Sign-On Bonus to RN’s with less than 2 years of experience!
Current Employees - Earn $2,500.00 by referring a Registered Nurse today!
Provides direct patient care, evaluates outcomes and adjusts nursing care process as indicated to ensure optimal patient care. Initiates, coordinates and evaluates patient and family centered health teaching. Establishes the plan of care in collaboration with the physician, patient and UM Professional in accordance with the physician orders and patient needs. Advises and consults with family and other agency personnel, as appropriate.
|
import sys
from optparse import make_option
import traceback
from django.core.management.base import BaseCommand
from django.conf import settings
from django_cron import CronJobManager, get_class
from django.db import close_connection
DEFAULT_LOCK_TIME = 24 * 60 * 60 # 24 hours
class Command(BaseCommand):
option_list = BaseCommand.option_list + (
make_option('--force', action='store_true', help='Force cron runs'),
make_option('--silent', action='store_true', help='Do not push any message on console'),
)
def handle(self, *args, **options):
"""
Iterates over all the CRON_CLASSES (or if passed in as a commandline argument)
and runs them.
"""
if args:
cron_class_names = args
else:
cron_class_names = getattr(settings, 'CRON_CLASSES', [])
try:
crons_to_run = map(lambda x: get_class(x), cron_class_names)
except:
error = traceback.format_exc()
print('Make sure these are valid cron class names: %s\n%s' % (cron_class_names, error))
sys.exit()
for cron_class in crons_to_run:
run_cron_with_cache_check(cron_class, force=options['force'],
silent=options['silent'])
close_connection()
def run_cron_with_cache_check(cron_class, force=False, silent=False):
"""
Checks the cache and runs the cron or not.
@cron_class - cron class to run.
@force - run job even if not scheduled
@silent - suppress notifications
"""
with CronJobManager(cron_class, silent) as manager:
manager.run(force)
|
Rates: 55-32 EUR per night. Minimum Rental - 1 night.
The house is situated in a green and peaceful area in the district Prenzlauer Berg - only a 10-minute drive from the centre of Berlin; making it the perfect location to explore Berlin.
The Graf Bed & Breakfast offers exceptional accomodation at affordable prices.
We have 3 double-rooms and 1 single-room at our disposal, and besides accommodation breakfast can be provided, if you so desire.
Queen Bed, Private Bath, TV. 2-Twin Beds, Shared Bath. 2-Twin Beds, Shared Bath. 1-Twin Bed, Shared Bath.
You will find us near the Prenzlauer Berg district park, between the train stations Landsberger Allee and Greifswalder Strasse. The B & B is situtated in the trendy district Prenzlauer Berg, but is still in a peaceful and central part of Berlin. All of our guests appreciate the private and calm atmosphere in the house.
Although Berlin is known for its parking problems, you will have no trouble finding a parking space directly in front of our house.
Prenzlauer Berg is home to many coffee bars, clubs, and restaurants and you will easily discover the extraordinary ambiance of this district. It takes only 7 minutes by car or about 15 minutes by tram to reach the heart of "Prenzlauer Berg" - Kollwitzplatz (Kollwitz Plaza).
|
# -*- coding: utf-8 -*-
import execjs
import sys
reload(sys)
sys.setdefaultencoding("utf8")
def des_encode(encode_data):
des_js = execjs.compile(des_js_code)
encoded_string = des_js.call("enString", str(encode_data))
return encoded_string
des_js_code = """
function enString(data){
var key1 = "YHXWWLKJYXGS";
var key2 = "ZFCHHYXFL10C";
var key3 = "DES";
var enchex = strEnc(data,key1,key2,key3);
return enchex;
}
/**
RSA加/解密
**/
function EncryString(enString)
{
if(window.ActiveXObject) // IE浏览器
{
xmlHttpRequest = new ActiveXObject("Microsoft.XMLHTTP");
}
else if(window.XMLHttpRequest) // 除IE以外的其他浏览器
{
xmlHttpRequest = new XMLHttpRequest();
}
if(null != xmlHttpRequest)
{
xmlHttpRequest.open("POST", "http://127.0.0.1:8081/sso6qtn/EncryptServlet", false);
// 当发生状态变化时就调用这个回调函数
//xmlHttpRequest.onreadystatechange = ajaxCallBack;
// 使用post提交时必须加上下面这行代码
xmlHttpRequest.setRequestHeader("Content-Type","application/x-www-form-urlencoded");
// 向服务器发出一个请求
xmlHttpRequest.send("enString="+enString);
return xmlHttpRequest.responseText;
}
}
function ajaxCallBack()
{
if(xmlHttpRequest.readyState == 4)
{
if(xmlHttpRequest.status == 200)
{
var content = xmlHttpRequest.responseText;
}
}
}
/**
* DES加密/解密
* @Copyright Copyright (c) 2009
* @author linsi
*/
/*
* encrypt the string to string made up of hex
* return the encrypted string
*/
function strEnc(data,firstKey,secondKey,thirdKey){
var leng = data.length;
var encData = "";
var firstKeyBt,secondKeyBt,thirdKeyBt,firstLength,secondLength,thirdLength;
if(firstKey != null && firstKey != ""){
firstKeyBt = getKeyBytes(firstKey);
firstLength = firstKeyBt.length;
}
if(secondKey != null && secondKey != ""){
secondKeyBt = getKeyBytes(secondKey);
secondLength = secondKeyBt.length;
}
if(thirdKey != null && thirdKey != ""){
thirdKeyBt = getKeyBytes(thirdKey);
thirdLength = thirdKeyBt.length;
}
if(leng > 0){
if(leng < 4){
var bt = strToBt(data);
var encByte ;
if(firstKey != null && firstKey !="" && secondKey != null && secondKey != "" && thirdKey != null && thirdKey != ""){
var tempBt;
var x,y,z;
tempBt = bt;
for(x = 0;x < firstLength ;x ++){
tempBt = enc(tempBt,firstKeyBt[x]);
}
for(y = 0;y < secondLength ;y ++){
tempBt = enc(tempBt,secondKeyBt[y]);
}
for(z = 0;z < thirdLength ;z ++){
tempBt = enc(tempBt,thirdKeyBt[z]);
}
encByte = tempBt;
}else{
if(firstKey != null && firstKey !="" && secondKey != null && secondKey != ""){
var tempBt;
var x,y;
tempBt = bt;
for(x = 0;x < firstLength ;x ++){
tempBt = enc(tempBt,firstKeyBt[x]);
}
for(y = 0;y < secondLength ;y ++){
tempBt = enc(tempBt,secondKeyBt[y]);
}
encByte = tempBt;
}else{
if(firstKey != null && firstKey !=""){
var tempBt;
var x = 0;
tempBt = bt;
for(x = 0;x < firstLength ;x ++){
tempBt = enc(tempBt,firstKeyBt[x]);
}
encByte = tempBt;
}
}
}
encData = bt64ToHex(encByte);
}else{
var iterator = parseInt(leng/4);
var remainder = leng%4;
var i=0;
for(i = 0;i < iterator;i++){
var tempData = data.substring(i*4+0,i*4+4);
var tempByte = strToBt(tempData);
var encByte ;
if(firstKey != null && firstKey !="" && secondKey != null && secondKey != "" && thirdKey != null && thirdKey != ""){
var tempBt;
var x,y,z;
tempBt = tempByte;
for(x = 0;x < firstLength ;x ++){
tempBt = enc(tempBt,firstKeyBt[x]);
}
for(y = 0;y < secondLength ;y ++){
tempBt = enc(tempBt,secondKeyBt[y]);
}
for(z = 0;z < thirdLength ;z ++){
tempBt = enc(tempBt,thirdKeyBt[z]);
}
encByte = tempBt;
}else{
if(firstKey != null && firstKey !="" && secondKey != null && secondKey != ""){
var tempBt;
var x,y;
tempBt = tempByte;
for(x = 0;x < firstLength ;x ++){
tempBt = enc(tempBt,firstKeyBt[x]);
}
for(y = 0;y < secondLength ;y ++){
tempBt = enc(tempBt,secondKeyBt[y]);
}
encByte = tempBt;
}else{
if(firstKey != null && firstKey !=""){
var tempBt;
var x;
tempBt = tempByte;
for(x = 0;x < firstLength ;x ++){
tempBt = enc(tempBt,firstKeyBt[x]);
}
encByte = tempBt;
}
}
}
encData += bt64ToHex(encByte);
}
if(remainder > 0){
var remainderData = data.substring(iterator*4+0,leng);
var tempByte = strToBt(remainderData);
var encByte ;
if(firstKey != null && firstKey !="" && secondKey != null && secondKey != "" && thirdKey != null && thirdKey != ""){
var tempBt;
var x,y,z;
tempBt = tempByte;
for(x = 0;x < firstLength ;x ++){
tempBt = enc(tempBt,firstKeyBt[x]);
}
for(y = 0;y < secondLength ;y ++){
tempBt = enc(tempBt,secondKeyBt[y]);
}
for(z = 0;z < thirdLength ;z ++){
tempBt = enc(tempBt,thirdKeyBt[z]);
}
encByte = tempBt;
}else{
if(firstKey != null && firstKey !="" && secondKey != null && secondKey != ""){
var tempBt;
var x,y;
tempBt = tempByte;
for(x = 0;x < firstLength ;x ++){
tempBt = enc(tempBt,firstKeyBt[x]);
}
for(y = 0;y < secondLength ;y ++){
tempBt = enc(tempBt,secondKeyBt[y]);
}
encByte = tempBt;
}else{
if(firstKey != null && firstKey !=""){
var tempBt;
var x;
tempBt = tempByte;
for(x = 0;x < firstLength ;x ++){
tempBt = enc(tempBt,firstKeyBt[x]);
}
encByte = tempBt;
}
}
}
encData += bt64ToHex(encByte);
}
}
}
return encData;
}
/*
* chang the string into the bit array
*
* return bit array(it's length % 64 = 0)
*/
function getKeyBytes(key){
var keyBytes = new Array();
var leng = key.length;
var iterator = parseInt(leng/4);
var remainder = leng%4;
var i = 0;
for(i = 0;i < iterator; i ++){
keyBytes[i] = strToBt(key.substring(i*4+0,i*4+4));
}
if(remainder > 0){
keyBytes[i] = strToBt(key.substring(i*4+0,leng));
}
return keyBytes;
}
/*
* chang the string(it's length <= 4) into the bit array
*
* return bit array(it's length = 64)
*/
function strToBt(str){
var leng = str.length;
var bt = new Array(64);
if(leng < 4){
var i=0,j=0,p=0,q=0;
for(i = 0;i<leng;i++){
var k = str.charCodeAt(i);
for(j=0;j<16;j++){
var pow=1,m=0;
for(m=15;m>j;m--){
pow *= 2;
}
bt[16*i+j]=parseInt(k/pow)%2;
}
}
for(p = leng;p<4;p++){
var k = 0;
for(q=0;q<16;q++){
var pow=1,m=0;
for(m=15;m>q;m--){
pow *= 2;
}
bt[16*p+q]=parseInt(k/pow)%2;
}
}
}else{
for(i = 0;i<4;i++){
var k = str.charCodeAt(i);
for(j=0;j<16;j++){
var pow=1;
for(m=15;m>j;m--){
pow *= 2;
}
bt[16*i+j]=parseInt(k/pow)%2;
}
}
}
return bt;
}
/*
* chang the bit(it's length = 4) into the hex
*
* return hex
*/
function bt4ToHex(binary) {
var hex;
switch (binary) {
case "0000" : hex = "0"; break;
case "0001" : hex = "1"; break;
case "0010" : hex = "2"; break;
case "0011" : hex = "3"; break;
case "0100" : hex = "4"; break;
case "0101" : hex = "5"; break;
case "0110" : hex = "6"; break;
case "0111" : hex = "7"; break;
case "1000" : hex = "8"; break;
case "1001" : hex = "9"; break;
case "1010" : hex = "A"; break;
case "1011" : hex = "B"; break;
case "1100" : hex = "C"; break;
case "1101" : hex = "D"; break;
case "1110" : hex = "E"; break;
case "1111" : hex = "F"; break;
}
return hex;
}
/*
* chang the hex into the bit(it's length = 4)
*
* return the bit(it's length = 4)
*/
function hexToBt4(hex) {
var binary;
switch (hex) {
case "0" : binary = "0000"; break;
case "1" : binary = "0001"; break;
case "2" : binary = "0010"; break;
case "3" : binary = "0011"; break;
case "4" : binary = "0100"; break;
case "5" : binary = "0101"; break;
case "6" : binary = "0110"; break;
case "7" : binary = "0111"; break;
case "8" : binary = "1000"; break;
case "9" : binary = "1001"; break;
case "A" : binary = "1010"; break;
case "B" : binary = "1011"; break;
case "C" : binary = "1100"; break;
case "D" : binary = "1101"; break;
case "E" : binary = "1110"; break;
case "F" : binary = "1111"; break;
}
return binary;
}
/*
* chang the bit(it's length = 64) into the string
*
* return string
*/
function byteToString(byteData){
var str="";
for(i = 0;i<4;i++){
var count=0;
for(j=0;j<16;j++){
var pow=1;
for(m=15;m>j;m--){
pow*=2;
}
count+=byteData[16*i+j]*pow;
}
if(count != 0){
str+=String.fromCharCode(count);
}
}
return str;
}
function bt64ToHex(byteData){
var hex = "";
for(i = 0;i<16;i++){
var bt = "";
for(j=0;j<4;j++){
bt += byteData[i*4+j];
}
hex+=bt4ToHex(bt);
}
return hex;
}
function hexToBt64(hex){
var binary = "";
for(i = 0;i<16;i++){
binary+=hexToBt4(hex.substring(i,i+1));
}
return binary;
}
/*
* the 64 bit des core arithmetic
*/
function enc(dataByte,keyByte){
var keys = generateKeys(keyByte);
var ipByte = initPermute(dataByte);
var ipLeft = new Array(32);
var ipRight = new Array(32);
var tempLeft = new Array(32);
var i = 0,j = 0,k = 0,m = 0, n = 0;
for(k = 0;k < 32;k ++){
ipLeft[k] = ipByte[k];
ipRight[k] = ipByte[32+k];
}
for(i = 0;i < 16;i ++){
for(j = 0;j < 32;j ++){
tempLeft[j] = ipLeft[j];
ipLeft[j] = ipRight[j];
}
var key = new Array(48);
for(m = 0;m < 48;m ++){
key[m] = keys[i][m];
}
var tempRight = xor(pPermute(sBoxPermute(xor(expandPermute(ipRight),key))), tempLeft);
for(n = 0;n < 32;n ++){
ipRight[n] = tempRight[n];
}
}
var finalData =new Array(64);
for(i = 0;i < 32;i ++){
finalData[i] = ipRight[i];
finalData[32+i] = ipLeft[i];
}
return finallyPermute(finalData);
}
function dec(dataByte,keyByte){
var keys = generateKeys(keyByte);
var ipByte = initPermute(dataByte);
var ipLeft = new Array(32);
var ipRight = new Array(32);
var tempLeft = new Array(32);
var i = 0,j = 0,k = 0,m = 0, n = 0;
for(k = 0;k < 32;k ++){
ipLeft[k] = ipByte[k];
ipRight[k] = ipByte[32+k];
}
for(i = 15;i >= 0;i --){
for(j = 0;j < 32;j ++){
tempLeft[j] = ipLeft[j];
ipLeft[j] = ipRight[j];
}
var key = new Array(48);
for(m = 0;m < 48;m ++){
key[m] = keys[i][m];
}
var tempRight = xor(pPermute(sBoxPermute(xor(expandPermute(ipRight),key))), tempLeft);
for(n = 0;n < 32;n ++){
ipRight[n] = tempRight[n];
}
}
var finalData =new Array(64);
for(i = 0;i < 32;i ++){
finalData[i] = ipRight[i];
finalData[32+i] = ipLeft[i];
}
return finallyPermute(finalData);
}
function initPermute(originalData){
var ipByte = new Array(64);
for (i = 0, m = 1, n = 0; i < 4; i++, m += 2, n += 2) {
for (j = 7, k = 0; j >= 0; j--, k++) {
ipByte[i * 8 + k] = originalData[j * 8 + m];
ipByte[i * 8 + k + 32] = originalData[j * 8 + n];
}
}
return ipByte;
}
function expandPermute(rightData){
var epByte = new Array(48);
for (i = 0; i < 8; i++) {
if (i == 0) {
epByte[i * 6 + 0] = rightData[31];
} else {
epByte[i * 6 + 0] = rightData[i * 4 - 1];
}
epByte[i * 6 + 1] = rightData[i * 4 + 0];
epByte[i * 6 + 2] = rightData[i * 4 + 1];
epByte[i * 6 + 3] = rightData[i * 4 + 2];
epByte[i * 6 + 4] = rightData[i * 4 + 3];
if (i == 7) {
epByte[i * 6 + 5] = rightData[0];
} else {
epByte[i * 6 + 5] = rightData[i * 4 + 4];
}
}
return epByte;
}
function xor(byteOne,byteTwo){
var xorByte = new Array(byteOne.length);
for(i = 0;i < byteOne.length; i ++){
xorByte[i] = byteOne[i] ^ byteTwo[i];
}
return xorByte;
}
function sBoxPermute(expandByte){
var sBoxByte = new Array(32);
var binary = "";
var s1 = [
[14, 4, 13, 1, 2, 15, 11, 8, 3, 10, 6, 12, 5, 9, 0, 7],
[0, 15, 7, 4, 14, 2, 13, 1, 10, 6, 12, 11, 9, 5, 3, 8],
[4, 1, 14, 8, 13, 6, 2, 11, 15, 12, 9, 7, 3, 10, 5, 0],
[15, 12, 8, 2, 4, 9, 1, 7, 5, 11, 3, 14, 10, 0, 6, 13 ]];
/* Table - s2 */
var s2 = [
[15, 1, 8, 14, 6, 11, 3, 4, 9, 7, 2, 13, 12, 0, 5, 10],
[3, 13, 4, 7, 15, 2, 8, 14, 12, 0, 1, 10, 6, 9, 11, 5],
[0, 14, 7, 11, 10, 4, 13, 1, 5, 8, 12, 6, 9, 3, 2, 15],
[13, 8, 10, 1, 3, 15, 4, 2, 11, 6, 7, 12, 0, 5, 14, 9 ]];
/* Table - s3 */
var s3= [
[10, 0, 9, 14, 6, 3, 15, 5, 1, 13, 12, 7, 11, 4, 2, 8],
[13, 7, 0, 9, 3, 4, 6, 10, 2, 8, 5, 14, 12, 11, 15, 1],
[13, 6, 4, 9, 8, 15, 3, 0, 11, 1, 2, 12, 5, 10, 14, 7],
[1, 10, 13, 0, 6, 9, 8, 7, 4, 15, 14, 3, 11, 5, 2, 12 ]];
/* Table - s4 */
var s4 = [
[7, 13, 14, 3, 0, 6, 9, 10, 1, 2, 8, 5, 11, 12, 4, 15],
[13, 8, 11, 5, 6, 15, 0, 3, 4, 7, 2, 12, 1, 10, 14, 9],
[10, 6, 9, 0, 12, 11, 7, 13, 15, 1, 3, 14, 5, 2, 8, 4],
[3, 15, 0, 6, 10, 1, 13, 8, 9, 4, 5, 11, 12, 7, 2, 14 ]];
/* Table - s5 */
var s5 = [
[2, 12, 4, 1, 7, 10, 11, 6, 8, 5, 3, 15, 13, 0, 14, 9],
[14, 11, 2, 12, 4, 7, 13, 1, 5, 0, 15, 10, 3, 9, 8, 6],
[4, 2, 1, 11, 10, 13, 7, 8, 15, 9, 12, 5, 6, 3, 0, 14],
[11, 8, 12, 7, 1, 14, 2, 13, 6, 15, 0, 9, 10, 4, 5, 3 ]];
/* Table - s6 */
var s6 = [
[12, 1, 10, 15, 9, 2, 6, 8, 0, 13, 3, 4, 14, 7, 5, 11],
[10, 15, 4, 2, 7, 12, 9, 5, 6, 1, 13, 14, 0, 11, 3, 8],
[9, 14, 15, 5, 2, 8, 12, 3, 7, 0, 4, 10, 1, 13, 11, 6],
[4, 3, 2, 12, 9, 5, 15, 10, 11, 14, 1, 7, 6, 0, 8, 13 ]];
/* Table - s7 */
var s7 = [
[4, 11, 2, 14, 15, 0, 8, 13, 3, 12, 9, 7, 5, 10, 6, 1],
[13, 0, 11, 7, 4, 9, 1, 10, 14, 3, 5, 12, 2, 15, 8, 6],
[1, 4, 11, 13, 12, 3, 7, 14, 10, 15, 6, 8, 0, 5, 9, 2],
[6, 11, 13, 8, 1, 4, 10, 7, 9, 5, 0, 15, 14, 2, 3, 12]];
/* Table - s8 */
var s8 = [
[13, 2, 8, 4, 6, 15, 11, 1, 10, 9, 3, 14, 5, 0, 12, 7],
[1, 15, 13, 8, 10, 3, 7, 4, 12, 5, 6, 11, 0, 14, 9, 2],
[7, 11, 4, 1, 9, 12, 14, 2, 0, 6, 10, 13, 15, 3, 5, 8],
[2, 1, 14, 7, 4, 10, 8, 13, 15, 12, 9, 0, 3, 5, 6, 11]];
for(m=0;m<8;m++){
var i=0,j=0;
i = expandByte[m*6+0]*2+expandByte[m*6+5];
j = expandByte[m * 6 + 1] * 2 * 2 * 2
+ expandByte[m * 6 + 2] * 2* 2
+ expandByte[m * 6 + 3] * 2
+ expandByte[m * 6 + 4];
switch (m) {
case 0 :
binary = getBoxBinary(s1[i][j]);
break;
case 1 :
binary = getBoxBinary(s2[i][j]);
break;
case 2 :
binary = getBoxBinary(s3[i][j]);
break;
case 3 :
binary = getBoxBinary(s4[i][j]);
break;
case 4 :
binary = getBoxBinary(s5[i][j]);
break;
case 5 :
binary = getBoxBinary(s6[i][j]);
break;
case 6 :
binary = getBoxBinary(s7[i][j]);
break;
case 7 :
binary = getBoxBinary(s8[i][j]);
break;
}
sBoxByte[m*4+0] = parseInt(binary.substring(0,1));
sBoxByte[m*4+1] = parseInt(binary.substring(1,2));
sBoxByte[m*4+2] = parseInt(binary.substring(2,3));
sBoxByte[m*4+3] = parseInt(binary.substring(3,4));
}
return sBoxByte;
}
function pPermute(sBoxByte){
var pBoxPermute = new Array(32);
pBoxPermute[ 0] = sBoxByte[15];
pBoxPermute[ 1] = sBoxByte[ 6];
pBoxPermute[ 2] = sBoxByte[19];
pBoxPermute[ 3] = sBoxByte[20];
pBoxPermute[ 4] = sBoxByte[28];
pBoxPermute[ 5] = sBoxByte[11];
pBoxPermute[ 6] = sBoxByte[27];
pBoxPermute[ 7] = sBoxByte[16];
pBoxPermute[ 8] = sBoxByte[ 0];
pBoxPermute[ 9] = sBoxByte[14];
pBoxPermute[10] = sBoxByte[22];
pBoxPermute[11] = sBoxByte[25];
pBoxPermute[12] = sBoxByte[ 4];
pBoxPermute[13] = sBoxByte[17];
pBoxPermute[14] = sBoxByte[30];
pBoxPermute[15] = sBoxByte[ 9];
pBoxPermute[16] = sBoxByte[ 1];
pBoxPermute[17] = sBoxByte[ 7];
pBoxPermute[18] = sBoxByte[23];
pBoxPermute[19] = sBoxByte[13];
pBoxPermute[20] = sBoxByte[31];
pBoxPermute[21] = sBoxByte[26];
pBoxPermute[22] = sBoxByte[ 2];
pBoxPermute[23] = sBoxByte[ 8];
pBoxPermute[24] = sBoxByte[18];
pBoxPermute[25] = sBoxByte[12];
pBoxPermute[26] = sBoxByte[29];
pBoxPermute[27] = sBoxByte[ 5];
pBoxPermute[28] = sBoxByte[21];
pBoxPermute[29] = sBoxByte[10];
pBoxPermute[30] = sBoxByte[ 3];
pBoxPermute[31] = sBoxByte[24];
return pBoxPermute;
}
function finallyPermute(endByte){
var fpByte = new Array(64);
fpByte[ 0] = endByte[39];
fpByte[ 1] = endByte[ 7];
fpByte[ 2] = endByte[47];
fpByte[ 3] = endByte[15];
fpByte[ 4] = endByte[55];
fpByte[ 5] = endByte[23];
fpByte[ 6] = endByte[63];
fpByte[ 7] = endByte[31];
fpByte[ 8] = endByte[38];
fpByte[ 9] = endByte[ 6];
fpByte[10] = endByte[46];
fpByte[11] = endByte[14];
fpByte[12] = endByte[54];
fpByte[13] = endByte[22];
fpByte[14] = endByte[62];
fpByte[15] = endByte[30];
fpByte[16] = endByte[37];
fpByte[17] = endByte[ 5];
fpByte[18] = endByte[45];
fpByte[19] = endByte[13];
fpByte[20] = endByte[53];
fpByte[21] = endByte[21];
fpByte[22] = endByte[61];
fpByte[23] = endByte[29];
fpByte[24] = endByte[36];
fpByte[25] = endByte[ 4];
fpByte[26] = endByte[44];
fpByte[27] = endByte[12];
fpByte[28] = endByte[52];
fpByte[29] = endByte[20];
fpByte[30] = endByte[60];
fpByte[31] = endByte[28];
fpByte[32] = endByte[35];
fpByte[33] = endByte[ 3];
fpByte[34] = endByte[43];
fpByte[35] = endByte[11];
fpByte[36] = endByte[51];
fpByte[37] = endByte[19];
fpByte[38] = endByte[59];
fpByte[39] = endByte[27];
fpByte[40] = endByte[34];
fpByte[41] = endByte[ 2];
fpByte[42] = endByte[42];
fpByte[43] = endByte[10];
fpByte[44] = endByte[50];
fpByte[45] = endByte[18];
fpByte[46] = endByte[58];
fpByte[47] = endByte[26];
fpByte[48] = endByte[33];
fpByte[49] = endByte[ 1];
fpByte[50] = endByte[41];
fpByte[51] = endByte[ 9];
fpByte[52] = endByte[49];
fpByte[53] = endByte[17];
fpByte[54] = endByte[57];
fpByte[55] = endByte[25];
fpByte[56] = endByte[32];
fpByte[57] = endByte[ 0];
fpByte[58] = endByte[40];
fpByte[59] = endByte[ 8];
fpByte[60] = endByte[48];
fpByte[61] = endByte[16];
fpByte[62] = endByte[56];
fpByte[63] = endByte[24];
return fpByte;
}
function getBoxBinary(i) {
var binary = "";
switch (i) {
case 0 :binary = "0000";break;
case 1 :binary = "0001";break;
case 2 :binary = "0010";break;
case 3 :binary = "0011";break;
case 4 :binary = "0100";break;
case 5 :binary = "0101";break;
case 6 :binary = "0110";break;
case 7 :binary = "0111";break;
case 8 :binary = "1000";break;
case 9 :binary = "1001";break;
case 10 :binary = "1010";break;
case 11 :binary = "1011";break;
case 12 :binary = "1100";break;
case 13 :binary = "1101";break;
case 14 :binary = "1110";break;
case 15 :binary = "1111";break;
}
return binary;
}
/*
* generate 16 keys for xor
*
*/
function generateKeys(keyByte){
var key = new Array(56);
var keys = new Array();
keys[ 0] = new Array();
keys[ 1] = new Array();
keys[ 2] = new Array();
keys[ 3] = new Array();
keys[ 4] = new Array();
keys[ 5] = new Array();
keys[ 6] = new Array();
keys[ 7] = new Array();
keys[ 8] = new Array();
keys[ 9] = new Array();
keys[10] = new Array();
keys[11] = new Array();
keys[12] = new Array();
keys[13] = new Array();
keys[14] = new Array();
keys[15] = new Array();
var loop = [1,1,2,2,2,2,2,2,1,2,2,2,2,2,2,1];
for(i=0;i<7;i++){
for(j=0,k=7;j<8;j++,k--){
key[i*8+j]=keyByte[8*k+i];
}
}
var i = 0;
for(i = 0;i < 16;i ++){
var tempLeft=0;
var tempRight=0;
for(j = 0; j < loop[i];j ++){
tempLeft = key[0];
tempRight = key[28];
for(k = 0;k < 27 ;k ++){
key[k] = key[k+1];
key[28+k] = key[29+k];
}
key[27]=tempLeft;
key[55]=tempRight;
}
var tempKey = new Array(48);
tempKey[ 0] = key[13];
tempKey[ 1] = key[16];
tempKey[ 2] = key[10];
tempKey[ 3] = key[23];
tempKey[ 4] = key[ 0];
tempKey[ 5] = key[ 4];
tempKey[ 6] = key[ 2];
tempKey[ 7] = key[27];
tempKey[ 8] = key[14];
tempKey[ 9] = key[ 5];
tempKey[10] = key[20];
tempKey[11] = key[ 9];
tempKey[12] = key[22];
tempKey[13] = key[18];
tempKey[14] = key[11];
tempKey[15] = key[ 3];
tempKey[16] = key[25];
tempKey[17] = key[ 7];
tempKey[18] = key[15];
tempKey[19] = key[ 6];
tempKey[20] = key[26];
tempKey[21] = key[19];
tempKey[22] = key[12];
tempKey[23] = key[ 1];
tempKey[24] = key[40];
tempKey[25] = key[51];
tempKey[26] = key[30];
tempKey[27] = key[36];
tempKey[28] = key[46];
tempKey[29] = key[54];
tempKey[30] = key[29];
tempKey[31] = key[39];
tempKey[32] = key[50];
tempKey[33] = key[44];
tempKey[34] = key[32];
tempKey[35] = key[47];
tempKey[36] = key[43];
tempKey[37] = key[48];
tempKey[38] = key[38];
tempKey[39] = key[55];
tempKey[40] = key[33];
tempKey[41] = key[52];
tempKey[42] = key[45];
tempKey[43] = key[41];
tempKey[44] = key[49];
tempKey[45] = key[35];
tempKey[46] = key[28];
tempKey[47] = key[31];
switch(i){
case 0: for(m=0;m < 48 ;m++){ keys[ 0][m] = tempKey[m]; } break;
case 1: for(m=0;m < 48 ;m++){ keys[ 1][m] = tempKey[m]; } break;
case 2: for(m=0;m < 48 ;m++){ keys[ 2][m] = tempKey[m]; } break;
case 3: for(m=0;m < 48 ;m++){ keys[ 3][m] = tempKey[m]; } break;
case 4: for(m=0;m < 48 ;m++){ keys[ 4][m] = tempKey[m]; } break;
case 5: for(m=0;m < 48 ;m++){ keys[ 5][m] = tempKey[m]; } break;
case 6: for(m=0;m < 48 ;m++){ keys[ 6][m] = tempKey[m]; } break;
case 7: for(m=0;m < 48 ;m++){ keys[ 7][m] = tempKey[m]; } break;
case 8: for(m=0;m < 48 ;m++){ keys[ 8][m] = tempKey[m]; } break;
case 9: for(m=0;m < 48 ;m++){ keys[ 9][m] = tempKey[m]; } break;
case 10: for(m=0;m < 48 ;m++){ keys[10][m] = tempKey[m]; } break;
case 11: for(m=0;m < 48 ;m++){ keys[11][m] = tempKey[m]; } break;
case 12: for(m=0;m < 48 ;m++){ keys[12][m] = tempKey[m]; } break;
case 13: for(m=0;m < 48 ;m++){ keys[13][m] = tempKey[m]; } break;
case 14: for(m=0;m < 48 ;m++){ keys[14][m] = tempKey[m]; } break;
case 15: for(m=0;m < 48 ;m++){ keys[15][m] = tempKey[m]; } break;
}
}
return keys;
}
//end-------------------------------------------------------------------------------------------------------------
"""
if __name__ == '__main__':
print des_encode("15070876044")
print type(des_encode("15070876044"))
print des_encode("440678")
print type(des_encode("440678"))
|
News & Publications - Should Producers and CSRs Pay for E&O Claims?
"Recently we had an E&O claim that was $38,000 and our deductible is $25,000. The owners of the agency, while not active in the agency, understand money. I am president and report to them weekly and monthly at a board meeting. When this came up, they wanted the CSR of the producer who caused the loss, the office manager who the CSR reports to, and me as president to "pay" for the loss.
"It was one where the producer filled out the application incorrectly (entered the square footage of a building incorrectly) and gave the app to the CSR, asking her to market it for him. She did and for FOUR renewals the incorrect information was on the renewal apps, even changing companies during that time. The producer always carried the renewal out and gave information to the CSR. She did as he said—it was not her duty to verify anything on this account. Now she is to shoulder some of the blame?
"I cant get my owners to understand that I do not think its anyone's fault but the producer's—what experience have you in dealing with who is responsible for such losses, if anyone."
Based on the information provided, it does look like the CSR is the least culpable person of those mentioned...and probably the one least able to afford a chunk of the deductible. Your question raises a number of issues, some of them legal one which, of course, we can't opine on. But below are some general observations from the VU faculty.
None — the deductible is owed by the named insured — the corporate brokerage.
It would be dangerous to go after the independent contractors who caused the loss and probably impossible if they are employees. Seek good legal advice.
Also, if the employee is working in the course and scope of his/her employment, it is the employer's problem unless you have an employment contract that sets out who is responsible for errors. That is why the employer buys E&O insurance and covers his employees.
I would anticipate a suit from the employee and in some states, like California, if the employer is a corporation, the employee may be entitled to indemnity from his/her employer. Be prepared.
Show me a CSR who can pony up $25,000 and I'll buy you lunch anywhere in the country.
Unless your agency specifically has a policy that requires someone other than the agency to pay the deductible (some producer contracts do include something on this), then there is little to stand on. For the record, agency owners should understand that it is their responsibility and that it may be more important than they think to make sure that there are appropriate policies and procedures in place. I do believe that they will have a hard time making the CSR, office manager and you pay the deductible without prior written notice on the policy. You may need to seek appropriate legal advice though.
You should have a conversation with an employment attorney. This is most likely a change in the terms of the employee's responsibilities.
While it is customary to have agreements with producers that they may be held responsible individually for deductibles on E&O claims arising from their accounts, I can't recall an agency pushing it down to the CSR level, even though I'm sure some do. I question, however, the practice if it isn't in the employment contract already. So if they balk, do you fire them? Do you sue them? If it's a producer with ownership in his or her book, are you creating another financial obligation via the termination? Have you thought this policy through? Can you afford to risk losing other employee loyalty to the agency following the inevitable firing of the CSR if you demand they pay? How will this affect morale? Will staff seek employment with an agency that does not invoke this?
$25,000 is a lot of money to stick on an employee, even if just a fraction. Did they have any say-so in the amount of the deductible on the policy? There are a lot of issues here and I know that if I were in the CSR's shoes, I'd quit and find a new job. You'd never get the money out of me. I think the owners need to back up and ask some hard questions like these. If they want to institute a policy of deductible reimbursement by offending employees, fine, but they need to get it in writing, outlining the consequences.
My experience has been that very few agencies cause the CSRs to cover an E&O claim. Quite a few, though probably not a majority, cause producers to cover a portion of the claim, but rarely the entire claim. Whether it is legal to cause the producer to cover any portion is an issue to discuss with an HR attorney. Some attorneys believe it is illegal to deduct such claims from wages because that may be a violation of employment laws.
I serve on the board of directors of an agency and our procedure is that we put all E&Os to a committee, of which I am the chairman, to determine the cause. If the producer is the cause, then he/she pays the deductible. If the CSR is the cause, then the agency pays the deductible. In both instances, we strive to determine the source of the E&O so as to eliminate it in the future. Obviously, if the same CSR continues to have issues, then other action will be taken.
Unless the employees are under a contract that states that they will pay for E&O deductibles caused by their mistakes, the agency is responsible for the E&O deductible in my opinion. Do the PROFITS of the agency go to the CSR, manager and you when things go right? If not, what reason do the owners have for charging errors to the employees? In some agencies, the employer will terminate the employment of people who repetitively cause errors that expose the agency to E&O claims. Charging $25,000 to the employees you mention is tantamount to asking you all to find new jobs.
"As a result of the tight market and increasing deductibles, some agencies are trying to foist their E&O deductibles off onto their producers. Given that a deductible is designed to make the insured participate in the loss, is it right or even legal to contractually assign the risk of the deductible to individual producers?
"Producers from one agency are facing as much as a $75,000 charge against their earnings. That deductible is based on the entire agency's exposure, not the exposure of just one agent. And what should we do if producers from a member agency call us for advice?"
At a minimum, the producer should be responsible for at least the split. If they get a 50-50 split, they are responsible for 50% of the deductible.
I have seen this done, but discourage my clients from even considering it. In my opinion, the deductible is part of the business risk. If you have a producer who is causing these problems on an ongoing basis, don't penalize him or her, get someone who is more cautious! After all, WHO hired this person??? The E&O deductible buck stops with the owners...they reap the profits; they bear the risks.
It actually goes on in more states that just yours. You will typically see this arrangement when the agency treats producers as independent contractors. Although that is a totally different issue, most producers are not legally independent contractors with an independent insurance agency. As far as an employee producer, rarely do I see a contract that requires them to be responsible for the deductible on the E&O insurance.
I don't know if it is common, but I've seen it a couple of times. I've not seen any agency contracts that would prohibit it.
I haven't really heard anything about this but I can guess where the issue comes from. I would imagine that the producer would be an independent contractor of the agency and covered under the agency's E&O policy. Since the producer is not an employee, the agency allows him/her to fall under their E&O policy, but only if they are responsible for paying the deductible should claim arise caused by an error or omission from their activities. I haven't seen any agency contract either way.
I know of an agency that has a "fine" cookie jar. In order to impress upon employees the need to follow E&O procedures, if someone is caught "breaking a rule," they get a fine...$100 for owners, $75 for producers, $50 for CSRs, and $25 for others. They don't hold these people accountable for the E&O deductible...their financial commitment is in prevention, not punishment after the fact. After all, some of these errors are human mistakes which will happen.
What about agencies with $10,000, $25,000, or $50,000 deductibles? That's a BIG price to pay for a mistake...I bet the owners don't carry those kinds of deductibles on their HO and PAP policies! If you're going to "fine" them, make it $250-$500 or whatever's affordable for that person. If they deserve worse or mistakes are chronic, get rid of them. The key is quality control BEFORE the loss occurs.
|
from typing import Dict, List
import textwrap
from keras.layers import Input
from overrides import overrides
import numpy
from ...data.instances.multiple_choice_qa import TupleInferenceInstance
from ...layers import NoisyOr
from ...layers.attention import MaskedSoftmax
from ...layers.backend import RepeatLike
from ...layers.subtract_minimum import SubtractMinimum
from ...layers.tuple_matchers import tuple_matchers
from ...training import TextTrainer
from ...training.models import DeepQaModel
from ...common.params import Params
class TupleInferenceModel(TextTrainer):
"""
This ``TextTrainer`` implements the TupleEntailment model of Tushar. It takes a set of tuples
from the question and its answer candidates and a set of background knowledge tuples and looks
for entailment between the corresponding tuple slots. The result is a probability distribution
over the answer options based on how well they align with the background tuples, given the
question text. We consider this alignment to be a form of soft inference, hence the model
name.
Parameters
----------
tuple_matcher: Dict[str, Any]
Parameters for selecting and then initializing the inner entailment model, one of the
TupleMatch models.
noisy_or_param_init: str, default='uniform'
The initialization for the noise parameters in the ``NoisyOr`` layers.
num_question_tuples: int, default=10
The number of tuples for each of the answer candidates in the question.
num_background_tuples: int, default=10
The number of tuples for the background knowledge.
num_tuple_slots: int, default=4
The number of slots in each tuple.
num_slot_words: int, default=5
The number of words in each slot of the tuples.
num_options: int, default=4
The number of answer options/candidates.
normalize_tuples_across_answers: bool, default=False
Whether or not to normalize each question tuple's score across the answer options. This
assumes that the tuples are in the same order for all answer options. Normalization is
currently done by subtracting the minimum score for a given tuple "position" from all the
tuples in that position.
display_text_wrap: int, default=150
This is used by the debug output methods to wrap long tuple strings.
display_num_tuples: int, default=5
This is used by the debug output methods. It determines how many background tuples to display for
each of the answer tuples in a given instance when displaying the tuple match scores.
"""
def __init__(self, params: Params):
self.noisy_or_param_init = params.pop('noisy_or_param_init', 'uniform')
self.num_question_tuples = params.pop('num_question_tuples', None)
self.num_background_tuples = params.pop('num_background_tuples', None)
self.num_tuple_slots = params.pop('num_tuple_slots', None)
self.num_slot_words = params.pop('num_slot_words', None)
self.num_options = params.pop('num_answer_options', None)
self.normalize_tuples_across_answers = params.pop('normalize_tuples_across_answers', False)
self.display_text_wrap = params.pop('display_text_wrap', 150)
self.display_num_tuples = params.pop('display_num_tuples', 5)
tuple_matcher_params = params.pop('tuple_matcher', {})
tuple_matcher_choice = tuple_matcher_params.pop_choice("type", list(tuple_matchers.keys()),
default_to_first_choice=True)
tuple_matcher_class = tuple_matchers[tuple_matcher_choice]
self.tuple_matcher = tuple_matcher_class(self, tuple_matcher_params)
self.tuple_matcher.name = "match_layer"
super(TupleInferenceModel, self).__init__(params)
self.name = 'TupleInferenceModel'
@overrides
def _instance_type(self):
return TupleInferenceInstance
@classmethod
@overrides
def _get_custom_objects(cls):
custom_objects = super(TupleInferenceModel, cls)._get_custom_objects()
for tuple_matcher in tuple_matchers.values():
custom_objects.update(tuple_matcher.get_custom_objects())
custom_objects['MaskedSoftmax'] = MaskedSoftmax
custom_objects['NoisyOr'] = NoisyOr
custom_objects['RepeatLike'] = RepeatLike
custom_objects['SubtractMinimum'] = SubtractMinimum
return custom_objects
@overrides
def get_padding_lengths(self) -> Dict[str, int]:
padding_lengths = super(TupleInferenceModel, self).get_padding_lengths()
padding_lengths['num_question_tuples'] = self.num_question_tuples
padding_lengths['num_background_tuples'] = self.num_background_tuples
padding_lengths['num_slots'] = self.num_tuple_slots
padding_lengths['num_sentence_words'] = self.num_slot_words
padding_lengths['num_options'] = self.num_options
return padding_lengths
@overrides
def get_instance_sorting_keys(self) -> List[str]: # pylint: disable=no-self-use
return ['num_sentence_words', 'num_background_tuples', 'num_question_tuples', 'num_slots']
@overrides
def _set_padding_lengths(self, padding_lengths: Dict[str, int]):
super(TupleInferenceModel, self)._set_padding_lengths(padding_lengths)
# The number of tuple slots determines the shape of some of the weights in our model, so we
# need to keep this constant.
if self.num_tuple_slots is None:
self.num_tuple_slots = padding_lengths['num_slots']
if self.data_generator is not None and self.data_generator.dynamic_padding:
return
if self.num_question_tuples is None:
self.num_question_tuples = padding_lengths['num_question_tuples']
if self.num_background_tuples is None:
self.num_background_tuples = padding_lengths['num_background_tuples']
if self.num_slot_words is None:
self.num_slot_words = padding_lengths['num_sentence_words']
if self.num_options is None:
self.num_options = padding_lengths['num_options']
@overrides
def get_padding_memory_scaling(self, padding_lengths: Dict[str, int]) -> int:
num_question_tuples = padding_lengths['num_question_tuples']
num_background_tuples = padding_lengths['num_background_tuples']
num_sentence_words = padding_lengths['num_sentence_words']
num_options = padding_lengths['num_options']
return num_question_tuples * num_background_tuples * num_sentence_words * num_options
@overrides
def _set_padding_lengths_from_model(self):
self.num_background_tuples = self.model.get_input_shape_at(0)[1][1]
self.num_options = self.model.get_input_shape_at(0)[0][1]
self.num_question_tuples = self.model.get_input_shape_at(0)[0][2]
self.num_tuple_slots = self.model.get_input_shape_at(0)[0][3]
self.num_slot_words = self.model.get_input_shape_at(0)[0][4]
self._set_text_lengths_from_model_input = self.model.get_input_shape_at(0)[0][4:]
@overrides
def _build_model(self):
r"""
The basic outline of the model is that the question input, :math:`\mathcal{A}` (which consists of the
inputs for each of the answer choices, i.e., each :math:`A^c \in \mathcal{A}`), and the background input,
:math:`\mathcal{K}`, get tiled to be the same size. They are then aligned tuple-by-tuple: each of the
background tuples, :math:`k_j` is compared to each of the answer tuples, :math:`a_i^c`, to create a
support/entailment score, :math:`s_{ij}^c`. This score is determined using the selected ``TupleMatch``
layer.
Then, for each answer tuple, :math:`a_i^c \in A^c` we combine
the scores for each :math:`k_j \in K` using noisy-or to get the entailment score for the given answer
choice tuple::
:math:`s_i^c = 1 - \prod_{j=1:J}(1 - q_1 * s_{ij}^c)`
where q_1 is the noise parameter for this first noisy-or. Next, we combine these scores for each answer
choice again using the noisy-or to get the entailment score for the answer candidate::
:math:`s^c = 1 - \prod_{i=1:N}(1 - q_2 * s_{i}^c)`
where q_2 is the noise parameter for this second noisy-or. At this point, we have a score for each of
the answer candidates, and so we perform a softmax to determine which is the best answer.
"""
# shape: (batch size, num_options, num_question_tuples, num_tuple_slots, num_slot_words)
slot_shape = self._get_sentence_shape(self.num_slot_words)
question_input_shape = (self.num_options, self.num_question_tuples, self.num_tuple_slots) + slot_shape
question_input = Input(question_input_shape, dtype='int32', name='question_input')
# shape: (batch size, num_background_tuples, num_tuple_slots, num_slot_words)
background_input_shape = (self.num_background_tuples, self.num_tuple_slots) + slot_shape
background_input = Input(background_input_shape, dtype='int32', name='background_input')
# Expand and tile the question input to be:
# shape: (batch size, num_options, num_question_tuples, num_background_tuples, num_tuple_slots,
# num_slot_words)
tiled_question = RepeatLike(axis=3, copy_from_axis=1)([question_input, background_input])
# Expand and tile the background input to match question input.
# shape: (batch size, num_options, num_question_tuples, num_background_tuples, num_tuple_slots,
# num_slot_words)
# First, add num_options.
tiled_background = RepeatLike(axis=1, copy_from_axis=1)([background_input, question_input])
# Next, add num_question_tuples.
tiled_background = RepeatLike(axis=2, copy_from_axis=2)([tiled_background, question_input])
# Find the matches between the question and background tuples.
# shape: (batch size, num_options, num_question_tuples, num_background_tuples)
matches = self.tuple_matcher([tiled_question, tiled_background])
# Find the probability that any given question tuple is entailed by the given background tuples.
# shape: (batch size, num_options, num_question_tuples)
combine_background_evidence = NoisyOr(axis=-1, param_init=self.noisy_or_param_init)
combine_background_evidence.name = "noisy_or_1"
qi_probabilities = combine_background_evidence(matches)
# If desired, peek across the options, and normalize the amount that a given answer tuple template "counts"
# towards a correct answer.
if self.normalize_tuples_across_answers:
normalize_across_options = SubtractMinimum(axis=1)
qi_probabilities = normalize_across_options(qi_probabilities)
# Find the probability that any given option is correct, given the entailement scores of each of its
# question tuples given the set of background tuples.
# shape: (batch size, num_options)
combine_question_evidence = NoisyOr(axis=-1, param_init=self.noisy_or_param_init)
combine_question_evidence.name = "noisy_or_2"
options_probabilities = combine_question_evidence(qi_probabilities)
# Softmax over the options to choose the best one.
final_output = MaskedSoftmax(name="masked_softmax")(options_probabilities)
return DeepQaModel(input=[question_input, background_input], output=[final_output])
@overrides
def _instance_debug_output(self, instance: TupleInferenceInstance, outputs: Dict[str, numpy.array]) -> str:
num_to_display = 5
result = ""
result += "\n====================================================================\n"
result += "Instance: %s\n" % instance.index
result += "Question Text: %s\n" % instance.question_text
result += "Label: %s\n" % instance.label
result += "Num tuples per answer option: %s\n" % [len(answer) for answer in instance.answer_tuples]
result += "(limiting display to top %s at various levels)\n" % num_to_display
result += "====================================================================\n"
answer_scores = []
index_of_chosen = None
softmax_output = outputs.get("masked_softmax", None)
if softmax_output is not None:
answer_scores = list(enumerate(softmax_output))
sorted_answer_scores = sorted(answer_scores, key=lambda tup: tup[1], reverse=True)
# TODO(becky): not handling ties
index_of_chosen = sorted_answer_scores[0][0]
result += "Final scores: %s\n" % answer_scores
if index_of_chosen is None:
result += "ERROR: no answer chosen\n"
elif index_of_chosen == instance.label:
result += " Answered correctly!\n"
else:
result += " Answered incorrectly\n"
result += "====================================================================\n"
# Output of the tuple matcher layer:
# shape: (num_options, num_question_tuples, num_background_tuples)
tuple_matcher_output = outputs.get('match_layer', None)
if tuple_matcher_output is not None:
# correct answer:
# Keep only the first tuples (depending on model setting) because when we padded we set
# truncate_from_right to False.
correct_tuples = instance.answer_tuples[instance.label][:self.num_question_tuples]
background_tuples = instance.background_tuples[:self.num_background_tuples]
result += "-----------------------------------\n"
result += " GOLD ANSWER: (Final score: {0})\n".format(answer_scores[instance.label][1])
result += "-----------------------------------\n"
result += self._render_tuple_match_scores(correct_tuples,
background_tuples,
tuple_matcher_output[instance.label],
instance)
result += "-------------------\n"
result += " Incorrect Answers: \n"
result += "-------------------\n"
# NOTE: that extra padded "options" are added on the right, so this should be fine.
for option in range(len(instance.answer_tuples)):
chosen_status = ""
if option != instance.label:
option_tuples = instance.answer_tuples[option][:self.num_question_tuples]
if option == index_of_chosen:
chosen_status = "(Chosen)"
result += "\nOption {0} {1}: (Final Score: {2})\n".format(option,
chosen_status,
answer_scores[option][1])
result += self._render_tuple_match_scores(option_tuples,
background_tuples,
tuple_matcher_output[option],
instance)
result += "\n"
return result
def _render_tuple_match_scores(self, answer_tuples, background_tuples, tuple_matcher_output, instance):
result = ""
for i, answer_tuple in enumerate(answer_tuples):
answer_tuple_string = "\n\t".join(textwrap.wrap(answer_tuple.display_string(), self.display_text_wrap))
result += "Question (repeated): %s\n" % instance.question_text
result += "Answer_tuple_{0} : \n\t{1}\n\n".format(i, answer_tuple_string)
result += "Top {0} (out of {1}) highest scoring background tuples:\n\n".format(self.display_num_tuples,
len(background_tuples))
tuple_match_scores = []
for j, background_tuple in enumerate(background_tuples):
tuple_match_score = tuple_matcher_output[i, j]
tuple_match_scores.append((tuple_match_score, j, background_tuple))
# Sort descending by tuple match score
sorted_by_score = sorted(tuple_match_scores, key=lambda tup: tup[0],
reverse=True)[:self.display_num_tuples]
for scored in sorted_by_score:
background_tuple_index = scored[1]
background_tuple_string = scored[2].display_string()
wrapped_tuple = "\n\t".join(textwrap.wrap(background_tuple_string, self.display_text_wrap))
result += " (TupleMatch Score %s) " % scored[0]
result += "\tbg_tuple_{0} \n\t{1}\n".format(background_tuple_index, wrapped_tuple)
result += "\n"
return result
|
Stay in one of 69 guestrooms featuring flat-screen televisions. Complimentary wireless Internet access keeps you connected, and cable programming is available for your entertainment. Private bathrooms with shower/tub combinations feature complimentary toiletries and hair dryers. Conveniences include phones, as well as desks and coffee/tea makers.
Offering an à la carte restaurant with live cooking, Van Der Valk Hotel Zwolle is located at 6 km from Zwolle. Free Wi-Fi access is available. Rooms here will provide you with air conditioning, a minibar and a seating area. There is also an electric kettle. Featuring a shower, private bathrooms also come with a bath and a hairdryer. Extras include a desk, a safety deposit box and ironing facilities. At Van Der Valk Hotel Zwolle you will find a breakfast buffet in the morning. Other facilities offered include a communal lounge and a bar. Guests can make free use of wellness and fitness facilities. The hotel is 6.1 km from IJsselhallen Zwolle, 5.2 km from Stedelijk Museum Zwolle and 5.4 km from Overijssels Centrum Beeldende Kunsten. The property offers free parking and a Tesla charging point.
Be sure to enjoy recreational amenities including a sauna and a fitness center. Additional features at this hotel include complimentary wireless Internet access, babysitting/childcare (surcharge), and tour/ticket assistance.
Featured amenities include complimentary wired Internet access, a 24-hour business center, and express check-in. Planning an event in Zwolle? This hotel has facilities measuring 4844 square feet (450 square meters), including a conference center. Self parking (subject to charges) is available onsite.
Take in the views from a terrace and a garden and make use of amenities such as complimentary wireless Internet access. Additional amenities at this hotel include concierge services, gift shops/newsstands, and wedding services.
A stay at Bilderberg Grand Hotel Wientjes places you in the heart of Zwolle, within a 10-minute walk of Sassenpoort and Museum De Fundatie. This 4-star hotel is 0.6 mi (0.9 km) from Grote Markt and 0.6 mi (0.9 km) from Grote Kerk.
Offering quality accommodation in the historic district of Zwolle, the hotel is a popular choice for both business and leisure travellers. From the property, guests can enjoy easy access to all that the lively city has to offer. This splendid establishment is within easy reach from the city's main attractions such as the Historisch Centrum Overijssel and Stedelijk Museum Zwolle. This pet-friendly hotel is also 3 kilometres from the Zwolle train station which offers easy access to explore the area. The property has different accommodation options including comfortable rooms and suites. Each unit enjoys delightful design to offer a heaven of peace and tranquillity in which to unwind after a busy day of work or sightseeing. The glamorous hotel restaurant serves exquisite dishes, which might be followed by a quality drink in the lounge bar. Corporate guests may make use of versatile meeting facilities.
Librije’s Hotel, in Zwolle’s historic centre, offers elegant rooms and a 3 Michelin-star restaurant in the unusual setting of a renovated 18th-century women’s prison. Free WiFi is available. Featuring an unique decor, all of the luxurious air-conditioned rooms and suites at Librije's Hotel come with a flat-screen TV and a minibar with complimentary drinks and snacks. The bathroom is fitted with a large bath or a shower. A range of cookery lessons and wine courses are available at Librije's Atelier. There is also a shop offering wines and home-made delicacies. The kitchen of Restaurant De Librije sources its natural ingredients and products locally. Guests will also find a lounge where aperitifs or cocktails can be enjoyed. Librije's Hotel is a 10-minute walk from the Zwolle City Museum. Zwolle Railway Station is less than a 20-minute walk away. The centre of Apeldoorn is just over a 30 minutes’ drive from the hotel.
lastminute.com has a fantastic range of hotels in Zwolle, with everything from cheap hotels to luxurious five star accommodation available. We like to live up to our last minute name so remember you can book any one of our excellent Zwolle hotels up until midnight and stay the same night.
|
"""
WiFi commands.
"""
import collections
import shellish
from . import base
from .. import ui
class AccessPoints(base.ECMCommand):
""" List access points seen by site surveys. """
name = 'aps'
def setup_args(self, parser):
self.add_router_argument('idents', nargs='*')
self.add_argument('-v', '--verbose', action='store_true', help='More '
'verbose display.')
self.inject_table_factory()
super().setup_args(parser)
def run(self, args):
if args.idents:
ids = ','.join(self.api.get_by_id_or_name('routers', x)['id']
for x in args.idents)
filters = {"survey__router__in": ids}
else:
filters = {}
check = '<b>%s</b>' % shellish.beststr('✓', '*')
if args.verbose:
fields = collections.OrderedDict((
('SSID', 'wireless_ap.ssid'),
('BSSID', 'wireless_ap.bssid'),
('Manufacturer', 'wireless_ap.manufacturer'),
('Band', 'survey.band'),
('Mode', 'wireless_ap.mode'),
('Auth', 'wireless_ap.authmode'),
('Channel', 'survey.channel'),
('RSSI', 'survey.rssi'),
('First Seen', lambda x: ui.time_since(x['survey.created'])),
('Last Seen', lambda x: ui.time_since(x['survey.updated'])),
('Seen By', 'survey.router.name'),
('Trusted', lambda x: check if x['trust.trusted'] else ''),
))
else:
fields = collections.OrderedDict((
('SSID', 'wireless_ap.ssid'),
('Manufacturer', 'wireless_ap.manufacturer'),
('Band', 'survey.band'),
('Auth', 'wireless_ap.authmode'),
('Last Seen', lambda x: ui.time_since(x['survey.updated'])),
('Seen By', 'survey.router.name'),
('Trusted', lambda x: check if x['trust.trusted'] else ''),
))
survey = self.api.get_pager('wireless_ap_survey_view',
expand='survey.router,trust,wireless_ap',
**filters)
with self.make_table(headers=fields.keys(),
accessors=fields.values()) as t:
t.print(map(dict, map(base.totuples, survey)))
class Survey(base.ECMCommand):
""" Start a WiFi site survey on connected router(s). """
name = 'survey'
use_pager = False
def setup_args(self, parser):
self.add_router_argument('idents', nargs='*')
def run(self, args):
if args.idents:
ids = [self.api.get_by_id_or_name('routers', x)['id']
for x in args.idents]
else:
ids = [x['id'] for x in self.api.get_pager('routers')]
self.api.post('wireless_site_survey', ids)
class WiFi(base.ECMCommand):
""" WiFi access points info and surveys. """
name = 'wifi'
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.add_subcommand(AccessPoints, default=True)
self.add_subcommand(Survey)
command_classes = [WiFi]
|
Subscriptions are renewable on 1st January annually. However, if you join the Society between 1st October and 31st December, your initial subscription remains valid until the end of the following year.
We can accept, by post, up to three years subscription in a single payment to reduce the cost of obtaining sterling cheques. If you wish to join or renew your membership by post, do not use the Eshop method of payment below, but use the application form (link here) and post this along with your payment to Maureen Griffiths (details on the PDF form).
|
#CHIPSEC: Platform Security Assessment Framework
#Copyright (c) 2010-2015, Intel Corporation
#
#This program is free software; you can redistribute it and/or
#modify it under the terms of the GNU General Public License
#as published by the Free Software Foundation; Version 2.
#
#This program is distributed in the hope that it will be useful,
#but WITHOUT ANY WARRANTY; without even the implied warranty of
#MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
#GNU General Public License for more details.
#
#You should have received a copy of the GNU General Public License
#along with this program; if not, write to the Free Software
#Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#
#Contact information:
#[email protected]
#
"""
QEMU VirtIO info tool
Usage:
``chipsec_main.py -i -m tools.vmm.virtio``
``chipsec_main.py -i -m tools.vmm.virtio -a 0:6.0``
"""
import re
from chipsec.module_common import *
from chipsec.hal.pci import *
from chipsec.hal.mmio import *
VENDORS[0x1AF4] = 'Red Hat, Inc.'
DEVICES[0x1AF4] = {
0x1000: 'VirtIO Network',
0x1001: 'VirtIO Block',
0x1002: 'VirtIO Baloon',
0x1003: 'VirtIO Console',
0x1004: 'VirtIO SCSI',
0x1005: 'VirtIO RNG',
0x1009: 'VirtIO filesystem',
0x1041: 'VirtIO network (1.0)',
0x1042: 'VirtIO block (1.0)',
0x1043: 'VirtIO console (1.0)',
0x1044: 'VirtIO RNG (1.0)',
0x1045: 'VirtIO memory balloon (1.0)',
0x1046: 'VirtIO SCSI (1.0)',
0x1049: 'VirtIO filesystem (1.0)',
0x1050: 'VirtIO GPU (1.0)',
0x1052: 'VirtIO input (1.0)',
0x1110: 'VirtIO Inter-VM shared memory'
}
VIRTIO_VENDORS = [0x1AF4]
class VirtIO_MMIO_Device(BaseModule):
def __init__(self, b, d, f):
BaseModule.__init__(self)
self.bus = b
self.dev = d
self.fun = f
def get_bars(self):
return [self.cs.pci.read_dword(self.bus, self.dev, self.fun, x) for x in xrange(0x10, 0x28, 4)]
def print_virtio_device(self):
self.logger.log("")
self.logger.log("VirtIO Device %02x:%02x.%01x" % (self.bus, self.dev, self.fun))
bars = self.get_bars()
for i in xrange(len(bars)):
if bars[i] in [0x0, 0xFFFFFFFF]: continue
if bars[i] & 0x1 == 0:
base = bars[i] & 0xFFFFFFF0
data = struct.unpack("<1024L", self.cs.mem.read_physical_mem(base, 0x1000))
else:
base = bars[i] & 0xFFFFFFFC
data = [self.cs.io.read_port_dword(x) for x in xrange(base, base + 0x100, 4)]
self.logger.log(" BAR%d: 0x%08x (assuming size is 4096 bytes)" % (i, base))
for x in xrange(len(data)):
if data[x] in [0x0, 0xFFFFFFFF]: continue
self.logger.log(" BAR + 0x%04x: 0x%08x" % (x * 4, data[x]))
return
class VirtIO(BaseModule):
def __init__(self):
BaseModule.__init__(self)
def get_virtio_devices(self, devices):
virtio_devices = []
for (b, d, f, vid, did) in devices:
if vid in VIRTIO_VENDORS:
virtio_devices.append((b, d, f, vid, did))
return virtio_devices
def run(self, module_argv):
self.logger.start_test("QEMU VirtIO info tool")
pcie_dev = []
if len(module_argv) >= 1:
match = re.search(r"^([0-9a-f]{1,2}):([0-1]?[0-9a-f]{1})\.([0-7]{1})$", module_argv[0])
if match:
_bus = int(match.group(1), 16) & 0xFF
_dev = int(match.group(2), 16) & 0x1F
_fun = int(match.group(3), 16) & 0x07
vid = self.cs.pci.read_word(_bus, _dev, _fun, 0)
did = self.cs.pci.read_word(_bus, _dev, _fun, 2)
dev = (_bus, _dev, _fun, vid, did)
pcie_dev = [dev]
virt_dev = [dev]
else:
self.logger.log("ERROR: Invalid B:D.F (%s)" % module_argv[0])
return ModuleResult.ERROR
else:
self.logger.log("Enumerating available PCI devices..")
pcie_dev = self.cs.pci.enumerate_devices()
virt_dev = self.get_virtio_devices(pcie_dev)
self.logger.log("PCI devices:")
print_pci_devices(virt_dev)
for (b, d, f, vid, did) in virt_dev:
dev = VirtIO_MMIO_Device(b, d, f)
dev.print_virtio_device()
return ModuleResult.PASSED
|
Grampians Getaway Accommodation - The Pyramids. Environment.
At Grampians Getaway we care for the environment. All waste water is collected and pumped to our oxidation ponds at the rear of the property. We use environmentally friendly chemicals which results in these ponds being habitat for frogs, turtles and nesting ducks.
Although we are connected to town water we try to keep our water usage to a minimum by watering gardens from water collected in large underground tanks connected to each villa and large tanks attached to the garage and house. Our Garden beds have been planted with flowering native plants and other species to attract bees, honeyeaters and other birds with advice from Phill at Vaughans Native Plant Nursery in Pomonal.
We have recently purchased some small native fish to replenish the fish in the lake hopefully these will grow and reproduce looking forward.
|
# Copyright 2014 The Chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
from recipe_engine import recipe_api
import datetime
import time
class TimeApi(recipe_api.RecipeApi):
def __init__(self, **kwargs):
super(TimeApi, self).__init__(**kwargs)
self._fake_time = None
self._fake_step = None
if self._test_data.enabled:
self._fake_time = self._test_data.get('seed', 1337000000.0)
self._fake_step = self._test_data.get('step', 1.5)
def time(self):
"""Return current timestamp as a float number of seconds since epoch."""
if self._test_data.enabled:
self._fake_time += self._fake_step
return self._fake_time
else: # pragma: no cover
return time.time()
def utcnow(self):
"""Return current UTC time as a datetime.datetime."""
if self._test_data.enabled:
self._fake_time += self._fake_step
return datetime.datetime.utcfromtimestamp(self._fake_time)
else: # pragma: no cover
return datetime.datetime.utcnow()
|
The anvil is very cold and virtually precipitation free even though virga can be seen falling from the forward sheared anvil. Severe cases are called mud storms. And whoever will say again that we have no discipline when it comes to grabbing the center stage kahit sino daw ang matapakan para sumikat lang, can think again.
The updraft tower is typically more strongly tilted and the deviant rightward motion lesser than for other supercell types.
Radar features of a supercell[ edit ] Radar reflectivity map Hook echo or Pendant The "hook echo" is the area of confluence between the main updraft and the rear flank downdraft RFD.
Typically, supercells are found in the warm sector of a low pressure system propagating generally in a north easterly direction in line with the cold front of the low pressure system.
But the core group stuck it out hence, the sweet victory for them and the community. And whoever will say again that we have no discipline when it comes to grabbing the center stage kahit sino daw ang matapakan para sumikat lang, can think again.
They have also been observed by storm chasers in Australia and Argentina the Pampas.
All types of supercells typically produce severe weather. The radar signature of an RFD is a hook like structure where sinking air has brought with it precipitation.
Someone commented "I haven't seen so many Filipinos in my entire 37 years in Vancouver in one event like it - only today. Hail spike This three body scatter spike is a region of weak echoes found radially behind the main reflectivity core at higher elevations when large hail is present.
Yes, we all know how beautiful the Pinoy culture is, but has anyone thought of proudly showing it? The boldness of the organizers to show the color, music, grace and magnificense of the Pinoy culture in the streets of Vancouver via the the first ever Filipino parade 3 years ago amazed the locals to no end.
On Doppler radar, the region of very high precipitation echos with a very sharp gradient perpendicular to the RFD. A region of this area is called the Vault. Congratulations Hirit team - the old timers were huddled in one corner of the park totally amazed and proud of all of you.
An observer who is at ground level too close to the storm is unable to see the overshooting top due to the fact that the anvil blocks the sight of this feature. They most often dissipate rather than turning into classic or HP supercells, although it is still not unusual for LPs to do the latter, especially when moving into a much moister air mass.
Strong updrafts lift the air turning about a horizontal axis and cause this air to turn about a vertical axis. The Hirit team is a group to reckon with.
A 'bolt from the blue' typically originates in the highest regions of a cumulonimbus cloud, traveling horizontally a good distance away from the thunderstorm before making a vertical descent to earth.An educational video for kids.
This one-minute video allows viewers to see how a thunderstorm is formed. The video uses actual footage, animation, and time-lapse photography. (). Bolt from the Blue A bolt from the blue (sometimes called 'anvil lightning' or 'anvil-to-ground' lightning) is a name given to a cloud-to-ground lightning discharge that strikes far away from its parent thunderstorm.
A 'bolt from the blue' typically originates in the highest regions of a cumulonimbus cloud, traveling horizontally a good distance away from the thunderstorm before making a.
The Bureau of Meteorology constantly monitors all of its radars to detect thunderstorms. Severe Thunderstorm Warnings will be issued if the storms are expected to produce any of the following: Damaging wind gusts (90 km/h or more) Large hail (2cm diameter or.
Description Distant Thunderstorm is an ambient sound loop to help you sleep, relax, meditate, relieve stress, or block out unwanted noises.
This is the official Distant Thunderstorm Skill by Invoked Apps, a top-rated Alexa developer trusted by millions of Echo device owners. Forecast Products: Current Weather Watches; This is the current graphic showing any severe thunderstorm and tornado watches which are in effect over the contiguous United States.
|
#!/usr/bin/python2
#-*- coding: utf-8 -*-
## =====================================================================
## Menus for Loltris
## Copyright (C) 2014 Jonas Møller <[email protected]>
##
## This program is free software: you can redistribute it and/or modify
## it under the terms of the GNU General Public License as published by
## the Free Software Foundation, either version 3 of the License, or
## (at your option) any later version.
##
## This program is distributed in the hope that it will be useful,
## but WITHOUT ANY WARRANTY; without even the implied warranty of
## MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
## GNU General Public License for more details.
##
## You should have received a copy of the GNU General Public License
## along with this program. If not, see <http://www.gnu.org/licenses/>.
## =====================================================================
import Shared
import BlockText
import Core
import Load
import Jobs
import Log
import Factory
import Credits
import Save
import Matrix
import Utils
import webbrowser as Webbrowser
import os.path as Path
from functools import *
from pygame.locals import *
from Globals import *
from DataTypes import *
## Games
import HighscoreExplorer
import TetrisGame
import MakeTetromino
import SandBox
import TwoPlayerTetrisGame
# import LANTetrisGame
class MainMenu(Core.Menu):
def __init__(self, **kwargs):
super(MainMenu, self).__init__(
"MainMenu", onHeaderClick=lambda: Webbrowser.open(PROJECT_SITE),
header_font=MENU_HEADER_FONT, option_font=MENU_OPTION_FONT, isroot=True, xcenter=True,
soundtrack=None, #Path.join(Load.MUSICDIR, "jazz_cat_infinite_loop_cut.ogg"),
sound_enabled=SOUND_ENABLED, **kwargs)
self.title_blocks = BlockText.render(TITLE_TEXT, font=Load.loadBlockFont("standard"))
blockwidth = (self.width) // len(self.title_blocks[0])
Log.debug("title_board.blockwidth = {}".format(blockwidth))
self.addJob("title_board",
Jobs.Board(
self,
y=SPACER,
height=len(self.title_blocks),
width=len(self.title_blocks[0]),
blockwidth=blockwidth,
bgcolor=self.bgcolor,
queue=100,
draw_grid=False,
draw_border=False,
)
)
self.jobs.title_board.x = (self.width // 2) - (self.jobs.title_board.width // 2)
for x, y in Matrix.matrixToSet(self.title_blocks):
self.jobs.title_board.blocks[(x, y)] = (0xaa,0xaa,0xaa)
self.options_pos[1] = self.jobs.title_board.y + self.jobs.title_board.height + SPACER*2
self.menu = Factory.textBoxes([
("Single Player", lambda: self.call(TetrisGame.TetrisGame, caption="Loltris")),
("Two Player", lambda: self.call(TwoPlayerTetrisGame.TwoPlayerTetris, caption="Loltris - Two Player")),
#("LAN Play", lambda: self.call(LANTetrisGame.LANMenu, caption="Loltris - LAN play")),
("Create new blocks", lambda: self.call(MakeTetromino.MakeTetromino, caption="Loltris - Creator")),
("Options", lambda: self.call(OptionsMenu, caption="Loltris - Options")),
("Scores", lambda: self.call(HighscoreExplorer.HighscoreList, caption="Loltris - Highscores")),
("Credits", lambda: self.call(Credits.Credits, caption="Loltris - Credits")),
("Homepage", lambda: Webbrowser.open(PROJECT_SITE)),
#("SandBox", lambda: self.call(SandBox.SandBox, caption="Loltris - SandBox")),
("Exit", self.quit),
],
self,
font=MENU_OPTION_FONT,
fill=MENU_3DBORDER_BACKGROUND,
xcenter=True,
colors={
"background":self.colorscheme["background"],
"font":self.colorscheme["option"],
},
)
self.setupObjects()
#self.loadHighscores()
## XXX: Temporary bugfix, scroll_filler is drawn on every frame while the board is not.
del self.jobs.scroll_filler
def loadHighscores(self):
""" Load scores from disk, then add the highscorelist job to see them """
self.highscores = Load.loadHighscores(top=HIGHSCORES)
Log.debug("Loaded new highscores from disk, displaying below")
Log.dump("".join(["{}: {}\n".format(d["name"], d["score"]) for d in self.highscores]))
if self.highscores:
self.addJob(
"highscorelist",
Jobs.TextBox(
self,
( "Top {} scores\n\n".format(HIGHSCORES) + ## Title
"".join(["{}: {}\n".format(x["name"], x["score"]) for x in self.highscores]) + ## The scores
("\n" * (HIGHSCORES - len(self.highscores))) ## Empty lines
),
y=self.menu[0].y+1,
textfit=True,
colors=HIGHSCORELIST_COLORSCHEME,
font=HIGHSCORELIST_FONT,
border=True,
background=True,
)
)
## The highscore-list should be 5 pixels from the right edge
self.jobs.highscorelist.x = SCREEN_WIDTH - self.jobs.highscorelist.width - 5
def launchTetrisGame(self):
self.call(TetrisGame.TetrisGame, caption="Loltris")
# self.loadHighscores()
def eventHandler(self, event):
super(MainMenu, self).eventHandler(event)
if event.type == KEYDOWN:
if event.key == K_TAB:
self.addJob(
"input",
Jobs.InputBox(self, "Input: ")
)
class PauseMenu(Core.Menu):
def __init__(self, **kwargs):
super(PauseMenu, self).__init__("PauseMenu", header_font=MENU_HEADER_FONT, option_font=MENU_OPTION_FONT, isroot=True, xcenter=True, **kwargs)
self.header = "Pause"
self.menu = Factory.textBoxes([
("Continue", self.quitGame),
("Exit to main menu", lambda: self.quitGame("MainMenu")),
("Exit Game", self.quit),
], self, font=MENU_OPTION_FONT, colors={"background":self.colorscheme["background"],
"font":self.colorscheme["option"], },
fill=MENU_3DBORDER_BACKGROUND,
xcenter=True,
)
self.setupObjects()
self.running = self.mainLoop
## Placeholder, need to add sliders and other stuff to the Menu class
## for an option menu to be doable.
class OptionsMenu(Core.Menu):
def __init__(self, **kwargs):
super(OptionsMenu, self).__init__("OptionsMenu", header_font=MENU_HEADER_FONT, option_font=MENU_OPTION_FONT, xcenter=True, **kwargs)
self.header = "Options"
self.options = Load.loadOptions()
self.menu = Factory.textBoxes([
("Keymaps", lambda: self.call(KeymapMenu, caption=self.caption)),
], self, font=MENU_OPTION_FONT, colors={"background":self.colorscheme["background"],
"font":self.colorscheme["option"], },
fill=MENU_3DBORDER_BACKGROUND,
xcenter=True,
)
## >inb4 immature jokes
def turnOn(option, options):
Log.debug(option)
if options.get(option) != None:
Log.warning("Turning on non-existent option: {}".format(repr(option)))
options[option] = True
Save.saveOptions()
def turnOff(option, options):
Log.debug(option)
if options.get(option) != None:
Log.warning("Turning off non-existent option: {}".format(repr(option)))
options[option] = False
Save.saveOptions()
self.menu.extend(
Factory.basicSwitches([
("Uber-Tetromino", "uber_tetromino"),
("Flip tetromino", "flip_tetromino"),
], self, turnOn, turnOff, Shared.options["gameplay"],
font=MENU_OPTION_FONT,
colors=SWITCH_OPTION_COLORS,
boxwidth=8,
box_center=True,
fill=MENU_3DBORDER_BACKGROUND,
)
)
self.setupObjects()
class Graphics(Core.Menu):
def __init__(self, **kwargs):
super(KeymapMenu.Graphics, self).__init__("GraphicsMenu", header_font=MENU_HEADER_FONT, option_font=MENU_OPTION_FONT, xcenter=True, **kwargs)
## >inb4 immature jokes
def turnOn(option, options):
options[option] = True
Save.saveOptions()
def turnOff(option, options):
options[option] = False
Save.saveOptions()
self.menu = \
Factory.basicSwitches([
("Fullscreen", "fullscreen"),
], self, turnOn, turnOff, Shared.options["gameplay"],
font=MENU_OPTION_FONT,
colors=SWITCH_OPTION_COLORS,
boxwidth=8,
box_center=True,
fill=MENU_3DBORDER_BACKGROUND,
)
## Generates a mainloop for getting a single character.
## Used in KeymapMenu.*
def getKeyLoop(self, keys):
if not self.jobs.input_box.update_required:
Log.debug("Setting key {} to activate {}".format(Utils.keyToString(self.jobs.input_box.value), self.getting))
keys[self.getting] = self.jobs.input_box.value
self.removeJob("input_box")
Save.saveKeymap()
## Restore
self.running = self.mainLoop
## Sets the appropriate values for setting a key in a keymap.
def modifyKeymap(self, keys, getting):
self.addJob("input_box", Jobs.GetKeyBox(self, "Press key for {}".format(getting), font=MENU_OPTION_FONT, colors=SWITCH_OPTION_COLORS, queue=self.menu[0].queue+1))
self.getting = getting
self.running = partial(getKeyLoop, self, keys)
class KeymapMenu(Core.Menu):
def __init__(self, **kwargs):
super(KeymapMenu, self).__init__("KeymapMenu", header_font=MENU_HEADER_FONT, option_font=MENU_OPTION_FONT, xcenter=True, **kwargs)
self.header = "Keymaps"
self.menu = Factory.textBoxes([
("Tetris", lambda: self.call(self.Tetris, caption="Loltris - Tetris keymap")),
("Menu", lambda: self.call(self.Menu, caption="Loltris - Menu keymap")),
], self, font=MENU_OPTION_FONT, colors={"background":self.colorscheme["background"],
"font":self.colorscheme["option"], },
fill=MENU_3DBORDER_BACKGROUND,
xcenter=True,
)
self.setupObjects()
self.getting = None
class Tetris(Core.Menu):
def __init__(self, **kwargs):
super(KeymapMenu.Tetris, self).__init__("KeymapMenu.Tetris", header_font=MENU_HEADER_FONT, option_font=MENU_OPTION_FONT, xcenter=True, **kwargs)
self.header = "Tetris-map"
self.menu = Factory.textBoxes(
[("Player 1", lambda: self.call(self.Player1, caption="Loltris - Tetris player 1 keymap")),
("Player 2", lambda: self.call(self.Player2, caption="Loltris - Tetris player 2 keymap")),
],
self,
font=MENU_OPTION_FONT,
colors={"background":self.colorscheme["background"], "font":self.colorscheme["option"]},
fill=MENU_3DBORDER_BACKGROUND,
xcenter=True,
)
self.menu.extend(Factory.variableTextBoxes(
[( action.replace("_", " ").capitalize() + ": {key}",
## Nested lambdas are used here to cirumvent an issue with python closures. (http://code.activestate.com/recipes/502271/)
## Basically if you don't nest the lambdas, you will end up with every single functions having the last action in
## the list of dictionary keys.
{"key": (lambda action_: lambda _: Utils.keyToString(Shared.keymap["game"][action_]))(action) },
(lambda action_: lambda: modifyKeymap(self, Shared.keymap["game"], action_))(action))
for action in Shared.keymap["game"]
if isinstance(Shared.keymap["game"][action], int) ## Skip the player1 and player2 sub-dictionaries
],
self,
font=MENU_OPTION_FONT,
colors={"background":self.colorscheme["background"], "font":self.colorscheme["option"]},
fill=MENU_3DBORDER_BACKGROUND,
xcenter=True,
))
self.setupObjects()
class Player1(Core.Menu):
def __init__(self, **kwargs):
super(KeymapMenu.Tetris.Player1, self).__init__(
"KeymapMenu.Tetris.Player1", header_font=MENU_HEADER_FONT, option_font=MENU_OPTION_FONT, xcenter=True, **kwargs)
self.header = "Player1 Keymaps"
self.menu.extend(Factory.variableTextBoxes(
[( action.replace("_", " ").capitalize() + ": {key}",
## Nested lambdas are used here to cirumvent an issue with python closures. (http://code.activestate.com/recipes/502271/)
## Basically if you don't nest the lambdas, you will end up with every single functions having the last action in
## the list of dictionary keys.
{"key": (lambda action_: lambda _: Utils.keyToString(Shared.keymap["game"]["player1"][action_]))(action) },
(lambda action_: lambda: modifyKeymap(self, Shared.keymap["game"]["player1"], action_))(action),
)
for action in Shared.keymap["game"]["player1"]
],
self,
font=MENU_OPTION_FONT,
colors={"background":self.colorscheme["background"], "font":self.colorscheme["option"]},
fill=MENU_3DBORDER_BACKGROUND,
xcenter=True,
))
self.setupObjects()
class Player2(Core.Menu):
def __init__(self, **kwargs):
super(KeymapMenu.Tetris.Player2, self).__init__(
"KeymapMenu.Tetris.Player2", header_font=MENU_HEADER_FONT, option_font=MENU_OPTION_FONT, xcenter=True, **kwargs)
self.header = "Player2 Keymaps"
self.menu.extend(Factory.variableTextBoxes(
[( action.replace("_", " ").capitalize() + ": {key}",
## Nested lambdas are used here to cirumvent an issue with python closures. (http://code.activestate.com/recipes/502271/)
## Basically if you don't nest the lambdas, you will end up with every single functions having the last action in
## the list of dictionary keys.
{"key": (lambda action_: lambda _: Utils.keyToString(Shared.keymap["game"]["player2"][action_]))(action) },
(lambda action_: lambda: modifyKeymap(self, Shared.keymap["game"]["player2"], action_))(action),
)
for action in Shared.keymap["game"]["player2"]
],
self,
font=MENU_OPTION_FONT,
colors={"background":self.colorscheme["background"], "font":self.colorscheme["option"]},
fill=MENU_3DBORDER_BACKGROUND,
xcenter=True,
))
self.setupObjects()
class Menu(Core.Menu):
def __init__(self, **kwargs):
super(KeymapMenu.Menu, self).__init__(
"KeymapMenu.Menu", header_font=MENU_HEADER_FONT, option_font=MENU_OPTION_FONT, xcenter=True, **kwargs
)
self.header = "Menu-map"
self.menu = Factory.variableTextBoxes(
[( action.replace("_", " ").capitalize() + ": {key}",
## Nested lambdas are used here to cirumvent an issue with python closures. (http://code.activestate.com/recipes/502271/)
## Basically if you don't nest the lambdas, you will end up with every single functions having the last action in
## the list of dictionary keys.
{"key": (lambda action_: lambda _: Utils.keyToString(Shared.keymap["menu"][action_]))(action) },
lambda: modifyKeymap(self, Shared.keymap["menu"], action))
for action in Shared.keymap["menu"] ],
self,
font=MENU_OPTION_FONT,
colors={
"background":self.colorscheme["background"],
"font":self.colorscheme["option"],
},
fill=MENU_3DBORDER_BACKGROUND,
xcenter=True,
)
self.setupObjects()
|
any part in California for that matter, we have answers.
Box truck coverage has many layers of coverage and insurance regulations.
with your state and federal laws so there is no job interference.
Special regulations apply to commercial carriers of both passengers and cargo.
This is because of the risk of common carrier accidents.
company if a catastrophic loss of events should occur.
have created minimum financial responsibility requirements.
These requirements are applied for commercial carriers.
These requirements may be met by purchasing coverage or obtaining a surety bond.
amounts that at least equal to the minimum limits.
In some cases, full or partial self coverage may be permitted.
to demonstrate the ability to fully or partially self-insure.
by the Motor Carrier Act of 1980, which actually took effect in 1981.
It has since been amended to increase the financial responsibility limits required.
The required coverage is designed to pay for “public liability” however.
Moreover, it does not apply to injuries to the Carrier’s employees or loss of the cargo being carried.
This would fall under a workers compensation policy.
Insurance policy overage is usually obtained by attaching an endorsement form to a policy.
This add on is, providing commercial vehicle or truckers coverage.
and not all carriers are willing to provide it.
purchase your policy directly through a provider.
What is a Commercial Box Truck?
that sit on the frame, ranging from 14 feet to 24 feet in length.
The box is usually separate from the cab and is not accessible from the cab.
Most vehicles have a roll-up rear door that’s similar to a garage door.
Typically used to haul large items such as furniture, appliances and large boxes.
Frequently used as rental moving vans by companies like U-Haul and Ryder.
Sometimes referred to as cube trucks, cube vans, bob trucks and box vans.
A box truck is sometimes mistakenly called a cargo van.
Makers are GMC, Ford, International, Mitsubishi Fuso, Hino (Toyota), Isuzu, etc.
After the purchase of your policy, you will receive a policy declaration page.
varying types of commercial automobile exposures.
These two variations of the business vehicle coverage form are available for special types of risk.
which have a premises – operations exposure and have a temporary possession of their customers vehicles.
trucking businesses that haul goods for others, and which frequently exchange trailers wit other business.
to see all your options and know exactly what they cover.
Like with any industry, mistakes and incorrect classification can occur.
you are left in a difficult situation.
classified properly or purchased coverage for the wrong type of policy.
Not only will your premium spent on the policy be useless, but you face a claim denial.
Your policy declaration pages for your policy will be length and detailed.
They are divided into six sections.
identity of the insurance company who is accepting risk and your producer with policy period.
All this information is a snap view of the most common questions you may have about your coverage.
It will also show the vehicles insured and coverage options provided.
Along with this information, rates, premiums, deductibles are commonly found on your declaration page.
When numerous vehicles are insured, separate schedules of the vehicles may be attached.
Item Six – Schedule of gross receipts or mileage for liability coverage for public auto or leasing rental concerns.
Many sections of the declarations will be completed as appropriate for the coverage’s being written.
If a particular coverage does not apply, then that section will intentionally be left blank.
This can sometimes be confusing if your not familiar with the process.
You will see multiple pages in this order.
Every section on your policy has a purpose. This is to ensure no gaps of coverage exist.
Each policy is tailored to your operation.
The policy uses symbols in order to tailor coverage.
covered under the various parts of the policy.
used to identify covered autos, which describes each one.
numerical symbol after each coverage purchased.
Coverage may be tailored to your needs by selection of the appropriate symbols.
Different symbols may be used for different coverage’s.
If no symbol is shown for a particular coverage, then that coverage is not provided.
if there is no conflict or overlap between the symbol descriptions.
For example, the same coverage could apply to the owned autos and hired autos.
7 and requesting that coverage apply on to the specifically described autos.
only for some specifically described autos.
because liability risk pose the greatest threat to to a business’s financial well being.
coverage to apply only to owned private passenger autos or to owned autos other than private passenger autos.
Symbols 5 and 6 can be used to activate no fault benefits and compulsory uninsured motorist coverage for owned autos.
These are required to have such benefits in the state where they are licensed or principally garaged.
Hired autos includes all autos the insured leases, hires, rents, or borrows.
But not autos owned by employees or members of their households.
Non owned autos include all autos you as the insured do not own, lease, hire, or borrow.
including autos owned by employees or members of their households while being used in your business.
Certain symbols may be used only specific types of coverage.
Symbols, 1, 2, 3, 4, 7, 8 and 9 may be used to designate liability coverage.
However, liability is the only coverage for which symbol 1 may be used.
This is because other coverage’s cannot apply.
This applies to any auto, no fault benefits, medical payments, uninsured motorist and physical damage coverage.
These coverage options are not available for non owned autos.
Symbols 2, 3, 4, 7, 8 all may be used to designate physical damage coverage.
Symbol 5 may only be used to designate no fault benefits.
Symbol 6 may only be used to designate uninsured motorist coverage for certain vehicles.
vehicles may be provided by showing symbol 2 for this particular coverage.
attractions that bring visitors and locals alike each year to its city.
is the largest aquarium in Southern California. What a statement that says on so many levels.
It is also the 5th largest aquamarine in the United States.
With these bragging rights comes a lot of attention.
Try about 1.4 million visitors annually. Now, that is a lot of attention.
It is no surprise with these numbers,that this is the 3rd largest attraction in the Los Angeles area.
road and other factors that your insurance company will take note on.
remember this when comparing insurance rates.
will either help or hurt you on the premium.
other rating factors in consideration, such as crime rate within that population.
The good news is, there are discounts to offset these higher premiums.
Plus, there are also carriers that specialize in areas with large populations.
a city and county with such a high tourism population.
on your truck or car quote as you probably already assumed.
only increases the chance of risk and exposure.
and have their rates to reflect accordingly.
Find low cost box truck insurance or cheap car insurance Los Angeles.
We understand car and truck coverage with prior industry experience.
specialty market, with special coverage options.
Contact a Licensed Box Truck Insurance Professional, who specializes in your industry and needs!
|
"""
Regular-expression matching by the Thompson construction.
Explained in C at http://swtch.com/~rsc/regexp/regexp1.html
"""
def match(re, s): return run(prepare(re), s)
def run(states, s):
for c in s:
states = set.union(*[state(c) for state in states])
return accepting_state in states
def accepting_state(c): return set()
def expecting_state(char, k): return lambda c: k() if c == char else set()
def state_node(state): return lambda: set([state])
def alt_node(k1, k2): return lambda: k1() | k2()
def loop_node(k, make_k):
def loop(): return k() | looping()
looping = make_k(loop)
return loop
def prepare((null, re)): return re(state_node(accepting_state))()
def lit(char):
return False, lambda k: state_node(expecting_state(char, k))
def alt((null1, re1), (null2, re2)):
return null1 or null2, lambda k: alt_node(re1(k), re2(k))
def many((null, re)):
assert not null, "I can't handle nested stars"
return True, lambda k: loop_node(k, re)
empty = (True, lambda k: k)
def seq((null1, re1), (null2, re2)):
return null1 and null2, lambda k: re1(re2(k))
## match(empty, '')
#. True
## match(empty, 'A')
#. False
## match(lit('x'), '')
#. False
## match(lit('x'), 'y')
#. False
## match(lit('x'), 'x')
#. True
## match(lit('x'), 'xx')
#. False
## match(seq(lit('a'), lit('b')), '')
#. False
## match(seq(lit('a'), lit('b')), 'ab')
#. True
## match(alt(lit('a'), lit('b')), 'b')
#. True
## match(alt(lit('a'), lit('b')), 'a')
#. True
## match(alt(lit('a'), lit('b')), 'x')
#. False
## match(many(lit('a')), '')
#. True
## match(many(lit('a')), 'a')
#. True
## match(many(lit('a')), 'x')
#. False
## match(many(lit('a')), 'aa')
#. True
## match(many(lit('a')), 'ax')
#. False
## complicated = seq(many(alt(seq(lit('a'), lit('b')), seq(lit('a'), seq(lit('x'), lit('y'))))), lit('z'))
## match(complicated, '')
#. False
## match(complicated, 'z')
#. True
## match(complicated, 'abz')
#. True
## match(complicated, 'ababaxyab')
#. False
## match(complicated, 'ababaxyabz')
#. True
## match(complicated, 'ababaxyaxz')
#. False
# N.B. infinite recursion, like Thompson's original code:
## match(many(many(lit('x'))), 'xxxx')
#. Traceback (most recent call last):
#. File "nfa_failstoploops.py", line 30, in many
#. assert not null, "I can't handle nested stars"
#. AssertionError: I can't handle nested stars
## match(many(many(lit('x'))), 'xxxxy')
#. Traceback (most recent call last):
#. File "nfa_failstoploops.py", line 30, in many
#. assert not null, "I can't handle nested stars"
#. AssertionError: I can't handle nested stars
# Had a bug: empty forced a match regardless of the continuation.
## match(seq(empty, lit('x')), '')
#. False
## match(seq(empty, lit('x')), 'x')
#. True
|
A workshop at ACM Creativity and Cognition, June 27, 2017 (all-day), Singapore.
This workshop will introduce creative coding audio for the Raspberry Pi, using the beads platform for audio programming, and the HappyBrackets platform for inter-device communication and sensor data acquisition. We will demonstrate methods to allow each self-contained battery-powered device to acquire sensor data about its surroundings and the way it is being interacted with, as well as methods for designing systems where groups of these devices wirelessly communicate their state, allowing new interaction possibilities and approaches.
The Raspberry Pi is an ultra-cheap, ultra-small Linux microcomputer. Introduced in 2012, it is a flagship device lighting the path towards generally available ubiquitous computing technology. The creative potential of cheap, tiny network-connected general purpose linux computers that are the size of credit cards is immense. Similarly, Java is one of the world’s most popular general purpose programming language, and underlies the creative coding environment Processing, itself one of the most popular environments for creative coding. Java opens up a world of infinite possibilities, and the coding examples in this course have been designed to be incredibly easy to learn, allowing you to get stuck into your creative goals from the get-go.
In this course you will learn the essentials of programming real-time audio software and apply these skills to the exciting world of the Internet of Things. You will use the Raspberry Pi as a rapid prototyping platform, exploring the creative potential of real-time sensor and network interaction, combined with real-time sound generation, creating systems that respond to user input, communicate with other devices and play sound. Make your own musical instruments, develop devices for sonic artworks, and create new sound design concepts for sonifying everyday objects. Through this course you will develop a basic understanding of audio programming and the core concepts behind programming for the Internet of Things. You will be able to conceptualize and design your own innovative interactive devices.
This course will include the following topics.
The afternoon session will be devoted to collaborative exercises and composition tasks.
This workshop will be a full-day session framed as a pedagogical workshop which will introduce the platform in an interactive hands-on manner. Participants in this workshop will require a raspberry pi, sensor system, battery system and access to a wifi or wired network which we will provide. Some participants may wish to purchase these devices in advance of the workshop, so that they learn on equipment they can take with them, and for these participants we will provide a description of the equipment we use and suggestions for how to buy them and set them up in advance of the workshop. For participants who do not bring their own devices we will provide a device for the day.
Participants will benefit from having some experience in programming Java. No electronics knowledge will be necessary but familiarity with linux and/or raspberry pis would be advantageous but is not required. This course will not introduce Java from first principles, although the coding examples will start off being very simple and easy to pick up (someone with experience in another language would not be significantly challenged).
This workshop will also benefit from an online MOOC course we have recorded with Kadenze, available here. We will expect participants to have enough time to review the course materials and videos in advance of the workshop, but not to have completed the assignments. The workshop itself will focus on reviewing the material, running through and clarifying any unclear content, and then on completing collaborative composition asssignments in the second half of the workshop.
Participating in this workshop will require registration for the ACM Conference on Creativity and Cognition 2017, alongside registration for the workshop. See here for details.
To help us to track who’s participating and to send updates to you, if you are planning on participating please fill in this google form. You don’t have to have registered to fill in this form, but you will need to eventually register to participate in the workshop.
|
from rest_framework import permissions
class IsAdminOrOwner(permissions.BasePermission):
message = 'Only an admin user or owner can access this.'
def has_object_permission(self, request, view, obj):
if request.user.is_authenticated():
return request.user.is_staff or request.user == obj.user
else:
return False
class IsAdminOrReadOnly(permissions.BasePermission):
message = 'Only an admin user can make changes.'
def has_permission(self, request, view):
if request.method in permissions.SAFE_METHODS:
return True
else:
return request.user.is_authenticated() and request.user.is_staff
class IsAdminOwnerOrReadOnly(permissions.BasePermission):
message = 'Only an admin user or the owner can change this object.'
def has_object_permission(self, request, view, obj):
if request.method in permissions.SAFE_METHODS:
return True
else:
if request.user.is_authenticated():
return request.user.is_staff or request.user == obj.user
else:
return False
class IsDJ(permissions.BasePermission):
message = 'Only the DJ can request the next song.'
def has_permission(self, request, view):
if request.user.is_authenticated():
return request.user.is_dj
else:
return False
|
As of 2018 and hopefully into 2019, industrial work in the United States certainly seems to be expanding. The concern is: Is the effort to control the air that a worker breathes, keeping pace with the industrial expansion? Experience says that it likely will lag behind putting workers—often new hires at risk for work related illnesses.
In short, the potential for worker exposure to chemicals in the workplace will result in an increase in illness, worker compensation claims and OSHA recordable illnesses.
The first step at worker exposure control is to monitor the work environment. The best method of determining worker exposure is actual sampling of the worker while he/she is performing the chemical exposure task. Miniaturization of sampling equipment now allows samplers to be directly attached to the worker. This can be a small, battery powered sampler, or even a small, passive badge sampler attached to the worker while a particular task is performed. The testing can be as short as a few minutes or a full shift. Some samplers are so small they can be attached to the worker’s lapel for a full shift. Once the exposure testing period is complete, the sample can be sent to a laboratory for analysis.
The proper person to do the sampling is referred to as an industrial hygienist—or a specially trained safety person. This industrial hygienist would generally observe the sampled worker for at least some of the time, so that when results are obtained from the laboratory, they can be interpreted in line with the work performed so that a professional judgment can be made as to whether the exposure was safe, and within acceptable exposure limits, and not above any OSHA limit, or other recommended limits.
Interpreting the results, the impact of exposure and the need for corrective measures are more important than the sampling itself—even if done properly. This requires the input of an experienced Industrial Hygienist or other health professional.
As time goes on, the sampling equipment and methods become more accurate but the interpretation of the test results become more complex. Thus, as time goes on and testing equipment and methods improve, the need for an experienced industrial hygienist, toxicologist, and epidemiologist becomes more critical.
Another complexity involves testing and analysis that duplicates the acceptable sampling criteria for OSHA regulated substances. There are about 500 chemical agents in the U.S. Workplace that are specifically regulated by OSHA. Sampling and analysis and interpretation of results regarding OSHA compliance must be done by a person/organization that can duplicate the sampling according to OSHA methods and be able to determine compliance with the appropriate regulations. Where no OSHA limits exist, the qualified professional must identify other sources of information to determine a safe exposure level. For more toxic substances, the qualified professional can recommend controls and substitutions that are less toxic.
We have the technical and professional staff to do sampling and interpretation.
|
import time
import os
from fabric.api import *
from fabric.contrib.files import upload_template
from git import Repo
from ec2_deploy.notifications import Notification
def _run_task(task, start_message, finished_message):
"""
Tasks a task from tasks.py and runs through the commands on the server
"""
start = time.time()
Notification(start_message).info()
# Run the task items
for item in task:
try:
Notification("-" + item['message']).info()
except KeyError:
pass
globals()["_" + item['action']](item['params'])
Notification("%s in %.2fs" % (finished_message, time.time() - start)).success()
def _sudo(params):
"""
Run command as root.
"""
command = _render(params)
sudo(command)
def _local(params):
"""
Run command on local machine.
"""
command = _render(params)
local(command)
def _pip(params):
"""
Run pip install command.
"""
for item in params:
command = _render(item)
_sudo("pip install %s" % command)
def _upload_template(params):
"""
Run command to render and upload a template text file to a remote host.
"""
upload_template(filename=_render(params['filename']),
destination=_render(params['destination']), use_sudo=True)
def _render(template, context=env):
"""
Does variable replacement %(variable)s
"""
return template % context
def add_to_hosts(path, instance):
"""
Takes an instance ID and appends it to a list config/hosts.py
"""
list_string = get_hosts_list(path)
list_string.append(instance)
with open(path + '/hosts.py', 'w') as f:
f.write(str(list_string))
def get_hosts_list(path, staging=False):
"""
Reads the hosts.py file and returns the list.
"""
if staging:
filepath = path + '/hosts_staging.py'
else:
filepath = path + '/hosts.py'
if os.path.isfile(filepath):
with open(filepath, 'r') as f:
list_string = eval(f.readline())
else:
list_string = []
return list_string
def run_sanity_checks(env):
Notification("Running sanity checks...").info()
# Check for git branches master and develop.
repo = Repo(env.local_repo)
if repo.bare:
Notification("No 'git' repo setup.").error_exit()
if "develop" not in repo.branches:
Notification("Please create a `develop` branch in git for the staging environment.").error_exit()
# Check for requirements.text.
if not os.path.isfile(os.path.expanduser("{}/requirements.txt".format(env.local_repo))):
Notification("Your local repo does not appear to have a 'requirements.txt'. Please create one in your root.").error_exit()
# Check for environment vars.
for var_file in ['vars_production.env', 'vars_staging.env']:
if not os.path.isfile(
os.path.expanduser("{}/server_templates/{}/{}".format(env.local_repo, env.template, var_file))):
Notification("Cannot find environments variable file in server template.").error_exit()
d = {}
with open("{}/server_templates/{}/{}".format(env.local_repo, env.template, var_file)) as f:
for line in f:
(key, val) = line.split("=")
d[key] = val
if len(d) is 0:
Notification("You have not set any environments variables for {} ".format(var_file)).error_exit()
if not "EC2_DEPLOY_SERVER_REPO" in d:
Notification("Please set 'EC2_DEPLOY_SERVER_REPO' in {} ".format(var_file)).error_exit()
Notification("Passed all checks").success()
|
Study your Advanced Diploma of Performing Arts with Perth’s Principal Academy of Dance.
This qualification offers intensive training and skills development to a professional level. The course is specifically designed to produce highly skilled and diverse professional performers enabling them to seek work in a broad range of future employment such as musical theatre productions, cruise ships, entertainment resorts, theme parks, film and television. The course is delivered with majors in dance, musical theatre and acting.
The aim of this course is to ensure graduates have acquired a comprehensive range of skills and in-depth knowledge to display the disciplined attitude necessary for a professional dancer, singer and actor. Throughout the course, students will enhance and refine their technique and artistry showing confidence, assurance and professional awareness in application. Also, a comprehensive understanding of professional contexts will be applied in performance and practice.
Students will be provided with the opportunity to perform in the mid-year performance, stage play, cabaret and end of year Gala Performance featuring choreography, vocal and scene work from popular stage and movie musicals. This is also an opportunity for the general public and invited industry professionals to view up and coming artists.
Musical Theatre Major – Acting Techniques, Vocal Technique, Song Repertoire, Musical Theatre Performance Accents, Classical Ballet, Jazz Dance, Tap, Performance Studies, Stage Combat, Audition Preparation, Professional Development.
Dance Major – Classical Ballet, Pas de Deux, Modern Dance, Jazz, Commercial Dance, Contemporary, Latin Jazz, Tap, Showgirl Performance, Acting Techniques, Song Repertoire, Musical Theatre Performance, Audition Preparation, Professional Development.
Acting Major – Improvisation, Accents, Stage Combat, Screen Acting, Voice Over & Radio Skills, Shakespeare Studies, International Theatre, Stage Play, Commedia Dell’Arte, Audition Preparation, Professional Development.
Prior to graduation students have the opportunity to “Show Case” their skills in front of potential agents & employers.
A full course outline of the core & electives for each major is available on application.
|
#!/usr/bin/env python
import os
import sys
import time
from getpass import getpass
from optparse import OptionParser
from termcolor import colored
from launchpadlib.launchpad import Launchpad
from github3 import login as github_login
from github3 import GitHubError
ACTIVE_STATUSES = [
"New",
"Confirmed",
"Triaged",
"In Progress"
]
IMPORTED_FIELDS = [
"owner",
"web_link",
"date_created",
"date_last_updated",
"tags",
]
def main(args):
usage = """%s: <lp project> <gh project>\n""" % (sys.argv[0],)
parser = OptionParser(usage=usage)
options, args = parser.parse_args(args=args)
if len(args) != 2:
parser.print_usage()
return 1
lp_project_name = args[0]
gh_project_name = args[1]
try:
gh_owner, gh_repo = gh_project_name.split('/')
except ValueError:
print "Unable to parse target Github repo: '%s'" % gh_project_name
print "Repo should be specified as <owner>/<repo>"
print "Authenticating with Launchpad"
launchpad = Launchpad.login_with(os.path.basename(sys.argv[0]), 'production')
print "Authenticating with Github"
github_user = raw_input("Github username: ")
github_pass = getpass("Github password: ")
try:
github = github_login(github_user, github_pass)
github.user()
except GitHubError:
raise SystemExit("Invalid Github login or problem contacting server")
# Validate launchpad project
try:
lp_project = launchpad.projects[lp_project_name]
except KeyError:
raise SystemExit("Unable to find project named '%s' on Launchpad" % lp_project_name)
# Validate github project
if github.repository(gh_owner, gh_repo) is None:
raise SystemExit("Unable to find Github project %s/%s" % (gh_owner, gh_repo))
# Begin migration
open_tasks = lp_project.searchTasks(status=ACTIVE_STATUSES)
for bug_task in open_tasks:
for field in IMPORTED_FIELDS:
print colored(field + ':', 'cyan') + colored(bug_task.bug.__getattr__(field), 'yellow')
print colored(bug_task.bug.description, 'yellow')
print
if confirm_or_exit(colored("Import?", 'cyan')):
title = bug_task.bug.title
description = format_description(bug_task.bug)
issue = github.create_issue(owner=gh_owner, repository=gh_repo, title=title, body=description)
for i, message in enumerate(bug_task.bug.messages):
if i == 0: continue # repeat of description
time.sleep(0.5)
comment = format_comment(message)
issue.create_comment(body=comment)
issue.add_labels('launchpad_import')
print colored("Created issue %d: %s" % (issue.number, issue.html_url), 'yellow')
if confirm_or_exit(colored("Close and update original?", 'cyan')):
bug_task.bug.newMessage(content="Migrated to Github: %s" % issue.html_url)
bug_task.status = "Won't Fix"
bug_task.bug.lp_save()
bug_task.lp_save()
def format_description(bug):
output = """#### Imported from %(web_link)s
|||
|----|----|
|Reported by|%(owner)s|
|Date Created|%(date_created)s|
""" % {
'web_link': bug.web_link,
'owner': format_user(bug.owner),
'date_created': bug.date_created.strftime("%b %d, %Y")
}
if bug.tags:
output += "|Tags|%s|" % bug.tags
output += bug.description.replace("Original description:\n", "")
return output
def format_user(user):
return "[%s](%s)" % (user.name, user.web_link)
def format_comment(message):
output = "#### Comment by %s on %s:\n" % \
(format_user(message.owner), message.date_created.strftime("%b %d, %Y"))
output += message.content
return output
def confirm_or_exit(prompt):
options = ['y','n','q']
option_prompt = '/'.join(options)
choice = None
while choice not in options:
choice = raw_input("%s (%s): " % (prompt, option_prompt)).lower()
if choice == 'y':
return True
if choice == 'n':
return False
if choice == 'q':
raise SystemExit(0)
if __name__ == "__main__":
sys.exit(main(sys.argv[1:]))
|
Just like their engagement session in July, Tori and Michael’s wedding day was the perfect temperature with no clouds in the sky. Their families and friends gathered at Revel on a Monday afternoon in April to see them become husband and wife.
I knew when I first met with Tori and Michael that some of the details of their wedding would be a little different from what I’ve seen before, and all the details were so unique and well-coordinated when I saw them in person. They had a yellow and gray color scheme; the bridesmaids wore gray maxi skirts and the groomsmen wore yellow suspenders and bow ties, all with black Converses. Tori’s long-sleeved lace and tulle gown was straight out of a fairy tale wedding, with a cotton crown she made herself and yellow Converses to match Michael’s—the same ones they wore with their black and white outfits at their engagement session. Instead of bouquets, Tori and her bridesmaids carried lanterns with cotton and string lights inside. The tables had cotton centerpieces and their favorite yellow candies. For the reception, they picked out breakfast food, because “who doesn’t love breakfast food?” As a couple who met during Governor’s School and graduated from Clemson, they had all of their rings present for their detail shots. Their “guest log” was cut from an actual log where all their guests could sign.
You could see that Tori and Michael were so happy for this day to finally be here. They were so smiley for their portraits, which makes it easy for me! It’s been so much fun to be part of their engagement and their wedding day over the past year or so.
Everything else was made possible by family and friends.
The one snow day we had in Greenville all winter just so happened to land on the day McKenzie wanted to propose to Katrina. The exciting night out he’d planned for the proposal quickly turned into a night in, so he had to think of a new plan quickly. He put on “Millionaire” by Chris Stapleton and asked Katrina for a slow dance in the living room. When the song finished, he told her that it was a good time to celebrate. Wen Katrina asked what they were celebrating, McKenzie said, “our engagement,” and got down on one knee with this gorgeous vintage-looking engagement ring.
Katrina and McKenzie’s intimate wedding was on a perfect April day at the Bleckley Inn in Anderson. They made each other laugh during the ceremony and they had me laughing so much during their bride and groom portraits! We took some photos out in the courtyard and some others in this little alleyway with flowers and string lights. Katrina wore a gown with lace sleeves and an off-the-shoulder V-neckline, like a gown out of a 50s movie. She also had these princess-looking shoes and the garter her mother wore on her own wedding day. Many of the items they used were handmade, including their engraved tree-like ring box, McKenzie’s feather bow tie, and the succulent décor at the reception. All three of their rings are so unique—McKenzie’s wedding band is made from meteorite!
Thank you Katrina and McKenzie for letting me be part of this beautiful day!
Lindsay and Tim met through the engineering program at Clemson. They became friends when they both had their engineering co-op with the same company and they started dating not long after that. For their first date, they went paddle boarding on Lake Hartwell, right outside their apartment community. While they were on a ski trip to Colorado, Tim proposed to Lindsay on a horse-drawn carriage! How sweet is that?
Since Clemson is such a special place for Lindsay and Tim, they decided that was where they wanted to do their engagement photos. They came down from Charlotte with their sweet puppy, Ray. We started down by the stadium and worked our way through the main part of campus. Since legend has it that walking hand-in-hand through President’s Park guarantees that you’ll be together forever, it’s pretty much a milestone in any Clemson couple’s relationship, and we had to get some pictures in there. This session wouldn’t have been complete without going to Harts Cove and taking some pictures at the docks on Lake Hartwell, where they had their first date.
Since the students were on spring break, the campus was nearly silent for a Friday evening, making this session more peaceful and intimate. It was the perfect spring day; the temperature was in the upper 60s, the sky was clear, the trees were pink and white, and there were so many colorful flowers in front of Tillman Hall. The bigger trees were still bare enough for that glow you only see in the winter, and to show Tillman Hall in the background when we were at Sikes Hall.
Enjoy the photos from this beautiful spring evening! I can’t wait for their fall wedding in the mountains!
To see more Clemson engagement sessions and other work I’ve done around there, visit my Clemson page.
Just a few days before the Miss Clemson University pageant, Anna was on the sidelines with the rest of the Rally Cats dance team as Clemson won the 2018 College Football National Championship. After a super cool Michael Jackson jazz dance, the most picturesque entrance in her evening gown, bringing in the most ad sales, and taking home the philanthropy award, Anna was crowned Miss Clemson University 2019. She represented Alpha Delta Pi. Not only does Anna come from a family with lots of Clemson Tigers, her great-aunt was the very first Miss Clemson College in 1962!
It’s actually a pretty big decision and I don’t know why I’ve never read a blog post on it. I never heard anyone talk about this, and I never thought to talk with anyone about it, maybe because everyone has such different views on marriage in general and what different name change options would say about you. Even when you Google “name change after marriage,” the whole first page is about the process of changing your name and not what to change it to. I’m sure I’m not the only one who would have benefitted from reading something like this.
I initially didn’t like the idea of changing my name. This is what I had been called my whole life. I didn’t want to be called something else just because I made the decision to get married. Changing my name would mean having to update everything from my driver’s license to my Amazon shipping information to my paperwork at my job. I really didn’t want to go through all those processes, in addition to having to update pretty much everyone I knew. The reason I did change my name was because I wanted our eventual family to all be under one name. “The Gows” wouldn’t include me if my last name was Scott or Scott-Gow. People wouldn’t know to associate me with Dan or our future children. So I eventually decided that I wanted to be a Gow, but what would come between Christine and Gow?
For other people, there are several factors to consider. I know professors who didn’t change their names so they can still be tied to different works they’ve published. Other professors might marry another professor in the same department and not want to be confused with each other. Some use a hyphenated name. One of my professors made her husband’s last name her middle name instead of the other way around. Some people change their name, but still choose to be known professionally as the name everyone knows. Carrie Underwood, for example, is not known to the public as Carrie Fisher, partially because so many people know her as Carrie Underwood, and partially because she could be confused with Princess Leia.
After deciding what your last name will be, you have to decide what you want your middle name to be. For some people, it’s an easy choice if they don’t like their middle or last name, or if it’s a name that they don’t want to be associated with anymore. I like my middle name and I didn’t like the idea of completely replacing my family name. Plus, I had Christine Scott Photography going for me, but it would be weird if Scott wasn’t part of my name anymore. I don’t think I really knew what I wanted my middle name to be until over a month after I was married.
This brings me to the legal process of name changes, which can vary by state. In South Carolina, you can change your middle and last name to reflect your spouse’s name after becoming legally married. Based on those rules, my options were Christine Laureana Scott, Christine Scott Gow, or Christine Laureana Gow (side note: Laureana is pronounced Laurie-Anna and it was my great-grandma’s name). Somehow, I got away with making “Laureana Scott” my legal middle name without anyone questioning me. I don’t know what my plan would have been if I’d been told that wasn’t allowed or if I didn’t have enough space on the form.
Here’s where it gets complicated: if you don’t go by your first name, you will need a court order to legally change your name to the one everyone calls you. This applies to people who go by their middle names, their last names, a nickname, or just another name they’ve chosen for themselves. According to the South Carolina Legislature’s website, “A person who desires to change his name may petition, in writing, a family court judge in the appropriate circuit, setting forth the reason for the change, his age, his place of residence and birth, and the name by which he desires to be known,” as well as a series of background checks and an affidavit.
If monograms or initials are important to you, that’s something you’ll want to think about before making a legal name change. Remember the Big Bang Theory episode where they name their teams “Perpetual Motion Squad” and “Army Ants,” not realizing what the acronyms would be on their shirts? Or what if your initials don’t spell anything, but your monogram does? As for initials, I always thought “Christine S” sounded weird, maybe because it was rare for me to need to be distinguished from another Christine, so “Christine S Gow” sounded weird to me too. The only times I see “Christine L Gow” are in places like my bank account. My signature is “Christine L Gow” because that’s what my credit card says and my signature is on the back. On a side note, Dan and I recently realized that the “G” looks different in our signatures. His G looks like the one on the General Mills logo and mine looks like the one on the Goody hair accessories logo.
There are online services that will automate the process for you. I considered this since the DMV here wasn’t open when I wasn’t at work, but I read some negative reviews about the online services and decided I didn’t want to risk my legal name getting messed up. I read that you’re supposed to go to the social security office before the DMV, so that’s what I did once I had a day off work. I believe I was required to bring two forms of ID, which can include your driver’s license, your passport, your social security card, and your birth certificate. I got there right when it opened and there was already a huge line across the front of the building. I had some photography work to do, so I brought that with me in anticipation of a long wait. The wait at the social security office was actually longer than the wait at the DMV, but I was able to sit down right away at the social security office.
I hope this has provided some insight to you as you get ready for this big change!
In 2018, I had sessions or weddings across 6 states. From fulfilling my lifelong dream of meeting Piglet in Disney World to adding beautiful Kentucky to the list of states I’ve been to, it’s been pretty exciting. There were pink and white trees, green trees, red and yellow trees, and trees with lights in them. I shot in Falls Park on the day of prom and right when it stopped pouring down rain after 4 days. It’s been a year of growth for my business, but I’ve also been lucky to keep shooting for other incredible photographers in the area.
First of all, my aunt Diane helped me come up with a new logo last week. My original logo was meant to be pretty versatile because I wasn’t totally sure what I wanted to focus on. Once I got a better idea of what would better reflect the work I do, I changed the font and formatting to reflect that a little better. I kept the same colors I’ve always had, partly because I just like them, and partly because I’ve come to like the elegance of navy with a hint of a bright green that you wouldn’t normally see at a wedding.
I’ll be starting off the year with Miss Clemson University on January 12th. This will be my third year with the pageant and I can’t wait to meet the new winner and have a session with her!
I’ve had a few headshot sessions this year and I’ve realized how much I enjoy them! This will become a service that I regularly offer. It’s important for professionals, small business owners, and pageant contestants to have updated, high-quality photos of themselves so people can become more familiar with the person they’re about to get to know. Just like with a portrait session, I’ll help you decide what to wear and how you want your hair and makeup so you can look and feel like yourself. Since this will only be a 30-minute session, I can have these sessions during the week in Greenville!
In the past year, I’ve become more passionate about helping people learn to take better pictures because it’s a skill anyone can benefit from. You don’t have to be an aspiring photographer to take advantage of this. This can help you if you want to have a picture of your cute kids before they grow up, you want to show better photos of a product you want to sell, or remember a pretty place you went to. To let me know you’re interested, join my mailing list and you’ll be the first to know when I have something new to share!
You’ll still have a complementary engagement session with your wedding, but there are a few things I’m excited to expand: bridal portraits, albums, and prints. As someone who’s been through wedding planning and continues to be part of it from the vendor side, I think it’s important to use my experience to help your first day of marriage go as smoothly as possible. Part of that is to make you feel comfortable with me before the big day comes, and part of that is to help you build your relationship with each other.
Here are a few of my favorites from 2018. Here’s to what comes next year!
Remember Amethyst and Matt, who had their engagement session in Disney World this past April? They celebrated their happily ever after on a warm and sunny December day in Savannah, Georgia!
My day started with a Forrest Gump impersonator chatting with me at breakfast and telling me to congratulate the couple for him, which I think is pretty cool. The day ended with a sparkler exit along the street with trolley passengers cheering and passing cars honking. Before the ceremony, I took pictures with Amethyst, her family, and her bridesmaids with ivy and palmetto trees. My second shooter, Shain, took Matt, his family, and the groomsmen to the nearby square with lots of giant oak trees and Spanish moss. The ceremony took place in the courtyard of the Kimpton Brice Hotel, which faces the Savannah River.
Although this was not a Disney-themed wedding, there had to be Disney references throughout the day, since that’s where Amethyst and Matt met and began their adventures as a couple. Not only were there some Disney songs carefully placed throughout the day (I especially loved the songs from The Goofy Movie as the wedding party entered the reception), each entrance into the reception space had a Disney quote about love on it. Like any Disney princess, Amethyst wore a tiara. Since Beauty and the Beast is one of Amethyst’s favorite Disney movies, Matt had notes and roses given to her throughout the day. As we took their photos in the square after the ceremony, Matt gave her one more note and a rose. I only know that part of that note said “I’ll love you until the last petal falls” because Matt pointed out that this rose was artificial, so no petals would fall.
When you’ve spent years working at “the happiest place on earth,” where do you go for your first adventure as a married couple? Well, “the happiest place on earth” actually refers to several places in different parts of the world, so I’ll just leave it at that.
Here are a few of my favorites from this beautiful wedding day!
While I love all the excitement and details of a wedding day, engagement sessions are unique in their own way. Not only is it intimate, it also gives couples a chance to showcase their personalities and stories through the location and outfits they choose. In this blog, I’m sharing a few tips on what to wear!
I believe that photos that make us happy are mostly because of how we feel at the time the photos were taken. I want everyone to feel their best before they even see me, so I give everyone a portrait guide before each session. Much of that gives some insight to help you decide what to wear. I recommend having two outfits, but some couples choose to wear one, either because they really like it or because they don’t want to take time out of the session to find a place to change. Here are a few tips to help you narrow down your options.
You don’t want to remember your engagement session as the day you wore a top that was too tight or an itchy sweater. If it’s 100 degrees, try to wear something that’s won’t make you sweat more. If it’s 25 degrees (like in Katherine and Anthony's session, pictured above), make sure you have layers under your outfit and gloves to put on between photos. A scarf and a peacoat can be just as pretty as a sundress. Check out my Pinterest board for some of my favorite outfits for cold weather sessions!
Wear comfortable shoes! Take it from me: after I finished my graduation session, it was about a half mile walk back to my car. I really didn’t want to make the choice between wearing the shoes that hurt me for the whole session and walking barefoot. If you’re dying to wear a certain pair of shoes for your session, make sure you have a more comfortable pair to change into if you have a lot of walking to do. Here’s a tip that I picked up from my high school color guard days: carefully placed electrical tape on the insides of your shoes can save your feet a lot of pain.
2. Coordinate with each other.
You can coordinate outfits without matching, like the “red top, black pants” my cousins and I would wear for our family pictures at Christmas when we were little. You could choose a color scheme, such as wearing different shades of blue or both of you wearing something red and black. You could also complement each other’s outfits. If your favorite dress is purple and your fiancé doesn’t have anything purple, you can choose an accessory in a similar color to something he does have.
You can coordinate your outfits with the color scheme of the location. If there are lots of fall leaves at this location, you could wear warmer hues to complement them. If your session is in the snow (which would be amazing), you could wear bolder colors to stand out. Tori and Michael’s first outfits complemented the colors of the desert garden so well, and their second outfits showed off their matching shoes while coordinating with the sunflowers.
If you’re newly engaged, I hope this helps you feel more confident going into your engagement session!
Want to be one of the first to see my next blog post for couples? Click here to join my email list!
I have a lot of pictures to take in the wonderful month of October, and the first were for Sierra and Remington’s engagement session in the mountains!
When I first met Sierra and Remington in July for Sierra’s graduation photos at Clemson, they told me they were about to celebrate their 4th anniversary as a couple. Right before that day came up, Remington told Sierra to make sure her trunk was shut because he got something out of it earlier and he wasn’t sure if it had closed all the way. Thinking he was at work, Sierra opened the trunk to find three boxes with letters and pictures of them together. Remington came from behind the house and proposed! How pretty is the ring he picked out? And yes, I looked all around for red and yellow leaves before Sierra and Remington got there so I could do a fall-looking ring shot, even though it was still a few days before the fall temperature kicked in.
I’ve found that people in other states tend to forget that there’s more to South Carolina than Charleston, but Charleston doesn’t have a view like this! Check out some of my favorites from this session with an awesome couple in a beautiful place.
Seeing your images for the first time is always such a fun experience, but deciding on what to do with the images can be difficult. You probably value being able to share them on social media or hanging your favorites in your home, but what else is there to consider? Here are some of the benefits of having digital files, prints, and albums.
When I was in middle or high school, a friend mentioned not having any baby pictures. I couldn’t understand why since my family had so many pictures of me from when I was little. His parents did take pictures of him when he was little, but his house had burned down when he was in elementary school, and any pictures that were in that house were gone. This was before digital photography became the norm, so any pictures they had that were taken before the fire were either copied from photos given to friends and family, or from the school’s database of school pictures.
This is the biggest reason why I never think twice about offering digital files. Another reason is that it’s always nice to be able to share your photos with so many friends and family members at once and to have them on your phone whenever you want to show someone, and you can get more prints if you need to years after I’ve given you the files. But only having digital files isn't ideal for every situation.
When I got engaged, my grandma, who was losing her memory, asked me every time we talked on the phone what my ring looked like. It was kind of hard to describe, so I texted a picture to my aunt, who I knew visited her often. Still, my grandma would ask me during every conversation what it looked like, insisting that my aunt had never shown her the photo, even though I knew she had several times. My grandma didn’t have access to a cell phone or a computer, so she couldn't see a digital file unless someone came and showed it to her, which we already knew wasn't working out. I printed off a photo and mailed it to her, along with some photos from our engagement session. That way, they could hang on the fridge she passed every day, and she wouldn’t forget what my ring looked like or whose ring it was.
So why would you want to order prints if you can just print the digital files yourself?
The short answer I give in my portrait guide is that I give you the option to order prints from a professional lab so that they’ll last longer over time and look closer to what you see on your screen. Plus, it’s easier than downloading the photos and re-uploading them to wherever you’re printing them.
As for color, I went to Clemson. Sometimes, I would take photos of athletes and their uniforms would look orange and blue instead of orange and purple. Other times, the orange in their uniforms would look like Tennessee orange or Texas orange, or even red. I worked hard in my editing to make sure the colors in my photos looked like what I saw in person. Still, I soon realized that this didn’t always matter if I was printing from the least expensive print source I could get to, because the print could be more yellow or blurry than the file I sent. In college, I'd either pick up smaller prints from a pharmacy in a flimsy envelope, or the larger ones would be mailed to me in a cardboard tube. Not even millimeters separated my prints from whatever could be happening outside. What if a bigger package fell on it? What if I spilled something on the envelope on the drive home? Well, I can assure you that the prints that come from my galleries come from the same lab as the photos that hang on my walls. They came in thick boxes with tons of layers, and you can see that our wedding colors are CLEARLY purple and orange.
Let’s go back to my grandma: my dad made my grandma an album of family photos he had scanned, dating back to before my grandparents were married. He said that when he showed her the photos, she was suddenly able to recall specific details about the photos. Around that time, I made my sister's wedding album as her wedding gift, and my dad asked me to make another copy for my grandma. I sometimes had to describe my cousins' weddings to her, even though she was at all of them, and we wanted her to remember how happy she was at my sister's wedding. The album had pictures of the extended family, my sister’s accessories, and the sign they got with their names and wedding date on it. She could pick it up from her coffee table and remember what my sister’s new last name was, the date of the wedding, who was there, and what everything looked like. And someday, my sister and her husband won’t remember that day so well, and they’ll have the album to show their kids.
I decided to make my own wedding album because it’s something I like to do. And I did make it…a year after my wedding. It wouldn’t surprise me if other couples planned to make their own and just never did. Unless you’re like me and you’re obsessed with detail shots, you probably won’t have prints of your shoes and rings hanging on your wall, but you would want to have some photos in your album that highlighted some of the important items from your wedding day, like any heirlooms or the lace on your veil that you love so much. You wouldn’t hang pictures that included every guest, but you’d put some of them in an album so you can remember who was there. Plus, looking through my grandparents’ wedding album is so much fun because everything was so different back then, and it’s only when I look at those photos that my family and I see how much I look like my grandma.
My approach to delivering photos has always been to serve you the best I can without making things too complicated for you. Everyone has different photography needs, so hopefully this has helped you get a better idea of what your needs are. Thanks for reading!
Want to be the first to know when I've posted another post like this one? Click here to join my mailing list!
Tori and Michael first met at governor’s school and became friends when they became counselors for the governor’s school summer camp program. After they both graduated from Clemson with engineering degrees, Michael proposed to Tori on a cruise to the Caribbean.
Since so much of Tori and Michael’s story took place at Clemson, that’s where we went for their engagement photos. I've done lots of sessions at Clemson and even had my own engagement photos there, but I love how every session I've had there shows a different part of campus that each person wants to look back on. We didn’t just take photos around campus, we went into the botanical gardens and the student organic farm as well. There’s a section of the botanical garden with all desert plants, so until we moved to the grassier areas with palmetto trees, it would be hard to tell we were in South Carolina if you weren’t there to feel the humidity. With a high of 84 degrees that day, it was even hard to remember that it was July. Although their clothes got caught on the cacti a few times before we came up with a new plan for where to stand, I’m happy to say that the only time anyone got pricked was when I set up the shot of Tori’s ring on top of a cactus with a flat top and little tiny spines around it. Luckily, I didn’t really feel them and they came out pretty easily.
I thought the outfits Tori and Michael chose were awesome and looked amazing against some of the plants in the area. Their first outfit complemented the colors of the desert garden so beautifully. They coordinated their second outfit with the sunflower field in mind, using their matching yellow Converses to complement the sunflowers and stand out from their black and white outfits.
Here are some of my favorites from this Arizona/South Carolina/summer-feeling session! I had a lot of fun with these two and I can’t wait to be part of their wedding day!
Samantha and Tim's wedding was one of those days that you'd picture if you imagined a peaceful summer wedding in the south. They were married at the Pines at Sheltowee, under the shade of the trees with the Appalachian mountains of eastern Kentucky right behind them.
The barn was elegantly rustic on the inside with big windows everywhere to let in plenty of natural light for Samantha's bridal portraits. Her gown had a beautiful lacy train and she carried a bouquet in a pastel pink and blue palette, while her bridesmaids carried baby's breath. The reception hall, where Samantha's family served food they made themselves, was decorated with hints of blush. The clouds rolled in during the wedding ceremony, keeping the sun off of everyone on this 90-degree day. The rain held off until after the ceremony and it only began sprinkling towards the very end of the bridal party photos. Luckily, the sun was back out during the golden hour, and you know how I love the way golden hour sun lights up hair like Samantha’s.
Their dog, Tux, walked down the aisle with Tim’s parents and sat quietly for the ceremony and some pictures. Since Tim is a herpetologist (someone who studies reptiles), his brother and best man mentioned during his speech that he was only allowed to have one room in the new house for critters. It only made sense for their getaway car to have drawings of some of the many animals he’s worked with.
I had always imagined that Kentucky would kind of look like West Virginia, where I first started to be conscious of taking “good” pictures of the mountains that would show people who weren’t there exactly how beautiful something could be. After many years of learning to take better pictures, combining a backdrop like that with the joy of a wedding makes me so happy. Samantha and Tim’s wedding perfectly combined those two and more. Here are some of my favorites from this gorgeous Kentucky day!
Dress: Bridal and Formal, Inc.
On Thursday, Diana became an alumna of Clemson’s engineering school. On Saturday, she became Andrew’s wife. I did this Clemson couple’s engagement photos on a cold January day in Clemson. Here are their wedding pictures from a 90-something degree clear day in May!
The wedding ceremony took place in the Fort Mill church Diana grew up in and a priest from Andrew’s church delivered the sermon. Diana wore the same veil that her mother and her mother’s sisters all wore on their wedding days, except Diana added a pretty gold headband to it. And how beautiful are these rings? Diana’s yellow-gold rings look so pretty together! Diana’s favorite color is gray, so her bridesmaids wore gray, but most of Diana’s accessories were gold and there were hints of blue in their bouquets, with Diana’s rosary wrapped around the bridal bouquet.
After the ceremony, we found a shaded area by a pond near the church for bride and groom portraits, then crossed the border of the Carolinas to Diana’s aunt’s house in Charlotte for the reception. Diana and Andrew love the movie Up, so “adventure” was a common theme throughout the reception. They had a few single-tier cakes from All In, a coffee shop in Clemson where everything tastes amazing, including their cakes. After a night of line dancing and partner dancing in the backyard, Diana and Andrew walked through a tunnel of their family and friends blowing bubbles that came from champagne bottle-shaped bubble containers, which I thought was adorable, before heading to Asheville for their first adventure as husband and wife.
Enjoy the photos of this beautiful celebration of love between two sweet people!
For the Spring 2018 semester, I had 8 Clemson graduation portrait sessions, with 2 more happening after graduation. Here are some of my favorites of each session at the best university.
Holly is graduating with a degree in health sciences. She grew up near University of Georgia and became interested in Clemson after they beat Georgia in football. We rescheduled her session for a cloudy afternoon that turned into a bright and sunny afternoon, which made the lighting a little different, and some gorgeous blossoms!
Logan is a political science major and Connor is a computer science major. They were also part of my wedding; we all met through Alpha Phi Omega National Service Fraternity. These were also the guys I spent the College Football National Championship with, as well as a lot of that football season.
Kinsey is graduating with a degree in chemistry. We arrived at the stadium to find out that the football team had decided to move their practice into the stadium at the last minute, meaning we couldn’t go inside and we had songs like Back in Black blasting through the stadium to pump us up for our other pictures. She did get to come during my next session and get her stadium photos, though!
Peter is a management major who will move to Atlanta after graduation. He paid tribute to the football team’s national title from his junior year by wearing a Deshaun Watson jersey for some of his shots around the stadium.
Mindy and Megan grew up in Tennessee. Although they didn’t have any family ties to Clemson, they both fell in love with the school and are graduating with engineering degrees.
Taron is also a chemistry major and wanted to explore the botanical gardens for the first time during her session. We were originally scheduled to have the session when the trees would be blooming, but it poured down rain on the day we scheduled. Her yellow dress looks so perfect in the gardens though!
Tracy and Katie have already graduated from Clemson once, but now they’re graduating with their Masters in Business Administration. Since all their MBA classes are in Greenville, they wanted to have a session in Greenville instead, which was a new experience for me!
Glenn is graduating with a degree in history. His session was right after most people moved out, so the campus was so quiet that day. Since there were no cars and his girlfriend had come along, I tried a shot of the two of them on the paw print at the 4-way stop.
Congratulations to Holly, Logan, Connor, Kinsey, Peter, Mindy, Megan, Taron, Tracy, Katie, and Glenn (and Alexa and Christine, who will be featured once their sessions happen) on their graduation from Clemson!
One of my first Clemson portrait sessions was for my friend Tracy when she was crowned Miss Clemson University 2016. Two years later, I got to cover the pageant and see Tara become Miss Clemson University. As the representative for the Clemson Rally Cats dance team, she had such an awesome dance routine to a Gloria Estefan mix, taking home the overall talent award in addition to the crown.
I finally got to get to know Tara on what was probably the most perfect day ever for a portrait session—it was a quiet, 65-degree and sunny day with the trees and flowers in full bloom. Tara is a health sciences major and a sister of Alpha Delta Pi. She was the 2016 Distinguished Young Woman of America. A few weeks before our session, she was on the basketball court with the Rally Cats when Clemson men’s basketball team made it to the Sweet Sixteen for the first time in over 20 years. A few days after our session, Tara and the Rally Cats went to Disney World to compete at Nationals.
Of course we had to have some photos of Tara in the stadium with her Rally Cat uniform, where she also posed for some pictures with excited Clemson fans who had come to take some pictures with Howard’s Rock. When we were taking pictures over by Tillman Hall, there was a group of little girls playing on Bowman Field who kept running up to us, then saying to each other, “I think she’s a REAL beauty queen!” They finally came close enough for Tara to show them the crown.
Here are some of my favorites from this beautiful day with this sweet lady!
For more information about Miss Clemson University, visit my Miss Clemson University page or contact the Clemson University Mortar Board.
Amethyst and Matt first met while working at the same Disney World resort. This past February, Matt got all of Amethyst’s friends in on his plan to propose at the top of the Orlando Eye, a giant Ferris wheel. When the ride was delayed due to a fire alarm going off, the group decided to take some pictures in front of the Ferris wheel while they were waiting. When it was Matt and Amethyst’s turn to have their picture taken, Matt popped the question. Fast forward to the day of the engagement session: seeing the Winnie the Pooh characters react to a couple saying they’re engaged is the cutest thing ever.
We started the session on the beaches by some of their favorite resorts. Yes, we had a beach and Cinderella’s Castle in the same shots! That Winnie the Pooh song that says “and the rain rain rain came down down down” is a perfect description of what happened partway through. Amethyst and Matt pulled out their Mickey and Minnie umbrellas, which was not only adorable, they looked especially gorgeous when a little patch of golden hour sun peeked out for a few minutes. To top it off, we saw a rainbow for a minute! Luckily, there were lots of gazebos in the area, so Matt and Amethyst could stay dry while dancing to Disney songs. Keep an eye out for a hidden Mickey in one of the gazebos!
I don’t normally shoot after the sun goes down, but we were in Disney. We headed to Main Street and watched the fireworks and light show at the castle. I asked Matt and Amethyst if they wanted some pictures with it. They said to wait until the grand finale, when there would be the most fireworks. There may have been a ton of people around, but the rush of catching the right moment was incredible.
I knew as soon as Matt and Amethyst told me where they wanted their engagement photos that this would be so much fun. Hope you love looking through these photos from the most magical location!
I had the pleasure of shooting the Miss Clemson University pageant once again! This time, I'm sharing some of my favorite photos of each contestant in a blog post.
After the opening dance in which all 19 contestants introduced themselves, Miss Clemson University 2017, Brooklyn, entered with the Tiger and Miss Clemson Life, Megan. Brooklyn was an MC for the event with Rachel, who was Miss Clemson 2016, which led her to become Miss South Carolina and the runner-up for Miss America 2016.
(for information about purchasing photos, visit my Miss Clemson University page).
I had the pleasure of meeting Caroline a few weeks before the pageant to do her pageant headshots. She was given the Miss Congeniality award at the pageant.
Reese represented the Clemson cheerleaders and performed the cheer routine to Tiger Rag, the Clemson fight song.
I'm loving this shot from Johanna's dance! She was awarded the 4th runner up.
Laney looked like she was having so much fun with her belly dance routine!
I had to post a shot of Kyndall's dress from the back because that cape is just so cool!
Alyx sang Orange Colored Sky with an orange jumpsuit. Perfect for Clemson!
I also did Margaret's pageant headshots a few weeks before the pageant. She received the People's Choice award at the pageant.
As a lover of all things vintage, I loved Carlyle's evening gown with her hair!
Tiffany's gown looks so sparkly in the stage lights!
Shea represented Clemson Dancers, the only organization represented at the pageant that I was part of during my time at Clemson.
Katherine played one of my favorite Disney songs (When You Wish Upon a Star) and had the perfect dress to go with it!
Claire had such a fun Irish dance for her talent!
I have so many favorite photos from Logan's dance routine! She was the first runner up and had the most ad sales.
This is my new favorite evening gown photo! Morgan was given the philanthropy award.
It's not my best picture of Cara's talent, but it's the one that shows off her outfit the best.
Nicole's dance routine and gorgeous white evening gown helped get her to 3rd runner up as just a freshman.
Abigail also sang jazz in a Clemson colored jumpsuit, wearing purple and singing Fly Me to the Moon.
The interview portion of the pageant took place earlier in the day, so I didn't see the interview for which Makenzie was given an award. She also was named second runner-up.
Brooklyn made her final walk as Miss Clemson University. She was joined onstage by her dad and brother.
The contestants returned to the stage for the award ceremony, which started with the first-ever crowning of Miss Clemson Life, Megan. Brooklyn passed on the crown she was given for Miss Clemson University 2016.
After all of the other awards were given, Tara of the Rally Cats was crowned Miss Clemson University 2018!
To see more, and for more information on purchasing photos, visit my Miss Clemson University page.
One day at Clemson, Andrew and Diana walked hand-in-hand through President's Park. Andrew did not know at the time that superstition says that means they'll be together forever. This time, they held hands and danced through the park, knowing their wedding day will be here before they know it.
I've done lots of portrait sessions around Clemson, and I had my proposal, engagement, and wedding pictures at Clemson, but this was my first time shooting an engagement session at Clemson, so this one holds a special place in my heart. It was also my first time doing a session in January; I don't know what colors are the most festive for January, but I'm pretty sure that Diana and Andrew wore them for the session, and Diana’s purple coat is so perfect for a session at Clemson. At such a cold time of year, there were no leaves on the trees to hide some of the buildings on the main part of campus, so Tillman Hall could be in the background of our photo from the steps of Sikes Hall and the photo of Diana’s yellow-gold and sapphire engagement ring. Diana has been on Clemson’s rowing team all four years of college, so we took some pictures down at the lake on the dock with a tiger paw painted on it. With a high of 37 degrees that day, we couldn’t see or hear any people or boats, making it a peaceful time for Diana and Andrew to snuggle up in a blanket on the dock and watch the sun go down over the water.
Here are some of my personal favorite photos from Diana and Andrew's engagement session. After having so much fun with them at their session, I'm even more excited now to be part of their wedding day!
I always watch the forecast closely and frequently in the week prior to a session, and Katherine and Anthony's session was no different. Yet somehow, I woke up, saw that the forecast had changed from "cloudy" to "partly cloudy," then looked out the window to see the roads covered in snow! Nobody in the area was expecting this to happen. Luckily, it melted enough to clear the roads so I could meet Katherine and Anthony for the session. This would be the first session I'd ever done in the dead of winter and it would be in Virginia, so I knew this would be my coldest session yet. What I didn't know was exactly how cold it would be. Although the snow all melted before I could get a shot of Katherine's ring in the snow, it was still cold enough for us to take pictures in front of a frozen pond.
After meeting through their friends a few years earlier, Anthony proposed to Katherine in Washington DC on the steps of the Jefferson Memorial, which faces the Washington Monument. They wanted to have their engagement photos in the Virginia Blue Ridge Mountains, so in the week between Christmas and New Year's, we met up in Crozet, slightly west of where Katherine and I both grew up. Their outfits were the perfect winter engagement session outfits; the colors were so classically wintery, Katherine's red sweater was a perfect fit for that time between Christmas and New Year's Eve, and Katherine's mom made the scarves that both of them wore to the session. Katherine's yellow-gold ring was covered by her gloves for a good part of the session, but it made for some cute photos of her trying to keep Anthony's hands warm. We ended the session a little earlier than I usually do because the windchill was so brutal (by our Virginia standards), but the golden hour lighting kept us out for just a few more pictures.
I'm so glad we were able to brave the cold at least for a little while and that the snow didn't keep us from driving to these mountain views. Since apparently everyone else in Crozet decided to stay inside that day (not that we can blame them), we had this gorgeous landscape all to ourselves. It's always an honor to be part of putting these sweet moments into pictures, but it's even more special in the wide open with absolutely nobody else in sight. I can't wait to come back to Charlottesville for the next part of Katherine and Anthony's story!
Another year over, a new one just begun.
2017 was quite the year for pictures! Clemson football won the national championship in January, I shot the Miss Clemson University pageant, Dan and I got married in July and took our honeymoon to British Columbia, there was a total solar eclipse in August, and there were all sorts of awesome people in front of my camera in between. So what's happening in 2018?
I will be shooting the Miss Clemson University pageant again this year! If you're competing and you don't have professional headshots for the program yet, let me know and we can set up a session.
Thank you so much to everyone who's supported me in my photography so far. Whether you've chosen me as your photographer or you've given me some words of encouragement, it means so much to me. I wouldn't have reached this point without your support.
Here are some of my favorite photos from everything I shot in 2017. I'm always striving to be better at what I do, and I can't wait for all the pictures 2018 has in store!
|
import riak
import uuid
# For regular HTTP...
# client = riak.RiakClient()
# For Protocol Buffers (go faster!)
client = riak.RiakClient(port=10018, transport_class=riak.RiakPbcTransport)
artifact_bucket = client.bucket('artifact')
def create_artifact(artifact_dict):
# ``artifact_dict`` should look something like:
# {
# 'title': 'Bangor Fire House circa 1908',
# 'description': 'A description of our bold artifact',
# 'slug': 'bangor-fire-house-circa-1908',
# 'address': '102 Broadway Street',
# 'city': 'Bangor',
# 'state': 'Maine',
# 'zipcode': '04401',
# 'image': 'path/to/image.jpg'
# 'created': time.time(),
# 'updated': time.time(),
# 'created_by': 'username',
# }
artifact = artifact_bucket.new(artifact_dict['slug'], data=artifact_dict)
artifact.store()
def get_artifact(artifact_slug):
artifact = artifact_bucket.get(artifact_slug)
return {
'artifact': artifact.get_data(),
}
'''
def create_comment(entry_slug, comment_dict):
# ``comment_dict`` should look something like:
# {
# 'author': 'Daniel',
# 'url': 'http://pragmaticbadger.com/',
# 'posted': time.time(),
# 'content': 'IS IT WEBSCALE? I HEARD /DEV/NULL IS WEBSCALE.',
# }
# Error handling omitted for brevity...
entry = artifact_bucket.get(entry_slug)
# Give it a UUID for the key.
comment = comment_bucket.new(str(uuid.uuid1()), data=comment_dict)
comment.store()
# Add the link.
entry.add_link(comment)
entry.store()
'''
'''
def get_entry_and_comments(entry_slug):
entry = artifact_bucket.get(entry_slug)
comments = []
# They come out in the order you added them, so there's no
# sorting to be done.
for comment_link in entry.get_links():
# Gets the related object, then the data out of it's value.
comments.append(comment_link.get().get_data())
return {
'entry': entry.get_data(),
'comments': comments,
}
'''
'''
# To test:
if __name__ == '__main__':
create_entry({
'title': 'First Post!',
'author': 'Daniel',
'slug': 'first-post',
'posted': time.time(),
'tease': 'A test post to my new Riak-powered blog.',
'content': 'Hmph. The tease kinda said it all...',
})
create_comment('first-post', {
'author': 'Matt',
'url': 'http://pragmaticbadger.com/',
'posted': time.time(),
'content': 'IS IT WEBSCALE? I HEARD /DEV/NULL IS WEBSCALE.',
})
create_comment('first-post', {
'author': 'Daniel',
'url': 'http://pragmaticbadger.com/',
'posted': time.time(),
'content': 'You better believe it!',
})
data = get_entry_and_comments('first-post')
print "Entry:"
print data['entry']['title']
print data['entry']['tease']
print
print "Comments:"
for comment in data['comments']:
print "%s - %s" % (comment['author'], comment['content'])
'''
|
DGAP-News: UEC presented Be-200 amphibious aircraft remotorization project with SaM146 engine at Hydroaviasalon-2018 in Gelendzhik.
A spectacular waterspout over the Black Sea was captured on video by witnesses vacationing in Gelendzhik bay on the southern coast of Russia.
... am proud to play for Sweden. I will never let any racists destroy that pride," Durmaz said during his speech at the team's base camp in Gelendzhik.
|
"""Application exceptions. Base Exception class for this app is `StargateException`.
All application exceptions are caught here and send back to client in a prescribed format.
Exceptions are further grouped so that we can located the part of code causing a specific
exception. Werkzeug exceptions are also mapped here.
"""
from flask import jsonify
from werkzeug.exceptions import NotAcceptable, Conflict, BadRequest, NotFound, InternalServerError, UnsupportedMediaType, UnprocessableEntity
from werkzeug.http import HTTP_STATUS_CODES
############################--MAIN APPLICATION ERROR CLASS--##################################
class StargateException(Exception):
werkzeug_exception = InternalServerError
def __init__(self, msg=None):
self.msg = msg
@property
def status_code(self):
return self.werkzeug_exception.code
def as_dict(self):
return {
'status': self.status_code,
'message': self.msg if self.msg else HTTP_STATUS_CODES.get(self.status_code, ''),
'details': {'_exception_class': type(self).__name__}
}
def get_response(self):
response = jsonify(self.as_dict())
response.status_code = self.status_code
return response
############################--NOT-FOUND ERRORS--############################################
class ResourceNotFound(StargateException):
werkzeug_exception = NotFound
def __init__(self, resource, id = None, msg = None):
super(ResourceNotFound, self).__init__()
self.resource = resource
self.msg = msg
self.id = id
def as_dict(self):
dct = super(ResourceNotFound, self).as_dict()
dct['details'].update({'resource' : self.resource, 'primary_key' : self.id})
return dct
############################--CONFLICT ERRORS--############################################
class ConflictException(StargateException):
werkzeug_exception = Conflict
def __init__(self, msg, **kwargs):
super(ConflictException, self).__init__()
self.msg = msg
############################--MEDIATYPE ERRORS--############################################
class MediaTypeNotSupported(StargateException):
werkzeug_exception = UnsupportedMediaType
class NotAcceptable(StargateException):
werkzeug_exception = NotAcceptable
############################--VALIDATION ERRORS--############################################
class ValidationError(StargateException):
werkzeug_exception = BadRequest
def __init__(self, msg, **kwargs):
super(ValidationError, self).__init__()
self.msg = msg
class ComparisonToNull(ValidationError):
def __init__(self, msg, **kwargs):
super(ComparisonToNull, self).__init__()
self.msg = msg
class UnknownField(ValidationError):
def __init__(self, field, resource):
self.field = field
self.resource = resource
self.msg = "Unknown field {0} in model {1}".format(field, resource)
super(UnknownField, self).__init__(self.msg)
def as_dict(self):
dct = super(UnknownField, self).as_dict()
dct['details'].update({'field' : self.field, 'resource': self.resource})
return dct
class UnknownRelation(ValidationError):
def __init__(self, relation, resource):
super(UnknownRelation, self).__init__()
self.relation = relation
self.resource = resource
self.msg = "Unknown relation {0} in model {1}".format(field, resource)
def as_dict(self):
dct = super(UnknownRelation, self).as_dict()
dct['details'].update({'relation' : self.relation, 'resource': self.resource})
return dct
class IllegalArgumentError(ValidationError):
def __init__(self, msg, **kwargs):
super(IllegalArgumentError, self).__init__()
self.msg = msg
class UnknownOperator(ValidationError):
def __init__(self, msg, **kwargs):
self.__name__ = 'UnknownOperator'
super(UnknownOperator, self).__init__(msg)
############################--PROCESSING ERRORS--############################################
class ProcessingException(StargateException):
werkzeug_exception = UnprocessableEntity
class MissingData(ProcessingException):
def __init__(self, model, *args, **kw):
super(MissingData, self).__init__(*args, **kw)
self.msg = "Missing `data` key for model {0}".format(model)
class MissingPrimaryKey(ProcessingException):
def __init__(self, model, *args, **kw):
super(MissingPrimaryKey, self).__init__(*args, **kw)
self.msg = "Missing `id` key for model {0}".format(model)
class DatabaseError(ProcessingException):
def __init__(self, msg, *args, **kw):
super(DatabaseError, self).__init__(*args, **kw)
self.msg = msg
class SerializationException(ProcessingException):
def __init__(self, instance, message=None, *args, **kw):
super(SerializationException, self).__init__(*args, **kw)
self.instance = instance
DEFAULT_MSG = "Failed to Deserialize Object"
self.msg = message if message is not None else DEFAULT_MSG
def as_dict(self):
dct = super(SerializationException, self).as_dict()
dct['details'].update({'instance' : self.instance})
return dct
class DeserializationException(ProcessingException):
def __init__(self, instance, message = None, *args, **kw):
super(DeserializationException, self).__init__(*args, **kw)
DEFAULT_MSG = "Failed to Deserialize Object"
self.msg = message if message is not None else DEFAULT_MSG
def as_dict(self):
dct = super(SerializationException, self).as_dict()
dct['details'].update({'instance' : self.instance})
return dct
###############################################################################################
|
We are delighted to announce the 12 winning images of the International Cat Care 2016 Cat Naps photography competition!
The competition ran for six weeks from the start of April and attracted a record number of entries, with a total of 3,400 pictures submitted by photographers from 51 different countries! This is easily the largest number of entries the charity has ever received in the four years that the competition has been running.
This year, together with Your Cat magazine, we were looking for images of cats napping, something cats certainly like to do! We received so many fantastic entries including cats sleeping up trees, in flowerpots and even one of a cat napping on the back of a horse! The judges from iCatCare and Your Cat magazine had the privileged but difficult task of whittling the entries down to a shortlist of just 40, from which the 12 winners were selected.
Congratulations go to Viv Harding, Nic Howett, Crystal Moreno, Matthew Wong, Becky Wong, Patrycja Kuczynska, Johanna Vaurio-Teräväinen, Alexis-Kimonas Kokkinaris, Caroline Hooper, Phil Croucher, Christine Lam Ying Loi and Paul Brown – our winners!
The twelve winners will each receive £100 and a selection of iCatCare merchandise. All of the winners were also invited to the iCatCare Annual Awards which were held in London on 15 July.
Angel: My bowl and my blanket…Right?!?
OMG these kitties are adorable! I want them all :). Congrats to all the winners.
Check out these awesome winning cat photographs. Monkey and I are sure glad there is a black cat amongst them!
Do two identical cat photos exist in the universe? I think not!
|
# This file is an example of using the multivol code, and is derived from an
# original example in vispy which is releaed under a BSD license included here:
#
# ===========================================================================
# Vispy is licensed under the terms of the (new) BSD license:
#
# Copyright (c) 2015, authors of Vispy
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution.
# * Neither the name of Vispy Development Team nor the names of its
# contributors may be used to endorse or promote products
# derived from this software without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS
# IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED
# TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A
# PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER
# OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
# LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
# NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
# ===========================================================================
#
# This modified version is released under the BSD license given in the LICENSE
# file in this repository.
from itertools import cycle
import numpy as np
from vispy import app, scene, io
from vispy.scene import visuals
import vispy.visuals as impl_visuals
from vispy.color import get_colormaps, BaseColormap
from multivol import RGBAVolume
# Read volume
vol = np.load(io.load_data_file('volume/stent.npz'))['arr_0']
# Prepare canvas
canvas = scene.SceneCanvas(keys='interactive', size=(800, 600), show=True)
canvas.measure_fps()
# Set up a viewbox to display the image with interactive pan/zoom
view = canvas.central_widget.add_view()
# Set up RGBA cube with the data we want
vol = (vol - vol.min()) / (vol.max() - vol.min())
data = np.zeros(vol.shape + (4,))
data[..., 0] = vol
data[..., 2] = vol[::-1,::-1,::-1]
data[..., 3] = vol + vol[::-1,::-1,::-1]
data /= 2.
volume = RGBAVolume(data, parent=view.scene)
volume.transform = scene.STTransform(translate=(64, 64, 0))
# Create three cameras (Fly, Turntable and Arcball)
fov = 60.
cam2 = scene.cameras.TurntableCamera(parent=view.scene, fov=fov,
name='Turntable')
view.camera = cam2 # Select turntable at first
canvas.update()
if __name__ == '__main__':
print(__doc__)
app.run()
|
We have a long heritage as a pioneering movement and we believe that God loves everyone and everyone is significant in his eyes. Want to know more about our work? Get a snapshot of what we do and how you can pray for our work from our publications.
Read online or download the latest edition of our supporter magazine Catalyst.
Read the latest edition of our Prayer Diary and find out how you can support us by praying for our work.
|
""" Module for windowing functions not found in SciPy
"""
from __future__ import division
import numpy as np
from scipy.signal import kaiser
__all__ = [
'kaiser_derived',
]
def kaiser_derived(M, beta):
""" Return a Kaiser-Bessel derived window.
Parameters
----------
M : int
Number of points in the output window. If zero or less, an empty
array is returned.
beta : float
Kaiser-Bessel window shape parameter.
Returns
-------
w : ndarray
The window, normalized to fulfil the Princen-Bradley condition.
Notes
-----
This window is only defined for an even number of taps.
References
----------
.. [1] Wikipedia, "Kaiser window",
https://en.wikipedia.org/wiki/Kaiser_window
"""
M = int(M)
try:
from scipy.signal import kaiser_derived as scipy_kd
return scipy_kd(M, beta)
except ImportError:
pass
if M < 1:
return np.array([])
if M % 2:
raise ValueError(
"Kaiser Bessel Derived windows are only defined for even number "
"of taps"
)
w = np.zeros(M)
kaiserw = kaiser(M // 2 + 1, beta)
csum = np.cumsum(kaiserw)
halfw = np.sqrt(csum[:-1] / csum[-1])
w[:M//2] = halfw
w[-M//2:] = halfw[::-1]
return w
|
On night of Sunday 28th November 1914 there was a mutiny at Upwey, near Weymouth. Private Wallace Williams of the 3rd Dorset Regiment was killed and Private Lane injured. The papers report that Grandad Beck attended the Coroner’s court but gives us no information about his involvement. The civil courts part in this was to ascertain how the death happened and if it was a criminal offence. The war had started 4 months before. I am sure that the investigating the incident had to be handled with care, as it involved both the military and civilian police. It is likely that Grandad Beck, as Dorset’s only detective was involved in the investigation and liaising with the Dorset Regiment. This may have helped to secure his promotion the following year, to Superintendent of Blandford Forum, a town with a military base nearby.
I know it is not really relevant but I couldn’t resist another picture of Lionel in uniform take in 1917.
On 23rd April 1917 Grandad Beck’s son Lionel joined the Royal North Devon Hussars 2/1 Battalion. His service was to last 101 days, just over 3 months. From his discharge papers and the three letters Lionel wrote to his sister May, we find out about this time. Less than a year after he returned home, Lionel died, his death certificate contained a surprise.
We learn that Private Lionel Howard Beck was 5ft 6in with blue eyes and fair hair. Before his call up, Lionel, worked as a shop assistant, most likely in Blandford where he lived at home with his parents and sister. At this time Grandad Beck was Superintendent at Blandford and the family lived in the police station.
|
#player ship class
import pygame
from pygame.locals import *
import var, gfx, math
from random import randint
import gameinit
class Stars:
def __init__(self):
stars = []
scrwide, scrhigh = gfx.rect.size
self.maxstars = 800
for x in range(self.maxstars):
val = randint(1, 3)
color = val*40+60, val*35+50, val*22+100
speed = -val, val
rect = Rect(randint(0, scrwide), randint(0, scrhigh), 1, 1)
stars.append([rect, speed, color])
half = self.maxstars / 2
self.stars = stars[:half], stars[half:]
self.numstars = 50
self.dead = 0
self.odd = 0
def recalc_num_stars(self, fps):
if isinstance(var.handler, gameinit.GameInit):
#don't change stars while loading resources
return
change = int((fps - 35.0) * 1.8)
change = min(change, 12) #limit how quickly they can be added
numstars = self.numstars + change
numstars = max(min(numstars, self.maxstars/2), 0)
if numstars < self.numstars:
DIRTY, BGD = gfx.dirty, self.last_background
for rect, vel, col in self.stars[self.odd][numstars:self.numstars]:
DIRTY(BGD(rect))
self.numstars = numstars
#print 'STAR:', numstars, fps, change
def erase_tick_draw(self, background, gfx):
R, B = gfx.rect.bottomright
FILL, DIRTY = gfx.surface.fill, gfx.dirty
for s in self.stars[self.odd][:self.numstars]:
DIRTY(background(s[0]))
self.odd = not self.odd
for rect, (xvel, yvel), col in self.stars[self.odd][:self.numstars]:
rect.left = (rect.left + xvel) % R
rect.top = (rect.top + yvel) % B
DIRTY(FILL(col, rect))
self.last_background = background
def eraseall(self, background, gfx): #only on fullscreen switch
R, B = gfx.rect.bottomright
FILL = gfx.surface.fill
for s in self.stars[0][:self.numstars]:
background(s[0])
for s in self.stars[1][:self.numstars]:
background(s[0])
|
Products, Training, and Support – as well as new ways to improve hospital efficiency.
Get education and training on the noddle system.
View, download, or print Voxello product manuals.
Get training and support for Voxello products.
Information on our breakthrough noddle technology.
|
__author__ = 'MrTrustworthy'
import inspect
class Antenna:
def __init__(self):
self.listeners = {}
def add_listener(self, channel, callback):
if len(inspect.signature(callback).parameters) == 0:
raise TypeError("Callback Function needs at least 1 parameter")
if channel not in self.listeners.keys():
self.listeners[channel] = []
self.listeners[channel].append(callback)
def remove_listener(self, channel, callback):
if channel in self.listeners.keys():
if callback in self.listeners[channel]:
self.listeners[channel].remove(callback)
if len(self.listeners[channel]) == 0:
del self.listeners[channel]
def dispatch_message(self, channel, info=None, fail_when_empty=False):
if channel not in self.listeners.keys():
if fail_when_empty:
raise KeyError("No listener on this channel")
else:
return
for callback in self.listeners[channel]:
callback(info)
|
There are more photos from Doniphan alumni listed at Classmates.com®. Click here to join Classmates.com® for free and share more DHS pictures.
Pictures of Doniphan High alumni are listed below. You can view the alumni, graduation, reunion, or sports photos from Doniphan in MO or upload photos you have taken related to Doniphan High School.
|
from laspy.file import File
import numpy as np
import colorsys, random, math, os, load_info
laszip = "../LAStools/bin/laszip"
def generate_spheres(lidar_list, areas_list, c1, c2):
"""
Create a string with the definition of spheres which represents points of the LiDAR file
included into de coordinates passed as parameters.
"""
print("Generating spheres...")
spheres = ""
for lidar_file in lidar_list:
lidar_file = lidar_file[0]
print("Generating spheres from " + lidar_file)
os.system(laszip + " -i " + lidar_file + " -o " + lidar_file[:-3] + "LAS")
inFile = File(lidar_file[:-3] + "LAS", mode='r')
point_records = inFile.points
x_scale = inFile.header.scale[0]
x_offset = inFile.header.offset[0]
y_scale = inFile.header.scale[1]
y_offset = inFile.header.offset[1]
z_scale = inFile.header.scale[2]
z_offset = inFile.header.offset[2]
final_points = []
count = 0
total = 0
number_points = len(point_records)
max_points = int(number_points / 3)
if max_points > 1000000:
max_points = 1000000
print("Reading all points...")
while(count < max_points and total < number_points):
rand = random.randint(0, number_points - 1)
point = point_records[rand]
# Take point coordinates
point = point[0]
x_coordinate = point[0] * x_scale + x_offset
y_coordinate = point[1] * y_scale + y_offset
z_coordinate = point[2] * z_scale + z_offset
total += 1
# In interesting zone?
interest = False
for area in areas_list:
if load_info.is_collision(float(area[0]), float(area[1]), float(area[2]), float(area[3]),
x_coordinate, y_coordinate, x_coordinate, y_coordinate):
if load_info.is_collision(float(c1[0]), float(c1[1]), float(c2[0]), float(c2[1]) - 500,
x_coordinate, y_coordinate, x_coordinate, y_coordinate):
interest = True
break
if interest == True:
red = str(point[10] / 65535)
green = str(point[11] / 65535)
blue = str(point[12] / 65535)
z_coordinate *= 1.85
z_coordinate -= 18
final_points.append([str(x_coordinate), str(z_coordinate), str(y_coordinate), red, green, blue])
count += 1
inFile.close()
os.system("rm " + lidar_file[:-3] + "LAS")
number_points = len(final_points)
#max_points = int(number_points / 3)
#max_points = number_points
if max_points > 1000000:
max_points = 1000000
count = 0
print("Taking " + str(number_points) + " points...")
for point in final_points:
#rand = random.randint(0, number_points - 1)
#point = final_points[rand]
spheres += ("sphere {\n<" + point[0] + ", " + point[1] + ", " + point[2] + ">, 2\ntexture {\npigment { color rgb <"
+ point[3] + ", " + point[4] + ", " + point[5] + "> }\n}\nno_shadow\n}\n")
count += 1
return spheres
|
HOMsoft wins Microsoft's "2003 Information Worker Challenge"
Warranty Management Technologies, LLC creator, Tracey Gundersen was awarded one of 15 “Innovator of the Year” awards by Minneapolis-based Finance and Commerce Newspaper.
at 11:00 a.m. CST on WCCO-AM.
Tracey and BATC representative Wendy Danks discussed what home buyers can expect with warranty service, home buyer orientations, and why maintenance is important to maintaining warranties. Also, buyers learned how the warranty process works and how to maximize their new home experience.
|
# Copyright 2017 Cloudbase Solutions Srl
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Config options available for the Azure metadata service."""
from oslo_config import cfg
from cloudbaseinit.conf import base as conf_base
class AzureOptions(conf_base.Options):
"""Config options available for the Azure metadata service."""
def __init__(self, config):
super(AzureOptions, self).__init__(config, group="azure")
self._options = [
cfg.StrOpt(
"transport_cert_store_name",
default="Windows Azure Environment",
help="Certificate store name for metadata certificates"),
]
def register(self):
"""Register the current options to the global ConfigOpts object."""
group = cfg.OptGroup(self.group_name, title='Azure Options')
self._config.register_group(group)
self._config.register_opts(self._options, group=group)
def list(self):
"""Return a list which contains all the available options."""
return self._options
|
The ideas for these articles come from the successes, failures, and discoveries in real-world projects. In this particular case, all three inspirations are at play. The unconventional process described in the first section has proven successful, with success measured by everyone going home on time and the projects delivered on time and under budget. The failure that led to the specific problem and solution covered here was an application that continually required more memory and additional hardware at regular release intervals. A random check of code while looking for re-usable assets revealed that a good percentage of the problem was the inefficient use of String. The discovery of the Eclipse feature that makes this issue easy to track and address was the result of randomly trying different settings (which is less efficient than reading the documentation but much more fun).
FUD: Fear, Uncertainty, and Doubt. These are the three roadblocks to improvements. When you read this article, you will probably experience all three until you have read it all the way through. Some may even have to give this a try themselves before they will be cured. And, some need an incentive to keep reading to get over their FUD. So, look at three problems that almost every application suffers from as motivation to get through the next sections with an open mind.
The first cost of a String literal is the overhead of creating the String. There are few more expensive operations than creating a new object. The use of interfaces is a common approach because you are all aware of this overhead. Yet, because String is so ubiquitous, it is often forgotten that creating a String is creating a new object. The failure example alluded to earlier is a perfect example. One class had a large number of Strings declared. They were all declared as static final, a good practice to reduce the cost of String creation. However, a random search of one these declarations revealed the exact same objects being created in 70 different files. A minimum footprint for a String is 40 bytes. That adds up quickly to more hardware expense and a good amount of labor spent looking for performance improvements.
The second cost is in the processing overhead of String comparison, a frequent reason for declaring a String in the first place. With the same String declared multiple times, the more efficient == comparison cannot be relied upon, requiring the more intensive .equals() comparison.
The third cost is maintenance. Tracking down every instance of a String is much more tedious and prone to error than changing a single instance.
There are some processes that are rarely used that every project can benefit from. One of these is daily code reviews by either a build master or technical lead. This is rarely done for many reasons, two of which are the misconceptions that it takes more time than it is worth and that it is hard to do.
Looking at the time versus value concept, a daily review should take no longer than an hour. That time estimate is based on the reviewer being responsible for no more than eight developers (more than that and re-configuring the teams should be considered) and that it is done from the beginning rather than waiting for a performance issue to occur that can't be easily traced. In a six-month project, this would total 125 hours. When compared with how long it takes to tune applications in either QA or production, project savings will generally average 100%.
Daily code reviews should be easy. Every source control application includes a report of what files have changed since the last update and a comparison tool to view differences between versions. Setting aside simple beans and other classes that can (and should) be generated by an IDE, the total lines of code output by a team on a daily basis is far less than one might think. This is not because the team is not productive; it is because producing a line of code consists of thinking about what the line should be, writing the line, testing the line, and corrections to the current line or previously written lines based on test results.
The time taken to review the code can be greatly reduced through the use of code analysis tools both native to an IDE and available as plug-ins. By providing feedback from these daily reviews to team members and having them make the corrections themselves, the team will reduce their code standard variations. Code that is clean to begin with takes even less time to review. That hour per day can quickly drop to an average of 30 minutes a day.
These daily code reviews should not be full peer reviews. They only need to be cursory reviews looking for what can be found quickly (once the review becomes a habit). One Eclipse feature that can speed this process is the use of the Errors/Warnings settings under the Compiler preference. While the default settings are very useful, there is one non-default setting that every Java development team can benefit from. Setting Non-externalized strings to Warning.
The same String values used repeatedly in a web application is something that every developer is familiar with. Because example code is rift with String declarations, it is a common (and expensive) habit to declare Strings often. A much more efficient approach is the creation of a single interface to hold String values that will be re-used in more than one class (JSPs included).
The two Strings above will be familiar to anyone who has ever worked on a web application. If you were to look at your last application, how many times would you find these Strings declared? Multiply that by the memory size required by each and you will see how much time you spent in meetings discussing how to reduce the time it takes for a user to login where there was a simple (if minor) reduction available immediately. Then add in the processing time where the .equals() method is used instead of the faster == comparison operator. I have used this design approach for many years; I was fortunate enough to be introduced to it on my first Java project. The average number of such Strings used in a web application is 120, with the criteria that the String must be used by more than one object. Frequently, these Strings are used by four or more objects. You would average three objects per String with an average of 60 bytes per String. Gosh, that is only .02 MB. Hardly worth it, eh? Ah, but these Strings are rarely declared as static final, so if you expect 1000 concurrent users, you are now at 20 MB. I'm prone to kill processes on my machine that use anything more the 500k if they aren't critical because I know they slow my 4GB machine down.
Although the use of String.intern() would also reduce overhead, that particular approach is much more useful at a class level than an application level.
|
"""
Test Results for discrete models from Stata
"""
import numpy as np
#### Discrete Model Tests ####
# Note that there is a slight refactor of the classes, so that one dataset
# might be used for more than one model
class Anes():
def __init__(self):
"""
Results are from Stata 11 (checked vs R nnet package).
"""
self.nobs = 944
def mnlogit_basezero(self):
params = [-.01153598, .29771435, -.024945, .08249144, .00519655,
-.37340167, -.08875065, .39166864, -.02289784, .18104276,
.04787398, -2.2509132, -.1059667, .57345051, -.01485121,
-.00715242, .05757516, -3.6655835, -.0915567, 1.2787718,
-.00868135, .19982796, .08449838, -7.6138431, -.0932846,
1.3469616, -.01790407, .21693885, .08095841, -7.0604782,
-.14088069, 2.0700801, -.00943265, .3219257, .10889408,
-12.105751]
self.params = np.reshape(params, (6,-1))
bse = [.0342823657, .093626795, .0065248584, .0735865799,
.0176336937, .6298376313, .0391615553, .1082386919,
.0079144618, .0852893563, .0222809297, .7631899491,
.0570382292, .1585481337, .0113313133, .1262913234,
.0336142088, 1.156541492, .0437902764, .1288965854,
.0084187486, .0941250559, .0261963632, .9575809602,
.0393516553, .1171860107, .0076110152, .0850070091,
.0229760791, .8443638283, .042138047, .1434089089,
.0081338625, .0910979921, .025300888, 1.059954821]
self.bse = np.reshape(bse, (6,-1))
self.cov_params = None
self.llf = -1461.922747312
self.llnull = -1750.34670999
self.llr = 576.8479253554
self.llr_pvalue = 1.8223179e-102
self.prsquared = .1647810465387
self.df_model = 30
self.df_resid = 944 - 36
self.J = 7
self.K = 6
self.aic = 2995.84549462
self.bic = 3170.45003661
z = [-.3364988051, 3.179798597, -3.823070772, 1.121012042,
.2946945327, -.5928538661, -2.266269864, 3.618564069,
-2.893164162, 2.122688754, 2.148652536, -2.949348555,
-1.857818873, 3.616885888, -1.310634214, -.0566342868,
1.712822091, -3.169435381, -2.090799808, 9.920912816,
-1.031191864, 2.123004903, 3.225576554, -7.951122047,
-2.370538224, 11.49421878, -2.352389066, 2.552011323,
3.523595639, -8.361890935, -3.34331327, 14.43480847,
-1.159676452, 3.533839715, 4.303962885, -11.42100649]
self.z = np.reshape(z, (6,-1))
pvalues = [0.7364947525, 0.0014737744, 0.0001317999, 0.2622827367,
0.7682272401, 0.5532789548, 0.0234348654, 0.0002962422,
0.0038138191, 0.0337799420, 0.0316619538, 0.0031844460,
0.0631947400, 0.0002981687, 0.1899813744, 0.9548365214,
0.0867452747, 0.0015273542, 0.0365460134, 3.37654e-23,
0.3024508550, 0.0337534410, 0.0012571921, 1.84830e-15,
0.0177622072, 1.41051e-30, 0.0186532528, 0.0107103038,
0.0004257334, 6.17209e-17, 0.0008278439, 3.12513e-47,
0.2461805610, 0.0004095694, 0.0000167770, 3.28408e-30]
self.pvalues = np.reshape(pvalues, (6,-1))
self.conf_int = [[[-0.0787282, 0.0556562], [0.1142092, 0.4812195],
[-0.0377335, -0.0121565], [-0.0617356, 0.2267185], [-0.0293649,
0.0397580], [-1.6078610, 0.8610574]], [[-0.1655059, -0.0119954],
[0.1795247, 0.6038126], [-0.0384099, -0.0073858], [0.0138787,
0.3482068], [0.0042042, 0.0915438], [-3.7467380, -0.7550884]],
[[-0.2177596, 0.0058262], [0.2627019, 0.8841991], [-0.0370602,
0.0073578], [-0.2546789, 0.2403740], [-0.0083075, 0.1234578],
[-5.9323630,-1.3988040]],[[-0.1773841, -0.0057293], [1.0261390,
1.5314040], [-0.0251818, 0.0078191], [0.0153462, 0.3843097],
[0.0331544, 0.1358423], [-9.4906670, -5.7370190]], [[-0.1704124,
-0.0161568], [1.1172810, 1.5766420], [-0.0328214, -0.0029868],
[0.0503282, 0.3835495], [0.0359261, 0.1259907], [-8.7154010,
-5.4055560]], [[-0.2234697, -0.0582916], [1.7890040, 2.3511560],
[-0.0253747, 0.0065094], [0.1433769, 0.5004745], [0.0593053,
0.1584829], [-14.1832200, -10.0282800]]]
class Spector():
"""
Results are from Stata 11
"""
def __init__(self):
self.nobs = 32
def logit(self):
self.params = [2.82611297201, .0951576702557, 2.37868772835,
-13.0213483201]
self.cov_params = [[1.59502033639, -.036920566629, .427615725153,
-4.57347950298], [-.036920566629, .0200375937069,
.0149126464275, -.346255757562], [.427615725153 ,
.0149126464275, 1.13329715236, -2.35916128427],
[-4.57347950298, -.346255757562, -2.35916128427,
24.3179625937]]
self.bse = [1.26294114526, .141554207662, 1.06456430165, 4.93132462871]
self.llf = -12.8896334653335
self.llnull = -20.5917296966173
self.df_model = 3
self.df_resid = 32 - 4 #TODO: is this right? not reported in stata
self.llr = 15.4041924625676
self.prsquared = .374038332124624
self.llr_pvalue = .00150187761112892
self.aic = 33.779266930667
self.bic = 39.642210541866
self.z = [2.237723415, 0.6722348408, 2.234423721, -2.640537645]
self.conf_int = [[.3507938,5.301432],[-.1822835,.3725988],[.29218,
4.465195],[-22.68657,-3.35613]]
self.pvalues = [.0252390974, .5014342039, .0254552063, .0082774596]
self.margeff_nodummy_dydx = [.36258084688424,.01220841099085,
.30517768382304]
self.margeff_nodummy_dydxmean = [.53385885781692,.01797548988961,
.44933926079386]
self.margeff_nodummy_dydxmedian = [.25009492465091,.00842091261329,
.2105003352955]
self.margeff_nodummy_dydxzero = [6.252993785e-06,2.105437138e-07,
5.263030788e-06]
self.margeff_nodummy_dyex = [1.1774000792198,.27896245178384,
.16960002159996]
self.margeff_nodummy_dyexmean = [1.6641381583512,.39433730945339,
.19658592659731]
self.margeff_nodummy_dyexmedian = [.76654095836557,.18947053379898,0]
self.margeff_nodummy_dyexzero = [0,0,0]
self.margeff_nodummy_eydx = [1.8546366266779,.06244722072812,
1.5610138123033]
self.margeff_nodummy_eydxmean = [2.1116143062702,.0710998816585,
1.7773072368626]
self.margeff_nodummy_eydxmedian = [2.5488082240624,.0858205793373,
2.1452853812126]
self.margeff_nodummy_eydxzero = [2.8261067189993,.0951574597115,
2.3786824653103]
self.margeff_nodummy_eyex = [5.4747106798973,1.3173389907576,
.44600395466634]
self.margeff_nodummy_eyexmean = [6.5822977203268,1.5597536538833,
.77757191612739]
self.margeff_nodummy_eyexmedian = [7.8120973525952,1.9309630350892,0]
self.margeff_nodummy_eyexzero = [0,0,0]
# for below GPA = 2.0, psi = 1
self.margeff_nodummy_atexog1 = [.1456333017086,.00490359933927,
.12257689308426]
# for below GPA at mean, tuce = 21, psi = 0
self.margeff_nodummy_atexog2 = [.25105129214546,.00845311433473,
.2113052923675]
self.margeff_dummy_dydx = [.36258084688424,.01220841099085,
.35751515254729]
self.margeff_dummy_dydxmean = [.53385885781692,.01797548988961,
.4564984096959]
# self.margeff_dummy_dydxmedian
# self.margeff_dummy_dydxzero
self.margeff_dummy_eydx = [1.8546366266779,.06244722072812,
1.5549034398832]
self.margeff_dummy_eydxmean = [2.1116143062702,.0710998816585,
1.6631775707188]
# self.margeff_dummy_eydxmedian
# self.margeff_dummy_eydxzero
# Factor variables not allowed in below
# self.margeff_dummy_dyex
# self.margeff_dummy_dyexmean
# self.margeff_dummy_dyexmedian
# self.margeff_dummy_dyexzero
# self.margeff_dummy_eyex
# self.margeff_dummy_eyex
# self.margeff_dummy_eyex
# self.margeff_dummy_eyex
# for below GPA = 2.0, psi = 1
self.margeff_dummy_atexog1 = [.1456333017086,.00490359933927,
.0494715429937]
# for below GPA at mean, tuce = 21, psi = 0
self.margeff_dummy_atexog2 = [.25105129214546,.00845311433473,
.44265645632553]
def probit(self):
self.params = [1.62581025407, .051728948442, 1.42633236818,
-7.45232041607]
self.cov_params = [[.481472955383, -.01891350017, .105439226234,
-1.1696681354], [-.01891350017, .00703757594, .002471864882,
-.101172838897], [.105439226234, .002471864882, .354070126802,
-.594791776765], [-1.1696681354, -.101172838897, -.594791776765,
6.46416639958]]
self.bse = [.693882522754, .083890261293, .595037920474, 2.54247249731]
self.llf = -12.8188033249334
self.llnull = -20.5917296966173
self.df_model = 3
self.df_resid = 32 - 4
self.llr = 15.5458527433678
self.prsquared = .377478069409622
self.llr_pvalue = .00140489496775855
self.aic = 33.637606649867
self.bic = 39.500550261066
self.z = [ 2.343062695, .6166263836, 2.397044489, -2.931131182]
self.conf_int = [[.2658255,2.985795],[-.1126929,.2161508],[.2600795,
2.592585],[-12.43547,-2.469166]]
self.pvalues = [.0191261688, .537481188, .0165279168, .0033773013]
class RandHIE():
"""
Results obtained from Stata 11
"""
def __init__(self):
self.nobs = 20190
def poisson(self):
self.params = [-.052535114675, -.247086797633, .035290201794,
-.03457750643, .271713973711, .033941474461, -.012635035534,
.054056326828, .206115121809, .700352877227]
self.cov_params = None
self.bse = [.00288398915279, .01061725196728, .00182833684966,
.00161284852954, .01223913844387, .00056476496963,
.00925061122826, .01530987068312, .02627928267502,
.01116266712362]
self.llf = -62419.588535018
self.llnull = -66647.181687959
self.df_model = 9
self.df_resid = self.nobs - self.df_model - 1
self.llr = 8455.186305881856
self.prsquared = .0634324369893758
self.llr_pvalue = 0
self.aic = 124859.17707
self.bic = 124938.306497
self.z = [-18.21612769, -23.27219872, 19.30180524, -21.43878101,
22.20041672, 60.09840604, -1.36585953, 3.53081538, 7.84325525,
62.74063980]
self.conf_int = [[ -.0581876, -.0468826],[-0.2678962, -0.2262774],
[0.0317067, 0.0388737],[-0.0377386, -0.0314164],
[0.2477257, 0.2957022], [0.0328346, 0.0350484],[-0.0307659,
0.0054958], [0.0240495, 0.0840631],[0.1546087, 0.2576216],
[0.6784745, 0.7222313]]
self.pvalues = [3.84415e-74, 8.4800e-120, 5.18652e-83, 5.8116e-102,
3.4028e-109, 0, .1719830562, .0004142808, 4.39014e-15, 0]
|
Things to do if you’re facing a seemingly severe debacle-fixing your garage door opener. The fantastic news isthese machines are rather simple to keep and to fix to those who almost have an problem with repairing machines.
If you’re employing an automated garage operator using a remote control, check the remote controller battery when it still provides adequate capability to transmit sign. In case the issue is from the battery, then it could be brought on by a failure in appropriate setup. Your recipient might not be in a position to perceive signs effortlessly.
When there’s not anything wrong with the transmitter and the sign, check the monitor. There could be something obstructing the trail or it might currently be somewhat rusty. In such instances, think about purchasing a new opener to the doorway of your own garage or if it may nevertheless be salvaged you can fix the region so that it can operate fluidly.
Your doorway opens but it doesn’t shut, what’s? It could possibly be attributed to the light beam detector. New versions of door operators now have an integrated beam detector that can sense whether there’s something blocking its route. This functions as a security measure to stop crushing something or stop accidents. When it feels that something is obstructing its own way, it instantly opens and succeeds.
If that is the issue, you need to inspect the installation of this beam detector. When it isn’t installed, it can make it feel anything that’s not in the path of the doorway of your garage that will cause a collapse in shutting the door.
Chain-driven openers are often noisier. This also generates more vibrations compared to sophisticated belt-driven doorway operators. But if you realize that it generates more noise and vibration than ordinary, there can be something wrong with all the chains or the buckle. In case the motion is fluid, then check out rust or to get damaged regions. Replace if needed.
The Android Apps occurrence has gained momentum with many helpful programs. Nowadays people are very addictive to those programs and in actuality, they’re using these programs for many crucial jobs in their day to day life. These helpful programs have covered nearly all of the critical areas of the digital world to generate the life span of the frequent person simple. There are countless programs available to satisfy your needs, let us talk a number of the helpful programs here.
Avast Mobile Security: With over a hundred million supports this Android anti-malware application provides extended services such as a call blocker, program locker, privacy adviser, firewall and more. Additionally, it offers features like RAM booster, crap cleaner, and Wi-Fi scanner . It optimizes your devices against malicious sites, emails, links, telephone calls, SMSs etc.. In addition, it provides alerts for spyware and spyware programs. This anti virus engine supplies complete a internet shield against all probable dangers.
File Manager (File Transfer): This program is supplied by Cheetah Mobile and contains more 50 million downloads. This fully featured file management application provides extensive services for users such as fundamental characteristics of cut, copy, paste, delete, research, compress, decompress options and lots of innovative features such as cloud service, Wi-Fi file transfer and more. It is possible to use this handy tool to see documents by type (picture, sound, video, downloads etc.. ) and can get them in 1 tap via Widgets. This wonderful program provides multiple protocols to move files easily.
G Cloud Backup: This program is the entire package for your own storage concerns. This program features cloud backup to store your file (videos, pictures, sound, contacts, files, messages, telephone logs etc.. ) onto it to keep it safe and protected. It is possible to further access these records anytime anyplace. You may use this program to readily migrate your data between different devices and also to expand your device’s storage area. It supports most of Android devices such as tablets tablet computers, iPhone and iPad.
7 Minute Workout: Adored by millions of customers around the globe this program is clinically proven to assist you cope with your exercise worries. It gives you the ability to drop weight, strengthen your muscles, improve cardiovascular purposes and more. In its extensive features it supports Google match, provides voice advice, offers exercise logs, flexible circuit time and relaxation time supplies notifications for everyday workout and more. This is the best option to stay fit and healthy.
You are able to use these helpful programs to make your experience on Android convenient and impressive. Aside from these programs one common difficulty users confront is cluttered and storage storage area in their apparatus. It impacts device speed and functionality. To manage this scenario it is possible to use Android Cleaner programs for better and instant outcomes.
Similar to any other pc or notebook, installing updated and appropriate drivers is a crucial necessity for the smooth performance of your own HP Compaq NW8240. Drivers will be the applications that assist the computer interact with all the hardware devices of your notebook. An obsolete driver doesn’t enable the hardware parts of your computer to operate correctly. Therefore it will become important that you maintain all updated drivers onto your own HP Compaq NW8240. This is a very simple task which you may perform with a tiny technical understanding. In the event you’re not capable update drivers after following these hints then look at calling seasoned computer support providers.
It’s very important that you understand the title of manufacturer and apparatus to figure out the proper driver. The most preferred method of getting the proper driver would be always to go on the manufacturer’s site and discover the appropriate driver for your specific device. Another means of getting it’s via online research to achieve a trustworthy site from where proper drivers could be downloaded.
The second step you want to undertake would be downloading the driver that’s compatible with all the operating system installed in your HP Compaq NW8240. You have to be certain the driver you’re going to download is your harmonious version and it won’t induce software incompatibility problems.
Before beginning installing the driver onto your own HP Compaq NW8240 notebook computer, you’re advised to create certain your machine is free of all sorts of viruses, malware and other internet ailments. It will become important because the existence of these ailments can derail the method of setup. The best means of eliminating these is using very good excellent antivirus software to clean your system. In addition, you also need cleaning the registry using a trusted registry cleaner so the setup is finished with no difficulties.
The final step of the method is to upgrade the drivers installed on your notebook. This procedure starts when you complete the setup . Without upgrading the drivers you won’t be in a position to avail its own new capabilities. So upgrade the drivers installed on your system to assist your notebook perform nicely.
The operation of this Device Manager would be to record all of the hardware devices present in your system. A device driver might be obsolete and need upgrading if a unit is being shown in yellow from the list. It’s possible to upgrade the specific device driver with a right click on it. Then choose the option’Update Driver Software’ in the sub-menu.
While upgrading drivers for your HP Compaq NW8240 isn’t a really tricky endeavor still your restricted technical understanding can make it complex. You are able to do it immediately and conveniently by simply calling a seasoned personal service supplier. Technicians from such service providers can be found online 24x7x365 and they’re especially trained to repair problems with HP notebooks.
An electrical generator provides electric power when there is insu19fficient or no reachable power source, guaranteeing that routine exercises and company actions keep amid a power blackout. Apart from giving a catastrophe reinforcement management distribution, a generator may similarly be used to offer consistent power to people and associations in remote regions which aren’t come into by mains electric administrations.
What will you use the generator ? It’s safe to state you will use it to restrain overpowering machines or are you going to use it to get electricity backup for your house or business? For house or individual programs, a lone phase control generator with 5kW to 30kW will do just fine. For contemporary programs or to control a huge company, three phase control generators from 30 kW to 6 megawatts have been indicated.
Some brands that are outstanding give an extensive reach of generator sets. To find out which one matches your requirements, do some research. Look the web for more information about every manufacturer. Prevalent brands will in all likelihood have their own websites or suppliers of those brands might have their available generators listed on the internet. You may also look at thing posting websites or talks and determine what different customers have stated with regard to a certain generator screen. Doing any exploration may provide you a superior thought of a generator’s detail and worth goes.
If you need help in acquiring a generator, then it’s ideal to go over your requirements and concerns using a set stock in mechanical equipment and generator supplier. They may speak about with you in detail the advantages and disadvantages of buying generator and steer you through the manner toward buying the most matching generator that fulfills your determinations. They can lead you on guarantee which might not be cost effective for the intended purpose or price range. Some new generator versions may expect buyers to maintain up before they can find the unit in perspective of restricted availability.
On the off possibility that there’s ever an end scenario that occurs, obtaining a generator is going to be of fantastic significance. A generator provides power for you and your house for the several things we need.
|
# -*- coding: utf-8 -*-
# Form implementation generated from reading ui file 'crashmanager-gui.ui'
#
# Created: Tue Apr 22 07:51:21 2014
# by: pyside-uic 0.2.13 running on PySide 1.1.0
#
# WARNING! All changes made in this file will be lost!
from PySide import QtCore, QtGui
class Ui_MainWindow(object):
def setupUi(self, MainWindow):
MainWindow.setObjectName("MainWindow")
MainWindow.resize(1219, 564)
self.centralwidget = QtGui.QWidget(MainWindow)
self.centralwidget.setObjectName("centralwidget")
self.tabWidget = QtGui.QTabWidget(self.centralwidget)
self.tabWidget.setGeometry(QtCore.QRect(9, 9, 631, 321))
self.tabWidget.setToolTip("")
self.tabWidget.setObjectName("tabWidget")
self.tab = QtGui.QWidget()
self.tab.setObjectName("tab")
self.gridLayoutWidget = QtGui.QWidget(self.tab)
self.gridLayoutWidget.setGeometry(QtCore.QRect(0, 30, 621, 231))
self.gridLayoutWidget.setObjectName("gridLayoutWidget")
self.gridLayout = QtGui.QGridLayout(self.gridLayoutWidget)
self.gridLayout.setContentsMargins(0, 0, 0, 0)
self.gridLayout.setObjectName("gridLayout")
self.radioButton_SYSTEMTBS = QtGui.QRadioButton(self.gridLayoutWidget)
self.radioButton_SYSTEMTBS.setObjectName("radioButton_SYSTEMTBS")
self.gridLayout.addWidget(self.radioButton_SYSTEMTBS, 5, 1, 1, 1)
self.radioButton_ALLDATA = QtGui.QRadioButton(self.gridLayoutWidget)
self.radioButton_ALLDATA.setObjectName("radioButton_ALLDATA")
self.gridLayout.addWidget(self.radioButton_ALLDATA, 4, 0, 1, 1)
self.radioButton_NONSYSTEMDATA = QtGui.QRadioButton(self.gridLayoutWidget)
self.radioButton_NONSYSTEMDATA.setObjectName("radioButton_NONSYSTEMDATA")
self.gridLayout.addWidget(self.radioButton_NONSYSTEMDATA, 0, 0, 1, 1)
self.radioButton_UNDODATA = QtGui.QRadioButton(self.gridLayoutWidget)
self.radioButton_UNDODATA.setObjectName("radioButton_UNDODATA")
self.gridLayout.addWidget(self.radioButton_UNDODATA, 3, 0, 1, 1)
self.radioButton_10 = QtGui.QRadioButton(self.gridLayoutWidget)
self.radioButton_10.setEnabled(True)
self.radioButton_10.setObjectName("radioButton_10")
self.gridLayout.addWidget(self.radioButton_10, 1, 1, 1, 1)
self.radioButton_SYSTEMDATA = QtGui.QRadioButton(self.gridLayoutWidget)
self.radioButton_SYSTEMDATA.setObjectName("radioButton_SYSTEMDATA")
self.gridLayout.addWidget(self.radioButton_SYSTEMDATA, 2, 0, 1, 1)
self.radioButton_SPFILE = QtGui.QRadioButton(self.gridLayoutWidget)
self.radioButton_SPFILE.setObjectName("radioButton_SPFILE")
self.gridLayout.addWidget(self.radioButton_SPFILE, 0, 2, 1, 1)
self.radioButton_NONSYSTEMTBS = QtGui.QRadioButton(self.gridLayoutWidget)
self.radioButton_NONSYSTEMTBS.setObjectName("radioButton_NONSYSTEMTBS")
self.gridLayout.addWidget(self.radioButton_NONSYSTEMTBS, 3, 1, 1, 1)
self.radioButton_21 = QtGui.QRadioButton(self.gridLayoutWidget)
self.radioButton_21.setEnabled(False)
self.radioButton_21.setObjectName("radioButton_21")
self.gridLayout.addWidget(self.radioButton_21, 2, 1, 1, 1)
self.radioButton_TEMPORARYTBS = QtGui.QRadioButton(self.gridLayoutWidget)
self.radioButton_TEMPORARYTBS.setObjectName("radioButton_TEMPORARYTBS")
self.gridLayout.addWidget(self.radioButton_TEMPORARYTBS, 4, 1, 1, 1)
self.radioButton_TEMPORARYDATA = QtGui.QRadioButton(self.gridLayoutWidget)
self.radioButton_TEMPORARYDATA.setObjectName("radioButton_TEMPORARYDATA")
self.gridLayout.addWidget(self.radioButton_TEMPORARYDATA, 1, 0, 1, 1)
self.radioButton_READONLYTBS = QtGui.QRadioButton(self.gridLayoutWidget)
self.radioButton_READONLYTBS.setObjectName("radioButton_READONLYTBS")
self.gridLayout.addWidget(self.radioButton_READONLYTBS, 0, 1, 1, 1)
self.radioButton_UNDOTBS = QtGui.QRadioButton(self.gridLayoutWidget)
self.radioButton_UNDOTBS.setObjectName("radioButton_UNDOTBS")
self.gridLayout.addWidget(self.radioButton_UNDOTBS, 6, 1, 1, 1)
self.tabWidget.addTab(self.tab, "")
self.tab_2 = QtGui.QWidget()
self.tab_2.setObjectName("tab_2")
self.gridLayoutWidget_2 = QtGui.QWidget(self.tab_2)
self.gridLayoutWidget_2.setGeometry(QtCore.QRect(0, 20, 621, 231))
self.gridLayoutWidget_2.setObjectName("gridLayoutWidget_2")
self.gridLayout_2 = QtGui.QGridLayout(self.gridLayoutWidget_2)
self.gridLayout_2.setContentsMargins(0, 0, 0, 0)
self.gridLayout_2.setObjectName("gridLayout_2")
self.radioButton_22 = QtGui.QRadioButton(self.gridLayoutWidget_2)
self.radioButton_22.setObjectName("radioButton_22")
self.gridLayout_2.addWidget(self.radioButton_22, 5, 1, 1, 1)
self.radioButton_23 = QtGui.QRadioButton(self.gridLayoutWidget_2)
self.radioButton_23.setObjectName("radioButton_23")
self.gridLayout_2.addWidget(self.radioButton_23, 4, 0, 1, 1)
self.radioButton_24 = QtGui.QRadioButton(self.gridLayoutWidget_2)
self.radioButton_24.setObjectName("radioButton_24")
self.gridLayout_2.addWidget(self.radioButton_24, 0, 0, 1, 1)
self.radioButton_25 = QtGui.QRadioButton(self.gridLayoutWidget_2)
self.radioButton_25.setObjectName("radioButton_25")
self.gridLayout_2.addWidget(self.radioButton_25, 3, 0, 1, 1)
self.radioButton_26 = QtGui.QRadioButton(self.gridLayoutWidget_2)
self.radioButton_26.setObjectName("radioButton_26")
self.gridLayout_2.addWidget(self.radioButton_26, 1, 1, 1, 1)
self.radioButton_27 = QtGui.QRadioButton(self.gridLayoutWidget_2)
self.radioButton_27.setObjectName("radioButton_27")
self.gridLayout_2.addWidget(self.radioButton_27, 2, 0, 1, 1)
self.radioButton_28 = QtGui.QRadioButton(self.gridLayoutWidget_2)
self.radioButton_28.setObjectName("radioButton_28")
self.gridLayout_2.addWidget(self.radioButton_28, 0, 2, 1, 1)
self.radioButton_29 = QtGui.QRadioButton(self.gridLayoutWidget_2)
self.radioButton_29.setObjectName("radioButton_29")
self.gridLayout_2.addWidget(self.radioButton_29, 3, 1, 1, 1)
self.radioButton_30 = QtGui.QRadioButton(self.gridLayoutWidget_2)
self.radioButton_30.setObjectName("radioButton_30")
self.gridLayout_2.addWidget(self.radioButton_30, 2, 1, 1, 1)
self.radioButton_31 = QtGui.QRadioButton(self.gridLayoutWidget_2)
self.radioButton_31.setObjectName("radioButton_31")
self.gridLayout_2.addWidget(self.radioButton_31, 4, 1, 1, 1)
self.radioButton_32 = QtGui.QRadioButton(self.gridLayoutWidget_2)
self.radioButton_32.setObjectName("radioButton_32")
self.gridLayout_2.addWidget(self.radioButton_32, 1, 0, 1, 1)
self.radioButton_33 = QtGui.QRadioButton(self.gridLayoutWidget_2)
self.radioButton_33.setObjectName("radioButton_33")
self.gridLayout_2.addWidget(self.radioButton_33, 0, 1, 1, 1)
self.radioButton_34 = QtGui.QRadioButton(self.gridLayoutWidget_2)
self.radioButton_34.setObjectName("radioButton_34")
self.gridLayout_2.addWidget(self.radioButton_34, 6, 1, 1, 1)
self.tabWidget.addTab(self.tab_2, "")
self.tab_3 = QtGui.QWidget()
self.tab_3.setObjectName("tab_3")
self.gridLayoutWidget_3 = QtGui.QWidget(self.tab_3)
self.gridLayoutWidget_3.setGeometry(QtCore.QRect(0, 20, 621, 231))
self.gridLayoutWidget_3.setObjectName("gridLayoutWidget_3")
self.gridLayout_3 = QtGui.QGridLayout(self.gridLayoutWidget_3)
self.gridLayout_3.setContentsMargins(0, 0, 0, 0)
self.gridLayout_3.setObjectName("gridLayout_3")
self.radioButton_35 = QtGui.QRadioButton(self.gridLayoutWidget_3)
self.radioButton_35.setObjectName("radioButton_35")
self.gridLayout_3.addWidget(self.radioButton_35, 5, 1, 1, 1)
self.radioButton_36 = QtGui.QRadioButton(self.gridLayoutWidget_3)
self.radioButton_36.setObjectName("radioButton_36")
self.gridLayout_3.addWidget(self.radioButton_36, 4, 0, 1, 1)
self.radioButton_37 = QtGui.QRadioButton(self.gridLayoutWidget_3)
self.radioButton_37.setObjectName("radioButton_37")
self.gridLayout_3.addWidget(self.radioButton_37, 0, 0, 1, 1)
self.radioButton_38 = QtGui.QRadioButton(self.gridLayoutWidget_3)
self.radioButton_38.setObjectName("radioButton_38")
self.gridLayout_3.addWidget(self.radioButton_38, 3, 0, 1, 1)
self.radioButton_39 = QtGui.QRadioButton(self.gridLayoutWidget_3)
self.radioButton_39.setObjectName("radioButton_39")
self.gridLayout_3.addWidget(self.radioButton_39, 1, 1, 1, 1)
self.radioButton_40 = QtGui.QRadioButton(self.gridLayoutWidget_3)
self.radioButton_40.setObjectName("radioButton_40")
self.gridLayout_3.addWidget(self.radioButton_40, 2, 0, 1, 1)
self.radioButton_41 = QtGui.QRadioButton(self.gridLayoutWidget_3)
self.radioButton_41.setObjectName("radioButton_41")
self.gridLayout_3.addWidget(self.radioButton_41, 0, 2, 1, 1)
self.radioButton_42 = QtGui.QRadioButton(self.gridLayoutWidget_3)
self.radioButton_42.setObjectName("radioButton_42")
self.gridLayout_3.addWidget(self.radioButton_42, 3, 1, 1, 1)
self.radioButton_43 = QtGui.QRadioButton(self.gridLayoutWidget_3)
self.radioButton_43.setObjectName("radioButton_43")
self.gridLayout_3.addWidget(self.radioButton_43, 2, 1, 1, 1)
self.radioButton_44 = QtGui.QRadioButton(self.gridLayoutWidget_3)
self.radioButton_44.setObjectName("radioButton_44")
self.gridLayout_3.addWidget(self.radioButton_44, 4, 1, 1, 1)
self.radioButton_45 = QtGui.QRadioButton(self.gridLayoutWidget_3)
self.radioButton_45.setObjectName("radioButton_45")
self.gridLayout_3.addWidget(self.radioButton_45, 1, 0, 1, 1)
self.radioButton_46 = QtGui.QRadioButton(self.gridLayoutWidget_3)
self.radioButton_46.setObjectName("radioButton_46")
self.gridLayout_3.addWidget(self.radioButton_46, 0, 1, 1, 1)
self.radioButton_47 = QtGui.QRadioButton(self.gridLayoutWidget_3)
self.radioButton_47.setObjectName("radioButton_47")
self.gridLayout_3.addWidget(self.radioButton_47, 6, 1, 1, 1)
self.tabWidget.addTab(self.tab_3, "")
self.tab_4 = QtGui.QWidget()
self.tab_4.setObjectName("tab_4")
self.radioButton_7 = QtGui.QRadioButton(self.tab_4)
self.radioButton_7.setGeometry(QtCore.QRect(40, 40, 147, 18))
self.radioButton_7.setObjectName("radioButton_7")
self.radioButton_8 = QtGui.QRadioButton(self.tab_4)
self.radioButton_8.setGeometry(QtCore.QRect(40, 70, 171, 18))
self.radioButton_8.setObjectName("radioButton_8")
self.tabWidget.addTab(self.tab_4, "")
self.tab_5 = QtGui.QWidget()
self.tab_5.setObjectName("tab_5")
self.gridLayoutWidget_5 = QtGui.QWidget(self.tab_5)
self.gridLayoutWidget_5.setGeometry(QtCore.QRect(0, 10, 670, 151))
self.gridLayoutWidget_5.setObjectName("gridLayoutWidget_5")
self.gridLayout_5 = QtGui.QGridLayout(self.gridLayoutWidget_5)
self.gridLayout_5.setSizeConstraint(QtGui.QLayout.SetDefaultConstraint)
self.gridLayout_5.setContentsMargins(0, 0, 0, 0)
self.gridLayout_5.setObjectName("gridLayout_5")
self.radioButton_18 = QtGui.QRadioButton(self.gridLayoutWidget_5)
self.radioButton_18.setObjectName("radioButton_18")
self.gridLayout_5.addWidget(self.radioButton_18, 0, 0, 1, 1)
self.radioButton_20 = QtGui.QRadioButton(self.gridLayoutWidget_5)
self.radioButton_20.setObjectName("radioButton_20")
self.gridLayout_5.addWidget(self.radioButton_20, 0, 1, 1, 1)
self.radioButton_15 = QtGui.QRadioButton(self.gridLayoutWidget_5)
self.radioButton_15.setObjectName("radioButton_15")
self.gridLayout_5.addWidget(self.radioButton_15, 1, 0, 1, 1)
self.radioButton_17 = QtGui.QRadioButton(self.gridLayoutWidget_5)
self.radioButton_17.setObjectName("radioButton_17")
self.gridLayout_5.addWidget(self.radioButton_17, 2, 0, 1, 1)
self.radioButton_19 = QtGui.QRadioButton(self.gridLayoutWidget_5)
self.radioButton_19.setObjectName("radioButton_19")
self.gridLayout_5.addWidget(self.radioButton_19, 1, 1, 1, 1)
self.radioButton_16 = QtGui.QRadioButton(self.gridLayoutWidget_5)
self.radioButton_16.setObjectName("radioButton_16")
self.gridLayout_5.addWidget(self.radioButton_16, 2, 1, 1, 1)
self.tabWidget.addTab(self.tab_5, "")
self.verticalLayoutWidget_3 = QtGui.QWidget(self.centralwidget)
self.verticalLayoutWidget_3.setGeometry(QtCore.QRect(10, 330, 631, 137))
self.verticalLayoutWidget_3.setObjectName("verticalLayoutWidget_3")
self.verticalLayout_3 = QtGui.QVBoxLayout(self.verticalLayoutWidget_3)
self.verticalLayout_3.setContentsMargins(0, 0, 0, 0)
self.verticalLayout_3.setObjectName("verticalLayout_3")
self.label = QtGui.QLabel(self.verticalLayoutWidget_3)
self.label.setObjectName("label")
self.verticalLayout_3.addWidget(self.label)
self.textBrowser = QtGui.QTextBrowser(self.verticalLayoutWidget_3)
self.textBrowser.setObjectName("textBrowser")
self.verticalLayout_3.addWidget(self.textBrowser)
self.horizontalLayoutWidget = QtGui.QWidget(self.centralwidget)
self.horizontalLayoutWidget.setGeometry(QtCore.QRect(30, 470, 188, 41))
self.horizontalLayoutWidget.setObjectName("horizontalLayoutWidget")
self.horizontalLayout = QtGui.QHBoxLayout(self.horizontalLayoutWidget)
self.horizontalLayout.setContentsMargins(0, 0, 0, 0)
self.horizontalLayout.setObjectName("horizontalLayout")
self.pushButton = QtGui.QPushButton(self.horizontalLayoutWidget)
self.pushButton.setObjectName("pushButton")
self.horizontalLayout.addWidget(self.pushButton)
self.pushButton_2 = QtGui.QPushButton(self.horizontalLayoutWidget)
self.pushButton_2.setObjectName("pushButton_2")
self.horizontalLayout.addWidget(self.pushButton_2)
self.verticalLayoutWidget = QtGui.QWidget(self.centralwidget)
self.verticalLayoutWidget.setGeometry(QtCore.QRect(649, 9, 561, 461))
self.verticalLayoutWidget.setObjectName("verticalLayoutWidget")
self.verticalLayout = QtGui.QVBoxLayout(self.verticalLayoutWidget)
self.verticalLayout.setContentsMargins(0, 0, 0, 0)
self.verticalLayout.setObjectName("verticalLayout")
self.label_2 = QtGui.QLabel(self.verticalLayoutWidget)
self.label_2.setObjectName("label_2")
self.verticalLayout.addWidget(self.label_2)
self.textBrowser_ALERTLOG = QtGui.QTextBrowser(self.verticalLayoutWidget)
self.textBrowser_ALERTLOG.setObjectName("textBrowser_ALERTLOG")
self.verticalLayout.addWidget(self.textBrowser_ALERTLOG)
MainWindow.setCentralWidget(self.centralwidget)
self.menubar = QtGui.QMenuBar(MainWindow)
self.menubar.setGeometry(QtCore.QRect(0, 0, 1219, 24))
self.menubar.setNativeMenuBar(False)
self.menubar.setObjectName("menubar")
self.menuAbout = QtGui.QMenu(self.menubar)
self.menuAbout.setObjectName("menuAbout")
self.menuFile = QtGui.QMenu(self.menubar)
self.menuFile.setObjectName("menuFile")
self.menuMode = QtGui.QMenu(self.menubar)
self.menuMode.setObjectName("menuMode")
MainWindow.setMenuBar(self.menubar)
self.statusbar = QtGui.QStatusBar(MainWindow)
self.statusbar.setObjectName("statusbar")
MainWindow.setStatusBar(self.statusbar)
self.actionClose = QtGui.QAction(MainWindow)
self.actionClose.setObjectName("actionClose")
self.actionAbout = QtGui.QAction(MainWindow)
self.actionAbout.setObjectName("actionAbout")
self.actionComplete_Recovery = QtGui.QAction(MainWindow)
self.actionComplete_Recovery.setObjectName("actionComplete_Recovery")
self.actionIncomplete_Recovery = QtGui.QAction(MainWindow)
self.actionIncomplete_Recovery.setObjectName("actionIncomplete_Recovery")
self.actionFlashback_Recovery = QtGui.QAction(MainWindow)
self.actionFlashback_Recovery.setObjectName("actionFlashback_Recovery")
self.actionContent = QtGui.QAction(MainWindow)
self.actionContent.setObjectName("actionContent")
self.menuAbout.addAction(self.actionContent)
self.menuAbout.addAction(self.actionAbout)
self.menuFile.addSeparator()
self.menuFile.addAction(self.actionClose)
self.menuMode.addAction(self.actionComplete_Recovery)
self.menuMode.addAction(self.actionIncomplete_Recovery)
self.menuMode.addAction(self.actionFlashback_Recovery)
self.menubar.addAction(self.menuFile.menuAction())
self.menubar.addAction(self.menuMode.menuAction())
self.menubar.addAction(self.menuAbout.menuAction())
self.retranslateUi(MainWindow)
self.tabWidget.setCurrentIndex(0)
QtCore.QObject.connect(self.actionClose, QtCore.SIGNAL("activated()"), MainWindow.close)
QtCore.QObject.connect(self.actionComplete_Recovery, QtCore.SIGNAL("activated()"), self.tabWidget.show)
QtCore.QMetaObject.connectSlotsByName(MainWindow)
def retranslateUi(self, MainWindow):
MainWindow.setWindowTitle(QtGui.QApplication.translate("MainWindow", "MainWindow", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_SYSTEMTBS.setText(QtGui.QApplication.translate("MainWindow", "Loss of the SYSTEM tablespace", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_ALLDATA.setText(QtGui.QApplication.translate("MainWindow", "Loss of all datafiles", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_NONSYSTEMDATA.setText(QtGui.QApplication.translate("MainWindow", "Loss of a non-system datafile", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_UNDODATA.setText(QtGui.QApplication.translate("MainWindow", "Loss of an UNDO datafile", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_10.setText(QtGui.QApplication.translate("MainWindow", "Loss of an Index tablespace", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_SYSTEMDATA.setText(QtGui.QApplication.translate("MainWindow", "Loss of a SYSTEM datafile", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_SPFILE.setText(QtGui.QApplication.translate("MainWindow", "Loss of the spfile", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_NONSYSTEMTBS.setText(QtGui.QApplication.translate("MainWindow", "Loss of a non-system tablespace", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_21.setText(QtGui.QApplication.translate("MainWindow", "Loss of all indexes in USERS tablespace", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_TEMPORARYTBS.setText(QtGui.QApplication.translate("MainWindow", "Loss of a temporary tablespace", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_TEMPORARYDATA.setText(QtGui.QApplication.translate("MainWindow", "Loss of a temporary datafile", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_READONLYTBS.setText(QtGui.QApplication.translate("MainWindow", "Loss of a Read-Only tablespace", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_UNDOTBS.setText(QtGui.QApplication.translate("MainWindow", "Loss of the UNDO tablespace", None, QtGui.QApplication.UnicodeUTF8))
self.tabWidget.setTabText(self.tabWidget.indexOf(self.tab), QtGui.QApplication.translate("MainWindow", "Complete Recovery", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_22.setText(QtGui.QApplication.translate("MainWindow", "Loss of the SYSTEM tablespace", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_23.setText(QtGui.QApplication.translate("MainWindow", "Loss of all datafiles", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_24.setText(QtGui.QApplication.translate("MainWindow", "Loss of a non-system datafile", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_25.setText(QtGui.QApplication.translate("MainWindow", "Loss of an UNDO datafile", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_26.setText(QtGui.QApplication.translate("MainWindow", "Loss of an Index tablespace", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_27.setText(QtGui.QApplication.translate("MainWindow", "Loss of a SYSTEM datafile", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_28.setText(QtGui.QApplication.translate("MainWindow", "Loss of the spfile", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_29.setText(QtGui.QApplication.translate("MainWindow", "Loss of a non-system tablespace", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_30.setText(QtGui.QApplication.translate("MainWindow", "Loss of all indexes in USERS tablespace", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_31.setText(QtGui.QApplication.translate("MainWindow", "Loss of a temporary tablespace", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_32.setText(QtGui.QApplication.translate("MainWindow", "Loss of a temporary datafile", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_33.setText(QtGui.QApplication.translate("MainWindow", "Loss of a Read-Only tablespace", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_34.setText(QtGui.QApplication.translate("MainWindow", "Loss of the UNDO tablespace", None, QtGui.QApplication.UnicodeUTF8))
self.tabWidget.setTabText(self.tabWidget.indexOf(self.tab_2), QtGui.QApplication.translate("MainWindow", "Incomplete Recovery", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_35.setText(QtGui.QApplication.translate("MainWindow", "Loss of the SYSTEM tablespace", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_36.setText(QtGui.QApplication.translate("MainWindow", "Loss of all datafiles", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_37.setText(QtGui.QApplication.translate("MainWindow", "Loss of a non-system datafile", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_38.setText(QtGui.QApplication.translate("MainWindow", "Loss of an UNDO datafile", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_39.setText(QtGui.QApplication.translate("MainWindow", "Loss of an Index tablespace", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_40.setText(QtGui.QApplication.translate("MainWindow", "Loss of a SYSTEM datafile", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_41.setText(QtGui.QApplication.translate("MainWindow", "Loss of the spfile", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_42.setText(QtGui.QApplication.translate("MainWindow", "Loss of a non-system tablespace", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_43.setText(QtGui.QApplication.translate("MainWindow", "Loss of all indexes in USERS tablespace", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_44.setText(QtGui.QApplication.translate("MainWindow", "Loss of a temporary tablespace", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_45.setText(QtGui.QApplication.translate("MainWindow", "Loss of a temporary datafile", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_46.setText(QtGui.QApplication.translate("MainWindow", "Loss of a Read-Only tablespace", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_47.setText(QtGui.QApplication.translate("MainWindow", "Loss of the UNDO tablespace", None, QtGui.QApplication.UnicodeUTF8))
self.tabWidget.setTabText(self.tabWidget.indexOf(self.tab_3), QtGui.QApplication.translate("MainWindow", "Flashback Recovery", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_7.setText(QtGui.QApplication.translate("MainWindow", "Loss of a control file", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_8.setText(QtGui.QApplication.translate("MainWindow", "Loss of all control files", None, QtGui.QApplication.UnicodeUTF8))
self.tabWidget.setTabText(self.tabWidget.indexOf(self.tab_4), QtGui.QApplication.translate("MainWindow", "Control file", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_18.setText(QtGui.QApplication.translate("MainWindow", "Loss of a redo log file group member", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_20.setText(QtGui.QApplication.translate("MainWindow", "Loss of all redo log members of an INACTIVE group", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_15.setText(QtGui.QApplication.translate("MainWindow", "Loss of a redo log file group", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_17.setText(QtGui.QApplication.translate("MainWindow", "Loss of redo log member of a multiplexed group", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_19.setText(QtGui.QApplication.translate("MainWindow", "Loss of all redo log members of an ACTIVE group", None, QtGui.QApplication.UnicodeUTF8))
self.radioButton_16.setText(QtGui.QApplication.translate("MainWindow", "Loss of all redo log members of CURRENT group", None, QtGui.QApplication.UnicodeUTF8))
self.tabWidget.setTabText(self.tabWidget.indexOf(self.tab_5), QtGui.QApplication.translate("MainWindow", "Redo log", None, QtGui.QApplication.UnicodeUTF8))
self.label.setText(QtGui.QApplication.translate("MainWindow", "Description:", None, QtGui.QApplication.UnicodeUTF8))
self.pushButton.setToolTip(QtGui.QApplication.translate("MainWindow", "<html><head/><body><p>Press this button if you want to perform a crash of your instance according to the radio button selected. In this way you have to recover your instance: so be sure to have at least one useful backup of your database.</p></body></html>", None, QtGui.QApplication.UnicodeUTF8))
self.pushButton.setText(QtGui.QApplication.translate("MainWindow", "Crash it !", None, QtGui.QApplication.UnicodeUTF8))
self.pushButton_2.setToolTip(QtGui.QApplication.translate("MainWindow", "<html><head/><body><p>Press this button if you want to have no radio button selected. In this way even if you push the "Crash it !" button it won\'t crash your instance.</p></body></html>", None, QtGui.QApplication.UnicodeUTF8))
self.pushButton_2.setText(QtGui.QApplication.translate("MainWindow", "Cancel", None, QtGui.QApplication.UnicodeUTF8))
self.label_2.setText(QtGui.QApplication.translate("MainWindow", "Alert log:", None, QtGui.QApplication.UnicodeUTF8))
self.menuAbout.setTitle(QtGui.QApplication.translate("MainWindow", "Help", None, QtGui.QApplication.UnicodeUTF8))
self.menuFile.setTitle(QtGui.QApplication.translate("MainWindow", "File", None, QtGui.QApplication.UnicodeUTF8))
self.menuMode.setTitle(QtGui.QApplication.translate("MainWindow", "Mode", None, QtGui.QApplication.UnicodeUTF8))
self.actionClose.setText(QtGui.QApplication.translate("MainWindow", "Quit", None, QtGui.QApplication.UnicodeUTF8))
self.actionAbout.setText(QtGui.QApplication.translate("MainWindow", "About", None, QtGui.QApplication.UnicodeUTF8))
self.actionComplete_Recovery.setText(QtGui.QApplication.translate("MainWindow", "Complete Recovery", None, QtGui.QApplication.UnicodeUTF8))
self.actionIncomplete_Recovery.setText(QtGui.QApplication.translate("MainWindow", "Incomplete Recovery", None, QtGui.QApplication.UnicodeUTF8))
self.actionFlashback_Recovery.setText(QtGui.QApplication.translate("MainWindow", "Flashback Recovery", None, QtGui.QApplication.UnicodeUTF8))
self.actionContent.setText(QtGui.QApplication.translate("MainWindow", "Content", None, QtGui.QApplication.UnicodeUTF8))
|
When I bought my first property about 8 years ago, it took over a year before it started to begin to pay for itself. I was thinking of selling it and if I did it would have certainly been at a loss. I started to question if it was a good investment because it quickly became a money pit; in the first six months rotting wood in the kitchen that no one spotted before closing meant I had to re-do it before tenants moved in, there were plumbing issues, the list goes on. Now it’s a cash cow.
When I jumped into Go Ringless Extreme, I didn’t see any real profit until 10 months into buying the software but my profitability soared soon after.
The moral of the story is to not give up! You can certainly get subscribers and clients and make a profit in your first year but your bigger life, the bigger rewards take a little longer.
So have the right mindset, stay patient and play the long game.
|
#import pymongo
import sys
import os
##from pymongo import MongoClient
from makeConllFromDBOutput import makeConll
from getJustSentences import getJustSentences
# python will convert \n to os.linesep
def moveBatch(randomIds,noise):
pathToEpic = os.getcwd()
pathToEpic = pathToEpic[0:pathToEpic.rfind("epic")+4]
returnString = "Tmp file: "
print "Inside moveBatch"
# Move Batch between databases
#client = MongoClient('mon-entity-event-r13-2.recfut.com:27016')
#db = client.rf_entity_curation
#labeled = db.malware_labeled
#unlabeled = db.malware_unlabeled
batch = open(os.path.expanduser(pathToEpic + "/data/PoolData/batch.txt"),'w')
readUnlabeled = open(os.path.expanduser(pathToEpic + "/data/PoolData/unlabeledPool.txt"), 'r')
lines = readUnlabeled.readlines()
readUnlabeled.close()
writeUnlabeled = open(os.path.expanduser(pathToEpic + "/data/PoolData/unlabeledPool.txt"), 'w')
print "Unlabeled openened for writing"
#print "randomIds " + str(randomIds)
################## Batch moved in database #############
#for oneId in randomIds:
# tmpId = unlabeled.find({"random" : oneId})
# labeled.insert(tmpId)
# unlabeled.remove({"random" : oneId})
# tmpId = labeled.find({"random" : oneId})
# batch.write(str(tmpId[0]))
# batch.write("\n")
#print "Starting to remove id from textfile"
for line in lines:
idFound = False
for oneID in randomIds:
if not (line.find(str(oneID)[0:len(str(oneID))-2])==-1):
idFound = True
#print str(idFound)+" " +str(oneID)[0:len(str(oneID))-2] +"\n"+line
if not idFound:
#print "Write \""+line+"\" to unlabeled"
writeUnlabeled.write(line)
else:
#print "Write \""+line+"\" to batch"
batch.write(line)
#print line + " does not include " +oneId
#print str(idFound)+" " + +"\n"+line
#returnString += str(idFound) + " " + line + "\n"
writeUnlabeled.close()
batch.close()
# Get Conll of the batches and add these to all conll's of labeled pool
makeConll(pathToEpic + "/data/PoolData/batch.txt", pathToEpic + "/data/PoolData/batchConll.conll", noise)
labeledOrig = open(os.path.expanduser(pathToEpic + "/data/PoolData/labeledPool.txt"), 'a')
labeledOrigConll = open(os.path.expanduser(pathToEpic + "/data/PoolData/labeledPool.conll"),'a')
batch = open(os.path.expanduser(pathToEpic + "/data/PoolData/batch.txt"),'r')
batchConll = open(os.path.expanduser(pathToEpic + "/data/PoolData/batchConll.conll"),'r')
labeledOrig.write(batch.read())
labeledOrigConll.write(batchConll.read())
labeledOrig.close()
labeledOrigConll.close()
batch.close()
batchConll.close()
#os.remove(os.path.expanduser(pathToEpic + "/data/PoolData/batch.txt"))
#os.remove(os.path.expanduser(pathToEpic + "/data/PoolData/batchConll.conll"))
return returnString
|
What passes for amusement at our house — Writing… or Typing?
Okay, Milo was just completely adorable and startling then!
LOL Milo was being firewood! Not a good thing to imitate!
Whether or not you and John are easily amused is something else entirely; this is just amazing cat-cuteness.
awwww. :) I love it!
|
# -*- coding: utf-8 -*-
#
# This file is part of PyBuilder
#
# Copyright 2011-2020 PyBuilder Team
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from unittest import TestCase
from pybuilder.core import Project, Logger
from pybuilder.errors import BuildFailedException
from pybuilder.plugins.python.pylint_plugin import (check_pylint_availability,
init_pylint,
execute_pylint,
DEFAULT_PYLINT_OPTIONS)
from test_utils import Mock, patch, ANY
PYLINT_ERROR_OUTPUT = [
'************* Module mode.file',
'src/main/python/module/file.py:34:0: C0301: Line too long (365/100) (line-too-long)',
'src/main/python/module/file.py:34:0: R1705: Unnecessary "else" after "return" (no-else-return)',
'Your code has been rated at 9.79/10 (previous run: 9.79/10, +0.00)',
''
]
PYLINT_NORMAL_OUTPUT = [
'Your code has been rated at 9.79/10 (previous run: 9.79/10, +0.00)',
''
]
class PylintPluginTests(TestCase):
def setUp(self):
self.project = Project("basedir")
self.project.set_property("dir_source_main_python", "source")
self.project.set_property("dir_reports", "reports")
self.reactor = Mock()
self.reactor.python_env_registry = {}
self.reactor.python_env_registry["pybuilder"] = pyb_env = Mock()
pyb_env.environ = {}
self.reactor.pybuilder_venv = pyb_env
def test_should_check_that_pylint_can_be_executed(self):
mock_logger = Mock(Logger)
check_pylint_availability(self.project, mock_logger, self.reactor)
self.reactor.pybuilder_venv.verify_can_execute.assert_called_with(['pylint'], 'pylint', 'plugin python.pylint')
def test_should_run_pylint_with_default_options(self):
init_pylint(self.project)
execute_pylint(self.project, Mock(Logger), self.reactor)
self.reactor.pybuilder_venv.execute_command.assert_called_with(["pylint"] + DEFAULT_PYLINT_OPTIONS, ANY,
env=ANY)
def test_should_run_pylint_with_custom_options(self):
init_pylint(self.project)
self.project.set_property("pylint_options", ["--test", "-f", "--x=y"])
execute_pylint(self.project, Mock(Logger), self.reactor)
self.reactor.pybuilder_venv.execute_command.assert_called_with(["pylint", "--test", "-f", "--x=y"], ANY,
env=ANY)
@patch('pybuilder.plugins.python.pylint_plugin.read_file', return_value=PYLINT_ERROR_OUTPUT)
@patch('pybuilder.plugins.python.pylint_plugin.execute_tool_on_modules')
def test_should_break_build_when_warnings_and_set(self, *_):
init_pylint(self.project)
self.project.set_property("pylint_break_build", True)
with self.assertRaises(BuildFailedException):
execute_pylint(self.project, Mock(Logger), self.reactor)
@patch('pybuilder.plugins.python.pylint_plugin.read_file', return_value=PYLINT_ERROR_OUTPUT)
@patch('pybuilder.plugins.python.pylint_plugin.execute_tool_on_modules')
def test_should_not_break_build_when_warnings_and_not_set(self, *_):
init_pylint(self.project)
self.project.set_property("pylint_break_build", False)
execute_pylint(self.project, Mock(Logger), self.reactor)
@patch('pybuilder.plugins.python.pylint_plugin.read_file', return_value=PYLINT_NORMAL_OUTPUT)
@patch('pybuilder.plugins.python.pylint_plugin.execute_tool_on_modules')
def test_should_not_break_build_when_no_warnings_and_set(self, *_):
init_pylint(self.project)
self.project.set_property("pylint_break_build", True)
execute_pylint(self.project, Mock(Logger), self.reactor)
@patch('pybuilder.plugins.python.pylint_plugin.read_file', return_value=PYLINT_NORMAL_OUTPUT)
@patch('pybuilder.plugins.python.pylint_plugin.execute_tool_on_modules')
def test_should_not_break_build_when_no_warnings_and_not_set(self, *_):
init_pylint(self.project)
self.project.set_property("pylint_break_build", False)
execute_pylint(self.project, Mock(Logger), self.reactor)
|
The question before the Court in Anderson v City of Stonnington VSC 374 (“Anderson v Stonnington”) was whether the laneway in question was a road within the meaning of the Local Government Act 1989 (“LG Act”), the Road Management Act 2004 (“RM Act”) or a highway pursuant to the common law. The Court determined that the answer to these three questions was yes.
McMillian J found that given that the definition of a road pursuant to the RM Act includes a public highway at common law, the answer to the first question necessarily answered the latter two questions.
Her Honour reviewed and summarised the authorities on the creation of public highways at common law, particularly where the disputed land is not recorded as a road in title documents. This is often the case when laneways have remained in the name of the original subdivider. Her Honour concluded that the laneway was a public highway because it had been open to the public and used ‘without force, without secrecy and without permission’ for a very long period of time.
Her Honour also summarised the relevant provisions of the Road Management Act 2004 (“the RM Act”) and the Local Government Act 1989 (“the LG Act”).
In her Honour’s opinion since 1 July 2004 (the introduction of the RM Act) all public rights of way have existed to the exclusion of private rights of way to the extent that the two overlap (c14 of Schedule 5 of the RM Act). In other words, where land is subject to private rights of way or easements of carriageway and the land is found to have the status of a road pursuant to the RM Act (which includes a public highway at common law) then the public rights to use the road outrank the easement rights. Her Honour suggested that the effect of c14 of Schedule 5 of the RM Act may well be that pubic rights to use a “road” would extinguish private easement rights, however she did not need to decide this point for the purposes of the case before her.
Ownership of land that falls within the definition of a road for the purposes of the RM Act vests in Council (c1 of Schedule 5 of the RM Act) and Council owned land cannot be adversely possessed (s7B Limitation of Actions Act 1958).
This reinforces the need to ascertain whether the land the subject of an adverse possession application has attained the status of a public highway at common law and is therefore a road pursuant to the RM Act. This is a question of fact and the facts of each case much be carefully considered before an answered can be reached.
It is worth noting that Her Honour made it clear that in order to create a public highway the law requires the landowner to intend to dedicate land as a public road and that it must be accepted by the public for that purpose. It is clear from Her Honour’s analysis of this rule that in circumstances where land has been set aside by the landowner as a road in a plan of subdivision lodged with the titles office, but were the road has never physically been created and not used by the public, that the public have not accepted the proffered dedication and that a public highway has not been created.
|
# -*- coding: utf-8-*-
import logging
from notifier import Notifier
from brain import Brain
class Conversation(object):
def __init__(self, persona, mic, profile):
self._logger = logging.getLogger(__name__)
self.persona = persona
self.mic = mic
self.profile = profile
self.brain = Brain(mic, profile)
# Initialize notifier if server
if self.profile['server'] == 'Y':
self.notifier = Notifier(profile)
def handleForever(self):
"""
Delegates user input to the handling function when activated.
"""
self._logger.info("Starting to handle conversation with keyword '%s'.",
self.persona)
while True:
# If server, handle passive modules (scheduled backgroud tasks)
if self.profile['server'] == 'Y':
# Print notifications until empty
notifications = self.notifier.getAllNotifications()
for notif in notifications:
self._logger.info("Received notification: '%s'", str(notif))
self._logger.debug("Started listening for keyword '%s'",
self.persona)
threshold, transcribed = self.mic.passiveListen(self.persona)
self._logger.debug("Stopped listening for keyword '%s'",
self.persona)
if not transcribed or not threshold:
self._logger.info("Nothing has been said or transcribed.")
continue
self._logger.info("Keyword '%s' has been said!", self.persona)
self._logger.debug("Started to listen actively with threshold: %r",
threshold)
input = self.mic.activeListenToAllOptions(threshold)
self._logger.debug("Stopped to listen actively with threshold: %r",
threshold)
if input:
self.brain.query(input)
else:
self.mic.say('A', "Pardon?")
|
Instagram is one of the fastest growing Social Media platforms because it allows users to share attractive visual content and short messages, enter competitions and connect with fans. Since it was purchased by Facebook, Instagram has moved from mobile to the web and continues to go from strength to strength. The great potential for online marketing means the most popular users can earn a living simply by promoting products and services to their followers on behalf of brands - but how can you get started and how can Instagram help your business?
Well, there are lots of different ways for marketers to get the best results out of Instagram, and OneHowTo are here to give you a quick rundown on different tricks so you learn how Instagram can help your business.
Just like Pinterest, Instagram is all about attractive visual content, so think carefully about how you want your Instagram page to appear to followers. Try to create an interesting mix of fun and exciting content along with business pictures.
Reward followers by giving them a taste of how your business looks on the inside: photos of employees at work, behind the scenes at special events - all these will help your fans get to know your business and build a relationship with your brand.
How can Instagram help my business on the whole? Get your posts seen by as many people as possible. Try to create a following by engaging with many users as possible by liking, commenting, and following relevant posts, people and brands. Use your existing social networks to promote your Instagram page and bring in existing followers. This will create a community of people that are interested in your product or business.
As well as engaging as much as possible with other users, you can use relevant hashtags to get your images seen, as well as linking with your Facebook page to capitalize on existing fans.
Once you have begun to increase your follower count, you should follow people back and engage with them to develop your brand. What's more, you can use Instagram's Photo Contest feature to set up mini competitions and reward fans with discounts and promotions.
Be kind to your followers and think about them. Create a special post for celebrations such as Christmas, by making a Christmas card with Instagram for your followers.
Another great way to build authentic relationships with followers is to feature your customers and fans in your posts - pictures of genuine customers enjoying your product is a great way to get the message out! Promote everything on Facebook to further heighten the impact.
Instagram now has video - use this to your advantage. Video posts are a great way to further connect with your followers and are great for grabbing peoples attention. Short videos to promote different events or send out a quick message to followers have shown to be highly effective method to engage with fans. A great idea is to Live stream special events from your company.
If you want to read similar articles to How Can Instagram Help my Business, we recommend you visit our Economy & business category.
|
#!/usr/bin/env python3
"""Import iso country codes"""
import csv
import io
import sys
column_to_field = [['The two-letter ISO 3166-1 code', 'country'],
['start-date', 'start-date'],
['end-date', 'end-date'],
['Country', 'name'],
['Official Name', 'official-name'],
['Name for Citizen', 'citizen-names']]
if __name__ == '__main__':
# the countries csv seems to be in cp1252, not utf-8
input_stream = io.TextIOWrapper(sys.stdin.buffer, encoding='cp1252')
reader = csv.DictReader(input_stream)
print (*[field for column, field in column_to_field], sep='\t')
# GB isn't included in the csv for some reason
print ('GB', '', '', 'United Kingdom', 'The United Kingdom of Great Britain and Northern Ireland', 'Briton, British citizen', sep='\t')
for num, row in enumerate(reader):
row['start-date'] = ''
row['end-date'] = ''
print (*[row[column] for column, field in column_to_field], sep='\t')
|
Keep your yard, vacant lot and land cut to less than 12 inches, as outlined in the city ordinance.
Spring is here and grass is starting to grow quickly. The city’s Code Compliance officials encourage everyone to avoid tall grass and weeds by mowing regularly.
If a Code Compliance officer documents that the grass and weeds on your property are more than 12 inches high, you will receive a one-time certified notice and have 10 days to correct the violation. If you don’t, a city contractor will do the job. The average bill for a residential lot is $250.
To learn more, call Code Compliance at 817-392-1234.
|
#!/usr/bin/python3
import subprocess
import bottle as b
DEAD = 0
PLAYING = 1
PAUSED = 2
STOPPED = 3
UNKNOWN = 4
def _run_mocp(switch):
p = subprocess.Popen(['mocp', switch], stdout=subprocess.PIPE, stderr=subprocess.DEVNULL)
(output, dummy) = p.communicate()
return (output, p.returncode)
def get_state():
r = _run_mocp('-i')
if r[1] == 2:
return DEAD
else:
for e in filter(lambda b: b.startswith(b'State: '), r[0].split(b'\n')):
# iterating over one element at most is such a 2AM solution
verb = e[7:]
if verb == b'PLAY':
return PLAYING
elif verb == b'PAUSE':
return PAUSED
elif verb == b'STOP':
return STOPPED
return UNKNOWN
def cmd_start():
_run_mocp('-S')
def cmd_play():
_run_mocp('-p')
def cmd_stop():
_run_mocp('-s')
@b.route('/')
def index():
playstate = get_state() == PLAYING
return b.template('radiosteuerung', playstate=playstate)
@b.route('/do/<cmd>')
def command(cmd):
method = 'cmd_' + cmd
# XXX: wrap these functions in a class, use getattr() instead
if method in globals():
globals()[method]()
return 'OK'
else:
b.abort(404, 'Method not found')
if get_state() == DEAD:
cmd_start()
b.TEMPLATE_PATH.insert(0, '/insertpathhere/') # location of template
#b.run(host='localhost', port=8088, debug=True) # not recommended
b.run(host='0.0.0.0', port=8088)
|
Put yourself in the driver’s seat, with the chance to start a new career with an exciting, developing bus company.
We’re looking for customer focused, reliable staff to join our work force.
We’re currently recruiting for bus drivers to join our teams across our network.
The majority of our workforce are bus drivers. They are the key interface with our customers and vital in keeping our services running.
online application, or download the form in pdf format and take it to your local depot.
Opportunities for other positions arise less frequently, and are advertised locally in the relevant area. You will also find details here.
and follow the applications requirements outlined in the advert.
|
#
# Copyright (c) 2008--2010 Red Hat, Inc.
#
# This software is licensed to you under the GNU General Public License,
# version 2 (GPLv2). There is NO WARRANTY for this software, express or
# implied, including the implied warranties of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. You should have received a copy of GPLv2
# along with this software; if not, see
# http://www.gnu.org/licenses/old-licenses/gpl-2.0.txt.
#
# Red Hat trademarks are not licensed under GPLv2. No permission is
# granted to use or replicate Red Hat trademarks that are incorporated
# in this software or its documentation.
#
from spacewalk.server import rhnSQL
# maps a funky product release to something that is canonical
# and has a chance of hitting some channels
def real_version(version):
version = str(version)
h = rhnSQL.prepare("""
select canon_version
from rhnRedHatCanonVersion rhcv
where rhcv.version = :version
""")
h.execute(version = version)
ret = h.fetchone_dict()
if not ret:
return version
return ret["canon_version"]
# checks if an arch is for real
def check_package_arch(name):
name = str(name)
if name is None or len(name) == 0:
return None
h = rhnSQL.prepare("select id from rhnPackageArch where label = :label")
h.execute(label=name)
ret = h.fetchone_dict()
if not ret:
return None
return name
if __name__ == '__main__':
"""Test code.
"""
rhnSQL.initDB()
print real_version('7.1sbe')
print check_package_arch('i386')
|
Modular Nightands and Dressers developed by ALF Italia. Made of wood and lacquer combinations, sleek and contemporary. Made in Italy.
Bowery nightstand is a neat design furniture piece for bedroom. It stands out for clean lines, mix of materials and sophistication. It is made of asphalt matt lacquer with glass, or matt white lacquer with painted glass bottom. Its base is in chrome and handles are polished steel. Bowery is considered an oversize bedside table.
Valenti modern nightstand features design and functionality for a contemporary bedroom. The combination of grey high gloss lacquer with a faux metal finish of the top encompasses the collection into a sophisticated and elegant furniture line. The side table has two large drawers and its size is perfect for large spaces. Made in Italy.
|
# Copyright 2014 DreamHost, LLC
#
# Author: DreamHost, LLC
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from setuptools import setup, find_packages
setup(
name='akanda-ceilometer',
version='0.1.0',
description='Ceilometer plugin for processing Akanda notifications',
author='DreamHost',
author_email='[email protected]',
url='http://github.com/dreamhost/akanda',
license='BSD',
install_requires=['ceilometer'],
namespace_packages=['akanda'],
packages=find_packages(exclude=['test', 'smoke']),
entry_points={
'ceilometer.collector': [
'akanda_bandwidth = akanda.ceilometer.notifications'
':NetworkBandwidthNotification',
],
},
include_package_data=True,
zip_safe=False,
)
|
I presume that you’ve seen the video of Space Squid’s zombie simulation project.
Well, it turns out that other people have been doing the whole chopping pig heads with swords thing. Only their swords actually cut the pig head instead of bend. I guess there actually are high quality swords out there that might be useful during the zombie apocalypse.
My favorite part is where they chop the cowboy boots.
These Cold Steel guys are hilarious, especially the pose the main guy strikes after each demo. “Twenty pounds of beef,” indeed.
|
from math import sqrt, exp
import numpy as np
def distance(u, v, dft):
return sqrt((u - (dft.shape[0] / 2))**2 + (v - (dft.shape[1] / 2))**2)
def ideal(value, cutoff, ftype='lowpass', bwidth=1.0):
if ftype == 'lowpass':
return 1 if value <= cutoff else 0
elif ftype == 'highpass':
return 0 if value <= cutoff else 1
elif ftype == 'bandreject':
min = cutoff - (bwidth / 2)
max = cutoff + (bwidth / 2)
return 0 if min <= value <= max else 1
elif ftype == 'bandpass':
min = cutoff - (bwidth / 2)
max = cutoff + (bwidth / 2)
return 1 if min <= value <= max else 0
def gauss(value, cutoff, ftype='lowpass', bwidth=1.0):
if ftype == 'lowpass':
return exp(-(value**2) / (2 * cutoff**2))
elif ftype == 'highpass':
return 1 - exp(-(value**2 / (2 * cutoff**2)))
elif ftype == 'bandreject':
return 1 - exp(-((value**2 - cutoff**2) / ((1+value) * bwidth))**2)
elif ftype == 'bandpass':
return exp(-((value**2 - cutoff**2) / ((1+value) * bwidth))**2)
def butterworth(value, cutoff, n, ftype='lowpass', bwidth=1.0):
if ftype == 'lowpass':
return 1 / (1 + (value / cutoff)**(2*n))
elif ftype == 'highpass':
return 1 / (1 + (cutoff / (1+value))**(2*n))
elif ftype == 'bandreject':
return 1 / (1 + ((value * bwidth) / (1+(value**2 - cutoff**2)))**(2*n))
elif ftype == 'bandpass':
return 1 / (1 + ((value**2 - cutoff**2) / ((1+value) * bwidth))**(2*n))
def apply_filter(data, filter):
filtered = np.empty(data.shape, dtype=complex)
for c in range(0, data.shape[0]):
for u in range(0, data.shape[1]):
for v in range(0, data.shape[2]):
value = distance(u, v, data[c])
filtered[c][u][v] = data[c][u][v] * filter(value)
return filtered
|
After growing up in China, William Zhao went to obtain his MBA in France. He remained in the country for 11 years, working in the finance field. He became active in curating contemporary arts exhibitions and became growingly involved in the French art scene, advising the French Government Commission in charge of the Joint France-China Years in 2001-2002 and the president of the Pompidou Centre, Jean-Jacques Aillagon, in the exhibition Alors, Chine.
William Zhao returned to Asia in 2003 and has devoted himself more and more to his passion in advising collectors and writing on contemporary Chinese art. Publications which regularly feature his columns include Modern Weekly (周末画报), Bazaar Art (China edition), South China Morning Poster and Architectural Digest. He is also active in supporting important world-class museum activities.
He has held such positions as Vice Chairman-Asia of the Association of Guimet Museum (Paris), Member of the Board of Directors of the Shao Zhong Foundation in China or Member of the Board of Directors of the Sovereign Art Foundation (Hong Kong).
William Zhao is currently preparing the curatorial work for a solo exhibition by French photographer Stephane Couturier.
|
import pytest
import hashlib
from django.core import exceptions
from django.db.utils import IntegrityError
from share.models import RawData
@pytest.mark.django_db
class TestRawData:
def test_doesnt_mangle_data(self, share_source):
rd = RawData(source=share_source, app_label='foo', data=b'This is just some data')
rd.save()
assert RawData.objects.first().data == 'This is just some data'
def test_must_have_data(self, share_source):
rd = RawData(source=share_source, app_label='foo')
with pytest.raises(exceptions.ValidationError) as e:
rd.clean_fields()
rd.save()
assert 'This field cannot be blank.' == e.value.message_dict['data'][0]
def test_must_have_source(self):
rd = RawData(data='SomeData', app_label='foo')
with pytest.raises(IntegrityError) as e:
rd.save()
assert 'null value in column "source_id" violates not-null constraint' in e.value.args[0]
def test_store_data(self, share_source):
rd = RawData.objects.store_data('myid', b'mydatums', share_source, 'applabel')
assert rd.date_seen is not None
assert rd.date_harvested is not None
assert rd.data == b'mydatums'
assert rd.source == share_source
assert rd.app_label == 'applabel'
assert rd.provider_doc_id == 'myid'
assert rd.sha256 == hashlib.sha256(b'mydatums').hexdigest()
def test_store_data_dedups_simple(self, share_source):
rd1 = RawData.objects.store_data('myid', b'mydatums', share_source, 'applabel')
rd2 = RawData.objects.store_data('myid', b'mydatums', share_source, 'applabel')
assert rd1.pk == rd2.pk
assert rd1.date_seen < rd2.date_seen
assert rd1.date_harvested == rd2.date_harvested
def test_store_data_dedups_complex(self, share_source):
data = b'{"providerUpdatedDateTime":"2016-08-25T11:37:40Z","uris":{"canonicalUri":"https://provider.domain/files/7d2792031","providerUris":["https://provider.domain/files/7d2792031"]},"contributors":[{"name":"Person1","email":"[email protected]"},{"name":"Person2","email":"[email protected]"},{"name":"Person3","email":"[email protected]"},{"name":"Person4","email":"[email protected]"}],"title":"ReducingMorbiditiesinNeonatesUndergoingMRIScannig"}'
rd1 = RawData.objects.store_data('myid', data, share_source, 'applabel')
rd2 = RawData.objects.store_data('myid', data, share_source, 'applabel')
assert rd1.pk == rd2.pk
assert rd1.date_seen < rd2.date_seen
assert rd1.date_harvested == rd2.date_harvested
|
Playing Together: How Games Learn From and Add to a Diverse World.
Hosted by William Evans of Black Nerd Problems. This panel will look at the power of play and games to bring people from across the globe together to communicate, share, and work together in a way no other medium is capable of.
|
# -*- coding: utf-8 -*-
'''
lxml mate.
'''
__ver_major__ = 0
__ver_minor__ = 5
__ver_patch__ = 2
__ver_sub__ = ""
__version__ = "%d.%d.%d%s" % (__ver_major__,__ver_minor__,__ver_patch__,__ver_sub__)
from lxml import etree, objectify
import types
class ObjectifiedElementProxy( object ):
u'''Proxy class for objectify.ObjectifiedElement instance which can be created by objectify.Element() or SubElement() or XML() or fromstring().
main purpose is to intercept AttributeException when access a non-existent tag.
How to access xml tag
---------------------
.
such as root.element.name
[]
such as root['element']['name']
hybrid
such as root['element'].name
note
The tag can only be accessed by [] when it is one of the reserved keywords.
Tag not in the xml tree can be accessed directly. A new tag will be created. No exceptions will be raised.
How to access xml tag's attributes
----------------------------------
[]
.attrib['style'], an exception will be raised when style dos'nt exist.
.attrib['style'] = 'big'
.get/set
.attrib.get( 'style', None )
.attrib.set( 'style', 'big' )
How to access class attributes and methods
------------------------------------------
.
attributes are reserved keywords and they can only be accessed by this way, for example
.pyval
.text
.insert etc.
or they are considered xml tags rather than attributes.
Reserved keywords
-----------------
The following keywords are used as methods or attributes' names.
**pyval** : returns the python value carried by leaf tag. read-only.
**text** : returns leaf tag's text content. read-only.
**obj** : returns ObjectifiedElement object referenced by this class instance. read-only.
**tag** : returns tag names. can be modified by . way such as \*.tag='newTagName'. readable and writable.
**attrib** : returns tag attributes dict. readable and writeable.
**parent** : returns parent node. read-only.
**children** : returns all children's list. read-only.
**len** : returns the number of children.
**insert** : insert a child node at the specified position.
**remove** : remove a child node at the specified position.
**index** : returns the position of the specified object.
**swap** : swap two nodes' position.
Examples
--------
create a brand new xml:
>>> p = ObjectifiedElmentProxy( rootag='Person' )
>>> p.name = 'peter'
>>> p.age = 13
>>> print( p )
<Person>
<name>peter</name>
<age>13</age>
</Person>
create from xml string:
>>> p = ObjectifiedElementProxy( xmlStr="<Person><name>peter</name><age>13</age></Person>" )
>>> print( p )
<Person>
<name>peter</name>
<age>13</age>
</Person>
multiple levels examples:
>>> r = ObjectifiedElementProxy()
>>> r.person.name = 'jack'
>>> r.person.age = 10
>>> print( r )
<root>
<person>
<name>jack</name>
<age>10</age>
</person>
</root>
to insert child like '<person><name>peter</name><age>13</age></person>':
>>> r.insert( 'person' )('name','peter')('age',13)
>>> p = r('person').person[-1]
>>> p.name = 'david'
>>> p.age = 16
>>> print( r )
<root>
<person>
<name>jack</name>
<age>10</age>
</person>
<person>
<name>peter</name>
<age>13</age>
</person>
<person>
<name>david</name>
<age>16</age>
</person>
</root>
>>> print( r.station[1].name.pyval )
peter
Notes
-----
xml attrib names and tag names are case insensitive.
Nodes with text attribute are called leaf nodes. Theoretically, leaf nodes should not have children, but not required.
'''
def __init__( self, objectifiedElement=None, xmlFileName=None, xmlStr=None, rootag='root', attrib=None, nsmap=None, **kwargs ):
u'''
initialize from ObjectifiedElement or xml file or xml string or create a brand new.
Arguments
---------
objectifiedElement : ObjectifiedElement, optional
an ObjectifiedElement object.
xmlFileName : str, optional
xml's filename.
xmlStr : str, optional
xml's content.
rootag : str, optional
create ObjectifiedElement instance which root tag's name is rootag.
attrib, nsmap, kwargs : optional
refer to objectify.Element()
'''
self._____o____ = None
if objectifiedElement is not None:
self._____o____ = objectifiedElement
elif xmlFileName:
self._____o____ = objectify.XML( xmlFileName )
elif xmlStr:
self._____o____ = objectify.fromstring( xmlStr )
else:
self._____o____ = objectify.Element( rootag, attrib=attrib, nsmap=nsmap, **kwargs )
def __call__( self, tag, pyval=None, attrib=None, nsmap=None, **kwargs ):
u'''Insert a new child node.
insert a new child node to the end.
Arguments
---------
e : str
the new tag to be inserted.
pyval : legal python data type
tag's python value.
attrib,nsmap,kwargs : optional
attribs for the new tag. see also objectify.Element() or SubElement().
Returns
-------
ObjectifiedElementProxy instance
See Also
--------
insert
note the difference between the two methods' return values.
Examples
--------
>>> p=ObjectifiedElementProxy( rootag='Person' )
>>> p( 'name', pyval='jack' )('age', pyval=13 )
>>> print( p )
<Person>
<name py:pytype="str">jack</name>
<age py:pytype="int">13</age>
</Person>
'''
self.insert( tag, None, attrib, nsmap, **kwargs )
self [ tag ][-1] = pyval
return self
def __getattr__( self, name ):
if name == '_____o____':
return object.__getattribute__(name)
if hasattr( self._____o____, name ):
e = getattr( self._____o____, name )
if name in ( 'tag','pyval','text', 'attrib' ) or isinstance( e, ( types.FunctionType, types.BuiltinFunctionType ) ):
return e
else:
#if has no attr named name, created a new one.
e = objectify.SubElement( self._____o____, name )
return ObjectifiedElementProxy( e )
def __setattr__( self, name, value ):
if name == '_____o____':
object.__setattr__( self, name, value )
return
setattr( self._____o____, name, value )
def __delattr__( self, e ):
self._____o____.__delattr__( e )
def __len__( self ):
u'''children's number'''
return len( self.children )
def __getitem__( self, name ):
if isinstance( name, int ):
return ObjectifiedElementProxy( self._____o____[name] )
if isinstance( name, slice ):
return [ ObjectifiedElementProxy( o ) for o in self._____o____[name] ]
if isinstance( name, str ):
if name == '_____o____':
return object.__getattribute__( name )
o = self._____o____
try:
e = o.__getitem__( name )
except:
e = objectify.SubElement( self._____o____, name )
return ObjectifiedElementProxy( e )
raise Exception
def __setitem__( self, e, v ):
if e == '_____o____':
object.__setitem__( self, e, v )
return
self._____o____[e] = v
def __delitem__( self, e ):
if isinstance( e, ObjectifiedElementProxy ):
self._____o____.__delattr__( e.tag )
else:
self._____o____.__delattr__( e )
def insert( self, e, i=None, attrib=None, nsmap=None, **kwargs ):
u'''Insert a new child node.
insert a new child node at the specified position.
Arguments
---------
e : str
the new tag to be inserted.
i : int, optional
if i is integer : position of the new tag. else append to the end.
attrib,nsmap,kwargs : optional
attribs for the new tag. see also objectify.Element() or SubElement().
'''
v = objectify.SubElement( self._____o____, e, attrib=attrib, nsmap=nsmap, **kwargs )
s = ObjectifiedElementProxy( v )
if i is not None:
self._____o____.insert( i, v )
return s
def swap( self, i, j ):
u'''swap two child nodes' position.
Arguments
---------
i,j : int
position of the child nodes to be swapped.
'''
self._____o____[i] = self._____o____[j]
def remove( self, i ):
u'''remove the child node.
Arguments
---------
i : int or ObjectifiedElement or ObjectifiedElementProxy or list
position of the child node or Element which will be removed.
'''
if isinstance( i, list ):
for k in i:
self.remove( k )
elif isinstance( i, int ):
return self.obj.remove( self.children[i].obj )
elif isinstance( i, objectify.ObjectifiedElement ):
return self.obj.remove( i )
elif isinstance( i, ObjectifiedElementProxy ):
return self.obj.remove( i.obj )
def index( self, o ):
u'''return the position of o.
Arguments
---------
o : ObjectifiedElementProxy
the ObjectifiedElementProxy instance to be positioned.
Returns
-------
int
'''
return self._____o____.index( o.obj )
def xpath( self, path ):
u'''find elements list in accordance with path.
Arguments
---------
path : str
please refer to lxml.objectify.ObjectifiedElement.xpath.
Returns
-------
list
a list of ObjectifiedElementProxy instance.
Xpath grammer review
--------------------
========== ===========
expression description
========== ===========
nodename to select all children of the node name
/ select from root node.
// select from current node
. select the current code.
.. select the parent node of the current node.
@ select attrib.
[] condition
text() tag text
* arbitrary node
========== ============
'''
return [ ObjectifiedElementProxy(k) for k in self._____o____.xpath( path ) ]
@property
def children( self, **kwargs ):
return [ ObjectifiedElementProxy( e ) for e in self._____o____.getchildren( **kwargs ) ]
@property
def parent( self ):
return ObjectifiedElementProxy( self._____o____.getparent() )
@property
def root( self):
parent = self._____o____.getparent()
while parent:
parent1 = parent.getparent()
if parent1 is None:
break
parent = parent1
return ObjectifiedElementProxy( parent )
@property
def obj( self ):
return self._____o____
@property
def pyval( self ):
if hasattr( self._____o____, 'pyval' ):
if isinstance( self._____o____, objectify.StringElement ):
return self._____o____.pyval.strip()
return self._____o____.pyval
def __nonzero__( self ):
return self.is_empty()
def is_empty( self ):
u'''To determine whether a null node.
no text \ no attribs \ no children.
'''
o = self._____o____
if o.text and o.text.strip():
return False
n = 0
for k in o.attrib:
if k[0] != '{':
n += 1
if n > 0:
return False
n = 0
for c in self.children:
if not c.is_empty():
n += 1
if n > 0:
return False
return True
def clean( self ):
u'''clean all null nodes.
'''
for c in self.children:
if c.is_empty():
c.parent.obj.__delattr__( c.tag )
else:
c.clean()
def tostring( self, encoding='utf-8', xml_declaration=True, standalone=None, with_comments=True,
pytype=False, xsi=True, xsi_nil=True, cleanup_namespaces=True, doctype=None,
with_tail=True, exclusive=False, inclusive_ns_prefixes=None ):
#self.clean()
objectify.deannotate( self._____o____, pytype=pytype, xsi=xsi, xsi_nil=xsi_nil, cleanup_namespaces=cleanup_namespaces )
s = etree.tostring( self._____o____, encoding=encoding, pretty_print= True,
xml_declaration=xml_declaration, with_tail=with_tail,
standalone=standalone, doctype=doctype,
exclusive=exclusive, with_comments=with_comments,
inclusive_ns_prefixes=inclusive_ns_prefixes )
return s
def __str__( self ):
#s = self.tostring( pytype=True, xml_declaration=False , encoding='unicode' )
s = self.tostring( pytype=True, xml_declaration=False ).decode()
return s
def dump( self, xmlFile, encoding='utf-8' ):
'''save xml to file.
Arguments
---------
xmlFile : str
xml's filename.
'''
f = open( xmlFile, 'w' )
s = self.tostring( encoding=encoding ).decode()
f.write( s )
f.close()
if __name__ == '__main__':
r = ObjectifiedElementProxy()
r.person.name = 'jack'
r.person.age = 10
r.insert( 'person' )('name','peter')('age',13)
p = r('person').person[-1]
p.name = 'david'
p.age = 16
print( r )
print( r.tostring().decode() )
print( r.person[1].name.pyval )
r.dump( 'x.xml' )
|
The statistics on foster care in the United States are staggering. There are approximately 415,000 children in foster care on any given day in the US. In Tennessee that number is slightly less than 8,000 children. We at Ethos believe God desires for us as people and as His church to love and support the children in the foster care system who are experiencing pain, confusion, fear and brokenness that only someone in their position can ever fully understand. But not only the children, the families who so willingly step into the brokenness and pain of this world and love these children the way God loves them.
We have several foster families at Ethos. Maybe God is calling you to become a foster family. You can find more information about first steps here with a local organization called Agape. They have two upcoming informational meetings at their Nashville office: Thursday, November 8th from 6-8pm and January 6th from 2-4pm. If you would like to talk to one of the foster families within Ethos or if you just have questions, you can email .
Or maybe He isn't calling you to personally foster in this season, but you can support other families as they do. One way you can do that is by praying for them. Praying for the children, for their biological families, and for their present foster families. You can provide things for the child and family as they work to fill in all the material gaps of a child who may have showed up at their door with only a small garbage bag worth of possessions or maybe even just the clothes on his/her back. You can offer to help with household chores or "honey-do" lists so the family can spend more time developing relationships with the children. You can check in frequently, listen, and avoid judgement. You can give these children lots of grace as they walk through a very stressful situation that is out of their control and express their pain in different ways. You can babysit the kids while the parents have a date night or run errands. You can provide gift cards for take out, clothes, fun outings, etc. You can pick up groceries and drop them on the porch. You can drop off special treats on hard days. The list of ways you can help is endless. You can do something!
One of the first tangible ways we are asking you to help is by providing a meal to a family from our Cannery gatherings who just welcomed two children into their home in the last few weeks. Help us provide dinner for them so they can spend more time together. Click the link below and select a date to serve this family.
|
# Copyright (c) 2010-2012 OpenStack Foundation
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""
Monkey Patch httplib.HTTPResponse to buffer reads of headers. This can improve
performance when making large numbers of small HTTP requests. This module
also provides helper functions to make HTTP connections using
BufferedHTTPResponse.
.. warning::
If you use this, be sure that the libraries you are using do not access
the socket directly (xmlrpclib, I'm looking at you :/), and instead
make all calls through httplib.
"""
from swift.common import constraints
import logging
import time
import socket
import eventlet
from eventlet.green.httplib import CONTINUE, HTTPConnection, HTTPMessage, \
HTTPResponse, HTTPSConnection, _UNKNOWN
from six.moves.urllib.parse import quote, parse_qsl, urlencode
import six
if six.PY2:
httplib = eventlet.import_patched('httplib')
from eventlet.green import httplib as green_httplib
else:
httplib = eventlet.import_patched('http.client')
from eventlet.green.http import client as green_httplib
# Apparently http.server uses this to decide when/whether to send a 431.
# Give it some slack, so the app is more likely to get the chance to reject
# with a 400 instead.
httplib._MAXHEADERS = constraints.MAX_HEADER_COUNT * 1.6
green_httplib._MAXHEADERS = constraints.MAX_HEADER_COUNT * 1.6
class BufferedHTTPResponse(HTTPResponse):
"""HTTPResponse class that buffers reading of headers"""
def __init__(self, sock, debuglevel=0, strict=0,
method=None): # pragma: no cover
self.sock = sock
# sock is an eventlet.greenio.GreenSocket
if six.PY2:
# sock.fd is a socket._socketobject
# sock.fd._sock is a _socket.socket object, which is what we want.
self._real_socket = sock.fd._sock
else:
# sock.fd is a socket.socket, which should have a _real_close
self._real_socket = sock.fd
self.fp = sock.makefile('rb')
self.debuglevel = debuglevel
self.strict = strict
self._method = method
self.headers = self.msg = None
# from the Status-Line of the response
self.version = _UNKNOWN # HTTP-Version
self.status = _UNKNOWN # Status-Code
self.reason = _UNKNOWN # Reason-Phrase
self.chunked = _UNKNOWN # is "chunked" being used?
self.chunk_left = _UNKNOWN # bytes left to read in current chunk
self.length = _UNKNOWN # number of bytes left in response
self.will_close = _UNKNOWN # conn will close at end of response
self._readline_buffer = b''
if not six.PY2:
def begin(self):
HTTPResponse.begin(self)
header_payload = self.headers.get_payload()
if isinstance(header_payload, list) and len(header_payload) == 1:
header_payload = header_payload[0].get_payload()
if header_payload:
# This shouldn't be here. We must've bumped up against
# https://bugs.python.org/issue37093
for line in header_payload.rstrip('\r\n').split('\n'):
if ':' not in line or line[:1] in ' \t':
# Well, we're no more broken than we were before...
# Should we support line folding?
# How can/should we handle a bad header line?
break
header, value = line.split(':', 1)
value = value.strip(' \t\n\r')
self.headers.add_header(header, value)
def expect_response(self):
if self.fp:
self.fp.close()
self.fp = None
self.fp = self.sock.makefile('rb', 0)
version, status, reason = self._read_status()
if status != CONTINUE:
self._read_status = lambda: (version, status, reason)
self.begin()
else:
self.status = status
self.reason = reason.strip()
self.version = 11
if six.PY2:
# Under py2, HTTPMessage.__init__ reads the headers
# which advances fp
self.msg = HTTPMessage(self.fp, 0)
# immediately kill msg.fp to make sure it isn't read again
self.msg.fp = None
else:
# py3 has a separate helper for it
self.headers = self.msg = httplib.parse_headers(self.fp)
def read(self, amt=None):
if not self._readline_buffer:
return HTTPResponse.read(self, amt)
if amt is None:
# Unbounded read: send anything we have buffered plus whatever
# is left.
buffered = self._readline_buffer
self._readline_buffer = b''
return buffered + HTTPResponse.read(self, amt)
elif amt <= len(self._readline_buffer):
# Bounded read that we can satisfy entirely from our buffer
res = self._readline_buffer[:amt]
self._readline_buffer = self._readline_buffer[amt:]
return res
else:
# Bounded read that wants more bytes than we have
smaller_amt = amt - len(self._readline_buffer)
buf = self._readline_buffer
self._readline_buffer = b''
return buf + HTTPResponse.read(self, smaller_amt)
def readline(self, size=1024):
# You'd think Python's httplib would provide this, but it doesn't.
# It does, however, provide a comment in the HTTPResponse class:
#
# # XXX It would be nice to have readline and __iter__ for this,
# # too.
#
# Yes, it certainly would.
while (b'\n' not in self._readline_buffer
and len(self._readline_buffer) < size):
read_size = size - len(self._readline_buffer)
chunk = HTTPResponse.read(self, read_size)
if not chunk:
break
self._readline_buffer += chunk
line, newline, rest = self._readline_buffer.partition(b'\n')
self._readline_buffer = rest
return line + newline
def nuke_from_orbit(self):
"""
Terminate the socket with extreme prejudice.
Closes the underlying socket regardless of whether or not anyone else
has references to it. Use this when you are certain that nobody else
you care about has a reference to this socket.
"""
if self._real_socket:
if six.PY2:
# this is idempotent; see sock_close in Modules/socketmodule.c
# in the Python source for details.
self._real_socket.close()
else:
# Hopefully this is equivalent?
# TODO: verify that this does everything ^^^^ does for py2
self._real_socket._real_close()
self._real_socket = None
self.close()
def close(self):
HTTPResponse.close(self)
self.sock = None
self._real_socket = None
class BufferedHTTPConnection(HTTPConnection):
"""HTTPConnection class that uses BufferedHTTPResponse"""
response_class = BufferedHTTPResponse
def connect(self):
self._connected_time = time.time()
ret = HTTPConnection.connect(self)
self.sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)
return ret
def putrequest(self, method, url, skip_host=0, skip_accept_encoding=0):
'''Send a request to the server.
:param method: specifies an HTTP request method, e.g. 'GET'.
:param url: specifies the object being requested, e.g. '/index.html'.
:param skip_host: if True does not add automatically a 'Host:' header
:param skip_accept_encoding: if True does not add automatically an
'Accept-Encoding:' header
'''
self._method = method
self._path = url
return HTTPConnection.putrequest(self, method, url, skip_host,
skip_accept_encoding)
def putheader(self, header, value):
if not isinstance(header, bytes):
header = header.encode('latin-1')
HTTPConnection.putheader(self, header, value)
def getexpect(self):
kwargs = {'method': self._method}
if hasattr(self, 'strict'):
kwargs['strict'] = self.strict
response = BufferedHTTPResponse(self.sock, **kwargs)
response.expect_response()
return response
def getresponse(self):
response = HTTPConnection.getresponse(self)
logging.debug("HTTP PERF: %(time).5f seconds to %(method)s "
"%(host)s:%(port)s %(path)s)",
{'time': time.time() - self._connected_time,
'method': self._method, 'host': self.host,
'port': self.port, 'path': self._path})
return response
def http_connect(ipaddr, port, device, partition, method, path,
headers=None, query_string=None, ssl=False):
"""
Helper function to create an HTTPConnection object. If ssl is set True,
HTTPSConnection will be used. However, if ssl=False, BufferedHTTPConnection
will be used, which is buffered for backend Swift services.
:param ipaddr: IPv4 address to connect to
:param port: port to connect to
:param device: device of the node to query
:param partition: partition on the device
:param method: HTTP method to request ('GET', 'PUT', 'POST', etc.)
:param path: request path
:param headers: dictionary of headers
:param query_string: request query string
:param ssl: set True if SSL should be used (default: False)
:returns: HTTPConnection object
"""
if isinstance(path, six.text_type):
path = path.encode("utf-8")
if isinstance(device, six.text_type):
device = device.encode("utf-8")
if isinstance(partition, six.text_type):
partition = partition.encode('utf-8')
elif isinstance(partition, six.integer_types):
partition = str(partition).encode('ascii')
path = quote(b'/' + device + b'/' + partition + path)
return http_connect_raw(
ipaddr, port, method, path, headers, query_string, ssl)
def http_connect_raw(ipaddr, port, method, path, headers=None,
query_string=None, ssl=False):
"""
Helper function to create an HTTPConnection object. If ssl is set True,
HTTPSConnection will be used. However, if ssl=False, BufferedHTTPConnection
will be used, which is buffered for backend Swift services.
:param ipaddr: IPv4 address to connect to
:param port: port to connect to
:param method: HTTP method to request ('GET', 'PUT', 'POST', etc.)
:param path: request path
:param headers: dictionary of headers
:param query_string: request query string
:param ssl: set True if SSL should be used (default: False)
:returns: HTTPConnection object
"""
if not port:
port = 443 if ssl else 80
if ssl:
conn = HTTPSConnection('%s:%s' % (ipaddr, port))
else:
conn = BufferedHTTPConnection('%s:%s' % (ipaddr, port))
if query_string:
# Round trip to ensure proper quoting
if six.PY2:
query_string = urlencode(parse_qsl(
query_string, keep_blank_values=True))
else:
query_string = urlencode(
parse_qsl(query_string, keep_blank_values=True,
encoding='latin1'),
encoding='latin1')
path += '?' + query_string
conn.path = path
conn.putrequest(method, path, skip_host=(headers and 'Host' in headers))
if headers:
for header, value in headers.items():
conn.putheader(header, str(value))
conn.endheaders()
return conn
|
First of all: I'm off this week. All week! Yay!!! I just needed some time out (even from blogging, as my sporadic posting over the last couple of weeks illustrates so well) and am taking a week's vacation from work. My plans? Doing nothing. Or rather, only do the things I feel like doing. And if that means looking at the sky or having a three-hour nap every afternoon, so be it. I am looking forward to just drift for a few days.
Sometimes, I am absolutely stumped about what to do with the contents of my weekly organic vegetable bag. Sometimes, I have an epiphany and a little kitchen magic happens.
Last week was epiphany-time. I had some leftover chard from a previous delivery, and the new bag came with a few bulbs of fresh, green garlic. I also remembered I had seen some ready-made pastry in the freezer when rummaging for food the other day, and thus felt inspired to make a quiche.
Green garlic is just young garlic, picked before the cloves have started forming. It is milder than mature garlic and works well in a dish like this, a pasta sauce or a stir fry. If you cannot get young garlic, you could maybe substitute with red onions or shallots, adding a bit of regular garlic when frying them. Instead of chard you could use a similar vegetable like spinach or beet greens.
The pictures are pretty crap, because I took them with my phone. I was hungry and wanted to eat and did not have the will to go get my proper camera. But this quiche was quite good, even if I say so myself, so I wanted to share. I had it with a simple salad, warm for dinner and then cold for lunch the next day and I would definitely make it again!
Lay the pastry over a tart tin and ease it into it, making sure to fit it neatly into the fluted rim. You can trim any overlap with a rolling pin that you gently push across the top around the edge of the tin. Prick the bottom of the pastry all over with a fork.
Heat the olive oil in a pan, add the garlic, add a generous amount of salt and sauté over a low heat until the garlic starts to soften, about 5 to 10 minutes.
Add the chard and marjoram and heat through until the chard starts wilting. Season to taste with lots of black pepper and a bit more salt if you like. Take off the heat and set aside.
In a small bowl, whisk the eggs with the milk. Evenly scatter the garlic and chard over the bottom of your pastry and then pour over the egg mixture. Grate a little bit of cheddar over the top.
Place the quiche in the preheated oven and cook for 20 minutes or until the egg has set and the top is starting to turn a golden brown.
I made a version of this last night, inspired by your post. I only had Filo pastry so I had to improvise a bit and create a wall of beans to keep the eggy-cheesey yuminess contained. Delicious!
oooh I must try this! You're such an excellent baker! I'm so impressed and inspired now!!
|
# Copyright (C) 2006-2010 Ludovic Rousseau ([email protected])
#
# This file is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA
# $Id$
import PyKCS11.LowLevel
import os
# redefine PKCS#11 constants
CK_TRUE = PyKCS11.LowLevel.CK_TRUE
CK_FALSE = PyKCS11.LowLevel.CK_FALSE
CK_UNAVAILABLE_INFORMATION = PyKCS11.LowLevel.CK_UNAVAILABLE_INFORMATION
CK_EFFECTIVELY_INFINITE = PyKCS11.LowLevel.CK_EFFECTIVELY_INFINITE
CK_INVALID_HANDLE = PyKCS11.LowLevel.CK_INVALID_HANDLE
CKM = {}
CKR = {}
CKA = {}
CKO = {}
CKU = {}
CKK = {}
CKC = {}
CKF = {}
CKS = {}
# redefine PKCS#11 constants using well known prefixes
for x in PyKCS11.LowLevel.__dict__.keys():
if x[:4] == 'CKM_' \
or x[:4] == 'CKR_' \
or x[:4] == 'CKA_' \
or x[:4] == 'CKO_' \
or x[:4] == 'CKU_' \
or x[:4] == 'CKK_' \
or x[:4] == 'CKC_' \
or x[:4] == 'CKF_' \
or x[:4] == 'CKS_':
a = "%s=PyKCS11.LowLevel.%s" % (x, x)
exec(a)
if x[3:] != "_VENDOR_DEFINED":
eval(x[:3])[eval(x)] = x # => CKM[CKM_RSA_PKCS] = 'CKM_RSA_PKCS'
eval(x[:3])[x] = eval(x) # => CKM['CKM_RSA_PKCS'] = CKM_RSA_PKCS
# special CKR[] values
CKR[-2] = "Unkown PKCS#11 type"
CKR[-1] = "Load"
class ckbytelist(PyKCS11.LowLevel.ckbytelist):
"""
add a __repr__() method to the LowLevel equivalent
"""
def __repr__(self):
"""
return the representation of a tuple
the __str__ method will use it also
"""
rep = [elt for elt in self]
return repr(rep)
class byteArray(PyKCS11.LowLevel.byteArray):
"""
add a __repr__() method to the LowLevel equivalent
"""
def __repr__(self):
"""
return the representation of a tuple
the __str__ method will use it also
"""
rep = [elt for elt in self]
return repr(rep)
class CkClass(object):
"""
Base class for CK_* classes
"""
# dictionnary of integer_value: text_value for the flags bits
flags_dict = dict()
# dictionnary of fields names and types
# type can be "pair", "flags" or "text"
fields = dict()
flags = 0
def flags2text(self):
"""
parse the L{self.flags} field and create a list of "CKF_*" strings
corresponding to bits set in flags
@return: a list of strings
@rtype: list
"""
r = []
for v in self.flags_dict.keys():
if self.flags & v:
r.append(self.flags_dict[v])
return r
def to_dict(self):
"""
convert the fields of the object into a dictionnary
"""
dico = dict()
for field in self.fields.keys():
if field == "flags":
dico[field] = self.flags2text()
else:
dico[field] = eval("self." + field)
return dico
def __str__(self):
"""
text representation of the object
"""
dico = self.to_dict()
lines = list()
for key in sorted(dico.keys()):
type = self.fields[key]
if type == "flags":
lines.append("%s: %s" % (key, ", ".join(dico[key])))
elif type == "pair":
lines.append("%s: " % key + "%d.%d" % dico[key])
else:
lines.append("%s: %s" % (key, dico[key]))
return "\n".join(lines)
class CK_SLOT_INFO(CkClass):
"""
matches the PKCS#11 CK_SLOT_INFO structure
@ivar slotDescription: blank padded
@type slotDescription: string
@ivar manufacturerID: blank padded
@type manufacturerID: string
@ivar flags: See L{flags2text}
@type flags: integer
@ivar hardwareVersion: 2 elements list
@type hardwareVersion: list
@ivar firmwareVersion: 2 elements list
@type firmwareVersion: list
"""
flags_dict = {
CKF_TOKEN_PRESENT: "CKF_TOKEN_PRESENT",
CKF_REMOVABLE_DEVICE: "CKF_REMOVABLE_DEVICE",
CKF_HW_SLOT: "CKF_HW_SLOT"}
fields = {"slotDescription": "text",
"manufacturerID": "text",
"flags": "flags",
"hardwareVersion": "text",
"firmwareVersion": "text"}
class CK_INFO(CkClass):
"""
matches the PKCS#11 CK_INFO structure
@ivar cryptokiVersion: Cryptoki interface version
@type cryptokiVersion: integer
@ivar manufacturerID: blank padded
@type manufacturerID: string
@ivar flags: must be zero
@type flags: integer
@ivar libraryDescription: blank padded
@type libraryDescription: string
@ivar libraryVersion: 2 elements list
@type libraryVersion: list
"""
fields = {"cryptokiVersion": "pair",
"manufacturerID": "text",
"flags": "flags",
"libraryDescription": "text",
"libraryVersion": "pair"}
class CK_SESSION_INFO(CkClass):
"""
matches the PKCS#11 CK_SESSION_INFO structure
@ivar slotID: ID of the slot that interfaces with the token
@type slotID: integer
@ivar state: state of the session
@type state: integer
@ivar flags: bit flags that define the type of session
@type flags: integer
@ivar ulDeviceError: an error code defined by the cryptographic token
@type ulDeviceError: integer
"""
flags_dict = {
CKF_RW_SESSION: "CKF_RW_SESSION",
CKF_SERIAL_SESSION: "CKF_SERIAL_SESSION",
}
def state2text(self):
"""
parse the L{self.state} field and return a "CKS_*" string
corresponding to the state
@return: a string
@rtype: string
"""
return CKS[self.state]
fields = {"slotID": "text",
"state": "text",
"flags": "flags",
"ulDeviceError": "text"}
class CK_TOKEN_INFO(CkClass):
"""
matches the PKCS#11 CK_TOKEN_INFO structure
@ivar label: blank padded
@type label: string
@ivar manufacturerID: blank padded
@type manufacturerID: string
@ivar model: string blank padded
@type model: string
@ivar serialNumber: string blank padded
@type serialNumber: string
@ivar flags:
@type flags: integer
@ivar ulMaxSessionCount:
@type ulMaxSessionCount: integer
@ivar ulSessionCount:
@type ulSessionCount: integer
@ivar ulMaxRwSessionCount:
@type ulMaxRwSessionCount: integer
@ivar ulRwSessionCount:
@type ulRwSessionCount: integer
@ivar ulMaxPinLen:
@type ulMaxPinLen: integer
@ivar ulMinPinLen:
@type ulMinPinLen: integer
@ivar ulTotalPublicMemory:
@type ulTotalPublicMemory: integer
@ivar ulFreePublicMemory:
@type ulFreePublicMemory: integer
@ivar ulTotalPrivateMemory:
@type ulTotalPrivateMemory: integer
@ivar ulFreePrivateMemory:
@type ulFreePrivateMemory: integer
@ivar hardwareVersion: 2 elements list
@type hardwareVersion: list
@ivar firmwareVersion: 2 elements list
@type firmwareVersion: list
@ivar utcTime: string
@type utcTime: string
"""
flags_dict = {
CKF_RNG: "CKF_RNG",
CKF_WRITE_PROTECTED: "CKF_WRITE_PROTECTED",
CKF_LOGIN_REQUIRED: "CKF_LOGIN_REQUIRED",
CKF_USER_PIN_INITIALIZED: "CKF_USER_PIN_INITIALIZED",
CKF_RESTORE_KEY_NOT_NEEDED: "CKF_RESTORE_KEY_NOT_NEEDED",
CKF_CLOCK_ON_TOKEN: "CKF_CLOCK_ON_TOKEN",
CKF_PROTECTED_AUTHENTICATION_PATH: "CKF_PROTECTED_AUTHENTICATION_PATH",
CKF_DUAL_CRYPTO_OPERATIONS: "CKF_DUAL_CRYPTO_OPERATIONS",
CKF_TOKEN_INITIALIZED: "CKF_TOKEN_INITIALIZED",
CKF_SECONDARY_AUTHENTICATION: "CKF_SECONDARY_AUTHENTICATION",
CKF_USER_PIN_COUNT_LOW: "CKF_USER_PIN_COUNT_LOW",
CKF_USER_PIN_FINAL_TRY: "CKF_USER_PIN_FINAL_TRY",
CKF_USER_PIN_LOCKED: "CKF_USER_PIN_LOCKED",
CKF_USER_PIN_TO_BE_CHANGED: "CKF_USER_PIN_TO_BE_CHANGED",
CKF_SO_PIN_COUNT_LOW: "CKF_SO_PIN_COUNT_LOW",
CKF_SO_PIN_FINAL_TRY: "CKF_SO_PIN_FINAL_TRY",
CKF_SO_PIN_LOCKED: "CKF_SO_PIN_LOCKED",
CKF_SO_PIN_TO_BE_CHANGED: "CKF_SO_PIN_TO_BE_CHANGED",
}
fields = {"label": "text",
"manufacturerID": "text",
"model": "text",
"serialNumber": "text",
"flags": "flags",
"ulMaxSessionCount": "text",
"ulSessionCount": "text",
"ulMaxRwSessionCount": "text",
"ulRwSessionCount": "text",
"ulMaxPinLen": "text",
"ulMinPinLen": "text",
"ulTotalPublicMemory": "text",
"ulFreePublicMemory": "text",
"ulTotalPrivateMemory": "text",
"ulFreePrivateMemory": "text",
"hardwareVersion": "pair",
"firmwareVersion": "pair",
"utcTime": "text"}
class CK_MECHANISM_INFO(CkClass):
"""
matches the PKCS#11 CK_MECHANISM_INFO structure
@ivar ulMinKeySize: minimum size of the key
@type ulMinKeySize: integer
@ivar ulMaxKeySize: maximum size of the key
@type ulMaxKeySize: integer
@ivar flags: bit flags specifying mechanism capabilities
@type flags: integer
"""
flags_dict = {
CKF_HW: "CKF_HW",
CKF_ENCRYPT: "CKF_ENCRYPT",
CKF_DECRYPT: "CKF_DECRYPT",
CKF_DIGEST: "CKF_DIGEST",
CKF_SIGN: "CKF_SIGN",
CKF_SIGN_RECOVER: "CKF_SIGN_RECOVER",
CKF_VERIFY: "CKF_VERIFY",
CKF_VERIFY_RECOVER: "CKF_VERIFY_RECOVER",
CKF_GENERATE: "CKF_GENERATE",
CKF_GENERATE_KEY_PAIR: "CKF_GENERATE_KEY_PAIR",
CKF_WRAP: "CKF_WRAP",
CKF_UNWRAP: "CKF_UNWRAP",
CKF_DERIVE: "CKF_DERIVE",
CKF_EXTENSION: "CKF_EXTENSION",
}
fields = {"ulMinKeySize": "text",
"ulMaxKeySize": "text",
"flags": "flags"}
class PyKCS11Error(Exception):
""" define the possible PKCS#11 error codes """
def __init__(self, value, text=""):
self.value = value
self.text = text
def __str__(self):
"""
The text representation of a PKCS#11 error is something like:
"CKR_DEVICE_ERROR (0x00000030)"
"""
if (self.value < 0):
return CKR[self.value] + " (%s)" % self.text
else:
return CKR[self.value] + " (0x%08X)" % self.value
class PyKCS11Lib(object):
""" high level PKCS#11 binding """
def __init__(self):
self.lib = PyKCS11.LowLevel.CPKCS11Lib()
def __del__(self):
self.lib.Unload()
def load(self, pkcs11dll_filename=None, *init_string):
"""
load a PKCS#11 library
@type pkcs11dll_filename: string
@param pkcs11dll_filename: the library name. If this parameter
is not set the environment variable PYKCS11LIB is used instead
@return: a L{PyKCS11Lib} object
@raise PyKCS11Error(-1): when the load fails
"""
if pkcs11dll_filename == None:
pkcs11dll_filename = os.getenv("PYKCS11LIB")
if pkcs11dll_filename == None:
raise PyKCS11Error(-1, "No PKCS11 library specified (set PYKCS11LIB env variable)")
rv = self.lib.Load(pkcs11dll_filename, 1)
if rv == 0:
raise PyKCS11Error(-1, pkcs11dll_filename)
def initToken(self, slot, pin, label):
"""
C_InitToken
"""
rv = self.lib.C_InitToken(slot, pin, label)
if rv != CKR_OK:
raise PyKCS11Error(rv)
def getInfo(self):
"""
C_GetInfo
@return: a L{CK_INFO} object
"""
info = PyKCS11.LowLevel.CK_INFO()
rv = self.lib.C_GetInfo(info)
if rv != CKR_OK:
raise PyKCS11Error(rv)
i = CK_INFO()
i.cryptokiVersion = (info.cryptokiVersion.major, info.cryptokiVersion.minor)
i.manufacturerID = info.GetManufacturerID()
i.flags = info.flags
i.libraryDescription = info.GetLibraryDescription()
i.libraryVersion = (info.libraryVersion.major, info.libraryVersion.minor)
return i
def getSlotList(self):
"""
C_GetSlotList
@return: a list of available slots
@rtype: list
"""
slotList = PyKCS11.LowLevel.ckintlist()
rv = self.lib.C_GetSlotList(0, slotList)
if rv != CKR_OK:
raise PyKCS11Error(rv)
s = []
for x in xrange(len(slotList)):
s.append(slotList[x])
return s
def getSlotInfo(self, slot):
"""
C_GetSlotInfo
@param slot: slot number returned by L{getSlotList}
@type slot: integer
@return: a L{CK_SLOT_INFO} object
"""
slotInfo = PyKCS11.LowLevel.CK_SLOT_INFO()
rv = self.lib.C_GetSlotInfo(slot, slotInfo)
if rv != CKR_OK:
raise PyKCS11Error(rv)
s = CK_SLOT_INFO()
s.slotDescription = slotInfo.GetSlotDescription()
s.manufacturerID = slotInfo.GetManufacturerID()
s.flags = slotInfo.flags
s.hardwareVersion = slotInfo.GetHardwareVersion()
s.firmwareVersion = slotInfo.GetFirmwareVersion()
return s
def getTokenInfo(self, slot):
"""
C_GetTokenInfo
@param slot: slot number returned by L{getSlotList}
@type slot: integer
@return: a L{CK_TOKEN_INFO} object
"""
tokeninfo = PyKCS11.LowLevel.CK_TOKEN_INFO()
rv = self.lib.C_GetTokenInfo(slot, tokeninfo)
if rv != CKR_OK:
raise PyKCS11Error(rv)
t = CK_TOKEN_INFO()
t.label = tokeninfo.GetLabel()
t.manufacturerID = tokeninfo.GetManufacturerID()
t.model = tokeninfo.GetModel()
t.serialNumber = tokeninfo.GetSerialNumber()
t.flags = tokeninfo.flags
t.ulMaxSessionCount = tokeninfo.ulMaxSessionCount
if t.ulMaxSessionCount == CK_UNAVAILABLE_INFORMATION:
t.ulMaxSessionCount = -1
t.ulSessionCount = tokeninfo.ulSessionCount
if t.ulSessionCount == CK_UNAVAILABLE_INFORMATION:
t.ulSessionCount = -1
t.ulMaxRwSessionCount = tokeninfo.ulMaxRwSessionCount
if t.ulMaxRwSessionCount == CK_UNAVAILABLE_INFORMATION:
t.ulMaxRwSessionCount = -1
t.ulRwSessionCount = tokeninfo.ulRwSessionCount
if t.ulRwSessionCount == CK_UNAVAILABLE_INFORMATION:
t.ulRwSessionCount = -1
t.ulMaxPinLen = tokeninfo.ulMaxPinLen
t.ulMinPinLen = tokeninfo.ulMinPinLen
t.ulTotalPublicMemory = tokeninfo.ulTotalPublicMemory
if t.ulTotalPublicMemory == CK_UNAVAILABLE_INFORMATION:
t.ulTotalPublicMemory = -1
t.ulFreePublicMemory = tokeninfo.ulFreePublicMemory
if t.ulFreePublicMemory == CK_UNAVAILABLE_INFORMATION:
t.ulFreePublicMemory = -1
t.ulTotalPrivateMemory = tokeninfo.ulTotalPrivateMemory
if t.ulTotalPrivateMemory == CK_UNAVAILABLE_INFORMATION:
t.ulTotalPrivateMemory = -1
t.ulFreePrivateMemory = tokeninfo.ulFreePrivateMemory
if t.ulFreePrivateMemory == CK_UNAVAILABLE_INFORMATION:
t.ulFreePrivateMemory = -1
t.hardwareVersion = (tokeninfo.hardwareVersion.major, tokeninfo.hardwareVersion.minor)
t.firmwareVersion = (tokeninfo.firmwareVersion.major, tokeninfo.firmwareVersion.minor)
t.utcTime = tokeninfo.GetUtcTime()
return t
def openSession(self, slot, flags=0):
"""
C_OpenSession
@param slot: slot number returned by L{getSlotList}
@type slot: integer
@param flags: 0 (default), CKF_RW_SESSION for RW session
@type flags: integer
@return: a L{Session} object
"""
se = PyKCS11.LowLevel.CK_SESSION_HANDLE()
flags |= CKF_SERIAL_SESSION
rv = self.lib.C_OpenSession(slot, flags, se)
if rv != CKR_OK:
raise PyKCS11Error(rv)
s = Session()
s.lib = self.lib
s.slot = slot
s.session = se
return s
def getMechanismList(self, slot):
"""
C_GetMechanismList
@return: the list of available mechanisms for a slot
@rtype: list
"""
mechanismList = PyKCS11.LowLevel.ckintlist()
rv = self.lib.C_GetMechanismList(slot, mechanismList)
if rv != CKR_OK:
raise PyKCS11Error(rv)
m = []
for x in xrange(len(mechanismList)):
m.append(CKM[mechanismList[x]])
return m
def getMechanismInfo(self, slot, type):
"""
C_GetMechanismInfo
@return: information about a mechanism
@rtype: a L{CK_MECHANISM_INFO} object
"""
info = PyKCS11.LowLevel.CK_MECHANISM_INFO()
rv = self.lib.C_GetMechanismInfo(slot, CKM[type], info)
if rv != CKR_OK:
raise PyKCS11Error(rv)
i = CK_MECHANISM_INFO()
i.ulMinKeySize = info.ulMinKeySize
i.ulMaxKeySize = info.ulMaxKeySize
i.flags = info.flags
return i
def waitForSlotEvent(self, flags=0):
"""
C_WaitForSlotEvent
@param flags: 0 (default) or CKF_DONT_BLOCK
@type flags: integer
@return: slot
@rtype: integer
"""
tmp = 0
(rv, slot) = self.lib.C_WaitForSlotEvent(flags, tmp)
if rv != CKR_OK:
raise PyKCS11Error(rv)
return slot
class Mechanism(object):
"""Wraps CK_MECHANISM"""
def __init__(self, mechanism, param):
"""
@param mechanism: the mechanism to be used
@type mechanism: integer, any CKM_* value
@param param: data to be used as crypto operation parameter
(i.e. the IV for some agorithms)
@type param: string or list/tuple of bytes
@see: L{Session.decrypt}, L{Session.sign}
"""
self.mechanism = mechanism
self.param = param
MechanismRSAPKCS1 = Mechanism(CKM_RSA_PKCS, None)
class Session(object):
""" Manage L{PyKCS11Lib.openSession} objects """
def closeSession(self):
"""
C_CloseSession
"""
rv = self.lib.C_CloseSession(self.session)
if rv != CKR_OK:
raise PyKCS11Error(rv)
def closeAllSession(self):
"""
C_CloseAllSession
"""
rv = self.lib.C_CloseAllSession(self.slot)
if rv != CKR_OK:
raise PyKCS11Error(rv)
def getSessionInfo(self):
"""
C_GetSessionInfo
@return: a L{CK_SESSION_INFO} object
"""
sessioninfo = PyKCS11.LowLevel.CK_SESSION_INFO()
rv = self.lib.C_GetSessionInfo(self.session, sessioninfo)
if rv != CKR_OK:
raise PyKCS11Error(rv)
s = CK_SESSION_INFO()
s.slotID = sessioninfo.slotID
s.state = sessioninfo.state
s.flags = sessioninfo.flags
s.ulDeviceError = sessioninfo.ulDeviceError
return s
def login(self, pin, user_type=CKU_USER):
"""
C_Login
@param pin: the user's PIN
@type pin: string
@param user_type: the user type. The default value is
CKU_USER. You may also use CKU_SO
@type user_type: integer
"""
rv = self.lib.C_Login(self.session, user_type, pin)
if rv != CKR_OK:
raise PyKCS11Error(rv)
def logout(self):
"""
C_Logout
"""
rv = self.lib.C_Logout(self.session)
if rv != CKR_OK:
raise PyKCS11Error(rv)
del self
def initPin(self, new_pin):
"""
C_InitPIN
"""
rv = self.lib.C_InitPIN(self.session, new_pin)
if rv != CKR_OK:
raise PyKCS11Error(rv)
def setPin(self, old_pin, new_pin):
"""
C_SetPIN
"""
rv = self.lib.C_SetPIN(self.session, old_pin, new_pin)
if rv != CKR_OK:
raise PyKCS11Error(rv)
def sign(self, key, data, mecha=MechanismRSAPKCS1):
"""
C_SignInit/C_Sign
@param key: a key handle, obtained calling L{findObjects}.
@type key: integer
@param data: the data to be signed
@type data: (binary) sring or list/tuple of bytes
@param mecha: the signing mechanism to be used
@type mecha: L{Mechanism} instance or L{MechanismRSAPKCS1}
for CKM_RSA_PKCS
@return: the computed signature
@rtype: list of bytes
@note: the returned value is an istance of L{ckbytelist}.
You can easly convert it to a binary string with::
''.join(chr(i) for i in ckbytelistSignature)
"""
m = PyKCS11.LowLevel.CK_MECHANISM()
signature = ckbytelist()
ba = None # must be declared here or may be deallocated too early
m.mechanism = mecha.mechanism
if (mecha.param):
ba = PyKCS11.LowLevel.byteArray(len(mecha.param))
if type(mecha.param) is type(''):
for c in xrange(len(mecha.param)):
ba[c] = ord(mecha.param[c])
else:
for c in xrange(len(mecha.param)):
ba[c] = mecha.param[c]
# with cast() the ba object continue to own internal pointer
# (avoids a leak).
# pParameter is an opaque pointer, never garbage collected.
m.pParameter = ba.cast()
m.ulParameterLen = len(mecha.param)
data1 = ckbytelist()
data1.reserve(len(data))
if type(data) is type(''):
for x in data:
data1.append(ord(x))
else:
for c in xrange(len(data)):
data1.append(data[c])
rv = self.lib.C_SignInit(self.session, m, key)
if (rv != 0):
raise PyKCS11Error(rv)
#first call get signature size
rv = self.lib.C_Sign(self.session, data1, signature)
if (rv != 0):
raise PyKCS11Error(rv)
#second call get actual signature data
rv = self.lib.C_Sign(self.session, data1, signature)
if (rv != 0):
raise PyKCS11Error(rv)
return signature
def decrypt(self, key, data, mecha=MechanismRSAPKCS1):
"""
C_DecryptInit/C_Decrypt
@param key: a key handle, obtained calling L{findObjects}.
@type key: integer
@param data: the data to be decrypted
@type data: (binary) sring or list/tuple of bytes
@param mecha: the decrypt mechanism to be used
@type mecha: L{Mechanism} instance or L{MechanismRSAPKCS1}
for CKM_RSA_PKCS
@return: the decrypted data
@rtype: list of bytes
@note: the returned value is an istance of L{ckbytelist}.
You can easly convert it to a binary string with::
''.join(chr(i) for i in ckbytelistData)
"""
m = PyKCS11.LowLevel.CK_MECHANISM()
decrypted = ckbytelist()
ba = None # must be declared here or may be deallocated too early
m.mechanism = mecha.mechanism
if (mecha.param):
ba = PyKCS11.LowLevel.byteArray(len(mecha.param))
if type(mecha.param) is type(''):
for c in xrange(len(mecha.param)):
ba[c] = ord(mecha.param[c])
else:
for c in xrange(len(mecha.param)):
ba[c] = mecha.param[c]
# with cast() the ba object continue to own internal pointer
# (avoids a leak).
# pParameter is an opaque pointer, never garbage collected.
m.pParameter = ba.cast()
m.ulParameterLen = len(mecha.param)
data1 = ckbytelist()
data1.reserve(len(data))
if type(data) is type(''):
for x in data:
data1.append(ord(x))
else:
for c in xrange(len(data)):
data1.append(data[c])
rv = self.lib.C_DecryptInit(self.session, m, key)
if (rv != 0):
raise PyKCS11Error(rv)
#first call get decrypted size
rv = self.lib.C_Decrypt(self.session, data1, decrypted)
if (rv != 0):
raise PyKCS11Error(rv)
#second call get actual decrypted data
rv = self.lib.C_Decrypt(self.session, data1, decrypted)
if (rv != 0):
raise PyKCS11Error(rv)
return decrypted
def isNum(self, type):
if type in (CKA_CERTIFICATE_TYPE,
CKA_CLASS,
CKA_KEY_GEN_MECHANISM,
CKA_KEY_TYPE,
CKA_MODULUS_BITS,
CKA_VALUE_BITS,
CKA_VALUE_LEN):
return True
return False
def isString(self, type):
if type in (CKA_LABEL,
CKA_APPLICATION):
return True
return False
def isBool(self, type):
if type in (CKA_ALWAYS_SENSITIVE,
CKA_DECRYPT,
CKA_ENCRYPT,
CKA_HAS_RESET,
CKA_LOCAL,
CKA_MODIFIABLE,
CKA_NEVER_EXTRACTABLE,
CKA_PRIVATE,
CKA_RESET_ON_INIT,
CKA_SECONDARY_AUTH,
CKA_SENSITIVE,
CKA_SIGN,
CKA_SIGN_RECOVER,
CKA_TOKEN,
CKA_TRUSTED,
CKA_UNWRAP,
CKA_VERIFY,
CKA_VERIFY_RECOVER,
CKA_WRAP):
return True
return False
def isBin(self, type):
return (not self.isBool(type)) and (not self.isString(type)) and (not self.isNum(type))
def findObjects(self, template=()):
"""
find the objects matching the template pattern
@param template: list of attributes tuples (attribute,value).
The default value is () and all the objects are returned
@type template: list
@return: a list of object ids
@rtype: list
"""
t = PyKCS11.LowLevel.ckattrlist(len(template))
for x in xrange(len(template)):
attr = template[x]
if self.isNum(attr[0]):
t[x].SetNum(attr[0], attr[1])
elif self.isString(attr[0]):
t[x].SetString(attr[0], attr[1])
elif self.isBool(attr[0]):
t[x].SetBool(attr[0], attr[1])
elif self.isBin(attr[0]):
t[x].SetBin(attr[0], attr[1])
else:
raise PyKCS11Error(-2)
# we search for 10 objects by default. speed/memory tradeoff
result = PyKCS11.LowLevel.ckobjlist(10)
self.lib.C_FindObjectsInit(self.session, t)
res = []
while True:
self.lib.C_FindObjects(self.session, result)
for x in result:
# make a copy of the handle: the original value get
# corrupted (!!)
a = PyKCS11.LowLevel.CK_OBJECT_HANDLE()
a.assign(x.value())
res.append(a)
if len(result) == 0:
break
self.lib.C_FindObjectsFinal(self.session)
return res
def getAttributeValue(self, obj_id, attr, allAsBinary=False):
"""
C_GetAttributeValue
@param obj_id: object ID returned by L{findObjects}
@type obj_id: integer
@param attr: list of attributes
@type attr: list
@param allAsBinary: return all values as binary data; default is False.
@type allAsBinary: Boolean
@return: a list of values corresponding to the list of attributes
@rtype: list
@see: L{getAttributeValue_fragmented}
@note: if allAsBinary is True the function do not convert results to
Python types (i.e.: CKA_TOKEN to Bool, CKA_CLASS to int, ...).
Binary data is returned as L{ckbytelist} type, usable
as a list containing only bytes.
You can easly convert it to a binary string with::
''.join(chr(i) for i in ckbytelistVariable)
"""
valTemplate = PyKCS11.LowLevel.ckattrlist(len(attr))
for x in xrange(len(attr)):
valTemplate[x].SetType(attr[x])
# first call to get the attribute size and reserve the memory
rv = self.lib.C_GetAttributeValue(self.session, obj_id, valTemplate)
if rv == CKR_ATTRIBUTE_TYPE_INVALID \
or rv == CKR_ATTRIBUTE_SENSITIVE:
return self.getAttributeValue_fragmented(obj_id, attr, allAsBinary)
if rv != CKR_OK:
raise PyKCS11Error(rv)
# second call to get the attribute value
rv = self.lib.C_GetAttributeValue(self.session, obj_id, valTemplate)
if rv != CKR_OK:
raise PyKCS11Error(rv)
res = []
for x in xrange(len(attr)):
if (allAsBinary):
res.append(valTemplate[x].GetBin())
elif valTemplate[x].IsNum():
res.append(valTemplate[x].GetNum())
elif valTemplate[x].IsBool():
res.append(valTemplate[x].GetBool())
elif valTemplate[x].IsString():
res.append(valTemplate[x].GetString())
elif valTemplate[x].IsBin():
res.append(valTemplate[x].GetBin())
else:
raise PyKCS11Error(-2)
return res
def getAttributeValue_fragmented(self, obj_id, attr, allAsBinary=False):
"""
Same as L{getAttributeValue} except that when some attribute
is sensitive or unknown an empty value (None) is returned.
Note: this is achived by getting attributes one by one.
@see: L{getAttributeValue}
"""
# some attributes does not exists or is sensitive
# but we don't know which ones. So try one by one
valTemplate = PyKCS11.LowLevel.ckattrlist(1)
res = []
for x in xrange(len(attr)):
valTemplate[0].Reset()
valTemplate[0].SetType(attr[x])
# first call to get the attribute size and reserve the memory
rv = self.lib.C_GetAttributeValue(self.session, obj_id, valTemplate)
if rv == CKR_ATTRIBUTE_TYPE_INVALID \
or rv == CKR_ATTRIBUTE_SENSITIVE:
# append an empty value
res.append(None)
continue
if rv != CKR_OK:
raise PyKCS11Error(rv)
# second call to get the attribute value
rv = self.lib.C_GetAttributeValue(self.session, obj_id, valTemplate)
if rv != CKR_OK:
raise PyKCS11Error(rv)
if (allAsBinary):
res.append(valTemplate[0].GetBin())
elif valTemplate[0].IsNum():
res.append(valTemplate[0].GetNum())
elif valTemplate[0].IsBool():
res.append(valTemplate[0].GetBool())
elif valTemplate[0].IsString():
res.append(valTemplate[0].GetString())
elif valTemplate[0].IsBin():
res.append(valTemplate[0].GetBin())
else:
raise PyKCS11Error(-2)
return res
def seedRandom(self, seed):
"""
C_SeedRandom
@param seed: seed material
@type seed: iterable
"""
low_seed = ckbytelist(len(seed))
for c in xrange(len(seed)):
low_seed.append(seed[c])
rv = self.lib.C_SeedRandom(self.session, low_seed)
if rv != CKR_OK:
raise PyKCS11Error(rv)
def generateRandom(self, size=16):
"""
C_GenerateRandom
@param size: number of random bytes to get
@type size: integer
@note: the returned value is an istance of L{ckbytelist}.
You can easly convert it to a binary string with::
''.join(chr(i) for i in random)
"""
low_rand = ckbytelist(size)
rv = self.lib.C_GenerateRandom(self.session, low_rand)
if rv != CKR_OK:
raise PyKCS11Error(rv)
return low_rand
if __name__ == "__main__":
# sample test/debug code
p = PyKCS11Lib()
p.load()
print "getInfo"
print p.getInfo()
print
print "getSlotList"
s = p.getSlotList()
print "slots:", s
slot = s[1]
print "using slot:", slot
print
print "getSlotInfo"
print p.getSlotInfo(slot)
print
print "getTokenInfo"
print p.getTokenInfo(slot)
print
print "openSession"
se = p.openSession(slot)
print
print "sessionInfo"
print se.getSessionInfo()
print
print "seedRandom"
try:
se.seedRandom([1, 2, 3, 4])
except PyKCS11Error, e:
print e
print "generateRandom"
print se.generateRandom()
print
print "login"
se.login(pin="0000")
print
print "sessionInfo"
print se.getSessionInfo()
print
print "findObjects"
objs = se.findObjects([(CKA_CLASS, CKO_CERTIFICATE)])
print objs
print
print "getAttributeValue"
for o in objs:
attr = se.getAttributeValue(o, [CKA_LABEL, CKA_CLASS])
print attr
print
print "logout"
se.logout()
print
print "closeSession"
se.closeSession()
|
For decades, I have wanted to attend an RT Convention. There were always reasons why I couldn’t go. When I saw the convention would be in Reno, Nevada, just a ten-hour drive from my home, I had to attend. Happily, my friend, Meridith Allen Conners was willing to accompany me on the trip.
The convention was amazing. It was held at the Peppermill Resort. When we arrived Tuesday morning, our room wasn’t ready. Of course, just as I got out of the car in Reno, I spilled tea all over my capris. But soon I was having too much fun to care. While waiting for our room, we registered and received our bag of books and fun items. There was a social gathering held by Brenda Novak. There were roulette table and other games. After receiving more free books (this will be a recurring theme), Meredith and I wondered over to the Frazzle table. Frazzle is a dice game. The lovely Patricia joined us and was able to educate us all on the rules. Soon we were groaning and cheering each other on. Apparently, we were having too much fun. Others gathered, including two handsome cover models (This will be a recurring theme, these two gentlemen were everywhere I went. They may have been stalking me.) As others gathered, we urged them to join the fun. Newbie, Cyndee Martin of Lehi, Utah won the game! I mean that literally, she won the physical game.
After this gathering, our room was ready. Meredith and I went to our palatial room. The Peppermill Resort was an incredible bargain. We had a refrigerator in our room, which we stuffed with snacks. Our bathroom had a whirlpool tub, two sinks, a separate shower, and granite. We took time to change and then we were off to the Indie Soiree.
At the soiree, we met indie authors. The authors would give us raffle tickets to win various prizes. Meredith was insistent that we go up to a gorgeous man and ask about his book. Stuart Reardon is a beautiful man. He is a former Rugby player, who now works as a cover model and author. Since Meredith was rendered speechless by his masculine beauty, I did all the talking. Sadly, I understood only a little of his responses. It was noisy, and he has a delicious accent. Also, at the Soiree, I ran into a fellow member of the Old School Romance Book Club, Eva Moore. I also ran into soon to be a good friend, Cyndee Martin, also a member of the OSRBC. Eva invited me to her hotel room for a wine tasting later that night. The soiree ended with the winning raffle tickets being drawn. I won Kathern LeVegue’s basket. (Winning things will be a recurring theme.) It contained books, chocolates, bath bombs, wine glasses, wine, and a kindle. Oh yes, there was a tiara.
I had just enough time to stash my basket in the hotel room. It was time for Cinema Craptastique. Hosted by Cherry Adair and Damon Suede, Tere Michaels and Paige Tyler, it was a blast. Cherry Adair is a gorgeous, vivacious redhead, who should be the heroine in a romance novel. Damon is energetic and fun. He is so much fun. The movie was Wolves starring Jason Momoa and Lucas Till. The movie was wonderfully awful, and Damon kept us in stitches with his commentary. Then the worst happened. The film stopped and would not start again. No fear. Damon Suede came to the rescue. He acted out the rest of the movie. There is no doubt his version was far superior. Laughing out loud, the audience ate cracker jacks and movie candy. Meredith won her first prize during the movie. We are on a roll.
The evening ended with a wine tasting party in Eva Morgan’s hotel room. Eva is the hostess with the mostess. (Partying in Eva’s room will be a recurring theme.) We all sampled a variety of Rosé. It was the perfect end to a perfect day.
I did not take enough pictures. Below are the beautiful Stuart Reardon and myself, the funny and talented Damon Suede, and Meridith Allen Conners and myself.
|
# -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import models, migrations
import jsonfield.fields
class Migration(migrations.Migration):
dependencies = [
]
operations = [
migrations.CreateModel(
name='CallbackEvent',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('timestamp', models.DateTimeField()),
('event_type', models.CharField(max_length=50)),
('event_payload', jsonfield.fields.JSONField(default=dict)),
],
options={
},
bases=(models.Model,),
),
migrations.CreateModel(
name='Webhook',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('trello_model_id', models.CharField(help_text='The id of the model being watched.', max_length=24)),
('trello_id', models.CharField(help_text='Webhook id returned from Trello API.', max_length=24, blank=True)),
('description', models.CharField(help_text='Description of the webhook.', max_length=500, blank=True)),
('auth_token', models.CharField(help_text='The Trello API user auth token.', max_length=64)),
('created_at', models.DateTimeField(blank=True)),
('last_updated_at', models.DateTimeField(blank=True)),
],
options={
},
bases=(models.Model,),
),
migrations.AddField(
model_name='callbackevent',
name='webhook',
field=models.ForeignKey(to='trello_webhooks.Webhook'),
preserve_default=True,
),
]
|
2/25/07 5:30 p.m. at Comcast Center - College Park, Md.
Technical fouls: North Carolina-None. Maryland Terrapins-None.
|
# author: James Campbell
# what: example three+ categorization of tweets using nltk
# date created: November 23 2015
import nltk
import sys
from sys import exit
pos_tweets = [('I love this car', 'positive'),
('This view is amazing', 'positive'),
('I feel great this morning', 'positive'),
('I am so excited about the concert', 'positive'),
('He is my best friend', 'positive'),
('Going well', 'positive'),
('Thank you', 'positive'),
('Hope you are doing well', 'positive'),
('I am very happy', 'positive'),
('Good for you', 'positive'),
('all good. I know about it and I accept it.', 'positive'),
('This is really good!', 'positive'),
('Tomorrow is going to be fun.', 'positive'),
('Smiling all around.', 'positive'),
('These are great apples today.', 'positive'),
('How about them apples? Thomas is a happy boy.', 'positive'),
('Thomas is very zen. He is well-mannered.', 'positive'),
('happy and good lots of light!', 'positive'),
('I like this new iphone very much', 'positive')]
neg_tweets = [('I do not like this car', 'negative'),
('This view is horrible', 'negative'),
('I feel tired this morning', 'negative'),
('I am not looking forward to the concert', 'negative'),
('He is my enemy', 'negative'),
('I am a bad boy', 'negative'),
('This is not good', 'negative'),
('I am bothered by this', 'negative'),
('I am not connected with this', 'negative'),
('Sadistic creep you ass. Die.', 'negative'),
('All sorts of crazy and scary as hell.', 'negative'),
('Not his emails, no.', 'negative'),
('His father is dead. Returned obviously.', 'negative'),
('He has a bomb.', 'negative'),
('Too fast to be on foot. We cannot catch them.', 'negative'),
('Feeling so stupid stoopid stupid!', 'negative'),
(':-(( :-(', 'negative'),
('This is the worst way imaginable, all of this traffic', 'negative')]
rain_tweets = [('this rain is craze today', 'rain'),
('Nov 23 17:30 Temperature 3C no or few clouds Wind SW 6 km/h Humidity 70% France', 'rain'),
('missin climbing mountains in the rain', 'rain'),
('There are days in live broadcasting Torrential rain in Paris ', 'rain'),
('Heavy Rain today in!', 'rain'),
('Woman in the boulangerie started complaining about the rain. I said, "its better than terrorists". Need to finesse my jovial patter', 'rain'),
('Light to moderate rain over NCR', 'rain'),
('After a cold night last night, tonight will be milder and mainly frost-free, with this band of rain. Jo', 'rain'),
('But I love the rain. And it rains frequently these days~ So it makes me feel rather good', 'rain'),
('With 1000 mm rain already and more rain forecasted 4 Chennai, Nov 2015 will overtake Oct 2005 and Nov 1918 to become the Wettest Month EVER!', 'rain'),
('It is raining today. Wet!', 'rain'),
('Lots of rain today. Raining!', 'rain'),
('Why is it raining?', 'rain'),
('So much rain!', 'rain'),
('it always rains this time of year', 'rain'),
('raining', 'rain'),
('raining outside today, rained yesterday too', 'rain'),
('rainy weather today! jeez', 'rain'),
('Rain has finally extinguished a #wildfire in Olympic National Park that had been burning since May', 'rain'),
('The rain had us indoors for Thursdays celebration', 'rain'),
('Rain (hourly) 0.0 mm, Pressure: 1012 hPa, falling slowly', 'rain'),
('That aspiration yours outfit make ends meet spite of the rainy weather this midsummer?: Edb', 'rain'),
('Glasgow\'s bright lights of Gordon st tonight #rain #Glasgow', 'rain'),
('Why is it raining? Because it always rains this time of year', 'rain'),
('The forecast for this week\'s weather includes lots of rain!', 'rain'),
('Morning Has Broken: Morning has BrokenAs I sit in my warm car in between rain squalls I am looking out', 'rain'),
('Wind 2.0 mph SW. Barometer 1021.10 mb, Falling. Temperature 5.5 °C. Rain today 0.2 mm. Humidity 78%', 'rain')]
tweets = []
for (words, sentiment) in pos_tweets + neg_tweets + rain_tweets:
words_filtered = [e.lower() for e in words.split() if len(e) >= 2]
tweets.append((words_filtered, sentiment))
def get_words_in_tweets(tweets):
all_words = []
for (words, sentiment) in tweets:
all_words.extend(words)
return all_words
def get_word_features(wordlist):
wordlist = nltk.FreqDist(wordlist)
word_features = wordlist.keys()
return word_features
def extract_features(document):
document_words = set(document)
features = {}
for word in word_features:
features['contains(%s)' % word] = (word in document_words)
return features
word_features = get_word_features(get_words_in_tweets(tweets))
training_set = nltk.classify.apply_features(extract_features, tweets)
classifier = nltk.NaiveBayesClassifier.train(training_set)
runtweets = [] # setup to import a list of tweets here if you wish into a python list
if len(sys.argv) > 1: # if param passed 4 name of text file w/ list of tweets
tweetfile = sys.argv[1]
with open(tweetfile, "r") as ins:
for line in ins:
runtweets.append(line)
runtweets.append('I am a bad boy') # should be negative
runtweets.append('rain today') # should be rain
runtweets.append('so stupid') # should be negative
runtweets.append('it is raining outside') # should be rain
runtweets.append('I love it') # should be positive
runtweets.append('so good') # should be positive
poscount = 0
negcount = 0
raincount = 0
for tweett in runtweets:
valued = classifier.classify(extract_features(tweett.split()))
print(valued)
if valued == 'negative':
negcount = negcount + 1
if valued == 'positive':
poscount = poscount + 1
if valued == 'rain':
raincount = raincount + 1
print('Positive count: %s \nNegative count: %s \nRain count: %s' % (poscount, negcount, raincount))
exit()
|
Home > Blog: Resumes > How Do I Get Employers to Notice My Resume and Call Me?
Rick Saia's post yesterday ("Dude, Where's My Job? ...") listed five typical reasons a recent grad might be having trouble finding that first job. Coincidentally, as Rick was crafting that post, a reader named Jen G. left a comment on an older blog post, asking a closely related question. Jen is a new graduate who has posted her resume on various online job boards.
"Unfortunately," she wrote, "I have rarely been called back from a site that I have posted my resume on. What do I need to do in order to get employers to notice my resume so that I get called back?"
Since this seems to be a hot topic, I've added a few more tips for Jen and anyone else who's wondering the same thing.
Posting your resume for public viewing on web sites like Monster.com or CareerBuilder.com is a good first step, but it's a passive method of job searching. You have no power over whether anyone ever reads your resume or chooses to call you after reading it.
Only IF an employer uses those sites for recruiting, and only IF your qualifications match their keywords, and only IF they read your resume and decide it sounds like a good fit will you ever get a call. That's a lot of ifs.
In addition, a resume posted online is, by nature, a one-size-fits-all resume, not targeted to the specific needs of a particular employer.
So leave the resume online, because you never know. But meanwhile, make your job search active, and take the power back into YOUR hands.
Go to a site such as Indeed.com or Simplyhired.com, which are job board "aggregators," meaning they search ALL the job sites for you (including Monster, CareerBuilder, company web sites, local newspapers, etc.).
Select the keywords that are likely to be in the title or job description of your desired position, and do a search. You can narrow the search by zip code and other criteria.
You can even save your search criteria and set up a job search agent that will keep checking these sites for you daily or weekly, and email you any matches. But again, keep it active. If you're not getting good results, try different keywords or expand your search area.
When you find a job that interests you, edit your resume and cover letter so they address exact keywords and phrases you found in the job description.
For example, let's say they're looking for a "Senior Office Manager" and your resume shows you're a "Senior administrator responsible for office management." Those mean the same thing, but you'll have better luck if you edit your resume to match the employer's terminology.
You need to do that each and every time you apply for a specific job. So you'll have one "basic" resume and cover letter, but you'll tweak them for every employer. You may end up with 20 or more versions, each slightly different. It's a pain, but it's necessary.
And finally, we hope you'll backtrack through The Pongo Blog and Career Corner for lots of other hints on how to conduct an active job search that will give you the best chance of landing your dream job.
|
"""Defines classes for Shootris"""
# Copyright (C) 2016 Karel "laird Odol" Murgas
# [email protected]
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#############
# Libraries #
#############
from collections import deque
from utilities import *
#####################
# Class definitions #
#####################
class Blob:
"""Defines blob - now only for main blob - to be expanded later"""
def __init__(self, screen, field, event, speed):
self.cols = MAXCOL
self.max_rows = MAXROW
self.top = 0
self.left = 0
self.ls = LEFTSTICK
self.bs = BOTTOMSTICK
self.content = deque([])
self.generated_rows = 0
self.append_row()
self.screen = screen
self.field = field
self.speed = speed
self.timer = pyg.time.set_timer(event, speed)
self.event = event
self.row_fraction = 0
# oveř prádzný spodek
# zahoď spodek
def get_rect(self):
"""Gets rectangle containing the blob"""
top_line = 0 if self.top == 0 else self.top * CELLSIZE - rest_of_cell(self.row_fraction)
bottom_len = len(self.content) * CELLSIZE
if self.top == 0:
bottom_len -= rest_of_cell(self.row_fraction)
return pyg.Rect(self.left * CELLSIZE, top_line, self.cols * CELLSIZE, bottom_len)
def get_bottom(self):
"""Returns row index of bottom row"""
return self.top + len(self.content) - 1 # index of top row + number of rows - correction
def create_cell(self, r, c):
"""Creates content of the cell - colors the cell regarding left and bottom neighbours"""
if c > 0 and roll(self.ls) and self.content[r][c - 1] is not None:
return self.content[r][c - 1] # color by left cell
elif r < len(self.content) - 1 and roll(self.bs) and self.content[r + 1][c] is not None:
return self.content[r + 1][c] # color by bottom cell
else:
return get_random_color() # random color
def append_row(self):
"""Appends new row to the start of the blob"""
self.content.appendleft([])
self.generated_rows += 1
for c in range(self.cols):
self.content[0].append(self.create_cell(0, c))
def clearRow(self, row):
"""Draws row as if the color was None"""
draw_blob(self.screen, self.field, [[None]*self.cols], row)
def destroy(self):
"""Ends the game and deletes the blob instance"""
self.timer = pyg.time.set_timer(self.event, 0)
pyg.event.post(pyg.event.Event(LOOSE_EVENT))
del self
def win(self):
"""Ends the game winning and deletes the blob instance"""
self.timer = pyg.time.set_timer(self.event, 0)
pyg.event.post(pyg.event.Event(WIN_EVENT))
del self
def move(self):
"""Moves the blob one pixel down, checks for blob boundaries"""
if self.get_bottom() >= FIELDLENGTH - 1 and self.row_fraction == 0:
self.destroy()
else:
if self.generated_rows < self.max_rows: # there is new line in the buffer
if self.row_fraction == 0:
self.append_row()
else: # clear the top line
self.screen.blit(pyg.Surface((self.cols * CELLSIZE, 1)), (0, self.top * CELLSIZE - rest_of_cell(self.row_fraction), self.cols * CELLSIZE, 1))
if self.row_fraction == 0:
self.top += 1
self.row_fraction = (self.row_fraction + 1) % CELLSIZE
draw_blob(self.screen, self.field, self.content, self.top, self.row_fraction)
def damage(self, r, c, color):
"""Deletes content of this cell and all direct neighbours"""
score = 0
if self.content[r][c] == color:
self.content[r][c] = None
score += 1
if c > 0 and self.content[r][c - 1] == color: # left
score += self.damage(r, c - 1, color)
if c < self.cols - 1 and self.content[r][c + 1] == color: # right
score += self.damage(r, c + 1, color)
if r > 0 and self.content[r - 1][c] == color: # top
score += self.damage(r - 1, c, color)
if r < len(self.content) - 1 and self.content[r + 1][c] == color: # bottom
score += self.damage(r + 1, c, color)
return score
def hit(self, c, r, color):
"""Determines, if hit was success. If it was, deletes cells and checks and pop out empty bottom rows"""
if self.content[r][c] == color:
if SOUND_EFFECTS_ON:
sound_hit_success.play()
score = self.damage(r, c, color)
draw_blob(self.screen, self.field, self.content, self.top, self.row_fraction)
while len(self.content) > 0 and self.content[len(self.content) - 1] == [None] * self.cols:
self.content.pop()
if len(self.content) == 0 and self.max_rows == self.generated_rows:
self.win()
return score
else:
if SOUND_EFFECTS_ON:
if self.content[r][c] != None:
sound_hit_fail.play()
else:
sound_miss.play()
return 0
class Infopanel:
def __init__(self, screen):
self.position = INFO_FIELD
self.score_position =INFO_FIELD[1] + 5 * CELLSIZE
self.highscore_position = INFO_FIELD[1] + 7 * CELLSIZE
self.text_position = INFO_FIELD[1] + FIELDLENGTH * CELLSIZE / 2
self.text_flash_position = INFO_FIELD[1] + (FIELDLENGTH + 2) * CELLSIZE / 2
self.tips_header_position = INFO_FIELD[1] + (FIELDLENGTH + 8) * CELLSIZE / 2
self.tips_text_position = INFO_FIELD[1] + (FIELDLENGTH + 10) * CELLSIZE / 2
self.text_flesh_visible = True
self.score = 0
self.highscore = 0
self.screen = screen
self.flash_timer = pyg.time.set_timer(FLASH_EVENT, TEXT_FLESH_TIME)
self.tips_timer = pyg.time.set_timer(TIPS_EVENT, TIPS_TIME)
def write(self, text, surf_top, surf_left=INFO_FIELD[0] + CELLSIZE, surf_size=((INFOWIDTH - 1) * CELLSIZE, CELLSIZE), color=WHITE, size=CELLSIZE):
font = pyg.font.SysFont(pyg.font.get_default_font(), size)
surf_start = (surf_left, surf_top)
self.screen.blit(pyg.Surface(surf_size), surf_start)
self.screen.blit(font.render(text, 1, color), surf_start)
pyg.display.update(pyg.Rect(surf_start, surf_size))
def message(self, text):
self.write(text, self.text_position)
def message_flash(self, text):
self.write(text, self.text_flash_position)
def message_tips_header(self, text):
self.write(text, self.tips_header_position)
def message_tips(self, text):
self.write(text, self.tips_text_position, size=(CELLSIZE * 4) // 5)
def add_score(self, score):
self.score += score
col = WHITE if self.score < self.highscore else GREEN
self.write('SCORE: ' + str(self.score), self.score_position, color=col)
def resetscore(self):
if self.score >= self.highscore:
self.write('HIGHSCORE: ' + str(self.score), self.highscore_position, color=RED)
self.highscore = self.score
self.score = 0
self.write('SCORE: 0', self.score_position)
class Magazine:
def __init__(self, screen, max_ammo=MAXAMMO, event=ADD_AMMO_EVENT, speed=AMMO_REPLENISH_SPEED):
self.maxammo = max_ammo
self.screen = screen
self.position = pyg.Rect(INFO_FIELD[0] + CELLSIZE, INFO_FIELD[1] + CELLSIZE, (INFOWIDTH - 1) * CELLSIZE, 2 * CELLSIZE)
self.content = deque([])
self.add_ammo()
self.event = event
self.timer = pyg.time.set_timer(event, speed)
def add_ammo(self):
if len(self.content) < self.maxammo:
self.content.append(get_random_color())
self.draw()
def color_bullet(self, cell, color):
"""Colors one 'bullet' cell"""
cell.fill(color)
return cell
def draw(self):
self.screen.blit(pyg.Surface(((INFOWIDTH - 1) * CELLSIZE, 2 * CELLSIZE)), self.position)
cell = pyg.Surface((2 * CELLSIZE, 2 * CELLSIZE))
for i, color in enumerate(self.content):
self.screen.blit(self.color_bullet(cell, color), (INFO_FIELD[0] + (1 + 2 * i) * CELLSIZE, INFO_FIELD[1] + CELLSIZE))
pyg.display.update(self.position)
def shoot(self):
if not self.is_empty():
bullet = self.content.popleft()
self.draw()
return bullet
else:
if SOUND_EFFECTS_ON:
sound_empty.play()
return None
def destroy(self):
self.timer = pyg.time.set_timer(self.event, 0)
self.content = deque([])
self.draw()
del self
def is_empty(self):
return len(self.content) == 0
def reload(self):
if not self.is_empty():
if SOUND_EFFECTS_ON:
sound_reload.play()
bullet = self.content.popleft()
self.content.append(bullet)
self.draw()
|
Bali Villa Kebun is built on a fantastic and quiet spot of Seminyak with merely a A few minutes walk towards the well-loved white sandy beach. A tropical rental with 3 ensuite bedrooms, which perfect to spend your well-deserved vacation with your family or friends. Numerous desired holiday attractions are within easy reach: beach, dining establishments, the celebrated Seminyak night life, retailers, health spas and souvenirs. The villa is pretty noiseless and undisturbed having a exclusive security managed access path.
Within just 20-40 minutes you make it to the international airport, the gorgeous surf shores in Canggu, the idyllic temple site of Tanah Lot and also the renowned Nirwana golf course. Ubud can be driven to in approximately 50 minutes.
The attractive Seminyak beach is really just a few minutes stroll from the villa location. On the beach you can enjoy a cocktail or a gourmet dinner in one of the beachfront restaurants or check out the amazing sunsets, this part of Bali is famous for.
The exotic holiday vacation villa was built in 2007 and is constructed in a combined Balinese / contemporary design and style. The Seminyak villa features a primary house, by having an open family room, kitchen area, a master bedroom and a pair of extra bedrooms situated in the individual guest pavilions. Both of these added pavilions include a bedroom and en-suite bathroom along with open bath. The sultry garden comprises of places for unwinding such as a patio and sundeck having sun-loungers surrounding the swimming pool area. Moreover, the villa is enclosed by an elevated wall structure for your convenience. Bali Villa Kebun belongs to one of our popular rentals and is a favourite with our guests. Therefore it is advised to book a villa holiday in advance so not run the risk of the vacation rental already booked for your preferred dates.
– Three en-suite bedrooms, two with king-size, one with queen size.
– Large LCD TV with 50 international channels, movies, news, cartoons, sports.
|
import time
import webbrowser
import json
import wx
import requests
from service.port import Port
from service.fit import Fit
from eos.types import Cargo
from eos.db import getItem
from gui.display import Display
import gui.globalEvents as GE
if 'wxMac' not in wx.PlatformInfo or ('wxMac' in wx.PlatformInfo and wx.VERSION >= (3, 0)):
from service.crest import Crest, CrestModes
class CrestFittings(wx.Frame):
def __init__(self, parent):
wx.Frame.__init__(self, parent, id=wx.ID_ANY, title="Browse EVE Fittings", pos=wx.DefaultPosition,
size=wx.Size(550, 450), style=wx.DEFAULT_FRAME_STYLE | wx.TAB_TRAVERSAL)
self.SetBackgroundColour(wx.SystemSettings.GetColour(wx.SYS_COLOUR_BTNFACE))
self.mainFrame = parent
mainSizer = wx.BoxSizer(wx.VERTICAL)
sCrest = Crest.getInstance()
characterSelectSizer = wx.BoxSizer(wx.HORIZONTAL)
if sCrest.settings.get('mode') == CrestModes.IMPLICIT:
self.stLogged = wx.StaticText(self, wx.ID_ANY, "Currently logged in as %s" % sCrest.implicitCharacter.name,
wx.DefaultPosition, wx.DefaultSize)
self.stLogged.Wrap(-1)
characterSelectSizer.Add(self.stLogged, 1, wx.ALIGN_CENTER_VERTICAL | wx.ALL, 5)
else:
self.charChoice = wx.Choice(self, wx.ID_ANY, wx.DefaultPosition, wx.DefaultSize, [])
characterSelectSizer.Add(self.charChoice, 1, wx.ALIGN_CENTER_VERTICAL | wx.ALL, 5)
self.updateCharList()
self.fetchBtn = wx.Button(self, wx.ID_ANY, u"Fetch Fits", wx.DefaultPosition, wx.DefaultSize, 5)
characterSelectSizer.Add(self.fetchBtn, 0, wx.ALL, 5)
mainSizer.Add(characterSelectSizer, 0, wx.EXPAND, 5)
self.sl = wx.StaticLine(self, wx.ID_ANY, wx.DefaultPosition, wx.DefaultSize, wx.LI_HORIZONTAL)
mainSizer.Add(self.sl, 0, wx.EXPAND | wx.ALL, 5)
contentSizer = wx.BoxSizer(wx.HORIZONTAL)
browserSizer = wx.BoxSizer(wx.VERTICAL)
self.fitTree = FittingsTreeView(self)
browserSizer.Add(self.fitTree, 1, wx.ALL | wx.EXPAND, 5)
contentSizer.Add(browserSizer, 1, wx.EXPAND, 0)
fitSizer = wx.BoxSizer(wx.VERTICAL)
self.fitView = FitView(self)
fitSizer.Add(self.fitView, 1, wx.ALL | wx.EXPAND, 5)
btnSizer = wx.BoxSizer(wx.HORIZONTAL)
self.importBtn = wx.Button(self, wx.ID_ANY, u"Import to pyfa", wx.DefaultPosition, wx.DefaultSize, 5)
self.deleteBtn = wx.Button(self, wx.ID_ANY, u"Delete from EVE", wx.DefaultPosition, wx.DefaultSize, 5)
btnSizer.Add(self.importBtn, 1, wx.ALL, 5)
btnSizer.Add(self.deleteBtn, 1, wx.ALL, 5)
fitSizer.Add(btnSizer, 0, wx.EXPAND)
contentSizer.Add(fitSizer, 1, wx.EXPAND, 0)
mainSizer.Add(contentSizer, 1, wx.EXPAND, 5)
self.fetchBtn.Bind(wx.EVT_BUTTON, self.fetchFittings)
self.importBtn.Bind(wx.EVT_BUTTON, self.importFitting)
self.deleteBtn.Bind(wx.EVT_BUTTON, self.deleteFitting)
self.mainFrame.Bind(GE.EVT_SSO_LOGOUT, self.ssoLogout)
self.mainFrame.Bind(GE.EVT_SSO_LOGIN, self.ssoLogin)
self.Bind(wx.EVT_CLOSE, self.OnClose)
self.statusbar = wx.StatusBar(self)
self.statusbar.SetFieldsCount()
self.SetStatusBar(self.statusbar)
self.cacheTimer = wx.Timer(self)
self.Bind(wx.EVT_TIMER, self.updateCacheStatus, self.cacheTimer)
self.SetSizer(mainSizer)
self.Layout()
self.Centre(wx.BOTH)
def ssoLogin(self, event):
self.updateCharList()
event.Skip()
def updateCharList(self):
sCrest = Crest.getInstance()
chars = sCrest.getCrestCharacters()
if len(chars) == 0:
self.Close()
self.charChoice.Clear()
for char in chars:
self.charChoice.Append(char.name, char.ID)
self.charChoice.SetSelection(0)
def updateCacheStatus(self, event):
t = time.gmtime(self.cacheTime - time.time())
if t < 0:
self.cacheTimer.Stop()
else:
sTime = time.strftime("%H:%M:%S", t)
self.statusbar.SetStatusText("Cached for %s" % sTime, 0)
def ssoLogout(self, event):
if event.type == CrestModes.IMPLICIT:
self.Close()
else:
self.updateCharList()
event.Skip() # continue event
def OnClose(self, event):
self.mainFrame.Unbind(GE.EVT_SSO_LOGOUT, handler=self.ssoLogout)
self.mainFrame.Unbind(GE.EVT_SSO_LOGIN, handler=self.ssoLogin)
event.Skip()
def getActiveCharacter(self):
sCrest = Crest.getInstance()
if sCrest.settings.get('mode') == CrestModes.IMPLICIT:
return sCrest.implicitCharacter.ID
selection = self.charChoice.GetCurrentSelection()
return self.charChoice.GetClientData(selection) if selection is not None else None
def fetchFittings(self, event):
sCrest = Crest.getInstance()
try:
waitDialog = wx.BusyInfo("Fetching fits, please wait...", parent=self)
fittings = sCrest.getFittings(self.getActiveCharacter())
self.cacheTime = fittings.get('cached_until')
self.updateCacheStatus(None)
self.cacheTimer.Start(1000)
self.fitTree.populateSkillTree(fittings)
except requests.exceptions.ConnectionError:
self.statusbar.SetStatusText("Connection error, please check your internet connection")
finally:
del waitDialog
def importFitting(self, event):
selection = self.fitView.fitSelection
if not selection:
return
data = self.fitTree.fittingsTreeCtrl.GetPyData(selection)
sPort = Port.getInstance()
fits = sPort.importFitFromBuffer(data)
self.mainFrame._openAfterImport(fits)
def deleteFitting(self, event):
sCrest = Crest.getInstance()
selection = self.fitView.fitSelection
if not selection:
return
data = json.loads(self.fitTree.fittingsTreeCtrl.GetPyData(selection))
dlg = wx.MessageDialog(self,
"Do you really want to delete %s (%s) from EVE?" % (data['name'], data['ship']['name']),
"Confirm Delete", wx.YES | wx.NO | wx.ICON_QUESTION)
if dlg.ShowModal() == wx.ID_YES:
try:
sCrest.delFitting(self.getActiveCharacter(), data['fittingID'])
except requests.exceptions.ConnectionError:
self.statusbar.SetStatusText("Connection error, please check your internet connection")
class ExportToEve(wx.Frame):
def __init__(self, parent):
wx.Frame.__init__(self, parent, id=wx.ID_ANY, title="Export fit to EVE", pos=wx.DefaultPosition,
size=(wx.Size(350, 100)), style=wx.DEFAULT_FRAME_STYLE | wx.TAB_TRAVERSAL)
self.mainFrame = parent
self.SetBackgroundColour(wx.SystemSettings.GetColour(wx.SYS_COLOUR_BTNFACE))
sCrest = Crest.getInstance()
mainSizer = wx.BoxSizer(wx.VERTICAL)
hSizer = wx.BoxSizer(wx.HORIZONTAL)
if sCrest.settings.get('mode') == CrestModes.IMPLICIT:
self.stLogged = wx.StaticText(self, wx.ID_ANY, "Currently logged in as %s" % sCrest.implicitCharacter.name,
wx.DefaultPosition, wx.DefaultSize)
self.stLogged.Wrap(-1)
hSizer.Add(self.stLogged, 1, wx.ALIGN_CENTER_VERTICAL | wx.ALL, 5)
else:
self.charChoice = wx.Choice(self, wx.ID_ANY, wx.DefaultPosition, wx.DefaultSize, [])
hSizer.Add(self.charChoice, 1, wx.ALIGN_CENTER_VERTICAL | wx.ALL, 5)
self.updateCharList()
self.charChoice.SetSelection(0)
self.exportBtn = wx.Button(self, wx.ID_ANY, u"Export Fit", wx.DefaultPosition, wx.DefaultSize, 5)
hSizer.Add(self.exportBtn, 0, wx.ALL, 5)
mainSizer.Add(hSizer, 0, wx.EXPAND, 5)
self.exportBtn.Bind(wx.EVT_BUTTON, self.exportFitting)
self.statusbar = wx.StatusBar(self)
self.statusbar.SetFieldsCount(2)
self.statusbar.SetStatusWidths([100, -1])
self.mainFrame.Bind(GE.EVT_SSO_LOGOUT, self.ssoLogout)
self.mainFrame.Bind(GE.EVT_SSO_LOGIN, self.ssoLogin)
self.Bind(wx.EVT_CLOSE, self.OnClose)
self.SetSizer(hSizer)
self.SetStatusBar(self.statusbar)
self.Layout()
self.Centre(wx.BOTH)
def updateCharList(self):
sCrest = Crest.getInstance()
chars = sCrest.getCrestCharacters()
if len(chars) == 0:
self.Close()
self.charChoice.Clear()
for char in chars:
self.charChoice.Append(char.name, char.ID)
self.charChoice.SetSelection(0)
def ssoLogin(self, event):
self.updateCharList()
event.Skip()
def ssoLogout(self, event):
if event.type == CrestModes.IMPLICIT:
self.Close()
else:
self.updateCharList()
event.Skip() # continue event
def OnClose(self, event):
self.mainFrame.Unbind(GE.EVT_SSO_LOGOUT, handler=self.ssoLogout)
event.Skip()
def getActiveCharacter(self):
sCrest = Crest.getInstance()
if sCrest.settings.get('mode') == CrestModes.IMPLICIT:
return sCrest.implicitCharacter.ID
selection = self.charChoice.GetCurrentSelection()
return self.charChoice.GetClientData(selection) if selection is not None else None
def exportFitting(self, event):
sPort = Port.getInstance()
fitID = self.mainFrame.getActiveFit()
self.statusbar.SetStatusText("", 0)
if fitID is None:
self.statusbar.SetStatusText("Please select an active fitting in the main window", 1)
return
self.statusbar.SetStatusText("Sending request and awaiting response", 1)
sCrest = Crest.getInstance()
try:
sFit = Fit.getInstance()
data = sPort.exportCrest(sFit.getFit(fitID))
res = sCrest.postFitting(self.getActiveCharacter(), data)
self.statusbar.SetStatusText("%d: %s" % (res.status_code, res.reason), 0)
try:
text = json.loads(res.text)
self.statusbar.SetStatusText(text['message'], 1)
except ValueError:
self.statusbar.SetStatusText("", 1)
except requests.exceptions.ConnectionError:
self.statusbar.SetStatusText("Connection error, please check your internet connection", 1)
class CrestMgmt(wx.Dialog):
def __init__(self, parent):
wx.Dialog.__init__(self, parent, id=wx.ID_ANY, title="CREST Character Management", pos=wx.DefaultPosition,
size=wx.Size(550, 250), style=wx.DEFAULT_DIALOG_STYLE)
self.mainFrame = parent
mainSizer = wx.BoxSizer(wx.HORIZONTAL)
self.lcCharacters = wx.ListCtrl(self, wx.ID_ANY, wx.DefaultPosition, wx.DefaultSize, wx.LC_REPORT)
self.lcCharacters.InsertColumn(0, heading='Character')
self.lcCharacters.InsertColumn(1, heading='Refresh Token')
self.popCharList()
mainSizer.Add(self.lcCharacters, 1, wx.ALL | wx.EXPAND, 5)
btnSizer = wx.BoxSizer(wx.VERTICAL)
self.addBtn = wx.Button(self, wx.ID_ANY, u"Add Character", wx.DefaultPosition, wx.DefaultSize, 0)
btnSizer.Add(self.addBtn, 0, wx.ALL | wx.EXPAND, 5)
self.deleteBtn = wx.Button(self, wx.ID_ANY, u"Revoke Character", wx.DefaultPosition, wx.DefaultSize, 0)
btnSizer.Add(self.deleteBtn, 0, wx.ALL | wx.EXPAND, 5)
mainSizer.Add(btnSizer, 0, wx.EXPAND, 5)
self.addBtn.Bind(wx.EVT_BUTTON, self.addChar)
self.deleteBtn.Bind(wx.EVT_BUTTON, self.delChar)
self.mainFrame.Bind(GE.EVT_SSO_LOGIN, self.ssoLogin)
self.SetSizer(mainSizer)
self.Layout()
self.Centre(wx.BOTH)
def ssoLogin(self, event):
self.popCharList()
event.Skip()
def popCharList(self):
sCrest = Crest.getInstance()
chars = sCrest.getCrestCharacters()
self.lcCharacters.DeleteAllItems()
for index, char in enumerate(chars):
self.lcCharacters.InsertStringItem(index, char.name)
self.lcCharacters.SetStringItem(index, 1, char.refresh_token)
self.lcCharacters.SetItemData(index, char.ID)
self.lcCharacters.SetColumnWidth(0, wx.LIST_AUTOSIZE)
self.lcCharacters.SetColumnWidth(1, wx.LIST_AUTOSIZE)
def addChar(self, event):
sCrest = Crest.getInstance()
uri = sCrest.startServer()
webbrowser.open(uri)
def delChar(self, event):
item = self.lcCharacters.GetFirstSelected()
if item > -1:
charID = self.lcCharacters.GetItemData(item)
sCrest = Crest.getInstance()
sCrest.delCrestCharacter(charID)
self.popCharList()
class FittingsTreeView(wx.Panel):
def __init__(self, parent):
wx.Panel.__init__(self, parent, id=wx.ID_ANY)
self.parent = parent
pmainSizer = wx.BoxSizer(wx.VERTICAL)
tree = self.fittingsTreeCtrl = wx.TreeCtrl(self, wx.ID_ANY, style=wx.TR_DEFAULT_STYLE | wx.TR_HIDE_ROOT)
pmainSizer.Add(tree, 1, wx.EXPAND | wx.ALL, 0)
self.root = tree.AddRoot("Fits")
self.populateSkillTree(None)
self.Bind(wx.EVT_TREE_ITEM_ACTIVATED, self.displayFit)
self.SetSizer(pmainSizer)
self.Layout()
def populateSkillTree(self, data):
if data is None:
return
root = self.root
tree = self.fittingsTreeCtrl
tree.DeleteChildren(root)
dict = {}
fits = data['items']
for fit in fits:
if fit['ship']['name'] not in dict:
dict[fit['ship']['name']] = []
dict[fit['ship']['name']].append(fit)
for name, fits in dict.iteritems():
shipID = tree.AppendItem(root, name)
for fit in fits:
fitId = tree.AppendItem(shipID, fit['name'])
tree.SetPyData(fitId, json.dumps(fit))
tree.SortChildren(root)
def displayFit(self, event):
selection = self.fittingsTreeCtrl.GetSelection()
fit = json.loads(self.fittingsTreeCtrl.GetPyData(selection))
list = []
for item in fit['items']:
try:
cargo = Cargo(getItem(item['type']['id']))
cargo.amount = item['quantity']
list.append(cargo)
except:
pass
self.parent.fitView.fitSelection = selection
self.parent.fitView.update(list)
class FitView(Display):
DEFAULT_COLS = ["Base Icon",
"Base Name"]
def __init__(self, parent):
Display.__init__(self, parent, style=wx.LC_SINGLE_SEL)
self.fitSelection = None
|
Koforidua, the Eastern Regional capital last Saturday, 5th January, 2019 revisited the monthly sanitation exercise.
The exercise which was started under the previous administration was brought to a halt due to some circumstances. Despite it challenges, the management of this assembly decided to revisit it to enhance indiscriminate disposal of refuse.
Personnel from Ghana Police, Immigration, Ghana Fire Service, and Ghana Prisons Services were not left out the hook by enforcing the bye-law of the Assembly.
Other agencies includes Assembly Members, staff of the Assembly, businessmen and women, political party faithful’s , traditional leaders within the Municipality and taxi drivers.
As early as 6:30am, all the principal streets within the Municipality were blocked to give way to an effective sanitation exercise. With full support from the security services and Zoomlion Ghana, all shops, kiosks, containers and other business operators were also brought to a total halt till 10:30am.
This initiative gingered all the citizenry, particularly the teaming youth of New Juaben South Municipality to actively support the worthy course.
Addressing the media, the Municipal Chief Executive, Isaac Appaw-Gyasi commended the traditional chiefs and people of Juaben for their enthusiastic support during the exercise. He said Koforidua and its environs is engulfed with filth which to him does not befit it Municipal status.
As a result, management and the Assembly Members at their recent meetings gave an approval to revisit and implement the sanitation exercise.
He further explained that the assembly cannot operate and implement any meaningful development without seeking its mandate from the Assembly Members, hence the approval.
“This initiative is one of the President’s visions to clean the country off-filth and as an Assembly, we owe it a duty to implement any good policy that seems to be inline and at the end support government agenda” he explained.
Mr. Appaw-Gyasi emphasized that sanitation, for decades now, has become a canker on the neck of the country and it is time for Ghanaians to join hands to fight this menace.
“In fact let me state it clear that the Assembly alone cannot fight this battle and win but it needs the collaborative effort of all citizenry devoid of partisan colours to fight and win. I need you all to help clean Koforidua since it is what we have now” he appealed.
He assured the public that this exercise will not be a one-day-wonder but rather the Assembly is going to repeat it monthly and upon that a sanitation court is also going to be revisited to punish any victim to the law. He maintained that the sanitation court will not compromise on it activities with anybody until the culprit is fined or punished by the law.
Mr. Appaw-Gyasi further warned Nananom, opinion leaders and other stakeholders to do away with any character that will seems to jeopardize the activity of the sanitation court, for that matter the court will not deal or work with any ‘human face’.
“Things has to change, individuals has to change their behaviour towards sanitation and New Juaben South should move forward for development. Let’s work together.” He emphasized.
|
#!/usr/bin/env python3
# Copyright (c) 2008-11 Qtrac Ltd. All rights reserved.
# This program or module is free software: you can redistribute it and/or
# modify it under the terms of the GNU General Public License as published
# by the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version. It is provided for educational
# purposes and is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# General Public License for more details.
def indented_list_sort(indented_list, indent=" "):
"""Returns an alphabetically sorted copy of the given list
The indented list is assumed to be a list of strings in a
hierarchy with indentation used to indicate child items.
The indent parameter specifies the characters that constitute
one level of indent.
The function copies the list, and returns it sorted in
case-insensitive alphabetical order, with child items sorted
underneath their parent items, and so on with grandchild items,
and so on recursively to any level of depth.
>>> indented_list = ["M", " MX", " MG", "D", " DA", " DF",\
" DFX", " DFK", " DFB", " DC", "K", "X", "H", " HJ",\
" HB", "A"]
>>>
>>> indented_list = indented_list_sort(indented_list, " ")
>>> indented_list[:8]
['A', 'D', ' DA', ' DC', ' DF', ' DFB', ' DFK', ' DFX']
>>> indented_list[8:]
['H', ' HB', ' HJ', 'K', 'M', ' MG', ' MX', 'X']
"""
KEY, ITEM, CHILDREN = range(3)
def add_entry(level, key, item, children):
if level == 0:
children.append((key, item, []))
else:
add_entry(level - 1, key, item, children[-1][CHILDREN])
def update_indented_list(entry):
indented_list.append(entry[ITEM])
for subentry in sorted(entry[CHILDREN]):
update_indented_list(subentry)
entries = []
for item in indented_list:
level = 0
i = 0
while item.startswith(indent, i):
i += len(indent)
level += 1
key = item.strip().lower()
add_entry(level, key, item, entries)
indented_list = []
for entry in sorted(entries):
update_indented_list(entry)
return indented_list
def indented_list_sort_local(indented_list, indent=" "):
"""
Given an indented list, i.e., a list of items with indented
subitems, sorts the items, and the subitems within each item (and so
on recursively) in case-insensitive alphabetical order.
>>> indented_list = ["M", " MX", " MG", "D", " DA", " DF", " DFX", \
" DFK", " DFB", " DC", "K", "X", "H", " HJ", " HB", "A"]
>>>
>>> indented_list = indented_list_sort_local(indented_list, " ")
>>> indented_list[:8]
['A', 'D', ' DA', ' DC', ' DF', ' DFB', ' DFK', ' DFX']
>>> indented_list[8:]
['H', ' HB', ' HJ', 'K', 'M', ' MG', ' MX', 'X']
"""
KEY, ITEM, CHILDREN = range(3)
def add_entry(key, item, children):
nonlocal level
if level == 0:
children.append((key, item, []))
else:
level -= 1
add_entry(key, item, children[-1][CHILDREN])
def update_indented_list(entry):
indented_list.append(entry[ITEM])
for subentry in sorted(entry[CHILDREN]):
update_indented_list(subentry)
entries = []
for item in indented_list:
level = 0
i = 0
while item.startswith(indent, i):
i += len(indent)
level += 1
key = item.strip().lower()
add_entry(key, item, entries)
indented_list = []
for entry in sorted(entries):
update_indented_list(entry)
return indented_list
if __name__ == "__main__":
before = ["Nonmetals",
" Hydrogen",
" Carbon",
" Nitrogen",
" Oxygen",
"Inner Transitionals",
" Lanthanides",
" Cerium",
" Europium",
" Actinides",
" Uranium",
" Curium",
" Plutonium",
"Alkali Metals",
" Lithium",
" Sodium",
" Potassium"]
result1 = indented_list_sort(before)
result2 = indented_list_sort_local(before)
after = ["Alkali Metals",
" Lithium",
" Potassium",
" Sodium",
"Inner Transitionals",
" Actinides",
" Curium",
" Plutonium",
" Uranium",
" Lanthanides",
" Cerium",
" Europium",
"Nonmetals",
" Carbon",
" Hydrogen",
" Nitrogen",
" Oxygen"]
assert result1 == result2 == after
import doctest
doctest.testmod()
|
The show-stopper – deep shell edition. Red Onyx – selected as the next new finish at the 2018 Winter NAMM Show. Rogers’ most revered wood-shell Dyna-Sonic – the Beavertail series. Featuring a 6.5″ x 14″ extended depth, proprietary formula shell complete with original spec reinforcement rings. Faithfully appointed with the second generation lug profile, chrome script logo, oval badge and finished in fiery Red Onyx. At the heart of the drum is Rogers legendary floating snare rail system, which provides extraordinary sensitivity and control that has never been duplicated. Hand-built in Roger’s USA workshop, the Dyna-Sonic No. 37-RO is another masterpiece of art and sonic excellence.
|
# +-----------------------------------------------------------------------+
# | IMAGESTION |
# | |
# | Copyright (C) 2010-Today, GNUCHILE.CL - Santiago de Chile |
# | Licensed under the GNU GPL |
# | |
# | Redistribution and use in source and binary forms, with or without |
# | modification, are permitted provided that the following conditions |
# | are met: |
# | |
# | o Redistributions of source code must retain the above copyright |
# | notice, this list of conditions and the following disclaimer. |
# | o Redistributions in binary form must reproduce the above copyright |
# | notice, this list of conditions and the following disclaimer in the |
# | documentation and/or other materials provided with the distribution.|
# | o The names of the authors may not be used to endorse or promote |
# | products derived from this software without specific prior written |
# | permission. |
# | |
# | THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS |
# | "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT |
# | LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR |
# | A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT |
# | OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, |
# | SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT |
# | LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, |
# | DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY |
# | THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT |
# | (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE |
# | OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. |
# | |
# +-----------------------------------------------------------------------+
# | Author: Miguel Vargas Welch <[email protected]> |
# +-----------------------------------------------------------------------+
#import Image
from PIL import Image
import thread
from datetime import datetime
import multiprocessing as mp
import ctypes as c
import numpy as np
# pip install image
# pip install Pillow
## Referencias apoyo:
## http://www.pythonware.com/library/pil/handbook/introduction.htm
## http://www.pythonware.com/library/pil/handbook/image.htm
## http://www.tutorialspoint.com/python/python_multithreading.htm
## http://ostermiller.org/dilate_and_erode.html
class Imagen(object):
def __init__(self,ruta):
self.path = ruta
self.busy = 0
self.reload()
self.manager = mp.Manager()
pass
def reload(self):
self.RGB = Image.open(self.path)
self.ancho, self.alto = self.RGB.size
self.R, self.G, self.B = self.RGB.split()
pass
def dilate(self):
self.busy = 1
try:
R = [self.R, self.R.copy()]
G = [self.G, self.G.copy()]
B = [self.B, self.B.copy()]
lsArgs = [
(R, 0, 0, self.alto, self.ancho),
(G, 0, 0, self.alto, self.ancho),
(B, 0, 0, self.alto, self.ancho)
]
processes = [mp.Process(target=self._dilate, args=lsArgs[x]) for x in range(0, 2)]
for p in processes:
p.start()
# Exit the completed processes
for p in processes:
p.join()
# thread.start_new_thread( self._dilate, (R, 0, 0, self.alto, self.ancho) )
# thread.start_new_thread( self._dilate, (G, 0, 0, self.alto, self.ancho) )
# thread.start_new_thread( self._dilate, (B, 0, 0, self.alto, self.ancho) )
except:
print "Error: unable to start thread dilate"
self.busy = 0
# while self.busy > 0:
# pass
self.R = R[1]
self.G = G[1]
self.B = B[1]
print self.busy
def _dilate(self, lst, y1, x1, y2, x2):
"""
@return :
@author Miguelote
"""
id = self.busy
ancho = x2 - x1
alto = y2 - y1
if ancho > 100 and alto > 100:
difX = ancho % 2
difY = alto % 2
width = ancho // 2 if(difX > 0) else ancho / 2
height = alto // 2 if(difY > 0) else alto / 2
#print [id, '-', alto, ancho, '-', y1,x1, y2,x2]
lsArgs = [
(lst, y1, x1, y2 - height, x2 - width),
(lst, y1, x1 + width, y2 - height, x2),
(lst, y1 + height, x1, y2, x2 - width),
(lst, y1 + height, x1 + width, y2, x2)
]
processes = [mp.Process(target=self._dilate, args=lsArgs[x]) for x in range(0, 3)]
for p in processes:
p.start()
# Exit the completed processes
for p in processes:
p.join()
# try:
# thread.start_new_thread( self._dilate, (lst, y1, x1, y2-height, x2-width) )
# thread.start_new_thread( self._dilate, (lst, y1, x1+width, y2-height, x2) )
# thread.start_new_thread( self._dilate, (lst, y1+height, x1, y2, x2-width) )
# thread.start_new_thread( self._dilate, (lst, y1+height, x1+width, y2, x2) )
# except:
# print "Error: unable to start thread _dilate"
# print [id, alto, ancho]
# print [y1, x1, y2-height, x2-width]
# print [y1, x1+width, y2-height, x2]
# print [y1+height, x1, y2, x2-width]
# print [y1+height, x1+width, y2, x2]
# print self.busy
else:
img, copia = lst
self.busy = self.busy + 1
start = datetime.now()
print [id, '-' ,self.busy, '_dilate' , alto, ancho]
for y in xrange(y1,y2):
for x in xrange(x1,x2):
punto = img.getpixel((x,y))
##norte = im.getpixel((x,y-1))
##sur = im.getpixel((x,y+1))
##este = im.getpixel((x+1,y))
##oeste = im.getpixel((x-1,y))
if y>0 and punto>img.getpixel((x,y-1)):
lst[1].putpixel((x,y-1),punto)
if x>0 and punto>img.getpixel((x-1,y)):
lst[1].putpixel((x-1,y),punto)
if y<self.alto-1 and punto>img.getpixel((x,y+1)):
lst[1].putpixel((x,y+1),punto)
if x<self.ancho-1 and punto>img.getpixel((x+1,y)):
lst[1].putpixel((x+1,y),punto)
if y>0 and x>0 and punto>img.getpixel((x-1,y-1)):
lst[1].putpixel((x-1,y-1),punto)
if y<self.alto-1 and x>0 and punto>img.getpixel((x-1,y+1)):
lst[1].putpixel((x-1,y+1),punto)
if y>0 and x<self.ancho-1 and punto>img.getpixel((x+1,y-1)):
lst[1].putpixel((x+1,y-1),punto)
if y<self.alto-1 and x<self.ancho-1 and punto>img.getpixel((x+1,y+1)):
lst[1].putpixel((x+1,y+1),punto)
stop = datetime.now()
delay = stop - start
print [id, '-' ,self.busy, "fin", delay]
self.busy = self.busy -1
if self.busy == 1:
self.busy = 0
def erode(self):
self.busy = 1
shareR = mp.Array('i', 2 * self.ancho * self.alto)
shareG = mp.Array('i', 2 * self.ancho * self.alto)
shareB = mp.Array('i', 2 * self.ancho * self.alto)
R = np.frombuffer(shareR.get_obj())
G = np.frombuffer(shareG.get_obj())
B = np.frombuffer(shareB.get_obj())
R[0], R[1] = (self.R, self.R.copy())
G[0], G[1] = (self.G, self.G.copy())
B[0], B[1] = (self.B, self.B.copy())
lsArgs = [
(R, 0, 0, self.alto, self.ancho),
(G, 0, 0, self.alto, self.ancho),
(B, 0, 0, self.alto, self.ancho)
]
processes = [mp.Process(target=self._erode, args=lsArgs[x]) for x in range(0, 2)]
for p in processes:
p.start()
# Exit the completed processes
for p in processes:
p.join()
# try:
# R = [self.R, self.R.copy()]
# G = [self.G, self.G.copy()]
# B = [self.B, self.B.copy()]
# thread.start_new_thread( self._erode, (R, 0, 0, self.alto, self.ancho) )
# thread.start_new_thread( self._erode, (G, 0, 0, self.alto, self.ancho) )
# thread.start_new_thread( self._erode, (B, 0, 0, self.alto, self.ancho) )
# except:
# print "Error: unable to start thread erode"
# self.busy = 0
#
# while self.busy > 0:
# pass
self.R = R[1]
self.G = G[1]
self.B = B[1]
print self.busy
def _erode(self, lst, y1, x1, y2, x2):
"""
@return :
@author Miguelote
"""
id = self.busy
ancho = x2 - x1
alto = y2 - y1
if ancho > 100 and alto > 100:
difX = ancho % 2
difY = alto % 2
width = ancho // 2 if(difX > 0) else ancho / 2
height = alto // 2 if(difY > 0) else alto / 2
#print [id, '-', alto, ancho, '-', y1,x1, y2,x2]
lsArgs = [
(lst, y1, x1, y2 - height, x2 - width),
(lst, y1, x1 + width, y2 - height, x2),
(lst, y1 + height, x1, y2, x2 - width),
(lst, y1 + height, x1 + width, y2, x2)
]
processes = [mp.Process(target=self._erode, args=lsArgs[x]) for x in range(0, 3)]
for p in processes:
p.start()
# Exit the completed processes
for p in processes:
p.join()
# try:
# thread.start_new_thread( self._erode, (lst, y1, x1, y2-height, x2-width) )
# thread.start_new_thread( self._erode, (lst, y1, x1+width, y2-height, x2) )
# thread.start_new_thread( self._erode, (lst, y1+height, x1, y2, x2-width) )
# thread.start_new_thread( self._erode, (lst, y1+height, x1+width, y2, x2) )
# except:
# print "Error: unable to start thread _erode"
# print [id, alto, ancho]
# print [y1, x1, y2-height, x2-width]
# print [y1, x1+width, y2-height, x2]
# print [y1+height, x1, y2, x2-width]
# print [y1+height, x1+width, y2, x2]
# print self.busy
else:
img, copia = lst
self.busy = self.busy + 1
start = datetime.now()
print [id, '-' ,self.busy, '_erode' , alto, ancho]
for y in xrange(y1,y2):
for x in xrange(x1,x2):
punto = img.getpixel((x,y))
##norte = im.getpixel((x,y-1))
##sur = im.getpixel((x,y+1))
##este = im.getpixel((x+1,y))
##oeste = im.getpixel((x-1,y))
if y>0 and punto>img.getpixel((x,y-1)):
lst[1].putpixel((x,y),img.getpixel((x,y-1)))
if x>0 and punto>img.getpixel((x-1,y)):
lst[1].putpixel((x,y),img.getpixel((x-1,y)))
if y<self.alto-1 and punto>img.getpixel((x,y+1)):
lst[1].putpixel((x,y),img.getpixel((x,y+1)))
if x<self.ancho-1 and punto>img.getpixel((x+1,y)):
lst[1].putpixel((x,y),img.getpixel((x+1,y)))
if y>0 and x>0 and punto>img.getpixel((x-1,y-1)):
lst[1].putpixel((x,y),img.getpixel((x-1,y-1)))
if y>0 and x<self.ancho-1 and punto>img.getpixel((x+1,y-1)):
lst[1].putpixel((x,y),img.getpixel((x+1,y-1)))
if y<self.alto-1 and x>0 and punto>img.getpixel((x-1,y+1)):
lst[1].putpixel((x,y),img.getpixel((x-1,y+1)))
if y<self.alto-1 and x<self.ancho-1 and punto>img.getpixel((x+1,y+1)):
lst[1].putpixel((x,y),img.getpixel((x+1,y+1)))
stop = datetime.now()
delay = stop - start
print [id, '-' ,self.busy, "fin", delay]
self.busy = self.busy -1
if self.busy == 1:
self.busy = 0
def rgb2gray(self):
"""
@return :
@author Miguelote
"""
pass
def substract(self,img):
pass
def getR(self):
"""
@return int[][] :
@author Miguelote
"""
return self.R
pass
def getG(self):
"""
@return int[][] :
@author Miguelote
"""
return self.G
pass
def getB(self):
"""
@return int[][] :
@author Miguelote
"""
return self.B
pass
def getRGB(self):
"""
@return int[][][3] :
@author Miguelote
"""
self.RGB = Image.merge("RGB", (self.R, self.G, self.B))
return self.RGB
pass
def getAlto(self):
return self.alto
pass
def getAncho(self):
return self.ancho
pass
def getPath(self):
return self.path
pass
|
Leslie Gelb, former president of the Council on Foreign Relations and veteran of policy circles dating back to the Johnson administration, was an unlikely candidate to surprise the routine world of Washington national-security roundtable discussions. But debating Iraq with Lawrence Korb on July 20 at the Center for American Progress, he did just that. When his turn to speak came, Gelb fled the safety of the podium and stood and delivered his remarks Oprah-style, pacing around the room wireless microphone in hand.
A key premise of the withdrawal argument had been that anti-Americanism was the prime driver of strife in Iraq. The country could best be unified in the context of a U.S. exit -- and sooner rather than later, before the cycle of violence generated a level of sectarian distrust that transcended Iraqi nationalism.
But that point now seems to have passed. Even Gelb's partition alternative is less a plan to save the country than a plan to concede that Iraq is beyond saving. So in the late summer of 2006, when the world's eyes turned to Lebanon even as the violence in Iraq reached new levels, the last opportunity to secure a decent outcome in Iraq has already passed, and nothing America can do at this point will change that.
God willing, there will be no civil war in Iraq,” intoned Nouri al-Maliki, third prime minister of post-Saddam Iraq to conclude the July 25 joint press availability with President Bush. Importuning the divine is as good a plan as any at this point, for the substantive meeting itself was merely the capstone in a process that closed the door on the last best hope for the conventional policy process.
About a month before al-Maliki's trip, the famously fractious congressional Democrats finally reached a reasonable degree of unity around a plan for Iraq -- the Reed-Levin Amendment to the Department of Defense appropriations bill -- that encapsulated the long-standing liberal argument that ruling out an open-ended military commitment was the best chance to avoid a downward spiral of sectarian violence.
Unfortunately, when the final plan was released days later, it included neither amnesty nor a timetable. Congressional Democrats had launched opportunistic attacks on the former point, leading the administration to come out against amnesty as well. Bush's views prevailed over al-Maliki's initial instincts in both cases.
The opportunity thereby lost was enormous. On June 28, two days after the release of the revised, timetable-free reconciliation plan, the leaders of 11 insurgent groups announced that they were willing to halt all attacks immediately in exchange for a promise of American withdrawal within two years. But Donald Rumsfeld dismissed the offer with the repetition of long-standing administration cant that a timetable is “a signal to the enemies that all you have to do is just wait.” That a substantial bloc of America's erstwhile enemies was prepared to give up in exchange for a timetable did not, apparently, enter into the picture.
This opportunity, once lost, can probably never be regained. The level of violence in Iraq reached frighteningly large dimensions this spring with more than 100 Iraqi civilians killed per day in both April and May. This heightened domestic hatred has sharply reduced the odds that removal of foreign forces would bring about national reconciliation. A wave of retaliatory killings by Shiite militias in early July even had some Sunni leaders reversing course and calling for more American troops to be deployed.
Bush came out of his meeting with al-Maliki not with a new plan for national reconciliation, but instead with a plan to shift several thousand troops into Baghdad in hopes of securing the capital. The plan has virtually no chance of succeeding. Brookings Institution analyst Kenneth Pollack -- a fan of the strategy -- estimated in congressional testimony that it would require 100,000 to 120,000 troops in Baghdad at a time when there are only 127,000 soldiers in the whole country.
But even if sufficient forces were available, the plan's goal of combating Shiite militias and death squads in the capital seems unrealistic at this point. After all, the two largest militias -- the Badr Organization and the Mahdi Army -- are affiliated with political parties represented in al-Maliki's parliamentary coalition. While al-Maliki was en route to Washington, Abdul Aziz al-Hakim, leader of the Supreme Council for the Islamic Revolution in Iraq, the nation's largest political party and overseers of the Badr Organization, was calling, in effect, for more militias, telling The Washington Post that the answer to Iraq's security problems was the formation of neighborhood defense committees. Under the circumstances, a serious effort at a crackdown would simply leave the United States without meaningful allies in the country.
One way or another, the current trends toward sectarian violence and “soft” ethnic cleansing as Iraqis increasingly try to sort themselves into homogenous neighborhoods is almost certain to continue. The main question remaining is how long American troops will be left in the crossfire.
|
# -*- coding: utf-8 -*-
from .platform import Platform
from .keywords.postgresql_keywords import PostgreSQLKeywords
from ..table import Table
from ..column import Column
from ..identifier import Identifier
class PostgresPlatform(Platform):
INTERNAL_TYPE_MAPPING = {
"smallint": "smallint",
"int2": "smallint",
"serial": "integer",
"serial4": "integer",
"int": "integer",
"int4": "integer",
"integer": "integer",
"bigserial": "bigint",
"serial8": "bigint",
"bigint": "bigint",
"int8": "bigint",
"bool": "boolean",
"boolean": "boolean",
"text": "text",
"tsvector": "text",
"varchar": "string",
"interval": "string",
"_varchar": "string",
"char": "string",
"bpchar": "string",
"inet": "string",
"date": "date",
"datetime": "datetime",
"timestamp": "datetime",
"timestamptz": "datetimez",
"time": "time",
"timetz": "time",
"float": "float",
"float4": "float",
"float8": "float",
"double": "float",
"double precision": "float",
"real": "float",
"decimal": "decimal",
"money": "decimal",
"numeric": "decimal",
"year": "date",
"uuid": "guid",
"bytea": "blob",
"json": "json",
}
def get_list_table_columns_sql(self, table):
sql = """SELECT
a.attnum,
quote_ident(a.attname) AS field,
t.typname AS type,
format_type(a.atttypid, a.atttypmod) AS complete_type,
(SELECT t1.typname FROM pg_catalog.pg_type t1 WHERE t1.oid = t.typbasetype) AS domain_type,
(SELECT format_type(t2.typbasetype, t2.typtypmod) FROM
pg_catalog.pg_type t2 WHERE t2.typtype = 'd' AND t2.oid = a.atttypid) AS domain_complete_type,
a.attnotnull AS isnotnull,
(SELECT 't'
FROM pg_index
WHERE c.oid = pg_index.indrelid
AND pg_index.indkey[0] = a.attnum
AND pg_index.indisprimary = 't'
) AS pri,
(SELECT pg_get_expr(adbin, adrelid)
FROM pg_attrdef
WHERE c.oid = pg_attrdef.adrelid
AND pg_attrdef.adnum=a.attnum
) AS default,
(SELECT pg_description.description
FROM pg_description WHERE pg_description.objoid = c.oid AND a.attnum = pg_description.objsubid
) AS comment
FROM pg_attribute a, pg_class c, pg_type t, pg_namespace n
WHERE %s
AND a.attnum > 0
AND a.attrelid = c.oid
AND a.atttypid = t.oid
AND n.oid = c.relnamespace
ORDER BY a.attnum""" % self.get_table_where_clause(
table
)
return sql
def get_list_table_indexes_sql(self, table):
sql = """
SELECT quote_ident(relname) as relname, pg_index.indisunique, pg_index.indisprimary,
pg_index.indkey, pg_index.indrelid,
pg_get_expr(indpred, indrelid) AS where
FROM pg_class, pg_index
WHERE oid IN (
SELECT indexrelid
FROM pg_index si, pg_class sc, pg_namespace sn
WHERE %s
AND sc.oid=si.indrelid AND sc.relnamespace = sn.oid
) AND pg_index.indexrelid = oid"""
sql = sql % self.get_table_where_clause(table, "sc", "sn")
return sql
def get_list_table_foreign_keys_sql(self, table):
return (
"SELECT quote_ident(r.conname) as conname, "
"pg_catalog.pg_get_constraintdef(r.oid, true) AS condef "
"FROM pg_catalog.pg_constraint r "
"WHERE r.conrelid = "
"("
"SELECT c.oid "
"FROM pg_catalog.pg_class c, pg_catalog.pg_namespace n "
"WHERE "
+ self.get_table_where_clause(table)
+ " AND n.oid = c.relnamespace"
")"
" AND r.contype = 'f'"
)
def get_table_where_clause(self, table, class_alias="c", namespace_alias="n"):
where_clause = (
namespace_alias
+ ".nspname NOT IN ('pg_catalog', 'information_schema', 'pg_toast') AND "
)
if table.find(".") >= 0:
split = table.split(".")
schema, table = split[0], split[1]
schema = "'%s'" % schema
else:
schema = (
"ANY(string_to_array((select replace(replace(setting, '\"$user\"', user), ' ', '')"
" from pg_catalog.pg_settings where name = 'search_path'),','))"
)
where_clause += "%s.relname = '%s' AND %s.nspname = %s" % (
class_alias,
table,
namespace_alias,
schema,
)
return where_clause
def get_advanced_foreign_key_options_sql(self, foreign_key):
query = ""
if foreign_key.has_option("match"):
query += " MATCH %s" % foreign_key.get_option("match")
query += super(PostgresPlatform, self).get_advanced_foreign_key_options_sql(
foreign_key
)
deferrable = (
foreign_key.has_option("deferrable")
and foreign_key.get_option("deferrable") is not False
)
if deferrable:
query += " DEFERRABLE"
else:
query += " NOT DEFERRABLE"
query += " INITIALLY"
deferred = (
foreign_key.has_option("deferred")
and foreign_key.get_option("deferred") is not False
)
if deferred:
query += " DEFERRED"
else:
query += " IMMEDIATE"
return query
def get_alter_table_sql(self, diff):
"""
Get the ALTER TABLE SQL statement
:param diff: The table diff
:type diff: orator.dbal.table_diff.TableDiff
:rtype: list
"""
sql = []
for column_diff in diff.changed_columns.values():
if self.is_unchanged_binary_column(column_diff):
continue
old_column_name = column_diff.get_old_column_name().get_quoted_name(self)
column = column_diff.column
if any(
[
column_diff.has_changed("type"),
column_diff.has_changed("precision"),
column_diff.has_changed("scale"),
column_diff.has_changed("fixed"),
]
):
query = (
"ALTER "
+ old_column_name
+ " TYPE "
+ self.get_sql_type_declaration(column.to_dict())
)
sql.append(
"ALTER TABLE "
+ diff.get_name(self).get_quoted_name(self)
+ " "
+ query
)
if column_diff.has_changed("default") or column_diff.has_changed("type"):
if column.get_default() is None:
default_clause = " DROP DEFAULT"
else:
default_clause = " SET" + self.get_default_value_declaration_sql(
column.to_dict()
)
query = "ALTER " + old_column_name + default_clause
sql.append(
"ALTER TABLE "
+ diff.get_name(self).get_quoted_name(self)
+ " "
+ query
)
if column_diff.has_changed("notnull"):
op = "DROP"
if column.get_notnull():
op = "SET"
query = "ALTER " + old_column_name + " " + op + " NOT NULL"
sql.append(
"ALTER TABLE "
+ diff.get_name(self).get_quoted_name(self)
+ " "
+ query
)
if column_diff.has_changed("autoincrement"):
if column.get_autoincrement():
seq_name = self.get_identity_sequence_name(
diff.name, old_column_name
)
sql.append("CREATE SEQUENCE " + seq_name)
sql.append(
"SELECT setval('" + seq_name + "', "
"(SELECT MAX(" + old_column_name + ") FROM " + diff.name + "))"
)
query = (
"ALTER "
+ old_column_name
+ " SET DEFAULT nextval('"
+ seq_name
+ "')"
)
sql.append(
"ALTER TABLE "
+ diff.get_name(self).get_quoted_name(self)
+ " "
+ query
)
else:
query = "ALTER " + old_column_name + " DROP DEFAULT"
sql.append(
"ALTER TABLE "
+ diff.get_name(self).get_quoted_name(self)
+ " "
+ query
)
if column_diff.has_changed("length"):
query = (
"ALTER "
+ old_column_name
+ " TYPE "
+ self.get_sql_type_declaration(column.to_dict())
)
sql.append(
"ALTER TABLE "
+ diff.get_name(self).get_quoted_name(self)
+ " "
+ query
)
for old_column_name, column in diff.renamed_columns.items():
sql.append(
"ALTER TABLE " + diff.get_name(self).get_quoted_name(self) + " "
"RENAME COLUMN "
+ Identifier(old_column_name).get_quoted_name(self)
+ " TO "
+ column.get_quoted_name(self)
)
return sql
def is_unchanged_binary_column(self, column_diff):
column_type = column_diff.column.get_type()
if column_type not in ["blob", "binary"]:
return False
if isinstance(column_diff.from_column, Column):
from_column = column_diff.from_column
else:
from_column = None
if from_column:
from_column_type = self.INTERNAL_TYPE_MAPPING[from_column.get_type()]
if from_column_type in ["blob", "binary"]:
return False
return (
len(
[
x
for x in column_diff.changed_properties
if x not in ["type", "length", "fixed"]
]
)
== 0
)
if column_diff.has_changed("type"):
return False
return (
len(
[
x
for x in column_diff.changed_properties
if x not in ["length", "fixed"]
]
)
== 0
)
def convert_booleans(self, item):
if isinstance(item, list):
for i, value in enumerate(item):
if isinstance(value, bool):
item[i] = str(value).lower()
elif isinstance(item, bool):
item = str(item).lower()
return item
def get_boolean_type_declaration_sql(self, column):
return "BOOLEAN"
def get_integer_type_declaration_sql(self, column):
if column.get("autoincrement"):
return "SERIAL"
return "INT"
def get_bigint_type_declaration_sql(self, column):
if column.get("autoincrement"):
return "BIGSERIAL"
return "BIGINT"
def get_smallint_type_declaration_sql(self, column):
return "SMALLINT"
def get_guid_type_declaration_sql(self, column):
return "UUID"
def get_datetime_type_declaration_sql(self, column):
return "TIMESTAMP(0) WITHOUT TIME ZONE"
def get_datetimetz_type_declaration_sql(self, column):
return "TIMESTAMP(0) WITH TIME ZONE"
def get_date_type_declaration_sql(self, column):
return "DATE"
def get_time_type_declaration_sql(self, column):
return "TIME(0) WITHOUT TIME ZONE"
def get_string_type_declaration_sql(self, column):
length = column.get("length", "255")
fixed = column.get("fixed")
if fixed:
return "CHAR(%s)" % length
else:
return "VARCHAR(%s)" % length
def get_binary_type_declaration_sql(self, column):
return "BYTEA"
def get_blob_type_declaration_sql(self, column):
return "BYTEA"
def get_clob_type_declaration_sql(self, column):
return "TEXT"
def get_text_type_declaration_sql(self, column):
return "TEXT"
def get_json_type_declaration_sql(self, column):
return "JSON"
def get_decimal_type_declaration_sql(self, column):
if "precision" not in column or not column["precision"]:
column["precision"] = 10
if "scale" not in column or not column["scale"]:
column["precision"] = 0
return "DECIMAL(%s, %s)" % (column["precision"], column["scale"])
def get_float_type_declaration_sql(self, column):
return "DOUBLE PRECISION"
def supports_foreign_key_constraints(self):
return True
def has_native_json_type(self):
return True
def _get_reserved_keywords_class(self):
return PostgreSQLKeywords
|
Where to buy Ulex europaeus plants & price comparison.
Photos of Ulex europaeus plants in real gardens.
You can also discover who's growing Ulex europaeus.
This advice is from our members, if you can't find your question, please try the Ulex genus page or ask our friendly community.
|
"""
sentry.models.alert
~~~~~~~~~~~~~~~~~~~
:copyright: (c) 2010-2013 by the Sentry Team, see AUTHORS for more details.
:license: BSD, see LICENSE for more details.
"""
from datetime import timedelta
from django.core.urlresolvers import reverse
from django.db import models
from django.utils import timezone
from django.utils.translation import ugettext_lazy as _
from sentry.constants import (
STATUS_RESOLVED, STATUS_UNRESOLVED, MINUTE_NORMALIZATION
)
from sentry.db.models import (
Model, GzippedDictField, BoundedPositiveIntegerField, sane_repr
)
from sentry.utils.db import has_trending
from sentry.utils.http import absolute_uri
class Alert(Model):
project = models.ForeignKey('sentry.Project')
group = models.ForeignKey('sentry.Group', null=True)
datetime = models.DateTimeField(default=timezone.now)
message = models.TextField()
data = GzippedDictField(null=True)
related_groups = models.ManyToManyField('sentry.Group', through='sentry.AlertRelatedGroup', related_name='related_alerts')
status = BoundedPositiveIntegerField(default=0, choices=(
(STATUS_UNRESOLVED, _('Unresolved')),
(STATUS_RESOLVED, _('Resolved')),
), db_index=True)
class Meta:
app_label = 'sentry'
db_table = 'sentry_alert'
__repr__ = sane_repr('project_id', 'group_id', 'datetime')
# TODO: move classmethods to manager
@classmethod
def get_recent_for_project(cls, project_id):
return cls.objects.filter(
project=project_id,
group_id__isnull=True,
datetime__gte=timezone.now() - timedelta(minutes=60),
status=STATUS_UNRESOLVED,
).order_by('-datetime')
@classmethod
def maybe_alert(cls, project_id, message, group_id=None):
from sentry.models import Group
now = timezone.now()
manager = cls.objects
# We only create an alert based on:
# - an alert for the project hasn't been created in the last 30 minutes
# - an alert for the event hasn't been created in the last 60 minutes
# TODO: there is a race condition if we're calling this function for the same project
if manager.filter(
project=project_id, datetime__gte=now - timedelta(minutes=60)).exists():
return
if manager.filter(
project=project_id, group=group_id,
datetime__gte=now - timedelta(minutes=60)).exists():
return
alert = manager.create(
project_id=project_id,
group_id=group_id,
datetime=now,
message=message,
)
if not group_id and has_trending():
# Capture the top 5 trending events at the time of this error
related_groups = Group.objects.get_accelerated([project_id], minutes=MINUTE_NORMALIZATION)[:5]
for group in related_groups:
AlertRelatedGroup.objects.create(
group=group,
alert=alert,
)
return alert
@property
def team(self):
return self.project.team
@property
def is_resolved(self):
return (self.status == STATUS_RESOLVED
or self.datetime < timezone.now() - timedelta(minutes=60))
def get_absolute_url(self):
return absolute_uri(reverse('sentry-alert-details', args=[
self.team.slug, self.project.slug, self.id]))
class AlertRelatedGroup(Model):
group = models.ForeignKey('sentry.Group')
alert = models.ForeignKey(Alert)
data = GzippedDictField(null=True)
class Meta:
app_label = 'sentry'
db_table = 'sentry_alertrelatedgroup'
unique_together = (('group', 'alert'),)
__repr__ = sane_repr('group_id', 'alert_id')
|
England, four miles north east of Wincanton, four miles north west of Mere, and 6 miles north of Gillingham. The south east of the parish borders Zeals and Stourhead in Wiltshire, and Bourton in Dorset.The village is within a mile of the A303. There are railway stations in the nearby towns of Castle Cary and Gillingham.
|
import os
BASE_DIR = os.path.dirname(os.path.dirname(__file__))
SECRET_KEY = '+k*kdcv4ut*bd99nb(ox$%j_9(1#8@_)!aa4oy2%iwsg&!tt15'
DEBUG = True
TEMPLATE_DEBUG = True
ALLOWED_HOSTS = ['localhost',
'127.0.0.1']
INSTALLED_APPS = (
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
)
MIDDLEWARE_CLASSES = (
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.auth.middleware.SessionAuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
)
ROOT_URLCONF = 'dziekanat.urls'
WSGI_APPLICATION = 'dziekanat.wsgi.application'
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'ntr',
'USER': 'postgres',
'PASSWORD': 'qwerty1asd',
'HOST': 'localhost',
'PORT': '5432',
},
}
LANGUAGE_CODE = 'pl-pl'
TIME_ZONE = 'UTC'
USE_I18N = True
USE_L10N = True
USE_TZ = True
STATIC_URL = '/static/'
|
Last week there were many reports that Twitter had been hacked and that a user database of 32 million, was listed on the dark web.
Twitter claims that this is not the case. Twitter’s Information Security Officer, Michael Coates, posted a response to this potential hack on their site, and took to the microblogging site.
What is suspected to be the cause, is users recycling passwords. In the week of “mega breaches”, hundreds of millions social media accounts had their credentials posted to the dark web. It’s likely that hackers were able to get into certain Twitter accounts via reused passwords that have been linked from other platform breaches. All they have to do is find repeating email addresses, and use the passwords provided on the dark web, and more often than not – they will be able to get into multiple accounts.
|
__author__ = 'duarte'
from modules.parameters import ParameterSet, ParameterSpace, extract_nestvalid_dict
from modules.input_architect import EncodingLayer, InputSignalSet, InputNoise, InputSignal
from modules.net_architect import Network
from modules.io import set_storage_locations
from modules.signals import iterate_obj_list
from modules.analysis import single_neuron_responses
from modules.visualization import InputPlots, set_global_rcParams
import cPickle as pickle
import numpy as np
import scipy.stats as stats
import nest
"""
:param parameter_set: must be consistent with the computation
:param plot: plot results - either show them or save to file
:param display: show figures/reports
:param save: save results
:return results_dictionary:
"""
# ######################################################################################################################
# Experiment options
# ======================================================================================================================
plot = True
display = True
save = True
debug = False
online = True
# ######################################################################################################################
# Extract parameters from file and build global ParameterSet
# ======================================================================================================================
params_file = '../parameters/single_neuron_patterned_synaptic_input.py'
parameter_set = ParameterSpace(params_file)[0]
parameter_set = parameter_set.clean(termination='pars')
if not isinstance(parameter_set, ParameterSet):
if isinstance(parameter_set, basestring) or isinstance(parameter_set, dict):
parameter_set = ParameterSet(parameter_set)
else:
raise TypeError("parameter_set must be ParameterSet, string with full path to parameter file or dictionary")
# ######################################################################################################################
# Setup extra variables and parameters
# ======================================================================================================================
if plot:
set_global_rcParams(parameter_set.kernel_pars['mpl_path'])
paths = set_storage_locations(parameter_set, save)
np.random.seed(parameter_set.kernel_pars['np_seed'])
results = dict()
# ######################################################################################################################
# Set kernel and simulation parameters
# ======================================================================================================================
print('\nRuning ParameterSet {0}'.format(parameter_set.label))
nest.ResetKernel()
nest.set_verbosity('M_WARNING')
nest.SetKernelStatus(extract_nestvalid_dict(parameter_set.kernel_pars.as_dict(), param_type='kernel'))
# ######################################################################################################################
# Build network
# ======================================================================================================================
net = Network(parameter_set.net_pars)
# ######################################################################################################################
# Randomize initial variable values
# ======================================================================================================================
for idx, n in enumerate(list(iterate_obj_list(net.populations))):
if hasattr(parameter_set.net_pars, "randomize_neuron_pars"):
randomize = parameter_set.net_pars.randomize_neuron_pars[idx]
for k, v in randomize.items():
n.randomize_initial_states(k, randomization_function=v[0], **v[1])
########################################################################################################################
# Build Input Signal Sets
# ======================================================================================================================
assert hasattr(parameter_set, "input_pars")
# Current input (need to build 2 separate noise signals for the 2 input channels)
total_stimulation_time = parameter_set.kernel_pars.sim_time + parameter_set.kernel_pars.transient_t
input_noise_ch1 = InputNoise(parameter_set.input_pars.noise, stop_time=total_stimulation_time)
input_noise_ch1.generate()
input_noise_ch1.re_seed(parameter_set.kernel_pars.np_seed)
input_noise_ch2 = InputNoise(parameter_set.input_pars.noise, stop_time=total_stimulation_time)
input_noise_ch2.generate()
input_noise_ch2.re_seed(parameter_set.kernel_pars.np_seed)
if plot:
inp_plot = InputPlots(stim_obj=None, input_obj=None, noise_obj=input_noise_ch1)
inp_plot.plot_noise_component(display=display, save=paths['figures'] + "/InputNoise_CH1")
inp_plot = InputPlots(stim_obj=None, input_obj=None, noise_obj=input_noise_ch2)
inp_plot.plot_noise_component(display=display, save=paths['figures'] + "/InputNoise_CH2")
# ######################################################################################################################
# Build and connect input
# ======================================================================================================================
enc_layer_ch1 = EncodingLayer(parameter_set.encoding_ch1_pars, signal=input_noise_ch1)
enc_layer_ch1.connect(parameter_set.encoding_ch1_pars, net)
enc_layer_ch2 = EncodingLayer(parameter_set.encoding_ch2_pars, signal=input_noise_ch2)
enc_layer_ch2.connect(parameter_set.encoding_ch2_pars, net)
# ######################################################################################################################
# Connect Devices
# ======================================================================================================================
net.connect_devices()
# ######################################################################################################################
# Simulate
# ======================================================================================================================
if parameter_set.kernel_pars.transient_t:
net.simulate(parameter_set.kernel_pars.transient_t)
net.flush_records()
net.simulate(parameter_set.kernel_pars.sim_time + nest.GetKernelStatus()['resolution'])
# ######################################################################################################################
# Extract and store data
# ======================================================================================================================
net.extract_population_activity(
t_start=parameter_set.kernel_pars.transient_t, # + nest.GetKernelStatus()['resolution'],
t_stop=parameter_set.kernel_pars.sim_time + parameter_set.kernel_pars.transient_t)
net.extract_network_activity()
# ######################################################################################################################
# Analyse / plot data
# ======================================================================================================================
results = dict()
analysis_interval = [parameter_set.kernel_pars.transient_t,
parameter_set.kernel_pars.transient_t + parameter_set.kernel_pars.sim_time]
for idd, nam in enumerate(net.population_names):
results.update({nam: {}})
results[nam] = single_neuron_responses(net.populations[idd],
parameter_set, pop_idx=idd,
start=analysis_interval[0],
stop=analysis_interval[1],
plot=plot, display=display,
save=paths['figures'] + paths['label'])
if results[nam]['rate']:
print('Output Rate [{0}] = {1} spikes/s'.format(str(nam), str(results[nam]['rate'])))
# ######################################################################################################################
# Save data
# ======================================================================================================================
if save:
with open(paths['results'] + 'Results_' + parameter_set.label, 'w') as f:
pickle.dump(results, f)
parameter_set.save(paths['parameters'] + 'Parameters_' + parameter_set.label)
|
The National Reform Steering Assembly (NRSA) has presented the junta with its long-anticipated recommendations for developing post-coup Thailand. But iLaw, a watchdog NGO, has pointed out that much of the NRSA’s reform package was merely copied and pasted from other sources. Several pages from the 1,342 article long report appear to have been lifted directly from previous reports written by the NRSA’s precursor, the National Reform Council (NRC).
The junta’s National Reform Steering Assembly (NRSA) has given the green light to media reform proposals which will tighten government control and surveillance over online media. On 3 July 2017, the NRSA voted 144-1 in favour of a report compiled by its Social Media Reform Subcommittee. Two members abstained.
Despite opposition from media groups, the junta is proposing a law to punish unlicensed journalists with two years in prison. On 10 April 2017, Maj Gen Pisit Pao-In, chairman of the media subcommittee of the junta’s National Reform Steering Assembly (NRSA), announced that under the new Media Bill, media workers who do not possess official licenses could face two years’ imprisonment, or a fine of 60,000 baht, or both.
Citing political ills, the Thai junta has ironically proposed a so-called political culture bill, saying it could foster a democratic political culture. On 7 March 2017, the junta-appointed Committee on National Reform, National Strategy, and Reconciliation announced 42 national reform priorities from Government House. Among these 42 reform goals, a political culture bill was proposed as a solution to Thailand’s political ills.
Two weeks ago, the whip committee of the junta’s National Reform Steering Assembly (NRSA) temporarily rejected the Protection of Media Rights and Freedom, Ethics and Professional Standards Bill, following strong opposition from the 30 media organisations.
The International Federation of Journalists (IFJ) and the South East Asia Journalist Unions (SEAJU) join the National Union of Journalist of Thailand (NUJT) in denouncing the draft media regulation bill that will further suppress media in already challenging environment. The IFJ and SEAJU call for the bill to be scrapped immediately.
As a model for its ongoing reconciliation efforts, the Thai junta will follow the amnesty programme for communists implemented during the Cold War. The Thai government has made political reconciliation a policy priority, to resolve chronic unrest between different political movements. Plans include a Memorandum of Understanding (MoU) to be signed by various political parties and movements in acknowledgement of a promise to build peaceful relationships with each other.
|
# Copyright 2013 IBM Corp.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from nova.compute import utils as compute_utils
from nova import db
from nova import exception
from nova import objects
from nova.objects import base
from nova.objects import fields
# TODO(berrange): Remove NovaObjectDictCompat
@base.NovaObjectRegistry.register
class Aggregate(base.NovaPersistentObject, base.NovaObject,
base.NovaObjectDictCompat):
# Version 1.0: Initial version
# Version 1.1: String attributes updated to support unicode
VERSION = '1.1'
fields = {
'id': fields.IntegerField(),
'name': fields.StringField(),
'hosts': fields.ListOfStringsField(nullable=True),
'metadata': fields.DictOfStringsField(nullable=True),
}
obj_extra_fields = ['availability_zone']
@staticmethod
def _from_db_object(context, aggregate, db_aggregate):
for key in aggregate.fields:
if key == 'metadata':
db_key = 'metadetails'
else:
db_key = key
aggregate[key] = db_aggregate[db_key]
aggregate._context = context
aggregate.obj_reset_changes()
return aggregate
def _assert_no_hosts(self, action):
if 'hosts' in self.obj_what_changed():
raise exception.ObjectActionError(
action=action,
reason='hosts updated inline')
@base.remotable_classmethod
def get_by_id(cls, context, aggregate_id):
db_aggregate = db.aggregate_get(context, aggregate_id)
return cls._from_db_object(context, cls(), db_aggregate)
@base.remotable
def create(self):
if self.obj_attr_is_set('id'):
raise exception.ObjectActionError(action='create',
reason='already created')
self._assert_no_hosts('create')
updates = self.obj_get_changes()
payload = dict(updates)
if 'metadata' in updates:
# NOTE(danms): For some reason the notification format is weird
payload['meta_data'] = payload.pop('metadata')
compute_utils.notify_about_aggregate_update(self._context,
"create.start",
payload)
metadata = updates.pop('metadata', None)
db_aggregate = db.aggregate_create(self._context, updates,
metadata=metadata)
self._from_db_object(self._context, self, db_aggregate)
payload['aggregate_id'] = self.id
compute_utils.notify_about_aggregate_update(self._context,
"create.end",
payload)
@base.remotable
def save(self):
self._assert_no_hosts('save')
updates = self.obj_get_changes()
payload = {'aggregate_id': self.id}
if 'metadata' in updates:
payload['meta_data'] = updates['metadata']
compute_utils.notify_about_aggregate_update(self._context,
"updateprop.start",
payload)
updates.pop('id', None)
db_aggregate = db.aggregate_update(self._context, self.id, updates)
compute_utils.notify_about_aggregate_update(self._context,
"updateprop.end",
payload)
self._from_db_object(self._context, self, db_aggregate)
@base.remotable
def update_metadata(self, updates):
payload = {'aggregate_id': self.id,
'meta_data': updates}
compute_utils.notify_about_aggregate_update(self._context,
"updatemetadata.start",
payload)
to_add = {}
for key, value in updates.items():
if value is None:
try:
db.aggregate_metadata_delete(self._context, self.id, key)
except exception.AggregateMetadataNotFound:
pass
try:
self.metadata.pop(key)
except KeyError:
pass
else:
to_add[key] = value
self.metadata[key] = value
db.aggregate_metadata_add(self._context, self.id, to_add)
compute_utils.notify_about_aggregate_update(self._context,
"updatemetadata.end",
payload)
self.obj_reset_changes(fields=['metadata'])
@base.remotable
def destroy(self):
db.aggregate_delete(self._context, self.id)
@base.remotable
def add_host(self, host):
db.aggregate_host_add(self._context, self.id, host)
if self.hosts is None:
self.hosts = []
self.hosts.append(host)
self.obj_reset_changes(fields=['hosts'])
@base.remotable
def delete_host(self, host):
db.aggregate_host_delete(self._context, self.id, host)
self.hosts.remove(host)
self.obj_reset_changes(fields=['hosts'])
@property
def availability_zone(self):
return self.metadata.get('availability_zone', None)
@base.NovaObjectRegistry.register
class AggregateList(base.ObjectListBase, base.NovaObject):
# Version 1.0: Initial version
# Version 1.1: Added key argument to get_by_host()
# Aggregate <= version 1.1
# Version 1.2: Added get_by_metadata_key
VERSION = '1.2'
fields = {
'objects': fields.ListOfObjectsField('Aggregate'),
}
# NOTE(danms): Aggregate was at 1.1 before we added this
obj_relationships = {
'objects': [('1.0', '1.1'), ('1.1', '1.1'), ('1.2', '1.1')],
}
@classmethod
def _filter_db_aggregates(cls, db_aggregates, hosts):
if not isinstance(hosts, set):
hosts = set(hosts)
filtered_aggregates = []
for db_aggregate in db_aggregates:
for host in db_aggregate['hosts']:
if host in hosts:
filtered_aggregates.append(db_aggregate)
break
return filtered_aggregates
@base.remotable_classmethod
def get_all(cls, context):
db_aggregates = db.aggregate_get_all(context)
return base.obj_make_list(context, cls(context), objects.Aggregate,
db_aggregates)
@base.remotable_classmethod
def get_by_host(cls, context, host, key=None):
db_aggregates = db.aggregate_get_by_host(context, host, key=key)
return base.obj_make_list(context, cls(context), objects.Aggregate,
db_aggregates)
@base.remotable_classmethod
def get_by_metadata_key(cls, context, key, hosts=None):
db_aggregates = db.aggregate_get_by_metadata_key(context, key=key)
if hosts is not None:
db_aggregates = cls._filter_db_aggregates(db_aggregates, hosts)
return base.obj_make_list(context, cls(context), objects.Aggregate,
db_aggregates)
|
Are you looking for Halloween Makeup For Little Red Riding Hood inspiring photo? Now, you will be happy that at this time wallpaper is obtainable at our online database. With our complete resources, you could find image or just found any kind of photo for your ideas everyday.
|
from Chordbook import transpose, durations
from Sequencer import Sequence, Note
from Synth import ClassicSynth, BassWalkSynth, PianoSynth
from random import choice
from itertools import cycle
from pyo import *
s = Server().boot()
noteCount = 1
totalCount = 0
tempo = 70
chords = cycle(
[
transpose(target='m7',key='D',octave=5),
transpose(target='7',key='G',octave=5),
transpose(target='maj7', key='C',octave=5)
]
)
chordName = cycle(['Dm7','G7','CMaj7'])
currentChord = next(chords)
currentChordName = next(chordName)
duree = durations['half']
realNotes = [Note(n, duree) for n in currentChord]
seqs = [Sequence([n],tempo) for n in realNotes]
for seq in seqs:
seq.play()
synths = [PianoSynth(seq) for seq in seqs]
for syn in synths:
syn.get_out().out()
def changeChord():
global currentChord, seqs, synths
global noteCount, currentChordName, totalCount
if noteCount > 4:
print "changing chord"
noteCount = 1
currentChord = next(chords)
currentChordName = next(chordName)
newNotes = [Note(n, duree) for n in currentChord]
for seq in seqs:
if seq.isPlaying():
seq.stop()
seqs = [Sequence([n],tempo) for n in newNotes]
synths = [PianoSynth(seq) for seq in seqs]
for seq in seqs:
if not(seq.isPlaying()):
seq.play()
for syn in synths:
syn.get_out().out()
print "Current="+currentChordName+" Total count="+str(totalCount)
noteCount += 1
totalCount += 1
pat = Pattern(changeChord, time=60/(tempo / durations['quarter'] / 4))
pat.play()
s.gui(locals())
|
Brand new 4-bed dorm besides a small swimming pool and spectacular mountain view. Air conditioning, first quality mattresses and mosquito nets. Located 100 mts from the reception following a beautiful stonen path. Not suitable for disabled people or children.
|
# Copyright 2013-2014 MongoDB, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Elasticsearch implementation of the DocManager interface.
Receives documents from an OplogThread and takes the appropriate actions on
Elasticsearch.
"""
import logging
from threading import Timer
import bson.json_util
from elasticsearch import Elasticsearch, exceptions as es_exceptions
from elasticsearch.helpers import scan, streaming_bulk
from mongo_connector import errors
from mongo_connector.constants import (DEFAULT_COMMIT_INTERVAL,
DEFAULT_MAX_BULK)
from mongo_connector.util import retry_until_ok
from mongo_connector.doc_managers import DocManagerBase, exception_wrapper
from mongo_connector.doc_managers.formatters import DefaultDocumentFormatter
wrap_exceptions = exception_wrapper({
es_exceptions.ConnectionError: errors.ConnectionFailed,
es_exceptions.TransportError: errors.OperationFailed})
class DocManager(DocManagerBase):
"""Elasticsearch implementation of the DocManager interface.
Receives documents from an OplogThread and takes the appropriate actions on
Elasticsearch.
"""
def __init__(self, url, auto_commit_interval=DEFAULT_COMMIT_INTERVAL,
unique_key='_id', chunk_size=DEFAULT_MAX_BULK,
meta_index_name="mongodb_meta", meta_type="mongodb_meta",
**kwargs):
self.elastic = Elasticsearch(hosts=[url])
self.auto_commit_interval = auto_commit_interval
self.doc_type = 'string' # default type is string, change if needed
self.meta_index_name = meta_index_name
self.meta_type = meta_type
self.unique_key = unique_key
self.chunk_size = chunk_size
if self.auto_commit_interval not in [None, 0]:
self.run_auto_commit()
self._formatter = DefaultDocumentFormatter()
def stop(self):
"""Stop the auto-commit thread."""
self.auto_commit_interval = None
def apply_update(self, doc, update_spec):
if "$set" not in update_spec and "$unset" not in update_spec:
# Don't try to add ns and _ts fields back in from doc
return update_spec
return super(DocManager, self).apply_update(doc, update_spec)
@wrap_exceptions
def update(self, doc, update_spec):
"""Apply updates given in update_spec to the document whose id
matches that of doc.
"""
document = self.elastic.get(index=doc['ns'],
id=str(doc['_id']))
updated = self.apply_update(document['_source'], update_spec)
# _id is immutable in MongoDB, so won't have changed in update
updated['_id'] = document['_id']
# Add metadata fields back into updated, for the purposes of
# calling upsert(). Need to do this until these become separate
# arguments in 2.x
updated['ns'] = doc['ns']
updated['_ts'] = doc['_ts']
self.upsert(updated)
# upsert() strips metadata, so only _id + fields in _source still here
return updated
@wrap_exceptions
def upsert(self, doc):
"""Insert a document into Elasticsearch."""
doc_type = self.doc_type
index = doc.pop('ns')
# No need to duplicate '_id' in source document
doc_id = str(doc.pop("_id"))
metadata = {
"ns": index,
"_ts": doc.pop("_ts")
}
# Index the source document
self.elastic.index(index=index, doc_type=doc_type,
body=self._formatter.format_document(doc), id=doc_id,
refresh=(self.auto_commit_interval == 0))
# Index document metadata
self.elastic.index(index=self.meta_index_name, doc_type=self.meta_type,
body=bson.json_util.dumps(metadata), id=doc_id,
refresh=(self.auto_commit_interval == 0))
# Leave _id, since it's part of the original document
doc['_id'] = doc_id
@wrap_exceptions
def bulk_upsert(self, docs):
"""Insert multiple documents into Elasticsearch."""
def docs_to_upsert():
doc = None
for doc in docs:
# Remove metadata and redundant _id
index = doc.pop("ns")
doc_id = str(doc.pop("_id"))
timestamp = doc.pop("_ts")
document_action = {
"_index": index,
"_type": self.doc_type,
"_id": doc_id,
"_source": self._formatter.format_document(doc)
}
document_meta = {
"_index": self.meta_index_name,
"_type": self.meta_type,
"_id": doc_id,
"_source": {
"_ns": index,
"ts": timestamp
}
}
yield document_action
yield document_meta
if not doc:
raise errors.EmptyDocsError(
"Cannot upsert an empty sequence of "
"documents into Elastic Search")
try:
kw = {}
if self.chunk_size > 0:
kw['chunk_size'] = self.chunk_size
responses = streaming_bulk(client=self.elastic,
actions=docs_to_upsert(),
**kw)
for ok, resp in responses:
if not ok:
logging.error(
"Could not bulk-upsert document "
"into ElasticSearch: %r" % resp)
if self.auto_commit_interval == 0:
self.commit()
except errors.EmptyDocsError:
# This can happen when mongo-connector starts up, there is no
# config file, but nothing to dump
pass
@wrap_exceptions
def remove(self, doc):
"""Remove a document from Elasticsearch."""
# self.elastic.delete(index=doc['ns'], doc_type=self.doc_type,
# id=str(doc["_id"]),
# refresh=(self.auto_commit_interval == 0))
# self.elastic.delete(index=self.meta_index_name, doc_type=self.meta_type,
# id=str(doc["_id"]),
# refresh=(self.auto_commit_interval == 0))
pass
@wrap_exceptions
def _stream_search(self, *args, **kwargs):
"""Helper method for iterating over ES search results."""
for hit in scan(self.elastic, query=kwargs.pop('body', None),
scroll='10m', **kwargs):
hit['_source']['_id'] = hit['_id']
yield hit['_source']
def search(self, start_ts, end_ts):
"""Query Elasticsearch for documents in a time range.
This method is used to find documents that may be in conflict during
a rollback event in MongoDB.
"""
return self._stream_search(
index=self.meta_index_name,
body={
"query": {
"filtered": {
"filter": {
"range": {
"_ts": {"gte": start_ts, "lte": end_ts}
}
}
}
}
})
def commit(self):
"""Refresh all Elasticsearch indexes."""
retry_until_ok(self.elastic.indices.refresh, index="")
def run_auto_commit(self):
"""Periodically commit to the Elastic server."""
self.elastic.indices.refresh()
if self.auto_commit_interval not in [None, 0]:
Timer(self.auto_commit_interval, self.run_auto_commit).start()
@wrap_exceptions
def get_last_doc(self):
"""Get the most recently modified document from Elasticsearch.
This method is used to help define a time window within which documents
may be in conflict after a MongoDB rollback.
"""
try:
result = self.elastic.search(
index=self.meta_index_name,
body={
"query": {"match_all": {}},
"sort": [{"_ts": "desc"}],
},
size=1
)["hits"]["hits"]
for r in result:
r['_source']['_id'] = r['_id']
return r['_source']
except es_exceptions.RequestError:
# no documents so ES returns 400 because of undefined _ts mapping
return None
|
Summer is a great time to move to a new home. The kids are out of school. The weather is more accommodating. And you have time to settle in before school starts again and the holidays arrive. If your plans include a new construction home in Baton Rouge’s most desirable suburbs, Alvarez Construction Company might already have the home for you! As you’re getting ready to move this summer, check out our move-in ready homes.
Located in The Sanctuary at Juban Crossing, this home features 1,826 square feet, 3 bedrooms, 2 baths. Stunning brick exterior adds curb appeal to our Papyrus floor plan that features an open concept, upgraded kitchen, and a private courtyard. $244,550. This home is eligible for 100% financing through USDA Rural Development and Alvarez Construction is offering to pay $4,000 in closing costs.
This home in the Grove at Ascension community has 1,817 square feet, 3 bedrooms, 3 baths. Step into this Briars home and experience the spaciousness of the open concept, high ceilings, and sun-filled interior. Live large and comfortably with the smart use of space. $255,550.
Coursey Cove community. 2,014 square feet, 3 bedrooms, 2 baths. This single-level Ashburne floor plan presents plenty of living space and style, indoors and out. We’ve expanded the patio and upgraded the kitchen in a home that already offers so many advantages! $272,900.
Located in our Deer Trail, Phase II community, this 2,093 square-foot, 4-bedroom, 2-bath home features a Camelia design is the ideal family home, featuring an open living space with wood flooring, upgraded gourmet kitchen, breakfast nook, and a luxury master suite. $279,900.
10383 Grand Plaza Drive is in our Cella Gardens community. With 1,894 square feet, 4 bedrooms, 2 baths, this Kimble floor plan incorporates eye-catching detail in a design that provides the perfect balance of open spaces and private places. $236,150.
We have many more available homes—all brand new and covered by our builder’s warranty. Alvarez Construction adds more to your new home. We’re proud to be Louisiana’s only home builder to include healthy, smart, and quality features in all of our homes. Browse our complete selection of available move-in ready homes, floor plans, and new home communities. Then talk to us about making this summer the season of your new home and new life!
|
import os
import markdown
from sitegen.siteloader.base import FinalHtmlAction, FSDependencyObserver
from sitegen.siteloader.dependency import Action
class MarkdownObserver(FSDependencyObserver):
def notify(self, directory: str, entry: str):
is_md = entry.endswith('.md')
is_html = entry.endswith('.html')
if is_md or is_html:
name = os.path.splitext(entry)[0]
path = os.path.join(directory, entry)
name_path = os.path.join(directory, name)
sub_path_items = name_path.split(os.path.sep)[1:]
build_target_path = os.sep.join(['_build'] + sub_path_items) + '.middle'
yaml_target_path = build_target_path + '.yml'
install_target_path = os.sep.join(['_install'] + sub_path_items) + '.html'
action_class = MarkdownAction if is_md else HtmlAction
self._dependency_collector.add_site_dependency([install_target_path])
self._dependency_collector.add_dependency(install_target_path,
[build_target_path, yaml_target_path],
FinalHtmlAction)
self._dependency_collector.add_dependency(build_target_path, [path], action_class)
class _PageAction(Action):
max_deps_count = 1
def __get_input_text(self, path: str):
with open(path, 'rt') as f:
input_text = f.read()
lines = input_text.splitlines()
if lines[0] == '--':
end = lines[1:].index('--') + 1
yaml_text = '\n'.join(lines[1:end])
input_text = '\n'.join(lines[(end + 2):])
else:
yaml_text = "title: " + os.path.basename(path).rsplit('.', 1)[0]
return input_text, yaml_text
def run(self):
path, target_path, yaml_target_path = self.__get_full_paths()
if not os.path.exists(os.path.dirname(target_path)):
os.makedirs(os.path.dirname(target_path))
print("Compiling", self.target_path)
input_text, yaml_text = self.__get_input_text(path)
output_text = self._format_text(input_text)
self.__write_output_files(output_text, target_path, yaml_target_path, yaml_text)
def __get_full_paths(self):
path = os.path.join(self._site_root, self.dependencies[0])
target_path = os.path.join(self._site_root, self.target_path)
yaml_target_path = target_path + '.yml'
return path, target_path, yaml_target_path
def _format_text(self, input_text: str):
raise NotImplementedError("Cannot generate output text")
def __write_output_files(self, output_text, target_path, yaml_target_path, yaml_text):
with open(target_path, 'wt') as f:
f.write(output_text)
with open(yaml_target_path, 'wt') as f:
f.write(yaml_text)
class MarkdownAction(_PageAction):
def _format_text(self, input_text: str):
return markdown.markdown(input_text, output_format='html5')
class HtmlAction(_PageAction):
def _format_text(self, input_text: str):
return input_text
|
Fuel your commitment to a healthy lifestyle by participating in one of our fitness classes. A variety of classes are available for you to choose from, and any of our instructors can help you find one or more that fit your health needs and goals. All classes are free for Jefferson Community Health & Life Burkley Fitness Center members, but membership is not required. A non-member may participate in classes by purchasing a day pass or punch card. All classes are held at Jefferson Community Health & Life Burkley Fitness Center unless otherwise noted.
For any questions or for more information, please call 402.729.6139.
Use a chair for more than just sitting! Seated exercises focus on improving range of motion, strength and balance. Standing exercises involve stability balls and other equipment to promote better balance and hand-eye coordination.
Get lean, build strength and tone muscle using light to moderate weights with lots of repetition. This group-based barbell class challenges all of your major muscle groups by using the best weight room exercises: squats, presses, lifts and curls.
Start your day off right with this early morning, energetic circuit training class. Several dumbbell and body weight exercises are combined with cardio to keep the heart rate up, making each workout an efficient and effective calorie burner.
Strength and speed are the perfect combination to build muscle and burn fat. This 60-minute weight training class uses barbells, dumbbells and other equipment to tone and strengthen eight major muscle groups.
Amplify your cardiovascular health in this upbeat class. Various pedaling speeds and riding positions are synchronized with lively music. Seated and standing positions challenge many muscle groups, including hamstrings, glutes and core.
Often described as meditation in motion, Tai Chi promotes balance, stress reduction, inner peace and overall fitness for a healthy mind and body. Deep breathing is accompanied by a series of movements performed in a slow, focused manner.
Whether you are new to or familiar with yoga, it can be an effective workout for seniors of all experience levels. Stretching, yoga poses, and breathing are synchronized to help seniors with balance, strengthening and relaxation.
Join the fun with this water exercise class that uses kick-boards, foam dumbbells, noodles and water jugs. This low-impact, high-resistance workout promotes increased range of motion and balance without detrimental impact on your joints.
This dance fitness class uses music to create an exhilarating and effective workout for dancers and non-dancers alike. Fast and slow rhythms are combined to tone and sculpt muscles and increase range of motion.
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.