text
stringlengths 29
850k
|
---|
import numpy as np
from numpy.random import random_sample
from skimage import img_as_float
from skimage.io import imread
from pictureframe import PictureFrame
image = img_as_float(imread('sample_data/image.png'))
depth = img_as_float(imread('sample_data/depth.png'))
# CONSTRUCTORS
# dict of array name to array
pf = PictureFrame({"image": image, "depth": depth})
# quick look at the the data
print(pf)
# INDEXING
# we pass the indexing arguments down to numpy, so everything
# works as you would expect (apart from maybe broadcasting)
# slicing returns a new PictureFrame with views of original data
print(pf[30:, 80:220])
# cartesian indexing returns a new PictureFrame with copies of the data
# note the new shape
print(pf[[1, 3, 4], [40, 10, 11]])
# boolean or mask indexing works fine too
mask = np.random.random_sample(depth.shape) > 0.5
print(pf[mask])
# CONVENIENCE FUNCTIONS
# zoom function returns a new PictureFrame with resized data arrays
# order of interpolation is by default 2 for float arrays and 0 for
# integer arrays, but this can be overridden
print(pf.zoom(0.5))
# pandas/SQL-like groupby function iterates over sub-PictureFrames
# corresponding to each label value
# here we use an "external" array...
labels = imread('sample_data/labels.png')
for label, pf_group in pf.groupby(labels):
print(label, pf_group)
# however we can add it to the PictureFrame
pf['labels'] = labels
# and group by the name of the array
for k, pf_group in pf.groupby('labels'):
print(k, pf_group)
# browse function lets you see all array data with matplotlib
pf.browse()
# ASSIGNMENT
indices = np.array([10, 7, 3, 0, 12])
# copy some data to a new PictureFrame and alter the values
other_pf = pf[indices]
other_pf['image'] = random_sample(other_pf.image.shape)
# assignment of values between corresponding arrays handled internally
pf[indices] = other_pf
|
I'm planning on going to Curso Flamenco in Sanlucar this year. It will be my 1st time going. Is anyone else going?
I'm looking for any tips on travel to Spain with my guitar (I've flown domestic USA with it). And info on the classes, such as the levels, like which one to take, Beginner or Advanced.
Will you be using internal European flights? Those might be the bigger problem.
Just in case it helps: If you're arriving into and from Madrid then the train to Jerez is easy. From there a hire car to Sanlucar.
Beware - medium and long distance trains in Spain must be pre-booked.
I'm looking at travel options now. It looks like I can fly into Sevilla, then take a bus or car to Sanlucar, as the least expensive option.
Rent a car and drive down. Hotel los helechos, where you must stay, has a garage.
I am new to the forum as well and I am off to Spain this summer for a week long festival and flamenco workshop in La Puebla de Cazalla. It is put on by Antonio Andrade and Cazalla is his home town.
Antonio is friends with a very good friend of mine - Beau Bledsoe. He helps me out with my flamenco and asked me to come with him this year. It is July 6th-14th.
I assume this is the class you are going to?
Looks like a very nice workshop.
Hi mbspark- The curso that you are attending looks great too. Thanks for the link. Iĺl look into it next year or so.
I wish I had the time and money to travel to Spain and take guitar classes every year!
David just some bits and pieces if you are not already fixed up.
Hire a car, but after a long trip I wouldn't fancy that.
The airport bus will take you to the Santa Justa train station or the bus station (about 25 mins in traffic). Be aware many miss the train station stop as it is not obvious you are at the train station (jet lag does funny things to people).
The bus costs 4 Euros but they are not that frequent. You may wait 30 mins and the bus can be very crowded.
The taxi should be fixed at 22 Euros. Quite often I find someone on the plane or while waiting for luggage to share with.
All prices subject to a rise but a taxi who does it on the meter should be shot.
There is a hotel opposite Santa Justa, Ayre hotel. Incredible confort and currently hotels.com (there are other agents) are offering under 70 USD a night in July.
The train from Santa Justa is just over an hour to Jerez. That is where I'd think about a car but presumably during the course you will have little need for one??? There are buses but infrequent. A taxi you might get for 25 Euros.
The bus from Sevilla to Sanlucar might only cost 10 Euros, but an hour and 15 mins on a bus is not my thing.
hello from Portugal , do you have dates and prices for that?
Iĺl stay at Hotel Los Helechos in Sanlucar.
Iḿ planning on flying into Sevilla to get there in the afternoon. Stay in Barrio Santa Cruz for one night, see some Flamenco, then hang out at LaCarboneria for a couple beers. Then Leave for Sanlucar on Sunday to get to the opening reception for the curso.
Enjoy it , to much for me to spend on holliday course.
Been there twice. First time great. Second time stuck on first floor right in front of the roundabout, so noisy as hell if you open window. Aircon is the nasty old fashioned one-setting-for-all-rooms kind. No per room thermostat. It was set for mid afternoon in the sahara setting. Utter sleepless hell. Will never stay there again.
Utter sleepless hell. Will never stay there again.
Shows just how tough it is to be certain of getting what one expects I guess. Things go wrong. And sounds like you got the wrong bit in spades. Sorry to hear that.
But to be fair to them they have thousands of positive ratings on TA, Booking.com and Hotels.com.
I also last month stayed at their hotel in Oviedo. Again excellent service, amazing bed, and a room I could have had a 50 guest party in.
But to anyone going there are lots of options near Santa Justa.
And info on the classes, such as the levels, like which one to take, Beginner or Advanced.
I went 5 or 6 years ago and loved it.... Unless you are a monster guitar player I would do the begginer course... I did that and he showed us loads of great stuff some of it playable but some of it ridiculous... Its Gerardo Nuñez so its all incredible, I still remember the sound.
If I was going again I would definetly try and spend the extra on the cante classes with Antonio Carrión... they sound amazing and the groups are smaller.
Price wise its €300 for 10 hours with one of the best flamenco players on the planet, a chance to hang out with him and other great players and see shows every night for free. I think thats pretty reasonable.
Ricardo that's all well and good for the beginners, but what is the complex stuff like?
it borders on the usable, ie scales chords/knowledge to stuff which only the flamenco gods can play. I found it useful, he did alot of improv and stuff in different tonos last year.
Antonio's class is very useful, i got an immense benefit from it last year, getting correction and what to work on.
well if you put it that way , its a very good price almost cheap..
Anyone know similiar but cheaper?
These courses in Córdoba are cheaper and some amazing people running them. |
import sys
import subprocess
import time
import glob
import csv
import logging
import pandas as pd
from os import path, makedirs
from time import gmtime
from pidas.settings import PIDAS_DIR, SENSOR_LIST_FILE, DATA_FILE, CSV_HEADER
def get_sensor_list(sensor_list):
"""create a dataframe from sensor list name and position"""
df = pd.read_csv(sensor_list)
print(df)
return df
def read_temp_raw(device_file):
catdata = subprocess.Popen(['cat', device_file], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
out, err = catdata.communicate()
out_decode = out.decode('utf-8')
lines = out_decode.split('\n')
return lines
def read_temp(device_file=''):
lines = read_temp_raw(device_file)
while lines[0].strip()[-3:] != 'YES':
#time.sleep(0.2)
lines = read_temp_raw(device_file)
equals_pos = lines[1].find('t=')
if equals_pos != -1:
temp_string = lines[1][equals_pos + 2:]
temp_c = float(temp_string) / 1000.0
return str(temp_c)
else:
return None
def get_temp_measures():
device_list = glob.glob('/sys/bus/w1/devices/28-*')
try:
sensors_df = get_sensor_list(path.join(PIDAS_DIR,SENSOR_LIST_FILE))
except OSError as e:
print(e)
exit(1)
temp_measures = []
for device in device_list:
head, sensor_id = path.split(device)
device_file = str(device) + '/w1_slave'
sensor_name = sensors_df['sensor_name'][sensors_df['sensor_id'] == sensor_id].values[0]
val = read_temp(device_file)
timestamp = str(int(time.time()))
measure = (sensor_id, sensor_name, val, timestamp)
temp_measures.append(measure)
return temp_measures
if __name__ == "__main__":
print("Begin")
log_path = path.join(PIDAS_DIR, 'logs')
file_path = path.join(PIDAS_DIR, DATA_FILE)
if not path.exists(log_path):
makedirs(log_path)
logging_level = logging.DEBUG
logging.Formatter.converter = gmtime
log_format = '%(asctime)-15s %(levelname)s:%(message)s'
logging.basicConfig(format=log_format, datefmt='%Y/%m/%d %H:%M:%S UTC', level=logging_level,
handlers=[logging.FileHandler(path.join(log_path,'log_temps_to_csv.log')),
logging.StreamHandler()])
logging.info('_____ Started _____')
logging.info('saving in' + file_path)
if not path.exists(file_path):
with open(file_path, "w") as output_file:
writer = csv.writer(output_file)
writer.writerow(CSV_HEADER)
while 1:
try:
temp_measures = get_temp_measures()
for measure in temp_measures:
with open(file_path, "a") as output_file:
writer = csv.writer(output_file)
writer.writerow(measure)
except KeyboardInterrupt:
print(' Exiting measures')
sys.exit()
|
Huge crowds marched from Victoria Square to the city’s Gay Village in Hurst Street, marking one of the first Prides of 2018.
A sense of freedom reigned in the streets as people marched.
Pride marches have radical origins as part of the Gay Liberation Movement in the 1970s that fought oppression.
While real gains have been made, LGBT+ people still face discrimination in Britain and still live in a capitalist society that represses and distorts people’s sexuality.
But there are attempts to remove politics from Pride.
In Birmingham the trade union bloc was big, lively and young, but was pushed to the back by organisers.
And in Sheffield, Pride organisers said they would not accept “any applications by political groups” because it was a “celebration march not a protest”.
They said “offensive placards” would not be allowed, but didn’t specify what this meant.
The Sheffield organisers’ decision caused outrage from LGBT+ activists online.
Many go to Pride as a celebration—but using that to push out politics is a cover for a more corporate message.
As more Pride marches approach, activists can organise for big, political blocs to make the link between its radical roots and fighting for liberation today. |
import sys
sys.path.append('./../nn/')
import numpy as np
from pandas import read_csv
from nn.nn.MLP import MLP, sigmoid, d_sigmoid, d_identity, identity, tanh, d_tanh, mcrmse, xeuclidian, d_xeuclidian, hamming, euclidian
from nn.nn.RBM import RBM
from nn.nn.Norms import l2, d_l2, l1, d_l1
if __name__ == '__main__':
df = read_csv('./../data/africa-soil/training.csv')
x = df.as_matrix(columns=df.columns[1:3595])
x[:, -1] = (x[:, -1] == 'Topsoil') * 1.0
x = x.astype(float)
y = df.as_matrix(columns=df.columns[3595:])
y = y.astype(float)
# standartizing
x = (x - np.repeat(x.mean(axis=0), x.shape[0]).reshape((x.shape[0], x.mean(axis=0).shape[0]), order='F')) / \
np.sqrt(np.repeat(x.var(axis=0), x.shape[0]).reshape((x.shape[0], x.mean(axis=0).shape[0]), order='F'))
idx_train = list(np.random.choice(range(x.shape[0]), size=int(round(0.8 * x.shape[0]))))
idx_cv = list(set(range(x.shape[0])) - set(idx_train))
rbm = RBM(x.shape[1], 100,
rng=(lambda n: np.random.normal(0, 0.001, n)),
mode='gaus-bin')
print(rbm)
rbm.train(x[idx_train, :],
cd_k=1,
learning_rate=0.001,
momentum_rate=0.9,
max_iter=1000,
batch_size=20,
n_iter_stop_skip=10,
goal=euclidian,
#cv_input_data=cv_input,
stop_threshold=0.15,
#neural_local_gain=(0.05, 0.95, 0.01, 100),
regularization_rate=0.1,
#regularization_norm=l1,
d_regularization_norm=d_l1,
)
|
The official interpretation of gaming is “to play any kind of lottery for cash or various other risks,” so gaming does not imply that it needs to remain in a casino; betting can happen anywhere… One of the most preferred sorts of betting beyond a บาคาร่า casino is possibly in the house Texas holder video games. It appears as if every person is constantly holding a video game of Texas Hold ‘me at their residence. These video games are very easy to obtain with each other and are a fun method to hang around with good friends. If you win, it’s additionally a fun method to make some fast money. |
# -*- encoding: utf-8 -*-
from django.test import TestCase
from mock import patch
from digest.management.commands.import_python_weekly import _get_content, \
_get_blocks
from digest.utils import MockResponse
from digest.utils import read_fixture
class ImportPythonWeeklyBadTest(TestCase):
def test_get_content_bad_link(self):
content = _get_content('htt://googl.fa')
self.assertEqual(content, '')
class ImportPythonWeeklyTest(TestCase):
def setUp(self):
self.url = 'http://us2.campaign-archive1.com/?u=e2e180baf855ac797ef407fc7&id=31658452eb&utm_content=buffera9dc3&utm_medium=social&utm_source=twitter.com&utm_campaign=buffer'
test_name = 'fixture_test_import_python_weekly_test_get_blocks.txt'
self.patcher = patch(
'digest.management.commands.import_python_weekly.urlopen')
self.urlopen_mock = self.patcher.start()
self.urlopen_mock.return_value = MockResponse(read_fixture(test_name))
# list(map(save_item, map(_apply_rules, map(_get_block_item, _get_blocks(url)))))
def tearDown(self):
self.patcher.stop()
def test_get_content(self):
content = _get_content(self.url)
self.assertEqual(len(content), 48233)
def test_get_blocks(self):
blocks = _get_blocks(self.url)
self.assertEqual(len(blocks), 28)
return blocks
|
A familiar problem for most homeowners, fruit flies are commonly spotted swarming around kitchen fruit bowls & near garbage storage areas. Terro fruit fly traps lure adult fruit flies into the trap using a non-toxic, food-based liquid lure. Once the flies enter the trap, they cannot escape to continue breeding & multiplying. & better yet, gone are the days of unsightly traps - this attractive, apple-shaped trap looks right at home in any kitchen. |
import six
from django.urls import reverse
from django.utils.functional import lazy
from django.utils.translation import ungettext
from oioioi.base.utils import make_navbar_badge
from oioioi.contests.utils import can_enter_contest, is_contest_basicadmin
from oioioi.questions.utils import unanswered_questions
from oioioi.questions.views import new_messages, visible_messages
from oioioi.status.registry import status_registry
def navbar_tip_processor(request):
if not getattr(request, 'contest', None):
return {}
if not request.user.is_authenticated:
return {}
if not can_enter_contest(request):
return {}
def generator():
return make_navbar_badge(**navbar_messages_generator(request))
return {'extra_navbar_right_messages': lazy(generator, six.text_type)()}
@status_registry.register
def get_messages(request, response):
response['messages'] = navbar_messages_generator(request)
return response
def navbar_messages_generator(request):
if request.contest is None:
return {}
is_admin = is_contest_basicadmin(request)
vis_messages = visible_messages(request)
if is_admin:
messages = unanswered_questions(vis_messages)
else:
messages = new_messages(request, vis_messages)
count = messages.count()
if count:
text = ungettext('%(count)d NEW MESSAGE', '%(count)d NEW MESSAGES', count) % {
'count': count
}
if count == 1:
m = messages.get()
link = reverse(
'message',
kwargs={
'contest_id': request.contest.id,
'message_id': m.top_reference_id
if vis_messages.filter(id=m.top_reference_id).exists()
else m.id,
},
)
else:
link = reverse(
'contest_messages', kwargs={'contest_id': request.contest.id}
)
return {'link': link, 'text': text, 'id': 'contest_new_messages'}
else:
return {'link': None, 'text': None, 'id': 'contest_new_messages'}
|
Fairbanks Alaska. April 22, 1996.
p> 30 second exposure, 50 mm lens, Kodak 1000 Gold film, F4.5, lots of aurora. Time about 1:15 ADT Comet lower middle. |
from django.conf.urls import include, url
from django.http import HttpResponse
from django.shortcuts import redirect
from django.contrib import admin
from django.contrib.auth.decorators import login_required
import six
if six.PY2:
from django.core.urlresolvers import reverse
else:
from django.urls import reverse
from tests.fake_webapp import (
EXAMPLE_HTML,
EXAMPLE_IFRAME_HTML,
EXAMPLE_ALERT_HTML,
EXAMPLE_TYPE_HTML,
EXAMPLE_NO_BODY_HTML,
EXAMPLE_POPUP_HTML,
EXAMPLE_REDIRECT_LOCATION_HTML,
)
admin.autodiscover()
def index(request):
return HttpResponse(EXAMPLE_HTML)
def iframed(request):
return HttpResponse(EXAMPLE_IFRAME_HTML)
def alertd(request):
return HttpResponse(EXAMPLE_ALERT_HTML)
def type(request):
return HttpResponse(EXAMPLE_TYPE_HTML)
def no_body(request):
return HttpResponse(EXAMPLE_NO_BODY_HTML)
def get_name(request):
return HttpResponse("My name is: Master Splinter")
def get_user_agent(request):
return HttpResponse(request.META["User-Agent"])
def post_form(request):
items = "\n".join("{}: {}".format(*item) for item in request.POST.items())
body = "<html><body>{}</body></html>".format(items)
return HttpResponse(body)
def request_headers(request):
body = "\n".join(
"%s: %s" % (key, value) for key, value in six.iteritems(request.META)
)
return HttpResponse(body)
def upload_file(request):
if request.method == "POST":
f = request.FILES["file"]
buffer = []
buffer.append("Content-type: %s" % f.content_type)
buffer.append("File content: %s" % f.read())
return HttpResponse("|".join(buffer))
def foo(request):
return HttpResponse("BAR!")
def query_string(request):
if request.query_string == "model":
return HttpResponse("query string is valid")
else:
raise Exception("500")
def popup(request):
return HttpResponse(EXAMPLE_POPUP_HTML)
@login_required
def auth_required(request):
return HttpResponse("Success!")
def redirected(request):
location = "{}?{}".format(reverse("redirect_location"), "come=get&some=true")
return redirect(location)
def redirect_location(request):
return HttpResponse(EXAMPLE_REDIRECT_LOCATION_HTML)
urlpatterns = [
url(r"^$", index),
url(r"^iframe$", iframed),
url(r"^alert$", alertd),
url(r"^type$", type),
url(r"^no_body$", no_body),
url(r"^name$", get_name),
url(r"^useragent$", get_user_agent),
url(r"^headers$", request_headers),
url(r"^upload$", upload_file),
url(r"^foo$", foo),
url(r"^query$", query_string),
url(r"^popup$", popup),
url(r"^authenticate$", auth_required),
url(r"^redirected", redirected),
url(r"^post", post_form),
url(r"^redirect-location", redirect_location, name="redirect_location"),
]
if six.PY2:
urlpatterns.append(url(r"^admin/", include(admin.site.urls)))
else:
urlpatterns.append(url(r"^admin/", admin.site.urls))
|
The funny thing about the Bible; you can read a story all your life, and still get a new message from what you read. That happened to me this week as I went through Mark 8.
There is a moment when Jesus is, very clearly, telling his disciples what his death and resurrection would look like. Peter, the fairly outspoken disciple whose words seem to spill out faster than is mind can process them, “rebukes” Jesus for the conversation. I’m not sure if the rebuke was because Jesus was speaking so bluntly, or if he was trying to encourage “positive thoughts” among the crew. Either way, he made it pretty clear that he thought Jesus was out of line.
Jesus came back with one of the harshest rebukes we see him give a follower.
Get behind me Satan!… You do not have in mind the things of God, but merely human concerns.
I think that, every time I’ve read this, I’ve been put of balance by the “Get behind me Satan” part, that I have tended to miss the real rebuke.
Whatever Peter’s concerns or desires, they were out of line with God’s plan.
I don’t think Peter was saying anything I would have thought was out of line. We WANT to avoid talk of pain. We WANT to be positive as we make our plans. I think Peter thought he was taking a stand for something good.
But then it struck me. How often do I pray in the same way? I pray for “a good day today”. I pray over my daughters that they would “have a good night’s sleep”. I am praying for my wife that she “have a great trip”. I ask God for things that are merely human concerns. How can I shift my prayers to pray for “things of God”.
Perhaps not “help her to have a good night’s sleep”, but rather “help her to shine your light at school tomorrow”.
Perhaps not “help me to have a good day and get my project done”, but rather “help me to see others as you see them. Give me the words I need to respond to a difficult situation in a way that points to You”.
Perhaps not “grant her a safe trip”, but rather “help her to learn more in a way that makes her more effective in accomplishing the tasks you have given her”.
I’m not completely sure what that looks like yet, but I do realize that I need to move my prayer life from my human concerns, to the things of God. |
# -*- encoding: utf-8 -*-
##############################################################################
#
# OpenERP, Open Source Management Solution
# Copyright (c) 2011 CCI Connect asbl (http://www.cciconnect.be) All Rights Reserved.
# Philmer <[email protected]>
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
##############################################################################
{
'name' : 'Search By Name Vat',
'version' : '1.0',
'author' : 'OpenERP',
'category' : 'Accounting & Finance',
'website': 'http://www.openerp.com',
'description': """
Modify res_partner search method
==================================
Funcionality to add Search by Name and Vat in res_partner model
Vat + Name parameter added to search res_partner
Vat + Name parameter added in res_partner field selection
""",
'depends' : ['base'],
'data' : [
],
'active': False,
'installable': True
}
# vim:expandtab:smartindent:tabstop=4:softtabstop=4:shiftwidth=4:
|
Don't Run Naked Through The Office reveals the secrets of employee engagement from the employee's perspective. It has been called a must read for anyone frustrated with their jobs, business leaders with a desire to increase employee engagement and productivity in challenging economic times, new college graduates heading into a competitive workplace, and others wanting to avoid missteps in the advancement of their careers.
Penned from a workers perspective by an experienced corporate training manager with contributions from a veteran job recruiter, Don't Run Naked Through the Office explains the four types of workplace environments, offers guidance on navigating the challenges of each, and walks readers through an effective, self-managed approach to professional development by creating a simple but effective Workplace Survival Plan that can benefit anyone in any discipline at any corporate level. |
# Protocol Buffers - Google's data interchange format
# Copyright 2008 Google Inc. All rights reserved.
# https://developers.google.com/protocol-buffers/
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are
# met:
#
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above
# copyright notice, this list of conditions and the following disclaimer
# in the documentation and/or other materials provided with the
# distribution.
# * Neither the name of Google Inc. nor the names of its
# contributors may be used to endorse or promote products derived from
# this software without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
"""Encoding related utilities."""
import re
import six
_cescape_chr_to_symbol_map = {}
_cescape_chr_to_symbol_map[9] = r'\t' # optional escape
_cescape_chr_to_symbol_map[10] = r'\n' # optional escape
_cescape_chr_to_symbol_map[13] = r'\r' # optional escape
_cescape_chr_to_symbol_map[34] = r'\"' # necessary escape
_cescape_chr_to_symbol_map[39] = r"\'" # optional escape
_cescape_chr_to_symbol_map[92] = r'\\' # necessary escape
# Lookup table for unicode
_cescape_unicode_to_str = [chr(i) for i in range(0, 256)]
for byte, string in _cescape_chr_to_symbol_map.items():
_cescape_unicode_to_str[byte] = string
# Lookup table for non-utf8, with necessary escapes at (o >= 127 or o < 32)
_cescape_byte_to_str = ([r'\%03o' % i for i in range(0, 32)] +
[chr(i) for i in range(32, 127)] +
[r'\%03o' % i for i in range(127, 256)])
for byte, string in _cescape_chr_to_symbol_map.items():
_cescape_byte_to_str[byte] = string
del byte, string
def CEscape(text, as_utf8):
# type: (...) -> str
"""Escape a bytes string for use in an text protocol buffer.
Args:
text: A byte string to be escaped.
as_utf8: Specifies if result may contain non-ASCII characters.
In Python 3 this allows unescaped non-ASCII Unicode characters.
In Python 2 the return value will be valid UTF-8 rather than only ASCII.
Returns:
Escaped string (str).
"""
# Python's text.encode() 'string_escape' or 'unicode_escape' codecs do not
# satisfy our needs; they encodes unprintable characters using two-digit hex
# escapes whereas our C++ unescaping function allows hex escapes to be any
# length. So, "\0011".encode('string_escape') ends up being "\\x011", which
# will be decoded in C++ as a single-character string with char code 0x11.
if six.PY3:
text_is_unicode = isinstance(text, str)
if as_utf8 and text_is_unicode:
# We're already unicode, no processing beyond control char escapes.
return text.translate(_cescape_chr_to_symbol_map)
ord_ = ord if text_is_unicode else lambda x: x # bytes iterate as ints.
else:
ord_ = ord # PY2
if as_utf8:
return ''.join(_cescape_unicode_to_str[ord_(c)] for c in text)
return ''.join(_cescape_byte_to_str[ord_(c)] for c in text)
_CUNESCAPE_HEX = re.compile(r'(\\+)x([0-9a-fA-F])(?![0-9a-fA-F])')
def CUnescape(text):
# type: (str) -> bytes
"""Unescape a text string with C-style escape sequences to UTF-8 bytes.
Args:
text: The data to parse in a str.
Returns:
A byte string.
"""
def ReplaceHex(m):
# Only replace the match if the number of leading back slashes is odd. i.e.
# the slash itself is not escaped.
if len(m.group(1)) & 1:
return m.group(1) + 'x0' + m.group(2)
return m.group(0)
# This is required because the 'string_escape' encoding doesn't
# allow single-digit hex escapes (like '\xf').
result = _CUNESCAPE_HEX.sub(ReplaceHex, text)
if six.PY2:
return result.decode('string_escape')
return (result.encode('utf-8') # PY3: Make it bytes to allow decode.
.decode('unicode_escape')
# Make it bytes again to return the proper type.
.encode('raw_unicode_escape'))
|
Community nutrition programs are becoming an increasing interest for dietetic professionals. The Penn State dietetic internship offers many opportunities for interns to explore these programs.
The learning experiences during the four-week Cooperative Extension experience range from food demonstrations to developing educational materials for lower-income children and adults.
A two-week rotation in the supermarket setting provides interns an opportunity to broaden their knowledge of food products and marketing. During the two weeks spent at a GIANT Foods Store in the Harrisburg area, interns conduct food demonstrations and connect customers to nutrition information as they shop. Opportunities for registered dietitians in this segment are increasing and the exposure will give interns an overview of these career opportunities.
A two week rotation centered on time in the Baird Wellness Center with the Director of Wellness and staff. One week will be working on projects at the farm market at Masonic Villages.
The eighteen-week medical nutrition therapy rotation will take place at Penn State Milton S. Hershey Medical Center, located in Hershey, Pennsylvania. Hershey Medical Center is a 507-bed tertiary-care teaching hospital providing specialized services in internal nutrition, oncology, renal disease, cardiology, and pediatrics to residents of central Pennsylvania. The medical center's nutrition staff includes sixteen registered dietitians, half of which have completed advanced degrees. Interns will receive approximately thirty didactic hours of training in medical nutrition therapy at this site under the supervision of the clinical coordinator. The rotation will provide interns with practical experience in nephrology, trauma/critical care, and pediatric units. Theory and didactic experiences will be incorporated into the practical experiences through interactive learning, case studies and special projects.
Interns will spend 5-weeks at Masonic Villages in Elizabethtown, Pennsylvania. This continuing care retirement community is located on more than 1,400 acres of Lancaster County farmland. The facility accommodates 1,600 residents with living arrangements from apartments and cottages to skilled nursing care.
In this rotation, interns will receive a comprehensive experience while rotating through the state-of-the-art cook/chill facility, several of the dining rooms available to the residents, and various kitchens throughout the campus. Interns will learn the complexities of forecasting, ordering, and purchasing for a large health care facility. In addition, they will assist in sanitation inspections, employee training, quality control, and customer services.
A five-week school nutrition rotation emphasizes the unique aspects of food service management important to this field. Interns will be able to experience the unique challenges and opportunities in this important segment of the dietetics field working alongside Food Service Directors. Interns will also have a week dedicated to development of a Business Plan on a topic of relevance to the school district assigned.
The goal of the research rotation of the Penn State Dietetic Internship program is to provide dietetic interns with a comprehensive overview of nutrition research, emphasizing the importance of translational nutrition research and its application to public health and dietary recommendations. This rotation is in total of two-weeks, but scheduled within the clinical rotations.
The registered dietitian plays a significant role in synthesizing research outcomes and communicating this information to the general public. The research rotation is designed specifically to assist the intern with understanding the research process. Interns will participate in research studies, write a research proposal and participate in journal clubs at The Penn State Milton Hershey Medical Center and Penn State University College of Medicine. In addition a day will be spent at the Main Campus in University Park to become familiar with various research settings.
The intent of the four-week enrichment rotation is for the intern to further develop skills in an area of dietetics that interests them. Past sites have included HealthSouth Rehabilitation, Hotel Hershey Spa, Super Bakery, San Diego Public School System, Sheetz Corporate Headquarters, Mount Nittany Medical Center (small community hospital), Centre Medical and Surgical Associates, and Blair Medical Associates.
Submit to the dietetics program director for approval. |
# coding=utf8
"""
Misc utilities.
"""
from __future__ import absolute_import
from __future__ import print_function
from __future__ import unicode_literals
from __future__ import division
from datetime import datetime
import sys
import pytz
import pytz.reference
import six
ISO_DATE_FORMAT = "%Y-%m-%dT%H:%M:%S.%f+00:00"
PYTHON2 = sys.version_info < (3,)
def decode(obj):
"""
Decode obj to unicode if it is a byte string, trying first utf8 and then
iso-8859-1, raising a `UnicodeDecodeError` if unable to decode a byte
string, or returning obj unchanged if it is not a byte string.
"""
if isinstance(obj, six.binary_type):
try:
obj = obj.decode("utf8")
except UnicodeDecodeError:
obj = obj.decode("iso-8859-1")
return obj
def encode(obj, ascii=False):
"""
Encode the object arg as ascii (unicode-escaped) if `ascii` true or utf8.
"""
if isinstance(obj, six.text_type):
obj = obj.encode("unicode-escape" if ascii else "utf8")
return obj
def needs_ascii(fh):
"""
Answer whether to encode as ascii for the given file handle, which is based
on whether the handle has an encoding (None under py2 and UTF-8 under py3)
and whether the handle is associated with a tty.
"""
if fh.encoding and fh.encoding != "UTF-8":
return True
return not fh.isatty()
def json_serializer(val):
"""A JSON `default` helper function for serializing datetimes."""
return val.isoformat() if isinstance(val, datetime) else val
def parse_datetime(val):
"""
Parse datetime string in `ISO_DATE_FORMAT` and return a datetime value.
"""
return datetime.strptime(val, ISO_DATE_FORMAT).replace(tzinfo=pytz.utc)
def utc_to_local(dt):
"""
Convert UTC `datetime.datetime` instance to localtime.
Returns a datetime with `tzinfo` set to the current local timezone.
"""
local_timezone = pytz.reference.LocalTimezone()
dt = dt + local_timezone.utcoffset(datetime.now())
return dt.replace(tzinfo=local_timezone)
def strip(val):
"""
Strip val, which may be str or iterable of str.
For str input, returns stripped string, and for iterable input,
returns list of str values without empty str (after strip) values.
"""
if isinstance(val, six.string_types):
return val.strip()
try:
return list(filter(None, map(strip, val)))
except TypeError:
return val
if PYTHON2:
def pickle_loads(data, *args, **kwargs):
"""
Pickle.loads replacement that handles Python2/3 gotchas.
"""
try:
from cPickle import loads
except ImportError:
from pickle import loads
return loads(data, *args, **kwargs)
else:
def pickle_loads(data, *args, **kwargs):
"""
Pickle.loads replacement that handles Python2/3 gotchas.
"""
from pickle import loads
try:
return loads(data, *args, **kwargs)
except UnicodeDecodeError as e:
print(e.args)
if PYTHON2 or not e.args[0] == "ascii":
raise
result = loads(data, encoding="bytes")
# need to handle a py2-pickled dict having bytes keys, which will
# be skipped in python3, so we convert all keys to str if needed
if isinstance(result, dict):
d = {}
method = result.iteritems if PYTHON2 else result.items
for k, v in method():
if isinstance(k, bytes):
k = k.decode("ascii")
d[k] = v
if d:
result = d
return result
def monkeypatch():
"""
Monkeypatch the whoosh.compat.loads to ...
"""
import whoosh.compat
whoosh.compat.loads = pickle_loads
|
Discover our beautiful natural shape wooden mirror with glass included. We have several models with different finishing.
Nice for rustic items lover. |
from django.conf import settings
from django.db import models
from django.utils import timezone
from common.utils import create_hash
from common.get_lat_lon_exif_pil import get_lat_lon_backup
from common.s3_synch import upload_file_to_s3, remove_file_from_s3, get_file_from_s3
from custom_user.models import User
from gallery.models import Gallery
import PIL
import os
import threading
def upload_to(instance, filename):
'''
Defines a dynamic directory for files to be uploaded to
http://stackoverflow.com/questions/6350153/getting-username-in-imagefield-upload-to-path
'''
directory = ''.join([settings.MEDIA_ROOT, 'galleries/', str(instance.family_id), '/', str(instance.gallery.id)])
if not os.path.exists(directory):
os.makedirs(directory)
return 'galleries/%s/%s/%s' % (instance.family_id, instance.gallery.id, filename)
class Image(models.Model):
'''
Represents an image uploaded to a gallery
'''
class Meta:
#Allows models.py to be split up across multiple files
app_label = 'gallery'
indexes = [
models.Index(fields=['gallery']),
models.Index(fields=['family']),
models.Index(fields=['date_taken'])
]
gallery = models.ForeignKey(Gallery, blank=False, null=False, on_delete=models.CASCADE)
family = models.ForeignKey('family_tree.Family', null=False, on_delete=models.CASCADE) #Use of model string name to prevent circular import
original_image = models.ImageField(upload_to=upload_to, blank=True, null=False, width_field='original_image_width', height_field='original_image_height')
original_image_height = models.IntegerField(null=True)
original_image_width = models.IntegerField(null=True)
thumbnail = models.ImageField(upload_to=upload_to, blank=True, null=False, width_field='thumbnail_width', height_field='thumbnail_height')
thumbnail_height = models.IntegerField(null=True)
thumbnail_width = models.IntegerField(null=True)
large_thumbnail = models.ImageField(upload_to=upload_to, blank=True, null=False, width_field='large_thumbnail_width', height_field='large_thumbnail_height')
large_thumbnail_height = models.IntegerField(null=True)
large_thumbnail_width = models.IntegerField(null=True)
title = models.CharField(max_length=50)
description = models.TextField(blank=True)
#EXIF data
date_taken = models.DateTimeField(null=False)
latitude = models.FloatField(blank=True, null=False, default = 0) #(0,0) is in the middle of the ocean so can set this to 0 to avoid nulls
longitude = models.FloatField(blank=True, null=False, default = 0)
#Tracking
creation_date = models.DateTimeField(auto_now_add=True)
last_updated_date = models.DateTimeField(auto_now=True)
uploaded_by = models.ForeignKey(User, blank=True, null=True, on_delete=models.SET_NULL)
def __str__(self): # __unicode__ on Python 2
return self.title
def save(self, *args, **kwargs):
'''
Overrides the save method
'''
self.family_id = self.gallery.family_id
if self.id is None or self.id <= 0:
new_record = True
else:
new_record = False
if not self.date_taken:
self.date_taken = timezone.now()
#Need to call save first before making thumbnails so image path is set properly
super(Image, self).save(*args, **kwargs) # Call the "real" save() method.
# Don't need to do the rest if editing existing image
if new_record == False:
return
im = PIL.Image.open(self._get_absolute_image_path())
self._populate_exif_data(im)
self.make_thumbnails(im)
#Set last updated data on Gallery
self.gallery.save()
# Ensure that this has not been set to null
if not self.date_taken:
self.date_taken = timezone.now()
super(Image, self).save(*args, **kwargs) # Call the "real" save() method.
def make_thumbnails(self, image=None):
'''
Creates the thumbnails for the images
It also sets a thumbnail for the gallery if none exists
'''
if not self.original_image:
return
if not self.large_thumbnail:
self.large_thumbnail, image = self._create_thumbnail((960,960))
if not self.thumbnail:
self.thumbnail, image = self._create_thumbnail((200,200), image)
#Set the gallery thumbnail
if not self.gallery.thumbnail:
self.gallery.thumbnail = self.thumbnail
self.gallery.thumbnail_height = self.thumbnail_height
self.gallery.thumbnail_width = self.thumbnail_width
def _create_thumbnail(self, size, image = None):
'''
Creates the thumbnails
'''
if not image:
image = PIL.Image.open(self._get_absolute_image_path())
image.thumbnail(size)
filename = create_hash(str(self.original_image)) + '.jpg'
path_and_filename = upload_to(self, str(filename))
image = image.convert('RGB')
image.save(settings.MEDIA_ROOT + str(path_and_filename), "JPEG", quality=90)
return path_and_filename, image
def _get_absolute_image_path(self, path = None):
'''
Gets the absolute image path
'''
if not path:
path = self.original_image
if settings.MEDIA_ROOT in str(path):
image_file = str(path)
else:
image_file = settings.MEDIA_ROOT + str(path)
return image_file
def _populate_exif_data(self, image=None):
'''
Uses the exif data from an image to populate fields on the image model
http://stackoverflow.com/questions/6460381/translate-exif-dms-to-dd-geolocation-with-python
'''
if self.latitude != 0 and self.longitude != 0:
return
# if not image:
# image = PIL.Image.open(self._get_absolute_image_path())
# Issue with PIL GPS tag reading so using another library
lat, lon, date_time = get_lat_lon_backup(self._get_absolute_image_path())
self.latitude = lat
self.longitude = lon
self.date_taken = date_time
def delete_local_image_files(self):
'''
Deletes the original image and thumbnails associated with this
object
'''
try:
os.remove(self._get_absolute_image_path(self.original_image))
except:
pass
try:
os.remove(self._get_absolute_image_path(self.thumbnail))
except:
pass
try:
os.remove(self._get_absolute_image_path(self.large_thumbnail))
except:
pass
def delete_remote_image_files(self):
'''
Deletes the image and thumbnails associated with this
object on s3
'''
t1 = threading.Thread(target=remove_file_from_s3, args=(self.original_image,))
t2 = threading.Thread(target=remove_file_from_s3, args=(self.thumbnail,))
t3 = threading.Thread(target=remove_file_from_s3, args=(self.large_thumbnail,))
t1.start()
t2.start()
t3.start()
t1.join()
t2.join()
t3.join()
def upload_files_to_s3(self):
'''
Uploads image and thumbnail files to s3
'''
t1 = threading.Thread(target=upload_file_to_s3, args=(self.original_image,))
t2 = threading.Thread(target=upload_file_to_s3, args=(self.thumbnail,))
t3 = threading.Thread(target=upload_file_to_s3, args=(self.large_thumbnail,))
t1.start()
t2.start()
t3.start()
t1.join()
t2.join()
t3.join()
def rotate(self, anticlockwise_angle = 90):
'''
Rotates the image and all thumbnails
'''
thumbnail = self._rotate_image(self.thumbnail, anticlockwise_angle)
thumbnail_path_and_filename = upload_to(self, str(create_hash(str(self.original_image)) + '.jpg'))
thumbnail.save(settings.MEDIA_ROOT + str(thumbnail_path_and_filename), "JPEG", quality=95)
large_thumbnail = self._rotate_image(self.large_thumbnail, anticlockwise_angle)
large_thumbnail_path_and_filename = upload_to(self, str(create_hash(str(self.original_image)) + '.jpg'))
large_thumbnail.save(settings.MEDIA_ROOT + str(large_thumbnail_path_and_filename), "JPEG", quality=95)
original_image = self._rotate_image(self.original_image, anticlockwise_angle)
original_image_path_and_filename = upload_to(self, str(create_hash(str(self.original_image)) + '.jpg'))
original_image.save(settings.MEDIA_ROOT + str(original_image_path_and_filename), "JPEG", quality=95)
self.delete_local_image_files()
self.delete_remote_image_files()
self.thumbnail = thumbnail_path_and_filename
self.large_thumbnail = large_thumbnail_path_and_filename
self.original_image = original_image_path_and_filename
self.save()
self.upload_files_to_s3()
self.delete_local_image_files()
def _rotate_image(self, path, anticlockwise_angle = 90):
'''
Rotates an image
'''
get_file_from_s3(path)
image = PIL.Image.open(settings.MEDIA_ROOT + str(path))
return image.rotate(anticlockwise_angle, resample=PIL.Image.BICUBIC, expand=True)
|
Stu Ross, The Outreach Foundation’s mission staff in Kenya, has been busy this summer with visiting volunteer church teams and church dedications. In this update, Stu shares stories about three of the PCEA churches in Kenya that were recently completed and dedicated – Nagum, Olosirkon, and Suswa.
Stu writes about the dedication of Nagum Church: “This beautiful church was dedicated today, July 16. One of the remarkable things about this church is that it is in a very remote location with no water, no stores, and no roads. But people are there, and the church was crowded. The people were so happy with their new church.
Also from Stu, about the Olosirkon Church dedication: “Imagine the joy of the Olosirkon congregation as we processed from the old mabati (steel) church to their new stone church. It took two years to complete. The church is beautiful with a ceramic tile floor, high curved altar, stained glass window on the back wall, two side rooms, and a handicap ramp at the front entrance.
Stu continues: “We dedicated another church deep in Maasai land, PCEA Suswa Church. The church is located in the Narok side of Maasai land, over 70 kms (~43 miles) away on a bush dirt road off the paved road. Another very remote area.
The Outreach Foundation is grateful for your partnership with Stu Ross in East Africa.
Read more about PCEA Stone Church Construction by clicking HERE.
Read more about PCEA Mabati Church Construction by clicking HERE.
There are many more congregations in need of gathering structures for worship and community life. Outreach is seeking gifts totaling $120,000 for four stone churches at $30,000 each and $80,000 to $100,000 for a new Mabati church building. |
# Copyright 2014 IBM Corp.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import logging
import os
import uuid
import fixtures
import mock
import oslo_config.fixture
from oslo_db.sqlalchemy import migration
from oslo_log import log
from six.moves import configparser
from six.moves import range
from testtools import matchers
from keystone.auth import controllers
from keystone.cmd import cli
from keystone.cmd.doctor import caching
from keystone.cmd.doctor import credential
from keystone.cmd.doctor import database as doc_database
from keystone.cmd.doctor import debug
from keystone.cmd.doctor import federation
from keystone.cmd.doctor import ldap
from keystone.cmd.doctor import security_compliance
from keystone.cmd.doctor import tokens
from keystone.cmd.doctor import tokens_fernet
from keystone.common import dependency
from keystone.common.sql import upgrades
import keystone.conf
from keystone import exception
from keystone.i18n import _
from keystone.identity.mapping_backends import mapping as identity_mapping
from keystone.tests import unit
from keystone.tests.unit import default_fixtures
from keystone.tests.unit.ksfixtures import database
from keystone.tests.unit.ksfixtures import ldapdb
CONF = keystone.conf.CONF
class CliTestCase(unit.SQLDriverOverrides, unit.TestCase):
def config_files(self):
config_files = super(CliTestCase, self).config_files()
config_files.append(unit.dirs.tests_conf('backend_sql.conf'))
return config_files
def test_token_flush(self):
self.useFixture(database.Database())
self.load_backends()
cli.TokenFlush.main()
class CliNoConfigTestCase(unit.BaseTestCase):
def setUp(self):
self.config_fixture = self.useFixture(oslo_config.fixture.Config(CONF))
self.config_fixture.register_cli_opt(cli.command_opt)
self.useFixture(fixtures.MockPatch(
'oslo_config.cfg.find_config_files', return_value=[]))
super(CliNoConfigTestCase, self).setUp()
# NOTE(crinkle): the command call doesn't have to actually work,
# that's what the other unit tests are for. So just mock it out.
class FakeConfCommand(object):
def __init__(self):
self.cmd_class = mock.Mock()
self.useFixture(fixtures.MockPatchObject(
CONF, 'command', FakeConfCommand()))
self.logging = self.useFixture(fixtures.FakeLogger(level=log.WARN))
def test_cli(self):
expected_msg = 'Config file not found, using default configs.'
cli.main(argv=['keystone-manage', 'db_sync'])
self.assertThat(self.logging.output, matchers.Contains(expected_msg))
class CliBootStrapTestCase(unit.SQLDriverOverrides, unit.TestCase):
def setUp(self):
self.useFixture(database.Database())
super(CliBootStrapTestCase, self).setUp()
def config_files(self):
self.config_fixture.register_cli_opt(cli.command_opt)
config_files = super(CliBootStrapTestCase, self).config_files()
config_files.append(unit.dirs.tests_conf('backend_sql.conf'))
return config_files
def config(self, config_files):
CONF(args=['bootstrap', '--bootstrap-password', uuid.uuid4().hex],
project='keystone',
default_config_files=config_files)
def test_bootstrap(self):
bootstrap = cli.BootStrap()
self._do_test_bootstrap(bootstrap)
def _do_test_bootstrap(self, bootstrap):
bootstrap.do_bootstrap()
project = bootstrap.resource_manager.get_project_by_name(
bootstrap.project_name,
'default')
user = bootstrap.identity_manager.get_user_by_name(
bootstrap.username,
'default')
role = bootstrap.role_manager.get_role(bootstrap.role_id)
role_list = (
bootstrap.assignment_manager.get_roles_for_user_and_project(
user['id'],
project['id']))
self.assertIs(1, len(role_list))
self.assertEqual(role_list[0], role['id'])
# NOTE(morganfainberg): Pass an empty context, it isn't used by
# `authenticate` method.
bootstrap.identity_manager.authenticate(
self.make_request(),
user['id'],
bootstrap.password)
if bootstrap.region_id:
region = bootstrap.catalog_manager.get_region(bootstrap.region_id)
self.assertEqual(self.region_id, region['id'])
if bootstrap.service_id:
svc = bootstrap.catalog_manager.get_service(bootstrap.service_id)
self.assertEqual(self.service_name, svc['name'])
self.assertEqual(set(['admin', 'public', 'internal']),
set(bootstrap.endpoints))
urls = {'public': self.public_url,
'internal': self.internal_url,
'admin': self.admin_url}
for interface, url in urls.items():
endpoint_id = bootstrap.endpoints[interface]
endpoint = bootstrap.catalog_manager.get_endpoint(endpoint_id)
self.assertEqual(self.region_id, endpoint['region_id'])
self.assertEqual(url, endpoint['url'])
self.assertEqual(svc['id'], endpoint['service_id'])
self.assertEqual(interface, endpoint['interface'])
def test_bootstrap_is_idempotent_when_password_does_not_change(self):
# NOTE(morganfainberg): Ensure we can run bootstrap with the same
# configuration multiple times without erroring.
bootstrap = cli.BootStrap()
self._do_test_bootstrap(bootstrap)
v3_token_controller = controllers.Auth()
v3_password_data = {
'identity': {
"methods": ["password"],
"password": {
"user": {
"name": bootstrap.username,
"password": bootstrap.password,
"domain": {
"id": CONF.identity.default_domain_id
}
}
}
}
}
auth_response = v3_token_controller.authenticate_for_token(
self.make_request(), v3_password_data)
token = auth_response.headers['X-Subject-Token']
self._do_test_bootstrap(bootstrap)
# build validation request
request = self.make_request(is_admin=True)
request.context_dict['subject_token_id'] = token
# Make sure the token we authenticate for is still valid.
v3_token_controller.validate_token(request)
def test_bootstrap_is_not_idempotent_when_password_does_change(self):
# NOTE(lbragstad): Ensure bootstrap isn't idempotent when run with
# different arguments or configuration values.
bootstrap = cli.BootStrap()
self._do_test_bootstrap(bootstrap)
v3_token_controller = controllers.Auth()
v3_password_data = {
'identity': {
"methods": ["password"],
"password": {
"user": {
"name": bootstrap.username,
"password": bootstrap.password,
"domain": {
"id": CONF.identity.default_domain_id
}
}
}
}
}
auth_response = v3_token_controller.authenticate_for_token(
self.make_request(), v3_password_data)
token = auth_response.headers['X-Subject-Token']
os.environ['OS_BOOTSTRAP_PASSWORD'] = uuid.uuid4().hex
self._do_test_bootstrap(bootstrap)
# build validation request
request = self.make_request(is_admin=True)
request.context_dict['subject_token_id'] = token
# Since the user account was recovered with a different password, we
# shouldn't be able to validate this token. Bootstrap should have
# persisted a revocation event because the user's password was updated.
# Since this token was obtained using the original password, it should
# now be invalid.
self.assertRaises(
exception.TokenNotFound,
v3_token_controller.validate_token,
request
)
def test_bootstrap_recovers_user(self):
bootstrap = cli.BootStrap()
self._do_test_bootstrap(bootstrap)
# Completely lock the user out.
user_id = bootstrap.identity_manager.get_user_by_name(
bootstrap.username,
'default')['id']
bootstrap.identity_manager.update_user(
user_id,
{'enabled': False,
'password': uuid.uuid4().hex})
# The second bootstrap run will recover the account.
self._do_test_bootstrap(bootstrap)
# Sanity check that the original password works again.
bootstrap.identity_manager.authenticate(
self.make_request(),
user_id,
bootstrap.password)
def test_bootstrap_creates_default_role(self):
bootstrap = cli.BootStrap()
try:
role = bootstrap.role_manager.get_role(CONF.member_role_id)
self.fail('Member Role is created and should not be.')
except exception.RoleNotFound:
pass
self._do_test_bootstrap(bootstrap)
role = bootstrap.role_manager.get_role(CONF.member_role_id)
self.assertEqual(role['name'], CONF.member_role_name)
self.assertEqual(role['id'], CONF.member_role_id)
class CliBootStrapTestCaseWithEnvironment(CliBootStrapTestCase):
def config(self, config_files):
CONF(args=['bootstrap'], project='keystone',
default_config_files=config_files)
def setUp(self):
super(CliBootStrapTestCaseWithEnvironment, self).setUp()
self.password = uuid.uuid4().hex
self.username = uuid.uuid4().hex
self.project_name = uuid.uuid4().hex
self.role_name = uuid.uuid4().hex
self.service_name = uuid.uuid4().hex
self.public_url = uuid.uuid4().hex
self.internal_url = uuid.uuid4().hex
self.admin_url = uuid.uuid4().hex
self.region_id = uuid.uuid4().hex
self.default_domain = {
'id': CONF.identity.default_domain_id,
'name': 'Default',
}
self.useFixture(
fixtures.EnvironmentVariable('OS_BOOTSTRAP_PASSWORD',
newvalue=self.password))
self.useFixture(
fixtures.EnvironmentVariable('OS_BOOTSTRAP_USERNAME',
newvalue=self.username))
self.useFixture(
fixtures.EnvironmentVariable('OS_BOOTSTRAP_PROJECT_NAME',
newvalue=self.project_name))
self.useFixture(
fixtures.EnvironmentVariable('OS_BOOTSTRAP_ROLE_NAME',
newvalue=self.role_name))
self.useFixture(
fixtures.EnvironmentVariable('OS_BOOTSTRAP_SERVICE_NAME',
newvalue=self.service_name))
self.useFixture(
fixtures.EnvironmentVariable('OS_BOOTSTRAP_PUBLIC_URL',
newvalue=self.public_url))
self.useFixture(
fixtures.EnvironmentVariable('OS_BOOTSTRAP_INTERNAL_URL',
newvalue=self.internal_url))
self.useFixture(
fixtures.EnvironmentVariable('OS_BOOTSTRAP_ADMIN_URL',
newvalue=self.admin_url))
self.useFixture(
fixtures.EnvironmentVariable('OS_BOOTSTRAP_REGION_ID',
newvalue=self.region_id))
def test_assignment_created_with_user_exists(self):
# test assignment can be created if user already exists.
bootstrap = cli.BootStrap()
bootstrap.resource_manager.create_domain(self.default_domain['id'],
self.default_domain)
user_ref = unit.new_user_ref(self.default_domain['id'],
name=self.username,
password=self.password)
bootstrap.identity_manager.create_user(user_ref)
self._do_test_bootstrap(bootstrap)
def test_assignment_created_with_project_exists(self):
# test assignment can be created if project already exists.
bootstrap = cli.BootStrap()
bootstrap.resource_manager.create_domain(self.default_domain['id'],
self.default_domain)
project_ref = unit.new_project_ref(self.default_domain['id'],
name=self.project_name)
bootstrap.resource_manager.create_project(project_ref['id'],
project_ref)
self._do_test_bootstrap(bootstrap)
def test_assignment_created_with_role_exists(self):
# test assignment can be created if role already exists.
bootstrap = cli.BootStrap()
bootstrap.resource_manager.create_domain(self.default_domain['id'],
self.default_domain)
role = unit.new_role_ref(name=self.role_name)
bootstrap.role_manager.create_role(role['id'], role)
self._do_test_bootstrap(bootstrap)
def test_assignment_created_with_region_exists(self):
# test assignment can be created if region already exists.
bootstrap = cli.BootStrap()
bootstrap.resource_manager.create_domain(self.default_domain['id'],
self.default_domain)
region = unit.new_region_ref(id=self.region_id)
bootstrap.catalog_manager.create_region(region)
self._do_test_bootstrap(bootstrap)
def test_endpoints_created_with_service_exists(self):
# test assignment can be created if service already exists.
bootstrap = cli.BootStrap()
bootstrap.resource_manager.create_domain(self.default_domain['id'],
self.default_domain)
service = unit.new_service_ref(name=self.service_name)
bootstrap.catalog_manager.create_service(service['id'], service)
self._do_test_bootstrap(bootstrap)
def test_endpoints_created_with_endpoint_exists(self):
# test assignment can be created if endpoint already exists.
bootstrap = cli.BootStrap()
bootstrap.resource_manager.create_domain(self.default_domain['id'],
self.default_domain)
service = unit.new_service_ref(name=self.service_name)
bootstrap.catalog_manager.create_service(service['id'], service)
region = unit.new_region_ref(id=self.region_id)
bootstrap.catalog_manager.create_region(region)
endpoint = unit.new_endpoint_ref(interface='public',
service_id=service['id'],
url=self.public_url,
region_id=self.region_id)
bootstrap.catalog_manager.create_endpoint(endpoint['id'], endpoint)
self._do_test_bootstrap(bootstrap)
class CliDomainConfigAllTestCase(unit.SQLDriverOverrides, unit.TestCase):
def setUp(self):
self.useFixture(database.Database())
super(CliDomainConfigAllTestCase, self).setUp()
self.load_backends()
self.config_fixture.config(
group='identity',
domain_config_dir=unit.TESTCONF + '/domain_configs_multi_ldap')
self.domain_count = 3
self.setup_initial_domains()
self.logging = self.useFixture(
fixtures.FakeLogger(level=logging.INFO))
def config_files(self):
self.config_fixture.register_cli_opt(cli.command_opt)
config_files = super(CliDomainConfigAllTestCase, self).config_files()
config_files.append(unit.dirs.tests_conf('backend_sql.conf'))
return config_files
def cleanup_domains(self):
for domain in self.domains:
if domain == 'domain_default':
# Not allowed to delete the default domain, but should at least
# delete any domain-specific config for it.
self.domain_config_api.delete_config(
CONF.identity.default_domain_id)
continue
this_domain = self.domains[domain]
this_domain['enabled'] = False
self.resource_api.update_domain(this_domain['id'], this_domain)
self.resource_api.delete_domain(this_domain['id'])
self.domains = {}
def config(self, config_files):
CONF(args=['domain_config_upload', '--all'], project='keystone',
default_config_files=config_files)
def setup_initial_domains(self):
def create_domain(domain):
return self.resource_api.create_domain(domain['id'], domain)
self.domains = {}
self.addCleanup(self.cleanup_domains)
for x in range(1, self.domain_count):
domain = 'domain%s' % x
self.domains[domain] = create_domain(
{'id': uuid.uuid4().hex, 'name': domain})
self.default_domain = unit.new_domain_ref(
description=u'The default domain',
id=CONF.identity.default_domain_id,
name=u'Default')
self.domains['domain_default'] = create_domain(self.default_domain)
def test_config_upload(self):
# The values below are the same as in the domain_configs_multi_ldap
# directory of test config_files.
default_config = {
'ldap': {'url': 'fake://memory',
'user': 'cn=Admin',
'password': 'password',
'suffix': 'cn=example,cn=com'},
'identity': {'driver': 'ldap'}
}
domain1_config = {
'ldap': {'url': 'fake://memory1',
'user': 'cn=Admin',
'password': 'password',
'suffix': 'cn=example,cn=com'},
'identity': {'driver': 'ldap',
'list_limit': '101'}
}
domain2_config = {
'ldap': {'url': 'fake://memory',
'user': 'cn=Admin',
'password': 'password',
'suffix': 'cn=myroot,cn=com',
'group_tree_dn': 'ou=UserGroups,dc=myroot,dc=org',
'user_tree_dn': 'ou=Users,dc=myroot,dc=org'},
'identity': {'driver': 'ldap'}
}
# Clear backend dependencies, since cli loads these manually
dependency.reset()
cli.DomainConfigUpload.main()
res = self.domain_config_api.get_config_with_sensitive_info(
CONF.identity.default_domain_id)
self.assertEqual(default_config, res)
res = self.domain_config_api.get_config_with_sensitive_info(
self.domains['domain1']['id'])
self.assertEqual(domain1_config, res)
res = self.domain_config_api.get_config_with_sensitive_info(
self.domains['domain2']['id'])
self.assertEqual(domain2_config, res)
class CliDomainConfigSingleDomainTestCase(CliDomainConfigAllTestCase):
def config(self, config_files):
CONF(args=['domain_config_upload', '--domain-name', 'Default'],
project='keystone', default_config_files=config_files)
def test_config_upload(self):
# The values below are the same as in the domain_configs_multi_ldap
# directory of test config_files.
default_config = {
'ldap': {'url': 'fake://memory',
'user': 'cn=Admin',
'password': 'password',
'suffix': 'cn=example,cn=com'},
'identity': {'driver': 'ldap'}
}
# Clear backend dependencies, since cli loads these manually
dependency.reset()
cli.DomainConfigUpload.main()
res = self.domain_config_api.get_config_with_sensitive_info(
CONF.identity.default_domain_id)
self.assertEqual(default_config, res)
res = self.domain_config_api.get_config_with_sensitive_info(
self.domains['domain1']['id'])
self.assertEqual({}, res)
res = self.domain_config_api.get_config_with_sensitive_info(
self.domains['domain2']['id'])
self.assertEqual({}, res)
def test_no_overwrite_config(self):
# Create a config for the default domain
default_config = {
'ldap': {'url': uuid.uuid4().hex},
'identity': {'driver': 'ldap'}
}
self.domain_config_api.create_config(
CONF.identity.default_domain_id, default_config)
# Now try and upload the settings in the configuration file for the
# default domain
dependency.reset()
with mock.patch('six.moves.builtins.print') as mock_print:
self.assertRaises(unit.UnexpectedExit, cli.DomainConfigUpload.main)
file_name = ('keystone.%s.conf' % self.default_domain['name'])
error_msg = _(
'Domain: %(domain)s already has a configuration defined - '
'ignoring file: %(file)s.') % {
'domain': self.default_domain['name'],
'file': os.path.join(CONF.identity.domain_config_dir,
file_name)}
mock_print.assert_has_calls([mock.call(error_msg)])
res = self.domain_config_api.get_config(
CONF.identity.default_domain_id)
# The initial config should not have been overwritten
self.assertEqual(default_config, res)
class CliDomainConfigNoOptionsTestCase(CliDomainConfigAllTestCase):
def config(self, config_files):
CONF(args=['domain_config_upload'],
project='keystone', default_config_files=config_files)
def test_config_upload(self):
dependency.reset()
with mock.patch('six.moves.builtins.print') as mock_print:
self.assertRaises(unit.UnexpectedExit, cli.DomainConfigUpload.main)
mock_print.assert_has_calls(
[mock.call(
_('At least one option must be provided, use either '
'--all or --domain-name'))])
class CliDomainConfigTooManyOptionsTestCase(CliDomainConfigAllTestCase):
def config(self, config_files):
CONF(args=['domain_config_upload', '--all', '--domain-name',
'Default'],
project='keystone', default_config_files=config_files)
def test_config_upload(self):
dependency.reset()
with mock.patch('six.moves.builtins.print') as mock_print:
self.assertRaises(unit.UnexpectedExit, cli.DomainConfigUpload.main)
mock_print.assert_has_calls(
[mock.call(_('The --all option cannot be used with '
'the --domain-name option'))])
class CliDomainConfigInvalidDomainTestCase(CliDomainConfigAllTestCase):
def config(self, config_files):
self.invalid_domain_name = uuid.uuid4().hex
CONF(args=['domain_config_upload', '--domain-name',
self.invalid_domain_name],
project='keystone', default_config_files=config_files)
def test_config_upload(self):
dependency.reset()
with mock.patch('six.moves.builtins.print') as mock_print:
self.assertRaises(unit.UnexpectedExit, cli.DomainConfigUpload.main)
file_name = 'keystone.%s.conf' % self.invalid_domain_name
error_msg = (_(
'Invalid domain name: %(domain)s found in config file name: '
'%(file)s - ignoring this file.') % {
'domain': self.invalid_domain_name,
'file': os.path.join(CONF.identity.domain_config_dir,
file_name)})
mock_print.assert_has_calls([mock.call(error_msg)])
class TestDomainConfigFinder(unit.BaseTestCase):
def setUp(self):
super(TestDomainConfigFinder, self).setUp()
self.logging = self.useFixture(fixtures.LoggerFixture())
@mock.patch('os.walk')
def test_finder_ignores_files(self, mock_walk):
mock_walk.return_value = [
['.', [], ['file.txt', 'keystone.conf', 'keystone.domain0.conf']],
]
domain_configs = list(cli._domain_config_finder('.'))
expected_domain_configs = [('./keystone.domain0.conf', 'domain0')]
self.assertThat(domain_configs,
matchers.Equals(expected_domain_configs))
expected_msg_template = ('Ignoring file (%s) while scanning '
'domain config directory')
self.assertThat(
self.logging.output,
matchers.Contains(expected_msg_template % 'file.txt'))
self.assertThat(
self.logging.output,
matchers.Contains(expected_msg_template % 'keystone.conf'))
class CliDBSyncTestCase(unit.BaseTestCase):
class FakeConfCommand(object):
def __init__(self, parent):
self.extension = False
self.check = parent.command_check
self.expand = parent.command_expand
self.migrate = parent.command_migrate
self.contract = parent.command_contract
self.version = None
def setUp(self):
super(CliDBSyncTestCase, self).setUp()
self.config_fixture = self.useFixture(oslo_config.fixture.Config(CONF))
self.config_fixture.register_cli_opt(cli.command_opt)
upgrades.offline_sync_database_to_version = mock.Mock()
upgrades.expand_schema = mock.Mock()
upgrades.migrate_data = mock.Mock()
upgrades.contract_schema = mock.Mock()
self.command_check = False
self.command_expand = False
self.command_migrate = False
self.command_contract = False
def _assert_correct_call(self, mocked_function):
for func in [upgrades.offline_sync_database_to_version,
upgrades.expand_schema,
upgrades.migrate_data,
upgrades.contract_schema]:
if func == mocked_function:
self.assertTrue(func.called)
else:
self.assertFalse(func.called)
def test_db_sync(self):
self.useFixture(fixtures.MockPatchObject(
CONF, 'command', self.FakeConfCommand(self)))
cli.DbSync.main()
self._assert_correct_call(
upgrades.offline_sync_database_to_version)
def test_db_sync_expand(self):
self.command_expand = True
self.useFixture(fixtures.MockPatchObject(
CONF, 'command', self.FakeConfCommand(self)))
cli.DbSync.main()
self._assert_correct_call(upgrades.expand_schema)
def test_db_sync_migrate(self):
self.command_migrate = True
self.useFixture(fixtures.MockPatchObject(
CONF, 'command', self.FakeConfCommand(self)))
cli.DbSync.main()
self._assert_correct_call(upgrades.migrate_data)
def test_db_sync_contract(self):
self.command_contract = True
self.useFixture(fixtures.MockPatchObject(
CONF, 'command', self.FakeConfCommand(self)))
cli.DbSync.main()
self._assert_correct_call(upgrades.contract_schema)
@mock.patch('keystone.cmd.cli.upgrades.get_db_version')
def test_db_sync_check_when_database_is_empty(self, mocked_get_db_version):
e = migration.exception.DbMigrationError("Invalid version")
mocked_get_db_version.side_effect = e
checker = cli.DbSync()
log_info = self.useFixture(fixtures.FakeLogger(level=log.INFO))
status = checker.check_db_sync_status()
self.assertIn("not currently under version control", log_info.output)
self.assertEqual(status, 2)
class TestMappingPopulate(unit.SQLDriverOverrides, unit.TestCase):
def setUp(self):
sqldb = self.useFixture(database.Database())
super(TestMappingPopulate, self).setUp()
self.ldapdb = self.useFixture(ldapdb.LDAPDatabase())
self.ldapdb.clear()
self.load_backends()
sqldb.recreate()
self.load_fixtures(default_fixtures)
def config_files(self):
self.config_fixture.register_cli_opt(cli.command_opt)
config_files = super(TestMappingPopulate, self).config_files()
config_files.append(unit.dirs.tests_conf('backend_ldap_sql.conf'))
return config_files
def config_overrides(self):
super(TestMappingPopulate, self).config_overrides()
self.config_fixture.config(group='identity', driver='ldap')
self.config_fixture.config(group='identity_mapping',
backward_compatible_ids=False)
def config(self, config_files):
CONF(args=['mapping_populate', '--domain-name', 'Default'],
project='keystone',
default_config_files=config_files)
def test_mapping_populate(self):
# mapping_populate should create id mappings. Test plan:
# 0. Purge mappings
# 1. Fetch user list directly via backend. It will not create any
# mappings because it bypasses identity manager
# 2. Verify that users have no public_id yet
# 3. Execute mapping_populate. It should create id mappings
# 4. For the same users verify that they have public_id now
purge_filter = {}
self.id_mapping_api.purge_mappings(purge_filter)
hints = None
users = self.identity_api.driver.list_users(hints)
for user in users:
local_entity = {
'domain_id': CONF.identity.default_domain_id,
'local_id': user['id'],
'entity_type': identity_mapping.EntityType.USER}
self.assertIsNone(self.id_mapping_api.get_public_id(local_entity))
dependency.reset() # backends are loaded again in the command handler
cli.MappingPopulate.main()
for user in users:
local_entity = {
'domain_id': CONF.identity.default_domain_id,
'local_id': user['id'],
'entity_type': identity_mapping.EntityType.USER}
self.assertIsNotNone(
self.id_mapping_api.get_public_id(local_entity))
def test_bad_domain_name(self):
CONF(args=['mapping_populate', '--domain-name', uuid.uuid4().hex],
project='keystone')
dependency.reset() # backends are loaded again in the command handler
# NOTE: assertEqual is used on purpose. assertFalse passes with None.
self.assertEqual(False, cli.MappingPopulate.main())
class CliDomainConfigUploadNothing(unit.BaseTestCase):
def setUp(self):
super(CliDomainConfigUploadNothing, self).setUp()
config_fixture = self.useFixture(oslo_config.fixture.Config(CONF))
config_fixture.register_cli_opt(cli.command_opt)
# NOTE(dstanek): since this is not testing any database
# functionality there is no need to go through the motions and
# setup a test database.
def fake_load_backends(self):
self.resource_manager = mock.Mock()
self.useFixture(fixtures.MockPatchObject(
cli.DomainConfigUploadFiles, 'load_backends', fake_load_backends))
tempdir = self.useFixture(fixtures.TempDir())
config_fixture.config(group='identity', domain_config_dir=tempdir.path)
self.logging = self.useFixture(
fixtures.FakeLogger(level=logging.DEBUG))
def test_uploading_all_from_an_empty_directory(self):
CONF(args=['domain_config_upload', '--all'], project='keystone',
default_config_files=[])
cli.DomainConfigUpload.main()
expected_msg = ('No domain configs uploaded from %r' %
CONF.identity.domain_config_dir)
self.assertThat(self.logging.output,
matchers.Contains(expected_msg))
class CachingDoctorTests(unit.TestCase):
def test_symptom_caching_disabled(self):
# Symptom Detected: Caching disabled
self.config_fixture.config(group='cache', enabled=False)
self.assertTrue(caching.symptom_caching_disabled())
# No Symptom Detected: Caching is enabled
self.config_fixture.config(group='cache', enabled=True)
self.assertFalse(caching.symptom_caching_disabled())
def test_caching_symptom_caching_enabled_without_a_backend(self):
# Success Case: Caching enabled and backend configured
self.config_fixture.config(group='cache', enabled=True)
self.config_fixture.config(group='cache', backend='dogpile.cache.null')
self.assertTrue(caching.symptom_caching_enabled_without_a_backend())
# Failure Case 1: Caching disabled and backend not configured
self.config_fixture.config(group='cache', enabled=False)
self.config_fixture.config(group='cache', backend='dogpile.cache.null')
self.assertFalse(caching.symptom_caching_enabled_without_a_backend())
# Failure Case 2: Caching disabled and backend configured
self.config_fixture.config(group='cache', enabled=False)
self.config_fixture.config(group='cache',
backend='dogpile.cache.memory')
self.assertFalse(caching.symptom_caching_enabled_without_a_backend())
# Failure Case 3: Caching enabled and backend configured
self.config_fixture.config(group='cache', enabled=True)
self.config_fixture.config(group='cache',
backend='dogpile.cache.memory')
self.assertFalse(caching.symptom_caching_enabled_without_a_backend())
class CredentialDoctorTests(unit.TestCase):
def test_credential_and_fernet_key_repositories_match(self):
# Symptom Detected: Key repository paths are not unique
directory = self.useFixture(fixtures.TempDir()).path
self.config_fixture.config(group='credential',
key_repository=directory)
self.config_fixture.config(group='fernet_tokens',
key_repository=directory)
self.assertTrue(credential.symptom_unique_key_repositories())
def test_credential_and_fernet_key_repositories_are_unique(self):
# No Symptom Detected: Key repository paths are unique
self.config_fixture.config(group='credential',
key_repository='/etc/keystone/cred-repo')
self.config_fixture.config(group='fernet_tokens',
key_repository='/etc/keystone/fernet-repo')
self.assertFalse(credential.symptom_unique_key_repositories())
@mock.patch('keystone.cmd.doctor.credential.utils')
def test_usability_of_cred_fernet_key_repo_raised(self, mock_utils):
# Symptom Detected: credential fernet key repository is world readable
self.config_fixture.config(group='credential', provider='fernet')
mock_utils.FernetUtils().validate_key_repository.return_value = False
self.assertTrue(
credential.symptom_usability_of_credential_fernet_key_repository())
@mock.patch('keystone.cmd.doctor.credential.utils')
def test_usability_of_cred_fernet_key_repo_not_raised(self, mock_utils):
# No Symptom Detected: Custom driver is used
self.config_fixture.config(group='credential', provider='my-driver')
mock_utils.FernetUtils().validate_key_repository.return_value = True
self.assertFalse(
credential.symptom_usability_of_credential_fernet_key_repository())
# No Symptom Detected: key repository is not world readable
self.config_fixture.config(group='credential', provider='fernet')
mock_utils.FernetUtils().validate_key_repository.return_value = True
self.assertFalse(
credential.symptom_usability_of_credential_fernet_key_repository())
@mock.patch('keystone.cmd.doctor.credential.utils')
def test_keys_in_credential_fernet_key_repository_raised(self, mock_utils):
# Symptom Detected: Key repo is empty
self.config_fixture.config(group='credential', provider='fernet')
mock_utils.FernetUtils().load_keys.return_value = False
self.assertTrue(
credential.symptom_keys_in_credential_fernet_key_repository())
@mock.patch('keystone.cmd.doctor.credential.utils')
def test_keys_in_credential_fernet_key_repository_not_raised(
self, mock_utils):
# No Symptom Detected: Custom driver is used
self.config_fixture.config(group='credential', provider='my-driver')
mock_utils.FernetUtils().load_keys.return_value = True
self.assertFalse(
credential.symptom_keys_in_credential_fernet_key_repository())
# No Symptom Detected: Key repo is not empty, fernet is current driver
self.config_fixture.config(group='credential', provider='fernet')
mock_utils.FernetUtils().load_keys.return_value = True
self.assertFalse(
credential.symptom_keys_in_credential_fernet_key_repository())
class DatabaseDoctorTests(unit.TestCase):
def test_symptom_is_raised_if_database_connection_is_SQLite(self):
# Symptom Detected: Database connection is sqlite
self.config_fixture.config(
group='database',
connection='sqlite:///mydb')
self.assertTrue(
doc_database.symptom_database_connection_is_not_SQLite())
# No Symptom Detected: Database connection is MySQL
self.config_fixture.config(
group='database',
connection='mysql+mysqlconnector://admin:secret@localhost/mydb')
self.assertFalse(
doc_database.symptom_database_connection_is_not_SQLite())
class DebugDoctorTests(unit.TestCase):
def test_symptom_debug_mode_is_enabled(self):
# Symptom Detected: Debug mode is enabled
self.config_fixture.config(debug=True)
self.assertTrue(debug.symptom_debug_mode_is_enabled())
# No Symptom Detected: Debug mode is disabled
self.config_fixture.config(debug=False)
self.assertFalse(debug.symptom_debug_mode_is_enabled())
class FederationDoctorTests(unit.TestCase):
def test_symptom_comma_in_SAML_public_certificate_path(self):
# Symptom Detected: There is a comma in path to public cert file
self.config_fixture.config(group='saml', certfile='file,cert.pem')
self.assertTrue(
federation.symptom_comma_in_SAML_public_certificate_path())
# No Symptom Detected: There is no comma in the path
self.config_fixture.config(group='saml', certfile='signing_cert.pem')
self.assertFalse(
federation.symptom_comma_in_SAML_public_certificate_path())
def test_symptom_comma_in_SAML_private_key_file_path(self):
# Symptom Detected: There is a comma in path to private key file
self.config_fixture.config(group='saml', keyfile='file,key.pem')
self.assertTrue(
federation.symptom_comma_in_SAML_private_key_file_path())
# No Symptom Detected: There is no comma in the path
self.config_fixture.config(group='saml', keyfile='signing_key.pem')
self.assertFalse(
federation.symptom_comma_in_SAML_private_key_file_path())
class LdapDoctorTests(unit.TestCase):
def test_user_enabled_emulation_dn_ignored_raised(self):
# Symptom when user_enabled_emulation_dn is being ignored because the
# user did not enable the user_enabled_emulation
self.config_fixture.config(group='ldap', user_enabled_emulation=False)
self.config_fixture.config(
group='ldap',
user_enabled_emulation_dn='cn=enabled_users,dc=example,dc=com')
self.assertTrue(
ldap.symptom_LDAP_user_enabled_emulation_dn_ignored())
def test_user_enabled_emulation_dn_ignored_not_raised(self):
# No symptom when configuration set properly
self.config_fixture.config(group='ldap', user_enabled_emulation=True)
self.config_fixture.config(
group='ldap',
user_enabled_emulation_dn='cn=enabled_users,dc=example,dc=com')
self.assertFalse(
ldap.symptom_LDAP_user_enabled_emulation_dn_ignored())
# No symptom when both configurations disabled
self.config_fixture.config(group='ldap', user_enabled_emulation=False)
self.config_fixture.config(group='ldap',
user_enabled_emulation_dn=None)
self.assertFalse(
ldap.symptom_LDAP_user_enabled_emulation_dn_ignored())
def test_user_enabled_emulation_use_group_config_ignored_raised(self):
# Symptom when user enabled emulation isn't enabled but group_config is
# enabled
self.config_fixture.config(group='ldap', user_enabled_emulation=False)
self.config_fixture.config(
group='ldap',
user_enabled_emulation_use_group_config=True)
self.assertTrue(
ldap.
symptom_LDAP_user_enabled_emulation_use_group_config_ignored())
def test_user_enabled_emulation_use_group_config_ignored_not_raised(self):
# No symptom when configuration deactivated
self.config_fixture.config(group='ldap', user_enabled_emulation=False)
self.config_fixture.config(
group='ldap',
user_enabled_emulation_use_group_config=False)
self.assertFalse(
ldap.
symptom_LDAP_user_enabled_emulation_use_group_config_ignored())
# No symptom when configurations set properly
self.config_fixture.config(group='ldap', user_enabled_emulation=True)
self.config_fixture.config(
group='ldap',
user_enabled_emulation_use_group_config=True)
self.assertFalse(
ldap.
symptom_LDAP_user_enabled_emulation_use_group_config_ignored())
def test_group_members_are_ids_disabled_raised(self):
# Symptom when objectclass is set to posixGroup but members_are_ids are
# not enabled
self.config_fixture.config(group='ldap',
group_objectclass='posixGroup')
self.config_fixture.config(group='ldap',
group_members_are_ids=False)
self.assertTrue(ldap.symptom_LDAP_group_members_are_ids_disabled())
def test_group_members_are_ids_disabled_not_raised(self):
# No symptom when the configurations are set properly
self.config_fixture.config(group='ldap',
group_objectclass='posixGroup')
self.config_fixture.config(group='ldap',
group_members_are_ids=True)
self.assertFalse(ldap.symptom_LDAP_group_members_are_ids_disabled())
# No symptom when configuration deactivated
self.config_fixture.config(group='ldap',
group_objectclass='groupOfNames')
self.config_fixture.config(group='ldap',
group_members_are_ids=False)
self.assertFalse(ldap.symptom_LDAP_group_members_are_ids_disabled())
@mock.patch('os.listdir')
@mock.patch('os.path.isdir')
def test_file_based_domain_specific_configs_raised(self, mocked_isdir,
mocked_listdir):
self.config_fixture.config(
group='identity',
domain_specific_drivers_enabled=True)
self.config_fixture.config(
group='identity',
domain_configurations_from_database=False)
# Symptom if there is no existing directory
mocked_isdir.return_value = False
self.assertTrue(ldap.symptom_LDAP_file_based_domain_specific_configs())
# Symptom if there is an invalid filename inside the domain directory
mocked_isdir.return_value = True
mocked_listdir.return_value = ['openstack.domains.conf']
self.assertTrue(ldap.symptom_LDAP_file_based_domain_specific_configs())
@mock.patch('os.listdir')
@mock.patch('os.path.isdir')
def test_file_based_domain_specific_configs_not_raised(self, mocked_isdir,
mocked_listdir):
# No symptom if both configurations deactivated
self.config_fixture.config(
group='identity',
domain_specific_drivers_enabled=False)
self.config_fixture.config(
group='identity',
domain_configurations_from_database=False)
self.assertFalse(
ldap.symptom_LDAP_file_based_domain_specific_configs())
# No symptom if directory exists with no invalid filenames
self.config_fixture.config(
group='identity',
domain_specific_drivers_enabled=True)
self.config_fixture.config(
group='identity',
domain_configurations_from_database=False)
mocked_isdir.return_value = True
mocked_listdir.return_value = ['keystone.domains.conf']
self.assertFalse(
ldap.symptom_LDAP_file_based_domain_specific_configs())
@mock.patch('os.listdir')
@mock.patch('os.path.isdir')
@mock.patch('keystone.cmd.doctor.ldap.configparser.ConfigParser')
def test_file_based_domain_specific_configs_formatted_correctly_raised(
self, mocked_parser, mocked_isdir, mocked_listdir):
symptom = ('symptom_LDAP_file_based_domain_specific_configs'
'_formatted_correctly')
# Symptom Detected: Ldap domain specific configuration files are not
# formatted correctly
self.config_fixture.config(
group='identity',
domain_specific_drivers_enabled=True)
self.config_fixture.config(
group='identity',
domain_configurations_from_database=False)
mocked_isdir.return_value = True
mocked_listdir.return_value = ['keystone.domains.conf']
mock_instance = mock.MagicMock()
mock_instance.read.side_effect = configparser.Error('No Section')
mocked_parser.return_value = mock_instance
self.assertTrue(getattr(ldap, symptom)())
@mock.patch('os.listdir')
@mock.patch('os.path.isdir')
def test_file_based_domain_specific_configs_formatted_correctly_not_raised(
self, mocked_isdir, mocked_listdir):
symptom = ('symptom_LDAP_file_based_domain_specific_configs'
'_formatted_correctly')
# No Symptom Detected: Domain_specific drivers is not enabled
self.config_fixture.config(
group='identity',
domain_specific_drivers_enabled=False)
self.assertFalse(getattr(ldap, symptom)())
# No Symptom Detected: Domain configuration from database is enabled
self.config_fixture.config(
group='identity',
domain_specific_drivers_enabled=True)
self.assertFalse(getattr(ldap, symptom)())
self.config_fixture.config(
group='identity',
domain_configurations_from_database=True)
self.assertFalse(getattr(ldap, symptom)())
# No Symptom Detected: The directory in domain_config_dir doesn't exist
mocked_isdir.return_value = False
self.assertFalse(getattr(ldap, symptom)())
# No Symptom Detected: domain specific drivers are enabled, domain
# configurations from database are disabled, directory exists, and no
# exceptions found.
self.config_fixture.config(
group='identity',
domain_configurations_from_database=False)
mocked_isdir.return_value = True
# An empty directory should not raise this symptom
self.assertFalse(getattr(ldap, symptom)())
# Test again with a file inside the directory
mocked_listdir.return_value = ['keystone.domains.conf']
self.assertFalse(getattr(ldap, symptom)())
class SecurityComplianceDoctorTests(unit.TestCase):
def test_minimum_password_age_greater_than_password_expires_days(self):
# Symptom Detected: Minimum password age is greater than the password
# expires days. Both values are positive integers greater than zero.
self.config_fixture.config(group='security_compliance',
minimum_password_age=2)
self.config_fixture.config(group='security_compliance',
password_expires_days=1)
self.assertTrue(
security_compliance.
symptom_minimum_password_age_greater_than_expires_days())
def test_minimum_password_age_equal_to_password_expires_days(self):
# Symptom Detected: Minimum password age is equal to the password
# expires days. Both values are positive integers greater than zero.
self.config_fixture.config(group='security_compliance',
minimum_password_age=1)
self.config_fixture.config(group='security_compliance',
password_expires_days=1)
self.assertTrue(
security_compliance.
symptom_minimum_password_age_greater_than_expires_days())
def test_minimum_password_age_less_than_password_expires_days(self):
# No Symptom Detected: Minimum password age is less than password
# expires days. Both values are positive integers greater than zero.
self.config_fixture.config(group='security_compliance',
minimum_password_age=1)
self.config_fixture.config(group='security_compliance',
password_expires_days=2)
self.assertFalse(
security_compliance.
symptom_minimum_password_age_greater_than_expires_days())
def test_minimum_password_age_and_password_expires_days_deactivated(self):
# No Symptom Detected: when minimum_password_age's default value is 0
# and password_expires_days' default value is None
self.assertFalse(
security_compliance.
symptom_minimum_password_age_greater_than_expires_days())
def test_invalid_password_regular_expression(self):
# Symptom Detected: Regular expression is invalid
self.config_fixture.config(
group='security_compliance',
password_regex='^^(??=.*\d)$')
self.assertTrue(
security_compliance.symptom_invalid_password_regular_expression())
def test_valid_password_regular_expression(self):
# No Symptom Detected: Regular expression is valid
self.config_fixture.config(
group='security_compliance',
password_regex='^(?=.*\d)(?=.*[a-zA-Z]).{7,}$')
self.assertFalse(
security_compliance.symptom_invalid_password_regular_expression())
def test_password_regular_expression_deactivated(self):
# No Symptom Detected: Regular expression deactivated to None
self.config_fixture.config(
group='security_compliance',
password_regex=None)
self.assertFalse(
security_compliance.symptom_invalid_password_regular_expression())
def test_password_regular_expression_description_not_set(self):
# Symptom Detected: Regular expression is set but description is not
self.config_fixture.config(
group='security_compliance',
password_regex='^(?=.*\d)(?=.*[a-zA-Z]).{7,}$')
self.config_fixture.config(
group='security_compliance',
password_regex_description=None)
self.assertTrue(
security_compliance.
symptom_password_regular_expression_description_not_set())
def test_password_regular_expression_description_set(self):
# No Symptom Detected: Regular expression and description are set
desc = '1 letter, 1 digit, and a minimum length of 7 is required'
self.config_fixture.config(
group='security_compliance',
password_regex='^(?=.*\d)(?=.*[a-zA-Z]).{7,}$')
self.config_fixture.config(
group='security_compliance',
password_regex_description=desc)
self.assertFalse(
security_compliance.
symptom_password_regular_expression_description_not_set())
def test_password_regular_expression_description_deactivated(self):
# No Symptom Detected: Regular expression and description are
# deactivated to None
self.config_fixture.config(
group='security_compliance', password_regex=None)
self.config_fixture.config(
group='security_compliance', password_regex_description=None)
self.assertFalse(
security_compliance.
symptom_password_regular_expression_description_not_set())
class TokensDoctorTests(unit.TestCase):
def test_unreasonable_max_token_size_raised(self):
# Symptom Detected: the max_token_size for uuid is not 32
self.config_fixture.config(group='token', provider='uuid')
self.config_fixture.config(max_token_size=33)
self.assertTrue(tokens.symptom_unreasonable_max_token_size())
# Symptom Detected: the max_token_size for fernet is greater than 255
self.config_fixture.config(group='token', provider='fernet')
self.config_fixture.config(max_token_size=256)
self.assertTrue(tokens.symptom_unreasonable_max_token_size())
def test_unreasonable_max_token_size_not_raised(self):
# No Symptom Detected: the max_token_size for uuid is 32
self.config_fixture.config(group='token', provider='uuid')
self.config_fixture.config(max_token_size=32)
self.assertFalse(tokens.symptom_unreasonable_max_token_size())
# No Symptom Detected: the max_token_size for fernet is 255 or less
self.config_fixture.config(group='token', provider='fernet')
self.config_fixture.config(max_token_size=255)
self.assertFalse(tokens.symptom_unreasonable_max_token_size())
class TokenFernetDoctorTests(unit.TestCase):
@mock.patch('keystone.cmd.doctor.tokens_fernet.utils')
def test_usability_of_Fernet_key_repository_raised(self, mock_utils):
# Symptom Detected: Fernet key repo is world readable
self.config_fixture.config(group='token', provider='fernet')
mock_utils.FernetUtils().validate_key_repository.return_value = False
self.assertTrue(
tokens_fernet.symptom_usability_of_Fernet_key_repository())
@mock.patch('keystone.cmd.doctor.tokens_fernet.utils')
def test_usability_of_Fernet_key_repository_not_raised(self, mock_utils):
# No Symptom Detected: UUID is used instead of fernet
self.config_fixture.config(group='token', provider='uuid')
mock_utils.FernetUtils().validate_key_repository.return_value = False
self.assertFalse(
tokens_fernet.symptom_usability_of_Fernet_key_repository())
# No Symptom Detected: configs set properly, key repo is not world
# readable but is user readable
self.config_fixture.config(group='token', provider='fernet')
mock_utils.FernetUtils().validate_key_repository.return_value = True
self.assertFalse(
tokens_fernet.symptom_usability_of_Fernet_key_repository())
@mock.patch('keystone.cmd.doctor.tokens_fernet.utils')
def test_keys_in_Fernet_key_repository_raised(self, mock_utils):
# Symptom Detected: Fernet key repository is empty
self.config_fixture.config(group='token', provider='fernet')
mock_utils.FernetUtils().load_keys.return_value = False
self.assertTrue(
tokens_fernet.symptom_keys_in_Fernet_key_repository())
@mock.patch('keystone.cmd.doctor.tokens_fernet.utils')
def test_keys_in_Fernet_key_repository_not_raised(self, mock_utils):
# No Symptom Detected: UUID is used instead of fernet
self.config_fixture.config(group='token', provider='uuid')
mock_utils.FernetUtils().load_keys.return_value = True
self.assertFalse(
tokens_fernet.symptom_usability_of_Fernet_key_repository())
# No Symptom Detected: configs set properly, key repo has been
# populated with keys
self.config_fixture.config(group='token', provider='fernet')
mock_utils.FernetUtils().load_keys.return_value = True
self.assertFalse(
tokens_fernet.symptom_usability_of_Fernet_key_repository())
|
The premises are available as a combined unit or separate units, subject to terms offered.
Other amenities include the Kirkgate Indoor Market and 661 space multi-storey car park.
The subject premises occupy a highly prominent location at the Godwin Street entrance to the centre opposite Sports Direct and Deichmann, close to Wilkinsons, B&M, Perfect Home and Perfume Shop.
The accommodation is arranged at ground floor level. |
""" command line options, ini-file and conftest.py processing. """
import py
import sys, os
from _pytest import hookspec # the extension point definitions
from _pytest.core import PluginManager
# pytest startup
def main(args=None, plugins=None):
""" return exit code, after performing an in-process test run.
:arg args: list of command line arguments.
:arg plugins: list of plugin objects to be auto-registered during
initialization.
"""
config = _prepareconfig(args, plugins)
exitstatus = config.hook.pytest_cmdline_main(config=config)
return exitstatus
class cmdline: # compatibility namespace
main = staticmethod(main)
class UsageError(Exception):
""" error in py.test usage or invocation"""
_preinit = []
default_plugins = (
"mark main terminal runner python pdb unittest capture skipping "
"tmpdir monkeypatch recwarn pastebin helpconfig nose assertion genscript "
"junitxml resultlog doctest").split()
def _preloadplugins():
assert not _preinit
_preinit.append(get_plugin_manager())
def get_plugin_manager():
if _preinit:
return _preinit.pop(0)
# subsequent calls to main will create a fresh instance
pluginmanager = PytestPluginManager()
pluginmanager.config = config = Config(pluginmanager) # XXX attr needed?
for spec in default_plugins:
pluginmanager.import_plugin(spec)
return pluginmanager
def _prepareconfig(args=None, plugins=None):
if args is None:
args = sys.argv[1:]
elif isinstance(args, py.path.local):
args = [str(args)]
elif not isinstance(args, (tuple, list)):
if not isinstance(args, str):
raise ValueError("not a string or argument list: %r" % (args,))
args = py.std.shlex.split(args)
pluginmanager = get_plugin_manager()
if plugins:
for plugin in plugins:
pluginmanager.register(plugin)
return pluginmanager.hook.pytest_cmdline_parse(
pluginmanager=pluginmanager, args=args)
class PytestPluginManager(PluginManager):
def __init__(self, hookspecs=[hookspec]):
super(PytestPluginManager, self).__init__(hookspecs=hookspecs)
self.register(self)
if os.environ.get('PYTEST_DEBUG'):
err = sys.stderr
encoding = getattr(err, 'encoding', 'utf8')
try:
err = py.io.dupfile(err, encoding=encoding)
except Exception:
pass
self.trace.root.setwriter(err.write)
def pytest_configure(self, config):
config.addinivalue_line("markers",
"tryfirst: mark a hook implementation function such that the "
"plugin machinery will try to call it first/as early as possible.")
config.addinivalue_line("markers",
"trylast: mark a hook implementation function such that the "
"plugin machinery will try to call it last/as late as possible.")
class Parser:
""" Parser for command line arguments and ini-file values. """
def __init__(self, usage=None, processopt=None):
self._anonymous = OptionGroup("custom options", parser=self)
self._groups = []
self._processopt = processopt
self._usage = usage
self._inidict = {}
self._ininames = []
self.hints = []
def processoption(self, option):
if self._processopt:
if option.dest:
self._processopt(option)
def getgroup(self, name, description="", after=None):
""" get (or create) a named option Group.
:name: name of the option group.
:description: long description for --help output.
:after: name of other group, used for ordering --help output.
The returned group object has an ``addoption`` method with the same
signature as :py:func:`parser.addoption
<_pytest.config.Parser.addoption>` but will be shown in the
respective group in the output of ``pytest. --help``.
"""
for group in self._groups:
if group.name == name:
return group
group = OptionGroup(name, description, parser=self)
i = 0
for i, grp in enumerate(self._groups):
if grp.name == after:
break
self._groups.insert(i+1, group)
return group
def addoption(self, *opts, **attrs):
""" register a command line option.
:opts: option names, can be short or long options.
:attrs: same attributes which the ``add_option()`` function of the
`optparse library
<http://docs.python.org/library/optparse.html#module-optparse>`_
accepts.
After command line parsing options are available on the pytest config
object via ``config.option.NAME`` where ``NAME`` is usually set
by passing a ``dest`` attribute, for example
``addoption("--long", dest="NAME", ...)``.
"""
self._anonymous.addoption(*opts, **attrs)
def parse(self, args):
from _pytest._argcomplete import try_argcomplete
self.optparser = self._getparser()
try_argcomplete(self.optparser)
return self.optparser.parse_args([str(x) for x in args])
def _getparser(self):
from _pytest._argcomplete import filescompleter
optparser = MyOptionParser(self)
groups = self._groups + [self._anonymous]
for group in groups:
if group.options:
desc = group.description or group.name
arggroup = optparser.add_argument_group(desc)
for option in group.options:
n = option.names()
a = option.attrs()
arggroup.add_argument(*n, **a)
# bash like autocompletion for dirs (appending '/')
optparser.add_argument(FILE_OR_DIR, nargs='*'
).completer=filescompleter
return optparser
def parse_setoption(self, args, option):
parsedoption = self.parse(args)
for name, value in parsedoption.__dict__.items():
setattr(option, name, value)
return getattr(parsedoption, FILE_OR_DIR)
def parse_known_args(self, args):
optparser = self._getparser()
args = [str(x) for x in args]
return optparser.parse_known_args(args)[0]
def addini(self, name, help, type=None, default=None):
""" register an ini-file option.
:name: name of the ini-variable
:type: type of the variable, can be ``pathlist``, ``args`` or ``linelist``.
:default: default value if no ini-file option exists but is queried.
The value of ini-variables can be retrieved via a call to
:py:func:`config.getini(name) <_pytest.config.Config.getini>`.
"""
assert type in (None, "pathlist", "args", "linelist")
self._inidict[name] = (help, type, default)
self._ininames.append(name)
class ArgumentError(Exception):
"""
Raised if an Argument instance is created with invalid or
inconsistent arguments.
"""
def __init__(self, msg, option):
self.msg = msg
self.option_id = str(option)
def __str__(self):
if self.option_id:
return "option %s: %s" % (self.option_id, self.msg)
else:
return self.msg
class Argument:
"""class that mimics the necessary behaviour of py.std.optparse.Option """
_typ_map = {
'int': int,
'string': str,
}
# enable after some grace period for plugin writers
TYPE_WARN = False
def __init__(self, *names, **attrs):
"""store parms in private vars for use in add_argument"""
self._attrs = attrs
self._short_opts = []
self._long_opts = []
self.dest = attrs.get('dest')
if self.TYPE_WARN:
try:
help = attrs['help']
if '%default' in help:
py.std.warnings.warn(
'py.test now uses argparse. "%default" should be'
' changed to "%(default)s" ',
FutureWarning,
stacklevel=3)
except KeyError:
pass
try:
typ = attrs['type']
except KeyError:
pass
else:
# this might raise a keyerror as well, don't want to catch that
if isinstance(typ, py.builtin._basestring):
if typ == 'choice':
if self.TYPE_WARN:
py.std.warnings.warn(
'type argument to addoption() is a string %r.'
' For parsearg this is optional and when supplied '
' should be a type.'
' (options: %s)' % (typ, names),
FutureWarning,
stacklevel=3)
# argparse expects a type here take it from
# the type of the first element
attrs['type'] = type(attrs['choices'][0])
else:
if self.TYPE_WARN:
py.std.warnings.warn(
'type argument to addoption() is a string %r.'
' For parsearg this should be a type.'
' (options: %s)' % (typ, names),
FutureWarning,
stacklevel=3)
attrs['type'] = Argument._typ_map[typ]
# used in test_parseopt -> test_parse_defaultgetter
self.type = attrs['type']
else:
self.type = typ
try:
# attribute existence is tested in Config._processopt
self.default = attrs['default']
except KeyError:
pass
self._set_opt_strings(names)
if not self.dest:
if self._long_opts:
self.dest = self._long_opts[0][2:].replace('-', '_')
else:
try:
self.dest = self._short_opts[0][1:]
except IndexError:
raise ArgumentError(
'need a long or short option', self)
def names(self):
return self._short_opts + self._long_opts
def attrs(self):
# update any attributes set by processopt
attrs = 'default dest help'.split()
if self.dest:
attrs.append(self.dest)
for attr in attrs:
try:
self._attrs[attr] = getattr(self, attr)
except AttributeError:
pass
if self._attrs.get('help'):
a = self._attrs['help']
a = a.replace('%default', '%(default)s')
#a = a.replace('%prog', '%(prog)s')
self._attrs['help'] = a
return self._attrs
def _set_opt_strings(self, opts):
"""directly from optparse
might not be necessary as this is passed to argparse later on"""
for opt in opts:
if len(opt) < 2:
raise ArgumentError(
"invalid option string %r: "
"must be at least two characters long" % opt, self)
elif len(opt) == 2:
if not (opt[0] == "-" and opt[1] != "-"):
raise ArgumentError(
"invalid short option string %r: "
"must be of the form -x, (x any non-dash char)" % opt,
self)
self._short_opts.append(opt)
else:
if not (opt[0:2] == "--" and opt[2] != "-"):
raise ArgumentError(
"invalid long option string %r: "
"must start with --, followed by non-dash" % opt,
self)
self._long_opts.append(opt)
def __repr__(self):
retval = 'Argument('
if self._short_opts:
retval += '_short_opts: ' + repr(self._short_opts) + ', '
if self._long_opts:
retval += '_long_opts: ' + repr(self._long_opts) + ', '
retval += 'dest: ' + repr(self.dest) + ', '
if hasattr(self, 'type'):
retval += 'type: ' + repr(self.type) + ', '
if hasattr(self, 'default'):
retval += 'default: ' + repr(self.default) + ', '
if retval[-2:] == ', ': # always long enough to test ("Argument(" )
retval = retval[:-2]
retval += ')'
return retval
class OptionGroup:
def __init__(self, name, description="", parser=None):
self.name = name
self.description = description
self.options = []
self.parser = parser
def addoption(self, *optnames, **attrs):
""" add an option to this group.
if a shortened version of a long option is specified it will
be suppressed in the help. addoption('--twowords', '--two-words')
results in help showing '--two-words' only, but --twowords gets
accepted **and** the automatic destination is in args.twowords
"""
option = Argument(*optnames, **attrs)
self._addoption_instance(option, shortupper=False)
def _addoption(self, *optnames, **attrs):
option = Argument(*optnames, **attrs)
self._addoption_instance(option, shortupper=True)
def _addoption_instance(self, option, shortupper=False):
if not shortupper:
for opt in option._short_opts:
if opt[0] == '-' and opt[1].islower():
raise ValueError("lowercase shortoptions reserved")
if self.parser:
self.parser.processoption(option)
self.options.append(option)
class MyOptionParser(py.std.argparse.ArgumentParser):
def __init__(self, parser):
self._parser = parser
py.std.argparse.ArgumentParser.__init__(self, usage=parser._usage,
add_help=False, formatter_class=DropShorterLongHelpFormatter)
def format_epilog(self, formatter):
hints = self._parser.hints
if hints:
s = "\n".join(["hint: " + x for x in hints]) + "\n"
s = "\n" + s + "\n"
return s
return ""
def parse_args(self, args=None, namespace=None):
"""allow splitting of positional arguments"""
args, argv = self.parse_known_args(args, namespace)
if argv:
for arg in argv:
if arg and arg[0] == '-':
msg = py.std.argparse._('unrecognized arguments: %s')
self.error(msg % ' '.join(argv))
getattr(args, FILE_OR_DIR).extend(argv)
return args
class DropShorterLongHelpFormatter(py.std.argparse.HelpFormatter):
"""shorten help for long options that differ only in extra hyphens
- collapse **long** options that are the same except for extra hyphens
- special action attribute map_long_option allows surpressing additional
long options
- shortcut if there are only two options and one of them is a short one
- cache result on action object as this is called at least 2 times
"""
def _format_action_invocation(self, action):
orgstr = py.std.argparse.HelpFormatter._format_action_invocation(self, action)
if orgstr and orgstr[0] != '-': # only optional arguments
return orgstr
res = getattr(action, '_formatted_action_invocation', None)
if res:
return res
options = orgstr.split(', ')
if len(options) == 2 and (len(options[0]) == 2 or len(options[1]) == 2):
# a shortcut for '-h, --help' or '--abc', '-a'
action._formatted_action_invocation = orgstr
return orgstr
return_list = []
option_map = getattr(action, 'map_long_option', {})
if option_map is None:
option_map = {}
short_long = {}
for option in options:
if len(option) == 2 or option[2] == ' ':
continue
if not option.startswith('--'):
raise ArgumentError('long optional argument without "--": [%s]'
% (option), self)
xxoption = option[2:]
if xxoption.split()[0] not in option_map:
shortened = xxoption.replace('-', '')
if shortened not in short_long or \
len(short_long[shortened]) < len(xxoption):
short_long[shortened] = xxoption
# now short_long has been filled out to the longest with dashes
# **and** we keep the right option ordering from add_argument
for option in options: #
if len(option) == 2 or option[2] == ' ':
return_list.append(option)
if option[2:] == short_long.get(option.replace('-', '')):
return_list.append(option)
action._formatted_action_invocation = ', '.join(return_list)
return action._formatted_action_invocation
class Conftest(object):
""" the single place for accessing values and interacting
towards conftest modules from py.test objects.
"""
def __init__(self, onimport=None, confcutdir=None):
self._path2confmods = {}
self._onimport = onimport
self._conftestpath2mod = {}
self._confcutdir = confcutdir
def setinitial(self, args):
""" try to find a first anchor path for looking up global values
from conftests. This function is usually called _before_
argument parsing. conftest files may add command line options
and we thus have no completely safe way of determining
which parts of the arguments are actually related to options
and which are file system paths. We just try here to get
bootstrapped ...
"""
current = py.path.local()
opt = '--confcutdir'
for i in range(len(args)):
opt1 = str(args[i])
if opt1.startswith(opt):
if opt1 == opt:
if len(args) > i:
p = current.join(args[i+1], abs=True)
elif opt1.startswith(opt + "="):
p = current.join(opt1[len(opt)+1:], abs=1)
self._confcutdir = p
break
foundanchor = False
for arg in args:
if hasattr(arg, 'startswith') and arg.startswith("--"):
continue
anchor = current.join(arg, abs=1)
if exists(anchor): # we found some file object
self._try_load_conftest(anchor)
foundanchor = True
if not foundanchor:
self._try_load_conftest(current)
def _try_load_conftest(self, anchor):
self._path2confmods[None] = self.getconftestmodules(anchor)
# let's also consider test* subdirs
if anchor.check(dir=1):
for x in anchor.listdir("test*"):
if x.check(dir=1):
self.getconftestmodules(x)
def getconftestmodules(self, path):
try:
clist = self._path2confmods[path]
except KeyError:
if path is None:
raise ValueError("missing default conftest.")
clist = []
for parent in path.parts():
if self._confcutdir and self._confcutdir.relto(parent):
continue
conftestpath = parent.join("conftest.py")
if conftestpath.check(file=1):
clist.append(self.importconftest(conftestpath))
self._path2confmods[path] = clist
return clist
def rget(self, name, path=None):
mod, value = self.rget_with_confmod(name, path)
return value
def rget_with_confmod(self, name, path=None):
modules = self.getconftestmodules(path)
modules.reverse()
for mod in modules:
try:
return mod, getattr(mod, name)
except AttributeError:
continue
raise KeyError(name)
def importconftest(self, conftestpath):
assert conftestpath.check(), conftestpath
try:
return self._conftestpath2mod[conftestpath]
except KeyError:
pkgpath = conftestpath.pypkgpath()
if pkgpath is None:
_ensure_removed_sysmodule(conftestpath.purebasename)
self._conftestpath2mod[conftestpath] = mod = conftestpath.pyimport()
dirpath = conftestpath.dirpath()
if dirpath in self._path2confmods:
for path, mods in self._path2confmods.items():
if path and path.relto(dirpath) or path == dirpath:
assert mod not in mods
mods.append(mod)
self._postimport(mod)
return mod
def _postimport(self, mod):
if self._onimport:
self._onimport(mod)
return mod
def _ensure_removed_sysmodule(modname):
try:
del sys.modules[modname]
except KeyError:
pass
class CmdOptions(object):
""" holds cmdline options as attributes."""
def __init__(self, **kwargs):
self.__dict__.update(kwargs)
def __repr__(self):
return "<CmdOptions %r>" %(self.__dict__,)
FILE_OR_DIR = 'file_or_dir'
class Config(object):
""" access to configuration values, pluginmanager and plugin hooks. """
def __init__(self, pluginmanager):
#: access to command line option as attributes.
#: (deprecated), use :py:func:`getoption() <_pytest.config.Config.getoption>` instead
self.option = CmdOptions()
_a = FILE_OR_DIR
self._parser = Parser(
usage="%%(prog)s [options] [%s] [%s] [...]" % (_a, _a),
processopt=self._processopt,
)
#: a pluginmanager instance
self.pluginmanager = pluginmanager
self.trace = self.pluginmanager.trace.root.get("config")
self._conftest = Conftest(onimport=self._onimportconftest)
self.hook = self.pluginmanager.hook
self._inicache = {}
self._opt2dest = {}
self._cleanup = []
self.pluginmanager.register(self, "pytestconfig")
self.pluginmanager.set_register_callback(self._register_plugin)
self._configured = False
def _register_plugin(self, plugin, name):
call_plugin = self.pluginmanager.call_plugin
call_plugin(plugin, "pytest_addhooks",
{'pluginmanager': self.pluginmanager})
self.hook.pytest_plugin_registered(plugin=plugin,
manager=self.pluginmanager)
dic = call_plugin(plugin, "pytest_namespace", {}) or {}
if dic:
import pytest
setns(pytest, dic)
call_plugin(plugin, "pytest_addoption", {'parser': self._parser})
if self._configured:
call_plugin(plugin, "pytest_configure", {'config': self})
def do_configure(self):
assert not self._configured
self._configured = True
self.hook.pytest_configure(config=self)
def do_unconfigure(self):
assert self._configured
self._configured = False
self.hook.pytest_unconfigure(config=self)
self.pluginmanager.ensure_shutdown()
def pytest_cmdline_parse(self, pluginmanager, args):
assert self == pluginmanager.config, (self, pluginmanager.config)
self.parse(args)
return self
def pytest_unconfigure(config):
while config._cleanup:
fin = config._cleanup.pop()
fin()
def notify_exception(self, excinfo, option=None):
if option and option.fulltrace:
style = "long"
else:
style = "native"
excrepr = excinfo.getrepr(funcargs=True,
showlocals=getattr(option, 'showlocals', False),
style=style,
)
res = self.hook.pytest_internalerror(excrepr=excrepr,
excinfo=excinfo)
if not py.builtin.any(res):
for line in str(excrepr).split("\n"):
sys.stderr.write("INTERNALERROR> %s\n" %line)
sys.stderr.flush()
@classmethod
def fromdictargs(cls, option_dict, args):
""" constructor useable for subprocesses. """
pluginmanager = get_plugin_manager()
config = pluginmanager.config
config._preparse(args, addopts=False)
config.option.__dict__.update(option_dict)
for x in config.option.plugins:
config.pluginmanager.consider_pluginarg(x)
return config
def _onimportconftest(self, conftestmodule):
self.trace("loaded conftestmodule %r" %(conftestmodule,))
self.pluginmanager.consider_conftest(conftestmodule)
def _processopt(self, opt):
for name in opt._short_opts + opt._long_opts:
self._opt2dest[name] = opt.dest
if hasattr(opt, 'default') and opt.dest:
if not hasattr(self.option, opt.dest):
setattr(self.option, opt.dest, opt.default)
def _getmatchingplugins(self, fspath):
allconftests = self._conftest._conftestpath2mod.values()
plugins = [x for x in self.pluginmanager.getplugins()
if x not in allconftests]
plugins += self._conftest.getconftestmodules(fspath)
return plugins
def pytest_load_initial_conftests(self, parser, args):
self._conftest.setinitial(args)
pytest_load_initial_conftests.trylast = True
def _initini(self, args):
self.inicfg = getcfg(args, ["pytest.ini", "tox.ini", "setup.cfg"])
self._parser.addini('addopts', 'extra command line options', 'args')
self._parser.addini('minversion', 'minimally required pytest version')
def _preparse(self, args, addopts=True):
self._initini(args)
if addopts:
args[:] = self.getini("addopts") + args
self._checkversion()
self.pluginmanager.consider_preparse(args)
self.pluginmanager.consider_setuptools_entrypoints()
self.pluginmanager.consider_env()
self.hook.pytest_load_initial_conftests(early_config=self,
args=args, parser=self._parser)
def _checkversion(self):
import pytest
minver = self.inicfg.get('minversion', None)
if minver:
ver = minver.split(".")
myver = pytest.__version__.split(".")
if myver < ver:
raise pytest.UsageError(
"%s:%d: requires pytest-%s, actual pytest-%s'" %(
self.inicfg.config.path, self.inicfg.lineof('minversion'),
minver, pytest.__version__))
def parse(self, args):
# parse given cmdline arguments into this config object.
# Note that this can only be called once per testing process.
assert not hasattr(self, 'args'), (
"can only parse cmdline args at most once per Config object")
self._origargs = args
self._preparse(args)
# XXX deprecated hook:
self.hook.pytest_cmdline_preparse(config=self, args=args)
self._parser.hints.extend(self.pluginmanager._hints)
args = self._parser.parse_setoption(args, self.option)
if not args:
args.append(py.std.os.getcwd())
self.args = args
def addinivalue_line(self, name, line):
""" add a line to an ini-file option. The option must have been
declared but might not yet be set in which case the line becomes the
the first line in its value. """
x = self.getini(name)
assert isinstance(x, list)
x.append(line) # modifies the cached list inline
def getini(self, name):
""" return configuration value from an :ref:`ini file <inifiles>`. If the
specified name hasn't been registered through a prior
:py:func:`parser.addini <pytest.config.Parser.addini>`
call (usually from a plugin), a ValueError is raised. """
try:
return self._inicache[name]
except KeyError:
self._inicache[name] = val = self._getini(name)
return val
def _getini(self, name):
try:
description, type, default = self._parser._inidict[name]
except KeyError:
raise ValueError("unknown configuration value: %r" %(name,))
try:
value = self.inicfg[name]
except KeyError:
if default is not None:
return default
if type is None:
return ''
return []
if type == "pathlist":
dp = py.path.local(self.inicfg.config.path).dirpath()
l = []
for relpath in py.std.shlex.split(value):
l.append(dp.join(relpath, abs=True))
return l
elif type == "args":
return py.std.shlex.split(value)
elif type == "linelist":
return [t for t in map(lambda x: x.strip(), value.split("\n")) if t]
else:
assert type is None
return value
def _getconftest_pathlist(self, name, path=None):
try:
mod, relroots = self._conftest.rget_with_confmod(name, path)
except KeyError:
return None
modpath = py.path.local(mod.__file__).dirpath()
l = []
for relroot in relroots:
if not isinstance(relroot, py.path.local):
relroot = relroot.replace("/", py.path.local.sep)
relroot = modpath.join(relroot, abs=True)
l.append(relroot)
return l
def _getconftest(self, name, path=None, check=False):
if check:
self._checkconftest(name)
return self._conftest.rget(name, path)
def getoption(self, name):
""" return command line option value.
:arg name: name of the option. You may also specify
the literal ``--OPT`` option instead of the "dest" option name.
"""
name = self._opt2dest.get(name, name)
try:
return getattr(self.option, name)
except AttributeError:
raise ValueError("no option named %r" % (name,))
def getvalue(self, name, path=None):
""" return command line option value.
:arg name: name of the command line option
(deprecated) if we can't find the option also lookup
the name in a matching conftest file.
"""
try:
return getattr(self.option, name)
except AttributeError:
return self._getconftest(name, path, check=False)
def getvalueorskip(self, name, path=None):
""" (deprecated) return getvalue(name) or call
py.test.skip if no value exists. """
__tracebackhide__ = True
try:
val = self.getvalue(name, path)
if val is None:
raise KeyError(name)
return val
except KeyError:
py.test.skip("no %r value found" %(name,))
def exists(path, ignore=EnvironmentError):
try:
return path.check()
except ignore:
return False
def getcfg(args, inibasenames):
args = [x for x in args if not str(x).startswith("-")]
if not args:
args = [py.path.local()]
for arg in args:
arg = py.path.local(arg)
for base in arg.parts(reverse=True):
for inibasename in inibasenames:
p = base.join(inibasename)
if exists(p):
iniconfig = py.iniconfig.IniConfig(p)
if 'pytest' in iniconfig.sections:
return iniconfig['pytest']
return {}
def setns(obj, dic):
import pytest
for name, value in dic.items():
if isinstance(value, dict):
mod = getattr(obj, name, None)
if mod is None:
modname = "pytest.%s" % name
mod = py.std.types.ModuleType(modname)
sys.modules[modname] = mod
mod.__all__ = []
setattr(obj, name, mod)
obj.__all__.append(name)
setns(mod, value)
else:
setattr(obj, name, value)
obj.__all__.append(name)
#if obj != pytest:
# pytest.__all__.append(name)
setattr(pytest, name, value)
|
There were some commonalities in the ideological beliefs that led to the Mahad-Chavdar Satyagrahs and the ones that ignited the French Revolution. The French Revolution brought the idea of equality centre stage and its logical fallout was the Declaration of the Rights of Man and the Citizen. The twin satyagrahs defined, loud and clear, the human rights of the untouchables. The satyagrahs, led by Babasaheb Ambedkar, not only showed the right path to the Dalits, who were then just emerging as a new social force, but they also sowed the seed for the rise of the Dalits as a political force. This movement and its failures and successes are of crucial importance and its implications need to be studied closely. Of special relevance are the revolutionary pronouncements of Ambedkar during these movements.
Ambedkar had offered a three-pronged solution to the problems of the Dalit community reeling under the scourge of untouchability. The first was giving up dirty professions or jobs; second, stop eating the meat of dead animals; and third and the most important, to shed inferiority complex of being an Untouchable and develop self-respect. Taken together, the three pieces of advice constitute a potent and effective socio-psychological formula. The persons and the social groups which imbibed the formula marched on the path to progress.
Ambedkar’s formula show us the way to look at untouchability, deprivation or exploitation from a fresh perspective. This is imperative too, as the uselessness of the old political ways of dealing with the issue has been proved beyond any doubt. The way principles were sacrificed on the altar of electoral expediency and the manner in which leaders were sold and purchased in the run-up to the last general elections, was depressing, to say the least. In this otherwise bleak scenario, the three pieces of advice evoke hope. Unitl now, all the discourses have been blaming discriminatory social and political norms for the problems of the Untouchables. But what has not been explored is the deep psychological compulsion that made the Untouchables look down upon themselves. This rendered them intrinsically weak. Of course, one can blame social and political conspiracies for it and one would not be entirely off the mark either, but Ambedkar’s formula for growth and progress is more personal and psychological than political. And it must be viewed and understood in that way.
At some point in time in the hoary past, the Buddhists were compelled to become Untouchables. Unless we study the method that put them in that position, we will not be able to comprehend the psychological process that not only made a section of society untouchable but also made them feel that they are so. Ambedkar’s research shows that the Buddhists of the past are the Dalits of the present, especially those Buddhists who used to eat the meat of dead animals.
A Buddhist tale talks about the Lord telling a monk that whatever he receives as alms should be eaten without making any choices. One day, a crow flying overhead dropps a piece of meat in the begging bowl of the monk. He appears before the Tataghat and sought guidance. According to the tale, Buddha tells him that as it is the meat of an animal that is already dead, consuming it won’t involve any violence. With time, this became a canon and even today, in some Buddhist countries, meat shops have boards declaring that “Meat of animals which have died a natural death is sold here”. The fact that many Indian castes also consumed meat of dead animals was one of the factors that led Ambedkar to conclude that the Untouchables of today are the Buddhists of the past. If we put alongside Tataghat’s order, the Brahmanical conspiracy that followed it and Ambedkar’s three-point formula, we can understand how Untouchables are born and how they can come out of that state of mind, and this can help us learn many valuable lessons.
Tataghat permitting the monk to eat the piece of meat dropped by the crow cost this country dear. The question before Buddha was whether the monks should be given the freedom of choice, as far as things received as alms was concerned. The very objective of seeking alms and surviving on them is to deny oneself all choices and thus control one’s desires. According to Buddhism, once we allow our mind to make a choice, we become slaves to greed and go down the slippery path of degradation. Buddha’s permission to the monk to consume the piece of meat should be viewed against this backdrop. If, at that time, Buddha had disallowed consumption of the piece of meat, the monks would not have turned meat-eaters but would have been granted the right to choice, thus nullifying their “sadhna”. This was not acceptable to Buddha. Hence, Buddha allowed the monk to consume the piece of meat because it did not involve violence. By implication, he allowed all monks to eat the meat of dead animals.
Subsequently, when the Hindu religion was weakened due to the challenges posed by Jainism and Buddhism, and was trying to regain its pre-eminence, it used these commands/bans in a different manner, assimilating the best of them. Hindu religion has an unmatched capacity to assimilate other traditions and faiths and in one sense, that makes it a great religion. But it also has a dark underbelly – one manifestation of which is the notion of untouchability with its horrific and painful implications.
Parallel to the attempts to rejuvenate the Hindu religion by reorganizing its Vedic knowledge systems, there was also an endeavour to absorb the Buddhist monks and Jain saints, with their bright and clean image, into Hinduism. It is a historical fact that the Buddhist mendicants and Jain sadhus enjoyed much greater popular acceptance and respect than the comfort-loving Brahmins. The lifestyle and knowledge of the monks, who led a very harsh, extremely disciplined life, were much more impressive than of the Brahmins. That was probably why Buddhism spread over entire Asia within a short period of time. But Buddhism’s overemphasis on reclusion and non-violence proved to be its undoing. The conspiracies hatched by other religionists did the rest and Buddhism was virtually wiped out from India.
When the Hindu religious leaders were drawing up the canons aimed at securing a respectable position for the Hindu sanyasis, they borrowed heavily from Jains and Buddhists. The Hindu sanyasi also became a brahmachari and started living on alms. Meat-eating was declared a taboo and a harsh code of conduct was drafted. In contrast, in the Vedic age, Brahmins led comfortable family lives. Once Hindu religion emerged from its bad days and became powerful once again, it foisted the Buddhist and Jain saints’s lifestyle on its sanyasis and declared Buddhists as Untouchables. To ensure that they remain economically and socially weak, menial and dirty jobs were assigned to them.
This brief explanation would suffice to understand how, sometimes, psychological techniques are more effective than social or political conspiracies. They not only give the exploiter the right to exploit but fill the exploited with a deep sense of inferiority. Untouchability, which was a political phenomenon, gradually became a social tradition.
Let us go a bit deeper into the issue. Distinguishing good from bad, superior from inferior, is a natural human instinct, which helps humans survive as a biological and social being. A profession which does not involve dirty work or where one’s skills or knowledge are important is considered higher and better. That is why those who are possessors of knowledge are considered superior in every society. This knowledge may relate to religion, use of weapons, science, or business and commerce. Although the jobs assigned to the Shudras also involve use of skills and knowledge, however, because this involved dealing with filth and dirt, they were never considered respectable. This kind of mentality is unfortunate; it should be shunned. But, then, the undeniable fact that this is the way collective social psyche works.
Thus it is clear that being involved in dirty or grimy work makes one an Untouchable. Even conceding that the work which the Shudras did was dirty compared with the work of teachers, businessmen or warriors, what made the system stifling and horrendous was that this division of work was based on birth. It made generation after generation of Untouchables bear a permanent stigma and suffer from inferiority complex. Had the division of work not been birth-based, untouchability would have been linked with the work one does, rather than with the family one was born into. It would not have been the cross that generations were forced to carry. Thus the genesis of untouchability lies in identification with dirty work, the feeling of inferiority born of it and the society’s humiliating attitude towards those doing such work.
When seen in this context, it is apparent that Ambedkar’s radical advice had the potential of igniting a new revolution. When he asks the Dalits to give up dirty or menial jobs, he wants to be free them from the feeling of inferiority and weakness rooted in their profession. When he urges them to give up eating the meat of dead animals, he is asking them to lead a clean, hygienic lifestyle and adopt non-violence as a creed. And when he calls upon them to cast away the feeling of inferiority, he wants them to gain self-confidence so that they can give a befitting reply to their exploiters and put an end to the chain of exploitation.
What is the takeaway for us from Ambedkar’s advice? This is a very crucial question. The lessons that we draw from history will decide our future, especially in these times when the history of India seems to be poised for a long-term, permanent change. As we pass through these politically transitional times, we should rethink the direction of Dalit movement and discourse. Until now, the Dalit movement and discourse has been limited to the political and social spheres, and its impact is clearly visible. Dalits have become politically empowered and today no party – or for that matter all parties taken together – cannot dare ignore Dalit issues. But for want of a cultural and religious revolution, this political power is taking Dalits on the path to self-annihilation. Dalits may have become politically strong but their political power is not under their control. It is being controlled from somewhere else. That is because they have gained political power without first developing a strong cultural ground.
Embedded in this feeling of power is the potential of a downfall. The situation is somewhat akin to the exploitation of the Muslim vote-bank and the global downfall of the Muslim community. Though there are many commonalities between the Muslims and the Dalits, there is much dissimilarity, too. But both have been exploited by the mainstream political parties. This was done as part of a grand conspiracy. The contours of this socio-psychological conspiracy become acutely clear in the light of Ambedkar’s three-point formula.
To begin with, Dalit identity was linked to unclean professions and this identity was given the form of a tradition, thus rendering them untouchables in perpetuity. The Muslims and the Dalits were also deprived of education. The untouchables were socially alienated and forced to keep away from education. This affected Muslims, too, because most of the Muslims are converted Untouchables. After Independence, conservatism was encouraged among Muslims and they were made slaves of Mullahs. The policy of appeasement was used to keep them away from education. The Muslims were confined to religious education and madarsas with disastrous results. Deprivation from education was even more disastrous for the Dalits. Being Untouchables, they were forced into “unclean” professions and this, in turn, perpetuated their untouchable status. This was a vicious cycle and all attempts of the Dalits to break it were crushed by the savarnnas. Muslims, most of whom are converted Dalits, kept away from education due to altogether different reasons.
In other words, historically, the Dalits have been long aware of the importance and need of education. This is clear from the efforts made in this direction by Mahatma Phule and other Dalit leaders. On the other hand, the Muslim indifference to education and development is born of their conscious decision. They have chosen to remain uneducated. Dalits wanted to get educated but they were barred by the savarnas. The Muslims rejected modern education as part of their endeavour to retain and prove their distinct identity. In a country where Hindus have thousands of years of rich history, Muslims found themselves gripped with a strange sense of insecurity. To retain its distinct identity and protect its traditions, the community withdrew into a cocoon and became wary of modern education. They feared that modern education would kill their distinct identity and their younger generation would adopt the ways of other religionists.
It is clear that the Dalits and Muslims have radically different attitudes towards modern education and any movement that wants to bring them together to build a shared future would have to resolve these contradictions. These contradictions are visible in the recent history.
It is high time the Dalits independently took the path of modern education and acquired knowledge of the English language. This was what Ambedkar also said. There is a need to keep away from the Dalit-Muslim coalition peddled by the leftists. This need is more acute today. The entire Dalit community should realize this. Ambedkar was never in favour of adopting the ways of the communists. He knew very well that the opportunist leftists could use the resentment among the Dalits to serve their political ends. This understanding of Ambedkar needs to be imbibed.
If Dalits are to adopt the three-point formula of Ambedkar to emerge from untouchability, they will have to acquire education and turn modern on their own. Empowerment and awareness will lead them to a better future. Similarly, the Muslim community should distance itself from those elements among them which are against modern education and progress. Until education and modernity gain wide acceptance in the Muslim community, the Dalit movement should refrain from forging a joint front with them. In short, the Dalits and their movement have to keep away from the conscious retrogression of another community as well from the utopia of revolutionary ideologies.
Unfortunately, the article lacks historical accuracy and depth. Good to have pride but do not sacrifice facts. For instance meat stores.
Article loads up on a conspiracy theory of Hindus to decimate Buddhism. This could not have been possible as Hinduism has never been controlled by a single institution (like the Bible or Quran or Pope). The chances of the entire Hindu community coming together to orchestrate a strategy to annihilate Buddhism and Jainism sounds like good fiction. Buddhism did not survive because it is nihilistic orientation. Islam was more pragmatic as a religion, hence survived and expanded. Hinduism survived not because of the abstract Vedic concepts, but the strong threads of ritualistic traditions that entice masses into faith. |
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from zipline.data.data_portal import DataPortal
from logbook import Logger
log = Logger('DataPortalLive')
class DataPortalLive(DataPortal):
def __init__(self, broker, *args, **kwargs):
self.broker = broker
super(DataPortalLive, self).__init__(*args, **kwargs)
def get_last_traded_dt(self, asset, dt, data_frequency):
return self.broker.get_last_traded_dt(asset)
def get_spot_value(self, assets, field, dt, data_frequency):
return self.broker.get_spot_value(assets, field, dt, data_frequency)
def get_history_window(self,
assets,
end_dt,
bar_count,
frequency,
field,
data_frequency,
ffill=True):
# This method is responsible for merging the ingested historical data
# with the real-time collected data through the Broker.
# DataPortal.get_history_window() is called with ffill=False to mark
# the missing fields with NaNs. After merge on the historical and
# real-time data the missing values (NaNs) are filled based on their
# next available values in the requested time window.
#
# Warning: setting ffill=True in DataPortal.get_history_window() call
# results a wrong behavior: The last available value reported by
# get_spot_value() will be used to fill the missing data - which is
# always representing the current spot price presented by Broker.
historical_bars = super(DataPortalLive, self).get_history_window(
assets, end_dt, bar_count, frequency, field, data_frequency,
ffill=False)
realtime_bars = self.broker.get_realtime_bars(
assets, frequency)
# Broker.get_realtime_history() returns the asset as level 0 column,
# open, high, low, close, volume returned as level 1 columns.
# To filter for field the levels needs to be swapped
realtime_bars = realtime_bars.swaplevel(0, 1, axis=1)
ohlcv_field = 'close' if field == 'price' else field
# TODO: end_dt is ignored when historical & realtime bars are merged.
# Should not cause issues as end_dt is set to current time in live
# trading, but would be more proper if merge would make use of it.
combined_bars = historical_bars.combine_first(
realtime_bars[ohlcv_field])
if ffill and field == 'price':
# Simple forward fill is not enough here as the last ingested
# value might be outside of the requested time window. That case
# the time series starts with NaN and forward filling won't help.
# To provide values for such cases we backward fill.
# Backward fill as a second operation will have no effect if the
# forward-fill was successful.
combined_bars.fillna(method='ffill', inplace=True)
combined_bars.fillna(method='bfill', inplace=True)
return combined_bars[-bar_count:]
|
Addiction Hope provides information, education and encouragement to those suffering from a drug or alcohol dependency, their friends and family, and substance abuse treatment providers.
The articles offered on this page have been made available to Addiction Hope by professionals in the field of substance and alcohol addiction in order to promote recovery.
The views of marijuana being simply a drug for stoners are changing as laws governing the drug are becoming more relaxed, and marijuana is more acceptable as a means of relief in the medical community. The legality, uses, scrutiny and future of marijuana are changing.
As the nation navigates the murky waters of medical marijuana, many people remain confused: why use marijuana when there are other, legal options to treat these ailments? There appears no easy answer – proponents of medical marijuana claim marijuana’s efficacy far surpasses that of traditional therapeutic approaches.
The topic of legalizing marijuana has sparked enormous controversy across the country, with many states examining and debating legislation supporting this issue. The recreational and medical use of marijuana have been discussed and disputed in the general public and among health professionals as well. As some states are moving towards legalizing marijuana, potential risks and benefits are being heavily weighed.
Understanding the possible implications of legalizing marijuana is extremely important. Recent research has exposed how marijuana use may predispose offspring to drug addiction. In a study completed at the Icahn School of Medicine at Mount Sinai, parental exposure to marijuana and the effects it has on the offspring in adulthood was studied.
In the first-ever global survey of illicit drug abuse, the Institute for Health Metrics and Evaluation at the University of Washington found marijuana to be the most widely used illegal drug worldwide, but the most deaths are caused by addiction to prescription painkillers, such as Vicodin and Oxycontin.
What exactly is marijuana? Marijuana is a psychoactive drug made with dried leaves and other parts of the Cannabis plant. When the leaves of the Cannabis plant are dried, it is then rolled into a cigarette, a pipe, a hookah, bong or a vaporizer and smoked. Out of all the illegal drugs, marijuana is the most commonly used. Learn more about marijuana, its affects, and addiction. |
from __init__ import *
import json, requests
import h5py
def query(lon, lat, coordsys='gal', mode='full'):
url = 'http://argonaut.skymaps.info/gal-lb-query-light'
payload = {'mode': mode}
if coordsys.lower() in ['gal', 'g']:
payload['l'] = lon
payload['b'] = lat
elif coordsys.lower() in ['equ', 'e']:
payload['ra'] = lon
payload['dec'] = lat
else:
raise ValueError("coordsys '{0}' not understood.".format(coordsys))
headers = {'content-type': 'application/json'}
r = requests.post(url, data=json.dumps(payload), headers=headers)
try:
r.raise_for_status()
except requests.exceptions.HTTPError as e:
print('Response received from Argonaut:')
print(r.text)
raise e
return json.loads(r.text)
def plot_dust3dim():
magv = arange(10, 22)
prlxerrr = array([4., 4., 4.2, 6.0, 9.1, 14.3, 23.1, 38.8, 69.7, 138., 312., 1786.])
fig, ax = plt.subplots()
ax.plot(magv, prlxerrr)
ax.set_ylabel(r'$\sigma_\pi [\mu$as]')
ax.set_xlabel('V [mag]')
ax.set_yscale('log')
plt.show()
# paths
pathimag, pathdata = tdpy.util.retr_path('radt_6dim')
numbside = 256
maxmgang = 10.
lghp, bghp, numbpixl, apix = tdpy.util.retr_healgrid(numbside)
mpixl = where((abs(lghp) < maxmgang) & (abs(90. - bghp) < maxmgang))[0]
qresult = query(list(lghp[mpixl]), list(bghp[mpixl]))
for key in qresult.keys():
fig, ax = plt.subplots()
ax.hist(qresult[key], 100)
ax.set_title(key)
plt.savefig(pathimag + 'full.pdf')
plt.close()
qresult = query(list(lghp[mpixl]), list(bghp[mpixl]), mode='sfnd')
for key in qresult.keys():
fig, ax = plt.subplots()
ax.hist(qresult[key])
ax.set_title(key)
plt.savefig(pathimag + 'sfnd.pdf')
plt.close()
qresult = query(list(lghp[mpixl]), list(bghp[mpixl]), mode='lite')
for key in qresult.keys():
fig, ax = plt.subplots()
ax.hist(qresult[key])
ax.set_title(key)
plt.savefig(pathimag + 'lite.pdf')
plt.close()
plot_dust3dim()
|
Schoolcraft, H. Rowe. (1848). The Indian in his wigwam: or, Characteristics of the red race of America ; from original notes and manuscripts. New York: Dewitt & Davenport.
Schoolcraft, Henry Rowe, 1793-1864. The Indian In His Wigwam: Or, Characteristics of the Red Race of America ; From Original Notes And Manuscripts. New York: Dewitt & Davenport, 1848. |
# coding=utf-8
r"""
This code was generated by
\ / _ _ _| _ _
| (_)\/(_)(_|\/| |(/_ v1.0.0
/ /
"""
from twilio.base.version import Version
from twilio.rest.preview.trusted_comms.branded_channel import BrandedChannelList
from twilio.rest.preview.trusted_comms.brands_information import BrandsInformationList
from twilio.rest.preview.trusted_comms.cps import CpsList
from twilio.rest.preview.trusted_comms.current_call import CurrentCallList
class TrustedComms(Version):
def __init__(self, domain):
"""
Initialize the TrustedComms version of Preview
:returns: TrustedComms version of Preview
:rtype: twilio.rest.preview.trusted_comms.TrustedComms.TrustedComms
"""
super(TrustedComms, self).__init__(domain)
self.version = 'TrustedComms'
self._branded_channels = None
self._brands_information = None
self._cps = None
self._current_calls = None
@property
def branded_channels(self):
"""
:rtype: twilio.rest.preview.trusted_comms.branded_channel.BrandedChannelList
"""
if self._branded_channels is None:
self._branded_channels = BrandedChannelList(self)
return self._branded_channels
@property
def brands_information(self):
"""
:rtype: twilio.rest.preview.trusted_comms.brands_information.BrandsInformationList
"""
if self._brands_information is None:
self._brands_information = BrandsInformationList(self)
return self._brands_information
@property
def cps(self):
"""
:rtype: twilio.rest.preview.trusted_comms.cps.CpsList
"""
if self._cps is None:
self._cps = CpsList(self)
return self._cps
@property
def current_calls(self):
"""
:rtype: twilio.rest.preview.trusted_comms.current_call.CurrentCallList
"""
if self._current_calls is None:
self._current_calls = CurrentCallList(self)
return self._current_calls
def __repr__(self):
"""
Provide a friendly representation
:returns: Machine friendly representation
:rtype: str
"""
return '<Twilio.Preview.TrustedComms>'
|
It has been amazing to watch #whatisschool continue to grow and lead the way in the Education chat category, constantly having a crowd of over 100 educators from over 10 countries every week! This week I am excited to be back to lead this week’s #whatisschool chat with the amazing Bev Ladd about how to sum up an amazing year of learning in your school.
It is hard to believe the end of the 2015-16 school year is coming to a close. This week #whatisschool focuses on celebrating your successes this year and looking at how to best sum up an amazing year. We will be leading an incredibly inspirational chat about the amazing things educators have done this year and are going to do to finish up their school year (and yes we know it is not the end for all of you! Especially my friends in NZ and Australia). We hope you’ll join us as it is sure to be informative and fun.
1) What were your biggest successes this year and how did you step outside your comfort zone?
2) What would your students say was the highlight of their year? Why?
3) What is your favorite end of year activity to sum up your year?
4) How do you gather and use parent and student feedback to reflect and improve upon next year?
5) What have you trialled this year that you will be implementing next year to improve student learning?
6) Share the work that you and/or your students created this year that you are most proud of. |
#Primero, pedimos la hora
tiempo_llegada = raw_input("Ingrese hora: ")
#Ahora vemos si acaso estamos en horario normal o punta (o no opera)
if tiempo_llegada >= "06:30" and tiempo_llegada <= "22:30":
#En operacion
if (tiempo_llegada >= "07:30" and tiempo_llegada <= "09:00") or (tiempo_llegada >= "18:00" and tiempo_llegada <= "19:00"):
#Hora punta
#Trenes largos (4 vagones) y cortos (2 vagones) cada 6 minutos (en ese orden)
print "Se encuentra en horario punta"
hora_llegada = int(tiempo_llegada[:2])
minutos_llegada = int(tiempo_llegada[3:])
#Veamos que vagon llegara a la estacion y en cuanto tiempo.
contador_largo_vagones = 0
contador_vagones = 0
while contador_largo_vagones < minutos_llegada:
contador_largo_vagones += 6
contador_vagones += 1
if contador_vagones % 2 == 0:
print "El tren tiene 4 vagones"
else:
print "El tren tiene 2 vagones"
if minutos_llegada % 6 == 0:
print "El tren se encuentra en el anden!"
else:
print "Debe esperar "+str(6 - (minutos_llegada % 6))+" minutos"
else:
#Horario normal
#Trenes largos (4 vagones) cada 12 minutos
print "Se encuentra en horario normal"
#Veamos en cuanto tiempo llegara el tren a la estacion.
hora_llegada = int(tiempo_llegada[:2])
minutos_llegada = int(tiempo_llegada[3:])
print "El tren tiene 4 vagones"
if minutos_llegada % 12 == 0:
print "El tren se encuentra en el anden!"
else:
print "Debe esperar "+str(12 - (minutos_llegada % 12))+" minutos"
else:
#El servicio no opera
print "Ya no se encuentran trenes en este horario"
|
Artist rendition shows the new veterans' center at Tony La Russa's Animal Rescue Foundation in Walnut Creek. The center will serve as a training space for ARF's Pets and Vets program, which matches veterans with rescue dogs, which the vets then train as their service dogs.
Site plan for Tony La Russa's Animal Rescue Foundation in Walnut Creek, showing the new veterans' center, additional dog runs and other improvements and additions.
Veteran Kai Barnwell, left, watches his dog Bubba being examined by Dr. Josie Noah, right, and assistants Kendall Swanson, center, and Mallory Brady, top, at ARF's mobile clinic. Vets receive free health care for their pets through ARF's Pets and Vets program.
Veteran Don Perry, left, holds his dog Rowdy's leash while talking to Dr. Josie Noah, right,. Noah runs ARF's mobile clinic, which provides free health care to the pets of veterans.
Katie, a 3-year old pit bull-Labrador mix, is photographed at Tony La Russa's Animal Rescue Foundation in Walnut Creek. Katie is owned by U.S. Marine veteran David Fuller and is part of the Pets and Vets program at ARF.
Katie, a three-year old pit bull-Labrador mix, is photographed at Tony La Russa's Animal Rescue Foundation in Walnut Creek, Calif., on Monday, Aug. 20, 2018. Katie is owned by U.S. Marine veteran David Fuller and is part of the Pets and Vets program at ARF.
Layla, right, and Thor, both pit bull mixes, hang out with their owners at Tony La Russa's Animal Rescue Foundation in Walnut Creek. Both were rescued and placed in the Pets and Vets program and are now companions of veterans.
Pit Bull mix Thor poses for a photo at Tony La Russa's Animal Rescue Foundation in Walnut Creek. Thor is part of the Pets and Vets program at ARF.
Pit bull mix Thor gives his owner, Johnny Delashaw, a high five.
Veterans Johnny Delashaw and his dog Thor, and Rudy DuBord and his dog Layla, right, pose for a photo.
Lady, a 5-year-old basenji mix, is photographed with her service dog vest at Heather Farm Park in Walnut Creek.
Veteran Madeline Gibson and Lady, a 5-year-old basenji mix.
Veteran Samuel Phillips pets Mackenzie, a 3-year-old pit bull terrier mix.
Mackenzie, a 3-year-old pit bull terrier mix, is being trained as a service dog.
Veteran Madeline Gibson and Lady, a 5-year-old basenji mix, spend time together at the Heather Farm dog park in Walnut Creek. Gibson and Lady have completed service dog training through the Pets and Vets program at the Tony La Russa's Animal Rescue Foundation.
Danny Kimbrell, Pets and Vets trainer, puts Laramie, a pit bull mix, through some tests during the evaluation process at Tony La Russa's Animal Rescue Foundation in Walnut Creek. Laramie is being evaluated for a possible entry into the Pets and Vets service dog program.
A friendly dog is brought in to play with Laramie, a pit bull mix, right, during her evaluation for possible entry into the Pets and Vets service dog program at Tony La Russa's Animal Rescue Foundation.
Dogs are tested for temperament before being admitted into the Pets and Vets program.
Michael Spain, of Stockton, rewards his dog Harvey with a kiss during a training session at Tony La Russa's Animal Rescue Foundation. Spain, who served in the Navy, is participating in the Pet and Vets program.
Layla rests on the floor after a busy day of training with her owner, Rudy DuBord, of Oakdale. DuBord is participating in the Pets and Vets program, which pairs veterans with emotional support animals to help them cope with PTSD, anxiety, depression and other issues when they return to civilian life.
Nicky Raulston, of Pittsburg, walks with her dog, Riley, during a training session.
Layla takes a break after some rigorous training.
Tony La Russa’s Animal Rescue Foundation is breaking ground on a new veteran’s center and increasing dog holding facilities that will allow more vets to be paired with a rescued service dog.
The facility, to be built at ARF’s headquarters at 2890 Mitchell Dr., Walnut Creek, will serve as the first shelter-based national headquarters for a pets and vets training center, and will allow shelters from around the country to take advantage of ARF’s expertise to create their own programs.
The $18.7 million, 23,800-square-foot expansion includes almost 14,000 square feet for the new “Pets and Vets Center,” as well as a special clinic entry for veterans, and a kennel that will add 30 dog runs to ARF’s existing building and is expected to save 500 more dogs each year.
La Russa launched the Pets and Vets program seven years ago as a way to give back to veterans who have served our country. In the past few years, the program has expanded to include the pairing of vets with Post Traumatic Stress Disorder with rescue dogs that the vets then train to be their service dogs.
It is a unique program that provides both the dogs and the training to vets at no cost to them.
Since the program launched, vets have shared training space with other activities going on at ARF, which is not ideal, ARF executive director Elena Bicker. The vets often struggle in crowds and in areas that they can’t control, such as rooms with several exits and people coming and going. The new center will provide a private, secure space for participants in the program.
It also will provide important training spaces for the dogs, including an all-weather training field and simulated home and office settings, allowing the dogs to learn behaviors required service support animals.
Corporate donations and a pledge from the Engelstad Family Foundation has raised some of the necessary funds but the majority of gifts are from individuals. ARF is asking the community to put them over the finish line by raising the final $1 million. For every dollar donated now, the Engelstad Family Foundation will match it with $2.
ARF’s Pets and Vets program identifies rescued dogs with the suitable temperament and intelligence to be trained as service animals. The dogs are put through a series of tests, before they’re assigned to a vet and training begins. For vets who suffer with PTSD, a crippling and isolating condition that can make them afraid to leave their homes or interact with others, training the dog gives the vets a purpose and a reason to re-enter the world.
In addition to the new veterans’ center and dog runs, ARF will add solar panels to reduce its carbon footprint and save on energy costs. The plan also creates an endowment fund for building maintenance.
It’s been a long road for ARF, which opened in 1991 in a borrowed storefront in Concord’s Willows shopping center. The group moved several times more before raising enough money in 2003 to purchase land on Mitchell Road in Walnut Creek and build a 31,873-square foot shelter designed to hold 112 cats and 36 dogs.
The group’s central mission of rescuing pets and finding them homes has not changed, but ARF has taken on new roles and programs, and rescued more and more animals, outgrowing its building.
To donate, go to www.arflife.org/campaign. ARF also is looking for people to foster pets during the construction, which is expected to take 18 months. Find more information on fostering, as well as in-person and online training schedules, at arflife.org/foster.
For more pets and animals coverage, follow us on Flipboard. |
#!/usr/bin/env python
# -*- coding: utf-8 -*-
__author__ = 'Frederic'
import os
import sys
sys.path.append("../src")
from gehol import GeholProxy
from gehol.studentsetcalendar import StudentSetCalendar
from gehol.converters.utils import write_content_to_file
from gehol.converters.rfc5545icalwriter import convert_geholcalendar_to_ical
from pprint import pprint
DATA_DIR = "../data/student/"
DATA_FILE = "../data/student-2012/SOCO_BA3.html"
first_monday = '19/09/2011'
def make_ical_from_local_file(filename):
f = open(filename)
cal = StudentSetCalendar(f)
print cal.header_data
pprint(cal.events)
ical_content = convert_geholcalendar_to_ical(cal, first_monday)
write_content_to_file(ical_content, "%s.ics" % cal.description)
URLs = [("MA1 en sciences informatiques - Spécialisée - Multimedia", "http://164.15.72.157:8081/Reporting/Individual;Student%20Set%20Groups;id;%23SPLUS0FACD0?&template=Ann%E9e%20d%27%E9tude&weeks=1-14&days=1-6&periods=5-33&width=0&height=0"),
("BA1 en sciences de l'ingénieur, orientation ingénieur civil - Série 2B", "http://164.15.72.157:8081/Reporting/Individual;Student%20Set%20Groups;id;%23SPLUSA6299F?&template=Ann%E9e%20d%27%E9tude&weeks=1-14&days=1-6&periods=5-33&width=0&height=0"),
("BA3 en information et communication", "http://164.15.72.157:8081/Reporting/Individual;Student%20Set%20Groups;id;%23SPLUS35F074?&template=Ann%E9e%20d%27%E9tude&weeks=1-14&days=1-6&periods=5-33&width=0&height=0"),
]
def make_ical_from_url(name, url):
gehol_proxy = GeholProxy()
cal = gehol_proxy.get_studentset_calendar_from_url(url)
ical = convert_geholcalendar_to_ical(cal, first_monday)
ical_data = ical.as_string()
outfile = "%s.ics" % name
print "writing ical file : %s" % outfile
write_content_to_file(ical_data, outfile)
GROUP_IDs = ["%23SPLUS0FACD0", "%23SPLUSA6299D", "%23SPLUS35F0CB", "%23SPLUS35F0CA", "%23SPLUS4BCCBA", "%23SPLUSA6299B"]
def make_ical_from_groupid(group_id):
gehol_proxy = GeholProxy()
cal = gehol_proxy.get_studentset_calendar(group_id, "1-14")
ical = convert_geholcalendar_to_ical(cal, first_monday)
ical_data = ical.as_string()
outfile = "%s.ics" % ical.name
print "writing ical file : %s" % outfile
write_content_to_file(ical_data, outfile)
if __name__ == "__main__":
for (profile, url) in URLs:
make_ical_from_url(profile, url)
for id in GROUP_IDs:
make_ical_from_groupid(id) |
Click on the links below to get your forms (opens in new tab). The Physicians Referral form is here as well. Any questions, please call us at (808) 674-9998. We're always glad to answer.
Learn more about our Physical Therapy & Massage Therapy Services or Contact us now for a consultation. We're ready to help you heal.
© 2019 Rebound Hawaii. All Rights Reserved. |
# -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import models, migrations
class Migration(migrations.Migration):
dependencies = [
('jobs', '0019_auto_20160824_1516'),
]
operations = [
migrations.AddField(
model_name='server',
name='crdntl_instanceip',
field=models.GenericIPAddressField(null=True),
),
migrations.AddField(
model_name='server',
name='crdntl_loginnode',
field=models.CharField(max_length=40, null=True),
),
migrations.AlterField(
model_name='server',
name='crdntl_domain',
field=models.CharField(max_length=50, null=True),
),
migrations.AlterField(
model_name='server',
name='crdntl_password',
field=models.CharField(max_length=20, null=True),
),
migrations.AlterField(
model_name='server',
name='crdntl_user',
field=models.CharField(max_length=50, null=True),
),
migrations.AlterField(
model_name='server',
name='server_cpus',
field=models.IntegerField(null=True),
),
]
|
Capital preservation is protecting yourself from losing money when you make a mistake. And you WILL make mistakes.
Basically, it ensures that no event, no matter how big or small, can put you out of business (so to speak). But it is NOT safe investing.
Digging yourself out of a hole is VERY hard (see my page on compound interest). Capital preservation helps you avoid digging one in the first place. That is why it is the first step on your way to better investing.
Capital is money that you use to purchase assets or new "stuff". From a personal finance perspective, think of it as your profit; the hard earned money you have after paying all the bills. Capital is your ability to buy new things.
Preservation is the act of protecting something from loss.
So a capital preservation is the act of protecting your ability to buy new things.
Throughout the site, I describe using portfolio sizing and position sizing; these are ways to minimize the amount of money you risk with each trade. The topic is popular during periods of uncertainty (either in the financial markets or global economies). But minimizing risk is something you do for every investment and/or trade, regardless of what the talking heads are saying.
You may have noticed that capital preservation is the first of three steps to better investing.
The reason this is the first step is simple. You need to have the ability to buy assets if you're going to invest. If you don't have any money, you can't buy any stocks, bonds, mutual funds, ETFs, etc.
If you live from paycheck to paycheck, you have no money to invest.
If you have no money to invest, you can't generate income or grow your accounts by investing.
Investing one of the only ways to have a chance at "retirement" in the traditional sense.
If you don't save and invest for retirement, you'll quickly lose the ability to buy things once you stop working.
As soon as retirement planning moved away from defined benefit pension plans (employer sponsored) to defined contribution plans (employee sponsored), the burden of preparing for retirement shifted from employers to employees.
Where to use Capital Preservation?
You need to use it in all your accounts.
Keep in mind that some investment brokers and banks will impose maintenance fees if you do not maintain a minimum balance. This helps them offset the low commissions they charge.
I once had an account that charged $25 if my balance fell below a certain threshold! That's $300 per year! Needless to say, I preserved my capital by moving it elsewhere.
How to use Capital Preservation?
As mentioned in the better investing page, avoiding losses at all costs is at the core of safe investing.
Losses can come in the form of day to day expenses, commissions, fees, taxes, interest payments, inflation, deflation, bid/ask spreads, getting fired, accidents; the list goes on and on. |
#MenuTitle: Show Kerning Pairs for Space
# -*- coding: utf-8 -*-
__doc__="""
Show Kerning Pairs for this glyph in a new tab.
"""
import GlyphsApp
import traceback
thisFont = Glyphs.font
Doc = Glyphs.currentDocument
selectedLayers = thisFont.selectedLayers
selectedMaster = thisFont.selectedFontMaster
masterID = selectedMaster.id
kernDict = thisFont.kerningDict()
leftGroups = {}
rightGroups = {}
for g in thisFont.glyphs:
if g.rightKerningGroup:
group_name = g.rightKerningGroupId()
try:
leftGroups[group_name].append(g.name)
except:
leftGroups[group_name] = [g.name]
if g.leftKerningGroup:
group_name = g.leftKerningGroupId()
try:
rightGroups[group_name].append(g.name)
except:
rightGroups[group_name] = [g.name]
def nameMaker(kernGlyphOrGroup, side):
# if this is a kerning group
if kernGlyphOrGroup[0] == "@":
# right glyph, left kerning group
if side == "right":
try:
return rightGroups[kernGlyphOrGroup][0]
except:
pass
elif side == "left":
# left glyph, right kerning group
try:
return leftGroups[kernGlyphOrGroup][0]
except:
pass
else:
return thisFont.glyphForId_(kernGlyphOrGroup).name
editString = u""""""
editStringL = u""""""
editStringR = u""""""
thisGlyph = thisFont.glyphs["space"]
thisGlyphName = thisGlyph.name
rGroupName = str(thisGlyph.rightKerningGroup)
lGroupName = str(thisGlyph.leftKerningGroup)
# print "\t", rGroupName, lGroupName
kernPairListL = []
kernPairListSortedL = []
kernPairListR = []
kernPairListSortedR = []
for L in thisFont.kerning[ masterID ]:
try:
# if the this kerning-pair's left glyph matches rGroupName (right side kerning group of thisGlyph)
if rGroupName == L[7:] or rGroupName == thisFont.glyphForId_(L).name or thisFont.glyphForId_(L).name == thisGlyph.name:
# for every R counterpart to L in the kerning pairs of rGroupName
for R in thisFont.kerning[masterID][L]:
if thisFont.kerning[masterID][L][R] != 0:
kernPairListL += [nameMaker(R, "right")]
except:
# print traceback.format_exc()
pass
for R in thisFont.kerning[masterID][L]:
try:
# if the R counterpart (class glyph) of L glyph is the selectedGlyph
if lGroupName == R[7:] or lGroupName == thisFont.glyphForId_(R).name or thisFont.glyphForId_(R).name == thisGlyph.name:
if thisFont.kerning[masterID][L][R] != 0:
kernPairListR += [nameMaker(L, "left")]
except:
pass
kernPairListSortedL = [g.name for g in Font.glyphs if g.name in kernPairListL]
for everyGlyph in kernPairListSortedL:
editStringL += "/%s/%s/bar" % (thisGlyphName, everyGlyph)
kernPairListSortedR = [g.name for g in Font.glyphs if g.name in kernPairListR]
for everyGlyph in kernPairListSortedR:
editStringR += "/%s/%s/bar" % (everyGlyph, thisGlyphName)
# editString = "/bar" + editStringL + "\n\n" + editStringR + "/bar"
editString = "/bar%s\n\n/bar%s" % (editStringL, editStringR)
thisFont.newTab(editString) |
We’re in the home stretch, friends!! Less than a week left to vote! Help me win the Amazing Avocado Big Hit Grand Prize ($5000!) for my yummy Avocado recipe by voting daily for me at http://www.theamazingavocado.com/bighit/vote.php?id=8. Thank you!!
Typically, my Dear You Guys letter is composed of a selection of comments that you guys have left on my blog during the last week. This week, though, Julie from DUTCHbeingME and I traded comments, so this is actually a Dear You Guys letter from her readers. She did the same for my comments on her blog!
PS–if you don’t know Julie, you’ve got to pay her a visit. I met her at BloggyBootCamp in Philly last month and we totally hit it off. Love her!
So… I kind of fell off the blogging planet for awhile there, so you may have forgotten I existed. But I didn’t forget that YOU existed. YEP YEP YEP! Live your life! It’s awesome anyway, just wish I could see your face.
Oh my goodness! I love this story!! Seriously love it. It’s going to be a hit, I just know it! It’s like the perfect Dear You Guys variation!! You have peaked my interest….So with all of the practice I’ve had the last 14 years of being a mom (duh Tracie, that is where we met!) ….why am I not perfect at it yet?? Nothing fab but still cute and funny. Ha!
And don’t worry on the vaccuming…I’m a fail too. That makes us normal. This might have choked me up a bit… maybe. There is no reason to not do that! I vacuum once a week…i used to vacuum every day when my sister and her dogs lived with us. The hair was crazy!! It sounds like it’s something you’d recommend though.
Hang on…Liz thinks someone tweets more than her? Unlikely. I couldn’t get my act together this week. I already know I’m crazy, but putting it in screen shots, well…Too much hospital time! Sometimes my craziest thoughts are on twitter and then I forget all about it. My mind is working… Sometimes I write a tweet and then think “I loved that one – so sad it’ll be gone soon!” Grrrrrr….
PS–The winner of the Nature’s Notebook custom print giveaway is Julie L (aunteegem). Congrats to Julie (this is not the same Julie from above!) and thanks to all who entered!
That is SO much fun! :) I love how you changed things up so much! I really have to work on my creativity with this! Ack!
I LOVE that you guys swapped blogs, and I REALLY love that #OutTweetTheTwit has made it’s way into this post!
It HAD to make the letter, didn’t it?
I love that we have such a fun little circle going with this meme. Thanks again for hosting! |
import logging
import dbus
import telepathy
import util.misc as misc_utils
_moduleLogger = logging.getLogger(__name__)
class ContactsMixin(telepathy.server.ConnectionInterfaceContacts):
ATTRIBUTES = {
telepathy.CONNECTION : 'contact-id',
telepathy.CONNECTION_INTERFACE_SIMPLE_PRESENCE : 'presence',
telepathy.CONNECTION_INTERFACE_ALIASING : 'alias',
telepathy.CONNECTION_INTERFACE_AVATARS : 'token',
telepathy.CONNECTION_INTERFACE_CAPABILITIES : 'caps',
telepathy.CONNECTION_INTERFACE_CONTACT_CAPABILITIES : 'capabilities'
}
def __init__(self):
telepathy.server.ConnectionInterfaceContacts.__init__(self)
dbus_interface = telepathy.CONNECTION_INTERFACE_CONTACTS
self._implement_property_get(
dbus_interface,
{'ContactAttributeInterfaces' : self.get_contact_attribute_interfaces}
)
def HoldHandles(self, *args):
"""
@abstract
"""
raise NotImplementedError("Abstract function called")
# Overwrite the dbus attribute to get the sender argument
@misc_utils.log_exception(_moduleLogger)
@dbus.service.method(telepathy.CONNECTION_INTERFACE_CONTACTS, in_signature='auasb',
out_signature='a{ua{sv}}', sender_keyword='sender')
def GetContactAttributes(self, handles, interfaces, hold, sender):
#InspectHandle already checks we're connected, the handles and handle type.
supportedInterfaces = set()
for interface in interfaces:
if interface in self.ATTRIBUTES:
supportedInterfaces.add(interface)
else:
_moduleLogger.debug("Ignoring unsupported interface %s" % interface)
handle_type = telepathy.HANDLE_TYPE_CONTACT
ret = dbus.Dictionary(signature='ua{sv}')
for handle in handles:
ret[handle] = dbus.Dictionary(signature='sv')
functions = {
telepathy.CONNECTION:
lambda x: zip(x, self.InspectHandles(handle_type, x)),
telepathy.CONNECTION_INTERFACE_SIMPLE_PRESENCE:
lambda x: self.GetPresences(x).items(),
telepathy.CONNECTION_INTERFACE_ALIASING:
lambda x: self.GetAliases(x).items(),
telepathy.CONNECTION_INTERFACE_AVATARS :
lambda x: self.GetKnownAvatarTokens(x).items(),
telepathy.CONNECTION_INTERFACE_CAPABILITIES:
lambda x: self.GetCapabilities(x).items(),
telepathy.CONNECTION_INTERFACE_CONTACT_CAPABILITIES :
lambda x: self.GetContactCapabilities(x).items()
}
#Hold handles if needed
if hold:
self.HoldHandles(handle_type, handles, sender)
# Attributes from the interface org.freedesktop.Telepathy.Connection
# are always returned, and need not be requested explicitly.
supportedInterfaces.add(telepathy.CONNECTION)
for interface in supportedInterfaces:
interface_attribute = interface + '/' + self.ATTRIBUTES[interface]
results = functions[interface](handles)
for handle, value in results:
ret[int(handle)][interface_attribute] = value
return ret
def get_contact_attribute_interfaces(self):
return self.ATTRIBUTES.keys()
|
In Elstree Studios GL2 stage today for this year’s first live Strictly. We’ve been involved in every series of Strictly – this year is series 15!
Round One provide a data system which enables the judges to lock in their scores and signal them to the production team at the end of each dance, prior to them showing the scores on their paddles. The system automatically records the scores and generates the totals, ready for use on the scoreboard.
Our systems provide the production team with a comprehensive live data display in the production galleries, enabling them to analyse the judges’ scores and provide additional narrative to the presenters in real time. |
"""
byceps.services.party.models.party
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
:Copyright: 2006-2020 Jochen Kupperschmidt
:License: Modified BSD, see LICENSE for details.
"""
from datetime import datetime
from typing import Optional
from ....database import db
from ....typing import BrandID, PartyID
from ....util.instances import ReprBuilder
from ...brand.models.brand import Brand
from ...shop.shop.transfer.models import ShopID
class Party(db.Model):
"""A party."""
__tablename__ = 'parties'
id = db.Column(db.UnicodeText, primary_key=True)
brand_id = db.Column(db.UnicodeText, db.ForeignKey('brands.id'), index=True, nullable=False)
brand = db.relationship(Brand, backref='parties')
title = db.Column(db.UnicodeText, unique=True, nullable=False)
starts_at = db.Column(db.DateTime, nullable=False)
ends_at = db.Column(db.DateTime, nullable=False)
max_ticket_quantity = db.Column(db.Integer, nullable=True)
shop_id = db.Column(db.UnicodeText, db.ForeignKey('shops.id'), index=True, nullable=True)
ticket_management_enabled = db.Column(db.Boolean, default=False, nullable=False)
seat_management_enabled = db.Column(db.Boolean, default=False, nullable=False)
archived = db.Column(db.Boolean, default=False, nullable=False)
def __init__(
self,
party_id: PartyID,
brand_id: BrandID,
title: str,
starts_at: datetime,
ends_at: datetime,
*,
max_ticket_quantity: Optional[int] = None,
shop_id: Optional[ShopID] = None,
) -> None:
self.id = party_id
self.brand_id = brand_id
self.title = title
self.starts_at = starts_at
self.ends_at = ends_at
self.max_ticket_quantity = max_ticket_quantity
self.shop_id = shop_id
@property
def is_over(self) -> bool:
"""Returns true if the party has ended."""
return self.ends_at < datetime.utcnow()
def __repr__(self) -> str:
return ReprBuilder(self) \
.add_with_lookup('id') \
.build()
|
LNDN DRGS (@lndn_drgs) offer up a teaser of what’s to come from their upcoming project in the form of a new 9 minute release. The track features various verses taken from “Used To It”, “Ho Convention”, “Pronounce”, “No. 1″& “Low On The 7”. Take a listen below.
G-Eazy (@g_eazy) drops off three tracks at once as he prepares to drop off his upcoming project in the fourth quarter. Listen below.
Jidenna’s (@jidenna @KendrickLamar) Kendrick Lamar assisted “Classic Man” receives the visual treatment as the two take to the streets of California with cameos from Hit-Boy, Janelle Monae & Ty Dolla Sign. Watch below.
Future (@1Future) continues his Like I Never Left documentary series as he drops off the third instalment in as many days. Dirty Sprite 2 drops on Friday.
A$AP Rocky (@asvpxrocky) joined Rod Stewart and James Corden as they toured the streets of L.A. for this edition of Carpool Karaoke. Check below as the trio perform Rocky’s A.L.L.A. track “Everyday”.
Shy Glizzy (@ShyGlizzy @QuavoStuntin @OffsetYRN), Quavo and Offset snap on “Head Huncho,” an ode of sorts to El Chapo. Stream it below!
Results (@iamResults @GhostGotBeats) uses eyebrow-raising lyrics that’s driven by an anthem like hook to serve as motivation to “get it”, then get more on his latest effort entitled “Not A Day Go By.” Listen to the Ghost produced track below! |
# -*- coding: utf-8 -*-
"""
Created on Thu Aug 6 08:56:57 2015
@author: boland
The following script is used to generate a control file for
the programme NonLinLoc. The format is such:
Source description (multiple sources can be specified)
GTSRCE stat_name loc_type x_srce y_srce z_srce elev
Examples:
GTSRCE STA XYZ 27.25 -67.78 0.0 1.242
GTSRCE CALF LATLON 43.753 6.922 0.0 1.242
GTSRCE JOU LATLONDM 43 38.00 N 05 39.52 E 0.0 0.300
For each control file, the location types should be consistent. Elevation is
measured in km.
For Example:
GTSRCE 101 LATLON 49.9520 12.5110 0.0 0.677
GTSRCE 102 LATLON 49.9660 12.5440 0.0 0.595
GTSRCE 103 LATLON 49.9780 12.5860 0.0 0.548
GTSRCE 104 LATLON 49.9910 12.6120 0.0 0.531
GTSRCE 105 LATLON 50.0070 12.6490 0.0 0.733
"""
import sys; sys.path.append('../..')
import os
from obspy import read
from classes.dataless import Dataless
import numpy as np
# set input parameters.
# this script can take either station XML, dataless SEED metadata or MSEED
# set one import path to the type of file that you want to import for your
# metadata
in_path = 'S.BHZ.01.2014.696174.dataless'
use_dataless = True
# set output file location and name
outfolder = os.getcwd()
outfile = 'station_info'
outpath = os.path.join(outfolder, outfile)
if use_dataless:
obj_dataless = Dataless(in_path)
coords = obj_dataless.locs_from_dataless()
stats = obj_dataless.stats_from_dataless()
#combine stations IDs with location data
info = np.column_stack((stats, coords))
if os.path.exists(outpath):
# generate a searchable object
search = True
search_list = []
else:
search = False
# now construct the control file line syntax and output to outpath
with open(outpath, "a+") as f:
if search:
# if the path already exists, then make sure to only append new lines!
for line in f:
search_list.append(line)
for row in info:
line_str = 'GTSRCE %s LATLON %0.4f %0.4f 0.0 %0.3f\n'\
%(str(row[0].split('.')[-1]),
float(row[2]),
float(row[1]),
float(row[3])/1e3)
if search:
if not line_str in search_list:
f.writelines(line_str)
|
Finally, a stainless steel version of the license plate bracket that was originally used on Airstreams in 1969 and later. Other versions of this bracket--like the Truck-Lite or Signal Stat brand--are chrome plated and much more likely to rust. We developed our own polished 304 stainless steel version in response to this longstanding issue.
Some other brands, like Yellowstone, started using this style in the early 1960s. It is similar to--and the best replacement for--the Yankee 331/332 and the Hollywood Accessories Model 42 from the early 1950s.
Please note that the photograph shows the license plate bracket installed properly in the assembly for Airstreams and most other vintage trailers. The slots for the license plate fasteners should be offset from the trailer body to allow clearance for the heads of the bolts you are using to affix the plate to the bracket.
Some other applications call for the light to be installed using two rubber standoffs from the vehicle. Those black standoffs are included. When standoffs are used, the bracket can be installed in the opposite orientation.
Uses a #67 bulb (included).
This license plate assembly is slightly larger than the version we sell for 1968 and earlier trailers.
Stainless light portion is 6 inches long, 2 inches wide (at widest point) and sticks out 2 inches from bracket.
License bracket is 10.75 inches long. Hole spacing is 4-7/8" between two screw holes.
Stainless components include: polished cover, backplate, screws and plate bracket. Some unseen parts are zinc plated steel. |
"""
Created on Aug 16, 2018
@author: StarlitGhost
"""
import json
import re
from urllib.parse import urlparse
import dateutil.parser
import dateutil.tz
from bs4 import BeautifulSoup
from twisted.plugin import IPlugin
from twisted.words.protocols.irc import assembleFormattedText as colour, attributes as A
from zope.interface import implementer
from desertbot.message import IRCMessage
from desertbot.moduleinterface import IModule
from desertbot.modules.commandinterface import BotCommand
from desertbot.response import IRCResponse
@implementer(IPlugin, IModule)
class Mastodon(BotCommand):
def actions(self):
return super(Mastodon, self).actions() + [('urlfollow', 2, self.followURL)]
def triggers(self):
return ['cw']
def help(self, query):
if query and query[0].lower() in self.triggers():
return {
'cw': 'cw <toot URL> - displays the contents of toots with content warnings'
}[query[0].lower()]
else:
return ('Automatic module that fetches toots from Mastodon URLs. '
'Also {}cw to display contents of toots with content warnings'
.format(self.bot.commandChar))
def execute(self, message: IRCMessage):
# display contents of a toot with a content warning
if message.command == 'cw':
if not message.parameters:
return IRCResponse(self.help(['cw']), message.replyTo)
match = re.search(r'(?P<url>(https?://|www\.)[^\s]+)',
message.parameters,
re.IGNORECASE)
if not match:
return IRCResponse('{!r} is not a recognized URL format'.format(message.parameters), message.replyTo)
follow = self.followURL(message, url=message.parameters, showContents=True)
if not follow:
return IRCResponse("Couldn't find a toot at {!r}".format(message.parameters), message.replyTo)
toot, _ = follow
return IRCResponse(toot, message.replyTo)
def followURL(self, _: IRCMessage, url: str, showContents: bool=False) -> [str, None]:
# check this is actually a Mastodon instance we're looking at
hostname = urlparse(url).hostname
endpoint = 'https://{domain}/api/v1/instance'.format(domain=hostname)
endpointResponse = self.bot.moduleHandler.runActionUntilValue('fetch-url', endpoint)
if not endpointResponse:
return
try:
endpointJSON = endpointResponse.json()
except json.decoder.JSONDecodeError:
return
if 'uri' not in endpointJSON:
return
response = self.bot.moduleHandler.runActionUntilValue('fetch-url',
'{}/embed'.format(url))
if not response:
return
soup = BeautifulSoup(response.content, 'lxml')
toot = soup.find(class_='entry')
if not toot:
# presumably not a toot, ignore
return
date = toot.find(class_='dt-published')['value']
date = dateutil.parser.parse(date)
date = date.astimezone(dateutil.tz.UTC)
date = date.strftime('%Y/%m/%d %H:%M')
name = toot.find(class_='p-name')
name = self.translateEmojo(name).text.strip()
user = toot.find(class_='display-name__account').text.strip()
user = '{} ({})'.format(name, user)
content = toot.find(class_='status__content')
summary = content.find(class_='p-summary')
if summary:
summary = self.translateEmojo(summary).text.strip()
text = content.find(class_='e-content')
text = self.translateEmojo(text)
# if there's no p tag, add one wrapping everything
if not text.find_all('p'):
text_children = list(text.children)
wrapper_p = soup.new_tag('p')
text.clear()
text.append(wrapper_p)
for child in text_children:
wrapper_p.append(child)
# replace <br /> tags with a newline
for br in text.find_all("br"):
br.replace_with('\n')
# then replace consecutive <p> tags with a double newline
lines = [line.text for line in text.find_all('p')]
text = '\n\n'.join(lines)
# strip empty lines, strip leading/ending whitespace,
# and replace newlines with gray pipes
graySplitter = colour(A.normal[' ', A.fg.gray['|'], ' '])
lines = [l.strip() for l in text.splitlines() if l.strip()]
text = graySplitter.join(lines)
media = toot.find('div', {'data-component': 'MediaGallery'})
if media:
media = json.loads(media['data-props'])
media = media['media']
numMedia = len(media)
if numMedia == 1:
medType = media[0]['type']
#size = media[0]['meta']['original']['size']
description = media[0]['description']
description = ': {}'.format(description) if description else ''
media = '(attached {medType}{description})'.format(medType=medType,
description=description)
else:
media = '({} media attached)'.format(numMedia)
formatString = str(colour(A.normal[A.fg.gray['[{date}]'],
A.bold[' {user}:'],
A.fg.red[' [{summary}]'] if summary else '',
' {text}' if not summary or showContents else '',
A.fg.gray[' {media}'] if media else '']))
return formatString.format(date=date,
user=user,
summary=summary,
text=text,
media=media), ''
def translateEmojo(self, tagTree):
for img in tagTree.find_all('img', class_='emojione'):
img.replace_with(img['title'])
return tagTree
mastodon = Mastodon()
|
Call your local Regis Salon in Merced, CA at (209) 722-5550 in the Merced Mall, refresh your hairstyle with a trendy haircut or stylish color service, your Regis Salon hairstylist wants to give you a look you love.
When choosing a salon, make the best choice for your hair and visit your local hair salon at Merced Mall in Merced, where your hair stylist strives to provide a luxurious salon experience that leaves you feeling beautiful, confident and happy. Find a hair salon near you and connect with your stylist about the hairstyle you desire. |
__author__ = 'Austin'
import os,os.path
import xml.etree.ElementTree as ET
#get the path to the pmd commits
#gwtcwd = get current working directory
#print os.getcwd()
def parse(logpath):
rootdir = os.path.abspath(os.path.dirname(os.getcwd()))
#print rootdir
LoR= []
pmd_folder = logpath
print pmd_folder
i = 0
completeName = os.path.join(pmd_folder, "CommitResult.txt")
with open(completeName, "w") as output:
for file in os.listdir(pmd_folder):
Result = dict()
currentfile = ""
num_viol = 0
i = i + 1
if os.path.isfile(pmd_folder +"\\"+ "commit"+str(i)+".xml"):
output.write("commit"+str(i)+".xml: \n")
f = open(pmd_folder +"\\"+ "commit"+str(i)+".xml")
lines = f.readlines()
for line in lines:
if '<file name=' in line:
temp = line.split("\\")
if currentfile == "":
currentfile = temp[-1][:-3]
Result[currentfile] = num_viol
else:
if currentfile not in Result:
Result[currentfile] = num_viol
else:
Result[currentfile] += num_viol
num_viol = 0
currentfile = temp[-1][:-3]
if '</violation>' in line:
num_viol = num_viol + 1
for key in Result.keys():
output.write("\t" +key + " : " + str(Result[key]) + "\n")
# print num_viol
f.close()
LoR.append(Result)
return LoR
|
The Health and Retirement Study (HRS) has recently added extensive new content addressing the use of Veteran’s Administration (VA) health services. This new content will allow us to examine a wide range of determinants of such use, and racial differences in the impact of these determinants, including economic and attitudinal factors. This project’s research questions will include: Is there a racial difference in the use of VA health services? What economic, health, social, or attitudinal factors influence each use? Do these factors explain the racial differences observed? |
import os
import logging
from PySide.QtGui import QDialog, QVBoxLayout, QHBoxLayout, QLabel, QTabWidget, QPushButton, QCheckBox, QFrame, \
QGroupBox, QListWidgetItem, QListWidget
from PySide.QtCore import Qt
import angr
l = logging.getLogger('dialogs.load_binary')
class LoadBinary(QDialog):
def __init__(self, file_path, parent=None):
super(LoadBinary, self).__init__(parent)
# initialization
self.file_path = file_path
self.option_widgets = { }
# return values
self.cfg_args = None
self.load_options = None
self.setWindowTitle('Load a new binary')
self.main_layout = QVBoxLayout()
self._init_widgets()
self._try_loading()
self.setLayout(self.main_layout)
self.show()
@property
def filename(self):
return os.path.basename(self.file_path)
#
# Private methods
#
def _try_loading(self):
try:
proj = angr.Project(self.file_path)
deps = [ i for i in list(proj.loader._satisfied_deps)
if i not in { 'angr syscalls', 'angr externs', '##cle_tls##', self.filename }
]
dep_list = self.option_widgets['dep_list'] # type: QListWidget
for dep in deps:
dep_item = QListWidgetItem(dep)
dep_item.setData(Qt.CheckStateRole, Qt.Unchecked)
dep_list.addItem(dep_item)
except Exception:
# I guess we will have to load it as a blob?
l.warning("Preloading of the binary fails due to an exception.", exc_info=True)
def _init_widgets(self):
# filename
filename_caption = QLabel(self)
filename_caption.setText('File name:')
filename = QLabel(self)
filename.setText(self.filename)
filename_layout = QHBoxLayout()
filename_layout.addWidget(filename_caption)
filename_layout.addWidget(filename)
self.main_layout.addLayout(filename_layout)
# central tab
tab = QTabWidget()
self._init_central_tab(tab)
self.main_layout.addWidget(tab)
# buttons
ok_button = QPushButton(self)
ok_button.setText('OK')
ok_button.clicked.connect(self._on_ok_clicked)
cancel_button = QPushButton(self)
cancel_button.setText('Cancel')
cancel_button.clicked.connect(self._on_cancel_clicked)
buttons_layout = QHBoxLayout()
buttons_layout.addWidget(ok_button)
buttons_layout.addWidget(cancel_button)
self.main_layout.addLayout(buttons_layout)
def _init_central_tab(self, tab):
self._init_load_options_tab(tab)
self._init_cfg_options_tab(tab)
def _init_load_options_tab(self, tab):
# auto load libs
auto_load_libs = QCheckBox(self)
auto_load_libs.setText("Automatically load all libraries")
auto_load_libs.setChecked(False)
self.option_widgets['auto_load_libs'] = auto_load_libs
# dependencies list
dep_group = QGroupBox("Dependencies")
dep_list = QListWidget(self)
self.option_widgets['dep_list'] = dep_list
sublayout = QVBoxLayout()
sublayout.addWidget(dep_list)
dep_group.setLayout(sublayout)
layout = QVBoxLayout()
layout.addWidget(auto_load_libs)
layout.addWidget(dep_group)
layout.addStretch(0)
frame = QFrame(self)
frame.setLayout(layout)
tab.addTab(frame, "Loading Options")
def _init_cfg_options_tab(self, tab):
resolve_indirect_jumps = QCheckBox(self)
resolve_indirect_jumps.setText('Resolve indirect jumps')
resolve_indirect_jumps.setChecked(True)
self.option_widgets['resolve_indirect_jumps'] = resolve_indirect_jumps
collect_data_refs = QCheckBox(self)
collect_data_refs.setText('Collect cross-references and infer data types')
collect_data_refs.setChecked(True)
self.option_widgets['collect_data_refs'] = collect_data_refs
layout = QVBoxLayout()
layout.addWidget(resolve_indirect_jumps)
layout.addWidget(collect_data_refs)
layout.addStretch(0)
frame = QFrame(self)
frame.setLayout(layout)
tab.addTab(frame, 'CFG Options')
#
# Event handlers
#
def _on_ok_clicked(self):
force_load_libs = [ ]
skip_libs = set()
dep_list = self.option_widgets['dep_list'] # type: QListWidget
for i in xrange(dep_list.count()):
item = dep_list.item(i) # type: QListWidgetItem
if item.checkState() == Qt.Checked:
force_load_libs.append(item.text())
else:
skip_libs.add(item.text())
self.load_options = {
'auto_load_libs': self.option_widgets['auto_load_libs'].isChecked(),
'force_load_libs': force_load_libs,
'skip_libs': skip_libs,
}
self.cfg_args = {
'resolve_indirect_jumps': self.option_widgets['resolve_indirect_jumps'].isChecked(),
'collect_data_references': self.option_widgets['collect_data_refs'].isChecked(),
}
self.close()
def _on_cancel_clicked(self):
self.cfg_args = None
self.close()
|
A disputed congressional seat in North Carolina could remain vacant for months after incoming Democratic House leaders in Washington on Friday declared they would not seat the apparent Republican winner because of unresolved allegations of election fraud in that race.
Even before Democrats made that fresh vow on Friday afternoon, the chaotic fight for the Ninth District’s House seat had already plunged into deeper turmoil: North Carolina’s state elections board dissolved at noon on Friday under a court order, two weeks before it was to hold a public hearing to consider evidence of the fraud allegations.
“We have entered no man’s land,” said J. Michael Bitzer, a professor of politics and history at Catawba College in Salisbury, N.C.
Friday’s political drama came more than seven weeks after Mark Harris, the Republican nominee for Congress in the Ninth District, seemed to defeat Dan McCready, the Democratic candidate, by 905 votes in November. But Mr. Harris’s apparent victory was soon overshadowed by allegations that a contractor for his campaign engaged in illegal activity to compromise the election. According to witnesses and affidavits, the contractor, L. McCrae Dowless Jr., and people working for him collected absentee ballots in violation of state law.
The accusations led the State Board of Elections and Ethics Enforcement to refuse to certify Mr. Harris as the winner and to open an investigation that has so far involved more than 100 interviews and at least 182,000 pages of records. It also gave Democrats a powerful argument to keep Mr. Harris out of office, at least for now — a pledge they made unequivocally on Friday — under the House’s constitutional authority to be “the judge of the elections, returns and qualifications of its own members.” Democrats had been signaling such a plan for weeks.
But the direction of North Carolina’s investigation into the allegations faced new obstacles on Friday as the elections board dissolved, a court-ordered consequence of a long-running battle over partisan power in North Carolina that was separate from the election fraud investigation. Even with the demise of the board, state officials said the inquiry would continue at the staff level, even if a board was not yet in place to consider evidence and reach conclusions.
“The staff will continue to investigate the Ninth District irregularities and perform all other elections-related functions,” said Patrick Gannon, a spokesman for the embattled panel.
The state board issued a new round of subpoenas in the hours before its dissolution. And in a letter on Friday, Joshua D. Malcolm, the Democrat who was chairman of the elections board, complained that Mr. Harris’s campaign had not been sufficiently responsive to a subpoena that was served weeks ago.
Mr. Malcolm said investigators had received 398 pages of campaign records — and believed there were about 140,000 other documents that could be of value to investigators but had not been made available.
“You are hereby requested to fully comply with the board’s subpoena so as to not further impact the agency’s ability to resolve the investigation,” Mr. Malcolm wrote in his letter to Mr. Harris’s campaign.
Mr. Harris has denied wrongdoing but acknowledged this month that he had directed the hiring of Mr. Dowless, a political operative from Bladen County who had previously been scrutinized by the authorities for possible election tampering. No one, including Mr. Dowless, has been charged in connection with this year’s allegations, and Mr. Dowless, who has declined to comment, rejected a request to meet with state investigators.
But plans for the January hearing, and the fate of the board, eventually ran headlong into a case that dealt with the constitutionality of the elections board’s design. On Thursday night, in a decision that stunned North Carolina Democrats and Republicans alike, a three-judge panel angrily rejected a bipartisan request to extend the life of the board temporarily.
The ruling left the board with less than 24 hours to exist, and plans for the Jan. 11 hearing uncertain. Some state officials said on Friday that the structure of recent legislation setting up the future elections board meant it could not begin operations until Jan. 31.
Gov. Roy Cooper, a Democrat, said Friday that he intended to name an interim board that would serve until the new law took effect. It was not immediately clear how quickly Mr. Cooper would act, but Republicans said that they would challenge such a move and would consider a boycott of the board.
The board, which said it received the motion at 10:15 a.m., took no action before its dissolution at noon.
But Mr. Malcolm, in his final minutes as elections board chairman, suggested the state would not hasten its inquiry. |
import logging
from django.contrib.auth import get_user_model
from django.core.exceptions import ObjectDoesNotExist
from django.db import models
from django.db.models.signals import post_save
from django.dispatch import receiver
from selvbetjening.core.events.models import Attend
logger = logging.getLogger('selvbetjening.events')
class Payment(models.Model):
class Meta:
app_label = 'events'
user = models.ForeignKey(get_user_model())
attendee = models.ForeignKey(Attend, null=True, on_delete=models.SET_NULL) # store abandoned payments
amount = models.DecimalField(max_digits=6, decimal_places=2)
signee = models.ForeignKey(get_user_model(), null=True, blank=True, related_name='signee_payment_set')
note = models.CharField(max_length=256, blank=True)
created_date = models.DateTimeField(auto_now_add=True)
@receiver(post_save, sender=Payment)
def payment_save_handler(sender, **kwargs):
instance = kwargs.get('instance')
created = kwargs.get('created')
try:
if created:
logger.info('Payment registered (%s,-) -- %s', instance.amount, instance.note,
extra={
'related_user': instance.user,
'related_attendee': instance.attendee
})
except ObjectDoesNotExist:
pass |
Patient care is inevitably destined for an electronic future, and an enterprise video solution plays a key role in reforming how healthcare institutions manage and utilize their vast digital media assets, most of which take the shape of unstructured healthcare data today.
Why health providers need an enterprise video platform to reduce the cost of care, increase care accessibility, and deliver better quality of care.
How VIDIZMO helps health providers resolve the daunting challenge of storing, managing and utilizing unstructured digital data.
How VIDIZMO helps improve communication, training, and collaboration with hospital staff, patients, etc. |
# -*- coding: utf-8 -*-
# File: dftools.py
import multiprocessing as mp
from six.moves import range
from ..utils.concurrency import DIE
from ..utils.develop import deprecated
from .serialize import LMDBSerializer, TFRecordSerializer
__all__ = ['dump_dataflow_to_process_queue',
'dump_dataflow_to_lmdb', 'dump_dataflow_to_tfrecord']
def dump_dataflow_to_process_queue(df, size, nr_consumer):
"""
Convert a DataFlow to a :class:`multiprocessing.Queue`.
The DataFlow will only be reset in the spawned process.
Args:
df (DataFlow): the DataFlow to dump.
size (int): size of the queue
nr_consumer (int): number of consumer of the queue.
The producer will add this many of ``DIE`` sentinel to the end of the queue.
Returns:
tuple(queue, process):
The process will take data from ``df`` and fill
the queue, once you start it. Each element in the queue is (idx,
dp). idx can be the ``DIE`` sentinel when ``df`` is exhausted.
"""
q = mp.Queue(size)
class EnqueProc(mp.Process):
def __init__(self, df, q, nr_consumer):
super(EnqueProc, self).__init__()
self.df = df
self.q = q
def run(self):
self.df.reset_state()
try:
for idx, dp in enumerate(self.df):
self.q.put((idx, dp))
finally:
for _ in range(nr_consumer):
self.q.put((DIE, None))
proc = EnqueProc(df, q, nr_consumer)
return q, proc
@deprecated("Use LMDBSerializer.save instead!", "2019-01-31")
def dump_dataflow_to_lmdb(df, lmdb_path, write_frequency=5000):
LMDBSerializer.save(df, lmdb_path, write_frequency)
@deprecated("Use TFRecordSerializer.save instead!", "2019-01-31")
def dump_dataflow_to_tfrecord(df, path):
TFRecordSerializer.save(df, path)
|
Why not spoil someone special with a gift voucher. They make the perfect present for any occasion and are valid for one year. Whether it's a romantic occasion, a birthday surprise, or simply to show someone you care, a gift voucher could be the ideal gift for him or her.
you can choose between a paper voucher presented in a card and envelope that we will post to you within 3-5 days, or an e-voucher which we will email directly to you within 24 hours of purchasing - perfect for those last minute gifts. |
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
# kashev.rocks
# Kashev Dalmia - [email protected]
from flask.ext.script import Manager
from flask.ext.assets import ManageAssets
from src.kashevrocks import app
from src.assets import register_assets
manager = Manager(app)
assets_env = register_assets(app)
manager.add_command("assets", ManageAssets(assets_env))
@manager.command
def liveserver(debug=True):
""" Runs a live reloading server which watches non-python code as well. """
import livereload
app.debug = debug
assets_env.debug = debug
server = livereload.Server(app.wsgi_app)
server.watch('src/')
server.serve()
@manager.command
def clean():
""" Cleans up all generated and cache files from the project. """
import shutil
import os
paths_to_clean = ['src/static/.webassets-cache',
'src/static/generated',
'debug.log']
for path in paths_to_clean:
try:
shutil.rmtree(path)
except NotADirectoryError:
os.remove(path) # It's a file, not a directory
except FileNotFoundError:
pass # They're not there, that's fine.
if __name__ == "__main__":
manager.run()
|
Elisha is the gas in our tank at 36. Her days find her constantly switching hats, from tying-up loose ends across departments to writing proposals and making sure the office is fully stocked with all the essentials to keep us going. As a natural cook and incredible singer, she brings a multitude of talents to 36 that extend well beyond being the best executive assistant an agency could ask for.
When she’s not helping everyone at 36 reach their many goals or running quality assurance for the development team, Elisha is probably either hosting friends at her lake house or getting on a plane to travel with her boyfriend. |
from __future__ import print_function, absolute_import, unicode_literals
# stdlib imports
import time
from datetime import datetime
# project imports
from FIPER.generic.subsystem import StreamDisplayer
from FIPER.generic.util import Table
from FIPER.generic.probeclient import Probe
from FIPER.host.component import Listener, Console
# noinspection PyUnusedLocal
class FleetHandler(object):
"""
Class of the main server.
Groups together the following concepts:
- Console is run in the main thread, waiting for and parsing input commands.
- Listener is listening for incomming car connections in a separate thread.
It also coordinates the creation and validation of new car interfaces.
- CarInterface instances are stored in the .cars dictionary.
- StreamDisplayer objects can be attached to CarInterface objects and
are run in a separate thread each.
- FleetHandler itself is responsible for sending commands to CarInterfaces
and to coordinate the shutdown of the cars on this side, etc.
"""
the_one = None
def __init__(self, myIP):
self.clients = {}
self.ip = myIP
self.cars = {}
self.watchers = {}
self.since = datetime.now()
self.status = "Idle"
self.console = Console(
master_name="FIPER-Server",
status_tag=self.status,
commands_dict={
"cars": self.printout_cars,
"kill": self.kill_car,
"watch": self.watch_car,
"unwatch": self.stop_watch,
"shutdown": self.shutdown,
"status": self.report,
"message": self.message,
"probe": self.probe,
"connect": Probe.initiate,
"sweep": self.sweep
}
)
self.listener = Listener(self)
self.listener.start()
print("SERVER: online")
def mainloop(self):
self.console.mainloop()
def printout_cars(self, *args):
"""List the current car-connections"""
print("Cars online:\n{}\n".format("\n".join(self.cars)))
@staticmethod
def probe(*ips):
"""Probe the supplied ip address(es)"""
IDs = dict(Probe.probe(*ips))
for ID, IP in IDs.iteritems():
print("{:<15}: {}".format(IP, ID if ID else "-"))
def message(self, ID, *msgs):
"""Just supply the car ID, and then the message to send."""
self.cars[ID].send(" ".join(msgs).encode())
@staticmethod
def sweep(*ips):
"""Probe the supplied ip addresses and print the formatted results"""
def get_status(dID):
status = ""
if dID is None:
status = "offline"
else:
status = "available"
return status
if not ips:
print("[sweep]: please specify an IP address range!")
return
IDs = dict(Probe.probe(*ips))
tab = Table(["IP", "ID", "status"],
[3*5, max(len(unicode(v)) for v in IDs.itervalues()), 11])
for IP, ID in IDs.iteritems():
tab.add(IP, ID, get_status(ID))
print(tab.get())
def kill_car(self, ID, *args):
"""Sends a shutdown message to a remote car, then tears down the connection"""
if ID not in self.cars:
print("SERVER: no such car:", ID)
return
if ID in self.watchers:
self.stop_watch(ID)
success = self.cars[ID].teardown(sleep=1)
if success:
del self.cars[ID]
def watch_car(self, ID, *args):
"""Launches the stream display in a separate thread"""
if ID not in self.cars:
print("SERVER: no such car:", ID)
return
if ID in self.watchers:
print("SERVER: already watching", ID)
return
self.cars[ID].send(b"stream on")
time.sleep(1)
self.watchers[ID] = StreamDisplayer(self.cars[ID])
def stop_watch(self, ID, *args):
"""Tears down the StreamDisplayer and shuts down a stream"""
if ID not in self.watchers:
print("SERVER: {} is not being watched!".format(ID))
return
self.cars[ID].send(b"stream off")
self.watchers[ID].teardown(sleep=1)
del self.watchers[ID]
def shutdown(self, *args):
"""Shuts the server down, terminating all threads nicely"""
self.listener.teardown(1)
rounds = 0
while self.cars:
print("SERVER: Car corpse collection round {}/{}".format(rounds+1, 4))
for ID in self.cars:
if ID in self.watchers:
self.stop_watch(ID)
self.kill_car(ID)
if rounds >= 3:
print("SERVER: cars: [{}] didn't shut down correctly"
.format(", ".join(self.cars.keys())))
break
rounds += 1
else:
print("SERVER: All cars shut down correctly!")
print("SERVER: Exiting...")
def report(self, *args):
"""
Prints a nice server status report
"""
repchain = "FIPER Server\n"
repchain += "-" * (len(repchain) - 1) + "\n"
repchain += "Up since " + self.since.strftime("%Y.%m.%d %H:%M:%S") + "\n"
repchain += "Cars online: {}\n".format(len(self.cars))
print("\n" + repchain + "\n")
def __enter__(self, srvinstance):
"""Context enter method"""
if FleetHandler.the_one is not None:
FleetHandler.the_one = srvinstance
else:
raise RuntimeError("Only one can remain!")
return srvinstance
def __exit__(self, exc_type, exc_val, exc_tb):
"""Context exit method, ensures proper shutdown"""
if FleetHandler.the_one is not None:
FleetHandler.the_one.shutdown()
FleetHandler.the_one = None
|
AN OVERGROWN patch of public land has been given a modern makeover.
Pinfold Open Space (between Shambles Street and Westgate) now has a small seating area, newly planted shrubs and improved steps.
The site is named after the pinfolds of medieval times, found mostly in the North and East and built to hold animals which had been found straying.
The damaged Hidden Barnsley plaque, one of seven located around the town centre at sites of historic interest has been replaced.
Funding for the project came from City Region Growth Point cash. |
# Address utilities
from __future__ import absolute_import
import re
from .crypto import (
sha3,
)
from .encoding import (
encode_hex,
)
from .string import (
force_text,
coerce_args_to_text,
coerce_return_to_text,
)
from .types import (
is_string,
)
from .formatting import (
add_0x_prefix,
remove_0x_prefix,
is_prefixed,
)
@coerce_args_to_text
def is_address(address):
"""
Checks if the given string is an address
"""
if not is_string(address):
return False
if not re.match(r"^(0x)?[0-9a-fA-F]{40}$", address):
return False
elif re.match(r"^(0x)?[0-9a-f]{40}", address) or re.match(r"(0x)?[0-9A-F]{40}$", address):
return True
else:
return is_checksum_address(address)
@coerce_args_to_text
def is_checksum_address(address):
"""
Checks if the given string is a checksummed address
"""
if not is_string(address):
return False
checksum_address = to_checksum_address(address)
return force_text(address) == force_text(checksum_address)
@coerce_args_to_text
def is_strict_address(address):
"""
Checks if the given string is strictly an address
"""
if not is_string(address):
return False
return re.match(r"^0x[0-9a-fA-F]{40}$", address) is not None
@coerce_args_to_text
@coerce_return_to_text
def to_checksum_address(address):
"""
Makes a checksum address
"""
if not is_string(address):
return False
address = remove_0x_prefix(address.lower())
addressHash = sha3(address)
checksumAddress = "0x"
for i in range(len(address)):
if int(addressHash[i], 16) > 7:
checksumAddress += address[i].upper()
else:
checksumAddress += address[i]
return checksumAddress
@coerce_args_to_text
@coerce_return_to_text
def to_address(address):
"""
Transforms given string to valid 20 bytes-length addres with 0x prefix
"""
if is_string(address):
if len(address) == 42:
return address.lower()
elif len(address) == 40:
return add_0x_prefix(address.lower())
elif len(address) == 20:
return encode_hex(address)
elif len(address) in {66, 64}:
long_address = remove_0x_prefix(address.lower())
if is_prefixed(long_address, '000000000000000000000000'):
return add_0x_prefix(address[-40:])
raise ValueError("Unknown address format")
@coerce_args_to_text
def is_same_address(addr1, addr2):
"""
Checks if both addresses are same or not
"""
if is_address(addr1) & is_address(addr2):
return to_checksum_address(addr1) == to_checksum_address(addr2)
else:
return to_checksum_address(to_address(addr1)) == to_checksum_address(to_address(addr2))
|
Traditionally, when you access a website/app/online service which requires you to "log in", all you need to provide is your username (or email address) and a password in order to authenticate your access.
Unfortunately, many people use the same credentials (username/password) over and over again for multiple websites/apps/online services. This means that if one of those services gets "hacked" and has a data breach and user's credentials are exposed, an attacker could potentially then access all other websites/apps/online services the user uses.
Two-factor authentication combats this, by employing a secondary means of authentication in addition to the traditional username/password combination in order to authenticate your access to the website/app/online service when you login. This means that even if your username/password were compromised, an attacker couldn't then use these on their own to gain access to your account.
An administrator can enabled/disable Two-Factor Authentication via MIDAS Admin Options → Manage MIDAS → Security.
Your MIDAS system will first need to be correctly configured to send email. You may find these settings via MIDAS Admin Options → Manage MIDAS → Email. It is strongly recommended that whenever making changes to the email settings within the software that you then send a "test" email from the system to yourself to ensure you're able to receive email from the system BEFORE enabling Two-Factor Authentication.
IMPORTANT: If you enable Two-Factor Authentication for your MIDAS system yet the software has not been correctly configured to send email, users will not be able to receive Authorization Codes via email and will be unable to login.
Simply enter the Authorization Code provided in the email in the space provided and click "Login". If the code is valid, the login process will complete and you will be successfully logged in.
For Two-Factor Authentication in MIDAS to be an effective layer of additional security, you should ensure that the password you use to login to MIDAS is never the same as the password you use to access your own email account inbox. If these two passwords are currently the same, we strongly advise changing one or both.
You can change your MIDAS password at any time, once logged in, via the "Change Password" link near the top of the screen. |
from crispy_forms.bootstrap import AppendedText, PrependedText
from crispy_forms.helper import FormHelper
from crispy_forms.layout import Layout, Submit, Div, Field
from django import forms
from django.contrib.auth import authenticate
from django.contrib.auth.forms import AuthenticationForm, UserCreationForm
from django.contrib.auth.models import User
from ideahub import settings
from ideahub.utils import glyphicon, field_max_len
class LoginForm(AuthenticationForm):
"""
A form that logs a user in
"""
remember_me = forms.BooleanField(
label = 'Remember Me',
required = False,
widget = forms.CheckboxInput
)
def remember_user(self):
try:
if self.cleaned_data.get('remember_me'):
return True
except AttributeError:
pass
return False
def __init__(self, *args, **kwargs):
super(LoginForm, self).__init__(*args, **kwargs)
self.helper = FormHelper()
self.helper.form_method = 'post'
self.helper.form_action = 'login'
self.helper.label_class = 'sr-only'
self.helper.layout = Layout(
PrependedText('username', glyphicon('user'), placeholder='Username'),
PrependedText('password', glyphicon('lock'), placeholder='Password'),
Field('remember_me', css_class='checkbox-inline'),
Submit('submit', 'Login'),
)
class SignupForm(UserCreationForm):
"""
A form that creates a user, with no privileges, from the given username, email,
password, first name and last name.
"""
first_name = forms.CharField(
label = 'First Name',
max_length = field_max_len(User, 'first_name'),
required = True,
widget = forms.TextInput,
)
last_name = forms.CharField(
label = 'Last Name',
max_length = field_max_len(User, 'last_name'),
required = True,
widget = forms.TextInput,
)
email = forms.EmailField(
label = 'Email',
max_length = field_max_len(User, 'email'),
required = True,
widget = forms.EmailInput,
)
username = forms.RegexField(
label = "Username",
max_length = field_max_len(User, 'username'),
required = True,
regex = r'^[\w.@+-]+$',
help_text = "{} characters or fewer. Letters, digits and @/./+/-/_ only.".format(field_max_len(User, 'username')),
error_messages = {
'invalid': "This value may contain only letters, numbers and @/./+/-/_ characters.",
},
)
password1 = forms.CharField(
label = "Password",
min_length = 8,
required = True,
widget = forms.PasswordInput,
help_text = "8 characters minimum.",
)
password2 = forms.CharField(
label = "Repeat Password",
required = True,
widget = forms.PasswordInput,
help_text = "Enter the same password as above, for verification.",
)
class Meta:
model = User
fields = ['first_name', 'last_name', 'email', 'username']
def save(self, commit=True, auth_after_save=True):
user = super(SignupForm, self).save(commit)
if commit and auth_after_save:
user = authenticate(username=self.cleaned_data['username'], password=self.cleaned_data['password1'])
return user
def __init__(self, *args, **kwargs):
super(SignupForm, self).__init__(*args, **kwargs)
self.helper = FormHelper()
self.helper.form_class = 'form-horizontal'
self.helper.form_method = 'post'
self.helper.form_action = 'signup'
self.helper.label_class = 'col-lg-2'
self.helper.field_class = 'col-lg-10'
self.helper.layout = Layout(
Field('first_name', placeholder='First Name'),
Field('last_name', placeholder='Last Name'),
Field('email', placeholder='Email'),
Field('username', placeholder='Username'),
Field('password1', placeholder='Password'),
Field('password2', placeholder='Repeat Password'),
Submit('submit', 'Sign Up'),
) |
0 County Road 924 is a $39,900 home on a 0.47 acre lot located in Sweeny, TX.
Nice piece of land. 20,473 Sq. Ft. Easy owner terms, give us a call today!
I am interested in 0 County Road 924, Sweeny, TX 77480. |
import math
import time
t1 = time.time()
prime = []
def primeSieve(n):
global prime
n = (n+1)//2
p = [True]*(n)
i = 1
prime.append(2)
while i < n:
if p[i]:
t = 2*i+1
prime.append(t)
p[i] = False
j = 2*i*i+2*i
while j < n:
p[j] = False
j += t
i += 1
return prime
primeSieve(10000)
# make two number concatenate
def seq(a,b):
dig = math.floor(math.log10(b)+1)
return a*math.pow(10,dig)+b
def isPrime(item):
root = math.floor(math.sqrt(item))
i = 0
t = prime[i]
while t <= root:
if item%t == 0:
return False
if t < prime[-1]:
i += 1
t = prime[i]
else:
t += 2
return True
ps = [[3]]
def canadd(tset,num):
for i in tset:
if not isPrime(seq(i,num)):
return False
if not isPrime(seq(num,i)):
return False
return True
def getanswer():
global ps
for j in range(3,len(prime)):
for k in ps:
if canadd(k,prime[j]):
ps.append(k+[prime[j]])
if len(k) == 4:
print(sum(ps[-1]))
return
if prime[j] < 20:
ps.append([prime[j]])
getanswer()
print("time:",time.time()-t1)
|
Melbourne punters get your engines pumping and get ready for Black Booty Lily.
Hi Gents, please take note of Lily's July tours.
Lilly will be visiting Melbourne on 23rd and 24th June.
Lily is in Sydney until Sunday 17th June.
Lily will be visiting Cairns from 8th to 11th June.
Lily is in Sydney until May 23rd. |
#!/usr/bin/env python
"""
SlipStream Client
=====
Copyright (C) 2013 SixSq Sarl (sixsq.com)
=====
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
import os
import sys
import shutil
def process(filename, cmd=None):
''' Append at the end of the file (e.g. rc.local) the SlipStream bootstrap
file to trigger the execution of the node (e.g. package generation,
image creation, deployment)
Do the following:
- Open target file
- Reverse the lines
- Process empty lines, if any
- Look for exit or not empty line
- If starts with exit, prepend the bootstrap line
- If starts with none empty line (but not exit),
prepend the bootstrap script
- Copy rest of lines
- Move original file to <filename>.sav
- Replace old file
Option: The 'cmd' field can be used to customize the command that will be inserted in the file'''
bootstrap = 'mkdir -p /tmp/slipstream/reports\n'
if cmd is None:
bootstrap += os.path.join(os.sep, 'etc', 'slipstream.bootstrap.sh') \
+ ' > ' + os.path.join(os.sep, 'tmp', 'slipstream', 'reports', 'node-execution.log') \
+ ' 2>&1 &\n'
else:
bootstrap += cmd + '\n'
# Backup the file if it was not done before
originalFilename = filename + '.orig'
if not os.path.exists(originalFilename):
shutil.copyfile(filename, originalFilename)
file = open(filename)
lines = file.readlines()
newlines = []
gotit = False
lines.reverse()
for line in lines:
# Simply copy empty lines
if gotit:
newlines.append(line)
continue
if line.strip() == '':
newlines.append(line)
continue
if line.strip().startswith('exit'):
gotit = True
newlines.append(line)
newlines.append(bootstrap)
continue
gotit = True
newlines.append(bootstrap)
newlines.append(line)
savedfilename = filename + '.sav'
if os.path.exists(savedfilename):
os.remove(savedfilename)
shutil.move(filename, savedfilename)
newfile = open(filename, 'w')
# reverse the lines
newlines.reverse()
newfile.writelines(newlines)
newfile.close()
os.chmod(filename, 0755)
if __name__ == '__main__':
if len(sys.argv) < 2 or len(sys.argv) > 3:
sys.stderr.write('Error, usage is: %s <filename> [<command-string>], got: %s\n' %
(sys.argv[0], ' '.join(sys.argv)))
sys.exit(1)
cmd = None
if len(sys.argv) == 3:
cmd = sys.argv[2]
process(sys.argv[1], cmd)
print 'Done!'
|
Transmission: Automatic Color:Red Interior Color: Gray Average Vehicle Review: (4.833 reviews) This is our first Mazda and I'd say I am very impressed so far. It's in the same class as our 2014 Kia Optima SX-T, and they share a lot of similarities - very smooth ride, attractive interior and body lines, great fuel mileage (the Mazda6 gets better mileage than the Kia), and plenty of passing power out of a 4 cylinder, though the Mazda6 is naturally aspirated and the Optima is turbocharged. I can't say which I enjoy driving more, both are a lot of fun and seem to be well put together.
Transmission: Automatic Color:Gray Interior Color: Black Average Vehicle Review: (5 reviews) This is a great looking car. I get compliments all the time. Interior is well designed plenty of space. Does not have a lot of storage inside and it is a bit loud on the road but I was aware of that before buying. Mazda is not the quietest car out there but it's not too bad. Having a car that you can't get tired of looking at is the best part. Car has been reliable as well no problems I'm at 54k.
Transmission: Automatic Color:Blue Interior Color: Black Average Vehicle Review: (4.833 reviews) This car is worth the money. It handles well, holds a lot of cargo, and is stylish. I would not recommend this car if you want to put child seats in the back, or if you are a wide-bodied person. The front seats are slightly narrow, and the back seats are a little crammed for anyone over 6 feet. I am not going to say that it is the best car in the world. However, it does all things pretty well. If you're in the market for a reasonable priced mid-size sedan with all the creature comforts and reliability of a Japanese Car, then look no farther. It won't disappoint you.
Transmission: Automatic Color:Silver Interior Color: Black Average Vehicle Review: (4.714 reviews) Very sporty looking midsize for those of us who don't want to drive the typical family four-door. Rides very smooth and engine power is more than adequate for day to day driving. I was surprised at how well the car maneuvers and corners considering it's footprint (this is a long car to park). Fuel economy on the highway is spot on and this car is made to cruise with four full size adults in it. Fuel economy in the city suffers the more you have to run through the gears in city traffic. Not bad by any means but expect closer to 21-22 mpg in the city not the advertised 25/26. My only real complaint with the car are the 19" tires. The tires Mazda puts on this car do not last long and expect to take a huge hit in the wallet to replace.
Transmission: Automatic Color:Blue Interior Color: Tan Average Vehicle Review: (4.833 reviews) I bought this as my first family car (previously had a mustang) and couldn't love it more! It's a great family car that still has the sporty look. With tons of room for all passengers! All the technology works great and makes me feel like I'm in a luxury car. I don't regret buying my Mazda for one second!
Transmission: Automatic Color:Black Interior Color: Black Average Vehicle Review: (5 reviews) There is a very good amount of space throughout the car, without feeling like a canned sardine. The car drives very smoothly and you hardly notice any potholes you may encounter. Because of how smooth it drives and also, for how comforting it is to feel extremely safe are two of the biggest things my girlfriend loves about this vehicle. Grant it, she loves that the passenger seat reclines all the way back, but she has said that when we do have kids, she would love to get a Mazda SUV instead of becoming a mom with a minivan or a mom with a truck that may lack room. The technology throughout the vehicle is also amazing. It is definitely a huge upgrade from the last car I have had!
Transmission: Automatic Color:Brown Interior Color: Black Average Vehicle Review: (4.833 reviews) I absolutely love my 2006 Mazda 6 S. It's the first car I have ever owned, and I've driven a lot of cars in my lifetime. Believe me when I say you definitely need to consider the Mazda6 if you're looking for a car that can tow friends and familiy in comfort and style at a pace befitting of a semi-sports car. I keep up to date with Mazda and their work and I can personally tell you that they build quality cars that last a long time. The only problem that I ever had with my Mazda is that the flexi pipe was torn to shreds when I bought it. (I knew this after inspecting the car, it was a rental/fleet car so you know the people who drove it didn't take care of it). Regardless, the car still works perfectly even after 100,000 miles.
Transmission: Automatic Color:White Interior Color: Gray Average Vehicle Review: (4.833 reviews) this car is a pleasure to own. Add to to that the price and gas milage. Interior materials are somewhat cheap feeling after getting to know the vehicle but I still have to say Mazda really made a awesome car. It's comfortable on cross-country trips and great in traffic. Good handling and acceleration. Definitely get this car with a stick! The gears are in great ratio for making the motor responsive. It feels like a sports car most of the time and the styling helps it give an air of sedan or whatever that means. It's a good car, I get compliments often. I'm happy with my purchase. I've also heard Mazda is know for making durable vehicles past 100k miles so we shall see!
Transmission: Automatic Color:Blue Interior Color: White Average Vehicle Review: (4.571 reviews) I can only speak to the Mazda6i manual (mine happens to be the 5-door hatchback). The handling is excellent, but off the line and overall acceleration is in the compact car class range with the 2.3L (specifically 8.7sec 0-60 on mine). Ok, so the main thing I needed to tell people as an owner is this. Again, I can only speak for the manual, but this car is not for freeway use!!! The 4-cylinder (manual) is rather fuel efficient around town, and on the highway up to about 60MPH, but the gearing on this car doesn't make much sense to me. At freeway speeds (75-80 where I live), the engine is racing in top gear at 3800+ RPM. Yes, this is below red-line, but I can't believe it is good for the engine when run for hours at a time. It also degrades the fuel economy to 28MPG, though the car will get 32MPG in the lower 60MPH range I mentioned. I don't know why Mazda built the car this way (might have something to do with the paltry 153 ft-lbs of torque), but had I known I would not have bought it. I have a virtually brand new car in my driveway, and would fear taking it on road trips if not for the warranty. So the gearing negates the fuel efficiency of the 4-cylinder (in the manual, at least) if you drive much on the freeway, so get the V6.
Transmission: Automatic Color:Black Interior Color: Black Average Vehicle Review: (4.571 reviews) this is a wonderful vehicle that combines an attractive style, the versatility of a wagon but without the boring look of the traditional wagon. and it has outstanding handling for a family vehicle.
Transmission: Automatic Color:White Interior Color: White Average Vehicle Review: (4.833 reviews) I bought this car right before Christmas in 2009 and i must say other than the price of parts the car is absolutely amazing. I actually got into a severe accident due to a person runing a red light last October and to this day I believe this car saved my life with it's sophisticated protection equipment.
Transmission: Automatic Color:White Interior Color: Black Average Vehicle Review: (4.833 reviews) Overall, I am pleased with the car I selected. I have always liked Mazda's and it's been a few years since owning one. I am glad to be a Mazda owner once again.
Transmission: Automatic Color:White Interior Color: Tan Average Vehicle Review: (4.833 reviews) Great, sporty car for the money. Would prefer if it got better gas mileage but other than that, it's a great car!
Transmission: Automatic Color:Black Interior Color: Tan Average Vehicle Review: (4.833 reviews) I rented a Masda 6 to drive from Arizona home to California. I had a Volvo 80, which i sold and bought a 2007 masda 6 because I feel in love with that car! so far great car 27 mpg around town I got 36 on a long trip using the cruize control. I love my zoom zoom! the trunk is huge and I don't have to buy the expensive tires my Vovlo requried. I would buy it again!
Transmission: Automatic Color:Blue Interior Color: Average Vehicle Review: (4.833 reviews) Have always loved the luxurious features and extras I get in a Mazda. It's well-made and designed, fun and zippy to drive. Great on gas, reliable with a spacious-comfortable interior. My 1st Mazda was a 626 which I had for 11 years! My 2nd was a Millenia which was so beautiful and luxurious - too bad they stopped making them :( Now I drive the 6. Its more sporty than I hoped for but there's no denying the way it handles, the way it looks and the a/c was the cherry on top - it's rocks! I definitely find myself saying "zoom-zoom" while I'm driving.
Transmission: Automatic Color:Gray Interior Color: Tan Average Vehicle Review: (4.714 reviews) I bought my Mazda 6 a little over a month ago as an early birthday present to myself. I was looking for something that was affordable, reliable and easy to maintain (as I know close to nothing about cars). This car is all of the above. I love the sporty look of the car, it's roomy enough for me to take my nieces and nephews around comfortably and it has been fairly good on gas. My only issue with the car is that it does not have a MP3 AV input and I live and die by my IPOD, but that was not a major dealbreaker for me. All in all, I think I made a great purchase.
this is a wonderful vehicle that combines an attractive style, the versatility of a wagon but without the boring look of the traditional wagon. and it has outstanding handling for a family vehicle.
This car is worth the money. It handles well, holds a lot of cargo, and is stylish. I would not recommend this car if you want to put child seats in the back, or if you are a wide-bodied person. The front seats are slightly narrow, and the back seats are a little crammed for anyone over 6 feet. I am not going to say that it is the best car in the world. However, it does all things pretty well. If you're in the market for a reasonable priced mid-size sedan with all the creature comforts and reliability of a Japanese Car, then look no farther. It won't disappoint you. |
# -*- coding: utf-8 -*-
"""
Created on Fri Jul 01 15:52:10 2016
@author: aluo
"""
from __future__ import print_function
import os
import sys
import errno
import timeit
import pickle
import numpy
from matplotlib import pyplot
from generate_patches import recombine_image
import theano
import theano.tensor as T
from theano.tensor.shared_randomstreams import RandomStreams
from logistic_sgd import load_data
from utils import tile_raster_images
try:
import PIL.Image as Image
except ImportError:
import Image
def make_sure_path_exists(path):
try:
os.makedirs(path)
except OSError as exception:
if exception.errno != errno.EEXIST:
raise
class dA(object):
def __init__(
self,
numpy_rng,
theano_rng=None,
input=None,
noiseInput=None,
n_visible=32*32,
n_hidden=800,
W=None,
bhid=None,
bvis=None
):
"""
Initialize the dA class by specifying the number of visible units (the
dimension d of the input ), the number of hidden units ( the dimension
d' of the latent or hidden space ) and the corruption level. The
constructor also receives symbolic variables for the input, weights and
bias. Such a symbolic variables are useful when, for example the input
is the result of some computations, or when weights are shared between
the dA and an MLP layer. When dealing with SdAs this always happens,
the dA on layer 2 gets as input the output of the dA on layer 1,
and the weights of the dA are used in the second stage of training
to construct an MLP.
:type numpy_rng: numpy.random.RandomState
:param numpy_rng: number random generator used to generate weights
:type theano_rng: theano.tensor.shared_randomstreams.RandomStreams
:param theano_rng: Theano random generator; if None is given one is
generated based on a seed drawn from `rng`
:type input: theano.tensor.TensorType
:param input: a symbolic description of the input or None for
standalone dA
:type n_visible: int
:param n_visible: number of visible units
:type n_hidden: int
:param n_hidden: number of hidden units
:type W: theano.tensor.TensorType
:param W: Theano variable pointing to a set of weights that should be
shared belong the dA and another architecture; if dA should
be standalone set this to None
:type bhid: theano.tensor.TensorType
:param bhid: Theano variable pointing to a set of biases values (for
hidden units) that should be shared belong dA and another
architecture; if dA should be standalone set this to None
:type bvis: theano.tensor.TensorType
:param bvis: Theano variable pointing to a set of biases values (for
visible units) that should be shared belong dA and another
architecture; if dA should be standalone set this to None
"""
self.n_visible = n_visible
self.n_hidden = n_hidden
# create a Theano random generator that gives symbolic random values
if not theano_rng:
theano_rng = RandomStreams(numpy_rng.randint(2 ** 30))
# note : W' was written as `W_prime` and b' as `b_prime`
if not W:
# W is initialized with `initial_W` which is uniformely sampled
# from -4*sqrt(6./(n_visible+n_hidden)) and
# 4*sqrt(6./(n_hidden+n_visible))the output of uniform if
# converted using asarray to dtype
# theano.config.floatX so that the code is runable on GPU
initial_W = numpy.asarray(
numpy_rng.uniform(
low=-4 * numpy.sqrt(6. / (n_hidden + n_visible)),
high=4 * numpy.sqrt(6. / (n_hidden + n_visible)),
size=(n_visible, n_hidden)
),
dtype=theano.config.floatX
)
W = theano.shared(value=initial_W, name='W', borrow=True)
if not bvis:
bvis = theano.shared(
value=numpy.zeros(
n_visible,
dtype=theano.config.floatX
),
name='b_prime',
borrow=True
)
if not bhid:
bhid = theano.shared(
value=numpy.zeros(
n_hidden,
dtype=theano.config.floatX
),
name='b',
borrow=True
)
self.W = W
# b corresponds to the bias of the hidden
self.b = bhid
# b_prime corresponds to the bias of the visible
self.b_prime = bvis
# tied weights, therefore W_prime is W transpose
self.W_prime = self.W.T
self.theano_rng = theano_rng
# if no input is given, generate a variable representing the input
if input is None:
# we use a matrix because we expect a minibatch of several
# examples, each example being a row
self.x = T.dmatrix(name='input')
else:
self.x = input
if noiseInput is None:
# we use a matrix because we expect a minibatch of several
# examples, each example being a row
self.noise_x = T.dmatrix(name='noiseInput')
else:
self.noise_x = noiseInput
self.params = [self.W, self.b, self.b_prime]
def get_corrupted_input(self, input, corruption_level):
"""This function keeps ``1-corruption_level`` entries of the inputs the
same and zero-out randomly selected subset of size ``coruption_level``
Note : first argument of theano.rng.binomial is the shape(size) of
random numbers that it should produce
second argument is the number of trials
third argument is the probability of success of any trial
this will produce an array of 0s and 1s where 1 has a
probability of 1 - ``corruption_level`` and 0 with
``corruption_level``
The binomial function return int64 data type by
default. int64 multiplicated by the input
type(floatX) always return float64. To keep all data
in floatX when floatX is float32, we set the dtype of
the binomial to floatX. As in our case the value of
the binomial is always 0 or 1, this don't change the
result. This is needed to allow the gpu to work
correctly as it only support float32 for now.
"""
return self.theano_rng.binomial(size=input.shape, n=1,
p=1 - corruption_level,
dtype=theano.config.floatX) * input
def get_hidden_values(self, input):
""" Computes the values of the hidden layer """
return T.nnet.sigmoid(T.dot(input, self.W) + self.b)
def get_reconstructed_input(self, hidden):
"""Computes the reconstructed input given the values of the
hidden layer
"""
return T.nnet.sigmoid(T.dot(hidden, self.W_prime) + self.b_prime)
def get_denoised_patch_function(self, patch):
y = self.get_hidden_values(patch)
z = self.get_reconstructed_input(y)
return z
def get_cost_updates(self, learning_rate):
""" This function computes the cost and the updates for one trainng
step of the dA """
# tilde_x = self.get_corrupted_input(self.x, corruption_level)
tilde_x=self.noise_x
y = self.get_hidden_values(tilde_x)
z = self.get_reconstructed_input(y)
# note : we sum over the size of a datapoint; if we are using
# minibatches, L will be a vector, with one entry per
# example in minibatch
L = - T.sum(self.x * T.log(z) + (1 - self.x) * T.log(1 - z), axis=1)
# note : L is now a vector, where each element is the
# cross-entropy cost of the reconstruction of the
# corresponding example of the minibatch. We need to
# compute the average of all these to get the cost of
# the minibatch
cost = T.mean(L)
# cost = L
# square_param = numpy.multiply(self.params[0],self.params[0])
# regularization = learning_rate* 0.5 * T.mean(T.sum(T.sum(square_param,axis = 0),axis=0))
# compute the gradients of the cost of the `dA` with respect
# to its parameters
gparams = T.grad(cost, self.params)
#gparams[0] = gparams[0] + learning_rate * self.params[0] / self.params[0].size
# generate the list of updates
updates = [
(param, param - learning_rate * gparam)
for param, gparam in zip(self.params, gparams)
]
return (cost, updates)
def test_dA(Width = 32, Height = 32, hidden = 800, learning_rate=0.1, training_epochs=15,
dataset = None, noise_dataset=None,
batch_size=20, output_folder='dA_plots'):
"""
This demo is tested on MNIST
:type learning_rate: float
:param learning_rate: learning rate used for training the DeNosing
AutoEncoder
:type training_epochs: int
:param training_epochs: number of epochs used for training
:type dataset: string
:param dataset: path to the picked dataset
"""
train_set_x = theano.shared(dataset)
# compute number of minibatches for training, validation and testing
n_train_batches = train_set_x.get_value(borrow=True).shape[0] // batch_size
index = T.lscalar() # index to a [mini]batch
x = T.matrix('x', dtype='float32') # the data is presented as rasterized images
noise_x = T.matrix('noise_x', dtype='float32')
#####################################
# BUILDING THE MODEL CORRUPTION 30% #
#####################################
rng = numpy.random.RandomState(1)
theano_rng = RandomStreams(rng.randint(2 ** 30))
noise_train_set_x = theano.shared(noise_dataset)
da = dA(
numpy_rng=rng,
theano_rng=theano_rng,
input=x,
noiseInput=noise_x,
n_visible=Width * Height,
n_hidden=hidden
)
cost, updates = da.get_cost_updates(
learning_rate=learning_rate
)
train_da = theano.function(
[index],
cost,
updates=updates,
givens={
x: train_set_x[index * batch_size: (index + 1) * batch_size],
noise_x: noise_train_set_x[index * batch_size: (index + 1) * batch_size]
}
)
start_time = timeit.default_timer()
############
# TRAINING #
############
# go through training epochs
for epoch in range(training_epochs):
# go through trainng set
c = []
for batch_index in range(n_train_batches):
c.append(train_da(batch_index))
if epoch % 100 == 0:
print('Training epoch %d, cost ' % epoch, numpy.mean(c))
end_time = timeit.default_timer()
training_time = (end_time - start_time)
print(('The training code for file ' +
os.path.split(__file__)[1] +
' ran for %.2fm' % (training_time / 60.)), file=sys.stderr)
W_corruption = da.W
bhid_corruption = da.b
bvis_corruption = da.b_prime
results = (W_corruption,
bhid_corruption, bvis_corruption)
return results
def unpickle(file):
fo = open(file, 'rb')
d = pickle.load(fo)
fo.close()
return d
def showRGBImage(array_data, W, H):
array_data = array_data.reshape(3,W, H).transpose()
array_data = numpy.swapaxes(array_data,0,1)
pyplot.axis('off')
array_data = pyplot.imshow(array_data)
def showGrayImage(data, W, H):
data = data.reshape(W,H)
pyplot.axis('off')
pyplot.imshow(data,cmap='Greys_r')
def showEncodeImage(data, autoEncoder, W, H):
X = data
tilde_X = X
Y = autoEncoder.get_hidden_values(tilde_X)
Z = autoEncoder.get_reconstructed_input(Y)
Y = Y.eval()
Z = Z.eval()
# tilde_X = tilde_X.eval()
showGrayImage(tilde_X, W, H)
pyplot.figure()
showGrayImage(Z, W, H)
pyplot.figure()
pyplot.show()
def saveTrainedData(path,noise_W, noise_b, noise_b_p,hidden, Width, Height ):
d = {}
d["noise_W"] = {"data" : noise_W}
d["noise_b"] = {"data" : noise_b}
d["noise_b_p"] = {"data" : noise_b_p}
d["hidden"] = {"data" : hidden}
d["Width"] = {"data" : Width}
d["Height"] = {"data" : Height}
ff = open(path, "wb")
pickle.dump(d, ff, protocol=pickle.HIGHEST_PROTOCOL)
ff.close()
def loadTrainedData(path):
d = unpickle(path)
noise_W = d["noise_W"]["data"]
noise_b = d["noise_b"]["data"]
noise_b_p = d["noise_b_p"]["data"]
hidden = d["hidden"]["data"]
Width = d["Width"]["data"]
Height = d["Height"]["data"]
results =(noise_W,noise_b,noise_b_p,hidden,Width,Height)
return results
def filterImages(noise_datasets, autoEncoder):
d = noise_datasets.copy()
rgb = ('r', 'g', 'b')
x = T.matrix('x', dtype='float32')
evaluate = theano.function(
[x],
autoEncoder.get_denoised_patch_function(x)
)
for c in rgb:
imgs = numpy.array(d[c]['data'], dtype='float32')
X = imgs
Z = evaluate(X)
d[c]['data'] = Z
return d
def saveImage(image_dict, image_file_name, results_folder="./result_images"):
make_sure_path_exists(results_folder)
recombine_image(image_dict, results_folder + os.sep +image_file_name + '.png')
def loadDataset(name, source_folder = "./image_patch_data"):
make_sure_path_exists(source_folder)
dataset_path = source_folder + os.sep + name + '.dat'
datasets = unpickle(dataset_path)
patches = numpy.concatenate((datasets['r']['data'], datasets['g']['data'], datasets['b']['data']),axis=0)
patches_f = numpy.array(patches, dtype='float32')
return patches_f, datasets
def loadDatasets(reference_name_list, noisy_dataset_name_list,source_folder = "./image_patch_data"):
assert len(reference_name_list) == len(noisy_dataset_name_list)
make_sure_path_exists(source_folder)
clean_datasets = []
noisy_datasets = []
noisy_patches_f = numpy.zeros(1)
clean_patches_f = numpy.zeros(1)
for i in range(len(reference_name_list)):
ref_name = reference_name_list[i]
noise_name = noisy_dataset_name_list[i]
clean_patches_f_i, clean_dataset_i = loadDataset(ref_name, source_folder)
noisy_patches_f_i, noisy_dataset_i = loadDataset(noise_name, source_folder)
if i == 0:
clean_patches_f = clean_patches_f_i
noisy_patches_f = noisy_patches_f_i
else:
clean_patches_f = numpy.concatenate((clean_patches_f, clean_patches_f_i), axis = 0)
noisy_patches_f = numpy.concatenate((noisy_patches_f, noisy_patches_f_i), axis = 0)
clean_datasets.append(clean_dataset_i)
noisy_datasets.append(noisy_dataset_i)
patch_size = noisy_datasets[0]['patch_size']
return clean_patches_f, noisy_patches_f, clean_datasets, noisy_datasets, patch_size
if __name__ == '__main__':
dataset_base = "sponzat_0"
dataset_name = dataset_base + "_10000"
result_folder = "./result_images"
noise_dataset_samples = 5
noise_dataset_name = dataset_base +'_'+ str(noise_dataset_samples)
clean_patches_f, noisy_patches_f, clean_datasets, noisy_datasets, patch_size = loadDatasets(dataset_name, noise_dataset_name)
Width = patch_size[0]
Height = patch_size[1]
#PARAMETERS TO PLAY WITH
hidden_fraction = 0.5
hidden = int(hidden_fraction*Width * Height)
training_epochs = 100
learning_rate =0.01
batch_size = clean_patches_f.shape[0]
parameters_string = '_dA_epochs' + str(training_epochs) +'_hidden' + str(hidden) + '_lrate' + str(learning_rate) +'_W' +str(Width)
path = 'training/trained_variables_' + noise_dataset_name + parameters_string + '.dat'
isTrained = os.path.isfile(path)
if not isTrained:
noise_W, noise_b, noise_b_p = test_dA(dataset=clean_patches_f,learning_rate=learning_rate,
training_epochs=training_epochs,hidden=hidden,
Width = Width, Height = Height,
batch_size = batch_size,
noise_dataset=noisy_patches_f)
saveTrainedData(path, noise_W, noise_b, noise_b_p,hidden, Width, Height )
else:
noise_W, noise_b, noise_b_p,hidden, Width, Height = loadTrainedData(path)
rng = numpy.random.RandomState(123)
theano_rng = RandomStreams(rng.randint(2 ** 30))
rng = numpy.random.RandomState(123)
theano_rng = RandomStreams(rng.randint(2 ** 30))
noiseDA = dA(
numpy_rng=rng,
theano_rng=theano_rng,
input=clean_patches_f,
noiseInput=noisy_patches_f,
n_visible=Width * Height,
n_hidden=hidden,
W=noise_W,
bhid=noise_b,
bvis=noise_b_p
)
denoised_datasets = filterImages(noisy_datasets,noiseDA)
saveImage(denoised_datasets, noise_dataset_name + parameters_string,
result_folder)
|
Goal: Bring the spirit of Christmas into everyday!
Challenge: Select a nonprofit organization within the community, raise money to meet their needs and volunteer our time.
Our Choice: Chester County SPCA How are we tackling the challenge?
We are making a difference by raising funds to purchase items on their “Wish List.” Founded in 1929, the CCSPCA is dedicated to ending animal suffering as well as involving the entire community in the welfare and well-being of animals. It is an open access, no-kill shelter covering both Chester and Delaware Counties.
Your turn: We challenge you to get out in the community and help a cause that means a lot to you!
If you wish to donate supplies to the CCSPCA on behalf of the ESMS Challenge, please click the link below to order much needed items. After selecting the items you graciously wish to donate, indicate that you are supporting the ESMS Challenge within the “Choose gift options” section of checkout and your message will appear on the packaging slip!
If you wish to donate funds directly to the ESMS Challenge, please email Liz at [email protected] or Ashley at [email protected].
To learn more about the CCSPCA, visit their website by clicking the link below! |
# coding=utf-8
#
# Copyright 2016 F5 Networks Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
"""BIG-IP® system failover module
REST URI
``http://localhost/mgmt/tm/shared/license``
GUI Path
``System --> License``
REST Kind
``tm:shared:licensing:*``
"""
from f5.bigip.resource import PathElement
from f5.bigip.resource import UnnamedResource
from f5.sdk_exception import UnsupportedMethod
class Licensing(PathElement):
"""BIG-IP® licensing stats and states.
Licensing objects themselves do not support any methods and are just
containers for lower level objects.
.. note::
This is an unnamed resource so it has not ~Partition~Name pattern
at the end of its URI.
"""
def __init__(self, shared):
super(Licensing, self).__init__(shared)
self._meta_data['allowed_lazy_attributes'] = [
Activation,
Registration,
]
self._meta_data['attribute_registry'] = {
'tm:shared:licensing:activation:activatelicenseresponse':
Activation,
'tm:shared:licensing:registration:registrationlicenseresponse':
Registration,
}
class Activation(UnnamedResource):
"""BIG-IP® license activation status
Activation state objects only support the
:meth:`~f5.bigip.resource.Resource.load` method because they cannot be
modified via the API.
.. note::
This is an unnamed resource so it has not ~Partition~Name pattern
at the end of its URI.
"""
def __init__(self, licensing):
super(Activation, self).__init__(licensing)
self._meta_data['required_load_parameters'] = set()
self._meta_data['required_json_kind'] =\
'tm:shared:licensing:activation:activatelicenseresponse'
def update(self, **kwargs):
'''Update is not supported for License Activation
:raises: UnsupportedOperation
'''
raise UnsupportedMethod(
"%s does not support the update method" % self.__class__.__name__
)
class Registration(UnnamedResource):
"""BIG-IP® license registration status
Registration state objects only support the
:meth:`~f5.bigip.resource.Resource.load` method because they cannot be
modified via the API.
.. note::
This is an unnamed resource so it has not ~Partition~Name pattern
at the end of its URI.
"""
def __init__(self, licensing):
super(Registration, self).__init__(licensing)
self._meta_data['required_load_parameters'] = set()
self._meta_data['required_json_kind'] =\
'tm:shared:licensing:activation:activatelicenseresponse'
def update(self, **kwargs):
'''Update is not supported for License Registration
:raises: UnsupportedOperation
'''
raise UnsupportedMethod(
"%s does not support the update method" % self.__class__.__name__
)
|
Anne Jovaras Podiatrist, podiatrist, listed under "Podiatrists" category, is located at 39 Camp St Beechworth VIC, 3747, Australia and can be reached by 0357282936 phone number. Anne Jovaras Podiatrist has currently 0 reviews.
Browse all Podiatrists in Beechworth VIC. |
""" Page Object Model - Gym Management Page """
from selenium.webdriver.common.by import By
from .selectors.gym_management import TRACKED_GYMS_TABLE_ROW, REMOVE_BUTTON, \
TABLE_COLUMNS
from .selectors.listing import SEARCH_BAR
from .common import BasePage
from .listing import ListingPage
class GymManagementPage(BasePage):
"""
Page Object Model for Gym Management Page
"""
def find_tracked_gym(self, gym_name):
"""
Locate the gym in the tracked gyms list
:return: WebElement of row with Gym in it or None
"""
gyms = self.driver.find_elements(*TRACKED_GYMS_TABLE_ROW)
for gym in gyms:
if gym_name in gym.text:
return gym
return None
def remove_tracked_gym(self, gym_row):
"""
Press the Remove button the supplied table row containing the Gym
to stop tracking
:param gym_row: WebElement of the table row that the Gym is in
"""
columns = gym_row.find_elements(*TABLE_COLUMNS)
remove_button = columns[1].find_element(*REMOVE_BUTTON)
button_selector = (
By.CSS_SELECTOR,
'a[href="{}"]'.format(remove_button.get_attribute('href'))
)
self.click_and_verify_change(
remove_button,
button_selector,
hidden=True
)
def enter_search_term(self, term):
"""
Enter a search term into the search bar
:param term: Term to enter
"""
page = ListingPage(self.driver)
page.enter_search_term(term)
def press_suggested_gym(self, gym):
"""
Press the suggested gym in the dropdown
:param gym: Gym to find
"""
page = ListingPage(self.driver)
suggestions = page.get_search_suggestions()
option = None
for suggestion in suggestions:
if gym in suggestion.text:
option = suggestion
assert option
page.click_and_verify_change(option, SEARCH_BAR)
|
Florida State’s Jessie Warren, who hit .520 in the World Series, was named most outstanding player. She went 3 for 4 with a homer on Tuesday.
Florida State got more in the second when Elizabeth Mason singled and two runs scored with help from an error in right field. Warren’s RBI single later in the inning pushed the Seminoles’ lead to 5-3. |
from .http import httpstatus
class WebAPIException(Exception):
"""Base exception for the REST infrastructure
These are exceptions that can be raised by the handlers.
"""
#: HTTP code generally associated to this exception.
#: Missing any better info, default is a server error.
http_code = httpstatus.INTERNAL_SERVER_ERROR
def __init__(self, message=None, **kwargs):
"""Initializes the exception. keyword arguments will become
part of the representation as key/value pairs."""
super().__init__(message)
self.message = message
self.info = kwargs if len(kwargs) else None
class NotFound(WebAPIException):
"""Exception raised when the resource is not found.
Raise this exception in your handlers when you can't
find the resource the identifier refers to.
"""
http_code = httpstatus.NOT_FOUND
class Exists(WebAPIException):
"""Represents a case where the resource could not be created
because it already exists. This is generally raised in the
create() method if the resource has uniqueness constraints on
things other than the exposed id."""
http_code = httpstatus.CONFLICT
class BadRepresentation(WebAPIException):
"""Exception raised when the resource representation is
invalid or does not contain the appropriate keys.
Raise this exception in your handlers when the received
representation is ill-formed
"""
http_code = httpstatus.BAD_REQUEST
class BadQueryArguments(WebAPIException):
"""Exception raised when the query arguments do not conform to the
expected format.
"""
http_code = httpstatus.BAD_REQUEST
class BadRequest(WebAPIException):
"""Deprecated. Kept for compatibility. Use BadRepresentation."""
http_code = httpstatus.BAD_REQUEST
class Unable(WebAPIException):
"""Exception raised when the request cannot be performed
for whatever reason that is not dependent on the client.
"""
http_code = httpstatus.INTERNAL_SERVER_ERROR
|
Say no to apartheid, Morgan Freeman and Jian Ghomeshi!
Video created by Canadian film maker John Greyson.
Please write an email to Jian Ghomeshi. Sign the petition on Change.org to Morgan Freeman. It only takes a minute, but it takes large numbers to make a change.
Leave a comment Posted in Art in support of Palestine, Solidarity with Palestine Tagged Jian Ghomeshi, morgan freeman, Say No to Apartheid.
We bought a mini-Ark !
Fundraising for the Gaza Music School – please be generous!
Music for Gaza! No more sound of bullets, bombs or drones please!
CBC coverage – hardly reaches the level of the absurd.
Media reporting on Gaza: Nous accusons.
Rebuilt Gaza Music School hopes to reach more children. |
# -*- coding: utf-8 -*-
""" Custom UI Widgets
@requires: U{B{I{gluon}} <http://web2py.com>}
@copyright: 2009-2012 (c) Sahana Software Foundation
@license: MIT
Permission is hereby granted, free of charge, to any person
obtaining a copy of this software and associated documentation
files (the "Software"), to deal in the Software without
restriction, including without limitation the rights to use,
copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the
Software is furnished to do so, subject to the following
conditions:
The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
OTHER DEALINGS IN THE SOFTWARE.
"""
__all__ = ["S3HiddenWidget",
"S3DateWidget",
"S3DateTimeWidget",
"S3BooleanWidget",
#"S3UploadWidget",
"S3AutocompleteWidget",
"S3LocationAutocompleteWidget",
"S3LatLonWidget",
"S3OrganisationAutocompleteWidget",
"S3OrganisationHierarchyWidget",
"S3PersonAutocompleteWidget",
"S3HumanResourceAutocompleteWidget",
"S3SiteAutocompleteWidget",
"S3LocationSelectorWidget",
"S3LocationDropdownWidget",
#"S3CheckboxesWidget",
"S3MultiSelectWidget",
"S3ACLWidget",
"CheckboxesWidgetS3",
"S3AddPersonWidget",
"S3AutocompleteOrAddWidget",
"S3AddObjectWidget",
"S3SearchAutocompleteWidget",
"S3TimeIntervalWidget",
"S3EmbedComponentWidget",
"S3KeyValueWidget",
"S3SliderWidget",
"S3InvBinWidget",
"s3_comments_widget",
"s3_richtext_widget",
"s3_checkboxes_widget",
"s3_grouped_checkboxes_widget"
]
import datetime
try:
from lxml import etree
except ImportError:
import sys
print >> sys.stderr, "ERROR: lxml module needed for XML handling"
raise
try:
import json # try stdlib (Python 2.6)
except ImportError:
try:
import simplejson as json # try external module
except:
import gluon.contrib.simplejson as json # fallback to pure-Python module
from gluon import *
# Here are dependencies listed for reference:
#from gluon import current
#from gluon.dal import Field
#from gluon.html import *
#from gluon.http import HTTP
#from gluon.validators import *
from gluon.sqlhtml import *
from gluon.storage import Storage
from s3utils import *
from s3validators import *
repr_select = lambda l: len(l.name) > 48 and "%s..." % l.name[:44] or l.name
# =============================================================================
class S3HiddenWidget(StringWidget):
"""
Standard String widget, but with a class of hide
- currently unused
"""
def __call__(self, field, value, **attributes):
default = dict(
_type = "text",
value = (value != None and str(value)) or "",
)
attr = StringWidget._attributes(field, default, **attributes)
attr["_class"] = "hide %s" % attr["_class"]
return TAG[""](
INPUT(**attr),
requires = field.requires
)
# =============================================================================
class S3DateWidget(FormWidget):
"""
Standard Date widget, but with a modified yearRange to support Birth dates
"""
def __init__(self,
format = None,
past=1440, # how many months into the past the date can be set to
future=1440 # how many months into the future the date can be set to
):
self.format = format
self.past = past
self.future = future
def __call__(self, field, value, **attributes):
# Need to convert value into ISO-format
# (widget expects ISO, but value comes in custom format)
_format = current.deployment_settings.get_L10n_date_format()
v, error = IS_DATE_IN_RANGE(format=_format)(value)
if not error:
value = v.isoformat()
if self.format:
# default: "yy-mm-dd"
format = str(self.format)
else:
format = _format.replace("%Y", "yy").replace("%y", "y").replace("%m", "mm").replace("%d", "dd").replace("%b", "M")
default = dict(
_type = "text",
value = (value != None and str(value)) or "",
)
attr = StringWidget._attributes(field, default, **attributes)
attr["_class"] = "date"
selector = str(field).replace(".", "_")
current.response.s3.jquery_ready.append(
'''$('#%(selector)s').datepicker('option',{
minDate:'-%(past)sm',
maxDate:'+%(future)sm',
yearRange:'c-100:c+100',
dateFormat:'%(format)s'})''' % \
dict(selector = selector,
past = self.past,
future = self.future,
format = format))
return TAG[""](
INPUT(**attr),
requires = field.requires
)
# =============================================================================
class S3DateTimeWidget(FormWidget):
"""
Standard DateTime widget, based on the widget above, but instead of using
jQuery datepicker we use Anytime.
"""
def __init__(self,
format = None,
past=876000, # how many hours into the past the date can be set to
future=876000 # how many hours into the future the date can be set to
):
self.format = format
self.past = past
self.future = future
def __call__(self, field, value, **attributes):
if self.format:
# default: "%Y-%m-%d %T"
format = str(self.format)
else:
format = str(current.deployment_settings.get_L10n_datetime_format())
request = current.request
s3 = current.response.s3
if isinstance(value, datetime.datetime):
value = value.strftime(format)
elif value is None:
value = ""
default = dict(_type = "text",
# Prevent default "datetime" calendar from showing up:
_class = "anytime",
value = value,
old_value = value)
attr = StringWidget._attributes(field, default, **attributes)
selector = str(field).replace(".", "_")
now = request.utcnow
offset = IS_UTC_OFFSET.get_offset_value(current.session.s3.utc_offset)
if offset:
now = now + datetime.timedelta(seconds=offset)
timedelta = datetime.timedelta
earliest = now - timedelta(hours = self.past)
latest = now + timedelta(hours = self.future)
earliest = earliest.strftime(format)
latest = latest.strftime(format)
script_dir = "/%s/static/scripts" % request.application
if s3.debug and \
"%s/anytime.js" % script_dir not in s3.scripts:
s3.scripts.append("%s/anytime.js" % script_dir)
s3.stylesheets.append("plugins/anytime.css")
elif "%s/anytimec.js" % script_dir not in s3.scripts:
s3.scripts.append("%s/anytimec.js" % script_dir)
s3.stylesheets.append("plugins/anytimec.css")
s3.jquery_ready.append(
'''$('#%(selector)s').AnyTime_picker({
askSecond:false,
firstDOW:1,
earliest:'%(earliest)s',
latest:'%(latest)s',
format:'%(format)s',
})
clear_button=$('<input type="button" value="clear"/>').click(function(e){
$('#%(selector)s').val('')
})
$('#%(selector)s').after(clear_button)''' % \
dict(selector=selector,
earliest=earliest,
latest=latest,
format=format.replace("%M", "%i")))
return TAG[""](
INPUT(**attr),
requires = field.requires
)
# =============================================================================
class S3BooleanWidget(BooleanWidget):
"""
Standard Boolean widget, with an option to hide/reveal fields conditionally.
"""
def __init__(self,
fields = [],
click_to_show = True
):
self.fields = fields
self.click_to_show = click_to_show
def __call__(self, field, value, **attributes):
response = current.response
fields = self.fields
click_to_show = self.click_to_show
default = dict(
_type="checkbox",
value=value,
)
attr = BooleanWidget._attributes(field, default, **attributes)
tablename = field.tablename
hide = ""
show = ""
for _field in fields:
fieldname = "%s_%s" % (tablename, _field)
hide += '''
$('#%s__row1').hide()
$('#%s__row').hide()
''' % (fieldname, fieldname)
show += '''
$('#%s__row1').show()
$('#%s__row').show()
''' % (fieldname, fieldname)
if fields:
checkbox = "%s_%s" % (tablename, field.name)
click_start = '''
$('#%s').click(function(){
if(this.checked){
''' % checkbox
middle = "} else {\n"
click_end = "}})"
if click_to_show:
# Hide by default
script = "%s\n%s\n%s\n%s\n%s\n%s" % (hide, click_start, show, middle, hide, click_end)
else:
# Show by default
script = "%s\n%s\n%s\n%s\n%s\n%s" % (show, click_start, hide, middle, show, click_end)
response.s3.jquery_ready.append(script)
return TAG[""](
INPUT(**attr),
requires = field.requires
)
# =============================================================================
class S3UploadWidget(UploadWidget):
"""
Subclassed to not show the delete checkbox when field is mandatory
- This now been included as standard within Web2Py from r2867
- Leaving this unused example in the codebase so that we can easily
amend this if we wish to later
"""
@staticmethod
def widget(field, value, download_url=None, **attributes):
"""
generates a INPUT file tag.
Optionally provides an A link to the file, including a checkbox so
the file can be deleted.
All is wrapped in a DIV.
@see: :meth:`FormWidget.widget`
@param download_url: Optional URL to link to the file (default = None)
"""
default=dict(
_type="file",
)
attr = UploadWidget._attributes(field, default, **attributes)
inp = INPUT(**attr)
if download_url and value:
url = "%s/%s" % (download_url, value)
(br, image) = ("", "")
if UploadWidget.is_image(value):
br = BR()
image = IMG(_src = url, _width = UploadWidget.DEFAULT_WIDTH)
requires = attr["requires"]
if requires == [] or isinstance(requires, IS_EMPTY_OR):
inp = DIV(inp, "[",
A(UploadWidget.GENERIC_DESCRIPTION, _href = url),
"|",
INPUT(_type="checkbox",
_name=field.name + UploadWidget.ID_DELETE_SUFFIX),
UploadWidget.DELETE_FILE,
"]", br, image)
else:
inp = DIV(inp, "[",
A(UploadWidget.GENERIC_DESCRIPTION, _href = url),
"]", br, image)
return inp
# =============================================================================
class S3AutocompleteWidget(FormWidget):
"""
Renders a SELECT as an INPUT field with AJAX Autocomplete
"""
def __init__(self,
module,
resourcename,
fieldname = "name",
filter = "", # REST filter
link_filter = "",
#new_items = False, # Whether to make this a combo box
post_process = "",
delay = 450, # milliseconds
min_length = 2): # Increase this for large deployments
self.module = module
self.resourcename = resourcename
self.fieldname = fieldname
self.filter = filter
self.link_filter = link_filter
#self.new_items = new_items
self.post_process = post_process
self.delay = delay
self.min_length = min_length
# @ToDo: Refreshes all dropdowns as-necessary
self.post_process = post_process or ""
def __call__(self, field, value, **attributes):
default = dict(
_type = "text",
value = (value != None and str(value)) or "",
)
attr = StringWidget._attributes(field, default, **attributes)
# Hide the real field
attr["_class"] = attr["_class"] + " hide"
real_input = str(field).replace(".", "_")
dummy_input = "dummy_%s" % real_input
# Script defined in static/scripts/S3/S3.js
js_autocomplete = '''S3.autocomplete('%s','%s','%s','%s','%s','%s',\"%s\",%s,%s)''' % \
(self.fieldname, self.module, self.resourcename, real_input, self.filter,
self.link_filter, self.post_process, self.delay, self.min_length)
if value:
text = str(field.represent(default["value"]))
if "<" in text:
# Strip Markup
try:
markup = etree.XML(text)
text = markup.xpath(".//text()")
if text:
text = " ".join(text)
else:
text = ""
except etree.XMLSyntaxError:
pass
represent = text
else:
represent = ""
current.response.s3.jquery_ready.append(js_autocomplete)
return TAG[""](
INPUT(_id=dummy_input,
_class="string",
_value=represent),
IMG(_src="/%s/static/img/ajax-loader.gif" % \
current.request.application,
_height=32, _width=32,
_id="%s_throbber" % dummy_input,
_class="throbber hide"),
INPUT(**attr),
requires = field.requires
)
# =============================================================================
class S3LocationAutocompleteWidget(FormWidget):
"""
Renders a gis_location SELECT as an INPUT field with AJAX Autocomplete
@note: differs from the S3AutocompleteWidget:
- needs to have deployment_settings passed-in
- excludes unreliable imported records (Level 'XX')
Appropriate when the location has been previously created (as is the
case for location groups or other specialized locations that need
the location create form).
S3LocationSelectorWidget may be more appropriate for specific locations.
Currently used for selecting the region location in gis_config
and for project/location.
@todo: .represent for the returned data
@todo: Refreshes any dropdowns as-necessary (post_process)
"""
def __init__(self,
prefix="gis",
resourcename="location",
fieldname="name",
level="",
hidden = False,
post_process = "",
delay = 450, # milliseconds
min_length = 2): # Increase this for large deployments
self.prefix = prefix
self.resourcename = resourcename
self.fieldname = fieldname
self.level = level
self.hidden = hidden
self.post_process = post_process
self.delay = delay
self.min_length = min_length
def __call__(self, field, value, **attributes):
fieldname = self.fieldname
level = self.level
if level:
if isinstance(level, list):
levels = ""
counter = 0
for _level in level:
levels += _level
if counter < len(level):
levels += "|"
counter += 1
url = URL(c=self.prefix,
f=self.resourcename,
args="search.json",
vars={"filter":"~",
"field":fieldname,
"level":levels,
"simple":1,
})
else:
url = URL(c=self.prefix,
f=self.resourcename,
args="search.json",
vars={"filter":"~",
"field":fieldname,
"level":level,
"simple":1,
})
else:
url = URL(c=self.prefix,
f=self.resourcename,
args="search.json",
vars={"filter":"~",
"field":fieldname,
"simple":1,
})
# Which Levels do we have in our hierarchy & what are their Labels?
#location_hierarchy = current.deployment_settings.gis.location_hierarchy
#try:
# # Ignore the bad bulk-imported data
# del location_hierarchy["XX"]
#except:
# pass
# @ToDo: Something nicer (i.e. server-side formatting within S3LocationSearch)
name_getter = \
'''function(item){
if(item.level=="L0"){return item.name+" (%(country)s)"
}else if(item.level=="L1"){return item.name+" ("+item.L0+")"
}else if(item.level=="L2"){return item.name+" ("+item.L1+","+item.L0+")"
}else if(item.level=="L3"){return item.name+" ("+item.L2+","+item.L1+","+item.L0+")"
}else if(item.level=="L4"){return item.name+" ("+item.L3+","+item.L2+","+item.L1+","+item.L0+")"
}else{return item.name}}''' % dict(country = current.messages.COUNTRY)
return S3GenericAutocompleteTemplate(
self.post_process,
self.delay,
self.min_length,
field,
value,
attributes,
source = repr(url),
name_getter = name_getter,
)
# =============================================================================
class S3OrganisationAutocompleteWidget(FormWidget):
"""
Renders an org_organisation SELECT as an INPUT field with AJAX Autocomplete.
Differs from the S3AutocompleteWidget in that it can default to the setting in the profile.
@ToDo: Add an option to hide the widget completely when using the Org from the Profile
- i.e. prevent user overrides
"""
def __init__(self,
post_process = "",
default_from_profile = False,
new_items = False, # Whether to make this a combo box
delay = 450, # milliseconds
min_length = 2): # Increase this for large deployments
self.post_process = post_process
self.delay = delay
self.min_length = min_length
self.new_items = new_items
self.default_from_profile = default_from_profile
def __call__(self, field, value, **attributes):
def transform_value(value):
if not value and self.default_from_profile:
auth = current.session.auth
if auth and auth.user:
value = auth.user.organisation_id
return value
return S3GenericAutocompleteTemplate(
self.post_process,
self.delay,
self.min_length,
field,
value,
attributes,
transform_value = transform_value,
new_items = self.new_items,
tablename = "org_organisation",
source = repr(
URL(c="org", f="org_search",
args="search.json",
vars={"filter":"~"})
)
)
# =============================================================================
class S3OrganisationHierarchyWidget(OptionsWidget):
""" Renders an organisation_id SELECT as a menu """
_class = "widget-org-hierarchy"
def __init__(self, primary_options=None):
"""
[{"id":12, "pe_id":4, "name":"Organisation Name"}]
"""
self.primary_options = primary_options
def __call__(self, field, value, **attributes):
options = self.primary_options
name = attributes.get("_name", field.name)
if options is None:
requires = field.requires
if isinstance(requires, (list, tuple)) and \
len(requires):
requires = requires[0]
if requires is not None:
if isinstance(requires, IS_EMPTY_OR):
requires = requires.other
if hasattr(requires, "options"):
table = current.s3db.org_organisation
options = requires.options()
ids = [option[0] for option in options if option[0]]
rows = current.db(table.id.belongs(ids)).select(table.id,
table.pe_id,
table.name,
orderby=table.name)
options = []
for row in rows:
options.append(row.as_dict())
else:
raise SyntaxError, "widget cannot determine options of %s" % field
javascript_array = '''%s_options=%s''' % (name,
json.dumps(options))
s3 = current.response.s3
s3.js_global.append(javascript_array)
s3.scripts.append("/%s/static/scripts/S3/s3.orghierarchy.js" % \
current.request.application)
s3.stylesheets.append("jquery-ui/jquery.ui.menu.css")
return self.widget(field, value, **attributes)
# =============================================================================
class S3PersonAutocompleteWidget(FormWidget):
"""
Renders a pr_person SELECT as an INPUT field with AJAX Autocomplete.
Differs from the S3AutocompleteWidget in that it uses 3 name fields
To make this widget use the HR table, set the controller to "hrm"
@ToDo: Migrate to template (initial attempt failed)
"""
def __init__(self,
controller = "pr",
function = "person_search",
post_process = "",
delay = 450, # milliseconds
min_length=2): # Increase this for large deployments
self.post_process = post_process
self.delay = delay
self.min_length = min_length
self.c = controller
self.f = function
def __call__(self, field, value, **attributes):
default = dict(
_type = "text",
value = (value != None and str(value)) or "",
)
attr = StringWidget._attributes(field, default, **attributes)
# Hide the real field
attr["_class"] = "%s hide" % attr["_class"]
real_input = str(field).replace(".", "_")
dummy_input = "dummy_%s" % real_input
url = URL(c=self.c,
f=self.f,
args="search.json")
js_autocomplete = "".join((
'''var %(real_input)s={val:$('#%(dummy_input)s').val(),accept:false}
$('#%(dummy_input)s').autocomplete({
source:'%(url)s',
delay:%(delay)d,
minLength:%(min_length)d,
search:function(event,ui){
$('#%(dummy_input)s_throbber').removeClass('hide').show()
return true
},
response:function(event,ui,content){
$('#%(dummy_input)s_throbber').hide()
return content
},
focus:function(event,ui){
var name=ui.item.first
if(ui.item.middle){
name+=' '+ui.item.middle
}
if(ui.item.last){
name+=' '+ui.item.last
}
$('#%(dummy_input)s').val(name)
return false
},
select:function(event,ui){
var name=ui.item.first
if(ui.item.middle){
name+=' '+ui.item.middle
}
if(ui.item.last){
name+=' '+ui.item.last
}
$('#%(dummy_input)s').val(name)
$('#%(real_input)s').val(ui.item.id).change()
''' % dict(dummy_input = dummy_input,
url = url,
delay = self.delay,
min_length = self.min_length,
real_input = real_input),
self.post_process, '''
%(real_input)s.accept=true
return false
}
}).data('autocomplete')._renderItem=function(ul,item){
var name=item.first
if(item.middle){
name+=' '+item.middle
}
if(item.last){
name+=' '+item.last
}
return $('<li></li>').data('item.autocomplete',item).append('<a>'+name+'</a>').appendTo(ul)
}
$('#%(dummy_input)s').blur(function(){
if(!$('#%(dummy_input)s').val()){
$('#%(real_input)s').val('').change()
%(real_input)s.accept=true
}
if(!%(real_input)s.accept){
$('#%(dummy_input)s').val(%(real_input)s.val)
}else{
%(real_input)s.val=$('#%(dummy_input)s').val()
}
%(real_input)s.accept=false
})''' % dict(dummy_input = dummy_input,
real_input = real_input)))
if value:
# Provide the representation for the current/default Value
text = str(field.represent(default["value"]))
if "<" in text:
# Strip Markup
try:
markup = etree.XML(text)
text = markup.xpath(".//text()")
if text:
text = " ".join(text)
else:
text = ""
except etree.XMLSyntaxError:
pass
represent = text
else:
represent = ""
current.response.s3.jquery_ready.append(js_autocomplete)
return TAG[""](
INPUT(_id=dummy_input,
_class="string",
_value=represent),
IMG(_src="/%s/static/img/ajax-loader.gif" % \
current.request.application,
_height=32, _width=32,
_id="%s_throbber" % dummy_input,
_class="throbber hide"),
INPUT(**attr),
requires = field.requires
)
# =============================================================================
class S3HumanResourceAutocompleteWidget(FormWidget):
"""
Renders an hrm_human_resource SELECT as an INPUT field with
AJAX Autocomplete.
Differs from the S3AutocompleteWidget in that it uses:
3 name fields
Organisation
Job Role
"""
def __init__(self,
post_process = "",
delay = 450, # milliseconds
min_length=2, # Increase this for large deployments
group=None, # Filter to staff/volunteers
):
self.post_process = post_process
self.delay = delay
self.min_length = min_length
self.group = group
def __call__(self, field, value, **attributes):
request = current.request
response = current.response
default = dict(
_type = "text",
value = (value != None and str(value)) or "",
)
attr = StringWidget._attributes(field, default, **attributes)
# Hide the real field
attr["_class"] = "%s hide" % attr["_class"]
real_input = str(field).replace(".", "_")
dummy_input = "dummy_%s" % real_input
group = self.group
if group == "staff":
# Search Staff using S3HRSearch
url = URL(c="hrm",
f="person_search",
args="search.json",
vars={"group":"staff"})
elif group == "volunteer":
# Search Volunteers using S3HRSearch
url = URL(c="vol",
f="person_search",
args="search.json")
else:
# Search all HRs using S3HRSearch
url = URL(c="hrm",
f="person_search",
args="search.json")
js_autocomplete = "".join((
'''var %(real_input)s={val:$('#%(dummy_input)s').val(),accept:false}
$('#%(dummy_input)s').autocomplete({
source:'%(url)s',
delay:%(delay)d,
minLength:%(min_length)d,
search:function(event,ui){
$('#%(dummy_input)s_throbber').removeClass('hide').show()
return true
},
response:function(event,ui,content){
$('#%(dummy_input)s_throbber').hide()
return content
},
focus:function(event,ui){
var name=ui.item.first
if(ui.item.middle){
name+=' '+ui.item.middle
}
if(ui.item.last){
name+=' '+ui.item.last
}
var org=ui.item.org
var job=ui.item.job
if(org||job){
if(job){
name+=' ('+job
if(org){
name+=', '+org
}
name+=')'
}else{
name+=' ('+org+')'
}
}
$('#%(dummy_input)s').val(name)
return false
},
select:function(event,ui){
var name=ui.item.first
if(ui.item.middle){
name+=' '+ui.item.middle
}
if(ui.item.last){
name+=' '+ui.item.last
}
var org=ui.item.org
var job=ui.item.job
if(org||job){
if(job){
name+=' ('+job
if(org){
name+=', '+org
}
name+=')'
}else{
name+=' ('+org+')'
}
}
$('#%(dummy_input)s').val(name)
$('#%(real_input)s').val(ui.item.id).change()
''' % dict(dummy_input = dummy_input,
url = url,
delay = self.delay,
min_length = self.min_length,
real_input = real_input),
self.post_process, '''
%(real_input)s.accept=true
return false
}
}).data('autocomplete')._renderItem=function(ul,item){
var name=item.first
if(item.middle){
name+=' '+item.middle
}
if(item.last){
name+=' '+item.last
}
var org=item.org
var job=item.job
if(org||job){
if(job){
name+=' ('+job
if(org){
name+=', '+org
}
name+=')'
}else{
name+=' ('+org+')'
}
}
return $('<li></li>').data('item.autocomplete',item).append('<a>'+name+'</a>').appendTo(ul)
}
$('#%(dummy_input)s').blur(function(){
if(!$('#%(dummy_input)s').val()){
$('#%(real_input)s').val('').change()
%(real_input)s.accept=true
}
if(!%(real_input)s.accept){
$('#%(dummy_input)s').val(%(real_input)s.val)
}else{
%(real_input)s.val=$('#%(dummy_input)s').val()
}
%(real_input)s.accept=false
})''' % dict(dummy_input = dummy_input,
real_input = real_input)))
if value:
# Provide the representation for the current/default Value
text = str(field.represent(default["value"]))
if "<" in text:
# Strip Markup
try:
markup = etree.XML(text)
text = markup.xpath(".//text()")
if text:
text = " ".join(text)
else:
text = ""
except etree.XMLSyntaxError:
pass
represent = text
else:
represent = ""
response.s3.jquery_ready.append(js_autocomplete)
return TAG[""](
INPUT(_id=dummy_input,
_class="string",
_value=represent),
IMG(_src="/%s/static/img/ajax-loader.gif" % \
request.application,
_height=32, _width=32,
_id="%s_throbber" % dummy_input,
_class="throbber hide"),
INPUT(**attr),
requires = field.requires
)
# =============================================================================
class S3SiteAutocompleteWidget(FormWidget):
"""
Renders an org_site SELECT as an INPUT field with AJAX Autocomplete.
Differs from the S3AutocompleteWidget in that it uses name & type fields
in the represent
"""
def __init__(self,
post_process = "",
delay = 450, # milliseconds
min_length = 2):
self.auth = current.auth
self.post_process = post_process
self.delay = delay
self.min_length = min_length
def __call__(self, field, value, **attributes):
default = dict(
_type = "text",
value = (value != None and str(value)) or "",
)
attr = StringWidget._attributes(field, default, **attributes)
# Hide the real field
attr["_class"] = "%s hide" % attr["_class"]
real_input = str(field).replace(".", "_")
dummy_input = "dummy_%s" % real_input
url = URL(c="org", f="site",
args="search.json",
vars={"filter":"~",
"field":"name"})
# Provide a Lookup Table for Site Types
cases = ""
case = -1
org_site_types = current.auth.org_site_types
for instance_type in org_site_types.keys():
case = case + 1
cases += '''case '%s':
return '%s'
''' % (instance_type,
org_site_types[instance_type])
js_autocomplete = "".join(('''function s3_site_lookup(instance_type){
switch(instance_type){
%s}
}''' % cases, '''
var %(real_input)s={val:$('#%(dummy_input)s').val(),accept:false}
$('#%(dummy_input)s').autocomplete({
source:'%(url)s',
delay:%(delay)d,
minLength:%(min_length)d,
search:function(event,ui){
$('#%(dummy_input)s_throbber').removeClass('hide').show()
return true
},
response:function(event,ui,content){
$('#%(dummy_input)s_throbber').hide()
return content
},
focus:function(event,ui){
var name=''
if(ui.item.name!=null){
name+=ui.item.name
}
if(ui.item.instance_type!=''){
name+=' ('+s3_site_lookup(ui.item.instance_type)+')'
}
$('#%(dummy_input)s').val(name)
return false
},
select:function(event,ui){
var name=''
if(ui.item.name!=null){
name+=ui.item.name
}
if(ui.item.instance_type!=''){
name+=' ('+s3_site_lookup(ui.item.instance_type)+')'
}
$('#%(dummy_input)s').val(name)
$('#%(real_input)s').val(ui.item.site_id).change()
''' % dict(dummy_input=dummy_input,
real_input=real_input,
url=url,
delay=self.delay,
min_length=self.min_length),
self.post_process, '''
%(real_input)s.accept=true
return false
}
}).data('autocomplete')._renderItem=function(ul,item){
var name=''
if(item.name!=null){
name+=item.name
}
if(item.instance_type!=''){
name+=' ('+s3_site_lookup(item.instance_type)+')'
}
return $('<li></li>').data('item.autocomplete',item).append('<a>'+name+'</a>').appendTo(ul)
}
$('#%(dummy_input)s').blur(function(){
if(!$('#%(dummy_input)s').val()){
$('#%(real_input)s').val('').change()
%(real_input)s.accept=true
}
if(!%(real_input)s.accept){
$('#%(dummy_input)s').val(%(real_input)s.val)
}else{
%(real_input)s.val=$('#%(dummy_input)s').val()
}
%(real_input)s.accept=false
})''' % dict(dummy_input=dummy_input,
real_input=real_input)))
if value:
# Provide the representation for the current/default Value
text = str(field.represent(default["value"]))
if "<" in text:
# Strip Markup
try:
markup = etree.XML(text)
text = markup.xpath(".//text()")
if text:
text = " ".join(text)
else:
text = ""
except etree.XMLSyntaxError:
pass
represent = text
else:
represent = ""
current.response.s3.jquery_ready.append(js_autocomplete)
return TAG[""](
INPUT(_id=dummy_input,
_class="string",
_value=represent),
IMG(_src="/%s/static/img/ajax-loader.gif" % \
current.request.application,
_height=32, _width=32,
_id="%s_throbber" % dummy_input,
_class="throbber hide"),
INPUT(**attr),
requires = field.requires
)
# -----------------------------------------------------------------------------
def S3GenericAutocompleteTemplate(post_process,
delay,
min_length,
field,
value,
attributes,
source,
name_getter = "function(item){return item.name}",
id_getter = "function(item){return item.id}",
transform_value = lambda value: value,
new_items = False, # Allow new items
tablename = None, # Needed if new_items=True
):
"""
Renders a SELECT as an INPUT field with AJAX Autocomplete
"""
value = transform_value(value)
default = dict(
_type = "text",
value = (value != None and s3_unicode(value)) or "",
)
attr = StringWidget._attributes(field, default, **attributes)
# Hide the real field
attr["_class"] = attr["_class"] + " hide"
real_input = str(field).replace(".", "_")
dummy_input = "dummy_%s" % real_input
js_autocomplete = "".join(('''
var %(real_input)s={val:$('#%(dummy_input)s').val(),accept:false}
var get_name=%(name_getter)s
var get_id=%(id_getter)s
$('#%(dummy_input)s').autocomplete({
source:%(source)s,
delay:%(delay)d,
minLength:%(min_length)d,
search:function(event,ui){
$('#%(dummy_input)s_throbber').removeClass('hide').show()
return true
},
response:function(event,ui,content){
$('#%(dummy_input)s_throbber').hide()
return content
},
focus:function(event,ui){
$('#%(dummy_input)s').val(get_name(ui.item))
return false
},
select:function(event,ui){
var item=ui.item
$('#%(dummy_input)s').val(get_name(ui.item))
$('#%(real_input)s').val(get_id(ui.item)).change()
''' % locals(),
post_process or "",
'''
%(real_input)s.accept=true
return false
}
}).data('autocomplete')._renderItem=function(ul,item){
return $('<li></li>').data('item.autocomplete',item).append('<a>'+get_name(item)+'</a>').appendTo(ul)
}''' % locals(),
'''
$('#%(dummy_input)s').blur(function(){
$('#%(real_input)s').val($('#%(dummy_input)s').val())
})''' % locals() if new_items else
'''
$('#%(dummy_input)s').blur(function(){
if(!$('#%(dummy_input)s').val()){
$('#%(real_input)s').val('').change()
%(real_input)s.accept=true
}
if(!%(real_input)s.accept){
$('#%(dummy_input)s').val(%(real_input)s.val)
}else{
%(real_input)s.val=$('#%(dummy_input)s').val()
}
%(real_input)s.accept=false
})''' % locals()))
if value:
# Provide the representation for the current/default Value
text = s3_unicode(field.represent(default["value"]))
if "<" in text:
# Strip Markup
try:
markup = etree.XML(text)
text = markup.xpath(".//text()")
if text:
text = " ".join(text)
else:
text = ""
except etree.XMLSyntaxError:
pass
represent = text
else:
represent = ""
current.response.s3.jquery_ready.append(js_autocomplete)
return TAG[""](
INPUT(_id=dummy_input,
_class="string",
value=represent),
IMG(_src="/%s/static/img/ajax-loader.gif" % \
current.request.application,
_height=32, _width=32,
_id="%s_throbber" % dummy_input,
_class="throbber hide"),
INPUT(**attr),
requires = field.requires
)
# =============================================================================
class S3LocationDropdownWidget(FormWidget):
"""
Renders a dropdown for an Lx level of location hierarchy
"""
def __init__(self, level="L0", default=None, empty=False):
""" Set Defaults """
self.level = level
self.default = default
self.empty = empty
def __call__(self, field, value, **attributes):
level = self.level
default = self.default
empty = self.empty
s3db = current.s3db
table = s3db.gis_location
query = (table.level == level)
locations = current.db(query).select(table.name,
table.id,
cache=s3db.cache)
opts = []
for location in locations:
opts.append(OPTION(location.name, _value=location.id))
if not value and default and location.name == default:
value = location.id
locations = locations.as_dict()
attr_dropdown = OptionsWidget._attributes(field,
dict(_type = "int",
value = value))
requires = IS_IN_SET(locations)
if empty:
requires = IS_NULL_OR(requires)
attr_dropdown["requires"] = requires
attr_dropdown["represent"] = \
lambda id: locations["id"]["name"] or UNKNOWN_OPT
return TAG[""](
SELECT(*opts, **attr_dropdown),
requires=field.requires
)
# =============================================================================
class S3LocationSelectorWidget(FormWidget):
"""
Renders a gis_location Foreign Key to allow inline display/editing of linked fields.
Designed for use for Resources which require a Specific Location, such as Sites, Persons, Assets, Incidents, etc
Not currently suitable for Resources which require a Hierarchical Location, such as Projects, Assessments, Plans, etc
- S3LocationAutocompleteWidget is more appropriate for these.
Can also be used to transparently wrap simple sites (such as project_site) using the IS_SITE_SELECTOR() validator
It uses s3.locationselector.widget.js to do all client-side functionality.
It requires the IS_LOCATION_SELECTOR() validator to process Location details upon form submission.
Create form
Active Tab: 'Create New Location'
Country Dropdown (to set the Number & Labels of Hierarchy)
Building Name (deployment_setting to hide)
Street Address (Line1/Line2?)
@ToDo: Trigger a geocoder lookup onblur
Postcode
@ToDo: Mode Strict:
Lx as dropdowns. Default label is 'Select previous to populate this dropdown' (Fixme!)
Mode not Strict (default):
L2-L5 as Autocompletes which create missing locations automatically
@ToDo: L1 as Dropdown? (Have a gis_config setting to inform whether this is populated for a given L0)
Map:
@ToDo: Inline or Popup? (Deployment Option?)
Set Map Viewport to default on best currently selected Hierarchy
@ToDo: L1+
Lat Lon
Inactive Tab: 'Select Existing Location'
Needs 2 modes:
Specific Locations only - for Sites/Incidents
@ToDo: Hierarchies ok (can specify which) - for Projects/Documents
@ToDo: Hierarchical Filters above the Search Box
Search is filtered to values shown
Filters default to any hierarchy selected on the Create tab?
Button to explicitly say 'Select this Location' which sets all the fields (inc hidden ID) & the UUID var
Tabs then change to View/Edit
Update form
Update form has uuid set server-side & hence S3.gis.uuid set client-side
Assume location is shared by other resources
Active Tab: 'View Location Details' (Fields are read-only)
Inactive Tab: 'Edit Location Details' (Fields are writable)
@ToDo: Inactive Tab: 'Move Location': Defaults to Searching for an Existing Location, with a button to 'Create New Location'
@see: http://eden.sahanafoundation.org/wiki/BluePrintGISLocationSelector
"""
def __init__(self,
hide_address=False,
site_type=None,
polygon=False):
self.hide_address = hide_address
self.site_type = site_type
self.polygon = polygon
def __call__(self, field, value, **attributes):
T = current.T
db = current.db
s3db = current.s3db
gis = current.gis
auth = current.auth
settings = current.deployment_settings
response = current.response
s3 = current.response.s3
appname = current.request.application
locations = s3db.gis_location
ctable = s3db.gis_config
requires = field.requires
# Main Input
defaults = dict(_type = "text",
value = (value != None and str(value)) or "")
attr = StringWidget._attributes(field, defaults, **attributes)
# Hide the real field
attr["_class"] = "hide"
# Is this a Site?
site = ""
if self.site_type:
# We are acting on a site_id not location_id
# Store the real variables
#site_value = value
#site_field = field
# Ensure that we have a name for the Location visible
settings.gis.building_name = True
# Set the variables to what they would be for a Location
stable = s3db[self.site_type]
field = stable.location_id
if value:
query = (stable.id == value)
record = db(query).select(stable.location_id,
limitby=(0, 1)).first()
if record:
value = record.location_id
defaults = dict(_type = "text",
value = str(value))
else:
raise HTTP(404)
else:
# Check for being a location_id on a site type
# If so, then the JS defaults the Building Name to Site Name
tablename = field.tablename
if tablename in auth.org_site_types:
site = tablename
# Full list of countries (by ID)
countries = gis.get_countries()
# Countries we should select from
_countries = settings.get_gis_countries()
if _countries:
__countries = gis.get_countries(key_type="code")
countrynames = []
append = countrynames.append
for k, v in __countries.iteritems():
if k in _countries:
append(v)
for k, v in countries.iteritems():
if v not in countrynames:
del countries[k]
# Read Options
config = gis.get_config()
default_L0 = Storage()
country_snippet = ""
if value:
default_L0.id = gis.get_parent_country(value)
elif config.default_location_id:
# Populate defaults with IDs & Names of ancestors at each level
defaults = gis.get_parent_per_level(defaults,
config.default_location_id,
feature=None,
ids=True,
names=True)
query = (locations.id == config.default_location_id)
default_location = db(query).select(locations.level,
locations.name).first()
if default_location.level:
# Add this one to the defaults too
defaults[default_location.level] = Storage(name = default_location.name,
id = config.default_location_id)
if "L0" in defaults:
default_L0 = defaults["L0"]
if default_L0:
id = default_L0.id
if id not in countries:
# Add the default country to the list of possibles
countries[id] = defaults["L0"].name
country_snippet = "S3.gis.country = '%s';\n" % \
gis.get_default_country(key_type="code")
elif len(countries) == 1:
default_L0.id = countries.keys()[0]
# Should we use a Map-based selector?
map_selector = settings.get_gis_map_selector()
if map_selector:
no_map = ""
else:
no_map = "S3.gis.no_map = true;\n"
# Should we display LatLon boxes?
latlon_selector = settings.get_gis_latlon_selector()
# Show we display Polygons?
polygon = self.polygon
# Navigate Away Confirm?
if settings.get_ui_navigate_away_confirm():
navigate_away_confirm = '''
S3.navigate_away_confirm=true'''
else:
navigate_away_confirm = ""
# Which tab should the widget open to by default?
# @ToDo: Act on this server-side instead of client-side
if s3.gis.tab:
tab = '''
S3.gis.tab="%s"''' % s3.gis.tab
else:
# Default to Create
tab = ""
# Which Levels do we have in our hierarchy & what are their initial Labels?
# If we have a default country or one from the value then we can lookup
# the labels we should use for that location
country = None
if default_L0:
country = default_L0.id
location_hierarchy = gis.get_location_hierarchy(location=country)
# This is all levels to start, but L0 will be dropped later.
levels = gis.hierarchy_level_keys
map_popup = ""
if value:
# Read current record
if auth.s3_has_permission("update", locations, record_id=value):
# Update mode
# - we assume this location could be shared by other resources
create = "hide" # Hide sections which are meant for create forms
update = ""
query = (locations.id == value)
this_location = db(query).select(locations.uuid,
locations.name,
locations.level,
locations.inherited,
locations.lat,
locations.lon,
locations.addr_street,
locations.addr_postcode,
locations.parent,
locations.path,
locations.wkt,
limitby=(0, 1)).first()
if this_location:
uid = this_location.uuid
level = this_location.level
defaults[level] = Storage()
defaults[level].id = value
if this_location.inherited:
lat = None
lon = None
wkt = None
else:
lat = this_location.lat
lon = this_location.lon
wkt = this_location.wkt
addr_street = this_location.addr_street or ""
#addr_street_encoded = ""
#if addr_street:
# addr_street_encoded = addr_street.replace("\r\n",
# "%0d").replace("\r",
# "%0d").replace("\n",
# "%0d")
postcode = this_location.addr_postcode
parent = this_location.parent
path = this_location.path
# Populate defaults with IDs & Names of ancestors at each level
defaults = gis.get_parent_per_level(defaults,
value,
feature=this_location,
ids=True,
names=True)
# If we have a non-specific location then not all keys will be populated.
# Populate these now:
for l in levels:
try:
defaults[l]
except:
defaults[l] = None
if level and not level == "XX":
# If within the locations hierarchy then don't populate the visible name box
represent = ""
else:
represent = this_location.name
if map_selector:
zoom = config.zoom
if zoom == None:
zoom = 1
if lat is None or lon is None:
map_lat = config.lat
map_lon = config.lon
else:
map_lat = lat
map_lon = lon
query = (locations.id == value)
row = db(query).select(locations.lat,
locations.lon,
limitby=(0, 1)).first()
if row:
feature = {"lat" : row.lat,
"lon" : row.lon }
features = [feature]
else:
features = []
map_popup = gis.show_map(
lat = map_lat,
lon = map_lon,
# Same as a single zoom on a cluster
zoom = zoom + 2,
features = features,
add_feature = True,#False,#True
add_feature_active = not polygon,#False,#not polygon,
add_polygon = polygon,#False,#polygon
add_polygon_active = polygon,#False,#polygon,
toolbar = True,
collapsed = True,
search = True,
window = True,
window_hide = True,
location_selector = True
)
else:
# Bad location_id
response.error = T("Invalid Location!")
value = None
elif auth.s3_has_permission("read", locations, record_id=value):
# Read mode
# @ToDo
return ""
else:
# No Permission to read location, so don't render a row
return ""
if not value:
# No default value
# Check that we're allowed to create records
if auth.s3_has_permission("update", locations):
# Create mode
create = ""
update = "hide" # Hide sections which are meant for update forms
uuid = ""
represent = ""
level = None
lat = None
lon = None
wkt = None
addr_street = ""
#addr_street_encoded = ""
postcode = ""
if map_selector:
map_popup = gis.show_map(
add_feature = True,
add_feature_active =not polygon, #False,#not polygon,
add_polygon =polygon, #False,#polygon,
add_polygon_active = polygon,#False,#polygon,
toolbar = True,
collapsed = True,
search = True,
window = True,
window_hide = True,
location_selector = True
)
else:
# No Permission to create a location, so don't render a row
return ""
# JS snippets of config
# (we only include items with data)
s3_gis_lat_lon = ""
# Components to inject into Form
divider = TR(TD(_class="subheading"),
_class="box_bottom locselect")
expand_button = DIV(_id="gis_location_expand", _class="expand")
label_row = TR(TD(expand_button, B("%s:" % field.label)),
_id="gis_location_label_row",
_class="box_top")
# Tabs to select between the modes
# @ToDo: Move Location tab
view_button = A(T("View Location Details"),
_style="cursor:pointer; cursor:hand",
_id="gis_location_view-btn")
edit_button = A(T("Edit Location Details"),
_style="cursor:pointer; cursor:hand",
_id="gis_location_edit-btn")
add_button = A(T("Create New Location"),
_style="cursor:pointer; cursor:hand",
_id="gis_location_add-btn")
search_button = A(T("Select Existing Location"),
_style="cursor:pointer; cursor:hand",
_id="gis_location_search-btn")
tabs = DIV(SPAN(add_button, _id="gis_loc_add_tab",
_class="tab_here %s" % create),
SPAN(search_button, _id="gis_loc_search_tab",
_class="tab_last %s" % create),
SPAN(view_button, _id="gis_loc_view_tab",
_class="tab_here %s" % update),
SPAN(edit_button, _id="gis_loc_edit_tab",
_class="tab_last %s" % update),
_class="tabs")
tab_rows = TR(TD(tabs), TD(),
_id="gis_location_tabs_row",
_class="locselect box_middle")
# L0 selector
SELECT_COUNTRY = T("Choose country")
level = "L0"
L0_rows = ""
if len(countries) == 1:
# Hard-coded country
id = countries.items()[0][0]
L0_rows = INPUT(value = id,
_id="gis_location_%s" % level,
_name="gis_location_%s" % level,
_class="hide box_middle")
else:
if default_L0:
attr_dropdown = OptionsWidget._attributes(field,
dict(_type = "int",
value = default_L0.id),
**attributes)
else:
attr_dropdown = OptionsWidget._attributes(field,
dict(_type = "int",
value = ""),
**attributes)
attr_dropdown["requires"] = \
IS_NULL_OR(IS_IN_SET(countries,
zero = SELECT_COUNTRY))
attr_dropdown["represent"] = \
lambda id: gis.get_country(id) or UNKNOWN_OPT
opts = [OPTION(SELECT_COUNTRY, _value="")]
if countries:
for (id, name) in countries.iteritems():
opts.append(OPTION(name, _value=id))
attr_dropdown["_id"] = "gis_location_%s" % level
## Old: Need to blank the name to prevent it from appearing in form.vars & requiring validation
#attr_dropdown["_name"] = ""
attr_dropdown["_name"] = "gis_location_%s" % level
if value:
# Update form => read-only
attr_dropdown["_disabled"] = "disabled"
try:
attr_dropdown["value"] = defaults[level].id
except:
pass
widget = SELECT(*opts, **attr_dropdown)
label = LABEL("%s:" % location_hierarchy[level])
L0_rows = DIV(TR(TD(label), TD(),
_class="locselect box_middle",
_id="gis_location_%s_label__row" % level),
TR(TD(widget), TD(),
_class="locselect box_middle",
_id="gis_location_%s__row" % level))
row = TR(INPUT(_id="gis_location_%s_search" % level,
_disabled="disabled"), TD(),
_class="hide locselect box_middle",
_id="gis_location_%s_search__row" % level)
L0_rows.append(row)
if self.site_type:
NAME_LABEL = T("Site Name")
else:
NAME_LABEL = T("Building Name")
STREET_LABEL = T("Street Address")
POSTCODE_LABEL = settings.get_ui_label_postcode()
LAT_LABEL = T("Latitude")
LON_LABEL = T("Longitude")
AUTOCOMPLETE_HELP = T("Enter some characters to bring up a list of possible matches")
NEW_HELP = T("If not found, you can have a new location created.")
def ac_help_widget(level):
try:
label = location_hierarchy[level]
except:
label = level
return DIV(_class="tooltip",
_title="%s|%s|%s" % (label, AUTOCOMPLETE_HELP, NEW_HELP))
hidden = ""
throbber = "/%s/static/img/ajax-loader.gif" % appname
Lx_rows = DIV()
if value:
# Display Read-only Fields
name_widget = INPUT(value=represent,
_id="gis_location_name",
_name="gis_location_name",
_disabled="disabled")
street_widget = TEXTAREA(value=addr_street,
_id="gis_location_street",
_class="text",
_name="gis_location_street",
_disabled="disabled")
postcode_widget = INPUT(value=postcode,
_id="gis_location_postcode",
_name="gis_location_postcode",
_disabled="disabled")
lat_widget = S3LatLonWidget("lat",
disabled=True).widget(value=lat)
lon_widget = S3LatLonWidget("lon",
switch_button=True,
disabled=True).widget(value=lon)
for level in levels:
if level == "L0":
# L0 has been handled as special case earlier
continue
elif level not in location_hierarchy:
# Skip levels not in hierarchy
continue
if defaults[level]:
id = defaults[level].id
name = defaults[level].name
else:
# Hide empty levels
hidden = "hide"
id = ""
name = ""
try:
label = LABEL("%s:" % location_hierarchy[level])
except:
label = LABEL("%s:" % level)
row = TR(TD(label), TD(),
_id="gis_location_%s_label__row" % level,
_class="%s locselect box_middle" % hidden)
Lx_rows.append(row)
widget = DIV(INPUT(value=id,
_id="gis_location_%s" % level,
_name="gis_location_%s" % level,
_class="hide"),
INPUT(value=name,
_id="gis_location_%s_ac" % level,
_disabled="disabled"),
IMG(_src=throbber,
_height=32, _width=32,
_id="gis_location_%s_throbber" % level,
_class="throbber hide"))
row = TR(TD(widget), TD(),
_id="gis_location_%s__row" % level,
_class="%s locselect box_middle" % hidden)
Lx_rows.append(row)
else:
name_widget = INPUT(_id="gis_location_name",
_name="gis_location_name")
street_widget = TEXTAREA(_id="gis_location_street",
_class="text",
_name="gis_location_street")
postcode_widget = INPUT(_id="gis_location_postcode",
_name="gis_location_postcode")
lat_widget = S3LatLonWidget("lat").widget()
lon_widget = S3LatLonWidget("lon", switch_button=True).widget()
for level in levels:
hidden = ""
if level == "L0":
# L0 has been handled as special case earlier
continue
elif level not in location_hierarchy:
# Hide unused levels
# (these can then be enabled for other regions)
hidden = "hide"
try:
label = LABEL("%s:" % location_hierarchy[level])
except:
label = LABEL("%s:" % level)
row = TR(TD(label), TD(),
_class="%s locselect box_middle" % hidden,
_id="gis_location_%s_label__row" % level)
Lx_rows.append(row)
if level in defaults and defaults[level]:
default = defaults[level]
default_id = default.id
default_name = default.name
else:
default_id = ""
default_name = ""
widget = DIV(INPUT(value=default_id,
_id="gis_location_%s" % level,
_name="gis_location_%s" % level,
_class="hide"),
INPUT(value=default_name,
_id="gis_location_%s_ac" % level,
_class="%s" % hidden),
IMG(_src=throbber,
_height=32, _width=32,
_id="gis_location_%s_throbber" % level,
_class="throbber hide"))
row = TR(TD(widget),
TD(ac_help_widget(level)),
_class="%s locselect box_middle" % hidden,
_id="gis_location_%s__row" % level)
Lx_rows.append(row)
row = TR(INPUT(_id="gis_location_%s_search" % level,
_disabled="disabled"), TD(),
_class="hide locselect box_middle",
_id="gis_location_%s_search__row" % level)
Lx_rows.append(row)
hide_address = self.hide_address
if settings.get_gis_building_name():
hidden = ""
if hide_address:
hidden = "hide"
elif value and not represent:
hidden = "hide"
name_rows = DIV(TR(LABEL("%s:" % NAME_LABEL), TD(),
_id="gis_location_name_label__row",
_class="%s locselect box_middle" % hidden),
TR(name_widget, TD(),
_id="gis_location_name__row",
_class="%s locselect box_middle" % hidden),
TR(INPUT(_id="gis_location_name_search",
_disabled="disabled"), TD(),
_id="gis_location_name_search__row",
_class="hide locselect box_middle"))
else:
name_rows = ""
hidden = ""
if hide_address:
hidden = "hide"
elif value and not addr_street:
hidden = "hide"
street_rows = DIV(TR(LABEL("%s:" % STREET_LABEL), TD(),
_id="gis_location_street_label__row",
_class="%s locselect box_middle" % hidden),
TR(street_widget, TD(),
_id="gis_location_street__row",
_class="%s locselect box_middle" % hidden),
TR(INPUT(_id="gis_location_street_search",
_disabled="disabled"), TD(),
_id="gis_location_street_search__row",
_class="hide locselect box_middle"))
if config.geocoder:
geocoder = '''
S3.gis.geocoder=true'''
else:
geocoder = ""
hidden = ""
if hide_address:
hidden = "hide"
elif value and not postcode:
hidden = "hide"
postcode_rows = DIV(TR(LABEL("%s:" % POSTCODE_LABEL), TD(),
_id="gis_location_postcode_label__row",
_class="%s locselect box_middle" % hidden),
TR(postcode_widget, TD(),
_id="gis_location_postcode__row",
_class="%s locselect box_middle" % hidden),
TR(INPUT(_id="gis_location_postcode_search",
_disabled="disabled"), TD(),
_id="gis_location_postcode_search__row",
_class="hide locselect box_middle"))
hidden = ""
no_latlon = ""
if not latlon_selector:
hidden = "hide"
no_latlon = '''S3.gis.no_latlon=true\n'''
elif value and lat is None:
hidden = "hide"
latlon_help = locations.lat.comment
converter_button = locations.lon.comment
converter_button = ""
latlon_rows = DIV(TR(LABEL("%s:" % LAT_LABEL), TD(),
_id="gis_location_lat_label__row",
_class="%s locselect box_middle" % hidden),
TR(TD(lat_widget), TD(latlon_help),
_id="gis_location_lat__row",
_class="%s locselect box_middle" % hidden),
TR(INPUT(_id="gis_location_lat_search",
_disabled="disabled"), TD(),
_id="gis_location_lat_search__row",
_class="hide locselect box_middle"),
TR(LABEL("%s:" % LON_LABEL), TD(),
_id="gis_location_lon_label__row",
_class="%s locselect box_middle" % hidden),
TR(TD(lon_widget), TD(converter_button),
_id="gis_location_lon__row",
_class="%s locselect box_middle" % hidden),
TR(INPUT(_id="gis_location_lon_search",
_disabled="disabled"), TD(),
_id="gis_location_lon_search__row",
_class="hide locselect box_middle"))
# Map Selector
PLACE_ON_MAP = T("Place on Map")
VIEW_ON_MAP = T("View on Map")
if map_selector:
if value:
map_button = A(VIEW_ON_MAP,
_style="cursor:pointer; cursor:hand",
_id="gis_location_map-btn",
_class="action-btn")
else:
map_button = A(PLACE_ON_MAP,
_style="cursor:pointer; cursor:hand",
_id="gis_location_map-btn",
_class="action-btn")
map_button_row = TR(map_button, TD(),
_id="gis_location_map_button_row",
_class="locselect box_middle")
else:
map_button_row = ""
# Search
widget = DIV(INPUT(_id="gis_location_search_ac"),
IMG(_src=throbber,
_height=32, _width=32,
_id="gis_location_search_throbber",
_class="throbber hide"),
_id="gis_location_search_div")
label = LABEL("%s:" % AUTOCOMPLETE_HELP)
select_button = A(T("Select This Location"),
_style="cursor:pointer; cursor:hand",
_id="gis_location_search_select-btn",
_class="hide action-btn")
search_rows = DIV(TR(label, TD(),
_id="gis_location_search_label__row",
_class="hide locselect box_middle"),
TR(TD(widget),
TD(select_button),
_id="gis_location_search__row",
_class="hide locselect box_middle"))
# @ToDo: Hierarchical Filter
Lx_search_rows = ""
# Error Messages
NAME_REQUIRED = T("Name field is required!")
COUNTRY_REQUIRED = T("Country is required!")
# Settings to be read by static/scripts/S3/s3.locationselector.widget.js
# Note: Currently we're limited to a single location selector per page
js_location_selector = '''
%s%s%s%s%s%s
S3.gis.location_id='%s'
S3.gis.site='%s'
i18n.gis_place_on_map='%s'
i18n.gis_view_on_map='%s'
i18n.gis_name_required='%s'
i18n.gis_country_required="%s"''' % (country_snippet,
geocoder,
navigate_away_confirm,
no_latlon,
no_map,
tab,
attr["_id"], # Name of the real location or site field
site,
PLACE_ON_MAP,
VIEW_ON_MAP,
NAME_REQUIRED,
COUNTRY_REQUIRED
)
s3.js_global.append(js_location_selector)
if s3.debug:
script = "s3.locationselector.widget.js"
else:
script = "s3.locationselector.widget.min.js"
s3.scripts.append("/%s/static/scripts/S3/%s" % (appname, script))
if self.polygon:
hidden = ""
if value:
# Display read-only view
wkt_widget = TEXTAREA(value = wkt,
_class="wkt-input",
_id="gis_location_wkt",
_name="gis_location_wkt",
_disabled="disabled")
if wkt:
hidden = "hide"
else:
wkt_widget = TEXTAREA(_class="wkt-input",
_id="gis_location_wkt",
_name="gis_location_wkt")
wkt_input_row = TAG[""](
TR(TD(LABEL("%s (WGS84)" % T("Polygon"))),
TD(),
_id="gis_location_wkt_label__row",
_class="box_middle %s" % hidden),
TR(
TD(wkt_widget),
TD(),
_id="gis_location_wkt__row",
_class="box_middle %s" % hidden)
)
else:
wkt_input_row = ""
# The overall layout of the components
return TAG[""](
TR(INPUT(**attr)), # Real input, which is hidden
label_row,
tab_rows,
Lx_search_rows,
search_rows,
L0_rows,
name_rows,
street_rows,
postcode_rows,
Lx_rows,
wkt_input_row,
map_button_row,
latlon_rows,
divider,
TR(map_popup, TD(), _class="box_middle"),
requires=requires
)
# =============================================================================
class S3LatLonWidget(DoubleWidget):
"""
Widget for latitude or longitude input, gives option to input in terms
of degrees, minutes and seconds
"""
def __init__(self, type, switch_button=False, disabled=False):
self._id = "gis_location_%s" % type
self._name = self._id
self.disabled = disabled
self.switch_button = switch_button
def widget(self, field=None, value=None):
T = current.T
s3 = current.response.s3
attr = dict(value=value,
_class="decimal %s" % self._class,
_id=self._id,
_name=self._name)
attr_dms = dict()
if self.disabled:
attr["_disabled"] = "disabled"
attr_dms["_disabled"] = "disabled"
dms_boxes = SPAN(
INPUT(_class="degrees", **attr_dms), "° ",
INPUT(_class="minutes", **attr_dms), "' ",
INPUT(_class="seconds", **attr_dms), "\" ",
["",
DIV(A(T("Use decimal"),
_class="action-btn gis_coord_switch_decimal"))
][self.switch_button],
_style="display: none;",
_class="gis_coord_dms"
)
decimal = SPAN(
INPUT(**attr),
["",
DIV(A(T("Use deg, min, sec"),
_class="action-btn gis_coord_switch_dms"))
][self.switch_button],
_class="gis_coord_decimal"
)
if not s3.lat_lon_i18n_appended:
s3.js_global.append('''
i18n.gis_only_numbers={degrees:'%s',minutes:'%s',seconds:'%s',decimal:'%s'}
i18n.gis_range_error={degrees:{lat:'%s',lon:'%s'},minutes:'%s',seconds:'%s',decimal:{lat:'%s',lon:'%s'}}
''' % (T("Degrees must be a number."),
T("Minutes must be a number."),
T("Seconds must be a number."),
T("Degrees must be a number."),
T("Degrees in a latitude must be between -90 to 90."),
T("Degrees in a longitude must be between -180 to 180."),
T("Minutes must be less than 60."),
T("Seconds must be less than 60."),
T("Latitude must be between -90 and 90."),
T("Longitude must be between -180 and 180.")))
s3.lat_lon_i18n_appended = True
if s3.debug and \
(not "S3/locationselector.widget.css" in s3.stylesheets):
s3.stylesheets.append("S3/locationselector.widget.css")
if (field == None):
return SPAN(decimal,
dms_boxes,
_class="gis_coord_wrap")
else:
return SPAN(
decimal,
dms_boxes,
*controls,
requires = field.requires,
_class="gis_coord_wrap"
)
# =============================================================================
class S3CheckboxesWidget(OptionsWidget):
"""
Generates a TABLE tag with <num_column> columns of INPUT
checkboxes (multiple allowed)
help_lookup_table_name_field will display tooltip help
:param lookup_table_name: int -
:param lookup_field_name: int -
:param multple: int -
:param options: list - optional -
value,text pairs for the Checkboxs -
If options = None, use options from requires.options().
This argument is useful for displaying a sub-set of the requires.options()
:param num_column: int -
:param help_lookup_field_name: string - optional -
:param help_footer: string -
Currently unused
"""
def __init__(self,
lookup_table_name = None,
lookup_field_name = None,
multiple = False,
options = None,
num_column = 1,
help_lookup_field_name = None,
help_footer = None
):
self.lookup_table_name = lookup_table_name
self.lookup_field_name = lookup_field_name
self.multiple = multiple
self.options = options
self.num_column = num_column
self.help_lookup_field_name = help_lookup_field_name
self.help_footer = help_footer
# -------------------------------------------------------------------------
def widget(self,
field,
value = None
):
if current.db:
db = current.db
else:
db = field._db
lookup_table_name = self.lookup_table_name
lookup_field_name = self.lookup_field_name
if lookup_table_name and lookup_field_name:
requires = IS_NULL_OR(IS_IN_DB(db,
db[lookup_table_name].id,
"%(" + lookup_field_name + ")s",
multiple = multiple))
else:
requires = self.requires
options = self.options
if not options:
if hasattr(requires, "options"):
options = requires.options()
else:
raise SyntaxError, "widget cannot determine options of %s" % field
values = s3_split_multi_value(value)
attr = OptionsWidget._attributes(field, {})
num_column = self.num_column
num_row = len(options) / num_column
# Ensure division rounds up
if len(options) % num_column > 0:
num_row = num_row +1
table = TABLE(_id = str(field).replace(".", "_"))
append = table.append
for i in range(0, num_row):
table_row = TR()
for j in range(0, num_column):
# Check that the index is still within options
index = num_row * j + i
if index < len(options):
input_options = {}
input_options = dict(requires = attr.get("requires", None),
_value = str(options[index][0]),
value = values,
_type = "checkbox",
_name = field.name,
hideerror = True
)
tip_attr = {}
help_text = ""
if self.help_lookup_field_name:
help_text = str(P(s3_get_db_field_value(tablename = lookup_table_name,
fieldname = self.help_lookup_field_name,
look_up_value = options[index][0],
look_up_field = "id")))
if self.help_footer:
help_text = help_text + str(self.help_footer)
if help_text:
tip_attr = dict(_class = "s3_checkbox_label",
#_title = options[index][1] + "|" + help_text
_rel = help_text
)
#table_row.append(TD(A(options[index][1],**option_attr )))
table_row.append(TD(INPUT(**input_options),
SPAN(options[index][1], **tip_attr)
)
)
append(table_row)
if self.multiple:
append(TR(I("(Multiple selections allowed)")))
return table
# -------------------------------------------------------------------------
def represent(self,
value):
list = [s3_get_db_field_value(tablename = lookup_table_name,
fieldname = lookup_field_name,
look_up_value = id,
look_up_field = "id")
for id in s3_split_multi_value(value) if id]
if list and not None in list:
return ", ".join(list)
else:
return None
# =============================================================================
class S3MultiSelectWidget(MultipleOptionsWidget):
"""
Standard MultipleOptionsWidget, but using the jQuery UI:
http://www.quasipartikel.at/multiselect/
static/scripts/ui.multiselect.js
"""
def __init__(self):
pass
def __call__(self, field, value, **attributes):
T = current.T
s3 = current.response.s3
selector = str(field).replace(".", "_")
s3.js_global.append('''
i18n.addAll='%s'
i18n.removeAll='%s'
i18n.itemsCount='%s'
i18n.search='%s'
''' % (T("Add all"),
T("Remove all"),
T("items selected"),
T("search")))
s3.jquery_ready.append('''
$('#%s').removeClass('list')
$('#%s').addClass('multiselect')
$('#%s').multiselect({
dividerLocation:0.5,
sortable:false
})
''' % (selector,
selector,
selector))
return TAG[""](
MultipleOptionsWidget.widget(field, value, **attributes),
requires = field.requires
)
# =============================================================================
class S3ACLWidget(CheckboxesWidget):
"""
Widget class for ACLs
@todo: add option dependency logic (JS)
@todo: configurable vertical/horizontal alignment
"""
@staticmethod
def widget(field, value, **attributes):
requires = field.requires
if not isinstance(requires, (list, tuple)):
requires = [requires]
if requires:
if hasattr(requires[0], "options"):
options = requires[0].options()
values = []
for k in options:
if isinstance(k, (list, tuple)):
k = k[0]
try:
flag = int(k)
if flag == 0:
if value == 0:
values.append(k)
break
else:
continue
elif value and value & flag == flag:
values.append(k)
except ValueError:
pass
value = values
#return CheckboxesWidget.widget(field, value, **attributes)
attr = OptionsWidget._attributes(field, {}, **attributes)
options = [(k, v) for k, v in options if k != ""]
opts = []
cols = attributes.get("cols", 1)
totals = len(options)
mods = totals%cols
rows = totals/cols
if mods:
rows += 1
for r_index in range(rows):
tds = []
for k, v in options[r_index*cols:(r_index+1)*cols]:
tds.append(TD(INPUT(_type="checkbox",
_name=attr.get("_name", field.name),
requires=attr.get("requires", None),
hideerror=True, _value=k,
value=(k in value)), v))
opts.append(TR(tds))
if opts:
opts[-1][0][0]["hideerror"] = False
return TABLE(*opts, **attr)
# was values = re.compile("[\w\-:]+").findall(str(value))
#values = not isinstance(value,(list,tuple)) and [value] or value
#requires = field.requires
#if not isinstance(requires, (list, tuple)):
#requires = [requires]
#if requires:
#if hasattr(requires[0], "options"):
#options = requires[0].options()
#else:
#raise SyntaxError, "widget cannot determine options of %s" \
#% field
# =============================================================================
class CheckboxesWidgetS3(OptionsWidget):
"""
S3 version of gluon.sqlhtml.CheckboxesWidget:
- supports also integer-type keys in option sets
- has an identifiable class
Used in Sync, Projects
"""
@staticmethod
def widget(field, value, **attributes):
"""
generates a TABLE tag, including INPUT checkboxes (multiple allowed)
see also: :meth:`FormWidget.widget`
"""
#values = re.compile("[\w\-:]+").findall(str(value))
values = not isinstance(value, (list, tuple)) and [value] or value
values = [str(v) for v in values]
attr = OptionsWidget._attributes(field, {}, **attributes)
attr["_class"] = "checkboxes-widget-s3"
requires = field.requires
if not isinstance(requires, (list, tuple)):
requires = [requires]
if hasattr(requires[0], "options"):
options = requires[0].options()
else:
raise SyntaxError, "widget cannot determine options of %s" \
% field
options = [(k, v) for k, v in options if k != ""]
options_help = attributes.get("options_help", {})
input_index = attributes.get("start_at", 0)
opts = []
cols = attributes.get("cols", 1)
totals = len(options)
mods = totals % cols
rows = totals / cols
if mods:
rows += 1
if totals == 0:
T = current.T
opts.append(TR(TD(SPAN(T("no options available"),
_class="no-options-available"),
INPUT(_type="hide",
_name=field.name,
_value=None))))
for r_index in range(rows):
tds = []
for k, v in options[r_index * cols:(r_index + 1) * cols]:
input_id = "id-%s-%s" % (field.name, input_index)
option_help = options_help.get(str(k), "")
if option_help:
label = LABEL(v, _for=input_id, _title=option_help)
else:
# Don't provide empty client-side popups
label = LABEL(v, _for=input_id)
tds.append(TD(INPUT(_type="checkbox",
_name=field.name,
_id=input_id,
requires=attr.get("requires", None),
hideerror=True,
_value=k,
value=(str(k) in values)),
label))
input_index += 1
opts.append(TR(tds))
if opts:
opts[-1][0][0]["hideerror"] = False
return TABLE(*opts, **attr)
# =============================================================================
class S3AddPersonWidget(FormWidget):
"""
Renders a person_id field as a Create Person form,
with an embedded Autocomplete to select existing people.
It relies on JS code in static/S3/s3.select_person.js
"""
def __init__(self,
controller = None,
select_existing = True):
# Controller to retrieve the person record
self.controller = controller
self.select_existing = select_existing
def __call__(self, field, value, **attributes):
T = current.T
request = current.request
appname = request.application
s3 = current.response.s3
formstyle = s3.crud.formstyle
# Main Input
real_input = str(field).replace(".", "_")
default = dict(_type = "text",
value = (value != None and str(value)) or "")
attr = StringWidget._attributes(field, default, **attributes)
attr["_class"] = "hide"
if self.select_existing:
_class ="box_top"
else:
_class = "hide"
if self.controller is None:
controller = request.controller
else:
controller = self.controller
# Select from registry buttons
select_row = TR(TD(A(T("Select from registry"),
_href="#",
_id="select_from_registry",
_class="action-btn"),
A(T("Remove selection"),
_href="#",
_onclick="clear_person_form();",
_id="clear_form_link",
_class="action-btn hide",
_style="padding-left:15px;"),
A(T("Edit Details"),
_href="#",
_onclick="edit_selected_person_form();",
_id="edit_selected_person_link",
_class="action-btn hide",
_style="padding-left:15px;"),
IMG(_src="/%s/static/img/ajax-loader.gif" % appname,
_height=32,
_width=32,
_id="person_load_throbber",
_class="throbber hide",
_style="padding-left:85px;"),
_class="w2p_fw"),
TD(),
_id="select_from_registry_row",
_class=_class,
_controller=controller,
_field=real_input,
_value=str(value))
# Autocomplete
select = '''select_person($('#%s').val())''' % real_input
widget = S3PersonAutocompleteWidget(post_process=select)
ac_row = TR(TD(LABEL("%s: " % T("Name"),
_class="hide",
_id="person_autocomplete_label"),
widget(field,
None,
_class="hide")),
TD(),
_id="person_autocomplete_row",
_class="box_top")
# Embedded Form
s3db = current.s3db
ptable = s3db.pr_person
ctable = s3db.pr_contact
fields = [ptable.first_name,
ptable.middle_name,
ptable.last_name,
ptable.date_of_birth,
ptable.gender]
if controller == "hrm":
emailRequired = current.deployment_settings.get_hrm_email_required()
elif controller == "vol":
fields.append(s3db.pr_person_details.occupation)
emailRequired = current.deployment_settings.get_hrm_email_required()
else:
emailRequired = False
if emailRequired:
validator = IS_EMAIL()
else:
validator = IS_NULL_OR(IS_EMAIL())
fields.extend([Field("email",
notnull=emailRequired,
requires=validator,
label=T("Email Address")),
Field("mobile_phone",
label=T("Mobile Phone Number"))])
labels, required = s3_mark_required(fields)
if required:
s3.has_required = True
form = SQLFORM.factory(table_name="pr_person",
labels=labels,
formstyle=formstyle,
upload="default/download",
separator = "",
*fields)
trs = []
for tr in form[0]:
if not tr.attributes["_id"].startswith("submit_record"):
if "_class" in tr.attributes:
tr.attributes["_class"] = "%s box_middle" % \
tr.attributes["_class"]
else:
tr.attributes["_class"] = "box_middle"
trs.append(tr)
table = DIV(*trs)
# Divider
divider = TR(TD(_class="subheading"),
TD(),
_class="box_bottom")
# JavaScript
if s3.debug:
script = "s3.select_person.js"
else:
script = "s3.select_person.min.js"
s3.scripts.append("/%s/static/scripts/S3/%s" % (appname, script))
# Overall layout of components
return TAG[""](select_row,
ac_row,
table,
divider)
# =============================================================================
class S3AutocompleteOrAddWidget(FormWidget):
"""
This widget searches for or adds an object. It contains:
- an autocomplete field which can be used to search for an existing object.
- an add widget which is used to add an object.
It fills the field with that object after successful addition
"""
def __init__(self,
autocomplete_widget,
add_widget
):
self.autocomplete_widget = autocomplete_widget
self.add_widget = add_widget
def __call__(self, field, value, **attributes):
return TAG[""](
# this does the input field
self.autocomplete_widget(field, value, **attributes),
# this can fill it if it isn't autocompleted
self.add_widget(field, value, **attributes)
)
# =============================================================================
class S3AddObjectWidget(FormWidget):
"""
This widget displays an inline form loaded via AJAX on demand.
In the browser:
A load request must made to this widget to enable it.
The load request must include:
- a URL for the form
after a successful submission, the response callback is handed the
response.
"""
def __init__(self,
form_url,
table_name,
dummy_field_selector,
on_show,
on_hide
):
self.form_url = form_url
self.table_name = table_name
self.dummy_field_selector = dummy_field_selector
self.on_show = on_show
self.on_hide = on_hide
def __call__(self, field, value, **attributes):
T = current.T
s3 = current.response.s3
if s3.debug:
script_name = "/%s/static/scripts/jquery.ba-resize.js"
else:
script_name = "/%s/static/scripts/jquery.ba-resize.min.js"
if script_name not in s3.scripts:
s3.scripts.append(script_name)
return TAG[""](
# @ToDo: this might be better moved to its own script.
SCRIPT('''
$(function () {
var form_field = $('#%(form_field_name)s')
var throbber = $('<div id="%(form_field_name)s_ajax_throbber" class="ajax_throbber"/>')
throbber.hide()
throbber.insertAfter(form_field)
function request_add_form() {
throbber.show()
var dummy_field = $('%(dummy_field_selector)s')
// create an element for the form
var form_iframe = document.createElement('iframe')
var $form_iframe = $(form_iframe)
$form_iframe.attr('id', '%(form_field_name)s_form_iframe')
$form_iframe.attr('frameborder', '0')
$form_iframe.attr('scrolling', 'no')
$form_iframe.attr('src', '%(form_url)s')
var initial_iframe_style = {
width: add_object_link.width(),
height: add_object_link.height()
}
$form_iframe.css(initial_iframe_style)
function close_iframe() {
$form_iframe.unload()
form_iframe.contentWindow.close()
//iframe_controls.remove()
$form_iframe.animate(
initial_iframe_style,
{
complete: function () {
$form_iframe.remove()
add_object_link.show()
%(on_hide)s
dummy_field.show()
}
}
)
}
function reload_iframe() {
form_iframe.contentWindow.location.reload(true)
}
function resize_iframe_to_fit_content() {
var form_iframe_content = $form_iframe.contents().find('body');
// do first animation smoothly
$form_iframe.animate(
{
height: form_iframe_content.outerHeight(true),
width: 500
},
{
duration: jQuery.resize.delay,
complete: function () {
// iframe's own animations should be instant, as they
// have their own smoothing (e.g. expanding error labels)
function resize_iframe_to_fit_content_immediately() {
$form_iframe.css({
height: form_iframe_content.outerHeight(true),
width:500
})
}
// if the iframe content resizes, resize the iframe
// this depends on Ben Alman's resize plugin
form_iframe_content.bind(
'resize',
resize_iframe_to_fit_content_immediately
)
// when unloading, unbind the resizer (remove poller)
$form_iframe.bind(
'unload',
function () {
form_iframe_content.unbind(
'resize',
resize_iframe_to_fit_content_immediately
)
//iframe_controls.hide()
}
)
// there may have been content changes during animation
// so resize to make sure they are shown.
form_iframe_content.resize()
//iframe_controls.show()
%(on_show)s
}
}
)
}
function iframe_loaded() {
dummy_field.hide()
resize_iframe_to_fit_content()
form_iframe.contentWindow.close_iframe = close_iframe
throbber.hide()
}
$form_iframe.bind('load', iframe_loaded)
function set_object_id() {
// the server must give the iframe the object
// id of the created object for the field
// the iframe must also close itself.
var created_object_representation = form_iframe.contentWindow.created_object_representation
if (created_object_representation) {
dummy_field.val(created_object_representation)
}
var created_object_id = form_iframe.contentWindow.created_object_id
if (created_object_id) {
form_field.val(created_object_id)
close_iframe()
}
}
$form_iframe.bind('load', set_object_id)
add_object_link.hide()
/*
var iframe_controls = $('<span class="iframe_controls" style="float:right; text-align:right;"></span>')
iframe_controls.hide()
var close_button = $('<a>%(Close)s </a>')
close_button.click(close_iframe)
var reload_button = $('<a>%(Reload)s </a>')
reload_button.click(reload_iframe)
iframe_controls.append(close_button)
iframe_controls.append(reload_button)
iframe_controls.insertBefore(add_object_link)
*/
$form_iframe.insertAfter(add_object_link)
}
var add_object_link = $('<a>%(Add)s</a>')
add_object_link.click(request_add_form)
add_object_link.insertAfter(form_field)
})''' % dict(
field_name = field.name,
form_field_name = "_".join((self.table_name, field.name)),
form_url = self.form_url,
dummy_field_selector = self.dummy_field_selector(self.table_name, field.name),
on_show = self.on_show,
on_hide = self.on_hide,
Add = T("Add..."),
Reload = T("Reload"),
Close = T("Close"),
)
)
)
# =============================================================================
class S3SearchAutocompleteWidget(FormWidget):
"""
Uses the s3Search Module
"""
def __init__(self,
tablename,
represent,
get_fieldname = "id",
):
self.get_fieldname = get_fieldname
self.tablename = tablename
self.represent = represent
def __call__(self, field, value, **attributes):
request = current.request
response = current.response
session = current.session
tablename = self.tablename
modulename, resourcename = tablename.split("_", 1)
attributes["is_autocomplete"] = True
attributes["fieldname"] = field.name
attributes["get_fieldname"] = self.get_fieldname
# Display in the simple search widget
if value:
attributes["value"] = self.represent(value)
else:
attributes["value"] = ""
r = s3_request(modulename, resourcename, args=[])
search_div = r.resource.search( r, **attributes)["form"]
hidden_input = INPUT(value = value or "",
requires = field.requires,
_id = "%s_%s" % (tablename, field.name),
_class = "hide hide_input",
_name = field.name,
)
return TAG[""](
search_div,
hidden_input
)
# =============================================================================
class S3TimeIntervalWidget(FormWidget):
"""
Simple time interval widget for the scheduler task table
"""
multipliers = (("weeks", 604800),
("days", 86400),
("hours", 3600),
("minutes", 60),
("seconds", 1))
@staticmethod
def widget(field, value, **attributes):
multipliers = S3TimeIntervalWidget.multipliers
if value is None:
value = 0
if value == 0:
multiplier = 1
else:
for m in multipliers:
multiplier = m[1]
if int(value) % multiplier == 0:
break
options = []
for i in xrange(1, len(multipliers) + 1):
title, opt = multipliers[-i]
if opt == multiplier:
option = OPTION(title, _value=opt, _selected="selected")
else:
option = OPTION(title, _value=opt)
options.append(option)
val = value / multiplier
inp = DIV(INPUT(value = val,
requires = field.requires,
_id = ("%s" % field).replace(".", "_"),
_name = field.name),
SELECT(options,
_name=("%s_multiplier" % field).replace(".", "_")))
return inp
@staticmethod
def represent(value):
multipliers = S3TimeIntervalWidget.multipliers
try:
val = int(value)
except:
val = 0
if val == 0:
multiplier = multipliers[-1]
else:
for m in multipliers:
if val % m[1] == 0:
multiplier = m
break
val = val / multiplier[1]
return "%s %s" % (val, current.T(multiplier[0]))
# =============================================================================
class S3InvBinWidget(FormWidget):
"""
Widget used by S3CRUD to offer the user matching bins where
stock items can be placed
"""
def __init__(self,
tablename,):
self.tablename = tablename
def __call__(self, field, value, **attributes):
T = current.T
request = current.request
s3db = current.s3db
tracktable = s3db.inv_track_item
stocktable = s3db.inv_inv_item
new_div = INPUT(value = value or "",
requires = field.requires,
_id = "i_%s_%s" % (self.tablename, field.name),
_name = field.name,
)
id = None
function = self.tablename[4:]
if len(request.args) > 2:
if request.args[1] == function:
id = request.args[2]
if id == None or tracktable[id] == None:
return TAG[""](
new_div
)
record = tracktable[id]
site_id = s3db.inv_recv[record.recv_id].site_id
query = (stocktable.site_id == site_id) & \
(stocktable.item_id == record.item_id) & \
(stocktable.item_source_no == record.item_source_no) & \
(stocktable.item_pack_id == record.item_pack_id) & \
(stocktable.currency == record.currency) & \
(stocktable.pack_value == record.pack_value) & \
(stocktable.expiry_date == record.expiry_date) & \
(stocktable.supply_org_id == record.supply_org_id)
rows = current.db(query).select(stocktable.bin,
stocktable.id)
if len(rows) == 0:
return TAG[""](
new_div
)
bins = []
for row in rows:
bins.append(OPTION(row.bin))
match_lbl = LABEL(T("Select an existing bin"))
match_div = SELECT(bins,
_id = "%s_%s" % (self.tablename, field.name),
_name = field.name,
)
new_lbl = LABEL(T("...or add a new bin"))
return TAG[""](
match_lbl,
match_div,
new_lbl,
new_div
)
# =============================================================================
class S3EmbedComponentWidget(FormWidget):
"""
Widget used by S3CRUD for link-table components with actuate="embed".
Uses s3.embed_component.js for client-side processing, and
S3CRUD._postprocess_embedded to receive the data.
"""
def __init__(self,
link=None,
component=None,
widget=None,
autocomplete=None,
link_filter=None,
select_existing=True):
self.link = link
self.component = component
self.widget = widget
self.autocomplete = autocomplete
self.select_existing = select_existing
self.link_filter = link_filter
self.post_process = current.s3db.get_config(link,
"post_process",
None)
def __call__(self, field, value, **attributes):
T = current.T
db = current.db
s3db = current.s3db
request = current.request
appname = request.application
s3 = current.response.s3
appname = current.request.application
formstyle = s3.crud.formstyle
ltable = s3db[self.link]
ctable = s3db[self.component]
prefix, resourcename = self.component.split("_", 1)
if field.name in request.post_vars:
selected = request.post_vars[field.name]
else:
selected = None
# Main Input
real_input = str(field).replace(".", "_")
dummy = "dummy_%s" % real_input
default = dict(_type = "text",
value = (value != None and str(value)) or "")
attr = StringWidget._attributes(field, default, **attributes)
attr["_class"] = "hide"
if self.select_existing:
_class ="box_top"
else:
_class = "hide"
# Post-process selection/deselection
if self.post_process is not None:
try:
if self.autocomplete:
pp = self.post_process % real_input
else:
pp = self.post_process % dummy
except:
pp = self.post_process
else:
pp = None
clear = "clear_component_form();"
if pp is not None:
clear = "%s%s" % (clear, pp)
# Select from registry buttons
url = "/%s/%s/%s/" % (appname, prefix, resourcename)
select_row = TR(TD(A(T("Select from registry"),
_href="#",
_id="select_from_registry",
_class="action-btn"),
A(T("Remove selection"),
_href="#",
_onclick=clear,
_id="clear_form_link",
_class="action-btn hide",
_style="padding-left:15px;"),
A(T("Edit Details"),
_href="#",
_onclick="edit_selected_form();",
_id="edit_selected_link",
_class="action-btn hide",
_style="padding-left:15px;"),
IMG(_src="/%s/static/img/ajax-loader.gif" % \
appname,
_height=32,
_width=32,
_id="load_throbber",
_class="throbber hide",
_style="padding-left:85px;"),
_class="w2p_fw"),
TD(),
_id="select_from_registry_row",
_class=_class,
_controller=prefix,
_component=self.component,
_url=url,
_field=real_input,
_value=str(value))
# Autocomplete/Selector
if self.autocomplete:
ac_field = ctable[self.autocomplete]
select = "select_component($('#%s').val());" % real_input
if pp is not None:
select = "%s%s" % (pp, select)
widget = S3AutocompleteWidget(prefix,
resourcename=resourcename,
fieldname=self.autocomplete,
link_filter=self.link_filter,
post_process=select)
ac_row = TR(TD(LABEL("%s: " % ac_field.label,
_class="hide",
_id="component_autocomplete_label"),
widget(field, None, _class="hide")),
TD(),
_id="component_autocomplete_row",
_class="box_top")
else:
select = "select_component($('#%s').val());" % dummy
if pp is not None:
select = "%s%s" % (pp, select)
# @todo: add link_filter here as well
widget = OptionsWidget.widget
ac_row = TR(TD(LABEL("%s: " % field.label,
_class="hide",
_id="component_autocomplete_label"),
widget(field, None, _class="hide",
_id=dummy, _onchange=select)),
TD(INPUT(_id=real_input, _class="hide")),
_id="component_autocomplete_row",
_class="box_top")
# Embedded Form
fields = [f for f in ctable
if (f.writable or f.readable) and not f.compute]
if selected:
# Initialize validators with the correct record ID
for f in fields:
requires = f.requires or []
if not isinstance(requires, (list, tuple)):
requires = [requires]
[r.set_self_id(selected) for r in requires
if hasattr(r, "set_self_id")]
labels, required = s3_mark_required(fields)
if required:
s3.has_required = True
form = SQLFORM.factory(table_name=self.component,
labels=labels,
formstyle=formstyle,
upload="default/download",
separator = "",
*fields)
trs = []
att = "box_middle embedded"
for tr in form[0]:
if not tr.attributes["_id"].startswith("submit_record"):
if "_class" in tr.attributes:
tr.attributes["_class"] = "%s %s" % (tr.attributes["_class"], att)
else:
tr.attributes["_class"] = att
trs.append(tr)
table = DIV(*trs)
# Divider
divider = TR(TD(_class="subheading"), TD(), _class="box_bottom embedded")
# JavaScript
if s3.debug:
script = "s3.embed_component.js"
else:
script = "s3.embed_component.min.js"
s3.scripts.append("/%s/static/scripts/S3/%s" % (appname, script))
# Overall layout of components
return TAG[""](select_row,
ac_row,
table,
divider)
# =============================================================================
def s3_comments_widget(field, value):
"""
A smaller-than-normal textarea
to be used by the s3.comments() Reusable field
"""
return TEXTAREA(_name=field.name,
_id="%s_%s" % (field._tablename, field.name),
_class="comments %s" % (field.type),
value=value,
requires=field.requires)
# =============================================================================
def s3_richtext_widget(field, value):
"""
A larger-than-normal textarea to be used by the CMS Post Body field
"""
s3 = current.response.s3
id = "%s_%s" % (field._tablename, field.name)
# Load the scripts
ckeditor = URL(c="static", f="ckeditor", args="ckeditor.js")
s3.scripts.append(ckeditor)
adapter = URL(c="static", f="ckeditor", args=["adapters",
"jquery.js"])
s3.scripts.append(adapter)
# Toolbar options: http://docs.cksource.com/CKEditor_3.x/Developers_Guide/Toolbar
js = '''var ck_config={toolbar:[['Format','Bold','Italic','-','NumberedList','BulletedList','-','Link','Unlink','-','Image','Table','-','PasteFromWord','-','Source','Maximize']],toolbarCanCollapse:false,removePlugins:'elementspath'}'''
s3.js_global.append(js)
js = '''$('#%s').ckeditor(ck_config)''' % id
s3.jquery_ready.append(js)
return TEXTAREA(_name=field.name,
_id=id,
_class="richtext %s" % (field.type),
value=value,
requires=field.requires)
# =============================================================================
def s3_grouped_checkboxes_widget(field,
value,
size = 20,
**attributes):
"""
Displays checkboxes for each value in the table column "field".
If there are more than "size" options, they are grouped by the
first letter of their label.
@type field: Field
@param field: Field (or Storage) object
@type value: dict
@param value: current value from the form field
@type size: int
@param size: number of input elements for each group
Used by S3SearchOptionsWidget
"""
requires = field.requires
if not isinstance(requires, (list, tuple)):
requires = [requires]
if hasattr(requires[0], "options"):
options = requires[0].options()
else:
raise SyntaxError, "widget cannot determine options of %s" \
% field
options = [(k, v) for k, v in options if k != ""]
total = len(options)
if total == 0:
T = current.T
options.append(TR(TD(SPAN(T("no options available"),
_class="no-options-available"),
INPUT(_type="hide",
_name=field.name,
_value=None))))
if total > size:
# Options are put into groups of "size"
import locale
letters = []
letters_options = {}
append = letters.append
for val, label in options:
letter = label
if letter:
letter = s3_unicode(letter).upper()[0]
if letter not in letters_options:
append(letter)
letters_options[letter] = [(val, label)]
else:
letters_options[letter].append((val, label))
widget = DIV(_class=attributes.pop("_class",
"s3-grouped-checkboxes-widget"),
_name = "%s_widget" % field.name)
input_index = 0
group_index = 0
group_options = []
from_letter = u"A"
to_letter = letters[0]
letters.sort(locale.strcoll)
lget = letters_options.get
for letter in letters:
if from_letter is None:
from_letter = letter
group_options += lget(letter, [])
count = len(group_options)
if count >= size or letter == letters[-1]:
if letter == letters[-1]:
to_letter = u"Z"
else:
to_letter = letter
# Are these options for a single letter or a range?
if to_letter != from_letter:
group_label = "%s - %s" % (from_letter, to_letter)
else:
group_label = from_letter
widget.append(DIV(group_label,
_id="%s-group-label-%s" % (field.name,
group_index),
_class="s3-grouped-checkboxes-widget-label expanded"))
group_field = field
# Can give Unicode issues:
#group_field.requires = IS_IN_SET(group_options,
# multiple=True)
letter_widget = s3_checkboxes_widget(group_field,
value,
options = group_options,
start_at_id=input_index,
**attributes)
widget.append(letter_widget)
input_index += count
group_index += 1
group_options = []
from_letter = None
else:
# not enough options to form groups
try:
widget = s3_checkboxes_widget(field, value, **attributes)
except:
# some versions of gluon/sqlhtml.py don't support non-integer keys
if s3_debug:
raise
else:
return None
return widget
# =============================================================================
def s3_checkboxes_widget(field,
value,
options = None,
cols = 1,
start_at_id = 0,
help_field = None,
**attributes):
"""
Display checkboxes for each value in the table column "field".
@type cols: int
@param cols: spread the input elements into "cols" columns
@type start_at_id: int
@param start_at_id: start input element ids at this number
@type help_text: string
@param help_text: field name string pointing to the field
containing help text for each option
"""
values = not isinstance(value, (list, tuple)) and [value] or value
values = [str(v) for v in values]
attributes["_name"] = "%s_widget" % field.name
if "_class" not in attributes:
attributes["_class"] = "s3-checkboxes-widget"
if options is None:
requires = field.requires
if not isinstance(requires, (list, tuple)):
requires = [requires]
if hasattr(requires[0], "options"):
options = requires[0].options()
else:
raise SyntaxError, "widget cannot determine options of %s" % field
help_text = Storage()
if help_field:
ftype = str(field.type)
if ftype[:9] == "reference":
ktablename = ftype[10:]
elif ftype[:14] == "list:reference":
ktablename = ftype[15:]
else:
# not a reference - no expand
# option text = field representation
ktablename = None
if ktablename is not None:
if "." in ktablename:
ktablename, pkey = ktablename.split(".", 1)
else:
pkey = None
ktable = current.s3db[ktablename]
if pkey is None:
pkey = ktable._id.name
lookup_field = help_field
if lookup_field in ktable.fields:
query = ktable[pkey].belongs([k for k, v in options])
rows = current.db(query).select(ktable[pkey],
ktable[lookup_field]
)
for row in rows:
help_text[str(row[ktable[pkey]])] = row[ktable[lookup_field]]
else:
# Error => no comments available
pass
else:
# No lookup table => no comments available
pass
options = [(k, v) for k, v in options if k != ""]
options = sorted(options, key=lambda option: option[1])
input_index = start_at_id
rows = []
count = len(options)
mods = count % cols
num_of_rows = count / cols
if mods:
num_of_rows += 1
for r in range(num_of_rows):
cells = []
for k, v in options[r * cols:(r + 1) * cols]:
input_id = "id-%s-%s" % (field.name, str(input_index))
title = help_text.get(str(k), None)
if title:
label_attr = dict(_title=title)
else:
label_attr = {}
cells.append(TD(INPUT(_type="checkbox",
_name=field.name,
_id=input_id,
hideerror=True,
_value=k,
value=(k in values)),
LABEL(v,
_for=input_id,
**label_attr)))
input_index += 1
rows.append(TR(cells))
if rows:
rows[-1][0][0]["hideerror"] = False
return TABLE(*rows, **attributes)
# =============================================================================
class S3SliderWidget(FormWidget):
"""
Standard Slider Widget
@author: Daniel Klischies ([email protected])
@ToDo: The range of the slider should ideally be picked up from the Validator
@ToDo: Show the value of the slider numerically as well as simply a position
"""
def __init__(self,
minval,
maxval,
steprange,
value):
self.minval = minval
self.maxval = maxval
self.steprange = steprange
self.value = value
def __call__(self, field, value, **attributes):
response = current.response
fieldname = str(field).replace(".", "_")
sliderdiv = DIV(_id=fieldname, **attributes)
inputid = "%s_input" % fieldname
localfield = str(field).split(".")
sliderinput = INPUT(_name=localfield[1],
_id=inputid,
_class="hide",
_value=self.value)
response.s3.jquery_ready.append('''S3.slider('%s','%f','%f','%f','%f')''' % \
(fieldname,
self.minval,
self.maxval,
self.steprange,
self.value))
return TAG[""](sliderdiv, sliderinput)
# =============================================================================
class S3OptionsMatrixWidget(FormWidget):
"""
Constructs a two dimensional array/grid of checkboxes
with row and column headers.
"""
def __init__(self, rows, cols):
"""
@type rows: tuple
@param rows:
A tuple of tuples.
The nested tuples will have the row label followed by a value
for each checkbox in that row.
@type cols: tuple
@param cols:
A tuple containing the labels to use in the column headers
"""
self.rows = rows
self.cols = cols
def __call__(self, field, value, **attributes):
"""
Returns the grid/matrix of checkboxes as a web2py TABLE object and
adds references to required Javascript files.
@type field: Field
@param field:
This gets passed in when the widget is rendered or used.
@type value: list
@param value:
A list of the values matching those of the checkboxes.
@param attributes:
HTML attributes to assign to the table.
"""
if isinstance(value, (list, tuple)):
values = [str(v) for v in value]
else:
values = [str(value)]
# Create the table header
header_cells = []
for col in self.cols:
header_cells.append(TH(col, _scope="col"))
header = THEAD(TR(header_cells))
# Create the table body cells
grid_rows = []
for row in self.rows:
# Create a list to hold our table cells
# the first cell will hold the row label
row_cells = [TH(row[0], _scope="row")]
for option in row[1:]:
# This determines if the checkbox should be checked
if option in values:
checked = True
else:
checked = False
row_cells.append(TD(
INPUT(_type="checkbox",
_name=field.name,
_value=option,
value=checked
)
))
grid_rows.append(TR(row_cells))
s3 = current.response.s3
s3.scripts.append("/%s/static/scripts/S3/s3.optionsmatrix.js" % current.request.application)
# If the table has an id attribute, activate the jQuery plugin for it.
if "_id" in attributes:
s3.jquery_ready.append('''$('#{0}').s3optionsmatrix()'''.format(attributes.get("_id")))
return TABLE(header, TBODY(grid_rows), **attributes)
# =============================================================================
class S3KeyValueWidget(ListWidget):
"""
Allows for input of key-value pairs and stores them as list:string
"""
def __init__(self, key_label=None, value_label=None):
"""
Returns a widget with key-value fields
"""
self._class = "key-value-pairs"
T = current.T
self.key_label = key_label or T("Key")
self.value_label = value_label or T("Value")
def __call__(self, field, value, **attributes):
T = current.T
s3 = current.response.s3
_id = "%s_%s" % (field._tablename, field.name)
_name = field.name
_class = "text hide"
attributes["_id"] = _id
attributes["_name"] = _name
attributes["_class"] = _class
script = SCRIPT(
'''jQuery(document).ready(function(){jQuery('#%s').kv_pairs('%s','%s')})''' % \
(_id, self.key_label, self.value_label))
if not value: value = "[]"
if not isinstance(value, str):
try:
value = json.dumps(value)
except:
raise("Bad value for key-value pair field")
appname = current.request.application
jsfile = "/%s/static/scripts/S3/%s" % (appname, "s3.keyvalue.widget.js")
if jsfile not in s3.scripts:
s3.scripts.append(jsfile)
return TAG[""](
TEXTAREA(value, **attributes),
script
)
@staticmethod
def represent(value):
if isinstance(value, str):
try:
value = json.loads(value)
if isinstance(value, str):
raise ValueError("key-value JSON is wrong.")
except:
# XXX: log this!
#raise ValueError("Bad json was found as value for a key-value field: %s" % value)
return ""
rep = []
if isinstance(value, (tuple, list)):
for kv in value:
rep += ["%s: %s" % (kv["key"], kv["value"])]
return ", ".join(rep)
# END =========================================================================
|
In the centre of the town of Sherborne, with its roots dating back to the 8th Century, the School was re-founded in 1550 by King Edward VI. The original buildings have been sympathetically restored and refurbished and provide a charming and peaceful ambience that is ideal for music-making.
Accommodation houses are within the precincts of the main school courtyard or in nearby locations. The main street of Sherborne,Cheap Street, flanks the east side of the courtyard, and tempts visitors to the small ale houses, antique shops, tea rooms and designer fashion houses. |
"""
This example uses the built in template matching. The easiest way
to think of this is if you played the card matching game, the cards would
pretty much have to be identical. The template method doesn't allow much
for the scale to change, nor for rotation. This is the most basic pattern
matching SimpleCV offers. If you are looking for something more complex
you will probably want to look into img.find()
"""
print __doc__
import time
from simplecv.api import Image, Color, TemplateMatch
source = Image("templatetest.png", sample=True) # the image to search
template = Image("template.png", sample=True) # the template to search the image for
t = 5
methods = ["SQR_DIFF", "SQR_DIFF_NORM", "CCOEFF",
"CCOEFF_NORM", "CCORR", "CCORR_NORM"] # the various types of template matching available
for m in methods:
img = Image("templatetest.png", sample=True)
img.dl().text("current method: {}".format(m), (10, 20), color=Color.RED)
fs = source.find(TemplateMatch, template, threshold=t, method=m)
for match in fs:
img.dl().rectangle((match.x, match.y), (match.width, match.height), color=Color.RED)
img.apply_layers().show()
time.sleep(3)
|
These ready-to-run cars feature injection-molded bodies with separately applied metal grab irons and etched metal roofwalks. Cars include trucks and brass wheelsets, weights and Kadee(R) #5 magnetic knuckle couplers. Covered hoppers feature etched-metal roofwalks. |
# coding: utf-8
# <img style='float: left' width="150px" src="http://bostonlightswim.org/wp/wp-content/uploads/2011/08/BLS-front_4-color.jpg">
# <br><br>
#
# ## [The Boston Light Swim](http://bostonlightswim.org/)
#
# ### Sea Surface Temperature time-series maps
# ### Load configuration
# In[8]:
import os
try:
import cPickle as pickle
except ImportError:
import pickle
run_name = '2015-08-17'
fname = os.path.join(run_name, 'config.pkl')
with open(fname, 'rb') as f:
config = pickle.load(f)
# ### Load skill_score
# In[9]:
try:
import cPickle as pickle
except ImportError:
import pickle
fname = os.path.join(run_name, 'skill_score.pkl')
with open(fname, 'rb') as f:
skill_score = pickle.load(f)
# In[10]:
import numpy as np
from glob import glob
from pandas import Panel
from utilities import nc2df
def load_ncs(run_name):
fname = '{}-{}.nc'.format
ALL_OBS_DATA = nc2df(os.path.join(run_name,
fname(run_name, 'OBS_DATA')))
index = ALL_OBS_DATA.index
dfs = dict(OBS_DATA=ALL_OBS_DATA)
for fname in glob(os.path.join(run_name, "*.nc")):
if 'OBS_DATA' in fname:
continue
else:
model = fname.split('.')[0].split('-')[-1]
df = nc2df(fname)
# FIXME: Horrible work around duplicate times.
if len(df.index.values) != len(np.unique(df.index.values)):
kw = dict(subset='index', take_last=True)
df = df.reset_index().drop_duplicates(**kw).set_index('index')
kw = dict(method='time', limit=30)
df = df.reindex(index).interpolate(**kw).ix[index]
dfs.update({model: df})
return Panel.fromDict(dfs).swapaxes(0, 2)
# In[11]:
from mpld3 import save_html
import matplotlib.pyplot as plt
from mpld3.plugins import LineLabelTooltip, connect
from utilities import make_map
bbox = config['bbox']
units = config['units']
run_name = config['run_name']
kw = dict(zoom_start=12, line=True, states=False, secoora_stations=False)
mapa = make_map(bbox, **kw)
# ### Clusters
# In[12]:
from glob import glob
from operator import itemgetter
import iris
from pandas import DataFrame, read_csv
fname = '{}-all_obs.csv'.format(run_name)
all_obs = read_csv(os.path.join(run_name, fname), index_col='name')
big_list = []
for fname in glob(os.path.join(run_name, "*.nc")):
if 'OBS_DATA' in fname:
continue
cube = iris.load_cube(fname)
model = fname.split('-')[-1].split('.')[0]
lons = cube.coord(axis='X').points
lats = cube.coord(axis='Y').points
stations = cube.coord('station name').points
models = [model]*lons.size
lista = zip(models, lons.tolist(), lats.tolist(), stations.tolist())
big_list.extend(lista)
big_list.sort(key=itemgetter(3))
df = DataFrame(big_list, columns=['name', 'lon', 'lat', 'station'])
df.set_index('station', drop=True, inplace=True)
groups = df.groupby(df.index)
for station, info in groups:
sta_name = all_obs['station'][all_obs['station'].astype(str) == station].index[0]
for lat, lon, name in zip(info.lat, info.lon, info.name):
location = lat, lon
popup = '<b>{}</b>\n{}'.format(sta_name, name)
mapa.simple_marker(location=location, popup=popup,
clustered_marker=True)
# ### Model and observations plots
# In[13]:
mean_bias = skill_score['mean_bias'].applymap('{:.2f}'.format).replace('nan', '--')
skill = skill_score['rmse'].applymap('{:.2f}'.format).replace('nan', '--')
resolution, width, height = 75, 7, 3
def make_plot():
fig, ax = plt.subplots(figsize=(width, height))
ax.set_ylabel('Sea surface Temperature ({})'.format(units))
ax.grid(True)
return fig, ax
dfs = load_ncs(run_name)
dfs = dfs.swapaxes('items', 'major').resample('30min').swapaxes('items', 'major')
for station in dfs:
sta_name = all_obs['station'][all_obs['station'].astype(str) == station].index[0]
df = dfs[station].dropna(axis=1, how='all')
if df.empty:
continue
labels = []
fig, ax = make_plot()
for col in df.columns:
serie = df[col].dropna()
lines = ax.plot(serie.index, serie, label=col,
linewidth=2.5, alpha=0.5)
if 'OBS_DATA' not in col:
text0 = col
text1 = mean_bias[sta_name][col]
text2 = skill[sta_name][col]
tooltip = '{}:\nbias {}\nskill: {}'.format
labels.append(tooltip(text0, text1, text2))
else:
labels.append('OBS_DATA')
kw = dict(loc='upper center', bbox_to_anchor=(0.5, 1.05), numpoints=1,
ncol=2, framealpha=0)
l = ax.legend(**kw)
l.set_title("") # Workaround str(None).
[connect(fig, LineLabelTooltip(line, name))
for line, name in zip(ax.lines, labels)]
html = 'station_{}.html'.format(station)
save_html(fig, '{}/{}'.format(run_name, html))
plt.close(fig)
popup = "<div align='center'> {} <br><iframe src='{}' alt='image'"
popup += "width='{}px' height='{}px' frameBorder='0'></div>"
popup = popup.format('{}'.format(sta_name), html,
(width*resolution)+75, (height*resolution)+50)
kw = dict(popup=popup, width=(width*resolution)+75)
if (df.columns == 'OBS_DATA').all():
kw.update(dict(marker_color="blue", marker_icon="ok"))
else:
kw.update(dict(marker_color="green", marker_icon="ok"))
obs = all_obs[all_obs['station'].astype(str) == station].squeeze()
mapa.simple_marker(location=[obs['lat'], obs['lon']], **kw)
# ### Map
# In[14]:
from utilities import inline_map
mapa.create_map(path=os.path.join(run_name, 'mapa.html'))
inline_map(os.path.join(run_name, 'mapa.html'))
|
Sophia Bush is furious that an obsessed fan managed to get her phone number through her co-worker.
The 35-year-old actress took to her Twitter account to rant about the alleged incident which took place on Saturday (13.10.18), where someone phoned one of her colleagues in a bid to get in touch with her late at night.
She fumed: "Dear Humans who manage to get phone numbers for my friends, my agents, my coworkers. Don't call them hunting me down. Especially not at 11pm on a Saturday night. It's f***ing rude. Period.
Getting my -- or a # you think is mine -- phone # isn't how we become friends. Back off. (sic)"
The 'One Tree Hill' star says that knowing people are violating her "privacy" causes her to have panic attacks and when her phone rings with a random number it makes her fear leaving the house.
She added: "What happens when my privacy or the privacy of the people closest to me is violated?" she asked. "I start having an anxiety attack. I don't feel safe going outside. Does that sound fun to you? It sure as hell isn't fun to me. Every noise in the house gets scary. Every random number on my phone feels like a threat. I literally don't give a sh*t what someone 'wants.' I don't belong to them. (sic)"
Sophia's 'One Tree Hill' co-star Hilarie Burton responded to her pal's tweet calling anyone who tries to contact her a "psycho" and invited her to come and stay on her farm.
She wrote: "If you're a real fan? You don't "hunt" someone you admire. That's psycho.
I'm sorry babe. Bring your friends and come hide on the farm. I got attack donkeys. (sic)"
Meanwhile, this summer Sophia said she has been looking for her future husband on the subway.
The brunette beauty admitted that she and her close pal have been scouting out the eligible bachelors that ride the metro in New York City, in the hopes of finding one who will get down on bended knee for them to ask for their hand in marriage.
The 'Incredibles 2' star said: "My best friend and I are book nerds and [have been] both deeply single for so long that every day this week we've been like 'Why are there so many hot guys on the subway reading?' Yesterday, literally, I'm looking at this guy who's like so beautiful and he's reading Zadie Smith and I'm like, 'Are you my husband?' It's like the grown-up version of [the children's book] 'Are You My Mother'. You're just like...'Is it you? Is it you? Who is it?'"
Sophia was previously married to her 'One Tree Hill' co-star Chad Michael Murray for just five months in 2005, before they split and later had their divorce finalised the following year.
The 'Chicago P.D.' actress recently admitted she felt pressured into marrying Chad, as she claims it isn't something she "wanted" to do.
When asked how long she had been dating Chad before they married, she said: "I don't even know. And it was not a thing I actually really wanted to do."
Sophia then claimed she went through with the marriage because she didn't want to "let everybody down". |
#
# PyDBG
# Copyright (C) 2006 Pedram Amini <[email protected]>
#
# $Id: defines.py 193 2007-04-05 13:30:01Z cameron $
#
# This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public
# License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later
# version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied
# warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with this program; if not, write to the Free
# Software Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#
#
# windows_h.py was generated with:
#
# c:\Python\Lib\site-packages\ctypes\wrap
# c:\python\python h2xml.py windows.h -o windows.xml -q -c
# c:\python\python xml2py.py windows.xml -s DEBUG_EVENT -s CONTEXT -s MEMORY_BASIC_INFORMATION -s LDT_ENTRY \
# -s PROCESS_INFORMATION -s STARTUPINFO -s SYSTEM_INFO -o windows_h.py
#
# Then the import of ctypes was changed at the top of the file to utilize my_ctypes, which adds the necessary changes
# to support the pickle-ing of our defined data structures and ctype primitives.
#
'''
@author: Pedram Amini
@license: GNU General Public License 2.0 or later
@contact: [email protected]
@organization: www.openrce.org
'''
from my_ctypes import *
from windows_h import *
###
### manually declare entities from Tlhelp32.h since i was unable to import using h2xml.py.
###
TH32CS_SNAPHEAPLIST = 0x00000001
TH32CS_SNAPPROCESS = 0x00000002
TH32CS_SNAPTHREAD = 0x00000004
TH32CS_SNAPMODULE = 0x00000008
TH32CS_INHERIT = 0x80000000
TH32CS_SNAPALL = (TH32CS_SNAPHEAPLIST | TH32CS_SNAPPROCESS | TH32CS_SNAPTHREAD | TH32CS_SNAPMODULE)
class THREADENTRY32(Structure):
_fields_ = [
('dwSize', DWORD),
('cntUsage', DWORD),
('th32ThreadID', DWORD),
('th32OwnerProcessID', DWORD),
('tpBasePri', DWORD),
('tpDeltaPri', DWORD),
('dwFlags', DWORD),
]
class PROCESSENTRY32(Structure):
_fields_ = [
('dwSize', DWORD),
('cntUsage', DWORD),
('th32ProcessID', DWORD),
('th32DefaultHeapID', DWORD),
('th32ModuleID', DWORD),
('cntThreads', DWORD),
('th32ParentProcessID', DWORD),
('pcPriClassBase', DWORD),
('dwFlags', DWORD),
('szExeFile', CHAR * 260),
]
class MODULEENTRY32(Structure):
_fields_ = [
("dwSize", DWORD),
("th32ModuleID", DWORD),
("th32ProcessID", DWORD),
("GlblcntUsage", DWORD),
("ProccntUsage", DWORD),
("modBaseAddr", DWORD),
("modBaseSize", DWORD),
("hModule", DWORD),
("szModule", CHAR * 256),
("szExePath", CHAR * 260),
]
###
### manually declare various structures as needed.
###
class SYSDBG_MSR(Structure):
_fields_ = [
("Address", c_ulong),
("Data", c_ulonglong),
]
###
### manually declare various #define's as needed.
###
# debug event codes.
EXCEPTION_DEBUG_EVENT = 0x00000001
CREATE_THREAD_DEBUG_EVENT = 0x00000002
CREATE_PROCESS_DEBUG_EVENT = 0x00000003
EXIT_THREAD_DEBUG_EVENT = 0x00000004
EXIT_PROCESS_DEBUG_EVENT = 0x00000005
LOAD_DLL_DEBUG_EVENT = 0x00000006
UNLOAD_DLL_DEBUG_EVENT = 0x00000007
OUTPUT_DEBUG_STRING_EVENT = 0x00000008
RIP_EVENT = 0x00000009
USER_CALLBACK_DEBUG_EVENT = 0xDEADBEEF # added for callback support in debug event loop.
# debug exception codes.
EXCEPTION_ACCESS_VIOLATION = 0xC0000005
EXCEPTION_BREAKPOINT = 0x80000003
EXCEPTION_GUARD_PAGE = 0x80000001
EXCEPTION_SINGLE_STEP = 0x80000004
# hw breakpoint conditions
HW_ACCESS = 0x00000003
HW_EXECUTE = 0x00000000
HW_WRITE = 0x00000001
CONTEXT_CONTROL = 0x00010001
CONTEXT_FULL = 0x00010007
CONTEXT_DEBUG_REGISTERS = 0x00010010
CREATE_NEW_CONSOLE = 0x00000010
DBG_CONTINUE = 0x00010002
DBG_EXCEPTION_NOT_HANDLED = 0x80010001
DBG_EXCEPTION_HANDLED = 0x00010001
DEBUG_PROCESS = 0x00000001
DEBUG_ONLY_THIS_PROCESS = 0x00000002
EFLAGS_RF = 0x00010000
EFLAGS_TRAP = 0x00000100
ERROR_NO_MORE_FILES = 0x00000012
FILE_MAP_READ = 0x00000004
FORMAT_MESSAGE_ALLOCATE_BUFFER = 0x00000100
FORMAT_MESSAGE_FROM_SYSTEM = 0x00001000
INVALID_HANDLE_VALUE = 0xFFFFFFFF
MEM_COMMIT = 0x00001000
MEM_DECOMMIT = 0x00004000
MEM_IMAGE = 0x01000000
MEM_RELEASE = 0x00008000
PAGE_NOACCESS = 0x00000001
PAGE_READONLY = 0x00000002
PAGE_READWRITE = 0x00000004
PAGE_WRITECOPY = 0x00000008
PAGE_EXECUTE = 0x00000010
PAGE_EXECUTE_READ = 0x00000020
PAGE_EXECUTE_READWRITE = 0x00000040
PAGE_EXECUTE_WRITECOPY = 0x00000080
PAGE_GUARD = 0x00000100
PAGE_NOCACHE = 0x00000200
PAGE_WRITECOMBINE = 0x00000400
PROCESS_ALL_ACCESS = 0x001F0FFF
SE_PRIVILEGE_ENABLED = 0x00000002
SW_SHOW = 0x00000005
THREAD_ALL_ACCESS = 0x001F03FF
TOKEN_ADJUST_PRIVILEGES = 0x00000020
# for NtSystemDebugControl()
SysDbgReadMsr = 16
SysDbgWriteMsr = 17 |
97% of the people who have won a Nobel Prize in science in the last century are men: 18 women versus 572 male prizewinners in scientific fields. Otto Hahn received the Nobel Prize in Chemistry for the discovery about nuclear fission of his laboratory colleague Lise Meitner. Jocelyn Bell Burnell discovered the first radio signal of a pulsar that allowed, among other things, the first indirect evidence of gravitational waves to be obtained. The director of her thesis, Antony Hewish, took the award. These are just a few examples of the gender bias that exists in science and technology. This problem is not innocuous or limited to a lack of recognition; its consequences are felt not only by those professionals who work in these fields, but also by society as a whole.
A study published in the journal PNAS in 2015 stated that there is a bias that causes female scientists to receive less funding than their male counterparts. Another study published in Science showed that the female presence is smaller in the fields where it is believed it is necessary to be brilliant, something that is associated less with women. These, and other, circumstances have led to only 25% of students enrolled in technical fields of study being women, a percentage that is further reduced when considering those who dedicate themselves professionally to these fields.
Already in a report from 2013, the European Commission warned of this troubling situation. In its study, the EC showed that the entry of more women into digital jobs would increase the GDP of the European Union by 9 billion euros. Among some of the solutions proposed was to empower women in the sector and facilitate their access to entrepreneurship programs.
For decades, only male mice have been used in tests and research studies, given the possibility that the hormonal fluctuations of females may distort the results. This has led to the creation of a gap between the proportion of women who are patients and the proportion of female animals that have been used for clinical trials. This has meant that much less is known about the diseases of women.
In addition, as men have been used for decades as the standard model for medicine, women have been misdiagnosed from pathologies for which they displayed different symptoms, such as heart disease. There have also been missed opportunities to learn more about exclusively female issues such as pregnancy, menstruation and menopause. In conclusion, the medical treatment received by women has been worse than that received by men for a long time.
“Doing bad research costs money and lives,” says Londa Schiebinger, director of Gendered Innovations, a project that seeks to explain and raise awareness about the impact of gender bias on research. Schiebinger points out that during 1997 and 2000, ten drugs were recalled in the United States for possible fatal health effects. “Eight of these posed greater risks to women’s health than to men’s health,” the US Government Accountability Office (GAO) said in a statement.
Throughout history, women have also suffered major side effects from medications, ranging from cholesterol-lowering drugs to sedatives and tranquilizers. The cause, in many cases, is that the recommended doses had been established from clinical studies focused mostly on males of medium size, not taking into account that the average size of women is smaller, which can make the effects of the drugs stronger or longer lasting.
Another example of how gender bias affects the development of technologies and innovation are car safety belts. “In engineering, men are often considered the norm, and women (and the shortest men) are analysed a posteriori, often from the perspective of how far they deviate from the norm. As a result, many devices are adapted to women in a reactive way,” explains Gendered Innovations in its study.
Thus, the dummies that are used to study the impact of car accidents employ men as the model. The first was developed in 1949 by the American Air Force. This has meant, for example, that it took until 1996 for the creation of the first dummy for pregnant women designed to study the effect of high-speed impacts on the foetus, the uterus and the placenta. The result of this delay in research is that “conventional belts do not properly fit pregnant women and motor accidents are the main cause of death of the foetus related to trauma to the mother,” according to this project.
The incorporation of women into management positions in an organization has a positive impact on the investor’s earnings in the medium and long term. A European Commission report argues that organizations that are more inclusive with women in management “achieve 35% higher profitability and 34% more return for shareholders than other comparable companies.” These considerations are framed within something greater—the underutilization of the female labour force is a powerful economic burden.
The International Monetary Fund contends that the effective incorporation of female talent into the increasingly technological labour market would bring 5% growth for the United States, 9% for Japan, 12% for the United Arab Emirates and reach 34% in countries like Egypt. |
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import tempfile
import json
import paramiko
from Crypto.PublicKey import RSA
import novaclient.exceptions as novaexception
from heat.common import exception
from heat.openstack.common import log as logging
from heat.engine import scheduler
from heat.engine.resources import instance
from heat.engine.resources import nova_utils
from heat.db.sqlalchemy import api as db_api
from . import rackspace_resource # noqa
logger = logging.getLogger(__name__)
class CloudServer(instance.Instance):
"""Resource for Rackspace Cloud Servers."""
properties_schema = {'flavor': {'Type': 'String', 'Required': True,
'UpdateAllowed': True},
'image': {'Type': 'String', 'Required': True},
'user_data': {'Type': 'String'},
'key_name': {'Type': 'String'},
'Volumes': {'Type': 'List'},
'name': {'Type': 'String'}}
attributes_schema = {'PrivateDnsName': ('Private DNS name of the specified'
' instance.'),
'PublicDnsName': ('Public DNS name of the specified '
'instance.'),
'PrivateIp': ('Private IP address of the specified '
'instance.'),
'PublicIp': ('Public IP address of the specified '
'instance.')}
base_script = """#!/bin/bash
# Install cloud-init and heat-cfntools
%s
# Create data source for cloud-init
mkdir -p /var/lib/cloud/seed/nocloud-net
mv /tmp/userdata /var/lib/cloud/seed/nocloud-net/user-data
touch /var/lib/cloud/seed/nocloud-net/meta-data
chmod 600 /var/lib/cloud/seed/nocloud-net/*
# Run cloud-init & cfn-init
cloud-init start || cloud-init init
bash -x /var/lib/cloud/data/cfn-userdata > /root/cfn-userdata.log 2>&1 ||
exit 42
"""
# - Ubuntu 12.04: Verified working
ubuntu_script = base_script % """\
apt-get update
export DEBIAN_FRONTEND=noninteractive
apt-get install -y -o Dpkg::Options::="--force-confdef" -o \
Dpkg::Options::="--force-confold" cloud-init python-boto python-pip gcc \
python-dev
pip install heat-cfntools
cfn-create-aws-symlinks --source /usr/local/bin
"""
# - Fedora 17: Verified working
# - Fedora 18: Not working. selinux needs to be in "Permissive"
# mode for cloud-init to work. It's disabled by default in the
# Rackspace Cloud Servers image. To enable selinux, a reboot is
# required.
# - Fedora 19: Verified working
fedora_script = base_script % """\
yum install -y cloud-init python-boto python-pip gcc python-devel
pip-python install heat-cfntools
cfn-create-aws-symlinks
"""
# - Centos 6.4: Verified working
centos_script = base_script % """\
rpm -ivh http://mirror.rackspace.com/epel/6/i386/epel-release-6-8.noarch.rpm
yum install -y cloud-init python-boto python-pip gcc python-devel \
python-argparse
pip-python install heat-cfntools
"""
# - RHEL 6.4: Verified working
rhel_script = base_script % """\
rpm -ivh http://mirror.rackspace.com/epel/6/i386/epel-release-6-8.noarch.rpm
# The RPM DB stays locked for a few secs
while fuser /var/lib/rpm/*; do sleep 1; done
yum install -y cloud-init python-boto python-pip gcc python-devel \
python-argparse
pip-python install heat-cfntools
cfn-create-aws-symlinks
"""
debian_script = base_script % """\
echo "deb http://mirror.rackspace.com/debian wheezy-backports main" >> \
/etc/apt/sources.list
apt-get update
apt-get -t wheezy-backports install -y cloud-init
export DEBIAN_FRONTEND=noninteractive
apt-get install -y -o Dpkg::Options::="--force-confdef" -o \
Dpkg::Options::="--force-confold" python-pip gcc python-dev
pip install heat-cfntools
"""
# - Arch 2013.6: Not working (deps not in default package repos)
# TODO(jason): Install cloud-init & other deps from third-party repos
arch_script = base_script % """\
pacman -S --noconfirm python-pip gcc
"""
# - Gentoo 13.2: Not working (deps not in default package repos)
# TODO(jason): Install cloud-init & other deps from third-party repos
gentoo_script = base_script % """\
emerge cloud-init python-boto python-pip gcc python-devel
"""
# - OpenSUSE 12.3: Not working (deps not in default package repos)
# TODO(jason): Install cloud-init & other deps from third-party repos
opensuse_script = base_script % """\
zypper --non-interactive rm patterns-openSUSE-minimal_base-conflicts
zypper --non-interactive in cloud-init python-boto python-pip gcc python-devel
"""
# List of supported Linux distros and their corresponding config scripts
image_scripts = {'arch': None,
'centos': centos_script,
'debian': None,
'fedora': fedora_script,
'gentoo': None,
'opensuse': None,
'rhel': rhel_script,
'ubuntu': ubuntu_script}
script_error_msg = ("The %(path)s script exited with a non-zero exit "
"status. To see the error message, log into the "
"server and view %(log)s")
# Template keys supported for handle_update. Properties not
# listed here trigger an UpdateReplace
update_allowed_keys = ('Metadata', 'Properties')
def __init__(self, name, json_snippet, stack):
super(CloudServer, self).__init__(name, json_snippet, stack)
self._private_key = None
self._server = None
self._distro = None
self._public_ip = None
self._private_ip = None
self._flavor = None
self._image = None
self.rs = rackspace_resource.RackspaceResource(name,
json_snippet,
stack)
def physical_resource_name(self):
name = self.properties.get('name')
if name:
return name
return super(CloudServer, self).physical_resource_name()
def nova(self):
return self.rs.nova() # Override the Instance method
def cinder(self):
return self.rs.cinder()
@property
def server(self):
"""Get the Cloud Server object."""
if not self._server:
logger.debug("Calling nova().servers.get()")
self._server = self.nova().servers.get(self.resource_id)
return self._server
@property
def distro(self):
"""Get the Linux distribution for this server."""
if not self._distro:
logger.debug("Calling nova().images.get()")
image_data = self.nova().images.get(self.image)
self._distro = image_data.metadata['os_distro']
return self._distro
@property
def script(self):
"""Get the config script for the Cloud Server image."""
return self.image_scripts[self.distro]
@property
def flavor(self):
"""Get the flavors from the API."""
if not self._flavor:
self._flavor = nova_utils.get_flavor_id(self.nova(),
self.properties['flavor'])
return self._flavor
@property
def image(self):
if not self._image:
self._image = nova_utils.get_image_id(self.nova(),
self.properties['image'])
return self._image
@property
def private_key(self):
"""Return the private SSH key for the resource."""
if self._private_key:
return self._private_key
if self.id is not None:
private_key = db_api.resource_data_get(self, 'private_key')
if not private_key:
return None
self._private_key = private_key
return private_key
@private_key.setter
def private_key(self, private_key):
"""Save the resource's private SSH key to the database."""
self._private_key = private_key
if self.id is not None:
db_api.resource_data_set(self, 'private_key', private_key, True)
def _get_ip(self, ip_type):
"""Return the IP of the Cloud Server."""
if ip_type in self.server.addresses:
for ip in self.server.addresses[ip_type]:
if ip['version'] == 4:
return ip['addr']
raise exception.Error("Could not determine the %s IP of %s." %
(ip_type, self.properties['image']))
@property
def public_ip(self):
"""Return the public IP of the Cloud Server."""
if not self._public_ip:
self._public_ip = self._get_ip('public')
return self._public_ip
@property
def private_ip(self):
"""Return the private IP of the Cloud Server."""
if not self._private_ip:
self._private_ip = self._get_ip('private')
return self._private_ip
@property
def has_userdata(self):
if self.properties['user_data'] or self.metadata != {}:
return True
else:
return False
def validate(self):
"""Validate user parameters."""
self.flavor
self.image
# It's okay if there's no script, as long as user_data and
# metadata are empty
if not self.script and self.has_userdata:
return {'Error': "user_data/metadata are not supported for image"
" %s." % self.properties['image']}
def _run_ssh_command(self, command):
"""Run a shell command on the Cloud Server via SSH."""
with tempfile.NamedTemporaryFile() as private_key_file:
private_key_file.write(self.private_key)
private_key_file.seek(0)
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.MissingHostKeyPolicy())
ssh.connect(self.public_ip,
username="root",
key_filename=private_key_file.name)
chan = ssh.get_transport().open_session()
chan.exec_command(command)
return chan.recv_exit_status()
def _sftp_files(self, files):
"""Transfer files to the Cloud Server via SFTP."""
with tempfile.NamedTemporaryFile() as private_key_file:
private_key_file.write(self.private_key)
private_key_file.seek(0)
pkey = paramiko.RSAKey.from_private_key_file(private_key_file.name)
transport = paramiko.Transport((self.public_ip, 22))
transport.connect(hostkey=None, username="root", pkey=pkey)
sftp = paramiko.SFTPClient.from_transport(transport)
for remote_file in files:
sftp_file = sftp.open(remote_file['path'], 'w')
sftp_file.write(remote_file['data'])
sftp_file.close()
def handle_create(self):
"""Create a Rackspace Cloud Servers container.
Rackspace Cloud Servers does not have the metadata service
running, so we have to transfer the user-data file to the
server and then trigger cloud-init.
"""
# Generate SSH public/private keypair
if self._private_key is not None:
rsa = RSA.importKey(self._private_key)
else:
rsa = RSA.generate(1024)
self.private_key = rsa.exportKey()
public_keys = [rsa.publickey().exportKey('OpenSSH')]
if self.properties.get('key_name'):
key_name = self.properties['key_name']
public_keys.append(nova_utils.get_keypair(self.nova(),
key_name).public_key)
personality_files = {
"/root/.ssh/authorized_keys": '\n'.join(public_keys)}
# Create server
client = self.nova().servers
logger.debug("Calling nova().servers.create()")
server = client.create(self.physical_resource_name(),
self.image,
self.flavor,
files=personality_files)
# Save resource ID to db
self.resource_id_set(server.id)
return server, scheduler.TaskRunner(self._attach_volumes_task())
def _attach_volumes_task(self):
tasks = (scheduler.TaskRunner(self._attach_volume, volume_id, device)
for volume_id, device in self.volumes())
return scheduler.PollingTaskGroup(tasks)
def _attach_volume(self, volume_id, device):
logger.debug("Calling nova().volumes.create_server_volume()")
self.nova().volumes.create_server_volume(self.server.id,
volume_id,
device or None)
yield
volume = self.cinder().get(volume_id)
while volume.status in ('available', 'attaching'):
yield
volume.get()
if volume.status != 'in-use':
raise exception.Error(volume.status)
def _detach_volumes_task(self):
tasks = (scheduler.TaskRunner(self._detach_volume, volume_id)
for volume_id, device in self.volumes())
return scheduler.PollingTaskGroup(tasks)
def _detach_volume(self, volume_id):
volume = self.cinder().get(volume_id)
volume.detach()
yield
while volume.status in ('in-use', 'detaching'):
yield
volume.get()
if volume.status != 'available':
raise exception.Error(volume.status)
def check_create_complete(self, cookie):
"""Check if server creation is complete and handle server configs."""
if not self._check_active(cookie):
return False
if self.has_userdata:
# Create heat-script and userdata files on server
raw_userdata = self.properties['user_data'] or ''
userdata = nova_utils.build_userdata(self, raw_userdata)
files = [{'path': "/tmp/userdata", 'data': userdata},
{'path': "/root/heat-script.sh", 'data': self.script}]
self._sftp_files(files)
# Connect via SSH and run script
cmd = "bash -ex /root/heat-script.sh > /root/heat-script.log 2>&1"
exit_code = self._run_ssh_command(cmd)
if exit_code == 42:
raise exception.Error(self.script_error_msg %
{'path': "cfn-userdata",
'log': "/root/cfn-userdata.log"})
elif exit_code != 0:
raise exception.Error(self.script_error_msg %
{'path': "heat-script.sh",
'log': "/root/heat-script.log"})
return True
# TODO(jason): Make this consistent with Instance and inherit
def _delete_server(self, server):
"""Return a coroutine that deletes the Cloud Server."""
server.delete()
while True:
yield
try:
server.get()
if server.status == "DELETED":
break
elif server.status == "ERROR":
raise exception.Error("Deletion of server %s failed." %
server.name)
except novaexception.NotFound:
break
def handle_update(self, json_snippet, tmpl_diff, prop_diff):
"""Try to update a Cloud Server's parameters.
If the Cloud Server's Metadata or flavor changed, update the
Cloud Server. If any other parameters changed, re-create the
Cloud Server with the new parameters.
"""
if 'Metadata' in tmpl_diff:
self.metadata = json_snippet['Metadata']
metadata_string = json.dumps(self.metadata)
files = [{'path': "/var/cache/heat-cfntools/last_metadata",
'data': metadata_string}]
self._sftp_files(files)
command = "bash -x /var/lib/cloud/data/cfn-userdata > " + \
"/root/cfn-userdata.log 2>&1"
exit_code = self._run_ssh_command(command)
if exit_code != 0:
raise exception.Error(self.script_error_msg %
{'path': "cfn-userdata",
'log': "/root/cfn-userdata.log"})
if 'flavor' in prop_diff:
flav = json_snippet['Properties']['flavor']
new_flavor = nova_utils.get_flavor_id(self.nova(), flav)
self.server.resize(new_flavor)
resize = scheduler.TaskRunner(nova_utils.check_resize,
self.server,
flav)
resize.start()
return resize
def _resolve_attribute(self, key):
"""Return the method that provides a given template attribute."""
attribute_function = {'PublicIp': self.public_ip,
'PrivateIp': self.private_ip,
'PublicDnsName': self.public_ip,
'PrivateDnsName': self.public_ip}
if key not in attribute_function:
raise exception.InvalidTemplateAttribute(resource=self.name,
key=key)
function = attribute_function[key]
logger.info('%s._resolve_attribute(%s) == %s'
% (self.name, key, function))
return unicode(function)
# pyrax module is required to work with Rackspace cloud server provider.
# If it is not installed, don't register cloud server provider
def resource_mapping():
if rackspace_resource.PYRAX_INSTALLED:
return {'Rackspace::Cloud::Server': CloudServer}
else:
return {}
|
A wall plate is a decorative detail worthy of a little extra attention. Make it part of the total design scheme.
Elegant and eye catching, Susan Goldstick light switch covers add style and beauty with lush colors and a hint of shimmer from metal and crystal accents.
Created to coordinate with our decorative hardware and furnishings, all our switch covers are available in multiple configurations with matching painted screws. |
# -*- coding: utf-8 -*-
##############################################################################
#
# Copyright (C) 2015 Eficent (<http://www.eficent.com/>)
# <[email protected]>
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
##############################################################################
from openerp import api, fields, models, _, exceptions
from datetime import datetime
from dateutil.relativedelta import relativedelta
_PURCHASE_ORDER_LINE_STATE = [
('none', 'No Purchase'),
('draft', 'RFQ'),
('confirmed', 'Confirmed'),
('done', 'Done'),
('cancel', 'Cancelled')
]
class PurchaseRequestLine(models.Model):
_inherit = "purchase.request.line"
@api.one
@api.depends('purchase_lines')
def _get_is_editable(self):
super(PurchaseRequestLine, self)._get_is_editable()
if self.purchase_lines:
self.is_editable = False
@api.one
def _purchased_qty(self):
purchased_qty = 0.0
for purchase_line in self.purchase_lines:
if purchase_line.state != 'cancel':
purchased_qty += purchase_line.product_qty
self.purchased_qty = purchased_qty
@api.one
@api.depends('purchase_lines.state')
def _get_purchase_state(self):
self.purchase_state = 'none'
if self.purchase_lines:
if any([po_line.state == 'done' for po_line in
self.purchase_lines]):
self.purchase_state = 'done'
elif all([po_line.state == 'cancel' for po_line in
self.purchase_lines]):
self.purchase_state = 'cancel'
elif any([po_line.state == 'confirmed' for po_line in
self.purchase_lines]):
self.purchase_state = 'confirmed'
elif all([po_line.state in ('draft', 'cancel') for po_line in
self.purchase_lines]):
self.purchase_state = 'draft'
purchased_qty = fields.Float(string='Quantity in RFQ or PO',
compute="_purchased_qty")
purchase_lines = fields.Many2many(
'purchase.order.line', 'purchase_request_purchase_order_line_rel',
'purchase_request_line_id',
'purchase_order_line_id', 'Purchase Order Lines', readonly=True)
purchase_state = fields.Selection(compute="_get_purchase_state",
string="Purchase Status",
selection=_PURCHASE_ORDER_LINE_STATE,
store=True,
default='none')
@api.one
def copy(self, default=None):
if default is None:
default = {}
default.update({
'purchase_lines': [],
})
return super(PurchaseRequestLine, self).copy(default)
@api.model
def _planned_date(self, request_line, delay=0.0):
company = request_line.company_id
date_planned = datetime.strptime(
request_line.date_required, '%Y-%m-%d') - \
relativedelta(days=company.po_lead)
if delay:
date_planned -= relativedelta(days=delay)
return date_planned and date_planned.strftime('%Y-%m-%d') \
or False
@api.model
def _calc_new_qty_price(self, request_line, po_line=None, cancel=False):
uom_obj = self.env['product.uom']
qty = uom_obj._compute_qty(request_line.product_uom_id.id,
request_line.product_qty,
request_line.product_id.uom_po_id.id)
# Make sure we use the minimum quantity of the partner corresponding
# to the PO. This does not apply in case of dropshipping
supplierinfo_min_qty = 0.0
if po_line.order_id.location_id.usage != 'customer':
if po_line.product_id.seller_id.id == \
po_line.order_id.partner_id.id:
supplierinfo_min_qty = po_line.product_id.seller_qty
else:
supplierinfo_obj = self.env['product.supplierinfo']
supplierinfos = supplierinfo_obj.search(
[('name', '=', po_line.order_id.partner_id.id),
('product_tmpl_id', '=',
po_line.product_id.product_tmpl_id.id)])
if supplierinfos:
supplierinfo_min_qty = supplierinfos[0].min_qty
if supplierinfo_min_qty == 0.0:
qty += po_line.product_qty
else:
# Recompute quantity by adding existing running procurements.
for rl in po_line.purchase_request_lines:
qty += uom_obj._compute_qty(rl.product_uom_id.id,
rl.product_qty,
rl.product_id.uom_po_id.id)
qty = max(qty, supplierinfo_min_qty) if qty > 0.0 else 0.0
price = po_line.price_unit
if qty != po_line.product_qty:
pricelist_obj = self.pool['product.pricelist']
pricelist_id = po_line.order_id.partner_id.\
property_product_pricelist_purchase.id
price = pricelist_obj.price_get(
self.env.cr, self.env.uid, [pricelist_id],
request_line.product_id.id, qty,
po_line.order_id.partner_id.id,
{'uom': request_line.product_id.uom_po_id.id})[pricelist_id]
return qty, price
@api.multi
def unlink(self):
for line in self:
if line.purchase_lines:
raise exceptions.Warning(
_('You cannot delete a record that refers to purchase '
'lines!'))
return super(PurchaseRequestLine, self).unlink()
|
Had the honour of being interviewed about my work by the Scottish cashmere designer Rosie Sugden.
As fellow creatives, we discussed our passions, influences what makes us tick.
To read the full interview click below.
Set amongst the hills of Perthshire sits Ballingtaggart Farm. This unique farm regularly hosts feast nights where previously unacquainted guests dine together. There is an air of community amongst the guests as they feast through the evening.
A sneak peak into a new, year-long project, set in the highlands of Perthshire, Scotland.
2016 has been a fantastic year, here is a cross section of the past year. These stills are from Scotland and the Balkans.
Portraits of the Scottish musician Hannah Fisher.
Bosnia and Herzegovina's capital city Sarajevo with fusion at its core, this city boast a real variety of religion and of culture. This has been known for its long and rich history over the years, from WW1 to hosting the 1984 Winter Olympics and latterly at the centre of a civil war. The city still carries the scars of this conflict, both within the walls and the hearts of the people who lived through it. Sarajevo is now rebuilding its identity and forming into something new, some beautiful.
The isle of Vis, is as far as you can get from mainland Croatia. There are two main towns, Vis and Komiza, on this island known for its fishing community as well as vineyards. The myriad of little streets of these towns, hold the history or the land. This relativity unspoilt Mediterranean isle has seen many much in its past. It spent many years as a Yugoslavian military base, where foreign visitors were not allowed until 1989. Many of its in habitants have long family ties with the island, going back generations.
Above the shores of Loch Tay, sitting within the Ben Lawers National Nature Reserve lies a Hydro-Electric dam. This dam, alongside others in the same scheme, creates enough power supply about 65,500 homes.
There is strong feeling that you are part of a Bond classic when walking on and around it. The sharp contrast of creation verses created, concrete verses nature. In late spring the snow still clings to the surrounding hills highlighting the magnificent peaks.
"Used to work in Next doing all the postal orders, took retirement and thought I would take on a wee job... didn't think I would be here 13 years on."
This past year has been filled hearing unexpected stories and the people they belong to, as well as seeing some magnificent places, both at home in Scotland and further a field.
Here are some of my highlights of the last year. |
# -*- coding: utf-8 -*-
from gluon import current
from s3 import *
from s3layouts import *
try:
from .layouts import *
except ImportError:
pass
import s3menus as default
red_cross_filter = {"organisation_type.name" : "Red Cross / Red Crescent"}
# =============================================================================
class S3MainMenu(default.S3MainMenu):
""" Custom Application Main Menu """
# -------------------------------------------------------------------------
@classmethod
def menu(cls):
""" Compose Menu """
# Modules menus
main_menu = MM()(
cls.menu_modules(),
)
# Additional menus
current.menu.personal = cls.menu_personal()
current.menu.lang = cls.menu_lang()
current.menu.about = cls.menu_about()
current.menu.org = cls.menu_org()
# @todo: restore?
#current.menu.dashboard = cls.menu_dashboard()
return main_menu
# -------------------------------------------------------------------------
@classmethod
def menu_modules(cls):
""" Custom Modules Menu """
T = current.T
auth = current.auth
has_role = auth.s3_has_role
root_org = auth.root_org_name()
system_roles = current.session.s3.system_roles
ADMIN = system_roles.ADMIN
ORG_ADMIN = system_roles.ORG_ADMIN
s3db = current.s3db
s3db.inv_recv_crud_strings()
inv_recv_list = current.response.s3.crud_strings.inv_recv.title_list
use_certs = lambda i: current.deployment_settings.get_hrm_use_certificates()
def hrm(item):
return root_org != "Honduran Red Cross" or \
has_role(ORG_ADMIN)
def inv(item):
return root_org != "Honduran Red Cross" or \
has_role("hn_wh_manager") or \
has_role("hn_national_wh_manager") or \
has_role(ORG_ADMIN)
def basic_warehouse(i):
if root_org == "Honduran Red Cross" and \
not (has_role("hn_national_wh_manager") or \
has_role(ORG_ADMIN)):
# Hide menu entries which user shouldn't need access to
return False
else:
return True
def multi_warehouse(i):
if root_org == "Honduran Red Cross" and \
not (has_role("hn_national_wh_manager") or \
has_role(ORG_ADMIN)):
# Only responsible for 1 warehouse so hide menu entries which should be accessed via Tabs on their warehouse
return False
else:
return True
def vol(item):
return root_org != "Honduran Red Cross" or \
has_role(ORG_ADMIN)
return [
homepage("gis")(
),
homepage("hrm", "org", name=T("Staff"),
vars=dict(group="staff"), check=hrm)(
MM("Staff", c="hrm", f="staff", m="summary"),
MM("Teams", c="hrm", f="group"),
MM("National Societies", c="org", f="organisation",
vars = red_cross_filter),
MM("Offices", c="org", f="office"),
MM("Job Titles", c="hrm", f="job_title"),
#MM("Skill List", c="hrm", f="skill"),
MM("Training Events", c="hrm", f="training_event"),
MM("Training Courses", c="hrm", f="course"),
MM("Certificate List", c="hrm", f="certificate", check=use_certs),
),
homepage("vol", name=T("Volunteers"), check=vol)(
MM("Volunteers", c="vol", f="volunteer", m="summary"),
MM("Teams", c="vol", f="group"),
MM("Volunteer Roles", c="vol", f="job_title"),
MM("Programs", c="vol", f="programme"),
#MM("Skill List", c="vol", f="skill"),
MM("Training Events", c="vol", f="training_event"),
MM("Training Courses", c="vol", f="course"),
MM("Certificate List", c="vol", f="certificate", check=use_certs),
),
#homepage("member")(
# MM("Members", c="member", f="membership", m="summary"),
#),
homepage("inv", "supply", "req", check=inv)(
MM("Warehouses", c="inv", f="warehouse", m="summary", check=multi_warehouse),
MM(inv_recv_list, c="inv", f="recv", check=multi_warehouse),
MM("Sent Shipments", c="inv", f="send", check=multi_warehouse),
MM("Items", c="supply", f="item", check=basic_warehouse),
MM("Catalogs", c="supply", f="catalog", check=basic_warehouse),
#MM("Item Categories", c="supply", f="item_category"),
M("Suppliers", c="inv", f="supplier", check=basic_warehouse)(),
M("Facilities", c="inv", f="facility", check=basic_warehouse)(),
M("Requests", c="req", f="req")(),
#M("Commitments", f="commit")(),
),
#homepage("asset")(
# MM("Assets", c="asset", f="asset", m="summary"),
# MM("Items", c="asset", f="item", m="summary"),
#),
#homepage("survey")(
# MM("Assessment Templates", c="survey", f="template"),
# MM("Disaster Assessments", c="survey", f="series"),
#),
homepage("project")(
MM("Projects", c="project", f="project", m="summary"),
MM("Locations", c="project", f="location"),
#MM("Outreach", c="po", f="index", check=outreach),
),
#homepage("vulnerability")(
# MM("Map", c="vulnerability", f="index"),
#),
#homepage("event")(
# MM("Events", c="event", f="event"),
# MM("Incident Reports", c="event", f="incident_report"),
#),
#homepage("deploy", name="RDRT", f="mission", m="summary",
# vars={"~.status__belongs": "2"})(
# MM("Missions", c="deploy", f="mission", m="summary"),
# MM("Members", c="deploy", f="human_resource", m="summary"),
#),
]
# -------------------------------------------------------------------------
@classmethod
def menu_org(cls):
""" Custom Organisation Menu """
OM = S3OrgMenuLayout
return OM()
# -------------------------------------------------------------------------
@classmethod
def menu_lang(cls):
s3 = current.response.s3
# Language selector
menu_lang = ML("Language", right=True)
for language in s3.l10n_languages.items():
code, name = language
menu_lang(
ML(name, translate=False, lang_code=code, lang_name=name)
)
return menu_lang
# -------------------------------------------------------------------------
@classmethod
def menu_personal(cls):
""" Custom Personal Menu """
auth = current.auth
s3 = current.response.s3
settings = current.deployment_settings
if not auth.is_logged_in():
request = current.request
login_next = URL(args=request.args, vars=request.vars)
if request.controller == "default" and \
request.function == "user" and \
"_next" in request.get_vars:
login_next = request.get_vars["_next"]
self_registration = settings.get_security_self_registration()
menu_personal = MP()(
MP("Register", c="default", f="user",
m="register", check=self_registration),
MP("Login", c="default", f="user",
m="login", vars=dict(_next=login_next)),
MP("Lost Password", c="default", f="user",
m="retrieve_password"),
)
else:
s3_has_role = auth.s3_has_role
is_org_admin = lambda i: s3_has_role("ORG_ADMIN") and \
not s3_has_role("ADMIN")
menu_personal = MP()(
MP("Administration", c="admin", f="index",
check=s3_has_role("ADMIN")),
MP("Administration", c="admin", f="user",
check=is_org_admin),
MP("Profile", c="default", f="person"),
MP("Change Password", c="default", f="user",
m="change_password"),
MP("Logout", c="default", f="user",
m="logout"),
)
return menu_personal
# -------------------------------------------------------------------------
@classmethod
def menu_about(cls):
menu_about = MA(c="default")(
MA("About Us", f="about"),
MA("Contact", f="contact"),
MA("Help", f="help"),
MA("Privacy", f="privacy"),
)
return menu_about
# =============================================================================
class S3OptionsMenu(default.S3OptionsMenu):
""" Custom Controller Menus """
# -------------------------------------------------------------------------
def admin(self):
""" ADMIN menu """
# Standard Admin Menu
menu = super(S3OptionsMenu, self).admin()
# Additional Items
menu(M("Map Settings", c="gis", f="config"),
M("Content Management", c="cms", f="index"),
)
return menu
# -------------------------------------------------------------------------
def gis(self):
""" GIS / GIS Controllers """
if current.request.function == "index":
# Empty so as to leave maximum space for the Map
# - functionality accessible via the Admin menu instead
return None
else:
return super(S3OptionsMenu, self).gis()
# -------------------------------------------------------------------------
@staticmethod
def hrm():
""" HRM Human Resource Management """
has_role = current.auth.s3_has_role
s3 = current.session.s3
ADMIN = s3.system_roles.ADMIN
settings = current.deployment_settings
if "hrm" not in s3:
current.s3db.hrm_vars()
hrm_vars = s3.hrm
SECTORS = "Clusters" if settings.get_ui_label_cluster() \
else "Sectors"
manager_mode = lambda i: hrm_vars.mode is None
personal_mode = lambda i: hrm_vars.mode is not None
is_org_admin = lambda i: hrm_vars.orgs and True or \
has_role(ADMIN)
is_super_editor = lambda i: has_role("staff_super") or \
has_role("vol_super")
staff = {"group": "staff"}
use_certs = lambda i: settings.get_hrm_use_certificates()
return M()(
M("Staff", c="hrm", f=("staff", "person"), m="summary",
check=manager_mode)(
M("Create", m="create"),
M("Import", f="person", m="import",
vars=staff, p="create"),
),
M("Staff & Volunteers (Combined)",
c="hrm", f="human_resource", m="summary",
check=[manager_mode, is_super_editor]),
M("Teams", c="hrm", f="group",
check=manager_mode)(
M("Create", m="create"),
M("Search Members", f="group_membership"),
M("Import", f="group_membership", m="import"),
),
M("National Societies", c="org",
f="organisation",
vars=red_cross_filter,
check=manager_mode)(
M("Create", m="create",
vars=red_cross_filter
),
M("Import", m="import", p="create", check=is_org_admin)
),
M("Offices", c="org", f="office",
check=manager_mode)(
M("Create", m="create"),
M("Import", m="import", p="create"),
),
M("Department Catalog", c="hrm", f="department",
check=manager_mode)(
M("Create", m="create"),
),
M("Job Title Catalog", c="hrm", f="job_title",
check=manager_mode)(
M("Create", m="create"),
M("Import", m="import", p="create", check=is_org_admin),
),
#M("Skill Catalog", f="skill",
# check=manager_mode)(
# M("Create", m="create"),
# #M("Skill Provisions", f="skill_provision"),
#),
M("Training Events", c="hrm", f="training_event",
check=manager_mode)(
M("Create", m="create"),
M("Search Training Participants", f="training"),
M("Import Participant List", f="training", m="import"),
),
M("Reports", c="hrm", f="staff", m="report",
check=manager_mode)(
M("Staff Report", m="report"),
M("Expiring Staff Contracts Report",
vars=dict(expiring="1")),
M("Training Report", f="training", m="report"),
),
M("Training Course Catalog", c="hrm", f="course",
check=manager_mode)(
M("Create", m="create"),
M("Import", m="import", p="create", check=is_org_admin),
M("Course Certificates", f="course_certificate"),
),
M("Certificate Catalog", c="hrm", f="certificate",
check=[manager_mode, use_certs])(
M("Create", m="create"),
M("Import", m="import", p="create", check=is_org_admin),
#M("Skill Equivalence", f="certificate_skill"),
),
M("Organization Types", c="org", f="organisation_type",
restrict=[ADMIN],
check=manager_mode)(
M("Create", m="create"),
),
M("Office Types", c="org", f="office_type",
restrict=[ADMIN],
check=manager_mode)(
M("Create", m="create"),
),
#M("Facility Types", c="org", f="facility_type",
# restrict=[ADMIN],
# check=manager_mode)(
# M("Create", m="create"),
#),
#M("My Profile", c="hrm", f="person",
# check=personal_mode, vars=dict(access="personal")),
# This provides the link to switch to the manager mode:
M("Human Resources", c="hrm", f="index",
check=[personal_mode, is_org_admin]),
# This provides the link to switch to the personal mode:
#M("Personal Profile", c="hrm", f="person",
# check=manager_mode, vars=dict(access="personal"))
)
# -------------------------------------------------------------------------
def org(self):
""" Organisation Management """
# Same as HRM
return self.hrm()
# -------------------------------------------------------------------------
@staticmethod
def vol():
""" Volunteer Management """
auth = current.auth
has_role = auth.s3_has_role
s3 = current.session.s3
ADMIN = s3.system_roles.ADMIN
root_org = auth.root_org_name()
# Custom conditions for the check-hook, as lambdas in order
# to have them checked only immediately before rendering:
manager_mode = lambda i: s3.hrm.mode is None
personal_mode = lambda i: s3.hrm.mode is not None
is_org_admin = lambda i: s3.hrm.orgs and True or \
has_role(ADMIN)
is_super_editor = lambda i: has_role("vol_super") or \
has_role("staff_super")
settings = current.deployment_settings
use_certs = lambda i: settings.get_hrm_use_certificates()
show_programmes = lambda i: settings.get_hrm_vol_experience() == "programme"
show_tasks = lambda i: settings.has_module("project") and \
settings.get_project_mode_task()
teams = settings.get_hrm_teams()
use_teams = lambda i: teams
return M(c="vol")(
M("Volunteers", f="volunteer", m="summary",
check=[manager_mode])(
M("Create", m="create"),
M("Import", f="person", m="import",
vars={"group":"volunteer"}, p="create"),
),
M("Staff & Volunteers (Combined)",
c="vol", f="human_resource", m="summary",
check=[manager_mode, is_super_editor]),
M(teams, f="group",
check=[manager_mode, use_teams])(
M("Create", m="create"),
M("Search Members", f="group_membership"),
M("Import", f="group_membership", m="import"),
),
#M("Department Catalog", f="department",
# check=manager_mode)(
# M("Create", m="create"),
#),
M("Volunteer Role Catalog", f="job_title",
check=[manager_mode])(
M("Create", m="create"),
M("Import", m="import", p="create", check=is_org_admin),
),
#M("Skill Catalog", f="skill",
# check=[manager_mode])(
# M("Create", m="create"),
# #M("Skill Provisions", f="skill_provision"),
#),
M("Training Events", f="training_event",
check=manager_mode)(
M("Create", m="create"),
M("Search Training Participants", f="training"),
M("Import Participant List", f="training", m="import"),
),
M("Training Course Catalog", f="course",
check=manager_mode)(
M("Create", m="create"),
#M("Course Certificates", f="course_certificate"),
),
M("Certificate Catalog", f="certificate",
check=[manager_mode, use_certs])(
M("Create", m="create"),
#M("Skill Equivalence", f="certificate_skill"),
),
M("Programs", f="programme",
check=[manager_mode, show_programmes])(
M("Create", m="create"),
M("Import Hours", f="programme_hours", m="import"),
),
M("Awards", f="award",
check=[manager_mode, is_org_admin])(
M("Create", m="create"),
),
M("Reports", f="volunteer", m="report",
check=manager_mode)(
M("Volunteer Report", m="report"),
M("Hours by Role Report", f="programme_hours", m="report",
vars=Storage(rows="job_title_id",
cols="month",
fact="sum(hours)"),
check=show_programmes),
M("Hours by Program Report", f="programme_hours", m="report",
vars=Storage(rows="programme_id",
cols="month",
fact="sum(hours)"),
check=show_programmes),
M("Training Report", f="training", m="report"),
),
#M("My Profile", f="person",
# check=personal_mode, vars=dict(access="personal")),
M("My Tasks", f="task",
check=[personal_mode, show_tasks],
vars=dict(access="personal",
mine=1)),
# This provides the link to switch to the manager mode:
M("Volunteer Management", f="index",
check=[personal_mode, is_org_admin]),
# This provides the link to switch to the personal mode:
#M("Personal Profile", f="person",
# check=manager_mode, vars=dict(access="personal"))
)
# -------------------------------------------------------------------------
@staticmethod
def inv():
""" INV / Inventory """
auth = current.auth
has_role = auth.s3_has_role
system_roles = current.session.s3.system_roles
ADMIN = system_roles.ADMIN
ORG_ADMIN = system_roles.ORG_ADMIN
s3db = current.s3db
s3db.inv_recv_crud_strings()
inv_recv_list = current.response.s3.crud_strings.inv_recv.title_list
settings = current.deployment_settings
#use_adjust = lambda i: not settings.get_inv_direct_stock_edits()
root_org = auth.root_org_name()
def use_adjust(i):
if root_org in ("Australian Red Cross", "Honduran Red Cross"):
# Australian & Honduran RC use proper Logistics workflow
return True
else:
# Others use simplified version
return False
#def use_facilities(i):
# if root_org == "Honduran Red Cross":
# # Honduran RC don't use Facilities
# return False
# else:
# return True
def basic_warehouse(i):
if root_org == "Honduran Red Cross" and \
not (has_role("hn_national_wh_manager") or \
has_role(ORG_ADMIN)):
# Hide menu entries which user shouldn't need access to
return False
else:
return True
def multi_warehouse(i):
if root_org == "Honduran Red Cross" and \
not (has_role("hn_national_wh_manager") or \
has_role(ORG_ADMIN)):
# Only responsible for 1 warehouse so hide menu entries which should be accessed via Tabs on their warehouse
# & other things that HNRC
return False
else:
return True
def use_kits(i):
if root_org == "Honduran Red Cross":
# Honduran RC use Kits
return True
else:
return False
def use_types(i):
if root_org == "Nepal Red Cross Society":
# Nepal RC use Warehouse Types
return True
else:
return False
use_commit = lambda i: settings.get_req_use_commit()
return M()(
#M("Home", f="index"),
M("Warehouses", c="inv", f="warehouse", m="summary", check=multi_warehouse)(
M("Create", m="create"),
M("Import", m="import", p="create"),
),
M("Warehouse Stock", c="inv", f="inv_item", args="summary")(
M("Search Shipped Items", f="track_item"),
M("Adjust Stock Levels", f="adj", check=use_adjust),
M("Kitting", f="kitting", check=use_kits),
M("Import", f="inv_item", m="import", p="create"),
),
M("Reports", c="inv", f="inv_item")(
M("Warehouse Stock", f="inv_item",m="report"),
M("Expiration Report", c="inv", f="track_item",
vars=dict(report="exp")),
#M("Monetization Report", c="inv", f="inv_item",
# vars=dict(report="mon")),
#M("Utilization Report", c="inv", f="track_item",
# vars=dict(report="util")),
#M("Summary of Incoming Supplies", c="inv", f="track_item",
# vars=dict(report="inc")),
# M("Summary of Releases", c="inv", f="track_item",
# vars=dict(report="rel")),
),
M(inv_recv_list, c="inv", f="recv", check=multi_warehouse)(
M("Create", m="create"),
),
M("Sent Shipments", c="inv", f="send", check=multi_warehouse)(
M("Create", m="create"),
M("Search Shipped Items", f="track_item"),
),
M("Items", c="supply", f="item", m="summary", check=basic_warehouse)(
M("Create", m="create"),
M("Import", f="catalog_item", m="import", p="create"),
),
# Catalog Items moved to be next to the Item Categories
#M("Catalog Items", c="supply", f="catalog_item")(
# M("Create", m="create"),
#),
#M("Brands", c="supply", f="brand",
# restrict=[ADMIN])(
# M("Create", m="create"),
#),
M("Catalogs", c="supply", f="catalog", check=basic_warehouse)(
M("Create", m="create"),
),
M("Item Categories", c="supply", f="item_category",
restrict=[ADMIN])(
M("Create", m="create"),
),
M("Suppliers", c="inv", f="supplier", check=basic_warehouse)(
M("Create", m="create"),
M("Import", m="import", p="create"),
),
M("Facilities", c="inv", f="facility", check=basic_warehouse)(
M("Create", m="create", t="org_facility"),
),
M("Facility Types", c="inv", f="facility_type",
restrict=[ADMIN])(
M("Create", m="create"),
),
M("Warehouse Types", c="inv", f="warehouse_type", check=use_types,
restrict=[ADMIN])(
M("Create", m="create"),
),
M("Requests", c="req", f="req")(
M("Create", m="create"),
M("Requested Items", f="req_item"),
),
M("Commitments", c="req", f="commit", check=use_commit)(
),
)
# -------------------------------------------------------------------------
def req(self):
""" Requests Management """
# Same as Inventory
return self.inv()
# -------------------------------------------------------------------------
@staticmethod
def project():
""" PROJECT / Project Tracking & Management """
root_org = current.auth.root_org_name()
def community_volunteers(i):
if root_org == "Honduran Red Cross":
return True
else:
return False
menu = M(c="project")(
M("Programs", f="programme")(
M("Create", m="create"),
),
M("Projects", f="project", m="summary")(
M("Create", m="create"),
),
M("Locations", f="location")(
# Better created from tab (otherwise Activity Type filter won't work)
#M("Create", m="create"),
M("Map", m="map"),
M("Community Contacts", f="location_contact"),
M("Community Volunteers", f="volunteer",
check=community_volunteers),
),
M("Reports", f="location", m="report")(
M("3W", f="location", m="report"),
M("Beneficiaries", f="beneficiary", m="report"),
#M("Indicators", f="indicator", m="report",
# check=indicators,
# ),
#M("Indicators over Time", f="indicator", m="timeplot",
# check=indicators,
# ),
M("Funding", f="organisation", m="report"),
),
M("Import", f="project", m="import", p="create")(
M("Import Projects", m="import", p="create"),
M("Import Project Organizations", f="organisation",
m="import", p="create"),
M("Import Project Communities", f="location",
m="import", p="create"),
),
M("Partner Organizations", f="partners")(
M("Create", m="create"),
M("Import", m="import", p="create"),
),
M("Activity Types", f="activity_type")(
M("Create", m="create"),
),
M("Beneficiary Types", f="beneficiary_type")(
M("Create", m="create"),
),
M("Demographics", f="demographic")(
M("Create", m="create"),
),
M("Hazards", f="hazard")(
M("Create", m="create"),
),
#M("Indicators", f="indicator",
# check=indicators)(
# M("Create", m="create"),
#),
M("Sectors", f="sector")(
M("Create", m="create"),
),
M("Themes", f="theme")(
M("Create", m="create"),
),
)
return menu
# END =========================================================================
|
Enjoy your stay in Clovis. Make your visit to Clovis an unforgettable experience by staying in the hotel you deserve. Still don´t know where to stay? At Destinia we offer you a wide variety of hotels in Clovis, United States, so you can choose the one that´s right for you, depending on your expectations and your budget. There are numerous hotels in Clovis that are ideal for a business trip, a family holiday, a weekend getaway or an outing with friends in Clovis. No matter the reason for your trip, at Destinia, you´ll find the greatest variety of hotels and hostels, always at the best prices, guaranteed. Are you looking for a deluxe hotel in Clovis city centre with a spa, Internet access or a pool? Perhaps a hotel that accepts pets or that has a child-care centre? We have that, too. Why not give our hotel finder a try? It´s quick, simple and the most effective way to find the best lodging in this part of United States. Whether you need a single-occupancy room, double-occupancy room or even a suite, with Destinia´s search engine, you´ll always find the best deals, so all you´ll have to worry about is packing your bags and enjoying your trip to Clovis.
In Clovis we will have clear sky, soft clothing is recommended as the temperatures will be warm throughout the day. |
"""
This class models the hamburger menu object as a Page Object
The hamburger menu has a bunch of options that can be:
a) Clicked
b) Hovered over
"""
import conf.locators_conf as locators
from utils.Wrapit import Wrapit
class Hamburger_Menu_Object:
"Page Object for the hamburger menu"
#locators
menu_icon = locators.menu_icon
menu_link = locators.menu_link
menu_item = locators.menu_item
@Wrapit._exceptionHandler
def goto_menu_link(self,my_link,expected_url_string=None):
"Navigate to a link: Hover + Click or just Click"
#Format for link: string separated by '>'
#E.g.: 'Approach > Where we start'
split_link = my_link.split('>')
hover_list = split_link[:-1]
self.click_hamburger_menu()
for element in hover_list:
self.hover(self.menu_item%element.strip())
result_flag = self.click_element(self.menu_link%split_link[-1].strip())
#Additional check to see if we went to the right page
if expected_url_string is not None:
result_flag &= True if expected_url_string in self.get_current_url() else False
#If the action failed, close the Hamburger menu
if result_flag is False:
self.click_hamburger_menu()
return result_flag
def click_hamburger_menu(self):
"Click on the hamburger menu icon"
return self.click_element(self.menu_icon)
|
Description PRICED TO MOVE $600 below NADA Retail!, FUEL EFFICIENT 40 MPG Hwy/31 MPG City! FRESH POWDER exterior and Charcoal interior, SV trim. CD Player, Bluetooth, iPod/MP3 Input. CLICK NOW!
iPod/MP3 Input, Bluetooth, CD Player Rear Spoiler, MP3 Player, Remote Trunk Release, Keyless Entry, Child Safety Locks. Nissan SV with FRESH POWDER exterior and Charcoal interior features a 4 Cylinder Engine with 109 HP at 6000 RPM*.
"In addition to being inexpensive to own and operate, the Versa sedan offers excellent fuel economy and a truly spacious back seat." -KBB.com. Great Gas Mileage: 40 MPG Hwy.
This Versa is priced $600 below NADA Retail. |
#
# p2p.py - Distributed bond P2P network node
#
# Distributed under the MIT/X11 software license, see the accompanying
# file COPYING or http://www.opensource.org/licenses/mit-license.php.
#
import struct
import hashlib
messagemap = {
"version",
"verack",
"ping",
"pong",
"addr",
"getaddr",
}
class MsgNull(object):
def __init__(self):
pass
def SerializeToString(self):
return ''
def ParseFromString(self, data):
pass
def __str__(self):
return "MsgNull()"
def message_read(msg_start, f):
try:
recvbuf = f.read(4 + 12 + 4 + 4)
except IOError:
return None
# check magic
if len(recvbuf) < 4:
return None
if recvbuf[:4] != msg_start:
raise ValueError("got garbage %s" % repr(recvbuf))
# check checksum
if len(recvbuf) < 4 + 12 + 4 + 4:
return None
# remaining header fields: command, msg length, checksum
command = recvbuf[4:4+12].split("\x00", 1)[0]
msglen = struct.unpack("<i", recvbuf[4+12:4+12+4])[0]
checksum = recvbuf[4+12+4:4+12+4+4]
# read message body
try:
recvbuf += f.read(msglen)
except IOError:
return None
msg = recvbuf[4+12+4+4:4+12+4+4+msglen]
th = hashlib.sha256(msg).digest()
h = hashlib.sha256(th).digest()
if checksum != h[:4]:
raise ValueError("got bad checksum %s" % repr(recvbuf))
recvbuf = recvbuf[4+12+4+4+msglen:]
return (command, msg)
def message_to_str(msg_start, command, message):
data = message.SerializeToString()
tmsg = msg_start
tmsg += command
tmsg += "\x00" * (12 - len(command))
tmsg += struct.pack("<I", len(data))
# add checksum
th = hashlib.sha256(data).digest()
h = hashlib.sha256(th).digest()
tmsg += h[:4]
tmsg += data
return tmsg
|
It's Cyber Monday. It's also the first day back from a holiday break, which means instead of plowing through your to-do list, you might have found yourself slightly distracted by sales and discounts. After all, there are a lot of good deals right now, especially where the royals are concerned. Not only are Kate Middleton's favorite sneakers 30 percent off, but Meghan Markle's Aquazzura heels are also on sale (!!).
The "Casablanca" style cost $450 (originally they're $750) in the versatile black color. The Duchess of Sussex was first spotted in these designer pumps when she attended the Royal Foundation Forum in February 2018. In October, Meghan debuted the same heels from the brand, but in a beige color while on the royal tour. Clearly, this style is beloved by the Duchess if she already owns two pairs.
If you know about the "Meghan Effect," you know these heels won't be around forever. Sizes are starting to sell out, so before Cyber Monday officially ends, order yourself a pair. It's the fancy heel you'll wear to weddings, birthday dinners, and night out with the girls. Though if you're not completely ready to splurge on these pumps, Kate's $18 Superga sneakers are rather more affordable.
Meghan attended the first annual Royal Foundation Forum. She wore a Jason Wu dress with her black Aquazzura heels.
Meghan also owns the Aquazzura "Casablanca" heels in a beige color. She wore it while departing the Sydney Airport for New Zealand on October 28 during the royal tour. |
"""add discount_policy_id to price.
Revision ID: 45de268cd444
Revises: 4d7f840202d2
Create Date: 2016-03-31 15:29:51.897720
"""
# revision identifiers, used by Alembic.
revision = '45de268cd444'
down_revision = '4d7f840202d2'
from alembic import op
import sqlalchemy as sa
import sqlalchemy_utils
def upgrade():
op.add_column(
'price',
sa.Column(
'discount_policy_id',
sqlalchemy_utils.types.uuid.UUIDType(binary=False),
nullable=True,
),
)
op.create_unique_constraint(
'price_item_id_discount_policy_id_key',
'price',
['item_id', 'discount_policy_id'],
)
op.create_foreign_key(
'price_discount_policy_id_fkey',
'price',
'discount_policy',
['discount_policy_id'],
['id'],
)
op.alter_column(
'discount_policy', 'percentage', existing_type=sa.INTEGER, nullable=True
)
op.add_column(
'discount_policy', sa.Column('is_price_based', sa.Boolean(), nullable=True)
)
def downgrade():
op.drop_constraint('price_discount_policy_id_fkey', 'price', type_='foreignkey')
op.drop_constraint('price_item_id_discount_policy_id_key', 'price', type_='unique')
op.drop_column('price', 'discount_policy_id')
op.drop_column('discount_policy', 'is_price_based')
op.alter_column(
'discount_policy', 'percentage', existing_type=sa.INTEGER, nullable=False
)
|
Between eight and nine o'clock that evening, Eppie and Silas were seated alone in the cottage. After the great excitement the weaver had undergone from the events of the afternoon, he had felt a longing for this quietude, and had even begged Mrs. Winthrop and Aaron, who had naturally lingered behind every one else, to leave him alone with his child. The excitement had not passed away: it had only reached that stage when the keenness of the susceptibility makes external stimulus intolerable—when there is no sense of weariness, but rather an intensity of inward life, under which sleep is an impossibility. Any one who has watched such moments in other men remembers the brightness of the eyes and the strange definiteness that comes over coarse features from that transient influence. It is as if a new fineness of ear for all spiritual voices had sent wonder-working vibrations through the heavy mortal frame—as if "beauty born of murmuring sound"d had passed into the face of the listener.
"But I know now, father," said Eppie. "If it hadn't been for you, they'd have taken me to the workhouse,h and there'd have been nobody to love me."
X [d] as if "beauty born of murmuring sound"
The parish-supported workhouse for paupers, who contributed to their upkeep by some sort of work. As soon as she was old enough, perhaps as young as five or six, Eppie would have been "hired" by a farmer to work in the fields or as a domestic.
The emphasis is upon the word's root, a "wonder," a prodigy. |
import win32ui
import win32con
import win32api
import commctrl
import pythoncom
from pywin.mfc import dialog
class TLBrowserException(Exception):
"TypeLib browser internal error"
error = TLBrowserException
FRAMEDLG_STD = win32con.WS_CAPTION | win32con.WS_SYSMENU
SS_STD = win32con.WS_CHILD | win32con.WS_VISIBLE
BS_STD = SS_STD | win32con.WS_TABSTOP
ES_STD = BS_STD | win32con.WS_BORDER
LBS_STD = ES_STD | win32con.LBS_NOTIFY | win32con.LBS_NOINTEGRALHEIGHT | win32con.WS_VSCROLL
CBS_STD = ES_STD | win32con.CBS_NOINTEGRALHEIGHT | win32con.WS_VSCROLL
typekindmap = {
pythoncom.TKIND_ENUM : 'Enumeration',
pythoncom.TKIND_RECORD : 'Record',
pythoncom.TKIND_MODULE : 'Module',
pythoncom.TKIND_INTERFACE : 'Interface',
pythoncom.TKIND_DISPATCH : 'Dispatch',
pythoncom.TKIND_COCLASS : 'CoClass',
pythoncom.TKIND_ALIAS : 'Alias',
pythoncom.TKIND_UNION : 'Union'
}
TypeBrowseDialog_Parent=dialog.Dialog
class TypeBrowseDialog(TypeBrowseDialog_Parent):
"Browse a type library"
IDC_TYPELIST = 1000
IDC_MEMBERLIST = 1001
IDC_PARAMLIST = 1002
IDC_LISTVIEW = 1003
def __init__(self, typefile = None):
TypeBrowseDialog_Parent.__init__(self, self.GetTemplate())
try:
if typefile:
self.tlb = pythoncom.LoadTypeLib(typefile)
else:
self.tlb = None
except pythoncom.ole_error:
self.MessageBox("The file does not contain type information")
self.tlb = None
self.HookCommand(self.CmdTypeListbox, self.IDC_TYPELIST)
self.HookCommand(self.CmdMemberListbox, self.IDC_MEMBERLIST)
def OnAttachedObjectDeath(self):
self.tlb = None
self.typeinfo = None
self.attr = None
return TypeBrowseDialog_Parent.OnAttachedObjectDeath(self)
def _SetupMenu(self):
menu = win32ui.CreateMenu()
flags=win32con.MF_STRING|win32con.MF_ENABLED
menu.AppendMenu(flags, win32ui.ID_FILE_OPEN, "&Open...")
menu.AppendMenu(flags, win32con.IDCANCEL, "&Close")
mainMenu = win32ui.CreateMenu()
mainMenu.AppendMenu(flags|win32con.MF_POPUP, menu.GetHandle(), "&File")
self.SetMenu(mainMenu)
self.HookCommand(self.OnFileOpen,win32ui.ID_FILE_OPEN)
def OnFileOpen(self, id, code):
openFlags = win32con.OFN_OVERWRITEPROMPT | win32con.OFN_FILEMUSTEXIST
fspec = "Type Libraries (*.tlb, *.olb)|*.tlb;*.olb|OCX Files (*.ocx)|*.ocx|DLL's (*.dll)|*.dll|All Files (*.*)|*.*||"
dlg = win32ui.CreateFileDialog(1, None, None, openFlags, fspec)
if dlg.DoModal() == win32con.IDOK:
try:
self.tlb = pythoncom.LoadTypeLib(dlg.GetPathName())
except pythoncom.ole_error:
self.MessageBox("The file does not contain type information")
self.tlb = None
self._SetupTLB()
def OnInitDialog(self):
self._SetupMenu()
self.typelb = self.GetDlgItem(self.IDC_TYPELIST)
self.memberlb = self.GetDlgItem(self.IDC_MEMBERLIST)
self.paramlb = self.GetDlgItem(self.IDC_PARAMLIST)
self.listview = self.GetDlgItem(self.IDC_LISTVIEW)
# Setup the listview columns
itemDetails = (commctrl.LVCFMT_LEFT, 100, "Item", 0)
self.listview.InsertColumn(0, itemDetails)
itemDetails = (commctrl.LVCFMT_LEFT, 1024, "Details", 0)
self.listview.InsertColumn(1, itemDetails)
if self.tlb is None:
self.OnFileOpen(None,None)
else:
self._SetupTLB()
return TypeBrowseDialog_Parent.OnInitDialog(self)
def _SetupTLB(self):
self.typelb.ResetContent()
self.memberlb.ResetContent()
self.paramlb.ResetContent()
self.typeinfo = None
self.attr = None
if self.tlb is None: return
n = self.tlb.GetTypeInfoCount()
for i in range(n):
self.typelb.AddString(self.tlb.GetDocumentation(i)[0])
def _SetListviewTextItems(self, items):
self.listview.DeleteAllItems()
index = -1
for item in items:
index = self.listview.InsertItem(index+1,item[0])
data = item[1]
if data is None: data = ""
self.listview.SetItemText(index, 1, data)
def SetupAllInfoTypes(self):
infos = self._GetMainInfoTypes() + self._GetMethodInfoTypes()
self._SetListviewTextItems(infos)
def _GetMainInfoTypes(self):
pos = self.typelb.GetCurSel()
if pos<0: return []
docinfo = self.tlb.GetDocumentation(pos)
infos = [('GUID', str(self.attr[0]))]
infos.append(('Help File', docinfo[3]))
infos.append(('Help Context', str(docinfo[2])))
try:
infos.append(('Type Kind', typekindmap[self.tlb.GetTypeInfoType(pos)]))
except:
pass
info = self.tlb.GetTypeInfo(pos)
attr = info.GetTypeAttr()
infos.append(('Attributes', str(attr)))
for j in range(attr[8]):
flags = info.GetImplTypeFlags(j)
refInfo = info.GetRefTypeInfo(info.GetRefTypeOfImplType(j))
doc = refInfo.GetDocumentation(-1)
attr = refInfo.GetTypeAttr()
typeKind = attr[5]
typeFlags = attr[11]
desc = doc[0]
desc = desc + ", Flags=0x%x, typeKind=0x%x, typeFlags=0x%x" % (flags, typeKind, typeFlags)
if flags & pythoncom.IMPLTYPEFLAG_FSOURCE:
desc = desc + "(Source)"
infos.append( ('Implements', desc))
return infos
def _GetMethodInfoTypes(self):
pos = self.memberlb.GetCurSel()
if pos<0: return []
realPos, isMethod = self._GetRealMemberPos(pos)
ret = []
if isMethod:
funcDesc = self.typeinfo.GetFuncDesc(realPos)
id = funcDesc[0]
ret.append(("Func Desc", str(funcDesc)))
else:
id = self.typeinfo.GetVarDesc(realPos)[0]
docinfo = self.typeinfo.GetDocumentation(id)
ret.append(('Help String', docinfo[1]))
ret.append(('Help Context', str(docinfo[2])))
return ret
def CmdTypeListbox(self, id, code):
if code == win32con.LBN_SELCHANGE:
pos = self.typelb.GetCurSel()
if pos >= 0:
self.memberlb.ResetContent()
self.typeinfo = self.tlb.GetTypeInfo(pos)
self.attr = self.typeinfo.GetTypeAttr()
for i in range(self.attr[7]):
id = self.typeinfo.GetVarDesc(i)[0]
self.memberlb.AddString(self.typeinfo.GetNames(id)[0])
for i in range(self.attr[6]):
id = self.typeinfo.GetFuncDesc(i)[0]
self.memberlb.AddString(self.typeinfo.GetNames(id)[0])
self.SetupAllInfoTypes()
return 1
def _GetRealMemberPos(self, pos):
pos = self.memberlb.GetCurSel()
if pos >= self.attr[7]:
return pos - self.attr[7], 1
elif pos >= 0:
return pos, 0
else:
raise error("The position is not valid")
def CmdMemberListbox(self, id, code):
if code == win32con.LBN_SELCHANGE:
self.paramlb.ResetContent()
pos = self.memberlb.GetCurSel()
realPos, isMethod = self._GetRealMemberPos(pos)
if isMethod:
id = self.typeinfo.GetFuncDesc(realPos)[0]
names = self.typeinfo.GetNames(id)
for i in range(len(names)):
if i > 0:
self.paramlb.AddString(names[i])
self.SetupAllInfoTypes()
return 1
def GetTemplate(self):
"Return the template used to create this dialog"
w = 272 # Dialog width
h = 192 # Dialog height
style = FRAMEDLG_STD | win32con.WS_VISIBLE | win32con.DS_SETFONT | win32con.WS_MINIMIZEBOX
template = [['Type Library Browser', (0, 0, w, h), style, None, (8, 'Helv')], ]
template.append([130, "&Type", -1, (10, 10, 62, 9), SS_STD | win32con.SS_LEFT])
template.append([131, None, self.IDC_TYPELIST, (10, 20, 80, 80), LBS_STD])
template.append([130, "&Members", -1, (100, 10, 62, 9), SS_STD | win32con.SS_LEFT])
template.append([131, None, self.IDC_MEMBERLIST, (100, 20, 80, 80), LBS_STD])
template.append([130, "&Parameters", -1, (190, 10, 62, 9), SS_STD | win32con.SS_LEFT])
template.append([131, None, self.IDC_PARAMLIST, (190, 20, 75, 80), LBS_STD])
lvStyle = SS_STD | commctrl.LVS_REPORT | commctrl.LVS_AUTOARRANGE | commctrl.LVS_ALIGNLEFT | win32con.WS_BORDER | win32con.WS_TABSTOP
template.append(["SysListView32", "", self.IDC_LISTVIEW, (10, 110, 255, 65), lvStyle])
return template
if __name__=='__main__':
import sys
fname = None
try:
fname = sys.argv[1]
except:
pass
dlg = TypeBrowseDialog(fname)
try:
win32api.GetConsoleTitle()
dlg.DoModal()
except:
dlg.CreateWindow(win32ui.GetMainFrame())
|
Situated in Krk, 500 metres from Krk Bus Station and 500 metres from Porporela Beach, Erika features accommodation with free WiFi, air conditioning and barbecue facilities. With garden views, this accommodation provides a terrace. The apartment has 1 bedroom, a living room, and a kitchen with a dining area and a fridge. A TV is offered. Krk Harbour is 700 metres from the apartment, while Krk Cathedral is 800 metres away.
Please inform Erika in advance of your expected arrival time. You can use the Special Requests box when booking, or contact the property directly with the contact details provided in your confirmation. |
# -*- coding: utf-8 -*-
# Copyright 2020 Green Valley Belgium NV
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# @@license_version:1.7@@
import binascii
import datetime
import json
import logging
import os
import time
import urllib
import webapp2
from babel import Locale
from google.appengine.api import search, users as gusers
from google.appengine.ext import db
from google.appengine.ext.deferred import deferred
from google.appengine.ext.webapp import template
from markdown import Markdown
from mcfw.cache import cached
from mcfw.consts import MISSING
from mcfw.exceptions import HttpNotFoundException
from mcfw.restapi import rest, GenericRESTRequestHandler
from mcfw.rpc import serialize_complex_value, arguments, returns
from rogerthat.bizz.communities.communities import get_communities_by_country, get_community, get_community_countries
from rogerthat.bizz.friends import user_code_by_hash, makeFriends, ORIGIN_USER_INVITE
from rogerthat.bizz.registration import get_headers_for_consent
from rogerthat.bizz.service import SERVICE_LOCATION_INDEX, re_index_map_only
from rogerthat.bizz.session import create_session
from rogerthat.dal.app import get_app_by_id
from rogerthat.exceptions.login import AlreadyUsedUrlException, InvalidUrlException, ExpiredUrlException
from rogerthat.models import ProfilePointer, ServiceProfile
from rogerthat.pages.legal import DOC_TERMS_SERVICE, get_current_document_version, get_version_content, \
get_legal_language, LANGUAGES as LEGAL_LANGUAGES
from rogerthat.pages.login import SetPasswordHandler
from rogerthat.rpc import users
from rogerthat.rpc.service import BusinessException
from rogerthat.settings import get_server_settings
from rogerthat.templates import get_languages_from_request
from rogerthat.to import ReturnStatusTO, RETURNSTATUS_TO_SUCCESS, WarningReturnStatusTO
from rogerthat.translations import DEFAULT_LANGUAGE
from rogerthat.utils import bizz_check, try_or_defer, get_country_code_by_ipaddress
from rogerthat.utils.app import get_app_id_from_app_user
from rogerthat.utils.cookie import set_cookie
from rogerthat.utils.service import create_service_identity_user
from shop import SHOP_JINJA_ENVIRONMENT
from shop.bizz import create_customer_signup, complete_customer_signup, get_organization_types, \
update_customer_consents, get_customer_signup, validate_customer_url_data, \
get_customer_consents
from shop.business.permissions import is_admin
from shop.constants import OFFICIALLY_SUPPORTED_LANGUAGES
from shop.models import Customer
from shop.to import CompanyTO, CustomerTO, CustomerLocationTO
from shop.view import get_shop_context, get_current_http_host
from solution_server_settings import get_solution_server_settings
from solutions import translate
from solutions.common.bizz.grecaptcha import recaptcha_verify
from solutions.common.bizz.settings import get_consents_for_community
from solutions.common.integrations.cirklo.cirklo import get_whitelisted_merchant
from solutions.common.integrations.cirklo.models import CirkloMerchant, CirkloCity
from solutions.common.markdown_newtab import NewTabExtension
from solutions.common.models import SolutionServiceConsent
from solutions.common.restapi.services import do_create_service
from solutions.common.to.settings import PrivacySettingsGroupTO
class StaticFileHandler(webapp2.RequestHandler):
def get(self, filename):
cur_path = os.path.dirname(__file__)
path = os.path.join(cur_path, u'html', filename)
with open(path, 'r') as f:
self.response.write(f.read())
class GenerateQRCodesHandler(webapp2.RequestHandler):
def get(self):
current_user = gusers.get_current_user()
if not is_admin(current_user):
self.abort(403)
path = os.path.join(os.path.dirname(__file__), 'html', 'generate_qr_codes.html')
context = get_shop_context()
self.response.out.write(template.render(path, context))
class CustomerMapHandler(webapp2.RequestHandler):
def get(self, app_id):
path = os.path.join(os.path.dirname(__file__), 'html', 'customer_map.html')
settings = get_server_settings()
lang = get_languages_from_request(self.request)[0]
translations = {
'merchants': translate(lang, 'merchants'),
'merchants_with_terminal': translate(lang, 'merchants_with_terminal'),
'community_services': translate(lang, 'community_services'),
'care': translate(lang, 'care'),
'associations': translate(lang, 'associations'),
}
params = {
'maps_key': settings.googleMapsKey,
'app_id': app_id,
'translations': json.dumps(translations)
}
self.response.out.write(template.render(path, params))
@cached(2, 21600)
@returns(unicode)
@arguments(app_id=unicode)
def get_customer_locations_for_app(app_id):
query_string = (u'app_ids:"%s"' % app_id)
query = search.Query(query_string=query_string,
options=search.QueryOptions(returned_fields=['service', 'name', 'location', 'description'],
limit=1000))
search_result = search.Index(name=SERVICE_LOCATION_INDEX).search(query)
customers = {customer.service_email: customer for customer in Customer.list_by_app_id(app_id)}
def map_result(service_search_result):
customer_location = CustomerLocationTO()
for field in service_search_result.fields:
if field.name == 'service':
customer = customers.get(field.value.split('/')[0])
if customer:
customer_location.address = customer.address1
customer_location.type = customer.organization_type
if customer.address2:
customer_location.address += '\n%s' % customer.address2
if customer.zip_code or customer.city:
customer_location.address += '\n'
if customer.zip_code:
customer_location.address += customer.zip_code
if customer.zip_code and customer.city:
customer_location.address += ' '
if customer.city:
customer_location.address += customer.city
else:
customer_location.type = ServiceProfile.ORGANIZATION_TYPE_PROFIT
continue
if field.name == 'name':
customer_location.name = field.value
continue
if field.name == 'location':
customer_location.lat = field.value.latitude
customer_location.lon = field.value.longitude
continue
if field.name == 'description':
customer_location.description = field.value
continue
return customer_location
return json.dumps(serialize_complex_value([map_result(r) for r in search_result.results], CustomerLocationTO, True))
class CustomerMapServicesHandler(webapp2.RequestHandler):
def get(self, app_id):
customer_locations = get_customer_locations_for_app(app_id)
self.response.write(customer_locations)
@rest('/unauthenticated/loyalty/scanned', 'get', read_only_access=True, authenticated=False)
@returns(ReturnStatusTO)
@arguments(user_email_hash=unicode, merchant_email=unicode, app_id=unicode)
def rest_loyalty_scanned(user_email_hash, merchant_email, app_id):
try:
bizz_check(user_email_hash is not MISSING, 'user_email_hash is required')
bizz_check(merchant_email is not MISSING, 'merchant_email is required')
bizz_check(app_id is not MISSING, 'app_id is required')
user_code = user_code_by_hash(binascii.unhexlify(user_email_hash))
profile_pointer = ProfilePointer.get(user_code)
if not profile_pointer:
logging.debug('No ProfilePointer found with user_code %s', user_code)
raise BusinessException('User not found')
app_user = profile_pointer.user
bizz_check(get_app_by_id(app_id), 'App not found')
bizz_check(app_id == get_app_id_from_app_user(profile_pointer.user), 'Invalid user email hash')
merchant_found = False
for customer in Customer.list_by_user_email(merchant_email):
merchant_found = True
service_user = users.User(customer.service_email)
logging.info('Received loyalty scan of %s by %s (%s)', app_user, service_user, customer.user_email)
makeFriends(service_user, app_user, None, None, ORIGIN_USER_INVITE,
notify_invitee=False,
notify_invitor=False,
allow_unsupported_apps=True)
bizz_check(merchant_found, 'Merchant not found')
except BusinessException as e:
return ReturnStatusTO.create(False, e.message)
else:
return RETURNSTATUS_TO_SUCCESS
class PublicPageHandler(webapp2.RequestHandler):
@property
def language(self):
return get_languages_from_request(self.request)[0]
def translate(self, key, **kwargs):
return translate(self.language, key, **kwargs)
def render(self, template_name, **params):
if not params.get('language'):
params['language'] = self.language
routes = ['signin', 'signup', 'reset_password', 'set_password']
for route_name in routes:
url = self.url_for(route_name)
params[route_name + '_url'] = url
template_path = 'public/%s.html' % template_name
return SHOP_JINJA_ENVIRONMENT.get_template(template_path).render(params)
def return_error(self, message, **kwargs):
translated_message = self.translate(message, **kwargs)
self.response.out.write(self.render('error', message=translated_message))
def dispatch(self):
if users.get_current_user():
return self.redirect('/')
return super(PublicPageHandler, self).dispatch()
class CustomerSigninHandler(PublicPageHandler):
def get(self, app_id=None):
self.response.write(self.render('signin'))
class CustomerSignupHandler(PublicPageHandler):
def get(self):
language = (self.request.get('language') or self.language).split('_')[0]
if language not in LEGAL_LANGUAGES:
language = DEFAULT_LANGUAGE
solution_server_settings = get_solution_server_settings()
version = get_current_document_version(DOC_TERMS_SERVICE)
legal_language = get_legal_language(language)
countries = get_community_countries()
selected_country = get_country_code_by_ipaddress(os.environ.get('HTTP_X_FORWARDED_FOR', None))
if selected_country:
communities = get_communities_by_country(selected_country)
else:
communities = []
params = {
'recaptcha_site_key': solution_server_settings.recaptcha_site_key,
'email_verified': False,
'toc_content': get_version_content(legal_language, DOC_TERMS_SERVICE, version),
'language': language.lower(),
'languages': [(code, name) for code, name in OFFICIALLY_SUPPORTED_LANGUAGES.iteritems()
if code in LEGAL_LANGUAGES],
'countries': [(country, Locale(language, country).get_territory_name()) for country in countries],
'communities': communities,
'selected_country': selected_country,
'signup_success': json.dumps(self.render('signup_success', language=language))
}
self.response.write(self.render('signup', **params))
class CustomerSignupPasswordHandler(PublicPageHandler):
def get(self):
data = self.request.get('data')
email = self.request.get('email').rstrip('.')
params = {
'email': email,
'data': data,
'language': self.language,
'error': None,
}
self.response.write(self.render('signup_setpassword', **params))
def post(self):
json_data = json.loads(self.request.body)
email = json_data.get('email')
data = json_data.get('data')
password = json_data.get('password', '')
password_confirm = json_data.get('password_confirm')
error = None
try:
signup, _ = get_customer_signup(email, data) # type: CustomerSignup, dict
except ExpiredUrlException:
error = self.translate('link_expired', action='')
except AlreadyUsedUrlException:
error = self.translate('link_is_already_used', action='')
except InvalidUrlException:
error = self.translate('invalid_url')
if len(password) < 8:
error = self.translate('password_length_error', length=8)
elif password != password_confirm:
error = self.translate('password_match_error')
if not error:
tos_version = get_current_document_version(DOC_TERMS_SERVICE)
result = do_create_service(signup.city_customer, signup.language, True, signup, password, tos_version=tos_version)
if result.success:
service_email = result.data['service_email']
deferred.defer(complete_customer_signup, email, data, service_email)
try:
# Sleep to allow datastore indexes to update
time.sleep(2)
secret, _ = create_session(users.User(signup.company_email), ignore_expiration=True, cached=False)
server_settings = get_server_settings()
set_cookie(self.response, server_settings.cookieSessionName, secret)
except:
logging.error("Failed to create session", exc_info=True)
else:
result = WarningReturnStatusTO.create(False, error)
self.response.headers['Content-Type'] = 'application/json'
self.response.write(json.dumps(result.to_dict()))
class CustomerResetPasswordHandler(PublicPageHandler):
def get(self):
self.response.out.write(self.render('reset_password'))
class CustomerSetPasswordHandler(PublicPageHandler, SetPasswordHandler):
"""Inherit PublicPageHandler first to override SetPasswordHandler return_error()"""
def get(self):
email = self.request.get('email')
data = self.request.get('data')
try:
parsed_data = self.parse_and_validate_data(email, data)
except ExpiredUrlException as e:
return self.return_error("link_expired", action=e.action)
except AlreadyUsedUrlException as e:
return self.return_error("link_is_already_used", action=e.action)
except InvalidUrlException:
return self.return_error('invalid_url')
params = {
'name': parsed_data['n'],
'email': email,
'action': parsed_data['a'],
'data': data,
}
self.response.out.write(self.render('set_password', **params))
def post(self):
super(CustomerSetPasswordHandler, self).post()
@rest('/unauthenticated/osa/customer/signup', 'post', read_only_access=True, authenticated=False)
@returns(ReturnStatusTO)
@arguments(city_customer_id=(int, long), company=CompanyTO, customer=CustomerTO, recaptcha_token=unicode,
email_consents=dict)
def customer_signup(city_customer_id, company, customer, recaptcha_token, email_consents=None):
try:
headers = get_headers_for_consent(GenericRESTRequestHandler.getCurrentRequest())
create_customer_signup(city_customer_id, company, customer, recaptcha_token,
domain=get_current_http_host(with_protocol=True), headers=headers, accept_missing=True)
headers = get_headers_for_consent(GenericRESTRequestHandler.getCurrentRequest())
consents = email_consents or {}
context = u'User signup'
try_or_defer(update_customer_consents, customer.user_email, consents, headers, context)
return RETURNSTATUS_TO_SUCCESS
except BusinessException as e:
return ReturnStatusTO.create(False, e.message)
def parse_euvat_address_eu(address):
address = address.strip().splitlines()
zc_ci = address.pop()
zip_code, city = zc_ci.split(' ', 1)
address1 = address.pop(0) if len(address) > 0 else ''
address2 = address.pop(0) if len(address) > 0 else ''
return address1, address2, zip_code, city
@rest('/unauthenticated/osa/signup/community-info/<community_id:[^/]+>', 'get', read_only_access=True,
authenticated=False)
@returns(dict)
@arguments(community_id=(int, long), language=unicode)
def get_customer_info(community_id, language=None):
community = get_community(community_id)
if not community:
raise HttpNotFoundException('Community not found')
if not language:
request = GenericRESTRequestHandler.getCurrentRequest()
language = get_languages_from_request(request)[0]
customer = Customer.get_by_service_email(community.main_service) # type: Customer
organization_types = dict(get_organization_types(customer, community.default_app, language))
return {
'customer': {
'id': customer.id,
},
'organization_types': organization_types
}
@rest('/unauthenticated/osa/signup/communities/<country_code:[^/]+>', 'get', read_only_access=True, authenticated=False)
@returns([dict])
@arguments(country_code=unicode)
def api_get_communities(country_code):
return [{'name': community.name, 'id': community.id} for community in get_communities_by_country(country_code)]
@rest('/unauthenticated/osa/signup/privacy-settings/<community_id:[^/]+>', 'get', read_only_access=True,
authenticated=False)
@returns([PrivacySettingsGroupTO])
@arguments(community_id=(int, long), language=unicode)
def get_privacy_settings(community_id, language=None):
if not language:
request = GenericRESTRequestHandler.getCurrentRequest()
language = get_languages_from_request(request)[0]
return get_consents_for_community(community_id, language, [])
class CustomerCirkloAcceptHandler(PublicPageHandler):
def get_url(self, customer):
url_params = urllib.urlencode({'cid': customer.id})
return '/customers/consent/cirklo?{}'.format(url_params)
def dispatch(self):
# Don't redirect to dashboard when logged in
return super(PublicPageHandler, self).dispatch()
def get(self):
customer_id = self.request.get('cid')
if customer_id:
try:
customer = Customer.get_by_id(long(customer_id))
except:
return self.return_error('invalid_url')
else:
email = self.request.get('email')
data = self.request.get('data')
try:
data = validate_customer_url_data(email, data)
except InvalidUrlException:
return self.return_error('invalid_url')
customer = db.get(data['s']) # Customer
if not customer:
return self.abort(404)
consents = get_customer_consents(customer.user_email)
should_accept = False
if SolutionServiceConsent.TYPE_CITY_CONTACT not in consents.types:
consents.types.append(SolutionServiceConsent.TYPE_CITY_CONTACT)
should_accept = True
if SolutionServiceConsent.TYPE_CIRKLO_SHARE not in consents.types:
consents.types.append(SolutionServiceConsent.TYPE_CIRKLO_SHARE)
should_accept = True
params = {
'cirklo_accept_url': self.get_url(customer),
'should_accept': should_accept
}
self.response.out.write(self.render('cirklo_accept', **params))
def post(self):
try:
customer_id = self.request.get('cid')
customer = Customer.get_by_id(long(customer_id)) # type: Customer
if not customer:
raise Exception('Customer not found')
except:
self.redirect('/')
return
consents = get_customer_consents(customer.user_email)
should_put_consents = False
if SolutionServiceConsent.TYPE_CITY_CONTACT not in consents.types:
consents.types.append(SolutionServiceConsent.TYPE_CITY_CONTACT)
should_put_consents = True
if SolutionServiceConsent.TYPE_CIRKLO_SHARE not in consents.types:
consents.types.append(SolutionServiceConsent.TYPE_CIRKLO_SHARE)
should_put_consents = True
if should_put_consents:
consents.put()
community = get_community(customer.community_id)
city_id = CirkloCity.get_by_service_email(community.main_service).city_id
service_user_email = customer.service_user.email()
cirklo_merchant_key = CirkloMerchant.create_key(service_user_email)
cirklo_merchant = cirklo_merchant_key.get() # type: CirkloMerchant
if not cirklo_merchant:
cirklo_merchant = CirkloMerchant(key=cirklo_merchant_key) # type: CirkloMerchant
cirklo_merchant.denied = False
logging.debug('Creating new cirklo merchant')
cirklo_merchant.creation_date = datetime.datetime.utcfromtimestamp(customer.creation_time)
cirklo_merchant.service_user_email = service_user_email
cirklo_merchant.customer_id = customer.id
cirklo_merchant.city_id = city_id
cirklo_merchant.data = None
cirklo_merchant.populate_from_cirklo(get_whitelisted_merchant(city_id, customer.user_email))
cirklo_merchant.put()
logging.debug('Saving cirklo merchant: %s', cirklo_merchant)
service_identity_user = create_service_identity_user(customer.service_user)
try_or_defer(re_index_map_only, service_identity_user)
else:
logging.debug('Not saving cirklo merchant, consents:', consents)
self.redirect(self.get_url(customer))
class VouchersCirkloSignupHandler(PublicPageHandler):
def get(self, city_id=''):
supported_languages = ["nl", "fr"]
language = (self.request.get('language') or self.language).split('_')[0].lower()
cities = []
if city_id and city_id != 'staging':
city = CirkloCity.create_key(city_id).get()
if city:
cities = [city]
if not cities:
if city_id and city_id == 'staging':
cities = [city for city in CirkloCity.list_signup_enabled() if city.city_id.startswith('staging-')]
else:
cities = [city for city in CirkloCity.list_signup_enabled() if not city.city_id.startswith('staging-')]
solution_server_settings = get_solution_server_settings()
if language not in supported_languages:
language = supported_languages[0]
if language == 'fr':
sorted_cities = sorted(cities, key=lambda x: x.signup_names.fr)
else:
sorted_cities = sorted(cities, key=lambda x: x.signup_names.nl)
params = {
'city_id': city_id or None,
'cities': sorted_cities,
'recaptcha_site_key': solution_server_settings.recaptcha_site_key,
'language': language,
'languages': [(code, name) for code, name in OFFICIALLY_SUPPORTED_LANGUAGES.iteritems()
if code in supported_languages]
}
md = Markdown(output='html', extensions=['nl2br', NewTabExtension()])
lines = [
'#### %s' % translate(language, 'cirklo_info_title'),
'<br />',
translate(language, 'cirklo_info_text_signup'),
'',
translate(language, 'cirklo_participation_text_signup'),
]
params['privacy_settings'] = {
'cirklo': {
'label': translate(language, 'consent_cirklo_share'),
'description': md.convert('\n\n'.join(lines))
},
'city': {
'label': translate(language, 'consent_city_contact'),
'description': '<h4>%s</h4>' % translate(language, 'consent_share_with_city')
}
}
params['signup_success'] = md.convert('\n\n'.join([translate(language, 'cirklo.signup.success')]))
self.response.write(self.render('cirklo_signup', **params))
def post(self):
json_data = json.loads(self.request.body)
logging.debug(json_data)
if not recaptcha_verify(json_data['recaptcha_token']):
logging.debug('Cannot verify recaptcha response')
self.abort(400)
if not CirkloCity.create_key(json_data['city_id']).get():
logging.debug('CirkloCity was invalid')
self.abort(400)
self.response.headers['Content-Type'] = 'text/json'
whitelisted_merchant = get_whitelisted_merchant(json_data['city_id'], json_data['company']['email'])
if whitelisted_merchant:
logging.debug('email found in cirklo db')
else:
cirklo_merchant = CirkloMerchant.get_by_city_id_and_email(json_data['city_id'], json_data['company']['email'])
if cirklo_merchant:
logging.debug('email found in osa db')
whitelisted_merchant = True
if whitelisted_merchant:
return self.response.out.write(json.dumps({
'success': False,
'errormsg': translate(json_data['language'], 'cirklo.email_already_used')
}))
merchant = CirkloMerchant()
merchant.service_user_email = None
merchant.customer_id = -1
merchant.city_id = json_data['city_id']
merchant.data = {
u'company': json_data['company'],
u'language': json_data['language']
}
merchant.emails = [json_data['company']['email']]
merchant.populate_from_cirklo(None)
merchant.denied = False
merchant.put()
self.response.headers['Content-Type'] = 'text/json'
return self.response.out.write(json.dumps({'success': True, 'errormsg': None}))
|
Wrapping up another month in the forex industry, we take a look at the major events that took place in October. Changing things up a bit in this column versus previous renditions, this time around we’ll focus on a few headlines and analyze their importance (you can call this my October thoughts on the industry). But, before we do this we will discuss a key general trend.
Without a doubt, the most important trend taking place in the forex market is a return of volatility and volumes. Beginning in September, volumes have rebounded from what had been an altogether weak 2014. When reporting volumes for our Q3 Forex Industry Report, our estimates for the June to August period of the retail sector were the lowest three month periods in two years. Specifically, Japan had hit a wall with many brokers experiencing 6 month consecutive declines in volumes through August. But, we all know that changed in September.
During October, every broker or trading venue issued stellar month-over-month growth for September. Early figures reveal that October forex trading remained active, but overall numbers are expected to be flat or slightly below September’s figure. What is worth noting is that unlike the year plus long period of August 2013 to August 2014, for the first time in a while, the forex market is witnessing follow through activity. Meaning that unlike in the dry spell where a broker’s month was being swung by one or two events, only to see trading fizzle immediately, high impact days are being followed by continued volumes and volatility. The trading action reveals a return of speculation among traders to the asset class which bodes well for the rest of the year.
We all know that a rising tide floats all boats. Among the boats moving higher is ICAP’s eFX platform EBS. In October, ICAP reported average daily volumes (ADV) on EBS above $100B for the first time since June 2013, as they totaled $117.9Bln. The figures reveal a strong rebound from the multi-year low of $68.5Bln traded this past April (see chart below). So is the EBS rebound here to stay, and is their mix of providing both an ECN (EBS Market) and relationship-based liquidity (EBS Direct) on one platform a winning formula?
While providing optimism, we aren’t convinced that EBS volumes are slated to experience continued growth. Despite recent talk from ICAP Executives about the uptake for EBS Direct, this isn’t the first time they’ve been bullish on their products. In 2013, similar optimism was noted about the strengths of their ECN after they had applied new dealing changes, including among items a widening of spreads, but deeper liquidity per tick.
However, despite a proactive attempt by senior EBS staff headed by their CEO Gil Mandelzis to explain the benefits of their offering to old and existing customers and banks around the world, 2013 and 2014’s results proved that their message had mostly gone unheard. For the most part, EBS has remained a last resort venue, as known by its users, where available liquidity could be found during periods of increased volatility. That reputation continued to hold as September’s strength coincided with EBS’s two most traded currencies, the euro and yen, experiencing a spike in volatility.
While intriguing with its combination of EBS Markets and EBS Direct, their offering isn’t unique, with many aggregation products existing in the market. In addition, as technology costs have declined, advantages to accessing the interbank FX market using ECNs have diminished. As a result, before willing to declare that EBS has finally broken out of its multi-year decline, we are awaiting to see how the platform performs during a quieter period, and whether customers will shun the product for lower spread alternatives.
Just barely making it into October was Friday’s announcement that GAIN Capital was acquiring City Index for $118Mln. The two firms have a past history as GAIN purchased US accounts from City Index when its subsidiary brand, FX Solutions, exited from the country. The terms of the deal were similar to those of GAIN’s acquisition of GFT which was a combination of cash, convertible notes and stock.
Following a 21% spike higher in GAIN Capital shares, the deal is currently valued at $125Mln, or 12X City Index’s trailing twelve month EBITDA for the period ending September 3oth, 2014. Compared to City’s last filing available to the public for the period ending March 31, 2013, the broker was able to increase customer deposits by 27% to $344Mln over the past year and a half. However, profitability continued to be a problem with the broker despite rising revenues in 2014. As a result, for GAIN Capital, the deal really rests on performing cost reduction synergies to boost margins from City Index’s flow.
A slight loss in the acquisition terms was GAIN’s early release of their Q3 figures which showed record revenues of $102.8Mln and net income of $14.7Mln. The figures compare to $69.7Mln in Q2. While Q2 was a depressed period of trading in the forex market, Q3 started off even worse. However, the 47% increase in revenues show just how profitable market-making can be for brokers when volatility returns, even if it was only for one month, as was in September. As such, the potential of this deal will probably ride more in terms of whether September’s trading activity isn’t an anomaly.
Returning back to the deal, it’s worth noting that the offer wasn’t made by FXCM which has been very public about it being in talks with several parties, and we can assume had been in discussions with City as well. Yes, GAIN and City do have a past of working together, but why GAIN as acquirer makes more sense than FXCM is due to the vast majority of City Index customers trading on proprietary technology. City did announce two weeks ago that it was integrating its recent MT4 logins with its proprietary Advantage platform. That news makes more sense now as it has foreshadowed the broker’s efforts to unify its customers.
In regards to FXCM, the broker has shown interest in companies offering generic platforms as of late, with it providing easier integration to their servers. Recent examples are their purchase of US accounts of IBFX, Alpari and FXDD which were primarily MetaTrader 4 users. As such, we could be entering a period where despite the advantages of a broker operating their own proprietary platform, valuation-wise the benefit is minimal if any.
Speaking about valuations, October has created a new list of bitcoin millionaires. No, I am not talking about people who unlocked their old USB drives to find their old stash of bitcoins, but company founders.
During October, a record of nearly $80Mln in funding reached the sector, predominately by venture capital funds. The funding was allocated to companies across the board including bitcoin exchanges, wallet companies, hardware providers and payment processing companies. The funding all occurred despite bitcoin prices being well below their 2014 highs and trading most of the month bouncing between $300-$350.
So why should we care? The jury is still out on whether any of these firms will provide value to their investors. However, what is worth taking note of is that these deals are funding the future infrastructure of an ‘all things digital’ world of finance. The dot com bubble of the late 1990’s brought all sorts of busts like Webvan, TheGlobe.com, and Pets.com, but emerging from it were Amazon, Ebay, Google and others which have changed the way we shop and use the internet.
As such, a similar trend appears to be taking place where we may not be feeling the changes to the financial industry being impacted by bitcoins and other digital currencies now, but in ten years, we can look back to all the seeds that were planted this year and how they have emerged to become key ingredients of how currencies are used.
Ron, when you say “City”, I’m guessing you mean Citi, right?
Ah, too many cities.. I was reading another article on broker profitability and got this one mixed up. My bad. |
# coding=utf-8
# Copyright 2017 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
from __future__ import absolute_import, division, print_function, unicode_literals
import logging
from builtins import object
from copy import copy
logger = logging.getLogger(__name__)
class WrappedPEX(object):
"""Wrapper around a PEX that exposes only its run() method.
Allows us to set the PEX_PATH in the environment when running.
"""
_PEX_PATH_ENV_VAR_NAME = 'PEX_PATH'
def __init__(self, pex, extra_pex_paths=None):
"""
:param pex: The main pex we wrap.
:param extra_pex_paths: Other pexes, to "merge" in via the PEX_PATH mechanism.
"""
self._pex = pex
self._extra_pex_paths = extra_pex_paths
@property
def interpreter(self):
return self._pex._interpreter
def path(self):
return self._pex.path()
def cmdline(self, args=()):
cmdline = ' '.join(self._pex.cmdline(args))
pex_path = self._pex_path()
if pex_path:
return '{env_var_name}={pex_path} {cmdline}'.format(env_var_name=self._PEX_PATH_ENV_VAR_NAME,
pex_path=pex_path,
cmdline=cmdline)
else:
return cmdline
def run(self, *args, **kwargs):
env = copy(kwargs.pop('env', {}))
pex_path = self._pex_path()
if pex_path:
env[self._PEX_PATH_ENV_VAR_NAME] = pex_path
logger.debug('Executing WrappedPEX using: {}'.format(self.cmdline(args=tuple(*args))))
return self._pex.run(*args, env=env, **kwargs)
def _pex_path(self):
if self._extra_pex_paths:
return ':'.join(self._extra_pex_paths)
else:
return None
|
Jubilant FoodWorks today reported over two-fold increase in standalone net profit at Rs 48.47 crore for the second quarter ended September 30, helped by increase in same store sales growth of Domino's Pizza.
Same store growth refers to the year-over-year growth in sales for restaurants in operation for 2 years.
The company, which operates Domino's Pizza and Dunkin' Donuts chains in India, had reported a net profit of Rs 21.56 crore in the same period of previous fiscal.
Its total income was up 9.02 per cent to Rs 730.28 crore during the quarter under review as against Rs 669.82 crore in the year-ago period, the company said in a BSE filing.
The company said growth was driven by a strong 5.5 per cent same store growth (SSG) in Domino's Pizza and disciplined cost management.
"A combination of mid-single digit same store sales growth and disciplined cost management led to another solid performance in second quarter of this financial year. We made good progress towards our goals during the quarter in both Domino's Pizza and Dunkin' Donuts," JFL Chairman Shyam S Bhartia and Co Chairman Hari S Bhartia said in a statement.
Total expenses during the period stood at Rs 657.01 crore compared to Rs 637.87 crore in the July-September quarter previous fiscal.
At present, the company operates 1,126 Domino's Pizza and 50 Dunkin' Donuts outlets.
During the September quarter, while the company opened one new Domino's Pizza outlet and shut another, it closed five Dunkin' Donuts outlet and opened two.
Shares of JFL were trading at Rs 1,640 per scrip, up 2.28 per cent on BSE. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.