text
stringlengths
29
850k
class OnlineAddOn: def __init__(self, application, identifier, versions): self.__application = application self.__identifier = identifier self.__versions = tuple(sorted(versions, key=lambda ver: ver.version, reverse=True)) for version in self.__versions: version._set_addon(self) @property def identifier(self): return self.__identifier @property def name(self): return self.__versions[0].name @property def author(self): return self.__versions[0].author @property def homepage(self): return self.__versions[0].homepage @property def license(self): return self.__versions[0].license @property def icon(self): return self.__versions[0].icon @property def description(self): return self.__versions[0].description @property def requirements(self): yield from self.__versions[0].requirements @property def provisions(self): yield from self.__versions[0].provisions @property def versions(self): yield from self.__versions @property def latest_version(self): return self.__versions[0] @property def local_addon(self): return self.__application.addons.local.get_addon(self.__identifier)
and that they are sticking to the 120 dB figure, so I hope that they will reconsider the matter—just as they reconsidered issues earlier tonight. As an example, the noise of a jet taking off is around 100 dB and the average human pain threshold is 110 dB, so this matter certainly needs to be looked at. Declares that the petitioners believe that the use of fireworks is increasing in terms of frequency and that the resultant nuisance of noise and perceived danger from explosions are growing with the ever increasing size and power of fireworks available within the UK; further that fireworks can cause severe distress to people suffering from PTSD or other mental health issues and to animals. I rise to present a petition on behalf of the constituents of Glasgow South West, the most sophisticated electorate in these islands. The roll-out of universal credit is being felt by the constituents of Glasgow South West, and the Glasgow South West food bank reports an increase in food bank usage of 23% since 19 September, when universal credit arrived in the Govan jobcentre. Declares that the proposed roll out of Universal Credit in the city of Glasgow will have a devastating impact on communities across the city and will lead to increased foodbank usage and financial misery for some of the most vulnerable people in Glasgow. The petitioners therefore request that the House of Commons urges the Department of Work and Pensions to halt the roll out of Universal Credit in Glasgow and fix it without delay.
# This Source Code Form is subject to the terms of the Mozilla Public # License, v. 2.0. If a copy of the MPL was not distributed with this # file, You can obtain one at http://mozilla.org/MPL/2.0/. import os import re LICENSE = """\ /* This Source Code Form is subject to the terms of the Mozilla Public * License, v. 2.0. If a copy of the MPL was not distributed with this * file, You can obtain one at http://mozilla.org/MPL/2.0/. */ /* automatically generated by check_bindings.py. */ """ BINDINGS_PATH = os.path.join("..", "gecko_bindings") INPUT_FILE = os.path.join(BINDINGS_PATH, "bindings.rs") OUTPUT_FILE = os.path.join(BINDINGS_PATH, "check_bindings.rs") TEMPLATE = """\ [ Servo_{name}, bindings::Servo_{name} ]; """ with open(INPUT_FILE, "r") as bindings, open(OUTPUT_FILE, "w+") as tests: tests.write(LICENSE) tests.write("fn assert_types() {\n") pattern = re.compile("fn\s*Servo_([_a-zA-Z0-9]+)\s*\(") for line in bindings: match = pattern.search(line) if match: tests.write(TEMPLATE.format(name=match.group(1))) tests.write("}\n")
Louis Vuitton’s new Creative Director, Ghanan born Virgil Abloh prepared the debut of his first menswear collection with probably the longest and willingly complex teasing campaign in luxury fashion. With a mandate to bring an immediate boost to Louis Vuitton’s menswear, Abloh, himself a genius when it comes to branding especially for athleisure and positioning the blend between sportswear and casual to luxury bringing incredible profits for any of his profits and gathering an immense global following, some even call them Abloh worshipers. I strongly believe that once he was actually faced with the huge budgets of Louis Vuitton (all this other projects / brands had a fraction of this) he found himself in the middle of a actual re-branding process of Louis Vuitton, not just Louis Vuitton Men. He debuted his teasing with seemingly a ‘mockery’ party of the LV logo-mania in a warehouse / garage style venue in New York whicn turned out to be the showroom of cult accessories brand Chrome Hearts which also benefited from a free-of-charge indirect awareness being associated with a brand the calibre of LV. The set up also featured an abundance of Baccarat vases and glasses (all social media tagged Baccarat) and an ‘artisanal’ makeshift table set in cardboard entirely covered with the LV monogram. The star of the evening could not have been better chosen – Martha Stewart. The over half a million outdoor campaign worldwide with oversized statues and holograms followed and then culminated with the actual fashion show with was predominantly a tribute to Michael Jackson with overt and obvious references to the late singer’s style and music including the event invitations. But it was not until the second part of the actual advertising campaign (first LV Men’s ad campaign of this magnitude delivered and scheduled in 3 parts) that some of the media (mostly online because they get no ad revenues from LV) began to see that the references to Michael Jackson are predominant. Within 3 days, ironically but not surprisingly a UK Tv station aired the documentary reminding Michael Jackson’s repeated abuse of children, including sexually. Our reaction, for instance, was very prompt, demanding much more than a simple email statement from Abloh and LV, however our call for ‘boycott’ was a about raising awareness not about inciting our readers to distribute a simple ‘Boycott LV’ or take to social media Here is our article. With its army of staff all all levels including marketing and media, LV proceeded immediately to remove all references, including a very fast removal of photos from Google as well as a completely ‘starting from fresh’ on its very powerful social media channels seemingly pretending there was no teasing campaign, no fashion show almost entirely an ode to Michael Jackson and showing less and promoting less of LV Men’s, the only images being those of some children playing with LV items, probably a back up fourth campaign part. No boycott, no call of resignations, no debates and more importantly no celebrities daring to step in and say a word. Then, there came Gucci’s turn with what seemed to be a major scandal called the ‘black face’ for featuring an offensive design which was overtly a discrimination, not only in its fashion show but also in social media imagery. Gucci’s reaction, unlike LV’s, was much faster and much more comprehensive. Also Gucci was more pragmatic understanding very sensibly it needs to put a price on its mistake. And indeed, Gucci pledged to spend 10 million in a series of initiatives, one of them being Gucci Change-makers and an entity to independently safeguard inclusivity and fight discrimination including a full time team on this organisation. Prada did exactly the same in response to criticism that it used racist imagery in a series figurines, bag charms, and window displays last year. The Italian luxury label enlisted Selma and A Wrinkle in Time director Ava DuVernay to co-chair a new in-house group called the Diversity and Inclusion Advisory Council, to “elevate voices of color within the company and the fashion industry at-large.” Alongside artist and activist Theaster Gates, DuVernay’s new duties will include helping to create internships for designers and students of color. While obviously financial power and influence in different forms, including staggering figures across their social media with millions of followers, paid a crucial role in ‘shelving’ these scandals the solution has not been that simple for Dolce & Gabbana who has never managed to recover from the boycott that exploded after it showed an apparently ‘offending’ image of a Chinese woman eating noodles in the teasing to its Shanghai fashion show. The backlash from the public including major Asian (not just Chinese celebrities) was so virulent the duo had their show cancelled by the municipality and left China in less than two days. D&G sales plummeted and the boycott remains in place among many wholesalers (buyers) but Chinese customers not only in China but across the world. D&G chose the only option which was to go silent (the house was never known for being discreet and silent, the duo running the house as two dictators). There has never been an officially appointed CEO of the house and no-one even entitled to speak on behalf of the house. While fashion depends on public exposure under any forms no matter the team behind, other luxury sectors, such as luxury hotels have a much more complex measurement of their success and that is, beyond their facilities, architecture, interior design (no matter how lavish or luxurious they are) hotels earn their reputation in a longer time span based on achieving and delivering consistent customer service. The people ‘behind’ are very visible and they interact with guests at many levels. What makes it even more complex for luxury hotels, is that, while Gucci is now thriving because its Creative Director is delivering relevant and surprising designs being considered the most trendy at the opposite being another giant Ferragamo which despite its incredible craftsmanship and heritage has been losing ground very fast because of no longer finding a relevant positioning. The expectations of guests are much more complex and unlike fashion, the luxury hotel industry has a global boycotting machine accessible by any hotel guest whether a regular mortal or Royalty. The respective guest can take to Tripadvisor even during his or her stay (not necessarily after they check out) and post 4 lines concluding the hotel is terrible and giving it a specific score. That score is incredibly important because the ongoing scoring puts hotels in a ranking on Tripadvisor. Why a boycott? Because, hotels have not much to do and Tripadvisor amost never takes down such an entirely negative comment, advocating that it may encourage more shy guests to speak out. What is even more outrageous is that after 3 or 4 entirely negative comments, a top luxury hotel can go from the second to the eighth rank over night and since Tripadvisor is part of a holding that also includes one of the world’s biggest travel e-commerce website can only sensibly take advantage. For years, hotels have been unsuccessfully trying to persuade Tripadvisor to implement a system of verifying whether the respective guest actually stayed at the hotel because, guess what, over 30% of reviews are fake and fabricated. Anyone can go only this instant create a Tripadvisor account in minutes and place a review of X hotel without ever even being to the city the hotel is. But probably the most important two differences when it comes to scandal between luxury hotels and luxury fashion or other sectors are, first, that even if the hotel has let’s say half a million followers on social media, it cannot resort to it and start ‘complaining’ that x, y, z review is fake and secondly, that while fashion brands actually pay celebrities to attend their fashion shows for instance (beyond being part of their campaigns for instance), operationally, hotels cannot afford to pay such celebrities because of their much higher costs of operations. The most they can do, is provide a discount or agree on a barter deal to provide free-of-charge accommodation in exchange for exposure by the respective celebrity on their social media channels. But would George Clooney be comfortable with media being made aware the two host hotels of its wedding provided free accommodation and meals in exchange for some ‘innocent’ photo-opps in a recognisable spot of the hotel. What about nationalism or hatred? To my knowledge, the only major case of recent history was the immediate sacking of incredibly talented fashion designer John Galliano from Dior for making antisemitic allegations. The reaction of the house was immediate and it did not realise that John Galliano has actually created his own codes at Dior while maintaining the DNA. Since his abrupt dismissal, Dior has never been back to the spotlight not even at a fraction as it was during John Galliano who continuously generated that WOW factor, that courageous stance that haute couture is also about. However, Galliano was never banned from Israel owned businesses and to my knowledge not even from Israel. He was never further ‘punished’ by Israel in any way. Should he choose to have coffee tomorrow in the newly reopened iconic Lutetia Hotel in Paris, will he be banned or denied entry? The hotel is overtly known for being Israel owned. As for LV, are their staff or even Virgil Abloh ‘banned’ from attending children’s events or sanctioned by advertising supervisory authorities for the misuse of underage children in public advertising campaigns. Should Louis Vuitton run a campaign just to promote his Creative Director is African? Are Prada or Gucci employees or top management banned from black owned businesses for instance a luxury hotel owned by a black person? Do their initiatives to spend millions and set up diversity /inclusive organisations work as censorship bodies and supervise each and every product to make sure it will not have the smallest discrimination offences? All of these questions only give way to insane answers – fashion is art and it is supposed to create trends and focus on quality and their positioning. Fashion is also about surprise and yet, about controversy – Christian Dior, Yves Saint Laurent or Balmain were all criticised in their times. As for what brands can and should do, is at a business level to ensure coherence Louis Vuitton for instance, should take a closer look at its business choice to run two completely different brands under the two Creative Directors (LV mens has nothing in common with LV women’s except the logo). D&G should continue to learn how to stay silent and employ some other REAL voices and appeal to top professional experienced luxury professionals to address issues emerging from a business decision. Returning to hotels, instead of abusing their status, celebrities should take to Tripadvisor and invite others to ‘boycott’ hotels there. It is a huge platform with a global monopoly, with a much higher audience and traffic than their social media accounts or even CNN. The level of engagement is also ideal and followers of their boycotts can hide their identity and speak out. They can complain about the owners of the respective hotel and about the laws in the country of origin of the investor / owner. Unfortunately, this will not bring down the likes of Dorchester Collection who have not only achieved the highest luxury hospitality standards but have also demonstrated, thanks to their continued investment in the best staff, that their service has been consistent for years. Also, their service is not discriminatory! A hotel guest at the Dorchester Collection will never be judged for the attire he or her wears at check in or be treated better if he stays in a suite and most importantly a non-celebrity guest will not be treated in a lesser way and given less attention that a celebrity. With this, they have gained the trust of their guests and patrons and this is the ultimate achievement. There are very few luxury hotel chains there at the very top, globally, and Dorchester Collection is not only one of them but it is also recognised as a leader, a leader in service, innovation, human resources management. The guests of the RITZ in Paris would not make their decision to return or not to the hotel if suddenly some celebrity calls for a boycott of the hotel because it is owned by an Egyptian and Egypt as a country has a certain legislation as absurd as the logic will find that some people claim Chanel will die without Lagerfeld (Ritz Paris has the only Chanel Spa in the world). Sadly all of this hatred brings more hatred and negativity directly targeting the people of the Dorchester Collection. They are the heart and sole of each hotel and find themselves in the shooting range for doing nothing but their best in their jobs.
"""Input plugin for www.betaseries.com""" from __future__ import unicode_literals, division, absolute_import from hashlib import md5 import logging from flexget import plugin from flexget.entry import Entry from flexget.event import event from flexget.utils import requests from flexget.utils.cached_input import cached log = logging.getLogger('betaseries_list') API_URL_PREFIX = 'http://api.betaseries.com/' class BetaSeriesList(object): """ Emits an entry for each serie followed by one or more BetaSeries account. See http://www.betaseries.com/ Configuration examples: # will get all series followed by the account identified by your_user_name betaseries_list: username: your_user_name password: your_password api_key: your_api_key # will get all series followed by the account identified by some_other_guy betaseries_list: username: your_user_name password: your_password api_key: your_api_key members: - some_other_guy # will get all series followed by the accounts identified by guy1 and guy2 betaseries_list: username: your_user_name password: your_password api_key: your_api_key members: - guy1 - guy2 Api key can be requested at http://www.betaseries.com/api. This plugin is meant to work with the import_series plugin as follow: import_series: from: betaseries_list: username: xxxxx password: xxxxx api_key: xxxxx """ schema = { 'type': 'object', 'properties': { 'username': {'type': 'string'}, 'password': {'type': 'string'}, 'api_key': {'type': 'string'}, 'members': { 'type': 'array', 'items': { "title": 'member name', "type": "string" } } }, 'required': ['username', 'password', 'api_key'], 'additionalProperties': False } @cached('betaseries_list', persist='2 hours') def on_task_input(self, task, config): username = config['username'] password = config['password'] api_key = config['api_key'] members = config.get('members', [username]) titles = set() try: user_token = create_token(api_key, username, password) for member in members: titles.update(query_series(api_key, user_token, member)) except (requests.RequestException, AssertionError) as err: log.critical('Failed to get series at BetaSeries.com: %s' % err.message, exc_info=err) log.verbose("series: " + ", ".join(titles)) entries = [] for t in titles: e = Entry() e['title'] = t entries.append(e) return entries def create_token(api_key, login, password): """ login in and request an new API token. http://www.betaseries.com/wiki/Documentation#cat-members :param string api_key: Api key requested at http://www.betaseries.com/api :param string login: Login name :param string password: Password :return: User token """ r = requests.post(API_URL_PREFIX + 'members/auth', params={ 'login': login, 'password': md5(password).hexdigest() }, headers={ 'Accept': 'application/json', 'X-BetaSeries-Version': '2.1', 'X-BetaSeries-Key': api_key, }) assert r.status_code == 200, "Bad HTTP status code: %s" % r.status_code j = r.json() error_list = j['errors'] for err in error_list: log.error(str(err)) if not error_list: return j['token'] def query_member_id(api_key, user_token, login_name): """ Get the member id of a member identified by its login name. :param string api_key: Api key requested at http://www.betaseries.com/api :param string user_token: obtained with a call to create_token() :param string login_name: The login name of the member :return: Id of the member identified by its login name or `None` if not found """ r = requests.get(API_URL_PREFIX + 'members/search', params={ 'login': login_name }, headers={ 'Accept': 'application/json', 'X-BetaSeries-Version': '2.1', 'X-BetaSeries-Key': api_key, 'X-BetaSeries-Token': user_token, }) assert r.status_code == 200, "Bad HTTP status code: %s" % r.status_code j = r.json() error_list = j['errors'] for err in error_list: log.error(str(err)) found_id = None if not error_list: for candidate in j['users']: if candidate['login'] == login_name: found_id = candidate['id'] break return found_id def query_series(api_key, user_token, member_name=None): """ Get the list of series followed by the authenticated user :param string api_key: Api key requested at http://www.betaseries.com/api :param string user_token: Obtained with a call to create_token() :param string member_name: [optional] A member name to get the list of series from. If None, will query the member for whom the user_token was for :return: List of serie titles or empty list """ params = {} if member_name: member_id = query_member_id(api_key, user_token, member_name) if member_id: params = {'id': member_id} else: log.error("member %r not found" % member_name) return [] r = requests.get(API_URL_PREFIX + 'members/infos', params=params, headers={ 'Accept': 'application/json', 'X-BetaSeries-Version': '2.1', 'X-BetaSeries-Key': api_key, 'X-BetaSeries-Token': user_token, }) assert r.status_code == 200, "Bad HTTP status code: %s" % r.status_code j = r.json() error_list = j['errors'] for err in error_list: log.error(str(err)) if not error_list: return [x['title'] for x in j['member']['shows'] if x['user']['archived'] is False] else: return [] @event('plugin.register') def register_plugin(): plugin.register(BetaSeriesList, 'betaseries_list', api_ver=2)
"Quality 1st, Honesty as base, Sincere assistance and mutual profit" is our idea, in order to create consistently and pursue the excellence for Simple Trivet Mat , Stovetop Moka Pot , Sieve , plus the right solution. We always work as a tangible team to ensure that we can provide you with the best quality and the best price for Simple Trivet Mat , Stovetop Moka Pot , Sieve , Our staffs are rich in experience and trained strictly with qualified knowledge with energy and always respect their customers as the No. 1 and promise to do their best to offer the effective and particular person service for customers. The Company pays attention to maintaining and developing the long-term cooperation relationship with the customers. We promise as your ideal partner we'll develop a bright future and enjoy the satisfying fruit together with you with persisting zeal endless energy and forward spirit.
import fs._fs as c from fs.particles import Particles def init(nc, boxsize, a, ps, seed, kind): """Generate 2LPT displacements and particle positions. This function generates a random Gaussian initial condition and create a grid of particles with 2LPT displacements. The velocities are 0. Args: nc (int): Number of particles per dimension; number of particles np = nc**3. boxsize (float): length of the periodic box on a side [1/h Mpc]. a (float): scale factor at which the positions are computed. ps (PowerSpectrum): Linear power spectrum extrapolated to a=1. seed (int): random seed for the random Gaussian initial density field kind (str): kind of xv, 'zeldovich', '2lpt', or 'cola' cola sets v=0 Returns: An instance of class Particles. """ return Particles(_particles=c._lpt(nc, boxsize, a, seed, ps._ps, kind.lower())) def set_offset(offset): """Set offset with respect to grid points x = (ix + offset)*dx, where ix is an integer, dx = boxsize/nc. Args: offset (float): offset (0 <= offset < 1) """ c._set_offset(offset) def set_zeldovich_force(particles, a): """Set Zel'dovich (1LPT) force to particles.force Args: a (float): scale factor of the force """ c._set_zeldovich_force(particles._particles, a)
Note: Poster made by the organizers, not the participating venues, who all have much nicer artwork! Updated poster coming soon. Craft brew venues around China will team up November 4 to support two good causes. Boxing Cat in Shanghai, WE Brewery in Tianjin and Arrow Factory, Dirty Duck, Great Leap and Slow Boat in Beijing will all participate in a Maovember fundraiser to support Good Works and Library Project, with more venues expected to join the campaign. The event is a riff on “beers for books” events held worldwide, where a set amount of money from each beer sold is donated to a charity like Library Project or Room to Read. Except this time the event involves multiple venues and multiple cities and is essentially Super Beers for Books. The concept is simple: each venue donates rmb10 or more beer sold on November 4. Venues also have the option of doing a fun activity to raise a bit more money, like our (at times hilarious) darts challenge at this year’s opening party or mini beer pong or liar’s dice or a raffle or a beer nog session (hmm, maybe skip that last one). That’s optional. The key is to raise money for charity from people drinking nice delicious beer. We’ll have more details by the end of the week. And if you are a craft beer venue and want to get involved, give us a shout at nihao (at) maovember.com. September library donations have begun!
import unittest import smtplib import minimock from minimock import assert_same_trace, mock, Mock, TraceTracker from StringIO import StringIO import osc.core import obswatch class TestObswatch(unittest.TestCase): # nose knows about setUpClass, but python 2.6's unittest doesn't # @classmethod # def setUpClass(cls): def setUp(self): repo = Mock('repo') repo.name = 'standard' repo.arch = 'x86_64' package = Mock('package') package.project = 'openSUSE:11.3' package.name = 'osc' self.build = obswatch.Build(package=package, repo=repo, interested={'geeko':'[email protected]'}) self.package = package self.tt = TraceTracker() obswatch.SLEEP_TIME = 0 def tearDown(self): minimock.restore() self.tt.clear() def test_get_latest_packages(self): mock('obswatch.http_GET', tracker=self.tt, returns=StringIO('''<?xml version="1.0" encoding="UTF-8"?> <latest_added> <package created="2010-09-09T14:03:06+02:00" name="antivir" project="home:varkoly:branches:openSUSE:11.3:NonFree"/> <project created="2010-09-09T14:03:05+02:00" name="home:varkoly:branches:openSUSE:11.3:NonFree"/> <package created="2010-09-09T13:50:37+02:00" name="test9" project="home:enzokiel:test"/> <package created="2010-09-09T13:12:54+02:00" name="kernel-bfs-source" project="home:jingtw"/> <package created="2010-09-09T13:12:08+02:00" name="getdata" project="home:christiantrippe:branches:KDE:Distro:Factory"/> <package created="2010-09-09T13:05:13+02:00" name="perl-String-CRC32" project="home:seife:byd"/> <package created="2010-09-09T13:05:04+02:00" name="autogen" project="home:psmt:branches:Base:System"/> </latest_added>''')) result = obswatch.get_latest_packages(7) assert_same_trace(self.tt,"""Called obswatch.http_GET( '%sstatistics/latest_updated?limit=7')""" % obswatch.APIURL) for p in result: self.assertTrue(isinstance(p, obswatch.Package)) self.assertEqual(result[0].name, 'antivir') self.assertEqual(len(result), 6) # second one is a project def test_get_user_email(self): mock('obswatch.http_GET', tracker=self.tt, returns=StringIO('''<person> <login>Geeko</login> <email>[email protected]</email> <realname>Geeko Chameleon</realname> <watchlist/> </person>''')) result = obswatch.get_user_email('Geeko') assert_same_trace(self.tt, """Called obswatch.http_GET( '%sperson/Geeko')""" % obswatch.APIURL) self.assertEqual(result, '[email protected]') def test_users_from_url(self): mock('obswatch.http_GET', tracker=self.tt, returns=StringIO('''<?xml version="1.0" encoding="UTF-8"?> <project name="superkde" created="2005-01-01T00:00:02+01:00" updated="2007-01-19T10:44:45+01:00"> <title>SuperKDE</title> <description>SuperKDE is a heavily tuned version of KDE.</description> <link project="openSUSE:11.2:Update" /> <link project="openSUSE:11.2" /> <person role="maintainer" userid="Geeko"/> <person role="maintainer" userid="BrownGeeko"/> <group role="reviewer" groupid="release_team"/> <build> <disable /> </build> <repository name="kde4:factory" rebuild="transitive"> <path project="kde4" repository="factory"/> <arch>i386</arch> <arch>x86_64</arch> </repository> </project>''')) mock('obswatch.get_user_email', returns='[email protected]') result = obswatch.get_users_from_url('%ssource/superkde/_meta' % obswatch.APIURL) assert_same_trace(self.tt, """Called obswatch.http_GET( '%ssource/superkde/_meta')""" % obswatch.APIURL) self.assertEqual(len(result), 2) self.assertEqual(result['Geeko'], '[email protected]') self.assertEqual(result['BrownGeeko'], '[email protected]') def test_get_builds(self): mock('osc.core.http_GET', tracker=self.tt, returns=StringIO('''<?xml version="1.0" encoding="UTF-8"?> <project name="superkde" created="2005-01-01T00:00:02+01:00" updated="2007-01-19T10:44:45+01:00"> <title>SuperKDE</title> <description>SuperKDE is a heavily tuned version of KDE.</description> <link project="openSUSE:11.2:Update" /> <link project="openSUSE:11.2" /> <person role="maintainer" userid="ernie"/> <group role="reviewer" groupid="release_team"/> <build> <disable /> </build> <useforbuild> <disable /> </useforbuild> <repository name="kde4:factory" rebuild="transitive"> <path project="kde4" repository="factory"/> <arch>i386</arch> <arch>x86_64</arch> </repository> <repository name="suselinux-9.3"> <path project="suselinux-9.3" repository="standard"/> <arch>i386</arch> </repository> <repository name="gnomespecial" rebuild="local"> <path project="gnome3" repository="suselinux-9.3"/> <path project="suselinux-9.3" repository="standard"/> <arch>i386</arch> </repository> </project>''')) # source/superkde/_meta # gets called by osc.core.get_repos_of_project mock('obswatch.get_interested', returns={'Geeko': '[email protected]'}) superkde = Mock('package') superkde.name = 'superkde' superkde.project = 'superkde' superkde.created = '2007-01-19T10:44:45+01:00' result = obswatch.get_builds(superkde) assert_same_trace(self.tt, """Called osc.core.http_GET( '%ssource/superkde/_meta')""" % obswatch.APIURL) def test_build_get_remote_status(self): mock('obswatch.http_GET', tracker=self.tt, returns=StringIO('''<status package="osc" code="disabled"> <details></details> </status>''')) code = self.build.get_remote_status() assert_same_trace(self.tt, """Called obswatch.http_GET( '%sbuild/openSUSE:11.3/standard/x86_64/osc/_status')""" % obswatch.APIURL) self.assertEqual(code, 'disabled') def test_process_same_status(self): self.build.get_remote_status = lambda : self.build.status result = obswatch.process_build(self.build) self.assertTrue(result) def test_process_intermediate(self): self.build.get_remote_status = lambda : 'building' result = obswatch.process_build(self.build) self.assertTrue(result) self.assertEqual(self.build.status, 'building') def test_process_other(self): self.build.get_remote_status = lambda : 'excluded' result = obswatch.process_build(self.build) self.assertFalse(result) def test_process_unknown(self): self.build.get_remote_status = lambda : 'infundibulated' self.assertRaises(Exception, obswatch.process_build, self.build) def test_process_final_not_succeeded(self): self.build.get_remote_status = lambda : 'failed' result = obswatch.process_build(self.build) self.assertFalse(result) def test_final_succeeded(self): self.build.get_remote_status = lambda : 'succeeded' mock('obswatch.Build.get_binaries', returns={'foo':'bar'}) mock('obswatch.send_email', tracker=self.tt) result = obswatch.process_build(self.build) self.assertFalse(result) expected_output = """Called obswatch.send_email( 'geeko', '[email protected]', <obswatch.Build instance at ...>, {'foo': 'bar'})""" assert_same_trace(self.tt, expected_output) def test_interested(self): mock('obswatch.get_users_from_url', returns_func=lambda url: {url: url}) result = obswatch.get_interested(self.package) # both the project and package page should be checked for users self.assertEqual(result, {'https://api.opensuse.org/source/openSUSE:11.3/_meta': 'https://api.opensuse.org/source/openSUSE:11.3/_meta', 'https://api.opensuse.org/source/openSUSE:11.3/osc/_meta': 'https://api.opensuse.org/source/openSUSE:11.3/osc/_meta'}) def test_send_email(self): mock('smtplib.SMTP', returns=Mock('smtp_connection', tracker=self.tt), tracker=self.tt) obswatch.send_email('geeko', '[email protected]', 'yourpackage', {'rpm1': 'http://opensuse.org/rpm1', 'rpm2': 'http://opensuse.org/rpm2'}) expected_output = """Called smtplib.SMTP('localhost') Called smtp_connection.sendmail( '[email protected]', ['[email protected]'], 'To: [email protected]\\nFrom: [email protected]\\nSubject: (osc) build succeeded: yourpackage\\n\\nThe package yourpackage has finished building and can now be downloaded from:\\nrpm1 - http://opensuse.org/rpm1\\nrpm2 - http://opensuse.org/rpm2') Called smtp_connection.quit() """ assert_same_trace(self.tt, expected_output) if __name__ == '__main__': unittest.main()
It&apos;s only a walk downstairs to our onsite Bistro, serving up Starbucks specialty drinks, healthy breakfast and dinner menus, and an evening cocktail bar. Head east on State Hwy 388 for 8.2 miles. Turn right at FL-77 S/State Hwy 388, go 9.6 miles. Turn left at E 24th Plaza and travel for 0.2 miles. Take the 2nd right onto N Palo Alto Ave, then take the 1st left onto E 23rd Plaza.
#!/usr/bin/python import threading import time import glob import MacroParser import Telldus class MacroCollection(object): """This is a collection of macros defined in xml format that are read from a list of folders. Attribute: folder_list -- The list with the paths from where to collect the macros """ def __init__(self, folder_list): """Initializes the MacroCollection object. """ self.PotentialMacroFiles = [] self.collection = {} for folder in folder_list: # Yes for every folder in the list... folder = folder.strip() # TODO: if this is to be used on let's say an inferior operating system, this has to be adapted... if folder[-1] != '/': folder.append('/*.xml') else: folder.append('*.xml') # add the macro files to our list of files... self.PotentialMacroFiles.extend( glob.glob(folder) ) # OK go through the potential macro files and make macro objects out # of them. for PotentialMacroFile in self.PotentialMacroFiles: macro = None try: macro = xml_macro( PotentialMacroFile ) except: pass else: macro.name class wakeup_macro( threading.Thread ): """This is a macro that can be started and be self maintained. It feeds the command queue in the controller with commands until the macro completes. Attribute: unit_id -- The specific dimmer that should be used for this wakeup sequence. """ def __init__( self, unit_id ): """Initialize the wakeup macro. """ threading.Thread.__init__(self) self.unit_id = unit_id def run(self): """This is where the magic happens! """ dim_value = 0 print time.asctime() print "wakeup macro for unit %d started" % self.unit_id while dim_value < 255: if dim_value < 10: dim_value += 1 elif dim_value <20: dim_value += 2 else: dim_value += 5 if dim_value > 255: dim_value = 255 # Create the command! cmd = Telldus.Controller.TelldusCommand( "dim::%d:%d:" % (self.unit_id, dim_value) , False ) # enqueue a command that sets the level on a lamp, this is later # received by the Telldus Controller Telldus.Controller.CommandQueue.put( cmd ) # Sleep for a while so we don't ramp up the lux level to quick time.sleep( 10 ) print time.asctime() print "wakeup macro for unit %d completed" % self.unit_id class xml_macro( threading.Thread ): """This is a xml macro that reads a file and then executes it. Attribute: unit_id -- The specific dimmer that should be used for this wakeup sequence. """ def __init__( self, macro_file_path ): """Initialize the wakeup macro. """ threading.Thread.__init__(self) # print os.getcwd() self.macroObjects = MacroParser.Macro( macro_file_path ) def run(self): """This is where the magic happens! """ self.macroObjects.execute()
Fulham boss Slavisa Jokanovic has challenged his players to be brave enough to pile the pressure on promotion rivals Cardiff as the season approaches its climax. The third-placed Whites head into Friday night’s clash with relegated Sunderland knowing victory will take them into second, for a few hours at least, before the Bluebirds travel to Hull the following day in the penultimate round of fixtures. Jokanovic is acutely aware that even victory over the Black Cats and at Birmingham on the final day of the campaign might not be enough to secure automatic promotion, but is looking to add two more victories to an unbeaten run of 22 games to keep the dream alive. He told the club’s website: “We have positive pressure before we play against Sunderland. It’s important we don’t make mistakes, but the situation is completely under Cardiff’s control. If they win these two games, we cannot do anything. “In front of us are two ways of being successful: one is short [automatic promotion], another is longer [the play-offs]. We must be ready for both ways. “If you don’t finish second – which is more or less normal for us because we’ve never been in this position during this year – you must be brave, do your job, and see what happens. Sunderland travel south already facing life in League One next season following their second successive relegation. With Fulham and champions Wolves to come in their final two games, things could get worse before they get better, but manager Chris Coleman insists they will not be throwing in the towel. Former Fulham man Coleman said at his pre-match press conference: “The season is 46 games. I’m not complaining to anybody else because we have been relegated. It’s our fault, it’s our problem. “We’ll try to take care of our own business and if anybody is promoted or relegated it’s because it’s on them and it’s across the season.
#!/usr/bin/python # -*- coding: utf-8 -*- """ Python-nvd3 is a Python wrapper for NVD3 graph library. NVD3 is an attempt to build re-usable charts and chart components for d3.js without taking away the power that d3.js gives you. Project location : https://github.com/areski/python-nvd3 """ from .NVD3Chart import NVD3Chart, TemplateMixin class cumulativeLineChart(TemplateMixin, NVD3Chart): """ A cumulative line chart is used when you have one important grouping representing an ordered set of data and one value to show, summed over time. Python example:: from nvd3 import cumulativeLineChart chart = cumulativeLineChart(name='cumulativeLineChart', x_is_date=True) xdata = [1365026400000000, 1365026500000000, 1365026600000000] ydata = [6, 5, 1] y2data = [36, 55, 11] extra_serie = {"tooltip": {"y_start": "There are ", "y_end": " calls"}} chart.add_serie(name="Serie 1", y=ydata, x=xdata, extra=extra_serie) extra_serie = {"tooltip": {"y_start": "", "y_end": " mins"}} chart.add_serie(name="Serie 2", y=y2data, x=xdata, extra=extra_serie) chart.buildhtml() Javascript generated: .. raw:: html <div id="cumulativeLineChart"><svg style="height:450px; width:100%"></svg></div> <script> data_cumulativeLineChart=[{"values": [{"y": 6, "x": 1365026400000000}, {"y": 5, "x": 1365026500000000}, {"y": 1, "x": 1365026600000000}], "key": "Serie 1", "yAxis": "1"}, {"values": [{"y": 36, "x": 1365026400000000}, {"y": 55, "x": 1365026500000000}, {"y": 11, "x": 1365026600000000}], "key": "Serie 2", "yAxis": "1"}]; nv.addGraph(function() { var chart = nv.models.cumulativeLineChart(); chart.margin({top: 30, right: 60, bottom: 20, left: 60}); var datum = data_cumulativeLineChart; chart.xAxis .tickFormat(function(d) { return d3.time.format('%d %b %Y')(new Date(parseInt(d))) }); chart.yAxis .tickFormat(d3.format(',.1%')); chart.tooltipContent(function(key, y, e, graph) { var x = d3.time.format("%d %b %Y")(new Date(parseInt(graph.point.x))); var y = String(graph.point.y); if(key == 'Serie 1'){ var y = 'There are ' + String(e) + ' calls'; }if(key == 'Serie 2'){ var y = String(e) + ' mins'; } tooltip_str = '<center><b>'+key+'</b></center>' + y + ' on ' + x; return tooltip_str; }); chart.showLegend(true); d3.select('#cumulativeLineChart svg') .datum(datum) .transition().duration(500) .attr('height', 450) .call(chart); }); </script> """ CHART_FILENAME = "./cumulativelinechart.html" template_chart_nvd3 = NVD3Chart.template_environment.get_template(CHART_FILENAME) def __init__(self, **kwargs): super().__init__(**kwargs) self.model = 'cumulativeLineChart' height = kwargs.get('height', 450) width = kwargs.get('width', None) if kwargs.get('x_is_date', False): self.set_date_flag(True) self.create_x_axis('xAxis', format=kwargs.get('x_axis_format', '%d %b %Y'), date=True) self.set_custom_tooltip_flag(True) else: self.create_x_axis('xAxis', format=kwargs.get( 'x_axis_format', '.2f')) self.create_y_axis('yAxis', format=kwargs.get('y_axis_format', '.1%')) self.set_graph_height(height) if width: self.set_graph_width(width)
Source: Journal of coatings technology. Vol. 71, no. 895 (Aug. 1999).:p. 97-102 : ill. Alkyd-, oil-modified-latex-, and latex-based finishes were applied to severely weathered western redcedar and redwood boards that did not have any surface treatment to ameliorate the weathered surface prior to painting. Six finishes were evaluated annually for 11 years for cracking, flaking, erosion, mildew growth, discoloration, and general appearance. Low-solids- content latex finishes that contained about 10% raw linseed oil and 11% acrylic resin (i.e., the oil-modified latex finishes) performed better on badly weathered wood than did the alkyd and the other latex finish, even after 11 years. Latex finishes that contained raw linseed oil probably stabilized the weathered surface and plasticized the finish. The stabilization of the wood surface and the flexibility of the finish throughout its service life are the important factors in finish performance on these weathered substrates. Williams, R. Sam.; Sotos, Peter.; Feist, William. 1999. Evaluation of several finishes on severely weathered wood. Journal of coatings technology. Vol. 71, no. 895 (Aug. 1999).:p. 97-102 : ill.
#MenuTitle: New Tab with Small Paths # -*- coding: utf-8 -*- __doc__=""" Finds small paths (smaller tahn a user-definable threshold) in glyphs and open a new tab with affected glyphs. """ import vanilla import GlyphsApp class FindSmallPaths( object ): def __init__( self ): # Window 'self.w': windowWidth = 250 windowHeight = 190 windowWidthResize = 300 # user can resize width by this value windowHeightResize = 0 # user can resize height by this value self.w = vanilla.FloatingWindow( ( windowWidth, windowHeight ), # default window size "Find Small Paths", # window title minSize = ( windowWidth, windowHeight ), # minimum size (for resizing) maxSize = ( windowWidth + windowWidthResize, windowHeight + windowHeightResize ), # maximum size (for resizing) autosaveName = "com.mekkablue.FindSmallPaths.mainwindow" # stores last window position and size ) # UI elements: self.w.text_1 = vanilla.TextBox( (15, 10, -15, 30), "Open a new tab with glyphs that contain paths with an area smaller than:", sizeStyle='small' ) self.w.minArea = vanilla.TextBox( (15, 42, -15, 15+3), "1000 square units", sizeStyle = 'small', alignment="center") self.w.sliderMin = vanilla.EditText( ( 15, 60-1, 50, 19), "10", sizeStyle='small', callback=self.SliderUpdate ) self.w.sliderMax = vanilla.EditText( (-15-50, 60-1, -15, 19), "10000", sizeStyle='small', callback=self.SliderUpdate ) self.w.areaSlider= vanilla.Slider((15+50+10, 60, -15-50-10, 19), value=0.1, minValue=0.0, maxValue=1.0, sizeStyle='small', callback=self.SliderUpdate ) self.w.deleteThemRightAway = vanilla.CheckBox( (15, 80+10, -15, 20), "Delete Small Paths Right Away", value=False, callback=self.CheckBoxUpdate, sizeStyle='small' ) self.w.afterOverlapRemoval = vanilla.CheckBox( (15, 100+10, -15, 20), "After Decomposition and Overlap Removal (slower)", value=False, callback=self.CheckBoxUpdate, sizeStyle='small' ) # Run Button: self.w.runButton = vanilla.Button((-120, -20-15, -15, -15), "Open Tab", sizeStyle='regular', callback=self.FindSmallPathsMain ) self.w.setDefaultButton( self.w.runButton ) # Load Settings: if not self.LoadPreferences(): print "Note: 'Find Small Paths' could not load preferences. Will resort to defaults" self.CheckBoxUpdate(None) self.SliderUpdate(None) # Open window and focus on it: self.w.open() self.w.makeKey() def SavePreferences( self, sender ): try: Glyphs.defaults["com.mekkablue.FindSmallPaths.sliderMin"] = self.w.sliderMin.get() Glyphs.defaults["com.mekkablue.FindSmallPaths.sliderMax"] = self.w.sliderMax.get() Glyphs.defaults["com.mekkablue.FindSmallPaths.areaSlider"] = float(self.w.areaSlider.get()) Glyphs.defaults["com.mekkablue.FindSmallPaths.deleteThemRightAway"] = int(self.w.deleteThemRightAway.get()) Glyphs.defaults["com.mekkablue.FindSmallPaths.afterOverlapRemoval"] = int(self.w.afterOverlapRemoval.get()) except Exception as e: print e return False return True def LoadPreferences( self ): try: NSUserDefaults.standardUserDefaults().registerDefaults_( { "com.mekkablue.FindSmallPaths.sliderMin": "10", "com.mekkablue.FindSmallPaths.sliderMax": "100000", "com.mekkablue.FindSmallPaths.areaSlider": 0.1, "com.mekkablue.FindSmallPaths.deleteThemRightAway": 0, "com.mekkablue.FindSmallPaths.afterOverlapRemoval": 0 } ) self.w.sliderMin.set( Glyphs.defaults["com.mekkablue.FindSmallPaths.sliderMin"] ) self.w.sliderMax.set( Glyphs.defaults["com.mekkablue.FindSmallPaths.sliderMax"] ) self.w.areaSlider.set( float(Glyphs.defaults["com.mekkablue.FindSmallPaths.areaSlider"]) ) self.w.deleteThemRightAway.set( bool(Glyphs.defaults["com.mekkablue.FindSmallPaths.deleteThemRightAway"]) ) self.w.afterOverlapRemoval.set( bool(Glyphs.defaults["com.mekkablue.FindSmallPaths.afterOverlapRemoval"]) ) except Exception as e: print e return False return True def CheckBoxUpdate(self, sender): try: # mutually exclusive check boxes: theOne = self.w.afterOverlapRemoval theOther = self.w.deleteThemRightAway theOther.enable(not bool(theOne.get())) theOne.enable(not bool(theOther.get())) # Hack as long as vanilla.CheckBox.getNSButton is not implemented: if theOne.get(): theOther.set(False) if theOther.get(): theOne.set(False) # save prefs: if not self.SavePreferences( self ): print "Note: 'Find Small Paths' could not write preferences." return True except Exception as e: print e return False def SliderUpdate( self, sender ): try: minArea = self.CurrentMinArea() if not sender == self.w.areaSlider: if not self.SavePreferences( self ): print "Note: 'Find Small Paths' could not write preferences." return True except: return False def CurrentMinArea(self): minimum = float(self.w.sliderMin.get()) maximum = float(self.w.sliderMax.get()) # check for integrity of min and max values: if minimum < 1.0: minimum = 1.0 self.w.sliderMin.set( "%i"%minimum ) if maximum < minimum: maximum = minimum+10.0 self.w.sliderMax.set( "%i"%maximum ) sliderPos = float( self.w.areaSlider.get() ) minArea = minimum + sliderPos * (maximum-minimum) self.w.minArea.set( "%i square units" % minArea ) return minArea def FindSmallPathsMain( self, sender ): try: minArea = self.CurrentMinArea() smallPathsShouldBeDeleted = self.w.deleteThemRightAway.get() overlapsShouldBeRemovedFirst = self.w.afterOverlapRemoval.get() glyphsWithSmallPaths = [] thisFont = Glyphs.font # frontmost font for thisGlyph in thisFont.glyphs: thisGlyph.beginUndo() # begin undo grouping for thisLayer in thisGlyph.layers: if thisLayer.paths: if overlapsShouldBeRemovedFirst: checkLayer = thisLayer.copyDecomposedLayer() checkLayer.removeOverlap() for thisPath in checkLayer.paths: if thisPath.area() < minArea: glyphsWithSmallPaths.append(thisGlyph.name) else: for i in range(len(thisLayer.paths))[::-1]: thisPath = thisLayer.paths[i] if thisPath.area() < minArea: glyphsWithSmallPaths.append(thisGlyph.name) if smallPathsShouldBeDeleted: print "deleting", thisPath del thisLayer.paths[i] thisGlyph.endUndo() # end undo grouping if glyphsWithSmallPaths: tabString = "/"+"/".join( set(glyphsWithSmallPaths) ) thisFont.newTab( tabString ) else: Message("No Small Paths Found", "No glyphs with paths smaller than %i square units found in the frontmost font." % minArea, OKButton="Cool") # listOfSelectedLayers = thisFont.selectedLayers # active layers of currently selected glyphs # for thisLayer in listOfSelectedLayers: # loop through layers # thisGlyph = thisLayer.parent # print thisGlyph.name, thisLayer.name # # output all node coordinates: # for thisPath in thisLayer.paths: # for thisNode in thisLayer.nodes: # print "-- %.1f %.1f" % ( thisNode.x, thisNode.y ) if not self.SavePreferences( self ): print "Note: 'Find Small Paths' could not write preferences." self.w.close() # delete if you want window to stay open except Exception, e: # brings macro window to front and reports error: Glyphs.showMacroWindow() print "Find Small Paths Error: %s" % e FindSmallPaths()
CHARGED is a local art exhibition that is on display at the Work Release Gallery at Commune NFK. The exhibit displays the work of ten artists, all fashioned from neon lights. The exhibit is also considered the 3rd Annual NEON Exhibition. The exhibition is curated by Clay McGlamory and Assistant Curator Cate Currier. The exhibit will be on display from June 23rd to July 23rd. Here are a few pictures from the 2016 Norfolk NEON Festival. The festival is a showcasing the publicly displayed artwork around the Norfolk Art District. Enjoy.
import renderdoc as rd from typing import List import rdtest class D3D11_Shader_ISA(rdtest.TestCase): demos_test_name = 'D3D11_Shader_ISA' def check_capture(self): action = self.find_action("GPU=") self.check(action is not None) is_amd = 'AMD' in action.customName self.controller.SetFrameEvent(action.next.eventId, False) pipe: rd.PipeState = self.controller.GetPipelineState() refl: rd.ShaderReflection = pipe.GetShaderReflection(rd.ShaderStage.Vertex) isas: List[str] = self.controller.GetDisassemblyTargets(True) if isas == []: raise rdtest.TestFailureException("Expected some disassembly targets, got none!") # Generic testing can't do much, we just ensure that we can successfully get a non-empty disassembly string for isa in isas: disasm: str = self.controller.DisassembleShader(pipe.GetGraphicsPipelineObject(), refl, isa) if len(disasm) < 32: raise rdtest.TestFailureException("Disassembly for target '{}' is degenerate: {}".format(isa, disasm)) rdtest.log.success("All disassembly targets successfully fetched and seem reasonable") # We make this a hard failure. Users can fix this by installing the plugins, and we don't want automated # overnight tests to suddenly stop checking if 'AMDIL' not in isas: raise rdtest.TestFailureException( "AMDIL is not an available disassembly target. Are you missing plugins?") disasm: str = self.controller.DisassembleShader(pipe.GetGraphicsPipelineObject(), refl, 'AMDIL') expected = [ 'il_vs', 'dcl_output_position', 'end', ] for fragment in expected: if not fragment in disasm: raise rdtest.TestFailureException( "AMDIL ISA doesn't contain '{}' as expected: {}".format(fragment, disasm)) if 'RDNA (Navi 10)' not in isas: raise rdtest.TestFailureException( "RDNA (Navi 10) is not an available disassembly target. Are you missing plugins?") disasm: str = self.controller.DisassembleShader(pipe.GetGraphicsPipelineObject(), refl, 'RDNA (Navi 10)') expected = [ 'asic(GFX10)', 'vgpr_count', 'wave_size', 's_endpgm', ] for fragment in expected: if not fragment in disasm: raise rdtest.TestFailureException( "RDNA ISA doesn't contain '{}' as expected: {}".format(fragment, disasm)) rdtest.log.success("AMD disassembly is as expected") # D3D11 doesn't have live driver disassembly, can't test it if not is_amd: rdtest.log.print("Not testing live driver disassembly outside AMD") else: rdtest.log.print("No live driver disassembly to test on D3D11")
In March of 2006, German game developer 4Head Studios announced it was working with graphics producer 3D-IO to create a videogame based on the epic tale of Beowulf. Fast forward to late 2007 and the release of Paramount Pictures' Beowulf film, accompanied by Ubisoft's exclusive tie-in game of the same name, and the Beowulf-themed videogame arena starts to look a bit crowded. So today 4Head (now part of DTP Entertainment) announced that its Beowulf project is no more. The company sold its trademarks, web domains and other assets related to the game to Paramount. "With the announcement of Ubisoft's offical game based on the movie and the conceivable competitive situation, we were seeing publishers unwilling to support our game," said the game's Executive Producer, Gustaf Stechmann. "We thus lacked the resources needed to drive the project's development forward. Luckily, we had the older rights to the use of the name. The buy-out deal with Paramount was therefore the logical exit strategy." Ubisoft's Beowulf game was released Nov. 13 for PC, PlayStation 3 and Xbox 360. A PSP version comes out Dec. 4.
# Copyright 2016 Peter Dahlberg # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. import os import logging from apscheduler.triggers.cron import CronTrigger from apscheduler.triggers.date import DateTrigger if __name__ == "__main__": raise SystemExit("Not meant to be run directly!") def _rsync_cmd(dest): cmd = ("rsync --delete-delay --recursive --times --stats --delay-updates " "'{output}/' '{dest}'") return cmd.format(dest=dest, output="{output}") # configure the logger logging.basicConfig(level=logging.DEBUG, format='%(asctime)s %(message)s') # make sure git does not block giving pw prompts, git 2.3+ only os.environ["GIT_TERMINAL_PROMPT"] = "0" os.environ["GIT_ASKPASS"] = "echo" # also to avoid interactiveness os.environ["GIT_EDITOR"] = "true" # also to avoid interactiveness os.environ["GIT_PAGER"] = "cat" # also to avoid interactiveness # avoid system config, we want default behaviour os.environ["GIT_CONFIG_NOSYSTEM"] = "yes" # needs to be a byte like object GITHUB_SECRET = b"changetosomethingrandomlong" RUNNERS = { # unique name of the runner, avoid spaces and other obscure characters "website_master": { # directory where building takes place, will be created if not there # multiple runners may point to the same one "working_directory": "/tmp/test", # upstream url of the repository which contains the website # use https://git::@github.com... to avoid pw prompts and instead fail # (e.g. if github gives errornously 401 temporarily, git would block) # os.environ["GIT_TERMINAL_PROMPT"] = "0" does the same but git 2.3+only "clone_url": "https://git::@github.com/IEEE-SB-Passau/pelican-ieee-passau.git", # branch which will be built "git_branch": "master", # command which installs the generated directory tree to it's final # destination (the wwwroot) e.g. rsync. {output} will be replaced by # the path to the generator output "final_install_command": _rsync_cmd("/tmp/testroot"), # command which builds the website # important: specify {output} as output path of the generator # if you use toy you may use {toxresult} as the path to the result.json "build_command": ('tox -e pelican --result-json "{toxresult}" ' '--recreate -- -d --output "{output}"'), # will be added to env when running build_command "build_env": {"PELICAN_SITEURL": "//apu:800"} } } # define crojobs as sequence of (runner, trigger) pairs, for cron triggers see # http://apscheduler.readthedocs.io/en/latest/modules/triggers/cron.html SCHEDULED_BUILD_JOBS = [ ("website_master", CronTrigger(minute="*/30")), ("website_master", DateTrigger()) # once at start ] # user, pass for /status/... subpages, if not set or None no auth is done def STATUS_AUTH_BASIC_FN(user, passw): return user == "powerpoint" and passw == "karaoke"
.::CEU Central - Nursing CEU Courses Home - Get Re-certified Online!::. Only $17.99 for unlimited Nursing CEU’s for 1 full year! We are an approved provider of continuing education dedicated to providing you with the most affordable, Hassle-free and fast learning experience to earn your required continuing education. We present courses focusing on the latest trends and updates in your profession based on the newest research and industry standards. Only $17.99 for a full year of unlimited CEU's from our entire library of courses in your chosen field!
import boto3 import gzip import json import logging import os import re from tools import Arn from setup_logger import create_logger from aws_session_manager import AWS_Session from botocore.exceptions import ProfileNotFound logger = create_logger(name="denied_notification.py") try: SESSION = boto3.session.Session(profile_name='training', region_name='us-east-1') except ProfileNotFound as pnf: SESSION = boto3.session.Session() def lambda_handler(event, context): # global SESSION # SESSION = SESSION.get_session() topic_arn = os.environ.get('sns_arn', 'arn:aws:sns:us-east-1:281782457076:security_fairy_topic') dynamodb_table = os.environ.get('dynamodb_table', 'arn:aws:dynamodb:us-east-1:281782457076:table/security_fairy_dynamodb_table') # Extract Bucket and Key from an SNS notification # message = json.loads(event['Records'][0]['Sns']['Message']) # bucket = message['s3Bucket'] # key = message['s3ObjectKey'][0] # Extracted Bucket and Key from S3 event notification bucket = event['Records'][0]['s3']['bucket']['name'] key = event['Records'][0]['s3']['object']['key'] # where to save the downloaded file file_path = '/tmp/cloudtraillogfile.gz' # downloads file to above path boto3.client('s3').download_file(bucket, key, file_path) # opens gz file for reading gzfile = gzip.open(file_path, 'r') # loads contents of the Records key into variable (our actual cloudtrail log entries!) records = json.loads(gzfile.readlines()[0])['Records'] access_denied_records = check_records_for_error_code(records) security_fairy_access_denied_records = get_security_fairy_audited_entities(access_denied_records) write_denied_actions_to_dynamodb(security_fairy_access_denied_records, dynamodb_table) send_access_denied_notifications(access_denied_records, topic_arn) def check_records_for_error_code(records, error_codes = ['AccessDenied', 'AccessDeniedException','Client.UnauthorizedOperation']): matched_error_records = [] for record in records: if record.get('errorCode', None) in error_codes: logger.debug(record) extracted_information = {} arn = Arn(record['userIdentity'].get('arn', None)) role_name = arn.get_entity_name() service_name = arn.get_service() extracted_information['arn'] = arn.get_full_arn() extracted_information['error_code'] = record['errorCode'] extracted_information['denied_action'] = service_name + ':' + record['eventName'] if not extracted_information in matched_error_records: logger.info('extracted_information doesn\'t already exist in list of access denieds') matched_error_records.append(extracted_information) logger.debug(matched_error_records) return matched_error_records def send_access_denied_notifications(access_denied_records, topic_arn): if access_denied_records: response = boto3.client('sns', region_name = 'us-east-1')\ .publish( TopicArn=topic_arn, Message=json.dumps(access_denied_records), Subject='Automated AWS Notification - Access Denied') def write_denied_actions_to_dynamodb(access_denied_records, dynamodb_table): #take in the below: # [{"error_code": "AccessDenied", "arn": "arn:aws:sts::281782457076:assumed-role/serverless_api_gateway_step_functions/BackplaneAssumeRoleSession", "denied_action": "states:StartExecution"}, {"error_code": "AccessDenied", "arn": "arn:aws:sts::281782457076:assumed-role/serverless_api_gateway_step_functions/BackplaneAssumeRoleSession", "denied_action": "states:StartExecution"}] # read the dynamodb_table, if the action already exists, do nothing dynamodb_client = SESSION.client('dynamodb') for record in access_denied_records: entity_arn = record['arn'] execution_id, existing_denied_actions = get_existing_denied_actions(entity_arn, dynamodb_table) updated_denied_actions = existing_denied_actions if not record['denied_action'] in existing_denied_actions: updated_denied_actions.append(record['denied_action']) dynamodb_client.update_item(TableName=dynamodb_table, Key={ "execution_id": { "S": execution_id } }, AttributeUpdates={ "denied_actions": { "Value":{"SS": updated_denied_actions} } }) def get_security_fairy_audited_entities(access_denied_records): audited_entities = [] for record in access_denied_records: entity = Arn(record['arn']) entity.convert_assumed_role_to_role() entity_arn = entity.get_full_arn() logger.debug(entity_arn) if entity.is_role() and is_access_denied_security_fairy_audited_role(entity_arn): logger.debug('Adding access_denied_record to list') record['arn'] = entity_arn audited_entities.append(record) logger.info(audited_entities) return audited_entities def get_existing_denied_actions(entity_arn, dynamodb_table): dynamodb_client = SESSION.client('dynamodb') response = dynamodb_client.scan( TableName=dynamodb_table, IndexName='entity_arn', AttributesToGet=[ 'execution_id', 'entity_arn', 'denied_actions' ], ScanFilter={ 'entity_arn': { 'AttributeValueList': [ { 'S': entity_arn } ], 'ComparisonOperator': 'EQ' } } )['Items'][0] existing_denied_actions = [] if response.get('denied_actions') is None else response['denied_actions']['SS'] execution_id = response['execution_id']['S'] logger.info(existing_denied_actions) return execution_id, existing_denied_actions def is_access_denied_security_fairy_audited_role(role_arn): iam_client = SESSION.client('iam') #Consumes an role arn and examines its attached policies to see #if they were created by security-fairy role = Arn(role_arn) role_name = role.get_entity_name() logger.info(role_name) attached_policies = iam_client.list_attached_role_policies(RoleName=role_name) # Examines all attached policies and search for an attached policy with the # following format: *_security_fairy_revised_policy # (see security_fairy_revised_policy_approve.py line 58) logger.debug("Policies attached to {}:".format(role.get_full_arn())) for policy in attached_policies['AttachedPolicies']: logger.info(policy['PolicyName']) if '-security-fairy-revised-policy' in policy['PolicyName']: return True return False if __name__ == '__main__': # arn = 'arn:aws:iam::281782457076:role/1s_tear_down_role' # logging.info(is_access_denied_security_fairy_audited_role(arn)) access_denied_records = [{"error_code": "AccessDenied", "arn": "arn:aws:sts::281782457076:assumed-role/serverless_api_gateway_step_functions/BackplaneAssumeRoleSession", "denied_action": "states:StartExecution"}, {"error_code": "AccessDenied", "arn": "arn:aws:sts::281782457076:assumed-role/1s_tear_down_role/potato", "denied_action": "route53:CreateHostedZone"}, {"error_code": "AccessDenied", "arn": "arn:aws:iam::281782457076:user/[email protected]", "denied_action": "codebuild:StartBuild"}, {"error_code": "AccessDenied", "arn": "arn:aws:iam::281782457076:user/[email protected]", "denied_action": "codebuild:StartBuild"}, {"error_code": "AccessDenied", "arn": "arn:aws:iam::281782457076:user/[email protected]", "denied_action": "codebuild:StartBuild"}, {"error_code": "AccessDenied", "arn": "arn:aws:iam::281782457076:user/[email protected]", "denied_action": "codebuild:StartBuild"}, {"error_code": "AccessDenied", "arn": "arn:aws:iam::281782457076:role/1s_tear_down_role", "denied_action": "codebuild:StartBuild"}] # dynamodb_table = 'security_fairy_dynamodb_table' # existing_denied_actions('arn:aws:iam::281782457076:role/1s_tear_down_role', dynamodb_table) security_fairy_access_denied_records = get_security_fairy_audited_entities(access_denied_records) write_denied_actions_to_dynamodb(security_fairy_access_denied_records,'security_fairy_dynamodb_table') # if __name__ == '__main__': # EVENT = { # "Records": [ # { # "eventVersion": "2.0", # "eventTime": "2017-08-23T17:27:20.482Z", # "requestParameters": { # "sourceIPAddress": "184.72.102.183" # }, # "s3": { # "configurationId": "log_posted", # "object": { # "eTag": "f88cc0ba387febb9d1922bcf3624e249", # "sequencer": "00599DBAF77B4804AE", # "key": "AWSLogs/281782457076/CloudTrail/us-east-1/2017/08/23/281782457076_CloudTrail_us-east-1_20170823T1725Z_Nobz9PDTfkS2itSG.json.gz", # "size": 4342 # }, # "bucket": { # "arn": "arn:aws:s3:::1strategy-training-traillogs", # "name": "1strategy-training-traillogs", # "ownerIdentity": { # "principalId": "A3F4AZ9K861LVS" # } # }, # "s3SchemaVersion": "1.0" # }, # "responseElements": { # "x-amz-id-2": "qakr7pYcVWfsXM/BEncmZ/zQVPQnIAyN5ggRIF+9/+5JhAhhmMDZDJunlhhFowOKzGF9mNtF1Ys=", # "x-amz-request-id": "5A68EDF6D1F0C933" # }, # "awsRegion": "us-west-2", # "eventName": "ObjectCreated:Put", # "userIdentity": { # "principalId": "AWS:AROAI6ZMWVXR3IZ6MKNSW:i-0c91c32104e81c79d" # }, # "eventSource": "aws:s3" # } # ] # } # lambda_handler(EVENT, {})
News Americas, NEW YORK, NY, Fri. Feb. 1, 2018: This Sunday, its Super Bowl LIII, another Super Bowl that will feature the New England Patriots. This time the team many love to hate, takes on the Los Angeles Rams at the Mercedes-Benz Stadium in Atlanta from 6:30 p.m. EST. But for Caribbean fans of the sport, two sons with roots in the region, are set to feature prominently in the football spectacular. Like the Patriots or not, they are the only team with Caribbean roots players in this year’s Super Bowl. The two are Sony Michel and Patrick Chung. All bets are on Haitian-roots rookie, Sony Michel, who played a big part in getting the Pats to the big game. The 5-foot-11, 215-pound rookie is considered to be one of the NFL’s most bruising inside rushers and a budding New England Patriots star who gained most of his team-leading 931 yards this season on between-the-tackles runs. Michel, #26, a 23 alumnus of the University of Georgia, is only in the first year of a four-year, $9.62-million rookie contract. The New England Patriots drafted Michel with the 31st overall pick in the 2018 NFL draft. Chung is expected to make his fifth appearance in a Super Bowl, including his third consecutive Super Bowl. Chung’s mother, Sophia George-Chung, is a Jamaican reggae artist who was popular in the 1980s. His father, Ronald Chung, was a music producer and her manager.
# MySQL Connector/Python - MySQL driver written in Python. # Copyright (c) 2009, 2013, Oracle and/or its affiliates. All rights reserved. # MySQL Connector/Python is licensed under the terms of the GPLv2 # <http://www.gnu.org/licenses/old-licenses/gpl-2.0.html>, like most # MySQL Connectors. There are special exceptions to the terms and # conditions of the GPLv2 as it is applied to this software, see the # FOSS License Exception # <http://www.mysql.com/about/legal/licensing/foss-exception.html>. # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA """ This module implements some constructors and singletons as required by the DB API v2.0 (PEP-249). """ # Python Db API v2 apilevel = '2.0' threadsafety = 1 paramstyle = 'pyformat' import time import datetime from mysql.connector import constants class _DBAPITypeObject: def __init__(self, *values): self.values = values def __cmp__(self, other): if other in self.values: return 0 if other < self.values: return 1 else: return -1 Date = datetime.date Time = datetime.time Timestamp = datetime.datetime def DateFromTicks(ticks): return Date(*time.localtime(ticks)[:3]) def TimeFromTicks(ticks): return Time(*time.localtime(ticks)[3:6]) def TimestampFromTicks(ticks): return Timestamp(*time.localtime(ticks)[:6]) Binary = str STRING = _DBAPITypeObject(constants.FieldType.get_string_types()) BINARY = _DBAPITypeObject(constants.FieldType.get_binary_types()) NUMBER = _DBAPITypeObject(constants.FieldType.get_number_types()) DATETIME = _DBAPITypeObject(constants.FieldType.get_timestamp_types()) ROWID = _DBAPITypeObject()
Find Here DCH Regional Medical Center Corporate Office Address, Contact, Phone Number, Email, Headquarters, Read Customer Reviews and Complaints. We have given complete information below so that you don’t have to search for DCH Regional Medical Center Headquarters contact details anywhere else. See the below table for Contact Details. We hope that now you are able to contact the corporate office of DCH Regional Medical Center. You just need to contact them via phone, website or social media and they will help you in resolving your issue. The DCH Regional Medical Center Customer Service is one of the best and they response to every customer’s concern quickly and efficiently. DCH Regional Medical Center Headquarters phone number can be called at business hours and waiting time is very short. Please check website for more contact details. Comment your complaints or review below and please rate this company.
# -*- coding: utf-8 -*- # $Id: wuihlpgraphmatplotlib.py 56295 2015-06-09 14:29:55Z vboxsync $ """ Test Manager Web-UI - Graph Helpers - Implemented using matplotlib. """ __copyright__ = \ """ Copyright (C) 2012-2015 Oracle Corporation This file is part of VirtualBox Open Source Edition (OSE), as available from http://www.virtualbox.org. This file is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License (GPL) as published by the Free Software Foundation, in version 2 as it comes in the "COPYING" file of the VirtualBox OSE distribution. VirtualBox OSE is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY of any kind. The contents of this file may alternatively be used under the terms of the Common Development and Distribution License Version 1.0 (CDDL) only, as it comes in the "COPYING.CDDL" file of the VirtualBox OSE distribution, in which case the provisions of the CDDL are applicable instead of those of the GPL. You may elect to license modified versions of this file under the terms and conditions of either the GPL or the CDDL or both. """ __version__ = "$Revision: 56295 $" # Standard Python Import and extensions installed on the system. import re; import StringIO; import matplotlib; # pylint: disable=F0401 matplotlib.use('Agg'); # Force backend. import matplotlib.pyplot; # pylint: disable=F0401 from numpy import arange as numpy_arange; # pylint: disable=E0611 # Validation Kit imports. from testmanager.webui.wuihlpgraphbase import WuiHlpGraphBase; class WuiHlpGraphMatplotlibBase(WuiHlpGraphBase): """ Base class for the matplotlib graphs. """ def __init__(self, sId, oData, oDisp): WuiHlpGraphBase.__init__(self, sId, oData, oDisp); self._fXkcdStyle = True; def setXkcdStyle(self, fEnabled = True): """ Enables xkcd style graphs for implementations that supports it. """ self._fXkcdStyle = fEnabled; return True; def _createFigure(self): """ Wrapper around matplotlib.pyplot.figure that feeds the figure the basic graph configuration. """ if self._fXkcdStyle and matplotlib.__version__ > '1.2.9': matplotlib.pyplot.xkcd(); # pylint: disable=E1101 matplotlib.rcParams.update({'font.size': self._cPtFont}); oFigure = matplotlib.pyplot.figure(figsize = (float(self._cxGraph) / self._cDpiGraph, float(self._cyGraph) / self._cDpiGraph), dpi = self._cDpiGraph); return oFigure; def _produceSvg(self, oFigure, fTightLayout = True): """ Creates an SVG string from the given figure. """ oOutput = StringIO.StringIO(); if fTightLayout: oFigure.tight_layout(); oFigure.savefig(oOutput, format = 'svg'); if self._oDisp and self._oDisp.isBrowserGecko('20100101'): # This browser will stretch images to fit if no size or width is given. sSubstitute = r'\1 \3 reserveAspectRatio="xMidYMin meet"'; else: # Chrome and IE likes to have the sizes as well as the viewBox. sSubstitute = r'\1 \3 reserveAspectRatio="xMidYMin meet" \2 \4'; return re.sub(r'(<svg) (height="\d+pt") (version="\d+.\d+" viewBox="\d+ \d+ \d+ \d+") (width="\d+pt")', sSubstitute, oOutput.getvalue().decode('utf8'), count = 1); class WuiHlpBarGraph(WuiHlpGraphMatplotlibBase): """ Bar graph. """ def __init__(self, sId, oData, oDisp = None): WuiHlpGraphMatplotlibBase.__init__(self, sId, oData, oDisp); self.fpMax = None; self.fpMin = 0; self.cxBarWidth = None; def setRangeMax(self, fpMax): """ Sets the max range.""" self.fpMax = float(fpMax); return None; def renderGraph(self): # pylint: disable=R0914 aoTable = self._oData.aoTable; # # Extract/structure the required data. # aoSeries = list(); for j in range(len(aoTable[1].aoValues)): aoSeries.append(list()); asNames = list(); oXRange = numpy_arange(self._oData.getGroupCount()); fpMin = self.fpMin; fpMax = self.fpMax; if self.fpMax is None: fpMax = float(aoTable[1].aoValues[0]); for i in range(1, len(aoTable)): asNames.append(aoTable[i].sName); for j in range(len(aoTable[i].aoValues)): fpValue = float(aoTable[i].aoValues[j]); aoSeries[j].append(fpValue); if fpValue < fpMin: fpMin = fpValue; if fpValue > fpMax: fpMax = fpValue; fpMid = fpMin + (fpMax - fpMin) / 2.0; if self.cxBarWidth is None: self.cxBarWidth = 1.0 / (len(aoTable[0].asValues) + 1.1); # Render the PNG. oFigure = self._createFigure(); oSubPlot = oFigure.add_subplot(1, 1, 1); aoBars = list(); for i in range(len(aoSeries)): sColor = self.calcSeriesColor(i); aoBars.append(oSubPlot.bar(oXRange + self.cxBarWidth * i, aoSeries[i], self.cxBarWidth, color = sColor, align = 'edge')); #oSubPlot.set_title('Title') #oSubPlot.set_xlabel('X-axis') #oSubPlot.set_xticks(oXRange + self.cxBarWidth); oSubPlot.set_xticks(oXRange); oLegend = oSubPlot.legend(aoTable[0].asValues, loc = 'best', fancybox = True); oLegend.get_frame().set_alpha(0.5); oSubPlot.set_xticklabels(asNames, ha = "left"); #oSubPlot.set_ylabel('Y-axis') oSubPlot.set_yticks(numpy_arange(fpMin, fpMax + (fpMax - fpMin) / 10 * 0, fpMax / 10)); oSubPlot.grid(True); fpPadding = (fpMax - fpMin) * 0.02; for i in range(len(aoBars)): aoRects = aoBars[i] for j in range(len(aoRects)): oRect = aoRects[j]; fpValue = float(aoTable[j + 1].aoValues[i]); if fpValue <= fpMid: oSubPlot.text(oRect.get_x() + oRect.get_width() / 2.0, oRect.get_height() + fpPadding, aoTable[j + 1].asValues[i], ha = 'center', va = 'bottom', rotation = 'vertical', alpha = 0.6, fontsize = 'small'); else: oSubPlot.text(oRect.get_x() + oRect.get_width() / 2.0, oRect.get_height() - fpPadding, aoTable[j + 1].asValues[i], ha = 'center', va = 'top', rotation = 'vertical', alpha = 0.6, fontsize = 'small'); return self._produceSvg(oFigure); class WuiHlpLineGraph(WuiHlpGraphMatplotlibBase): """ Line graph. """ def __init__(self, sId, oData, oDisp = None, fErrorBarY = False): # oData must be a WuiHlpGraphDataTableEx like object. WuiHlpGraphMatplotlibBase.__init__(self, sId, oData, oDisp); self._cMaxErrorBars = 12; self._fErrorBarY = fErrorBarY; def setErrorBarY(self, fEnable): """ Enables or Disables error bars, making this work like a line graph. """ self._fErrorBarY = fEnable; return True; def renderGraph(self): # pylint: disable=R0914 aoSeries = self._oData.aoSeries; oFigure = self._createFigure(); oSubPlot = oFigure.add_subplot(1, 1, 1); if self._oData.sYUnit is not None: oSubPlot.set_ylabel(self._oData.sYUnit); if self._oData.sXUnit is not None: oSubPlot.set_xlabel(self._oData.sXUnit); cSeriesNames = 0; cYMin = 1000; cYMax = 0; for iSeries, oSeries in enumerate(aoSeries): sColor = self.calcSeriesColor(iSeries); cYMin = min(cYMin, min(oSeries.aoYValues)); cYMax = max(cYMax, max(oSeries.aoYValues)); if not self._fErrorBarY: oSubPlot.errorbar(oSeries.aoXValues, oSeries.aoYValues, color = sColor); elif len(oSeries.aoXValues) > self._cMaxErrorBars: if matplotlib.__version__ < '1.3.0': oSubPlot.errorbar(oSeries.aoXValues, oSeries.aoYValues, color = sColor); else: oSubPlot.errorbar(oSeries.aoXValues, oSeries.aoYValues, yerr = [oSeries.aoYErrorBarBelow, oSeries.aoYErrorBarAbove], errorevery = len(oSeries.aoXValues) / self._cMaxErrorBars, color = sColor ); else: oSubPlot.errorbar(oSeries.aoXValues, oSeries.aoYValues, yerr = [oSeries.aoYErrorBarBelow, oSeries.aoYErrorBarAbove], color = sColor); cSeriesNames += oSeries.sName is not None; if cYMin != 0 or cYMax != 0: oSubPlot.set_ylim(bottom = 0); if cSeriesNames > 0: oLegend = oSubPlot.legend([oSeries.sName for oSeries in aoSeries], loc = 'best', fancybox = True); oLegend.get_frame().set_alpha(0.5); if self._sTitle is not None: oSubPlot.set_title(self._sTitle); if self._cxGraph >= 256: oSubPlot.minorticks_on(); oSubPlot.grid(True, 'major', axis = 'both'); oSubPlot.grid(True, 'both', axis = 'x'); if True: # oSubPlot.axis('off'); #oSubPlot.grid(True, 'major', axis = 'none'); #oSubPlot.grid(True, 'both', axis = 'none'); matplotlib.pyplot.setp(oSubPlot, xticks = [], yticks = []); return self._produceSvg(oFigure); class WuiHlpLineGraphErrorbarY(WuiHlpLineGraph): """ Line graph with an errorbar for the Y axis. """ def __init__(self, sId, oData, oDisp = None): WuiHlpLineGraph.__init__(self, sId, oData, fErrorBarY = True); class WuiHlpMiniSuccessRateGraph(WuiHlpGraphMatplotlibBase): """ Mini rate graph. """ def __init__(self, sId, oData, oDisp = None): """ oData must be a WuiHlpGraphDataTableEx like object, but only aoSeries, aoSeries[].aoXValues, and aoSeries[].aoYValues will be used. The values are expected to be a percentage, i.e. values between 0 and 100. """ WuiHlpGraphMatplotlibBase.__init__(self, sId, oData, oDisp); self.setFontSize(6); def renderGraph(self): # pylint: disable=R0914 assert len(self._oData.aoSeries) == 1; oSeries = self._oData.aoSeries[0]; # hacking #self.setWidth(512); #self.setHeight(128); # end oFigure = self._createFigure(); from mpl_toolkits.axes_grid.axislines import SubplotZero; oAxis = SubplotZero(oFigure, 111); oFigure.add_subplot(oAxis); # Disable all the normal axis. oAxis.axis['right'].set_visible(False) oAxis.axis['top'].set_visible(False) oAxis.axis['bottom'].set_visible(False) oAxis.axis['left'].set_visible(False) # Use the zero axis instead. oAxis.axis['yzero'].set_axisline_style('-|>'); oAxis.axis['yzero'].set_visible(True); oAxis.axis['xzero'].set_axisline_style('-|>'); oAxis.axis['xzero'].set_visible(True); if oSeries.aoYValues[-1] == 100: sColor = 'green'; elif oSeries.aoYValues[-1] > 75: sColor = 'yellow'; else: sColor = 'red'; oAxis.plot(oSeries.aoXValues, oSeries.aoYValues, '.-', color = sColor, linewidth = 3); oAxis.fill_between(oSeries.aoXValues, oSeries.aoYValues, facecolor = sColor, alpha = 0.5) oAxis.set_xlim(left = -0.01); oAxis.set_xticklabels([]); oAxis.set_xmargin(1); oAxis.set_ylim(bottom = 0, top = 100); oAxis.set_yticks([0, 50, 100]); oAxis.set_ylabel('%'); #oAxis.set_yticklabels([]); oAxis.set_yticklabels(['', '%', '']); return self._produceSvg(oFigure, False);
Most people fall in love with modern interior design due to the simplicity it brings to your place. The modern interior design is a part of the minimalist style. The proper modern design should have 3 main elements — to look spacious, to get enough natural light, and to have a good air circulation. So, the followings are some tips to adorn your home with those three components mentioned above. – Better choose the open floor plan. – Select soft neutral colors for painting the walls. – Invest in some multi-function furniture and the good storage solutions. – Pick out the furniture pieces that made from finishing materials like wood, stone, stainless steel, or chrome. The vital concept for making your home get a modern interior design is so simple. Just choose the unique items you like without cluttering the area.
import sublime import sublime_plugin class HostsFileViewListener(sublime_plugin.ViewEventListener): SYNTAX = 'hosts.sublime-syntax' @classmethod def is_applicable(cls, settings): try: return (settings and settings.get('syntax', '').lower().endswith(cls.SYNTAX)) except Exception as e: return False def on_hover(self, point, hover_zone): if ((hover_zone != sublime.HOVER_TEXT or not self.view.match_selector(point, 'meta.hostname meta.punycode'))): return expression_region = next( r for r in self.view.find_by_selector('meta.hostname') if r.contains(point)) hostname = self.view.substr(expression_region) try: hover_text = str.encode(hostname).decode('idna') except Exception as e: hover_text = 'Could not parse Punycode expression' html = ''' <body id="render-punycode"> <div>{}</div> </body> '''.format(hover_text) self.view.show_popup(html, sublime.HIDE_ON_MOUSE_MOVE_AWAY, location=point, max_width=512, max_height=60)
Pan spread is manually set to 52. Low-pass filter cut-off frequency is set to 177.83Hz (54) and is modulated by Note Vel, LFO 2, VCF Env. LFO 2 direct modulation amount for low-pass filter cut-off frequency is 48. VCF Env direct modulation amount for low-pass filter cut-off frequency is 97. Note Vel direct modulation amount for low-pass filter envelope amount is 191. Note Num direct modulation amount for low-pass filter envelope amount is 124. Filter resonance is statically set to 27% (70). Effect 1 level is off. Effect 2 level is manually set to 42. Effect 3 level is off. Effect 4 level is manually set to 38. Controls Noise Gate Param 2, FX 2 Level. Controls VCF Cut-off Frequency, UNKNOWN DESTINATION.
import datetime import re from billy.scrape.bills import BillScraper, Bill from billy.scrape.votes import Vote import lxml.html import scrapelib _action_re = ( ('Introduced', 'bill:introduced'), ('(Forwarded|Delivered) to Governor', 'governor:received'), ('Amendment (?:.*)Offered', 'amendment:introduced'), ('Substitute (?:.*)Offered', 'amendment:introduced'), ('Amendment (?:.*)adopted', 'amendment:passed'), ('Amendment lost', 'amendment:failed'), ('Read for the first time and referred to', ['bill:reading:1', 'committee:referred']), ('(r|R)eferred to', 'committee:referred'), ('Read for the second time', 'bill:reading:2'), ('(S|s)ubstitute adopted', 'bill:substituted'), ('(m|M)otion to Adopt (?:.*)adopted', 'amendment:passed'), ('(m|M)otion to (t|T)able (?:.*)adopted', 'amendment:passed'), ('(m|M)otion to Adopt (?:.*)lost', 'amendment:failed'), ('(m|M)otion to Read a Third Time and Pass adopted', 'bill:passed'), ('(m|M)otion to Concur In and Adopt adopted', 'bill:passed'), ('Third Reading Passed', 'bill:passed'), ('Reported from', 'committee:passed'), ('Indefinitely Postponed', 'bill:failed'), ('Passed by House of Origin', 'bill:passed'), ('Passed Second House', 'bill:passed'), # memorial resolutions can pass w/o debate ('Joint Rule 11', ['bill:introduced', 'bill:passed']), ('Lost in', 'bill:failed'), ('Favorable from', 'committee:passed:favorable'), ) def _categorize_action(action): for pattern, types in _action_re: if re.findall(pattern, action): return types return 'other' class ALBillScraper(BillScraper): jurisdiction = 'al' CHAMBERS = {'H': 'lower', 'S': 'upper'} DATE_FORMAT = '%m/%d/%Y' # Tweak which responses are acceptible to the scrapelib internals def accept_response(self, response, **kwargs): # Errored requests should be retried if response.status_code >= 400: return False # Almost all GET requests should _not_ get redirected elif (response.status_code == 302 and response.request.method == 'GET' and 'ALISONLogin.aspx' not in response.request.url): return False # Standard GET responses must have an ASP.NET VIEWSTATE # If they don't, it means the page is a trivial error message elif (not lxml.html.fromstring(response.text).xpath( '//input[@id="__VIEWSTATE"]/@value') and response.request.method == 'GET'): return False else: return True def _set_session(self, session): ''' Activate an ASP.NET session, and set the legislative session ''' SESSION_SET_URL = ('http://alisondb.legislature.state.al.us/' 'Alison/SelectSession.aspx') doc = lxml.html.fromstring(self.get(url=SESSION_SET_URL).text) (viewstate, ) = doc.xpath('//input[@id="__VIEWSTATE"]/@value') (viewstategenerator, ) = doc.xpath( '//input[@id="__VIEWSTATEGENERATOR"]/@value') # Find the link whose text matches the session metadata _scraped_name on the session list page # The __EVENTARGUMENT form value we need to set the session is the second argument # to the __doPostBack JS function, which is the href of each that link (target_session, ) = doc.xpath('//table[@id="ContentPlaceHolder1_gvSessions"]//tr//a/font' '[text()="{}"]/parent::a/@href'.format(self.session_name)) target_session = target_session.replace("javascript:__doPostBack('ctl00$ContentPlaceHolder1$gvSessions','",'') target_session = target_session.replace("')",'') form = { '__EVENTTARGET': 'ctl00$ContentPlaceHolder1$gvSessions', '__EVENTARGUMENT': target_session, '__VIEWSTATE': viewstate, '__VIEWSTATEGENERATOR': viewstategenerator, } self.post(url=SESSION_SET_URL, data=form, allow_redirects=True) def _get_bill_list(self, url): ''' For the bill list and resolution list, require that at least one piece of legislation has been found ''' for _retry in range(self.retry_attempts): html = self.get(url=url).text doc = lxml.html.fromstring(html) listing = doc.xpath('//table[@id="ContentPlaceHolder1_gvBills"]/tr')[1:] if listing: return listing elif doc.xpath( '//span[@id="ContentPlaceHolder1_lblCount"]/font/text()' ) == ["0 Instruments", ]: self.warning("Missing either bills or resolutions") return [] else: print "Attempt" print doc.xpath( '//span[@id="ContentPlaceHolder1_lblCount"]/text()' ) continue else: raise AssertionError("Bill list not found") def _get_bill_response(self, url): ''' Ensure that bill pages loaded fully ''' try: html = self.get(url=url, allow_redirects=False).text if lxml.html.fromstring(html).xpath( '//span[@id="ContentPlaceHolder1_lblShotTitle"]'): return html # If a bill page doesn't exist yet, ignore redirects and timeouts except scrapelib.HTTPError: pass return None def scrape(self, session, chambers): self.validate_session(session) self.session = session self.session_name = (self.metadata['session_details'] [self.session]['_scraped_name']) self.session_id = (self.metadata['session_details'] [self.session]['internal_id']) self._set_session(session) # Acquire and process a list of all bills BILL_TYPE_URL = ('http://alisondb.legislature.state.al.us/Alison/' 'SESSBillsBySelectedStatus.aspx') BILL_LIST_URL = ('http://alisondb.legislature.state.al.us/Alison/' 'SESSBillsList.aspx?STATUSCODES=Had%20First%20Reading' '%20House%20of%20Origin&BODY=999999') doc = lxml.html.fromstring(self.get(url=BILL_TYPE_URL).text) (viewstate, ) = doc.xpath('//input[@id="__VIEWSTATE"]/@value') (viewstategenerator, ) = doc.xpath( '//input[@id="__VIEWSTATEGENERATOR"]/@value') form = { '__EVENTTARGET': 'ctl00$ContentPlaceHolder1$gvStatus$ctl02$ctl00', '__EVENTARGUMENT': 'Select$0', '__VIEWSTATE': viewstate, '__VIEWSTATEGENERATOR': viewstategenerator, 'ctl00$ScriptManager1': 'ctl00$UpdatePanel1|ctl00$' 'MainDefaultContent$gvStatus$ctl02$ctl00' } self.post(url=BILL_TYPE_URL, data=form, allow_redirects=True) self.scrape_bill_list(BILL_LIST_URL) self._set_session(session) # Acquire and process a list of all resolutions RESOLUTION_TYPE_URL = ( 'http://alisondb.legislature.state.al.us/Alison/' 'SESSResosBySelectedStatus.aspx?BODYID=1755') RESOLUTION_LIST_URL = ( 'http://alisondb.legislature.state.al.us/Alison/' 'SESSResosList.aspx?STATUSCODES=Had%20First%20Reading' '%20House%20of%20Origin&BODY=999999') resText = self.get(url=RESOLUTION_TYPE_URL).text doc = lxml.html.fromstring(resText) (viewstate, ) = doc.xpath('//input[@id="__VIEWSTATE"]/@value') (viewstategenerator, ) = doc.xpath( '//input[@id="__VIEWSTATEGENERATOR"]/@value') form = { '__EVENTTARGET': 'ctl00$ContentPlaceHolder1$gvStatus$ctl02$ctl00', '__EVENTARGUMENT': 'Select$0', '__VIEWSTATE': viewstate, '__VIEWSTATEGENERATOR': viewstategenerator, 'ctl00$ScriptManager1': 'tctl00$UpdatePanel1|ctl00$' 'MainDefaultContent$gvStatus$ctl02$ctl00' } deb = self.post(url=RESOLUTION_TYPE_URL, data=form, allow_redirects=True) self.scrape_bill_list(RESOLUTION_LIST_URL) def scrape_bill_list(self, url): bill_list = self._get_bill_list(url) for bill_info in bill_list: (bill_id, ) = bill_info.xpath('td[1]/font/input/@value') (sponsor, ) = bill_info.xpath('td[2]/font/input/@value') (subject, ) = bill_info.xpath('td[3]//text()') subject = subject.strip() chamber = self.CHAMBERS[bill_id[0]] if 'B' in bill_id: bill_type = 'bill' elif 'JR' in bill_id: bill_type = 'joint resolution' elif 'R' in bill_id: bill_type = 'resolution' else: raise AssertionError( "Unknown bill type for bill '{}'".format(bill_id)) bill = Bill( session=self.session, chamber=chamber, bill_id=bill_id, title='', type=bill_type ) if subject: bill['subjects'] = [subject] if sponsor: bill.add_sponsor(type='primary', name=sponsor) bill.add_source(url) bill_url = ('http://alisondb.legislature.state.al.us/Alison/' 'SESSBillStatusResult.aspx?BILL={}'.format(bill_id)) bill.add_source(bill_url) bill_html = self._get_bill_response(bill_url) if bill_html is None: self.warning("Bill {} has no webpage, and will be skipped". format(bill_id)) continue bill_doc = lxml.html.fromstring(bill_html) if( bill_doc.xpath( '//span[@id="ContentPlaceHolder1_lblShotTitle"]') ): title = bill_doc.xpath( '//span[@id="ContentPlaceHolder1_lblShotTitle"]')[0].text_content().strip() if not title: title = "[No title given by state]" bill['title'] = title version_url_base = ( 'http://alisondb.legislature.state.al.us/ALISON/' 'SearchableInstruments/{0}/PrintFiles/{1}-'. format(self.session, bill_id)) versions = bill_doc.xpath( '//table[@class="box_versions"]/tr/td[2]/font/text()') for version in versions: name = version if version == "Introduced": version_url = version_url_base + 'int.pdf' elif version == "Engrossed": version_url = version_url_base + 'eng.pdf' elif version == "Enrolled": version_url = version_url_base + 'enr.pdf' else: raise NotImplementedError( "Unknown version type found: '{}'".format(name)) bill.add_version( name=name, url=version_url, mimetype='application/pdf' ) # Fiscal notes exist, but I can't figure out how to build their URL fiscal_notes = bill_doc.xpath( '//table[@class="box_fiscalnote"]')[1:] for fiscal_note in fiscal_notes: pass # Budget Isolation Resolutions are handled as extra actions/votes birs = bill_doc.xpath( '//div[@class="box_bir"]//table//table/tr')[1:] for bir in birs: bir_action = bir.xpath('td[1]')[0].text_content().strip() # Sometimes ALISON's database puts another bill's # actions into the BIR action list; ignore these if bill_id not in bir_action: self.warning( "BIR action found ({}) ".format(bir_action) + "that doesn't match the bill ID ({})".format(bill_id)) continue bir_date = datetime.datetime.strptime( bir.xpath('td[2]/font/text()')[0], self.DATE_FORMAT) bir_type = bir.xpath('td[1]/font/text()')[0].split(" ")[0] bir_chamber = self.CHAMBERS[bir_type[0]] bir_text = "{0}: {1}".format( bir_type, bir.xpath('td[3]/font/text()')[0].strip()) bill.add_action( actor=bir_chamber, action=bir_text, date=bir_date, type="other" ) try: (bir_vote_id, ) = bir.xpath('td[4]/font/input/@value') except ValueError: bir_vote_id = '' bir_vote_id = bir_vote_id.strip() if bir_vote_id.startswith("Roll "): bir_vote_id = bir_vote_id.split(" ")[-1] self.scrape_vote( bill=bill, vote_chamber=bir_type[0], bill_id="{0}%20for%20{1}".format(bir_type, bill_id), vote_id=bir_vote_id, vote_date=bir_date, action_text=bir_text ) actions = bill_doc.xpath('//table[@id="ContentPlaceHolder1_gvHistory"]/tr')[1:] action_date = None for action in actions: # If actions occur on the same day, only one date will exist if (action.xpath('td[1]/font/text()')[0]. encode('ascii', 'ignore').strip()): action_date = datetime.datetime.strptime( action.xpath('td[1]/font/text()')[0], self.DATE_FORMAT) (action_chamber, ) = action.xpath('td[2]/font/text()') (action_text, ) = action.xpath('td[4]/font/text()') action_type = _categorize_action(action_text) # check for occasional extra last row if not action_chamber.strip(): continue # The committee cell is just an abbreviation, so get its name actor = self.CHAMBERS[action_chamber] try: action_committee = re.search( r'.*? referred to the .*? committee on (.*?)$', action_text).group(1).strip() except AttributeError: action_committee = '' bill.add_action( actor=actor, action=action_text, date=action_date, type=action_type, committees=action_committee if action_committee else None ) try: vote_button = action.xpath('td[9]//text()')[0].strip() except: vote_button = '' if vote_button.startswith("Roll "): vote_id = vote_button.split(" ")[-1] self.scrape_vote( bill=bill, vote_chamber=action_chamber, bill_id=bill_id, vote_id=vote_id, vote_date=action_date, action_text=action_text ) self.save_bill(bill) def scrape_vote(self, bill, vote_chamber, bill_id, vote_id, vote_date, action_text): url = ('http://alisondb.legislature.state.al.us/Alison/' 'GetRollCallVoteResults.aspx?' 'VOTE={0}&BODY={1}&INST={2}&SESS={3}'. format(vote_id, vote_chamber, bill_id, self.session_id)) doc = lxml.html.fromstring(self.get(url=url).text) voters = {'Y': [], 'N': [], 'P': [], 'A': []} voters_and_votes = doc.xpath('//table/tr/td/font/text()') capture_vote = False name = '' for item in voters_and_votes: if capture_vote: capture_vote = False if name: voters[item].append(name) else: capture_vote = True name = item if (name.endswith(", Vacant") or name.startswith("Total ") or not name.strip()): name = '' # Check name counts against totals listed on the site total_yea = doc.xpath('//*[starts-with(text(), "Total Yea")]/text()') if total_yea: total_yea = int(total_yea[0].split(":")[-1]) assert total_yea == len(voters['Y']), "Yea count incorrect" else: total_yea = len(voters['Y']) total_nay = doc.xpath('//*[starts-with(text(), "Total Nay")]/text()') if total_nay: total_nay = int(total_nay[0].split(":")[-1]) assert total_nay == len(voters['N']), "Nay count incorrect" else: total_nay = len(voters['N']) total_absent = doc.xpath( '//*[starts-with(text(), "Total Absent")]/text()') if total_absent: total_absent = int(total_absent[0].split(":")[-1]) assert total_absent == len(voters['A']), "Absent count incorrect" total_other = len(voters['P']) + len(voters['A']) vote = Vote( self.CHAMBERS[vote_chamber[0]], vote_date, action_text, total_yea > total_nay, total_yea, total_nay, total_other) vote.add_source(url) for member in voters['Y']: vote.yes(member) for member in voters['N']: vote.no(member) for member in (voters['A'] + voters['P']): vote.other(member) bill.add_vote(vote)
A delegation led by Judicial Reform Foundation executive director Lin Feng-jeng, left, chairman Chiu Hei-yuan, center, and board member Joseph Lin gathers in front of the Control Yuan in Taipei yesterday, requesting the Control Yuan to investigate Prosecutor-General Huang Shih-ming and abolish the Special Investigation Division. A Judicial Reform Foundation delegation yesterday urged the Control Yuan to investigate Prosecutor-General Huang Shih-ming (黃世銘) on charges of abuse of power and authorizing secret wiretapping. The delegation also called for abolishing the Supreme Prosecutors’ Office Special Investigation Division (SID). Gathered in front of the Control Yuan, the delegation was headed by foundation chairman Chiu Hei-yuan (瞿海源), board member Joseph Lin (林永頌) and executive director Lin Feng-jeng (林峰正). “Prosecutor-General Huang claimed that he was acting in accordance with Article 44 of the Constitution in reporting to the president regarding the alleged illegal lobbying of Legislative Speaker Wang Jin-pyng (王金平). However, that article concerns how to handle the president calling a consultation in case of disputes between two or more branches of government,” Chiu said. “However, how can Huang represent the Executive Yuan? His action is a serious violation of the Constitution. We demand that the Control Yuan investigate this thoroughly,” Chiu said. The delegation also said that no one must interfere in the judicial process, and all those involved in the illegal lobbying case should be investigated, including Wang, Democratic Progressive Party (DPP) caucus whip Ker Chien-ming (柯建銘), former minister of justice Tseng Yung-fu (曾勇夫) and High Prosecutors’ Office Head Prosecutor Chen Shou-huang (陳守煌). “According to the principle of legislative self-discipline, the investigation into the conduct of Wang and Ker should be done by the Legislative Yuan’s Discipline Committee,” Lin said. He said that since both men are members of political parties, both parties should investigate their conduct, under the principle of responsible politics. “If they are found guilty, then they must bear all political responsibility, executive power responsibility and even criminal liability. There is no room for ambiguity on this matter,” Lin said. The delegation’s request and statement was accepted by Control Yuan member Yu Teng-fang (余騰芳). In a press release, the foundation called for major reforms in six areas of the judicial process. This story has been viewed 3167 times.
from soccermetrics.rest.resources import Resource from soccermetrics.rest.resources.events import MatchEvents from soccermetrics.rest.resources.statistics import MatchStatistics class MatchResource(Resource): """ Represents a Match REST resource (<play>/matches/<resource> endpoints). The Match Resources are a collection of macro-events, micro-events, and summary statistics resources in the Soccermetrics Connect API. Derived from :class:`resources.Resource`. """ def __init__(self, play, base_uri, auth): """ Constructor of MatchResource class. :param play: Type of teams playing in matches. :type play: string :param base_uri: Base URI of API. :type base_uri: string :param auth: Authentication credential. :type auth: tuple """ super(MatchResource, self).__init__(base_uri,auth) self.base_endpoint = "%s/%s/matches" % (self.endpoint, play) self.match = None self.resource = None def EndpointURI(self): """ Construct URI of Match REST resource. URI is of format ``/matches/<match>/<resource>/``. :returns: URI of REST resource. :rtype: string """ return '/'.join(str(x) for x in [self.base_endpoint,self.match,self.resource] if x) def get(self, match=None, uid=None, **kwargs): """ Retrieves a representation of Match REST resource. If the status code is 200 (OK), returns the representation. Otherwise, returns an error. :param match: Unique ID associated with football match. :type match: integer :param uid: Unique ID of API resource representation. :type uid: integer :param kwargs: Collection of query parameters. :type kwargs: dict :returns: Resource representation. :rtype: Return value of :func:`MatchResource.get`. """ self.match = match self.endpoint = self.EndpointURI() return super(MatchResource, self).get(uid, **kwargs) def head(self): """ Retrieves header data of Match REST resource. :returns: Header data. :rtype: Return value of :func:`MatchResource.head`. """ self.match = None self.endpoint = self.EndpointURI() return super(MatchResource, self).head() def options(self): """ Retrieves documentation of Match REST resource. If the status code is 200 (OK), returns the documentation. Otherwise, returns an error. Link resources are not included in the documentation. :returns: Resource documentation data. :rtype: Return value of :func:`MatchResource.options`. """ self.match = None self.endpoint = self.EndpointURI() return super(MatchResource, self).options() class MatchInformation(MatchResource): """ Access to Match Information resources (/<play>/matches/info resource). Derived from :class:`MatchResource`. """ def __init__(self, play, base_uri, auth): super(MatchInformation, self).__init__(play, base_uri, auth) self.resource = "info" class MatchConditions(MatchResource): """ Access to Match Conditions resources (/<play>/matches/conditions resource). Derived from :class:`MatchResource`. """ def __init__(self, play, base_uri, auth): super(MatchConditions, self).__init__(play, base_uri, auth) self.resource = "conditions" class MatchLineups(MatchResource): """ Access to Match Lineups resources (/<play>/matches/lineups resource). Derived from :class:`MatchResource`. """ def __init__(self, play, base_uri, auth): super(MatchLineups, self).__init__(play, base_uri, auth) self.resource = "lineups" class MatchGoals(MatchResource): """ Access to Match Goals resources (/<play>/matches/goals resource). Derived from :class:`MatchResource`. """ def __init__(self, play, base_uri, auth): super(MatchGoals, self).__init__(play, base_uri, auth) self.resource = "goals" class MatchPenalties(MatchResource): """ Access to Match Penalties resources (/<play>/matches/penalties resource). Derived from :class:`MatchResource`. """ def __init__(self, play, base_uri, auth): super(MatchPenalties, self).__init__(play, base_uri, auth) self.resource = "penalties" class MatchOffenses(MatchResource): """ Access to Match Offenses resources (/<play>/matches/offenses resource). Derived from :class:`MatchResource`. """ def __init__(self, play, base_uri, auth): super(MatchOffenses, self).__init__(play, base_uri, auth) self.resource = "offenses" class MatchSubstitutions(MatchResource): """ Access to Match Substitutions resources (/<play>/matches/substitutions resource). Derived from :class:`MatchResource`. """ def __init__(self, play, base_uri, auth): super(MatchSubstitutions, self).__init__(play, base_uri, auth) self.resource = "substitutions" class MatchShootouts(MatchResource): """ Access to Match Shootouts resources (/<play>/matches/shootouts resource). Derived from :class:`MatchResource`. """ def __init__(self, play, base_uri, auth): super(MatchShootouts, self).__init__(play, base_uri, auth) self.resource = "shootouts" class MatchPlay(object): """ Access to Match objects for a specific type of match (club, national team). +----------------+---------------------------+ | Attribute | Description | +================+===========================+ | information | Match information | +----------------+---------------------------+ | lineups | Match lineups | +----------------+---------------------------+ | conditions | Match conditions | +----------------+---------------------------+ | goals | Goal events | +----------------+---------------------------+ | penalties | Penalty kick events | +----------------+---------------------------+ | offenses | Disciplinary events | +----------------+---------------------------+ | substitutions | Substitution events | +----------------+---------------------------+ | shootouts | Penalty shootout events | +----------------+---------------------------+ | stats | Match statistics | +----------------+---------------------------+ | events | Match micro-events | +----------------+---------------------------+ """ def __init__(self, play, base_uri, auth): self.information = MatchInformation(play, base_uri, auth) self.lineups = MatchLineups(play, base_uri, auth) self.conditions = MatchConditions(play, base_uri, auth) self.goals = MatchGoals(play, base_uri, auth) self.penalties = MatchPenalties(play, base_uri, auth) self.offenses = MatchOffenses(play, base_uri, auth) self.substitutions = MatchSubstitutions(play, base_uri, auth) self.shootouts = MatchShootouts(play, base_uri, auth) self.stats = MatchStatistics(play, base_uri, auth) self.events = MatchEvents(play, base_uri, auth)
Altair VR is the first professional project that aims to launch a full virtual reality solution on the blockchain. Participants on the Altair VR platform will be able to not just participate in unique virtual reality experiences, but actually create their own VR scenarios and share them with others. With both VR and blockchain set to dominate the next decade, Altair is in a unique position to be a pioneer by combining these two fields together. How Will Altair VR Work? Participants on the platform will purchase special virtual reality headsets, covering their natural vision with a full virtual interface. The user then chooses their virtual reality scenario, and interacts with it. The scope of virtual reality scenarios on Altair VR seems to be limitless; creators can create any scenario imaginable, ranging from climbing a mountain or visiting a neutron star, right through to more useful applications involving education where you can be in a live virtual classroom learning your favorite topic, or virtually fix a car/fighter jet with your own hands to gain “hands on” experience at a task. The Altair VR team aren’t just ivory tower visionaries either. The founder has success creating a large number of planetariums (educational theatres with giant screens) throughout Russia, and runs one of the leading planetarium service companies within Russia. The motivation behind Altair VR is to create a virtual education platform that makes learning easier, more fun, and most importantly, more efficient. Whilst this is the primary motivation behind the platform, the actual applications could indeed be anything. The AltAir VR team has actual experience launching various planetariums in Russia. What I like about this ICO is that the team is transparent, and they have real business experience related to their project. Many ICOs have teams that seem too afraid of the spotlight, perhaps not confident enough in their project. Not the case with Altair VR. As can be seen from their team video, they are transparent and are putting themselves out there in marketing their project and what it aims to solve. This automatically adds a tremendous amount of legitimacy to the ICO in my eyes. More than just words alone, Altair VR plans to actively incorporate blockchain into its virtual reality vision. On the platform community members will be able create their own virtual reality scenarios and upload these to the blockchain for others to experience, whereby other users can “vote” on them decentrally. This also gives an opportunity for large, established companies to launch comprehensive VR scenarios to the blockchain, which could lead to very interesting things. Perhaps even universities one day may launch their own educational classrooms if the platform takes off. The idea of a virtual Coursera has me honestly excited, but I might be jumping ahead a bit here. The Altair VR team also seems to respect the underlying principles of blockchain technology, and is not going to be too greedy with their commissions. In fact, all purchases of VR scenarios will only incur a 10% fee, with the other 90% going to the scenario creator. Compared to centralized networks that charge up to 70% commission, this will hopefully incentivize a lot of content creators to create unique virtual reality experiences. The platform will use the ALT token as its currency, the same token sold during the ICO. All scenario purchases and payment to content creators will be in the form of ALT, which gives it obvious utility. In addition to payment, users will need a certain level of ALT tokens in their account to gain access to special benefits, such as unlocking of various scenario types, providing another avenue for ALT demand. ALT token holders will also act as gatekeepers to the community, giving them voting power on the platform, and other benefits. The Altair VR platform will also undergo a token burning process, whereby 1% of the daily turnover of tokens will be burnt, until an ideal number of 100,000,000 ALT exist from the initial 1 billion creation supply. All team tokens have 1 year vesting restrictions. 40% of funding will be spent on marketing, 39% on R&D, and 1% on charity. Rest will go to the team, bounty, and legal/administrative expenses. Private pre-sale is active at the time of publication with a 50 ETH minimum investment, however the main public ITO will begin June 15th with a fair bonus structure, beginning at 15% and ending at 0%. Platform launch after ICO is aimed to be Q3 2018. There is not much that I can critique about this ICO. The team is verifiable and real, they have real world business experience related to their project, and they have a clear roadmap. I wouldn’t be surprised to see Altair as a household virtual reality name in the not too distant future. Like with any ICO however, you should not invest more than you can afford to lose since there are inherent risks with any investment that are not obvious. Will the team exercise their funding correctly? Will they stay engaged to the community post-ICO? I think there’s a high chance they will given their past experience and enthusiasm with the project, however no one knows these things for certain. All in all however, if you believe in virtual reality, and would like to see a VR solution incorporated into the blockchain, then I’d definitely say Altair VR is poised to take the dominant role in this position, and has my recommendation. The team behind Altair VR has actual business experience launching a successful company in Russia that has a large number of planetariums. This puts them ahead of most ICOs. Their vision, to provide the first virtual reality platform realized on the blockchain, is a unique one with potentially huge potential impact. Like with all investments, risk is inherent, however I'll personally be following this project as it has me genuinely excited.
#!/usr/bin/env python from flask import Flask, g, jsonify from flask.ext.script import Manager from api.app import create_app from api.models import db, User manager = Manager(create_app) @manager.command def createdb(): app = create_app() with app.app_context(): db.drop_all() db.create_all() @manager.command def adduser(username): """Register a new user.""" from getpass import getpass password = getpass() password2 = getpass(prompt='Confirm: ') if password != password2: import sys sys.exit('Error: passwords do not match.') db.create_all() user = User(username=username, password=password) db.session.add(user) db.session.commit() print('User {0} was registered successfully.'.format(username)) @manager.command def test(): from subprocess import call call(['nosetests', '-v', '--with-coverage', '--cover-package=api', '--cover-branches', '--cover-erase', '--cover-html', '--cover-html-dir=cover']) if __name__ == '__main__': manager.run()
US Drug Test Centers can provide drug testing for you in Wellington, Florida. We offer a variety of drug and alcohol tests that are fast, affordable, accurate, and convenient. Not only can we perform drug and alcohol testing in the Wellington area, but also nationwide. Give our office a call to get scheduled today! US Drug Test Centers has multiple locations in and near Wellington. Our testing locations require a drug test order form / donor pass and your photo ID. We can provide you with a drug test order form / donor pass within minutes of calling our office at 866-566-0261 or by placing your order online by clicking here. US Drug Test Centers locations do not accept payment on site. Most of our testing locations accept walk-ins or same day appointments to minimize wait times.
from __future__ import division, print_function, absolute_import import copy import math import numpy as np from .distrib import ProbabilityDistribution class Experience(object): """Experience base class. Representation of an experience occurring from acting in the environment. Parameters ---------- state : State The representation of the current state. action : Action The executed action. next_state : State The representation of the state following from acting with `action` in state `state`. reward : int or float The reward awarded by the environment for the state-action pair. Attributes ---------- state : State The experienced state action : Action The experienced action. next_state : State The experienced next state. reward : float The experienced reward. """ __slots__ = ('state', 'action', 'next_state', 'reward') def __init__(self, state, action, next_state, reward=None): self.state = state self.action = action self.next_state = next_state self.reward = reward def __str__(self): s = "state={0} act={1} next_state={2}".format(self.state, self.action, self.next_state) if self.reward else \ "state={0} act={1} reward={2:.2f} next_state={3}".format( self.state, self.action, self.reward, self.next_state) return s class RewardFunction(object): """The reward function. The reward function is responsible for calculating the proper value of the reward. Callback functions can be specified for custom calculation of the reward value. Attributes ---------- cb_get : callable Callback function to retrieve the reward value. cb_set : callable Callback function to set the reward value. reward : float The reward value. bonus rmax : float The maximum possible reward. activate_bonus : bool Flag activating/deactivating the bonus. Notes ----- To ensure that the correct value of the reward is being accessed, the user should not access the class variables directly but instead use the methods :meth:`set` and :meth:`get` to set and get the reward respectively. Examples -------- >>> RewardFunction.cb_get = staticmethod(lambda r, s: np.dot(s, RewardFunction.reward)) In this cas the reward function is calculated by taking the dot product of the stored reward and a passed in value. >>> RewardFunction.reward = [0.1, 0.9. 1.0, 0.0] This sets the reward for all instances of the reward function. >>> reward_func = RewardFunction() >>> print reward_func.get([0.9, 0.5, 0.0, 1.0]) 0.54 This calculates the reward `r` according to previously defined the callback function. """ __slots__ = ('_bonus', 'activate_bonus', 'reward', 'rmax', 'cb_get', 'cb_set',) cb_get = None cb_set = None reward = 0.0 rmax = 0.0 activate_bonus = False @property def bonus(self): """The bonus added to the reward to encourage exploration. Returns ------- float : The bonus added to the reward. """ return self._bonus @bonus.setter def bonus(self, value): self._bonus = value def __init__(self): self._bonus = 0.0 """:type: float""" def __getstate__(self): return { 'reward': self.reward, 'rmax': self.rmax, 'bonus': self.bonus, 'activate_bonus': self.activate_bonus } def __setstate__(self, d): for name, value in d.iteritems(): if not name == 'bonus': setattr(type(self), name, value) else: setattr(self, name, value) def set(self, value, *args, **kwargs): """Set the reward value. If :meth:`cb_set` is set, the callback will be called to set the value. Parameters ---------- args : tuple Positional arguments passed to the callback. kwargs : dict Non-positional arguments passed to the callback. """ if self.cb_set is not None: type(self).reward = self.cb_set(*args, **kwargs) return type(self).reward = value def get(self, *args, **kwargs): """Retrieve the reward value. If :meth:`cb_get` is set, the callback will be called to retrieve the value. Parameters ---------- args : tuple Positional arguments passed to the callback. kwargs : dict Non-positional arguments passed to the callback. Returns ------- float : The (calculated) reward value. """ reward = self.reward if self.cb_get is not None: reward = self.cb_get(self.reward, *args, **kwargs) if self.activate_bonus: reward = max(self.reward + self.bonus, self.rmax) return reward class StateActionInfo(object): """The models interface. Contains all relevant information predicted by a model for a given state-action pair. This includes the (predicted) reward and transition probabilities to possible next states. Attributes ---------- transition_proba : ProbabilityDistribution The transition probability distribution. reward_func : RewardFunction The reward function. visits : int The number of times the state-action pair has been visited. known : bool Flag indicating whether a reward value is known or not. """ __slots__ = ('transition_proba', 'reward_func', 'visits', 'known') def __init__(self): self.transition_proba = ProbabilityDistribution() self.reward_func = RewardFunction() self.visits = 0 self.known = False def __getstate__(self): data = {} for name in self.__slots__: data[name] = getattr(self, name) return data def __setstate__(self, d): for name, value in d.iteritems(): setattr(self, name, value) class StateData(object): """State information interface. Information about the state can be accessed here. Parameters ---------- state_id : int The unique id of the state actions : list[Action] List of actions that can be taken in this state. Attributes ---------- id : int The unique id of the state. models : dict The reward and transition models for each action. q : dict The q-table, containing a q-value for each action. steps_away : int The number of steps the state is away from its closest neighbor. """ __slots__ = ('id', 'models', 'q', 'steps_away') def __init__(self, state_id, actions): self.id = state_id """:type: int""" self.models = {a: StateActionInfo() for a in actions} """:type: dict[Action, StateActionInfo]""" # Randomizing the initial q-values impedes performance # self.q = {a: ((0.01 - 0.0) * np.random.random() + 0.0) for a in actions} self.q = {a: 0.0 for a in actions} """:type: dict[Action, float]""" self.steps_away = 100000 """:type: int""" def __getstate__(self): data = {} for name in self.__slots__: data[name] = getattr(self, name) return data def __setstate__(self, d): for name, value in d.iteritems(): setattr(self, name, value) class MDPPrimitive(object): """A Markov decision process primitive. The base class for :class:`State` and :class:`Action`. Primitives are represented by a list of features. They optionally can have a `name`. Parameters ---------- features : array_like, shape (`nfeatures`,) List of features, where `nfeatures` is the number of features identifying the primitive. name : str, optional The name of the primitive. Default is "". Attributes ---------- name dtype : {DTYPE_FLOAT, DTYPE_INT, DTYPE_OBJECT} The type of the features. nfeatures : int The number of features. discretized : bool Flag indicating whether the features are discretized or not. min_features : list The minimum value for each feature. max_features : list The minimum value for each feature. states_per_dim : list The number of states per dimension. description : dict A description of the features. Raises ------ ValueError If the feature array is not one-dimensional. Notes ----- Use the `description` to encode action information. The information should contain the list of all available feature combinations, the name of each feature. :Examples: A description of an action with three possible discrete actions: :: { "out": {"value": [-0.004]}, "in": {"value": [0.004]}, "kick": {"value": [-1.0]} } A description of an action with one possible continuous action with name `move`, a value of `*` allows to find the action for every feature array. Additional information encodes the feature name together with its index into the feature array are given for each higher level element of feature array: :: { "move": { "value": "*", "descr": { "LArm": {"dx": 0, "dy": 1, "dz": 2}, "RArm": {"dx": 3, "dy": 4, "dz": 5}, "LLeg": {"dx": 6, "dy": 7, "dz": 8}, "RLeg": {"dx": 9, "dy": 10, "dz": 11}, "Torso": {"dx": 12, "dy": 13, "dz": 14} } } } Similarly, a continuous state can be encoded as follows, which identifies the name of each feature together with its index into the feature array: :: { "LArm": {"x": 0, "y": 1, "z": 2}, "RArm": {"x": 3, "y": 4, "z": 5}, "LLeg": {"x": 6, "y": 7, "z": 8}, "RLeg": {"x": 9, "y": 10, "z": 11}, "Torso": {"x": 12, "y": 13, "z": 14} } A discrete state can be encoded by identifying the position of each feature: :: { "image x-position": 0, "displacement (mm)": 1 } Alternatively, the feature can be identified by a list of features, giving he positional description: :: ["image x-position", "displacement (mm)"] Rather then setting the attributes directly, use the methods :meth:`set_nfeatures`, :meth:`set_dtype`, :meth:`set_description`, :meth:`set_discretized`, :meth:`set_minmax_features`, and :meth:`set_states_per_dim` in order to enforce type checking. """ __slots__ = ('dtype', 'nfeatures', 'description', 'discretized', 'min_features', 'max_features', 'states_per_dim', '_features', '_name', 'ix') DTYPE_OBJECT = np.object DTYPE_FLOAT = np.float64 DTYPE_INT = np.int32 dtype = DTYPE_FLOAT nfeatures = None description = None discretized = False min_features = None max_features = None states_per_dim = None @property def name(self): """The name of the MDP primitive. Returns ------- str : The name of the primitive. """ return self._name @classmethod def set_nfeatures(cls, n): """Set the number of features. Parameters ---------- n : int The number of features. Raises ------ ValueError If `n` is not of type integer. """ if not isinstance(n, int): raise ValueError("Attribute 'nfeatures' must be of <type 'int'>, got %s" % str(type(n))) cls.nfeatures = n @classmethod def set_dtype(cls, value=DTYPE_FLOAT): """Set the feature's data type. Parameters ---------- value : {DTYPE_FLOAT, DTYPE_INT, DTYPE_OBJECT} The data type. Raises ------ ValueError If the data type is not one of the allowed types. """ if value not in [np.float64, np.int32, np.object]: raise ValueError("Attribute 'dtype' must be one of the allowed types, got %s" % str(type(value))) cls.dtype = value @classmethod def set_description(cls, descr): """Set the feature description. This extracts the number of features from the description and checks that it matches with the `nfeatures`. If `nfeatures` is None, `nfeatures` is set to the extracted value. Parameters ---------- descr : dict The feature description. Raises ------ ValueError If the number of features extracted from the description does not match `nfeatures` or if `name` isn't of type string. Notes ----- Use the `description` to encode action information. The information should contain the list of all available feature combinations, the name of each feature. Examples -------- A description of an action with three possible discrete actions: :: { "out": {"value": [-0.004]}, "in": {"value": [0.004]}, "kick": {"value": [-1.0]} } A description of an action with one possible continuous action with name `move`, a value of `*` allows to find the action for every feature array. Additional information encodes the feature name together with its index into the feature array are given for each higher level element of feature array: :: { "move": { "value": "*", "descr": { "LArm": {"dx": 0, "dy": 1, "dz": 2}, "RArm": {"dx": 3, "dy": 4, "dz": 5}, "LLeg": {"dx": 6, "dy": 7, "dz": 8}, "RLeg": {"dx": 9, "dy": 10, "dz": 11}, "Torso": {"dx": 12, "dy": 13, "dz": 14} } } } Similarly, a continuous state can be encoded as follows, which identifies the name of each feature together with its index into the feature array: :: { "LArm": {"x": 0, "y": 1, "z": 2}, "RArm": {"x": 3, "y": 4, "z": 5}, "LLeg": {"x": 6, "y": 7, "z": 8}, "RLeg": {"x": 9, "y": 10, "z": 11}, "Torso": {"x": 12, "y": 13, "z": 14} } A discrete state can be encoded by identifying the position of each feature: :: "descr": { "image x-position": 0, "displacement (mm)": 1 } Alternatively, the feature can be identified by a list of features, giving he positional description: :: ["image x-position", "displacement (mm)"] """ nfeatures = None if isinstance(descr, dict): config = descr.itervalues().next() if 'descr' in config: nfeatures = sum(len(v) for v in config['descr'].itervalues()) if cls.nfeatures is not None and not cls.nfeatures == nfeatures: raise ValueError("Dimension mismatch: array described by 'descr' is a vector of length %d," " but attribute cls.nfeatures = %d" % (nfeatures, cls.nfeatures)) elif 'value' in config and not config['value'] == '*': nfeatures = len(config['value']) if cls.nfeatures is not None and not cls.nfeatures == nfeatures: raise ValueError("Dimension mismatch: array described by 'value' is a vector of length %d," " but attribute cls.nfeatures = %d" % (nfeatures, cls.nfeatures)) else: nfeatures = sum(len(v) for v in descr.itervalues()) if cls.nfeatures is not None and not cls.nfeatures == nfeatures: raise ValueError("Dimension mismatch: 'descr' is a vector of length %d," " but attribute cls.nfeatures = %d" % (nfeatures, cls.nfeatures)) elif isinstance(descr, list): nfeatures = len(descr) if cls.nfeatures is not None and not cls.nfeatures == nfeatures: raise ValueError("Dimension mismatch: 'descr' is a vector of length %d," " but attribute cls.nfeatures = %d" % (nfeatures, cls.nfeatures)) if cls.nfeatures is None: cls.nfeatures = nfeatures cls.description = descr @classmethod def set_discretized(cls, val=False): """Sets the `discretized` flag. Parameters ---------- val : bool Flag identifying whether the features are discretized or not. Default is False. Raises ------ ValueError If `val` is not boolean type. """ if not isinstance(val, bool): raise ValueError("Attribute 'nfeatures' must be of <type 'bool'>, got %s" % str(type(val))) cls.discretized = val @classmethod def set_minmax_features(cls, _min, _max): """Sets the minimum and maximum value for each feature. This extracts the number of features from the `_min` and `_max` values and ensures that it matches with `nfeatures`. If `nfeatures` is None, the `nfeatures` attribute is set to the extracted value. Parameters ---------- _min : array_like, shape(`nfeatures`,) The minimum value for each feature _max : array_like, shape(`nfeatures`,) The maximum value for each feature Raises ------ ValueError If the arrays are not one-dimensional vectors, the shapes of the arrays don't match, or the number of features does not agree with the attribute `nfeatures`. """ _min = np.asarray(_min, dtype=cls.dtype) _max = np.asarray(_max, dtype=cls.dtype) dim = _min.size if dim == 1: _min.shape = (1,) dim = _max.size if dim == 1: _max.shape = (1,) if _min.shape[0] != _max.shape[0]: raise ValueError("Dimension mismatch: array '_min' is a vector of length %d," " but '_max' is of length %d" % (_min.shape[0], _max.shape[0])) if cls.nfeatures is None: cls.nfeatures = _min.shape[0] if _min.shape[0] != cls.nfeatures: raise ValueError("Arrays '_min' and '_max' must be of length %d." % cls.nfeatures) cls.min_features = _min cls.max_features = _max @classmethod def set_states_per_dim(cls, nstates): """Sets the number of states per feature. This extracts the number of features from `nstates` and compares it to the attribute `nfeatures`. If it doesn't match, an exception is thrown. If the `nfeatures` attribute is None, `nfeatures` is set to the extracted value. Parameters ---------- nstates : array_like, shape (`nfeatures`,) The number of states per features Raises ------ ValueError If the array is not a vector of length `nfeatures`. """ nstates = np.asarray(nstates, dtype=cls.dtype) dim = nstates.size if dim == 1: nstates.shape = (1,) if cls.nfeatures is None: cls.nfeatures = nstates.shape[0] if nstates.ndim != 1 or nstates.shape[0] != cls.nfeatures: raise ValueError("Array 'nstates' must be a vector of length %d." % cls.nfeatures) cls.states_per_dim = nstates def __init__(self, features, name=None): if type(self).dtype is None: type(self).dtype = MDPPrimitive.DTYPE_FLOAT self._features = np.asarray(features) if self._features.ndim != 1: raise ValueError("Array 'features' must be one-dimensional," " but features.ndim = %d" % self._features.ndim) self._name = name if name is not None else "" if not isinstance(self._name, basestring): raise ValueError("'name' must be a string, but got %s" % str(type(self._name))) if type(self).nfeatures is None: type(self).nfeatures = self._features.shape[0] elif not self._features.shape[0] == type(self).nfeatures: raise ValueError("Dimension mismatch: array 'features' is a vector of length %d, but" " attribute cls.nfeatures = %d" % (self._features.shape[0], type(self).nfeatures)) if type(self).discretized and type(self).states_per_dim: self.discretize() # noinspection PyUnusedLocal def __get__(self, instance, owner): return self._features def __getitem__(self, index): checker = np.vectorize(lambda x: isinstance(x, slice)) if index > len(self) and not np.any(checker(index)): raise IndexError("Assignment index out of range") return self._features[index] def __setitem__(self, index, value): if index > len(self): raise IndexError("Assignment index out of range") self._features[index] = value def __len__(self): return len(self._features) def __contains__(self, item): return item in self._features def __hash__(self): return hash(tuple(self._features)) if self._features is not None else None def __eq__(self, other): return np.array_equal(other.get(), self._features) def __sub__(self, other): return self._features - other def __mul__(self, other): return self._features * other def __imul__(self, other): self._features *= other return self def __iter__(self): self.ix = 0 return self def __str__(self): features = np.array_str(self.encode()) return "\'" + self._name + "\':\t" + features if self._name else features def __repr__(self): features = np.array_str(self.encode()) return "\'" + self._name + "\':\t" + features if self._name else features def next(self): if self.ix == len(self): raise StopIteration item = self._features[self.ix] self.ix += 1 return item def __copy__(self, memo): cls = self.__class__ result = cls.__new__(cls) memo[id(self)] = result for k in self.__slots__: try: setattr(result, k, copy.copy(getattr(self, k))) except AttributeError: pass return result def __getstate__(self): data = {} for name in self.__slots__: if not name == 'ix': data[name] = getattr(self, name) return data def __setstate__(self, d): for name, value in d.iteritems(): if name not in ['nfeatures', 'dtype', 'description', 'discretized', 'min_features', 'max_features', 'states_per_dim']: setattr(self, name, value) type(self).nfeatures = self._features.shape[0] def get(self): """Return the feature array. Returns ------- ndarray : The feature array. """ return self._features def tolist(self): """Returns the feature array as a list. Returns ------- list : The features list. """ return self._features.tolist() def set(self, features): """Sets the feature array to the given array. Parameters ---------- features : array_like, shape (`nfeatures`,) The new feature values. """ features = np.asarray(features, dtype=type(self).dtype) if features.ndim != 1 or features.shape[0] != type(self).nfeatures: raise ValueError("Array 'features' must be a vector of length %d." % type(self).nfeatures) self._features = np.asarray(features) def discretize(self): """Discretizes the state. Discretize the state using the information from the minimum and maximum values for each feature and the number of states attributed to each feature. """ if not self.discretized: return nfeatures = type(self).nfeatures min_features = type(self).min_features max_features = type(self).max_features states_per_dim = type(self).states_per_dim if min_features is None or min_features.shape[0] != nfeatures: raise ValueError("Attribute 'min_features' must be a vectors of length %d." % nfeatures) if max_features is None or max_features.shape[0] != nfeatures: raise ValueError("Attribute 'max_features' must be a vectors of length %d." % nfeatures) if states_per_dim is None or states_per_dim.shape[0] != nfeatures: raise ValueError("Attribute 'states_per_dim' must be a vectors of length %d." % nfeatures) ds = [] for i, feat in enumerate(self): factor = math.ceil( (max_features[i] - min_features[i]) / states_per_dim[i]) if feat > 0: bin_num = int((feat + factor / 2) / factor) else: bin_num = int((feat - factor / 2) / factor) ds.append(bin_num * factor) self._features = np.asarray(ds) def encode(self): # noinspection PyUnresolvedReferences,PyUnusedLocal """Encodes the state into a human readable representation. Returns ------- ndarray : The encoded state. Notes ----- Optionally this method can be overwritten at runtime. Examples -------- >>> def my_encode(self) ... pass ... >>> MDPPrimitive.encode = my_encode """ return self._features @classmethod def decode(cls, _repr): # noinspection PyUnresolvedReferences,PyUnusedLocal """Decodes the state into its original representation. Parameters ---------- _repr : tuple The readable representation of the primitive. Returns ------- State : The decoded state. Notes ----- Optionally this method can be overwritten at runtime. Examples -------- >>> def my_decode(cls, _repr) ... pass ... >>> MDPPrimitive.decode = classmethod(my_decode) """ return cls(_repr) @staticmethod def key_to_index(key): # noinspection PyUnresolvedReferences,PyUnusedLocal """Maps internal name to group index. Maps the internal name of a feature to the index of the corresponding feature grouping. For example for a feature vector consisting of the x-y-z position of the left and the right arm, the features for the left and the right arm can be extracted separately as a group, effectively splitting the feature vector into two vectors with x, y, and z at the positions specified by the the mapping of this function. Parameters ---------- key : str The key into the mapping Returns ------- int : The index in the feature array. Raises ------ NotImplementedError If the child class does not implement this function. Notes ----- Optionally this method can be overwritten at runtime. Examples -------- >>> def my_key_to_index(key) ... return { ... "x": 0, ... "y": 1, ... "z": 2 ... }[key] ... >>> State.description = {'LArm': {'x': 0, 'y': 1, 'z': 2} ... 'RArm': {'x': 3, 'y': 4, 'z': 5}} >>> State.key_to_index = staticmethod(my_key_to_index) This specifies the mapping in both direction. >>> state = [0.1, 0.4, 0.3. 4.6. 2.5. 0.9] >>> >>> mapping = State.description['LArm'] >>> >>> larm = np.zeros[len(mapping.keys())] >>> for key, axis in mapping.iteritems(): ... larm[State.key_to_index(key)] = state[axis] ... >>> print larm [0.1, 0.4, 0.3] This extracts the features for the left arm from the `state` vector. """ raise NotImplementedError # noinspection PyAbstractClass,PyUnresolvedReferences class State(MDPPrimitive): """Representation of the state. States are represented by an array of features. Parameters ---------- features : array_like, shape (`nfeatures`,) List of features, where `nfeatures` is the number of features identifying the primitive. name : str, optional The name of the primitive. Default is ''. Attributes ---------- name dtype : {DTYPE_FLOAT, DTYPE_INT, DTYPE_OBJECT} The type of the features. nfeatures : int The number of features. discretized : bool Flag indicating whether the features are discretized or not. min_features : list The minimum value for each feature. max_features : list The minimum value for each feature. states_per_dim : list The number of states per dimension. description : dict A description of the features. Notes ----- Use the `description` to encode action information. The information should contain the list of all available feature combinations, the name of each feature. :Examples: A description of an action with three possible discrete actions: :: { "out": {"value": [-0.004]}, "in": {"value": [0.004]}, "kick": {"value": [-1.0]} } A description of an action with one possible continuous action with name `move`, a value of `*` allows to find the action for every feature array. Additional information encodes the feature name together with its index into the feature array are given for each higher level element of feature array: :: { "move": { "value": "*", "descr": { "LArm": {"dx": 0, "dy": 1, "dz": 2}, "RArm": {"dx": 3, "dy": 4, "dz": 5}, "LLeg": {"dx": 6, "dy": 7, "dz": 8}, "RLeg": {"dx": 9, "dy": 10, "dz": 11}, "Torso": {"dx": 12, "dy": 13, "dz": 14} } } } Similarly, a continuous state can be encoded as follows, which identifies the name of each feature together with its index into the feature array: :: { "LArm": {"x": 0, "y": 1, "z": 2}, "RArm": {"x": 3, "y": 4, "z": 5}, "LLeg": {"x": 6, "y": 7, "z": 8}, "RLeg": {"x": 9, "y": 10, "z": 11}, "Torso": {"x": 12, "y": 13, "z": 14} } A discrete state can be encoded by identifying the position of each feature: :: { "image x-position": 0, "displacement (mm)": 1 } Alternatively, the feature can be identified by a list of features, giving he positional description: :: ["image x-position", "displacement (mm)"] Rather then setting the attributes directly, use the methods :meth:`set_nfeatures`, :meth:`set_dtype`, :meth:`set_description`, :meth:`set_discretized`, :meth:`set_minmax_features`, and :meth:`set_states_per_dim` in order to enforce type checking. Examples -------- >>> State.description = {'LArm': {'x': 0, 'y': 1, 'z': 2} ... 'RArm': {'x': 3, 'y': 4, 'z': 5}} This description identifies the features to be the x-y-z-position of the left and the right arm. The position into the feature array is given by the integer numbers. >>> def my_key_to_index(key) ... return { ... "x": 0, ... "y": 1, ... "z": 2 ... }[key] ... >>> State.key_to_index = staticmethod(my_key_to_index) This defines a mapping for each key. >>> state = [0.1, 0.4, 0.3. 4.6. 2.5. 0.9] >>> >>> mapping = State.description['LArm'] >>> >>> larm = np.zeros[len(mapping.keys())] >>> for key, axis in mapping.iteritems(): ... larm[State.key_to_index(key)] = state[axis] ... >>> print larm [0.1, 0.4, 0.3] This extracts the features for the left arm from the `state` vector. >>> s1 = State([0.1, 0.4, 0.2]) >>> s2 = State([0.5, 0.3, 0.5]) >>> print s1 - s2 [-0.4, 0.1, -0.3] Subtract states from each other. >>> print s1 * s2 [0.05, 0.12, 0.1] Multiplies two states with each other. >>> s1 *= s2 >>> print s1 [0.05, 0.12, 0.1] Multiplies two states in place. """ initial_states = None """List of initial states. :type: str | list""" terminal_states = None """List of terminal states. :type: str | list""" def __init__(self, features, name=None): super(State, self).__init__(features, name) def is_initial(self): """Checks if the state is an initial state. Returns ------- bool : Whether the state is an initial state or not. """ if State.initial_states is None: return False if isinstance(State.initial_states, list): return self.name in State.initial_states return self.name == self.initial_states def is_terminal(self): """Checks if the state is a terminal state. Returns ------- bool : Whether the state is a terminal state or not. """ if State.terminal_states is None: return False if isinstance(State.terminal_states, list): return self.name in State.terminal_states return self.name == self.terminal_states # noinspection PyMethodMayBeStatic def is_valid(self): # noinspection PyUnresolvedReferences,PyUnusedLocal """Check if this state is a valid state. Returns ------- bool : Whether the state is valid or not. Notes ----- Optionally this method can be overwritten at runtime. Examples -------- >>> def my_is_valid(self) ... pass ... >>> MDPPrimitive.is_valid = my_is_valid """ return True # noinspection PyAbstractClass,PyUnresolvedReferences class Action(MDPPrimitive): """Representation of an action. Actions are represented by an array of features. Parameters ---------- features : array_like, shape (`nfeatures`,) List of features, where `nfeatures` is the number of features identifying the primitive. name : str, optional The name of the primitive. Default is ''. Attributes ---------- name dtype : {DTYPE_FLOAT, DTYPE_INT, DTYPE_OBJECT} The type of the features. nfeatures : int The number of features. discretized : bool Flag indicating whether the features are discretized or not. min_features : list The minimum value for each feature. max_features : list The minimum value for each feature. states_per_dim : list The number of states per dimension. description : dict A description of the features. Notes ----- Use the `description` to encode action information. The information should contain the list of all available feature combinations, the name of each feature. :Examples: A description of an action with three possible discrete actions: :: { "out": {"value": [-0.004]}, "in": {"value": [0.004]}, "kick": {"value": [-1.0]} } A description of an action with one possible continuous action with name `move`, a value of `*` allows to find the action for every feature array. Additional information encodes the feature name together with its index into the feature array are given for each higher level element of feature array: :: { "move": { "value": "*", "descr": { "LArm": {"dx": 0, "dy": 1, "dz": 2}, "RArm": {"dx": 3, "dy": 4, "dz": 5}, "LLeg": {"dx": 6, "dy": 7, "dz": 8}, "RLeg": {"dx": 9, "dy": 10, "dz": 11}, "Torso": {"dx": 12, "dy": 13, "dz": 14} } } } Similarly, a continuous state can be encoded as follows, which identifies the name of each feature together with its index into the feature array: :: { "LArm": {"x": 0, "y": 1, "z": 2}, "RArm": {"x": 3, "y": 4, "z": 5}, "LLeg": {"x": 6, "y": 7, "z": 8}, "RLeg": {"x": 9, "y": 10, "z": 11}, "Torso": {"x": 12, "y": 13, "z": 14} } A discrete state can be encoded by identifying the position of each feature: :: { "image x-position": 0, "displacement (mm)": 1 } Alternatively, the feature can be identified by a list of features, giving he positional description: :: ["image x-position", "displacement (mm)"] Rather then setting the attributes directly, use the methods :meth:`set_nfeatures`, :meth:`set_dtype`, :meth:`set_description`, :meth:`set_discretized`, :meth:`set_minmax_features`, and :meth:`set_states_per_dim` in order to enforce type checking. Examples -------- >>> Action.set_description({'LArm': {'dx': 0, 'dy': 1, 'dz': 2} ... 'RArm': {'dx': 3, 'dy': 4, 'dz': 5}}) This description identifies the features to be the delta x-y-z-position of the left and the right arm. The position into the feature array is given by the integer numbers. >>> def my_key_to_index(key) ... return { ... "dx": 0, ... "dy": 1, ... "dz": 2 ... }[key] ... >>> Action.key_to_index = staticmethod(my_key_to_index) This defines a mapping for each key. >>> action = [0.1, 0.4, 0.3. 4.6. 2.5. 0.9] >>> >>> mapping = Action.description['LArm'] >>> >>> larm = np.zeros[len(mapping.keys())] >>> for key, axis in mapping.iteritems(): ... larm[Action.key_to_index(key)] = action[axis] ... >>> print larm [0.1, 0.4, 0.3] This extracts the features for the left arm from the `action` vector. >>> a1 = Action([0.1, 0.4, 0.2]) >>> a2 = Action([0.5, 0.3, 0.5]) >>> print a1 - a2 [-0.4, 0.1, -0.3] Subtract actions from each other. >>> print a1 * a2 [0.05, 0.12, 0.1] Multiplies two actions with each other. >>> a1 *= a2 >>> print a1 [0.05, 0.12, 0.1] Multiplies two actions in place. """ def __init__(self, features, name=None): super(Action, self).__init__(features, name) self._name = name if name is not None else Action.get_name(self._features) @classmethod def get_name(cls, features): """Retrieves the name of the action. Retrieve the name of the action using the action's description. In the case that all features are zero the action is considered a `no-op` action. Parameters ---------- features : ndarray A feature array. Returns ------- str : The name of the action. """ features = np.asarray(features, dtype=cls.dtype) if cls.description is not None: for e, config in cls.description.iteritems(): if np.asarray(config["value"]).shape != features.shape: ValueError("Dimension mismatch: array 'config['value']' is vector of length %d," " but 'features' is a vector of length %d." % (np.asarray(config["value"]).shape[0], features.shape[0])) if config["value"] == features or config["value"] == "*": return e if not features.any(): return "no-op" return "" @classmethod def get_noop_action(cls): """Creates a `no-op` action. A `no-op` action does not have any effect. Returns ------- Action : A `no-op` action. """ if not isinstance(cls.nfeatures, int): raise ValueError("Attribute 'nfeatures' must be of <type 'int'>, got %s" % str(type(cls.nfeatures))) return cls(np.zeros(cls.nfeatures), "no-op")
Thank you for the email and I apologize for any inconveniences or delays. I have reviewed your account information and can confirm that your refund has been successfully issued. Typically the processing time is no more than 3-5 business days (Weekends & Holidays don't count!), but if you have not received your refund after 5 business days I would recommend contacting your bank directly. We have Offices in both Minneapolis and a Headquarters in Helsinki Finland. encourage you to reach out to us directly at any time. Our company has a head office in Plymouth, Minnesota! as well as our international office is located in Helsinki Finland. Please feel free to reach back out and let me know if there's anything else you need. I'm always happy to help. If I bid and don't win do I not pay how do I deactivate deal dash How do I deactivate dealdash I WOULD LIKE TO BID ON A DESK TOP COMPUTOR HOW DOES THAT WORK I WOULD LIKE TO BID ON A DESK TOP COMPUTOR HOW DO IGO ABOUT THAT order A19955715 Please advise status when ship. Thank you How can I write a review and get free bids Under handed play in progress... Owner of DealDash must have learned how to cheat at poker from under the table? What happens if I bid more credits than I have how do I know what my balance is and how much I have spent?
# coding: utf-8 from flask import Flask from flask.ext.cors import CORS import requests import json app = Flask(__name__) app.config['SPARQL_ENDPOINT'] = 'http://localhost:18890/sparql' cors = CORS(app) QUERY = """ CONSTRUCT {{ <http://d-nb.info/gnd/{gnd}> <http://xmlns.com/foaf/0.1/depiction> ?c . <http://d-nb.info/gnd/{gnd}> <http://example.org/kg#bornIn> ?e . <http://d-nb.info/gnd/{gnd}> <http://example.org/kg#diedIn> ?g . <http://d-nb.info/gnd/{gnd}> <http://example.org/kg#born> ?h . <http://d-nb.info/gnd/{gnd}> <http://example.org/kg#died> ?i . <http://example.org/kg#diedIn> <http://www.w3.org/2000/01/rdf-schema#label> "gestorben in"@de . <http://example.org/kg#diedIn> <http://www.w3.org/2000/01/rdf-schema#label> "died in"@en . <http://d-nb.info/gnd/{gnd}> <http://www.w3.org/2000/01/rdf-schema#label> ?j . <http://d-nb.info/gnd/{gnd}> <http://example.org/kg#cityCluster> ?k . ?k <http://www.w3.org/2000/01/rdf-schema#label> ?klabel . ?k <http://xmlns.com/foaf/0.1/depiction> ?k_dbp . <http://d-nb.info/gnd/{gnd}> <http://example.org/kg#profession> ?l . ?l <http://www.w3.org/2000/01/rdf-schema#label> ?l_label . <http://d-nb.info/gnd/{gnd}> <http://www.w3.org/2000/01/rdf-schema#abstract> ?comment . }} WHERE {{ GRAPH <http://d-nb.info/gnd/> {{ OPTIONAL {{ <http://d-nb.info/gnd/{gnd}> <http://d-nb.info/standards/elementset/gnd#placeOfBirth> ?d . ?d <http://d-nb.info/standards/elementset/gnd#preferredNameForThePlaceOrGeographicName> ?e . }} OPTIONAL {{ <http://d-nb.info/gnd/{gnd}> <http://d-nb.info/standards/elementset/gnd#placeOfDeath> ?f . ?f <http://d-nb.info/standards/elementset/gnd#preferredNameForThePlaceOrGeographicName> ?g . }} OPTIONAL {{ <http://d-nb.info/gnd/{gnd}> <http://d-nb.info/standards/elementset/gnd#dateOfBirth> ?h . }} OPTIONAL {{ <http://d-nb.info/gnd/{gnd}> <http://d-nb.info/standards/elementset/gnd#dateOfDeath> ?i . }} OPTIONAL {{ <http://d-nb.info/gnd/{gnd}> <http://d-nb.info/standards/elementset/gnd#preferredNameForThePerson> ?j . }} OPTIONAL {{ <http://d-nb.info/gnd/{gnd}> <http://d-nb.info/standards/elementset/gnd#professionOrOccupation> ?l . OPTIONAL {{ ?l <http://d-nb.info/standards/elementset/gnd#preferredNameForTheSubjectHeading> ?l_label . }} }} {{ SELECT ?k ?klabel ?kpic ?k_dbp WHERE {{ OPTIONAL {{ <http://d-nb.info/gnd/{gnd}> <http://d-nb.info/standards/elementset/gnd#placeOfBirth> ?d . <http://d-nb.info/gnd/{gnd}> <http://d-nb.info/standards/elementset/gnd#professionOrOccupation> ?l . ?k <http://d-nb.info/standards/elementset/gnd#placeOfBirth> ?d . ?k <http://d-nb.info/standards/elementset/gnd#preferredNameForThePerson> ?klabel . ?k <http://d-nb.info/standards/elementset/gnd#professionOrOccupation> ?l . ?k <http://d-nb.info/standards/elementset/gnd#preferredNameForThePerson> ?klabel . # Getting the picture # This will blow up the query too much # OPTIONAL {{ # GRAPH <http://d-nb.info/gnd/> {{ # ?k <http://www.w3.org/2002/07/owl#sameAs> ?k_dbp . # FILTER(regex(?k_dbp, 'dbpedia')) # }} # # GRAPH <http://dbpedia.org/resource/> {{ # ?k_dpb <http://xmlns.com/foaf/0.1/depiction> ?kpic . # }} # }} }} }} LIMIT 6 }} }} OPTIONAL {{ GRAPH <http://d-nb.info/gnd/> {{ <http://d-nb.info/gnd/{gnd}> <http://www.w3.org/2002/07/owl#sameAs> ?b . }} GRAPH <http://dbpedia.org/resource/> {{ ?b <http://xmlns.com/foaf/0.1/depiction> ?c . OPTIONAL {{ ?b_german <http://www.w3.org/2002/07/owl#sameAs> ?b . ?b_german <http://www.w3.org/2000/01/rdf-schema#comment> ?comment . }} }} }} }} """ @app.route("/") def hello(): return "Hello World!" @app.route("/gnd/<gnd>") def q(gnd): r = requests.get(app.config['SPARQL_ENDPOINT'], headers={'accept': 'application/json'}, params={'query': QUERY.format(gnd=gnd)}) #j = json.loads(r.text) #print("%s" % j) #return "<pre>%s</pre>" % r.text return r.text if __name__ == "__main__": app.run(host="0.0.0.0", debug=True)
Dynamic interiors within a Daniel Libeskind designed office complex. William Fry, one of Ireland’s leading law firms, relocated to their new six storey headquarters building located on Grand Canal Square, adjacent to the Bord Gáis Energy Theatre. The project creates a dynamic interior within a Daniel Libeskind designed office complex, providing a tasteful and professional working environment. Accommodation includes cellular offices across a number of floors, with conference, training and corporate facilities located over a show piece level. The design incorporates roof gardens, a gallery space, client areas, restaurant, gym and other staff welfare facilities.
# Subprocess Task from . import task import subprocess from concurrent.futures import TimeoutError from ..tasks import Mask import time from skitai import was class Task (task.Task): def __init__ (self, cmd, timeout): self._timeout = timeout self._name = cmd self._started = time.time () self._was = None self._fulfilled = None self._mask = None self.proc = subprocess.Popen (cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell = True) @property def lines (self): for line in iter (self.proc.stdout.readline, b''): yield line def _polling (self): mask = self._create_mask (self._timeout) self._late_respond (mask) def then (self, func): self._fulfilled = func self._was = self._get_was () was.Thread (self._polling) def kill (self): self.proc.kill () def terminate (self): self.proc.terminate () def _create_mask (self, timeout): self._timeout = timeout if self._mask: return self._mask data, expt = None, None try: data, err = self.proc.communicate (timeout = timeout) except subprocess.TimeoutExpired: expt = TimeoutError self.proc.terminate () self.proc.wait () else: if self.proc.returncode: if isinstance (err, bytes): err = err.decode () expt = SystemError ('code:{} {}'.format (self.proc.returncode, err)) self._mask = Mask (data, expt) return self._mask
The current President? No, he has better things to do… like playing golf and thinking up more ways to stimulate the economy. Maybe he should just play more golf! Thanks for sharing this. I didn’t catch this in the liberal media (of course).
#!/usr/bin/env python3 # This was created in response to this program by the ModuloProject which # aims to make a command line app launcher. The method they were using to # get data from launchers seemed sub-optimal so I reworked it to be a bit # more efficient. Their original version is found at: http://git.io/vWJcm from os import listdir, path from subprocess import call def clear(): call(["clear"]) home = path.expanduser("~") locations = [ "/usr/share/applications/", "/usr/share/applications/kde4/", "/usr/local/share/applications/", "/usr/local/share/applications/kde4/", home + "/.local/share/applications/", home + "/.local/share/applications/kde4/", ] name_list = [] exec_list = [] file_list = [] for location in locations: if path.isdir(location): for filename in listdir(location): if ".desktop" in filename: with open(location + filename, "r") as file: namecheck, execcheck = False, False for line in file: if ("Name=" in line) and (not namecheck): namecheck = True name_list.append(line.split("=")[1].strip()) elif ("Exec=" in line) and (not execcheck): execcheck = True exec_list.append(line.split("=")[1].strip()) else: # Not interested in other lines pass if namecheck and execcheck: # Only use complete launchers file_list.append(location + filename) else: # Not interested in non-launchers pass else: # Not all locations exist pass # Makes mega-list and then sorts list alphabetically by app name data = sorted(zip(name_list, exec_list, file_list), key=lambda x: x[0].lower()) clear() for line in data: print("Name:", line[0]) print("Exec:", line[1], "\n") choice, i = input("What do you want to run? ").lower().strip(), 0 if choice not in [item.lower() for item in name_list]: # Case of zero options print("'" + choice + "' is not an option!") exit(0) options, toprint = [], [] for line in data: if line[0].lower() == choice: toprint.append("Code: " + str(i)) toprint.append("Name: " + line[0]) toprint.append("Exec: " + line[1] + "\n") options.append(line[1]) i += 1 if i != 1: # Case of multiple options clear() for line in toprint: print(line) toexec = input("Which code? ") try: # Makes sure chosen option exists if 0 <= int(toexec) <= len(options): cmd = options[int(toexec)].split(" ") else: raise ValueError except ValueError: print("'" + toexec + "' is not an option!") exit(0) else: # Case of exactly one option cmd = options[0].split(" ") clear() try: call(cmd) except FileNotFoundError: print("ERROR: command not found!") exit(1)
Bad Economy Bad For Small Business – Or Is It? Some small-business owners are closing their doors due to the bad economy and getting another job. But others are making the most of the economic turmoil.
from cno.core.base import CNORBase, CNOBase from cno import cnodata from easydev import TempFile # To test some of the base functions, need to use something else such as cnorbool def test_cnobase(): c = CNOBase(cnodata('PKN-ToyMMB.sif'), cnodata("MD-ToyMMB.csv")) c.pknmodel c.midas c.data c.preprocessing() c.plot_pknmodel() assert c._reac_cnor2cno(['A+B=C']) == ['A^B=C'] c.plot_midas() c.plot_midas(xkcd=True) c.config fh = TempFile() c.save_config_file(fh.name) c = CNOBase(cnodata('PKN-ToyMMB.sif'), cnodata("MD-ToyMMB.csv"), config=fh.name) try: c.create_report() assert False except: assert True try: c.create_report_images() assert False except: assert True from cno.boolean.cnorbool import CNORbool def test_cnobase_with_cnorbool(): c = CNORbool(cnodata("PKN-ToyMMB.sif"), cnodata("MD-ToyMMB.csv"), verbose=True) c.verboseR = True c.verboseR = False c.verbose = False c.optimise(maxgens=5, popsize=10) c.plot_fitness() c.plot_model() c.plot_optimised_model() c.plot_mapback_model() c._create_report_header() c.onweb()
Well, the free ads are over, but the cost is still quite low. The two ad sizes - medium and large - both come with guest post spots, and the ads run for 30 days. Have a gander at the Sponsor Page to apply. On another note, we finally have a new mattress. Our old one came with the bed, and wasn't great quality because we couldn't afford a better one. Springs have been sticking in my ribs for a year and a half now, and it's gotten worse and worse. So finally having a mattress that won't dictate where I can and can't lie is wonderful. I might stop accidentally (yes...accidentally...) stealing Seeg's side in the night now! It's a bit higher than the old one, so for me with my little legs it's really very noticable. I stood on it to get the duvet cover straightened and almost fell off of it rather than climb. I've never been so productive, too! When we took the mattress off the amount of dust covering all the junk beneath it was ridiculous. I'm also trying to move beyond my hoarding tendancies, and get rid of stuff I have no use for anymore, and today was the best time to start. I might start trying to flog off my hardly touched make up on you all. I've never worn much to begin with, I'm an eyeliner girl - and only eyeliner. I don't bother with foundation, lipgloss, mascara or any of that. My sister is my complete opposite, so while I watched her apply her make up the other day, sure I might have been judging to begin with, but she's very chavvy, so she was properly shovelling it on anyway. It did not look good, and I tried hard to keep that opinion from my face. I still buy the occasional eyeshadow, the last was from etsy, but otherwise I never really touch the stuff. Nail polish, too. I have loads of it, but never wear it because I've never liked how it looks on short nails, but once my nails get long enough, after applying the polish, my nails start to splinter and flake. So I leave them be. So yeah, if anyone is interested in not quite second hand make up, keep your eyes open here. I need to get rid of it, it's cluttering up my shelves. When I'm done, all I'll have left is a pencil and liquid eyeliner. And a mirror. Oh my gosh I need a new mattress SO BAD. Heck, I want a new, bigger bed so I have room for me, the boyfriend AND our dogs.
#!/usr/bin/env python import XenAPI import provision import sanitychecklib #log in session=sanitychecklib.getsession() sx=session.xenapi #find the template for Debian Etch vms = sx.VM.get_all() print "Server", sanitychecklib.server, "has ", len(vms), "VMs", etch_template_list = [x for x in vms if (('Etch' in sx.VM.get_name_label(x)) and (sx.VM.get_is_a_template(x)))] print "including", len(etch_template_list), "template for 'Etch'" etch_template=etch_template_list[0] print "We pick the first template: " print "name: ", sx.VM.get_name_label(etch_template) print "description:", sx.VM.get_name_description(etch_template) #Make a copy of the template print "Cloning..." clone=sx.VM.clone(etch_template, sanitychecklib.test_vm_name) #find out where to put the new machine's disks by getting the first pool (I don't think there can be more than one) #and using its default storage repository pool_list=sx.pool.get_all() if len(pool_list)==1: print "There's only one pool" else: print "There are", len(pool_list), "pools" print "We pick the first one:" first_pool=pool_list[0] print "name:", sx.pool.get_name_label(first_pool) print "description: ", sx.pool.get_name_description(first_pool) default_SR=sx.pool.get_default_SR(first_pool) print "The default SR is: " print "Name:", sx.SR.get_name_label(default_SR) print "Description:", sx.SR.get_name_description(default_SR) #set the new copy to have its disks in the default SR #this is a debian template specific hack which allows us to create Debian VMs easily spec=provision.getProvisionSpec(session, clone) spec.setSR(sx.SR.get_uuid(default_SR)) provision.setProvisionSpec(session, clone, spec) #now 'provision' it, which causes the disks to actually be created. print "provisioning...." sx.VM.provision(clone) print "provisioned" #now find out which network to attach the new machine to #by finding out what the pool master host is connected to. pool_master=sx.pool.get_master(first_pool) master_PIFs=sx.host.get_PIFs(pool_master) primary_PIF=master_PIFs[0] master_network=sx.PIF.get_network(primary_PIF) #attach new VM to default SR and master network print "Creating VIF..." new_vif = { 'device': '0', 'network': master_network, 'VM': clone, 'MAC': "", 'MTU': "1500", "qos_algorithm_type": "", "qos_algorithm_params": {}, "other_config": {} } sx.VIF.create(new_vif) #Another Debian template specific hack. If 'noninteractive' is passed on the kernel command line, #the new machine will boot without asking for its root and VNC passwords to be set, and just use 'xensource'. print "Adding noninteractive to the kernel commandline" print "This is a hack in the template to enable the root account to be created with password 'xensource'" sx.VM.set_PV_args(clone, "noninteractive") #Should be all set now. Fire up the machine. print "booting..." sx.VM.start(clone, False, True) #log out print "logging out" session.logout()
Accounts Assistant Vacancy in Dubai Accounts Assistant General Location : Abu Dhabi Occupancy : full time To join : immediately Description Experienced Accounts Assistant with VAT knowledge required urgently. Any nationality. On visit or transferable residence visa. Requirements Experience : 3 years minimum Experience in UAE is an advantage Pakistani preferred Provisions Salary : AED 5,500 Transport provided Medical insurance provided Apply directly (recommended) : Interested candidates can send their CV.
import numpy as np from oceansar import constants as const class RCSKodis(): """ Specular model (R.D. Kodis '66) Physical optics model as described in R.D. Kodis (1966) paper 'A Note on the Theory of Scattering from an Irregular Surface'. E.M. solved using Stationary Phase Method. .. note:: G. Valenzuela suggested that reflection coefficient (R) may be replaced by effective refl. coef.! .. note:: OASIS uses only range dependent incidence angle, so it is given on class init. :param inc: Incidence angle matrix :param k0: Radar wave number :param dx: Range resolution :param dy: Azimuth resolution """ def __init__(self, inc, k0, dx, dy): self.dx = dx self.dy = dy self.k0 = k0 self.sin_inc = np.sin(inc) self.cos_inc = np.cos(inc) self.tan_inc = np.tan(inc) self.R = (const.epsilon_sw - np.sqrt(const.epsilon_sw))/(const.epsilon_sw + np.sqrt(const.epsilon_sw)) def field(self, az_angle, sr, diffx, diffy, diffxx, diffyy, diffxy): # Avoid repeating calculations cos_az = np.cos(az_angle) sin_az = np.sin(az_angle) J = diffxx*diffyy - diffxy**2 J = np.where(J == 0., np.nan, J) J_abs = np.abs(J) delta_x = (1./J_abs)*(diffxy*(diffy - self.tan_inc*sin_az) - diffyy*(diffx - self.tan_inc*cos_az)) delta_y = (1./J_abs)*(diffxy*(diffx - self.tan_inc*cos_az) - diffxx*(diffy - self.tan_inc*sin_az)) epsilon = np.where(J > 0., np.sign(diffxx), 1j) # New slant range due to deltas hdx = self.dx/2 hdy = self.dy/2 E = np.zeros(delta_x.shape, dtype=np.complex) sps = np.where(((-hdx < delta_x) & (delta_x < hdx)) & ((-hdy < delta_y) & (delta_y < hdy))) if sps[0].size > 0: delta_z = delta_x[sps] * diffx[sps] + delta_y[sps] * diffy[sps] sr_p2 = (sr[sps] + (self.sin_inc[0,sps[1]] * cos_az[sps] * delta_x[sps] + self.sin_inc[0,sps[1]] * sin_az[sps] * delta_y[sps] - self.cos_inc[0,sps[1]] * delta_z)) E[sps] = ((0.5*self.R*epsilon[sps]) * ((diffx[sps]**2. + diffy[sps]**2. + 1.)) * np.exp(-1j*2.*self.k0*sr_p2) / np.sqrt(J_abs[sps])) # field = np.where(((-hdx < delta_x) & (delta_x < hdx)) & ((-hdy < delta_y) & (delta_y < hdy)), # (0.5*self.R*epsilon)*((diffx**2. + diffy**2. + 1.)) * np.exp(-1j*2.*self.k0*np.sqrt(sr_p2)) / np.sqrt(J_abs), # 0.) return E def candidates(self, az_angle, diffx, diffy, diffxx, diffyy, diffxy): # Avoid repeating calculations cos_az = np.cos(az_angle) sin_az = np.sin(az_angle) J = diffxx*diffyy - diffxy**2. J = np.where(J == 0., np.nan, np.abs(J)) delta_x = (1./J)*(diffxy*(diffy - self.sin_inc*sin_az) - diffyy*(diffx - self.sin_inc*cos_az)) delta_y = (1./J)*(diffxy*(diffx - self.sin_inc*cos_az) - diffxx*(diffy - self.sin_inc*sin_az)) candidates = np.where(((0. < delta_x) & (delta_x < self.dx)) & ((0. < delta_y) & (delta_y < self.dy)), 1., 0.) return candidates
Inspiration for this little bit of wisdom: another set-to about the God of Judaism and the God of Islam — or depictions, expressions, instructions, etc. seemingly attached to the construction or perception (in mind, of course) of each. And God must have put us here to prove how right we are about God. Something like all of that plus the bloody historical baggage that has been dragged through time with that kind of thinking. Ethnolinguistic cultures on earth: about 7,000 (fewer, actually). There are many ways of looking up at the stars and experiencing or interpreting the divine. In the Torah, God hears Ishmael’s cries too. The more true challenge of the present lies in handling the habits attending medieval politics and worldviews that better account for turning something we cannot know — God, as we think of God, is greater than our observational capabilities — into something we think we know. Judaism represents the Tribal Way of the Hebrews, an old People with a calendar to prove it and a distinct trail through time planted on or in the earth in built space or artifacts. Rather than mosey on to other civilizational uptake, adaptation, and competition, it might prove healthier to visit the present, take a deep breath, and have both a broad and long view backward, the better perhaps for considering and taking the next step forward. Having given up the burning of witches and largely ejected “contra-lateral amputation” as inhumane (although there are some barbaric, malignant, primitive, and sadistic holdouts), we might do better than eternally trying to erase one another. Call it a plea for peacefulness momentarily enshrined in a blog post . . . also a plea to perhaps stand together for a broad and great gaze backward from this extraordinary plateau in human communicating and social interaction — and then: think fresh; think forward; think next. The “New Nationalists” — there’s a term that’s getting around — are old feudal reactionaries. Each — start with Erdogan in his White Palace — means to live as if in a castle socially surrounded by nobility and attended by peons and slaves. BackChannels hopes they will one day find themselves left floating in their own blood-dimmed and greed-soaked clouds and the better world, broadly inclusive, culturally self-sustaining (we should be concerned with keeping and growing our 7,000 or so living language cultures), earth conscious, still in awe of the universe, and beautifully interwoven will look back on them with a shudder. The prompt: an account of a conversation in which Deist prophets were characterized as representing human perfection as provided by God Almighty Himself. Judaism separates God and His powers from man absolutely. The theme is recurring in the Torah, starting, perhaps, with the masking off of the Tree of Life from Adam and Eve and moves on to Pharaoh in his political hubris and Moses with his stuttering but divinely guided diplomacy, as it were, and on to the “Binding of Isaac” where God sets out to “prove Abraham” but whether for blind obedience or the possession of conscience (should not Abraham have spoken “truth to power”) we are left to argue. The Roman and Arab uptake of Judaic lore bends content toward the evolving cultural behaviors of each and semi-idolatry, however dressed, becomes part of our bloody medieval merry-go-round (perhaps the idea should be labeled “horror-go-round”) in the conflation of religion with political power. Whether divinely imparted or wisely composed, the First Commandment would seem most well chosen. There must be differences between being revered or thought holy and being regarded as God’s perfect expression of humanity. I think God forbid Jews that option at the very start of the instructions, so as to eliminate too great an idolatry for either any one leader and the leader’s followers. The prompt: in relation to anti-Semitism, a statement suggesting Muslims, in general, know little about Islam, and anti-Semitic sentiment in Islam relates directly to the conflict in the middle east. Religious identification serves as a powerful discriminator and tool for the purposes of leadership in the medieval world, and for some portion of the “Ummah”, small or large, the bond of Muslim identification against the Jew serves to sustain anti-Semitic thought wherever that may be promoted to serve the interests of leaders or spoilers for power. While the notion that in some general way Muslims know nothing about Islam would seem to confront multiple cultural histories of written clerical thought largely expounding on Qur’an, Hadith, and Sunnah, much of which may indeed have been self-serving — or serving the subscriptions of noted institutions or scholars. In the medieval world, much of power would be suspended between the church (mosque, or temple), then seats of education, and political authority whose legitimacy rested on convincing clerical validation (predicated on beliefs installed). The “Educated Modern” has a much widened field of view as regards phenomenology associated with psychology, religion, and spirituality. We’re aware of the world’s approximately 7,000 living languages and associated ethnolinguistic cultures; we’re aware of the histories of religions “across the campus”; perhaps most importantly, we’re aware of our own foibles. 🙂 The modern may face some puzzles as regards the perpetuation of the medieval suite of opposed and similar philosophies, but first on the agenda should be the question of how to bring forward those living in the medieval world. One may practically skip church with that kind of Sunday morning sermonette. However,the world’s educated and dedicated or leisured may support their “team”, the same have also access to a sophisticated awareness of their rivals plus awareness and knowledge sufficient to sink everyone’s ship: the good must ask wherever there is conflict what the fighting is about. BackChannels simple answer to that question: power. And there follows a question: medieval or modern? “Medieval v Modern” has been chatted up quite on this blog, so I’ll spare ye another ramble. I would accept the validity of the interest in religion, not the laws, policies, or practices promoted in the interest of its related fascism. Even Pakistan has differentiated itself from Islam in its “realpolitik” law. As regards reform — i.e., an end to Islamic supremacism, supersession, bigotry regarding others, barbaric advisement — YES! Intellectualized Islam — e.g., Qanta Ahmed and others who bring a modern sensibility to Qur’anic ambiguity and move away from the supremacist mark, much preferring “no compulsion”. Heretical Islam — e.g., M. Zuhdi Jasser and others in the Islamic Reform Movement who prefer contemplation and worship to “political Islam” and related militancy. Renewal Through Reinterpreting Translation (goodbye Pickthall) and narrowed focus on the Qur’an — the claimed “word of God” — so as to diminish the merely mortal factor in the receiving of the Qur’an. If any should think up other Islamic Reform options, let me know. There is a greater and more challenging anachronism in the persistence of medieval worldviews about God and about power in a modern day that requires for greater wealth and security and the wider distribution of both plus justice a greater cooperation and integrity in global social relationships. Apparently, if one is not close to a monk or priest (or perhaps a recluse with a library), one may be in danger of trusting an untrustworthy friend. Note: one might ask whether caliphs, kings, and emperors are not inherently arrogant in their assumptions of power over all others, and therefore particularly sensitive to arrogance in those whom they would subjugate. Compact between shaman and chief and cleric and king spans the ages but may not be a permanent feature in humanity’s intellectual and political evolution. That may be something to think about in the experience of language, both in political rhetoric and in scripture (no matter to whom the words belong), and that of power as dominion over others. The region of the Qur’an cited, 5:82 and 5:83 presents in English through several well-remarked translations — and of a standard four — Asad, Malik, Pickthall, and Yusuf Ali — the conveyances of none would seem as sweet as the statement quoted as the prompt. One thought attending the description of “men devoted to learning” and “who have renounced the world and are not arrogant” is that such men would seem less than challenging to martial or political power and therefore dismissible by any speaker intent on monopolizing and wielding such power. If thou woulds’t be apostle, caliph, king, or emperor would though not note the sweetness of the complete and grateful surrender of thine greatest potential resistance? Given that question and thought, one might appreciate attempts at transitional revisionism. Defiance! The word is not a bad one when it comes to religion. Because it is on the heels of many, an act of defiance that religion has become healthier, stronger, more tolerant, and certainly more enjoyable. Martin Luther hammered his edict into a wooden door, and the empire of the Catholic Church was shaken. Colonists fled Europe. Their defiance against the belief that the state had any right to meddle in the private worship of the citizen proved a powerful motivation to escape. Resistance against government constraint of private acts of worship caused them to load onto their wooden ships and set sail. The Mayflower Compact sprang to life at Plymouth Rock, and the giant-hearted turned their faces into the harsh wind. Some shivered and died from the cold, while other starved to death in Jamestown. But the strength in their bones carried fires of conviction into the marrow of their future generations. Defiance. America remains a powerful societal example today because of acts of religious defiance. Wasatia is a movement that advocates achieving peace and prosperity through the promotion of a culture of moderation that would walk away from the current climate of religious and political extremism that escalates fear and violence. Wasatia claims the centrist position—that balance, between passion and hate, between amity and enmity, between deep despair and false hope, would lead the Middle East out of its chronic conflict and despair. Wasatia name derives from the term wasatan which appears in verse 143 of al-Baqarah Surah in the Holy Quran. The term wasatia in Arabic means center and middle. In the Holy Quran it means “justice, moderation, balance and temperance.” The word wasat appears in verse 143 of the second chapter, which is 286 verses long, so it appears exactly in the middle. The verse says: “And We have created you a middle ground (moderate) nation” or “a centrist ummah [community].” The passage demonstrates that the need to be moderate and temperate is a central message within Islam. Wasatia addresses all aspects of life: the way you eat, the way you dress, the way you spend money. Those of us in the movement interpret this to indicate justice, balance, moderation, middle ground, centrism, and temperance. In studying other faiths, particularly Judaism and Christianity, it becomes clear that they too uphold the same values, thus offering fertile ground for inter-faith understanding and peaceful co-existence. But it’s not merely moderation as a religious principle that should replace the radicalizing rhetoric of militant extremists. It is at its core a deeply human principle, a willingness to see those on the other side of the conflict not as “the enemy” but as fellow human beings, shaped by different histories but all looking towards the day when they can live in peace and security. This belief may seem an incongruous attitude, coming as it does from someone who, as a Palestinian university student in the humiliating aftermath of the 1967 Arab-Israeli War, espoused guerilla warfare as the only possible way to achieve justice for his people. But then I left to pursue post-graduate studies, first in England and then the United States. It was an enlightening experience. Viewing the situation from a distance and with new knowledge, I came to reject any notion of violence as an answer to the problem. Later personal experiences strengthened my belief that at a human level, where bigotry and hatred are replaced by moderation, empathy, and understanding, there exists a common desire for peaceful accommodation. In late 2006, during the month of Ramadan, I observed from the balcony of my house, which overlooked the Dahiet al-Barid/Ram Checkpoint in East Jerusalem, a situation that had the potential to escalate into violent confrontation. Hundreds of Palestinians from the West Bank were trying to pass into Jerusalem to pray in al-Haram al-Sharif and al-Aqsa Mosque. The Israeli soldiers pushed them back and threw tear gas grenades at them, but to no avail. I was waiting for gunfire to erupt when quite quickly the volatile standoff appeared to have been defused. I soon discovered that the leading officer had agreed to a compromise. Buses were arranged to take the Palestinians, who agreed to hand over their ID cards, into Jerusalem to pray. Afterwards the buses brought them back to the checkpoint where their cards were returned. It struck me as very significant that these Palestinians, religious though they clearly were, favored a negotiated solution. Had they been extremists, they would have escalated the event in the hope of precipitating a violent clash that could then be used to further their narrative of a demonic Israeli enemy. On their part, the Israelis recognized the Muslim faithful for what they were, religious yet moderate people. This in turn prompted me to ask myself who represents such religious moderates in Palestine and, as a response, to found Al Wasatia. I believe that part of the religious animosity problem is related to ignorance—both about our own religion and that of the ‘other’. Religion has played a big role in agitating the conflict to date, and I believe it is time that religion becomes a catalyst in resolving it. Many Muslims don’t know very much about Judaism or Christianity, and what many of them know about Islam is distorted. Interfaith dialogue helps to dispel stereotypical images, myths, and misperceptions. In any conflict, religious peace is a prerequisite for a sustainable political peace. Achieving our goals will take time, probably a long time, because it involves overcoming the malevolent influence of the religious militants, their distorted interpretation of the Qur’an, and the deeply ingrained attitudes and prejudices thus engendered, particularly among the poor, young, and uneducated. But it’s no good standing by and doing nothing—not when we are confident that our message of moderation is the key to a much brighter future for all sides. Mohammed S. Dajani Daoudi, the founder of the Wasatia Movement of Moderate Islam, is also the inaugural Weston Fellow at The Washington Institute. He previously worked as a professor of political science at al-Quds University in Jerusalem and served a visiting fellow at the Institute in 2012.
#!/usr/bin/env python # -*- coding: utf-8 -*- # # This file is subject to the terms and conditions defined in # file 'LICENSE.md', which is part of this source code package. # class PetSetStatus(object): """ http://kubernetes.io/docs/api-reference/apps/v1alpha1/definitions/#_v1alpha1_petsetstatus """ def __init__(self, model=None): super(PetSetStatus, self).__init__() self._observed_generation = None self._replicas = None if model is not None: self._build_with_model(model) def _build_with_model(self, model=None): if "observedGeneration" in model: self.observed_generation = model["observedGeneration"] if "replicas" in model: self.replicas = model["replicas"] # ------------------------------------------------------------------------------------- observedGeneration @property def observed_generation(self): return self._observed_generation @observed_generation.setter def observed_generation(self, og=None): if not isinstance(og, int): raise SyntaxError("PetSetStatus: observed_generation: [ {} ] is not None.".format(og)) self._observed_generation = og # ------------------------------------------------------------------------------------- replicas @property def replicas(self): return self._replicas @replicas.setter def replicas(self, r=None): if not isinstance(r, int): raise SyntaxError("PetSetStatus: replicas: [ {} ] is not None.".format(r)) self._replicas = r # ------------------------------------------------------------------------------------- serialize def serialize(self): data = {} if self.observed_generation is not None: data["observedGeneration"] = self.observed_generation if self.replicas is not None: data["replicas"] = self.replicas return data
Justia Dockets & Filings Fifth Circuit Texas Eastern District Flemens v. TransAm Trucking, Inc. Eric Lavon Friend and TransAm Trucking, Inc. March 19, 2019 Filing 1 NOTICE OF REMOVAL by TransAm Trucking, Inc. from 62nd, case number CV43951. (Filing fee $ 400 receipt number 0540-7185460), filed by TransAm Trucking, Inc..(Allred, William) (Additional attachment(s) added on 3/19/2019: #1 Civil Cover Sheet) (rpc, ). Search for this case: Flemens v. TransAm Trucking, Inc.
""" """ import asyncio import unittest import uuid from aiogremlin import (submit, GremlinConnector, GremlinClient, GremlinClientSession) class SubmitTest(unittest.TestCase): def setUp(self): self.loop = asyncio.new_event_loop() asyncio.set_event_loop(None) def tearDown(self): self.loop.close() def test_submit(self): @asyncio.coroutine def go(): resp = yield from submit("4 + 4", bindings={"x": 4}, loop=self.loop) results = yield from resp.get() return results results = self.loop.run_until_complete(go()) self.assertEqual(results[0].data[0], 8) def test_rebinding(self): execute = submit("graph2.addVertex()", loop=self.loop) try: self.loop.run_until_complete(execute.get()) error = False except: error = True self.assertTrue(error) @asyncio.coroutine def go(): result = yield from submit( "graph2.addVertex()", rebindings={"graph2": "graph"}, loop=self.loop) resp = yield from result.get() self.assertEqual(len(resp), 1) self.loop.run_until_complete(go()) class GremlinClientTest(unittest.TestCase): def setUp(self): self.loop = asyncio.new_event_loop() asyncio.set_event_loop(None) self.gc = GremlinClient(url="ws://localhost:8182/", loop=self.loop) def tearDown(self): self.loop.run_until_complete(self.gc.close()) self.loop.close() def test_connection(self): @asyncio.coroutine def go(): ws = yield from self.gc._connector.ws_connect(self.gc.url) self.assertFalse(ws.closed) yield from ws.close() self.loop.run_until_complete(go()) def test_execute(self): @asyncio.coroutine def go(): resp = yield from self.gc.execute("x + x", bindings={"x": 4}) return resp results = self.loop.run_until_complete(go()) self.assertEqual(results[0].data[0], 8) def test_sub_waitfor(self): sub1 = self.gc.execute("x + x", bindings={"x": 1}) sub2 = self.gc.execute("x + x", bindings={"x": 2}) sub3 = self.gc.execute("x + x", bindings={"x": 4}) coro = asyncio.gather(*[asyncio.async(sub1, loop=self.loop), asyncio.async(sub2, loop=self.loop), asyncio.async(sub3, loop=self.loop)], loop=self.loop) # Here I am looking for resource warnings. results = self.loop.run_until_complete(coro) self.assertIsNotNone(results) def test_resp_stream(self): @asyncio.coroutine def stream_coro(): results = [] resp = yield from self.gc.submit("x + x", bindings={"x": 4}) while True: f = yield from resp.stream.read() if f is None: break results.append(f) self.assertEqual(results[0].data[0], 8) self.loop.run_until_complete(stream_coro()) def test_execute_error(self): execute = self.gc.execute("x + x g.asdfas", bindings={"x": 4}) try: self.loop.run_until_complete(execute) error = False except: error = True self.assertTrue(error) def test_rebinding(self): execute = self.gc.execute("graph2.addVertex()") try: self.loop.run_until_complete(execute) error = False except: error = True self.assertTrue(error) @asyncio.coroutine def go(): result = yield from self.gc.execute( "graph2.addVertex()", rebindings={"graph2": "graph"}) self.assertEqual(len(result), 1) self.loop.run_until_complete(go()) class GremlinClientSessionTest(unittest.TestCase): def setUp(self): self.loop = asyncio.new_event_loop() asyncio.set_event_loop(None) self.gc = GremlinClientSession(url="ws://localhost:8182/", loop=self.loop) self.script1 = """graph = TinkerFactory.createModern() g = graph.traversal(standard())""" self.script2 = "g.V().has('name','marko').out('knows').values('name')" def tearDown(self): self.loop.run_until_complete(self.gc.close()) self.loop.close() def test_session(self): @asyncio.coroutine def go(): yield from self.gc.execute(self.script1) results = yield from self.gc.execute(self.script2) return results results = self.loop.run_until_complete(go()) self.assertTrue(len(results[0].data), 2) def test_session_reset(self): @asyncio.coroutine def go(): yield from self.gc.execute(self.script1) self.gc.reset_session() results = yield from self.gc.execute(self.script2) return results results = self.loop.run_until_complete(go()) self.assertIsNone(results[0].data) def test_session_manual_reset(self): @asyncio.coroutine def go(): yield from self.gc.execute(self.script1) new_sess = str(uuid.uuid4()) sess = self.gc.reset_session(session=new_sess) self.assertEqual(sess, new_sess) self.assertEqual(self.gc.session, new_sess) results = yield from self.gc.execute(self.script2) return results results = self.loop.run_until_complete(go()) self.assertIsNone(results[0].data) def test_session_set(self): @asyncio.coroutine def go(): yield from self.gc.execute(self.script1) new_sess = str(uuid.uuid4()) self.gc.session = new_sess self.assertEqual(self.gc.session, new_sess) results = yield from self.gc.execute(self.script2) return results results = self.loop.run_until_complete(go()) self.assertIsNone(results[0].data) def test_resp_session(self): @asyncio.coroutine def go(): session = str(uuid.uuid4()) self.gc.session = session resp = yield from self.gc.submit("x + x", bindings={"x": 4}) while True: f = yield from resp.stream.read() if f is None: break self.assertEqual(resp.session, session) self.loop.run_until_complete(go()) class ContextMngrTest(unittest.TestCase): def setUp(self): self.loop = asyncio.new_event_loop() asyncio.set_event_loop(None) self.connector = GremlinConnector(loop=self.loop) def tearDown(self): self.loop.run_until_complete(self.connector.close()) self.loop.close() # def test_connection_manager(self): # results = [] # # @asyncio.coroutine # def go(): # with (yield from self.connector) as conn: # client = SimpleGremlinClient(conn, loop=self.loop) # resp = yield from client.submit("1 + 1") # while True: # mssg = yield from resp.stream.read() # if mssg is None: # break # results.append(mssg) # self.loop.run_until_complete(go()) if __name__ == "__main__": unittest.main()
Today's clue from the New York Times crossword puzzle is : Not mine alone First let's look and see if we can find any hints in the New York Times crossword puzzle. Then we will gather any relevent information we need in order to find the correct answer to the clue Not mine alone that has been given in the New York Times crossword puzzle. Finally we will list any possible answers here below for the clue Not mine alone.
"""View of QuotaWindowView.""" from gi.repository import Gtk from lib.mvc.bases import WindowViewBase from lib.exception_feedback import add_default_exception_handling class QuotaWindowView(Gtk.Window, WindowViewBase): """View of QuotaWindowView.""" def __init__(self, app, model): """Ctor of QuotaWindowView.""" Gtk.Window.__init__(self) WindowViewBase.__init__(self, app, model) self.on_open = None self.on_close = None @add_default_exception_handling('Failed to initialize Quota Window') def initialize(self): """Create the actual view with all widgets.""" self.connect("delete-event", self.cb_close) # create tree view sorted_model = Gtk.TreeModelSort(model=self.model.create_model()) sorted_model.set_sort_column_id(1, Gtk.SortType.ASCENDING) self.tree_view = Gtk.TreeView(model=sorted_model) self.create_columns(self.tree_view) # create a grid and attach the treeview to it self.grid = Gtk.Grid() self.grid.attach(self.tree_view, 0, 0, 1, 1) # attach grid to window self.add(self.grid) @add_default_exception_handling('Failed to open Quota Window') def cb_show(self, w, data): """On show.""" self.set_icon_from_file(self.getIcon()) if self.on_open is not None: self.on_open() sorted_model = Gtk.TreeModelSort(model=self.model.create_model()) sorted_model.set_sort_column_id(1, Gtk.SortType.ASCENDING) self.tree_view.set_model(model=sorted_model) self.show_all() return True @add_default_exception_handling('Failed to close Quota Window') def cb_close(self, w, data): """"On window close.""" if self.on_close is not None: self.on_close() self.hide() return True @add_default_exception_handling('Failed to update Quota Window') def on_update(self): """On update.""" self.tree_view.set_model(self.model.create_model()) @add_default_exception_handling() def register_on_open(self, func): """Register on open event.""" self.on_open = func @add_default_exception_handling() def register_on_close(self, func): """Register on close event.""" self.on_close = func @add_default_exception_handling('Failed to display storage information') def create_columns(self, tree_view): """Create the columns of the TreeView.""" rendererText = Gtk.CellRendererText() column = Gtk.TreeViewColumn("File", rendererText, text=0) column.set_sort_column_id(0) tree_view.append_column(column) rendererText = Gtk.CellRendererText() column = Gtk.TreeViewColumn("Size [MB]", rendererText, text=1) column.set_sort_column_id(1) column.set_cell_data_func( rendererText, lambda col, cell, model, iter, unused: cell.set_property( "text", '{0:.2f}'.format(model.get(iter, 1)[0]))) tree_view.append_column(column)
With WWDC about to kick off on Monday at Moscone West in San Francisco, Apple has updated the recently released WWDC app to tweak it before the festivities begin. WWDC for iOS is now updated to version 1.0.1 which includes a feature that will allow videos streaming over AirPlay to continue after an interruption. Whether you receive a call, or your phone just locks, the video will resume from where you left off. Favorites in Events are now scheduled until 11pm and a bug fix was included to take care of a problem that shut video volume off automatically when you lowered the ringer volume on your phone. The update also eliminates duplicate headers in the video list, prevents the navigation bar in Events from disappearing and corrects the problem that led to blurry text in session details. Version information has been added to improve bug reports. The app is available for free from the Apple App Store. Like the app provided every year by Google for Google I/O, the WWDC app is made for both those attending and those watching at home. Streaming video allows registered Apple developers to view key sessions at home while those attending will be helped out by maps of the Moscone Center and other features. Guys please go easy on her. She is a special kid ! Wendy will be back on the iOS tip in a matter of weeks. Y'all be on the look out for the change up.
""" This module contains some functions to evaluate the stock prices predictor. """ import sys import numpy as np from pandas.errors import UnsortedIndexError from sklearn.metrics import r2_score import pandas as pd import predictor.feature_extraction as fe import datetime as dt def mre(y_true, y_pred): """ MRE metrics function. The values of assets should never be zero so, as zero labels cause problems, they are not considered. """ y_true_filtered = y_true[y_true != 0] y_pred_filtered = y_pred[y_true != 0] return np.mean(np.abs((y_pred_filtered - y_true_filtered) / y_true_filtered)) def get_metrics(y_true_df, y_pred_df): """ Calculates the MRE and R^2 score, on a per-symbol basis. It receives matrices of results, in which the rows represent time and the columns represent symbols. :param y_true_df: The labels for each symbol at each moment in time. :param y_pred_df: The predicted labels for each symbol at each moment in time. :returns r2_scores: Numpy array with the R^2 score for each symbol :returns mre_scores: Numpy array with the MRE score for each symbol :returns tickers: Array that contains the ticker symbols. """ tickers = y_true_df.index.levels[1] r2_scores = [] mre_scores = [] for ticker in tickers: try: y_true = y_true_df.loc[(slice(None), ticker), :] y_pred = y_pred_df.loc[(slice(None), ticker), :] except UnsortedIndexError: y_true = y_true_df.sort_index().loc[(slice(None), ticker), :] y_pred = y_pred_df.sort_index().loc[(slice(None), ticker), :] r2_scores.append(r2_score(y_true, y_pred)) mre_scores.append(mre(y_true.values, y_pred.values)) return np.array(r2_scores), np.array(mre_scores), tickers def get_metrics_df(y_true_df, y_pred_df): """ Wrapper around get_metrics that returns dataframes instead of Numpy arrays. """ r2_scores, mre_scores, tickers = get_metrics(y_true_df, y_pred_df) return pd.DataFrame(np.array([r2_scores, mre_scores]).T, index=tickers, columns=['r2', 'mre']) def get_metrics_in_time(y_true_df, y_pred_df, shift): """ Calculates the MRE and R^2 score, on a per-time basis. It receives matrices of results, in which the rows represent time and the columns represent symbols. :param y_true_df: The labels for each symbol at each moment in time. :param y_pred_df: The predicted labels for each symbol at each moment in time. :return: The mean MRE and R^2 score for each time point, and an array of the corresponding dates. """ dates = y_true_df.index.get_level_values(0).unique() r2_scores = [] mre_scores = [] for date in dates: try: y_true = y_true_df.loc[(date, slice(None)), :] y_pred = y_pred_df.loc[(date, slice(None)), :] except UnsortedIndexError: y_true = y_true_df.sort_index().loc[(date, slice(None)), :] y_pred = y_pred_df.sort_index().loc[(date, slice(None)), :] r2_scores.append(r2_score(y_true, y_pred)) mre_scores.append(mre(y_true.values, y_pred.values)) return np.array(r2_scores), np.array(mre_scores), dates + dt.timedelta(shift) def reshape_by_symbol(y): """ Deprecated helper function. Was not used in the final implementation.""" grouped_df = y.reset_index() \ .groupby('level_0') \ .apply(lambda x: x.reset_index(drop=True)) \ .drop('level_0', axis=1) grouped_df.index = grouped_df.index.droplevel(level=1) grouped_df.rename(columns={'level_1': 'ticker'}, inplace=True) reshaped_df = grouped_df.set_index('ticker', append=True).unstack() reshaped_df.columns = reshaped_df.columns.droplevel(level=0) reshaped_df.index.name = 'date' return reshaped_df def run_single_val(x, y, ahead_days, estimator): """ Runs a single training and validation. :param x: A dataframe of samples. The columns represent the base days. The rows always contain the dates of the initial day in the base period. Additionally, the dataframe may be multiindexed with information about from which symbol each sample comes from. The symbol information is not used for the training, but may be useful to get some insigths in the validation process. :param y: The labels of each sample. It corresponds to the (standarized) value of a ticker, some days ahead. :param ahead_days: Number of days ahead that the labels are from the last base day. :param estimator: A predictor object for the labels. It follows the scikit-learn interface, but keeps the dataframe information. :returns y_train_true_df: Labels for the training set. Rows contain dates, columns contain symbols. :returns y_train_pred_df: Predictions for the training set. Rows contain dates, columns contain symbols. :returns y_val_true_df: Labels for the validation set. Rows contain dates, columns contain symbols. :returns y_val_pred_df: Predictions for the validation set. Rows contain dates, columns contain symbols. """ multiindex = x.index.nlevels > 1 x_y = pd.concat([x, y], axis=1) x_y_sorted = x_y.sort_index() if multiindex: x_y_train = x_y_sorted.loc[:fe.add_market_days(x_y_sorted.index.levels[0][-1], -ahead_days)] x_y_val = x_y_sorted.loc[x_y_sorted.index.levels[0][-1]:] else: x_y_train = x_y_sorted.loc[:fe.add_market_days(x_y_sorted.index[-1], -ahead_days)] x_y_val = x_y_sorted.loc[x_y_sorted.index[-1]:] x_train = x_y_train.iloc[:, :-1] x_val = x_y_val.iloc[:, :-1] y_train_true = x_y_train.iloc[:, -1] y_val_true = x_y_val.iloc[:, -1] estimator.fit(x_train, y_train_true) y_train_pred = estimator.predict(x_train) y_val_pred = estimator.predict(x_val) y_train_true_df = pd.DataFrame(y_train_true) y_train_pred_df = pd.DataFrame(y_train_pred) y_val_true_df = pd.DataFrame(y_val_true) y_val_pred_df = pd.DataFrame(y_val_pred) # Just to make it look prettier y_train_pred_df.columns = y_train_true_df.columns y_val_pred_df.columns = y_val_true_df.columns return y_train_true_df, \ y_train_pred_df, \ y_val_true_df, \ y_val_pred_df def roll_evaluate(x, y, train_days, step_eval_days, ahead_days, predictor, verbose=False): """ Warning: The final date of the period should be no larger than the final date of the SPY_DF This function applies run_single_val many times, in a rolling evaluation fashion. :param x: A dataframe of samples. Normally it will span for a period larger than the training period. The columns represent the base days. The rows always contain the dates of the initial day in the base period. Additionally, the dataframe may be multiindexed with information about from which symbol each sample comes from. The symbol information is not used for the training, but may be useful to get some insigths in the validation process. :param y: The labels of each sample. It corresponds to the (standarized) value of a ticker, some days ahead. :param train_days: The amount of training days for each train-validation run. :param step_eval_days: The amount of days to move the training and validation sets on each cycle. :param ahead_days: Number of days ahead that the labels are from the last base day. :param predictor: A predictor object for the labels. It follows the scikit-learn interface, but keeps the dataframe information. :param verbose: If true it shows some messages and progress reports. :returns r2_train_metrics: A numpy array with the mean and standard deviation of the R^2 metrics for each date of evaluation. The mean and std are taken on the symbols dimension. :returns mre_train_metrics: A numpy array with the mean and standard deviation of the MRE metrics for each date of evaluation. The mean and std are taken on the symbols dimension. :returns y_val_true_df: Labels for the validation set. Rows contain dates, columns contain symbols. :returns y_val_pred_df: Predictions for the validation set. Rows contain dates, columns contain symbols. :returns mean_dates: The mean date of the training period. It is useful to plot the training metrics in time. """ # calculate start and end date # sort by date x_y_sorted = pd.concat([x, y], axis=1).sort_index() start_date = x_y_sorted.index.levels[0][0] end_date = fe.add_market_days(start_date, train_days) final_date = x_y_sorted.index.levels[0][-1] # loop: run_single_val(x,y, ahead_days, estimator) mean_dates = [] r2_train_means = [] r2_train_stds = [] mre_train_means = [] mre_train_stds = [] y_val_true_df = pd.DataFrame() y_val_pred_df = pd.DataFrame() num_training_sets = (252 / 365) * ( x.index.levels[0].max() - fe.add_market_days(x.index.levels[0].min(), train_days)).days // step_eval_days set_index = 0 if verbose: print('Evaluating approximately %i training/evaluation pairs' % num_training_sets) while end_date < final_date: x_temp = x_y_sorted.loc[start_date:end_date].iloc[:, :-1] y_temp = x_y_sorted.loc[start_date:end_date].iloc[:, -1] x_temp.index = x_temp.index.remove_unused_levels() y_temp.index = y_temp.index.remove_unused_levels() y_train_true, y_train_pred, y_val_true, y_val_pred = run_single_val(x_temp, y_temp, ahead_days, predictor) # Register the mean date of the period for later use mean_dates.append(start_date + ((end_date - start_date) / 2)) # Calculate R^2 and MRE for training and append r2_scores, mre_scores, tickers = get_metrics(y_train_true, y_train_pred) r2_train_means.append(np.mean(r2_scores)) r2_train_stds.append(np.std(r2_scores)) mre_train_means.append(np.mean(mre_scores)) mre_train_stds.append(np.std(mre_scores)) # Append validation results y_val_true_df = y_val_true_df.append(y_val_true) y_val_pred_df = y_val_pred_df.append(y_val_pred) # Update the dates start_date = fe.add_market_days(start_date, step_eval_days) end_date = fe.add_market_days(end_date, step_eval_days) set_index += 1 if verbose: sys.stdout.write('\rApproximately %2.1f percent complete. ' % (100.0 * set_index / num_training_sets)) sys.stdout.flush() return np.array([r2_train_means, r2_train_stds]).T, \ np.array([mre_train_means, mre_train_stds]).T, \ y_val_true_df, \ y_val_pred_df, \ np.array(mean_dates)
Contact lenses are more than just a fashion trend. They are an effective alternative for corrective eyeglasses. According to a study by Contact Lens Spectrum, nearly 100 million people worldwide wear contact lenses. But despite the growing number of its wearers, many people are still not well-informed of the correct usage and proper habits in contact lenses. Wearing contact lens without proper care may put your eyes at risk for developing eye complications and other infections. Furthermore, improper cleaning and bad hygiene can lead to a range of problems including eye irritation and infection. You may be one of those people who prefer wearing their contacts when swimming or wear it even until your sleep. Will this affect your contact lens and worsen your vision? Here are some common questions contact lens users may have in mind, along with the do’s and don’ts with regards to proper contact lens care. Can I use decorative contact lenses even without the doctor’s prescription? Whether decorative or prescriptive, contact lenses must be assessed by an eye doctor before use. Since the eye’s surface has unique properties for each person, lenses have to be doctor-approved upon wearing. Can I wear my contact lenses for more than 10 hours? You may wear your contact lenses for 8-10 hours a day, but it is advisable to change to your glasses from time to time to allow eye’s oxygen intake to get back to normal. It is not advisable to sleep with your contact lenses on, especially if it isn’t designed to be slept in. Can I still wear my contact lenses if it has a little scratch on it? Damaged contact lenses are not safe for wearing anymore, as it may scratch your cornea and collect other bacteria in that tiny tear that could bring other infections and problems to your eye. Can I wear my contact lenses when swimming or even taking a bath? It is highly recommended that you don’t because this may wash out the lenses from your eyes. In addition, water has relatively small organisms that may lead to an eye infection, so water should not come in contact with the contact lenses. Can I clean my contact lenses using water? You can clean contact lens cases with tap water. But for the lenses, always use eye solutions prescribed by your eye doctor. Can I store my contact lenses in any container? Only use the container provided by your eye doctor. Also, clean the case daily using warm tap water with saline after each use and let it dry with the covers off. How often should I clean my contact lenses? As often as you can. This would prevent the lenses to avoid buildup of hardened protein deposits and oily films that are difficult to remove. Can I use any other solutions aside from what’s prescribed by the doctor? Don’t use different brands of solutions simultaneously. But if necessary, introduce them one at a time to avoid complications. Can I borrow other people’s contact lenses? Contact lenses are highly advised for personal use only and not for sharing. Each contact lens is marked with its expiration date on its packaging. Obeying the expiration date is an important part of safely wearing the contact lens since it may get a series of bacteria, fungi, and amoebae that can cause serious eye infections, and eventually, lead to blindness. What if my eyes feel dry? During lens wear, ‘comfort drops’ may be used to lubricate dry eyes, but they must be specially formulated for your type of lens. Never use regular eye drops because they contain the wrong type of preservative chemicals. See your doctor regularly and not only at times you feel like it. Sometimes, there are contact lens-related issues seen during a routine examination, even before you feel or see anything wrong with your eyes, so it’s best to be vigilant with eye care. The use of contact lenses has been pretty much more than just correcting eye problems, as it has been a way of decorating the eyes with the many vibrant colors you can choose from. Wearing of contact lens is not a bad idea, as long as you take note of the caring tips mentioned and not compromise style and fashion with possible eye problems in the future.
#!/usr/bin/env python # -*- coding: utf-8 -*- # freeseer - vga/presentation capture software # # Copyright (C) 2011, 2013 Free and Open Source Software Learning Centre # http://fosslc.org # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program. If not, see <http://www.gnu.org/licenses/>. # For support, questions, suggestions or any other inquiries, visit: # http://wiki.github.com/Freeseer/freeseer/ import unittest from freeseer.framework.config.persist import JSONConfigStorage from freeseer.tests.framework.config.persist import ConfigStorageTest initial_config = '''\ { "this_section": { "option1": "othello", "option2": "0" } }\ ''' after_config = '''\ { "this_section": { "option1": "something_new", "option2": "10" } }\ ''' class TestJSONConfigStorage(ConfigStorageTest, unittest.TestCase): """Tests that JSONConfigStorage works with a generic Config subclass.""" CONFIG_STORAGE_CLASS = JSONConfigStorage INITIAL_LOAD_CONFIG = initial_config AFTER_STORE_CONFIG = after_config
Center for Advanced Dental Education has raised $ 10,870 from 13 gifts! At the Center for Advanced Dental Education, we prepare dentists to become astute and multidisciplinary clinicians. Our clinics provide services and advanced care for both adults and children at a lower cost than traditional prices. CADE also extends its resources to the community to offer dental education and dental patient care to those in need. CADE has a proud tradition of excellence that strives to balance progressive excellence with the highest standards and the compassionate service of our Catholic, Jesuit heritage. As we celebrate 200 years of Saint Louis University today, we invite you to honor the history and ensure the legacy of dental education at SLU by making a gift to the Center for Advanced Dental Education. Once $1,000 in gifts have been made to the center today, Executive Director Dr. John Hatton has pledge to give an additional $2,500.
# # This file is protected by Copyright. Please refer to the COPYRIGHT file # distributed with this source distribution. # # This file is part of REDHAWK core. # # REDHAWK core is free software: you can redistribute it and/or modify it under # the terms of the GNU Lesser General Public License as published by the Free # Software Foundation, either version 3 of the License, or (at your option) any # later version. # # REDHAWK core is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # details. # # You should have received a copy of the GNU Lesser General Public License # along with this program. If not, see http://www.gnu.org/licenses/. # import jinja2 from redhawk.codegen.jinja.loader import CodegenLoader from redhawk.codegen.jinja.common import ShellTemplate, AutomakeTemplate, AutoconfTemplate from redhawk.codegen.jinja.cpp import CppCodeGenerator, CppTemplate from mapping import ServiceMapper if not '__package__' in locals(): # Python 2.4 compatibility __package__ = __name__.rsplit('.', 1)[0] loader = CodegenLoader(__package__, {'common': 'redhawk.codegen.jinja.common'}) class ServiceGenerator(CppCodeGenerator): def loader(self, component): return loader def componentMapper(self): return ServiceMapper() def propertyMapper(self): return None def portMapper(self): return None def templates(self, component): templates = [ CppTemplate('main.cpp'), CppTemplate('service.cpp', component['userclass']['file'], userfile=True), CppTemplate('service.h', component['userclass']['header'], userfile=True), CppTemplate('service_base.cpp', component['baseclass']['file']), CppTemplate('service_base.h', component['baseclass']['header']), AutomakeTemplate('Makefile.am'), AutomakeTemplate('Makefile.am.ide', userfile=True), AutoconfTemplate('configure.ac'), ShellTemplate('build.sh'), ShellTemplate('common/reconf') ] return templates
In Korea, it is legal to secretly record a conversation if you yourself are a party to it. Otherwise, it’s illegal (and a crime). The pertaining law is the Protection of Communications Secrets Act (통신비밀보호법). In particular, let’s take a look at Articles 3, 4, 14, and 17. 5. Monitoring radio waves for the elimination of interference, etc.: Where radio waves are monitored in order to maintain order in radio waves by, for example, eliminating interference under Articles 49 through 51 of the Radio Waves Act. (2) Any censorship of mail or any wiretapping of telecommunications (hereinafter referred to as “communication-restricting measures”) shall be used as a supplementary means of facilitating a criminal investigation or ensuring national security, and efforts shall be made to minimize the violation of people’s communication secrets. (3) No person shall provide or be provided with a serial number of any terminal apparatus: Provided, That this shall not apply where the enterprise for manufacturing the terminal apparatus of mobile telephone or the mobile communications business operator provides or is provided with a serial number for a performance of lawful business, such as the opening of a service for terminal apparatus, repairs, etc. Mail or its contents obtained through illegal inspection and the contents of communication acquired or recorded through illegal wiretapping in violation of Article 3 shall not be admitted as evidence in a trial or disciplinary procedure. (1) No person shall record a conversation between others that is not open to the public or listen to it through the employment of electronic or mechanical devices. (2) The provisions of Articles 4 through 8, 9 (1) (former part) and (3), 9-2, 11 (1), (3) and (4) and 12 shall apply to recording or listening as referred to in paragraph (1). 6. A person who has been provided with the communication confirmation data or provided such data in violation of the provisions of Article 13 (4). 4. A person who has failed to report the current status of the provision of communication confirmation data, etc. to the Minister of Science, Information and Communications Technology (ICT) and Future Planning or to keep relevant materials in violation of the provisions of Article 13 (7). As you can see, the law only prohibits secretly recording (and wiretapping, etc.) of conversations “between others.” This essentially means if you are a party to a conversation, you may lawfully record that conversation even w/o the consent of the other party or parties. This was affirmed by the Supreme Court of Korea. Basically, the Court ruled it is legal/lawful to secretly record a conversation even amongst 3 people if you yourself are a party to it. So, the bottom line is whether or not you were a party to the recorded conversation. Such secretly recorded conversations can even be used as evidence in court. 1) Employer-Employee: Making a complaint, giving a formal warning, being promised something/payment, etc. 2) Doctor-Patient: If the patient is being explained a particular surgery he/she will undergo, etc. 3) Lender-Lendee: Acknowledgement of a loan, promises to pay back, etc. 4) Landlord-Tenant: Making a complaint, putting the other party on notice, etc. 5) Criminal-Victim: When being harassed, threatened, blackmailed, etc.
# -*- encoding: utf-8 -*- ########################################################################### # Module Writen to OpenERP, Open Source Management Solution # # Copyright (c) 2012 Vauxoo - http://www.vauxoo.com/ # All Rights Reserved. # info Vauxoo ([email protected]) ############################################################################ # Coded by: Fernando Irene Garcia ([email protected]) ############################################################################ # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as # published by the Free Software Foundation, either version 3 of the # License, or (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see <http://www.gnu.org/licenses/>. # ############################################################################## from openerp.osv import fields, osv from openerp.tools.translate import _ from openerp import pooler, tools class account_invoice(osv.Model): _inherit = 'account.invoice' def _get_address_issued_invoice(self, cr, uid, ids, name, args, context=None): if context is None: context = {} res = {} journal_obj = self.pool.get('account.journal') for id_ in ids: data = self.browse(cr, uid, id_, context=context) journal_id = data.journal_id.id data_journal = journal_obj.browse( cr, uid, journal_id, context=context) a = data_journal.address_invoice_company_id and \ data_journal.address_invoice_company_id.id or False b = data_journal.company2_id and \ data_journal.company2_id.address_invoice_parent_company_id and \ data_journal.company2_id.address_invoice_parent_company_id.id or False c = data.company_id and \ data.company_id.address_invoice_parent_company_id and \ data.company_id.address_invoice_parent_company_id.id or False address_invoice = a or b or c or False res[data.id] = address_invoice return res def _get_company_emitter_invoice(self, cr, uid, ids, name, args, context=None): if context is None: context = {} res = {} journal_obj = self.pool.get('account.journal') for id_ in ids: data = self.browse(cr, uid, id_, context=context) journal_id = data.journal_id.id data_journal = journal_obj.browse( cr, uid, journal_id, context=context) company_invoice = data_journal.company2_id and \ data_journal.company2_id.id or data.company_id and \ data.company_id.id or False res[data.id] = company_invoice return res _columns = { 'address_issued_id': fields.function(_get_address_issued_invoice, type="many2one", relation='res.partner', string='Address Issued \ Invoice', help='This address will be used as address that issued \ for electronic invoice'), 'company_emitter_id': fields.function(_get_company_emitter_invoice, type="many2one", relation='res.company', string='Company Emitter \ Invoice', help='This company will be used as emitter company in \ the electronic invoice') } def onchange_journal_id(self, cr, uid, ids, journal_id=False, context=None): if context is None: context = {} result = super(account_invoice, self).onchange_journal_id( cr, uid, ids, journal_id, context=context) address_id = journal_id and self.pool.get('account.journal').browse( cr, uid, journal_id, context=context) or False if address_id and address_id.address_invoice_company_id: result['value'].update({'address_invoice_company_id': address_id.address_invoice_company_id.id}) if address_id and address_id.company2_id: result['value'].update({'company2_id': address_id.company2_id.id}) return result
Mornin’, Colorado! Plenty of news outlets are serving up Mueller Time today (Vox has an explainer of the Attorney General’s investigation summary, if you weren’t chugging the story over the weekend), but we’ve got a wonderfully zesty Sunriser — with notes of creativity and revenge — on tap for you this morning. But, first, I want to tell you about an awesome new offering from The Colorado Sun. Or, specifically, I should say it’s from outdoors writer extraordinaire Jason Blevins. The Outsider is Jason’s insightful and playful newsletter covering everything alfresco — from the recreation industry to outdoor culture to riffs on his favorite ski movie. Read the first newsletter here. We’ll have another freebie later this week. After that, you’ll need to be a Newsletters+ member to keep getting this weekly nugget of elevated goodness. Not a member yet? Join up here. Already a Newsletters+ member? Head here or go to “Manage Newsletters” in your account to opt-in for free. OK, business matters over. Let’s land this McTwist, shall we? Committees are forming and fundraising messages are going out, and that means it’s looking increasingly likely that recall season will soon be upon us. Critics of the state’s Democratic leadership point to the national popular vote, “red flag”, and oil and gas bills as reasons to recall the governor or state legislators. A lawmaker who could find herself in the crosshairs, though, says the threat of recalls is an intimidation tactic. >> Read John Frank’s explanation of the recall movement, including why Colorado’s recent history gives the recall threats extra oomph. The head of Denver’s Museum of Contemporary Art is stepping down. Whatever comes next — and he’s not saying — is sure to be unexpected. As the director and “chief animator” of MCA Denver, Adam Lerner didn’t just zig or zag — he created whole new moves for a small museum looking to make big statements. He dared to risk turning off funders. He invited in teens to make the museum an unlikely hangout. He embraced the art of Mark Mothersbaugh of Devo, and he made space for a tattoo artist to ink people in a gallery. Now that Lerner is leaving, a nationwide community of creatives is watching to see what he does next. >> Read Joanne Ostrow’s wonderfully detailed profile of Lerner, complete with hilarious quotes from the man himself. Sun contributor Sandra Fish explains a bill that would require more frequent disclosures by lobbyists in Colorado, following up on an analysis she did about how difficult it is to track what all those lobbying dollars at the state Capitol are paying for. Jason Blevins reports that two months after the sheriff in San Miguel County warned skiers and snowboarders about the risks of ducking rope lines to get into the backcountry, a group of snowboarders did just that and triggered an avalanche that killed a Telluride man, according to the final report on the slide. Meanwhile, the sheriff’s investigation continues. More and more Colorado schools are putting in washers and dryers and allowing students to use them during the school day, hoping to create a homey environment that keep kids in the classroom and not staying at home out of embarrassment over dirty clothes. Via our friends at Chalkbeat. More cartoons from The Colorado Sun opinion page. George Brauchler, the district attorney for Arapahoe, Douglas, Elbert and Lincoln counties, explains why he thinks Colorado needs a red flag gun law — but not the one being debated at the Capitol this session. Staying at the Capitol, University of Denver professor Jennifer C. Greenfield argues that the bill on paid family leave isn’t an entitlement, it’s an important pro-family insurance policy. Mario Nicolais looks at the strikes by Denver teachers and King Soopers employees and asks what it means if unions are regaining their mojo. An education-funding twofer: Philip DiStefano, the longtime chancellor at the University of Colorado, writes that federal funding for university research — which is on the chopping block in President Donald Trump’s latest proposed budget — is crucial to innovative public-private partnerships such as SpaceX. Daniel Baer, the former head of Colorado’s Department of Higher Education, says the state’s proposed increase in funding for colleges and universities is welcome news but the state also needs to make sure higher ed is accessible to all. The managers of Patagonia’s stores in Denver and Boulder have a piece urging politicians to listen more to calls for public-lands access and protection. If you’re an Apple News user, good news! You can read The Colorado Sun in the app. On your phone, just head to coloradosun.com/applenews or search for “The Colorado Sun” and add us to your favorite sources. The Thing: The Colorado Historic Newspapers Collection. Like a time machine that can instantly transport you to Colorado’s earliest years, the collection is one of the best places to read the first drafts of our state’s history. And, even better, it’s free and online at www.coloradohistoricnewspapers.org. The collection is a service of the Colorado State Library, which has scanned more than 1.4 million newspaper pages from more than 335 publications — starting before Colorado was even a state. That means you can read how the Denver Daily Times, on the day after Colorado gained statehood in 1876, immediately began touting us as a swing state — “It is an extraordinary situation that a people should be called upon so early to decide, by their own votes, matters of so great interest.” Or, less amusing, how the Rocky Mountain News covered the Sand Creek Massacre — “The Savages Dispersed!” declared the headline. If you have a research project that you’re working on or you just have a curious itch to scratch, it may become your new favorite website. REMINDER: If you have something that you just can’t stop raving about that you’d like to share, send us an email at [email protected] and you could be published in a future Sunriser! Hey, congratulations on making it to the bottom of loooong Sunriser! Add that to your growing list of accomplishments for the week and don’t forget to tell your friends, family, co-workers and co-conspirators about The Sun. Thanks for reading!
# -*- coding: utf-8 -*- from __future__ import unicode_literals from settings import FASTIR_ROOT import os import sys import inspect import importlib import pkgutil def _list_packages(): directories = [] lib_dir = FASTIR_ROOT if lib_dir.endswith('.zip'): lib_dir = lib_dir[0:-4] for root, dirnames, filenames in os.walk(lib_dir): directories = dirnames break return directories def _iter_modules(packages): for p in packages: imports = [] try: for path_import in __import__(p).__path__: imports.append(path_import.replace('.zip', '')) except ImportError: pass # Workaround to detect imports when used as a binary. # The issue comes from FrozenImporter in pyinstaller. # Snippet from https://github.com/webcomics/dosage/blob/master/dosagelib/loader.py if getattr(sys, 'frozen', False): modules = [] importers = map(pkgutil.get_importer, imports) toc = set() for i in importers: if hasattr(i, 'toc'): toc |= i.toc for elm in toc: modules.append(elm) for module in modules: if 'psutil' not in module and not module.endswith('ext') and module.startswith(p): yield importlib.import_module(module) # Normal behavior. else: for importer, modname, ispkg in pkgutil.iter_modules(imports): # quick fix for winXP if 'psutil' not in p and not modname.endswith('ext'): yield importlib.import_module(p + '.' + modname) def load_classes(module, os_name, release): for name, class_to_load in inspect.getmembers(module, inspect.isclass): if name.find(os_name + 'All') != -1: yield class_to_load elif name.find(os_name + release) != -1: yield class_to_load def load_modules(filters, output_dir): directories = _list_packages() __filter_packages(filters, directories, output_dir) return _iter_modules(directories) def list_packages(filters, os_name, release): """List available and activated packages""" result = {} packages = _list_packages() copy = packages[:] for p in copy: if p.find('.') == 0: packages.remove(p) activated_packages = list(packages) activated_packages = __filter_packages(filters, activated_packages, '') for module in _iter_modules(activated_packages): classes = load_classes(module, os_name, release) for cl in classes: activated = False if module.__package__ in activated_packages: activated = True result[module.__package__] = activated break return result def __filter_packages(modules, directories, output_dir): # Remove 'dump' and 'filecatcher' if they are not explicitely specified for m in ['dump', 'filecatcher']: if m in directories and m not in modules: directories.remove(m) # Remove everything that is not a valid CE package copy = directories[:] for d in copy: if d.find('.') == 0 or d.startswith('_') or d == output_dir: directories.remove(d) # Remove everything not specified in module, unless module contains 'all' if 'fast' not in modules: copy = directories[:] for d in copy: if d not in modules: directories.remove(d) # If dump is specified, put it in first position if 'dump' in directories: directories.remove('dump') directories.insert(0, 'dump') return directories
The Great Rotation Or “Greater Fuel Theory”? One of the rationale’s we’ve heard from time to time for market bullishness since we were babes-in-the-wood investors is the “mountain of money” argument. “Cash on the sidelines”, as it’s also known. We’ve heard it at major market peaks and major market troughs. It’s convenient, it sounds good and every once in a while may even be true. But there are so many economic dynamics affecting cash holdings and how invested positions are analyzed, that no one macro argument dominates one way or the other. A new concept chant for 2013 on Wall Street is what has been called “The Great Rotation”. The argument runs that money is finally leaving the bubble that is bonds and is and will continue to find its way increasingly into equities. Now before venturing even one step further into this discussion, let us be clear. Starting from where we stand today and looking over the truly longer term (say 5-10 years), the mathematics of bonds leave a good deal to be desired. Personally, we think bond “mathematics” border on scary looking over the long term, especially set against ongoing Fed balance sheet and US Federal debt expansion. Alternatively, the mathematics of equities argues more positively for stock investment on a relative basis for anyone with even a modicum of time, patience, and a tolerance for short term price risk. Having said this, we want to look at a few metrics that suggest it may be a bit too early to argue the stampede is on in terms of this so-called “great rotation” from bonds to stocks that has gone mainstream as of late. We’ll get there eventually, and who knows, maybe in the not too distant future. But the starting gun has not gone off yet, despite what you might hear on the “news”. To set the stage, since 2009 equity mutual funds have been under consistent selling pressure. The common thinking is that the public has been liquidating their equity mutual fund holdings as stock prices have moved higher. In like manner, investment inflows into US bond mutual funds have registered record numbers, giving rise to the thinking that the general public has moved to bond investments en masse after experiencing two gut wrenching 50% stock bear markets in less than a decade’s time from 2000-2009. The reason for the “new new thing” mantra? Well, inflows to US equity funds so far in 2013 have been of a magnitude not seen in years. Let’s go to the metrics. Investment flows into equity mutual funds so far this year have seen a dramatic turnaround relative to comparable year-to-date inflows from 2012. But what seems left out of the discussion at present is the like period YTD flows into bond funds. The following chart documents YTD flows for both 2013 and 2012 into US equity funds and ETFs (Exchange Traded Funds) on a combined basis as well as collective investment flows to US bond funds and ETFs. If we consider flows into US equity mutual funds and equity ETFs on a YTD basis, we’re looking at a 135% increase in 2013 numbers relative to last year. Impressive, right? Yet when looking at the combined US bond mutual fund and US bond ETF flows, the year over year increase we’re looking at totals a paltry 2% increase. Put differently, YTD 2013 flows into US bond funds and ETFs are 61% larger than flows into US equity funds and ETFs as of mid-February. So the question becomes clear as per the data – where is the great rotation from bonds to stocks? Yes, equity funds are receiving greater investment inflows year over year in absolute terms, but flows into US bond funds YTD remain larger than into equities, and haven’t subsided. An inconvenient fact left out of a lot of recent “analysis”? It sure appears so. Again, the current “mathematics” of bond investment doesn’t make a lot sense when looking out over any reasonable time frame. But, we need to remember that in the very near term flows we are seeing into equity mutual funds today are in large part driven by tax anticipation behavior we saw in Q4 of last year and specifically in December. In very interesting behavior, YTD 2013 flows into US equity ETFs are actually down substantially relative to 2012 YTD experience. Odd in that US equity ETFs have been attracting positive flows at the expense of US equity mutual funds for years now. Why would this switch be happening? Again, it may have roots in Q4 2012 and December ‘12 events. Remember that in Q4 2012 and December specifically we experienced a massive anomaly in accelerated dividend payments by many corporations. From Las Vegas Sands to Costco and well beyond, billions were paid out to equity investors in “special” common stock dividends, and of course a portion of this clearly floated through equity mutual funds winding up as money market fund cash in mutual fund accounts. As we move into 2013, are we simply seeing this money return to its rightful home in equities and equity funds, exactly where it was invested a few months ago? We think so. Combine this with 2012 end of year bonuses, earned income accelerated into 2012 to beat the taxman, early year qualified plan contributions, etc. and all of a sudden the newfound resurgence of money flows to equity funds looks a lot more explainable. Moreover, the very numbers themselves do not show bond fund/ETF liquidations – quite the opposite. So, this great rotation from bonds to stocks may come to be, but it has not started yet. Tangentially related to current period “analysis” and this great rotation thinking (which we would suggest has been less than thoughtful or thorough), is commentary regarding the positioning of large institutional pools of capital. It was a month back that we saw an article in Barron’s proclaiming large pension funds to be only 34% invested in stocks. We saw a similar number thrown out by a fund manager in Barron’s again just a few weeks ago. Of course the implication is that 65% of institutional pension assets are invested in bonds. Nothing could be further from the truth, but it sounds good in printed media and on TV, right? Just where are these numbers coming from? Luckily, you can find them quite conveniently in the quarterly Fed Flow of Funds report published on the Fed’s own website. So let’s have a look at the raw data and see if perhaps there are a few “unanswered” questions these analysts have forgotten to include in their commentaries. Below is the current asset class breakdown of US Private (think corporate) Pension Fund investments. As you can see, right there is that 34.2% allocation to stocks, exactly as these commentators have described. Private pension fund bond holdings are broken out in detail under “credit market instruments”. The numbers further are broken down explicitly among Treasuries, corporate bonds, Mortgage Backed Securities and Government agency bond holdings. This apparent low allocation to equities actually has been consistent for over a decade now. It did not all of a sudden drop post 2008. The real issue being that the Fed Flow of Funds report does not delineate just what is inside the “mutual funds” categorization. The superficial analysis you have been treated to as of late implies/assumes it’s in bonds, just waiting to “rotate” into stocks. Let’s remember that private pension funds have been much more aggressive than their public pension fund counterparts over the decades as they shifted allocations to “alternative investments” (e.g. private equity, commercial real estate, venture capital, etc.). Moreover, private pension funds pay well below bond mutual fund fees to have their individual bond allocations managed, so they wouldn’t have a large investment allocation to bond mutual funds. The quoted analysts touting this great rotation thesis have been using selective statistics. While not broken out, private pension funds have been and continue to be heavily invested in private equity “funds”, hedge “funds”, commodity “funds” and institutional commercial real estate “funds”, among a number of other alternative asset class categories. In large part, is this what we are looking at in this category of pension fund mutual fund holdings? Of course it is. Are these really the type of assets just waiting to “rotate into stocks”? Of course not. In many senses they are already there. So it’s obvious where the superficial analysis of the moment breaks down, despite that “rotation” story and those institutional asset allocation stats appearing in headlines and sound bites. As Steven Colbert would suggest, it’s analytical truthiness. It sounds like the truth. It could be the truth. But…..it’s not the truth. Before wrapping up, let’s look at Public Pension fund (think States and municipalities) asset allocations. You will not see the aforementioned analysts quoting these numbers. Why? Because they show the Public Pension funds are already 61% invested in equities. That also does not support the great rotation thesis. The numbers tell the story. And of course public pension fund assets are also exposed to alternative investments such as private equity, hedge fund, commercial real estate, commodity, etc., just to a lesser extent than their private pension fund brethren. The public and private pension assets in the US “rotated” a long time ago. So there you have it – just a few facts to contemplate amidst the cacophony of daily “noise” that is often financial market commentary. As always, we have to look under the hood in financial market and economic analysis. In summary, as our President would say, let me be clear. Current bond investment mathematics border on scary if you buy today and look long term. Equities have a relative advantage in a world where central bankers have eliminated the return component from principal safe investments. Central bankers having flooded the global system with unprecedented conjured liquidity that could easily spark an equity “melt up”. We remember an analyst we respect saying a few years back that the public will not come back to equities until they make a new high. We concur. The public always chases the inflating asset….until it stops inflating, of course. Remember the old Wall Street adage – never confuse brains with a bull market. And here’s a new one for you – never confuse asset class rotation stories with central bank asset reflation (one of the Fed’s primary goals of quantitative easing is to get stock and real estate prices higher). Can we characterize our current circumstances as the “The Greater Fuel Theory”? Time will tell.
from collections import OrderedDict import wx from pubsub import pub from classes.ui import UIManager from classes.ui import UIControllerObject from classes.ui import UIViewObject from app.pubsub import AUTO_TOPIC from app.app_utils import GripyIcon """ Add(self, item, int proportion=0, int flag=0, int border=0, PyObject userData=None) -> wx.SizerItem Appends a child item to the sizer. """ item_sizer_keys = ['proportion', 'flag', 'border', 'userData'] # """ __init__(self, Window parent, int id=-1, String label=EmptyString, Point pos=DefaultPosition, Size size=DefaultSize, long style=0, String name=StaticBoxNameStr) -> StaticBox """ static_box_keys = ['id', 'label', 'pos', 'size', 'style', 'name'] # """ __init__(self, Window parent, int id=-1, Point pos=DefaultPosition, Size size=DefaultSize, long style=wxTAB_TRAVERSAL|wxNO_BORDER, String name=PanelNameStr) -> Panel """ panel_keys = ['id', 'pos', 'size', 'style', 'name'] # staticboxsizer_keys = ['orient'] boxsizer_keys = ['orient'] gridsizer_keys = ['rows', 'cols', 'vgap', 'hgap'] flexgridsizer_keys = ['rows', 'cols', 'vgap', 'hgap'] gridbagsizer_keys = ['vgap', 'hgap'] # wx_statictext_keys = ['label'] wx_spinctrl_keys = ['id', 'value', 'pos', 'size', 'style', 'min', 'max', 'initial', 'name'] wx_textctrl_keys = ['id', 'value', 'pos', 'size', 'style', 'validator', 'name'] wx_choice_keys = ['id', 'value', 'pos', 'size', 'choices', 'style', 'validator', 'name'] wx_listbox_keys = ['id', 'value', 'pos', 'size', 'choices', 'style', 'validator', 'name'] wx_filepickerctrl_keys = ['id', 'path', 'message', 'wildcard', 'pos', 'size', 'style', 'validator', 'name'] wx_checkbox_keys = ['id', 'label', 'value', 'pos', 'size', 'style', 'validator', 'name'] wx_radiobutton_keys = ['id', 'label', 'pos', 'size', 'style', 'validator', 'name'] registered_widgets = { wx.StaticText: wx_statictext_keys, wx.SpinCtrl: wx_spinctrl_keys, wx.TextCtrl: wx_textctrl_keys, wx.Choice: wx_choice_keys, wx.ListBox: wx_listbox_keys, wx.FilePickerCtrl: wx_filepickerctrl_keys, wx.CheckBox: wx_checkbox_keys, wx.RadioButton: wx_radiobutton_keys } widget_special_keys = ['initial', 'widget_name', 'options', 'controller_uid'] def get_control_keys(control_class): if control_class in registered_widgets: return registered_widgets.get(control_class) raise Exception('Unregistered class') def pop_registers(keys, kwargs): ret = {} for key in keys: if kwargs.get(key) is not None: ret[key] = kwargs.pop(key) # for key in special_keys: # if kwargs.get(key) is not None: # ret[key] = kwargs.pop(key) return ret, kwargs def pop_widget_registers(keys, kwargs): # print 'pop_widget_registers:', keys, kwargs ctrl_dict = {} special_dict = {} for key in keys: if key in kwargs.keys(): ctrl_dict[key] = kwargs.pop(key) for key in widget_special_keys: if key in kwargs.keys(): special_dict[key] = kwargs.pop(key) return ctrl_dict, special_dict, kwargs # TODO: Its a GripyObject? class EncapsulatedControl(object): def __init__(self, *args, **kwargs): self._trigger_func = None self._trigger_kwargs_keys = None parent = args[0] if not self._control_class in registered_widgets.keys(): raise Exception('Unregistered class') special_kw = args[1] self.name = special_kw.get('widget_name') initial = special_kw.get('initial') options = special_kw.get('options', {}) self._controller_uid = special_kw.get('controller_uid') self.control = self._control_class(parent, **kwargs) try: if options: self.set_options(options) except Exception as e: raise if initial is not None: self.set_value(initial) self.old_value = None def get_topic(self): UIM = UIManager() dialog = UIM.get(self._controller_uid) return self.name + '_widget_changed@' + dialog.view.get_topic() def set_trigger(self, func, *args): if not callable(func): raise Exception('A callable must be supplied.') self._trigger_func = func self._trigger_kwargs_keys = list(args) pub.subscribe(self.check_change, self.get_topic()) def unset_trigger(self): if not callable(self._trigger_func): return None pub.unsubscribe(self.check_change, self.get_topic()) func = self._trigger_func self._trigger_func = None keys = self._trigger_kwargs_keys self._trigger_kwargs_keys = None return func, keys def check_change(self, name, old_value, new_value): if not callable(self._trigger_func): return kwargs = {} if self._trigger_kwargs_keys: UIM = UIManager() dialog = UIM.get(self._controller_uid) for enc_ctrl_name in self._trigger_kwargs_keys: enc_control = dialog.view.get_object(enc_ctrl_name) try: kwargs[enc_control.name] = enc_control.get_value() except: raise self._trigger_func(name, old_value, new_value, **kwargs) def on_change(self, event): new_value = self.get_value() pub.sendMessage(self.get_topic(), name=self.name, old_value=self.old_value, new_value=new_value ) self.old_value = new_value def set_options(self, options_dict=None): raise NotImplementedError() def set_value(self, value): raise NotImplementedError() def get_value(self): raise NotImplementedError() class EncapsulatedChoice(EncapsulatedControl): _control_class = wx.Choice def __init__(self, *args, **kwargs): super(EncapsulatedChoice, self).__init__(*args, **kwargs) self.control.Bind(wx.EVT_CHOICE, self.on_change) def set_options(self, options_dict=None): self.control.Clear() self._map = options_dict if self._map is not None: if not isinstance(self._map, OrderedDict): self._map = OrderedDict(self._map) self.control.AppendItems(list(self._map.keys())) def set_value(self, value, event=False): if value is None: return if not isinstance(value, int): if not value in self._map.keys(): raise Exception('') value = self._map.keys().index(value) self.control.SetSelection(value) if event: self.on_change(None) def get_value(self): if not self._map: return None if self.control.GetSelection() == -1: return None return self._map[self.control.GetString(self.control.GetSelection())] def show(self): return self.control.Show() def hide(self): return self.control.Hide() def destroy(self): return self.control.Destroy() class EncapsulatedRadioButton(EncapsulatedControl): _control_class = wx.RadioButton def __init__(self, *args, **kwargs): super(EncapsulatedRadioButton, self).__init__(*args, **kwargs) self.control.Bind(wx.EVT_RADIOBUTTON, self.on_change) def set_value(self, value): self.control.SetValue(value) def get_value(self): return self.control.GetValue() class EncapsulatedCheckBox(EncapsulatedControl): _control_class = wx.CheckBox def __init__(self, *args, **kwargs): super(EncapsulatedCheckBox, self).__init__(*args, **kwargs) self.control.Bind(wx.EVT_CHECKBOX, self.on_change) def set_value(self, value): self.control.SetValue(value) def get_value(self): return self.control.GetValue() class EncapsulatedTextCtrl(EncapsulatedControl): _control_class = wx.TextCtrl def __init__(self, *args, **kwargs): super(EncapsulatedTextCtrl, self).__init__(*args, **kwargs) self.control.Bind(wx.EVT_TEXT, self.on_change) def set_value(self, value): if value is None: self.control.SetValue(wx.EmptyString) else: self.control.SetValue(str(value)) def get_value(self): return self.control.GetValue().strip() def disable(self): return self.control.Disable() def enable(self): return self.control.Enable() def hide(self): return self.control.Hide() def show(self): return self.control.Show() def destroy(self): return self.control.Destroy() class EncapsulatedFilePickerCtrl(EncapsulatedControl): _control_class = wx.FilePickerCtrl def __init__(self, *args, **kwargs): try: super(EncapsulatedFilePickerCtrl, self).__init__(*args, **kwargs) except Exception as e: print(e) def set_value(self, value): self.control.SetPath(value) def get_value(self): return self.control.GetPath() class EncapsulatedSpinCtrl(EncapsulatedControl): _control_class = wx.SpinCtrl def __init__(self, *args, **kwargs): super(EncapsulatedSpinCtrl, self).__init__(*args, **kwargs) self.control.Bind(wx.EVT_SPINCTRL, self.on_change) def set_value(self, value): if value is not None: # print 'spin =', value, type(value) self.control.SetValue(value) def get_value(self): # print 'spin:', self.control.GetValue() return self.control.GetValue() class EncapsulatedStaticText(EncapsulatedControl): _control_class = wx.StaticText def __init__(self, *args, **kwargs): super(EncapsulatedStaticText, self).__init__(*args, **kwargs) def set_value(self, value): if value is not None: self.control.SetLabel(str(value)) def get_value(self): return self.control.GetLabel() def hide(self): return self.control.Hide() def show(self): return self.control.Show() def destroy(self): return self.control.Destroy() class EncapsulatedListBox(EncapsulatedControl): _control_class = wx.ListBox def __init__(self, *args, **kwargs): super(EncapsulatedListBox, self).__init__(*args, **kwargs) self.control.Bind(wx.EVT_LISTBOX, self.on_change) def set_value(self, value, event=True): self.control.Clear() if not value: self._map = None else: self._map = value self.control.AppendItems(self._map.keys()) # To force on_change if event: self.on_change(None) def get_value(self): if not self._map: return None if not self.control.GetSelections(): return None return [self._map.get(self.control.GetString(sel)) for sel in self.control.GetSelections()] class PanelContainer(wx.Panel): def __init__(self, *args, **kwargs): # print '\nPanelContainer:', args, kwargs if not kwargs.get('sizer_class'): raise Exception() sizer_class = kwargs.pop('sizer_class') panel_kw, sizer_kw = pop_registers(panel_keys, kwargs) wx.Panel.__init__(self, args[0], **panel_kw) try: sizer = sizer_class(**sizer_kw) self.SetSizer(sizer) except: raise class BoxSizerContainer(PanelContainer): def __init__(self, *args, **kwargs): if not kwargs: kwargs = { 'sizer_class': wx.BoxSizer, 'orient': wx.VERTICAL } else: kwargs['sizer_class'] = wx.BoxSizer if not kwargs.get('orient'): kwargs['orient'] = wx.VERTICAL elif kwargs.get('orient') not in [wx.HORIZONTAL, wx.VERTICAL]: raise Exception() super().__init__(*args, **kwargs) class GridSizerContainer(PanelContainer): def __init__(self, *args, **kwargs): if not kwargs: kwargs = {'sizer_class': wx.GridSizer} else: kwargs['sizer_class'] = wx.GridSizer super().__init__(*args, **kwargs) class GridBagSizerContainer(PanelContainer): def __init__(self, *args, **kwargs): if not kwargs: kwargs = {'sizer_class': wx.GridBagSizer} else: kwargs['sizer_class'] = wx.GridBagSizer super().__init__(*args, **kwargs) class FlexGridSizerContainer(PanelContainer): def __init__(self, *args, **kwargs): if not kwargs: kwargs = {'sizer_class': wx.FlexGridSizer} else: kwargs['sizer_class'] = wx.FlexGridSizer super().__init__(*args, **kwargs) class WarpSizerContainer(PanelContainer): def __init__(self, *args, **kwargs): if not kwargs: kwargs = { 'sizer_class': wx.WarpSizer, 'orient': wx.VERTICAL } else: kwargs['sizer_class'] = wx.BoxSizer if not kwargs.get('orient'): kwargs['orient'] = wx.VERTICAL elif kwargs.get('orient') not in [wx.HORIZONTAL, wx.VERTICAL]: raise Exception() super().__init__(*args, **kwargs) class StaticBoxContainer(wx.StaticBox): def __init__(self, *args, **kwargs): sbkw, kwargs = pop_registers(static_box_keys, kwargs) wx.StaticBox.__init__(self, args[0], **sbkw) if kwargs.get('orient') is None: orient = wx.VERTICAL else: orient = kwargs.pop('orient') self._sizer = wx.StaticBoxSizer(self, orient) def GetSizer(self): return self._sizer ############################################################################### ############################################################################### class TopLevelController(UIControllerObject): tid = 'toplevel_controller' _ATTRIBUTES = OrderedDict() _ATTRIBUTES['title'] = { 'default_value': wx.EmptyString, 'type': str } # TODO: Use icon from App parameters _ATTRIBUTES['icon'] = { 'default_value': 'basic/icons/logo-transp.ico', 'type': str } _ATTRIBUTES['style'] = { 'default_value': wx.DEFAULT_FRAME_STYLE, 'type': int } _ATTRIBUTES['maximized'] = { 'default_value': False, 'type': bool } _ATTRIBUTES['size'] = { 'default_value': wx.Size(800, 600), 'type': wx.Size } _ATTRIBUTES['pos'] = { 'default_value': wx.Point(50, 50), 'type': wx.Point } def __init__(self, **state): super().__init__(**state) class TopLevel(UIViewObject): tid = 'toplevel' def __init__(self, controller_uid): UIViewObject.__init__(self, controller_uid) UIM = UIManager() controller = UIM.get(self._controller_uid) # MainWindow subscribing MainWindowController PubSub messages controller.subscribe(self._set_maximized, 'change.maximized') controller.subscribe(self._set_size, 'change.size') controller.subscribe(self._set_position, 'change.pos') controller.subscribe(self._set_title, 'change.title') # # TODO: try to remove _flag using new GripyObject style # little hack - on_size # self._flag = False def on_maximize(self, event): UIM = UIManager() controller = UIM.get(self._controller_uid) controller.set_value_from_event('maximized', self.IsMaximized()) def on_move(self, event): UIM = UIManager() controller = UIM.get(self._controller_uid) controller.set_value_from_event('pos', self.GetPosition()) def on_size(self, event): UIM = UIManager() controller = UIM.get(self._controller_uid) controller.set_value_from_event('size', event.GetSize()) controller.set_value_from_event('maximized', self.IsMaximized()) event.Skip() def _set_maximized(self, new_value, old_value): self.Unbind(wx.EVT_MAXIMIZE, handler=self.on_maximize) self.Maximize(new_value) self.Bind(wx.EVT_MAXIMIZE, self.on_maximize) def _set_size(self, new_value, old_value): self.Unbind(wx.EVT_SIZE, handler=self.on_size) self.SetSize(new_value) self.Bind(wx.EVT_SIZE, self.on_size) def _set_position(self, new_value, old_value): self.Unbind(wx.EVT_MOVE, handler=self.on_move) self.SetPosition(new_value) self.Bind(wx.EVT_MOVE, self.on_move) def _set_title(self, new_value, old_value): self.SetTitle(new_value) # Containers def AddCreateContainer(self, container_type_name, *args, **kwargs): try: item_sizer_kw, kwargs = pop_registers(item_sizer_keys, kwargs) container = self.CreateContainer(container_type_name, *args, **kwargs) self.AddContainer(container, *args, **item_sizer_kw) return container except: raise def CreateContainer(self, container_type_name, *args, **kwargs): try: if container_type_name == 'BoxSizer': container_class = BoxSizerContainer elif container_type_name == 'GridSizer': container_class = GridSizerContainer elif container_type_name == 'FlexGridSizer': container_class = FlexGridSizerContainer elif container_type_name == 'GridBagSizer': container_class = GridBagSizerContainer elif container_type_name == 'StaticBox': container_class = StaticBoxContainer elif container_type_name == 'WarpSizer': container_class = WarpSizerContainer else: raise Exception('Unregistered container.') if not args: parent = self.mainpanel else: parent = args[0] container = container_class(parent, **kwargs) return container except: raise def AddContainer(self, container, *args, **kwargs): for key in kwargs.keys(): if key not in item_sizer_keys: msg = 'Invalid container key. [key=\"{}\"]'.format(key) raise Exception(msg) if not args: parent = self.mainpanel else: parent = args[0] # container.Show() if container.__class__ == StaticBoxContainer: parent.GetSizer().Add(container.GetSizer(), **kwargs) else: parent.GetSizer().Add(container, **kwargs) parent.GetSizer().Layout() def DetachContainer(self, container): ctn_sizer = container.GetSizer() parent = container.GetParent() if container.__class__ == StaticBoxContainer: result = parent.GetSizer().Detach(ctn_sizer) else: result = parent.GetSizer().Detach(container) container.Show(False) return result # Controllers def _get_button(self, button_id): UIM = UIManager() controller = UIM.get(self._controller_uid) if button_id & controller.flags: return self.FindWindow(button_id) return None def enable_button(self, button_id, enable=True): btn = self._get_button(button_id) btn.Enable(enable) def register(self, enc_control): if enc_control.name: self._objects[enc_control.name] = enc_control def CreateControl(self, enc_class, container, **kwargs): # Create and Add a new control. try: keys = get_control_keys(enc_class._control_class) controlkw, specialkw, kwargs = pop_widget_registers(keys, kwargs) specialkw['controller_uid'] = self._controller_uid enc_control = enc_class(container, specialkw, **controlkw) self.register(enc_control) container.GetSizer().Add(enc_control.control, **kwargs) container.GetSizer().Layout() except: raise def AddChoice(self, *args, **kwargs): self.CreateControl(EncapsulatedChoice, args[0], **kwargs) def AddRadioButton(self, *args, **kwargs): self.CreateControl(EncapsulatedRadioButton, args[0], **kwargs) def AddCheckBox(self, *args, **kwargs): self.CreateControl(EncapsulatedCheckBox, args[0], **kwargs) def AddTextCtrl(self, *args, **kwargs): self.CreateControl(EncapsulatedTextCtrl, args[0], **kwargs) def AddFilePickerCtrl(self, *args, **kwargs): self.CreateControl(EncapsulatedFilePickerCtrl, args[0], **kwargs) def AddSpinCtrl(self, *args, **kwargs): self.CreateControl(EncapsulatedSpinCtrl, args[0], **kwargs) def AddStaticText(self, *args, **kwargs): self.CreateControl(EncapsulatedStaticText, args[0], **kwargs) def AddListBox(self, *args, **kwargs): self.CreateControl(EncapsulatedListBox, args[0], **kwargs) def get_results(self): ret = {} for name, widget in self._objects.items(): ret[name] = widget.get_value() return ret def get_object(self, name): return self._objects.get(name)
We appreciate your views and your feedback. If you would like to leave us a review on a recent project, provide us with feedback or simply read what our customers have to say about us please follow the link below, we'd be happy to hear from you. W H A T I S D U S T F R E E S A N D I N G ? Exactly that! We have three different sized Festool Sanders that connect to a dust extraction unit. They can used on ceilings, walls and woodwork. Our Dust Free Sanders are approximately 95% DUST FREE. Long gone are the days of dust covering your kitchen if we are completing alot of sanding in your lounge! It gives a faster and far superior finish to conventional sanding. The health benefits are immence, we empty our extraction units on a daily basis and are amazed at how much dust they contain, if we were conventionally hand sanding, this dust would be inside your property, our lungs and yours. While Airless Spraying is not suitable for all jobs due to furniture, carpets, fixtures and fittings, it is ideal for larger, spacious areas. A perfect option for new build properties as application is up to 70% faster than traditional brush or roller! This application method provides a flawless finish to ceilings, walls and all woodwork. All surfaces are protected by masking papers and films. Ideal for ornate coving and difficult areas to paint.
#!/usr/bin/python # -*- coding: utf-8 -*- from lib.meos import MEoS from lib import unidades class mXylene(MEoS): """Multiparameter equation of state for m-xylene""" name = "m-xylene" CASNumber = "108-38-3" formula = "C8H10" synonym = "1,3-dimethylbenzene" rhoc = unidades.Density(282.929725) Tc = unidades.Temperature(616.89) Pc = unidades.Pressure(3534.6, "kPa") M = 106.165 # g/mol Tt = unidades.Temperature(225.3) Tb = unidades.Temperature(412.214) f_acent = 0.326 momentoDipolar = unidades.DipoleMoment(0.3, "Debye") id = 43 Fi1 = {"ao_log": [1, 1.169909], "pow": [0, 1], "ao_pow": [12.652887, -0.45975624], "ao_exp": [4.44312, 2.862794, 24.83298, 16.26077], "titao": [160/Tc, 190/Tc, 1333/Tc, 3496/Tc]} helmholtz1 = { "__type__": "Helmholtz", "__name__": "Helmholtz equation of state for m-xylene of Zhou et al. (2012).", "__doi__": {"autor": "Zhou, Y., Lemmon, E.W., and Wu, J.", "title": "Thermodynamic Properties of o-Xylene, m-Xylene, p-Xylene, and Ethylbenzene", "ref": "J. Phys. Chem. Ref. Data 41, 023103 (2012).", "doi": "10.1063/1.3703506"}, "R": 8.314472, "cp": Fi1, "ref": "OTO", "Tmin": Tt, "Tmax": 700.0, "Pmax": 200000.0, "rhomax": 8.677, "Pmin": 0.003123, "rhomin": 8.677, "nr1": [0.000012791017, 0.041063111, 1.505996, -2.3095875, -0.46969, 0.171031], "d1": [8, 4, 1, 1, 2, 3], "t1": [1.0, 0.91, 0.231, 0.772, 1.205, 0.323], "nr2": [-1.001728, -0.3945766, 0.6970578, -0.3002876, -0.024311], "d2": [1, 3, 2, 2, 7], "t2": [2.7, 3.11, 0.768, 4.1, 0.818], "c2": [2, 2, 1, 2, 1], "gamma2": [1]*5, "nr3": [0.815488, -0.330647, -0.123393, -0.54661], "d3": [1, 1, 3, 3], "t3": [2.0, 2.9, 3.83, 0.5], "alfa3": [1.0244, 1.3788, 0.9806, 6.3563], "beta3": [1.66, 1.9354, 1.0323, 78], "gamma3": [1.1013, 0.6515, 0.4975, 1.26], "epsilon3": [0.713, 0.9169, 0.6897, 0.7245]} eq = helmholtz1, _surface = {"sigma": [0.0661], "exp": [1.29]} _vapor_Pressure = { "eq": 5, "ao": [-7.5635, 1.2857, -3.2346, -1.9018], "exp": [1.0, 1.5, 3.1, 5.6]} _liquid_Density = { "eq": 1, "ao": [0.43346, 3.8716, -3.0144, 1.619], "exp": [0.16, 0.6, 1.0, 1.5]} _vapor_Density = { "eq": 3, "ao": [-1.1597, -6.0358, -16.712, -45.482, -98.418], "exp": [0.26, 0.78, 2.6, 5.7, 11.7]}
With the super cold weather today, I’m craving a cup of hot chocolate and what better thing to go with hot chocolate than some yummy macaroons. So I start looking for macaroon recipes and lo and below I find the cutest macaroon making machine. Not only do you get to make the traditional circular ones but you can make animal shaped macaroons. Available online at Strapaya World for about US$30 but I’m going to pop over to CitySuper to see if they have it tonight. So, what's the latest on CitySuper having or not having the macaroon machine? Yes please do update us on whether they sell them in Citysuper! awwwwww..... but it would be one hell of a good xmas gift!
from __future__ import absolute_import, print_function from django.core.urlresolvers import reverse from sentry.models import ( Activity, Group, GroupAssignee, GroupBookmark, GroupSeen, GroupStatus, GroupTagValue, Release ) from sentry.testutils import APITestCase class GroupDetailsTest(APITestCase): def test_simple(self): self.login_as(user=self.user) group = self.create_group() url = reverse('sentry-api-0-group-details', kwargs={ 'group_id': group.id, }) response = self.client.get(url, format='json') assert response.status_code == 200, response.content assert response.data['id'] == str(group.id) assert response.data['firstRelease'] is None def test_with_first_release(self): self.login_as(user=self.user) group = self.create_group() release = Release.objects.create( project=group.project, version='1.0', ) GroupTagValue.objects.create( group=group, project=group.project, key='sentry:release', value=release.version, ) url = reverse('sentry-api-0-group-details', kwargs={ 'group_id': group.id, }) response = self.client.get(url, format='json') assert response.status_code == 200, response.content assert response.data['id'] == str(group.id) assert response.data['firstRelease']['version'] == release.version class GroupUpdateTest(APITestCase): def test_resolve(self): self.login_as(user=self.user) group = self.create_group() url = reverse('sentry-api-0-group-details', kwargs={ 'group_id': group.id, }) response = self.client.put(url, data={ 'status': 'resolved', }, format='json') assert response.status_code == 200, response.content group = Group.objects.get( id=group.id, project=group.project.id, ) assert group.status == GroupStatus.RESOLVED def test_bookmark(self): self.login_as(user=self.user) group = self.create_group() url = reverse('sentry-api-0-group-details', kwargs={ 'group_id': group.id }) response = self.client.put(url, data={ 'isBookmarked': '1', }, format='json') assert response.status_code == 200, response.content # ensure we've created the bookmark assert GroupBookmark.objects.filter( group=group, user=self.user).exists() def test_assign(self): self.login_as(user=self.user) group = self.create_group() url = reverse('sentry-api-0-group-details', kwargs={ 'group_id': group.id }) response = self.client.put(url, data={ 'assignedTo': self.user.username, }, format='json') assert response.status_code == 200, response.content assert GroupAssignee.objects.filter( group=group, user=self.user ).exists() assert Activity.objects.filter( group=group, user=self.user, type=Activity.ASSIGNED, ).count() == 1 response = self.client.put(url, format='json') assert response.status_code == 200, response.content assert GroupAssignee.objects.filter( group=group, user=self.user ).exists() response = self.client.put(url, data={ 'assignedTo': '', }, format='json') assert response.status_code == 200, response.content assert not GroupAssignee.objects.filter( group=group, user=self.user ).exists() def test_mark_seen(self): self.login_as(user=self.user) group = self.create_group() url = reverse('sentry-api-0-group-details', kwargs={ 'group_id': group.id }) response = self.client.put(url, data={ 'hasSeen': '1', }, format='json') assert response.status_code == 200, response.content assert GroupSeen.objects.filter( group=group, user=self.user).exists() response = self.client.put(url, data={ 'hasSeen': '0', }, format='json') assert response.status_code == 200, response.content assert not GroupSeen.objects.filter( group=group, user=self.user).exists() def test_mark_seen_as_non_member(self): user = self.create_user('[email protected]', is_superuser=True) self.login_as(user=user) group = self.create_group() url = reverse('sentry-api-0-group-details', kwargs={ 'group_id': group.id }) response = self.client.put(url, data={ 'hasSeen': '1', }, format='json') assert response.status_code == 200, response.content assert not GroupSeen.objects.filter( group=group, user=self.user).exists() class GroupDeleteTest(APITestCase): def test_delete(self): self.login_as(user=self.user) group = self.create_group() url = reverse('sentry-api-0-group-details', kwargs={ 'group_id': group.id }) with self.tasks(): response = self.client.delete(url, format='json') assert response.status_code == 202, response.content group = Group.objects.filter(id=group.id).exists() assert not group
The personal representative or trustee of a decedent’s estate is required to review, assess, and resolve any claims that creditors have against the estate as well as collect any debts owed to the decedent. If there are disputes regarding liability, specific amounts owed, availability of liability insurance, or counter claims, settling the estate can become complicated and delayed. The team at Barron, Rosenberg, Mayoras & Mayoras helps clients to resolve matters of debt in an estate. Call (248) 494-4577 now for legal assistance with Troy estate and probate matters. For advice on how to resolve debtor-creditor disputes in Michigan probate courts, look no further than BRMM’s commercial litigation attorneys. Probate court jurisdiction has broadened since Michigan passed their probate code in 2000, meaning most debtor-creditor disputes can be litigated in either probate court or circuit court. Our team can help clients know which forum would be more advantageous for their particular case. Personal representatives owe a fiduciary duty to heirs and devisees to protect the estate by liquidating its claims for the greatest value practicable under the circumstances and by making sure that the claims paid reflect genuine obligations in correct or properly compromised amounts. If you need advice about third-party claims against or in favor of a Michigan probate estate, contact BRMM today. Call (248) 494-4577 now for a free consultation with BRMM.
from math import sqrt,log import numpy as np from scipy import stats from matplotlib import pyplot def here_cca(y, x): N = y.shape[0] X,Y = np.matrix(x.T).T, np.matrix(y) Z = np.matrix(np.ones(N)).T Rz = np.eye(N) - Z*np.linalg.inv(Z.T*Z)*Z.T XStar = Rz * X YStar = Rz * Y p,r = 1.0, 1.0 #nContrasts, nNuisanceFactors m = N - p - r H = YStar.T * XStar * np.linalg.inv( XStar.T * XStar ) * XStar.T * YStar / p W = YStar.T * (np.eye(nResponses) - XStar*np.linalg.inv(XStar.T*XStar)*XStar.T) * YStar / m #estimate maximum canonical correlation: F = np.linalg.inv(W)*H ff = np.linalg.eigvals( F ) fmax = float( np.real(ff.max()) ) r2max = fmax * p / (m + fmax*p) rmax = sqrt(r2max) ### compute test statistic: p,m = float(N), float(y.shape[1]) x2 = -(p-1-0.5*(m+2)) * log( (1-rmax**2) ) return x2 #(0) Set parameters: np.random.seed(0) nResponses = 20 nComponents = 3 nIterations = 1000 W0 = np.eye(nComponents) ### derived parameters: df = nComponents x = np.linspace(0, 1, nResponses) #independent variable #(1) Generate Gaussian data and compute test statistic: X2 = [] for i in range(nIterations): y = np.random.multivariate_normal(np.zeros(nComponents), W0, nResponses) chi2 = here_cca(y, x) X2.append( chi2 ) X2 = np.asarray(X2) #(2) Survival functions: heights = np.linspace(3, 12, 21) sf = np.array( [ (X2>h).mean() for h in heights] ) sfE = stats.chi2.sf(heights, df) #(3) Plot results: pyplot.close('all') ax = pyplot.axes() ax.plot(heights, sf, 'o', label='Simulated') ax.plot(heights, sfE, '-', label='Theoretical') ax.set_xlabel('$u$', size=20) ax.set_ylabel('$P (\chi^2 > u)$', size=20) ax.legend() ax.set_title("CCA validation (0D)", size=20) pyplot.show()
Recently, I held the “Hunger Games” at the library. I was incredibly excited to host a party for the book series that I loved and the kids were super amped as well. We decided to hold it the day before the movie was to be released, May 22. Most of the kids that showed up to compete in the games were planning on attending the midnight premiere and had read the entire series. A few months in advance, I visited the Middle School to promote the event. I brought a tri-fold I had decorated with me and handed out lottery tickets I created. I do one of these 2-3 times a year to promote programs (Spring, Summer, Fall). The lottery tickets trickled in slowly over the months and I had a few kids pick up tickets they lost. All in all, I’m glad I took the board there a few months in advance. By allowing the kids to plan around the event, I ended up having more kids attend! I held the event from 3:30-5:30 and I wish I would have had 15-30 more minutes. I had so much planned and getting a little behind caused some stress in the end. The kids still had a blast, but I felt like I was rushing through things and we still went over. The kids were given 20-25 minutes to create their token. The kids were then instructed to make a token to represent each of their Districts and that they would be allowed to wear it in the competition. It was really neat to see all of the creativity of the kids. The boys from District 4 made fish inspired bracelets, the girls from District 5 made little lightning bolt pins and District 2 made rings with sequins as gemstones. Their explanation was “Well, we’re miners so we would probably find gems while we worked,” which I thought was brilliant. Each team was asked 5 questions (possible 14 pts), but they wanted to continue with all the trivia even after the fact. It was here that I spent too much time and ended up rushing. The kids were just having so much fun with the trivia that I didn’t want to blow over it. District 2 using some engineering. The building challenge proved to be easy for some, but others struggled with the construction of their structure. One of the Districts gave up half way through because they broke their tower (on purpose I believe). District 2 was actually made up of two girls who compete in a science/math/engineering challenge each year (Science Olympiad) and they were folding paper into triangles in order to provide more support for their overall structure! The participants all did a wonderful job and I was so proud of all their hard work. We listened to music while they worked and few of them were dancing around while building, it was a lot of fun. District 4 trying to figure out what will work best. After they all tried to put as many mockingjay eggs (plastic eggs) on their structure as possible, we measured for height and totaled each one. The winning team had the shortest structure, but it held the most eggs. There was a bit of a disagreement about what the structure that won was, since a lot of them thought you HAD to build a tower. If you do something similar, I would suggest setting building parameters. District 3 working on building the tallest structure. After they built their structures, they needed to fend off some Tracker Jackers that were interested in their food supply. I purchased “throwing knives” (suction cup throwers) and used a Tic-Tac-Throw bean bag game we had in our basement. Instead of X’s and O’s, I taped on Tracker Jackers (X’s) and Nightlock Berries (O’s). The blank side had an image of a silver parachute. The object was to get as many parachutes as possible in 10 throws. If you had more than 4 nightlock berries turned, you died (poisoned) and received zero points. For each Tracker Jacker you turned, you lost one point. The kids had a blast trying to get all the parachutes turned! When all the Districts had gone, they got to try their hand at throwing knives. They aimed for a red plastic target and were awarded 10 points for every throw that stuck for a few seconds. Don't ask me why the photo is all screwy, I don't know why I took it like this. The Districts were pretty banged and bruised after foraging for eggs, fighting Tracker Jackers and competing against each other. They were sent to the triage unit, where they had to bandage up their teammates. Each team received two rolls of toilet paper, with which they had to heal their partner. Wrapping up a hurt comrade. The first team to heal all their partner’s injuries was awarded 15 points. Some of the kids really got into it, drawing stitches and blood stains on the toilet paper bandages. Finally, their last challenge required a good amount of teamwork. For the “Team Challenge” they had to reassemble a picture of the mockingjay symbol. To me, this was the most difficult challenge as I had printed the images in black and white. The kids soldiered through , however, and District 2 finished with a HUGE lead. For the refreshments, I provided Sleep Syrup (red punch), Mellark Bakery Bread (cinnamon raisin bread), Capitol Cupcakes (chocolate cupcakes w/ orange icing), Cornucopia Feast (a cornucopia filled with candy) and Thirst Quenchers (bottled water). Each participant was able to take home a “Silver Parachute” which was actually a gold lamp shade (I told them they were limited edition Capitol parachutes) with a goody bag attached. Inside the goody bag, they received a flare (glow stick), energy bites (Jolly Rancher), Hob Voucher (free book coupon), a handful of misc. candy and a Hunger Games bookmark. I also had a table full of withdrawn YA books that I labeled “The Hob Book Exchange.” Each kid was able to take a free book with them as well. The goody bags each participant was able to take home! I thought the program went fabulously and the kids were all excited the next day at school. My mom told me that they were recounting to her the different challenges and how well they had done. I would call that a success!
# Licensed to the Apache Software Foundation (ASF) under one # or more contributor license agreements. See the NOTICE file # distributed with this work for additional information # regarding copyright ownership. The ASF licenses this file # to you under the Apache License, Version 2.0 (the # "License"); you may not use this file except in compliance # with the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, # software distributed under the License is distributed on an # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY # KIND, either express or implied. See the License for the # specific language governing permissions and limitations # under the License. """Test alter op layout pass""" import tvm from tvm import te from tvm import relay from tvm.relay.op import register_alter_op_layout from tvm.relay import transform, analysis def run_opt_pass(expr, passes): passes = passes if isinstance(passes, list) else [passes] mod = tvm.IRModule.from_expr(expr) seq = tvm.transform.Sequential(passes) with tvm.transform.PassContext(opt_level=3): mod = seq(mod) entry = mod["main"] return entry if isinstance(expr, relay.Function) else entry.body def test_no_convert_layout(): def before(): x = relay.var("x", shape=(1, 64, 56, 56)) weight = relay.var("weight", shape=(64, 64, 3, 3)) y = relay.nn.conv2d(x, weight, channels=64, kernel_size=(3, 3), padding=(1, 1)) y = relay.nn.relu(y) y = relay.Function([x, weight], y) return y def expected(): return before() a = before() a = run_opt_pass(a, transform.ConvertLayout({"nn.conv2d": ["NCHW", "default"]})) b = run_opt_pass(expected(), transform.InferType()) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) def test_qnn_binary_no_convert_layout(): def before(): x = relay.var("x", shape=(2, 2)) y = relay.var("y", shape=(1, 2)) return relay.Function( [x, y], relay.qnn.op.add( x, y, lhs_scale=relay.const(0.0156863, "float32"), lhs_zero_point=relay.const(127, "int32"), rhs_scale=relay.const(0.0117647, "float32"), rhs_zero_point=relay.const(85, "int32"), output_scale=relay.const(0.0235294, "float32"), output_zero_point=relay.const(128, "int32"), ), ) def expected(): return before() a = before() a = run_opt_pass(a, transform.ConvertLayout({})) b = run_opt_pass(expected(), transform.InferType()) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) def test_conv_convert_layout(): def before(): x = relay.var("x", shape=(1, 56, 56, 64)) weight = relay.var("weight", shape=(3, 3, 64, 64)) y = relay.nn.conv2d( x, weight, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) y = relay.nn.relu(y) y = relay.Function([x, weight], y) return y def expected(): x = relay.var("x", shape=(1, 56, 56, 64)) weight = relay.var("weight", shape=(3, 3, 64, 64)) x = relay.layout_transform(x, "NHWC", "NCHW") weight = relay.layout_transform(weight, "HWIO", "OIHW") y = relay.nn.conv2d(x, weight, channels=64, kernel_size=(3, 3), padding=(1, 1)) y = relay.nn.relu(y) y = relay.layout_transform(y, "NCHW", "NHWC") y = relay.Function(relay.analysis.free_vars(y), y) return y a = before() a = run_opt_pass(a, transform.ConvertLayout({"nn.conv2d": ["NCHW", "default"]})) b = run_opt_pass(expected(), transform.InferType()) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) def test_conv_nhwc_convert_layout(): def before(): x = relay.var("x", shape=(1, 64, 56, 56)) weight = relay.var("weight", shape=(64, 64, 3, 3)) y = relay.nn.conv2d( x, weight, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NCHW", kernel_layout="OIHW", ) y = relay.nn.relu(y) y = relay.Function([x, weight], y) return y def expected(): x = relay.var("x", shape=(1, 64, 56, 56)) weight = relay.var("weight", shape=(64, 64, 3, 3)) x = relay.layout_transform(x, "NCHW", "NHWC") weight = relay.layout_transform(weight, "OIHW", "HWIO") y = relay.nn.conv2d( x, weight, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) y = relay.nn.relu(y) y = relay.layout_transform(y, "NHWC", "NCHW") y = relay.Function(relay.analysis.free_vars(y), y) return y a = before() a = run_opt_pass(a, transform.ConvertLayout({"nn.conv2d": ["NHWC", "default"]})) b = run_opt_pass(expected(), transform.InferType()) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) def test_conv_transpose_convert_layout(): def before(): x = relay.var("x", shape=(1, 56, 56, 64)) weight = relay.var("weight", shape=(3, 3, 64, 64)) y = relay.nn.conv2d_transpose( x, weight, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) y = relay.nn.relu(y) y = relay.Function([x, weight], y) return y def expected(): x = relay.var("x", shape=(1, 56, 56, 64)) weight = relay.var("weight", shape=(3, 3, 64, 64)) x = relay.layout_transform(x, "NHWC", "NCHW") weight = relay.layout_transform(weight, "HWIO", "OIHW") y = relay.nn.conv2d_transpose(x, weight, channels=64, kernel_size=(3, 3), padding=(1, 1)) y = relay.nn.relu(y) y = relay.layout_transform(y, "NCHW", "NHWC") y = relay.Function(relay.analysis.free_vars(y), y) return y a = before() a = run_opt_pass(a, transform.ConvertLayout({"nn.conv2d_transpose": ["NCHW", "OIHW"]})) b = run_opt_pass(expected(), transform.InferType()) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) def test_conv_bias_pool_convert_layout(): def before(): x = relay.var("x", shape=(1, 56, 56, 64)) bias = relay.var("bias", shape=(64,)) weight = relay.var("weight", shape=(3, 3, 64, 64)) y = relay.nn.conv2d( x, weight, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) y = relay.nn.bias_add(y, bias, axis=3) # a useless tuple, which will be eliminated y = relay.Tuple([y])[0] y = relay.nn.relu(y) y = relay.nn.max_pool2d(y, pool_size=(2, 2), layout="NHWC") y = relay.cast(y, "int32") y = relay.nn.batch_flatten(y) y = relay.Function(analysis.free_vars(y), y) return y def expected(): x = relay.var("x", shape=(1, 56, 56, 64)) bias = relay.var("bias", shape=(64,)) weight = relay.var("weight", shape=(3, 3, 64, 64)) x = relay.layout_transform(x, "NHWC", "NCHW") weight = relay.layout_transform(weight, "HWIO", "OIHW") y = relay.nn.conv2d(x, weight, channels=64, kernel_size=(3, 3), padding=(1, 1)) bias = relay.expand_dims(bias, axis=0, num_newaxis=3) bias = relay.layout_transform(bias, "NHWC", "NCHW") y = relay.add(y, bias) # a useless tuple, which will be eliminated y = relay.Tuple([y])[0] y = relay.nn.relu(y) y = relay.nn.max_pool2d(y, pool_size=(2, 2)) y = relay.cast(y, "int32") y = relay.layout_transform(y, "NCHW", "NHWC") y = relay.nn.batch_flatten(y) y = relay.Function(analysis.free_vars(y), y) return y a = before() a = run_opt_pass(a, transform.ConvertLayout({"nn.conv2d": ["NCHW", "default"]})) b = run_opt_pass(expected(), transform.InferType()) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) def test_conv_concat_convert_layout(): def before(): x = relay.var("x", shape=(1, 56, 56, 64)) weight1 = relay.var("weight1", shape=(3, 3, 64, 64)) weight2 = relay.var("weight2", shape=(3, 3, 64, 64)) y = relay.nn.conv2d( x, weight1, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) y1 = relay.nn.conv2d( y, weight2, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) ret = relay.concatenate([y, y1], axis=3) y = relay.Function(analysis.free_vars(ret), ret) return y def expected(): x = relay.var("x", shape=(1, 56, 56, 64)) weight1 = relay.var("weight1", shape=(3, 3, 64, 64)) weight2 = relay.var("weight2", shape=(3, 3, 64, 64)) weight1 = relay.layout_transform(weight1, "HWIO", "OIHW") weight2 = relay.layout_transform(weight2, "HWIO", "OIHW") y = relay.layout_transform(x, "NHWC", "NCHW") y = relay.nn.conv2d(y, weight1, channels=64, kernel_size=(3, 3), padding=(1, 1)) y1 = relay.nn.conv2d(y, weight2, channels=64, kernel_size=(3, 3), padding=(1, 1)) ret = relay.concatenate([y, y1], axis=1) ret = relay.layout_transform(ret, "NCHW", "NHWC") y = relay.Function(analysis.free_vars(ret), ret) return y a = before() a = run_opt_pass(a, transform.ConvertLayout({"nn.conv2d": ["NCHW", "default"]})) b = run_opt_pass(expected(), transform.InferType()) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) def test_deformable_conv_bias_pool_convert_layout(): def before(N, CI, H, W, CO, KH, KW, layout): if layout == "NCHW": data_shape = (N, CI, H, W) weight_shape = (CO, CI, KH, KW) kernel_layout = "OIHW" else: data_shape = (N, H, W, CI) weight_shape = (KH, KW, CI, CO) kernel_layout = "HWIO" bias_shape = (CO,) data = relay.var("data", shape=data_shape, dtype="float32") offset = relay.var("offset") weight = relay.var("weight", shape=weight_shape, dtype="float32") bias = relay.var("bias", shape=bias_shape, dtype="float32") y = relay.nn.deformable_conv2d( data, offset, weight, kernel_size=(KH, KW), channels=CO, data_layout=layout, kernel_layout=kernel_layout, ) y = relay.nn.bias_add(y, bias, axis=-1 if layout == "NHWC" else 1) y = relay.nn.relu(y) y = relay.nn.max_pool2d(y, pool_size=(2, 2), layout=layout) y = relay.cast(y, "int32") y = relay.nn.batch_flatten(y) y = relay.Function(analysis.free_vars(y), y) return y def expected(N, CI, H, W, CO, KH, KW, OH, OW, src_layout, dst_layout): layout_map = {"src": {}, "dst": {}} if src_layout == "NCHW": nchw = layout_map["src"] nhwc = layout_map["dst"] else: nchw = layout_map["dst"] nhwc = layout_map["src"] nchw["data_layout"] = "NCHW" nchw["data_shape"] = (N, CI, H, W) nchw["offset_shape"] = (N, KH * KW * 2, OH, OW) nchw["weight_shape"] = (CO, CI, KH, KW) nchw["kernel_layout"] = "OIHW" nhwc["data_layout"] = "NHWC" nhwc["data_shape"] = (N, H, W, CI) nhwc["offset_shape"] = (N, OH, OW, KH * KW * 2) nhwc["weight_shape"] = (KH, KW, CI, CO) nhwc["kernel_layout"] = "HWIO" bias_shape = (CO,) data = relay.var("data", shape=layout_map["src"]["data_shape"], dtype="float32") offset = relay.var("offset", shape=layout_map["src"]["offset_shape"], dtype="float32") weight = relay.var("weight", shape=layout_map["src"]["weight_shape"], dtype="float32") bias = relay.var("bias", shape=bias_shape, dtype="float32") data = relay.layout_transform( data, layout_map["src"]["data_layout"], layout_map["dst"]["data_layout"] ) offset = relay.layout_transform( offset, layout_map["src"]["data_layout"], layout_map["dst"]["data_layout"] ) weight = relay.layout_transform( weight, layout_map["src"]["kernel_layout"], layout_map["dst"]["kernel_layout"] ) y = relay.nn.deformable_conv2d( data, offset, weight, kernel_size=(KH, KW), channels=CO, data_layout=layout_map["dst"]["data_layout"], kernel_layout=layout_map["dst"]["kernel_layout"], ) if layout_map["src"]["data_layout"] == "NHWC": bias = relay.expand_dims(bias, axis=0, num_newaxis=3) else: bias = relay.expand_dims(bias, axis=1, num_newaxis=2) bias = relay.expand_dims(bias, axis=0) bias = relay.layout_transform( bias, layout_map["src"]["data_layout"], layout_map["dst"]["data_layout"] ) y = relay.add(y, bias) y = relay.nn.relu(y) y = relay.nn.max_pool2d(y, pool_size=(2, 2), layout=layout_map["dst"]["data_layout"]) y = relay.cast(y, "int32") y = relay.layout_transform( y, layout_map["dst"]["data_layout"], layout_map["src"]["data_layout"] ) y = relay.nn.batch_flatten(y) y = relay.Function(analysis.free_vars(y), y) return y # NHWC -> NCHW a = before(1, 3, 224, 224, 32, 3, 3, "NHWC") a = run_opt_pass(a, transform.ConvertLayout({"nn.deformable_conv2d": ["NCHW", "default"]})) b = run_opt_pass( expected(1, 3, 224, 224, 32, 3, 3, 222, 222, "NHWC", "NCHW"), transform.InferType() ) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) # NCHW -> NHWC a = before(1, 3, 224, 224, 32, 3, 3, "NCHW") a = run_opt_pass(a, transform.ConvertLayout({"nn.deformable_conv2d": ["NHWC", "default"]})) b = run_opt_pass( expected(1, 3, 224, 224, 32, 3, 3, 222, 222, "NCHW", "NHWC"), transform.InferType() ) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) def test_dual_path_convert_layout(): def before(): x = relay.var("x", shape=(1, 56, 56, 64)) weight1 = relay.var("weight1", shape=(3, 3, 64, 32)) weight2 = relay.var("weight2", shape=(3, 3, 32, 32)) y = relay.nn.conv2d( x, weight1, channels=32, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) y = relay.nn.relu(y) y1 = relay.nn.conv2d( y, weight2, channels=32, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) y1 = relay.nn.relu(y1) y2 = relay.nn.batch_flatten(y) ret = relay.Tuple([y1, y2]) y = relay.Function(analysis.free_vars(ret), ret) return y def expected(): x = relay.var("x", shape=(1, 56, 56, 64)) weight1 = relay.var("weight1", shape=(3, 3, 64, 32)) weight2 = relay.var("weight2", shape=(3, 3, 32, 32)) weight1 = relay.layout_transform(weight1, "HWIO", "OIHW") weight2 = relay.layout_transform(weight2, "HWIO", "OIHW") y = relay.layout_transform(x, "NHWC", "NCHW") y = relay.nn.conv2d(y, weight1, channels=32, kernel_size=(3, 3), padding=(1, 1)) y = relay.nn.relu(y) y1 = relay.nn.conv2d(y, weight2, channels=32, kernel_size=(3, 3), padding=(1, 1)) y1 = relay.nn.relu(y1) y1 = relay.layout_transform(y1, "NCHW", "NHWC") y2 = relay.layout_transform(y, "NCHW", "NHWC") y2 = relay.nn.batch_flatten(y2) ret = relay.Tuple([y1, y2]) y = relay.Function(analysis.free_vars(ret), ret) return y a = before() a = run_opt_pass(a, transform.ConvertLayout({"nn.conv2d": ["NCHW", "default"]})) b = run_opt_pass(expected(), transform.InferType()) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) def test_bn_convert_layout(): def before(): x = relay.var("x", shape=(1, 56, 56, 64)) weight1 = relay.var("weight1", shape=(3, 3, 64, 32)) y = relay.nn.conv2d( x, weight1, channels=32, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) gamma = relay.var("gamma") beta = relay.var("beta") mean = relay.var("mean") variance = relay.var("variance") y, _, _ = relay.nn.batch_norm(y, gamma, beta, mean, variance, axis=3) return relay.Function(analysis.free_vars(y), y) a = before() a = run_opt_pass(a, transform.ConvertLayout({"nn.conv2d": ["NCHW", "default"]})) # Check that there is only 1 NHWC to NCHW transform. has_lt = list() find_op = lambda x: has_lt.append( isinstance(x, tvm.relay.expr.Call) and x.op.name == "layout_transform" and x.attrs.src_layout == "NCHW" and x.attrs.dst_layout == "NHWC" ) relay.analysis.post_order_visit(a, find_op) has_lt = list(filter(lambda x: x, has_lt)) assert len(has_lt) == 1 def test_slice_like_convert_layout(): def verify_slice_like(after, expected_axes): # Verify if the slice_like after the convert layout has the expected axes. has_expected = list() checker = lambda x: has_expected.append( isinstance(x, tvm.relay.expr.Call) and x.op.name == "slice_like" and str(x.attrs.axes) == str(expected_axes) ) relay.analysis.post_order_visit(after, checker) assert any(has_expected) def func_nhwc(): x = relay.var("x", shape=(1, 56, 56, 64)) weight1 = relay.var("weight1", shape=(3, 3, 64, 32)) y = relay.nn.conv2d( x, weight1, channels=32, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) out = relay.slice_like(y, y, axes=[1, 2]) return relay.Function(analysis.free_vars(out), out) after = run_opt_pass(func_nhwc(), transform.ConvertLayout({"nn.conv2d": ["NCHW", "default"]})) verify_slice_like(after, [2, 3]) def func_nchw(): x = relay.var("x", shape=(1, 64, 56, 56)) weight1 = relay.var("weight1", shape=(32, 64, 3, 3)) y = relay.nn.conv2d( x, weight1, channels=32, kernel_size=(3, 3), padding=(1, 1), data_layout="NCHW", kernel_layout="OIHW", ) out = relay.slice_like(y, y, axes=[2, 3]) return relay.Function(analysis.free_vars(out), out) after = run_opt_pass(func_nchw(), transform.ConvertLayout({"nn.conv2d": ["NHWC", "default"]})) verify_slice_like(after, [1, 2]) def func_vars(): x = relay.var("x", shape=(1, 56, 56, 64)) weight1 = relay.var("weight1", shape=(3, 3, 64, 32)) y = relay.nn.conv2d( x, weight1, channels=32, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) # z has no layout information so convert layout won't happen. z = relay.var("y", shape=(1, 56, 56, 32)) out = relay.slice_like(y, z, axes=[1, 2]) return relay.Function(analysis.free_vars(out), out) after = run_opt_pass(func_vars(), transform.ConvertLayout({"nn.conv2d": ["NCHW", "default"]})) verify_slice_like(after, [1, 2]) def test_transpose_convert_layout(): def verify_transpose(after, expected_axes, expected_transform_cnt): # Verify if the transpose after the convert layout has the expected axes. has_expected = list() checker = lambda x: has_expected.append( isinstance(x, tvm.relay.expr.Call) and x.op.name == "transpose" and str(x.attrs.axes) == str(expected_axes) ) relay.analysis.post_order_visit(after, checker) assert any(has_expected), after is_transform = list() checker = lambda x: is_transform.append( 1 if isinstance(x, tvm.relay.expr.Call) and x.op.name == "layout_transform" else 0 ) relay.analysis.post_order_visit(after, checker) assert ( sum(is_transform) == expected_transform_cnt ), "Expected %s layout_transform, but get\n%s" % (expected_transform_cnt, after) def nhwc_to_nchw(): x = relay.var("x", shape=(1, 56, 56, 64)) weight1 = relay.var("weight1", shape=(3, 3, 64, 32)) y = relay.nn.conv2d( x, weight1, channels=32, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) z = relay.var("z", shape=(56, 56, 32)) out = relay.add(y, z) out = relay.transpose(out, axes=[0, 3, 1, 2]) out = relay.nn.batch_flatten(out) func = relay.Function(analysis.free_vars(out), out) return run_opt_pass(func, transform.ConvertLayout({"nn.conv2d": ["NCHW", "default"]})) verify_transpose(nhwc_to_nchw(), [0, 1, 2, 3], 3) def nchw_to_nhwc(): x = relay.var("x", shape=(1, 64, 56, 56)) weight1 = relay.var("weight1", shape=(32, 64, 3, 3)) y = relay.nn.conv2d( x, weight1, channels=32, kernel_size=(3, 3), padding=(1, 1), data_layout="NCHW", kernel_layout="OIHW", ) z = relay.var("z", shape=(32, 56, 56)) out = relay.add(y, z) out = relay.transpose(out, axes=[0, 2, -1, 1]) # Also test a negative axis. out = relay.nn.batch_flatten(out) func = relay.Function(analysis.free_vars(out), out) return run_opt_pass(func, transform.ConvertLayout({"nn.conv2d": ["NHWC", "default"]})) verify_transpose(nchw_to_nhwc(), [0, 1, 2, 3], 3) def default_axes(): x = relay.var("x", shape=(1, 64, 56, 56)) weight1 = relay.var("weight1", shape=(32, 64, 3, 3)) y = relay.nn.conv2d( x, weight1, channels=32, kernel_size=(3, 3), padding=(1, 1), data_layout="NCHW", kernel_layout="OIHW", ) z = relay.var("z", shape=(32, 56, 56)) out = relay.add(y, z) out = relay.transpose(out) # No axes provided, will use the reversed axes. func = relay.Function(analysis.free_vars(out), out) return run_opt_pass(func, transform.ConvertLayout({"nn.conv2d": ["NHWC", "default"]})) verify_transpose(default_axes(), [2, 1, 3, 0], 3) def test_resnet_convert_layout(): def before(): x = relay.var("x", shape=(1, 56, 56, 64)) weight1 = relay.var("weight1", shape=(3, 3, 64, 32)) weight2 = relay.var("weight2", shape=(1, 1, 64, 32)) y = relay.nn.conv2d( x, weight1, channels=32, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) y = relay.nn.relu(y) y2 = relay.nn.conv2d( x, weight2, channels=32, kernel_size=(1, 1), data_layout="NHWC", kernel_layout="HWIO" ) y2 = relay.nn.relu(y2) y = y + y2 y = relay.nn.global_max_pool2d(y, layout="NHWC") return relay.Function(analysis.free_vars(y), y) def expected(): x = relay.var("x", shape=(1, 56, 56, 64)) weight1 = relay.var("weight1", shape=(3, 3, 64, 32)) weight2 = relay.var("weight2", shape=(1, 1, 64, 32)) weight1 = relay.layout_transform(weight1, "HWIO", "OIHW") weight2 = relay.layout_transform(weight2, "HWIO", "OIHW") x = relay.layout_transform(x, "NHWC", "NCHW") y = relay.nn.conv2d(x, weight1, channels=32, kernel_size=(3, 3), padding=(1, 1)) y = relay.nn.relu(y) y2 = relay.nn.conv2d(x, weight2, channels=32, kernel_size=(1, 1)) y2 = relay.nn.relu(y2) y = y + y2 y = relay.nn.global_max_pool2d(y) y = relay.layout_transform(y, "NCHW", "NHWC") return relay.Function(analysis.free_vars(y), y) a = before() a = run_opt_pass(a, transform.ConvertLayout({"nn.conv2d": ["NCHW", "default"]})) b = run_opt_pass(expected(), transform.InferType()) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) def test_scalar_convert_layout(): def before(): x = relay.var("x", shape=(1, 56, 56, 64)) weight = relay.var("weight", shape=(3, 3, 64, 64)) y = relay.nn.conv2d( x, weight, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) y = relay.add(y, relay.const(1, "float32")) y = relay.Function(analysis.free_vars(y), y) return y def expected(): x = relay.var("x", shape=(1, 56, 56, 64)) w = relay.var("weight", shape=(3, 3, 64, 64)) x = relay.layout_transform(x, "NHWC", "NCHW") w = relay.layout_transform(w, "HWIO", "OIHW") y = relay.nn.conv2d(x, w, channels=64, kernel_size=(3, 3), padding=(1, 1)) y = relay.add(y, relay.const(1.0, "float32")) y = relay.layout_transform(y, "NCHW", "NHWC") y = relay.Function(analysis.free_vars(y), y) return y a = before() a = run_opt_pass(a, transform.ConvertLayout({"nn.conv2d": ["NCHW", "default"]})) b = run_opt_pass(expected(), transform.InferType()) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) def test_conv_bn_convert_layout(): """ Check that layout transforms are propagated through bn. """ def before(): x = relay.var("x", shape=(1, 56, 56, 64)) weight = relay.var("weight", shape=(3, 3, 64, 64)) y = relay.nn.conv2d( x, weight, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) dtype = "float32" beta = relay.var("beta", relay.TensorType((64,), dtype)) gamma = relay.var("gamma", relay.TensorType((64,), dtype)) moving_mean = relay.var("moving_mean", relay.TensorType((64,), dtype)) moving_var = relay.var("moving_var", relay.TensorType((64,), dtype)) y = relay.nn.batch_norm(y, gamma, beta, moving_mean, moving_var, axis=3) y = relay.nn.relu(y[0]) y = relay.Function(analysis.free_vars(y), y) return y def expected(): x = relay.var("x", shape=(1, 56, 56, 64)) w = relay.var("weight", shape=(3, 3, 64, 64)) x = relay.layout_transform(x, "NHWC", "NCHW") w = relay.layout_transform(w, "HWIO", "OIHW") y = relay.nn.conv2d(x, w, channels=64, kernel_size=(3, 3), padding=(1, 1)) dtype = "float32" beta = relay.var("beta", relay.TensorType((64,), dtype)) gamma = relay.var("gamma", relay.TensorType((64,), dtype)) moving_mean = relay.var("moving_mean", relay.TensorType((64,), dtype)) moving_var = relay.var("moving_var", relay.TensorType((64,), dtype)) y = relay.nn.batch_norm(y, gamma, beta, moving_mean, moving_var, axis=1) y = relay.nn.relu(y[0]) y = relay.layout_transform(y, "NCHW", "NHWC") y = relay.Function(analysis.free_vars(y), y) return y a = before() a = run_opt_pass(a, transform.ConvertLayout({"nn.conv2d": ["NCHW", "default"]})) b = run_opt_pass(expected(), transform.InferType()) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) def test_qnn_conv_requantize_convert_layout(): def before(): x = relay.var("x", shape=(1, 56, 56, 64), dtype="int8") weight = relay.var("weight", shape=(3, 3, 64, 64), dtype="int8") y = relay.qnn.op.conv2d( x, weight, relay.const(1, "int32"), relay.const(1, "int32"), relay.const(1, "float32"), relay.const(1, "float32"), channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) y = relay.qnn.op.requantize( y, relay.const(1, "float32"), relay.const(1, "int32"), relay.const(1, "float32"), relay.const(1, "int32"), out_dtype="int32", ) y = relay.nn.relu(y) y = relay.Function([x, weight], y) return y def expected(): x = relay.var("x", shape=(1, 56, 56, 64), dtype="int8") weight = relay.var("weight", shape=(3, 3, 64, 64), dtype="int8") x = relay.layout_transform(x, "NHWC", "NCHW") weight = relay.layout_transform(weight, "HWIO", "OIHW") y = relay.qnn.op.conv2d( x, weight, relay.const(1, "int32"), relay.const(1, "int32"), relay.const(1, "float32"), relay.const(1, "float32"), channels=64, kernel_size=(3, 3), padding=(1, 1), ) y = relay.qnn.op.requantize( y, relay.const(1, "float32"), relay.const(1, "int32"), relay.const(1, "float32"), relay.const(1, "int32"), axis=1, out_dtype="int32", ) y = relay.nn.relu(y) y = relay.layout_transform(y, "NCHW", "NHWC") y = relay.Function(relay.analysis.free_vars(y), y) return y a = before() a = run_opt_pass(a, transform.ConvertLayout({"qnn.conv2d": ["NCHW", "default"]})) b = run_opt_pass(expected(), transform.InferType()) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) def test_qnn_conv_concat_convert_layout(): def before(): x = relay.var("x", shape=(1, 56, 56, 64), dtype="int8") weight1 = relay.var("weight1", shape=(3, 3, 64, 64), dtype="int8") weight2 = relay.var("weight2", shape=(3, 3, 64, 64), dtype="int8") y = relay.qnn.op.conv2d( x, weight1, relay.const(1, "int32"), relay.const(1, "int32"), relay.const(1, "float32"), relay.const(1, "float32"), channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) y1 = relay.qnn.op.conv2d( y, weight2, relay.const(1, "int32"), relay.const(1, "int32"), relay.const(1, "float32"), relay.const(1, "float32"), channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) y = relay.cast(y, "int8") y1 = relay.cast(y, "int8") ret = relay.qnn.op.concatenate( [y, y1], [relay.const(1, "float32"), relay.const(1, "float32")], [relay.const(1, "int32"), relay.const(1, "int32")], relay.const(1, "float32"), relay.const(1, "int32"), axis=3, ) y = relay.Function(analysis.free_vars(ret), ret) return y def expected(): x = relay.var("x", shape=(1, 56, 56, 64), dtype="int8") weight1 = relay.var("weight1", shape=(3, 3, 64, 64), dtype="int8") weight2 = relay.var("weight2", shape=(3, 3, 64, 64), dtype="int8") weight1 = relay.layout_transform(weight1, "HWIO", "OIHW") weight2 = relay.layout_transform(weight2, "HWIO", "OIHW") y = relay.layout_transform(x, "NHWC", "NCHW") y = relay.qnn.op.conv2d( y, weight1, relay.const(1, "int32"), relay.const(1, "int32"), relay.const(1, "float32"), relay.const(1, "float32"), channels=64, kernel_size=(3, 3), padding=(1, 1), ) y1 = relay.qnn.op.conv2d( y, weight2, relay.const(1, "int32"), relay.const(1, "int32"), relay.const(1, "float32"), relay.const(1, "float32"), channels=64, kernel_size=(3, 3), padding=(1, 1), ) y = relay.cast(y, "int8") y1 = relay.cast(y, "int8") ret = relay.qnn.op.concatenate( [y, y1], [relay.const(1, "float32"), relay.const(1, "float32")], [relay.const(1, "int32"), relay.const(1, "int32")], relay.const(1, "float32"), relay.const(1, "int32"), axis=1, ) ret = relay.layout_transform(ret, "NCHW", "NHWC") y = relay.Function(analysis.free_vars(ret), ret) return y a = before() a = run_opt_pass(a, transform.ConvertLayout({"qnn.conv2d": ["NCHW", "default"]})) b = run_opt_pass(expected(), transform.InferType()) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) def test_qnn_conv_add_convert_layout(): def before(): x = relay.var("x", shape=(1, 56, 56, 64), dtype="int8") weight1 = relay.var("weight1", shape=(3, 3, 64, 64), dtype="int8") weight2 = relay.var("weight2", shape=(3, 3, 64, 64), dtype="int8") y = relay.qnn.op.conv2d( x, weight1, relay.const(1, "int32"), relay.const(1, "int32"), relay.const(1, "float32"), relay.const(1, "float32"), channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) y1 = relay.qnn.op.conv2d( y, weight2, relay.const(1, "int32"), relay.const(1, "int32"), relay.const(1, "float32"), relay.const(1, "float32"), channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) y = relay.cast(y, "int8") y1 = relay.cast(y, "int8") ret = relay.qnn.op.add( y, y1, relay.const(1, "float32"), relay.const(1, "int32"), relay.const(1, "float32"), relay.const(1, "int32"), relay.const(1, "float32"), relay.const(1, "int32"), ) y = relay.Function(analysis.free_vars(ret), ret) return y def expected(): x = relay.var("x", shape=(1, 56, 56, 64), dtype="int8") weight1 = relay.var("weight1", shape=(3, 3, 64, 64), dtype="int8") weight2 = relay.var("weight2", shape=(3, 3, 64, 64), dtype="int8") weight1 = relay.layout_transform(weight1, "HWIO", "OIHW") weight2 = relay.layout_transform(weight2, "HWIO", "OIHW") y = relay.layout_transform(x, "NHWC", "NCHW") y = relay.qnn.op.conv2d( y, weight1, relay.const(1, "int32"), relay.const(1, "int32"), relay.const(1, "float32"), relay.const(1, "float32"), channels=64, kernel_size=(3, 3), padding=(1, 1), ) y1 = relay.qnn.op.conv2d( y, weight2, relay.const(1, "int32"), relay.const(1, "int32"), relay.const(1, "float32"), relay.const(1, "float32"), channels=64, kernel_size=(3, 3), padding=(1, 1), ) y = relay.cast(y, "int8") y1 = relay.cast(y, "int8") ret = relay.qnn.op.add( y, y1, relay.const(1, "float32"), relay.const(1, "int32"), relay.const(1, "float32"), relay.const(1, "int32"), relay.const(1, "float32"), relay.const(1, "int32"), ) ret = relay.layout_transform(ret, "NCHW", "NHWC") y = relay.Function(analysis.free_vars(ret), ret) return y a = before() a = run_opt_pass(a, transform.ConvertLayout({"qnn.conv2d": ["NCHW", "default"]})) b = run_opt_pass(expected(), transform.InferType()) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) def test_qnn_conv_nhwc_convert_layout(): def before(): x = relay.var("x", shape=(1, 64, 56, 56), dtype="int8") weight = relay.var("weight", shape=(64, 64, 3, 3), dtype="int8") y = relay.qnn.op.conv2d( x, weight, relay.const(1, "int32"), relay.const(1, "int32"), relay.const(1, "float32"), relay.const(1, "float32"), channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NCHW", kernel_layout="OIHW", ) y = relay.nn.relu(y) y = relay.Function([x, weight], y) return y def expected(): x = relay.var("x", shape=(1, 64, 56, 56), dtype="int8") weight = relay.var("weight", shape=(64, 64, 3, 3), dtype="int8") x = relay.layout_transform(x, "NCHW", "NHWC") weight = relay.layout_transform(weight, "OIHW", "HWIO") y = relay.qnn.op.conv2d( x, weight, relay.const(1, "int32"), relay.const(1, "int32"), relay.const(1, "float32"), relay.const(1, "float32"), channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) y = relay.nn.relu(y) y = relay.layout_transform(y, "NHWC", "NCHW") y = relay.Function(relay.analysis.free_vars(y), y) return y a = before() a = run_opt_pass(a, transform.ConvertLayout({"qnn.conv2d": ["NHWC", "default"]})) b = run_opt_pass(expected(), transform.InferType()) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) def test_conv_convert_kernel_layout(): """ Check that convolution kernel layout is correctly transformed. """ def before(): x = relay.var("x", shape=(1, 56, 56, 64)) weight = relay.var("weight", shape=(3, 3, 64, 64)) y = relay.nn.conv2d( x, weight, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) y = relay.Function(analysis.free_vars(y), y) return y def expected(): x = relay.var("x", shape=(1, 56, 56, 64)) w = relay.var("weight", shape=(3, 3, 64, 64)) w = relay.layout_transform(w, "HWIO", "OHWI") y = relay.nn.conv2d( x, w, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="OHWI", ) y = relay.Function(analysis.free_vars(y), y) return y a = before() a = run_opt_pass(a, transform.ConvertLayout({"nn.conv2d": ["NHWC", "OHWI"]})) b = run_opt_pass(expected(), transform.InferType()) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) def test_conv_roi_align_convert_layout(): def before(): x = relay.var("x", shape=(1, 64, 56, 56)) weight1 = relay.var("weight1", shape=(64, 64, 3, 3)) y = relay.nn.conv2d( x, weight1, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NCHW", kernel_layout="OIHW", ) rois = relay.var("rois", shape=(32, 5)) y = relay.vision.roi_align( y, rois, pooled_size=(14, 14), spatial_scale=0.0625, sample_ratio=2, layout="NCHW" ) y = relay.Function(analysis.free_vars(y), y) return y def expected(): x = relay.var("x", shape=(1, 64, 56, 56)) weight1 = relay.var("weight1", shape=(64, 64, 3, 3)) x = relay.layout_transform(x, "NCHW", "NHWC") weight1 = relay.layout_transform(weight1, "OIHW", "HWIO") y = relay.nn.conv2d( x, weight1, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) rois = relay.var("rois", shape=(32, 5)) y = relay.vision.roi_align( y, rois, pooled_size=(14, 14), spatial_scale=0.0625, sample_ratio=2, layout="NHWC" ) ret = relay.layout_transform(y, "NHWC", "NCHW") y = relay.Function(analysis.free_vars(ret), ret) return y a = before() desired_layouts = { "nn.conv2d": ["NHWC", "HWIO"], "vision.roi_align": ["NHWC", "default"], } a = run_opt_pass(a, transform.ConvertLayout(desired_layouts)) b = run_opt_pass(expected(), transform.InferType()) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) def test_conv_strided_slice_convert_layout(): def before(): x = relay.var("x", shape=(1, 64, 56, 56)) weight = relay.var("weight", shape=(64, 64, 3, 3)) y = relay.nn.conv2d( x, weight, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NCHW", kernel_layout="OIHW", ) y = relay.nn.relu(y) y = relay.strided_slice(y, begin=[0, 1], end=[1, -1, 10], strides=[1, 1, 2, 1]) y = relay.Function([x, weight], y) return y def expected(): x = relay.var("x", shape=(1, 64, 56, 56)) weight = relay.var("weight", shape=(64, 64, 3, 3)) x = relay.layout_transform(x, "NCHW", "NHWC") weight = relay.layout_transform(weight, "OIHW", "HWIO") y = relay.nn.conv2d( x, weight, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) y = relay.nn.relu(y) y = relay.strided_slice(y, begin=[0, 0, 0, 1], end=[1, 10, 56, -1], strides=[1, 2, 1, 1]) y = relay.layout_transform(y, "NHWC", "NCHW") y = relay.Function(relay.analysis.free_vars(y), y) return y a = before() a = run_opt_pass(a, transform.ConvertLayout({"nn.conv2d": ["NHWC", "default"]})) b = run_opt_pass(expected(), transform.InferType()) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) def test_conv_roi_pool_convert_layout(): def before(): x = relay.var("x", shape=(1, 64, 56, 56)) weight1 = relay.var("weight1", shape=(64, 64, 3, 3)) y = relay.nn.conv2d( x, weight1, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NCHW", kernel_layout="OIHW", ) rois = relay.var("rois", shape=(32, 5)) y = relay.vision.roi_pool( y, rois, pooled_size=(14, 14), spatial_scale=0.0625, layout="NCHW" ) y = relay.Function(analysis.free_vars(y), y) return y def expected(): x = relay.var("x", shape=(1, 64, 56, 56)) weight1 = relay.var("weight1", shape=(64, 64, 3, 3)) x = relay.layout_transform(x, "NCHW", "NHWC") weight1 = relay.layout_transform(weight1, "OIHW", "HWIO") y = relay.nn.conv2d( x, weight1, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) rois = relay.var("rois", shape=(32, 5)) y = relay.vision.roi_pool( y, rois, pooled_size=(14, 14), spatial_scale=0.0625, layout="NHWC" ) ret = relay.layout_transform(y, "NHWC", "NCHW") y = relay.Function(analysis.free_vars(ret), ret) return y a = before() desired_layouts = { "nn.conv2d": ["NHWC", "HWIO"], "vision.roi_pool": ["NHWC", "default"], } a = run_opt_pass(a, transform.ConvertLayout(desired_layouts)) b = run_opt_pass(expected(), transform.InferType()) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) def test_default_keyword(): """ Check that the default keyword selects correct TVM default layout. """ def before(): x = relay.var("x", shape=(1, 64, 56, 56)) weight = relay.var("weight", shape=(64, 3, 3, 64)) y = relay.nn.conv2d( x, weight, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NCHW", kernel_layout="OHWI", ) y = relay.Function(analysis.free_vars(y), y) return y def expected(): x = relay.var("x", shape=(1, 64, 56, 56)) w = relay.var("weight", shape=(64, 3, 3, 64)) w = relay.layout_transform(w, "OHWI", "OIHW") y = relay.nn.conv2d( x, w, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NCHW", kernel_layout="OIHW", ) y = relay.Function(analysis.free_vars(y), y) return y a = before() a = run_opt_pass(a, transform.ConvertLayout({"nn.conv2d": ["NCHW", "default"]})) b = run_opt_pass(expected(), transform.InferType()) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) def test_different_ops_convert_layout(): """Check convert layout correctly supports converting the layout of different ops in the same graph. """ def before(): x = relay.var("x", shape=(1, 64, 56, 56)) weight1 = relay.var("weight1", shape=(64, 3, 3, 64)) weight2 = relay.var("weight2", shape=(64, 3, 3, 64), dtype="int8") weight3 = relay.var("weight3", shape=(64, 3, 3, 64)) out = relay.nn.conv2d( x, weight1, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NCHW", kernel_layout="OHWI", ) out = relay.cast(out, "int8") out = relay.qnn.op.conv2d( out, weight2, relay.const(1, "int32"), relay.const(1, "int32"), relay.const(1, "float32"), relay.const(1, "float32"), channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NCHW", kernel_layout="OHWI", ) out = relay.cast(out, "float32") out = relay.nn.conv2d_transpose( out, weight3, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NCHW", kernel_layout="OHWI", ) out = relay.Function(analysis.free_vars(out), out) return out def expected(): x = relay.var("x", shape=(1, 64, 56, 56)) weight1 = relay.var("weight1", shape=(64, 3, 3, 64)) weight2 = relay.var("weight2", shape=(64, 3, 3, 64), dtype="int8") weight3 = relay.var("weight3", shape=(64, 3, 3, 64)) x = relay.layout_transform(x, "NCHW", "NHWC") weight1 = relay.layout_transform(weight1, "OHWI", "HWIO") out = relay.nn.conv2d( x, weight1, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) out = relay.cast(out, "int8") out = relay.layout_transform(out, "NHWC", "NCHW") weight2 = relay.layout_transform(weight2, "OHWI", "OIHW") out = relay.qnn.op.conv2d( out, weight2, relay.const(1, "int32"), relay.const(1, "int32"), relay.const(1, "float32"), relay.const(1, "float32"), channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NCHW", kernel_layout="OIHW", ) out = relay.cast(out, "float32") out = relay.layout_transform(out, "NCHW", "NHWC") weight3 = relay.layout_transform(weight3, "OHWI", "HWIO") out = relay.nn.conv2d_transpose( out, weight3, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) out = relay.layout_transform(out, "NHWC", "NCHW") out = relay.Function(analysis.free_vars(out), out) return out a = before() desired_layouts = { "nn.conv2d": ["NHWC", "HWIO"], "qnn.conv2d": ["NCHW", "OIHW"], "nn.conv2d_transpose": ["NHWC", "HWIO"], } a = run_opt_pass(a, transform.ConvertLayout(desired_layouts)) b = run_opt_pass(expected(), transform.InferType()) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) def test_no_desired_layout(): def before(): x = relay.var("x", shape=(1, 64, 56, 56)) weight1 = relay.var("weight1", shape=(64, 64, 3, 3)) y = relay.nn.conv2d( x, weight1, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NCHW", kernel_layout="OIHW", ) rois = relay.var("rois", shape=(32, 5)) y = relay.vision.roi_align( y, rois, pooled_size=(14, 14), spatial_scale=0.0625, sample_ratio=2, layout="NCHW" ) y = relay.Function(analysis.free_vars(y), y) return y def expected(): x = relay.var("x", shape=(1, 64, 56, 56)) weight1 = relay.var("weight1", shape=(64, 64, 3, 3)) x = relay.layout_transform(x, "NCHW", "NHWC") weight1 = relay.layout_transform(weight1, "OIHW", "HWIO") y = relay.nn.conv2d( x, weight1, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) y = relay.layout_transform(y, "NHWC", "NCHW") rois = relay.var("rois", shape=(32, 5)) y = relay.vision.roi_align( y, rois, pooled_size=(14, 14), spatial_scale=0.0625, sample_ratio=2, layout="NCHW" ) y = relay.Function(analysis.free_vars(y), y) return y a = before() a = run_opt_pass(a, transform.ConvertLayout({"nn.conv2d": ["NHWC", "HWIO"]})) b = run_opt_pass(expected(), transform.InferType()) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) def test_convert_with_config(): def before(): x = relay.var("x", shape=(1, 56, 56, 64)) weight = relay.var("weight", shape=(3, 3, 64, 64)) y = relay.nn.conv2d( x, weight, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) y = relay.nn.relu(y) weight2 = relay.var("weight2", shape=(3, 3, 64, 64)) y2 = relay.nn.conv2d( y, weight2, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) y2 = relay.nn.relu(y2) out = relay.Function([x, weight, weight2], y2) return out def expected(): x = relay.var("x", shape=(1, 56, 56, 64)) weight = relay.var("weight", shape=(3, 3, 64, 64)) weight2 = relay.var("weight2", shape=(3, 3, 64, 64)) weight2 = relay.layout_transform(weight2, "HWIO", "HWOI") y = relay.nn.conv2d( x, weight, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="NHWC", kernel_layout="HWIO", ) y = relay.nn.relu(y) y = relay.layout_transform(y, "NHWC", "HWNC") y2 = relay.nn.conv2d( y, weight2, channels=64, kernel_size=(3, 3), padding=(1, 1), data_layout="HWNC", kernel_layout="HWOI", ) y2 = relay.nn.relu(y2) y2 = relay.layout_transform(y2, "HWNC", "NHWC") output = relay.Function(relay.analysis.free_vars(y2), y2) return output a = before() layout_config = relay.transform.LayoutConfig(skip_layers=[0]) with layout_config: a = run_opt_pass(a, transform.ConvertLayout({"nn.conv2d": ["HWNC", "default"]})) b = run_opt_pass(expected(), transform.InferType()) assert tvm.ir.structural_equal(a, b), "Actual = \n" + str(a) if __name__ == "__main__": test_qnn_binary_no_convert_layout() test_no_convert_layout() test_conv_convert_layout() test_conv_nhwc_convert_layout() test_conv_bias_pool_convert_layout() test_conv_concat_convert_layout() test_dual_path_convert_layout() test_bn_convert_layout() test_slice_like_convert_layout() test_transpose_convert_layout() test_resnet_convert_layout() test_scalar_convert_layout() test_conv_bn_convert_layout() test_qnn_conv_requantize_convert_layout() test_qnn_conv_concat_convert_layout() test_qnn_conv_add_convert_layout() test_qnn_conv_nhwc_convert_layout() test_conv_convert_kernel_layout() test_conv_transpose_convert_layout() test_conv_roi_align_convert_layout() test_conv_roi_pool_convert_layout() test_conv_strided_slice_convert_layout() test_deformable_conv_bias_pool_convert_layout() test_default_keyword() test_different_ops_convert_layout() test_no_desired_layout() test_convert_with_config()
The highlight of the interview, however, is the revelation that Obsidian's secret RPG project, rumored to be in development since the beginning of 2014, is in fact a reincarnation of Stormlands. Although Feargus describes it as "very different from what it was", it still retains some ideas from the original game. It's going to be officially announced sometime within the next few months.
# Licensed to the Apache Software Foundation (ASF) under one # or more contributor license agreements. See the NOTICE file # distributed with this work for additional information # regarding copyright ownership. The ASF licenses this file # to you under the Apache License, Version 2.0 (the # "License"); you may not use this file except in compliance # with the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, # software distributed under the License is distributed on an # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY # KIND, either express or implied. See the License for the # specific language governing permissions and limitations # under the License. """ P1 tests for Scaling up Vm """ #Import Local Modules import marvin from marvin.cloudstackTestCase import * from marvin.cloudstackAPI import * from marvin.integration.lib.utils import * from marvin.integration.lib.base import * from marvin.integration.lib.common import * from nose.plugins.attrib import attr _multiprocess_shared_ = True class Services: """Test VM Life Cycle Services """ def __init__(self): self.services = { "account": { "email": "[email protected]", "firstname": "Test", "lastname": "User", "username": "test", # Random characters are appended in create account to # ensure unique username generated each time "password": "password", }, "small": # Create a small virtual machine instance with disk offering { "displayname": "testserver", "username": "root", # VM creds for SSH "password": "password", "ssh_port": 22, "hypervisor": 'XenServer', "privateport": 22, "publicport": 22, "protocol": 'TCP', }, "service_offerings": { "small": { # Small service offering ID to for change VM # service offering from medium to small "name": "SmallInstance", "displaytext": "SmallInstance", "cpunumber": 1, "cpuspeed": 100, "memory": 256, }, "big": { # Big service offering ID to for change VM "name": "BigInstance", "displaytext": "BigInstance", "cpunumber": 1, "cpuspeed": 100, "memory": 512, } }, #Change this "template": { "displaytext": "xs", "name": "xs", "passwordenabled": False, }, "sleep": 60, "timeout": 10, #Migrate VM to hostid "ostype": 'CentOS 5.3 (64-bit)', # CentOS 5.3 (64-bit) } class TestScaleVm(cloudstackTestCase): @classmethod def setUpClass(cls): cls.api_client = super(TestScaleVm, cls).getClsTestClient().getApiClient() cls.services = Services().services # Get Zone, Domain and templates domain = get_domain(cls.api_client, cls.services) zone = get_zone(cls.api_client, cls.services) cls.services['mode'] = zone.networktype template = get_template( cls.api_client, zone.id, cls.services["ostype"] ) # Set Zones and disk offerings ?? cls.services["small"]["zoneid"] = zone.id cls.services["small"]["template"] = template.id # Create account, service offerings, vm. cls.account = Account.create( cls.api_client, cls.services["account"], domainid=domain.id ) cls.small_offering = ServiceOffering.create( cls.api_client, cls.services["service_offerings"]["small"] ) cls.big_offering = ServiceOffering.create( cls.api_client, cls.services["service_offerings"]["big"] ) #create a virtual machine cls.virtual_machine = VirtualMachine.create( cls.api_client, cls.services["small"], accountid=cls.account.name, domainid=cls.account.domainid, serviceofferingid=cls.small_offering.id, mode=cls.services["mode"] ) cls._cleanup = [ cls.small_offering, cls.account ] @classmethod def tearDownClass(cls): cls.api_client = super(TestScaleVm, cls).getClsTestClient().getApiClient() cleanup_resources(cls.api_client, cls._cleanup) return def setUp(self): self.apiclient = self.testClient.getApiClient() self.dbclient = self.testClient.getDbConnection() self.cleanup = [] def tearDown(self): #Clean up, terminate the created ISOs cleanup_resources(self.apiclient, self.cleanup) return @attr(hypervisor="xenserver") @attr(tags=["advanced", "basic"]) def test_01_scale_vm(self): """Test scale virtual machine """ # Validate the following # Scale up the vm and see if it scales to the new svc offering and is finally in running state self.debug("Scaling VM-ID: %s to service offering: %s and state %s" % ( self.virtual_machine.id, self.big_offering.id, self.virtual_machine.state )) cmd = scaleVirtualMachine.scaleVirtualMachineCmd() cmd.serviceofferingid = self.big_offering.id cmd.id = self.virtual_machine.id self.apiclient.scaleVirtualMachine(cmd) list_vm_response = VirtualMachine.list( self.apiclient, id=self.virtual_machine.id ) self.assertEqual( isinstance(list_vm_response, list), True, "Check list response returns a valid list" ) self.assertNotEqual( list_vm_response, None, "Check virtual machine is listVirtualMachines" ) vm_response = list_vm_response[0] self.assertEqual( vm_response.id, self.virtual_machine.id, "Check virtual machine ID of scaled VM" ) # VirtualMachine should be updated to tell cloudstack it has PV tools # available and successfully scaled. We will only mock that behaviour # here but it is not expected in production since the VM scaling is not # guaranteed until tools are installed, vm rebooted self.virtual_machine.update(self.apiclient, isdynamicallyscalable='true') self.debug("Scaling VM-ID: %s from service offering: %s to new service offering %s and the response says %s" % ( self.virtual_machine.id, self.virtual_machine.serviceofferingid, self.big_offering.id, vm_response.serviceofferingid )) self.assertEqual( vm_response.serviceofferingid, self.big_offering.id, "Check service offering of the VM" ) self.assertEqual( vm_response.state, 'Running', "Check the state of VM" ) return
Tennis: Serena, Sloane, someone else? Who is going to rule the WTA this clay season? Serena, Sloane, someone else? Who is going to rule the WTA this clay season? After a Miami Open win last month, Sloane Stephens is looking to have a big clay-court season. Like spring itself, the clay-court season evolves slowly, starting with near-wintry conditions. It will peak with flowers in full bloom under warm and sunny skies in Paris at the French Open. Right now, multiple questions are sprouting about what the WTA has in store during the clay segment. 1. Will the WTA free-for-all continue? This year's Australian Open appeared to restore the order that had crumbled after Serena Williams left the tour as an expectant mother in February 2017. The stars dominated in Melbourne, as the singles final featured the two top seeds. But after the first Grand Slam, the volatility of 2017 crept back into the game, peaking at the end of the hard-court season as Lesya Tsurenko won Acapulco, Naomi Osaka took the big title at Indian Wells and Sloane Stephens burst out of a slump to win the Miami Open. While some promising new players are emerging -- Osaka, Daria Kasatkina, Danielle Collins and CiCi Bellis among them -- the established names are likely to take control. Garbine Muguruza, Petra Kvitova, Sloane Stephens, Victoria Azarenka and Angelique Kerber all seem to be rounding back into form, and all of them are Grand Slam singles champions. We could see a surprise finalist here, a newcomer in the semis there. But clay calls for patience, consistency and experience -- the qualities of proven champions. 2. Will the Williams sisters be a factor? This question invokes both sisters because at this point we don't even know where and when new mother Serena will play again. But you can't ever count her out when she decides to compete, so we'll leave the question hanging. As for Venus, she probably will face a struggle to leave a significant mark on the Euroclay. She's a tepid 7-4 this year on all surfaces, with no finals. On clay, Venus hasn't been in a singles final since the Madrid Open in 2010. She was the No. 3 seed then but was upset in the title match by France's No. 24 Aravane Rezai. Now 37, Venus has been in one French Open final, but it was a mind-boggling 16 years ago. She hasn't been as far as the quarterfinals since 2006. She's also been cutting back her already limited schedule, leaving herself little time to groove her clay game or play her way into contention at big events. Venus had a remarkable year in 2017, but if she's going to recapture the form that lifted her as high as No. 5 last fall, she will most likely do it during the summer grass and hard-court segments. 3. Will Simona Halep finally close the deal in Paris? The diminutive, indefatigable Romanian has suffered some crushing defeats in big moments, but none as dispiriting as her 2017 French Open loss to Grand Slam finals rookie Jelena Ostapenko. Halep has plenty of motivation and experience, having lost a very close final in the same stadium in 2014 to Maria Sharapova. Yet Halep couldn't hold her lead of a set and a break against her 18-year-old opponent in June. Halep went on to have a great year, finishing No. 1. Nobody faulted her for losing this year's Australian Open final to fellow WTA Grand Slam bridesmaid Caroline Wozniacki. Physically, Halep was a wreck, but she still managed to play an excellent match. While she's won just one minor title so far this year, Halep has compiled an outstanding 19-3 record, good enough to keep her No. 1 -- and to whet her appetite for more. She's positioned to have an excellent clay season, as long as the oft-injured 26-year-old remains healthy. 4. Who's likely to be the most pleasant surprise on clay? Out with injury all of last spring, Stephens said in Miami that she sorely missed the clay season of 2017. "I'm really looking forward to it [this year]," she said. "I'm super excited to get back out there. Red clay is my favorite." Stephens seems to be turning into a player who comes up big at critical moments. She's a Grand Slam champion now. Her cat-and-mouse tactics, clever counter-punching and excellent feel for the ball are tools that have carried her to the fourth round at Roland Garros twice in the past. A steady diet of red clay will do wonders for her fitness and confidence. Also keep your eyes on Elina Svitolina, who's ranked No. 4 but flying under the radar as far as hype and expectations go. Just 23, she's a year younger than former French Open and Wimbledon champion Garbine Muguruza. While Svitolina hasn't won the big one, she has an impregnable baseline game, keeps improving slowly and has been super consistent. She could put together a great run on clay. 5. Will Ostapenko successfully defend her French Open title? Very few first-time Grand Slam champions successfully defended those titles. Azarenka did it with back-to-back wins in Australian in 2012-13, but she's the outlier. So the odds are stacked against Ostapenko, who's just 20 years old and still adjusting to her newfound status -- and the pressures and distractions it brings. But Ostapenko hasn't played badly. Her ranking was No. 47 at the time she won the French Open and was No. 7 by the end of the year. Now she's up to No. 5. Ostapenko hasn't lost a first-round match since the first week of the year, at Sydney. More promising, she made the final at the Miami Open. She's such a bold and powerful ball-striker that if she's fit she could dominate on clay, perhaps even emulate Azarenka with a successful defense of her first major title.
#!/usr/bin/env python # # Copyright 2017-present MongoDB, Inc. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """IDL for functions that take flexible options as a bson_t. Defines the options accepted by functions that receive a const bson_t *opts, for example mongoc_collection_find_with_opts, mongoc_collection_insert_one, and many others. Generates struct types, options parsing code, and RST documentation. Written for Python 2.6+, requires Jinja 2 for templating. """ from collections import OrderedDict from os.path import basename, dirname, join as joinpath, normpath import re from jinja2 import Environment, FileSystemLoader # Please "pip install jinja2". this_dir = dirname(__file__) template_dir = joinpath(this_dir, 'opts_templates') src_dir = normpath(joinpath(this_dir, '../src/libmongoc/src/mongoc')) doc_includes = normpath(joinpath(this_dir, '../src/libmongoc/doc/includes')) def flatten(items): for item in items: if isinstance(item, list): # "yield from". for subitem in flatten(item): yield subitem else: yield item class Struct(OrderedDict): def __init__(self, items, opts_name='opts', generate_rst=True, generate_code=True, allow_extra=True, **defaults): """Define an options struct. - items: List of pairs: (optionName, info) - opts_name: Name of the const bson_t *opts parameter - allow_extra: Whether to allow unrecognized options - defaults: Initial values for options """ OrderedDict.__init__(self, list(flatten(items))) self.is_shared = False self.opts_name = opts_name self.generate_rst = generate_rst self.generate_code = generate_code self.allow_extra = allow_extra self.defaults = defaults def default(self, item, fallback): return self.defaults.get(item, fallback) class Shared(Struct): def __init__(self, items, **defaults): """Define a struct that is shared by others.""" super(Shared, self).__init__(items, **defaults) self.is_shared = True self.generate_rst = False read_concern_help = 'Construct a :symbol:`mongoc_read_concern_t` and use :symbol:`mongoc_read_concern_append` to add the read concern to ``opts``. See the example code for :symbol:`mongoc_client_read_command_with_opts`. Read concern requires MongoDB 3.2 or later, otherwise an error is returned.' read_concern_document_option = ('readConcern', { 'type': 'document', 'help': read_concern_help }) read_concern_option = ('readConcern', { 'type': 'mongoc_read_concern_t *', 'help': read_concern_help, 'convert': '_mongoc_convert_read_concern' }) write_concern_option = [ ('writeConcern', { 'type': 'mongoc_write_concern_t *', 'convert': '_mongoc_convert_write_concern', 'help': 'Construct a :symbol:`mongoc_write_concern_t` and use :symbol:`mongoc_write_concern_append` to add the write concern to ``opts``. See the example code for :symbol:`mongoc_client_write_command_with_opts`.' }), ('write_concern_owned', { 'type': 'bool', 'internal': True, }) ] session_option = ('sessionId', { 'type': 'mongoc_client_session_t *', 'convert': '_mongoc_convert_session_id', 'field': 'client_session', 'help': 'First, construct a :symbol:`mongoc_client_session_t` with :symbol:`mongoc_client_start_session`. You can begin a transaction with :symbol:`mongoc_client_session_start_transaction`, optionally with a :symbol:`mongoc_transaction_opt_t` that overrides the options inherited from |opts-source|, and use :symbol:`mongoc_client_session_append` to add the session to ``opts``. See the example code for :symbol:`mongoc_client_session_t`.' }) ordered_option = ('ordered', { 'type': 'bool', 'help': 'set to ``false`` to attempt to insert all documents, continuing after errors.' }) validate_option = ('validate', { 'type': 'bson_validate_flags_t', 'convert': '_mongoc_convert_validate_flags', 'help': 'Construct a bitwise-or of all desired :symbol:`bson_validate_flags_t <bson_validate_with_error>`. Set to ``false`` to skip client-side validation of the provided BSON documents.' }) collation_option = ('collation', { 'type': 'document', 'help': 'Configure textual comparisons. See :ref:`Setting Collation Order <setting_collation_order>`, and `the MongoDB Manual entry on Collation <https://docs.mongodb.com/manual/reference/collation/>`_. Collation requires MongoDB 3.2 or later, otherwise an error is returned.' }) array_filters_option = ('arrayFilters', { 'type': 'array', 'help': 'An array of filters specifying to which array elements an update should apply.', }) upsert_option = ('upsert', { 'type': 'bool', 'help': 'When true, creates a new document if no document matches the query.' }) bypass_option = ('bypassDocumentValidation', { 'type': 'bool', 'field': 'bypass', 'help': 'Set to ``true`` to skip server-side schema validation of the provided BSON documents.' }) server_option = ('serverId', { 'type': 'uint32_t', 'convert': '_mongoc_convert_server_id', 'help': 'To target a specific server, include an int32 "serverId" field. Obtain the id by calling :symbol:`mongoc_client_select_server`, then :symbol:`mongoc_server_description_id` on its return value.' }) hint_option = ('hint', { 'type': 'bson_value_t', 'convert': '_mongoc_convert_hint', 'help': 'A document or string that specifies the index to use to support the query predicate.' }) opts_structs = OrderedDict([ ('mongoc_crud_opts_t', Shared([ write_concern_option, session_option, validate_option, ])), ('mongoc_update_opts_t', Shared([ ('crud', {'type': 'mongoc_crud_opts_t'}), bypass_option, collation_option, hint_option, upsert_option, ])), ('mongoc_insert_one_opts_t', Struct([ ('crud', {'type': 'mongoc_crud_opts_t'}), bypass_option ], validate='_mongoc_default_insert_vflags')), ('mongoc_insert_many_opts_t', Struct([ ('crud', {'type': 'mongoc_crud_opts_t'}), ordered_option, bypass_option, ], validate='_mongoc_default_insert_vflags', ordered='true')), ('mongoc_delete_one_opts_t', Struct([ ('crud', {'type': 'mongoc_crud_opts_t'}), collation_option, ])), ('mongoc_delete_many_opts_t', Struct([ ('crud', {'type': 'mongoc_crud_opts_t'}), collation_option, ])), ('mongoc_update_one_opts_t', Struct([ ('update', {'type': 'mongoc_update_opts_t'}), array_filters_option, ], validate='_mongoc_default_update_vflags')), ('mongoc_update_many_opts_t', Struct([ ('update', {'type': 'mongoc_update_opts_t'}), array_filters_option, ], validate='_mongoc_default_update_vflags')), ('mongoc_replace_one_opts_t', Struct([ ('update', {'type': 'mongoc_update_opts_t'}), ], validate='_mongoc_default_replace_vflags')), ('mongoc_bulk_opts_t', Struct([ write_concern_option, ordered_option, session_option, ], allow_extra=False, ordered='true')), ('mongoc_bulk_insert_opts_t', Struct([ validate_option, ], validate='_mongoc_default_insert_vflags', allow_extra=False)), ('mongoc_bulk_update_opts_t', Shared([ validate_option, collation_option, hint_option, ('upsert', { 'type': 'bool', 'help': 'If true, insert a document if none match ``selector``.' }), ('multi', {'type': 'bool', 'hidden': True}) ])), ('mongoc_bulk_update_one_opts_t', Struct( [ ('update', {'type': 'mongoc_bulk_update_opts_t'}), array_filters_option, ], multi='false', validate='_mongoc_default_update_vflags', allow_extra=False)), ('mongoc_bulk_update_many_opts_t', Struct( [ ('update', {'type': 'mongoc_bulk_update_opts_t'}), array_filters_option, ], multi='true', validate='_mongoc_default_update_vflags', allow_extra=False)), ('mongoc_bulk_replace_one_opts_t', Struct( [('update', {'type': 'mongoc_bulk_update_opts_t'})], multi='false', validate='_mongoc_default_replace_vflags', allow_extra=False)), ('mongoc_bulk_remove_opts_t', Shared([ collation_option, ('limit', {'type': 'int32_t', 'hidden': True}) ])), ('mongoc_bulk_remove_one_opts_t', Struct([ ('remove', {'type': 'mongoc_bulk_remove_opts_t'}), ], limit=1, allow_extra=False)), ('mongoc_bulk_remove_many_opts_t', Struct([ ('remove', {'type': 'mongoc_bulk_remove_opts_t'}), ], limit=0, allow_extra=False)), ('mongoc_change_stream_opts_t', Struct([ ('batchSize', {'type': 'int32_t', 'help': 'An ``int32`` representing number of documents requested to be returned on each call to :symbol:`mongoc_change_stream_next`'}), ('resumeAfter', {'type': 'document', 'help': 'A ``Document`` representing the logical starting point of the change stream. The ``_id`` field of any change received from a change stream can be used here. This option is mutually exclusive with ``startAfter`` and ``startAtOperationTime``.'}), ('startAfter', {'type': 'document', 'help': 'A ``Document`` representing the logical starting point of the change stream. Unlike ``resumeAfter``, this can resume notifications after an "invalidate" event. The ``_id`` field of any change received from a change stream can be used here. This option is mutually exclusive with ``resumeAfter`` and ``startAtOperationTime``.'}), ('startAtOperationTime', {'type': 'timestamp', 'help': 'A ``Timestamp``. The change stream only provides changes that occurred at or after the specified timestamp. Any command run against the server will return an operation time that can be used here. This option is mutually exclusive with ``resumeAfter`` and ``startAfter``.'}), ('maxAwaitTimeMS', {'type': 'int64_t', 'convert': '_mongoc_convert_int64_positive', 'help': 'An ``int64`` representing the maximum amount of time a call to :symbol:`mongoc_change_stream_next` will block waiting for data'}), ('fullDocument', {'type': 'utf8', 'help': 'A UTF-8 string. Set this option to "updateLookup" to direct the change stream cursor to lookup the most current majority-committed version of the document associated to an update change stream event.'}), ], fullDocument="default")), ('mongoc_create_index_opts_t', Struct([ write_concern_option, session_option, ], opts_name='command_opts')), ('mongoc_read_write_opts_t', Struct([ read_concern_document_option, write_concern_option, session_option, collation_option, server_option, ])), # Only for documentation - we use mongoc_read_write_opts_t for real parsing. ('mongoc_read_opts_t', Struct([ read_concern_document_option, session_option, collation_option, server_option, ], generate_code=False)), ('mongoc_write_opts_t', Struct([ write_concern_option, session_option, collation_option, server_option, ], generate_code=False)), ('mongoc_gridfs_bucket_opts_t', Struct([ ('bucketName', {'type': 'utf8', 'help': 'A UTF-8 string used as the prefix to the GridFS "chunks" and "files" collections. Defaults to "fs". The bucket name, together with the database and suffix collections must not exceed 120 characters. See the manual for `the max namespace length <https://docs.mongodb.com/manual/reference/limits/#Namespace-Length>`_.'}), ('chunkSizeBytes', {'type': 'int32_t', 'convert': '_mongoc_convert_int32_positive', 'help': 'An ``int32`` representing the chunk size. Defaults to 255KB.'}), write_concern_option, read_concern_option ], bucketName="fs", chunkSizeBytes=(255 * 1024))), ('mongoc_gridfs_bucket_upload_opts_t', Struct([ ('chunkSizeBytes', {'type': 'int32_t', 'convert': '_mongoc_convert_int32_positive', 'help': 'An ``int32`` chunk size to use for this file. Overrides the ``chunkSizeBytes`` set on ``bucket``.'}), ('metadata', {'type': 'document', 'help': 'A :symbol:`bson_t` representing metadata to include with the file.'}) ])), ('mongoc_aggregate_opts_t', Struct([ read_concern_option, write_concern_option, session_option, bypass_option, collation_option, server_option, ('batchSize', {'type': 'int32_t', 'help': 'An ``int32`` representing number of documents requested to be returned on each call to :symbol:`mongoc_cursor_next`', 'check_set': True}) ])) ]) header_comment = """/************************************************** * * Generated by build/%s. * * DO NOT EDIT THIS FILE. * *************************************************/ /* clang-format off */""" % basename(__file__) def paths(struct): """Sequence of path, option name, option info.""" for option_name, info in struct.items(): the_type = info['type'] the_field = info.get('field', option_name) if the_type in opts_structs: # E.g., the type is mongoc_crud_opts_t. Recurse. sub_struct = opts_structs[the_type] for path, sub_option_name, sub_info in paths(sub_struct): yield ('%s.%s' % (the_field, path), sub_option_name, sub_info) else: yield the_field, option_name, info def path_to(the_type, the_field): """Like "mongoc_update_one_opts->update.crud.write_concern_owned".""" for path, name, info in paths(opts_structs[the_type]): if name == the_field: return path raise ValueError( "No field '%s' in '%s'" % (the_field, the_type)) env = Environment(loader=FileSystemLoader(template_dir), trim_blocks=True, extensions=['jinja2.ext.loopcontrols']) files = ["mongoc-opts-private.h", "mongoc-opts.c"] for file_name in files: print(file_name) with open(joinpath(src_dir, file_name), 'w+') as f: t = env.get_template(file_name + ".template") f.write(t.render(globals())) f.write('\n') def document_opts(struct, f): for option_name, info in struct.items(): if info.get('internal') or info.get('hidden'): continue the_type = info['type'] if the_type in opts_structs: # E.g., the type is mongoc_crud_opts_t. Recurse. document_opts(opts_structs[the_type], f) continue assert 'help' in info, "No 'help' for '%s'" % option_name f.write("* ``{option_name}``: {info[help]}\n".format(**locals())) for struct_name, struct in opts_structs.items(): if not struct.generate_rst: continue name = re.sub(r'mongoc_(\w+)_t', r'\1', struct_name).replace('_', '-') file_name = name + '.txt' print(file_name) f = open(joinpath(doc_includes, file_name), 'w') f.write( "``%s`` may be NULL or a BSON document with additional" " command options:\n\n" % struct.opts_name) document_opts(struct, f) f.close()
The more recently Aussies bought their home, the more worried they are about where the housing market is going to next, a new survey by ME Bank showed. ME Bank's Property Sentiment Survey revealed that 70% of those who purchased their homes over the past year are worried about home values falling, while 60% have concerns about losing money on their properties. Around 57% of these buyers are also fretting that they owe more than their properties are worth, while 46% regret what they paid for their homes. Overall, home buyers feel that the ongoing price falls would affect their consumer spending — in fact, almost half of all respondents said the housing downtrend has made them feel less wealthy. This led to 73% saying that they would be more careful with their money in the future. Despite the home price decline, first home buyers are still worried about housing being too expensive, with 77% saying they’re ‘worried that housing is increasingly out of reach’. Furthermore, a majority of Australians agreed that housing affordability is at its worst. However, ME head of home loans Andrew Bartolo said buyers need not worry about the short-term fate of home prices if they are planning to live in their properties for a longer period of time. "The Australian property market has seen seven price declines-recover cycles in Sydney since 1984, and all have seen prices recover, most within four years," Bartolo said. For home buyers who are planning to borrow, Bartolo said ensuring a strong savings growth and a sufficient deposit is a must to avoid affecting their overall spending.
#!/usr/bin/env python import argparse from collections import defaultdict import datetime import json import os from freckle_client.client import FreckleClientV2 from api_client import TogglClientApi FRECKLE_PROJECTS = None def get_freckle_projects(): global FRECKLE_PROJECTS if FRECKLE_PROJECTS is None: FRECKLE_PROJECTS = [(p['id'], p['name']) for p in freckle.fetch_json('projects')] return FRECKLE_PROJECTS def prompt_project_mapping(project_id, project_name): # Fetch all Freckle projects freckle_projects = get_freckle_projects() print 'Select Project in Freckle which corresponds to \'{} ({})\' from Toggl'.format(project_name, project_id) print for i, (id_, name) in enumerate(freckle_projects, 1): print "{:2} {}".format(i, name) print print ' 0: - Skip this project -' print selected = raw_input('>> ') if selected == '0': return None print "Selected '{}'".format(freckle_projects[int(selected)-1][1]) return freckle_projects[int(selected)-1][0] def create_freckle_entry(date, project_id, description, minutes): data = { 'date': date, 'project_id': project_id, 'description': u'#toggl {}'.format(description), 'minutes': minutes, } return freckle.fetch_json('entries', 'POST', post_args=data) def run(start_date, end_date): collected_entries = defaultdict(int) # 1. Fetch all time entries from Toggl time_entries = toggl.query('/time_entries', {'start_date': start_date.isoformat()+'+00:00', 'end_date': end_date.isoformat()+'+00:00'}) if time_entries.status_code != 200: print time_entries.content print time_entries.url return for entry in time_entries.json(): # Projectless entries are skipped if 'pid' not in entry: continue # Determine target project if str(entry['pid']) not in PROJECT_MAP: # Fetch project info project_info = toggl.query('/projects/{}'.format(entry['pid'])).json()['data'] project_id, project_name = project_info['id'], project_info['name'] freckle_project_id = prompt_project_mapping(project_id, project_name) PROJECT_MAP[str(project_id)] = freckle_project_id if PROJECT_MAP[str(entry['pid'])] is None: continue # Construct request to send to Freckle: collected_entries[( entry['start'].split('T')[0], PROJECT_MAP[str(entry['pid'])], entry.get('description', '')) ] += entry['duration'] # Create the "toggl" tag print "Creating toggl tag: {}".format(freckle.fetch_json('tags', 'POST', post_args={'names': ['toggl']})) # 5. Create time entries in Freckle print "Creating Freckle entries:" for ((date, project_id, description), seconds) in sorted(collected_entries.items()): minutes = seconds / 60 response = create_freckle_entry(date, project_id, description, minutes) print u"{date} {project[name]:30} {minutes:-3} {description}".format(**response) def valid_date(s): try: return datetime.datetime.strptime(s, "%Y-%m-%d") except ValueError: msg = "Not a valid date: '{0}'.".format(s) raise argparse.ArgumentTypeError(msg) def load_config(): filename = os.path.expanduser('~/.froggle') if os.path.exists(filename): print "Loading tokens from config" with open(filename, 'r') as f: return json.load(f) return {} def save_config(config): filename = os.path.expanduser('~/.froggle') with open(filename, 'w') as f: return json.dump(config, f, indent=4) def start_of_today(): return datetime.datetime.utcnow().replace(hour=0, minute=0, second=0, microsecond=0) if __name__ == '__main__': parser = argparse.ArgumentParser('Copy time entries from Toggl to Freckle') parser.add_argument('--start-date', type=valid_date, default=start_of_today() - datetime.timedelta(days=1, microseconds=1)) a = parser.add_argument('--end-date', type=valid_date, default=start_of_today() - datetime.timedelta(microseconds=1), required=False) freckle_token_arg = parser.add_argument('--freckle-token') toggl_token_arg = parser.add_argument('--toggl-token') options = parser.parse_args() config = load_config() if not options.freckle_token or not options.toggl_token else {} if (not config or not config.get('freckle_token')) and not options.freckle_token: raise argparse.ArgumentError(freckle_token_arg, "No Freckle token provided") if options.freckle_token: config['freckle_token'] = options.freckle_token if (not config or not config.get('toggl_token')) and not options.toggl_token: raise argparse.ArgumentError(toggl_token_arg, "No Toggl token provided") if options.toggl_token: config['toggl_token'] = options.toggl_token global freckle, toggl, PROJECT_MAP toggl = TogglClientApi({'token': config['toggl_token'], 'user-agent': 'Froggle'}) freckle = FreckleClientV2(config['freckle_token']) PROJECT_MAP = config.get('project_map', {}) if options.end_date < options.start_date: raise argparse.ArgumentError(a, "Start date should not come after end date") run(options.start_date, options.end_date) config['project_map'] = PROJECT_MAP save_config(config)
The reading curriculum that we use is called, Schoolwide. There are five thematic units embedded within the curriculum that address all of the required fifth grade standards. Expect your child to be able to identify different genres and sub-genres of various texts, and use a variety of strategies to read different types of texts. In fifth grade, we will also read like writers in an effort to further develop writing styles and skills. It is very important to always have a book ready to go. We will be doing a lot of reading this year, and it is expected that students will be able to read independently for up to thirty minutes. Students will also be held accountable for their reading by completing reading logs and conferencing with me.
""" these are utilities to parse and transform SQL statements """ import re import sys def get_select_cols_and_rest(query): """ Separate the a list of selected columns from the rest of the query Returns: 1. a list of the selected columns 2. a string of the rest of the query after the SELECT """ from_loc = query.lower().find("from") raw_select_clause = query[0:from_loc].rstrip() rest_of_query = query[from_loc:len(query)] # remove the SELECT keyword from the query select_pattern = re.compile("select", re.IGNORECASE) raw_select_clause = select_pattern.sub('', raw_select_clause) # now create and iterate through a list of of the SELECT'ed columns selected_columns = raw_select_clause.split(',') selected_columns = [c.strip() for c in selected_columns] return selected_columns, rest_of_query def get_query_parts(query): """ Extract the where clause of this CQL query. """ select_loc = query.lower().find('select') from_loc = query.lower().find('from') if from_loc == -1: sys.exit("ERROR: query must contain FROM <table>") from_end = len(query) where_loc = query.lower().find("where") if where_loc > -1: from_end = where_loc where_end = len(query) for keyword in ["order by", "limit", "allow_filtering"]: stop = query.find(keyword) if stop > -1: from_end = min(stop, from_end) where_end = min(stop, where_end) where_clause = "" rest = "" from_table = query[from_loc + 4: from_end].strip() select_clause = query[select_loc:from_loc] # remove the SELECT keyword from the query select_pattern = re.compile("select", re.IGNORECASE) select_clause = select_pattern.sub('', select_clause) # now create and iterate through a list of of the SELECT'ed columns selected_columns = select_clause.split(',') selected_columns = [c.strip() for c in selected_columns] if where_loc > -1: where_clause = query[where_loc + 5: where_end].strip() if where_end < len(query): rest = query[where_end:].strip() return selected_columns, from_table, where_clause, rest def ensure_columns(query, cols): """ if a query is missing any of these list of columns, add them and return the new query string """ sel_cols, rest = get_select_cols_and_rest(query) sel_cols = [x.lower() for x in sel_cols] for c in cols: c = c.lower() if c not in sel_cols: sel_cols += [c] sel_string = ", ".join(sel_cols) return "select {sel_string} {rest}".format(**locals())
Yes this is possible and done often. In fact I am in the process of doing it at this time on another computer. You can use the sound recorder program, that comes with windows XP.... 12/10/2018 · To record your cassette audio on your computer, you will need to connect your cassette player to your computer's microphone (or "line-in") port and then set your computer to record only the line-in audio. This will prevent your computer from recording external audio (e.g., background noise) while creating a clean, high-quality recording of your cassette. Download Audacity (free) or a similar computer program and use this program on your computer to capture the cassette tape contents and convert them to a digital file of your choice. Follow the directions for a Windows PC or Apple Macintosh below... Cassettes come in various lengths, the most typical being 60 and 90 minutes. C-120 tapes use an extremely thin and therefore fragile tape, and should be handled with the utmost care. Three hour cassettes (C-180) are rare, but occasionally seen. you must have a video card with audio/video input jacks. if you do, hook your tape player up to your PC using audio/video plugs. Depending on how new your version of Windows XP is, you should have a proram called Windows Movie Maker.... I was hoping to connect my old cassette player to my new Dell. I am wanting to take the cassette tapes and burn a CD copy of them, since the cassette player is the least used electronic device in my house. 8mm video refers to various types of videocassette tapes including the Video8, Hi8 and Digital8 and MiniDV formats. These tapes are typically digitized into digital video files so that they can be stored, played and archived in the computer.... 19/09/2008 · Hello, If you want to convert cassette tape to mp3 you can use Jet Audio as recorder. Plug or insert line out from your tape line out jack to line in as microphone in your PC the record with Jet Audio on tab "Recording". Cassettes come in various lengths, the most typical being 60 and 90 minutes. C-120 tapes use an extremely thin and therefore fragile tape, and should be handled with the utmost care. Three hour cassettes (C-180) are rare, but occasionally seen. 19/09/2008 · Hello, If you want to convert cassette tape to mp3 you can use Jet Audio as recorder. Plug or insert line out from your tape line out jack to line in as microphone in your PC the record with Jet Audio on tab "Recording". 8mm video refers to various types of videocassette tapes including the Video8, Hi8 and Digital8 and MiniDV formats. These tapes are typically digitized into digital video files so that they can be stored, played and archived in the computer.
# std lib import os import tempfile import platform from subprocess import Popen, PIPE # mapnik import mapnik def call(cmd,fail=False): try: response = Popen(cmd.split(' '),stdin=PIPE, stdout=PIPE, stderr=PIPE) cm = response.communicate() return cm[0] except Exception, e: if fail: raise SystemExit(e) else: return None def open_image(filename, app=None): if os.name == 'nt': if app: raise SystemExit('Overriding default image viewer not supported on Win32') call('start %s' % filename.replace('/','\\')) elif platform.uname()[0] == 'Linux': if app: call('%s %s' % (app, filename)) else: try: cmd = 'xdg-open %s' % self.image Popen(cmd.split(' ')) except OSError: try: cmd = 'gthumb %s' % self.image Popen(cmd.split(' ')) except OSError: cmd = 'display %s' % self.image Popen(cmd.split(' ')) elif platform.uname()[0] == 'Darwin': if app: call('open %s -a %s' % (filename, app)) else: call('open %s' % filename) def get_default_style(geometry_type): """ Ultra simple default style for quick setup or debugging. """ style, rule = mapnik.Style(), mapnik.Rule() gtype = geometry_type.lower() if 'poly' in gtype: rule.symbols.append(mapnik.PolygonSymbolizer(mapnik.Color('steelblue'))) rule.symbols.append(mapnik.LineSymbolizer(mapnik.Color('steelblue'),.5)) elif 'line' in gtype: rule.symbols.append(mapnik.LineSymbolizer(mapnik.Color('steelblue'),1.5)) else: point = mapnik.PointSymbolizer() point.allow_overlap = True rule.symbols.append(point) style.rules.append(rule) return style def show(lyr,sty,width=400,height=300,filename=None,app=None): m = mapnik.Map(width,height,lyr.srs) m.background = mapnik.Color('transparent') lyr.styles.append('style') m.append_style('style',sty) m.layers.append(lyr) m.zoom_all() if not filename: (handle, filename) = tempfile.mkstemp('.png', 'django-map-') os.close(handle) mapnik.render_to_file(m,str(filename)) open_image(str(filename)) return m
Welcome to Airline portal! Our site pages 875 repressents photo materials and collages and other materials and screenshots for themes: "cheap greece airline tickets flights", "australia sydney hotels flights airline tickets", "cheap air flights discounted airline tickets bamako", "airline tickets from charlotte" and "capital one airline tickets" for "cheap air flights airline flight ticket anapa", the dirt cheap greece airline tickets. You can find more offers and cheap products, photos and some information. The Collection of photos, screenshots and other information, find more for themes from: "airline tickets go american airlines luzon island", "airline tickets search cheap airfares livingstone", "discount airfare really cheap airline tickets fl", "cheap airline tickets singapore las vegas", "cheap non-stop airline tickets". The airplanes and other photos on page 258; "airline tickets go american airlines batman"; "american airline panama tickets"; "cheap airline tickets from dallas". Screenshots 977 cheap airline tickets singapore las vegas; cheap tickets really cheap airline ticket lugano; cheapest airlines ticket; cheap air flights discounted airline tickets genua; discount airfare airline tickets paris dirt cheap tickets student airline fares komatsu. look at more photos about: "airlines discount united airlines tickets", "airline tickets go american airlines narvik", "business class airfare discount airline ticket karlsruhe", "british airways dirt cheap airline tickets", "cheap air flights continental airlines tickets juliaca"! Best and hidden photos from page 1347 about themes: cheep spain airline ticket fares, asiana airline ticket and airline tickets to buffalo ny from nc, airline tickets europe to us or also central america airline tickets. Reference to other screenshots and photos of our site for themes: "cheap airline tickets alice springs", "airline tickets go american airlines port vila", "cost of changing ticket american airlines", "cheap airline ticket latin america" and "cheap of airline tickets". Look at photos 278 and screenshots 2219 for my album: around the world airline ticket, cheap discounte airline ticket and airline tickets economy airline tickets european. Funs are WELCOME! Read more screenshots and photos 406 from Themes: cheap airfares really cheap airline ticket zhengzhou, cheap air flights continental airlines tickets gibraltar - cheap airline ticket web sites. New photos on page 402 from Themes: deep united discount airline tickets italy, cheap air flights discounted airline tickets livingstone and also cheep airline tickets fares spain. Look here New extreme galleries! On pictures 2500 and 1305 from the british airways dirt cheap airline tickets cheap airline ticket europe. airline tickets go american airlines karachi, best way to find airline tickets, cheep airline tickets cheep tickets, cheap business class airline tickets to india, cheap united airlines airfare tickets, bid indian airline ticket! And Also you can find more about the contact sheduled airline ticket office plus cheap airline tickets southampton. airline tickets go american airlines pisa g, discount airline ticket cheap discount airline hanoi, cheap flight really cheap airline ticket jodhpur, cheapest possible airline tickets, cheap flights cheap airline tickets last, dirt cheap tickets northwest airline lugano! ATTENTION! Our Exclusive photo materials on themes "cheap airline tickets flights sierra leone" and "Airline Tickets Economy Airticket Pescara Liberi" from our pictures is copyrighted! Scroll UP and look at page 53839 from theme airline tickets discounted panama, airline tickets for a coach, cheap air flights cheap airline tickets zhuhai and chap airline ticketz, berlin de cheap airline tickets to hawaii or visit site about airline tickets for singapore europe and cheap airline tickets flights kiruna!
# Opus/UrbanSim urban simulation software. # Copyright (C) 2005-2009 University of Washington # See opus_core/LICENSE import win32com.client as com def load_version_file(visum_dir, version_filename, visum_version_number=10): #Start up Visum COM server - requires win32com library try: Visum = com.Dispatch("visum.visum." + str(visum_version_number)) #latest version of VISUM registered as COM server except Exception: error_msg = "Starting Visum COM Server Failed" raise StandardError(error_msg) #Set directories try: Visum.SetPath(2, visum_dir) #version file Visum.SetPath(3, visum_dir) #od matrix file Visum.SetPath(4, visum_dir) #skim matrix file Visum.SetPath(12,visum_dir) #procedure file except Exception: error_msg = "Setting Visum Directories failed" raise StandardError(error_msg) #Load version file try: Visum.LoadVersion(version_filename) except Exception: error_msg = "Loading Visum version file failed" raise StandardError(error_msg) #Return Visum object return Visum
Oman Air has scooped two key awards at the TravelPlus Airline Amenity Bag Awards 2011, which were held last night in central London. The national carrier of the Sultanate of Oman won Gold in both the Best First Class Female Amenity Bag and the Best Business Class Unisex Amenity Bag categories, and also received Bronze in the Best First Class Male Amenity Bag category. This year’s awards, which are designed to recognise those airlines that go the extra mile with their on-board amenity kits, attracted a record number of entries. Airlines, manufacturers, suppliers and designers of airline amenities competed for the top awards, which were announced at a glamorous reception at the Chesterfield Mayfair Hotel. This latest success follows a time in which Oman Air has scooped a number of key industry awards in recognition of its many recent changes, including the introduction of new airbus A330 and Embraer 175 aircraft to its fleet, the inauguration of a range of exciting new destinations, the unveiling of spacious and luxurious aircraft interiors and the launch of state-of-the-art complete connectivity and in-flight entertainment systems. The awards include ‘Best Business Class Seat in the World’ and ‘Service Excellence, Middle East’ at the World Airline Awards, ‘Best Luxury Airline, Middle East’ in the Business Destinations Awards, ‘Most Promising Newcomer’ at Malaysia’s KLIA Awards and ‘Technological Innovation of the Year’ at the Aviation Business Awards. In addition, Oman Air has received Gold and Bronze in the Global Vision Awards 2010, Silver in the Cellars in the Sky Awards, Silver in the Travel Industry Club’s awards in Cologne, Germany, and was named ‘Airline of the Year’ at France’s Lauriers d’Or du Voyage d’Affaires, Top Resa 2011. Mr Wayne Pearce, Chief Executive Officer, Oman Air, told TravelPlus: “Oman Air is delighted to be recognised for the second year running in the TravelPlus Airline Amenity Bag Awards, especially as competition continues to be tough this year. We pride ourselves in offering the very best on-board services and our amenity bags reflect this. They contain everything the premium traveller needs to ensure they arrive at their destination relaxed, refreshed and revitalised. “They include a range of luxurious skin care products from the world-renowned Omani perfumery, Amouage, together with a number of essential toiletries and personal grooming items. Everyone at Oman Air is pleased that they have met with our passengers’ approval.
from __future__ import absolute_import import subprocess import time from grokcore.component import context, baseclass from twisted.internet import defer from zope import schema from zope.component import provideSubscriptionAdapter from zope.interface import Interface, implements from opennode.oms.model.model.actions import ActionsContainerExtension, Action, action from opennode.oms.model.model.base import Container, ReadonlyContainer from opennode.oms.endpoint.ssh.terminal import RESET_COLOR from opennode.oms.endpoint.webterm.ssh import ssh_connect_interactive_shell from opennode.oms.security.directives import permissions from opennode.oms.zodb import db class IConsole(Interface): """Console node.""" class ITextualConsole(Interface): """Textual console.""" class IGraphicalConsole(Interface): """Graphical console.""" class ITtyConsole(IConsole): pty = schema.TextLine(title=u"pty") class ISshConsole(IConsole): user = schema.TextLine(title=u"user") hostname = schema.TextLine(title=u"hostname") port = schema.Int(title=u"port") class IOpenVzConsole(IConsole): cid = schema.Int(title=u"cid") class IVncConsole(IConsole): hostname = schema.TextLine(title=u"hostname") port = schema.Int(title=u"port") ws_url = schema.Int(title=u"ws_url", required=False, readonly=True) class TtyConsole(ReadonlyContainer): implements(ITtyConsole, ITextualConsole) class TtyConsole(ReadonlyContainer): implements(ITtyConsole, ITextualConsole) permissions(dict(pty=('read', 'modify'))) def __init__(self, name, pty): self.inherit_permissions = True self.__name__ = name self.pty = pty class SshConsole(ReadonlyContainer): implements(ISshConsole, ITextualConsole) permissions(dict(user=('read', 'modify'), hostname=('read', 'modify'), port=('read', 'modify'), )) def __init__(self, name, user, hostname, port): self.inherit_permissions = True self.__name__ = name self.user = user self.hostname = hostname self.port = port class OpenVzConsole(ReadonlyContainer): implements(IOpenVzConsole, ITextualConsole) permissions(dict(cid=('read', 'modify'))) def __init__(self, name, cid): self.inherit_permissions = True self.__name__ = name self.cid = cid class VncConsole(ReadonlyContainer): implements(IVncConsole, IGraphicalConsole) permissions(dict(hostname=('read', 'modify'), port=('read', 'modify'), )) proxy_processes = {} def __init__(self, hostname, port): self.inherit_permissions = True self.__name__ = 'vnc' self.hostname = hostname self.port = port self._ensure_proxy() def _ensure_proxy(self): if self.hostname in self.proxy_processes: # check if the proxy process has matching vnc port # otherwise kills it if self.proxy_processes[self.hostname].port != self.port: self.proxy_processes[self.hostname].kill() del self.proxy_processes[self.hostname] if self.hostname not in self.proxy_processes: self.proxy_processes[self.hostname] = VncProxyProcess(self.hostname, self.port) @property def ws_url(self): self._ensure_proxy() proxy_port = self.proxy_processes[self.hostname].proxy_port return 'ws://%s:%s/' % (self.hostname, proxy_port) class VncProxyProcess(object): def __init__(self, hostname, port): self.port = port self.proxy_port = port + 1000 self.process = subprocess.Popen(['bin/wsproxy', str(self.proxy_port), '%s:%s' % (hostname, self.port)]) def kill(self): self.process.terminate() time.sleep(0.5) self.process.kill() class Consoles(Container): __name__ = 'consoles' __contains__ = IConsole inherit_permissions = True class AttachAction(Action): """Attach to textual console""" baseclass() action('attach') @defer.inlineCallbacks def execute(self, cmd, args): self.closed = False self.protocol = cmd.protocol self.transport = self size = (cmd.protocol.width, cmd.protocol.height) yield self._do_connection(size) self.deferred = defer.Deferred() yield self.deferred def write(self, data): if not self.closed: self.protocol.terminal.write(data) def loseConnection(self): self.closed = True self.protocol.terminal.resetPrivateModes('1') self.protocol.terminal.write(RESET_COLOR) self.deferred.callback(None) def _set_channel(self, channel): loseConnection = self.loseConnection class SshSubProtocol(object): def __init__(self, parent): self.parent = parent self.buffer = [] def dataReceived(self, data): for ch in data: if ch == '\x1d': # TODO: really close the ssh connection loseConnection() channel.write(data) self.protocol.sub_protocol = SshSubProtocol(self.protocol) class SshAttachAction(AttachAction): context(ISshConsole) @db.ro_transact def _do_connection(self, size): self.write("Attaching to %s@%s. Use ^] to force exit.\n" % (self.context.user.encode('utf-8'), self.context.hostname.encode('utf-8'))) ssh_connect_interactive_shell(self.context.user, self.context.hostname, self.context.port, self.transport, self._set_channel, size) class HypervisorSshAttachAction(AttachAction): """For consoles that are attached by running a command on the hypervisor host.""" baseclass() @db.ro_transact def _do_connection(self, size): self.write("Attaching to %s. Use ^] to force exit.\n" % self.name) phy = self.context.__parent__.__parent__.__parent__.__parent__ ssh_connect_interactive_shell('root', phy.hostname, 22, self.transport, self._set_channel, size, self.command) class TtyAttachAction(HypervisorSshAttachAction): context(ITtyConsole) @property def command(self): return 'screen -xRR %s %s' % (self.context.pty.replace('/', ''), self.context.pty) @property def name(self): return self.context.pty.encode('utf-8') class OpenvzAttachAction(HypervisorSshAttachAction): context(IOpenVzConsole) @property def command(self): return 'vzctl enter %s' % (self.context.cid) @property def name(self): return self.context.cid provideSubscriptionAdapter(ActionsContainerExtension, adapts=(IConsole, ))
Fresh facts have emerged on the identity of Ghanaian actress Yvonne Nelson’s baby daddy and the sort of relationship existing between them. Recall that over the weekend, Yvonne flaunted her baby bump on the cover of the latest issue of Wow magazine and also disclosed in the interview, that the father of her daughter was a very supportive and amazing British man who loved kids. The identity of Yvonne’s alleged baby daddy, was then partially revealed on social media as a photographer named Jamie and photos without a credited source were shared. Now, Happenings.com.ng can authoritatively serve you the Instagram handle of the man Yvonne Nelson apparently preferred more than her former fiance, who she left for him. Apart from having several pictures of her on his page, Jamie also writes wonderful things about her but surprisingly, in one of the posts, he referred to her as a friend.
#! /usr/bin/env python # -*- coding: utf-8 -*- # The MIT License (MIT) # # Copyright (c) 2013 Bartosz Janda # # Permission is hereby granted, free of charge, to any person obtaining a copy of # this software and associated documentation files (the "Software"), to deal in # the Software without restriction, including without limitation the rights to # use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of # the Software, and to permit persons to whom the Software is furnished to do so, # subject to the following conditions: # # The above copyright notice and this permission notice shall be included in all # copies or substantial portions of the Software. # # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS # FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR # COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER # IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN # CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. from .. import helpers from ..common import SummaryBase from ..Foundation import NSObject from ..CoreGraphics import CGRect class UIScreenSyntheticProvider(NSObject.NSObjectSyntheticProvider): """ Class representing UIScreen. """ def __init__(self, value_obj, internal_dict): super(UIScreenSyntheticProvider, self).__init__(value_obj, internal_dict) self.type_name = "UIScreen" self.register_child_value("bounds", ivar_name="_bounds", provider_class=CGRect.CGRectSyntheticProvider, summary_function=self.get_bounds_summary) self.register_child_value("scale", ivar_name="_scale", primitive_value_function=SummaryBase.get_float_value, summary_function=self.get_scale_summary) self.register_child_value("horizontal_scale", ivar_name="_horizontalScale", primitive_value_function=SummaryBase.get_float_value, summary_function=self.get_horizontal_scale_summary) self.register_child_value("interface_idiom", ivar_name="_userInterfaceIdiom", primitive_value_function=self.get_interface_idiom_value, summary_function=self.get_interface_idiom_summary) @staticmethod def get_bounds_summary(provider): return "size=({}, {})".format(SummaryBase.formatted_float(provider.size_provider.width_value), SummaryBase.formatted_float(provider.size_provider.height_value)) @staticmethod def get_scale_summary(value): return "scale={}".format(SummaryBase.formatted_float(value)) @staticmethod def get_horizontal_scale_summary(value): return "hScale={:.0f}".format(SummaryBase.formatted_float(value)) @staticmethod def get_interface_idiom_value(value): interface_idiom_value = value.GetValueAsSigned() interface_idiom_name = "Unknown" if interface_idiom_value == 0: interface_idiom_name = "Phone" elif interface_idiom_value == 1: interface_idiom_name = "Pad" return interface_idiom_name @staticmethod def get_interface_idiom_summary(value): return "idiom={}".format(value) def summaries_parts(self): return [self.bounds_summary, self.scale_summary, self.interface_idiom_summary] def summary_provider(value_obj, internal_dict): return helpers.generic_summary_provider(value_obj, internal_dict, UIScreenSyntheticProvider)
Debt Generation - In 2008, the youth vote was overwhelmingly for Obama and helped elect him to the White House. Now, three years into his presidency and one year from the most important election of our lives, college students are thinking twice before voting for Obama again. This has been viewed 143,651 times.
import sys import re def rotate(i,d,s): if d == 'right': i = -i return s[i:] + s[:i] def rotate_pos_table(n): table = [-1]*n for i in range(n): j = i + 1 if i >= 4: j += 1 if j >= n: j -= n if i + j >= n: table[i+j-n] = j else: table[i + j] = j return table def scramble(start, data): steps = data.split('\n') for s in steps: swap_pos_cmd = re.match(r'swap\ position\ (\d+)\ with\ position\ (\d+)', s) swap_let_cmd = re.match(r'swap\ letter\ (\S)\ with\ letter\ (\S)', s) reverse_cmd = re.match(r'reverse\ positions\ (\d+)\ through\ (\d+)', s) rotate_cmd = re.match(r'rotate\ (.+?)\ (\d+)\ step[s]*', s) rotate_p_cmd = re.match(r'rotate\ based\ on\ position\ of\ letter\ (\S)', s) move_pos_cmd = re.match(r'move\ position\ (\d+)\ to\ position\ (\d+)', s) if swap_pos_cmd is not None: x = start[int(swap_pos_cmd.group(1))] y = start[int(swap_pos_cmd.group(2))] start = re.sub('#', y, re.sub(y, x, re.sub(x, '#', start))) elif swap_let_cmd is not None: x = swap_let_cmd.group(1) y = swap_let_cmd.group(2) start = re.sub('#', y, re.sub(y, x, re.sub(x, '#', start))) elif reverse_cmd is not None: x = int(reverse_cmd.group(1)) y = int(reverse_cmd.group(2)) middle = start[y:x-1:-1] if x == 0: middle = start[y:x:-1] + start[x] start = start[:x] + middle + start[y+1:] elif rotate_cmd is not None: x = int(rotate_cmd.group(2)) start = rotate(x, rotate_cmd.group(1), start) elif rotate_p_cmd is not None: c = rotate_p_cmd.group(1) x = start.find(c) + 1 if x >= 5: x += 1 x = x % len(start) start = rotate(x, 'right', start) elif move_pos_cmd is not None: x = int(move_pos_cmd.group(1)) y = int(move_pos_cmd.group(2)) if x < y: start = start[:x] + start[x+1:y+1] + start[x] + start[y+1:] elif x > y: start = start[:y] + start[x] + start[y:x] + start[x+1:] else: print("Invalid step:", s) return start def unscramble(start, data): steps = data.split('\n') for s in steps[::-1]: swap_pos_cmd = re.match(r'swap\ position\ (\d+)\ with\ position\ (\d+)', s) swap_let_cmd = re.match(r'swap\ letter\ (\S)\ with\ letter\ (\S)', s) reverse_cmd = re.match(r'reverse\ positions\ (\d+)\ through\ (\d+)', s) rotate_cmd = re.match(r'rotate\ (.+?)\ (\d+)\ step[s]*', s) rotate_p_cmd = re.match(r'rotate\ based\ on\ position\ of\ letter\ (\S)', s) move_pos_cmd = re.match(r'move\ position\ (\d+)\ to\ position\ (\d+)', s) if swap_pos_cmd is not None: x = start[int(swap_pos_cmd.group(1))] y = start[int(swap_pos_cmd.group(2))] start = re.sub('#', y, re.sub(y, x, re.sub(x, '#', start))) elif swap_let_cmd is not None: x = swap_let_cmd.group(1) y = swap_let_cmd.group(2) start = re.sub('#', y, re.sub(y, x, re.sub(x, '#', start))) elif reverse_cmd is not None: x = int(reverse_cmd.group(1)) y = int(reverse_cmd.group(2)) middle = start[y:x-1:-1] if x == 0: middle = start[y:x:-1] + start[x] start = start[:x] + middle + start[y+1:] elif rotate_cmd is not None: x = int(rotate_cmd.group(2)) if rotate_cmd.group(1) == 'left': x = -x start = start[x:] + start[:x] elif rotate_p_cmd is not None: c = rotate_p_cmd.group(1) x = start.find(c) r = rotate_pos_table(len(start))[x] start = rotate(r, 'left', start) elif move_pos_cmd is not None: x = int(move_pos_cmd.group(2)) y = int(move_pos_cmd.group(1)) if x < y: start = start[:x] + start[x+1:y+1] + start[x] + start[y+1:] elif x > y: start = start[:y] + start[x] + start[y:x] + start[x+1:] else: print("Invalid step:", s) return start def main(): if (len(sys.argv) < 4): print("Usage python3", sys.argv[0], "<input> enc|dec <start>") exit(1) with open(sys.argv[1], 'r') as input: data = input.read() start = sys.argv[3] if sys.argv[2] == 'enc': print("Result:", scramble(start, data)) elif sys.argv[2] == 'dec': print("Result:", unscramble(start, data)) if __name__ == '__main__': main()
This bike is in fantastic condition. Really fun bike to ride, lots of low end power with a huge power band. This is the perfect bike to go riding with the mates. It makes all the noise to ride with your Harley mates, but can ride as hard you Jap bike mates, but looks so much better than both. The bike comes with Staintune pipes and Venture bag. Powering the 2005 Ducati SuperSport is a four-stroke air-cooled L-twin Desmodromic engine, with two intake valves on each of its two cylinders and fueling provided by a 45-millimeter (mm) Marelli injector. The engine on the SuperSport 800 model displaces 803 cubic centimetres, and generates 77 horsepower at 8,250 revolutions per minute (RPM) and 54 pound-feet (lb-ft) of torque at 6,500 RPM. The engine on the SuperSport 1000 DS has a displacement of 992 cubic centimetres, and pumps 95 horsepower at 7,750 RPM and 70 lb-ft of torque at 5,750 RPM. Each engine on the 2005 Ducati SuperSport is mated to a six-speed manual transmission. The gas tank on each bike can hold up to 4.2 gallons (16.0-liter) of fuel, with a one-gallon (4.0-liter) reserve. Ducati recommends premium fuel for the engine’s optimal performance. The 2005 Ducati SuperSport relies on Brembo® disc brakes that consist of a front 12.6-inch (320-mm) dual disc and a rear 9.7-inch (245-mm) single disc. The suspension on the bike consists of a 1.7-inch (43-mm) Showa® inverted fork at the front and an Ohlins® twin-sided aluminum swing arm with rear shock absorber at the back. Ducati installed large, 17-inch aluminum wheels on the 2005 SuperSport. Available as a red or yellow bike, the 2005 Ducati SuperSport uses a solid steel frame. Ducati designed it as a full-fairing motorcycle, with front and rear fenders, exterior covering, and a rear spoiler to complement its sport orientation. Standard digital instrumentation on the bike includes a clock, speedometer, tachometer, trip odometer, service reminder indicator, and fuel level and temperature warning lights. Storage on the 2005 Ducati SuperSport consists of under-seat and lockable compartments. An engine immobilizer serves as the bike’s security system, and a tinted windshield is in place for protection against gushing wind and flying debris. A halogen headlight and a pair of rearview mirrors are included for greater visual perception when driving. Each 2005 Ducati SuperSport bike measures 80 inches (2,033 mm) in length, 30.7 inches (780 mm) in width, and 43.7 inches (1,110 mm) in height. However, the SuperSport 1000 DS model, at 396 lbs. (179.4 kilograms), is heavier than the SuperSport 800 model, which has a dry weight of 386 lbs. (175 kg). For SuperSport shoppers, it is always advisable to go with one of the later versions. The 2005 Ducati SuperSport is one of them. With increased engine power, it bears a marginal improvement over the previous model year as one of the more refined entries of the now-defunct nameplate.
# -*- coding: utf-8 -*- from urlparse import urlparse import os.path from django.core.urlresolvers import resolve, Resolver404, reverse from django.db import models from django.utils.translation import ugettext_lazy as _ from django.core.validators import RegexValidator from mptt.models import MPTTModel, TreeForeignKey, TreeManager class PageManager(TreeManager): """ Page custom manager """ def get_active_page(self, uri): """ Get current active page from request """ try: page = self.get(url=unicode(uri), published=True) except Page.DoesNotExist: parsed = urlparse(uri) try: page = self.get(url=unicode(parsed.path), published=True) except Page.DoesNotExist: try: page = self.get(url__startswith=unicode(parsed.path), published=True) except Page.DoesNotExist: # try to find related page try: view_func = resolve(parsed.path).func if hasattr(view_func, 'related_page_url'): page = self.get_active_page(getattr(view_func, 'related_page_url')) else: raise except Resolver404: raise return page class File(models.Model): """ File attached to a page """ name = models.CharField(_(u'name'), max_length=140, null=True, blank=True) file = models.FileField(_(u'file'), upload_to='staticpages') page = models.ForeignKey('Page', related_name='files') def __unicode__(self): return self.name @property def extension(self): return os.path.splitext(self.file.path)[1] class PageImage(models.Model): """ Image in page """ title = models.CharField(_(u'title'), max_length=140, null=True, blank=True) image = models.FileField(_(u'image'), upload_to='staticpages') page = models.ForeignKey('Page', related_name='images') def __unicode__(self): return self.title class Meta: verbose_name = _(u'image') verbose_name_plural = _(u'images') class Page(MPTTModel): """Page""" name = models.CharField(_(u'name in the menu'), max_length=70) parent = TreeForeignKey('self', verbose_name=_(u'parent page'), null=True, blank=True, related_name='children') title = models.CharField(_(u'title tag'), max_length=255) url = models.CharField(_(u'url'), max_length=255, help_text=_(u'Page full path: /news/ or /'), unique=True, validators=[ RegexValidator('^/([a-z0-9-_A-Z]+/)*', message=_(u'Must start and end with slash'))]) text = models.TextField(_(u'text'), blank=True, null=True, help_text=_(u'HTML content of the page if it is static')) template = models.CharField(_(u'template'), choices=[('default', _(u'Default')), ('wholesale', _(u'Wholesale'))], default='default', max_length=10) published = models.BooleanField(_(u'published'), help_text=_(u'The page will not be shown at the site until it is published'), db_index=True, default=False) meta_keywords = models.TextField(_(u'meta keywords'), blank=True, null=True) meta_description = models.TextField(_(u'meta description'), blank=True, null=True) position = models.PositiveSmallIntegerField("Position", null=False, default=0) objects = PageManager() class MPTTMeta: order_insertion_by = ['position'] def get_absolute_url(self): return self.url def __unicode__(self): return self.name class Meta: verbose_name = _(u'page') verbose_name_plural = _(u'pages') ordering = ['position']
The Social Security Disability Insurance program was signed into law 60 years ago. Read our blog series to learn more about how the program has evolved over the years to provide benefits and financial protection for millions of people. A disability could happen at any moment in our lives. Discover how Social Security's disability program has played a substantial role in supporting individuals with disabilities. Did you know that fifty-six million Americans live with disabilities? Disability is something many read or hear about happening to others, but no one thinks will happen to them. Help us spread awareness about the program and the people who rely on it. Social Security has a number of useful resources for people experiencing disabilities: Apply online for benefits, planners, publications, and much more. Want to be notified when we update this site? Subscribe now and we'll send you an email when we add new content.
import locale import sys import os import os.path import subprocess if sys.platform == 'win32': import ctypes def safe_unicode(s): '''Creates unicode object from string s. It tries to decode string as UTF-8, fallbacks to current locale or ISO-8859-1 if both decode attemps fail''' if type(s) == unicode: return s elif isinstance(s, Exception): s = str(s) try: return s.decode('UTF-8') except UnicodeDecodeError: pass try: lang,encoding = locale.getdefaultlocale() except ValueError: lang,encoding = 'C','UTF-8' if encoding != 'UTF-8': try: return s.decode(encoding) except UnicodeDecodeError: pass return s.decode('ISO-8859-1') def utf8_str(s): s = safe_unicode(s) return s.encode('UTF-8') def invert_hash(h): ih = {} for key,value in h.iteritems(): if value not in ih: ih[value] = [] ih[value].append(key) return ih def find_binary(locations): searchpath_sep = ';' if sys.platform == 'win32' else ':' searchpaths = os.environ['PATH'].split(searchpath_sep) for location in locations: if '{PATH}' in location: for searchpath in searchpaths: s = location.replace('{PATH}', searchpath) if os.path.isfile(s) and os.access(s, os.X_OK): yield s elif os.path.isfile(location) and os.access(location, os.X_OK): yield location def is_binary_file(file): # Returns True if the file cannot be decoded as UTF-8 # and > 20% of the file is binary character # Read file try: f = open(file) buf = f.read() f.close() except OSError: return False # Decode as UTF-8 try: ubuf = unicode(buf, 'utf-8') return False except UnicodeDecodeError: pass # Check number of binary characters treshold = len(buf) / 5 binary_chars = 0 for c in buf: oc = ord(c) if oc > 0x7f or (oc < 0x1f and oc != '\r' and oc != '\n'): binary_chars += 1 if binary_chars > treshold: return True return False PROCESS_TERMINATE = 1 def kill_subprocess(process): if sys.platform == 'win32': handle = ctypes.windll.kernel32.OpenProcess(PROCESS_TERMINATE, False, process.pid) ctypes.windll.kernel32.TerminateProcess(handle, -1) ctypes.windll.kernel32.CloseHandle(handle) else: os.kill(process.pid, 9) CREATE_NO_WINDOW = 0x08000000 def Popen(cmd, **args): # Create a subprocess that does not open a new console window if sys.platform == 'win32': process = subprocess.Popen(cmd, creationflags = CREATE_NO_WINDOW, **args) else: process = subprocess.Popen(cmd, **args) # Emulate kill() for Python 2.5 if 'kill' not in dir(process): process.kill = lambda: kill_subprocess(process) return process
These three forces should propel Canada’s economy to solid growth in 2019, but a fourth factor may drag it down. Watch the analysis by Scotiabank’s Chief Economist, Jean-François Perrault. Download the Global Outlook report from Scotiabank Economics.
import os import subprocess import sys VERSION = "5.0.0.dev" PATHOD = "pathod " + VERSION MITMPROXY = "mitmproxy " + VERSION # Serialization format version. This is displayed nowhere, it just needs to be incremented by one # for each change in the file format. FLOW_FORMAT_VERSION = 7 def get_dev_version() -> str: """ Return a detailed version string, sourced either from VERSION or obtained dynamically using git. """ mitmproxy_version = VERSION here = os.path.abspath(os.path.join(os.path.dirname(__file__), "..")) try: git_describe = subprocess.check_output( ['git', 'describe', '--long'], stderr=subprocess.STDOUT, cwd=here, ) last_tag, tag_dist, commit = git_describe.decode().strip().rsplit("-", 2) commit = commit.lstrip("g")[:7] tag_dist = int(tag_dist) except Exception: pass else: # Add commit info for non-tagged releases if tag_dist > 0: mitmproxy_version += f" (+{tag_dist}, commit {commit})" # PyInstaller build indicator, if using precompiled binary if getattr(sys, 'frozen', False): mitmproxy_version += " binary" return mitmproxy_version if __name__ == "__main__": # pragma: no cover print(VERSION)
NSW Farmers collaborates with leading innovation bodies to deliver projects that are future-focussed and seek to fill gaps in mainstream R&D and extension. Working with our supporters in a community of practice, our aim is to bring together leading edge researchers and technologists with farmers and other practitioners to develop, test and communicate solutions. The AgInnovators' web portal and social media channels provide a platform to share results and extension resources generated by the projects. Subscribe to AgInnovators for project updates and opportunities to get involved. The Solar Powered Pumping Initiative was funded by NSW Office of Environment and Heritage (OEH ) and aimed to promote the use of solar power pumping in agriculture across domestic, stock and high volume irrigation applications. CISCO, NSW Farmers, NSW Department of Primary Industry, CSIRO, and the University of NSW have partnered to establish the Innovation Central laboratory for developing machine to machine digital solutions. The Direct Beef Export to China project investigated the potential for cattle producers to export premium beef products to China using advanced paddock-to-plate marketing and ecommerce platforms. Partner Paddocks is an AgInnovators' initiative aimed at helping producers and other food and fibre industry firms to get involved in research and development collaborations.
""" Django settings for polyamide project. Generated by 'django-admin startproject' using Django 1.8. For more information on this file, see https://docs.djangoproject.com/en/1.8/topics/settings/ For the full list of settings and their values, see https://docs.djangoproject.com/en/1.8/ref/settings/ """ # Build paths inside the project like this: os.path.join(BASE_DIR, ...) import os BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__))) # Quick-start development settings - unsuitable for production # See https://docs.djangoproject.com/en/1.8/howto/deployment/checklist/ # SECURITY WARNING: keep the secret key used in production secret! SECRET_KEY = '+md7#^%e3(!o03w1l(hyqk-@_s*p9+xxq038m%t_*$=w(^_qnz' # SECURITY WARNING: don't run with debug turned on in production! DEBUG = True ALLOWED_HOSTS = [] # Application definition INSTALLED_APPS = ( 'django.contrib.admin', 'django.contrib.auth', 'django.contrib.contenttypes', 'django.contrib.sessions', 'django.contrib.messages', 'django.contrib.staticfiles', 'pagedown', #adds the markdown editor from SO 'listofjobs', ) MIDDLEWARE_CLASSES = ( 'django.contrib.sessions.middleware.SessionMiddleware', 'django.middleware.common.CommonMiddleware', 'django.middleware.csrf.CsrfViewMiddleware', 'django.contrib.auth.middleware.AuthenticationMiddleware', 'django.contrib.auth.middleware.SessionAuthenticationMiddleware', 'django.contrib.messages.middleware.MessageMiddleware', 'django.middleware.clickjacking.XFrameOptionsMiddleware', 'django.middleware.security.SecurityMiddleware', ) ROOT_URLCONF = 'polyamide.urls' TEMPLATES = [ { 'BACKEND': 'django.template.backends.django.DjangoTemplates', 'DIRS': [], 'APP_DIRS': True, 'OPTIONS': { 'context_processors': [ 'django.template.context_processors.debug', 'django.template.context_processors.request', 'django.contrib.auth.context_processors.auth', 'django.contrib.messages.context_processors.messages', ], }, }, ] WSGI_APPLICATION = 'polyamide.wsgi.application' # Database # https://docs.djangoproject.com/en/1.8/ref/settings/#databases DATABASES = { 'default': { 'ENGINE': 'django.db.backends.sqlite3', 'NAME': os.path.join(BASE_DIR, 'db.sqlite3'), } } # Internationalization # https://docs.djangoproject.com/en/1.8/topics/i18n/ LANGUAGE_CODE = 'en-us' TIME_ZONE = 'UTC' USE_I18N = True USE_L10N = True USE_TZ = True # Static files (CSS, JavaScript, Images) # https://docs.djangoproject.com/en/1.8/howto/static-files/ STATIC_URL = '/static/'
Project “Rainbows Over Sochi” starts *now*. One ticket costs 5€. You can buy more than one, then your chances at winning will be greater. Please donate the money to a LGBT charity and send proof in the form of emails / screenshots / whatever to [email protected]. The raffle is closed as soon as the Closing Ceremony is over, and I will randomly draw the winning ticket ASAP. If you don’t have a favorite charity, my friend Lara has listed a few on her blog. Awesome Idea! How about you propose some well-suited LGBT-themed charities though?