repo_name
stringlengths 5
100
| path
stringlengths 4
375
| copies
stringclasses 991
values | size
stringlengths 4
7
| content
stringlengths 666
1M
| license
stringclasses 15
values |
---|---|---|---|---|---|
psathyrella/partis-deprecated
|
packages/ighutil/python/versioneer.py
|
6
|
27828
|
#! /usr/bin/python
"""versioneer.py
(like a rocketeer, but for versions)
* https://github.com/warner/python-versioneer
* Brian Warner
* License: Public Domain
* Version: 0.8+
This file helps distutils-based projects manage their version number by just
creating version-control tags.
For developers who work from a VCS-generated tree (e.g. 'git clone' etc),
each 'setup.py version', 'setup.py build', 'setup.py sdist' will compute a
version number by asking your version-control tool about the current
checkout. The version number will be written into a generated _version.py
file of your choosing, where it can be included by your __init__.py
For users who work from a VCS-generated tarball (e.g. 'git archive'), it will
compute a version number by looking at the name of the directory created when
te tarball is unpacked. This conventionally includes both the name of the
project and a version number.
For users who work from a tarball built by 'setup.py sdist', it will get a
version number from a previously-generated _version.py file.
As a result, loading code directly from the source tree will not result in a
real version. If you want real versions from VCS trees (where you frequently
update from the upstream repository, or do new development), you will need to
do a 'setup.py version' after each update, and load code from the build/
directory.
You need to provide this code with a few configuration values:
versionfile_source:
A project-relative pathname into which the generated version strings
should be written. This is usually a _version.py next to your project's
main __init__.py file. If your project uses src/myproject/__init__.py,
this should be 'src/myproject/_version.py'. This file should be checked
in to your VCS as usual: the copy created below by 'setup.py
update_files' will include code that parses expanded VCS keywords in
generated tarballs. The 'build' and 'sdist' commands will replace it with
a copy that has just the calculated version string.
versionfile_build:
Like versionfile_source, but relative to the build directory instead of
the source directory. These will differ when your setup.py uses
'package_dir='. If you have package_dir={'myproject': 'src/myproject'},
then you will probably have versionfile_build='myproject/_version.py' and
versionfile_source='src/myproject/_version.py'.
tag_prefix: a string, like 'PROJECTNAME-', which appears at the start of all
VCS tags. If your tags look like 'myproject-1.2.0', then you
should use tag_prefix='myproject-'. If you use unprefixed tags
like '1.2.0', this should be an empty string.
parentdir_prefix: a string, frequently the same as tag_prefix, which
appears at the start of all unpacked tarball filenames. If
your tarball unpacks into 'myproject-1.2.0', this should
be 'myproject-'.
To use it:
1: include this file in the top level of your project
2: make the following changes to the top of your setup.py:
import versioneer
versioneer.versionfile_source = 'src/myproject/_version.py'
versioneer.versionfile_build = 'myproject/_version.py'
versioneer.tag_prefix = '' # tags are like 1.2.0
versioneer.parentdir_prefix = 'myproject-' # dirname like 'myproject-1.2.0'
3: add the following arguments to the setup() call in your setup.py:
version=versioneer.get_version(),
cmdclass=versioneer.get_cmdclass(),
4: run 'setup.py update_files', which will create _version.py, and will
modify your __init__.py to define __version__ (by calling a function
from _version.py)
5: modify your MANIFEST.in to include versioneer.py
6: add both versioneer.py and the generated _version.py to your VCS
"""
import os, sys, re
from distutils.core import Command
from distutils.command.sdist import sdist as _sdist
from distutils.command.build import build as _build
versionfile_source = None
versionfile_build = None
tag_prefix = None
parentdir_prefix = None
VCS = "git"
IN_LONG_VERSION_PY = False
LONG_VERSION_PY = '''
IN_LONG_VERSION_PY = True
# This file helps to compute a version number in source trees obtained from
# git-archive tarball (such as those provided by githubs download-from-tag
# feature). Distribution tarballs (build by setup.py sdist) and build
# directories (produced by setup.py build) will contain a much shorter file
# that just contains the computed version number.
# This file is released into the public domain. Generated by
# versioneer-0.8+ (https://github.com/warner/python-versioneer)
# these strings will be replaced by git during git-archive
git_refnames = "%(DOLLAR)sFormat:%%d%(DOLLAR)s"
git_full = "%(DOLLAR)sFormat:%%H%(DOLLAR)s"
import subprocess
import sys
def run_command(args, cwd=None, verbose=False, hide_stderr=False):
try:
# remember shell=False, so use git.cmd on windows, not just git
p = subprocess.Popen(args, cwd=cwd, stdout=subprocess.PIPE,
stderr=(subprocess.PIPE if hide_stderr else None))
except EnvironmentError:
e = sys.exc_info()[1]
if verbose:
print("unable to run %%s" %% args[0])
print(e)
return None
stdout = p.communicate()[0].strip()
if sys.version >= '3':
stdout = stdout.decode()
if p.returncode != 0:
if verbose:
print("unable to run %%s (error)" %% args[0])
return None
return stdout
import sys
import re
import os.path
def get_expanded_variables(versionfile_source):
# the code embedded in _version.py can just fetch the value of these
# variables. When used from setup.py, we don't want to import
# _version.py, so we do it with a regexp instead. This function is not
# used from _version.py.
variables = {}
try:
f = open(versionfile_source,"r")
for line in f.readlines():
if line.strip().startswith("git_refnames ="):
mo = re.search(r'=\s*"(.*)"', line)
if mo:
variables["refnames"] = mo.group(1)
if line.strip().startswith("git_full ="):
mo = re.search(r'=\s*"(.*)"', line)
if mo:
variables["full"] = mo.group(1)
f.close()
except EnvironmentError:
pass
return variables
def versions_from_expanded_variables(variables, tag_prefix, verbose=False):
refnames = variables["refnames"].strip()
if refnames.startswith("$Format"):
if verbose:
print("variables are unexpanded, not using")
return {} # unexpanded, so not in an unpacked git-archive tarball
refs = set([r.strip() for r in refnames.strip("()").split(",")])
# starting in git-1.8.3, tags are listed as "tag: foo-1.0" instead of
# just "foo-1.0". If we see a "tag: " prefix, prefer those.
TAG = "tag: "
tags = set([r[len(TAG):] for r in refs if r.startswith(TAG)])
if not tags:
# Either we're using git < 1.8.3, or there really are no tags. We use
# a heuristic: assume all version tags have a digit. The old git %%d
# expansion behaves like git log --decorate=short and strips out the
# refs/heads/ and refs/tags/ prefixes that would let us distinguish
# between branches and tags. By ignoring refnames without digits, we
# filter out many common branch names like "release" and
# "stabilization", as well as "HEAD" and "master".
tags = set([r for r in refs if re.search(r'\d', r)])
if verbose:
print("discarding '%%s', no digits" %% ",".join(refs-tags))
if verbose:
print("likely tags: %%s" %% ",".join(sorted(tags)))
for ref in sorted(tags):
# sorting will prefer e.g. "2.0" over "2.0rc1"
if ref.startswith(tag_prefix):
r = ref[len(tag_prefix):]
if verbose:
print("picking %%s" %% r)
return { "version": r,
"full": variables["full"].strip() }
# no suitable tags, so we use the full revision id
if verbose:
print("no suitable tags, using full revision id")
return { "version": variables["full"].strip(),
"full": variables["full"].strip() }
def versions_from_vcs(tag_prefix, versionfile_source, verbose=False):
# this runs 'git' from the root of the source tree. That either means
# someone ran a setup.py command (and this code is in versioneer.py, so
# IN_LONG_VERSION_PY=False, thus the containing directory is the root of
# the source tree), or someone ran a project-specific entry point (and
# this code is in _version.py, so IN_LONG_VERSION_PY=True, thus the
# containing directory is somewhere deeper in the source tree). This only
# gets called if the git-archive 'subst' variables were *not* expanded,
# and _version.py hasn't already been rewritten with a short version
# string, meaning we're inside a checked out source tree.
try:
here = os.path.abspath(__file__)
except NameError:
# some py2exe/bbfreeze/non-CPython implementations don't do __file__
return {} # not always correct
GIT = "git"
if sys.platform == "win32":
GIT = "git.cmd"
# versionfile_source is the relative path from the top of the source tree
# (where the .git directory might live) to this file. Invert this to find
# the root from __file__.
root = here
if IN_LONG_VERSION_PY:
for i in range(len(versionfile_source.split("/"))):
root = os.path.dirname(root)
else:
toplevel = run_command([GIT, "rev-parse", "--show-toplevel"],
hide_stderr=True)
root = (toplevel.strip() if toplevel else os.path.dirname(here))
if not os.path.exists(os.path.join(root, ".git")):
if verbose:
print("no .git in %%s" %% root)
return {}
stdout = run_command([GIT, "describe", "--tags", "--dirty", "--always"],
cwd=root)
if stdout is None:
return {}
if not stdout.startswith(tag_prefix):
if verbose:
print("tag '%%s' doesn't start with prefix '%%s'" %% (stdout, tag_prefix))
return {}
tag = stdout[len(tag_prefix):]
stdout = run_command([GIT, "rev-parse", "HEAD"], cwd=root)
if stdout is None:
return {}
full = stdout.strip()
if tag.endswith("-dirty"):
full += "-dirty"
return {"version": tag, "full": full}
def versions_from_parentdir(parentdir_prefix, versionfile_source, verbose=False):
if IN_LONG_VERSION_PY:
# We're running from _version.py. If it's from a source tree
# (execute-in-place), we can work upwards to find the root of the
# tree, and then check the parent directory for a version string. If
# it's in an installed application, there's no hope.
try:
here = os.path.abspath(__file__)
except NameError:
# py2exe/bbfreeze/non-CPython don't have __file__
return {} # without __file__, we have no hope
# versionfile_source is the relative path from the top of the source
# tree to _version.py. Invert this to find the root from __file__.
root = here
for i in range(len(versionfile_source.split("/"))):
root = os.path.dirname(root)
else:
# we're running from versioneer.py, which means we're running from
# the setup.py in a source tree. sys.argv[0] is setup.py in the root.
here = os.path.abspath(sys.argv[0])
root = os.path.dirname(here)
# Source tarballs conventionally unpack into a directory that includes
# both the project name and a version string.
dirname = os.path.basename(root)
if not dirname.startswith(parentdir_prefix):
if verbose:
print("guessing rootdir is '%%s', but '%%s' doesn't start with prefix '%%s'" %%
(root, dirname, parentdir_prefix))
return None
return {"version": dirname[len(parentdir_prefix):], "full": ""}
tag_prefix = "%(TAG_PREFIX)s"
parentdir_prefix = "%(PARENTDIR_PREFIX)s"
versionfile_source = "%(VERSIONFILE_SOURCE)s"
def get_versions(default={"version": "unknown", "full": ""}, verbose=False):
variables = { "refnames": git_refnames, "full": git_full }
ver = versions_from_expanded_variables(variables, tag_prefix, verbose)
if not ver:
ver = versions_from_vcs(tag_prefix, versionfile_source, verbose)
if not ver:
ver = versions_from_parentdir(parentdir_prefix, versionfile_source,
verbose)
if not ver:
ver = default
return ver
'''
import subprocess
import sys
def run_command(args, cwd=None, verbose=False, hide_stderr=False):
try:
# remember shell=False, so use git.cmd on windows, not just git
p = subprocess.Popen(args, cwd=cwd, stdout=subprocess.PIPE,
stderr=(subprocess.PIPE if hide_stderr else None))
except EnvironmentError:
e = sys.exc_info()[1]
if verbose:
print("unable to run %s" % args[0])
print(e)
return None
stdout = p.communicate()[0].strip()
if sys.version >= '3':
stdout = stdout.decode()
if p.returncode != 0:
if verbose:
print("unable to run %s (error)" % args[0])
return None
return stdout
import sys
import re
import os.path
def get_expanded_variables(versionfile_source):
# the code embedded in _version.py can just fetch the value of these
# variables. When used from setup.py, we don't want to import
# _version.py, so we do it with a regexp instead. This function is not
# used from _version.py.
variables = {}
try:
f = open(versionfile_source,"r")
for line in f.readlines():
if line.strip().startswith("git_refnames ="):
mo = re.search(r'=\s*"(.*)"', line)
if mo:
variables["refnames"] = mo.group(1)
if line.strip().startswith("git_full ="):
mo = re.search(r'=\s*"(.*)"', line)
if mo:
variables["full"] = mo.group(1)
f.close()
except EnvironmentError:
pass
return variables
def versions_from_expanded_variables(variables, tag_prefix, verbose=False):
refnames = variables["refnames"].strip()
if refnames.startswith("$Format"):
if verbose:
print("variables are unexpanded, not using")
return {} # unexpanded, so not in an unpacked git-archive tarball
refs = set([r.strip() for r in refnames.strip("()").split(",")])
# starting in git-1.8.3, tags are listed as "tag: foo-1.0" instead of
# just "foo-1.0". If we see a "tag: " prefix, prefer those.
TAG = "tag: "
tags = set([r[len(TAG):] for r in refs if r.startswith(TAG)])
if not tags:
# Either we're using git < 1.8.3, or there really are no tags. We use
# a heuristic: assume all version tags have a digit. The old git %d
# expansion behaves like git log --decorate=short and strips out the
# refs/heads/ and refs/tags/ prefixes that would let us distinguish
# between branches and tags. By ignoring refnames without digits, we
# filter out many common branch names like "release" and
# "stabilization", as well as "HEAD" and "master".
tags = set([r for r in refs if re.search(r'\d', r)])
if verbose:
print("discarding '%s', no digits" % ",".join(refs-tags))
if verbose:
print("likely tags: %s" % ",".join(sorted(tags)))
for ref in sorted(tags):
# sorting will prefer e.g. "2.0" over "2.0rc1"
if ref.startswith(tag_prefix):
r = ref[len(tag_prefix):]
if verbose:
print("picking %s" % r)
return { "version": r,
"full": variables["full"].strip() }
# no suitable tags, so we use the full revision id
if verbose:
print("no suitable tags, using full revision id")
return { "version": variables["full"].strip(),
"full": variables["full"].strip() }
def versions_from_vcs(tag_prefix, versionfile_source, verbose=False):
# this runs 'git' from the root of the source tree. That either means
# someone ran a setup.py command (and this code is in versioneer.py, so
# IN_LONG_VERSION_PY=False, thus the containing directory is the root of
# the source tree), or someone ran a project-specific entry point (and
# this code is in _version.py, so IN_LONG_VERSION_PY=True, thus the
# containing directory is somewhere deeper in the source tree). This only
# gets called if the git-archive 'subst' variables were *not* expanded,
# and _version.py hasn't already been rewritten with a short version
# string, meaning we're inside a checked out source tree.
try:
here = os.path.abspath(__file__)
except NameError:
# some py2exe/bbfreeze/non-CPython implementations don't do __file__
return {} # not always correct
GIT = "git"
if sys.platform == "win32":
GIT = "git.cmd"
# versionfile_source is the relative path from the top of the source tree
# (where the .git directory might live) to this file. Invert this to find
# the root from __file__.
root = here
if IN_LONG_VERSION_PY:
for i in range(len(versionfile_source.split("/"))):
root = os.path.dirname(root)
else:
toplevel = run_command([GIT, "rev-parse", "--show-toplevel"],
hide_stderr=True)
root = (toplevel.strip() if toplevel else os.path.dirname(here))
if not os.path.exists(os.path.join(root, ".git")):
if verbose:
print("no .git in %s" % root)
return {}
stdout = run_command([GIT, "describe", "--tags", "--dirty", "--always"],
cwd=root)
if stdout is None:
return {}
if not stdout.startswith(tag_prefix):
if verbose:
print("tag '%s' doesn't start with prefix '%s'" % (stdout, tag_prefix))
return {}
tag = stdout[len(tag_prefix):]
stdout = run_command([GIT, "rev-parse", "HEAD"], cwd=root)
if stdout is None:
return {}
full = stdout.strip()
if tag.endswith("-dirty"):
full += "-dirty"
return {"version": tag, "full": full}
def versions_from_parentdir(parentdir_prefix, versionfile_source, verbose=False):
if IN_LONG_VERSION_PY:
# We're running from _version.py. If it's from a source tree
# (execute-in-place), we can work upwards to find the root of the
# tree, and then check the parent directory for a version string. If
# it's in an installed application, there's no hope.
try:
here = os.path.abspath(__file__)
except NameError:
# py2exe/bbfreeze/non-CPython don't have __file__
return {} # without __file__, we have no hope
# versionfile_source is the relative path from the top of the source
# tree to _version.py. Invert this to find the root from __file__.
root = here
for i in range(len(versionfile_source.split("/"))):
root = os.path.dirname(root)
else:
# we're running from versioneer.py, which means we're running from
# the setup.py in a source tree. sys.argv[0] is setup.py in the root.
here = os.path.abspath(sys.argv[0])
root = os.path.dirname(here)
# Source tarballs conventionally unpack into a directory that includes
# both the project name and a version string.
dirname = os.path.basename(root)
if not dirname.startswith(parentdir_prefix):
if verbose:
print("guessing rootdir is '%s', but '%s' doesn't start with prefix '%s'" %
(root, dirname, parentdir_prefix))
return None
return {"version": dirname[len(parentdir_prefix):], "full": ""}
import os.path
import sys
# os.path.relpath only appeared in Python-2.6 . Define it here for 2.5.
def os_path_relpath(path, start=os.path.curdir):
"""Return a relative version of a path"""
if not path:
raise ValueError("no path specified")
start_list = [x for x in os.path.abspath(start).split(os.path.sep) if x]
path_list = [x for x in os.path.abspath(path).split(os.path.sep) if x]
# Work out how much of the filepath is shared by start and path.
i = len(os.path.commonprefix([start_list, path_list]))
rel_list = [os.path.pardir] * (len(start_list)-i) + path_list[i:]
if not rel_list:
return os.path.curdir
return os.path.join(*rel_list)
def do_vcs_install(versionfile_source, ipy):
GIT = "git"
if sys.platform == "win32":
GIT = "git.cmd"
files = [versionfile_source, ipy]
try:
me = __file__
if me.endswith(".pyc") or me.endswith(".pyo"):
me = os.path.splitext(me)[0] + ".py"
versioneer_file = os_path_relpath(me)
except NameError:
versioneer_file = "versioneer.py"
files.append(versioneer_file)
present = False
try:
f = open(".gitattributes", "r")
for line in f.readlines():
if line.strip().startswith(versionfile_source):
if "export-subst" in line.strip().split()[1:]:
present = True
f.close()
except EnvironmentError:
pass
if not present:
f = open(".gitattributes", "a+")
f.write("%s export-subst\n" % versionfile_source)
f.close()
files.append(".gitattributes")
run_command([GIT, "add", "--"] + files)
SHORT_VERSION_PY = """
# This file was generated by 'versioneer.py' (0.8+) from
# revision-control system data, or from the parent directory name of an
# unpacked source archive. Distribution tarballs contain a pre-generated copy
# of this file.
version_version = '%(version)s'
version_full = '%(full)s'
def get_versions(default={}, verbose=False):
return {'version': version_version, 'full': version_full}
"""
DEFAULT = {"version": "unknown", "full": "unknown"}
def versions_from_file(filename):
versions = {}
try:
f = open(filename)
except EnvironmentError:
return versions
for line in f.readlines():
mo = re.match("version_version = '([^']+)'", line)
if mo:
versions["version"] = mo.group(1)
mo = re.match("version_full = '([^']+)'", line)
if mo:
versions["full"] = mo.group(1)
f.close()
return versions
def write_to_version_file(filename, versions):
f = open(filename, "w")
f.write(SHORT_VERSION_PY % versions)
f.close()
print("set %s to '%s'" % (filename, versions["version"]))
def get_best_versions(versionfile, tag_prefix, parentdir_prefix,
default=DEFAULT, verbose=False):
# returns dict with two keys: 'version' and 'full'
#
# extract version from first of _version.py, 'git describe', parentdir.
# This is meant to work for developers using a source checkout, for users
# of a tarball created by 'setup.py sdist', and for users of a
# tarball/zipball created by 'git archive' or github's download-from-tag
# feature.
variables = get_expanded_variables(versionfile_source)
if variables:
ver = versions_from_expanded_variables(variables, tag_prefix)
if ver:
if verbose: print("got version from expanded variable %s" % ver)
return ver
ver = versions_from_file(versionfile)
if ver:
if verbose: print("got version from file %s %s" % (versionfile, ver))
return ver
ver = versions_from_vcs(tag_prefix, versionfile_source, verbose)
if ver:
if verbose: print("got version from git %s" % ver)
return ver
ver = versions_from_parentdir(parentdir_prefix, versionfile_source, verbose)
if ver:
if verbose: print("got version from parentdir %s" % ver)
return ver
if verbose: print("got version from default %s" % ver)
return default
def get_versions(default=DEFAULT, verbose=False):
assert versionfile_source is not None, "please set versioneer.versionfile_source"
assert tag_prefix is not None, "please set versioneer.tag_prefix"
assert parentdir_prefix is not None, "please set versioneer.parentdir_prefix"
return get_best_versions(versionfile_source, tag_prefix, parentdir_prefix,
default=default, verbose=verbose)
def get_version(verbose=False):
return get_versions(verbose=verbose)["version"]
class cmd_version(Command):
description = "report generated version string"
user_options = []
boolean_options = []
def initialize_options(self):
pass
def finalize_options(self):
pass
def run(self):
ver = get_version(verbose=True)
print("Version is currently: %s" % ver)
class cmd_build(_build):
def run(self):
versions = get_versions(verbose=True)
_build.run(self)
# now locate _version.py in the new build/ directory and replace it
# with an updated value
target_versionfile = os.path.join(self.build_lib, versionfile_build)
print("UPDATING %s" % target_versionfile)
os.unlink(target_versionfile)
f = open(target_versionfile, "w")
f.write(SHORT_VERSION_PY % versions)
f.close()
class cmd_sdist(_sdist):
def run(self):
versions = get_versions(verbose=True)
self._versioneer_generated_versions = versions
# unless we update this, the command will keep using the old version
self.distribution.metadata.version = versions["version"]
return _sdist.run(self)
def make_release_tree(self, base_dir, files):
_sdist.make_release_tree(self, base_dir, files)
# now locate _version.py in the new base_dir directory (remembering
# that it may be a hardlink) and replace it with an updated value
target_versionfile = os.path.join(base_dir, versionfile_source)
print("UPDATING %s" % target_versionfile)
os.unlink(target_versionfile)
f = open(target_versionfile, "w")
f.write(SHORT_VERSION_PY % self._versioneer_generated_versions)
f.close()
INIT_PY_SNIPPET = """
from ._version import get_versions
__version__ = get_versions()['version']
del get_versions
"""
class cmd_update_files(Command):
description = "modify __init__.py and create _version.py"
user_options = []
boolean_options = []
def initialize_options(self):
pass
def finalize_options(self):
pass
def run(self):
ipy = os.path.join(os.path.dirname(versionfile_source), "__init__.py")
print(" creating %s" % versionfile_source)
f = open(versionfile_source, "w")
f.write(LONG_VERSION_PY % {"DOLLAR": "$",
"TAG_PREFIX": tag_prefix,
"PARENTDIR_PREFIX": parentdir_prefix,
"VERSIONFILE_SOURCE": versionfile_source,
})
f.close()
try:
old = open(ipy, "r").read()
except EnvironmentError:
old = ""
if INIT_PY_SNIPPET not in old:
print(" appending to %s" % ipy)
f = open(ipy, "a")
f.write(INIT_PY_SNIPPET)
f.close()
else:
print(" %s unmodified" % ipy)
do_vcs_install(versionfile_source, ipy)
def get_cmdclass():
return {'version': cmd_version,
'update_files': cmd_update_files,
'build': cmd_build,
'sdist': cmd_sdist,
}
|
gpl-3.0
|
manipopopo/tensorflow
|
tensorflow/python/saved_model/tag_constants.py
|
20
|
1516
|
# Copyright 2016 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""Common tags used for graphs in SavedModel.
"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from tensorflow.python.util.tf_export import tf_export
# Tag for the `serving` graph.
SERVING = "serve"
tf_export("saved_model.tag_constants.SERVING").export_constant(
__name__, "SERVING")
# Tag for the `training` graph.
TRAINING = "train"
tf_export("saved_model.tag_constants.TRAINING").export_constant(
__name__, "TRAINING")
# Tag for the `eval` graph. Not exported while the export logic is in contrib.
EVAL = "eval"
# Tag for the `gpu` graph.
GPU = "gpu"
tf_export("saved_model.tag_constants.GPU").export_constant(__name__, "GPU")
# Tag for the `tpu` graph.
TPU = "tpu"
tf_export("saved_model.tag_constants.TPU").export_constant(__name__, "TPU")
|
apache-2.0
|
javilonas/Thoth-SM-N9005-Sammy
|
scripts/tracing/draw_functrace.py
|
14676
|
3560
|
#!/usr/bin/python
"""
Copyright 2008 (c) Frederic Weisbecker <[email protected]>
Licensed under the terms of the GNU GPL License version 2
This script parses a trace provided by the function tracer in
kernel/trace/trace_functions.c
The resulted trace is processed into a tree to produce a more human
view of the call stack by drawing textual but hierarchical tree of
calls. Only the functions's names and the the call time are provided.
Usage:
Be sure that you have CONFIG_FUNCTION_TRACER
# mount -t debugfs nodev /sys/kernel/debug
# echo function > /sys/kernel/debug/tracing/current_tracer
$ cat /sys/kernel/debug/tracing/trace_pipe > ~/raw_trace_func
Wait some times but not too much, the script is a bit slow.
Break the pipe (Ctrl + Z)
$ scripts/draw_functrace.py < raw_trace_func > draw_functrace
Then you have your drawn trace in draw_functrace
"""
import sys, re
class CallTree:
""" This class provides a tree representation of the functions
call stack. If a function has no parent in the kernel (interrupt,
syscall, kernel thread...) then it is attached to a virtual parent
called ROOT.
"""
ROOT = None
def __init__(self, func, time = None, parent = None):
self._func = func
self._time = time
if parent is None:
self._parent = CallTree.ROOT
else:
self._parent = parent
self._children = []
def calls(self, func, calltime):
""" If a function calls another one, call this method to insert it
into the tree at the appropriate place.
@return: A reference to the newly created child node.
"""
child = CallTree(func, calltime, self)
self._children.append(child)
return child
def getParent(self, func):
""" Retrieve the last parent of the current node that
has the name given by func. If this function is not
on a parent, then create it as new child of root
@return: A reference to the parent.
"""
tree = self
while tree != CallTree.ROOT and tree._func != func:
tree = tree._parent
if tree == CallTree.ROOT:
child = CallTree.ROOT.calls(func, None)
return child
return tree
def __repr__(self):
return self.__toString("", True)
def __toString(self, branch, lastChild):
if self._time is not None:
s = "%s----%s (%s)\n" % (branch, self._func, self._time)
else:
s = "%s----%s\n" % (branch, self._func)
i = 0
if lastChild:
branch = branch[:-1] + " "
while i < len(self._children):
if i != len(self._children) - 1:
s += "%s" % self._children[i].__toString(branch +\
" |", False)
else:
s += "%s" % self._children[i].__toString(branch +\
" |", True)
i += 1
return s
class BrokenLineException(Exception):
"""If the last line is not complete because of the pipe breakage,
we want to stop the processing and ignore this line.
"""
pass
class CommentLineException(Exception):
""" If the line is a comment (as in the beginning of the trace file),
just ignore it.
"""
pass
def parseLine(line):
line = line.strip()
if line.startswith("#"):
raise CommentLineException
m = re.match("[^]]+?\\] +([0-9.]+): (\\w+) <-(\\w+)", line)
if m is None:
raise BrokenLineException
return (m.group(1), m.group(2), m.group(3))
def main():
CallTree.ROOT = CallTree("Root (Nowhere)", None, None)
tree = CallTree.ROOT
for line in sys.stdin:
try:
calltime, callee, caller = parseLine(line)
except BrokenLineException:
break
except CommentLineException:
continue
tree = tree.getParent(caller)
tree = tree.calls(callee, calltime)
print CallTree.ROOT
if __name__ == "__main__":
main()
|
gpl-2.0
|
AlexanderYAPPO/stackalytics
|
tests/unit/test_default_data_processor.py
|
4
|
4084
|
# Copyright (c) 2013 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import copy
import mock
import testtools
from stackalytics.processor import default_data_processor
from stackalytics.processor import normalizer
from tests.unit import test_data
class TestDefaultDataProcessor(testtools.TestCase):
def setUp(self):
super(TestDefaultDataProcessor, self).setUp()
self.get_users = mock.Mock(return_value=[
test_data.USERS,
])
normalized_data = copy.deepcopy(test_data.DEFAULT_DATA)
normalizer.normalize_default_data(normalized_data)
def tearDown(self):
super(TestDefaultDataProcessor, self).tearDown()
def test_normalizer(self):
data = copy.deepcopy(test_data.DEFAULT_DATA)
normalizer.normalize_default_data(data)
self.assertIn('releases', data['repos'][0])
self.assertEqual([], data['repos'][0]['releases'],
message='Empty list of releases expected')
self.assertEqual(0, data['users'][0]['companies'][-1]['end_date'],
message='The last company end date should be 0')
self.assertIn('user_id', data['users'][0])
self.assertEqual(test_data.USERS[0]['launchpad_id'],
data['users'][0]['user_id'],
message='User id should be set')
# verify that *independent company is added automatically
self.assertEqual(3, len(data['users'][1]['companies']))
self.assertEqual(0, data['users'][1]['companies'][-1]['end_date'],
message='The last company end date should be 0')
def test_update_project_list(self):
with mock.patch('stackalytics.processor.default_data_processor.'
'_retrieve_project_list_from_gerrit') as retriever:
retriever.return_value = [
{'module': 'nova',
'uri': 'git://git.openstack.org/openstack/nova',
'organization': 'openstack'},
{'module': 'qa', 'uri': 'git://git.openstack.org/openstack/qa',
'organization': 'openstack'},
]
dd = {
'repos': [
{'module': 'qa',
'uri': 'git://git.openstack.org/openstack/qa',
'organization': 'openstack'},
{'module': 'tux',
'uri': 'git://git.openstack.org/stackforge/tux',
'organization': 'stackforge'},
],
'project_sources': [{'organization': 'openstack',
'uri': 'gerrit://'}],
'module_groups': [],
}
default_data_processor._update_project_list(dd)
self.assertEqual(3, len(dd['repos']))
self.assertIn('qa', set([r['module'] for r in dd['repos']]))
self.assertIn('nova', set([r['module'] for r in dd['repos']]))
self.assertIn('tux', set([r['module'] for r in dd['repos']]))
self.assertEqual(2, len(dd['module_groups']))
self.assertIn({'id': 'openstack',
'module_group_name': 'openstack',
'modules': ['qa', 'nova'],
'tag': 'organization'}, dd['module_groups'])
self.assertIn({'id': 'stackforge',
'module_group_name': 'stackforge',
'modules': ['tux'],
'tag': 'organization'}, dd['module_groups'])
|
apache-2.0
|
zanderle/django
|
django/utils/baseconv.py
|
650
|
2982
|
# Copyright (c) 2010 Guilherme Gondim. All rights reserved.
# Copyright (c) 2009 Simon Willison. All rights reserved.
# Copyright (c) 2002 Drew Perttula. All rights reserved.
#
# License:
# Python Software Foundation License version 2
#
# See the file "LICENSE" for terms & conditions for usage, and a DISCLAIMER OF
# ALL WARRANTIES.
#
# This Baseconv distribution contains no GNU General Public Licensed (GPLed)
# code so it may be used in proprietary projects just like prior ``baseconv``
# distributions.
#
# All trademarks referenced herein are property of their respective holders.
#
"""
Convert numbers from base 10 integers to base X strings and back again.
Sample usage::
>>> base20 = BaseConverter('0123456789abcdefghij')
>>> base20.encode(1234)
'31e'
>>> base20.decode('31e')
1234
>>> base20.encode(-1234)
'-31e'
>>> base20.decode('-31e')
-1234
>>> base11 = BaseConverter('0123456789-', sign='$')
>>> base11.encode('$1234')
'$-22'
>>> base11.decode('$-22')
'$1234'
"""
BASE2_ALPHABET = '01'
BASE16_ALPHABET = '0123456789ABCDEF'
BASE56_ALPHABET = '23456789ABCDEFGHJKLMNPQRSTUVWXYZabcdefghijkmnpqrstuvwxyz'
BASE36_ALPHABET = '0123456789abcdefghijklmnopqrstuvwxyz'
BASE62_ALPHABET = '0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz'
BASE64_ALPHABET = BASE62_ALPHABET + '-_'
class BaseConverter(object):
decimal_digits = '0123456789'
def __init__(self, digits, sign='-'):
self.sign = sign
self.digits = digits
if sign in self.digits:
raise ValueError('Sign character found in converter base digits.')
def __repr__(self):
return "<BaseConverter: base%s (%s)>" % (len(self.digits), self.digits)
def encode(self, i):
neg, value = self.convert(i, self.decimal_digits, self.digits, '-')
if neg:
return self.sign + value
return value
def decode(self, s):
neg, value = self.convert(s, self.digits, self.decimal_digits, self.sign)
if neg:
value = '-' + value
return int(value)
def convert(self, number, from_digits, to_digits, sign):
if str(number)[0] == sign:
number = str(number)[1:]
neg = 1
else:
neg = 0
# make an integer out of the number
x = 0
for digit in str(number):
x = x * len(from_digits) + from_digits.index(digit)
# create the result in base 'len(to_digits)'
if x == 0:
res = to_digits[0]
else:
res = ''
while x > 0:
digit = x % len(to_digits)
res = to_digits[digit] + res
x = int(x // len(to_digits))
return neg, res
base2 = BaseConverter(BASE2_ALPHABET)
base16 = BaseConverter(BASE16_ALPHABET)
base36 = BaseConverter(BASE36_ALPHABET)
base56 = BaseConverter(BASE56_ALPHABET)
base62 = BaseConverter(BASE62_ALPHABET)
base64 = BaseConverter(BASE64_ALPHABET, sign='$')
|
bsd-3-clause
|
jfemiani/facade-segmentation
|
pyfacades/util/process_strip.py
|
1
|
3952
|
import numpy as np
import skimage
from skimage.transform import rescale
def split_tiles(image, shape, overlap=16):
""" Rescale and split the input images to get several overlapping images of a given shape.
*** The inpput must be CHANNELS FIRST ***
The input image is rescaled so that height matches the output height.
It is split into possibly overlapping tiles, each sized to match the output shape
"""
# image_channels = image.shape[0]
image_height = image.shape[-2]
# image_width = image.shape[-1]
output_height = shape[-2]
output_width = shape[-1]
# Rescale to match vertical size
scale = output_height / float(image_height)
scaled_image = rescale(image.transpose(1, 2, 0), (scale, scale), order=0, preserve_range=True).transpose(2, 0, 1)
scaled_width = scaled_image.shape[-1]
if scaled_width < output_width:
padding = output_width - scaled_width
if len(scaled_image.shape) == 3:
scaled_image = np.pad(scaled_image, ((0, 0), (0, 0), (padding / 2, padding - padding / 2)), mode='constant')
else:
scaled_image = np.pad(scaled_image, ((0, 0), (padding / 2, padding - padding / 2)), mode='constant')
# Since the input is not a multiple of the output width, we will evenly divide the image
# to produce overlapping tiles. Work it out.
# -- The last tile always fits, and does not overlap with the _next_ tile (there is none)
# -- The remaining tiles each overlap with the following tile. The width of uncovered portion
# evenly divides the rest of the strip
# -- I need an integer number of tiles to cover the remaining strip (so I use a ceil)
num_tiles = 1 + int(np.ceil(max(0, (scaled_width - output_width)) / float(output_width - overlap)))
for x in np.linspace(0, scaled_width - output_width, num_tiles):
yield scaled_image[:, :, int(x):int(x) + output_width]
def combine_tiles(images, output_shape):
""" Combine a set of overlapping images to match the output shape
This routine is the inverse os split_tiles.
The 'images' parameter is a sequence of identically-shaped images.
They are evenly spaced in the output_shape.
Overlapping regions are averages (arithmetic mean)
"""
# You may need to reshape in order to curry extra dimensions into the 'channels'
tn, tc, th, tw = images.shape
oc, oh, ow = output_shape
# The number of channels should match. In my case I have 5D inputs so I slice...
assert tc == oc
if oh != th:
s = float(th) / oh
# The scaled-down image, after tiles are combined, must still be at least one pixel wide (!)
result = combine_tiles(images, (tc, th, max(1, int(ow * s))))
# noinspection PyTypeChecker
result = result.transpose(1, 2, 0) # Change to channels-last for skimage
result = skimage.transform.resize(result, (oh, ow, oc), preserve_range=True)
result = result.transpose(2, 0, 1) # Back to channels-first
return result
assert oh == th
if ow < tw:
x = (tw - ow) / 2
result = combine_tiles(images, (oc, oh, tw))
result = result[:, :, x:x + ow]
return result
assert ow >= tw
# noinspection PyUnresolvedReferences
tx = (np.linspace(0, ow - tw, tn)).astype(int)
output = np.zeros(output_shape)
counts = np.zeros((oh, ow))
for i, x in enumerate(tx):
x = int(x)
weight = np.ones((th, tw))
if i > 0:
left_padding = tx[i - 1] + tw - x
weight[:, :left_padding] *= np.linspace(0, 1, left_padding)
if i < len(tx) - 1:
right_padding = x + tw - tx[i + 1]
weight[:, -right_padding:] *= np.linspace(1, 0, right_padding)
output[:, :, x:x + tw] += images[i] * weight
counts[:, x:x + tw] += weight
output /= counts
output = output.astype(images.dtype)
return output
|
mit
|
bazz-erp/erpnext
|
erpnext/setup/doctype/authorization_control/authorization_control.py
|
29
|
9897
|
# Copyright (c) 2015, Frappe Technologies Pvt. Ltd. and Contributors
# License: GNU General Public License v3. See license.txt
from __future__ import unicode_literals
import frappe
from frappe.utils import cstr, flt, has_common, comma_or
from frappe import session, _
from erpnext.utilities.transaction_base import TransactionBase
class AuthorizationControl(TransactionBase):
def get_appr_user_role(self, det, doctype_name, total, based_on, condition, item, company):
amt_list, appr_users, appr_roles = [], [], []
users, roles = '',''
if det:
for x in det:
amt_list.append(flt(x[0]))
max_amount = max(amt_list)
app_dtl = frappe.db.sql("""select approving_user, approving_role from `tabAuthorization Rule`
where transaction = %s and (value = %s or value > %s)
and docstatus != 2 and based_on = %s and company = %s %s""" %
('%s', '%s', '%s', '%s', '%s', condition),
(doctype_name, flt(max_amount), total, based_on, company))
if not app_dtl:
app_dtl = frappe.db.sql("""select approving_user, approving_role from `tabAuthorization Rule`
where transaction = %s and (value = %s or value > %s) and docstatus != 2
and based_on = %s and ifnull(company,'') = '' %s""" %
('%s', '%s', '%s', '%s', condition), (doctype_name, flt(max_amount), total, based_on))
for d in app_dtl:
if(d[0]): appr_users.append(d[0])
if(d[1]): appr_roles.append(d[1])
if not has_common(appr_roles, frappe.get_roles()) and not has_common(appr_users, [session['user']]):
frappe.msgprint(_("Not authroized since {0} exceeds limits").format(_(based_on)))
frappe.throw(_("Can be approved by {0}").format(comma_or(appr_roles + appr_users)))
def validate_auth_rule(self, doctype_name, total, based_on, cond, company, item = ''):
chk = 1
add_cond1,add_cond2 = '',''
if based_on == 'Itemwise Discount':
add_cond1 += " and master_name = '"+cstr(item).replace("'", "\\'")+"'"
itemwise_exists = frappe.db.sql("""select value from `tabAuthorization Rule`
where transaction = %s and value <= %s
and based_on = %s and company = %s and docstatus != 2 %s %s""" %
('%s', '%s', '%s', '%s', cond, add_cond1), (doctype_name, total, based_on, company))
if not itemwise_exists:
itemwise_exists = frappe.db.sql("""select value from `tabAuthorization Rule`
where transaction = %s and value <= %s and based_on = %s
and ifnull(company,'') = '' and docstatus != 2 %s %s""" %
('%s', '%s', '%s', cond, add_cond1), (doctype_name, total, based_on))
if itemwise_exists:
self.get_appr_user_role(itemwise_exists, doctype_name, total, based_on, cond+add_cond1, item,company)
chk = 0
if chk == 1:
if based_on == 'Itemwise Discount':
add_cond2 += " and ifnull(master_name,'') = ''"
appr = frappe.db.sql("""select value from `tabAuthorization Rule`
where transaction = %s and value <= %s and based_on = %s
and company = %s and docstatus != 2 %s %s""" %
('%s', '%s', '%s', '%s', cond, add_cond2), (doctype_name, total, based_on, company))
if not appr:
appr = frappe.db.sql("""select value from `tabAuthorization Rule`
where transaction = %s and value <= %s and based_on = %s
and ifnull(company,'') = '' and docstatus != 2 %s %s""" %
('%s', '%s', '%s', cond, add_cond2), (doctype_name, total, based_on))
self.get_appr_user_role(appr, doctype_name, total, based_on, cond+add_cond2, item, company)
def bifurcate_based_on_type(self, doctype_name, total, av_dis, based_on, doc_obj, val, company):
add_cond = ''
auth_value = av_dis
if val == 1: add_cond += " and system_user = '"+session['user'].replace("'", "\\'")+"'"
elif val == 2: add_cond += " and system_role IN %s" % ("('"+"','".join(frappe.get_roles())+"')")
else: add_cond += " and ifnull(system_user,'') = '' and ifnull(system_role,'') = ''"
if based_on == 'Grand Total': auth_value = total
elif based_on == 'Customerwise Discount':
if doc_obj:
if doc_obj.doctype == 'Sales Invoice': customer = doc_obj.customer
else: customer = doc_obj.customer_name
add_cond = " and master_name = '"+cstr(customer).replace("'", "\\'")+"'"
if based_on == 'Itemwise Discount':
if doc_obj:
for t in doc_obj.get("items"):
self.validate_auth_rule(doctype_name, t.discount_percentage, based_on, add_cond, company,t.item_code )
else:
self.validate_auth_rule(doctype_name, auth_value, based_on, add_cond, company)
def validate_approving_authority(self, doctype_name,company, total, doc_obj = ''):
if not frappe.db.count("Authorization Rule"):
return
av_dis = 0
if doc_obj:
price_list_rate, base_rate = 0, 0
for d in doc_obj.get("items"):
if d.base_rate:
price_list_rate += flt(d.base_price_list_rate) or flt(d.base_rate)
base_rate += flt(d.base_rate)
if doc_obj.get("discount_amount"):
base_rate -= flt(doc_obj.discount_amount)
if price_list_rate: av_dis = 100 - flt(base_rate * 100 / price_list_rate)
final_based_on = ['Grand Total','Average Discount','Customerwise Discount','Itemwise Discount']
# Check for authorization set for individual user
based_on = [x[0] for x in frappe.db.sql("""select distinct based_on from `tabAuthorization Rule`
where transaction = %s and system_user = %s
and (company = %s or ifnull(company,'')='') and docstatus != 2""",
(doctype_name, session['user'], company))]
for d in based_on:
self.bifurcate_based_on_type(doctype_name, total, av_dis, d, doc_obj, 1, company)
# Remove user specific rules from global authorization rules
for r in based_on:
if r in final_based_on and r != 'Itemwise Discount': final_based_on.remove(r)
# Check for authorization set on particular roles
based_on = [x[0] for x in frappe.db.sql("""select based_on
from `tabAuthorization Rule`
where transaction = %s and system_role IN (%s) and based_on IN (%s)
and (company = %s or ifnull(company,'')='')
and docstatus != 2
""" % ('%s', "'"+"','".join(frappe.get_roles())+"'", "'"+"','".join(final_based_on)+"'", '%s'), (doctype_name, company))]
for d in based_on:
self.bifurcate_based_on_type(doctype_name, total, av_dis, d, doc_obj, 2, company)
# Remove role specific rules from global authorization rules
for r in based_on:
if r in final_based_on and r != 'Itemwise Discount': final_based_on.remove(r)
# Check for global authorization
for g in final_based_on:
self.bifurcate_based_on_type(doctype_name, total, av_dis, g, doc_obj, 0, company)
def get_value_based_rule(self,doctype_name,employee,total_claimed_amount,company):
val_lst =[]
val = frappe.db.sql("""select value from `tabAuthorization Rule`
where transaction=%s and (to_emp=%s or
to_designation IN (select designation from `tabEmployee` where name=%s))
and ifnull(value,0)< %s and company = %s and docstatus!=2""",
(doctype_name,employee,employee,total_claimed_amount,company))
if not val:
val = frappe.db.sql("""select value from `tabAuthorization Rule`
where transaction=%s and (to_emp=%s or
to_designation IN (select designation from `tabEmployee` where name=%s))
and ifnull(value,0)< %s and ifnull(company,'') = '' and docstatus!=2""",
(doctype_name, employee, employee, total_claimed_amount))
if val:
val_lst = [y[0] for y in val]
else:
val_lst.append(0)
max_val = max(val_lst)
rule = frappe.db.sql("""select name, to_emp, to_designation, approving_role, approving_user
from `tabAuthorization Rule`
where transaction=%s and company = %s
and (to_emp=%s or to_designation IN (select designation from `tabEmployee` where name=%s))
and ifnull(value,0)= %s and docstatus!=2""",
(doctype_name,company,employee,employee,flt(max_val)), as_dict=1)
if not rule:
rule = frappe.db.sql("""select name, to_emp, to_designation, approving_role, approving_user
from `tabAuthorization Rule`
where transaction=%s and ifnull(company,'') = ''
and (to_emp=%s or to_designation IN (select designation from `tabEmployee` where name=%s))
and ifnull(value,0)= %s and docstatus!=2""",
(doctype_name,employee,employee,flt(max_val)), as_dict=1)
return rule
# related to payroll module only
def get_approver_name(self, doctype_name, total, doc_obj=''):
app_user=[]
app_specific_user =[]
rule ={}
if doc_obj:
if doctype_name == 'Expense Claim':
rule = self.get_value_based_rule(doctype_name, doc_obj.employee,
doc_obj.total_claimed_amount, doc_obj.company)
elif doctype_name == 'Appraisal':
rule = frappe.db.sql("""select name, to_emp, to_designation, approving_role, approving_user
from `tabAuthorization Rule` where transaction=%s
and (to_emp=%s or to_designation IN (select designation from `tabEmployee` where name=%s))
and company = %s and docstatus!=2""",
(doctype_name,doc_obj.employee, doc_obj.employee, doc_obj.company),as_dict=1)
if not rule:
rule = frappe.db.sql("""select name, to_emp, to_designation, approving_role, approving_user
from `tabAuthorization Rule`
where transaction=%s and (to_emp=%s or
to_designation IN (select designation from `tabEmployee` where name=%s))
and ifnull(company,'') = '' and docstatus!=2""",
(doctype_name,doc_obj.employee, doc_obj.employee), as_dict=1)
if rule:
for m in rule:
if m['to_emp'] or m['to_designation']:
if m['approving_user']:
app_specific_user.append(m['approving_user'])
elif m['approving_role']:
user_lst = [z[0] for z in frappe.db.sql("""select distinct t1.name
from `tabUser` t1, `tabHas Role` t2 where t2.role=%s
and t2.parent=t1.name and t1.name !='Administrator'
and t1.name != 'Guest' and t1.docstatus !=2""", m['approving_role'])]
for x in user_lst:
if not x in app_user:
app_user.append(x)
if len(app_specific_user) >0:
return app_specific_user
else:
return app_user
|
gpl-3.0
|
FrankBian/kuma
|
vendor/packages/sqlalchemy/test/aaa_profiling/test_orm.py
|
7
|
3012
|
from sqlalchemy.test.testing import eq_, assert_raises, \
assert_raises_message
from sqlalchemy import exc as sa_exc, util, Integer, String, ForeignKey
from sqlalchemy.orm import exc as orm_exc, mapper, relationship, \
sessionmaker
from sqlalchemy.test import testing, profiling
from test.orm import _base
from sqlalchemy.test.schema import Table, Column
class MergeTest(_base.MappedTest):
@classmethod
def define_tables(cls, metadata):
parent = Table('parent', metadata, Column('id', Integer,
primary_key=True,
test_needs_autoincrement=True), Column('data',
String(20)))
child = Table('child', metadata, Column('id', Integer,
primary_key=True, test_needs_autoincrement=True),
Column('data', String(20)), Column('parent_id',
Integer, ForeignKey('parent.id'), nullable=False))
@classmethod
def setup_classes(cls):
class Parent(_base.BasicEntity):
pass
class Child(_base.BasicEntity):
pass
@classmethod
@testing.resolve_artifact_names
def setup_mappers(cls):
mapper(Parent, parent, properties={'children'
: relationship(Child, backref='parent')})
mapper(Child, child)
@classmethod
@testing.resolve_artifact_names
def insert_data(cls):
parent.insert().execute({'id': 1, 'data': 'p1'})
child.insert().execute({'id': 1, 'data': 'p1c1', 'parent_id'
: 1})
@testing.resolve_artifact_names
def test_merge_no_load(self):
sess = sessionmaker()()
sess2 = sessionmaker()()
p1 = sess.query(Parent).get(1)
p1.children
# down from 185 on this this is a small slice of a usually
# bigger operation so using a small variance
@profiling.function_call_count(95, variance=0.001,
versions={'2.4': 67, '3': 96})
def go():
return sess2.merge(p1, load=False)
p2 = go()
# third call, merge object already present. almost no calls.
@profiling.function_call_count(12, variance=0.001,
versions={'2.4': 8, '3': 13})
def go():
return sess2.merge(p2, load=False)
p3 = go()
@testing.only_on('sqlite', 'Call counts tailored to pysqlite')
@testing.resolve_artifact_names
def test_merge_load(self):
sess = sessionmaker()()
sess2 = sessionmaker()()
p1 = sess.query(Parent).get(1)
p1.children
# preloading of collection took this down from 1728 to 1192
# using sqlite3 the C extension took it back up to approx. 1257
# (py2.6)
@profiling.function_call_count(1257, versions={'2.4': 807})
def go():
p2 = sess2.merge(p1)
go()
# one more time, count the SQL
sess2 = sessionmaker()()
self.assert_sql_count(testing.db, go, 2)
|
mpl-2.0
|
ddRPB/rpb-server
|
services/OCStudyEventDefinitionWsService.py
|
1
|
6030
|
#### ## ## ######## ####### ######## ######## ######
## ### ### ## ## ## ## ## ## ## ## ##
## #### #### ## ## ## ## ## ## ## ##
## ## ### ## ######## ## ## ######## ## ######
## ## ## ## ## ## ## ## ## ##
## ## ## ## ## ## ## ## ## ## ##
#### ## ## ## ####### ## ## ## ######
# Logging
import logging
import logging.config
# SOAP
import pysimplesoap.client
from pysimplesoap.client import SoapClient
from pysimplesoap.simplexml import SimpleXMLElement
from pysimplesoap.transport import get_http_wrapper, set_http_wrapper
# Domain
from domain.StudyEventDefinition import StudyEventDefinition
from domain.EventDefinitionCrf import EventDefinitionCrf
from domain.Crf import Crf
from domain.CrfVersion import CrfVersion
from domain.EventDefinitionCrf import EventDefinitionCrf
from domain.StudyEventDefinition import StudyEventDefinition
#----------------------------------------------------------------------
#------------------------------ Constants -----------------------------
STUDYNAMESPACE = "http://openclinica.org/ws/studyEventDefinition/v1"
STUDYACTION = "http://openclinica.org/ws/studyEventDefinition/v1"
###### ######## ######## ## ## #### ###### ########
## ## ## ## ## ## ## ## ## ## ##
## ## ## ## ## ## ## ## ##
###### ###### ######## ## ## ## ## ######
## ## ## ## ## ## ## ## ##
## ## ## ## ## ## ## ## ## ## ##
###### ######## ## ## ### #### ###### ########
class OCStudyEventDefinitionWsService():
"""SOAP web services to OpenClinica
"""
def __init__(self, studyLocation, proxyStr, proxyUsr, proxyPass, isTrace):
"""Default Constructor
"""
self._logger = logging.getLogger(__name__)
logging.config.fileConfig("logging.ini", disable_existing_loggers=False)
proxies = None
if proxyStr:
proxies = pysimplesoap.client.parse_proxy(proxyStr)
self._logger.info("OC EventDef SOAP services with proxies: " + str(proxies))
self._logger.info("OC EventDef SOAP services with auth: " + str(proxyUsr))
if proxies:
self.client = SoapClient(location=studyLocation,
namespace=STUDYNAMESPACE,
action=STUDYACTION,
soap_ns='soapenv',
ns="v1",
trace=isTrace,
proxy=proxies,
username=proxyUsr,
password=proxyPass)
else:
self.client = SoapClient(location=studyLocation,
namespace=STUDYNAMESPACE,
action=STUDYACTION,
soap_ns='soapenv',
ns="v1",
trace=isTrace,
username=proxyUsr,
password=proxyPass)
## ## ######## ######## ## ## ####### ######## ######
### ### ## ## ## ## ## ## ## ## ## ##
#### #### ## ## ## ## ## ## ## ## ##
## ### ## ###### ## ######### ## ## ## ## ######
## ## ## ## ## ## ## ## ## ## ##
## ## ## ## ## ## ## ## ## ## ## ##
## ## ######## ## ## ## ####### ######## ######
def wsse(self, userName, passwordHash):
"""
"""
self.client['wsse:Security'] = {
'wsse:UsernameToken': {
'wsse:Username': userName,
'wsse:Password': passwordHash,
}
}
def listAllByStudy(self, study):
"""
"""
params = SimpleXMLElement("""<?xml version="1.0" encoding="UTF-8"?>
<listAllRequest>
<v1:studyEventDefinitionListAll xmlns:v1="http://openclinica.org/ws/studyEventDefinition/v1">
<bean:studyRef xmlns:bean="http://openclinica.org/ws/beans">
<bean:identifier>""" + study.identifier() + """</bean:identifier>
</bean:studyRef>
</v1:studyEventDefinitionListAll>
</listAllRequest>""")
response = self.client.call('listAllRequest', params)
studyEventDefinitions = []
for studyEventDefinition in response.studyEventDefinitions.children():
oid = str(studyEventDefinition.oid)
name = str(studyEventDefinition.name)
eventDefinitionCrfs = []
for eventDefinitionCrf in studyEventDefinition.eventDefinitionCrfs.children():
required = str(eventDefinitionCrf.required)
doubleDataEntry = str(eventDefinitionCrf.doubleDataEntry)
passwordRequired = str(eventDefinitionCrf.passwordRequired)
hideCrf = str(eventDefinitionCrf.hideCrf)
sourceDataVerificaiton = str(eventDefinitionCrf.sourceDataVerificaiton)
crfOid = str(eventDefinitionCrf.crf.oid)
crfName = str(eventDefinitionCrf.crf.name)
obtainedCrf = Crf(crfOid, crfName)
defaultCrfVersionOid = str(eventDefinitionCrf.defaultCrfVersion.oid)
defaultCrfVersionName = str(eventDefinitionCrf.defaultCrfVersion.name)
obtainedDefaultCrfVersion = CrfVersion(defaultCrfVersionOid, defaultCrfVersionName)
obtainedEventDefinitionCrf = EventDefinitionCrf(required,
doubleDataEntry,
passwordRequired,
hideCrf,
sourceDataVerificaiton,
obtainedCrf,
obtainedDefaultCrfVersion)
eventDefinitionCrfs.append(obtainedEventDefinitionCrf)
obtainedStudyEventDefintion = StudyEventDefinition(oid, name, eventDefinitionCrfs)
studyEventDefinitions.append(obtainedStudyEventDefintion)
return studyEventDefinitions
|
gpl-3.0
|
wankunde/cloudera_hadoop
|
dev-support/relnotes.py
|
62
|
7865
|
#!/usr/bin/python
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import re
import sys
from optparse import OptionParser
import httplib
import urllib
import cgi
try:
import json
except ImportError:
import simplejson as json
namePattern = re.compile(r' \([0-9]+\)')
def clean(str):
return quoteHtml(re.sub(namePattern, "", str))
def formatComponents(str):
str = re.sub(namePattern, '', str).replace("'", "")
if str != "":
ret = "(" + str + ")"
else:
ret = ""
return quoteHtml(ret)
def quoteHtml(str):
return cgi.escape(str).encode('ascii', 'xmlcharrefreplace')
def mstr(obj):
if (obj == None):
return ""
return unicode(obj)
class Version:
"""Represents a version number"""
def __init__(self, data):
self.mod = False
self.data = data
found = re.match('^((\d+)(\.\d+)*).*$', data)
if (found):
self.parts = [ int(p) for p in found.group(1).split('.') ]
else:
self.parts = []
# backfill version with zeroes if missing parts
self.parts.extend((0,) * (3 - len(self.parts)))
def decBugFix(self):
self.mod = True
self.parts[2] -= 1
return self
def __str__(self):
if (self.mod):
return '.'.join([ str(p) for p in self.parts ])
return self.data
def __cmp__(self, other):
return cmp(self.parts, other.parts)
class Jira:
"""A single JIRA"""
def __init__(self, data, parent):
self.key = data['key']
self.fields = data['fields']
self.parent = parent
self.notes = None
def getId(self):
return mstr(self.key)
def getDescription(self):
return mstr(self.fields['description'])
def getReleaseNote(self):
if (self.notes == None):
field = self.parent.fieldIdMap['Release Note']
if (self.fields.has_key(field)):
self.notes=mstr(self.fields[field])
else:
self.notes=self.getDescription()
return self.notes
def getPriority(self):
ret = ""
pri = self.fields['priority']
if(pri != None):
ret = pri['name']
return mstr(ret)
def getAssignee(self):
ret = ""
mid = self.fields['assignee']
if(mid != None):
ret = mid['displayName']
return mstr(ret)
def getComponents(self):
return " , ".join([ comp['name'] for comp in self.fields['components'] ])
def getSummary(self):
return self.fields['summary']
def getType(self):
ret = ""
mid = self.fields['issuetype']
if(mid != None):
ret = mid['name']
return mstr(ret)
def getReporter(self):
ret = ""
mid = self.fields['reporter']
if(mid != None):
ret = mid['displayName']
return mstr(ret)
def getProject(self):
ret = ""
mid = self.fields['project']
if(mid != None):
ret = mid['key']
return mstr(ret)
class JiraIter:
"""An Iterator of JIRAs"""
def __init__(self, versions):
self.versions = versions
resp = urllib.urlopen("https://issues.apache.org/jira/rest/api/2/field")
data = json.loads(resp.read())
self.fieldIdMap = {}
for part in data:
self.fieldIdMap[part['name']] = part['id']
self.jiras = []
at=0
end=1
count=100
while (at < end):
params = urllib.urlencode({'jql': "project in (HADOOP,HDFS,MAPREDUCE,YARN) and fixVersion in ('"+"' , '".join(versions)+"') and resolution = Fixed", 'startAt':at, 'maxResults':count})
resp = urllib.urlopen("https://issues.apache.org/jira/rest/api/2/search?%s"%params)
data = json.loads(resp.read())
if (data.has_key('errorMessages')):
raise Exception(data['errorMessages'])
at = data['startAt'] + data['maxResults']
end = data['total']
self.jiras.extend(data['issues'])
self.iter = self.jiras.__iter__()
def __iter__(self):
return self
def next(self):
data = self.iter.next()
j = Jira(data, self)
return j
class Outputs:
"""Several different files to output to at the same time"""
def __init__(self, base_file_name, file_name_pattern, keys, params={}):
self.params = params
self.base = open(base_file_name%params, 'w')
self.others = {}
for key in keys:
both = dict(params)
both['key'] = key
self.others[key] = open(file_name_pattern%both, 'w')
def writeAll(self, pattern):
both = dict(self.params)
both['key'] = ''
self.base.write(pattern%both)
for key in self.others.keys():
both = dict(self.params)
both['key'] = key
self.others[key].write(pattern%both)
def writeKeyRaw(self, key, str):
self.base.write(str)
if (self.others.has_key(key)):
self.others[key].write(str)
def close(self):
self.base.close()
for fd in self.others.values():
fd.close()
def main():
parser = OptionParser(usage="usage: %prog [options] [USER-ignored] [PASSWORD-ignored] [VERSION]")
parser.add_option("-v", "--version", dest="versions",
action="append", type="string",
help="versions in JIRA to include in releasenotes", metavar="VERSION")
parser.add_option("--previousVer", dest="previousVer",
action="store", type="string",
help="previous version to include in releasenotes", metavar="VERSION")
(options, args) = parser.parse_args()
if (options.versions == None):
options.versions = []
if (len(args) > 2):
options.versions.append(args[2])
if (len(options.versions) <= 0):
parser.error("At least one version needs to be supplied")
versions = [ Version(v) for v in options.versions];
versions.sort();
maxVersion = str(versions[-1])
if(options.previousVer == None):
options.previousVer = str(versions[0].decBugFix())
print >> sys.stderr, "WARNING: no previousVersion given, guessing it is "+options.previousVer
list = JiraIter(options.versions)
version = maxVersion
outputs = Outputs("releasenotes.%(ver)s.html",
"releasenotes.%(key)s.%(ver)s.html",
["HADOOP","HDFS","MAPREDUCE","YARN"], {"ver":maxVersion, "previousVer":options.previousVer})
head = '<META http-equiv="Content-Type" content="text/html; charset=UTF-8">\n' \
'<title>Hadoop %(key)s %(ver)s Release Notes</title>\n' \
'<STYLE type="text/css">\n' \
' H1 {font-family: sans-serif}\n' \
' H2 {font-family: sans-serif; margin-left: 7mm}\n' \
' TABLE {margin-left: 7mm}\n' \
'</STYLE>\n' \
'</head>\n' \
'<body>\n' \
'<h1>Hadoop %(key)s %(ver)s Release Notes</h1>\n' \
'These release notes include new developer and user-facing incompatibilities, features, and major improvements. \n' \
'<a name="changes"/>\n' \
'<h2>Changes since Hadoop %(previousVer)s</h2>\n' \
'<ul>\n'
outputs.writeAll(head)
for jira in list:
line = '<li> <a href="https://issues.apache.org/jira/browse/%s">%s</a>.\n' \
' %s %s reported by %s and fixed by %s %s<br>\n' \
' <b>%s</b><br>\n' \
' <blockquote>%s</blockquote></li>\n' \
% (quoteHtml(jira.getId()), quoteHtml(jira.getId()), clean(jira.getPriority()), clean(jira.getType()).lower(),
quoteHtml(jira.getReporter()), quoteHtml(jira.getAssignee()), formatComponents(jira.getComponents()),
quoteHtml(jira.getSummary()), quoteHtml(jira.getReleaseNote()))
outputs.writeKeyRaw(jira.getProject(), line)
outputs.writeAll("</ul>\n</body></html>\n")
outputs.close()
if __name__ == "__main__":
main()
|
apache-2.0
|
danux/danjd
|
cv/models.py
|
1
|
1371
|
# ~*~ coding: utf-8 ~*~
from django.core.validators import MaxValueValidator, MinValueValidator
from django.db import models
class Skill(models.Model):
"""
A skill has a name and a skill level.
"""
name = models.CharField(max_length=30)
skill_level = models.IntegerField(validators=[MinValueValidator(1), MaxValueValidator(100)])
date_created = models.DateTimeField(auto_now_add=True)
date_modified = models.DateTimeField(auto_now=True)
class Meta(object):
ordering = ['-skill_level']
class Job(models.Model):
"""
A skill has a name and a skill level.
"""
title = models.CharField(max_length=150)
start_date = models.DateField()
end_date = models.DateField(blank=True, null=True)
description = models.TextField()
date_created = models.DateTimeField(auto_now_add=True)
date_modified = models.DateTimeField(auto_now=True)
class Meta(object):
ordering = ['-start_date']
class Project(models.Model):
"""
A skill has a name and a skill level.
"""
name = models.CharField(max_length=150)
description = models.TextField()
github_url = models.URLField()
order = models.IntegerField()
date_created = models.DateTimeField(auto_now_add=True)
date_modified = models.DateTimeField(auto_now=True)
class Meta(object):
ordering = ['order']
|
mit
|
jspricke/python-remind
|
setup.py
|
1
|
1133
|
from setuptools import setup
setup(name='remind',
version='0.17.0',
description='Remind Python library',
long_description=open('README.rst').read(),
author='Jochen Sprickerhof',
author_email='[email protected]',
license='GPLv3+',
url='https://github.com/jspricke/python-remind',
keywords=['Remind'],
classifiers=[
'Programming Language :: Python',
'Development Status :: 4 - Beta',
'Environment :: Console',
'License :: OSI Approved :: GNU General Public License v3 or later (GPLv3+)',
'Topic :: Office/Business :: Scheduling',
'Topic :: Software Development :: Libraries :: Python Modules',
],
setup_requires=['nose>=1.3', 'coverage'],
install_requires=['python-dateutil', 'pytz', 'tzlocal', 'vobject'],
py_modules=['remind', 'ics_compare'],
entry_points={
'console_scripts': [
'rem2ics = remind:rem2ics',
'ics2rem = remind:ics2rem',
'icscomp = ics_compare:main',
]
},
test_suite='nose.collector',)
|
gpl-3.0
|
Ballz0fSteel/Umeko
|
lib/youtube_dl/extractor/ketnet.py
|
33
|
2589
|
from __future__ import unicode_literals
from .common import InfoExtractor
class KetnetIE(InfoExtractor):
_VALID_URL = r'https?://(?:www\.)?ketnet\.be/(?:[^/]+/)*(?P<id>[^/?#&]+)'
_TESTS = [{
'url': 'https://www.ketnet.be/kijken/zomerse-filmpjes',
'md5': 'd907f7b1814ef0fa285c0475d9994ed7',
'info_dict': {
'id': 'zomerse-filmpjes',
'ext': 'mp4',
'title': 'Gluur mee op de filmset en op Pennenzakkenrock',
'description': 'Gluur mee met Ghost Rockers op de filmset',
'thumbnail': r're:^https?://.*\.jpg$',
}
}, {
'url': 'https://www.ketnet.be/kijken/karrewiet/uitzending-8-september-2016',
'only_matching': True,
}, {
'url': 'https://www.ketnet.be/achter-de-schermen/sien-repeteert-voor-stars-for-life',
'only_matching': True,
}, {
# mzsource, geo restricted to Belgium
'url': 'https://www.ketnet.be/kijken/nachtwacht/de-bermadoe',
'only_matching': True,
}]
def _real_extract(self, url):
video_id = self._match_id(url)
webpage = self._download_webpage(url, video_id)
config = self._parse_json(
self._search_regex(
r'(?s)playerConfig\s*=\s*({.+?})\s*;', webpage,
'player config'),
video_id)
title = config['title']
formats = []
for source_key in ('', 'mz'):
source = config.get('%ssource' % source_key)
if not isinstance(source, dict):
continue
for format_id, format_url in source.items():
if format_id == 'hls':
formats.extend(self._extract_m3u8_formats(
format_url, video_id, 'mp4',
entry_protocol='m3u8_native', m3u8_id=format_id,
fatal=False))
elif format_id == 'hds':
formats.extend(self._extract_f4m_formats(
format_url, video_id, f4m_id=format_id, fatal=False))
else:
formats.append({
'url': format_url,
'format_id': format_id,
})
self._sort_formats(formats)
return {
'id': video_id,
'title': title,
'description': config.get('description'),
'thumbnail': config.get('image'),
'series': config.get('program'),
'episode': config.get('episode'),
'formats': formats,
}
|
gpl-3.0
|
Lyleo/nupic
|
tests/integration/nupic/opf/opf_checkpoint_test/opf_checkpoint_test.py
|
17
|
16469
|
#!/usr/bin/env python
# ----------------------------------------------------------------------
# Numenta Platform for Intelligent Computing (NuPIC)
# Copyright (C) 2013, Numenta, Inc. Unless you have an agreement
# with Numenta, Inc., for a separate license for this software code, the
# following terms and conditions apply:
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License version 3 as
# published by the Free Software Foundation.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
# See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see http://www.gnu.org/licenses.
#
# http://numenta.org/licenses/
# ----------------------------------------------------------------------
import csv
import os
import shutil
from nupic.data.file_record_stream import FileRecordStream
from nupic.frameworks.opf.experiment_runner import runExperiment
from nupic.support import initLogging
from nupic.support.unittesthelpers.testcasebase import (
unittest, TestCaseBase as HelperTestCaseBase)
_EXPERIMENT_BASE = os.path.join(os.path.abspath(
os.path.dirname(__file__)), "experiments")
class MyTestCaseBase(HelperTestCaseBase):
def shortDescription(self):
""" Override to force unittest framework to use test method names instead
of docstrings in the report.
"""
return None
@staticmethod
def getOpfNonTemporalPredictionFilepath(experimentDir, taskLabel):
path = os.path.join(experimentDir,
"inference",
"%s.nontemporal.predictionLog.csv" % taskLabel)
return os.path.abspath(path)
@staticmethod
def getOpfTemporalPredictionFilepath(experimentDir, taskLabel):
path = os.path.join(experimentDir,
"inference",
"%s.temporal.predictionLog.csv" % taskLabel)
return os.path.abspath(path)
def compareOPFPredictionFiles(self, path1, path2, temporal,
maxMismatches=None):
""" Compare temporal or non-temporal predictions for the given experiment
that just finished executing
experimentName: e.g., "gym"; this string will be used to form
a directory path to the experiments.
maxMismatches: Maximum number of row mismatches to report before
terminating the comparison; None means: report all
mismatches
Returns: True if equal; False if different
"""
experimentLabel = "%s prediction comparison" % \
("Temporal" if temporal else "Non-Temporal")
print "%s: Performing comparison of OPF prediction CSV files %r and %r" % (
experimentLabel, path1, path2)
# Open CSV readers
#
self.assertTrue(
os.path.isfile(path1),
msg="OPF prediction file path1 %s doesn't exist or is not a file" % (
path1))
(opf1CsvReader, opf1FieldNames) = self._openOpfPredictionCsvFile(path1)
self.assertTrue(
os.path.isfile(path2),
msg="OPF prediction file path2 %s doesn't exist or is not a file" % (
path2))
(opf2CsvReader, opf2FieldNames) = self._openOpfPredictionCsvFile(path2)
self.assertEqual(len(opf1FieldNames), len(opf2FieldNames),
("%s: Mismatch in number of prediction columns: "
"opf1: %s, opf2: %s") % (
experimentLabel, len(opf1FieldNames),
len(opf2FieldNames)))
self.assertEqual(opf1FieldNames, opf2FieldNames)
# Each data row is assumed to be arranged as follows:
#
# reset, actual-field1, prediction-field1, actual-field2,
# prediction-field2, etc.
#
# Presently, we only compare the predicted values that need to match.
opf1EOF = False
opf2EOF = False
opf1CurrentDataRowIndex = -1
opf2CurrentDataRowIndex = -1
if temporal:
# Skip the first data rows for temporal tests, since they don't contain
# prediction values.
_skipOpf1Row = opf1CsvReader.next()
opf1CurrentDataRowIndex += 1
_skipOpf2Row = opf2CsvReader.next()
opf2CurrentDataRowIndex += 1
fieldsIndexesToCompare = tuple(xrange(2, len(opf1FieldNames), 2))
self.assertGreater(len(fieldsIndexesToCompare), 0)
print ("%s: Comparing fields at indexes: %s; "
"opf1Labels: %s; opf2Labels: %s") % (
experimentLabel,
fieldsIndexesToCompare,
[opf1FieldNames[i] for i in fieldsIndexesToCompare],
[opf2FieldNames[i] for i in fieldsIndexesToCompare])
for i in fieldsIndexesToCompare:
self.assertTrue(opf1FieldNames[i].endswith("predicted"),
msg="%r doesn't end with 'predicted'" % opf1FieldNames[i])
self.assertTrue(opf2FieldNames[i].endswith("predicted"),
msg="%r doesn't end with 'predicted'" % opf2FieldNames[i])
mismatchCount = 0
while True:
try:
opf1Row = opf1CsvReader.next()
except StopIteration:
opf1EOF = True
else:
opf1CurrentDataRowIndex += 1
try:
opf2Row = opf2CsvReader.next()
except StopIteration:
opf2EOF = True
else:
opf2CurrentDataRowIndex += 1
if opf1EOF != opf2EOF:
print ("%s: ERROR: Data row counts mismatch: "
"opf1EOF: %s, opf1CurrentDataRowIndex: %s; "
"opf2EOF: %s, opf2CurrentDataRowIndex: %s") % (
experimentLabel,
opf1EOF, opf1CurrentDataRowIndex,
opf2EOF, opf2CurrentDataRowIndex)
return False
if opf1EOF and opf2EOF:
# Done with both prediction datasets
break
# Compare the rows
self.assertEqual(len(opf1Row), len(opf2Row))
for i in fieldsIndexesToCompare:
opf1FloatValue = float(opf1Row[i])
opf2FloatValue = float(opf2Row[i])
if opf1FloatValue != opf2FloatValue:
mismatchCount += 1
print ("%s: ERROR: mismatch in "
"prediction values: dataRowIndex: %s, fieldIndex: %s (%r); "
"opf1FieldValue: <%s>, opf2FieldValue: <%s>; "
"opf1FieldValueAsFloat: %s, opf2FieldValueAsFloat: %s; "
"opf1Row: %s, opf2Row: %s") % (
experimentLabel,
opf1CurrentDataRowIndex,
i,
opf1FieldNames[i],
opf1Row[i],
opf2Row[i],
opf1FloatValue,
opf2FloatValue,
opf1Row,
opf2Row)
# Stop comparison if we exceeded the allowed number of mismatches
if maxMismatches is not None and mismatchCount >= maxMismatches:
break
if mismatchCount != 0:
print "%s: ERROR: there were %s mismatches between %r and %r" % (
experimentLabel, mismatchCount, path1, path2)
return False
# A difference here would indicate a logic error in this method
self.assertEqual(opf1CurrentDataRowIndex, opf2CurrentDataRowIndex)
print ("%s: Comparison of predictions "
"completed: OK; number of prediction rows examined: %s; "
"path1: %r; path2: %r") % \
(experimentLabel,
opf1CurrentDataRowIndex + 1,
path1,
path2)
return True
def _openOpfPredictionCsvFile(self, filepath):
""" Open an OPF prediction CSV file and advance it to the first data row
Returns: the tuple (csvReader, fieldNames), where 'csvReader' is the
csv reader object, and 'fieldNames' is a sequence of field
names.
"""
# Open the OPF prediction file
csvReader = self._openCsvFile(filepath)
# Advance it past the three NUPIC header lines
names = csvReader.next()
_types = csvReader.next()
_specials = csvReader.next()
return (csvReader, names)
@staticmethod
def _openCsvFile(filepath):
# We'll be operating on csvs with arbitrarily long fields
size = 2**27
csv.field_size_limit(size)
rawFileObj = open(filepath, 'rU')
csvReader = csv.reader(rawFileObj, dialect='excel')
return csvReader
def _testSamePredictions(self, experiment, predSteps, checkpointAt,
predictionsFilename, additionalFields=None):
""" Test that we get the same predictions out from the following two
scenarios:
a_plus_b: Run the network for 'a' iterations followed by 'b' iterations
a, followed by b: Run the network for 'a' iterations, save it, load it
back in, then run for 'b' iterations.
Parameters:
-----------------------------------------------------------------------
experiment: base directory of the experiment. This directory should
contain the following:
base.py
a_plus_b/description.py
a/description.py
b/description.py
The sub-directory description files should import the
base.py and only change the first and last record used
from the data file.
predSteps: Number of steps ahead predictions are for
checkpointAt: Number of iterations that 'a' runs for.
IMPORTANT: This must match the number of records that
a/description.py runs for - it is NOT dynamically stuffed into
the a/description.py.
predictionsFilename: The name of the predictions file that the OPF
generates for this experiment (for example
'DefaulTask.NontemporalMultiStep.predictionLog.csv')
"""
# Get the 3 sub-experiment directories
aPlusBExpDir = os.path.join(_EXPERIMENT_BASE, experiment, "a_plus_b")
aExpDir = os.path.join(_EXPERIMENT_BASE, experiment, "a")
bExpDir = os.path.join(_EXPERIMENT_BASE, experiment, "b")
# Run a+b
_aPlusBExp = runExperiment(args=[aPlusBExpDir])
# Run a, the copy the saved checkpoint into the b directory
_aExp = runExperiment(args=[aExpDir])
if os.path.exists(os.path.join(bExpDir, 'savedmodels')):
shutil.rmtree(os.path.join(bExpDir, 'savedmodels'))
shutil.copytree(src=os.path.join(aExpDir, 'savedmodels'),
dst=os.path.join(bExpDir, 'savedmodels'))
_bExp = runExperiment(args=[bExpDir, '--load=DefaultTask'])
# Now, compare the predictions at the end of a+b to those in b.
aPlusBPred = FileRecordStream(os.path.join(aPlusBExpDir, 'inference',
predictionsFilename))
bPred = FileRecordStream(os.path.join(bExpDir, 'inference',
predictionsFilename))
colNames = [x[0] for x in aPlusBPred.getFields()]
actValueColIdx = colNames.index('multiStepPredictions.actual')
predValueColIdx = colNames.index('multiStepPredictions.%d' % (predSteps))
# Skip past the 'a' records in aPlusB
for i in range(checkpointAt):
aPlusBPred.next()
# Now, read through the records that don't have predictions yet
for i in range(predSteps):
aPlusBPred.next()
bPred.next()
# Now, compare predictions in the two files
rowIdx = checkpointAt + predSteps + 4 - 1
epsilon = 0.0001
while True:
rowIdx += 1
try:
rowAPB = aPlusBPred.next()
rowB = bPred.next()
# Compare actuals
self.assertEqual(rowAPB[actValueColIdx], rowB[actValueColIdx],
"Mismatch in actual values: row %d of a+b has %s and row %d of "
"b has %s" % (rowIdx, rowAPB[actValueColIdx], rowIdx-checkpointAt,
rowB[actValueColIdx]))
# Compare predictions, within nearest epsilon
predAPB = eval(rowAPB[predValueColIdx])
predB = eval(rowB[predValueColIdx])
# Sort with highest probabilities first
predAPB = [(a, b) for b, a in predAPB.items()]
predB = [(a, b) for b, a in predB.items()]
predAPB.sort(reverse=True)
predB.sort(reverse=True)
if additionalFields is not None:
for additionalField in additionalFields:
fieldIdx = colNames.index(additionalField)
self.assertEqual(rowAPB[fieldIdx], rowB[fieldIdx],
"Mismatch in field \'%s\' values: row %d of a+b has value: (%s)\n"
" and row %d of b has value: %s" % \
(additionalField, rowIdx, rowAPB[fieldIdx],
rowIdx-checkpointAt, rowB[fieldIdx]))
self.assertEqual(len(predAPB), len(predB),
"Mismatch in predicted values: row %d of a+b has %d predictions: "
"\n (%s) and row %d of b has %d predictions:\n (%s)" % \
(rowIdx, len(predAPB), predAPB, rowIdx-checkpointAt, len(predB),
predB))
for i in range(len(predAPB)):
(aProb, aValue) = predAPB[i]
(bProb, bValue) = predB[i]
self.assertLess(abs(aValue-bValue), epsilon,
"Mismatch in predicted values: row %d of a+b predicts value %s "
"and row %d of b predicts %s" % (rowIdx, aValue,
rowIdx-checkpointAt, bValue))
self.assertLess(abs(aProb-bProb), epsilon,
"Mismatch in probabilities: row %d of a+b predicts %s with "
"probability %s and row %d of b predicts %s with probability %s" \
% (rowIdx, aValue, aProb, rowIdx-checkpointAt, bValue, bProb))
except StopIteration:
break
print "Predictions match!"
@staticmethod
def _testBackwardsCompatibility(experiment, checkpointName):
""" Test that we can load in a checkpoint saved by an earlier version of
the OPF.
Parameters:
-----------------------------------------------------------------------
experiment: Directory of the experiment.
checkpointName: which checkpoint to verify
"""
# Get the experiment directories
expDir = os.path.join(_EXPERIMENT_BASE, experiment)
# Copy the pertinent checkpoint
if os.path.exists(os.path.join(expDir, 'savedmodels')):
shutil.rmtree(os.path.join(expDir, 'savedmodels'))
shutil.copytree(src=os.path.join(expDir, checkpointName),
dst=os.path.join(expDir, 'savedmodels'))
# Run it from the checkpoint
_aPlusBExp = runExperiment(args=[expDir, '--load=DefaultTask',
'--noCheckpoint'])
class PositiveTests(MyTestCaseBase):
def test_NonTemporalMultiStep(self):
""" Test that we get the same predictions out of a model that was
saved and reloaded from a checkpoint as we do from one that runs
continuously.
"""
self._testSamePredictions(
experiment="non_temporal_multi_step", predSteps=24, checkpointAt=250,
predictionsFilename=
"DefaultTask.NontemporalMultiStep.predictionLog.csv")
@unittest.skip("Currently Fails: NUP-1864")
def test_TemporalMultiStep(self):
""" Test that we get the same predictions out of a model that was
saved and reloaded from a checkpoint as we do from one that runs
continuously.
"""
self._testSamePredictions(experiment="temporal_multi_step", predSteps=24,
checkpointAt=250,
predictionsFilename='DefaultTask.TemporalMultiStep.predictionLog.csv')
@unittest.skip("Currently Fails: NUP-1864")
def test_TemporalAnomaly(self):
""" Test that we get the same predictions out of a model that was
saved and reloaded from a checkpoint as we do from one that runs
continuously.
"""
self._testSamePredictions(experiment="temporal_anomaly", predSteps=1,
checkpointAt=250,
predictionsFilename='DefaultTask.TemporalAnomaly.predictionLog.csv',
additionalFields=['anomalyScore'])
def test_BackwardsCompatibility(self):
""" Test that we can load in a checkpoint saved by an earlier version of
the OPF.
"""
self._testBackwardsCompatibility(
os.path.join('backwards_compatibility', 'a'),
'savedmodels_2012-10-05')
if __name__ == "__main__":
initLogging(verbose=True)
unittest.main()
|
gpl-3.0
|
rockstor/rockstor-core
|
src/rockstor/storageadmin/models/network_interface.py
|
2
|
9858
|
"""
Copyright (c) 2012-2020 RockStor, Inc. <http://rockstor.com>
This file is part of RockStor.
RockStor is free software; you can redistribute it and/or modify
it under the terms of the GNU General Public License as published
by the Free Software Foundation; either version 2 of the License,
or (at your option) any later version.
RockStor is distributed in the hope that it will be useful, but
WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program. If not, see <http://www.gnu.org/licenses/>.
"""
import json
from django.db import models
# This is the key abstraction for network configuration that is user
# configurable in Rockstor. user can add, delete or modify connections which
# results in CRUD ops on this model and also on other models linked to this
# one, such as NetworkInterface, EthernetConnection etc..
class NetworkConnection(models.Model):
# Wired connection 1, Team-team0 etc..
name = models.CharField(max_length=256, null=True)
# uuid generated by NM
uuid = models.CharField(max_length=256, unique=True)
# active (== GENERAL.STATE: activated in nmcli), could also be activating
# or blank(assumed inactive) -- subtle distinction compared to state of
# NetworkInterface
state = models.CharField(max_length=64, null=True)
# whether or not to automatically connect when underlying resources are
# available.
autoconnect = models.BooleanField(default=True)
# manual or dhcp
ipv4_method = models.CharField(max_length=64, null=True)
# comma separated strings of ip/nm_bits. typically just one ip/nm. eg:
# 192.168.1.5/24
ipv4_addresses = models.CharField(max_length=1024, null=True)
# there can only be one ipv4 gateway. eg: 192.168.1.1
ipv4_gw = models.CharField(max_length=64, null=True)
# comma separated strings of one or more dns addresses. eg: "8.8.8.8
# 8.8.4.4"
ipv4_dns = models.CharField(max_length=256, null=True)
# comma separated strings of one or more dns search domains. eg:
# rockstor.com
ipv4_dns_search = models.CharField(max_length=256, null=True)
# not clear yet on ipv6 stuff.
ipv6_method = models.CharField(max_length=1024, null=True)
ipv6_addresses = models.CharField(max_length=1024, null=True)
ipv6_gw = models.CharField(max_length=64, null=True)
ipv6_dns = models.CharField(max_length=256, null=True)
ipv6_dns_search = models.CharField(max_length=256, null=True)
# slave connections have a master. eg: team
master = models.ForeignKey("NetworkConnection", null=True)
@property
def ipaddr(self):
if self.ipv4_addresses is None:
return None
return self.ipv4_addresses.split(",")[0].split("/")[0]
@property
def mtu(self):
if self.ethernetconnection_set.count() > 0:
eco = self.ethernetconnection_set.first()
try:
return int(eco.mtu)
except ValueError:
pass
return 1500
@property
def ctype(self):
if self.ethernetconnection_set.count() > 0:
return "ethernet"
if self.teamconnection_set.count() > 0:
return "team"
if self.bondconnection_set.count() > 0:
return "bond"
if self.bridgeconnection_set.count() > 0:
return "bridge"
return None
@property
def team_profile(self):
profile = None
try:
tco = self.teamconnection_set.first()
config_d = json.loads(tco.config)
profile = config_d["runner"]["name"]
except:
pass
finally:
return profile
@property
def bond_profile(self):
profile = None
try:
bco = self.bondconnection_set.first()
config_d = json.loads(bco.config)
profile = config_d["mode"]
except:
pass
finally:
return profile
@property
def docker_name(self):
dname = None
if self.bridgeconnection_set.count() > 0:
brco = self.bridgeconnection_set.first()
dname = brco.docker_name
return dname
@property
def user_dnet(self):
"""
Returns True if the docker network is a rocknet (defined by the user).
Used by rockons.js to list available rocknets available for connection.
:return: Boolean
"""
user_dnet = None
if self.bridgeconnection_set.count() > 0:
brco = self.bridgeconnection_set.first()
user_dnet = brco.usercon
if user_dnet:
user_dnet = True
return user_dnet
@property
def docker_options(self):
"""
Gather all connection's settings in a dict to be displayed in the UI connection form
needed to edit an existing docker network connection.
:return:
"""
docker_options = {}
if self.bridgeconnection_set.count() > 0:
brco = self.bridgeconnection_set.first()
connected_containers = []
if brco.dcontainernetwork_set.filter(connection=brco.id).count() > 0:
for i in range(
brco.dcontainernetwork_set.filter(connection=brco.id).count()
):
cname = (
brco.dcontainernetwork_set.filter(connection=brco.id)
.order_by("id")[i]
.container_name
)
rname = (
brco.dcontainernetwork_set.filter(connection=brco.id)
.order_by("id")[i]
.container.rockon.name
)
connected_containers.append("{} ({})".format(cname, rname))
docker_options["aux_address"] = brco.aux_address
docker_options["dgateway"] = brco.dgateway
docker_options["host_binding"] = brco.host_binding
docker_options["icc"] = brco.icc
docker_options["internal"] = brco.internal
docker_options["ip_masquerade"] = brco.ip_masquerade
docker_options["ip_range"] = brco.ip_range
docker_options["subnet"] = brco.subnet
docker_options["containers"] = connected_containers
return docker_options
class Meta:
app_label = "storageadmin"
# network interfaces/devices are auto detected from the system via "nmcli d
# show" They are not "directly" user configurable. but their attributes are
# refreshed in two ways 1. when user configures a NetworkConnection and inturn
# NetworkInterface is changed, eg: state. 2. When changes at the system level
# are picked up.
class NetworkDevice(models.Model):
# enp0s3, lo etc..
name = models.CharField(max_length=256, unique=True)
# ethernet, infiniband etc..
dtype = models.CharField(max_length=100, null=True)
mac = models.CharField(max_length=100, null=True)
connection = models.ForeignKey(
NetworkConnection, null=True, on_delete=models.SET_NULL
)
# active (== GENERAL.STATE: activated in nmcli), could also be activating
# or blank(assumed inactive)
state = models.CharField(max_length=64, null=True)
mtu = models.CharField(max_length=64, null=True)
@property
def cname(self):
if self.connection is None:
return None
return self.connection.name
@property
def dev_name(self):
"""
Return the user-friendly docker_name as device name for bridge connections
to be displayed in the network widget on the dashboard.
:return:
"""
if (self.dtype == "bridge") and (self.connection is not None):
return self.connection.docker_name
return self.name
class Meta:
app_label = "storageadmin"
# This is the most common of connection types that uses NetworkInterface of
# dtype=ethernet
class EthernetConnection(models.Model):
connection = models.ForeignKey(NetworkConnection, null=True)
mac = models.CharField(max_length=64, null=True)
cloned_mac = models.CharField(max_length=64, null=True)
mtu = models.CharField(max_length=64, null=True)
class Meta:
app_label = "storageadmin"
class TeamConnection(models.Model):
connection = models.ForeignKey(NetworkConnection, null=True)
# eg: Team1
name = models.CharField(max_length=64, null=True)
# json config.
config = models.CharField(max_length=2048, null=True)
class Meta:
app_label = "storageadmin"
class BondConnection(models.Model):
connection = models.ForeignKey(NetworkConnection, null=True)
name = models.CharField(max_length=64, null=True)
# at the NM level it's not json like in team config, but we could convert
# it for consistency.
config = models.CharField(max_length=2048, null=True)
class Meta:
app_label = "storageadmin"
class BridgeConnection(models.Model):
connection = models.ForeignKey(NetworkConnection, null=True)
docker_name = models.CharField(max_length=64, null=True)
usercon = models.BooleanField(default=False)
aux_address = models.CharField(max_length=2048, null=True)
dgateway = models.CharField(max_length=64, null=True)
host_binding = models.CharField(max_length=64, null=True)
icc = models.BooleanField(default=False)
internal = models.BooleanField(default=False)
ip_masquerade = models.BooleanField(default=False)
ip_range = models.CharField(max_length=64, null=True)
subnet = models.CharField(max_length=64, null=True)
class Meta:
app_label = "storageadmin"
|
gpl-3.0
|
soarpenguin/ansible
|
lib/ansible/module_utils/cnos.py
|
2
|
133164
|
# This code is part of Ansible, but is an independent component.
# This particular file snippet, and this file snippet only, is BSD licensed.
# Modules you write using this snippet, which is embedded dynamically by
# Ansible still belong to the author of the module, and may assign their own
# license to the complete work.
#
# Copyright (C) 2017 Lenovo, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright notice,
# this list of conditions and the following disclaimer in the documentation
# and/or other materials provided with the distribution.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
# Contains utility methods
# Lenovo Networking
import time
import socket
import re
try:
import cnos_errorcodes
import cnos_devicerules
HAS_LIB = True
except:
HAS_LIB = False
from distutils.cmd import Command
def interfaceConfig(
obj, deviceType, prompt, timeout, interfaceArg1,
interfaceArg2, interfaceArg3, interfaceArg4, interfaceArg5,
interfaceArg6, interfaceArg7, interfaceArg8, interfaceArg9):
retVal = ""
command = "interface "
newPrompt = prompt
if(interfaceArg1 == "port-aggregation"):
command = command + " " + interfaceArg1 + " " + interfaceArg2 + "\n"
# debugOutput(command)
value = checkSanityofVariable(
deviceType, "portchannel_interface_value", interfaceArg2)
if(value == "ok"):
newPrompt = "(config-if)#"
retVal = retVal + \
waitForDeviceResponse(command, newPrompt, timeout, obj)
else:
value = checkSanityofVariable(
deviceType, "portchannel_interface_range", interfaceArg2)
if(value == "ok"):
newPrompt = "(config-if-range)#"
retVal = retVal + \
waitForDeviceResponse(command, newPrompt, timeout, obj)
else:
value = checkSanityofVariable(
deviceType, "portchannel_interface_string", interfaceArg2)
if(value == "ok"):
newPrompt = "(config-if-range)#"
retVal = retVal + \
waitForDeviceResponse(command, newPrompt, timeout, obj)
else:
retVal = "Error-102"
return retVal
retVal = retVal + interfaceLevel2Config(
obj, deviceType, newPrompt, timeout, interfaceArg3, interfaceArg4,
interfaceArg5, interfaceArg6, interfaceArg7, interfaceArg8,
interfaceArg9)
elif(interfaceArg1 == "ethernet"):
# command = command + interfaceArg1 + " 1/"
value = checkSanityofVariable(
deviceType, "ethernet_interface_value", interfaceArg2)
if(value == "ok"):
newPrompt = "(config-if)#"
command = command + interfaceArg1 + " 1/" + interfaceArg2 + " \n"
retVal = retVal + \
waitForDeviceResponse(command, newPrompt, timeout, obj)
else:
value = checkSanityofVariable(
deviceType, "ethernet_interface_range", interfaceArg2)
if(value == "ok"):
command = command + \
interfaceArg1 + " 1/" + interfaceArg2 + " \n"
newPrompt = "(config-if-range)#"
retVal = retVal + \
waitForDeviceResponse(command, newPrompt, timeout, obj)
else:
value = checkSanityofVariable(
deviceType, "ethernet_interface_string", interfaceArg2)
if(value == "ok"):
command = command + \
interfaceArg1 + " " + interfaceArg2 + "\n"
newPrompt = "(config-if-range)#"
retVal = retVal + \
waitForDeviceResponse(command, newPrompt, timeout, obj)
else:
retVal = "Error-102"
return retVal
retVal = retVal + interfaceLevel2Config(
obj, deviceType, newPrompt, timeout, interfaceArg3, interfaceArg4,
interfaceArg5, interfaceArg6, interfaceArg7, interfaceArg8,
interfaceArg9)
elif(interfaceArg1 == "loopback"):
value = checkSanityofVariable(
deviceType, "loopback_interface_value", interfaceArg2)
if(value == "ok"):
newPrompt = "(config-if)#"
command = command + interfaceArg1 + " " + interfaceArg2 + "\n"
retVal = retVal + \
waitForDeviceResponse(command, newPrompt, timeout, obj)
else:
retVal = "Error-102"
return retVal
retVal = retVal + interfaceLevel2Config(
obj, deviceType, newPrompt, timeout, interfaceArg3, interfaceArg4,
interfaceArg5, interfaceArg6, interfaceArg7, interfaceArg8,
interfaceArg9)
elif(interfaceArg1 == "mgmt"):
value = checkSanityofVariable(
deviceType, "mgmt_interface_value", interfaceArg2)
if(value == "ok"):
newPrompt = "(config-if)#"
command = command + interfaceArg1 + " " + interfaceArg2 + "\n"
retVal = retVal + \
waitForDeviceResponse(command, newPrompt, timeout, obj)
else:
retVal = "Error-102"
return retVal
retVal = retVal + interfaceLevel2Config(
obj, deviceType, newPrompt, timeout, interfaceArg3, interfaceArg4,
interfaceArg5, interfaceArg6, interfaceArg7, interfaceArg8,
interfaceArg9)
elif(interfaceArg1 == "vlan"):
value = checkSanityofVariable(
deviceType, "vlan_interface_value", interfaceArg2)
if(value == "ok"):
newPrompt = "(config-if)#"
command = command + interfaceArg1 + " " + interfaceArg2 + "\n"
retVal = retVal + \
waitForDeviceResponse(command, newPrompt, timeout, obj)
else:
retVal = "Error-102"
return retVal
retVal = retVal + interfaceLevel2Config(
obj, deviceType, newPrompt, timeout, interfaceArg3, interfaceArg4,
interfaceArg5, interfaceArg6, interfaceArg7, interfaceArg8,
interfaceArg9)
else:
retVal = "Error-102"
return retVal
# EOM
def interfaceLevel2Config(
obj, deviceType, prompt, timeout, interfaceL2Arg1, interfaceL2Arg2,
interfaceL2Arg3, interfaceL2Arg4, interfaceL2Arg5, interfaceL2Arg6,
interfaceL2Arg7):
retVal = ""
command = ""
if(interfaceL2Arg1 == "aggregation-group"):
# debugOutput("aggregation-group")
command = interfaceL2Arg1 + " "
value = checkSanityofVariable(
deviceType, "aggregation_group_no", interfaceL2Arg2)
if(value == "ok"):
command = command + interfaceL2Arg2 + " mode "
value = checkSanityofVariable(
deviceType, "aggregation_group_mode", interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3
else:
retVal = "Error-200"
return retVal
else:
retVal = "Error-201"
return retVal
elif (interfaceL2Arg1 == "bfd"):
# debugOutput("bfd")
command = interfaceL2Arg1 + " "
value = checkSanityofVariable(
deviceType, "bfd_options", interfaceL2Arg2)
if(value == "ok"):
if(interfaceL2Arg2 == "echo"):
command = command + interfaceL2Arg2
elif(interfaceL2Arg2 == "interval"):
command = command + interfaceL2Arg2 + " "
value = checkSanityofVariable(
deviceType, "bfd_interval", interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3
value = checkSanityofVariable(
deviceType, "bfd_minrx", interfaceL2Arg4)
if(value == "ok"):
command = command + " minrx " + interfaceL2Arg4
value = checkSanityofVariable(
deviceType, "bfd_ multiplier", interfaceL2Arg5)
if(value == "ok"):
command = command + " multiplier " + \
interfaceL2Arg5
else:
retVal = "Error-236"
return retVal
else:
retVal = "Error-235"
return retVal
else:
retVal = "Error-234"
return retVal
elif(interfaceL2Arg2 == "authentication"):
command = command + interfaceL2Arg2 + " "
value = checkSanityofVariable(
deviceType, "bfd_auth_options", interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3 + " "
if((interfaceL2Arg3 == "keyed-md5") or
(interfaceL2Arg3 == "keyed-sha1") or
(interfaceL2Arg3 == "meticulous-keyed-md5") or
(interfaceL2Arg3 == "meticulous-keyed-sha1") or
(interfaceL2Arg3 == "simple")):
value = checkSanityofVariable(
deviceType, "bfd_key_options", interfaceL2Arg4)
if(value == "ok"):
command = command + interfaceL2Arg4 + " "
if(interfaceL2Arg4 == "key-chain"):
value = checkSanityofVariable(
deviceType, "bfd_key_chain",
interfaceL2Arg5)
if(value == "ok"):
command = command + interfaceL2Arg5
else:
retVal = "Error-237"
return retVal
elif(interfaceL2Arg4 == "key-id"):
value = checkSanityofVariable(
deviceType, "bfd_key_id", interfaceL2Arg5)
if(value == "ok"):
command = command + interfaceL2Arg5
command = command + " key "
value = checkSanityofVariable(
deviceType, "bfd_key_name",
interfaceL2Arg6)
if(value == "ok"):
command = command + interfaceL2Arg6
else:
retVal = "Error-238"
return retVal
else:
retVal = "Error-239"
return retVal
else:
retVal = "Error-240"
return retVal
else:
retVal = "Error-241"
return retVal
elif(interfaceL2Arg2 == "ipv4" or interfaceL2Arg2 == "ipv6"):
command = command + interfaceL2Arg2 + " "
value = checkSanityofVariable(
deviceType, "bfd_ipv4_options", interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3 + " "
if(interfaceL2Arg3 == "authentication"):
value = checkSanityofVariable(
deviceType, "bfd_auth_options", interfaceL2Arg4)
if(value == "ok"):
command = command + interfaceL2Arg4 + " "
if((interfaceL2Arg4 == "keyed-md5") or
(interfaceL2Arg4 == "keyed-sha1") or
(interfaceL2Arg4 == "meticulous-keyed-md5") or
(interfaceL2Arg4 == "meticulous-keyed-sha1") or
(interfaceL2Arg4 == "simple")):
value = checkSanityofVariable(
deviceType, "bfd_key_options",
interfaceL2Arg5)
if(value == "ok"):
command = command + interfaceL2Arg5 + " "
if(interfaceL2Arg5 == "key-chain"):
value = checkSanityofVariable(
deviceType, "bfd_key_chain",
interfaceL2Arg6)
if(value == "ok"):
command = command + interfaceL2Arg6
else:
retVal = "Error-237"
return retVal
elif(interfaceL2Arg5 == "key-id"):
value = checkSanityofVariable(
deviceType, "bfd_key_id",
interfaceL2Arg6)
if(value == "ok"):
command = command + \
interfaceL2Arg6 + " key "
value = checkSanityofVariable(
deviceType, "bfd_key_name",
interfaceL2Arg7)
if(value == "ok"):
command = command + \
interfaceL2Arg7
else:
retVal = "Error-238"
return retVal
else:
retVal = "Error-239"
return retVal
else:
retVal = "Error-240"
return retVal
else:
retVal = "Error-240"
return retVal
else:
retVal = "Error-241"
return retVal
elif(interfaceL2Arg3 == "echo"):
command = command + interfaceL2Arg3
elif(interfaceL2Arg3 == "interval"):
command = command + interfaceL2Arg3 + " "
value = checkSanityofVariable(
deviceType, "bfd_interval", interfaceL2Arg4)
if(value == "ok"):
command = command + interfaceL2Arg4
value = checkSanityofVariable(
deviceType, "bfd_minrx", interfaceL2Arg5)
if(value == "ok"):
command = command + " minrx " + interfaceL2Arg5
value = checkSanityofVariable(
deviceType, "bfd_ multiplier",
interfaceL2Arg6)
if(value == "ok"):
command = command + " multiplier " + \
interfaceL2Arg6
else:
retVal = "Error-236"
return retVal
else:
retVal = "Error-235"
return retVal
else:
retVal = "Error-234"
return retVal
else:
command = command # None is taken care here
elif(interfaceL2Arg2 == "neighbor"):
command = command + interfaceL2Arg2 + " src-ip "
value = checkSanityofVariable(
deviceType, "bfd_neighbor_ip", interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3 + " dest-ip "
value = checkSanityofVariable(
deviceType, "bfd_neighbor_ip", interfaceL2Arg4)
if(value == "ok"):
command = command + interfaceL2Arg4 + " "
if(interfaceL2Arg5 is not None):
value = checkSanityofVariable(
deviceType, "bfd_neighbor_options",
interfaceL2Arg5)
if(value == "ok"):
command = command + interfaceL2Arg5 + " "
if(interfaceL2Arg6 is not None):
if((interfaceL2Arg6 == "admin-down") or
(interfaceL2Arg6 ==
"non-persistent")):
command = command + \
interfaceL2Arg6 + " "
if((interfaceL2Arg7 is not None) and
(interfaceL2Arg7 ==
"admin-down")):
command = command + interfaceL2Arg7
else:
retVal = "Error-277"
return retVal
else:
retVal = "Error-277"
return retVal
# Else is not there are its optionsal
# Else is not there as this is optional
# Else is not there as this is optional
else:
retVal = "Error-242"
return retVal
else:
retVal = "Error-243"
return retVal
else:
retVal = "Error-205"
return retVal
else:
retVal = "Error-205"
return retVal
elif (interfaceL2Arg1 == "bridge-port"):
# debugOutput("bridge-port")
command = interfaceL2Arg1 + " "
if(interfaceL2Arg2 is None):
command = command
elif(interfaceL2Arg2 == "access"):
command = command + interfaceL2Arg2 + " vlan "
value = checkSanityofVariable(
deviceType, "bfd_access_vlan", interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3
else:
retVal = "Error-202"
return retVal
elif(interfaceL2Arg2 == "mode"):
command = command + interfaceL2Arg2 + " "
value = checkSanityofVariable(
deviceType, "bfd_bridgeport_mode", interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3
else:
retVal = "Error-203"
return retVal
elif(interfaceL2Arg2 == "trunk"):
command = command + interfaceL2Arg2 + " "
value = checkSanityofVariable(
deviceType, "trunk_options", interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3 + " "
if((interfaceL2Arg3 == "allowed") or
(interfaceL2Arg3 == "native")):
command = command + "vlan " # Only permiting one vlan id
if(interfaceL2Arg4 == "all" or interfaceL2Arg4 == "none"):
command = command + interfaceL2Arg4
elif(interfaceL2Arg4 == "add" or
interfaceL2Arg4 == "remove" or
interfaceL2Arg4 == "none"):
command = command + interfaceL2Arg4 + " "
value = checkSanityofVariable(
deviceType, "bfd_access_vlan", interfaceL2Arg5)
if(value == "ok"):
command = command + interfaceL2Arg5
else:
retVal = "Error-202"
return retVal
else:
value = checkSanityofVariable(
deviceType, "bfd_access_vlan", interfaceL2Arg4)
if(value == "ok"):
command = command + interfaceL2Arg4
else:
retVal = "Error-202"
return retVal
else:
retVal = "Error-204"
return retVal
else:
retVal = "Error-204"
return retVal
else:
retVal = "Error-205"
return retVal
elif (interfaceL2Arg1 == "description"):
# debugOutput("description")
command = interfaceL2Arg1 + " "
value = checkSanityofVariable(
deviceType, "portCh_description", interfaceL2Arg2)
if(value == "ok"):
command = command + interfaceL2Arg2
else:
retVal = "Error-206"
return retVal
elif (interfaceL2Arg1 == "duplex"):
# debugOutput("duplex")
command = interfaceL2Arg1 + " "
value = checkSanityofVariable(
deviceType, "duplex_option", interfaceL2Arg2)
if(value == "ok"):
command = command + interfaceL2Arg2
else:
retVal = "Error-207"
return retVal
elif (interfaceL2Arg1 == "flowcontrol"):
# debugOutput("flowcontrol")
command = interfaceL2Arg1 + " "
value = checkSanityofVariable(
deviceType, "flowcontrol_options", interfaceL2Arg2)
if(value == "ok"):
command = command + interfaceL2Arg2 + " "
if(interfaceL2Arg3 == "on" or interfaceL2Arg3 == "off"):
command = command + interfaceL2Arg3
else:
retVal = "Error-208"
return retVal
else:
retVal = "Error-209"
return retVal
elif (interfaceL2Arg1 == "ip"):
# debugOutput("ip")
command = interfaceL2Arg1 + " "
value = checkSanityofVariable(
deviceType, "portchannel_ip_options", interfaceL2Arg2)
if(value == "ok"):
command = command + interfaceL2Arg2 + " "
if(interfaceL2Arg2 == "access-group"):
value = checkSanityofVariable(
deviceType, "accessgroup_name", interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3 + " "
if(interfaceL2Arg4 == "in" or interfaceL2Arg4 == "out"):
command = command + interfaceL2Arg4
else:
retVal = "Error-245"
return retVal
else:
retVal = "Error-246"
return retVal
elif(interfaceL2Arg2 == "address"):
if(interfaceL2Arg3 == "dhcp"):
command = command + interfaceL2Arg3
elif(interfaceL2Arg3 is not None):
value = checkSanityofVariable(
deviceType, "portchannel_ipv4", interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3 + " "
value = checkSanityofVariable(
deviceType, "portchannel_ipv4", interfaceL2Arg4)
if(value == "ok"):
command = command + interfaceL2Arg4 + " "
if(interfaceL2Arg5 == "secondary"):
command = command + interfaceL2Arg5
elif(interfaceL2Arg5 is None):
command = command + interfaceL2Arg5
else:
retVal = "Error-278"
return retVal
else:
retVal = "Error-279"
return retVal
else:
value = checkSanityofVariable(
deviceType, "portchannel_ipv4_mask",
interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3 + " "
if(interfaceL2Arg4 == "secondary"):
command = command + interfaceL2Arg4
elif(interfaceL2Arg4 is None):
command = command + interfaceL2Arg4
else:
retVal = "Error-278"
return retVal
else:
retVal = "Error-279"
return retVal
elif(interfaceL2Arg2 == "arp"):
value = checkSanityofVariable(
deviceType, "arp_ipaddress", interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3 + " "
value = checkSanityofVariable(
deviceType, "arp_macaddress", interfaceL2Arg4)
if(value == "ok"):
command = command + interfaceL2Arg4 + " "
else:
retVal = "Error-247"
return retVal
elif(interfaceL2Arg3 == "timeout"):
command = command + interfaceL2Arg3 + " "
value = checkSanityofVariable(
deviceType, "arp_timeout_value", interfaceL2Arg4)
if(value == "ok"):
command = command + interfaceL2Arg4 + " "
else:
retVal = "Error-248"
return retVal
else:
retVal = "Error-249"
return retVal
elif(interfaceL2Arg2 == "dhcp"):
if(interfaceL2Arg3 == "client"):
command = command + interfaceL2Arg3 + " "
if(interfaceL2Arg4 == "class-id"):
command = command + interfaceL2Arg3 + " "
if(interfaceL2Arg4 is not None):
command = command + interfaceL2Arg4
elif(interfaceL2Arg4 == "request"):
command = command + interfaceL2Arg4 + " "
if(interfaceL2Arg5 == "bootfile-name" or
interfaceL2Arg5 == "host-name" or
interfaceL2Arg5 == "log-server" or
interfaceL2Arg5 == "tftp-server-name"):
command = command + interfaceL2Arg5 + " "
else:
retVal = "Error-250"
return retVal
else:
retVal = "Error-251"
return retVal
elif(interfaceL2Arg3 == "relay"):
command = command + interfaceL2Arg3 + " address "
value = checkSanityofVariable(
deviceType, "relay_ipaddress", interfaceL2Arg4)
if(value == "ok"):
command = command + interfaceL2Arg4
else:
retVal = "Error-252"
return retVal
else:
retVal = "Error-253"
return retVal
elif(interfaceL2Arg2 == "ospf"):
value = checkSanityofVariable(
deviceType, "ip_ospf_options", interfaceL2Arg3)
if(value == "ok"):
retVal = "Error-102"
return retVal
else:
retVal = "Error-254"
return retVal
elif(interfaceL2Arg2 == "port"):
command = command + "access-group "
value = checkSanityofVariable(
deviceType, "accessgroup_name", interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3 + " in"
else:
retVal = "Error-246"
return retVal
elif(interfaceL2Arg2 == "port-unreachable"):
command = command + interfaceL2Arg2
elif(interfaceL2Arg2 == "redirects"):
command = command + interfaceL2Arg2
elif(interfaceL2Arg2 == "router"):
command = command + interfaceL2Arg2 + " 0 "
if(interfaceL2Arg3 == "area" or
interfaceL2Arg3 == "multi-area"):
command = command + interfaceL2Arg3
value = checkSanityofVariable(
deviceType, "ospf_id_decimal_value", interfaceL2Arg4)
if(value == "ok"):
command = command + interfaceL2Arg4
else:
value = checkSanityofVariable(
deviceType, "ospf_id_ipaddres_value",
interfaceL2Arg4)
if(value == "ok"):
command = command + interfaceL2Arg4
else:
retVal = "Error-255"
return retVal
else:
retVal = "Error-256"
return retVal
elif(interfaceL2Arg2 == "unreachables"):
command = command + interfaceL2Arg2
else:
retVal = "Error-244"
return retVal
else:
retVal = "Error-244"
return retVal
elif (interfaceL2Arg1 == "ipv6"):
# debugOutput("ipv6")
command = interfaceL2Arg1 + " "
value = checkSanityofVariable(
deviceType, "portchannel_ipv6_options", interfaceL2Arg2)
if(value == "ok"):
command = command + interfaceL2Arg2 + " "
if(interfaceL2Arg2 == "address"):
if(interfaceL2Arg3 == "dhcp"):
command = command + interfaceL2Arg3
else:
value = checkSanityofVariable(
deviceType, "portchannel_ipv6_address",
interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3 + " "
if(interfaceL2Arg4 == "anycast" or
interfaceL2Arg4 == "secondary" or
interfaceL2Arg4 is None):
command = command + interfaceL2Arg4
else:
retVal = "Error-276"
return retVal
else:
retVal = "Error-275"
return retVal
elif(interfaceL2Arg2 == "dhcp"):
value = checkSanityofVariable(
deviceType, "portchannel_ipv6_dhcp", interfaceL2Arg3)
if(value == "ok"):
command = command + "relay address " + interfaceL2Arg3
if(interfaceL2Arg4 is not None):
if(interfaceL2Arg4 == "ethernet"):
value = checkSanityofVariable(
deviceType, "portchannel_ipv6_dhcp_ethernet",
interfaceL2Arg4)
if(value == "ok"):
command = command + " interface ethernet " + \
interfaceL2Arg4
else:
retVal = "Error-271"
return retVal
elif(interfaceL2Arg4 == "vlan"):
value = checkSanityofVariable(
deviceType, "portchannel_ipv6_dhcp_vlan",
interfaceL2Arg4)
if(value == "ok"):
command = command + " interface vlan " + \
interfaceL2Arg4
else:
retVal = "Error-272"
return retVal
else:
retVal = "Error-270"
return retVal
else:
retVal = "Error-269"
return retVal
elif(interfaceL2Arg2 == "link-local"):
value = checkSanityofVariable(
deviceType, "portchannel_ipv6_linklocal", interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3
else:
retVal = "Error-273"
return retVal
elif(interfaceL2Arg2 == "nd"):
retVal = "Error-102"
return retVal
elif(interfaceL2Arg2 == "neighbor"):
value = checkSanityofVariable(
deviceType, "portchannel_ipv6_neighbor_address",
interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3 + " "
value = checkSanityofVariable(
deviceType, "portchannel_ipv6_neighbor_mac",
interfaceL2Arg4)
if(value == "ok"):
command = command + interfaceL2Arg4
else:
retVal = "Error-267"
return retVal
else:
retVal = "Error-268"
return retVal
else:
retVal = "Error-266"
return retVal
else:
retVal = "Error-102"
return retVal
elif (interfaceL2Arg1 == "lacp"):
# debugOutput("lacp")
command = interfaceL2Arg1 + " "
value = checkSanityofVariable(
deviceType, "lacp_options", interfaceL2Arg2)
if(value == "ok"):
command = command + interfaceL2Arg2 + " "
if(interfaceL2Arg2 == "port-priority"):
value = checkSanityofVariable(
deviceType, "port_priority", interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3
else:
retVal = "Error-210"
return retVal
elif(interfaceL2Arg2 == "suspend-individual"):
command = command + interfaceL2Arg3
elif(interfaceL2Arg2 == "timeout"):
command = command + interfaceL2Arg2 + " "
if(interfaceL2Arg3 == "long" or interfaceL2Arg3 == "short"):
command = command + interfaceL2Arg3
else:
retVal = "Error-211"
return retVal
else:
retVal = "Error-212"
return retVal
else:
retVal = "Error-212"
return retVal
elif (interfaceL2Arg1 == "lldp"):
# debugOutput("lldp")
command = interfaceL2Arg1 + " "
value = checkSanityofVariable(
deviceType, "lldp_options", interfaceL2Arg2)
if(value == "ok"):
command = command + interfaceL2Arg2 + " "
if(interfaceL2Arg2 == "receive" or
interfaceL2Arg2 == "trap-notification" or
interfaceL2Arg2 == "transmit"):
command = command
elif(interfaceL2Arg2 == "tlv-select"):
value = checkSanityofVariable(
deviceType, "lldp_tlv_options", interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3
else:
retVal = "Error-213"
return retVal
else:
retVal = "Error-214"
return retVal
else:
retVal = "Error-214"
return retVal
elif (interfaceL2Arg1 == "load-interval"):
# debugOutput("load-interval")
command = interfaceL2Arg1 + " "
value = checkSanityofVariable(
deviceType, "load_interval_delay", interfaceL2Arg2)
if(value == "ok"):
command = command + interfaceL2Arg2
else:
if(interfaceL2Arg2 == "counter"):
command = command + interfaceL2Arg2 + " "
value = checkSanityofVariable(
deviceType, "load_interval_counter", interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3 + " "
value = checkSanityofVariable(
deviceType, "load_interval_delay", interfaceL2Arg4)
if(value == "ok"):
command = command + interfaceL2Arg4
else:
retVal = "Error-215"
return retVal
else:
retVal = "Error-216"
return retVal
else:
retVal = "Error-217"
return retVal
elif (interfaceL2Arg1 == "mac"):
# debugOutput("mac")
command = interfaceL2Arg1 + " port access-group "
value = checkSanityofVariable(
deviceType, "mac_accessgroup_name", interfaceL2Arg2)
if(value == "ok"):
command = command + interfaceL2Arg2
else:
retVal = "Error-218"
return retVal
elif (interfaceL2Arg1 == "mac-address"):
# debugOutput("mac-address")
command = interfaceL2Arg1 + " "
value = checkSanityofVariable(
deviceType, "mac_address", interfaceL2Arg2)
if(value == "ok"):
command = command + interfaceL2Arg2
else:
retVal = "Error-219"
return retVal
elif (interfaceL2Arg1 == "mac-learn"):
# debugOutput("mac-learn")
command = interfaceL2Arg1 + " disable"
elif (interfaceL2Arg1 == "microburst-detection"):
# debugOutput("microburst-detection")
command = interfaceL2Arg1 + " enable threshold "
value = checkSanityofVariable(
deviceType, "microburst_threshold", interfaceL2Arg2)
if(value == "ok"):
command = command + interfaceL2Arg2
else:
retVal = "Error-220"
return retVal
elif (interfaceL2Arg1 == "mtu"):
# debugOutput("mtu")
command = interfaceL2Arg1 + " "
value = checkSanityofVariable(deviceType, "mtu_value", interfaceL2Arg2)
if(value == "ok"):
command = command + interfaceL2Arg2
else:
retVal = "Error-221"
return retVal
elif (interfaceL2Arg1 == "service"):
# debugOutput("service")
command = interfaceL2Arg1 + " instance "
value = checkSanityofVariable(
deviceType, "service_instance", interfaceL2Arg2)
if(value == "ok"):
command = command + interfaceL2Arg2
else:
retVal = "Error-222"
return retVal
elif (interfaceL2Arg1 == "service-policy"):
# debugOutput("service-policy")
command = interfaceL2Arg1 + " "
value = checkSanityofVariable(
deviceType, "service_policy_options", interfaceL2Arg2)
if(value == "ok"):
command = command + interfaceL2Arg2 + " "
if(interfaceL2Arg2 == "input" or interfaceL2Arg2 == "output"):
value = checkSanityofVariable(
deviceType, "service_policy_name", interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3
else:
retVal = "Error-223"
return retVal
elif(interfaceL2Arg2 == "copp-system-policy"):
command = command + "class all"
elif(interfaceL2Arg2 == "type" and interfaceL2Arg3 == "qos"):
command = command + interfaceL2Arg3 + " "
if(interfaceL2Arg4 == "input" or interfaceL2Arg4 == "output"):
value = checkSanityofVariable(
deviceType, "service_policy_name", interfaceL2Arg5)
if(value == "ok"):
command = command + interfaceL2Arg5
else:
retVal = "Error-223"
return retVal
elif(interfaceL2Arg2 == "type" and interfaceL2Arg3 == "queuing"):
command = command + interfaceL2Arg3 + " "
if(interfaceL2Arg4 == "input" or interfaceL2Arg4 == "output"):
value = checkSanityofVariable(
deviceType, "service_policy_name", interfaceL2Arg5)
if(value == "ok"):
command = command + interfaceL2Arg5
else:
retVal = "Error-223"
return retVal
else:
retVal = "Error-224"
return retVal
elif (interfaceL2Arg1 == "shutdown"):
# debugOutput("shutdown")
command = interfaceL2Arg1
elif (interfaceL2Arg1 == "no shutdown"):
# debugOutput("no shutdown")
command = interfaceL2Arg1
elif (interfaceL2Arg1 == "snmp"):
# debugOutput("snmp")
command = interfaceL2Arg1 + " trap link-status "
elif (interfaceL2Arg1 == "spanning-tree"):
# debugOutput("spanning-tree")
command = interfaceL2Arg1 + " "
value = checkSanityofVariable(
deviceType, "spanning_tree_options", interfaceL2Arg2)
if(value == "ok"):
if(interfaceL2Arg2 == "bpdufilter"):
command = command + interfaceL2Arg2 + " "
if(interfaceL2Arg3 == "enable" or
interfaceL2Arg3 == "disable"):
command = command + interfaceL2Arg3
else:
retVal = "Error-257"
return retVal
elif(interfaceL2Arg2 == "bpduguard"):
command = command + interfaceL2Arg2 + " "
if(interfaceL2Arg3 == "enable" or
interfaceL2Arg3 == "disable"):
command = command + interfaceL2Arg3
else:
retVal = "Error-258"
return retVal
elif(interfaceL2Arg2 == "cost"):
command = command + interfaceL2Arg2 + " "
value = checkSanityofVariable(
deviceType, "spanning_tree_cost", interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3
elif(interfaceL2Arg3 == "auto"):
command = command + interfaceL2Arg3
else:
retVal = "Error-259"
return retVal
elif(interfaceL2Arg2 == "disable" or interfaceL2Arg2 == "enable"):
command = command + interfaceL2Arg2 + " "
elif(interfaceL2Arg2 == "guard"):
command = command + interfaceL2Arg2 + " "
if(interfaceL2Arg3 == "loop" or interfaceL2Arg3 == "root"):
command = command + interfaceL2Arg3
else:
retVal = "Error-260"
return retVal
elif(interfaceL2Arg2 == "link-type"):
command = command + interfaceL2Arg2 + " "
if(interfaceL2Arg3 == "auto" or
interfaceL2Arg3 == "point-to-point" or
interfaceL2Arg3 == "shared"):
command = command + interfaceL2Arg3
else:
retVal = "Error-261"
return retVal
elif(interfaceL2Arg2 == "mst"):
command = command + interfaceL2Arg2 + " "
value = checkSanityofVariable(
deviceType, "spanning_tree_interfacerange",
interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3 + " "
if(interfaceL2Arg4 == "cost"):
command = command + interfaceL2Arg4 + " "
value = checkSanityofVariable(
deviceType, "spanning_tree_cost", interfaceL2Arg5)
if(value == "ok"):
command = command + interfaceL2Arg5
elif(interfaceL2Arg5 == "auto"):
command = command + interfaceL2Arg5
else:
retVal = "Error-259"
return retVal
elif(interfaceL2Arg4 == "port-priority"):
command = command + interfaceL2Arg4 + " "
value = checkSanityofVariable(
deviceType, "spanning_tree_portpriority",
interfaceL2Arg5)
if(value == "ok"):
command = command + interfaceL2Arg5
else:
retVal = "Error-259"
return retVal
else:
retVal = "Error-259"
return retVal
else:
retVal = "Error-263"
return retVal
elif(interfaceL2Arg2 == "port"):
command = command + interfaceL2Arg2 + " type edge"
elif(interfaceL2Arg2 == "port-priority"):
command = command + interfaceL2Arg2 + " "
value = checkSanityofVariable(
deviceType, "spanning_tree_portpriority", interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3
else:
retVal = "Error-264"
return retVal
elif(interfaceL2Arg2 == "vlan"):
command = command + interfaceL2Arg2 + " "
value = checkSanityofVariable(
deviceType, "vlan_id_range", interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3
if(interfaceL2Arg4 == "cost"):
command = command + interfaceL2Arg4 + " "
value = checkSanityofVariable(
deviceType, "spanning_tree_cost", interfaceL2Arg5)
if(value == "ok"):
command = command + interfaceL2Arg5
elif(interfaceL2Arg5 == "auto"):
command = command + interfaceL2Arg5
else:
retVal = "Error-263"
return retVal
elif(interfaceL2Arg4 == "port-priority"):
command = command + interfaceL2Arg4 + " "
value = checkSanityofVariable(
deviceType, "spanning_tree_portpriority",
interfaceL2Arg5)
if(value == "ok"):
command = command + interfaceL2Arg5
else:
retVal = "Error-264"
return retVal
else:
retVal = "Error-264"
return retVal
else:
retVal = "Error-134"
return retVal
else:
retVal = "Error-263"
return retVal
elif (interfaceL2Arg1 == "speed"):
# debugOutput("speed")
command = interfaceL2Arg1 + " "
value = checkSanityofVariable(
deviceType, "interface_speed", interfaceL2Arg2)
if(value == "ok"):
command = command + interfaceL2Arg2
else:
retVal = "Error-225"
return retVal
elif (interfaceL2Arg1 == "storm-control"):
# debugOutput("storm-control")
command = interfaceL2Arg1 + " "
value = checkSanityofVariable(
deviceType, "stormcontrol_options", interfaceL2Arg2)
if(value == "ok"):
command = command + interfaceL2Arg2 + " level "
value = checkSanityofVariable(
deviceType, "stormcontrol_level", interfaceL2Arg3)
if(value == "ok"):
command = command + interfaceL2Arg3
else:
retVal = "Error-226"
return retVal
else:
retVal = "Error-227"
return retVal
elif (interfaceL2Arg1 == "vlan"):
# debugOutput("vlan")
command = interfaceL2Arg1 + " dot1q tag native "
value = checkSanityofVariable(
deviceType, "portchannel_dot1q_tag", interfaceL2Arg2)
if(value == "ok"):
command = command + interfaceL2Arg2
if(interfaceL2Arg2 == "egress-only"):
command = command + " enable"
else:
retVal = "Error-228"
return retVal
elif (interfaceL2Arg1 == "vrrp"):
# debugOutput("vrrp")
command = interfaceL2Arg1 + " "
value = checkSanityofVariable(deviceType, "vrrp_id", interfaceL2Arg2)
if(value == "ok"):
command = command + interfaceL2Arg2 + " "
if(interfaceL2Arg3 == "ipv6"):
command = command + interfaceL2Arg3 + " "
elif(interfaceL2Arg3 is None):
command = command + ""
else:
retVal = "Error-229"
return retVal
else:
retVal = "Error-230"
return retVal
else:
retVal = "Error-233"
return retVal
command = command + "\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, prompt, timeout, obj)
# Come back to config mode
if((prompt == "(config-if)#") or (prompt == "(config-if-range)#")):
command = "exit \n"
# debugOutput(command)
retVal = retVal + \
waitForDeviceResponse(command, "(config)#", timeout, obj)
return retVal
# EOM
def portChannelConfig(
obj, deviceType, prompt, timeout, portChArg1, portChArg2, portChArg3,
portChArg4, portChArg5, portChArg6, portChArg7):
retVal = ""
command = ""
if(portChArg1 == "port-aggregation" and prompt == "(config)#"):
command = command + portChArg1 + " load-balance ethernet "
if(portChArg2 == "destination-ip" or
portChArg2 == "destination-mac" or
portChArg2 == "destination-port" or
portChArg2 == "source-dest-ip" or
portChArg2 == "source-dest-mac" or
portChArg2 == "source-dest-port" or
portChArg2 == "source-interface" or
portChArg2 == "source-ip" or
portChArg2 == "source-mac" or
portChArg2 == "source-port"):
# debugOutput(portChArg2)
command = command + portChArg2 + " "
if(portChArg3 is None):
command = command + ""
elif(portChArg3 == "source-interface"):
command = command + portChArg3
else:
retVal = "Error-231"
return retVal
else:
retVal = "Error-232"
return retVal
# EOM
def routerConfig(
obj, deviceType, prompt, timeout, protocol, asNum, routerArg1,
routerArg2, routerArg3, routerArg4, routerArg5, routerArg6, routerArg7,
routerArg8):
retVal = ""
# Wait time to get response from server
timeout = timeout
if(protocol == "bgp"):
# bgp config command happens here.
command = "routing-protocol bgp "
value = checkSanityofVariable(deviceType, "bgp_as_number", asNum)
if(value == "ok"):
# BGP command happens here. It creates if not present
command = command + asNum + "\n"
# debugOutput(command)
retVal = waitForDeviceResponse(
command, "(config-router)#", timeout, obj)
retVal = retVal + bgpConfig(
obj, deviceType, "(config-router)#", timeout, routerArg1,
routerArg2, routerArg3, routerArg4, routerArg5, routerArg6,
routerArg7, routerArg8)
else:
retVal = "Error-176"
elif(protocol == "ospf"):
retVal = "Command Value is Not supported as of now"
else:
retVal = "Error-177"
return retVal
# EOM
def bgpNeighborAFConfig(
obj, deviceType, prompt, timeout, bgpNeighborAFArg1, bgpNeighborAFArg2,
bgpNeighborAFArg3):
retVal = ""
command = ""
timeout = timeout
if(bgpNeighborAFArg1 == "allowas-in"):
command = command + bgpNeighborAFArg1 + " "
if(bgpNeighborAFArg2 is not None):
value = checkSanityofVariable(
deviceType, "bgp_neighbor_af_occurances", bgpNeighborAFArg2)
if(value == "ok"):
command = command + bgpNeighborAFArg2
else:
retVal = "Error-325"
return retVal
else:
command = command
elif(bgpNeighborAFArg1 == "default-originate"):
command = command + bgpNeighborAFArg1 + " "
if(bgpNeighborAFArg2 is not None and bgpNeighborAFArg2 == "route-map"):
command = command + bgpNeighborAFArg2 + " "
value = checkSanityofVariable(
deviceType, "bgp_neighbor_af_routemap", bgpNeighborAFArg2)
if(value == "ok"):
command = command + bgpNeighborAFArg3
else:
retVal = "Error-324"
return retVal
elif(bgpNeighborAFArg1 == "filter-list"):
command = command + bgpNeighborAFArg1 + " "
value = checkSanityofVariable(
deviceType, "bgp_neighbor_af_filtername", bgpNeighborAFArg2)
if(value == "ok"):
command = command + bgpNeighborAFArg2 + " "
if(bgpNeighborAFArg3 == "in" or bgpNeighborAFArg3 == "out"):
command = command + bgpNeighborAFArg3
else:
retVal = "Error-323"
return retVal
else:
retVal = "Error-322"
return retVal
elif(bgpNeighborAFArg1 == "maximum-prefix"):
command = command + bgpNeighborAFArg1 + " "
value = checkSanityofVariable(
deviceType, "bgp_neighbor_af_maxprefix", bgpNeighborAFArg2)
if(value == "ok"):
command = command + bgpNeighborAFArg2 + " "
if(bgpNeighborAFArg3 is not None):
command = command + bgpNeighborAFArg3
else:
command = command
else:
retVal = "Error-326"
return retVal
elif(bgpNeighborAFArg1 == "next-hop-self"):
command = command + bgpNeighborAFArg1
elif(bgpNeighborAFArg1 == "prefix-list"):
command = command + bgpNeighborAFArg1 + " "
value = checkSanityofVariable(
deviceType, "bgp_neighbor_af_prefixname", bgpNeighborAFArg2)
if(value == "ok"):
command = command + bgpNeighborAFArg2 + " "
if(bgpNeighborAFArg3 == "in" or bgpNeighborAFArg3 == "out"):
command = command + bgpNeighborAFArg3
else:
retVal = "Error-321"
return retVal
else:
retVal = "Error-320"
return retVal
elif(bgpNeighborAFArg1 == "route-map"):
command = command + bgpNeighborAFArg1 + " "
value = checkSanityofVariable(
deviceType, "bgp_neighbor_af_routemap", bgpNeighborAFArg2)
if(value == "ok"):
command = command + bgpNeighborAFArg2
else:
retVal = "Error-319"
return retVal
elif(bgpNeighborAFArg1 == "route-reflector-client"):
command = command + bgpNeighborAFArg1
elif(bgpNeighborAFArg1 == "send-community"):
command = command + bgpNeighborAFArg1 + " "
if(bgpNeighborAFArg2 is not None and bgpNeighborAFArg2 == "extended"):
command = command + bgpNeighborAFArg2
else:
command = command
elif(bgpNeighborAFArg1 == "soft-reconfiguration"):
command = command + bgpNeighborAFArg1 + " inbound"
elif(bgpNeighborAFArg1 == "unsuppress-map"):
command = command + bgpNeighborAFArg1 + " "
value = checkSanityofVariable(
deviceType, "bgp_neighbor_af_routemap", bgpNeighborAFArg2)
if(value == "ok"):
command = command + bgpNeighborAFArg2
else:
retVal = "Error-318"
return retVal
else:
retVal = "Error-317"
return retVal
command = command + "\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, prompt, timeout, obj)
command = "exit \n"
retVal = retVal + \
waitForDeviceResponse(
command, "(config-router-neighbor)#", timeout, obj)
return retVal
# EOM
def bgpNeighborConfig(
obj, deviceType, prompt, timeout, bgpNeighborArg1, bgpNeighborArg2,
bgpNeighborArg3, bgpNeighborArg4, bgpNeighborArg5):
retVal = ""
command = ""
timeout = timeout
if(bgpNeighborArg1 == "address-family"):
command = command + bgpNeighborArg1 + " "
value = checkSanityofVariable(
deviceType, "bgp_neighbor_address_family", bgpNeighborArg2)
if(value == "ok"):
command = command + bgpNeighborArg2 + " unicast \n"
# debugOutput(command)
retVal = waitForDeviceResponse(
command, "(config-router-neighbor-af)#", timeout, obj)
retVal = retVal + bgpNeighborAFConfig(
obj, deviceType, "(config-router-neighbor-af)#", timeout,
bgpNeighborArg3, bgpNeighborArg4, bgpNeighborArg5)
return retVal
else:
retVal = "Error-316"
return retVal
elif(bgpNeighborArg1 == "advertisement-interval"):
command = command + bgpNeighborArg1
elif(bgpNeighborArg1 == "bfd"):
command = command + bgpNeighborArg1 + " "
if(bgpNeighborArg2 is not None and bgpNeighborArg2 == "mutihop"):
command = command + bgpNeighborArg2
else:
command = command
elif(bgpNeighborArg1 == "connection-retry-time"):
command = command + bgpNeighborArg1 + " "
value = checkSanityofVariable(
deviceType, "bgp_neighbor_connection_retrytime", bgpNeighborArg2)
if(value == "ok"):
command = command + bgpNeighborArg2
else:
retVal = "Error-315"
return retVal
elif(bgpNeighborArg1 == "description"):
command = command + bgpNeighborArg1 + " "
value = checkSanityofVariable(
deviceType, "bgp_neighbor_description", bgpNeighborArg2)
if(value == "ok"):
command = command + bgpNeighborArg2
else:
retVal = "Error-314"
return retVal
elif(bgpNeighborArg1 == "disallow-infinite-holdtime"):
command = command + bgpNeighborArg1
elif(bgpNeighborArg1 == "dont-capability-negotiate"):
command = command + bgpNeighborArg1
elif(bgpNeighborArg1 == "dynamic-capability"):
command = command + bgpNeighborArg1
elif(bgpNeighborArg1 == "ebgp-multihop"):
command = command + bgpNeighborArg1 + " "
value = checkSanityofVariable(
deviceType, "bgp_neighbor_maxhopcount", bgpNeighborArg2)
if(value == "ok"):
command = command + bgpNeighborArg2
else:
retVal = "Error-313"
return retVal
elif(bgpNeighborArg1 == "interface"):
command = command + bgpNeighborArg1 + " "
# TBD
elif(bgpNeighborArg1 == "local-as"):
command = command + bgpNeighborArg1 + " "
value = checkSanityofVariable(
deviceType, "bgp_neighbor_local_as", bgpNeighborArg2)
if(value == "ok"):
command = command + bgpNeighborArg2 + " "
if(bgpNeighborArg3 is not None and
bgpNeighborArg3 == "no-prepend"):
command = command + bgpNeighborArg3 + " "
if(bgpNeighborArg4 is not None and
bgpNeighborArg4 == "replace-as"):
command = command + bgpNeighborArg4 + " "
if(bgpNeighborArg5 is not None and
bgpNeighborArg5 == "dual-as"):
command = command + bgpNeighborArg5
else:
command = command
else:
command = command
else:
command = command
else:
retVal = "Error-312"
return retVal
elif(bgpNeighborArg1 == "maximum-peers"):
command = command + bgpNeighborArg1 + " "
value = checkSanityofVariable(
deviceType, "bgp_neighbor_maxpeers", bgpNeighborArg2)
if(value == "ok"):
command = command + bgpNeighborArg2
else:
retVal = "Error-311"
return retVal
elif(bgpNeighborArg1 == "password"):
command = command + bgpNeighborArg1 + " "
value = checkSanityofVariable(
deviceType, "bgp_neighbor_password", bgpNeighborArg2)
if(value == "ok"):
command = command + bgpNeighborArg2
else:
retVal = "Error-310"
return retVal
elif(bgpNeighborArg1 == "remove-private-AS"):
command = command + bgpNeighborArg1
elif(bgpNeighborArg1 == "timers"):
command = command + bgpNeighborArg1 + " "
value = checkSanityofVariable(
deviceType, "bgp_neighbor_timers_Keepalive", bgpNeighborArg2)
if(value == "ok"):
command = command + bgpNeighborArg2 + " "
value = checkSanityofVariable(
deviceType, "bgp_neighbor_timers_holdtime", bgpNeighborArg3)
if(value == "ok"):
command = command + bgpNeighborArg3
else:
retVal = "Error-309"
return retVal
else:
retVal = "Error-308"
return retVal
elif(bgpNeighborArg1 == "transport"):
command = command + bgpNeighborArg1 + " connection-mode passive "
elif(bgpNeighborArg1 == "ttl-security"):
command = command + bgpNeighborArg1 + " hops "
value = checkSanityofVariable(
deviceType, "bgp_neighbor_ttl_hops", bgpNeighborArg2)
if(value == "ok"):
command = command + bgpNeighborArg2
else:
retVal = "Error-307"
return retVal
elif(bgpNeighborArg1 == "update-source"):
command = command + bgpNeighborArg1 + " "
if(bgpNeighborArg2 is not None):
value = checkSanityofVariable(
deviceType, "bgp_neighbor_update_options", bgpNeighborArg2)
if(value == "ok"):
command = command + bgpNeighborArg2 + " "
if(bgpNeighborArg2 == "ethernet"):
value = checkSanityofVariable(
deviceType, "bgp_neighbor_update_ethernet",
bgpNeighborArg3)
if(value == "ok"):
command = command + bgpNeighborArg3
else:
retVal = "Error-304"
return retVal
elif(bgpNeighborArg2 == "loopback"):
value = checkSanityofVariable(
deviceType, "bgp_neighbor_update_loopback",
bgpNeighborArg3)
if(value == "ok"):
command = command + bgpNeighborArg3
else:
retVal = "Error-305"
return retVal
else:
value = checkSanityofVariable(
deviceType, "bgp_neighbor_update_vlan",
bgpNeighborArg3)
if(value == "ok"):
command = command + bgpNeighborArg3
else:
retVal = "Error-306"
return retVal
else:
command = command + bgpNeighborArg2
else:
retVal = "Error-303"
return retVal
elif(bgpNeighborArg1 == "weight"):
command = command + bgpNeighborArg1 + " "
value = checkSanityofVariable(
deviceType, "bgp_neighbor_weight", bgpNeighborArg2)
if(value == "ok"):
command = command + bgpNeighborArg2
else:
retVal = "Error-302"
return retVal
else:
retVal = "Error-301"
return retVal
command = command + "\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, prompt, timeout, obj)
command = "exit \n"
retVal = retVal + \
waitForDeviceResponse(command, "(config-router)#", timeout, obj)
return retVal
# EOM
def bgpAFConfig(
obj, deviceType, prompt, timeout, bgpAFArg1, bgpAFArg2, bgpAFArg3,
bgpAFArg4, bgpAFArg5, bgpAFArg6):
retVal = ""
command = ""
timeout = timeout
if(bgpAFArg1 == "aggregate-address"):
command = command + bgpAFArg1 + " "
value = checkSanityofVariable(
deviceType, "bgp_aggregate_prefix", bgpAFArg2)
if(value == "ok"):
command = command + bgpAFArg2 + " "
if(bgpAFArg2 is None):
command = command
elif(bgpAFArg2 == "as-set" or bgpAFArg2 == "summary-only"):
command = command + bgpAFArg2 + " "
if(bgpAFArg3 is None):
command = command
elif(bgpAFArg2 == "as-set"):
command = command + "summary-only"
else:
command = command + "as-set"
else:
retVal = "Error-297"
return retVal
else:
retVal = "Error-296"
return retVal
elif(bgpAFArg1 == "client-to-client"):
command = command + bgpAFArg1 + " reflection "
elif(bgpAFArg1 == "dampening"):
command = command + bgpAFArg1 + " "
if(bgpAFArg2 == "route-map"):
command = command + bgpAFArg2 + " "
value = checkSanityofVariable(
deviceType, "addrfamily_routemap_name", bgpAFArg3)
if(value == "ok"):
command = command + bgpAFArg3
else:
retVal = "Error-196"
return retVal
elif(bgpAFArg2 is not None):
value = checkSanityofVariable(
deviceType, "reachability_half_life", bgpAFArg2)
if(value == "ok"):
command = command + bgpAFArg2 + " "
if(bgpAFArg3 is not None):
value1 = checkSanityofVariable(
deviceType, "start_reuse_route_value", bgpAFArg3)
value2 = checkSanityofVariable(
deviceType, "start_suppress_route_value", bgpAFArg4)
value3 = checkSanityofVariable(
deviceType, "max_duration_to_suppress_route",
bgpAFArg5)
if(value1 == "ok" and value2 == "ok" and value3 == "ok"):
command = command + bgpAFArg3 + " " + bgpAFArg4 + \
" " + bgpAFArg5 + " "
if(bgpAFArg6 is not None):
value = checkSanityofVariable(
deviceType,
"unreachability_halftime_for_penalty",
bgpAFArg6)
if(value == "ok"):
command = command + bgpAFArg6
else:
retVal = "Error-295"
return retVal
else:
command = command
else:
retVal = "Error-294"
return retVal
elif(bgpAFArg1 == "distance"):
command = command + bgpAFArg1 + " "
value = checkSanityofVariable(
deviceType, "distance_external_AS", bgpAFArg2)
if(value == "ok"):
command = command + bgpAFArg2 + " "
value = checkSanityofVariable(
deviceType, "distance_internal_AS", bgpAFArg3)
if(value == "ok"):
command = command + bgpAFArg3 + " "
value = checkSanityofVariable(
deviceType, "distance_local_routes", bgpAFArg4)
if(value == "ok"):
command = command + bgpAFArg4
else:
retVal = "Error-291"
return retVal
else:
retVal = "Error-292"
return retVal
else:
retVal = "Error-293"
return retVal
elif(bgpAFArg1 == "maximum-paths"):
command = command + bgpAFArg1 + " "
value = checkSanityofVariable(deviceType, "maxpath_option", bgpAFArg2)
if(value == "ok"):
command = command + bgpAFArg2 + " "
value = checkSanityofVariable(
deviceType, "maxpath_numbers", bgpAFArg3)
if(value == "ok"):
command = command + bgpAFArg3
else:
retVal = "Error-199"
return retVal
else:
retVal = "Error-290"
return retVal
elif(bgpAFArg1 == "network"):
command = command + bgpAFArg1 + " "
if(bgpAFArg2 == "synchronization"):
command = command + bgpAFArg2
else:
value = checkSanityofVariable(
deviceType, "network_ip_prefix_with_mask", bgpAFArg2)
if(value == "ok"):
command = command + bgpAFArg2 + " "
if(bgpAFArg3 is not None and bgpAFArg3 == "backdoor"):
command = command + bgpAFArg3
elif(bgpAFArg3 is not None and bgpAFArg3 == "route-map"):
command = command + bgpAFArg3
value = checkSanityofVariable(
deviceType, "addrfamily_routemap_name", bgpAFArg4)
if(value == "ok"):
command = command + bgpAFArg4 + " "
if(bgpAFArg5 is not None and bgpAFArg5 == "backdoor"):
command = command + bgpAFArg5
else:
retVal = "Error-298"
return retVal
else:
retVal = "Error-196"
return retVal
else:
command = command
else:
value = checkSanityofVariable(
deviceType, "network_ip_prefix_value", bgpAFArg2)
if(value == "ok"):
command = command + bgpAFArg2 + " "
if(bgpAFArg3 is not None and bgpAFArg3 == "backdoor"):
command = command + bgpAFArg3
elif(bgpAFArg3 is not None and bgpAFArg3 == "route-map"):
command = command + bgpAFArg3
value = checkSanityofVariable(
deviceType, "addrfamily_routemap_name", bgpAFArg4)
if(value == "ok"):
command = command + bgpAFArg4 + " "
if(bgpAFArg5 is not None and
bgpAFArg5 == "backdoor"):
command = command + bgpAFArg5
else:
retVal = "Error-298"
return retVal
else:
retVal = "Error-196"
return retVal
elif(bgpAFArg3 is not None and bgpAFArg3 == "mask"):
command = command + bgpAFArg3
value = checkSanityofVariable(
deviceType, "network_ip_prefix_mask", bgpAFArg4)
if(value == "ok"):
command = command + bgpAFArg4 + " "
else:
retVal = "Error-299"
return retVal
else:
command = command
else:
retVal = "Error-300"
return retVal
elif(bgpAFArg1 == "nexthop"):
command = command + bgpAFArg1 + " trigger-delay critical "
value = checkSanityofVariable(
deviceType, "nexthop_crtitical_delay", bgpAFArg2)
if(value == "ok"):
command = command + bgpAFArg2 + " "
value = checkSanityofVariable(
deviceType, "nexthop_noncrtitical_delay", bgpAFArg3)
if(value == "ok"):
command = command + bgpAFArg3 + " "
else:
retVal = "Error-198"
return retVal
else:
retVal = "Error-197"
return retVal
elif(bgpAFArg1 == "redistribute"):
command = command + bgpAFArg1 + " "
value = checkSanityofVariable(
deviceType, "addrfamily_redistribute_option", bgpAFArg2)
if(value == "ok"):
command = command + bgpAFArg2 + " "
if(bgpAFArg2 is not None):
command = command + "route-map "
value = checkSanityofVariable(
deviceType, "addrfamily_routemap_name", bgpAFArg3)
if(value == "ok"):
command = command + bgpAFArg3
else:
retVal = "Error-196"
return retVal
else:
retVal = "Error-195"
return retVal
elif(bgpAFArg1 == "save" or bgpAFArg1 == "synchronization"):
command = command + bgpAFArg1
else:
retVal = "Error-194"
return retVal
command = command + "\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, prompt, timeout, obj)
command = "exit \n"
retVal = retVal + \
waitForDeviceResponse(command, "(config-router)#", timeout, obj)
return retVal
# EOM
def bgpConfig(
obj, deviceType, prompt, timeout, bgpArg1, bgpArg2, bgpArg3, bgpArg4,
bgpAgr5, bgpArg6, bgpArg7, bgpArg8):
retVal = ""
command = ""
# Wait time to get response from server
timeout = timeout
if(bgpArg1 == "address-family"):
# debugOutput(bgpArg1)
command = command + bgpArg1 + " "
value = checkSanityofVariable(
deviceType, "bgp_address_family", bgpArg2)
if(value == "ok"):
command = command + bgpArg2 + " " + "unicast \n"
debugOutput(command)
retVal = waitForDeviceResponse(
command, "(config-router-af)#", timeout, obj)
retVal = retVal + bgpAFConfig(
obj, deviceType, "(config-router-af)#", timeout,
bgpArg3, bgpArg4, bgpAgr5, bgpArg6, bgpArg7, bgpArg8)
return retVal
else:
retVal = "Error-178"
return retVal
elif(bgpArg1 == "bestpath"):
# debugOutput(bgpArg1)
command = command + bgpArg1 + " "
if(bgpArg2 == "always-compare-med"):
# debugOutput(bgpArg2)
command = command + bgpArg2
elif(bgpArg2 == "compare-confed-aspath"):
# debugOutput(bgpArg2)
command = command + bgpArg2
elif(bgpArg2 == "compare-routerid"):
# debugOutput(bgpArg2)
command = command + bgpArg2
elif(bgpArg2 == "dont-compare-originator-id"):
# debugOutput(bgpArg2)
command = command + bgpArg2
elif(bgpArg2 == "tie-break-on-age"):
# debugOutput(bgpArg2)
command = command + bgpArg2
elif(bgpArg2 == "as-path"):
# debugOutput(bgpArg2)
command = command + bgpArg2 + " "
if(bgpArg3 == "ignore" or bgpArg3 == "multipath-relax"):
command = command + bgpArg3
else:
retVal = "Error-179"
return retVal
elif(bgpArg2 == "med"):
# debugOutput(bgpArg2)
command = command + bgpArg2 + " "
if(bgpArg3 == "confed" or
bgpArg3 == "missing-as-worst" or
bgpArg3 == "non-deterministic" or
bgpArg3 == "remove-recv-med" or
bgpArg3 == "remove-send-med"):
command = command + bgpArg3
else:
retVal = "Error-180"
return retVal
else:
retVal = "Error-181"
return retVal
elif(bgpArg1 == "bgp"):
# debugOutput(bgpArg1)
command = command + bgpArg1 + " as-local-count "
value = checkSanityofVariable(
deviceType, "bgp_bgp_local_count", bgpArg2)
if(value == "ok"):
command = command + bgpArg2
else:
retVal = "Error-182"
return retVal
elif(bgpArg1 == "cluster-id"):
# debugOutput(bgpArg1)
command = command + bgpArg1 + " "
value = checkSanityofVariable(deviceType, "cluster_id_as_ip", bgpArg2)
if(value == "ok"):
command = command + bgpArg2
else:
value = checkSanityofVariable(
deviceType, "cluster_id_as_number", bgpArg2)
if(value == "ok"):
command = command + bgpArg2
else:
retVal = "Error-183"
return retVal
elif(bgpArg1 == "confederation"):
# debugOutput(bgpArg1)
command = command + bgpArg1 + " "
if(bgpArg2 == "identifier"):
value = checkSanityofVariable(
deviceType, "confederation_identifier", bgpArg3)
if(value == "ok"):
command = command + " " + bgpArg2 + " " + bgpArg3
else:
retVal = "Error-184"
return retVal
elif(bgpArg2 == "peers"):
value = checkSanityofVariable(
deviceType, "confederation_peers_as", bgpArg3)
if(value == "ok"):
command = command + " " + bgpArg2 + " " + bgpArg3
else:
retVal = "Error-185"
return retVal
else:
retVal = "Error-186"
return retVal
elif(bgpArg1 == "enforce-first-as"):
# debugOutput(bgpArg1)
command = command + bgpArg1
elif(bgpArg1 == "fast-external-failover"):
# debugOutput(bgpArg1)
command = command + bgpArg1
elif(bgpArg1 == "graceful-restart"):
# debugOutput(bgpArg1)
command = command + bgpArg1 + " stalepath-time "
value = checkSanityofVariable(
deviceType, "stalepath_delay_value", bgpArg2)
if(value == "ok"):
command = command + bgpArg2
else:
retVal = "Error-187"
return retVal
elif(bgpArg1 == "graceful-restart-helper"):
# debugOutput(bgpArg1)
command = command + bgpArg1
elif(bgpArg1 == "log-neighbor-changes"):
# debugOutput(bgpArg1)
command = command + bgpArg1
elif(bgpArg1 == "maxas-limit"):
# debugOutput(bgpArg1)
command = command + bgpArg1 + " "
value = checkSanityofVariable(deviceType, "maxas_limit_as", bgpArg2)
if(value == "ok"):
command = command + bgpArg2
else:
retVal = "Error-188"
return retVal
elif(bgpArg1 == "neighbor"):
# debugOutput(bgpArg1)
command = command + bgpArg1 + " "
value = checkSanityofVariable(
deviceType, "neighbor_ipaddress", bgpArg2)
# retVal = "Error-102"
# return retVal
if(value == "ok"):
command = command + bgpArg2
if(bgpArg3 is not None):
command = command + " remote-as "
value = checkSanityofVariable(
deviceType, "neighbor_as", bgpArg3)
if(value == "ok"):
command = command + bgpArg3 + "\n"
# debugOutput(command)
retVal = waitForDeviceResponse(
command, "(config-router-neighbor)#", timeout, obj)
retVal = retVal + bgpNeighborConfig(
obj, deviceType, "(config-router-neighbor)#",
timeout, bgpArg4, bgpAgr5, bgpArg6, bgpArg7, bgpArg8)
return retVal
else:
retVal = "Error-189"
return retVal
elif(bgpArg1 == "router-id"):
# debugOutput(bgpArg1)
command = command + bgpArg1 + " "
value = checkSanityofVariable(deviceType, "router_id", bgpArg2)
if(value == "ok"):
command = command + bgpArg2
else:
retVal = "Error-190"
return retVal
elif(bgpArg1 == "shutdown"):
# debugOutput(bgpArg1)
command = command + bgpArg1
elif(bgpArg1 == "synchronization"):
# debugOutput(bgpArg1)
command = command + bgpArg1
elif(bgpArg1 == "timers"):
# debugOutput(bgpArg1)
command = command + bgpArg1 + " bgp "
value = checkSanityofVariable(
deviceType, "bgp_keepalive_interval", bgpArg2)
if(value == "ok"):
command = command + bgpArg2
else:
retVal = "Error-191"
return retVal
if(bgpArg3 is not None):
value = checkSanityofVariable(deviceType, "bgp_holdtime", bgpArg3)
if(value == "ok"):
command = command + " " + bgpArg3
else:
retVal = "Error-192"
return retVal
else:
retVal = "Error-192"
return retVal
elif(bgpArg1 == "vrf"):
# debugOutput(bgpArg1)
command = command + bgpArg1 + " default"
else:
# debugOutput(bgpArg1)
retVal = "Error-192"
return retVal
command = command + "\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, prompt, timeout, obj)
# Come back to config mode
command = "exit \n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "(config)#", timeout, obj)
return retVal
# EOM
def vlanConfig(
obj, deviceType, prompt, timeout, vlanArg1, vlanArg2, vlanArg3,
vlanArg4, vlanArg5):
retVal = ""
# Wait time to get response from server
timeout = timeout
# vlan config command happens here.
command = "vlan "
if(vlanArg1 == "access-map"):
# debugOutput("access-map ")
command = command + vlanArg1 + " "
value = checkSanityofVariable(
deviceType, "vlan_access_map_name", vlanArg2)
if(value == "ok"):
command = command + vlanArg2 + " \n"
# debugOutput(command)
retVal = waitForDeviceResponse(
command, "(config-access-map)#", timeout, obj)
retVal = retVal + vlanAccessMapConfig(
obj, deviceType, "(config-access-map)#", timeout, vlanArg3,
vlanArg4, vlanArg5)
return retVal
else:
retVal = "Error-130"
return retVal
elif(vlanArg1 == "dot1q"):
# debugOutput("dot1q")
command = command + vlanArg1 + " tag native "
if(vlanArg2 is not None):
value = checkSanityofVariable(
deviceType, "vlan_dot1q_tag", vlanArg2)
if(value == "ok"):
command = command + vlanArg2
else:
retVal = "Error-131"
return retVal
elif(vlanArg1 == "filter"):
# debugOutput( "filter")
command = command + vlanArg1 + " "
if(vlanArg2 is not None):
value = checkSanityofVariable(
deviceType, "vlan_filter_name", vlanArg2)
if(value == "ok"):
command = command + vlanArg2 + " vlan-list "
value = checkSanityofVariable(deviceType, "vlan_id", vlanArg3)
if(value == "ok"):
command = command + vlanArg3
else:
value = checkSanityofVariable(
deviceType, "vlan_id_range", vlanArg3)
if(value == "ok"):
command = command + vlanArg3
else:
retVal = "ERROR-133"
return retVal
else:
retVal = "Error-132"
return retVal
else:
value = checkSanityofVariable(deviceType, "vlan_id", vlanArg1)
if(value == "ok"):
retVal = createVlan(obj, deviceType, "(config-vlan)#",
timeout, vlanArg1, vlanArg2, vlanArg3,
vlanArg4, vlanArg5)
return retVal
else:
value = checkSanityofVariable(
deviceType, "vlan_id_range", vlanArg1)
if(value == "ok"):
retVal = createVlan(obj, deviceType, "(config-vlan)#",
timeout, vlanArg1, vlanArg2, vlanArg3,
vlanArg4, vlanArg5)
return retVal
retVal = "Error-133"
return retVal
retVal = "Error-134"
return retVal
# debugOutput(command)
command = command + "\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, prompt, timeout, obj)
return retVal
# EOM
def vlanAccessMapConfig(
obj, deviceType, prompt, timeout, vlanArg3, vlanArg4, vlanArg5):
retVal = ""
# Wait time to get response from server
timeout = timeout
command = ""
if(vlanArg3 == "action"):
command = command + vlanArg3 + " "
value = checkSanityofVariable(
deviceType, "vlan_accessmap_action", vlanArg4)
if(value == "ok"):
command = command + vlanArg4
else:
retVal = "Error-135"
return retVal
elif(vlanArg3 == "match"):
command = command + vlanArg3 + " "
if(vlanArg4 == "ip" or vlanArg4 == "mac"):
command = command + vlanArg4 + " address "
value = checkSanityofVariable(
deviceType, "vlan_access_map_name", vlanArg5)
if(value == "ok"):
command = command + vlanArg5
else:
retVal = "Error-136"
return retVal
else:
retVal = "Error-137"
return retVal
elif(vlanArg3 == "statistics"):
command = vlanArg3 + " per-entry"
else:
retVal = "Error-138"
return retVal
command = command + "\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, prompt, timeout, obj)
return retVal
# EOM
def checkVlanNameNotAssigned(
obj, deviceType, prompt, timeout, vlanId, vlanName):
retVal = "ok"
command = "display vlan id " + vlanId + " \n"
retVal = waitForDeviceResponse(command, prompt, timeout, obj)
if(retVal.find(vlanName) != -1):
return "Nok"
return retVal
# EOM
# Utility Method to create vlan
def createVlan(
obj, deviceType, prompt, timeout, vlanArg1, vlanArg2, vlanArg3,
vlanArg4, vlanArg5):
# vlan config command happens here. It creates if not present
command = "vlan " + vlanArg1 + "\n"
# debugOutput(command)
retVal = waitForDeviceResponse(command, prompt, timeout, obj)
command = ""
if(vlanArg2 == "name"):
# debugOutput("name")
command = vlanArg2 + " "
value = checkSanityofVariable(deviceType, "vlan_name", vlanArg3)
if(value == "ok"):
value = checkVlanNameNotAssigned(obj, deviceType, prompt, timeout,
vlanArg1, vlanArg3)
if(value == "ok"):
command = command + vlanArg3
else:
command = "\n"
else:
retVal = "Error-139"
return retVal
elif (vlanArg2 == "flood"):
# debugOutput("flood")
command = vlanArg2 + " "
value = checkSanityofVariable(deviceType, "vlan_flood", vlanArg3)
if(value == "ok"):
command = command + vlanArg3
else:
retVal = "Error-140"
return retVal
elif(vlanArg2 == "state"):
# debugOutput("state")
command = vlanArg2 + " "
value = checkSanityofVariable(deviceType, "vlan_state", vlanArg3)
if(value == "ok"):
command = command + vlanArg3
else:
retVal = "Error-141"
return retVal
elif(vlanArg2 == "ip"):
# debugOutput("ip")
command = vlanArg2 + " igmp snooping "
# debugOutput("vlanArg3")
if(vlanArg3 is None or vlanArg3 == ""):
# debugOutput("None or empty")
command = command
elif(vlanArg3 == "fast-leave"):
# debugOutput("fast-leave")
command = command + vlanArg3
elif (vlanArg3 == "last-member-query-interval"):
# debugOutput("last-member-query-interval")
command = command + vlanArg3 + " "
value = checkSanityofVariable(
deviceType, "vlan_last_member_query_interval", vlanArg4)
if(value == "ok"):
command = command + vlanArg4
else:
retVal = "Error-142"
return retVal
elif (vlanArg3 == "querier"):
# debugOutput("querier")
command = command + vlanArg3 + " "
value = checkSanityofVariable(deviceType, "vlan_querier", vlanArg4)
if(value == "ok"):
command = command + vlanArg4
else:
retVal = "Error-143"
return retVal
elif (vlanArg3 == "querier-timeout"):
# debugOutput("querier-timeout")
command = command + vlanArg3 + " "
value = checkSanityofVariable(
deviceType, "vlan_querier_timeout", vlanArg4)
if(value == "ok"):
command = command + vlanArg4
else:
retVal = "Error-144"
return retVal
elif (vlanArg3 == "query-interval"):
# debugOutput("query-interval")
command = command + vlanArg3 + " "
value = checkSanityofVariable(
deviceType, "vlan_query_interval", vlanArg4)
if(value == "ok"):
command = command + vlanArg4
else:
retVal = "Error-145"
return retVal
elif (vlanArg3 == "query-max-response-time"):
# debugOutput("query-max-response-time")
command = command + vlanArg3 + " "
value = checkSanityofVariable(
deviceType, "vlan_query_max_response_time", vlanArg4)
if(value == "ok"):
command = command + vlanArg4
else:
retVal = "Error-146"
return retVal
elif (vlanArg3 == "report-suppression"):
# debugOutput("report-suppression")
command = command + vlanArg3
elif (vlanArg3 == "robustness-variable"):
# debugOutput("robustness-variable")
command = command + vlanArg3 + " "
value = checkSanityofVariable(
deviceType, "vlan_robustness_variable", vlanArg4)
if(value == "ok"):
command = command + vlanArg4
else:
retVal = "Error-147"
return retVal
elif (vlanArg3 == "startup-query-count"):
# debugOutput("startup-query-count")
command = command + vlanArg3 + " "
value = checkSanityofVariable(
deviceType, "vlan_startup_query_count", vlanArg4)
if(value == "ok"):
command = command + vlanArg4
else:
retVal = "Error-148"
return retVal
elif (vlanArg3 == "startup-query-interval"):
# debugOutput("startup-query-interval")
command = command + vlanArg3 + " "
value = checkSanityofVariable(
deviceType, "vlan_startup_query_interval", vlanArg4)
if(value == "ok"):
command = command + vlanArg4
else:
retVal = "Error-149"
return retVal
elif (vlanArg3 == "static-group"):
# debugOutput("static-group")
# command = command + vlanArg3 + " "
# value = checkSanityofVariable(deviceType, variableId, vlanArg4)
# if(value == "ok"):
# command = command + vlanArg4
# else :
retVal = "Error-102"
return retVal
elif (vlanArg3 == "version"):
# debugOutput("version")
command = command + vlanArg3 + " "
value = checkSanityofVariable(
deviceType, "vlan_snooping_version", vlanArg4)
if(value == "ok"):
command = command + vlanArg4
else:
retVal = "Error-150"
return retVal
elif (vlanArg3 == "mrouter"):
# debugOutput("mrouter")
command = command + vlanArg3 + " interface "
if(vlanArg4 == "ethernet"):
command = command + vlanArg4 + " "
value = checkSanityofVariable(
deviceType, "vlan_ethernet_interface", vlanArg5)
if(value == "ok"):
command = command + vlanArg5
else:
retVal = "Error-151"
return retVal
elif(vlanArg4 == "port-aggregation"):
command = command + vlanArg4 + " "
value = checkSanityofVariable(
deviceType, "vlan_portagg_number", vlanArg5)
if(value == "ok"):
command = command + vlanArg5
else:
retVal = "Error-152"
return retVal
else:
retVal = "Error-153"
return retVal
else:
command = command + vlanArg3
else:
retVal = "Error-154"
return retVal
command = command + "\n"
# debugOutput(command)
retVal = retVal + "\n" + \
waitForDeviceResponse(command, prompt, timeout, obj)
# Come back to config mode
command = "exit \n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "(config)#", timeout, obj)
return retVal
# EOM
def vlagConfig(
obj, deviceType, prompt, timeout, vlagArg1, vlagArg2, vlagArg3,
vlagArg4):
retVal = ""
# Wait time to get response from server
timeout = timeout
# vlag config command happens here.
command = "vlag "
if(vlagArg1 == "enable"):
# debugOutput("enable")
command = command + vlagArg1 + " "
elif(vlagArg1 == "auto-recovery"):
# debugOutput("auto-recovery")
command = command + vlagArg1 + " "
value = checkSanityofVariable(
deviceType, "vlag_auto_recovery", vlagArg2)
if(value == "ok"):
command = command + vlagArg2
else:
retVal = "Error-160"
return retVal
elif(vlagArg1 == "config-consistency"):
# debugOutput("config-consistency")
command = command + vlagArg1 + " "
value = checkSanityofVariable(
deviceType, "vlag_config_consistency", vlagArg2)
if(value == "ok"):
command = command + vlagArg2
else:
retVal = "Error-161"
return retVal
elif(vlagArg1 == "isl"):
# debugOutput("isl")
command = command + vlagArg1 + " port-aggregation "
value = checkSanityofVariable(
deviceType, "vlag_port_aggregation", vlagArg2)
if(value == "ok"):
command = command + vlagArg2
else:
retVal = "Error-162"
return retVal
elif(vlagArg1 == "mac-address-table"):
# debugOutput("mac-address-table")
command = command + vlagArg1 + " refresh"
elif(vlagArg1 == "peer-gateway"):
# debugOutput("peer-gateway")
command = command + vlagArg1 + " "
elif(vlagArg1 == "priority"):
# debugOutput("priority")
command = command + vlagArg1 + " "
value = checkSanityofVariable(deviceType, "vlag_priority", vlagArg2)
if(value == "ok"):
command = command + vlagArg2
else:
retVal = "Error-163"
return retVal
elif(vlagArg1 == "startup-delay"):
# debugOutput("startup-delay")
command = command + vlagArg1 + " "
value = checkSanityofVariable(
deviceType, "vlag_startup_delay", vlagArg2)
if(value == "ok"):
command = command + vlagArg2
else:
retVal = "Error-164"
return retVal
elif(vlagArg1 == "tier-id"):
# debugOutput("tier-id")
command = command + vlagArg1 + " "
value = checkSanityofVariable(deviceType, "vlag_tier_id", vlagArg2)
if(value == "ok"):
command = command + vlagArg2
else:
retVal = "Error-165"
return retVal
elif(vlagArg1 == "vrrp"):
# debugOutput("vrrp")
command = command + vlagArg1 + " active"
elif(vlagArg1 == "instance"):
# debugOutput("instance")
command = command + vlagArg1 + " "
value = checkSanityofVariable(deviceType, "vlag_instance", vlagArg2)
if(value == "ok"):
command = command + vlagArg2
if(vlagArg3 is not None):
command = command + " port-aggregation "
value = checkSanityofVariable(
deviceType, "vlag_port_aggregation", vlagArg3)
if(value == "ok"):
command = command + vlagArg3
else:
retVal = "Error-162"
return retVal
else:
command = command + " enable "
else:
retVal = "Error-166"
return retVal
elif(vlagArg1 == "hlthchk"):
# debugOutput("hlthchk")
command = command + vlagArg1 + " "
value = checkSanityofVariable(
deviceType, "vlag_hlthchk_options", vlagArg2)
if(value == "ok"):
if(vlagArg2 == "keepalive-attempts"):
value = checkSanityofVariable(
deviceType, "vlag_keepalive_attempts", vlagArg3)
if(value == "ok"):
command = command + vlagArg2 + " " + vlagArg3
else:
retVal = "Error-167"
return retVal
elif(vlagArg2 == "keepalive-interval"):
value = checkSanityofVariable(
deviceType, "vlag_keepalive_interval", vlagArg3)
if(value == "ok"):
command = command + vlagArg2 + " " + vlagArg3
else:
retVal = "Error-168"
return retVal
elif(vlagArg2 == "retry-interval"):
value = checkSanityofVariable(
deviceType, "vlag_retry_interval", vlagArg3)
if(value == "ok"):
command = command + vlagArg2 + " " + vlagArg3
else:
retVal = "Error-169"
return retVal
elif(vlagArg2 == "peer-ip"):
# Here I am not taking care of IPV6 option.
value = checkSanityofVariable(
deviceType, "vlag_peerip", vlagArg3)
if(value == "ok"):
command = command + vlagArg2 + " " + vlagArg3
if(vlagArg4 is not None):
value = checkSanityofVariable(
deviceType, "vlag_peerip_vrf", vlagArg4)
if(value == "ok"):
command = command + " vrf " + vlagArg4
else:
retVal = "Error-170"
return retVal
else:
retVal = "Error-171"
return retVal
else:
retVal = "Error-172"
return retVal
# debugOutput(command)
command = command + "\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "(config)#", timeout, obj)
return retVal
# EOM
# Utility Method to back up the start up config
# This method supports only TFTP or FTP
# Tuning of timeout parameter is pending
def doStartupConfigBackUp(
protocol, timeout, confServerIp, confPath, confServerUser,
confServerPwd, obj):
# server = "10.241.105.214"
server = confServerIp
# username = "pbhosale"
username = confServerUser
# password = "Lab4man1"
password = confServerPwd
path = "cnos_config"
if((confPath is not None) & (confPath != "")):
path = confPath
retVal = ""
# config backup command happens here
if(protocol == "ftp"):
command = "cp startup-config " + protocol + " " + protocol + "://" + \
username + "@" + server + "/" + path + " vrf management\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "Password:", 3, obj)
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(password, "#", timeout, obj)
elif(protocol == "tftp"):
command = "cp startup-config " + protocol + " " + protocol + \
"://" + server + "/" + path + " vrf management\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "#", 3, obj)
else:
return "Error-110"
return retVal
# EOM
# Utility Method to back up the start up config
# This method supports only SCP or SFTP
# Tuning of timeout parameter is pending
def doSecureStartupConfigBackUp(
protocol, timeout, confServerIp, confPath, confServerUser,
confServerPwd, obj):
# server = "10.241.105.214"
server = confServerIp
# username = "pbhosale"
username = confServerUser
# password = "Lab4man1"
password = confServerPwd
path = "cnos_config"
if((confPath is not None) and (confPath != "")):
path = confPath
retVal = ""
# config backup command happens here
command = "cp startup-config " + protocol + " " + protocol + "://" + \
username + "@" + server + "/" + path + " vrf management\n"
# debugOutput(command)
response = waitForDeviceResponse(command, "(yes/no)", 3, obj)
if(response.lower().find("error-101")):
command = password + "\n"
retVal = retVal + waitForDeviceResponse(command, "#", timeout, obj)
return retVal
retVal = retVal + response
if(protocol == "scp"):
command = "yes \n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "timeout:", 3, obj)
command = "0\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "Password:", 3, obj)
elif(protocol == "sftp"):
command = "yes \n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "Password:", 3, obj)
else:
return "Error-110"
# Password entry happens here
# debugOutput(command)
command = password + "\n"
retVal = retVal + waitForDeviceResponse(command, "#", timeout, obj)
return retVal
# EOM
# Utility Method to restore the Running config
# This method supports only TFTP or FTP
# Tuning of timeout parameter is pending
def doStartUpConfigRollback(
protocol, timeout, confServerIp, confPath, confServerUser,
confServerPwd, obj):
# server = "10.241.105.214"
server = confServerIp
# username = "pbhosale"
username = confServerUser
# password = "Lab4man1"
password = confServerPwd
path = "cnos_config"
if((confPath is not None) & (confPath != "")):
path = confPath
retVal = ""
# config backup command happens here
if(protocol == "ftp"):
command = "cp " + protocol + " " + protocol + "://" + username + \
"@" + server + "/" + path + " startup-config vrf management\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "Password:", 3, obj)
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "[n]", timeout, obj)
command = "y\n"
retVal = retVal + waitForDeviceResponse(password, "#", timeout, obj)
elif(protocol == "tftp"):
command = "cp " + protocol + " " + protocol + "://" + \
server + "/" + path + " startup-config vrf management\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "[n]", timeout, obj)
command = "y\n"
retVal = retVal + waitForDeviceResponse(command, "#", timeout, obj)
else:
return "Error-110"
return retVal
# EOM
# Utility Method to restore the start up config
# This method supports only SCP or SFTP
# Tuning of timeout parameter is pending
def doSecureStartUpConfigRollback(
protocol, timeout, confServerIp, confPath, confServerUser,
confServerPwd, obj):
# server = "10.241.105.214"
server = confServerIp
# username = "pbhosale"
username = confServerUser
# password = "Lab4man1"
password = confServerPwd
path = "cnos_config"
if((confPath is not None) and (confPath != "")):
path = confPath
retVal = ""
# config backup command happens here
# cp sftp sftp://[email protected]/cnos_config/running_config.conf
# startup-config vrf management
command = "cp " + protocol + " " + protocol + "://" + username + \
"@" + server + "/" + path + " startup-config vrf management \n"
# debugOutput(command)
response = waitForDeviceResponse(command, "(yes/no)", 3, obj)
if(response.lower().find("error-101")):
command = password + "\n"
retVal = retVal + waitForDeviceResponse(command, "[n]", timeout, obj)
command = "y\n"
retVal = retVal + waitForDeviceResponse(command, "#", timeout, obj)
return retVal
retVal = retVal + response
if(protocol == "scp"):
command = "yes \n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "timeout:", 3, obj)
command = "0\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "Password:", 3, obj)
elif(protocol == "sftp"):
command = "yes \n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "Password:", 3, obj)
else:
return "Error-110"
# Password entry happens here
# debugOutput(command)
command = password + "\n"
retVal = retVal + waitForDeviceResponse(command, "[n]", timeout, obj)
command = "y\n"
retVal = retVal + waitForDeviceResponse(command, "#", timeout, obj)
return retVal
# EOM
# Utility Method to back up the Running config
# This method supports only TFTP or FTP
# Tuning of timeout parameter is pending
def doRunningConfigBackUp(
protocol, timeout, confServerIp, confPath, confServerUser,
confServerPwd, obj):
# server = "10.241.105.214"
server = confServerIp
# username = "pbhosale"
username = confServerUser
# password = "Lab4man1"
password = confServerPwd
path = "cnos_config"
if((confPath is not None) & (confPath != "")):
path = confPath
retVal = ""
# config backup command happens here
if(protocol == "ftp"):
command = "cp running-config " + protocol + " " + protocol + "://" + \
username + "@" + server + "/" + path + " vrf management\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "Password:", 3, obj)
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(password, "#", timeout, obj)
elif(protocol == "tftp"):
command = "cp running-config " + protocol + " " + protocol + \
"://" + server + "/" + path + " vrf management\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "#", 3, obj)
else:
return "Error-110"
return retVal
# EOM
# Utility Method to back up the running config
# This method supports only SCP or SFTP
# Tuning of timeout parameter is pending
def doSecureRunningConfigBackUp(
protocol, timeout, confServerIp, confPath, confServerUser,
confServerPwd, obj):
# server = "10.241.105.214"
server = confServerIp
# username = "pbhosale"
username = confServerUser
# password = "Lab4man1"
password = confServerPwd
path = "cnos_config"
if((confPath is not None) and (confPath != "")):
path = confPath
retVal = ""
# config backup command happens here
command = "cp running-config " + protocol + " " + protocol + "://" + \
username + "@" + server + "/" + path + " vrf management\n"
# debugOutput(command)
response = waitForDeviceResponse(command, "(yes/no)", 3, obj)
if(response.lower().find("error-101")):
command = password + "\n"
retVal = retVal + waitForDeviceResponse(command, "#", timeout, obj)
return retVal
retVal = retVal + response
if(protocol == "scp"):
command = "yes \n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "timeout:", 3, obj)
command = "0\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "Password:", 3, obj)
elif(protocol == "sftp"):
command = "yes \n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "Password:", 3, obj)
else:
return "Error-110"
# Password entry happens here
# debugOutput(command)
command = password + "\n"
retVal = retVal + waitForDeviceResponse(command, "#", timeout, obj)
return retVal
# EOM
# Utility Method to restore the Running config
# This method supports only TFTP or FTP
# Tuning of timeout parameter is pending
def doRunningConfigRollback(
protocol, timeout, confServerIp, confPath, confServerUser,
confServerPwd, obj):
# server = "10.241.105.214"
server = confServerIp
# username = "pbhosale"
username = confServerUser
# password = "Lab4man1"
password = confServerPwd
path = "cnos_config"
if((confPath is not None) & (confPath != "")):
path = confPath
retVal = ""
# config backup command happens here
if(protocol == "ftp"):
command = "cp " + protocol + " " + protocol + "://" + username + \
"@" + server + "/" + path + " running-config vrf management\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "Password:", 3, obj)
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(password, "#", timeout, obj)
elif(protocol == "tftp"):
command = "cp " + protocol + " " + protocol + "://" + \
server + "/" + path + " running-config vrf management\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "#", timeout, obj)
else:
return "Error-110"
return retVal
# EOM
# Utility Method to restore the running config
# This method supports only SCP or SFTP
# Tuning of timeout parameter is pending
def doSecureRunningConfigRollback(
protocol, timeout, confServerIp, confPath, confServerUser,
confServerPwd, obj):
# server = "10.241.105.214"
server = confServerIp
# username = "pbhosale"
username = confServerUser
# password = "Lab4man1"
password = confServerPwd
path = "cnos_config"
if((confPath is not None) and (confPath != "")):
path = confPath
retVal = ""
# config backup command happens here
# cp sftp sftp://[email protected]/cnos_config/running_config.conf
# running-config vrf management
command = "cp " + protocol + " " + protocol + "://" + username + \
"@" + server + "/" + path + " running-config vrf management \n"
# debugOutput(command)
response = waitForDeviceResponse(command, "(yes/no)", 3, obj)
if(response.lower().find("error-101")):
command = password + "\n"
retVal = retVal + waitForDeviceResponse(command, "#", timeout, obj)
return retVal
retVal = retVal + response
if(protocol == "scp"):
command = "yes \n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "timeout:", 3, obj)
command = "0\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "Password:", 3, obj)
elif(protocol == "sftp"):
command = "yes \n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "Password:", 3, obj)
else:
return "Error-110"
# Password entry happens here
# debugOutput(command)
command = password + "\n"
retVal = retVal + waitForDeviceResponse(command, "#", timeout, obj)
return retVal
# EOM
# Utility Method to download an image from FTP/TFTP server to device.
# This method supports only FTP or TFTP
# Tuning of timeout parameter is pending
def doImageTransfer(
protocol, timeout, imgServerIp, imgPath, imgType, imgServerUser,
imgServerPwd, obj):
# server = "10.241.106.118"
server = imgServerIp
# username = "root"
username = imgServerUser
# password = "root123"
password = imgServerPwd
type = "os"
if(imgType is not None):
type = imgType.lower()
path = "cnos_images"
if((imgPath is not None) and (imgPath != "")):
path = imgPath
retVal = ""
# Image transfer command happens here
if(protocol == "ftp"):
command = "cp " + protocol + " " + protocol + "://" + username + \
"@" + server + "/" + path + " system-image " + type + \
" vrf management\n"
elif(protocol == "tftp"):
command = "cp " + protocol + " " + protocol + "://" + server + \
"/" + path + " system-image " + type + " vrf management\n"
else:
return "Error-110"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "[n]", 3, obj)
# Confirmation command happens here
command = "y\n"
# debugOutput(command)
# retVal = retVal+ waitForDeviceResponse(command, "(yes/no)?", 3, obj)
# command = "Yes \n"
# debugOutput(command)
if(protocol == "ftp"):
retVal = retVal + waitForDeviceResponse(command, "Password:", 3, obj)
# Password entry happens here Only for FTP
command = password + " \n"
# debugOutput(command)
# Change to standby image y
retVal = retVal + waitForDeviceResponse(command, "[n]", timeout, obj)
command = "y\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "#", timeout, obj)
return retVal
# EOM
# Utility Method to download an image from SFTP/SCP server to device.
# This method supports only SCP or SFTP
# Tuning of timeout parameter is pending
def doSecureImageTransfer(
protocol, timeout, imgServerIp, imgPath, imgType, imgServerUser,
imgServerPwd, obj):
# server = "10.241.105.214"
server = imgServerIp
# username = "pbhosale"
username = imgServerUser
# password = "Lab4man1"
password = imgServerPwd
type = "scp"
if(imgType is not None):
type = imgType.lower()
path = "cnos_images"
if((imgPath is not None) and(imgPath != "")):
path = imgPath
retVal = ""
# Image transfer command happens here
command = "cp " + protocol + " " + protocol + "://" + username + "@" + \
server + "/" + path + " system-image " + type + " vrf management \n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "[n]", 3, obj)
# Confirmation command happens here
if(protocol == "scp"):
command = "y\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "(yes/no)?", 3, obj)
command = "Yes\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "timeout:", 3, obj)
command = "0\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "Password:", 3, obj)
elif(protocol == "sftp"):
command = "y\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "(yes/no)?", 3, obj)
command = "Yes\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "Password:", 3, obj)
else:
return "Error-110"
# Password entry happens here
command = password + "\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "[n]", timeout, obj)
# Change to standby image y
command = "y\n"
# debugOutput(command)
retVal = retVal + waitForDeviceResponse(command, "#", timeout, obj)
return retVal
# EOM
# Method for enter enable mnode
#
def enterEnableModeForDevice(enablePassword, timeout, obj):
command = "enable\n"
pwdPrompt = "password:"
# debugOutput(enablePassword)
# debugOutput('\n')
obj.settimeout(int(timeout))
# Executing enable
obj.send(command)
flag = False
retVal = ""
count = 5
while not flag:
# If wait time is execeeded.
if(count == 0):
flag = True
else:
count = count - 1
# A delay of one second
time.sleep(1)
try:
buffByte = obj.recv(9999)
buff = buffByte.decode()
retVal = retVal + buff
# debugOutput(buff)
gotit = buff.find(pwdPrompt)
if(gotit != -1):
time.sleep(1)
if(enablePassword is None or enablePassword == ""):
return "\n Error-106"
obj.send(enablePassword)
obj.send("\r")
obj.send("\n")
time.sleep(1)
innerBuffByte = obj.recv(9999)
innerBuff = innerBuffByte.decode()
retVal = retVal + innerBuff
# debugOutput(innerBuff)
innerGotit = innerBuff.find("#")
if(innerGotit != -1):
return retVal
else:
gotit = buff.find("#")
if(gotit != -1):
return retVal
except:
retVal = retVal + "\n Error-101"
flag = True
if(retVal == ""):
retVal = "\n Error-101"
return retVal
# EOM
# Method for device response wait for a time delay
#
def waitForDeviceResponse(command, prompt, timeout, obj):
obj.settimeout(int(timeout))
obj.send(command)
flag = False
retVal = ""
while not flag:
time.sleep(1)
try:
buffByte = obj.recv(9999)
buff = buffByte.decode()
retVal = retVal + buff
# debugOutput(retVal)
gotit = buff.find(prompt)
if(gotit != -1):
flag = True
except:
if prompt != "(yes/no)?":
retVal = retVal + "\n Error-101"
else:
retVal = retVal
flag = True
return retVal
# EOM
def checkOutputForError(output):
retVal = ""
index = output.lower().find("error")
startIndex = index + 6
if(index == -1):
index = output.lower().find("invalid")
startIndex = index + 8
if(index == -1):
index = output.lower().find("incorrect")
startIndex = index + 9
if(index == -1):
index = output.lower().find("failure")
startIndex = index + 8
if(index == -1):
return None
endIndex = startIndex + 3
errorCode = output[startIndex:endIndex]
result = errorCode.isdigit()
if(result is not True):
return "Device returned an Error. Please check Results for more \
information"
errorFile = "dictionary/ErrorCodes.lvo"
try:
# with open(errorFile, 'r') as f:
f = open(errorFile, 'r')
for line in f:
if('=' in line):
data = line.split('=')
if(data[0].strip() == errorCode):
errorString = data[1].strip()
return errorString
except Exception:
errorString = cnos_errorcodes.getErrorString(errorCode)
errorString = errorString.strip()
return errorString
return "Error Code Not Found"
# EOM
def checkSanityofVariable(deviceType, variableId, variableValue):
retVal = ""
ruleFile = "dictionary/" + deviceType + "_rules.lvo"
ruleString = getRuleStringForVariable(deviceType, ruleFile, variableId)
retVal = validateValueAgainstRule(ruleString, variableValue)
return retVal
# EOM
def getRuleStringForVariable(deviceType, ruleFile, variableId):
retVal = ""
try:
with open(ruleFile, 'r') as f:
for line in f:
# debugOutput(line)
if(':' in line):
data = line.split(':')
# debugOutput(data[0])
if(data[0].strip() == variableId):
retVal = line
except Exception:
ruleString = cnos_devicerules.getRuleString(deviceType, variableId)
retVal = ruleString.strip()
return retVal
# EOM
def validateValueAgainstRule(ruleString, variableValue):
retVal = ""
if(ruleString == ""):
return 1
rules = ruleString.split(':')
variableType = rules[1].strip()
varRange = rules[2].strip()
if(variableType == "INTEGER"):
result = checkInteger(variableValue)
if(result is True):
return "ok"
else:
return "Error-111"
elif(variableType == "FLOAT"):
result = checkFloat(variableValue)
if(result is True):
return "ok"
else:
return "Error-112"
elif(variableType == "INTEGER_VALUE"):
int_range = varRange.split('-')
r = range(int(int_range[0].strip()), int(int_range[1].strip()))
if(checkInteger(variableValue) is not True):
return "Error-111"
result = int(variableValue) in r
if(result is True):
return "ok"
else:
return "Error-113"
elif(variableType == "INTEGER_VALUE_RANGE"):
int_range = varRange.split('-')
varLower = int_range[0].strip()
varHigher = int_range[1].strip()
r = range(int(varLower), int(varHigher))
val_range = variableValue.split('-')
try:
valLower = val_range[0].strip()
valHigher = val_range[1].strip()
except Exception:
return "Error-113"
if((checkInteger(valLower) is not True) or
(checkInteger(valHigher) is not True)):
# debugOutput("Error-114")
return "Error-114"
result = (int(valLower) in r) and (int(valHigher)in r) \
and (int(valLower) < int(valHigher))
if(result is True):
return "ok"
else:
# debugOutput("Error-113")
return "Error-113"
elif(variableType == "INTEGER_OPTIONS"):
int_options = varRange.split(',')
if(checkInteger(variableValue) is not True):
return "Error-111"
for opt in int_options:
if(opt.strip() is variableValue):
result = True
break
if(result is True):
return "ok"
else:
return "Error-115"
elif(variableType == "LONG"):
result = checkInteger(variableValue)
if(result is True):
return "ok"
else:
return "Error-116"
elif(variableType == "LONG_VALUE"):
long_range = varRange.split('-')
r = range(int(long_range[0].strip()), int(long_range[1].strip()))
if(checkInteger(variableValue) is not True):
# debugOutput(variableValue)
return "Error-116"
result = int(variableValue) in r
if(result is True):
return "ok"
else:
return "Error-113"
elif(variableType == "LONG_VALUE_RANGE"):
long_range = varRange.split('-')
r = range(int(long_range[0].strip()), int(long_range[1].strip()))
val_range = variableValue.split('-')
if((checkInteger(val_range[0]) is not True) or
(checkInteger(val_range[1]) is not True)):
return "Error-117"
result = (val_range[0] in r) and (
val_range[1] in r) and (val_range[0] < val_range[1])
if(result is True):
return "ok"
else:
return "Error-113"
elif(variableType == "LONG_OPTIONS"):
long_options = varRange.split(',')
if(checkInteger(variableValue) is not True):
return "Error-116"
for opt in long_options:
if(opt.strip() == variableValue):
result = True
break
if(result is True):
return "ok"
else:
return "Error-115"
elif(variableType == "TEXT"):
if(variableValue == ""):
return "Error-118"
if(True is isinstance(variableValue, str)):
return "ok"
else:
return "Error-119"
elif(variableType == "NO_VALIDATION"):
if(variableValue == ""):
return "Error-118"
else:
return "ok"
elif(variableType == "TEXT_OR_EMPTY"):
if(variableValue is None or variableValue == ""):
return "ok"
if(result == isinstance(variableValue, str)):
return "ok"
else:
return "Error-119"
elif(variableType == "MATCH_TEXT"):
if(variableValue == ""):
return "Error-118"
if(isinstance(variableValue, str)):
if(varRange == variableValue):
return "ok"
else:
return "Error-120"
else:
return "Error-119"
elif(variableType == "MATCH_TEXT_OR_EMPTY"):
if(variableValue is None or variableValue == ""):
return "ok"
if(isinstance(variableValue, str)):
if(varRange == variableValue):
return "ok"
else:
return "Error-120"
else:
return "Error-119"
elif(variableType == "TEXT_OPTIONS"):
str_options = varRange.split(',')
if(isinstance(variableValue, str) is not True):
return "Error-119"
result = False
for opt in str_options:
if(opt.strip() == variableValue):
result = True
break
if(result is True):
return "ok"
else:
return "Error-115"
elif(variableType == "TEXT_OPTIONS_OR_EMPTY"):
if(variableValue is None or variableValue == ""):
return "ok"
str_options = varRange.split(',')
if(isinstance(variableValue, str) is not True):
return "Error-119"
for opt in str_options:
if(opt.strip() == variableValue):
result = True
break
if(result is True):
return "ok"
else:
return "Error-115"
elif(variableType == "IPV4Address"):
try:
socket.inet_pton(socket.AF_INET, variableValue)
result = True
except socket.Error:
result = False
if(result is True):
return "ok"
else:
return "Error-121"
elif(variableType == "IPV4AddressWithMask"):
if(variableValue is None or variableValue == ""):
return "Error-119"
str_options = variableValue.split('/')
ipaddr = str_options[0]
mask = str_options[1]
try:
socket.inet_pton(socket.AF_INET, ipaddr)
if(checkInteger(mask) is True):
result = True
else:
result = False
except socket.Error:
result = False
if(result is True):
return "ok"
else:
return "Error-121"
elif(variableType == "IPV6Address"):
try:
socket.inet_pton(socket.AF_INET6, variableValue)
result = True
except socket.Error:
result = False
if(result is True):
return "ok"
else:
return "Error-122"
return retVal
# EOM
def disablePaging(remote_conn):
remote_conn.send("terminal length 0\n")
time.sleep(1)
# Clear the buffer on the screen
outputByte = remote_conn.recv(1000)
output = outputByte.decode()
return output
# EOM
def checkInteger(s):
try:
int(s)
return True
except ValueError:
return False
# EOM
def checkFloat(s):
try:
float(s)
return True
except ValueError:
return False
# EOM
def debugOutput(command):
f = open('debugOuput.txt', 'a')
f.write(str(command)) # python will convert \n to os.linesep
f.close() # you can omit in most cases as the destructor will call it
# EOM
|
gpl-3.0
|
mith1979/ansible_automation
|
applied_python/applied_python/lib/python2.7/site-packages/pyasn1/type/char.py
|
172
|
2043
|
# ASN.1 "character string" types
from pyasn1.type import univ, tag
class NumericString(univ.OctetString):
tagSet = univ.OctetString.tagSet.tagImplicitly(
tag.Tag(tag.tagClassUniversal, tag.tagFormatSimple, 18)
)
class PrintableString(univ.OctetString):
tagSet = univ.OctetString.tagSet.tagImplicitly(
tag.Tag(tag.tagClassUniversal, tag.tagFormatSimple, 19)
)
class TeletexString(univ.OctetString):
tagSet = univ.OctetString.tagSet.tagImplicitly(
tag.Tag(tag.tagClassUniversal, tag.tagFormatSimple, 20)
)
class T61String(TeletexString): pass
class VideotexString(univ.OctetString):
tagSet = univ.OctetString.tagSet.tagImplicitly(
tag.Tag(tag.tagClassUniversal, tag.tagFormatSimple, 21)
)
class IA5String(univ.OctetString):
tagSet = univ.OctetString.tagSet.tagImplicitly(
tag.Tag(tag.tagClassUniversal, tag.tagFormatSimple, 22)
)
class GraphicString(univ.OctetString):
tagSet = univ.OctetString.tagSet.tagImplicitly(
tag.Tag(tag.tagClassUniversal, tag.tagFormatSimple, 25)
)
class VisibleString(univ.OctetString):
tagSet = univ.OctetString.tagSet.tagImplicitly(
tag.Tag(tag.tagClassUniversal, tag.tagFormatSimple, 26)
)
class ISO646String(VisibleString): pass
class GeneralString(univ.OctetString):
tagSet = univ.OctetString.tagSet.tagImplicitly(
tag.Tag(tag.tagClassUniversal, tag.tagFormatSimple, 27)
)
class UniversalString(univ.OctetString):
tagSet = univ.OctetString.tagSet.tagImplicitly(
tag.Tag(tag.tagClassUniversal, tag.tagFormatSimple, 28)
)
encoding = "utf-32-be"
class BMPString(univ.OctetString):
tagSet = univ.OctetString.tagSet.tagImplicitly(
tag.Tag(tag.tagClassUniversal, tag.tagFormatSimple, 30)
)
encoding = "utf-16-be"
class UTF8String(univ.OctetString):
tagSet = univ.OctetString.tagSet.tagImplicitly(
tag.Tag(tag.tagClassUniversal, tag.tagFormatSimple, 12)
)
encoding = "utf-8"
|
apache-2.0
|
yamada-h/ryu
|
ryu/lib/of_config/base.py
|
7
|
4331
|
# Copyright (C) 2013 Nippon Telegraph and Telephone Corporation.
# Copyright (C) 2013 YAMAMOTO Takashi <yamamoto at valinux co jp>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# convenient classes to manipulate OF-Config XML
# in a little more pythonic way.
# currently assuming OF-Config 1.1.1.
from ryu.lib import stringify
from lxml import objectify
import lxml.etree as ET
_ns_of111 = 'urn:onf:of111:config:yang'
_ns_netconf = 'urn:ietf:params:xml:ns:netconf:base:1.0'
_nsmap = {
'of111': _ns_of111,
'nc': _ns_netconf,
}
def _pythonify(name):
return name.replace('-', '_')
class _e(object):
def __init__(self, name, is_list):
self.name = name
self.cls = None
self.is_list = is_list
# complexType
class _ct(_e):
def __init__(self, name, cls, is_list):
super(_ct, self).__init__(name, is_list)
self.cls = cls
class _Base(stringify.StringifyMixin):
_M = objectify.ElementMaker(annotate=False,
namespace=_ns_of111,
nsmap=_nsmap)
def __init__(self, **kwargs):
for e in self._ELEMENTS:
k = _pythonify(e.name)
try:
v = kwargs.pop(k)
assert not e.name in kwargs
except KeyError:
k = e.name
try:
v = kwargs.pop(k)
except KeyError:
if e.is_list:
v = []
else:
v = None
setattr(self, k, v)
if kwargs:
raise TypeError('unknown kwargs %s' % kwargs)
def to_et(self, tag):
def convert(v):
if isinstance(v, _Base):
return v.to_et(e.name)
elif isinstance(v, objectify.ObjectifiedElement):
assert ET.QName(v.tag).localname == itag
return v
return self._M(itag, v)
args = []
for e in self._ELEMENTS:
itag = e.name
k = _pythonify(itag)
v = getattr(self, k)
if v is None:
continue
if isinstance(v, list):
assert e.is_list
ele = map(convert, v)
else:
assert not e.is_list
ele = [convert(v)]
args.extend(ele)
return self._M(tag, *args)
def to_xml(self, tag):
e = self.to_et(tag)
return ET.tostring(e, pretty_print=True)
@classmethod
def from_xml(cls, xmlstring):
et = objectify.fromstring(xmlstring)
return cls.from_et(et)
@classmethod
def from_et(cls, et):
def convert(v):
if not e.cls is None:
return e.cls.from_et(v)
return v
kwargs = {}
for e in cls._ELEMENTS:
try:
v = et[e.name]
except AttributeError:
continue
assert isinstance(v, objectify.ObjectifiedElement)
if len(v) == 1:
v = convert(v)
if e.is_list:
v = [v]
else:
assert e.is_list
v = map(convert, v)
k = _pythonify(e.name)
assert not k in kwargs
kwargs[k] = v
return cls(**kwargs)
def __getattribute__(self, k):
return stringify.StringifyMixin.__getattribute__(self, _pythonify(k))
def __setattr__(self, k, v):
stringify.StringifyMixin.__setattr__(self, _pythonify(k), v)
class _Unimpl(_Base):
_ELEMENTS = [
_e('raw_et', is_list=False),
]
def to_et(self, tag):
assert self.raw_et.tag == tag
return self.raw_et
@classmethod
def from_et(cls, et):
return cls(raw_et=et)
|
apache-2.0
|
kangkot/arangodb
|
3rdParty/V8-4.3.61/third_party/python_26/Lib/test/test_gdbm.py
|
58
|
2501
|
import gdbm
import unittest
import os
from test.test_support import verbose, TESTFN, run_unittest, unlink
filename = TESTFN
class TestGdbm(unittest.TestCase):
def setUp(self):
self.g = None
def tearDown(self):
if self.g is not None:
self.g.close()
unlink(filename)
def test_key_methods(self):
self.g = gdbm.open(filename, 'c')
self.assertEqual(self.g.keys(), [])
self.g['a'] = 'b'
self.g['12345678910'] = '019237410982340912840198242'
key_set = set(self.g.keys())
self.assertEqual(key_set, frozenset(['a', '12345678910']))
self.assert_(self.g.has_key('a'))
key = self.g.firstkey()
while key:
self.assert_(key in key_set)
key_set.remove(key)
key = self.g.nextkey(key)
self.assertRaises(KeyError, lambda: self.g['xxx'])
def test_error_conditions(self):
# Try to open a non-existent database.
unlink(filename)
self.assertRaises(gdbm.error, gdbm.open, filename, 'r')
# Try to access a closed database.
self.g = gdbm.open(filename, 'c')
self.g.close()
self.assertRaises(gdbm.error, lambda: self.g['a'])
# try pass an invalid open flag
self.assertRaises(gdbm.error, lambda: gdbm.open(filename, 'rx').close())
def test_flags(self):
# Test the flag parameter open() by trying all supported flag modes.
all = set(gdbm.open_flags)
# Test standard flags (presumably "crwn").
modes = all - set('fsu')
for mode in modes:
self.g = gdbm.open(filename, mode)
self.g.close()
# Test additional flags (presumably "fsu").
flags = all - set('crwn')
for mode in modes:
for flag in flags:
self.g = gdbm.open(filename, mode + flag)
self.g.close()
def test_reorganize(self):
self.g = gdbm.open(filename, 'c')
size0 = os.path.getsize(filename)
self.g['x'] = 'x' * 10000
size1 = os.path.getsize(filename)
self.assert_(size0 < size1)
del self.g['x']
# 'size' is supposed to be the same even after deleting an entry.
self.assertEqual(os.path.getsize(filename), size1)
self.g.reorganize()
size2 = os.path.getsize(filename)
self.assert_(size1 > size2 >= size0)
def test_main():
run_unittest(TestGdbm)
if __name__ == '__main__':
test_main()
|
apache-2.0
|
paulballesty/zxcvbn
|
data-scripts/build_keyboard_adjacency_graphs.py
|
9
|
4051
|
#!/usr/bin/python
import sys
import simplejson
def usage():
return '''
constructs adjacency_graphs.coffee from QWERTY and DVORAK keyboard layouts
usage:
%s adjacency_graphs.coffee
''' % sys.argv[0]
qwerty = r'''
`~ 1! 2@ 3# 4$ 5% 6^ 7& 8* 9( 0) -_ =+
qQ wW eE rR tT yY uU iI oO pP [{ ]} \|
aA sS dD fF gG hH jJ kK lL ;: '"
zZ xX cC vV bB nN mM ,< .> /?
'''
dvorak = r'''
`~ 1! 2@ 3# 4$ 5% 6^ 7& 8* 9( 0) [{ ]}
'" ,< .> pP yY fF gG cC rR lL /? =+ \|
aA oO eE uU iI dD hH tT nN sS -_
;: qQ jJ kK xX bB mM wW vV zZ
'''
keypad = r'''
/ * -
7 8 9 +
4 5 6
1 2 3
0 .
'''
mac_keypad = r'''
= / *
7 8 9 -
4 5 6 +
1 2 3
0 .
'''
def get_slanted_adjacent_coords(x, y):
'''
returns the six adjacent coordinates on a standard keyboard, where each row is slanted to the
right from the last. adjacencies are clockwise, starting with key to the left, then two keys
above, then right key, then two keys below. (that is, only near-diagonal keys are adjacent,
so g's coordinate is adjacent to those of t,y,b,v, but not those of r,u,n,c.)
'''
return [(x-1, y), (x, y-1), (x+1, y-1), (x+1, y), (x, y+1), (x-1, y+1)]
def get_aligned_adjacent_coords(x, y):
'''
returns the nine clockwise adjacent coordinates on a keypad, where each row is vert aligned.
'''
return [(x-1, y), (x-1, y-1), (x, y-1), (x+1, y-1), (x+1, y), (x+1, y+1), (x, y+1), (x-1, y+1)]
def build_graph(layout_str, slanted):
'''
builds an adjacency graph as a dictionary: {character: [adjacent_characters]}.
adjacent characters occur in a clockwise order.
for example:
* on qwerty layout, 'g' maps to ['fF', 'tT', 'yY', 'hH', 'bB', 'vV']
* on keypad layout, '7' maps to [None, None, None, '=', '8', '5', '4', None]
'''
position_table = {} # maps from tuple (x,y) -> characters at that position.
tokens = layout_str.split()
token_size = len(tokens[0])
x_unit = token_size + 1 # x position unit len is token len plus 1 for the following whitespace.
adjacency_func = get_slanted_adjacent_coords if slanted else get_aligned_adjacent_coords
assert all(len(token) == token_size for token in tokens), 'token len mismatch:\n ' + layout_str
for y, line in enumerate(layout_str.split('\n')):
# the way I illustrated keys above, each qwerty row is indented one space in from the last
slant = y - 1 if slanted else 0
for token in line.split():
x, remainder = divmod(line.index(token) - slant, x_unit)
assert remainder == 0, 'unexpected x offset for %s in:\n%s' % (token, layout_str)
position_table[(x,y)] = token
adjacency_graph = {}
for (x,y), chars in position_table.iteritems():
for char in chars:
adjacency_graph[char] = []
for coord in adjacency_func(x, y):
# position in the list indicates direction
# (for qwerty, 0 is left, 1 is top, 2 is top right, ...)
# for edge chars like 1 or m, insert None as a placeholder when needed
# so that each character in the graph has a same-length adjacency list.
adjacency_graph[char].append(position_table.get(coord, None))
return adjacency_graph
if __name__ == '__main__':
if len(sys.argv) != 2:
print usage()
sys.exit(0)
with open(sys.argv[1], 'w') as f:
f.write('# generated by scripts/build_keyboard_adjacency_graphs.py\n')
f.write('adjacency_graphs = \n ')
lines = []
for graph_name, args in [('qwerty', (qwerty, True)),
('dvorak', (dvorak, True)),
('keypad', (keypad, False)),
('mac_keypad', (mac_keypad, False))]:
graph = build_graph(*args)
lines.append('%s: %s' % (graph_name, simplejson.dumps(graph, sort_keys=True)))
f.write('\n '.join(lines))
f.write('\n\n')
f.write('module.exports = adjacency_graphs\n')
sys.exit(0)
|
mit
|
rghe/ansible
|
contrib/inventory/nagios_ndo.py
|
42
|
3808
|
#!/usr/bin/env python
# (c) 2014, Jonathan Lestrelin <[email protected]>
#
# This file is part of Ansible,
#
# Ansible is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# Ansible is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
"""
Nagios NDO external inventory script.
========================================
Returns hosts and hostgroups from Nagios NDO.
Configuration is read from `nagios_ndo.ini`.
"""
import os
import argparse
import sys
try:
import configparser
except ImportError:
import ConfigParser
configparser = ConfigParser
import json
try:
from sqlalchemy import text
from sqlalchemy.engine import create_engine
except ImportError:
sys.exit("Error: SQLAlchemy is needed. Try something like: pip install sqlalchemy")
class NagiosNDOInventory(object):
def read_settings(self):
config = configparser.SafeConfigParser()
config.read(os.path.dirname(os.path.realpath(__file__)) + '/nagios_ndo.ini')
if config.has_option('ndo', 'database_uri'):
self.ndo_database_uri = config.get('ndo', 'database_uri')
def read_cli(self):
parser = argparse.ArgumentParser()
parser.add_argument('--host', nargs=1)
parser.add_argument('--list', action='store_true')
self.options = parser.parse_args()
def get_hosts(self):
engine = create_engine(self.ndo_database_uri)
connection = engine.connect()
select_hosts = text("SELECT display_name \
FROM nagios_hosts")
select_hostgroups = text("SELECT alias \
FROM nagios_hostgroups")
select_hostgroup_hosts = text("SELECT h.display_name \
FROM nagios_hostgroup_members hgm, nagios_hosts h, nagios_hostgroups hg \
WHERE hgm.hostgroup_id = hg.hostgroup_id \
AND hgm.host_object_id = h.host_object_id \
AND hg.alias =:hostgroup_alias")
hosts = connection.execute(select_hosts)
self.result['all']['hosts'] = [host['display_name'] for host in hosts]
for hostgroup in connection.execute(select_hostgroups):
hostgroup_alias = hostgroup['alias']
self.result[hostgroup_alias] = {}
hosts = connection.execute(select_hostgroup_hosts, hostgroup_alias=hostgroup_alias)
self.result[hostgroup_alias]['hosts'] = [host['display_name'] for host in hosts]
def __init__(self):
self.defaultgroup = 'group_all'
self.ndo_database_uri = None
self.options = None
self.read_settings()
self.read_cli()
self.result = {}
self.result['all'] = {}
self.result['all']['hosts'] = []
self.result['_meta'] = {}
self.result['_meta']['hostvars'] = {}
if self.ndo_database_uri:
self.get_hosts()
if self.options.host:
print(json.dumps({}))
elif self.options.list:
print(json.dumps(self.result))
else:
sys.exit("usage: --list or --host HOSTNAME")
else:
sys.exit("Error: Database configuration is missing. See nagios_ndo.ini.")
NagiosNDOInventory()
|
gpl-3.0
|
balachandrana/pythonista_stash_utilities
|
catbinascii.py
|
1
|
2770
|
""" cat bin asccii
Examples:
"""
"""
todo:
__doc__
test examples
"""
import argparse
import fileinput
import os
import re
import sys
import binascii
def main(args):
ap = argparse.ArgumentParser(description=__doc__)
ap.add_argument('files', nargs='*', help='files to be processed')
ap.add_argument('-u', '--catunbinascii', action='store_true',
help='convert binascii to binary file')
ap.add_argument('-b', '--buffersize', action='store',
help='buffer size')
ns = ap.parse_args(args)
if not ns.buffersize:
ns.buffersize = 32
else:
ns.buffersize = int(ns.buffersize)
files = None
if not ns.catunbinascii:
try:
files = [f for f in ns.files if not os.path.isdir(f)]
for f in files:
fp = open(f, "rb")
buf = fp.read(ns.buffersize)
while buf:
print binascii.hexlify(buf)
buf = fp.read(ns.buffersize)
fp.close()
except IOError as err:
sys.stderr.write("catbinascii: {}: {!s}".format(
type(err).__name__, err))
else:
try:
if ns.files:
if len(ns.files) == 1:
fps = sys.stdin
if not os.path.isdir(ns.files[0]):
fpd = open(files[1], "wb")
else:
sys.stderr.write("%s destination file is a directory\n"
% ns.files[0])
sys.exit(0)
elif len(ns.files) == 2:
if not os.path.isdir(ns.files[0]):
fps = open(ns.files[0])
else:
sys.stderr.write(
"%s source file is a directory\n" % ns.files[0])
sys.exit(0)
if not os.path.isdir(ns.files[1]):
fpd = open(ns.files[1], "wb")
else:
sys.stderr.write("%s destination file is a directory\n"
% ns.files[1])
sys.exit(0)
else:
sys.stderr.write("too many files specified\n")
sys.exit(0)
line = fps.readline()
while line:
fpd.write(binascii.unhexlify(line.strip()))
line = fps.readline()
fps.close()
fpd.close()
except IOError as err:
sys.stderr.write("catbinascii: {}: {!s}".format(
type(err).__name__, err))
if __name__ == "__main__":
main(sys.argv[1:])
|
mit
|
devs1991/test_edx_docmode
|
lms/djangoapps/shoppingcart/utils.py
|
103
|
2782
|
"""
Utility methods for the Shopping Cart app
"""
from django.conf import settings
from microsite_configuration import microsite
from pdfminer.pdfparser import PDFParser
from pdfminer.pdfdocument import PDFDocument
from pdfminer.pdfinterp import PDFResourceManager, PDFPageInterpreter
from pdfminer.converter import PDFPageAggregator
from pdfminer.pdfpage import PDFPage
from pdfminer.layout import LAParams, LTTextBox, LTTextLine, LTFigure
def is_shopping_cart_enabled():
"""
Utility method to check the various configuration to verify that
all of the settings have been enabled
"""
enable_paid_course_registration = microsite.get_value(
'ENABLE_PAID_COURSE_REGISTRATION',
settings.FEATURES.get('ENABLE_PAID_COURSE_REGISTRATION')
)
enable_shopping_cart = microsite.get_value(
'ENABLE_SHOPPING_CART',
settings.FEATURES.get('ENABLE_SHOPPING_CART')
)
return enable_paid_course_registration and enable_shopping_cart
def parse_pages(pdf_buffer, password):
"""
With an PDF buffer object, get the pages, parse each one, and return the entire pdf text
"""
# Create a PDF parser object associated with the file object.
parser = PDFParser(pdf_buffer)
# Create a PDF document object that stores the document structure.
# Supply the password for initialization.
document = PDFDocument(parser, password)
resource_manager = PDFResourceManager()
la_params = LAParams()
device = PDFPageAggregator(resource_manager, laparams=la_params)
interpreter = PDFPageInterpreter(resource_manager, device)
text_content = [] # a list of strings, each representing text collected from each page of the doc
for page in PDFPage.create_pages(document):
interpreter.process_page(page)
# receive the LTPage object for this page
layout = device.get_result()
# layout is an LTPage object which may contain
# child objects like LTTextBox, LTFigure, LTImage, etc.
text_content.append(parse_lt_objects(layout._objs)) # pylint: disable=protected-access
return text_content
def parse_lt_objects(lt_objects):
"""
Iterate through the list of LT* objects and capture the text data contained in each object
"""
text_content = []
for lt_object in lt_objects:
if isinstance(lt_object, LTTextBox) or isinstance(lt_object, LTTextLine):
# text
text_content.append(lt_object.get_text().encode('utf-8'))
elif isinstance(lt_object, LTFigure):
# LTFigure objects are containers for other LT* objects, so recurse through the children
text_content.append(parse_lt_objects(lt_object._objs)) # pylint: disable=protected-access
return '\n'.join(text_content)
|
agpl-3.0
|
thomazs/geraldo
|
site/newsite/django_1_0/django/contrib/markup/templatetags/markup.py
|
38
|
3646
|
"""
Set of "markup" template filters for Django. These filters transform plain text
markup syntaxes to HTML; currently there is support for:
* Textile, which requires the PyTextile library available at
http://dealmeida.net/projects/textile/
* Markdown, which requires the Python-markdown library from
http://www.freewisdom.org/projects/python-markdown
* ReStructuredText, which requires docutils from http://docutils.sf.net/
In each case, if the required library is not installed, the filter will
silently fail and return the un-marked-up text.
"""
from django import template
from django.conf import settings
from django.utils.encoding import smart_str, force_unicode
from django.utils.safestring import mark_safe
register = template.Library()
def textile(value):
try:
import textile
except ImportError:
if settings.DEBUG:
raise template.TemplateSyntaxError, "Error in {% textile %} filter: The Python textile library isn't installed."
return force_unicode(value)
else:
return mark_safe(force_unicode(textile.textile(smart_str(value), encoding='utf-8', output='utf-8')))
textile.is_safe = True
def markdown(value, arg=''):
"""
Runs Markdown over a given value, optionally using various
extensions python-markdown supports.
Syntax::
{{ value|markdown:"extension1_name,extension2_name..." }}
To enable safe mode, which strips raw HTML and only returns HTML
generated by actual Markdown syntax, pass "safe" as the first
extension in the list.
If the version of Markdown in use does not support extensions,
they will be silently ignored.
"""
try:
import markdown
except ImportError:
if settings.DEBUG:
raise template.TemplateSyntaxError, "Error in {% markdown %} filter: The Python markdown library isn't installed."
return force_unicode(value)
else:
# markdown.version was first added in 1.6b. The only version of markdown
# to fully support extensions before 1.6b was the shortlived 1.6a.
if hasattr(markdown, 'version'):
extensions = [e for e in arg.split(",") if e]
if len(extensions) > 0 and extensions[0] == "safe":
extensions = extensions[1:]
safe_mode = True
else:
safe_mode = False
# Unicode support only in markdown v1.7 or above. Version_info
# exist only in markdown v1.6.2rc-2 or above.
if getattr(markdown, "version_info", None) < (1,7):
return mark_safe(force_unicode(markdown.markdown(smart_str(value), extensions, safe_mode=safe_mode)))
else:
return mark_safe(markdown.markdown(force_unicode(value), extensions, safe_mode=safe_mode))
else:
return mark_safe(force_unicode(markdown.markdown(smart_str(value))))
markdown.is_safe = True
def restructuredtext(value):
try:
from docutils.core import publish_parts
except ImportError:
if settings.DEBUG:
raise template.TemplateSyntaxError, "Error in {% restructuredtext %} filter: The Python docutils library isn't installed."
return force_unicode(value)
else:
docutils_settings = getattr(settings, "RESTRUCTUREDTEXT_FILTER_SETTINGS", {})
parts = publish_parts(source=smart_str(value), writer_name="html4css1", settings_overrides=docutils_settings)
return mark_safe(force_unicode(parts["fragment"]))
restructuredtext.is_safe = True
register.filter(textile)
register.filter(markdown)
register.filter(restructuredtext)
|
lgpl-3.0
|
geomagpy/MARTAS
|
libmqtt/lorawanserver.py
|
1
|
7323
|
from __future__ import print_function
from __future__ import absolute_import
# ###################################################################
# Import packages
# ###################################################################
from magpy.stream import DataStream, KEYLIST, NUMKEYLIST, subtractStreams
import struct
from datetime import datetime
import json
import base64
import binascii
def datetime2array(t):
return [t.year,t.month,t.day,t.hour,t.minute,t.second,t.microsecond]
## LORA-ZAMG - protocol
##
class lorawanserver(object):
"""
application/3/node/0018b2200000034a/rx {"applicationID":"3","applicationName":"Temperature-and-Humidity","deviceName":"TITEC-Multisensor","devEUI":"0018b2200000034a","rxInfo":[{"gatewayID":"00800000a0001285","name":"MTCDT_AEPGW2","rssi":-49,"loRaSNR":7.2,"location":{"latitude":48.248422399999995,"longitude":16.3520512,"altitude":0}}],"txInfo":{"frequency":868500000,"dr":5},"adr":true,"fCnt":457,"fPort":1,"data":"QgASEzQVIg/HVA=="}
content suggestion: appeui, deveui, sensorname, locationname, sensormodell, -> rest beelike
topic suggestions:
headline/station/sensor
ideally, headline is a unique formatidentifier
e.g.
loraz/schwarzenbergplatz/adeunis od
warum: so kann man relativ systematisch stationen abfragen
mobile sensoren ohne festen standort:
loraz/mobile/adeunis
"""
def __init__(self):
"""
"""
print (" -> Initializing loraWAN server routines ...")
#self.payload = payload
#self.topic = topic
self.topicidentifier = {'startswith':'application','endswith':'rx'}
self.datakeytranslator = {'tl':['t1','degC'], 'rf':['var1','per'], 'corr':['var5','none']}
self.identifier = {}
self.headdict = {}
self.headstream = {}
def GetPayload(self, payload, topic):
loradict = json.loads(payload)
# convert loradict to headdict (header) and data_bin
newpayload, sensorid, headline, header = self.loradict2datastruct(loradict)
return newpayload, sensorid, headline, header, self.identifier
def b2v7(self,b1,b2,div):
val = ((b2 << 8) + b1)/ float(div)
return val
def b2v(self,b1,b2,b3,off):
v = ((b1 << 8) + b2 << 8) + b3
val = (v/100000. *6.25) - off
return val
def loradict2datastruct(self, loradict):
datakeytranslator = {'tl':['t1','degC'], 'rf':['var1','per'], 'corr':['var5','none'], 'bat':['var4','per']}
rxdict = loradict.get('rxInfo')[0]
locdict = rxdict.get('location')
header = {}
header['SensorName'] = loradict.get('deviceName','LORA')
header['SensorDescription'] = loradict.get('applicationName','not specified')
header['SensorSerialNum'] = loradict.get('devEUI','')
header['SensorGroup'] = loradict.get('deviceName','LORA')
sensorid = header['SensorName'][:5].replace('-','') + '_' + header['SensorSerialNum'] + '_0001'
header['SensorID'] = sensorid
header['StationID'] = rxdict.get('gatewayID','undefined')
header['StationName'] = rxdict.get('name','undefined')
header['StationLongitude'] = locdict.get('longitude','')
header['StationLatitude'] = locdict.get('latitude','')
if not locdict.get('longitude','') == '':
header['StationLocationReference'] = 'WGS84, EPSG: 4326'
if locdict.get('altitude','') in ['',0,'0']:
alt = ''
else:
alt = locdict.get('altitude')
header['StationElevation'] = alt
if not alt == '':
header['StationElevationRef'] = 'm NN'
datacode = loradict.get('data')
# convert to something like datadict = {"tl":21.75,"rf":36.9}
barray = bytearray(base64.b64decode(datacode))
print ("Device:", loradict.get('deviceName'))
print ("Length Bytearray:", len(barray))
if len(barray) == 10:
temp = self.b2v(barray[3],barray[4],barray[5],55)
rf = self.b2v(barray[7],barray[8],barray[9],25)
datadict = {"tl":temp, "rf":rf}
elif len(barray) == 7:
print ("Found Bytearray 7 with code", datacode)
temp = self.b2v7(barray[1],barray[2],100)
rf = self.b2v7(barray[3],barray[4],100)
bat = self.b2v7(barray[5],barray[6],1)
datadict = {"tl":temp, "rf":rf, "bat":bat}
else:
print ("Found Bytearray of length {} with code", len(barray), datacode)
print ("Payload looks like", loradict)
temp = 999.0
rf = -10.0
datadict = {"tl":temp, "rf":rf}
keylist, elemlist, unitlist, multilist = [],[],[],[]
if not loradict.get('DateTime','') == '':
time = datetime.strptime(loradict.get('DateTime'),"%Y-%m-%dT%H:%M:%S.%fZ")
elif not loradict.get('DatumSec','') == '':
time = datetime.strptime(loradict.get('DatumSec'),"%Y-%m-%dT%H:%M:%S.%fZ")
else:
time = datetime.utcnow()
datalst = datetime2array(time)
packstr = '6hL'
for elem in datadict:
if elem in datakeytranslator:
key = datakeytranslator[elem][0]
unit = datakeytranslator[elem][1]
keylist.append(key)
elemlist.append(elem)
unitlist.append(unit)
multilist.append(1000)
packstr += "l"
datalst.append(int(datadict[elem]*1000))
#print (elem, datadict[elem])
datalst = [str(elem) for elem in datalst]
dataline =','.join(datalst)
#print ("DATA", dataline)
self.identifier[sensorid+':packingcode'] = packstr
self.identifier[sensorid+':keylist'] = keylist
self.identifier[sensorid+':elemlist'] = elemlist
self.identifier[sensorid+':unitlist'] = unitlist
self.identifier[sensorid+':multilist'] = multilist
def identifier2line(dic, sensorid):
p1 = dic.get(sensorid+':packingcode')
p2 = dic.get(sensorid+':keylist')
p3 = dic.get(sensorid+':elemlist')
p4 = dic.get(sensorid+':unitlist')
p5 = dic.get(sensorid+':multilist')
p5 = [str(elem) for elem in p5]
size = struct.calcsize(p1)
line = "# MagPyBin {} [{}] [{}] [{}] [{}] {} {}".format(sensorid,','.join(p2),','.join(p3),','.join(p4),','.join(p5),p1,size)
return line
headline = identifier2line(self.identifier, sensorid)
#self.headstream[sensorid] = create_head_dict(self.headdict[sensorid],sensorid)
#self.headstream[sensorid] = merge_two_dicts(self.headstream[sensorid], header)
#print ("HEAD1", headdict[sensorid])
#print ("HEAD2", headstream[sensorid])
print ("success")
return dataline, sensorid, headline, header
|
gpl-3.0
|
kobotoolbox/kpi
|
kpi/tests/api/v1/test_api_permissions.py
|
1
|
5983
|
# coding: utf-8
from django.contrib.auth.models import User, Permission
from django.urls import reverse
from django.utils import timezone
from rest_framework import status
from kpi.constants import ASSET_TYPE_COLLECTION
from kpi.models import Asset, ObjectPermission
from kpi.models.object_permission import get_anonymous_user
# importing module instead of the class, avoid running the tests twice
from kpi.tests.api.v2 import test_api_permissions
from kpi.tests.kpi_test_case import KpiTestCase
class ApiAnonymousPermissionsTestCase(test_api_permissions.ApiAnonymousPermissionsTestCase):
URL_NAMESPACE = None
class ApiPermissionsPublicAssetTestCase(test_api_permissions.ApiPermissionsPublicAssetTestCase):
URL_NAMESPACE = None
class ApiPermissionsTestCase(test_api_permissions.ApiPermissionsTestCase):
URL_NAMESPACE = None
class ApiAssignedPermissionsTestCase(KpiTestCase):
"""
An obnoxiously large amount of code to test that the endpoint for listing
assigned permissions complies with the following rules:
* Superusers see it all (thank goodness for pagination)
* Anonymous users see nothing
* Regular users see everything that concerns them, namely all
their own permissions and all the owners' permissions for all objects
to which they have been assigned any permission
See also `kpi.filters.KpiAssignedObjectPermissionsFilter`
"""
def setUp(self):
super().setUp()
self.anon = get_anonymous_user()
self.super = User.objects.get(username='admin')
self.super_password = 'pass'
self.someuser = User.objects.get(username='someuser')
self.someuser_password = 'someuser'
self.anotheruser = User.objects.get(username='anotheruser')
self.anotheruser_password = 'anotheruser'
def create_object_with_specific_pk(model, pk, **kwargs):
obj = model()
obj.pk = pk
for k, v in kwargs.items():
setattr(obj, k, v)
obj.save()
return obj
self.collection = Asset.objects.create(
asset_type=ASSET_TYPE_COLLECTION, owner=self.someuser
)
self.asset = Asset.objects.create(owner=self.someuser)
def test_anon_cannot_list_permissions(self):
self.asset.assign_perm(self.anon, 'view_asset')
self.assertTrue(self.anon.has_perm('view_asset', self.asset))
url = reverse('objectpermission-list')
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertListEqual(response.data['results'], [])
self.asset.remove_perm(self.anon, 'view_asset')
self.assertFalse(self.anon.has_perm('view_asset', self.asset))
def test_user_sees_relevant_permissions_on_assigned_objects(self):
# A user with explicitly-assigned permissions should see their
# own permissions and the owner's permissions, but not permissions
# assigned to other users
self.asset.assign_perm(self.anotheruser, 'view_asset')
self.assertTrue(self.anotheruser.has_perm('view_asset', self.asset))
irrelevant_user = User.objects.create(username='mindyourown')
self.asset.assign_perm(irrelevant_user, 'view_asset')
self.client.login(username=self.anotheruser.username,
password=self.anotheruser_password)
url = reverse('objectpermission-list')
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_200_OK)
returned_uids = [r['uid'] for r in response.data['results']]
all_obj_perms = self.asset.permissions.all()
relevant_obj_perms = all_obj_perms.filter(
user__in=(self.asset.owner, self.anotheruser),
permission__codename__in=self.asset.ASSIGNABLE_PERMISSIONS_BY_TYPE[
self.asset.asset_type
],
)
self.assertListEqual(
sorted(returned_uids),
sorted(relevant_obj_perms.values_list('uid', flat=True)),
)
self.asset.remove_perm(self.anotheruser, 'view_asset')
self.assertFalse(self.anotheruser.has_perm('view_asset', self.asset))
def test_user_cannot_see_permissions_on_unassigned_objects(self):
self.asset.assign_perm(self.anotheruser, 'view_asset')
self.assertTrue(self.anotheruser.has_perm('view_asset', self.asset))
self.client.login(username=self.anotheruser.username,
password=self.anotheruser_password)
url = reverse('objectpermission-list')
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_200_OK)
returned_uids = [r['uid'] for r in response.data['results']]
other_obj_perms = self.collection.permissions.all()
self.assertFalse(
set(returned_uids).intersection(
other_obj_perms.values_list('uid', flat=True)
)
)
self.asset.remove_perm(self.anotheruser, 'view_asset')
self.assertFalse(self.anotheruser.has_perm('view_asset', self.asset))
def test_superuser_sees_all_permissions(self):
self.asset.assign_perm(self.anotheruser, 'view_asset')
self.assertTrue(self.anotheruser.has_perm('view_asset', self.asset))
self.client.login(username=self.super.username,
password=self.super_password)
url = reverse('objectpermission-list')
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_200_OK)
returned_uids = [r['uid'] for r in response.data['results']]
self.assertListEqual(
sorted(returned_uids),
sorted(ObjectPermission.objects.values_list('uid', flat=True))
)
self.asset.remove_perm(self.anotheruser, 'view_asset')
self.assertFalse(self.anotheruser.has_perm('view_asset', self.asset))
|
agpl-3.0
|
deepmind/dm_control
|
dm_control/viewer/user_input_test.py
|
1
|
6472
|
# Copyright 2018 The dm_control Authors.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ============================================================================
"""Tests for the user_input module."""
from absl.testing import absltest
from dm_control.viewer import user_input
import mock
class InputMapTests(absltest.TestCase):
def setUp(self):
super().setUp()
self.mouse = mock.MagicMock()
self.keyboard = mock.MagicMock()
self.input_map = user_input.InputMap(self.mouse, self.keyboard)
self.callback = mock.MagicMock()
def test_clearing_bindings(self):
self.input_map._active_exclusive = 1
self.input_map._action_callbacks = {1: 2}
self.input_map._double_click_callbacks = {3: 4}
self.input_map._plane_callback = [5]
self.input_map._z_axis_callback = [6]
self.input_map.clear_bindings()
self.assertEmpty(self.input_map._action_callbacks)
self.assertEmpty(self.input_map._double_click_callbacks)
self.assertEmpty(self.input_map._plane_callback)
self.assertEmpty(self.input_map._z_axis_callback)
self.assertEqual(
user_input._NO_EXCLUSIVE_KEY, self.input_map._active_exclusive)
def test_binding(self):
self.input_map.bind(self.callback, user_input.KEY_UP)
expected_dict = {
(user_input.KEY_UP, user_input.MOD_NONE): (False, self.callback)}
self.assertDictEqual(expected_dict, self.input_map._action_callbacks)
def test_binding_exclusive(self):
self.input_map.bind(self.callback, user_input.Exclusive(user_input.KEY_UP))
expected_dict = {
(user_input.KEY_UP, user_input.MOD_NONE): (True, self.callback)}
self.assertDictEqual(expected_dict, self.input_map._action_callbacks)
def test_binding_and_invoking_ranges_of_actions(self):
self.input_map.bind(self.callback, user_input.Range(
[user_input.KEY_UP, (user_input.KEY_UP, user_input.MOD_ALT)]))
self.input_map._handle_key(
user_input.KEY_UP, user_input.PRESS, user_input.MOD_NONE)
self.callback.assert_called_once_with(0)
self.callback.reset_mock()
self.input_map._handle_key(
user_input.KEY_UP, user_input.PRESS, user_input.MOD_ALT)
self.callback.assert_called_once_with(1)
def test_binding_planar_action(self):
self.input_map.bind_plane(self.callback)
self.assertLen(self.input_map._plane_callback, 1)
self.assertEqual(self.callback, self.input_map._plane_callback[0])
def test_binding_z_axis_action(self):
self.input_map.bind_z_axis(self.callback)
self.assertLen(self.input_map._z_axis_callback, 1)
self.assertEqual(self.callback, self.input_map._z_axis_callback[0])
def test_invoking_regular_action_in_response_to_click(self):
self.input_map._action_callbacks = {(1, 2): (False, self.callback)}
self.input_map._handle_key(1, user_input.PRESS, 2)
self.callback.assert_called_once()
self.callback.reset_mock()
self.input_map._handle_key(1, user_input.RELEASE, 2)
self.assertEqual(0, self.callback.call_count)
def test_invoking_exclusive_action_in_response_to_click(self):
self.input_map._action_callbacks = {(1, 2): (True, self.callback)}
self.input_map._handle_key(1, user_input.PRESS, 2)
self.callback.assert_called_once_with(True)
self.callback.reset_mock()
self.input_map._handle_key(1, user_input.RELEASE, 2)
self.callback.assert_called_once_with(False)
def test_exclusive_action_blocks_other_actions_until_its_finished(self):
self.input_map._action_callbacks = {
(1, 2): (True, self.callback), (3, 4): (False, self.callback)}
self.input_map._handle_key(1, user_input.PRESS, 2)
self.callback.assert_called_once_with(True)
self.callback.reset_mock()
# Attempting to start other actions (PRESS) or end them (RELEASE)
# amounts to nothing.
self.input_map._handle_key(3, user_input.PRESS, 4)
self.assertEqual(0, self.callback.call_count)
self.input_map._handle_key(3, user_input.RELEASE, 4)
self.assertEqual(0, self.callback.call_count)
# Even attempting to start the same action for the 2nd time fails.
self.input_map._handle_key(1, user_input.PRESS, 2)
self.assertEqual(0, self.callback.call_count)
# Only finishing the action frees up the resources.
self.input_map._handle_key(1, user_input.RELEASE, 2)
self.callback.assert_called_once_with(False)
self.callback.reset_mock()
# Now we can start a new action.
self.input_map._handle_key(3, user_input.PRESS, 4)
self.callback.assert_called_once()
def test_modifiers_required_only_for_exclusive_action_start(self):
activation_modifiers = 2
no_modifiers = 0
self.input_map._action_callbacks = {
(1, activation_modifiers): (True, self.callback)}
self.input_map._handle_key(1, user_input.PRESS, activation_modifiers)
self.callback.assert_called_once_with(True)
self.callback.reset_mock()
self.input_map._handle_key(1, user_input.RELEASE, no_modifiers)
self.callback.assert_called_once_with(False)
def test_invoking_regular_action_in_response_to_double_click(self):
self.input_map._double_click_callbacks = {(1, 2): self.callback}
self.input_map._handle_double_click(1, 2)
self.callback.assert_called_once()
def test_exclusive_actions_dont_respond_to_double_clicks(self):
self.input_map._action_callbacks = {(1, 2): (True, self.callback)}
self.input_map._handle_double_click(1, 2)
self.assertEqual(0, self.callback.call_count)
def test_mouse_move(self):
position = [1, 2]
translation = [3, 4]
self.input_map._plane_callback = [self.callback]
self.input_map._handle_mouse_move(position, translation)
self.callback.assert_called_once_with(position, translation)
def test_mouse_scroll(self):
value = 5
self.input_map._z_axis_callback = [self.callback]
self.input_map._handle_mouse_scroll(value)
self.callback.assert_called_once_with(value)
if __name__ == '__main__':
absltest.main()
|
apache-2.0
|
Nepherhotep/django
|
tests/postgres_tests/test_ranges.py
|
98
|
24582
|
import datetime
import json
import unittest
from django import forms
from django.core import exceptions, serializers
from django.db import connection
from django.db.models import F
from django.test import TestCase, override_settings
from django.utils import timezone
from . import PostgreSQLTestCase
from .models import RangeLookupsModel, RangesModel
try:
from psycopg2.extras import DateRange, DateTimeTZRange, NumericRange
from django.contrib.postgres import fields as pg_fields, forms as pg_forms
from django.contrib.postgres.validators import (
RangeMaxValueValidator, RangeMinValueValidator,
)
except ImportError:
pass
def skipUnlessPG92(test):
try:
PG_VERSION = connection.pg_version
except AttributeError:
PG_VERSION = 0
if PG_VERSION < 90200:
return unittest.skip('PostgreSQL >= 9.2 required')(test)
return test
@skipUnlessPG92
class TestSaveLoad(TestCase):
def test_all_fields(self):
now = timezone.now()
instance = RangesModel(
ints=NumericRange(0, 10),
bigints=NumericRange(10, 20),
floats=NumericRange(20, 30),
timestamps=DateTimeTZRange(now - datetime.timedelta(hours=1), now),
dates=DateRange(now.date() - datetime.timedelta(days=1), now.date()),
)
instance.save()
loaded = RangesModel.objects.get()
self.assertEqual(instance.ints, loaded.ints)
self.assertEqual(instance.bigints, loaded.bigints)
self.assertEqual(instance.floats, loaded.floats)
self.assertEqual(instance.timestamps, loaded.timestamps)
self.assertEqual(instance.dates, loaded.dates)
def test_range_object(self):
r = NumericRange(0, 10)
instance = RangesModel(ints=r)
instance.save()
loaded = RangesModel.objects.get()
self.assertEqual(r, loaded.ints)
def test_tuple(self):
instance = RangesModel(ints=(0, 10))
instance.save()
loaded = RangesModel.objects.get()
self.assertEqual(NumericRange(0, 10), loaded.ints)
def test_range_object_boundaries(self):
r = NumericRange(0, 10, '[]')
instance = RangesModel(floats=r)
instance.save()
loaded = RangesModel.objects.get()
self.assertEqual(r, loaded.floats)
self.assertTrue(10 in loaded.floats)
def test_unbounded(self):
r = NumericRange(None, None, '()')
instance = RangesModel(floats=r)
instance.save()
loaded = RangesModel.objects.get()
self.assertEqual(r, loaded.floats)
def test_empty(self):
r = NumericRange(empty=True)
instance = RangesModel(ints=r)
instance.save()
loaded = RangesModel.objects.get()
self.assertEqual(r, loaded.ints)
def test_null(self):
instance = RangesModel(ints=None)
instance.save()
loaded = RangesModel.objects.get()
self.assertIsNone(loaded.ints)
@skipUnlessPG92
class TestQuerying(TestCase):
@classmethod
def setUpTestData(cls):
cls.objs = [
RangesModel.objects.create(ints=NumericRange(0, 10)),
RangesModel.objects.create(ints=NumericRange(5, 15)),
RangesModel.objects.create(ints=NumericRange(None, 0)),
RangesModel.objects.create(ints=NumericRange(empty=True)),
RangesModel.objects.create(ints=None),
]
def test_exact(self):
self.assertSequenceEqual(
RangesModel.objects.filter(ints__exact=NumericRange(0, 10)),
[self.objs[0]],
)
def test_isnull(self):
self.assertSequenceEqual(
RangesModel.objects.filter(ints__isnull=True),
[self.objs[4]],
)
def test_isempty(self):
self.assertSequenceEqual(
RangesModel.objects.filter(ints__isempty=True),
[self.objs[3]],
)
def test_contains(self):
self.assertSequenceEqual(
RangesModel.objects.filter(ints__contains=8),
[self.objs[0], self.objs[1]],
)
def test_contains_range(self):
self.assertSequenceEqual(
RangesModel.objects.filter(ints__contains=NumericRange(3, 8)),
[self.objs[0]],
)
def test_contained_by(self):
self.assertSequenceEqual(
RangesModel.objects.filter(ints__contained_by=NumericRange(0, 20)),
[self.objs[0], self.objs[1], self.objs[3]],
)
def test_overlap(self):
self.assertSequenceEqual(
RangesModel.objects.filter(ints__overlap=NumericRange(3, 8)),
[self.objs[0], self.objs[1]],
)
def test_fully_lt(self):
self.assertSequenceEqual(
RangesModel.objects.filter(ints__fully_lt=NumericRange(5, 10)),
[self.objs[2]],
)
def test_fully_gt(self):
self.assertSequenceEqual(
RangesModel.objects.filter(ints__fully_gt=NumericRange(5, 10)),
[],
)
def test_not_lt(self):
self.assertSequenceEqual(
RangesModel.objects.filter(ints__not_lt=NumericRange(5, 10)),
[self.objs[1]],
)
def test_not_gt(self):
self.assertSequenceEqual(
RangesModel.objects.filter(ints__not_gt=NumericRange(5, 10)),
[self.objs[0], self.objs[2]],
)
def test_adjacent_to(self):
self.assertSequenceEqual(
RangesModel.objects.filter(ints__adjacent_to=NumericRange(0, 5)),
[self.objs[1], self.objs[2]],
)
def test_startswith(self):
self.assertSequenceEqual(
RangesModel.objects.filter(ints__startswith=0),
[self.objs[0]],
)
def test_endswith(self):
self.assertSequenceEqual(
RangesModel.objects.filter(ints__endswith=0),
[self.objs[2]],
)
def test_startswith_chaining(self):
self.assertSequenceEqual(
RangesModel.objects.filter(ints__startswith__gte=0),
[self.objs[0], self.objs[1]],
)
@skipUnlessPG92
class TestQueringWithRanges(TestCase):
def test_date_range(self):
objs = [
RangeLookupsModel.objects.create(date='2015-01-01'),
RangeLookupsModel.objects.create(date='2015-05-05'),
]
self.assertSequenceEqual(
RangeLookupsModel.objects.filter(date__contained_by=DateRange('2015-01-01', '2015-05-04')),
[objs[0]],
)
def test_date_range_datetime_field(self):
objs = [
RangeLookupsModel.objects.create(timestamp='2015-01-01'),
RangeLookupsModel.objects.create(timestamp='2015-05-05'),
]
self.assertSequenceEqual(
RangeLookupsModel.objects.filter(timestamp__date__contained_by=DateRange('2015-01-01', '2015-05-04')),
[objs[0]],
)
def test_datetime_range(self):
objs = [
RangeLookupsModel.objects.create(timestamp='2015-01-01T09:00:00'),
RangeLookupsModel.objects.create(timestamp='2015-05-05T17:00:00'),
]
self.assertSequenceEqual(
RangeLookupsModel.objects.filter(
timestamp__contained_by=DateTimeTZRange('2015-01-01T09:00', '2015-05-04T23:55')
),
[objs[0]],
)
def test_integer_range(self):
objs = [
RangeLookupsModel.objects.create(integer=5),
RangeLookupsModel.objects.create(integer=99),
RangeLookupsModel.objects.create(integer=-1),
]
self.assertSequenceEqual(
RangeLookupsModel.objects.filter(integer__contained_by=NumericRange(1, 98)),
[objs[0]]
)
def test_biginteger_range(self):
objs = [
RangeLookupsModel.objects.create(big_integer=5),
RangeLookupsModel.objects.create(big_integer=99),
RangeLookupsModel.objects.create(big_integer=-1),
]
self.assertSequenceEqual(
RangeLookupsModel.objects.filter(big_integer__contained_by=NumericRange(1, 98)),
[objs[0]]
)
def test_float_range(self):
objs = [
RangeLookupsModel.objects.create(float=5),
RangeLookupsModel.objects.create(float=99),
RangeLookupsModel.objects.create(float=-1),
]
self.assertSequenceEqual(
RangeLookupsModel.objects.filter(float__contained_by=NumericRange(1, 98)),
[objs[0]]
)
def test_f_ranges(self):
parent = RangesModel.objects.create(floats=NumericRange(0, 10))
objs = [
RangeLookupsModel.objects.create(float=5, parent=parent),
RangeLookupsModel.objects.create(float=99, parent=parent),
]
self.assertSequenceEqual(
RangeLookupsModel.objects.filter(float__contained_by=F('parent__floats')),
[objs[0]]
)
def test_exclude(self):
objs = [
RangeLookupsModel.objects.create(float=5),
RangeLookupsModel.objects.create(float=99),
RangeLookupsModel.objects.create(float=-1),
]
self.assertSequenceEqual(
RangeLookupsModel.objects.exclude(float__contained_by=NumericRange(0, 100)),
[objs[2]]
)
@skipUnlessPG92
class TestSerialization(TestCase):
test_data = (
'[{"fields": {"ints": "{\\"upper\\": \\"10\\", \\"lower\\": \\"0\\", '
'\\"bounds\\": \\"[)\\"}", "floats": "{\\"empty\\": true}", '
'"bigints": null, "timestamps": "{\\"upper\\": \\"2014-02-02T12:12:12+00:00\\", '
'\\"lower\\": \\"2014-01-01T00:00:00+00:00\\", \\"bounds\\": \\"[)\\"}", '
'"dates": "{\\"upper\\": \\"2014-02-02\\", \\"lower\\": \\"2014-01-01\\", \\"bounds\\": \\"[)\\"}" }, '
'"model": "postgres_tests.rangesmodel", "pk": null}]'
)
lower_date = datetime.date(2014, 1, 1)
upper_date = datetime.date(2014, 2, 2)
lower_dt = datetime.datetime(2014, 1, 1, 0, 0, 0, tzinfo=timezone.utc)
upper_dt = datetime.datetime(2014, 2, 2, 12, 12, 12, tzinfo=timezone.utc)
def test_dumping(self):
instance = RangesModel(ints=NumericRange(0, 10), floats=NumericRange(empty=True),
timestamps=DateTimeTZRange(self.lower_dt, self.upper_dt),
dates=DateRange(self.lower_date, self.upper_date))
data = serializers.serialize('json', [instance])
dumped = json.loads(data)
for field in ('ints', 'dates', 'timestamps'):
dumped[0]['fields'][field] = json.loads(dumped[0]['fields'][field])
check = json.loads(self.test_data)
for field in ('ints', 'dates', 'timestamps'):
check[0]['fields'][field] = json.loads(check[0]['fields'][field])
self.assertEqual(dumped, check)
def test_loading(self):
instance = list(serializers.deserialize('json', self.test_data))[0].object
self.assertEqual(instance.ints, NumericRange(0, 10))
self.assertEqual(instance.floats, NumericRange(empty=True))
self.assertEqual(instance.bigints, None)
class TestValidators(PostgreSQLTestCase):
def test_max(self):
validator = RangeMaxValueValidator(5)
validator(NumericRange(0, 5))
with self.assertRaises(exceptions.ValidationError) as cm:
validator(NumericRange(0, 10))
self.assertEqual(cm.exception.messages[0], 'Ensure that this range is completely less than or equal to 5.')
self.assertEqual(cm.exception.code, 'max_value')
def test_min(self):
validator = RangeMinValueValidator(5)
validator(NumericRange(10, 15))
with self.assertRaises(exceptions.ValidationError) as cm:
validator(NumericRange(0, 10))
self.assertEqual(cm.exception.messages[0], 'Ensure that this range is completely greater than or equal to 5.')
self.assertEqual(cm.exception.code, 'min_value')
class TestFormField(PostgreSQLTestCase):
def test_valid_integer(self):
field = pg_forms.IntegerRangeField()
value = field.clean(['1', '2'])
self.assertEqual(value, NumericRange(1, 2))
def test_valid_floats(self):
field = pg_forms.FloatRangeField()
value = field.clean(['1.12345', '2.001'])
self.assertEqual(value, NumericRange(1.12345, 2.001))
def test_valid_timestamps(self):
field = pg_forms.DateTimeRangeField()
value = field.clean(['01/01/2014 00:00:00', '02/02/2014 12:12:12'])
lower = datetime.datetime(2014, 1, 1, 0, 0, 0)
upper = datetime.datetime(2014, 2, 2, 12, 12, 12)
self.assertEqual(value, DateTimeTZRange(lower, upper))
def test_valid_dates(self):
field = pg_forms.DateRangeField()
value = field.clean(['01/01/2014', '02/02/2014'])
lower = datetime.date(2014, 1, 1)
upper = datetime.date(2014, 2, 2)
self.assertEqual(value, DateRange(lower, upper))
def test_using_split_datetime_widget(self):
class SplitDateTimeRangeField(pg_forms.DateTimeRangeField):
base_field = forms.SplitDateTimeField
class SplitForm(forms.Form):
field = SplitDateTimeRangeField()
form = SplitForm()
self.assertHTMLEqual(str(form), '''
<tr>
<th>
<label for="id_field_0">Field:</label>
</th>
<td>
<input id="id_field_0_0" name="field_0_0" type="text" />
<input id="id_field_0_1" name="field_0_1" type="text" />
<input id="id_field_1_0" name="field_1_0" type="text" />
<input id="id_field_1_1" name="field_1_1" type="text" />
</td>
</tr>
''')
form = SplitForm({
'field_0_0': '01/01/2014',
'field_0_1': '00:00:00',
'field_1_0': '02/02/2014',
'field_1_1': '12:12:12',
})
self.assertTrue(form.is_valid())
lower = datetime.datetime(2014, 1, 1, 0, 0, 0)
upper = datetime.datetime(2014, 2, 2, 12, 12, 12)
self.assertEqual(form.cleaned_data['field'], DateTimeTZRange(lower, upper))
def test_none(self):
field = pg_forms.IntegerRangeField(required=False)
value = field.clean(['', ''])
self.assertEqual(value, None)
def test_rendering(self):
class RangeForm(forms.Form):
ints = pg_forms.IntegerRangeField()
self.assertHTMLEqual(str(RangeForm()), '''
<tr>
<th><label for="id_ints_0">Ints:</label></th>
<td>
<input id="id_ints_0" name="ints_0" type="number" />
<input id="id_ints_1" name="ints_1" type="number" />
</td>
</tr>
''')
def test_integer_lower_bound_higher(self):
field = pg_forms.IntegerRangeField()
with self.assertRaises(exceptions.ValidationError) as cm:
field.clean(['10', '2'])
self.assertEqual(cm.exception.messages[0], 'The start of the range must not exceed the end of the range.')
self.assertEqual(cm.exception.code, 'bound_ordering')
def test_integer_open(self):
field = pg_forms.IntegerRangeField()
value = field.clean(['', '0'])
self.assertEqual(value, NumericRange(None, 0))
def test_integer_incorrect_data_type(self):
field = pg_forms.IntegerRangeField()
with self.assertRaises(exceptions.ValidationError) as cm:
field.clean('1')
self.assertEqual(cm.exception.messages[0], 'Enter two whole numbers.')
self.assertEqual(cm.exception.code, 'invalid')
def test_integer_invalid_lower(self):
field = pg_forms.IntegerRangeField()
with self.assertRaises(exceptions.ValidationError) as cm:
field.clean(['a', '2'])
self.assertEqual(cm.exception.messages[0], 'Enter a whole number.')
def test_integer_invalid_upper(self):
field = pg_forms.IntegerRangeField()
with self.assertRaises(exceptions.ValidationError) as cm:
field.clean(['1', 'b'])
self.assertEqual(cm.exception.messages[0], 'Enter a whole number.')
def test_integer_required(self):
field = pg_forms.IntegerRangeField(required=True)
with self.assertRaises(exceptions.ValidationError) as cm:
field.clean(['', ''])
self.assertEqual(cm.exception.messages[0], 'This field is required.')
value = field.clean([1, ''])
self.assertEqual(value, NumericRange(1, None))
def test_float_lower_bound_higher(self):
field = pg_forms.FloatRangeField()
with self.assertRaises(exceptions.ValidationError) as cm:
field.clean(['1.8', '1.6'])
self.assertEqual(cm.exception.messages[0], 'The start of the range must not exceed the end of the range.')
self.assertEqual(cm.exception.code, 'bound_ordering')
def test_float_open(self):
field = pg_forms.FloatRangeField()
value = field.clean(['', '3.1415926'])
self.assertEqual(value, NumericRange(None, 3.1415926))
def test_float_incorrect_data_type(self):
field = pg_forms.FloatRangeField()
with self.assertRaises(exceptions.ValidationError) as cm:
field.clean('1.6')
self.assertEqual(cm.exception.messages[0], 'Enter two numbers.')
self.assertEqual(cm.exception.code, 'invalid')
def test_float_invalid_lower(self):
field = pg_forms.FloatRangeField()
with self.assertRaises(exceptions.ValidationError) as cm:
field.clean(['a', '3.1415926'])
self.assertEqual(cm.exception.messages[0], 'Enter a number.')
def test_float_invalid_upper(self):
field = pg_forms.FloatRangeField()
with self.assertRaises(exceptions.ValidationError) as cm:
field.clean(['1.61803399', 'b'])
self.assertEqual(cm.exception.messages[0], 'Enter a number.')
def test_float_required(self):
field = pg_forms.FloatRangeField(required=True)
with self.assertRaises(exceptions.ValidationError) as cm:
field.clean(['', ''])
self.assertEqual(cm.exception.messages[0], 'This field is required.')
value = field.clean(['1.61803399', ''])
self.assertEqual(value, NumericRange(1.61803399, None))
def test_date_lower_bound_higher(self):
field = pg_forms.DateRangeField()
with self.assertRaises(exceptions.ValidationError) as cm:
field.clean(['2013-04-09', '1976-04-16'])
self.assertEqual(cm.exception.messages[0], 'The start of the range must not exceed the end of the range.')
self.assertEqual(cm.exception.code, 'bound_ordering')
def test_date_open(self):
field = pg_forms.DateRangeField()
value = field.clean(['', '2013-04-09'])
self.assertEqual(value, DateRange(None, datetime.date(2013, 4, 9)))
def test_date_incorrect_data_type(self):
field = pg_forms.DateRangeField()
with self.assertRaises(exceptions.ValidationError) as cm:
field.clean('1')
self.assertEqual(cm.exception.messages[0], 'Enter two valid dates.')
self.assertEqual(cm.exception.code, 'invalid')
def test_date_invalid_lower(self):
field = pg_forms.DateRangeField()
with self.assertRaises(exceptions.ValidationError) as cm:
field.clean(['a', '2013-04-09'])
self.assertEqual(cm.exception.messages[0], 'Enter a valid date.')
def test_date_invalid_upper(self):
field = pg_forms.DateRangeField()
with self.assertRaises(exceptions.ValidationError) as cm:
field.clean(['2013-04-09', 'b'])
self.assertEqual(cm.exception.messages[0], 'Enter a valid date.')
def test_date_required(self):
field = pg_forms.DateRangeField(required=True)
with self.assertRaises(exceptions.ValidationError) as cm:
field.clean(['', ''])
self.assertEqual(cm.exception.messages[0], 'This field is required.')
value = field.clean(['1976-04-16', ''])
self.assertEqual(value, DateRange(datetime.date(1976, 4, 16), None))
def test_datetime_lower_bound_higher(self):
field = pg_forms.DateTimeRangeField()
with self.assertRaises(exceptions.ValidationError) as cm:
field.clean(['2006-10-25 14:59', '2006-10-25 14:58'])
self.assertEqual(cm.exception.messages[0], 'The start of the range must not exceed the end of the range.')
self.assertEqual(cm.exception.code, 'bound_ordering')
def test_datetime_open(self):
field = pg_forms.DateTimeRangeField()
value = field.clean(['', '2013-04-09 11:45'])
self.assertEqual(value, DateTimeTZRange(None, datetime.datetime(2013, 4, 9, 11, 45)))
def test_datetime_incorrect_data_type(self):
field = pg_forms.DateTimeRangeField()
with self.assertRaises(exceptions.ValidationError) as cm:
field.clean('2013-04-09 11:45')
self.assertEqual(cm.exception.messages[0], 'Enter two valid date/times.')
self.assertEqual(cm.exception.code, 'invalid')
def test_datetime_invalid_lower(self):
field = pg_forms.DateTimeRangeField()
with self.assertRaises(exceptions.ValidationError) as cm:
field.clean(['45', '2013-04-09 11:45'])
self.assertEqual(cm.exception.messages[0], 'Enter a valid date/time.')
def test_datetime_invalid_upper(self):
field = pg_forms.DateTimeRangeField()
with self.assertRaises(exceptions.ValidationError) as cm:
field.clean(['2013-04-09 11:45', 'sweet pickles'])
self.assertEqual(cm.exception.messages[0], 'Enter a valid date/time.')
def test_datetime_required(self):
field = pg_forms.DateTimeRangeField(required=True)
with self.assertRaises(exceptions.ValidationError) as cm:
field.clean(['', ''])
self.assertEqual(cm.exception.messages[0], 'This field is required.')
value = field.clean(['2013-04-09 11:45', ''])
self.assertEqual(value, DateTimeTZRange(datetime.datetime(2013, 4, 9, 11, 45), None))
@override_settings(USE_TZ=True, TIME_ZONE='Africa/Johannesburg')
def test_datetime_prepare_value(self):
field = pg_forms.DateTimeRangeField()
value = field.prepare_value(
DateTimeTZRange(datetime.datetime(2015, 5, 22, 16, 6, 33, tzinfo=timezone.utc), None)
)
self.assertEqual(value, [datetime.datetime(2015, 5, 22, 18, 6, 33), None])
def test_model_field_formfield_integer(self):
model_field = pg_fields.IntegerRangeField()
form_field = model_field.formfield()
self.assertIsInstance(form_field, pg_forms.IntegerRangeField)
def test_model_field_formfield_biginteger(self):
model_field = pg_fields.BigIntegerRangeField()
form_field = model_field.formfield()
self.assertIsInstance(form_field, pg_forms.IntegerRangeField)
def test_model_field_formfield_float(self):
model_field = pg_fields.FloatRangeField()
form_field = model_field.formfield()
self.assertIsInstance(form_field, pg_forms.FloatRangeField)
def test_model_field_formfield_date(self):
model_field = pg_fields.DateRangeField()
form_field = model_field.formfield()
self.assertIsInstance(form_field, pg_forms.DateRangeField)
def test_model_field_formfield_datetime(self):
model_field = pg_fields.DateTimeRangeField()
form_field = model_field.formfield()
self.assertIsInstance(form_field, pg_forms.DateTimeRangeField)
class TestWidget(PostgreSQLTestCase):
def test_range_widget(self):
f = pg_forms.ranges.DateTimeRangeField()
self.assertHTMLEqual(
f.widget.render('datetimerange', ''),
'<input type="text" name="datetimerange_0" /><input type="text" name="datetimerange_1" />'
)
self.assertHTMLEqual(
f.widget.render('datetimerange', None),
'<input type="text" name="datetimerange_0" /><input type="text" name="datetimerange_1" />'
)
dt_range = DateTimeTZRange(
datetime.datetime(2006, 1, 10, 7, 30),
datetime.datetime(2006, 2, 12, 9, 50)
)
self.assertHTMLEqual(
f.widget.render('datetimerange', dt_range),
'<input type="text" name="datetimerange_0" value="2006-01-10 07:30:00" />'
'<input type="text" name="datetimerange_1" value="2006-02-12 09:50:00" />'
)
|
bsd-3-clause
|
Delgan/w2ui
|
server/python/bottle/bottle.py
|
28
|
143552
|
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
Bottle is a fast and simple micro-framework for small web applications. It
offers request dispatching (Routes) with url parameter support, templates,
a built-in HTTP Server and adapters for many third party WSGI/HTTP-server and
template engines - all in a single file and with no dependencies other than the
Python Standard Library.
Homepage and documentation: http://bottlepy.org/
Copyright (c) 2014, Marcel Hellkamp.
License: MIT (see LICENSE for details)
"""
from __future__ import with_statement
__author__ = 'Marcel Hellkamp'
__version__ = '0.13-dev'
__license__ = 'MIT'
# The gevent and eventlet server adapters need to patch some modules before
# they are imported. This is why we parse the commandline parameters here but
# handle them later
if __name__ == '__main__':
from optparse import OptionParser
_cmd_parser = OptionParser(usage="usage: %prog [options] package.module:app")
_opt = _cmd_parser.add_option
_opt("--version", action="store_true", help="show version number.")
_opt("-b", "--bind", metavar="ADDRESS", help="bind socket to ADDRESS.")
_opt("-s", "--server", default='wsgiref', help="use SERVER as backend.")
_opt("-p", "--plugin", action="append", help="install additional plugin/s.")
_opt("--debug", action="store_true", help="start server in debug mode.")
_opt("--reload", action="store_true", help="auto-reload on file changes.")
_cmd_options, _cmd_args = _cmd_parser.parse_args()
if _cmd_options.server:
if _cmd_options.server.startswith('gevent'):
import gevent.monkey; gevent.monkey.patch_all()
elif _cmd_options.server.startswith('eventlet'):
import eventlet; eventlet.monkey_patch()
import base64, cgi, email.utils, functools, hmac, imp, itertools, mimetypes,\
os, re, subprocess, sys, tempfile, threading, time, warnings
from datetime import date as datedate, datetime, timedelta
from tempfile import TemporaryFile
from traceback import format_exc, print_exc
from inspect import getargspec
from unicodedata import normalize
try: from simplejson import dumps as json_dumps, loads as json_lds
except ImportError: # pragma: no cover
try: from json import dumps as json_dumps, loads as json_lds
except ImportError:
try: from django.utils.simplejson import dumps as json_dumps, loads as json_lds
except ImportError:
def json_dumps(data):
raise ImportError("JSON support requires Python 2.6 or simplejson.")
json_lds = json_dumps
# We now try to fix 2.5/2.6/3.1/3.2 incompatibilities.
# It ain't pretty but it works... Sorry for the mess.
py = sys.version_info
py3k = py >= (3, 0, 0)
py25 = py < (2, 6, 0)
py31 = (3, 1, 0) <= py < (3, 2, 0)
# Workaround for the missing "as" keyword in py3k.
def _e(): return sys.exc_info()[1]
# Workaround for the "print is a keyword/function" Python 2/3 dilemma
# and a fallback for mod_wsgi (resticts stdout/err attribute access)
try:
_stdout, _stderr = sys.stdout.write, sys.stderr.write
except IOError:
_stdout = lambda x: sys.stdout.write(x)
_stderr = lambda x: sys.stderr.write(x)
# Lots of stdlib and builtin differences.
if py3k:
import http.client as httplib
import _thread as thread
from urllib.parse import urljoin, SplitResult as UrlSplitResult
from urllib.parse import urlencode, quote as urlquote, unquote as urlunquote
urlunquote = functools.partial(urlunquote, encoding='latin1')
from http.cookies import SimpleCookie
from collections import MutableMapping as DictMixin
import pickle
from io import BytesIO
from configparser import ConfigParser
basestring = str
unicode = str
json_loads = lambda s: json_lds(touni(s))
callable = lambda x: hasattr(x, '__call__')
imap = map
def _raise(*a): raise a[0](a[1]).with_traceback(a[2])
else: # 2.x
import httplib
import thread
from urlparse import urljoin, SplitResult as UrlSplitResult
from urllib import urlencode, quote as urlquote, unquote as urlunquote
from Cookie import SimpleCookie
from itertools import imap
import cPickle as pickle
from StringIO import StringIO as BytesIO
from ConfigParser import SafeConfigParser as ConfigParser
if py25:
msg = "Python 2.5 support may be dropped in future versions of Bottle."
warnings.warn(msg, DeprecationWarning)
from UserDict import DictMixin
def next(it): return it.next()
bytes = str
else: # 2.6, 2.7
from collections import MutableMapping as DictMixin
unicode = unicode
json_loads = json_lds
eval(compile('def _raise(*a): raise a[0], a[1], a[2]', '<py3fix>', 'exec'))
# Some helpers for string/byte handling
def tob(s, enc='utf8'):
return s.encode(enc) if isinstance(s, unicode) else bytes(s)
def touni(s, enc='utf8', err='strict'):
if isinstance(s, bytes):
return s.decode(enc, err)
else:
return unicode(s or ("" if s is None else s))
tonat = touni if py3k else tob
# 3.2 fixes cgi.FieldStorage to accept bytes (which makes a lot of sense).
# 3.1 needs a workaround.
if py31:
from io import TextIOWrapper
class NCTextIOWrapper(TextIOWrapper):
def close(self): pass # Keep wrapped buffer open.
# A bug in functools causes it to break if the wrapper is an instance method
def update_wrapper(wrapper, wrapped, *a, **ka):
try:
functools.update_wrapper(wrapper, wrapped, *a, **ka)
except AttributeError:
pass
# These helpers are used at module level and need to be defined first.
# And yes, I know PEP-8, but sometimes a lower-case classname makes more sense.
def depr(message, strict=False):
warnings.warn(message, DeprecationWarning, stacklevel=3)
def makelist(data): # This is just too handy
if isinstance(data, (tuple, list, set, dict)):
return list(data)
elif data:
return [data]
else:
return []
class DictProperty(object):
""" Property that maps to a key in a local dict-like attribute. """
def __init__(self, attr, key=None, read_only=False):
self.attr, self.key, self.read_only = attr, key, read_only
def __call__(self, func):
functools.update_wrapper(self, func, updated=[])
self.getter, self.key = func, self.key or func.__name__
return self
def __get__(self, obj, cls):
if obj is None: return self
key, storage = self.key, getattr(obj, self.attr)
if key not in storage: storage[key] = self.getter(obj)
return storage[key]
def __set__(self, obj, value):
if self.read_only: raise AttributeError("Read-Only property.")
getattr(obj, self.attr)[self.key] = value
def __delete__(self, obj):
if self.read_only: raise AttributeError("Read-Only property.")
del getattr(obj, self.attr)[self.key]
class cached_property(object):
""" A property that is only computed once per instance and then replaces
itself with an ordinary attribute. Deleting the attribute resets the
property. """
def __init__(self, func):
self.__doc__ = getattr(func, '__doc__')
self.func = func
def __get__(self, obj, cls):
if obj is None: return self
value = obj.__dict__[self.func.__name__] = self.func(obj)
return value
class lazy_attribute(object):
""" A property that caches itself to the class object. """
def __init__(self, func):
functools.update_wrapper(self, func, updated=[])
self.getter = func
def __get__(self, obj, cls):
value = self.getter(cls)
setattr(cls, self.__name__, value)
return value
###############################################################################
# Exceptions and Events ########################################################
###############################################################################
class BottleException(Exception):
""" A base class for exceptions used by bottle. """
pass
###############################################################################
# Routing ######################################################################
###############################################################################
class RouteError(BottleException):
""" This is a base class for all routing related exceptions """
class RouteReset(BottleException):
""" If raised by a plugin or request handler, the route is reset and all
plugins are re-applied. """
class RouterUnknownModeError(RouteError): pass
class RouteSyntaxError(RouteError):
""" The route parser found something not supported by this router. """
class RouteBuildError(RouteError):
""" The route could not be built. """
def _re_flatten(p):
""" Turn all capturing groups in a regular expression pattern into
non-capturing groups. """
if '(' not in p:
return p
return re.sub(r'(\\*)(\(\?P<[^>]+>|\((?!\?))',
lambda m: m.group(0) if len(m.group(1)) % 2 else m.group(1) + '(?:', p)
class Router(object):
""" A Router is an ordered collection of route->target pairs. It is used to
efficiently match WSGI requests against a number of routes and return
the first target that satisfies the request. The target may be anything,
usually a string, ID or callable object. A route consists of a path-rule
and a HTTP method.
The path-rule is either a static path (e.g. `/contact`) or a dynamic
path that contains wildcards (e.g. `/wiki/<page>`). The wildcard syntax
and details on the matching order are described in docs:`routing`.
"""
default_pattern = '[^/]+'
default_filter = 're'
#: The current CPython regexp implementation does not allow more
#: than 99 matching groups per regular expression.
_MAX_GROUPS_PER_PATTERN = 99
def __init__(self, strict=False):
self.rules = [] # All rules in order
self._groups = {} # index of regexes to find them in dyna_routes
self.builder = {} # Data structure for the url builder
self.static = {} # Search structure for static routes
self.dyna_routes = {}
self.dyna_regexes = {} # Search structure for dynamic routes
#: If true, static routes are no longer checked first.
self.strict_order = strict
self.filters = {
're': lambda conf:
(_re_flatten(conf or self.default_pattern), None, None),
'int': lambda conf: (r'-?\d+', int, lambda x: str(int(x))),
'float': lambda conf: (r'-?[\d.]+', float, lambda x: str(float(x))),
'path': lambda conf: (r'.+?', None, None)}
def add_filter(self, name, func):
""" Add a filter. The provided function is called with the configuration
string as parameter and must return a (regexp, to_python, to_url) tuple.
The first element is a string, the last two are callables or None. """
self.filters[name] = func
rule_syntax = re.compile('(\\\\*)'
'(?:(?::([a-zA-Z_][a-zA-Z_0-9]*)?()(?:#(.*?)#)?)'
'|(?:<([a-zA-Z_][a-zA-Z_0-9]*)?(?::([a-zA-Z_]*)'
'(?::((?:\\\\.|[^\\\\>]+)+)?)?)?>))')
def _itertokens(self, rule):
offset, prefix = 0, ''
for match in self.rule_syntax.finditer(rule):
prefix += rule[offset:match.start()]
g = match.groups()
if len(g[0])%2: # Escaped wildcard
prefix += match.group(0)[len(g[0]):]
offset = match.end()
continue
if prefix:
yield prefix, None, None
name, filtr, conf = g[4:7] if g[2] is None else g[1:4]
yield name, filtr or 'default', conf or None
offset, prefix = match.end(), ''
if offset <= len(rule) or prefix:
yield prefix+rule[offset:], None, None
def add(self, rule, method, target, name=None):
""" Add a new rule or replace the target for an existing rule. """
anons = 0 # Number of anonymous wildcards found
keys = [] # Names of keys
pattern = '' # Regular expression pattern with named groups
filters = [] # Lists of wildcard input filters
builder = [] # Data structure for the URL builder
is_static = True
for key, mode, conf in self._itertokens(rule):
if mode:
is_static = False
if mode == 'default': mode = self.default_filter
mask, in_filter, out_filter = self.filters[mode](conf)
if not key:
pattern += '(?:%s)' % mask
key = 'anon%d' % anons
anons += 1
else:
pattern += '(?P<%s>%s)' % (key, mask)
keys.append(key)
if in_filter: filters.append((key, in_filter))
builder.append((key, out_filter or str))
elif key:
pattern += re.escape(key)
builder.append((None, key))
self.builder[rule] = builder
if name: self.builder[name] = builder
if is_static and not self.strict_order:
self.static.setdefault(method, {})
self.static[method][self.build(rule)] = (target, None)
return
try:
re_pattern = re.compile('^(%s)$' % pattern)
re_match = re_pattern.match
except re.error:
raise RouteSyntaxError("Could not add Route: %s (%s)" % (rule, _e()))
if filters:
def getargs(path):
url_args = re_match(path).groupdict()
for name, wildcard_filter in filters:
try:
url_args[name] = wildcard_filter(url_args[name])
except ValueError:
raise HTTPError(400, 'Path has wrong format.')
return url_args
elif re_pattern.groupindex:
def getargs(path):
return re_match(path).groupdict()
else:
getargs = None
flatpat = _re_flatten(pattern)
whole_rule = (rule, flatpat, target, getargs)
if (flatpat, method) in self._groups:
if DEBUG:
msg = 'Route <%s %s> overwrites a previously defined route'
warnings.warn(msg % (method, rule), RuntimeWarning)
self.dyna_routes[method][self._groups[flatpat, method]] = whole_rule
else:
self.dyna_routes.setdefault(method, []).append(whole_rule)
self._groups[flatpat, method] = len(self.dyna_routes[method]) - 1
self._compile(method)
def _compile(self, method):
all_rules = self.dyna_routes[method]
comborules = self.dyna_regexes[method] = []
maxgroups = self._MAX_GROUPS_PER_PATTERN
for x in range(0, len(all_rules), maxgroups):
some = all_rules[x:x+maxgroups]
combined = (flatpat for (_, flatpat, _, _) in some)
combined = '|'.join('(^%s$)' % flatpat for flatpat in combined)
combined = re.compile(combined).match
rules = [(target, getargs) for (_, _, target, getargs) in some]
comborules.append((combined, rules))
def build(self, _name, *anons, **query):
""" Build an URL by filling the wildcards in a rule. """
builder = self.builder.get(_name)
if not builder: raise RouteBuildError("No route with that name.", _name)
try:
for i, value in enumerate(anons): query['anon%d'%i] = value
url = ''.join([f(query.pop(n)) if n else f for (n,f) in builder])
return url if not query else url+'?'+urlencode(query)
except KeyError:
raise RouteBuildError('Missing URL argument: %r' % _e().args[0])
def match(self, environ):
""" Return a (target, url_args) tuple or raise HTTPError(400/404/405). """
verb = environ['REQUEST_METHOD'].upper()
path = environ['PATH_INFO'] or '/'
if verb == 'HEAD':
methods = ['PROXY', verb, 'GET', 'ANY']
else:
methods = ['PROXY', verb, 'ANY']
for method in methods:
if method in self.static and path in self.static[method]:
target, getargs = self.static[method][path]
return target, getargs(path) if getargs else {}
elif method in self.dyna_regexes:
for combined, rules in self.dyna_regexes[method]:
match = combined(path)
if match:
target, getargs = rules[match.lastindex - 1]
return target, getargs(path) if getargs else {}
# No matching route found. Collect alternative methods for 405 response
allowed = set([])
nocheck = set(methods)
for method in set(self.static) - nocheck:
if path in self.static[method]:
allowed.add(verb)
for method in set(self.dyna_regexes) - allowed - nocheck:
for combined, rules in self.dyna_regexes[method]:
match = combined(path)
if match:
allowed.add(method)
if allowed:
allow_header = ",".join(sorted(allowed))
raise HTTPError(405, "Method not allowed.", Allow=allow_header)
# No matching route and no alternative method found. We give up
raise HTTPError(404, "Not found: " + repr(path))
class Route(object):
""" This class wraps a route callback along with route specific metadata and
configuration and applies Plugins on demand. It is also responsible for
turing an URL path rule into a regular expression usable by the Router.
"""
def __init__(self, app, rule, method, callback, name=None,
plugins=None, skiplist=None, **config):
#: The application this route is installed to.
self.app = app
#: The path-rule string (e.g. ``/wiki/<page>``).
self.rule = rule
#: The HTTP method as a string (e.g. ``GET``).
self.method = method
#: The original callback with no plugins applied. Useful for introspection.
self.callback = callback
#: The name of the route (if specified) or ``None``.
self.name = name or None
#: A list of route-specific plugins (see :meth:`Bottle.route`).
self.plugins = plugins or []
#: A list of plugins to not apply to this route (see :meth:`Bottle.route`).
self.skiplist = skiplist or []
#: Additional keyword arguments passed to the :meth:`Bottle.route`
#: decorator are stored in this dictionary. Used for route-specific
#: plugin configuration and meta-data.
self.config = ConfigDict().load_dict(config)
@cached_property
def call(self):
""" The route callback with all plugins applied. This property is
created on demand and then cached to speed up subsequent requests."""
return self._make_callback()
def reset(self):
""" Forget any cached values. The next time :attr:`call` is accessed,
all plugins are re-applied. """
self.__dict__.pop('call', None)
def prepare(self):
""" Do all on-demand work immediately (useful for debugging)."""
self.call
def all_plugins(self):
""" Yield all Plugins affecting this route. """
unique = set()
for p in reversed(self.app.plugins + self.plugins):
if True in self.skiplist: break
name = getattr(p, 'name', False)
if name and (name in self.skiplist or name in unique): continue
if p in self.skiplist or type(p) in self.skiplist: continue
if name: unique.add(name)
yield p
def _make_callback(self):
callback = self.callback
for plugin in self.all_plugins():
try:
if hasattr(plugin, 'apply'):
callback = plugin.apply(callback, self)
else:
callback = plugin(callback)
except RouteReset: # Try again with changed configuration.
return self._make_callback()
if not callback is self.callback:
update_wrapper(callback, self.callback)
return callback
def get_undecorated_callback(self):
""" Return the callback. If the callback is a decorated function, try to
recover the original function. """
func = self.callback
func = getattr(func, '__func__' if py3k else 'im_func', func)
closure_attr = '__closure__' if py3k else 'func_closure'
while hasattr(func, closure_attr) and getattr(func, closure_attr):
func = getattr(func, closure_attr)[0].cell_contents
return func
def get_callback_args(self):
""" Return a list of argument names the callback (most likely) accepts
as keyword arguments. If the callback is a decorated function, try
to recover the original function before inspection. """
return getargspec(self.get_undecorated_callback())[0]
def get_config(self, key, default=None):
""" Lookup a config field and return its value, first checking the
route.config, then route.app.config."""
for conf in (self.config, self.app.conifg):
if key in conf: return conf[key]
return default
def __repr__(self):
cb = self.get_undecorated_callback()
return '<%s %r %r>' % (self.method, self.rule, cb)
###############################################################################
# Application Object ###########################################################
###############################################################################
class Bottle(object):
""" Each Bottle object represents a single, distinct web application and
consists of routes, callbacks, plugins, resources and configuration.
Instances are callable WSGI applications.
:param catchall: If true (default), handle all exceptions. Turn off to
let debugging middleware handle exceptions.
"""
def __init__(self, catchall=True, autojson=True):
#: A :class:`ConfigDict` for app specific configuration.
self.config = ConfigDict()
self.config._on_change = functools.partial(self.trigger_hook, 'config')
self.config.meta_set('autojson', 'validate', bool)
self.config.meta_set('catchall', 'validate', bool)
self.config['catchall'] = catchall
self.config['autojson'] = autojson
#: A :class:`ResourceManager` for application files
self.resources = ResourceManager()
self.routes = [] # List of installed :class:`Route` instances.
self.router = Router() # Maps requests to :class:`Route` instances.
self.error_handler = {}
# Core plugins
self.plugins = [] # List of installed plugins.
if self.config['autojson']:
self.install(JSONPlugin())
self.install(TemplatePlugin())
#: If true, most exceptions are caught and returned as :exc:`HTTPError`
catchall = DictProperty('config', 'catchall')
__hook_names = 'before_request', 'after_request', 'app_reset', 'config'
__hook_reversed = 'after_request'
@cached_property
def _hooks(self):
return dict((name, []) for name in self.__hook_names)
def add_hook(self, name, func):
""" Attach a callback to a hook. Three hooks are currently implemented:
before_request
Executed once before each request. The request context is
available, but no routing has happened yet.
after_request
Executed once after each request regardless of its outcome.
app_reset
Called whenever :meth:`Bottle.reset` is called.
"""
if name in self.__hook_reversed:
self._hooks[name].insert(0, func)
else:
self._hooks[name].append(func)
def remove_hook(self, name, func):
""" Remove a callback from a hook. """
if name in self._hooks and func in self._hooks[name]:
self._hooks[name].remove(func)
return True
def trigger_hook(self, __name, *args, **kwargs):
""" Trigger a hook and return a list of results. """
return [hook(*args, **kwargs) for hook in self._hooks[__name][:]]
def hook(self, name):
""" Return a decorator that attaches a callback to a hook. See
:meth:`add_hook` for details."""
def decorator(func):
self.add_hook(name, func)
return func
return decorator
def mount(self, prefix, app, **options):
""" Mount an application (:class:`Bottle` or plain WSGI) to a specific
URL prefix. Example::
root_app.mount('/admin/', admin_app)
:param prefix: path prefix or `mount-point`. If it ends in a slash,
that slash is mandatory.
:param app: an instance of :class:`Bottle` or a WSGI application.
All other parameters are passed to the underlying :meth:`route` call.
"""
segments = [p for p in prefix.split('/') if p]
if not segments: raise ValueError('Empty path prefix.')
path_depth = len(segments)
def mountpoint_wrapper():
try:
request.path_shift(path_depth)
rs = HTTPResponse([])
def start_response(status, headerlist, exc_info=None):
if exc_info:
_raise(*exc_info)
rs.status = status
for name, value in headerlist: rs.add_header(name, value)
return rs.body.append
body = app(request.environ, start_response)
if body and rs.body: body = itertools.chain(rs.body, body)
rs.body = body or rs.body
return rs
finally:
request.path_shift(-path_depth)
options.setdefault('skip', True)
options.setdefault('method', 'PROXY')
options.setdefault('mountpoint', {'prefix': prefix, 'target': app})
options['callback'] = mountpoint_wrapper
self.route('/%s/<:re:.*>' % '/'.join(segments), **options)
if not prefix.endswith('/'):
self.route('/' + '/'.join(segments), **options)
def merge(self, routes):
""" Merge the routes of another :class:`Bottle` application or a list of
:class:`Route` objects into this application. The routes keep their
'owner', meaning that the :data:`Route.app` attribute is not
changed. """
if isinstance(routes, Bottle):
routes = routes.routes
for route in routes:
self.add_route(route)
def install(self, plugin):
""" Add a plugin to the list of plugins and prepare it for being
applied to all routes of this application. A plugin may be a simple
decorator or an object that implements the :class:`Plugin` API.
"""
if hasattr(plugin, 'setup'): plugin.setup(self)
if not callable(plugin) and not hasattr(plugin, 'apply'):
raise TypeError("Plugins must be callable or implement .apply()")
self.plugins.append(plugin)
self.reset()
return plugin
def uninstall(self, plugin):
""" Uninstall plugins. Pass an instance to remove a specific plugin, a type
object to remove all plugins that match that type, a string to remove
all plugins with a matching ``name`` attribute or ``True`` to remove all
plugins. Return the list of removed plugins. """
removed, remove = [], plugin
for i, plugin in list(enumerate(self.plugins))[::-1]:
if remove is True or remove is plugin or remove is type(plugin) \
or getattr(plugin, 'name', True) == remove:
removed.append(plugin)
del self.plugins[i]
if hasattr(plugin, 'close'): plugin.close()
if removed: self.reset()
return removed
def reset(self, route=None):
""" Reset all routes (force plugins to be re-applied) and clear all
caches. If an ID or route object is given, only that specific route
is affected. """
if route is None: routes = self.routes
elif isinstance(route, Route): routes = [route]
else: routes = [self.routes[route]]
for route in routes: route.reset()
if DEBUG:
for route in routes: route.prepare()
self.trigger_hook('app_reset')
def close(self):
""" Close the application and all installed plugins. """
for plugin in self.plugins:
if hasattr(plugin, 'close'): plugin.close()
def run(self, **kwargs):
""" Calls :func:`run` with the same parameters. """
run(self, **kwargs)
def match(self, environ):
""" Search for a matching route and return a (:class:`Route` , urlargs)
tuple. The second value is a dictionary with parameters extracted
from the URL. Raise :exc:`HTTPError` (404/405) on a non-match."""
return self.router.match(environ)
def get_url(self, routename, **kargs):
""" Return a string that matches a named route """
scriptname = request.environ.get('SCRIPT_NAME', '').strip('/') + '/'
location = self.router.build(routename, **kargs).lstrip('/')
return urljoin(urljoin('/', scriptname), location)
def add_route(self, route):
""" Add a route object, but do not change the :data:`Route.app`
attribute."""
self.routes.append(route)
self.router.add(route.rule, route.method, route, name=route.name)
if DEBUG: route.prepare()
def route(self, path=None, method='GET', callback=None, name=None,
apply=None, skip=None, **config):
""" A decorator to bind a function to a request URL. Example::
@app.route('/hello/<name>')
def hello(name):
return 'Hello %s' % name
The ``:name`` part is a wildcard. See :class:`Router` for syntax
details.
:param path: Request path or a list of paths to listen to. If no
path is specified, it is automatically generated from the
signature of the function.
:param method: HTTP method (`GET`, `POST`, `PUT`, ...) or a list of
methods to listen to. (default: `GET`)
:param callback: An optional shortcut to avoid the decorator
syntax. ``route(..., callback=func)`` equals ``route(...)(func)``
:param name: The name for this route. (default: None)
:param apply: A decorator or plugin or a list of plugins. These are
applied to the route callback in addition to installed plugins.
:param skip: A list of plugins, plugin classes or names. Matching
plugins are not installed to this route. ``True`` skips all.
Any additional keyword arguments are stored as route-specific
configuration and passed to plugins (see :meth:`Plugin.apply`).
"""
if callable(path): path, callback = None, path
plugins = makelist(apply)
skiplist = makelist(skip)
def decorator(callback):
if isinstance(callback, basestring): callback = load(callback)
for rule in makelist(path) or yieldroutes(callback):
for verb in makelist(method):
verb = verb.upper()
route = Route(self, rule, verb, callback, name=name,
plugins=plugins, skiplist=skiplist, **config)
self.add_route(route)
return callback
return decorator(callback) if callback else decorator
def get(self, path=None, method='GET', **options):
""" Equals :meth:`route`. """
return self.route(path, method, **options)
def post(self, path=None, method='POST', **options):
""" Equals :meth:`route` with a ``POST`` method parameter. """
return self.route(path, method, **options)
def put(self, path=None, method='PUT', **options):
""" Equals :meth:`route` with a ``PUT`` method parameter. """
return self.route(path, method, **options)
def delete(self, path=None, method='DELETE', **options):
""" Equals :meth:`route` with a ``DELETE`` method parameter. """
return self.route(path, method, **options)
def patch(self, path=None, method='PATCH', **options):
""" Equals :meth:`route` with a ``PATCH`` method parameter. """
return self.route(path, method, **options)
def error(self, code=500):
""" Decorator: Register an output handler for a HTTP error code"""
def wrapper(handler):
self.error_handler[int(code)] = handler
return handler
return wrapper
def default_error_handler(self, res):
return tob(template(ERROR_PAGE_TEMPLATE, e=res))
def _handle(self, environ):
path = environ['bottle.raw_path'] = environ['PATH_INFO']
if py3k:
try:
environ['PATH_INFO'] = path.encode('latin1').decode('utf8')
except UnicodeError:
return HTTPError(400, 'Invalid path string. Expected UTF-8')
try:
environ['bottle.app'] = self
request.bind(environ)
response.bind()
try:
self.trigger_hook('before_request')
route, args = self.router.match(environ)
environ['route.handle'] = route
environ['bottle.route'] = route
environ['route.url_args'] = args
return route.call(**args)
finally:
self.trigger_hook('after_request')
except HTTPResponse:
return _e()
except RouteReset:
route.reset()
return self._handle(environ)
except (KeyboardInterrupt, SystemExit, MemoryError):
raise
except Exception:
if not self.catchall: raise
stacktrace = format_exc()
environ['wsgi.errors'].write(stacktrace)
return HTTPError(500, "Internal Server Error", _e(), stacktrace)
def _cast(self, out, peek=None):
""" Try to convert the parameter into something WSGI compatible and set
correct HTTP headers when possible.
Support: False, str, unicode, dict, HTTPResponse, HTTPError, file-like,
iterable of strings and iterable of unicodes
"""
# Empty output is done here
if not out:
if 'Content-Length' not in response:
response['Content-Length'] = 0
return []
# Join lists of byte or unicode strings. Mixed lists are NOT supported
if isinstance(out, (tuple, list))\
and isinstance(out[0], (bytes, unicode)):
out = out[0][0:0].join(out) # b'abc'[0:0] -> b''
# Encode unicode strings
if isinstance(out, unicode):
out = out.encode(response.charset)
# Byte Strings are just returned
if isinstance(out, bytes):
if 'Content-Length' not in response:
response['Content-Length'] = len(out)
return [out]
# HTTPError or HTTPException (recursive, because they may wrap anything)
# TODO: Handle these explicitly in handle() or make them iterable.
if isinstance(out, HTTPError):
out.apply(response)
out = self.error_handler.get(out.status_code, self.default_error_handler)(out)
return self._cast(out)
if isinstance(out, HTTPResponse):
out.apply(response)
return self._cast(out.body)
# File-like objects.
if hasattr(out, 'read'):
if 'wsgi.file_wrapper' in request.environ:
return request.environ['wsgi.file_wrapper'](out)
elif hasattr(out, 'close') or not hasattr(out, '__iter__'):
return WSGIFileWrapper(out)
# Handle Iterables. We peek into them to detect their inner type.
try:
iout = iter(out)
first = next(iout)
while not first:
first = next(iout)
except StopIteration:
return self._cast('')
except HTTPResponse:
first = _e()
except (KeyboardInterrupt, SystemExit, MemoryError):
raise
except:
if not self.catchall: raise
first = HTTPError(500, 'Unhandled exception', _e(), format_exc())
# These are the inner types allowed in iterator or generator objects.
if isinstance(first, HTTPResponse):
return self._cast(first)
elif isinstance(first, bytes):
new_iter = itertools.chain([first], iout)
elif isinstance(first, unicode):
encoder = lambda x: x.encode(response.charset)
new_iter = imap(encoder, itertools.chain([first], iout))
else:
msg = 'Unsupported response type: %s' % type(first)
return self._cast(HTTPError(500, msg))
if hasattr(out, 'close'):
new_iter = _closeiter(new_iter, out.close)
return new_iter
def wsgi(self, environ, start_response):
""" The bottle WSGI-interface. """
try:
out = self._cast(self._handle(environ))
# rfc2616 section 4.3
if response._status_code in (100, 101, 204, 304)\
or environ['REQUEST_METHOD'] == 'HEAD':
if hasattr(out, 'close'): out.close()
out = []
start_response(response._status_line, response.headerlist)
return out
except (KeyboardInterrupt, SystemExit, MemoryError):
raise
except:
if not self.catchall: raise
err = '<h1>Critical error while processing request: %s</h1>' \
% html_escape(environ.get('PATH_INFO', '/'))
if DEBUG:
err += '<h2>Error:</h2>\n<pre>\n%s\n</pre>\n' \
'<h2>Traceback:</h2>\n<pre>\n%s\n</pre>\n' \
% (html_escape(repr(_e())), html_escape(format_exc()))
environ['wsgi.errors'].write(err)
headers = [('Content-Type', 'text/html; charset=UTF-8')]
start_response('500 INTERNAL SERVER ERROR', headers, sys.exc_info())
return [tob(err)]
def __call__(self, environ, start_response):
""" Each instance of :class:'Bottle' is a WSGI application. """
return self.wsgi(environ, start_response)
def __enter__(self):
""" Use this application as default for all module-level shortcuts. """
default_app.push(self)
return self
def __exit__(self, exc_type, exc_value, traceback):
default_app.pop()
###############################################################################
# HTTP and WSGI Tools ##########################################################
###############################################################################
class BaseRequest(object):
""" A wrapper for WSGI environment dictionaries that adds a lot of
convenient access methods and properties. Most of them are read-only.
Adding new attributes to a request actually adds them to the environ
dictionary (as 'bottle.request.ext.<name>'). This is the recommended
way to store and access request-specific data.
"""
__slots__ = ('environ', )
#: Maximum size of memory buffer for :attr:`body` in bytes.
MEMFILE_MAX = 102400
def __init__(self, environ=None):
""" Wrap a WSGI environ dictionary. """
#: The wrapped WSGI environ dictionary. This is the only real attribute.
#: All other attributes actually are read-only properties.
self.environ = {} if environ is None else environ
self.environ['bottle.request'] = self
@DictProperty('environ', 'bottle.app', read_only=True)
def app(self):
""" Bottle application handling this request. """
raise RuntimeError('This request is not connected to an application.')
@DictProperty('environ', 'bottle.route', read_only=True)
def route(self):
""" The bottle :class:`Route` object that matches this request. """
raise RuntimeError('This request is not connected to a route.')
@DictProperty('environ', 'route.url_args', read_only=True)
def url_args(self):
""" The arguments extracted from the URL. """
raise RuntimeError('This request is not connected to a route.')
@property
def path(self):
""" The value of ``PATH_INFO`` with exactly one prefixed slash (to fix
broken clients and avoid the "empty path" edge case). """
return '/' + self.environ.get('PATH_INFO','').lstrip('/')
@property
def method(self):
""" The ``REQUEST_METHOD`` value as an uppercase string. """
return self.environ.get('REQUEST_METHOD', 'GET').upper()
@DictProperty('environ', 'bottle.request.headers', read_only=True)
def headers(self):
""" A :class:`WSGIHeaderDict` that provides case-insensitive access to
HTTP request headers. """
return WSGIHeaderDict(self.environ)
def get_header(self, name, default=None):
""" Return the value of a request header, or a given default value. """
return self.headers.get(name, default)
@DictProperty('environ', 'bottle.request.cookies', read_only=True)
def cookies(self):
""" Cookies parsed into a :class:`FormsDict`. Signed cookies are NOT
decoded. Use :meth:`get_cookie` if you expect signed cookies. """
cookies = SimpleCookie(self.environ.get('HTTP_COOKIE','')).values()
return FormsDict((c.key, c.value) for c in cookies)
def get_cookie(self, key, default=None, secret=None):
""" Return the content of a cookie. To read a `Signed Cookie`, the
`secret` must match the one used to create the cookie (see
:meth:`BaseResponse.set_cookie`). If anything goes wrong (missing
cookie or wrong signature), return a default value. """
value = self.cookies.get(key)
if secret and value:
dec = cookie_decode(value, secret) # (key, value) tuple or None
return dec[1] if dec and dec[0] == key else default
return value or default
@DictProperty('environ', 'bottle.request.query', read_only=True)
def query(self):
""" The :attr:`query_string` parsed into a :class:`FormsDict`. These
values are sometimes called "URL arguments" or "GET parameters", but
not to be confused with "URL wildcards" as they are provided by the
:class:`Router`. """
get = self.environ['bottle.get'] = FormsDict()
pairs = _parse_qsl(self.environ.get('QUERY_STRING', ''))
for key, value in pairs:
get[key] = value
return get
@DictProperty('environ', 'bottle.request.forms', read_only=True)
def forms(self):
""" Form values parsed from an `url-encoded` or `multipart/form-data`
encoded POST or PUT request body. The result is returned as a
:class:`FormsDict`. All keys and values are strings. File uploads
are stored separately in :attr:`files`. """
forms = FormsDict()
for name, item in self.POST.allitems():
if not isinstance(item, FileUpload):
forms[name] = item
return forms
@DictProperty('environ', 'bottle.request.params', read_only=True)
def params(self):
""" A :class:`FormsDict` with the combined values of :attr:`query` and
:attr:`forms`. File uploads are stored in :attr:`files`. """
params = FormsDict()
for key, value in self.query.allitems():
params[key] = value
for key, value in self.forms.allitems():
params[key] = value
return params
@DictProperty('environ', 'bottle.request.files', read_only=True)
def files(self):
""" File uploads parsed from `multipart/form-data` encoded POST or PUT
request body. The values are instances of :class:`FileUpload`.
"""
files = FormsDict()
for name, item in self.POST.allitems():
if isinstance(item, FileUpload):
files[name] = item
return files
@DictProperty('environ', 'bottle.request.json', read_only=True)
def json(self):
""" If the ``Content-Type`` header is ``application/json``, this
property holds the parsed content of the request body. Only requests
smaller than :attr:`MEMFILE_MAX` are processed to avoid memory
exhaustion. """
ctype = self.environ.get('CONTENT_TYPE', '').lower().split(';')[0]
if ctype == 'application/json':
b = self._get_body_string()
if not b:
return None
return json_loads(b)
return None
def _iter_body(self, read, bufsize):
maxread = max(0, self.content_length)
while maxread:
part = read(min(maxread, bufsize))
if not part: break
yield part
maxread -= len(part)
@staticmethod
def _iter_chunked(read, bufsize):
err = HTTPError(400, 'Error while parsing chunked transfer body.')
rn, sem, bs = tob('\r\n'), tob(';'), tob('')
while True:
header = read(1)
while header[-2:] != rn:
c = read(1)
header += c
if not c: raise err
if len(header) > bufsize: raise err
size, _, _ = header.partition(sem)
try:
maxread = int(tonat(size.strip()), 16)
except ValueError:
raise err
if maxread == 0: break
buff = bs
while maxread > 0:
if not buff:
buff = read(min(maxread, bufsize))
part, buff = buff[:maxread], buff[maxread:]
if not part: raise err
yield part
maxread -= len(part)
if read(2) != rn:
raise err
@DictProperty('environ', 'bottle.request.body', read_only=True)
def _body(self):
body_iter = self._iter_chunked if self.chunked else self._iter_body
read_func = self.environ['wsgi.input'].read
body, body_size, is_temp_file = BytesIO(), 0, False
for part in body_iter(read_func, self.MEMFILE_MAX):
body.write(part)
body_size += len(part)
if not is_temp_file and body_size > self.MEMFILE_MAX:
body, tmp = TemporaryFile(mode='w+b'), body
body.write(tmp.getvalue())
del tmp
is_temp_file = True
self.environ['wsgi.input'] = body
body.seek(0)
return body
def _get_body_string(self):
""" read body until content-length or MEMFILE_MAX into a string. Raise
HTTPError(413) on requests that are to large. """
clen = self.content_length
if clen > self.MEMFILE_MAX:
raise HTTPError(413, 'Request too large')
if clen < 0: clen = self.MEMFILE_MAX + 1
data = self.body.read(clen)
if len(data) > self.MEMFILE_MAX: # Fail fast
raise HTTPError(413, 'Request too large')
return data
@property
def body(self):
""" The HTTP request body as a seek-able file-like object. Depending on
:attr:`MEMFILE_MAX`, this is either a temporary file or a
:class:`io.BytesIO` instance. Accessing this property for the first
time reads and replaces the ``wsgi.input`` environ variable.
Subsequent accesses just do a `seek(0)` on the file object. """
self._body.seek(0)
return self._body
@property
def chunked(self):
""" True if Chunked transfer encoding was. """
return 'chunked' in self.environ.get('HTTP_TRANSFER_ENCODING', '').lower()
#: An alias for :attr:`query`.
GET = query
@DictProperty('environ', 'bottle.request.post', read_only=True)
def POST(self):
""" The values of :attr:`forms` and :attr:`files` combined into a single
:class:`FormsDict`. Values are either strings (form values) or
instances of :class:`cgi.FieldStorage` (file uploads).
"""
post = FormsDict()
# We default to application/x-www-form-urlencoded for everything that
# is not multipart and take the fast path (also: 3.1 workaround)
if not self.content_type.startswith('multipart/'):
pairs = _parse_qsl(tonat(self._get_body_string(), 'latin1'))
for key, value in pairs:
post[key] = value
return post
safe_env = {'QUERY_STRING':''} # Build a safe environment for cgi
for key in ('REQUEST_METHOD', 'CONTENT_TYPE', 'CONTENT_LENGTH'):
if key in self.environ: safe_env[key] = self.environ[key]
args = dict(fp=self.body, environ=safe_env, keep_blank_values=True)
if py31:
args['fp'] = NCTextIOWrapper(args['fp'], encoding='utf8',
newline='\n')
elif py3k:
args['encoding'] = 'utf8'
data = cgi.FieldStorage(**args)
self['_cgi.FieldStorage'] = data #http://bugs.python.org/issue18394#msg207958
data = data.list or []
for item in data:
if item.filename:
post[item.name] = FileUpload(item.file, item.name,
item.filename, item.headers)
else:
post[item.name] = item.value
return post
@property
def url(self):
""" The full request URI including hostname and scheme. If your app
lives behind a reverse proxy or load balancer and you get confusing
results, make sure that the ``X-Forwarded-Host`` header is set
correctly. """
return self.urlparts.geturl()
@DictProperty('environ', 'bottle.request.urlparts', read_only=True)
def urlparts(self):
""" The :attr:`url` string as an :class:`urlparse.SplitResult` tuple.
The tuple contains (scheme, host, path, query_string and fragment),
but the fragment is always empty because it is not visible to the
server. """
env = self.environ
http = env.get('HTTP_X_FORWARDED_PROTO') or env.get('wsgi.url_scheme', 'http')
host = env.get('HTTP_X_FORWARDED_HOST') or env.get('HTTP_HOST')
if not host:
# HTTP 1.1 requires a Host-header. This is for HTTP/1.0 clients.
host = env.get('SERVER_NAME', '127.0.0.1')
port = env.get('SERVER_PORT')
if port and port != ('80' if http == 'http' else '443'):
host += ':' + port
path = urlquote(self.fullpath)
return UrlSplitResult(http, host, path, env.get('QUERY_STRING'), '')
@property
def fullpath(self):
""" Request path including :attr:`script_name` (if present). """
return urljoin(self.script_name, self.path.lstrip('/'))
@property
def query_string(self):
""" The raw :attr:`query` part of the URL (everything in between ``?``
and ``#``) as a string. """
return self.environ.get('QUERY_STRING', '')
@property
def script_name(self):
""" The initial portion of the URL's `path` that was removed by a higher
level (server or routing middleware) before the application was
called. This script path is returned with leading and tailing
slashes. """
script_name = self.environ.get('SCRIPT_NAME', '').strip('/')
return '/' + script_name + '/' if script_name else '/'
def path_shift(self, shift=1):
""" Shift path segments from :attr:`path` to :attr:`script_name` and
vice versa.
:param shift: The number of path segments to shift. May be negative
to change the shift direction. (default: 1)
"""
script = self.environ.get('SCRIPT_NAME','/')
self['SCRIPT_NAME'], self['PATH_INFO'] = path_shift(script, self.path, shift)
@property
def content_length(self):
""" The request body length as an integer. The client is responsible to
set this header. Otherwise, the real length of the body is unknown
and -1 is returned. In this case, :attr:`body` will be empty. """
return int(self.environ.get('CONTENT_LENGTH') or -1)
@property
def content_type(self):
""" The Content-Type header as a lowercase-string (default: empty). """
return self.environ.get('CONTENT_TYPE', '').lower()
@property
def is_xhr(self):
""" True if the request was triggered by a XMLHttpRequest. This only
works with JavaScript libraries that support the `X-Requested-With`
header (most of the popular libraries do). """
requested_with = self.environ.get('HTTP_X_REQUESTED_WITH','')
return requested_with.lower() == 'xmlhttprequest'
@property
def is_ajax(self):
""" Alias for :attr:`is_xhr`. "Ajax" is not the right term. """
return self.is_xhr
@property
def auth(self):
""" HTTP authentication data as a (user, password) tuple. This
implementation currently supports basic (not digest) authentication
only. If the authentication happened at a higher level (e.g. in the
front web-server or a middleware), the password field is None, but
the user field is looked up from the ``REMOTE_USER`` environ
variable. On any errors, None is returned. """
basic = parse_auth(self.environ.get('HTTP_AUTHORIZATION',''))
if basic: return basic
ruser = self.environ.get('REMOTE_USER')
if ruser: return (ruser, None)
return None
@property
def remote_route(self):
""" A list of all IPs that were involved in this request, starting with
the client IP and followed by zero or more proxies. This does only
work if all proxies support the ```X-Forwarded-For`` header. Note
that this information can be forged by malicious clients. """
proxy = self.environ.get('HTTP_X_FORWARDED_FOR')
if proxy: return [ip.strip() for ip in proxy.split(',')]
remote = self.environ.get('REMOTE_ADDR')
return [remote] if remote else []
@property
def remote_addr(self):
""" The client IP as a string. Note that this information can be forged
by malicious clients. """
route = self.remote_route
return route[0] if route else None
def copy(self):
""" Return a new :class:`Request` with a shallow :attr:`environ` copy. """
return Request(self.environ.copy())
def get(self, value, default=None): return self.environ.get(value, default)
def __getitem__(self, key): return self.environ[key]
def __delitem__(self, key): self[key] = ""; del(self.environ[key])
def __iter__(self): return iter(self.environ)
def __len__(self): return len(self.environ)
def keys(self): return self.environ.keys()
def __setitem__(self, key, value):
""" Change an environ value and clear all caches that depend on it. """
if self.environ.get('bottle.request.readonly'):
raise KeyError('The environ dictionary is read-only.')
self.environ[key] = value
todelete = ()
if key == 'wsgi.input':
todelete = ('body', 'forms', 'files', 'params', 'post', 'json')
elif key == 'QUERY_STRING':
todelete = ('query', 'params')
elif key.startswith('HTTP_'):
todelete = ('headers', 'cookies')
for key in todelete:
self.environ.pop('bottle.request.'+key, None)
def __repr__(self):
return '<%s: %s %s>' % (self.__class__.__name__, self.method, self.url)
def __getattr__(self, name):
""" Search in self.environ for additional user defined attributes. """
try:
var = self.environ['bottle.request.ext.%s'%name]
return var.__get__(self) if hasattr(var, '__get__') else var
except KeyError:
raise AttributeError('Attribute %r not defined.' % name)
def __setattr__(self, name, value):
if name == 'environ': return object.__setattr__(self, name, value)
self.environ['bottle.request.ext.%s'%name] = value
def _hkey(s):
return s.title().replace('_','-')
class HeaderProperty(object):
def __init__(self, name, reader=None, writer=str, default=''):
self.name, self.default = name, default
self.reader, self.writer = reader, writer
self.__doc__ = 'Current value of the %r header.' % name.title()
def __get__(self, obj, _):
if obj is None: return self
value = obj.headers.get(self.name, self.default)
return self.reader(value) if self.reader else value
def __set__(self, obj, value):
obj.headers[self.name] = self.writer(value)
def __delete__(self, obj):
del obj.headers[self.name]
class BaseResponse(object):
""" Storage class for a response body as well as headers and cookies.
This class does support dict-like case-insensitive item-access to
headers, but is NOT a dict. Most notably, iterating over a response
yields parts of the body and not the headers.
:param body: The response body as one of the supported types.
:param status: Either an HTTP status code (e.g. 200) or a status line
including the reason phrase (e.g. '200 OK').
:param headers: A dictionary or a list of name-value pairs.
Additional keyword arguments are added to the list of headers.
Underscores in the header name are replaced with dashes.
"""
default_status = 200
default_content_type = 'text/html; charset=UTF-8'
# Header blacklist for specific response codes
# (rfc2616 section 10.2.3 and 10.3.5)
bad_headers = {
204: set(('Content-Type',)),
304: set(('Allow', 'Content-Encoding', 'Content-Language',
'Content-Length', 'Content-Range', 'Content-Type',
'Content-Md5', 'Last-Modified'))}
def __init__(self, body='', status=None, headers=None, **more_headers):
self._cookies = None
self._headers = {}
self.body = body
self.status = status or self.default_status
if headers:
if isinstance(headers, dict):
headers = headers.items()
for name, value in headers:
self.add_header(name, value)
if more_headers:
for name, value in more_headers.items():
self.add_header(name, value)
def copy(self, cls=None):
""" Returns a copy of self. """
cls = cls or BaseResponse
assert issubclass(cls, BaseResponse)
copy = cls()
copy.status = self.status
copy._headers = dict((k, v[:]) for (k, v) in self._headers.items())
if self._cookies:
copy._cookies = SimpleCookie()
copy._cookies.load(self._cookies.output())
return copy
def __iter__(self):
return iter(self.body)
def close(self):
if hasattr(self.body, 'close'):
self.body.close()
@property
def status_line(self):
""" The HTTP status line as a string (e.g. ``404 Not Found``)."""
return self._status_line
@property
def status_code(self):
""" The HTTP status code as an integer (e.g. 404)."""
return self._status_code
def _set_status(self, status):
if isinstance(status, int):
code, status = status, _HTTP_STATUS_LINES.get(status)
elif ' ' in status:
status = status.strip()
code = int(status.split()[0])
else:
raise ValueError('String status line without a reason phrase.')
if not 100 <= code <= 999: raise ValueError('Status code out of range.')
self._status_code = code
self._status_line = str(status or ('%d Unknown' % code))
def _get_status(self):
return self._status_line
status = property(_get_status, _set_status, None,
''' A writeable property to change the HTTP response status. It accepts
either a numeric code (100-999) or a string with a custom reason
phrase (e.g. "404 Brain not found"). Both :data:`status_line` and
:data:`status_code` are updated accordingly. The return value is
always a status string. ''')
del _get_status, _set_status
@property
def headers(self):
""" An instance of :class:`HeaderDict`, a case-insensitive dict-like
view on the response headers. """
hdict = HeaderDict()
hdict.dict = self._headers
return hdict
def __contains__(self, name): return _hkey(name) in self._headers
def __delitem__(self, name): del self._headers[_hkey(name)]
def __getitem__(self, name): return self._headers[_hkey(name)][-1]
def __setitem__(self, name, value): self._headers[_hkey(name)] = [str(value)]
def get_header(self, name, default=None):
""" Return the value of a previously defined header. If there is no
header with that name, return a default value. """
return self._headers.get(_hkey(name), [default])[-1]
def set_header(self, name, value):
""" Create a new response header, replacing any previously defined
headers with the same name. """
self._headers[_hkey(name)] = [value if isinstance(value, unicode) else str(value)]
def add_header(self, name, value):
""" Add an additional response header, not removing duplicates. """
self._headers.setdefault(_hkey(name), []).append(str(value))
def iter_headers(self):
""" Yield (header, value) tuples, skipping headers that are not
allowed with the current response status code. """
return self.headerlist
@property
def headerlist(self):
""" WSGI conform list of (header, value) tuples. """
out = []
headers = list(self._headers.items())
if 'Content-Type' not in self._headers:
headers.append(('Content-Type', [self.default_content_type]))
if self._status_code in self.bad_headers:
bad_headers = self.bad_headers[self._status_code]
headers = [h for h in headers if h[0] not in bad_headers]
out += [(name, val) for (name, vals) in headers for val in vals]
if self._cookies:
for c in self._cookies.values():
out.append(('Set-Cookie', c.OutputString()))
if py3k:
out = [
(k, v.encode('utf8').decode('latin1')
if isinstance(v, unicode) else v) for (k, v) in out]
return out
content_type = HeaderProperty('Content-Type')
content_length = HeaderProperty('Content-Length', reader=int)
expires = HeaderProperty('Expires',
reader=lambda x: datetime.utcfromtimestamp(parse_date(x)),
writer=lambda x: http_date(x))
@property
def charset(self, default='UTF-8'):
""" Return the charset specified in the content-type header (default: utf8). """
if 'charset=' in self.content_type:
return self.content_type.split('charset=')[-1].split(';')[0].strip()
return default
def set_cookie(self, name, value, secret=None, **options):
""" Create a new cookie or replace an old one. If the `secret` parameter is
set, create a `Signed Cookie` (described below).
:param name: the name of the cookie.
:param value: the value of the cookie.
:param secret: a signature key required for signed cookies.
Additionally, this method accepts all RFC 2109 attributes that are
supported by :class:`cookie.Morsel`, including:
:param max_age: maximum age in seconds. (default: None)
:param expires: a datetime object or UNIX timestamp. (default: None)
:param domain: the domain that is allowed to read the cookie.
(default: current domain)
:param path: limits the cookie to a given path (default: current path)
:param secure: limit the cookie to HTTPS connections (default: off).
:param httponly: prevents client-side javascript to read this cookie
(default: off, requires Python 2.6 or newer).
If neither `expires` nor `max_age` is set (default), the cookie will
expire at the end of the browser session (as soon as the browser
window is closed).
Signed cookies may store any pickle-able object and are
cryptographically signed to prevent manipulation. Keep in mind that
cookies are limited to 4kb in most browsers.
Warning: Signed cookies are not encrypted (the client can still see
the content) and not copy-protected (the client can restore an old
cookie). The main intention is to make pickling and unpickling
save, not to store secret information at client side.
"""
if not self._cookies:
self._cookies = SimpleCookie()
if secret:
value = touni(cookie_encode((name, value), secret))
elif not isinstance(value, basestring):
raise TypeError('Secret key missing for non-string Cookie.')
if len(value) > 4096: raise ValueError('Cookie value to long.')
self._cookies[name] = value
for key, value in options.items():
if key == 'max_age':
if isinstance(value, timedelta):
value = value.seconds + value.days * 24 * 3600
if key == 'expires':
if isinstance(value, (datedate, datetime)):
value = value.timetuple()
elif isinstance(value, (int, float)):
value = time.gmtime(value)
value = time.strftime("%a, %d %b %Y %H:%M:%S GMT", value)
self._cookies[name][key.replace('_', '-')] = value
def delete_cookie(self, key, **kwargs):
""" Delete a cookie. Be sure to use the same `domain` and `path`
settings as used to create the cookie. """
kwargs['max_age'] = -1
kwargs['expires'] = 0
self.set_cookie(key, '', **kwargs)
def __repr__(self):
out = ''
for name, value in self.headerlist:
out += '%s: %s\n' % (name.title(), value.strip())
return out
def _local_property():
ls = threading.local()
def fget(_):
try: return ls.var
except AttributeError:
raise RuntimeError("Request context not initialized.")
def fset(_, value): ls.var = value
def fdel(_): del ls.var
return property(fget, fset, fdel, 'Thread-local property')
class LocalRequest(BaseRequest):
""" A thread-local subclass of :class:`BaseRequest` with a different
set of attributes for each thread. There is usually only one global
instance of this class (:data:`request`). If accessed during a
request/response cycle, this instance always refers to the *current*
request (even on a multithreaded server). """
bind = BaseRequest.__init__
environ = _local_property()
class LocalResponse(BaseResponse):
""" A thread-local subclass of :class:`BaseResponse` with a different
set of attributes for each thread. There is usually only one global
instance of this class (:data:`response`). Its attributes are used
to build the HTTP response at the end of the request/response cycle.
"""
bind = BaseResponse.__init__
_status_line = _local_property()
_status_code = _local_property()
_cookies = _local_property()
_headers = _local_property()
body = _local_property()
Request = BaseRequest
Response = BaseResponse
class HTTPResponse(Response, BottleException):
def __init__(self, body='', status=None, headers=None, **more_headers):
super(HTTPResponse, self).__init__(body, status, headers, **more_headers)
def apply(self, other):
other._status_code = self._status_code
other._status_line = self._status_line
other._headers = self._headers
other._cookies = self._cookies
other.body = self.body
class HTTPError(HTTPResponse):
default_status = 500
def __init__(self, status=None, body=None, exception=None, traceback=None,
**options):
self.exception = exception
self.traceback = traceback
super(HTTPError, self).__init__(body, status, **options)
###############################################################################
# Plugins ######################################################################
###############################################################################
class PluginError(BottleException): pass
class JSONPlugin(object):
name = 'json'
api = 2
def __init__(self, json_dumps=json_dumps):
self.json_dumps = json_dumps
def apply(self, callback, _):
dumps = self.json_dumps
if not dumps: return callback
def wrapper(*a, **ka):
try:
rv = callback(*a, **ka)
except HTTPError:
rv = _e()
if isinstance(rv, dict):
#Attempt to serialize, raises exception on failure
json_response = dumps(rv)
#Set content type only if serialization successful
response.content_type = 'application/json'
return json_response
elif isinstance(rv, HTTPResponse) and isinstance(rv.body, dict):
rv.body = dumps(rv.body)
rv.content_type = 'application/json'
return rv
return wrapper
class TemplatePlugin(object):
""" This plugin applies the :func:`view` decorator to all routes with a
`template` config parameter. If the parameter is a tuple, the second
element must be a dict with additional options (e.g. `template_engine`)
or default variables for the template. """
name = 'template'
api = 2
def apply(self, callback, route):
conf = route.config.get('template')
if isinstance(conf, (tuple, list)) and len(conf) == 2:
return view(conf[0], **conf[1])(callback)
elif isinstance(conf, str):
return view(conf)(callback)
else:
return callback
#: Not a plugin, but part of the plugin API. TODO: Find a better place.
class _ImportRedirect(object):
def __init__(self, name, impmask):
""" Create a virtual package that redirects imports (see PEP 302). """
self.name = name
self.impmask = impmask
self.module = sys.modules.setdefault(name, imp.new_module(name))
self.module.__dict__.update({'__file__': __file__, '__path__': [],
'__all__': [], '__loader__': self})
sys.meta_path.append(self)
def find_module(self, fullname, path=None):
if '.' not in fullname: return
packname = fullname.rsplit('.', 1)[0]
if packname != self.name: return
return self
def load_module(self, fullname):
if fullname in sys.modules: return sys.modules[fullname]
modname = fullname.rsplit('.', 1)[1]
realname = self.impmask % modname
__import__(realname)
module = sys.modules[fullname] = sys.modules[realname]
setattr(self.module, modname, module)
module.__loader__ = self
return module
###############################################################################
# Common Utilities #############################################################
###############################################################################
class MultiDict(DictMixin):
""" This dict stores multiple values per key, but behaves exactly like a
normal dict in that it returns only the newest value for any given key.
There are special methods available to access the full list of values.
"""
def __init__(self, *a, **k):
self.dict = dict((k, [v]) for (k, v) in dict(*a, **k).items())
def __len__(self): return len(self.dict)
def __iter__(self): return iter(self.dict)
def __contains__(self, key): return key in self.dict
def __delitem__(self, key): del self.dict[key]
def __getitem__(self, key): return self.dict[key][-1]
def __setitem__(self, key, value): self.append(key, value)
def keys(self): return self.dict.keys()
if py3k:
def values(self): return (v[-1] for v in self.dict.values())
def items(self): return ((k, v[-1]) for k, v in self.dict.items())
def allitems(self):
return ((k, v) for k, vl in self.dict.items() for v in vl)
iterkeys = keys
itervalues = values
iteritems = items
iterallitems = allitems
else:
def values(self): return [v[-1] for v in self.dict.values()]
def items(self): return [(k, v[-1]) for k, v in self.dict.items()]
def iterkeys(self): return self.dict.iterkeys()
def itervalues(self): return (v[-1] for v in self.dict.itervalues())
def iteritems(self):
return ((k, v[-1]) for k, v in self.dict.iteritems())
def iterallitems(self):
return ((k, v) for k, vl in self.dict.iteritems() for v in vl)
def allitems(self):
return [(k, v) for k, vl in self.dict.iteritems() for v in vl]
def get(self, key, default=None, index=-1, type=None):
""" Return the most recent value for a key.
:param default: The default value to be returned if the key is not
present or the type conversion fails.
:param index: An index for the list of available values.
:param type: If defined, this callable is used to cast the value
into a specific type. Exception are suppressed and result in
the default value to be returned.
"""
try:
val = self.dict[key][index]
return type(val) if type else val
except Exception:
pass
return default
def append(self, key, value):
""" Add a new value to the list of values for this key. """
self.dict.setdefault(key, []).append(value)
def replace(self, key, value):
""" Replace the list of values with a single value. """
self.dict[key] = [value]
def getall(self, key):
""" Return a (possibly empty) list of values for a key. """
return self.dict.get(key) or []
#: Aliases for WTForms to mimic other multi-dict APIs (Django)
getone = get
getlist = getall
class FormsDict(MultiDict):
""" This :class:`MultiDict` subclass is used to store request form data.
Additionally to the normal dict-like item access methods (which return
unmodified data as native strings), this container also supports
attribute-like access to its values. Attributes are automatically de-
or recoded to match :attr:`input_encoding` (default: 'utf8'). Missing
attributes default to an empty string. """
#: Encoding used for attribute values.
input_encoding = 'utf8'
#: If true (default), unicode strings are first encoded with `latin1`
#: and then decoded to match :attr:`input_encoding`.
recode_unicode = True
def _fix(self, s, encoding=None):
if isinstance(s, unicode) and self.recode_unicode: # Python 3 WSGI
return s.encode('latin1').decode(encoding or self.input_encoding)
elif isinstance(s, bytes): # Python 2 WSGI
return s.decode(encoding or self.input_encoding)
else:
return s
def decode(self, encoding=None):
""" Returns a copy with all keys and values de- or recoded to match
:attr:`input_encoding`. Some libraries (e.g. WTForms) want a
unicode dictionary. """
copy = FormsDict()
enc = copy.input_encoding = encoding or self.input_encoding
copy.recode_unicode = False
for key, value in self.allitems():
copy.append(self._fix(key, enc), self._fix(value, enc))
return copy
def getunicode(self, name, default=None, encoding=None):
""" Return the value as a unicode string, or the default. """
try:
return self._fix(self[name], encoding)
except (UnicodeError, KeyError):
return default
def __getattr__(self, name, default=unicode()):
# Without this guard, pickle generates a cryptic TypeError:
if name.startswith('__') and name.endswith('__'):
return super(FormsDict, self).__getattr__(name)
return self.getunicode(name, default=default)
class HeaderDict(MultiDict):
""" A case-insensitive version of :class:`MultiDict` that defaults to
replace the old value instead of appending it. """
def __init__(self, *a, **ka):
self.dict = {}
if a or ka: self.update(*a, **ka)
def __contains__(self, key): return _hkey(key) in self.dict
def __delitem__(self, key): del self.dict[_hkey(key)]
def __getitem__(self, key): return self.dict[_hkey(key)][-1]
def __setitem__(self, key, value): self.dict[_hkey(key)] = [str(value)]
def append(self, key, value):
self.dict.setdefault(_hkey(key), []).append(str(value))
def replace(self, key, value): self.dict[_hkey(key)] = [str(value)]
def getall(self, key): return self.dict.get(_hkey(key)) or []
def get(self, key, default=None, index=-1):
return MultiDict.get(self, _hkey(key), default, index)
def filter(self, names):
for name in [_hkey(n) for n in names]:
if name in self.dict:
del self.dict[name]
class WSGIHeaderDict(DictMixin):
""" This dict-like class wraps a WSGI environ dict and provides convenient
access to HTTP_* fields. Keys and values are native strings
(2.x bytes or 3.x unicode) and keys are case-insensitive. If the WSGI
environment contains non-native string values, these are de- or encoded
using a lossless 'latin1' character set.
The API will remain stable even on changes to the relevant PEPs.
Currently PEP 333, 444 and 3333 are supported. (PEP 444 is the only one
that uses non-native strings.)
"""
#: List of keys that do not have a ``HTTP_`` prefix.
cgikeys = ('CONTENT_TYPE', 'CONTENT_LENGTH')
def __init__(self, environ):
self.environ = environ
def _ekey(self, key):
""" Translate header field name to CGI/WSGI environ key. """
key = key.replace('-','_').upper()
if key in self.cgikeys:
return key
return 'HTTP_' + key
def raw(self, key, default=None):
""" Return the header value as is (may be bytes or unicode). """
return self.environ.get(self._ekey(key), default)
def __getitem__(self, key):
val = self.environ[self._ekey(key)]
if py3k:
if isinstance(val, unicode):
val = val.encode('latin1').decode('utf8')
else:
val = val.decode('utf8')
return val
def __setitem__(self, key, value):
raise TypeError("%s is read-only." % self.__class__)
def __delitem__(self, key):
raise TypeError("%s is read-only." % self.__class__)
def __iter__(self):
for key in self.environ:
if key[:5] == 'HTTP_':
yield _hkey(key[5:])
elif key in self.cgikeys:
yield _hkey(key)
def keys(self): return [x for x in self]
def __len__(self): return len(self.keys())
def __contains__(self, key): return self._ekey(key) in self.environ
class ConfigDict(dict):
""" A dict-like configuration storage with additional support for
namespaces, validators, meta-data, on_change listeners and more.
"""
__slots__ = ('_meta', '_on_change')
def __init__(self):
self._meta = {}
self._on_change = lambda name, value: None
def load_config(self, filename):
""" Load values from an ``*.ini`` style config file.
If the config file contains sections, their names are used as
namespaces for the values within. The two special sections
``DEFAULT`` and ``bottle`` refer to the root namespace (no prefix).
"""
conf = ConfigParser()
conf.read(filename)
for section in conf.sections():
for key, value in conf.items(section):
if section not in ('DEFAULT', 'bottle'):
key = section + '.' + key
self[key] = value
return self
def load_dict(self, source, namespace=''):
""" Load values from a dictionary structure. Nesting can be used to
represent namespaces.
>>> c = ConfigDict()
>>> c.load_dict({'some': {'namespace': {'key': 'value'} } })
{'some.namespace.key': 'value'}
"""
for key, value in source.items():
if isinstance(key, str):
nskey = (namespace + '.' + key).strip('.')
if isinstance(value, dict):
self.load_dict(value, namespace=nskey)
else:
self[nskey] = value
else:
raise TypeError('Key has type %r (not a string)' % type(key))
return self
def update(self, *a, **ka):
""" If the first parameter is a string, all keys are prefixed with this
namespace. Apart from that it works just as the usual dict.update().
Example: ``update('some.namespace', key='value')`` """
prefix = ''
if a and isinstance(a[0], str):
prefix = a[0].strip('.') + '.'
a = a[1:]
for key, value in dict(*a, **ka).items():
self[prefix+key] = value
def setdefault(self, key, value):
if key not in self:
self[key] = value
return self[key]
def __setitem__(self, key, value):
if not isinstance(key, str):
raise TypeError('Key has type %r (not a string)' % type(key))
value = self.meta_get(key, 'filter', lambda x: x)(value)
if key in self and self[key] is value:
return
self._on_change(key, value)
dict.__setitem__(self, key, value)
def __delitem__(self, key):
self._on_change(key, None)
dict.__delitem__(self, key)
def meta_get(self, key, metafield, default=None):
""" Return the value of a meta field for a key. """
return self._meta.get(key, {}).get(metafield, default)
def meta_set(self, key, metafield, value):
""" Set the meta field for a key to a new value. This triggers the
on-change handler for existing keys. """
self._meta.setdefault(key, {})[metafield] = value
if key in self:
self[key] = self[key]
def meta_list(self, key):
""" Return an iterable of meta field names defined for a key. """
return self._meta.get(key, {}).keys()
class AppStack(list):
""" A stack-like list. Calling it returns the head of the stack. """
def __call__(self):
""" Return the current default application. """
return self[-1]
def push(self, value=None):
""" Add a new :class:`Bottle` instance to the stack """
if not isinstance(value, Bottle):
value = Bottle()
self.append(value)
return value
class WSGIFileWrapper(object):
def __init__(self, fp, buffer_size=1024*64):
self.fp, self.buffer_size = fp, buffer_size
for attr in ('fileno', 'close', 'read', 'readlines', 'tell', 'seek'):
if hasattr(fp, attr): setattr(self, attr, getattr(fp, attr))
def __iter__(self):
buff, read = self.buffer_size, self.read
while True:
part = read(buff)
if not part: return
yield part
class _closeiter(object):
""" This only exists to be able to attach a .close method to iterators that
do not support attribute assignment (most of itertools). """
def __init__(self, iterator, close=None):
self.iterator = iterator
self.close_callbacks = makelist(close)
def __iter__(self):
return iter(self.iterator)
def close(self):
for func in self.close_callbacks:
func()
class ResourceManager(object):
""" This class manages a list of search paths and helps to find and open
application-bound resources (files).
:param base: default value for :meth:`add_path` calls.
:param opener: callable used to open resources.
:param cachemode: controls which lookups are cached. One of 'all',
'found' or 'none'.
"""
def __init__(self, base='./', opener=open, cachemode='all'):
self.opener = opener
self.base = base
self.cachemode = cachemode
#: A list of search paths. See :meth:`add_path` for details.
self.path = []
#: A cache for resolved paths. ``res.cache.clear()`` clears the cache.
self.cache = {}
def add_path(self, path, base=None, index=None, create=False):
""" Add a new path to the list of search paths. Return False if the
path does not exist.
:param path: The new search path. Relative paths are turned into
an absolute and normalized form. If the path looks like a file
(not ending in `/`), the filename is stripped off.
:param base: Path used to absolutize relative search paths.
Defaults to :attr:`base` which defaults to ``os.getcwd()``.
:param index: Position within the list of search paths. Defaults
to last index (appends to the list).
The `base` parameter makes it easy to reference files installed
along with a python module or package::
res.add_path('./resources/', __file__)
"""
base = os.path.abspath(os.path.dirname(base or self.base))
path = os.path.abspath(os.path.join(base, os.path.dirname(path)))
path += os.sep
if path in self.path:
self.path.remove(path)
if create and not os.path.isdir(path):
os.makedirs(path)
if index is None:
self.path.append(path)
else:
self.path.insert(index, path)
self.cache.clear()
return os.path.exists(path)
def __iter__(self):
""" Iterate over all existing files in all registered paths. """
search = self.path[:]
while search:
path = search.pop()
if not os.path.isdir(path): continue
for name in os.listdir(path):
full = os.path.join(path, name)
if os.path.isdir(full): search.append(full)
else: yield full
def lookup(self, name):
""" Search for a resource and return an absolute file path, or `None`.
The :attr:`path` list is searched in order. The first match is
returend. Symlinks are followed. The result is cached to speed up
future lookups. """
if name not in self.cache or DEBUG:
for path in self.path:
fpath = os.path.join(path, name)
if os.path.isfile(fpath):
if self.cachemode in ('all', 'found'):
self.cache[name] = fpath
return fpath
if self.cachemode == 'all':
self.cache[name] = None
return self.cache[name]
def open(self, name, mode='r', *args, **kwargs):
""" Find a resource and return a file object, or raise IOError. """
fname = self.lookup(name)
if not fname: raise IOError("Resource %r not found." % name)
return self.opener(fname, mode=mode, *args, **kwargs)
class FileUpload(object):
def __init__(self, fileobj, name, filename, headers=None):
""" Wrapper for file uploads. """
#: Open file(-like) object (BytesIO buffer or temporary file)
self.file = fileobj
#: Name of the upload form field
self.name = name
#: Raw filename as sent by the client (may contain unsafe characters)
self.raw_filename = filename
#: A :class:`HeaderDict` with additional headers (e.g. content-type)
self.headers = HeaderDict(headers) if headers else HeaderDict()
content_type = HeaderProperty('Content-Type')
content_length = HeaderProperty('Content-Length', reader=int, default=-1)
@cached_property
def filename(self):
""" Name of the file on the client file system, but normalized to ensure
file system compatibility. An empty filename is returned as 'empty'.
Only ASCII letters, digits, dashes, underscores and dots are
allowed in the final filename. Accents are removed, if possible.
Whitespace is replaced by a single dash. Leading or tailing dots
or dashes are removed. The filename is limited to 255 characters.
"""
fname = self.raw_filename
if not isinstance(fname, unicode):
fname = fname.decode('utf8', 'ignore')
fname = normalize('NFKD', fname).encode('ASCII', 'ignore').decode('ASCII')
fname = os.path.basename(fname.replace('\\', os.path.sep))
fname = re.sub(r'[^a-zA-Z0-9-_.\s]', '', fname).strip()
fname = re.sub(r'[-\s]+', '-', fname).strip('.-')
return fname[:255] or 'empty'
def _copy_file(self, fp, chunk_size=2**16):
read, write, offset = self.file.read, fp.write, self.file.tell()
while 1:
buf = read(chunk_size)
if not buf: break
write(buf)
self.file.seek(offset)
def save(self, destination, overwrite=False, chunk_size=2**16):
""" Save file to disk or copy its content to an open file(-like) object.
If *destination* is a directory, :attr:`filename` is added to the
path. Existing files are not overwritten by default (IOError).
:param destination: File path, directory or file(-like) object.
:param overwrite: If True, replace existing files. (default: False)
:param chunk_size: Bytes to read at a time. (default: 64kb)
"""
if isinstance(destination, basestring): # Except file-likes here
if os.path.isdir(destination):
destination = os.path.join(destination, self.filename)
if not overwrite and os.path.exists(destination):
raise IOError('File exists.')
with open(destination, 'wb') as fp:
self._copy_file(fp, chunk_size)
else:
self._copy_file(destination, chunk_size)
###############################################################################
# Application Helper ###########################################################
###############################################################################
def abort(code=500, text='Unknown Error.'):
""" Aborts execution and causes a HTTP error. """
raise HTTPError(code, text)
def redirect(url, code=None):
""" Aborts execution and causes a 303 or 302 redirect, depending on
the HTTP protocol version. """
if not code:
code = 303 if request.get('SERVER_PROTOCOL') == "HTTP/1.1" else 302
res = response.copy(cls=HTTPResponse)
res.status = code
res.body = ""
res.set_header('Location', urljoin(request.url, url))
raise res
def _file_iter_range(fp, offset, bytes, maxread=1024*1024):
""" Yield chunks from a range in a file. No chunk is bigger than maxread."""
fp.seek(offset)
while bytes > 0:
part = fp.read(min(bytes, maxread))
if not part: break
bytes -= len(part)
yield part
def static_file(filename, root, mimetype='auto', download=False, charset='UTF-8'):
""" Open a file in a safe way and return :exc:`HTTPResponse` with status
code 200, 305, 403 or 404. The ``Content-Type``, ``Content-Encoding``,
``Content-Length`` and ``Last-Modified`` headers are set if possible.
Special support for ``If-Modified-Since``, ``Range`` and ``HEAD``
requests.
:param filename: Name or path of the file to send.
:param root: Root path for file lookups. Should be an absolute directory
path.
:param mimetype: Defines the content-type header (default: guess from
file extension)
:param download: If True, ask the browser to open a `Save as...` dialog
instead of opening the file with the associated program. You can
specify a custom filename as a string. If not specified, the
original filename is used (default: False).
:param charset: The charset to use for files with a ``text/*``
mime-type. (default: UTF-8)
"""
root = os.path.abspath(root) + os.sep
filename = os.path.abspath(os.path.join(root, filename.strip('/\\')))
headers = dict()
if not filename.startswith(root):
return HTTPError(403, "Access denied.")
if not os.path.exists(filename) or not os.path.isfile(filename):
return HTTPError(404, "File does not exist.")
if not os.access(filename, os.R_OK):
return HTTPError(403, "You do not have permission to access this file.")
if mimetype == 'auto':
mimetype, encoding = mimetypes.guess_type(filename)
if encoding: headers['Content-Encoding'] = encoding
if mimetype:
if mimetype[:5] == 'text/' and charset and 'charset' not in mimetype:
mimetype += '; charset=%s' % charset
headers['Content-Type'] = mimetype
if download:
download = os.path.basename(filename if download == True else download)
headers['Content-Disposition'] = 'attachment; filename="%s"' % download
stats = os.stat(filename)
headers['Content-Length'] = clen = stats.st_size
lm = time.strftime("%a, %d %b %Y %H:%M:%S GMT", time.gmtime(stats.st_mtime))
headers['Last-Modified'] = lm
ims = request.environ.get('HTTP_IF_MODIFIED_SINCE')
if ims:
ims = parse_date(ims.split(";")[0].strip())
if ims is not None and ims >= int(stats.st_mtime):
headers['Date'] = time.strftime("%a, %d %b %Y %H:%M:%S GMT", time.gmtime())
return HTTPResponse(status=304, **headers)
body = '' if request.method == 'HEAD' else open(filename, 'rb')
headers["Accept-Ranges"] = "bytes"
ranges = request.environ.get('HTTP_RANGE')
if 'HTTP_RANGE' in request.environ:
ranges = list(parse_range_header(request.environ['HTTP_RANGE'], clen))
if not ranges:
return HTTPError(416, "Requested Range Not Satisfiable")
offset, end = ranges[0]
headers["Content-Range"] = "bytes %d-%d/%d" % (offset, end-1, clen)
headers["Content-Length"] = str(end-offset)
if body: body = _file_iter_range(body, offset, end-offset)
return HTTPResponse(body, status=206, **headers)
return HTTPResponse(body, **headers)
###############################################################################
# HTTP Utilities and MISC (TODO) ###############################################
###############################################################################
def debug(mode=True):
""" Change the debug level.
There is only one debug level supported at the moment."""
global DEBUG
if mode: warnings.simplefilter('default')
DEBUG = bool(mode)
def http_date(value):
if isinstance(value, (datedate, datetime)):
value = value.utctimetuple()
elif isinstance(value, (int, float)):
value = time.gmtime(value)
if not isinstance(value, basestring):
value = time.strftime("%a, %d %b %Y %H:%M:%S GMT", value)
return value
def parse_date(ims):
""" Parse rfc1123, rfc850 and asctime timestamps and return UTC epoch. """
try:
ts = email.utils.parsedate_tz(ims)
return time.mktime(ts[:8] + (0,)) - (ts[9] or 0) - time.timezone
except (TypeError, ValueError, IndexError, OverflowError):
return None
def parse_auth(header):
""" Parse rfc2617 HTTP authentication header string (basic) and return (user,pass) tuple or None"""
try:
method, data = header.split(None, 1)
if method.lower() == 'basic':
user, pwd = touni(base64.b64decode(tob(data))).split(':',1)
return user, pwd
except (KeyError, ValueError):
return None
def parse_range_header(header, maxlen=0):
""" Yield (start, end) ranges parsed from a HTTP Range header. Skip
unsatisfiable ranges. The end index is non-inclusive."""
if not header or header[:6] != 'bytes=': return
ranges = [r.split('-', 1) for r in header[6:].split(',') if '-' in r]
for start, end in ranges:
try:
if not start: # bytes=-100 -> last 100 bytes
start, end = max(0, maxlen-int(end)), maxlen
elif not end: # bytes=100- -> all but the first 99 bytes
start, end = int(start), maxlen
else: # bytes=100-200 -> bytes 100-200 (inclusive)
start, end = int(start), min(int(end)+1, maxlen)
if 0 <= start < end <= maxlen:
yield start, end
except ValueError:
pass
def _parse_qsl(qs):
r = []
for pair in qs.replace(';','&').split('&'):
if not pair: continue
nv = pair.split('=', 1)
if len(nv) != 2: nv.append('')
key = urlunquote(nv[0].replace('+', ' '))
value = urlunquote(nv[1].replace('+', ' '))
r.append((key, value))
return r
def _lscmp(a, b):
""" Compares two strings in a cryptographically safe way:
Runtime is not affected by length of common prefix. """
return not sum(0 if x==y else 1 for x, y in zip(a, b)) and len(a) == len(b)
def cookie_encode(data, key):
""" Encode and sign a pickle-able object. Return a (byte) string """
msg = base64.b64encode(pickle.dumps(data, -1))
sig = base64.b64encode(hmac.new(tob(key), msg).digest())
return tob('!') + sig + tob('?') + msg
def cookie_decode(data, key):
""" Verify and decode an encoded string. Return an object or None."""
data = tob(data)
if cookie_is_encoded(data):
sig, msg = data.split(tob('?'), 1)
if _lscmp(sig[1:], base64.b64encode(hmac.new(tob(key), msg).digest())):
return pickle.loads(base64.b64decode(msg))
return None
def cookie_is_encoded(data):
""" Return True if the argument looks like a encoded cookie."""
return bool(data.startswith(tob('!')) and tob('?') in data)
def html_escape(string):
""" Escape HTML special characters ``&<>`` and quotes ``'"``. """
return string.replace('&','&').replace('<','<').replace('>','>')\
.replace('"','"').replace("'",''')
def html_quote(string):
""" Escape and quote a string to be used as an HTTP attribute."""
return '"%s"' % html_escape(string).replace('\n',' ')\
.replace('\r',' ').replace('\t','	')
def yieldroutes(func):
""" Return a generator for routes that match the signature (name, args)
of the func parameter. This may yield more than one route if the function
takes optional keyword arguments. The output is best described by example::
a() -> '/a'
b(x, y) -> '/b/<x>/<y>'
c(x, y=5) -> '/c/<x>' and '/c/<x>/<y>'
d(x=5, y=6) -> '/d' and '/d/<x>' and '/d/<x>/<y>'
"""
path = '/' + func.__name__.replace('__','/').lstrip('/')
spec = getargspec(func)
argc = len(spec[0]) - len(spec[3] or [])
path += ('/<%s>' * argc) % tuple(spec[0][:argc])
yield path
for arg in spec[0][argc:]:
path += '/<%s>' % arg
yield path
def path_shift(script_name, path_info, shift=1):
""" Shift path fragments from PATH_INFO to SCRIPT_NAME and vice versa.
:return: The modified paths.
:param script_name: The SCRIPT_NAME path.
:param script_name: The PATH_INFO path.
:param shift: The number of path fragments to shift. May be negative to
change the shift direction. (default: 1)
"""
if shift == 0: return script_name, path_info
pathlist = path_info.strip('/').split('/')
scriptlist = script_name.strip('/').split('/')
if pathlist and pathlist[0] == '': pathlist = []
if scriptlist and scriptlist[0] == '': scriptlist = []
if 0 < shift <= len(pathlist):
moved = pathlist[:shift]
scriptlist = scriptlist + moved
pathlist = pathlist[shift:]
elif 0 > shift >= -len(scriptlist):
moved = scriptlist[shift:]
pathlist = moved + pathlist
scriptlist = scriptlist[:shift]
else:
empty = 'SCRIPT_NAME' if shift < 0 else 'PATH_INFO'
raise AssertionError("Cannot shift. Nothing left from %s" % empty)
new_script_name = '/' + '/'.join(scriptlist)
new_path_info = '/' + '/'.join(pathlist)
if path_info.endswith('/') and pathlist: new_path_info += '/'
return new_script_name, new_path_info
def auth_basic(check, realm="private", text="Access denied"):
""" Callback decorator to require HTTP auth (basic).
TODO: Add route(check_auth=...) parameter. """
def decorator(func):
@functools.wraps(func)
def wrapper(*a, **ka):
user, password = request.auth or (None, None)
if user is None or not check(user, password):
err = HTTPError(401, text)
err.add_header('WWW-Authenticate', 'Basic realm="%s"' % realm)
return err
return func(*a, **ka)
return wrapper
return decorator
# Shortcuts for common Bottle methods.
# They all refer to the current default application.
def make_default_app_wrapper(name):
""" Return a callable that relays calls to the current default app. """
@functools.wraps(getattr(Bottle, name))
def wrapper(*a, **ka):
return getattr(app(), name)(*a, **ka)
return wrapper
route = make_default_app_wrapper('route')
get = make_default_app_wrapper('get')
post = make_default_app_wrapper('post')
put = make_default_app_wrapper('put')
delete = make_default_app_wrapper('delete')
patch = make_default_app_wrapper('patch')
error = make_default_app_wrapper('error')
mount = make_default_app_wrapper('mount')
hook = make_default_app_wrapper('hook')
install = make_default_app_wrapper('install')
uninstall = make_default_app_wrapper('uninstall')
url = make_default_app_wrapper('get_url')
###############################################################################
# Server Adapter ###############################################################
###############################################################################
class ServerAdapter(object):
quiet = False
def __init__(self, host='127.0.0.1', port=8080, **options):
self.options = options
self.host = host
self.port = int(port)
def run(self, handler): # pragma: no cover
pass
def __repr__(self):
args = ', '.join(['%s=%s'%(k,repr(v)) for k, v in self.options.items()])
return "%s(%s)" % (self.__class__.__name__, args)
class CGIServer(ServerAdapter):
quiet = True
def run(self, handler): # pragma: no cover
from wsgiref.handlers import CGIHandler
def fixed_environ(environ, start_response):
environ.setdefault('PATH_INFO', '')
return handler(environ, start_response)
CGIHandler().run(fixed_environ)
class FlupFCGIServer(ServerAdapter):
def run(self, handler): # pragma: no cover
import flup.server.fcgi
self.options.setdefault('bindAddress', (self.host, self.port))
flup.server.fcgi.WSGIServer(handler, **self.options).run()
class WSGIRefServer(ServerAdapter):
def run(self, app): # pragma: no cover
from wsgiref.simple_server import make_server
from wsgiref.simple_server import WSGIRequestHandler, WSGIServer
import socket
class FixedHandler(WSGIRequestHandler):
def address_string(self): # Prevent reverse DNS lookups please.
return self.client_address[0]
def log_request(*args, **kw):
if not self.quiet:
return WSGIRequestHandler.log_request(*args, **kw)
handler_cls = self.options.get('handler_class', FixedHandler)
server_cls = self.options.get('server_class', WSGIServer)
if ':' in self.host: # Fix wsgiref for IPv6 addresses.
if getattr(server_cls, 'address_family') == socket.AF_INET:
class server_cls(server_cls):
address_family = socket.AF_INET6
self.srv = make_server(self.host, self.port, app, server_cls, handler_cls)
self.port = self.srv.server_port # update port actual port (0 means random)
try:
self.srv.serve_forever()
except KeyboardInterrupt:
self.srv.server_close() # Prevent ResourceWarning: unclosed socket
raise
class CherryPyServer(ServerAdapter):
def run(self, handler): # pragma: no cover
from cherrypy import wsgiserver
self.options['bind_addr'] = (self.host, self.port)
self.options['wsgi_app'] = handler
certfile = self.options.get('certfile')
if certfile:
del self.options['certfile']
keyfile = self.options.get('keyfile')
if keyfile:
del self.options['keyfile']
server = wsgiserver.CherryPyWSGIServer(**self.options)
if certfile:
server.ssl_certificate = certfile
if keyfile:
server.ssl_private_key = keyfile
try:
server.start()
finally:
server.stop()
class WaitressServer(ServerAdapter):
def run(self, handler):
from waitress import serve
serve(handler, host=self.host, port=self.port, _quiet=self.quiet)
class PasteServer(ServerAdapter):
def run(self, handler): # pragma: no cover
from paste import httpserver
from paste.translogger import TransLogger
handler = TransLogger(handler, setup_console_handler=(not self.quiet))
httpserver.serve(handler, host=self.host, port=str(self.port),
**self.options)
class MeinheldServer(ServerAdapter):
def run(self, handler):
from meinheld import server
server.listen((self.host, self.port))
server.run(handler)
class FapwsServer(ServerAdapter):
""" Extremely fast webserver using libev. See http://www.fapws.org/ """
def run(self, handler): # pragma: no cover
import fapws._evwsgi as evwsgi
from fapws import base, config
port = self.port
if float(config.SERVER_IDENT[-2:]) > 0.4:
# fapws3 silently changed its API in 0.5
port = str(port)
evwsgi.start(self.host, port)
# fapws3 never releases the GIL. Complain upstream. I tried. No luck.
if 'BOTTLE_CHILD' in os.environ and not self.quiet:
_stderr("WARNING: Auto-reloading does not work with Fapws3.\n")
_stderr(" (Fapws3 breaks python thread support)\n")
evwsgi.set_base_module(base)
def app(environ, start_response):
environ['wsgi.multiprocess'] = False
return handler(environ, start_response)
evwsgi.wsgi_cb(('', app))
evwsgi.run()
class TornadoServer(ServerAdapter):
""" The super hyped asynchronous server by facebook. Untested. """
def run(self, handler): # pragma: no cover
import tornado.wsgi, tornado.httpserver, tornado.ioloop
container = tornado.wsgi.WSGIContainer(handler)
server = tornado.httpserver.HTTPServer(container)
server.listen(port=self.port,address=self.host)
tornado.ioloop.IOLoop.instance().start()
class AppEngineServer(ServerAdapter):
""" Adapter for Google App Engine. """
quiet = True
def run(self, handler):
from google.appengine.ext.webapp import util
# A main() function in the handler script enables 'App Caching'.
# Lets makes sure it is there. This _really_ improves performance.
module = sys.modules.get('__main__')
if module and not hasattr(module, 'main'):
module.main = lambda: util.run_wsgi_app(handler)
util.run_wsgi_app(handler)
class TwistedServer(ServerAdapter):
""" Untested. """
def run(self, handler):
from twisted.web import server, wsgi
from twisted.python.threadpool import ThreadPool
from twisted.internet import reactor
thread_pool = ThreadPool()
thread_pool.start()
reactor.addSystemEventTrigger('after', 'shutdown', thread_pool.stop)
factory = server.Site(wsgi.WSGIResource(reactor, thread_pool, handler))
reactor.listenTCP(self.port, factory, interface=self.host)
if not reactor.running:
reactor.run()
class DieselServer(ServerAdapter):
""" Untested. """
def run(self, handler):
from diesel.protocols.wsgi import WSGIApplication
app = WSGIApplication(handler, port=self.port)
app.run()
class GeventServer(ServerAdapter):
""" Untested. Options:
* `fast` (default: False) uses libevent's http server, but has some
issues: No streaming, no pipelining, no SSL.
* See gevent.wsgi.WSGIServer() documentation for more options.
"""
def run(self, handler):
from gevent import wsgi, pywsgi, local
if not isinstance(threading.local(), local.local):
msg = "Bottle requires gevent.monkey.patch_all() (before import)"
raise RuntimeError(msg)
if not self.options.pop('fast', None): wsgi = pywsgi
self.options['log'] = None if self.quiet else 'default'
address = (self.host, self.port)
server = wsgi.WSGIServer(address, handler, **self.options)
if 'BOTTLE_CHILD' in os.environ:
import signal
signal.signal(signal.SIGINT, lambda s, f: server.stop())
server.serve_forever()
class GeventSocketIOServer(ServerAdapter):
def run(self,handler):
from socketio import server
address = (self.host, self.port)
server.SocketIOServer(address, handler, **self.options).serve_forever()
class GunicornServer(ServerAdapter):
""" Untested. See http://gunicorn.org/configure.html for options. """
def run(self, handler):
from gunicorn.app.base import Application
config = {'bind': "%s:%d" % (self.host, int(self.port))}
config.update(self.options)
class GunicornApplication(Application):
def init(self, parser, opts, args):
return config
def load(self):
return handler
GunicornApplication().run()
class EventletServer(ServerAdapter):
""" Untested. Options:
* `backlog` adjust the eventlet backlog parameter which is the maximum
number of queued connections. Should be at least 1; the maximum
value is system-dependent.
* `family`: (default is 2) socket family, optional. See socket
documentation for available families.
"""
def run(self, handler):
from eventlet import wsgi, listen, patcher
if not patcher.is_monkey_patched(os):
msg = "Bottle requires eventlet.monkey_patch() (before import)"
raise RuntimeError(msg)
socket_args = {}
for arg in ('backlog', 'family'):
try:
socket_args[arg] = self.options.pop(arg)
except KeyError:
pass
address = (self.host, self.port)
try:
wsgi.server(listen(address, **socket_args), handler,
log_output=(not self.quiet))
except TypeError:
# Fallback, if we have old version of eventlet
wsgi.server(listen(address), handler)
class RocketServer(ServerAdapter):
""" Untested. """
def run(self, handler):
from rocket import Rocket
server = Rocket((self.host, self.port), 'wsgi', { 'wsgi_app' : handler })
server.start()
class BjoernServer(ServerAdapter):
""" Fast server written in C: https://github.com/jonashaag/bjoern """
def run(self, handler):
from bjoern import run
run(handler, self.host, self.port)
class AutoServer(ServerAdapter):
""" Untested. """
adapters = [WaitressServer, PasteServer, TwistedServer, CherryPyServer, WSGIRefServer]
def run(self, handler):
for sa in self.adapters:
try:
return sa(self.host, self.port, **self.options).run(handler)
except ImportError:
pass
server_names = {
'cgi': CGIServer,
'flup': FlupFCGIServer,
'wsgiref': WSGIRefServer,
'waitress': WaitressServer,
'cherrypy': CherryPyServer,
'paste': PasteServer,
'fapws3': FapwsServer,
'tornado': TornadoServer,
'gae': AppEngineServer,
'twisted': TwistedServer,
'diesel': DieselServer,
'meinheld': MeinheldServer,
'gunicorn': GunicornServer,
'eventlet': EventletServer,
'gevent': GeventServer,
'geventSocketIO':GeventSocketIOServer,
'rocket': RocketServer,
'bjoern' : BjoernServer,
'auto': AutoServer,
}
###############################################################################
# Application Control ##########################################################
###############################################################################
def load(target, **namespace):
""" Import a module or fetch an object from a module.
* ``package.module`` returns `module` as a module object.
* ``pack.mod:name`` returns the module variable `name` from `pack.mod`.
* ``pack.mod:func()`` calls `pack.mod.func()` and returns the result.
The last form accepts not only function calls, but any type of
expression. Keyword arguments passed to this function are available as
local variables. Example: ``import_string('re:compile(x)', x='[a-z]')``
"""
module, target = target.split(":", 1) if ':' in target else (target, None)
if module not in sys.modules: __import__(module)
if not target: return sys.modules[module]
if target.isalnum(): return getattr(sys.modules[module], target)
package_name = module.split('.')[0]
namespace[package_name] = sys.modules[package_name]
return eval('%s.%s' % (module, target), namespace)
def load_app(target):
""" Load a bottle application from a module and make sure that the import
does not affect the current default application, but returns a separate
application object. See :func:`load` for the target parameter. """
global NORUN; NORUN, nr_old = True, NORUN
tmp = default_app.push() # Create a new "default application"
try:
rv = load(target) # Import the target module
return rv if callable(rv) else tmp
finally:
default_app.remove(tmp) # Remove the temporary added default application
NORUN = nr_old
_debug = debug
def run(app=None, server='wsgiref', host='127.0.0.1', port=8080,
interval=1, reloader=False, quiet=False, plugins=None,
debug=None, **kargs):
""" Start a server instance. This method blocks until the server terminates.
:param app: WSGI application or target string supported by
:func:`load_app`. (default: :func:`default_app`)
:param server: Server adapter to use. See :data:`server_names` keys
for valid names or pass a :class:`ServerAdapter` subclass.
(default: `wsgiref`)
:param host: Server address to bind to. Pass ``0.0.0.0`` to listens on
all interfaces including the external one. (default: 127.0.0.1)
:param port: Server port to bind to. Values below 1024 require root
privileges. (default: 8080)
:param reloader: Start auto-reloading server? (default: False)
:param interval: Auto-reloader interval in seconds (default: 1)
:param quiet: Suppress output to stdout and stderr? (default: False)
:param options: Options passed to the server adapter.
"""
if NORUN: return
if reloader and not os.environ.get('BOTTLE_CHILD'):
lockfile = None
try:
fd, lockfile = tempfile.mkstemp(prefix='bottle.', suffix='.lock')
os.close(fd) # We only need this file to exist. We never write to it
while os.path.exists(lockfile):
args = [sys.executable] + sys.argv
environ = os.environ.copy()
environ['BOTTLE_CHILD'] = 'true'
environ['BOTTLE_LOCKFILE'] = lockfile
p = subprocess.Popen(args, env=environ)
while p.poll() is None: # Busy wait...
os.utime(lockfile, None) # I am alive!
time.sleep(interval)
if p.poll() != 3:
if os.path.exists(lockfile): os.unlink(lockfile)
sys.exit(p.poll())
except KeyboardInterrupt:
pass
finally:
if os.path.exists(lockfile):
os.unlink(lockfile)
return
try:
if debug is not None: _debug(debug)
app = app or default_app()
if isinstance(app, basestring):
app = load_app(app)
if not callable(app):
raise ValueError("Application is not callable: %r" % app)
for plugin in plugins or []:
if isinstance(plugin, basestring):
plugin = load(plugin)
app.install(plugin)
if server in server_names:
server = server_names.get(server)
if isinstance(server, basestring):
server = load(server)
if isinstance(server, type):
server = server(host=host, port=port, **kargs)
if not isinstance(server, ServerAdapter):
raise ValueError("Unknown or unsupported server: %r" % server)
server.quiet = server.quiet or quiet
if not server.quiet:
_stderr("Bottle v%s server starting up (using %s)...\n" % (__version__, repr(server)))
_stderr("Listening on http://%s:%d/\n" % (server.host, server.port))
_stderr("Hit Ctrl-C to quit.\n\n")
if reloader:
lockfile = os.environ.get('BOTTLE_LOCKFILE')
bgcheck = FileCheckerThread(lockfile, interval)
with bgcheck:
server.run(app)
if bgcheck.status == 'reload':
sys.exit(3)
else:
server.run(app)
except KeyboardInterrupt:
pass
except (SystemExit, MemoryError):
raise
except:
if not reloader: raise
if not getattr(server, 'quiet', quiet):
print_exc()
time.sleep(interval)
sys.exit(3)
class FileCheckerThread(threading.Thread):
""" Interrupt main-thread as soon as a changed module file is detected,
the lockfile gets deleted or gets to old. """
def __init__(self, lockfile, interval):
threading.Thread.__init__(self)
self.daemon = True
self.lockfile, self.interval = lockfile, interval
#: Is one of 'reload', 'error' or 'exit'
self.status = None
def run(self):
exists = os.path.exists
mtime = lambda p: os.stat(p).st_mtime
files = dict()
for module in list(sys.modules.values()):
path = getattr(module, '__file__', '')
if path[-4:] in ('.pyo', '.pyc'): path = path[:-1]
if path and exists(path): files[path] = mtime(path)
while not self.status:
if not exists(self.lockfile)\
or mtime(self.lockfile) < time.time() - self.interval - 5:
self.status = 'error'
thread.interrupt_main()
for path, lmtime in list(files.items()):
if not exists(path) or mtime(path) > lmtime:
self.status = 'reload'
thread.interrupt_main()
break
time.sleep(self.interval)
def __enter__(self):
self.start()
def __exit__(self, exc_type, *_):
if not self.status: self.status = 'exit' # silent exit
self.join()
return exc_type is not None and issubclass(exc_type, KeyboardInterrupt)
###############################################################################
# Template Adapters ############################################################
###############################################################################
class TemplateError(HTTPError):
def __init__(self, message):
HTTPError.__init__(self, 500, message)
class BaseTemplate(object):
""" Base class and minimal API for template adapters """
extensions = ['tpl','html','thtml','stpl']
settings = {} #used in prepare()
defaults = {} #used in render()
def __init__(self, source=None, name=None, lookup=None, encoding='utf8', **settings):
""" Create a new template.
If the source parameter (str or buffer) is missing, the name argument
is used to guess a template filename. Subclasses can assume that
self.source and/or self.filename are set. Both are strings.
The lookup, encoding and settings parameters are stored as instance
variables.
The lookup parameter stores a list containing directory paths.
The encoding parameter should be used to decode byte strings or files.
The settings parameter contains a dict for engine-specific settings.
"""
self.name = name
self.source = source.read() if hasattr(source, 'read') else source
self.filename = source.filename if hasattr(source, 'filename') else None
self.lookup = [os.path.abspath(x) for x in lookup] if lookup else []
self.encoding = encoding
self.settings = self.settings.copy() # Copy from class variable
self.settings.update(settings) # Apply
if not self.source and self.name:
self.filename = self.search(self.name, self.lookup)
if not self.filename:
raise TemplateError('Template %s not found.' % repr(name))
if not self.source and not self.filename:
raise TemplateError('No template specified.')
self.prepare(**self.settings)
@classmethod
def search(cls, name, lookup=None):
""" Search name in all directories specified in lookup.
First without, then with common extensions. Return first hit. """
if not lookup:
depr('The template lookup path list should not be empty.', True) #0.12
lookup = ['.']
if os.path.isabs(name) and os.path.isfile(name):
depr('Absolute template path names are deprecated.', True) #0.12
return os.path.abspath(name)
for spath in lookup:
spath = os.path.abspath(spath) + os.sep
fname = os.path.abspath(os.path.join(spath, name))
if not fname.startswith(spath): continue
if os.path.isfile(fname): return fname
for ext in cls.extensions:
if os.path.isfile('%s.%s' % (fname, ext)):
return '%s.%s' % (fname, ext)
@classmethod
def global_config(cls, key, *args):
""" This reads or sets the global settings stored in class.settings. """
if args:
cls.settings = cls.settings.copy() # Make settings local to class
cls.settings[key] = args[0]
else:
return cls.settings[key]
def prepare(self, **options):
""" Run preparations (parsing, caching, ...).
It should be possible to call this again to refresh a template or to
update settings.
"""
raise NotImplementedError
def render(self, *args, **kwargs):
""" Render the template with the specified local variables and return
a single byte or unicode string. If it is a byte string, the encoding
must match self.encoding. This method must be thread-safe!
Local variables may be provided in dictionaries (args)
or directly, as keywords (kwargs).
"""
raise NotImplementedError
class MakoTemplate(BaseTemplate):
def prepare(self, **options):
from mako.template import Template
from mako.lookup import TemplateLookup
options.update({'input_encoding':self.encoding})
options.setdefault('format_exceptions', bool(DEBUG))
lookup = TemplateLookup(directories=self.lookup, **options)
if self.source:
self.tpl = Template(self.source, lookup=lookup, **options)
else:
self.tpl = Template(uri=self.name, filename=self.filename, lookup=lookup, **options)
def render(self, *args, **kwargs):
for dictarg in args: kwargs.update(dictarg)
_defaults = self.defaults.copy()
_defaults.update(kwargs)
return self.tpl.render(**_defaults)
class CheetahTemplate(BaseTemplate):
def prepare(self, **options):
from Cheetah.Template import Template
self.context = threading.local()
self.context.vars = {}
options['searchList'] = [self.context.vars]
if self.source:
self.tpl = Template(source=self.source, **options)
else:
self.tpl = Template(file=self.filename, **options)
def render(self, *args, **kwargs):
for dictarg in args: kwargs.update(dictarg)
self.context.vars.update(self.defaults)
self.context.vars.update(kwargs)
out = str(self.tpl)
self.context.vars.clear()
return out
class Jinja2Template(BaseTemplate):
def prepare(self, filters=None, tests=None, globals={}, **kwargs):
from jinja2 import Environment, FunctionLoader
self.env = Environment(loader=FunctionLoader(self.loader), **kwargs)
if filters: self.env.filters.update(filters)
if tests: self.env.tests.update(tests)
if globals: self.env.globals.update(globals)
if self.source:
self.tpl = self.env.from_string(self.source)
else:
self.tpl = self.env.get_template(self.filename)
def render(self, *args, **kwargs):
for dictarg in args: kwargs.update(dictarg)
_defaults = self.defaults.copy()
_defaults.update(kwargs)
return self.tpl.render(**_defaults)
def loader(self, name):
fname = self.search(name, self.lookup)
if not fname: return
with open(fname, "rb") as f:
return f.read().decode(self.encoding)
class SimpleTemplate(BaseTemplate):
def prepare(self, escape_func=html_escape, noescape=False, syntax=None, **ka):
self.cache = {}
enc = self.encoding
self._str = lambda x: touni(x, enc)
self._escape = lambda x: escape_func(touni(x, enc))
self.syntax = syntax
if noescape:
self._str, self._escape = self._escape, self._str
@cached_property
def co(self):
return compile(self.code, self.filename or '<string>', 'exec')
@cached_property
def code(self):
source = self.source
if not source:
with open(self.filename, 'rb') as f:
source = f.read()
try:
source, encoding = touni(source), 'utf8'
except UnicodeError:
depr('Template encodings other than utf8 are no longer supported.') #0.11
source, encoding = touni(source, 'latin1'), 'latin1'
parser = StplParser(source, encoding=encoding, syntax=self.syntax)
code = parser.translate()
self.encoding = parser.encoding
return code
def _rebase(self, _env, _name=None, **kwargs):
_env['_rebase'] = (_name, kwargs)
def _include(self, _env, _name=None, **kwargs):
env = _env.copy()
env.update(kwargs)
if _name not in self.cache:
self.cache[_name] = self.__class__(name=_name, lookup=self.lookup)
return self.cache[_name].execute(env['_stdout'], env)
def execute(self, _stdout, kwargs):
env = self.defaults.copy()
env.update(kwargs)
env.update({'_stdout': _stdout, '_printlist': _stdout.extend,
'include': functools.partial(self._include, env),
'rebase': functools.partial(self._rebase, env), '_rebase': None,
'_str': self._str, '_escape': self._escape, 'get': env.get,
'setdefault': env.setdefault, 'defined': env.__contains__ })
eval(self.co, env)
if env.get('_rebase'):
subtpl, rargs = env.pop('_rebase')
rargs['base'] = ''.join(_stdout) #copy stdout
del _stdout[:] # clear stdout
return self._include(env, subtpl, **rargs)
return env
def render(self, *args, **kwargs):
""" Render the template using keyword arguments as local variables. """
env = {}; stdout = []
for dictarg in args: env.update(dictarg)
env.update(kwargs)
self.execute(stdout, env)
return ''.join(stdout)
class StplSyntaxError(TemplateError): pass
class StplParser(object):
""" Parser for stpl templates. """
_re_cache = {} #: Cache for compiled re patterns
# This huge pile of voodoo magic splits python code into 8 different tokens.
# 1: All kinds of python strings (trust me, it works)
_re_tok = '((?m)[urbURB]?(?:\'\'(?!\')|""(?!")|\'{6}|"{6}' \
'|\'(?:[^\\\\\']|\\\\.)+?\'|"(?:[^\\\\"]|\\\\.)+?"' \
'|\'{3}(?:[^\\\\]|\\\\.|\\n)+?\'{3}' \
'|"{3}(?:[^\\\\]|\\\\.|\\n)+?"{3}))'
_re_inl = _re_tok.replace('|\\n','') # We re-use this string pattern later
# 2: Comments (until end of line, but not the newline itself)
_re_tok += '|(#.*)'
# 3,4: Keywords that start or continue a python block (only start of line)
_re_tok += '|^([ \\t]*(?:if|for|while|with|try|def|class)\\b)' \
'|^([ \\t]*(?:elif|else|except|finally)\\b)'
# 5: Our special 'end' keyword (but only if it stands alone)
_re_tok += '|((?:^|;)[ \\t]*end[ \\t]*(?=(?:%(block_close)s[ \\t]*)?\\r?$|;|#))'
# 6: A customizable end-of-code-block template token (only end of line)
_re_tok += '|(%(block_close)s[ \\t]*(?=$))'
# 7: And finally, a single newline. The 8th token is 'everything else'
_re_tok += '|(\\r?\\n)'
# Match the start tokens of code areas in a template
_re_split = '(?m)^[ \t]*(\\\\?)((%(line_start)s)|(%(block_start)s))'
# Match inline statements (may contain python strings)
_re_inl = '%%(inline_start)s((?:%s|[^\'"\n]+?)*?)%%(inline_end)s' % _re_inl
default_syntax = '<% %> % {{ }}'
def __init__(self, source, syntax=None, encoding='utf8'):
self.source, self.encoding = touni(source, encoding), encoding
self.set_syntax(syntax or self.default_syntax)
self.code_buffer, self.text_buffer = [], []
self.lineno, self.offset = 1, 0
self.indent, self.indent_mod = 0, 0
def get_syntax(self):
""" Tokens as a space separated string (default: <% %> % {{ }}) """
return self._syntax
def set_syntax(self, syntax):
self._syntax = syntax
self._tokens = syntax.split()
if not syntax in self._re_cache:
names = 'block_start block_close line_start inline_start inline_end'
etokens = map(re.escape, self._tokens)
pattern_vars = dict(zip(names.split(), etokens))
patterns = (self._re_split, self._re_tok, self._re_inl)
patterns = [re.compile(p%pattern_vars) for p in patterns]
self._re_cache[syntax] = patterns
self.re_split, self.re_tok, self.re_inl = self._re_cache[syntax]
syntax = property(get_syntax, set_syntax)
def translate(self):
if self.offset: raise RuntimeError('Parser is a one time instance.')
while True:
m = self.re_split.search(self.source[self.offset:])
if m:
text = self.source[self.offset:self.offset+m.start()]
self.text_buffer.append(text)
offs = self.offset
self.offset += m.end()
if m.group(1): # Escape syntax
line, sep, _ = self.source[self.offset:].partition('\n')
self.text_buffer.append(self.source[offs+m.start():offs+m.start(1)]+m.group(2)+line+sep)
self.offset += len(line+sep)
continue
self.flush_text()
self.read_code(multiline=bool(m.group(4)))
else: break
self.text_buffer.append(self.source[self.offset:])
self.flush_text()
return ''.join(self.code_buffer)
def read_code(self, multiline):
code_line, comment = '', ''
while True:
m = self.re_tok.search(self.source[self.offset:])
if not m:
code_line += self.source[self.offset:]
self.offset = len(self.source)
self.write_code(code_line.strip(), comment)
return
code_line += self.source[self.offset:self.offset+m.start()]
self.offset += m.end()
_str, _com, _blk1, _blk2, _end, _cend, _nl = m.groups()
if code_line and (_blk1 or _blk2): # a if b else c
code_line += _blk1 or _blk2
continue
if _str: # Python string
code_line += _str
elif _com: # Python comment (up to EOL)
comment = _com
if multiline and _com.strip().endswith(self._tokens[1]):
multiline = False # Allow end-of-block in comments
elif _blk1: # Start-block keyword (if/for/while/def/try/...)
code_line, self.indent_mod = _blk1, -1
self.indent += 1
elif _blk2: # Continue-block keyword (else/elif/except/...)
code_line, self.indent_mod = _blk2, -1
elif _end: # The non-standard 'end'-keyword (ends a block)
self.indent -= 1
elif _cend: # The end-code-block template token (usually '%>')
if multiline: multiline = False
else: code_line += _cend
else: # \n
self.write_code(code_line.strip(), comment)
self.lineno += 1
code_line, comment, self.indent_mod = '', '', 0
if not multiline:
break
def flush_text(self):
text = ''.join(self.text_buffer)
del self.text_buffer[:]
if not text: return
parts, pos, nl = [], 0, '\\\n'+' '*self.indent
for m in self.re_inl.finditer(text):
prefix, pos = text[pos:m.start()], m.end()
if prefix:
parts.append(nl.join(map(repr, prefix.splitlines(True))))
if prefix.endswith('\n'): parts[-1] += nl
parts.append(self.process_inline(m.group(1).strip()))
if pos < len(text):
prefix = text[pos:]
lines = prefix.splitlines(True)
if lines[-1].endswith('\\\\\n'): lines[-1] = lines[-1][:-3]
elif lines[-1].endswith('\\\\\r\n'): lines[-1] = lines[-1][:-4]
parts.append(nl.join(map(repr, lines)))
code = '_printlist((%s,))' % ', '.join(parts)
self.lineno += code.count('\n')+1
self.write_code(code)
@staticmethod
def process_inline(chunk):
if chunk[0] == '!': return '_str(%s)' % chunk[1:]
return '_escape(%s)' % chunk
def write_code(self, line, comment=''):
code = ' ' * (self.indent+self.indent_mod)
code += line.lstrip() + comment + '\n'
self.code_buffer.append(code)
def template(*args, **kwargs):
"""
Get a rendered template as a string iterator.
You can use a name, a filename or a template string as first parameter.
Template rendering arguments can be passed as dictionaries
or directly (as keyword arguments).
"""
tpl = args[0] if args else None
adapter = kwargs.pop('template_adapter', SimpleTemplate)
lookup = kwargs.pop('template_lookup', TEMPLATE_PATH)
tplid = (id(lookup), tpl)
if tplid not in TEMPLATES or DEBUG:
settings = kwargs.pop('template_settings', {})
if isinstance(tpl, adapter):
TEMPLATES[tplid] = tpl
if settings: TEMPLATES[tplid].prepare(**settings)
elif "\n" in tpl or "{" in tpl or "%" in tpl or '$' in tpl:
TEMPLATES[tplid] = adapter(source=tpl, lookup=lookup, **settings)
else:
TEMPLATES[tplid] = adapter(name=tpl, lookup=lookup, **settings)
if not TEMPLATES[tplid]:
abort(500, 'Template (%s) not found' % tpl)
for dictarg in args[1:]: kwargs.update(dictarg)
return TEMPLATES[tplid].render(kwargs)
mako_template = functools.partial(template, template_adapter=MakoTemplate)
cheetah_template = functools.partial(template, template_adapter=CheetahTemplate)
jinja2_template = functools.partial(template, template_adapter=Jinja2Template)
def view(tpl_name, **defaults):
""" Decorator: renders a template for a handler.
The handler can control its behavior like that:
- return a dict of template vars to fill out the template
- return something other than a dict and the view decorator will not
process the template, but return the handler result as is.
This includes returning a HTTPResponse(dict) to get,
for instance, JSON with autojson or other castfilters.
"""
def decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
result = func(*args, **kwargs)
if isinstance(result, (dict, DictMixin)):
tplvars = defaults.copy()
tplvars.update(result)
return template(tpl_name, **tplvars)
elif result is None:
return template(tpl_name, defaults)
return result
return wrapper
return decorator
mako_view = functools.partial(view, template_adapter=MakoTemplate)
cheetah_view = functools.partial(view, template_adapter=CheetahTemplate)
jinja2_view = functools.partial(view, template_adapter=Jinja2Template)
###############################################################################
# Constants and Globals ########################################################
###############################################################################
TEMPLATE_PATH = ['./', './views/']
TEMPLATES = {}
DEBUG = False
NORUN = False # If set, run() does nothing. Used by load_app()
#: A dict to map HTTP status codes (e.g. 404) to phrases (e.g. 'Not Found')
HTTP_CODES = httplib.responses
HTTP_CODES[418] = "I'm a teapot" # RFC 2324
HTTP_CODES[428] = "Precondition Required"
HTTP_CODES[429] = "Too Many Requests"
HTTP_CODES[431] = "Request Header Fields Too Large"
HTTP_CODES[511] = "Network Authentication Required"
_HTTP_STATUS_LINES = dict((k, '%d %s'%(k,v)) for (k,v) in HTTP_CODES.items())
#: The default template used for error pages. Override with @error()
ERROR_PAGE_TEMPLATE = """
%%try:
%%from %s import DEBUG, request
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html>
<head>
<title>Error: {{e.status}}</title>
<style type="text/css">
html {background-color: #eee; font-family: sans-serif;}
body {background-color: #fff; border: 1px solid #ddd;
padding: 15px; margin: 15px;}
pre {background-color: #eee; border: 1px solid #ddd; padding: 5px;}
</style>
</head>
<body>
<h1>Error: {{e.status}}</h1>
<p>Sorry, the requested URL <tt>{{repr(request.url)}}</tt>
caused an error:</p>
<pre>{{e.body}}</pre>
%%if DEBUG and e.exception:
<h2>Exception:</h2>
<pre>{{repr(e.exception)}}</pre>
%%end
%%if DEBUG and e.traceback:
<h2>Traceback:</h2>
<pre>{{e.traceback}}</pre>
%%end
</body>
</html>
%%except ImportError:
<b>ImportError:</b> Could not generate the error page. Please add bottle to
the import path.
%%end
""" % __name__
#: A thread-safe instance of :class:`LocalRequest`. If accessed from within a
#: request callback, this instance always refers to the *current* request
#: (even on a multithreaded server).
request = LocalRequest()
#: A thread-safe instance of :class:`LocalResponse`. It is used to change the
#: HTTP response for the *current* request.
response = LocalResponse()
#: A thread-safe namespace. Not used by Bottle.
local = threading.local()
# Initialize app stack (create first empty Bottle app)
# BC: 0.6.4 and needed for run()
app = default_app = AppStack()
app.push()
#: A virtual package that redirects import statements.
#: Example: ``import bottle.ext.sqlite`` actually imports `bottle_sqlite`.
ext = _ImportRedirect('bottle.ext' if __name__ == '__main__' else __name__+".ext", 'bottle_%s').module
if __name__ == '__main__':
opt, args, parser = _cmd_options, _cmd_args, _cmd_parser
if opt.version:
_stdout('Bottle %s\n'%__version__)
sys.exit(0)
if not args:
parser.print_help()
_stderr('\nError: No application entry point specified.\n')
sys.exit(1)
sys.path.insert(0, '.')
sys.modules.setdefault('bottle', sys.modules['__main__'])
host, port = (opt.bind or 'localhost'), 8080
if ':' in host and host.rfind(']') < host.rfind(':'):
host, port = host.rsplit(':', 1)
host = host.strip('[]')
run(args[0], host=host, port=int(port), server=opt.server,
reloader=opt.reload, plugins=opt.plugin, debug=opt.debug)
# THE END
|
mit
|
with-git/tensorflow
|
tensorflow/contrib/grid_rnn/python/ops/grid_rnn_cell.py
|
76
|
23199
|
# Copyright 2016 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""Module for constructing GridRNN cells"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from collections import namedtuple
import functools
from tensorflow.python.ops import array_ops
from tensorflow.python.ops import math_ops
from tensorflow.python.ops import nn
from tensorflow.python.ops import variable_scope as vs
from tensorflow.python.platform import tf_logging as logging
from tensorflow.contrib import layers
from tensorflow.contrib import rnn
class GridRNNCell(rnn.RNNCell):
"""Grid recurrent cell.
This implementation is based on:
http://arxiv.org/pdf/1507.01526v3.pdf
This is the generic implementation of GridRNN. Users can specify arbitrary
number of dimensions,
set some of them to be priority (section 3.2), non-recurrent (section 3.3)
and input/output dimensions (section 3.4).
Weight sharing can also be specified using the `tied` parameter.
Type of recurrent units can be specified via `cell_fn`.
"""
def __init__(self,
num_units,
num_dims=1,
input_dims=None,
output_dims=None,
priority_dims=None,
non_recurrent_dims=None,
tied=False,
cell_fn=None,
non_recurrent_fn=None,
state_is_tuple=True,
output_is_tuple=True):
"""Initialize the parameters of a Grid RNN cell
Args:
num_units: int, The number of units in all dimensions of this GridRNN cell
num_dims: int, Number of dimensions of this grid.
input_dims: int or list, List of dimensions which will receive input data.
output_dims: int or list, List of dimensions from which the output will be
recorded.
priority_dims: int or list, List of dimensions to be considered as
priority dimensions.
If None, no dimension is prioritized.
non_recurrent_dims: int or list, List of dimensions that are not
recurrent.
The transfer function for non-recurrent dimensions is specified
via `non_recurrent_fn`, which is
default to be `tensorflow.nn.relu`.
tied: bool, Whether to share the weights among the dimensions of this
GridRNN cell.
If there are non-recurrent dimensions in the grid, weights are
shared between each group of recurrent and non-recurrent
dimensions.
cell_fn: function, a function which returns the recurrent cell object.
Has to be in the following signature:
```
def cell_func(num_units):
# ...
```
and returns an object of type `RNNCell`. If None, LSTMCell with
default parameters will be used.
Note that if you use a custom RNNCell (with `cell_fn`), it is your
responsibility to make sure the inner cell use `state_is_tuple=True`.
non_recurrent_fn: a tensorflow Op that will be the transfer function of
the non-recurrent dimensions
state_is_tuple: If True, accepted and returned states are tuples of the
states of the recurrent dimensions. If False, they are concatenated
along the column axis. The latter behavior will soon be deprecated.
Note that if you use a custom RNNCell (with `cell_fn`), it is your
responsibility to make sure the inner cell use `state_is_tuple=True`.
output_is_tuple: If True, the output is a tuple of the outputs of the
recurrent dimensions. If False, they are concatenated along the
column axis. The later behavior will soon be deprecated.
Raises:
TypeError: if cell_fn does not return an RNNCell instance.
"""
if not state_is_tuple:
logging.warning('%s: Using a concatenated state is slower and will '
'soon be deprecated. Use state_is_tuple=True.', self)
if not output_is_tuple:
logging.warning('%s: Using a concatenated output is slower and will'
'soon be deprecated. Use output_is_tuple=True.', self)
if num_dims < 1:
raise ValueError('dims must be >= 1: {}'.format(num_dims))
self._config = _parse_rnn_config(num_dims, input_dims, output_dims,
priority_dims, non_recurrent_dims,
non_recurrent_fn or nn.relu, tied,
num_units)
self._state_is_tuple = state_is_tuple
self._output_is_tuple = output_is_tuple
if cell_fn is None:
my_cell_fn = functools.partial(
rnn.LSTMCell, num_units=num_units, state_is_tuple=state_is_tuple)
else:
my_cell_fn = lambda: cell_fn(num_units)
if tied:
self._cells = [my_cell_fn()] * num_dims
else:
self._cells = [my_cell_fn() for _ in range(num_dims)]
if not isinstance(self._cells[0], rnn.RNNCell):
raise TypeError('cell_fn must return an RNNCell instance, saw: %s' %
type(self._cells[0]))
if self._output_is_tuple:
self._output_size = tuple(self._cells[0].output_size
for _ in self._config.outputs)
else:
self._output_size = self._cells[0].output_size * len(self._config.outputs)
if self._state_is_tuple:
self._state_size = tuple(self._cells[0].state_size
for _ in self._config.recurrents)
else:
self._state_size = self._cell_state_size() * len(self._config.recurrents)
@property
def output_size(self):
return self._output_size
@property
def state_size(self):
return self._state_size
def __call__(self, inputs, state, scope=None):
"""Run one step of GridRNN.
Args:
inputs: input Tensor, 2D, batch x input_size. Or None
state: state Tensor, 2D, batch x state_size. Note that state_size =
cell_state_size * recurrent_dims
scope: VariableScope for the created subgraph; defaults to "GridRNNCell".
Returns:
A tuple containing:
- A 2D, batch x output_size, Tensor representing the output of the cell
after reading "inputs" when previous state was "state".
- A 2D, batch x state_size, Tensor representing the new state of the cell
after reading "inputs" when previous state was "state".
"""
conf = self._config
dtype = inputs.dtype
c_prev, m_prev, cell_output_size = self._extract_states(state)
new_output = [None] * conf.num_dims
new_state = [None] * conf.num_dims
with vs.variable_scope(scope or type(self).__name__): # GridRNNCell
# project input, populate c_prev and m_prev
self._project_input(inputs, c_prev, m_prev, cell_output_size > 0)
# propagate along dimensions, first for non-priority dimensions
# then priority dimensions
_propagate(conf.non_priority, conf, self._cells, c_prev, m_prev,
new_output, new_state, True)
_propagate(conf.priority, conf, self._cells,
c_prev, m_prev, new_output, new_state, False)
# collect outputs and states
output_tensors = [new_output[i] for i in self._config.outputs]
if self._output_is_tuple:
output = tuple(output_tensors)
else:
if output_tensors:
output = array_ops.concat(output_tensors, 1)
else:
output = array_ops.zeros([0, 0], dtype)
if self._state_is_tuple:
states = tuple(new_state[i] for i in self._config.recurrents)
else:
# concat each state first, then flatten the whole thing
state_tensors = [
x for i in self._config.recurrents for x in new_state[i]
]
if state_tensors:
states = array_ops.concat(state_tensors, 1)
else:
states = array_ops.zeros([0, 0], dtype)
return output, states
def _extract_states(self, state):
"""Extract the cell and previous output tensors from the given state.
Args:
state: The RNN state.
Returns:
Tuple of the cell value, previous output, and cell_output_size.
Raises:
ValueError: If len(self._config.recurrents) != len(state).
"""
conf = self._config
# c_prev is `m` (cell value), and
# m_prev is `h` (previous output) in the paper.
# Keeping c and m here for consistency with the codebase
c_prev = [None] * conf.num_dims
m_prev = [None] * conf.num_dims
# for LSTM : state = memory cell + output, hence cell_output_size > 0
# for GRU/RNN: state = output (whose size is equal to _num_units),
# hence cell_output_size = 0
total_cell_state_size = self._cell_state_size()
cell_output_size = total_cell_state_size - conf.num_units
if self._state_is_tuple:
if len(conf.recurrents) != len(state):
raise ValueError('Expected state as a tuple of {} '
'element'.format(len(conf.recurrents)))
for recurrent_dim, recurrent_state in zip(conf.recurrents, state):
if cell_output_size > 0:
c_prev[recurrent_dim], m_prev[recurrent_dim] = recurrent_state
else:
m_prev[recurrent_dim] = recurrent_state
else:
for recurrent_dim, start_idx in zip(conf.recurrents,
range(0, self.state_size,
total_cell_state_size)):
if cell_output_size > 0:
c_prev[recurrent_dim] = array_ops.slice(state, [0, start_idx],
[-1, conf.num_units])
m_prev[recurrent_dim] = array_ops.slice(
state, [0, start_idx + conf.num_units], [-1, cell_output_size])
else:
m_prev[recurrent_dim] = array_ops.slice(state, [0, start_idx],
[-1, conf.num_units])
return c_prev, m_prev, cell_output_size
def _project_input(self, inputs, c_prev, m_prev, with_c):
"""Fills in c_prev and m_prev with projected input, for input dimensions.
Args:
inputs: inputs tensor
c_prev: cell value
m_prev: previous output
with_c: boolean; whether to include project_c.
Raises:
ValueError: if len(self._config.input) != len(inputs)
"""
conf = self._config
if (inputs is not None and inputs.get_shape().with_rank(2)[1].value > 0 and
conf.inputs):
if isinstance(inputs, tuple):
if len(conf.inputs) != len(inputs):
raise ValueError('Expect inputs as a tuple of {} '
'tensors'.format(len(conf.inputs)))
input_splits = inputs
else:
input_splits = array_ops.split(
value=inputs, num_or_size_splits=len(conf.inputs), axis=1)
input_sz = input_splits[0].get_shape().with_rank(2)[1].value
for i, j in enumerate(conf.inputs):
input_project_m = vs.get_variable(
'project_m_{}'.format(j), [input_sz, conf.num_units],
dtype=inputs.dtype)
m_prev[j] = math_ops.matmul(input_splits[i], input_project_m)
if with_c:
input_project_c = vs.get_variable(
'project_c_{}'.format(j), [input_sz, conf.num_units],
dtype=inputs.dtype)
c_prev[j] = math_ops.matmul(input_splits[i], input_project_c)
def _cell_state_size(self):
"""Total size of the state of the inner cell used in this grid.
Returns:
Total size of the state of the inner cell.
"""
state_sizes = self._cells[0].state_size
if isinstance(state_sizes, tuple):
return sum(state_sizes)
return state_sizes
"""Specialized cells, for convenience
"""
class Grid1BasicRNNCell(GridRNNCell):
"""1D BasicRNN cell"""
def __init__(self, num_units, state_is_tuple=True, output_is_tuple=True):
super(Grid1BasicRNNCell, self).__init__(
num_units=num_units,
num_dims=1,
input_dims=0,
output_dims=0,
priority_dims=0,
tied=False,
cell_fn=lambda n: rnn.BasicRNNCell(num_units=n),
state_is_tuple=state_is_tuple,
output_is_tuple=output_is_tuple)
class Grid2BasicRNNCell(GridRNNCell):
"""2D BasicRNN cell
This creates a 2D cell which receives input and gives output in the first
dimension.
The first dimension can optionally be non-recurrent if `non_recurrent_fn` is
specified.
"""
def __init__(self,
num_units,
tied=False,
non_recurrent_fn=None,
state_is_tuple=True,
output_is_tuple=True):
super(Grid2BasicRNNCell, self).__init__(
num_units=num_units,
num_dims=2,
input_dims=0,
output_dims=0,
priority_dims=0,
tied=tied,
non_recurrent_dims=None if non_recurrent_fn is None else 0,
cell_fn=lambda n: rnn.BasicRNNCell(num_units=n),
non_recurrent_fn=non_recurrent_fn,
state_is_tuple=state_is_tuple,
output_is_tuple=output_is_tuple)
class Grid1BasicLSTMCell(GridRNNCell):
"""1D BasicLSTM cell."""
def __init__(self,
num_units,
forget_bias=1,
state_is_tuple=True,
output_is_tuple=True):
def cell_fn(n):
return rnn.BasicLSTMCell(num_units=n, forget_bias=forget_bias)
super(Grid1BasicLSTMCell, self).__init__(
num_units=num_units,
num_dims=1,
input_dims=0,
output_dims=0,
priority_dims=0,
tied=False,
cell_fn=cell_fn,
state_is_tuple=state_is_tuple,
output_is_tuple=output_is_tuple)
class Grid2BasicLSTMCell(GridRNNCell):
"""2D BasicLSTM cell.
This creates a 2D cell which receives input and gives output in the first
dimension.
The first dimension can optionally be non-recurrent if `non_recurrent_fn` is
specified.
"""
def __init__(self,
num_units,
tied=False,
non_recurrent_fn=None,
forget_bias=1,
state_is_tuple=True,
output_is_tuple=True):
def cell_fn(n):
return rnn.BasicLSTMCell(num_units=n, forget_bias=forget_bias)
super(Grid2BasicLSTMCell, self).__init__(
num_units=num_units,
num_dims=2,
input_dims=0,
output_dims=0,
priority_dims=0,
tied=tied,
non_recurrent_dims=None if non_recurrent_fn is None else 0,
cell_fn=cell_fn,
non_recurrent_fn=non_recurrent_fn,
state_is_tuple=state_is_tuple,
output_is_tuple=output_is_tuple)
class Grid1LSTMCell(GridRNNCell):
"""1D LSTM cell.
This is different from Grid1BasicLSTMCell because it gives options to
specify the forget bias and enabling peepholes.
"""
def __init__(self,
num_units,
use_peepholes=False,
forget_bias=1.0,
state_is_tuple=True,
output_is_tuple=True):
def cell_fn(n):
return rnn.LSTMCell(
num_units=n, forget_bias=forget_bias, use_peepholes=use_peepholes)
super(Grid1LSTMCell, self).__init__(
num_units=num_units,
num_dims=1,
input_dims=0,
output_dims=0,
priority_dims=0,
cell_fn=cell_fn,
state_is_tuple=state_is_tuple,
output_is_tuple=output_is_tuple)
class Grid2LSTMCell(GridRNNCell):
"""2D LSTM cell.
This creates a 2D cell which receives input and gives output in the first
dimension.
The first dimension can optionally be non-recurrent if `non_recurrent_fn` is
specified.
"""
def __init__(self,
num_units,
tied=False,
non_recurrent_fn=None,
use_peepholes=False,
forget_bias=1.0,
state_is_tuple=True,
output_is_tuple=True):
def cell_fn(n):
return rnn.LSTMCell(
num_units=n, forget_bias=forget_bias, use_peepholes=use_peepholes)
super(Grid2LSTMCell, self).__init__(
num_units=num_units,
num_dims=2,
input_dims=0,
output_dims=0,
priority_dims=0,
tied=tied,
non_recurrent_dims=None if non_recurrent_fn is None else 0,
cell_fn=cell_fn,
non_recurrent_fn=non_recurrent_fn,
state_is_tuple=state_is_tuple,
output_is_tuple=output_is_tuple)
class Grid3LSTMCell(GridRNNCell):
"""3D BasicLSTM cell.
This creates a 2D cell which receives input and gives output in the first
dimension.
The first dimension can optionally be non-recurrent if `non_recurrent_fn` is
specified.
The second and third dimensions are LSTM.
"""
def __init__(self,
num_units,
tied=False,
non_recurrent_fn=None,
use_peepholes=False,
forget_bias=1.0,
state_is_tuple=True,
output_is_tuple=True):
def cell_fn(n):
return rnn.LSTMCell(
num_units=n, forget_bias=forget_bias, use_peepholes=use_peepholes)
super(Grid3LSTMCell, self).__init__(
num_units=num_units,
num_dims=3,
input_dims=0,
output_dims=0,
priority_dims=0,
tied=tied,
non_recurrent_dims=None if non_recurrent_fn is None else 0,
cell_fn=cell_fn,
non_recurrent_fn=non_recurrent_fn,
state_is_tuple=state_is_tuple,
output_is_tuple=output_is_tuple)
class Grid2GRUCell(GridRNNCell):
"""2D LSTM cell.
This creates a 2D cell which receives input and gives output in the first
dimension.
The first dimension can optionally be non-recurrent if `non_recurrent_fn` is
specified.
"""
def __init__(self,
num_units,
tied=False,
non_recurrent_fn=None,
state_is_tuple=True,
output_is_tuple=True):
super(Grid2GRUCell, self).__init__(
num_units=num_units,
num_dims=2,
input_dims=0,
output_dims=0,
priority_dims=0,
tied=tied,
non_recurrent_dims=None if non_recurrent_fn is None else 0,
cell_fn=lambda n: rnn.GRUCell(num_units=n),
non_recurrent_fn=non_recurrent_fn,
state_is_tuple=state_is_tuple,
output_is_tuple=output_is_tuple)
# Helpers
_GridRNNDimension = namedtuple('_GridRNNDimension', [
'idx', 'is_input', 'is_output', 'is_priority', 'non_recurrent_fn'
])
_GridRNNConfig = namedtuple('_GridRNNConfig',
['num_dims', 'dims', 'inputs', 'outputs',
'recurrents', 'priority', 'non_priority', 'tied',
'num_units'])
def _parse_rnn_config(num_dims, ls_input_dims, ls_output_dims, ls_priority_dims,
ls_non_recurrent_dims, non_recurrent_fn, tied, num_units):
def check_dim_list(ls):
if ls is None:
ls = []
if not isinstance(ls, (list, tuple)):
ls = [ls]
ls = sorted(set(ls))
if any(_ < 0 or _ >= num_dims for _ in ls):
raise ValueError('Invalid dims: {}. Must be in [0, {})'.format(ls,
num_dims))
return ls
input_dims = check_dim_list(ls_input_dims)
output_dims = check_dim_list(ls_output_dims)
priority_dims = check_dim_list(ls_priority_dims)
non_recurrent_dims = check_dim_list(ls_non_recurrent_dims)
rnn_dims = []
for i in range(num_dims):
rnn_dims.append(
_GridRNNDimension(
idx=i,
is_input=(i in input_dims),
is_output=(i in output_dims),
is_priority=(i in priority_dims),
non_recurrent_fn=non_recurrent_fn
if i in non_recurrent_dims else None))
return _GridRNNConfig(
num_dims=num_dims,
dims=rnn_dims,
inputs=input_dims,
outputs=output_dims,
recurrents=[x for x in range(num_dims) if x not in non_recurrent_dims],
priority=priority_dims,
non_priority=[x for x in range(num_dims) if x not in priority_dims],
tied=tied,
num_units=num_units)
def _propagate(dim_indices, conf, cells, c_prev, m_prev, new_output, new_state,
first_call):
"""Propagates through all the cells in dim_indices dimensions.
"""
if len(dim_indices) == 0:
return
# Because of the way RNNCells are implemented, we take the last dimension
# (H_{N-1}) out and feed it as the state of the RNN cell
# (in `last_dim_output`).
# The input of the cell (H_0 to H_{N-2}) are concatenated into `cell_inputs`
if conf.num_dims > 1:
ls_cell_inputs = [None] * (conf.num_dims - 1)
for d in conf.dims[:-1]:
if new_output[d.idx] is None:
ls_cell_inputs[d.idx] = m_prev[d.idx]
else:
ls_cell_inputs[d.idx] = new_output[d.idx]
cell_inputs = array_ops.concat(ls_cell_inputs, 1)
else:
cell_inputs = array_ops.zeros([m_prev[0].get_shape().as_list()[0], 0],
m_prev[0].dtype)
last_dim_output = (new_output[-1]
if new_output[-1] is not None else m_prev[-1])
for i in dim_indices:
d = conf.dims[i]
if d.non_recurrent_fn:
if conf.num_dims > 1:
linear_args = array_ops.concat([cell_inputs, last_dim_output], 1)
else:
linear_args = last_dim_output
with vs.variable_scope('non_recurrent' if conf.tied else
'non_recurrent/cell_{}'.format(i)):
if conf.tied and not (first_call and i == dim_indices[0]):
vs.get_variable_scope().reuse_variables()
new_output[d.idx] = layers.fully_connected(
linear_args,
num_outputs=conf.num_units,
activation_fn=d.non_recurrent_fn,
weights_initializer=(vs.get_variable_scope().initializer or
layers.initializers.xavier_initializer),
weights_regularizer=vs.get_variable_scope().regularizer)
else:
if c_prev[i] is not None:
cell_state = (c_prev[i], last_dim_output)
else:
# for GRU/RNN, the state is just the previous output
cell_state = last_dim_output
with vs.variable_scope('recurrent' if conf.tied else
'recurrent/cell_{}'.format(i)):
if conf.tied and not (first_call and i == dim_indices[0]):
vs.get_variable_scope().reuse_variables()
cell = cells[i]
new_output[d.idx], new_state[d.idx] = cell(cell_inputs, cell_state)
|
apache-2.0
|
albertz/music-player
|
mac/pyobjc-core/PyObjCTest/test_object_property.py
|
2
|
19504
|
from __future__ import unicode_literals
from PyObjCTools.TestSupport import *
import objc
import copy
from PyObjCTest.fnd import *
objc.registerMetaDataForSelector(
b"NSObject", b"validateValue:forKey:error:",
dict(
arguments={
2: dict(type_modifier=objc._C_INOUT),
4: dict(type_modifier=objc._C_OUT),
},
))
class OCCopy (NSObject):
def copy(self):
return self.copyWithZone_(None)
def copyWithZone_(self, zone):
v = OCCopy.allocWithZone_(zone).init()
return v
class OCObserve (NSObject):
def init(self):
self = super(OCObserve, self).init()
self.values = []
self.registrations = []
return self
@property
def seen(self):
return { v[1]: v[2]['new'] for v in self.values }
def register(self, object, keypath):
object.addObserver_forKeyPath_options_context_(
self, keypath, 0x3, None)
self.registrations.append((object, keypath))
def unregister(self, object, keypath):
object.removeObserver_forKeyPath_(self, keypath)
def observeValueForKeyPath_ofObject_change_context_(
self, keypath, object, change, context):
# We don't get to keep the 'change' dictionary, make
# a copy (it gets reused in future calls)
new_change = {}
for k in change:
v = change[k]
if isinstance(v, (list, tuple, set)):
v = copy.copy(v)
new_change[k] = v
self.values.append((object, keypath, new_change))
def __enter__(self):
return self
def __exit__(self, type, value, traceback):
for o, k in self.registrations:
self.unregister(o, k)
self.registrations = []
class TestObjectProperty (TestCase):
def testCreation(self):
class OCTestObjectProperty1 (NSObject):
p1 = objc.object_property()
p2 = objc.object_property(copy=True)
p3 = objc.object_property(read_only=True)
p4 = objc.object_property(ivar='myp4')
p5 = objc.object_property(typestr=objc._C_INT)
p6 = objc.object_property(typestr=objc._C_DBL)
o = OCTestObjectProperty1.alloc().init()
self.assertTrue(o.respondsToSelector(b'p1'))
self.assertTrue(o.respondsToSelector(b'setP1:'))
v = OCCopy.alloc().init()
o.p1 = v
self.assertIs(o.p1, v)
self.assertIs(o._p1, v)
self.assertTrue(o.respondsToSelector(b'p2'))
self.assertTrue(o.respondsToSelector(b'setP2:'))
o.p2 = v
self.assertIsInstance(o.p2, OCCopy)
self.assertIsNot(o.p2, v)
self.assertIsNot(o._p2, v)
self.assertTrue(o.respondsToSelector(b'p3'))
self.assertFalse(o.respondsToSelector(b'setP3:'))
o._p3 = v
self.assertIs(o.p3, v)
self.assertTrue(o.respondsToSelector(b'p4'))
self.assertTrue(o.respondsToSelector(b'setP4:'))
o.p4 = v
self.assertIs(o.p4, v)
self.assertIs(o.myp4, v)
self.assertTrue(o.respondsToSelector(b'p5'))
self.assertTrue(o.respondsToSelector(b'setP5:'))
self.assertTrue(o.respondsToSelector(b'p6'))
self.assertTrue(o.respondsToSelector(b'setP6:'))
s = o.methodSignatureForSelector_(b'p5')
self.assertEqual(s.methodReturnType(), objc._C_INT)
s = o.methodSignatureForSelector_(b'p6')
self.assertEqual(s.methodReturnType(), objc._C_DBL)
def testDepends(self):
class OCTestObjectProperty2 (NSObject):
p1 = objc.object_property()
p2 = objc.object_property()
p3 = objc.object_property(read_only=True, depends_on=['p1', 'p2'])
@p3.getter
def p3(self):
return (self.p1 or '', self.p2 or '')
class OCTestObjectProperty2b (OCTestObjectProperty2):
p4 = objc.object_property()
@OCTestObjectProperty2.p3.getter
def p3(self):
return (self.p4 or '', self.p2 or '', self.p1 or '')
p3.depends_on('p4')
p5 = objc.object_property(read_only=True)
@p5.getter
def p5(self):
return "-%s-"%(self.p4,)
p5.depends_on('p4')
observer1 = OCObserve.alloc().init()
observer2 = OCObserve.alloc().init()
object1 = OCTestObjectProperty2.alloc().init()
object2 = OCTestObjectProperty2b.alloc().init()
v = type(object1).keyPathsForValuesAffectingP3()
self.assertIsInstance(v, objc.lookUpClass('NSSet'))
self.assertEqual(v, {'p1', 'p2'})
v = type(object2).keyPathsForValuesAffectingP3()
self.assertIsInstance(v, objc.lookUpClass('NSSet'))
self.assertEqual(v, {'p1', 'p2', 'p4'})
self.assertTrue(object1.respondsToSelector('p1'))
self.assertTrue(object1.respondsToSelector('setP1:'))
self.assertTrue(object1.respondsToSelector('p2'))
self.assertTrue(object1.respondsToSelector('setP2:'))
self.assertTrue(object1.respondsToSelector('p3'))
self.assertFalse(object1.respondsToSelector('setP3:'))
self.assertTrue(object2.respondsToSelector('p1'))
self.assertTrue(object2.respondsToSelector('setP1:'))
self.assertTrue(object2.respondsToSelector('p2'))
self.assertTrue(object2.respondsToSelector('setP2:'))
self.assertTrue(object2.respondsToSelector('p3'))
self.assertFalse(object2.respondsToSelector('setP3:'))
self.assertTrue(object2.respondsToSelector('p4'))
self.assertTrue(object2.respondsToSelector('setP4:'))
observer1.register(object1, 'p1')
observer1.register(object1, 'p2')
observer1.register(object1, 'p3')
observer2.register(object2, 'p1')
observer2.register(object2, 'p2')
observer2.register(object2, 'p3')
observer2.register(object2, 'p4')
observer2.register(object2, 'p5')
try:
self.assertEqual(observer1.values, [])
self.assertEqual(observer2.values, [])
object1.p1 = "a"
object1.p2 = "b"
self.assertEqual(object1.p3, ("a", "b"))
self.assertEqual(object1.pyobjc_instanceMethods.p3(), ("a", "b"))
object2.p1 = "a"
object2.p2 = "b"
object2.p4 = "c"
self.assertEqual(object2.p3, ("c", "b", "a"))
self.assertEqual(object2.pyobjc_instanceMethods.p3(), ("c", "b", "a"))
self.assertEqual(object2.pyobjc_instanceMethods.p4(), "c")
#seen = { v[1]: v[2]['new'] for v in observer1.values }
self.assertEqual(observer1.seen,
{'p1': 'a', 'p2': 'b', 'p3': ('a', 'b') })
#seen = { v[1]: v[2]['new'] for v in observer2.values }
self.assertEqual(observer2.seen,
{'p1': 'a', 'p2': 'b', 'p3': ('c', 'b', 'a'), 'p4': 'c', 'p5': '-c-' })
finally:
observer1.unregister(object1, 'p1')
observer1.unregister(object1, 'p2')
observer1.unregister(object1, 'p3')
observer2.unregister(object2, 'p1')
observer2.unregister(object2, 'p2')
observer2.unregister(object2, 'p3')
observer2.unregister(object2, 'p4')
observer2.unregister(object2, 'p5')
def testDepends2(self):
class OCTestObjectProperty2B (NSObject):
p1 = objc.object_property()
@p1.getter
def p1(self):
return self._p1
@p1.setter
def p1(self, v):
self._p1 = v
p2 = objc.object_property()
@p2.getter
def p2(self):
return self._p2
@p2.setter
def p2(self, v):
self._p2 = v
p3 = objc.object_property(read_only=True, depends_on=['p1', 'p2'])
@p3.getter
def p3(self):
return (self.p1 or '', self.p2 or '')
class OCTestObjectProperty2Bb (OCTestObjectProperty2B):
p4 = objc.object_property()
@OCTestObjectProperty2B.p1.getter
def p1(self):
return self._p1
@OCTestObjectProperty2B.p3.getter
def p3(self):
return (self.p4 or '', self.p2 or '', self.p1 or '')
p3.depends_on('p4')
observer1 = OCObserve.alloc().init()
observer2 = OCObserve.alloc().init()
object1 = OCTestObjectProperty2B.alloc().init()
object2 = OCTestObjectProperty2Bb.alloc().init()
v = type(object1).keyPathsForValuesAffectingP3()
self.assertIsInstance(v, objc.lookUpClass('NSSet'))
self.assertEqual(v, {'p1', 'p2'})
v = type(object2).keyPathsForValuesAffectingP3()
self.assertIsInstance(v, objc.lookUpClass('NSSet'))
self.assertEqual(v, {'p1', 'p2', 'p4'})
self.assertTrue(object1.respondsToSelector('p1'))
self.assertTrue(object1.respondsToSelector('setP1:'))
self.assertTrue(object1.respondsToSelector('p2'))
self.assertTrue(object1.respondsToSelector('setP2:'))
self.assertTrue(object1.respondsToSelector('p3'))
self.assertFalse(object1.respondsToSelector('setP3:'))
self.assertTrue(object2.respondsToSelector('p1'))
self.assertTrue(object2.respondsToSelector('setP1:'))
self.assertTrue(object2.respondsToSelector('p2'))
self.assertTrue(object2.respondsToSelector('setP2:'))
self.assertTrue(object2.respondsToSelector('p3'))
self.assertFalse(object2.respondsToSelector('setP3:'))
self.assertTrue(object2.respondsToSelector('p4'))
self.assertTrue(object2.respondsToSelector('setP4:'))
observer1.register(object1, 'p1')
observer1.register(object1, 'p2')
observer1.register(object1, 'p3')
observer2.register(object2, 'p1')
observer2.register(object2, 'p2')
observer2.register(object2, 'p3')
observer2.register(object2, 'p4')
try:
self.assertEqual(observer1.values, [])
self.assertEqual(observer2.values, [])
object1.p1 = "a"
object1.p2 = "b"
self.assertEqual(object1.p3, ("a", "b"))
self.assertEqual(object1.pyobjc_instanceMethods.p3(), ("a", "b"))
object2.p1 = "a"
object2.p2 = "b"
object2.p4 = "c"
self.assertEqual(object2.p3, ("c", "b", "a"))
self.assertEqual(object2.pyobjc_instanceMethods.p3(), ("c", "b", "a"))
self.assertEqual(object2.pyobjc_instanceMethods.p4(), "c")
#seen = { v[1]: v[2]['new'] for v in observer1.values }
self.assertEqual(observer1.seen,
{'p1': 'a', 'p2': 'b', 'p3': ('a', 'b') })
#seen = { v[1]: v[2]['new'] for v in observer2.values }
self.assertEqual(observer2.seen,
{'p1': 'a', 'p2': 'b', 'p3': ('c', 'b', 'a'), 'p4': 'c' })
finally:
observer1.unregister(object1, 'p1')
observer1.unregister(object1, 'p2')
observer1.unregister(object1, 'p3')
observer2.unregister(object2, 'p1')
observer2.unregister(object2, 'p2')
observer2.unregister(object2, 'p3')
observer2.unregister(object2, 'p4')
def testMethods(self):
l = []
class OCTestObjectProperty4 (NSObject):
p1 = objc.object_property()
@p1.getter
def p1(self):
l.append(('get',))
return self._p1 + '!'
@p1.setter
def p1(self, v):
l.append(('set', v))
self._p1 = v + '?'
@p1.validate
def p1(self, value, error):
if value == 1:
return (True, value, None)
else:
return (False, 2, "snake")
class OCTestObjectProperty4b (OCTestObjectProperty4):
@OCTestObjectProperty4.p1.validate
def p1(self, value, error):
if value == 2:
return (True, value, None)
else:
return (False, 2, "monty")
o = OCTestObjectProperty4.alloc().init()
o.p1 = 'f'
self.assertEqual(o.p1, 'f?!')
self.assertEqual(o._p1, 'f?')
self.assertEqual(l, [('set', 'f'), ('get',)])
ok, value, error = o.validateValue_forKey_error_(
1, 'p1', None)
self.assertTrue(ok)
self.assertEqual(value, 1)
self.assertEqual(error, None)
ok, value, error = o.validateValue_forKey_error_(
9, 'p1', None)
self.assertFalse(ok)
self.assertEqual(value, 2)
self.assertEqual(error, "snake")
l = []
o = OCTestObjectProperty4b.alloc().init()
o.p1 = 'f'
self.assertEqual(o.p1, 'f?!')
self.assertEqual(o._p1, 'f?')
self.assertEqual(l, [('set', 'f'), ('get',)])
ok, value, error = o.validateValue_forKey_error_(
2, 'p1', None)
self.assertTrue(ok)
self.assertEqual(value, 2)
self.assertEqual(error, None)
ok, value, error = o.validateValue_forKey_error_(
9, 'p1', None)
self.assertFalse(ok)
self.assertEqual(value, 2)
self.assertEqual(error, "monty")
def testNative(self):
l = []
class OCTestObjectProperty7 (NSObject):
p1 = objc.object_property()
@p1.getter
def p1(self):
l.append('get')
return self._p1
@p1.setter
def p1(self, value):
l.append('set')
self._p1 = value
o = OCTestObjectProperty7.alloc().init()
o.setValue_forKey_(42, 'p1')
self.assertEqual(o._p1, 42)
o._p1 = "monkey"
v = o.valueForKey_('p1')
self.assertEqual(v, "monkey")
self.assertEqual(l, ["set", "get"])
def testDynamic(self):
class OCTestObjectProperty8 (NSObject):
p1 = objc.object_property(dynamic=True)
p2 = objc.object_property(dynamic=True, typestr=objc._C_NSBOOL)
self.assertFalse(OCTestObjectProperty8.instancesRespondToSelector_(b"p1"))
self.assertFalse(OCTestObjectProperty8.instancesRespondToSelector_(b"setP1:"))
self.assertFalse(OCTestObjectProperty8.instancesRespondToSelector_(b"isP2"))
self.assertFalse(OCTestObjectProperty8.instancesRespondToSelector_(b"setP2:"))
v = [42]
def getter(self):
return v[0]
def setter(self, value):
v[0] = value
OCTestObjectProperty8.p1 = getter
OCTestObjectProperty8.setP1_ = setter
v2 = [False]
def getter2(self):
return v2[0]
def setter2(self, value):
v2[0] = bool(value)
OCTestObjectProperty8.isP2 = getter2
OCTestObjectProperty8.setP2_ = setter2
self.assertTrue(OCTestObjectProperty8.instancesRespondToSelector_(b"p1"))
self.assertTrue(OCTestObjectProperty8.instancesRespondToSelector_(b"setP1:"))
self.assertTrue(OCTestObjectProperty8.instancesRespondToSelector_(b"isP2"))
self.assertTrue(OCTestObjectProperty8.instancesRespondToSelector_(b"setP2:"))
o = OCTestObjectProperty8.alloc().init()
self.assertIsInstance(OCTestObjectProperty8.p1, objc.object_property)
self.assertIsInstance(OCTestObjectProperty8.p2, objc.object_property)
self.assertEqual(o.p1, 42)
self.assertEqual(o.p2, False)
o.p1 = 99
o.p2 = True
self.assertEqual(o.p1, 99)
self.assertEqual(v[0], 99)
self.assertEqual(o.p2, True)
self.assertEqual(v2[0], True)
def testReadOnly(self):
class OCTestObjectProperty3 (NSObject):
p1 = objc.object_property(read_only=True)
o = OCTestObjectProperty3.alloc().init()
self.assertRaises(ValueError, setattr, o, 'p1', 42)
def testSubclass(self):
class OCTestObjectProperty5 (NSObject):
p1 = objc.object_property(read_only=True)
p2 = objc.object_property()
p3 = objc.object_property(read_only=True, typestr=objc._C_NSBOOL)
class OCTestObjectProperty6 (OCTestObjectProperty5):
@OCTestObjectProperty5.p1.setter
def p1(self, value):
self._p1 = value
@OCTestObjectProperty5.p2.setter
def p2(self, value):
self._p2 = value * 2
@OCTestObjectProperty5.p3.getter
def p3(self):
return not super(OCTestObjectProperty6, self).p3
base = OCTestObjectProperty5.alloc().init()
self.assertRaises(ValueError, setattr, base, 'p1', 1)
self.assertRaises(ValueError, setattr, base, 'p3', 1)
base.p2 = 'b'
self.assertEqual(base.p2, 'b')
sub = OCTestObjectProperty6.alloc().init()
sub.p1 = 1
sub.p2 = 'a'
sub._p3 = False
self.assertEqual(sub.p1, 1)
self.assertEqual(sub.p2, 'aa')
self.assertEqual(sub.p3, True)
self.assertTrue(base.respondsToSelector_(b'p2'))
self.assertFalse(base.respondsToSelector_(b'setP1:'))
self.assertTrue(base.respondsToSelector_(b'isP3'))
self.assertFalse(base.respondsToSelector_(b'p3'))
self.assertTrue(sub.respondsToSelector_(b'p2'))
self.assertTrue(sub.respondsToSelector_(b'setP1:'))
self.assertTrue(sub.respondsToSelector_(b'isP3'))
self.assertFalse(sub.respondsToSelector_(b'p3'))
try:
del sub.p3
except TypeError:
pass
else:
self.fail("Deleting an object_property shouldn't be possible")
def testDefaultSetterWithoutIvar(self):
try:
class OCTestObjectProperty7 (NSObject):
p1 = objc.object_property(ivar=objc.NULL)
except ValueError:
pass
else:
self.fail("ValueError not raised")
try:
class OCTestObjectProperty8 (NSObject):
p1 = objc.object_property(ivar=objc.NULL, read_only=True)
except ValueError:
pass
else:
self.fail("ValueError not raised")
try:
class OCTestObjectProperty9 (NSObject):
p1 = objc.object_property(read_only=True)
@p1.setter
def p1(self, v):
pass
except ValueError:
pass
else:
self.fail("ValueError not raised")
try:
class OCTestObjectProperty9 (NSObject):
p1 = objc.object_property(read_only=True)
@p1.validate
def p1(self, v):
pass
except ValueError:
pass
else:
self.fail("ValueError not raised")
class TestBoolProperty (TestCase):
def testDefault(self):
class OCTestBoolProperty1 (NSObject):
p1 = objc.bool_property()
o = OCTestBoolProperty1.alloc().init()
self.assertEqual(o.p1, False)
o.p1 = [1, 2]
self.assertEqual(o.p1, True)
if __name__ == "__main__":
main()
|
bsd-2-clause
|
flwh/KK_mt6589_iq451
|
prebuilts/python/linux-x86/2.7.5/lib/python2.7/distutils/tests/test_cmd.py
|
87
|
3901
|
"""Tests for distutils.cmd."""
import unittest
import os
from test.test_support import captured_stdout, run_unittest
from distutils.cmd import Command
from distutils.dist import Distribution
from distutils.errors import DistutilsOptionError
from distutils import debug
class MyCmd(Command):
def initialize_options(self):
pass
class CommandTestCase(unittest.TestCase):
def setUp(self):
dist = Distribution()
self.cmd = MyCmd(dist)
def test_ensure_string_list(self):
cmd = self.cmd
cmd.not_string_list = ['one', 2, 'three']
cmd.yes_string_list = ['one', 'two', 'three']
cmd.not_string_list2 = object()
cmd.yes_string_list2 = 'ok'
cmd.ensure_string_list('yes_string_list')
cmd.ensure_string_list('yes_string_list2')
self.assertRaises(DistutilsOptionError,
cmd.ensure_string_list, 'not_string_list')
self.assertRaises(DistutilsOptionError,
cmd.ensure_string_list, 'not_string_list2')
def test_make_file(self):
cmd = self.cmd
# making sure it raises when infiles is not a string or a list/tuple
self.assertRaises(TypeError, cmd.make_file,
infiles=1, outfile='', func='func', args=())
# making sure execute gets called properly
def _execute(func, args, exec_msg, level):
self.assertEqual(exec_msg, 'generating out from in')
cmd.force = True
cmd.execute = _execute
cmd.make_file(infiles='in', outfile='out', func='func', args=())
def test_dump_options(self):
msgs = []
def _announce(msg, level):
msgs.append(msg)
cmd = self.cmd
cmd.announce = _announce
cmd.option1 = 1
cmd.option2 = 1
cmd.user_options = [('option1', '', ''), ('option2', '', '')]
cmd.dump_options()
wanted = ["command options for 'MyCmd':", ' option1 = 1',
' option2 = 1']
self.assertEqual(msgs, wanted)
def test_ensure_string(self):
cmd = self.cmd
cmd.option1 = 'ok'
cmd.ensure_string('option1')
cmd.option2 = None
cmd.ensure_string('option2', 'xxx')
self.assertTrue(hasattr(cmd, 'option2'))
cmd.option3 = 1
self.assertRaises(DistutilsOptionError, cmd.ensure_string, 'option3')
def test_ensure_string_list(self):
cmd = self.cmd
cmd.option1 = 'ok,dok'
cmd.ensure_string_list('option1')
self.assertEqual(cmd.option1, ['ok', 'dok'])
cmd.option2 = ['xxx', 'www']
cmd.ensure_string_list('option2')
cmd.option3 = ['ok', 2]
self.assertRaises(DistutilsOptionError, cmd.ensure_string_list,
'option3')
def test_ensure_filename(self):
cmd = self.cmd
cmd.option1 = __file__
cmd.ensure_filename('option1')
cmd.option2 = 'xxx'
self.assertRaises(DistutilsOptionError, cmd.ensure_filename, 'option2')
def test_ensure_dirname(self):
cmd = self.cmd
cmd.option1 = os.path.dirname(__file__) or os.curdir
cmd.ensure_dirname('option1')
cmd.option2 = 'xxx'
self.assertRaises(DistutilsOptionError, cmd.ensure_dirname, 'option2')
def test_debug_print(self):
cmd = self.cmd
with captured_stdout() as stdout:
cmd.debug_print('xxx')
stdout.seek(0)
self.assertEqual(stdout.read(), '')
debug.DEBUG = True
try:
with captured_stdout() as stdout:
cmd.debug_print('xxx')
stdout.seek(0)
self.assertEqual(stdout.read(), 'xxx\n')
finally:
debug.DEBUG = False
def test_suite():
return unittest.makeSuite(CommandTestCase)
if __name__ == '__main__':
run_unittest(test_suite())
|
gpl-2.0
|
tombstone/models
|
research/slim/nets/mobilenet/mobilenet_v3_test.py
|
3
|
5913
|
# Copyright 2019 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""Tests for google3.third_party.tensorflow_models.slim.nets.mobilenet.mobilenet_v3."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import tensorflow.compat.v1 as tf
from nets.mobilenet import mobilenet_v3
from google3.testing.pybase import parameterized
class MobilenetV3Test(tf.test.TestCase, parameterized.TestCase):
# pylint: disable = g-unreachable-test-method
def assertVariablesHaveNormalizerFn(self, use_groupnorm):
global_variables = [v.name for v in tf.global_variables()]
has_batch_norm = False
has_group_norm = False
for global_variable in global_variables:
if 'BatchNorm' in global_variable:
has_batch_norm = True
if 'GroupNorm' in global_variable:
has_group_norm = True
if use_groupnorm:
self.assertFalse(has_batch_norm)
self.assertTrue(has_group_norm)
else:
self.assertTrue(has_batch_norm)
self.assertFalse(has_group_norm)
@parameterized.named_parameters(('without_groupnorm', False),
('with_groupnorm', True))
def testMobilenetV3Large(self, use_groupnorm):
logits, endpoints = mobilenet_v3.mobilenet(
tf.placeholder(tf.float32, (1, 224, 224, 3)),
use_groupnorm=use_groupnorm)
self.assertEqual(endpoints['layer_19'].shape, [1, 1, 1, 1280])
self.assertEqual(logits.shape, [1, 1001])
self.assertVariablesHaveNormalizerFn(use_groupnorm)
@parameterized.named_parameters(('without_groupnorm', False),
('with_groupnorm', True))
def testMobilenetV3Small(self, use_groupnorm):
_, endpoints = mobilenet_v3.mobilenet(
tf.placeholder(tf.float32, (1, 224, 224, 3)),
conv_defs=mobilenet_v3.V3_SMALL,
use_groupnorm=use_groupnorm)
self.assertEqual(endpoints['layer_15'].shape, [1, 1, 1, 1024])
self.assertVariablesHaveNormalizerFn(use_groupnorm)
@parameterized.named_parameters(('without_groupnorm', False),
('with_groupnorm', True))
def testMobilenetEdgeTpu(self, use_groupnorm):
_, endpoints = mobilenet_v3.edge_tpu(
tf.placeholder(tf.float32, (1, 224, 224, 3)),
use_groupnorm=use_groupnorm)
self.assertIn('Inference mode is created by default',
mobilenet_v3.edge_tpu.__doc__)
self.assertEqual(endpoints['layer_24'].shape, [1, 7, 7, 1280])
self.assertStartsWith(
endpoints['layer_24'].name, 'MobilenetEdgeTPU')
self.assertVariablesHaveNormalizerFn(use_groupnorm)
def testMobilenetEdgeTpuChangeScope(self):
_, endpoints = mobilenet_v3.edge_tpu(
tf.placeholder(tf.float32, (1, 224, 224, 3)), scope='Scope')
self.assertStartsWith(
endpoints['layer_24'].name, 'Scope')
@parameterized.named_parameters(('without_groupnorm', False),
('with_groupnorm', True))
def testMobilenetV3BaseOnly(self, use_groupnorm):
result, endpoints = mobilenet_v3.mobilenet(
tf.placeholder(tf.float32, (1, 224, 224, 3)),
conv_defs=mobilenet_v3.V3_LARGE,
use_groupnorm=use_groupnorm,
base_only=True,
final_endpoint='layer_17')
# Get the latest layer before average pool.
self.assertEqual(endpoints['layer_17'].shape, [1, 7, 7, 960])
self.assertEqual(result, endpoints['layer_17'])
self.assertVariablesHaveNormalizerFn(use_groupnorm)
def testMobilenetV3BaseOnly_VariableInput(self):
result, endpoints = mobilenet_v3.mobilenet(
tf.placeholder(tf.float32, (None, None, None, 3)),
conv_defs=mobilenet_v3.V3_LARGE,
base_only=True,
final_endpoint='layer_17')
# Get the latest layer before average pool.
self.assertEqual(endpoints['layer_17'].shape.as_list(),
[None, None, None, 960])
self.assertEqual(result, endpoints['layer_17'])
# Use reduce mean for pooling and check for operation 'ReduceMean' in graph
@parameterized.named_parameters(('without_groupnorm', False),
('with_groupnorm', True))
def testMobilenetV3WithReduceMean(self, use_groupnorm):
_, _ = mobilenet_v3.mobilenet(
tf.placeholder(tf.float32, (1, 224, 224, 3)),
conv_defs=mobilenet_v3.V3_SMALL,
use_groupnorm=use_groupnorm,
use_reduce_mean_for_pooling=True)
g = tf.get_default_graph()
reduce_mean = [v for v in g.get_operations() if 'ReduceMean' in v.name]
self.assertNotEmpty(reduce_mean)
self.assertVariablesHaveNormalizerFn(use_groupnorm)
@parameterized.named_parameters(('without_groupnorm', False),
('with_groupnorm', True))
def testMobilenetV3WithOutReduceMean(self, use_groupnorm):
_, _ = mobilenet_v3.mobilenet(
tf.placeholder(tf.float32, (1, 224, 224, 3)),
conv_defs=mobilenet_v3.V3_SMALL,
use_groupnorm=use_groupnorm,
use_reduce_mean_for_pooling=False)
g = tf.get_default_graph()
reduce_mean = [v for v in g.get_operations() if 'ReduceMean' in v.name]
self.assertEmpty(reduce_mean)
self.assertVariablesHaveNormalizerFn(use_groupnorm)
if __name__ == '__main__':
# absltest.main()
tf.test.main()
|
apache-2.0
|
pv/scikit-learn
|
sklearn/tree/tests/test_export.py
|
76
|
9318
|
"""
Testing for export functions of decision trees (sklearn.tree.export).
"""
from numpy.testing import assert_equal
from nose.tools import assert_raises
from sklearn.tree import DecisionTreeClassifier, DecisionTreeRegressor
from sklearn.tree import export_graphviz
from sklearn.externals.six import StringIO
# toy sample
X = [[-2, -1], [-1, -1], [-1, -2], [1, 1], [1, 2], [2, 1]]
y = [-1, -1, -1, 1, 1, 1]
y2 = [[-1, 1], [-1, 2], [-1, 3], [1, 1], [1, 2], [1, 3]]
w = [1, 1, 1, .5, .5, .5]
def test_graphviz_toy():
# Check correctness of export_graphviz
clf = DecisionTreeClassifier(max_depth=3,
min_samples_split=1,
criterion="gini",
random_state=2)
clf.fit(X, y)
# Test export code
out = StringIO()
export_graphviz(clf, out_file=out)
contents1 = out.getvalue()
contents2 = 'digraph Tree {\n' \
'node [shape=box] ;\n' \
'0 [label="X[0] <= 0.0\\ngini = 0.5\\nsamples = 6\\n' \
'value = [3, 3]"] ;\n' \
'1 [label="gini = 0.0\\nsamples = 3\\nvalue = [3, 0]"] ;\n' \
'0 -> 1 [labeldistance=2.5, labelangle=45, ' \
'headlabel="True"] ;\n' \
'2 [label="gini = 0.0\\nsamples = 3\\nvalue = [0, 3]"] ;\n' \
'0 -> 2 [labeldistance=2.5, labelangle=-45, ' \
'headlabel="False"] ;\n' \
'}'
assert_equal(contents1, contents2)
# Test with feature_names
out = StringIO()
export_graphviz(clf, out_file=out, feature_names=["feature0", "feature1"])
contents1 = out.getvalue()
contents2 = 'digraph Tree {\n' \
'node [shape=box] ;\n' \
'0 [label="feature0 <= 0.0\\ngini = 0.5\\nsamples = 6\\n' \
'value = [3, 3]"] ;\n' \
'1 [label="gini = 0.0\\nsamples = 3\\nvalue = [3, 0]"] ;\n' \
'0 -> 1 [labeldistance=2.5, labelangle=45, ' \
'headlabel="True"] ;\n' \
'2 [label="gini = 0.0\\nsamples = 3\\nvalue = [0, 3]"] ;\n' \
'0 -> 2 [labeldistance=2.5, labelangle=-45, ' \
'headlabel="False"] ;\n' \
'}'
assert_equal(contents1, contents2)
# Test with class_names
out = StringIO()
export_graphviz(clf, out_file=out, class_names=["yes", "no"])
contents1 = out.getvalue()
contents2 = 'digraph Tree {\n' \
'node [shape=box] ;\n' \
'0 [label="X[0] <= 0.0\\ngini = 0.5\\nsamples = 6\\n' \
'value = [3, 3]\\nclass = yes"] ;\n' \
'1 [label="gini = 0.0\\nsamples = 3\\nvalue = [3, 0]\\n' \
'class = yes"] ;\n' \
'0 -> 1 [labeldistance=2.5, labelangle=45, ' \
'headlabel="True"] ;\n' \
'2 [label="gini = 0.0\\nsamples = 3\\nvalue = [0, 3]\\n' \
'class = no"] ;\n' \
'0 -> 2 [labeldistance=2.5, labelangle=-45, ' \
'headlabel="False"] ;\n' \
'}'
assert_equal(contents1, contents2)
# Test plot_options
out = StringIO()
export_graphviz(clf, out_file=out, filled=True, impurity=False,
proportion=True, special_characters=True, rounded=True)
contents1 = out.getvalue()
contents2 = 'digraph Tree {\n' \
'node [shape=box, style="filled, rounded", color="black", ' \
'fontname=helvetica] ;\n' \
'edge [fontname=helvetica] ;\n' \
'0 [label=<X<SUB>0</SUB> ≤ 0.0<br/>samples = 100.0%<br/>' \
'value = [0.5, 0.5]>, fillcolor="#e5813900"] ;\n' \
'1 [label=<samples = 50.0%<br/>value = [1.0, 0.0]>, ' \
'fillcolor="#e58139ff"] ;\n' \
'0 -> 1 [labeldistance=2.5, labelangle=45, ' \
'headlabel="True"] ;\n' \
'2 [label=<samples = 50.0%<br/>value = [0.0, 1.0]>, ' \
'fillcolor="#399de5ff"] ;\n' \
'0 -> 2 [labeldistance=2.5, labelangle=-45, ' \
'headlabel="False"] ;\n' \
'}'
assert_equal(contents1, contents2)
# Test max_depth
out = StringIO()
export_graphviz(clf, out_file=out, max_depth=0, class_names=True)
contents1 = out.getvalue()
contents2 = 'digraph Tree {\n' \
'node [shape=box] ;\n' \
'0 [label="X[0] <= 0.0\\ngini = 0.5\\nsamples = 6\\n' \
'value = [3, 3]\\nclass = y[0]"] ;\n' \
'1 [label="(...)"] ;\n' \
'0 -> 1 ;\n' \
'2 [label="(...)"] ;\n' \
'0 -> 2 ;\n' \
'}'
assert_equal(contents1, contents2)
# Test max_depth with plot_options
out = StringIO()
export_graphviz(clf, out_file=out, max_depth=0, filled=True,
node_ids=True)
contents1 = out.getvalue()
contents2 = 'digraph Tree {\n' \
'node [shape=box, style="filled", color="black"] ;\n' \
'0 [label="node #0\\nX[0] <= 0.0\\ngini = 0.5\\n' \
'samples = 6\\nvalue = [3, 3]", fillcolor="#e5813900"] ;\n' \
'1 [label="(...)", fillcolor="#C0C0C0"] ;\n' \
'0 -> 1 ;\n' \
'2 [label="(...)", fillcolor="#C0C0C0"] ;\n' \
'0 -> 2 ;\n' \
'}'
assert_equal(contents1, contents2)
# Test multi-output with weighted samples
clf = DecisionTreeClassifier(max_depth=2,
min_samples_split=1,
criterion="gini",
random_state=2)
clf = clf.fit(X, y2, sample_weight=w)
out = StringIO()
export_graphviz(clf, out_file=out, filled=True, impurity=False)
contents1 = out.getvalue()
contents2 = 'digraph Tree {\n' \
'node [shape=box, style="filled", color="black"] ;\n' \
'0 [label="X[0] <= 0.0\\nsamples = 6\\n' \
'value = [[3.0, 1.5, 0.0]\\n' \
'[1.5, 1.5, 1.5]]", fillcolor="#e5813900"] ;\n' \
'1 [label="X[1] <= -1.5\\nsamples = 3\\n' \
'value = [[3, 0, 0]\\n[1, 1, 1]]", ' \
'fillcolor="#e5813965"] ;\n' \
'0 -> 1 [labeldistance=2.5, labelangle=45, ' \
'headlabel="True"] ;\n' \
'2 [label="samples = 1\\nvalue = [[1, 0, 0]\\n' \
'[0, 0, 1]]", fillcolor="#e58139ff"] ;\n' \
'1 -> 2 ;\n' \
'3 [label="samples = 2\\nvalue = [[2, 0, 0]\\n' \
'[1, 1, 0]]", fillcolor="#e581398c"] ;\n' \
'1 -> 3 ;\n' \
'4 [label="X[0] <= 1.5\\nsamples = 3\\n' \
'value = [[0.0, 1.5, 0.0]\\n[0.5, 0.5, 0.5]]", ' \
'fillcolor="#e5813965"] ;\n' \
'0 -> 4 [labeldistance=2.5, labelangle=-45, ' \
'headlabel="False"] ;\n' \
'5 [label="samples = 2\\nvalue = [[0.0, 1.0, 0.0]\\n' \
'[0.5, 0.5, 0.0]]", fillcolor="#e581398c"] ;\n' \
'4 -> 5 ;\n' \
'6 [label="samples = 1\\nvalue = [[0.0, 0.5, 0.0]\\n' \
'[0.0, 0.0, 0.5]]", fillcolor="#e58139ff"] ;\n' \
'4 -> 6 ;\n' \
'}'
assert_equal(contents1, contents2)
# Test regression output with plot_options
clf = DecisionTreeRegressor(max_depth=3,
min_samples_split=1,
criterion="mse",
random_state=2)
clf.fit(X, y)
out = StringIO()
export_graphviz(clf, out_file=out, filled=True, leaves_parallel=True,
rotate=True, rounded=True)
contents1 = out.getvalue()
contents2 = 'digraph Tree {\n' \
'node [shape=box, style="filled, rounded", color="black", ' \
'fontname=helvetica] ;\n' \
'graph [ranksep=equally, splines=polyline] ;\n' \
'edge [fontname=helvetica] ;\n' \
'rankdir=LR ;\n' \
'0 [label="X[0] <= 0.0\\nmse = 1.0\\nsamples = 6\\n' \
'value = 0.0", fillcolor="#e581397f"] ;\n' \
'1 [label="mse = 0.0\\nsamples = 3\\nvalue = -1.0", ' \
'fillcolor="#e5813900"] ;\n' \
'0 -> 1 [labeldistance=2.5, labelangle=-45, ' \
'headlabel="True"] ;\n' \
'2 [label="mse = 0.0\\nsamples = 3\\nvalue = 1.0", ' \
'fillcolor="#e58139ff"] ;\n' \
'0 -> 2 [labeldistance=2.5, labelangle=45, ' \
'headlabel="False"] ;\n' \
'{rank=same ; 0} ;\n' \
'{rank=same ; 1; 2} ;\n' \
'}'
assert_equal(contents1, contents2)
def test_graphviz_errors():
# Check for errors of export_graphviz
clf = DecisionTreeClassifier(max_depth=3, min_samples_split=1)
clf.fit(X, y)
# Check feature_names error
out = StringIO()
assert_raises(IndexError, export_graphviz, clf, out, feature_names=[])
# Check class_names error
out = StringIO()
assert_raises(IndexError, export_graphviz, clf, out, class_names=[])
|
bsd-3-clause
|
igel-kun/pyload
|
module/plugins/hoster/XDCC.py
|
1
|
3550
|
# -*- coding: utf-8 -*-
import os
import re
from module.plugins.internal.Hoster import Hoster
from module.network.XDCCRequest import XDCCRequest
from module.plugins.internal.misc import parse_name, safejoin
class XDCC(Hoster):
__name__ = "XDCC"
__type__ = "hoster"
__version__ = "99"
__status__ = "testing"
__pattern__ = r'(?:xdcc|irc)://(?P<SERVER>.*?)/#?(?P<CHAN>.*?)/(?P<BOT>.*?)/#?(?P<PACK>\d+)/?'
# mimic IRSSI v0.8.6 by default
__config__ = [("nick", "str", "Nickname", "pyload" ),
("passowrd", "str", "Password for the nickname", "" ),
("realname", "str", "Realname", "really pyload" ),
("ctcp_version", "str", "Version string to send on CTCP VERSION requests", "irssi v0.8.6 - running on FreeBSD i686"),
("passive_port", "str", "Local port to open for passive DCC - 0 for auto select, X-Y for a range", 0)]
__description__ = """Download from IRC XDCC bot"""
__license__ = "GPLv3"
__authors__ = [("jeix", "[email protected]" ),
("GammaC0de", "nitzo2001[AT]yahoo[DOT]com"),
("igel", "")]
# NETWORK rules are commands to send to the server on connection, depending on the server name
NETWORK_RULES = [(r'abjects', ['JOIN #mg-chat']), (r'abandoned-irc', ['JOIN #zw-chat'])]
# PRIVMSG rules are rules to turn private messages from anyone whose name matches rule[0] into commands using re.sub(rule[1], rule[2])
PRIVMSG_RULES = [(r"(?:@staff|Zombies)", r".*you must /?join .*?(#[^ ]*) .*to download.*", r"JOIN \1")]
# ERROR patterns are patterns that, when received as a private notice, cause the download to fail
ERROR_PATTERN = r"(invalid pack|try again)"
def setup(self):
# TODO: find a way to do multiDL for different servers
self.multiDL = False
def parse_server(self, server):
temp = server.split(':')
server = temp[0]
if len(temp) == 2:
try:
port = int(temp[1])
except ValueError:
self.fail(_("Error: Erroneous port: %s." % temp[1]))
return (server, port)
elif len(temp) == 1:
return (server, 6667)
else:
self.fail(_("Invalid hostname for IRC Server: %s") % server)
def setup_base(self):
# check for duplicates before get_info() overwrites our perfectly good pyfile.name from a previous attempt with the silly "url"
self.check_duplicates()
def process(self, pyfile):
dl_basename = parse_name(pyfile.name)
dl_folder = self.pyload.config.get('general', 'download_folder')
dl_dirname = safejoin(dl_folder, pyfile.package().folder)
dl_filename = safejoin(dl_dirname, dl_basename)
try:
server, chan, bot, pack = re.match(self.__pattern__, pyfile.url).groups()
nick = self.config.get('nick')
password = self.config.get('password')
realname = self.config.get('realname')
# split the port from the server
server,port = self.parse_server(server)
except Exception:
self.fail(_("malformed XDCC URI: %s - expected xdcc://server[:port]/chan/bot/pack" % pyfile.url))
self.req = XDCCRequest(self, pyfile)
self.req.download(server, port, chan, bot, pack, dl_filename, True, nick, password, realname)
|
gpl-3.0
|
nyuwireless/ns3-mmwave
|
.waf-1.8.12-f00e5b53f6bbeab1384a38c9cc5d51f7/waflib/Tools/dmd.py
|
21
|
1487
|
#! /usr/bin/env python
# encoding: utf-8
# WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file
import sys
from waflib.Tools import ar,d
from waflib.Configure import conf
@conf
def find_dmd(conf):
conf.find_program(['dmd','dmd2','ldc'],var='D')
out=conf.cmd_and_log(conf.env.D+['--help'])
if out.find("D Compiler v")==-1:
out=conf.cmd_and_log(conf.env.D+['-version'])
if out.find("based on DMD v1.")==-1:
conf.fatal("detected compiler is not dmd/ldc")
@conf
def common_flags_ldc(conf):
v=conf.env
v['DFLAGS']=['-d-version=Posix']
v['LINKFLAGS']=[]
v['DFLAGS_dshlib']=['-relocation-model=pic']
@conf
def common_flags_dmd(conf):
v=conf.env
v['D_SRC_F']=['-c']
v['D_TGT_F']='-of%s'
v['D_LINKER']=v['D']
v['DLNK_SRC_F']=''
v['DLNK_TGT_F']='-of%s'
v['DINC_ST']='-I%s'
v['DSHLIB_MARKER']=v['DSTLIB_MARKER']=''
v['DSTLIB_ST']=v['DSHLIB_ST']='-L-l%s'
v['DSTLIBPATH_ST']=v['DLIBPATH_ST']='-L-L%s'
v['LINKFLAGS_dprogram']=['-quiet']
v['DFLAGS_dshlib']=['-fPIC']
v['LINKFLAGS_dshlib']=['-L-shared']
v['DHEADER_ext']='.di'
v.DFLAGS_d_with_header=['-H','-Hf']
v['D_HDR_F']='%s'
def configure(conf):
conf.find_dmd()
if sys.platform=='win32':
out=conf.cmd_and_log(conf.env.D+['--help'])
if out.find("D Compiler v2.")>-1:
conf.fatal('dmd2 on Windows is not supported, use gdc or ldc2 instead')
conf.load('ar')
conf.load('d')
conf.common_flags_dmd()
conf.d_platform_flags()
if str(conf.env.D).find('ldc')>-1:
conf.common_flags_ldc()
|
gpl-2.0
|
OrlandoSoto/retirement
|
retirement_api/urls.py
|
1
|
1116
|
from django.conf.urls import patterns, include, url
from django.contrib.staticfiles.urls import staticfiles_urlpatterns
from django.contrib import admin
from django.conf import settings
admin.autodiscover()
urlpatterns = patterns('',
# url(r'^retirement-api/admin/', include(admin.site.urls)),
url(r'^retirement-api/estimator/$', 'retirement_api.views.estimator', name='estimator'),
url(r'^retirement-api/estimator/(?P<dob>[^/]+)/(?P<income>\d+)/$', 'retirement_api.views.estimator', name='estimator'),
url(r'^retirement/retirement-api/estimator/(?P<dob>[^/]+)/(?P<income>\d+)/$', 'retirement_api.views.estimator', name='estimator'),
url(r'^retirement-api/get-retirement-age/(?P<birth_year>\d+)/$', 'retirement_api.views.get_full_retirement_age', name='get_full_retirement_age'),
url(r'^claiming-social-security/$', 'retirement_api.views.claiming', name='claiming'),
url(r'^claiming-social-security/es/$', 'retirement_api.views.claiming', {'es': True}),
url(r'^retirement/static\/(?P<path>.*)/$', 'django.contrib.staticfiles.views.serve')
)
urlpatterns += staticfiles_urlpatterns()
|
cc0-1.0
|
mesosphere/aws-cfn-resource-bridge
|
aws/cfn/bridge/vendored/botocore/vendored/requests/packages/charade/chardistribution.py
|
2755
|
9226
|
######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Communicator client code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
# Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301 USA
######################### END LICENSE BLOCK #########################
from .euctwfreq import (EUCTWCharToFreqOrder, EUCTW_TABLE_SIZE,
EUCTW_TYPICAL_DISTRIBUTION_RATIO)
from .euckrfreq import (EUCKRCharToFreqOrder, EUCKR_TABLE_SIZE,
EUCKR_TYPICAL_DISTRIBUTION_RATIO)
from .gb2312freq import (GB2312CharToFreqOrder, GB2312_TABLE_SIZE,
GB2312_TYPICAL_DISTRIBUTION_RATIO)
from .big5freq import (Big5CharToFreqOrder, BIG5_TABLE_SIZE,
BIG5_TYPICAL_DISTRIBUTION_RATIO)
from .jisfreq import (JISCharToFreqOrder, JIS_TABLE_SIZE,
JIS_TYPICAL_DISTRIBUTION_RATIO)
from .compat import wrap_ord
ENOUGH_DATA_THRESHOLD = 1024
SURE_YES = 0.99
SURE_NO = 0.01
MINIMUM_DATA_THRESHOLD = 3
class CharDistributionAnalysis:
def __init__(self):
# Mapping table to get frequency order from char order (get from
# GetOrder())
self._mCharToFreqOrder = None
self._mTableSize = None # Size of above table
# This is a constant value which varies from language to language,
# used in calculating confidence. See
# http://www.mozilla.org/projects/intl/UniversalCharsetDetection.html
# for further detail.
self._mTypicalDistributionRatio = None
self.reset()
def reset(self):
"""reset analyser, clear any state"""
# If this flag is set to True, detection is done and conclusion has
# been made
self._mDone = False
self._mTotalChars = 0 # Total characters encountered
# The number of characters whose frequency order is less than 512
self._mFreqChars = 0
def feed(self, aBuf, aCharLen):
"""feed a character with known length"""
if aCharLen == 2:
# we only care about 2-bytes character in our distribution analysis
order = self.get_order(aBuf)
else:
order = -1
if order >= 0:
self._mTotalChars += 1
# order is valid
if order < self._mTableSize:
if 512 > self._mCharToFreqOrder[order]:
self._mFreqChars += 1
def get_confidence(self):
"""return confidence based on existing data"""
# if we didn't receive any character in our consideration range,
# return negative answer
if self._mTotalChars <= 0 or self._mFreqChars <= MINIMUM_DATA_THRESHOLD:
return SURE_NO
if self._mTotalChars != self._mFreqChars:
r = (self._mFreqChars / ((self._mTotalChars - self._mFreqChars)
* self._mTypicalDistributionRatio))
if r < SURE_YES:
return r
# normalize confidence (we don't want to be 100% sure)
return SURE_YES
def got_enough_data(self):
# It is not necessary to receive all data to draw conclusion.
# For charset detection, certain amount of data is enough
return self._mTotalChars > ENOUGH_DATA_THRESHOLD
def get_order(self, aBuf):
# We do not handle characters based on the original encoding string,
# but convert this encoding string to a number, here called order.
# This allows multiple encodings of a language to share one frequency
# table.
return -1
class EUCTWDistributionAnalysis(CharDistributionAnalysis):
def __init__(self):
CharDistributionAnalysis.__init__(self)
self._mCharToFreqOrder = EUCTWCharToFreqOrder
self._mTableSize = EUCTW_TABLE_SIZE
self._mTypicalDistributionRatio = EUCTW_TYPICAL_DISTRIBUTION_RATIO
def get_order(self, aBuf):
# for euc-TW encoding, we are interested
# first byte range: 0xc4 -- 0xfe
# second byte range: 0xa1 -- 0xfe
# no validation needed here. State machine has done that
first_char = wrap_ord(aBuf[0])
if first_char >= 0xC4:
return 94 * (first_char - 0xC4) + wrap_ord(aBuf[1]) - 0xA1
else:
return -1
class EUCKRDistributionAnalysis(CharDistributionAnalysis):
def __init__(self):
CharDistributionAnalysis.__init__(self)
self._mCharToFreqOrder = EUCKRCharToFreqOrder
self._mTableSize = EUCKR_TABLE_SIZE
self._mTypicalDistributionRatio = EUCKR_TYPICAL_DISTRIBUTION_RATIO
def get_order(self, aBuf):
# for euc-KR encoding, we are interested
# first byte range: 0xb0 -- 0xfe
# second byte range: 0xa1 -- 0xfe
# no validation needed here. State machine has done that
first_char = wrap_ord(aBuf[0])
if first_char >= 0xB0:
return 94 * (first_char - 0xB0) + wrap_ord(aBuf[1]) - 0xA1
else:
return -1
class GB2312DistributionAnalysis(CharDistributionAnalysis):
def __init__(self):
CharDistributionAnalysis.__init__(self)
self._mCharToFreqOrder = GB2312CharToFreqOrder
self._mTableSize = GB2312_TABLE_SIZE
self._mTypicalDistributionRatio = GB2312_TYPICAL_DISTRIBUTION_RATIO
def get_order(self, aBuf):
# for GB2312 encoding, we are interested
# first byte range: 0xb0 -- 0xfe
# second byte range: 0xa1 -- 0xfe
# no validation needed here. State machine has done that
first_char, second_char = wrap_ord(aBuf[0]), wrap_ord(aBuf[1])
if (first_char >= 0xB0) and (second_char >= 0xA1):
return 94 * (first_char - 0xB0) + second_char - 0xA1
else:
return -1
class Big5DistributionAnalysis(CharDistributionAnalysis):
def __init__(self):
CharDistributionAnalysis.__init__(self)
self._mCharToFreqOrder = Big5CharToFreqOrder
self._mTableSize = BIG5_TABLE_SIZE
self._mTypicalDistributionRatio = BIG5_TYPICAL_DISTRIBUTION_RATIO
def get_order(self, aBuf):
# for big5 encoding, we are interested
# first byte range: 0xa4 -- 0xfe
# second byte range: 0x40 -- 0x7e , 0xa1 -- 0xfe
# no validation needed here. State machine has done that
first_char, second_char = wrap_ord(aBuf[0]), wrap_ord(aBuf[1])
if first_char >= 0xA4:
if second_char >= 0xA1:
return 157 * (first_char - 0xA4) + second_char - 0xA1 + 63
else:
return 157 * (first_char - 0xA4) + second_char - 0x40
else:
return -1
class SJISDistributionAnalysis(CharDistributionAnalysis):
def __init__(self):
CharDistributionAnalysis.__init__(self)
self._mCharToFreqOrder = JISCharToFreqOrder
self._mTableSize = JIS_TABLE_SIZE
self._mTypicalDistributionRatio = JIS_TYPICAL_DISTRIBUTION_RATIO
def get_order(self, aBuf):
# for sjis encoding, we are interested
# first byte range: 0x81 -- 0x9f , 0xe0 -- 0xfe
# second byte range: 0x40 -- 0x7e, 0x81 -- oxfe
# no validation needed here. State machine has done that
first_char, second_char = wrap_ord(aBuf[0]), wrap_ord(aBuf[1])
if (first_char >= 0x81) and (first_char <= 0x9F):
order = 188 * (first_char - 0x81)
elif (first_char >= 0xE0) and (first_char <= 0xEF):
order = 188 * (first_char - 0xE0 + 31)
else:
return -1
order = order + second_char - 0x40
if second_char > 0x7F:
order = -1
return order
class EUCJPDistributionAnalysis(CharDistributionAnalysis):
def __init__(self):
CharDistributionAnalysis.__init__(self)
self._mCharToFreqOrder = JISCharToFreqOrder
self._mTableSize = JIS_TABLE_SIZE
self._mTypicalDistributionRatio = JIS_TYPICAL_DISTRIBUTION_RATIO
def get_order(self, aBuf):
# for euc-JP encoding, we are interested
# first byte range: 0xa0 -- 0xfe
# second byte range: 0xa1 -- 0xfe
# no validation needed here. State machine has done that
char = wrap_ord(aBuf[0])
if char >= 0xA0:
return 94 * (char - 0xA1) + wrap_ord(aBuf[1]) - 0xa1
else:
return -1
|
apache-2.0
|
s20121035/rk3288_android5.1_repo
|
frameworks/base/tools/layoutlib/rename_font/build_font_single.py
|
1
|
6688
|
#!/usr/bin/env python
# Copyright (C) 2014 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the 'License');
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an 'AS IS' BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""
Rename the PS name of the input font.
OpenType fonts (*.otf) are not currently supported. They are copied to the destination without renaming.
XML files are also copied in case they are passed there by mistake.
Usage: build_font_single.py /path/to/input_font.ttf /path/to/output_font.ttf
"""
import glob
import os
import re
import shutil
import sys
import xml.etree.ElementTree as etree
# Prevent .pyc files from being created.
sys.dont_write_bytecode = True
# fontTools is available at platform/external/fonttools
from fontTools import ttx
class FontInfo(object):
family = None
style = None
version = None
ends_in_regular = False
fullname = None
class InvalidFontException(Exception):
pass
# A constant to copy the font without modifying. This is useful when running
# locally and speed up the time to build the SDK.
COPY_ONLY = False
# These constants represent the value of nameID parameter in the namerecord for
# different information.
# see http://scripts.sil.org/cms/scripts/page.php?item_id=IWS-Chapter08#3054f18b
NAMEID_FAMILY = 1
NAMEID_STYLE = 2
NAMEID_FULLNAME = 4
NAMEID_VERSION = 5
# A list of extensions to process.
EXTENSIONS = ['.ttf', '.otf', '.xml']
def main(argv):
if len(argv) < 2:
print 'Incorrect usage: ' + str(argv)
sys.exit('Usage: build_font_single.py /path/to/input/font.ttf /path/to/out/font.ttf')
dest_path = argv[-1]
input_path = argv[0]
extension = os.path.splitext(input_path)[1].lower()
if extension in EXTENSIONS:
if not COPY_ONLY and extension == '.ttf':
convert_font(input_path, dest_path)
return
shutil.copy(input_path, dest_path)
def convert_font(input_path, dest_path):
filename = os.path.basename(input_path)
print 'Converting font: ' + filename
# the path to the output file. The file name is the fontfilename.ttx
ttx_path = dest_path[:-1] + 'x'
try:
# run ttx to generate an xml file in the output folder which represents all
# its info
ttx_args = ['-q', '-o', ttx_path, input_path]
ttx.main(ttx_args)
# now parse the xml file to change its PS name.
tree = etree.parse(ttx_path)
root = tree.getroot()
for name in root.iter('name'):
update_tag(name, get_font_info(name))
tree.write(ttx_path, xml_declaration=True, encoding='utf-8')
# generate the udpated font now.
ttx_args = ['-q', '-o', dest_path, ttx_path]
ttx.main(ttx_args)
except InvalidFontException:
# In case of invalid fonts, we exit.
print filename + ' is not a valid font'
raise
except Exception as e:
print 'Error converting font: ' + filename
print e
# Some fonts are too big to be handled by the ttx library.
# Just copy paste them.
shutil.copy(input_path, dest_path)
try:
# delete the temp ttx file is it exists.
os.remove(ttx_path)
except OSError:
pass
def get_font_info(tag):
""" Returns a list of FontInfo representing the various sets of namerecords
found in the name table of the font. """
fonts = []
font = None
last_name_id = sys.maxint
for namerecord in tag.iter('namerecord'):
if 'nameID' in namerecord.attrib:
name_id = int(namerecord.attrib['nameID'])
# A new font should be created for each platform, encoding and language
# id. But, since the nameIDs are sorted, we use the easy approach of
# creating a new one when the nameIDs reset.
if name_id <= last_name_id and font is not None:
fonts.append(font)
font = None
last_name_id = name_id
if font is None:
font = FontInfo()
if name_id == NAMEID_FAMILY:
font.family = namerecord.text.strip()
if name_id == NAMEID_STYLE:
font.style = namerecord.text.strip()
if name_id == NAMEID_FULLNAME:
font.ends_in_regular = ends_in_regular(namerecord.text)
font.fullname = namerecord.text.strip()
if name_id == NAMEID_VERSION:
font.version = get_version(namerecord.text)
if font is not None:
fonts.append(font)
return fonts
def update_tag(tag, fonts):
last_name_id = sys.maxint
fonts_iterator = fonts.__iter__()
font = None
for namerecord in tag.iter('namerecord'):
if 'nameID' in namerecord.attrib:
name_id = int(namerecord.attrib['nameID'])
if name_id <= last_name_id:
font = fonts_iterator.next()
font = update_font_name(font)
last_name_id = name_id
if name_id == NAMEID_FAMILY:
namerecord.text = font.family
if name_id == NAMEID_FULLNAME:
namerecord.text = font.fullname
def update_font_name(font):
""" Compute the new font family name and font fullname. If the font has a
valid version, it's sanitized and appended to the font family name. The
font fullname is then created by joining the new family name and the
style. If the style is 'Regular', it is appended only if the original font
had it. """
if font.family is None or font.style is None:
raise InvalidFontException('Font doesn\'t have proper family name or style')
if font.version is not None:
new_family = font.family + font.version
else:
new_family = font.family
if font.style is 'Regular' and not font.ends_in_regular:
font.fullname = new_family
else:
font.fullname = new_family + ' ' + font.style
font.family = new_family
return font
def ends_in_regular(string):
""" According to the specification, the font fullname should not end in
'Regular' for plain fonts. However, some fonts don't obey this rule. We
keep the style info, to minimize the diff. """
string = string.strip().split()[-1]
return string is 'Regular'
def get_version(string):
# The string must begin with 'Version n.nn '
# to extract n.nn, we return the second entry in the split strings.
string = string.strip()
if not string.startswith('Version '):
raise InvalidFontException('mal-formed font version')
return sanitize(string.split()[1])
def sanitize(string):
return re.sub(r'[^\w-]+', '', string)
if __name__ == '__main__':
main(sys.argv[1:])
|
gpl-3.0
|
arcivanov/pybuilder
|
src/integrationtest/python/smoke_setup_tests.py
|
3
|
1045
|
# -*- coding: utf-8 -*-
#
# This file is part of PyBuilder
#
# Copyright 2011-2020 PyBuilder Team
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import unittest
from smoke_itest_support import SmokeIntegrationTestSupport
class SetupSmokeTest(SmokeIntegrationTestSupport):
PROJECT_FILES = list(SmokeIntegrationTestSupport.PROJECT_FILES) + ["setup.py"]
def test_smoke_setup_install(self):
self.smoke_test_module("pip", "-vvvvvvvvvvvvvv", "install", ".")
if __name__ == "__main__":
unittest.main()
|
apache-2.0
|
neurodebian/htcondor
|
src/condor_contrib/condor_pigeon/src/condor_pigeon_client/skype_linux_tools/Skype4Py/Languages/pl.py
|
10
|
7899
|
apiAttachAvailable = u'API jest dost\u0119pne'
apiAttachNotAvailable = u'Niedost\u0119pny'
apiAttachPendingAuthorization = u'Autoryzacja w toku'
apiAttachRefused = u'Odmowa'
apiAttachSuccess = u'Sukces'
apiAttachUnknown = u'Nieznany'
budDeletedFriend = u'Usuni\u0119ty z listy znajomych'
budFriend = u'Znajomy'
budNeverBeenFriend = u'Nigdy nie by\u0142 na li\u015bcie znajomych'
budPendingAuthorization = u'Autoryzacja w toku'
budUnknown = u'Nieznany'
cfrBlockedByRecipient = u'Po\u0142\u0105czenie zablokowane przez odbiorc\u0119'
cfrMiscError = u'B\u0142\u0105d'
cfrNoCommonCodec = u'Brak podstawowego kodeka'
cfrNoProxyFound = u'Nie odnaleziono serwera proksy'
cfrNotAuthorizedByRecipient = u'Ten u\u017cytkownik nie ma autoryzacji odbiorcy'
cfrRecipientNotFriend = u'Odbiorca nie jest znajomym'
cfrRemoteDeviceError = u'Problem ze zdalnym urz\u0105dzeniem d\u017awi\u0119kowym'
cfrSessionTerminated = u'Sesja zako\u0144czona'
cfrSoundIOError = u'B\u0142\u0105d d\u017awi\u0119ku przychodz\u0105cego lub wychodz\u0105cego'
cfrSoundRecordingError = u'B\u0142\u0105d nagrywania d\u017awi\u0119ku'
cfrUnknown = u'Nieznany'
cfrUserDoesNotExist = u'Taki u\u017cytkownik lub numer telefonu nie istnieje'
cfrUserIsOffline = u'Ona lub On jest niedost\u0119pny'
chsAllCalls = u'Wszystkie'
chsDialog = u'Dialog'
chsIncomingCalls = u'Zaakceptuj wielu uczestnik\xf3w'
chsLegacyDialog = u'Dialog przestarza\u0142y'
chsMissedCalls = u'Nie odebrane'
chsMultiNeedAccept = u'Zaakceptuj wielu uczestnik\xf3w'
chsMultiSubscribed = u'Wielu subskrybowanych'
chsOutgoingCalls = u'Wielu subskrybowanych'
chsUnknown = u'Nieznany'
chsUnsubscribed = u'Nie jest abonentem'
clsBusy = u'Zaj\u0119te'
clsCancelled = u'Anulowane'
clsEarlyMedia = u'Odtwarzanie wczesnych medi\xf3w (Early Media)'
clsFailed = u'Niestety, nieudane po\u0142\u0105czenie!'
clsFinished = u'Zako\u0144czono'
clsInProgress = u'Rozmowa w toku'
clsLocalHold = u'Zawieszona przez u\u017cytkownika'
clsMissed = u'Nieodebrana rozmowa'
clsOnHold = u'Zawieszona'
clsRefused = u'Odmowa'
clsRemoteHold = u'Zawieszona przez odbiorc\u0119'
clsRinging = u'Dzwoni'
clsRouting = u'Trasowanie'
clsTransferred = u'Nieznany'
clsTransferring = u'Nieznany'
clsUnknown = u'Nieznany'
clsUnplaced = u'Nigdy nie \u0142aczono'
clsVoicemailBufferingGreeting = u'Pozdrowienia podczas buforowania'
clsVoicemailCancelled = u'Poczta g\u0142osowa anulowana'
clsVoicemailFailed = u'B\u0142\u0105d poczty g\u0142osowej'
clsVoicemailPlayingGreeting = u'Odtwarzanie pozdrowienia'
clsVoicemailRecording = u'Nagrywanie poczty g\u0142osowej'
clsVoicemailSent = u'Poczta g\u0142osowa wys\u0142ana'
clsVoicemailUploading = u'Wysy\u0142anie poczty g\u0142osowej'
cltIncomingP2P = u'Rozmowa przychodz\u0105ca peer-to-peer'
cltIncomingPSTN = u'Rozmowa przychodz\u0105ca'
cltOutgoingP2P = u'Rozmowa wychodz\u0105ca peer-to-peer'
cltOutgoingPSTN = u'Rozmowa wychodz\u0105ca'
cltUnknown = u'Nieznany'
cmeAddedMembers = u'Cz\u0142onkowie dodani'
cmeCreatedChatWith = u'Rozpocz\u0119ty czat z'
cmeEmoted = u'Emoted'
cmeLeft = u'Opusci\u0142'
cmeSaid = u'Powiedzia\u0142'
cmeSawMembers = u'Zobaczy\u0142e\u015b cz\u0142onk\xf3w'
cmeSetTopic = u'Ustaw temat'
cmeUnknown = u'Nieznany'
cmsRead = u'Przeczyta\u0142'
cmsReceived = u'Otrzyma\u0142'
cmsSending = u'Wysy\u0142am...'
cmsSent = u'Wys\u0142any'
cmsUnknown = u'Nieznany'
conConnecting = u'\u0141\u0105czenie'
conOffline = u'Niepod\u0142\u0105czony'
conOnline = u'Dost\u0119pny'
conPausing = u'Wstrzymane'
conUnknown = u'Nieznany'
cusAway = u'Zaraz wracam'
cusDoNotDisturb = u'Nie przeszkadza\u0107'
cusInvisible = u'Niewidoczny'
cusLoggedOut = u'Niepod\u0142\u0105czony'
cusNotAvailable = u'Niedost\u0119pny'
cusOffline = u'Niepod\u0142\u0105czony'
cusOnline = u'Dost\u0119pny'
cusSkypeMe = u"Tryb 'Skype Me'"
cusUnknown = u'Nieznany'
cvsBothEnabled = u'Odbierz i odbierz wideo'
cvsNone = u'Bez wideo'
cvsReceiveEnabled = u'Odbierz wideo'
cvsSendEnabled = u'Wy\u015blij wideo'
cvsUnknown = u'Nieznany'
grpAllFriends = u'Wszyscy znajomi'
grpAllUsers = u'Wszyscy u\u017cytkownicy'
grpCustomGroup = u'Niestandardowe'
grpOnlineFriends = u'Znajomi w sieci'
grpPendingAuthorizationFriends = u'Autoryzacja w toku'
grpProposedSharedGroup = u'Propozycja grupy wsp\xf3\u0142dzielonej'
grpRecentlyContactedUsers = u'Ostatnie kontakty'
grpSharedGroup = u'Wsp\xf3\u0142dzielona grupa'
grpSkypeFriends = u'Znajomi ze Skype'
grpSkypeOutFriends = u'Znajomi ze SkypeOut'
grpUngroupedFriends = u'Znajomi spoza grupy'
grpUnknown = u'Nieznany'
grpUsersAuthorizedByMe = u'Moja autoryzacja'
grpUsersBlockedByMe = u'Moja blokada'
grpUsersWaitingMyAuthorization = u'Pro\u015bba o autoryzacj\u0119'
leaAddDeclined = u'Dodawanie odrzucone'
leaAddedNotAuthorized = u'Osoba dodawana musi by\u0107 autoryzowana'
leaAdderNotFriend = u'Osoba dodaj\u0105ca musi by\u0107 znajomym'
leaUnknown = u'Nieznany'
leaUnsubscribe = u'Nie jest abonentem'
leaUserIncapable = u'U\u017cytkownik nie mo\u017ce rozmawia\u0107'
leaUserNotFound = u'U\u017cytkownik nie zosta\u0142 znaleziony'
olsAway = u'Zaraz wracam'
olsDoNotDisturb = u'Nie przeszkadza\u0107'
olsNotAvailable = u'Niedost\u0119pny'
olsOffline = u'Niepod\u0142\u0105czony'
olsOnline = u'Dost\u0119pny'
olsSkypeMe = u"Tryb 'Skype Me'"
olsSkypeOut = u'SkypeOut'
olsUnknown = u'Nieznany'
smsMessageStatusComposing = u'Tworzenie'
smsMessageStatusDelivered = u'Dor\u0119czona'
smsMessageStatusFailed = u'Nieudane'
smsMessageStatusRead = u'Read'
smsMessageStatusReceived = u'Otrzymany'
smsMessageStatusSendingToServer = u'Sending to Server'
smsMessageStatusSentToServer = u'Wys\u0142ana do serwera'
smsMessageStatusSomeTargetsFailed = u'Niekt\xf3re numery nieudane'
smsMessageStatusUnknown = u'Nieznany'
smsMessageTypeCCRequest = u'Pro\u015bba o kod potwierdzaj\u0105cy'
smsMessageTypeCCSubmit = u'Wys\u0142anie kodu potwierdzaj\u0105cego'
smsMessageTypeIncoming = u'Przychodz\u0105ca'
smsMessageTypeOutgoing = u'Outgoing'
smsMessageTypeUnknown = u'Unknown'
smsTargetStatusAcceptable = u'Akceptowalny'
smsTargetStatusAnalyzing = u'Analiza'
smsTargetStatusDeliveryFailed = u'Nieudane'
smsTargetStatusDeliveryPending = u'Oczekuje'
smsTargetStatusDeliverySuccessful = u'Dor\u0119czona'
smsTargetStatusNotRoutable = u'Brak trasy'
smsTargetStatusUndefined = u'Niezdefiniowana'
smsTargetStatusUnknown = u'Nieznany'
usexFemale = u'Kobieta'
usexMale = u'M\u0119\u017cczyzna'
usexUnknown = u'Nieznany'
vmrConnectError = u'B\u0142\u0105d po\u0142\u0105czenia'
vmrFileReadError = u'B\u0142\u0105d odczytu pliku'
vmrFileWriteError = u'B\u0142\u0105d zapisu pliku'
vmrMiscError = u'B\u0142\u0105d'
vmrNoError = u'Bez b\u0142\u0119du'
vmrNoPrivilege = u'Brak uprawnie\u0144 Voicemail'
vmrNoVoicemail = u'Taka poczta g\u0142osowa nie istnieje'
vmrPlaybackError = u'B\u0142\u0105d odtwarzania'
vmrRecordingError = u'B\u0142\u0105d nagrywania'
vmrUnknown = u'Nieznany'
vmsBlank = u'Pusty'
vmsBuffering = u'Buforowanie'
vmsDeleting = u'Usuwanie'
vmsDownloading = u'Trwa pobieranie'
vmsFailed = u'Nie powiodlo si\u0119'
vmsNotDownloaded = u'Niepobrany'
vmsPlayed = u'Odtworzony'
vmsPlaying = u'Odtwarzanie'
vmsRecorded = u'Nagrany'
vmsRecording = u'Nagrywanie poczty g\u0142osowej'
vmsUnknown = u'Nieznany'
vmsUnplayed = u'Nieodtworzony'
vmsUploaded = u'Przekazany'
vmsUploading = u'Przekazywanie'
vmtCustomGreeting = u'Pozdrowienia niestandardowe'
vmtDefaultGreeting = u'Pozdrowienia domy\u015blne'
vmtIncoming = u'przysy\u0142ana jest wiadomo\u015b\u0107 g\u0142osowa'
vmtOutgoing = u'Wychodz\u0105ca'
vmtUnknown = u'Nieznany'
vssAvailable = u'Dost\u0119pny'
vssNotAvailable = u'Niedostepny'
vssPaused = u'Wstrzymane'
vssRejected = u'Odrzucona'
vssRunning = u'Trwaj\u0105ca'
vssStarting = u'Rozpocz\u0119cie'
vssStopping = u'Zatrzymanie'
vssUnknown = u'Nieznany'
|
apache-2.0
|
miptliot/edx-platform
|
common/djangoapps/terrain/stubs/catalog.py
|
19
|
1690
|
"""
Stub implementation of catalog service for acceptance tests
"""
# pylint: disable=invalid-name, missing-docstring
import re
import urlparse
from .http import StubHttpRequestHandler, StubHttpService
class StubCatalogServiceHandler(StubHttpRequestHandler):
def do_GET(self):
pattern_handlers = {
r'/api/v1/programs/$': self.program_list,
r'/api/v1/programs/([0-9a-f-]+)/$': self.program_detail,
r'/api/v1/program_types/$': self.program_types,
}
if self.match_pattern(pattern_handlers):
return
self.send_response(404, content='404 Not Found')
def match_pattern(self, pattern_handlers):
"""
Find the correct handler method given the path info from the HTTP request.
"""
path = urlparse.urlparse(self.path).path
for pattern, handler in pattern_handlers.items():
match = re.match(pattern, path)
if match:
handler(*match.groups())
return True
def program_list(self):
"""Stub the catalog's program list endpoint."""
programs = self.server.config.get('catalog.programs', [])
self.send_json_response(programs)
def program_detail(self, program_uuid):
"""Stub the catalog's program detail endpoint."""
program = self.server.config.get('catalog.programs.' + program_uuid)
self.send_json_response(program)
def program_types(self):
program_types = self.server.config.get('catalog.programs_types', [])
self.send_json_response(program_types)
class StubCatalogService(StubHttpService):
HANDLER_CLASS = StubCatalogServiceHandler
|
agpl-3.0
|
firebase/grpc-SwiftPM
|
src/python/grpcio/grpc_core_dependencies.py
|
1
|
48612
|
# Copyright 2015 gRPC authors.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# AUTO-GENERATED FROM `$REPO_ROOT/templates/src/python/grpcio/grpc_core_dependencies.py.template`!!!
CORE_SOURCE_FILES = [
'src/core/ext/filters/census/grpc_context.cc',
'src/core/ext/filters/client_channel/backend_metric.cc',
'src/core/ext/filters/client_channel/backup_poller.cc',
'src/core/ext/filters/client_channel/channel_connectivity.cc',
'src/core/ext/filters/client_channel/client_channel.cc',
'src/core/ext/filters/client_channel/client_channel_channelz.cc',
'src/core/ext/filters/client_channel/client_channel_factory.cc',
'src/core/ext/filters/client_channel/client_channel_plugin.cc',
'src/core/ext/filters/client_channel/global_subchannel_pool.cc',
'src/core/ext/filters/client_channel/health/health_check_client.cc',
'src/core/ext/filters/client_channel/http_connect_handshaker.cc',
'src/core/ext/filters/client_channel/http_proxy.cc',
'src/core/ext/filters/client_channel/lb_policy.cc',
'src/core/ext/filters/client_channel/lb_policy/child_policy_handler.cc',
'src/core/ext/filters/client_channel/lb_policy/grpclb/client_load_reporting_filter.cc',
'src/core/ext/filters/client_channel/lb_policy/grpclb/grpclb.cc',
'src/core/ext/filters/client_channel/lb_policy/grpclb/grpclb_channel_secure.cc',
'src/core/ext/filters/client_channel/lb_policy/grpclb/grpclb_client_stats.cc',
'src/core/ext/filters/client_channel/lb_policy/grpclb/load_balancer_api.cc',
'src/core/ext/filters/client_channel/lb_policy/pick_first/pick_first.cc',
'src/core/ext/filters/client_channel/lb_policy/round_robin/round_robin.cc',
'src/core/ext/filters/client_channel/lb_policy/xds/cds.cc',
'src/core/ext/filters/client_channel/lb_policy/xds/xds.cc',
'src/core/ext/filters/client_channel/lb_policy_registry.cc',
'src/core/ext/filters/client_channel/local_subchannel_pool.cc',
'src/core/ext/filters/client_channel/parse_address.cc',
'src/core/ext/filters/client_channel/proxy_mapper_registry.cc',
'src/core/ext/filters/client_channel/resolver.cc',
'src/core/ext/filters/client_channel/resolver/dns/c_ares/dns_resolver_ares.cc',
'src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_ev_driver.cc',
'src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_ev_driver_libuv.cc',
'src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_ev_driver_posix.cc',
'src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_ev_driver_windows.cc',
'src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc',
'src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper_fallback.cc',
'src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper_libuv.cc',
'src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper_posix.cc',
'src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper_windows.cc',
'src/core/ext/filters/client_channel/resolver/dns/dns_resolver_selection.cc',
'src/core/ext/filters/client_channel/resolver/dns/native/dns_resolver.cc',
'src/core/ext/filters/client_channel/resolver/fake/fake_resolver.cc',
'src/core/ext/filters/client_channel/resolver/sockaddr/sockaddr_resolver.cc',
'src/core/ext/filters/client_channel/resolver/xds/xds_resolver.cc',
'src/core/ext/filters/client_channel/resolver_registry.cc',
'src/core/ext/filters/client_channel/resolver_result_parsing.cc',
'src/core/ext/filters/client_channel/resolving_lb_policy.cc',
'src/core/ext/filters/client_channel/retry_throttle.cc',
'src/core/ext/filters/client_channel/server_address.cc',
'src/core/ext/filters/client_channel/service_config.cc',
'src/core/ext/filters/client_channel/subchannel.cc',
'src/core/ext/filters/client_channel/subchannel_pool_interface.cc',
'src/core/ext/filters/client_channel/xds/xds_api.cc',
'src/core/ext/filters/client_channel/xds/xds_bootstrap.cc',
'src/core/ext/filters/client_channel/xds/xds_channel_secure.cc',
'src/core/ext/filters/client_channel/xds/xds_client.cc',
'src/core/ext/filters/client_channel/xds/xds_client_stats.cc',
'src/core/ext/filters/client_idle/client_idle_filter.cc',
'src/core/ext/filters/deadline/deadline_filter.cc',
'src/core/ext/filters/http/client/http_client_filter.cc',
'src/core/ext/filters/http/client_authority_filter.cc',
'src/core/ext/filters/http/http_filters_plugin.cc',
'src/core/ext/filters/http/message_compress/message_compress_filter.cc',
'src/core/ext/filters/http/server/http_server_filter.cc',
'src/core/ext/filters/max_age/max_age_filter.cc',
'src/core/ext/filters/message_size/message_size_filter.cc',
'src/core/ext/filters/workarounds/workaround_cronet_compression_filter.cc',
'src/core/ext/filters/workarounds/workaround_utils.cc',
'src/core/ext/transport/chttp2/alpn/alpn.cc',
'src/core/ext/transport/chttp2/client/authority.cc',
'src/core/ext/transport/chttp2/client/chttp2_connector.cc',
'src/core/ext/transport/chttp2/client/insecure/channel_create.cc',
'src/core/ext/transport/chttp2/client/insecure/channel_create_posix.cc',
'src/core/ext/transport/chttp2/client/secure/secure_channel_create.cc',
'src/core/ext/transport/chttp2/server/chttp2_server.cc',
'src/core/ext/transport/chttp2/server/insecure/server_chttp2.cc',
'src/core/ext/transport/chttp2/server/insecure/server_chttp2_posix.cc',
'src/core/ext/transport/chttp2/server/secure/server_secure_chttp2.cc',
'src/core/ext/transport/chttp2/transport/bin_decoder.cc',
'src/core/ext/transport/chttp2/transport/bin_encoder.cc',
'src/core/ext/transport/chttp2/transport/chttp2_plugin.cc',
'src/core/ext/transport/chttp2/transport/chttp2_transport.cc',
'src/core/ext/transport/chttp2/transport/context_list.cc',
'src/core/ext/transport/chttp2/transport/flow_control.cc',
'src/core/ext/transport/chttp2/transport/frame_data.cc',
'src/core/ext/transport/chttp2/transport/frame_goaway.cc',
'src/core/ext/transport/chttp2/transport/frame_ping.cc',
'src/core/ext/transport/chttp2/transport/frame_rst_stream.cc',
'src/core/ext/transport/chttp2/transport/frame_settings.cc',
'src/core/ext/transport/chttp2/transport/frame_window_update.cc',
'src/core/ext/transport/chttp2/transport/hpack_encoder.cc',
'src/core/ext/transport/chttp2/transport/hpack_parser.cc',
'src/core/ext/transport/chttp2/transport/hpack_table.cc',
'src/core/ext/transport/chttp2/transport/http2_settings.cc',
'src/core/ext/transport/chttp2/transport/huffsyms.cc',
'src/core/ext/transport/chttp2/transport/incoming_metadata.cc',
'src/core/ext/transport/chttp2/transport/parsing.cc',
'src/core/ext/transport/chttp2/transport/stream_lists.cc',
'src/core/ext/transport/chttp2/transport/stream_map.cc',
'src/core/ext/transport/chttp2/transport/varint.cc',
'src/core/ext/transport/chttp2/transport/writing.cc',
'src/core/ext/transport/inproc/inproc_plugin.cc',
'src/core/ext/transport/inproc/inproc_transport.cc',
'src/core/ext/upb-generated/envoy/annotations/deprecation.upb.c',
'src/core/ext/upb-generated/envoy/annotations/resource.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/auth/cert.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/cds.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/cluster.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/cluster/circuit_breaker.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/cluster/filter.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/cluster/outlier_detection.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/core/address.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/core/base.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/core/config_source.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/core/grpc_service.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/core/health_check.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/core/http_uri.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/core/protocol.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/discovery.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/eds.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/endpoint.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/endpoint/endpoint.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/endpoint/endpoint_components.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/endpoint/load_report.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/lds.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/listener.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/listener/listener.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/listener/listener_components.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/listener/udp_listener_config.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/rds.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/route.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/route/route.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/route/route_components.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/scoped_route.upb.c',
'src/core/ext/upb-generated/envoy/api/v2/srds.upb.c',
'src/core/ext/upb-generated/envoy/config/filter/accesslog/v2/accesslog.upb.c',
'src/core/ext/upb-generated/envoy/config/filter/network/http_connection_manager/v2/http_connection_manager.upb.c',
'src/core/ext/upb-generated/envoy/config/listener/v2/api_listener.upb.c',
'src/core/ext/upb-generated/envoy/service/discovery/v2/ads.upb.c',
'src/core/ext/upb-generated/envoy/service/load_stats/v2/lrs.upb.c',
'src/core/ext/upb-generated/envoy/type/http.upb.c',
'src/core/ext/upb-generated/envoy/type/matcher/regex.upb.c',
'src/core/ext/upb-generated/envoy/type/matcher/string.upb.c',
'src/core/ext/upb-generated/envoy/type/metadata/v2/metadata.upb.c',
'src/core/ext/upb-generated/envoy/type/percent.upb.c',
'src/core/ext/upb-generated/envoy/type/range.upb.c',
'src/core/ext/upb-generated/envoy/type/semantic_version.upb.c',
'src/core/ext/upb-generated/envoy/type/tracing/v2/custom_tag.upb.c',
'src/core/ext/upb-generated/gogoproto/gogo.upb.c',
'src/core/ext/upb-generated/google/api/annotations.upb.c',
'src/core/ext/upb-generated/google/api/http.upb.c',
'src/core/ext/upb-generated/google/protobuf/any.upb.c',
'src/core/ext/upb-generated/google/protobuf/descriptor.upb.c',
'src/core/ext/upb-generated/google/protobuf/duration.upb.c',
'src/core/ext/upb-generated/google/protobuf/empty.upb.c',
'src/core/ext/upb-generated/google/protobuf/struct.upb.c',
'src/core/ext/upb-generated/google/protobuf/timestamp.upb.c',
'src/core/ext/upb-generated/google/protobuf/wrappers.upb.c',
'src/core/ext/upb-generated/google/rpc/status.upb.c',
'src/core/ext/upb-generated/src/proto/grpc/gcp/altscontext.upb.c',
'src/core/ext/upb-generated/src/proto/grpc/gcp/handshaker.upb.c',
'src/core/ext/upb-generated/src/proto/grpc/gcp/transport_security_common.upb.c',
'src/core/ext/upb-generated/src/proto/grpc/health/v1/health.upb.c',
'src/core/ext/upb-generated/src/proto/grpc/lb/v1/load_balancer.upb.c',
'src/core/ext/upb-generated/udpa/annotations/migrate.upb.c',
'src/core/ext/upb-generated/udpa/annotations/sensitive.upb.c',
'src/core/ext/upb-generated/udpa/data/orca/v1/orca_load_report.upb.c',
'src/core/ext/upb-generated/validate/validate.upb.c',
'src/core/lib/avl/avl.cc',
'src/core/lib/backoff/backoff.cc',
'src/core/lib/channel/channel_args.cc',
'src/core/lib/channel/channel_stack.cc',
'src/core/lib/channel/channel_stack_builder.cc',
'src/core/lib/channel/channel_trace.cc',
'src/core/lib/channel/channelz.cc',
'src/core/lib/channel/channelz_registry.cc',
'src/core/lib/channel/connected_channel.cc',
'src/core/lib/channel/handshaker.cc',
'src/core/lib/channel/handshaker_registry.cc',
'src/core/lib/channel/status_util.cc',
'src/core/lib/compression/compression.cc',
'src/core/lib/compression/compression_args.cc',
'src/core/lib/compression/compression_internal.cc',
'src/core/lib/compression/message_compress.cc',
'src/core/lib/compression/stream_compression.cc',
'src/core/lib/compression/stream_compression_gzip.cc',
'src/core/lib/compression/stream_compression_identity.cc',
'src/core/lib/debug/stats.cc',
'src/core/lib/debug/stats_data.cc',
'src/core/lib/debug/trace.cc',
'src/core/lib/gpr/alloc.cc',
'src/core/lib/gpr/atm.cc',
'src/core/lib/gpr/cpu_iphone.cc',
'src/core/lib/gpr/cpu_linux.cc',
'src/core/lib/gpr/cpu_posix.cc',
'src/core/lib/gpr/cpu_windows.cc',
'src/core/lib/gpr/env_linux.cc',
'src/core/lib/gpr/env_posix.cc',
'src/core/lib/gpr/env_windows.cc',
'src/core/lib/gpr/log.cc',
'src/core/lib/gpr/log_android.cc',
'src/core/lib/gpr/log_linux.cc',
'src/core/lib/gpr/log_posix.cc',
'src/core/lib/gpr/log_windows.cc',
'src/core/lib/gpr/murmur_hash.cc',
'src/core/lib/gpr/string.cc',
'src/core/lib/gpr/string_posix.cc',
'src/core/lib/gpr/string_util_windows.cc',
'src/core/lib/gpr/string_windows.cc',
'src/core/lib/gpr/sync.cc',
'src/core/lib/gpr/sync_abseil.cc',
'src/core/lib/gpr/sync_posix.cc',
'src/core/lib/gpr/sync_windows.cc',
'src/core/lib/gpr/time.cc',
'src/core/lib/gpr/time_posix.cc',
'src/core/lib/gpr/time_precise.cc',
'src/core/lib/gpr/time_windows.cc',
'src/core/lib/gpr/tls_pthread.cc',
'src/core/lib/gpr/tmpfile_msys.cc',
'src/core/lib/gpr/tmpfile_posix.cc',
'src/core/lib/gpr/tmpfile_windows.cc',
'src/core/lib/gpr/wrap_memcpy.cc',
'src/core/lib/gprpp/arena.cc',
'src/core/lib/gprpp/fork.cc',
'src/core/lib/gprpp/global_config_env.cc',
'src/core/lib/gprpp/host_port.cc',
'src/core/lib/gprpp/mpscq.cc',
'src/core/lib/gprpp/thd_posix.cc',
'src/core/lib/gprpp/thd_windows.cc',
'src/core/lib/http/format_request.cc',
'src/core/lib/http/httpcli.cc',
'src/core/lib/http/httpcli_security_connector.cc',
'src/core/lib/http/parser.cc',
'src/core/lib/iomgr/buffer_list.cc',
'src/core/lib/iomgr/call_combiner.cc',
'src/core/lib/iomgr/cfstream_handle.cc',
'src/core/lib/iomgr/combiner.cc',
'src/core/lib/iomgr/endpoint.cc',
'src/core/lib/iomgr/endpoint_cfstream.cc',
'src/core/lib/iomgr/endpoint_pair_posix.cc',
'src/core/lib/iomgr/endpoint_pair_uv.cc',
'src/core/lib/iomgr/endpoint_pair_windows.cc',
'src/core/lib/iomgr/error.cc',
'src/core/lib/iomgr/error_cfstream.cc',
'src/core/lib/iomgr/ev_epoll1_linux.cc',
'src/core/lib/iomgr/ev_epollex_linux.cc',
'src/core/lib/iomgr/ev_poll_posix.cc',
'src/core/lib/iomgr/ev_posix.cc',
'src/core/lib/iomgr/ev_windows.cc',
'src/core/lib/iomgr/exec_ctx.cc',
'src/core/lib/iomgr/executor.cc',
'src/core/lib/iomgr/executor/mpmcqueue.cc',
'src/core/lib/iomgr/executor/threadpool.cc',
'src/core/lib/iomgr/fork_posix.cc',
'src/core/lib/iomgr/fork_windows.cc',
'src/core/lib/iomgr/gethostname_fallback.cc',
'src/core/lib/iomgr/gethostname_host_name_max.cc',
'src/core/lib/iomgr/gethostname_sysconf.cc',
'src/core/lib/iomgr/grpc_if_nametoindex_posix.cc',
'src/core/lib/iomgr/grpc_if_nametoindex_unsupported.cc',
'src/core/lib/iomgr/internal_errqueue.cc',
'src/core/lib/iomgr/iocp_windows.cc',
'src/core/lib/iomgr/iomgr.cc',
'src/core/lib/iomgr/iomgr_custom.cc',
'src/core/lib/iomgr/iomgr_internal.cc',
'src/core/lib/iomgr/iomgr_posix.cc',
'src/core/lib/iomgr/iomgr_posix_cfstream.cc',
'src/core/lib/iomgr/iomgr_uv.cc',
'src/core/lib/iomgr/iomgr_windows.cc',
'src/core/lib/iomgr/is_epollexclusive_available.cc',
'src/core/lib/iomgr/load_file.cc',
'src/core/lib/iomgr/lockfree_event.cc',
'src/core/lib/iomgr/poller/eventmanager_libuv.cc',
'src/core/lib/iomgr/polling_entity.cc',
'src/core/lib/iomgr/pollset.cc',
'src/core/lib/iomgr/pollset_custom.cc',
'src/core/lib/iomgr/pollset_set.cc',
'src/core/lib/iomgr/pollset_set_custom.cc',
'src/core/lib/iomgr/pollset_set_windows.cc',
'src/core/lib/iomgr/pollset_uv.cc',
'src/core/lib/iomgr/pollset_windows.cc',
'src/core/lib/iomgr/resolve_address.cc',
'src/core/lib/iomgr/resolve_address_custom.cc',
'src/core/lib/iomgr/resolve_address_posix.cc',
'src/core/lib/iomgr/resolve_address_windows.cc',
'src/core/lib/iomgr/resource_quota.cc',
'src/core/lib/iomgr/sockaddr_utils.cc',
'src/core/lib/iomgr/socket_factory_posix.cc',
'src/core/lib/iomgr/socket_mutator.cc',
'src/core/lib/iomgr/socket_utils_common_posix.cc',
'src/core/lib/iomgr/socket_utils_linux.cc',
'src/core/lib/iomgr/socket_utils_posix.cc',
'src/core/lib/iomgr/socket_utils_uv.cc',
'src/core/lib/iomgr/socket_utils_windows.cc',
'src/core/lib/iomgr/socket_windows.cc',
'src/core/lib/iomgr/tcp_client.cc',
'src/core/lib/iomgr/tcp_client_cfstream.cc',
'src/core/lib/iomgr/tcp_client_custom.cc',
'src/core/lib/iomgr/tcp_client_posix.cc',
'src/core/lib/iomgr/tcp_client_windows.cc',
'src/core/lib/iomgr/tcp_custom.cc',
'src/core/lib/iomgr/tcp_posix.cc',
'src/core/lib/iomgr/tcp_server.cc',
'src/core/lib/iomgr/tcp_server_custom.cc',
'src/core/lib/iomgr/tcp_server_posix.cc',
'src/core/lib/iomgr/tcp_server_utils_posix_common.cc',
'src/core/lib/iomgr/tcp_server_utils_posix_ifaddrs.cc',
'src/core/lib/iomgr/tcp_server_utils_posix_noifaddrs.cc',
'src/core/lib/iomgr/tcp_server_windows.cc',
'src/core/lib/iomgr/tcp_uv.cc',
'src/core/lib/iomgr/tcp_windows.cc',
'src/core/lib/iomgr/time_averaged_stats.cc',
'src/core/lib/iomgr/timer.cc',
'src/core/lib/iomgr/timer_custom.cc',
'src/core/lib/iomgr/timer_generic.cc',
'src/core/lib/iomgr/timer_heap.cc',
'src/core/lib/iomgr/timer_manager.cc',
'src/core/lib/iomgr/timer_uv.cc',
'src/core/lib/iomgr/udp_server.cc',
'src/core/lib/iomgr/unix_sockets_posix.cc',
'src/core/lib/iomgr/unix_sockets_posix_noop.cc',
'src/core/lib/iomgr/wakeup_fd_eventfd.cc',
'src/core/lib/iomgr/wakeup_fd_nospecial.cc',
'src/core/lib/iomgr/wakeup_fd_pipe.cc',
'src/core/lib/iomgr/wakeup_fd_posix.cc',
'src/core/lib/iomgr/work_serializer.cc',
'src/core/lib/json/json_reader.cc',
'src/core/lib/json/json_writer.cc',
'src/core/lib/profiling/basic_timers.cc',
'src/core/lib/profiling/stap_timers.cc',
'src/core/lib/security/context/security_context.cc',
'src/core/lib/security/credentials/alts/alts_credentials.cc',
'src/core/lib/security/credentials/alts/check_gcp_environment.cc',
'src/core/lib/security/credentials/alts/check_gcp_environment_linux.cc',
'src/core/lib/security/credentials/alts/check_gcp_environment_no_op.cc',
'src/core/lib/security/credentials/alts/check_gcp_environment_windows.cc',
'src/core/lib/security/credentials/alts/grpc_alts_credentials_client_options.cc',
'src/core/lib/security/credentials/alts/grpc_alts_credentials_options.cc',
'src/core/lib/security/credentials/alts/grpc_alts_credentials_server_options.cc',
'src/core/lib/security/credentials/composite/composite_credentials.cc',
'src/core/lib/security/credentials/credentials.cc',
'src/core/lib/security/credentials/credentials_metadata.cc',
'src/core/lib/security/credentials/fake/fake_credentials.cc',
'src/core/lib/security/credentials/google_default/credentials_generic.cc',
'src/core/lib/security/credentials/google_default/google_default_credentials.cc',
'src/core/lib/security/credentials/iam/iam_credentials.cc',
'src/core/lib/security/credentials/jwt/json_token.cc',
'src/core/lib/security/credentials/jwt/jwt_credentials.cc',
'src/core/lib/security/credentials/jwt/jwt_verifier.cc',
'src/core/lib/security/credentials/local/local_credentials.cc',
'src/core/lib/security/credentials/oauth2/oauth2_credentials.cc',
'src/core/lib/security/credentials/plugin/plugin_credentials.cc',
'src/core/lib/security/credentials/ssl/ssl_credentials.cc',
'src/core/lib/security/credentials/tls/grpc_tls_credentials_options.cc',
'src/core/lib/security/credentials/tls/tls_credentials.cc',
'src/core/lib/security/security_connector/alts/alts_security_connector.cc',
'src/core/lib/security/security_connector/fake/fake_security_connector.cc',
'src/core/lib/security/security_connector/load_system_roots_fallback.cc',
'src/core/lib/security/security_connector/load_system_roots_linux.cc',
'src/core/lib/security/security_connector/local/local_security_connector.cc',
'src/core/lib/security/security_connector/security_connector.cc',
'src/core/lib/security/security_connector/ssl/ssl_security_connector.cc',
'src/core/lib/security/security_connector/ssl_utils.cc',
'src/core/lib/security/security_connector/ssl_utils_config.cc',
'src/core/lib/security/security_connector/tls/tls_security_connector.cc',
'src/core/lib/security/transport/client_auth_filter.cc',
'src/core/lib/security/transport/secure_endpoint.cc',
'src/core/lib/security/transport/security_handshaker.cc',
'src/core/lib/security/transport/server_auth_filter.cc',
'src/core/lib/security/transport/target_authority_table.cc',
'src/core/lib/security/transport/tsi_error.cc',
'src/core/lib/security/util/json_util.cc',
'src/core/lib/slice/b64.cc',
'src/core/lib/slice/percent_encoding.cc',
'src/core/lib/slice/slice.cc',
'src/core/lib/slice/slice_buffer.cc',
'src/core/lib/slice/slice_intern.cc',
'src/core/lib/slice/slice_string_helpers.cc',
'src/core/lib/surface/api_trace.cc',
'src/core/lib/surface/byte_buffer.cc',
'src/core/lib/surface/byte_buffer_reader.cc',
'src/core/lib/surface/call.cc',
'src/core/lib/surface/call_details.cc',
'src/core/lib/surface/call_log_batch.cc',
'src/core/lib/surface/channel.cc',
'src/core/lib/surface/channel_init.cc',
'src/core/lib/surface/channel_ping.cc',
'src/core/lib/surface/channel_stack_type.cc',
'src/core/lib/surface/completion_queue.cc',
'src/core/lib/surface/completion_queue_factory.cc',
'src/core/lib/surface/event_string.cc',
'src/core/lib/surface/init.cc',
'src/core/lib/surface/init_secure.cc',
'src/core/lib/surface/lame_client.cc',
'src/core/lib/surface/metadata_array.cc',
'src/core/lib/surface/server.cc',
'src/core/lib/surface/validate_metadata.cc',
'src/core/lib/surface/version.cc',
'src/core/lib/transport/bdp_estimator.cc',
'src/core/lib/transport/byte_stream.cc',
'src/core/lib/transport/connectivity_state.cc',
'src/core/lib/transport/error_utils.cc',
'src/core/lib/transport/metadata.cc',
'src/core/lib/transport/metadata_batch.cc',
'src/core/lib/transport/pid_controller.cc',
'src/core/lib/transport/static_metadata.cc',
'src/core/lib/transport/status_conversion.cc',
'src/core/lib/transport/status_metadata.cc',
'src/core/lib/transport/timeout_encoding.cc',
'src/core/lib/transport/transport.cc',
'src/core/lib/transport/transport_op_string.cc',
'src/core/lib/uri/uri_parser.cc',
'src/core/plugin_registry/grpc_plugin_registry.cc',
'src/core/tsi/alts/crypt/aes_gcm.cc',
'src/core/tsi/alts/crypt/gsec.cc',
'src/core/tsi/alts/frame_protector/alts_counter.cc',
'src/core/tsi/alts/frame_protector/alts_crypter.cc',
'src/core/tsi/alts/frame_protector/alts_frame_protector.cc',
'src/core/tsi/alts/frame_protector/alts_record_protocol_crypter_common.cc',
'src/core/tsi/alts/frame_protector/alts_seal_privacy_integrity_crypter.cc',
'src/core/tsi/alts/frame_protector/alts_unseal_privacy_integrity_crypter.cc',
'src/core/tsi/alts/frame_protector/frame_handler.cc',
'src/core/tsi/alts/handshaker/alts_handshaker_client.cc',
'src/core/tsi/alts/handshaker/alts_shared_resource.cc',
'src/core/tsi/alts/handshaker/alts_tsi_handshaker.cc',
'src/core/tsi/alts/handshaker/alts_tsi_utils.cc',
'src/core/tsi/alts/handshaker/transport_security_common_api.cc',
'src/core/tsi/alts/zero_copy_frame_protector/alts_grpc_integrity_only_record_protocol.cc',
'src/core/tsi/alts/zero_copy_frame_protector/alts_grpc_privacy_integrity_record_protocol.cc',
'src/core/tsi/alts/zero_copy_frame_protector/alts_grpc_record_protocol_common.cc',
'src/core/tsi/alts/zero_copy_frame_protector/alts_iovec_record_protocol.cc',
'src/core/tsi/alts/zero_copy_frame_protector/alts_zero_copy_grpc_protector.cc',
'src/core/tsi/fake_transport_security.cc',
'src/core/tsi/local_transport_security.cc',
'src/core/tsi/ssl/session_cache/ssl_session_boringssl.cc',
'src/core/tsi/ssl/session_cache/ssl_session_cache.cc',
'src/core/tsi/ssl/session_cache/ssl_session_openssl.cc',
'src/core/tsi/ssl_transport_security.cc',
'src/core/tsi/transport_security.cc',
'src/core/tsi/transport_security_grpc.cc',
'third_party/abseil-cpp/absl/base/dynamic_annotations.cc',
'third_party/abseil-cpp/absl/base/internal/cycleclock.cc',
'third_party/abseil-cpp/absl/base/internal/raw_logging.cc',
'third_party/abseil-cpp/absl/base/internal/spinlock.cc',
'third_party/abseil-cpp/absl/base/internal/spinlock_wait.cc',
'third_party/abseil-cpp/absl/base/internal/sysinfo.cc',
'third_party/abseil-cpp/absl/base/internal/thread_identity.cc',
'third_party/abseil-cpp/absl/base/internal/throw_delegate.cc',
'third_party/abseil-cpp/absl/base/internal/unscaledcycleclock.cc',
'third_party/abseil-cpp/absl/base/log_severity.cc',
'third_party/abseil-cpp/absl/numeric/int128.cc',
'third_party/abseil-cpp/absl/strings/ascii.cc',
'third_party/abseil-cpp/absl/strings/charconv.cc',
'third_party/abseil-cpp/absl/strings/escaping.cc',
'third_party/abseil-cpp/absl/strings/internal/charconv_bigint.cc',
'third_party/abseil-cpp/absl/strings/internal/charconv_parse.cc',
'third_party/abseil-cpp/absl/strings/internal/escaping.cc',
'third_party/abseil-cpp/absl/strings/internal/memutil.cc',
'third_party/abseil-cpp/absl/strings/internal/ostringstream.cc',
'third_party/abseil-cpp/absl/strings/internal/str_format/arg.cc',
'third_party/abseil-cpp/absl/strings/internal/str_format/bind.cc',
'third_party/abseil-cpp/absl/strings/internal/str_format/extension.cc',
'third_party/abseil-cpp/absl/strings/internal/str_format/float_conversion.cc',
'third_party/abseil-cpp/absl/strings/internal/str_format/output.cc',
'third_party/abseil-cpp/absl/strings/internal/str_format/parser.cc',
'third_party/abseil-cpp/absl/strings/internal/utf8.cc',
'third_party/abseil-cpp/absl/strings/match.cc',
'third_party/abseil-cpp/absl/strings/numbers.cc',
'third_party/abseil-cpp/absl/strings/str_cat.cc',
'third_party/abseil-cpp/absl/strings/str_replace.cc',
'third_party/abseil-cpp/absl/strings/str_split.cc',
'third_party/abseil-cpp/absl/strings/string_view.cc',
'third_party/abseil-cpp/absl/strings/substitute.cc',
'third_party/abseil-cpp/absl/types/bad_optional_access.cc',
'third_party/address_sorting/address_sorting.c',
'third_party/address_sorting/address_sorting_posix.c',
'third_party/address_sorting/address_sorting_windows.c',
'third_party/boringssl-with-bazel/err_data.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/a_bitstr.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/a_bool.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/a_d2i_fp.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/a_dup.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/a_enum.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/a_gentm.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/a_i2d_fp.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/a_int.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/a_mbstr.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/a_object.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/a_octet.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/a_print.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/a_strnid.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/a_time.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/a_type.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/a_utctm.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/a_utf8.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/asn1_lib.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/asn1_par.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/asn_pack.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/f_enum.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/f_int.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/f_string.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/tasn_dec.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/tasn_enc.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/tasn_fre.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/tasn_new.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/tasn_typ.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/tasn_utl.c',
'third_party/boringssl-with-bazel/src/crypto/asn1/time_support.c',
'third_party/boringssl-with-bazel/src/crypto/base64/base64.c',
'third_party/boringssl-with-bazel/src/crypto/bio/bio.c',
'third_party/boringssl-with-bazel/src/crypto/bio/bio_mem.c',
'third_party/boringssl-with-bazel/src/crypto/bio/connect.c',
'third_party/boringssl-with-bazel/src/crypto/bio/fd.c',
'third_party/boringssl-with-bazel/src/crypto/bio/file.c',
'third_party/boringssl-with-bazel/src/crypto/bio/hexdump.c',
'third_party/boringssl-with-bazel/src/crypto/bio/pair.c',
'third_party/boringssl-with-bazel/src/crypto/bio/printf.c',
'third_party/boringssl-with-bazel/src/crypto/bio/socket.c',
'third_party/boringssl-with-bazel/src/crypto/bio/socket_helper.c',
'third_party/boringssl-with-bazel/src/crypto/bn_extra/bn_asn1.c',
'third_party/boringssl-with-bazel/src/crypto/bn_extra/convert.c',
'third_party/boringssl-with-bazel/src/crypto/buf/buf.c',
'third_party/boringssl-with-bazel/src/crypto/bytestring/asn1_compat.c',
'third_party/boringssl-with-bazel/src/crypto/bytestring/ber.c',
'third_party/boringssl-with-bazel/src/crypto/bytestring/cbb.c',
'third_party/boringssl-with-bazel/src/crypto/bytestring/cbs.c',
'third_party/boringssl-with-bazel/src/crypto/bytestring/unicode.c',
'third_party/boringssl-with-bazel/src/crypto/chacha/chacha.c',
'third_party/boringssl-with-bazel/src/crypto/cipher_extra/cipher_extra.c',
'third_party/boringssl-with-bazel/src/crypto/cipher_extra/derive_key.c',
'third_party/boringssl-with-bazel/src/crypto/cipher_extra/e_aesccm.c',
'third_party/boringssl-with-bazel/src/crypto/cipher_extra/e_aesctrhmac.c',
'third_party/boringssl-with-bazel/src/crypto/cipher_extra/e_aesgcmsiv.c',
'third_party/boringssl-with-bazel/src/crypto/cipher_extra/e_chacha20poly1305.c',
'third_party/boringssl-with-bazel/src/crypto/cipher_extra/e_null.c',
'third_party/boringssl-with-bazel/src/crypto/cipher_extra/e_rc2.c',
'third_party/boringssl-with-bazel/src/crypto/cipher_extra/e_rc4.c',
'third_party/boringssl-with-bazel/src/crypto/cipher_extra/e_tls.c',
'third_party/boringssl-with-bazel/src/crypto/cipher_extra/tls_cbc.c',
'third_party/boringssl-with-bazel/src/crypto/cmac/cmac.c',
'third_party/boringssl-with-bazel/src/crypto/conf/conf.c',
'third_party/boringssl-with-bazel/src/crypto/cpu-aarch64-fuchsia.c',
'third_party/boringssl-with-bazel/src/crypto/cpu-aarch64-linux.c',
'third_party/boringssl-with-bazel/src/crypto/cpu-arm-linux.c',
'third_party/boringssl-with-bazel/src/crypto/cpu-arm.c',
'third_party/boringssl-with-bazel/src/crypto/cpu-intel.c',
'third_party/boringssl-with-bazel/src/crypto/cpu-ppc64le.c',
'third_party/boringssl-with-bazel/src/crypto/crypto.c',
'third_party/boringssl-with-bazel/src/crypto/curve25519/spake25519.c',
'third_party/boringssl-with-bazel/src/crypto/dh/check.c',
'third_party/boringssl-with-bazel/src/crypto/dh/dh.c',
'third_party/boringssl-with-bazel/src/crypto/dh/dh_asn1.c',
'third_party/boringssl-with-bazel/src/crypto/dh/params.c',
'third_party/boringssl-with-bazel/src/crypto/digest_extra/digest_extra.c',
'third_party/boringssl-with-bazel/src/crypto/dsa/dsa.c',
'third_party/boringssl-with-bazel/src/crypto/dsa/dsa_asn1.c',
'third_party/boringssl-with-bazel/src/crypto/ec_extra/ec_asn1.c',
'third_party/boringssl-with-bazel/src/crypto/ec_extra/ec_derive.c',
'third_party/boringssl-with-bazel/src/crypto/ecdh_extra/ecdh_extra.c',
'third_party/boringssl-with-bazel/src/crypto/ecdsa_extra/ecdsa_asn1.c',
'third_party/boringssl-with-bazel/src/crypto/engine/engine.c',
'third_party/boringssl-with-bazel/src/crypto/err/err.c',
'third_party/boringssl-with-bazel/src/crypto/evp/digestsign.c',
'third_party/boringssl-with-bazel/src/crypto/evp/evp.c',
'third_party/boringssl-with-bazel/src/crypto/evp/evp_asn1.c',
'third_party/boringssl-with-bazel/src/crypto/evp/evp_ctx.c',
'third_party/boringssl-with-bazel/src/crypto/evp/p_dsa_asn1.c',
'third_party/boringssl-with-bazel/src/crypto/evp/p_ec.c',
'third_party/boringssl-with-bazel/src/crypto/evp/p_ec_asn1.c',
'third_party/boringssl-with-bazel/src/crypto/evp/p_ed25519.c',
'third_party/boringssl-with-bazel/src/crypto/evp/p_ed25519_asn1.c',
'third_party/boringssl-with-bazel/src/crypto/evp/p_rsa.c',
'third_party/boringssl-with-bazel/src/crypto/evp/p_rsa_asn1.c',
'third_party/boringssl-with-bazel/src/crypto/evp/p_x25519.c',
'third_party/boringssl-with-bazel/src/crypto/evp/p_x25519_asn1.c',
'third_party/boringssl-with-bazel/src/crypto/evp/pbkdf.c',
'third_party/boringssl-with-bazel/src/crypto/evp/print.c',
'third_party/boringssl-with-bazel/src/crypto/evp/scrypt.c',
'third_party/boringssl-with-bazel/src/crypto/evp/sign.c',
'third_party/boringssl-with-bazel/src/crypto/ex_data.c',
'third_party/boringssl-with-bazel/src/crypto/fipsmodule/bcm.c',
'third_party/boringssl-with-bazel/src/crypto/fipsmodule/fips_shared_support.c',
'third_party/boringssl-with-bazel/src/crypto/fipsmodule/is_fips.c',
'third_party/boringssl-with-bazel/src/crypto/hkdf/hkdf.c',
'third_party/boringssl-with-bazel/src/crypto/hrss/hrss.c',
'third_party/boringssl-with-bazel/src/crypto/lhash/lhash.c',
'third_party/boringssl-with-bazel/src/crypto/mem.c',
'third_party/boringssl-with-bazel/src/crypto/obj/obj.c',
'third_party/boringssl-with-bazel/src/crypto/obj/obj_xref.c',
'third_party/boringssl-with-bazel/src/crypto/pem/pem_all.c',
'third_party/boringssl-with-bazel/src/crypto/pem/pem_info.c',
'third_party/boringssl-with-bazel/src/crypto/pem/pem_lib.c',
'third_party/boringssl-with-bazel/src/crypto/pem/pem_oth.c',
'third_party/boringssl-with-bazel/src/crypto/pem/pem_pk8.c',
'third_party/boringssl-with-bazel/src/crypto/pem/pem_pkey.c',
'third_party/boringssl-with-bazel/src/crypto/pem/pem_x509.c',
'third_party/boringssl-with-bazel/src/crypto/pem/pem_xaux.c',
'third_party/boringssl-with-bazel/src/crypto/pkcs7/pkcs7.c',
'third_party/boringssl-with-bazel/src/crypto/pkcs7/pkcs7_x509.c',
'third_party/boringssl-with-bazel/src/crypto/pkcs8/p5_pbev2.c',
'third_party/boringssl-with-bazel/src/crypto/pkcs8/pkcs8.c',
'third_party/boringssl-with-bazel/src/crypto/pkcs8/pkcs8_x509.c',
'third_party/boringssl-with-bazel/src/crypto/poly1305/poly1305.c',
'third_party/boringssl-with-bazel/src/crypto/poly1305/poly1305_arm.c',
'third_party/boringssl-with-bazel/src/crypto/poly1305/poly1305_vec.c',
'third_party/boringssl-with-bazel/src/crypto/pool/pool.c',
'third_party/boringssl-with-bazel/src/crypto/rand_extra/deterministic.c',
'third_party/boringssl-with-bazel/src/crypto/rand_extra/forkunsafe.c',
'third_party/boringssl-with-bazel/src/crypto/rand_extra/fuchsia.c',
'third_party/boringssl-with-bazel/src/crypto/rand_extra/rand_extra.c',
'third_party/boringssl-with-bazel/src/crypto/rand_extra/windows.c',
'third_party/boringssl-with-bazel/src/crypto/rc4/rc4.c',
'third_party/boringssl-with-bazel/src/crypto/refcount_c11.c',
'third_party/boringssl-with-bazel/src/crypto/refcount_lock.c',
'third_party/boringssl-with-bazel/src/crypto/rsa_extra/rsa_asn1.c',
'third_party/boringssl-with-bazel/src/crypto/rsa_extra/rsa_print.c',
'third_party/boringssl-with-bazel/src/crypto/siphash/siphash.c',
'third_party/boringssl-with-bazel/src/crypto/stack/stack.c',
'third_party/boringssl-with-bazel/src/crypto/thread.c',
'third_party/boringssl-with-bazel/src/crypto/thread_none.c',
'third_party/boringssl-with-bazel/src/crypto/thread_pthread.c',
'third_party/boringssl-with-bazel/src/crypto/thread_win.c',
'third_party/boringssl-with-bazel/src/crypto/x509/a_digest.c',
'third_party/boringssl-with-bazel/src/crypto/x509/a_sign.c',
'third_party/boringssl-with-bazel/src/crypto/x509/a_strex.c',
'third_party/boringssl-with-bazel/src/crypto/x509/a_verify.c',
'third_party/boringssl-with-bazel/src/crypto/x509/algorithm.c',
'third_party/boringssl-with-bazel/src/crypto/x509/asn1_gen.c',
'third_party/boringssl-with-bazel/src/crypto/x509/by_dir.c',
'third_party/boringssl-with-bazel/src/crypto/x509/by_file.c',
'third_party/boringssl-with-bazel/src/crypto/x509/i2d_pr.c',
'third_party/boringssl-with-bazel/src/crypto/x509/rsa_pss.c',
'third_party/boringssl-with-bazel/src/crypto/x509/t_crl.c',
'third_party/boringssl-with-bazel/src/crypto/x509/t_req.c',
'third_party/boringssl-with-bazel/src/crypto/x509/t_x509.c',
'third_party/boringssl-with-bazel/src/crypto/x509/t_x509a.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x509.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x509_att.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x509_cmp.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x509_d2.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x509_def.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x509_ext.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x509_lu.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x509_obj.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x509_r2x.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x509_req.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x509_set.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x509_trs.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x509_txt.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x509_v3.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x509_vfy.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x509_vpm.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x509cset.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x509name.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x509rset.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x509spki.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x_algor.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x_all.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x_attrib.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x_crl.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x_exten.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x_info.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x_name.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x_pkey.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x_pubkey.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x_req.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x_sig.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x_spki.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x_val.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x_x509.c',
'third_party/boringssl-with-bazel/src/crypto/x509/x_x509a.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/pcy_cache.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/pcy_data.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/pcy_lib.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/pcy_map.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/pcy_node.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/pcy_tree.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_akey.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_akeya.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_alt.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_bcons.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_bitst.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_conf.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_cpols.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_crld.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_enum.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_extku.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_genn.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_ia5.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_info.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_int.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_lib.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_ncons.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_ocsp.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_pci.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_pcia.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_pcons.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_pku.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_pmaps.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_prn.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_purp.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_skey.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_sxnet.c',
'third_party/boringssl-with-bazel/src/crypto/x509v3/v3_utl.c',
'third_party/boringssl-with-bazel/src/ssl/bio_ssl.cc',
'third_party/boringssl-with-bazel/src/ssl/d1_both.cc',
'third_party/boringssl-with-bazel/src/ssl/d1_lib.cc',
'third_party/boringssl-with-bazel/src/ssl/d1_pkt.cc',
'third_party/boringssl-with-bazel/src/ssl/d1_srtp.cc',
'third_party/boringssl-with-bazel/src/ssl/dtls_method.cc',
'third_party/boringssl-with-bazel/src/ssl/dtls_record.cc',
'third_party/boringssl-with-bazel/src/ssl/handoff.cc',
'third_party/boringssl-with-bazel/src/ssl/handshake.cc',
'third_party/boringssl-with-bazel/src/ssl/handshake_client.cc',
'third_party/boringssl-with-bazel/src/ssl/handshake_server.cc',
'third_party/boringssl-with-bazel/src/ssl/s3_both.cc',
'third_party/boringssl-with-bazel/src/ssl/s3_lib.cc',
'third_party/boringssl-with-bazel/src/ssl/s3_pkt.cc',
'third_party/boringssl-with-bazel/src/ssl/ssl_aead_ctx.cc',
'third_party/boringssl-with-bazel/src/ssl/ssl_asn1.cc',
'third_party/boringssl-with-bazel/src/ssl/ssl_buffer.cc',
'third_party/boringssl-with-bazel/src/ssl/ssl_cert.cc',
'third_party/boringssl-with-bazel/src/ssl/ssl_cipher.cc',
'third_party/boringssl-with-bazel/src/ssl/ssl_file.cc',
'third_party/boringssl-with-bazel/src/ssl/ssl_key_share.cc',
'third_party/boringssl-with-bazel/src/ssl/ssl_lib.cc',
'third_party/boringssl-with-bazel/src/ssl/ssl_privkey.cc',
'third_party/boringssl-with-bazel/src/ssl/ssl_session.cc',
'third_party/boringssl-with-bazel/src/ssl/ssl_stat.cc',
'third_party/boringssl-with-bazel/src/ssl/ssl_transcript.cc',
'third_party/boringssl-with-bazel/src/ssl/ssl_versions.cc',
'third_party/boringssl-with-bazel/src/ssl/ssl_x509.cc',
'third_party/boringssl-with-bazel/src/ssl/t1_enc.cc',
'third_party/boringssl-with-bazel/src/ssl/t1_lib.cc',
'third_party/boringssl-with-bazel/src/ssl/tls13_both.cc',
'third_party/boringssl-with-bazel/src/ssl/tls13_client.cc',
'third_party/boringssl-with-bazel/src/ssl/tls13_enc.cc',
'third_party/boringssl-with-bazel/src/ssl/tls13_server.cc',
'third_party/boringssl-with-bazel/src/ssl/tls_method.cc',
'third_party/boringssl-with-bazel/src/ssl/tls_record.cc',
'third_party/boringssl-with-bazel/src/third_party/fiat/curve25519.c',
'third_party/cares/cares/ares__close_sockets.c',
'third_party/cares/cares/ares__get_hostent.c',
'third_party/cares/cares/ares__read_line.c',
'third_party/cares/cares/ares__timeval.c',
'third_party/cares/cares/ares_cancel.c',
'third_party/cares/cares/ares_create_query.c',
'third_party/cares/cares/ares_data.c',
'third_party/cares/cares/ares_destroy.c',
'third_party/cares/cares/ares_expand_name.c',
'third_party/cares/cares/ares_expand_string.c',
'third_party/cares/cares/ares_fds.c',
'third_party/cares/cares/ares_free_hostent.c',
'third_party/cares/cares/ares_free_string.c',
'third_party/cares/cares/ares_getenv.c',
'third_party/cares/cares/ares_gethostbyaddr.c',
'third_party/cares/cares/ares_gethostbyname.c',
'third_party/cares/cares/ares_getnameinfo.c',
'third_party/cares/cares/ares_getopt.c',
'third_party/cares/cares/ares_getsock.c',
'third_party/cares/cares/ares_init.c',
'third_party/cares/cares/ares_library_init.c',
'third_party/cares/cares/ares_llist.c',
'third_party/cares/cares/ares_mkquery.c',
'third_party/cares/cares/ares_nowarn.c',
'third_party/cares/cares/ares_options.c',
'third_party/cares/cares/ares_parse_a_reply.c',
'third_party/cares/cares/ares_parse_aaaa_reply.c',
'third_party/cares/cares/ares_parse_mx_reply.c',
'third_party/cares/cares/ares_parse_naptr_reply.c',
'third_party/cares/cares/ares_parse_ns_reply.c',
'third_party/cares/cares/ares_parse_ptr_reply.c',
'third_party/cares/cares/ares_parse_soa_reply.c',
'third_party/cares/cares/ares_parse_srv_reply.c',
'third_party/cares/cares/ares_parse_txt_reply.c',
'third_party/cares/cares/ares_platform.c',
'third_party/cares/cares/ares_process.c',
'third_party/cares/cares/ares_query.c',
'third_party/cares/cares/ares_search.c',
'third_party/cares/cares/ares_send.c',
'third_party/cares/cares/ares_strcasecmp.c',
'third_party/cares/cares/ares_strdup.c',
'third_party/cares/cares/ares_strerror.c',
'third_party/cares/cares/ares_strsplit.c',
'third_party/cares/cares/ares_timeout.c',
'third_party/cares/cares/ares_version.c',
'third_party/cares/cares/ares_writev.c',
'third_party/cares/cares/bitncmp.c',
'third_party/cares/cares/inet_net_pton.c',
'third_party/cares/cares/inet_ntop.c',
'third_party/cares/cares/windows_port.c',
'third_party/upb/upb/decode.c',
'third_party/upb/upb/encode.c',
'third_party/upb/upb/msg.c',
'third_party/upb/upb/port.c',
'third_party/upb/upb/table.c',
'third_party/upb/upb/upb.c',
'third_party/zlib/adler32.c',
'third_party/zlib/compress.c',
'third_party/zlib/crc32.c',
'third_party/zlib/deflate.c',
'third_party/zlib/gzclose.c',
'third_party/zlib/gzlib.c',
'third_party/zlib/gzread.c',
'third_party/zlib/gzwrite.c',
'third_party/zlib/infback.c',
'third_party/zlib/inffast.c',
'third_party/zlib/inflate.c',
'third_party/zlib/inftrees.c',
'third_party/zlib/trees.c',
'third_party/zlib/uncompr.c',
'third_party/zlib/zutil.c',
]
|
apache-2.0
|
Yukarumya/Yukarum-Redfoxes
|
python/mozbuild/mozbuild/frontend/mach_commands.py
|
2
|
7822
|
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
from __future__ import absolute_import, print_function, unicode_literals
from collections import defaultdict
import os
from mach.decorators import (
CommandArgument,
CommandProvider,
Command,
SubCommand,
)
from mozbuild.base import MachCommandBase
import mozpack.path as mozpath
class InvalidPathException(Exception):
"""Represents an error due to an invalid path."""
@CommandProvider
class MozbuildFileCommands(MachCommandBase):
@Command('mozbuild-reference', category='build-dev',
description='View reference documentation on mozbuild files.')
@CommandArgument('symbol', default=None, nargs='*',
help='Symbol to view help on. If not specified, all will be shown.')
@CommandArgument('--name-only', '-n', default=False, action='store_true',
help='Print symbol names only.')
def reference(self, symbol, name_only=False):
# mozbuild.sphinx imports some Sphinx modules, so we need to be sure
# the optional Sphinx package is installed.
self._activate_virtualenv()
self.virtualenv_manager.install_pip_package('Sphinx==1.1.3')
from mozbuild.sphinx import (
format_module,
function_reference,
special_reference,
variable_reference,
)
import mozbuild.frontend.context as m
if name_only:
for s in sorted(m.VARIABLES.keys()):
print(s)
for s in sorted(m.FUNCTIONS.keys()):
print(s)
for s in sorted(m.SPECIAL_VARIABLES.keys()):
print(s)
return 0
if len(symbol):
for s in symbol:
if s in m.VARIABLES:
for line in variable_reference(s, *m.VARIABLES[s]):
print(line)
continue
elif s in m.FUNCTIONS:
for line in function_reference(s, *m.FUNCTIONS[s]):
print(line)
continue
elif s in m.SPECIAL_VARIABLES:
for line in special_reference(s, *m.SPECIAL_VARIABLES[s]):
print(line)
continue
print('Could not find symbol: %s' % s)
return 1
return 0
for line in format_module(m):
print(line)
return 0
@Command('file-info', category='build-dev',
description='Query for metadata about files.')
def file_info(self):
"""Show files metadata derived from moz.build files.
moz.build files contain "Files" sub-contexts for declaring metadata
against file patterns. This command suite is used to query that data.
"""
@SubCommand('file-info', 'bugzilla-component',
'Show Bugzilla component info for files listed.')
@CommandArgument('-r', '--rev',
help='Version control revision to look up info from')
@CommandArgument('paths', nargs='+',
help='Paths whose data to query')
def file_info_bugzilla(self, paths, rev=None):
"""Show Bugzilla component for a set of files.
Given a requested set of files (which can be specified using
wildcards), print the Bugzilla component for each file.
"""
components = defaultdict(set)
try:
for p, m in self._get_files_info(paths, rev=rev).items():
components[m.get('BUG_COMPONENT')].add(p)
except InvalidPathException as e:
print(e.message)
return 1
for component, files in sorted(components.items(), key=lambda x: (x is None, x)):
print('%s :: %s' % (component.product, component.component) if component else 'UNKNOWN')
for f in sorted(files):
print(' %s' % f)
@SubCommand('file-info', 'missing-bugzilla',
'Show files missing Bugzilla component info')
@CommandArgument('-r', '--rev',
help='Version control revision to look up info from')
@CommandArgument('paths', nargs='+',
help='Paths whose data to query')
def file_info_missing_bugzilla(self, paths, rev=None):
try:
for p, m in sorted(self._get_files_info(paths, rev=rev).items()):
if 'BUG_COMPONENT' not in m:
print(p)
except InvalidPathException as e:
print(e.message)
return 1
@SubCommand('file-info', 'dep-tests',
'Show test files marked as dependencies of these source files.')
@CommandArgument('-r', '--rev',
help='Version control revision to look up info from')
@CommandArgument('paths', nargs='+',
help='Paths whose data to query')
def file_info_test_deps(self, paths, rev=None):
try:
for p, m in self._get_files_info(paths, rev=rev).items():
print('%s:' % mozpath.relpath(p, self.topsrcdir))
if m.test_files:
print('\tTest file patterns:')
for p in m.test_files:
print('\t\t%s' % p)
if m.test_tags:
print('\tRelevant tags:')
for p in m.test_tags:
print('\t\t%s' % p)
if m.test_flavors:
print('\tRelevant flavors:')
for p in m.test_flavors:
print('\t\t%s' % p)
except InvalidPathException as e:
print(e.message)
return 1
def _get_reader(self, finder):
from mozbuild.frontend.reader import (
BuildReader,
EmptyConfig,
)
config = EmptyConfig(self.topsrcdir)
return BuildReader(config, finder=finder)
def _get_files_info(self, paths, rev=None):
from mozbuild.frontend.reader import default_finder
from mozpack.files import FileFinder, MercurialRevisionFinder
# Normalize to relative from topsrcdir.
relpaths = []
for p in paths:
a = mozpath.abspath(p)
if not mozpath.basedir(a, [self.topsrcdir]):
raise InvalidPathException('path is outside topsrcdir: %s' % p)
relpaths.append(mozpath.relpath(a, self.topsrcdir))
repo = None
if rev:
hg_path = os.path.join(self.topsrcdir, '.hg')
if not os.path.exists(hg_path):
raise InvalidPathException('a Mercurial repo is required '
'when specifying a revision')
repo = self.topsrcdir
# We need two finders because the reader's finder operates on
# absolute paths.
finder = FileFinder(self.topsrcdir, find_executables=False)
if repo:
reader_finder = MercurialRevisionFinder(repo, rev=rev,
recognize_repo_paths=True)
else:
reader_finder = default_finder
# Expand wildcards.
allpaths = []
for p in relpaths:
if '*' not in p:
if p not in allpaths:
allpaths.append(p)
continue
if repo:
raise InvalidPathException('cannot use wildcard in version control mode')
for path, f in finder.find(p):
if path not in allpaths:
allpaths.append(path)
reader = self._get_reader(finder=reader_finder)
return reader.files_info(allpaths)
|
mpl-2.0
|
probml/pyprobml
|
scripts/svi_gmm_tfp_scratch.py
|
1
|
7626
|
# SVI for a GMM
# Modified from
# https://github.com/brendanhasz/svi-gaussian-mixture-model/blob/master/BayesianGaussianMixtureModel.ipynb
#pip install tf-nightly
#pip install --upgrade tfp-nightly -q
# Imports
import numpy as np
import matplotlib.pyplot as plt
import seaborn as sns
import tensorflow as tf
import tensorflow_probability as tfp
tfd = tfp.distributions
from time import time
# Plot settings
#%config InlineBackend.figure_format = 'svg'
# Random seed
np.random.seed(12345)
tf.random.set_seed(12345)
# Generate some data
N = 3000
X = np.random.randn(N, 2).astype('float32')
X[:1000, :] += [2, 0]
X[1000:2000, :] -= [2, 4]
X[2000:, :] += [-2, 4]
# Plot the data
plt.plot(X[:, 0], X[:, 1], '.')
plt.axis('equal')
plt.show()
# Make a TensorFlow Dataset from that data
batch_size = 500
dataset = tf.data.Dataset.from_tensor_slices(
(X)).shuffle(10000).batch(batch_size)
class GaussianMixtureModel(tf.keras.Model):
"""A Bayesian Gaussian mixture model.
Assumes Gaussians' variances in each dimension are independent.
Parameters
----------
Nc : int > 0
Number of mixture components.
Nd : int > 0
Number of dimensions.
"""
def __init__(self, Nc, Nd):
# Initialize
super(GaussianMixtureModel, self).__init__()
self.Nc = Nc
self.Nd = Nd
# Variational distribution variables for means
self.locs = tf.Variable(tf.random.normal((Nc, Nd)))
self.scales = tf.Variable(tf.pow(tf.random.gamma((Nc, Nd), 5, 5), -0.5))
# Variational distribution variables for standard deviations
self.alpha = tf.Variable(tf.random.uniform((Nc, Nd), 4., 6.))
self.beta = tf.Variable(tf.random.uniform((Nc, Nd), 4., 6.))
# Variational distribution variables for component weights
self.counts = tf.Variable(2*tf.ones((Nc,)))
# Prior distributions for the means
self.mu_prior = tfd.Normal(tf.zeros((Nc, Nd)), tf.ones((Nc, Nd)))
# Prior distributions for the standard deviations
self.sigma_prior = tfd.Gamma(5*tf.ones((Nc, Nd)), 5*tf.ones((Nc, Nd)))
# Prior distributions for the component weights
self.theta_prior = tfd.Dirichlet(2*tf.ones((Nc,)))
def call(self, x, sampling=True, independent=True):
"""Compute losses given a batch of data.
Parameters
----------
x : tf.Tensor
A batch of data
sampling : bool
Whether to sample from the variational posterior
distributions (if True, the default), or just use the
mean of the variational distributions (if False).
Returns
-------
log_likelihoods : tf.Tensor
Log likelihood for each sample
kl_sum : tf.Tensor
Sum of the KL divergences between the variational
distributions and their priors
"""
# The variational distributions
mu = tfd.Normal(self.locs, self.scales)
sigma = tfd.Gamma(self.alpha, self.beta)
theta = tfd.Dirichlet(self.counts)
# Sample from the variational distributions
if sampling:
Nb = x.shape[0] #number of samples in the batch
mu_sample = mu.sample(Nb)
sigma_sample = tf.pow(sigma.sample(Nb), -0.5)
theta_sample = theta.sample(Nb)
else:
mu_sample = tf.reshape(mu.mean(), (1, self.Nc, self.Nd))
sigma_sample = tf.pow(tf.reshape(sigma.mean(), (1, self.Nc, self.Nd)), -0.5)
theta_sample = tf.reshape(theta.mean(), (1, self.Nc))
# The mixture density
density = tfd.Mixture(
cat=tfd.Categorical(probs=theta_sample),
components=[
tfd.MultivariateNormalDiag(loc=mu_sample[:, i, :],
scale_diag=sigma_sample[:, i, :])
for i in range(self.Nc)])
# Compute the mean log likelihood
log_likelihoods = density.log_prob(x)
# Compute the KL divergence sum
mu_div = tf.reduce_sum(tfd.kl_divergence(mu, self.mu_prior))
sigma_div = tf.reduce_sum(tfd.kl_divergence(sigma, self.sigma_prior))
theta_div = tf.reduce_sum(tfd.kl_divergence(theta, self.theta_prior))
kl_sum = mu_div + sigma_div + theta_div
# Return both losses
return log_likelihoods, kl_sum
def fit(self, data, nepochs):
optimizer = tf.keras.optimizers.Adam(lr=1e-3)
@tf.function
def train_step(data):
with tf.GradientTape() as tape:
log_likelihoods, kl_sum = self(data)
elbo_loss = kl_sum/N - tf.reduce_mean(log_likelihoods)
gradients = tape.gradient(elbo_loss, model.trainable_variables)
optimizer.apply_gradients(zip(gradients, model.trainable_variables))
for epoch in range(nepochs):
for data in dataset:
train_step(data)
def gmm_fit(model, data, nepochs):
optimizer = tf.keras.optimizers.Adam(lr=1e-3)
@tf.function
def train_step(data):
with tf.GradientTape() as tape:
log_likelihoods, kl_sum = model(data)
elbo_loss = kl_sum/N - tf.reduce_mean(log_likelihoods)
gradients = tape.gradient(elbo_loss, model.trainable_variables)
optimizer.apply_gradients(zip(gradients, model.trainable_variables))
for epoch in range(nepochs):
for data in dataset:
train_step(data)
nepochs = 1000
nmix = 3
ndim = 2
model = GaussianMixtureModel(nmix, ndim)
### Fitting
time_start = time()
method = 3
if method == 1:
model.fit(dataset, nepochs)
if method == 2:
gmm_fit(model, dataset, nepochs)
if method == 3:
# Relies on 'model' and 'optimizer' being in scope = yuk!
model = GaussianMixtureModel(nmix, ndim)
optimizer = tf.keras.optimizers.Adam(lr=1e-3)
@tf.function
def train_step(data):
with tf.GradientTape() as tape:
log_likelihoods, kl_sum = model(data)
elbo_loss = kl_sum/N - tf.reduce_mean(log_likelihoods)
gradients = tape.gradient(elbo_loss, model.trainable_variables)
optimizer.apply_gradients(zip(gradients, model.trainable_variables))
for epoch in range(nepochs):
for data in dataset:
train_step(data)
elapsed_time = (time() - time_start)
print('method {}'.format(method))
print(elapsed_time)
### Evaluation
# Compute log likelihood at each point on a grid
Np = 100 #number of grid points
Xp, Yp = np.meshgrid(np.linspace(-6, 6, Np), np.linspace(-6, 6, Np))
Pp = np.column_stack([Xp.flatten(), Yp.flatten()]).astype('float32')
Z, _ = model(Pp, sampling=False)
Z = np.reshape(Z, (Np, Np))
# Show the fit mixture density
plt.figure()
plt.imshow(np.exp(Z),
extent=(-6, 6, -6, 6),
origin='lower')
cbar = plt.colorbar()
cbar.ax.set_ylabel('Likelihood')
model.locs
model.trainable_variables
# Sample from the std deviation variational posterior
stds = tf.pow(tfd.Gamma(model.alpha, model.beta).sample(10000), -0.5)
# Plot the samples
plt.figure()
sns.distplot(stds[:, 0, 0])
# Sample from the mean variational posterior
means = tfd.Normal(model.locs, model.scales).sample(10000)
# Plot the mean samples for a single
plt.figure()
sns.kdeplot(means[:, 0, 0].numpy(),
means[:, 0, 1].numpy(),
n_levels=10)
|
mit
|
blacklin/kbengine
|
kbe/res/scripts/common/Lib/concurrent/futures/_base.py
|
88
|
19638
|
# Copyright 2009 Brian Quinlan. All Rights Reserved.
# Licensed to PSF under a Contributor Agreement.
__author__ = 'Brian Quinlan ([email protected])'
import collections
import logging
import threading
import time
FIRST_COMPLETED = 'FIRST_COMPLETED'
FIRST_EXCEPTION = 'FIRST_EXCEPTION'
ALL_COMPLETED = 'ALL_COMPLETED'
_AS_COMPLETED = '_AS_COMPLETED'
# Possible future states (for internal use by the futures package).
PENDING = 'PENDING'
RUNNING = 'RUNNING'
# The future was cancelled by the user...
CANCELLED = 'CANCELLED'
# ...and _Waiter.add_cancelled() was called by a worker.
CANCELLED_AND_NOTIFIED = 'CANCELLED_AND_NOTIFIED'
FINISHED = 'FINISHED'
_FUTURE_STATES = [
PENDING,
RUNNING,
CANCELLED,
CANCELLED_AND_NOTIFIED,
FINISHED
]
_STATE_TO_DESCRIPTION_MAP = {
PENDING: "pending",
RUNNING: "running",
CANCELLED: "cancelled",
CANCELLED_AND_NOTIFIED: "cancelled",
FINISHED: "finished"
}
# Logger for internal use by the futures package.
LOGGER = logging.getLogger("concurrent.futures")
class Error(Exception):
"""Base class for all future-related exceptions."""
pass
class CancelledError(Error):
"""The Future was cancelled."""
pass
class TimeoutError(Error):
"""The operation exceeded the given deadline."""
pass
class _Waiter(object):
"""Provides the event that wait() and as_completed() block on."""
def __init__(self):
self.event = threading.Event()
self.finished_futures = []
def add_result(self, future):
self.finished_futures.append(future)
def add_exception(self, future):
self.finished_futures.append(future)
def add_cancelled(self, future):
self.finished_futures.append(future)
class _AsCompletedWaiter(_Waiter):
"""Used by as_completed()."""
def __init__(self):
super(_AsCompletedWaiter, self).__init__()
self.lock = threading.Lock()
def add_result(self, future):
with self.lock:
super(_AsCompletedWaiter, self).add_result(future)
self.event.set()
def add_exception(self, future):
with self.lock:
super(_AsCompletedWaiter, self).add_exception(future)
self.event.set()
def add_cancelled(self, future):
with self.lock:
super(_AsCompletedWaiter, self).add_cancelled(future)
self.event.set()
class _FirstCompletedWaiter(_Waiter):
"""Used by wait(return_when=FIRST_COMPLETED)."""
def add_result(self, future):
super().add_result(future)
self.event.set()
def add_exception(self, future):
super().add_exception(future)
self.event.set()
def add_cancelled(self, future):
super().add_cancelled(future)
self.event.set()
class _AllCompletedWaiter(_Waiter):
"""Used by wait(return_when=FIRST_EXCEPTION and ALL_COMPLETED)."""
def __init__(self, num_pending_calls, stop_on_exception):
self.num_pending_calls = num_pending_calls
self.stop_on_exception = stop_on_exception
self.lock = threading.Lock()
super().__init__()
def _decrement_pending_calls(self):
with self.lock:
self.num_pending_calls -= 1
if not self.num_pending_calls:
self.event.set()
def add_result(self, future):
super().add_result(future)
self._decrement_pending_calls()
def add_exception(self, future):
super().add_exception(future)
if self.stop_on_exception:
self.event.set()
else:
self._decrement_pending_calls()
def add_cancelled(self, future):
super().add_cancelled(future)
self._decrement_pending_calls()
class _AcquireFutures(object):
"""A context manager that does an ordered acquire of Future conditions."""
def __init__(self, futures):
self.futures = sorted(futures, key=id)
def __enter__(self):
for future in self.futures:
future._condition.acquire()
def __exit__(self, *args):
for future in self.futures:
future._condition.release()
def _create_and_install_waiters(fs, return_when):
if return_when == _AS_COMPLETED:
waiter = _AsCompletedWaiter()
elif return_when == FIRST_COMPLETED:
waiter = _FirstCompletedWaiter()
else:
pending_count = sum(
f._state not in [CANCELLED_AND_NOTIFIED, FINISHED] for f in fs)
if return_when == FIRST_EXCEPTION:
waiter = _AllCompletedWaiter(pending_count, stop_on_exception=True)
elif return_when == ALL_COMPLETED:
waiter = _AllCompletedWaiter(pending_count, stop_on_exception=False)
else:
raise ValueError("Invalid return condition: %r" % return_when)
for f in fs:
f._waiters.append(waiter)
return waiter
def as_completed(fs, timeout=None):
"""An iterator over the given futures that yields each as it completes.
Args:
fs: The sequence of Futures (possibly created by different Executors) to
iterate over.
timeout: The maximum number of seconds to wait. If None, then there
is no limit on the wait time.
Returns:
An iterator that yields the given Futures as they complete (finished or
cancelled). If any given Futures are duplicated, they will be returned
once.
Raises:
TimeoutError: If the entire result iterator could not be generated
before the given timeout.
"""
if timeout is not None:
end_time = timeout + time.time()
fs = set(fs)
with _AcquireFutures(fs):
finished = set(
f for f in fs
if f._state in [CANCELLED_AND_NOTIFIED, FINISHED])
pending = fs - finished
waiter = _create_and_install_waiters(fs, _AS_COMPLETED)
try:
yield from finished
while pending:
if timeout is None:
wait_timeout = None
else:
wait_timeout = end_time - time.time()
if wait_timeout < 0:
raise TimeoutError(
'%d (of %d) futures unfinished' % (
len(pending), len(fs)))
waiter.event.wait(wait_timeout)
with waiter.lock:
finished = waiter.finished_futures
waiter.finished_futures = []
waiter.event.clear()
for future in finished:
yield future
pending.remove(future)
finally:
for f in fs:
with f._condition:
f._waiters.remove(waiter)
DoneAndNotDoneFutures = collections.namedtuple(
'DoneAndNotDoneFutures', 'done not_done')
def wait(fs, timeout=None, return_when=ALL_COMPLETED):
"""Wait for the futures in the given sequence to complete.
Args:
fs: The sequence of Futures (possibly created by different Executors) to
wait upon.
timeout: The maximum number of seconds to wait. If None, then there
is no limit on the wait time.
return_when: Indicates when this function should return. The options
are:
FIRST_COMPLETED - Return when any future finishes or is
cancelled.
FIRST_EXCEPTION - Return when any future finishes by raising an
exception. If no future raises an exception
then it is equivalent to ALL_COMPLETED.
ALL_COMPLETED - Return when all futures finish or are cancelled.
Returns:
A named 2-tuple of sets. The first set, named 'done', contains the
futures that completed (is finished or cancelled) before the wait
completed. The second set, named 'not_done', contains uncompleted
futures.
"""
with _AcquireFutures(fs):
done = set(f for f in fs
if f._state in [CANCELLED_AND_NOTIFIED, FINISHED])
not_done = set(fs) - done
if (return_when == FIRST_COMPLETED) and done:
return DoneAndNotDoneFutures(done, not_done)
elif (return_when == FIRST_EXCEPTION) and done:
if any(f for f in done
if not f.cancelled() and f.exception() is not None):
return DoneAndNotDoneFutures(done, not_done)
if len(done) == len(fs):
return DoneAndNotDoneFutures(done, not_done)
waiter = _create_and_install_waiters(fs, return_when)
waiter.event.wait(timeout)
for f in fs:
with f._condition:
f._waiters.remove(waiter)
done.update(waiter.finished_futures)
return DoneAndNotDoneFutures(done, set(fs) - done)
class Future(object):
"""Represents the result of an asynchronous computation."""
def __init__(self):
"""Initializes the future. Should not be called by clients."""
self._condition = threading.Condition()
self._state = PENDING
self._result = None
self._exception = None
self._waiters = []
self._done_callbacks = []
def _invoke_callbacks(self):
for callback in self._done_callbacks:
try:
callback(self)
except Exception:
LOGGER.exception('exception calling callback for %r', self)
def __repr__(self):
with self._condition:
if self._state == FINISHED:
if self._exception:
return '<Future at %s state=%s raised %s>' % (
hex(id(self)),
_STATE_TO_DESCRIPTION_MAP[self._state],
self._exception.__class__.__name__)
else:
return '<Future at %s state=%s returned %s>' % (
hex(id(self)),
_STATE_TO_DESCRIPTION_MAP[self._state],
self._result.__class__.__name__)
return '<Future at %s state=%s>' % (
hex(id(self)),
_STATE_TO_DESCRIPTION_MAP[self._state])
def cancel(self):
"""Cancel the future if possible.
Returns True if the future was cancelled, False otherwise. A future
cannot be cancelled if it is running or has already completed.
"""
with self._condition:
if self._state in [RUNNING, FINISHED]:
return False
if self._state in [CANCELLED, CANCELLED_AND_NOTIFIED]:
return True
self._state = CANCELLED
self._condition.notify_all()
self._invoke_callbacks()
return True
def cancelled(self):
"""Return True if the future was cancelled."""
with self._condition:
return self._state in [CANCELLED, CANCELLED_AND_NOTIFIED]
def running(self):
"""Return True if the future is currently executing."""
with self._condition:
return self._state == RUNNING
def done(self):
"""Return True of the future was cancelled or finished executing."""
with self._condition:
return self._state in [CANCELLED, CANCELLED_AND_NOTIFIED, FINISHED]
def __get_result(self):
if self._exception:
raise self._exception
else:
return self._result
def add_done_callback(self, fn):
"""Attaches a callable that will be called when the future finishes.
Args:
fn: A callable that will be called with this future as its only
argument when the future completes or is cancelled. The callable
will always be called by a thread in the same process in which
it was added. If the future has already completed or been
cancelled then the callable will be called immediately. These
callables are called in the order that they were added.
"""
with self._condition:
if self._state not in [CANCELLED, CANCELLED_AND_NOTIFIED, FINISHED]:
self._done_callbacks.append(fn)
return
fn(self)
def result(self, timeout=None):
"""Return the result of the call that the future represents.
Args:
timeout: The number of seconds to wait for the result if the future
isn't done. If None, then there is no limit on the wait time.
Returns:
The result of the call that the future represents.
Raises:
CancelledError: If the future was cancelled.
TimeoutError: If the future didn't finish executing before the given
timeout.
Exception: If the call raised then that exception will be raised.
"""
with self._condition:
if self._state in [CANCELLED, CANCELLED_AND_NOTIFIED]:
raise CancelledError()
elif self._state == FINISHED:
return self.__get_result()
self._condition.wait(timeout)
if self._state in [CANCELLED, CANCELLED_AND_NOTIFIED]:
raise CancelledError()
elif self._state == FINISHED:
return self.__get_result()
else:
raise TimeoutError()
def exception(self, timeout=None):
"""Return the exception raised by the call that the future represents.
Args:
timeout: The number of seconds to wait for the exception if the
future isn't done. If None, then there is no limit on the wait
time.
Returns:
The exception raised by the call that the future represents or None
if the call completed without raising.
Raises:
CancelledError: If the future was cancelled.
TimeoutError: If the future didn't finish executing before the given
timeout.
"""
with self._condition:
if self._state in [CANCELLED, CANCELLED_AND_NOTIFIED]:
raise CancelledError()
elif self._state == FINISHED:
return self._exception
self._condition.wait(timeout)
if self._state in [CANCELLED, CANCELLED_AND_NOTIFIED]:
raise CancelledError()
elif self._state == FINISHED:
return self._exception
else:
raise TimeoutError()
# The following methods should only be used by Executors and in tests.
def set_running_or_notify_cancel(self):
"""Mark the future as running or process any cancel notifications.
Should only be used by Executor implementations and unit tests.
If the future has been cancelled (cancel() was called and returned
True) then any threads waiting on the future completing (though calls
to as_completed() or wait()) are notified and False is returned.
If the future was not cancelled then it is put in the running state
(future calls to running() will return True) and True is returned.
This method should be called by Executor implementations before
executing the work associated with this future. If this method returns
False then the work should not be executed.
Returns:
False if the Future was cancelled, True otherwise.
Raises:
RuntimeError: if this method was already called or if set_result()
or set_exception() was called.
"""
with self._condition:
if self._state == CANCELLED:
self._state = CANCELLED_AND_NOTIFIED
for waiter in self._waiters:
waiter.add_cancelled(self)
# self._condition.notify_all() is not necessary because
# self.cancel() triggers a notification.
return False
elif self._state == PENDING:
self._state = RUNNING
return True
else:
LOGGER.critical('Future %s in unexpected state: %s',
id(self),
self._state)
raise RuntimeError('Future in unexpected state')
def set_result(self, result):
"""Sets the return value of work associated with the future.
Should only be used by Executor implementations and unit tests.
"""
with self._condition:
self._result = result
self._state = FINISHED
for waiter in self._waiters:
waiter.add_result(self)
self._condition.notify_all()
self._invoke_callbacks()
def set_exception(self, exception):
"""Sets the result of the future as being the given exception.
Should only be used by Executor implementations and unit tests.
"""
with self._condition:
self._exception = exception
self._state = FINISHED
for waiter in self._waiters:
waiter.add_exception(self)
self._condition.notify_all()
self._invoke_callbacks()
class Executor(object):
"""This is an abstract base class for concrete asynchronous executors."""
def submit(self, fn, *args, **kwargs):
"""Submits a callable to be executed with the given arguments.
Schedules the callable to be executed as fn(*args, **kwargs) and returns
a Future instance representing the execution of the callable.
Returns:
A Future representing the given call.
"""
raise NotImplementedError()
def map(self, fn, *iterables, timeout=None):
"""Returns a iterator equivalent to map(fn, iter).
Args:
fn: A callable that will take as many arguments as there are
passed iterables.
timeout: The maximum number of seconds to wait. If None, then there
is no limit on the wait time.
Returns:
An iterator equivalent to: map(func, *iterables) but the calls may
be evaluated out-of-order.
Raises:
TimeoutError: If the entire result iterator could not be generated
before the given timeout.
Exception: If fn(*args) raises for any values.
"""
if timeout is not None:
end_time = timeout + time.time()
fs = [self.submit(fn, *args) for args in zip(*iterables)]
# Yield must be hidden in closure so that the futures are submitted
# before the first iterator value is required.
def result_iterator():
try:
for future in fs:
if timeout is None:
yield future.result()
else:
yield future.result(end_time - time.time())
finally:
for future in fs:
future.cancel()
return result_iterator()
def shutdown(self, wait=True):
"""Clean-up the resources associated with the Executor.
It is safe to call this method several times. Otherwise, no other
methods can be called after this one.
Args:
wait: If True then shutdown will not return until all running
futures have finished executing and the resources used by the
executor have been reclaimed.
"""
pass
def __enter__(self):
return self
def __exit__(self, exc_type, exc_val, exc_tb):
self.shutdown(wait=True)
return False
|
lgpl-3.0
|
igoralmeida/tahoe-lafs
|
src/allmydata/test/test_deepcheck.py
|
6
|
60303
|
import os, simplejson, urllib
from cStringIO import StringIO
from twisted.trial import unittest
from twisted.internet import defer
from twisted.internet import threads # CLI tests use deferToThread
from allmydata.immutable import upload
from allmydata.mutable.common import UnrecoverableFileError
from allmydata.mutable.publish import MutableData
from allmydata.util import idlib
from allmydata.util import base32
from allmydata.scripts import runner
from allmydata.interfaces import ICheckResults, ICheckAndRepairResults, \
IDeepCheckResults, IDeepCheckAndRepairResults
from allmydata.monitor import Monitor, OperationCancelledError
from allmydata.uri import LiteralFileURI
from twisted.web.client import getPage
from allmydata.test.common import ErrorMixin, _corrupt_mutable_share_data, \
ShouldFailMixin
from allmydata.test.common_util import StallMixin
from allmydata.test.no_network import GridTestMixin
timeout = 2400 # One of these took 1046.091s on Zandr's ARM box.
class MutableChecker(GridTestMixin, unittest.TestCase, ErrorMixin):
def _run_cli(self, argv):
stdout, stderr = StringIO(), StringIO()
# this can only do synchronous operations
assert argv[0] == "debug"
runner.runner(argv, run_by_human=False, stdout=stdout, stderr=stderr)
return stdout.getvalue()
def test_good(self):
self.basedir = "deepcheck/MutableChecker/good"
self.set_up_grid()
CONTENTS = "a little bit of data"
CONTENTS_uploadable = MutableData(CONTENTS)
d = self.g.clients[0].create_mutable_file(CONTENTS_uploadable)
def _created(node):
self.node = node
self.fileurl = "uri/" + urllib.quote(node.get_uri())
d.addCallback(_created)
# now make sure the webapi verifier sees no problems
d.addCallback(lambda ign: self.GET(self.fileurl+"?t=check&verify=true",
method="POST"))
def _got_results(out):
self.failUnless("<span>Healthy : Healthy</span>" in out, out)
self.failUnless("Recoverable Versions: 10*seq1-" in out, out)
self.failIf("Not Healthy!" in out, out)
self.failIf("Unhealthy" in out, out)
self.failIf("Corrupt Shares" in out, out)
d.addCallback(_got_results)
d.addErrback(self.explain_web_error)
return d
def test_corrupt(self):
self.basedir = "deepcheck/MutableChecker/corrupt"
self.set_up_grid()
CONTENTS = "a little bit of data"
CONTENTS_uploadable = MutableData(CONTENTS)
d = self.g.clients[0].create_mutable_file(CONTENTS_uploadable)
def _stash_and_corrupt(node):
self.node = node
self.fileurl = "uri/" + urllib.quote(node.get_uri())
self.corrupt_shares_numbered(node.get_uri(), [0],
_corrupt_mutable_share_data)
d.addCallback(_stash_and_corrupt)
# now make sure the webapi verifier notices it
d.addCallback(lambda ign: self.GET(self.fileurl+"?t=check&verify=true",
method="POST"))
def _got_results(out):
self.failUnless("Not Healthy!" in out, out)
self.failUnless("Unhealthy: best version has only 9 shares (encoding is 3-of-10)" in out, out)
self.failUnless("Corrupt Shares:" in out, out)
d.addCallback(_got_results)
# now make sure the webapi repairer can fix it
d.addCallback(lambda ign:
self.GET(self.fileurl+"?t=check&verify=true&repair=true",
method="POST"))
def _got_repair_results(out):
self.failUnless("<div>Repair successful</div>" in out, out)
d.addCallback(_got_repair_results)
d.addCallback(lambda ign: self.GET(self.fileurl+"?t=check&verify=true",
method="POST"))
def _got_postrepair_results(out):
self.failIf("Not Healthy!" in out, out)
self.failUnless("Recoverable Versions: 10*seq" in out, out)
d.addCallback(_got_postrepair_results)
d.addErrback(self.explain_web_error)
return d
def test_delete_share(self):
self.basedir = "deepcheck/MutableChecker/delete_share"
self.set_up_grid()
CONTENTS = "a little bit of data"
CONTENTS_uploadable = MutableData(CONTENTS)
d = self.g.clients[0].create_mutable_file(CONTENTS_uploadable)
def _stash_and_delete(node):
self.node = node
self.fileurl = "uri/" + urllib.quote(node.get_uri())
self.delete_shares_numbered(node.get_uri(), [0])
d.addCallback(_stash_and_delete)
# now make sure the webapi checker notices it
d.addCallback(lambda ign: self.GET(self.fileurl+"?t=check&verify=false",
method="POST"))
def _got_results(out):
self.failUnless("Not Healthy!" in out, out)
self.failUnless("Unhealthy: best version has only 9 shares (encoding is 3-of-10)" in out, out)
self.failIf("Corrupt Shares" in out, out)
d.addCallback(_got_results)
# now make sure the webapi repairer can fix it
d.addCallback(lambda ign:
self.GET(self.fileurl+"?t=check&verify=false&repair=true",
method="POST"))
def _got_repair_results(out):
self.failUnless("Repair successful" in out)
d.addCallback(_got_repair_results)
d.addCallback(lambda ign: self.GET(self.fileurl+"?t=check&verify=false",
method="POST"))
def _got_postrepair_results(out):
self.failIf("Not Healthy!" in out, out)
self.failUnless("Recoverable Versions: 10*seq" in out)
d.addCallback(_got_postrepair_results)
d.addErrback(self.explain_web_error)
return d
class DeepCheckBase(GridTestMixin, ErrorMixin, StallMixin, ShouldFailMixin):
def web_json(self, n, **kwargs):
kwargs["output"] = "json"
d = self.web(n, "POST", **kwargs)
d.addCallback(self.decode_json)
return d
def decode_json(self, (s,url)):
try:
data = simplejson.loads(s)
except ValueError:
self.fail("%s: not JSON: '%s'" % (url, s))
return data
def parse_streamed_json(self, s):
for unit in s.split("\n"):
if not unit:
# stream should end with a newline, so split returns ""
continue
try:
yield simplejson.loads(unit)
except ValueError, le:
le.args = tuple(le.args + (unit,))
raise
def web(self, n, method="GET", **kwargs):
# returns (data, url)
url = (self.client_baseurls[0] + "uri/%s" % urllib.quote(n.get_uri())
+ "?" + "&".join(["%s=%s" % (k,v) for (k,v) in kwargs.items()]))
d = getPage(url, method=method)
d.addCallback(lambda data: (data,url))
return d
def wait_for_operation(self, ignored, ophandle):
url = self.client_baseurls[0] + "operations/" + ophandle
url += "?t=status&output=JSON"
d = getPage(url)
def _got(res):
try:
data = simplejson.loads(res)
except ValueError:
self.fail("%s: not JSON: '%s'" % (url, res))
if not data["finished"]:
d = self.stall(delay=1.0)
d.addCallback(self.wait_for_operation, ophandle)
return d
return data
d.addCallback(_got)
return d
def get_operation_results(self, ignored, ophandle, output=None):
url = self.client_baseurls[0] + "operations/" + ophandle
url += "?t=status"
if output:
url += "&output=" + output
d = getPage(url)
def _got(res):
if output and output.lower() == "json":
try:
return simplejson.loads(res)
except ValueError:
self.fail("%s: not JSON: '%s'" % (url, res))
return res
d.addCallback(_got)
return d
def slow_web(self, n, output=None, **kwargs):
# use ophandle=
handle = base32.b2a(os.urandom(4))
d = self.web(n, "POST", ophandle=handle, **kwargs)
d.addCallback(self.wait_for_operation, handle)
d.addCallback(self.get_operation_results, handle, output=output)
return d
class DeepCheckWebGood(DeepCheckBase, unittest.TestCase):
# construct a small directory tree (with one dir, one immutable file, one
# mutable file, two LIT files, one DIR2:LIT empty dir, one DIR2:LIT tiny
# dir, and a loop), and then check/examine it in various ways.
def set_up_tree(self):
# 2.9s
c0 = self.g.clients[0]
d = c0.create_dirnode()
def _created_root(n):
self.root = n
self.root_uri = n.get_uri()
d.addCallback(_created_root)
d.addCallback(lambda ign:
c0.create_mutable_file(MutableData("mutable file contents")))
d.addCallback(lambda n: self.root.set_node(u"mutable", n))
def _created_mutable(n):
self.mutable = n
self.mutable_uri = n.get_uri()
d.addCallback(_created_mutable)
large = upload.Data("Lots of data\n" * 1000, None)
d.addCallback(lambda ign: self.root.add_file(u"large", large))
def _created_large(n):
self.large = n
self.large_uri = n.get_uri()
d.addCallback(_created_large)
small = upload.Data("Small enough for a LIT", None)
d.addCallback(lambda ign: self.root.add_file(u"small", small))
def _created_small(n):
self.small = n
self.small_uri = n.get_uri()
d.addCallback(_created_small)
small2 = upload.Data("Small enough for a LIT too", None)
d.addCallback(lambda ign: self.root.add_file(u"small2", small2))
def _created_small2(n):
self.small2 = n
self.small2_uri = n.get_uri()
d.addCallback(_created_small2)
empty_litdir_uri = "URI:DIR2-LIT:"
tiny_litdir_uri = "URI:DIR2-LIT:gqytunj2onug64tufqzdcosvkjetutcjkq5gw4tvm5vwszdgnz5hgyzufqydulbshj5x2lbm" # contains one child which is itself also LIT
d.addCallback(lambda ign: self.root._create_and_validate_node(None, empty_litdir_uri, name=u"test_deepcheck empty_lit_dir"))
def _created_empty_lit_dir(n):
self.empty_lit_dir = n
self.empty_lit_dir_uri = n.get_uri()
self.root.set_node(u"empty_lit_dir", n)
d.addCallback(_created_empty_lit_dir)
d.addCallback(lambda ign: self.root._create_and_validate_node(None, tiny_litdir_uri, name=u"test_deepcheck tiny_lit_dir"))
def _created_tiny_lit_dir(n):
self.tiny_lit_dir = n
self.tiny_lit_dir_uri = n.get_uri()
self.root.set_node(u"tiny_lit_dir", n)
d.addCallback(_created_tiny_lit_dir)
d.addCallback(lambda ign: self.root.set_node(u"loop", self.root))
return d
def check_is_healthy(self, cr, n, where, incomplete=False):
self.failUnless(ICheckResults.providedBy(cr), where)
self.failUnless(cr.is_healthy(), where)
self.failUnlessEqual(cr.get_storage_index(), n.get_storage_index(),
where)
self.failUnlessEqual(cr.get_storage_index_string(),
base32.b2a(n.get_storage_index()), where)
num_servers = len(self.g.all_servers)
self.failUnlessEqual(num_servers, 10, where)
self.failUnlessEqual(cr.get_happiness(), num_servers, where)
self.failUnlessEqual(cr.get_share_counter_good(), num_servers, where)
self.failUnlessEqual(cr.get_encoding_needed(), 3, where)
self.failUnlessEqual(cr.get_encoding_expected(), num_servers, where)
if not incomplete:
self.failUnlessEqual(cr.get_host_counter_good_shares(),
num_servers, where)
self.failUnlessEqual(cr.get_corrupt_shares(), [], where)
if not incomplete:
self.failUnlessEqual(sorted([s.get_serverid()
for s in cr.get_servers_responding()]),
sorted(self.g.get_all_serverids()),
where)
all_serverids = set()
for (shareid, servers) in cr.get_sharemap().items():
all_serverids.update([s.get_serverid() for s in servers])
self.failUnlessEqual(sorted(all_serverids),
sorted(self.g.get_all_serverids()),
where)
self.failUnlessEqual(cr.get_share_counter_wrong(), 0, where)
self.failUnlessEqual(cr.get_version_counter_recoverable(), 1, where)
self.failUnlessEqual(cr.get_version_counter_unrecoverable(), 0, where)
def check_and_repair_is_healthy(self, cr, n, where, incomplete=False):
self.failUnless(ICheckAndRepairResults.providedBy(cr), (where, cr))
self.failUnless(cr.get_pre_repair_results().is_healthy(), where)
self.check_is_healthy(cr.get_pre_repair_results(), n, where, incomplete)
self.failUnless(cr.get_post_repair_results().is_healthy(), where)
self.check_is_healthy(cr.get_post_repair_results(), n, where, incomplete)
self.failIf(cr.get_repair_attempted(), where)
def deep_check_is_healthy(self, cr, num_healthy, where):
self.failUnless(IDeepCheckResults.providedBy(cr))
self.failUnlessEqual(cr.get_counters()["count-objects-healthy"],
num_healthy, where)
def deep_check_and_repair_is_healthy(self, cr, num_healthy, where):
self.failUnless(IDeepCheckAndRepairResults.providedBy(cr), where)
c = cr.get_counters()
self.failUnlessEqual(c["count-objects-healthy-pre-repair"],
num_healthy, where)
self.failUnlessEqual(c["count-objects-healthy-post-repair"],
num_healthy, where)
self.failUnlessEqual(c["count-repairs-attempted"], 0, where)
def test_good(self):
self.basedir = "deepcheck/DeepCheckWebGood/good"
self.set_up_grid()
d = self.set_up_tree()
d.addCallback(self.do_stats)
d.addCallback(self.do_web_stream_manifest)
d.addCallback(self.do_web_stream_check)
d.addCallback(self.do_test_check_good)
d.addCallback(self.do_test_web_good)
d.addCallback(self.do_test_cli_good)
d.addErrback(self.explain_web_error)
d.addErrback(self.explain_error)
return d
def do_stats(self, ignored):
d = defer.succeed(None)
d.addCallback(lambda ign: self.root.start_deep_stats().when_done())
d.addCallback(self.check_stats_good)
return d
def check_stats_good(self, s):
self.failUnlessEqual(s["count-directories"], 3)
self.failUnlessEqual(s["count-files"], 5)
self.failUnlessEqual(s["count-immutable-files"], 1)
self.failUnlessEqual(s["count-literal-files"], 3)
self.failUnlessEqual(s["count-mutable-files"], 1)
# don't check directories: their size will vary
# s["largest-directory"]
# s["size-directories"]
self.failUnlessEqual(s["largest-directory-children"], 7)
self.failUnlessEqual(s["largest-immutable-file"], 13000)
# to re-use this function for both the local
# dirnode.start_deep_stats() and the webapi t=start-deep-stats, we
# coerce the result into a list of tuples. dirnode.start_deep_stats()
# returns a list of tuples, but JSON only knows about lists., so
# t=start-deep-stats returns a list of lists.
histogram = [tuple(stuff) for stuff in s["size-files-histogram"]]
self.failUnlessEqual(histogram, [(4, 10, 1), (11, 31, 2),
(10001, 31622, 1),
])
self.failUnlessEqual(s["size-immutable-files"], 13000)
self.failUnlessEqual(s["size-literal-files"], 56)
def do_web_stream_manifest(self, ignored):
d = self.web(self.root, method="POST", t="stream-manifest")
d.addCallback(lambda (output,url):
self._check_streamed_manifest(output))
return d
def _check_streamed_manifest(self, output):
units = list(self.parse_streamed_json(output))
files = [u for u in units if u["type"] in ("file", "directory")]
assert units[-1]["type"] == "stats"
stats = units[-1]["stats"]
self.failUnlessEqual(len(files), 8)
# [root,mutable,large] are distributed, [small,small2,empty_litdir,tiny_litdir] are not
self.failUnlessEqual(len([f for f in files
if f["verifycap"] != ""]), 3)
self.failUnlessEqual(len([f for f in files
if f["verifycap"] == ""]), 5)
self.failUnlessEqual(len([f for f in files
if f["repaircap"] != ""]), 3)
self.failUnlessEqual(len([f for f in files
if f["repaircap"] == ""]), 5)
self.failUnlessEqual(len([f for f in files
if f["storage-index"] != ""]), 3)
self.failUnlessEqual(len([f for f in files
if f["storage-index"] == ""]), 5)
# make sure that a mutable file has filecap==repaircap!=verifycap
mutable = [f for f in files
if f["cap"] is not None
and f["cap"].startswith("URI:SSK:")][0]
self.failUnlessEqual(mutable["cap"], self.mutable_uri)
self.failIfEqual(mutable["cap"], mutable["verifycap"])
self.failUnlessEqual(mutable["cap"], mutable["repaircap"])
# for immutable file, verifycap==repaircap!=filecap
large = [f for f in files
if f["cap"] is not None
and f["cap"].startswith("URI:CHK:")][0]
self.failUnlessEqual(large["cap"], self.large_uri)
self.failIfEqual(large["cap"], large["verifycap"])
self.failUnlessEqual(large["verifycap"], large["repaircap"])
self.check_stats_good(stats)
def do_web_stream_check(self, ignored):
# TODO
return
d = self.web(self.root, t="stream-deep-check")
def _check(res):
units = list(self.parse_streamed_json(res))
#files = [u for u in units if u["type"] in ("file", "directory")]
assert units[-1]["type"] == "stats"
#stats = units[-1]["stats"]
# ...
d.addCallback(_check)
return d
def do_test_check_good(self, ignored):
d = defer.succeed(None)
# check the individual items
d.addCallback(lambda ign: self.root.check(Monitor()))
d.addCallback(self.check_is_healthy, self.root, "root")
d.addCallback(lambda ign: self.mutable.check(Monitor()))
d.addCallback(self.check_is_healthy, self.mutable, "mutable")
d.addCallback(lambda ign: self.large.check(Monitor()))
d.addCallback(self.check_is_healthy, self.large, "large")
d.addCallback(lambda ign: self.small.check(Monitor()))
d.addCallback(self.failUnlessEqual, None, "small")
d.addCallback(lambda ign: self.small2.check(Monitor()))
d.addCallback(self.failUnlessEqual, None, "small2")
d.addCallback(lambda ign: self.empty_lit_dir.check(Monitor()))
d.addCallback(self.failUnlessEqual, None, "empty_lit_dir")
d.addCallback(lambda ign: self.tiny_lit_dir.check(Monitor()))
d.addCallback(self.failUnlessEqual, None, "tiny_lit_dir")
# and again with verify=True
d.addCallback(lambda ign: self.root.check(Monitor(), verify=True))
d.addCallback(self.check_is_healthy, self.root, "root")
d.addCallback(lambda ign: self.mutable.check(Monitor(), verify=True))
d.addCallback(self.check_is_healthy, self.mutable, "mutable")
d.addCallback(lambda ign: self.large.check(Monitor(), verify=True))
d.addCallback(self.check_is_healthy, self.large, "large", incomplete=True)
d.addCallback(lambda ign: self.small.check(Monitor(), verify=True))
d.addCallback(self.failUnlessEqual, None, "small")
d.addCallback(lambda ign: self.small2.check(Monitor(), verify=True))
d.addCallback(self.failUnlessEqual, None, "small2")
d.addCallback(lambda ign: self.empty_lit_dir.check(Monitor(), verify=True))
d.addCallback(self.failUnlessEqual, None, "empty_lit_dir")
d.addCallback(lambda ign: self.tiny_lit_dir.check(Monitor(), verify=True))
d.addCallback(self.failUnlessEqual, None, "tiny_lit_dir")
# and check_and_repair(), which should be a nop
d.addCallback(lambda ign: self.root.check_and_repair(Monitor()))
d.addCallback(self.check_and_repair_is_healthy, self.root, "root")
d.addCallback(lambda ign: self.mutable.check_and_repair(Monitor()))
d.addCallback(self.check_and_repair_is_healthy, self.mutable, "mutable")
d.addCallback(lambda ign: self.large.check_and_repair(Monitor()))
d.addCallback(self.check_and_repair_is_healthy, self.large, "large")
d.addCallback(lambda ign: self.small.check_and_repair(Monitor()))
d.addCallback(self.failUnlessEqual, None, "small")
d.addCallback(lambda ign: self.small2.check_and_repair(Monitor()))
d.addCallback(self.failUnlessEqual, None, "small2")
d.addCallback(lambda ign: self.empty_lit_dir.check_and_repair(Monitor()))
d.addCallback(self.failUnlessEqual, None, "empty_lit_dir")
d.addCallback(lambda ign: self.tiny_lit_dir.check_and_repair(Monitor()))
# check_and_repair(verify=True)
d.addCallback(lambda ign: self.root.check_and_repair(Monitor(), verify=True))
d.addCallback(self.check_and_repair_is_healthy, self.root, "root")
d.addCallback(lambda ign: self.mutable.check_and_repair(Monitor(), verify=True))
d.addCallback(self.check_and_repair_is_healthy, self.mutable, "mutable")
d.addCallback(lambda ign: self.large.check_and_repair(Monitor(), verify=True))
d.addCallback(self.check_and_repair_is_healthy, self.large, "large", incomplete=True)
d.addCallback(lambda ign: self.small.check_and_repair(Monitor(), verify=True))
d.addCallback(self.failUnlessEqual, None, "small")
d.addCallback(lambda ign: self.small2.check_and_repair(Monitor(), verify=True))
d.addCallback(self.failUnlessEqual, None, "small2")
d.addCallback(self.failUnlessEqual, None, "small2")
d.addCallback(lambda ign: self.empty_lit_dir.check_and_repair(Monitor(), verify=True))
d.addCallback(self.failUnlessEqual, None, "empty_lit_dir")
d.addCallback(lambda ign: self.tiny_lit_dir.check_and_repair(Monitor(), verify=True))
# now deep-check the root, with various verify= and repair= options
d.addCallback(lambda ign:
self.root.start_deep_check().when_done())
d.addCallback(self.deep_check_is_healthy, 3, "root")
d.addCallback(lambda ign:
self.root.start_deep_check(verify=True).when_done())
d.addCallback(self.deep_check_is_healthy, 3, "root")
d.addCallback(lambda ign:
self.root.start_deep_check_and_repair().when_done())
d.addCallback(self.deep_check_and_repair_is_healthy, 3, "root")
d.addCallback(lambda ign:
self.root.start_deep_check_and_repair(verify=True).when_done())
d.addCallback(self.deep_check_and_repair_is_healthy, 3, "root")
# and finally, start a deep-check, but then cancel it.
d.addCallback(lambda ign: self.root.start_deep_check())
def _checking(monitor):
monitor.cancel()
d = monitor.when_done()
# this should fire as soon as the next dirnode.list finishes.
# TODO: add a counter to measure how many list() calls are made,
# assert that no more than one gets to run before the cancel()
# takes effect.
def _finished_normally(res):
self.fail("this was supposed to fail, not finish normally")
def _cancelled(f):
f.trap(OperationCancelledError)
d.addCallbacks(_finished_normally, _cancelled)
return d
d.addCallback(_checking)
return d
def json_check_is_healthy(self, data, n, where, incomplete=False):
self.failUnlessEqual(data["storage-index"],
base32.b2a(n.get_storage_index()), where)
self.failUnless("summary" in data, (where, data))
self.failUnlessEqual(data["summary"].lower(), "healthy",
"%s: '%s'" % (where, data["summary"]))
r = data["results"]
self.failUnlessEqual(r["healthy"], True, where)
num_servers = len(self.g.all_servers)
self.failUnlessEqual(num_servers, 10)
self.failIfIn("needs-rebalancing", r)
self.failUnlessEqual(r["count-happiness"], num_servers, where)
self.failUnlessEqual(r["count-shares-good"], num_servers, where)
self.failUnlessEqual(r["count-shares-needed"], 3, where)
self.failUnlessEqual(r["count-shares-expected"], num_servers, where)
if not incomplete:
self.failUnlessEqual(r["count-good-share-hosts"], num_servers,
where)
self.failUnlessEqual(r["count-corrupt-shares"], 0, where)
self.failUnlessEqual(r["list-corrupt-shares"], [], where)
if not incomplete:
self.failUnlessEqual(sorted(r["servers-responding"]),
sorted([idlib.nodeid_b2a(sid)
for sid in self.g.get_all_serverids()]),
where)
self.failUnless("sharemap" in r, where)
all_serverids = set()
for (shareid, serverids_s) in r["sharemap"].items():
all_serverids.update(serverids_s)
self.failUnlessEqual(sorted(all_serverids),
sorted([idlib.nodeid_b2a(sid)
for sid in self.g.get_all_serverids()]),
where)
self.failUnlessEqual(r["count-wrong-shares"], 0, where)
self.failUnlessEqual(r["count-recoverable-versions"], 1, where)
self.failUnlessEqual(r["count-unrecoverable-versions"], 0, where)
def json_check_and_repair_is_healthy(self, data, n, where, incomplete=False):
self.failUnlessEqual(data["storage-index"],
base32.b2a(n.get_storage_index()), where)
self.failUnlessEqual(data["repair-attempted"], False, where)
self.json_check_is_healthy(data["pre-repair-results"],
n, where, incomplete)
self.json_check_is_healthy(data["post-repair-results"],
n, where, incomplete)
def json_full_deepcheck_is_healthy(self, data, n, where):
self.failUnlessEqual(data["root-storage-index"],
base32.b2a(n.get_storage_index()), where)
self.failUnlessEqual(data["count-objects-checked"], 3, where)
self.failUnlessEqual(data["count-objects-healthy"], 3, where)
self.failUnlessEqual(data["count-objects-unhealthy"], 0, where)
self.failUnlessEqual(data["count-corrupt-shares"], 0, where)
self.failUnlessEqual(data["list-corrupt-shares"], [], where)
self.failUnlessEqual(data["list-unhealthy-files"], [], where)
self.json_check_stats_good(data["stats"], where)
def json_full_deepcheck_and_repair_is_healthy(self, data, n, where):
self.failUnlessEqual(data["root-storage-index"],
base32.b2a(n.get_storage_index()), where)
self.failUnlessEqual(data["count-objects-checked"], 3, where)
self.failUnlessEqual(data["count-objects-healthy-pre-repair"], 3, where)
self.failUnlessEqual(data["count-objects-unhealthy-pre-repair"], 0, where)
self.failUnlessEqual(data["count-corrupt-shares-pre-repair"], 0, where)
self.failUnlessEqual(data["count-objects-healthy-post-repair"], 3, where)
self.failUnlessEqual(data["count-objects-unhealthy-post-repair"], 0, where)
self.failUnlessEqual(data["count-corrupt-shares-post-repair"], 0, where)
self.failUnlessEqual(data["list-corrupt-shares"], [], where)
self.failUnlessEqual(data["list-remaining-corrupt-shares"], [], where)
self.failUnlessEqual(data["list-unhealthy-files"], [], where)
self.failUnlessEqual(data["count-repairs-attempted"], 0, where)
self.failUnlessEqual(data["count-repairs-successful"], 0, where)
self.failUnlessEqual(data["count-repairs-unsuccessful"], 0, where)
def json_check_lit(self, data, n, where):
self.failUnlessEqual(data["storage-index"], "", where)
self.failUnlessEqual(data["results"]["healthy"], True, where)
def json_check_stats_good(self, data, where):
self.check_stats_good(data)
def do_test_web_good(self, ignored):
d = defer.succeed(None)
# stats
d.addCallback(lambda ign:
self.slow_web(self.root,
t="start-deep-stats", output="json"))
d.addCallback(self.json_check_stats_good, "deep-stats")
# check, no verify
d.addCallback(lambda ign: self.web_json(self.root, t="check"))
d.addCallback(self.json_check_is_healthy, self.root, "root")
d.addCallback(lambda ign: self.web_json(self.mutable, t="check"))
d.addCallback(self.json_check_is_healthy, self.mutable, "mutable")
d.addCallback(lambda ign: self.web_json(self.large, t="check"))
d.addCallback(self.json_check_is_healthy, self.large, "large")
d.addCallback(lambda ign: self.web_json(self.small, t="check"))
d.addCallback(self.json_check_lit, self.small, "small")
d.addCallback(lambda ign: self.web_json(self.small2, t="check"))
d.addCallback(self.json_check_lit, self.small2, "small2")
d.addCallback(lambda ign: self.web_json(self.empty_lit_dir, t="check"))
d.addCallback(self.json_check_lit, self.empty_lit_dir, "empty_lit_dir")
d.addCallback(lambda ign: self.web_json(self.tiny_lit_dir, t="check"))
d.addCallback(self.json_check_lit, self.tiny_lit_dir, "tiny_lit_dir")
# check and verify
d.addCallback(lambda ign:
self.web_json(self.root, t="check", verify="true"))
d.addCallback(self.json_check_is_healthy, self.root, "root+v")
d.addCallback(lambda ign:
self.web_json(self.mutable, t="check", verify="true"))
d.addCallback(self.json_check_is_healthy, self.mutable, "mutable+v")
d.addCallback(lambda ign:
self.web_json(self.large, t="check", verify="true"))
d.addCallback(self.json_check_is_healthy, self.large, "large+v",
incomplete=True)
d.addCallback(lambda ign:
self.web_json(self.small, t="check", verify="true"))
d.addCallback(self.json_check_lit, self.small, "small+v")
d.addCallback(lambda ign:
self.web_json(self.small2, t="check", verify="true"))
d.addCallback(self.json_check_lit, self.small2, "small2+v")
d.addCallback(lambda ign: self.web_json(self.empty_lit_dir, t="check", verify="true"))
d.addCallback(self.json_check_lit, self.empty_lit_dir, "empty_lit_dir+v")
d.addCallback(lambda ign: self.web_json(self.tiny_lit_dir, t="check", verify="true"))
d.addCallback(self.json_check_lit, self.tiny_lit_dir, "tiny_lit_dir+v")
# check and repair, no verify
d.addCallback(lambda ign:
self.web_json(self.root, t="check", repair="true"))
d.addCallback(self.json_check_and_repair_is_healthy, self.root, "root+r")
d.addCallback(lambda ign:
self.web_json(self.mutable, t="check", repair="true"))
d.addCallback(self.json_check_and_repair_is_healthy, self.mutable, "mutable+r")
d.addCallback(lambda ign:
self.web_json(self.large, t="check", repair="true"))
d.addCallback(self.json_check_and_repair_is_healthy, self.large, "large+r")
d.addCallback(lambda ign:
self.web_json(self.small, t="check", repair="true"))
d.addCallback(self.json_check_lit, self.small, "small+r")
d.addCallback(lambda ign:
self.web_json(self.small2, t="check", repair="true"))
d.addCallback(self.json_check_lit, self.small2, "small2+r")
d.addCallback(lambda ign: self.web_json(self.empty_lit_dir, t="check", repair="true"))
d.addCallback(self.json_check_lit, self.empty_lit_dir, "empty_lit_dir+r")
d.addCallback(lambda ign: self.web_json(self.tiny_lit_dir, t="check", repair="true"))
d.addCallback(self.json_check_lit, self.tiny_lit_dir, "tiny_lit_dir+r")
# check+verify+repair
d.addCallback(lambda ign:
self.web_json(self.root, t="check", repair="true", verify="true"))
d.addCallback(self.json_check_and_repair_is_healthy, self.root, "root+vr")
d.addCallback(lambda ign:
self.web_json(self.mutable, t="check", repair="true", verify="true"))
d.addCallback(self.json_check_and_repair_is_healthy, self.mutable, "mutable+vr")
d.addCallback(lambda ign:
self.web_json(self.large, t="check", repair="true", verify="true"))
d.addCallback(self.json_check_and_repair_is_healthy, self.large, "large+vr", incomplete=True)
d.addCallback(lambda ign:
self.web_json(self.small, t="check", repair="true", verify="true"))
d.addCallback(self.json_check_lit, self.small, "small+vr")
d.addCallback(lambda ign:
self.web_json(self.small2, t="check", repair="true", verify="true"))
d.addCallback(self.json_check_lit, self.small2, "small2+vr")
d.addCallback(lambda ign: self.web_json(self.empty_lit_dir, t="check", repair="true", verify=True))
d.addCallback(self.json_check_lit, self.empty_lit_dir, "empty_lit_dir+vr")
d.addCallback(lambda ign: self.web_json(self.tiny_lit_dir, t="check", repair="true", verify=True))
d.addCallback(self.json_check_lit, self.tiny_lit_dir, "tiny_lit_dir+vr")
# now run a deep-check, with various verify= and repair= flags
d.addCallback(lambda ign:
self.slow_web(self.root, t="start-deep-check", output="json"))
d.addCallback(self.json_full_deepcheck_is_healthy, self.root, "root+d")
d.addCallback(lambda ign:
self.slow_web(self.root, t="start-deep-check", verify="true",
output="json"))
d.addCallback(self.json_full_deepcheck_is_healthy, self.root, "root+dv")
d.addCallback(lambda ign:
self.slow_web(self.root, t="start-deep-check", repair="true",
output="json"))
d.addCallback(self.json_full_deepcheck_and_repair_is_healthy, self.root, "root+dr")
d.addCallback(lambda ign:
self.slow_web(self.root, t="start-deep-check", verify="true", repair="true", output="json"))
d.addCallback(self.json_full_deepcheck_and_repair_is_healthy, self.root, "root+dvr")
# now look at t=info
d.addCallback(lambda ign: self.web(self.root, t="info"))
# TODO: examine the output
d.addCallback(lambda ign: self.web(self.mutable, t="info"))
d.addCallback(lambda ign: self.web(self.large, t="info"))
d.addCallback(lambda ign: self.web(self.small, t="info"))
d.addCallback(lambda ign: self.web(self.small2, t="info"))
d.addCallback(lambda ign: self.web(self.empty_lit_dir, t="info"))
d.addCallback(lambda ign: self.web(self.tiny_lit_dir, t="info"))
return d
def _run_cli(self, argv, stdin=""):
#print "CLI:", argv
stdout, stderr = StringIO(), StringIO()
d = threads.deferToThread(runner.runner, argv, run_by_human=False,
stdin=StringIO(stdin),
stdout=stdout, stderr=stderr)
def _done(res):
return stdout.getvalue(), stderr.getvalue()
d.addCallback(_done)
return d
def do_test_cli_good(self, ignored):
d = defer.succeed(None)
d.addCallback(lambda ign: self.do_cli_manifest_stream1())
d.addCallback(lambda ign: self.do_cli_manifest_stream2())
d.addCallback(lambda ign: self.do_cli_manifest_stream3())
d.addCallback(lambda ign: self.do_cli_manifest_stream4())
d.addCallback(lambda ign: self.do_cli_manifest_stream5())
d.addCallback(lambda ign: self.do_cli_stats1())
d.addCallback(lambda ign: self.do_cli_stats2())
return d
def _check_manifest_storage_index(self, out):
lines = [l for l in out.split("\n") if l]
self.failUnlessEqual(len(lines), 3)
self.failUnless(base32.b2a(self.root.get_storage_index()) in lines)
self.failUnless(base32.b2a(self.mutable.get_storage_index()) in lines)
self.failUnless(base32.b2a(self.large.get_storage_index()) in lines)
def do_cli_manifest_stream1(self):
basedir = self.get_clientdir(0)
d = self._run_cli(["--node-directory", basedir,
"manifest",
self.root_uri])
def _check((out,err)):
self.failUnlessEqual(err, "")
lines = [l for l in out.split("\n") if l]
self.failUnlessEqual(len(lines), 8)
caps = {}
for l in lines:
try:
cap, path = l.split(None, 1)
except ValueError:
cap = l.strip()
path = ""
caps[cap] = path
self.failUnless(self.root.get_uri() in caps)
self.failUnlessEqual(caps[self.root.get_uri()], "")
self.failUnlessEqual(caps[self.mutable.get_uri()], "mutable")
self.failUnlessEqual(caps[self.large.get_uri()], "large")
self.failUnlessEqual(caps[self.small.get_uri()], "small")
self.failUnlessEqual(caps[self.small2.get_uri()], "small2")
self.failUnlessEqual(caps[self.empty_lit_dir.get_uri()], "empty_lit_dir")
self.failUnlessEqual(caps[self.tiny_lit_dir.get_uri()], "tiny_lit_dir")
d.addCallback(_check)
return d
def do_cli_manifest_stream2(self):
basedir = self.get_clientdir(0)
d = self._run_cli(["--node-directory", basedir,
"manifest",
"--raw",
self.root_uri])
def _check((out,err)):
self.failUnlessEqual(err, "")
# this should be the same as the POST t=stream-manifest output
self._check_streamed_manifest(out)
d.addCallback(_check)
return d
def do_cli_manifest_stream3(self):
basedir = self.get_clientdir(0)
d = self._run_cli(["--node-directory", basedir,
"manifest",
"--storage-index",
self.root_uri])
def _check((out,err)):
self.failUnlessEqual(err, "")
self._check_manifest_storage_index(out)
d.addCallback(_check)
return d
def do_cli_manifest_stream4(self):
basedir = self.get_clientdir(0)
d = self._run_cli(["--node-directory", basedir,
"manifest",
"--verify-cap",
self.root_uri])
def _check((out,err)):
self.failUnlessEqual(err, "")
lines = [l for l in out.split("\n") if l]
self.failUnlessEqual(len(lines), 3)
self.failUnless(self.root.get_verify_cap().to_string() in lines)
self.failUnless(self.mutable.get_verify_cap().to_string() in lines)
self.failUnless(self.large.get_verify_cap().to_string() in lines)
d.addCallback(_check)
return d
def do_cli_manifest_stream5(self):
basedir = self.get_clientdir(0)
d = self._run_cli(["--node-directory", basedir,
"manifest",
"--repair-cap",
self.root_uri])
def _check((out,err)):
self.failUnlessEqual(err, "")
lines = [l for l in out.split("\n") if l]
self.failUnlessEqual(len(lines), 3)
self.failUnless(self.root.get_repair_cap().to_string() in lines)
self.failUnless(self.mutable.get_repair_cap().to_string() in lines)
self.failUnless(self.large.get_repair_cap().to_string() in lines)
d.addCallback(_check)
return d
def do_cli_stats1(self):
basedir = self.get_clientdir(0)
d = self._run_cli(["--node-directory", basedir,
"stats",
self.root_uri])
def _check3((out,err)):
lines = [l.strip() for l in out.split("\n") if l]
self.failUnless("count-immutable-files: 1" in lines)
self.failUnless("count-mutable-files: 1" in lines)
self.failUnless("count-literal-files: 3" in lines)
self.failUnless("count-files: 5" in lines)
self.failUnless("count-directories: 3" in lines)
self.failUnless("size-immutable-files: 13000 (13.00 kB, 12.70 kiB)" in lines, lines)
self.failUnless("size-literal-files: 56" in lines, lines)
self.failUnless(" 4-10 : 1 (10 B, 10 B)".strip() in lines, lines)
self.failUnless(" 11-31 : 2 (31 B, 31 B)".strip() in lines, lines)
self.failUnless("10001-31622 : 1 (31.62 kB, 30.88 kiB)".strip() in lines, lines)
d.addCallback(_check3)
return d
def do_cli_stats2(self):
basedir = self.get_clientdir(0)
d = self._run_cli(["--node-directory", basedir,
"stats",
"--raw",
self.root_uri])
def _check4((out,err)):
data = simplejson.loads(out)
self.failUnlessEqual(data["count-immutable-files"], 1)
self.failUnlessEqual(data["count-immutable-files"], 1)
self.failUnlessEqual(data["count-mutable-files"], 1)
self.failUnlessEqual(data["count-literal-files"], 3)
self.failUnlessEqual(data["count-files"], 5)
self.failUnlessEqual(data["count-directories"], 3)
self.failUnlessEqual(data["size-immutable-files"], 13000)
self.failUnlessEqual(data["size-literal-files"], 56)
self.failUnless([4,10,1] in data["size-files-histogram"])
self.failUnless([11,31,2] in data["size-files-histogram"])
self.failUnless([10001,31622,1] in data["size-files-histogram"])
d.addCallback(_check4)
return d
class DeepCheckWebBad(DeepCheckBase, unittest.TestCase):
def test_bad(self):
self.basedir = "deepcheck/DeepCheckWebBad/bad"
self.set_up_grid()
d = self.set_up_damaged_tree()
d.addCallback(self.do_check)
d.addCallback(self.do_deepcheck)
d.addCallback(self.do_deepcheck_broken)
d.addCallback(self.do_test_web_bad)
d.addErrback(self.explain_web_error)
d.addErrback(self.explain_error)
return d
def set_up_damaged_tree(self):
# 6.4s
# root
# mutable-good
# mutable-missing-shares
# mutable-corrupt-shares
# mutable-unrecoverable
# large-good
# large-missing-shares
# large-corrupt-shares
# large-unrecoverable
# broken
# large1-good
# subdir-good
# large2-good
# subdir-unrecoverable
# large3-good
self.nodes = {}
c0 = self.g.clients[0]
d = c0.create_dirnode()
def _created_root(n):
self.root = n
self.root_uri = n.get_uri()
d.addCallback(_created_root)
d.addCallback(self.create_mangled, "mutable-good")
d.addCallback(self.create_mangled, "mutable-missing-shares")
d.addCallback(self.create_mangled, "mutable-corrupt-shares")
d.addCallback(self.create_mangled, "mutable-unrecoverable")
d.addCallback(self.create_mangled, "large-good")
d.addCallback(self.create_mangled, "large-missing-shares")
d.addCallback(self.create_mangled, "large-corrupt-shares")
d.addCallback(self.create_mangled, "large-unrecoverable")
d.addCallback(lambda ignored: c0.create_dirnode())
d.addCallback(self._stash_node, "broken")
large1 = upload.Data("Lots of data\n" * 1000 + "large1" + "\n", None)
d.addCallback(lambda ignored:
self.nodes["broken"].add_file(u"large1", large1))
d.addCallback(lambda ignored:
self.nodes["broken"].create_subdirectory(u"subdir-good"))
large2 = upload.Data("Lots of data\n" * 1000 + "large2" + "\n", None)
d.addCallback(lambda subdir: subdir.add_file(u"large2-good", large2))
d.addCallback(lambda ignored:
self.nodes["broken"].create_subdirectory(u"subdir-unrecoverable"))
d.addCallback(self._stash_node, "subdir-unrecoverable")
large3 = upload.Data("Lots of data\n" * 1000 + "large3" + "\n", None)
d.addCallback(lambda subdir: subdir.add_file(u"large3-good", large3))
d.addCallback(lambda ignored:
self._delete_most_shares(self.nodes["broken"]))
return d
def _stash_node(self, node, name):
self.nodes[name] = node
return node
def create_mangled(self, ignored, name):
nodetype, mangletype = name.split("-", 1)
if nodetype == "mutable":
mutable_uploadable = MutableData("mutable file contents")
d = self.g.clients[0].create_mutable_file(mutable_uploadable)
d.addCallback(lambda n: self.root.set_node(unicode(name), n))
elif nodetype == "large":
large = upload.Data("Lots of data\n" * 1000 + name + "\n", None)
d = self.root.add_file(unicode(name), large)
elif nodetype == "small":
small = upload.Data("Small enough for a LIT", None)
d = self.root.add_file(unicode(name), small)
d.addCallback(self._stash_node, name)
if mangletype == "good":
pass
elif mangletype == "missing-shares":
d.addCallback(self._delete_some_shares)
elif mangletype == "corrupt-shares":
d.addCallback(self._corrupt_some_shares)
else:
assert mangletype == "unrecoverable"
d.addCallback(self._delete_most_shares)
return d
def _run_cli(self, argv):
stdout, stderr = StringIO(), StringIO()
# this can only do synchronous operations
assert argv[0] == "debug"
runner.runner(argv, run_by_human=False, stdout=stdout, stderr=stderr)
return stdout.getvalue()
def _delete_some_shares(self, node):
self.delete_shares_numbered(node.get_uri(), [0,1])
def _corrupt_some_shares(self, node):
for (shnum, serverid, sharefile) in self.find_uri_shares(node.get_uri()):
if shnum in (0,1):
self._run_cli(["debug", "corrupt-share", sharefile])
def _delete_most_shares(self, node):
self.delete_shares_numbered(node.get_uri(), range(1,10))
def check_is_healthy(self, cr, where):
try:
self.failUnless(ICheckResults.providedBy(cr), (cr, type(cr), where))
self.failUnless(cr.is_healthy(), (cr.get_report(), cr.is_healthy(), cr.get_summary(), where))
self.failUnless(cr.is_recoverable(), where)
self.failUnlessEqual(cr.get_version_counter_recoverable(), 1, where)
self.failUnlessEqual(cr.get_version_counter_unrecoverable(), 0, where)
return cr
except Exception, le:
le.args = tuple(le.args + (where,))
raise
def check_is_missing_shares(self, cr, where):
self.failUnless(ICheckResults.providedBy(cr), where)
self.failIf(cr.is_healthy(), where)
self.failUnless(cr.is_recoverable(), where)
self.failUnlessEqual(cr.get_version_counter_recoverable(), 1, where)
self.failUnlessEqual(cr.get_version_counter_unrecoverable(), 0, where)
return cr
def check_has_corrupt_shares(self, cr, where):
# by "corrupt-shares" we mean the file is still recoverable
self.failUnless(ICheckResults.providedBy(cr), where)
self.failIf(cr.is_healthy(), (where, cr))
self.failUnless(cr.is_recoverable(), where)
self.failUnless(cr.get_share_counter_good() < 10, where)
self.failUnless(cr.get_corrupt_shares(), where)
return cr
def check_is_unrecoverable(self, cr, where):
self.failUnless(ICheckResults.providedBy(cr), where)
self.failIf(cr.is_healthy(), where)
self.failIf(cr.is_recoverable(), where)
self.failUnless(cr.get_share_counter_good() < cr.get_encoding_needed(),
(cr.get_share_counter_good(), cr.get_encoding_needed(),
where))
self.failUnlessEqual(cr.get_version_counter_recoverable(), 0, where)
self.failUnlessEqual(cr.get_version_counter_unrecoverable(), 1, where)
return cr
def do_check(self, ignored):
d = defer.succeed(None)
# check the individual items, without verification. This will not
# detect corrupt shares.
def _check(which, checker):
d = self.nodes[which].check(Monitor())
d.addCallback(checker, which + "--check")
return d
d.addCallback(lambda ign: _check("mutable-good", self.check_is_healthy))
d.addCallback(lambda ign: _check("mutable-missing-shares",
self.check_is_missing_shares))
d.addCallback(lambda ign: _check("mutable-corrupt-shares",
self.check_is_healthy))
d.addCallback(lambda ign: _check("mutable-unrecoverable",
self.check_is_unrecoverable))
d.addCallback(lambda ign: _check("large-good", self.check_is_healthy))
d.addCallback(lambda ign: _check("large-missing-shares",
self.check_is_missing_shares))
d.addCallback(lambda ign: _check("large-corrupt-shares",
self.check_is_healthy))
d.addCallback(lambda ign: _check("large-unrecoverable",
self.check_is_unrecoverable))
# and again with verify=True, which *does* detect corrupt shares.
def _checkv(which, checker):
d = self.nodes[which].check(Monitor(), verify=True)
d.addCallback(checker, which + "--check-and-verify")
return d
d.addCallback(lambda ign: _checkv("mutable-good", self.check_is_healthy))
d.addCallback(lambda ign: _checkv("mutable-missing-shares",
self.check_is_missing_shares))
d.addCallback(lambda ign: _checkv("mutable-corrupt-shares",
self.check_has_corrupt_shares))
d.addCallback(lambda ign: _checkv("mutable-unrecoverable",
self.check_is_unrecoverable))
d.addCallback(lambda ign: _checkv("large-good", self.check_is_healthy))
d.addCallback(lambda ign: _checkv("large-missing-shares", self.check_is_missing_shares))
d.addCallback(lambda ign: _checkv("large-corrupt-shares", self.check_has_corrupt_shares))
d.addCallback(lambda ign: _checkv("large-unrecoverable",
self.check_is_unrecoverable))
return d
def do_deepcheck(self, ignored):
d = defer.succeed(None)
# now deep-check the root, with various verify= and repair= options
d.addCallback(lambda ign:
self.root.start_deep_check().when_done())
def _check1(cr):
self.failUnless(IDeepCheckResults.providedBy(cr))
c = cr.get_counters()
self.failUnlessEqual(c["count-objects-checked"], 9)
self.failUnlessEqual(c["count-objects-healthy"], 5)
self.failUnlessEqual(c["count-objects-unhealthy"], 4)
self.failUnlessEqual(c["count-objects-unrecoverable"], 2)
d.addCallback(_check1)
d.addCallback(lambda ign:
self.root.start_deep_check(verify=True).when_done())
def _check2(cr):
self.failUnless(IDeepCheckResults.providedBy(cr))
c = cr.get_counters()
self.failUnlessEqual(c["count-objects-checked"], 9)
self.failUnlessEqual(c["count-objects-healthy"], 3)
self.failUnlessEqual(c["count-objects-unhealthy"], 6)
self.failUnlessEqual(c["count-objects-healthy"], 3) # root, mutable good, large good
self.failUnlessEqual(c["count-objects-unrecoverable"], 2) # mutable unrecoverable, large unrecoverable
d.addCallback(_check2)
return d
def do_deepcheck_broken(self, ignored):
# deep-check on the broken directory should fail, because of the
# untraversable subdir
def _do_deep_check():
return self.nodes["broken"].start_deep_check().when_done()
d = self.shouldFail(UnrecoverableFileError, "do_deep_check",
"no recoverable versions",
_do_deep_check)
return d
def json_is_healthy(self, data, where):
r = data["results"]
self.failUnless(r["healthy"], where)
self.failUnless(r["recoverable"], where)
self.failUnlessEqual(r["count-recoverable-versions"], 1, where)
self.failUnlessEqual(r["count-unrecoverable-versions"], 0, where)
def json_is_missing_shares(self, data, where):
r = data["results"]
self.failIf(r["healthy"], where)
self.failUnless(r["recoverable"], where)
self.failUnlessEqual(r["count-recoverable-versions"], 1, where)
self.failUnlessEqual(r["count-unrecoverable-versions"], 0, where)
def json_has_corrupt_shares(self, data, where):
# by "corrupt-shares" we mean the file is still recoverable
r = data["results"]
self.failIf(r["healthy"], where)
self.failUnless(r["recoverable"], where)
self.failUnless(r["count-shares-good"] < 10, where)
self.failUnless(r["count-corrupt-shares"], where)
self.failUnless(r["list-corrupt-shares"], where)
def json_is_unrecoverable(self, data, where):
r = data["results"]
self.failIf(r["healthy"], where)
self.failIf(r["recoverable"], where)
self.failUnless(r["count-shares-good"] < r["count-shares-needed"],
where)
self.failUnlessEqual(r["count-recoverable-versions"], 0, where)
self.failUnlessEqual(r["count-unrecoverable-versions"], 1, where)
def do_test_web_bad(self, ignored):
d = defer.succeed(None)
# check, no verify
def _check(which, checker):
d = self.web_json(self.nodes[which], t="check")
d.addCallback(checker, which + "--webcheck")
return d
d.addCallback(lambda ign: _check("mutable-good",
self.json_is_healthy))
d.addCallback(lambda ign: _check("mutable-missing-shares",
self.json_is_missing_shares))
d.addCallback(lambda ign: _check("mutable-corrupt-shares",
self.json_is_healthy))
d.addCallback(lambda ign: _check("mutable-unrecoverable",
self.json_is_unrecoverable))
d.addCallback(lambda ign: _check("large-good",
self.json_is_healthy))
d.addCallback(lambda ign: _check("large-missing-shares",
self.json_is_missing_shares))
d.addCallback(lambda ign: _check("large-corrupt-shares",
self.json_is_healthy))
d.addCallback(lambda ign: _check("large-unrecoverable",
self.json_is_unrecoverable))
# check and verify
def _checkv(which, checker):
d = self.web_json(self.nodes[which], t="check", verify="true")
d.addCallback(checker, which + "--webcheck-and-verify")
return d
d.addCallback(lambda ign: _checkv("mutable-good",
self.json_is_healthy))
d.addCallback(lambda ign: _checkv("mutable-missing-shares",
self.json_is_missing_shares))
d.addCallback(lambda ign: _checkv("mutable-corrupt-shares",
self.json_has_corrupt_shares))
d.addCallback(lambda ign: _checkv("mutable-unrecoverable",
self.json_is_unrecoverable))
d.addCallback(lambda ign: _checkv("large-good",
self.json_is_healthy))
d.addCallback(lambda ign: _checkv("large-missing-shares", self.json_is_missing_shares))
d.addCallback(lambda ign: _checkv("large-corrupt-shares", self.json_has_corrupt_shares))
d.addCallback(lambda ign: _checkv("large-unrecoverable",
self.json_is_unrecoverable))
return d
class Large(DeepCheckBase, unittest.TestCase):
def test_lots_of_lits(self):
self.basedir = "deepcheck/Large/lots_of_lits"
self.set_up_grid()
# create the following directory structure:
# root/
# subdir/
# 000-large (CHK)
# 001-small (LIT)
# 002-small
# ...
# 399-small
# then do a deepcheck and make sure it doesn't cause a
# Deferred-tail-recursion stack overflow
COUNT = 400
c0 = self.g.clients[0]
d = c0.create_dirnode()
self.stash = {}
def _created_root(n):
self.root = n
return n
d.addCallback(_created_root)
d.addCallback(lambda root: root.create_subdirectory(u"subdir"))
def _add_children(subdir_node):
self.subdir_node = subdir_node
kids = {}
for i in range(1, COUNT):
litcap = LiteralFileURI("%03d-data" % i).to_string()
kids[u"%03d-small" % i] = (litcap, litcap)
return subdir_node.set_children(kids)
d.addCallback(_add_children)
up = upload.Data("large enough for CHK" * 100, "")
d.addCallback(lambda ign: self.subdir_node.add_file(u"0000-large", up))
def _start_deepcheck(ignored):
return self.web(self.root, method="POST", t="stream-deep-check")
d.addCallback(_start_deepcheck)
def _check( (output, url) ):
units = list(self.parse_streamed_json(output))
self.failUnlessEqual(len(units), 2+COUNT+1)
d.addCallback(_check)
return d
|
gpl-2.0
|
z-jason/anki
|
thirdparty/BeautifulSoup.py
|
20
|
79554
|
"""Beautiful Soup
Elixir and Tonic
"The Screen-Scraper's Friend"
http://www.crummy.com/software/BeautifulSoup/
Beautiful Soup parses a (possibly invalid) XML or HTML document into a
tree representation. It provides methods and Pythonic idioms that make
it easy to navigate, search, and modify the tree.
A well-formed XML/HTML document yields a well-formed data
structure. An ill-formed XML/HTML document yields a correspondingly
ill-formed data structure. If your document is only locally
well-formed, you can use this library to find and process the
well-formed part of it.
Beautiful Soup works with Python 2.2 and up. It has no external
dependencies, but you'll have more success at converting data to UTF-8
if you also install these three packages:
* chardet, for auto-detecting character encodings
http://chardet.feedparser.org/
* cjkcodecs and iconv_codec, which add more encodings to the ones supported
by stock Python.
http://cjkpython.i18n.org/
Beautiful Soup defines classes for two main parsing strategies:
* BeautifulStoneSoup, for parsing XML, SGML, or your domain-specific
language that kind of looks like XML.
* BeautifulSoup, for parsing run-of-the-mill HTML code, be it valid
or invalid. This class has web browser-like heuristics for
obtaining a sensible parse tree in the face of common HTML errors.
Beautiful Soup also defines a class (UnicodeDammit) for autodetecting
the encoding of an HTML or XML document, and converting it to
Unicode. Much of this code is taken from Mark Pilgrim's Universal Feed Parser.
For more than you ever wanted to know about Beautiful Soup, see the
documentation:
http://www.crummy.com/software/BeautifulSoup/documentation.html
Here, have some legalese:
Copyright (c) 2004-2010, Leonard Richardson
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are
met:
* Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above
copyright notice, this list of conditions and the following
disclaimer in the documentation and/or other materials provided
with the distribution.
* Neither the name of the the Beautiful Soup Consortium and All
Night Kosher Bakery nor the names of its contributors may be
used to endorse or promote products derived from this software
without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR
CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE, DAMMIT.
"""
from __future__ import generators
__author__ = "Leonard Richardson ([email protected])"
__version__ = "3.2.1"
__copyright__ = "Copyright (c) 2004-2012 Leonard Richardson"
__license__ = "New-style BSD"
from sgmllib import SGMLParser, SGMLParseError
import codecs
import markupbase
import re
import sgmllib
try:
from htmlentitydefs import name2codepoint
except ImportError:
name2codepoint = {}
try:
set
except NameError:
from sets import Set as set
#These hacks make Beautiful Soup able to parse XML with namespaces
sgmllib.tagfind = re.compile('[a-zA-Z][-_.:a-zA-Z0-9]*')
markupbase._declname_match = re.compile(r'[a-zA-Z][-_.:a-zA-Z0-9]*\s*').match
DEFAULT_OUTPUT_ENCODING = "utf-8"
def _match_css_class(str):
"""Build a RE to match the given CSS class."""
return re.compile(r"(^|.*\s)%s($|\s)" % str)
# First, the classes that represent markup elements.
class PageElement(object):
"""Contains the navigational information for some part of the page
(either a tag or a piece of text)"""
def _invert(h):
"Cheap function to invert a hash."
i = {}
for k,v in h.items():
i[v] = k
return i
XML_ENTITIES_TO_SPECIAL_CHARS = { "apos" : "'",
"quot" : '"',
"amp" : "&",
"lt" : "<",
"gt" : ">" }
XML_SPECIAL_CHARS_TO_ENTITIES = _invert(XML_ENTITIES_TO_SPECIAL_CHARS)
def setup(self, parent=None, previous=None):
"""Sets up the initial relations between this element and
other elements."""
self.parent = parent
self.previous = previous
self.next = None
self.previousSibling = None
self.nextSibling = None
if self.parent and self.parent.contents:
self.previousSibling = self.parent.contents[-1]
self.previousSibling.nextSibling = self
def replaceWith(self, replaceWith):
oldParent = self.parent
myIndex = self.parent.index(self)
if hasattr(replaceWith, "parent")\
and replaceWith.parent is self.parent:
# We're replacing this element with one of its siblings.
index = replaceWith.parent.index(replaceWith)
if index and index < myIndex:
# Furthermore, it comes before this element. That
# means that when we extract it, the index of this
# element will change.
myIndex = myIndex - 1
self.extract()
oldParent.insert(myIndex, replaceWith)
def replaceWithChildren(self):
myParent = self.parent
myIndex = self.parent.index(self)
self.extract()
reversedChildren = list(self.contents)
reversedChildren.reverse()
for child in reversedChildren:
myParent.insert(myIndex, child)
def extract(self):
"""Destructively rips this element out of the tree."""
if self.parent:
try:
del self.parent.contents[self.parent.index(self)]
except ValueError:
pass
#Find the two elements that would be next to each other if
#this element (and any children) hadn't been parsed. Connect
#the two.
lastChild = self._lastRecursiveChild()
nextElement = lastChild.next
if self.previous:
self.previous.next = nextElement
if nextElement:
nextElement.previous = self.previous
self.previous = None
lastChild.next = None
self.parent = None
if self.previousSibling:
self.previousSibling.nextSibling = self.nextSibling
if self.nextSibling:
self.nextSibling.previousSibling = self.previousSibling
self.previousSibling = self.nextSibling = None
return self
def _lastRecursiveChild(self):
"Finds the last element beneath this object to be parsed."
lastChild = self
while hasattr(lastChild, 'contents') and lastChild.contents:
lastChild = lastChild.contents[-1]
return lastChild
def insert(self, position, newChild):
if isinstance(newChild, basestring) \
and not isinstance(newChild, NavigableString):
newChild = NavigableString(newChild)
position = min(position, len(self.contents))
if hasattr(newChild, 'parent') and newChild.parent is not None:
# We're 'inserting' an element that's already one
# of this object's children.
if newChild.parent is self:
index = self.index(newChild)
if index > position:
# Furthermore we're moving it further down the
# list of this object's children. That means that
# when we extract this element, our target index
# will jump down one.
position = position - 1
newChild.extract()
newChild.parent = self
previousChild = None
if position == 0:
newChild.previousSibling = None
newChild.previous = self
else:
previousChild = self.contents[position-1]
newChild.previousSibling = previousChild
newChild.previousSibling.nextSibling = newChild
newChild.previous = previousChild._lastRecursiveChild()
if newChild.previous:
newChild.previous.next = newChild
newChildsLastElement = newChild._lastRecursiveChild()
if position >= len(self.contents):
newChild.nextSibling = None
parent = self
parentsNextSibling = None
while not parentsNextSibling:
parentsNextSibling = parent.nextSibling
parent = parent.parent
if not parent: # This is the last element in the document.
break
if parentsNextSibling:
newChildsLastElement.next = parentsNextSibling
else:
newChildsLastElement.next = None
else:
nextChild = self.contents[position]
newChild.nextSibling = nextChild
if newChild.nextSibling:
newChild.nextSibling.previousSibling = newChild
newChildsLastElement.next = nextChild
if newChildsLastElement.next:
newChildsLastElement.next.previous = newChildsLastElement
self.contents.insert(position, newChild)
def append(self, tag):
"""Appends the given tag to the contents of this tag."""
self.insert(len(self.contents), tag)
def findNext(self, name=None, attrs={}, text=None, **kwargs):
"""Returns the first item that matches the given criteria and
appears after this Tag in the document."""
return self._findOne(self.findAllNext, name, attrs, text, **kwargs)
def findAllNext(self, name=None, attrs={}, text=None, limit=None,
**kwargs):
"""Returns all items that match the given criteria and appear
after this Tag in the document."""
return self._findAll(name, attrs, text, limit, self.nextGenerator,
**kwargs)
def findNextSibling(self, name=None, attrs={}, text=None, **kwargs):
"""Returns the closest sibling to this Tag that matches the
given criteria and appears after this Tag in the document."""
return self._findOne(self.findNextSiblings, name, attrs, text,
**kwargs)
def findNextSiblings(self, name=None, attrs={}, text=None, limit=None,
**kwargs):
"""Returns the siblings of this Tag that match the given
criteria and appear after this Tag in the document."""
return self._findAll(name, attrs, text, limit,
self.nextSiblingGenerator, **kwargs)
fetchNextSiblings = findNextSiblings # Compatibility with pre-3.x
def findPrevious(self, name=None, attrs={}, text=None, **kwargs):
"""Returns the first item that matches the given criteria and
appears before this Tag in the document."""
return self._findOne(self.findAllPrevious, name, attrs, text, **kwargs)
def findAllPrevious(self, name=None, attrs={}, text=None, limit=None,
**kwargs):
"""Returns all items that match the given criteria and appear
before this Tag in the document."""
return self._findAll(name, attrs, text, limit, self.previousGenerator,
**kwargs)
fetchPrevious = findAllPrevious # Compatibility with pre-3.x
def findPreviousSibling(self, name=None, attrs={}, text=None, **kwargs):
"""Returns the closest sibling to this Tag that matches the
given criteria and appears before this Tag in the document."""
return self._findOne(self.findPreviousSiblings, name, attrs, text,
**kwargs)
def findPreviousSiblings(self, name=None, attrs={}, text=None,
limit=None, **kwargs):
"""Returns the siblings of this Tag that match the given
criteria and appear before this Tag in the document."""
return self._findAll(name, attrs, text, limit,
self.previousSiblingGenerator, **kwargs)
fetchPreviousSiblings = findPreviousSiblings # Compatibility with pre-3.x
def findParent(self, name=None, attrs={}, **kwargs):
"""Returns the closest parent of this Tag that matches the given
criteria."""
# NOTE: We can't use _findOne because findParents takes a different
# set of arguments.
r = None
l = self.findParents(name, attrs, 1)
if l:
r = l[0]
return r
def findParents(self, name=None, attrs={}, limit=None, **kwargs):
"""Returns the parents of this Tag that match the given
criteria."""
return self._findAll(name, attrs, None, limit, self.parentGenerator,
**kwargs)
fetchParents = findParents # Compatibility with pre-3.x
#These methods do the real heavy lifting.
def _findOne(self, method, name, attrs, text, **kwargs):
r = None
l = method(name, attrs, text, 1, **kwargs)
if l:
r = l[0]
return r
def _findAll(self, name, attrs, text, limit, generator, **kwargs):
"Iterates over a generator looking for things that match."
if isinstance(name, SoupStrainer):
strainer = name
# (Possibly) special case some findAll*(...) searches
elif text is None and not limit and not attrs and not kwargs:
# findAll*(True)
if name is True:
return [element for element in generator()
if isinstance(element, Tag)]
# findAll*('tag-name')
elif isinstance(name, basestring):
return [element for element in generator()
if isinstance(element, Tag) and
element.name == name]
else:
strainer = SoupStrainer(name, attrs, text, **kwargs)
# Build a SoupStrainer
else:
strainer = SoupStrainer(name, attrs, text, **kwargs)
results = ResultSet(strainer)
g = generator()
while True:
try:
i = g.next()
except StopIteration:
break
if i:
found = strainer.search(i)
if found:
results.append(found)
if limit and len(results) >= limit:
break
return results
#These Generators can be used to navigate starting from both
#NavigableStrings and Tags.
def nextGenerator(self):
i = self
while i is not None:
i = i.next
yield i
def nextSiblingGenerator(self):
i = self
while i is not None:
i = i.nextSibling
yield i
def previousGenerator(self):
i = self
while i is not None:
i = i.previous
yield i
def previousSiblingGenerator(self):
i = self
while i is not None:
i = i.previousSibling
yield i
def parentGenerator(self):
i = self
while i is not None:
i = i.parent
yield i
# Utility methods
def substituteEncoding(self, str, encoding=None):
encoding = encoding or "utf-8"
return str.replace("%SOUP-ENCODING%", encoding)
def toEncoding(self, s, encoding=None):
"""Encodes an object to a string in some encoding, or to Unicode.
."""
if isinstance(s, unicode):
if encoding:
s = s.encode(encoding)
elif isinstance(s, str):
if encoding:
s = s.encode(encoding)
else:
s = unicode(s)
else:
if encoding:
s = self.toEncoding(str(s), encoding)
else:
s = unicode(s)
return s
BARE_AMPERSAND_OR_BRACKET = re.compile("([<>]|"
+ "&(?!#\d+;|#x[0-9a-fA-F]+;|\w+;)"
+ ")")
def _sub_entity(self, x):
"""Used with a regular expression to substitute the
appropriate XML entity for an XML special character."""
return "&" + self.XML_SPECIAL_CHARS_TO_ENTITIES[x.group(0)[0]] + ";"
class NavigableString(unicode, PageElement):
def __new__(cls, value):
"""Create a new NavigableString.
When unpickling a NavigableString, this method is called with
the string in DEFAULT_OUTPUT_ENCODING. That encoding needs to be
passed in to the superclass's __new__ or the superclass won't know
how to handle non-ASCII characters.
"""
if isinstance(value, unicode):
return unicode.__new__(cls, value)
return unicode.__new__(cls, value, DEFAULT_OUTPUT_ENCODING)
def __getnewargs__(self):
return (NavigableString.__str__(self),)
def __getattr__(self, attr):
"""text.string gives you text. This is for backwards
compatibility for Navigable*String, but for CData* it lets you
get the string without the CData wrapper."""
if attr == 'string':
return self
else:
raise AttributeError, "'%s' object has no attribute '%s'" % (self.__class__.__name__, attr)
def __unicode__(self):
return str(self).decode(DEFAULT_OUTPUT_ENCODING)
def __str__(self, encoding=DEFAULT_OUTPUT_ENCODING):
# Substitute outgoing XML entities.
data = self.BARE_AMPERSAND_OR_BRACKET.sub(self._sub_entity, self)
if encoding:
return data.encode(encoding)
else:
return data
class CData(NavigableString):
def __str__(self, encoding=DEFAULT_OUTPUT_ENCODING):
return "<![CDATA[%s]]>" % NavigableString.__str__(self, encoding)
class ProcessingInstruction(NavigableString):
def __str__(self, encoding=DEFAULT_OUTPUT_ENCODING):
output = self
if "%SOUP-ENCODING%" in output:
output = self.substituteEncoding(output, encoding)
return "<?%s?>" % self.toEncoding(output, encoding)
class Comment(NavigableString):
def __str__(self, encoding=DEFAULT_OUTPUT_ENCODING):
return "<!--%s-->" % NavigableString.__str__(self, encoding)
class Declaration(NavigableString):
def __str__(self, encoding=DEFAULT_OUTPUT_ENCODING):
return "<!%s>" % NavigableString.__str__(self, encoding)
class Tag(PageElement):
"""Represents a found HTML tag with its attributes and contents."""
def _convertEntities(self, match):
"""Used in a call to re.sub to replace HTML, XML, and numeric
entities with the appropriate Unicode characters. If HTML
entities are being converted, any unrecognized entities are
escaped."""
x = match.group(1)
if self.convertHTMLEntities and x in name2codepoint:
return unichr(name2codepoint[x])
elif x in self.XML_ENTITIES_TO_SPECIAL_CHARS:
if self.convertXMLEntities:
return self.XML_ENTITIES_TO_SPECIAL_CHARS[x]
else:
return u'&%s;' % x
elif len(x) > 0 and x[0] == '#':
# Handle numeric entities
if len(x) > 1 and x[1] == 'x':
return unichr(int(x[2:], 16))
else:
return unichr(int(x[1:]))
elif self.escapeUnrecognizedEntities:
return u'&%s;' % x
else:
return u'&%s;' % x
def __init__(self, parser, name, attrs=None, parent=None,
previous=None):
"Basic constructor."
# We don't actually store the parser object: that lets extracted
# chunks be garbage-collected
self.parserClass = parser.__class__
self.isSelfClosing = parser.isSelfClosingTag(name)
self.name = name
if attrs is None:
attrs = []
elif isinstance(attrs, dict):
attrs = attrs.items()
self.attrs = attrs
self.contents = []
self.setup(parent, previous)
self.hidden = False
self.containsSubstitutions = False
self.convertHTMLEntities = parser.convertHTMLEntities
self.convertXMLEntities = parser.convertXMLEntities
self.escapeUnrecognizedEntities = parser.escapeUnrecognizedEntities
# Convert any HTML, XML, or numeric entities in the attribute values.
convert = lambda(k, val): (k,
re.sub("&(#\d+|#x[0-9a-fA-F]+|\w+);",
self._convertEntities,
val))
self.attrs = map(convert, self.attrs)
def getString(self):
if (len(self.contents) == 1
and isinstance(self.contents[0], NavigableString)):
return self.contents[0]
def setString(self, string):
"""Replace the contents of the tag with a string"""
self.clear()
self.append(string)
string = property(getString, setString)
def getText(self, separator=u""):
if not len(self.contents):
return u""
stopNode = self._lastRecursiveChild().next
strings = []
current = self.contents[0]
while current is not stopNode:
if isinstance(current, NavigableString):
strings.append(current.strip())
current = current.next
return separator.join(strings)
text = property(getText)
def get(self, key, default=None):
"""Returns the value of the 'key' attribute for the tag, or
the value given for 'default' if it doesn't have that
attribute."""
return self._getAttrMap().get(key, default)
def clear(self):
"""Extract all children."""
for child in self.contents[:]:
child.extract()
def index(self, element):
for i, child in enumerate(self.contents):
if child is element:
return i
raise ValueError("Tag.index: element not in tag")
def has_key(self, key):
return self._getAttrMap().has_key(key)
def __getitem__(self, key):
"""tag[key] returns the value of the 'key' attribute for the tag,
and throws an exception if it's not there."""
return self._getAttrMap()[key]
def __iter__(self):
"Iterating over a tag iterates over its contents."
return iter(self.contents)
def __len__(self):
"The length of a tag is the length of its list of contents."
return len(self.contents)
def __contains__(self, x):
return x in self.contents
def __nonzero__(self):
"A tag is non-None even if it has no contents."
return True
def __setitem__(self, key, value):
"""Setting tag[key] sets the value of the 'key' attribute for the
tag."""
self._getAttrMap()
self.attrMap[key] = value
found = False
for i in range(0, len(self.attrs)):
if self.attrs[i][0] == key:
self.attrs[i] = (key, value)
found = True
if not found:
self.attrs.append((key, value))
self._getAttrMap()[key] = value
def __delitem__(self, key):
"Deleting tag[key] deletes all 'key' attributes for the tag."
for item in self.attrs:
if item[0] == key:
self.attrs.remove(item)
#We don't break because bad HTML can define the same
#attribute multiple times.
self._getAttrMap()
if self.attrMap.has_key(key):
del self.attrMap[key]
def __call__(self, *args, **kwargs):
"""Calling a tag like a function is the same as calling its
findAll() method. Eg. tag('a') returns a list of all the A tags
found within this tag."""
return apply(self.findAll, args, kwargs)
def __getattr__(self, tag):
#print "Getattr %s.%s" % (self.__class__, tag)
if len(tag) > 3 and tag.rfind('Tag') == len(tag)-3:
return self.find(tag[:-3])
elif tag.find('__') != 0:
return self.find(tag)
raise AttributeError, "'%s' object has no attribute '%s'" % (self.__class__, tag)
def __eq__(self, other):
"""Returns true iff this tag has the same name, the same attributes,
and the same contents (recursively) as the given tag.
NOTE: right now this will return false if two tags have the
same attributes in a different order. Should this be fixed?"""
if other is self:
return True
if not hasattr(other, 'name') or not hasattr(other, 'attrs') or not hasattr(other, 'contents') or self.name != other.name or self.attrs != other.attrs or len(self) != len(other):
return False
for i in range(0, len(self.contents)):
if self.contents[i] != other.contents[i]:
return False
return True
def __ne__(self, other):
"""Returns true iff this tag is not identical to the other tag,
as defined in __eq__."""
return not self == other
def __repr__(self, encoding=DEFAULT_OUTPUT_ENCODING):
"""Renders this tag as a string."""
return self.__str__(encoding)
def __unicode__(self):
return self.__str__(None)
def __str__(self, encoding=DEFAULT_OUTPUT_ENCODING,
prettyPrint=False, indentLevel=0):
"""Returns a string or Unicode representation of this tag and
its contents. To get Unicode, pass None for encoding.
NOTE: since Python's HTML parser consumes whitespace, this
method is not certain to reproduce the whitespace present in
the original string."""
encodedName = self.toEncoding(self.name, encoding)
attrs = []
if self.attrs:
for key, val in self.attrs:
fmt = '%s="%s"'
if isinstance(val, basestring):
if self.containsSubstitutions and '%SOUP-ENCODING%' in val:
val = self.substituteEncoding(val, encoding)
# The attribute value either:
#
# * Contains no embedded double quotes or single quotes.
# No problem: we enclose it in double quotes.
# * Contains embedded single quotes. No problem:
# double quotes work here too.
# * Contains embedded double quotes. No problem:
# we enclose it in single quotes.
# * Embeds both single _and_ double quotes. This
# can't happen naturally, but it can happen if
# you modify an attribute value after parsing
# the document. Now we have a bit of a
# problem. We solve it by enclosing the
# attribute in single quotes, and escaping any
# embedded single quotes to XML entities.
if '"' in val:
fmt = "%s='%s'"
if "'" in val:
# TODO: replace with apos when
# appropriate.
val = val.replace("'", "&squot;")
# Now we're okay w/r/t quotes. But the attribute
# value might also contain angle brackets, or
# ampersands that aren't part of entities. We need
# to escape those to XML entities too.
val = self.BARE_AMPERSAND_OR_BRACKET.sub(self._sub_entity, val)
attrs.append(fmt % (self.toEncoding(key, encoding),
self.toEncoding(val, encoding)))
close = ''
closeTag = ''
if self.isSelfClosing:
close = ' /'
else:
closeTag = '</%s>' % encodedName
indentTag, indentContents = 0, 0
if prettyPrint:
indentTag = indentLevel
space = (' ' * (indentTag-1))
indentContents = indentTag + 1
contents = self.renderContents(encoding, prettyPrint, indentContents)
if self.hidden:
s = contents
else:
s = []
attributeString = ''
if attrs:
attributeString = ' ' + ' '.join(attrs)
if prettyPrint:
s.append(space)
s.append('<%s%s%s>' % (encodedName, attributeString, close))
if prettyPrint:
s.append("\n")
s.append(contents)
if prettyPrint and contents and contents[-1] != "\n":
s.append("\n")
if prettyPrint and closeTag:
s.append(space)
s.append(closeTag)
if prettyPrint and closeTag and self.nextSibling:
s.append("\n")
s = ''.join(s)
return s
def decompose(self):
"""Recursively destroys the contents of this tree."""
self.extract()
if len(self.contents) == 0:
return
current = self.contents[0]
while current is not None:
next = current.next
if isinstance(current, Tag):
del current.contents[:]
current.parent = None
current.previous = None
current.previousSibling = None
current.next = None
current.nextSibling = None
current = next
def prettify(self, encoding=DEFAULT_OUTPUT_ENCODING):
return self.__str__(encoding, True)
def renderContents(self, encoding=DEFAULT_OUTPUT_ENCODING,
prettyPrint=False, indentLevel=0):
"""Renders the contents of this tag as a string in the given
encoding. If encoding is None, returns a Unicode string.."""
s=[]
for c in self:
text = None
if isinstance(c, NavigableString):
text = c.__str__(encoding)
elif isinstance(c, Tag):
s.append(c.__str__(encoding, prettyPrint, indentLevel))
if text and prettyPrint:
text = text.strip()
if text:
if prettyPrint:
s.append(" " * (indentLevel-1))
s.append(text)
if prettyPrint:
s.append("\n")
return ''.join(s)
#Soup methods
def find(self, name=None, attrs={}, recursive=True, text=None,
**kwargs):
"""Return only the first child of this Tag matching the given
criteria."""
r = None
l = self.findAll(name, attrs, recursive, text, 1, **kwargs)
if l:
r = l[0]
return r
findChild = find
def findAll(self, name=None, attrs={}, recursive=True, text=None,
limit=None, **kwargs):
"""Extracts a list of Tag objects that match the given
criteria. You can specify the name of the Tag and any
attributes you want the Tag to have.
The value of a key-value pair in the 'attrs' map can be a
string, a list of strings, a regular expression object, or a
callable that takes a string and returns whether or not the
string matches for some custom definition of 'matches'. The
same is true of the tag name."""
generator = self.recursiveChildGenerator
if not recursive:
generator = self.childGenerator
return self._findAll(name, attrs, text, limit, generator, **kwargs)
findChildren = findAll
# Pre-3.x compatibility methods
first = find
fetch = findAll
def fetchText(self, text=None, recursive=True, limit=None):
return self.findAll(text=text, recursive=recursive, limit=limit)
def firstText(self, text=None, recursive=True):
return self.find(text=text, recursive=recursive)
#Private methods
def _getAttrMap(self):
"""Initializes a map representation of this tag's attributes,
if not already initialized."""
if not getattr(self, 'attrMap'):
self.attrMap = {}
for (key, value) in self.attrs:
self.attrMap[key] = value
return self.attrMap
#Generator methods
def childGenerator(self):
# Just use the iterator from the contents
return iter(self.contents)
def recursiveChildGenerator(self):
if not len(self.contents):
raise StopIteration
stopNode = self._lastRecursiveChild().next
current = self.contents[0]
while current is not stopNode:
yield current
current = current.next
# Next, a couple classes to represent queries and their results.
class SoupStrainer:
"""Encapsulates a number of ways of matching a markup element (tag or
text)."""
def __init__(self, name=None, attrs={}, text=None, **kwargs):
self.name = name
if isinstance(attrs, basestring):
kwargs['class'] = _match_css_class(attrs)
attrs = None
if kwargs:
if attrs:
attrs = attrs.copy()
attrs.update(kwargs)
else:
attrs = kwargs
self.attrs = attrs
self.text = text
def __str__(self):
if self.text:
return self.text
else:
return "%s|%s" % (self.name, self.attrs)
def searchTag(self, markupName=None, markupAttrs={}):
found = None
markup = None
if isinstance(markupName, Tag):
markup = markupName
markupAttrs = markup
callFunctionWithTagData = callable(self.name) \
and not isinstance(markupName, Tag)
if (not self.name) \
or callFunctionWithTagData \
or (markup and self._matches(markup, self.name)) \
or (not markup and self._matches(markupName, self.name)):
if callFunctionWithTagData:
match = self.name(markupName, markupAttrs)
else:
match = True
markupAttrMap = None
for attr, matchAgainst in self.attrs.items():
if not markupAttrMap:
if hasattr(markupAttrs, 'get'):
markupAttrMap = markupAttrs
else:
markupAttrMap = {}
for k,v in markupAttrs:
markupAttrMap[k] = v
attrValue = markupAttrMap.get(attr)
if not self._matches(attrValue, matchAgainst):
match = False
break
if match:
if markup:
found = markup
else:
found = markupName
return found
def search(self, markup):
#print 'looking for %s in %s' % (self, markup)
found = None
# If given a list of items, scan it for a text element that
# matches.
if hasattr(markup, "__iter__") \
and not isinstance(markup, Tag):
for element in markup:
if isinstance(element, NavigableString) \
and self.search(element):
found = element
break
# If it's a Tag, make sure its name or attributes match.
# Don't bother with Tags if we're searching for text.
elif isinstance(markup, Tag):
if not self.text:
found = self.searchTag(markup)
# If it's text, make sure the text matches.
elif isinstance(markup, NavigableString) or \
isinstance(markup, basestring):
if self._matches(markup, self.text):
found = markup
else:
raise Exception, "I don't know how to match against a %s" \
% markup.__class__
return found
def _matches(self, markup, matchAgainst):
#print "Matching %s against %s" % (markup, matchAgainst)
result = False
if matchAgainst is True:
result = markup is not None
elif callable(matchAgainst):
result = matchAgainst(markup)
else:
#Custom match methods take the tag as an argument, but all
#other ways of matching match the tag name as a string.
if isinstance(markup, Tag):
markup = markup.name
if markup and not isinstance(markup, basestring):
markup = unicode(markup)
#Now we know that chunk is either a string, or None.
if hasattr(matchAgainst, 'match'):
# It's a regexp object.
result = markup and matchAgainst.search(markup)
elif hasattr(matchAgainst, '__iter__'): # list-like
result = markup in matchAgainst
elif hasattr(matchAgainst, 'items'):
result = markup.has_key(matchAgainst)
elif matchAgainst and isinstance(markup, basestring):
if isinstance(markup, unicode):
matchAgainst = unicode(matchAgainst)
else:
matchAgainst = str(matchAgainst)
if not result:
result = matchAgainst == markup
return result
class ResultSet(list):
"""A ResultSet is just a list that keeps track of the SoupStrainer
that created it."""
def __init__(self, source):
list.__init__([])
self.source = source
# Now, some helper functions.
def buildTagMap(default, *args):
"""Turns a list of maps, lists, or scalars into a single map.
Used to build the SELF_CLOSING_TAGS, NESTABLE_TAGS, and
NESTING_RESET_TAGS maps out of lists and partial maps."""
built = {}
for portion in args:
if hasattr(portion, 'items'):
#It's a map. Merge it.
for k,v in portion.items():
built[k] = v
elif hasattr(portion, '__iter__'): # is a list
#It's a list. Map each item to the default.
for k in portion:
built[k] = default
else:
#It's a scalar. Map it to the default.
built[portion] = default
return built
# Now, the parser classes.
class BeautifulStoneSoup(Tag, SGMLParser):
"""This class contains the basic parser and search code. It defines
a parser that knows nothing about tag behavior except for the
following:
You can't close a tag without closing all the tags it encloses.
That is, "<foo><bar></foo>" actually means
"<foo><bar></bar></foo>".
[Another possible explanation is "<foo><bar /></foo>", but since
this class defines no SELF_CLOSING_TAGS, it will never use that
explanation.]
This class is useful for parsing XML or made-up markup languages,
or when BeautifulSoup makes an assumption counter to what you were
expecting."""
SELF_CLOSING_TAGS = {}
NESTABLE_TAGS = {}
RESET_NESTING_TAGS = {}
QUOTE_TAGS = {}
PRESERVE_WHITESPACE_TAGS = []
MARKUP_MASSAGE = [(re.compile('(<[^<>]*)/>'),
lambda x: x.group(1) + ' />'),
(re.compile('<!\s+([^<>]*)>'),
lambda x: '<!' + x.group(1) + '>')
]
ROOT_TAG_NAME = u'[document]'
HTML_ENTITIES = "html"
XML_ENTITIES = "xml"
XHTML_ENTITIES = "xhtml"
# TODO: This only exists for backwards-compatibility
ALL_ENTITIES = XHTML_ENTITIES
# Used when determining whether a text node is all whitespace and
# can be replaced with a single space. A text node that contains
# fancy Unicode spaces (usually non-breaking) should be left
# alone.
STRIP_ASCII_SPACES = { 9: None, 10: None, 12: None, 13: None, 32: None, }
def __init__(self, markup="", parseOnlyThese=None, fromEncoding=None,
markupMassage=True, smartQuotesTo=XML_ENTITIES,
convertEntities=None, selfClosingTags=None, isHTML=False):
"""The Soup object is initialized as the 'root tag', and the
provided markup (which can be a string or a file-like object)
is fed into the underlying parser.
sgmllib will process most bad HTML, and the BeautifulSoup
class has some tricks for dealing with some HTML that kills
sgmllib, but Beautiful Soup can nonetheless choke or lose data
if your data uses self-closing tags or declarations
incorrectly.
By default, Beautiful Soup uses regexes to sanitize input,
avoiding the vast majority of these problems. If the problems
don't apply to you, pass in False for markupMassage, and
you'll get better performance.
The default parser massage techniques fix the two most common
instances of invalid HTML that choke sgmllib:
<br/> (No space between name of closing tag and tag close)
<! --Comment--> (Extraneous whitespace in declaration)
You can pass in a custom list of (RE object, replace method)
tuples to get Beautiful Soup to scrub your input the way you
want."""
self.parseOnlyThese = parseOnlyThese
self.fromEncoding = fromEncoding
self.smartQuotesTo = smartQuotesTo
self.convertEntities = convertEntities
# Set the rules for how we'll deal with the entities we
# encounter
if self.convertEntities:
# It doesn't make sense to convert encoded characters to
# entities even while you're converting entities to Unicode.
# Just convert it all to Unicode.
self.smartQuotesTo = None
if convertEntities == self.HTML_ENTITIES:
self.convertXMLEntities = False
self.convertHTMLEntities = True
self.escapeUnrecognizedEntities = True
elif convertEntities == self.XHTML_ENTITIES:
self.convertXMLEntities = True
self.convertHTMLEntities = True
self.escapeUnrecognizedEntities = False
elif convertEntities == self.XML_ENTITIES:
self.convertXMLEntities = True
self.convertHTMLEntities = False
self.escapeUnrecognizedEntities = False
else:
self.convertXMLEntities = False
self.convertHTMLEntities = False
self.escapeUnrecognizedEntities = False
self.instanceSelfClosingTags = buildTagMap(None, selfClosingTags)
SGMLParser.__init__(self)
if hasattr(markup, 'read'): # It's a file-type object.
markup = markup.read()
self.markup = markup
self.markupMassage = markupMassage
try:
self._feed(isHTML=isHTML)
except StopParsing:
pass
self.markup = None # The markup can now be GCed
def convert_charref(self, name):
"""This method fixes a bug in Python's SGMLParser."""
try:
n = int(name)
except ValueError:
return
if not 0 <= n <= 127 : # ASCII ends at 127, not 255
return
return self.convert_codepoint(n)
def _feed(self, inDocumentEncoding=None, isHTML=False):
# Convert the document to Unicode.
markup = self.markup
if isinstance(markup, unicode):
if not hasattr(self, 'originalEncoding'):
self.originalEncoding = None
else:
dammit = UnicodeDammit\
(markup, [self.fromEncoding, inDocumentEncoding],
smartQuotesTo=self.smartQuotesTo, isHTML=isHTML)
markup = dammit.unicode
self.originalEncoding = dammit.originalEncoding
self.declaredHTMLEncoding = dammit.declaredHTMLEncoding
if markup:
if self.markupMassage:
if not hasattr(self.markupMassage, "__iter__"):
self.markupMassage = self.MARKUP_MASSAGE
for fix, m in self.markupMassage:
markup = fix.sub(m, markup)
# TODO: We get rid of markupMassage so that the
# soup object can be deepcopied later on. Some
# Python installations can't copy regexes. If anyone
# was relying on the existence of markupMassage, this
# might cause problems.
del(self.markupMassage)
self.reset()
SGMLParser.feed(self, markup)
# Close out any unfinished strings and close all the open tags.
self.endData()
while self.currentTag.name != self.ROOT_TAG_NAME:
self.popTag()
def __getattr__(self, methodName):
"""This method routes method call requests to either the SGMLParser
superclass or the Tag superclass, depending on the method name."""
#print "__getattr__ called on %s.%s" % (self.__class__, methodName)
if methodName.startswith('start_') or methodName.startswith('end_') \
or methodName.startswith('do_'):
return SGMLParser.__getattr__(self, methodName)
elif not methodName.startswith('__'):
return Tag.__getattr__(self, methodName)
else:
raise AttributeError
def isSelfClosingTag(self, name):
"""Returns true iff the given string is the name of a
self-closing tag according to this parser."""
return self.SELF_CLOSING_TAGS.has_key(name) \
or self.instanceSelfClosingTags.has_key(name)
def reset(self):
Tag.__init__(self, self, self.ROOT_TAG_NAME)
self.hidden = 1
SGMLParser.reset(self)
self.currentData = []
self.currentTag = None
self.tagStack = []
self.quoteStack = []
self.pushTag(self)
def popTag(self):
tag = self.tagStack.pop()
#print "Pop", tag.name
if self.tagStack:
self.currentTag = self.tagStack[-1]
return self.currentTag
def pushTag(self, tag):
#print "Push", tag.name
if self.currentTag:
self.currentTag.contents.append(tag)
self.tagStack.append(tag)
self.currentTag = self.tagStack[-1]
def endData(self, containerClass=NavigableString):
if self.currentData:
currentData = u''.join(self.currentData)
if (currentData.translate(self.STRIP_ASCII_SPACES) == '' and
not set([tag.name for tag in self.tagStack]).intersection(
self.PRESERVE_WHITESPACE_TAGS)):
if '\n' in currentData:
currentData = '\n'
else:
currentData = ' '
self.currentData = []
if self.parseOnlyThese and len(self.tagStack) <= 1 and \
(not self.parseOnlyThese.text or \
not self.parseOnlyThese.search(currentData)):
return
o = containerClass(currentData)
o.setup(self.currentTag, self.previous)
if self.previous:
self.previous.next = o
self.previous = o
self.currentTag.contents.append(o)
def _popToTag(self, name, inclusivePop=True):
"""Pops the tag stack up to and including the most recent
instance of the given tag. If inclusivePop is false, pops the tag
stack up to but *not* including the most recent instqance of
the given tag."""
#print "Popping to %s" % name
if name == self.ROOT_TAG_NAME:
return
numPops = 0
mostRecentTag = None
for i in range(len(self.tagStack)-1, 0, -1):
if name == self.tagStack[i].name:
numPops = len(self.tagStack)-i
break
if not inclusivePop:
numPops = numPops - 1
for i in range(0, numPops):
mostRecentTag = self.popTag()
return mostRecentTag
def _smartPop(self, name):
"""We need to pop up to the previous tag of this type, unless
one of this tag's nesting reset triggers comes between this
tag and the previous tag of this type, OR unless this tag is a
generic nesting trigger and another generic nesting trigger
comes between this tag and the previous tag of this type.
Examples:
<p>Foo<b>Bar *<p>* should pop to 'p', not 'b'.
<p>Foo<table>Bar *<p>* should pop to 'table', not 'p'.
<p>Foo<table><tr>Bar *<p>* should pop to 'tr', not 'p'.
<li><ul><li> *<li>* should pop to 'ul', not the first 'li'.
<tr><table><tr> *<tr>* should pop to 'table', not the first 'tr'
<td><tr><td> *<td>* should pop to 'tr', not the first 'td'
"""
nestingResetTriggers = self.NESTABLE_TAGS.get(name)
isNestable = nestingResetTriggers != None
isResetNesting = self.RESET_NESTING_TAGS.has_key(name)
popTo = None
inclusive = True
for i in range(len(self.tagStack)-1, 0, -1):
p = self.tagStack[i]
if (not p or p.name == name) and not isNestable:
#Non-nestable tags get popped to the top or to their
#last occurance.
popTo = name
break
if (nestingResetTriggers is not None
and p.name in nestingResetTriggers) \
or (nestingResetTriggers is None and isResetNesting
and self.RESET_NESTING_TAGS.has_key(p.name)):
#If we encounter one of the nesting reset triggers
#peculiar to this tag, or we encounter another tag
#that causes nesting to reset, pop up to but not
#including that tag.
popTo = p.name
inclusive = False
break
p = p.parent
if popTo:
self._popToTag(popTo, inclusive)
def unknown_starttag(self, name, attrs, selfClosing=0):
#print "Start tag %s: %s" % (name, attrs)
if self.quoteStack:
#This is not a real tag.
#print "<%s> is not real!" % name
attrs = ''.join([' %s="%s"' % (x, y) for x, y in attrs])
self.handle_data('<%s%s>' % (name, attrs))
return
self.endData()
if not self.isSelfClosingTag(name) and not selfClosing:
self._smartPop(name)
if self.parseOnlyThese and len(self.tagStack) <= 1 \
and (self.parseOnlyThese.text or not self.parseOnlyThese.searchTag(name, attrs)):
return
tag = Tag(self, name, attrs, self.currentTag, self.previous)
if self.previous:
self.previous.next = tag
self.previous = tag
self.pushTag(tag)
if selfClosing or self.isSelfClosingTag(name):
self.popTag()
if name in self.QUOTE_TAGS:
#print "Beginning quote (%s)" % name
self.quoteStack.append(name)
self.literal = 1
return tag
def unknown_endtag(self, name):
#print "End tag %s" % name
if self.quoteStack and self.quoteStack[-1] != name:
#This is not a real end tag.
#print "</%s> is not real!" % name
self.handle_data('</%s>' % name)
return
self.endData()
self._popToTag(name)
if self.quoteStack and self.quoteStack[-1] == name:
self.quoteStack.pop()
self.literal = (len(self.quoteStack) > 0)
def handle_data(self, data):
self.currentData.append(data)
def _toStringSubclass(self, text, subclass):
"""Adds a certain piece of text to the tree as a NavigableString
subclass."""
self.endData()
self.handle_data(text)
self.endData(subclass)
def handle_pi(self, text):
"""Handle a processing instruction as a ProcessingInstruction
object, possibly one with a %SOUP-ENCODING% slot into which an
encoding will be plugged later."""
if text[:3] == "xml":
text = u"xml version='1.0' encoding='%SOUP-ENCODING%'"
self._toStringSubclass(text, ProcessingInstruction)
def handle_comment(self, text):
"Handle comments as Comment objects."
self._toStringSubclass(text, Comment)
def handle_charref(self, ref):
"Handle character references as data."
if self.convertEntities:
data = unichr(int(ref))
else:
data = '&#%s;' % ref
self.handle_data(data)
def handle_entityref(self, ref):
"""Handle entity references as data, possibly converting known
HTML and/or XML entity references to the corresponding Unicode
characters."""
data = None
if self.convertHTMLEntities:
try:
data = unichr(name2codepoint[ref])
except KeyError:
pass
if not data and self.convertXMLEntities:
data = self.XML_ENTITIES_TO_SPECIAL_CHARS.get(ref)
if not data and self.convertHTMLEntities and \
not self.XML_ENTITIES_TO_SPECIAL_CHARS.get(ref):
# TODO: We've got a problem here. We're told this is
# an entity reference, but it's not an XML entity
# reference or an HTML entity reference. Nonetheless,
# the logical thing to do is to pass it through as an
# unrecognized entity reference.
#
# Except: when the input is "&carol;" this function
# will be called with input "carol". When the input is
# "AT&T", this function will be called with input
# "T". We have no way of knowing whether a semicolon
# was present originally, so we don't know whether
# this is an unknown entity or just a misplaced
# ampersand.
#
# The more common case is a misplaced ampersand, so I
# escape the ampersand and omit the trailing semicolon.
data = "&%s" % ref
if not data:
# This case is different from the one above, because we
# haven't already gone through a supposedly comprehensive
# mapping of entities to Unicode characters. We might not
# have gone through any mapping at all. So the chances are
# very high that this is a real entity, and not a
# misplaced ampersand.
data = "&%s;" % ref
self.handle_data(data)
def handle_decl(self, data):
"Handle DOCTYPEs and the like as Declaration objects."
self._toStringSubclass(data, Declaration)
def parse_declaration(self, i):
"""Treat a bogus SGML declaration as raw data. Treat a CDATA
declaration as a CData object."""
j = None
if self.rawdata[i:i+9] == '<![CDATA[':
k = self.rawdata.find(']]>', i)
if k == -1:
k = len(self.rawdata)
data = self.rawdata[i+9:k]
j = k+3
self._toStringSubclass(data, CData)
else:
try:
j = SGMLParser.parse_declaration(self, i)
except SGMLParseError:
toHandle = self.rawdata[i:]
self.handle_data(toHandle)
j = i + len(toHandle)
return j
class BeautifulSoup(BeautifulStoneSoup):
"""This parser knows the following facts about HTML:
* Some tags have no closing tag and should be interpreted as being
closed as soon as they are encountered.
* The text inside some tags (ie. 'script') may contain tags which
are not really part of the document and which should be parsed
as text, not tags. If you want to parse the text as tags, you can
always fetch it and parse it explicitly.
* Tag nesting rules:
Most tags can't be nested at all. For instance, the occurance of
a <p> tag should implicitly close the previous <p> tag.
<p>Para1<p>Para2
should be transformed into:
<p>Para1</p><p>Para2
Some tags can be nested arbitrarily. For instance, the occurance
of a <blockquote> tag should _not_ implicitly close the previous
<blockquote> tag.
Alice said: <blockquote>Bob said: <blockquote>Blah
should NOT be transformed into:
Alice said: <blockquote>Bob said: </blockquote><blockquote>Blah
Some tags can be nested, but the nesting is reset by the
interposition of other tags. For instance, a <tr> tag should
implicitly close the previous <tr> tag within the same <table>,
but not close a <tr> tag in another table.
<table><tr>Blah<tr>Blah
should be transformed into:
<table><tr>Blah</tr><tr>Blah
but,
<tr>Blah<table><tr>Blah
should NOT be transformed into
<tr>Blah<table></tr><tr>Blah
Differing assumptions about tag nesting rules are a major source
of problems with the BeautifulSoup class. If BeautifulSoup is not
treating as nestable a tag your page author treats as nestable,
try ICantBelieveItsBeautifulSoup, MinimalSoup, or
BeautifulStoneSoup before writing your own subclass."""
def __init__(self, *args, **kwargs):
if not kwargs.has_key('smartQuotesTo'):
kwargs['smartQuotesTo'] = self.HTML_ENTITIES
kwargs['isHTML'] = True
BeautifulStoneSoup.__init__(self, *args, **kwargs)
SELF_CLOSING_TAGS = buildTagMap(None,
('br' , 'hr', 'input', 'img', 'meta',
'spacer', 'link', 'frame', 'base', 'col'))
PRESERVE_WHITESPACE_TAGS = set(['pre', 'textarea'])
QUOTE_TAGS = {'script' : None, 'textarea' : None}
#According to the HTML standard, each of these inline tags can
#contain another tag of the same type. Furthermore, it's common
#to actually use these tags this way.
NESTABLE_INLINE_TAGS = ('span', 'font', 'q', 'object', 'bdo', 'sub', 'sup',
'center')
#According to the HTML standard, these block tags can contain
#another tag of the same type. Furthermore, it's common
#to actually use these tags this way.
NESTABLE_BLOCK_TAGS = ('blockquote', 'div', 'fieldset', 'ins', 'del')
#Lists can contain other lists, but there are restrictions.
NESTABLE_LIST_TAGS = { 'ol' : [],
'ul' : [],
'li' : ['ul', 'ol'],
'dl' : [],
'dd' : ['dl'],
'dt' : ['dl'] }
#Tables can contain other tables, but there are restrictions.
NESTABLE_TABLE_TAGS = {'table' : [],
'tr' : ['table', 'tbody', 'tfoot', 'thead'],
'td' : ['tr'],
'th' : ['tr'],
'thead' : ['table'],
'tbody' : ['table'],
'tfoot' : ['table'],
}
NON_NESTABLE_BLOCK_TAGS = ('address', 'form', 'p', 'pre')
#If one of these tags is encountered, all tags up to the next tag of
#this type are popped.
RESET_NESTING_TAGS = buildTagMap(None, NESTABLE_BLOCK_TAGS, 'noscript',
NON_NESTABLE_BLOCK_TAGS,
NESTABLE_LIST_TAGS,
NESTABLE_TABLE_TAGS)
NESTABLE_TAGS = buildTagMap([], NESTABLE_INLINE_TAGS, NESTABLE_BLOCK_TAGS,
NESTABLE_LIST_TAGS, NESTABLE_TABLE_TAGS)
# Used to detect the charset in a META tag; see start_meta
CHARSET_RE = re.compile("((^|;)\s*charset=)([^;]*)", re.M)
def start_meta(self, attrs):
"""Beautiful Soup can detect a charset included in a META tag,
try to convert the document to that charset, and re-parse the
document from the beginning."""
httpEquiv = None
contentType = None
contentTypeIndex = None
tagNeedsEncodingSubstitution = False
for i in range(0, len(attrs)):
key, value = attrs[i]
key = key.lower()
if key == 'http-equiv':
httpEquiv = value
elif key == 'content':
contentType = value
contentTypeIndex = i
if httpEquiv and contentType: # It's an interesting meta tag.
match = self.CHARSET_RE.search(contentType)
if match:
if (self.declaredHTMLEncoding is not None or
self.originalEncoding == self.fromEncoding):
# An HTML encoding was sniffed while converting
# the document to Unicode, or an HTML encoding was
# sniffed during a previous pass through the
# document, or an encoding was specified
# explicitly and it worked. Rewrite the meta tag.
def rewrite(match):
return match.group(1) + "%SOUP-ENCODING%"
newAttr = self.CHARSET_RE.sub(rewrite, contentType)
attrs[contentTypeIndex] = (attrs[contentTypeIndex][0],
newAttr)
tagNeedsEncodingSubstitution = True
else:
# This is our first pass through the document.
# Go through it again with the encoding information.
newCharset = match.group(3)
if newCharset and newCharset != self.originalEncoding:
self.declaredHTMLEncoding = newCharset
self._feed(self.declaredHTMLEncoding)
raise StopParsing
pass
tag = self.unknown_starttag("meta", attrs)
if tag and tagNeedsEncodingSubstitution:
tag.containsSubstitutions = True
class StopParsing(Exception):
pass
class ICantBelieveItsBeautifulSoup(BeautifulSoup):
"""The BeautifulSoup class is oriented towards skipping over
common HTML errors like unclosed tags. However, sometimes it makes
errors of its own. For instance, consider this fragment:
<b>Foo<b>Bar</b></b>
This is perfectly valid (if bizarre) HTML. However, the
BeautifulSoup class will implicitly close the first b tag when it
encounters the second 'b'. It will think the author wrote
"<b>Foo<b>Bar", and didn't close the first 'b' tag, because
there's no real-world reason to bold something that's already
bold. When it encounters '</b></b>' it will close two more 'b'
tags, for a grand total of three tags closed instead of two. This
can throw off the rest of your document structure. The same is
true of a number of other tags, listed below.
It's much more common for someone to forget to close a 'b' tag
than to actually use nested 'b' tags, and the BeautifulSoup class
handles the common case. This class handles the not-co-common
case: where you can't believe someone wrote what they did, but
it's valid HTML and BeautifulSoup screwed up by assuming it
wouldn't be."""
I_CANT_BELIEVE_THEYRE_NESTABLE_INLINE_TAGS = \
('em', 'big', 'i', 'small', 'tt', 'abbr', 'acronym', 'strong',
'cite', 'code', 'dfn', 'kbd', 'samp', 'strong', 'var', 'b',
'big')
I_CANT_BELIEVE_THEYRE_NESTABLE_BLOCK_TAGS = ('noscript',)
NESTABLE_TAGS = buildTagMap([], BeautifulSoup.NESTABLE_TAGS,
I_CANT_BELIEVE_THEYRE_NESTABLE_BLOCK_TAGS,
I_CANT_BELIEVE_THEYRE_NESTABLE_INLINE_TAGS)
class MinimalSoup(BeautifulSoup):
"""The MinimalSoup class is for parsing HTML that contains
pathologically bad markup. It makes no assumptions about tag
nesting, but it does know which tags are self-closing, that
<script> tags contain Javascript and should not be parsed, that
META tags may contain encoding information, and so on.
This also makes it better for subclassing than BeautifulStoneSoup
or BeautifulSoup."""
RESET_NESTING_TAGS = buildTagMap('noscript')
NESTABLE_TAGS = {}
class BeautifulSOAP(BeautifulStoneSoup):
"""This class will push a tag with only a single string child into
the tag's parent as an attribute. The attribute's name is the tag
name, and the value is the string child. An example should give
the flavor of the change:
<foo><bar>baz</bar></foo>
=>
<foo bar="baz"><bar>baz</bar></foo>
You can then access fooTag['bar'] instead of fooTag.barTag.string.
This is, of course, useful for scraping structures that tend to
use subelements instead of attributes, such as SOAP messages. Note
that it modifies its input, so don't print the modified version
out.
I'm not sure how many people really want to use this class; let me
know if you do. Mainly I like the name."""
def popTag(self):
if len(self.tagStack) > 1:
tag = self.tagStack[-1]
parent = self.tagStack[-2]
parent._getAttrMap()
if (isinstance(tag, Tag) and len(tag.contents) == 1 and
isinstance(tag.contents[0], NavigableString) and
not parent.attrMap.has_key(tag.name)):
parent[tag.name] = tag.contents[0]
BeautifulStoneSoup.popTag(self)
#Enterprise class names! It has come to our attention that some people
#think the names of the Beautiful Soup parser classes are too silly
#and "unprofessional" for use in enterprise screen-scraping. We feel
#your pain! For such-minded folk, the Beautiful Soup Consortium And
#All-Night Kosher Bakery recommends renaming this file to
#"RobustParser.py" (or, in cases of extreme enterprisiness,
#"RobustParserBeanInterface.class") and using the following
#enterprise-friendly class aliases:
class RobustXMLParser(BeautifulStoneSoup):
pass
class RobustHTMLParser(BeautifulSoup):
pass
class RobustWackAssHTMLParser(ICantBelieveItsBeautifulSoup):
pass
class RobustInsanelyWackAssHTMLParser(MinimalSoup):
pass
class SimplifyingSOAPParser(BeautifulSOAP):
pass
######################################################
#
# Bonus library: Unicode, Dammit
#
# This class forces XML data into a standard format (usually to UTF-8
# or Unicode). It is heavily based on code from Mark Pilgrim's
# Universal Feed Parser. It does not rewrite the XML or HTML to
# reflect a new encoding: that happens in BeautifulStoneSoup.handle_pi
# (XML) and BeautifulSoup.start_meta (HTML).
# Autodetects character encodings.
# Download from http://chardet.feedparser.org/
try:
import chardet
# import chardet.constants
# chardet.constants._debug = 1
except ImportError:
chardet = None
# cjkcodecs and iconv_codec make Python know about more character encodings.
# Both are available from http://cjkpython.i18n.org/
# They're built in if you use Python 2.4.
try:
import cjkcodecs.aliases
except ImportError:
pass
try:
import iconv_codec
except ImportError:
pass
class UnicodeDammit:
"""A class for detecting the encoding of a *ML document and
converting it to a Unicode string. If the source encoding is
windows-1252, can replace MS smart quotes with their HTML or XML
equivalents."""
# This dictionary maps commonly seen values for "charset" in HTML
# meta tags to the corresponding Python codec names. It only covers
# values that aren't in Python's aliases and can't be determined
# by the heuristics in find_codec.
CHARSET_ALIASES = { "macintosh" : "mac-roman",
"x-sjis" : "shift-jis" }
def __init__(self, markup, overrideEncodings=[],
smartQuotesTo='xml', isHTML=False):
self.declaredHTMLEncoding = None
self.markup, documentEncoding, sniffedEncoding = \
self._detectEncoding(markup, isHTML)
self.smartQuotesTo = smartQuotesTo
self.triedEncodings = []
if markup == '' or isinstance(markup, unicode):
self.originalEncoding = None
self.unicode = unicode(markup)
return
u = None
for proposedEncoding in overrideEncodings:
u = self._convertFrom(proposedEncoding)
if u: break
if not u:
for proposedEncoding in (documentEncoding, sniffedEncoding):
u = self._convertFrom(proposedEncoding)
if u: break
# If no luck and we have auto-detection library, try that:
if not u and chardet and not isinstance(self.markup, unicode):
u = self._convertFrom(chardet.detect(self.markup)['encoding'])
# As a last resort, try utf-8 and windows-1252:
if not u:
for proposed_encoding in ("utf-8", "windows-1252"):
u = self._convertFrom(proposed_encoding)
if u: break
self.unicode = u
if not u: self.originalEncoding = None
def _subMSChar(self, orig):
"""Changes a MS smart quote character to an XML or HTML
entity."""
sub = self.MS_CHARS.get(orig)
if isinstance(sub, tuple):
if self.smartQuotesTo == 'xml':
sub = '&#x%s;' % sub[1]
else:
sub = '&%s;' % sub[0]
return sub
def _convertFrom(self, proposed):
proposed = self.find_codec(proposed)
if not proposed or proposed in self.triedEncodings:
return None
self.triedEncodings.append(proposed)
markup = self.markup
# Convert smart quotes to HTML if coming from an encoding
# that might have them.
if self.smartQuotesTo and proposed.lower() in("windows-1252",
"iso-8859-1",
"iso-8859-2"):
markup = re.compile("([\x80-\x9f])").sub \
(lambda(x): self._subMSChar(x.group(1)),
markup)
try:
# print "Trying to convert document to %s" % proposed
u = self._toUnicode(markup, proposed)
self.markup = u
self.originalEncoding = proposed
except Exception, e:
# print "That didn't work!"
# print e
return None
#print "Correct encoding: %s" % proposed
return self.markup
def _toUnicode(self, data, encoding):
'''Given a string and its encoding, decodes the string into Unicode.
%encoding is a string recognized by encodings.aliases'''
# strip Byte Order Mark (if present)
if (len(data) >= 4) and (data[:2] == '\xfe\xff') \
and (data[2:4] != '\x00\x00'):
encoding = 'utf-16be'
data = data[2:]
elif (len(data) >= 4) and (data[:2] == '\xff\xfe') \
and (data[2:4] != '\x00\x00'):
encoding = 'utf-16le'
data = data[2:]
elif data[:3] == '\xef\xbb\xbf':
encoding = 'utf-8'
data = data[3:]
elif data[:4] == '\x00\x00\xfe\xff':
encoding = 'utf-32be'
data = data[4:]
elif data[:4] == '\xff\xfe\x00\x00':
encoding = 'utf-32le'
data = data[4:]
newdata = unicode(data, encoding)
return newdata
def _detectEncoding(self, xml_data, isHTML=False):
"""Given a document, tries to detect its XML encoding."""
xml_encoding = sniffed_xml_encoding = None
try:
if xml_data[:4] == '\x4c\x6f\xa7\x94':
# EBCDIC
xml_data = self._ebcdic_to_ascii(xml_data)
elif xml_data[:4] == '\x00\x3c\x00\x3f':
# UTF-16BE
sniffed_xml_encoding = 'utf-16be'
xml_data = unicode(xml_data, 'utf-16be').encode('utf-8')
elif (len(xml_data) >= 4) and (xml_data[:2] == '\xfe\xff') \
and (xml_data[2:4] != '\x00\x00'):
# UTF-16BE with BOM
sniffed_xml_encoding = 'utf-16be'
xml_data = unicode(xml_data[2:], 'utf-16be').encode('utf-8')
elif xml_data[:4] == '\x3c\x00\x3f\x00':
# UTF-16LE
sniffed_xml_encoding = 'utf-16le'
xml_data = unicode(xml_data, 'utf-16le').encode('utf-8')
elif (len(xml_data) >= 4) and (xml_data[:2] == '\xff\xfe') and \
(xml_data[2:4] != '\x00\x00'):
# UTF-16LE with BOM
sniffed_xml_encoding = 'utf-16le'
xml_data = unicode(xml_data[2:], 'utf-16le').encode('utf-8')
elif xml_data[:4] == '\x00\x00\x00\x3c':
# UTF-32BE
sniffed_xml_encoding = 'utf-32be'
xml_data = unicode(xml_data, 'utf-32be').encode('utf-8')
elif xml_data[:4] == '\x3c\x00\x00\x00':
# UTF-32LE
sniffed_xml_encoding = 'utf-32le'
xml_data = unicode(xml_data, 'utf-32le').encode('utf-8')
elif xml_data[:4] == '\x00\x00\xfe\xff':
# UTF-32BE with BOM
sniffed_xml_encoding = 'utf-32be'
xml_data = unicode(xml_data[4:], 'utf-32be').encode('utf-8')
elif xml_data[:4] == '\xff\xfe\x00\x00':
# UTF-32LE with BOM
sniffed_xml_encoding = 'utf-32le'
xml_data = unicode(xml_data[4:], 'utf-32le').encode('utf-8')
elif xml_data[:3] == '\xef\xbb\xbf':
# UTF-8 with BOM
sniffed_xml_encoding = 'utf-8'
xml_data = unicode(xml_data[3:], 'utf-8').encode('utf-8')
else:
sniffed_xml_encoding = 'ascii'
pass
except:
xml_encoding_match = None
xml_encoding_match = re.compile(
'^<\?.*encoding=[\'"](.*?)[\'"].*\?>').match(xml_data)
if not xml_encoding_match and isHTML:
regexp = re.compile('<\s*meta[^>]+charset=([^>]*?)[;\'">]', re.I)
xml_encoding_match = regexp.search(xml_data)
if xml_encoding_match is not None:
xml_encoding = xml_encoding_match.groups()[0].lower()
if isHTML:
self.declaredHTMLEncoding = xml_encoding
if sniffed_xml_encoding and \
(xml_encoding in ('iso-10646-ucs-2', 'ucs-2', 'csunicode',
'iso-10646-ucs-4', 'ucs-4', 'csucs4',
'utf-16', 'utf-32', 'utf_16', 'utf_32',
'utf16', 'u16')):
xml_encoding = sniffed_xml_encoding
return xml_data, xml_encoding, sniffed_xml_encoding
def find_codec(self, charset):
return self._codec(self.CHARSET_ALIASES.get(charset, charset)) \
or (charset and self._codec(charset.replace("-", ""))) \
or (charset and self._codec(charset.replace("-", "_"))) \
or charset
def _codec(self, charset):
if not charset: return charset
codec = None
try:
codecs.lookup(charset)
codec = charset
except (LookupError, ValueError):
pass
return codec
EBCDIC_TO_ASCII_MAP = None
def _ebcdic_to_ascii(self, s):
c = self.__class__
if not c.EBCDIC_TO_ASCII_MAP:
emap = (0,1,2,3,156,9,134,127,151,141,142,11,12,13,14,15,
16,17,18,19,157,133,8,135,24,25,146,143,28,29,30,31,
128,129,130,131,132,10,23,27,136,137,138,139,140,5,6,7,
144,145,22,147,148,149,150,4,152,153,154,155,20,21,158,26,
32,160,161,162,163,164,165,166,167,168,91,46,60,40,43,33,
38,169,170,171,172,173,174,175,176,177,93,36,42,41,59,94,
45,47,178,179,180,181,182,183,184,185,124,44,37,95,62,63,
186,187,188,189,190,191,192,193,194,96,58,35,64,39,61,34,
195,97,98,99,100,101,102,103,104,105,196,197,198,199,200,
201,202,106,107,108,109,110,111,112,113,114,203,204,205,
206,207,208,209,126,115,116,117,118,119,120,121,122,210,
211,212,213,214,215,216,217,218,219,220,221,222,223,224,
225,226,227,228,229,230,231,123,65,66,67,68,69,70,71,72,
73,232,233,234,235,236,237,125,74,75,76,77,78,79,80,81,
82,238,239,240,241,242,243,92,159,83,84,85,86,87,88,89,
90,244,245,246,247,248,249,48,49,50,51,52,53,54,55,56,57,
250,251,252,253,254,255)
import string
c.EBCDIC_TO_ASCII_MAP = string.maketrans( \
''.join(map(chr, range(256))), ''.join(map(chr, emap)))
return s.translate(c.EBCDIC_TO_ASCII_MAP)
MS_CHARS = { '\x80' : ('euro', '20AC'),
'\x81' : ' ',
'\x82' : ('sbquo', '201A'),
'\x83' : ('fnof', '192'),
'\x84' : ('bdquo', '201E'),
'\x85' : ('hellip', '2026'),
'\x86' : ('dagger', '2020'),
'\x87' : ('Dagger', '2021'),
'\x88' : ('circ', '2C6'),
'\x89' : ('permil', '2030'),
'\x8A' : ('Scaron', '160'),
'\x8B' : ('lsaquo', '2039'),
'\x8C' : ('OElig', '152'),
'\x8D' : '?',
'\x8E' : ('#x17D', '17D'),
'\x8F' : '?',
'\x90' : '?',
'\x91' : ('lsquo', '2018'),
'\x92' : ('rsquo', '2019'),
'\x93' : ('ldquo', '201C'),
'\x94' : ('rdquo', '201D'),
'\x95' : ('bull', '2022'),
'\x96' : ('ndash', '2013'),
'\x97' : ('mdash', '2014'),
'\x98' : ('tilde', '2DC'),
'\x99' : ('trade', '2122'),
'\x9a' : ('scaron', '161'),
'\x9b' : ('rsaquo', '203A'),
'\x9c' : ('oelig', '153'),
'\x9d' : '?',
'\x9e' : ('#x17E', '17E'),
'\x9f' : ('Yuml', ''),}
#######################################################################
#By default, act as an HTML pretty-printer.
if __name__ == '__main__':
import sys
soup = BeautifulSoup(sys.stdin)
print soup.prettify()
|
agpl-3.0
|
broferek/ansible
|
test/units/modules/network/check_point/test_cp_mgmt_host_facts.py
|
19
|
2820
|
# Ansible module to manage CheckPoint Firewall (c) 2019
#
# Ansible is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# Ansible is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
#
from __future__ import absolute_import, division, print_function
__metaclass__ = type
import pytest
from units.modules.utils import set_module_args, exit_json, fail_json, AnsibleExitJson
from ansible.module_utils import basic
from ansible.modules.network.check_point import cp_mgmt_host_facts
OBJECT = {
"from": 1,
"to": 1,
"total": 6,
"objects": [
"53de74b7-8f19-4cbe-99fc-a81ef0759bad"
]
}
SHOW_PLURAL_PAYLOAD = {
'limit': 1,
'details_level': 'uid'
}
SHOW_SINGLE_PAYLOAD = {
'name': 'object_which_is_not_exist'
}
api_call_object = 'host'
api_call_object_plural_version = 'hosts'
failure_msg = '''{u'message': u'Requested object [object_which_is_not_exist] not found', u'code': u'generic_err_object_not_found'}'''
class TestCheckpointHostFacts(object):
module = cp_mgmt_host_facts
@pytest.fixture(autouse=True)
def module_mock(self, mocker):
return mocker.patch.multiple(basic.AnsibleModule, exit_json=exit_json, fail_json=fail_json)
@pytest.fixture
def connection_mock(self, mocker):
connection_class_mock = mocker.patch('ansible.module_utils.network.checkpoint.checkpoint.Connection')
return connection_class_mock.return_value
def test_show_single_object_which_is_not_exist(self, mocker, connection_mock):
connection_mock.send_request.return_value = (404, failure_msg)
try:
result = self._run_module(SHOW_SINGLE_PAYLOAD)
except Exception as e:
result = e.args[0]
assert result['failed']
assert 'Checkpoint device returned error 404 with message ' + failure_msg == result['msg']
def test_show_few_objects(self, mocker, connection_mock):
connection_mock.send_request.return_value = (200, OBJECT)
result = self._run_module(SHOW_PLURAL_PAYLOAD)
assert not result['changed']
assert OBJECT == result['ansible_facts'][api_call_object_plural_version]
def _run_module(self, module_args):
set_module_args(module_args)
with pytest.raises(AnsibleExitJson) as ex:
self.module.main()
return ex.value.args[0]
|
gpl-3.0
|
acutesoftware/AIKIF
|
aikif/.z_prototype/search.py
|
1
|
1453
|
# -*- coding: utf-8 -*-
import argparse
import aikif.config as cfg
def search(search_string):
"""
main function to search using indexes
"""
print('Searching for ' + search_string)
ndxFiles = cfg.params['index_files']
numResults = 0
totLines = 0
for fname in ndxFiles:
print("Searching " + fname)
with open(fname, 'r') as f:
line_num = 0
for line_num, line in enumerate(f):
totLines = totLines + 1
if search_string in line:
try:
print(line) # gives error with some encoding
except Exception:
print("Cant print search result")
numResults = numResults + 1
print(str(line_num) + " lines searched")
print('Found ', str(numResults), 'results in', str(totLines), 'lines over', str(len(ndxFiles)), 'index files')
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Search.py looks in AIKIF index files for strings')
parser.add_argument('-s', '--search', help='enter a search string, enclosed with quotes if multiple words needed')
parser.add_argument('-i', '--index', help='choose an index file to search')
args = parser.parse_args()
search(args.search.strip(''))
print("REMEMBER - call this with python otherwise it doesnt run\n python search.py -s database\n")
|
gpl-3.0
|
William-Hai/volatility
|
volatility/scan.py
|
44
|
9086
|
# Volatility
# Copyright (C) 2007-2013 Volatility Foundation
#
# Derived from source in PyFlag developed by:
# Copyright 2004: Commonwealth of Australia.
# Michael Cohen <[email protected]>
# David Collett <[email protected]>
#
# This file is part of Volatility.
#
# Volatility is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# Volatility is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Volatility. If not, see <http://www.gnu.org/licenses/>.
#
# Special thanks to Michael Cohen for ideas and comments!
#
#pylint: disable-msg=C0111
"""
@author: AAron Walters
@license: GNU General Public License 2.0
@contact: [email protected]
@organization: Volatility Foundation
"""
import volatility.debug as debug
import volatility.registry as registry
import volatility.addrspace as addrspace
import volatility.constants as constants
import volatility.conf as conf
########### Following is the new implementation of the scanning
########### framework. The old framework was based on PyFlag's
########### scanning framework which is probably too complex for this.
class BaseScanner(object):
""" A more thorough scanner which checks every byte """
checks = []
def __init__(self, window_size = 8):
self.buffer = addrspace.BufferAddressSpace(conf.DummyConfig(), data = '\x00' * 1024)
self.window_size = window_size
self.constraints = []
self.error_count = 0
def check_addr(self, found):
""" This calls all our constraints on the offset found and
returns the number of contraints that matched.
We shortcut the loop as soon as its obvious that there will
not be sufficient matches to fit the criteria. This allows for
an early exit and a speed boost.
"""
cnt = 0
for check in self.constraints:
## constraints can raise for an error
try:
val = check.check(found)
except Exception:
debug.b()
val = False
if not val:
cnt = cnt + 1
if cnt > self.error_count:
return False
return True
overlap = 20
def scan(self, address_space, offset = 0, maxlen = None):
self.buffer.profile = address_space.profile
current_offset = offset
## Build our constraints from the specified ScannerCheck
## classes:
self.constraints = []
for class_name, args in self.checks:
check = registry.get_plugin_classes(ScannerCheck)[class_name](self.buffer, **args)
self.constraints.append(check)
## Which checks also have skippers?
skippers = [ c for c in self.constraints if hasattr(c, "skip") ]
for (range_start, range_size) in sorted(address_space.get_available_addresses()):
# Jump to the next available point to scan from
# self.base_offset jumps up to be at least range_start
current_offset = max(range_start, current_offset)
range_end = range_start + range_size
# If we have a maximum length, we make sure it's less than the range_end
if maxlen:
range_end = min(range_end, offset + maxlen)
while (current_offset < range_end):
# We've now got range_start <= self.base_offset < range_end
# Figure out how much data to read
l = min(constants.SCAN_BLOCKSIZE + self.overlap, range_end - current_offset)
# Populate the buffer with data
# We use zread to scan what we can because there are often invalid
# pages in the DTB
data = address_space.zread(current_offset, l)
self.buffer.assign_buffer(data, current_offset)
## Run checks throughout this block of data
i = 0
while i < l:
if self.check_addr(i + current_offset):
## yield the offset to the start of the memory
## (after the pool tag)
yield i + current_offset
## Where should we go next? By default we go 1 byte
## ahead, but if some of the checkers have skippers,
## we may actually go much farther. Checkers with
## skippers basically tell us that there is no way
## they can match anything before the skipped result,
## so there is no point in trying them on all the data
## in between. This optimization is useful to really
## speed things up. FIXME - currently skippers assume
## that the check must match, therefore we can skip
## the unmatchable region, but its possible that a
## scanner needs to match only some checkers.
skip = 1
for s in skippers:
skip = max(skip, s.skip(data, i))
i += skip
current_offset += min(constants.SCAN_BLOCKSIZE, l)
class DiscontigScanner(BaseScanner):
def scan(self, address_space, offset = 0, maxlen = None):
debug.warning("DiscontigScanner has been deprecated, all functionality is now contained in BaseScanner")
for match in BaseScanner.scan(self, address_space, offset, maxlen):
yield match
class ScannerCheck(object):
""" A scanner check is a special class which is invoked on an AS to check for a specific condition.
The main method is def check(self, offset):
This will return True if the condition is true or False otherwise.
This class is the base class for all checks.
"""
def __init__(self, address_space, **_kwargs):
self.address_space = address_space
def object_offset(self, offset, address_space):
return offset
def check(self, _offset):
return False
## If you want to speed up the scanning define this method - it
## will be used to skip the data which is obviously not going to
## match. You will need to return the number of bytes from offset
## to skip to. We take the maximum number of bytes to guarantee
## that all checks have a chance of passing.
#def skip(self, data, offset):
# return -1
class PoolScanner(BaseScanner):
def object_offset(self, found, address_space):
"""
The name of this function "object_offset" can be misleading depending
on how its used. Even before removing the preambles (r1324), it may not
always return the offset of an object. Here are the rules:
If you subclass PoolScanner and do not override this function, it
will return the offset of _POOL_HEADER. If you do override this function,
it should be used to calculate and return the offset of your desired
object within the pool. Thus there are two different ways it can be done.
Example 1.
For an example of subclassing PoolScanner and not overriding this function,
see filescan.PoolScanFile. In this case, the plugin (filescan.FileScan)
treats the offset returned by this function as the start of _POOL_HEADER
and then works out the object from the bottom up:
for offset in PoolScanFile().scan(address_space):
pool_obj = obj.Object("_POOL_HEADER", vm = address_space,
offset = offset)
##
## Work out objects base here
##
Example 2.
For an example of subclassing PoolScanner and overriding this function,
see filescan.PoolScanProcess. In this case, the "work" described above is
done here (in the sublcassed object_offset). Thus in the plugin (filescan.PSScan)
it can directly instantiate _EPROCESS from the offset we return.
for offset in PoolScanProcess().scan(address_space):
eprocess = obj.Object('_EPROCESS', vm = address_space,
native_vm = kernel_as, offset = offset)
"""
## Subtract the offset of the PoolTag member to get the start
## of _POOL_HEADER. This is done because PoolScanners search
## for the PoolTag.
return found - self.buffer.profile.get_obj_offset('_POOL_HEADER', 'PoolTag')
def scan(self, address_space, offset = 0, maxlen = None):
for i in BaseScanner.scan(self, address_space, offset, maxlen):
yield self.object_offset(i, address_space)
|
gpl-2.0
|
magdamagda/geny-chorobowe
|
geny_chorobowe/find_disease_genes/migrations/0010_auto_20160106_0951.py
|
1
|
1184
|
# -*- coding: utf-8 -*-
# Generated by Django 1.9.1 on 2016-01-06 09:51
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('find_disease_genes', '0009_auto_20160105_2120'),
]
operations = [
migrations.CreateModel(
name='MedgenConcept',
fields=[
('ConceptID', models.CharField(max_length=10, primary_key=True, serialize=False)),
('Name', models.CharField(max_length=200)),
('Def', models.CharField(max_length=300)),
('Source', models.CharField(max_length=50)),
('RelatedConcepts', models.ManyToManyField(blank=True, null=True, related_name='_medgenconcept_RelatedConcepts_+', to='find_disease_genes.MedgenConcept')),
],
),
migrations.RenameField(
model_name='clinvardisease',
old_name='SourceID',
new_name='Source',
),
migrations.AlterField(
model_name='clinvarsource',
name='SourceName',
field=models.CharField(max_length=50),
),
]
|
gpl-2.0
|
tanyunshi/python-docx
|
features/steps/table.py
|
6
|
12947
|
# encoding: utf-8
"""
Step implementations for table-related features
"""
from __future__ import (
absolute_import, division, print_function, unicode_literals
)
from behave import given, then, when
from docx import Document
from docx.enum.table import WD_TABLE_ALIGNMENT, WD_TABLE_DIRECTION
from docx.shared import Inches
from docx.table import _Column, _Columns, _Row, _Rows
from helpers import test_docx
# given ===================================================
@given('a 2 x 2 table')
def given_a_2x2_table(context):
context.table_ = Document().add_table(rows=2, cols=2)
@given('a 3x3 table having {span_state}')
def given_a_3x3_table_having_span_state(context, span_state):
table_idx = {
'only uniform cells': 0,
'a horizontal span': 1,
'a vertical span': 2,
'a combined span': 3,
}[span_state]
document = Document(test_docx('tbl-cell-access'))
context.table_ = document.tables[table_idx]
@given('a column collection having two columns')
def given_a_column_collection_having_two_columns(context):
docx_path = test_docx('blk-containing-table')
document = Document(docx_path)
context.columns = document.tables[0].columns
@given('a row collection having two rows')
def given_a_row_collection_having_two_rows(context):
docx_path = test_docx('blk-containing-table')
document = Document(docx_path)
context.rows = document.tables[0].rows
@given('a table')
def given_a_table(context):
context.table_ = Document().add_table(rows=2, cols=2)
@given('a table cell having a width of {width}')
def given_a_table_cell_having_a_width_of_width(context, width):
table_idx = {'no explicit setting': 0, '1 inch': 1, '2 inches': 2}[width]
document = Document(test_docx('tbl-props'))
table = document.tables[table_idx]
cell = table.cell(0, 0)
context.cell = cell
@given('a table column having a width of {width_desc}')
def given_a_table_having_a_width_of_width_desc(context, width_desc):
col_idx = {
'no explicit setting': 0,
'1440': 1,
}[width_desc]
docx_path = test_docx('tbl-col-props')
document = Document(docx_path)
context.column = document.tables[0].columns[col_idx]
@given('a table having {alignment} alignment')
def given_a_table_having_alignment_alignment(context, alignment):
table_idx = {
'inherited': 3,
'left': 4,
'right': 5,
'center': 6,
}[alignment]
docx_path = test_docx('tbl-props')
document = Document(docx_path)
context.table_ = document.tables[table_idx]
@given('a table having an autofit layout of {autofit}')
def given_a_table_having_an_autofit_layout_of_autofit(context, autofit):
tbl_idx = {
'no explicit setting': 0,
'autofit': 1,
'fixed': 2,
}[autofit]
document = Document(test_docx('tbl-props'))
context.table_ = document.tables[tbl_idx]
@given('a table having {style} style')
def given_a_table_having_style(context, style):
table_idx = {
'no explicit': 0,
'Table Grid': 1,
'Light Shading - Accent 1': 2,
}[style]
document = Document(test_docx('tbl-having-applied-style'))
context.document = document
context.table_ = document.tables[table_idx]
@given('a table having table direction set {setting}')
def given_a_table_having_table_direction_setting(context, setting):
table_idx = [
'to inherit',
'right-to-left',
'left-to-right'
].index(setting)
document = Document(test_docx('tbl-on-off-props'))
context.table_ = document.tables[table_idx]
@given('a table having two columns')
def given_a_table_having_two_columns(context):
docx_path = test_docx('blk-containing-table')
document = Document(docx_path)
# context.table is used internally by behave, underscore added
# to distinguish this one
context.table_ = document.tables[0]
@given('a table having two rows')
def given_a_table_having_two_rows(context):
docx_path = test_docx('blk-containing-table')
document = Document(docx_path)
context.table_ = document.tables[0]
# when =====================================================
@when('I add a 1.0 inch column to the table')
def when_I_add_a_1_inch_column_to_table(context):
context.column = context.table_.add_column(Inches(1.0))
@when('I add a row to the table')
def when_add_row_to_table(context):
table = context.table_
context.row = table.add_row()
@when('I assign {value_str} to table.alignment')
def when_I_assign_value_to_table_alignment(context, value_str):
value = {
'None': None,
'WD_TABLE_ALIGNMENT.LEFT': WD_TABLE_ALIGNMENT.LEFT,
'WD_TABLE_ALIGNMENT.RIGHT': WD_TABLE_ALIGNMENT.RIGHT,
'WD_TABLE_ALIGNMENT.CENTER': WD_TABLE_ALIGNMENT.CENTER,
}[value_str]
table = context.table_
table.alignment = value
@when('I assign {value} to table.style')
def when_apply_value_to_table_style(context, value):
table, styles = context.table_, context.document.styles
if value == 'None':
new_value = None
elif value.startswith('styles['):
new_value = styles[value.split('\'')[1]]
else:
new_value = styles[value]
table.style = new_value
@when('I assign {value} to table.table_direction')
def when_assign_value_to_table_table_direction(context, value):
new_value = (
None if value == 'None' else getattr(WD_TABLE_DIRECTION, value)
)
context.table_.table_direction = new_value
@when('I merge from cell {origin} to cell {other}')
def when_I_merge_from_cell_origin_to_cell_other(context, origin, other):
def cell(table, idx):
row, col = idx // 3, idx % 3
return table.cell(row, col)
a_idx, b_idx = int(origin) - 1, int(other) - 1
table = context.table_
a, b = cell(table, a_idx), cell(table, b_idx)
a.merge(b)
@when('I set the cell width to {width}')
def when_I_set_the_cell_width_to_width(context, width):
new_value = {'1 inch': Inches(1)}[width]
context.cell.width = new_value
@when('I set the column width to {width_emu}')
def when_I_set_the_column_width_to_width_emu(context, width_emu):
new_value = None if width_emu == 'None' else int(width_emu)
context.column.width = new_value
@when('I set the table autofit to {setting}')
def when_I_set_the_table_autofit_to_setting(context, setting):
new_value = {'autofit': True, 'fixed': False}[setting]
table = context.table_
table.autofit = new_value
# then =====================================================
@then('I can access a collection column by index')
def then_can_access_collection_column_by_index(context):
columns = context.columns
for idx in range(2):
column = columns[idx]
assert isinstance(column, _Column)
@then('I can access a collection row by index')
def then_can_access_collection_row_by_index(context):
rows = context.rows
for idx in range(2):
row = rows[idx]
assert isinstance(row, _Row)
@then('I can access the column collection of the table')
def then_can_access_column_collection_of_table(context):
table = context.table_
columns = table.columns
assert isinstance(columns, _Columns)
@then('I can access the row collection of the table')
def then_can_access_row_collection_of_table(context):
table = context.table_
rows = table.rows
assert isinstance(rows, _Rows)
@then('I can iterate over the column collection')
def then_can_iterate_over_column_collection(context):
columns = context.columns
actual_count = 0
for column in columns:
actual_count += 1
assert isinstance(column, _Column)
assert actual_count == 2
@then('I can iterate over the row collection')
def then_can_iterate_over_row_collection(context):
rows = context.rows
actual_count = 0
for row in rows:
actual_count += 1
assert isinstance(row, _Row)
assert actual_count == 2
@then('table.alignment is {value_str}')
def then_table_alignment_is_value(context, value_str):
value = {
'None': None,
'WD_TABLE_ALIGNMENT.LEFT': WD_TABLE_ALIGNMENT.LEFT,
'WD_TABLE_ALIGNMENT.RIGHT': WD_TABLE_ALIGNMENT.RIGHT,
'WD_TABLE_ALIGNMENT.CENTER': WD_TABLE_ALIGNMENT.CENTER,
}[value_str]
table = context.table_
assert table.alignment == value, 'got %s' % table.alignment
@then('table.cell({row}, {col}).text is {expected_text}')
def then_table_cell_row_col_text_is_text(context, row, col, expected_text):
table = context.table_
row_idx, col_idx = int(row), int(col)
cell_text = table.cell(row_idx, col_idx).text
assert cell_text == expected_text, 'got %s' % cell_text
@then('table.style is styles[\'{style_name}\']')
def then_table_style_is_styles_style_name(context, style_name):
table, styles = context.table_, context.document.styles
expected_style = styles[style_name]
assert table.style == expected_style, "got '%s'" % table.style
@then('table.table_direction is {value}')
def then_table_table_direction_is_value(context, value):
expected_value = (
None if value == 'None' else getattr(WD_TABLE_DIRECTION, value)
)
actual_value = context.table_.table_direction
assert actual_value == expected_value, "got '%s'" % actual_value
@then('the column cells text is {expected_text}')
def then_the_column_cells_text_is_expected_text(context, expected_text):
table = context.table_
cells_text = ' '.join(c.text for col in table.columns for c in col.cells)
assert cells_text == expected_text, 'got %s' % cells_text
@then('the length of the column collection is 2')
def then_len_of_column_collection_is_2(context):
columns = context.table_.columns
assert len(columns) == 2
@then('the length of the row collection is 2')
def then_len_of_row_collection_is_2(context):
rows = context.table_.rows
assert len(rows) == 2
@then('the new column has 2 cells')
def then_new_column_has_2_cells(context):
assert len(context.column.cells) == 2
@then('the new column is 1.0 inches wide')
def then_new_column_is_1_inches_wide(context):
assert context.column.width == Inches(1)
@then('the new row has 2 cells')
def then_new_row_has_2_cells(context):
assert len(context.row.cells) == 2
@then('the reported autofit setting is {autofit}')
def then_the_reported_autofit_setting_is_autofit(context, autofit):
expected_value = {'autofit': True, 'fixed': False}[autofit]
table = context.table_
assert table.autofit is expected_value
@then('the reported column width is {width_emu}')
def then_the_reported_column_width_is_width_emu(context, width_emu):
expected_value = None if width_emu == 'None' else int(width_emu)
assert context.column.width == expected_value, (
'got %s' % context.column.width
)
@then('the reported width of the cell is {width}')
def then_the_reported_width_of_the_cell_is_width(context, width):
expected_width = {'None': None, '1 inch': Inches(1)}[width]
actual_width = context.cell.width
assert actual_width == expected_width, (
'expected %s, got %s' % (expected_width, actual_width)
)
@then('the row cells text is {encoded_text}')
def then_the_row_cells_text_is_expected_text(context, encoded_text):
expected_text = encoded_text.replace('\\', '\n')
table = context.table_
cells_text = ' '.join(c.text for row in table.rows for c in row.cells)
assert cells_text == expected_text, 'got %s' % cells_text
@then('the table has {count} columns')
def then_table_has_count_columns(context, count):
column_count = int(count)
columns = context.table_.columns
assert len(columns) == column_count
@then('the table has {count} rows')
def then_table_has_count_rows(context, count):
row_count = int(count)
rows = context.table_.rows
assert len(rows) == row_count
@then('the width of cell {n_str} is {inches_str} inches')
def then_the_width_of_cell_n_is_x_inches(context, n_str, inches_str):
def _cell(table, idx):
row, col = idx // 3, idx % 3
return table.cell(row, col)
idx, inches = int(n_str) - 1, float(inches_str)
cell = _cell(context.table_, idx)
assert cell.width == Inches(inches), 'got %s' % cell.width.inches
@then('the width of each cell is {inches} inches')
def then_the_width_of_each_cell_is_inches(context, inches):
table = context.table_
expected_width = Inches(float(inches))
for cell in table._cells:
assert cell.width == expected_width, 'got %s' % cell.width.inches
@then('the width of each column is {inches} inches')
def then_the_width_of_each_column_is_inches(context, inches):
table = context.table_
expected_width = Inches(float(inches))
for column in table.columns:
assert column.width == expected_width, 'got %s' % column.width.inches
|
mit
|
nop33/indico
|
indico/modules/events/registration/placeholders/invitations.py
|
2
|
1675
|
# This file is part of Indico.
# Copyright (C) 2002 - 2017 European Organization for Nuclear Research (CERN).
#
# Indico is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License as
# published by the Free Software Foundation; either version 3 of the
# License, or (at your option) any later version.
#
# Indico is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Indico; if not, see <http://www.gnu.org/licenses/>.
from __future__ import unicode_literals
from markupsafe import Markup
from indico.util.i18n import _
from indico.util.placeholders import Placeholder
from indico.web.flask.util import url_for
class FirstNamePlaceholder(Placeholder):
name = 'first_name'
description = _("First name of the person")
@classmethod
def render(cls, invitation):
return invitation.first_name
class LastNamePlaceholder(Placeholder):
name = 'last_name'
description = _("Last name of the person")
@classmethod
def render(cls, invitation):
return invitation.last_name
class InvitationLinkPlaceholder(Placeholder):
name = 'invitation_link'
description = _("Link to accept/decline the invitation")
required = True
@classmethod
def render(cls, invitation):
url = url_for('.display_regform', invitation.locator.uuid, _external=True)
return Markup('<a href="{0}">{0}</a>'.format(url))
|
gpl-3.0
|
Taapat/enigma2-openpli-vuplus
|
lib/python/Components/Sources/Boolean.py
|
162
|
1264
|
from Source import Source
from Components.Element import cached
from enigma import eTimer
# a small warning:
# you can use that boolean well to express screen-private
# conditional expressions.
#
# however, if you think that there is ANY interest that another
# screen could use your expression, please put your calculation
# into a seperate Source, providing a "boolean"-property.
class Boolean(Source, object):
def __init__(self, fixed = False, function = None, destroy = None, poll = 0):
Source.__init__(self)
self.function = function
self.fixed = fixed
self.post_destroy = destroy
if poll > 0:
self.poll_timer = eTimer()
self.poll_timer.callback.append(self.poll)
self.poll_timer.start(poll)
else:
self.poll_timer = None
@cached
def getBoolean(self):
if self.function is not None:
return self.function()
else:
return self.fixed
def setBoolean(self, value):
assert self.function is None
self.fixed = value
self.poll()
boolean = property(getBoolean, setBoolean)
def poll(self):
self.changed((self.CHANGED_ALL,))
def destroy(self):
if self.poll_timer:
self.poll_timer.callback.remove(self.poll)
if self.post_destroy is not None:
self.fixed = self.post_destroy
self.poll()
Source.destroy(self)
|
gpl-2.0
|
getnikola/plugins
|
v7/link_figure/link_figure.py
|
2
|
4919
|
# -*- coding: utf-8 -*-
# Copyright © 2014 Ivan Teoh and others.
# Permission is hereby granted, free of charge, to any
# person obtaining a copy of this software and associated
# documentation files (the "Software"), to deal in the
# Software without restriction, including without limitation
# the rights to use, copy, modify, merge, publish,
# distribute, sublicense, and/or sell copies of the
# Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice
# shall be included in all copies or substantial portions of
# the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY
# KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE
# WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR
# PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS
# OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR
# OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
# OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
from docutils import nodes
from docutils.parsers.rst import Directive, directives
from nikola.plugin_categories import RestExtension
class Plugin(RestExtension):
name = "link_figure"
def set_site(self, site):
self.site = site
directives.register_directive('link_figure', LinkFigure)
return super(Plugin, self).set_site(site)
CODE_URL_BASIC = (u"""<a class="{classes}"
href="{url}"
title="{description}">
{title}
</a>""")
CODE_IMAGE = (u"""<div class="link-figure-media">
<a class="link-figure-image" href="{url}" target="_blank">
<img src="{image_url}" alt="{title}" />
</a>
</div>""")
CODE_DESCRIPTION = (u"""<p class="link-figure-description">
{description}
</p>""")
CODE_AUTHOR = (u"""<p class="link-figure-author">
{author_by}{author}
</p>""")
CODE_AUTHOR_URL = (u"""<p class="link-figure-author">
{author_by}<a href="{author_url}" target="_blank">
{author}
</a></p>""")
CODE_URL = (u"""<div class="link-figure-content">
<a class="link-figure-title" href="{url}" target="_blank">{title}</a>
{description}
{author}
</div>""")
CODE = (u"""<div class="{classes}">
{image_url}
{url}
</div>""")
class LinkFigure(Directive):
""" Restructured text extension for inserting link figure
Usage:
.. link_figure:: url
:title: url title
:description: url description
:class: class name
:image_url: url image
:author: url domain or author
:author_url: author url
:author_by: author by symbol
"""
has_content = False
required_arguments = 1
optional_arguments = 0
final_argument_whitespace = True
option_spec = {
'title': directives.unchanged,
'description': directives.unchanged,
'class': directives.unchanged,
'image_url': directives.unchanged,
'author': directives.unchanged,
'author_url': directives.unchanged,
'author_by': directives.unchanged,
}
def run(self):
""" Required by the Directive interface. Create docutils nodes """
options = {
'url': self.arguments[0],
'title': self.options.get('title', ''),
'description': self.options.get('description', ''),
'classes': self.options.get('class', ''),
'image_url': self.options.get('image_url', ''),
'author': self.options.get('author', ''),
'author_url': self.options.get('author_url', ''),
'author_by': self.options.get('author_by', ''),
}
if not options['title']:
if options['url'].endswith('/'):
options['title'] = options['url'][:-1]
options['title'] = options['title'].split('/')[-1]
options['title'] = options['title'].split('?')[0]
if not options['description']:
options['description'] = options['title']
return [nodes.raw('', CODE_URL_BASIC.format(**options), format='html')]
if options['image_url']:
options['image_url'] = CODE_IMAGE.format(**options)
if options['author_by']:
options['author_by'] = options['author_by'].strip() + ' '
if options['author'] and options['author_url']:
options['author'] = CODE_AUTHOR_URL.format(**options)
elif options['author']:
options['author'] = CODE_AUTHOR.format(**options)
if options['description']:
options['description'] = CODE_DESCRIPTION.format(**options)
options['url'] = CODE_URL.format(**options)
return [nodes.raw('', CODE.format(**options), format='html')]
def assert_has_content(self):
""" LinkFigure has no content, override check from superclass """
pass
|
mit
|
Astoriane/jCrunchyroll
|
lib/crunchy-xml-decoder/unidecode/x0cb.py
|
253
|
5012
|
data = (
'jjwaels', # 0x00
'jjwaelt', # 0x01
'jjwaelp', # 0x02
'jjwaelh', # 0x03
'jjwaem', # 0x04
'jjwaeb', # 0x05
'jjwaebs', # 0x06
'jjwaes', # 0x07
'jjwaess', # 0x08
'jjwaeng', # 0x09
'jjwaej', # 0x0a
'jjwaec', # 0x0b
'jjwaek', # 0x0c
'jjwaet', # 0x0d
'jjwaep', # 0x0e
'jjwaeh', # 0x0f
'jjoe', # 0x10
'jjoeg', # 0x11
'jjoegg', # 0x12
'jjoegs', # 0x13
'jjoen', # 0x14
'jjoenj', # 0x15
'jjoenh', # 0x16
'jjoed', # 0x17
'jjoel', # 0x18
'jjoelg', # 0x19
'jjoelm', # 0x1a
'jjoelb', # 0x1b
'jjoels', # 0x1c
'jjoelt', # 0x1d
'jjoelp', # 0x1e
'jjoelh', # 0x1f
'jjoem', # 0x20
'jjoeb', # 0x21
'jjoebs', # 0x22
'jjoes', # 0x23
'jjoess', # 0x24
'jjoeng', # 0x25
'jjoej', # 0x26
'jjoec', # 0x27
'jjoek', # 0x28
'jjoet', # 0x29
'jjoep', # 0x2a
'jjoeh', # 0x2b
'jjyo', # 0x2c
'jjyog', # 0x2d
'jjyogg', # 0x2e
'jjyogs', # 0x2f
'jjyon', # 0x30
'jjyonj', # 0x31
'jjyonh', # 0x32
'jjyod', # 0x33
'jjyol', # 0x34
'jjyolg', # 0x35
'jjyolm', # 0x36
'jjyolb', # 0x37
'jjyols', # 0x38
'jjyolt', # 0x39
'jjyolp', # 0x3a
'jjyolh', # 0x3b
'jjyom', # 0x3c
'jjyob', # 0x3d
'jjyobs', # 0x3e
'jjyos', # 0x3f
'jjyoss', # 0x40
'jjyong', # 0x41
'jjyoj', # 0x42
'jjyoc', # 0x43
'jjyok', # 0x44
'jjyot', # 0x45
'jjyop', # 0x46
'jjyoh', # 0x47
'jju', # 0x48
'jjug', # 0x49
'jjugg', # 0x4a
'jjugs', # 0x4b
'jjun', # 0x4c
'jjunj', # 0x4d
'jjunh', # 0x4e
'jjud', # 0x4f
'jjul', # 0x50
'jjulg', # 0x51
'jjulm', # 0x52
'jjulb', # 0x53
'jjuls', # 0x54
'jjult', # 0x55
'jjulp', # 0x56
'jjulh', # 0x57
'jjum', # 0x58
'jjub', # 0x59
'jjubs', # 0x5a
'jjus', # 0x5b
'jjuss', # 0x5c
'jjung', # 0x5d
'jjuj', # 0x5e
'jjuc', # 0x5f
'jjuk', # 0x60
'jjut', # 0x61
'jjup', # 0x62
'jjuh', # 0x63
'jjweo', # 0x64
'jjweog', # 0x65
'jjweogg', # 0x66
'jjweogs', # 0x67
'jjweon', # 0x68
'jjweonj', # 0x69
'jjweonh', # 0x6a
'jjweod', # 0x6b
'jjweol', # 0x6c
'jjweolg', # 0x6d
'jjweolm', # 0x6e
'jjweolb', # 0x6f
'jjweols', # 0x70
'jjweolt', # 0x71
'jjweolp', # 0x72
'jjweolh', # 0x73
'jjweom', # 0x74
'jjweob', # 0x75
'jjweobs', # 0x76
'jjweos', # 0x77
'jjweoss', # 0x78
'jjweong', # 0x79
'jjweoj', # 0x7a
'jjweoc', # 0x7b
'jjweok', # 0x7c
'jjweot', # 0x7d
'jjweop', # 0x7e
'jjweoh', # 0x7f
'jjwe', # 0x80
'jjweg', # 0x81
'jjwegg', # 0x82
'jjwegs', # 0x83
'jjwen', # 0x84
'jjwenj', # 0x85
'jjwenh', # 0x86
'jjwed', # 0x87
'jjwel', # 0x88
'jjwelg', # 0x89
'jjwelm', # 0x8a
'jjwelb', # 0x8b
'jjwels', # 0x8c
'jjwelt', # 0x8d
'jjwelp', # 0x8e
'jjwelh', # 0x8f
'jjwem', # 0x90
'jjweb', # 0x91
'jjwebs', # 0x92
'jjwes', # 0x93
'jjwess', # 0x94
'jjweng', # 0x95
'jjwej', # 0x96
'jjwec', # 0x97
'jjwek', # 0x98
'jjwet', # 0x99
'jjwep', # 0x9a
'jjweh', # 0x9b
'jjwi', # 0x9c
'jjwig', # 0x9d
'jjwigg', # 0x9e
'jjwigs', # 0x9f
'jjwin', # 0xa0
'jjwinj', # 0xa1
'jjwinh', # 0xa2
'jjwid', # 0xa3
'jjwil', # 0xa4
'jjwilg', # 0xa5
'jjwilm', # 0xa6
'jjwilb', # 0xa7
'jjwils', # 0xa8
'jjwilt', # 0xa9
'jjwilp', # 0xaa
'jjwilh', # 0xab
'jjwim', # 0xac
'jjwib', # 0xad
'jjwibs', # 0xae
'jjwis', # 0xaf
'jjwiss', # 0xb0
'jjwing', # 0xb1
'jjwij', # 0xb2
'jjwic', # 0xb3
'jjwik', # 0xb4
'jjwit', # 0xb5
'jjwip', # 0xb6
'jjwih', # 0xb7
'jjyu', # 0xb8
'jjyug', # 0xb9
'jjyugg', # 0xba
'jjyugs', # 0xbb
'jjyun', # 0xbc
'jjyunj', # 0xbd
'jjyunh', # 0xbe
'jjyud', # 0xbf
'jjyul', # 0xc0
'jjyulg', # 0xc1
'jjyulm', # 0xc2
'jjyulb', # 0xc3
'jjyuls', # 0xc4
'jjyult', # 0xc5
'jjyulp', # 0xc6
'jjyulh', # 0xc7
'jjyum', # 0xc8
'jjyub', # 0xc9
'jjyubs', # 0xca
'jjyus', # 0xcb
'jjyuss', # 0xcc
'jjyung', # 0xcd
'jjyuj', # 0xce
'jjyuc', # 0xcf
'jjyuk', # 0xd0
'jjyut', # 0xd1
'jjyup', # 0xd2
'jjyuh', # 0xd3
'jjeu', # 0xd4
'jjeug', # 0xd5
'jjeugg', # 0xd6
'jjeugs', # 0xd7
'jjeun', # 0xd8
'jjeunj', # 0xd9
'jjeunh', # 0xda
'jjeud', # 0xdb
'jjeul', # 0xdc
'jjeulg', # 0xdd
'jjeulm', # 0xde
'jjeulb', # 0xdf
'jjeuls', # 0xe0
'jjeult', # 0xe1
'jjeulp', # 0xe2
'jjeulh', # 0xe3
'jjeum', # 0xe4
'jjeub', # 0xe5
'jjeubs', # 0xe6
'jjeus', # 0xe7
'jjeuss', # 0xe8
'jjeung', # 0xe9
'jjeuj', # 0xea
'jjeuc', # 0xeb
'jjeuk', # 0xec
'jjeut', # 0xed
'jjeup', # 0xee
'jjeuh', # 0xef
'jjyi', # 0xf0
'jjyig', # 0xf1
'jjyigg', # 0xf2
'jjyigs', # 0xf3
'jjyin', # 0xf4
'jjyinj', # 0xf5
'jjyinh', # 0xf6
'jjyid', # 0xf7
'jjyil', # 0xf8
'jjyilg', # 0xf9
'jjyilm', # 0xfa
'jjyilb', # 0xfb
'jjyils', # 0xfc
'jjyilt', # 0xfd
'jjyilp', # 0xfe
'jjyilh', # 0xff
)
|
lgpl-3.0
|
abhiii5459/sympy
|
sympy/parsing/sympy_parser.py
|
43
|
30008
|
"""Transform a string with Python-like source code into SymPy expression. """
from __future__ import print_function, division
from .sympy_tokenize import \
generate_tokens, untokenize, TokenError, \
NUMBER, STRING, NAME, OP, ENDMARKER
from keyword import iskeyword
import ast
import re
import unicodedata
import sympy
from sympy.core.compatibility import exec_, StringIO
from sympy.core.basic import Basic
_re_repeated = re.compile(r"^(\d*)\.(\d*)\[(\d+)\]$")
def _token_splittable(token):
"""
Predicate for whether a token name can be split into multiple tokens.
A token is splittable if it does not contain an underscore character and
it is not the name of a Greek letter. This is used to implicitly convert
expressions like 'xyz' into 'x*y*z'.
"""
if '_' in token:
return False
else:
try:
return not unicodedata.lookup('GREEK SMALL LETTER ' + token)
except KeyError:
pass
if len(token) > 1:
return True
return False
def _token_callable(token, local_dict, global_dict, nextToken=None):
"""
Predicate for whether a token name represents a callable function.
Essentially wraps ``callable``, but looks up the token name in the
locals and globals.
"""
func = local_dict.get(token[1])
if not func:
func = global_dict.get(token[1])
return callable(func) and not isinstance(func, sympy.Symbol)
def _add_factorial_tokens(name, result):
if result == [] or result[-1][1] == '(':
raise TokenError()
beginning = [(NAME, name), (OP, '(')]
end = [(OP, ')')]
diff = 0
length = len(result)
for index, token in enumerate(result[::-1]):
toknum, tokval = token
i = length - index - 1
if tokval == ')':
diff += 1
elif tokval == '(':
diff -= 1
if diff == 0:
if i - 1 >= 0 and result[i - 1][0] == NAME:
return result[:i - 1] + beginning + result[i - 1:] + end
else:
return result[:i] + beginning + result[i:] + end
return result
class AppliedFunction(object):
"""
A group of tokens representing a function and its arguments.
`exponent` is for handling the shorthand sin^2, ln^2, etc.
"""
def __init__(self, function, args, exponent=None):
if exponent is None:
exponent = []
self.function = function
self.args = args
self.exponent = exponent
self.items = ['function', 'args', 'exponent']
def expand(self):
"""Return a list of tokens representing the function"""
result = []
result.append(self.function)
result.extend(self.args)
return result
def __getitem__(self, index):
return getattr(self, self.items[index])
def __repr__(self):
return "AppliedFunction(%s, %s, %s)" % (self.function, self.args,
self.exponent)
class ParenthesisGroup(list):
"""List of tokens representing an expression in parentheses."""
pass
def _flatten(result):
result2 = []
for tok in result:
if isinstance(tok, AppliedFunction):
result2.extend(tok.expand())
else:
result2.append(tok)
return result2
def _group_parentheses(recursor):
def _inner(tokens, local_dict, global_dict):
"""Group tokens between parentheses with ParenthesisGroup.
Also processes those tokens recursively.
"""
result = []
stacks = []
stacklevel = 0
for token in tokens:
if token[0] == OP:
if token[1] == '(':
stacks.append(ParenthesisGroup([]))
stacklevel += 1
elif token[1] == ')':
stacks[-1].append(token)
stack = stacks.pop()
if len(stacks) > 0:
# We don't recurse here since the upper-level stack
# would reprocess these tokens
stacks[-1].extend(stack)
else:
# Recurse here to handle nested parentheses
# Strip off the outer parentheses to avoid an infinite loop
inner = stack[1:-1]
inner = recursor(inner,
local_dict,
global_dict)
parenGroup = [stack[0]] + inner + [stack[-1]]
result.append(ParenthesisGroup(parenGroup))
stacklevel -= 1
continue
if stacklevel:
stacks[-1].append(token)
else:
result.append(token)
if stacklevel:
raise TokenError("Mismatched parentheses")
return result
return _inner
def _apply_functions(tokens, local_dict, global_dict):
"""Convert a NAME token + ParenthesisGroup into an AppliedFunction.
Note that ParenthesisGroups, if not applied to any function, are
converted back into lists of tokens.
"""
result = []
symbol = None
for tok in tokens:
if tok[0] == NAME:
symbol = tok
result.append(tok)
elif isinstance(tok, ParenthesisGroup):
if symbol and _token_callable(symbol, local_dict, global_dict):
result[-1] = AppliedFunction(symbol, tok)
symbol = None
else:
result.extend(tok)
else:
symbol = None
result.append(tok)
return result
def _implicit_multiplication(tokens, local_dict, global_dict):
"""Implicitly adds '*' tokens.
Cases:
- Two AppliedFunctions next to each other ("sin(x)cos(x)")
- AppliedFunction next to an open parenthesis ("sin x (cos x + 1)")
- A close parenthesis next to an AppliedFunction ("(x+2)sin x")\
- A close parenthesis next to an open parenthesis ("(x+2)(x+3)")
- AppliedFunction next to an implicitly applied function ("sin(x)cos x")
"""
result = []
for tok, nextTok in zip(tokens, tokens[1:]):
result.append(tok)
if (isinstance(tok, AppliedFunction) and
isinstance(nextTok, AppliedFunction)):
result.append((OP, '*'))
elif (isinstance(tok, AppliedFunction) and
nextTok[0] == OP and nextTok[1] == '('):
# Applied function followed by an open parenthesis
result.append((OP, '*'))
elif (tok[0] == OP and tok[1] == ')' and
isinstance(nextTok, AppliedFunction)):
# Close parenthesis followed by an applied function
result.append((OP, '*'))
elif (tok[0] == OP and tok[1] == ')' and
nextTok[0] == NAME):
# Close parenthesis followed by an implicitly applied function
result.append((OP, '*'))
elif (tok[0] == nextTok[0] == OP
and tok[1] == ')' and nextTok[1] == '('):
# Close parenthesis followed by an open parenthesis
result.append((OP, '*'))
elif (isinstance(tok, AppliedFunction) and nextTok[0] == NAME):
# Applied function followed by implicitly applied function
result.append((OP, '*'))
elif (tok[0] == NAME and
not _token_callable(tok, local_dict, global_dict) and
nextTok[0] == OP and nextTok[1] == '('):
# Constant followed by parenthesis
result.append((OP, '*'))
elif (tok[0] == NAME and
not _token_callable(tok, local_dict, global_dict) and
nextTok[0] == NAME and
not _token_callable(nextTok, local_dict, global_dict)):
# Constant followed by constant
result.append((OP, '*'))
elif (tok[0] == NAME and
not _token_callable(tok, local_dict, global_dict) and
(isinstance(nextTok, AppliedFunction) or nextTok[0] == NAME)):
# Constant followed by (implicitly applied) function
result.append((OP, '*'))
if tokens:
result.append(tokens[-1])
return result
def _implicit_application(tokens, local_dict, global_dict):
"""Adds parentheses as needed after functions."""
result = []
appendParen = 0 # number of closing parentheses to add
skip = 0 # number of tokens to delay before adding a ')' (to
# capture **, ^, etc.)
exponentSkip = False # skipping tokens before inserting parentheses to
# work with function exponentiation
for tok, nextTok in zip(tokens, tokens[1:]):
result.append(tok)
if (tok[0] == NAME and
nextTok[0] != OP and
nextTok[0] != ENDMARKER):
if _token_callable(tok, local_dict, global_dict, nextTok):
result.append((OP, '('))
appendParen += 1
# name followed by exponent - function exponentiation
elif (tok[0] == NAME and nextTok[0] == OP and nextTok[1] == '**'):
if _token_callable(tok, local_dict, global_dict):
exponentSkip = True
elif exponentSkip:
# if the last token added was an applied function (i.e. the
# power of the function exponent) OR a multiplication (as
# implicit multiplication would have added an extraneous
# multiplication)
if (isinstance(tok, AppliedFunction)
or (tok[0] == OP and tok[1] == '*')):
# don't add anything if the next token is a multiplication
# or if there's already a parenthesis (if parenthesis, still
# stop skipping tokens)
if not (nextTok[0] == OP and nextTok[1] == '*'):
if not(nextTok[0] == OP and nextTok[1] == '('):
result.append((OP, '('))
appendParen += 1
exponentSkip = False
elif appendParen:
if nextTok[0] == OP and nextTok[1] in ('^', '**', '*'):
skip = 1
continue
if skip:
skip -= 1
continue
result.append((OP, ')'))
appendParen -= 1
if tokens:
result.append(tokens[-1])
if appendParen:
result.extend([(OP, ')')] * appendParen)
return result
def function_exponentiation(tokens, local_dict, global_dict):
"""Allows functions to be exponentiated, e.g. ``cos**2(x)``.
Examples
========
>>> from sympy.parsing.sympy_parser import (parse_expr,
... standard_transformations, function_exponentiation)
>>> transformations = standard_transformations + (function_exponentiation,)
>>> parse_expr('sin**4(x)', transformations=transformations)
sin(x)**4
"""
result = []
exponent = []
consuming_exponent = False
level = 0
for tok, nextTok in zip(tokens, tokens[1:]):
if tok[0] == NAME and nextTok[0] == OP and nextTok[1] == '**':
if _token_callable(tok, local_dict, global_dict):
consuming_exponent = True
elif consuming_exponent:
exponent.append(tok)
# only want to stop after hitting )
if tok[0] == nextTok[0] == OP and tok[1] == ')' and nextTok[1] == '(':
consuming_exponent = False
# if implicit multiplication was used, we may have )*( instead
if tok[0] == nextTok[0] == OP and tok[1] == '*' and nextTok[1] == '(':
consuming_exponent = False
del exponent[-1]
continue
elif exponent and not consuming_exponent:
if tok[0] == OP:
if tok[1] == '(':
level += 1
elif tok[1] == ')':
level -= 1
if level == 0:
result.append(tok)
result.extend(exponent)
exponent = []
continue
result.append(tok)
if tokens:
result.append(tokens[-1])
if exponent:
result.extend(exponent)
return result
def split_symbols_custom(predicate):
"""Creates a transformation that splits symbol names.
``predicate`` should return True if the symbol name is to be split.
For instance, to retain the default behavior but avoid splitting certain
symbol names, a predicate like this would work:
>>> from sympy.parsing.sympy_parser import (parse_expr, _token_splittable,
... standard_transformations, implicit_multiplication,
... split_symbols_custom)
>>> def can_split(symbol):
... if symbol not in ('list', 'of', 'unsplittable', 'names'):
... return _token_splittable(symbol)
... return False
...
>>> transformation = split_symbols_custom(can_split)
>>> parse_expr('unsplittable', transformations=standard_transformations +
... (transformation, implicit_multiplication))
unsplittable
"""
def _split_symbols(tokens, local_dict, global_dict):
result = []
split = False
split_previous=False
for tok in tokens:
if split_previous:
# throw out closing parenthesis of Symbol that was split
split_previous=False
continue
split_previous=False
if tok[0] == NAME and tok[1] == 'Symbol':
split = True
elif split and tok[0] == NAME:
symbol = tok[1][1:-1]
if predicate(symbol):
for char in symbol:
if char in local_dict or char in global_dict:
# Get rid of the call to Symbol
del result[-2:]
result.extend([(NAME, "%s" % char),
(NAME, 'Symbol'), (OP, '(')])
else:
result.extend([(NAME, "'%s'" % char), (OP, ')'),
(NAME, 'Symbol'), (OP, '(')])
# Delete the last two tokens: get rid of the extraneous
# Symbol( we just added
# Also, set split_previous=True so will skip
# the closing parenthesis of the original Symbol
del result[-2:]
split = False
split_previous = True
continue
else:
split = False
result.append(tok)
return result
return _split_symbols
#: Splits symbol names for implicit multiplication.
#:
#: Intended to let expressions like ``xyz`` be parsed as ``x*y*z``. Does not
#: split Greek character names, so ``theta`` will *not* become
#: ``t*h*e*t*a``. Generally this should be used with
#: ``implicit_multiplication``.
split_symbols = split_symbols_custom(_token_splittable)
def implicit_multiplication(result, local_dict, global_dict):
"""Makes the multiplication operator optional in most cases.
Use this before :func:`implicit_application`, otherwise expressions like
``sin 2x`` will be parsed as ``x * sin(2)`` rather than ``sin(2*x)``.
Examples
========
>>> from sympy.parsing.sympy_parser import (parse_expr,
... standard_transformations, implicit_multiplication)
>>> transformations = standard_transformations + (implicit_multiplication,)
>>> parse_expr('3 x y', transformations=transformations)
3*x*y
"""
# These are interdependent steps, so we don't expose them separately
for step in (_group_parentheses(implicit_multiplication),
_apply_functions,
_implicit_multiplication):
result = step(result, local_dict, global_dict)
result = _flatten(result)
return result
def implicit_application(result, local_dict, global_dict):
"""Makes parentheses optional in some cases for function calls.
Use this after :func:`implicit_multiplication`, otherwise expressions
like ``sin 2x`` will be parsed as ``x * sin(2)`` rather than
``sin(2*x)``.
Examples
========
>>> from sympy.parsing.sympy_parser import (parse_expr,
... standard_transformations, implicit_application)
>>> transformations = standard_transformations + (implicit_application,)
>>> parse_expr('cot z + csc z', transformations=transformations)
cot(z) + csc(z)
"""
for step in (_group_parentheses(implicit_application),
_apply_functions,
_implicit_application,):
result = step(result, local_dict, global_dict)
result = _flatten(result)
return result
def implicit_multiplication_application(result, local_dict, global_dict):
"""Allows a slightly relaxed syntax.
- Parentheses for single-argument method calls are optional.
- Multiplication is implicit.
- Symbol names can be split (i.e. spaces are not needed between
symbols).
- Functions can be exponentiated.
Examples
========
>>> from sympy.parsing.sympy_parser import (parse_expr,
... standard_transformations, implicit_multiplication_application)
>>> parse_expr("10sin**2 x**2 + 3xyz + tan theta",
... transformations=(standard_transformations +
... (implicit_multiplication_application,)))
3*x*y*z + 10*sin(x**2)**2 + tan(theta)
"""
for step in (split_symbols, implicit_multiplication,
implicit_application, function_exponentiation):
result = step(result, local_dict, global_dict)
return result
def auto_symbol(tokens, local_dict, global_dict):
"""Inserts calls to ``Symbol`` for undefined variables."""
result = []
prevTok = (None, None)
tokens.append((None, None)) # so zip traverses all tokens
for tok, nextTok in zip(tokens, tokens[1:]):
tokNum, tokVal = tok
nextTokNum, nextTokVal = nextTok
if tokNum == NAME:
name = tokVal
if (name in ['True', 'False', 'None']
or iskeyword(name)
or name in local_dict
# Don't convert attribute access
or (prevTok[0] == OP and prevTok[1] == '.')
# Don't convert keyword arguments
or (prevTok[0] == OP and prevTok[1] in ('(', ',')
and nextTokNum == OP and nextTokVal == '=')):
result.append((NAME, name))
continue
elif name in global_dict:
obj = global_dict[name]
if isinstance(obj, (Basic, type)) or callable(obj):
result.append((NAME, name))
continue
result.extend([
(NAME, 'Symbol'),
(OP, '('),
(NAME, repr(str(name))),
(OP, ')'),
])
else:
result.append((tokNum, tokVal))
prevTok = (tokNum, tokVal)
return result
def lambda_notation(tokens, local_dict, global_dict):
"""Substitutes "lambda" with its Sympy equivalent Lambda().
However, the conversion doesn't take place if only "lambda"
is passed because that is a syntax error.
"""
result = []
flag = False
toknum, tokval = tokens[0]
tokLen = len(tokens)
if toknum == NAME and tokval == 'lambda':
if tokLen == 2:
result.extend(tokens)
elif tokLen > 2:
result.extend([
(NAME, 'Lambda'),
(OP, '('),
(OP, '('),
(OP, ')'),
(OP, ')'),
])
for tokNum, tokVal in tokens[1:]:
if tokNum == OP and tokVal == ':':
tokVal = ','
flag = True
if flag:
result.insert(-1, (tokNum, tokVal))
else:
result.insert(-2, (tokNum, tokVal))
else:
result.extend(tokens)
return result
def factorial_notation(tokens, local_dict, global_dict):
"""Allows standard notation for factorial."""
result = []
prevtoken = ''
for toknum, tokval in tokens:
if toknum == OP:
op = tokval
if op == '!!':
if prevtoken == '!' or prevtoken == '!!':
raise TokenError
result = _add_factorial_tokens('factorial2', result)
elif op == '!':
if prevtoken == '!' or prevtoken == '!!':
raise TokenError
result = _add_factorial_tokens('factorial', result)
else:
result.append((OP, op))
else:
result.append((toknum, tokval))
prevtoken = tokval
return result
def convert_xor(tokens, local_dict, global_dict):
"""Treats XOR, ``^``, as exponentiation, ``**``."""
result = []
for toknum, tokval in tokens:
if toknum == OP:
if tokval == '^':
result.append((OP, '**'))
else:
result.append((toknum, tokval))
else:
result.append((toknum, tokval))
return result
def auto_number(tokens, local_dict, global_dict):
"""Converts numeric literals to use SymPy equivalents.
Complex numbers use ``I``; integer literals use ``Integer``, float
literals use ``Float``, and repeating decimals use ``Rational``.
"""
result = []
prevtoken = ''
for toknum, tokval in tokens:
if toknum == NUMBER:
number = tokval
postfix = []
if number.endswith('j') or number.endswith('J'):
number = number[:-1]
postfix = [(OP, '*'), (NAME, 'I')]
if '.' in number or (('e' in number or 'E' in number) and
not (number.startswith('0x') or number.startswith('0X'))):
match = _re_repeated.match(number)
if match is not None:
# Clear repeating decimals, e.g. 3.4[31] -> (3 + 4/10 + 31/990)
pre, post, repetend = match.groups()
zeros = '0'*len(post)
post, repetends = [w.lstrip('0') for w in [post, repetend]]
# or else interpreted as octal
a = pre or '0'
b, c = post or '0', '1' + zeros
d, e = repetends, ('9'*len(repetend)) + zeros
seq = [
(OP, '('),
(NAME,
'Integer'), (OP, '('), (NUMBER, a), (OP, ')'),
(OP, '+'),
(NAME, 'Rational'), (OP, '('), (
NUMBER, b), (OP, ','), (NUMBER, c), (OP, ')'),
(OP, '+'),
(NAME, 'Rational'), (OP, '('), (
NUMBER, d), (OP, ','), (NUMBER, e), (OP, ')'),
(OP, ')'),
]
else:
seq = [(NAME, 'Float'), (OP, '('),
(NUMBER, repr(str(number))), (OP, ')')]
else:
seq = [(NAME, 'Integer'), (OP, '('), (
NUMBER, number), (OP, ')')]
result.extend(seq + postfix)
else:
result.append((toknum, tokval))
return result
def rationalize(tokens, local_dict, global_dict):
"""Converts floats into ``Rational``. Run AFTER ``auto_number``."""
result = []
passed_float = False
for toknum, tokval in tokens:
if toknum == NAME:
if tokval == 'Float':
passed_float = True
tokval = 'Rational'
result.append((toknum, tokval))
elif passed_float == True and toknum == NUMBER:
passed_float = False
result.append((STRING, tokval))
else:
result.append((toknum, tokval))
return result
#: Standard transformations for :func:`parse_expr`.
#: Inserts calls to :class:`Symbol`, :class:`Integer`, and other SymPy
#: datatypes and allows the use of standard factorial notation (e.g. ``x!``).
standard_transformations = (lambda_notation, auto_symbol, auto_number, factorial_notation)
def stringify_expr(s, local_dict, global_dict, transformations):
"""
Converts the string ``s`` to Python code, in ``local_dict``
Generally, ``parse_expr`` should be used.
"""
tokens = []
input_code = StringIO(s.strip())
for toknum, tokval, _, _, _ in generate_tokens(input_code.readline):
tokens.append((toknum, tokval))
for transform in transformations:
tokens = transform(tokens, local_dict, global_dict)
return untokenize(tokens)
def eval_expr(code, local_dict, global_dict):
"""
Evaluate Python code generated by ``stringify_expr``.
Generally, ``parse_expr`` should be used.
"""
expr = eval(
code, global_dict, local_dict) # take local objects in preference
return expr
def parse_expr(s, local_dict=None, transformations=standard_transformations,
global_dict=None, evaluate=True):
"""Converts the string ``s`` to a SymPy expression, in ``local_dict``
Parameters
==========
s : str
The string to parse.
local_dict : dict, optional
A dictionary of local variables to use when parsing.
global_dict : dict, optional
A dictionary of global variables. By default, this is initialized
with ``from sympy import *``; provide this parameter to override
this behavior (for instance, to parse ``"Q & S"``).
transformations : tuple, optional
A tuple of transformation functions used to modify the tokens of the
parsed expression before evaluation. The default transformations
convert numeric literals into their SymPy equivalents, convert
undefined variables into SymPy symbols, and allow the use of standard
mathematical factorial notation (e.g. ``x!``).
evaluate : bool, optional
When False, the order of the arguments will remain as they were in the
string and automatic simplification that would normally occur is
suppressed. (see examples)
Examples
========
>>> from sympy.parsing.sympy_parser import parse_expr
>>> parse_expr("1/2")
1/2
>>> type(_)
<class 'sympy.core.numbers.Half'>
>>> from sympy.parsing.sympy_parser import standard_transformations,\\
... implicit_multiplication_application
>>> transformations = (standard_transformations +
... (implicit_multiplication_application,))
>>> parse_expr("2x", transformations=transformations)
2*x
When evaluate=False, some automatic simplifications will not occur:
>>> parse_expr("2**3"), parse_expr("2**3", evaluate=False)
(8, 2**3)
In addition the order of the arguments will not be made canonical.
This feature allows one to tell exactly how the expression was entered:
>>> a = parse_expr('1 + x', evaluate=False)
>>> b = parse_expr('x + 1', evaluate=0)
>>> a == b
False
>>> a.args
(1, x)
>>> b.args
(x, 1)
See Also
========
stringify_expr, eval_expr, standard_transformations,
implicit_multiplication_application
"""
if local_dict is None:
local_dict = {}
if global_dict is None:
global_dict = {}
exec_('from sympy import *', global_dict)
code = stringify_expr(s, local_dict, global_dict, transformations)
if not evaluate:
code = compile(evaluateFalse(code), '<string>', 'eval')
return eval_expr(code, local_dict, global_dict)
def evaluateFalse(s):
"""
Replaces operators with the SymPy equivalent and sets evaluate=False.
"""
node = ast.parse(s)
node = EvaluateFalseTransformer().visit(node)
# node is a Module, we want an Expression
node = ast.Expression(node.body[0].value)
return ast.fix_missing_locations(node)
class EvaluateFalseTransformer(ast.NodeTransformer):
operators = {
ast.Add: 'Add',
ast.Mult: 'Mul',
ast.Pow: 'Pow',
ast.Sub: 'Add',
ast.Div: 'Mul',
ast.BitOr: 'Or',
ast.BitAnd: 'And',
ast.BitXor: 'Not',
}
def flatten(self, args, func):
result = []
for arg in args:
if isinstance(arg, ast.Call) and arg.func.id == func:
result.extend(self.flatten(arg.args, func))
else:
result.append(arg)
return result
def visit_BinOp(self, node):
if node.op.__class__ in self.operators:
sympy_class = self.operators[node.op.__class__]
right = self.visit(node.right)
if isinstance(node.op, ast.Sub):
right = ast.UnaryOp(op=ast.USub(), operand=right)
elif isinstance(node.op, ast.Div):
right = ast.Call(
func=ast.Name(id='Pow', ctx=ast.Load()),
args=[right, ast.UnaryOp(op=ast.USub(), operand=ast.Num(1))],
keywords=[ast.keyword(arg='evaluate', value=ast.Name(id='False', ctx=ast.Load()))],
starargs=None,
kwargs=None
)
new_node = ast.Call(
func=ast.Name(id=sympy_class, ctx=ast.Load()),
args=[self.visit(node.left), right],
keywords=[ast.keyword(arg='evaluate', value=ast.Name(id='False', ctx=ast.Load()))],
starargs=None,
kwargs=None
)
if sympy_class in ('Add', 'Mul'):
# Denest Add or Mul as appropriate
new_node.args = self.flatten(new_node.args, sympy_class)
return new_node
return node
|
bsd-3-clause
|
Tetchain/pycoin
|
tests/encoding_test.py
|
18
|
9641
|
#!/usr/bin/env python
import unittest
from pycoin import encoding
from pycoin.serialize import h2b
class EncodingTestCase(unittest.TestCase):
def test_to_from_long(self):
def do_test(as_int, prefix, as_rep, base):
self.assertEqual((as_int, prefix), encoding.to_long(base, encoding.byte_to_int, as_rep))
self.assertEqual(as_rep, encoding.from_long(as_int, prefix, base, lambda v:v))
do_test(10000101, 2, h2b("00009896e5"), 256)
do_test(10000101, 3, h2b("0000009896e5"), 256)
do_test(1460765565493402645157733592332121663123460211377, 1, b'\0\xff\xde\xfeOHu\xcf\x11\x9f\xc3\xd8\xf4\xa0\x9a\xe3~\xc4\xccB\xb1', 256)
def test_to_bytes_32(self):
for i in range(256):
v = encoding.to_bytes_32(i)
self.assertEqual(v, b'\0' * 31 + bytes(bytearray([i])))
for i in range(256,512):
v = encoding.to_bytes_32(i)
self.assertEqual(v, b'\0' * 30 + bytes(bytearray([1, i&0xff])))
def test_to_from_base58(self):
def do_test(as_text, as_bin):
self.assertEqual(as_bin, encoding.a2b_base58(as_text))
self.assertEqual(as_text, encoding.b2a_base58(as_bin))
do_test("1abcdefghijkmnpqrst", b'\x00\x01\x93\\|\xf2*\xb9\xbe\x19b\xae\xe4\x8c{')
do_test("1CASrvcpMMTa4dz4DmYtAqcegCtdkhjvdn", b'\x00zr\xb6\xfac\xde6\xc4\xab\xc6\nh\xb5-\x7f3\xe3\xd7\xcd>\xc4\xba\xbd9')
do_test("1111111111111111aaaa11aa",
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00CnzQ)\x0b')
do_test("123456789ABCDEFGHJKLMNPQRSTUVWXYZabcdefghijkmnopqrstuvwxyz",
b'\x00\x01\x11\xd3\x8e_\xc9\x07\x1f\xfc\xd2\x0bJv<\xc9\xaeO%+\xb4\xe4\x8f\xd6j\x83^%*\xda\x93\xffH\rm\xd4=\xc6*d\x11U\xa5')
def test_to_from_hashed_base58(self):
def do_test(as_text, as_bin):
self.assertEqual(as_text, encoding.b2a_hashed_base58(as_bin))
self.assertEqual(as_bin, encoding.a2b_hashed_base58(as_text))
self.assertTrue(encoding.is_hashed_base58_valid(as_text))
bogus_text = as_text[:-1] + chr(1+ord(as_text[-1]))
self.assertFalse(encoding.is_hashed_base58_valid(bogus_text))
do_test("14nr3dMd4VwNpFhFECU1A6imi", b'\x00\x01\x93\\|\xf2*\xb9\xbe\x19b\xae\xe4\x8c{')
do_test("1CASrvcpMMTa4dz4DmYtAqcegCtdkhjvdn", b'\x00zr\xb6\xfac\xde6\xc4\xab\xc6\nh\xb5-\x7f3\xe3\xd7\xcd>')
do_test("11111111111111114njGbaozZJui9o",
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00CnzQ)\x0b')
do_test("1mLRia5CbfDB9752zxvtrpnkigecaYWUSQNLJGECA8641ywusqomjhfdb6EM7bXGj1Gb",
b'\x00\x01\x11\xd3\x8e_\xc9\x07\x1f\xfc\xd2\x0bJv<\xc9\xaeO%+\xb4\xe4\x8f\xd6j\x83^%*\xda\x93\xffH\rm\xd4=\xc6*d\x11U\xa5aaaa')
def test_double_sha256(self):
def do_test(blob, expected_hash):
self.assertEqual(encoding.double_sha256(blob), expected_hash)
do_test(b"This is a test",
b'\xea\xc6I\xd41\xaa?\xc2\xd5t\x9d\x1aP!\xbb\xa7\x81.\xc8;\x8aY\xfa\x84\x0b\xffu\xc1\x7f\x8af\\')
do_test(b"The quick brown fox jumps over the lazy dogs",
b'\x8a5e\x88yz\x90\x1a\x11\x03\x17y\xd4xz\xd0E~\xb0\x82\xc5k\xd9\xb6W\x15z\xcf1\xba\xe6\xc4')
do_test(b'\x74' * 10000,
b'nMw6\xaa7<G\x18\xee\xf2\xb9E(\xfe\xd5u\x19\xa0\xbd\xc3\xa8\xf40\n\xee7,\xbe\xde\xa9\xa0')
def test_hash160(self):
def do_test(blob, expected_hash):
self.assertEqual(encoding.hash160(blob), expected_hash)
do_test(b"This is a test",
b'\x18\xac\x98\xfa*$\x12\xdd\xb7]\xe6\x04Y\xb5*\xcd\x98\xf2\xd9r')
do_test(b"The quick brown fox jumps over the lazy dogs",
b'v\xc9\xd1\xf3\xaaR&UN G_\x91\x9a\xad\xd1t\xf7\xe9\xb7')
do_test(b'\x74' * 10000,
b'\xa9a\x07\x02\x96gt\x01\xa5~\xae\r\x96\xd1MZ\x88\n,A')
def test_wif_to_from_secret_exponent(self):
def do_test(as_secret_exponent, as_wif, is_compressed):
self.assertEqual(as_wif, encoding.secret_exponent_to_wif(as_secret_exponent, compressed=is_compressed))
se, comp = encoding.wif_to_tuple_of_secret_exponent_compressed(as_wif)
self.assertEqual(se, as_secret_exponent)
self.assertEqual(comp, is_compressed)
self.assertTrue(encoding.is_valid_wif(as_wif))
WIF_LIST = [
"5HwoXVkHoRM8sL2KmNRS217n1g8mPPBomrY7yehCuXC1115WWsh",
"5J5KUK3VXP8HUefNVYPxwxVRokScZdWXpu1Tj8LfaAXMqHzMmbk",
"5JCqR8LhFLuS5yJRDiNVsus5bpkTjsqFswUoUbz8EorifYA4TwJ",
"5JLMMwdtyJgahHwTwtM2osEjPu4Jv89yvyx9E5dauTC5Vs6EjBA",
"5JTsJkw6hGTjJcaWg4KZjpcPByNA6NUhz2RUyZH3a6XSL7vAYmy",
"5JbPFaEJREEsuwDZQEJ6fmz2z3g1GcoS34tpj2vWEjroARtCMBF",
"5JiuCPXW9C22XFrc8QGdbjMgn7yrSs8A67NAUWZxuPC9ziUizQP",
"5JrR9Cphs9oB8aVeraFAXgjLaCHhd7St99qWDzDRa2XWq3RVw7d",
"5Jyw627ub7aKju8hakDhTe6zNGbYoMmcCCJqyTrtEfrsfLDreVt",
"5K7T2qR7K5MUMDmkJvCEPbUeALuPyc6LFEnBiwWLuKCEVdBp8qV",
"5KExyeiK338cxYQo36AmKYrHxRDF9rR4JHFXUR9oZxXbKue7gdL",
"5KNUvU1WkzumZs3qmG9JFWDwkVX6L6jnMKisDtoGEbrxACzxk6T",
"5KVzsHJiUxgvBBgtVS7qBTbbYZpwWM4WQNCCyNSiuFCJzYMxg8H",
"5KdWp6bvCvU4nWKwDc6N7QyFLe8ngbPETQfYir6BZtXfpsnSrGS",
]
SE_LIST = [int(c * 64, 16) for c in "123456789abcde"]
for se, wif in zip(SE_LIST, WIF_LIST):
do_test(se, wif, is_compressed=False)
def test_public_pair_to_sec(self):
def do_test(as_public_pair, as_sec, is_compressed, as_hash160_sec, as_bitcoin_address):
self.assertEqual(encoding.sec_to_public_pair(as_sec), as_public_pair)
self.assertEqual(encoding.public_pair_to_sec(as_public_pair, compressed=is_compressed), as_sec)
self.assertEqual(encoding.is_sec_compressed(as_sec), is_compressed)
self.assertEqual(encoding.public_pair_to_hash160_sec(as_public_pair, compressed=is_compressed),
as_hash160_sec)
self.assertEqual(encoding.hash160_sec_to_bitcoin_address(as_hash160_sec), as_bitcoin_address)
self.assertEqual(encoding.public_pair_to_bitcoin_address(as_public_pair, compressed=is_compressed), as_bitcoin_address)
self.assertTrue(encoding.is_valid_bitcoin_address(as_bitcoin_address))
bad_address = as_bitcoin_address[:17] + chr(ord(as_bitcoin_address[17]) + 1) + as_bitcoin_address[18:]
self.assertFalse(encoding.is_valid_bitcoin_address(bad_address))
SEC_TEST_DATA = [
((35826991941973211494003564265461426073026284918572421206325859877044495085994,
25491041833361137486709012056693088297620945779048998614056404517283089805761),
"034f355bdcb7cc0af728ef3cceb9615d90684bb5b2ca5f859ab0f0b704075871aa",
True,
"fc7250a211deddc70ee5a2738de5f07817351cef",
"1Q1pE5vPGEEMqRcVRMbtBK842Y6Pzo6nK9"
),
((31855367722742370537280679280108010854876607759940877706949385967087672770343,
46659058944867745027460438812818578793297503278458148978085384795486842595210),
"02466d7fcae563e5cb09a0d1870bb580344804617879a14949cf22285f1bae3f27",
True,
"531260aa2a199e228c537dfa42c82bea2c7c1f4d",
"18aF6pYXKDSXjXHpidt2G6okdVdBr8zA7z"
),
((27341391395138457474971175971081207666803680341783085051101294443585438462385,
26772005640425216814694594224987412261034377630410179754457174380653265224672),
"023c72addb4fdf09af94f0c94d7fe92a386a7e70cf8a1d85916386bb2535c7b1b1",
True,
"3bc28d6d92d9073fb5e3adf481795eaf446bceed",
"16Syw4SugWs4siKbK8cuxJXM2ukh2GKpRi"
),
((35826991941973211494003564265461426073026284918572421206325859877044495085994,
25491041833361137486709012056693088297620945779048998614056404517283089805761),
"044f355bdcb7cc0af728ef3cceb9615d90684bb5b2ca5f859ab0f0b704075871aa"\
"385b6b1b8ead809ca67454d9683fcf2ba03456d6fe2c4abe2b07f0fbdbb2f1c1",
False,
"e4e517ee07984a4000cd7b00cbcb545911c541c4",
"1MsHWS1BnwMc3tLE8G35UXsS58fKipzB7a"
),
((31855367722742370537280679280108010854876607759940877706949385967087672770343,
46659058944867745027460438812818578793297503278458148978085384795486842595210),
"04466d7fcae563e5cb09a0d1870bb580344804617879a14949cf22285f1bae3f27"\
"6728176c3c6431f8eeda4538dc37c865e2784f3a9e77d044f33e407797e1278a",
False,
"b256082b934fe782adbacaafeadfca64c52a5384",
"1HFxLkPTtMZeo5mDpZn6CF9sh4h2ycknwr"
),
((27341391395138457474971175971081207666803680341783085051101294443585438462385,
26772005640425216814694594224987412261034377630410179754457174380653265224672),
"043c72addb4fdf09af94f0c94d7fe92a386a7e70cf8a1d85916386bb2535c7b1b1"\
"3b306b0fe085665d8fc1b28ae1676cd3ad6e08eaeda225fe38d0da4de55703e0",
False,
"edf6bbd7ba7aad222c2b28e6d8d5001178e3680c",
"1NhEipumt9Pug6pwTqMNRXhBG84K39Ebbi"
),
]
for public_pair, sec, compressed, hash160_sec, bitcoin_address in SEC_TEST_DATA:
do_test(public_pair, h2b(sec), compressed, h2b(hash160_sec), bitcoin_address)
if __name__ == '__main__':
unittest.main()
|
mit
|
google/glazier
|
testing/run_tests.py
|
1
|
1660
|
# Copyright 2019 Google Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Locate *_test modules and run the tests in them."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import pkgutil
import re
import subprocess
import sys
import glazier
FAILED_RE = re.compile(r'FAILED\s*\(errors=(\d*)\)')
def main():
results = {'codes': {0: 0, 1: 0}, 'errors': 0}
for _, test, _ in pkgutil.walk_packages(glazier.__path__,
glazier.__name__ + '.'):
if '_test' in test:
print('**** %s ****\n' % test)
proc = subprocess.Popen(['python', '-m', test], stderr=subprocess.PIPE)
_, err = proc.communicate()
err = err.decode()
print(err)
failed = FAILED_RE.search(err)
if failed:
results['errors'] += int(failed.group(1))
results['codes'][proc.returncode] = results['codes'].setdefault(
proc.returncode, 0) + 1
print('Success: %s' % results['codes'][0])
print('Failure: %s' % results['codes'][1])
sys.exit(results['codes'][1])
if __name__ == '__main__':
main()
|
apache-2.0
|
YueLinHo/Subversion
|
subversion/tests/cmdline/svntest/verify.py
|
1
|
34138
|
#
# verify.py: routines that handle comparison and display of expected
# vs. actual output
#
# Subversion is a tool for revision control.
# See http://subversion.tigris.org for more information.
#
# ====================================================================
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
######################################################################
import re, sys
from difflib import unified_diff, ndiff
import pprint
import logging
import svntest
logger = logging.getLogger()
######################################################################
# Exception types
class SVNUnexpectedOutput(svntest.Failure):
"""Exception raised if an invocation of svn results in unexpected
output of any kind."""
pass
class SVNUnexpectedStdout(SVNUnexpectedOutput):
"""Exception raised if an invocation of svn results in unexpected
output on STDOUT."""
pass
class SVNUnexpectedStderr(SVNUnexpectedOutput):
"""Exception raised if an invocation of svn results in unexpected
output on STDERR."""
pass
class SVNExpectedStdout(SVNUnexpectedOutput):
"""Exception raised if an invocation of svn results in no output on
STDOUT when output was expected."""
pass
class SVNExpectedStderr(SVNUnexpectedOutput):
"""Exception raised if an invocation of svn results in no output on
STDERR when output was expected."""
pass
class SVNUnexpectedExitCode(SVNUnexpectedOutput):
"""Exception raised if an invocation of svn exits with a value other
than what was expected."""
pass
class SVNIncorrectDatatype(SVNUnexpectedOutput):
"""Exception raised if invalid input is passed to the
run_and_verify_* API"""
pass
class SVNDumpParseError(svntest.Failure):
"""Exception raised if parsing a dump file fails"""
pass
######################################################################
# Comparison of expected vs. actual output
def createExpectedOutput(expected, output_type, match_all=True):
"""Return EXPECTED, promoted to an ExpectedOutput instance if not
None. Raise SVNIncorrectDatatype if the data type of EXPECTED is
not handled."""
if isinstance(expected, list):
expected = ExpectedOutput(expected)
elif isinstance(expected, str):
expected = RegexOutput(expected, match_all)
elif isinstance(expected, int):
expected = RegexOutput(".*: E%d:.*" % expected, False)
elif expected is AnyOutput:
expected = AnyOutput()
elif expected is not None and not isinstance(expected, ExpectedOutput):
raise SVNIncorrectDatatype("Unexpected type for '%s' data" % output_type)
return expected
class ExpectedOutput(object):
"""Matches an ordered list of lines.
If MATCH_ALL is True, the expected lines must match all the actual
lines, one-to-one, in the same order. If MATCH_ALL is False, the
expected lines must match a subset of the actual lines, one-to-one,
in the same order, ignoring any other actual lines among the
matching ones.
"""
def __init__(self, expected, match_all=True):
"""Initialize the expected output to EXPECTED which is a string, or
a list of strings.
See also: svntest.verify.createExpectedOutput().
"""
assert expected is not None
self.expected = expected
self.match_all = match_all
def __str__(self):
return str(self.expected)
def __cmp__(self, other):
raise TypeError("ExpectedOutput does not implement direct comparison; "
"see the 'matches()' method")
def matches(self, actual):
"""Return whether SELF matches ACTUAL (which may be a list
of newline-terminated lines, or a single string).
"""
assert actual is not None
expected = self.expected
if not isinstance(expected, list):
expected = [expected]
if not isinstance(actual, list):
actual = [actual]
if self.match_all:
return expected == actual
i_expected = 0
for actual_line in actual:
if expected[i_expected] == actual_line:
i_expected += 1
if i_expected == len(expected):
return True
return False
def display_differences(self, message, label, actual):
"""Show the differences between the expected and ACTUAL lines. Print
MESSAGE unless it is None, the expected lines, the ACTUAL lines,
and a diff, all labeled with LABEL.
"""
display_lines(message, self.expected, actual, label, label)
display_lines_diff(self.expected, actual, label, label)
class AnyOutput(ExpectedOutput):
"""Matches any non-empty output.
"""
def __init__(self):
ExpectedOutput.__init__(self, [], False)
def matches(self, actual):
assert actual is not None
if len(actual) == 0:
# No actual output. No match.
return False
for line in actual:
# If any line has some text, then there is output, so we match.
if line:
return True
# We did not find a line with text. No match.
return False
def display_differences(self, message, label, actual):
if message:
logger.warn(message)
class RegexOutput(ExpectedOutput):
"""Matches a single regular expression.
If MATCH_ALL is true, every actual line must match the RE. If
MATCH_ALL is false, at least one actual line must match the RE. In
any case, there must be at least one line of actual output.
"""
def __init__(self, expected, match_all=True):
"EXPECTED is a regular expression string."
assert isinstance(expected, str) or isinstance(expected, bytes)
ExpectedOutput.__init__(self, expected, match_all)
self.expected_re = re.compile(expected)
def matches(self, actual):
assert actual is not None
if not isinstance(actual, list):
actual = [actual]
# If a regex was provided assume that we require some actual output.
# Fail if we don't have any.
if len(actual) == 0:
return False
if self.match_all:
return all(self.expected_re.match(line) for line in actual)
else:
return any(self.expected_re.match(line) for line in actual)
def display_differences(self, message, label, actual):
display_lines(message, self.expected, actual, label + ' (regexp)', label)
def insert(self, index, line):
self.expected.insert(index, line)
self.expected_re = re.compile(self.expected)
class RegexListOutput(ExpectedOutput):
"""Matches an ordered list of regular expressions.
If MATCH_ALL is True, the expressions must match all the actual
lines, one-to-one, in the same order. If MATCH_ALL is False, the
expressions must match a subset of the actual lines, one-to-one, in
the same order, ignoring any other actual lines among the matching
ones.
In any case, there must be at least one line of actual output.
"""
def __init__(self, expected, match_all=True):
"EXPECTED is a list of regular expression strings."
assert isinstance(expected, list)
ExpectedOutput.__init__(self, expected, match_all)
self.expected_res = [re.compile(e) for e in expected]
def matches(self, actual):
assert actual is not None
if not isinstance(actual, list):
actual = [actual]
if self.match_all:
return (len(self.expected_res) == len(actual) and
all(e.match(a) for e, a in zip(self.expected_res, actual)))
i_expected = 0
for actual_line in actual:
if self.expected_res[i_expected].match(actual_line):
i_expected += 1
if i_expected == len(self.expected_res):
return True
return False
def display_differences(self, message, label, actual):
display_lines(message, self.expected, actual, label + ' (regexp)', label)
def insert(self, index, line):
self.expected.insert(index, line)
self.expected_res = [re.compile(e) for e in self.expected]
class UnorderedOutput(ExpectedOutput):
"""Matches an unordered list of lines.
The expected lines must match all the actual lines, one-to-one, in
any order.
"""
def __init__(self, expected):
assert isinstance(expected, list)
ExpectedOutput.__init__(self, expected)
def matches(self, actual):
if not isinstance(actual, list):
actual = [actual]
return sorted(self.expected) == sorted(actual)
def display_differences(self, message, label, actual):
display_lines(message, self.expected, actual, label + ' (unordered)', label)
display_lines_diff(self.expected, actual, label + ' (unordered)', label)
class UnorderedRegexListOutput(ExpectedOutput):
"""Matches an unordered list of regular expressions.
The expressions must match all the actual lines, one-to-one, in any
order.
Note: This can give a false negative result (no match) when there is
an actual line that matches multiple expressions and a different
actual line that matches some but not all of those same
expressions. The implementation matches each expression in turn to
the first unmatched actual line that it can match, and does not try
all the permutations when there are multiple possible matches.
"""
def __init__(self, expected):
assert isinstance(expected, list)
ExpectedOutput.__init__(self, expected)
def matches(self, actual):
assert actual is not None
if not isinstance(actual, list):
actual = [actual]
if len(self.expected) != len(actual):
return False
for e in self.expected:
expect_re = re.compile(e)
for actual_line in actual:
if expect_re.match(actual_line):
actual.remove(actual_line)
break
else:
# One of the regexes was not found
return False
return True
def display_differences(self, message, label, actual):
display_lines(message, self.expected, actual,
label + ' (regexp) (unordered)', label)
class AlternateOutput(ExpectedOutput):
"""Matches any one of a list of ExpectedOutput instances.
"""
def __init__(self, expected, match_all=True):
"EXPECTED is a list of ExpectedOutput instances."
assert isinstance(expected, list) and expected != []
assert all(isinstance(e, ExpectedOutput) for e in expected)
ExpectedOutput.__init__(self, expected)
def matches(self, actual):
assert actual is not None
for e in self.expected:
if e.matches(actual):
return True
return False
def display_differences(self, message, label, actual):
# For now, just display differences against the first alternative.
e = self.expected[0]
e.display_differences(message, label, actual)
######################################################################
# Displaying expected and actual output
def display_trees(message, label, expected, actual):
'Print two trees, expected and actual.'
if message is not None:
logger.warn(message)
if expected is not None:
logger.warn('EXPECTED %s:', label)
svntest.tree.dump_tree(expected)
if actual is not None:
logger.warn('ACTUAL %s:', label)
svntest.tree.dump_tree(actual)
def display_lines_diff(expected, actual, expected_label, actual_label):
"""Print a unified diff between EXPECTED (labeled with EXPECTED_LABEL)
and ACTUAL (labeled with ACTUAL_LABEL).
Each of EXPECTED and ACTUAL is a string or a list of strings.
"""
if not isinstance(expected, list):
expected = [expected]
if not isinstance(actual, list):
actual = [actual]
logger.warn('DIFF ' + expected_label + ':')
for x in unified_diff(expected, actual,
fromfile='EXPECTED ' + expected_label,
tofile='ACTUAL ' + actual_label):
logger.warn('| ' + x.rstrip())
def display_lines(message, expected, actual,
expected_label, actual_label=None):
"""Print MESSAGE, unless it is None, then print EXPECTED (labeled
with EXPECTED_LABEL) followed by ACTUAL (labeled with ACTUAL_LABEL).
Each of EXPECTED and ACTUAL is a string or a list of strings.
"""
if message is not None:
logger.warn(message)
if type(expected) is str:
expected = [expected]
if type(actual) is str:
actual = [actual]
if actual_label is None:
actual_label = expected_label
if expected is not None:
logger.warn('EXPECTED %s:', expected_label)
for x in expected:
logger.warn('| ' + x.rstrip())
if actual is not None:
logger.warn('ACTUAL %s:', actual_label)
for x in actual:
logger.warn('| ' + x.rstrip())
def compare_and_display_lines(message, label, expected, actual,
raisable=None):
"""Compare two sets of output lines, and print them if they differ,
preceded by MESSAGE iff not None. EXPECTED may be an instance of
ExpectedOutput (and if not, it is wrapped as such). ACTUAL may be a
list of newline-terminated lines, or a single string. RAISABLE is an
exception class, an instance of which is thrown if ACTUAL doesn't
match EXPECTED."""
if raisable is None:
raisable = svntest.main.SVNLineUnequal
### It'd be nicer to use createExpectedOutput() here, but its
### semantics don't match all current consumers of this function.
assert expected is not None
assert actual is not None
if not isinstance(expected, ExpectedOutput):
expected = ExpectedOutput(expected)
actual = svntest.main.ensure_list(actual)
if len(actual) > 0:
is_binary = not isinstance(actual[0], str)
actual = svntest.main.filter_dbg(actual, is_binary)
if not expected.matches(actual):
expected.display_differences(message, label, actual)
raise raisable
def verify_outputs(message, actual_stdout, actual_stderr,
expected_stdout, expected_stderr, all_stdout=True):
"""Compare and display expected vs. actual stderr and stdout lines:
if they don't match, print the difference (preceded by MESSAGE iff
not None) and raise an exception.
If EXPECTED_STDERR or EXPECTED_STDOUT is a string the string is
interpreted as a regular expression. For EXPECTED_STDOUT and
ACTUAL_STDOUT to match, every line in ACTUAL_STDOUT must match the
EXPECTED_STDOUT regex, unless ALL_STDOUT is false. For
EXPECTED_STDERR regexes only one line in ACTUAL_STDERR need match."""
expected_stderr = createExpectedOutput(expected_stderr, 'stderr', False)
expected_stdout = createExpectedOutput(expected_stdout, 'stdout', all_stdout)
for (actual, expected, label, raisable) in (
(actual_stderr, expected_stderr, 'STDERR', SVNExpectedStderr),
(actual_stdout, expected_stdout, 'STDOUT', SVNExpectedStdout)):
if expected is None:
continue
if isinstance(expected, RegexOutput):
raisable = svntest.main.SVNUnmatchedError
elif not isinstance(expected, AnyOutput):
raisable = svntest.main.SVNLineUnequal
compare_and_display_lines(message, label, expected, actual, raisable)
def verify_exit_code(message, actual, expected,
raisable=SVNUnexpectedExitCode):
"""Compare and display expected vs. actual exit codes:
if they don't match, print the difference (preceded by MESSAGE iff
not None) and raise an exception."""
if expected != actual:
display_lines(message, str(expected), str(actual), "Exit Code")
raise raisable
# A simple dump file parser. While sufficient for the current
# testsuite it doesn't cope with all valid dump files.
class DumpParser:
def __init__(self, lines):
self.current = 0
self.lines = lines
self.parsed = {}
def parse_line(self, regex, required=True):
m = re.match(regex, self.lines[self.current])
if not m:
if required:
raise SVNDumpParseError("expected '%s' at line %d\n%s"
"\nPrevious lines:\n%s"
% (regex, self.current,
self.lines[self.current],
''.join(self.lines[max(0,self.current - 10):self.current])))
else:
return None
self.current += 1
return m.group(1)
def parse_blank(self, required=True):
if self.lines[self.current] != b'\n': # Works on Windows
if required:
raise SVNDumpParseError("expected blank at line %d\n%s"
% (self.current, self.lines[self.current]))
else:
return False
self.current += 1
return True
def parse_header(self, header):
regex = b'([^:]*): (.*)$'
m = re.match(regex, self.lines[self.current])
if not m:
raise SVNDumpParseError("expected a header at line %d, but found:\n%s"
% (self.current, self.lines[self.current]))
self.current += 1
return m.groups()
def parse_headers(self):
headers = []
while self.lines[self.current] != b'\n':
key, val = self.parse_header(self)
headers.append((key, val))
return headers
def parse_boolean(self, header, required):
return self.parse_line(header + b': (false|true)$', required)
def parse_format(self):
return self.parse_line(b'SVN-fs-dump-format-version: ([0-9]+)$')
def parse_uuid(self):
return self.parse_line(b'UUID: ([0-9a-z-]+)$')
def parse_revision(self):
return self.parse_line(b'Revision-number: ([0-9]+)$')
def parse_prop_delta(self):
return self.parse_line(b'Prop-delta: (false|true)$', required=False)
def parse_prop_length(self, required=True):
return self.parse_line(b'Prop-content-length: ([0-9]+)$', required)
def parse_content_length(self, required=True):
return self.parse_line(b'Content-length: ([0-9]+)$', required)
def parse_path(self):
path = self.parse_line(b'Node-path: (.*)$', required=False)
return path
def parse_kind(self):
return self.parse_line(b'Node-kind: (.+)$', required=False)
def parse_action(self):
return self.parse_line(b'Node-action: ([0-9a-z-]+)$')
def parse_copyfrom_rev(self):
return self.parse_line(b'Node-copyfrom-rev: ([0-9]+)$', required=False)
def parse_copyfrom_path(self):
path = self.parse_line(b'Node-copyfrom-path: (.+)$', required=False)
if not path and self.lines[self.current] == 'Node-copyfrom-path: \n':
self.current += 1
path = ''
return path
def parse_copy_md5(self):
return self.parse_line(b'Text-copy-source-md5: ([0-9a-z]+)$', required=False)
def parse_copy_sha1(self):
return self.parse_line(b'Text-copy-source-sha1: ([0-9a-z]+)$', required=False)
def parse_text_md5(self):
return self.parse_line(b'Text-content-md5: ([0-9a-z]+)$', required=False)
def parse_text_sha1(self):
return self.parse_line(b'Text-content-sha1: ([0-9a-z]+)$', required=False)
def parse_text_delta(self):
return self.parse_line(b'Text-delta: (false|true)$', required=False)
def parse_text_delta_base_md5(self):
return self.parse_line(b'Text-delta-base-md5: ([0-9a-f]+)$', required=False)
def parse_text_delta_base_sha1(self):
return self.parse_line(b'Text-delta-base-sha1: ([0-9a-f]+)$', required=False)
def parse_text_length(self):
return self.parse_line(b'Text-content-length: ([0-9]+)$', required=False)
def get_props(self):
props = []
while not re.match(b'PROPS-END$', self.lines[self.current]):
props.append(self.lines[self.current])
self.current += 1
self.current += 1
# Split into key/value pairs to do an unordered comparison.
# This parses the serialized hash under the assumption that it is valid.
prophash = {}
curprop = [0]
while curprop[0] < len(props):
def read_key_or_value(curprop):
# klen / vlen
klen = int(props[curprop[0]].split()[1])
curprop[0] += 1
# key / value
key = b''
while len(key) != klen + 1:
key += props[curprop[0]]
curprop[0] += 1
key = key[:-1]
return key
if props[curprop[0]].startswith(b'K'):
key = read_key_or_value(curprop)
value = read_key_or_value(curprop)
elif props[curprop[0]].startswith(b'D'):
key = read_key_or_value(curprop)
value = None
else:
raise
prophash[key] = value
return prophash
def get_content(self, length):
content = b''
while len(content) < length:
content += self.lines[self.current]
self.current += 1
if len(content) == length + 1:
content = content[:-1]
elif len(content) != length:
raise SVNDumpParseError("content length expected %d actual %d at line %d"
% (length, len(content), self.current))
return content
def parse_one_node(self):
node = {}
# optional 'kind' and required 'action' must be next
node['kind'] = self.parse_kind()
action = self.parse_action()
# read any remaining headers
headers_list = self.parse_headers()
headers = dict(headers_list)
# Content-length must be last, if present
if b'Content-length' in headers and headers_list[-1][0] != b'Content-length':
raise SVNDumpParseError("'Content-length' header is not last, "
"in header block ending at line %d"
% (self.current,))
# parse the remaining optional headers and store in specific keys in NODE
for key, header, regex in [
('copyfrom_rev', b'Node-copyfrom-rev', b'([0-9]+)$'),
('copyfrom_path', b'Node-copyfrom-path', b'(.*)$'),
('copy_md5', b'Text-copy-source-md5', b'([0-9a-z]+)$'),
('copy_sha1', b'Text-copy-source-sha1',b'([0-9a-z]+)$'),
('prop_length', b'Prop-content-length', b'([0-9]+)$'),
('text_length', b'Text-content-length', b'([0-9]+)$'),
('text_md5', b'Text-content-md5', b'([0-9a-z]+)$'),
('text_sha1', b'Text-content-sha1', b'([0-9a-z]+)$'),
('content_length', b'Content-length', b'([0-9]+)$'),
]:
if not header in headers:
node[key] = None
continue
m = re.match(regex, headers[header])
if not m:
raise SVNDumpParseError("expected '%s' at line %d\n%s"
% (regex, self.current,
self.lines[self.current]))
node[key] = m.group(1)
self.parse_blank()
if node['prop_length']:
node['props'] = self.get_props()
if node['text_length']:
node['content'] = self.get_content(int(node['text_length']))
# Hard to determine how may blanks is 'correct' (a delete that is
# followed by an add that is a replace and a copy has one fewer
# than expected but that can't be predicted until seeing the add)
# so allow arbitrary number
blanks = 0
while self.current < len(self.lines) and self.parse_blank(required=False):
blanks += 1
node['blanks'] = blanks
return action, node
def parse_all_nodes(self):
nodes = {}
while True:
if self.current >= len(self.lines):
break
path = self.parse_path()
if path is None:
break
if not nodes.get(path):
nodes[path] = {}
action, node = self.parse_one_node()
if nodes[path].get(action):
raise SVNDumpParseError("duplicate action '%s' for node '%s' at line %d"
% (action, path, self.current))
nodes[path][action] = node
return nodes
def parse_one_revision(self):
revision = {}
number = self.parse_revision()
revision['prop_length'] = self.parse_prop_length()
revision['content_length'] = self.parse_content_length()
self.parse_blank()
revision['props'] = self.get_props()
self.parse_blank()
revision['nodes'] = self.parse_all_nodes()
return number, revision
def parse_all_revisions(self):
while self.current < len(self.lines):
number, revision = self.parse_one_revision()
if self.parsed.get(number):
raise SVNDumpParseError("duplicate revision %d at line %d"
% (number, self.current))
self.parsed[number] = revision
def parse(self):
self.parsed['format'] = self.parse_format()
self.parse_blank()
self.parsed['uuid'] = self.parse_uuid()
self.parse_blank()
self.parse_all_revisions()
return self.parsed
def compare_dump_files(message, label, expected, actual,
ignore_uuid=False,
expect_content_length_always=False,
ignore_empty_prop_sections=False,
ignore_number_of_blank_lines=False):
"""Parse two dump files EXPECTED and ACTUAL, both of which are lists
of lines as returned by run_and_verify_dump, and check that the same
revisions, nodes, properties, etc. are present in both dumps.
"""
parsed_expected = DumpParser(expected).parse()
parsed_actual = DumpParser(actual).parse()
if ignore_uuid:
parsed_expected['uuid'] = '<ignored>'
parsed_actual['uuid'] = '<ignored>'
for parsed in [parsed_expected, parsed_actual]:
for rev_name, rev_record in parsed.items():
#print "Found %s" % (rev_name,)
if b'nodes' in rev_record:
#print "Found %s.%s" % (rev_name, 'nodes')
for path_name, path_record in rev_record['nodes'].items():
#print "Found %s.%s.%s" % (rev_name, 'nodes', path_name)
for action_name, action_record in path_record.items():
#print "Found %s.%s.%s.%s" % (rev_name, 'nodes', path_name, action_name)
if expect_content_length_always:
if action_record.get('content_length') == None:
#print 'Adding: %s.%s.%s.%s.%s' % (rev_name, 'nodes', path_name, action_name, 'content_length=0')
action_record['content_length'] = '0'
if ignore_empty_prop_sections:
if action_record.get('prop_length') == '10':
#print 'Removing: %s.%s.%s.%s.%s' % (rev_name, 'nodes', path_name, action_name, 'prop_length')
action_record['prop_length'] = None
del action_record['props']
old_content_length = int(action_record['content_length'])
action_record['content_length'] = str(old_content_length - 10)
if ignore_number_of_blank_lines:
action_record['blanks'] = 0
if parsed_expected != parsed_actual:
print('DIFF of raw dumpfiles (including expected differences)')
print(''.join(ndiff(expected, actual)))
raise svntest.Failure('DIFF of parsed dumpfiles (ignoring expected differences)\n'
+ '\n'.join(ndiff(
pprint.pformat(parsed_expected).splitlines(),
pprint.pformat(parsed_actual).splitlines())))
##########################################################################################
## diff verifications
def is_absolute_url(target):
return (target.startswith('file://')
or target.startswith('http://')
or target.startswith('https://')
or target.startswith('svn://')
or target.startswith('svn+ssh://'))
def make_diff_header(path, old_tag, new_tag, src_label=None, dst_label=None):
"""Generate the expected diff header for file PATH, with its old and new
versions described in parentheses by OLD_TAG and NEW_TAG. SRC_LABEL and
DST_LABEL are paths or urls that are added to the diff labels if we're
diffing against the repository or diffing two arbitrary paths.
Return the header as an array of newline-terminated strings."""
if src_label:
src_label = src_label.replace('\\', '/')
if not is_absolute_url(src_label):
src_label = '.../' + src_label
src_label = '\t(' + src_label + ')'
else:
src_label = ''
if dst_label:
dst_label = dst_label.replace('\\', '/')
if not is_absolute_url(dst_label):
dst_label = '.../' + dst_label
dst_label = '\t(' + dst_label + ')'
else:
dst_label = ''
path_as_shown = path.replace('\\', '/')
return [
"Index: " + path_as_shown + "\n",
"===================================================================\n",
"--- " + path_as_shown + src_label + "\t(" + old_tag + ")\n",
"+++ " + path_as_shown + dst_label + "\t(" + new_tag + ")\n",
]
def make_no_diff_deleted_header(path, old_tag, new_tag):
"""Generate the expected diff header for a deleted file PATH when in
'no-diff-deleted' mode. (In that mode, no further details appear after the
header.) Return the header as an array of newline-terminated strings."""
path_as_shown = path.replace('\\', '/')
return [
"Index: " + path_as_shown + " (deleted)\n",
"===================================================================\n",
]
def make_git_diff_header(target_path, repos_relpath,
old_tag, new_tag, add=False, src_label=None,
dst_label=None, delete=False, text_changes=True,
cp=False, mv=False, copyfrom_path=None,
copyfrom_rev=None):
""" Generate the expected 'git diff' header for file TARGET_PATH.
REPOS_RELPATH is the location of the path relative to the repository root.
The old and new versions ("revision X", or "working copy") must be
specified in OLD_TAG and NEW_TAG.
SRC_LABEL and DST_LABEL are paths or urls that are added to the diff
labels if we're diffing against the repository. ADD, DELETE, CP and MV
denotes the operations performed on the file. COPYFROM_PATH is the source
of a copy or move. Return the header as an array of newline-terminated
strings."""
path_as_shown = target_path.replace('\\', '/')
if src_label:
src_label = src_label.replace('\\', '/')
src_label = '\t(.../' + src_label + ')'
else:
src_label = ''
if dst_label:
dst_label = dst_label.replace('\\', '/')
dst_label = '\t(.../' + dst_label + ')'
else:
dst_label = ''
output = [
"Index: " + path_as_shown + "\n",
"===================================================================\n"
]
if add:
output.extend([
"diff --git a/" + repos_relpath + " b/" + repos_relpath + "\n",
"new file mode 100644\n",
])
if text_changes:
output.extend([
"--- /dev/null\t(" + old_tag + ")\n",
"+++ b/" + repos_relpath + dst_label + "\t(" + new_tag + ")\n"
])
elif delete:
output.extend([
"diff --git a/" + repos_relpath + " b/" + repos_relpath + "\n",
"deleted file mode 100644\n",
])
if text_changes:
output.extend([
"--- a/" + repos_relpath + src_label + "\t(" + old_tag + ")\n",
"+++ /dev/null\t(" + new_tag + ")\n"
])
elif cp:
if copyfrom_rev:
copyfrom_rev = '@' + copyfrom_rev
else:
copyfrom_rev = ''
output.extend([
"diff --git a/" + copyfrom_path + " b/" + repos_relpath + "\n",
"copy from " + copyfrom_path + copyfrom_rev + "\n",
"copy to " + repos_relpath + "\n",
])
if text_changes:
output.extend([
"--- a/" + copyfrom_path + src_label + "\t(" + old_tag + ")\n",
"+++ b/" + repos_relpath + "\t(" + new_tag + ")\n"
])
elif mv:
output.extend([
"diff --git a/" + copyfrom_path + " b/" + path_as_shown + "\n",
"rename from " + copyfrom_path + "\n",
"rename to " + repos_relpath + "\n",
])
if text_changes:
output.extend([
"--- a/" + copyfrom_path + src_label + "\t(" + old_tag + ")\n",
"+++ b/" + repos_relpath + "\t(" + new_tag + ")\n"
])
else:
output.extend([
"diff --git a/" + repos_relpath + " b/" + repos_relpath + "\n",
"--- a/" + repos_relpath + src_label + "\t(" + old_tag + ")\n",
"+++ b/" + repos_relpath + dst_label + "\t(" + new_tag + ")\n",
])
return output
def make_diff_prop_header(path):
"""Return a property diff sub-header, as a list of newline-terminated
strings."""
return [
"\n",
"Property changes on: " + path.replace('\\', '/') + "\n",
"___________________________________________________________________\n"
]
def make_diff_prop_val(plus_minus, pval):
"Return diff for prop value PVAL, with leading PLUS_MINUS (+ or -)."
if len(pval) > 0 and pval[-1] != '\n':
return [plus_minus + pval + "\n","\\ No newline at end of property\n"]
return [plus_minus + pval]
def make_diff_prop_deleted(pname, pval):
"""Return a property diff for deletion of property PNAME, old value PVAL.
PVAL is a single string with no embedded newlines. Return the result
as a list of newline-terminated strings."""
return [
"Deleted: " + pname + "\n",
"## -1 +0,0 ##\n"
] + make_diff_prop_val("-", pval)
def make_diff_prop_added(pname, pval):
"""Return a property diff for addition of property PNAME, new value PVAL.
PVAL is a single string with no embedded newlines. Return the result
as a list of newline-terminated strings."""
return [
"Added: " + pname + "\n",
"## -0,0 +1 ##\n",
] + make_diff_prop_val("+", pval)
def make_diff_prop_modified(pname, pval1, pval2):
"""Return a property diff for modification of property PNAME, old value
PVAL1, new value PVAL2.
PVAL is a single string with no embedded newlines. A newline at the
end is significant: without it, we add an extra line saying '\ No
newline at end of property'.
Return the result as a list of newline-terminated strings.
"""
return [
"Modified: " + pname + "\n",
"## -1 +1 ##\n",
] + make_diff_prop_val("-", pval1) + make_diff_prop_val("+", pval2)
|
apache-2.0
|
lokirius/python-for-android
|
python-build/python-libs/gdata/build/lib/gdata/tlslite/utils/cipherfactory.py
|
357
|
3177
|
"""Factory functions for symmetric cryptography."""
import os
import Python_AES
import Python_RC4
import cryptomath
tripleDESPresent = False
if cryptomath.m2cryptoLoaded:
import OpenSSL_AES
import OpenSSL_RC4
import OpenSSL_TripleDES
tripleDESPresent = True
if cryptomath.cryptlibpyLoaded:
import Cryptlib_AES
import Cryptlib_RC4
import Cryptlib_TripleDES
tripleDESPresent = True
if cryptomath.pycryptoLoaded:
import PyCrypto_AES
import PyCrypto_RC4
import PyCrypto_TripleDES
tripleDESPresent = True
# **************************************************************************
# Factory Functions for AES
# **************************************************************************
def createAES(key, IV, implList=None):
"""Create a new AES object.
@type key: str
@param key: A 16, 24, or 32 byte string.
@type IV: str
@param IV: A 16 byte string
@rtype: L{tlslite.utils.AES}
@return: An AES object.
"""
if implList == None:
implList = ["cryptlib", "openssl", "pycrypto", "python"]
for impl in implList:
if impl == "cryptlib" and cryptomath.cryptlibpyLoaded:
return Cryptlib_AES.new(key, 2, IV)
elif impl == "openssl" and cryptomath.m2cryptoLoaded:
return OpenSSL_AES.new(key, 2, IV)
elif impl == "pycrypto" and cryptomath.pycryptoLoaded:
return PyCrypto_AES.new(key, 2, IV)
elif impl == "python":
return Python_AES.new(key, 2, IV)
raise NotImplementedError()
def createRC4(key, IV, implList=None):
"""Create a new RC4 object.
@type key: str
@param key: A 16 to 32 byte string.
@type IV: object
@param IV: Ignored, whatever it is.
@rtype: L{tlslite.utils.RC4}
@return: An RC4 object.
"""
if implList == None:
implList = ["cryptlib", "openssl", "pycrypto", "python"]
if len(IV) != 0:
raise AssertionError()
for impl in implList:
if impl == "cryptlib" and cryptomath.cryptlibpyLoaded:
return Cryptlib_RC4.new(key)
elif impl == "openssl" and cryptomath.m2cryptoLoaded:
return OpenSSL_RC4.new(key)
elif impl == "pycrypto" and cryptomath.pycryptoLoaded:
return PyCrypto_RC4.new(key)
elif impl == "python":
return Python_RC4.new(key)
raise NotImplementedError()
#Create a new TripleDES instance
def createTripleDES(key, IV, implList=None):
"""Create a new 3DES object.
@type key: str
@param key: A 24 byte string.
@type IV: str
@param IV: An 8 byte string
@rtype: L{tlslite.utils.TripleDES}
@return: A 3DES object.
"""
if implList == None:
implList = ["cryptlib", "openssl", "pycrypto"]
for impl in implList:
if impl == "cryptlib" and cryptomath.cryptlibpyLoaded:
return Cryptlib_TripleDES.new(key, 2, IV)
elif impl == "openssl" and cryptomath.m2cryptoLoaded:
return OpenSSL_TripleDES.new(key, 2, IV)
elif impl == "pycrypto" and cryptomath.pycryptoLoaded:
return PyCrypto_TripleDES.new(key, 2, IV)
raise NotImplementedError()
|
apache-2.0
|
BT-astauder/odoo
|
openerp/tools/cache.py
|
100
|
5907
|
# -*- coding: utf-8 -*-
##############################################################################
#
# OpenERP, Open Source Management Solution
# Copyright (C) 2013 OpenERP (<http://www.openerp.com>).
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
##############################################################################
# decorator makes wrappers that have the same API as their wrapped function;
# this is important for the openerp.api.guess() that relies on signatures
from decorator import decorator
from inspect import getargspec
import lru
import logging
logger = logging.getLogger(__name__)
class ormcache(object):
""" LRU cache decorator for orm methods. """
def __init__(self, skiparg=2, size=8192, multi=None, timeout=None):
self.skiparg = skiparg
self.size = size
self.stat_miss = 0
self.stat_hit = 0
self.stat_err = 0
def __call__(self, method):
self.method = method
lookup = decorator(self.lookup, method)
lookup.clear_cache = self.clear
return lookup
def stat(self):
return "lookup-stats hit=%s miss=%s err=%s ratio=%.1f" % \
(self.stat_hit, self.stat_miss, self.stat_err,
(100*float(self.stat_hit))/(self.stat_miss+self.stat_hit))
def lru(self, model):
ormcache = model._ormcache
try:
d = ormcache[self.method]
except KeyError:
d = ormcache[self.method] = lru.LRU(self.size)
return d
def lookup(self, method, *args, **kwargs):
d = self.lru(args[0])
key = args[self.skiparg:]
try:
r = d[key]
self.stat_hit += 1
return r
except KeyError:
self.stat_miss += 1
value = d[key] = self.method(*args, **kwargs)
return value
except TypeError:
self.stat_err += 1
return self.method(*args, **kwargs)
def clear(self, model, *args):
""" Remove *args entry from the cache or all keys if *args is undefined """
d = self.lru(model)
if args:
logger.warn("ormcache.clear arguments are deprecated and ignored "
"(while clearing caches on (%s).%s)",
model._name, self.method.__name__)
d.clear()
model.pool._any_cache_cleared = True
class ormcache_context(ormcache):
def __init__(self, skiparg=2, size=8192, accepted_keys=()):
super(ormcache_context,self).__init__(skiparg,size)
self.accepted_keys = accepted_keys
def __call__(self, method):
# remember which argument is context
args = getargspec(method)[0]
self.context_pos = args.index('context')
return super(ormcache_context, self).__call__(method)
def lookup(self, method, *args, **kwargs):
d = self.lru(args[0])
# Note. The decorator() wrapper (used in __call__ above) will resolve
# arguments, and pass them positionally to lookup(). This is why context
# is not passed through kwargs!
if self.context_pos < len(args):
context = args[self.context_pos]
else:
context = kwargs.get('context') or {}
ckey = [(k, context[k]) for k in self.accepted_keys if k in context]
# Beware: do not take the context from args!
key = args[self.skiparg:self.context_pos] + tuple(ckey)
try:
r = d[key]
self.stat_hit += 1
return r
except KeyError:
self.stat_miss += 1
value = d[key] = self.method(*args, **kwargs)
return value
except TypeError:
self.stat_err += 1
return self.method(*args, **kwargs)
class ormcache_multi(ormcache):
def __init__(self, skiparg=2, size=8192, multi=3):
assert skiparg <= multi
super(ormcache_multi, self).__init__(skiparg, size)
self.multi = multi
def lookup(self, method, *args, **kwargs):
d = self.lru(args[0])
base_key = args[self.skiparg:self.multi] + args[self.multi+1:]
ids = args[self.multi]
result = {}
missed = []
# first take what is available in the cache
for i in ids:
key = base_key + (i,)
try:
result[i] = d[key]
self.stat_hit += 1
except Exception:
self.stat_miss += 1
missed.append(i)
if missed:
# call the method for the ids that were not in the cache
args = list(args)
args[self.multi] = missed
result.update(method(*args, **kwargs))
# store those new results back in the cache
for i in missed:
key = base_key + (i,)
d[key] = result[i]
return result
class dummy_cache(object):
""" Cache decorator replacement to actually do no caching. """
def __init__(self, *l, **kw):
pass
def __call__(self, fn):
fn.clear_cache = self.clear
return fn
def clear(self, *l, **kw):
pass
# For backward compatibility
cache = ormcache
# vim:expandtab:smartindent:tabstop=4:softtabstop=4:shiftwidth=4:
|
agpl-3.0
|
GeoscienceAustralia/Geodesy-Web-Services
|
aws/amazonia/test/unit_tests/test_stack.py
|
2
|
27223
|
from amazonia.classes.api_gateway_config import ApiGatewayMethodConfig
from amazonia.classes.api_gateway_config import ApiGatewayResponseConfig, ApiGatewayRequestConfig
from amazonia.classes.asg_config import AsgConfig
from amazonia.classes.block_devices_config import BlockDevicesConfig
from amazonia.classes.cf_distribution_config import CFDistributionConfig, CFOriginsConfig, CFCacheBehaviorConfig
from amazonia.classes.database_config import DatabaseConfig
from amazonia.classes.elb_config import ElbConfig, ElbListenersConfig
from amazonia.classes.lambda_config import LambdaConfig
from amazonia.classes.stack import Stack, DuplicateUnitNameError
from amazonia.classes.util import get_cf_friendly_name
from nose.tools import *
from troposphere import Tags, Ref
userdata = keypair = instance_type = code_deploy_service_role = vpc_cidr = public_cidr = \
minsize = maxsize = elb_health_check = nat_image_id = jump_image_id = unit_image_id = health_check_grace_period = \
health_check_type = db_instance_type = db_engine = db_port = db_hdd_size = owner_emails = \
db_backup_window = db_backup_retention = db_maintenance_window = db_storage_type = block_devices_config = \
elb_listeners_config = healthy_threshold = unhealthy_threshold = interval = timeout = sticky_app_cookies = None
availability_zones = []
home_cidrs = []
instance_port = []
loadbalancer_port = []
instance_protocol = []
loadbalancer_protocol = []
def setup_resources():
global userdata, availability_zones, keypair, instance_type, code_deploy_service_role, vpc_cidr, \
public_cidr, instance_port, loadbalancer_port, instance_protocol, loadbalancer_protocol, minsize, maxsize, \
elb_health_check, home_cidrs, nat_image_id, jump_image_id, health_check_grace_period, health_check_type, \
unit_image_id, db_instance_type, db_engine, db_port, db_hdd_size, owner_emails, \
db_backup_window, db_backup_retention, db_maintenance_window, db_storage_type, block_devices_config, \
elb_listeners_config, healthy_threshold, unhealthy_threshold, interval, timeout
userdata = """#cloud-config
repo_update: true
repo_upgrade: all
packages:
- httpd
runcmd:
- service httpd start
"""
availability_zones = ['ap-southeast-2a', 'ap-southeast-2b', 'ap-southeast-2c']
keypair = 'INSERT_YOUR_KEYPAIR_HERE'
nat_image_id = 'ami-53371f30'
jump_image_id = 'ami-dc361ebf'
unit_image_id = 'ami-dc361ebf'
instance_type = 't2.nano'
code_deploy_service_role = 'arn:aws:iam::1234567890124 :role/CodeDeployServiceRole'
vpc_cidr = {'name': 'VPC', 'cidr': '10.0.0.0/16'}
home_cidrs = [{'name': 'GA', 'cidr': '123.123.12.34/32'}, {'name': 'home', 'cidr': '192.168.0.1/16'}]
instance_port = ['80']
loadbalancer_port = ['80']
instance_protocol = ['HTTP']
loadbalancer_protocol = ['HTTP']
minsize = 1
maxsize = 1
elb_health_check = 'HTTP:80/index.html'
healthy_threshold = 10
unhealthy_threshold = 2
interval = 300
timeout = 30
sticky_app_cookie = 'JSESSION'
public_cidr = {'name': 'PublicIp', 'cidr': '0.0.0.0/0'}
health_check_grace_period = 300
health_check_type = 'ELB'
owner_emails = ['[email protected]']
db_instance_type = 'db.m1.small'
db_engine = 'postgres'
db_port = '5432'
db_hdd_size = 5
db_backup_window = '17:00-17:30'
db_backup_retention = '4'
db_maintenance_window = 'Mon:01:00-Mon:01:30'
db_storage_type = 'gp2'
block_devices_config = [BlockDevicesConfig(
device_name='/dev/xvda',
ebs_volume_size='15',
ebs_volume_type='gp2',
ebs_encrypted=False,
ebs_snapshot_id=None,
virtual_name=False), BlockDevicesConfig(
device_name='/dev/sda2',
ebs_volume_size='',
ebs_volume_type='',
ebs_encrypted=False,
ebs_snapshot_id='',
virtual_name=True
)]
elb_listeners_config = [
ElbListenersConfig(
instance_port='80',
loadbalancer_port='80',
loadbalancer_protocol='HTTP',
instance_protocol='HTTP',
sticky_app_cookie=sticky_app_cookie
)]
@with_setup(setup_resources)
def test_stack():
""" Test stack structure
"""
stack = create_stack()
assert_equals(stack.code_deploy_service_role, code_deploy_service_role)
assert_equals(stack.keypair, keypair)
assert_equals(stack.availability_zones, availability_zones)
assert_equals(stack.vpc_cidr, vpc_cidr)
[assert_equals(stack.home_cidrs[num], home_cidrs[num]) for num in range(len(home_cidrs))]
assert_equals(stack.public_cidr, {'name': 'PublicIp', 'cidr': '0.0.0.0/0'})
assert_equals(stack.internet_gateway.title, 'Ig')
assert_is(type(stack.internet_gateway.Tags), Tags)
assert_equals(stack.gateway_attachment.title, 'IgAtch')
assert_is(type(stack.gateway_attachment.VpcId), Ref)
assert_is(type(stack.gateway_attachment.InternetGatewayId), Ref)
assert_equals(stack.public_route_table.title, 'PubRouteTable')
assert_is(type(stack.public_route_table.VpcId), Ref)
assert_is(type(stack.public_route_table.Tags), Tags)
for az in availability_zones:
assert_equals(stack.private_route_tables[az].title, get_cf_friendly_name(az) + 'PriRouteTable')
assert_is(type(stack.private_route_tables[az].VpcId), Ref)
assert_is(type(stack.private_route_tables[az].Tags), Tags)
assert_equals(stack.nat.single.SourceDestCheck, 'false')
assert_equals(stack.jump.single.SourceDestCheck, 'true')
for num in range(len(availability_zones)):
# For public subnets
public_subnet = stack.public_subnets[num]
assert_equals(public_subnet.CidrBlock, ''.join(['10.0.', str(num), '.0/24']))
# For private subnets
private_subnet = stack.private_subnets[num]
assert_equals(private_subnet.CidrBlock, ''.join(['10.0.', str(num + 100), '.0/24']))
assert_equals(len(stack.units), 7)
@with_setup(setup_resources)
def test_highly_available_nat_stack():
""" Test for nat gateway configuration"""
stack = create_stack(nat_highly_available=True)
assert_equals(stack.code_deploy_service_role, code_deploy_service_role)
assert_equals(stack.keypair, keypair)
assert_equals(stack.availability_zones, availability_zones)
assert_equals(stack.vpc_cidr, vpc_cidr)
[assert_equals(stack.home_cidrs[num], home_cidrs[num]) for num in range(len(home_cidrs))]
assert_equals(stack.public_cidr, {'name': 'PublicIp', 'cidr': '0.0.0.0/0'})
assert_equals(stack.internet_gateway.title, 'Ig')
assert_is(type(stack.internet_gateway.Tags), Tags)
assert_equals(stack.gateway_attachment.title, 'IgAtch')
assert_is(type(stack.gateway_attachment.VpcId), Ref)
assert_is(type(stack.gateway_attachment.InternetGatewayId), Ref)
assert_equals(stack.public_route_table.title, 'PubRouteTable')
assert_is(type(stack.public_route_table.VpcId), Ref)
assert_is(type(stack.public_route_table.Tags), Tags)
for az in availability_zones:
assert_equals(stack.private_route_tables[az].title, get_cf_friendly_name(az) + 'PriRouteTable')
assert_is(type(stack.private_route_tables[az].VpcId), Ref)
assert_is(type(stack.private_route_tables[az].Tags), Tags)
assert_equals(len(stack.nat_gateways), len(availability_zones))
assert_equals(stack.jump.single.SourceDestCheck, 'true')
for num in range(len(availability_zones)):
# For public subnets
public_subnet = stack.public_subnets[num]
assert_equals(public_subnet.CidrBlock, ''.join(['10.0.', str(num), '.0/24']))
# For private subnets
private_subnet = stack.private_subnets[num]
assert_equals(private_subnet.CidrBlock, ''.join(['10.0.', str(num + 100), '.0/24']))
assert_equals(len(stack.units), 7)
def test_duplicate_unit_names():
""" Test for duplicate unit names
"""
assert_raises(DuplicateUnitNameError, Stack, **{
'code_deploy_service_role': code_deploy_service_role,
'keypair': keypair,
'availability_zones': availability_zones,
'vpc_cidr': vpc_cidr,
'public_cidr': public_cidr,
'home_cidrs': home_cidrs,
'jump_image_id': jump_image_id,
'jump_instance_type': instance_type,
'nat_image_id': nat_image_id,
'nat_instance_type': instance_type,
'public_hosted_zone_name': None,
'private_hosted_zone_name': 'private.lan.',
'iam_instance_profile_arn': None,
'owner_emails': owner_emails,
'nat_highly_available': False,
'ec2_scheduled_shutdown': False,
'autoscaling_units': [{'unit_title': 'app1',
'asg_config': AsgConfig(
minsize=minsize,
maxsize=maxsize,
image_id=unit_image_id,
instance_type=instance_type,
health_check_grace_period=health_check_grace_period,
health_check_type=health_check_type,
userdata=userdata,
iam_instance_profile_arn=None,
block_devices_config=block_devices_config,
simple_scaling_policy_config=None,
ec2_scheduled_shutdown=None
),
'elb_config': ElbConfig(
elb_listeners_config=elb_listeners_config,
elb_health_check=elb_health_check,
elb_log_bucket=None,
public_unit=True,
ssl_certificate_id=None,
healthy_threshold=healthy_threshold,
unhealthy_threshold=unhealthy_threshold,
interval=interval,
timeout=timeout
),
'dependencies': [],
},
{'unit_title': 'app1',
'elb_config': ElbConfig(
elb_listeners_config=elb_listeners_config,
elb_health_check=elb_health_check,
elb_log_bucket=None,
public_unit=True,
ssl_certificate_id=None,
healthy_threshold=healthy_threshold,
unhealthy_threshold=unhealthy_threshold,
interval=interval,
timeout=timeout
),
'asg_config': AsgConfig(
minsize=minsize,
maxsize=maxsize,
image_id=unit_image_id,
instance_type=instance_type,
health_check_grace_period=health_check_grace_period,
health_check_type=health_check_type,
userdata=userdata,
iam_instance_profile_arn=None,
block_devices_config=None,
simple_scaling_policy_config=None,
ec2_scheduled_shutdown=None
),
'dependencies': [],
}],
'database_units': [],
'zd_autoscaling_units': [],
'cf_distribution_units': [],
'api_gateway_units': [],
'lambda_units': []
})
def create_stack(nat_highly_available=False):
"""
Helper function to create a stack with default values
:return new stack
"""
global userdata, availability_zones, keypair, instance_type, code_deploy_service_role, vpc_cidr, \
public_cidr, instance_port, loadbalancer_port, instance_protocol, loadbalancer_protocol, minsize, maxsize, \
elb_health_check, home_cidrs, nat_image_id, jump_image_id, health_check_grace_period, health_check_type, \
unit_image_id, db_instance_type, db_engine, db_port, owner_emails, db_backup_window, \
db_backup_retention, db_maintenance_window, db_storage_type, block_devices_config, healthy_threshold, \
unhealthy_threshold, interval, timeout, elb_listeners_config, sticky_app_cookies
stack = Stack(
code_deploy_service_role=code_deploy_service_role,
keypair=keypair,
availability_zones=availability_zones,
vpc_cidr=vpc_cidr,
public_cidr=public_cidr,
home_cidrs=home_cidrs,
jump_image_id=jump_image_id,
jump_instance_type=instance_type,
nat_image_id=nat_image_id,
nat_instance_type=instance_type,
public_hosted_zone_name=None,
private_hosted_zone_name='priavte.lan.',
iam_instance_profile_arn=None,
owner_emails=owner_emails,
nat_highly_available=nat_highly_available,
ec2_scheduled_shutdown=False,
zd_autoscaling_units=[{'unit_title': 'zdapp1',
'elb_config': ElbConfig(
elb_listeners_config=elb_listeners_config,
elb_health_check=elb_health_check,
elb_log_bucket=None,
public_unit=True,
ssl_certificate_id=None,
healthy_threshold=healthy_threshold,
unhealthy_threshold=unhealthy_threshold,
interval=interval,
timeout=timeout
),
'blue_asg_config': AsgConfig(
minsize=minsize,
maxsize=maxsize,
image_id=unit_image_id,
instance_type=instance_type,
health_check_grace_period=health_check_grace_period,
health_check_type=health_check_type,
userdata=userdata,
iam_instance_profile_arn=None,
block_devices_config=block_devices_config,
simple_scaling_policy_config=None,
ec2_scheduled_shutdown=None
),
'green_asg_config': AsgConfig(
minsize=minsize,
maxsize=maxsize,
image_id=unit_image_id,
instance_type=instance_type,
health_check_grace_period=health_check_grace_period,
health_check_type=health_check_type,
userdata=userdata,
iam_instance_profile_arn=None,
block_devices_config=block_devices_config,
simple_scaling_policy_config=None,
ec2_scheduled_shutdown=None
),
'dependencies': ['app2:5432', 'db1:80'],
}],
autoscaling_units=[{'unit_title': 'app1',
'elb_config': ElbConfig(
elb_listeners_config=elb_listeners_config,
elb_health_check=elb_health_check,
elb_log_bucket=None,
public_unit=True,
ssl_certificate_id=None,
healthy_threshold=healthy_threshold,
unhealthy_threshold=unhealthy_threshold,
interval=interval,
timeout=timeout
),
'asg_config': AsgConfig(
minsize=minsize,
maxsize=maxsize,
image_id=unit_image_id,
instance_type=instance_type,
health_check_grace_period=health_check_grace_period,
health_check_type=health_check_type,
userdata=userdata,
iam_instance_profile_arn=None,
block_devices_config=block_devices_config,
simple_scaling_policy_config=None,
ec2_scheduled_shutdown=None
),
'dependencies': ['app2:80', 'db1:5432'],
},
{'unit_title': 'app2',
'elb_config': ElbConfig(
elb_listeners_config=elb_listeners_config,
elb_health_check=elb_health_check,
elb_log_bucket=None,
public_unit=True,
ssl_certificate_id=None,
healthy_threshold=healthy_threshold,
unhealthy_threshold=unhealthy_threshold,
interval=interval,
timeout=timeout
),
'asg_config': AsgConfig(
minsize=minsize,
maxsize=maxsize,
image_id=unit_image_id,
instance_type=instance_type,
health_check_grace_period=health_check_grace_period,
health_check_type=health_check_type,
userdata=userdata,
iam_instance_profile_arn=None,
block_devices_config=block_devices_config,
simple_scaling_policy_config=None,
ec2_scheduled_shutdown=None
),
'dependencies': []
}],
database_units=[{'unit_title': 'db1',
'database_config': DatabaseConfig(
db_instance_type=db_instance_type,
db_engine=db_engine,
db_port=db_port,
db_hdd_size=db_hdd_size,
db_snapshot_id=None,
db_name='MyDb',
db_backup_window=db_backup_window,
db_backup_retention=db_backup_retention,
db_maintenance_window=db_maintenance_window,
db_storage_type=db_storage_type
)
}
],
cf_distribution_units=[{'unit_title': 'cfdist1',
'cf_origins_config': [
CFOriginsConfig(
domain_name='amazonia-elb-bucket.s3.amazonaws.com',
origin_id='S3-amazonia-elb-bucket',
origin_path='',
custom_headers={
'Origin': 'http://www.domain.com',
'Accept': 'True'
},
origin_policy={
'is_s3': True,
'origin_access_identity': 'originaccessid1'
}
),
CFOriginsConfig(
domain_name='app1',
origin_id='www-elb',
origin_path='/path',
custom_headers={},
origin_policy={
'is_s3': False,
'origin_protocol_policy': 'https-only',
'http_port': 80,
'https_port': 443,
'origin_ssl_protocols': ['TLSv1', 'TLSv1.1', 'TLSv1.2'],
}
),
CFOriginsConfig(
domain_name='validYamlTestAPIGW',
origin_id='www-elb2',
origin_path='/path',
custom_headers={},
origin_policy={
'is_s3': False,
'origin_protocol_policy': 'https-only',
'http_port': 80,
'https_port': 443,
'origin_ssl_protocols': ['TLSv1', 'TLSv1.1', 'TLSv1.2'],
}
)
],
'cf_distribution_config': CFDistributionConfig(
aliases=['www.test-stack.gadevs.ga', 'test-stack.gadevs.ga'],
comment='SysTestCFDistribution',
default_root_object='index.html',
enabled=True,
price_class='PriceClass_All',
error_page_path='index.html',
acm_cert_arn='arn.acm.certificate',
minimum_protocol_version='TLSv1',
ssl_support_method='sni-only'
),
'cf_cache_behavior_config': [
CFCacheBehaviorConfig(
is_default=True,
path_pattern='/index.html',
allowed_methods=['GET', 'HEAD'],
cached_methods=['GET', 'HEAD'],
target_origin_id='S3-bucket-id',
forward_cookies='all',
forwarded_headers=['Accept', 'Set-Cookie'],
viewer_protocol_policy='allow-all',
min_ttl=0,
default_ttl=0,
max_ttl=0,
trusted_signers=['self'],
query_string='False'
),
CFCacheBehaviorConfig(
is_default=False,
path_pattern='/login.js',
allowed_methods=['GET', 'POST', 'HEAD', 'DELETE', 'OPTIONS', 'PATCH', 'PUT'],
cached_methods=['GET', 'HEAD'],
target_origin_id='www-origin',
forward_cookies='all',
forwarded_headers=['Accept', 'Set-Cookie'],
viewer_protocol_policy='https-only',
min_ttl=0,
default_ttl=0,
max_ttl=0,
trusted_signers=['self'],
query_string='True'
)
]
}],
api_gateway_units=[{'unit_title': 'validYamlTestAPIGW',
'method_config': [
ApiGatewayMethodConfig(
method_name='login',
lambda_unit='validYamlTestLambda',
httpmethod='POST',
authorizationtype='NONE',
request_config=ApiGatewayRequestConfig(
templates={'application/json': ''},
parameters={'somemapping': 'somefield'}
),
response_config=[
ApiGatewayResponseConfig(
templates={'application/json': ''},
parameters={'somemapping': 'somefield'},
statuscode='200',
models={'application/json': 'Empty'},
selectionpattern=''
)]
)
]
}],
lambda_units=[{'unit_title': 'validYamlTestLambda',
'dependencies': ['db1:5432'],
'lambda_config': LambdaConfig(
lambda_s3_bucket='bucket_name',
lambda_s3_key='key_name',
lambda_description='blah',
lambda_function_name='my_function',
lambda_handler='main',
lambda_memory_size=128,
lambda_role_arn='test_arn',
lambda_runtime='python2.7',
lambda_timeout=1,
lambda_schedule='cron(0/5 * * * ? *)'
)
}
]
)
return stack
|
bsd-3-clause
|
joopert/home-assistant
|
tests/components/fan/test_init.py
|
4
|
1116
|
"""Tests for fan platforms."""
import unittest
from homeassistant.components.fan import FanEntity
import pytest
class BaseFan(FanEntity):
"""Implementation of the abstract FanEntity."""
def __init__(self):
"""Initialize the fan."""
pass
class TestFanEntity(unittest.TestCase):
"""Test coverage for base fan entity class."""
def setUp(self):
"""Set up test data."""
self.fan = BaseFan()
def tearDown(self):
"""Tear down unit test data."""
self.fan = None
def test_fanentity(self):
"""Test fan entity methods."""
assert "off" == self.fan.state
assert 0 == len(self.fan.speed_list)
assert 0 == self.fan.supported_features
assert {"speed_list": []} == self.fan.state_attributes
# Test set_speed not required
self.fan.oscillate(True)
with pytest.raises(NotImplementedError):
self.fan.set_speed("slow")
with pytest.raises(NotImplementedError):
self.fan.turn_on()
with pytest.raises(NotImplementedError):
self.fan.turn_off()
|
apache-2.0
|
cpina/science-cruise-data-management
|
ScienceCruiseDataManagement/main/management/commands/importprojects.py
|
1
|
1792
|
from django.core.management.base import BaseCommand, CommandError
from main.models import Project, Person
import csv
# This file is part of https://github.com/cpina/science-cruise-data-management
#
# This project was programmed in a hurry without any prior Django experience,
# while circumnavigating the Antarctic on the ACE expedition, without proper
# Internet access, with 150 scientists using the system and doing at the same
# cruise other data management and system administration tasks.
#
# Sadly there aren't unit tests and we didn't have time to refactor the code
# during the cruise, which is really needed.
#
# Carles Pina ([email protected]) and Jen Thomas ([email protected]), 2016-2017.
class Command(BaseCommand):
help = 'Adds data to the person table'
def add_arguments(self, parser):
parser.add_argument('filename', type=str)
def handle(self, *args, **options):
print(options['filename'])
self.import_data_from_csv(options['filename'])
def import_data_from_csv(self, filename):
with open(filename) as csvfile:
reader = csv.DictReader(csvfile)
for row in reader:
print(row)
project = Project()
project.number = row['project_number']
project.title= row['project_title']
project.alternative_title = row['project_alternative_title']
project.abstract = row['abstract']
if row['name_first'] != '':
print("{}-{}".format(row['name_first'],row['name_last']))
person = Person.objects.filter(name_first=row['name_first']).filter(name_last=row['name_last'])[0]
project.principal_investigator =person
project.save()
|
mit
|
havt/odoo
|
addons/account/wizard/account_use_model.py
|
341
|
3361
|
# -*- coding: utf-8 -*-
##############################################################################
#
# OpenERP, Open Source Management Solution
# Copyright (C) 2004-2010 Tiny SPRL (<http://tiny.be>).
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
##############################################################################
import time
from openerp.osv import fields, osv
from openerp.tools.translate import _
class account_use_model(osv.osv_memory):
_name = 'account.use.model'
_description = 'Use model'
_columns = {
'model': fields.many2many('account.model', 'account_use_model_relation', 'account_id', 'model_id', 'Account Model'),
}
def view_init(self, cr , uid , fields_list, context=None):
account_model_obj = self.pool.get('account.model')
if context is None:
context = {}
if context.get('active_ids',False):
data_model = account_model_obj.browse(cr, uid, context['active_ids'])
for model in data_model:
for line in model.lines_id:
if line.date_maturity == 'partner':
if not line.partner_id:
raise osv.except_osv(_('Error!'), _("Maturity date of entry line generated by model line '%s' is based on partner payment term!"\
"\nPlease define partner on it!")%line.name)
pass
def create_entries(self, cr, uid, ids, context=None):
account_model_obj = self.pool.get('account.model')
mod_obj = self.pool.get('ir.model.data')
if context is None:
context = {}
data = self.read(cr, uid, ids, context=context)[0]
record_id = context and context.get('model_line', False) or False
if record_id:
model_ids = data['model']
else:
model_ids = context['active_ids']
move_ids = account_model_obj.generate(cr, uid, model_ids, context=context)
context = dict(context, move_ids=move_ids)
model_data_ids = mod_obj.search(cr, uid,[('model','=','ir.ui.view'),('name','=','view_move_form')], context=context)
resource_id = mod_obj.read(cr, uid, model_data_ids, fields=['res_id'], context=context)[0]['res_id']
return {
'domain': "[('id','in', ["+','.join(map(str,context['move_ids']))+"])]",
'name': 'Entries',
'view_type': 'form',
'view_mode': 'tree,form',
'res_model': 'account.move',
'views': [(False,'tree'),(resource_id,'form')],
'type': 'ir.actions.act_window',
}
# vim:expandtab:smartindent:tabstop=4:softtabstop=4:shiftwidth=4:
|
agpl-3.0
|
dannyboi104/SickRage
|
lib/mako/ast.py
|
60
|
6702
|
# mako/ast.py
# Copyright (C) 2006-2015 the Mako authors and contributors <see AUTHORS file>
#
# This module is part of Mako and is released under
# the MIT License: http://www.opensource.org/licenses/mit-license.php
"""utilities for analyzing expressions and blocks of Python
code, as well as generating Python from AST nodes"""
from mako import exceptions, pyparser, compat
import re
class PythonCode(object):
"""represents information about a string containing Python code"""
def __init__(self, code, **exception_kwargs):
self.code = code
# represents all identifiers which are assigned to at some point in
# the code
self.declared_identifiers = set()
# represents all identifiers which are referenced before their
# assignment, if any
self.undeclared_identifiers = set()
# note that an identifier can be in both the undeclared and declared
# lists.
# using AST to parse instead of using code.co_varnames,
# code.co_names has several advantages:
# - we can locate an identifier as "undeclared" even if
# its declared later in the same block of code
# - AST is less likely to break with version changes
# (for example, the behavior of co_names changed a little bit
# in python version 2.5)
if isinstance(code, compat.string_types):
expr = pyparser.parse(code.lstrip(), "exec", **exception_kwargs)
else:
expr = code
f = pyparser.FindIdentifiers(self, **exception_kwargs)
f.visit(expr)
class ArgumentList(object):
"""parses a fragment of code as a comma-separated list of expressions"""
def __init__(self, code, **exception_kwargs):
self.codeargs = []
self.args = []
self.declared_identifiers = set()
self.undeclared_identifiers = set()
if isinstance(code, compat.string_types):
if re.match(r"\S", code) and not re.match(r",\s*$", code):
# if theres text and no trailing comma, insure its parsed
# as a tuple by adding a trailing comma
code += ","
expr = pyparser.parse(code, "exec", **exception_kwargs)
else:
expr = code
f = pyparser.FindTuple(self, PythonCode, **exception_kwargs)
f.visit(expr)
class PythonFragment(PythonCode):
"""extends PythonCode to provide identifier lookups in partial control
statements
e.g.
for x in 5:
elif y==9:
except (MyException, e):
etc.
"""
def __init__(self, code, **exception_kwargs):
m = re.match(r'^(\w+)(?:\s+(.*?))?:\s*(#|$)', code.strip(), re.S)
if not m:
raise exceptions.CompileException(
"Fragment '%s' is not a partial control statement" %
code, **exception_kwargs)
if m.group(3):
code = code[:m.start(3)]
(keyword, expr) = m.group(1,2)
if keyword in ['for','if', 'while']:
code = code + "pass"
elif keyword == 'try':
code = code + "pass\nexcept:pass"
elif keyword == 'elif' or keyword == 'else':
code = "if False:pass\n" + code + "pass"
elif keyword == 'except':
code = "try:pass\n" + code + "pass"
elif keyword == 'with':
code = code + "pass"
else:
raise exceptions.CompileException(
"Unsupported control keyword: '%s'" %
keyword, **exception_kwargs)
super(PythonFragment, self).__init__(code, **exception_kwargs)
class FunctionDecl(object):
"""function declaration"""
def __init__(self, code, allow_kwargs=True, **exception_kwargs):
self.code = code
expr = pyparser.parse(code, "exec", **exception_kwargs)
f = pyparser.ParseFunc(self, **exception_kwargs)
f.visit(expr)
if not hasattr(self, 'funcname'):
raise exceptions.CompileException(
"Code '%s' is not a function declaration" % code,
**exception_kwargs)
if not allow_kwargs and self.kwargs:
raise exceptions.CompileException(
"'**%s' keyword argument not allowed here" %
self.kwargnames[-1], **exception_kwargs)
def get_argument_expressions(self, as_call=False):
"""Return the argument declarations of this FunctionDecl as a printable
list.
By default the return value is appropriate for writing in a ``def``;
set `as_call` to true to build arguments to be passed to the function
instead (assuming locals with the same names as the arguments exist).
"""
namedecls = []
# Build in reverse order, since defaults and slurpy args come last
argnames = self.argnames[::-1]
kwargnames = self.kwargnames[::-1]
defaults = self.defaults[::-1]
kwdefaults = self.kwdefaults[::-1]
# Named arguments
if self.kwargs:
namedecls.append("**" + kwargnames.pop(0))
for name in kwargnames:
# Keyword-only arguments must always be used by name, so even if
# this is a call, print out `foo=foo`
if as_call:
namedecls.append("%s=%s" % (name, name))
elif kwdefaults:
default = kwdefaults.pop(0)
if default is None:
# The AST always gives kwargs a default, since you can do
# `def foo(*, a=1, b, c=3)`
namedecls.append(name)
else:
namedecls.append("%s=%s" % (
name, pyparser.ExpressionGenerator(default).value()))
else:
namedecls.append(name)
# Positional arguments
if self.varargs:
namedecls.append("*" + argnames.pop(0))
for name in argnames:
if as_call or not defaults:
namedecls.append(name)
else:
default = defaults.pop(0)
namedecls.append("%s=%s" % (
name, pyparser.ExpressionGenerator(default).value()))
namedecls.reverse()
return namedecls
@property
def allargnames(self):
return tuple(self.argnames) + tuple(self.kwargnames)
class FunctionArgs(FunctionDecl):
"""the argument portion of a function declaration"""
def __init__(self, code, **kwargs):
super(FunctionArgs, self).__init__("def ANON(%s):pass" % code,
**kwargs)
|
gpl-3.0
|
joopert/home-assistant
|
tests/components/input_datetime/test_reproduce_state.py
|
5
|
2723
|
"""Test reproduce state for Input datetime."""
from homeassistant.core import State
from tests.common import async_mock_service
async def test_reproducing_states(hass, caplog):
"""Test reproducing Input datetime states."""
hass.states.async_set(
"input_datetime.entity_datetime",
"2010-10-10 01:20:00",
{"has_date": True, "has_time": True},
)
hass.states.async_set(
"input_datetime.entity_time", "01:20:00", {"has_date": False, "has_time": True}
)
hass.states.async_set(
"input_datetime.entity_date",
"2010-10-10",
{"has_date": True, "has_time": False},
)
datetime_calls = async_mock_service(hass, "input_datetime", "set_datetime")
# These calls should do nothing as entities already in desired state
await hass.helpers.state.async_reproduce_state(
[
State("input_datetime.entity_datetime", "2010-10-10 01:20:00"),
State("input_datetime.entity_time", "01:20:00"),
State("input_datetime.entity_date", "2010-10-10"),
],
blocking=True,
)
assert len(datetime_calls) == 0
# Test invalid state is handled
await hass.helpers.state.async_reproduce_state(
[
State("input_datetime.entity_datetime", "not_supported"),
State("input_datetime.entity_datetime", "not-valid-date"),
State("input_datetime.entity_datetime", "not:valid:time"),
State("input_datetime.entity_datetime", "1234-56-78 90:12:34"),
],
blocking=True,
)
assert "not_supported" in caplog.text
assert "not-valid-date" in caplog.text
assert "not:valid:time" in caplog.text
assert "1234-56-78 90:12:34" in caplog.text
assert len(datetime_calls) == 0
# Make sure correct services are called
await hass.helpers.state.async_reproduce_state(
[
State("input_datetime.entity_datetime", "2011-10-10 02:20:00"),
State("input_datetime.entity_time", "02:20:00"),
State("input_datetime.entity_date", "2011-10-10"),
# Should not raise
State("input_datetime.non_existing", "2010-10-10 01:20:00"),
],
blocking=True,
)
valid_calls = [
{
"entity_id": "input_datetime.entity_datetime",
"datetime": "2011-10-10 02:20:00",
},
{"entity_id": "input_datetime.entity_time", "time": "02:20:00"},
{"entity_id": "input_datetime.entity_date", "date": "2011-10-10"},
]
assert len(datetime_calls) == 3
for call in datetime_calls:
assert call.domain == "input_datetime"
assert call.data in valid_calls
valid_calls.remove(call.data)
|
apache-2.0
|
0jpq0/kbengine
|
kbe/res/scripts/common/Lib/idlelib/OutputWindow.py
|
88
|
4394
|
from tkinter import *
from idlelib.EditorWindow import EditorWindow
import re
import tkinter.messagebox as tkMessageBox
from idlelib import IOBinding
class OutputWindow(EditorWindow):
"""An editor window that can serve as an output file.
Also the future base class for the Python shell window.
This class has no input facilities.
"""
def __init__(self, *args):
EditorWindow.__init__(self, *args)
self.text.bind("<<goto-file-line>>", self.goto_file_line)
# Customize EditorWindow
def ispythonsource(self, filename):
# No colorization needed
return 0
def short_title(self):
return "Output"
def maybesave(self):
# Override base class method -- don't ask any questions
if self.get_saved():
return "yes"
else:
return "no"
# Act as output file
def write(self, s, tags=(), mark="insert"):
if isinstance(s, (bytes, bytes)):
s = s.decode(IOBinding.encoding, "replace")
self.text.insert(mark, s, tags)
self.text.see(mark)
self.text.update()
return len(s)
def writelines(self, lines):
for line in lines:
self.write(line)
def flush(self):
pass
# Our own right-button menu
rmenu_specs = [
("Cut", "<<cut>>", "rmenu_check_cut"),
("Copy", "<<copy>>", "rmenu_check_copy"),
("Paste", "<<paste>>", "rmenu_check_paste"),
(None, None, None),
("Go to file/line", "<<goto-file-line>>", None),
]
file_line_pats = [
# order of patterns matters
r'file "([^"]*)", line (\d+)',
r'([^\s]+)\((\d+)\)',
r'^(\s*\S.*?):\s*(\d+):', # Win filename, maybe starting with spaces
r'([^\s]+):\s*(\d+):', # filename or path, ltrim
r'^\s*(\S.*?):\s*(\d+):', # Win abs path with embedded spaces, ltrim
]
file_line_progs = None
def goto_file_line(self, event=None):
if self.file_line_progs is None:
l = []
for pat in self.file_line_pats:
l.append(re.compile(pat, re.IGNORECASE))
self.file_line_progs = l
# x, y = self.event.x, self.event.y
# self.text.mark_set("insert", "@%d,%d" % (x, y))
line = self.text.get("insert linestart", "insert lineend")
result = self._file_line_helper(line)
if not result:
# Try the previous line. This is handy e.g. in tracebacks,
# where you tend to right-click on the displayed source line
line = self.text.get("insert -1line linestart",
"insert -1line lineend")
result = self._file_line_helper(line)
if not result:
tkMessageBox.showerror(
"No special line",
"The line you point at doesn't look like "
"a valid file name followed by a line number.",
master=self.text)
return
filename, lineno = result
edit = self.flist.open(filename)
edit.gotoline(lineno)
def _file_line_helper(self, line):
for prog in self.file_line_progs:
match = prog.search(line)
if match:
filename, lineno = match.group(1, 2)
try:
f = open(filename, "r")
f.close()
break
except OSError:
continue
else:
return None
try:
return filename, int(lineno)
except TypeError:
return None
# These classes are currently not used but might come in handy
class OnDemandOutputWindow:
tagdefs = {
# XXX Should use IdlePrefs.ColorPrefs
"stdout": {"foreground": "blue"},
"stderr": {"foreground": "#007700"},
}
def __init__(self, flist):
self.flist = flist
self.owin = None
def write(self, s, tags, mark):
if not self.owin:
self.setup()
self.owin.write(s, tags, mark)
def setup(self):
self.owin = owin = OutputWindow(self.flist)
text = owin.text
for tag, cnf in self.tagdefs.items():
if cnf:
text.tag_configure(tag, **cnf)
text.tag_raise('sel')
self.write = self.owin.write
|
lgpl-3.0
|
mageec/mageec
|
script/generate-builds.py
|
1
|
12030
|
#!/usr/bin/env python3
import argparse
import os
import random
import sys
import mageec
gcc_flags = [
#'-faggressive-loop-optimizations', # Not supported in 4.5
'-falign-functions',
'-falign-jumps',
'-falign-labels',
'-falign-loops',
'-fbranch-count-reg',
'-fbranch-target-load-optimize',
'-fbranch-target-load-optimize2',
'-fbtr-bb-exclusive',
'-fcaller-saves',
#'-fcombine-stack-adjustments', # Not supported in 4.5
#'-fcommon', # affects semantics, unlikely to affect performance
#'-fcompare-elim', # Not supported in 4.5
'-fconserve-stack',
'-fcprop-registers',
'-fcrossjumping',
'-fcse-follow-jumps',
#'-fdata-sections', # affects semantics unlikely to affect performance
'-fdce',
'-fdefer-pop',
'-fdelete-null-pointer-checks',
#'-fdevirtualize', # Not supported in 4.5
'-fdse',
'-fearly-inlining',
'-fexpensive-optimizations',
'-fforward-propagate',
'-fgcse',
'-fgcse-after-reload',
'-fgcse-las',
'-fgcse-lm',
'-fgcse-sm',
'-fguess-branch-probability',
#'-fhoist-adjacent-loads', # Not supported in 4.5
'-fif-conversion',
'-fif-conversion2',
'-finline',
#'-finline-atomics', # Not supported in 4.5
'-finline-functions',
'-finline-functions-called-once',
'-finline-small-functions',
'-fipa-cp',
'-fipa-cp-clone',
#'-fipa-profile', # Not supported in 4.5
'-fipa-pta',
'-fipa-pure-const',
'-fipa-reference',
'-fipa-sra',
#'-fira-hoist-pressure', # Not supported in 4.5
'-fivopts',
'-fmerge-constants',
'-fmodulo-sched',
'-fmove-loop-invariants',
'-fomit-frame-pointer',
'-foptimize-sibling-calls',
#'-foptimize-strlen', # Not supported in 4.5
'-fpeephole',
'-fpeephole2',
'-fpredictive-commoning',
'-fprefetch-loop-arrays',
'-fregmove',
'-frename-registers',
'-freorder-blocks',
'-freorder-functions',
'-frerun-cse-after-loop',
'-freschedule-modulo-scheduled-loops',
'-fsched-critical-path-heuristic',
'-fsched-dep-count-heuristic',
'-fsched-group-heuristic',
'-fsched-interblock',
'-fsched-last-insn-heuristic',
'-fsched-pressure',
'-fsched-rank-heuristic',
'-fsched-spec',
'-fsched-spec-insn-heuristic',
'-fsched-spec-load',
'-fsched-stalled-insns',
'-fsched-stalled-insns-dep',
'-fschedule-insns',
'-fschedule-insns2',
#'-fsection-anchors', # may conflict with other flags
'-fsel-sched-pipelining',
'-fsel-sched-pipelining-outer-loops',
'-fsel-sched-reschedule-pipelined',
'-fselective-scheduling',
'-fselective-scheduling2',
#'-fshrink-wrap', # Not supported in 4.5
'-fsplit-ivs-in-unroller',
'-fsplit-wide-types',
#'-fstrict-aliasing', # affects semantics
'-fthread-jumps',
'-ftoplevel-reorder',
#'-ftree-bit-ccp', # Not supported in 4.5
'-ftree-builtin-call-dce',
'-ftree-ccp',
'-ftree-ch',
#'-ftree-coalesce-inlined-vars', # No equivalent -fno for this flag
#'-ftree-coalesce-vars', # Not supported in 4.5
'-ftree-copy-prop',
'-ftree-copyrename',
'-ftree-cselim',
'-ftree-dce',
'-ftree-dominator-opts',
'-ftree-dse',
'-ftree-forwprop',
'-ftree-fre',
#'-ftree-loop-distribute-patterns', # Not supported in 4.5
'-ftree-loop-distribution',
#'-ftree-loop-if-convert', # Not supported in 4.5
'-ftree-loop-im',
'-ftree-loop-ivcanon',
'-ftree-loop-optimize',
#'-ftree-partial-pre', # Not supported in 4.5
'-ftree-phiprop',
'-ftree-pre',
'-ftree-pta',
'-ftree-reassoc',
'-ftree-scev-cprop',
'-ftree-sink',
'-ftree-slp-vectorize',
#'-ftree-slsr', # Not supported in 4.5
'-ftree-sra',
'-ftree-switch-conversion',
#'-ftree-tail-merge', # Not supported in 4.5
'-ftree-ter',
'-ftree-vect-loop-version',
'-ftree-vectorize',
'-ftree-vrp',
'-funroll-all-loops',
'-funroll-loops',
'-funswitch-loops',
'-fvariable-expansion-in-unroller',
'-fvect-cost-model',
'-fweb'
]
# Make generic to the type of choice which needs to be made
def generate_configs(flags, num_configs, generator):
configs = []
if generator == 'random':
for i in range(0, num_configs):
num_enabled = random.randint(0, len(flags))
flag_seq = random.sample(flags, num_enabled)
configs.append(' '.join(flag_seq))
else:
assert False, 'Unsupported configuration generator'
return configs
def generate_configurations(src_dir, build_dir, install_dir, build_system,
cc, cxx, fort, flags, jobs, database_path,
features_path, num_configs, generator, debug):
assert(os.path.exists(src_dir) and os.path.isabs(src_dir))
assert(os.path.exists(build_dir) and os.path.isabs(build_dir))
assert(os.path.exists(install_dir) and os.path.isabs(install_dir))
assert(os.path.exists(database_path))
assert(os.path.exists(features_path))
assert(mageec.is_command_on_path(cc))
assert(mageec.is_command_on_path(cxx))
assert(mageec.is_command_on_path(fort))
assert(num_configs > 0)
assert(jobs > 0)
configs = generate_configs(gcc_flags, num_configs, generator)
run_id = 0
for config in configs:
run_build_dir = os.path.join(build_dir, 'run-' + str(run_id))
run_install_dir = os.path.join(install_dir, 'run-' + str(run_id))
if not os.path.exists(run_build_dir):
os.makedirs(run_build_dir)
if not os.path.exists(run_install_dir):
os.makedirs(run_install_dir)
run_id += 1
print ('-- Building configuration:\n'
' Configuration: \'' + config + '\'')
compilations_path = os.path.join(run_install_dir, 'compilations.csv')
cc_wrapper = 'mageec-' + cc
cxx_wrapper = 'mageec-' + cxx
fort_wrapper = 'mageec-' + fort
assert(mageec.is_command_on_path(cc_wrapper))
assert(mageec.is_command_on_path(cxx_wrapper))
assert(mageec.is_command_on_path(fort_wrapper))
wrapper_flags = ""
if debug:
wrapper_flags += ' -fmageec-debug'
wrapper_flags += ' -fmageec-mode=gather'
wrapper_flags += ' -fmageec-database=' + database_path
wrapper_flags += ' -fmageec-features=' + features_path
wrapper_flags += ' -fmageec-out=' + compilations_path
new_flags = wrapper_flags + ' ' + flags + ' ' + config
res = mageec.build(src_dir=src_dir,
build_dir=run_build_dir,
install_dir=run_install_dir,
build_system=build_system,
cc=cc_wrapper,
cxx=cxx_wrapper,
fort=fort_wrapper,
flags=new_flags)
# just ignore failed builds
if not res:
print ('-- Build failed. Continuing regardless')
return True
def main():
parser = argparse.ArgumentParser(
description='Generate and build multiple versions of a source project')
# required arguments
parser.add_argument('--src-dir', nargs=1, required=True,
help='Directory containing the source to build')
parser.add_argument('--build-dir', nargs=1, required=True,
help='Build directory')
parser.add_argument('--install-dir', nargs=1, required=True,
help='Install directory')
parser.add_argument('--cc', nargs=1, required=True,
help='Command to use to compile C source')
parser.add_argument('--cxx', nargs=1, required=True,
help='Command to use to compile C++ source')
parser.add_argument('--fort', nargs=1, required=True,
help='Command to use to compile Fortran source')
parser.add_argument('--database', nargs=1, required=True,
help='mageec database to store generated compilations in')
parser.add_argument('--features', nargs=1, required=True,
help='File containing extracted features for the source being built')
parser.add_argument('--num-configs', nargs=1, required=True,
help='Number of configurations of the source to generate')
parser.add_argument('--generator', nargs=1, required=True,
help='Generator to use to generate configurations')
# optional arguments
parser.add_argument('--debug', action='store_true', required=False,
help='Enable debug when doing feature extraction')
parser.add_argument('--build-system', nargs=1, required=False,
help='Build system to be used to build the source. May be \'cmake\', '
'\'configure\', or a script to be used to build the source')
parser.add_argument('--flags', nargs=1, required=False,
help='Common arguments to be used when building')
parser.add_argument('--jobs', nargs=1, required=False,
help='Number of jobs to run when building')
parser.set_defaults(debug=False,
build_system=[None],
flags=[''],
jobs=[1])
args = parser.parse_args(sys.argv[1:])
src_dir = os.path.abspath(args.src_dir[0])
build_dir = os.path.abspath(args.build_dir[0])
install_dir = os.path.abspath(args.install_dir[0])
cc = args.cc[0]
cxx = args.cxx[0]
fort = args.fort[0]
database_path = os.path.abspath(args.database[0])
features_path = os.path.abspath(args.features[0])
num_configs = int(args.num_configs[0])
generator = args.generator[0]
if not os.path.exists(src_dir):
print ('-- Source directory \'' + src_dir + '\' does not exist')
return -1
if not os.path.exists(build_dir):
os.makedirs(build_dir)
if not os.path.exists(install_dir):
os.makedirs(install_dir)
if not os.path.exists(database_path):
print ('-- Database \'' + database_path + '\' does not exist')
return -1
if not os.path.exists(features_path):
print ('-- Features file \'' + features_path + '\' does not exist')
return -1
if not mageec.is_command_on_path(cc):
print ('-- Compiler \'' + cc + '\' is not on the path')
return -1
if not mageec.is_command_on_path(cxx):
print ('-- Compiler \'' + cxx + '\' is not on the path')
return -1
if not mageec.is_command_on_path(fort):
print ('-- Compiler \'' + fort + '\' is not on the path')
return -1
if num_configs <= 0:
print ('-- Cannot generate a negative or zero number of configurations')
return -1
debug = args.debug
build_system = args.build_system[0]
flags = args.flags[0]
jobs = int(args.jobs[0])
if jobs < 1:
print ('-- Number of jobs must be a positive integer')
return -1
res = generate_configurations(src_dir=src_dir,
build_dir=build_dir,
install_dir=install_dir,
build_system=build_system,
cc=cc,
cxx=cxx,
fort=fort,
flags=flags,
jobs=jobs,
database_path=database_path,
features_path=features_path,
num_configs=num_configs,
generator=generator,
debug=debug)
if not res:
return -1
return 0
if __name__ == '__main__':
main()
|
gpl-3.0
|
cstipkovic/spidermonkey-research
|
testing/talos/talos/cmanager_mac.py
|
2
|
2801
|
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
"""CounterManager for Mac OSX"""
import subprocess
from cmanager import CounterManager
import sys
def GetProcessData(pid):
"""Runs a ps on the process identified by pid and returns the output line
as a list (pid, vsz, rss)
"""
command = ['ps -o pid,vsize,rss -p'+str(pid)]
try:
handle = subprocess.Popen(command, stdout=subprocess.PIPE,
universal_newlines=True, shell=True)
handle.wait()
data = handle.stdout.readlines()
except:
print("Unexpected error executing '%s': %s", (command, sys.exc_info()))
raise
# First line is header output should look like:
# PID VSZ RSS
# 3210 75964 920
line = data[1]
line = line.split()
if line[0] == str(pid):
return line
def GetPrivateBytes(pid):
"""Calculate the amount of private, writeable memory allocated to a
process.
"""
psData = GetProcessData(pid)
return int(psData[1]) * 1024 # convert to bytes
def GetResidentSize(pid):
"""Retrieve the current resident memory for a given process"""
psData = GetProcessData(pid)
return int(psData[2]) * 1024 # convert to bytes
class MacCounterManager(CounterManager):
"""This class manages the monitoring of a process with any number of
counters.
A counter can be any function that takes an argument of one pid and
returns a piece of data about that process.
Some examples are: CalcCPUTime, GetResidentSize, and GetPrivateBytes
"""
counterDict = {"Private Bytes": GetPrivateBytes,
"RSS": GetResidentSize}
def __init__(self, process_name, process, counters):
"""Args:
counters: A list of counters to monitor. Any counters whose name
does not match a key in 'counterDict' will be ignored.
"""
CounterManager.__init__(self)
# the last process is the useful one
self.pid = process.pid
self._loadCounters()
self.registerCounters(counters)
def getCounterValue(self, counterName):
"""Returns the last value of the counter 'counterName'"""
if counterName not in self.registeredCounters:
print("Warning: attempting to collect counter %s and it is not"
" registered" % counterName)
return
try:
return self.registeredCounters[counterName][0](self.pid)
except Exception as e:
print("Error in collecting counter: %s, pid: %s, exception: %s"
% (counterName, self.pid, e))
|
mpl-2.0
|
thaim/ansible
|
lib/ansible/plugins/action/cnos.py
|
38
|
3535
|
# (C) 2017 Red Hat Inc.
# Copyright (C) 2017 Lenovo.
#
# GNU General Public License v3.0+
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
#
# Contains Action Plugin methods for CNOS Config Module
# Lenovo Networking
#
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import sys
import copy
from ansible import constants as C
from ansible.plugins.action.network import ActionModule as ActionNetworkModule
from ansible.module_utils.network.cnos.cnos import cnos_provider_spec
from ansible.module_utils.network.common.utils import load_provider
from ansible.module_utils.connection import Connection
from ansible.module_utils._text import to_text
from ansible.utils.display import Display
display = Display()
class ActionModule(ActionNetworkModule):
def run(self, tmp=None, task_vars=None):
del tmp # tmp no longer has any effect
self._config_module = True if self._task.action == 'cnos_config' else False
socket_path = None
if self._play_context.connection == 'local':
provider = load_provider(cnos_provider_spec, self._task.args)
pc = copy.deepcopy(self._play_context)
pc.connection = 'network_cli'
pc.network_os = 'cnos'
pc.remote_addr = provider['host'] or self._play_context.remote_addr
pc.port = provider['port'] or self._play_context.port or 22
pc.remote_user = provider['username'] or self._play_context.connection_user
pc.password = provider['password'] or self._play_context.password
pc.private_key_file = provider['ssh_keyfile'] or self._play_context.private_key_file
command_timeout = int(provider['timeout'] or C.PERSISTENT_COMMAND_TIMEOUT)
pc.become = provider['authorize'] or True
pc.become_pass = provider['auth_pass']
pc.become_method = 'enable'
display.vvv('using connection plugin %s (was local)' % pc.connection, pc.remote_addr)
connection = self._shared_loader_obj.connection_loader.get('persistent', pc, sys.stdin)
connection.set_options(direct={'persistent_command_timeout': command_timeout})
socket_path = connection.run()
display.vvvv('socket_path: %s' % socket_path, pc.remote_addr)
if not socket_path:
return {'failed': True,
'msg': 'unable to open shell. Please see: ' +
'https://docs.ansible.com/ansible/network_debug_troubleshooting.html#unable-to-open-shell'}
task_vars['ansible_socket'] = socket_path
# make sure we are in the right cli context which should be
# enable mode and not config module or exec mode
if socket_path is None:
socket_path = self._connection.socket_path
conn = Connection(socket_path)
out = conn.get_prompt()
if to_text(out, errors='surrogate_then_replace').strip().endswith(')#'):
display.vvvv('In Config mode, sending exit to device', self._play_context.remote_addr)
conn.send_command('exit')
else:
conn.send_command('enable')
result = super(ActionModule, self).run(task_vars=task_vars)
return result
|
mit
|
jamesblunt/edx-platform
|
lms/djangoapps/class_dashboard/tests/test_views.py
|
133
|
4061
|
"""
Tests for class dashboard (Metrics tab in instructor dashboard)
"""
import json
from django.test.client import RequestFactory
from mock import patch
from nose.plugins.attrib import attr
from xmodule.modulestore.tests.factories import CourseFactory
from xmodule.modulestore.tests.django_utils import ModuleStoreTestCase
from class_dashboard import views
from student.tests.factories import AdminFactory
@attr('shard_1')
class TestViews(ModuleStoreTestCase):
"""
Tests related to class_dashboard/views.py
"""
def setUp(self):
super(TestViews, self).setUp()
self.request_factory = RequestFactory()
self.request = self.request_factory.get('')
self.request.user = None
self.simple_data = {'error': 'error'}
@patch('class_dashboard.views.has_instructor_access_for_class')
def test_all_problem_grade_distribution_has_access(self, has_access):
"""
Test returns proper value when have proper access
"""
has_access.return_value = True
response = views.all_problem_grade_distribution(self.request, 'test/test/test')
self.assertEqual(json.dumps(self.simple_data), response.content)
@patch('class_dashboard.views.has_instructor_access_for_class')
def test_all_problem_grade_distribution_no_access(self, has_access):
"""
Test for no access
"""
has_access.return_value = False
response = views.all_problem_grade_distribution(self.request, 'test/test/test')
self.assertEqual("{\"error\": \"Access Denied: User does not have access to this course\'s data\"}", response.content)
@patch('class_dashboard.views.has_instructor_access_for_class')
def test_all_sequential_open_distribution_has_access(self, has_access):
"""
Test returns proper value when have proper access
"""
has_access.return_value = True
response = views.all_sequential_open_distrib(self.request, 'test/test/test')
self.assertEqual(json.dumps(self.simple_data), response.content)
@patch('class_dashboard.views.has_instructor_access_for_class')
def test_all_sequential_open_distribution_no_access(self, has_access):
"""
Test for no access
"""
has_access.return_value = False
response = views.all_sequential_open_distrib(self.request, 'test/test/test')
self.assertEqual("{\"error\": \"Access Denied: User does not have access to this course\'s data\"}", response.content)
@patch('class_dashboard.views.has_instructor_access_for_class')
def test_section_problem_grade_distribution_has_access(self, has_access):
"""
Test returns proper value when have proper access
"""
has_access.return_value = True
response = views.section_problem_grade_distrib(self.request, 'test/test/test', '1')
self.assertEqual(json.dumps(self.simple_data), response.content)
@patch('class_dashboard.views.has_instructor_access_for_class')
def test_section_problem_grade_distribution_no_access(self, has_access):
"""
Test for no access
"""
has_access.return_value = False
response = views.section_problem_grade_distrib(self.request, 'test/test/test', '1')
self.assertEqual("{\"error\": \"Access Denied: User does not have access to this course\'s data\"}", response.content)
def test_sending_deprecated_id(self):
course = CourseFactory.create()
instructor = AdminFactory.create()
self.request.user = instructor
response = views.all_sequential_open_distrib(self.request, course.id.to_deprecated_string())
self.assertEqual('[]', response.content)
response = views.all_problem_grade_distribution(self.request, course.id.to_deprecated_string())
self.assertEqual('[]', response.content)
response = views.section_problem_grade_distrib(self.request, course.id.to_deprecated_string(), 'no section')
self.assertEqual('{"error": "error"}', response.content)
|
agpl-3.0
|
deroneriksson/incubator-systemml
|
src/main/python/systemml/random/sampling.py
|
7
|
5336
|
# -------------------------------------------------------------
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
# -------------------------------------------------------------
__all__ = ['normal', 'uniform', 'poisson']
from ..defmatrix import *
# Special object used internally to specify the placeholder which will be replaced by output ID
# This helps to provide dml containing output ID in constructSamplingNode
OUTPUT_ID = '$$OutputID$$'
def constructSamplingNode(inputs, dml):
"""
Convenient utility to create an intermediate of AST.
Parameters
----------
inputs = list of input matrix objects and/or DMLOp
dml = list of DML string (which will be eventually joined before execution). To specify out.ID, please use the placeholder
"""
dmlOp = DMLOp(inputs)
out = matrix(None, op=dmlOp)
dmlOp.dml = [out.ID if x == OUTPUT_ID else x for x in dml]
return out
INPUTS = []
def asStr(arg):
"""
Internal use only: Convenient utility to update inputs and return appropriate string value
"""
if isinstance(arg, matrix):
INPUTS = INPUTS + [arg]
return arg.ID
else:
return str(arg)
def normal(loc=0.0, scale=1.0, size=(1, 1), sparsity=1.0):
"""
Draw random samples from a normal (Gaussian) distribution.
Parameters
----------
loc: Mean ("centre") of the distribution.
scale: Standard deviation (spread or "width") of the distribution.
size: Output shape (only tuple of length 2, i.e. (m, n), supported).
sparsity: Sparsity (between 0.0 and 1.0).
Examples
--------
>>> import systemml as sml
>>> import numpy as np
>>> sml.setSparkContext(sc)
>>> from systemml import random
>>> m1 = sml.random.normal(loc=3, scale=2, size=(3,3))
>>> m1.toNumPy()
array([[ 3.48857226, 6.17261819, 2.51167259],
[ 3.60506708, -1.90266305, 3.97601633],
[ 3.62245706, 5.9430881 , 2.53070413]])
"""
if len(size) != 2:
raise TypeError('Incorrect type for size. Expected tuple of length 2')
INPUTS = []
rows = asStr(size[0])
cols = asStr(size[1])
loc = asStr(loc)
scale = asStr(scale)
sparsity = asStr(sparsity)
# loc + scale*standard normal
return constructSamplingNode(INPUTS, [
OUTPUT_ID, ' = ', loc, ' + ', scale, ' * random.normal(', rows, ',', cols, ',', sparsity, ')\n'])
def uniform(low=0.0, high=1.0, size=(1, 1), sparsity=1.0):
"""
Draw samples from a uniform distribution.
Parameters
----------
low: Lower boundary of the output interval.
high: Upper boundary of the output interval.
size: Output shape (only tuple of length 2, i.e. (m, n), supported).
sparsity: Sparsity (between 0.0 and 1.0).
Examples
--------
>>> import systemml as sml
>>> import numpy as np
>>> sml.setSparkContext(sc)
>>> from systemml import random
>>> m1 = sml.random.uniform(size=(3,3))
>>> m1.toNumPy()
array([[ 0.54511396, 0.11937437, 0.72975775],
[ 0.14135946, 0.01944448, 0.52544478],
[ 0.67582422, 0.87068849, 0.02766852]])
"""
if len(size) != 2:
raise TypeError('Incorrect type for size. Expected tuple of length 2')
INPUTS = []
rows = asStr(size[0])
cols = asStr(size[1])
low = asStr(low)
high = asStr(high)
sparsity = asStr(sparsity)
return constructSamplingNode(INPUTS, [
OUTPUT_ID, ' = random.uniform(', rows, ',', cols, ',', sparsity, ',', low, ',', high, ')\n'])
def poisson(lam=1.0, size=(1, 1), sparsity=1.0):
"""
Draw samples from a Poisson distribution.
Parameters
----------
lam: Expectation of interval, should be > 0.
size: Output shape (only tuple of length 2, i.e. (m, n), supported).
sparsity: Sparsity (between 0.0 and 1.0).
Examples
--------
>>> import systemml as sml
>>> import numpy as np
>>> sml.setSparkContext(sc)
>>> from systemml import random
>>> m1 = sml.random.poisson(lam=1, size=(3,3))
>>> m1.toNumPy()
array([[ 1., 0., 2.],
[ 1., 0., 0.],
[ 0., 0., 0.]])
"""
if len(size) != 2:
raise TypeError('Incorrect type for size. Expected tuple of length 2')
INPUTS = []
rows = asStr(size[0])
cols = asStr(size[1])
lam = asStr(lam)
sparsity = asStr(sparsity)
return constructSamplingNode(INPUTS, [
OUTPUT_ID, ' = random.poisson(', rows, ',', cols, ',', sparsity, ',', lam, ')\n'])
|
apache-2.0
|
amarant/servo
|
tests/wpt/web-platform-tests/tools/py/py/_path/cacheutil.py
|
278
|
3333
|
"""
This module contains multithread-safe cache implementations.
All Caches have
getorbuild(key, builder)
delentry(key)
methods and allow configuration when instantiating the cache class.
"""
from time import time as gettime
class BasicCache(object):
def __init__(self, maxentries=128):
self.maxentries = maxentries
self.prunenum = int(maxentries - maxentries/8)
self._dict = {}
def clear(self):
self._dict.clear()
def _getentry(self, key):
return self._dict[key]
def _putentry(self, key, entry):
self._prunelowestweight()
self._dict[key] = entry
def delentry(self, key, raising=False):
try:
del self._dict[key]
except KeyError:
if raising:
raise
def getorbuild(self, key, builder):
try:
entry = self._getentry(key)
except KeyError:
entry = self._build(key, builder)
self._putentry(key, entry)
return entry.value
def _prunelowestweight(self):
""" prune out entries with lowest weight. """
numentries = len(self._dict)
if numentries >= self.maxentries:
# evict according to entry's weight
items = [(entry.weight, key)
for key, entry in self._dict.items()]
items.sort()
index = numentries - self.prunenum
if index > 0:
for weight, key in items[:index]:
# in MT situations the element might be gone
self.delentry(key, raising=False)
class BuildcostAccessCache(BasicCache):
""" A BuildTime/Access-counting cache implementation.
the weight of a value is computed as the product of
num-accesses-of-a-value * time-to-build-the-value
The values with the least such weights are evicted
if the cache maxentries threshold is superceded.
For implementation flexibility more than one object
might be evicted at a time.
"""
# time function to use for measuring build-times
def _build(self, key, builder):
start = gettime()
val = builder()
end = gettime()
return WeightedCountingEntry(val, end-start)
class WeightedCountingEntry(object):
def __init__(self, value, oneweight):
self._value = value
self.weight = self._oneweight = oneweight
def value(self):
self.weight += self._oneweight
return self._value
value = property(value)
class AgingCache(BasicCache):
""" This cache prunes out cache entries that are too old.
"""
def __init__(self, maxentries=128, maxseconds=10.0):
super(AgingCache, self).__init__(maxentries)
self.maxseconds = maxseconds
def _getentry(self, key):
entry = self._dict[key]
if entry.isexpired():
self.delentry(key)
raise KeyError(key)
return entry
def _build(self, key, builder):
val = builder()
entry = AgingEntry(val, gettime() + self.maxseconds)
return entry
class AgingEntry(object):
def __init__(self, value, expirationtime):
self.value = value
self.weight = expirationtime
def isexpired(self):
t = gettime()
return t >= self.weight
|
mpl-2.0
|
SoftwareExperiment4/SungkyunWiki
|
wiki/plugins/haystack/search_indexes.py
|
16
|
1113
|
from __future__ import absolute_import
from __future__ import unicode_literals
from haystack import indexes
from wiki import models
class ArticleIndex(indexes.SearchIndex, indexes.Indexable):
text = indexes.CharField(document=True, use_template=True)
created = indexes.DateTimeField(model_attr='created')
modified = indexes.DateTimeField(model_attr='modified')
# default because indexing fails with whoosh. see.
# http://stackoverflow.com/questions/11995367/how-do-i-use-a-boolean-field-in-django-haystack-search-query
# https://github.com/toastdriven/django-haystack/issues/382
other_read = indexes.BooleanField(model_attr='other_read', default=False)
group_read = indexes.BooleanField(model_attr='group_read', default=False)
owner_id = indexes.IntegerField(model_attr='owner__id', null=True)
group_id = indexes.IntegerField(model_attr='group__id', null=True)
def get_model(self):
return models.Article
def index_queryset(self, using=None):
"""Used when the entire index for model is updated."""
return self.get_model().objects.all()
|
gpl-3.0
|
michaelneuder/image_quality_analysis
|
bin/nets/old/conv_net_SSIM.py
|
1
|
6635
|
#!/usr/bin/env python3
import os
os.environ['TF_CPP_MIN_LOG_LEVEL']='2'
import numpy as np
np.set_printoptions(threshold=np.nan)
import tensorflow as tf
import time
from PIL import Image as im
def convolve_inner_layers(x, W, b):
y = tf.nn.conv2d(x, W, strides = [1,1,1,1], padding='SAME')
y = tf.nn.bias_add(y, b)
return tf.nn.tanh(y)
def convolve_ouput_layer(x, W, b):
y = tf.nn.conv2d(x, W, strides=[1,1,1,1], padding='SAME')
y = tf.nn.bias_add(y, b)
return y
def conv_net(x, W, b):
conv1 = convolve_inner_layers(x, W['weights1'], b['bias1'])
conv2 = convolve_inner_layers(conv1, W['weights2'], b['bias2'])
conv3 = convolve_inner_layers(conv2, W['weights3'], b['bias3'])
output = convolve_ouput_layer(conv3, W['weights_out'], b['bias_out'])
return output
def get_epoch(x, y, n):
input_size = x.shape[0]
number_batches = int(input_size / n)
extra_examples = input_size % n
batches = {}
batch_indices = np.arange(input_size)
np.random.shuffle(batch_indices)
for i in range(number_batches):
temp_indices = batch_indices[n*i:n*(i+1)]
temp_x = []
temp_y = []
for j in temp_indices:
temp_x.append(x[j])
temp_y.append(y[j])
batches[i] = [np.asarray(temp_x), np.asarray(temp_y)]
if extra_examples != 0:
extra_indices = batch_indices[input_size-extra_examples:input_size]
temp_x = []
temp_y = []
for k in extra_indices:
temp_x.append(x[k])
temp_y.append(y[k])
batches[i+1] = [np.asarray(temp_x), np.asarray(temp_y)]
return batches
def main():
# a bit of ascii fun
print(' _ _ _ ')
print(' ___ ___ _ ____ _____ | |_ _| |_(_) ___ _ __ ')
print(' / __/ _ \| \'_ \ \ / / _ \| | | | | __| |/ _ \| \'_ \ ')
print(' | (_| (_) | | | \ V / (_) | | |_| | |_| | (_) | | | |')
print(' \___\___/|_| |_|\_/ \___/|_|\__,_|\__|_|\___/|_| |_|')
print('=======================================================')
print("initializing variables ...")
filter_dim = 11
weights = {
'weights1': tf.Variable((1/(filter_dim*filter_dim*2))*tf.random_normal([filter_dim,filter_dim,2,30])),
'weights2': tf.Variable((1/(30*filter_dim*filter_dim))*tf.random_normal([filter_dim,filter_dim,30,20])),
'weights3': tf.Variable((1/(20*filter_dim*filter_dim))*tf.random_normal([filter_dim,filter_dim,20,10])),
'weights_out': tf.Variable((1/(10*filter_dim*filter_dim))*tf.random_normal([filter_dim,filter_dim,10,1]))
}
biases = {
'bias1': tf.Variable((1/(filter_dim*filter_dim*2))*tf.random_normal([30])),
'bias2': tf.Variable((1/(30*filter_dim*filter_dim))*tf.random_normal([20])),
'bias3': tf.Variable((1/(20*filter_dim*filter_dim))*tf.random_normal([10])),
'bias_out': tf.Variable((1/(10*filter_dim*filter_dim))*tf.random_normal([1]))
}
# tf Graph input
x = tf.placeholder(tf.float32, [None, 96, 96, 2])
y = tf.placeholder(tf.float32, [None, 96, 96, 1])
# data
print("loading data ...")
original_images_train = np.loadtxt('../../../data/sample_data/orig_500.txt')
reconstructed_images_train = np.loadtxt('../../../data/sample_data/recon_500.txt')
comparison_images_train = np.loadtxt('../../../data/sample_data/comp_500.txt')
original_images_test = np.loadtxt('../../../data/sample_data/orig_140.txt')
reconstructed_images_test = np.loadtxt('../../../data/sample_data/recon_140.txt')
comparison_images_test = np.loadtxt('../../../data/sample_data/comp_140.txt')
# get size of training and testing set
train_size = original_images_train.shape[0]
test_size = original_images_test.shape[0]
# reshaping the result data to --- (num pics), 96, 96, 1
comparison_images_train = np.reshape(comparison_images_train, [train_size, 96, 96, 1])
comparison_images_test = np.reshape(comparison_images_test, [test_size, 96, 96, 1])
# zipping data
combined_data_train = np.reshape(np.dstack((original_images_train, reconstructed_images_train)), [train_size,96,96,2])
combined_data_test = np.reshape(np.dstack((original_images_test, reconstructed_images_test)), [test_size,96,96,2])
#### temporary edit --- don't forget to remove
for i in range(96,192):
print(original_images_train[0][i], reconstructed_images_train[0][i], combined_data_train[0][1][i-96])
exit()
# paramaters
learning_rate = .0001
epochs = 100
# model
prediction = conv_net(x, weights, biases)
# saving state
saver = tf.train.Saver()
# loss and optimization
cost = tf.reduce_mean(tf.square(tf.subtract(prediction, y)))
optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(cost)
# session
init = tf.global_variables_initializer()
with tf.Session() as sess:
sess.run(init)
global_step = 0
epoch_count = 0
start_time = time.time()
print("starting training ... ")
while epoch_count < epochs:
epoch_time = time.time()
print('-------------------------------------------------------')
print('beginning epoch {} ...'.format(epoch_count))
epoch = get_epoch(combined_data_train, comparison_images_train, 50)
for i in epoch:
x_data_train, y_data_train = np.asarray(epoch[i][0]), np.asarray(epoch[i][1])
sess.run(optimizer, feed_dict={x : x_data_train, y : y_data_train})
loss = sess.run(cost, feed_dict={x : x_data_train, y : y_data_train})
print(" - training global_step {}. current error: {}. ".format(global_step, loss))
global_step+=1
print('epoch {} completed in {} seconds. current error = {}'.format(epoch_count, time.time()-epoch_time, loss))
print('-------------------------------------------------------')
epoch_count+=1
print('optimization finished!')
prediction = np.asarray(sess.run(prediction, feed_dict={x : [combined_data_train[0]]}))
target = np.asarray([comparison_images_test[0]])
print(prediction.shape, target.shape)
with open('post_training.csv', mode = 'w') as write_file:
write_file.write('target, prediction\n')
for i in range(96):
for j in range(96):
write_file.write(str(float(target[0][i][j][0])) + ', ' + str(float(prediction[0][i][j][0])) + '\n')
write_file.close()
if __name__ == '__main__':
main()
|
mit
|
derDavidT/sympy
|
sympy/calculus/euler.py
|
54
|
3152
|
from sympy import Function, sympify, diff, Eq, S, Symbol, Derivative
from sympy.core.compatibility import (
combinations_with_replacement, iterable, range)
def euler_equations(L, funcs=(), vars=()):
r"""
Find the Euler-Lagrange equations [1]_ for a given Lagrangian.
Parameters
==========
L : Expr
The Lagrangian that should be a function of the functions listed
in the second argument and their derivatives.
For example, in the case of two functions `f(x,y)`, `g(x,y)` and
two independent variables `x`, `y` the Lagrangian would have the form:
.. math:: L\left(f(x,y),g(x,y),\frac{\partial f(x,y)}{\partial x},
\frac{\partial f(x,y)}{\partial y},
\frac{\partial g(x,y)}{\partial x},
\frac{\partial g(x,y)}{\partial y},x,y\right)
In many cases it is not necessary to provide anything, except the
Lagrangian, it will be auto-detected (and an error raised if this
couldn't be done).
funcs : Function or an iterable of Functions
The functions that the Lagrangian depends on. The Euler equations
are differential equations for each of these functions.
vars : Symbol or an iterable of Symbols
The Symbols that are the independent variables of the functions.
Returns
=======
eqns : list of Eq
The list of differential equations, one for each function.
Examples
========
>>> from sympy import Symbol, Function
>>> from sympy.calculus.euler import euler_equations
>>> x = Function('x')
>>> t = Symbol('t')
>>> L = (x(t).diff(t))**2/2 - x(t)**2/2
>>> euler_equations(L, x(t), t)
[Eq(-x(t) - Derivative(x(t), t, t), 0)]
>>> u = Function('u')
>>> x = Symbol('x')
>>> L = (u(t, x).diff(t))**2/2 - (u(t, x).diff(x))**2/2
>>> euler_equations(L, u(t, x), [t, x])
[Eq(-Derivative(u(t, x), t, t) + Derivative(u(t, x), x, x), 0)]
References
==========
.. [1] http://en.wikipedia.org/wiki/Euler%E2%80%93Lagrange_equation
"""
funcs = tuple(funcs) if iterable(funcs) else (funcs,)
if not funcs:
funcs = tuple(L.atoms(Function))
else:
for f in funcs:
if not isinstance(f, Function):
raise TypeError('Function expected, got: %s' % f)
vars = tuple(vars) if iterable(vars) else (vars,)
if not vars:
vars = funcs[0].args
else:
vars = tuple(sympify(var) for var in vars)
if not all(isinstance(v, Symbol) for v in vars):
raise TypeError('Variables are not symbols, got %s' % vars)
for f in funcs:
if not vars == f.args:
raise ValueError("Variables %s don't match args: %s" % (vars, f))
order = max(len(d.variables) for d in L.atoms(Derivative)
if d.expr in funcs)
eqns = []
for f in funcs:
eq = diff(L, f)
for i in range(1, order + 1):
for p in combinations_with_replacement(vars, i):
eq = eq + S.NegativeOne**i*diff(L, diff(f, *p), *p)
eqns.append(Eq(eq))
return eqns
|
bsd-3-clause
|
KyoungRan/Django_React_ex
|
Django_React_Workshop-mbrochh/django/myvenv/lib/python3.4/site-packages/django/contrib/gis/db/backends/spatialite/operations.py
|
28
|
10604
|
"""
SQL functions reference lists:
http://www.gaia-gis.it/spatialite-3.0.0-BETA/spatialite-sql-3.0.0.html
https://web.archive.org/web/20130407175746/http://www.gaia-gis.it/gaia-sins/spatialite-sql-4.0.0.html
http://www.gaia-gis.it/gaia-sins/spatialite-sql-4.2.1.html
"""
import re
import sys
from django.contrib.gis.db.backends.base.operations import \
BaseSpatialOperations
from django.contrib.gis.db.backends.spatialite.adapter import SpatiaLiteAdapter
from django.contrib.gis.db.backends.utils import SpatialOperator
from django.contrib.gis.db.models import aggregates
from django.contrib.gis.geometry.backend import Geometry
from django.contrib.gis.measure import Distance
from django.core.exceptions import ImproperlyConfigured
from django.db.backends.sqlite3.operations import DatabaseOperations
from django.utils import six
from django.utils.functional import cached_property
class SpatiaLiteOperations(BaseSpatialOperations, DatabaseOperations):
name = 'spatialite'
spatialite = True
version_regex = re.compile(r'^(?P<major>\d)\.(?P<minor1>\d)\.(?P<minor2>\d+)')
Adapter = SpatiaLiteAdapter
area = 'Area'
centroid = 'Centroid'
collect = 'Collect'
contained = 'MbrWithin'
difference = 'Difference'
distance = 'Distance'
envelope = 'Envelope'
extent = 'Extent'
geojson = 'AsGeoJSON'
gml = 'AsGML'
intersection = 'Intersection'
kml = 'AsKML'
length = 'GLength' # OpenGis defines Length, but this conflicts with an SQLite reserved keyword
makeline = 'MakeLine'
num_geom = 'NumGeometries'
num_points = 'NumPoints'
point_on_surface = 'PointOnSurface'
scale = 'ScaleCoords'
svg = 'AsSVG'
sym_difference = 'SymDifference'
transform = 'Transform'
translate = 'ShiftCoords'
union = 'GUnion' # OpenGis defines Union, but this conflicts with an SQLite reserved keyword
unionagg = 'GUnion'
from_text = 'GeomFromText'
from_wkb = 'GeomFromWKB'
select = 'AsText(%s)'
gis_operators = {
'equals': SpatialOperator(func='Equals'),
'disjoint': SpatialOperator(func='Disjoint'),
'touches': SpatialOperator(func='Touches'),
'crosses': SpatialOperator(func='Crosses'),
'within': SpatialOperator(func='Within'),
'overlaps': SpatialOperator(func='Overlaps'),
'contains': SpatialOperator(func='Contains'),
'intersects': SpatialOperator(func='Intersects'),
'relate': SpatialOperator(func='Relate'),
# Returns true if B's bounding box completely contains A's bounding box.
'contained': SpatialOperator(func='MbrWithin'),
# Returns true if A's bounding box completely contains B's bounding box.
'bbcontains': SpatialOperator(func='MbrContains'),
# Returns true if A's bounding box overlaps B's bounding box.
'bboverlaps': SpatialOperator(func='MbrOverlaps'),
# These are implemented here as synonyms for Equals
'same_as': SpatialOperator(func='Equals'),
'exact': SpatialOperator(func='Equals'),
'distance_gt': SpatialOperator(func='Distance', op='>'),
'distance_gte': SpatialOperator(func='Distance', op='>='),
'distance_lt': SpatialOperator(func='Distance', op='<'),
'distance_lte': SpatialOperator(func='Distance', op='<='),
}
disallowed_aggregates = (aggregates.Extent3D,)
@cached_property
def function_names(self):
return {
'Length': 'ST_Length',
'Reverse': 'ST_Reverse',
'Scale': 'ScaleCoords',
'Translate': 'ST_Translate' if self.spatial_version >= (3, 1, 0) else 'ShiftCoords',
'Union': 'ST_Union',
}
@cached_property
def unsupported_functions(self):
unsupported = {'BoundingCircle', 'ForceRHR', 'IsValid', 'MakeValid', 'MemSize'}
if self.spatial_version < (3, 1, 0):
unsupported.add('SnapToGrid')
if self.spatial_version < (4, 0, 0):
unsupported.update({'Perimeter', 'Reverse'})
elif not self.lwgeom_version():
unsupported.add('GeoHash')
return unsupported
@cached_property
def spatial_version(self):
"""Determine the version of the SpatiaLite library."""
try:
version = self.spatialite_version_tuple()[1:]
except Exception as msg:
new_msg = (
'Cannot determine the SpatiaLite version for the "%s" '
'database (error was "%s"). Was the SpatiaLite initialization '
'SQL loaded on this database?') % (self.connection.settings_dict['NAME'], msg)
six.reraise(ImproperlyConfigured, ImproperlyConfigured(new_msg), sys.exc_info()[2])
if version < (3, 0, 0):
raise ImproperlyConfigured('GeoDjango only supports SpatiaLite versions 3.0.0 and above.')
return version
def convert_extent(self, box, srid):
"""
Convert the polygon data received from SpatiaLite to min/max values.
"""
if box is None:
return None
shell = Geometry(box, srid).shell
xmin, ymin = shell[0][:2]
xmax, ymax = shell[2][:2]
return (xmin, ymin, xmax, ymax)
def convert_geom(self, wkt, geo_field):
"""
Converts geometry WKT returned from a SpatiaLite aggregate.
"""
if wkt:
return Geometry(wkt, geo_field.srid)
else:
return None
def geo_db_type(self, f):
"""
Returns None because geometry columns are added via the
`AddGeometryColumn` stored procedure on SpatiaLite.
"""
return None
def get_distance(self, f, value, lookup_type, **kwargs):
"""
Returns the distance parameters for the given geometry field,
lookup value, and lookup type. SpatiaLite only supports regular
cartesian-based queries (no spheroid/sphere calculations for point
geometries like PostGIS).
"""
if not value:
return []
value = value[0]
if isinstance(value, Distance):
if f.geodetic(self.connection):
raise ValueError('SpatiaLite does not support distance queries on '
'geometry fields with a geodetic coordinate system. '
'Distance objects; use a numeric value of your '
'distance in degrees instead.')
else:
dist_param = getattr(value, Distance.unit_attname(f.units_name(self.connection)))
else:
dist_param = value
return [dist_param]
def get_geom_placeholder(self, f, value, compiler):
"""
Provides a proper substitution value for Geometries that are not in the
SRID of the field. Specifically, this routine will substitute in the
Transform() and GeomFromText() function call(s).
"""
def transform_value(value, srid):
return not (value is None or value.srid == srid)
if hasattr(value, 'as_sql'):
if transform_value(value, f.srid):
placeholder = '%s(%%s, %s)' % (self.transform, f.srid)
else:
placeholder = '%s'
# No geometry value used for F expression, substitute in
# the column name instead.
sql, _ = compiler.compile(value)
return placeholder % sql
else:
if transform_value(value, f.srid):
# Adding Transform() to the SQL placeholder.
return '%s(%s(%%s,%s), %s)' % (self.transform, self.from_text, value.srid, f.srid)
else:
return '%s(%%s,%s)' % (self.from_text, f.srid)
def _get_spatialite_func(self, func):
"""
Helper routine for calling SpatiaLite functions and returning
their result.
Any error occurring in this method should be handled by the caller.
"""
cursor = self.connection._cursor()
try:
cursor.execute('SELECT %s' % func)
row = cursor.fetchone()
finally:
cursor.close()
return row[0]
def geos_version(self):
"Returns the version of GEOS used by SpatiaLite as a string."
return self._get_spatialite_func('geos_version()')
def proj4_version(self):
"Returns the version of the PROJ.4 library used by SpatiaLite."
return self._get_spatialite_func('proj4_version()')
def lwgeom_version(self):
"""Return the version of LWGEOM library used by SpatiaLite."""
return self._get_spatialite_func('lwgeom_version()')
def spatialite_version(self):
"Returns the SpatiaLite library version as a string."
return self._get_spatialite_func('spatialite_version()')
def spatialite_version_tuple(self):
"""
Returns the SpatiaLite version as a tuple (version string, major,
minor, subminor).
"""
version = self.spatialite_version()
m = self.version_regex.match(version)
if m:
major = int(m.group('major'))
minor1 = int(m.group('minor1'))
minor2 = int(m.group('minor2'))
else:
raise Exception('Could not parse SpatiaLite version string: %s' % version)
return (version, major, minor1, minor2)
def spatial_aggregate_name(self, agg_name):
"""
Returns the spatial aggregate SQL template and function for the
given Aggregate instance.
"""
agg_name = 'unionagg' if agg_name.lower() == 'union' else agg_name.lower()
return getattr(self, agg_name)
# Routines for getting the OGC-compliant models.
def geometry_columns(self):
from django.contrib.gis.db.backends.spatialite.models import SpatialiteGeometryColumns
return SpatialiteGeometryColumns
def spatial_ref_sys(self):
from django.contrib.gis.db.backends.spatialite.models import SpatialiteSpatialRefSys
return SpatialiteSpatialRefSys
def get_db_converters(self, expression):
converters = super(SpatiaLiteOperations, self).get_db_converters(expression)
if hasattr(expression.output_field, 'geom_type'):
converters.append(self.convert_geometry)
return converters
def convert_geometry(self, value, expression, connection, context):
if value:
value = Geometry(value)
if 'transformed_srid' in context:
value.srid = context['transformed_srid']
return value
|
mit
|
skiselev/upm
|
examples/python/relay.py
|
7
|
1896
|
from __future__ import print_function
# Author: Sarah Knepper <[email protected]>
# Copyright (c) 2015 Intel Corporation.
#
# Permission is hereby granted, free of charge, to any person obtaining
# a copy of this software and associated documentation files (the
# "Software"), to deal in the Software without restriction, including
# without limitation the rights to use, copy, modify, merge, publish,
# distribute, sublicense, and/or sell copies of the Software, and to
# permit persons to whom the Software is furnished to do so, subject to
# the following conditions:
#
# The above copyright notice and this permission notice shall be
# included in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
# NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
# LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
# OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
# WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
import time
from upm import pyupm_grove as grove
def main():
# Create the relay switch object using GPIO pin 0
relay = grove.Relay(0)
# Close and then open the relay switch 3 times,
# waiting one second each time. The LED on the relay switch
# will light up when the switch is on (closed).
# The switch will also make a noise between transitions.
for i in range (0,3):
relay.on()
if relay.isOn():
print(relay.name(), 'is on')
time.sleep(1)
relay.off()
if relay.isOff():
print(relay.name(), 'is off')
time.sleep(1)
# Delete the relay switch object
del relay
if __name__ == '__main__':
main()
|
mit
|
richardcs/ansible
|
lib/ansible/plugins/lookup/mongodb.py
|
84
|
8872
|
# (c) 2016, Marcos Diez <[email protected]>
# https://github.com/marcosdiez/
#
# This file is part of Ansible
#
# Ansible is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# Ansible is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
from __future__ import (absolute_import, division, print_function)
from ansible.module_utils.six import string_types, integer_types
__metaclass__ = type
DOCUMENTATION = '''
author: 'Marcos Diez <marcos (at) unitron.com.br>'
lookup: mongodb
version_added: "2.3"
short_description: lookup info from MongoDB
description:
- 'The ``MongoDB`` lookup runs the *find()* command on a given *collection* on a given *MongoDB* server.'
- 'The result is a list of jsons, so slightly different from what PyMongo returns. In particular, *timestamps* are converted to epoch integers.'
options:
connect_string:
description:
- Can be any valid MongoDB connection string, supporting authentication, replicasets, etc.
- "More info at U(https://docs.mongodb.org/manual/reference/connection-string/)"
default: "mongodb://localhost/"
database:
description:
- Name of the database which the query will be made
required: True
collection:
description:
- Name of the collection which the query will be made
required: True
filter:
description:
- Criteria of the output
type: 'dict'
default: '{}'
projection:
description:
- Fields you want returned
type: dict
default: "{}"
skip:
description:
- How many results should be skipped
type: integer
limit:
description:
- How many results should be shown
type: integer
sort:
description:
- Sorting rules. Please notice the constats are replaced by strings.
type: list
default: "[]"
notes:
- "Please check https://api.mongodb.org/python/current/api/pymongo/collection.html?highlight=find#pymongo.collection.Collection.find for more details."
requirements:
- pymongo >= 2.4 (python library)
'''
EXAMPLES = '''
- hosts: all
gather_facts: false
vars:
mongodb_parameters:
#mandatory parameters
database: 'local'
#optional
collection: "startup_log"
connection_string: "mongodb://localhost/"
extra_connection_parameters: { "ssl" : True , "ssl_certfile": /etc/self_signed_certificate.pem" }
#optional query parameters, we accept any parameter from the normal mongodb query.
filter: { "hostname": "batman" }
projection: { "pid": True , "_id" : False , "hostname" : True }
skip: 0
limit: 1
sort: [ [ "startTime" , "ASCENDING" ] , [ "age", "DESCENDING" ] ]
tasks:
- debug: msg="Mongo has already started with the following PID [{{ item.pid }}]"
with_mongodb: "{{mongodb_parameters}}"
'''
import datetime
try:
from pymongo import ASCENDING, DESCENDING
from pymongo.errors import ConnectionFailure
from pymongo import MongoClient
except ImportError:
try: # for older PyMongo 2.2
from pymongo import Connection as MongoClient
except ImportError:
pymongo_found = False
else:
pymongo_found = True
else:
pymongo_found = True
from ansible.errors import AnsibleError
from ansible.plugins.lookup import LookupBase
class LookupModule(LookupBase):
def _fix_sort_parameter(self, sort_parameter):
if sort_parameter is None:
return sort_parameter
if not isinstance(sort_parameter, list):
raise AnsibleError(u"Error. Sort parameters must be a list, not [ {0} ]".format(sort_parameter))
for item in sort_parameter:
self._convert_sort_string_to_constant(item)
return sort_parameter
def _convert_sort_string_to_constant(self, item):
original_sort_order = item[1]
sort_order = original_sort_order.upper()
if sort_order == u"ASCENDING":
item[1] = ASCENDING
elif sort_order == u"DESCENDING":
item[1] = DESCENDING
# else the user knows what s/he is doing and we won't predict. PyMongo will return an error if necessary
def convert_mongo_result_to_valid_json(self, result):
if result is None:
return result
if isinstance(result, integer_types + (float, bool)):
return result
if isinstance(result, string_types):
return result
elif isinstance(result, list):
new_list = []
for elem in result:
new_list.append(self.convert_mongo_result_to_valid_json(elem))
return new_list
elif isinstance(result, dict):
new_dict = {}
for key in result.keys():
value = result[key] # python2 and 3 compatible....
new_dict[key] = self.convert_mongo_result_to_valid_json(value)
return new_dict
elif isinstance(result, datetime.datetime):
# epoch
return (result - datetime.datetime(1970, 1, 1)). total_seconds()
else:
# failsafe
return u"{0}".format(result)
def run(self, terms, variables, **kwargs):
ret = []
for term in terms:
u'''
Makes a MongoDB query and returns the output as a valid list of json.
Timestamps are converted to epoch integers/longs.
Here is a sample playbook that uses it:
-------------------------------------------------------------------------------
- hosts: all
gather_facts: false
vars:
mongodb_parameters:
#optional parameter, default = "mongodb://localhost/"
# connection_string: "mongodb://localhost/"
#mandatory parameters
database: 'local'
collection: "startup_log"
#optional query parameters
#we accept any parameter from the normal mongodb query.
# the official documentation is here
# https://api.mongodb.org/python/current/api/pymongo/collection.html?highlight=find#pymongo.collection.Collection.find
# filter: { "hostname": "batman" }
# projection: { "pid": True , "_id" : False , "hostname" : True }
# skip: 0
# limit: 1
# sort: [ [ "startTime" , "ASCENDING" ] , [ "age", "DESCENDING" ] ]
# extra_connection_parameters = { }
# dictionary with extra parameters like ssl, ssl_keyfile, maxPoolSize etc...
# the full list is available here. It varies from PyMongo version
# https://api.mongodb.org/python/current/api/pymongo/mongo_client.html#pymongo.mongo_client.MongoClient
tasks:
- debug: msg="Mongo has already started with the following PID [{{ item.pid }}] - full_data {{ item }} "
with_items:
- "{{ lookup('mongodb', mongodb_parameters) }}"
-------------------------------------------------------------------------------
'''
connection_string = term.get(u'connection_string', u"mongodb://localhost")
database = term[u"database"]
collection = term[u'collection']
extra_connection_parameters = term.get(u'extra_connection_parameters', {})
if u"extra_connection_parameters" in term:
del term[u"extra_connection_parameters"]
if u"connection_string" in term:
del term[u"connection_string"]
del term[u"database"]
del term[u"collection"]
if u"sort" in term:
term[u"sort"] = self._fix_sort_parameter(term[u"sort"])
# all other parameters are sent to mongo, so we are future and past proof
try:
client = MongoClient(connection_string, **extra_connection_parameters)
results = client[database][collection].find(**term)
for result in results:
result = self.convert_mongo_result_to_valid_json(result)
ret.append(result)
except ConnectionFailure as e:
raise AnsibleError(u'unable to connect to database: %s' % str(e))
return ret
|
gpl-3.0
|
pablorecio/resistencia-1812
|
resistencia/gui/round_results.py
|
1
|
6387
|
# -*- coding: utf-8 -*-
###############################################################################
# This file is part of Resistencia Cadiz 1812. #
# #
# This program is free software: you can redistribute it and/or modify #
# it under the terms of the GNU General Public License as published by #
# the Free Software Foundation, either version 3 of the License, or #
# any later version. #
# #
# This program is distributed in the hope that it will be useful, #
# but WITHOUT ANY WARRANTY; without even the implied warranty of #
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the #
# GNU General Public License for more details. #
# #
# You should have received a copy of the GNU General Public License #
# along with this program. If not, see <http://www.gnu.org/licenses/>. #
# #
# Copyright (C) 2010, Pablo Recio Quijano, <[email protected]> #
###############################################################################
import gtk
from resistencia import xdg
from resistencia.nls import gettext as _
def _draw_string(string, color):
return '<span foreground="' + color + '"><b>' + string + '</b></span>'
class roundResults:
def add_column(self, list_view, title, columnId):
column = gtk.TreeViewColumn(title, gtk.CellRendererText(), markup=columnId)
list_view.append_column(column)
def fill_classification(self):
i = 1
top = int(len(self.classifications) / 2)
for e in self.classifications:
if e[0] == 'aux_ghost_team':
top = top - 1
color = '#0C0C9D'
print self.classifications
for e in self.classifications:
name = e[0]
if not name == 'aux_ghost_team':
if self.show_top_teams and (i - 1) < top:
name = _draw_string(name, color)
self.list_store_classifications.append((i, name, e[1]))
i = i + 1
def fill_results(self):
for e in self.results:
teamA = e[0][0].replace('aux_ghost_team', _('Rests'))
teamB = e[0][1].replace('aux_ghost_team', _('Rests'))
win_color = '#0C0C9D'
draw_color = '#5DEA5D'
if e[1] == 1:
teamA = _draw_string(teamA, win_color)
elif e[1] == -1:
teamB = _draw_string(teamB, win_color)
else: #draw
teamA = _draw_string(teamA, draw_color)
teamB = _draw_string(teamB, draw_color)
self.list_store_results.append((teamA, teamB))
def __init__(self, classification, results, round, rounds,
show_classifications=True, show_top_teams=False): #add parent
builder = gtk.Builder()
builder.add_from_file(xdg.get_data_path('glade/results.glade'))
self.classifications = classification
self.results = results
self.round = round
self.rounds = rounds
self.show_top_teams = show_top_teams
self.result_dialog = builder.get_object('dlg_results')
title = self.result_dialog.get_title() + ' ' + str(round) + '/' + str(rounds)
self.result_dialog.set_title(title)
self.confirmation_dialog = builder.get_object('dlg_confirmation_close')
self.confirmation_dialog.connect('response', lambda d, r: d.hide())
self.confirmation_dialog.set_transient_for(self.result_dialog)
self.finalround_dialog = builder.get_object('dlg_finalround')
self.finalround_dialog.connect('response', lambda d, r: d.hide())
self.finalround_dialog.set_transient_for(self.result_dialog)
self.list_view_classifications = builder.get_object('treeview_classification')
self.list_view_results = builder.get_object('treeview_results')
if show_classifications:
self.cPosition = 0
self.cTeamName = 1
self.cPuntuations = 2
self.sPosition = 'Pos'
self.sTeamName = _('Team name')
self.sPuntuations = 'Punt'
self.add_column(self.list_view_classifications,
self.sPosition, self.cPosition)
self.add_column(self.list_view_classifications,
self.sTeamName, self.cTeamName)
self.add_column(self.list_view_classifications,
self.sPuntuations, self.cPuntuations)
self.list_store_classifications = builder.get_object('list_classification')
if show_classifications:
self.fill_classification()
else:
builder.get_object('hbox1').remove(builder.get_object('frame_classifications'))
self.cTeamA = 0
self.cTeamB = 1
self.sTeamA = _('Team A')
self.sTeamB = _('Team B')
self.add_column(self.list_view_results, self.sTeamA, self.cTeamA)
self.add_column(self.list_view_results, self.sTeamB, self.cTeamB)
self.list_store_results = builder.get_object('list_results')
self.fill_results()
self.end_contest = False
builder.connect_signals(self)
def on_dlg_results_show(self, data=None):
print 'on_dlg_results_show'
if self.round == self.rounds:
self.finalround_dialog.run()
def on_btn_results_cancel_clicked(self, widget):
self.confirmation_dialog.run()
def on_btn_results_next_clicked(self, widget):
self.result_dialog.hide()
def on_dlg_results_close(self, widget, data=None):
self.result_dialog.destroy()
def on_btn_confirmation_apply_clicked(self, widget, data=None):
self.confirmation_dialog.destroy()
self.result_dialog.destroy()
self.end_contest = True
def on_btn_confirmation_cancel_clicked(self, widget, data=None):
self.confirmation_dialog.hide()
self.result_dialog.run()
|
gpl-3.0
|
rysson/filmkodi
|
plugin.video.mrknow/lib/parser2.py
|
2
|
28077
|
# -*- coding: utf-8 -*-
import common
import sys, os, traceback
import time
import random
import re
import urllib
import string
from string import lower
from entities.CList import CList
from entities.CItemInfo import CItemInfo
from entities.CListItem import CListItem
from entities.CRuleItem import CRuleItem
import customReplacements as cr
import customConversions as cc
from utils import decryptionUtils as crypt
from utils import datetimeUtils as dt
from utils import rowbalance as rb
from utils.fileUtils import findInSubdirectory, getFileContent, getFileExtension
from utils.scrapingUtils import findVideoFrameLink, findContentRefreshLink, findRTMP, findJS, findPHP, getHostName, findEmbedPHPLink
from common import getHTML
import requests
def mydump(obj):
'''return a printable representation of an object for debugging'''
newobj=obj
if '__dict__' in dir(obj):
newobj=obj.__dict__
if ' object at ' in unicode(obj) and not newobj.has_key('__type__'):
newobj['__type__']=unicode(obj)
for attr in newobj:
newobj[attr]=mydump(newobj[attr])
return newobj
class ParsingResult(object):
class Code:
SUCCESS = 0
CFGFILE_NOT_FOUND = 1
CFGSYNTAX_INVALID = 2
WEBREQUEST_FAILED = 3
def __init__(self, code, itemsList):
self.code = code
self.list = itemsList
self.message = None
class Parser2(object):
"""
returns a list of items
"""
def parse(self, lItem):
url = lItem['url']
cfg = lItem['cfg']
ext = getFileExtension(url)
successfullyScraped = True
tmpList = None
if lItem['catcher']:
catcher = lItem['catcher']
cfg = os.path.join(common.Paths.catchersDir, '__' + catcher + '.cfg')
tmpList = self.__loadLocal(cfg, lItem)
if tmpList and len(tmpList.rules) > 0:
successfullyScraped = self.__loadRemote(tmpList, lItem)
else:
if ext == 'cfg':
tmpList = self.__loadLocal(url, lItem)
if tmpList and tmpList.start != '' and len(tmpList.rules) > 0:
lItem['url'] = tmpList.start
successfullyScraped = self.__loadRemote(tmpList, lItem)
elif cfg:
tmpList = self.__loadLocal(cfg, lItem)
if tmpList and len(tmpList.rules) > 0:
successfullyScraped = self.__loadRemote(tmpList, lItem)
# autoselect
if tmpList and tmpList.skill.find('autoselect') != -1 and len(tmpList.items) == 1:
m = tmpList.items[0]
m_type = m['type']
if m_type == 'rss':
common.log('Autoselect - ' + m['title'])
lItem = m
tmpList = self.parse(lItem).list
if not tmpList:
return ParsingResult(ParsingResult.Code.CFGSYNTAX_INVALID, None)
if tmpList and successfullyScraped == False:
return ParsingResult(ParsingResult.Code.WEBREQUEST_FAILED, tmpList)
# Remove duplicates
if tmpList.skill.find('allowDuplicates') == -1:
urls = []
for i in range(len(tmpList.items)-1,-1,-1):
item = tmpList.items[i]
tmpUrl = item['url']
tmpCfg = item['cfg']
if not tmpCfg:
tmpCfg = ''
if not urls.__contains__(tmpUrl + '|' + tmpCfg):
urls.append(tmpUrl + '|' + tmpCfg)
else:
tmpList.items.remove(item)
return ParsingResult(ParsingResult.Code.SUCCESS, tmpList)
"""
loads cfg, creates list and sets up rules for scraping
"""
def __loadLocal(self, filename, lItem = None):
params = []
#get Parameters
if filename.find('@') != -1:
params = filename.split('@')
filename = params.pop(0)
# get cfg file
cfg = filename
if not os.path.exists(cfg):
cfg = os.path.join(common.Paths.modulesDir, filename)
if not os.path.exists(cfg):
tmpPath = os.path.dirname(os.path.join(common.Paths.modulesDir, lItem["definedIn"]))
cfg = os.path.join(tmpPath ,filename)
if not os.path.exists(cfg):
srchFilename = filename
if filename.find('/') > -1:
srchFilename = srchFilename.split('/')[1]
try:
cfg = findInSubdirectory(srchFilename, common.Paths.modulesDir)
except:
try:
cfg = findInSubdirectory(srchFilename, common.Paths.favouritesFolder)
except:
try:
cfg = findInSubdirectory(srchFilename, common.Paths.customModulesDir)
except:
common.log('File not found: ' + srchFilename)
return None
#load file and apply parameters
data = getFileContent(cfg)
data = cr.CustomReplacements().replace(os.path.dirname(cfg), data, lItem, params)
#log
msg = 'Local file ' + filename + ' opened'
if len(params) > 0:
msg += ' with Parameter(s): '
msg += ",".join(params)
common.log(msg)
outputList = self.__parseCfg(filename, data, lItem)
return outputList
"""
scrape items according to rules and add them to the list
"""
def __loadRemote(self, inputList, lItem):
try:
inputList.curr_url = lItem['url']
count = 0
i = 1
maxits = 2 # 1 optimistic + 1 demystified
ignoreCache = False
demystify = False
back = ''
startUrl = inputList.curr_url
#print inputList, lItem
while count == 0 and i <= maxits:
if i > 1:
ignoreCache = True
demystify = True
# Trivial: url is from known streamer
if back:
lItem['referer'] = back
items = self.__parseHtml(inputList.curr_url, '"' + inputList.curr_url + '"', inputList.rules, inputList.skill, inputList.cfg, lItem)
count = len(items)
# try to find items in html source code
if count == 0:
referer = ''
if lItem['referer']:
referer = lItem['referer']
data = common.getHTML(inputList.curr_url, None, referer, False, False, ignoreCache, demystify)
if data == '':
return False
msg = 'Remote URL ' + inputList.curr_url + ' opened'
if demystify:
msg += ' (demystified)'
common.log(msg)
if inputList.section != '':
section = inputList.section
data = self.__getSection(data, section)
if lItem['section']:
section = lItem['section']
data = self.__getSection(data, section)
print("-----------",inputList.curr_url, inputList.skill, inputList.cfg, lItem)
items = self.__parseHtml(inputList.curr_url, data, inputList.rules, inputList.skill, inputList.cfg, lItem)
count = len(items)
common.log(' -> ' + str(count) + ' item(s) found')
# find rtmp stream
#common.log('Find rtmp stream')
if count == 0:
item = self.__findRTMP(data, startUrl, lItem)
if item:
items = []
items.append(item)
count = 1
# find embedding javascripts
#common.log('Find embedding javascripts')
if count == 0:
item = findJS(data)
if item:
firstJS = item[0]
streamId = firstJS[0]
jsUrl = firstJS[1]
if not jsUrl.startswith('http://'):
jsUrl = urllib.basejoin(startUrl,jsUrl)
streamerName = getHostName(jsUrl)
jsSource = getHTML(jsUrl, None, startUrl)
phpUrl = findPHP(jsSource, streamId)
if phpUrl:
data = getHTML(phpUrl, None, startUrl)
item = self.__findRTMP(data, phpUrl, lItem)
if item:
if streamerName:
item['title'] = item['title'].replace('RTMP', streamerName)
items = []
items.append(item)
count = 1
else:
red = phpUrl
try:
if (not red.startswith('http')): red='http:'+red
except: pass
common.log(' -> Redirect: ' + red)
if back == red:
break
back = inputList.curr_url
inputList.curr_url = red
common.log(str(len(inputList.items)) + ' items ' + inputList.cfg + ' -> ' + red)
startUrl = red
continue
# find redirects
#common.log('find redirects')
if count == 0:
red = self.__findRedirect(startUrl, inputList.curr_url)
if startUrl == red:
common.log(' -> No redirect found')
else:
try:
if (not red.startswith('http')): red = 'http:' + red
except:
pass
common.log(' -> Redirect: ' + red)
if back == red:
break
back = inputList.curr_url
inputList.curr_url = red
common.log(str(len(inputList.items)) + ' items ' + inputList.cfg + ' -> ' + red)
startUrl = red
i = 0
i += 1
if count != 0:
inputList.items = inputList.items + items
except:
traceback.print_exc(file = sys.stdout)
return False
return True
def __findRTMP(self, data, pageUrl, lItem):
rtmp = findRTMP(pageUrl, data)
if rtmp:
item = CListItem()
item['title'] = 'RTMP* - ' + rtmp[1]
item['type'] = 'video'
item['url'] = rtmp[0] + ' playPath=' + rtmp[1] + ' swfUrl=' + rtmp[2] +' swfVfy=1 live=true pageUrl=' + pageUrl
item.merge(lItem)
return item
return None
def __getSection(self, data, section):
p = re.compile(section, re.IGNORECASE + re.DOTALL + re.UNICODE)
m = p.search(data)
if m:
return m.group(0)
else:
common.log(' -> Section could not be found:' + section)
return data
def __findRedirect(self, page, referer='', demystify=False):
data = common.getHTML(page, None, referer=referer, xml=False, mobile=False, demystify=demystify)
if findContentRefreshLink(page, data):
return findContentRefreshLink(page, data)
elif findVideoFrameLink(page, data):
return findVideoFrameLink(page, data)
elif findEmbedPHPLink(data):
return findEmbedPHPLink(data)
if not demystify:
return self.__findRedirect(page, referer, True)
return page
def __parseCfg(self, cfgFile, data, lItem):
tmpList = CList()
data = data.replace('\r\n', '\n').split('\n')
items = []
tmp = None
hasOwnCfg = False
for m in data:
if m and m[0] != '#':
index = m.find('=')
if index != -1:
key = lower(m[:index]).strip()
value = m[index+1:]
index = value.find('|')
if value[:index] == 'sports.devil.locale':
value = common.translate(int(value[index+1:]))
elif value[:index] == 'sports.devil.image':
value = os.path.join(common.Paths.imgDir, value[index+1:])
if key == 'start':
tmpList.start = value
elif key == 'section':
tmpList.section = value
elif key == 'sort':
tmpList.sort = value
elif key == 'skill':
tmpList.skill = value
elif key == 'catcher':
tmpList.catcher = value
elif key == 'item_infos':
rule_tmp = CRuleItem()
hasOwnCfg = False
rule_tmp.infos = value
elif key == 'item_order':
rule_tmp.order = value
elif key == 'item_skill':
rule_tmp.skill = value
elif key == 'item_curr':
rule_tmp.curr = value
elif key == 'item_precheck':
rule_tmp.precheck = value
elif key.startswith('item_info'):
tmpkey = key[len('item_info'):]
if tmpkey == '_name':
info_tmp = CItemInfo()
info_tmp.name = value
if value == 'cfg':
hasOwnCfg = True
elif tmpkey == '_from':
info_tmp.src = value
elif tmpkey == '':
info_tmp.rule = value
elif tmpkey == '_default':
info_tmp.default = value
elif tmpkey == '_convert':
info_tmp.convert.append(value)
elif tmpkey == '_build':
info_tmp.build = value
rule_tmp.info_list.append(info_tmp)
elif key == 'item_url_build':
rule_tmp.url_build = value
if tmpList.catcher != '':
refInf = CItemInfo()
refInf.name = 'referer'
refInf.build = value
rule_tmp.info_list.append(refInf)
if not hasOwnCfg:
refInf = CItemInfo()
refInf.name = 'catcher'
refInf.build = tmpList.catcher
rule_tmp.info_list.append(refInf)
tmpList.rules.append(rule_tmp)
# static menu items (without regex)
elif key == 'title':
tmp = CListItem()
tmp['title'] = value
if tmpList.skill.find('videoTitle') > -1:
tmp['videoTitle'] = value
elif key == 'url':
tmp['url'] = value
if lItem:
tmp.merge(lItem)
if tmpList.catcher != '':
tmp['referer'] = value
if not hasOwnCfg:
tmp['catcher'] = tmpList.catcher
tmp['definedIn'] = cfgFile
items.append(tmp)
tmp = None
elif tmp != None:
if key == 'cfg':
hasOwnCfg = True
tmp[key] = value
tmpList.items = items
tmpList.cfg = cfgFile
return tmpList
def __parseHtml(self, url, data, rules, skills, definedIn, lItem):
#common.log('_parseHtml called' + url)
items = []
for item_rule in rules:
#common.log('rule: ' + item_rule.infos)
if not hasattr(item_rule, 'precheck') or (item_rule.precheck in data):
revid = re.compile(item_rule.infos, re.IGNORECASE + re.DOTALL + re.MULTILINE + re.UNICODE)
for reinfos in revid.findall(data):
tmp = CListItem()
if lItem['referer']:
tmp['referer'] = lItem['referer']
if item_rule.order.find('|') != -1:
infos_names = item_rule.order.split('|')
infos_values = list(reinfos)
i = 0
for name in infos_names:
tmp[name] = infos_values[i]
i = i+1
else:
tmp[item_rule.order] = reinfos
for info in item_rule.info_list:
info_value = tmp[info.name]
if info_value:
if info.build.find('%s') != -1:
tmpVal = info.build % info_value
tmp[info.name] = tmpVal
continue
if info.build.find('%s') != -1:
if info.src.__contains__('+'):
tmpArr = info.src.split('+')
src = ''
for t in tmpArr:
t = t.strip()
if t.find('\'') != -1:
src = src + t.strip('\'')
else:
src = src + (tmp[t] or '')
elif info.src.__contains__('||'):
variables = info.src.split('||')
src = firstNonEmpty(tmp, variables)
else:
src = tmp[info.src]
if src and info.convert != []:
tmp['referer'] = url
src = self.__parseCommands(tmp, src, info.convert)
if isinstance(src, dict):
for dKey in src:
tmp[dKey] = src[dKey]
src = src.values()[0]
info_value = info.build % (src)
else:
info_value = info.build
tmp[info.name] = info_value
if tmp['url']:
tmp['url'] = item_rule.url_build % (tmp['url'])
else:
tmp['url'] = url
tmp.merge(lItem)
if item_rule.skill.find('append') != -1:
tmp['url'] = url + tmp['url']
if item_rule.skill.find('space') != -1:
tmp['title'] = ' %s ' % tmp['title'].strip()
if skills.find('videoTitle') > -1:
tmp['videoTitle'] = tmp['title']
tmp['definedIn'] = definedIn
items.append(tmp)
return items
def __parseCommands(self, item, src, convCommands):
#common.log('_parseCommands called')
#common.log('_parseCommands called %s | %s | %s' % (item,src, convCommands))
# helping function
def parseCommand(txt):
command = {"command": txt, "params": ""}
if txt.find("(") > -1:
command["command"] = txt[0:txt.find("(")]
command["params"] = txt[len(command["command"]) + 1:-1]
return command
for convCommand in convCommands:
pComm = parseCommand(convCommand)
command = pComm["command"]
params = pComm["params"]
if params.find('@REFERER@'):
referer = item['referer']
if not referer:
referer = ''
params = params.replace('@REFERER@', referer)
if command == 'convDate':
src = cc.convDate(params, src)
elif command =='currenturl':
print("--------------curenturl ------------------------")
src= getFileContent(os.path.join(common.Paths.cacheDir, 'lasturl'))
print("--------------curenturl ------------------------",src)
elif command =='iklub':
common.log('--------------ikulb ------------------------')
common.log('src: %s' % src)
src = cc.decodeIklub(src)
common.log('src: %s' % src)
#common.log('--------------ikulb ------------------------')
elif command =='decodemrknow2':
common.log('--------------decodemrknow2 ------------------------')
#common.log('src: %s' % src)
src = cc.decodeMrknow2(src)
#common.log('src: %s' % src)
elif command =='decodemrknow3':
common.log('--------------decodemrknow3 ------------------------')
common.log('src: %s' % src)
src = cc.decodeMrknow3(src)
#common.log('src: %s' % src)
elif command == 'convTimestamp':
src = cc.convTimestamp(params, src)
elif command == 'select':
src = cc.select(params, src)
if not src:
continue
elif command == 'unicode_escape':
src = src.decode('unicode-escape')
elif command == 'replaceFromDict':
dictName = str(params.strip('\''))
path = os.path.join(common.Paths.dictsDir, dictName + '.txt')
if not (os.path.exists(path)):
common.log('Dictionary file not found: ' + path)
continue
src = cc.replaceFromDict(path, src)
elif command == 'time':
src = time.time()
elif command == 'timediff':
src = dt.timediff(src,params.strip('\''))
elif command == 'offset':
src = cc.offset(params, src)
elif command == 'getSource':
src = cc.getSource(params, src)
elif command == 'quote':
try:
src = urllib.quote(params.strip("'").replace('%s', src),'')
except:
cleanParams = params.strip("'")
cleanParams = cleanParams.replace("%s",src)
src = urllib.quote(cleanParams.encode('utf-8'),'')
elif command == 'unquote':
src = urllib.unquote(params.strip("'").replace('%s', src))
elif command == 'parseText':
src = cc.parseText(item, params, src)
elif command == 'getInfo':
src = cc.getInfo(item, params, src)
elif command == 'getXML':
src = cc.getInfo(item, params, src, xml=True)
elif command == 'getMobile':
src = cc.getInfo(item, params, src, mobile=True)
elif command == 'decodeBase64':
src = cc.decodeBase64(src)
elif command == 'decodeRawUnicode':
src = cc.decodeRawUnicode(src)
elif command == 'resolve':
src = cc.resolve(src)
elif command == 'decodeXppod':
src = cc.decodeXppod(src)
elif command == 'decodeXppodHLS':
src = cc.decodeXppod_hls(src)
elif command == 'decodeMrknow1':
src = cc.decodeMrknow1(src)
elif command == 'replace':
src = cc.replace(params, src)
elif command == 'replaceRegex':
src = cc.replaceRegex(params, src)
elif command == 'ifEmpty':
src = cc.ifEmpty(item, params, src)
elif command == 'isEqual':
src = cc.isEqual(item, params, src)
elif command == 'ifFileExists':
src = cc.ifFileExists(item, params, src)
elif command == 'ifExists':
src = cc.ifExists(item, params, src)
elif command == 'encryptJimey':
src = crypt.encryptJimey(params.strip("'").replace('%s', src))
elif command == 'gAesDec':
src = crypt.gAesDec(src,item.infos[params])
elif command == 'aesDec':
src = crypt.aesDec(src,item.infos[params])
elif command == 'getCookies':
src = cc.getCookies(params, src)
elif command == 'destreamer':
src = crypt.destreamer(params.strip("'").replace('%s', src))
elif command == 'unixTimestamp':
src = dt.getUnixTimestamp()
elif command == 'rowbalance':
src = rb.get()
elif command == 'urlMerge':
src = cc.urlMerge(params, src)
elif command == 'translate':
try:
src = common.translate(int(src))
except:
pass
elif command == 'camelcase':
src = string.capwords(string.capwords(src, '-'))
elif command == 'lowercase':
src = string.lower(src)
elif command == 'reverse':
src = src[::-1]
elif command == 'demystify':
print 'demystify'
src = crypt.doDemystify(src)
print 'after demystify',src
elif command == 'random':
paramArr = params.split(',')
minimum = int(paramArr[0])
maximum = int(paramArr[1])
src = str(random.randrange(minimum,maximum))
elif command == 'debug':
common.log('--------------debug ------------------------')
common.log('Debug from cfg file: ' + src)
elif command == 'divide':
paramArr = params.split(',')
a = paramArr[0].strip().strip("'").replace('%s', src)
a = resolveVariable(a, item)
b = paramArr[1].strip().strip("'").replace('%s', src)
b = resolveVariable(b, item)
if not a or not b:
continue
a = int(a)
b = int(b)
try:
src = str(a/b)
except:
pass
return src
def resolveVariable(varStr, item):
if varStr.startswith('@') and varStr.endswith('@'):
return item.getInfo(varStr.strip('@'))
return varStr
def firstNonEmpty(tmp, variables):
for v in variables:
vClean = v.strip()
if vClean.find("'") != -1:
vClean = vClean.strip("'")
else:
vClean = tmp.getInfo(vClean)
if vClean != '':
return vClean
return ''
|
apache-2.0
|
samithaj/headphones
|
lib/pytz/__init__.py
|
61
|
34011
|
'''
datetime.tzinfo timezone definitions generated from the
Olson timezone database:
ftp://elsie.nci.nih.gov/pub/tz*.tar.gz
See the datetime section of the Python Library Reference for information
on how to use these modules.
'''
# The Olson database is updated several times a year.
OLSON_VERSION = '2014j'
VERSION = '2014.10' # Switching to pip compatible version numbering.
__version__ = VERSION
OLSEN_VERSION = OLSON_VERSION # Old releases had this misspelling
__all__ = [
'timezone', 'utc', 'country_timezones', 'country_names',
'AmbiguousTimeError', 'InvalidTimeError',
'NonExistentTimeError', 'UnknownTimeZoneError',
'all_timezones', 'all_timezones_set',
'common_timezones', 'common_timezones_set',
]
import sys, datetime, os.path, gettext
try:
from pkg_resources import resource_stream
except ImportError:
resource_stream = None
from pytz.exceptions import AmbiguousTimeError
from pytz.exceptions import InvalidTimeError
from pytz.exceptions import NonExistentTimeError
from pytz.exceptions import UnknownTimeZoneError
from pytz.lazy import LazyDict, LazyList, LazySet
from pytz.tzinfo import unpickler
from pytz.tzfile import build_tzinfo, _byte_string
try:
unicode
except NameError: # Python 3.x
# Python 3.x doesn't have unicode(), making writing code
# for Python 2.3 and Python 3.x a pain.
unicode = str
def ascii(s):
r"""
>>> ascii('Hello')
'Hello'
>>> ascii('\N{TRADE MARK SIGN}') #doctest: +IGNORE_EXCEPTION_DETAIL
Traceback (most recent call last):
...
UnicodeEncodeError: ...
"""
s.encode('US-ASCII') # Raise an exception if not ASCII
return s # But return the original string - not a byte string.
else: # Python 2.x
def ascii(s):
r"""
>>> ascii('Hello')
'Hello'
>>> ascii(u'Hello')
'Hello'
>>> ascii(u'\N{TRADE MARK SIGN}') #doctest: +IGNORE_EXCEPTION_DETAIL
Traceback (most recent call last):
...
UnicodeEncodeError: ...
"""
return s.encode('US-ASCII')
def open_resource(name):
"""Open a resource from the zoneinfo subdir for reading.
Uses the pkg_resources module if available and no standard file
found at the calculated location.
"""
name_parts = name.lstrip('/').split('/')
for part in name_parts:
if part == os.path.pardir or os.path.sep in part:
raise ValueError('Bad path segment: %r' % part)
filename = os.path.join(os.path.dirname(__file__),
'zoneinfo', *name_parts)
if not os.path.exists(filename) and resource_stream is not None:
# http://bugs.launchpad.net/bugs/383171 - we avoid using this
# unless absolutely necessary to help when a broken version of
# pkg_resources is installed.
return resource_stream(__name__, 'zoneinfo/' + name)
return open(filename, 'rb')
def resource_exists(name):
"""Return true if the given resource exists"""
try:
open_resource(name).close()
return True
except IOError:
return False
# Enable this when we get some translations?
# We want an i18n API that is useful to programs using Python's gettext
# module, as well as the Zope3 i18n package. Perhaps we should just provide
# the POT file and translations, and leave it up to callers to make use
# of them.
#
# t = gettext.translation(
# 'pytz', os.path.join(os.path.dirname(__file__), 'locales'),
# fallback=True
# )
# def _(timezone_name):
# """Translate a timezone name using the current locale, returning Unicode"""
# return t.ugettext(timezone_name)
_tzinfo_cache = {}
def timezone(zone):
r''' Return a datetime.tzinfo implementation for the given timezone
>>> from datetime import datetime, timedelta
>>> utc = timezone('UTC')
>>> eastern = timezone('US/Eastern')
>>> eastern.zone
'US/Eastern'
>>> timezone(unicode('US/Eastern')) is eastern
True
>>> utc_dt = datetime(2002, 10, 27, 6, 0, 0, tzinfo=utc)
>>> loc_dt = utc_dt.astimezone(eastern)
>>> fmt = '%Y-%m-%d %H:%M:%S %Z (%z)'
>>> loc_dt.strftime(fmt)
'2002-10-27 01:00:00 EST (-0500)'
>>> (loc_dt - timedelta(minutes=10)).strftime(fmt)
'2002-10-27 00:50:00 EST (-0500)'
>>> eastern.normalize(loc_dt - timedelta(minutes=10)).strftime(fmt)
'2002-10-27 01:50:00 EDT (-0400)'
>>> (loc_dt + timedelta(minutes=10)).strftime(fmt)
'2002-10-27 01:10:00 EST (-0500)'
Raises UnknownTimeZoneError if passed an unknown zone.
>>> try:
... timezone('Asia/Shangri-La')
... except UnknownTimeZoneError:
... print('Unknown')
Unknown
>>> try:
... timezone(unicode('\N{TRADE MARK SIGN}'))
... except UnknownTimeZoneError:
... print('Unknown')
Unknown
'''
if zone.upper() == 'UTC':
return utc
try:
zone = ascii(zone)
except UnicodeEncodeError:
# All valid timezones are ASCII
raise UnknownTimeZoneError(zone)
zone = _unmunge_zone(zone)
if zone not in _tzinfo_cache:
if zone in all_timezones_set:
fp = open_resource(zone)
try:
_tzinfo_cache[zone] = build_tzinfo(zone, fp)
finally:
fp.close()
else:
raise UnknownTimeZoneError(zone)
return _tzinfo_cache[zone]
def _unmunge_zone(zone):
"""Undo the time zone name munging done by older versions of pytz."""
return zone.replace('_plus_', '+').replace('_minus_', '-')
ZERO = datetime.timedelta(0)
HOUR = datetime.timedelta(hours=1)
class UTC(datetime.tzinfo):
"""UTC
Optimized UTC implementation. It unpickles using the single module global
instance defined beneath this class declaration.
"""
zone = "UTC"
_utcoffset = ZERO
_dst = ZERO
_tzname = zone
def fromutc(self, dt):
if dt.tzinfo is None:
return self.localize(dt)
return super(utc.__class__, self).fromutc(dt)
def utcoffset(self, dt):
return ZERO
def tzname(self, dt):
return "UTC"
def dst(self, dt):
return ZERO
def __reduce__(self):
return _UTC, ()
def localize(self, dt, is_dst=False):
'''Convert naive time to local time'''
if dt.tzinfo is not None:
raise ValueError('Not naive datetime (tzinfo is already set)')
return dt.replace(tzinfo=self)
def normalize(self, dt, is_dst=False):
'''Correct the timezone information on the given datetime'''
if dt.tzinfo is self:
return dt
if dt.tzinfo is None:
raise ValueError('Naive time - no tzinfo set')
return dt.astimezone(self)
def __repr__(self):
return "<UTC>"
def __str__(self):
return "UTC"
UTC = utc = UTC() # UTC is a singleton
def _UTC():
"""Factory function for utc unpickling.
Makes sure that unpickling a utc instance always returns the same
module global.
These examples belong in the UTC class above, but it is obscured; or in
the README.txt, but we are not depending on Python 2.4 so integrating
the README.txt examples with the unit tests is not trivial.
>>> import datetime, pickle
>>> dt = datetime.datetime(2005, 3, 1, 14, 13, 21, tzinfo=utc)
>>> naive = dt.replace(tzinfo=None)
>>> p = pickle.dumps(dt, 1)
>>> naive_p = pickle.dumps(naive, 1)
>>> len(p) - len(naive_p)
17
>>> new = pickle.loads(p)
>>> new == dt
True
>>> new is dt
False
>>> new.tzinfo is dt.tzinfo
True
>>> utc is UTC is timezone('UTC')
True
>>> utc is timezone('GMT')
False
"""
return utc
_UTC.__safe_for_unpickling__ = True
def _p(*args):
"""Factory function for unpickling pytz tzinfo instances.
Just a wrapper around tzinfo.unpickler to save a few bytes in each pickle
by shortening the path.
"""
return unpickler(*args)
_p.__safe_for_unpickling__ = True
class _CountryTimezoneDict(LazyDict):
"""Map ISO 3166 country code to a list of timezone names commonly used
in that country.
iso3166_code is the two letter code used to identify the country.
>>> def print_list(list_of_strings):
... 'We use a helper so doctests work under Python 2.3 -> 3.x'
... for s in list_of_strings:
... print(s)
>>> print_list(country_timezones['nz'])
Pacific/Auckland
Pacific/Chatham
>>> print_list(country_timezones['ch'])
Europe/Zurich
>>> print_list(country_timezones['CH'])
Europe/Zurich
>>> print_list(country_timezones[unicode('ch')])
Europe/Zurich
>>> print_list(country_timezones['XXX'])
Traceback (most recent call last):
...
KeyError: 'XXX'
Previously, this information was exposed as a function rather than a
dictionary. This is still supported::
>>> print_list(country_timezones('nz'))
Pacific/Auckland
Pacific/Chatham
"""
def __call__(self, iso3166_code):
"""Backwards compatibility."""
return self[iso3166_code]
def _fill(self):
data = {}
zone_tab = open_resource('zone.tab')
try:
for line in zone_tab:
line = line.decode('US-ASCII')
if line.startswith('#'):
continue
code, coordinates, zone = line.split(None, 4)[:3]
if zone not in all_timezones_set:
continue
try:
data[code].append(zone)
except KeyError:
data[code] = [zone]
self.data = data
finally:
zone_tab.close()
country_timezones = _CountryTimezoneDict()
class _CountryNameDict(LazyDict):
'''Dictionary proving ISO3166 code -> English name.
>>> print(country_names['au'])
Australia
'''
def _fill(self):
data = {}
zone_tab = open_resource('iso3166.tab')
try:
for line in zone_tab.readlines():
line = line.decode('US-ASCII')
if line.startswith('#'):
continue
code, name = line.split(None, 1)
data[code] = name.strip()
self.data = data
finally:
zone_tab.close()
country_names = _CountryNameDict()
# Time-zone info based solely on fixed offsets
class _FixedOffset(datetime.tzinfo):
zone = None # to match the standard pytz API
def __init__(self, minutes):
if abs(minutes) >= 1440:
raise ValueError("absolute offset is too large", minutes)
self._minutes = minutes
self._offset = datetime.timedelta(minutes=minutes)
def utcoffset(self, dt):
return self._offset
def __reduce__(self):
return FixedOffset, (self._minutes, )
def dst(self, dt):
return ZERO
def tzname(self, dt):
return None
def __repr__(self):
return 'pytz.FixedOffset(%d)' % self._minutes
def localize(self, dt, is_dst=False):
'''Convert naive time to local time'''
if dt.tzinfo is not None:
raise ValueError('Not naive datetime (tzinfo is already set)')
return dt.replace(tzinfo=self)
def normalize(self, dt, is_dst=False):
'''Correct the timezone information on the given datetime'''
if dt.tzinfo is None:
raise ValueError('Naive time - no tzinfo set')
return dt.replace(tzinfo=self)
def FixedOffset(offset, _tzinfos = {}):
"""return a fixed-offset timezone based off a number of minutes.
>>> one = FixedOffset(-330)
>>> one
pytz.FixedOffset(-330)
>>> one.utcoffset(datetime.datetime.now())
datetime.timedelta(-1, 66600)
>>> one.dst(datetime.datetime.now())
datetime.timedelta(0)
>>> two = FixedOffset(1380)
>>> two
pytz.FixedOffset(1380)
>>> two.utcoffset(datetime.datetime.now())
datetime.timedelta(0, 82800)
>>> two.dst(datetime.datetime.now())
datetime.timedelta(0)
The datetime.timedelta must be between the range of -1 and 1 day,
non-inclusive.
>>> FixedOffset(1440)
Traceback (most recent call last):
...
ValueError: ('absolute offset is too large', 1440)
>>> FixedOffset(-1440)
Traceback (most recent call last):
...
ValueError: ('absolute offset is too large', -1440)
An offset of 0 is special-cased to return UTC.
>>> FixedOffset(0) is UTC
True
There should always be only one instance of a FixedOffset per timedelta.
This should be true for multiple creation calls.
>>> FixedOffset(-330) is one
True
>>> FixedOffset(1380) is two
True
It should also be true for pickling.
>>> import pickle
>>> pickle.loads(pickle.dumps(one)) is one
True
>>> pickle.loads(pickle.dumps(two)) is two
True
"""
if offset == 0:
return UTC
info = _tzinfos.get(offset)
if info is None:
# We haven't seen this one before. we need to save it.
# Use setdefault to avoid a race condition and make sure we have
# only one
info = _tzinfos.setdefault(offset, _FixedOffset(offset))
return info
FixedOffset.__safe_for_unpickling__ = True
def _test():
import doctest, os, sys
sys.path.insert(0, os.pardir)
import pytz
return doctest.testmod(pytz)
if __name__ == '__main__':
_test()
all_timezones = \
['Africa/Abidjan',
'Africa/Accra',
'Africa/Addis_Ababa',
'Africa/Algiers',
'Africa/Asmara',
'Africa/Asmera',
'Africa/Bamako',
'Africa/Bangui',
'Africa/Banjul',
'Africa/Bissau',
'Africa/Blantyre',
'Africa/Brazzaville',
'Africa/Bujumbura',
'Africa/Cairo',
'Africa/Casablanca',
'Africa/Ceuta',
'Africa/Conakry',
'Africa/Dakar',
'Africa/Dar_es_Salaam',
'Africa/Djibouti',
'Africa/Douala',
'Africa/El_Aaiun',
'Africa/Freetown',
'Africa/Gaborone',
'Africa/Harare',
'Africa/Johannesburg',
'Africa/Juba',
'Africa/Kampala',
'Africa/Khartoum',
'Africa/Kigali',
'Africa/Kinshasa',
'Africa/Lagos',
'Africa/Libreville',
'Africa/Lome',
'Africa/Luanda',
'Africa/Lubumbashi',
'Africa/Lusaka',
'Africa/Malabo',
'Africa/Maputo',
'Africa/Maseru',
'Africa/Mbabane',
'Africa/Mogadishu',
'Africa/Monrovia',
'Africa/Nairobi',
'Africa/Ndjamena',
'Africa/Niamey',
'Africa/Nouakchott',
'Africa/Ouagadougou',
'Africa/Porto-Novo',
'Africa/Sao_Tome',
'Africa/Timbuktu',
'Africa/Tripoli',
'Africa/Tunis',
'Africa/Windhoek',
'America/Adak',
'America/Anchorage',
'America/Anguilla',
'America/Antigua',
'America/Araguaina',
'America/Argentina/Buenos_Aires',
'America/Argentina/Catamarca',
'America/Argentina/ComodRivadavia',
'America/Argentina/Cordoba',
'America/Argentina/Jujuy',
'America/Argentina/La_Rioja',
'America/Argentina/Mendoza',
'America/Argentina/Rio_Gallegos',
'America/Argentina/Salta',
'America/Argentina/San_Juan',
'America/Argentina/San_Luis',
'America/Argentina/Tucuman',
'America/Argentina/Ushuaia',
'America/Aruba',
'America/Asuncion',
'America/Atikokan',
'America/Atka',
'America/Bahia',
'America/Bahia_Banderas',
'America/Barbados',
'America/Belem',
'America/Belize',
'America/Blanc-Sablon',
'America/Boa_Vista',
'America/Bogota',
'America/Boise',
'America/Buenos_Aires',
'America/Cambridge_Bay',
'America/Campo_Grande',
'America/Cancun',
'America/Caracas',
'America/Catamarca',
'America/Cayenne',
'America/Cayman',
'America/Chicago',
'America/Chihuahua',
'America/Coral_Harbour',
'America/Cordoba',
'America/Costa_Rica',
'America/Creston',
'America/Cuiaba',
'America/Curacao',
'America/Danmarkshavn',
'America/Dawson',
'America/Dawson_Creek',
'America/Denver',
'America/Detroit',
'America/Dominica',
'America/Edmonton',
'America/Eirunepe',
'America/El_Salvador',
'America/Ensenada',
'America/Fort_Wayne',
'America/Fortaleza',
'America/Glace_Bay',
'America/Godthab',
'America/Goose_Bay',
'America/Grand_Turk',
'America/Grenada',
'America/Guadeloupe',
'America/Guatemala',
'America/Guayaquil',
'America/Guyana',
'America/Halifax',
'America/Havana',
'America/Hermosillo',
'America/Indiana/Indianapolis',
'America/Indiana/Knox',
'America/Indiana/Marengo',
'America/Indiana/Petersburg',
'America/Indiana/Tell_City',
'America/Indiana/Vevay',
'America/Indiana/Vincennes',
'America/Indiana/Winamac',
'America/Indianapolis',
'America/Inuvik',
'America/Iqaluit',
'America/Jamaica',
'America/Jujuy',
'America/Juneau',
'America/Kentucky/Louisville',
'America/Kentucky/Monticello',
'America/Knox_IN',
'America/Kralendijk',
'America/La_Paz',
'America/Lima',
'America/Los_Angeles',
'America/Louisville',
'America/Lower_Princes',
'America/Maceio',
'America/Managua',
'America/Manaus',
'America/Marigot',
'America/Martinique',
'America/Matamoros',
'America/Mazatlan',
'America/Mendoza',
'America/Menominee',
'America/Merida',
'America/Metlakatla',
'America/Mexico_City',
'America/Miquelon',
'America/Moncton',
'America/Monterrey',
'America/Montevideo',
'America/Montreal',
'America/Montserrat',
'America/Nassau',
'America/New_York',
'America/Nipigon',
'America/Nome',
'America/Noronha',
'America/North_Dakota/Beulah',
'America/North_Dakota/Center',
'America/North_Dakota/New_Salem',
'America/Ojinaga',
'America/Panama',
'America/Pangnirtung',
'America/Paramaribo',
'America/Phoenix',
'America/Port-au-Prince',
'America/Port_of_Spain',
'America/Porto_Acre',
'America/Porto_Velho',
'America/Puerto_Rico',
'America/Rainy_River',
'America/Rankin_Inlet',
'America/Recife',
'America/Regina',
'America/Resolute',
'America/Rio_Branco',
'America/Rosario',
'America/Santa_Isabel',
'America/Santarem',
'America/Santiago',
'America/Santo_Domingo',
'America/Sao_Paulo',
'America/Scoresbysund',
'America/Shiprock',
'America/Sitka',
'America/St_Barthelemy',
'America/St_Johns',
'America/St_Kitts',
'America/St_Lucia',
'America/St_Thomas',
'America/St_Vincent',
'America/Swift_Current',
'America/Tegucigalpa',
'America/Thule',
'America/Thunder_Bay',
'America/Tijuana',
'America/Toronto',
'America/Tortola',
'America/Vancouver',
'America/Virgin',
'America/Whitehorse',
'America/Winnipeg',
'America/Yakutat',
'America/Yellowknife',
'Antarctica/Casey',
'Antarctica/Davis',
'Antarctica/DumontDUrville',
'Antarctica/Macquarie',
'Antarctica/Mawson',
'Antarctica/McMurdo',
'Antarctica/Palmer',
'Antarctica/Rothera',
'Antarctica/South_Pole',
'Antarctica/Syowa',
'Antarctica/Troll',
'Antarctica/Vostok',
'Arctic/Longyearbyen',
'Asia/Aden',
'Asia/Almaty',
'Asia/Amman',
'Asia/Anadyr',
'Asia/Aqtau',
'Asia/Aqtobe',
'Asia/Ashgabat',
'Asia/Ashkhabad',
'Asia/Baghdad',
'Asia/Bahrain',
'Asia/Baku',
'Asia/Bangkok',
'Asia/Beirut',
'Asia/Bishkek',
'Asia/Brunei',
'Asia/Calcutta',
'Asia/Chita',
'Asia/Choibalsan',
'Asia/Chongqing',
'Asia/Chungking',
'Asia/Colombo',
'Asia/Dacca',
'Asia/Damascus',
'Asia/Dhaka',
'Asia/Dili',
'Asia/Dubai',
'Asia/Dushanbe',
'Asia/Gaza',
'Asia/Harbin',
'Asia/Hebron',
'Asia/Ho_Chi_Minh',
'Asia/Hong_Kong',
'Asia/Hovd',
'Asia/Irkutsk',
'Asia/Istanbul',
'Asia/Jakarta',
'Asia/Jayapura',
'Asia/Jerusalem',
'Asia/Kabul',
'Asia/Kamchatka',
'Asia/Karachi',
'Asia/Kashgar',
'Asia/Kathmandu',
'Asia/Katmandu',
'Asia/Khandyga',
'Asia/Kolkata',
'Asia/Krasnoyarsk',
'Asia/Kuala_Lumpur',
'Asia/Kuching',
'Asia/Kuwait',
'Asia/Macao',
'Asia/Macau',
'Asia/Magadan',
'Asia/Makassar',
'Asia/Manila',
'Asia/Muscat',
'Asia/Nicosia',
'Asia/Novokuznetsk',
'Asia/Novosibirsk',
'Asia/Omsk',
'Asia/Oral',
'Asia/Phnom_Penh',
'Asia/Pontianak',
'Asia/Pyongyang',
'Asia/Qatar',
'Asia/Qyzylorda',
'Asia/Rangoon',
'Asia/Riyadh',
'Asia/Saigon',
'Asia/Sakhalin',
'Asia/Samarkand',
'Asia/Seoul',
'Asia/Shanghai',
'Asia/Singapore',
'Asia/Srednekolymsk',
'Asia/Taipei',
'Asia/Tashkent',
'Asia/Tbilisi',
'Asia/Tehran',
'Asia/Tel_Aviv',
'Asia/Thimbu',
'Asia/Thimphu',
'Asia/Tokyo',
'Asia/Ujung_Pandang',
'Asia/Ulaanbaatar',
'Asia/Ulan_Bator',
'Asia/Urumqi',
'Asia/Ust-Nera',
'Asia/Vientiane',
'Asia/Vladivostok',
'Asia/Yakutsk',
'Asia/Yekaterinburg',
'Asia/Yerevan',
'Atlantic/Azores',
'Atlantic/Bermuda',
'Atlantic/Canary',
'Atlantic/Cape_Verde',
'Atlantic/Faeroe',
'Atlantic/Faroe',
'Atlantic/Jan_Mayen',
'Atlantic/Madeira',
'Atlantic/Reykjavik',
'Atlantic/South_Georgia',
'Atlantic/St_Helena',
'Atlantic/Stanley',
'Australia/ACT',
'Australia/Adelaide',
'Australia/Brisbane',
'Australia/Broken_Hill',
'Australia/Canberra',
'Australia/Currie',
'Australia/Darwin',
'Australia/Eucla',
'Australia/Hobart',
'Australia/LHI',
'Australia/Lindeman',
'Australia/Lord_Howe',
'Australia/Melbourne',
'Australia/NSW',
'Australia/North',
'Australia/Perth',
'Australia/Queensland',
'Australia/South',
'Australia/Sydney',
'Australia/Tasmania',
'Australia/Victoria',
'Australia/West',
'Australia/Yancowinna',
'Brazil/Acre',
'Brazil/DeNoronha',
'Brazil/East',
'Brazil/West',
'CET',
'CST6CDT',
'Canada/Atlantic',
'Canada/Central',
'Canada/East-Saskatchewan',
'Canada/Eastern',
'Canada/Mountain',
'Canada/Newfoundland',
'Canada/Pacific',
'Canada/Saskatchewan',
'Canada/Yukon',
'Chile/Continental',
'Chile/EasterIsland',
'Cuba',
'EET',
'EST',
'EST5EDT',
'Egypt',
'Eire',
'Etc/GMT',
'Etc/GMT+0',
'Etc/GMT+1',
'Etc/GMT+10',
'Etc/GMT+11',
'Etc/GMT+12',
'Etc/GMT+2',
'Etc/GMT+3',
'Etc/GMT+4',
'Etc/GMT+5',
'Etc/GMT+6',
'Etc/GMT+7',
'Etc/GMT+8',
'Etc/GMT+9',
'Etc/GMT-0',
'Etc/GMT-1',
'Etc/GMT-10',
'Etc/GMT-11',
'Etc/GMT-12',
'Etc/GMT-13',
'Etc/GMT-14',
'Etc/GMT-2',
'Etc/GMT-3',
'Etc/GMT-4',
'Etc/GMT-5',
'Etc/GMT-6',
'Etc/GMT-7',
'Etc/GMT-8',
'Etc/GMT-9',
'Etc/GMT0',
'Etc/Greenwich',
'Etc/UCT',
'Etc/UTC',
'Etc/Universal',
'Etc/Zulu',
'Europe/Amsterdam',
'Europe/Andorra',
'Europe/Athens',
'Europe/Belfast',
'Europe/Belgrade',
'Europe/Berlin',
'Europe/Bratislava',
'Europe/Brussels',
'Europe/Bucharest',
'Europe/Budapest',
'Europe/Busingen',
'Europe/Chisinau',
'Europe/Copenhagen',
'Europe/Dublin',
'Europe/Gibraltar',
'Europe/Guernsey',
'Europe/Helsinki',
'Europe/Isle_of_Man',
'Europe/Istanbul',
'Europe/Jersey',
'Europe/Kaliningrad',
'Europe/Kiev',
'Europe/Lisbon',
'Europe/Ljubljana',
'Europe/London',
'Europe/Luxembourg',
'Europe/Madrid',
'Europe/Malta',
'Europe/Mariehamn',
'Europe/Minsk',
'Europe/Monaco',
'Europe/Moscow',
'Europe/Nicosia',
'Europe/Oslo',
'Europe/Paris',
'Europe/Podgorica',
'Europe/Prague',
'Europe/Riga',
'Europe/Rome',
'Europe/Samara',
'Europe/San_Marino',
'Europe/Sarajevo',
'Europe/Simferopol',
'Europe/Skopje',
'Europe/Sofia',
'Europe/Stockholm',
'Europe/Tallinn',
'Europe/Tirane',
'Europe/Tiraspol',
'Europe/Uzhgorod',
'Europe/Vaduz',
'Europe/Vatican',
'Europe/Vienna',
'Europe/Vilnius',
'Europe/Volgograd',
'Europe/Warsaw',
'Europe/Zagreb',
'Europe/Zaporozhye',
'Europe/Zurich',
'GB',
'GB-Eire',
'GMT',
'GMT+0',
'GMT-0',
'GMT0',
'Greenwich',
'HST',
'Hongkong',
'Iceland',
'Indian/Antananarivo',
'Indian/Chagos',
'Indian/Christmas',
'Indian/Cocos',
'Indian/Comoro',
'Indian/Kerguelen',
'Indian/Mahe',
'Indian/Maldives',
'Indian/Mauritius',
'Indian/Mayotte',
'Indian/Reunion',
'Iran',
'Israel',
'Jamaica',
'Japan',
'Kwajalein',
'Libya',
'MET',
'MST',
'MST7MDT',
'Mexico/BajaNorte',
'Mexico/BajaSur',
'Mexico/General',
'NZ',
'NZ-CHAT',
'Navajo',
'PRC',
'PST8PDT',
'Pacific/Apia',
'Pacific/Auckland',
'Pacific/Bougainville',
'Pacific/Chatham',
'Pacific/Chuuk',
'Pacific/Easter',
'Pacific/Efate',
'Pacific/Enderbury',
'Pacific/Fakaofo',
'Pacific/Fiji',
'Pacific/Funafuti',
'Pacific/Galapagos',
'Pacific/Gambier',
'Pacific/Guadalcanal',
'Pacific/Guam',
'Pacific/Honolulu',
'Pacific/Johnston',
'Pacific/Kiritimati',
'Pacific/Kosrae',
'Pacific/Kwajalein',
'Pacific/Majuro',
'Pacific/Marquesas',
'Pacific/Midway',
'Pacific/Nauru',
'Pacific/Niue',
'Pacific/Norfolk',
'Pacific/Noumea',
'Pacific/Pago_Pago',
'Pacific/Palau',
'Pacific/Pitcairn',
'Pacific/Pohnpei',
'Pacific/Ponape',
'Pacific/Port_Moresby',
'Pacific/Rarotonga',
'Pacific/Saipan',
'Pacific/Samoa',
'Pacific/Tahiti',
'Pacific/Tarawa',
'Pacific/Tongatapu',
'Pacific/Truk',
'Pacific/Wake',
'Pacific/Wallis',
'Pacific/Yap',
'Poland',
'Portugal',
'ROC',
'ROK',
'Singapore',
'Turkey',
'UCT',
'US/Alaska',
'US/Aleutian',
'US/Arizona',
'US/Central',
'US/East-Indiana',
'US/Eastern',
'US/Hawaii',
'US/Indiana-Starke',
'US/Michigan',
'US/Mountain',
'US/Pacific',
'US/Pacific-New',
'US/Samoa',
'UTC',
'Universal',
'W-SU',
'WET',
'Zulu']
all_timezones = LazyList(
tz for tz in all_timezones if resource_exists(tz))
all_timezones_set = LazySet(all_timezones)
common_timezones = \
['Africa/Abidjan',
'Africa/Accra',
'Africa/Addis_Ababa',
'Africa/Algiers',
'Africa/Asmara',
'Africa/Bamako',
'Africa/Bangui',
'Africa/Banjul',
'Africa/Bissau',
'Africa/Blantyre',
'Africa/Brazzaville',
'Africa/Bujumbura',
'Africa/Cairo',
'Africa/Casablanca',
'Africa/Ceuta',
'Africa/Conakry',
'Africa/Dakar',
'Africa/Dar_es_Salaam',
'Africa/Djibouti',
'Africa/Douala',
'Africa/El_Aaiun',
'Africa/Freetown',
'Africa/Gaborone',
'Africa/Harare',
'Africa/Johannesburg',
'Africa/Juba',
'Africa/Kampala',
'Africa/Khartoum',
'Africa/Kigali',
'Africa/Kinshasa',
'Africa/Lagos',
'Africa/Libreville',
'Africa/Lome',
'Africa/Luanda',
'Africa/Lubumbashi',
'Africa/Lusaka',
'Africa/Malabo',
'Africa/Maputo',
'Africa/Maseru',
'Africa/Mbabane',
'Africa/Mogadishu',
'Africa/Monrovia',
'Africa/Nairobi',
'Africa/Ndjamena',
'Africa/Niamey',
'Africa/Nouakchott',
'Africa/Ouagadougou',
'Africa/Porto-Novo',
'Africa/Sao_Tome',
'Africa/Tripoli',
'Africa/Tunis',
'Africa/Windhoek',
'America/Adak',
'America/Anchorage',
'America/Anguilla',
'America/Antigua',
'America/Araguaina',
'America/Argentina/Buenos_Aires',
'America/Argentina/Catamarca',
'America/Argentina/Cordoba',
'America/Argentina/Jujuy',
'America/Argentina/La_Rioja',
'America/Argentina/Mendoza',
'America/Argentina/Rio_Gallegos',
'America/Argentina/Salta',
'America/Argentina/San_Juan',
'America/Argentina/San_Luis',
'America/Argentina/Tucuman',
'America/Argentina/Ushuaia',
'America/Aruba',
'America/Asuncion',
'America/Atikokan',
'America/Bahia',
'America/Bahia_Banderas',
'America/Barbados',
'America/Belem',
'America/Belize',
'America/Blanc-Sablon',
'America/Boa_Vista',
'America/Bogota',
'America/Boise',
'America/Cambridge_Bay',
'America/Campo_Grande',
'America/Cancun',
'America/Caracas',
'America/Cayenne',
'America/Cayman',
'America/Chicago',
'America/Chihuahua',
'America/Costa_Rica',
'America/Creston',
'America/Cuiaba',
'America/Curacao',
'America/Danmarkshavn',
'America/Dawson',
'America/Dawson_Creek',
'America/Denver',
'America/Detroit',
'America/Dominica',
'America/Edmonton',
'America/Eirunepe',
'America/El_Salvador',
'America/Fortaleza',
'America/Glace_Bay',
'America/Godthab',
'America/Goose_Bay',
'America/Grand_Turk',
'America/Grenada',
'America/Guadeloupe',
'America/Guatemala',
'America/Guayaquil',
'America/Guyana',
'America/Halifax',
'America/Havana',
'America/Hermosillo',
'America/Indiana/Indianapolis',
'America/Indiana/Knox',
'America/Indiana/Marengo',
'America/Indiana/Petersburg',
'America/Indiana/Tell_City',
'America/Indiana/Vevay',
'America/Indiana/Vincennes',
'America/Indiana/Winamac',
'America/Inuvik',
'America/Iqaluit',
'America/Jamaica',
'America/Juneau',
'America/Kentucky/Louisville',
'America/Kentucky/Monticello',
'America/Kralendijk',
'America/La_Paz',
'America/Lima',
'America/Los_Angeles',
'America/Lower_Princes',
'America/Maceio',
'America/Managua',
'America/Manaus',
'America/Marigot',
'America/Martinique',
'America/Matamoros',
'America/Mazatlan',
'America/Menominee',
'America/Merida',
'America/Metlakatla',
'America/Mexico_City',
'America/Miquelon',
'America/Moncton',
'America/Monterrey',
'America/Montevideo',
'America/Montreal',
'America/Montserrat',
'America/Nassau',
'America/New_York',
'America/Nipigon',
'America/Nome',
'America/Noronha',
'America/North_Dakota/Beulah',
'America/North_Dakota/Center',
'America/North_Dakota/New_Salem',
'America/Ojinaga',
'America/Panama',
'America/Pangnirtung',
'America/Paramaribo',
'America/Phoenix',
'America/Port-au-Prince',
'America/Port_of_Spain',
'America/Porto_Velho',
'America/Puerto_Rico',
'America/Rainy_River',
'America/Rankin_Inlet',
'America/Recife',
'America/Regina',
'America/Resolute',
'America/Rio_Branco',
'America/Santa_Isabel',
'America/Santarem',
'America/Santiago',
'America/Santo_Domingo',
'America/Sao_Paulo',
'America/Scoresbysund',
'America/Sitka',
'America/St_Barthelemy',
'America/St_Johns',
'America/St_Kitts',
'America/St_Lucia',
'America/St_Thomas',
'America/St_Vincent',
'America/Swift_Current',
'America/Tegucigalpa',
'America/Thule',
'America/Thunder_Bay',
'America/Tijuana',
'America/Toronto',
'America/Tortola',
'America/Vancouver',
'America/Whitehorse',
'America/Winnipeg',
'America/Yakutat',
'America/Yellowknife',
'Antarctica/Casey',
'Antarctica/Davis',
'Antarctica/DumontDUrville',
'Antarctica/Macquarie',
'Antarctica/Mawson',
'Antarctica/McMurdo',
'Antarctica/Palmer',
'Antarctica/Rothera',
'Antarctica/Syowa',
'Antarctica/Troll',
'Antarctica/Vostok',
'Arctic/Longyearbyen',
'Asia/Aden',
'Asia/Almaty',
'Asia/Amman',
'Asia/Anadyr',
'Asia/Aqtau',
'Asia/Aqtobe',
'Asia/Ashgabat',
'Asia/Baghdad',
'Asia/Bahrain',
'Asia/Baku',
'Asia/Bangkok',
'Asia/Beirut',
'Asia/Bishkek',
'Asia/Brunei',
'Asia/Chita',
'Asia/Choibalsan',
'Asia/Colombo',
'Asia/Damascus',
'Asia/Dhaka',
'Asia/Dili',
'Asia/Dubai',
'Asia/Dushanbe',
'Asia/Gaza',
'Asia/Hebron',
'Asia/Ho_Chi_Minh',
'Asia/Hong_Kong',
'Asia/Hovd',
'Asia/Irkutsk',
'Asia/Jakarta',
'Asia/Jayapura',
'Asia/Jerusalem',
'Asia/Kabul',
'Asia/Kamchatka',
'Asia/Karachi',
'Asia/Kathmandu',
'Asia/Khandyga',
'Asia/Kolkata',
'Asia/Krasnoyarsk',
'Asia/Kuala_Lumpur',
'Asia/Kuching',
'Asia/Kuwait',
'Asia/Macau',
'Asia/Magadan',
'Asia/Makassar',
'Asia/Manila',
'Asia/Muscat',
'Asia/Nicosia',
'Asia/Novokuznetsk',
'Asia/Novosibirsk',
'Asia/Omsk',
'Asia/Oral',
'Asia/Phnom_Penh',
'Asia/Pontianak',
'Asia/Pyongyang',
'Asia/Qatar',
'Asia/Qyzylorda',
'Asia/Rangoon',
'Asia/Riyadh',
'Asia/Sakhalin',
'Asia/Samarkand',
'Asia/Seoul',
'Asia/Shanghai',
'Asia/Singapore',
'Asia/Srednekolymsk',
'Asia/Taipei',
'Asia/Tashkent',
'Asia/Tbilisi',
'Asia/Tehran',
'Asia/Thimphu',
'Asia/Tokyo',
'Asia/Ulaanbaatar',
'Asia/Urumqi',
'Asia/Ust-Nera',
'Asia/Vientiane',
'Asia/Vladivostok',
'Asia/Yakutsk',
'Asia/Yekaterinburg',
'Asia/Yerevan',
'Atlantic/Azores',
'Atlantic/Bermuda',
'Atlantic/Canary',
'Atlantic/Cape_Verde',
'Atlantic/Faroe',
'Atlantic/Madeira',
'Atlantic/Reykjavik',
'Atlantic/South_Georgia',
'Atlantic/St_Helena',
'Atlantic/Stanley',
'Australia/Adelaide',
'Australia/Brisbane',
'Australia/Broken_Hill',
'Australia/Currie',
'Australia/Darwin',
'Australia/Eucla',
'Australia/Hobart',
'Australia/Lindeman',
'Australia/Lord_Howe',
'Australia/Melbourne',
'Australia/Perth',
'Australia/Sydney',
'Canada/Atlantic',
'Canada/Central',
'Canada/Eastern',
'Canada/Mountain',
'Canada/Newfoundland',
'Canada/Pacific',
'Europe/Amsterdam',
'Europe/Andorra',
'Europe/Athens',
'Europe/Belgrade',
'Europe/Berlin',
'Europe/Bratislava',
'Europe/Brussels',
'Europe/Bucharest',
'Europe/Budapest',
'Europe/Busingen',
'Europe/Chisinau',
'Europe/Copenhagen',
'Europe/Dublin',
'Europe/Gibraltar',
'Europe/Guernsey',
'Europe/Helsinki',
'Europe/Isle_of_Man',
'Europe/Istanbul',
'Europe/Jersey',
'Europe/Kaliningrad',
'Europe/Kiev',
'Europe/Lisbon',
'Europe/Ljubljana',
'Europe/London',
'Europe/Luxembourg',
'Europe/Madrid',
'Europe/Malta',
'Europe/Mariehamn',
'Europe/Minsk',
'Europe/Monaco',
'Europe/Moscow',
'Europe/Oslo',
'Europe/Paris',
'Europe/Podgorica',
'Europe/Prague',
'Europe/Riga',
'Europe/Rome',
'Europe/Samara',
'Europe/San_Marino',
'Europe/Sarajevo',
'Europe/Simferopol',
'Europe/Skopje',
'Europe/Sofia',
'Europe/Stockholm',
'Europe/Tallinn',
'Europe/Tirane',
'Europe/Uzhgorod',
'Europe/Vaduz',
'Europe/Vatican',
'Europe/Vienna',
'Europe/Vilnius',
'Europe/Volgograd',
'Europe/Warsaw',
'Europe/Zagreb',
'Europe/Zaporozhye',
'Europe/Zurich',
'GMT',
'Indian/Antananarivo',
'Indian/Chagos',
'Indian/Christmas',
'Indian/Cocos',
'Indian/Comoro',
'Indian/Kerguelen',
'Indian/Mahe',
'Indian/Maldives',
'Indian/Mauritius',
'Indian/Mayotte',
'Indian/Reunion',
'Pacific/Apia',
'Pacific/Auckland',
'Pacific/Bougainville',
'Pacific/Chatham',
'Pacific/Chuuk',
'Pacific/Easter',
'Pacific/Efate',
'Pacific/Enderbury',
'Pacific/Fakaofo',
'Pacific/Fiji',
'Pacific/Funafuti',
'Pacific/Galapagos',
'Pacific/Gambier',
'Pacific/Guadalcanal',
'Pacific/Guam',
'Pacific/Honolulu',
'Pacific/Johnston',
'Pacific/Kiritimati',
'Pacific/Kosrae',
'Pacific/Kwajalein',
'Pacific/Majuro',
'Pacific/Marquesas',
'Pacific/Midway',
'Pacific/Nauru',
'Pacific/Niue',
'Pacific/Norfolk',
'Pacific/Noumea',
'Pacific/Pago_Pago',
'Pacific/Palau',
'Pacific/Pitcairn',
'Pacific/Pohnpei',
'Pacific/Port_Moresby',
'Pacific/Rarotonga',
'Pacific/Saipan',
'Pacific/Tahiti',
'Pacific/Tarawa',
'Pacific/Tongatapu',
'Pacific/Wake',
'Pacific/Wallis',
'US/Alaska',
'US/Arizona',
'US/Central',
'US/Eastern',
'US/Hawaii',
'US/Mountain',
'US/Pacific',
'UTC']
common_timezones = LazyList(
tz for tz in common_timezones if tz in all_timezones)
common_timezones_set = LazySet(common_timezones)
|
gpl-3.0
|
shepdelacreme/ansible
|
lib/ansible/modules/network/aos/_aos_login.py
|
44
|
4110
|
#!/usr/bin/python
#
# (c) 2017 Apstra Inc, <[email protected]>
#
# This file is part of Ansible
#
# Ansible is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# Ansible is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
#
ANSIBLE_METADATA = {'metadata_version': '1.1',
'status': ['deprecated'],
'supported_by': 'community'}
DOCUMENTATION = '''
---
module: aos_login
author: [email protected] (@jeremyschulman)
version_added: "2.3"
short_description: Login to AOS server for session token
deprecated:
removed_in: "2.9"
why: This module does not support AOS 2.1 or later
alternative: See new modules at U(https://www.ansible.com/ansible-apstra).
description:
- Obtain the AOS server session token by providing the required
username and password credentials. Upon successful authentication,
this module will return the session-token that is required by all
subsequent AOS module usage. On success the module will automatically populate
ansible facts with the variable I(aos_session)
This module is not idempotent and do not support check mode.
requirements:
- "aos-pyez >= 0.6.1"
options:
server:
description:
- Address of the AOS Server on which you want to open a connection.
required: true
port:
description:
- Port number to use when connecting to the AOS server.
default: 443
user:
description:
- Login username to use when connecting to the AOS server.
default: admin
passwd:
description:
- Password to use when connecting to the AOS server.
default: admin
'''
EXAMPLES = '''
- name: Create a session with the AOS-server
aos_login:
server: "{{ inventory_hostname }}"
user: admin
passwd: admin
- name: Use the newly created session (register is not mandatory)
aos_ip_pool:
session: "{{ aos_session }}"
name: my_ip_pool
state: present
'''
RETURNS = '''
aos_session:
description: Authenticated session information
returned: always
type: dict
sample: { 'url': <str>, 'headers': {...} }
'''
from ansible.module_utils.basic import AnsibleModule
from ansible.module_utils.network.aos.aos import check_aos_version
try:
from apstra.aosom.session import Session
import apstra.aosom.exc as aosExc
HAS_AOS_PYEZ = True
except ImportError:
HAS_AOS_PYEZ = False
def aos_login(module):
mod_args = module.params
aos = Session(server=mod_args['server'], port=mod_args['port'],
user=mod_args['user'], passwd=mod_args['passwd'])
try:
aos.login()
except aosExc.LoginServerUnreachableError:
module.fail_json(
msg="AOS-server [%s] API not available/reachable, check server" % aos.server)
except aosExc.LoginAuthError:
module.fail_json(msg="AOS-server login credentials failed")
module.exit_json(changed=False,
ansible_facts=dict(aos_session=aos.session),
aos_session=dict(aos_session=aos.session))
def main():
module = AnsibleModule(
argument_spec=dict(
server=dict(required=True),
port=dict(default='443', type="int"),
user=dict(default='admin'),
passwd=dict(default='admin', no_log=True)))
if not HAS_AOS_PYEZ:
module.fail_json(msg='aos-pyez is not installed. Please see details '
'here: https://github.com/Apstra/aos-pyez')
# Check if aos-pyez is present and match the minimum version
check_aos_version(module, '0.6.1')
aos_login(module)
if __name__ == '__main__':
main()
|
gpl-3.0
|
lmtim/iOSBlogCN
|
Export.py
|
65
|
1482
|
__author__ = 'wwxiang'
#coding=utf-8
import os
import re
work = os.getcwd()
resxml = work + os.path.sep + 'blogcn.opml'
workmd = work + os.path.sep + 'README.md'
def handler():
isblock = True
handlerData = []
lineNo = 0
try:
with open(workmd,'rb') as linefs:
lineCout = len(linefs.readlines())
linefs.close()
with open(workmd,'rb') as fs:
while isblock:
lineNo += 1
val = fs.readline().decode()
if lineNo == lineCout:
isblock = False
if not val[0] == '[':
continue
title = re.findall(r'\[(.+?)\]',val)[0]
xmlUrl = re.findall(r'<(.+?)>',val)[0]
htmlUrl = re.findall(r'\((.+?)\)',val)[0]
handlerData.append('<outline text="{0}" title="{0}" type="rss" xmlUrl="{1}" htmlUrl="{2}"/>'.format(title,xmlUrl,htmlUrl))
fs.close()
except:
print('错误处理','读取文件失败')
return
export_xml = '<?xml version="1.0" encoding="UTF-8"?><opml version="1.0"><head><title>导出订阅</title></head><body><outline text="ios" title="ios" >\n'
export_xml += '\r\n'.join(handlerData)
export_xml += '</outline></body></opml>\r\n'
with open(resxml,'wb') as fs:
fs.write(export_xml.encode())
fs.close()
print('res.xml文件处理完成')
pass
if os.path.isfile(workmd):
handler()
|
gpl-2.0
|
ddzialak/boto
|
boto/cloudsearch2/search.py
|
16
|
13430
|
# Copyright (c) 2014 Amazon.com, Inc. or its affiliates.
# All Rights Reserved
#
# Permission is hereby granted, free of charge, to any person obtaining a
# copy of this software and associated documentation files (the
# "Software"), to deal in the Software without restriction, including
# without limitation the rights to use, copy, modify, merge, publish, dis-
# tribute, sublicense, and/or sell copies of the Software, and to permit
# persons to whom the Software is furnished to do so, subject to the fol-
# lowing conditions:
#
# The above copyright notice and this permission notice shall be included
# in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
# OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABIL-
# ITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
# SHALL THE AUTHOR BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
# WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
# IN THE SOFTWARE.
#
from math import ceil
from boto.compat import json, map, six
import requests
SIMPLE = 'simple'
STRUCTURED = 'structured'
LUCENE = 'lucene'
DISMAX = 'dismax'
class SearchServiceException(Exception):
pass
class SearchResults(object):
def __init__(self, **attrs):
self.rid = attrs['status']['rid']
self.time_ms = attrs['status']['time-ms']
self.hits = attrs['hits']['found']
self.docs = attrs['hits']['hit']
self.start = attrs['hits']['start']
self.query = attrs['query']
self.search_service = attrs['search_service']
self.facets = {}
if 'facets' in attrs:
for (facet, values) in attrs['facets'].items():
if 'buckets' in values:
self.facets[facet] = dict((k, v) for (k, v) in map(lambda x: (x['value'], x['count']), values.get('buckets', [])))
self.num_pages_needed = ceil(self.hits / self.query.real_size)
def __len__(self):
return len(self.docs)
def __iter__(self):
return iter(self.docs)
def next_page(self):
"""Call Cloudsearch to get the next page of search results
:rtype: :class:`boto.cloudsearch2.search.SearchResults`
:return: the following page of search results
"""
if self.query.page <= self.num_pages_needed:
self.query.start += self.query.real_size
self.query.page += 1
return self.search_service(self.query)
else:
raise StopIteration
class Query(object):
RESULTS_PER_PAGE = 500
def __init__(self, q=None, parser=None, fq=None, expr=None,
return_fields=None, size=10, start=0, sort=None,
facet=None, highlight=None, partial=None, options=None):
self.q = q
self.parser = parser
self.fq = fq
self.expr = expr or {}
self.sort = sort or []
self.return_fields = return_fields or []
self.start = start
self.facet = facet or {}
self.highlight = highlight or {}
self.partial = partial
self.options = options
self.page = 0
self.update_size(size)
def update_size(self, new_size):
self.size = new_size
self.real_size = Query.RESULTS_PER_PAGE if (self.size >
Query.RESULTS_PER_PAGE or self.size == 0) else self.size
def to_params(self):
"""Transform search parameters from instance properties to a dictionary
:rtype: dict
:return: search parameters
"""
params = {'start': self.start, 'size': self.real_size}
if self.q:
params['q'] = self.q
if self.parser:
params['q.parser'] = self.parser
if self.fq:
params['fq'] = self.fq
if self.expr:
for k, v in six.iteritems(self.expr):
params['expr.%s' % k] = v
if self.facet:
for k, v in six.iteritems(self.facet):
if not isinstance(v, six.string_types):
v = json.dumps(v)
params['facet.%s' % k] = v
if self.highlight:
for k, v in six.iteritems(self.highlight):
params['highlight.%s' % k] = v
if self.options:
params['q.options'] = self.options
if self.return_fields:
params['return'] = ','.join(self.return_fields)
if self.partial is not None:
params['partial'] = self.partial
if self.sort:
params['sort'] = ','.join(self.sort)
return params
class SearchConnection(object):
def __init__(self, domain=None, endpoint=None):
self.domain = domain
self.endpoint = endpoint
self.session = requests.Session()
# Copy proxy settings from connection
if self.domain and self.domain.layer1 and self.domain.layer1.use_proxy:
self.session.proxies['http'] = self.domain.layer1.get_proxy_url_with_auth()
if not endpoint:
self.endpoint = domain.search_service_endpoint
def build_query(self, q=None, parser=None, fq=None, rank=None, return_fields=None,
size=10, start=0, facet=None, highlight=None, sort=None,
partial=None, options=None):
return Query(q=q, parser=parser, fq=fq, expr=rank, return_fields=return_fields,
size=size, start=start, facet=facet, highlight=highlight,
sort=sort, partial=partial, options=options)
def search(self, q=None, parser=None, fq=None, rank=None, return_fields=None,
size=10, start=0, facet=None, highlight=None, sort=None, partial=None,
options=None):
"""
Send a query to CloudSearch
Each search query should use at least the q or bq argument to specify
the search parameter. The other options are used to specify the
criteria of the search.
:type q: string
:param q: A string to search the default search fields for.
:type parser: string
:param parser: The parser to use. 'simple', 'structured', 'lucene', 'dismax'
:type fq: string
:param fq: The filter query to use.
:type sort: List of strings
:param sort: A list of fields or rank expressions used to order the
search results. Order is handled by adding 'desc' or 'asc' after the field name.
``['year desc', 'author asc']``
:type return_fields: List of strings
:param return_fields: A list of fields which should be returned by the
search. If this field is not specified, only IDs will be returned.
``['headline']``
:type size: int
:param size: Number of search results to specify
:type start: int
:param start: Offset of the first search result to return (can be used
for paging)
:type facet: dict
:param facet: Dictionary of fields for which facets should be returned
The facet value is string of JSON options
``{'year': '{sort:"bucket", size:3}', 'genres': '{buckets:["Action","Adventure","Sci-Fi"]}'}``
:type highlight: dict
:param highlight: Dictionary of fields for which highlights should be returned
The facet value is string of JSON options
``{'genres': '{format:'text',max_phrases:2,pre_tag:'<b>',post_tag:'</b>'}'}``
:type partial: bool
:param partial: Should partial results from a partioned service be returned if
one or more index partitions are unreachable.
:type options: str
:param options: Options for the query parser specified in *parser*.
Specified as a string in JSON format.
``{fields: ['title^5', 'description']}``
:rtype: :class:`boto.cloudsearch2.search.SearchResults`
:return: Returns the results of this search
The following examples all assume we have indexed a set of documents
with fields: *author*, *date*, *headline*
A simple search will look for documents whose default text search
fields will contain the search word exactly:
>>> search(q='Tim') # Return documents with the word Tim in them (but not Timothy)
A simple search with more keywords will return documents whose default
text search fields contain the search strings together or separately.
>>> search(q='Tim apple') # Will match "tim" and "apple"
More complex searches require the boolean search operator.
Wildcard searches can be used to search for any words that start with
the search string.
>>> search(q="'Tim*'") # Return documents with words like Tim or Timothy)
Search terms can also be combined. Allowed operators are "and", "or",
"not", "field", "optional", "token", "phrase", or "filter"
>>> search(q="(and 'Tim' (field author 'John Smith'))", parser='structured')
Facets allow you to show classification information about the search
results. For example, you can retrieve the authors who have written
about Tim with a max of 3
>>> search(q='Tim', facet={'Author': '{sort:"bucket", size:3}'})
"""
query = self.build_query(q=q, parser=parser, fq=fq, rank=rank,
return_fields=return_fields,
size=size, start=start, facet=facet,
highlight=highlight, sort=sort,
partial=partial, options=options)
return self(query)
def __call__(self, query):
"""Make a call to CloudSearch
:type query: :class:`boto.cloudsearch2.search.Query`
:param query: A group of search criteria
:rtype: :class:`boto.cloudsearch2.search.SearchResults`
:return: search results
"""
api_version = '2013-01-01'
if self.domain:
api_version = self.domain.layer1.APIVersion
url = "http://%s/%s/search" % (self.endpoint, api_version)
params = query.to_params()
r = self.session.get(url, params=params)
_body = r.content.decode('utf-8')
try:
data = json.loads(_body)
except ValueError:
if r.status_code == 403:
msg = ''
import re
g = re.search('<html><body><h1>403 Forbidden</h1>([^<]+)<', _body)
try:
msg = ': %s' % (g.groups()[0].strip())
except AttributeError:
pass
raise SearchServiceException('Authentication error from Amazon%s' % msg)
raise SearchServiceException("Got non-json response from Amazon. %s" % _body, query)
if 'messages' in data and 'error' in data:
for m in data['messages']:
if m['severity'] == 'fatal':
raise SearchServiceException("Error processing search %s "
"=> %s" % (params, m['message']), query)
elif 'error' in data:
raise SearchServiceException("Unknown error processing search %s"
% json.dumps(data), query)
data['query'] = query
data['search_service'] = self
return SearchResults(**data)
def get_all_paged(self, query, per_page):
"""Get a generator to iterate over all pages of search results
:type query: :class:`boto.cloudsearch2.search.Query`
:param query: A group of search criteria
:type per_page: int
:param per_page: Number of docs in each :class:`boto.cloudsearch2.search.SearchResults` object.
:rtype: generator
:return: Generator containing :class:`boto.cloudsearch2.search.SearchResults`
"""
query.update_size(per_page)
page = 0
num_pages_needed = 0
while page <= num_pages_needed:
results = self(query)
num_pages_needed = results.num_pages_needed
yield results
query.start += query.real_size
page += 1
def get_all_hits(self, query):
"""Get a generator to iterate over all search results
Transparently handles the results paging from Cloudsearch
search results so even if you have many thousands of results
you can iterate over all results in a reasonably efficient
manner.
:type query: :class:`boto.cloudsearch2.search.Query`
:param query: A group of search criteria
:rtype: generator
:return: All docs matching query
"""
page = 0
num_pages_needed = 0
while page <= num_pages_needed:
results = self(query)
num_pages_needed = results.num_pages_needed
for doc in results:
yield doc
query.start += query.real_size
page += 1
def get_num_hits(self, query):
"""Return the total number of hits for query
:type query: :class:`boto.cloudsearch2.search.Query`
:param query: a group of search criteria
:rtype: int
:return: Total number of hits for query
"""
query.update_size(1)
return self(query).hits
|
mit
|
Sylrob434/CouchPotatoServer
|
libs/CodernityDB/debug_stuff.py
|
44
|
7678
|
#!/usr/bin/env python
# -*- coding: utf-8 -*-
#
# Copyright 2011-2013 Codernity (http://codernity.com)
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from CodernityDB.tree_index import TreeBasedIndex
import struct
import os
import inspect
from functools import wraps
import json
class DebugTreeBasedIndex(TreeBasedIndex):
def __init__(self, *args, **kwargs):
super(DebugTreeBasedIndex, self).__init__(*args, **kwargs)
def print_tree(self):
print '-----CURRENT TREE-----'
print self.root_flag
if self.root_flag == 'l':
print '---ROOT---'
self._print_leaf_data(self.data_start)
return
else:
print '---ROOT---'
self._print_node_data(self.data_start)
nr_of_el, children_flag = self._read_node_nr_of_elements_and_children_flag(
self.data_start)
nodes = []
for index in range(nr_of_el):
l_pointer, key, r_pointer = self._read_single_node_key(
self.data_start, index)
nodes.append(l_pointer)
nodes.append(r_pointer)
print 'ROOT NODES', nodes
while children_flag == 'n':
self._print_level(nodes, 'n')
new_nodes = []
for node in nodes:
nr_of_el, children_flag = \
self._read_node_nr_of_elements_and_children_flag(node)
for index in range(nr_of_el):
l_pointer, key, r_pointer = self._read_single_node_key(
node, index)
new_nodes.append(l_pointer)
new_nodes.append(r_pointer)
nodes = new_nodes
self._print_level(nodes, 'l')
def _print_level(self, nodes, flag):
print '---NEXT LVL---'
if flag == 'n':
for node in nodes:
self._print_node_data(node)
elif flag == 'l':
for node in nodes:
self._print_leaf_data(node)
def _print_leaf_data(self, leaf_start_position):
print 'printing data of leaf at', leaf_start_position
nr_of_elements = self._read_leaf_nr_of_elements(leaf_start_position)
self.buckets.seek(leaf_start_position)
data = self.buckets.read(self.leaf_heading_size +
nr_of_elements * self.single_leaf_record_size)
leaf = struct.unpack('<' + self.leaf_heading_format +
nr_of_elements * self.single_leaf_record_format, data)
print leaf
print
def _print_node_data(self, node_start_position):
print 'printing data of node at', node_start_position
nr_of_elements = self._read_node_nr_of_elements_and_children_flag(
node_start_position)[0]
self.buckets.seek(node_start_position)
data = self.buckets.read(self.node_heading_size + self.pointer_size
+ nr_of_elements * (self.key_size + self.pointer_size))
node = struct.unpack('<' + self.node_heading_format + self.pointer_format
+ nr_of_elements * (
self.key_format + self.pointer_format),
data)
print node
print
# ------------------>
def database_step_by_step(db_obj, path=None):
if not path:
# ugly for multiplatform support....
p = db_obj.path
p1 = os.path.split(p)
p2 = os.path.split(p1[0])
p3 = '_'.join([p2[1], 'operation_logger.log'])
path = os.path.join(os.path.split(p2[0])[0], p3)
f_obj = open(path, 'wb')
__stack = [] # inspect.stack() is not working on pytest etc
def remove_from_stack(name):
for i in range(len(__stack)):
if __stack[-i] == name:
__stack.pop(-i)
def __dumper(f):
@wraps(f)
def __inner(*args, **kwargs):
funct_name = f.__name__
if funct_name == 'count':
name = args[0].__name__
meth_args = (name,) + args[1:]
elif funct_name in ('reindex_index', 'compact_index'):
name = args[0].name
meth_args = (name,) + args[1:]
else:
meth_args = args
kwargs_copy = kwargs.copy()
res = None
__stack.append(funct_name)
if funct_name == 'insert':
try:
res = f(*args, **kwargs)
except:
packed = json.dumps((funct_name,
meth_args, kwargs_copy, None))
f_obj.write('%s\n' % packed)
f_obj.flush()
raise
else:
packed = json.dumps((funct_name,
meth_args, kwargs_copy, res))
f_obj.write('%s\n' % packed)
f_obj.flush()
else:
if funct_name == 'get':
for curr in __stack:
if ('delete' in curr or 'update' in curr) and not curr.startswith('test'):
remove_from_stack(funct_name)
return f(*args, **kwargs)
packed = json.dumps((funct_name, meth_args, kwargs_copy))
f_obj.write('%s\n' % packed)
f_obj.flush()
res = f(*args, **kwargs)
remove_from_stack(funct_name)
return res
return __inner
for meth_name, meth_f in inspect.getmembers(db_obj, predicate=inspect.ismethod):
if not meth_name.startswith('_'):
setattr(db_obj, meth_name, __dumper(meth_f))
setattr(db_obj, 'operation_logger', f_obj)
def database_from_steps(db_obj, path):
# db_obj.insert=lambda data : insert_for_debug(db_obj, data)
with open(path, 'rb') as f_obj:
for current in f_obj:
line = json.loads(current[:-1])
if line[0] == 'count':
obj = getattr(db_obj, line[1][0])
line[1] = [obj] + line[1][1:]
name = line[0]
if name == 'insert':
try:
line[1][0].pop('_rev')
except:
pass
elif name in ('delete', 'update'):
el = db_obj.get('id', line[1][0]['_id'])
line[1][0]['_rev'] = el['_rev']
# print 'FROM STEPS doing', line
meth = getattr(db_obj, line[0], None)
if not meth:
raise Exception("Method = `%s` not found" % line[0])
meth(*line[1], **line[2])
# def insert_for_debug(self, data):
#
# _rev = data['_rev']
#
# if not '_id' in data:
# _id = uuid4().hex
# else:
# _id = data['_id']
# data['_id'] = _id
# try:
# _id = bytes(_id)
# except:
# raise DatabaseException("`_id` must be valid bytes object")
# self._insert_indexes(_id, _rev, data)
# ret = {'_id': _id, '_rev': _rev}
# data.update(ret)
# return ret
|
gpl-3.0
|
cdepman/falcon_api
|
site-packages/wheel/test/test_install.py
|
455
|
1866
|
# Test wheel.
# The file has the following contents:
# hello.pyd
# hello/hello.py
# hello/__init__.py
# test-1.0.data/data/hello.dat
# test-1.0.data/headers/hello.dat
# test-1.0.data/scripts/hello.sh
# test-1.0.dist-info/WHEEL
# test-1.0.dist-info/METADATA
# test-1.0.dist-info/RECORD
# The root is PLATLIB
# So, some in PLATLIB, and one in each of DATA, HEADERS and SCRIPTS.
import wheel.tool
import wheel.pep425tags
from wheel.install import WheelFile
from tempfile import mkdtemp
import shutil
import os
THISDIR = os.path.dirname(__file__)
TESTWHEEL = os.path.join(THISDIR, 'test-1.0-py2.py3-none-win32.whl')
def check(*path):
return os.path.exists(os.path.join(*path))
def test_install():
tempdir = mkdtemp()
def get_supported():
return list(wheel.pep425tags.get_supported()) + [('py3', 'none', 'win32')]
whl = WheelFile(TESTWHEEL, context=get_supported)
assert whl.supports_current_python(get_supported)
try:
locs = {}
for key in ('purelib', 'platlib', 'scripts', 'headers', 'data'):
locs[key] = os.path.join(tempdir, key)
os.mkdir(locs[key])
whl.install(overrides=locs)
assert len(os.listdir(locs['purelib'])) == 0
assert check(locs['platlib'], 'hello.pyd')
assert check(locs['platlib'], 'hello', 'hello.py')
assert check(locs['platlib'], 'hello', '__init__.py')
assert check(locs['data'], 'hello.dat')
assert check(locs['headers'], 'hello.dat')
assert check(locs['scripts'], 'hello.sh')
assert check(locs['platlib'], 'test-1.0.dist-info', 'RECORD')
finally:
shutil.rmtree(tempdir)
def test_install_tool():
"""Slightly improve coverage of wheel.install"""
wheel.tool.install([TESTWHEEL], force=True, dry_run=True)
|
mit
|
bcornwellmott/erpnext
|
erpnext/hr/report/employee_birthday/employee_birthday.py
|
120
|
1328
|
# Copyright (c) 2015, Frappe Technologies Pvt. Ltd. and Contributors
# License: GNU General Public License v3. See license.txt
from __future__ import unicode_literals
import frappe
from frappe import _
from frappe.utils import flt
def execute(filters=None):
if not filters: filters = {}
columns = get_columns()
data = get_employees(filters)
return columns, data
def get_columns():
return [
_("Employee") + ":Link/Employee:120", _("Name") + ":Data:200", _("Date of Birth")+ ":Date:100",
_("Branch") + ":Link/Branch:120", _("Department") + ":Link/Department:120",
_("Designation") + ":Link/Designation:120", _("Gender") + "::60", _("Company") + ":Link/Company:120"
]
def get_employees(filters):
conditions = get_conditions(filters)
return frappe.db.sql("""select name, employee_name, date_of_birth,
branch, department, designation,
gender, company from tabEmployee where status = 'Active' %s""" % conditions, as_list=1)
def get_conditions(filters):
conditions = ""
if filters.get("month"):
month = ["Jan", "Feb", "Mar", "Apr", "May", "Jun", "Jul", "Aug", "Sep", "Oct", "Nov",
"Dec"].index(filters["month"]) + 1
conditions += " and month(date_of_birth) = '%s'" % month
if filters.get("company"): conditions += " and company = '%s'" % \
filters["company"].replace("'", "\\'")
return conditions
|
gpl-3.0
|
notmyname/swift
|
test/unit/common/test_exceptions.py
|
51
|
1959
|
# Copyright (c) 2010-2012 OpenStack Foundation
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# TODO(creiht): Tests
import unittest
from swift.common import exceptions
class TestExceptions(unittest.TestCase):
def test_replication_exception(self):
self.assertEqual(str(exceptions.ReplicationException()), '')
self.assertEqual(str(exceptions.ReplicationException('test')), 'test')
def test_replication_lock_timeout(self):
exc = exceptions.ReplicationLockTimeout(15, 'test')
try:
self.assertTrue(isinstance(exc, exceptions.MessageTimeout))
finally:
exc.cancel()
def test_client_exception(self):
strerror = 'test: HTTP://random:888/randompath?foo=1 666 reason: ' \
'device /sdb1 content'
exc = exceptions.ClientException('test', http_scheme='HTTP',
http_host='random',
http_port=888,
http_path='/randompath',
http_query='foo=1',
http_status=666,
http_reason='reason',
http_device='/sdb1',
http_response_content='content')
self.assertEqual(str(exc), strerror)
if __name__ == '__main__':
unittest.main()
|
apache-2.0
|
danielvdao/TheAnimalFarm
|
venv/lib/python2.7/site-packages/setuptools/command/easy_install.py
|
206
|
72706
|
#!/usr/bin/env python
"""
Easy Install
------------
A tool for doing automatic download/extract/build of distutils-based Python
packages. For detailed documentation, see the accompanying EasyInstall.txt
file, or visit the `EasyInstall home page`__.
__ https://pythonhosted.org/setuptools/easy_install.html
"""
import sys
import os
import zipimport
import shutil
import tempfile
import zipfile
import re
import stat
import random
import platform
import textwrap
import warnings
import site
import struct
from glob import glob
from distutils import log, dir_util
import pkg_resources
from setuptools import Command, _dont_write_bytecode
from setuptools.sandbox import run_setup
from setuptools.py31compat import get_path, get_config_vars
from distutils.util import get_platform
from distutils.util import convert_path, subst_vars
from distutils.errors import DistutilsArgError, DistutilsOptionError, \
DistutilsError, DistutilsPlatformError
from distutils.command.install import INSTALL_SCHEMES, SCHEME_KEYS
from setuptools.command import setopt
from setuptools.archive_util import unpack_archive
from setuptools.package_index import PackageIndex
from setuptools.package_index import URL_SCHEME
from setuptools.command import bdist_egg, egg_info
from setuptools.compat import (iteritems, maxsize, basestring, unicode,
reraise)
from pkg_resources import (
yield_lines, normalize_path, resource_string, ensure_directory,
get_distribution, find_distributions, Environment, Requirement,
Distribution, PathMetadata, EggMetadata, WorkingSet, DistributionNotFound,
VersionConflict, DEVELOP_DIST,
)
sys_executable = os.environ.get('__VENV_LAUNCHER__',
os.path.normpath(sys.executable))
__all__ = [
'samefile', 'easy_install', 'PthDistributions', 'extract_wininst_cfg',
'main', 'get_exe_prefixes',
]
def is_64bit():
return struct.calcsize("P") == 8
def samefile(p1, p2):
both_exist = os.path.exists(p1) and os.path.exists(p2)
use_samefile = hasattr(os.path, 'samefile') and both_exist
if use_samefile:
return os.path.samefile(p1, p2)
norm_p1 = os.path.normpath(os.path.normcase(p1))
norm_p2 = os.path.normpath(os.path.normcase(p2))
return norm_p1 == norm_p2
if sys.version_info <= (3,):
def _to_ascii(s):
return s
def isascii(s):
try:
unicode(s, 'ascii')
return True
except UnicodeError:
return False
else:
def _to_ascii(s):
return s.encode('ascii')
def isascii(s):
try:
s.encode('ascii')
return True
except UnicodeError:
return False
class easy_install(Command):
"""Manage a download/build/install process"""
description = "Find/get/install Python packages"
command_consumes_arguments = True
user_options = [
('prefix=', None, "installation prefix"),
("zip-ok", "z", "install package as a zipfile"),
("multi-version", "m", "make apps have to require() a version"),
("upgrade", "U", "force upgrade (searches PyPI for latest versions)"),
("install-dir=", "d", "install package to DIR"),
("script-dir=", "s", "install scripts to DIR"),
("exclude-scripts", "x", "Don't install scripts"),
("always-copy", "a", "Copy all needed packages to install dir"),
("index-url=", "i", "base URL of Python Package Index"),
("find-links=", "f", "additional URL(s) to search for packages"),
("build-directory=", "b",
"download/extract/build in DIR; keep the results"),
('optimize=', 'O',
"also compile with optimization: -O1 for \"python -O\", "
"-O2 for \"python -OO\", and -O0 to disable [default: -O0]"),
('record=', None,
"filename in which to record list of installed files"),
('always-unzip', 'Z', "don't install as a zipfile, no matter what"),
('site-dirs=','S',"list of directories where .pth files work"),
('editable', 'e', "Install specified packages in editable form"),
('no-deps', 'N', "don't install dependencies"),
('allow-hosts=', 'H', "pattern(s) that hostnames must match"),
('local-snapshots-ok', 'l',
"allow building eggs from local checkouts"),
('version', None, "print version information and exit"),
('no-find-links', None,
"Don't load find-links defined in packages being installed")
]
boolean_options = [
'zip-ok', 'multi-version', 'exclude-scripts', 'upgrade', 'always-copy',
'editable',
'no-deps', 'local-snapshots-ok', 'version'
]
if site.ENABLE_USER_SITE:
help_msg = "install in user site-package '%s'" % site.USER_SITE
user_options.append(('user', None, help_msg))
boolean_options.append('user')
negative_opt = {'always-unzip': 'zip-ok'}
create_index = PackageIndex
def initialize_options(self):
if site.ENABLE_USER_SITE:
whereami = os.path.abspath(__file__)
self.user = whereami.startswith(site.USER_SITE)
else:
self.user = 0
self.zip_ok = self.local_snapshots_ok = None
self.install_dir = self.script_dir = self.exclude_scripts = None
self.index_url = None
self.find_links = None
self.build_directory = None
self.args = None
self.optimize = self.record = None
self.upgrade = self.always_copy = self.multi_version = None
self.editable = self.no_deps = self.allow_hosts = None
self.root = self.prefix = self.no_report = None
self.version = None
self.install_purelib = None # for pure module distributions
self.install_platlib = None # non-pure (dists w/ extensions)
self.install_headers = None # for C/C++ headers
self.install_lib = None # set to either purelib or platlib
self.install_scripts = None
self.install_data = None
self.install_base = None
self.install_platbase = None
if site.ENABLE_USER_SITE:
self.install_userbase = site.USER_BASE
self.install_usersite = site.USER_SITE
else:
self.install_userbase = None
self.install_usersite = None
self.no_find_links = None
# Options not specifiable via command line
self.package_index = None
self.pth_file = self.always_copy_from = None
self.site_dirs = None
self.installed_projects = {}
self.sitepy_installed = False
# Always read easy_install options, even if we are subclassed, or have
# an independent instance created. This ensures that defaults will
# always come from the standard configuration file(s)' "easy_install"
# section, even if this is a "develop" or "install" command, or some
# other embedding.
self._dry_run = None
self.verbose = self.distribution.verbose
self.distribution._set_command_options(
self, self.distribution.get_option_dict('easy_install')
)
def delete_blockers(self, blockers):
for filename in blockers:
if os.path.exists(filename) or os.path.islink(filename):
log.info("Deleting %s", filename)
if not self.dry_run:
if os.path.isdir(filename) and not os.path.islink(filename):
rmtree(filename)
else:
os.unlink(filename)
def finalize_options(self):
if self.version:
print('setuptools %s' % get_distribution('setuptools').version)
sys.exit()
py_version = sys.version.split()[0]
prefix, exec_prefix = get_config_vars('prefix', 'exec_prefix')
self.config_vars = {
'dist_name': self.distribution.get_name(),
'dist_version': self.distribution.get_version(),
'dist_fullname': self.distribution.get_fullname(),
'py_version': py_version,
'py_version_short': py_version[0:3],
'py_version_nodot': py_version[0] + py_version[2],
'sys_prefix': prefix,
'prefix': prefix,
'sys_exec_prefix': exec_prefix,
'exec_prefix': exec_prefix,
# Only python 3.2+ has abiflags
'abiflags': getattr(sys, 'abiflags', ''),
}
if site.ENABLE_USER_SITE:
self.config_vars['userbase'] = self.install_userbase
self.config_vars['usersite'] = self.install_usersite
# fix the install_dir if "--user" was used
#XXX: duplicate of the code in the setup command
if self.user and site.ENABLE_USER_SITE:
self.create_home_path()
if self.install_userbase is None:
raise DistutilsPlatformError(
"User base directory is not specified")
self.install_base = self.install_platbase = self.install_userbase
if os.name == 'posix':
self.select_scheme("unix_user")
else:
self.select_scheme(os.name + "_user")
self.expand_basedirs()
self.expand_dirs()
self._expand('install_dir','script_dir','build_directory','site_dirs')
# If a non-default installation directory was specified, default the
# script directory to match it.
if self.script_dir is None:
self.script_dir = self.install_dir
if self.no_find_links is None:
self.no_find_links = False
# Let install_dir get set by install_lib command, which in turn
# gets its info from the install command, and takes into account
# --prefix and --home and all that other crud.
self.set_undefined_options('install_lib',
('install_dir','install_dir')
)
# Likewise, set default script_dir from 'install_scripts.install_dir'
self.set_undefined_options('install_scripts',
('install_dir', 'script_dir')
)
if self.user and self.install_purelib:
self.install_dir = self.install_purelib
self.script_dir = self.install_scripts
# default --record from the install command
self.set_undefined_options('install', ('record', 'record'))
# Should this be moved to the if statement below? It's not used
# elsewhere
normpath = map(normalize_path, sys.path)
self.all_site_dirs = get_site_dirs()
if self.site_dirs is not None:
site_dirs = [
os.path.expanduser(s.strip()) for s in self.site_dirs.split(',')
]
for d in site_dirs:
if not os.path.isdir(d):
log.warn("%s (in --site-dirs) does not exist", d)
elif normalize_path(d) not in normpath:
raise DistutilsOptionError(
d+" (in --site-dirs) is not on sys.path"
)
else:
self.all_site_dirs.append(normalize_path(d))
if not self.editable: self.check_site_dir()
self.index_url = self.index_url or "https://pypi.python.org/simple"
self.shadow_path = self.all_site_dirs[:]
for path_item in self.install_dir, normalize_path(self.script_dir):
if path_item not in self.shadow_path:
self.shadow_path.insert(0, path_item)
if self.allow_hosts is not None:
hosts = [s.strip() for s in self.allow_hosts.split(',')]
else:
hosts = ['*']
if self.package_index is None:
self.package_index = self.create_index(
self.index_url, search_path = self.shadow_path, hosts=hosts,
)
self.local_index = Environment(self.shadow_path+sys.path)
if self.find_links is not None:
if isinstance(self.find_links, basestring):
self.find_links = self.find_links.split()
else:
self.find_links = []
if self.local_snapshots_ok:
self.package_index.scan_egg_links(self.shadow_path+sys.path)
if not self.no_find_links:
self.package_index.add_find_links(self.find_links)
self.set_undefined_options('install_lib', ('optimize','optimize'))
if not isinstance(self.optimize,int):
try:
self.optimize = int(self.optimize)
if not (0 <= self.optimize <= 2): raise ValueError
except ValueError:
raise DistutilsOptionError("--optimize must be 0, 1, or 2")
if self.editable and not self.build_directory:
raise DistutilsArgError(
"Must specify a build directory (-b) when using --editable"
)
if not self.args:
raise DistutilsArgError(
"No urls, filenames, or requirements specified (see --help)")
self.outputs = []
def _expand_attrs(self, attrs):
for attr in attrs:
val = getattr(self, attr)
if val is not None:
if os.name == 'posix' or os.name == 'nt':
val = os.path.expanduser(val)
val = subst_vars(val, self.config_vars)
setattr(self, attr, val)
def expand_basedirs(self):
"""Calls `os.path.expanduser` on install_base, install_platbase and
root."""
self._expand_attrs(['install_base', 'install_platbase', 'root'])
def expand_dirs(self):
"""Calls `os.path.expanduser` on install dirs."""
self._expand_attrs(['install_purelib', 'install_platlib',
'install_lib', 'install_headers',
'install_scripts', 'install_data',])
def run(self):
if self.verbose != self.distribution.verbose:
log.set_verbosity(self.verbose)
try:
for spec in self.args:
self.easy_install(spec, not self.no_deps)
if self.record:
outputs = self.outputs
if self.root: # strip any package prefix
root_len = len(self.root)
for counter in range(len(outputs)):
outputs[counter] = outputs[counter][root_len:]
from distutils import file_util
self.execute(
file_util.write_file, (self.record, outputs),
"writing list of installed files to '%s'" %
self.record
)
self.warn_deprecated_options()
finally:
log.set_verbosity(self.distribution.verbose)
def pseudo_tempname(self):
"""Return a pseudo-tempname base in the install directory.
This code is intentionally naive; if a malicious party can write to
the target directory you're already in deep doodoo.
"""
try:
pid = os.getpid()
except:
pid = random.randint(0, maxsize)
return os.path.join(self.install_dir, "test-easy-install-%s" % pid)
def warn_deprecated_options(self):
pass
def check_site_dir(self):
"""Verify that self.install_dir is .pth-capable dir, if needed"""
instdir = normalize_path(self.install_dir)
pth_file = os.path.join(instdir,'easy-install.pth')
# Is it a configured, PYTHONPATH, implicit, or explicit site dir?
is_site_dir = instdir in self.all_site_dirs
if not is_site_dir and not self.multi_version:
# No? Then directly test whether it does .pth file processing
is_site_dir = self.check_pth_processing()
else:
# make sure we can write to target dir
testfile = self.pseudo_tempname()+'.write-test'
test_exists = os.path.exists(testfile)
try:
if test_exists: os.unlink(testfile)
open(testfile,'w').close()
os.unlink(testfile)
except (OSError,IOError):
self.cant_write_to_target()
if not is_site_dir and not self.multi_version:
# Can't install non-multi to non-site dir
raise DistutilsError(self.no_default_version_msg())
if is_site_dir:
if self.pth_file is None:
self.pth_file = PthDistributions(pth_file, self.all_site_dirs)
else:
self.pth_file = None
PYTHONPATH = os.environ.get('PYTHONPATH','').split(os.pathsep)
if instdir not in map(normalize_path, [_f for _f in PYTHONPATH if _f]):
# only PYTHONPATH dirs need a site.py, so pretend it's there
self.sitepy_installed = True
elif self.multi_version and not os.path.exists(pth_file):
self.sitepy_installed = True # don't need site.py in this case
self.pth_file = None # and don't create a .pth file
self.install_dir = instdir
def cant_write_to_target(self):
template = """can't create or remove files in install directory
The following error occurred while trying to add or remove files in the
installation directory:
%s
The installation directory you specified (via --install-dir, --prefix, or
the distutils default setting) was:
%s
"""
msg = template % (sys.exc_info()[1], self.install_dir,)
if not os.path.exists(self.install_dir):
msg += """
This directory does not currently exist. Please create it and try again, or
choose a different installation directory (using the -d or --install-dir
option).
"""
else:
msg += """
Perhaps your account does not have write access to this directory? If the
installation directory is a system-owned directory, you may need to sign in
as the administrator or "root" account. If you do not have administrative
access to this machine, you may wish to choose a different installation
directory, preferably one that is listed in your PYTHONPATH environment
variable.
For information on other options, you may wish to consult the
documentation at:
https://pythonhosted.org/setuptools/easy_install.html
Please make the appropriate changes for your system and try again.
"""
raise DistutilsError(msg)
def check_pth_processing(self):
"""Empirically verify whether .pth files are supported in inst. dir"""
instdir = self.install_dir
log.info("Checking .pth file support in %s", instdir)
pth_file = self.pseudo_tempname()+".pth"
ok_file = pth_file+'.ok'
ok_exists = os.path.exists(ok_file)
try:
if ok_exists: os.unlink(ok_file)
dirname = os.path.dirname(ok_file)
if not os.path.exists(dirname):
os.makedirs(dirname)
f = open(pth_file,'w')
except (OSError,IOError):
self.cant_write_to_target()
else:
try:
f.write("import os; f = open(%r, 'w'); f.write('OK'); f.close()\n" % (ok_file,))
f.close()
f=None
executable = sys.executable
if os.name=='nt':
dirname,basename = os.path.split(executable)
alt = os.path.join(dirname,'pythonw.exe')
if basename.lower()=='python.exe' and os.path.exists(alt):
# use pythonw.exe to avoid opening a console window
executable = alt
from distutils.spawn import spawn
spawn([executable,'-E','-c','pass'],0)
if os.path.exists(ok_file):
log.info(
"TEST PASSED: %s appears to support .pth files",
instdir
)
return True
finally:
if f:
f.close()
if os.path.exists(ok_file):
os.unlink(ok_file)
if os.path.exists(pth_file):
os.unlink(pth_file)
if not self.multi_version:
log.warn("TEST FAILED: %s does NOT support .pth files", instdir)
return False
def install_egg_scripts(self, dist):
"""Write all the scripts for `dist`, unless scripts are excluded"""
if not self.exclude_scripts and dist.metadata_isdir('scripts'):
for script_name in dist.metadata_listdir('scripts'):
if dist.metadata_isdir('scripts/' + script_name):
# The "script" is a directory, likely a Python 3
# __pycache__ directory, so skip it.
continue
self.install_script(
dist, script_name,
dist.get_metadata('scripts/'+script_name)
)
self.install_wrapper_scripts(dist)
def add_output(self, path):
if os.path.isdir(path):
for base, dirs, files in os.walk(path):
for filename in files:
self.outputs.append(os.path.join(base,filename))
else:
self.outputs.append(path)
def not_editable(self, spec):
if self.editable:
raise DistutilsArgError(
"Invalid argument %r: you can't use filenames or URLs "
"with --editable (except via the --find-links option)."
% (spec,)
)
def check_editable(self,spec):
if not self.editable:
return
if os.path.exists(os.path.join(self.build_directory, spec.key)):
raise DistutilsArgError(
"%r already exists in %s; can't do a checkout there" %
(spec.key, self.build_directory)
)
def easy_install(self, spec, deps=False):
tmpdir = tempfile.mkdtemp(prefix="easy_install-")
download = None
if not self.editable: self.install_site_py()
try:
if not isinstance(spec,Requirement):
if URL_SCHEME(spec):
# It's a url, download it to tmpdir and process
self.not_editable(spec)
download = self.package_index.download(spec, tmpdir)
return self.install_item(None, download, tmpdir, deps, True)
elif os.path.exists(spec):
# Existing file or directory, just process it directly
self.not_editable(spec)
return self.install_item(None, spec, tmpdir, deps, True)
else:
spec = parse_requirement_arg(spec)
self.check_editable(spec)
dist = self.package_index.fetch_distribution(
spec, tmpdir, self.upgrade, self.editable, not self.always_copy,
self.local_index
)
if dist is None:
msg = "Could not find suitable distribution for %r" % spec
if self.always_copy:
msg+=" (--always-copy skips system and development eggs)"
raise DistutilsError(msg)
elif dist.precedence==DEVELOP_DIST:
# .egg-info dists don't need installing, just process deps
self.process_distribution(spec, dist, deps, "Using")
return dist
else:
return self.install_item(spec, dist.location, tmpdir, deps)
finally:
if os.path.exists(tmpdir):
rmtree(tmpdir)
def install_item(self, spec, download, tmpdir, deps, install_needed=False):
# Installation is also needed if file in tmpdir or is not an egg
install_needed = install_needed or self.always_copy
install_needed = install_needed or os.path.dirname(download) == tmpdir
install_needed = install_needed or not download.endswith('.egg')
install_needed = install_needed or (
self.always_copy_from is not None and
os.path.dirname(normalize_path(download)) ==
normalize_path(self.always_copy_from)
)
if spec and not install_needed:
# at this point, we know it's a local .egg, we just don't know if
# it's already installed.
for dist in self.local_index[spec.project_name]:
if dist.location==download:
break
else:
install_needed = True # it's not in the local index
log.info("Processing %s", os.path.basename(download))
if install_needed:
dists = self.install_eggs(spec, download, tmpdir)
for dist in dists:
self.process_distribution(spec, dist, deps)
else:
dists = [self.egg_distribution(download)]
self.process_distribution(spec, dists[0], deps, "Using")
if spec is not None:
for dist in dists:
if dist in spec:
return dist
def select_scheme(self, name):
"""Sets the install directories by applying the install schemes."""
# it's the caller's problem if they supply a bad name!
scheme = INSTALL_SCHEMES[name]
for key in SCHEME_KEYS:
attrname = 'install_' + key
if getattr(self, attrname) is None:
setattr(self, attrname, scheme[key])
def process_distribution(self, requirement, dist, deps=True, *info):
self.update_pth(dist)
self.package_index.add(dist)
self.local_index.add(dist)
self.install_egg_scripts(dist)
self.installed_projects[dist.key] = dist
log.info(self.installation_report(requirement, dist, *info))
if (dist.has_metadata('dependency_links.txt') and
not self.no_find_links):
self.package_index.add_find_links(
dist.get_metadata_lines('dependency_links.txt')
)
if not deps and not self.always_copy:
return
elif requirement is not None and dist.key != requirement.key:
log.warn("Skipping dependencies for %s", dist)
return # XXX this is not the distribution we were looking for
elif requirement is None or dist not in requirement:
# if we wound up with a different version, resolve what we've got
distreq = dist.as_requirement()
requirement = requirement or distreq
requirement = Requirement(
distreq.project_name, distreq.specs, requirement.extras
)
log.info("Processing dependencies for %s", requirement)
try:
distros = WorkingSet([]).resolve(
[requirement], self.local_index, self.easy_install
)
except DistributionNotFound:
e = sys.exc_info()[1]
raise DistutilsError(
"Could not find required distribution %s" % e.args
)
except VersionConflict:
e = sys.exc_info()[1]
raise DistutilsError(
"Installed distribution %s conflicts with requirement %s"
% e.args
)
if self.always_copy or self.always_copy_from:
# Force all the relevant distros to be copied or activated
for dist in distros:
if dist.key not in self.installed_projects:
self.easy_install(dist.as_requirement())
log.info("Finished processing dependencies for %s", requirement)
def should_unzip(self, dist):
if self.zip_ok is not None:
return not self.zip_ok
if dist.has_metadata('not-zip-safe'):
return True
if not dist.has_metadata('zip-safe'):
return True
return False
def maybe_move(self, spec, dist_filename, setup_base):
dst = os.path.join(self.build_directory, spec.key)
if os.path.exists(dst):
msg = "%r already exists in %s; build directory %s will not be kept"
log.warn(msg, spec.key, self.build_directory, setup_base)
return setup_base
if os.path.isdir(dist_filename):
setup_base = dist_filename
else:
if os.path.dirname(dist_filename)==setup_base:
os.unlink(dist_filename) # get it out of the tmp dir
contents = os.listdir(setup_base)
if len(contents)==1:
dist_filename = os.path.join(setup_base,contents[0])
if os.path.isdir(dist_filename):
# if the only thing there is a directory, move it instead
setup_base = dist_filename
ensure_directory(dst)
shutil.move(setup_base, dst)
return dst
def install_wrapper_scripts(self, dist):
if not self.exclude_scripts:
for args in get_script_args(dist):
self.write_script(*args)
def install_script(self, dist, script_name, script_text, dev_path=None):
"""Generate a legacy script wrapper and install it"""
spec = str(dist.as_requirement())
is_script = is_python_script(script_text, script_name)
def get_template(filename):
"""
There are a couple of template scripts in the package. This
function loads one of them and prepares it for use.
These templates use triple-quotes to escape variable
substitutions so the scripts get the 2to3 treatment when build
on Python 3. The templates cannot use triple-quotes naturally.
"""
raw_bytes = resource_string('setuptools', template_name)
template_str = raw_bytes.decode('utf-8')
clean_template = template_str.replace('"""', '')
return clean_template
if is_script:
template_name = 'script template.py'
if dev_path:
template_name = template_name.replace('.py', ' (dev).py')
script_text = (get_script_header(script_text) +
get_template(template_name) % locals())
self.write_script(script_name, _to_ascii(script_text), 'b')
def write_script(self, script_name, contents, mode="t", blockers=()):
"""Write an executable file to the scripts directory"""
self.delete_blockers( # clean up old .py/.pyw w/o a script
[os.path.join(self.script_dir,x) for x in blockers])
log.info("Installing %s script to %s", script_name, self.script_dir)
target = os.path.join(self.script_dir, script_name)
self.add_output(target)
mask = current_umask()
if not self.dry_run:
ensure_directory(target)
if os.path.exists(target):
os.unlink(target)
f = open(target,"w"+mode)
f.write(contents)
f.close()
chmod(target, 0x1FF-mask) # 0777
def install_eggs(self, spec, dist_filename, tmpdir):
# .egg dirs or files are already built, so just return them
if dist_filename.lower().endswith('.egg'):
return [self.install_egg(dist_filename, tmpdir)]
elif dist_filename.lower().endswith('.exe'):
return [self.install_exe(dist_filename, tmpdir)]
# Anything else, try to extract and build
setup_base = tmpdir
if os.path.isfile(dist_filename) and not dist_filename.endswith('.py'):
unpack_archive(dist_filename, tmpdir, self.unpack_progress)
elif os.path.isdir(dist_filename):
setup_base = os.path.abspath(dist_filename)
if (setup_base.startswith(tmpdir) # something we downloaded
and self.build_directory and spec is not None):
setup_base = self.maybe_move(spec, dist_filename, setup_base)
# Find the setup.py file
setup_script = os.path.join(setup_base, 'setup.py')
if not os.path.exists(setup_script):
setups = glob(os.path.join(setup_base, '*', 'setup.py'))
if not setups:
raise DistutilsError(
"Couldn't find a setup script in %s" % os.path.abspath(dist_filename)
)
if len(setups)>1:
raise DistutilsError(
"Multiple setup scripts in %s" % os.path.abspath(dist_filename)
)
setup_script = setups[0]
# Now run it, and return the result
if self.editable:
log.info(self.report_editable(spec, setup_script))
return []
else:
return self.build_and_install(setup_script, setup_base)
def egg_distribution(self, egg_path):
if os.path.isdir(egg_path):
metadata = PathMetadata(egg_path,os.path.join(egg_path,'EGG-INFO'))
else:
metadata = EggMetadata(zipimport.zipimporter(egg_path))
return Distribution.from_filename(egg_path,metadata=metadata)
def install_egg(self, egg_path, tmpdir):
destination = os.path.join(self.install_dir,os.path.basename(egg_path))
destination = os.path.abspath(destination)
if not self.dry_run:
ensure_directory(destination)
dist = self.egg_distribution(egg_path)
if not samefile(egg_path, destination):
if os.path.isdir(destination) and not os.path.islink(destination):
dir_util.remove_tree(destination, dry_run=self.dry_run)
elif os.path.exists(destination):
self.execute(os.unlink,(destination,),"Removing "+destination)
uncache_zipdir(destination)
if os.path.isdir(egg_path):
if egg_path.startswith(tmpdir):
f,m = shutil.move, "Moving"
else:
f,m = shutil.copytree, "Copying"
elif self.should_unzip(dist):
self.mkpath(destination)
f,m = self.unpack_and_compile, "Extracting"
elif egg_path.startswith(tmpdir):
f,m = shutil.move, "Moving"
else:
f,m = shutil.copy2, "Copying"
self.execute(f, (egg_path, destination),
(m+" %s to %s") %
(os.path.basename(egg_path),os.path.dirname(destination)))
self.add_output(destination)
return self.egg_distribution(destination)
def install_exe(self, dist_filename, tmpdir):
# See if it's valid, get data
cfg = extract_wininst_cfg(dist_filename)
if cfg is None:
raise DistutilsError(
"%s is not a valid distutils Windows .exe" % dist_filename
)
# Create a dummy distribution object until we build the real distro
dist = Distribution(
None,
project_name=cfg.get('metadata','name'),
version=cfg.get('metadata','version'), platform=get_platform(),
)
# Convert the .exe to an unpacked egg
egg_path = dist.location = os.path.join(tmpdir, dist.egg_name()+'.egg')
egg_tmp = egg_path + '.tmp'
_egg_info = os.path.join(egg_tmp, 'EGG-INFO')
pkg_inf = os.path.join(_egg_info, 'PKG-INFO')
ensure_directory(pkg_inf) # make sure EGG-INFO dir exists
dist._provider = PathMetadata(egg_tmp, _egg_info) # XXX
self.exe_to_egg(dist_filename, egg_tmp)
# Write EGG-INFO/PKG-INFO
if not os.path.exists(pkg_inf):
f = open(pkg_inf,'w')
f.write('Metadata-Version: 1.0\n')
for k,v in cfg.items('metadata'):
if k != 'target_version':
f.write('%s: %s\n' % (k.replace('_','-').title(), v))
f.close()
script_dir = os.path.join(_egg_info,'scripts')
self.delete_blockers( # delete entry-point scripts to avoid duping
[os.path.join(script_dir,args[0]) for args in get_script_args(dist)]
)
# Build .egg file from tmpdir
bdist_egg.make_zipfile(
egg_path, egg_tmp, verbose=self.verbose, dry_run=self.dry_run
)
# install the .egg
return self.install_egg(egg_path, tmpdir)
def exe_to_egg(self, dist_filename, egg_tmp):
"""Extract a bdist_wininst to the directories an egg would use"""
# Check for .pth file and set up prefix translations
prefixes = get_exe_prefixes(dist_filename)
to_compile = []
native_libs = []
top_level = {}
def process(src,dst):
s = src.lower()
for old,new in prefixes:
if s.startswith(old):
src = new+src[len(old):]
parts = src.split('/')
dst = os.path.join(egg_tmp, *parts)
dl = dst.lower()
if dl.endswith('.pyd') or dl.endswith('.dll'):
parts[-1] = bdist_egg.strip_module(parts[-1])
top_level[os.path.splitext(parts[0])[0]] = 1
native_libs.append(src)
elif dl.endswith('.py') and old!='SCRIPTS/':
top_level[os.path.splitext(parts[0])[0]] = 1
to_compile.append(dst)
return dst
if not src.endswith('.pth'):
log.warn("WARNING: can't process %s", src)
return None
# extract, tracking .pyd/.dll->native_libs and .py -> to_compile
unpack_archive(dist_filename, egg_tmp, process)
stubs = []
for res in native_libs:
if res.lower().endswith('.pyd'): # create stubs for .pyd's
parts = res.split('/')
resource = parts[-1]
parts[-1] = bdist_egg.strip_module(parts[-1])+'.py'
pyfile = os.path.join(egg_tmp, *parts)
to_compile.append(pyfile)
stubs.append(pyfile)
bdist_egg.write_stub(resource, pyfile)
self.byte_compile(to_compile) # compile .py's
bdist_egg.write_safety_flag(os.path.join(egg_tmp,'EGG-INFO'),
bdist_egg.analyze_egg(egg_tmp, stubs)) # write zip-safety flag
for name in 'top_level','native_libs':
if locals()[name]:
txt = os.path.join(egg_tmp, 'EGG-INFO', name+'.txt')
if not os.path.exists(txt):
f = open(txt,'w')
f.write('\n'.join(locals()[name])+'\n')
f.close()
def installation_report(self, req, dist, what="Installed"):
"""Helpful installation message for display to package users"""
msg = "\n%(what)s %(eggloc)s%(extras)s"
if self.multi_version and not self.no_report:
msg += """
Because this distribution was installed --multi-version, before you can
import modules from this package in an application, you will need to
'import pkg_resources' and then use a 'require()' call similar to one of
these examples, in order to select the desired version:
pkg_resources.require("%(name)s") # latest installed version
pkg_resources.require("%(name)s==%(version)s") # this exact version
pkg_resources.require("%(name)s>=%(version)s") # this version or higher
"""
if self.install_dir not in map(normalize_path,sys.path):
msg += """
Note also that the installation directory must be on sys.path at runtime for
this to work. (e.g. by being the application's script directory, by being on
PYTHONPATH, or by being added to sys.path by your code.)
"""
eggloc = dist.location
name = dist.project_name
version = dist.version
extras = '' # TODO: self.report_extras(req, dist)
return msg % locals()
def report_editable(self, spec, setup_script):
dirname = os.path.dirname(setup_script)
python = sys.executable
return """\nExtracted editable version of %(spec)s to %(dirname)s
If it uses setuptools in its setup script, you can activate it in
"development" mode by going to that directory and running::
%(python)s setup.py develop
See the setuptools documentation for the "develop" command for more info.
""" % locals()
def run_setup(self, setup_script, setup_base, args):
sys.modules.setdefault('distutils.command.bdist_egg', bdist_egg)
sys.modules.setdefault('distutils.command.egg_info', egg_info)
args = list(args)
if self.verbose>2:
v = 'v' * (self.verbose - 1)
args.insert(0,'-'+v)
elif self.verbose<2:
args.insert(0,'-q')
if self.dry_run:
args.insert(0,'-n')
log.info(
"Running %s %s", setup_script[len(setup_base)+1:], ' '.join(args)
)
try:
run_setup(setup_script, args)
except SystemExit:
v = sys.exc_info()[1]
raise DistutilsError("Setup script exited with %s" % (v.args[0],))
def build_and_install(self, setup_script, setup_base):
args = ['bdist_egg', '--dist-dir']
dist_dir = tempfile.mkdtemp(
prefix='egg-dist-tmp-', dir=os.path.dirname(setup_script)
)
try:
self._set_fetcher_options(os.path.dirname(setup_script))
args.append(dist_dir)
self.run_setup(setup_script, setup_base, args)
all_eggs = Environment([dist_dir])
eggs = []
for key in all_eggs:
for dist in all_eggs[key]:
eggs.append(self.install_egg(dist.location, setup_base))
if not eggs and not self.dry_run:
log.warn("No eggs found in %s (setup script problem?)",
dist_dir)
return eggs
finally:
rmtree(dist_dir)
log.set_verbosity(self.verbose) # restore our log verbosity
def _set_fetcher_options(self, base):
"""
When easy_install is about to run bdist_egg on a source dist, that
source dist might have 'setup_requires' directives, requiring
additional fetching. Ensure the fetcher options given to easy_install
are available to that command as well.
"""
# find the fetch options from easy_install and write them out
# to the setup.cfg file.
ei_opts = self.distribution.get_option_dict('easy_install').copy()
fetch_directives = (
'find_links', 'site_dirs', 'index_url', 'optimize',
'site_dirs', 'allow_hosts',
)
fetch_options = {}
for key, val in ei_opts.items():
if key not in fetch_directives: continue
fetch_options[key.replace('_', '-')] = val[1]
# create a settings dictionary suitable for `edit_config`
settings = dict(easy_install=fetch_options)
cfg_filename = os.path.join(base, 'setup.cfg')
setopt.edit_config(cfg_filename, settings)
def update_pth(self, dist):
if self.pth_file is None:
return
for d in self.pth_file[dist.key]: # drop old entries
if self.multi_version or d.location != dist.location:
log.info("Removing %s from easy-install.pth file", d)
self.pth_file.remove(d)
if d.location in self.shadow_path:
self.shadow_path.remove(d.location)
if not self.multi_version:
if dist.location in self.pth_file.paths:
log.info(
"%s is already the active version in easy-install.pth",
dist
)
else:
log.info("Adding %s to easy-install.pth file", dist)
self.pth_file.add(dist) # add new entry
if dist.location not in self.shadow_path:
self.shadow_path.append(dist.location)
if not self.dry_run:
self.pth_file.save()
if dist.key=='setuptools':
# Ensure that setuptools itself never becomes unavailable!
# XXX should this check for latest version?
filename = os.path.join(self.install_dir,'setuptools.pth')
if os.path.islink(filename): os.unlink(filename)
f = open(filename, 'wt')
f.write(self.pth_file.make_relative(dist.location)+'\n')
f.close()
def unpack_progress(self, src, dst):
# Progress filter for unpacking
log.debug("Unpacking %s to %s", src, dst)
return dst # only unpack-and-compile skips files for dry run
def unpack_and_compile(self, egg_path, destination):
to_compile = []
to_chmod = []
def pf(src, dst):
if dst.endswith('.py') and not src.startswith('EGG-INFO/'):
to_compile.append(dst)
elif dst.endswith('.dll') or dst.endswith('.so'):
to_chmod.append(dst)
self.unpack_progress(src,dst)
return not self.dry_run and dst or None
unpack_archive(egg_path, destination, pf)
self.byte_compile(to_compile)
if not self.dry_run:
for f in to_chmod:
mode = ((os.stat(f)[stat.ST_MODE]) | 0x16D) & 0xFED # 0555, 07755
chmod(f, mode)
def byte_compile(self, to_compile):
if _dont_write_bytecode:
self.warn('byte-compiling is disabled, skipping.')
return
from distutils.util import byte_compile
try:
# try to make the byte compile messages quieter
log.set_verbosity(self.verbose - 1)
byte_compile(to_compile, optimize=0, force=1, dry_run=self.dry_run)
if self.optimize:
byte_compile(
to_compile, optimize=self.optimize, force=1,
dry_run=self.dry_run
)
finally:
log.set_verbosity(self.verbose) # restore original verbosity
def no_default_version_msg(self):
template = """bad install directory or PYTHONPATH
You are attempting to install a package to a directory that is not
on PYTHONPATH and which Python does not read ".pth" files from. The
installation directory you specified (via --install-dir, --prefix, or
the distutils default setting) was:
%s
and your PYTHONPATH environment variable currently contains:
%r
Here are some of your options for correcting the problem:
* You can choose a different installation directory, i.e., one that is
on PYTHONPATH or supports .pth files
* You can add the installation directory to the PYTHONPATH environment
variable. (It must then also be on PYTHONPATH whenever you run
Python and want to use the package(s) you are installing.)
* You can set up the installation directory to support ".pth" files by
using one of the approaches described here:
https://pythonhosted.org/setuptools/easy_install.html#custom-installation-locations
Please make the appropriate changes for your system and try again."""
return template % (self.install_dir, os.environ.get('PYTHONPATH',''))
def install_site_py(self):
"""Make sure there's a site.py in the target dir, if needed"""
if self.sitepy_installed:
return # already did it, or don't need to
sitepy = os.path.join(self.install_dir, "site.py")
source = resource_string("setuptools", "site-patch.py")
current = ""
if os.path.exists(sitepy):
log.debug("Checking existing site.py in %s", self.install_dir)
f = open(sitepy,'rb')
current = f.read()
# we want str, not bytes
if sys.version_info >= (3,):
current = current.decode()
f.close()
if not current.startswith('def __boot():'):
raise DistutilsError(
"%s is not a setuptools-generated site.py; please"
" remove it." % sitepy
)
if current != source:
log.info("Creating %s", sitepy)
if not self.dry_run:
ensure_directory(sitepy)
f = open(sitepy,'wb')
f.write(source)
f.close()
self.byte_compile([sitepy])
self.sitepy_installed = True
def create_home_path(self):
"""Create directories under ~."""
if not self.user:
return
home = convert_path(os.path.expanduser("~"))
for name, path in iteritems(self.config_vars):
if path.startswith(home) and not os.path.isdir(path):
self.debug_print("os.makedirs('%s', 0700)" % path)
os.makedirs(path, 0x1C0) # 0700
INSTALL_SCHEMES = dict(
posix = dict(
install_dir = '$base/lib/python$py_version_short/site-packages',
script_dir = '$base/bin',
),
)
DEFAULT_SCHEME = dict(
install_dir = '$base/Lib/site-packages',
script_dir = '$base/Scripts',
)
def _expand(self, *attrs):
config_vars = self.get_finalized_command('install').config_vars
if self.prefix:
# Set default install_dir/scripts from --prefix
config_vars = config_vars.copy()
config_vars['base'] = self.prefix
scheme = self.INSTALL_SCHEMES.get(os.name,self.DEFAULT_SCHEME)
for attr,val in scheme.items():
if getattr(self,attr,None) is None:
setattr(self,attr,val)
from distutils.util import subst_vars
for attr in attrs:
val = getattr(self, attr)
if val is not None:
val = subst_vars(val, config_vars)
if os.name == 'posix':
val = os.path.expanduser(val)
setattr(self, attr, val)
def get_site_dirs():
# return a list of 'site' dirs
sitedirs = [_f for _f in os.environ.get('PYTHONPATH',
'').split(os.pathsep) if _f]
prefixes = [sys.prefix]
if sys.exec_prefix != sys.prefix:
prefixes.append(sys.exec_prefix)
for prefix in prefixes:
if prefix:
if sys.platform in ('os2emx', 'riscos'):
sitedirs.append(os.path.join(prefix, "Lib", "site-packages"))
elif os.sep == '/':
sitedirs.extend([os.path.join(prefix,
"lib",
"python" + sys.version[:3],
"site-packages"),
os.path.join(prefix, "lib", "site-python")])
else:
sitedirs.extend(
[prefix, os.path.join(prefix, "lib", "site-packages")]
)
if sys.platform == 'darwin':
# for framework builds *only* we add the standard Apple
# locations. Currently only per-user, but /Library and
# /Network/Library could be added too
if 'Python.framework' in prefix:
home = os.environ.get('HOME')
if home:
sitedirs.append(
os.path.join(home,
'Library',
'Python',
sys.version[:3],
'site-packages'))
lib_paths = get_path('purelib'), get_path('platlib')
for site_lib in lib_paths:
if site_lib not in sitedirs: sitedirs.append(site_lib)
if site.ENABLE_USER_SITE:
sitedirs.append(site.USER_SITE)
sitedirs = list(map(normalize_path, sitedirs))
return sitedirs
def expand_paths(inputs):
"""Yield sys.path directories that might contain "old-style" packages"""
seen = {}
for dirname in inputs:
dirname = normalize_path(dirname)
if dirname in seen:
continue
seen[dirname] = 1
if not os.path.isdir(dirname):
continue
files = os.listdir(dirname)
yield dirname, files
for name in files:
if not name.endswith('.pth'):
# We only care about the .pth files
continue
if name in ('easy-install.pth','setuptools.pth'):
# Ignore .pth files that we control
continue
# Read the .pth file
f = open(os.path.join(dirname,name))
lines = list(yield_lines(f))
f.close()
# Yield existing non-dupe, non-import directory lines from it
for line in lines:
if not line.startswith("import"):
line = normalize_path(line.rstrip())
if line not in seen:
seen[line] = 1
if not os.path.isdir(line):
continue
yield line, os.listdir(line)
def extract_wininst_cfg(dist_filename):
"""Extract configuration data from a bdist_wininst .exe
Returns a ConfigParser.RawConfigParser, or None
"""
f = open(dist_filename,'rb')
try:
endrec = zipfile._EndRecData(f)
if endrec is None:
return None
prepended = (endrec[9] - endrec[5]) - endrec[6]
if prepended < 12: # no wininst data here
return None
f.seek(prepended-12)
from setuptools.compat import StringIO, ConfigParser
import struct
tag, cfglen, bmlen = struct.unpack("<iii",f.read(12))
if tag not in (0x1234567A, 0x1234567B):
return None # not a valid tag
f.seek(prepended-(12+cfglen))
cfg = ConfigParser.RawConfigParser({'version':'','target_version':''})
try:
part = f.read(cfglen)
# part is in bytes, but we need to read up to the first null
# byte.
if sys.version_info >= (2,6):
null_byte = bytes([0])
else:
null_byte = chr(0)
config = part.split(null_byte, 1)[0]
# Now the config is in bytes, but for RawConfigParser, it should
# be text, so decode it.
config = config.decode(sys.getfilesystemencoding())
cfg.readfp(StringIO(config))
except ConfigParser.Error:
return None
if not cfg.has_section('metadata') or not cfg.has_section('Setup'):
return None
return cfg
finally:
f.close()
def get_exe_prefixes(exe_filename):
"""Get exe->egg path translations for a given .exe file"""
prefixes = [
('PURELIB/', ''), ('PLATLIB/pywin32_system32', ''),
('PLATLIB/', ''),
('SCRIPTS/', 'EGG-INFO/scripts/'),
('DATA/lib/site-packages', ''),
]
z = zipfile.ZipFile(exe_filename)
try:
for info in z.infolist():
name = info.filename
parts = name.split('/')
if len(parts)==3 and parts[2]=='PKG-INFO':
if parts[1].endswith('.egg-info'):
prefixes.insert(0,('/'.join(parts[:2]), 'EGG-INFO/'))
break
if len(parts) != 2 or not name.endswith('.pth'):
continue
if name.endswith('-nspkg.pth'):
continue
if parts[0].upper() in ('PURELIB','PLATLIB'):
contents = z.read(name)
if sys.version_info >= (3,):
contents = contents.decode()
for pth in yield_lines(contents):
pth = pth.strip().replace('\\','/')
if not pth.startswith('import'):
prefixes.append((('%s/%s/' % (parts[0],pth)), ''))
finally:
z.close()
prefixes = [(x.lower(),y) for x, y in prefixes]
prefixes.sort()
prefixes.reverse()
return prefixes
def parse_requirement_arg(spec):
try:
return Requirement.parse(spec)
except ValueError:
raise DistutilsError(
"Not a URL, existing file, or requirement spec: %r" % (spec,)
)
class PthDistributions(Environment):
"""A .pth file with Distribution paths in it"""
dirty = False
def __init__(self, filename, sitedirs=()):
self.filename = filename
self.sitedirs = list(map(normalize_path, sitedirs))
self.basedir = normalize_path(os.path.dirname(self.filename))
self._load()
Environment.__init__(self, [], None, None)
for path in yield_lines(self.paths):
list(map(self.add, find_distributions(path, True)))
def _load(self):
self.paths = []
saw_import = False
seen = dict.fromkeys(self.sitedirs)
if os.path.isfile(self.filename):
f = open(self.filename,'rt')
for line in f:
if line.startswith('import'):
saw_import = True
continue
path = line.rstrip()
self.paths.append(path)
if not path.strip() or path.strip().startswith('#'):
continue
# skip non-existent paths, in case somebody deleted a package
# manually, and duplicate paths as well
path = self.paths[-1] = normalize_path(
os.path.join(self.basedir,path)
)
if not os.path.exists(path) or path in seen:
self.paths.pop() # skip it
self.dirty = True # we cleaned up, so we're dirty now :)
continue
seen[path] = 1
f.close()
if self.paths and not saw_import:
self.dirty = True # ensure anything we touch has import wrappers
while self.paths and not self.paths[-1].strip():
self.paths.pop()
def save(self):
"""Write changed .pth file back to disk"""
if not self.dirty:
return
data = '\n'.join(map(self.make_relative,self.paths))
if data:
log.debug("Saving %s", self.filename)
data = (
"import sys; sys.__plen = len(sys.path)\n"
"%s\n"
"import sys; new=sys.path[sys.__plen:];"
" del sys.path[sys.__plen:];"
" p=getattr(sys,'__egginsert',0); sys.path[p:p]=new;"
" sys.__egginsert = p+len(new)\n"
) % data
if os.path.islink(self.filename):
os.unlink(self.filename)
f = open(self.filename,'wt')
f.write(data)
f.close()
elif os.path.exists(self.filename):
log.debug("Deleting empty %s", self.filename)
os.unlink(self.filename)
self.dirty = False
def add(self, dist):
"""Add `dist` to the distribution map"""
if (dist.location not in self.paths and (
dist.location not in self.sitedirs or
dist.location == os.getcwd() # account for '.' being in PYTHONPATH
)):
self.paths.append(dist.location)
self.dirty = True
Environment.add(self, dist)
def remove(self, dist):
"""Remove `dist` from the distribution map"""
while dist.location in self.paths:
self.paths.remove(dist.location)
self.dirty = True
Environment.remove(self, dist)
def make_relative(self,path):
npath, last = os.path.split(normalize_path(path))
baselen = len(self.basedir)
parts = [last]
sep = os.altsep=='/' and '/' or os.sep
while len(npath)>=baselen:
if npath==self.basedir:
parts.append(os.curdir)
parts.reverse()
return sep.join(parts)
npath, last = os.path.split(npath)
parts.append(last)
else:
return path
def get_script_header(script_text, executable=sys_executable, wininst=False):
"""Create a #! line, getting options (if any) from script_text"""
from distutils.command.build_scripts import first_line_re
# first_line_re in Python >=3.1.4 and >=3.2.1 is a bytes pattern.
if not isinstance(first_line_re.pattern, str):
first_line_re = re.compile(first_line_re.pattern.decode())
first = (script_text+'\n').splitlines()[0]
match = first_line_re.match(first)
options = ''
if match:
options = match.group(1) or ''
if options: options = ' '+options
if wininst:
executable = "python.exe"
else:
executable = nt_quote_arg(executable)
hdr = "#!%(executable)s%(options)s\n" % locals()
if not isascii(hdr):
# Non-ascii path to sys.executable, use -x to prevent warnings
if options:
if options.strip().startswith('-'):
options = ' -x'+options.strip()[1:]
# else: punt, we can't do it, let the warning happen anyway
else:
options = ' -x'
executable = fix_jython_executable(executable, options)
hdr = "#!%(executable)s%(options)s\n" % locals()
return hdr
def auto_chmod(func, arg, exc):
if func is os.remove and os.name=='nt':
chmod(arg, stat.S_IWRITE)
return func(arg)
et, ev, _ = sys.exc_info()
reraise(et, (ev[0], ev[1] + (" %s %s" % (func,arg))))
def uncache_zipdir(path):
"""Ensure that the importer caches dont have stale info for `path`"""
from zipimport import _zip_directory_cache as zdc
_uncache(path, zdc)
_uncache(path, sys.path_importer_cache)
def _uncache(path, cache):
if path in cache:
del cache[path]
else:
path = normalize_path(path)
for p in cache:
if normalize_path(p)==path:
del cache[p]
return
def is_python(text, filename='<string>'):
"Is this string a valid Python script?"
try:
compile(text, filename, 'exec')
except (SyntaxError, TypeError):
return False
else:
return True
def is_sh(executable):
"""Determine if the specified executable is a .sh (contains a #! line)"""
try:
fp = open(executable)
magic = fp.read(2)
fp.close()
except (OSError,IOError): return executable
return magic == '#!'
def nt_quote_arg(arg):
"""Quote a command line argument according to Windows parsing rules"""
result = []
needquote = False
nb = 0
needquote = (" " in arg) or ("\t" in arg)
if needquote:
result.append('"')
for c in arg:
if c == '\\':
nb += 1
elif c == '"':
# double preceding backslashes, then add a \"
result.append('\\' * (nb*2) + '\\"')
nb = 0
else:
if nb:
result.append('\\' * nb)
nb = 0
result.append(c)
if nb:
result.append('\\' * nb)
if needquote:
result.append('\\' * nb) # double the trailing backslashes
result.append('"')
return ''.join(result)
def is_python_script(script_text, filename):
"""Is this text, as a whole, a Python script? (as opposed to shell/bat/etc.
"""
if filename.endswith('.py') or filename.endswith('.pyw'):
return True # extension says it's Python
if is_python(script_text, filename):
return True # it's syntactically valid Python
if script_text.startswith('#!'):
# It begins with a '#!' line, so check if 'python' is in it somewhere
return 'python' in script_text.splitlines()[0].lower()
return False # Not any Python I can recognize
try:
from os import chmod as _chmod
except ImportError:
# Jython compatibility
def _chmod(*args): pass
def chmod(path, mode):
log.debug("changing mode of %s to %o", path, mode)
try:
_chmod(path, mode)
except os.error:
e = sys.exc_info()[1]
log.debug("chmod failed: %s", e)
def fix_jython_executable(executable, options):
if sys.platform.startswith('java') and is_sh(executable):
# Workaround for Jython is not needed on Linux systems.
import java
if java.lang.System.getProperty("os.name") == "Linux":
return executable
# Workaround Jython's sys.executable being a .sh (an invalid
# shebang line interpreter)
if options:
# Can't apply the workaround, leave it broken
log.warn(
"WARNING: Unable to adapt shebang line for Jython,"
" the following script is NOT executable\n"
" see http://bugs.jython.org/issue1112 for"
" more information.")
else:
return '/usr/bin/env %s' % executable
return executable
class ScriptWriter(object):
"""
Encapsulates behavior around writing entry point scripts for console and
gui apps.
"""
template = textwrap.dedent("""
# EASY-INSTALL-ENTRY-SCRIPT: %(spec)r,%(group)r,%(name)r
__requires__ = %(spec)r
import sys
from pkg_resources import load_entry_point
if __name__ == '__main__':
sys.exit(
load_entry_point(%(spec)r, %(group)r, %(name)r)()
)
""").lstrip()
@classmethod
def get_script_args(cls, dist, executable=sys_executable, wininst=False):
"""
Yield write_script() argument tuples for a distribution's entrypoints
"""
gen_class = cls.get_writer(wininst)
spec = str(dist.as_requirement())
header = get_script_header("", executable, wininst)
for type_ in 'console', 'gui':
group = type_ + '_scripts'
for name, ep in dist.get_entry_map(group).items():
script_text = gen_class.template % locals()
for res in gen_class._get_script_args(type_, name, header,
script_text):
yield res
@classmethod
def get_writer(cls, force_windows):
if force_windows or sys.platform=='win32':
return WindowsScriptWriter.get_writer()
return cls
@classmethod
def _get_script_args(cls, type_, name, header, script_text):
# Simply write the stub with no extension.
yield (name, header+script_text)
class WindowsScriptWriter(ScriptWriter):
@classmethod
def get_writer(cls):
"""
Get a script writer suitable for Windows
"""
writer_lookup = dict(
executable=WindowsExecutableLauncherWriter,
natural=cls,
)
# for compatibility, use the executable launcher by default
launcher = os.environ.get('SETUPTOOLS_LAUNCHER', 'executable')
return writer_lookup[launcher]
@classmethod
def _get_script_args(cls, type_, name, header, script_text):
"For Windows, add a .py extension"
ext = dict(console='.pya', gui='.pyw')[type_]
if ext not in os.environ['PATHEXT'].lower().split(';'):
warnings.warn("%s not listed in PATHEXT; scripts will not be "
"recognized as executables." % ext, UserWarning)
old = ['.pya', '.py', '-script.py', '.pyc', '.pyo', '.pyw', '.exe']
old.remove(ext)
header = cls._adjust_header(type_, header)
blockers = [name+x for x in old]
yield name+ext, header+script_text, 't', blockers
@staticmethod
def _adjust_header(type_, orig_header):
"""
Make sure 'pythonw' is used for gui and and 'python' is used for
console (regardless of what sys.executable is).
"""
pattern = 'pythonw.exe'
repl = 'python.exe'
if type_ == 'gui':
pattern, repl = repl, pattern
pattern_ob = re.compile(re.escape(pattern), re.IGNORECASE)
new_header = pattern_ob.sub(string=orig_header, repl=repl)
clean_header = new_header[2:-1].strip('"')
if sys.platform == 'win32' and not os.path.exists(clean_header):
# the adjusted version doesn't exist, so return the original
return orig_header
return new_header
class WindowsExecutableLauncherWriter(WindowsScriptWriter):
@classmethod
def _get_script_args(cls, type_, name, header, script_text):
"""
For Windows, add a .py extension and an .exe launcher
"""
if type_=='gui':
launcher_type = 'gui'
ext = '-script.pyw'
old = ['.pyw']
else:
launcher_type = 'cli'
ext = '-script.py'
old = ['.py','.pyc','.pyo']
hdr = cls._adjust_header(type_, header)
blockers = [name+x for x in old]
yield (name+ext, hdr+script_text, 't', blockers)
yield (
name+'.exe', get_win_launcher(launcher_type),
'b' # write in binary mode
)
if not is_64bit():
# install a manifest for the launcher to prevent Windows
# from detecting it as an installer (which it will for
# launchers like easy_install.exe). Consider only
# adding a manifest for launchers detected as installers.
# See Distribute #143 for details.
m_name = name + '.exe.manifest'
yield (m_name, load_launcher_manifest(name), 't')
# for backward-compatibility
get_script_args = ScriptWriter.get_script_args
def get_win_launcher(type):
"""
Load the Windows launcher (executable) suitable for launching a script.
`type` should be either 'cli' or 'gui'
Returns the executable as a byte string.
"""
launcher_fn = '%s.exe' % type
if platform.machine().lower()=='arm':
launcher_fn = launcher_fn.replace(".", "-arm.")
if is_64bit():
launcher_fn = launcher_fn.replace(".", "-64.")
else:
launcher_fn = launcher_fn.replace(".", "-32.")
return resource_string('setuptools', launcher_fn)
def load_launcher_manifest(name):
manifest = pkg_resources.resource_string(__name__, 'launcher manifest.xml')
if sys.version_info[0] < 3:
return manifest % vars()
else:
return manifest.decode('utf-8') % vars()
def rmtree(path, ignore_errors=False, onerror=auto_chmod):
"""Recursively delete a directory tree.
This code is taken from the Python 2.4 version of 'shutil', because
the 2.3 version doesn't really work right.
"""
if ignore_errors:
def onerror(*args):
pass
elif onerror is None:
def onerror(*args):
raise
names = []
try:
names = os.listdir(path)
except os.error:
onerror(os.listdir, path, sys.exc_info())
for name in names:
fullname = os.path.join(path, name)
try:
mode = os.lstat(fullname).st_mode
except os.error:
mode = 0
if stat.S_ISDIR(mode):
rmtree(fullname, ignore_errors, onerror)
else:
try:
os.remove(fullname)
except os.error:
onerror(os.remove, fullname, sys.exc_info())
try:
os.rmdir(path)
except os.error:
onerror(os.rmdir, path, sys.exc_info())
def current_umask():
tmp = os.umask(0x12) # 022
os.umask(tmp)
return tmp
def bootstrap():
# This function is called when setuptools*.egg is run using /bin/sh
import setuptools
argv0 = os.path.dirname(setuptools.__path__[0])
sys.argv[0] = argv0
sys.argv.append(argv0)
main()
def main(argv=None, **kw):
from setuptools import setup
from setuptools.dist import Distribution
import distutils.core
USAGE = """\
usage: %(script)s [options] requirement_or_url ...
or: %(script)s --help
"""
def gen_usage(script_name):
return USAGE % dict(
script=os.path.basename(script_name),
)
def with_ei_usage(f):
old_gen_usage = distutils.core.gen_usage
try:
distutils.core.gen_usage = gen_usage
return f()
finally:
distutils.core.gen_usage = old_gen_usage
class DistributionWithoutHelpCommands(Distribution):
common_usage = ""
def _show_help(self,*args,**kw):
with_ei_usage(lambda: Distribution._show_help(self,*args,**kw))
if argv is None:
argv = sys.argv[1:]
with_ei_usage(lambda:
setup(
script_args = ['-q','easy_install', '-v']+argv,
script_name = sys.argv[0] or 'easy_install',
distclass=DistributionWithoutHelpCommands, **kw
)
)
|
gpl-2.0
|
otherness-space/myProject002
|
my_project_002/lib/python2.7/site-packages/django/contrib/contenttypes/management.py
|
96
|
2903
|
from django.contrib.contenttypes.models import ContentType
from django.db import DEFAULT_DB_ALIAS, router
from django.db.models import get_apps, get_models, signals
from django.utils.encoding import smart_text
from django.utils import six
from django.utils.six.moves import input
def update_contenttypes(app, created_models, verbosity=2, db=DEFAULT_DB_ALIAS, **kwargs):
"""
Creates content types for models in the given app, removing any model
entries that no longer have a matching model class.
"""
if not router.allow_syncdb(db, ContentType):
return
ContentType.objects.clear_cache()
app_models = get_models(app)
if not app_models:
return
# They all have the same app_label, get the first one.
app_label = app_models[0]._meta.app_label
app_models = dict(
(model._meta.object_name.lower(), model)
for model in app_models
)
# Get all the content types
content_types = dict(
(ct.model, ct)
for ct in ContentType.objects.using(db).filter(app_label=app_label)
)
to_remove = [
ct
for (model_name, ct) in six.iteritems(content_types)
if model_name not in app_models
]
cts = [
ContentType(
name=smart_text(model._meta.verbose_name_raw),
app_label=app_label,
model=model_name,
)
for (model_name, model) in six.iteritems(app_models)
if model_name not in content_types
]
ContentType.objects.using(db).bulk_create(cts)
if verbosity >= 2:
for ct in cts:
print("Adding content type '%s | %s'" % (ct.app_label, ct.model))
# Confirm that the content type is stale before deletion.
if to_remove:
if kwargs.get('interactive', False):
content_type_display = '\n'.join([
' %s | %s' % (ct.app_label, ct.model)
for ct in to_remove
])
ok_to_delete = input("""The following content types are stale and need to be deleted:
%s
Any objects related to these content types by a foreign key will also
be deleted. Are you sure you want to delete these content types?
If you're unsure, answer 'no'.
Type 'yes' to continue, or 'no' to cancel: """ % content_type_display)
else:
ok_to_delete = False
if ok_to_delete == 'yes':
for ct in to_remove:
if verbosity >= 2:
print("Deleting stale content type '%s | %s'" % (ct.app_label, ct.model))
ct.delete()
else:
if verbosity >= 2:
print("Stale content types remain.")
def update_all_contenttypes(verbosity=2, **kwargs):
for app in get_apps():
update_contenttypes(app, None, verbosity, **kwargs)
signals.post_syncdb.connect(update_contenttypes)
if __name__ == "__main__":
update_all_contenttypes()
|
mit
|
onceuponatimeforever/oh-mainline
|
vendor/packages/bleach/bleach/tests/test_basics.py
|
21
|
5094
|
import html5lib
from nose.tools import eq_
import bleach
def test_empty():
eq_('', bleach.clean(''))
def test_comments_only():
comment = '<!-- this is a comment -->'
open_comment = '<!-- this is an open comment'
eq_('', bleach.clean(comment))
eq_('', bleach.clean(open_comment))
eq_(comment, bleach.clean(comment, strip_comments=False))
eq_('%s-->' % open_comment, bleach.clean(open_comment,
strip_comments=False))
def test_with_comments():
html = '<!-- comment -->Just text'
eq_('Just text', bleach.clean(html))
eq_(html, bleach.clean(html, strip_comments=False))
def test_no_html():
eq_('no html string', bleach.clean('no html string'))
def test_allowed_html():
eq_('an <strong>allowed</strong> tag',
bleach.clean('an <strong>allowed</strong> tag'))
eq_('another <em>good</em> tag',
bleach.clean('another <em>good</em> tag'))
def test_bad_html():
eq_('a <em>fixed tag</em>',
bleach.clean('a <em>fixed tag'))
def test_function_arguments():
TAGS = ['span', 'br']
ATTRS = {'span': ['style']}
eq_('a <br><span style="">test</span>',
bleach.clean('a <br/><span style="color:red">test</span>',
tags=TAGS, attributes=ATTRS))
def test_named_arguments():
ATTRS = {'a': ['rel', 'href']}
s = u'<a href="http://xx.com" rel="alternate">xx.com</a>'
eq_('<a href="http://xx.com">xx.com</a>', bleach.clean(s))
eq_(s, bleach.clean(s, attributes=ATTRS))
def test_disallowed_html():
eq_('a <script>safe()</script> test',
bleach.clean('a <script>safe()</script> test'))
eq_('a <style>body{}</style> test',
bleach.clean('a <style>body{}</style> test'))
def test_bad_href():
eq_('<em>no link</em>',
bleach.clean('<em href="fail">no link</em>'))
def test_bare_entities():
eq_('an & entity', bleach.clean('an & entity'))
eq_('an < entity', bleach.clean('an < entity'))
eq_('tag < <em>and</em> entity',
bleach.clean('tag < <em>and</em> entity'))
eq_('&', bleach.clean('&'))
def test_escaped_entities():
s = u'<em>strong</em>'
eq_(s, bleach.clean(s))
def test_serializer():
s = u'<table></table>'
eq_(s, bleach.clean(s, tags=['table']))
eq_(u'test<table></table>', bleach.linkify(u'<table>test</table>'))
eq_(u'<p>test</p>', bleach.clean(u'<p>test</p>', tags=['p']))
def test_no_href_links():
s = u'<a name="anchor">x</a>'
eq_(s, bleach.linkify(s))
eq_(s, bleach.linkify(s, nofollow=False))
def test_weird_strings():
s = '</3'
eq_(bleach.clean(s), '')
def test_xml_render():
parser = html5lib.HTMLParser()
eq_(bleach._render(parser.parseFragment('')), '')
def test_stripping():
eq_('a test <em>with</em> <b>html</b> tags',
bleach.clean('a test <em>with</em> <b>html</b> tags', strip=True))
eq_('a test <em>with</em> <b>html</b> tags',
bleach.clean('a test <em>with</em> <img src="http://example.com/"> '
'<b>html</b> tags', strip=True))
s = '<p><a href="http://example.com/">link text</a></p>'
eq_('<p>link text</p>', bleach.clean(s, tags=['p'], strip=True))
s = '<p><span>multiply <span>nested <span>text</span></span></span></p>'
eq_('<p>multiply nested text</p>', bleach.clean(s, tags=['p'], strip=True))
s = ('<p><a href="http://example.com/"><img src="http://example.com/">'
'</a></p>')
eq_('<p><a href="http://example.com/"></a></p>',
bleach.clean(s, tags=['p', 'a'], strip=True))
def test_allowed_styles():
ATTR = ['style']
STYLE = ['color']
blank = '<b style=""></b>'
s = '<b style="color: blue;"></b>'
eq_(blank, bleach.clean('<b style="top:0"></b>', attributes=ATTR))
eq_(s, bleach.clean(s, attributes=ATTR, styles=STYLE))
eq_(s, bleach.clean('<b style="top: 0; color: blue;"></b>',
attributes=ATTR, styles=STYLE))
def test_idempotent():
"""Make sure that applying the filter twice doesn't change anything."""
dirty = u'<span>invalid & </span> < extra http://link.com<em>'
clean = bleach.clean(dirty)
eq_(clean, bleach.clean(clean))
linked = bleach.linkify(dirty)
eq_(linked, bleach.linkify(linked))
def test_lowercase_html():
"""We should output lowercase HTML."""
dirty = u'<EM CLASS="FOO">BAR</EM>'
clean = u'<em class="FOO">BAR</em>'
eq_(clean, bleach.clean(dirty, attributes=['class']))
def test_wildcard_attributes():
ATTR = {
'*': ['id'],
'img': ['src'],
}
TAG = ['img', 'em']
dirty = (u'both <em id="foo" style="color: black">can</em> have '
u'<img id="bar" src="foo"/>')
clean = u'both <em id="foo">can</em> have <img id="bar" src="foo">'
eq_(clean, bleach.clean(dirty, tags=TAG, attributes=ATTR))
def test_sarcasm():
"""Jokes should crash.<sarcasm/>"""
dirty = u'Yeah right <sarcasm/>'
clean = u'Yeah right <sarcasm/>'
eq_(clean, bleach.clean(dirty))
|
agpl-3.0
|
blueyed/ycmd
|
ycmd/request_validation.py
|
29
|
2422
|
#!/usr/bin/env python
#
# Copyright (C) 2014 Google Inc.
#
# This file is part of YouCompleteMe.
#
# YouCompleteMe is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# YouCompleteMe is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with YouCompleteMe. If not, see <http://www.gnu.org/licenses/>.
from ycmd.responses import ServerError
# Throws an exception if request doesn't have all the required fields.
# TODO: Accept a request_type param so that we can also verify missing
# command_arguments and completer_target fields if necessary.
def EnsureRequestValid( request_json ):
required_fields = set(
[ 'line_num', 'column_num', 'filepath', 'file_data' ] )
missing = set( x for x in required_fields if x not in request_json )
if 'filepath' not in missing and 'file_data' not in missing:
missing.update( _MissingFieldsForFileData( request_json ) )
if not missing:
return True
message = '\n'.join( _FieldMissingMessage( field ) for field in missing )
raise ServerError( message )
def _FieldMissingMessage( field ):
return 'Request missing required field: {0}'.format( field )
def _FilepathInFileDataSpec( request_json ):
return 'file_data["{0}"]'.format( request_json[ 'filepath' ] )
def _SingleFileDataFieldSpec( request_json, field ):
return '{0}["{1}"]'.format( _FilepathInFileDataSpec( request_json ), field )
def _MissingFieldsForFileData( request_json ):
missing = set()
data_for_file = request_json[ 'file_data' ].get( request_json[ 'filepath' ] )
if data_for_file:
required_data = [ 'filetypes', 'contents' ]
for required in required_data:
if required not in data_for_file:
missing.add( _SingleFileDataFieldSpec( request_json, required ) )
filetypes = data_for_file.get( 'filetypes', [] )
if not filetypes:
missing.add( '{0}[0]'.format(
_SingleFileDataFieldSpec( request_json, 'filetypes' ) ) )
else:
missing.add( _FilepathInFileDataSpec( request_json ) )
return missing
|
gpl-3.0
|
nolanliou/tensorflow
|
tensorflow/python/kernel_tests/sparse_conditional_accumulator_test.py
|
132
|
22955
|
# Copyright 2015 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import time
import numpy as np
from tensorflow.python.framework import constant_op
from tensorflow.python.framework import dtypes as dtypes_lib
from tensorflow.python.framework import errors_impl
from tensorflow.python.framework import ops
from tensorflow.python.framework import tensor_shape
from tensorflow.python.ops import array_ops
from tensorflow.python.ops import data_flow_ops
from tensorflow.python.platform import test
def _indexedslice(x, noshape=False):
x = np.array(x)
dense_shape = x.shape
ndim = len(dense_shape)
indices = np.where(np.sum(x, tuple(range(1, ndim))))[0]
values = x[indices]
if noshape:
dense_shape = None
return ops.IndexedSlices(
indices=indices.tolist(), values=values, dense_shape=dense_shape)
class IndexedSlicesConditionalAccumulatorTest(test.TestCase):
def _assertEqual_indexedslices(self, expected_tensor, result):
self.assertAllEqual(expected_tensor.indices, result.indices)
self.assertAllEqual(expected_tensor.values, result.values)
if (result.dense_shape is not None and
expected_tensor.dense_shape is not None):
self.assertAllEqual(expected_tensor.dense_shape, result.dense_shape)
def _assertEqual_nparray(self, expected_array, result, sess):
expected_tensor = _indexedslice(expected_array)
self._assertEqual_indexedslices(expected_tensor, result)
def testConstructor(self):
with ops.Graph().as_default():
q = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float32, name="Q")
self.assertTrue(isinstance(q.accumulator_ref, ops.Tensor))
self.assertProtoEquals("""
name:'Q' op:'SparseConditionalAccumulator'
attr { key: 'dtype' value { type: DT_FLOAT } }
attr { key: 'shape' value { shape { unknown_rank: true} } }
attr { key: 'container' value { s: '' } }
attr { key: 'shared_name' value { s: '' } }
""", q.accumulator_ref.op.node_def)
def testConstructorWithShape(self):
with ops.Graph().as_default():
q = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float32,
name="Q",
shape=tensor_shape.TensorShape([1, 5, 2, 8]))
self.assertTrue(isinstance(q.accumulator_ref, ops.Tensor))
self.assertProtoEquals("""
name:'Q' op:'SparseConditionalAccumulator'
attr { key: 'dtype' value { type: DT_FLOAT } }
attr { key: 'shape' value { shape { dim {size: 1 }
dim {size: 5 }
dim {size: 2 }
dim {size: 8 }
} } }
attr { key: 'container' value { s: '' } }
attr { key: 'shared_name' value { s: '' } }
""", q.accumulator_ref.op.node_def)
def testAccumulatorSizeEmpty(self):
with self.test_session():
q = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float32, name="Q")
self.assertEqual(q.num_accumulated().eval(), 0)
def testAccumulatorSetGlobalStep(self):
with self.test_session():
q = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float32, name="Q", shape=tensor_shape.TensorShape([1]))
set_global_step_op = q.set_global_step(1)
set_global_step_op.run()
def testAccumulatorApplyGradFloat32(self):
with self.test_session():
q = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float32, name="Q", shape=tensor_shape.TensorShape([3, 3]))
accum_op = q.apply_indexed_slices_grad(
ops.IndexedSlices(
indices=[0, 2],
values=np.array([[0, 0, 1], [3, 0, 4]]).astype(np.float32)))
accum_op.run()
self.assertEqual(q.num_accumulated().eval(), 1)
def testDtypes(self):
with self.test_session() as sess:
dtypes = [dtypes_lib.float16, dtypes_lib.float32, dtypes_lib.float64]
for i in range(len(dtypes)):
dtype = dtypes[i]
q = data_flow_ops.SparseConditionalAccumulator(
dtype, shape=tensor_shape.TensorShape([3, 3, 3]))
elems = np.arange(2)
sum_elems = np.zeros([3, 3, 3]).astype(dtype.as_numpy_dtype)
for e in elems:
mat_to_add = np.zeros([3, 3, 3]).astype(dtype.as_numpy_dtype)
mat_to_add[i, i, i] = e + 1
sum_elems += mat_to_add
t = _indexedslice(mat_to_add)
q.apply_indexed_slices_grad(t).run()
result = sess.run(q.take_indexed_slices_grad(1))
self._assertEqual_nparray(sum_elems / len(elems), result, sess)
def testAccumulatorMultipleAccumulators(self):
with self.test_session() as sess:
q_f32_0 = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float32, name="Q", shape=tensor_shape.TensorShape([2, 2]))
q_f32_1 = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float32, name="Q", shape=tensor_shape.TensorShape([2, 2]))
q_f16_0 = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float16, name="Q", shape=tensor_shape.TensorShape([2, 2]))
q_f16_1 = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float16, name="Q", shape=tensor_shape.TensorShape([2, 2]))
accums = [q_f16_0, q_f16_1, q_f32_0, q_f32_1]
elems = [[[1, 0], [0, 0]], [[0, 1], [0, 0]], [[0, 0], [1, 0]], [[0, 0],
[0, 1]]]
expected_tensors = []
for i in range(len(accums)):
tensor_to_add = np.array(elems[i]).astype(accums[i]
.dtype.as_numpy_dtype)
expected_tensor = _indexedslice(tensor_to_add)
expected_tensors.append(expected_tensor)
st = _indexedslice(tensor_to_add)
accums[i].apply_indexed_slices_grad(st).run()
for i in range(len(accums)):
result = sess.run(accums[i].take_indexed_slices_grad(1))
self._assertEqual_indexedslices(expected_tensors[i], result)
def testAccumulatorTakeGrad(self):
with self.test_session() as sess:
q = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float32, name="Q", shape=())
grad_indexed_slices = ops.IndexedSlices(
indices=[0, 1], values=np.array([[1, 0], [0, 2]]).astype(np.float32))
accum_op = q.apply_indexed_slices_grad(grad_indexed_slices)
accum_op.run()
accum_op = q.apply_grad([0, 2],
np.array([[0, 1], [3, 0]]).astype(np.float32),
[3, 2])
accum_op.run()
takeg_t = q.take_indexed_slices_grad(1)
val = sess.run(takeg_t)
self.assertAllEqual(val.indices, [0, 1, 2])
self.assertAllEqual(val.values, [[0.5, 0.5], [0, 2], [3, 0]])
self.assertAllEqual(val.dense_shape, [-1, 2])
def testAccumulatorRepeatedTakeGrad(self):
with self.test_session() as sess:
q = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float32, name="Q", shape=())
grad_indexed_slices = ops.IndexedSlices(
indices=[0, 1], values=np.array([[1, 0], [0, 2]]).astype(np.float32))
accum_op = q.apply_indexed_slices_grad(grad_indexed_slices, local_step=0)
accum_op.run()
accum_op = q.apply_grad(
[0, 2],
np.array([[0, 1], [3, 0]]).astype(np.float32), [3, 2],
local_step=0)
accum_op.run()
takeg_t = q.take_indexed_slices_grad(1)
val = sess.run(takeg_t)
self.assertAllEqual(val.indices, [0, 1, 2])
self.assertAllEqual(val.values, [[0.5, 0.5], [0, 2], [3, 0]])
self.assertAllEqual(val.dense_shape, [-1, 2])
grad_indexed_slices = ops.IndexedSlices(
indices=[0, 1],
values=np.array([[10, 0], [0, 20]]).astype(np.float32))
accum_op = q.apply_indexed_slices_grad(grad_indexed_slices, local_step=1)
accum_op.run()
accum_op = q.apply_grad(
[0, 2],
np.array([[0, 10], [30, 0]]).astype(np.float32), [3, 2],
local_step=1)
accum_op.run()
takeg_t = q.take_indexed_slices_grad(1)
val = sess.run(takeg_t)
self.assertAllEqual(val.indices, [0, 1, 2])
self.assertAllEqual(val.values, [[5, 5], [0, 20], [30, 0]])
self.assertAllEqual(val.dense_shape, [-1, 2])
def testParallelApplyGrad(self):
with self.test_session() as sess:
q = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float32, name="Q", shape=tensor_shape.TensorShape([2, 2]))
elems = [10.0, 20.0, 30.0, 40.0, 50.0, 60.0, 70.0, 80.0, 90.0, 100.0]
accum_ops = []
for x in elems:
x = _indexedslice(np.array([[x, 0], [0, x]]).astype(np.float32))
accum_ops.append(q.apply_indexed_slices_grad(x, local_step=0))
takeg_t = q.take_indexed_slices_grad(1)
def apply_indexed_slices_grad(accum_op):
sess.run(accum_op)
threads = [
self.checkedThread(
target=apply_indexed_slices_grad, args=(o,)) for o in accum_ops
]
for thread in threads:
thread.start()
for thread in threads:
thread.join()
val = sess.run(takeg_t)
expected_val = sum(elems) / len(elems)
self._assertEqual_nparray(
np.array([[expected_val, 0], [0, expected_val]]).astype(np.float32),
val, sess)
def testParallelTakeGrad(self):
with self.test_session() as sess:
q = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float32, name="Q", shape=tensor_shape.TensorShape([2, 2]))
elems = [e + 1 for e in range(10)]
accum_ops = []
for e in elems:
v = _indexedslice(np.array([[0, 0], [e, 0]]).astype(np.float32))
accum_ops.append(q.apply_indexed_slices_grad(v, local_step=e - 1))
takeg_t = q.take_indexed_slices_grad(1)
results = []
def apply_indexed_slices_grad():
for accum_op in accum_ops:
time.sleep(1.0)
sess.run(accum_op)
apply_indexed_slices_grad_thread = self.checkedThread(
target=apply_indexed_slices_grad)
def take_grad():
t = sess.run(takeg_t)
results.append(t)
threads = [self.checkedThread(target=take_grad) for _ in range(10)]
for thread in threads:
thread.start()
apply_indexed_slices_grad_thread.start()
for thread in threads:
thread.join()
apply_indexed_slices_grad_thread.join()
for i in range(len(accum_ops)):
self._assertEqual_nparray(
np.array([[0, 0], [elems[i], 0]]), results[i], sess)
def testAccumulatorApplyAndBlockingTake(self):
with self.test_session() as sess:
q = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float32, name="Q", shape=tensor_shape.TensorShape([2, 2]))
elems = [10.0, 20.0, 30.0]
elems_ave = sum(elems) / len(elems)
accum_ops = []
for x in elems:
x = _indexedslice(np.array([[0, x], [0, 0]]).astype(np.float32))
accum_ops.append(q.apply_indexed_slices_grad(x, local_step=0))
takeg_t = q.take_indexed_slices_grad(3)
results = []
def apply_indexed_slices_grad():
for accum_op in accum_ops:
sess.run(accum_op)
def take_grad():
results.append(sess.run(takeg_t))
accum_thread = self.checkedThread(target=apply_indexed_slices_grad)
takeg_thread = self.checkedThread(target=take_grad)
accum_thread.start()
takeg_thread.start()
accum_thread.join()
takeg_thread.join()
self._assertEqual_nparray([[0, elems_ave], [0, 0]], results[0], sess)
def _blocking_takeg(self, sess, takeg_op):
with self.assertRaisesOpError("was cancelled"):
sess.run(takeg_op)
def testAccumulatorCancel(self):
with self.test_session() as sess:
q = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float32,
name="Q",
shape=tensor_shape.TensorShape([1, 2, 3]))
takeg_t = q.take_indexed_slices_grad(1)
takeg_thread = self.checkedThread(
self._blocking_takeg, args=(sess, takeg_t))
takeg_thread.start()
time.sleep(1.0)
sess.close() # Will cancel blocked operation
takeg_thread.join()
def testNonVectorIndices(self):
with self.test_session():
q = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float32, name="Q", shape=tensor_shape.TensorShape([3, 3]))
with self.assertRaisesRegexp(
errors_impl.InvalidArgumentError,
"Input indices should be vector but received shape:"):
q.apply_grad(
grad_indices=[[0, 1], [1, 0]],
grad_values=np.array([1, 2]).astype(np.float32)).run()
def testZeroDimensionValues(self):
with self.test_session():
q = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float32, name="Q", shape=tensor_shape.TensorShape([3, 3]))
with self.assertRaisesRegexp(errors_impl.InvalidArgumentError,
"Values cannot be 0-dimensional."):
q.apply_grad(
grad_indices=[0], grad_values=np.array(1).astype(np.float32)).run()
def testWrongNonEmptyInputValues(self):
with self.test_session():
q = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float32, name="Q", shape=tensor_shape.TensorShape([3, 3]))
with self.assertRaisesRegexp(errors_impl.InvalidArgumentError,
" non-empty input values, got "):
q.apply_grad(
grad_indices=[0, 1],
grad_values=np.array([[0, 1, 1]]).astype(np.float32)).run()
def testDynamicNonVectorIndices(self):
with self.test_session() as sess:
q = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float32, name="Q", shape=tensor_shape.TensorShape([3, 3]))
x_indices = array_ops.placeholder(dtypes_lib.int64)
x_values = array_ops.placeholder(dtypes_lib.float32)
accum_op = q.apply_grad(grad_indices=x_indices, grad_values=x_values)
with self.assertRaisesRegexp(
errors_impl.InvalidArgumentError,
"Input indices should be vector but received shape:"):
sess.run(accum_op,
feed_dict={
x_indices: [[0, 1], [1, 0]],
x_values: np.array([1, 2]).astype(np.float32)
})
def testDynamicWrongNonEmptyInputValues(self):
with self.test_session() as sess:
q = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float32, name="Q", shape=tensor_shape.TensorShape([3, 3]))
x_indices = array_ops.placeholder(dtypes_lib.int64)
x_values = array_ops.placeholder(dtypes_lib.float32)
accum_op = q.apply_grad(grad_indices=x_indices, grad_values=x_values)
with self.assertRaisesRegexp(errors_impl.InvalidArgumentError,
" non-empty input values, got "):
sess.run(accum_op,
feed_dict={
x_indices: [0, 1],
x_values: np.array([[0, 1, 1]]).astype(np.float32)
})
def testEmptyShapeApply(self):
with self.test_session():
q = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float32, name="Q", shape=tensor_shape.TensorShape([]))
with self.assertRaisesRegexp(errors_impl.InvalidArgumentError,
"Input indices should be vector"):
q.apply_grad(grad_indices=0, grad_values=[1.0], grad_shape=[]).run()
with self.assertRaisesRegexp(errors_impl.InvalidArgumentError,
"Input indices should be vector"):
q.apply_grad(grad_indices=0, grad_values=[1.0]).run()
with self.assertRaisesRegexp(errors_impl.InvalidArgumentError,
"Values cannot be 0-dimensional."):
q.apply_grad(grad_indices=[0], grad_values=1.0, grad_shape=[]).run()
with self.assertRaisesRegexp(errors_impl.InvalidArgumentError,
"Values cannot be 0-dimensional."):
q.apply_grad(grad_indices=[0], grad_values=1.0).run()
# The right way to apply a scalar
q.apply_grad(grad_indices=[0], grad_values=[1.0], grad_shape=[]).run()
q.apply_grad(grad_indices=[0], grad_values=[1.0]).run()
def testValidateShape(self):
with self.test_session() as sess:
q = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float32, name="Q", shape=[2, 2, None])
# Provided shape has wrong rank
with self.assertRaisesRegexp(
errors_impl.InvalidArgumentError,
"Shape mismatch: expected shape rank at least 3, got 2"):
q.apply_grad(
grad_indices=[0],
grad_values=np.array([[1, 2]]).astype(np.float32),
grad_shape=[2, 2]).run()
# Provided shape has wrong dim
with self.assertRaisesRegexp(
errors_impl.InvalidArgumentError,
"Shape mismatch: expected shape dim 1 to be 2, got 3"):
q.apply_grad(
grad_indices=[0],
grad_values=np.array([[[1, 2], [3, 4], [5, 6]]]).astype(np.float32),
grad_shape=[2, 3, 2]).run()
# Indices exceeded accumulator's shape's limits
with self.assertRaisesRegexp(
errors_impl.InvalidArgumentError,
"Shape mismatch: index of slice 0 exceeded limits of shape;"
" index is 3 exceeded 2"):
q.apply_grad(
grad_indices=[3],
grad_values=np.array([[[1, 2], [3, 4]]]).astype(np.float32)).run()
# Values' rank does not match shape
with self.assertRaisesRegexp(
errors_impl.InvalidArgumentError,
"Shape mismatch: expected values rank at least 3, got 2"):
q.apply_grad(
grad_indices=[0, 1],
grad_values=np.array([[1, 2], [3, 4]]).astype(np.float32)).run()
# Values' dim does not match shape
with self.assertRaisesRegexp(
errors_impl.InvalidArgumentError,
"Shape mismatch: expected values dim 1 to be 2, got 3"):
q.apply_grad(
grad_indices=[0],
grad_values=np.array(
[[[1, 2], [3, 4], [5, 6]]]).astype(np.float32)).run()
# First successful gradient creates additional constraints
# Shape will be additionally be constrained to [None,2,2,2] hereafter.
q.apply_grad(
grad_indices=[0],
grad_values=np.array(
[[[[1, 2], [3, 4]], [[5, 6], [7, 8]]]]).astype(np.float32)).run()
# Values' rank does not match accumulated gradient
with self.assertRaisesRegexp(
errors_impl.InvalidArgumentError,
"Shape mismatch: expected values rank 4, got 3"):
q.apply_grad(
grad_indices=[0],
grad_values=np.array([[[1, 2], [3, 4]]]).astype(np.float32)).run()
# Values' dim does not match accumulated gradient
with self.assertRaisesRegexp(
errors_impl.InvalidArgumentError,
"Shape mismatch: expected values dim 3 to be 2, got 3"):
q.apply_grad(
grad_indices=[0],
grad_values=np.array(
[[[[1, 2, 3], [4, 5, 6]], [[7, 8, 9], [10, 11, 12]]]]).astype(
np.float32)).run()
# After take grad, constraints on accumulated gradient are removed
sess.run(q.take_grad(1))
# First successful gradient imposes new constraints.
# Hereafter, shape will additionally constrained to [None,2,2,3]
q.apply_grad(
grad_indices=[0],
grad_values=np.array(
[[[[1, 2, 3], [4, 5, 6]], [[7, 8, 9], [10, 11, 12]]]]).astype(
np.float32),
local_step=1).run()
with self.assertRaisesRegexp(
errors_impl.InvalidArgumentError,
"Shape mismatch: expected values dim 3 to be 3, got 2"):
q.apply_grad(
grad_indices=[0],
grad_values=np.array(
[[[[1, 2], [3, 4]], [[5, 6], [7, 8]]]]).astype(np.float32),
local_step=1).run()
def testReturnShape(self):
with self.test_session() as sess:
q = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float32, name="Q", shape=[2, None])
q.apply_grad(
grad_indices=[0],
grad_values=np.array(
[[[[1, 2], [3, 4]], [[5, 6], [7, 8]]]]).astype(np.float32)).run()
val = sess.run(q.take_indexed_slices_grad(1))
self.assertAllEqual(val.dense_shape, [2, 2, 2, 2])
q = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float32, name="Q", shape=[None, 2])
q.apply_grad(
grad_indices=[0],
grad_values=np.array(
[[[[1, 2, 3], [4, 5, 6]], [[7, 8, 9], [10, 11, 12]]]]).astype(
np.float32)).run()
val = sess.run(q.take_indexed_slices_grad(1))
self.assertAllEqual(val.dense_shape, [-1, 2, 2, 3])
def testApplyGradtInt32IndicesAndShape(self):
with self.test_session() as sess:
q = data_flow_ops.SparseConditionalAccumulator(
dtypes_lib.float32, name="Q", shape=tensor_shape.TensorShape([3, 3]))
accum_op = q.apply_grad(
grad_indices=constant_op.constant(
[0, 2], dtype=dtypes_lib.int32),
grad_values=constant_op.constant(
[[0, 0, 1], [3, 0, 4]], dtype=dtypes_lib.float32),
grad_shape=constant_op.constant(
[3, 3], dtype=dtypes_lib.int32))
accum_op.run()
accum_op = q.apply_indexed_slices_grad(
ops.IndexedSlices(
indices=constant_op.constant(
[0, 2], dtype=dtypes_lib.int32),
values=constant_op.constant(
[[0, 0, 1], [3, 0, 4]], dtype=dtypes_lib.float32),
dense_shape=constant_op.constant(
[3, 3], dtype=dtypes_lib.int32)))
accum_op.run()
self.assertEqual(q.num_accumulated().eval(), 2)
val = sess.run(q.take_indexed_slices_grad(1))
self.assertAllEqual(val.indices, [0, 2])
self.assertAllEqual(val.values, [[0, 0, 1], [3, 0, 4]])
self.assertAllEqual(val.dense_shape, [3, 3])
if __name__ == "__main__":
test.main()
|
apache-2.0
|
bitfinder/thrift
|
contrib/fb303/py/fb303/FacebookBase.py
|
173
|
1917
|
#!/usr/bin/env python
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
import time
import FacebookService
import thrift.reflection.limited
from ttypes import fb_status
class FacebookBase(FacebookService.Iface):
def __init__(self, name):
self.name = name
self.alive = int(time.time())
self.counters = {}
def getName(self, ):
return self.name
def getVersion(self, ):
return ''
def getStatus(self, ):
return fb_status.ALIVE
def getCounters(self):
return self.counters
def resetCounter(self, key):
self.counters[key] = 0
def getCounter(self, key):
if self.counters.has_key(key):
return self.counters[key]
return 0
def incrementCounter(self, key):
self.counters[key] = self.getCounter(key) + 1
def setOption(self, key, value):
pass
def getOption(self, key):
return ""
def getOptions(self):
return {}
def getOptions(self):
return {}
def aliveSince(self):
return self.alive
def getCpuProfile(self, duration):
return ""
def getLimitedReflection(self):
return thrift.reflection.limited.Service()
def reinitialize(self):
pass
def shutdown(self):
pass
|
apache-2.0
|
tjsavage/full_nonrel_starter
|
django/templatetags/static.py
|
233
|
2149
|
from django import template
from django.utils.encoding import iri_to_uri
register = template.Library()
class PrefixNode(template.Node):
def __repr__(self):
return "<PrefixNode for %r>" % self.name
def __init__(self, varname=None, name=None):
if name is None:
raise template.TemplateSyntaxError(
"Prefix nodes must be given a name to return.")
self.varname = varname
self.name = name
@classmethod
def handle_token(cls, parser, token, name):
"""
Class method to parse prefix node and return a Node.
"""
tokens = token.contents.split()
if len(tokens) > 1 and tokens[1] != 'as':
raise template.TemplateSyntaxError(
"First argument in '%s' must be 'as'" % tokens[0])
if len(tokens) > 1:
varname = tokens[2]
else:
varname = None
return cls(varname, name)
@classmethod
def handle_simple(cls, name):
try:
from django.conf import settings
except ImportError:
prefix = ''
else:
prefix = iri_to_uri(getattr(settings, name, ''))
return prefix
def render(self, context):
prefix = self.handle_simple(self.name)
if self.varname is None:
return prefix
context[self.varname] = prefix
return ''
@register.tag
def get_static_prefix(parser, token):
"""
Populates a template variable with the static prefix,
``settings.STATIC_URL``.
Usage::
{% get_static_prefix [as varname] %}
Examples::
{% get_static_prefix %}
{% get_static_prefix as static_prefix %}
"""
return PrefixNode.handle_token(parser, token, "STATIC_URL")
@register.tag
def get_media_prefix(parser, token):
"""
Populates a template variable with the static prefix,
``settings.MEDIA_URL``.
Usage::
{% get_media_prefix [as varname] %}
Examples::
{% get_media_prefix %}
{% get_media_prefix as media_prefix %}
"""
return PrefixNode.handle_token(parser, token, "MEDIA_URL")
|
bsd-3-clause
|
michhar/flask-webapp-aml
|
env/Lib/site-packages/wtforms/i18n.py
|
142
|
2175
|
import os
def messages_path():
"""
Determine the path to the 'messages' directory as best possible.
"""
module_path = os.path.abspath(__file__)
locale_path = os.path.join(os.path.dirname(module_path), 'locale')
if not os.path.exists(locale_path):
locale_path = '/usr/share/locale'
return locale_path
def get_builtin_gnu_translations(languages=None):
"""
Get a gettext.GNUTranslations object pointing at the
included translation files.
:param languages:
A list of languages to try, in order. If omitted or None, then
gettext will try to use locale information from the environment.
"""
import gettext
return gettext.translation('wtforms', messages_path(), languages)
def get_translations(languages=None, getter=get_builtin_gnu_translations):
"""
Get a WTForms translation object which wraps a low-level translations object.
:param languages:
A sequence of languages to try, in order.
:param getter:
A single-argument callable which returns a low-level translations object.
"""
translations = getter(languages)
if hasattr(translations, 'ugettext'):
return DefaultTranslations(translations)
else:
# Python 3 has no ugettext/ungettext, so just return the translations object.
return translations
class DefaultTranslations(object):
"""
A WTForms translations object to wrap translations objects which use
ugettext/ungettext.
"""
def __init__(self, translations):
self.translations = translations
def gettext(self, string):
return self.translations.ugettext(string)
def ngettext(self, singular, plural, n):
return self.translations.ungettext(singular, plural, n)
class DummyTranslations(object):
"""
A translations object which simply returns unmodified strings.
This is typically used when translations are disabled or if no valid
translations provider can be found.
"""
def gettext(self, string):
return string
def ngettext(self, singular, plural, n):
if n == 1:
return singular
return plural
|
mit
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.