code
stringlengths 501
5.19M
| package
stringlengths 2
81
| path
stringlengths 9
304
| filename
stringlengths 4
145
|
---|---|---|---|
.. _embedding:
Embedding the interpreter
#########################
While pybind11 is mainly focused on extending Python using C++, it's also
possible to do the reverse: embed the Python interpreter into a C++ program.
All of the other documentation pages still apply here, so refer to them for
general pybind11 usage. This section will cover a few extra things required
for embedding.
Getting started
===============
A basic executable with an embedded interpreter can be created with just a few
lines of CMake and the ``pybind11::embed`` target, as shown below. For more
information, see :doc:`/compiling`.
.. code-block:: cmake
cmake_minimum_required(VERSION 3.4)
project(example)
find_package(pybind11 REQUIRED) # or `add_subdirectory(pybind11)`
add_executable(example main.cpp)
target_link_libraries(example PRIVATE pybind11::embed)
The essential structure of the ``main.cpp`` file looks like this:
.. code-block:: cpp
#include <pybind11/embed.h> // everything needed for embedding
namespace py = pybind11;
int main() {
py::scoped_interpreter guard{}; // start the interpreter and keep it alive
py::print("Hello, World!"); // use the Python API
}
The interpreter must be initialized before using any Python API, which includes
all the functions and classes in pybind11. The RAII guard class ``scoped_interpreter``
takes care of the interpreter lifetime. After the guard is destroyed, the interpreter
shuts down and clears its memory. No Python functions can be called after this.
Executing Python code
=====================
There are a few different ways to run Python code. One option is to use ``eval``,
``exec`` or ``eval_file``, as explained in :ref:`eval`. Here is a quick example in
the context of an executable with an embedded interpreter:
.. code-block:: cpp
#include <pybind11/embed.h>
namespace py = pybind11;
int main() {
py::scoped_interpreter guard{};
py::exec(R"(
kwargs = dict(name="World", number=42)
message = "Hello, {name}! The answer is {number}".format(**kwargs)
print(message)
)");
}
Alternatively, similar results can be achieved using pybind11's API (see
:doc:`/advanced/pycpp/index` for more details).
.. code-block:: cpp
#include <pybind11/embed.h>
namespace py = pybind11;
using namespace py::literals;
int main() {
py::scoped_interpreter guard{};
auto kwargs = py::dict("name"_a="World", "number"_a=42);
auto message = "Hello, {name}! The answer is {number}"_s.format(**kwargs);
py::print(message);
}
The two approaches can also be combined:
.. code-block:: cpp
#include <pybind11/embed.h>
#include <iostream>
namespace py = pybind11;
using namespace py::literals;
int main() {
py::scoped_interpreter guard{};
auto locals = py::dict("name"_a="World", "number"_a=42);
py::exec(R"(
message = "Hello, {name}! The answer is {number}".format(**locals())
)", py::globals(), locals);
auto message = locals["message"].cast<std::string>();
std::cout << message;
}
Importing modules
=================
Python modules can be imported using ``module_::import()``:
.. code-block:: cpp
py::module_ sys = py::module_::import("sys");
py::print(sys.attr("path"));
For convenience, the current working directory is included in ``sys.path`` when
embedding the interpreter. This makes it easy to import local Python files:
.. code-block:: python
"""calc.py located in the working directory"""
def add(i, j):
return i + j
.. code-block:: cpp
py::module_ calc = py::module_::import("calc");
py::object result = calc.attr("add")(1, 2);
int n = result.cast<int>();
assert(n == 3);
Modules can be reloaded using ``module_::reload()`` if the source is modified e.g.
by an external process. This can be useful in scenarios where the application
imports a user defined data processing script which needs to be updated after
changes by the user. Note that this function does not reload modules recursively.
.. _embedding_modules:
Adding embedded modules
=======================
Embedded binary modules can be added using the ``PYBIND11_EMBEDDED_MODULE`` macro.
Note that the definition must be placed at global scope. They can be imported
like any other module.
.. code-block:: cpp
#include <pybind11/embed.h>
namespace py = pybind11;
PYBIND11_EMBEDDED_MODULE(fast_calc, m) {
// `m` is a `py::module_` which is used to bind functions and classes
m.def("add", [](int i, int j) {
return i + j;
});
}
int main() {
py::scoped_interpreter guard{};
auto fast_calc = py::module_::import("fast_calc");
auto result = fast_calc.attr("add")(1, 2).cast<int>();
assert(result == 3);
}
Unlike extension modules where only a single binary module can be created, on
the embedded side an unlimited number of modules can be added using multiple
``PYBIND11_EMBEDDED_MODULE`` definitions (as long as they have unique names).
These modules are added to Python's list of builtins, so they can also be
imported in pure Python files loaded by the interpreter. Everything interacts
naturally:
.. code-block:: python
"""py_module.py located in the working directory"""
import cpp_module
a = cpp_module.a
b = a + 1
.. code-block:: cpp
#include <pybind11/embed.h>
namespace py = pybind11;
PYBIND11_EMBEDDED_MODULE(cpp_module, m) {
m.attr("a") = 1;
}
int main() {
py::scoped_interpreter guard{};
auto py_module = py::module_::import("py_module");
auto locals = py::dict("fmt"_a="{} + {} = {}", **py_module.attr("__dict__"));
assert(locals["a"].cast<int>() == 1);
assert(locals["b"].cast<int>() == 2);
py::exec(R"(
c = a + b
message = fmt.format(a, b, c)
)", py::globals(), locals);
assert(locals["c"].cast<int>() == 3);
assert(locals["message"].cast<std::string>() == "1 + 2 = 3");
}
Interpreter lifetime
====================
The Python interpreter shuts down when ``scoped_interpreter`` is destroyed. After
this, creating a new instance will restart the interpreter. Alternatively, the
``initialize_interpreter`` / ``finalize_interpreter`` pair of functions can be used
to directly set the state at any time.
Modules created with pybind11 can be safely re-initialized after the interpreter
has been restarted. However, this may not apply to third-party extension modules.
The issue is that Python itself cannot completely unload extension modules and
there are several caveats with regard to interpreter restarting. In short, not
all memory may be freed, either due to Python reference cycles or user-created
global data. All the details can be found in the CPython documentation.
.. warning::
Creating two concurrent ``scoped_interpreter`` guards is a fatal error. So is
calling ``initialize_interpreter`` for a second time after the interpreter
has already been initialized.
Do not use the raw CPython API functions ``Py_Initialize`` and
``Py_Finalize`` as these do not properly handle the lifetime of
pybind11's internal data.
Sub-interpreter support
=======================
Creating multiple copies of ``scoped_interpreter`` is not possible because it
represents the main Python interpreter. Sub-interpreters are something different
and they do permit the existence of multiple interpreters. This is an advanced
feature of the CPython API and should be handled with care. pybind11 does not
currently offer a C++ interface for sub-interpreters, so refer to the CPython
documentation for all the details regarding this feature.
We'll just mention a couple of caveats the sub-interpreters support in pybind11:
1. Sub-interpreters will not receive independent copies of embedded modules.
Instead, these are shared and modifications in one interpreter may be
reflected in another.
2. Managing multiple threads, multiple interpreters and the GIL can be
challenging and there are several caveats here, even within the pure
CPython API (please refer to the Python docs for details). As for
pybind11, keep in mind that ``gil_scoped_release`` and ``gil_scoped_acquire``
do not take sub-interpreters into account.
| Abies | /Abies-0.0.5.tar.gz/Abies-0.0.5/extern/pybind11/docs/advanced/embedding.rst | embedding.rst |
Miscellaneous
#############
.. _macro_notes:
General notes regarding convenience macros
==========================================
pybind11 provides a few convenience macros such as
:func:`PYBIND11_DECLARE_HOLDER_TYPE` and ``PYBIND11_OVERRIDE_*``. Since these
are "just" macros that are evaluated in the preprocessor (which has no concept
of types), they *will* get confused by commas in a template argument; for
example, consider:
.. code-block:: cpp
PYBIND11_OVERRIDE(MyReturnType<T1, T2>, Class<T3, T4>, func)
The limitation of the C preprocessor interprets this as five arguments (with new
arguments beginning after each comma) rather than three. To get around this,
there are two alternatives: you can use a type alias, or you can wrap the type
using the ``PYBIND11_TYPE`` macro:
.. code-block:: cpp
// Version 1: using a type alias
using ReturnType = MyReturnType<T1, T2>;
using ClassType = Class<T3, T4>;
PYBIND11_OVERRIDE(ReturnType, ClassType, func);
// Version 2: using the PYBIND11_TYPE macro:
PYBIND11_OVERRIDE(PYBIND11_TYPE(MyReturnType<T1, T2>),
PYBIND11_TYPE(Class<T3, T4>), func)
The ``PYBIND11_MAKE_OPAQUE`` macro does *not* require the above workarounds.
.. _gil:
Global Interpreter Lock (GIL)
=============================
When calling a C++ function from Python, the GIL is always held.
The classes :class:`gil_scoped_release` and :class:`gil_scoped_acquire` can be
used to acquire and release the global interpreter lock in the body of a C++
function call. In this way, long-running C++ code can be parallelized using
multiple Python threads. Taking :ref:`overriding_virtuals` as an example, this
could be realized as follows (important changes highlighted):
.. code-block:: cpp
:emphasize-lines: 8,9,31,32
class PyAnimal : public Animal {
public:
/* Inherit the constructors */
using Animal::Animal;
/* Trampoline (need one for each virtual function) */
std::string go(int n_times) {
/* Acquire GIL before calling Python code */
py::gil_scoped_acquire acquire;
PYBIND11_OVERRIDE_PURE(
std::string, /* Return type */
Animal, /* Parent class */
go, /* Name of function */
n_times /* Argument(s) */
);
}
};
PYBIND11_MODULE(example, m) {
py::class_<Animal, PyAnimal> animal(m, "Animal");
animal
.def(py::init<>())
.def("go", &Animal::go);
py::class_<Dog>(m, "Dog", animal)
.def(py::init<>());
m.def("call_go", [](Animal *animal) -> std::string {
/* Release GIL before calling into (potentially long-running) C++ code */
py::gil_scoped_release release;
return call_go(animal);
});
}
The ``call_go`` wrapper can also be simplified using the ``call_guard`` policy
(see :ref:`call_policies`) which yields the same result:
.. code-block:: cpp
m.def("call_go", &call_go, py::call_guard<py::gil_scoped_release>());
Binding sequence data types, iterators, the slicing protocol, etc.
==================================================================
Please refer to the supplemental example for details.
.. seealso::
The file :file:`tests/test_sequences_and_iterators.cpp` contains a
complete example that shows how to bind a sequence data type, including
length queries (``__len__``), iterators (``__iter__``), the slicing
protocol and other kinds of useful operations.
Partitioning code over multiple extension modules
=================================================
It's straightforward to split binding code over multiple extension modules,
while referencing types that are declared elsewhere. Everything "just" works
without any special precautions. One exception to this rule occurs when
extending a type declared in another extension module. Recall the basic example
from Section :ref:`inheritance`.
.. code-block:: cpp
py::class_<Pet> pet(m, "Pet");
pet.def(py::init<const std::string &>())
.def_readwrite("name", &Pet::name);
py::class_<Dog>(m, "Dog", pet /* <- specify parent */)
.def(py::init<const std::string &>())
.def("bark", &Dog::bark);
Suppose now that ``Pet`` bindings are defined in a module named ``basic``,
whereas the ``Dog`` bindings are defined somewhere else. The challenge is of
course that the variable ``pet`` is not available anymore though it is needed
to indicate the inheritance relationship to the constructor of ``class_<Dog>``.
However, it can be acquired as follows:
.. code-block:: cpp
py::object pet = (py::object) py::module_::import("basic").attr("Pet");
py::class_<Dog>(m, "Dog", pet)
.def(py::init<const std::string &>())
.def("bark", &Dog::bark);
Alternatively, you can specify the base class as a template parameter option to
``class_``, which performs an automated lookup of the corresponding Python
type. Like the above code, however, this also requires invoking the ``import``
function once to ensure that the pybind11 binding code of the module ``basic``
has been executed:
.. code-block:: cpp
py::module_::import("basic");
py::class_<Dog, Pet>(m, "Dog")
.def(py::init<const std::string &>())
.def("bark", &Dog::bark);
Naturally, both methods will fail when there are cyclic dependencies.
Note that pybind11 code compiled with hidden-by-default symbol visibility (e.g.
via the command line flag ``-fvisibility=hidden`` on GCC/Clang), which is
required for proper pybind11 functionality, can interfere with the ability to
access types defined in another extension module. Working around this requires
manually exporting types that are accessed by multiple extension modules;
pybind11 provides a macro to do just this:
.. code-block:: cpp
class PYBIND11_EXPORT Dog : public Animal {
...
};
Note also that it is possible (although would rarely be required) to share arbitrary
C++ objects between extension modules at runtime. Internal library data is shared
between modules using capsule machinery [#f6]_ which can be also utilized for
storing, modifying and accessing user-defined data. Note that an extension module
will "see" other extensions' data if and only if they were built with the same
pybind11 version. Consider the following example:
.. code-block:: cpp
auto data = reinterpret_cast<MyData *>(py::get_shared_data("mydata"));
if (!data)
data = static_cast<MyData *>(py::set_shared_data("mydata", new MyData(42)));
If the above snippet was used in several separately compiled extension modules,
the first one to be imported would create a ``MyData`` instance and associate
a ``"mydata"`` key with a pointer to it. Extensions that are imported later
would be then able to access the data behind the same pointer.
.. [#f6] https://docs.python.org/3/extending/extending.html#using-capsules
Module Destructors
==================
pybind11 does not provide an explicit mechanism to invoke cleanup code at
module destruction time. In rare cases where such functionality is required, it
is possible to emulate it using Python capsules or weak references with a
destruction callback.
.. code-block:: cpp
auto cleanup_callback = []() {
// perform cleanup here -- this function is called with the GIL held
};
m.add_object("_cleanup", py::capsule(cleanup_callback));
This approach has the potential downside that instances of classes exposed
within the module may still be alive when the cleanup callback is invoked
(whether this is acceptable will generally depend on the application).
Alternatively, the capsule may also be stashed within a type object, which
ensures that it not called before all instances of that type have been
collected:
.. code-block:: cpp
auto cleanup_callback = []() { /* ... */ };
m.attr("BaseClass").attr("_cleanup") = py::capsule(cleanup_callback);
Both approaches also expose a potentially dangerous ``_cleanup`` attribute in
Python, which may be undesirable from an API standpoint (a premature explicit
call from Python might lead to undefined behavior). Yet another approach that
avoids this issue involves weak reference with a cleanup callback:
.. code-block:: cpp
// Register a callback function that is invoked when the BaseClass object is collected
py::cpp_function cleanup_callback(
[](py::handle weakref) {
// perform cleanup here -- this function is called with the GIL held
weakref.dec_ref(); // release weak reference
}
);
// Create a weak reference with a cleanup callback and initially leak it
(void) py::weakref(m.attr("BaseClass"), cleanup_callback).release();
.. note::
PyPy does not garbage collect objects when the interpreter exits. An alternative
approach (which also works on CPython) is to use the :py:mod:`atexit` module [#f7]_,
for example:
.. code-block:: cpp
auto atexit = py::module_::import("atexit");
atexit.attr("register")(py::cpp_function([]() {
// perform cleanup here -- this function is called with the GIL held
}));
.. [#f7] https://docs.python.org/3/library/atexit.html
Generating documentation using Sphinx
=====================================
Sphinx [#f4]_ has the ability to inspect the signatures and documentation
strings in pybind11-based extension modules to automatically generate beautiful
documentation in a variety formats. The python_example repository [#f5]_ contains a
simple example repository which uses this approach.
There are two potential gotchas when using this approach: first, make sure that
the resulting strings do not contain any :kbd:`TAB` characters, which break the
docstring parsing routines. You may want to use C++11 raw string literals,
which are convenient for multi-line comments. Conveniently, any excess
indentation will be automatically be removed by Sphinx. However, for this to
work, it is important that all lines are indented consistently, i.e.:
.. code-block:: cpp
// ok
m.def("foo", &foo, R"mydelimiter(
The foo function
Parameters
----------
)mydelimiter");
// *not ok*
m.def("foo", &foo, R"mydelimiter(The foo function
Parameters
----------
)mydelimiter");
By default, pybind11 automatically generates and prepends a signature to the docstring of a function
registered with ``module_::def()`` and ``class_::def()``. Sometimes this
behavior is not desirable, because you want to provide your own signature or remove
the docstring completely to exclude the function from the Sphinx documentation.
The class ``options`` allows you to selectively suppress auto-generated signatures:
.. code-block:: cpp
PYBIND11_MODULE(example, m) {
py::options options;
options.disable_function_signatures();
m.def("add", [](int a, int b) { return a + b; }, "A function which adds two numbers");
}
Note that changes to the settings affect only function bindings created during the
lifetime of the ``options`` instance. When it goes out of scope at the end of the module's init function,
the default settings are restored to prevent unwanted side effects.
.. [#f4] http://www.sphinx-doc.org
.. [#f5] http://github.com/pybind/python_example
.. _avoiding-cpp-types-in-docstrings:
Avoiding C++ types in docstrings
================================
Docstrings are generated at the time of the declaration, e.g. when ``.def(...)`` is called.
At this point parameter and return types should be known to pybind11.
If a custom type is not exposed yet through a ``py::class_`` constructor or a custom type caster,
its C++ type name will be used instead to generate the signature in the docstring:
.. code-block:: text
| __init__(...)
| __init__(self: example.Foo, arg0: ns::Bar) -> None
^^^^^^^
This limitation can be circumvented by ensuring that C++ classes are registered with pybind11
before they are used as a parameter or return type of a function:
.. code-block:: cpp
PYBIND11_MODULE(example, m) {
auto pyFoo = py::class_<ns::Foo>(m, "Foo");
auto pyBar = py::class_<ns::Bar>(m, "Bar");
pyFoo.def(py::init<const ns::Bar&>());
pyBar.def(py::init<const ns::Foo&>());
}
| Abies | /Abies-0.0.5.tar.gz/Abies-0.0.5/extern/pybind11/docs/advanced/misc.rst | misc.rst |
Utilities
#########
Using Python's print function in C++
====================================
The usual way to write output in C++ is using ``std::cout`` while in Python one
would use ``print``. Since these methods use different buffers, mixing them can
lead to output order issues. To resolve this, pybind11 modules can use the
:func:`py::print` function which writes to Python's ``sys.stdout`` for consistency.
Python's ``print`` function is replicated in the C++ API including optional
keyword arguments ``sep``, ``end``, ``file``, ``flush``. Everything works as
expected in Python:
.. code-block:: cpp
py::print(1, 2.0, "three"); // 1 2.0 three
py::print(1, 2.0, "three", "sep"_a="-"); // 1-2.0-three
auto args = py::make_tuple("unpacked", true);
py::print("->", *args, "end"_a="<-"); // -> unpacked True <-
.. _ostream_redirect:
Capturing standard output from ostream
======================================
Often, a library will use the streams ``std::cout`` and ``std::cerr`` to print,
but this does not play well with Python's standard ``sys.stdout`` and ``sys.stderr``
redirection. Replacing a library's printing with ``py::print <print>`` may not
be feasible. This can be fixed using a guard around the library function that
redirects output to the corresponding Python streams:
.. code-block:: cpp
#include <pybind11/iostream.h>
...
// Add a scoped redirect for your noisy code
m.def("noisy_func", []() {
py::scoped_ostream_redirect stream(
std::cout, // std::ostream&
py::module_::import("sys").attr("stdout") // Python output
);
call_noisy_func();
});
.. warning::
The implementation in ``pybind11/iostream.h`` is NOT thread safe. Multiple
threads writing to a redirected ostream concurrently cause data races
and potentially buffer overflows. Therefore it is currently a requirement
that all (possibly) concurrent redirected ostream writes are protected by
a mutex. #HelpAppreciated: Work on iostream.h thread safety. For more
background see the discussions under
`PR #2982 <https://github.com/pybind/pybind11/pull/2982>`_ and
`PR #2995 <https://github.com/pybind/pybind11/pull/2995>`_.
This method respects flushes on the output streams and will flush if needed
when the scoped guard is destroyed. This allows the output to be redirected in
real time, such as to a Jupyter notebook. The two arguments, the C++ stream and
the Python output, are optional, and default to standard output if not given. An
extra type, ``py::scoped_estream_redirect <scoped_estream_redirect>``, is identical
except for defaulting to ``std::cerr`` and ``sys.stderr``; this can be useful with
``py::call_guard``, which allows multiple items, but uses the default constructor:
.. code-block:: cpp
// Alternative: Call single function using call guard
m.def("noisy_func", &call_noisy_function,
py::call_guard<py::scoped_ostream_redirect,
py::scoped_estream_redirect>());
The redirection can also be done in Python with the addition of a context
manager, using the ``py::add_ostream_redirect() <add_ostream_redirect>`` function:
.. code-block:: cpp
py::add_ostream_redirect(m, "ostream_redirect");
The name in Python defaults to ``ostream_redirect`` if no name is passed. This
creates the following context manager in Python:
.. code-block:: python
with ostream_redirect(stdout=True, stderr=True):
noisy_function()
It defaults to redirecting both streams, though you can use the keyword
arguments to disable one of the streams if needed.
.. note::
The above methods will not redirect C-level output to file descriptors, such
as ``fprintf``. For those cases, you'll need to redirect the file
descriptors either directly in C or with Python's ``os.dup2`` function
in an operating-system dependent way.
.. _eval:
Evaluating Python expressions from strings and files
====================================================
pybind11 provides the ``eval``, ``exec`` and ``eval_file`` functions to evaluate
Python expressions and statements. The following example illustrates how they
can be used.
.. code-block:: cpp
// At beginning of file
#include <pybind11/eval.h>
...
// Evaluate in scope of main module
py::object scope = py::module_::import("__main__").attr("__dict__");
// Evaluate an isolated expression
int result = py::eval("my_variable + 10", scope).cast<int>();
// Evaluate a sequence of statements
py::exec(
"print('Hello')\n"
"print('world!');",
scope);
// Evaluate the statements in an separate Python file on disk
py::eval_file("script.py", scope);
C++11 raw string literals are also supported and quite handy for this purpose.
The only requirement is that the first statement must be on a new line following
the raw string delimiter ``R"(``, ensuring all lines have common leading indent:
.. code-block:: cpp
py::exec(R"(
x = get_answer()
if x == 42:
print('Hello World!')
else:
print('Bye!')
)", scope
);
.. note::
`eval` and `eval_file` accept a template parameter that describes how the
string/file should be interpreted. Possible choices include ``eval_expr``
(isolated expression), ``eval_single_statement`` (a single statement, return
value is always ``none``), and ``eval_statements`` (sequence of statements,
return value is always ``none``). `eval` defaults to ``eval_expr``,
`eval_file` defaults to ``eval_statements`` and `exec` is just a shortcut
for ``eval<eval_statements>``.
| Abies | /Abies-0.0.5.tar.gz/Abies-0.0.5/extern/pybind11/docs/advanced/pycpp/utilities.rst | utilities.rst |
.. _numpy:
NumPy
#####
Buffer protocol
===============
Python supports an extremely general and convenient approach for exchanging
data between plugin libraries. Types can expose a buffer view [#f2]_, which
provides fast direct access to the raw internal data representation. Suppose we
want to bind the following simplistic Matrix class:
.. code-block:: cpp
class Matrix {
public:
Matrix(size_t rows, size_t cols) : m_rows(rows), m_cols(cols) {
m_data = new float[rows*cols];
}
float *data() { return m_data; }
size_t rows() const { return m_rows; }
size_t cols() const { return m_cols; }
private:
size_t m_rows, m_cols;
float *m_data;
};
The following binding code exposes the ``Matrix`` contents as a buffer object,
making it possible to cast Matrices into NumPy arrays. It is even possible to
completely avoid copy operations with Python expressions like
``np.array(matrix_instance, copy = False)``.
.. code-block:: cpp
py::class_<Matrix>(m, "Matrix", py::buffer_protocol())
.def_buffer([](Matrix &m) -> py::buffer_info {
return py::buffer_info(
m.data(), /* Pointer to buffer */
sizeof(float), /* Size of one scalar */
py::format_descriptor<float>::format(), /* Python struct-style format descriptor */
2, /* Number of dimensions */
{ m.rows(), m.cols() }, /* Buffer dimensions */
{ sizeof(float) * m.cols(), /* Strides (in bytes) for each index */
sizeof(float) }
);
});
Supporting the buffer protocol in a new type involves specifying the special
``py::buffer_protocol()`` tag in the ``py::class_`` constructor and calling the
``def_buffer()`` method with a lambda function that creates a
``py::buffer_info`` description record on demand describing a given matrix
instance. The contents of ``py::buffer_info`` mirror the Python buffer protocol
specification.
.. code-block:: cpp
struct buffer_info {
void *ptr;
py::ssize_t itemsize;
std::string format;
py::ssize_t ndim;
std::vector<py::ssize_t> shape;
std::vector<py::ssize_t> strides;
};
To create a C++ function that can take a Python buffer object as an argument,
simply use the type ``py::buffer`` as one of its arguments. Buffers can exist
in a great variety of configurations, hence some safety checks are usually
necessary in the function body. Below, you can see a basic example on how to
define a custom constructor for the Eigen double precision matrix
(``Eigen::MatrixXd``) type, which supports initialization from compatible
buffer objects (e.g. a NumPy matrix).
.. code-block:: cpp
/* Bind MatrixXd (or some other Eigen type) to Python */
typedef Eigen::MatrixXd Matrix;
typedef Matrix::Scalar Scalar;
constexpr bool rowMajor = Matrix::Flags & Eigen::RowMajorBit;
py::class_<Matrix>(m, "Matrix", py::buffer_protocol())
.def(py::init([](py::buffer b) {
typedef Eigen::Stride<Eigen::Dynamic, Eigen::Dynamic> Strides;
/* Request a buffer descriptor from Python */
py::buffer_info info = b.request();
/* Some sanity checks ... */
if (info.format != py::format_descriptor<Scalar>::format())
throw std::runtime_error("Incompatible format: expected a double array!");
if (info.ndim != 2)
throw std::runtime_error("Incompatible buffer dimension!");
auto strides = Strides(
info.strides[rowMajor ? 0 : 1] / (py::ssize_t)sizeof(Scalar),
info.strides[rowMajor ? 1 : 0] / (py::ssize_t)sizeof(Scalar));
auto map = Eigen::Map<Matrix, 0, Strides>(
static_cast<Scalar *>(info.ptr), info.shape[0], info.shape[1], strides);
return Matrix(map);
}));
For reference, the ``def_buffer()`` call for this Eigen data type should look
as follows:
.. code-block:: cpp
.def_buffer([](Matrix &m) -> py::buffer_info {
return py::buffer_info(
m.data(), /* Pointer to buffer */
sizeof(Scalar), /* Size of one scalar */
py::format_descriptor<Scalar>::format(), /* Python struct-style format descriptor */
2, /* Number of dimensions */
{ m.rows(), m.cols() }, /* Buffer dimensions */
{ sizeof(Scalar) * (rowMajor ? m.cols() : 1),
sizeof(Scalar) * (rowMajor ? 1 : m.rows()) }
/* Strides (in bytes) for each index */
);
})
For a much easier approach of binding Eigen types (although with some
limitations), refer to the section on :doc:`/advanced/cast/eigen`.
.. seealso::
The file :file:`tests/test_buffers.cpp` contains a complete example
that demonstrates using the buffer protocol with pybind11 in more detail.
.. [#f2] http://docs.python.org/3/c-api/buffer.html
Arrays
======
By exchanging ``py::buffer`` with ``py::array`` in the above snippet, we can
restrict the function so that it only accepts NumPy arrays (rather than any
type of Python object satisfying the buffer protocol).
In many situations, we want to define a function which only accepts a NumPy
array of a certain data type. This is possible via the ``py::array_t<T>``
template. For instance, the following function requires the argument to be a
NumPy array containing double precision values.
.. code-block:: cpp
void f(py::array_t<double> array);
When it is invoked with a different type (e.g. an integer or a list of
integers), the binding code will attempt to cast the input into a NumPy array
of the requested type. This feature requires the :file:`pybind11/numpy.h`
header to be included. Note that :file:`pybind11/numpy.h` does not depend on
the NumPy headers, and thus can be used without declaring a build-time
dependency on NumPy; NumPy>=1.7.0 is a runtime dependency.
Data in NumPy arrays is not guaranteed to packed in a dense manner;
furthermore, entries can be separated by arbitrary column and row strides.
Sometimes, it can be useful to require a function to only accept dense arrays
using either the C (row-major) or Fortran (column-major) ordering. This can be
accomplished via a second template argument with values ``py::array::c_style``
or ``py::array::f_style``.
.. code-block:: cpp
void f(py::array_t<double, py::array::c_style | py::array::forcecast> array);
The ``py::array::forcecast`` argument is the default value of the second
template parameter, and it ensures that non-conforming arguments are converted
into an array satisfying the specified requirements instead of trying the next
function overload.
There are several methods on arrays; the methods listed below under references
work, as well as the following functions based on the NumPy API:
- ``.dtype()`` returns the type of the contained values.
- ``.strides()`` returns a pointer to the strides of the array (optionally pass
an integer axis to get a number).
- ``.flags()`` returns the flag settings. ``.writable()`` and ``.owndata()``
are directly available.
- ``.offset_at()`` returns the offset (optionally pass indices).
- ``.squeeze()`` returns a view with length-1 axes removed.
- ``.view(dtype)`` returns a view of the array with a different dtype.
- ``.reshape({i, j, ...})`` returns a view of the array with a different shape.
``.resize({...})`` is also available.
- ``.index_at(i, j, ...)`` gets the count from the beginning to a given index.
There are also several methods for getting references (described below).
Structured types
================
In order for ``py::array_t`` to work with structured (record) types, we first
need to register the memory layout of the type. This can be done via
``PYBIND11_NUMPY_DTYPE`` macro, called in the plugin definition code, which
expects the type followed by field names:
.. code-block:: cpp
struct A {
int x;
double y;
};
struct B {
int z;
A a;
};
// ...
PYBIND11_MODULE(test, m) {
// ...
PYBIND11_NUMPY_DTYPE(A, x, y);
PYBIND11_NUMPY_DTYPE(B, z, a);
/* now both A and B can be used as template arguments to py::array_t */
}
The structure should consist of fundamental arithmetic types, ``std::complex``,
previously registered substructures, and arrays of any of the above. Both C++
arrays and ``std::array`` are supported. While there is a static assertion to
prevent many types of unsupported structures, it is still the user's
responsibility to use only "plain" structures that can be safely manipulated as
raw memory without violating invariants.
Vectorizing functions
=====================
Suppose we want to bind a function with the following signature to Python so
that it can process arbitrary NumPy array arguments (vectors, matrices, general
N-D arrays) in addition to its normal arguments:
.. code-block:: cpp
double my_func(int x, float y, double z);
After including the ``pybind11/numpy.h`` header, this is extremely simple:
.. code-block:: cpp
m.def("vectorized_func", py::vectorize(my_func));
Invoking the function like below causes 4 calls to be made to ``my_func`` with
each of the array elements. The significant advantage of this compared to
solutions like ``numpy.vectorize()`` is that the loop over the elements runs
entirely on the C++ side and can be crunched down into a tight, optimized loop
by the compiler. The result is returned as a NumPy array of type
``numpy.dtype.float64``.
.. code-block:: pycon
>>> x = np.array([[1, 3], [5, 7]])
>>> y = np.array([[2, 4], [6, 8]])
>>> z = 3
>>> result = vectorized_func(x, y, z)
The scalar argument ``z`` is transparently replicated 4 times. The input
arrays ``x`` and ``y`` are automatically converted into the right types (they
are of type ``numpy.dtype.int64`` but need to be ``numpy.dtype.int32`` and
``numpy.dtype.float32``, respectively).
.. note::
Only arithmetic, complex, and POD types passed by value or by ``const &``
reference are vectorized; all other arguments are passed through as-is.
Functions taking rvalue reference arguments cannot be vectorized.
In cases where the computation is too complicated to be reduced to
``vectorize``, it will be necessary to create and access the buffer contents
manually. The following snippet contains a complete example that shows how this
works (the code is somewhat contrived, since it could have been done more
simply using ``vectorize``).
.. code-block:: cpp
#include <pybind11/pybind11.h>
#include <pybind11/numpy.h>
namespace py = pybind11;
py::array_t<double> add_arrays(py::array_t<double> input1, py::array_t<double> input2) {
py::buffer_info buf1 = input1.request(), buf2 = input2.request();
if (buf1.ndim != 1 || buf2.ndim != 1)
throw std::runtime_error("Number of dimensions must be one");
if (buf1.size != buf2.size)
throw std::runtime_error("Input shapes must match");
/* No pointer is passed, so NumPy will allocate the buffer */
auto result = py::array_t<double>(buf1.size);
py::buffer_info buf3 = result.request();
double *ptr1 = static_cast<double *>(buf1.ptr);
double *ptr2 = static_cast<double *>(buf2.ptr);
double *ptr3 = static_cast<double *>(buf3.ptr);
for (size_t idx = 0; idx < buf1.shape[0]; idx++)
ptr3[idx] = ptr1[idx] + ptr2[idx];
return result;
}
PYBIND11_MODULE(test, m) {
m.def("add_arrays", &add_arrays, "Add two NumPy arrays");
}
.. seealso::
The file :file:`tests/test_numpy_vectorize.cpp` contains a complete
example that demonstrates using :func:`vectorize` in more detail.
Direct access
=============
For performance reasons, particularly when dealing with very large arrays, it
is often desirable to directly access array elements without internal checking
of dimensions and bounds on every access when indices are known to be already
valid. To avoid such checks, the ``array`` class and ``array_t<T>`` template
class offer an unchecked proxy object that can be used for this unchecked
access through the ``unchecked<N>`` and ``mutable_unchecked<N>`` methods,
where ``N`` gives the required dimensionality of the array:
.. code-block:: cpp
m.def("sum_3d", [](py::array_t<double> x) {
auto r = x.unchecked<3>(); // x must have ndim = 3; can be non-writeable
double sum = 0;
for (py::ssize_t i = 0; i < r.shape(0); i++)
for (py::ssize_t j = 0; j < r.shape(1); j++)
for (py::ssize_t k = 0; k < r.shape(2); k++)
sum += r(i, j, k);
return sum;
});
m.def("increment_3d", [](py::array_t<double> x) {
auto r = x.mutable_unchecked<3>(); // Will throw if ndim != 3 or flags.writeable is false
for (py::ssize_t i = 0; i < r.shape(0); i++)
for (py::ssize_t j = 0; j < r.shape(1); j++)
for (py::ssize_t k = 0; k < r.shape(2); k++)
r(i, j, k) += 1.0;
}, py::arg().noconvert());
To obtain the proxy from an ``array`` object, you must specify both the data
type and number of dimensions as template arguments, such as ``auto r =
myarray.mutable_unchecked<float, 2>()``.
If the number of dimensions is not known at compile time, you can omit the
dimensions template parameter (i.e. calling ``arr_t.unchecked()`` or
``arr.unchecked<T>()``. This will give you a proxy object that works in the
same way, but results in less optimizable code and thus a small efficiency
loss in tight loops.
Note that the returned proxy object directly references the array's data, and
only reads its shape, strides, and writeable flag when constructed. You must
take care to ensure that the referenced array is not destroyed or reshaped for
the duration of the returned object, typically by limiting the scope of the
returned instance.
The returned proxy object supports some of the same methods as ``py::array`` so
that it can be used as a drop-in replacement for some existing, index-checked
uses of ``py::array``:
- ``.ndim()`` returns the number of dimensions
- ``.data(1, 2, ...)`` and ``r.mutable_data(1, 2, ...)``` returns a pointer to
the ``const T`` or ``T`` data, respectively, at the given indices. The
latter is only available to proxies obtained via ``a.mutable_unchecked()``.
- ``.itemsize()`` returns the size of an item in bytes, i.e. ``sizeof(T)``.
- ``.ndim()`` returns the number of dimensions.
- ``.shape(n)`` returns the size of dimension ``n``
- ``.size()`` returns the total number of elements (i.e. the product of the shapes).
- ``.nbytes()`` returns the number of bytes used by the referenced elements
(i.e. ``itemsize()`` times ``size()``).
.. seealso::
The file :file:`tests/test_numpy_array.cpp` contains additional examples
demonstrating the use of this feature.
Ellipsis
========
Python 3 provides a convenient ``...`` ellipsis notation that is often used to
slice multidimensional arrays. For instance, the following snippet extracts the
middle dimensions of a tensor with the first and last index set to zero.
In Python 2, the syntactic sugar ``...`` is not available, but the singleton
``Ellipsis`` (of type ``ellipsis``) can still be used directly.
.. code-block:: python
a = ... # a NumPy array
b = a[0, ..., 0]
The function ``py::ellipsis()`` function can be used to perform the same
operation on the C++ side:
.. code-block:: cpp
py::array a = /* A NumPy array */;
py::array b = a[py::make_tuple(0, py::ellipsis(), 0)];
.. versionchanged:: 2.6
``py::ellipsis()`` is now also available in Python 2.
Memory view
===========
For a case when we simply want to provide a direct accessor to C/C++ buffer
without a concrete class object, we can return a ``memoryview`` object. Suppose
we wish to expose a ``memoryview`` for 2x4 uint8_t array, we can do the
following:
.. code-block:: cpp
const uint8_t buffer[] = {
0, 1, 2, 3,
4, 5, 6, 7
};
m.def("get_memoryview2d", []() {
return py::memoryview::from_buffer(
buffer, // buffer pointer
{ 2, 4 }, // shape (rows, cols)
{ sizeof(uint8_t) * 4, sizeof(uint8_t) } // strides in bytes
);
})
This approach is meant for providing a ``memoryview`` for a C/C++ buffer not
managed by Python. The user is responsible for managing the lifetime of the
buffer. Using a ``memoryview`` created in this way after deleting the buffer in
C++ side results in undefined behavior.
We can also use ``memoryview::from_memory`` for a simple 1D contiguous buffer:
.. code-block:: cpp
m.def("get_memoryview1d", []() {
return py::memoryview::from_memory(
buffer, // buffer pointer
sizeof(uint8_t) * 8 // buffer size
);
})
.. note::
``memoryview::from_memory`` is not available in Python 2.
.. versionchanged:: 2.6
``memoryview::from_memory`` added.
| Abies | /Abies-0.0.5.tar.gz/Abies-0.0.5/extern/pybind11/docs/advanced/pycpp/numpy.rst | numpy.rst |
Python types
############
.. _wrappers:
Available wrappers
==================
All major Python types are available as thin C++ wrapper classes. These
can also be used as function parameters -- see :ref:`python_objects_as_args`.
Available types include :class:`handle`, :class:`object`, :class:`bool_`,
:class:`int_`, :class:`float_`, :class:`str`, :class:`bytes`, :class:`tuple`,
:class:`list`, :class:`dict`, :class:`slice`, :class:`none`, :class:`capsule`,
:class:`iterable`, :class:`iterator`, :class:`function`, :class:`buffer`,
:class:`array`, and :class:`array_t`.
.. warning::
Be sure to review the :ref:`pytypes_gotchas` before using this heavily in
your C++ API.
.. _instantiating_compound_types:
Instantiating compound Python types from C++
============================================
Dictionaries can be initialized in the :class:`dict` constructor:
.. code-block:: cpp
using namespace pybind11::literals; // to bring in the `_a` literal
py::dict d("spam"_a=py::none(), "eggs"_a=42);
A tuple of python objects can be instantiated using :func:`py::make_tuple`:
.. code-block:: cpp
py::tuple tup = py::make_tuple(42, py::none(), "spam");
Each element is converted to a supported Python type.
A `simple namespace`_ can be instantiated using
.. code-block:: cpp
using namespace pybind11::literals; // to bring in the `_a` literal
py::object SimpleNamespace = py::module_::import("types").attr("SimpleNamespace");
py::object ns = SimpleNamespace("spam"_a=py::none(), "eggs"_a=42);
Attributes on a namespace can be modified with the :func:`py::delattr`,
:func:`py::getattr`, and :func:`py::setattr` functions. Simple namespaces can
be useful as lightweight stand-ins for class instances.
.. _simple namespace: https://docs.python.org/3/library/types.html#types.SimpleNamespace
.. _casting_back_and_forth:
Casting back and forth
======================
In this kind of mixed code, it is often necessary to convert arbitrary C++
types to Python, which can be done using :func:`py::cast`:
.. code-block:: cpp
MyClass *cls = ...;
py::object obj = py::cast(cls);
The reverse direction uses the following syntax:
.. code-block:: cpp
py::object obj = ...;
MyClass *cls = obj.cast<MyClass *>();
When conversion fails, both directions throw the exception :class:`cast_error`.
.. _python_libs:
Accessing Python libraries from C++
===================================
It is also possible to import objects defined in the Python standard
library or available in the current Python environment (``sys.path``) and work
with these in C++.
This example obtains a reference to the Python ``Decimal`` class.
.. code-block:: cpp
// Equivalent to "from decimal import Decimal"
py::object Decimal = py::module_::import("decimal").attr("Decimal");
.. code-block:: cpp
// Try to import scipy
py::object scipy = py::module_::import("scipy");
return scipy.attr("__version__");
.. _calling_python_functions:
Calling Python functions
========================
It is also possible to call Python classes, functions and methods
via ``operator()``.
.. code-block:: cpp
// Construct a Python object of class Decimal
py::object pi = Decimal("3.14159");
.. code-block:: cpp
// Use Python to make our directories
py::object os = py::module_::import("os");
py::object makedirs = os.attr("makedirs");
makedirs("/tmp/path/to/somewhere");
One can convert the result obtained from Python to a pure C++ version
if a ``py::class_`` or type conversion is defined.
.. code-block:: cpp
py::function f = <...>;
py::object result_py = f(1234, "hello", some_instance);
MyClass &result = result_py.cast<MyClass>();
.. _calling_python_methods:
Calling Python methods
========================
To call an object's method, one can again use ``.attr`` to obtain access to the
Python method.
.. code-block:: cpp
// Calculate e^π in decimal
py::object exp_pi = pi.attr("exp")();
py::print(py::str(exp_pi));
In the example above ``pi.attr("exp")`` is a *bound method*: it will always call
the method for that same instance of the class. Alternately one can create an
*unbound method* via the Python class (instead of instance) and pass the ``self``
object explicitly, followed by other arguments.
.. code-block:: cpp
py::object decimal_exp = Decimal.attr("exp");
// Compute the e^n for n=0..4
for (int n = 0; n < 5; n++) {
py::print(decimal_exp(Decimal(n));
}
Keyword arguments
=================
Keyword arguments are also supported. In Python, there is the usual call syntax:
.. code-block:: python
def f(number, say, to):
... # function code
f(1234, say="hello", to=some_instance) # keyword call in Python
In C++, the same call can be made using:
.. code-block:: cpp
using namespace pybind11::literals; // to bring in the `_a` literal
f(1234, "say"_a="hello", "to"_a=some_instance); // keyword call in C++
Unpacking arguments
===================
Unpacking of ``*args`` and ``**kwargs`` is also possible and can be mixed with
other arguments:
.. code-block:: cpp
// * unpacking
py::tuple args = py::make_tuple(1234, "hello", some_instance);
f(*args);
// ** unpacking
py::dict kwargs = py::dict("number"_a=1234, "say"_a="hello", "to"_a=some_instance);
f(**kwargs);
// mixed keywords, * and ** unpacking
py::tuple args = py::make_tuple(1234);
py::dict kwargs = py::dict("to"_a=some_instance);
f(*args, "say"_a="hello", **kwargs);
Generalized unpacking according to PEP448_ is also supported:
.. code-block:: cpp
py::dict kwargs1 = py::dict("number"_a=1234);
py::dict kwargs2 = py::dict("to"_a=some_instance);
f(**kwargs1, "say"_a="hello", **kwargs2);
.. seealso::
The file :file:`tests/test_pytypes.cpp` contains a complete
example that demonstrates passing native Python types in more detail. The
file :file:`tests/test_callbacks.cpp` presents a few examples of calling
Python functions from C++, including keywords arguments and unpacking.
.. _PEP448: https://www.python.org/dev/peps/pep-0448/
.. _implicit_casting:
Implicit casting
================
When using the C++ interface for Python types, or calling Python functions,
objects of type :class:`object` are returned. It is possible to invoke implicit
conversions to subclasses like :class:`dict`. The same holds for the proxy objects
returned by ``operator[]`` or ``obj.attr()``.
Casting to subtypes improves code readability and allows values to be passed to
C++ functions that require a specific subtype rather than a generic :class:`object`.
.. code-block:: cpp
#include <pybind11/numpy.h>
using namespace pybind11::literals;
py::module_ os = py::module_::import("os");
py::module_ path = py::module_::import("os.path"); // like 'import os.path as path'
py::module_ np = py::module_::import("numpy"); // like 'import numpy as np'
py::str curdir_abs = path.attr("abspath")(path.attr("curdir"));
py::print(py::str("Current directory: ") + curdir_abs);
py::dict environ = os.attr("environ");
py::print(environ["HOME"]);
py::array_t<float> arr = np.attr("ones")(3, "dtype"_a="float32");
py::print(py::repr(arr + py::int_(1)));
These implicit conversions are available for subclasses of :class:`object`; there
is no need to call ``obj.cast()`` explicitly as for custom classes, see
:ref:`casting_back_and_forth`.
.. note::
If a trivial conversion via move constructor is not possible, both implicit and
explicit casting (calling ``obj.cast()``) will attempt a "rich" conversion.
For instance, ``py::list env = os.attr("environ");`` will succeed and is
equivalent to the Python code ``env = list(os.environ)`` that produces a
list of the dict keys.
.. TODO: Adapt text once PR #2349 has landed
Handling exceptions
===================
Python exceptions from wrapper classes will be thrown as a ``py::error_already_set``.
See :ref:`Handling exceptions from Python in C++
<handling_python_exceptions_cpp>` for more information on handling exceptions
raised when calling C++ wrapper classes.
.. _pytypes_gotchas:
Gotchas
=======
Default-Constructed Wrappers
----------------------------
When a wrapper type is default-constructed, it is **not** a valid Python object (i.e. it is not ``py::none()``). It is simply the same as
``PyObject*`` null pointer. To check for this, use
``static_cast<bool>(my_wrapper)``.
Assigning py::none() to wrappers
--------------------------------
You may be tempted to use types like ``py::str`` and ``py::dict`` in C++
signatures (either pure C++, or in bound signatures), and assign them default
values of ``py::none()``. However, in a best case scenario, it will fail fast
because ``None`` is not convertible to that type (e.g. ``py::dict``), or in a
worse case scenario, it will silently work but corrupt the types you want to
work with (e.g. ``py::str(py::none())`` will yield ``"None"`` in Python).
| Abies | /Abies-0.0.5.tar.gz/Abies-0.0.5/extern/pybind11/docs/advanced/pycpp/object.rst | object.rst |
Functional
##########
The following features must be enabled by including :file:`pybind11/functional.h`.
Callbacks and passing anonymous functions
=========================================
The C++11 standard brought lambda functions and the generic polymorphic
function wrapper ``std::function<>`` to the C++ programming language, which
enable powerful new ways of working with functions. Lambda functions come in
two flavors: stateless lambda function resemble classic function pointers that
link to an anonymous piece of code, while stateful lambda functions
additionally depend on captured variables that are stored in an anonymous
*lambda closure object*.
Here is a simple example of a C++ function that takes an arbitrary function
(stateful or stateless) with signature ``int -> int`` as an argument and runs
it with the value 10.
.. code-block:: cpp
int func_arg(const std::function<int(int)> &f) {
return f(10);
}
The example below is more involved: it takes a function of signature ``int -> int``
and returns another function of the same kind. The return value is a stateful
lambda function, which stores the value ``f`` in the capture object and adds 1 to
its return value upon execution.
.. code-block:: cpp
std::function<int(int)> func_ret(const std::function<int(int)> &f) {
return [f](int i) {
return f(i) + 1;
};
}
This example demonstrates using python named parameters in C++ callbacks which
requires using ``py::cpp_function`` as a wrapper. Usage is similar to defining
methods of classes:
.. code-block:: cpp
py::cpp_function func_cpp() {
return py::cpp_function([](int i) { return i+1; },
py::arg("number"));
}
After including the extra header file :file:`pybind11/functional.h`, it is almost
trivial to generate binding code for all of these functions.
.. code-block:: cpp
#include <pybind11/functional.h>
PYBIND11_MODULE(example, m) {
m.def("func_arg", &func_arg);
m.def("func_ret", &func_ret);
m.def("func_cpp", &func_cpp);
}
The following interactive session shows how to call them from Python.
.. code-block:: pycon
$ python
>>> import example
>>> def square(i):
... return i * i
...
>>> example.func_arg(square)
100L
>>> square_plus_1 = example.func_ret(square)
>>> square_plus_1(4)
17L
>>> plus_1 = func_cpp()
>>> plus_1(number=43)
44L
.. warning::
Keep in mind that passing a function from C++ to Python (or vice versa)
will instantiate a piece of wrapper code that translates function
invocations between the two languages. Naturally, this translation
increases the computational cost of each function call somewhat. A
problematic situation can arise when a function is copied back and forth
between Python and C++ many times in a row, in which case the underlying
wrappers will accumulate correspondingly. The resulting long sequence of
C++ -> Python -> C++ -> ... roundtrips can significantly decrease
performance.
There is one exception: pybind11 detects case where a stateless function
(i.e. a function pointer or a lambda function without captured variables)
is passed as an argument to another C++ function exposed in Python. In this
case, there is no overhead. Pybind11 will extract the underlying C++
function pointer from the wrapped function to sidestep a potential C++ ->
Python -> C++ roundtrip. This is demonstrated in :file:`tests/test_callbacks.cpp`.
.. note::
This functionality is very useful when generating bindings for callbacks in
C++ libraries (e.g. GUI libraries, asynchronous networking libraries, etc.).
The file :file:`tests/test_callbacks.cpp` contains a complete example
that demonstrates how to work with callbacks and anonymous functions in
more detail.
| Abies | /Abies-0.0.5.tar.gz/Abies-0.0.5/extern/pybind11/docs/advanced/cast/functional.rst | functional.rst |
Strings, bytes and Unicode conversions
######################################
.. note::
This section discusses string handling in terms of Python 3 strings. For
Python 2.7, replace all occurrences of ``str`` with ``unicode`` and
``bytes`` with ``str``. Python 2.7 users may find it best to use ``from
__future__ import unicode_literals`` to avoid unintentionally using ``str``
instead of ``unicode``.
Passing Python strings to C++
=============================
When a Python ``str`` is passed from Python to a C++ function that accepts
``std::string`` or ``char *`` as arguments, pybind11 will encode the Python
string to UTF-8. All Python ``str`` can be encoded in UTF-8, so this operation
does not fail.
The C++ language is encoding agnostic. It is the responsibility of the
programmer to track encodings. It's often easiest to simply `use UTF-8
everywhere <http://utf8everywhere.org/>`_.
.. code-block:: c++
m.def("utf8_test",
[](const std::string &s) {
cout << "utf-8 is icing on the cake.\n";
cout << s;
}
);
m.def("utf8_charptr",
[](const char *s) {
cout << "My favorite food is\n";
cout << s;
}
);
.. code-block:: pycon
>>> utf8_test("🎂")
utf-8 is icing on the cake.
🎂
>>> utf8_charptr("🍕")
My favorite food is
🍕
.. note::
Some terminal emulators do not support UTF-8 or emoji fonts and may not
display the example above correctly.
The results are the same whether the C++ function accepts arguments by value or
reference, and whether or not ``const`` is used.
Passing bytes to C++
--------------------
A Python ``bytes`` object will be passed to C++ functions that accept
``std::string`` or ``char*`` *without* conversion. On Python 3, in order to
make a function *only* accept ``bytes`` (and not ``str``), declare it as taking
a ``py::bytes`` argument.
Returning C++ strings to Python
===============================
When a C++ function returns a ``std::string`` or ``char*`` to a Python caller,
**pybind11 will assume that the string is valid UTF-8** and will decode it to a
native Python ``str``, using the same API as Python uses to perform
``bytes.decode('utf-8')``. If this implicit conversion fails, pybind11 will
raise a ``UnicodeDecodeError``.
.. code-block:: c++
m.def("std_string_return",
[]() {
return std::string("This string needs to be UTF-8 encoded");
}
);
.. code-block:: pycon
>>> isinstance(example.std_string_return(), str)
True
Because UTF-8 is inclusive of pure ASCII, there is never any issue with
returning a pure ASCII string to Python. If there is any possibility that the
string is not pure ASCII, it is necessary to ensure the encoding is valid
UTF-8.
.. warning::
Implicit conversion assumes that a returned ``char *`` is null-terminated.
If there is no null terminator a buffer overrun will occur.
Explicit conversions
--------------------
If some C++ code constructs a ``std::string`` that is not a UTF-8 string, one
can perform a explicit conversion and return a ``py::str`` object. Explicit
conversion has the same overhead as implicit conversion.
.. code-block:: c++
// This uses the Python C API to convert Latin-1 to Unicode
m.def("str_output",
[]() {
std::string s = "Send your r\xe9sum\xe9 to Alice in HR"; // Latin-1
py::str py_s = PyUnicode_DecodeLatin1(s.data(), s.length());
return py_s;
}
);
.. code-block:: pycon
>>> str_output()
'Send your résumé to Alice in HR'
The `Python C API
<https://docs.python.org/3/c-api/unicode.html#built-in-codecs>`_ provides
several built-in codecs.
One could also use a third party encoding library such as libiconv to transcode
to UTF-8.
Return C++ strings without conversion
-------------------------------------
If the data in a C++ ``std::string`` does not represent text and should be
returned to Python as ``bytes``, then one can return the data as a
``py::bytes`` object.
.. code-block:: c++
m.def("return_bytes",
[]() {
std::string s("\xba\xd0\xba\xd0"); // Not valid UTF-8
return py::bytes(s); // Return the data without transcoding
}
);
.. code-block:: pycon
>>> example.return_bytes()
b'\xba\xd0\xba\xd0'
Note the asymmetry: pybind11 will convert ``bytes`` to ``std::string`` without
encoding, but cannot convert ``std::string`` back to ``bytes`` implicitly.
.. code-block:: c++
m.def("asymmetry",
[](std::string s) { // Accepts str or bytes from Python
return s; // Looks harmless, but implicitly converts to str
}
);
.. code-block:: pycon
>>> isinstance(example.asymmetry(b"have some bytes"), str)
True
>>> example.asymmetry(b"\xba\xd0\xba\xd0") # invalid utf-8 as bytes
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xba in position 0: invalid start byte
Wide character strings
======================
When a Python ``str`` is passed to a C++ function expecting ``std::wstring``,
``wchar_t*``, ``std::u16string`` or ``std::u32string``, the ``str`` will be
encoded to UTF-16 or UTF-32 depending on how the C++ compiler implements each
type, in the platform's native endianness. When strings of these types are
returned, they are assumed to contain valid UTF-16 or UTF-32, and will be
decoded to Python ``str``.
.. code-block:: c++
#define UNICODE
#include <windows.h>
m.def("set_window_text",
[](HWND hwnd, std::wstring s) {
// Call SetWindowText with null-terminated UTF-16 string
::SetWindowText(hwnd, s.c_str());
}
);
m.def("get_window_text",
[](HWND hwnd) {
const int buffer_size = ::GetWindowTextLength(hwnd) + 1;
auto buffer = std::make_unique< wchar_t[] >(buffer_size);
::GetWindowText(hwnd, buffer.data(), buffer_size);
std::wstring text(buffer.get());
// wstring will be converted to Python str
return text;
}
);
.. warning::
Wide character strings may not work as described on Python 2.7 or Python
3.3 compiled with ``--enable-unicode=ucs2``.
Strings in multibyte encodings such as Shift-JIS must transcoded to a
UTF-8/16/32 before being returned to Python.
Character literals
==================
C++ functions that accept character literals as input will receive the first
character of a Python ``str`` as their input. If the string is longer than one
Unicode character, trailing characters will be ignored.
When a character literal is returned from C++ (such as a ``char`` or a
``wchar_t``), it will be converted to a ``str`` that represents the single
character.
.. code-block:: c++
m.def("pass_char", [](char c) { return c; });
m.def("pass_wchar", [](wchar_t w) { return w; });
.. code-block:: pycon
>>> example.pass_char("A")
'A'
While C++ will cast integers to character types (``char c = 0x65;``), pybind11
does not convert Python integers to characters implicitly. The Python function
``chr()`` can be used to convert integers to characters.
.. code-block:: pycon
>>> example.pass_char(0x65)
TypeError
>>> example.pass_char(chr(0x65))
'A'
If the desire is to work with an 8-bit integer, use ``int8_t`` or ``uint8_t``
as the argument type.
Grapheme clusters
-----------------
A single grapheme may be represented by two or more Unicode characters. For
example 'é' is usually represented as U+00E9 but can also be expressed as the
combining character sequence U+0065 U+0301 (that is, the letter 'e' followed by
a combining acute accent). The combining character will be lost if the
two-character sequence is passed as an argument, even though it renders as a
single grapheme.
.. code-block:: pycon
>>> example.pass_wchar("é")
'é'
>>> combining_e_acute = "e" + "\u0301"
>>> combining_e_acute
'é'
>>> combining_e_acute == "é"
False
>>> example.pass_wchar(combining_e_acute)
'e'
Normalizing combining characters before passing the character literal to C++
may resolve *some* of these issues:
.. code-block:: pycon
>>> example.pass_wchar(unicodedata.normalize("NFC", combining_e_acute))
'é'
In some languages (Thai for example), there are `graphemes that cannot be
expressed as a single Unicode code point
<http://unicode.org/reports/tr29/#Grapheme_Cluster_Boundaries>`_, so there is
no way to capture them in a C++ character type.
C++17 string views
==================
C++17 string views are automatically supported when compiling in C++17 mode.
They follow the same rules for encoding and decoding as the corresponding STL
string type (for example, a ``std::u16string_view`` argument will be passed
UTF-16-encoded data, and a returned ``std::string_view`` will be decoded as
UTF-8).
References
==========
* `The Absolute Minimum Every Software Developer Absolutely, Positively Must Know About Unicode and Character Sets (No Excuses!) <https://www.joelonsoftware.com/2003/10/08/the-absolute-minimum-every-software-developer-absolutely-positively-must-know-about-unicode-and-character-sets-no-excuses/>`_
* `C++ - Using STL Strings at Win32 API Boundaries <https://msdn.microsoft.com/en-ca/magazine/mt238407.aspx>`_
| Abies | /Abies-0.0.5.tar.gz/Abies-0.0.5/extern/pybind11/docs/advanced/cast/strings.rst | strings.rst |
Eigen
#####
`Eigen <http://eigen.tuxfamily.org>`_ is C++ header-based library for dense and
sparse linear algebra. Due to its popularity and widespread adoption, pybind11
provides transparent conversion and limited mapping support between Eigen and
Scientific Python linear algebra data types.
To enable the built-in Eigen support you must include the optional header file
:file:`pybind11/eigen.h`.
Pass-by-value
=============
When binding a function with ordinary Eigen dense object arguments (for
example, ``Eigen::MatrixXd``), pybind11 will accept any input value that is
already (or convertible to) a ``numpy.ndarray`` with dimensions compatible with
the Eigen type, copy its values into a temporary Eigen variable of the
appropriate type, then call the function with this temporary variable.
Sparse matrices are similarly copied to or from
``scipy.sparse.csr_matrix``/``scipy.sparse.csc_matrix`` objects.
Pass-by-reference
=================
One major limitation of the above is that every data conversion implicitly
involves a copy, which can be both expensive (for large matrices) and disallows
binding functions that change their (Matrix) arguments. Pybind11 allows you to
work around this by using Eigen's ``Eigen::Ref<MatrixType>`` class much as you
would when writing a function taking a generic type in Eigen itself (subject to
some limitations discussed below).
When calling a bound function accepting a ``Eigen::Ref<const MatrixType>``
type, pybind11 will attempt to avoid copying by using an ``Eigen::Map`` object
that maps into the source ``numpy.ndarray`` data: this requires both that the
data types are the same (e.g. ``dtype='float64'`` and ``MatrixType::Scalar`` is
``double``); and that the storage is layout compatible. The latter limitation
is discussed in detail in the section below, and requires careful
consideration: by default, numpy matrices and Eigen matrices are *not* storage
compatible.
If the numpy matrix cannot be used as is (either because its types differ, e.g.
passing an array of integers to an Eigen parameter requiring doubles, or
because the storage is incompatible), pybind11 makes a temporary copy and
passes the copy instead.
When a bound function parameter is instead ``Eigen::Ref<MatrixType>`` (note the
lack of ``const``), pybind11 will only allow the function to be called if it
can be mapped *and* if the numpy array is writeable (that is
``a.flags.writeable`` is true). Any access (including modification) made to
the passed variable will be transparently carried out directly on the
``numpy.ndarray``.
This means you can write code such as the following and have it work as
expected:
.. code-block:: cpp
void scale_by_2(Eigen::Ref<Eigen::VectorXd> v) {
v *= 2;
}
Note, however, that you will likely run into limitations due to numpy and
Eigen's difference default storage order for data; see the below section on
:ref:`storage_orders` for details on how to bind code that won't run into such
limitations.
.. note::
Passing by reference is not supported for sparse types.
Returning values to Python
==========================
When returning an ordinary dense Eigen matrix type to numpy (e.g.
``Eigen::MatrixXd`` or ``Eigen::RowVectorXf``) pybind11 keeps the matrix and
returns a numpy array that directly references the Eigen matrix: no copy of the
data is performed. The numpy array will have ``array.flags.owndata`` set to
``False`` to indicate that it does not own the data, and the lifetime of the
stored Eigen matrix will be tied to the returned ``array``.
If you bind a function with a non-reference, ``const`` return type (e.g.
``const Eigen::MatrixXd``), the same thing happens except that pybind11 also
sets the numpy array's ``writeable`` flag to false.
If you return an lvalue reference or pointer, the usual pybind11 rules apply,
as dictated by the binding function's return value policy (see the
documentation on :ref:`return_value_policies` for full details). That means,
without an explicit return value policy, lvalue references will be copied and
pointers will be managed by pybind11. In order to avoid copying, you should
explicitly specify an appropriate return value policy, as in the following
example:
.. code-block:: cpp
class MyClass {
Eigen::MatrixXd big_mat = Eigen::MatrixXd::Zero(10000, 10000);
public:
Eigen::MatrixXd &getMatrix() { return big_mat; }
const Eigen::MatrixXd &viewMatrix() { return big_mat; }
};
// Later, in binding code:
py::class_<MyClass>(m, "MyClass")
.def(py::init<>())
.def("copy_matrix", &MyClass::getMatrix) // Makes a copy!
.def("get_matrix", &MyClass::getMatrix, py::return_value_policy::reference_internal)
.def("view_matrix", &MyClass::viewMatrix, py::return_value_policy::reference_internal)
;
.. code-block:: python
a = MyClass()
m = a.get_matrix() # flags.writeable = True, flags.owndata = False
v = a.view_matrix() # flags.writeable = False, flags.owndata = False
c = a.copy_matrix() # flags.writeable = True, flags.owndata = True
# m[5,6] and v[5,6] refer to the same element, c[5,6] does not.
Note in this example that ``py::return_value_policy::reference_internal`` is
used to tie the life of the MyClass object to the life of the returned arrays.
You may also return an ``Eigen::Ref``, ``Eigen::Map`` or other map-like Eigen
object (for example, the return value of ``matrix.block()`` and related
methods) that map into a dense Eigen type. When doing so, the default
behaviour of pybind11 is to simply reference the returned data: you must take
care to ensure that this data remains valid! You may ask pybind11 to
explicitly *copy* such a return value by using the
``py::return_value_policy::copy`` policy when binding the function. You may
also use ``py::return_value_policy::reference_internal`` or a
``py::keep_alive`` to ensure the data stays valid as long as the returned numpy
array does.
When returning such a reference of map, pybind11 additionally respects the
readonly-status of the returned value, marking the numpy array as non-writeable
if the reference or map was itself read-only.
.. note::
Sparse types are always copied when returned.
.. _storage_orders:
Storage orders
==============
Passing arguments via ``Eigen::Ref`` has some limitations that you must be
aware of in order to effectively pass matrices by reference. First and
foremost is that the default ``Eigen::Ref<MatrixType>`` class requires
contiguous storage along columns (for column-major types, the default in Eigen)
or rows if ``MatrixType`` is specifically an ``Eigen::RowMajor`` storage type.
The former, Eigen's default, is incompatible with ``numpy``'s default row-major
storage, and so you will not be able to pass numpy arrays to Eigen by reference
without making one of two changes.
(Note that this does not apply to vectors (or column or row matrices): for such
types the "row-major" and "column-major" distinction is meaningless).
The first approach is to change the use of ``Eigen::Ref<MatrixType>`` to the
more general ``Eigen::Ref<MatrixType, 0, Eigen::Stride<Eigen::Dynamic,
Eigen::Dynamic>>`` (or similar type with a fully dynamic stride type in the
third template argument). Since this is a rather cumbersome type, pybind11
provides a ``py::EigenDRef<MatrixType>`` type alias for your convenience (along
with EigenDMap for the equivalent Map, and EigenDStride for just the stride
type).
This type allows Eigen to map into any arbitrary storage order. This is not
the default in Eigen for performance reasons: contiguous storage allows
vectorization that cannot be done when storage is not known to be contiguous at
compile time. The default ``Eigen::Ref`` stride type allows non-contiguous
storage along the outer dimension (that is, the rows of a column-major matrix
or columns of a row-major matrix), but not along the inner dimension.
This type, however, has the added benefit of also being able to map numpy array
slices. For example, the following (contrived) example uses Eigen with a numpy
slice to multiply by 2 all coefficients that are both on even rows (0, 2, 4,
...) and in columns 2, 5, or 8:
.. code-block:: cpp
m.def("scale", [](py::EigenDRef<Eigen::MatrixXd> m, double c) { m *= c; });
.. code-block:: python
# a = np.array(...)
scale_by_2(myarray[0::2, 2:9:3])
The second approach to avoid copying is more intrusive: rearranging the
underlying data types to not run into the non-contiguous storage problem in the
first place. In particular, that means using matrices with ``Eigen::RowMajor``
storage, where appropriate, such as:
.. code-block:: cpp
using RowMatrixXd = Eigen::Matrix<double, Eigen::Dynamic, Eigen::Dynamic, Eigen::RowMajor>;
// Use RowMatrixXd instead of MatrixXd
Now bound functions accepting ``Eigen::Ref<RowMatrixXd>`` arguments will be
callable with numpy's (default) arrays without involving a copying.
You can, alternatively, change the storage order that numpy arrays use by
adding the ``order='F'`` option when creating an array:
.. code-block:: python
myarray = np.array(source, order="F")
Such an object will be passable to a bound function accepting an
``Eigen::Ref<MatrixXd>`` (or similar column-major Eigen type).
One major caveat with this approach, however, is that it is not entirely as
easy as simply flipping all Eigen or numpy usage from one to the other: some
operations may alter the storage order of a numpy array. For example, ``a2 =
array.transpose()`` results in ``a2`` being a view of ``array`` that references
the same data, but in the opposite storage order!
While this approach allows fully optimized vectorized calculations in Eigen, it
cannot be used with array slices, unlike the first approach.
When *returning* a matrix to Python (either a regular matrix, a reference via
``Eigen::Ref<>``, or a map/block into a matrix), no special storage
consideration is required: the created numpy array will have the required
stride that allows numpy to properly interpret the array, whatever its storage
order.
Failing rather than copying
===========================
The default behaviour when binding ``Eigen::Ref<const MatrixType>`` Eigen
references is to copy matrix values when passed a numpy array that does not
conform to the element type of ``MatrixType`` or does not have a compatible
stride layout. If you want to explicitly avoid copying in such a case, you
should bind arguments using the ``py::arg().noconvert()`` annotation (as
described in the :ref:`nonconverting_arguments` documentation).
The following example shows an example of arguments that don't allow data
copying to take place:
.. code-block:: cpp
// The method and function to be bound:
class MyClass {
// ...
double some_method(const Eigen::Ref<const MatrixXd> &matrix) { /* ... */ }
};
float some_function(const Eigen::Ref<const MatrixXf> &big,
const Eigen::Ref<const MatrixXf> &small) {
// ...
}
// The associated binding code:
using namespace pybind11::literals; // for "arg"_a
py::class_<MyClass>(m, "MyClass")
// ... other class definitions
.def("some_method", &MyClass::some_method, py::arg().noconvert());
m.def("some_function", &some_function,
"big"_a.noconvert(), // <- Don't allow copying for this arg
"small"_a // <- This one can be copied if needed
);
With the above binding code, attempting to call the the ``some_method(m)``
method on a ``MyClass`` object, or attempting to call ``some_function(m, m2)``
will raise a ``RuntimeError`` rather than making a temporary copy of the array.
It will, however, allow the ``m2`` argument to be copied into a temporary if
necessary.
Note that explicitly specifying ``.noconvert()`` is not required for *mutable*
Eigen references (e.g. ``Eigen::Ref<MatrixXd>`` without ``const`` on the
``MatrixXd``): mutable references will never be called with a temporary copy.
Vectors versus column/row matrices
==================================
Eigen and numpy have fundamentally different notions of a vector. In Eigen, a
vector is simply a matrix with the number of columns or rows set to 1 at
compile time (for a column vector or row vector, respectively). NumPy, in
contrast, has comparable 2-dimensional 1xN and Nx1 arrays, but *also* has
1-dimensional arrays of size N.
When passing a 2-dimensional 1xN or Nx1 array to Eigen, the Eigen type must
have matching dimensions: That is, you cannot pass a 2-dimensional Nx1 numpy
array to an Eigen value expecting a row vector, or a 1xN numpy array as a
column vector argument.
On the other hand, pybind11 allows you to pass 1-dimensional arrays of length N
as Eigen parameters. If the Eigen type can hold a column vector of length N it
will be passed as such a column vector. If not, but the Eigen type constraints
will accept a row vector, it will be passed as a row vector. (The column
vector takes precedence when both are supported, for example, when passing a
1D numpy array to a MatrixXd argument). Note that the type need not be
explicitly a vector: it is permitted to pass a 1D numpy array of size 5 to an
Eigen ``Matrix<double, Dynamic, 5>``: you would end up with a 1x5 Eigen matrix.
Passing the same to an ``Eigen::MatrixXd`` would result in a 5x1 Eigen matrix.
When returning an Eigen vector to numpy, the conversion is ambiguous: a row
vector of length 4 could be returned as either a 1D array of length 4, or as a
2D array of size 1x4. When encountering such a situation, pybind11 compromises
by considering the returned Eigen type: if it is a compile-time vector--that
is, the type has either the number of rows or columns set to 1 at compile
time--pybind11 converts to a 1D numpy array when returning the value. For
instances that are a vector only at run-time (e.g. ``MatrixXd``,
``Matrix<float, Dynamic, 4>``), pybind11 returns the vector as a 2D array to
numpy. If this isn't want you want, you can use ``array.reshape(...)`` to get
a view of the same data in the desired dimensions.
.. seealso::
The file :file:`tests/test_eigen.cpp` contains a complete example that
shows how to pass Eigen sparse and dense data types in more detail.
| Abies | /Abies-0.0.5.tar.gz/Abies-0.0.5/extern/pybind11/docs/advanced/cast/eigen.rst | eigen.rst |
Chrono
======
When including the additional header file :file:`pybind11/chrono.h` conversions
from C++11 chrono datatypes to python datetime objects are automatically enabled.
This header also enables conversions of python floats (often from sources such
as ``time.monotonic()``, ``time.perf_counter()`` and ``time.process_time()``)
into durations.
An overview of clocks in C++11
------------------------------
A point of confusion when using these conversions is the differences between
clocks provided in C++11. There are three clock types defined by the C++11
standard and users can define their own if needed. Each of these clocks have
different properties and when converting to and from python will give different
results.
The first clock defined by the standard is ``std::chrono::system_clock``. This
clock measures the current date and time. However, this clock changes with to
updates to the operating system time. For example, if your time is synchronised
with a time server this clock will change. This makes this clock a poor choice
for timing purposes but good for measuring the wall time.
The second clock defined in the standard is ``std::chrono::steady_clock``.
This clock ticks at a steady rate and is never adjusted. This makes it excellent
for timing purposes, however the value in this clock does not correspond to the
current date and time. Often this clock will be the amount of time your system
has been on, although it does not have to be. This clock will never be the same
clock as the system clock as the system clock can change but steady clocks
cannot.
The third clock defined in the standard is ``std::chrono::high_resolution_clock``.
This clock is the clock that has the highest resolution out of the clocks in the
system. It is normally a typedef to either the system clock or the steady clock
but can be its own independent clock. This is important as when using these
conversions as the types you get in python for this clock might be different
depending on the system.
If it is a typedef of the system clock, python will get datetime objects, but if
it is a different clock they will be timedelta objects.
Provided conversions
--------------------
.. rubric:: C++ to Python
- ``std::chrono::system_clock::time_point`` → ``datetime.datetime``
System clock times are converted to python datetime instances. They are
in the local timezone, but do not have any timezone information attached
to them (they are naive datetime objects).
- ``std::chrono::duration`` → ``datetime.timedelta``
Durations are converted to timedeltas, any precision in the duration
greater than microseconds is lost by rounding towards zero.
- ``std::chrono::[other_clocks]::time_point`` → ``datetime.timedelta``
Any clock time that is not the system clock is converted to a time delta.
This timedelta measures the time from the clocks epoch to now.
.. rubric:: Python to C++
- ``datetime.datetime`` or ``datetime.date`` or ``datetime.time`` → ``std::chrono::system_clock::time_point``
Date/time objects are converted into system clock timepoints. Any
timezone information is ignored and the type is treated as a naive
object.
- ``datetime.timedelta`` → ``std::chrono::duration``
Time delta are converted into durations with microsecond precision.
- ``datetime.timedelta`` → ``std::chrono::[other_clocks]::time_point``
Time deltas that are converted into clock timepoints are treated as
the amount of time from the start of the clocks epoch.
- ``float`` → ``std::chrono::duration``
Floats that are passed to C++ as durations be interpreted as a number of
seconds. These will be converted to the duration using ``duration_cast``
from the float.
- ``float`` → ``std::chrono::[other_clocks]::time_point``
Floats that are passed to C++ as time points will be interpreted as the
number of seconds from the start of the clocks epoch.
| Abies | /Abies-0.0.5.tar.gz/Abies-0.0.5/extern/pybind11/docs/advanced/cast/chrono.rst | chrono.rst |
STL containers
##############
Automatic conversion
====================
When including the additional header file :file:`pybind11/stl.h`, conversions
between ``std::vector<>``/``std::deque<>``/``std::list<>``/``std::array<>``/``std::valarray<>``,
``std::set<>``/``std::unordered_set<>``, and
``std::map<>``/``std::unordered_map<>`` and the Python ``list``, ``set`` and
``dict`` data structures are automatically enabled. The types ``std::pair<>``
and ``std::tuple<>`` are already supported out of the box with just the core
:file:`pybind11/pybind11.h` header.
The major downside of these implicit conversions is that containers must be
converted (i.e. copied) on every Python->C++ and C++->Python transition, which
can have implications on the program semantics and performance. Please read the
next sections for more details and alternative approaches that avoid this.
.. note::
Arbitrary nesting of any of these types is possible.
.. seealso::
The file :file:`tests/test_stl.cpp` contains a complete
example that demonstrates how to pass STL data types in more detail.
.. _cpp17_container_casters:
C++17 library containers
========================
The :file:`pybind11/stl.h` header also includes support for ``std::optional<>``
and ``std::variant<>``. These require a C++17 compiler and standard library.
In C++14 mode, ``std::experimental::optional<>`` is supported if available.
Various versions of these containers also exist for C++11 (e.g. in Boost).
pybind11 provides an easy way to specialize the ``type_caster`` for such
types:
.. code-block:: cpp
// `boost::optional` as an example -- can be any `std::optional`-like container
namespace pybind11 { namespace detail {
template <typename T>
struct type_caster<boost::optional<T>> : optional_caster<boost::optional<T>> {};
}}
The above should be placed in a header file and included in all translation units
where automatic conversion is needed. Similarly, a specialization can be provided
for custom variant types:
.. code-block:: cpp
// `boost::variant` as an example -- can be any `std::variant`-like container
namespace pybind11 { namespace detail {
template <typename... Ts>
struct type_caster<boost::variant<Ts...>> : variant_caster<boost::variant<Ts...>> {};
// Specifies the function used to visit the variant -- `apply_visitor` instead of `visit`
template <>
struct visit_helper<boost::variant> {
template <typename... Args>
static auto call(Args &&...args) -> decltype(boost::apply_visitor(args...)) {
return boost::apply_visitor(args...);
}
};
}} // namespace pybind11::detail
The ``visit_helper`` specialization is not required if your ``name::variant`` provides
a ``name::visit()`` function. For any other function name, the specialization must be
included to tell pybind11 how to visit the variant.
.. warning::
When converting a ``variant`` type, pybind11 follows the same rules as when
determining which function overload to call (:ref:`overload_resolution`), and
so the same caveats hold. In particular, the order in which the ``variant``'s
alternatives are listed is important, since pybind11 will try conversions in
this order. This means that, for example, when converting ``variant<int, bool>``,
the ``bool`` variant will never be selected, as any Python ``bool`` is already
an ``int`` and is convertible to a C++ ``int``. Changing the order of alternatives
(and using ``variant<bool, int>``, in this example) provides a solution.
.. note::
pybind11 only supports the modern implementation of ``boost::variant``
which makes use of variadic templates. This requires Boost 1.56 or newer.
Additionally, on Windows, MSVC 2017 is required because ``boost::variant``
falls back to the old non-variadic implementation on MSVC 2015.
.. _opaque:
Making opaque types
===================
pybind11 heavily relies on a template matching mechanism to convert parameters
and return values that are constructed from STL data types such as vectors,
linked lists, hash tables, etc. This even works in a recursive manner, for
instance to deal with lists of hash maps of pairs of elementary and custom
types, etc.
However, a fundamental limitation of this approach is that internal conversions
between Python and C++ types involve a copy operation that prevents
pass-by-reference semantics. What does this mean?
Suppose we bind the following function
.. code-block:: cpp
void append_1(std::vector<int> &v) {
v.push_back(1);
}
and call it from Python, the following happens:
.. code-block:: pycon
>>> v = [5, 6]
>>> append_1(v)
>>> print(v)
[5, 6]
As you can see, when passing STL data structures by reference, modifications
are not propagated back the Python side. A similar situation arises when
exposing STL data structures using the ``def_readwrite`` or ``def_readonly``
functions:
.. code-block:: cpp
/* ... definition ... */
class MyClass {
std::vector<int> contents;
};
/* ... binding code ... */
py::class_<MyClass>(m, "MyClass")
.def(py::init<>())
.def_readwrite("contents", &MyClass::contents);
In this case, properties can be read and written in their entirety. However, an
``append`` operation involving such a list type has no effect:
.. code-block:: pycon
>>> m = MyClass()
>>> m.contents = [5, 6]
>>> print(m.contents)
[5, 6]
>>> m.contents.append(7)
>>> print(m.contents)
[5, 6]
Finally, the involved copy operations can be costly when dealing with very
large lists. To deal with all of the above situations, pybind11 provides a
macro named ``PYBIND11_MAKE_OPAQUE(T)`` that disables the template-based
conversion machinery of types, thus rendering them *opaque*. The contents of
opaque objects are never inspected or extracted, hence they *can* be passed by
reference. For instance, to turn ``std::vector<int>`` into an opaque type, add
the declaration
.. code-block:: cpp
PYBIND11_MAKE_OPAQUE(std::vector<int>);
before any binding code (e.g. invocations to ``class_::def()``, etc.). This
macro must be specified at the top level (and outside of any namespaces), since
it adds a template instantiation of ``type_caster``. If your binding code consists of
multiple compilation units, it must be present in every file (typically via a
common header) preceding any usage of ``std::vector<int>``. Opaque types must
also have a corresponding ``class_`` declaration to associate them with a name
in Python, and to define a set of available operations, e.g.:
.. code-block:: cpp
py::class_<std::vector<int>>(m, "IntVector")
.def(py::init<>())
.def("clear", &std::vector<int>::clear)
.def("pop_back", &std::vector<int>::pop_back)
.def("__len__", [](const std::vector<int> &v) { return v.size(); })
.def("__iter__", [](std::vector<int> &v) {
return py::make_iterator(v.begin(), v.end());
}, py::keep_alive<0, 1>()) /* Keep vector alive while iterator is used */
// ....
.. seealso::
The file :file:`tests/test_opaque_types.cpp` contains a complete
example that demonstrates how to create and expose opaque types using
pybind11 in more detail.
.. _stl_bind:
Binding STL containers
======================
The ability to expose STL containers as native Python objects is a fairly
common request, hence pybind11 also provides an optional header file named
:file:`pybind11/stl_bind.h` that does exactly this. The mapped containers try
to match the behavior of their native Python counterparts as much as possible.
The following example showcases usage of :file:`pybind11/stl_bind.h`:
.. code-block:: cpp
// Don't forget this
#include <pybind11/stl_bind.h>
PYBIND11_MAKE_OPAQUE(std::vector<int>);
PYBIND11_MAKE_OPAQUE(std::map<std::string, double>);
// ...
// later in binding code:
py::bind_vector<std::vector<int>>(m, "VectorInt");
py::bind_map<std::map<std::string, double>>(m, "MapStringDouble");
When binding STL containers pybind11 considers the types of the container's
elements to decide whether the container should be confined to the local module
(via the :ref:`module_local` feature). If the container element types are
anything other than already-bound custom types bound without
``py::module_local()`` the container binding will have ``py::module_local()``
applied. This includes converting types such as numeric types, strings, Eigen
types; and types that have not yet been bound at the time of the stl container
binding. This module-local binding is designed to avoid potential conflicts
between module bindings (for example, from two separate modules each attempting
to bind ``std::vector<int>`` as a python type).
It is possible to override this behavior to force a definition to be either
module-local or global. To do so, you can pass the attributes
``py::module_local()`` (to make the binding module-local) or
``py::module_local(false)`` (to make the binding global) into the
``py::bind_vector`` or ``py::bind_map`` arguments:
.. code-block:: cpp
py::bind_vector<std::vector<int>>(m, "VectorInt", py::module_local(false));
Note, however, that such a global binding would make it impossible to load this
module at the same time as any other pybind module that also attempts to bind
the same container type (``std::vector<int>`` in the above example).
See :ref:`module_local` for more details on module-local bindings.
.. seealso::
The file :file:`tests/test_stl_binders.cpp` shows how to use the
convenience STL container wrappers.
| Abies | /Abies-0.0.5.tar.gz/Abies-0.0.5/extern/pybind11/docs/advanced/cast/stl.rst | stl.rst |
Custom type casters
===================
In very rare cases, applications may require custom type casters that cannot be
expressed using the abstractions provided by pybind11, thus requiring raw
Python C API calls. This is fairly advanced usage and should only be pursued by
experts who are familiar with the intricacies of Python reference counting.
The following snippets demonstrate how this works for a very simple ``inty``
type that that should be convertible from Python types that provide a
``__int__(self)`` method.
.. code-block:: cpp
struct inty { long long_value; };
void print(inty s) {
std::cout << s.long_value << std::endl;
}
The following Python snippet demonstrates the intended usage from the Python side:
.. code-block:: python
class A:
def __int__(self):
return 123
from example import print
print(A())
To register the necessary conversion routines, it is necessary to add an
instantiation of the ``pybind11::detail::type_caster<T>`` template.
Although this is an implementation detail, adding an instantiation of this
type is explicitly allowed.
.. code-block:: cpp
namespace pybind11 { namespace detail {
template <> struct type_caster<inty> {
public:
/**
* This macro establishes the name 'inty' in
* function signatures and declares a local variable
* 'value' of type inty
*/
PYBIND11_TYPE_CASTER(inty, const_name("inty"));
/**
* Conversion part 1 (Python->C++): convert a PyObject into a inty
* instance or return false upon failure. The second argument
* indicates whether implicit conversions should be applied.
*/
bool load(handle src, bool) {
/* Extract PyObject from handle */
PyObject *source = src.ptr();
/* Try converting into a Python integer value */
PyObject *tmp = PyNumber_Long(source);
if (!tmp)
return false;
/* Now try to convert into a C++ int */
value.long_value = PyLong_AsLong(tmp);
Py_DECREF(tmp);
/* Ensure return code was OK (to avoid out-of-range errors etc) */
return !(value.long_value == -1 && !PyErr_Occurred());
}
/**
* Conversion part 2 (C++ -> Python): convert an inty instance into
* a Python object. The second and third arguments are used to
* indicate the return value policy and parent object (for
* ``return_value_policy::reference_internal``) and are generally
* ignored by implicit casters.
*/
static handle cast(inty src, return_value_policy /* policy */, handle /* parent */) {
return PyLong_FromLong(src.long_value);
}
};
}} // namespace pybind11::detail
.. note::
A ``type_caster<T>`` defined with ``PYBIND11_TYPE_CASTER(T, ...)`` requires
that ``T`` is default-constructible (``value`` is first default constructed
and then ``load()`` assigns to it).
.. warning::
When using custom type casters, it's important to declare them consistently
in every compilation unit of the Python extension module. Otherwise,
undefined behavior can ensue.
| Abies | /Abies-0.0.5.tar.gz/Abies-0.0.5/extern/pybind11/docs/advanced/cast/custom.rst | custom.rst |
Overview
########
.. rubric:: 1. Native type in C++, wrapper in Python
Exposing a custom C++ type using :class:`py::class_` was covered in detail
in the :doc:`/classes` section. There, the underlying data structure is
always the original C++ class while the :class:`py::class_` wrapper provides
a Python interface. Internally, when an object like this is sent from C++ to
Python, pybind11 will just add the outer wrapper layer over the native C++
object. Getting it back from Python is just a matter of peeling off the
wrapper.
.. rubric:: 2. Wrapper in C++, native type in Python
This is the exact opposite situation. Now, we have a type which is native to
Python, like a ``tuple`` or a ``list``. One way to get this data into C++ is
with the :class:`py::object` family of wrappers. These are explained in more
detail in the :doc:`/advanced/pycpp/object` section. We'll just give a quick
example here:
.. code-block:: cpp
void print_list(py::list my_list) {
for (auto item : my_list)
std::cout << item << " ";
}
.. code-block:: pycon
>>> print_list([1, 2, 3])
1 2 3
The Python ``list`` is not converted in any way -- it's just wrapped in a C++
:class:`py::list` class. At its core it's still a Python object. Copying a
:class:`py::list` will do the usual reference-counting like in Python.
Returning the object to Python will just remove the thin wrapper.
.. rubric:: 3. Converting between native C++ and Python types
In the previous two cases we had a native type in one language and a wrapper in
the other. Now, we have native types on both sides and we convert between them.
.. code-block:: cpp
void print_vector(const std::vector<int> &v) {
for (auto item : v)
std::cout << item << "\n";
}
.. code-block:: pycon
>>> print_vector([1, 2, 3])
1 2 3
In this case, pybind11 will construct a new ``std::vector<int>`` and copy each
element from the Python ``list``. The newly constructed object will be passed
to ``print_vector``. The same thing happens in the other direction: a new
``list`` is made to match the value returned from C++.
Lots of these conversions are supported out of the box, as shown in the table
below. They are very convenient, but keep in mind that these conversions are
fundamentally based on copying data. This is perfectly fine for small immutable
types but it may become quite expensive for large data structures. This can be
avoided by overriding the automatic conversion with a custom wrapper (i.e. the
above-mentioned approach 1). This requires some manual effort and more details
are available in the :ref:`opaque` section.
.. _conversion_table:
List of all builtin conversions
-------------------------------
The following basic data types are supported out of the box (some may require
an additional extension header to be included). To pass other data structures
as arguments and return values, refer to the section on binding :ref:`classes`.
+------------------------------------+---------------------------+-----------------------------------+
| Data type | Description | Header file |
+====================================+===========================+===================================+
| ``int8_t``, ``uint8_t`` | 8-bit integers | :file:`pybind11/pybind11.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``int16_t``, ``uint16_t`` | 16-bit integers | :file:`pybind11/pybind11.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``int32_t``, ``uint32_t`` | 32-bit integers | :file:`pybind11/pybind11.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``int64_t``, ``uint64_t`` | 64-bit integers | :file:`pybind11/pybind11.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``ssize_t``, ``size_t`` | Platform-dependent size | :file:`pybind11/pybind11.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``float``, ``double`` | Floating point types | :file:`pybind11/pybind11.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``bool`` | Two-state Boolean type | :file:`pybind11/pybind11.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``char`` | Character literal | :file:`pybind11/pybind11.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``char16_t`` | UTF-16 character literal | :file:`pybind11/pybind11.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``char32_t`` | UTF-32 character literal | :file:`pybind11/pybind11.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``wchar_t`` | Wide character literal | :file:`pybind11/pybind11.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``const char *`` | UTF-8 string literal | :file:`pybind11/pybind11.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``const char16_t *`` | UTF-16 string literal | :file:`pybind11/pybind11.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``const char32_t *`` | UTF-32 string literal | :file:`pybind11/pybind11.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``const wchar_t *`` | Wide string literal | :file:`pybind11/pybind11.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::string`` | STL dynamic UTF-8 string | :file:`pybind11/pybind11.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::u16string`` | STL dynamic UTF-16 string | :file:`pybind11/pybind11.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::u32string`` | STL dynamic UTF-32 string | :file:`pybind11/pybind11.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::wstring`` | STL dynamic wide string | :file:`pybind11/pybind11.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::string_view``, | STL C++17 string views | :file:`pybind11/pybind11.h` |
| ``std::u16string_view``, etc. | | |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::pair<T1, T2>`` | Pair of two custom types | :file:`pybind11/pybind11.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::tuple<...>`` | Arbitrary tuple of types | :file:`pybind11/pybind11.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::reference_wrapper<...>`` | Reference type wrapper | :file:`pybind11/pybind11.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::complex<T>`` | Complex numbers | :file:`pybind11/complex.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::array<T, Size>`` | STL static array | :file:`pybind11/stl.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::vector<T>`` | STL dynamic array | :file:`pybind11/stl.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::deque<T>`` | STL double-ended queue | :file:`pybind11/stl.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::valarray<T>`` | STL value array | :file:`pybind11/stl.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::list<T>`` | STL linked list | :file:`pybind11/stl.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::map<T1, T2>`` | STL ordered map | :file:`pybind11/stl.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::unordered_map<T1, T2>`` | STL unordered map | :file:`pybind11/stl.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::set<T>`` | STL ordered set | :file:`pybind11/stl.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::unordered_set<T>`` | STL unordered set | :file:`pybind11/stl.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::optional<T>`` | STL optional type (C++17) | :file:`pybind11/stl.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::experimental::optional<T>`` | STL optional type (exp.) | :file:`pybind11/stl.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::variant<...>`` | Type-safe union (C++17) | :file:`pybind11/stl.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::filesystem::path<T>`` | STL path (C++17) [#]_ | :file:`pybind11/stl/filesystem.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::function<...>`` | STL polymorphic function | :file:`pybind11/functional.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::chrono::duration<...>`` | STL time duration | :file:`pybind11/chrono.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``std::chrono::time_point<...>`` | STL date/time | :file:`pybind11/chrono.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``Eigen::Matrix<...>`` | Eigen: dense matrix | :file:`pybind11/eigen.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``Eigen::Map<...>`` | Eigen: mapped memory | :file:`pybind11/eigen.h` |
+------------------------------------+---------------------------+-----------------------------------+
| ``Eigen::SparseMatrix<...>`` | Eigen: sparse matrix | :file:`pybind11/eigen.h` |
+------------------------------------+---------------------------+-----------------------------------+
.. [#] ``std::filesystem::path`` is converted to ``pathlib.Path`` and
``os.PathLike`` is converted to ``std::filesystem::path``, but this requires
Python 3.6 (for ``__fspath__`` support).
| Abies | /Abies-0.0.5.tar.gz/Abies-0.0.5/extern/pybind11/docs/advanced/cast/overview.rst | overview.rst |
# IMPORTANT: If you change this file in the pybind11 repo, also review
# setup_helpers.pyi for matching changes.
#
# If you copy this file in, you don't
# need the .pyi file; it's just an interface file for static type checkers.
import contextlib
import os
import platform
import shlex
import shutil
import sys
import sysconfig
import tempfile
import threading
import warnings
try:
from setuptools import Extension as _Extension
from setuptools.command.build_ext import build_ext as _build_ext
except ImportError:
from distutils.command.build_ext import build_ext as _build_ext
from distutils.extension import Extension as _Extension
import distutils.ccompiler
import distutils.errors
WIN = sys.platform.startswith("win32") and "mingw" not in sysconfig.get_platform()
PY2 = sys.version_info[0] < 3
MACOS = sys.platform.startswith("darwin")
STD_TMPL = "/std:c++{}" if WIN else "-std=c++{}"
# It is recommended to use PEP 518 builds if using this module. However, this
# file explicitly supports being copied into a user's project directory
# standalone, and pulling pybind11 with the deprecated setup_requires feature.
# If you copy the file, remember to add it to your MANIFEST.in, and add the current
# directory into your path if it sits beside your setup.py.
class Pybind11Extension(_Extension):
"""
Build a C++11+ Extension module with pybind11. This automatically adds the
recommended flags when you init the extension and assumes C++ sources - you
can further modify the options yourself.
The customizations are:
* ``/EHsc`` and ``/bigobj`` on Windows
* ``stdlib=libc++`` on macOS
* ``visibility=hidden`` and ``-g0`` on Unix
Finally, you can set ``cxx_std`` via constructor or afterwards to enable
flags for C++ std, and a few extra helper flags related to the C++ standard
level. It is _highly_ recommended you either set this, or use the provided
``build_ext``, which will search for the highest supported extension for
you if the ``cxx_std`` property is not set. Do not set the ``cxx_std``
property more than once, as flags are added when you set it. Set the
property to None to disable the addition of C++ standard flags.
If you want to add pybind11 headers manually, for example for an exact
git checkout, then set ``include_pybind11=False``.
Warning: do not use property-based access to the instance on Python 2 -
this is an ugly old-style class due to Distutils.
"""
# flags are prepended, so that they can be further overridden, e.g. by
# ``extra_compile_args=["-g"]``.
def _add_cflags(self, flags):
self.extra_compile_args[:0] = flags
def _add_ldflags(self, flags):
self.extra_link_args[:0] = flags
def __init__(self, *args, **kwargs):
self._cxx_level = 0
cxx_std = kwargs.pop("cxx_std", 0)
if "language" not in kwargs:
kwargs["language"] = "c++"
include_pybind11 = kwargs.pop("include_pybind11", True)
# Can't use super here because distutils has old-style classes in
# Python 2!
_Extension.__init__(self, *args, **kwargs)
# Include the installed package pybind11 headers
if include_pybind11:
# If using setup_requires, this fails the first time - that's okay
try:
import pybind11
pyinc = pybind11.get_include()
if pyinc not in self.include_dirs:
self.include_dirs.append(pyinc)
except ImportError:
pass
# Have to use the accessor manually to support Python 2 distutils
Pybind11Extension.cxx_std.__set__(self, cxx_std)
cflags = []
ldflags = []
if WIN:
cflags += ["/EHsc", "/bigobj"]
else:
cflags += ["-fvisibility=hidden"]
env_cflags = os.environ.get("CFLAGS", "")
env_cppflags = os.environ.get("CPPFLAGS", "")
c_cpp_flags = shlex.split(env_cflags) + shlex.split(env_cppflags)
if not any(opt.startswith("-g") for opt in c_cpp_flags):
cflags += ["-g0"]
if MACOS:
cflags += ["-stdlib=libc++"]
ldflags += ["-stdlib=libc++"]
self._add_cflags(cflags)
self._add_ldflags(ldflags)
@property
def cxx_std(self):
"""
The CXX standard level. If set, will add the required flags. If left
at 0, it will trigger an automatic search when pybind11's build_ext
is used. If None, will have no effect. Besides just the flags, this
may add a register warning/error fix for Python 2 or macos-min 10.9
or 10.14.
"""
return self._cxx_level
@cxx_std.setter
def cxx_std(self, level):
if self._cxx_level:
warnings.warn("You cannot safely change the cxx_level after setting it!")
# MSVC 2015 Update 3 and later only have 14 (and later 17) modes, so
# force a valid flag here.
if WIN and level == 11:
level = 14
self._cxx_level = level
if not level:
return
cflags = [STD_TMPL.format(level)]
ldflags = []
if MACOS and "MACOSX_DEPLOYMENT_TARGET" not in os.environ:
# C++17 requires a higher min version of macOS. An earlier version
# (10.12 or 10.13) can be set manually via environment variable if
# you are careful in your feature usage, but 10.14 is the safest
# setting for general use. However, never set higher than the
# current macOS version!
current_macos = tuple(int(x) for x in platform.mac_ver()[0].split(".")[:2])
desired_macos = (10, 9) if level < 17 else (10, 14)
macos_string = ".".join(str(x) for x in min(current_macos, desired_macos))
macosx_min = "-mmacosx-version-min=" + macos_string
cflags += [macosx_min]
ldflags += [macosx_min]
if PY2:
if WIN:
# Will be ignored on MSVC 2015, where C++17 is not supported so
# this flag is not valid.
cflags += ["/wd5033"]
elif level >= 17:
cflags += ["-Wno-register"]
elif level >= 14:
cflags += ["-Wno-deprecated-register"]
self._add_cflags(cflags)
self._add_ldflags(ldflags)
# Just in case someone clever tries to multithread
tmp_chdir_lock = threading.Lock()
cpp_cache_lock = threading.Lock()
@contextlib.contextmanager
def tmp_chdir():
"Prepare and enter a temporary directory, cleanup when done"
# Threadsafe
with tmp_chdir_lock:
olddir = os.getcwd()
try:
tmpdir = tempfile.mkdtemp()
os.chdir(tmpdir)
yield tmpdir
finally:
os.chdir(olddir)
shutil.rmtree(tmpdir)
# cf http://bugs.python.org/issue26689
def has_flag(compiler, flag):
"""
Return the flag if a flag name is supported on the
specified compiler, otherwise None (can be used as a boolean).
If multiple flags are passed, return the first that matches.
"""
with tmp_chdir():
fname = "flagcheck.cpp"
with open(fname, "w") as f:
# Don't trigger -Wunused-parameter.
f.write("int main (int, char **) { return 0; }")
try:
compiler.compile([fname], extra_postargs=[flag])
except distutils.errors.CompileError:
return False
return True
# Every call will cache the result
cpp_flag_cache = None
def auto_cpp_level(compiler):
"""
Return the max supported C++ std level (17, 14, or 11). Returns latest on Windows.
"""
if WIN:
return "latest"
global cpp_flag_cache
# If this has been previously calculated with the same args, return that
with cpp_cache_lock:
if cpp_flag_cache:
return cpp_flag_cache
levels = [17, 14, 11]
for level in levels:
if has_flag(compiler, STD_TMPL.format(level)):
with cpp_cache_lock:
cpp_flag_cache = level
return level
msg = "Unsupported compiler -- at least C++11 support is needed!"
raise RuntimeError(msg)
class build_ext(_build_ext): # noqa: N801
"""
Customized build_ext that allows an auto-search for the highest supported
C++ level for Pybind11Extension. This is only needed for the auto-search
for now, and is completely optional otherwise.
"""
def build_extensions(self):
"""
Build extensions, injecting C++ std for Pybind11Extension if needed.
"""
for ext in self.extensions:
if hasattr(ext, "_cxx_level") and ext._cxx_level == 0:
# Python 2 syntax - old-style distutils class
ext.__class__.cxx_std.__set__(ext, auto_cpp_level(self.compiler))
# Python 2 doesn't allow super here, since distutils uses old-style
# classes!
_build_ext.build_extensions(self)
def intree_extensions(paths, package_dir=None):
"""
Generate Pybind11Extensions from source files directly located in a Python
source tree.
``package_dir`` behaves as in ``setuptools.setup``. If unset, the Python
package root parent is determined as the first parent directory that does
not contain an ``__init__.py`` file.
"""
exts = []
for path in paths:
if package_dir is None:
parent, _ = os.path.split(path)
while os.path.exists(os.path.join(parent, "__init__.py")):
parent, _ = os.path.split(parent)
relname, _ = os.path.splitext(os.path.relpath(path, parent))
qualified_name = relname.replace(os.path.sep, ".")
exts.append(Pybind11Extension(qualified_name, [path]))
else:
found = False
for prefix, parent in package_dir.items():
if path.startswith(parent):
found = True
relname, _ = os.path.splitext(os.path.relpath(path, parent))
qualified_name = relname.replace(os.path.sep, ".")
if prefix:
qualified_name = prefix + "." + qualified_name
exts.append(Pybind11Extension(qualified_name, [path]))
if not found:
raise ValueError(
"path {} is not a child of any of the directories listed "
"in 'package_dir' ({})".format(path, package_dir)
)
return exts
def naive_recompile(obj, src):
"""
This will recompile only if the source file changes. It does not check
header files, so a more advanced function or Ccache is better if you have
editable header files in your package.
"""
return os.stat(obj).st_mtime < os.stat(src).st_mtime
def no_recompile(obg, src):
"""
This is the safest but slowest choice (and is the default) - will always
recompile sources.
"""
return True
# Optional parallel compile utility
# inspired by: http://stackoverflow.com/questions/11013851/speeding-up-build-process-with-distutils
# and: https://github.com/tbenthompson/cppimport/blob/stable/cppimport/build_module.py
# and NumPy's parallel distutils module:
# https://github.com/numpy/numpy/blob/master/numpy/distutils/ccompiler.py
class ParallelCompile(object):
"""
Make a parallel compile function. Inspired by
numpy.distutils.ccompiler.CCompiler_compile and cppimport.
This takes several arguments that allow you to customize the compile
function created:
envvar:
Set an environment variable to control the compilation threads, like
NPY_NUM_BUILD_JOBS
default:
0 will automatically multithread, or 1 will only multithread if the
envvar is set.
max:
The limit for automatic multithreading if non-zero
needs_recompile:
A function of (obj, src) that returns True when recompile is needed. No
effect in isolated mode; use ccache instead, see
https://github.com/matplotlib/matplotlib/issues/1507/
To use::
ParallelCompile("NPY_NUM_BUILD_JOBS").install()
or::
with ParallelCompile("NPY_NUM_BUILD_JOBS"):
setup(...)
By default, this assumes all files need to be recompiled. A smarter
function can be provided via needs_recompile. If the output has not yet
been generated, the compile will always run, and this function is not
called.
"""
__slots__ = ("envvar", "default", "max", "_old", "needs_recompile")
def __init__(self, envvar=None, default=0, max=0, needs_recompile=no_recompile):
self.envvar = envvar
self.default = default
self.max = max
self.needs_recompile = needs_recompile
self._old = []
def function(self):
"""
Builds a function object usable as distutils.ccompiler.CCompiler.compile.
"""
def compile_function(
compiler,
sources,
output_dir=None,
macros=None,
include_dirs=None,
debug=0,
extra_preargs=None,
extra_postargs=None,
depends=None,
):
# These lines are directly from distutils.ccompiler.CCompiler
macros, objects, extra_postargs, pp_opts, build = compiler._setup_compile(
output_dir, macros, include_dirs, sources, depends, extra_postargs
)
cc_args = compiler._get_cc_args(pp_opts, debug, extra_preargs)
# The number of threads; start with default.
threads = self.default
# Determine the number of compilation threads, unless set by an environment variable.
if self.envvar is not None:
threads = int(os.environ.get(self.envvar, self.default))
def _single_compile(obj):
try:
src, ext = build[obj]
except KeyError:
return
if not os.path.exists(obj) or self.needs_recompile(obj, src):
compiler._compile(obj, src, ext, cc_args, extra_postargs, pp_opts)
try:
# Importing .synchronize checks for platforms that have some multiprocessing
# capabilities but lack semaphores, such as AWS Lambda and Android Termux.
import multiprocessing.synchronize
from multiprocessing.pool import ThreadPool
except ImportError:
threads = 1
if threads == 0:
try:
threads = multiprocessing.cpu_count()
threads = self.max if self.max and self.max < threads else threads
except NotImplementedError:
threads = 1
if threads > 1:
pool = ThreadPool(threads)
# In Python 2, ThreadPool can't be used as a context manager.
# Once we are no longer supporting it, this can be 'with pool:'
try:
for _ in pool.imap_unordered(_single_compile, objects):
pass
finally:
pool.terminate()
else:
for ob in objects:
_single_compile(ob)
return objects
return compile_function
def install(self):
distutils.ccompiler.CCompiler.compile = self.function()
return self
def __enter__(self):
self._old.append(distutils.ccompiler.CCompiler.compile)
return self.install()
def __exit__(self, *args):
distutils.ccompiler.CCompiler.compile = self._old.pop() | Abies | /Abies-0.0.5.tar.gz/Abies-0.0.5/extern/pybind11/pybind11/setup_helpers.py | setup_helpers.py |
<!--
Title (above): please place [branch_name] at the beginning if you are targeting a branch other than master. *Do not target stable*.
It is recommended to use conventional commit format, see conventionalcommits.org, but not required.
-->
## Description
<!-- Include relevant issues or PRs here, describe what changed and why -->
## Suggested changelog entry:
<!-- Fill in the below block with the expected RestructuredText entry. Delete if no entry needed;
but do not delete header or rst block if an entry is needed! Will be collected via a script. -->
```rst
```
<!-- If the upgrade guide needs updating, note that here too -->
| Abies | /Abies-0.0.5.tar.gz/Abies-0.0.5/extern/pybind11/.github/pull_request_template.md | pull_request_template.md |
Thank you for your interest in this project! Please refer to the following
sections on how to contribute code and bug reports.
### Reporting bugs
Before submitting a question or bug report, please take a moment of your time
and ensure that your issue isn't already discussed in the project documentation
provided at [pybind11.readthedocs.org][] or in the [issue tracker][]. You can
also check [gitter][] to see if it came up before.
Assuming that you have identified a previously unknown problem or an important
question, it's essential that you submit a self-contained and minimal piece of
code that reproduces the problem. In other words: no external dependencies,
isolate the function(s) that cause breakage, submit matched and complete C++
and Python snippets that can be easily compiled and run in isolation; or
ideally make a small PR with a failing test case that can be used as a starting
point.
## Pull requests
Contributions are submitted, reviewed, and accepted using GitHub pull requests.
Please refer to [this article][using pull requests] for details and adhere to
the following rules to make the process as smooth as possible:
* Make a new branch for every feature you're working on.
* Make small and clean pull requests that are easy to review but make sure they
do add value by themselves.
* Add tests for any new functionality and run the test suite (`cmake --build
build --target pytest`) to ensure that no existing features break.
* Please run [`pre-commit`][pre-commit] to check your code matches the
project style. (Note that `gawk` is required.) Use `pre-commit run
--all-files` before committing (or use installed-mode, check pre-commit docs)
to verify your code passes before pushing to save time.
* This project has a strong focus on providing general solutions using a
minimal amount of code, thus small pull requests are greatly preferred.
### Licensing of contributions
pybind11 is provided under a BSD-style license that can be found in the
``LICENSE`` file. By using, distributing, or contributing to this project, you
agree to the terms and conditions of this license.
You are under no obligation whatsoever to provide any bug fixes, patches, or
upgrades to the features, functionality or performance of the source code
("Enhancements") to anyone; however, if you choose to make your Enhancements
available either publicly, or directly to the author of this software, without
imposing a separate written license agreement for such Enhancements, then you
hereby grant the following license: a non-exclusive, royalty-free perpetual
license to install, use, modify, prepare derivative works, incorporate into
other computer software, distribute, and sublicense such enhancements or
derivative works thereof, in binary and source code form.
## Development of pybind11
### Quick setup
To setup a quick development environment, use [`nox`](https://nox.thea.codes).
This will allow you to do some common tasks with minimal setup effort, but will
take more time to run and be less flexible than a full development environment.
If you use [`pipx run nox`](https://pipx.pypa.io), you don't even need to
install `nox`. Examples:
```bash
# List all available sessions
nox -l
# Run linters
nox -s lint
# Run tests on Python 3.9
nox -s tests-3.9
# Build and preview docs
nox -s docs -- serve
# Build SDists and wheels
nox -s build
```
### Full setup
To setup an ideal development environment, run the following commands on a
system with CMake 3.14+:
```bash
python3 -m venv venv
source venv/bin/activate
pip install -r tests/requirements.txt
cmake -S . -B build -DDOWNLOAD_CATCH=ON -DDOWNLOAD_EIGEN=ON
cmake --build build -j4
```
Tips:
* You can use `virtualenv` (from PyPI) instead of `venv` (which is Python 3
only).
* You can select any name for your environment folder; if it contains "env" it
will be ignored by git.
* If you don’t have CMake 3.14+, just add “cmake” to the pip install command.
* You can use `-DPYBIND11_FINDPYTHON=ON` to use FindPython on CMake 3.12+
* In classic mode, you may need to set `-DPYTHON_EXECUTABLE=/path/to/python`.
FindPython uses `-DPython_ROOT_DIR=/path/to` or
`-DPython_EXECUTABLE=/path/to/python`.
### Configuration options
In CMake, configuration options are given with “-D”. Options are stored in the
build directory, in the `CMakeCache.txt` file, so they are remembered for each
build directory. Two selections are special - the generator, given with `-G`,
and the compiler, which is selected based on environment variables `CXX` and
similar, or `-DCMAKE_CXX_COMPILER=`. Unlike the others, these cannot be changed
after the initial run.
The valid options are:
* `-DCMAKE_BUILD_TYPE`: Release, Debug, MinSizeRel, RelWithDebInfo
* `-DPYBIND11_FINDPYTHON=ON`: Use CMake 3.12+’s FindPython instead of the
classic, deprecated, custom FindPythonLibs
* `-DPYBIND11_NOPYTHON=ON`: Disable all Python searching (disables tests)
* `-DBUILD_TESTING=ON`: Enable the tests
* `-DDOWNLOAD_CATCH=ON`: Download catch to build the C++ tests
* `-DDOWNLOAD_EIGEN=ON`: Download Eigen for the NumPy tests
* `-DPYBIND11_INSTALL=ON/OFF`: Enable the install target (on by default for the
master project)
* `-DUSE_PYTHON_INSTALL_DIR=ON`: Try to install into the python dir
<details><summary>A few standard CMake tricks: (click to expand)</summary><p>
* Use `cmake --build build -v` to see the commands used to build the files.
* Use `cmake build -LH` to list the CMake options with help.
* Use `ccmake` if available to see a curses (terminal) gui, or `cmake-gui` for
a completely graphical interface (not present in the PyPI package).
* Use `cmake --build build -j12` to build with 12 cores (for example).
* Use `-G` and the name of a generator to use something different. `cmake
--help` lists the generators available.
- On Unix, setting `CMAKE_GENERATER=Ninja` in your environment will give
you automatic mulithreading on all your CMake projects!
* Open the `CMakeLists.txt` with QtCreator to generate for that IDE.
* You can use `-DCMAKE_EXPORT_COMPILE_COMMANDS=ON` to generate the `.json` file
that some tools expect.
</p></details>
To run the tests, you can "build" the check target:
```bash
cmake --build build --target check
```
`--target` can be spelled `-t` in CMake 3.15+. You can also run individual
tests with these targets:
* `pytest`: Python tests only, using the
[pytest](https://docs.pytest.org/en/stable/) framework
* `cpptest`: C++ tests only
* `test_cmake_build`: Install / subdirectory tests
If you want to build just a subset of tests, use
`-DPYBIND11_TEST_OVERRIDE="test_callbacks;test_pickling"`. If this is
empty, all tests will be built. Tests are specified without an extension if they need both a .py and
.cpp file.
You may also pass flags to the `pytest` target by editing `tests/pytest.ini` or
by using the `PYTEST_ADDOPTS` environment variable
(see [`pytest` docs](https://docs.pytest.org/en/2.7.3/customize.html#adding-default-options)). As an example:
```bash
env PYTEST_ADDOPTS="--capture=no --exitfirst" \
cmake --build build --target pytest
# Or using abbreviated flags
env PYTEST_ADDOPTS="-s -x" cmake --build build --target pytest
```
### Formatting
All formatting is handled by pre-commit.
Install with brew (macOS) or pip (any OS):
```bash
# Any OS
python3 -m pip install pre-commit
# OR macOS with homebrew:
brew install pre-commit
```
Then, you can run it on the items you've added to your staging area, or all
files:
```bash
pre-commit run
# OR
pre-commit run --all-files
```
And, if you want to always use it, you can install it as a git hook (hence the
name, pre-commit):
```bash
pre-commit install
```
### Clang-Format
As of v2.6.2, pybind11 ships with a [`clang-format`][clang-format]
configuration file at the top level of the repo (the filename is
`.clang-format`). Currently, formatting is NOT applied automatically, but
manually using `clang-format` for newly developed files is highly encouraged.
To check if a file needs formatting:
```bash
clang-format -style=file --dry-run some.cpp
```
The output will show things to be fixed, if any. To actually format the file:
```bash
clang-format -style=file -i some.cpp
```
Note that the `-style-file` option searches the parent directories for the
`.clang-format` file, i.e. the commands above can be run in any subdirectory
of the pybind11 repo.
### Clang-Tidy
[`clang-tidy`][clang-tidy] performs deeper static code analyses and is
more complex to run, compared to `clang-format`, but support for `clang-tidy`
is built into the pybind11 CMake configuration. To run `clang-tidy`, the
following recipe should work. Run the `docker` command from the top-level
directory inside your pybind11 git clone. Files will be modified in place,
so you can use git to monitor the changes.
```bash
docker run --rm -v $PWD:/mounted_pybind11 -it silkeh/clang:12
apt-get update && apt-get install -y python3-dev python3-pytest
cmake -S /mounted_pybind11/ -B build -DCMAKE_CXX_CLANG_TIDY="$(which clang-tidy);-fix" -DDOWNLOAD_EIGEN=ON -DDOWNLOAD_CATCH=ON -DCMAKE_CXX_STANDARD=17
cmake --build build -j 2 -- --keep-going
```
### Include what you use
To run include what you use, install (`brew install include-what-you-use` on
macOS), then run:
```bash
cmake -S . -B build-iwyu -DCMAKE_CXX_INCLUDE_WHAT_YOU_USE=$(which include-what-you-use)
cmake --build build
```
The report is sent to stderr; you can pipe it into a file if you wish.
### Build recipes
This builds with the Intel compiler (assuming it is in your path, along with a
recent CMake and Python 3):
```bash
python3 -m venv venv
. venv/bin/activate
pip install pytest
cmake -S . -B build-intel -DCMAKE_CXX_COMPILER=$(which icpc) -DDOWNLOAD_CATCH=ON -DDOWNLOAD_EIGEN=ON -DPYBIND11_WERROR=ON
```
This will test the PGI compilers:
```bash
docker run --rm -it -v $PWD:/pybind11 nvcr.io/hpc/pgi-compilers:ce
apt-get update && apt-get install -y python3-dev python3-pip python3-pytest
wget -qO- "https://cmake.org/files/v3.18/cmake-3.18.2-Linux-x86_64.tar.gz" | tar --strip-components=1 -xz -C /usr/local
cmake -S pybind11/ -B build
cmake --build build
```
### Explanation of the SDist/wheel building design
> These details below are _only_ for packaging the Python sources from git. The
> SDists and wheels created do not have any extra requirements at all and are
> completely normal.
The main objective of the packaging system is to create SDists (Python's source
distribution packages) and wheels (Python's binary distribution packages) that
include everything that is needed to work with pybind11, and which can be
installed without any additional dependencies. This is more complex than it
appears: in order to support CMake as a first class language even when using
the PyPI package, they must include the _generated_ CMake files (so as not to
require CMake when installing the `pybind11` package itself). They should also
provide the option to install to the "standard" location
(`<ENVROOT>/include/pybind11` and `<ENVROOT>/share/cmake/pybind11`) so they are
easy to find with CMake, but this can cause problems if you are not an
environment or using ``pyproject.toml`` requirements. This was solved by having
two packages; the "nice" pybind11 package that stores the includes and CMake
files inside the package, that you get access to via functions in the package,
and a `pybind11-global` package that can be included via `pybind11[global]` if
you want the more invasive but discoverable file locations.
If you want to install or package the GitHub source, it is best to have Pip 10
or newer on Windows, macOS, or Linux (manylinux1 compatible, includes most
distributions). You can then build the SDists, or run any procedure that makes
SDists internally, like making wheels or installing.
```bash
# Editable development install example
python3 -m pip install -e .
```
Since Pip itself does not have an `sdist` command (it does have `wheel` and
`install`), you may want to use the upcoming `build` package:
```bash
python3 -m pip install build
# Normal package
python3 -m build -s .
# Global extra
PYBIND11_GLOBAL_SDIST=1 python3 -m build -s .
```
If you want to use the classic "direct" usage of `python setup.py`, you will
need CMake 3.15+ and either `make` or `ninja` preinstalled (possibly via `pip
install cmake ninja`), since directly running Python on `setup.py` cannot pick
up and install `pyproject.toml` requirements. As long as you have those two
things, though, everything works the way you would expect:
```bash
# Normal package
python3 setup.py sdist
# Global extra
PYBIND11_GLOBAL_SDIST=1 python3 setup.py sdist
```
A detailed explanation of the build procedure design for developers wanting to
work on or maintain the packaging system is as follows:
#### 1. Building from the source directory
When you invoke any `setup.py` command from the source directory, including
`pip wheel .` and `pip install .`, you will activate a full source build. This
is made of the following steps:
1. If the tool is PEP 518 compliant, like Pip 10+, it will create a temporary
virtual environment and install the build requirements (mostly CMake) into
it. (if you are not on Windows, macOS, or a manylinux compliant system, you
can disable this with `--no-build-isolation` as long as you have CMake 3.15+
installed)
2. The environment variable `PYBIND11_GLOBAL_SDIST` is checked - if it is set
and truthy, this will be make the accessory `pybind11-global` package,
instead of the normal `pybind11` package. This package is used for
installing the files directly to your environment root directory, using
`pybind11[global]`.
2. `setup.py` reads the version from `pybind11/_version.py` and verifies it
matches `includes/pybind11/detail/common.h`.
3. CMake is run with `-DCMAKE_INSTALL_PREIFX=pybind11`. Since the CMake install
procedure uses only relative paths and is identical on all platforms, these
files are valid as long as they stay in the correct relative position to the
includes. `pybind11/share/cmake/pybind11` has the CMake files, and
`pybind11/include` has the includes. The build directory is discarded.
4. Simpler files are placed in the SDist: `tools/setup_*.py.in`,
`tools/pyproject.toml` (`main` or `global`)
5. The package is created by running the setup function in the
`tools/setup_*.py`. `setup_main.py` fills in Python packages, and
`setup_global.py` fills in only the data/header slots.
6. A context manager cleans up the temporary CMake install directory (even if
an error is thrown).
### 2. Building from SDist
Since the SDist has the rendered template files in `tools` along with the
includes and CMake files in the correct locations, the builds are completely
trivial and simple. No extra requirements are required. You can even use Pip 9
if you really want to.
[pre-commit]: https://pre-commit.com
[clang-format]: https://clang.llvm.org/docs/ClangFormat.html
[clang-tidy]: https://clang.llvm.org/extra/clang-tidy/
[pybind11.readthedocs.org]: http://pybind11.readthedocs.org/en/latest
[issue tracker]: https://github.com/pybind/pybind11/issues
[gitter]: https://gitter.im/pybind/Lobby
[using pull requests]: https://help.github.com/articles/using-pull-requests
| Abies | /Abies-0.0.5.tar.gz/Abies-0.0.5/extern/pybind11/.github/CONTRIBUTING.md | CONTRIBUTING.md |
## Utility Modules
- [x] dff
- [ ] skid buffer
- [ ] synchronous fifo
- [ ] asynchronous fifo
- [ ] bram
- [ ] mux
- [ ] width adapter
- [ ] fixed/float converter
- [ ] recording/memory
- [ ] arbiter
- [ ] register interface
## Wrapper Templates
- [ ] stereo wrapper
- [ ] skid inputs/outputs
- [ ] address decoder
## Debugging
- [ ] vivado ila
- [ ] open ila
## DSP Basics
- [ ] sum/mixer
- [ ] dds/vco
- [ ] fir filter
- [ ] symmetric fir filter
- [ ] iir/biquad filter
- [ ] upsample/downsample
- [ ] convolution
- [ ] fft
- [ ] tuner
## Audio Effects
- [ ] equalizer
- [ ] delay
- [ ] phaser
- [ ] symmetric distortion
- [ ] asymmetric distortion
- [ ] compressor/limiter
- [ ] reverb
- [ ] octaver
## Peripherals
- [ ] i2s
- [ ] uart
- [ ] spi
- [ ] memory
- [ ] flash
- [ ] gpio
| Abies | /Abies-0.0.5.tar.gz/Abies-0.0.5/abies/library/all_libs.md | all_libs.md |
# Signal Definitions
Modules must comply with naming and bus conventions.
Names must match exactly.
## Common Parameters
Modules may be required to implement these parameters.
Parameters should have defaults.
parameter ADW = 24; // Audio bus data width
parameter CDW = 8; // Control bus data width
parameter AW = 24; // Address Width
## Common signals
All modules must have these signals in their port:
input logic clk;
input logic rst;
clk will be the processing clock. This is not the sample clock.
rst can be unused, but the rst port must be present to comply with code tools.
## Audio Bus Definition
Modules may have audio input and audio output ports.
The valid signal is the sample clock. There will be fclk/fsclk samples between each sample. 100e6 / 48e3 = 2083 for example. Audio modules can take no more clock cycles than this to process data.
Sometimes implementing ready is not necessary, or the developer may not want to implement stalling logic. So it is mostly optional. An audio sample clock is fairly slow and regular, so this shouldn't be a big deal.
Skid buffers can be used to connect modules that don't properly implement the ready signal. Use them if dropping data is a possibility. They are an easy way to automatically handle ready logic.
### Audio Input
input logic [DW-1:0] audio_i_data;
input logic audio_i_valid;
output logic audio_i_ready;
If your design is fully pipelined or takes much fewer than fclk/sclk clock cycles to process data, you probably don't need to worry about the ready signal. You can "assign audio_i_ready = 1;" and everything will be fine.
If your design takes close to fclk/fsclk cycles, you should handle the ready signal properly. If you are unsure if the previous module in the chain is going to respect the ready signal, you can connect a skid buffer before your module to make sure you don't lose data.
### Audio Output
output logic [DW-1:0] audio_o_data;
output logic audio_o_valid;
input logic audio_o_ready;
If you can't stall or don't want to implement it, you can ignore the ready signal. | Abies | /Abies-0.0.5.tar.gz/Abies-0.0.5/abies/library/bus_defs.md | bus_defs.md |
import os
os.environ['PYGAME_HIDE_SUPPORT_PROMPT'] = "hide"
from PIL import Image as PILImage
import pygame
from pygame.locals import *
import time, random
import warnings
from pygame import mixer
import logging
import sys
import pkg_resources
print("Abinde version {}. Hello from the Abinde team! \nNot sure what to do? Check out the docs. https://abinde-game-dev.github.io/docs.\nJoin the Abinde team! https://github.com/Abinde-Game-Dev.".format(pkg_resources.get_distribution("Abinde").version))
mixer.init()
pygame.font.init()
windows = []
game_quit = False
class color:
ALICEBLUE = (240, 248, 255)
ANTIQUEWHITE = (250, 235, 215)
ANTIQUEWHITE1 = (255, 239, 219)
ANTIQUEWHITE2 = (238, 223, 204)
ANTIQUEWHITE3 = (205, 192, 176)
ANTIQUEWHITE4 = (139, 131, 120)
AQUA = (0, 255, 255)
AQUAMARINE1 = (127, 255, 212)
AQUAMARINE2 = (118, 238, 198)
AQUAMARINE3 = (102, 205, 170)
AQUAMARINE4 = (69, 139, 116)
AZURE1 = (240, 255, 255)
AZURE2 = (224, 238, 238)
AZURE3 = (193, 205, 205)
AZURE4 = (131, 139, 139)
BANANA = (227, 207, 87)
BEIGE = (245, 245, 220)
BISQUE1 = (255, 228, 196)
BISQUE2 = (238, 213, 183)
BISQUE3 = (205, 183, 158)
BISQUE4 = (139, 125, 107)
BLACK = (0, 0, 0)
BLANCHEDALMOND = (255, 235, 205)
BLUE = (0, 0, 255)
BLUE2 = (0, 0, 238)
BLUE3 = (0, 0, 205)
BLUE4 = (0, 0, 139)
BLUEVIOLET = (138, 43, 226)
BRICK = (156, 102, 31)
BROWN = (165, 42, 42)
BROWN1 = (255, 64, 64)
BROWN2 = (238, 59, 59)
BROWN3 = (205, 51, 51)
BROWN4 = (139, 35, 35)
BURLYWOOD = (222, 184, 135)
BURLYWOOD1 = (255, 211, 155)
BURLYWOOD2 = (238, 197, 145)
BURLYWOOD3 = (205, 170, 125)
BURLYWOOD4 = (139, 115, 85)
BURNTSIENNA = (138, 54, 15)
BURNTUMBER = (138, 51, 36)
CADETBLUE = (95, 158, 160)
CADETBLUE1 = (152, 245, 255)
CADETBLUE2 = (142, 229, 238)
CADETBLUE3 = (122, 197, 205)
CADETBLUE4 = (83, 134, 139)
CADMIUMORANGE = (255, 97, 3)
CADMIUMYELLOW = (255, 153, 18)
CARROT = (237, 145, 33)
CHARTREUSE1 = (127, 255, 0)
CHARTREUSE2 = (118, 238, 0)
CHARTREUSE3 = (102, 205, 0)
CHARTREUSE4 = (69, 139, 0)
CHOCOLATE = (210, 105, 30)
CHOCOLATE1 = (255, 127, 36)
CHOCOLATE2 = (238, 118, 33)
CHOCOLATE3 = (205, 102, 29)
CHOCOLATE4 = (139, 69, 19)
COBALT = (61, 89, 171)
COBALTGREEN = (61, 145, 64)
COLDGREY = (128, 138, 135)
CORAL = (255, 127, 80)
CORAL1 = (255, 114, 86)
CORAL2 = (238, 106, 80)
CORAL3 = (205, 91, 69)
CORAL4 = (139, 62, 47)
CORNFLOWERBLUE = (100, 149, 237)
CORNSILK1 = (255, 248, 220)
CORNSILK2 = (238, 232, 205)
CORNSILK3 = (205, 200, 177)
CORNSILK4 = (139, 136, 120)
CRIMSON = (220, 20, 60)
CYAN2 = (0, 238, 238)
CYAN3 = (0, 205, 205)
CYAN4 = (0, 139, 139)
DARKGOLDENROD = (184, 134, 11)
DARKGOLDENROD1 = (255, 185, 15)
DARKGOLDENROD2 = (238, 173, 14)
DARKGOLDENROD3 = (205, 149, 12)
DARKGOLDENROD4 = (139, 101, 8)
DARKGRAY = (169, 169, 169)
DARKGREEN = (0, 100, 0)
DARKKHAKI = (189, 183, 107)
DARKOLIVEGREEN = (85, 107, 47)
DARKOLIVEGREEN1 = (202, 255, 112)
DARKOLIVEGREEN2 = (188, 238, 104)
DARKOLIVEGREEN3 = (162, 205, 90)
DARKOLIVEGREEN4 = (110, 139, 61)
DARKORANGE = (255, 140, 0)
DARKORANGE1 = (255, 127, 0)
DARKORANGE2 = (238, 118, 0)
DARKORANGE3 = (205, 102, 0)
DARKORANGE4 = (139, 69, 0)
DARKORCHID = (153, 50, 204)
DARKORCHID1 = (191, 62, 255)
DARKORCHID2 = (178, 58, 238)
DARKORCHID3 = (154, 50, 205)
DARKORCHID4 = (104, 34, 139)
DARKSALMON = (233, 150, 122)
DARKSEAGREEN = (143, 188, 143)
DARKSEAGREEN1 = (193, 255, 193)
DARKSEAGREEN2 = (180, 238, 180)
DARKSEAGREEN3 = (155, 205, 155)
DARKSEAGREEN4 = (105, 139, 105)
DARKSLATEBLUE = (72, 61, 139)
DARKSLATEGRAY = (47, 79, 79)
DARKSLATEGRAY1 = (151, 255, 255)
DARKSLATEGRAY2 = (141, 238, 238)
DARKSLATEGRAY3 = (121, 205, 205)
DARKSLATEGRAY4 = (82, 139, 139)
DARKTURQUOISE = (0, 206, 209)
DARKVIOLET = (148, 0, 211)
DEEPPINK1 = (255, 20, 147)
DEEPPINK2 = (238, 18, 137)
DEEPPINK3 = (205, 16, 118)
DEEPPINK4 = (139, 10, 80)
DEEPSKYBLUE1 = (0, 191, 255)
DEEPSKYBLUE2 = (0, 178, 238)
DEEPSKYBLUE3 = (0, 154, 205)
DEEPSKYBLUE4 = (0, 104, 139)
DIMGRAY = (105, 105, 105)
DIMGRAY = (105, 105, 105)
DODGERBLUE1 = (30, 144, 255)
DODGERBLUE2 = (28, 134, 238)
DODGERBLUE3 = (24, 116, 205)
DODGERBLUE4 = (16, 78, 139)
EGGSHELL = (252, 230, 201)
EMERALDGREEN = (0, 201, 87)
FIREBRICK = (178, 34, 34)
FIREBRICK1 = (255, 48, 48)
FIREBRICK2 = (238, 44, 44)
FIREBRICK3 = (205, 38, 38)
FIREBRICK4 = (139, 26, 26)
FLESH = (255, 125, 64)
FLORALWHITE = (255, 250, 240)
FORESTGREEN = (34, 139, 34)
GAINSBORO = (220, 220, 220)
GHOSTWHITE = (248, 248, 255)
GOLD1 = (255, 215, 0)
GOLD2 = (238, 201, 0)
GOLD3 = (205, 173, 0)
GOLD4 = (139, 117, 0)
GOLDENROD = (218, 165, 32)
GOLDENROD1 = (255, 193, 37)
GOLDENROD2 = (238, 180, 34)
GOLDENROD3 = (205, 155, 29)
GOLDENROD4 = (139, 105, 20)
GRAY = (128, 128, 128)
GRAY1 = (3, 3, 3)
GRAY10 = (26, 26, 26)
GRAY11 = (28, 28, 28)
GRAY12 = (31, 31, 31)
GRAY13 = (33, 33, 33)
GRAY14 = (36, 36, 36)
GRAY15 = (38, 38, 38)
GRAY16 = (41, 41, 41)
GRAY17 = (43, 43, 43)
GRAY18 = (46, 46, 46)
GRAY19 = (48, 48, 48)
GRAY2 = (5, 5, 5)
GRAY20 = (51, 51, 51)
GRAY21 = (54, 54, 54)
GRAY22 = (56, 56, 56)
GRAY23 = (59, 59, 59)
GRAY24 = (61, 61, 61)
GRAY25 = (64, 64, 64)
GRAY26 = (66, 66, 66)
GRAY27 = (69, 69, 69)
GRAY28 = (71, 71, 71)
GRAY29 = (74, 74, 74)
GRAY3 = (8, 8, 8)
GRAY30 = (77, 77, 77)
GRAY31 = (79, 79, 79)
GRAY32 = (82, 82, 82)
GRAY33 = (84, 84, 84)
GRAY34 = (87, 87, 87)
GRAY35 = (89, 89, 89)
GRAY36 = (92, 92, 92)
GRAY37 = (94, 94, 94)
GRAY38 = (97, 97, 97)
GRAY39 = (99, 99, 99)
GRAY4 = (10, 10, 10)
GRAY40 = (102, 102, 102)
GRAY42 = (107, 107, 107)
GRAY43 = (110, 110, 110)
GRAY44 = (112, 112, 112)
GRAY45 = (115, 115, 115)
GRAY46 = (117, 117, 117)
GRAY47 = (120, 120, 120)
GRAY48 = (122, 122, 122)
GRAY49 = (125, 125, 125)
GRAY5 = (13, 13, 13)
GRAY50 = (127, 127, 127)
GRAY51 = (130, 130, 130)
GRAY52 = (133, 133, 133)
GRAY53 = (135, 135, 135)
GRAY54 = (138, 138, 138)
GRAY55 = (140, 140, 140)
GRAY56 = (143, 143, 143)
GRAY57 = (145, 145, 145)
GRAY58 = (148, 148, 148)
GRAY59 = (150, 150, 150)
GRAY6 = (15, 15, 15)
GRAY60 = (153, 153, 153)
GRAY61 = (156, 156, 156)
GRAY62 = (158, 158, 158)
GRAY63 = (161, 161, 161)
GRAY64 = (163, 163, 163)
GRAY65 = (166, 166, 166)
GRAY66 = (168, 168, 168)
GRAY67 = (171, 171, 171)
GRAY68 = (173, 173, 173)
GRAY69 = (176, 176, 176)
GRAY7 = (18, 18, 18)
GRAY70 = (179, 179, 179)
GRAY71 = (181, 181, 181)
GRAY72 = (184, 184, 184)
GRAY73 = (186, 186, 186)
GRAY74 = (189, 189, 189)
GRAY75 = (191, 191, 191)
GRAY76 = (194, 194, 194)
GRAY77 = (196, 196, 196)
GRAY78 = (199, 199, 199)
GRAY79 = (201, 201, 201)
GRAY8 = (20, 20, 20)
GRAY80 = (204, 204, 204)
GRAY81 = (207, 207, 207)
GRAY82 = (209, 209, 209)
GRAY83 = (212, 212, 212)
GRAY84 = (214, 214, 214)
GRAY85 = (217, 217, 217)
GRAY86 = (219, 219, 219)
GRAY87 = (222, 222, 222)
GRAY88 = (224, 224, 224)
GRAY89 = (227, 227, 227)
GRAY9 = (23, 23, 23)
GRAY90 = (229, 229, 229)
GRAY91 = (232, 232, 232)
GRAY92 = (235, 235, 235)
GRAY93 = (237, 237, 237)
GRAY94 = (240, 240, 240)
GRAY95 = (242, 242, 242)
GRAY97 = (247, 247, 247)
GRAY98 = (250, 250, 250)
GRAY99 = (252, 252, 252)
GREEN = (0, 128, 0)
GREEN1 = (0, 255, 0)
GREEN2 = (0, 238, 0)
GREEN3 = (0, 205, 0)
GREEN4 = (0, 139, 0)
GREENYELLOW = (173, 255, 47)
HONEYDEW1 = (240, 255, 240)
HONEYDEW2 = (224, 238, 224)
HONEYDEW3 = (193, 205, 193)
HONEYDEW4 = (131, 139, 131)
HOTPINK = (255, 105, 180)
HOTPINK1 = (255, 110, 180)
HOTPINK2 = (238, 106, 167)
HOTPINK3 = (205, 96, 144)
HOTPINK4 = (139, 58, 98)
INDIANRED = (176, 23, 31)
INDIANRED = (205, 92, 92)
INDIANRED1 = (255, 106, 106)
INDIANRED2 = (238, 99, 99)
INDIANRED3 = (205, 85, 85)
INDIANRED4 = (139, 58, 58)
INDIGO = (75, 0, 130)
IVORY1 = (255, 255, 240)
IVORY2 = (238, 238, 224)
IVORY3 = (205, 205, 193)
IVORY4 = (139, 139, 131)
IVORYBLACK = (41, 36, 33)
KHAKI = (240, 230, 140)
KHAKI1 = (255, 246, 143)
KHAKI2 = (238, 230, 133)
KHAKI3 = (205, 198, 115)
KHAKI4 = (139, 134, 78)
LAVENDER = (230, 230, 250)
LAVENDERBLUSH1 = (255, 240, 245)
LAVENDERBLUSH2 = (238, 224, 229)
LAVENDERBLUSH3 = (205, 193, 197)
LAVENDERBLUSH4 = (139, 131, 134)
LAWNGREEN = (124, 252, 0)
LEMONCHIFFON1 = (255, 250, 205)
LEMONCHIFFON2 = (238, 233, 191)
LEMONCHIFFON3 = (205, 201, 165)
LEMONCHIFFON4 = (139, 137, 112)
LIGHTBLUE = (173, 216, 230)
LIGHTBLUE1 = (191, 239, 255)
LIGHTBLUE2 = (178, 223, 238)
LIGHTBLUE3 = (154, 192, 205)
LIGHTBLUE4 = (104, 131, 139)
LIGHTCORAL = (240, 128, 128)
LIGHTCYAN1 = (224, 255, 255)
LIGHTCYAN2 = (209, 238, 238)
LIGHTCYAN3 = (180, 205, 205)
LIGHTCYAN4 = (122, 139, 139)
LIGHTGOLDENROD1 = (255, 236, 139)
LIGHTGOLDENROD2 = (238, 220, 130)
LIGHTGOLDENROD3 = (205, 190, 112)
LIGHTGOLDENROD4 = (139, 129, 76)
LIGHTGOLDENRODYELLOW = (250, 250, 210)
LIGHTGREY = (211, 211, 211)
LIGHTPINK = (255, 182, 193)
LIGHTPINK1 = (255, 174, 185)
LIGHTPINK2 = (238, 162, 173)
LIGHTPINK3 = (205, 140, 149)
LIGHTPINK4 = (139, 95, 101)
LIGHTSALMON1 = (255, 160, 122)
LIGHTSALMON2 = (238, 149, 114)
LIGHTSALMON3 = (205, 129, 98)
LIGHTSALMON4 = (139, 87, 66)
LIGHTSEAGREEN = (32, 178, 170)
LIGHTSKYBLUE = (135, 206, 250)
LIGHTSKYBLUE1 = (176, 226, 255)
LIGHTSKYBLUE2 = (164, 211, 238)
LIGHTSKYBLUE3 = (141, 182, 205)
LIGHTSKYBLUE4 = (96, 123, 139)
LIGHTSLATEBLUE = (132, 112, 255)
LIGHTSLATEGRAY = (119, 136, 153)
LIGHTSTEELBLUE = (176, 196, 222)
LIGHTSTEELBLUE1 = (202, 225, 255)
LIGHTSTEELBLUE2 = (188, 210, 238)
LIGHTSTEELBLUE3 = (162, 181, 205)
LIGHTSTEELBLUE4 = (110, 123, 139)
LIGHTYELLOW1 = (255, 255, 224)
LIGHTYELLOW2 = (238, 238, 209)
LIGHTYELLOW3 = (205, 205, 180)
LIGHTYELLOW4 = (139, 139, 122)
LIMEGREEN = (50, 205, 50)
LINEN = (250, 240, 230)
MAGENTA = (255, 0, 255)
MAGENTA2 = (238, 0, 238)
MAGENTA3 = (205, 0, 205)
MAGENTA4 = (139, 0, 139)
MANGANESEBLUE = (3, 168, 158)
MAROON = (128, 0, 0)
MAROON1 = (255, 52, 179)
MAROON2 = (238, 48, 167)
MAROON3 = (205, 41, 144)
MAROON4 = (139, 28, 98)
MEDIUMORCHID = (186, 85, 211)
MEDIUMORCHID1 = (224, 102, 255)
MEDIUMORCHID2 = (209, 95, 238)
MEDIUMORCHID3 = (180, 82, 205)
MEDIUMORCHID4 = (122, 55, 139)
MEDIUMPURPLE = (147, 112, 219)
MEDIUMPURPLE1 = (171, 130, 255)
MEDIUMPURPLE2 = (159, 121, 238)
MEDIUMPURPLE3 = (137, 104, 205)
MEDIUMPURPLE4 = (93, 71, 139)
MEDIUMSEAGREEN = (60, 179, 113)
MEDIUMSLATEBLUE = (123, 104, 238)
MEDIUMSPRINGGREEN = (0, 250, 154)
MEDIUMTURQUOISE = (72, 209, 204)
MEDIUMVIOLETRED = (199, 21, 133)
MELON = (227, 168, 105)
MIDNIGHTBLUE = (25, 25, 112)
MINT = (189, 252, 201)
MINTCREAM = (245, 255, 250)
MISTYROSE1 = (255, 228, 225)
MISTYROSE2 = (238, 213, 210)
MISTYROSE3 = (205, 183, 181)
MISTYROSE4 = (139, 125, 123)
MOCCASIN = (255, 228, 181)
NAVAJOWHITE1 = (255, 222, 173)
NAVAJOWHITE2 = (238, 207, 161)
NAVAJOWHITE3 = (205, 179, 139)
NAVAJOWHITE4 = (139, 121, 94)
NAVY = (0, 0, 128)
OLDLACE = (253, 245, 230)
OLIVE = (128, 128, 0)
OLIVEDRAB = (107, 142, 35)
OLIVEDRAB1 = (192, 255, 62)
OLIVEDRAB2 = (179, 238, 58)
OLIVEDRAB3 = (154, 205, 50)
OLIVEDRAB4 = (105, 139, 34)
ORANGE = (255, 128, 0)
ORANGE1 = (255, 165, 0)
ORANGE2 = (238, 154, 0)
ORANGE3 = (205, 133, 0)
ORANGE4 = (139, 90, 0)
ORANGERED1 = (255, 69, 0)
ORANGERED2 = (238, 64, 0)
ORANGERED3 = (205, 55, 0)
ORANGERED4 = (139, 37, 0)
ORCHID = (218, 112, 214)
ORCHID1 = (255, 131, 250)
ORCHID2 = (238, 122, 233)
ORCHID3 = (205, 105, 201)
ORCHID4 = (139, 71, 137)
PALEGOLDENROD = (238, 232, 170)
PALEGREEN = (152, 251, 152)
PALEGREEN1 = (154, 255, 154)
PALEGREEN2 = (144, 238, 144)
PALEGREEN3 = (124, 205, 124)
PALEGREEN4 = (84, 139, 84)
PALETURQUOISE1 = (187, 255, 255)
PALETURQUOISE2 = (174, 238, 238)
PALETURQUOISE3 = (150, 205, 205)
PALETURQUOISE4 = (102, 139, 139)
PALEVIOLETRED = (219, 112, 147)
PALEVIOLETRED1 = (255, 130, 171)
PALEVIOLETRED2 = (238, 121, 159)
PALEVIOLETRED3 = (205, 104, 137)
PALEVIOLETRED4 = (139, 71, 93)
PAPAYAWHIP = (255, 239, 213)
PEACHPUFF1 = (255, 218, 185)
PEACHPUFF2 = (238, 203, 173)
PEACHPUFF3 = (205, 175, 149)
PEACHPUFF4 = (139, 119, 101)
PEACOCK = (51, 161, 201)
PINK = (255, 192, 203)
PINK1 = (255, 181, 197)
PINK2 = (238, 169, 184)
PINK3 = (205, 145, 158)
PINK4 = (139, 99, 108)
PLUM = (221, 160, 221)
PLUM1 = (255, 187, 255)
PLUM2 = (238, 174, 238)
PLUM3 = (205, 150, 205)
PLUM4 = (139, 102, 139)
POWDERBLUE = (176, 224, 230)
PURPLE = (128, 0, 128)
PURPLE1 = (155, 48, 255)
PURPLE2 = (145, 44, 238)
PURPLE3 = (125, 38, 205)
PURPLE4 = (85, 26, 139)
RASPBERRY = (135, 38, 87)
RAWSIENNA = (199, 97, 20)
RED1 = (255, 0, 0)
RED2 = (238, 0, 0)
RED3 = (205, 0, 0)
RED4 = (139, 0, 0)
ROSYBROWN = (188, 143, 143)
ROSYBROWN1 = (255, 193, 193)
ROSYBROWN2 = (238, 180, 180)
ROSYBROWN3 = (205, 155, 155)
ROSYBROWN4 = (139, 105, 105)
ROYALBLUE = (65, 105, 225)
ROYALBLUE1 = (72, 118, 255)
ROYALBLUE2 = (67, 110, 238)
ROYALBLUE3 = (58, 95, 205)
ROYALBLUE4 = (39, 64, 139)
SALMON = (250, 128, 114)
SALMON1 = (255, 140, 105)
SALMON2 = (238, 130, 98)
SALMON3 = (205, 112, 84)
SALMON4 = (139, 76, 57)
SANDYBROWN = (244, 164, 96)
SAPGREEN = (48, 128, 20)
SEAGREEN1 = (84, 255, 159)
SEAGREEN2 = (78, 238, 148)
SEAGREEN3 = (67, 205, 128)
SEAGREEN4 = (46, 139, 87)
SEASHELL1 = (255, 245, 238)
SEASHELL2 = (238, 229, 222)
SEASHELL3 = (205, 197, 191)
SEASHELL4 = (139, 134, 130)
SEPIA = (94, 38, 18)
SGIBEET = (142, 56, 142)
SGIBRIGHTGRAY = (197, 193, 170)
SGICHARTREUSE = (113, 198, 113)
SGIDARKGRAY = (85, 85, 85)
SGIGRAY12 = (30, 30, 30)
SGIGRAY16 = (40, 40, 40)
SGIGRAY32 = (81, 81, 81)
SGIGRAY36 = (91, 91, 91)
SGIGRAY52 = (132, 132, 132)
SGIGRAY56 = (142, 142, 142)
SGIGRAY72 = (183, 183, 183)
SGIGRAY76 = (193, 193, 193)
SGIGRAY92 = (234, 234, 234)
SGIGRAY96 = (244, 244, 244)
SGILIGHTBLUE = (125, 158, 192)
SGILIGHTGRAY = (170, 170, 170)
SGIOLIVEDRAB = (142, 142, 56)
SGISALMON = (198, 113, 113)
SGISLATEBLUE = (113, 113, 198)
SGITEAL = (56, 142, 142)
SIENNA = (160, 82, 45)
SIENNA1 = (255, 130, 71)
SIENNA2 = (238, 121, 66)
SIENNA3 = (205, 104, 57)
SIENNA4 = (139, 71, 38)
SILVER = (192, 192, 192)
SKYBLUE = (135, 206, 235)
SKYBLUE1 = (135, 206, 255)
SKYBLUE2 = (126, 192, 238)
SKYBLUE3 = (108, 166, 205)
SKYBLUE4 = (74, 112, 139)
SLATEBLUE = (106, 90, 205)
SLATEBLUE1 = (131, 111, 255)
SLATEBLUE2 = (122, 103, 238)
SLATEBLUE3 = (105, 89, 205)
SLATEBLUE4 = (71, 60, 139)
SLATEGRAY = (112, 128, 144)
SLATEGRAY1 = (198, 226, 255)
SLATEGRAY2 = (185, 211, 238)
SLATEGRAY3 = (159, 182, 205)
SLATEGRAY4 = (108, 123, 139)
SNOW1 = (255, 250, 250)
SNOW2 = (238, 233, 233)
SNOW3 = (205, 201, 201)
SNOW4 = (139, 137, 137)
SPRINGGREEN = (0, 255, 127)
SPRINGGREEN1 = (0, 238, 118)
SPRINGGREEN2 = (0, 205, 102)
SPRINGGREEN3 = (0, 139, 69)
STEELBLUE = (70, 130, 180)
STEELBLUE1 = (99, 184, 255)
STEELBLUE2 = (92, 172, 238)
STEELBLUE3 = (79, 148, 205)
STEELBLUE4 = (54, 100, 139)
TAN = (210, 180, 140)
TAN1 = (255, 165, 79)
TAN2 = (238, 154, 73)
TAN3 = (205, 133, 63)
TAN4 = (139, 90, 43)
TEAL = (0, 128, 128)
THISTLE = (216, 191, 216)
THISTLE1 = (255, 225, 255)
THISTLE2 = (238, 210, 238)
THISTLE3 = (205, 181, 205)
THISTLE4 = (139, 123, 139)
TOMATO1 = (255, 99, 71)
TOMATO2 = (238, 92, 66)
TOMATO3 = (205, 79, 57)
TOMATO4 = (139, 54, 38)
TURQUOISE = (64, 224, 208)
TURQUOISE1 = (0, 245, 255)
TURQUOISE2 = (0, 229, 238)
TURQUOISE3 = (0, 197, 205)
TURQUOISE4 = (0, 134, 139)
TURQUOISEBLUE = (0, 199, 140)
VIOLET = (238, 130, 238)
VIOLETRED = (208, 32, 144)
VIOLETRED1 = (255, 62, 150)
VIOLETRED2 = (238, 58, 140)
VIOLETRED3 = (205, 50, 120)
VIOLETRED4 = (139, 34, 82)
WARMGREY = (128, 128, 105)
WHEAT = (245, 222, 179)
WHEAT1 = (255, 231, 186)
WHEAT2 = (238, 216, 174)
WHEAT3 = (205, 186, 150)
WHEAT4 = (139, 126, 102)
WHITE = (255, 255, 255)
WHITESMOKE = (245, 245, 245)
WHITESMOKE = (245, 245, 245)
YELLOW1 = (255, 255, 0)
YELLOW2 = (238, 238, 0)
YELLOW3 = (205, 205, 0)
YELLOW4 = (139, 139, 0)
class key:
ZERO = 48
ONE = 49
TWO = 50
THREE = 51
FOUR = 52
FIVE = 53
SIX = 54
SEVEN = 55
EIGHT = 56
NINE = 57
AC_BACK = 1073742094
AMPERSAND = 38
ASTERISK = 42
AT = 64
BACKQUOTE = 96
BACKSLASH = 92
BACKSPACE = 8
BREAK = 1073741896
CAPSLOCK = 1073741881
CARET = 94
CLEAR = 1073741980
COLON = 58
COMMA = 44
CURRENCYSUBUNIT = 1073742005
CURRENCYUNIT = 1073742004
DELETE = 127
DOLLAR = 36
DOWN = 1073741905
END = 1073741901
EQUALS = 61
ESCAPE = 27
EURO = 1073742004
EXCLAIM = 33
F1 = 1073741882
F10 = 1073741891
F11 = 1073741892
F12 = 1073741893
F13 = 1073741928
F14 = 1073741929
F15 = 1073741930
F2 = 1073741883
F3 = 1073741884
F4 = 1073741885
F5 = 1073741886
F6 = 1073741887
F7 = 1073741888
F8 = 1073741889
F9 = 1073741890
GREATER = 62
HASH = 35
HELP = 1073741941
HOME = 1073741898
INSERT = 1073741897
KP0 = 1073741922
KP1 = 1073741913
KP2 = 1073741914
KP3 = 1073741915
KP4 = 1073741916
KP5 = 1073741917
KP6 = 1073741918
KP7 = 1073741919
KP8 = 1073741920
KP9 = 1073741921
KP_0 = 1073741922
KP_1 = 1073741913
KP_2 = 1073741914
KP_3 = 1073741915
KP_4 = 1073741916
KP_5 = 1073741917
KP_6 = 1073741918
KP_7 = 1073741919
KP_8 = 1073741920
KP_9 = 1073741921
KP_DIVIDE = 1073741908
KP_ENTER = 1073741912
KP_EQUALS = 1073741927
KP_MINUS = 1073741910
KP_MULTIPLY = 1073741909
KP_PERIOD = 1073741923
KP_PLUS = 1073741911
LALT = 1073742050
LCTRL = 1073742048
LEFT = 1073741904
LEFTBRACKET = 91
LEFTPAREN = 40
LESS = 60
LGUI = 1073742051
LMETA = 1073742051
LSHIFT = 1073742049
LSUPER = 1073742051
MENU = 1073741942
MINUS = 45
MODE = 1073742081
NUMLOCK = 1073741907
NUMLOCKCLEAR = 1073741907
PAGEDOWN = 1073741902
PAGEUP = 1073741899
PAUSE = 1073741896
PERCENT = 37
PERIOD = 46
PLUS = 43
POWER = 1073741926
PRINT = 1073741894
PRINTSCREEN = 1073741894
QUESTION = 63
QUOTE = 39
QUOTEDBL = 34
RALT = 1073742054
RCTRL = 1073742052
RETURN = 13
RGUI = 1073742055
RIGHT = 1073741903
RIGHTBRACKET = 93
RIGHTPAREN = 41
RMETA = 1073742055
RSHIFT = 1073742053
RSUPER = 1073742055
SCROLLLOCK = 1073741895
SCROLLOCK = 1073741895
SEMICOLON = 59
SLASH = 47
SPACE = 32
SYSREQ = 1073741978
TAB = 9
UNDERSCORE = 95
UNKNOWN = 0
UP = 1073741906
A = 97
B = 98
C = 99
D = 100
E = 101
F = 102
G = 103
H = 104
I = 105
J = 106
K = 107
L = 108
M = 109
N = 110
O = 111
P = 112
Q = 113
R = 114
S = 115
T = 116
U = 117
V = 118
W = 119
X = 120
Y = 121
Z = 122
class mod:
ALT = 768
CAPS = 8192
CTRL = 192
GUI = 3072
LALT = 256
LCTRL = 64
LGUI = 1024
LMETA = 1024
LSHIFT = 1
META = 3072
MODE = 16384
NONE = 0
NUM = 4096
RALT = 512
RCTRL = 128
RGUI = 2048
RMETA = 2048
RSHIFT = 2
SHIFT = 3
def check_all():
if not pkg_resources.get_distribution("pygame").version >= "2.1.2":
warnings.warn("Your version of pygame ({}) is outdated. Upgrading pygame is highly reccomended.".format(pkg_resources.get_distribution("pygame").version), Warning)
def pil_image_to_surface(pilImage):
return pygame.image.fromstring(
pilImage.tobytes(), pilImage.size, pilImage.mode).convert()
def LoadImage(path):
return pil_image_to_surface(PILImage.open(path))
class error:
class TitleError(Exception):
def __init__(self):
super().__init__("Title must be single-line string.")
class BackgroundError(Exception):
def __init__(self):
super().__init__("Background must be rgb tuple.")
class SizeError(Exception):
def __init__(self):
super().__init__("Size must be an int list.")
class MultipleInstanceError(Exception):
def __init__(self):
super().__init__("You can only have 1 window open at once.")
class SetModeError(Exception):
def __init__(self):
super().__init__("Only options 'PIL' and 'pygame' are supported.")
class Game(object):
def __init__(self, title="New Abinde Instance", size=[500, 600], bg=color.BLACK, warn_me="always", log_to="file", **kwargs):
global windows
self.variables = {}
for kwarg in kwargs:
self.variables[kwarg] = kwargs.get(kwarg)
self.all_s = []
pygame.init()
sys.stdout.flush()
try:
if len(windows) <= 1:
self.root = pygame.display.set_mode((size[0], size[1]))
else:
raise error.MultipleInstanceError
except:
raise error.SizeError
try:
pygame.display.set_caption(title)
except:
raise error.TitleError
if warn_me == "always":
warnings.simplefilter("always")
elif warn_me == "once":
warnings.simplefilter("once")
elif warn_me == "never":
warnings.simplefilter("never")
if log_to == "file":
logging.basicConfig(format='GAME - %(message)s', level=logging.INFO, filename="game.log", filemode="w")
elif log_to == "program":
logging.basicConfig(format='GAME - %(message)s', level=logging.INFO)
check_all()
self.fps = pygame.time.Clock()
self.looping = True
self.bg = bg
self.root.fill(bg)
self.size = size
windows.append(self)
self.rect = self.root.get_rect()
self.on_update = []
self.on_keydown = []
self.on_keyup = []
self.on_mousemotion = []
self.on_keypress = []
def loop(self):
global game_quit
while self.looping:
if not game_quit:
try:
try:
self.root.fill(self.bg)
except:
raise error.BackgroundError
for event in pygame.event.get():
if event.type == pygame.QUIT:
pygame.quit()
game_quit = True
self.looping = False
sys.exit()
if event.type == pygame.KEYUP:
for function in self.on_keyup:
function(event)
if event.type == pygame.KEYDOWN:
for function in self.on_keydown:
function(event)
if event.type == pygame.MOUSEMOTION:
for function in self.on_mousemotion:
function(event)
logging.info(pygame.event.event_name(event.type))
for function in self.on_update:
function(event)
self.checkkeypress()
for sprite in self.all_s:
sprite.draw(self)
pygame.display.flip()
self.fps.tick(60)
except KeyboardInterrupt:
logging.error("Loop Interrupt")
break
def mainloop(self):
self.looping = True
# To fix bug on mac run loop on Main Thread.
self.loop()
def checkkeypress(self):
self.keys = pygame.key.get_pressed()
if True in self.keys:
for function in self.on_keypress:
function(self.keys)
def get_size(self):
return self.size
def wait(self, ms):
time.sleep(ms / 1000)
def close(self):
logging.info("Quit")
pygame.display.quit()
def reset(self):
logging.info("Reset")
self.on_update = []
self.on_keydown = []
self.on_keyup = []
self.on_mousemotion = []
self.on_keypress = []
self.all_s = []
class OnKeyUp:
def __init__(self, game, do):
game.on_keyup.append(do)
logging.info("[Key Up] Event Added")
class OnKeyDown:
def __init__(self, game, do):
game.on_keydown.append(do)
logging.info("[Key Down] Event Added")
class OnUpdate:
def __init__(self, game, do):
game.on_update.append(do)
logging.info("[Update] Event Added")
class OnMouseMotion:
def __init__(self, game, do):
game.on_mousemotion.append(do)
logging.info("[Mouse Motion] Event Added")
class OnKeyPress:
def __init__(self, game, do):
game.on_keypress.append(do)
logging.info("[Key Press] Event Added")
class sprite:
class Rectangle(object):
def __init__(self, game, pos, size, color=color.WHITE, title="Rectangle"):
game.all_s.append(self)
self.game = game
self.x = pos[0]
self.y = pos[1]
self.width = size[0]
self.height = size[1]
self.color = color
self.title = title
self.rect = pygame.Rect(self.x, self.y, self.width, self.height)
def draw(self, game):
self.rect = pygame.Rect(self.x, self.y, self.width, self.height)
pygame.draw.rect(game.root, self.color, self.rect)
def returntitle(self):
return self.title
def move(self, move=[1, 1]):
self.x += move[0]
self.y += move[1]
self.rect = pygame.Rect(self.x, self.y, self.width, self.height)
def go_to(self, pos=[1, 1]):
self.x = pos[0]
self.y = pos[1]
self.rect = pygame.Rect(self.x, self.y, self.width, self.height)
def get_pos(self):
return [self.x, self.y]
def get_size(self):
return [self.width, self.height]
def kill(self):
self.game.all_s.remove(self)
def touching(self, sprite):
if self.rect.colliderect(sprite.rect):
return sprite
def touching_any(self, sprites):
self.sprites = []
for sprite in sprites:
if self.touching(sprite):
self.sprites.append(sprite)
return self.sprites
class Line(object):
def __init__(self, game, pos, size, color=color.WHITE, title="Line"):
game.all_s.append(self)
self.game = game
self.x = pos[0]
self.y = pos[1]
self.width = size[0]
self.height = size[1]
self.color = color
self.title = title
def draw(self, game):
self.rect = pygame.Rect(self.x, self.y, self.width, self.height)
pygame.draw.line(game.root, self.color, [self.x, self.y], [self.width, self.height])
def returntitle(self):
return self.title
def move(self, move=[1, 1]):
self.x += move[0]
self.y += move[1]
self.rect = pygame.Rect(self.x, self.y, self.width, self.height)
def go_to(self, pos=[1, 1]):
self.x = pos[0]
self.y = pos[1]
self.rect = pygame.Rect(self.x, self.y, self.width, self.height)
def get_pos(self):
return [self.x, self.y]
def get_size(self):
return [self.width, self.height]
def kill(self):
self.game.all_s.remove(self)
def touching(self, sprite):
if self.rect.colliderect(sprite.rect):
return sprite
def touching_any(self, sprites):
self.sprites = []
for sprite in sprites:
if self.touching(sprite):
self.sprites.append(sprite)
return self.sprites
class Ellipse(object):
def __init__(self, game, pos, size, color=color.WHITE, title="Ellipse"):
game.all_s.append(self)
self.game = game
self.x = pos[0]
self.y = pos[1]
self.width = size[0]
self.height = size[1]
self.color = color
self.title = title
def draw(self, game):
self.rect = pygame.Rect(self.x, self.y, self.width, self.height)
pygame.draw.ellipse(game.root, self.color, (self.x, self.y, self.width, self.height))
def returntitle(self):
return self.title
def move(self, move=[1, 1]):
self.x += move[0]
self.y += move[1]
def go_to(self, pos=[1, 1]):
self.x = pos[0]
self.y = pos[1]
def get_pos(self):
return [self.x, self.y]
def get_size(self):
return [self.width, self.height]
def kill(self):
self.game.all_s.remove(self)
def touching(self, sprite):
if self.rect.colliderect(sprite.rect):
return sprite
def touching_any(self, sprites):
self.sprites = []
for sprite in sprites:
if self.touching(sprite):
self.sprites.append(sprite)
return self.sprites
class Text(object):
def __init__(self, game, pos, text, fontsize=30, fontname="Sans Serif", color=color.WHITE):
game.all_s.append(self)
self.font = pygame.font.SysFont(fontname, fontsize)
self.root = self.font.render(text, False, color)
self.pos = pos
self.game = game
self.text = text
self.color = color
def draw(self, game):
self.root = self.font.render(self.text, False, self.color)
game.root.blit(self.root, self.pos)
def kill(self):
self.game.all_s.remove(self)
class Image(object):
def __init__(self, game, image, pos, title="Image"):
game.all_s.append(self)
self.game = game
self.x = pos[0]
self.y = pos[1]
self.color = color
self.title = title
self.image = image
self.rect = self.image.get_rect()
self.rect.topleft = (self.x, self.y)
def draw(self, game):
game.root.blit(self.image, self.rect)
def returntitle(self):
return self.title
def move(self, move=[1, 1]):
self.x += move[0]
self.y += move[1]
self.rect.topleft = (self.x, self.y)
def go_to(self, pos=[1, 1]):
self.x = pos[0]
self.y = pos[1]
self.rect.topleft = (self.x, self.y)
def get_pos(self):
return [self.x, self.y]
def kill(self):
self.game.all_s.remove(self)
def touching(self, sprite):
if self.rect.colliderect(sprite.rect):
return sprite
def touching_any(self, sprites):
self.sprites = []
for sprite in sprites:
if self.touching(sprite):
self.sprites.append(sprite)
return self.sprites
class Audio:
def __init__(self, file, volume=0.7):
mixer.music.load(file)
mixer.music.set_volume(volume)
def play(self):
mixer.music.play()
def pause(self):
mixer.music.pause()
def unpause(self):
mixer.music.unpause()
class spritesheet(object):
def __init__(self, filename):
try:
self.sheet = pygame.image.load(filename).convert()
except pygame.error as e:
print('Unable to load spritesheet image:', filename, ":", e)
raise SystemExit
def image_at(self, rectangle, colorkey = None):
"Loads image from x,y,x+offset,y+offset"
rect = pygame.Rect(rectangle[0], rectangle[1], rectangle[2], rectangle[3])
image = pygame.Surface(rect.size).convert()
image.blit(self.sheet, (0, 0), rect)
if colorkey != None:
if colorkey == -1:
colorkey = image.get_at((0,0))
image.set_colorkey(colorkey, pygame.RLEACCEL)
return image
def quit():
logging.info("Full Quit")
pygame.quit()
sys.exit() | Abinde | /Abinde-2.5-py3-none-any.whl/Abinde.py | Abinde.py |
# abrio-cli
SDK used By Developers to create custom logic for their apps.
## CLI configurations for local usage
to test the cli tool :
+ move to cli directory and create a virtual env
```
cd cli
virtualenv venv
. venv/bin/activate
```
+ install AbrIO CLI locally using
```
pip install --editable .
```
Abrio CLI is now installed.
## Abrio CLI on PyPi
Abrio cli is now on PyPiTest repository. install it using
```
pip install AbrIO
# if not found :
pip install --extra-index-url https://testpypi.python.org/pypi AbrIO
```
| AbrIO | /AbrIO-0.0.5.tar.gz/AbrIO-0.0.5/README.md | README.md |
import click, zipfile, json
import requests
import datetime
from colorclass import Color
from terminaltables import AsciiTable
from requests.auth import HTTPBasicAuth
from requests_toolbelt import MultipartEncoder, MultipartEncoderMonitor
from clint.textui.progress import Bar as ProgressBar
from ..util.file import *
from ..util.checker import *
from ..conf.conf import config, get_full_path
from ..conf.conf import config,errors
@click.option('--version', prompt="Enter component version", default="0.0.1" )
@click.option('--public', is_flag=True, prompt='Do you want to mark this component as public?' )
@click.option('--name', prompt="Enter component name", default="Test")
@click.command()
def init(name, public, version):
'''
Create new abrio component.
'''
if not ensure_abrio_root() :
click.secho('\nAbrio Root Directory Not Detected.\n' , fg="red", bold=True)
return
if os.path.exists(name) :
click.secho("\nDirecotry with name <{0}> already exists.\n".format(name), fg="red", bold=True)
return
click.secho("\nConnection to sever..." , bold=True, fg="yellow")
project_config = load_project_config()
email = project_config['email']
pwd = project_config['password']
name = name
is_private = public
response = requests.post(
config['server']['host']+'component/create',
auth=HTTPBasicAuth(email, pwd),
json={'isPrivate': is_private, 'name': name}
)
if response.status_code == 201 :
pkey = response.json()['token']
os.mkdir(name)
zip_file = zipfile.ZipFile(os.path.join(get_full_path('data', config['sdk_file'])))
zip_file.extractall(name)
component_config = {
'pkey': pkey,
'public': public,
'version': version,
'name': name,
'last_uploaded': ''
}
# with open(os.path.join(name, (name+'.json')), 'w') as config_file :
# config_file.write(json.dumps(component_config, indent=4, separators=(',', ': ')))
add_component_project(component_config)
click.secho("\nComponent <{0}> created.\n".format(name), bold=True, fg='green')
else :
click.secho(errors["UNKNOWN_NETWORK"],bold=True, fg="red")
@click.command()
@click.argument('name')
def upload(name) :
'''
Upload Abrio component to server.
'''
if not ensure_abrio_root():
click.secho('\nAbrio Root Directory Not Detected.\n', fg="red", bold=True)
return
if not ensure_component_exists(name):
click.secho("\nComponent <{0}> does not exist.\n".format(name), bold=True, fg="red")
build_dir = '/sample/build/libs/'
os.system('cd {0} && gradle jar && cd ..'.format(name))
jar_dir = name + build_dir + name + '.jar'
os.rename(name + build_dir + 'sample.jar',jar_dir)
encoder = create_upload(jar_dir)
callback = create_callback(encoder)
monitor = MultipartEncoderMonitor(encoder, callback)
component_config = load_component_config(name)
component_config['last_uploaded'] = str(datetime.datetime.now())
write_component_config(name, component_config)
headers = {
'Content-Type': monitor.content_type,
'private key': component_config['pkey'],
'version' : component_config['version']
}
upload_response = requests.post(
config['server']['host'] + "component/upload",
data=monitor,
# auth=HTTPBasicAuth(email, pwd),
headers=headers)
if upload_response.status_code == 200 :
click.secho('\n\n\nComponent uploaded\n', bold=True, fg="green")
else :
click.secho(errors["UNKNOWN_NETWORK"], bold=True, fg="red")
@click.option('--sure', prompt="Are you sure you want to delete this component", is_flag=True)
@click.argument('name')
@click.command()
def rm(name, sure) :
'''
Delete Abrio Component.
'''
if not ensure_abrio_root():
click.secho('\nAbrio Root Directory Not Detected.\n', fg="red", bold=True)
return
if sure :
if ensure_component_exists(name) :
os.system('rm -Rf {0}'.format(name))
remove_component_project(name)
# todo delete from server
click.secho("\nComponent <{0}> deleted.\n".format(name), bold=True, fg="yellow")
else :
click.secho("\nComponent <{0}> does not exist.\n".format(name), bold=True, fg="red")
@click.command()
def ls() :
'''
List Available Abrio components
'''
if not ensure_abrio_root():
click.secho('\nAbrio Root Directory Not Detected.\n', fg="red", bold=True)
return
project_config = load_project_config()
response = requests.get(
config['server']['host'] + "project/list_components",
json={'private_key': project_config['private_key']}
)
if response.status_code == 200 :
component_table = [
['Component Name', 'Version', 'Public', "Last Upload" , "Type"]] + \
[
[
component['name'],
component['version'],
str(component['public']),
component['last_uploaded'],
Color('{autoyellow}Local{/autoyellow}')
] for component in project_config['components']
]
component_table += [
[
comp['name'],
comp['deploy_version'],
str(not comp['private']),
"---",
Color('{autocyan}Online{/autocyan}')
] for comp in json.loads(response.content)['result']]
table = AsciiTable(component_table)
click.echo(table.table)
else :
click.secho(errors["UNKNOWN_NETWORK"], bold=True, fg="red")
def create_callback(encoder):
encoder_len = encoder.len
bar = ProgressBar(expected_size=encoder_len, filled_char='=')
def callback(monitor):
bar.show(monitor.bytes_read)
return callback
def create_upload(file_path):
file_name = file_path.split("/")[-1]
return MultipartEncoder({'files':(file_name,open(file_path, 'rb'))}) | AbrIO | /AbrIO-0.0.5.tar.gz/AbrIO-0.0.5/abriocli/component/component.py | component.py |
import click, requests, json
from terminaltables import AsciiTable
from ..conf.conf import errors
from ..util.file import *
from ..util.checker import *
@click.command()
@click.option('--pkey', prompt="Enter Project private key (create one in website if you don't have one)",)
@click.option('--email', prompt="Enter your Abrio Account Email")
@click.option('--password', prompt="Enter your Abrio Account Password" , hide_input=True)
def init(pkey, email, password) :
'''
Create new Abrio Project
'''
config = {
'email' : email,
'password' : password,
'private_key' : pkey,
'components' : [],
'create_date' : ''
}
write_project_config(config)
click.secho('\ndirectory marked as abrio project root. happy coding.\n',bold=True, fg='green')
@click.command()
def deploy() :
'''
Deploy Abrio project
'''
if not ensure_abrio_root():
click.secho('\nAbrio Root Directory Not Detected.\n', fg="red", bold=True)
return
pkey = load_project_config()['private_key']
response = requests.post(
config['server']['host'] + 'project/start',
json={'private_key' : pkey }
)
if response.status_code == 200:
click.secho("\nProject Successfully Lunched.\n", bold=True, fg='green')
elif response.status_code == 409 :
click.secho("\nProject already lunched.\n" , bold=True, fg="yellow")
else:
click.secho(errors["UNKNOWN_NETWORK"], bold=True, fg="red")
@click.command()
def stop() :
'''
Stop Abrio Project
'''
if not ensure_abrio_root():
click.secho('\nAbrio Root Directory Not Detected.\n', fg="red", bold=True)
return
pkey = load_project_config()['private_key']
response = requests.post(
config['server']['host'] + 'project/stop',
json={'private_key': pkey}
)
if response.status_code == 200:
click.secho("\nProject Successfully Stopped.\n", bold=True, fg='green')
else:
click.secho(errors["UNKNOWN_NETWORK"], bold=True, fg="red")
@click.command()
def status() :
'''
View Abrio project status
'''
if not ensure_abrio_root():
click.secho('\nAbrio Root Directory Not Detected.\n', fg="red", bold=True)
return
click.secho("\nConnecting to server..\n", fg='yellow', bold=True)
project_config = load_project_config()
pkey = project_config['private_key']
response = requests.get(
config['server']['host'] + 'project/status',
json={'private_key': pkey}
)
if response.status_code == 200 :
content = json.loads(response.content)
project_config['create_date'] = content['create_date']
project_config['name'] = content['name']
write_project_config(project_config)
project_table = [
['Name', "Is Running", "Created At", "Owner"],
[
project_config['name'],
str(content['is_running']),
project_config['create_date'],
project_config['email']
]
]
click.echo(AsciiTable(project_table).table)
else :
click.secho(errors["UNKNOWN_NETWORK"], bold=True, fg="red") | AbrIO | /AbrIO-0.0.5.tar.gz/AbrIO-0.0.5/abriocli/project/project.py | project.py |
# AbsBox
a structured finance cashflow engine wrapper:
* transperency -> open source for both wrapper and backend engine
* human readable waterfall -> no more coding/scripting, just list and maps in Python !
[](https://img.shields.io/pypi/pyversions/absbox)
[](https://badge.fury.io/py/absbox)
[](https://img.shields.io/pypi/dm/absbox)
## installation
pip install absbox
## Community & Support
* [Discussion](https://github.com/yellowbean/AbsBox/discussions)
## Goals
* Provide building blocks to create cashflow models for ABS/MBS
* Adapt to multiple asset classes
* Residential Mortgage / Auto Loans
* Corp Loans
* Consumer Credit
* Lease
* Features
* Sensitivity Analysis on different scenarios
* Bond Cashflow Forecast, Pricing
* Tweaking on deal components
## Data flow

## Documentation
* English -> https://absbox-doc.readthedocs.io
* Chinese -> https://absbox.readthedocs.io
| AbsBox | /AbsBox-0.12.5.tar.gz/AbsBox-0.12.5/README.md | README.md |
import logging, json, datetime, pickle,re
from json.decoder import JSONDecodeError
import requests
from requests.exceptions import ConnectionError
import urllib3
from dataclasses import dataclass,field
from absbox.local.util import mkTag,isDate,flat
from absbox.local.component import mkPool,mkAssumption,mkAssumption2
from absbox.local.base import *
import pandas as pd
from pyspecter import S,query
urllib3.disable_warnings()
@dataclass
class API:
url:str
lang:str = "chinese"
server_info = {}
version = "0","12","0"
hdrs = {'Content-type': 'application/json', 'Accept': 'text/plain'}
def __post_init__(self):
try:
_r = requests.get(f"{self.url}/version",verify=False).text
except (ConnectionRefusedError, ConnectionError):
logging.error(f"Error: Can't not connect to API server {self.url}")
self.url = None
return
echo = json.loads(_r)
self.server_info = echo
x,y,z = echo['version'].split(".")
logging.info(f"Connect with engine {self.url} version {echo['version']} successfully")
if self.version[1] != y:
logging.error(f"Failed to init the api instance, lib support={self.version} but server version={echo['version']} , pls upgrade your api package by: pip -U absbox")
return
def build_req(self, deal, assumptions=None, pricing=None, read=None) -> str:
_assump = None
if isinstance(assumptions, dict):
_assump = mkTag(("Multiple", { scenarioName:mkAssumption2(a) for (scenarioName,a) in assumptions.items()}))
elif isinstance(assumptions, list) or isinstance(assumptions, tuple):
_assump = mkTag(("Single",mkAssumption2(assumptions)))
else:
_assump = None
return json.dumps({"deal": deal.json
,"assump": _assump
,"bondPricing": deal.read_pricing(pricing) if (pricing is not None) else None}
, ensure_ascii=False)
def build_pool_req(self, pool, assumptions=[], read=None) -> str:
return json.dumps({"pool": mkPool(pool)
,"pAssump": mkAssumption2(assumptions)}
,ensure_ascii=False)
def _validate_assump(self,x,e,w):
def asset_check(_e,_w):
return _e,_w
a = x['assump']
asset_ids = set(range(len(query(x,['deal','contents','pool','assets']))))
match a:
case {'tag':'Single','contents':{'tag':'PoolLevel'}}:
return [True,e,w]
case {'tag':'Multiple','contents':{'tag':'PoolLevel'}}:
return [True,e,w]
case {'tag':'Single', 'contents':{'tag':'ByIndex', 'contents':(assumps,_)}}:
_ids = set(flat([ assump[0] for assump in assumps ]))
if not _ids.issubset(asset_ids):
e.append(f"Not Valid Asset ID:{_ids - asset_ids}")
missing_asset_id = asset_ids - _ids
if len(missing_asset_id)>0:
w.append(f"Missing Asset to set assumption:{missing_asset_id}")
case {'tag':'Multiple', 'contents':scenarioMap}:
for k,v in scenarioMap.items():
if v['tag']=='PoolLevel':
continue
_ids = set(flat([ _a[0] for _a in v['contents'][0]]))
if not _ids.issubset(asset_ids):
e.append(f"Scenario:{k},Not Valid Asset ID:{_ids - asset_ids}")
missing_asset_id = asset_ids - _ids
if len(missing_asset_id)>0:
w.append(f"Scenario:{k},Missing Asset to set assumption:{missing_asset_id}")
case None:
return [True,e,w]
case _ :
raise RuntimeError(f"Failed to match:{a}")
if len(e)>0:
return [False,e,w]
return [True,e,w]
def validate(self, _r) -> list:
error = []
warning = []
_r = json.loads(_r)
__d = _r['deal']
_d = __d['contents']
valid_acc = set(_d['accounts'].keys())
valid_bnd = set(_d['bonds'].keys())
valid_fee = set(_d['fees'].keys())
_w = _d['waterfall']
_,error,warning = self._validate_assump(_r,error,warning)
if _w is None:
raise RuntimeError("Waterfall is None")
# validatin waterfall
for wn,wa in _w.items():
for idx,action in enumerate(wa):
action = action[1]
match action['tag']:
case 'PayFeeBy':
if (not set(action['contents'][1]).issubset(valid_acc)) \
or (not set(action['contents'][2]).issubset(valid_fee)):
error.append(f"{wn},{idx}")
case 'PayFee':
if (not set(action['contents'][0]).issubset(valid_acc)) \
or (not set(action['contents'][1]).issubset(valid_fee)):
error.append(f"{wn},{idx}")
case 'PayInt':
if (action['contents'][0] not in valid_acc) \
or (not set(action['contents'][1]).issubset(valid_bnd)):
error.append(f"{wn},{idx}")
case 'PayPrin':
if (action['contents'][0] not in valid_acc) \
or (not set(action['contents'][1]).issubset(valid_bnd)):
error.append(f"{wn},{idx}")
case 'PayPrinBy':
if (action['contents'][1] not in valid_acc) \
or (not set(action['contents'][2]).issubset(valid_bnd)):
error.append(f"{wn},{idx}")
case 'PayResidual':
if (action['contents'][1] not in valid_acc) \
or (action['contents'][2] not in valid_bnd):
error.append(f"{wn},{idx}")
case 'Transfer':
if (action['contents'][0] not in valid_acc) \
or (action['contents'][1] not in valid_acc):
error.append(f"{wn},{idx}")
case 'TransferBy':
if (action['contents'][1] not in valid_acc) \
or (action['contents'][2] not in valid_acc):
error.append(f"{wn},{idx}")
case 'PayTillYield':
if (action['contents'][0] not in valid_acc) \
or (not set(action['contents'][1]).issubset(valid_bnd)):
error.append(f"{wn},{idx}")
case 'PayFeeResidual':
if (action['contents'][1] not in valid_acc) \
or (action['contents'][2] not in valid_fee):
error.append(f"{wn},{idx}")
case 'PayFeeResidual':
if (action['contents'][1] not in valid_acc) \
or (action['contents'][2] not in valid_fee):
error.append(f"{wn},{idx}")
if warning:
logging.warning(f"Warning in modelling:{warning}")
if len(error)>0:
if error:
logging.error(f"Error in modelling:{error}")
return False,error,warning
else:
return True,error,warning
def run(self,
deal,
assumptions=None,
pricing=None,
custom_endpoint=None,
read=True,
position=None,
timing=False):
if isinstance(assumptions,str):
assumptions = pickle.load(assumptions)
# if run req is a multi-scenario run
if assumptions:
multi_run_flag = isinstance(assumptions, dict)
else:
multi_run_flag = False
# overwrite any custom_endpoint
url = f"{self.url}/run_deal"
if custom_endpoint:
url = f"{self.url}/{custom_endpoint}"
if isinstance(deal, str):
with open(deal,'rb') as _f:
c = _f.read()
deal = pickle.loads(c)
# construct request
req = self.build_req(deal, assumptions, pricing)
#validate deal
deal_validate,err,warn = self.validate(req)
if not deal_validate:
return deal_validate,err,warn
result = self._send_req(req,url)
if read:
if multi_run_flag:
return { n:deal.read(_r,position=position) for (n,_r) in result.items()}
else:
return deal.read(result,position=position)
else:
return result
def runPool(self, pool, assumptions=[],read=True):
url = f"{self.url}/run_pool"
req = self.build_pool_req(pool, assumptions=assumptions)
result = self._send_req(req,url)
if read:
flow_header,idx = guess_pool_flow_header(result[0],self.lang)
try:
result = pd.DataFrame([_['contents'] for _ in result] , columns=flow_header)
except ValueError as e:
logging.error(f"Failed to match header:{flow_header} with {result[0]['contents']}")
result = result.set_index(idx)
result.index.rename(idx, inplace=True)
result.sort_index(inplace=True)
return result
def runStructs(self, deals, **p):
assert isinstance(deals, dict),f"Deals should be a dict but got {deals}"
return {k: self.run(v,**p) for k,v in deals.items() }
def _send_req(self,_req,_url)->dict:
try:
r = requests.post(_url, data=_req.encode('utf-8'), headers=self.hdrs, verify=False)
except (ConnectionRefusedError, ConnectionError):
return None
if r.status_code != 200:
print(json.loads(_req))
raise RuntimeError(r.text)
try:
result = json.loads(r.text)
return result
except JSONDecodeError as e:
raise RuntimeError(e)
def guess_pool_flow_header(x,l):
match (x['tag'],l):
case ('MortgageFlow','chinese'):
return (china_mortgage_flow_fields_d,"日期")
case ('MortgageFlow','english'):
return (english_mortgage_flow_fields_d,"Date")
case ('LoanFlow','chinese'):
return (china_loan_flow_d,"日期")
case ('LoanFlow','english'):
return (english_loan_flow_d,"Date")
case ('LeaseFlow','chinese'):
return (china_rental_flow_d,"日期")
case ('LeaseFlow','english'):
return (english_rental_flow_d,"Date")
case _:
raise RuntimeError(f"Failed to match pool header with {x['tag']}{l}")
def save(deal,p:str):
def save_to(b):
with open(p,'wb') as _f:
pickle.dump(b,_f)
match deal:
case _:
save_to(deal) | AbsBox | /AbsBox-0.12.5.tar.gz/AbsBox-0.12.5/absbox/client.py | client.py |
import pandas as pd
import functools,json
import itertools,re
from enum import Enum
import numpy as np
import dataclasses
from functools import reduce
from pyxirr import xirr,xnpv
from absbox.local.base import *
def flat(xss) -> list:
return reduce(lambda xs, ys: xs + ys, xss)
def mkTag(x):
match x:
case (tagName, tagValue):
return {"tag": tagName, "contents": tagValue}
case (tagName):
return {"tag": tagName}
def readTagStr(x:str):
_x = json.loads(x.replace("'","\""))
if 'contents' in _x:
return f"<{_x['tag']}:{','.join(_x['contents'])}>"
return f"<{_x['tag']}>"
def readTag(x:dict):
return f"<{x['tag']}:{','.join(x['contents'])}>"
def isDate(x):
return re.match(r"\d{4}\-\d{2}\-\d{2}",x)
def mkTs(n, vs):
return mkTag((n, vs))
def unify(xs, ns):
"union dataframes by stacking up with names provided"
index_name = xs[0].index.name
dfs = []
for x, n in zip(xs, ns):
dfs.append(pd.concat([x], keys=[n], axis=1))
r = functools.reduce(lambda acc, x: acc.merge(x
, how='outer'
, on=[index_name])
, dfs)
return r.sort_index()
def unifyTs(xs):
"unify time-series alike dataframe"
_index_set = set([x.index.name for x in xs])
assert len(_index_set)==1,f"Index of dataframes should have same name,got:{_index_set}"
_index_name = list(_index_set)[0]
r = functools.reduce(lambda acc, x: acc.merge(x
, how='outer'
, on=[_index_name]), xs)
return r.sort_index()
def backFillBal(x,ds):
b = pd.DataFrame({"日期": ds})
b.set_index("日期", inplace=True)
base = pd.concat([b, x], axis=1).sort_index()
paidOffDate = None
r = None
if any(base['余额']==0):
paidOffDate = base[base['余额']==0].index[0]
base['flag'] = (base.index >= paidOffDate)
base.loc[base['flag']==True, "余额"] = 0
base.loc[base['flag']==False, "余额"] = (base["余额"] + base["本金"]).shift(-1).fillna(method='bfill')
r = base.drop(["flag"], axis=1)
else:
last_index = base.index.to_list()[-1]
last_keep_balance = base.at[last_index, "余额"]
base["余额"] = (base["余额"] + base["本金"]).shift(-1).fillna(method='bfill')
base.at[last_index, "余额"] = last_keep_balance
r = base
return r
def bondView(r,flow=None, flowName=True,flowDates=None,rnd=2):
result = []
default_bnd_col_size = 6
bnd_names = r['bonds'].keys()
b_dates = [ set(r['bonds'][bn].index.tolist()) for bn in bnd_names ]
all_b_dates = set()
for bd in b_dates:
all_b_dates = all_b_dates | bd
all_b_dates_s = list(all_b_dates)
all_b_dates_s.sort()
if flowDates is None:
flowDates = all_b_dates_s
for (bn, bnd) in r['bonds'].items():
if flow :
result.append(backFillBal(bnd,flowDates)[flow])
else:
result.append(backFillBal(bnd,flowDates))
x = pd.concat(result,axis=1)
bnd_cols_count = len(flow) if flow else default_bnd_col_size
headers = [ bnd_cols_count*[bn] for bn in bnd_names]
if flowName:
x.columns = [ list(itertools.chain.from_iterable(headers)) ,x.columns]
else:
x.columns = list(itertools.chain.from_iterable(headers))
return x.sort_index().round(rnd)
def accView(r, flow=None, flowName=True):
result = []
default_acc_col_size = 3
acc_names = r['accounts'].keys()
for (an, acc) in r['accounts'].items():
if flow :
result.append(acc.groupby("日期").last()[flow])
else:
result.append(acc.groupby("日期").last())
x = pd.concat(result,axis=1)
account_cols_count = len(flow) if flow else default_acc_col_size
headers = [ account_cols_count*[an] for an in acc_names]
if flowName:
x.columns = [ list(itertools.chain.from_iterable(headers)) ,x.columns]
else:
x.columns = list(itertools.chain.from_iterable(headers))
return x.sort_index()
def feeView(r,flow=None):
fees = r['fees']
feeNames = list(fees.keys())
feeVals = list(fees.values())
if flow is None:
return unify(feeVals, feeNames)
else:
newFees = [ _f[flow] for _f in feeVals]
return unify(newFees,feeNames)
def peekAtDates(x,ds):
x_consol = x.groupby(["日期"]).last()
if x_consol.index.get_indexer(ds,method='pad').min()==-1:
raise RuntimeError(f"<查看日期:{ds}>早于当前DataFrame")
keep_idx = [x_consol.index.asof(d) for d in ds]
y = x_consol.loc[keep_idx]
y.reset_index("日期")
y["日期"] = ds
return y.set_index("日期")
def balanceSheetView(r, ds=None, equity=None, rnd=2):
bv = bondView(r, flow=["余额"],flowDates=ds,flowName=False)
av = accView(r, flow=["余额"],flowName=False)
pv = r['pool']['flow'][["未偿余额"]]
if "违约金额" in r['pool']['flow'] and "回收金额" in r['pool']['flow']:
r['pool']["flow"]["不良"] = r['pool']['flow']["违约金额"].cumsum() - r['pool']['flow']["回收金额"].cumsum()
pv = r['pool']['flow'][["未偿余额","不良"]]
if equity:
equityFlow = bondView(r, flow=["本息合计"],flowDates=ds,flowName=False)[equity]
equityFlow.columns = pd.MultiIndex.from_arrays([["权益"]*len(equity), list(equityFlow.columns)])
equityFlow["权益", f"合计分配{equity}"] = equityFlow.sum(axis=1)
if ds is None:
ds = list(bv.index)
if equity:
bv.drop(columns=equity, inplace=True)
try:
pvCol, avCol = [ peekAtDates(_, ds) for _ in [pv, av] ]
# need to add cutoff amount for equity tranche
for k, _ in [("资产池", pvCol), ("账户", avCol), ("债券", bv)]:
_[f'{k}-合计'] = _.sum(axis=1)
asset_cols = (len(pvCol.columns)+len(avCol.columns))*["资产"]
liability_cols = len(bv.columns)*["负债"]
header = asset_cols + liability_cols
bs = pd.concat([pvCol, avCol, bv], axis=1)
bs.columns = pd.MultiIndex.from_arrays([header, list(bs.columns)])
bs["资产", "合计"] = bs["资产", "资产池-合计"]+bs["资产", "账户-合计"]
bs["负债", "合计"] = bs["负债", "债券-合计"]
if equity:
bs["权益", "累计分配"] = equityFlow["权益", f"合计分配{equity}"].cumsum()
bs["权益", "合计"] = bs["资产", "合计"] - bs["负债", "合计"] + bs["权益", "累计分配"]
else:
bs["权益", "合计"] = bs["资产", "合计"] - bs["负债", "合计"]
except RuntimeError as e:
print(f"Error: 其他错误=>{e}")
return bs.round(rnd) # unify([pvCol,avCol,bvCol],["资产-资产池","资产-账户","负债"])
def PnLView(r,ds=None):
accounts = r['accounts']
consoleStmts = pd.concat([ acc for acc in accounts ])
return consoleStmts
def consolStmtByDate(s):
return s.groupby("日期").last()
def aggStmtByDate(s):
return s.groupby("日期").sum()
def aggCFby(_df, interval, cols):
df = _df.copy()
idx = None
dummy_col = '_index'
df[dummy_col] = df.index
_mapping = {"月份":"M"
,"Month":"M"
,"M":"M"
,"month":"M"}
if df.index.name == "日期":
idx = "日期"
else:
idx = "date"
df[dummy_col]=pd.to_datetime(df[dummy_col]).dt.to_period(_mapping[interval])
return df.groupby([dummy_col])[cols].sum().rename_axis(idx)#.drop(columns=[dummy_col])
def irr(flow,init=None):
def extract_cash_col(_cols):
if _cols == china_bondflow_fields_s:
return flow['本息合计']
elif _cols == english_bondflow_fields_s:
return flow['cash']
else:
raise RuntimeError("Failed to match",_cols)
cols = flow.columns.to_list()
dates = flow.index.to_list()
amounts = extract_cash_col(cols).to_list()
if init is not None:
invest_date,invest_amount = init
dates = [invest_date]+dates
amounts = [invest_amount]+amounts
return xirr(np.array(dates), np.array(amounts))
def sum_fields_to_field(_df,cols,col):
df = _df.copy()
df[col] = df[cols].sum(axis=1)
return df
def npv(_flow,**p):
flow = _flow.copy()
cols = flow.columns.to_list()
idx_name = flow.index.name
init_date,_init_amt = p['init']
init_amt = _init_amt if _init_amt!=0.00 else 0.0001
def _pv(_af):
af = flow[_af]
return xnpv(p['rate'],[init_date]+flow.index.to_list(),[-1*init_amt]+af.to_list())
match (cols,idx_name):
case (china_rental_flow,"日期"):
return _pv("租金")
case (english_rental_flow,"Date"):
return _pv("Rental")
case (english_mortgage_flow_fields,"Date"):
sum_fields_to_field(flow,["Principal","Interest","Prepayment","Recovery"], "Cash")
return _pv("Cash")
case (china_bondflow_fields,"日期"):
return _pv("本息合计")
case (english_bondflow_fields,"Date"):
return _pv("cash")
case _:
raise RuntimeError("Failed to match",cols,idx_name)
def update_deal(d,i,c):
"A patch function to update a deal data component list in immuntable way"
_d = d.copy()
_d.pop(i)
_d.insert(i,c)
return _d
def mkDealsBy(d, m:dict)->dict:
return { k:dataclasses.replace(d, **v)
for k,v in m.items()}
class DC(Enum): # TODO need to check with HS code
DC_30E_360 = "DC_30E_360"
DC_30Ep_360 ="DC_30Ep_360"
DC_ACT_360 = "DC_ACT_360"
DC_ACT_365A = "DC_ACT_365A"
DC_ACT_365L = "DC_ACT_365L"
DC_NL_365 = "DC_NL_365"
DC_ACT_365F = "DC_ACT_365F"
DC_ACT_ACT = "DC_ACT_ACT"
DC_30_360_ISDA = "DC_30_360_ISDA"
DC_30_360_German = "DC_30_360_German"
DC_30_360_US = "DC_30_360_US"
def str2date(x:str):
return datetime.strptime(x, '%Y-%m-%d').date()
def daysBetween(sd,ed):
return (ed - sd).days
def guess_locale(x):
accs = x['accounts']
assert len(accs)>0,"Failed to identify via deal accounts result"
acc_cols = set(list(accs.values())[0].columns.to_list())
locale = None
if acc_cols == set(["余额", "变动额", "备注"]):
locale="cn"
if acc_cols == set(["balance", "change", "memo"]):
locale="en"
return locale | AbsBox | /AbsBox-0.12.5.tar.gz/AbsBox-0.12.5/absbox/local/util.py | util.py |
import matplotlib.pyplot as plt
from matplotlib import font_manager
from absbox.local.util import guess_locale,aggStmtByDate,consolStmtByDate
from pyspecter import query, S
from itertools import reduce
import numpy as np
import logging
dmap = {
"cn":{
"bond":"债券","scenario":"场景"
},
"en":{
"bond":"bond","scenario":"scenario"
}
}
def init_plot_fonts():
define_list = ['Source Han Serif CN','Microsoft Yahei','STXihei']
support_list = font_manager.findSystemFonts(fontpaths=None, fontext='ttf')
font_p = font_manager.FontProperties()
try:
for sl in support_list:
f = font_manager.get_font(sl)
if f.family_name in set(define_list):
font_p.set_family(f.family_name)
font_p.set_size(14)
return font_p
except RuntimeError as e:
logging.error("中文字体载入失败")
return None
font_p = init_plot_fonts()
def plot_bond(rs, bnd, flow='本息合计'):
"""Plot bonds across scenarios"""
plt.figure(figsize=(12,8))
_alpha = 0.8
locale = guess_locale(list(rs.values())[0])
for idx,s in enumerate(rs):
plt.step(s['bonds'][bnd].index,s['bonds'][bnd][[flow]], alpha=_alpha, linewidth=5, label=f"{dmap[locale]['scenario']}-{idx}")
plt.legend(loc='upper left', prop=font_p)
plt.title(f"{len(rs)} {dmap[locale]['scenario']} {dmap[locale]['bond']}:{bnd} - {flow}", fontproperties=font_p)
plt.grid(True)
plt.axis('tight')
plt.xticks(rotation=30)
current_values = plt.gca().get_yticks()
plt.gca().set_yticklabels(['{:.0f}(w)'.format(x/10000) for x in current_values])
return plt
def plot_bonds(r, bnds:list, flow='本息合计'):
"Plot bond flows with in a single run"
locale = guess_locale(r)
plt.figure(figsize=(12,8))
_alpha = 0.8
for b in bnds:
b_flow = r['bonds'][b]
plt.step(b_flow.index,b_flow[[flow]], alpha=_alpha, linewidth=5, label=f"{dmap[locale]['bond']}-{b}")
plt.legend(loc='upper left', prop=font_p)
bnd_title = ','.join(bnds)
plt.title(f"{dmap[locale]['bond']}:{bnd_title} - {flow}", fontproperties=font_p)
plt.grid(True)
plt.axis('tight')
plt.xticks(rotation=30)
current_values = plt.gca().get_yticks()
plt.gca().set_yticklabels(['{:.0f}(w)'.format(x/10000) for x in current_values])
return plt
def plot_by_scenario(rs, flowtype, flowpath):
"Plot with multiple scenario"
plt.figure(figsize=(12,8))
scenario_names = rs.keys()
dflows = [query(rs,[s]+flowpath) for s in scenario_names]
_alpha = 0.8
x_labels = reduce(lambda acc,x:acc.union(x) ,[ _.index for _ in dflows ]).unique()
x = np.arange(len(x_labels))
width = 1
step_length = width / (len(scenario_names)+1)
for (idx,(scen,dflow)) in enumerate(zip(scenario_names,dflows)):
if flowtype=="balance":
cb = consolStmtByDate(dflow)
plt.step(cb.index, cb, alpha=_alpha, linewidth=5, label=f"{scen}")
elif flowtype=="amount":
cb = aggStmtByDate(dflow)
_bar = plt.bar(x+idx*step_length,cb,width=step_length,label=scen)
else:
plt.plot(dflow.index,dflow, alpha=_alpha, linewidth=5, label=f"{scen}")
plt.legend(scenario_names,loc='upper right', prop=font_p)
plt.grid(True)
plt.axis('tight')
plt.xticks(ticks=x,labels=x_labels,rotation=30) | AbsBox | /AbsBox-0.12.5.tar.gz/AbsBox-0.12.5/absbox/local/plot.py | plot.py |
from absbox.local.util import mkTag, DC, mkTs, guess_locale, readTagStr
from enum import Enum
import itertools
import functools
import logging
import pandas as pd
from pyspecter import query, S
datePattern = {"月末": "MonthEnd", "季度末": "QuarterEnd", "年末": "YearEnd", "月初": "MonthFirst",
"季度初": "QuarterFirst", "年初": "YearFirst", "每年": "MonthDayOfYear", "每月": "DayOfMonth", "每周": "DayOfWeek"}
freqMap = {"每月": "Monthly", "每季度": "Quarterly", "每半年": "SemiAnnually", "每年": "Annually", "Monthly": "Monthly", "Quarterly": "Quarterly", "SemiAnnually": "SemiAnnually", "Annually": "Annually", "monthly": "Monthly", "quarterly": "Quarterly", "semiAnnually": "SemiAnnually", "annually": "Annually"
}
baseMap = {"资产池余额": "CurrentPoolBalance", "资产池期末余额": "CurrentPoolBalance", "资产池期初余额": "CurrentPoolBegBalance", "资产池初始余额": "OriginalPoolBalance", "初始资产池余额": "OriginalPoolBalance", "资产池当期利息": "PoolCollectionInt", "债券余额": "CurrentBondBalance", "债券初始余额": "OriginalBondBalance", "当期已付债券利息": "LastBondIntPaid", "当期已付费用": "LastFeePaid", "当期未付债券利息": "CurrentDueBondInt", "当期未付费用": "CurrentDueFee"
}
def mkLiq(x):
match x:
case {"正常余额折价": cf, "违约余额折价": df}:
return mkTag(("BalanceFactor", [cf, df]))
case {"CurrentFactor": cf, "DefaultFactor": df}:
return mkTag(("BalanceFactor", [cf, df]))
case {"贴现计价": df, "违约余额回收率": r}:
return mkTag(("PV", [df, r]))
case {"PV": df, "DefaultRecovery": r}:
return mkTag(("PV", [df, r]))
def mkDatePattern(x):
match x:
case ["每月", _d]:
return mkTag((datePattern["每月"], _d))
case ["每年", _m, _d]:
return mkTag((datePattern["每年"], [_m, _d]))
case ["DayOfMonth", _d]:
return mkTag(("DayOfMonth", _d))
case ["MonthDayOfYear", _m, _d]:
return mkTag(("MonthDayOfYear", _m, _d))
case ["CustomDate", *_ds]:
return mkTag(("CustomDate", _ds))
case ["AllDatePattern", *_dps]:
return mkTag(("AllDatePattern", [ mkDatePattern(_) for _ in _dps]))
case _x if (_x in datePattern.values()):
return mkTag((_x))
case _x if (_x in datePattern.keys()):
return mkTag((datePattern[x]))
case _:
raise RuntimeError(f"Failed to match {x}")
def mkDate(x):
match x:
case {"封包日": a, "起息日": b, "首次兑付日": c, "法定到期日": d, "收款频率": pf, "付款频率": bf} | \
{"cutoff": a, "closing": b, "firstPay": c, "stated": d, "poolFreq": pf, "payFreq": bf}:
firstCollection = x.get("首次归集日", b)
mr = x.get("循环结束日", None)
return mkTag(("PreClosingDates", [a, b, mr, d, [firstCollection, mkDatePattern(pf)], [c, mkDatePattern(bf)]]))
case {"归集日": (lastCollected, nextCollect), "兑付日": (pp, np), "法定到期日": c, "收款频率": pf, "付款频率": bf} | \
{"collect": (lastCollected, nextCollect), "pay": (pp, np), "stated": c, "poolFreq": pf, "payFreq": bf}:
mr = x.get("循环结束日", None)
return mkTag(("CurrentDates", [[lastCollected, pp],
mr,
c,
[nextCollect, mkDatePattern(pf)],
[np, mkDatePattern(bf)]]))
case {"回款日": cdays, "分配日": ddays, "封包日": cutoffDate, "起息日": closingDate} | \
{"poolCollection": cdays, "distirbution": ddays, "cutoff": cutoffDate, "closing": closingDate}:
return mkTag(("CustomDates", [cutoffDate, [mkTag(("PoolCollection", [cd, ""])) for cd in cdays], closingDate, [mkTag(("RunWaterfall", [dd, ""])) for dd in ddays]]))
case _:
raise RuntimeError(f"Failed to match:{x}")
def mkFeeType(x):
match x:
case {"年化费率": [base, rate]} | {"annualPctFee": [base, rate]}:
return mkTag(("AnnualRateFee", [mkTag((baseMap[base], '1970-01-01')), rate]))
case {"百分比费率": [*desc, rate]} | {"pctFee": [*desc, rate]}:
match desc:
case ["资产池回款", "利息"] | ["poolCollection", "interest"]:
return mkTag(("PctFee", [mkTag(("PoolCollectionIncome", "CollectedInterest")), rate]))
case ["已付利息合计", *bns] | ["paidInterest", *bns]:
return mkTag(("PctFee", [mkTag(("LastBondIntPaid", bns)), rate]))
case ["已付本金合计", *bns] | ["paidPrincipal", *bns]:
return mkTag(("PctFee", [mkTag(("LastBondPrinPaid", bns)), rate]))
case _:
raise RuntimeError(f"Failed to match on 百分比费率:{desc,rate}")
case {"固定费用": amt} | {"fixFee": amt}:
return mkTag(("FixFee", amt))
case {"周期费用": [p, amt]} | {"recurFee": [p, amt]}:
return mkTag(("RecurFee", [mkDatePattern(p), amt]))
case {"自定义": fflow} | {"customFee": fflow}:
return mkTag(("FeeFlow", mkTs("BalanceCurve", fflow)))
case {"计数费用": [p, s, amt]} | {"numFee": [p, s, amt]}:
return mkTag(("NumFee", [mkDatePattern(p), mkDs(s), amt]))
case _:
raise RuntimeError(f"Failed to match on fee type:{x}")
def mkDateVector(x):
match x:
case dp if isinstance(dp, str):
return mkTag(datePattern[dp])
case [dp, *p] if (dp in datePattern.keys()):
return mkTag((datePattern[dp], p))
case _:
raise RuntimeError(f"not match found: {x}")
def mkDs(x):
"Making Deal Stats"
match x:
case ("债券余额",) | ("bondBalance",):
return mkTag("CurrentBondBalance")
case ("债券余额", *bnds) | ("bondBalance", *bnds):
return mkTag(("CurrentBondBalanceOf", bnds))
case ("初始债券余额",) | ("originalBondBalance",):
return mkTag("OriginalBondBalance")
case ("到期月份", bn) | ("monthsTillMaturity", bn):
return mkTag(("MonthsTillMaturity", bn))
case ("资产池余额",) | ("poolBalance",):
return mkTag("CurrentPoolBalance")
case ("初始资产池余额",) | ("originalPoolBalance",):
return mkTag("OriginalPoolBalance")
case ("资产池违约余额",) | ("currentPoolDefaultedBalance",):
return mkTag("CurrentPoolDefaultedBalance")
case ("资产池累积违约余额",) | ("cumPoolDefaultedBalance",):
return mkTag("CumulativePoolDefaultedBalance")
case ("资产池累积违约率",) | ("cumPoolDefaultedRate",):
return mkTag("CumulativePoolDefaultedRate")
case ("债券系数",) | ("bondFactor",):
return mkTag("BondFactor")
case ("资产池系数",) | ("poolFactor",):
return mkTag("PoolFactor")
case ("所有账户余额",) | ("accountBalance"):
return mkTag("AllAccBalance")
case ("账户余额", *ans) | ("accountBalance", *ans):
return mkTag(("AccBalance", ans))
case ("债券待付利息", *bnds) | ("bondDueInt", *bnds):
return mkTag(("CurrentDueBondInt", bnds))
case ("债券已付利息", *bnds) | ("lastBondIntPaid", *bnds):
return mkTag(("LastBondIntPaid", bnds))
case ("债券低于目标余额", bn) | ("behindTargetBalance", bn):
return mkTag(("BondBalanceGap", bn))
case ("债务人数量",) | ("borrowerNumber",):
return mkTag(("CurrentPoolBorrowerNum"))
# , "当期已付债券利息":"LastBondIntPaid"
# , "当期已付费用" :"LastFeePaid"
# , "当期未付债券利息" :"CurrentDueBondInt"
# , "当期未付费用": "CurrentDueFee"
case ("待付费用", *fns) | ("feeDue", *fns):
return mkTag(("CurrentDueFee", fns))
case ("已付费用", *fns) | ("lastFeePaid", *fns):
return mkTag(("LastFeePaid", fns))
case ("系数", ds, f) | ("factor", ds, f):
return mkTag(("Factor", [mkDs(ds), f]))
case ("Min", ds1, ds2):
return mkTag(("Min", [mkDs(ds1), mkDs(ds2)]))
case ("Max", ds1, ds2):
return mkTag(("Max", [mkDs(ds1), mkDs(ds2)]))
case ("合计", *ds) | ("sum", *ds):
return mkTag(("Sum", [mkDs(_ds) for _ds in ds]))
case ("差额", *ds) | ("substract", *ds):
return mkTag(("Substract", [mkDs(_ds) for _ds in ds]))
case ("常数", n) | ("constant", n):
return mkTag(("Constant", n))
case ("储备账户缺口", *accs) | ("reserveGap", *accs):
return mkTag(("ReserveAccGap", accs))
case ("自定义", n) | ("custom", n):
return mkTag(("UseCustomData", n))
case legacy if (legacy in baseMap.keys()):
return mkDs((legacy,))
case _:
raise RuntimeError(f"Failed to match DS/Formula: {x}")
def isPre(x):
try:
return mkPre(x) is not None
except RuntimeError as e:
return False
def mkPre(p):
def isIntQuery(y):
match y:
case ("monthsTillMaturity", _):
return True
case ("到期月份", _):
return True
case _:
return False
dealStatusMap = {"摊还": "Current", "加速清偿": "Accelerated", "循环": "Revolving"}
match p:
case [ds, "=", n]:
if isIntQuery(ds):
return mkTag(("IfEqInt", [mkDs(ds), n]))
else:
return mkTag(("IfEqBal", [mkDs(ds), n]))
case [ds, ">", amt]:
if isIntQuery(ds):
return mkTag(("IfGTInt", [mkDs(ds), amt]))
else:
return mkTag(("IfGT", [mkDs(ds), amt]))
case [ds, "<", amt]:
if isIntQuery(ds):
return mkTag(("IfLTInt", [mkDs(ds), amt]))
else:
return mkTag(("IfLT", [mkDs(ds), amt]))
case [ds, ">=", amt]:
if isIntQuery(ds):
return mkTag(("IfGETInt", [mkDs(ds), amt]))
else:
return mkTag(("IfGET", [mkDs(ds), amt]))
case [ds, "<=", amt]:
if isIntQuery(ds):
return mkTag(("IfLETInt", [mkDs(ds), amt]))
else:
return mkTag(("IfLET", [mkDs(ds), amt]))
case [ds, "=", 0]:
return mkTag(("IfZero", mkDs(ds)))
case [">", _d]:
return mkTag(("IfAfterDate", _d))
case ["<", _d]:
return mkTag(("IfBeforeDate", _d))
case [">=", _d]:
return mkTag(("IfAfterOnDate", _d))
case ["<=", _d]:
return mkTag(("IfBeforeOnDate", _d))
case ["状态", _st] | ["status", _st]:
return mkTag(("IfDealStatus", mkStatus(_st)))
case ["同时满足", _p1, _p2] | ["all", _p1, _p2]:
return mkTag(("And", mkPre(_p1), mkPre(_p2)))
case ["任一满足", _p1, _p2] | ["any", _p1, _p2]:
return mkTag(("Or", mkPre(_p1), mkPre(_p2)))
case _:
raise RuntimeError(f"Failed to match on Pre: {p}")
def mkAccInt(x):
match x:
case {"周期": _dp, "利率": idx, "利差": spd, "最近结息日": lsd} \
| {"period": _dp, "index": idx, "spread": spd, "lastSettleDate": lsd}:
return mkTag(("InvestmentAccount", [idx, spd, lsd, mkDateVector(_dp)]))
case {"周期": _dp, "利率": br, "最近结息日": lsd} \
| {"period": _dp, "rate": br, "lastSettleDate": lsd}:
return mkTag(("BankAccount", [br, lsd, mkDateVector(_dp)]))
case None:
return None
case _:
raise RuntimeError(
f"Failed to match on account interest definition: {x}")
def mkAccType(x):
match x:
case {"固定储备金额": amt} | {"fixReserve": amt}:
return mkTag(("FixReserve", amt))
case {"目标储备金额": [base, rate]} | {"targetReserve": [base, rate]}:
match base:
case ["合计", *qs] | ["Sum", *qs]:
sumDs = [mkDs(q) for q in qs]
return mkTag(("PctReserve", [mkTag(("Sum", sumDs)), rate]))
case _:
return mkTag(("PctReserve", [mkDs(base), rate]))
case {"目标储备金额": {"公式": ds, "系数": rate}} | {"targetReserve": {"formula": ds, "factor": rate}}:
return mkTag(("PctReserve", [mkDs(ds), rate]))
case {"目标储备金额": {"公式": ds}} | {"targetReserve": {"formula": ds}}:
return mkTag(("PctReserve", [mkDs(ds), 1.0]))
case {"较高": [a, b]} | {"max": [a, b]}:
return mkTag(("Max", [mkAccType(a), mkAccType(b)]))
case {"较低": [a, b]} | {"min": [a, b]}:
return mkTag(("Min", [mkAccType(a), mkAccType(b)]))
case {"分段": [p, a, b]} | {"When": [p, a, b]}:
return mkTag(("Either", [mkPre(p), mkAccType(a), mkAccType(b)]))
case None:
return None
case _:
raise RuntimeError(f"Failed to match {x} for account reserve type")
def mkAccTxn(xs):
"AccTxn T.Day Balance Amount Comment"
if xs is None:
return None
else:
return [mkTag(("AccTxn", x)) for x in xs]
def mkAcc(an, x):
match x:
case {"余额": b, "类型": t, "计息": i, "记录": tx} | {"balance": b, "type": t, "interest": i, "txn": tx}:
return {"accBalance": b, "accName": an, "accType": mkAccType(t), "accInterest": mkAccInt(i), "accStmt": mkAccTxn(tx)}
case {"余额": b} | {"balance": b}:
return mkAcc(an, x | {"计息": x.get("计息", None), "interest": x.get("interest", None), "记录": x.get("记录", None), "txn": x.get("txn", None), "类型": x.get("类型", None), "type": x.get("type", None)})
case _:
raise RuntimeError(f"Failed to match account: {an},{x}")
def mkBondType(x):
match x:
case {"固定摊还": schedule} | {"PAC": schedule}:
return mkTag(("PAC", mkTag(("BalanceCurve", schedule))))
case {"过手摊还": None} | {"Sequential": None}:
return mkTag(("Sequential"))
case {"锁定摊还": _after} | {"Lockout": _after}:
return mkTag(("Lockout", _after))
case {"权益": _} | {"Equity": _}:
return mkTag(("Equity"))
case _:
raise RuntimeError(f"Failed to match bond type: {x}")
def mkRateReset(x):
match x:
case {"重置期间": interval, "起始": sdate} | {"resetInterval": interval, "starts": sdate}:
return mkTag(("ByInterval", [freqMap[interval], sdate]))
case {"重置期间": interval} | {"resetInterval": interval}:
return mkTag(("ByInterval", [freqMap[interval], None]))
case {"重置月份": monthOfYear} | {"resetMonth": monthOfYear}:
return mkTag(("MonthOfYear", monthOfYear))
case _:
raise RuntimeError(f"Failed to match:{x}: mkRateReset")
def mkBondRate(x):
indexMapping = {"LPR5Y": "LPR5Y", "LIBOR1M": "LIBOR1M"}
match x:
case {"浮动": [_index, Spread, resetInterval], "日历": dc} | \
{"floater": [_index, Spread, resetInterval], "dayCount": dc}:
return mkTag(("Floater", [indexMapping[_index], Spread, mkRateReset(resetInterval), dc, None, None]))
case {"浮动": [_index, Spread, resetInterval]} | {"floater": [_index, Spread, resetInterval]}:
return mkBondRate(x | {"日历": DC.DC_ACT_365F.value, "dayCount": DC.DC_ACT_365F.value})
case {"固定": _rate, "日历": dc} | {"fix": _rate, "dayCount": dc}:
return mkTag(("Fix", [_rate, dc]))
case {"固定": _rate} | {"Fixed": _rate}:
return mkTag(("Fix", [_rate, DC.DC_ACT_365F.value]))
case {"期间收益": _yield}:
return mkTag(("InterestByYield", _yield))
case _:
raise RuntimeError(f"Failed to match bond rate type:{x}")
def mkBnd(bn, x):
match x:
case {"当前余额": bndBalance, "当前利率": bndRate, "初始余额": originBalance, "初始利率": originRate, "起息日": originDate, "利率": bndInterestInfo, "债券类型": bndType} | \
{"balance": bndBalance, "rate": bndRate, "originBalance": originBalance, "originRate": originRate, "startDate": originDate, "rateType": bndInterestInfo, "bondType": bndType}:
md = x.get("到期日", None) or x.get("maturityDate", None)
return {bn: {"bndName": bn, "bndBalance": bndBalance, "bndRate": bndRate, "bndOriginInfo":
{"originBalance": originBalance, "originDate": originDate, "originRate": originRate} | {"maturityDate": md}, "bndInterestInfo": mkBondRate(bndInterestInfo), "bndType": mkBondType(bndType), "bndDuePrin": 0, "bndDueInt": 0, "bndDueIntDate": None}}
case _:
raise RuntimeError(f"Failed to match bond:{bn},{x}:mkBnd")
def mkLiqMethod(x):
match x:
case ["正常|违约", a, b] | ["Cuurent|Defaulted", a, b]:
return mkTag(("BalanceFactor", [a, b]))
case ["正常|拖欠|违约", a, b, c] | ["Cuurent|Delinquent|Defaulted", a, b, c]:
return mkTag(("BalanceFactor2", [a, b, c]))
case ["贴现|违约", a, b] | ["PV|Defaulted", a, b]:
return mkTag(("PV", [a, b]))
case _:
raise RuntimeError(f"Failed to match {x}:mkLiqMethod")
def mkFeeCapType(x):
match x:
case {"应计费用百分比": pct} | {"duePct": pct}:
return mkTag(("DuePct", pct))
case {"应计费用上限": amt} | {"dueCapAmt": amt}:
return mkTag(("DueCapAmt", amt))
case _:
raise RuntimeError(f"Failed to match {x}:mkFeeCapType")
def mkPDA(x):
match x:
case {"公式": ds} | {"formula": ds}:
return mkTag(("DS", mkDs(ds)))
case _:
raise RuntimeError(f"Failed to match {x}:mkPDA")
def mkAccountCapType(x):
match x:
case {"余额百分比": pct} | {"balPct": pct}:
return mkTag(("DuePct", pct))
case {"金额上限": amt} | {"balCapAmt": amt}:
return mkTag(("DueCapAmt", amt))
case _:
raise RuntimeError(f"Failed to match {x}:mkAccountCapType")
def mkTransferLimit(x):
match x:
case {"余额百分比": pct} | {"balPct": pct}:
return mkTag(("DuePct", pct))
case {"金额上限": amt} | {"balCapAmt": amt}:
return mkTag(("DueCapAmt", amt))
case {"公式": "ABCD"}:
return mkTag(("Formula", "ABCD"))
case {"公式": formula} | {"formula": formula}:
return mkTag(("DS", mkDs(formula)))
case _:
raise RuntimeError(f"Failed to match :{x}:mkTransferLimit")
def mkAction(x):
match x:
case ["账户转移", source, target] | ["transfer", source, target]:
return mkTag(("Transfer", [source, target]))
case ["按公式账户转移", _limit, source, target] | ["transferBy", _limit, source, target]:
return mkTag(("TransferBy", [mkTransferLimit(_limit), source, target]))
case ["计提费用", *feeNames] | ["calcFee", *feeNames]:
return mkTag(("CalcFee", feeNames))
case ["计提利息", *bndNames] | ["calcInt", *bndNames]:
return mkTag(("CalcBondInt", bndNames))
case ["支付费用", source, target] | ["payFee", source, target]:
return mkTag(("PayFee", [source, target]))
case ["支付费用收益", source, target, _limit] | ["payFeeResidual", source, target, _limit]:
limit = mkAccountCapType(_limit)
return mkTag(("PayFeeResidual", [limit, source, target]))
case ["支付费用收益", source, target] | ["payFeeResidual", source, target]:
return mkTag(("PayFeeResidual", [None, source, target]))
case ["支付费用限额", source, target, _limit] | ["payFeeBy", source, target, _limit]:
limit = mkFeeCapType(_limit)
return mkTag(("PayFeeBy", [limit, source, target]))
case ["支付利息", source, target] | ["payInt", source, target]:
return mkTag(("PayInt", [source, target]))
case ["支付本金", source, target, _limit] | ["payPrin", source, target, _limit]:
pda = mkPDA(_limit)
return mkTag(("PayPrinBy", [pda, source, target]))
case ["支付本金", source, target] | ["payPrin", source, target]:
return mkTag(("PayPrin", [source, target]))
case ["支付剩余本金", source, target] | ["payPrinResidual", source, target]:
return mkTag(("PayPrinResidual", [source, target]))
case ["支付期间收益", source, target]:
return mkTag(("PayTillYield", [source, target]))
case ["支付收益", source, target, limit] | ["payResidual", source, target, limit]:
return mkTag(("PayResidual", [limit, source, target]))
case ["支付收益", source, target] | ["payResidual", source, target]:
return mkTag(("PayResidual", [None, source, target]))
case ["储备账户转移", source, target, satisfy] | ["transferReserve", source, target, satisfy]:
_map = {"源储备": "Source", "目标储备": "Target"}
return mkTag(("TransferReserve", [_map[satisfy], source, target]))
case ["出售资产", liq, target] | ["sellAsset", liq, target]:
return mkTag(("LiquidatePool", [mkLiqMethod(liq), target]))
case ["流动性支持", source, target, limit] | ["liqSupport", source, target, limit]:
return mkTag(("LiqSupport", [mkTag(("DS", mkDs(limit))), source, target]))
case ["流动性支持", source, target] | ["liqSupport", source, target]:
return mkTag(("LiqSupport", [None, source, target]))
case ["流动性支持偿还", source, target] | ["liqRepay", source, target]:
return mkTag(("LiqRepay", [None, source, target]))
case ["流动性支持报酬", source, target] | ["liqRepayResidual", source, target]:
return mkTag(("LiqYield", [None, source, target]))
case ["流动性支持计提", target] | ["liqAccrue", target]:
return mkTag(("LiqAccrue", target))
case _:
raise RuntimeError(f"Failed to match :{x}:mkAction")
def mkWaterfall2(x):
match x:
case (pre, *_action) if isPre(pre) and len(x) > 2: # pre with multiple actions
_pre = mkPre(pre)
return [[_pre, mkAction(a)] for a in _action]
case (pre, _action) if isPre(pre) and len(x) == 2: # pre with 1 actions
_pre = mkPre(pre)
return [[_pre, mkAction(_action)]]
case _:
return [[None, mkAction(x)]]
def mkStatus(x):
match x:
case "摊销" | "Amortizing":
return mkTag(("Amortizing"))
case "循环" | "Revolving":
return mkTag(("Revolving"))
case "加速清偿" | "Accelerated":
return mkTag(("DealAccelerated", None))
case "违约" | "Defaulted":
return mkTag(("DealDefaulted", None))
case "结束" | "Ended":
return mkTag(("Ended"))
case "设计" | "PreClosing":
return mkTag(("PreClosing"))
case _:
raise RuntimeError(f"Failed to match :{x}:mkStatus")
def readStatus(x, locale):
m = {"en": {'amort': "Amortizing", 'def': "Defaulted", 'acc': "Accelerated", 'end': "Ended",
'pre': "PreClosing"}, "cn": {'amort': "摊销", 'def': "违约", 'acc': "加速清偿", 'end': "结束", 'pre': "设计"}}
match x:
case {"tag": "Amortizing"}:
return m[locale]['amort']
case {"tag": "DealAccelerated"}:
return m[locale]['acc']
case {"tag": "DealDefaulted"}:
return m[locale]['def']
case {"tag": "Ended"}:
return m[locale]['end']
case {"tag": "PreClosing"}:
return m[locale]['pre']
case _:
raise RuntimeError(
f"Failed to read deal status:{x} with locale: {locale}")
def mkWhenTrigger(x):
match x:
case "回收后" | "BeforeCollect":
return "BeginCollectionWF"
case "回收动作后" | "AfterCollect":
return "EndCollectionWF"
case "分配前" | "BeforeDistribution":
return "BeginDistributionWF"
case "分配后" | "AfterDistribution":
return "EndDistributionWF"
case _:
raise RuntimeError(f"Failed to match :{x}:mkWhenTrigger")
def mkThreshold(x):
match x:
case ">":
return "Above"
case ">=":
return "EqAbove"
case "<":
return "Below"
case "<=":
return "EqBelow"
case _:
raise RuntimeError(f"Failed to match :{x}:mkThreshold")
def _rateTypeDs(x):
h = x[0]
if h in set(["资产池累积违约率", "cumPoolDefaultedRate", "债券系数", "bondFactor", "资产池系数", "poolFactor"]):
return True
return False
def mkTrigger(x):
match x:
case [">", _d]:
return mkTag(("AfterDate", _d))
case [">=", _d]:
return mkTag(("AfterOnDate", _d))
case ["到期日未兑付", _bn] | ["passMaturity", _bn]:
return mkTag(("PassMaturityDate", _bn))
case ["所有满足", *trgs] | ["all", *trgs]:
return mkTag(("AllTrigger", [mkTrigger(t) for t in trgs]))
case ["任一满足", *trgs] | ["any", *trgs]:
return mkTag(("AnyTrigger", [mkTrigger(t) for t in trgs]))
case ["一直", b] | ["always", b]:
return mkTag(("Always", b))
case [ds, cmp, v] if (isinstance(v, float) and _rateTypeDs(ds)):
return mkTag(("ThresholdRate", [mkThreshold(cmp), mkDs(ds), v]))
case [ds, cmp, ts] if _rateTypeDs(ds):
return mkTag(("ThresholdRateCurve", [mkThreshold(cmp), mkDs(ds), mkTs("ThresholdCurve", ts)]))
case [ds, cmp, v] if (isinstance(v, float) or isinstance(v, int)):
return mkTag(("ThresholdBal", [mkThreshold(cmp), mkDs(ds), v]))
case [ds, cmp, ts]:
return mkTag(("ThresholdBalCurve", [mkThreshold(cmp), mkDs(ds), mkTs("ThresholdCurve", ts)]))
case _:
raise RuntimeError(f"Failed to match :{x}:mkTrigger")
def mkTriggerEffect(x):
match x:
case ("新状态", s) | ("newStatus", s):
return mkTag(("DealStatusTo", mkStatus(s)))
case ["计提费用", *fn] | ["accrueFees", *fn]:
return mkTag(("DoAccrueFee", fn))
case ["新增事件", trg] | ["newTrigger", trg]:
return mkTag(("AddTrigger", mkTrigger(trg)))
case ["结果", *efs] | ["Effects", *efs]:
return mkTag(("TriggerEffects", [mkTriggerEffect(e) for e in efs]))
case _:
raise RuntimeError(f"Failed to match :{x}:mkTriggerEffect")
def mkWaterfall(r, x):
mapping = {
"未违约": "Amortizing",
"摊销": "Amortizing",
"循环": "Revolving",
"加速清偿": "DealAccelerated",
"违约": "DealDefaulted",
"未设立": "PreClosing",
}
if len(x) == 0:
return {k: list(v) for k, v in r.items()}
_k, _v = x.popitem()
_w_tag = None
match _k:
case ("兑付日", "加速清偿") | ("amortizing", "accelerated"):
_w_tag = f"DistributionDay (DealAccelerated Nothing)"
case ("兑付日", "违约") | ("amortizing", "defaulted"):
_w_tag = f"DistributionDay (DealDefaulted Nothing)"
case ("兑付日", _st) | ("amortizing", _st):
_w_tag = f"DistributionDay {mapping.get(_st,_st)}"
case "兑付日" | "未违约" | "amortizing":
_w_tag = f"DistributionDay Amortizing"
case "清仓回购" | "cleanUp":
_w_tag = "CleanUp"
case "回款日" | "回款后" | "endOfCollection":
_w_tag = f"EndOfPoolCollection"
case "设立日" | "closingDay":
_w_tag = f"OnClosingDay"
case _:
raise RuntimeError(f"Failed to match :{x}:mkWaterfall")
r[_w_tag] = itertools.chain.from_iterable([mkWaterfall2(_a) for _a in _v])
return mkWaterfall(r, x)
def mkAssetRate(x):
match x:
case ["固定", r] | ["fix", r]:
return mkTag(("Fix", r))
case ["浮动", r, {"基准": idx, "利差": spd, "重置频率": p}]:
return mkTag(("Floater", [idx, spd, r, freqMap[p], None]))
case ["floater", r, {"index": idx, "spread": spd, "reset": p}]:
return mkTag(("Floater", [idx, spd, r, freqMap[p], None]))
case ["Floater", r, {"index": idx, "spread": spd, "reset": p}]:
return mkTag(("Floater", [idx, spd, r, freqMap[p], None]))
case _:
raise RuntimeError(f"Failed to match {x}:mkAssetRate")
def mkAmortPlan(x) -> dict:
match x:
case "等额本息" | "Level" | "level":
return mkTag("Level")
case "等额本金" | "Even" | "even":
return mkTag("Even")
case "先息后本" | "I_P" | "i_p":
return mkTag("I_P")
case "等本等费" | "F_P" | "f_p":
return mkTag("F_P")
case _:
raise RuntimeError(f"Failed to match AmortPlan {x}:mkAmortPlan")
def mkAsset(x):
_statusMapping = {"正常": mkTag(("Current")), "违约": mkTag(("Defaulted", None)), "current": mkTag(("Current")), "defaulted": mkTag(("Defaulted", None)), "Current": mkTag(("Current")), "Defaulted": mkTag(("Defaulted", None))
}
match x:
case ["按揭贷款", {"放款金额": originBalance, "放款利率": originRate, "初始期限": originTerm, "频率": freq, "类型": _type, "放款日": startDate}, {"当前余额": currentBalance, "当前利率": currentRate, "剩余期限": remainTerms, "状态": status}] | \
["Mortgage", {"originBalance": originBalance, "originRate": originRate, "originTerm": originTerm, "freq": freq, "type": _type, "originDate": startDate}, {"currentBalance": currentBalance, "currentRate": currentRate, "remainTerm": remainTerms, "status": status}]:
borrowerNum1 = x[2].get("borrowerNum", None)
borrowerNum2 = x[2].get("借款数量", None)
return mkTag(("Mortgage", [
{"originBalance": originBalance,
"originRate": mkAssetRate(originRate),
"originTerm": originTerm,
"period": freqMap[freq],
"startDate": startDate,
"prinType": mkAmortPlan(_type)
} | mkTag("MortgageOriginalInfo"),
currentBalance,
currentRate,
remainTerms,
(borrowerNum1 or borrowerNum2),
_statusMapping[status]]))
case ["贷款", {"放款金额": originBalance, "放款利率": originRate, "初始期限": originTerm, "频率": freq, "类型": _type, "放款日": startDate}, {"当前余额": currentBalance, "当前利率": currentRate, "剩余期限": remainTerms, "状态": status}] \
| ["Loan", {"originBalance": originBalance, "originRate": originRate, "originTerm": originTerm, "freq": freq, "type": _type, "originDate": startDate}, {"currentBalance": currentBalance, "currentRate": currentRate, "remainTerm": remainTerms, "status": status}]:
return mkTag(("PersonalLoan", [
{"originBalance": originBalance,
"originRate": mkAssetRate(originRate),
"originTerm": originTerm,
"period": freqMap[freq],
"startDate": startDate,
"prinType": mkAmortPlan(_type)
} | mkTag("LoanOriginalInfo"),
currentBalance,
currentRate,
remainTerms,
_statusMapping[status]]))
case ["分期", {"放款金额": originBalance, "放款费率": originRate, "初始期限": originTerm, "频率": freq, "类型": _type, "放款日": startDate, "剩余期限": remainTerms}, {"当前余额": currentBalance, "状态": status}] \
| ["Installment", {"originBalance": originBalance, "feeRate": originRate, "originTerm": originTerm, "freq": freq, "type": _type, "originDate": startDate, "remainTerm": remainTerms}, {"currentBalance": currentBalance, "status": status}]:
return mkTag(("Installment", [
{"originBalance": originBalance,
"originRate": mkAssetRate(originRate),
"originTerm": originTerm,
"period": freqMap[freq],
"startDate": startDate,
"prinType": mkAmortPlan(_type)
} | mkTag("LoanOriginalInfo"),
currentBalance,
remainTerms,
_statusMapping[status]]))
case ["租赁", {"固定租金": dailyRate, "初始期限": originTerm, "频率": dp, "起始日": startDate, "状态": status, "剩余期限": remainTerms}] \
| ["Lease", {"fixRental": dailyRate, "originTerm": originTerm, "freq": dp, "originDate": startDate, "status": status, "remainTerm": remainTerms}]:
return mkTag(("RegularLease", [{"originTerm": originTerm, "startDate": startDate, "paymentDates": mkDatePattern(dp), "originRental": dailyRate} | mkTag("LeaseInfo"), 0, remainTerms, _statusMapping[status]]))
case ["租赁", {"初始租金": dailyRate, "初始期限": originTerm, "频率": dp, "起始日": startDate, "计提周期": accDp, "涨幅": rate, "状态": status, "剩余期限": remainTerms}] \
| ["Lease", {"initRental": dailyRate, "originTerm": originTerm, "freq": dp, "originDate": startDate, "accrue": accDp, "pct": rate, "status": status, "remainTerm": remainTerms}]:
dailyRatePlan = None
_stepUpType = "curve" if isinstance(rate, list) else "constant"
if _stepUpType == "constant":
dailyRatePlan = mkTag(
("FlatRate", [mkDatePattern(accDp), rate]))
else:
dailyRatePlan = mkTag(
("ByRateCurve", [mkDatePattern(accDp), rate]))
return mkTag(("StepUpLease", [{"originTerm": originTerm, "startDate": startDate, "paymentDates": mkDatePattern(dp), "originRental": dailyRate} | mkTag("LeaseInfo"), dailyRatePlan, 0, remainTerms, _statusMapping[status]]))
case _:
raise RuntimeError(f"Failed to match {x}:mkAsset")
def identify_deal_type(x):
match x:
case {"pool": {"assets": [{'tag': 'PersonalLoan'}, *rest]}}:
return "LDeal"
case {"pool": {"assets": [{'tag': 'Mortgage'}, *rest]}}:
return "MDeal"
case {"pool": {"assets": [], "futureCf": cfs}} if cfs[0]['tag'] == 'MortgageFlow':
return "MDeal"
case {"pool": {"assets": [{'tag': 'Installment'}, *rest]}}:
return "IDeal"
case {"pool": {"assets": [{'tag': 'Lease'}, *rest]}} | {"pool": {"assets": [{'tag': 'RegularLease'}, *rest]}}:
return "RDeal"
case {"pool": {"assets": [{'tag': 'StepUpLease'}, *rest]}}:
return "RDeal"
case _:
raise RuntimeError(f"Failed to identify deal type {x}")
def mkCallOptions(x):
match x:
case {"资产池余额": bal} | {"poolBalance": bal}:
return mkTag(("PoolBalance", bal))
case {"债券余额": bal} | {"bondBalance": bal}:
return mkTag(("BondBalance", bal))
case {"资产池余额剩余比率": factor} | {"poolFactor": factor}:
return mkTag(("PoolFactor", factor))
case {"债券余额剩余比率": factor} | {"bondFactor": factor}:
return mkTag(("BondFactor", factor))
case {"指定日之后": d} | {"afterDate": d}:
return mkTag(("AfterDate", d))
case {"任意满足": xs} | {"or": xs}:
return mkTag(("Or", [mkCallOptions(_x) for _x in xs]))
case {"全部满足": xs} | {"and": xs}:
return mkTag(("And", [mkCallOptions(_x) for _x in xs]))
case _:
raise RuntimeError(f"Failed to match {x}:mkCallOptions")
def mkAssumption(x) -> dict:
match x:
case {"CPR": cpr} if isinstance(cpr, list):
return mkTag(("PrepaymentVec", cpr))
case {"CDR": cdr} if isinstance(cdr, list):
return mkTag(("DefaultVec", cdr))
case {"CPR": cpr}:
return mkTag(("PrepaymentCPR", cpr))
case {"CPR调整": [*cprAdj, ed]} | {"CPRAdjust": [*cprAdj, ed]}:
return mkTag(("PrepaymentFactors", mkTs("FactorCurveClosed", [cprAdj, ed])))
case {"CDR": cdr}:
return mkTag(("DefaultCDR", cdr))
case {"CDR调整": [*cdrAdj, ed]} | {"CDRAdjust": [*cdrAdj, ed]}:
return mkTag(("DefaultFactors", mkTs("FactorCurveClosed", [cdrAdj, ed])))
case {"回收": (rr, rlag)} | {"Recovery": (rr, rlag)}:
return mkTag(("Recovery", (rr, rlag)))
case {"利率": [idx, rate]} if isinstance(rate, float):
return mkTag(("InterestRateConstant", [idx, rate]))
case {"Rate": [idx, rate]} if isinstance(rate, float):
return mkTag(("InterestRateConstant", [idx, rate]))
case {"利率": [idx, *rateCurve]} | {"Rate": [idx, *rateCurve]}:
return mkTag(("InterestRateCurve", [idx, *rateCurve]))
case {"清仓": opts} | {"CleanUp": opts}:
return mkTag(("CallWhen", [mkCallOptions(co) for co in opts]))
case {"停止": d} | {"StopAt": d}:
return mkTag(("StopRunBy", d))
case {"租赁截止日": d} | {"LeaseProjectEnd": d}:
return mkTag(("LeaseProjectionEnd", d))
case {"租赁年涨幅": r} | {"LeaseAnnualIncreaseRate": r} if not isinstance(r, list):
return mkTag(("LeaseBaseAnnualRate", r))
case {"租赁年涨幅": r} | {"LeaseAnnualIncreaseRate": r}:
return mkTag(("LeaseBaseCurve", mkTs("FloatCurve", r)))
case {"租赁间隔": n} | {"LeaseGapDays": n}:
return mkTag(("LeaseGapDays", n))
case {"租赁间隔表": (tbl, n)} | {"LeaseGapDaysByAmount": (tbl, n)}:
return mkTag(("LeaseGapDaysByAmount", [tbl, n]))
case {"查看":inspects} | {"Inspect":inspects}:
inspectVars = [ [mkDatePattern(dp),mkDs(ds)] for dp,ds in inspects ]
return mkTag(("InspectOn", inspectVars))
case _:
raise RuntimeError(f"Failed to match {x}:Assumption")
def mkAssumpList(xs):
return [mkAssumption(x) for x in xs]
def mkAssumption2(x) -> dict:
match x:
case (assetAssumpList, dealAssump) if isinstance(x, tuple):
return mkTag(("ByIndex", [[(ids, mkAssumpList(aps)) for ids, aps in assetAssumpList], mkAssumpList(dealAssump)]))
case xs if isinstance(xs, list):
return mkTag(("PoolLevel", mkAssumpList(xs)))
case None:
return None
case _:
raise RuntimeError(f"Failed to match {x}:mkAssumption2")
def mkPool(x):
mapping = {"LDeal": "LPool", "MDeal": "MPool",
"IDeal": "IPool", "RDeal": "RPool"}
match x:
case {"清单": assets, "封包日": d} | {"assets": assets, "cutoffDate": d}:
_pool = {"assets": [mkAsset(a) for a in assets], "asOfDate": d}
_pool_asset_type = identify_deal_type({"pool": _pool})
return mkTag((mapping[_pool_asset_type], _pool))
case _:
raise RuntimeError(f"Failed to match {x}:mkPool")
def mkCustom(x):
match x:
case {"常量": n} | {"Constant": n}:
return mkTag(("CustomConstant", n))
case {"余额曲线": ts} | {"BalanceCurve": ts}:
return mkTag(("CustomCurve", mkTs("BalanceCurve", ts)))
case {"公式": ds} | {"Formula": ds}:
return mkTag(("CustomDS", mkDs(ds)))
def mkLiqProviderType(x):
match x:
case {"总额度": amt} | {"Total": amt}:
return mkTag(("FixSupport"))
case {"日期": dp, "限额": amt} | {"Reset": dp, "Quota": amt}:
return mkTag(("ReplenishSupport", [mkDatePattern(dp), amt]))
case {"日期": dp, "公式": ds,"系数":pct} | {"Reset": dp, "Formula":ds, "Pct":pct}:
return mkTag(("ByPct", [mkDatePattern(dp),mkDs(ds),pct]))
case {}:
return mkTag(("UnLimit"))
case _:
raise RuntimeError(f"Failed to match LiqProvider Type:{x}")
def mkLiqProviderRate(x):
match x:
case {"fixRate":r ,"rateAccDates":rateAccDates,"lastAccDate":lastAccDate} | \
{"固定利率":r ,"结息日":rateAccDates,"上次结息日":lastAccDate} :
return mkTag(("FixRate",[mkDatePattern(rateAccDates),r,lastAccDate]))
case _:
return None
def mkLiqProvider(n, x):
match x:
case {"类型": "无限制", "起始日": _sd, **p} \
| {"type": "Unlimited", "start": _sd, **p}:
return {"liqName": n, "liqType": mkLiqProviderType({})
, "liqBalance": None, "liqCredit": p.get("已提供", 0) | p.get("credit",0), "liqStart": _sd
,"liqRate":mkLiqProviderRate(p)}
case {"类型": _sp, "额度": _ab, "起始日": _sd, **p} \
| {"type": _sp, "lineOfCredit": _ab, "start": _sd, **p}:
return {"liqName": n, "liqType": mkLiqProviderType(_sp)
, "liqBalance": _ab, "liqCredit": p.get("已提供", 0) | p.get("credit",0), "liqStart": _sd
,"liqRate":mkLiqProviderRate(p)}
case {"额度": _ab, "起始日": _sd, **p} \
| {"lineOfCredit": _ab, "start": _sd, **p}:
return {"liqName": n, "liqType": mkTag(("FixSupport"))
, "liqBalance": _ab, "liqCredit": p.get("已提供", 0) | p.get("credit",0), "liqStart": _sd
,"liqRate":mkLiqProviderRate(p)}
case _:
raise RuntimeError(f"Failed to match LiqProvidere:{x}")
def mkCf(x):
if len(x) == 0:
return None
else:
return [mkTag(("MortgageFlow", _x+[0.0]*5+[None])) for _x in x]
def mkCollection(xs):
sourceMapping = {"利息回款": "CollectedInterest", "本金回款": "CollectedPrincipal", "早偿回款": "CollectedPrepayment", "回收回款": "CollectedRecoveries", "租金回款": "CollectedRental", "CollectedInterest": "CollectedInterest", "CollectedPrincipal": "CollectedPrincipal", "CollectedPrepayment": "CollectedPrepayment", "CollectedRecoveries": "CollectedRecoveries", "CollectedRental": "CollectedRental"
}
return [[sourceMapping[x], acc] for (x, acc) in xs]
def mkAccTxn(xs):
"AccTxn T.Day Balance Amount Comment"
if xs is None:
return None
else:
return [mkTag(("AccTxn", x)) for x in xs]
def mk(x):
match x:
case ["资产", assets]:
return {"assets": [mkAsset(a) for a in assets]}
case ["账户", accName, attrs] | ["account", accName, attrs]:
return {accName: mkAcc(accName, attrs)}
case ["费用", feeName, {"类型": feeType, **fi}] \
| ["fee", feeName, {"type": feeType, **fi}]:
return {feeName: {"feeName": feeName, "feeType": mkFeeType(feeType), "feeStart": fi.get("起算日", None), "feeDueDate": fi.get("计算日", None), "feeDue": 0,
"feeArrears": 0, "feeLastPaidDay": None}}
case ["债券", bndName, bnd] | ["bond", bndName, bnd]:
return mkBnd(bndName, bnd)
case ["归集规则", collection]:
return mkCollection(collection)
def mkPricingAssump(x):
match x:
case {"贴现日": pricingDay, "贴现曲线": xs} | {"PVDate": pricingDay, "PVCurve": xs}:
return mkTag(("DiscountCurve", [pricingDay, mkTs("IRateCurve", xs)]))
case {"债券": bnd_with_price, "利率曲线": rdps} | {"bonds": bnd_with_price, "curve": rdps}:
return mkTag(("RunZSpread", [mkTs("IRateCurve", rdps), bnd_with_price]))
case _:
raise RuntimeError(f"Failed to match pricing assumption: {x}")
def readPricingResult(x, locale) -> dict:
if x is None:
return None
h = None
tag = query(x, [S.MVALS, S.ALL, "tag"])[0]
if tag == "PriceResult":
h = {"cn": ["估值", "票面估值", "WAL", "久期", "凸性", "应计利息"],
"en": ["pricing", "face", "WAL", "duration", "convexity", "accure interest"]}
elif tag == "ZSpread":
h = {"cn": ["静态利差"], "en": ["Z-spread"]}
else:
raise RuntimeError(
f"Failed to read princing result: {x} with tag={tag}")
return pd.DataFrame.from_dict({k: v['contents'] for k, v in x.items()}, orient='index', columns=h[locale]).sort_index()
def readRunSummary(x, locale) -> dict:
def filter_by_tags(xs, tags):
tags_set = set(tags)
return [ x for x in xs if x['tag'] in tags_set]
r = {}
if x is None:
return None
bndStatus = {'cn': ["本金违约", "利息违约", "起算余额"], 'en': [
"Balance Defaults", "Interest Defaults", "Original Balance"]}
bond_defaults = [(_['contents'][0], _['tag'], _['contents'][1], _['contents'][2])
for _ in x if _['tag'] in set(['BondOutstanding', 'BondOutstandingInt'])]
_fmap = {"cn": {'BondOutstanding': "本金违约", "BondOutstandingInt": "利息违约"}, "en": {
'BondOutstanding': "Balance Defaults", "BondOutstandingInt": "Interest Defaults"}}
bndNames = set([y[0] for y in bond_defaults])
bndSummary = pd.DataFrame(columns=bndStatus[locale], index=list(bndNames))
for bn, amt_type, amt, begBal in bond_defaults:
bndSummary.loc[bn][_fmap[locale][amt_type]] = amt
bndSummary.loc[bn][bndStatus[locale][2]] = begBal
bndSummary.fillna(0, inplace=True)
bndSummary["Total"] = bndSummary[bndStatus[locale][0]] + \
bndSummary[bndStatus[locale][1]]
r['bonds'] = bndSummary
dealStatusLog = {'cn': ["日期", "旧状态", "新状态"], 'en': ["Date", "From", "To"]}
status_change_logs = [(_['contents'][0], readStatus(_['contents'][1], locale), readStatus(_['contents'][2], locale))
for _ in x if _['tag'] in set(['DealStatusChangeTo'])]
r['status'] = pd.DataFrame(data=status_change_logs, columns=dealStatusLog[locale])
# inspection variables
def uplift_ds(df):
ds_name = readTagStr(df['DealStats'].iloc[0])
df.drop(columns=["DealStats"],inplace=True)
df.rename(columns={"Value":ds_name},inplace=True)
df.set_index("Date",inplace=True)
return df
inspect_vars = filter_by_tags(x, ["InspectBal"])
inspect_df = pd.DataFrame(data = [ (c['contents'][0],str(c['contents'][1]),c['contents'][2]) for c in inspect_vars ]
,columns=["Date","DealStats","Value"])
grped_inspect_df = inspect_df.groupby("DealStats")
r['inspect'] = {readTagStr(k):uplift_ds(v) for k,v in grped_inspect_df}
return r
def aggAccs(x, locale):
_header = {
"cn": {"idx": "日期", "change": "变动额", "bal": ("期初余额", '余额', "期末余额")}, "en": {"idx": "date", "change": "change", "bal": ("begin balance", 'balance', "end balance")}
}
header = _header[locale]
agg_acc = {}
for k, v in x.items():
acc_by_date = v.groupby(header["idx"])
acc_txn_amt = acc_by_date.agg(change=(header["change"], sum))
ending_bal_column = acc_by_date.last(
)[header["bal"][1]].rename(header["bal"][2])
begin_bal_column = ending_bal_column.shift(1).rename(header["bal"][0])
agg_acc[k] = acc_txn_amt.join([begin_bal_column, ending_bal_column])
if agg_acc[k].empty:
agg_acc[k].columns = header["bal"][0], header['change'], header["bal"][2]
continue
fst_idx = agg_acc[k].index[0]
agg_acc[k].at[fst_idx, header["bal"][0]] = round(
agg_acc[k].at[fst_idx, header["bal"][2]] - agg_acc[k].at[fst_idx, header['change']], 2)
agg_acc[k] = agg_acc[k][[header["bal"][0],
header['change'], header["bal"][2]]]
return agg_acc
def readIssuance(pool):
_map = {'cn': "发行", 'en': "Issuance"}
lang_flag = None
if '发行' in pool.keys():
lang_flag = 'cn'
elif 'Issuance' in pool.keys():
lang_flag = 'en'
else:
return None
validIssuanceFields = {
"资产池规模": "IssuanceBalance",
"IssuanceBalance": "IssuanceBalance"
}
r = {}
for k, v in pool[_map[lang_flag]].items():
if k in validIssuanceFields:
r[validIssuanceFields[k]] = v
else:
logging.warning(
"Key {k} is not in pool fields {validIssuanceFields.keys()}")
return r
def show(r, x="full"):
''' show cashflow of SPV during the projection '''
def _map(y):
if y == 'cn':
return {"agg_accounts": "账户", "fees": "费用", "bonds": "债券", "pool": "资产池", "idx": "日期"}
else:
return {"agg_accounts": "Accounts", "fees": "Fees", "bonds": "Bonds", "pool": "Pool", "idx": "date"}
_comps = ['agg_accounts', 'fees', 'bonds']
dfs = {c: pd.concat(r[c].values(), axis=1, keys=r[c].keys())
for c in _comps if r[c]}
locale = guess_locale(r)
_m = _map(locale)
dfs2 = {}
for k, v in dfs.items():
dfs2[_m[k]] = pd.concat([v], keys=[_m[k]], axis=1)
agg_pool = pd.concat([r['pool']['flow']], axis=1, keys=[_m["pool"]])
agg_pool = pd.concat([agg_pool], axis=1, keys=[_m["pool"]])
_full = functools.reduce(lambda acc, x: acc.merge(
x, how='outer', on=[_m["idx"]]), [agg_pool]+list(dfs2.values()))
match x:
case "full":
return _full.loc[:, [_m["pool"]]+list(dfs2.keys())].sort_index()
case "cash":
return None # ""
def flow_by_scenario(rs, flowpath, annotation=True, aggFunc=None, rnd=2):
"pull flows from multiple scenario"
scenario_names = rs.keys()
locale = guess_locale(list(rs.values())[0])
def _map(y):
if y == 'cn':
return {"idx": "日期"}
else:
return {"idx": "date"}
m = _map(locale)
dflow = None
aggFM = {"max": pd.Series.max, "sum": pd.Series.sum, "min": pd.Series.min}
if aggFunc is None:
dflows = [query(rs, [s]+flowpath) for s in scenario_names]
else:
dflows = [query(rs, [s]+flowpath).groupby(m['idx']).aggregate(
aggFM.get(aggFunc, aggFunc)) for s in scenario_names]
if annotation:
dflows = [f.rename(f"{s}({flowpath[-1]})")
for (s, f) in zip(scenario_names, dflows)]
try:
return pd.concat(dflows, axis=1).round(rnd)
except ValueError as e:
return f"need to pass function to `aggFunc` to aggregate duplication rows, options: Min/Max/Sum " | AbsBox | /AbsBox-0.12.5.tar.gz/AbsBox-0.12.5/absbox/local/component.py | component.py |
import logging, os, re, itertools
import requests, shutil
from dataclasses import dataclass,field
import functools, pickle, collections
import pandas as pd
import numpy as np
from urllib.request import unquote
from functools import reduce
from pyspecter import query
from absbox import *
from absbox.local.util import mkTag,DC,mkTs,consolStmtByDate,aggStmtByDate
from absbox.local.component import *
@dataclass
class SPV:
名称: str
日期: dict
资产池: dict
账户: tuple
债券: tuple
费用: tuple
分配规则: dict
归集规则: tuple
清仓回购: tuple = None
流动性支持:dict = None
自定义: dict = None
触发事件: dict = None
状态:str = "摊销"
@classmethod
def load(cls,p):
with open(p,'rb') as _f:
c = _f.read()
return pickle.loads(c)
@classmethod
def pull(cls,_id,p,url=None,pw=None):
def get_filename_from_cd(cd):
if not cd:
return None
fname = re.findall("filename\*=utf-8''(.+)", cd)
if len(fname) == 0:
fname1 = re.findall("filename=\"(.+)\"", cd)
return fname1[0]
return unquote(fname[0])
with requests.get(f"{url}/china/deal/{_id}",stream=True,verify=False) as r:
filename = get_filename_from_cd(r.headers.get('content-disposition'))
if filename is None:
logging.error("Can't not find the Deal Name")
return None
with open(os.path.join(p,filename),'wb') as f:
shutil.copyfileobj(r.raw, f)
logging.info(f"Download {p} {filename} done ")
@property
def json(self):
stated = False
dists,collects,cleans = [ self.分配规则.get(wn,[]) for wn in ['未违约','回款后','清仓回购'] ]
distsAs,collectsAs,cleansAs = [ [ mkWaterfall2(_action) for _action in _actions] for _actions in [dists,collects,cleans] ]
distsflt,collectsflt,cleanflt = [ itertools.chain.from_iterable(x) for x in [distsAs,collectsAs,cleansAs] ]
parsedDates = mkDate(self.日期)
status = mkStatus(self.状态)
defaultStartDate = self.日期.get("起息日",None) or self.日期['归集日'][0]
"""
get the json formatted string
"""
_r = {
"dates": parsedDates,
"name": self.名称,
"status": status,
"pool":{"assets": [mkAsset(x) for x in self.资产池.get('清单',[])]
, "asOfDate": self.日期.get('封包日',None) or self.日期['归集日'][0]
, "issuanceStat": readIssuance(self.资产池)
, "futureCf":mkCf(self.资产池.get('归集表', []))},
"bonds": functools.reduce(lambda result, current: result | current
, [mk(['债券', bn, bo]) for (bn, bo) in self.债券]),
"waterfall": mkWaterfall({},self.分配规则.copy()),
"fees": functools.reduce(lambda result, current: result | current
, [mk(["费用", feeName, feeO]) for (feeName, feeO) in self.费用]) if self.费用 else {},
"accounts": functools.reduce(lambda result, current: result | current
, [mk(["账户", accName, accO]) for (accName, accO) in self.账户]),
"collects": mkCollection(self.归集规则)
}
for fn, fo in _r['fees'].items():
if fo['feeStart'] is None :
fo['feeStart'] = defaultStartDate
if hasattr(self, "自定义") and self.自定义 is not None:
_r["custom"] = {}
for n,ci in self.自定义.items():
_r["custom"][n] = mkCustom(ci)
if hasattr(self, "触发事件") and self.触发事件 is not None:
_trigger = self.触发事件
_trr = {mkWhenTrigger(_loc):
[[mkTrigger(_trig),mkTriggerEffect(_effect)] for (_trig,_effect) in _vs ]
for _loc,_vs in _trigger.items()}
_r["triggers"] = _trr
if hasattr(self, "流动性支持") and self.流动性支持 is not None:
_providers = {}
for (_k, _p) in self.流动性支持.items():
_providers[_k] = mkLiqProvider(_k, ( _p | {"起始日": defaultStartDate}))
_r["liqProvider"] = _providers
_dealType = identify_deal_type(_r)
return mkTag((_dealType,_r))
def _get_bond(self, bn):
for _bn,_bo in self.债券:
if _bn == bn:
return _bo
return None
def read_assump(self, assump):
if assump:
return [mkAssumption(a) for a in assump]
return None
def read_pricing(self, pricing):
if pricing:
return mkPricingAssump(pricing)
return None
def read(self, resp, position=None):
read_paths = {'bonds': ('bndStmt', china_bondflow_fields, "债券")
, 'fees': ('feeStmt', china_fee_flow_fields_d, "费用")
, 'accounts': ('accStmt', china_acc_flow_fields_d , "账户")
, 'liqProvider': ('liqStmt', china_liq_flow_fields_d, "流动性支持")
}
output = {}
for comp_name, comp_v in read_paths.items():
if (not comp_name in resp[0]) or (resp[0][comp_name] is None):
continue
output[comp_name] = {}
for k, x in resp[0][comp_name].items():
ir = None
if x[comp_v[0]]:
ir = [_['contents'] for _ in x[comp_v[0]]]
output[comp_name][k] = pd.DataFrame(ir, columns=comp_v[1]).set_index("日期")
output[comp_name] = collections.OrderedDict(sorted(output[comp_name].items()))
# aggregate fees
output['fees'] = {f: v.groupby('日期').agg({"余额": "min", "支付": "sum", "剩余支付": "min"})
for f, v in output['fees'].items()}
# aggregate accounts
output['agg_accounts'] = aggAccs(output['accounts'],'cn')
output['pool'] = {}
_pool_cf_header,_ = guess_pool_flow_header(resp[0]['pool']['futureCf'][0],"chinese")
output['pool']['flow'] = pd.DataFrame([_['contents'] for _ in resp[0]['pool']['futureCf']]
, columns=_pool_cf_header)
pool_idx = "日期"
output['pool']['flow'] = output['pool']['flow'].set_index(pool_idx)
output['pool']['flow'].index.rename(pool_idx, inplace=True)
output['pricing'] = readPricingResult(resp[3], 'cn')
if position:
output['position'] = {}
for k,v in position.items():
if k in output['bonds']:
b = self._get_bond(k)
factor = v / b["初始余额"] / 100
if factor > 1.0:
raise RuntimeError("持仓系数大于1.0")
output['position'][k] = output['bonds'][k][['本金','利息','本息合计']].apply(lambda x:x*factor).round(4)
output['result'] = readRunSummary(resp[2], 'cn')
return output
信贷ABS = SPV # Legacy ,to be deleted | AbsBox | /AbsBox-0.12.5.tar.gz/AbsBox-0.12.5/absbox/local/china.py | china.py |
from dataclasses import dataclass
import functools
from absbox import *
from absbox.local.util import mkTag
from absbox.local.component import *
from absbox.local.base import *
import pandas as pd
import collections
@dataclass
class Generic:
name: str
dates: dict
pool: dict
accounts: tuple
bonds: tuple
fees: tuple
waterfall: dict
collection: list
call: tuple = None
liqFacility :dict = None
custom: dict = None
trigger:dict = None
status:str = "Amortizing"
@property
def json(self):
dists,collects,cleans = [ self.waterfall.get(wn,[]) for wn in ['Normal','PoolCollection','CleanUp']]
distsAs,collectsAs,cleansAs = [ [ mkWaterfall2(_action) for _action in _actions] for _actions in [dists,collects,cleans] ]
distsflt,collectsflt,cleanflt = [ itertools.chain.from_iterable(x) for x in [distsAs,collectsAs,cleansAs] ]
parsedDates = mkDate(self.dates)
"""
get the json formatted string
"""
_r = {
"dates": parsedDates,
"name": self.name,
"status":mkTag((self.status)),
"pool": {"assets": [mkAsset(x) for x in self.pool.get('assets', [])]
, "asOfDate": self.dates['cutoff']
, "issuanceStat": self.pool.get("issuanceStat")
, "futureCf":mkCf(self.pool.get('cashflow', [])) },
"bonds": functools.reduce(lambda result, current: result | current
, [mk(['bond', bn, bo]) for (bn, bo) in self.bonds]),
"waterfall": mkWaterfall({},self.waterfall.copy()),
"fees": functools.reduce(lambda result, current: result | current
, [mk(["fee", feeName, feeO]) for (feeName, feeO) in self.fees]) if self.fees else {},
"accounts": functools.reduce(lambda result, current: result | current
, [mk(["account", accName, accO]) for (accName, accO) in self.accounts]),
"collects": self.collection
}
for fn, fo in _r['fees'].items():
if fo['feeStart'] is None:
fo['feeStart'] = self.dates['closing']
if hasattr(self, "custom") and self.custom is not None:
_r["custom"] = {}
for n,ci in self.custom.items():
_r["custom"][n] = mkCustom(ci)
if hasattr(self, "trigger") and self.trigger is not None:
_trigger = self.trigger
_trr = {mkWhenTrigger(_loc):
[[mkTrigger(_trig),mkTriggerEffect(_effect)] for (_trig,_effect) in _vs ]
for _loc,_vs in _trigger.items()}
_r["triggers"] = _trr
if hasattr(self, "liqFacility") and self.liqFacility is not None:
_providers = {}
for (_k, _p) in self.liqFacility.items():
_providers[_k] = mkLiqProvider(_k, ( _p | {"start": self.dates['closing']}))
_r["liqProvider"] = _providers
_dealType = identify_deal_type(_r)
return mkTag((_dealType,_r))
def read_assump(self, assump):
if assump:
return [mkAssumption(a) for a in assump]
return None
def read_pricing(self, pricing):
if pricing:
return mkPricingAssump(pricing)
return None
def read(self, resp, position=None):
read_paths = {'bonds': ('bndStmt' , english_bondflow_fields , "bond")
, 'fees': ('feeStmt' , english_fee_flow_fields_d , "fee")
, 'accounts': ('accStmt' , english_acc_flow_fields_d , "account")
, 'liqProvider': ('liqStmt', english_liq_flow_fields_d, "")
}
output = {}
for comp_name, comp_v in read_paths.items():
if resp[0][comp_name] is None:
continue
output[comp_name] = {}
for k, x in resp[0][comp_name].items():
ir = None
if x[comp_v[0]]:
ir = [_['contents'] for _ in x[comp_v[0]]]
output[comp_name][k] = pd.DataFrame(ir, columns=comp_v[1]).set_index("date")
output[comp_name] = collections.OrderedDict(sorted(output[comp_name].items()))
# aggregate fees
output['fees'] = {f: v.groupby('date').agg({"balance": "min", "payment": "sum", "due": "min"})
for f, v in output['fees'].items()}
# aggregate accounts
output['agg_accounts'] = aggAccs(output['accounts'], 'en')
output['pool'] = {}
_pool_cf_header,_ = guess_pool_flow_header(resp[0]['pool']['futureCf'][0],"english")
output['pool']['flow'] = pd.DataFrame([_['contents'] for _ in resp[0]['pool']['futureCf']]
, columns=_pool_cf_header)
pool_idx = 'Date'
output['pool']['flow'] = output['pool']['flow'].set_index(pool_idx)
output['pool']['flow'].index.rename(pool_idx, inplace=True)
output['pricing'] = readPricingResult(resp[3], 'en')
output['result'] = readRunSummary(resp[2], 'en')
return output | AbsBox | /AbsBox-0.12.5.tar.gz/AbsBox-0.12.5/absbox/local/generic.py | generic.py |
import threading
import pyjsonrpc
import json
import os
from geventwebsocket import WebSocketServer, WebSocketApplication, Resource
from absinthe.tools.commands import CommandRequestHandler, external_jsonrpc_command
from absinthe.message import Message
from absinthe.tools.utils import SimpleResponse
from absinthe.tools.remote_process_base import RemoteProcessBase
# gevent socket in thread, need to patch...
from gevent import monkey
monkey.patch_all()
class Client(WebSocketApplication):
def __init__(self, ws, manager):
WebSocketApplication.__init__(self, ws)
self.manager = manager
self.address = '%(REMOTE_ADDR)s:%(REMOTE_PORT)s' % ws.environ
def send(self, message):
self.ws.send(str(message))
def on_open(self):
pass
def on_message(self, message):
if message is None:
return
self.manager.on_message(message, self)
def on_close(self, reason):
self.manager.on_close(self)
class Session(object):
def __init__(self, name, client, path):
self.name = name
self.client = client
self.path = path
class PathHandler(object):
def __init__(self, name, path):
self.name = name
self.path = path
def parse(self, filename):
if filename.find(self.path) != 0:
raise Exception('Path mismatch')
fn = filename[len(self.path):]
return fn.split(os.sep)
class ClientManager(object):
def __init__(self, session_manager):
self.session_manager = session_manager
def __call__(self, ws):
return Client(ws, self.session_manager)
class SessionManager(object):
def __init__(self, logger):
self.logger = logger
self.sessions = {}
self.paths = {}
def register_path(self, path):
self.paths[path.name] = path
def find_sessions(self, path_name):
sessions = []
for name in self.sessions:
for session in self.sessions[name]:
if session.path.name == path_name:
sessions.append(session)
return sessions
@external_jsonrpc_command
def set_focus(self, path_name):
sessions = self.find_sessions(path_name)
for session in sessions:
session.client.send(Message(session.name, 'set_focus'))
return SimpleResponse(True)
@external_jsonrpc_command
def open_file(self, path_name, filename, line):
self.logger.debug('Open file %s in %s' % (path_name, filename))
msgs = []
try:
sessions = self.find_sessions(path_name)
if len(sessions) == 0:
msg = 'There is no client for this path %s' % path_name
self.logger.warning(msg)
return SimpleResponse(False, [msg])
msgs.append('Session found: %s' % path_name)
for session in sessions:
file_parts = session.path.parse(filename)
session.client.send(Message(session.name, 'open_file', dict(filename = file_parts, line = line)))
msgs.append('File open request sent to %s' % session.client.address)
for msg in msgs:
self.logger.debug(msg)
except Exception as e:
self.logger.exception(e);
msgs.append(e);
return SimpleResponse(True, msgs)
def on_message(self, message, client):
try:
msg = Message.from_str(message)
except Exception as e:
self.logger.error('Malformed message received via websocket: %s, %s' % (e, message))
return
if hasattr(self, msg.command):
func = getattr(self, msg.command)
func(msg.name, client, **msg.arguments)
else:
self.logger.warning('Undefined command received: %s' % msg.command)
def on_close(self, client):
for name in self.sessions:
for session in self.sessions[name]:
if session.client == client:
self.sessions[name].remove(session)
self.logger.info('Session close: %s from %s' % (name, client.address))
def session_start(self, name, client, remote_path):
self.logger.info('Session start: %s from %s' % (name, client.address))
session = Session(name, client, self.paths[remote_path])
if name not in self.sessions:
self.sessions[name] = []
self.sessions[name].append(session)
class AbsintheServer(RemoteProcessBase):
def __init__(self, config, logger):
self.logger = logger
self.config = config
self.session_manager = SessionManager(self.logger)
self.client_manager = ClientManager(self.session_manager)
@external_jsonrpc_command
def init(self):
for name in self.config.data['paths']:
self.session_manager.register_path(PathHandler(name, self.config.data['paths'][name].value))
server_address = self.config.data['agent_server']
self.server = WebSocketServer((server_address['host'].value, server_address['port'].value), Resource({'/': self.client_manager}))
th = threading.Thread(target=self.server.serve_forever)
th.setDaemon(True)
th.start()
self.logger.debug('init')
return SimpleResponse(True)
# Initialize the command server to receive IPC commands.
def start_command_server(self):
try:
command_server_address = self.config.data['command_server']
self.command_server = pyjsonrpc.ThreadingHttpServer(
server_address = (command_server_address['host'].value, command_server_address['port'].value),
RequestHandlerClass = CommandRequestHandler
)
except Exception as e:
self.logger.error('Exception occured during the command server initalization: ' + str(e) + traceback.format_exc())
return
CommandRequestHandler.logger = self.logger
CommandRequestHandler.externals.extend([self, self.session_manager])
self.logger.debug('command server starting...')
self.command_server.serve_forever() | Absinthe | /Absinthe-1.1.0.tar.gz/Absinthe-1.1.0/absinthe/absinthe_server.py | absinthe_server.py |
import os
import sys
import signal
import time
import websocket
from threading import Thread
from logger import get_logger
from config import Config
from utils import FileReader
from message import Message
from tools.timer import Timer
import editors
class AbsintheClient():
def __init__(self, config, logger):
self.config = config.raw()
self.logger = logger
def start(self):
self.create_hosts(self.config['hosts'])
self.create_editors(self.config['editors'])
self.create_sessions(self.config['sessions'])
for name in self.hosts:
host = self.hosts[name]
if len(host.on_connect_callbacks):
host.connect()
# a very simple blocking mecha...
while True:
try:
sys.stdin.read(1)
except KeyboardInterrupt as k:
self.stop()
break
def create_sessions(self, sessions):
self.logger.debug('Create sessions')
self.sessions = {}
for session in sessions:
if session['enabled'] != True:
continue
name = session['name']
if session['host'] not in self.hosts:
self.logger.error('Undefined host %s in session %s. Ignoring session.' % (session['host'], name))
continue
if session['editor'] not in self.editors:
self.logger.error('Undefined editor %s in session %s. Ignoring session.' % (session['editor'], name))
continue
self.sessions[name] = Session(name, self.hosts[session['host']], self.editors[session['editor']], session['path'], session['remote-path'], self.logger)
def create_editors(self, editors_defs):
self.logger.debug('Create editors')
self.editors = {}
for name in editors_defs:
if not hasattr(editors, name):
self.logger.error('Undefined editor: %s. Skipping.' % name)
continue
TypedEditor = getattr(editors, name)
self.editors[name] = TypedEditor(name, editors_defs[name], self.logger)
def create_hosts(self, hosts):
self.logger.debug('Create hosts')
self.hosts = {}
for name in hosts:
addr = hosts[name]
self.hosts[name] = Host(name, addr['host'], addr['port'], self.logger)
def stop(self):
pass
class Host():
def __init__(self, name, host, port, logger):
self.msg_queue = []
self.logger = logger
self.name = name
self.host = "ws://%s:%d" % (host, port)
self.sessions = {}
self.connected = False
self.on_connect_callbacks = []
self.reconnect_timer = Timer(10, self.connect)
def connect(self):
self.logger.debug('Try to connect to %s - %s' % (self.name, self.host))
self.socket = websocket.WebSocketApp(self.host, on_message = self.on_message, on_error = self.on_error, on_close = self.on_close)
self.socket.on_open = self.on_open
th = Thread(target=self.socket.run_forever)
th.setDaemon(True)
th.start()
def on_connect(self, callback):
self.on_connect_callbacks.append(callback)
def send(self, message):
if not self.is_connected():
self.logger.error('Try to send message, but no open socket. Name: %s, Message: %s' % (self.name, message))
self.msg_queue.append(message)
return
self.logger.debug('Send message to %s: %s' % (self.host, message))
self.socket.send(str(message))
def is_connected(self):
return self.connected
def subscribe(self, name, session):
if name not in self.sessions:
self.sessions[name] = []
self.sessions[name].append(session)
def on_open(self, ws):
self.connected = True
self.reconnect_timer.stop()
self.logger.info('Connected to %s - %s' % (self.name, self.host))
for callback in self.on_connect_callbacks:
callback()
for msg in self.msg_queue:
self.send(msg)
self.msg_queue = []
def on_message(self, ws, message):
try:
msg = Message.from_str(message)
except Exception as e:
self.logger.error('Malformed message received via websocket: %s, %s' % (e, message))
return
self.logger.debug('Message received %s' % msg)
if msg.name not in self.sessions:
self.logger.error('Unknown session %s' % msg.name)
return
for session in self.sessions[msg.name]:
if hasattr(session, msg.command):
func = getattr(session, msg.command)
func(**msg.arguments)
else:
self.logger.error('Unknown command %s for session %s' % (msg.command, session.name))
def on_error(self, ws, error):
self.logger.error('Error occured on websocket: %s' % s)
def on_close(self, reason):
if self.is_connected():
self.logger.error('%s socket closed, try to reconnect...' % self.name)
self.connected = False
self.reconnect_timer.start()
class Session():
def __init__(self, name, host, editor, path, remote_path, logger):
self.name = name
self.host = host
self.editor = editor
self.path = path
self.logger = logger
self.remote_path = remote_path
host.subscribe(name, self)
host.on_connect(self.on_host_connected)
def on_host_connected(self):
msg = Message(self.name, 'session_start', dict(remote_path = self.remote_path))
self.host.send(msg)
def set_focus(self):
self.editor.set_focus()
def open_content(self, content):
pass
def open_file(self, filename, line = 1):
full_path = os.path.join(self.path, *filename)
self.logger.debug('Open file: %s' % full_path)
self.editor.open_file(full_path, line) | Absinthe | /Absinthe-1.1.0.tar.gz/Absinthe-1.1.0/absinthe/absinthe_client.py | absinthe_client.py |
import pyjsonrpc
from utils import SimpleResponse, Storage
class commandline(object):
def __init__(self, help, arguments = dict()):
self.help = help
self.arguments = arguments
def __call__(self, func):
func.commandline = Storage(
help = self.help,
arguments = self.arguments
)
return func
class CommandRequestHandler(pyjsonrpc.HttpRequestHandler):
logger = None
externals = []
def log_message(self, format, *args):
# called from parent class...
self.logger.debug("HTTP request - %s - %s" % (self.client_address[0], format % args))
@commandline('Ping')
@pyjsonrpc.rpcmethod
def ping(self):
return SimpleResponse(True, 'pong')
"""
Example of json rpc method with parameters when should be callable from command line:
@commandline('paramed test method', dict(teve=dict(help='teveee', type=int), alma=dict(help = 'set the alma', type = int)))
@pyjsonrpc.rpcmethod
def paramed(self, alma, teve):
self.logger.info('paramed ' + str(alma) + str(teve))
return SimpleResponse(True, alma*teve)
"""
"""
This decorator function make an external class's methods to able to called from json rpc client
For examples see Plutonium class (the order of the decorators are important!)
"""
def external_jsonrpc_command(orig_func):
def runner(*args, **kwargs):
ext_obj = None
for external in CommandRequestHandler.externals:
if hasattr(external, orig_func.func_name):
ext_obj = external
break
if ext_obj is None:
return
# get the original function in the original external class
func = getattr(ext_obj, orig_func.func_name)
# need to remove the CommandRequestHandler instance from the beginning of args tuple
args_list = list(args)
args_list.pop(0)
args_tuple = tuple(args_list)
return func(*args_tuple, **kwargs)
# add the runner function to CommandRequestHandler by the decorated function name
setattr(CommandRequestHandler, orig_func.func_name, runner)
# call the orig decorator
pyjsonrpc.rpcmethod(runner)
# need to copy commandline params if exists
if hasattr(orig_func, 'commandline'):
runner.commandline = orig_func.commandline
return orig_func | Absinthe | /Absinthe-1.1.0.tar.gz/Absinthe-1.1.0/absinthe/tools/commands.py | commands.py |
import json
from abc import abstractmethod
from .utils import Storage
class ConfigBase(object):
def __init__(self):
self._subscribers = []
def on_change(self, handler):
self._subscribers.append(handler)
return self
def notify(self, new, old):
for handler in self._subscribers:
handler(new, old)
return self
class ConfigValue(ConfigBase):
def __init__(self, value):
ConfigBase.__init__(self)
self.value = value
def __repr__(self):
return str(self.value)
class ConfigNode(dict, ConfigBase):
def __init__(self, *args, **kwargs):
ConfigBase.__init__(self)
dict.__init__(self, *args, **kwargs)
def sub_values(self):
subs = dict()
for name in self:
val = self[name]
if type(val) is ConfigNode:
subs[name] = val.sub_values()
else:
subs[name] = val.value
return subs
class Config(object):
def __init__(self, logger):
self.data = Storage()
self.logger = logger
self.raw_data = Storage()
def raw(self):
return self.raw_data
def load(self, data):
self.raw_data = data
return self.parse()
def reload(self):
self.old_data = self.data
if not self.parse():
return False
self.logger.info('Reloading the config')
def check_eq(new_node, old_node):
new_node._subscribers = old_node._subscribers
res = False
if type(new_node) is ConfigNode and type(old_node) is ConfigNode:
# common nodes in new_node and new_node
common = []
# get new nodes in new_node dict
for name in new_node:
if name in old_node:
common.append(name)
else:
new_node.notify(name, None) #...
for name in old_node:
if name not in new_node:
old_node.notify(None, name)
for name in common:
res = res or check_eq(new_node[name], old_node[name])
if res:
new_node.notify(new_node, old_node)
elif type(new_node.value) is list and type(old_node.value) is list:
if len(new_node.value) != len(old_node.value):
new_node.notify(new_node, old_node) #jeeeeeeeeeeeeeej
res = True
else:
if new_node.value != old_node.value:
new_node.notify(new_node, old_node) #jeeeeeeeeeeeeeej
res = True
return res
check_eq(self.data, self.old_data)
def parse(self):
data = ConfigNode(self.raw_data)
def iter(node):
for name in node:
val = node[name]
if type(val) is dict:
node[name] = ConfigNode(node[name])
iter(node[name])
else:
node[name] = ConfigValue(node[name])
iter(data)
self.data = data
return True | Absinthe | /Absinthe-1.1.0.tar.gz/Absinthe-1.1.0/absinthe/tools/config.py | config.py |
import sys, os, time, atexit
from signal import SIGTERM
class Daemon(object):
"""
A generic daemon class.
Usage: subclass the Daemon class and override the run() method
"""
def __init__(self, pidfile, logger):
self.logger = logger
self.pidfile = pidfile
def init(self):
"""
Called before the daemonize, so useful some init before the fork, eg print the user config parse errors, etc...
return False to stop the daemonize!
"""
return (True, 'Ok')
def daemonize(self):
"""
do the UNIX double-fork magic, see Stevens' "Advanced
Programming in the UNIX Environment" for details (ISBN 0201563177)
http://www.erlenstar.demon.co.uk/unix/faq_2.html#SEC16
"""
try:
pid = os.fork()
if pid > 0:
# exit first parent
sys.exit(0)
except OSError, e:
self.logger.error("Fork #1 failed: %d (%s)\n" % (e.errno, e.strerror))
sys.exit(1)
# decouple from parent environment
os.chdir("/")
os.setsid()
os.umask(0)
# do second fork
try:
pid = os.fork()
if pid > 0:
# exit from second parent
sys.exit(0)
except OSError, e:
self.logger.error("Fork #2 failed: %d (%s)\n" % (e.errno, e.strerror))
sys.exit(1)
# write pidfile
atexit.register(self.delpid)
pid = str(os.getpid())
file(self.pidfile,'w+').write("%s\n" % pid)
def delpid(self):
os.remove(self.pidfile)
def get_pid(self):
pid = None
try:
with open(self.pidfile, 'r') as f:
try:
pid = int(f.read().strip())
except TypeError as e:
pid = None
except IOError:
pid = None
return pid
def is_running(self):
pid = self.get_pid()
if pid is None:
return False
try:
os.kill(pid, 0)
except OSError:
return False
else:
return True
def status(self):
return 'Daemon is ' + ('running' if self.is_running() else 'not running')
def start(self):
"""
Start the daemon
"""
if self.is_running():
msg = "Daemon already running (pidfile:%s)" % self.pidfile
self.logger.error(msg)
return msg
initres = self.init()
if not initres[0]:
return initres[1]
# Start the daemon
self.daemonize()
return self.run()
def stop(self):
"""
Stop the daemon
"""
# Get the pid from the pidfile
pid = self.get_pid()
if not pid:
message = "Pidfile %s does not exist. Daemon not running?" % self.pidfile
self.logger.error(message)
return message # not an error in a restart
# Try killing the daemon process
try:
while 1:
os.kill(pid, SIGTERM)
time.sleep(0.1)
except OSError, err:
err = str(err)
if err.find("No such process") > 0:
if os.path.exists(self.pidfile):
os.remove(self.pidfile)
else:
self.logger.error(err)
sys.exit(1)
return 'Daemon is stopped'
def restart(self):
"""
Restart the daemon
"""
self.stop()
self.start()
def run(self):
"""
You should override this method when you subclass Daemon. It will be called after the process has been
daemonized by start() or restart().
""" | Absinthe | /Absinthe-1.1.0.tar.gz/Absinthe-1.1.0/absinthe/tools/daemon.py | daemon.py |
import importlib
import os
import urllib2, urllib
import re
import sys
import json
def ucfirst(str):
return str[0].upper() + str[1:]
def split_uppercase(str):
return re.findall('[A-Z][^A-Z]*', str)
def xml_element_to_storage(element):
res = Storage()
for item in element:
res[item.tag] = item.text
return res
def extend_path(root_dir, paths):
new_paths = []
for path in paths:
pp = [root_dir]
if type(path) is list:
pp.extend(path)
else:
pp.append(path)
new_path = os.path.join(*pp)
new_paths.append(new_path)
sys.path.extend(new_paths)
class Platforms():
folders = dict(
win32 = 'win',
linux2 = 'linux',
linux = 'linux',
)
def get_folder(self):
platform = sys.platform
if platform not in self.folders:
raise Exception('Unknown platform %s' % platform)
return self.folders[platform]
def import_class(self, name):
folder = self.get_folder()
file_name = '_'.join(split_uppercase(name)).lower()
module = importlib.import_module('%s.%s' % (folder, file_name))
return getattr(module, name)
class Storage(dict):
def __getattr__(self, key):
return self[key]
def __setattr__(self, key, value):
self[key] = value
def __hasattr__(self, key):
return key in self
class FileReader(object):
def read_all(self, file_name):
with open(file_name) as f:
content = f.read()
return content
class SimpleResponse(Storage):
def __init__(self, code, message = ''):
self.code = code
self.message = message
@staticmethod
def from_string(response):
try:
data = json.loads(response)
return SimpleResponse(data['code'], data['message'])
except:
raise
def __str__(self):
return "Code: %r, message: %s" % (self.code, self.message)
class HTTPResponse(object):
def __init__(self, code, content, headers = {}):
self.code = code
self.content = content
self.headers = headers
def __str__(self):
return "code: %d, headers: %s, content: %s" % (self.code, self.headers, self.content)
class URLLoader(object):
def load(self, url, data = None, headers = {}):
if data:
data = urllib.urlencode(data)
req = urllib2.Request(url, data, headers)
try:
response = urllib2.urlopen(req)
return HTTPResponse(response.getcode(), response.read(), response.info().dict)
except urllib2.URLError, e:
return HTTPResponse(e.code, e.read(), e.info().dict) | Absinthe | /Absinthe-1.1.0.tar.gz/Absinthe-1.1.0/absinthe/tools/utils.py | utils.py |
import os
import time
import subprocess
from signal import SIGTERM
from utils import SimpleResponse
class BackgroundProcessHandler(object):
def __init__(self, command, pid_file, logger):
self.pid_file = pid_file
self.logger = logger
self.command = command
def start(self):
if self.is_running():
return SimpleResponse(False, 'Daemon is already running. Pid: %d' % self.get_pid())
pid = subprocess.Popen(self.command).pid
file(self.pid_file,'w+').write("%s\n" % pid)
return SimpleResponse(True, 'Started (pid: %s)' % pid)
def get_pid(self):
pid = None
try:
with open(self.pid_file, 'r') as f:
try:
pid = int(f.read().strip())
except TypeError as e:
pid = None
except IOError:
pid = None
return pid
def is_running(self):
pid = self.get_pid()
if pid is None:
return False
try:
os.kill(pid, 0)
except OSError:
return False
else:
return True
def status(self):
return SimpleResponse(True, 'Daemon is ' + ('running' if self.is_running() else 'not running'))
def stop(self):
# Get the pid from the pidfile
pid = self.get_pid()
if not pid:
message = "Pidfile %s does not exist" % self.pid_file
self.logger.error(message)
return SimpleResponse(False, 'Daemon is not running')
# Try killing the daemon process
try:
while 1:
os.kill(pid, SIGTERM)
time.sleep(0.1)
except OSError, err:
err = str(err)
if err.find("No such process") > 0:
if os.path.exists(self.pid_file):
os.remove(self.pid_file)
else:
self.logger.error(err)
sys.exit(1)
return SimpleResponse(True, 'Daemon is stopped')
def restart(self):
self.stop()
return self.start() | Absinthe | /Absinthe-1.1.0.tar.gz/Absinthe-1.1.0/absinthe/tools/background_process_handler.py | background_process_handler.py |
import time
import urllib2
import pyjsonrpc
from utils import Storage, SimpleResponse
from background_process_handler import BackgroundProcessHandler
# used by the init script to manage the remote process, eg start, stop, create the IPC interface
class RemoteProcessManager(BackgroundProcessHandler):
def __init__(self, command, control_port, pid_file, logger):
super(RemoteProcessManager, self).__init__(command, pid_file, logger)
self.control_port = control_port
def get_rpc_client(self):
return pyjsonrpc.HttpClient(url = "http://localhost:%d/jsonrpc" % self.control_port)
def start(self):
result = super(RemoteProcessManager, self).start()
if result.code is False:
return result
rpc_client = self.get_rpc_client()
"""
The BackgroundProcessHandler.start function executes the remote script, and returns immediately.
But the command server not available yet, so we need to wait for it.
"""
attempts = 50
while True:
try:
rpc_client.ping()
break
except urllib2.URLError as e:
attempts = attempts - 1
if attempts > 0:
time.sleep(0.1)
else:
break
if not attempts:
# if the remote process unwilling to communicate, needs to stop it!
self.stop()
return SimpleResponse(False, 'Initialize has been failed')
self.logger.debug('send init')
initres = rpc_client.init()
self.logger.debug(initres)
init_res = Storage(initres)
self.logger.debug('Init remote process: ' + str(init_res))
if init_res.code:
result.message = result.message + ' and initialized'
return result
def control(self, name, args = {}):
self.logger.debug("Send rpc command '%s' with args: %s" % (name, args))
rpc_client = self.get_rpc_client()
func = getattr(rpc_client, name)
response = func(**args)
return SimpleResponse(**dict(response)) | Absinthe | /Absinthe-1.1.0.tar.gz/Absinthe-1.1.0/absinthe/tools/remote_process_manager.py | remote_process_manager.py |
from os.path import splitext, abspath
from sys import modules
import win32serviceutil
import win32service
import win32api
class ServiceManager(object):
def __init__(self, cls, name, display_name = None):
'''
cls : the class (derived from Service) that implement the Service
name : Service name
display_name : the name displayed in the service manager
'''
self.cls = cls
cls._svc_name_ = name
cls._svc_display_name_ = display_name or name
try:
module_path = modules[cls.__module__].__file__
except AttributeError:
# maybe py2exe went by
from sys import executable
module_path = executable
module_file = splitext(abspath(module_path))[0]
cls._svc_reg_class_ = '%s.%s' % (module_file, cls.__name__)
def start(self):
try:
win32serviceutil.StartService(self.cls._svc_name_)
return True
except Exception as e:
# the service is not installed
if e[0] in [1060, 1056]:
return False
else:
raise
def stop(self):
try:
win32serviceutil.StopService(self.cls._svc_name_)
return True
except Exception as e:
# the service is not running
if e[0] in [1060, 1062]:
return False
else:
raise
def remove(self):
self.stop()
try:
svc_mgr = win32service.OpenSCManager(None,None,win32service.SC_MANAGER_ALL_ACCESS)
svc_handle = win32service.OpenService(svc_mgr, self.cls._svc_name_, win32service.SERVICE_ALL_ACCESS)
win32service.DeleteService(svc_handle)
return True
except Exception as e:
# the service is not installed
if e[0] in [1060]:
return False
else:
raise
def install(self, stay_alive=True):
'''
stay_alive : Service will stop on logout if False
'''
if stay_alive: win32api.SetConsoleCtrlHandler(lambda x: True, True)
try:
win32serviceutil.InstallService(
self.cls._svc_reg_class_,
self.cls._svc_name_,
self.cls._svc_display_name_,
startType = win32service.SERVICE_AUTO_START
)
return True
except Exception as e:
# the service is already
if e[0] in [1073]:
return False
else:
raise | Absinthe | /Absinthe-1.1.0.tar.gz/Absinthe-1.1.0/absinthe/tools/win/service_manager.py | service_manager.py |
from .functions import convert_baseAmount_to_quoteAmount
from ccxt.base.exchange import Exchange
class CustomExchange(Exchange):
def __init__(self, config={}):
super().__init__(config=config)
self.tickers = dict()
self.currencies = dict()
self.apiName = None
def init(self, apiName):
self.apiName = apiName
def load_tickers(self):
self.tickers = super().fetch_tickers()
return self.tickers
def load_currencies(self): # it is not needed for now
self.currencies = super().fetch_currencies()
return self.currencies
def get_ask(self, symbol):
try:
return float(self.tickers[symbol]['ask'])
except:
return None
def get_bid(self, symbol):
try:
return float(self.tickers[symbol]['bid'])
except:
return None
def get_lastprice(self, symbol):
try:
return float(self.tickers[symbol]['last'])
except:
return None
def get_fee(self, code):
""" releated to child """
try:
return float(self.currencies[code]['fee'])
except:
return None
def check_withdrawal(self, code):
""" releated to child """
return self.currencies[code]['payout']
def check_deposit(self, code):
""" releated to child """
return self.currncies[code]['payin']
def convert_currency(self, active, passive):
quotes = {'DOGE', 'USDT', 'UST', 'USDC', 'TUSD',
'BTC', 'KCS', 'PAX', 'TRX', 'DAI', 'ETH'}
active = active.split(' ')
amount = float(active[0])
active_code = active[1].upper()
passive_code = passive.upper()
if active_code in quotes:
try:
price = self.fetch_custom_price(f'{passive_code}/{active_code}')
return float(amount / price)
except:
price = self.fetch_custom_price(f'{active_code}/{passive_code}')
return float(amount * price)
elif passive_code in quotes:
price = self.fetch_custom_price(f'{active_code}/{passive_code}')
return float(amount * price)
def fetch_custom_total_balance(self, currency):
return super().fetch_total_balance()[currency]
def fetch_custom_free_balance(self, currency):
return super().fetch_free_balance()[currency]
def fetch_custom_price(self, symbol):
return super().fetch_ticker(symbol)['last']
def fetch_custom_ask(self, symbol):
return super().fetch_ticker(symbol)['ask']
def fetch_custom_bid(self, symbol):
return super().fetch_ticker(symbol)['bid']
def is_order_successfull(self, orderId):
trades = super().fetch_my_trades()
for trade in trades:
if orderId == trade['info']['orderId']:
return True
return False
def fetch_BaseMinSize(self, symbol):
baseMinSize = self.fetch_market(symbol)['limits']['amount']['min']
return baseMinSize
def fetch_BaseMinSizeViaQuote(self, symbol):
baseMinSize = self.fetch_BaseMinSize(symbol)
quotePrice = self.fetch_custom_price(symbol)
return convert_baseAmount_to_quoteAmount(baseMinSize, quotePrice)
def fetch_market(self, symbol):
for i in super().fetch_markets():
if i['symbol'] == symbol:
return i
# if __name__ == '__main__':
# a = CustomExchange()
# a.apiName = "baby"
# print(a.__str__) | Abstract-Exchange | /Abstract_Exchange-0.0.2.tar.gz/Abstract_Exchange-0.0.2/Abstract_Exchange/CustomExchange.py | CustomExchange.py |
from time import mktime, sleep
from .CustomExchange import CustomExchange
from datetime import datetime
# from kucoin.client import
from ccxt.hitbtc import hitbtc as BaseHitbtc
from ccxt.kucoin import kucoin as BaseKucoin
from ccxt.bybit import bybit as BaseBybit
from pybit import usdt_perpetual
from concurrent.futures import ThreadPoolExecutor
class hitbtc(CustomExchange, BaseHitbtc):
def __init__(self, config={}):
super().__init__(config=config)
def check_withdrawal(self, code):
""" releated to child """
return self.currencies[code]['payout']
def check_deposit(self, code):
""" releated to child """
return self.currencies[code]['payin']
class kucoin(CustomExchange, BaseKucoin):
def __init__(self, config={}):
super().__init__(config=config)
# self.trade = Trade(self.apiKey, self.secret, self.password)
# self.user = User(self.apiKey, self.secret, self.password)
def check_withdrawal(self, code):
""" releated to child """
return self.currencies[code]['info']['isWithdrawEnabled']
def check_deposit(self, code):
""" releated to child """
return self.currencies[code]['info']['isDepositEnabled']
class bybit(CustomExchange, BaseBybit):
def __init__(self, config={}):
super().__init__(config=config)
self.pybit = usdt_perpetual.HTTP(
endpoint='https://api.bybit.com', api_key=self.apiKey, api_secret=self.secret)
def is_in_position_size(self, symbol, side=None): # check
positions = self.pybit.my_position(symbol=symbol)['result']
for pos in positions:
if pos['size'] > 0 and (side == None or pos['side'] == side):
return True
return False
def in_position_size(self, symbol):
positions = self.pybit.my_position(symbol=symbol)['result']
data = []
for pos in positions:
if pos['size'] > 0:
data.append(pos)
return data
def in_position_size_orders(self, order_ids, symbols):
orders = self.get_active_order_bulk(symbols, order_ids, order_status="Filled,PartiallyFilled")
# sleep(.1)
user_trade_records = self.user_trade_records_bulk(symbols=symbols, order_ids=order_ids)
data = []
for order in orders:
in_position = 1
for user_trade_record in user_trade_records:
if order["order_id"] == user_trade_record["order_id"] and user_trade_record["closed_size"] == order["qty"]:
in_position = 0
break
if in_position == 1:
data.append(order)
return data
def in_position_size_bulk(self, symbols: list, max_in_parallel=40): # chekc
with ThreadPoolExecutor(max_workers=max_in_parallel) as executor:
data = []
executions = [
executor.submit(
self.in_position_size,
**{"symbol": symbol}
) for symbol in symbols
]
executor.shutdown()
for execution in executions:
data += execution.result()
return data
def is_in_active_order(self, symbol):
active_orders = self.pybit.query_active_order(symbol=symbol)['result']
if not(active_orders):
return False
return True
def get_custom_leverage(self, symbol, side="Both"):
data = self.pybit.my_position(symbol=symbol)['result']
if side != "Both":
for i in data:
if i['side'] == side:
return i['leverage']
else:
return {"buy_leverage": data[0]["leverage"], "sell_leverage": data[1]["leverage"]}
def set_custom_leverage(self, symbol, buy_leverage=None, sell_leverage=None):
last_leverage = self.get_leverage(symbol=symbol, side="Both")
buy_leverage = last_leverage["buy_leverage"] if buy_leverage == None else buy_leverage
sell_leverage = last_leverage["sell_leverage"] if sell_leverage == None else sell_leverage
self.pybit.set_leverage(
symbol=symbol, buy_leverage=buy_leverage, sell_leverage=sell_leverage)
def fetch_custom_free_balance(self, currency):
return self.pybit.get_wallet_balance(coin=currency)['result'][currency]['available_balance']
def get_active_order_bulk_one_page(self, symbols: list, page=1, limit=50, order_status="Created,Rejected,New,PartiallyFilled,Filled,Cancelled,PendingCancel", max_in_parallel=10):
data = []
with ThreadPoolExecutor(max_workers=max_in_parallel) as executor:
executions = [
executor.submit(
self.pybit.get_active_order,
**{"symbol": symbol, "order_status": order_status, "limit": limit, "page": page}
) for symbol in symbols
]
executor.shutdown()
for execution in executions:
res = execution.result()["result"]["data"]
if res != None:
data += res
return data
def get_active_order_bulk(self, symbols: list, order_ids=[], order_status="Created,Rejected,New,PartiallyFilled,Filled,Cancelled,PendingCancel", max_in_parallel=10):
data = []
for page in range(1, 51):
res = self.get_active_order_bulk_one_page(
symbols=symbols, page=page, order_status=order_status, max_in_parallel=max_in_parallel)
if not(res):
break
else:
data += res
if order_ids:
data2 = []
for order in data:
if order["order_id"] in order_ids:
data2.append(order)
return data2
else:
return data
def closed_profit_and_loss_bulk_one_page(self, symbols: list, page=1, limit=50, start_time=0, end_time=mktime(datetime.timetuple(
datetime.now())), max_in_parallel=40):
data = []
with ThreadPoolExecutor(max_workers=max_in_parallel) as executor:
executions = [
executor.submit(
self.pybit.closed_profit_and_loss,
**{"symbol": symbol, "start_time": start_time, "end_time": end_time, "limit": limit, "page": page}
) for symbol in symbols
]
executor.shutdown()
for execution in executions:
res = execution.result()["result"]["data"]
if res != None:
data += res
return data
def closed_profit_and_loss_bulk(self, symbols: list, order_ids=[], start_time=0, end_time=mktime(datetime.timetuple(
datetime.now())), max_in_parallel=40):
data = []
for page in range(1, 51):
res = self.closed_profit_and_loss_bulk_one_page(
symbols=symbols, page=page, start_time=start_time, end_time=end_time, max_in_parallel=max_in_parallel)
if not(res):
break
else:
data += res
if order_ids:
data2 = []
for order in data:
if order["order_id"] in order_ids:
data2.append(order)
return data2
else:
return data
def user_trade_records_bulk_one_page(self, symbols: list, page=1, limit=200, start_time=0, end_time=mktime(datetime.timetuple(
datetime.now())), max_in_parallel=40):
data = []
with ThreadPoolExecutor(max_workers=max_in_parallel) as executor:
executions = [
executor.submit(
self.pybit.user_trade_records,
**{"symbol": symbol, "start_time": start_time, "end_time": end_time, "limit": limit, "page": page}
) for symbol in symbols
]
executor.shutdown()
for execution in executions:
res = execution.result()["result"]["data"]
if res != None:
data += res
return data
def user_trade_records_bulk(self, symbols: list, order_ids=[], start_time=0, end_time=mktime(datetime.timetuple(
datetime.now())), max_in_parallel=40):
data = []
for page in range(1, 51):
res = self.user_trade_records_bulk_one_page(
symbols=symbols, page=page, start_time=start_time, end_time=end_time, max_in_parallel=max_in_parallel)
if not(res):
break
else:
data += res
if order_ids:
data2 = []
for order in data:
if order["order_id"] in order_ids:
data2.append(order)
return data2
else:
return data
if __name__ == '__main__':
a = bybit({'apiKey': 'CSxcH3KzjGJqUrpwXe',
'secret': 'iGLSmVrfhbDXMyICTc7TnVnfiYHqJpOKN2Mk'})
print(a.pybit.my_position("DOGEUSDT")) | Abstract-Exchange | /Abstract_Exchange-0.0.2.tar.gz/Abstract_Exchange-0.0.2/Abstract_Exchange/Exchanges.py | Exchanges.py |
from multiprocessing import Process
def filter_dict_keys(dict: dict, keys: list):
i = 0
dict_keys = list(dict.keys())
while i < len(dict_keys):
if dict_keys[i] not in keys:
dict.pop(dict_keys[i])
i += 1
return dict
def dict_average(dict: dict):
s = 0
dict_values = dict.values()
for i in dict_values:
s += i
ave = s / len(dict_values)
return round(ave, 5)
def is_float(number):
try:
float(number)
return True
except:
return False
def calculate_benefit_percentage(buyed_price=None, selled_price=None, percentage=None):
if buyed_price and percentage:
return buyed_price * ((100+percentage)/100)
elif selled_price and percentage:
return selled_price * ((100+percentage)/100)
elif buyed_price and selled_price:
return ((selled_price/buyed_price)*100)-100
def calculate_percentage(amount=None, totalAmount=None, percentage=None):
if amount and percentage:
return amount * (percentage/100)
elif totalAmount and percentage:
return totalAmount * (percentage/100)
elif amount and totalAmount:
return (amount / totalAmount) * 100
def convert_quoteAmount_to_baseAmount(quouteAmount, price):
baseAmount = quouteAmount / price
return baseAmount
def convert_baseAmount_to_quoteAmount(baseAmount, price):
quoteAmount = baseAmount * price
return quoteAmount
def convert_symbol(symbol: str, delemiter=""):
quotes = {'DOGE', 'USDT', 'UST', 'USDC', 'TUSD',
'BTC', 'KCS', 'PAX', 'TRX', 'DAI', 'ETH'}
delemiters = {'-', ' ', '/'}
symbol = symbol.strip()
# delimeterd symbol
for i in symbol:
if i in delemiters:
b = symbol.split(i)[0]
q = symbol.split(i)[1]
s = b + delemiter + q
return s
# undelimeterd symbol
if symbol[-4:] in quotes:
q = symbol[-4:]
b = symbol[:-4]
s = b + delemiter + q
return s
elif symbol[-3:] in quotes:
q = symbol[-3:]
b = symbol[:-3]
s = b + delemiter + q
return s
def reverse_side(side: str):
side = side.capitalize()
if side == "Buy":
return "Sell"
elif side == "Sell":
return "Buy"
return None
def precision_of_number(number):
c = 0
number = str(number)
for i in range(len(number)-1, 0, -1):
if number[i] == ".":
return c
c += 1
def run_in_parallel(fn, args):
prcs = []
for arg in args:
p = Process(target=fn, args=arg)
p.start()
prcs.append(p)
for p in prcs:
p.join() | Abstract-Exchange | /Abstract_Exchange-0.0.2.tar.gz/Abstract_Exchange-0.0.2/Abstract_Exchange/functions.py | functions.py |
def add(a,b):
return a+b
def subtract(a,b):
return a-b
def multiply(a,b):
return a*b
def divide(a,b):
return a/b
def divideInt(a,b):
return a//b
def mod(a,b):
return a%b
def exponent(a,b):
return a**b
## More specific stuff
def GCD(a,b):
bigVal , smallVal = max([a,b]) , min([a,b])
# start euclid's alogirthm
done = False
while not done:
tempVal = bigVal
bigVal = smallVal
potentialGCD = smallVal
smallVal = tempVal % smallVal
if smallVal ==0:
return potentialGCD
def gcdSteps(a,b):
def equationFormat(valOne, valTwo, valThree, valFour):
return "{} = {}*{} + {}".format(valOne, valTwo, valThree, valFour)
def endingsFormat(valOne, valTwo, valThree, valFour):
return "{} = {} - {}*{}".format(valFour, valOne, valTwo, valThree)
def popEndValue(list):
return list[0:len(list)-1]
endingVals = []
allEquations=[]
bigVal , smallVal = max([a,b]) , min([a,b])
# start euclid's alogirthm
# FORMAT => A = M*X + B
aValues =[]
mValues =[]
xValues =[]
bValues =[]
done = False
while not done:
tempVal = bigVal
bigVal = smallVal
smallVal = tempVal % smallVal
endingVals.append(endingsFormat(tempVal,bigVal,tempVal//bigVal,smallVal))
allEquations.append(equationFormat(tempVal,bigVal,tempVal//bigVal,smallVal))
aValues.append(tempVal)
mValues.append(bigVal)
xValues.append(tempVal//bigVal)
bValues.append(smallVal)
if smallVal ==0:
break
aValues = popEndValue(aValues)
mValues = popEndValue(mValues)
xValues = popEndValue(xValues)
bValues = popEndValue(bValues)
endingVals = popEndValue(endingVals)
# print("\n",aValues,"\n",mValues,"\n", xValues,"\n" , bValues, "\n")
return allEquations, endingVals
def simplifyCongruence(initVal1, initVal2, iterVal):
returnVals = []
while initVal1 <0:
initVal1+=iterVal
if initVal1>0:
returnVals.append(initVal1)
break
while initVal1-iterVal> 0:
initVal1 -= iterVal
if initVal1-iterVal < 0:
returnVals.append(initVal1)
break
while initVal2 <0:
initVal2+=iterVal
if initVal2>0:
returnVals.append(initVal2)
break
while initVal2-iterVal> 0:
initVal2 -= iterVal
if initVal2-iterVal < 0:
returnVals.append(initVal2)
break
templateFormat = "\n\n {} ≊ {} (mod {})\n\n".format(initVal1, initVal2, iterVal)
return templateFormat
def linearCombination(valueOne, valueTwo):
gcdValue = GCD(valueOne,valueTwo)
equation = "{} = {} + {}".format(gcdValue,valueOne,valueTwo)
equation= equation.split("=")
return equation | Abstract1-Josephaw1022 | /Abstract1-Josephaw1022-0.0.1.tar.gz/Abstract1-Josephaw1022-0.0.1/Abstract1/Abstract1.py | Abstract1.py |
# AbundanceMatching
[](https://pypi.python.org/pypi/AbundanceMatching)
A python module to create (interpolate and extrapolate) abundance functions and also provide fiducial deconvolution
(with Peter Behroozi's implementation) and abundance matching.
## Installation
```bash
pip install abundancematching
```
## Example
Here's an example to do abundance matching with this code.
```python
"""
Assume you have a numpy structured array `halos`,
which contains a list of halos, with labels of the quantity names.
Assume you also have a luminosity function table `lf`,
whose first column st column is the quantity to match (e.g. magnitude),
and the second column is the abundance (per Mpc^3 per Mag).
"""
import matplotlib.pyplot as plt
from AbundanceMatching import AbundanceFunction, LF_SCATTER_MULT, calc_number_densities
af = AbundanceFunction(lf[:,0], lf[:,1], (-27, -5))
# check the abundance function
plt.semilogy(lf[:,0], lf[:,1])
x = np.linspace(-27, -5, 101)
plt.semilogy(x, af(x))
# deconvolution and check results (it's a good idea to always check this)
scatter = 0.2
remainder = af.deconvolute(scatter*LF_SCATTER_MULT, 20)
x, nd = af.get_number_density_table()
plt.plot(x, remainder/nd);
# get number densities of the halo catalog
nd_halos = calc_number_densities(halos['vpeak'], box_size)
# do abundance matching with no scatter
catalog = af.match(nd_halos)
#do abundance matching with some scatter
catalog_sc = af.match(nd_halos, scatter*LF_SCATTER_MULT)
#if you want to do multiple (100) realizations:
catalog_deconv = af.match(nd_halos, scatter*LF_SCATTER_MULT, False)
for __ in range(100):
catalog_this = add_scatter(catalog_deconv, scatter*LF_SCATTER_MULT)
catalog_this = rematch(catalog_this, catalog, af._x_flipped)
# do something with catalog_this
```
| AbundanceMatching | /AbundanceMatching-0.3.0.tar.gz/AbundanceMatching-0.3.0/README.md | README.md |
# Abyiss Python Client - WebSocket & REST APIs
Python Client for Abyiss Cryptocurrency APIs.
To use our API please sign up for a free account here: [Sign Up](https://www.abyiss.com/signin), and find your API Key in your [Dashboard](https://www.abyiss.com/dashboard).
## Please use our official [Documentation](https://docs.abyiss.com/). It contains all the latest updates.
### We will be adding some of the additional features to this client libary and our API in roughly in this order:
* **WebSockets** - This will allow you to subscribe to real time cryptocurrency market data from the API.
* **Unified Endpoints** - This will allow you get a unified view of the entire cryptocurrency market.
* **CSV Export** - This will allow you to export the market data to a CSV file.
* **More Support** - Add support for more currencies, exchanges, markets, symbols, timeframes, functions, indicators and more.
If you have any problems with this library, please open an issue request on [Github](https://github.com/Abyiss/Client-python/issues) or for any additional support please email us at [[email protected]](mailto:[email protected]).
# Getting Started
### Install Abyiss Python Library
``` pip install abyiss ```
### Quick Start - Copy & Paste Code
```python
from Abyiss import Abyiss
apiKey = "YOUR API KEY""
client = Abyiss.Client(apiKey)
exchanges = client.getExchanges()
exchangeDetails = client.getExchangeDetails("coinbasepro")
exchangeStatus = client.getExchangeStatus("coinbasepro")
exchangeMarkets = client.getExchangeMarkets("coinbasepro")
exchangeMarketDetails = client.getMarketDetails("coinbasepro", "BTC-USDT")
aggregates = client.aggregates("coinbasepro", "BTC-USDT", "1h", '300')
trades = client.trades("coinbasepro", "BTC-USDT", '300')
quotes = client.quotes("coinbasepro", "BTC-USDT")
orderbook = client.orderBook("coinbasepro", "BTC-USDT")
```
# More Details
### Abyiss Client
```python
apiKey = "(s2nKF1s2S^Xj6(43z6x6VCh18Ao5Qhu@*6"
# Create an instance of the Abyiss class with your API key
client = Abyiss.Client(apiKey)
```
* Instance of the Abyiss class with your API key
* Parameters:
- **apiKey**: String. Your Abyiss API Key
* Returns a 200 status code upon successful query.
## Reference Data
### Get Exchanges
```python
exchanges = client.getExchanges()
```
* Returns an array of all supported exchanges in the form of market objects.
* Parameters:
- **exchange**: String. Unique exchange identifier used by Abyiss.
* Response Attributes:
- **name**: String. The official name of the exchange.
- **id**: String. Unique exchange identifier used by Abyiss.
* Response Object:
```json
[
{
"name":"Binance",
"id":"binance"
},
{
"name":"Binance US",
"id":"binanceus"
},
{
"name":"Coinbase Pro",
"id":"coinbasepro"
},
{
"name":"BitBay",
"id":"bitbay"
}
]
```
### Get Exchange Details
```python
exchangeDetails = client.getExchangeDetails("exchange id")
```
* Returns an object with properties about the exchange.
* Parameters:
- **exchange id**: String. Unique exchange identifier used by Abyiss.
* Response Attributes:
- **name**: String. the official name of the exchange.
- **id**: String. the id of the exchange used within the api routes.
- **url**: String. the exchange's official website url.
- **hasTrades**: Boolean. Whether the api can be used to query market trades on the exchange.
- **hasAggregates**: Boolean. Whether the api can be used to query market candle aggregates on the exchange.
- **aggregateTimeframes**: Object containing all of the timeframes supported for market candle aggregates.
* Response Object:
```json
{
"name":"Coinbase Pro",
"id":"coinbasepro",
"url":"https://pro.coinbase.com/",
"hasTrades":true,
"hasAggregates":true,
"aggregateTimeframes":
{
"1m":60,
"5m":300,
"15m":900,
"1h":3600,
"6h":21600,
"1d":86400
}
}
```
### Get Exchange Status
```python
exchangeStatus = client.getExchangeStatus("exchange id*")
```
* Returns an object with properties that describe an exchange's status.
* Parameters:
- **exchange id**: String. Unique exchange identifier used by Abyiss.
* Response Attributes:
- **status**: String. The status of the exchange. 'ok' is good.
- **updated**: Int. Unix timestamp of last time the exchage's status was updated.
* Response Object:
```json
{
"status":"ok",
"updated":1634929487916
}
```
### Get Exchange Markets
```python
exchangeMarkets = client.getExchangeMarkets("exchange id*")
```
* Returns an array of all crypto pair ids on the exchange.
* Parameters:
- **exchange id**: String. Unique exchange identifier used by Abyiss.
* Response Attributes:
- **pair id**: String. Unique Crypto Pair identifier used by the exchange.
* Response Object:
```json
[
"OGN/BTC",
"REQ/BTC",
"KEEP/USD",
"AAVE/USD",
"SKL/GBP",
"MIR/EUR",
"FORTH/EUR",
"DOT/USDT"
]
```
### Get Exchange Markets Details
```python
exchangeMarketDetails = client.getMarketDetails("exchange id*", "market id*")
```
* Returns an object with properties about the crypto pair.
* Parameters:
- **exchange id**: String. Unique exchange identifier used by Abyiss.
- **market id**: String. Unique Crypto Pair identifier used by the exchange.
* Response Attributes:
- **exchange**: String. Unique identifier used by Abyiss for the exchange.
- **symbol**: String. The symbol of the market.
- **id**: String. Unique identifier used by Abyiss for the market.
- **active**: Boolean. Whether the market is active on the exchange.
- **base**: String. The base of the market. eg: The quantity that is bought.
- **quote**: String. The quote of the market. eg: The currency being compared.
- **percentage**: Boolean. Whether the taker and maker fee rate is a multiplier or a fixed flat amount.
- **taker**: Float. Taker fee rate, 0.002 = 0.2%.
- **maker**: Float. Maker fee rate, 0.0016 = 0.16%.
- **spot**: String. Exchange type that the market is listed on.
* Response Object:
```json
{
"exchange":"coinbasepro",
"symbol":"BTC/USD",
"id":"BTC-USD",
"active":true,
"base":"BTC",
"quote":"USD",
"percentage":true,
"taker":0.005,
"maker":0.005,
"type":"spot"
}
```
## Market Data
### Aggregates
```python
aggregates = client.aggregates("exchange id*", "market id*", "aggregate size*", 'limit')
```
* Returns an array of recent aggregate candlesticks of a given aggregate size for a market on an exchange.
* Parameters:
- **exchange id**: String. Unique exchange identifier used by Abyiss.
- **market id**: String. Unique Crypto Pair identifier used by the exchange.
- **aggregate size**: String. Aggregate bar or candlestick time frame. (1m, 5m, 15m, 1h, 6h, 1d)
- **limit**: String. Optional. Number of results per request. Maximum 500. (default 200)
* Response Attributes:
- **exchange**: String. Unique identifier used by Abyiss for the exchange.
- **market**: String. Unique identifier used by Abyiss for the market.
- **timestamp**: integer. Unix timestamp of the start of the aggregate calculation. Defining the scope.
- **open**: float. The first, or opening, price of the aggregate's scope.
- **high**: float. The highest price recorded within the scope of the aggregate.
- **low**: float. The lowest price recorded within the scope of the aggregate.
- **close**: float. The last, or closing, price within the aggregate's scope.
- **volume**: float. The volume within the aggregate's scope.
* Response Object:
```json
{
"exchange": "coinbasepro",
"market": "BTC/USD",
"timestamp": 1639532040000,
"open": 48080,
"high": 48111.79,
"low": 48080,
"close": 48088.72,
"volume": 2.55482409
}
```
### Trades
```python
trades = client.trades("exchange id*", "market id*", 'limit')
```
* Returns an array of recent trades that have occurred on an exchange for a crypto pair.
* Parameters:
- **exchange id**: String. Unique exchange identifier used by Abyiss.
- **market id**: String. Unique Crypto Pair identifier used by the exchange.
- **limit**: String. Optional. Number of results per request. Maximum 500. (default 200)
* Response Attributes:
- **exchange**: String. Unique identifier used by Abyiss for the exchange.
- **market**: String. Unique identifier used by Abyiss for the market.
- **id**: String. The exchange specific unique id of the trade.
- **timestamp**: string. Unix timestamp of the start of the aggregate calculation. Defining the scope.
- **price**: float. The quote currency price of the market.
- **size**: float. The quantity traded.
- **cost**: float. The quote cost: (size * price).
- **side**: string. Whether the trade was a buy or sell.
* Response Object:
```json
{
"exchange": "coinbasepro",
"market": "BTC/USD",
"id": "251180247",
"timestamp": "1639534096083",
"price": 47887.03,
"size": 0.00013904,
"cost": 6.6582126511999995,
"side": "sell"
}
```
### Quotes
```python
quotes = client.quotes("exchange id*", "market id*")
```
* Returns an array of recent quotes that have occurred on an exchange for a crypto pair.
* Parameters:
- **exchange id**: String. Unique exchange identifier used by Abyiss.
- **market id**: String. Unique Crypto Pair identifier used by the exchange.
* Response Attributes:
- **exchange**: String. Unique identifier used by Abyiss for the exchange.
- **market**: String. Unique identifier used by Abyiss for the market.
- **bid price**: float. The bid price.
- **bid size**: float. The bid size.
- **ask price**: float. The ask price.
- **ask size**: float. The ask size.
- **timestamp**: integer. Unix timestamp of the start of the aggregate calculation. Defining the scope.
* Response Object:
```json
{
"exchange":"coinbasepro",
"market":"BTC/USD",
"nonce":14601360013,
"bids":
[
[
61947.91,
1.48088
],
[
61947.9,
0.5
],
[
61944.07,
0.44094
]
],
"asks":
[
[
61947.92,
2.28573
],
[
61952.89,
0.11214
],
[
61952.9,
0.50224
]
]
}
```
### Level 2 OrderBook
```python
level2 = client.orderBook("exchange id*", "market id*")
```
* Returns a snapshot of the recent level 2 orderbook for a crypto pair on an exchange.
* Parameters:
- **exchange id**: String. Unique exchange identifier used by Abyiss.
- **market id**: String. Unique Crypto Pair identifier used by the exchange.
* Response Attributes:
- **exchange**: String. Unique identifier used by Abyiss for the exchange.
- **market**: String. Unique identifier used by Abyiss for the market.
- **bid price**: float. The bid price.
- **bid size**: float. The bid size.
- **ask price**: float. The ask price.
- **ask size**: float. The ask size.
- **timestamp**: integer. Unix timestamp of the start of the aggregate calculation. Defining the scope.
* Response Object:
```json
{
"exchange":"coinbasepro",
"market":"BTC/USD",
"nonce":14601360013,
"bids":
[
[
61947.91,
1.48088
],
[
61947.9,
0.5
],
[
61944.07,
0.44094
]
],
"asks":
[
[
61947.92,
2.28573
],
[
61952.89,
0.11214
],
[
61952.9,
0.50224
]
]
}
```
| Abyiss | /Abyiss-1.0.0.tar.gz/Abyiss-1.0.0/README.md | README.md |
from colorama import Fore, Back, init
from os import error, system, truncate
from tqdm import tqdm
import pyfiglet
import requests
import json
import os
# #https://download1518.mediafire.com/pjbmrxu3fyig/ygh79hofhj9npol/config.json
dir_path = os.path.dirname(os.path.realpath(__file__))
def download(url, filename, callback):
print(Fore.GREEN)
response = requests.get(url, stream=True)
total_size_in_bytes= int(response.headers.get('content-length', 0))
block_size = 1024 #1 Kibibyte
progress_bar = tqdm(total=total_size_in_bytes, unit='iB', unit_scale=True)
with open(filename, 'wb') as file:
for data in response.iter_content(block_size):
progress_bar.update(len(data))
file.write(data)
progress_bar.close()
if total_size_in_bytes != 0 and progress_bar.n != total_size_in_bytes:
print("ERROR, something went wrong")
callback('error')
callback('successful')
def ondownload(state):
if state == "successful":
print(f'{Fore.GREEN}[ABYSS-INFO]:config.json file downloaded.')
print(f'{Fore.GREEN}[ABYSS-INFO]: Please run again the file.')
else:
print(f'{Fore.GREEN}[ABYSS-ERROR]: Error downloading config.json file, check your iternet connection or try again.')
exit()
class Abyss:
def __init__(self):
self.foreground = None
self.background = None
self.error_color = None
self.Info_color = None
self.title_font = None
self.title = None
self.subtitle = None
def init(themename,path,filename):
try:
f = open(f'{path}/{filename}', 'r')
c = f.read()
js = json.loads(c)
foreground = js[themename]['style']['foreground'].upper()
background = js[themename]['style']['background'].upper()
error_color = js[themename]['style']['error_color'].upper()
info_color = js[themename]['style']['info_color'].upper()
title_font = js[themename]['style']['title_font'].upper()
title = js[themename]['shell_info']['title'].upper()
subtitle = js[themename]['shell_info']['subtitle'].upper()
foreground = getattr(Fore, foreground)
background = getattr(Fore, background)
error_color = getattr(Fore, error_color)
info_color = getattr(Fore, info_color)
build(Abyss,foreground,error_color,info_color,background,title_font,title,subtitle)
except FileNotFoundError:
print(f'{Fore.RED}[ABYSS-ERROR]:File config.json not found.')
print(f'{Fore.GREEN}[ABYSS-INFO]: Downloading config.json file...')
download('https://drive.google.com/u/0/uc?id=15hGFtFkaupcuzcpJYdU5vmLmZwK5pywz&export=download',
f'{path}/{filename}', ondownload)
def generate_menu(options):
option = None
options[f'{len(options)+1}']=('salida', exitmenu)
while option != len(options):
show_menu(options)
option = read_option(options)
run(option, options)
def show_menu(options):
print('Seleccione una opción:')
for i in sorted(options):
print(f' {i}) {options[i][0]}')
def read_option(options):
while (a := input('Opción: ')) not in options:
print('Opción incorrecta, vuelva a intentarlo.')
return a
def run(option, options):
options[option][1]()
def exitmenu():
print('Exiting')
exit()
def build(self,foreground=Fore.BLACK, error_color=Fore.BLUE, info_color=Fore.CYAN, background='f0', title_font="wavy",title="ABYSS",subtitle="Shell Library, KAT"):
self.foreground = foreground
self.background = background
self.error_color = error_color
self.Info_color = info_color
self.title_font = title_font
self.title = title
self.subtitle = subtitle
cleanScreen()
title = pyfiglet.figlet_format(title, font = title_font)
print(foreground)
print(title)
print(subtitle)
def cleanScreen():
_ = system('cls') | Abyss-Shell | /Abyss-Shell-1.0.0.tar.gz/Abyss-Shell-1.0.0/Abyss/__init__.py | __init__.py |
# Assetto Corsa dammy Python library
Dummy library for Assetto Corsa native functions presented in Python.
Actually, it has a single interface for `ac` module.
The main goal of this package: provide convenient autocomplete in IDE (tested in PyCharm).
## Installation to develop AC mod
You will need to install the package:
```shell
pip install AcDummyLib
```
Add following in your script instead of `import ac`:
```python
if __name__ == '__main__':
### pip install AcDummyLib
from AcDummyLib import ac
else:
import ac
```
Now you can check your IDE, autocomplete shall work.
## Contribution
You are very welcome to add changes into this code. =)
Please feel free to push merge/pull requests.
Or, you may raise an issue to highlight found discrepancies.
## Roadmap
- Migrate function descriptions into the interface file.
## References
#### Source documents:
- https://docs.google.com/document/d/13trBp6K1TjWbToUQs_nfFsB291-zVJzRZCNaTYt4Dzc/pub
- https://assettocorsamods.net/attachments/inofficial_acpythondoc_v2-pdf.7415/
#### Initial forum threads:
- https://assettocorsamods.net/threads/doc-python-doc.59
- https://assettocorsamods.net/threads/is-there-a-way-to-load-ac-library-to-have-autocomplete-in-an-ide-e-g-pycharm.3088/ | AcDummyLib | /AcDummyLib-0.2.0.tar.gz/AcDummyLib-0.2.0/README.md | README.md |
Binaural Beats and Monaural Beats with Python
=============================================
``AccelBrainBeat`` is a Python library for creating the binaural beats
or monaural beats. You can play these beats and generate wav files. The
frequencys can be optionally selected.
Description
-----------
This Python script enables you to handle your mind state by a kind of
"Brain-Wave Controller" which is generally known as Biaural beat or
Monauarl beats in a simplified method.
Documentation
-------------
Full documentation is available on
https://code.accel-brain.com/Binaural-Beat-and-Monaural-Beat-with-python/
. This document contains information on functionally reusability,
functional scalability and functional extensibility.
Demonstration IN Movie
----------------------
- `Drive to design the brain's level
upper <https://www.youtube.com/channel/UCvQNSr2fVjI8bIMhJ_bfQmg>`__
(Youtube)
Installation
------------
Install using pip:
.. code:: bash
pip install AccelBrainBeat
Source code
~~~~~~~~~~~
The source code is currently hosted on GitHub.
- `Binaural-Beat-and-Monaural-Beat-with-python <https://github.com/chimera0/accel-brain-code/tree/master/Binaural-Beat-and-Monaural-Beat-with-python>`__
Python package index(PyPI)
~~~~~~~~~~~~~~~~~~~~~~~~~~
Binary installers for the latest released version are available at the
Python package index.
- `AccelBrainBeat: Python Package
Index <https://pypi.python.org/pypi/AccelBrainBeat/>`__
Dependencies
~~~~~~~~~~~~
- `NumPy <http://www.numpy.org/>`__: v1.7.0 or higher
To play the beats on console
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
If you want to not only output wav files but also play the beats on
console, `PyAudio <https://people.csail.mit.edu/hubert/pyaudio/>`__
(v0.2.9 or higher) must be installed.
Use-case on console
-------------------
You can study or work while listening to the Binaural or Monauarl beats.
Before starting your job, run a batch program on console.
Create "Binaural Beat" and output wav file
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Run the batch program:
`save\_binaural\_beat.py <https://github.com/chimera0/accel-brain-code/blob/master/Binaural-Beat-and-Monaural-Beat-with-python/bat/save_binaural_beat.py>`__.
.. code:: bash
python bat/save_binaural_beat.py -o binaural_beat.wav -l 400 -r 430 -t 60 -v 0.01
The command line arguments is as follows.
.. code:: bash
python bat/save_binaural_beat.py -h
::
usage: save_binaural_beat.py [-h] [-o OUTPUT_FILE_NAME] [-l LEFT] [-r RIGHT]
[-t TIME] [-v VOLUME]
Create the Binaural Beat and save wav file.
optional arguments:
-h, --help show this help message and exit
-o OUTPUT_FILE_NAME, --output_file_name OUTPUT_FILE_NAME
Output file name.
-l LEFT, --left LEFT Left frequencys (Hz).
-r RIGHT, --right RIGHT
Right frequencys (Hz).
-t TIME, --time TIME Play time. This is per seconds.
-v VOLUME, --volume VOLUME
Sound volume.
Create "Monaural Beat" and output wav file
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Run the batch program:
`save\_monaural\_beat.py <https://github.com/chimera0/accel-brain-code/blob/master/Binaural-Beat-and-Monaural-Beat-with-python/bat/save_monaural_beat.py>`__.
.. code:: bash
python bat/save_monaural_beat.py -o monaural_beat.wav -l 400 -r 430 -t 60 -v 0.01
The command line arguments is as follows.
.. code:: bash
python bat/save_monaural_beat.py -h
::
usage: save_monaural_beat.py [-h] [-o OUTPUT_FILE_NAME] [-l LEFT] [-r RIGHT]
[-t TIME] [-v VOLUME]
Create the Monaural Beat and save wav file.
optional arguments:
-h, --help show this help message and exit
-o OUTPUT_FILE_NAME, --output_file_name OUTPUT_FILE_NAME
Output file name.
-l LEFT, --left LEFT Left frequencys (Hz).
-r RIGHT, --right RIGHT
Right frequencys (Hz).
-t TIME, --time TIME Play time. This is per seconds.
-v VOLUME, --volume VOLUME
Sound volume.
Create and play "Binaural Beat" on console
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Run the batch program:
`play\_binaural\_beat.py <https://github.com/chimera0/accel-brain-code/blob/master/Binaural-Beat-and-Monaural-Beat-with-python/bat/play_binaural_beat.py>`__.
.. code:: bash
python play_binaural_beat.py -l 400 -r 430 -t 60 -v 0.01
The command line arguments is as follows.
.. code:: bash
python bat/play_binaural_beat.py -h
::
usage: play_binaural_beat.py [-h] [-l LEFT] [-r RIGHT] [-t TIME] [-v VOLUME]
Create and play the Binaural Beat.
optional arguments:
-h, --help show this help message and exit
-l LEFT, --left LEFT Left frequencys (Hz).
-r RIGHT, --right RIGHT
Right frequencys (Hz).
-t TIME, --time TIME Play time. This is per seconds.
-v VOLUME, --volume VOLUME
Sound volume.
Create and play "Monaural Beat" on console
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Run the batch program:
`play\_monaural\_beat.py <https://github.com/chimera0/accel-brain-code/blob/master/Binaural-Beat-and-Monaural-Beat-with-python/bat/play_monaural_beat.py>`__.
.. code:: bash
python bat/play_monaural_beat_beat.py -l 400 -r 430 -t 60 -v 0.01
The command line arguments is as follows.
.. code:: bash
python bat/play_monaural_beat.py -h
::
usage: play_monaural_beat.py [-h] [-l LEFT] [-r RIGHT] [-t TIME] [-v VOLUME]
Create and play the Monaural Beat.
optional arguments:
-h, --help show this help message and exit
-l LEFT, --left LEFT Left frequencys (Hz).
-r RIGHT, --right RIGHT
Right frequencys (Hz).
-t TIME, --time TIME Play time. This is per seconds.
-v VOLUME, --volume VOLUME
Sound volume.
Use-case for coding
-------------------
You can use this library as a module by executing an import statement in
your Python source file.
Create wav file of "Binaural Beat"
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Call the method.
.. code:: python
from AccelBrainBeat.brainbeat.binaural_beat import BinauralBeat
brain_beat = BinauralBeat() # for binaural beats.
brain_beat.save_beat(
output_file_name="save_binaural_beat.wav",
frequencys=(400, 430),
play_time=10,
volume=0.01
)
- ``output_file_name`` is wav file name or path.
Create wav file of "Monaural Beat"
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The interface of monaural beats is also same as the binaural beats.
.. code:: python
from AccelBrainBeat.brainbeat.monaural_beat import MonauralBeat
brain_beat = MonauralBeat() # for monaural beats.
brain_beat.save_beat(
output_file_name="save_monaural_beat.wav",
frequencys=(400, 430),
play_time=10,
volume=0.01
)
Create and play "Binaural Beat"
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
For example, if ``400`` Hz was played in left ear and ``430`` Hz in the
right, then the binaural beats would have a frequency of 30 Hz.
Import Python and Cython modules.
.. code:: python
from AccelBrainBeat.brainbeat.binaural_beat import BinauralBeat
Instantiate objects and call the method.
.. code:: python
brain_beat = BinauralBeat()
brain_beat.play_beat(
frequencys=(400, 430),
play_time=10,
volume=0.01
)
- The type of ``frequencys`` is tuple. This is a pair of both
frequencys.
- ``play_time`` is playing times(per seconds).
- ``volume`` is the sound volume. It depends on your environment.
Create and play "Monaural Beat"
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The interface of monaural beats is same as the binaural beats.
``MonoauralBeat`` is functionally equivalent to ``BinauralBeat``.
.. code:: python
from AccelBrainBeat.brainbeat.monaural_beat import MonauralBeat
brain_beat = MonauralBeat()
brain_beat.play_beat(
frequencys=(400, 430),
play_time=10,
volume=0.01
)
Licence
-------
- `GPL2 <https://github.com/chimera0/Binaural-Beat-and-Monaural-Beat-with-python/blob/master/LICENSE>`__
Related products
----------------
Binaural beats and Monauarl beats can be implemented by not only Python
but also Unity3D. I developed Unity3D package: `Immersive Brain's Level
Upper by Binaural Beat and Monaural
Beat. <https://www.assetstore.unity3d.com/en/#!/content/66518>`__.
As the kind of "Brain-Wave Controller", this Unity3D package is
functionally equivalent to Python\`s library.
More detail
-----------
The function of this library is inducing you to be extreme immersive
mind state on the path to peak performance. You can handle your mind
state by using this library which is able to control your brain waves by
the binaural beats and the monaural beats.
Concept of Binaural beats and Monauarl beats
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
According to a popular theory, brain waves such as Delta, Theta, Alpha,
Beta, and Gamma rhythms tend to be correlated with mind states. The
delta waves(1-3 Hz) are regarded as the slowest brain waves that are
typically produced during the deep stages of sleep. The theta waves(4-7
Hz) are offen induced by the meditative state or focusing the mind. The
alpha waves(8-12 Hz) are associate with relaxed state. The beta
waves(13-29 Hz) are normal waking consciousness. The Gamma waves(30-100
Hz) are the fastest of the brain waves and associated with peak
concentration and the brain's optimal frequency for cognitive
functioning.
By a theory of the binaural beats, signals of two different frequencies
from headphone or earphone are presented separately, one to each ear,
your brain detects the phase variation between the frequencies and tries
to reconcile that difference. The effect on the brain waves depends on
the difference in frequencies of each tone. For example, if 400 Hz was
played in one ear and 430 in the other, then the binaural beats would
have a frequency of 30 Hz.
The monaural beats are similar to the binaural beats. But they vary in
distinct ways. The binaural beats seem to be "created" or perceived by
cortical areas combining the two different frequencies. On the other
hand, the monaural beats are due to direct stimulation of the basilar
membrane. This makes it possible to hear the beats.
Please choose either binaural beets or monaural beats. If you set up 5
Hz, your brain waves and the frequency can be tuned and then you are
able to be the meditative state or focusing the mind. Or what you choose
to be relaxed state is the alpha waves(8-12 Hz).
Related PoC
~~~~~~~~~~~
- `仏教の社会構造とマインドフルネス瞑想の意味論 <https://accel-brain.com/social-structure-of-buddhism-and-semantics-of-mindfulness-meditation/>`__
(Japanese)
- `プロトタイプの開発:バイノーラルビート <https://accel-brain.com/social-structure-of-buddhism-and-semantics-of-mindfulness-meditation/3/#i-6>`__
Author
------
- chimera0(RUM)
Author's websites
~~~~~~~~~~~~~~~~~
- `Accel Brain <https://accel-brain.com>`__ (Japanese)
References
~~~~~~~~~~
- Brandy, Queen., et al., (2003) “Binaural Beat Induced Theta EEG
Activity and Hypnotic Susceptibility : Contradictory Results and
Technical Considerations,” American Journal of Clinical Hypnosis,
pp295-309.
- Green, Barry., Gallwey, W. Timothy., (1986) The Inner Game of Music,
Doubleday.
- Kennerly, Richard Cauley., (1994) An empirical investigation into the
effect of beta frequency binaural beat audio signals on four measures
of human memory, Department of Psychology, West Georgia College,
Carrolton, Georgia.
- Kim, Jeansok J., Lee, Hongjoo J., Han, Jung-Soo., Packard, Mark G.
(2001) “Amygdala Is Critical for Stress-Induced Modulation of
Hippocampal Long-Term Potentiation and Learning,” The Journal of
Neuroscience, Vol. 21, pp5222-5228.
- LeDoux, Joseph. (1998) The emotional brain : the mysterious
underpinnings of emotional life, London : Weidenfeld & Nicolson.
- McEwen, Bruce S., Sapolsky, Robert M. (1995) “Stress and cognitive
function,” Current Opinion in Neurobiology, Vol. 5, pp205-216.
- Oster, Gerald., (1973) “Auditory Beats in the Brain,” Scientific
American, pp94-102.
- Radford, Benjamin., (2001) “Pokemon Contagion: Photosensitive
Epilepsy or Mass Psychogenic Illness?,” Southern Medical Journal,
Vol. 94, No. 2, pp197-204.
- Steward, Oswald., (2000) Functional neuroscience, Springer.
- Swann, R., et al. (1982) The Brain ? A User’s Manual, New York: G. P.
Putnam’s Sons.
- Takeo, Takahashi., et al., (1999) “Pokemon seizures,” Neurol J
Southeast Asia, Vol. 4, pp1-11.
- Vollenweider., Franz X., Geyer., Mark A. (2001) “A systems model of
altered consciousness: Integrating natural and drug-induced
psychoses,” Brain Research Bulletin, Vol. 56, No. 5, pp495-507.
- Wahbeh, Helane., Calabrese, Carlo., Zwickey, Heather., (2007)
“Binaural Beat Technology in Humans : A Pilot Study to Assess
Psychologic and Physiologic Effects,” The Journal of Alternative and
Complementary Medicine, Vol. 13, No. 1, pp25-32.
- Westman, Jack C., Walters, James R. (1981) “Noise and Stress : A
Comprehensive Approach,” Environmental Health Perspectives, Vol. 41,
pp291-309.
| AccelBrainBeat | /AccelBrainBeat-1.0.5-py3-none-any.whl/AccelBrainBeat-1.0.5.dist-info/DESCRIPTION.rst | DESCRIPTION.rst |
Accern for Python
=================
.. image:: https://raw.githubusercontent.com/Accern/accern-python/master/docs/_static/accern.png
:target: _static/accern.png
|pypi| |circleci| |sphinx|
.. snip
A python library to consume Accern's V4 REST API for Titan streaming/historical data.
Overview
--------
Accern is a fast-growing NYC startup that is disrupting the way quantitative
hedge funds can gain a competitive advantage using news and social media data.
It currently has the world’s largest financial news coverage, covering over
1 billion public news websites, blogs, financial documents, and social media
websites. Furthermore, Accern derives proprietary analytics from each news
story to help quantitative hedge funds make accurate trading decisions.
Accern consolidates multiple news data feeds into one to help drastically reduce
costs of both small and large hedge funds. With Accern proprietary data filters, we
are able to deliver relevant articles to clients with a 99 percent accuracy rate.
Accern’s delivery mechanism is a RESTful API where it delivers derived analytics
from news articles, including the original article URLs so quantitative hedge
funds can derive their own analytics in-house from the relevant articles.
The Accern library for Python helps users get fast, flexible data structures from
Accern's V4 Titan streaming/historical data.
.. snap
Install
------------
.. code-block:: console
pip install accern
Quick Start
---------------
1. Contact `[email protected]`. and inquire about an Accern API token.
2. To quickly start using the Accern API, create an API instance and pass your token:
.. code-block:: python
from accern import API
token = 'YOUR TOKEN'
Client = API(token)
3. Pass params to get filtered data and make an API request.
.. code-block:: python
schema = {
'filters': {
'entity_ticker': 'AAPL'
}
}
resp = Client.request(schema)
3. Accern ``Historical API`` will be available in the future releases.
For more information see the `full documentation
<https://accern-python.readthedocs.io>`_ on Read The Docs.
Non default I/O urls
---------------------
The I/O urls can be changed by defining a config file:
.. code-block:: python
from accern import set_config_file
set_config_file("new-config-file.json")
The expected content of the config file can be found in `accern/config.py`
under `CONFIG_DEFAULT`.
.. |circleci| image:: https://circleci.com/gh/Accern/accern-python.svg?style=shield&circle-token=4a51eaa89bd79c92bb9df0e48642146ad7091afc
:target: https://circleci.com/gh/Accern/accern-python
.. |sphinx| image:: https://readthedocs.org/projects/accern-python/badge/?version=latest
:target: http://accern-python.readthedocs.io/en/latest/?badge=latest
.. |pypi| image:: https://badge.fury.io/py/Accern.svg
:target: https://badge.fury.io/py/Accern
| Accern | /Accern-0.4.0.tar.gz/Accern-0.4.0/README.rst | README.rst |
from datetime import datetime
import logging
import os
import re
import sys
import six
try:
from urlparse import urlsplit, urlunsplit
except ImportError:
from urllib.parse import urlsplit, urlunsplit
try:
from urllib import urlencode
except ImportError:
from urllib.parse import urlencode
ACCERN_LOG = os.environ.get('ACCERN_LOG')
__all__ = [
'datetime',
'json',
'log_debug',
'log_info',
'logfmt',
'six',
'urlencode',
'urlsplit',
'urlunsplit'
]
try:
import json
except ImportError:
json = None
if not (json and hasattr(json, 'loads')):
try:
import simplejson as json
except ImportError:
if not json:
raise ImportError(
"Accern requires a JSON library, such as simplejson. "
"HINT: Try installing the "
"python simplejson library via 'pip install simplejson' or "
"'easy_install simplejson', or contact [email protected] "
"with questions.")
else:
raise ImportError(
"Accern requires a JSON library with the same interface as "
"the Python 2.6 'json' library. You appear to have a 'json' "
"library with a different interface. Please install "
"the simplejson library. HINT: Try installing the "
"python simplejson library via 'pip install simplejson' "
"or 'easy_install simplejson', or contact [email protected]"
"with questions.")
def utf8(value):
if six.PY2 and isinstance(value, unicode):
return value.encode('utf-8')
return value
def _console_log_level():
if ACCERN_LOG in ['debug', 'info']:
return ACCERN_LOG
return None
def log_debug(message, **params):
msg = logfmt(dict(message=message, **params))
if _console_log_level() == 'debug':
print(msg, sys.stderr)
logger = logging.getLogger('accern')
logger.debug(msg)
def log_info(message, **params):
msg = logfmt(dict(message=message, **params))
if _console_log_level() in ['debug', 'info']:
print(msg, sys.stderr)
logger = logging.getLogger('accern')
logger.info(msg)
def logfmt(props):
def fmt(key, val):
# Handle case where val is a bytes or bytesarray
if six.PY3 and hasattr(val, 'decode'):
val = val.decode('utf-8')
# Check if val is already a string to avoid re-encoding into
# ascii. Since the code is sent through 2to3, we can't just
# use unicode(val, encoding='utf8') since it will be
# translated incorrectly.
if not isinstance(val, six.string_types):
val = six.text_type(val)
if re.search(r'\s', val):
val = repr(val)
# key should already be a string
if re.search(r'\s', key):
key = repr(key)
return u'{key}={val}'.format(key=key, val=val)
return u' '.join([fmt(key, val) for key, val in sorted(props.items())]) | Accern | /Accern-0.4.0.tar.gz/Accern-0.4.0/accern/util.py | util.py |
from accern import default_client, error, util
from accern.default_client import AccernClient
from accern.schema import Schema
from accern.config import get_config
def get_api_base(env):
try:
return get_config()["io"][env]
except KeyError:
raise ValueError("Unknown env type: {0}".format(env))
class HistoricalClient(AccernClient):
"""Perform requests to the Accern API web services."""
def __init__(self, token=None, client=None, env=None):
"""Intialize with params.
:param client: default http client. Optional
:param token: Accern API token. Required.
"""
self.env = "production" if env is None else env
self.api_base = get_api_base(self.env)
self.token = token
self._client = client or default_client.new_http_client()
@staticmethod
def interpret_response(rbody, rcode, rheaders):
try:
if hasattr(rbody, 'decode'):
rbody = rbody.decode('utf-8')
resp = util.json.loads(rbody)
except Exception:
raise error.APIError(
"Invalid response body from API: %s "
"(HTTP response code was %d)" % (rbody, rcode),
rbody, rcode, rheaders)
if not 200 <= rcode < 300:
AccernClient.handle_error(rbody, rcode, resp)
return resp
def create_job(self, schema):
"""Create a job with schema.
:param schema: job detail, will be added to payload
:raises ApiError: when the API returns an error.
:raises Timeout: if the request timed out.
:raises TransportError: when something went wrong while trying to
exceute a request.
"""
schema = Schema.validate_schema(method='historical', schema=schema)
token = AccernClient.check_token(self.token)
method = 'POST'
headers = AccernClient.build_api_headers(token, method)
if method == 'POST':
post_data = util.json.dumps({'query': schema})
else:
post_data = None
rbody, rcode, rheaders = self._client.request(method, self.api_base, headers, post_data)
resp = self.interpret_response(rbody, rcode, rheaders)
return resp
def get_jobs(self, job_id=None):
"""Get the user's job history.
:param job_id: if job_id is valid, will return the job related
"""
token = AccernClient.check_token(self.token)
method = 'GET'
headers = AccernClient.build_api_headers(token, method)
if job_id is None:
rbody, rcode, rheaders = self._client.request(method, self.api_base, headers, post_data=None)
resp = self.interpret_response(rbody, rcode, rheaders)
else:
rbody, rcode, rheaders = self._client.request(method, '%s/%s' % (self.api_base, job_id), headers, post_data=None)
resp = self.interpret_response(rbody, rcode, rheaders)
return resp | Accern | /Accern-0.4.0.tar.gz/Accern-0.4.0/accern/historical.py | historical.py |
from os.path import dirname
from accern import error, util
MODULE_PATH = dirname(__file__)
FIELD_OPTIONS = util.json.load(open("%s/data/options.json" % MODULE_PATH))
class Schema(object):
@staticmethod
def _validate_categorical(field, value):
VALUE = FIELD_OPTIONS[field]['value']
return {
'field': field,
'type': 'categorical',
'value': {
'valid': [v for v in value if v in VALUE],
'invalid': [v for v in value if v not in VALUE]
}
}
@staticmethod
def _validate_range(field, value):
RANGE = FIELD_OPTIONS[field]['value']
if not isinstance(value, list):
return {
'field': field,
'error': 'Invalid value type of field.'
}
if len(value) != 2:
return {
'field': field,
'type': 'range',
'error': '"%s" has wrong number or arguments.' % (field)
}
if value[1] < value[0]:
return {
'field': field,
'type': 'range',
'error': '"%s" has malformed filter option value.' % (field)
}
if value[0] >= RANGE[0] and value[1] <= RANGE[1]:
return {
'field': field,
'type': 'range',
'value': {
'valid': True,
'default_range': RANGE
}
}
return {
'field': field,
'type': 'range',
'value': {
'valid': False,
'default_range': RANGE
}
}
@staticmethod
def _validate_norange(field, value):
if len(value) != 2:
return {
'field': field,
'type': 'no range',
'error': '"%s" has wrong number or arguments.' % (field)
}
if value[1] < value[0]:
return {
'field': field,
'type': 'range',
'error': '"%s" has malformed filter option value.' % (field)
}
return {
'field': field,
'type': 'no range'
}
@classmethod
def get_fields(cls):
return [
v for v in sorted(FIELD_OPTIONS.keys())
if ('filter' in FIELD_OPTIONS[v]['method']
or 'url_param' in FIELD_OPTIONS[v]['method'])]
@classmethod
def get_options(cls, field):
try:
if ('filter' in FIELD_OPTIONS[field]['method']
or 'url_param' in FIELD_OPTIONS[field]['method']):
options = FIELD_OPTIONS[field]
del options['method']
return options
return None
except KeyError:
raise error.SchemaError(
'Invalid field (%s) in filter option.' % field)
@classmethod
def get_url_params(cls):
return [
k for k, v in FIELD_OPTIONS.items()
if 'url_param' in v['method']]
@classmethod
def validate_options(cls, field=None, value=None):
if field not in FIELD_OPTIONS:
raise error.SchemaError(
'Invalid field (%s) in filter option.' % field)
if value is None:
raise error.SchemaError(
'No filter option value for "%s".' % field)
types = {
'categorical': cls._validate_categorical,
'norange': cls._validate_norange,
'range': cls._validate_range,
'other': lambda field, value: {'field': field, 'value': value}
}
return types[FIELD_OPTIONS[field]['type']](field, value)
@classmethod
def validate_schema_filters(cls, method, filters):
if isinstance(filters, list):
if method != 'historical':
raise error.SchemaError(
'Method "%s" does not support multiple filters.' % method)
for f in filters:
cls.validate_schema_filters(method=method, filters=f)
return
for f in filters:
if (isinstance(filters[f], list)
and any(isinstance(el, list) for el in filters[f])):
for el in filters[f]:
resp = cls.validate_options(field=f, value=el)
if 'error' in resp:
raise error.SchemaError(resp['error'])
else:
resp = cls.validate_options(field=f, value=filters[f])
if 'error' in resp:
raise error.SchemaError(resp['error'])
@classmethod
def validate_schema_select(cls, method, select):
if not select:
return None
if method in ['api', 'stream']:
if any('function' in el for el in select):
raise error.SchemaError('Method "%s" does not support select field functions.' % method)
if isinstance(select, list):
if any('field' not in v for v in select):
raise error.SchemaError('Missing "field" in select option.')
return [{'field': v['field'], 'alias': v.get('alias', v['field'])} for v in select]
if 'field' not in select:
raise error.SchemaError('Missing "field" in select option.')
return {'field': select['field'], 'alias': select.get('alias', select['field'])}
try:
if any(v["field"] == 'harvested_at' and v.get("function") is not None for v in select):
select = [{
'field': v['field'],
'alias': v.get('alias', v['field']),
'function': v.get('function', FIELD_OPTIONS[v['field']]['function'][0])
} for v in select]
if any(v['field'] == 'harvested_at' and v['alias'] != v['function'] for v in select):
raise error.SchemaError("Alias of harvested_at is different from it's aggregation function.")
else:
select = [
{
'field': v['field'],
'alias': v.get('alias', v['field'])
} for v in select]
return select
except KeyError:
raise error.SchemaError('Missing "field" in select option.')
@classmethod
def validate_schema(cls, method=None, schema=None):
if schema is None:
return None
schema = {k.lower(): v for k, v in schema.items()}
if method is None:
raise error.SchemaError('Method is missing.')
if method not in ['api', 'historical', 'stream']:
raise error.SchemaError('Illegal usage of validate schema function.')
if method in ['api', 'stream']:
if 'name' in schema:
raise error.SchemaError('Illegal "name" in %s schema.' % method)
if 'description' in schema:
raise error.SchemaError('Illegal "description" in %s schema.' % method)
elif method == 'historical':
if 'name' not in schema:
raise error.SchemaError('Required field "name" not found in %s schema.' % method)
if 'description' not in schema:
raise error.SchemaError('Required field "description" not found in %s schema.' % method)
if method in ['api', 'historical', 'stream']:
filters = schema.get('filters', {})
cls.validate_schema_filters(method=method, filters=filters)
select = schema.get('select', {})
select = cls.validate_schema_select(method=method, select=select)
if select is not None:
schema['select'] = select
return schema | Accern | /Accern-0.4.0.tar.gz/Accern-0.4.0/accern/schema.py | schema.py |
from __future__ import print_function
import codecs
import re
import time
import requests
from accern import util
from accern.default_client import AccernClient, Event
from accern.schema import Schema
from accern.config import get_config
END_OF_FIELD = re.compile(r'\r\n\r\n|\r\r|\n\n')
class StreamListener(AccernClient):
def on_data(self, raw_data):
"""Call when raw data is received from connection.
Override this method if you want to manually handle stream data. Return
False to stop stream and close connection.
"""
if 'disconnect' in raw_data:
if self.on_disconnect(raw_data['disconnect']) is False:
return False
return raw_data
@staticmethod
def on_disconnect(notice):
"""Call when server return a disconnect notice."""
if notice == 'disconnect':
return False
return True
class StreamClient(object):
"""Perform requests to the Accern API web services."""
def __init__(self, listener, token=None, schema=None, **kwargs):
"""Intialize with params.
:param client: default http client. Optional
:param token: Accern API token. Required.
"""
self._listener = listener or StreamListener()
self.api_base = get_config()["v4_stream"]
# Keep data here as it streams in
self.buf = u''
self.chunk_size = kwargs.get('chunk_size', 1024)
self.last_id = kwargs.get('last_id', None)
self.schema = schema or {}
# Any extra kwargs will be fed into the requests.get call later.
self.requests_kwargs = kwargs.get('request', {})
self.resp = None
self.resp_iterator = None
self.retry = kwargs.get('retry', 3000)
self.timeout = kwargs.get('timeout', 300.0)
self.token = token or None
self.url = None
# The SSE spec requires making requests with Cache-Control: nocache
if 'headers' not in self.requests_kwargs:
self.requests_kwargs['headers'] = {}
self.requests_kwargs['headers']['Cache-Control'] = 'no-cache'
# The 'Accept' header is not required, but explicit > implicit
self.requests_kwargs['headers']['Accept'] = 'text/event-stream'
self.new_session()
def __iter__(self):
return self
def __next__(self):
decoder = codecs.getincrementaldecoder(self.resp.encoding)(errors='replace')
while not self._event_complete():
try:
next_chunk = next(self.resp_iterator)
if not next_chunk:
raise EOFError()
self.buf += decoder.decode(next_chunk)
except (StopIteration, requests.RequestException, EOFError):
time.sleep(self.retry / 1000.0)
self._run()
# The SSE spec only supports resuming from a whole message, so
# if we have half a message we should throw it out.
head, sep, _ = self.buf.rpartition('\n')
self.buf = head + sep
continue
# Split the complete event (up to the END_OF_FIELD) into event_string,
# and retain anything after the current complete event in self.buf
# for next time.
(event_string, self.buf) = re.split(END_OF_FIELD, self.buf, maxsplit=1)
msg = Event.parse(event_string)
# If the server requests a specific retry delay, we need to honor it.
if msg.retry:
self.retry = msg.retry
# last_id should only be set if included in the message. It's not
# forgotten if a message omits it.
if msg.event_id:
self.last_id = msg.event_id
if msg.data:
raw_data = util.json.loads(util.json.loads(msg.data)['data'])['signals']
data = AccernClient.quant_filter(self.schema, raw_data)
data = AccernClient.select_fields(self.schema, data)
if self._listener.on_data(data) is False:
return False
return msg
def _event_complete(self):
return re.search(END_OF_FIELD, self.buf) is not None
def _run(self):
if self.last_id:
self.requests_kwargs['headers']['Last-Event-ID'] = self.last_id
# Use session if set. Otherwise fall back to requests module.
requester = self.session or requests
self.resp = requester.get(self.url, stream=True)
self.resp_iterator = self.resp.iter_content(chunk_size=self.chunk_size)
# TODO: Ensure we're handling redirects. Might also stick the 'origin'
# attribute on Events like the Javascript spec requires.
self.resp.raise_for_status()
while next(self, None) is not None:
next(self, None)
def new_session(self):
self.session = requests.Session()
self.session.params = None
def performs(self):
"""Perform HTTP GET/POST with credentials.
:param output: output config.
:param params: HTTP GET parameters.
:raises ApiError: when the API returns an error.
:raises Timeout: if the request timed out.
:raises TransportError: when something went wrong while trying to
exceute a request.
"""
print('%s - Start streaming, use [Ctrl+C] to stop...' % (util.datetime.now()))
schema = Schema.validate_schema(method='stream', schema=self.schema)
params = AccernClient.build_api_params(schema)
params['token'] = AccernClient.check_token(self.token)
encoded_params = util.urlencode(list(AccernClient.api_encode(params or {})))
self.url = AccernClient.build_api_url(self.api_base, encoded_params)
try:
self._run()
except KeyboardInterrupt:
print('%s - Streaming stopped...' % (util.datetime.now()))
else:
pass
if util.six.PY2:
next = __next__ | Accern | /Accern-0.4.0.tar.gz/Accern-0.4.0/accern/stream.py | stream.py |
from accern import error, util
from accern.default_client import AccernClient, new_http_client
from accern.schema import Schema
from accern.config import get_config
class API(AccernClient):
"""Perform requests to the Accern API web services."""
def __init__(self, token=None, client=None):
"""Intialize with params.
:param client: default http client. Optional
:param token: Accern API token. Required.
"""
self.api_base = get_config()["v4_api"]
self.token = token
self._client = client or new_http_client()
@staticmethod
def interpret_response(rbody, rcode, rheaders, schema):
try:
if hasattr(rbody, 'decode'):
rbody = rbody.decode('utf-8')
resp = util.json.loads(rbody)
except Exception:
raise error.APIError(
"Invalid response body from API: %s "
"(HTTP response code was %d)" % (rbody, rcode),
rbody, rcode, rheaders)
if not 200 <= rcode < 300:
raise error.APIError('API request failed.')
if resp['total'] > 0:
resp['signals'] = AccernClient.quant_filter(
schema, resp['signals'])
resp['signals'] = AccernClient.select_fields(
schema, resp['signals'])
resp['total'] = len(resp['signals'])
return resp
def request(self, schema=None):
rbody, rcode, rheaders = self.request_raw(schema)
resp = self.interpret_response(rbody, rcode, rheaders, schema)
return resp
def request_raw(self, schema):
"""Perform HTTP GET with credentials.
:raises AuthenticationError: when the token is invalid.
"""
schema = Schema.validate_schema(method='api', schema=schema)
params = AccernClient.build_api_params(schema)
params['token'] = AccernClient.check_token(self.token)
encoded_params = util.urlencode(list(AccernClient.api_encode(params)))
abs_url = AccernClient.build_api_url(self.api_base, encoded_params)
rbody, rcode, rheaders = self._client.request(
'GET', abs_url, headers=None, post_data=None)
return rbody, rcode, rheaders | Accern | /Accern-0.4.0.tar.gz/Accern-0.4.0/accern/api.py | api.py |
from datetime import datetime
import re
import sys
import textwrap
from accern import error, util
from accern.schema import Schema
try:
import requests
except ImportError:
requests = None
else:
try:
# Require version 0.8.8, but don't want to depend on distutils
version = requests.__version__
major, minor, patch = [int(i) for i in version.split('.')]
# pylint: disable=broad-except
except Exception:
# Probably some new-fangled version, so it should support verify
pass
else:
if (major, minor, patch) < (2, 20, 0):
sys.stderr.write(
'Warning: the Accern library requires that your Python '
'"requests" library be newer than version 2.20.0, but your '
'"requests" library is version %s. Accern will fall back to '
'an alternate HTTP library so everything should work. We '
'recommend upgrading your "requests" library. If you have any '
'questions, please contact [email protected]. (HINT: running '
'"pip install -U requests" should upgrade your requests '
'library to the latest version.)' % (version,))
requests = None
__all__ = [
'AccernClient',
'Event',
'new_http_client'
]
QUANT = [
'event_impact_gt_mu_add_sigma',
'event_impact_gt_mu_pos_add_sigma_pos',
'event_impact_gt_mu_pos_add_2sigma_pos',
'event_impact_gt_1pct_pos',
'event_impact_lt_mu_sub_sigma',
'event_impact_lt_mu_neg_sub_sigma_neg',
'event_impact_lt_mu_neg_sub_2sigma_neg',
'event_impact_lt_1pct_neg',
'event_impact_neg',
'event_impact_pct_change_avg',
'event_impact_pct_change_stdev',
'event_impact_pos',
'event_relevance',
'event_sentiment',
'event_source_timeliness_score',
'entity_exchange',
'entity_relevance',
'entity_sentiment',
'harvested_at',
'story_group_sentiment_avg',
'story_group_sentiment_stdev',
'story_group_count',
'story_group_traffic_sum',
'story_sentiment',
'story_traffic'
]
def new_http_client(*args, **kwargs):
return RequestsClient(*args, **kwargs)
class AccernClient(object):
@staticmethod
def api_encode(params):
if isinstance(params, object):
for key, value in util.six.iteritems(params):
key = util.utf8(key)
if value is None:
continue
elif isinstance(value, list):
yield (key, ",".join(value))
else:
yield (key, util.utf8(value))
@staticmethod
def build_api_url(url, query):
scheme, netloc, path, base_query, fragment = util.urlsplit(url)
if base_query:
query = '%s&%s' % (base_query, query)
return util.urlunsplit((scheme, netloc, path, query, fragment))
@staticmethod
def build_api_headers(token, method):
if method == "POST":
return {
'Content-Type': 'application/json',
'IO-Authorization': token
}
if method == "GET":
return {
'IO-Authorization': token
}
raise ValueError("Unknown API method: {0}".format(method))
@staticmethod
def build_api_params(schema):
if schema is None:
filters = {}
else:
filters = schema.get('filters', {})
avail_params = [value for value in filters if value in Schema.get_url_params()]
params = {key: filters[key] for key in avail_params}
if 'harvested_at' in filters:
del params['harvested_at']
params['from'] = filters['harvested_at'][0]
return params
@classmethod
def check_values(cls, raw_data, f, f_values):
if f == 'harvested_at':
for data in raw_data:
if (datetime.strptime(data[f], '%Y-%m-%dT%H:%M:%S.%fZ') >= datetime.strptime(f_values[0], '%Y-%m-%d %H:%M:%S')) and \
(datetime.strptime(data[f], '%Y-%m-%dT%H:%M:%S.%fZ') <= datetime.strptime(f_values[1], '%Y-%m-%d %H:%M:%S')):
yield data
else:
for data in raw_data:
for value in f_values:
if data[f] >= value[0] and data[f] <= value[1]:
yield data
@staticmethod
def check_token(token):
if token:
my_token = token
else:
from accern import token
my_token = token
if my_token is None:
raise error.AuthenticationError('No token provided.')
return my_token
@staticmethod
def handle_error(rbody, rcode, resp):
try:
error_data = resp['error']
except (KeyError, TypeError):
raise error.APIError(
"Invalid response object from API: %r (HTTP response code "
"was %d)" % (rbody, rcode), rbody, rcode, resp)
if rcode == 400:
raise error.AccernError(error_data, rbody, rcode)
@staticmethod
def select_fields(schema, raw_data):
if schema is None:
select = []
else:
select = schema.get('select', [])
names = [option['alias'] if 'alias' in option else option['field'] for option in select]
fields = [option['field'] for option in select]
data_selected = []
for data in raw_data:
if bool(select) > 0:
try:
if isinstance(select, list):
new_data = dict(zip(names, [data[field] for field in fields]))
else:
raise error.AccernError('Select field should be a list.')
except KeyError:
raise error.AccernError('Invalid select values passed.')
else:
new_data = data
if 'entity_competitors' in new_data:
new_data['entity_competitors'] = ' | '.join(new_data['entity_competitors'])
if 'entity_indices' in new_data:
new_data['entity_indices'] = ' | '.join(new_data['entity_indices'])
data_selected.append(new_data)
return data_selected
@classmethod
def quant_filter(cls, schema, raw_data):
if schema is None:
filters = {}
else:
filters = schema.get('filters', {})
for f in filters:
if f in QUANT:
data_filtered = list(cls.check_values(raw_data, f, filters[f]))
raw_data = data_filtered
return raw_data
class Event(object):
SSE_LINE_PATTERN = re.compile('(?P<name>[^:]*):?( ?(?P<value>.*))?')
def __init__(self, data='', event='message', event_id=None, retry=None):
self.data = data
self.event = event
self.event_id = event_id
self.retry = retry
def dump(self):
lines = []
if self.event_id:
lines.append('id: %s' % self.event_id)
# Only include an event line if it's not the default already.
if self.event != 'message':
lines.append('event: %s' % self.event)
if self.retry:
lines.append('retry: %s' % self.retry)
lines.extend('data: %s' % d for d in self.data.split('\n'))
return '\n'.join(lines) + '\n\n'
@classmethod
def parse(cls, raw):
"""Given a possibly-multiline string representing an SSE message, parse it and return a Event object."""
msg = cls()
for line in raw.splitlines():
m = cls.SSE_LINE_PATTERN.match(line)
if m is None:
# Malformed line. Discard but warn.
sys.stderr.write('Invalid SSE line: "%s"' % line, SyntaxWarning)
continue
name = m.group('name')
if name == '':
# line began with a ":", so is a comment. Ignore
continue
value = m.group('value')
if name == 'data':
# If we already have some data, then join to it with a newline.
# Else this is it.
if msg.data:
msg.data = '%s\n%s' % (msg.data, value)
else:
msg.data = value
elif name == 'event':
msg.event = value
elif name == 'id':
msg.event_id = value
elif name == 'retry':
msg.retry = int(value)
return msg
def __str__(self):
return self.data
class HTTPClient(object):
def request(self, method, url, headers, post_data=None):
raise NotImplementedError(
'HTTPClient subclasses must implement `request`')
class RequestsClient(HTTPClient):
name = 'requests'
def __init__(self, timeout=80, session=None, **kwargs):
super(RequestsClient, self).__init__(**kwargs)
self._timeout = timeout
self._session = session or requests.Session()
def request(self, method, url, headers, post_data=None):
kwargs = {}
try:
try:
result = self._session.request(method,
url,
headers=headers,
data=post_data,
timeout=self._timeout,
**kwargs)
except TypeError as e:
raise TypeError(
'Warning: It looks like your "requests" library is out of '
'date. You can fix that by running "pip install -U requests".) '
'The underlying error was: %s' % (e))
content = result.content
status_code = result.status_code
# pylint: disable=broad-except
except Exception as e:
# Would catch just requests.exceptions.RequestException, but can
# also raise ValueError, RuntimeError, etc.
self.handle_request_error(e)
return content, status_code, result.headers
@staticmethod
def handle_request_error(e):
if isinstance(e, requests.exceptions.RequestException):
msg = ("Unexpected error communicating with Accern. "
"If this problem persists, let us know at "
"[email protected].")
err = "%s: %s" % (type(e).__name__, str(e))
else:
msg = ("Unexpected error communicating with Accern. "
"It looks like there's probably a configuration "
"issue locally. If this problem persists, let us "
"know at [email protected].")
err = "A %s was raised" % (type(e).__name__,)
if str(e):
err += " with error message %s" % (str(e),)
else:
err += " with no error message"
msg = textwrap.fill(msg) + "\n\n(Network error: %s)" % (err,)
raise error.APIConnectionError(msg) | Accern | /Accern-0.4.0.tar.gz/Accern-0.4.0/accern/default_client.py | default_client.py |
import inspect
class AccessException(Exception): pass
isFunc = lambda f: isinstance(f, type(lambda: 1))
isClass = lambda c: hasattr(c, '__call__') and not isFunc(c)
hasAttribute = \
lambda cls, caller: \
hasattr(cls, caller) or any(map(
lambda x: hasattr(x, caller),
list(filter(isClass, dict(cls.__dict__).values()))
))
def privatefunc(cls, f):
def _(*args, **kwargs):
caller = inspect.stack()[1].function
if not hasAttribute(cls, caller):
raise AccessException("Private function cannot be accessed.")
return f(*args, **kwargs)
return _
def protectedfunc(cls, f):
def _(*args, **kwargs):
mycls = type(args[0])
caller = inspect.stack()[1].function
if not (
hasAttribute(mycls, caller) and (
cls == mycls or
hasAttribute(cls, caller) or
cls.__name__ in list(map(
lambda x: x.__name__,
mycls.__bases__
))
)
):
raise AccessException("Protected function cannot be accessed.")
return f(*args, **kwargs)
return _
def access(cls):
d = dict(cls.__dict__)
functions = {
key : d[key] for key in d.keys()
if isFunc(d[key])
}
d['__access_modify'] = True
for key in functions:
f = d[key]
acc = getattr(f, 'access', 'public')
if acc == 'private':
d[key] = privatefunc(cls, f)
elif acc == 'public':
d[key] = f
elif acc == 'protected':
d[key] = protectedfunc(cls, f)
return type(cls.__name__, cls.__bases__, d)
def gen(name):
def decorator(func):
setattr(func, 'access', name)
return func
return decorator
private = gen('private')
protected = gen('protected')
public = gen('public')
if __name__ == '__main__':
@access
class test:
@protected
def __init__(self):
self.a = 1
class builder:
def build():
return test()
class Inside:
def b(self): return test.a(1)
print(test.builder.build())
@access
class asdf:
@protected
def a(self): return 1
class qwer(asdf):
def b(self): return self.a()+1
print(qwer().b()) | Access-Modify | /Access_Modify-1.0.8-py3-none-any.whl/access_modify/access_modify.py | access_modify.py |
Pre 3.0 Changelog
=================
2.13.12 (2012-10-31)
--------------------
- LP #1071067: Use a stronger random number generator and a constant time
comparison function.
2.13.11 (2012-10-21)
--------------------
- LP #966101: Recognize special `zope2.Private` permission in ZCML
role directive.
2.13.10 (2012-09-09)
--------------------
- LP #1047318: Tighten import restrictions for restricted code.
2.13.9 (2012-08-23)
-------------------
- Fix a bug in ZopeSecurityPolicy.py. Global variable `rolesForPermissionOn`
could be overridden if `__role__` had custom rolesForPermissionOn.
2.13.8 (2012-06-22)
-------------------
- Add Anonymous as a default role for Public permission.
2.13.7 (2011-12-12)
-------------------
- Exclude compiled `.so` and `.dll` files from source distributions.
2.13.6 (2011-12-12)
-------------------
- Added `manifest.in` to ensure the inclusion of the `include` directory into
the release.
2.13.5 (2011-12-12)
-------------------
- Apply changes made available in `Products.Zope_Hotfix_20111024` and make them
more robust.
2.13.4 (2011-01-11)
-------------------
- Return the created user in _doAddUser.
- Added IUser interface.
- LP #659968: Added support for level argument to the ``__import__`` function
as introduced in Python 2.5. Currently only level=-1 is supported.
2.13.3 (2010-08-28)
-------------------
- Added a ``role`` subdirective for the ``permission`` ZCML directive. If any
roles are specified, they will override the default set of default roles
(Manager).
2.13.2 (2010-07-16)
-------------------
- Added ``override_existing_protection`` parameter to the protectName helper.
2.13.1 (2010-06-19)
-------------------
- Restore security declarations for deprecated ``sets`` module.
2.13.0 (2010-06-19)
-------------------
- Released as separate package.
| AccessControl | /AccessControl-5.5.tar.gz/AccessControl-5.5/HISTORY.rst | HISTORY.rst |
Changelog
=========
For changes before version 3.0, see ``HISTORY.rst``.
5.5 (2022-10-10)
----------------
- Switch from ``-Ofast`` to ``-O3`` when compiling code for Linux wheels.
(`#133 <https://github.com/zopefoundation/AccessControl/pull/133>`_)
- Add support for Python 3.11 (as of 3.11.0rc2).
5.4 (2022-08-26)
----------------
- Add support for Python 3.11 (as of 3.11.0b5).
- Support ``default`` argument in ``next`` built-in function.
(`#131 <https://github.com/zopefoundation/AccessControl/pull/131>`_)
5.3.1 (2022-03-29)
------------------
- Prevent race condition in guarded_import
(`#123 <https://github.com/zopefoundation/AccessControl/issues/123>`_)
5.3 (2022-02-25)
----------------
- Provide ``AccessControl.get_safe_globals`` to facilitate safe use.
- Honor ``PURE_PYTHON`` environment variable to enable python implementation
during runtime.
- Add support for Python 3.10.
5.2 (2021-07-30)
----------------
- Fix Appveyor configuration so tests can run and wheels build.
5.1 (2021-07-30)
----------------
NOTE: This release has been yanked from PyPI due to wheel build issues.
- Fix a remote code execution issue by preventing access to
``string.Formatter`` from restricted code.
5.0 (2020-10-07)
----------------
- Add support for Python 3.9.
- Remove deprecated classes and functions in
(see `#32 <https://github.com/zopefoundation/AccessControl/issues/32>`_):
+ ``AccessControl/DTML.py``
+ ``AccessControl/Owned.py``
+ ``AccessControl/Role.py``
+ ``AccessControl/Permissions.py``
- Add deprecation warnings for BBB imports in:
+ ``AccessControl/AuthEncoding.py``
+ ``AccessControl/Owned.py``
+ ``AccessControl/Role.py``
+ ``AccessControl/User.py``
- Although this version might run on Zope 4, it is no longer supported because
of the dropped deprecation warnings.
4.2 (2020-04-20)
----------------
- Add missing permission ``Manage WebDAV Locks``
- Fix regression for BBB import of ```users.UnrestrictedUser``
(`#94 <https://github.com/zopefoundation/AccessControl/issues/94>`_)
- Add a check if database is present in ``.owner.ownerInfo``.
(`#91 <https://github.com/zopefoundation/AccessControl/issues/91>`_).
4.1 (2019-09-02)
----------------
- Python 3: Allow iteration over the result of ``dict.{keys,values,items}``
(`#89 <https://github.com/zopefoundation/AccessControl/issues/89>`_).
4.0 (2019-05-08)
----------------
Changes since 3.0.12:
- Add support for Python 3.5, 3.6, 3.7 and 3.8.
- Restore simple access to bytes methods in Python 3
(`#83 <https://github.com/zopefoundation/AccessControl/issues/83>`_)
- Clarify deprecation warnings for several BBB shims.
(`#32 <https://github.com/zopefoundation/AccessControl/issues/32>`_)
- Add a test to prove that a user folder flag cannot be acquired elsewhere.
(`#7 <https://github.com/zopefoundation/AccessControl/issues/7>`_)
- Tighten basic auth string handling in ``BasicUserFolder.identify``
(`#56 <https://github.com/zopefoundation/AccessControl/issues/56>`_)
- Prevent the Zope 4 ZMI from showing an add dialog for the user folder.
(`#82 <https://github.com/zopefoundation/AccessControl/issues/82>`_)
- Fix order of roles returned by
``AccessControl.rolemanager.RoleManager.userdefined_roles``.
- Add configuration for `zodbupdate`.
- Add ``TaintedBytes`` besides ``TaintedString`` in ``AccessControl.tainted``.
(`#57 <https://github.com/zopefoundation/AccessControl/issues/57>`_)
- Security fix: In ``str.format``, check the security for attributes that are
accessed. (Ported from 2.13).
- Port ``override_container`` context manager here from 2.13.
- Add AppVeyor configuration to automate building Windows eggs.
- Fix for compilers that only support C89 syntax (e.g. on Windows).
- Sanitize and test `RoleManager` role handling.
- Depend on RestrictedPython >= 4.0.
- #16: Fixed permission handling by avoiding column and row numbers as
identifiers for permissions and roles.
- Extract ``.AuthEncoding`` to its own package for reuse.
- Declare missing dependency on BTrees.
- Drop `Record` dependency, which now does its own security declaration.
- Remove leftovers from history support dropped in Zope.
- Remove duplicate guard against * imports.
(`#60 <https://github.com/zopefoundation/AccessControl/issues/60>`_)
3.0.12 (2015-12-21)
-------------------
- Avoid acquiring ``access`` from module wrapped by
``SecurityInfo._ModuleSecurityInfo``. See:
https://github.com/zopefoundation/AccessControl/issues/12
3.0.11 (2014-11-02)
-------------------
- Harden test fix for machines that do not define `localhost`.
3.0.10 (2014-11-02)
-------------------
- Test fix for machines that do not define `localhost`.
3.0.9 (2014-08-08)
------------------
- GitHub #6: Do not pass SecurityInfo instance itself to declarePublic/declarePrivate
when using the public/private decorator. This fixes ``Conflicting security declarations``
warnings on Zope startup.
- LP #1248529: Leave existing security manager in place inside
``RoleManager.manage_getUserRolesAndPermissions``.
3.0.8 (2013-07-16)
------------------
- LP #1169923: ensure initialization of shared ``ImplPython`` state
(used by ``ImplC``) when using the "C" security policy. Thanks to
Arnaud Fontaine for the patch.
3.0.7 (2013-05-14)
------------------
- Remove long-deprecated 'Shared' roles support (pre-dates Zope, never
used by Zope itself)
- Prevent infinite loop when looking up local roles in an acquisition chain
with cycles.
3.0.6 (2012-10-31)
------------------
- LP #1071067: Use a stronger random number generator and a constant time
comparison function.
3.0.5 (2012-10-21)
------------------
- LP #966101: Recognize special `zope2.Private` permission in ZCML
role directive.
3.0.4 (2012-09-09)
------------------
- LP #1047318: Tighten import restrictions for restricted code.
3.0.3 (2012-08-23)
------------------
- Fix a bug in ZopeSecurityPolicy.py. Global variable `rolesForPermissionOn`
could be overridden if `__role__` had custom rolesForPermissionOn.
3.0.2 (2012-06-22)
------------------
- Add Anonymous as a default role for Public permission.
3.0.1 (2012-05-24)
------------------
- Fix tests under Python 2.6.
3.0 (2012-05-12)
----------------
- Added decorators for public, private and protected security declarations.
- Update tests to take advantage of automatic test suite discovery.
| AccessControl | /AccessControl-5.5.tar.gz/AccessControl-5.5/CHANGES.rst | CHANGES.rst |
.. image:: https://github.com/zopefoundation/AccessControl/actions/workflows/tests.yml/badge.svg
:target: https://github.com/zopefoundation/AccessControl/actions/workflows/tests.yml
.. image:: https://coveralls.io/repos/github/zopefoundation/AccessControl/badge.svg?branch=master
:target: https://coveralls.io/github/zopefoundation/AccessControl?branch=master
.. image:: https://img.shields.io/pypi/v/AccessControl.svg
:target: https://pypi.org/project/AccessControl/
:alt: Current version on PyPI
.. image:: https://img.shields.io/pypi/pyversions/AccessControl.svg
:target: https://pypi.org/project/AccessControl/
:alt: Supported Python versions
AccessControl
=============
AccessControl provides a general security framework for use in Zope.
| AccessControl | /AccessControl-5.5.tar.gz/AccessControl-5.5/README.rst | README.rst |
set -e -x
# Running inside docker
# Set a cache directory for pip. This was
# mounted to be the same as it is outside docker so it
# can be persisted.
export XDG_CACHE_HOME="/cache"
# XXX: This works for macOS, where everything bind-mounted
# is seen as owned by root in the container. But when the host is Linux
# the actual UIDs come through to the container, triggering
# pip to disable the cache when it detects that the owner doesn't match.
# The below is an attempt to fix that, taken from bcrypt. It seems to work on
# Github Actions.
if [ -n "$GITHUB_ACTIONS" ]; then
echo Adjusting pip cache permissions
mkdir -p $XDG_CACHE_HOME/pip
chown -R $(whoami) $XDG_CACHE_HOME
fi
ls -ld /cache
ls -ld /cache/pip
# We need some libraries because we build wheels from scratch:
yum -y install libffi-devel
tox_env_map() {
case $1 in
*"cp27"*) echo 'py27';;
*"cp35"*) echo 'py35';;
*"cp311"*) echo 'py311';;
*"cp36"*) echo 'py36';;
*"cp37"*) echo 'py37';;
*"cp38"*) echo 'py38';;
*"cp39"*) echo 'py39';;
*"cp310"*) echo 'py310';;
*) echo 'py';;
esac
}
# Compile wheels
for PYBIN in /opt/python/*/bin; do
if \
[[ "${PYBIN}" == *"cp27"* ]] || \
[[ "${PYBIN}" == *"cp35"* ]] || \
[[ "${PYBIN}" == *"cp311"* ]] || \
[[ "${PYBIN}" == *"cp36"* ]] || \
[[ "${PYBIN}" == *"cp37"* ]] || \
[[ "${PYBIN}" == *"cp38"* ]] || \
[[ "${PYBIN}" == *"cp39"* ]] || \
[[ "${PYBIN}" == *"cp310"* ]] ; then
if [[ "${PYBIN}" == *"cp311"* ]] ; then
"${PYBIN}/pip" install --pre -e /io/
"${PYBIN}/pip" wheel /io/ --pre -w wheelhouse/
else
"${PYBIN}/pip" install -e /io/
"${PYBIN}/pip" wheel /io/ -w wheelhouse/
fi
if [ `uname -m` == 'aarch64' ]; then
cd /io/
${PYBIN}/pip install tox
TOXENV=$(tox_env_map "${PYBIN}")
${PYBIN}/tox -e ${TOXENV}
cd ..
fi
rm -rf /io/build /io/*.egg-info
fi
done
# Bundle external shared libraries into the wheels
for whl in wheelhouse/AccessControl*.whl; do
auditwheel repair "$whl" -w /io/wheelhouse/
done | AccessControl | /AccessControl-5.5.tar.gz/AccessControl-5.5/.manylinux-install.sh | .manylinux-install.sh |
Request method decorators
=========================
Using request method decorators, you can limit functions or methods to only
be callable when the HTTP request was made using a particular method.
To limit access to a function or method to POST requests, use the
`requestmethod` decorator factory::
>>> from AccessControl.requestmethod import requestmethod
>>> @requestmethod('POST')
... def foo(bar, REQUEST):
... return bar
When this method is accessed through a request that does not use POST, the
Forbidden exception will be raised::
>>> foo('spam', GET)
Traceback (most recent call last):
...
Forbidden: Request must be POST
Only when the request was made using POST, will the call succeed::
>>> foo('spam', POST)
'spam'
The `REQUEST` can be is a positional or a keyword parameter.
*Not* passing an optional `REQUEST` always succeeds.
Note that the `REQUEST` parameter is a requirement for the decorator to
operate, not including it in the callable signature results in an error.
You can pass in multiple request methods allow access by any of them.
| AccessControl | /AccessControl-5.5.tar.gz/AccessControl-5.5/docs/requestmethod.rst | requestmethod.rst |
Security Architecture
---------------------
Users
-----
Objects representing users may be created in Principia
User Folder objects. User objects maintain the information
used to authenticate users, and allow roles to be associated
with a user.
Permissions
-----------
A "permission" is the smallest unit of access to an object,
roughly equivalent to the atomic permissions seen in NT:
R (Read), W(Write), X(Execute), etc. In Principia, a permission
usually describes a fine-grained logical operation on an object,
such as "View Management Screens", "Add Properties", etc.
Different types of objects will define different permissions
as appropriate for the object.
Types of access
---------------
A "type of access" is a named grouping of 0 or more of the
permissions defined by an object. All objects have one predefined
type of access called Full Access (all permissions defined by that
object). A user who has the special role "Manager" always has Full
Access to all objects at or below the level in the object hierarchy
at which the user is defined.
New types of access may be defined as combinations of the
various permissions defined by a given object. These new
types of access may be defined by the programmer, or by
users at runtime.
Roles
-----
A role is a name that ties users (authentication of identity)
to permissions (authorization for that identity) in the system.
Roles may be defined in any Folder (or Folderish) object in the
system. Sub folders can make use of roles defined higher in the
hierarchy. These roles can be assigned to users. All users,
including non-authenticated users have the built-in role of
"Anonymous".
Principia objects allow the association of defined roles
with a single "type of access" each, in the context of that
object. A single role is associated with one and only one
type of access in the context of a given object.
Examples
--------
User Object1
o has the role "RoleA" o has given "RoleA" Full Access
Result: the user has Full Access to Object1.
User Object2
o has the role "RoleA" o has given "RoleB" Full Access
o has given the role "RoleA" View Access,
a custom type of access that allows only
viewing of the object.
Result: the user has only View Access.
Notes
-----
All objects define a permission called "Default permission". If this
permission is given to a role, users with that role will be able to
access subobjects which do not explicitly restrict access.
Technical
---------
Objects define their permissions as logical operations.
Programmers have to determine the appropriate operations
for their object type, and provide a mapping of permission
name to attribute names. It is important to note that permissions
cannot overlap - none of the attributes named in a permission
can occur in any of the other permissions. The following are
proposed permissions for some current principia objects:
Folder
o View management screens
o Change permissions
o Undo changes
o Add objects
o Delete objects
o Add properties
o Change properties
o Delete properties
o Default permission
Confera Topic
o View management screens
o Change permissions
o Undo changes
o Add objects
o Delete objects
o Add properties
o Change properties
o Delete properties
o Default permission
o Change Configuration
o Add Messages
o Change Messages
o Delete Messages
Tabula Collection
o View management screens
o Change permissions
o Undo changes
o Add objects
o Delete objects
o Add properties
o Change properties
o Delete properties
o Default permission
o Change schema
o Upload data
o Add computed fields
o Change computed fields
o Delete computed fields
Document/Image/File
o View management screens
o Change permissions
o Change/upload data
o View
Session
o View management screens
o Change permissions
o Change session config
o Join/leave session
o Save/discard session
Mail Host
o View management screens
o Change permissions
o Change configuration
To support the architecture, developers must derive an
object from the AccessControl.rolemanager.RoleManager mixin class,
and define in their class an __ac_permissions__ attribute.
This should be a tuple of tuples, where each tuple represents
a permission and contains a string permission name as its first
element and a list of attribute names as its second element.
Example:
__ac_permissions__=(
('View management screens',
['manage','manage_menu','manage_main','manage_copyright',
'manage_tabs','manage_propertiesForm','manage_UndoForm']),
('Undo changes', ['manage_undo_transactions']),
('Change permissions', ['manage_access']),
('Add objects', ['manage_addObject']),
('Delete objects', ['manage_delObjects']),
('Add properties', ['manage_addProperty']),
('Change properties', ['manage_editProperties']),
('Delete properties', ['manage_delProperties']),
('Default permission', ['']),
)
The developer may also predefine useful types of access, by
specifying an __ac_types__ attribute. This should be a tuple of
tuples, where each tuple represents a type of access and contains
a string name as its first element and a list of permission names
as its second element.
By default, only "Full Access" is defined (by the RoleManager mixin).
If you wish to override __ac_types__ to provide convenient types of
access, you must always be sure to define "Full Access" as containing
the names of all possible permissions for your object.
Example:
__ac_types__=(
('Full Access', tuple(map(lambda x: x[0], __ac_permissions__))),
('Change', ['Add Objects', 'Add Properties', 'Change Properties']),
)
Developers may also provide pre-defined role names that are
not deletable via the interface by specifying an __ac_roles__
attribute. This is probably not something we'll ever use under
the new architecture, but it's there if you need it.
Example:
__ac_roles__=('Manager', 'Anonymous')
| AccessControl | /AccessControl-5.5.tar.gz/AccessControl-5.5/docs/AccessControl.rst | AccessControl.rst |
Modeler package
--------------------
1. set up virtual environment for python
- for Pycharm
- https://www.jetbrains.com/help/pycharm/creating-virtual-environment.html
- for pure python
- go to your/download/src/directory and run the below commands
>>> python3 -m venv venv
>>> source venv/bin/activate
2. after downloading source from git, open terminal and run the below command.
>>> python setup.py test install
3. if you added a new package then run the following commend
>>> python setup.py install
4. Install protobuf 3.6.1 on Ubuntu 18.04
.. code-block::
#! /bin/bash
# Make sure you grab the latest version
curl -OL https://github.com/google/protobuf/releases/download/v3.6.1/protoc-3.6.1-linux-x86_64.zip
https://github.com/google/protobuf/releases/download/v3.6.1/protoc-3.6.1-linux-x86_64.zip
# Unzip
unzip protoc-3.6.1-linux-x86_64.zip -d protoc3
# Move protoc to /usr/local/bin/
sudo mv(or cp -pr) protoc3/bin/* /usr/local/bin/
# Move protoc3/include to /usr/local/include/
sudo mv(or cp -pr) protoc3/include/* /usr/local/include/
# Optional: change owner
sudo chown $USER /usr/local/bin/protoc
sudo chown -R $USER /usr/local/include/google
- if you update or add "\*.proto" file in protos package
- excute generate-protos.sh
How to use
- for Lifecycle
- see Lifecycle/README.MD
| Accuinsight | /Accuinsight-3.4.20230721.tar.gz/Accuinsight-3.4.20230721/README.rst | README.rst |
import sqlite3
from Ace_todolist import now_date
from Ace_todolist import input_check
# 항목 추가 함수
def add_todo():
conn = sqlite3.connect("ace.db")
cur = conn.cursor()
yn_list = ['y', 'Y', 'n', 'N']
place_default = ' '
comment_default = ' '
title = input("Todo ? ")
print()
due = input("Due date? (yyyy-mm-dd or mm-dd or dd)")
print()
while not input_check.due_check(due) or not input_check.date_check(due):
if not input_check.due_check(due):
print("Wrong input. Type again.")
print()
due = input("Due date? (yyyy-mm-dd or mm-dd or dd)")
print()
elif not input_check.date_check(due):
print("Invalid date or month. Type again.")
print()
due = input("Due date? (yyyy-mm-dd or mm-dd or dd)")
print()
# 년도 또는 달 생략 시 현재 년도와 달로 대체
if len(due) < 10:
due = now_date.convert_due(due)
category = input("Category ? ")
print()
priority = input("Priority ? (1 to 5) ")
print()
while not input_check.priority_check(priority):
print("Wrong input. Type again.")
print()
priority = input("Priority ? (1 to 5) ")
print()
edit_place = input("Want to add Place ? (y / n) ")
print()
while edit_place not in yn_list:
print("Wrong input. Type again.")
print()
edit_place = input("Want to add Place ? (y / n) ")
print()
if (edit_place == 'y') or (edit_place == 'Y'):
place = input("Place ? ")
print()
else:
place = place_default
edit_comment = input("Want to add Comment ? (y / n) ")
print()
while edit_comment not in yn_list:
print("Wrong input. Type again.")
print()
edit_comment = input("Want to add Comment ? (y / n) ")
print()
if (edit_comment == 'y') or (edit_comment == 'Y'):
comment = input("Comment ? ")
print()
else:
comment = comment_default
data = ((title, category, priority, due, place, comment), )
sql = "insert into todo (title, category, priority, due, place, comment, finished) values (?, ?, ?, ?, ?, ?,0);"
cur.executemany(sql, data)
conn.commit()
conn.close()
# 개행을 위한 print() | Ace-todolist | /Ace_todolist-2.1-py3-none-any.whl/Ace_todolist/add_todo.py | add_todo.py |
from Ace_todolist import list_todo
from Ace_todolist import edit_todo
from Ace_todolist import create_db
from Ace_todolist import add_todo
from Ace_todolist import remove_todo
from Ace_todolist import stat_todo
from Ace_todolist import search
from Ace_todolist import detail
import argparse
UPWARD_OPTIONS = ["t", "c", "p", "d"]
DOWNWARD_OPTIONS = ["T", "C", "P", "D"]
# 정렬 함수 (option은 정렬 대상, status는 정렬 방식(오름차/내립차순), finished는 완료 여부에 따른 출력 방식을 결정)
# 프로그램 실행코드
def run_program():
create_db.create_db()
# 개행을 위한 print()
print()
parser = argparse.ArgumentParser()
parser.add_argument("--add", help="add item", action="store_true")
parser.add_argument("--list", choices=["a", "f"],
help= "print list of items(a: list of all items, f: split view of list based on completion)")
parser.add_argument("--edit", choices=["title", "category", "due", "priority",
"fin", "place", "comment"],
help="edit item based on title, category, ..., comment")
parser.add_argument("--stat", help="statistics of items", action="store_true")
parser.add_argument("--search", choices=["i", "t", "c", "d"],
help="search item based on (i : ID, t: title, c : category, d: due)")
parser.add_argument("--detail", help="print details of item", action="store_true")
parser.add_argument("--remove", help="remove item", action="store_true")
parser.add_argument("--version", action="version", version="version 2.1")
parser.add_argument("--sort", choices=["t", "c", "p", "d", "T", "C", "P", "D"],
help=" using with --list, option for sorting (t : title, c: category, p: priority, d: due)")
args = parser.parse_args()
if args.add:
add_todo.add_todo()
elif args.list:
if args.list == "f":
select2 = 1
if args.sort in UPWARD_OPTIONS:
status = list_todo.sort(UPWARD_OPTIONS.index(args.sort)+1, 0, select2)
elif args.sort in DOWNWARD_OPTIONS:
status = list_todo.sort(DOWNWARD_OPTIONS.index(args.sort)+1, 1, select2)
else:
list_todo.list_finished_unfinished()
status = 0
else:
select2 = 0
if args.sort in UPWARD_OPTIONS:
status = list_todo.sort(UPWARD_OPTIONS.index(args.sort)+1, 0, select2)
elif args.sort in DOWNWARD_OPTIONS:
status = list_todo.sort(DOWNWARD_OPTIONS.index(args.sort)+1, 1, select2)
else:
list_todo.list_all()
status = 0
# 옵션 입력(기본값은 오름차순, 동일한 옵션을 입력할 시 차순 변경)
while 1:
option = input("Sort in other way or quit this section:\n"
"(t: title, c: category, p: priority, d: due, q: quit)? ")
if option == 'q':
break
while option not in UPWARD_OPTIONS and option not in DOWNWARD_OPTIONS:
print("Wrong input!\n")
option = input("Sort in other way or quit this section:\n"
"(t: title, c: category, p: priority, d: due, q: quit)? ")
if option in UPWARD_OPTIONS:
option = UPWARD_OPTIONS.index(option) + 1
elif option in DOWNWARD_OPTIONS:
option = UPWARD_OPTIONS.index(option) + 1
# 옵션에 따른 정렬 및 출력 함수
status = list_todo.sort(int(option), status, select2)
elif args.edit:
target = args.edit
if not args.search:
searching = None
else:
searching = args.search
edit_todo.edit_todo(searching, target)
elif args.detail:
if not args.search:
detail.detail()
else:
detail.detail(args.search)
elif args.remove:
if not args.search:
remove_todo.remove_todo()
else:
remove_todo.remove_todo(args.search)
elif args.search:
search.search(args.search)
elif args.stat:
stat_todo.stat_todo()
else:
print("Hello!, This is ACE TODO APPLICATION")
print("If you want to run action of ACE, confirm options(-h or --help)")
print()
if __name__ == "__main__":
run_program() | Ace-todolist | /Ace_todolist-2.1-py3-none-any.whl/Ace_todolist/__main__.py | __main__.py |
import sqlite3
from Ace_todolist import list_todo
from Ace_todolist import search
from Ace_todolist import input_check
from Ace_todolist import now_date
def edit_todo(searching, target):
conn = sqlite3.connect("ace.db")
cur = conn.cursor()
loop = 'y'
# 반복에 대해 yes라고 응답하면 반복
while loop == 'y' or loop == 'Y':
# 검색코드
lists = search.search(searching)
while len(lists) == 0:
print("Nothing found! Please retry.")
print()
lists = search.search(searching)
# 변경할 항목 선택
if len(lists) != 1:
target_id = input("Record id? ")
while not target_id.isdigit():
target_id = input("Record id? ")
else:
target_id = lists[0][0]
column_list = ['title', 'due', 'due', 'fin', 'priority', 'category', 'place', 'comment']
if target is None:
select = input("\nWhat do you want to edit? \n\n Title => title"
"\n Due date => due \n Finished => fin"
"\n priority => priority \n category = > category"
"\n place = > place \n comment = > comment\n")
while select not in column_list:
print("Incorrect type")
select = input("\nWhat do you want to edit?\n\n Title => title"
"\n Due date => due\n Finished => fin"
"\n priority => priority\n category = > category"
"\n place = > place\n comment = > comment\n")
else:
select = target
# 변경내용 작성
# 타이틀 변경
if 'title' in select:
sel_title = input("\nTitle?")
cur.execute("update todo set title = ? where id =?", (sel_title, target_id))
# 기한 변경
elif 'due' in select:
set_due = input("\nDue date? ")
while not input_check.due_check(set_due):
print("wrong input. type again.")
print()
set_due = input("Due date? (yyyy-mm-dd or mm-dd or dd)")
print()
if len(set_due) < 10:
set_due = now_date.convert_due(set_due)
cur.execute("update todo set due = ? where id =?", (set_due, target_id))
# 중요도 변경
elif 'priority' in select:
sel_priority = input("\nPriority? ")
cur.execute("update todo set priority = ? where id =?", (sel_priority, target_id))
while not input_check.priority_check(sel_priority):
print("Wrong input")
sel_priority = input("Priority ? (1 to 5) ")
cur.execute("update todo set priority = ? where id =?", (sel_priority, target_id))
# 완료/미완료 변경
elif 'fin' in select:
sel_finished = input("\nFinished (1: yes, 0: no)? ")
cur.execute("update todo set finished = ? where id =?", (sel_finished, target_id))
while not input_check.finished_check(sel_finished):
print("Wrong input")
sel_finished = input("\nFinished (1: yes, 0: no)? ")
cur.execute("update todo set finished = ? where id =?", (sel_finished, target_id))
# 카테고리 변경
elif 'category' in select:
sel_category = input("\nCategory? ")
cur.execute("update todo set category = ? where id =?", (sel_category, target_id))
# 장소 변경
elif 'place' in select:
sel_place = input("\nPlace? ")
cur.execute("update todo set place = ? where id =?", (sel_place, target_id))
# 세부사항 변경
elif 'comment' in select:
sel_comment = input("\nComment? ")
cur.execute("update todo set comment = ? where id =?", (sel_comment, target_id))
conn.commit()
loop = input("\nanything else you want to edit? (yes: y no: n): ")
# 변경된 리스트 출력
list_todo.list_all()
conn.close() | Ace-todolist | /Ace_todolist-2.1-py3-none-any.whl/Ace_todolist/edit_todo.py | edit_todo.py |
# todo table에 존재하는 todo 항목을 검색하여 찾는 함수...
import sqlite3
from Ace_todolist import list_todo as printer
def search(option=None):
conn = sqlite3.connect("ace.db")
cur = conn.cursor()
# 검색한 항목을 담은 list
search_list = []
search_answer_list = ["i", "d", "t", "c"]
# 어떤 방법으로 찾고 싶은 지에 대한 input 함수 / 조건문
if option is None:
search_type = input("How do you want to search? (i: id, t: title, d: due, c: category) ")
while search_type not in search_answer_list:
print()
print("Incorrect type")
search_type = input("How do you want to search? (i: id, t: title, d: due, c: category) ")
else:
search_type = option
if search_type == "i":
search_id = input("what id: ")
sql = "select * from todo where id=?"
cur.execute(sql, (search_id, ))
rows = cur.fetchall()
for row in rows:
search_list.append(row)
printer.print_list(search_list)
elif search_type == "t":
search_title = input("what title: ")
search_list = contain_thing(search_title, 1)
printer.print_list(search_list)
elif search_type == "d":
search_due = input("what due: ")
search_list = contain_thing(search_due, 3)
printer.print_list(search_list)
elif search_type == "c":
search_category = input("what category: ")
search_list = contain_thing(search_category, 2)
printer.print_list(search_list)
cur.close()
conn.close()
return search_list
# 검색하는 단어를 포함하는 항목 모두 찾기
def contain_thing(what_search, num_index):
conn = sqlite3.connect("ace.db")
cur = conn.cursor()
# 검색하는 단어를 포함한 모든 항목 리스트
contain_list = []
# 존재하는 모든 항목에 대한 리스트
all_list = []
# 존재하는 모든 항목 담기
sql = "select * from todo where 1"
cur.execute(sql)
rows = cur.fetchall()
for row in rows:
all_list.append(row)
# 검색하는 단어를 포함하는 항목 모두 찾기
for elem in all_list:
if what_search in elem[num_index]:
contain_list.append(elem)
cur.close()
conn.close()
return contain_list | Ace-todolist | /Ace_todolist-2.1-py3-none-any.whl/Ace_todolist/search.py | search.py |
import sqlite3
from Ace_todolist import now_date
COLUMN = ["title", "category", "priority", "due"]
SORT = ["asc", "desc"]
# 각 행의 크기
WIDTH_ID = 4
WIDTH_TITLE = 61
WIDTH_CATEGORY = 16
WIDTH_PRIORITY = 10
WIDTH_DUE = 11
COLUMN_SIZE = [WIDTH_ID, WIDTH_TITLE, WIDTH_CATEGORY, WIDTH_DUE, WIDTH_PRIORITY]
# 각 행의 라벨
ID = "ID"
TITLE = "TITLE"
CATEGORY = "CATEGORY"
PRIORITY = "PRIORITY"
DUE = "DUE"
COLUMN_LABEL = [ID, TITLE, CATEGORY, DUE, PRIORITY]
# 각 행의 라벨 공백
SPACE_ID = int((WIDTH_ID - len(ID)) / 2)
SPACE_TITLE = int((WIDTH_TITLE - len(TITLE)) / 2)
SPACE_CATEGORY = int((WIDTH_CATEGORY - len(CATEGORY)) / 2)
SPACE_PRIORITY = int((WIDTH_PRIORITY - len(PRIORITY)) / 2)
SPACE_DUE = int((WIDTH_DUE - len(DUE)) / 2)
SPACE_COLUMN = [SPACE_ID, SPACE_TITLE, SPACE_CATEGORY, SPACE_DUE, SPACE_PRIORITY]
# 문자열 색 코드
BLACK = "\x1b[30m"
WHITE = "\x1b[37m"
RED = "\x1b[31m"
BLUE = "\x1b[34m"
YELLOW = "\x1b[33m"
MAGENTA = "\x1b[35m"
RESET = "\x1b[0m"
# 입력 받은 리스트를 콘솔에 출력
def print_list(rows):
# 개행을 위한 print()
print()
# 표의 각 행 라벨 검은 색으로 출력
print(label_string(BLACK))
# 이중 가로 선 출력
print(line_string("="))
count = 0
# 각 항목 출력
for row in rows:
due = str(row[3])
fin = row[-1]
# 매 다섯 번 째 열 마다 가로 선 출력
if count == 5:
print(line_string())
count = 0
# 만료되지 않은 항목 중 완료하지 않은 항목 노란색으로 출력
if fin == 0 and now_date.expired(due):
print(todo_string(YELLOW, row))
count += 1
# 완료하지 않은 상태로 만료된 항목 빨간 색으로 출력
elif fin == 0:
print(todo_string(RED, row))
count += 1
# 완료한 항목 파란 색으로 출력
else:
print(todo_string(BLUE, row))
count += 1
# 개행을 위한 print
print()
# 전체 TODO 리스트 출력
def list_all(where=" ", data="ace.db"):
conn = sqlite3.connect(data)
cur = conn.cursor()
# 기본 옵션
if where == " ":
sql = "select * from todo where 1"
rows_fin = []
# 남은 Due Date 기준 정렬
elif where == 'due asc' or where == 'due desc':
sql = "select * from todo where finished = 0 order by " + where
sql2 = "select * from todo where finished = 1 order by " + where
cur.execute(sql2)
rows_fin = cur.fetchall()
# 기타 옵션
else:
sql = "select * from todo order by " + where
rows_fin = []
cur.execute(sql)
rows = cur.fetchall()
for row in rows_fin:
rows.append(row)
conn.close()
print_list(rows)
print("(※ RED: Expired, BLUE: Finished, YELLOW : Unfinished)")
print('\n')
# 완료/미완료 TODO 리스트 구분지어 출력
def list_finished_unfinished(where=" ", data="ace.db"):
conn = sqlite3.connect(data)
cur = conn.cursor()
# 기본 옵션
if where == " ":
sql_finished = "select * from todo where finished = 0"
sql_unfinished = "select * from todo where finished = 1"
# 정렬 옵션
else:
sql_finished = "select * from todo where finished = 0 order by " + where
sql_unfinished = "select * from todo where finished = 1 order by " + where
cur.execute(sql_finished)
rows_finished = cur.fetchall()
cur.execute(sql_unfinished)
rows_unfinished = cur.fetchall()
print()
# 미완료 항목 출력
print("****Undone Todo List****")
print_list(rows_finished)
print()
# 완료 항목 출력
print("****Done Todo List****")
print_list(rows_unfinished)
print("(※ RED: Expired, BLUE: Finished, YELLOW : Unfinished)")
print('\n')
# 각 항목에 대한 문자열 생성
def todo_string(color, row, line=WHITE):
i = 0
string = ""
while i < len(COLUMN_SIZE):
if i != 4:
element = str(row[i])
else:
element = str(star(row[i]))
string += color + element + " " * (COLUMN_SIZE[i] - len(element))
if i < len(COLUMN_SIZE) - 1:
string += line + "|"
else:
string += RESET
i += 1
return string
# 구분 선에 대한 문자열 생성
def line_string(shape="-", color=WHITE):
string = color
for size in COLUMN_SIZE:
string += shape * size
if size != COLUMN_SIZE[-1]:
string += "+"
else:
string += RESET
return string
# 라벨에 대한 문자열 생성
def label_string(color, line=WHITE):
string = ""
i = 0
while i < len(COLUMN_LABEL):
string += "\x1b[1m" + color + " " * SPACE_COLUMN[i] + COLUMN_LABEL[i] + " " * SPACE_COLUMN[i]
if i < len(COLUMN_LABEL) - 1:
string += RESET + line + "|"
else:
string += RESET
i += 1
return string
def star(number, star1="☆", star2="★"):
return star2 * number + star1 * (5 - number)
def sort(option, status, finished):
if finished == 0:
if option == status:
list_all(COLUMN[option-1] + " " + SORT[1])
return 0
else:
list_all(COLUMN[option-1] + " " + SORT[0])
return option
else:
if option == status:
list_finished_unfinished(COLUMN[option-1] + " " + SORT[1])
return 0
else:
list_finished_unfinished(COLUMN[option-1] + " " + SORT[0])
return option | Ace-todolist | /Ace_todolist-2.1-py3-none-any.whl/Ace_todolist/list_todo.py | list_todo.py |
class MorseCode:
"""
Class that holds everything you need to generate or translate Morse
"""
def __init__(self):
self.alphabet = {'0': '-----', '1': '.----', '2': '..---', '3': '...--',
'4': '....-', '5': '.....', '6': '-....', '7': '--...',
'8': '---..', '9': '----.', 'a': '.-', 'b': '-...',
'c': '-.-.', 'd': '-..', 'e': '.', 'f': '..-.', 'g': '--.',
'h': '....', 'i': '..', 'j': '.---', 'k': '-.-',
'l': '.-..', 'm': '--', 'n': '-.', 'o': '---', 'p': '.--.',
'q': '--.-', 'r': '.-.', 's': '...', 't': '-', 'u': '..-',
'v': '..-', 'w': '.--', 'y': '-.--', 'z': '--..'}
self.punctuation = {
'!': '−·−·−−',
'?': '··−−··',
'/': '−··−·',
';': '−·−·−·',
',': '−−··−−',
'(': '−·−−·',
')': '−·−−·−',
':': '−−−···',
'=': '−···−',
'-': '−····−',
'+': '·−·−·',
'@': '·−−·−·',
'"': '·−··−·',
'$': '···−··−',
'_': '··−−·−',
'&': '·−···',
' ': ''}
# Only holds alphanumeric characters
self.alpha_list = [letter for letter in self.alphabet.keys()]
# Only holds Morse code
self.code_list = [code for code in self.alphabet.values()]
# Punctuation characters
self.punc_eng = [punc for punc in self.punctuation.keys()]
# Punctuation Morse code
self.punc_codes = [code for code in self.punctuation.values()]
def generate(self, text):
"""
Generates Morse code from a message in English
"""
text = text.lower() # Make everything lower case, as case doesn't matter in Morse
morse = [] # Create the list that will eventually hold the Morse code
for letter in text: # Search the message for its match in Morse
if letter in self.alphabet:
morse.append(self.alphabet[letter])
# Attach punctuation or spaces as needed (periods are left out because .
# is 'e' in Morse)
if letter == '':
morse.append('')
if letter in self.punctuation:
morse.append(self.punctuation[letter])
return ' '.join(morse)
def translate(self, morse):
"""
Translates a Morse message into English
"""
morse = morse.split(' ')
english = []
for code in morse:
if code in self.code_list:
x = self.code_list.index(code)
english.append(self.alpha_list[x])
# Attach punctuation or spaces as needed
if code == '':
english.append(' ')
if code in self.punc_codes:
index = self.punc_codes.index(code)
english.append(self.punc_eng[index])
return ''.join(english) | AceMorse | /AceMorse-1.0.tar.gz/AceMorse-1.0/acemorse.py | acemorse.py |
Achilterm
=========
**Achilterm** is a lightweight **UTF-8** web based terminal.
.. image:: https://raw.githubusercontent.com/fgallaire/achilterm/master/img/achilterm.png
Achilterm is written in Python (and some AJAX javascript for client
side).
Achilterm is **very simple to install** on Linux, MacOS X, FreeBSD,
Solaris, cygwin and any Unix that runs Python.
Achilterm is initially forked from Ajaxterm which was inspired by
Anyterm.
Achilterm is developed by Florent Gallaire [email protected].
Website: http://fgallaire.github.io/achilterm.
Download and Install
--------------------
To install the last stable version from PyPI:
::
$ sudo pip install achilterm
To install the development version from GitHub:
::
$ git clone https://github.com/fgallaire/achilterm.git
$ cd achilterm
$ sudo python setup.py install
To run Achilterm after installation:
::
$ achilterm
To run Achilterm from the source without installation:
::
$ ./achilterm/achilterm.py
Then point your browser to this URL : ``http://localhost:8022/``
Documentation and Caveats
-------------------------
- Achilterm support Python 2.5 and above and Python 3.2 and above
- Achilterm require WebOb >= 1.0
- If run as root achilterm will run /bin/login, otherwise it will run
ssh localhost. To use an other command use the -c option.
- By default Achilterm only listen at ``127.0.0.1:8022``. For remote
access, it is strongly recommended to use **https SSL/TLS**, and that
is simple to configure if you use the apache web server using
``mod_proxy``.
Using ssl will also speed up achilterm (probably because of keepalive).
- Using GET HTTP request seems to speed up achilterm, just click on GET
in the interface, but be warned that your keystrokes might be loggued
(by apache or any proxy). I usually enable it after the login.
Achiltermlite
-------------
Achiltermlite is a stripped-down client-side version of Achilterm.
.. image:: https://raw.githubusercontent.com/fgallaire/achilterm/master/img/achiltermlite.png
Commandline usage
-----------------
::
usage: achilterm [options]
options:
--version show program's version number and exit
-h, --help show this help message and exit
-p PORT, --port=PORT set the TCP port (default: 8022)
-c CMD, --command=CMD set the command (default: /bin/login or ssh localhost)
-l, --log log requests to stderr (default: quiet mode)
-d, --daemon run as daemon in the background
-P PIDFILE, --pidfile=PIDFILE
set the pidfile (default: /var/run/achilterm.pid)
-i INDEX_FILE, --index=INDEX_FILE
default index file (default: achilterm.html)
-u UID, --uid=UID set the daemon's user id
-L, --lite use Achiltermlite
-w WIDTH, --width=WIDTH set the width (default: 80)
-H HEIGHT, --height=HEIGHT set the height (default: 25)
Configuration example
---------------------
::
Listen 443
NameVirtualHost *:443
<VirtualHost *:443>
ServerName localhost
SSLEngine On
SSLCertificateKeyFile ssl/apache.pem
SSLCertificateFile ssl/apache.pem
ProxyRequests Off
<Proxy *>
Order deny,allow
Allow from all
</Proxy>
ProxyPass /achilterm/ http://localhost:8022/
ProxyPassReverse /achilterm/ http://localhost:8022/
</VirtualHost>
Old versions
------------
- Achilterm 0.13 require Python 2.5 and above
Older Achilterm versions only support latin1, if you use Ubuntu or any
``LANG==en_US.UTF-8`` distribution don't forget to ``$ unset LANG``:
- Achilterm 0.12 require WebOb >= 1.2 (use it with Python 2.6 and 2.7)
- Achilterm 0.11 require WebOb < 1.0 (use it with Python 2.5)
Compared to anyterm
-------------------
- There are no partial updates, achilterm updates either all the screen
or nothing. That make the code simpler and I also think it's faster.
HTTP replies are always gzencoded. When used in 80x25 mode, almost
all of them are below the 1500 bytes (size of an ethernet frame) and
we just replace the screen with the reply (no javascript string
handling).
- Achilterm polls the server for updates with an exponentially growing
timeout when the screen hasn't changed. The timeout is also resetted
as soon as a key is pressed. Anyterm blocks on a pending request and
use a parallel connection for keypresses. The anyterm approch is
better when there aren't any keypress.
License
-------
Achilterm files are released under the GNU AGPLv3 or above license.
Achilterm codebase from Ajaxterm by Antony Lesuisse (email: al AT
udev.org), License Public Domain.
| Achilterm | /Achilterm-0.21.tar.gz/Achilterm-0.21/README.rst | README.rst |
achilterm={};
achilterm.Terminal_ctor=function(id,width,height) {
var ie=0;
if(window.ActiveXObject)
ie=1;
var chrome = navigator.userAgent.match('Chrome');
var webkit = navigator.userAgent.match('WebKit');
var sid=""+Math.round(Math.random()*1000000000);
var query0="s="+sid+"&w="+width+"&h="+height;
var query1=query0+"&c=1&k=";
var buf="";
var timeout;
var keybuf=[];
var sending=0;
var rmax=1;
var div=document.getElementById(id);
var dterm=document.createElement('div');
function update() {
if(sending==0) {
sending=1;
var r=new XMLHttpRequest();
var send="";
while(keybuf.length>0) {
send+=keybuf.pop();
}
var query=query1+send;
r.open("POST","u",true);
r.setRequestHeader('Content-Type','application/x-www-form-urlencoded');
r.onreadystatechange = function () {
if (r.readyState==4) {
if(r.status==200) {
de=r.responseXML.documentElement;
if(de.tagName=="pre") {
dterm.innerHTML = r.responseText;
rmax=100;
} else {
rmax*=2;
if(rmax>2000)
rmax=2000;
}
sending=0;
timeout=window.setTimeout(update,rmax);
}
}
}
r.send(query);
}
}
function queue(s) {
keybuf.unshift(s);
if(sending==0) {
window.clearTimeout(timeout);
timeout=window.setTimeout(update,1);
}
}
function keypress(ev) {
if (!ev) var ev=window.event;
var kc;
var k="";
if (ev.keyCode)
kc=ev.keyCode;
if (ev.which)
kc=ev.which;
if (ev.altKey) {
if (kc>=65 && kc<=90)
kc+=32;
if (kc>=97 && kc<=122) {
k=String.fromCharCode(27)+String.fromCharCode(kc);
}
} else if (ev.ctrlKey) {
if (kc>=65 && kc<=90) k=String.fromCharCode(kc-64); // Ctrl-A..Z
else if (kc>=97 && kc<=122) k=String.fromCharCode(kc-96); // Ctrl-A..Z
else if (kc==54) k=String.fromCharCode(30); // Ctrl-^
else if (kc==109) k=String.fromCharCode(31); // Ctrl-_
else if (kc==219) k=String.fromCharCode(27); // Ctrl-[
else if (kc==220) k=String.fromCharCode(28); // Ctrl-\
else if (kc==221) k=String.fromCharCode(29); // Ctrl-]
else if (kc==219) k=String.fromCharCode(29); // Ctrl-]
else if (kc==219) k=String.fromCharCode(0); // Ctrl-@
} else if (ev.which==0) {
if (kc==9) k=String.fromCharCode(9); // Tab
else if (kc==8) k=String.fromCharCode(127); // Backspace
else if (kc==27) k=String.fromCharCode(27); // Escape
else {
if (kc==33) k="[5~"; // PgUp
else if (kc==34) k="[6~"; // PgDn
else if (kc==35) k="[4~"; // End
else if (kc==36) k="[1~"; // Home
else if (kc==37) k="[D"; // Left
else if (kc==38) k="[A"; // Up
else if (kc==39) k="[C"; // Right
else if (kc==40) k="[B"; // Down
else if (kc==45) k="[2~"; // Ins
else if (kc==46) k="[3~"; // Del
else if (kc==112) k="[[A"; // F1
else if (kc==113) k="[[B"; // F2
else if (kc==114) k="[[C"; // F3
else if (kc==115) k="[[D"; // F4
else if (kc==116) k="[[E"; // F5
else if (kc==117) k="[17~"; // F6
else if (kc==118) k="[18~"; // F7
else if (kc==119) k="[19~"; // F8
else if (kc==120) k="[20~"; // F9
else if (kc==121) k="[21~"; // F10
else if (kc==122) k="[23~"; // F11
else if (kc==123) k="[24~"; // F12
if (k.length) {
k=String.fromCharCode(27)+k;
}
}
} else {
if (kc==8)
k=String.fromCharCode(127); // Backspace
else
k=String.fromCharCode(kc);
}
if(k.length) {
if(k=="+") {
queue("%2B");
} else {
queue(encodeURIComponent(k));
}
}
ev.cancelBubble=true;
if (ev.stopPropagation) ev.stopPropagation();
if (ev.preventDefault) ev.preventDefault();
return false;
}
function keydown(ev) {
if (!ev) var ev=window.event;
if (ie || chrome || webkit) {
o={9:1,8:1,27:1,33:1,34:1,35:1,36:1,37:1,38:1,39:1,40:1,45:1,46:1,112:1,
113:1,114:1,115:1,116:1,117:1,118:1,119:1,120:1,121:1,122:1,123:1};
if (o[ev.keyCode] || ev.ctrlKey || ev.altKey) {
ev.which=0;
return keypress(ev);
}
}
}
function init() {
div.appendChild(dterm);
document.onkeypress=keypress;
document.onkeydown=keydown;
timeout=window.setTimeout(update,100);
}
init();
}
achilterm.Terminal=function(id,width,height) {
return new this.Terminal_ctor(id,width,height);
} | Achilterm | /Achilterm-0.21.tar.gz/Achilterm-0.21/achilterm/achiltermlite.js | achiltermlite.js |
achilterm={};
achilterm.Terminal_ctor=function(id,width,height) {
var ie=0;
if(window.ActiveXObject)
ie=1;
var chrome = navigator.userAgent.match('Chrome');
var webkit = navigator.userAgent.match('WebKit');
var sid=""+Math.round(Math.random()*1000000000);
var query0="s="+sid+"&w="+width+"&h="+height;
var query1=query0+"&c=1&k=";
var buf="";
var timeout;
var error_timeout;
var keybuf=[];
var sending=0;
var rmax=1;
var div=document.getElementById(id);
var dstat=document.createElement('pre');
var sled=document.createElement('span');
var opt_get=document.createElement('a');
var opt_color=document.createElement('a');
var opt_paste=document.createElement('a');
var sdebug=document.createElement('span');
var dterm=document.createElement('div');
function debug(s) {
sdebug.innerHTML=s;
}
function error() {
sled.className='off';
debug("Connection lost timeout ts:"+((new Date).getTime()));
}
function opt_add(opt,name) {
opt.className='off';
opt.innerHTML=' '+name+' ';
dstat.appendChild(opt);
dstat.appendChild(document.createTextNode(' '));
}
function do_get(event) {
opt_get.className=(opt_get.className=='off')?'on':'off';
debug('GET '+opt_get.className);
}
function do_color(event) {
var o=opt_color.className=(opt_color.className=='off')?'on':'off';
if(o=='on')
query1=query0+"&c=1&k=";
else
query1=query0+"&k=";
debug('Color '+opt_color.className);
}
function mozilla_clipboard() {
// mozilla sucks
try {
netscape.security.PrivilegeManager.enablePrivilege("UniversalXPConnect");
} catch (err) {
debug('Access denied, <a href="http://kb.mozillazine.org/Granting_JavaScript_access_to_the_clipboard" target="_blank">more info</a>');
return undefined;
}
var clip = Components.classes["@mozilla.org/widget/clipboard;1"].createInstance(Components.interfaces.nsIClipboard);
var trans = Components.classes["@mozilla.org/widget/transferable;1"].createInstance(Components.interfaces.nsITransferable);
if (!clip || !trans) {
return undefined;
}
trans.addDataFlavor("text/unicode");
clip.getData(trans,clip.kGlobalClipboard);
var str=new Object();
var strLength=new Object();
try {
trans.getTransferData("text/unicode",str,strLength);
} catch(err) {
return "";
}
if (str) {
str=str.value.QueryInterface(Components.interfaces.nsISupportsString);
}
if (str) {
return str.data.substring(0,strLength.value / 2);
} else {
return "";
}
}
function do_paste(event) {
var p=undefined;
if (window.clipboardData) {
p=window.clipboardData.getData("Text");
} else if(window.netscape) {
p=mozilla_clipboard();
}
if (p) {
debug('Pasted');
queue(encodeURIComponent(p));
} else {
}
}
function update() {
// debug("ts: "+((new Date).getTime())+" rmax:"+rmax);
if(sending==0) {
sending=1;
sled.className='on';
var r=new XMLHttpRequest();
var send="";
while(keybuf.length>0) {
send+=keybuf.pop();
}
var query=query1+send;
if(opt_get.className=='on') {
r.open("GET","u?"+query,true);
if(ie) {
r.setRequestHeader("If-Modified-Since", "Sat, 1 Jan 2000 00:00:00 GMT");
}
} else {
r.open("POST","u",true);
}
r.setRequestHeader('Content-Type','application/x-www-form-urlencoded');
r.onreadystatechange = function () {
// debug("xhr:"+((new Date).getTime())+" state:"+r.readyState+" status:"+r.status+" statusText:"+r.statusText);
if (r.readyState==4) {
if(r.status==200) {
window.clearTimeout(error_timeout);
de=r.responseXML.documentElement;
if(de.tagName=="pre") {
dterm.innerHTML = r.responseText;
rmax=100;
} else {
rmax*=2;
if(rmax>2000)
rmax=2000;
}
sending=0;
sled.className='off';
timeout=window.setTimeout(update,rmax);
} else {
debug("Connection error status:"+r.status);
}
}
}
error_timeout=window.setTimeout(error,5000);
if(opt_get.className=='on') {
r.send(null);
} else {
r.send(query);
}
}
}
function queue(s) {
keybuf.unshift(s);
if(sending==0) {
window.clearTimeout(timeout);
timeout=window.setTimeout(update,1);
}
}
function keypress(ev) {
if (!ev) var ev=window.event;
// s="kp keyCode="+ev.keyCode+" which="+ev.which+" shiftKey="+ev.shiftKey+" ctrlKey="+ev.ctrlKey+" altKey="+ev.altKey;
// debug(s);
// return false;
// else { if (!ev.ctrlKey || ev.keyCode==17) { return; }
var kc;
var k="";
if (ev.keyCode)
kc=ev.keyCode;
if (ev.which)
kc=ev.which;
if (ev.altKey) {
if (kc>=65 && kc<=90)
kc+=32;
if (kc>=97 && kc<=122) {
k=String.fromCharCode(27)+String.fromCharCode(kc);
}
} else if (ev.ctrlKey) {
if (kc>=65 && kc<=90) k=String.fromCharCode(kc-64); // Ctrl-A..Z
else if (kc>=97 && kc<=122) k=String.fromCharCode(kc-96); // Ctrl-A..Z
else if (kc==54) k=String.fromCharCode(30); // Ctrl-^
else if (kc==109) k=String.fromCharCode(31); // Ctrl-_
else if (kc==219) k=String.fromCharCode(27); // Ctrl-[
else if (kc==220) k=String.fromCharCode(28); // Ctrl-\
else if (kc==221) k=String.fromCharCode(29); // Ctrl-]
else if (kc==219) k=String.fromCharCode(29); // Ctrl-]
else if (kc==219) k=String.fromCharCode(0); // Ctrl-@
} else if (ev.which==0) {
if (kc==9) k=String.fromCharCode(9); // Tab
else if (kc==8) k=String.fromCharCode(127); // Backspace
else if (kc==27) k=String.fromCharCode(27); // Escape
else {
if (kc==33) k="[5~"; // PgUp
else if (kc==34) k="[6~"; // PgDn
else if (kc==35) k="[4~"; // End
else if (kc==36) k="[1~"; // Home
else if (kc==37) k="[D"; // Left
else if (kc==38) k="[A"; // Up
else if (kc==39) k="[C"; // Right
else if (kc==40) k="[B"; // Down
else if (kc==45) k="[2~"; // Ins
else if (kc==46) k="[3~"; // Del
else if (kc==112) k="[[A"; // F1
else if (kc==113) k="[[B"; // F2
else if (kc==114) k="[[C"; // F3
else if (kc==115) k="[[D"; // F4
else if (kc==116) k="[[E"; // F5
else if (kc==117) k="[17~"; // F6
else if (kc==118) k="[18~"; // F7
else if (kc==119) k="[19~"; // F8
else if (kc==120) k="[20~"; // F9
else if (kc==121) k="[21~"; // F10
else if (kc==122) k="[23~"; // F11
else if (kc==123) k="[24~"; // F12
if (k.length) {
k=String.fromCharCode(27)+k;
}
}
} else {
if (kc==8)
k=String.fromCharCode(127); // Backspace
else
k=String.fromCharCode(kc);
}
if(k.length) {
// queue(encodeURIComponent(k));
if(k=="+") {
queue("%2B");
} else {
queue(encodeURIComponent(k));
}
}
ev.cancelBubble=true;
if (ev.stopPropagation) ev.stopPropagation();
if (ev.preventDefault) ev.preventDefault();
return false;
}
function keydown(ev) {
if (!ev) var ev=window.event;
if (ie || chrome || webkit) {
// s="kd keyCode="+ev.keyCode+" which="+ev.which+" shiftKey="+ev.shiftKey+" ctrlKey="+ev.ctrlKey+" altKey="+ev.altKey;
// debug(s);
o={9:1,8:1,27:1,33:1,34:1,35:1,36:1,37:1,38:1,39:1,40:1,45:1,46:1,112:1,
113:1,114:1,115:1,116:1,117:1,118:1,119:1,120:1,121:1,122:1,123:1};
if (o[ev.keyCode] || ev.ctrlKey || ev.altKey) {
ev.which=0;
return keypress(ev);
}
}
}
function init() {
sled.appendChild(document.createTextNode('\xb7'));
sled.className='off';
dstat.appendChild(sled);
dstat.appendChild(document.createTextNode(' '));
opt_add(opt_color,'Colors');
opt_color.className='on';
opt_add(opt_get,'GET');
opt_add(opt_paste,'Paste');
dstat.appendChild(sdebug);
dstat.className='stat';
div.appendChild(dstat);
div.appendChild(dterm);
if(opt_color.addEventListener) {
opt_get.addEventListener('click',do_get,true);
opt_color.addEventListener('click',do_color,true);
opt_paste.addEventListener('click',do_paste,true);
} else {
opt_get.attachEvent("onclick", do_get);
opt_color.attachEvent("onclick", do_color);
opt_paste.attachEvent("onclick", do_paste);
}
document.onkeypress=keypress;
document.onkeydown=keydown;
timeout=window.setTimeout(update,100);
}
init();
}
achilterm.Terminal=function(id,width,height) {
return new this.Terminal_ctor(id,width,height);
} | Achilterm | /Achilterm-0.21.tar.gz/Achilterm-0.21/achilterm/achilterm.js | achilterm.js |
__version__ = '0.21'
import array,cgi,fcntl,glob,mimetypes,optparse,os,pty,random,re,signal,select,sys,threading,time,termios,struct,pwd
import webob
from wsgiref.simple_server import make_server, WSGIRequestHandler
import gzip
try:
from cStringIO import StringIO
except ImportError:
try:
from StringIO import StringIO
except:
from io import BytesIO as StringIO
PY2 = sys.version[0] == '2'
class Terminal:
def __init__(self,width=80,height=24):
self.width=width
self.height=height
self.init()
self.reset()
def init(self):
self.esc_seq={
"\x00": None,
"\x05": self.esc_da,
"\x07": None,
"\x08": self.esc_0x08,
"\x09": self.esc_0x09,
"\x0a": self.esc_0x0a,
"\x0b": self.esc_0x0a,
"\x0c": self.esc_0x0a,
"\x0d": self.esc_0x0d,
"\x0e": None,
"\x0f": None,
"\x1b#8": None,
"\x1b=": None,
"\x1b>": None,
"\x1b(0": None,
"\x1b(A": None,
"\x1b(B": None,
"\x1b[c": self.esc_da,
"\x1b[0c": self.esc_da,
"\x1b]R": None,
"\x1b7": self.esc_save,
"\x1b8": self.esc_restore,
"\x1bD": None,
"\x1bE": None,
"\x1bH": None,
"\x1bM": self.esc_ri,
"\x1bN": None,
"\x1bO": None,
"\x1bZ": self.esc_da,
"\x1ba": None,
"\x1bc": self.reset,
"\x1bn": None,
"\x1bo": None,
}
for k,v in self.esc_seq.items():
if v==None:
self.esc_seq[k]=self.esc_ignore
# regex
d={
r'\[\??([0-9;]*)([@ABCDEFGHJKLMPXacdefghlmnqrstu`])' : self.csi_dispatch,
r'\]([^\x07]+)\x07' : self.esc_ignore,
}
self.esc_re=[]
for k,v in d.items():
self.esc_re.append((re.compile('\x1b'+k),v))
# define csi sequences
self.csi_seq={
'@': (self.csi_at,[1]),
'`': (self.csi_G,[1]),
'J': (self.csi_J,[0]),
'K': (self.csi_K,[0]),
}
for i in [i[4] for i in dir(self) if i.startswith('csi_') and len(i)==5]:
if i not in self.csi_seq:
self.csi_seq[i]=(getattr(self,'csi_'+i),[1])
def reset(self,s=""):
self.scr=array.array('i',[0x000700]*(self.width*self.height))
self.st=0
self.sb=self.height-1
self.cx_bak=self.cx=0
self.cy_bak=self.cy=0
self.cl=0
self.sgr=0x000700
self.buf=""
self.outbuf=""
self.last_html=""
def peek(self,y1,x1,y2,x2):
return self.scr[self.width*y1+x1:self.width*y2+x2]
def poke(self,y,x,s):
pos=self.width*y+x
self.scr[pos:pos+len(s)]=s
def zero(self,y1,x1,y2,x2):
w=self.width*(y2-y1)+x2-x1+1
z=array.array('i',[0x000700]*w)
self.scr[self.width*y1+x1:self.width*y2+x2+1]=z
def scroll_up(self,y1,y2):
self.poke(y1,0,self.peek(y1+1,0,y2,self.width))
self.zero(y2,0,y2,self.width-1)
def scroll_down(self,y1,y2):
self.poke(y1+1,0,self.peek(y1,0,y2-1,self.width))
self.zero(y1,0,y1,self.width-1)
def scroll_right(self,y,x):
self.poke(y,x+1,self.peek(y,x,y,self.width))
self.zero(y,x,y,x)
def cursor_down(self):
if self.cy>=self.st and self.cy<=self.sb:
self.cl=0
q,r=divmod(self.cy+1,self.sb+1)
if q:
self.scroll_up(self.st,self.sb)
self.cy=self.sb
else:
self.cy=r
def cursor_right(self):
q,r=divmod(self.cx+1,self.width)
if q:
self.cl=1
else:
self.cx=r
def echo(self,c):
if self.cl:
self.cursor_down()
self.cx=0
self.scr[(self.cy*self.width)+self.cx]=self.sgr|ord(c)
self.cursor_right()
def esc_0x08(self,s):
self.cx=max(0,self.cx-1)
def esc_0x09(self,s):
x=self.cx+8
q,r=divmod(x,8)
self.cx=(q*8)%self.width
def esc_0x0a(self,s):
self.cursor_down()
def esc_0x0d(self,s):
self.cl=0
self.cx=0
def esc_save(self,s):
self.cx_bak=self.cx
self.cy_bak=self.cy
def esc_restore(self,s):
self.cx=self.cx_bak
self.cy=self.cy_bak
self.cl=0
def esc_da(self,s):
self.outbuf="\x1b[?6c"
def esc_ri(self,s):
self.cy=max(self.st,self.cy-1)
if self.cy==self.st:
self.scroll_down(self.st,self.sb)
def esc_ignore(self,*s):
pass
# print "term:ignore: %s"%repr(s)
def csi_dispatch(self,seq,mo):
# CSI sequences
s=mo.group(1)
c=mo.group(2)
f=self.csi_seq.get(c,None)
if f:
try:
l=[min(int(i),1024) for i in s.split(';') if len(i)<4]
except ValueError:
l=[]
if len(l)==0:
l=f[1]
f[0](l)
# else:
# print 'csi ignore',c,l
def csi_at(self,l):
for i in range(l[0]):
self.scroll_right(self.cy,self.cx)
def csi_A(self,l):
self.cy=max(self.st,self.cy-l[0])
def csi_B(self,l):
self.cy=min(self.sb,self.cy+l[0])
def csi_C(self,l):
self.cx=min(self.width-1,self.cx+l[0])
self.cl=0
def csi_D(self,l):
self.cx=max(0,self.cx-l[0])
self.cl=0
def csi_E(self,l):
self.csi_B(l)
self.cx=0
self.cl=0
def csi_F(self,l):
self.csi_A(l)
self.cx=0
self.cl=0
def csi_G(self,l):
self.cx=min(self.width,l[0])-1
def csi_H(self,l):
if len(l)<2: l=[1,1]
self.cx=min(self.width,l[1])-1
self.cy=min(self.height,l[0])-1
self.cl=0
def csi_J(self,l):
if l[0]==0:
self.zero(self.cy,self.cx,self.height-1,self.width-1)
elif l[0]==1:
self.zero(0,0,self.cy,self.cx)
elif l[0]==2:
self.zero(0,0,self.height-1,self.width-1)
def csi_K(self,l):
if l[0]==0:
self.zero(self.cy,self.cx,self.cy,self.width-1)
elif l[0]==1:
self.zero(self.cy,0,self.cy,self.cx)
elif l[0]==2:
self.zero(self.cy,0,self.cy,self.width-1)
def csi_L(self,l):
for i in range(l[0]):
if self.cy<self.sb:
self.scroll_down(self.cy,self.sb)
def csi_M(self,l):
if self.cy>=self.st and self.cy<=self.sb:
for i in range(l[0]):
self.scroll_up(self.cy,self.sb)
def csi_P(self,l):
w,cx,cy=self.width,self.cx,self.cy
end=self.peek(cy,cx,cy,w)
self.csi_K([0])
self.poke(cy,cx,end[l[0]:])
def csi_X(self,l):
self.zero(self.cy,self.cx,self.cy,self.cx+l[0])
def csi_a(self,l):
self.csi_C(l)
def csi_c(self,l):
#'\x1b[?0c' 0-8 cursor size
pass
def csi_d(self,l):
self.cy=min(self.height,l[0])-1
def csi_e(self,l):
self.csi_B(l)
def csi_f(self,l):
self.csi_H(l)
def csi_h(self,l):
if l[0]==4:
pass
# print "insert on"
def csi_l(self,l):
if l[0]==4:
pass
# print "insert off"
def csi_m(self,l):
for i in l:
if i==0 or i==39 or i==49 or i==27:
self.sgr=0x000700
elif i==1:
self.sgr=(self.sgr|0x000800)
elif i==7:
self.sgr=0x070000
elif i>=30 and i<=37:
c=i-30
self.sgr=(self.sgr&0xff08ff)|(c<<8)
elif i>=40 and i<=47:
c=i-40
self.sgr=(self.sgr&0x00ffff)|(c<<16)
# else:
# print "CSI sgr ignore",l,i
# print 'sgr: %r %x'%(l,self.sgr)
def csi_r(self,l):
if len(l)<2: l=[0,self.height]
self.st=min(self.height-1,l[0]-1)
self.sb=min(self.height-1,l[1]-1)
self.sb=max(self.st,self.sb)
def csi_s(self,l):
self.esc_save(0)
def csi_u(self,l):
self.esc_restore(0)
def escape(self):
e=self.buf
if len(e)>32:
# print "error %r"%e
self.buf=""
elif e in self.esc_seq:
self.esc_seq[e](e)
self.buf=""
else:
for r,f in self.esc_re:
mo=r.match(e)
if mo:
f(e,mo)
self.buf=""
break
# if self.buf=='': print "ESC %r\n"%e
def write(self,s):
s = s.decode('utf-8')
for i in s:
if len(self.buf) or (i in self.esc_seq):
self.buf+=i
self.escape()
elif i == '\x1b':
self.buf+=i
else:
self.echo(i)
def read(self):
b=self.outbuf
self.outbuf=""
return b
def dumphtml(self,color=1):
h=self.height
w=self.width
r=""
span=""
span_bg,span_fg=-1,-1
for i in range(h*w):
q,c=divmod(self.scr[i],256)
if color:
bg,fg=divmod(q,256)
else:
bg,fg=0,7
if i==self.cy*w+self.cx:
bg,fg=1,7
if (bg!=span_bg or fg!=span_fg or i==h*w-1):
if len(span):
if PY2:
span = span.encode('utf-8')
r+='<span class="f%d b%d">%s</span>'%(span_fg,span_bg,cgi.escape(span))
span=""
span_bg,span_fg=bg,fg
if c == 0:
span+=' '
elif c > 0x10000:
span+='?'
else:
if PY2:
span+=unichr(c&0xFFFF)
else:
span+=chr(c&0xFFFF)
if i%w==w-1:
span+='\n'
r='<?xml version="1.0" encoding="UTF-8"?><pre class="term">%s</pre>'%r
if self.last_html==r:
return '<?xml version="1.0"?><idem></idem>'
else:
self.last_html=r
# print self
return r
def __repr__(self):
d=self.scr
r=""
for i in range(self.height):
r+="|%s|\n"%d[self.width*i:self.width*(i+1)]
return r
class SynchronizedMethod:
def __init__(self,lock,orig):
self.lock=lock
self.orig=orig
def __call__(self,*l):
self.lock.acquire()
r=self.orig(*l)
self.lock.release()
return r
class Multiplex:
def __init__(self,cmd=None):
#signal.signal(signal.SIGCHLD, signal.SIG_IGN)
self.cmd=cmd
self.proc={}
self.lock=threading.RLock()
self.thread=threading.Thread(target=self.loop)
self.alive=1
# synchronize methods
for name in ['create','fds','proc_read','proc_write','dump','die','run']:
orig=getattr(self,name)
setattr(self,name,SynchronizedMethod(self.lock,orig))
self.thread.start()
def create(self,w=80,h=25):
pid,fd=pty.fork()
if pid==0:
try:
fdl=[int(i) for i in os.listdir('/proc/self/fd')]
except OSError:
fdl=range(256)
for i in [i for i in fdl if i>2]:
try:
os.close(i)
except OSError:
pass
if self.cmd:
cmd=['/bin/sh','-c',self.cmd]
elif os.getuid()==0:
cmd=['/bin/login']
else:
sys.stdout.write("Login: ")
sys.stdout.flush()
login=sys.stdin.readline().strip()
if re.match('^[0-9A-Za-z-_. ]+$',login):
cmd=['ssh']
cmd+=['-oPreferredAuthentications=keyboard-interactive,password']
cmd+=['-oNoHostAuthenticationForLocalhost=yes']
cmd+=['-oLogLevel=FATAL']
cmd+=['-F/dev/null','-l',login,'localhost']
else:
os._exit(0)
env={}
env["COLUMNS"]=str(w)
env["LINES"]=str(h)
env["TERM"]="linux"
env["PATH"]=os.environ['PATH']
os.execvpe(cmd[0],cmd,env)
else:
fcntl.fcntl(fd, fcntl.F_SETFL, os.O_NONBLOCK)
# python bug http://python.org/sf/1112949 on amd64
fcntl.ioctl(fd, struct.unpack('i',struct.pack('I',termios.TIOCSWINSZ))[0], struct.pack("HHHH",h,w,0,0))
self.proc[fd]={'pid':pid,'term':Terminal(w,h),'buf':'','time':time.time()}
return fd
def die(self):
self.alive=0
def run(self):
return self.alive
def fds(self):
return self.proc.keys()
def proc_kill(self,fd):
if fd in self.proc:
self.proc[fd]['time']=0
t=time.time()
for i in self.proc.keys():
t0=self.proc[i]['time']
if (t-t0)>120:
try:
os.close(i)
os.kill(self.proc[i]['pid'],signal.SIGTERM)
except (IOError,OSError):
pass
del self.proc[i]
def proc_read(self,fd):
try:
t=self.proc[fd]['term']
t.write(os.read(fd,65536))
reply=t.read()
if reply:
os.write(fd,reply)
self.proc[fd]['time']=time.time()
except (KeyError,IOError,OSError):
self.proc_kill(fd)
def proc_write(self,fd,s):
try:
os.write(fd,s.encode('utf-8'))
except (IOError,OSError):
self.proc_kill(fd)
def dump(self,fd,color=1):
try:
return self.proc[fd]['term'].dumphtml(color)
except KeyError:
return False
def loop(self):
while self.run():
fds=self.fds()
i,o,e=select.select(fds, [], [], 1.0)
for fd in i:
self.proc_read(fd)
if len(i):
time.sleep(0.002)
for i in self.proc.keys():
try:
os.close(i)
os.kill(self.proc[i]['pid'],signal.SIGTERM)
except (IOError,OSError):
pass
class AchilTerm:
def __init__(self,cmd=None,index_file='achilterm.html',lite=False,width=80,height=25):
os.chdir(os.path.normpath(os.path.dirname(__file__)))
if lite:
index_file = 'achiltermlite.html'
self.files={}
for i in ['css','html','js']:
for j in glob.glob('*.%s'%i):
self.files[j]=open(j).read()
self.files['index']=open(index_file).read() % {'width': width, 'height': height}
self.mime = mimetypes.types_map.copy()
self.mime['.html']= 'text/html; charset=UTF-8'
self.multi = Multiplex(cmd)
self.session = {}
def __call__(self, environ, start_response):
req = webob.Request(environ)
res = webob.Response()
if req.environ['PATH_INFO'].endswith('/u'):
s=req.params.get("s","")
k=req.params.get("k","")
c=req.params.get("c","")
w=int(req.params.get("w", 0))
h=int(req.params.get("h", 0))
if s in self.session:
term=self.session[s]
else:
if not (w>2 and w<256 and h>2 and h<100):
w,h=80,25
term=self.session[s]=self.multi.create(w,h)
if k:
self.multi.proc_write(term,k)
time.sleep(0.002)
dump=self.multi.dump(term,c)
res.content_type = 'text/xml'
if isinstance(dump,str):
res.content_encoding = 'gzip'
zbuf=StringIO()
zfile=gzip.GzipFile(mode='wb', fileobj=zbuf)
if not PY2:
zfile.write(''.join(dump).encode('utf-8'))
else:
zfile.write(''.join(dump))
zfile.close()
zbuf=zbuf.getvalue()
res.write(zbuf)
else:
del self.session[s]
res.write('<?xml version="1.0"?><idem></idem>')
# print "sessions %r"%self.session
else:
n=os.path.basename(req.environ['PATH_INFO'])
if n in self.files:
res.content_type = self.mime.get(os.path.splitext(n)[1].lower(), 'application/octet-stream')
res.write(self.files[n])
else:
res.content_type = 'text/html; charset=UTF-8'
res.write(self.files['index'])
return res(environ, start_response)
class NoLogWSGIRequestHandler(WSGIRequestHandler):
def log_message(self, format, *args):
pass
def main():
parser = optparse.OptionParser(version='Achilterm version ' + __version__)
parser.add_option("-p", "--port", dest="port", default="8022", help="set the TCP port (default: 8022)")
parser.add_option("-c", "--command", dest="cmd", default=None,help="set the command (default: /bin/login or ssh localhost)")
parser.add_option("-l", "--log", action="store_true", dest="log",default=0,help="log requests to stderr (default: quiet mode)")
parser.add_option("-d", "--daemon", action="store_true", dest="daemon", default=0, help="run as daemon in the background")
parser.add_option("-P", "--pidfile",dest="pidfile",default="/var/run/achilterm.pid",help="set the pidfile (default: /var/run/achilterm.pid)")
parser.add_option("-i", "--index", dest="index_file", default=0, help="default index file (default: achilterm.html)")
parser.add_option("-u", "--uid", dest="uid", help="set the daemon's user id")
parser.add_option("-L", "--lite", action="store_true", dest="lite", default=0, help="use Achiltermlite")
parser.add_option("-w", "--width", dest="width", default="80", help="set the width (default: 80)")
parser.add_option("-H", "--height", dest="height", default="25", help="set the height (default: 25)")
(o, a) = parser.parse_args()
if o.daemon:
pid=os.fork()
if pid == 0:
#os.setsid() ?
os.setpgrp()
nullin = open('/dev/null', 'r')
nullout = open('/dev/null', 'w')
os.dup2(nullin.fileno(), sys.stdin.fileno())
os.dup2(nullout.fileno(), sys.stdout.fileno())
os.dup2(nullout.fileno(), sys.stderr.fileno())
if os.getuid()==0 and o.uid:
try:
os.setuid(int(o.uid))
except:
os.setuid(pwd.getpwnam(o.uid).pw_uid)
else:
try:
open(o.pidfile,'w+').write(str(pid)+'\n')
except:
pass
print('Achilterm at http://localhost:%s/ pid: %d' % (o.port,pid))
sys.exit(0)
else:
print('Achilterm at http://localhost:%s/' % o.port)
if o.log:
handler_class = WSGIRequestHandler
else:
handler_class = NoLogWSGIRequestHandler
if o.index_file:
at=AchilTerm(o.cmd,os.path.abspath(o.index_file),o.lite,o.width,o.height)
else:
at=AchilTerm(o.cmd,lite=o.lite,width=o.width,height=o.height)
try:
make_server('localhost', int(o.port), at, handler_class=handler_class).serve_forever()
except KeyboardInterrupt:
sys.excepthook(*sys.exc_info())
at.multi.die()
if __name__ == '__main__':
main() | Achilterm | /Achilterm-0.21.tar.gz/Achilterm-0.21/achilterm/achilterm.py | achilterm.py |
import tornado.httpserver
import tornado.ioloop
import tornado.options
import tornado.web
import tornado.wsgi
from tornado.options import define, options
import array,cgi,fcntl,glob,mimetypes,optparse,os,pty,random,re,signal,select,sys,threading,time,termios,struct,pwd
import webob
import gzip
define("port", default=8888, help="run on the given port", type=int)
try:
from cStringIO import StringIO
except ImportError:
try:
from StringIO import StringIO
except:
from io import BytesIO as StringIO
PY2 = sys.version[0] == '2'
class Terminal:
def __init__(self,width=80,height=24):
self.width=width
self.height=height
self.init()
self.reset()
def init(self):
self.esc_seq={
"\x00": None,
"\x05": self.esc_da,
"\x07": None,
"\x08": self.esc_0x08,
"\x09": self.esc_0x09,
"\x0a": self.esc_0x0a,
"\x0b": self.esc_0x0a,
"\x0c": self.esc_0x0a,
"\x0d": self.esc_0x0d,
"\x0e": None,
"\x0f": None,
"\x1b#8": None,
"\x1b=": None,
"\x1b>": None,
"\x1b(0": None,
"\x1b(A": None,
"\x1b(B": None,
"\x1b[c": self.esc_da,
"\x1b[0c": self.esc_da,
"\x1b]R": None,
"\x1b7": self.esc_save,
"\x1b8": self.esc_restore,
"\x1bD": None,
"\x1bE": None,
"\x1bH": None,
"\x1bM": self.esc_ri,
"\x1bN": None,
"\x1bO": None,
"\x1bZ": self.esc_da,
"\x1ba": None,
"\x1bc": self.reset,
"\x1bn": None,
"\x1bo": None,
}
for k,v in self.esc_seq.items():
if v==None:
self.esc_seq[k]=self.esc_ignore
# regex
d={
r'\[\??([0-9;]*)([@ABCDEFGHJKLMPXacdefghlmnqrstu`])' : self.csi_dispatch,
r'\]([^\x07]+)\x07' : self.esc_ignore,
}
self.esc_re=[]
for k,v in d.items():
self.esc_re.append((re.compile('\x1b'+k),v))
# define csi sequences
self.csi_seq={
'@': (self.csi_at,[1]),
'`': (self.csi_G,[1]),
'J': (self.csi_J,[0]),
'K': (self.csi_K,[0]),
}
for i in [i[4] for i in dir(self) if i.startswith('csi_') and len(i)==5]:
if i not in self.csi_seq:
self.csi_seq[i]=(getattr(self,'csi_'+i),[1])
def reset(self,s=""):
self.scr=array.array('i',[0x000700]*(self.width*self.height))
self.st=0
self.sb=self.height-1
self.cx_bak=self.cx=0
self.cy_bak=self.cy=0
self.cl=0
self.sgr=0x000700
self.buf=""
self.outbuf=""
self.last_html=""
def peek(self,y1,x1,y2,x2):
return self.scr[self.width*y1+x1:self.width*y2+x2]
def poke(self,y,x,s):
pos=self.width*y+x
self.scr[pos:pos+len(s)]=s
def zero(self,y1,x1,y2,x2):
w=self.width*(y2-y1)+x2-x1+1
z=array.array('i',[0x000700]*w)
self.scr[self.width*y1+x1:self.width*y2+x2+1]=z
def scroll_up(self,y1,y2):
self.poke(y1,0,self.peek(y1+1,0,y2,self.width))
self.zero(y2,0,y2,self.width-1)
def scroll_down(self,y1,y2):
self.poke(y1+1,0,self.peek(y1,0,y2-1,self.width))
self.zero(y1,0,y1,self.width-1)
def scroll_right(self,y,x):
self.poke(y,x+1,self.peek(y,x,y,self.width))
self.zero(y,x,y,x)
def cursor_down(self):
if self.cy>=self.st and self.cy<=self.sb:
self.cl=0
q,r=divmod(self.cy+1,self.sb+1)
if q:
self.scroll_up(self.st,self.sb)
self.cy=self.sb
else:
self.cy=r
def cursor_right(self):
q,r=divmod(self.cx+1,self.width)
if q:
self.cl=1
else:
self.cx=r
def echo(self,c):
if self.cl:
self.cursor_down()
self.cx=0
self.scr[(self.cy*self.width)+self.cx]=self.sgr|ord(c)
self.cursor_right()
def esc_0x08(self,s):
self.cx=max(0,self.cx-1)
def esc_0x09(self,s):
x=self.cx+8
q,r=divmod(x,8)
self.cx=(q*8)%self.width
def esc_0x0a(self,s):
self.cursor_down()
def esc_0x0d(self,s):
self.cl=0
self.cx=0
def esc_save(self,s):
self.cx_bak=self.cx
self.cy_bak=self.cy
def esc_restore(self,s):
self.cx=self.cx_bak
self.cy=self.cy_bak
self.cl=0
def esc_da(self,s):
self.outbuf="\x1b[?6c"
def esc_ri(self,s):
self.cy=max(self.st,self.cy-1)
if self.cy==self.st:
self.scroll_down(self.st,self.sb)
def esc_ignore(self,*s):
pass
# print "term:ignore: %s"%repr(s)
def csi_dispatch(self,seq,mo):
# CSI sequences
s=mo.group(1)
c=mo.group(2)
f=self.csi_seq.get(c,None)
if f:
try:
l=[min(int(i),1024) for i in s.split(';') if len(i)<4]
except ValueError:
l=[]
if len(l)==0:
l=f[1]
f[0](l)
# else:
# print 'csi ignore',c,l
def csi_at(self,l):
for i in range(l[0]):
self.scroll_right(self.cy,self.cx)
def csi_A(self,l):
self.cy=max(self.st,self.cy-l[0])
def csi_B(self,l):
self.cy=min(self.sb,self.cy+l[0])
def csi_C(self,l):
self.cx=min(self.width-1,self.cx+l[0])
self.cl=0
def csi_D(self,l):
self.cx=max(0,self.cx-l[0])
self.cl=0
def csi_E(self,l):
self.csi_B(l)
self.cx=0
self.cl=0
def csi_F(self,l):
self.csi_A(l)
self.cx=0
self.cl=0
def csi_G(self,l):
self.cx=min(self.width,l[0])-1
def csi_H(self,l):
if len(l)<2: l=[1,1]
self.cx=min(self.width,l[1])-1
self.cy=min(self.height,l[0])-1
self.cl=0
def csi_J(self,l):
if l[0]==0:
self.zero(self.cy,self.cx,self.height-1,self.width-1)
elif l[0]==1:
self.zero(0,0,self.cy,self.cx)
elif l[0]==2:
self.zero(0,0,self.height-1,self.width-1)
def csi_K(self,l):
if l[0]==0:
self.zero(self.cy,self.cx,self.cy,self.width-1)
elif l[0]==1:
self.zero(self.cy,0,self.cy,self.cx)
elif l[0]==2:
self.zero(self.cy,0,self.cy,self.width-1)
def csi_L(self,l):
for i in range(l[0]):
if self.cy<self.sb:
self.scroll_down(self.cy,self.sb)
def csi_M(self,l):
if self.cy>=self.st and self.cy<=self.sb:
for i in range(l[0]):
self.scroll_up(self.cy,self.sb)
def csi_P(self,l):
w,cx,cy=self.width,self.cx,self.cy
end=self.peek(cy,cx,cy,w)
self.csi_K([0])
self.poke(cy,cx,end[l[0]:])
def csi_X(self,l):
self.zero(self.cy,self.cx,self.cy,self.cx+l[0])
def csi_a(self,l):
self.csi_C(l)
def csi_c(self,l):
#'\x1b[?0c' 0-8 cursor size
pass
def csi_d(self,l):
self.cy=min(self.height,l[0])-1
def csi_e(self,l):
self.csi_B(l)
def csi_f(self,l):
self.csi_H(l)
def csi_h(self,l):
if l[0]==4:
pass
# print "insert on"
def csi_l(self,l):
if l[0]==4:
pass
# print "insert off"
def csi_m(self,l):
for i in l:
if i==0 or i==39 or i==49 or i==27:
self.sgr=0x000700
elif i==1:
self.sgr=(self.sgr|0x000800)
elif i==7:
self.sgr=0x070000
elif i>=30 and i<=37:
c=i-30
self.sgr=(self.sgr&0xff08ff)|(c<<8)
elif i>=40 and i<=47:
c=i-40
self.sgr=(self.sgr&0x00ffff)|(c<<16)
# else:
# print "CSI sgr ignore",l,i
# print 'sgr: %r %x'%(l,self.sgr)
def csi_r(self,l):
if len(l)<2: l=[0,self.height]
self.st=min(self.height-1,l[0]-1)
self.sb=min(self.height-1,l[1]-1)
self.sb=max(self.st,self.sb)
def csi_s(self,l):
self.esc_save(0)
def csi_u(self,l):
self.esc_restore(0)
def escape(self):
e=self.buf
if len(e)>32:
# print "error %r"%e
self.buf=""
elif e in self.esc_seq:
self.esc_seq[e](e)
self.buf=""
else:
for r,f in self.esc_re:
mo=r.match(e)
if mo:
f(e,mo)
self.buf=""
break
# if self.buf=='': print "ESC %r\n"%e
def write(self,s):
s = s.decode('utf-8')
for i in s:
if len(self.buf) or (i in self.esc_seq):
self.buf+=i
self.escape()
elif i == '\x1b':
self.buf+=i
else:
self.echo(i)
def read(self):
b=self.outbuf
self.outbuf=""
return b
def dumphtml(self,color=1):
h=self.height
w=self.width
r=""
span=""
span_bg,span_fg=-1,-1
for i in range(h*w):
q,c=divmod(self.scr[i],256)
if color:
bg,fg=divmod(q,256)
else:
bg,fg=0,7
if i==self.cy*w+self.cx:
bg,fg=1,7
if (bg!=span_bg or fg!=span_fg or i==h*w-1):
if len(span):
if PY2:
span = span.encode('utf-8')
r+='<span class="f%d b%d">%s</span>'%(span_fg,span_bg,cgi.escape(span))
span=""
span_bg,span_fg=bg,fg
if c == 0:
span+=' '
elif c > 0x10000:
span+='?'
else:
if PY2:
span+=unichr(c&0xFFFF)
else:
span+=chr(c&0xFFFF)
if i%w==w-1:
span+='\n'
r='<?xml version="1.0" encoding="UTF-8"?><pre class="term">%s</pre>'%r
if self.last_html==r:
return '<?xml version="1.0"?><idem></idem>'
else:
self.last_html=r
# print self
return r
def __repr__(self):
d=self.scr
r=""
for i in range(self.height):
r+="|%s|\n"%d[self.width*i:self.width*(i+1)]
return r
class SynchronizedMethod:
def __init__(self,lock,orig):
self.lock=lock
self.orig=orig
def __call__(self,*l):
self.lock.acquire()
r=self.orig(*l)
self.lock.release()
return r
class Multiplex:
def __init__(self,cmd=None):
#signal.signal(signal.SIGCHLD, signal.SIG_IGN)
self.cmd=cmd
self.proc={}
self.lock=threading.RLock()
self.thread=threading.Thread(target=self.loop)
self.alive=1
# synchronize methods
for name in ['create','fds','proc_read','proc_write','dump','die','run']:
orig=getattr(self,name)
setattr(self,name,SynchronizedMethod(self.lock,orig))
self.thread.start()
def create(self,w=80,h=25):
pid,fd=pty.fork()
if pid==0:
try:
fdl=[int(i) for i in os.listdir('/proc/self/fd')]
except OSError:
fdl=range(256)
for i in [i for i in fdl if i>2]:
try:
os.close(i)
except OSError:
pass
if self.cmd:
cmd=['/bin/sh','-c',self.cmd]
elif os.getuid()==0:
cmd=['/bin/login']
else:
sys.stdout.write("Login: ")
sys.stdout.flush()
login=sys.stdin.readline().strip()
if re.match('^[0-9A-Za-z-_. ]+$',login):
cmd=['ssh']
cmd+=['-oPreferredAuthentications=keyboard-interactive,password']
cmd+=['-oNoHostAuthenticationForLocalhost=yes']
cmd+=['-oLogLevel=FATAL']
cmd+=['-F/dev/null','-l',login,'localhost']
else:
os._exit(0)
env={}
env["COLUMNS"]=str(w)
env["LINES"]=str(h)
env["TERM"]="linux"
env["PATH"]=os.environ['PATH']
os.execvpe(cmd[0],cmd,env)
else:
fcntl.fcntl(fd, fcntl.F_SETFL, os.O_NONBLOCK)
# python bug http://python.org/sf/1112949 on amd64
fcntl.ioctl(fd, struct.unpack('i',struct.pack('I',termios.TIOCSWINSZ))[0], struct.pack("HHHH",h,w,0,0))
self.proc[fd]={'pid':pid,'term':Terminal(w,h),'buf':'','time':time.time()}
return fd
def die(self):
self.alive=0
def run(self):
return self.alive
def fds(self):
return self.proc.keys()
def proc_kill(self,fd):
if fd in self.proc:
self.proc[fd]['time']=0
t=time.time()
for i in self.proc.keys():
t0=self.proc[i]['time']
if (t-t0)>120:
try:
os.close(i)
os.kill(self.proc[i]['pid'],signal.SIGTERM)
except (IOError,OSError):
pass
del self.proc[i]
def proc_read(self,fd):
try:
t=self.proc[fd]['term']
t.write(os.read(fd,65536))
reply=t.read()
if reply:
os.write(fd,reply)
self.proc[fd]['time']=time.time()
except (KeyError,IOError,OSError):
self.proc_kill(fd)
def proc_write(self,fd,s):
try:
os.write(fd,s.encode('utf-8'))
except (IOError,OSError):
self.proc_kill(fd)
def dump(self,fd,color=1):
try:
return self.proc[fd]['term'].dumphtml(color)
except KeyError:
return False
def loop(self):
while self.run():
fds=self.fds()
i,o,e=select.select(fds, [], [], 1.0)
for fd in i:
self.proc_read(fd)
if len(i):
time.sleep(0.002)
for i in self.proc.keys():
try:
os.close(i)
os.kill(self.proc[i]['pid'],signal.SIGTERM)
except (IOError,OSError):
pass
class AchilTerm:
def __init__(self,cmd=None,index_file='achilterm.html',lite=False):
if lite:
index_file = 'achiltermlite.html'
self.files={}
for i in ['css','html','js']:
for j in glob.glob('*.%s'%i):
self.files[j]=open(j).read()
self.files['index']=open(index_file).read()
self.mime = mimetypes.types_map.copy()
self.mime['.html']= 'text/html; charset=UTF-8'
self.multi = Multiplex(cmd)
self.session = {}
def __call__(self, environ, start_response):
req = webob.Request(environ)
res = webob.Response()
if req.environ['PATH_INFO'].endswith('/u'):
s=req.params.get("s","")
k=req.params.get("k","")
c=req.params.get("c","")
w=int(req.params.get("w", 0))
h=int(req.params.get("h", 0))
if s in self.session:
term=self.session[s]
else:
if not (w>2 and w<256 and h>2 and h<100):
w,h=80,25
term=self.session[s]=self.multi.create(w,h)
if k:
self.multi.proc_write(term,k)
time.sleep(0.002)
dump=self.multi.dump(term,c)
res.content_type = 'text/xml'
if isinstance(dump,str):
res.content_encoding = 'gzip'
zbuf=StringIO()
zfile=gzip.GzipFile(mode='wb', fileobj=zbuf)
if not PY2:
zfile.write(''.join(dump).encode('utf-8'))
else:
zfile.write(''.join(dump))
zfile.close()
zbuf=zbuf.getvalue()
res.write(zbuf)
else:
del self.session[s]
res.write('<?xml version="1.0"?><idem></idem>')
# print "sessions %r"%self.session
else:
n=os.path.basename(req.environ['PATH_INFO'])
if n in self.files:
res.content_type = self.mime.get(os.path.splitext(n)[1].lower(), 'application/octet-stream')
res.write(self.files[n])
else:
res.content_type = 'text/html; charset=UTF-8'
res.write(self.files['index'])
return res(environ, start_response)
def main():
parser = optparse.OptionParser()
parser.add_option("-p", "--port", dest="port", default="8022", help="Set the TCP port (default: 8022)")
parser.add_option("-c", "--command", dest="cmd", default=None,help="set the command (default: /bin/login or ssh localhost)")
parser.add_option("-l", "--log", action="store_true", dest="log",default=0,help="log requests to stderr (default: quiet mode)")
parser.add_option("-d", "--daemon", action="store_true", dest="daemon", default=0, help="run as daemon in the background")
parser.add_option("-P", "--pidfile",dest="pidfile",default="/var/run/achilterm.pid",help="set the pidfile (default: /var/run/achilterm.pid)")
parser.add_option("-i", "--index", dest="index_file", default="achilterm.html",help="default index file (default: achilterm.html)")
parser.add_option("-u", "--uid", dest="uid", help="Set the daemon's user id")
parser.add_option("-L", "--lite", action="store_true", dest="lite", default=0, help="use Achiltermlite")
(o, a) = parser.parse_args()
if o.daemon:
pid=os.fork()
if pid == 0:
#os.setsid() ?
os.setpgrp()
nullin = file('/dev/null', 'r')
nullout = file('/dev/null', 'w')
os.dup2(nullin.fileno(), sys.stdin.fileno())
os.dup2(nullout.fileno(), sys.stdout.fileno())
os.dup2(nullout.fileno(), sys.stderr.fileno())
if os.getuid()==0 and o.uid:
try:
os.setuid(int(o.uid))
except:
os.setuid(pwd.getpwnam(o.uid).pw_uid)
else:
try:
file(o.pidfile,'w+').write(str(pid)+'\n')
except:
pass
print('Achilterm at http://localhost:%s/ pid: %d' % (o.port,pid))
sys.exit(0)
else:
print('Achilterm at http://localhost:%s/' % o.port)
at=AchilTerm(o.cmd,o.index_file,o.lite)
try:
http_server = tornado.httpserver.HTTPServer(tornado.wsgi.WSGIContainer(at))
http_server.listen(options.port)
#wsgiref.simple_server.make_server('localhost', int(o.port), at).serve_forever()
tornado.ioloop.IOLoop.current().start()
except KeyboardInterrupt:
sys.excepthook(*sys.exc_info())
at.multi.die()
if __name__ == '__main__':
main() | Achilterm | /Achilterm-0.21.tar.gz/Achilterm-0.21/achilterm/achiltorn.py | achiltorn.py |
Achoo. A fluent interface for testing Python objects.
Copyright (C) 2008 Quuxo Software.
<http://web.quuxo.com/projects/achoo>
Achoo is a fluent interface for testing Python objects. It makes
making assertions about objects (is this object equal to this other one?)
and function and method calls (does calling this function with these
arguments raise an error?) easy.
It is designed to be used in conjunction with a unit testing
framework like PyUnit's `unittest' module, shipped with all modern
Python distributions.
This module was inspired by Martin Fowler's article on fluent interfaces[0],
FEST Assert[1] and from too much perspiration from trying to use
unittest.TestCase's assert/fail methods.
Patches welcome!
[0] <http://www.martinfowler.com/bliki/FluentInterface.html>
[1] <http://fest.easytesting.org/assert/>
* Installation *
If you have a recent Setuptools[2] installed, just run this from the
command line:
easy_install Achoo
If you downloaded the Achoo distribution tarball, unpack it and
change to the Achoo directory, then run this from the command line:
python setup.py install
You may need to be root to do this.
[2] <http://pypi.python.org/pypi/setuptools>
* Using Achoo *
The 5 second demo:
from achoo import requiring
from achoo import calling
s = 'foo'
requiring(s).equal_to('foo').length(3)
calling(s.index).passing('quux').raises(ValueError)
For additional documentation, see the API documentation in the `doc'
directory or using Python's built-in help system.
* Licence *
Achoo is distributed under the GNU Lesser General Public Licence v3. As
such, it should be compatible with most other projects. See the included
files COPYING and COPYING.LESSER for the full details.
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU Lesser General Public License as
published by the Free Software Foundation, either version 3 of the
License, or (at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU Lesser General Public
License along with this program. If not, see
<http://www.gnu.org/licenses/>. | Achoo | /Achoo-1.0.tar.gz/Achoo-1.0/README | README |
import sys
import gettext
_ = gettext.translation('achoo', fallback=True).ugettext
def requiring(value):
"""
Assertion builder factory for object properties.
To test an object, call C{requiring} and pass the object as the
sole argument. A C{ValueAssertionBuilder} is returned and can be used
to chain together assertions about it.
For example::
test_map = {'foo': 'bar'}
requiring(test_map)\
.length(1)\
.contains('foo')\
.index('foo').equal_to('bar')
@return: an instance of C{ValueAssertionBuilder} wrapping the value
passed in
@param value: an object to be tested
"""
# XXX maybe put some logging here? what about lookup
# for different builder types depending on the given type?
# otherwise this could probably be an alias for the builder
# constructor
return ValueAssertionBuilder(value)
class ValueAssertionBuilder(object):
"""
An assertion builder for testing properties of objects.
This object can be used to create a set of assertions about various
properties of an object. Most methods return a builder with the
same object so that more than one assertion to be made about it.
If any of the assertions fail, an C{AssertionError} is raised.
"""
def __init__(self, value, invert=False):
"""
Constructs a new builder.
In general, you want to use the C{requiring} function instead
of this directly.
@param value: an object to be tested
@param invert: optionally inverts the sense of the next assertion
if C{True}
"""
self.value = value
self.invert_sense = invert
@property
def is_not(self):
"""
Inverts the sense of the next assertion.
This property causes the boolean sense of the next assertion
to be inverted. That is, if a call to C{equal_to} is prefixed
with C{is_not}, it will raise an error if the value object is
not equal to the given value. All other subsequent assertions
retain the specified sense unless also prefixed with C{is_not}.
For example::
s = 'foo'
requiring(s.length).is_not.equal_to(0)
"""
return ValueAssertionBuilder(self.value, True)
def equal_to(self, other):
"""
Asserts the value object is equal to some other object.
@return: this assertion builder
@param other: another object to test against the builder's
value object
@raise AssertionError: if the builder's value is not equal to
C{other}
"""
if self.value != other and not self.invert_sense:
raise self._error(_('Value `%s\' expected to equal `%s\''),
_('Value `%s\' not expected to equal `%s\''),
other)
self.invert_sense = False
return self
def same_as(self, other):
"""
Asserts the value object is the same as another object.
@return: this assertion builder
@param other: another object to test for same identity
@raise AssertionError: if the builder's value is not the same
object as C{other}
"""
if self.value is not other and not self.invert_sense:
raise self._error(_('Value `%s\' expected to be `%s\''),
_('Value `%s\' not expected to be `%s\''),
other)
self.invert_sense = False
return self
def is_none(self):
"""
Asserts the value object is C{None}.
@return: this assertion builder
@raise AssertionError: if the builder's value is not C{None}
"""
return self.same_as(None)
def is_not_none(self):
"""
Asserts the value object is not C{None}.
@return: this assertion builder
@raise AssertionError: if the builder's value is C{None}
"""
if self.value is None and not self.invert_sense:
raise self._error(_('Value `%s\' expected to be `%s\''),
_('Value `%s\' not expected to be `%s\''),
None)
self.invert_sense = False
return self
def is_a(self, clazz):
"""
Asserts the value object is an instance of a particular type.
@return: this assertion builder
@param clazz: type the value must be an instance of
@raise AssertionError: if the builder's value is not an instance
of C{clazz}
"""
if not isinstance(self.value, clazz) and not self.invert_sense:
raise self._error(_('Value `%s\' expected to be a `%s\''),
_('Value `%s\' not expected to be a `%s\''),
clazz)
self.invert_sense = False
return self
def length(self, length):
"""
Asserts the value object has a specific length.
@return: this assertion builder
@param length: the value that must be returned by passing
the builder value to the C{len} built-in
@raise AssertionError: if the length of the builder's value is
not equal to C{length}
"""
if len(self.value) != length and not self.invert_sense:
raise self._error(_('Length of `%s\' expected to equal `%s\''),
_('Length of `%s\' not expected to equal `%s\''),
length)
self.invert_sense = False
return self
def contains(self, element):
"""
Asserts the value object contains a specific element.
@return: this assertion builder
@param element: the element that must be contained by the
value object, as tested using the keyword C{in}
@raise AssertionError: if the builder's value does not contain
C{element}
"""
if element not in self.value and not self.invert_sense:
raise self._error(_('Value `%s\' expected to contain `%s\''),
_('Value of `%s\' not expected to contain `%s\''),
element)
self.invert_sense = False
return self
def index(self, index):
"""
Asserts the value object has a specific index.
B{Note:} this method returns a builder for the object at the
given index, allowing assertions to be made about that object
but not allowing any additional assertions to be made about
the original object.
The C{is_not} modifier has no effect on this method.
For example::
test_map = {'foo': 'bar'}
requiring(test_map).index('foo').equal_to('bar')
@return: an assertion builder for the object at the given
index
@param index: the index that must be contained by the
value object, as tested using the keyword C{in}
@raise AssertionError: if the builder's value does not contain
an element at C{index}
"""
if self.invert_sense:
raise AssertionError\
(_('A call to `index\' cannot be preceded by `is_not\''))
try:
return ValueAssertionBuilder(self.value[index])
except KeyError:
raise self._error(_('Value `%s\' expected to contain key `%s\''),
None, index)
except IndexError:
raise self._error(_('Value `%s\' expected to contain index `%s\''),
None, index)
def _error(self, message, inverse_message, other):
"""
Returns a new C{AssertionError} with an appropriate message.
"""
return AssertionError((message
if not self.invert_sense
else inverse_message) % (self.value, other))
def calling(callabl):
"""
Assertion builder factory for callable objects.
To test a callable, call C{requiring} and pass the object as the
sole argument. A C{ValueAssertionBuilder} is returned and can be used
to chain together assertions about it.
For example::
incr = lambda x: x + 1
calling(incr).passing(1).returns(2)
calling(incr).raises(TypeError)
@return: an instance of C{CallableAssertionBuilder} wrapping the
callable passed in
@param callabl: a callable object (function, method or similar) to
be tested
"""
# XXX maybe put some logging here? what about lookup
# for different builder types depending on the given type?
# otherwise this could probably be an alias for the builder
# constructor
return CallableAssertionBuilder(callabl)
class CallableAssertionBuilder(object):
"""
An assertion builder for testing callable objects.
This object can be used to create a set of assertions about
conditions when calling a callable object, such as a function
or method.
To provide parameters to the callable, use the C{passing} method.
The callable is not actually executed until one of the return
or raises methods is called.
"""
def __init__(self, callabl):
"""
Constructs a new builder.
In general, you want to use the C{calling} function instead
of this directly.
@param callabl: an object to be tested
"""
self.callable = callabl
self.args = None
self.kwargs = None
def passing(self, *args, **kwargs):
"""
Applies a set of arguments to be passed to the callable.
Use this method to specify what positional and keyword arguments
should be passed to the callable.
@return: this assertion builder
@param args: positional arguments to be passed to the callable
@param kwargs: keyword arguments to be passed to the callable
"""
self.args = args
self.kwargs = kwargs
return self
def returns(self, value=None):
"""
Invokes the callable, optionally checking the returned value.
Calling this method will cause the callable to be invoked,
with any arguments specified using C{passing} and returning
a C{ValueAssertionBuilder} for the object returned by the
callable.
An object can be optionally passed to this method for
conveniently checking the value of the object returned
by the callable.
@return: a C{ValueAssertionBuilder} for the object returned
by the invocation of the callable
@param value: optional value that must be equal to the
object returned by invoking the callable
@raise AssertionError: if the returned value is not equal to
C{value}
"""
ret = self._invoke()
builder = requiring(ret)
if value is not None:
builder.equal_to(value)
return builder
def returns_none(self):
"""
Invokes the callable and ensures the return value is C{None}.
Calling this method will cause the callable to be invoked,
with any arguments specified using C{passing}.
@raise AssertionError: if the value returned by invoking
the callable is not equal to C{None}
"""
self.returns().is_none()
def raises(self, error):
"""
Invokes the callable, ensuring it raises an exception.
Calling this method will cause the callable to be invoked,
with any arguments specified using C{passing}.
A C{ValueAssertionBuilder} for the exception is returned,
allowing its properties to be examined.
@return: a C{ValueAssertionBuilder} for the exception raised
by the invocation of the callable
@param error: type of the exception to be raised
@raise AssertionError: if the callable invocation did not
raise an exception or if it raised an exception that was
not of type C{BaseException}
"""
try:
self._invoke()
except:
e_type, e_value, tb = sys.exc_info()
if e_type == error:
return requiring(e_value)
raise AssertionError(_('Calling `%s\' raised a `%s\' error')
% (self.callable, e_type))
else:
raise AssertionError(_('Calling `%s\' did not raise any error')
% self.callable)
def _invoke(self):
"""
Invokes the callable with any parameters that have been specified.
@return: the return value from the callable invocation
"""
if self.args and self.kwargs:
return self.callable(*self.args, **self.kwargs)
if self.args:
return self.callable(*self.args)
if self.kwargs:
return self.callable(**self.kwargs)
return self.callable() | Achoo | /Achoo-1.0.tar.gz/Achoo-1.0/achoo.py | achoo.py |
import sys
DEFAULT_VERSION = "0.6c8"
DEFAULT_URL = "http://pypi.python.org/packages/%s/s/setuptools/" % sys.version[:3]
md5_data = {
'setuptools-0.6b1-py2.3.egg': '8822caf901250d848b996b7f25c6e6ca',
'setuptools-0.6b1-py2.4.egg': 'b79a8a403e4502fbb85ee3f1941735cb',
'setuptools-0.6b2-py2.3.egg': '5657759d8a6d8fc44070a9d07272d99b',
'setuptools-0.6b2-py2.4.egg': '4996a8d169d2be661fa32a6e52e4f82a',
'setuptools-0.6b3-py2.3.egg': 'bb31c0fc7399a63579975cad9f5a0618',
'setuptools-0.6b3-py2.4.egg': '38a8c6b3d6ecd22247f179f7da669fac',
'setuptools-0.6b4-py2.3.egg': '62045a24ed4e1ebc77fe039aa4e6f7e5',
'setuptools-0.6b4-py2.4.egg': '4cb2a185d228dacffb2d17f103b3b1c4',
'setuptools-0.6c1-py2.3.egg': 'b3f2b5539d65cb7f74ad79127f1a908c',
'setuptools-0.6c1-py2.4.egg': 'b45adeda0667d2d2ffe14009364f2a4b',
'setuptools-0.6c2-py2.3.egg': 'f0064bf6aa2b7d0f3ba0b43f20817c27',
'setuptools-0.6c2-py2.4.egg': '616192eec35f47e8ea16cd6a122b7277',
'setuptools-0.6c3-py2.3.egg': 'f181fa125dfe85a259c9cd6f1d7b78fa',
'setuptools-0.6c3-py2.4.egg': 'e0ed74682c998bfb73bf803a50e7b71e',
'setuptools-0.6c3-py2.5.egg': 'abef16fdd61955514841c7c6bd98965e',
'setuptools-0.6c4-py2.3.egg': 'b0b9131acab32022bfac7f44c5d7971f',
'setuptools-0.6c4-py2.4.egg': '2a1f9656d4fbf3c97bf946c0a124e6e2',
'setuptools-0.6c4-py2.5.egg': '8f5a052e32cdb9c72bcf4b5526f28afc',
'setuptools-0.6c5-py2.3.egg': 'ee9fd80965da04f2f3e6b3576e9d8167',
'setuptools-0.6c5-py2.4.egg': 'afe2adf1c01701ee841761f5bcd8aa64',
'setuptools-0.6c5-py2.5.egg': 'a8d3f61494ccaa8714dfed37bccd3d5d',
'setuptools-0.6c6-py2.3.egg': '35686b78116a668847237b69d549ec20',
'setuptools-0.6c6-py2.4.egg': '3c56af57be3225019260a644430065ab',
'setuptools-0.6c6-py2.5.egg': 'b2f8a7520709a5b34f80946de5f02f53',
'setuptools-0.6c7-py2.3.egg': '209fdf9adc3a615e5115b725658e13e2',
'setuptools-0.6c7-py2.4.egg': '5a8f954807d46a0fb67cf1f26c55a82e',
'setuptools-0.6c7-py2.5.egg': '45d2ad28f9750e7434111fde831e8372',
'setuptools-0.6c8-py2.3.egg': '50759d29b349db8cfd807ba8303f1902',
'setuptools-0.6c8-py2.4.egg': 'cba38d74f7d483c06e9daa6070cce6de',
'setuptools-0.6c8-py2.5.egg': '1721747ee329dc150590a58b3e1ac95b',
}
import sys, os
def _validate_md5(egg_name, data):
if egg_name in md5_data:
from md5 import md5
digest = md5(data).hexdigest()
if digest != md5_data[egg_name]:
print >>sys.stderr, (
"md5 validation of %s failed! (Possible download problem?)"
% egg_name
)
sys.exit(2)
return data
def use_setuptools(
version=DEFAULT_VERSION, download_base=DEFAULT_URL, to_dir=os.curdir,
download_delay=15
):
"""Automatically find/download setuptools and make it available on sys.path
`version` should be a valid setuptools version number that is available
as an egg for download under the `download_base` URL (which should end with
a '/'). `to_dir` is the directory where setuptools will be downloaded, if
it is not already available. If `download_delay` is specified, it should
be the number of seconds that will be paused before initiating a download,
should one be required. If an older version of setuptools is installed,
this routine will print a message to ``sys.stderr`` and raise SystemExit in
an attempt to abort the calling script.
"""
was_imported = 'pkg_resources' in sys.modules or 'setuptools' in sys.modules
def do_download():
egg = download_setuptools(version, download_base, to_dir, download_delay)
sys.path.insert(0, egg)
import setuptools; setuptools.bootstrap_install_from = egg
try:
import pkg_resources
except ImportError:
return do_download()
try:
pkg_resources.require("setuptools>="+version); return
except pkg_resources.VersionConflict, e:
if was_imported:
print >>sys.stderr, (
"The required version of setuptools (>=%s) is not available, and\n"
"can't be installed while this script is running. Please install\n"
" a more recent version first, using 'easy_install -U setuptools'."
"\n\n(Currently using %r)"
) % (version, e.args[0])
sys.exit(2)
else:
del pkg_resources, sys.modules['pkg_resources'] # reload ok
return do_download()
except pkg_resources.DistributionNotFound:
return do_download()
def download_setuptools(
version=DEFAULT_VERSION, download_base=DEFAULT_URL, to_dir=os.curdir,
delay = 15
):
"""Download setuptools from a specified location and return its filename
`version` should be a valid setuptools version number that is available
as an egg for download under the `download_base` URL (which should end
with a '/'). `to_dir` is the directory where the egg will be downloaded.
`delay` is the number of seconds to pause before an actual download attempt.
"""
import urllib2, shutil
egg_name = "setuptools-%s-py%s.egg" % (version,sys.version[:3])
url = download_base + egg_name
saveto = os.path.join(to_dir, egg_name)
src = dst = None
if not os.path.exists(saveto): # Avoid repeated downloads
try:
from distutils import log
if delay:
log.warn("""
---------------------------------------------------------------------------
This script requires setuptools version %s to run (even to display
help). I will attempt to download it for you (from
%s), but
you may need to enable firewall access for this script first.
I will start the download in %d seconds.
(Note: if this machine does not have network access, please obtain the file
%s
and place it in this directory before rerunning this script.)
---------------------------------------------------------------------------""",
version, download_base, delay, url
); from time import sleep; sleep(delay)
log.warn("Downloading %s", url)
src = urllib2.urlopen(url)
# Read/write all in one block, so we don't create a corrupt file
# if the download is interrupted.
data = _validate_md5(egg_name, src.read())
dst = open(saveto,"wb"); dst.write(data)
finally:
if src: src.close()
if dst: dst.close()
return os.path.realpath(saveto)
def main(argv, version=DEFAULT_VERSION):
"""Install or upgrade setuptools and EasyInstall"""
try:
import setuptools
except ImportError:
egg = None
try:
egg = download_setuptools(version, delay=0)
sys.path.insert(0,egg)
from setuptools.command.easy_install import main
return main(list(argv)+[egg]) # we're done here
finally:
if egg and os.path.exists(egg):
os.unlink(egg)
else:
if setuptools.__version__ == '0.0.1':
print >>sys.stderr, (
"You have an obsolete version of setuptools installed. Please\n"
"remove it from your system entirely before rerunning this script."
)
sys.exit(2)
req = "setuptools>="+version
import pkg_resources
try:
pkg_resources.require(req)
except pkg_resources.VersionConflict:
try:
from setuptools.command.easy_install import main
except ImportError:
from easy_install import main
main(list(argv)+[download_setuptools(delay=0)])
sys.exit(0) # try to force an exit
else:
if argv:
from setuptools.command.easy_install import main
main(argv)
else:
print "Setuptools version",version,"or greater has been installed."
print '(Run "ez_setup.py -U setuptools" to reinstall or upgrade.)'
def update_md5(filenames):
"""Update our built-in md5 registry"""
import re
from md5 import md5
for name in filenames:
base = os.path.basename(name)
f = open(name,'rb')
md5_data[base] = md5(f.read()).hexdigest()
f.close()
data = [" %r: %r,\n" % it for it in md5_data.items()]
data.sort()
repl = "".join(data)
import inspect
srcfile = inspect.getsourcefile(sys.modules[__name__])
f = open(srcfile, 'rb'); src = f.read(); f.close()
match = re.search("\nmd5_data = {\n([^}]+)}", src)
if not match:
print >>sys.stderr, "Internal error!"
sys.exit(2)
src = src[:match.start(1)] + repl + src[match.end(1):]
f = open(srcfile,'w')
f.write(src)
f.close()
if __name__=='__main__':
if len(sys.argv)>2 and sys.argv[1]=='--md5update':
update_md5(sys.argv[2:])
else:
main(sys.argv[1:]) | Achoo | /Achoo-1.0.tar.gz/Achoo-1.0/setup/ez_setup.py | ez_setup.py |
#############
Ackbar v. 0.2
#############
A Python program to assess Key Biodiversity Areas (KBA) delimitation.
The KBA standard requires detailed information from multiple sources in order
to take a practical decision about the boundaries of a new KBA.
This program only considers biological data (geographic occurrences of species)
to suggests and rank areas where a thorough assessment should be conducted.
The output is a set of shapefiles of such areas.
*************
Documentation
*************
Detailed documentation (rationale, installation, and usage) is hosted at the
Github `wiki <https://github.com/nrsalinas/ackbar/wiki>`_ of the project.
*************
Requirements
*************
- Python 3 interpreter.
- C++ compiler.
- `Shapely <https://pypi.org/project/Shapely/>`_.
- `Fiona <https://pypi.org/project/Fiona/>`_.
- `Pyproj <https://pypi.org/project/pyproj/>`_.
- Atention to detail.
*************
Installation
*************
Ackbar can be installed through pip::
pip install ackbar
*****
Usage
*****
All parameters are parsed through a configuration file, which should be the sole
argument::
ackbar.py config.txt
Config file parameters are fully explained in the project
`wiki <https://github.com/nrsalinas/ackbar/wiki>`_ page.
*********************
License
*********************
Copyright 2020 Nelson R. Salinas
Ackbar is available under the GNU General Public License version 3. See LICENSE.md
for more information.
*******
Contact
*******
| Nelson R. Salinas
| Instituto de Investigación de Recursos Biológicos Alexander von Humboldt
| [email protected]
| Ackbar | /Ackbar-0.2.tar.gz/Ackbar-0.2/README.rst | README.rst |
### GNU GENERAL PUBLIC LICENSE
Version 3, 29 June 2007
Copyright (C) 2007 Free Software Foundation, Inc.
<https://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies of this
license document, but changing it is not allowed.
### Preamble
The GNU General Public License is a free, copyleft license for
software and other kinds of works.
The licenses for most software and other practical works are designed
to take away your freedom to share and change the works. By contrast,
the GNU General Public License is intended to guarantee your freedom
to share and change all versions of a program--to make sure it remains
free software for all its users. We, the Free Software Foundation, use
the GNU General Public License for most of our software; it applies
also to any other work released this way by its authors. You can apply
it to your programs, too.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
them if you wish), that you receive source code or can get it if you
want it, that you can change the software or use pieces of it in new
free programs, and that you know you can do these things.
To protect your rights, we need to prevent others from denying you
these rights or asking you to surrender the rights. Therefore, you
have certain responsibilities if you distribute copies of the
software, or if you modify it: responsibilities to respect the freedom
of others.
For example, if you distribute copies of such a program, whether
gratis or for a fee, you must pass on to the recipients the same
freedoms that you received. You must make sure that they, too, receive
or can get the source code. And you must show them these terms so they
know their rights.
Developers that use the GNU GPL protect your rights with two steps:
(1) assert copyright on the software, and (2) offer you this License
giving you legal permission to copy, distribute and/or modify it.
For the developers' and authors' protection, the GPL clearly explains
that there is no warranty for this free software. For both users' and
authors' sake, the GPL requires that modified versions be marked as
changed, so that their problems will not be attributed erroneously to
authors of previous versions.
Some devices are designed to deny users access to install or run
modified versions of the software inside them, although the
manufacturer can do so. This is fundamentally incompatible with the
aim of protecting users' freedom to change the software. The
systematic pattern of such abuse occurs in the area of products for
individuals to use, which is precisely where it is most unacceptable.
Therefore, we have designed this version of the GPL to prohibit the
practice for those products. If such problems arise substantially in
other domains, we stand ready to extend this provision to those
domains in future versions of the GPL, as needed to protect the
freedom of users.
Finally, every program is threatened constantly by software patents.
States should not allow patents to restrict development and use of
software on general-purpose computers, but in those that do, we wish
to avoid the special danger that patents applied to a free program
could make it effectively proprietary. To prevent this, the GPL
assures that patents cannot be used to render the program non-free.
The precise terms and conditions for copying, distribution and
modification follow.
### TERMS AND CONDITIONS
#### 0. Definitions.
"This License" refers to version 3 of the GNU General Public License.
"Copyright" also means copyright-like laws that apply to other kinds
of works, such as semiconductor masks.
"The Program" refers to any copyrightable work licensed under this
License. Each licensee is addressed as "you". "Licensees" and
"recipients" may be individuals or organizations.
To "modify" a work means to copy from or adapt all or part of the work
in a fashion requiring copyright permission, other than the making of
an exact copy. The resulting work is called a "modified version" of
the earlier work or a work "based on" the earlier work.
A "covered work" means either the unmodified Program or a work based
on the Program.
To "propagate" a work means to do anything with it that, without
permission, would make you directly or secondarily liable for
infringement under applicable copyright law, except executing it on a
computer or modifying a private copy. Propagation includes copying,
distribution (with or without modification), making available to the
public, and in some countries other activities as well.
To "convey" a work means any kind of propagation that enables other
parties to make or receive copies. Mere interaction with a user
through a computer network, with no transfer of a copy, is not
conveying.
An interactive user interface displays "Appropriate Legal Notices" to
the extent that it includes a convenient and prominently visible
feature that (1) displays an appropriate copyright notice, and (2)
tells the user that there is no warranty for the work (except to the
extent that warranties are provided), that licensees may convey the
work under this License, and how to view a copy of this License. If
the interface presents a list of user commands or options, such as a
menu, a prominent item in the list meets this criterion.
#### 1. Source Code.
The "source code" for a work means the preferred form of the work for
making modifications to it. "Object code" means any non-source form of
a work.
A "Standard Interface" means an interface that either is an official
standard defined by a recognized standards body, or, in the case of
interfaces specified for a particular programming language, one that
is widely used among developers working in that language.
The "System Libraries" of an executable work include anything, other
than the work as a whole, that (a) is included in the normal form of
packaging a Major Component, but which is not part of that Major
Component, and (b) serves only to enable use of the work with that
Major Component, or to implement a Standard Interface for which an
implementation is available to the public in source code form. A
"Major Component", in this context, means a major essential component
(kernel, window system, and so on) of the specific operating system
(if any) on which the executable work runs, or a compiler used to
produce the work, or an object code interpreter used to run it.
The "Corresponding Source" for a work in object code form means all
the source code needed to generate, install, and (for an executable
work) run the object code and to modify the work, including scripts to
control those activities. However, it does not include the work's
System Libraries, or general-purpose tools or generally available free
programs which are used unmodified in performing those activities but
which are not part of the work. For example, Corresponding Source
includes interface definition files associated with source files for
the work, and the source code for shared libraries and dynamically
linked subprograms that the work is specifically designed to require,
such as by intimate data communication or control flow between those
subprograms and other parts of the work.
The Corresponding Source need not include anything that users can
regenerate automatically from other parts of the Corresponding Source.
The Corresponding Source for a work in source code form is that same
work.
#### 2. Basic Permissions.
All rights granted under this License are granted for the term of
copyright on the Program, and are irrevocable provided the stated
conditions are met. This License explicitly affirms your unlimited
permission to run the unmodified Program. The output from running a
covered work is covered by this License only if the output, given its
content, constitutes a covered work. This License acknowledges your
rights of fair use or other equivalent, as provided by copyright law.
You may make, run and propagate covered works that you do not convey,
without conditions so long as your license otherwise remains in force.
You may convey covered works to others for the sole purpose of having
them make modifications exclusively for you, or provide you with
facilities for running those works, provided that you comply with the
terms of this License in conveying all material for which you do not
control copyright. Those thus making or running the covered works for
you must do so exclusively on your behalf, under your direction and
control, on terms that prohibit them from making any copies of your
copyrighted material outside their relationship with you.
Conveying under any other circumstances is permitted solely under the
conditions stated below. Sublicensing is not allowed; section 10 makes
it unnecessary.
#### 3. Protecting Users' Legal Rights From Anti-Circumvention Law.
No covered work shall be deemed part of an effective technological
measure under any applicable law fulfilling obligations under article
11 of the WIPO copyright treaty adopted on 20 December 1996, or
similar laws prohibiting or restricting circumvention of such
measures.
When you convey a covered work, you waive any legal power to forbid
circumvention of technological measures to the extent such
circumvention is effected by exercising rights under this License with
respect to the covered work, and you disclaim any intention to limit
operation or modification of the work as a means of enforcing, against
the work's users, your or third parties' legal rights to forbid
circumvention of technological measures.
#### 4. Conveying Verbatim Copies.
You may convey verbatim copies of the Program's source code as you
receive it, in any medium, provided that you conspicuously and
appropriately publish on each copy an appropriate copyright notice;
keep intact all notices stating that this License and any
non-permissive terms added in accord with section 7 apply to the code;
keep intact all notices of the absence of any warranty; and give all
recipients a copy of this License along with the Program.
You may charge any price or no price for each copy that you convey,
and you may offer support or warranty protection for a fee.
#### 5. Conveying Modified Source Versions.
You may convey a work based on the Program, or the modifications to
produce it from the Program, in the form of source code under the
terms of section 4, provided that you also meet all of these
conditions:
- a) The work must carry prominent notices stating that you modified
it, and giving a relevant date.
- b) The work must carry prominent notices stating that it is
released under this License and any conditions added under
section 7. This requirement modifies the requirement in section 4
to "keep intact all notices".
- c) You must license the entire work, as a whole, under this
License to anyone who comes into possession of a copy. This
License will therefore apply, along with any applicable section 7
additional terms, to the whole of the work, and all its parts,
regardless of how they are packaged. This License gives no
permission to license the work in any other way, but it does not
invalidate such permission if you have separately received it.
- d) If the work has interactive user interfaces, each must display
Appropriate Legal Notices; however, if the Program has interactive
interfaces that do not display Appropriate Legal Notices, your
work need not make them do so.
A compilation of a covered work with other separate and independent
works, which are not by their nature extensions of the covered work,
and which are not combined with it such as to form a larger program,
in or on a volume of a storage or distribution medium, is called an
"aggregate" if the compilation and its resulting copyright are not
used to limit the access or legal rights of the compilation's users
beyond what the individual works permit. Inclusion of a covered work
in an aggregate does not cause this License to apply to the other
parts of the aggregate.
#### 6. Conveying Non-Source Forms.
You may convey a covered work in object code form under the terms of
sections 4 and 5, provided that you also convey the machine-readable
Corresponding Source under the terms of this License, in one of these
ways:
- a) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by the
Corresponding Source fixed on a durable physical medium
customarily used for software interchange.
- b) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by a
written offer, valid for at least three years and valid for as
long as you offer spare parts or customer support for that product
model, to give anyone who possesses the object code either (1) a
copy of the Corresponding Source for all the software in the
product that is covered by this License, on a durable physical
medium customarily used for software interchange, for a price no
more than your reasonable cost of physically performing this
conveying of source, or (2) access to copy the Corresponding
Source from a network server at no charge.
- c) Convey individual copies of the object code with a copy of the
written offer to provide the Corresponding Source. This
alternative is allowed only occasionally and noncommercially, and
only if you received the object code with such an offer, in accord
with subsection 6b.
- d) Convey the object code by offering access from a designated
place (gratis or for a charge), and offer equivalent access to the
Corresponding Source in the same way through the same place at no
further charge. You need not require recipients to copy the
Corresponding Source along with the object code. If the place to
copy the object code is a network server, the Corresponding Source
may be on a different server (operated by you or a third party)
that supports equivalent copying facilities, provided you maintain
clear directions next to the object code saying where to find the
Corresponding Source. Regardless of what server hosts the
Corresponding Source, you remain obligated to ensure that it is
available for as long as needed to satisfy these requirements.
- e) Convey the object code using peer-to-peer transmission,
provided you inform other peers where the object code and
Corresponding Source of the work are being offered to the general
public at no charge under subsection 6d.
A separable portion of the object code, whose source code is excluded
from the Corresponding Source as a System Library, need not be
included in conveying the object code work.
A "User Product" is either (1) a "consumer product", which means any
tangible personal property which is normally used for personal,
family, or household purposes, or (2) anything designed or sold for
incorporation into a dwelling. In determining whether a product is a
consumer product, doubtful cases shall be resolved in favor of
coverage. For a particular product received by a particular user,
"normally used" refers to a typical or common use of that class of
product, regardless of the status of the particular user or of the way
in which the particular user actually uses, or expects or is expected
to use, the product. A product is a consumer product regardless of
whether the product has substantial commercial, industrial or
non-consumer uses, unless such uses represent the only significant
mode of use of the product.
"Installation Information" for a User Product means any methods,
procedures, authorization keys, or other information required to
install and execute modified versions of a covered work in that User
Product from a modified version of its Corresponding Source. The
information must suffice to ensure that the continued functioning of
the modified object code is in no case prevented or interfered with
solely because modification has been made.
If you convey an object code work under this section in, or with, or
specifically for use in, a User Product, and the conveying occurs as
part of a transaction in which the right of possession and use of the
User Product is transferred to the recipient in perpetuity or for a
fixed term (regardless of how the transaction is characterized), the
Corresponding Source conveyed under this section must be accompanied
by the Installation Information. But this requirement does not apply
if neither you nor any third party retains the ability to install
modified object code on the User Product (for example, the work has
been installed in ROM).
The requirement to provide Installation Information does not include a
requirement to continue to provide support service, warranty, or
updates for a work that has been modified or installed by the
recipient, or for the User Product in which it has been modified or
installed. Access to a network may be denied when the modification
itself materially and adversely affects the operation of the network
or violates the rules and protocols for communication across the
network.
Corresponding Source conveyed, and Installation Information provided,
in accord with this section must be in a format that is publicly
documented (and with an implementation available to the public in
source code form), and must require no special password or key for
unpacking, reading or copying.
#### 7. Additional Terms.
"Additional permissions" are terms that supplement the terms of this
License by making exceptions from one or more of its conditions.
Additional permissions that are applicable to the entire Program shall
be treated as though they were included in this License, to the extent
that they are valid under applicable law. If additional permissions
apply only to part of the Program, that part may be used separately
under those permissions, but the entire Program remains governed by
this License without regard to the additional permissions.
When you convey a copy of a covered work, you may at your option
remove any additional permissions from that copy, or from any part of
it. (Additional permissions may be written to require their own
removal in certain cases when you modify the work.) You may place
additional permissions on material, added by you to a covered work,
for which you have or can give appropriate copyright permission.
Notwithstanding any other provision of this License, for material you
add to a covered work, you may (if authorized by the copyright holders
of that material) supplement the terms of this License with terms:
- a) Disclaiming warranty or limiting liability differently from the
terms of sections 15 and 16 of this License; or
- b) Requiring preservation of specified reasonable legal notices or
author attributions in that material or in the Appropriate Legal
Notices displayed by works containing it; or
- c) Prohibiting misrepresentation of the origin of that material,
or requiring that modified versions of such material be marked in
reasonable ways as different from the original version; or
- d) Limiting the use for publicity purposes of names of licensors
or authors of the material; or
- e) Declining to grant rights under trademark law for use of some
trade names, trademarks, or service marks; or
- f) Requiring indemnification of licensors and authors of that
material by anyone who conveys the material (or modified versions
of it) with contractual assumptions of liability to the recipient,
for any liability that these contractual assumptions directly
impose on those licensors and authors.
All other non-permissive additional terms are considered "further
restrictions" within the meaning of section 10. If the Program as you
received it, or any part of it, contains a notice stating that it is
governed by this License along with a term that is a further
restriction, you may remove that term. If a license document contains
a further restriction but permits relicensing or conveying under this
License, you may add to a covered work material governed by the terms
of that license document, provided that the further restriction does
not survive such relicensing or conveying.
If you add terms to a covered work in accord with this section, you
must place, in the relevant source files, a statement of the
additional terms that apply to those files, or a notice indicating
where to find the applicable terms.
Additional terms, permissive or non-permissive, may be stated in the
form of a separately written license, or stated as exceptions; the
above requirements apply either way.
#### 8. Termination.
You may not propagate or modify a covered work except as expressly
provided under this License. Any attempt otherwise to propagate or
modify it is void, and will automatically terminate your rights under
this License (including any patent licenses granted under the third
paragraph of section 11).
However, if you cease all violation of this License, then your license
from a particular copyright holder is reinstated (a) provisionally,
unless and until the copyright holder explicitly and finally
terminates your license, and (b) permanently, if the copyright holder
fails to notify you of the violation by some reasonable means prior to
60 days after the cessation.
Moreover, your license from a particular copyright holder is
reinstated permanently if the copyright holder notifies you of the
violation by some reasonable means, this is the first time you have
received notice of violation of this License (for any work) from that
copyright holder, and you cure the violation prior to 30 days after
your receipt of the notice.
Termination of your rights under this section does not terminate the
licenses of parties who have received copies or rights from you under
this License. If your rights have been terminated and not permanently
reinstated, you do not qualify to receive new licenses for the same
material under section 10.
#### 9. Acceptance Not Required for Having Copies.
You are not required to accept this License in order to receive or run
a copy of the Program. Ancillary propagation of a covered work
occurring solely as a consequence of using peer-to-peer transmission
to receive a copy likewise does not require acceptance. However,
nothing other than this License grants you permission to propagate or
modify any covered work. These actions infringe copyright if you do
not accept this License. Therefore, by modifying or propagating a
covered work, you indicate your acceptance of this License to do so.
#### 10. Automatic Licensing of Downstream Recipients.
Each time you convey a covered work, the recipient automatically
receives a license from the original licensors, to run, modify and
propagate that work, subject to this License. You are not responsible
for enforcing compliance by third parties with this License.
An "entity transaction" is a transaction transferring control of an
organization, or substantially all assets of one, or subdividing an
organization, or merging organizations. If propagation of a covered
work results from an entity transaction, each party to that
transaction who receives a copy of the work also receives whatever
licenses to the work the party's predecessor in interest had or could
give under the previous paragraph, plus a right to possession of the
Corresponding Source of the work from the predecessor in interest, if
the predecessor has it or can get it with reasonable efforts.
You may not impose any further restrictions on the exercise of the
rights granted or affirmed under this License. For example, you may
not impose a license fee, royalty, or other charge for exercise of
rights granted under this License, and you may not initiate litigation
(including a cross-claim or counterclaim in a lawsuit) alleging that
any patent claim is infringed by making, using, selling, offering for
sale, or importing the Program or any portion of it.
#### 11. Patents.
A "contributor" is a copyright holder who authorizes use under this
License of the Program or a work on which the Program is based. The
work thus licensed is called the contributor's "contributor version".
A contributor's "essential patent claims" are all patent claims owned
or controlled by the contributor, whether already acquired or
hereafter acquired, that would be infringed by some manner, permitted
by this License, of making, using, or selling its contributor version,
but do not include claims that would be infringed only as a
consequence of further modification of the contributor version. For
purposes of this definition, "control" includes the right to grant
patent sublicenses in a manner consistent with the requirements of
this License.
Each contributor grants you a non-exclusive, worldwide, royalty-free
patent license under the contributor's essential patent claims, to
make, use, sell, offer for sale, import and otherwise run, modify and
propagate the contents of its contributor version.
In the following three paragraphs, a "patent license" is any express
agreement or commitment, however denominated, not to enforce a patent
(such as an express permission to practice a patent or covenant not to
sue for patent infringement). To "grant" such a patent license to a
party means to make such an agreement or commitment not to enforce a
patent against the party.
If you convey a covered work, knowingly relying on a patent license,
and the Corresponding Source of the work is not available for anyone
to copy, free of charge and under the terms of this License, through a
publicly available network server or other readily accessible means,
then you must either (1) cause the Corresponding Source to be so
available, or (2) arrange to deprive yourself of the benefit of the
patent license for this particular work, or (3) arrange, in a manner
consistent with the requirements of this License, to extend the patent
license to downstream recipients. "Knowingly relying" means you have
actual knowledge that, but for the patent license, your conveying the
covered work in a country, or your recipient's use of the covered work
in a country, would infringe one or more identifiable patents in that
country that you have reason to believe are valid.
If, pursuant to or in connection with a single transaction or
arrangement, you convey, or propagate by procuring conveyance of, a
covered work, and grant a patent license to some of the parties
receiving the covered work authorizing them to use, propagate, modify
or convey a specific copy of the covered work, then the patent license
you grant is automatically extended to all recipients of the covered
work and works based on it.
A patent license is "discriminatory" if it does not include within the
scope of its coverage, prohibits the exercise of, or is conditioned on
the non-exercise of one or more of the rights that are specifically
granted under this License. You may not convey a covered work if you
are a party to an arrangement with a third party that is in the
business of distributing software, under which you make payment to the
third party based on the extent of your activity of conveying the
work, and under which the third party grants, to any of the parties
who would receive the covered work from you, a discriminatory patent
license (a) in connection with copies of the covered work conveyed by
you (or copies made from those copies), or (b) primarily for and in
connection with specific products or compilations that contain the
covered work, unless you entered into that arrangement, or that patent
license was granted, prior to 28 March 2007.
Nothing in this License shall be construed as excluding or limiting
any implied license or other defenses to infringement that may
otherwise be available to you under applicable patent law.
#### 12. No Surrender of Others' Freedom.
If conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot convey a
covered work so as to satisfy simultaneously your obligations under
this License and any other pertinent obligations, then as a
consequence you may not convey it at all. For example, if you agree to
terms that obligate you to collect a royalty for further conveying
from those to whom you convey the Program, the only way you could
satisfy both those terms and this License would be to refrain entirely
from conveying the Program.
#### 13. Use with the GNU Affero General Public License.
Notwithstanding any other provision of this License, you have
permission to link or combine any covered work with a work licensed
under version 3 of the GNU Affero General Public License into a single
combined work, and to convey the resulting work. The terms of this
License will continue to apply to the part which is the covered work,
but the special requirements of the GNU Affero General Public License,
section 13, concerning interaction through a network will apply to the
combination as such.
#### 14. Revised Versions of this License.
The Free Software Foundation may publish revised and/or new versions
of the GNU General Public License from time to time. Such new versions
will be similar in spirit to the present version, but may differ in
detail to address new problems or concerns.
Each version is given a distinguishing version number. If the Program
specifies that a certain numbered version of the GNU General Public
License "or any later version" applies to it, you have the option of
following the terms and conditions either of that numbered version or
of any later version published by the Free Software Foundation. If the
Program does not specify a version number of the GNU General Public
License, you may choose any version ever published by the Free
Software Foundation.
If the Program specifies that a proxy can decide which future versions
of the GNU General Public License can be used, that proxy's public
statement of acceptance of a version permanently authorizes you to
choose that version for the Program.
Later license versions may give you additional or different
permissions. However, no additional obligations are imposed on any
author or copyright holder as a result of your choosing to follow a
later version.
#### 15. Disclaimer of Warranty.
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT
WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND
PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE
DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR
CORRECTION.
#### 16. Limitation of Liability.
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR
CONVEYS THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES
ARISING OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT
NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR
LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM
TO OPERATE WITH ANY OTHER PROGRAMS), EVEN IF SUCH HOLDER OR OTHER
PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.
#### 17. Interpretation of Sections 15 and 16.
If the disclaimer of warranty and limitation of liability provided
above cannot be given local legal effect according to their terms,
reviewing courts shall apply local law that most closely approximates
an absolute waiver of all civil liability in connection with the
Program, unless a warranty or assumption of liability accompanies a
copy of the Program in return for a fee.
END OF TERMS AND CONDITIONS
### How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these
terms.
To do so, attach the following notices to the program. It is safest to
attach them to the start of each source file to most effectively state
the exclusion of warranty; and each file should have at least the
"copyright" line and a pointer to where the full notice is found.
<one line to give the program's name and a brief idea of what it does.>
Copyright (C) <year> <name of author>
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program. If not, see <https://www.gnu.org/licenses/>.
Also add information on how to contact you by electronic and paper
mail.
If the program does terminal interaction, make it output a short
notice like this when it starts in an interactive mode:
<program> Copyright (C) <year> <name of author>
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it
under certain conditions; type `show c' for details.
The hypothetical commands \`show w' and \`show c' should show the
appropriate parts of the General Public License. Of course, your
program's commands might be different; for a GUI interface, you would
use an "about box".
You should also get your employer (if you work as a programmer) or
school, if any, to sign a "copyright disclaimer" for the program, if
necessary. For more information on this, and how to apply and follow
the GNU GPL, see <https://www.gnu.org/licenses/>.
The GNU General Public License does not permit incorporating your
program into proprietary programs. If your program is a subroutine
library, you may consider it more useful to permit linking proprietary
applications with the library. If this is what you want to do, use the
GNU Lesser General Public License instead of this License. But first,
please read <https://www.gnu.org/licenses/why-not-lgpl.html>.
| Ackbar | /Ackbar-0.2.tar.gz/Ackbar-0.2/LICENSE.md | LICENSE.md |
# Analysis and Corroboration of Key Biodiversity AReas - Ackbar
###############################################################################
#
# Copyright 2020 Nelson R. Salinas
#
#
# This file is part of Ackbar.
#
# Ackbar is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# Ackbar is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Ackbar. If not, see <http://www.gnu.org/licenses/>.
#
###############################################################################
import sys
import os
import shutil
import datetime
import re
from ackbar_lib import fileio
from ackbar_lib import pydata
from ackbar_lib import shapes
oper_sys = None
if sys.platform.startswith('linux'):
oper_sys = 'linux'
elif sys.platform.startswith('darwin'):
oper_sys = 'darwin'
elif sys.platform.startswith('win32'):
oper_sys = 'windows'
#if oper_sys != 'linux':
# raise OSError('Operating system not supported. Currently, Ackbar only runs on Linux.')
# Track memory usage durig execution
mem_tracking = False
if mem_tracking and oper_sys == 'linux':
import resource
else:
mem_tracking = False
version = "0.1"
logfile = ""
paramPass = True
critmap = {0: "A1a", 1: "A1b", 2: "A1c", 3: "A1d",4 : "A1e", 5: "B1", 6: "B2"}
parameters = {
"distribution_file" : None,
"iucn_file" : None,
"taxonomic_groups_file" : None,
"taxonomic_assignments_file" : None,
"kba_species_file" : None,
"kba_directory" : None,
"kba_index" : None,
#"exclusion_directory" : None,
"focal_area_directory": None,
"outfile_root" : None,
"overwrite_output" : None,
"cell_size" : None,
"offset_lat" : None,
"offset_lon" : None,
"pop_max_distance": None,
"eps" : None,
"iters" : None,
"max_kba" : None,
"congruency_factor" : None
}
deb_counter = 0
today = datetime.datetime.now()
outfileRootDefault = today.strftime("Ackbar_output_%Y%m%d_%H%M%S")
bufferLog = "Ackbar ver. {0}\nAnalysis executed on {1}\n\n".format(version, today)
helloLog = '''
******************** Ackbar ver. {0} ********************
A Python program to assist the delimitation and update of Key
Biodiversity Areas.
Usage:
ackbar.py configuration_file
or
ackbar.py [option]
where `option` could be:
-i Prints the list of taxonomic groups recommended by the
IUCN for the application of criterion B.
All parameters required for executing an analysis are set through the configuration
file. The complete specification of the configuration file can be accessed at
https://github.com/nrsalinas/ackbar/wiki. Examples of input files can be accessed
and downloaded at https://github.com/nrsalinas/ackbar/tree/master/data.
'''.format(version)
if len(sys.argv) > 2:
raise IOError('Too many arguments were parsed to Ackbar.')
elif len(sys.argv) == 1:
print(helloLog)
else:
if sys.argv[1] == '-i':
from ackbar_lib.B2_recommended_thresholds import groups as iucn_groups
for group in sorted(iucn_groups):
print(group)
for k in iucn_groups[group]:
if not iucn_groups[group][k] is None:
print('\t{0}: {1}'.format(k, iucn_groups[group][k]))
elif os.path.isfile(sys.argv[1]):
if mem_tracking:
print("{0}: {1}".format(deb_counter, resource.getrusage(resource.RUSAGE_SELF).ru_maxrss))
deb_counter += 1
with open(sys.argv[1], 'r') as config:
# Parse config info into `parameters` dictionary
for line in config:
line = line.rstrip()
line = re.sub(r'#.*', '', line, flags=re.DOTALL)
line = re.sub(r'^\s+', '', line)
line = re.sub(r'\s$', '', line)
if len(line) > 5:
par_name = re.sub(r'\s*=.*$', '', line, flags=re.DOTALL)
par_val = re.sub(r'^.*=\s*', '', line, flags=re.DOTALL)
#print(par_name, par_val)
if par_name and par_val and par_name in parameters:
parameters[par_name] = par_val
#print(par_name , par_val )
## Check presence/absence of parameters
# Check mandatory params
for manpar in ["distribution_file", "iucn_file", "cell_size"]:
if parameters[manpar] is None:
raise ValueError('Configuration file error: mandatory parameter `{0}` has not been set.'.format(manpar))
kba_pars = 0
# Optional parameters
for kbap in ["kba_species_file", "kba_directory", "kba_index"]:
if not parameters[kbap] is None:
kba_pars += 1
if kba_pars > 0 and kba_pars < 3:
raise ValueError("Configuration file error: not all the parameters required for including existing KBA were set (`kba_species_file`, `kba_directory`, and `kba_index`). Alternatively, ALL three parameters can be left blank to conduct an analysis without considering previous KBA information.")
if parameters["taxonomic_groups_file"] and not parameters["taxonomic_assignments_file"]:
print("Configuration file error: taxonomic assignment file missing. If criterion B2 is sought to be assess, taxonomic assignments file is mandatory and taxonomic groups file optional.", file = sys.stderr)
sys.exit(1)
# Check parsed values are valid
if parameters["taxonomic_groups_file"] and not os.path.exists(parameters["taxonomic_groups_file"]):
print("Taxonomic group file could not be found ({0}).".format(parameters["taxonomic_groups_file"]), file = sys.stderr)
sys.exit(1)
if parameters["taxonomic_assignments_file"] and not os.path.exists(parameters["taxonomic_assignments_file"]):
print("Taxonomic group assignment file could not be found ({0}).".format(parameters["taxonomic_assignments_file"]), file = sys.stderr)
sys.exit(1)
for fpar in filter(lambda x: re.search(r'_file$', x), parameters.keys()):
if parameters[fpar] and not os.path.isfile(parameters[fpar]):
raise ValueError('Configuration file error: parameter `{0}` has not a valid value (`{1}` is not a file).'.format(fpar, parameters[fpar]))
for dpar in filter(lambda x: re.search(r'_directory$', x), parameters.keys()):
if parameters[dpar] and not os.path.isdir(parameters[dpar]):
raise ValueError('Configuration file error: parameter `{0}` has not a valid value (`{1}` is not a directory).'.format(dpar, parameters[dpar]))
for par_name in ["cell_size", "offset_lat", "offset_lon", "eps", "congruency_factor", "iters", "max_kba", "pop_max_distance"]:
par_val = parameters[par_name]
if par_val:
try:
par_val = float(par_val)
if par_val < 0:
raise ValueError("Configuration file error: parameter `{0}` should be a positive number.".format(par_name))
if par_name in ["iters", "max_kba"] and par_val % 1 > 0:
raise ValueError('Configuration file error: parameter `{0}` has not a valid value (`{1}` should be an integer).'.format(par_name, par_val))
if par_name == "cell_size" and par_val > 10:
raise ValueError("Configuration file error: `cell_size` value seems out of logical or practical range (`{0}`)".format(par_val))
if par_name == "max_kba" and par_val < 1:
raise ValueError("Configuration file error: `max_kba` value seems out of practical range (`{0}`)".format(par_val))
parameters[par_name] = par_val
except ValueError as te:
mess = str(te)
if mess.startswith('could not convert string to float'):
raise ValueError('Configuration file error: parameter `{0}` has not a valid value (`{1}` should be a number).'.format(par_name, par_val))
else:
raise
if type(parameters["overwrite_output"]) == str:
if re.search(r'true', parameters["overwrite_output"], re.I):
parameters["overwrite_output"] = True
elif re.search(r'False', parameters["overwrite_output"], re.I):
parameters["overwrite_output"] = False
else:
mss = "\nConfiguration file error: value parsed as `overwrite_output` value is not valid ({0}). Parameter will be set as False.\n".format(parameters["overwrite_output"])
parameters["overwrite_output"] = False
bufferLog += mss
print(mss, file=sys.stderr)
else:
parameters["overwrite_output"] = False
if parameters["outfile_root"] is None:
parameters["outfile_root"] = outfileRootDefault
if parameters["offset_lat"] is None:
parameters["offset_lat"] = 0
if parameters["offset_lon"] is None:
parameters["offset_lon"] = 0
if parameters["eps"] is None:
parameters["eps"] = 0.2
if parameters["iters"] is None:
parameters["iters"] = 1000
if parameters["max_kba"] is None:
parameters["max_kba"] = 20
if parameters["congruency_factor"] is None:
parameters["congruency_factor"] = 1
if parameters["pop_max_distance"] is None:
parameters["pop_max_distance"] = 0
bufferLog += "Parameters set for the analysis:\n\n"
for par in parameters:
#print(par, " = ", parameters[par])
bufferLog += "{0} = {1}\n".format(par, parameters[par])
if not parameters["taxonomic_groups_file"] and parameters["taxonomic_assignments_file"]:
bufferLog += "\nB2 criterion: recommended IUCN taxonomic groups will be used (user parsed assignments and no group info).\n"
#print(bufferLog)
### Output file/directory names
new_trigger_file = parameters["outfile_root"] + "_trigger_spp_previous_KBA.csv"
sol_dir = parameters["outfile_root"] + "_solution_shapefiles"
logfile = parameters["outfile_root"] + "_log.txt"
soltablename = parameters["outfile_root"] + "_solution_scores.csv"
output_names = [new_trigger_file, sol_dir, logfile]
### Check output files/directories exists
if parameters["overwrite_output"] == True:
for name in output_names:
if os.path.exists(name):
if os.path.isfile(name):
os.remove(name)
elif os.path.isdir(name):
shutil.rmtree(name)
else:
for name in output_names:
if os.path.exists(name):
raise OSError("A file/directory named {0} already exists.".format(name))
if mem_tracking:
print("{0}: {1}".format(deb_counter, resource.getrusage(resource.RUSAGE_SELF).ru_maxrss))
deb_counter += 1
################################################################
data = fileio.InputData(parameters["distribution_file"])
if mem_tracking:
print("{0}: {1}".format(deb_counter, resource.getrusage(resource.RUSAGE_SELF).ru_maxrss))
deb_counter += 1
data.iucnFile(parameters["iucn_file"])
data.mergePointsAllTaxa(parameters["pop_max_distance"])
#if parameters["pop_max_distance"] > 0:
# data.mergePointsAllTaxa(parameters["pop_max_distance"])
if mem_tracking:
print("{0}: {1}".format(deb_counter, resource.getrusage(resource.RUSAGE_SELF).ru_maxrss))
deb_counter += 1
bufferLog += "\nNumber of species in distribution file: {0}\n\n".format(len(data.points))
bufferLog += "\nUnique datapoints per species:\n\n"
for sp in sorted(data.points):
bufferLog += "{0}: {1}\n".format(sp, len(data.points[sp]))
no_points = [x for x in data.iucn if not x in data.points]
if len(no_points) > 0:
bufferLog += "\nIUCN file contains {0} species with no data points:\n".format(len(no_points))
for sp in no_points:
bufferLog += "\t{0}\n".format(sp)
no_iucn = [x for x in data.points if not x in data.iucn]
if len(no_iucn) > 0:
bufferLog += "\nIUCN file lacks {0} species present in the distribution file:\n".format(len(no_iucn))
for sp in no_iucn:
bufferLog += "\t{0}\n".format(sp)
if parameters["focal_area_directory"]:
# points will be filtered with the first shapefile found in directory
breakout = False
for d, s, f in os.walk(parameters["focal_area_directory"]):
if breakout:
break
for file in f:
if file.endswith(".shp") and d == parameters["focal_area_directory"]:
data.reduceArea(d + "/" + file)
breakout = True
break
if len(data.points) < 1:
print("Analysis aborted. User-defined parameters values imply a null set of datapoints. Review the configuration file.", file=sys.stderr)
exit(1)
if parameters["taxonomic_assignments_file"]:
data.groupFiles(parameters["taxonomic_assignments_file"], parameters["taxonomic_groups_file"])
data.groups2search()
if mem_tracking:
print("{0}: {1}".format(deb_counter, resource.getrusage(resource.RUSAGE_SELF).ru_maxrss))
deb_counter += 1
no_points = [x for x in data.taxonGroups if not x in data.points]
if len(no_points) > 0:
bufferLog += "\nTaxon group assignments file contains {0} species with no data points:\n".format(len(no_points))
for sp in no_points:
bufferLog += "\t{0}\n".format(sp)
no_groups = [x for x in data.points if not x in data.taxonGroups]
if len(no_groups) > 0:
bufferLog += "\nTaxon group assignments file lacks {0} present in the distribution file (The analysis will not be executed until you fix this):\n".format(len(no_groups))
for sp in no_groups:
bufferLog += "\t{0}\n".format(sp)
groupAssign = {}
for x in data.taxonGroups:
groupAssign[data.taxonGroups[x]['group']] = 0
miss_groups = [x for x in groupAssign.keys() if not x in data.taxonGroupsInfo.keys()]
if mem_tracking:
print("{0}: {1}".format(deb_counter, resource.getrusage(resource.RUSAGE_SELF).ru_maxrss))
deb_counter += 1
if len(miss_groups) > 0:
bufferLog += "\nTaxonomic groups missing in the taxonomic groups file:\n"
for y in miss_groups:
bufferLog += "\t{0}\n".format(y)
if parameters["kba_species_file"] and parameters["kba_directory"] and parameters["kba_index"]:
old_kbas = shapes.KBA(parameters["kba_directory"], parameters["kba_index"])
if mem_tracking:
print("{0}: {1}".format(deb_counter, resource.getrusage(resource.RUSAGE_SELF).ru_maxrss))
deb_counter += 1
old_kbas.spp_inclusion(data)
if mem_tracking:
print("{0}: {1}".format(deb_counter, resource.getrusage(resource.RUSAGE_SELF).ru_maxrss))
deb_counter += 1
old_kbas.new_spp_table(new_trigger_file)
if mem_tracking:
print("{0}: {1}".format(deb_counter, resource.getrusage(resource.RUSAGE_SELF).ru_maxrss))
deb_counter += 1
tiles = data.getTiles(parameters["cell_size"],
offsetLat = parameters["offset_lat"],
offsetLon = parameters["offset_lon"]
)
if mem_tracking:
print("{0}: {1}".format(deb_counter, resource.getrusage(resource.RUSAGE_SELF).ru_maxrss))
deb_counter += 1
tiles = data.filter_nulls(tiles)
eff_total = len(tiles)
eff_cr = len([x for x in filter(lambda x : x.getThreatStatus() == "CR", tiles)])
eff_en = len([x for x in filter(lambda x : x.getThreatStatus() == "EN", tiles)])
eff_vu = len([x for x in filter(lambda x : x.getThreatStatus() == "VU", tiles)])
eff_nt = len([x for x in filter(lambda x : x.getThreatStatus() == "NT", tiles)])
eff_lc = len([x for x in filter(lambda x : x.getThreatStatus() == "LC", tiles)])
bufferLog += "\nEffective number of species included in the delimitation of candidate areas:\n\tCR: {0}\n\tEN: {1}\n\tVU: {2}\n\tNT: {3}\n\tLC: {4}\n\tTotal: {5}\n".format(eff_cr, eff_en, eff_vu, eff_nt, eff_lc, eff_total)
if parameters["taxonomic_assignments_file"]:
#
# Check if data.groupDict and data.spp2groupDict are appropriate dicts
#
mysols = pydata.metasearchAlt(tiles,
parameters["eps"], parameters["iters"],
parameters["max_kba"], parameters["congruency_factor"],
data.groupDict, data.spp2groupDict)
if mem_tracking:
print("{0}: {1}".format(deb_counter, resource.getrusage(resource.RUSAGE_SELF).ru_maxrss))
deb_counter += 1
else:
mysols = pydata.metasearchAlt(
tiles,
parameters["eps"], parameters["iters"],
parameters["max_kba"], parameters["congruency_factor"])
if mem_tracking:
print("{0}: {1}".format(deb_counter, resource.getrusage(resource.RUSAGE_SELF).ru_maxrss))
deb_counter += 1
if len(mysols) > 0 and len(mysols[0]) > 0:
shapes.solution2shape(mysols, data, sol_dir)
if mem_tracking:
print("{0}: {1}".format(deb_counter, resource.getrusage(resource.RUSAGE_SELF).ru_maxrss))
deb_counter += 1
sol_table = "Group,Solution,Aggregated_score,IUCN_score,NDM_score\n"
for ig, group in enumerate(mysols):
bufferLog += "\nSolution group {0}\n".format(ig)
#critmap = {0: "A1a", 1: "A1b", 2: "A1c", 3: "A1d",4 : "A1e", 5: "B1", 6: "B2"}
ttable = "Taxon,Solution," + ",".join(critmap.values()) + "\n"
for isol, sol in enumerate(group):
bufferLog += "\n\tSolution {0}:\n".format(isol)
sol_table += "{0},{1},{2},{3}, {4}\n".format(ig, isol, sol.aggrScore, sol.score, sol.ndmScore)
for spinx in sorted(sol.spp2crit, key = lambda x : tiles[x].getName()):
bufferLog += "\t\t{0}: ".format(tiles[spinx].getName())
ttable += "{0},{1}".format( tiles[spinx].getName() , isol )
tcrits = list(map(lambda x: critmap[x], sol.spp2crit[spinx]))
tcritsstr = " ".join(map(str, tcrits))
bufferLog += " {0}\n".format(tcritsstr)
#print(tcrits)
for k in critmap:
if critmap[k] in tcrits:
ttable += ",1"
else:
ttable += ",0"
ttable += "\n"
#ttable += "," + ",".join([tiles[spinx].aggrScore, tiles[spinx].score, tiles[spinx].ndmScore]) + "\n"
tablename = sol_dir + "/" + "group_{0}.csv".format(ig)
#print (tablename)
with open(tablename, "w") as thandle:
thandle.write(ttable)
with open(soltablename, "w") as thandle:
thandle.write(sol_table)
with open(logfile, "w") as logh:
logh.write(bufferLog)
exit(0) | Ackbar | /Ackbar-0.2.tar.gz/Ackbar-0.2/ackbar.py | ackbar.py |
import numpy as np
import rasterio
import re
from os import walk
class SDMs:
def __init__(self, raster_directory):
self.directory = raster_directory
self.origin_N = None
self.origin_W = None
self.bounds = [None, None, None, None]
self.index_reg = {}
def iucnFile(self, categories_file):
with open(categories_file, "r") as fhandle:
pass
def get_bounds(self):
northest = -1e9
westest = 1e9
southest = 1e9
eastest = -1e9
for d, s, f in walk(self.directory):
for file in f:
if re.search(r"\.tif{1,2}$", file):
with rasterio.open("/".join([d , file])) as src:
ras = src.read(1)
ras = np.where(ras < 0, np.nan, ras)
sumrow = 0
rowindx = 0
while sumrow <= 0:
row = ras[rowindx, :]
row = row[~np.isnan(row)]
if row.shape[0] > 0:
sumrow = row.sum()
rowindx += 1
sumcol = 0
colindx = 0
while sumcol <= 0:
col = ras[: , colindx]
col = col[~np.isnan(col)]
if col.shape[0] > 0:
sumcol = col.sum()
colindx += 1
thwest, thnorth = src.xy(rowindx, colindx)
sumrow = 0
rowindx = ras.shape[0] - 1
while sumrow <= 0:
row = ras[rowindx, :]
row = row[~np.isnan(row)]
if row.shape[0] > 0:
sumrow = row.sum()
rowindx -= 1
sumcol = 0
colindx = ras.shape[1] - 1
while sumcol <= 0:
col = ras[: , colindx]
col = col[~np.isnan(col)]
if col.shape[0] > 0:
sumcol = col.sum()
colindx -= 1
theast, thsouth = src.xy(rowindx, colindx)
if thwest < westest:
westest = thwest
if thnorth > northest:
northest = thnorth
if theast > eastest:
eastest = theast
if thsouth < southest:
southest = thsouth
self.bounds = [northest, southest, westest, eastest]
return None
def pop_in_cell(self, src, ras, points):
"""Point is a coordinate list of a cell corners: NW, NE, SE, SW"""
x0,y0 = src.index(points[0], points[1])
x1,y1 = src.index(points[2], points[3])
temp = ras[x0:(x1+1), y0:(y1+1)]
return temp[temp == 1].sum()
def set_grid_prop(self, cell_size, offset):
self.cellSize = cell_size
self.origin_N = self.bounds[0] + offset
self.origin_W = self.bounds[0] - offset
span = [max((self.bounds[3] - self.origin_N), self.cellSize), max((self.origin_W - self.bounds[1]), self.cellSize)]
totCols = int(np.ceil(span[0] / self.cellSize))
totRows = int(np.ceil(span[1] / self.cellSize))
#span[0] = totCols * self.cellSize
#span[1] = totRows * self.cellSize
self.rows, self.cols = totRows, totCols
def getTiles(self):
taxa = []
gridColl = []
presenceGrid = [[0 for x in range(self.cols)] for y in range(self.rows)]
for d, s, f in walk(self.directory):
for file in f:
if re.search(r"\.tif{1,2}$", file):
with rasterio.open("/".join([d , file])) as src:
thisgrid = []
taxon = re.sub(r"\.tif{1,2}$", "", file)
ras = src.read(1)
ras = np.where(ras < 0, np.nan, ras)
popExt = ras[~np.isnan(ras)].sum()
for ir, row in enumerate(range(self.rows)):
thisrow = []
for ic, col in enumerate(range(self.cols)):
x0 = self.origin_W + row * self.cellSize
y0 = self.origin_N - col * self.cellSize
x1 = self.origin_W + (row + 1) * self.cellSize
y1 = self.origin_N - (col + 1) * self.cellSize
thcell = self.pop_in_cell(src, ras, (x0, y0, x1, y1))
thisrow.append(thcell)
presenceGrid[ir][ic] += thcell
thisgrid.append(thisrow)
gridColl.append(thisgrid)
taxa.append(taxon)
#---
self.index_reg = {}
act_size = 0
for ir, row in enumerate(presenceGrid):
for ic, cel in enumerate(row):
if cel > 0:
self.index_reg[(ir, ic)] = act_size
act_size += 1
#---
#for it, tax in taxa:
#################
### IN PROGRESS ### | Ackbar | /Ackbar-0.2.tar.gz/Ackbar-0.2/ackbar_lib/rasters.py | rasters.py |
groups = {
"Amphibia": {"min_spp": 2, "range_threshold": 10000},
"Myxini": {"min_spp": 2, "range_threshold": 10000},
"Mammalia": {"min_spp": 2, "range_threshold": 17614},
"Aves": {"min_spp": 2, "range_threshold": 50000},
"Ceratophyllales": {"min_spp": 2, "range_threshold": 50000},
"Chondrichthyes": {"min_spp": 2, "range_threshold": 50000},
"Sarcopterygii": {"min_spp": 2, "range_threshold": 50000},
"Acanthocephala": {"min_spp": 2, "range_threshold": None},
"Acanthopteroctetoidea": {"min_spp": 2, "range_threshold": None},
"Acorales": {"min_spp": 2, "range_threshold": None},
"Agathiphagoidea": {"min_spp": 2, "range_threshold": None},
"Alismatales": {"min_spp": 2, "range_threshold": None},
"Alucitoidea": {"min_spp": 2, "range_threshold": None},
"Amblypygi": {"min_spp": 2, "range_threshold": None},
"Amborellales": {"min_spp": 2, "range_threshold": None},
"Amphipoda": {"min_spp": 2, "range_threshold": None},
"Anaspidacea": {"min_spp": 2, "range_threshold": None},
"Anthocerotophyta": {"min_spp": 2, "range_threshold": None},
"Apiales": {"min_spp": 2, "range_threshold": None},
"Appendicularia": {"min_spp": 2, "range_threshold": None},
"Aquifoliales": {"min_spp": 2, "range_threshold": None},
"Archaeognatha": {"min_spp": 2, "range_threshold": None},
"Arecales": {"min_spp": 2, "range_threshold": None},
"Arthoniomycetes": {"min_spp": 2, "range_threshold": None},
"Ascidiacea": {"min_spp": 2, "range_threshold": None},
"Astigmata": {"min_spp": 2, "range_threshold": None},
"Austrobaileyales": {"min_spp": 2, "range_threshold": None},
"Axioidea": {"min_spp": 2, "range_threshold": None},
"Bathynellacea": {"min_spp": 2, "range_threshold": None},
"Berberidopsidales": {"min_spp": 2, "range_threshold": None},
"Bivalvia": {"min_spp": 2, "range_threshold": None},
"Blattodea": {"min_spp": 2, "range_threshold": None},
"Bochusacea": {"min_spp": 2, "range_threshold": None},
"Bombycoidea": {"min_spp": 2, "range_threshold": None},
"Boraginales": {"min_spp": 2, "range_threshold": None},
"Bostrichoidea": {"min_spp": 2, "range_threshold": None},
"Brachiopoda": {"min_spp": 2, "range_threshold": None},
"Branchiopoda": {"min_spp": 2, "range_threshold": None},
"Brassicales": {"min_spp": 2, "range_threshold": None},
"Bruniales": {"min_spp": 2, "range_threshold": None},
"Bryozoa": {"min_spp": 2, "range_threshold": None},
"Buprestoidea": {"min_spp": 2, "range_threshold": None},
"Buxales": {"min_spp": 2, "range_threshold": None},
"Byrrhoidea": {"min_spp": 2, "range_threshold": None},
"Calliduloidea": {"min_spp": 2, "range_threshold": None},
"Canellales": {"min_spp": 2, "range_threshold": None},
"Carnoidea": {"min_spp": 2, "range_threshold": None},
"Caryophyllales": {"min_spp": 2, "range_threshold": None},
"Caryophyllales": {"min_spp": 2, "range_threshold": None},
"Caryophyllales": {"min_spp": 2, "range_threshold": None},
"Caryophyllales": {"min_spp": 2, "range_threshold": None},
"Celastrales": {"min_spp": 2, "range_threshold": None},
"Cephalaspidomorphi": {"min_spp": 2, "range_threshold": None},
"Cephalocerida": {"min_spp": 2, "range_threshold": None},
"Cephalopoda": {"min_spp": 2, "range_threshold": None},
"Cephalorhyncha": {"min_spp": 2, "range_threshold": None},
"Cephoidea": {"min_spp": 2, "range_threshold": None},
"Ceraphronoidea": {"min_spp": 2, "range_threshold": None},
"Cestoda": {"min_spp": 2, "range_threshold": None},
"Chaetognatha": {"min_spp": 2, "range_threshold": None},
"Charipidae": {"min_spp": 2, "range_threshold": None},
"Charophyta": {"min_spp": 2, "range_threshold": None},
"Chilopoda": {"min_spp": 2, "range_threshold": None},
"Chloranthales": {"min_spp": 2, "range_threshold": None},
"Chlorophyta": {"min_spp": 2, "range_threshold": None},
"Choreutoidea": {"min_spp": 2, "range_threshold": None},
"Chrysidoidea": {"min_spp": 2, "range_threshold": None},
"Chytridiomycota": {"min_spp": 2, "range_threshold": None},
"Cimelioidea": {"min_spp": 2, "range_threshold": None},
"Cleroidea": {"min_spp": 2, "range_threshold": None},
"Clitellata": {"min_spp": 2, "range_threshold": None},
"Cnidaria": {"min_spp": 2, "range_threshold": None},
"Coccinelloidea": {"min_spp": 2, "range_threshold": None},
"Commelinales": {"min_spp": 2, "range_threshold": None},
"Copromorphoidea": {"min_spp": 2, "range_threshold": None},
"Cornales": {"min_spp": 2, "range_threshold": None},
"Cossoidea": {"min_spp": 2, "range_threshold": None},
"Crossosomatales": {"min_spp": 2, "range_threshold": None},
"Ctenophora": {"min_spp": 2, "range_threshold": None},
"Cucujoidea": {"min_spp": 2, "range_threshold": None},
"Cucurbitales": {"min_spp": 2, "range_threshold": None},
"Cumacea": {"min_spp": 2, "range_threshold": None},
"Cycadopsida": {"min_spp": 2, "range_threshold": None},
"Cycliophora": {"min_spp": 2, "range_threshold": None},
"Cynipoidea": {"min_spp": 2, "range_threshold": None},
"Dascilloidea": {"min_spp": 2, "range_threshold": None},
"Decapoda": {"min_spp": 2, "range_threshold": None},
"Dermaptera": {"min_spp": 2, "range_threshold": None},
"Derodontoidea": {"min_spp": 2, "range_threshold": None},
"Diaprioidea": {"min_spp": 2, "range_threshold": None},
"Dicyemida": {"min_spp": 2, "range_threshold": None},
"Dilleniales": {"min_spp": 2, "range_threshold": None},
"Diopsoidea": {"min_spp": 2, "range_threshold": None},
"Dioscoreales": {"min_spp": 2, "range_threshold": None},
"Dipsacales": {"min_spp": 2, "range_threshold": None},
"Drepanoidea": {"min_spp": 2, "range_threshold": None},
"Dytiscoidea": {"min_spp": 2, "range_threshold": None},
"Echinodermata": {"min_spp": 2, "range_threshold": None},
"Echiura": {"min_spp": 2, "range_threshold": None},
"Elateroidea": {"min_spp": 2, "range_threshold": None},
"Embioptera": {"min_spp": 2, "range_threshold": None},
"Empidoidea": {"min_spp": 2, "range_threshold": None},
"Entognatha": {"min_spp": 2, "range_threshold": None},
"Entoprocta": {"min_spp": 2, "range_threshold": None},
"Epermenioidea": {"min_spp": 2, "range_threshold": None},
"Ephemeroptera": {"min_spp": 2, "range_threshold": None},
"Ephydroidea": {"min_spp": 2, "range_threshold": None},
"Ericales": {"min_spp": 2, "range_threshold": None},
"Ericales": {"min_spp": 2, "range_threshold": None},
"Ericales": {"min_spp": 2, "range_threshold": None},
"Eriocranioidea": {"min_spp": 2, "range_threshold": None},
"Escalloniales": {"min_spp": 2, "range_threshold": None},
"Euphasiacea": {"min_spp": 2, "range_threshold": None},
"Eurotiomycetes": {"min_spp": 2, "range_threshold": None},
"Evanioidea": {"min_spp": 2, "range_threshold": None},
"Fagales": {"min_spp": 2, "range_threshold": None},
"Galacticoidea": {"min_spp": 2, "range_threshold": None},
"Garryales": {"min_spp": 2, "range_threshold": None},
"Gastrotricha": {"min_spp": 2, "range_threshold": None},
"Geoglossomycetes": {"min_spp": 2, "range_threshold": None},
"Geraniales": {"min_spp": 2, "range_threshold": None},
"Ginkgoopsida": {"min_spp": 2, "range_threshold": None},
"Glomeromycota": {"min_spp": 2, "range_threshold": None},
"Gnathostomulida": {"min_spp": 2, "range_threshold": None},
"Gnetopsida": {"min_spp": 2, "range_threshold": None},
"Gracillarioidea": {"min_spp": 2, "range_threshold": None},
"Grylloblattodea": {"min_spp": 2, "range_threshold": None},
"Gunnerales": {"min_spp": 2, "range_threshold": None},
"Haliploidea": {"min_spp": 2, "range_threshold": None},
"Hedyloidea": {"min_spp": 2, "range_threshold": None},
"Hemichordata": {"min_spp": 2, "range_threshold": None},
"Hepialoidea": {"min_spp": 2, "range_threshold": None},
"Hesperioidea": {"min_spp": 2, "range_threshold": None},
"Heterobathmioidea": {"min_spp": 2, "range_threshold": None},
"Hexanauplia": {"min_spp": 2, "range_threshold": None},
"Hippoboscoidea": {"min_spp": 2, "range_threshold": None},
"Histeroidea": {"min_spp": 2, "range_threshold": None},
"Holothyroidae": {"min_spp": 2, "range_threshold": None},
"Huerteales": {"min_spp": 2, "range_threshold": None},
"Hyblaeoidea": {"min_spp": 2, "range_threshold": None},
"Hydrophiloidea": {"min_spp": 2, "range_threshold": None},
"Icacinales": {"min_spp": 2, "range_threshold": None},
"Immoidea": {"min_spp": 2, "range_threshold": None},
"Incurvarioidea": {"min_spp": 2, "range_threshold": None},
"Isopoda": {"min_spp": 2, "range_threshold": None},
"Isoptera": {"min_spp": 2, "range_threshold": None},
"Ixodida": {"min_spp": 2, "range_threshold": None},
"Lasiocampoidea": {"min_spp": 2, "range_threshold": None},
"Laurales": {"min_spp": 2, "range_threshold": None},
"Lauxanioidea": {"min_spp": 2, "range_threshold": None},
"Lecanoromycetes": {"min_spp": 2, "range_threshold": None},
"Leotiomycetes": {"min_spp": 2, "range_threshold": None},
"Lepiceroidea": {"min_spp": 2, "range_threshold": None},
"Leptocardii": {"min_spp": 2, "range_threshold": None},
"Leptostraca": {"min_spp": 2, "range_threshold": None},
"Liliales": {"min_spp": 2, "range_threshold": None},
"Lophocoronoidea": {"min_spp": 2, "range_threshold": None},
"Lophogastrica": {"min_spp": 2, "range_threshold": None},
"Lycopodiopsida": {"min_spp": 2, "range_threshold": None},
"Lymexyloidea": {"min_spp": 2, "range_threshold": None},
"Magnoliales": {"min_spp": 2, "range_threshold": None},
"Malvales": {"min_spp": 2, "range_threshold": None},
"Mantaphasmatodea": {"min_spp": 2, "range_threshold": None},
"Mantodea": {"min_spp": 2, "range_threshold": None},
"Marchantiophyta": {"min_spp": 2, "range_threshold": None},
"Maxillopoda": {"min_spp": 2, "range_threshold": None},
"Mecoptera": {"min_spp": 2, "range_threshold": None},
"Megaloptera": {"min_spp": 2, "range_threshold": None},
"Merostomata": {"min_spp": 2, "range_threshold": None},
"Mesostigmata": {"min_spp": 2, "range_threshold": None},
"Mesozoa": {"min_spp": 2, "range_threshold": None},
"Metteniusales": {"min_spp": 2, "range_threshold": None},
"Micropterigiodea": {"min_spp": 2, "range_threshold": None},
"Microsporidia": {"min_spp": 2, "range_threshold": None},
"Mictacea": {"min_spp": 2, "range_threshold": None},
"Mimallonoidea": {"min_spp": 2, "range_threshold": None},
"Mnesarchaeoidea": {"min_spp": 2, "range_threshold": None},
"Monogenea": {"min_spp": 2, "range_threshold": None},
"Muscoidea": {"min_spp": 2, "range_threshold": None},
"Myrtales": {"min_spp": 2, "range_threshold": None},
"Mysida": {"min_spp": 2, "range_threshold": None},
"Myxozoa": {"min_spp": 2, "range_threshold": None},
"Nematomorpha": {"min_spp": 2, "range_threshold": None},
"Nemertina": {"min_spp": 2, "range_threshold": None},
"Nemestrinoidea": {"min_spp": 2, "range_threshold": None},
"Neolectomycetes": {"min_spp": 2, "range_threshold": None},
"Neopseustoidea": {"min_spp": 2, "range_threshold": None},
"Nepenthales": {"min_spp": 2, "range_threshold": None},
"Nepticuloidea": {"min_spp": 2, "range_threshold": None},
"Nerioidea": {"min_spp": 2, "range_threshold": None},
"Neuroptera": {"min_spp": 2, "range_threshold": None},
"Nymphaeales": {"min_spp": 2, "range_threshold": None},
"Odonata": {"min_spp": 2, "range_threshold": None},
"Oestroidea": {"min_spp": 2, "range_threshold": None},
"Onychophora": {"min_spp": 2, "range_threshold": None},
"Opilioacarida": {"min_spp": 2, "range_threshold": None},
"Opiliones": {"min_spp": 2, "range_threshold": None},
"Opomyzoidea": {"min_spp": 2, "range_threshold": None},
"Oribatida": {"min_spp": 2, "range_threshold": None},
"Ostracoda": {"min_spp": 2, "range_threshold": None},
"Oxalidales": {"min_spp": 2, "range_threshold": None},
"Palaephatoidea": {"min_spp": 2, "range_threshold": None},
"Palpigradi": {"min_spp": 2, "range_threshold": None},
"Pamphilioidea": {"min_spp": 2, "range_threshold": None},
"Pandanales": {"min_spp": 2, "range_threshold": None},
"Paracryphiales": {"min_spp": 2, "range_threshold": None},
"Pauropoda": {"min_spp": 2, "range_threshold": None},
"Perissommatidae": {"min_spp": 2, "range_threshold": None},
"Petrosaviales": {"min_spp": 2, "range_threshold": None},
"Pezizomycetes": {"min_spp": 2, "range_threshold": None},
"Phasmida": {"min_spp": 2, "range_threshold": None},
"Phoroidea": {"min_spp": 2, "range_threshold": None},
"Phoronida": {"min_spp": 2, "range_threshold": None},
"Picramniales": {"min_spp": 2, "range_threshold": None},
"Pinopsida": {"min_spp": 2, "range_threshold": None},
"Piperales": {"min_spp": 2, "range_threshold": None},
"Platygastroidea": {"min_spp": 2, "range_threshold": None},
"Plecoptera": {"min_spp": 2, "range_threshold": None},
"Polychaeta": {"min_spp": 2, "range_threshold": None},
"Polypodiopsida": {"min_spp": 2, "range_threshold": None},
"Porifera": {"min_spp": 2, "range_threshold": None},
"Proctotrupoidea": {"min_spp": 2, "range_threshold": None},
"Prostigmata": {"min_spp": 2, "range_threshold": None},
"Proteales": {"min_spp": 2, "range_threshold": None},
"Pseudoscorpiones": {"min_spp": 2, "range_threshold": None},
"Psocodea": {"min_spp": 2, "range_threshold": None},
"Psychodomorpha": {"min_spp": 2, "range_threshold": None},
"Pterophoroidea": {"min_spp": 2, "range_threshold": None},
"Ptychopteridae": {"min_spp": 2, "range_threshold": None},
"Puccioniomycetes": {"min_spp": 2, "range_threshold": None},
"Pycnogonida": {"min_spp": 2, "range_threshold": None},
"Ranunculales": {"min_spp": 2, "range_threshold": None},
"Remipedia": {"min_spp": 2, "range_threshold": None},
"Reptilia": {"min_spp": 2, "range_threshold": None},
"Rhodophyta": {"min_spp": 2, "range_threshold": None},
"Ricinulei": {"min_spp": 2, "range_threshold": None},
"Ropalomeridae": {"min_spp": 2, "range_threshold": None},
"Rosales": {"min_spp": 2, "range_threshold": None},
"Rotifera": {"min_spp": 2, "range_threshold": None},
"Santalales": {"min_spp": 2, "range_threshold": None},
"Sapindales": {"min_spp": 2, "range_threshold": None},
"Sapindales": {"min_spp": 2, "range_threshold": None},
"Saxifragales": {"min_spp": 2, "range_threshold": None},
"Scatopsoidea": {"min_spp": 2, "range_threshold": None},
"Schizomida": {"min_spp": 2, "range_threshold": None},
"Schreckensteinioidea": {"min_spp": 2, "range_threshold": None},
"Sciaroidea": {"min_spp": 2, "range_threshold": None},
"Sciomyzoidea": {"min_spp": 2, "range_threshold": None},
"Scirtoidea": {"min_spp": 2, "range_threshold": None},
"Scorpiones": {"min_spp": 2, "range_threshold": None},
"Serphitoidea": {"min_spp": 2, "range_threshold": None},
"Sesioidea": {"min_spp": 2, "range_threshold": None},
"Simaethistoidea": {"min_spp": 2, "range_threshold": None},
"Siphonaptera": {"min_spp": 2, "range_threshold": None},
"Sipuncula": {"min_spp": 2, "range_threshold": None},
"Siricoidea": {"min_spp": 2, "range_threshold": None},
"Solanales": {"min_spp": 2, "range_threshold": None},
"Solifugae": {"min_spp": 2, "range_threshold": None},
"Sphaeriusoidea": {"min_spp": 2, "range_threshold": None},
"Sphaeroceroidea": {"min_spp": 2, "range_threshold": None},
"Stomatopoda": {"min_spp": 2, "range_threshold": None},
"Stratiomyomorpha": {"min_spp": 2, "range_threshold": None},
"Strepsiptera": {"min_spp": 2, "range_threshold": None},
"Symphyla": {"min_spp": 2, "range_threshold": None},
"Syrphoidea": {"min_spp": 2, "range_threshold": None},
"Tabanomorpha": {"min_spp": 2, "range_threshold": None},
"Tanaidaceae": {"min_spp": 2, "range_threshold": None},
"Taphrinomycetes": {"min_spp": 2, "range_threshold": None},
"Tardigrada": {"min_spp": 2, "range_threshold": None},
"Tenebrionoidea": {"min_spp": 2, "range_threshold": None},
"Tenthredinoidea": {"min_spp": 2, "range_threshold": None},
"Tephritoidea": {"min_spp": 2, "range_threshold": None},
"Thaliacea": {"min_spp": 2, "range_threshold": None},
"Thermosbaenacea": {"min_spp": 2, "range_threshold": None},
"Thyridoidea": {"min_spp": 2, "range_threshold": None},
"Thysanoptera": {"min_spp": 2, "range_threshold": None},
"Tineoidea": {"min_spp": 2, "range_threshold": None},
"Tischerioidea": {"min_spp": 2, "range_threshold": None},
"Tortricoidea": {"min_spp": 2, "range_threshold": None},
"Tremellomycetes": {"min_spp": 2, "range_threshold": None},
"Trichoptera": {"min_spp": 2, "range_threshold": None},
"Trochodendrales": {"min_spp": 2, "range_threshold": None},
"Turbellaria": {"min_spp": 2, "range_threshold": None},
"Urodoidea": {"min_spp": 2, "range_threshold": None},
"Uropygi": {"min_spp": 2, "range_threshold": None},
"Ustilaginomycetes": {"min_spp": 2, "range_threshold": None},
"Vahliales": {"min_spp": 2, "range_threshold": None},
"Vermileoninae": {"min_spp": 2, "range_threshold": None},
"Vitales": {"min_spp": 2, "range_threshold": None},
"Whalleyanoidea": {"min_spp": 2, "range_threshold": None},
"Xenacoelomorpha": {"min_spp": 2, "range_threshold": None},
"Xiphydrioidea": {"min_spp": 2, "range_threshold": None},
"Xylophagomorpha": {"min_spp": 2, "range_threshold": None},
"Yponomeutoidea": {"min_spp": 2, "range_threshold": None},
"Zingiberales": {"min_spp": 2, "range_threshold": None},
"Zoraptera": {"min_spp": 2, "range_threshold": None},
"Zygaenoidea": {"min_spp": 2, "range_threshold": None},
"Zygentoma": {"min_spp": 2, "range_threshold": None},
"Zygomycota": {"min_spp": 2, "range_threshold": None},
"Zygophyllales": {"min_spp": 2, "range_threshold": None},
"Asiloidea": {"min_spp": 3, "range_threshold": None},
"Bibionomorpha": {"min_spp": 3, "range_threshold": None},
"Bryophyta": {"min_spp": 3, "range_threshold": None},
"Culicomorpha": {"min_spp": 3, "range_threshold": None},
"Diplopoda": {"min_spp": 3, "range_threshold": None},
"Gelechioidea": {"min_spp": 3, "range_threshold": None},
"Malpighiales": {"min_spp": 3, "range_threshold": None},
"Nematoda": {"min_spp": 3, "range_threshold": None},
"Papilionoidea": {"min_spp": 3, "range_threshold": None},
"Pyraloidea": {"min_spp": 3, "range_threshold": None},
"Tipulomorpha": {"min_spp": 3, "range_threshold": None},
"Trematoda": {"min_spp": 3, "range_threshold": None},
"Chalcidoidea": {"min_spp": 4, "range_threshold": None},
"Chromista": {"min_spp": 4, "range_threshold": None},
"Fabales": {"min_spp": 4, "range_threshold": None},
"Gentianales": {"min_spp": 4, "range_threshold": None},
"Geometroidea": {"min_spp": 4, "range_threshold": None},
"Lamiales": {"min_spp": 4, "range_threshold": None},
"Poales": {"min_spp": 4, "range_threshold": None},
"Sordariomycetes": {"min_spp": 4, "range_threshold": None},
"Vespoidea": {"min_spp": 4, "range_threshold": None},
"Apoidea": {"min_spp": 5, "range_threshold": None},
"Asterales": {"min_spp": 5, "range_threshold": None},
"Dothideomycetes": {"min_spp": 5, "range_threshold": None},
"Orthoptera": {"min_spp": 5, "range_threshold": None},
"Trombidiformes": {"min_spp": 5, "range_threshold": None},
"Actinopterygii": {"min_spp": 6, "range_threshold": None},
"Agaricomycetes": {"min_spp": 7, "range_threshold": None},
"Asparagales": {"min_spp": 7, "range_threshold": None},
"Caraboidea": {"min_spp": 7, "range_threshold": None},
"Scarabaeoidea": {"min_spp": 7, "range_threshold": None},
"Chrysomeloidea": {"min_spp": 8, "range_threshold": None},
"Ichneumonoidea": {"min_spp": 8, "range_threshold": None},
"Noctuoidea": {"min_spp": 8, "range_threshold": None},
"Araneae": {"min_spp": 9, "range_threshold": None},
"Gastropoda": {"min_spp": 10, "range_threshold": None},
"Staphylinoidea": {"min_spp": 14, "range_threshold": None},
"Hemiptera": {"min_spp": 16, "range_threshold": None},
"Curculionoidea": {"min_spp": 19, "range_threshold": None}
} | Ackbar | /Ackbar-0.2.tar.gz/Ackbar-0.2/ackbar_lib/B2_recommended_thresholds.py | B2_recommended_thresholds.py |
import sys
import csv
import re
from math import ceil, floor, sin, cos, atan2, pi
import fiona
from fiona.crs import from_epsg
from shapely.geometry import Point, Polygon, MultiPolygon, mapping
from ackbar_lib import pydata, shapes
from ackbar_lib.B2_recommended_thresholds import groups as iucn_groups
class InputData(object):
"""
Input data processor class. Class constructor requires a csv file (str) with
three columns: longitude, latitude, and taxon name.
"""
def __init__(self, infile, csv_pars = None):
self.points = {} # values are population fractions
self.totPops = {} # total populations per taxon
self.iucn = {} # taxon to (Category, subcriteriaA)
self.minLatitude = 91.0
self.maxLatitude = -91.0
self.minLongitude = 181.0
self.maxLongitude = -181.0
self.originN = None
self.cellSize = None
self.rows = None
self.cols = None
self.geometry = None
self.csvfile = infile
self.presence_grid = []
self.index_reg = {}
self.taxonGroups = {}
self.taxonGroupsInfo = {}
self.groupDict = {}
self.spp2groupDict = {}
self.csv_params = {}
lineCounter = 0
latCol = None
lonCol = None
taxCol = None
if type(csv_pars) == dict:
if "delimiter" in csv_pars:
self.csv_params["delimiter"] = csv_pars["delimiter"]
if "lineterminator" in csv_pars:
self.csv_params["lineterminator"] = csv_pars["lineterminator"]
if "quotechar" in csv_pars:
self.csv_params["quotechar"] = csv_pars["quotechar"]
with open(infile,'r') as fil:
table = csv.reader(fil, **self.csv_params)
for lineCounter, row in enumerate(table):
if lineCounter == 0:
for icol , col in enumerate(row):
if re.search("lon(gitude)*", col, flags=re.I):
lonCol = icol
continue
elif re.search("lat(titude)*", col, flags=re.I):
latCol = icol
continue
elif re.search("taxon", col, flags=re.I):
taxCol = icol
continue
if lonCol is None or latCol is None or taxCol is None:
raise IOError("Input file `{0}`: column labels do not follow the required format (`Taxon`, `Longitude`, `Latitude`).".format(infile))
else:
row[latCol] = re.sub(r"[\s\'\"]","",row[latCol])
row[lonCol] = re.sub(r"[\s\'\"]","",row[lonCol])
lat = None
lon = None
try:
lat = float(row[latCol])
if lat < -90 or lat > 90:
raise ValueError('Distribution file error: {2} is not a valid latitude value (line {0} in file `{1}`).'.format(lineCounter, infile, row[latCol]))
except ValueError as te:
mess = str(te)
if mess.startswith('could not convert string to float'):
raise ValueError('Distribution file error: {2} is not a valid latitude value (line {0} in file `{1}`).'.format(lineCounter, infile, row[latCol]))
else:
raise
try:
lon = float(row[lonCol])
if lon < -180 or lon > 180:
raise ValueError('Distribution file error: {2} is not a valid longitude value (line {0} in file `{1}`).'.format(lineCounter, infile, row[lonCol]))
except ValueError as te:
mess = str(te)
if mess.startswith('could not convert string to float'):
raise ValueError('Distribution file error: {2} is not a valid longitude value (line {0} in file `{1}`).'.format(lineCounter, infile, row[lonCol]))
else:
raise
if len(row[taxCol]) > 105:
raise IOError("Distribution file error: `{0}` exceeds the maximum taxon name size, 105 chars (line {1} in file `{2}`)".format(row[taxCol],lineCounter, infile))
#############################################################
# #
# Reproject data if user wishes to #
# #
#############################################################
if row[taxCol] in self.points:
self.points[row[taxCol]][(lon,lat)] = 1
else:
self.points[row[taxCol]] = { (lon,lat) : 1 }
if self.minLatitude > lat:
self.minLatitude = lat
if self.maxLatitude < lat:
self.maxLatitude = lat
if self.minLongitude > lon:
self.minLongitude = lon
if self.maxLongitude < lon:
self.maxLongitude = lon
if len(self.points) < 3:
raise ValueError("Input file only contain distribution data from {0} species (at least three are required).".format(len(self.points)))
return None
def groupFiles(self, assignments_file, diversity_file):
"""
Process information from files and store it in data structures.
Only stores data of taxa included in distribution file.
"""
log = '' # log buffer
assignments = False
taxonCol = None
groupCol = None
rangeCol = None
groupBisCol = None
globsppCol = None
minsppCol = None
rangeThresCol = None
self.taxonGroups = {}
self.taxonGroupsInfo = {}
with open(assignments_file, 'r') as afile:
table = csv.reader(afile, **self.csv_params)
for irow , row in enumerate(table):
if irow == 0:
for ic , cell in enumerate(row):
if re.search('group', cell, flags=re.I):
groupCol = ic
continue
if re.search('taxon', cell, flags=re.I):
taxonCol = ic
continue
if re.search('range_size', cell, flags=re.I):
rangeCol = ic
continue
if groupCol is None or taxonCol is None:
raise IOError("Input file `{0}`: column labels do not follow the required format (headers should be `Group`, `Taxon`, and `Range_size`).".format(assignments_file))
else:
rangeS = None
if not rangeCol is None:
rangeS = re.sub(r'^\s+', '', row[rangeCol])
rangeS = re.sub(r'\s+$', '', row[rangeCol])
if len(rangeS) > 0:
rangeS = float(rangeS)
if rangeS <= 0:
raise IOError("Invalid range size provided (`{0}`)".format(rangeS))
else:
rangeS = None
group = re.sub(r'^\s+', '', row[groupCol])
group = re.sub(r'\s+$', '', row[groupCol])
#############################################################
# #
# Does this threshold make sense??? #
# #
#############################################################
if len(group) < 4:
raise IOError("`{0}` does not seem an actual taxonomic membership".format(group))
taxon = re.sub(r'^\s+', '', row[taxonCol])
taxon = re.sub(r'\s+$', '', row[taxonCol])
if taxon in self.taxonGroups:
raise IOError("Taxon duplicated in group file (`{0}`)".format(taxon))
else:
self.taxonGroups[taxon] = {'group': group, 'range_size': rangeS}
if rangeS is None and taxon in self.points:
point_list = [x for x in self.points[taxon].keys()]
tarea = shapes.area_estimator(point_list)
self.taxonGroups[taxon]['range_size'] = tarea
if not diversity_file is None:
with open(diversity_file, 'r') as dhandle:
table = csv.reader(dhandle, **self.csv_params)
for irow , row in enumerate(table):
if irow == 0:
for ic , cell in enumerate(row):
if re.search('group', cell, flags=re.I):
groupBisCol = ic
continue
if re.search('global_species', cell, flags=re.I):
globsppCol = ic
continue
if re.search('min_species', cell, flags=re.I):
minsppCol = ic
continue
if re.search('range_threshold', cell, flags=re.I):
rangeThresCol = ic
continue
if groupBisCol is None or (globsppCol is None and minsppCol is None):
raise IOError("Input file `{0}`: column labels do not follow the required format (headers should be `Group`, `Global_species`, `Min_species`, and `Range_threshold`).".format(assignments_file))
else:
tgroup = row[groupBisCol]
tsp = row[globsppCol]
range_thr = None
min_spp = None
if rangeThresCol:
range_thr = row[rangeThresCol]
if len(range_thr) > 0:
range_thr = int(range_thr)
else:
range_thr = 10000
if minsppCol:
min_spp = row[minsppCol]
if len(min_spp) > 0:
min_spp = int(min_spp)
if len(tsp) > 0:
tsp = int(tsp)
else:
tsp = None
if tgroup in self.taxonGroupsInfo:
raise IOError("Group duplicated in group diversity file (`{0}`)".format(tgroup))
else:
self.taxonGroupsInfo[tgroup] = {
'range_threshold': range_thr,
'global_species': tsp,
'min_spp' : min_spp}
for taxon in self.points:
if taxon not in self.taxonGroups:
raise IOError("`{0}` not included in taxonomic group assignment file".format(taxon))
def groups2search(self):
"""Set dictionaries of taxonomic group info required for search function."""
self.groupDict = {}
self.spp2groupDict = {}
# Append groups from recommended IUCN list to group dictionaries
for taxon in self.points:
if taxon in self.taxonGroups:
tgroup = self.taxonGroups[taxon]['group']
if not tgroup in self.taxonGroupsInfo:
if tgroup in iucn_groups:
thres = None
if iucn_groups[tgroup]['range_threshold'] is None:
thres = 10000
else:
thres = iucn_groups[tgroup]['range_threshold']
self.taxonGroupsInfo[tgroup] = {
'range_threshold': thres,
'global_species': None,
'min_spp': iucn_groups[tgroup]['min_spp']}
else:
raise IOError("Taxonomic group `{0}` included in neither the group diversity file nor the official IUCN taxonomic group list.".format(tgroup))
else:
raise IOError("Taxon `{0}` not included in taxonomic group assignment file.".format(taxon))
for ita, taxon in enumerate(self.points):
tgr = self.taxonGroups[taxon]['group']
grouppy = None
for igr, gr in enumerate(sorted(self.taxonGroupsInfo.keys())):
if tgr == gr:
grouppy = igr
break
if grouppy is None:
print('{0} not in self.points'.format(taxon))
else:
self.spp2groupDict[ita] = grouppy
for igr, gr in enumerate(sorted(self.taxonGroupsInfo.keys())):
mspp = None
mran = self.taxonGroupsInfo[gr]['range_threshold']
if self.taxonGroupsInfo[gr]['min_spp']:
mspp = self.taxonGroupsInfo[gr]['min_spp']
else:
mspp = int(self.taxonGroupsInfo[gr]['global_species'] * 0.0002)
if mspp < 2:
mspp = 2
self.groupDict[igr] = (mran, mspp)
def iucnFile(self, filename):
"""
Process IUCN categories and subcriteria for criterion A from a csv file.
"""
lineCounter = 0
nameCol = None
categCol = None
criterCol = None
validCats = ['CR', 'EN', 'VU', 'NT', 'LC', 'DD', 'NE']
with open(filename,'r') as fil:
table = csv.reader(fil, **self.csv_params)
for row in table:
lineCounter += 1
if lineCounter == 1:
for ic, cell in enumerate(row):
if re.search("taxon",row[ic],flags=re.I):
nameCol = ic
continue
if re.search("categor",row[ic],flags=re.I):
categCol = ic
continue
if re.search("criter",row[ic],flags=re.I):
criterCol = ic
continue
if nameCol is None or categCol is None or criterCol is None:
raise IOError("Input file `{0}`: column labels do not follow the required format (`Taxon`, `Category`, `Criteria`).".format(filename))
else:
cat = row[categCol]
isvalid = False
if type(cat) == str:
for vc in validCats:
cat = re.sub(r'^\s+','',cat)
cat = re.sub(r'\s+$','',cat)
if vc == cat.upper():
isvalid = True
if not isvalid:
raise IOError("{0} has a non valid IUCN category code (`{1}`)".format(row[nameCol], cat))
if cat == '' or re.search(r'^\s+$', cat):
cat = 'NE'
subcrA = []
if type(row[criterCol]) == str:
if not re.search(r'[BCDE]', row[criterCol]) and re.search(r'A', row[criterCol]):
digits = re.findall(r'\d', row[criterCol])
if len(digits) >= 1:
for dig in digits:
if dig == '1':
subcrA.append(1)
elif dig == '2':
subcrA.append(2)
elif dig == '4':
subcrA.append(4)
elif dig != '3':
raise IOError("{0} has non valid subcriteria A (`{1}`)".format(row[nameCol], row[criterCol]))
self.iucn[row[nameCol]] = {'category': cat, 'subcritA': subcrA}
for na in [x for x in self.points if not x in self.iucn]:
self.iucn[na] = {'category': 'NE', 'subcritA': []}
def mergePoints(self, taxonName, maxDist):
"""
Merge points using DBSCAN. Cluster scheme is store in values of points
dictionary.
"""
clusters = self.dbscan(taxonName, maxDist)
totPops = len(clusters)
for cl in clusters:
factor = 1 / len(clusters[cl])
for loc in clusters[cl]:
self.points[taxonName][loc] = factor
return totPops
def mergePointsAllTaxa(self, maxDist):
"""
Merge points of all taxa; wrapper to self.mergePoints.
"""
for taxon in self.points:
# Join points that are too close to be different populations
self.totPops[taxon] = self.mergePoints(taxon, maxDist)
def reduceArea(self, shapefile):
"""
Reduce the spatial scope of the dataset by removing all points that lie outside a provided set of polygons.
"""
self.points = shapes.filter_points(self.points, shapefile)
oldTotPops = self.totPops
self.totPops = {}
for tax in self.points:
self.totPops[tax] = oldTotPops[tax]
def dbscan(self, taxon, eps):
"""
DBSCAN-like algorithm to cluster points. There is not minimum cluster
size and, therefore, no noise list.
- eps: maximum distance among cluster members.
"""
clusters = {}
visited = {x:0 for x in self.points[taxon]}
for pivot in self.points[taxon]:
if visited[pivot] == 0:
visited[pivot] = 1
clusters[pivot] = [pivot]
self.expand(taxon, clusters, pivot, pivot, visited, eps)
#########################################################################
# #
# Is the following loop necessary? #
# #
#########################################################################
for q in self.points[taxon]:
qIsAlone = 1
for key in clusters:
if q in clusters[key]:
qIsAlone = 0
if qIsAlone:
clusters[q] = [q]
return clusters
def expand(self, taxon, clusters, pivot, border, visited, eps):
for newborder in self.points[taxon]:
if visited[newborder] == 0:
if border != newborder:
td = self.haversine(border, newborder)
if td < eps:
clusters[pivot].append(newborder)
visited[newborder] = 1
self.expand(taxon, clusters, pivot, newborder, visited, eps)
def haversine(self, pointA, pointB, radius = 6371):
phi1 = pointA[1] * pi / 180 # in radians
phi2 = pointB[1] * pi / 180
phiDelta = (pointB[1] - pointA[1]) * pi / 180
lambdaDelta = (pointB[0] - pointA[0]) * pi / 180
a = sin(phiDelta / 2) ** 2 + cos(phi1) * cos(phi2) * sin(lambdaDelta / 2) ** 2
c = 2 * atan2(a ** 0.5, (1-a) ** 0.5)
d = radius * c
return d
def filter_nulls(self, tiles):
"""
Filters out tiles that are null and update the species-to-group dictionary.
Null tiles occur often after clearing data point container from localities
within previously delimited KBAs.
Arguments:
- tiles (list): Tiles output by `getlist` method.
"""
newlist = []
newdict = {}
counter = 0
for it, ti in enumerate(tiles):
if not ti.isNull():
newlist.append(ti)
if len(self.spp2groupDict) > 0:
newdict[counter] = self.spp2groupDict[it]
counter +=1
if len(self.spp2groupDict) > 0:
self.spp2groupDict = newdict
return newlist
def getTiles(self, cellSize, offsetLat = 0, offsetLon = 0):
"""
Create basic data structures required for the analysis from a collection
of distributional points. Returns a list of data.Tile objects.
Arguments:
- cellSize (int or float): Size of the cells making up the lattice. If
the grid is made up of squares, `cellSize` will be the side of the square.
"""
if cellSize > (self.maxLatitude - self.minLatitude) or cellSize > (self.maxLongitude - self.minLongitude):
raise ValueError("Grid cell size (`cellSize`) is larger than the extent of the input distributions.")
self.cellSize = float(cellSize)
tileStack = []
self.rows, self.cols = 0, 0
offsetLat = float(offsetLat)
offsetLon = float(offsetLon)
self.originN = ((self.minLongitude - offsetLon), (self.maxLatitude + offsetLat))
span = [max((self.maxLongitude - self.originN[0]), self.cellSize), max((self.originN[1] - self.minLatitude), self.cellSize)]
totCols = int(ceil(span[0] / self.cellSize))
totRows = int(ceil(span[1] / self.cellSize))
span[0] = totCols * self.cellSize
span[1] = totRows * self.cellSize
self.rows, self.cols = totRows, totCols
self.presence_grid = [[0 for x in range(totCols)] for x in range(totRows)]
grid_coll = []
for taxon in self.points:
grid = [[0 for x in range(totCols)] for x in range(totRows)]
for lon,lat in self.points[taxon]:
apprindx = ceil( ( (lon - self.originN[0]) / span[0]) * totCols)
apprindy = ceil( ( (self.originN[1] - lat) / span[1]) * totRows)
x = apprindx - 1
y = apprindy - 1
if x < 0:
x += 1
if y < 0:
y += 1
#th = self.points[taxon][lon,lat] / totPops
th = self.points[taxon][lon,lat]
th /= self.totPops[taxon]
grid[y][x] += th
self.presence_grid[y][x] += th
grid_coll.append(grid)
self.index_reg = {}
act_size = 0
for ir, row in enumerate(self.presence_grid):
for ic, cel in enumerate(row):
if cel > 0:
self.index_reg[(ir, ic)] = act_size
act_size += 1
for it, taxon in enumerate(self.points):
cat = self.iucn[taxon]['category']
tile = pydata.Meshpy(act_size, taxon, cat)
for sca in self.iucn[taxon]['subcritA']:
tile.newThreatSubcriteriaA(sca)
if len(self.taxonGroups) > 0 and taxon in self.taxonGroups and self.taxonGroups[taxon]['range_size']:
tile.setRange(self.taxonGroups[taxon]['range_size'])
for r in range(self.rows):
for c in range(self.cols):
if self.presence_grid[r][c] > 0:
rowNeighs = [r]
colNeighs = [c]
if grid_coll[it][r][c] > 0:
tile.setValue(self.index_reg[(r, c)], grid_coll[it][r][c])
if r > 0:
rowNeighs.append(r-1)
if r < (self.rows - 1):
rowNeighs.append(r+1)
if c > 0:
colNeighs.append(c-1)
if c < (self.cols - 1):
colNeighs.append(c+1)
for nr in rowNeighs:
if r != nr and self.presence_grid[nr][c] > 0:
tile.linkNeighs(self.index_reg[(r, c)], self.index_reg[(nr, c)])
for nc in colNeighs:
if c != nc and self.presence_grid[r][nc] > 0:
tile.linkNeighs(self.index_reg[(r, c)], self.index_reg[(r, nc)])
tileStack.append(tile)
return tileStack
def tile2str(self, tile):
"""
Get string of Tile given a encodification scheme. Only for testing.
"""
if tile.getSize() == len(self.index_reg):
ms = ''
for r in range(self.rows):
for c in range(self.cols):
if self.presence_grid[r][c] > 0:
val = tile.getValue(self.index_reg[(r, c)])
if val > 0:
ms += '1 '
else:
ms += '0 '
else:
ms += '- '
ms += '\n'
return ms
def grid2shape(self, filename):
"""
Saves the grid into a shapefile.
"""
counter = 0
wrtMode = None
schema = {
'geometry': 'Polygon',
'properties': {'id': 'int', 'x': 'int', 'y': 'int', 'xBase': 'float', 'yBase': 'float'},
}
for y, x in self.index_reg:
xBase = self.originN[0] + self.cellSize * (x + 1)
yBase = self.originN[1] - self.cellSize * (y + 1)
ocor = [(xBase, yBase + self.cellSize),
(xBase - self.cellSize, yBase + self.cellSize),
(xBase - self.cellSize, yBase),
(xBase, yBase)]
pol = Polygon(ocor)
if counter == 0:
wrtMode = 'w'
else:
wrtMode = 'a'
with fiona.open(filename, wrtMode, 'ESRI Shapefile', schema, from_epsg(4326)) as fhandle:
fhandle.write({'geometry': mapping(pol), 'properties':
{'id': counter, 'x': x, 'y': y, 'xBase': xBase, 'yBase': yBase}})
counter += 1
return None | Ackbar | /Ackbar-0.2.tar.gz/Ackbar-0.2/ackbar_lib/fileio.py | fileio.py |
import os
import pyproj
import fiona
from fiona.crs import from_epsg
from shapely.geometry import shape, Point, Polygon, mapping, MultiPoint
from shapely.ops import unary_union, transform
from functools import partial
from ackbar_lib import fileio as fileio
class KBA(object):
"""
Shapefile reader.
- path (str): Path to folders containing shapefiles to read.
"""
def __init__(self, path, index_field, encoding = 'utf8'):
self.source_directory = None
self.polys = {}
self.new_trigger_spp = {}
self.encoding = encoding
self.index_field = index_field
if not os.path.exists(path):
raise IOError("`{0}` is not a valid directory.".format(path))
self.source_directory = path
#if shtype is None or (shtype != 'kba' and shtype != 'exclusion' and shtype != 'reserves'):
# raise ValueError("`{0}` is not a valid value for parameter `shtype`. Valid values are `kba`, `exclusion`, and `reserves`.".format(shtype))
#self.shape_type = shtype
for directory, subdi, files in os.walk(self.source_directory):
for fi in files:
if fi.endswith('.shp'):
try:
toop = directory + "/" + fi
filehandle = fiona.open(toop, crs= 'EPSG:4326', encoding = self.encoding)
for item in filehandle:
#self.polys.append(shape(item['geometry']))
self.polys[item['properties'][self.index_field]] = {
'shape': shape(item['geometry']),
'new_spp': {}
}
except:
raise IOError("Could not open file `{0}`".format(fi))
# Maybe it is not necessary to merge all polygons
#self.upoly = unary_union(self.polys)
def spp_inclusion(self, distroData):
"""
Verify if new species could support previously delimited KBA.
"""
if isinstance(distroData, fileio.InputData):
#print("In inclusion")
#self.new_trigger_spp = {x:{} for x in range(len(self.polys))}
for k in self.polys:
for spp in distroData.points:
pointsWithin = []
popsize = 0
isTrigger = False
criteria = []
for p in distroData.points[spp]:
tp = Point(p)
if self.polys[k]['shape'].contains(tp):
pointsWithin.append(p)
popsize += distroData.points[spp][p]
#print(popsize)
if distroData.iucn[spp]['category'] in ['CR', 'EN']:
if popsize > 0.95:
isTrigger = True
criteria.append(4)
criteria.append(0)
elif popsize > 0.005:
isTrigger = True
criteria.append(0)
if popsize >= 0.01 and len(distroData.iucn[spp]['subcritA']) > 0:
isTrigger = True
criteria.append(2)
elif distroData.iucn[spp]['category'] == 'VU':
if popsize > 0.01:
isTrigger = True
criteria.append(1)
elif popsize >= 0.02 and len(distroData.iucn[spp]['subcritA']) > 0:
isTrigger = True
criteria.append(3)
if popsize > 0.1:
isTrigger = True
criteria.append(5)
if isTrigger:
#self.new_trigger_spp[ik][spp] = criteria
self.polys[k]['new_spp'][spp] = criteria
if len(pointsWithin) > 0:
for q in pointsWithin:
distroData.points[spp][q] = 0
def new_spp_table(self, filename):
"""
Writes out a simple csv file indicating new trigger species to previously
delimited KBA.
"""
crmap = {0: 'A1a', 1: 'A1b', 2: 'A1c', 3: 'A1d', 4: 'A1e', 5: 'B1', 6: 'B2'}
bffr = '{0},Species,Criteria\n'.format(self.index_field)
for kbaid in self.polys:
for sp in self.polys[kbaid]['new_spp']:
cr = '"'
for c in self.polys[kbaid]['new_spp'][sp]:
cr += crmap[c] + ', '
cr = cr.rstrip(', ')
cr += '"'
bffr += '{0},{1},{2}\n'.format(kbaid, sp, cr)
with open(filename, 'w') as fhandle:
fhandle.write(bffr)
def solution2shape(mysols, indata, dic_name = 'solutions'):
if os.path.exists(dic_name):
raise IOError("There is already a directory called `{0}`. Please chose another name for the solution folder.".format(dic_name))
else:
os.mkdir(dic_name)
irkeys = list(indata.index_reg.keys())
wrtMode = None
schema = {
'geometry': 'Polygon',
'properties': {'id': 'int',
'IUCNscore': 'int',
'aggrScore': 'float',
'NDMscore': 'float'},
}
for igr, gr in enumerate(mysols):
filename = '{0}/group_{1}.shp'.format(dic_name, igr)
for its, tsol in enumerate(gr):
polys = []
solpoly = None
for ic in range(tsol.getSize()):
if tsol.getValue(ic) > 0:
y, x = irkeys[ic]
xBase = indata.originN[0] + indata.cellSize * x
yBase = indata.originN[1] - indata.cellSize * y
ocor = [(xBase + indata.cellSize, yBase),
(xBase, yBase),
(xBase, yBase - indata.cellSize),
(xBase + indata.cellSize, yBase - indata.cellSize)]
polys.append(Polygon(ocor))
solpoly = unary_union(polys)
if its == 0:
wrtMode = 'w'
else:
wrtMode = 'a'
with fiona.open(filename, wrtMode, 'ESRI Shapefile', schema, from_epsg(4326)) as c:
c.write({
'geometry': mapping(solpoly),
'properties': {
'id': its,
'IUCNscore': tsol.score,
'aggrScore': tsol.aggrScore,
'NDMscore': tsol.ndmScore
}})
return None
def area_estimator(point_list, lat0 = 0, lon0 = -73, factor = 0.9992):
"""
Estimates the area (km^2) of the convex hull of a set of points. Points
should be longitude-latitude points, projected in the WGS84 datum. Area will
be estimated using the Transverse Mercator projection. Origin coordinates of
the Transverse Mercator projection should be provided; if not set, the
Colombian offical origin will be used. Scale factor can be also parsed
(default = 0.9992).
"""
out = None
if len(point_list) > 2:
convex_hull = MultiPoint(point_list).convex_hull
wgs84 = pyproj.Proj(init='epsg:4326')
tm = pyproj.Proj(proj='tmerc', lat_0 = lat0, lon_0 = lon0, k_0=factor, units='m')
project = partial(pyproj.transform, wgs84, tm)
tm_ch = transform(project, convex_hull)
out = tm_ch.area / 1000 ** 2
return out
def filter_points(points, shapefile):
"""
Filter points of a fileio.Indata object given a polygon.
"""
feats = []
filtered = {tax:{} for tax in points}
with fiona.open(shapefile, encoding="utf8") as src:
feats = [shape(x['geometry']) for x in src]
for taxon in points:
for lon, lat in points[taxon]:
keep = False
for polyg in feats:
if polyg.contains(Point(lon, lat)):
keep = True
break
if keep:
filtered[taxon][(lon, lat)] = points[taxon][(lon, lat)]
filtered = {tax: filtered[tax] for tax in filtered if len(filtered[tax]) > 0}
return filtered | Ackbar | /Ackbar-0.2.tar.gz/Ackbar-0.2/ackbar_lib/shapes.py | shapes.py |
# Acodis API Handler
This package provides easy to use python classes and functions to communicate with Acodis API (https://acodis.io).
Acodis is an IDP solution that focuses on extracting and structuring complex documents (PDFs, Images)
## Installation
```bash
pip install AcodisApiHandler
```
## Usage
This package is particularly useful for programmatic access, since ACODIS' API structure requires a different
**user** and **password** for every export step. Hence, if you have to manage multiple exports
(e.g. multiple workflows), you just need to update the main class `user` and `password` attribute,
and call the `authenticate()` method.
```python
from AcodisApiHandler import AcodisApiHandler
# Set up your credentials
ACODIS_BASE_URL = "https://<YOUR-ACOIDS-INSTANCE-URL>/workbench/api/transaction"
ACODIS_USER = "<YOUR-EXPORT-USERNAME>"
ACODIS_PASSWORD = "<YOUR-EXPORT-PASSWORD>"
# Create an instance of the AcodisApiHandler class
handler = AcodisApiHandler(ACODIS_BASE_URL)
# Set the credentials
handler.user = ACODIS_USER
handler.password = ACODIS_PASSWORD
# Authenticate with the API
handler.authenticate()
handler.workflow(pdf_path="<PATH-TO-PDF-FILE>")
# The extraction result is an ElementTree XML object stored in the handler.result variable
# You can check it by:
print(handler.result)
```
## Utils
This package also provides some utils to help you with the extraction process.
### Extracting tagged data points
Tags are used to identify the data points that you want to extract from the document.
This function will create a dictionary with the tags as keys and the extracted data as values.
```python
from AcodisApiHandler import extract_tags
tags_list = ["example_tag_1", "example_tag_1", "example_tag_1"]
# Using the precviously created handler instance
tagged_data = extract_tags(handler, tags_list)
```
If we print the `tagged_data` variable we will get:
```python
{
"example_tag_1": "Example data 1",
"example_tag_2": "Example data 2",
"example_tag_3": "Example data 3"
}
```
## License
[MIT](https://choosealicense.com/licenses/mit/)
## Contributing
Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.
## Roadmap
- [ ] Additional utils: parsing tables, extracting images, etc.
- [ ] Add unit tests
- [ ] Add batch processing and parallelization | AcodisApiHandler | /AcodisApiHandler-0.3.4.tar.gz/AcodisApiHandler-0.3.4/README.md | README.md |
from acolyte.api import BaseAPIHandler
from acolyte.core.service import Result
from acolyte.core.storage.flow_template import FlowTemplateDAO
from acolyte.core.storage.flow_instance import FlowInstanceDAO
handlers = [
# start a flow instance
{
"url": r"/v1/flow/template/(\d+)/start",
"http_method": "post",
"service": "FlowExecutorService",
"method": "start_flow",
"path_variables": [
"flow_template_id"
],
"body_variables": {
"description": "description",
"start_flow_args": "start_flow_args",
"group": "group"
},
"context_variables": {
"current_user_id": "initiator"
}
},
# run an action of the job
{
"url": r"/v1/flow/instance/(\d+)/([\w_]+)/([\w_]+)",
"http_method": "post",
"service": "FlowExecutorService",
"method": "handle_job_action",
"path_variables": [
"flow_instance_id",
"target_step",
"target_action"
],
"body_variables": {
"action_args": "action_args"
},
"context_variables": {
"current_user_id": "actor"
}
},
# create a flow group
{
"url": r"/v1/flow/group/create",
"http_method": "post",
"service": "FlowExecutorService",
"method": "create_flow_instance_group",
"body_variables": {
"name": "name",
"description": "description",
"meta": "meta"
}
}
]
class StartFlowByTplNameHandler(BaseAPIHandler):
def post(self, tpl_name):
check_token_rs = self._check_token()
if not check_token_rs.is_success():
self._output_result(check_token_rs)
return
flow_tpl_dao = FlowTemplateDAO(self._("db"))
flow_template = flow_tpl_dao.query_flow_template_by_name(tpl_name)
if flow_template is None:
self._output_result(
Result.bad_request(
"unknow_template",
msg="找不到名称为'{tpl_name}'的flow template".format(
tpl_name=tpl_name)))
return
body = self.json_body()
initiator = body.get("initiator")
# Body传递过来的initiator优先
initiator = initiator or check_token_rs.data["id"]
rs = self._("FlowExecutorService").start_flow(
flow_template_id=flow_template.id,
description=body.get("description", ""),
start_flow_args=body.get("start_flow_args", ""),
initiator=initiator
)
self._output_result(rs)
class RunActionByTplNameHandler(BaseAPIHandler):
def post(self, tpl_name, target_step, target_action):
check_token_rs = self._check_token()
if not check_token_rs.is_success():
self._output_result(check_token_rs)
return
flow_tpl_dao = FlowTemplateDAO(self._("db"))
flow_instance_dao = FlowInstanceDAO(self._("db"))
flow_template = flow_tpl_dao.query_flow_template_by_name(tpl_name)
if flow_template is None:
self._output_result(
Result.bad_request(
"unknow_template",
msg="找不到名称为'{tpl_name}'的flow template".format(
tpl_name=tpl_name)))
return
running_instance_list = flow_instance_dao\
.query_running_instance_list_by_tpl_id(flow_template.id)
if not running_instance_list:
self._output_result(Result.bad_request(
"no_running_instance",
msg="'{tpl_name}'下没有任何正在执行的实例".format(tpl_name=tpl_name)))
return
elif len(running_instance_list) > 1:
self._output_result(Result.bad_request(
"more_than_one",
msg="'{tpl_name}'下有不止1个实例正在执行".format(tpl_name=tpl_name)
))
return
running_instance = running_instance_list.pop()
body = self.json_body()
rs = self._("FlowExecutorService").handle_job_action(
flow_instance_id=running_instance.id,
target_step=target_step,
target_action=target_action,
action_args=body.get("action_args", {}),
actor=check_token_rs.data["id"]
)
self._output_result(rs) | Acolyte | /Acolyte-0.0.1.tar.gz/Acolyte-0.0.1/acolyte/api/flow_executor.py | flow_executor.py |
from abc import ABCMeta
import simplejson as json
from functools import wraps
from tornado.web import RequestHandler
from acolyte.util import log
from acolyte.util.json import to_json
from acolyte.core.service import Result
class BaseAPIHandler(RequestHandler, metaclass=ABCMeta):
def __init__(self, application, request):
super().__init__(application, request)
def _(self, service_id):
return BaseAPIHandler.service_container.get_service(service_id)
def json_body(self):
body = self.request.body
if not body:
return {}
return json.loads(self.request.body)
def _check_token(self):
"""检查请求头中的token是否合法
"""
try:
token = self.request.headers["token"]
return self._("UserService").check_token(token)
except KeyError:
return Result.bad_request("token_not_exist",
"Can't find token in request headers")
def _output_result(self, rs):
"""将result对象按照json的格式输出
"""
self.set_header('Content-Type', 'application/json;charset=utf-8')
self.set_status(rs.status_code)
self.write(to_json(rs))
self.finish()
def response_json(func):
@wraps(func)
def _func(self, *args, **kwds):
rs = func(self, *args, **kwds)
self.set_header('Content-Type', 'application/json;charset=utf-8')
self.set_status(rs.status_code)
self.write(to_json(rs))
self.finish()
return _func
class APIHandlerBuilder:
"""该类创建的Builder对象可以由Service方法自动创建出
对应的APIHandler
"""
def __init__(self, service_id, method_name, http_mtd):
self._service_id = service_id
self._method_name = method_name
self._http_mtd = http_mtd
self._bind_path_vars = {}
self._bind_body_vars = {}
self._bind_context_vars = {}
def bind_path_var(self, path_var_index, mtd_arg_name, handler=None):
"""将tornado的path variable绑定到service方法的参数上
:param path_var_index: path variable的索引,从1计数
:param mtd_arg_name: 方法参数名
"""
self._bind_path_vars[path_var_index] = mtd_arg_name, handler
return self
def bind_body_var(self, body_var_name, mtd_arg_name, handler=None):
"""将body中的值提取出来,绑定到service方法的参数上
:param body_var_name: body参数名称
:param mtd_arg_name: 方法参数名
"""
self._bind_body_vars[body_var_name] = mtd_arg_name, handler
return self
def bind_context_var(self, context_var_name, mtd_arg_name, handler=None):
self._bind_context_vars[context_var_name] = mtd_arg_name, handler
return self
def build(self):
"""执行最终构建
"""
bases = (BaseAPIHandler,)
attrs = {}
_bind_path_vars = self._bind_path_vars
_bind_body_vars = self._bind_body_vars
_bind_context_vars = self._bind_context_vars
_service_id = self._service_id
_method_name = self._method_name
def handler(self, *args):
nonlocal _bind_path_vars
nonlocal _bind_body_vars
nonlocal _bind_context_vars
nonlocal _service_id
nonlocal _method_name
# 检查token
check_token_rs = self._check_token()
if not check_token_rs.is_success():
self._output_result(check_token_rs)
return
current_user_id = check_token_rs.data["id"]
service_args = {}
# 填充path variable
for idx, val in enumerate(args, start=1):
mtd_arg_name, handler = _bind_path_vars[idx]
service_args[mtd_arg_name] = val if handler is None \
else handler(val)
# 填充body variable
json_body = self.json_body()
for body_var_name, (mtd_arg_name, handler) in \
_bind_body_vars.items():
val = json_body.get(body_var_name)
service_args[mtd_arg_name] = val if handler is None \
else handler(val)
# 填充context variable
for context_var_name, (mtd_arg_name, handler) in \
_bind_context_vars.items():
if context_var_name == "current_user_id":
service_args[mtd_arg_name] = current_user_id \
if handler is None else handler(current_user_id)
rs = getattr(self._(_service_id), _method_name)(**service_args)
log.api.debug((
"execute service "
"service_id = {service_id} "
"method_name = {method_name} "
"service_args = {service_args}"
).format(service_id=_service_id, method_name=_method_name,
service_args=service_args))
self._output_result(rs)
attrs[self._http_mtd] = handler
class_name = self._mtd_name_to_class_name()
return type(class_name, bases, attrs)
def _mtd_name_to_class_name(self):
str_buf = []
for idx, char in enumerate(self._method_name):
# 第一个字母大写
if idx == 0 or self._method_name[idx - 1] == '_':
str_buf.append(char.upper())
elif char != '_':
str_buf.append(char.lower())
return "".join(str_buf) + "Handler" | Acolyte | /Acolyte-0.0.1.tar.gz/Acolyte-0.0.1/acolyte/api/__init__.py | __init__.py |
import typing
import inspect
import datetime
from abc import ABCMeta
from acolyte.core.job import ActionArg
from acolyte.core.flow import FlowMeta
from acolyte.core.notify import ReadStatus
from acolyte.core.mgr import AbstractManager
from acolyte.util.validate import (
Field,
IntField,
StrField
)
from acolyte.util.lang import get_source_code
class ViewObject(metaclass=ABCMeta):
"""ViewObject用做最终输出的视图对象,可以直接序列化为json
"""
...
class UserSimpleView(ViewObject):
"""用户信息的简单视图
"""
@classmethod
def from_user(cls, user):
return cls(user.id, user.email, user.name)
def __init__(self, id_, email, name):
self.id = id_
self.email = email
self.name = name
class FlowMetaView(ViewObject):
"""FlowMeta渲染视图
"""
@classmethod
def from_flow_meta(cls, flow_meta: FlowMeta, job_mgr: AbstractManager):
"""从FlowMeta对象来构建
:param flow_meta: flow meta对象
:param job_mgr: 用于获取Job对象
"""
jobs = [JobRefView.from_job_ref(job_ref) for job_ref in flow_meta.jobs]
return cls(
flow_meta.name, flow_meta.description, jobs,
get_source_code(flow_meta.__class__))
def __init__(self, name: str, description: str,
jobs: list, source_code: str):
"""
:param name: flow meta名称
:param description: 描述
:param jobs: job列表
:param source_code: 源码
"""
self.name = name
self.description = description
self.jobs = jobs
self.source_code = source_code
class JobRefView(ViewObject):
@classmethod
def from_job_ref(cls, job_ref) -> ViewObject:
return cls(job_ref.step_name, job_ref.job_name, job_ref.bind_args)
def __init__(self, step_name: str, job_name: str,
bind_args: dict):
"""
:param step_name: 步骤名称
:param job_name: Job名称
:param bind_args: 绑定参数
"""
self.step_name = step_name
self.job_name = job_name
self.bind_args = bind_args
class FieldInfoView(ViewObject):
"""字段的类型和验证视图
"""
@classmethod
def from_field_obj(cls, field: Field):
"""从Field对象进行构建
"""
if isinstance(field, StrField):
return _StrFieldView(field.required, field.default,
field.min_len, field.max_len, field.regex)
elif isinstance(field, IntField):
return _IntFieldView(
field.required, field.default, field.min, field.max)
else:
return cls(field.type.__name__, field.required, field.default)
def __init__(self, type_: str, required: bool, default: typing.Any):
self.type = type_
self.required = required
self.default = default
class _IntFieldView(FieldInfoView):
def __init__(self, required: bool, default: int,
min_: int or None, max_: int or None):
super().__init__('int', required, default)
self.min = min_
self.max = max_
class _StrFieldView(FieldInfoView):
def __init__(self, required: bool, default: str,
min_len: int or None, max_len: int or None,
regex: str or None):
super().__init__('str', required, default)
self.min_len = min_len
self.max_len = max_len
self.regex = regex
class JobArgView(ViewObject):
@classmethod
def from_job_arg(cls, job_arg: ActionArg) -> ViewObject:
return cls(
job_arg.name,
FieldInfoView.from_field_obj(job_arg.field_info),
job_arg.mark,
job_arg.comment,
)
def __init__(self, name: str, field_info: FieldInfoView,
mark: str, comment: str):
self.name = name
self.field_info = field_info
self.mark = mark
self.comment = comment
class FlowTemplateView(ViewObject):
@classmethod
def from_flow_template(cls, flow_template, user):
return FlowTemplateView(
id_=flow_template.id,
flow_meta=flow_template.flow_meta,
name=flow_template.name,
bind_args=flow_template.bind_args,
max_run_instance=flow_template.max_run_instance,
creator_info=UserSimpleView.from_user(user),
config=flow_template.config
)
def __init__(self, id_: int, flow_meta: str, name: str,
bind_args: dict, max_run_instance: int,
creator_info: UserSimpleView, config: dict):
self.id = id_
self.flow_meta = flow_meta
self.name = name
self.bind_args = bind_args
self.max_run_instance = max_run_instance
self.creator_info = creator_info
self.config = config
class FlowTemplateSimpleView(ViewObject):
"""简化版的FlowTemplate视图
主要用于FlowInstance视图
"""
@classmethod
def from_flow_template(cls, flow_template, flow_meta_mgr):
flow_meta = flow_meta_mgr.get(flow_template.flow_meta)
return cls(flow_template.id, flow_meta.name, flow_template.name)
def __init__(self, id_, flow_meta_name, name):
"""
:param id_: 编号
:param flow_meta_name: flow meta名称
:param name: flow template名称
"""
self.id = id_
self.flow_meta_name = flow_meta_name
self.name = name
class FlowSimpleInstanceView(ViewObject):
"""描述一个FlowInstance的简单实例
"""
@classmethod
def from_flow_instance(
cls, flow_instance, group, flow_template, flow_meta_mgr, creator):
return cls(
id_=flow_instance.id,
status=flow_instance.status,
description=flow_instance.description,
current_step=flow_instance.current_step,
group=group,
created_on=flow_instance.created_on,
updated_on=flow_instance.updated_on,
flow_template_info=FlowTemplateSimpleView.from_flow_template(
flow_template, flow_meta_mgr),
creator_info=UserSimpleView.from_user(creator)
)
def __init__(self, id_, status, description, current_step,
group, created_on, updated_on, flow_template_info,
creator_info):
"""
:param id_: 编号
:param status: 状态
:param description: 描述
:param current_step: 当前运行到的步骤
:param group: 所属分组
:param created_on: 创建时间
:param updated_on: 最近更新时间
:param flow_template_info: flow_template视图
:param creator_info: creator视图
"""
self.id = id_
self.status = status
self.description = description
self.current_step = current_step
self.group = group
self.created_on = created_on
self.updated_on = updated_on
self.flow_template_info = flow_template_info
self.creator_info = creator_info
class ActionDetailsView(ViewObject):
@classmethod
def from_action_mtd(cls, act_mtd, job_args):
action_name = act_mtd.__name__[len("on_"):]
return cls(
action_name,
doc=act_mtd.__doc__,
args=[JobArgView.from_job_arg(a) for a in job_args[action_name]]
)
def __init__(self, name: str, doc: str, args: typing.List[JobArgView]):
self.name = name
self.doc = doc
self.args = args
class JobDetailsView(ViewObject):
@classmethod
def from_job(cls, job):
action_methods = [
ActionDetailsView.from_action_mtd(mtd, job.job_args)
for mtd_name, mtd in inspect.getmembers(job, inspect.ismethod)
if mtd_name.startswith("on_")
]
return cls(job.name, job.description, action_methods,
get_source_code(job.__class__))
def __init__(self, name: str, description: str,
actions: typing.List[ActionDetailsView], source_code: str):
self.name = name
self.description = description
self.actions = actions
self.source_code = source_code
class JobInstanceSimpleView(ViewObject):
@classmethod
def from_job_instance(cls, job_instance):
return cls(
id_=job_instance.id,
step_name=job_instance.step_name,
job_name=job_instance.job_name,
status=job_instance.status,
created_on=job_instance.created_on,
updated_on=job_instance.updated_on
)
def __init__(self, id_, step_name, job_name, status,
created_on, updated_on):
self.id = id_
self.step_name = step_name
self.job_name = job_name
self.status = status
self.created_on = created_on
self.updated_on = updated_on
class FlowInstanceDetailsView(ViewObject):
@classmethod
def from_flow_instance(cls, flow_instance, flow_tpl_view,
initiator_info, job_instance_list,
flow_discard_reason=None):
return cls(
id_=flow_instance.id,
flow_tpl_view=flow_tpl_view,
status=flow_instance.status,
initiator_info=initiator_info,
updated_on=flow_instance.updated_on,
created_on=flow_instance.created_on,
steps=[
JobInstanceSimpleView.from_job_instance(instance)
for instance in job_instance_list
],
flow_discard_reason=flow_discard_reason
)
def __init__(self, id_: int, flow_tpl_view: FlowTemplateSimpleView,
status: str, initiator_info: UserSimpleView,
updated_on: datetime.datetime, created_on: datetime.datetime,
steps: typing.List[JobInstanceSimpleView],
flow_discard_reason=None):
"""FlowInstance详情
:param id: 编号
:param flow_tpl_view: simple flow template view
:param status: 状态
:param initiator_info: 触发者信息
:param updated_on: 最近更新时间
:param created_on: 创建时间
:param steps: 执行步骤列表
:param flow_discard_reason: 废弃原因
"""
self.id = id_
self.flow_tpl = flow_tpl_view
self.status = status
self.initiator_info = initiator_info
self.updated_on = updated_on
self.created_on = created_on
self.steps = steps
self.flow_discard_reason = flow_discard_reason
class JobActionDataDetailsView(ViewObject):
@classmethod
def from_job_action_data(cls, job_action_data, actor: UserSimpleView=None):
return JobActionDataDetailsView(
id_=job_action_data.id,
action_name=job_action_data.action,
arguments=job_action_data.arguments,
data=job_action_data.data,
created_on=job_action_data.created_on,
updated_on=job_action_data.updated_on,
actor=actor
)
def __init__(self, id_: int, action_name: str, arguments: typing.Dict,
data: typing.Dict, created_on: datetime.datetime,
updated_on: datetime.datetime, actor: UserSimpleView=None):
self.id = id_
self.action_name = action_name
self.arguments = arguments
self.data = data
self.created_on = created_on
self.updated_on = updated_on
self.actor = actor
class JobInstanceDetailsView(ViewObject):
@classmethod
def from_job_instance(
cls, job_instance, action_data_list, action_actors=None):
if action_actors is None:
action_actors = {}
return JobInstanceDetailsView(
id_=job_instance.id,
step_name=job_instance.step_name,
job_name=job_instance.job_name,
status=job_instance.status,
updated_on=job_instance.updated_on,
created_on=job_instance.created_on,
actions=[
JobActionDataDetailsView.from_job_action_data(
act_data,
actor=UserSimpleView.from_user(
action_actors[act_data.actor])
)
for act_data in action_data_list
]
)
def __init__(self, id_: int, step_name: str, job_name: str,
status: str, created_on: datetime.datetime,
updated_on: datetime.datetime,
actions: typing.List[JobActionDataDetailsView]):
self.id = id_
self.step_name = step_name
self.job_name = job_name
self.status = status
self.updated_on = updated_on
self.created_on = created_on
self.actions = actions
class DecisionView(ViewObject):
@classmethod
def from_decision_define(cls, decision, job_define):
return cls(
job_name=job_define.name,
decision_name=decision.name,
title=decision.title,
prompt=decision.prompt,
options=[
DecisionOptionView.from_decision_option_define(
option,
job_define
) for option in decision.options
]
)
def __init__(self, job_name, decision_name, title, prompt, options):
self.job_name = job_name
self.decision_name = decision_name
self.title = title
self.prompt = prompt
self.options = options
class DecisionOptionView(ViewObject):
@classmethod
def from_decision_option_define(cls, decision_option, job_define):
return cls(
action=decision_option.action,
label=decision_option.label,
action_args=[
JobArgView.from_job_arg(action_arg)
for action_arg in job_define.job_args.get(
decision_option.action, [])]
)
def __init__(self, action: str, label: str,
action_args: typing.List[JobArgView]):
self.action = action
self.label = label
self.action_args = action_args
class FlowInstanceGroupDetailsView(ViewObject):
@classmethod
def from_flow_instance_group(
cls, flow_instance_group, flow_simple_instance_view_lst):
return cls(
id_=flow_instance_group.id,
name=flow_instance_group.name,
description=flow_instance_group.description,
status=flow_instance_group.status,
created_on=flow_instance_group.created_on,
updated_on=flow_instance_group.updated_on,
flow_simple_instance_view_lst=flow_simple_instance_view_lst
)
def __init__(self, id_: int, name: str, description: str, status: str,
created_on: datetime.datetime, updated_on: datetime.datetime,
flow_simple_instance_view_lst):
self.id = id_
self.name = name
self.description = description
self.status = status
self.created_on = created_on
self.updated_on = updated_on
self.flow_simple_instance_view_lst = flow_simple_instance_view_lst
class FlowInstanceGroupSimpleView(ViewObject):
@classmethod
def from_flow_instance_group(
cls, flow_instance_group, sub_flow_num, sub_flow_status):
return cls(
id_=flow_instance_group.id,
name=flow_instance_group.name,
description=flow_instance_group.description,
status=flow_instance_group.status,
created_on=flow_instance_group.created_on,
updated_on=flow_instance_group.updated_on,
sub_flow_num=sub_flow_num,
sub_flow_status=sub_flow_status
)
def __init__(self, id_: int, name: str, description: str,
status: str, created_on: datetime.datetime,
updated_on: datetime.datetime, sub_flow_num: int,
sub_flow_status: typing.Dict[str, int]):
self.id = id_
self.name = name
self.description = description
self.status = status
self.created_on = created_on
self.updated_on = updated_on
self.sub_flow_num = sub_flow_num
self.sub_flow_status = sub_flow_status
class NotifySimpleView(ViewObject):
@classmethod
def from_notify_index(cls, notify_index, notify_tpl_manager):
notify_tpl = notify_tpl_manager.get(notify_index.notify_template)
return cls(
id_=notify_index.id,
subject=notify_tpl.render_subject(
**notify_tpl.subject_template_args),
digest=notify_tpl.render_digest(
**notify_tpl.content_template_args),
read_status=notify_index.read_status,
created_on=notify_index.created_on
)
def __init__(
self, id_: int, subject: str,
digest: str, read_status: ReadStatus,
created_on: datetime.datetime):
self.id = id_
self.subject = subject
self.digest = digest
self.read_status = read_status
self.created_on = created_on
class NotifyDetailsView(ViewObject):
@classmethod
def from_notify_index(cls, notify_index, notify_tpl_manager):
notify_tpl = notify_tpl_manager.get(notify_index.notify_template)
return cls(
id_=notify_index.id,
subject=notify_tpl.render_subject(
**notify_tpl.subject_template_args),
digest=notify_tpl.render_digest(
**notify_tpl.digest_template_args),
content=notify_tpl.render_content(
**notify_tpl.content_template_args),
read_status=notify_index.read_status,
created_on=notify_index.created_on
)
def __init__(self, id_: int, subject: str, digest: str, content: str,
read_status: ReadStatus, created_on: datetime.datetime):
self.id = id_
self.subject = subject
self.digest = digest
self.content = content
self.read_status = read_status
self.created_on = created_on | Acolyte | /Acolyte-0.0.1.tar.gz/Acolyte-0.0.1/acolyte/core/view.py | view.py |
import collections
from abc import ABCMeta
from acolyte.core.storage.job_action_data import JobActionDataDAO
class AbstractFlowContext(collections.Mapping, metaclass=ABCMeta):
"""上下文对象用于在Flow运行中的Job之间传递数据
"""
def __init__(self, flow_executor, config):
super().__init__()
self._flow_executor = flow_executor
self._config = config
@property
def config(self):
"""用于获取一个template的配置对象
"""
return self._config
def init(self):
...
def destroy(self):
...
def failure(self):
"""Job可以在action中随时回调此方法终结flow的执行
"""
self._flow_executor._failure_whole_flow(self)
def finish(self, waiting_for=None):
"""Job可以在action中调用此方法来表示自己已经执行完毕
"""
self._flow_executor._finish_step(self, waiting_for)
def save(self, data):
"""Job可以通过该方法保存持久化的数据
"""
...
def finish_instance_group(self):
self._flow_executor._finish_instance_group(self)
class MySQLContext(AbstractFlowContext):
"""基于MySQL的上下文实现
"""
class _ActionQueueDelegate:
def __init__(self, context, queue):
self._context = context
self._queue = queue
def init(self, *tasks, trigger_consume_action=False):
self._queue.init(
self._context, *tasks,
trigger_consume_action=trigger_consume_action)
def take(self):
return self._queue.take(self._context)
def ack(self,
task_id, *, trigger_consume_action=False, auto_finish=False):
self._queue.ack(self._context, task_id,
trigger_consume_action=trigger_consume_action,
auto_finish=auto_finish)
def clear(self):
self._queue.clear(self._context)
@property
def untake_num(self):
return self._queue.untake_num(self._context)
@property
def taken_num(self):
return self._queue.taken_num(self._context)
@property
def acked_num(self):
return self._queue.acked_num(self._context)
@property
def dropped_num(self):
return self._queue.dropped_num(self._context)
class _ActionQueueContainer(dict):
def __init__(self, context, action_queues):
if action_queues is None:
action_queues = []
super().__init__({
aq.name: MySQLContext._ActionQueueDelegate(context, aq)
for aq in action_queues})
def __setattr__(self, name, value):
self[name] = value
def __getattr__(self, name):
return self[name]
def __init__(self, flow_executor, config, db, flow_instance_id,
job_instance_id=None, job_action_id=None, job_action=None,
flow_meta=None, current_step=None, actor=0,
action_queues=None):
"""
:param flow_executor: 当前执行flow的executor对象
:param db: 数据源
:param flow_instance_id: flow实例ID
:param job_instance_id: job实例ID
:param job_action_id: job action ID
:param job_action: job action名称
:param flow_meta: flow元信息
:param current_step: 当前运行到的job step
"""
super().__init__(flow_executor, config)
self._db = db
self._flow_instance_id = flow_instance_id
self._job_instance_id = job_instance_id
self._job_action = job_action
self._job_action_id = job_action_id
self._flow_meta = flow_meta
self._current_step = current_step
self._actor = actor
self._queue = MySQLContext._ActionQueueContainer(self, action_queues)
@property
def flow_instance_id(self):
return self._flow_instance_id
@property
def job_instance_id(self):
return self._job_instance_id
@property
def job_action(self):
return self._job_action
@property
def job_action_id(self):
return self._job_action_id
@property
def flow_meta(self):
return self._flow_meta
@property
def current_step(self):
return self._current_step
@property
def actor(self):
return self._actor
@property
def queue(self):
return self._queue
def __getitem__(self, key):
return self._db.query_one_field((
"select v from flow_context where "
"flow_instance_id = %s and k = %s"
), (self._flow_instance_id, key))
def __setitem__(self, key, value):
self._db.execute((
"insert into flow_context ("
"flow_instance_id, k, v) values ("
"%s, %s, %s) on duplicate key update "
"v = %s"
), (self._flow_instance_id, key, value, value))
def __delitem__(self, key):
return self._db.execute((
"delete from flow_context where "
"flow_instance_id = %s and k = %s limit 1"
), (self._flow_instance_id, key))
def __len__(self):
return self._db.query_one_field((
"select count(*) as c from flow_context where "
"flow_instance_id = %s"
), (self._flow_instance_id,))
def __iter__(self):
return self.keys()
def get(self, key, value=None):
v = self[key]
if v is None:
return value
return v
def items(self):
rs = self._db.query_all((
"select k, v from flow_context where "
"flow_instance_id = %s"
), (self._flow_instance_id,))
return [(row["k"], row["v"]) for row in rs]
def keys(self):
rs = self._db.query_all((
"select k from flow_context where "
"flow_instance_id = %s"
), (self._flow_instance_id,))
return [row['k'] for row in rs]
def values(self):
rs = self._db.query_all((
"select v from flow_context where "
"flow_instance_id = %s"
), (self._flow_instance_id,))
return [row['v'] for row in rs]
def destroy(self):
self._db.execute((
"delete from flow_context where "
"flow_instance_id = %s"
), (self._flow_instance_id,))
def save(self, data):
action_dao = JobActionDataDAO(self._db)
action_dao.update_data(self._job_action_id, data)
def save_with_key(self, data_key, data):
action_dao = JobActionDataDAO(self._db)
action_dao.update_data_with_key(self._job_action_id, data_key, data) | Acolyte | /Acolyte-0.0.1.tar.gz/Acolyte-0.0.1/acolyte/core/context.py | context.py |
import datetime
from typing import List, Dict, Any
from abc import ABCMeta
from acolyte.core.job import JobRef
class FlowMeta(metaclass=ABCMeta):
"""flow meta
每个流程都可以抽象成flow meta,比如工程更新、SQL审核、机器审核等等
"""
def __init__(self, name: str, jobs: List[JobRef],
start_args: Dict[str, Any]=None,
failure_args: Dict[str, Any]=None,
description: str=""):
"""
:param name: flow meta名称
:param jobs: 包含的JobRef对象列表
:param bind_args: 绑定的静态参数,格式 {start: {args}, stop: {args}}
"""
self._name = name
self._jobs = jobs
self._start_args = start_args
self._failure_args = failure_args
self._description = description
self._job_ref_map = {job_ref.step_name: job_ref for job_ref in jobs}
@property
def name(self):
return self._name
@property
def jobs(self):
return self._jobs
@property
def start_args(self):
if self._start_args is None:
return {}
return self._start_args
@property
def failure_args(self):
if self._failure_args is None:
return {}
return self._failure_args
@property
def description(self):
return self._description
def on_start(self, context, arguments):
"""当Flow启动时,执行此逻辑
:param context: flow执行上下文
:param arguments: 生成的运行时参数
"""
...
def on_failure(self, context, arguments):
"""当Flow被终止时,执行此逻辑
:param context: flow执行上下文
:param arguments: 生成的运行时参数
"""
...
def on_finish(self, context):
"""当flow结束是,执行此逻辑
:param context: flow执行上下文
"""
...
def on_discard(self, context):
"""当flow instance被废弃的时候,回调此逻辑
"""
...
def get_next_step(self, current_step):
"""根据当前步骤获取下一个执行步骤
"""
# 当前step为start的情况
if current_step == "start":
if self._jobs:
return self._jobs[0]
else:
return "finish"
for idx, job_ref in enumerate(self._jobs):
if job_ref.step_name == current_step:
if idx < len(self._jobs) - 1:
return self._jobs[idx + 1]
else:
return "finish"
def get_job_ref_by_step_name(self, step_name):
return self._job_ref_map.get(step_name, None)
class FlowTemplate:
def __init__(self, id_: int, flow_meta: str, name: str,
bind_args: Dict[str, Any], max_run_instance: int,
config: Dict[str, Any], creator: int,
created_on: datetime.datetime):
"""根据FlowMeta来生成的Flow模板
:param flow_meta: 使用的flow_meta
:param name: 模板名称
:param bind_args: 绑定的参数
:param max_run_instance: 最大可运行实例数目
:param config: 配置数据,该数据可以在上下文当中被引用
:param creator: 创建者
:param created_on: 创建时间
"""
self.id = id_
self.flow_meta = flow_meta
self.name = name
self.bind_args = bind_args
self.max_run_instance = max_run_instance
self.config = config
self.creator = creator
self.created_on = created_on
class FlowStatus:
"""flow的当前运行状态
"""
STATUS_WAITING = "waiting" # 等待执行
STATUS_INIT = "init" # 初始化,正在运行on_start
STATUS_RUNNING = "running" # 正在执行
STATUS_FINISHED = "finished" # 已经完成
STATUS_FAILURE = "failure" # 已经失败
STATUS_EXCEPTION = "exception" # 由于异常而终止
STATUS_DISCARD = "discard" # 废弃
class FlowInstance:
"""描述flow template的运行实例
"""
def __init__(self, id_: int, flow_template_id: int, initiator: int,
current_step: str, status, description, created_on,
updated_on):
"""
:param id_: 每个flow运行实例都会有一个唯一ID
:param flow_template_id: 所属的flow_template
:param initiator: 发起人
:param current_step: 当前执行到的步骤
:param status: 执行状态
:param created_on: 创建时间
:param updated_on: 最新更新步骤时间
"""
self.id = id_
self.flow_template_id = flow_template_id
self.initiator = initiator
self.current_step = current_step
self.status = status
self.description = description
self.created_on = created_on
self.updated_on = updated_on
class FlowDiscardReason:
"""Flow instance的废弃原因
"""
def __init__(self, flow_instance_id, actor, reason, discard_time):
"""
:param flow_instance_id: flow instance id
:param actor: 执行人
:param reason: 废弃原因
:param discard_time: 废弃时间
"""
self.flow_instance_id = flow_instance_id
self._actor = actor
self.reason = reason
self.discard_time = discard_time
class FlowInstanceGroupStatus:
STATUS_RUNNING = "running"
STATUS_FINISHED = "finished"
class FlowInstanceGroup:
"""用于标记一组逻辑上有关联的FlowInstance
"""
def __init__(self,
id_, name, description, meta,
status, created_on, updated_on):
"""
:param id_: 编号
:param name: 名称
:param description: 描述
:param meta: 元数据
:param status: 状态
:param created_on: 创建时间
:param updated_on: 状态更新时间
"""
self.id = id_
self.name = name
self.description = description
self.meta = meta
self.status = status
self.created_on = created_on
self.updated_on = updated_on | Acolyte | /Acolyte-0.0.1.tar.gz/Acolyte-0.0.1/acolyte/core/flow.py | flow.py |
import locale
import simplejson as json
from collections import ChainMap
from contextlib import contextmanager
from typing import Dict, Any
from acolyte.util import log
from acolyte.util.json import to_json
from acolyte.util.service_container import ServiceContainer
from acolyte.core.service import (
AbstractService,
Result,
)
from acolyte.core.flow import (
FlowStatus,
FlowInstanceGroupStatus
)
from acolyte.core.job import (
JobStatus,
ActionArg,
ActionLockType,
ActionRunTimes,
)
from acolyte.core.context import MySQLContext
from acolyte.core.storage.user import UserDAO
from acolyte.core.storage.flow_template import FlowTemplateDAO
from acolyte.core.storage.flow_instance import FlowInstanceDAO
from acolyte.core.storage.job_instance import JobInstanceDAO
from acolyte.core.storage.job_action_data import JobActionDataDAO
from acolyte.core.storage.flow_discard_reason import FlowDiscardReasonDAO
from acolyte.core.storage.flow_instance_group import (
FlowInstanceGroupDAO,
FlowInstanceGroupRelationDAO,
)
from acolyte.core.message import default_validate_messages
from acolyte.util.validate import (
IntField,
StrField,
Field,
check,
BadReq,
InvalidFieldException
)
from acolyte.exception import ObjectNotFoundException
class FlowExecutorService(AbstractService):
def __init__(self, service_container: ServiceContainer):
super().__init__(service_container)
def _after_register(self):
# 获取各种所依赖的服务
self._db = self._("db")
self._flow_tpl_dao = FlowTemplateDAO(self._db)
self._flow_meta_mgr = self._("flow_meta_manager")
self._job_mgr = self._("job_manager")
self._flow_instance_dao = FlowInstanceDAO(self._db)
self._user_dao = UserDAO(self._db)
self._job_instance_dao = JobInstanceDAO(self._db)
self._job_action_dao = JobActionDataDAO(self._db)
self._flow_discard_dao = FlowDiscardReasonDAO(self._db)
self._flow_instance_group_dao = FlowInstanceGroupDAO(self._db)
self._flow_instance_group_rlt_dao = FlowInstanceGroupRelationDAO(
self._db)
@check(
IntField("flow_template_id", required=True),
IntField("initiator", required=True),
StrField("description", required=True, max_len=1000),
Field("start_flow_args", type_=dict, required=False,
default=None, value_of=json.loads),
IntField("group", required=False, default=0)
)
def start_flow(self, flow_template_id: int,
initiator: int, description: str,
start_flow_args: Dict[str, Any], group=0) -> Result:
"""开启一个flow进程,创建flow_instance并执行第一个Job
S1. 根据flow_template_id获取flow_template,然后获取flow_meta,如果获取失败,返回错误
S2. 检查并合并参数
S3. 检查max_run_instance
S4. 创建一条新的flow_instance记录
S5. 创建context
S6. 回调flow meta中on_start方法的逻辑
:param flow_template_id: 使用的flow template
:param initiator: 发起人
:param description: 本次flow描述
:param start_flow_args: 执行FlowMeta的on_start方法时所需要的参数
:param group: flow instance所属的组,默认为0,不属于任何组
"""
if start_flow_args is None:
start_flow_args = {}
# 检查flow_template_id是否合法
flow_template = self._flow_tpl_dao.query_flow_template_by_id(
flow_template_id)
if flow_template is None:
raise BadReq("invalid_flow_template",
flow_template_id=flow_template_id)
flow_meta = self._flow_meta_mgr.get(flow_template.flow_meta)
if flow_meta is None:
raise BadReq("invalid_flow_meta", flow_meta=flow_meta)
# 检查发起者
initiator_user = self._user_dao.query_user_by_id(initiator)
if initiator_user is None:
raise BadReq("invalid_initiator", initiator=initiator)
# 检查group
if group:
flow_instance_group = self._flow_instance_group_dao\
.query_by_id(group)
if flow_instance_group is None:
raise BadReq("group_not_exist")
# 检查并合并start_flow_args
field_rules = getattr(flow_meta.on_start, "field_rules", [])
rs = self._combine_and_check_args(
"start", field_rules, start_flow_args, flow_meta.start_args)
if rs.status_code == Result.STATUS_BADREQUEST:
return rs
start_flow_args = rs.data
# 锁定检查instance数目并创建第一条记录
flow_instance = None
if flow_template.max_run_instance > 0:
lock_key = "lock_instance_create_{tpl_id}".format(
tpl_id=flow_template_id)
with self._db.lock(lock_key):
current_instance_num = self._flow_instance_dao.\
query_running_instance_num_by_tpl_id(flow_template_id)
if current_instance_num >= flow_template.max_run_instance:
raise BadReq(
reason="too_many_instance",
allow_instance_num=flow_template.max_run_instance
)
flow_instance = self._flow_instance_dao.insert(
flow_template_id, initiator, description)
else:
flow_instance = self._flow_instance_dao.insert(
flow_template_id, initiator, description)
# 创建Context
ctx = MySQLContext(
flow_executor=self,
config=flow_template.config,
db=self._db,
flow_instance_id=flow_instance.id
)
# 回调on_start
flow_meta.on_start(ctx, **start_flow_args)
# 将状态更新到running
self._flow_instance_dao.update_status(
flow_instance.id, FlowStatus.STATUS_RUNNING)
# 添加组
if group:
self._flow_instance_group_rlt_dao.insert(flow_instance.id, group)
log.acolyte.info(
"start flow instance {}".format(to_json(flow_instance)))
return Result.ok(data=flow_instance)
@check(
IntField("flow_instance_id", required=True),
StrField("target_step", required=True),
StrField("target_action", required=True),
IntField("actor", required=True),
Field("action_args", type_=dict, required=False,
default=None, value_of=json.loads)
)
def handle_job_action(self, flow_instance_id: int,
target_step: str, target_action: str,
actor: int, action_args: Dict[str, Any]) -> Result:
"""处理Job中的Action
S1. 检查并获取flow实例
S2. 检查job以及job_action的存在性
S3. 检查执行人是否合法
S4. 检查当前是否可以允许该step及target_action的执行
S5. 合并以及检查相关参数
S6. 回调相关Action逻辑
S7. 返回回调函数的返回值
:param flow_instance_id: flow的标识
:param target_step: 要执行的Step
:param target_action: 自定义的动作名称
:param actor: 执行人
:param action_args: 执行该自定义动作所需要的参数
"""
if action_args is None:
action_args = {}
# 检查flow instance的id合法性
flow_instance = self._flow_instance_dao.query_by_instance_id(
flow_instance_id)
if flow_instance is None:
raise BadReq("invalid_flow_instance",
flow_instance_id=flow_instance_id)
# 检查flow instance的状态
if flow_instance.status != FlowStatus.STATUS_RUNNING:
raise BadReq("invalid_status", status=flow_instance.status)
# 获取对应的flow template和flow meta
flow_template = self._flow_tpl_dao\
.query_flow_template_by_id(flow_instance.flow_template_id)
if flow_template is None:
raise BadReq("unknown_flow_template",
flow_template_id=flow_instance.flow_template_id)
try:
flow_meta = self._flow_meta_mgr.get(flow_template.flow_meta)
except ObjectNotFoundException:
raise BadReq("unknown_flow_meta", flow_meta=flow_meta)
actor_info = self._user_dao.query_user_by_id(actor)
if actor_info is None:
raise BadReq("invalid_actor", actor=actor)
# 检查当前step以及当前step是否完成
# 检查下一个状态是否是目标状态
handler_mtd, job_def, job_ref = self._check_step(
flow_meta, flow_instance, target_step, target_action)
# 合并检查参数 request_args - template_bind_args - meta_bind_args
rs = self._check_and_combine_action_args(
job_def, target_action, action_args, job_ref, flow_template)
if rs.status_code == Result.STATUS_BADREQUEST:
return rs
action_args = rs.data
job_instance = self._job_instance_dao.query_by_instance_id_and_step(
instance_id=flow_instance_id,
step=target_step
)
action_constraint = job_def.action_constraints.get(target_action)
with self._check_constraint(
action_constraint, flow_instance_id,
job_instance, target_action):
# 如果是trigger事件,需要创建job_instance记录
if target_action == "trigger":
job_instance = self._job_instance_dao.insert(
flow_instance_id, target_step, job_def.name, actor)
self._flow_instance_dao.update_current_step(
flow_instance_id, target_step)
action = self._job_action_dao.insert(
job_instance_id=job_instance.id,
action=target_action,
actor=actor,
arguments=action_args,
data_key="",
data={}
)
ctx = MySQLContext(
flow_executor=self,
config=flow_template.config,
db=self._db,
flow_instance_id=flow_instance.id,
job_instance_id=job_instance.id,
job_action_id=action.id,
job_action=target_action,
flow_meta=flow_meta,
current_step=target_step,
actor=actor,
action_queues=job_def.action_queues
)
try:
exec_rs = handler_mtd(ctx, **action_args)
if not isinstance(exec_rs, Result):
exec_rs = Result.ok(data=exec_rs)
except Exception as e:
self._job_action_dao.delete_by_id(action.id)
log.error.exception(e)
return Result.service_error("service_error", msg="服务器开小差了")
else:
if not exec_rs.is_success():
# 如果返回结果不成功,那么允许重来
self._job_action_dao.delete_by_id(action.id)
else:
self._job_action_dao.sync_updated_on(action.id)
log.acolyte.info((
"Job action executed, "
"action_data = {action_data}, "
"action_result = {action_result}"
).format(
action_data=to_json(action),
action_result=to_json(rs)
))
return exec_rs
def _check_and_combine_action_args(
self, job_def, target_action, request_args,
job_ref, flow_template):
job_arg_defines = job_def.job_args.get(target_action)
# 无参数定义
if not job_arg_defines:
return Result.ok(data={})
# 获取各级的参数绑定
meta_bind_args = job_ref.bind_args.get(target_action, {})
tpl_bind_args = flow_template.bind_args.get(
job_ref.step_name, {}).get(target_action, {})
args_chain = ChainMap(request_args, tpl_bind_args, meta_bind_args)
# 最终生成使用的参数集合
args = {}
for job_arg_define in job_arg_defines:
try:
arg_name = job_arg_define.name
# auto 类型,直接从chain中取值
if job_arg_define.mark == ActionArg.MARK_AUTO:
value = args_chain[arg_name]
# static类型,从template中取值
elif job_arg_define.mark == ActionArg.MARK_STATIC:
value = tpl_bind_args.get(arg_name, None)
# 如果值是以$config.开头,那么从flow_template.config中替换值
if value and isinstance(value, str) and \
value.startswith('$config.'):
value = flow_template.config.get(
value[len('$config.'):])
# const类型,从meta中取值
elif job_arg_define.mark == ActionArg.MARK_CONST:
value = meta_bind_args.get(arg_name, None)
args[arg_name] = job_arg_define.field_info(value)
except InvalidFieldException as e:
full_field_name = "{step}.{action}.{arg}".format(
step=job_ref.step_name,
action=target_action,
arg=arg_name
)
return self._gen_bad_req_result(e, full_field_name)
return Result.ok(data=args)
def _check_step(self, flow_meta, flow_instance,
target_step, target_action):
current_step = flow_instance.current_step
# 检查当前action的方法是否存在
target_job_ref = flow_meta.get_job_ref_by_step_name(target_step)
if target_job_ref is None:
raise BadReq("unknown_target_step", target_step=target_step)
try:
job_def = self._job_mgr.get(target_job_ref.job_name)
except ObjectNotFoundException:
raise BadReq("unknown_job", job_name=target_job_ref.name)
handler_mtd = getattr(job_def, "on_" + target_action, None)
if handler_mtd is None:
raise BadReq("unknown_action_handler", action=target_action)
# 当前step即目标step
if current_step == target_step:
job_instance = self._job_instance_dao.\
query_by_instance_id_and_step(
instance_id=flow_instance.id,
step=current_step
)
if job_instance.status == JobStatus.STATUS_FINISHED:
raise BadReq("step_already_runned", step=target_step)
# 检查当前action是否被执行过
# action = self._job_action_dao.\
# query_by_job_instance_id_and_action(
# job_instance_id=job_instance.id,
# action=target_action
# )
# if action:
# raise BadReq("action_already_runned", action=target_action)
# 如果非trigger,则检查trigger是否执行过
if target_action != "trigger":
trigger_action = self._job_action_dao.\
query_by_job_instance_id_and_action(
job_instance_id=job_instance.id,
action="trigger"
)
if trigger_action is None:
raise BadReq("no_trigger")
return handler_mtd, job_def, target_job_ref
if current_step != "start":
# 当前step非目标step
job_instance = self._job_instance_dao.\
query_by_instance_id_and_step(
instance_id=flow_instance.id,
step=current_step
)
# 流程记录了未知的current_step
if job_instance is None:
raise BadReq("unknown_current_step", current_step=current_step)
# 当前的step尚未完成
if job_instance.status != JobStatus.STATUS_FINISHED:
raise BadReq("current_step_unfinished",
current_step=current_step)
# 获取下一个该运行的步骤
next_step = flow_meta.get_next_step(current_step)
if next_step.step_name != target_step:
raise BadReq("invalid_target_step", next_step=next_step)
if target_action != "trigger":
raise BadReq("no_trigger")
return handler_mtd, job_def, target_job_ref
def _combine_and_check_args(
self, action_name, field_rules, *args_dict):
"""合并 & 检查参数 先合并,后检查
:param field_rules: 字段规则
:param old_args
"""
_combined_args = ChainMap(*args_dict).new_child()
# 木有验证规则,直接返回数据
if not field_rules:
return Result.ok(data=_combined_args)
try:
for field_rule in field_rules:
# 检查并替换掉相关参数
val = field_rule(_combined_args[field_rule.name])
_combined_args[field_rule.name] = val
except InvalidFieldException as e:
full_field_name = "{action_name}.{field_name}".format(
action_name=action_name,
field_name=e.field_name
)
return self._gen_bad_req_result(e, full_field_name)
else:
return Result.ok(data=_combined_args)
def _gen_bad_req_result(self, e, full_field_name):
loc, _ = locale.getlocale(locale.LC_CTYPE)
full_reason = "{full_field_name}_{reason}".format(
full_field_name=full_field_name,
reason=e.reason
)
msg = default_validate_messages[loc][e.reason]
if e.expect is None or e.expect == "":
msg = msg.format(field_name=full_field_name)
else:
msg = msg.format(
field_name=full_field_name, expect=e.expect)
return Result.bad_request(reason=full_reason, msg=msg)
@contextmanager
def _check_constraint(self,
action_constraint, flow_instance_id,
job_instance, target_action):
if action_constraint is None:
yield
else:
if job_instance is not None:
# 测试最大可运行数
actions = self._job_action_dao.\
query_by_job_instance_id_and_action(
job_instance_id=job_instance.id,
action=target_action
)
if (
len(actions) > 0 and
action_constraint.run_times == ActionRunTimes.ONCE
):
raise BadReq("action_already_runned", action=target_action)
lock = action_constraint.lock
# 处理独占锁
if lock and lock.lock_type == ActionLockType.USER_EXCLUSIVE_LOCK:
with self._db.lock(
lock.gen_lock_key(flow_instance_id), 0) as lock_rs:
if not lock_rs:
raise BadReq("someone_operating")
yield
def _finish_step(self, ctx, waiting_for=None):
"""标记一个job instance完成,通常由action通过context进行回调
S1. 将job_instance的状态更新为finish
S2. 检查整个flow是否已经完成
S3. 如果整个流程已经完成,那么标记flow_instance的status
S4. 回调flow_meta中的on_finish事件
"""
flow_instance_id, job_instance_id = (
ctx.flow_instance_id,
ctx.job_instance_id
)
if waiting_for:
# 做个标记,偶已经执行完了
_barrier_key = self._gen_barrier_key(
job_instance_id, ctx.job_action)
ctx[_barrier_key] = True
# 加个大锁,避免mark finish状态mark重了
with self._db.lock(
"action_barrier_lock_{}".format(job_instance_id)):
job_instance = self._job_instance_dao.query_by_id(
job_instance_id)
# job 已经被标记完成了,什么都不需要做了
if job_instance.status == JobStatus.STATUS_FINISHED:
return
# 检查依赖的action是否都执行完毕了,都执行完毕了就可以安全的标记完成状态
if all(
ctx[self._gen_barrier_key(job_instance_id, action)]
for action in waiting_for
):
self._mark_finish_step(
ctx, flow_instance_id, job_instance_id)
else:
self._mark_finish_step(ctx, flow_instance_id, job_instance_id)
def _gen_barrier_key(self, job_instance_id, job_action):
return ("__action_barrier_"
"{job_instance_id}_"
"{job_action}"
).format(
job_instance_id=job_instance_id,
job_action=job_action)
def _mark_finish_step(self, ctx, flow_instance_id, job_instance_id):
self._job_instance_dao.update_status(
job_instance_id=job_instance_id,
status=JobStatus.STATUS_FINISHED
)
next_step = ctx.flow_meta.get_next_step(ctx.current_step)
# 尚未完成,继续处理
if next_step != "finish":
next_step_job = self._job_mgr.get(next_step.job_name)
# 下一个Step为自动触发类型
if next_step_job.auto_trigger:
self.handle_job_action(
flow_instance_id=ctx.flow_instance_id,
target_step=next_step.step_name,
target_action="trigger",
actor=ctx.actor,
action_args={}
)
return
# 修改flow_instance的状态
self._flow_instance_dao.update_status(
flow_instance_id=flow_instance_id,
status=FlowStatus.STATUS_FINISHED
)
# 回调on_finish事件
on_finish_handler = getattr(ctx.flow_meta, "on_finish", None)
if on_finish_handler is None:
return
on_finish_handler(ctx)
def _failure_whole_flow(self, ctx):
"""终止整个flow的运行,通常由action通过context进行回调
S1. 标记job_instance的status为stop
S2. 标记flow_instance的status为stop
S3. 回调flow_meta中的on_stop事件
"""
self._job_instance_dao.update_status(
job_instance_id=ctx.job_instance_id,
status=JobStatus.STATUS_FAILURE
)
self._flow_instance_dao.update_status(
flow_instance_id=ctx.flow_instance_id,
status=FlowStatus.STATUS_FAILURE
)
# 回调on_stop事件
on_failure_handler = getattr(ctx.flow_meta, "on_failure", None)
if on_failure_handler is None:
return
on_failure_handler(ctx)
def _handle_exception(self, job_instance_id, exc_type, exc_val, traceback):
"""标记Flow为Exception状态
"""
pass
@check(
IntField("flow_instance_id", required=True),
IntField("actor_id", required=True),
StrField("discard_reason", required=False, max_len=1000, default="")
)
def discard_flow_instance(
self, flow_instance_id, actor_id, discard_reason):
"""手工废弃整个flow instance
"""
flow_instance = self._flow_instance_dao\
.query_by_instance_id(flow_instance_id)
# 检查flow_instance的存在性
if flow_instance is None:
raise BadReq("flow_instance_not_found")
if flow_instance.status not in (
FlowStatus.STATUS_RUNNING,
FlowStatus.STATUS_INIT,
FlowStatus.STATUS_WAITING
):
raise BadReq("invalid_status", current_status=flow_instance.status)
self._flow_instance_dao.update_status(
flow_instance_id, FlowStatus.STATUS_DISCARD)
# 插入废弃原因
self._flow_discard_dao.insert(
flow_instance_id, actor_id, discard_reason)
return Result.ok()
@check(
StrField("name", required=True),
StrField("description", required=False),
Field("meta", required=False, type_=dict,
value_of=json.loads, default=None)
)
def create_flow_instance_group(self, name, description, meta):
"""
创建flow instance group
S1. 检查name是否已经存在
S2. 执行创建
S3. 返回新创建的instance group对象
:param name: group name
:param description: 描述
:param meta: group相关的meta信息
"""
flow_instance_group = self._flow_instance_group_dao\
.query_by_name(name)
if flow_instance_group is not None:
raise BadReq("group_existed", name=name)
flow_instance_group = self._flow_instance_group_dao.insert(
name=name,
description=description,
meta=meta,
status=FlowInstanceGroupStatus.STATUS_RUNNING
)
return Result.ok(data=flow_instance_group)
def _finish_instance_group(self, context):
"""标记instance group记录为完成状态,通常由上下文对象进行回调
"""
flow_instance_id = context.flow_instance_id
group_id = self._flow_instance_group_rlt_dao\
.query_group_id_by_instance_id(flow_instance_id)
if not group_id:
# do nothing
return
self._flow_instance_group_dao\
.update_status(
group_id, FlowInstanceGroupStatus.STATUS_FINISHED) | Acolyte | /Acolyte-0.0.1.tar.gz/Acolyte-0.0.1/acolyte/core/flow_executor_service.py | flow_executor_service.py |
import time
import random
from acolyte.util.validate import (
IntField,
StrField,
check,
BadReq
)
from acolyte.util.sec import sha1
from acolyte.core.service import AbstractService, Result
from acolyte.core.storage.user import UserDAO
from acolyte.core.storage.role import RoleDAO
from acolyte.core.storage.user_token import UserTokenDAO
class UserService(AbstractService):
_TOKEN_SALT = "6f81900c31f7fd80bd"
def __init__(self, service_container):
super().__init__(service_container)
def _after_register(self):
self._db = self._("db")
self._user_dao = UserDAO(self._db)
self._role_dao = RoleDAO(self._db)
self._user_token_dao = UserTokenDAO(self._db)
@check(
StrField("email", required=True),
StrField("password", required=True)
)
def login(self, email: str, password: str) -> Result:
"""登录
S1. 通过email和password检索用户
S2. 创建并获取新的token
S3. 存储相关用户数据到session_data
"""
user = self._user_dao.query_user_by_email_and_password(
email=email,
password=sha1(password)
)
if user is None:
raise BadReq("no_match")
# do upsert
new_token = self._gen_new_token(user.id)
self._user_token_dao.upsert_token(user.id, new_token)
# save user basic info to session data
self._user_token_dao.save_session_data(
new_token, name=user.name, email=user.email)
return Result.ok(data={"id": user.id, "token": new_token})
def _gen_new_token(self, user_id: int):
"""生成新token
规则: sha1({用户ID}{时间戳}{随机数}{salt})
"""
return sha1((
"{user_id}"
"{timestamp_int}"
"{randnum}"
"{salt}"
).format(
user_id=user_id,
timestamp_int=int(time.time()),
randnum=random.randint(10000, 99999),
salt=UserService._TOKEN_SALT))
@check(
StrField("email", required=True, regex=r'^[\w.-]+@[\w.-]+.\w+$'),
StrField("password", required=True, min_len=6, max_len=20),
StrField("name", required=True, max_len=10),
IntField("role", required=True),
StrField("github_account", required=True),
IntField("operator", required=True)
)
def add_user(self, email: str, password: str,
name: str, role: int, github_account: str,
operator: int) -> Result:
"""添加新用户
S1. 检查邮箱是否存在
S2. 检查角色是否存在
S3. 检查operator是否有权限
S4. 创建新用户
:param email: 邮箱地址
:param password: 密码,会经过sha1加密
:param name: 姓名
:param role: 角色编号
:param github_account: github账户
:param operator: 操作者
"""
# 检查是否已注册
if self._user_dao.is_email_exist(email):
raise BadReq("email_exist", email=email)
# 检查角色是否合法
if not self._role_dao.query_role_by_id(role):
raise BadReq("role_not_found", role=role)
# 检查操作者信息及权限
operator_model = self._user_dao.query_user_by_id(operator)
if operator_model is None:
raise BadReq("operator_not_found")
operator_role = self._role_dao.query_role_by_id(operator_model.role)
if operator_role.name != "admin":
raise BadReq("not_allow_operation")
# 创建新用户
new_user = self._user_dao.insert_user(
email, sha1(password), name, role, github_account)
return Result.ok(data=new_user)
@check(StrField("token", required=True))
def check_token(self, token: str) -> Result:
"""检查token
S1. 查找token相关的用户信息
S2. 返回token关联简单会话数据
"""
session_data = self._user_token_dao.query_session_data(token)
if session_data is None:
raise BadReq("invalid_token")
return Result.ok(data=session_data)
def logout(self, token: str) -> Result:
"""退出
S1. 直接从数据库中删除token记录
"""
self._user_token_dao.delete_by_token(token)
return Result.ok()
@check(
IntField("user_id", required=True),
StrField("old_password", required=True),
StrField("new_password", required=True, min_len=6)
)
def modify_password(self, user_id, old_password, new_password):
"""修改密码
:param user_id: 目标用户ID
:param old_password: 旧密码
:param new_password: 新密码
"""
# 当前密码不正确
current_password = self._user_dao.query_user_password(user_id)
if current_password != sha1(old_password):
raise BadReq("old_password_incorrect")
self._user_dao.update_password(user_id, sha1(new_password))
return Result.ok()
def profile(self, user_id: int) -> Result:
... | Acolyte | /Acolyte-0.0.1.tar.gz/Acolyte-0.0.1/acolyte/core/user_service.py | user_service.py |
import datetime
from typing import Dict, List, Any
from enum import Enum
class NotifyWay(Enum):
"""通知方式"""
WEB_INBOX = 1 # 站内通知
EMAIL = 2 # 邮件通知
SMS = 3 # 短信通知
WEIXIN = 4 # 微信通知
class NotifyTemplate:
"""通知模板
"""
def __init__(self, name, subject_template, content_template,
digest_template):
"""通知模板
"""
self._name = name
self._subject_template = subject_template
self._content_template = content_template
self._digest_template = digest_template
@property
def name(self):
return self._name
@property
def subject_template(self):
return self._subject_template
@property
def content_template(self):
return self._content_template
@property
def digest_template(self):
return self._digest_template
def render_subject(self, **subject_args):
return self._subject_template.format(**subject_args)
def render_digest(self, **digest_args):
return self._digest_template.format(**digest_args)
def render_content(self, **content_args):
return self._content_template.format(**content_args)
class Jinja2NotifyTemplate(NotifyTemplate):
"""基于jinja2的通知模板
"""
def __init__(self,
name, subject_template, content_template,
digest_template, jinja2_env):
super().__init__(
name=name,
subject_template=subject_template,
content_template=content_template,
digest_template=digest_template
)
self._jinja2_env = jinja2_env
def render_subject(self, **subject_args):
return self._render(self._subject_template, **subject_args)
def render_digest(self, **digest_args):
return self._render(self._digest_template, **digest_args)
def render_content(self, **content_args):
return self._render(self._content_template, **content_args)
def _render(self, tpl, **args):
if tpl.startswith("tpl:"):
return self._render_with_jinja2(tpl[len("tpl:"):], **args)
else:
return self._render_with_format(tpl, **args)
def _render_with_format(self, tpl, **args):
return tpl.format(**args)
def _render_with_jinja2(self, tpl, **args):
tpl = self._jinja2_env.get_template(tpl)
return tpl.render(**args)
class ReadStatus(Enum):
"""通知阅读状态"""
UNREAD = 0 # 未读状态
READED = 1 # 已读状态
class NotifyReceiver:
"""用于通知发送接口,表示收件人
"""
def __init__(self, receiver_user, subject_template_args,
content_template_args, digest_template_args):
self._receiver_user = receiver_user
self._subject_template_args = subject_template_args
self._content_template_args = content_template_args
self._digest_template_args = digest_template_args
@property
def receiver_user(self):
return self._receiver_user
@property
def subject_template_args(self):
return self._subject_template_args
@property
def content_template_args(self):
return self._content_template_args
@property
def digest_template_args(self):
return self._digest_template_args
class NotifyIndex:
"""通知分发索引
"""
@classmethod
def from_notify_receiver(
cls, id_, notify_template, notify_receiver, notify_ways):
return cls(
id_=id_,
notify_template=notify_template,
receiver=notify_receiver._receiver_user.id,
subject_template_args=notify_receiver.subject_template_args,
content_template_args=notify_receiver.content_template_args,
digest_template_args=notify_receiver.digest_template_args,
notify_ways=notify_ways
)
def __init__(self, id_: int, notify_template: str, receiver: int,
subject_template_args: Dict[str, Any],
content_template_args: Dict[str, Any],
digest_template_args: Dict[str, Any],
notify_ways: List[NotifyWay]=None,
read_status: ReadStatus=ReadStatus.UNREAD,
created_on=None, updated_on=None):
"""
:param notify_template: 所引用的模板名称
:param receiver: 收件人
:param subject_template_args: 渲染标题所需要的参数
:param content_template_args: 渲染正文所需要的参数
:param digest_template_args: 渲染摘要所需要的参数
:param notify_ways: 通知方式
:param read_status: 阅读状态
:param created_on: 创建时间
:param updated_on: 更新时间
"""
self._id = id_
self._notify_template = notify_template
self._receiver = receiver
self._subject_template_args = subject_template_args
self._content_template_args = content_template_args
self._digest_template_args = digest_template_args
self._notify_ways = notify_ways if notify_ways else []
self._read_status = read_status
now = datetime.datetime.now()
self._created_on = created_on if created_on else now
self._updated_on = updated_on if updated_on else now
@property
def id(self):
return self._id
@property
def notify_template(self):
return self._notify_template
@property
def receiver(self):
return self._receiver
@property
def subject_template_args(self):
return self._subject_template_args
@property
def content_template_args(self):
return self._content_template_args
@property
def digest_template_args(self):
return self._digest_template_args
@property
def notify_ways(self):
return self._notify_ways
@property
def read_status(self):
return self._read_status
@property
def created_on(self):
return self._created_on
@property
def updated_on(self):
return self._updated_on | Acolyte | /Acolyte-0.0.1.tar.gz/Acolyte-0.0.1/acolyte/core/notify.py | notify.py |
messages = {
"zh_CN": {
"FlowService": {
"flow_executor_service": {
"template_not_found": "找不到ID为'{tpl_id}'的模板"
},
"get_flow_meta_info": {
"flow_meta_name_empty": "flow_meta_name参数不能为空",
"flow_meta_name_invalid_type": "flow_meta_name参数只允许字符串类型",
"flow_meta_not_exist": "找不到名称为'{flow_meta}'的FlowMeta"
},
"create_flow_template": {
"flow_meta_not_exist": "指定的flow meta对象'{flow_meta}'不存在",
"name_already_exist": "flow template '{name}' 已存在",
"invalid_creator_id": "创建者ID '{creator}' 不合法",
"not_allow_bind_const": "参数 '{arg_name}' 是const类型,不允许被覆盖"
},
"get_flow_template": {
"not_found": "找不到ID为'{flow_template_id}'的Flow template"
},
"modify_flow_template": {
"tpl_not_found": "找不到ID为 '{flow_tpl_id}' 的Flow template",
"name_exist": "名称 '{name}' 已存在"
},
"get_instance_by_time_scope": {
"invalid_time_scope": "不合法的时间范围"
},
"get_flow_instance_details": {
"not_found": "找不到ID为'{flow_instance_id}'的Flow instance",
},
"get_flow_instance_group_details": {
"group_not_exist": "找不到目标分组"
},
"get_flow_instance_group_history": {
"invalid_datescope": "不合法的时间范围"
},
},
"JobService": {
"get_job_details_by_name": {
"job_not_found": "找不到该Job的定义:'{job_name}'",
},
"get_job_instance_details": {
"instance_not_found": "找不到ID为'{job_instance_id}'的Job运行时实例"
},
"get_decision_info": {
"job_instance_not_found": "找不到ID为'{job_instance_id}'的Job运行时实例",
"decision_not_found": "找不到名称为'{decision_name}'的Decision信息",
},
},
"FlowExecutorService": {
"start_flow": {
"invalid_flow_template":
"不合法的flow template id: {flow_template_id}",
"invalid_flow_meta": "不合法的flow meta: {flow_meta}",
"invalid_initiator": "不合法的发起者ID: {initiator}",
"too_many_instance":
"无法创建更多的运行时实例,允许最大实例数目为: {allow_instance_num}"
},
"handle_job_action": {
"invalid_flow_instance":
"不合法的flow instance id: {flow_instance_id}",
"invalid_status": "当前flow instance的状态为 '{status}',无法执行action",
"unknown_flow_template":
"找不到对应的flow template: {flow_template_id}",
"unknown_flow_meta": "找不到对应的flow meta: '{flow_meta}'",
"invalid_actor": "不合法的actor id '{actor}'",
"unknown_target_step": "未知的target step: '{target_step}'",
"unknown_job": "未知的Job引用 '{job_name}'",
"unknown_action_handler": "找不到action handler: '{action}'",
"step_already_runned": "step '{step}' 已经被运行过了",
"action_already_runned": "该action已经被运行过了",
"no_trigger": "尚未执行trigger action",
"unknown_current_step": "当前step未知: '{current_step}'",
"current_step_unfinished": "当前step '{current_step}' 尚未完成",
"invalid_target_step": "下一个目标step为 '{next_step}'",
"someone_operating": "当前正有用户在执行该操作",
},
"discard_flow_instance": {
"flow_instance_not_found": "flow instance不存在",
"invalid_status": "当前工作流的状态为 '{current_status}',无法废弃",
},
"create_flow_instance_group": {
"group_existed": "名为 '{name}' 的flow instance group已经存在"
},
},
"UserService": {
"login": {
"no_match": "账号密码不匹配",
},
"add_user": {
"email_exist": "邮箱'{email}'已经存在",
"role_not_found": "指定的角色编号'{role}'不存在",
"operator_not_found": "操作人信息不合法",
"not_allow_operation": "您无权进行此项操作",
},
"check_token": {
"invalid_token": "不合法的token"
},
"modify_password": {
"old_password_incorrect": "当前密码输入有误",
}
},
}
}
# 字段验证的默认提示消息
default_validate_messages = {
"zh_CN": {
"empty": "{field_name}参数不能为空",
"invalid_type": "{field_name}参数只接受{expect}类型",
"less_than_min": "{field_name}参数不能小于{expect}",
"more_than_max": "{field_name}参数不能大于{expect}",
"less_than_min_length": "{field_name}的长度不能小于{expect}",
"more_than_max_length": "{field_name}的长度不能大于{expect}",
"invalid_format": "{field_name}的格式不满足正则表达式'{expect}'"
}
} | Acolyte | /Acolyte-0.0.1.tar.gz/Acolyte-0.0.1/acolyte/core/message.py | message.py |
import datetime
import simplejson as json
from abc import ABCMeta, abstractmethod
from acolyte.util import db
from acolyte.util.json import to_json
class TaskStatus:
UNTAKE = "untake"
TAKEN = "taken"
ACKED = "acked"
DROPPED = "dropped"
class Task:
@classmethod
def from_args(cls, args):
return cls(0, TaskStatus.UNTAKE, args)
def __init__(self, id_, status, args):
self._id = id_
self._status = status
self._args = args
@property
def id(self):
return self._id
@property
def status(self):
return self._status
@property
def args(self):
return self._args
class ActionQueue(metaclass=ABCMeta):
"""Action队列
"""
def __init__(self, name, init_action, consume_action, ack_action):
self._name = name
self._init_action = init_action
self._consume_action = consume_action
self._ack_action = ack_action
@property
def name(self):
return self._name
def init(self, context, *tasks, trigger_consume_action=False):
"""初始化队列
:param trigger_consume_action: 如果为True,会在初始化之后自动触发
consume_action
"""
self._save_tasks_into_queue(context, *tasks)
if trigger_consume_action:
self._execute_action(
context, self._consume_action, {})
@abstractmethod
def _save_tasks_into_queue(self, context, *tasks):
"""批量保存任务到队列
"""
...
def take(self, context):
"""从队列中取出任务
"""
return self._take_from_queue(context)
@abstractmethod
def _take_from_queue(self, context):
...
def ack(self, context, task_id, *,
trigger_consume_action=False, auto_finish=False):
"""确认任务完毕
:param trigger_consume_action: 如果为True,并且队列没有全部消费完,
会自动触发consume_action
:param 如果队列全部消费完,是否调用finish方法
"""
self._mark_task_acked(task_id)
# 所有消息都被ack
if self._is_all_acked(context):
if auto_finish:
context.finish()
else:
if trigger_consume_action:
self._execute_action(context, self._consume_action, {})
@abstractmethod
def _mark_task_acked(self, task_id):
...
@abstractmethod
def _is_all_acked(self, context):
...
def give_back(self, context, task, *, trigger_consume_action=False):
"""将消息归还给队列
:param trigger_consume_action: 归还后是否触发consume_action
"""
self._mark_task_untake(task)
if trigger_consume_action:
self._execute_action(context, self._consume_action, {})
@abstractmethod
def _mark_task_untake(self, context, task):
...
def drop_task(self, context, task, *, trigger_consume_action=False):
"""将消息丢弃
:param trigger_consume_action: 丢弃后是否触发consume_action
"""
self._mark_task_dropped(context, task)
if trigger_consume_action:
self._execute_action(context, self._consume_action, {})
@abstractmethod
def _mark_task_dropped(self, context, task):
...
@abstractmethod
def clear(self, context):
"""将整个队列中的任务标记为Dropped状态
"""
...
def _execute_action(self, context, action, args=None):
if args is None:
args = {}
context._flow_executor.handle_job_action(
flow_instance_id=context.flow_instance_id,
target_step=context.current_step,
target_action=action,
actor=context.actor,
action_args=args
)
# 各种状态数目
def untake_num(self, context):
return self._get_num_by_status(context, TaskStatus.UNTAKE)
def taken_num(self, context):
return self._get_num_by_status(context, TaskStatus.TAKEN)
def acked_num(self, context):
return self._get_num_by_status(context, TaskStatus.ACKED)
def dropped_num(self, context):
return self._get_num_by_status(context, TaskStatus.DROPPED)
@abstractmethod
def _get_num_by_status(self, context, status):
...
class _TaskModel:
@classmethod
def from_task_and_ctx(cls, task, queue_name, context):
return cls(
id_=task.id,
flow_instance_id=context.flow_instance_id,
job_instance_id=context.job_instance_id,
queue_name=queue_name,
status=task.status,
args=task.args
)
def __init__(self, id_, flow_instance_id, job_instance_id,
queue_name, status, args):
self.id = id_
self.flow_instance_id = flow_instance_id
self.job_instance_id = job_instance_id
self.queue_name = queue_name
self.status = status
self.args = args
now = datetime.datetime.now()
self.updated_on = now
self.created_on = now
def _task_mapper(row):
return Task(
id_=row.pop("id"),
status=row["status"],
args=json.loads(row["args"])
)
class MySQLActionQueue(ActionQueue):
def __init__(self, name, init_action, consume_action, ack_action):
super().__init__(
name=name,
init_action=init_action,
consume_action=consume_action,
ack_action=ack_action
)
def _save_tasks_into_queue(self, context, *tasks):
if not tasks:
return
tasks = [_TaskModel.from_task_and_ctx(
t, self.name, context) for t in tasks]
db.default.executemany(
"insert into job_action_queue ("
"flow_instance_id, job_instance_id, queue_name, "
"status, args, updated_on, created_on) values ("
"%s, %s, %s, %s, %s, %s, %s);",
[(
t.flow_instance_id,
t.job_instance_id,
t.queue_name,
t.status,
to_json(t.args),
t.updated_on,
t.created_on
) for t in tasks])
def _take_from_queue(self, context):
task = db.default.query_one(
sql="select id, status, args from "
"job_action_queue where flow_instance_id = %s "
"and job_instance_id = %s and queue_name = %s and status = %s "
"limit 1",
args=(
context.flow_instance_id,
context.job_instance_id,
self.name,
TaskStatus.UNTAKE
),
mapper=_task_mapper
)
if not task:
return None
self._update_task_status(task.id, TaskStatus.TAKEN)
return task
def _mark_task_acked(self, task_id):
self._update_task_status(task_id, TaskStatus.ACKED)
def _mark_task_untake(self, task):
self._update_task_status(task.id, TaskStatus.UNTAKE)
def _mark_task_dropped(self, task):
self._update_task_status(task.id, TaskStatus.DROPPED)
def _update_task_status(self, id_, status):
now = datetime.datetime.now()
db.default.execute((
"update job_action_queue set status = %s, "
"updated_on = %s where id = %s"
), (status, now, id_))
def _is_all_acked(self, context):
unack_count = db.default.query_one_field(
sql="select count(*) from job_action_queue "
"where flow_instance_id = %s and job_instance_id = %s "
"and queue_name = %s and status != %s limit 1",
args=(
context.flow_instance_id,
context.job_instance_id,
self.name,
TaskStatus.ACKED
)
)
return unack_count == 0
def _get_num_by_status(self, context, status):
return db.default.query_one_field(
"select count(*) from job_action_queue "
"where flow_instance_id = %s and job_instance_id = %s "
"and queue_name = %s and status = %s",
args=(
context.flow_instance_id,
context.job_instance_id,
self._name,
status
)
)
def clear(self, context):
return db.default.execute((
"update job_action_queue set status = %s "
"where flow_instance_id = %s and job_instance_id = %s "
"and queue_name = %s"
), (
TaskStatus.DROPPED,
context.flow_instance_id,
context.job_instance_id,
self._name
)) | Acolyte | /Acolyte-0.0.1.tar.gz/Acolyte-0.0.1/acolyte/core/action_queue.py | action_queue.py |
from acolyte.util.validate import (
check,
StrField,
IntField,
BadReq,
)
from acolyte.core.service import (
Result,
AbstractService
)
from acolyte.core.storage.job_instance import JobInstanceDAO
from acolyte.core.storage.job_action_data import JobActionDataDAO
from acolyte.core.storage.user import UserDAO
from acolyte.core.view import (
JobDetailsView,
JobInstanceDetailsView,
DecisionView,
)
from acolyte.exception import ObjectNotFoundException
class JobService(AbstractService):
def __init__(self, service_container):
super().__init__(service_container)
self._job_mgr = self._("job_manager")
self._db = self._("db")
self._job_instance_dao = JobInstanceDAO(self._db)
self._job_action_data_dao = JobActionDataDAO(self._db)
self._user_dao = UserDAO(self._db)
@check(
StrField("job_name", required=True)
)
def get_job_details_by_name(self, job_name):
"""根据job名称来获取job定义详情
"""
try:
job_define = self._job_mgr.get(job_name)
except ObjectNotFoundException:
raise BadReq("job_not_found", job_name=job_name)
return Result.ok(data=JobDetailsView.from_job(job_define))
def get_all_job_definations(self):
"""获取所有的Job定义
"""
...
def get_job_instance_list_by_flow_instance(self, flow_instance_id):
"""根据flow_instance_id获取job_instance列表
"""
...
@check(
IntField("job_instance_id", required=True),
)
def get_job_instance_details(self, job_instance_id):
"""获取某个job_instance的详情数据,包括每个其中每个event的数据
"""
job_instance = self._job_instance_dao.query_by_id(
job_instance_id)
if job_instance is None:
raise BadReq(
"instance_not_found", job_instance_id=job_instance.id)
action_data_list = self._job_action_data_dao\
.query_by_instance_id(job_instance.id)
actor_id_list = [action_data.actor for action_data in action_data_list]
actors = self._user_dao.query_users_by_id_list(actor_id_list, True)
return Result.ok(data=JobInstanceDetailsView.from_job_instance(
job_instance, action_data_list, actors))
@check(
IntField("job_instance_id", required=True),
StrField("decision_name", required=True)
)
def get_decision_info(self, job_instance_id, decision_name):
"""获取Job的某个Decison摘要
"""
job_instance = self._job_instance_dao.query_by_id(job_instance_id)
if job_instance is None:
raise BadReq("job_instance_not_found",
job_instance_id=job_instance_id)
job_define = self._job_mgr.get(job_instance.job_name)
decision_define = job_define.get_decision(decision_name)
if decision_define is None:
raise BadReq("decision_not_found", decision_name=decision_name)
return Result.ok(DecisionView.from_decision_define(
decision_define, job_define)) | Acolyte | /Acolyte-0.0.1.tar.gz/Acolyte-0.0.1/acolyte/core/job_service.py | job_service.py |
import datetime
from typing import (
Dict,
Any
)
from abc import (
ABCMeta,
abstractmethod
)
from functools import wraps
from acolyte.util.validate import Field
class AbstractJob(metaclass=ABCMeta):
"""描述一个Job
实现的其它Job均需要继承该类
"""
def __init__(self, name: str, description: str, *,
ui=None, decisions=None, auto_trigger=False,
action_queues=None):
"""
:param name: Job名称
:param description: Job描述
:param job_args: Job参数声明
:param ui: 自定义终端页UI的相关信息
:param decisions: 相关的决策页面
:param auto_trigger: 是否为自动触发
:param action_queues: action队列
"""
self._name = name
self._description = description
self._job_args = {}
self._action_constraints = {}
self._ui = ui
if decisions is None:
decisions = []
self._decisions = decisions
self._decisions_dict = {
decision.name: decision for decision in decisions}
self._auto_trigger = auto_trigger
if action_queues is None:
self._action_queues = []
else:
self._action_queues = action_queues
@property
def name(self):
return self._name
@property
def description(self):
return self._description
@property
def job_args(self):
return self._job_args
@property
def action_constraints(self):
return self._action_constraints
@property
def ui(self):
return self._ui
@property
def decisions(self):
return self._decisions
@property
def auto_trigger(self):
return self._auto_trigger
@property
def action_queues(self):
return self._action_queues
def get_decision(self, decision_name):
"""根据名称获取指定的Decision对象
"""
return self._decisions_dict.get(decision_name, None)
@abstractmethod
def on_trigger(self, context, arguments):
"""当工作单元被触发时执行此动作
"""
...
class DetailsPageUI(metaclass=ABCMeta):
"""用于描述Job的自定义UI组件
"""
def __init__(self):
...
@abstractmethod
def render_instance_details_page(self, **data):
"""渲染JobInstance详情页
:param data: 渲染数据
"""
...
class Jinja2DetailsPageUI(DetailsPageUI):
"""基于Jinja2的UI自定义渲染组件
"""
def __init__(self, env, tpl):
"""
:param env: Jinja2 Environment对象
:param tpl: 引用的模板名称
"""
self._env = env
self._tpl = tpl
def render_instance_details_page(self, **data):
tpl = self._env.get_template(self._tpl)
return tpl.render(**data)
class Decision:
"""一个Decision表示一个决策页面,用户通过在DecisionUI进行表决来
触发对应的action
"""
def __init__(self, name, title, prompt, *options):
"""
:param name: Decision名称,用来在各处引用
:param title: 该Decision的简要文字说明,比如“反馈沙箱部署结果”
:param prompt: 展示在页面的提示说明
:param options: Decision的具体选项
"""
self._name = name
self._title = title
self._prompt = prompt
self._options = options
@property
def name(self):
return self._name
@property
def title(self):
return self._title
@property
def prompt(self):
return self._prompt
@property
def options(self):
return self._options
class DecisionOption:
"""Decision中所包含的决策选项
"""
def __init__(self, action, label):
"""
:param action: 该Option执行会触发的Action
:param label: 该Option在页面的简要说明
"""
self._action = action
self._label = label
@property
def action(self):
return self._action
@property
def label(self):
return self._label
class JobStatus:
"""Job实例的各种运行状态
"""
STATUS_WAITING = "waiting"
STATUS_RUNNING = "running"
STATUS_FINISHED = "finished"
STATUS_FAILURE = "failure"
STATUS_EXCEPTION = "exception"
class JobInstance:
"""描述一个Job的运行状态
"""
def __init__(self, id_: int, flow_instance_id: int, step_name: str,
job_name: str, status: str, trigger_actor: int,
created_on, updated_on):
"""
:param id_: 每个Job的运行实例有一个编号
:param flow_instance_id: 隶属的flow_instance
:param step_name: step名称
:param job_name: job名称
:param status: 运行状态
:param trigger_actor: 触发者
:param created_on: 运行实例起始时间
:param updated_on: 最近更新状态时间
"""
self.id = id_
self.flow_instance_id = flow_instance_id
self.step_name = step_name
self.job_name = job_name
self.status = status
self.trigger_actor = trigger_actor
self.created_on = created_on
self.updated_on = updated_on
class JobActionData:
"""记录Job每一个Action执行的数据
"""
def __init__(self,
id_: int, job_instance_id: int,
action: str, actor: int,
arguments: Dict[str, Any],
data_key: str,
data: Dict[str, Any],
created_on: datetime.datetime,
updated_on: datetime.datetime):
"""
:param id_: Action实例编号
:param job_instance_id: 隶属的job instance
:param action: 动作名称
:param actor: 执行者
:param arguments: 执行该Action时所使用的参数
:param data_key: 用于标记本次action的执行
:param data: 该Action执行后回填的数据
:param created_on: 执行时间
:param finished_on: 执行结束时间
"""
self.id = id_
self.job_instance_id = job_instance_id
self.action = action
self.actor = actor
self.arguments = arguments
self.data_key = data_key
self.data = data
self.created_on = created_on
self.updated_on = updated_on
class JobRef:
"""还对象用于在FlowMeta等声明中引用一个Job
"""
def __init__(
self, step_name: str, job_name: str = None, **bind_args):
"""
:param step_name: 不能叫'start'或者'finish',这俩是保留字
"""
self._step_name = step_name
self._job_name = job_name if job_name is not None else step_name
self._bind_args = bind_args if bind_args is not None else {}
@property
def step_name(self):
return self._step_name
@property
def job_name(self):
return self._job_name
@property
def bind_args(self):
return self._bind_args
class ActionArg:
"""参数声明
"""
# 参数类型
MARK_AUTO = "auto" # 自动变量,绑定参数值可以被运行时参数值所覆盖
MARK_CONST = "const" # const类型的参数的值自FlowMeta指定后就不在变了
MARK_STATIC = "static" # static类型的参数值自FlowInstance指定后就不再变了
def __init__(self, field_info: Field, mark: str, comment: str):
"""
:param step_name: 当前步骤名称
:param job_name: 引用的job名称
:param field_info: 字段类型以及验证属性
:param mark: 字段标记 auto、const、static
:param comment: 说明
"""
self._name = field_info.name
self._field_info = field_info
self._mark = mark
self._comment = comment
@property
def name(self):
return self._name
@property
def field_info(self):
return self._field_info
@property
def mark(self):
return self._mark
@property
def comment(self):
return self._comment
def action_args(*action_args):
def _job_args(f):
f._action_args = action_args
@wraps(f)
def _f(*args, **kwds):
return f(*args, **kwds)
return _f
return _job_args
class ActionLockType:
# 用户独占锁,同一时刻只允许一个用户执行
USER_EXCLUSIVE_LOCK = "user_exclusive_lock"
class ActionLock:
"""用于描述Action上加锁的信息
"""
def __init__(self, lock_key, lock_type, wait_time=0):
"""
:param lock_key: 用于标识该锁的字符串
:param lock_type: 锁类型
:param wait_time: 等待获取锁的时间,单位是秒,0为立即返回,
-1永不超时
"""
self._lock_key = lock_key
self._lock_type = lock_type
self._wait_time = wait_time
@property
def lock_key(self):
return self._lock_key
@property
def lock_type(self):
return self._lock_type
@property
def wait_time(self):
return self._wait_time
def gen_lock_key(self, value):
return "{lock_key}_{value}".format(
lock_key=self.lock_key, value=value)
class ActionRunTimes:
ONCE = 1 # 只能运行一次
NO_LIMIT = 0 # 无限制
class ActionConstraint:
"""描述Action执行的约束
"""
def __init__(self, *, lock=None, run_times=ActionRunTimes.ONCE):
"""
:param lock: Action执行所需要的锁
:param run_times: 运行次数
"""
self._lock = lock
self._run_times = run_times
@property
def lock(self):
return self._lock
@property
def run_times(self):
return self._run_times
def action_constraint(*, lock, run_times):
"""使用该注解为action添加约束
:param lock: Action执行的锁定类型
:param run_times: Action可运行的次数
"""
def _action_constraint(f):
f._action_constraint = ActionConstraint(
lock=lock,
run_times=run_times
)
@wraps(f)
def _f(*args, **kwds):
return f(*args, **kwds)
return _f
return _action_constraint | Acolyte | /Acolyte-0.0.1.tar.gz/Acolyte-0.0.1/acolyte/core/job.py | job.py |
import simplejson as json
import datetime
import locale
from collections import Counter, defaultdict
from acolyte.util import log
from acolyte.util.json import to_json
from acolyte.util.validate import (
Field,
IntField,
StrField,
check,
BadReq,
InvalidFieldException
)
from acolyte.util.lang import get_from_nested_dict
from acolyte.core.service import (
AbstractService,
Result
)
from acolyte.core.flow import FlowTemplate, FlowStatus
from acolyte.core.mgr import ObjectNotFoundException
from acolyte.core.storage.flow_template import FlowTemplateDAO
from acolyte.core.storage.flow_instance import FlowInstanceDAO
from acolyte.core.storage.job_instance import JobInstanceDAO
from acolyte.core.storage.flow_discard_reason import FlowDiscardReasonDAO
from acolyte.core.storage.flow_instance_group import (
FlowInstanceGroupDAO,
FlowInstanceGroupRelationDAO
)
from acolyte.core.storage.user import UserDAO
from acolyte.core.view import (
FlowMetaView,
FlowTemplateView,
FlowTemplateSimpleView,
FlowSimpleInstanceView,
FlowInstanceDetailsView,
FlowInstanceGroupDetailsView,
FlowInstanceGroupSimpleView
)
from acolyte.core.job import ActionArg
from acolyte.core.message import default_validate_messages
class FlowService(AbstractService):
def __init__(self, service_container):
super().__init__(service_container)
def _after_register(self):
# 注入两个manager
self._flow_meta_mgr = self._("flow_meta_manager")
self._job_mgr = self._("job_manager")
db = self._("db")
self._flow_tpl_dao = FlowTemplateDAO(db)
self._user_dao = UserDAO(db)
self._flow_instance_dao = FlowInstanceDAO(db)
self._job_instance_dao = JobInstanceDAO(db)
self._flow_discard_reason_dao = FlowDiscardReasonDAO(db)
self._flow_instance_group_dao = FlowInstanceGroupDAO(db)
self._flow_instance_group_rlt_dao = FlowInstanceGroupRelationDAO(db)
def get_all_flow_meta(self) -> Result:
"""获得所有注册到容器的flow_meta信息
:return [
{
"name": "mooncake_flow",
"description": "just a test flow",
"jobs": [
{
"step_name": "programmer",
"job_name": "programmer",
"bind_args": {
"trigger": {
"a": 1,
"b": 2,
}
}
}
]
},
]
"""
all_flow_meta = [
FlowMetaView.from_flow_meta(meta, self._job_mgr)
for meta in self._flow_meta_mgr.all()
]
return Result.ok(data=all_flow_meta)
@check(
StrField("flow_meta_name", required=True),
)
def get_flow_meta_info(self, flow_meta_name) -> Result:
"""获取单个的flow_meta详情
"""
try:
flow_meta = self._flow_meta_mgr.get(flow_meta_name)
except ObjectNotFoundException:
raise BadReq("flow_meta_not_exist", flow_meta=flow_meta_name)
return Result.ok(data=FlowMetaView.from_flow_meta(
flow_meta, self._job_mgr))
@check(
StrField("flow_meta_name", required=True),
StrField("name", required=True, min_len=3, max_len=50,
regex="^[a-zA-Z0-9_]+$"),
Field("bind_args", type_=dict, required=True, value_of=json.loads),
IntField("max_run_instance", required=True, min_=0),
Field("config", type_=dict, required=True, value_of=json.loads),
IntField("creator", required=True)
)
def create_flow_template(self, flow_meta_name, name, bind_args,
max_run_instance, config, creator) -> Result:
"""创建flow_template
"""
# 获取flow_meta以及检查其存在性
try:
flow_meta = self._flow_meta_mgr.get(flow_meta_name)
except ObjectNotFoundException:
raise BadReq("flow_meta_not_exist", flow_meta=flow_meta_name)
# 检查name是否已经存在
if self._flow_tpl_dao.is_name_existed(name):
raise BadReq("name_already_exist", name=name)
# 检查creator是否存在
creator_user = self._user_dao.query_user_by_id(creator)
if not creator_user:
raise BadReq("invalid_creator_id", creator=creator)
# 校验参数
rs = self._validate_tpl_bind_args(flow_meta, config, bind_args)
if rs.status_code == Result.STATUS_BADREQUEST:
return rs
bind_args = rs.data
created_on = datetime.datetime.now()
# 插入吧!
flow_template = self._flow_tpl_dao.insert_flow_template(
flow_meta_name, name, bind_args, max_run_instance, config,
creator, created_on)
log.acolyte.info(
"New flow template created, {}".format(to_json(flow_template)))
# 返回刚创建的View
return Result.ok(
data=FlowTemplateView.from_flow_template(
flow_template, creator_user))
@check(
IntField("flow_tpl_id", required=True),
StrField("name", required=True, min_len=3, max_len=50,
regex="^[a-zA-Z0-9_]+$"),
Field("bind_args", type_=dict, required=True, value_of=json.loads),
IntField("max_run_instance", required=True, min_=0),
Field("config", type_=dict, required=True, value_of=json.loads)
)
def modify_flow_template(self, flow_tpl_id, name, bind_args,
max_run_instance, config):
"""
S1. 检查flow tpl是否存在
S2. 检查名字是否改变,如果名字改变了,那么检查是否重复
S3. 检查参数
S4. 执行修改,返回结果
"""
flow_tpl = self._flow_tpl_dao.query_flow_template_by_id(flow_tpl_id)
if flow_tpl is None:
raise BadReq("tpl_not_found", flow_tpl_id=flow_tpl_id)
# 名字已经改变了,检查新名字是否已经存在
if flow_tpl.name != name:
if self._flow_tpl_dao.is_name_existed(name):
raise BadReq("name_exist", name=name)
# 检查参数
flow_meta = self._flow_meta_mgr.get(flow_tpl.flow_meta)
rs = self._validate_tpl_bind_args(flow_meta, config, bind_args)
if rs.status_code == Result.STATUS_BADREQUEST:
return rs
bind_args = rs.data
self._flow_tpl_dao.update_flow_template(
flow_tpl_id, name, bind_args, max_run_instance, config)
creator_info = self._user_dao.query_user_by_id(flow_tpl.creator)
return Result.ok(data=FlowTemplateView.from_flow_template(
FlowTemplate(
id_=flow_tpl_id,
flow_meta=flow_tpl.flow_meta,
name=name,
bind_args=bind_args,
max_run_instance=max_run_instance,
config=config,
creator=flow_tpl.creator,
created_on=flow_tpl.created_on
),
creator_info
))
# 校验template的绑定参数
def _validate_tpl_bind_args(self, flow_meta, config, bind_args):
new_bind_args = {}
for job_ref in flow_meta.jobs:
job = self._job_mgr.get(job_ref.job_name)
new_bind_args[job.name] = {}
for event, job_arg_declares in job.job_args.items():
new_bind_args[job.name][event] = {}
for job_arg_declare in job_arg_declares:
try:
bind_value = get_from_nested_dict(
bind_args, job.name, event, job_arg_declare.name)
# const 类型的参数不允许被绑定
if job_arg_declare.mark == ActionArg.MARK_CONST:
continue
# 如果为None并且是auto类型,那么可以在此不检查
if bind_value is None and \
job_arg_declare.mark == ActionArg.MARK_AUTO:
continue
if bind_value and isinstance(bind_value, str) \
and bind_value.startswith("$config."):
bind_value = config.get(
bind_value[len("$config."):])
# 执行校验并替换新值
new_value = job_arg_declare.field_info(bind_value)
new_bind_args[job.name][event][
job_arg_declare.name] = new_value
except InvalidFieldException as e:
field_name = "{job_name}_{event}_{arg_name}".format(
job_name=job.name,
event=event,
arg_name=job_arg_declare.name
)
full_reason = "{field_name}_{reason}".format(
field_name=field_name,
reason=e.reason
)
# 产生错误消息
loc, _ = locale.getlocale(locale.LC_CTYPE)
msg = default_validate_messages[loc][e.reason]
if e.expect is None:
msg = msg.format(field_name=field_name)
else:
msg = msg.format(
field_name=field_name, expect=e.expect)
return Result.bad_request(full_reason, msg=msg)
return Result.ok(data=new_bind_args)
def get_all_flow_templates(self):
"""获取所有的flow_templates列表
"""
all_flow_templates = self._flow_tpl_dao.query_all_templates()
if not all_flow_templates:
return Result.ok(data=[])
users = self._user_dao.query_users_by_id_list(
[tpl.creator for tpl in all_flow_templates], to_dict=True)
templates_view = [FlowTemplateView.from_flow_template(
tpl, users[tpl.creator]) for tpl in all_flow_templates]
return Result.ok(data=templates_view)
@check(
IntField("flow_template_id", required=True)
)
def get_flow_template(self, flow_template_id: int):
"""获取单个的flow_template详情
"""
flow_template = self._flow_tpl_dao.query_flow_template_by_id(
flow_template_id)
if flow_template is None:
raise BadReq("not_found", flow_template_id=flow_template_id)
creator = self._user_dao.query_user_by_id(flow_template.creator)
return Result.ok(
FlowTemplateView.from_flow_template(flow_template, creator))
@check(
StrField("flow_meta_name", required=True)
)
def get_flow_templates_by_flow_meta_name(self, flow_meta_name: str):
"""根据flow meta来获取flow template列表
"""
try:
self._flow_meta_mgr.get(flow_meta_name)
except ObjectNotFoundException:
raise BadReq("unknown_flow_meta", flow_meta_name=flow_meta_name)
flow_temlates = self._flow_tpl_dao.query_by_flow_meta_name(
flow_meta_name)
if not flow_temlates:
return Result.ok(data=[])
users = self._user_dao.query_users_by_id_list(
[t.creator for t in flow_temlates], True)
return Result.ok(data=[
FlowTemplateView.from_flow_template(tpl, users[tpl.creator])
for tpl in flow_temlates
])
def get_all_running_instance(self):
"""获取所有运行中的实例
"""
all_running_instance = self._flow_instance_dao\
.get_flow_instance_by_status(FlowStatus.STATUS_RUNNING)
return self._build_instance_list_view(all_running_instance)
@check(
Field("begin_date", type_=datetime.date),
Field("end_date", type_=datetime.date)
)
def get_instance_by_time_scope(self, begin_date, end_date):
"""根据时间范围来查询运行实例
:param begin_date: 起始时间
:param end_date: 结束时间
"""
if begin_date > end_date:
raise BadReq("invalid_time_scope")
instance_list = self._flow_instance_dao.\
get_flow_instance_by_created_date(
begin_date,
end_date + datetime.timedelta(days=1))
return self._build_instance_list_view(instance_list)
def _build_instance_list_view(self, instance_list):
"""从instance model列表构建instance视图对象
"""
if not instance_list:
return Result.ok(data=[])
tpl_id_list = []
creator_id_list = []
instance_id_list = []
for instance in instance_list:
tpl_id_list.append(instance.flow_template_id)
creator_id_list.append(instance.initiator)
instance_id_list.append(instance.id)
templates = self._flow_tpl_dao.query_by_id_list(
tpl_id_list, True)
creators = self._user_dao.query_users_by_id_list(
creator_id_list, True)
instance_group_ids = self._flow_instance_group_rlt_dao\
.query_group_ids_by_instance_ids(instance_id_list)
groups = self._flow_instance_group_dao\
.query_by_id_list(list(instance_group_ids.values()), to_dict=True)
def get_group(instance_id):
group_id = instance_group_ids.get(instance_id)
if not group_id:
return None
return groups[group_id]
return Result.ok([
FlowSimpleInstanceView.from_flow_instance(
flow_instance=instance,
group=get_group(instance.id),
flow_template=templates[instance.flow_template_id],
flow_meta_mgr=self._flow_meta_mgr,
creator=creators[instance.initiator]
) for instance in instance_list])
@check(
IntField("flow_instance_id", required=True)
)
def get_flow_instance_details(self, flow_instance_id):
"""获取flow instance详情
"""
flow_instance = self._flow_instance_dao\
.query_by_instance_id(flow_instance_id)
if flow_instance is None:
raise BadReq("not_found", flow_instance_id=flow_instance_id)
# get flow template
flow_tpl = self._flow_tpl_dao.query_flow_template_by_id(
flow_instance.flow_template_id)
flow_tpl_view = FlowTemplateSimpleView.from_flow_template(
flow_tpl,
self._flow_meta_mgr
)
# get initiator info
initiator_info = self._user_dao.query_user_by_id(
flow_instance.initiator)
# job instance list
job_instance_list = self._job_instance_dao\
.query_by_flow_instance_id(flow_instance_id)
flow_discard_reason = None
if flow_instance.status == FlowStatus.STATUS_DISCARD:
flow_discard_reason = self._flow_discard_reason_dao\
.query_by_flow_instance_id(flow_instance.id)
return Result.ok(FlowInstanceDetailsView.from_flow_instance(
flow_instance=flow_instance,
flow_tpl_view=flow_tpl_view,
initiator_info=initiator_info,
job_instance_list=job_instance_list,
flow_discard_reason=flow_discard_reason
))
@check(
IntField("group_id", required=True)
)
def get_flow_instance_group_details(self, group_id):
"""获取flow instance group 详情
:param group_id flow instance group id
"""
flow_instance_group = self._flow_instance_group_dao\
.query_by_id(group_id)
if flow_instance_group is None:
raise BadReq("group_not_exist", group_id=group_id)
sub_flow_instance_ids = self._flow_instance_group_rlt_dao\
.query_instance_id_lst_by_group_id(group_id)
if not sub_flow_instance_ids:
return Result.ok(
FlowInstanceGroupDetailsView
.from_flow_instance_group(
flow_instance_group,
[]
))
flow_instances = self._flow_instance_dao\
.query_by_instance_id_list(sub_flow_instance_ids)
initiators = self._user_dao.query_users_by_id_list(
id_list=[fi.initiator for fi in flow_instances],
to_dict=True
)
flow_templates = self._flow_tpl_dao.query_by_id_list(
tpl_id_list=[fi.flow_template_id for fi in flow_instances],
to_dict=True
)
flow_instance_views = [
FlowSimpleInstanceView.from_flow_instance(
flow_instance=fi,
group=flow_instance_group,
flow_template=flow_templates[fi.flow_template_id],
flow_meta_mgr=self._flow_meta_mgr,
creator=initiators[fi.initiator]
) for fi in flow_instances
]
return Result.ok(
FlowInstanceGroupDetailsView
.from_flow_instance_group(
flow_instance_group,
flow_instance_views
))
@check(
Field("begin_date", type_=datetime.date, required=True),
Field("end_date", type_=datetime.date, required=True)
)
def get_flow_instance_group_history(self, begin_date, end_date):
"""
根据时间范围来查询flow instance group运行历史
:param begin_date: 起始日期
:param end_date: 结束日期
"""
if begin_date > end_date:
raise BadReq("invalid_datescope")
end_date += datetime.timedelta(days=1)
groups = self._flow_instance_group_dao.query_by_datescope(
begin_date=begin_date,
end_date=end_date
)
if not groups:
return Result.ok(data=[])
group_id_list = [g.id for g in groups]
g_f_relations = self._flow_instance_group_rlt_dao\
.query_by_group_id_list(
group_id_list=group_id_list
)
all_instance_ids = []
for v in g_f_relations.values():
all_instance_ids += v
flow_instances = self._flow_instance_dao\
.query_by_instance_id_list(all_instance_ids, to_dict=True)
all_status_counter = defaultdict(Counter)
for group_id, instance_ids in g_f_relations.items():
for instance_id in instance_ids:
flow_instance = flow_instances[instance_id]
all_status_counter[group_id][flow_instance.status] += 1
return Result.ok(data=[
FlowInstanceGroupSimpleView.from_flow_instance_group(
flow_instance_group=g,
sub_flow_num=len(g_f_relations[g.id]),
sub_flow_status=all_status_counter[g.id]
) for g in groups]) | Acolyte | /Acolyte-0.0.1.tar.gz/Acolyte-0.0.1/acolyte/core/flow_service.py | flow_service.py |
import inspect
import pkg_resources
from abc import ABCMeta, abstractmethod
from acolyte.exception import (
UnsupportOperationException,
ObjectAlreadyExistedException,
ObjectNotFoundException,
InvalidArgumentException
)
from acolyte.core.job import (
ActionConstraint,
ActionLock,
ActionLockType,
ActionRunTimes,
)
class AbstractManager(metaclass=ABCMeta):
def __init__(self):
...
@abstractmethod
def load(self):
"""加载所有对象到容器
"""
...
@abstractmethod
def register(self, name, obj):
"""注册对象到容器
"""
...
@abstractmethod
def get(self, name):
"""从容器中获取元素
"""
...
@abstractmethod
def all(self):
"""获取容器中的所有元素信息
"""
...
class ManagerChain(AbstractManager):
def __init__(self, *mgr_list):
self._mgr_list = mgr_list
def load(self):
map(lambda mgr: mgr.load(), self._mgr_list)
def register(self, name: str, obj: object):
raise UnsupportOperationException.build(ManagerChain, "register")
def get(self, name: str) -> object:
for mgr in self._mgr_list:
try:
return mgr.get(name)
except ObjectNotFoundException:
continue
else:
raise ObjectNotFoundException(name)
def all(self):
result = []
for mgr in self._mgr_list:
result += mgr.all()
return result
class DictBasedManager(AbstractManager):
def __init__(self):
super().__init__()
self._container = {}
def load(self):
raise UnsupportOperationException.build(DictBasedManager, "load")
def register(self, name, obj):
if name in self._container:
raise ObjectAlreadyExistedException(name)
self._container[name] = self._handle_obj(obj)
def get(self, name):
try:
return self._container[name]
except KeyError:
raise ObjectNotFoundException(name)
def all(self):
return self._container.values()
def _handle_obj(self, obj):
"""子类可以实现该方法对加载的对象做更多的处理
"""
return obj
class EntryPointManager(DictBasedManager):
def __init__(self, entry_point: str):
super().__init__()
self._entry_point = entry_point
self._container = {}
def load(self):
for ep in pkg_resources.iter_entry_points(self._entry_point):
obj = ep.load()()
self._container[obj.name] = self._handle_obj(obj)
class JobManager(EntryPointManager):
def __init__(self, entry_point: str):
super().__init__(entry_point)
def _handle_obj(self, obj):
for mtd_name, mtd in inspect.getmembers(obj, inspect.ismethod):
if mtd_name.startswith("on_"):
action_name = mtd_name[len("on_"):]
# 处理action_args
action_args = getattr(mtd, "_action_args", tuple())
obj.job_args[action_name] = action_args
# 添加默认执行约束 包括:
# 1. 用户互斥锁
# 2. 运行次数为1
# 3. 需要检查data_key
action_constraint = getattr(mtd, "_action_constraint", None)
if action_constraint is None:
# 默认为用户级的互斥锁
lock_key = "{job_name}_{action_name}".format(
job_name=obj.name, action_name=action_name)
action_lock = ActionLock(
lock_key=lock_key,
lock_type=ActionLockType.USER_EXCLUSIVE_LOCK
)
obj.action_constraints[action_name] = \
ActionConstraint(lock=action_lock,
run_times=ActionRunTimes.ONCE)
else:
if (
action_name == "trigger" and
action_constraint.run_times != ActionRunTimes.ONCE
):
# 如果action是trigger,那么run_times只能为once
raise InvalidArgumentException((
"The trigger action of job '{job_name}' "
"only allow run_times = ActionRunTimes.ONCE"
).format(job_name=obj.name))
return obj
# managers for job, flow_meta, notify
job_manager = JobManager("acolyte.job")
flow_meta_manager = EntryPointManager("acolyte.flow_meta")
notify_template_manager = EntryPointManager("acolyte.notify_template") | Acolyte | /Acolyte-0.0.1.tar.gz/Acolyte-0.0.1/acolyte/core/mgr.py | mgr.py |
import functools
from typing import Iterable
from acolyte.util.mail import send_mail
from acolyte.util.validate import check, BadReq
from acolyte.core.notify import (
NotifyReceiver,
NotifyWay,
NotifyTemplate,
ReadStatus
)
from acolyte.core.view import NotifyDetailsView, NotifySimpleView
from acolyte.core.service import AbstractService, Result
from acolyte.core.storage.notify_index import NotifyIndexDAO
class NotifyLogic(AbstractService):
def __init__(self, service_container):
super().__init__(service_container)
def _send_email(tpl, receiver):
send_mail(
receiver=receiver.receiver_user.email,
subject=tpl.render_subject(**receiver.subject_template_args),
content=tpl.render_content(**receiver.content_template_args)
)
self._send_methods = {
NotifyWay.WEB_INBOX: None,
NotifyWay.EMAIL: _send_email,
NotifyWay.SMS: None,
NotifyWay.WEIXIN: None
}
def _after_register(self):
db = self._("db")
self._notify_index_dao = NotifyIndexDAO(db)
self._notify_executor = self._("notify_executor")
self._notify_tpl_manager = self._("notify_tpl_manager")
def notify(
self, notify_template_name: str,
notify_ways: Iterable[NotifyWay],
*receivers: NotifyReceiver):
tpl = self._notify_tpl_manager.get(notify_template_name)
notify_function = functools.partial(
self._notify, tpl_name=notify_template_name,
tpl=tpl, notify_ways=notify_ways)
def _batch_notify():
for receiver in receivers:
notify_function(receiver=receiver)
self._notify_executor.submit(_batch_notify)
return Result.ok()
def _notify(self, tpl_name: str, tpl: NotifyTemplate,
notify_ways: Iterable[NotifyWay], receiver: NotifyReceiver):
"""通知个体"""
# 不管用什么方式,先插一条站内信
self._notify_index_dao.insert(
notify_template=tpl_name,
notify_receiver=receiver,
notify_ways=notify_ways
)
for notify_way in notify_ways:
send_method = self._send_methods[notify_way]
if send_method is None:
continue
send_method(tpl, receiver)
@check()
def view_notify_details(self, notify_index_id):
notify_index = self._notify_index_dao.query_by_id(notify_index_id)
if notify_index is None:
raise BadReq("notify_index_not_found")
# 如果未读则标记已读
if notify_index.read_status == ReadStatus.unread:
self._notify_index_dao.update_read_status(
notify_index_id, ReadStatus.READED.value)
return Result.ok(
NotifyDetailsView.from_notify_index(
notify_index, self._notify_tpl_manager))
def mark_all_readed(self, receiver_id):
"""全部标记已读
"""
self._notify_index_dao.update_read_status_by_receiver_id(
receiver_id, ReadStatus.READED.value)
return Result.ok()
def get_unread_count(self, receiver_id):
"""获得未读消息数目
"""
return self._notify_index_dao.query_unread_num(receiver_id)
def get_all_unread(self, receiver_id):
"""获取所有的未读列表
"""
unread_notify_index_lst = self._notify_index_dao\
.query_unread(receiver_id)
if not unread_notify_index_lst:
return []
return Result.ok(data=[
NotifySimpleView.from_notify_index(
notify_index, self._notify_tpl_manager
) for notify_index in unread_notify_index_lst])
def view_history(self, receiver_id, offset_id, limit):
"""查看历史
"""
notify_index_lst = self._notify_index_dao\
.query_by_receiver_id(receiver_id, offset_id, limit)
if not notify_index_lst:
return Result.ok(data=[])
return Result.ok(data=[
NotifySimpleView.from_notify_index(
notify_index, self._notify_tpl_manager
) for notify_index in notify_index_lst]) | Acolyte | /Acolyte-0.0.1.tar.gz/Acolyte-0.0.1/acolyte/core/notify_logic.py | notify_logic.py |
from abc import (
ABCMeta,
abstractmethod
)
from typing import Dict, Any
from acolyte.util import db
from acolyte.util import log
from acolyte.util.service_container import ServiceContainer
from acolyte.core.mgr import (
job_manager,
flow_meta_manager
)
from acolyte.util.mail import load_smtp_config
from acolyte.core.flow_service import FlowService
from acolyte.core.user_service import UserService
from acolyte.core.job_service import JobService
from acolyte.core.flow_executor_service import FlowExecutorService
class AbstractBootstrap(metaclass=ABCMeta):
"""Bootstrap类用于统一初始化启动应用所需要的组件和服务
"""
def __init__(self):
...
@abstractmethod
def start(config):
...
class EasemobFlowBootstrap(AbstractBootstrap):
"""正式启动应用所需的Bootstrap
"""
def __init__(self):
super().__init__()
self._service_container = ServiceContainer()
def start(self, config: Dict[str, Dict[str, Any]]):
"""在这里对各种组件进行初始化
:param config: 配置数据,字典套字典什么的
"""
# 初始化日志
log.load_logger_config(config)
log.acolyte.info("Starting acolyte ...")
# 初始化数据库连接池
self._pool = db.load_db_config(config)
self._service_binding(self._service_container)
# 初始化邮箱配置
load_smtp_config(config)
log.acolyte.info("Acolyte started .")
@property
def service_container(self):
return self._service_container
def _service_binding(self, service_container: ServiceContainer):
"""将服务绑定到注册容器
"""
service_container.register(
service_id="db",
service_obj=self._pool
)
service_container.register(
service_id="job_manager",
service_obj=job_manager,
init_callback=lambda service_obj: service_obj.load()
)
service_container.register(
service_id="flow_meta_manager",
service_obj=flow_meta_manager,
init_callback=lambda service_obj: service_obj.load()
)
# 注册各种Service
service_container.register_service(FlowService)
service_container.register_service(UserService)
service_container.register_service(JobService)
service_container.register_service(FlowExecutorService)
service_container.after_register() | Acolyte | /Acolyte-0.0.1.tar.gz/Acolyte-0.0.1/acolyte/core/bootstrap.py | bootstrap.py |
import datetime
from acolyte.core.storage import AbstractDAO
from acolyte.core.flow import FlowInstance, FlowStatus
def _mapper(result):
result["id_"] = result.pop("id")
return FlowInstance(**result)
class FlowInstanceDAO(AbstractDAO):
def __init__(self, db):
super().__init__(db)
def query_by_instance_id(self, instance_id):
return self._db.query_one((
"select * from flow_instance where id = %s limit 1"
), (instance_id,), _mapper)
def query_by_instance_id_list(self, id_list, to_dict=False):
if not id_list:
return []
holders = ",".join(("%s", ) * len(id_list))
rs = self._db.query_all((
"select * from flow_instance where "
"id in ({holders}) order by id"
).format(holders=holders), id_list, _mapper)
if to_dict:
return {instance.id: instance for instance in rs}
return rs
def query_running_instance_num_by_tpl_id(self, tpl_id):
return int(self._db.query_one_field((
"select count(*) as c from flow_instance "
"where flow_template_id = %s and "
"status in ('running', 'init')"
), (tpl_id,)))
def query_running_instance_list_by_tpl_id(self, tpl_id):
return self._db.query_all((
"select * from flow_instance "
"where flow_template_id = %s and "
"status = 'running'"
), (tpl_id,), _mapper)
def insert(self, flow_template_id, initiator, description):
now = datetime.datetime.now()
with self._db.connection() as conn:
with conn.cursor() as csr:
csr.execute((
"insert into flow_instance ("
"flow_template_id, initiator, current_step, status, "
"description, created_on, updated_on) values ("
"%s, %s, %s, %s, %s, %s, %s)"
), (flow_template_id, initiator, "start",
FlowStatus.STATUS_INIT, description, now, now))
conn.commit()
return FlowInstance(
id_=csr.lastrowid,
flow_template_id=flow_template_id,
initiator=initiator,
current_step="start",
status=FlowStatus.STATUS_INIT,
description=description,
created_on=now,
updated_on=now
)
def update_status(self, flow_instance_id, status):
now = datetime.datetime.now()
return self._db.execute((
"update flow_instance set status = %s, "
"updated_on = %s where id = %s limit 1"
), (status, now, flow_instance_id))
def update_current_step(self, flow_instance_id, current_step):
now = datetime.datetime.now()
return self._db.execute((
"update flow_instance set current_step = %s, "
"updated_on = %s where id = %s limit 1"
), (current_step, now, flow_instance_id))
def delete_by_instance_id(self, instance_id):
if isinstance(instance_id, list):
holders = ",".join(("%s", ) * len(instance_id))
return self._db.execute(
"delete from flow_instance where id in ({holders})".format(
holders=holders), instance_id)
else:
return self._db.execute("delete from flow_instance where id = %s",
(instance_id, ))
def get_flow_instance_by_status(self, flow_status):
return self._db.query_all((
"select * from flow_instance where status = %s"
), (flow_status, ), _mapper)
def get_flow_instance_by_created_date(self, begin_date, end_date):
return self._db.query_all((
"select * from flow_instance where "
"created_on between %s and %s order by created_on desc"
), (begin_date, end_date), _mapper) | Acolyte | /Acolyte-0.0.1.tar.gz/Acolyte-0.0.1/acolyte/core/storage/flow_instance.py | flow_instance.py |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.