ext
stringclasses 9
values | sha
stringlengths 40
40
| content
stringlengths 3
1.04M
|
---|---|---|
py | b413e83ac6c0bcb18cc40592c12e1423eafe2294 | """
Implicit Plots
"""
from __future__ import absolute_import
from .implicit_surface import ImplicitSurface
def implicit_plot3d(f, xrange, yrange, zrange, **kwds):
r"""
Plot an isosurface of a function.
INPUT:
- ``f`` -- function
- ``xrange`` -- a 2-tuple (x_min, x_max) or a 3-tuple (x, x_min, x_max)
- ``yrange`` -- a 2-tuple (y_min, y_max) or a 3-tuple (y, y_min, y_max)
- ``zrange`` -- a 2-tuple (z_min, z_max) or a 3-tuple (z, z_min, z_max)
- ``plot_points`` -- (default: "automatic", which is 40) the number of
function evaluations in each direction. (The number of cubes in the
marching cubes algorithm will be one less than this). Can be a triple of
integers, to specify a different resolution in each of x,y,z.
- ``contour`` -- (default: 0) plot the isosurface f(x,y,z)==contour. Can be a
list, in which case multiple contours are plotted.
- ``region`` -- (default: None) If region is given, it must be a Python
callable. Only segments of the surface where region(x,y,z) returns a
number >0 will be included in the plot. (Note that returning a Python
boolean is acceptable, since True == 1 and False == 0).
EXAMPLES::
sage: var('x,y,z')
(x, y, z)
A simple sphere::
sage: implicit_plot3d(x^2+y^2+z^2==4, (x,-3,3), (y,-3,3), (z,-3,3))
Graphics3d Object
.. PLOT::
var('x,y,z')
F = x**2 + y**2 + z**2
P = implicit_plot3d(F==4, (x,-3,3), (y,-3,3), (z,-3,3))
sphinx_plot(P)
A nested set of spheres with a hole cut out::
sage: implicit_plot3d((x^2 + y^2 + z^2), (x,-2,2), (y,-2,2), (z,-2,2), plot_points=60, contour=[1,3,5],
....: region=lambda x,y,z: x<=0.2 or y>=0.2 or z<=0.2, color='aquamarine').show(viewer='tachyon')
.. PLOT::
var('x,y,z')
F = x**2 + y**2 + z**2
P = implicit_plot3d(F, (x,-2,2), (y,-2,2), (z,-2,2), plot_points=60, contour=[1,3,5],
region=lambda x,y,z: x<=0.2 or y>=0.2 or z<=0.2, color='aquamarine')
sphinx_plot(P)
A very pretty example, attributed to Douglas Summers-Stay (`archived page
<http://web.archive.org/web/20080529033738/http://iat.ubalt.edu/summers/math/platsol.htm>`_)::
sage: T = RDF(golden_ratio)
sage: F = 2 - (cos(x+T*y) + cos(x-T*y) + cos(y+T*z) + cos(y-T*z) + cos(z-T*x) + cos(z+T*x))
sage: r = 4.77
sage: implicit_plot3d(F, (x,-r,r), (y,-r,r), (z,-r,r), plot_points=40, color='darkkhaki').show(viewer='tachyon')
.. PLOT::
var('x,y,z')
T = RDF(golden_ratio)
F = 2 - (cos(x+T*y) + cos(x-T*y) + cos(y+T*z) + cos(y-T*z) + cos(z-T*x) + cos(z+T*x))
r = 4.77
V = implicit_plot3d(F, (x,-r,r), (y,-r,r), (z,-r,r), plot_points=40, color='darkkhaki')
sphinx_plot(V)
As I write this (but probably not as you read it), it's almost Valentine's
day, so let's try a heart (from http://mathworld.wolfram.com/HeartSurface.html)
::
sage: F = (x^2+9/4*y^2+z^2-1)^3 - x^2*z^3 - 9/(80)*y^2*z^3
sage: r = 1.5
sage: implicit_plot3d(F, (x,-r,r), (y,-r,r), (z,-r,r), plot_points=80, color='red', smooth=False).show(viewer='tachyon')
.. PLOT::
var('x,y,z')
F = (x**2+9.0/4.0*y**2+z**2-1)**3 - x**2*z**3 - 9.0/(80)*y**2*z**3
r = 1.5
V = implicit_plot3d(F, (x,-r,r), (y,-r,r), (z,-r,r), plot_points=80, color='red', smooth=False)
sphinx_plot(V)
The same examples also work with the default Jmol viewer; for example::
sage: T = RDF(golden_ratio)
sage: F = 2 - (cos(x + T*y) + cos(x - T*y) + cos(y + T*z) + cos(y - T*z) + cos(z - T*x) + cos(z + T*x))
sage: r = 4.77
sage: implicit_plot3d(F, (x,-r,r), (y,-r,r), (z,-r,r), plot_points=40, color='deepskyblue').show()
Here we use smooth=True with a Tachyon graph::
sage: implicit_plot3d(x^2 + y^2 + z^2, (x,-2,2), (y,-2,2), (z,-2,2), contour=4, color='deepskyblue', smooth=True)
Graphics3d Object
.. PLOT::
var('x,y,z')
F = x**2 + y**2 + z**2
P = implicit_plot3d(F, (x,-2,2), (y,-2,2), (z,-2,2), contour=4, color='deepskyblue', smooth=True)
sphinx_plot(P)
We explicitly specify a gradient function (in conjunction with smooth=True)
and invert the normals::
sage: gx = lambda x, y, z: -(2*x + y^2 + z^2)
sage: gy = lambda x, y, z: -(x^2 + 2*y + z^2)
sage: gz = lambda x, y, z: -(x^2 + y^2 + 2*z)
sage: implicit_plot3d(x^2+y^2+z^2, (x,-2,2), (y,-2,2), (z,-2,2), contour=4,
....: plot_points=40, smooth=True, gradient=(gx, gy, gz)).show(viewer='tachyon')
.. PLOT::
var('x,y,z')
gx = lambda x, y, z: -(2*x + y**2 + z**2)
gy = lambda x, y, z: -(x**2 + 2*y + z**2)
gz = lambda x, y, z: -(x**2 + y**2 + 2*z)
P = implicit_plot3d(x**2+y**2+z**2, (x,-2,2), (y,-2,2), (z,-2,2), contour=4,
plot_points=40, smooth=True, gradient=(gx, gy, gz))
sphinx_plot(P)
A graph of two metaballs interacting with each other::
sage: def metaball(x0, y0, z0): return 1 / ((x-x0)^2+(y-y0)^2+(z-z0)^2)
sage: implicit_plot3d(metaball(-0.6,0,0) + metaball(0.6,0,0), (x,-2,2), (y,-2,2), (z,-2,2), plot_points=60, contour=2, color='seagreen')
Graphics3d Object
.. PLOT::
var('x,y,z')
def metaball(x0, y0, z0): return 1 / ((x-x0)**2+(y-y0)**2+(z-z0)**2)
P = implicit_plot3d(metaball(-0.6,0,0) + metaball(0.6,0,0), (x,-2,2), (y,-2,2), (z,-2,2), plot_points=60, contour=2, color='seagreen')
sphinx_plot(P)
One can also color the surface using a coloring function and a
colormap as follows. Note that the coloring function must take
values in the interval [0,1]. ::
sage: t = (sin(3*z)**2).function(x,y,z)
sage: cm = colormaps.gist_rainbow
sage: G = implicit_plot3d(x^2 + y^2 + z^2, (x,-2,2), (y,-2,2), (z,-2, 2),
....: contour=4, color=(t,cm), plot_points=100)
sage: G.show(viewer='tachyon')
.. PLOT::
var('x,y,z')
t = (sin(3*z)**2).function(x,y,z)
cm = colormaps.gist_rainbow
G = implicit_plot3d(x**2 + y**2 + z**2, (x,-2,2), (y,-2,2), (z,-2, 2),
contour=4, color=(t,cm), plot_points=60)
sphinx_plot(G)
Here is another colored example::
sage: x, y, z = var('x,y,z')
sage: t = (x).function(x,y,z)
sage: cm = colormaps.PiYG
sage: G = implicit_plot3d(x^4 + y^2 + z^2, (x,-2,2),
....: (y,-2,2),(z,-2,2), contour=4, color=(t,cm), plot_points=40)
sage: G
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
t = (x).function(x,y,z)
cm = colormaps.PiYG
G = implicit_plot3d(x**4 + y**2 + z**2, (x,-2,2),
(y,-2,2),(z,-2,2), contour=4, color=(t,cm), plot_points=40)
sphinx_plot(G)
.. WARNING::
This kind of coloring using a colormap can be visualized using
Jmol, Tachyon (option ``viewer='tachyon'``) and Canvas3D
(option ``viewer='canvas3d'`` in the notebook).
MANY MORE EXAMPLES:
A kind of saddle::
sage: implicit_plot3d(x^3 + y^2 - z^2, (x,-2,2), (y,-2,2), (z,-2,2), plot_points=60, contour=0, color='lightcoral')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
G = implicit_plot3d(x**3 + y**2 - z**2, (x,-2,2), (y,-2,2), (z,-2,2), plot_points=60, contour=0, color='lightcoral')
sphinx_plot(G)
A smooth surface with six radial openings::
sage: implicit_plot3d(-(cos(x) + cos(y) + cos(z)), (x,-4,4), (y,-4,4), (z,-4,4), color='orchid')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
G = implicit_plot3d(-(cos(x) + cos(y) + cos(z)), (x,-4,4), (y,-4,4), (z,-4,4), color='orchid')
sphinx_plot(G)
A cube composed of eight conjoined blobs::
sage: F = x^2 + y^2 + z^2 + cos(4*x) + cos(4*y) + cos(4*z) - 0.2
sage: implicit_plot3d(F, (x,-2,2), (y,-2,2), (z,-2,2), color='mediumspringgreen')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
F = x**2 + y**2 + z**2 + cos(4*x) + cos(4*y) + cos(4*z) - 0.2
G = implicit_plot3d(F, (x,-2,2), (y,-2,2), (z,-2,2), color='mediumspringgreen')
sphinx_plot(G)
A variation of the blob cube featuring heterogeneously sized blobs::
sage: F = x^2 + y^2 + z^2 + sin(4*x) + sin(4*y) + sin(4*z) - 1
sage: implicit_plot3d(F, (x,-2,2), (y,-2,2), (z,-2,2), color='lavenderblush')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
F = x**2 + y**2 + z**2 + sin(4*x) + sin(4*y) + sin(4*z) - 1
G = implicit_plot3d(F, (x,-2,2), (y,-2,2), (z,-2,2), color='lavenderblush')
sphinx_plot(G)
A Klein bottle::
sage: G = x^2 + y^2 + z^2
sage: F = (G+2*y-1)*((G-2*y-1)^2-8*z^2) + 16*x*z*(G-2*y-1)
sage: implicit_plot3d(F, (x,-3,3), (y,-3.1,3.1), (z,-4,4), color='moccasin')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
G = x**2 + y**2 + z**2
F = (G+2*y-1)*((G-2*y-1)**2-8*z**2)+16*x*z*(G-2*y-1)
G = implicit_plot3d(F, (x,-3,3), (y,-3.1,3.1), (z,-4,4), color='moccasin')
sphinx_plot(G)
A lemniscate::
sage: F = 4*x^2*(x^2+y^2+z^2+z) + y^2*(y^2+z^2-1)
sage: implicit_plot3d(F, (x,-0.5,0.5), (y,-1,1), (z,-1,1), color='deeppink')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
F = 4*x**2*(x**2+y**2+z**2+z) + y**2*(y**2+z**2-1)
G = implicit_plot3d(F, (x,-0.5,0.5), (y,-1,1), (z,-1,1), color='deeppink')
sphinx_plot(G)
Drope::
sage: implicit_plot3d(z - 4*x*exp(-x^2-y^2), (x,-2,2), (y,-2,2), (z,-1.7,1.7), color='darkcyan')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
G = implicit_plot3d(z - 4*x*exp(-x**2-y**2), (x,-2,2), (y,-2,2), (z,-1.7,1.7), color='darkcyan')
sphinx_plot(G)
A cube with a circular aperture on each face::
sage: F = ((1/2.3)^2 * (x^2 + y^2 + z^2))^(-6) + ((1/2)^8 * (x^8 + y^8 + z^8))^6 - 1
sage: implicit_plot3d(F, (x,-2,2), (y,-2,2), (z,-2,2), color='palevioletred')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
F = ((1/2.3)**2 * (x**2 + y**2 + z**2))**(-6) + ((1/2)**8 * (x**8 + y**8 + z**8))**6 - 1
G = implicit_plot3d(F, (x,-2,2), (y,-2,2), (z,-2,2), color='palevioletred')
sphinx_plot(G)
A simple hyperbolic surface::
sage: implicit_plot3d(x^2 + y - z^2, (x,-1,1), (y,-1,1), (z,-1,1), color='darkslategray')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
G = implicit_plot3d(x**2 + y - z**2, (x,-1,1), (y,-1,1), (z,-1,1), color='darkslategray')
sphinx_plot(G)
A hyperboloid::
sage: implicit_plot3d(x^2 + y^2 - z^2 -0.3, (x,-2,2), (y,-2,2), (z,-1.8,1.8), color='honeydew')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
G = implicit_plot3d(x**2 + y**2 - z**2 -0.3, (x,-2,2), (y,-2,2), (z,-1.8,1.8), color='honeydew')
sphinx_plot(G)
Dupin cyclide (:wikipedia:`Dupin_cyclide`) ::
sage: x, y, z , a, b, c, d = var('x,y,z,a,b,c,d')
sage: a = 3.5
sage: b = 3
sage: c = sqrt(a^2 - b^2)
sage: d = 2
sage: F = (x^2 + y^2 + z^2 + b^2 - d^2)^2 - 4*(a*x-c*d)^2 - 4*b^2*y^2
sage: implicit_plot3d(F, (x,-6,6), (y,-6,6), (z,-6,6), color='seashell')
Graphics3d Object
.. PLOT::
x, y, z , a, b, c, d = var('x,y,z,a,b,c,d')
a = 3.5
b = 3
c = sqrt(a**2 - b**2)
d = 2
F = (x**2 + y**2 + z**2 + b**2 - d**2)**2 - 4*(a*x-c*d)**2 - 4*b**2*y**2
G = implicit_plot3d(F, (x,-6,6), (y,-6,6), (z,-6,6), color='seashell')
sphinx_plot(G)
Sinus::
sage: implicit_plot3d(sin(pi*((x)^2+(y)^2))/2 + z, (x,-1,1), (y,-1,1), (z,-1,1), color='rosybrown')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
G = implicit_plot3d(sin(pi*((x)**2+(y)**2))/2 + z, (x,-1,1), (y,-1,1), (z,-1,1), color='rosybrown')
sphinx_plot(G)
A torus::
sage: implicit_plot3d((sqrt(x*x+y*y)-3)^2 + z*z - 1, (x,-4,4), (y,-4,4), (z,-1,1), color='indigo')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
G = implicit_plot3d((sqrt(x*x+y*y)-3)**2 + z*z - 1, (x,-4,4), (y,-4,4), (z,-1,1), color='indigo')
sphinx_plot(G)
An octahedron::
sage: implicit_plot3d(abs(x) + abs(y) + abs(z) - 1, (x,-1,1), (y,-1,1), (z,-1,1), color='olive')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
G = implicit_plot3d(abs(x) + abs(y) + abs(z) - 1, (x,-1,1), (y,-1,1), (z,-1,1), color='olive')
sphinx_plot(G)
A cube::
sage: implicit_plot3d(x^100 + y^100 + z^100 - 1, (x,-2,2), (y,-2,2), (z,-2,2), color='lightseagreen')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
G = implicit_plot3d(x**100 + y**100 + z**100 - 1, (x,-2,2), (y,-2,2), (z,-2,2), color='lightseagreen')
sphinx_plot(G)
Toupie::
sage: implicit_plot3d((sqrt(x*x+y*y)-3)^3 + z*z - 1, (x,-4,4), (y,-4,4), (z,-6,6), color='mintcream')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
G = implicit_plot3d((sqrt(x*x+y*y)-3)**3 + z*z - 1, (x,-4,4), (y,-4,4), (z,-6,6), color='mintcream')
sphinx_plot(G)
A cube with rounded edges::
sage: F = x^4 + y^4 + z^4 - (x^2 + y^2 + z^2)
sage: implicit_plot3d(F, (x,-2,2), (y,-2,2), (z,-2,2), color='mediumvioletred')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
F = x**4 + y**4 + z**4 - (x**2 + y**2 + z**2)
G = implicit_plot3d(F, (x,-2,2), (y,-2,2), (z,-2,2), color='mediumvioletred')
sphinx_plot(G)
Chmutov::
sage: F = x^4 + y^4 + z^4 - (x^2 + y^2 + z^2 - 0.3)
sage: implicit_plot3d(F, (x,-1.5,1.5), (y,-1.5,1.5), (z,-1.5,1.5), color='lightskyblue')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
F = x**4 + y**4 + z**4 - (x**2 + y**2 + z**2 - 0.3)
G = implicit_plot3d(F, (x,-1.5,1.5), (y,-1.5,1.5), (z,-1.5,1.5), color='lightskyblue')
sphinx_plot(G)
Further Chmutov::
sage: F = 2*(x^2*(3-4*x^2)^2+y^2*(3-4*y^2)^2+z^2*(3-4*z^2)^2) - 3
sage: implicit_plot3d(F, (x,-1.3,1.3), (y,-1.3,1.3), (z,-1.3,1.3), color='darksalmon')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
F = 2*(x**2*(3-4*x**2)**2+y**2*(3-4*y**2)**2+z**2*(3-4*z**2)**2) - 3
G = implicit_plot3d(F, (x,-1.3,1.3), (y,-1.3,1.3), (z,-1.3,1.3), color='darksalmon')
sphinx_plot(G)
Clebsch surface::
sage: F_1 = 81 * (x^3+y^3+z^3)
sage: F_2 = 189 * (x^2*(y+z)+y^2*(x+z)+z^2*(x+y))
sage: F_3 = 54 * x * y * z
sage: F_4 = 126 * (x*y+x*z+y*z)
sage: F_5 = 9 * (x^2+y^2+z^2)
sage: F_6 = 9 * (x+y+z)
sage: F = F_1 - F_2 + F_3 + F_4 - F_5 + F_6 + 1
sage: implicit_plot3d(F, (x,-1,1), (y,-1,1), (z,-1,1), color='yellowgreen')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
F_1 = 81 * (x**3+y**3+z**3)
F_2 = 189 * (x**2*(y+z)+y**2*(x+z)+z**2*(x+y))
F_3 = 54 * x * y * z
F_4 = 126 * (x*y+x*z+y*z)
F_5 = 9 * (x**2+y**2+z**2)
F_6 = 9 * (x+y+z)
F = F_1 - F_2 + F_3 + F_4 - F_5 + F_6 + 1
G = implicit_plot3d(F, (x,-1,1), (y,-1,1), (z,-1,1), color='yellowgreen')
sphinx_plot(G)
Looks like a water droplet::
sage: implicit_plot3d(x^2 +y^2 -(1-z)*z^2, (x,-1.5,1.5), (y,-1.5,1.5), (z,-1,1), color='bisque')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
G = implicit_plot3d(x**2 +y**2 -(1-z)*z**2, (x,-1.5,1.5), (y,-1.5,1.5), (z,-1,1), color='bisque')
sphinx_plot(G)
Sphere in a cage::
sage: F = (x^8+z^30+y^8-(x^4 + z^50 + y^4 -0.3)) * (x^2+y^2+z^2-0.5)
sage: implicit_plot3d(F, (x,-1.2,1.2), (y,-1.3,1.3), (z,-1.5,1.5), color='firebrick')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
F = (x**8+z**30+y**8-(x**4 + z**50 + y**4 -0.3)) * (x**2+y**2+z**2-0.5)
G = implicit_plot3d(F, (x,-1.2,1.2), (y,-1.3,1.3), (z,-1.5,1.5), color='firebrick')
sphinx_plot(G)
Ortho circle::
sage: F = ((x^2+y^2-1)^2+z^2) * ((y^2+z^2-1)^2+x^2) * ((z^2+x^2-1)^2+y^2)-0.075^2 * (1+3*(x^2+y^2+z^2))
sage: implicit_plot3d(F, (x,-1.5,1.5), (y,-1.5,1.5), (z,-1.5,1.5), color='lemonchiffon')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
F = ((x**2+y**2-1)**2+z**2) * ((y**2+z**2-1)**2+x**2) * ((z**2+x**2-1)**2+y**2)-0.075**2 * (1+3*(x**2+y**2+z**2))
G = implicit_plot3d(F, (x,-1.5,1.5), (y,-1.5,1.5), (z,-1.5,1.5), color='lemonchiffon')
sphinx_plot(G)
Cube sphere::
sage: F = 12 - ((1/2.3)^2 *(x^2 + y^2 + z^2))^-6 - ((1/2)^8 * (x^8 + y^8 + z^8))^6
sage: implicit_plot3d(F, (x,-2,2), (y,-2,2), (z,-2,2), color='rosybrown')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
F = 12 - ((1/2.3)**2 *(x**2 + y**2 + z**2))**-6 - ( (1/2)**8 * (x**8 + y**8 + z**8) )**6
G = implicit_plot3d(F, (x,-2,2), (y,-2,2), (z,-2,2), color='rosybrown')
sphinx_plot(G)
Two cylinders intersect to make a cross::
sage: implicit_plot3d((x^2+y^2-1) * (x^2+z^2-1) - 1, (x,-3,3), (y,-3,3), (z,-3,3), color='burlywood')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
G = implicit_plot3d((x**2+y**2-1) * (x**2+z**2-1) - 1, (x,-3,3), (y,-3,3), (z,-3,3), color='burlywood')
sphinx_plot(G)
Three cylinders intersect in a similar fashion::
sage: implicit_plot3d((x^2+y^2-1) * (x^2+z^2-1) * (y^2+z^2-1)-1, (x,-3,3), (y,-3,3), (z,-3,3), color='aqua')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
G = implicit_plot3d((x**2+y**2-1) * (x**2+z**2-1) * (y**2+z**2-1)-1, (x,-3,3), (y,-3,3), (z,-3,3), color='aqua')
sphinx_plot(G)
A sphere-ish object with twelve holes, four on each XYZ plane::
sage: implicit_plot3d(3*(cos(x)+cos(y)+cos(z)) + 4*cos(x)*cos(y)*cos(z), (x,-3,3), (y,-3,3), (z,-3,3), color='orangered')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
G = implicit_plot3d(3*(cos(x)+cos(y)+cos(z)) + 4*cos(x)*cos(y)*cos(z), (x,-3,3), (y,-3,3), (z,-3,3), color='orangered')
sphinx_plot(G)
A gyroid::
sage: implicit_plot3d(cos(x)*sin(y) + cos(y)*sin(z) + cos(z)*sin(x), (x,-4,4), (y,-4,4), (z,-4,4), color='sandybrown')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
G = implicit_plot3d(cos(x)*sin(y) + cos(y)*sin(z) + cos(z)*sin(x), (x,-4,4), (y,-4,4), (z,-4,4), color='sandybrown')
sphinx_plot(G)
Tetrahedra::
sage: implicit_plot3d((x^2+y^2+z^2)^2 + 8*x*y*z - 10*(x^2+y^2+z^2) + 25, (x,-4,4), (y,-4,4), (z,-4,4), color='plum')
Graphics3d Object
.. PLOT::
x, y, z = var('x,y,z')
G = implicit_plot3d((x**2+y**2+z**2)**2 + 8*x*y*z - 10*(x**2+y**2+z**2) + 25, (x,-4,4), (y,-4,4), (z,-4,4), color='plum')
sphinx_plot(G)
TESTS:
Test a separate resolution in the X direction; this should look like a
regular sphere::
sage: implicit_plot3d(x^2 + y^2 + z^2, (x,-2,2), (y,-2,2), (z,-2,2), plot_points=(10,40,40), contour=4)
Graphics3d Object
Test using different plot ranges in the different directions; each
of these should generate half of a sphere. Note that we need to use
the ``aspect_ratio`` keyword to make it look right with the unequal
plot ranges::
sage: implicit_plot3d(x^2 + y^2 + z^2, (x,0,2), (y,-2,2), (z,-2,2), contour=4, aspect_ratio=1)
Graphics3d Object
sage: implicit_plot3d(x^2 + y^2 + z^2, (x,-2,2), (y,0,2), (z,-2,2), contour=4, aspect_ratio=1)
Graphics3d Object
sage: implicit_plot3d(x^2 + y^2 + z^2, (x,-2,2), (y,-2,2), (z,0,2), contour=4, aspect_ratio=1)
Graphics3d Object
Extra keyword arguments will be passed to show()::
sage: implicit_plot3d(x^2 + y^2 + z^2, (x,-2,2), (y,-2,2), (z,-2,2), contour=4, viewer='tachyon')
Graphics3d Object
An implicit plot that does not include any surface in the view volume
produces an empty plot::
sage: implicit_plot3d(x^2 + y^2 + z^2 - 5000, (x,-2,2), (y,-2,2), (z,-2,2), plot_points=6)
Graphics3d Object
Make sure that implicit_plot3d does not error if the function cannot
be symbolically differentiated::
sage: implicit_plot3d(max_symbolic(x, y^2) - z, (x,-2,2), (y,-2,2), (z,-2,2), plot_points=6)
Graphics3d Object
TESTS:
Check for :trac:`10599`::
sage: var('x,y,z')
(x, y, z)
sage: M = matrix(3,[1,-1,-1,-1,3,1,-1,1,3])
sage: v = 1/M.eigenvalues()[1]
sage: implicit_plot3d(x^2+y^2+z^2==v, [x,-3,3], [y,-3,3],[z,-3,3])
Graphics3d Object
"""
# These options, related to rendering with smooth shading, are irrelevant
# since IndexFaceSet does not support surface normals:
# smooth: (default: False) Whether to use vertex normals to produce a
# smooth-looking surface. False is slightly faster.
# gradient: (default: None) If smooth is True (the default), then
# Tachyon rendering needs vertex normals. In that case, if gradient is None
# (the default), then we try to differentiate the function to get the
# gradient. If that fails, then we use central differencing on the scalar
# field. But it's also possible to specify the gradient; this must be either
# a single python callable that takes (x,y,z) and returns a tuple (dx,dy,dz)
# or a tuple of three callables that each take (x,y,z) and return dx, dy, dz
# respectively.
G = ImplicitSurface(f, xrange, yrange, zrange, **kwds)
G._set_extra_kwds(kwds)
return G
|
py | b413e88706bf7b847807dcc9cb4141437766e34b | # -*- coding: utf-8 -*-
from __future__ import absolute_import, print_function, unicode_literals
from .baseapi import BaseAPI
class Account(BaseAPI):
def __init__(self, *args, **kwargs):
self.balance = None
self.pending_charges = None
self.last_payment_date = None
self.last_payment_amount = None
super(Account, self).__init__(*args, **kwargs)
@classmethod
def get_object(cls, api_token):
"""
Class method that will return an Account object.
"""
acct = cls(token=api_token)
acct.load()
return acct
def load(self):
"""
Documentation: https://www.vultr.com/api/#account_info
"""
account = self.get_data("account/info")
for attr in account.keys():
setattr(self, attr, account[attr])
def __str__(self):
return "<%s>" % self.__class__.__name__
|
py | b413e8eb3345007c15b9eb1d27864b04f463e603 | # Written by Jagannath Bilgi <[email protected]>
from bs4 import BeautifulSoup
from bs4.element import Comment
"""
Below code is used for extracting readable text from site. Code is referenced from web
"""
def tag_visible(element):
if element.parent.name in ['style', 'script', 'head', 'title', 'meta', '[document]']:
return False
if isinstance(element, Comment):
return False
return True
def text_from_html(body):
soup = BeautifulSoup(body, 'html.parser')
texts = soup.findAll(text=True)
visible_texts = filter(tag_visible, texts)
return u" ".join(t.strip() for t in visible_texts)
|
py | b413ea1d9b7ef4c395b9a3da1fe28eba3f88acf6 | from collections import defaultdict
import numpy as np
import random
from typing import Any, Callable, Dict, List, Optional, TYPE_CHECKING
from ray.rllib.env.base_env import _DUMMY_AGENT_ID
from ray.rllib.policy.policy_map import PolicyMap
from ray.rllib.utils.annotations import Deprecated, DeveloperAPI
from ray.rllib.utils.deprecation import deprecation_warning
from ray.rllib.utils.spaces.space_utils import flatten_to_single_ndarray
from ray.rllib.utils.typing import SampleBatchType, AgentID, PolicyID, \
EnvActionType, EnvID, EnvInfoDict, EnvObsType
from ray.util import log_once
if TYPE_CHECKING:
from ray.rllib.evaluation.rollout_worker import RolloutWorker
from ray.rllib.evaluation.sample_batch_builder import \
MultiAgentSampleBatchBuilder
@DeveloperAPI
class Episode:
"""Tracks the current state of a (possibly multi-agent) episode.
Attributes:
new_batch_builder (func): Create a new MultiAgentSampleBatchBuilder.
add_extra_batch (func): Return a built MultiAgentBatch to the sampler.
batch_builder (obj): Batch builder for the current episode.
total_reward (float): Summed reward across all agents in this episode.
length (int): Length of this episode.
episode_id (int): Unique id identifying this trajectory.
agent_rewards (dict): Summed rewards broken down by agent.
custom_metrics (dict): Dict where the you can add custom metrics.
user_data (dict): Dict that you can use for temporary storage. E.g.
in between two custom callbacks referring to the same episode.
hist_data (dict): Dict mapping str keys to List[float] for storage of
per-timestep float data throughout the episode.
Use case 1: Model-based rollouts in multi-agent:
A custom compute_actions() function in a policy can inspect the
current episode state and perform a number of rollouts based on the
policies and state of other agents in the environment.
Use case 2: Returning extra rollouts data.
The model rollouts can be returned back to the sampler by calling:
>>> batch = episode.new_batch_builder()
>>> for each transition:
batch.add_values(...) # see sampler for usage
>>> episode.extra_batches.add(batch.build_and_reset())
"""
def __init__(
self,
policies: PolicyMap,
policy_mapping_fn: Callable[[AgentID, "Episode", "RolloutWorker"],
PolicyID],
batch_builder_factory: Callable[[],
"MultiAgentSampleBatchBuilder"],
extra_batch_callback: Callable[[SampleBatchType], None],
env_id: EnvID,
*,
worker: Optional["RolloutWorker"] = None,
):
"""Initializes an Episode instance.
Args:
policies: The PolicyMap object (mapping PolicyIDs to Policy
objects) to use for determining, which policy is used for
which agent.
policy_mapping_fn: The mapping function mapping AgentIDs to
PolicyIDs.
batch_builder_factory:
extra_batch_callback:
env_id: The environment's ID in which this episode runs.
worker: The RolloutWorker instance, in which this episode runs.
"""
self.new_batch_builder: Callable[
[], "MultiAgentSampleBatchBuilder"] = batch_builder_factory
self.add_extra_batch: Callable[[SampleBatchType],
None] = extra_batch_callback
self.batch_builder: "MultiAgentSampleBatchBuilder" = \
batch_builder_factory()
self.total_reward: float = 0.0
self.length: int = 0
self.episode_id: int = random.randrange(2e9)
self.env_id = env_id
self.worker = worker
self.agent_rewards: Dict[AgentID, float] = defaultdict(float)
self.custom_metrics: Dict[str, float] = {}
self.user_data: Dict[str, Any] = {}
self.hist_data: Dict[str, List[float]] = {}
self.media: Dict[str, Any] = {}
self.policy_map: PolicyMap = policies
self._policies = self.policy_map # backward compatibility
self.policy_mapping_fn: Callable[[AgentID, "Episode", "RolloutWorker"],
PolicyID] = policy_mapping_fn
self._next_agent_index: int = 0
self._agent_to_index: Dict[AgentID, int] = {}
self._agent_to_policy: Dict[AgentID, PolicyID] = {}
self._agent_to_rnn_state: Dict[AgentID, List[Any]] = {}
self._agent_to_last_obs: Dict[AgentID, EnvObsType] = {}
self._agent_to_last_raw_obs: Dict[AgentID, EnvObsType] = {}
self._agent_to_last_done: Dict[AgentID, bool] = {}
self._agent_to_last_info: Dict[AgentID, EnvInfoDict] = {}
self._agent_to_last_action: Dict[AgentID, EnvActionType] = {}
self._agent_to_last_extra_action_outs: Dict[AgentID, dict] = {}
self._agent_to_prev_action: Dict[AgentID, EnvActionType] = {}
self._agent_reward_history: Dict[AgentID, List[int]] = defaultdict(
list)
@DeveloperAPI
def soft_reset(self) -> None:
"""Clears rewards and metrics, but retains RNN and other state.
This is used to carry state across multiple logical episodes in the
same env (i.e., if `soft_horizon` is set).
"""
self.length = 0
self.episode_id = random.randrange(2e9)
self.total_reward = 0.0
self.agent_rewards = defaultdict(float)
self._agent_reward_history = defaultdict(list)
@DeveloperAPI
def policy_for(self, agent_id: AgentID = _DUMMY_AGENT_ID) -> PolicyID:
"""Returns and stores the policy ID for the specified agent.
If the agent is new, the policy mapping fn will be called to bind the
agent to a policy for the duration of the entire episode (even if the
policy_mapping_fn is changed in the meantime!).
Args:
agent_id: The agent ID to lookup the policy ID for.
Returns:
The policy ID for the specified agent.
"""
# Perform a new policy_mapping_fn lookup and bind AgentID for the
# duration of this episode to the returned PolicyID.
if agent_id not in self._agent_to_policy:
# Try new API: pass in agent_id and episode as named args.
# New signature should be: (agent_id, episode, worker, **kwargs)
try:
policy_id = self._agent_to_policy[agent_id] = \
self.policy_mapping_fn(agent_id, self, worker=self.worker)
except TypeError as e:
if "positional argument" in e.args[0] or \
"unexpected keyword argument" in e.args[0]:
if log_once("policy_mapping_new_signature"):
deprecation_warning(
old="policy_mapping_fn(agent_id)",
new="policy_mapping_fn(agent_id, episode, "
"worker, **kwargs)")
policy_id = self._agent_to_policy[agent_id] = \
self.policy_mapping_fn(agent_id)
else:
raise e
# Use already determined PolicyID.
else:
policy_id = self._agent_to_policy[agent_id]
# PolicyID not found in policy map -> Error.
if policy_id not in self.policy_map:
raise KeyError("policy_mapping_fn returned invalid policy id "
f"'{policy_id}'!")
return policy_id
@DeveloperAPI
def last_observation_for(
self, agent_id: AgentID = _DUMMY_AGENT_ID) -> Optional[EnvObsType]:
"""Returns the last observation for the specified AgentID.
Args:
agent_id: The agent's ID to get the last observation for.
Returns:
Last observation the specified AgentID has seen. None in case
the agent has never made any observations in the episode.
"""
return self._agent_to_last_obs.get(agent_id)
@DeveloperAPI
def last_raw_obs_for(
self, agent_id: AgentID = _DUMMY_AGENT_ID) -> Optional[EnvObsType]:
"""Returns the last un-preprocessed obs for the specified AgentID.
Args:
agent_id: The agent's ID to get the last un-preprocessed
observation for.
Returns:
Last un-preprocessed observation the specified AgentID has seen.
None in case the agent has never made any observations in the
episode.
"""
return self._agent_to_last_raw_obs.get(agent_id)
@DeveloperAPI
def last_info_for(self, agent_id: AgentID = _DUMMY_AGENT_ID
) -> Optional[EnvInfoDict]:
"""Returns the last info for the specified AgentID.
Args:
agent_id: The agent's ID to get the last info for.
Returns:
Last info dict the specified AgentID has seen.
None in case the agent has never made any observations in the
episode.
"""
return self._agent_to_last_info.get(agent_id)
@DeveloperAPI
def last_action_for(self,
agent_id: AgentID = _DUMMY_AGENT_ID) -> EnvActionType:
"""Returns the last action for the specified AgentID, or zeros.
The "last" action is the most recent one taken by the agent.
Args:
agent_id: The agent's ID to get the last action for.
Returns:
Last action the specified AgentID has executed.
Zeros in case the agent has never performed any actions in the
episode.
"""
# Agent has already taken at least one action in the episode.
if agent_id in self._agent_to_last_action:
return flatten_to_single_ndarray(
self._agent_to_last_action[agent_id])
# Agent has not acted yet, return all zeros.
else:
policy_id = self.policy_for(agent_id)
policy = self.policy_map[policy_id]
flat = flatten_to_single_ndarray(policy.action_space.sample())
if hasattr(policy.action_space, "dtype"):
return np.zeros_like(flat, dtype=policy.action_space.dtype)
return np.zeros_like(flat)
@DeveloperAPI
def prev_action_for(self,
agent_id: AgentID = _DUMMY_AGENT_ID) -> EnvActionType:
"""Returns the previous action for the specified agent, or zeros.
The "previous" action is the one taken one timestep before the
most recent action taken by the agent.
Args:
agent_id: The agent's ID to get the previous action for.
Returns:
Previous action the specified AgentID has executed.
Zero in case the agent has never performed any actions (or only
one) in the episode.
"""
# We are at t > 1 -> There has been a previous action by this agent.
if agent_id in self._agent_to_prev_action:
return flatten_to_single_ndarray(
self._agent_to_prev_action[agent_id])
# We're at t <= 1, so return all zeros.
else:
return np.zeros_like(self.last_action_for(agent_id))
@DeveloperAPI
def last_reward_for(self, agent_id: AgentID = _DUMMY_AGENT_ID) -> float:
"""Returns the last reward for the specified agent, or zero.
The "last" reward is the one received most recently by the agent.
Args:
agent_id: The agent's ID to get the last reward for.
Returns:
Last reward for the the specified AgentID.
Zero in case the agent has never performed any actions
(and thus received rewards) in the episode.
"""
history = self._agent_reward_history[agent_id]
# We are at t > 0 -> Return previously received reward.
if len(history) >= 1:
return history[-1]
# We're at t=0, so there is no previous reward, just return zero.
else:
return 0.0
@DeveloperAPI
def prev_reward_for(self, agent_id: AgentID = _DUMMY_AGENT_ID) -> float:
"""Returns the previous reward for the specified agent, or zero.
The "previous" reward is the one received one timestep before the
most recently received reward of the agent.
Args:
agent_id: The agent's ID to get the previous reward for.
Returns:
Previous reward for the the specified AgentID.
Zero in case the agent has never performed any actions (or only
one) in the episode.
"""
history = self._agent_reward_history[agent_id]
# We are at t > 1 -> Return reward prior to most recent (last) one.
if len(history) >= 2:
return history[-2]
# We're at t <= 1, so there is no previous reward, just return zero.
else:
return 0.0
@DeveloperAPI
def rnn_state_for(self, agent_id: AgentID = _DUMMY_AGENT_ID) -> List[Any]:
"""Returns the last RNN state for the specified agent.
Args:
agent_id: The agent's ID to get the most recent RNN state for.
Returns:
Most recent RNN state of the the specified AgentID.
"""
if agent_id not in self._agent_to_rnn_state:
policy_id = self.policy_for(agent_id)
policy = self.policy_map[policy_id]
self._agent_to_rnn_state[agent_id] = policy.get_initial_state()
return self._agent_to_rnn_state[agent_id]
@DeveloperAPI
def last_done_for(self, agent_id: AgentID = _DUMMY_AGENT_ID) -> bool:
"""Returns the last done flag for the specified AgentID.
Args:
agent_id: The agent's ID to get the last done flag for.
Returns:
Last done flag for the specified AgentID.
"""
if agent_id not in self._agent_to_last_done:
self._agent_to_last_done[agent_id] = False
return self._agent_to_last_done[agent_id]
@DeveloperAPI
def last_extra_action_outs_for(
self,
agent_id: AgentID = _DUMMY_AGENT_ID,
) -> dict:
"""Returns the last extra-action outputs for the specified agent.
This data is returned by a call to
`Policy.compute_actions_from_input_dict` as the 3rd return value
(1st return value = action; 2nd return value = RNN state outs).
Args:
agent_id: The agent's ID to get the last extra-action outs for.
Returns:
The last extra-action outs for the specified AgentID.
"""
return self._agent_to_last_extra_action_outs[agent_id]
@DeveloperAPI
def get_agents(self) -> List[AgentID]:
"""Returns list of agent IDs that have appeared in this episode.
Returns:
The list of all agent IDs that have appeared so far in this
episode.
"""
return list(self._agent_to_index.keys())
def _add_agent_rewards(self, reward_dict: Dict[AgentID, float]) -> None:
for agent_id, reward in reward_dict.items():
if reward is not None:
self.agent_rewards[agent_id,
self.policy_for(agent_id)] += reward
self.total_reward += reward
self._agent_reward_history[agent_id].append(reward)
def _set_rnn_state(self, agent_id, rnn_state):
self._agent_to_rnn_state[agent_id] = rnn_state
def _set_last_observation(self, agent_id, obs):
self._agent_to_last_obs[agent_id] = obs
def _set_last_raw_obs(self, agent_id, obs):
self._agent_to_last_raw_obs[agent_id] = obs
def _set_last_done(self, agent_id, done):
self._agent_to_last_done[agent_id] = done
def _set_last_info(self, agent_id, info):
self._agent_to_last_info[agent_id] = info
def _set_last_action(self, agent_id, action):
if agent_id in self._agent_to_last_action:
self._agent_to_prev_action[agent_id] = \
self._agent_to_last_action[agent_id]
self._agent_to_last_action[agent_id] = action
def _set_last_extra_action_outs(self, agent_id, pi_info):
self._agent_to_last_extra_action_outs[agent_id] = pi_info
def _agent_index(self, agent_id):
if agent_id not in self._agent_to_index:
self._agent_to_index[agent_id] = self._next_agent_index
self._next_agent_index += 1
return self._agent_to_index[agent_id]
@property
def _policy_mapping_fn(self):
deprecation_warning(
old="Episode._policy_mapping_fn",
new="Episode.policy_mapping_fn",
error=False,
)
return self.policy_mapping_fn
@Deprecated(new="Episode.last_extra_action_outs_for", error=False)
def last_pi_info_for(self, *args, **kwargs):
return self.last_extra_action_outs_for(*args, **kwargs)
# Backward compatibility. The name Episode implies that there is
# also a (single agent?) Episode.
@Deprecated(new="ray.rllib.evaluation.episode.Episode", error=False)
class MultiAgentEpisode(Episode):
pass
|
py | b413ebd1241bed3f58f1b55b7c55a0a686392afd | # Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
# pylint: disable=C,R,W
import logging
import re
from collections import OrderedDict
from datetime import datetime, timedelta
from typing import Any, Dict, Hashable, List, NamedTuple, Optional, Tuple, Union
import pandas as pd
import sqlalchemy as sa
import sqlparse
from flask import escape, Markup
from flask_appbuilder import Model
from flask_babel import lazy_gettext as _
from sqlalchemy import (
and_,
asc,
Boolean,
Column,
DateTime,
desc,
ForeignKey,
Integer,
or_,
select,
String,
Table,
Text,
)
from sqlalchemy.exc import CompileError
from sqlalchemy.orm import backref, Query, relationship, RelationshipProperty, Session
from sqlalchemy.orm.exc import NoResultFound
from sqlalchemy.schema import UniqueConstraint
from sqlalchemy.sql import column, ColumnElement, literal_column, table, text
from sqlalchemy.sql.expression import Label, Select, TextAsFrom
from superset import app, db, is_feature_enabled, security_manager
from superset.connectors.base.models import BaseColumn, BaseDatasource, BaseMetric
from superset.constants import NULL_STRING
from superset.db_engine_specs.base import TimestampExpression
from superset.exceptions import DatabaseNotFound
from superset.jinja_context import (
BaseTemplateProcessor,
ExtraCache,
get_template_processor,
)
from superset.models.annotations import Annotation
from superset.models.core import Database
from superset.models.helpers import AuditMixinNullable, QueryResult
from superset.typing import Metric, QueryObjectDict
from superset.utils import core as utils, import_datasource
config = app.config
metadata = Model.metadata # pylint: disable=no-member
logger = logging.getLogger(__name__)
class SqlaQuery(NamedTuple):
extra_cache_keys: List[Any]
labels_expected: List[str]
prequeries: List[str]
sqla_query: Select
class QueryStringExtended(NamedTuple):
labels_expected: List[str]
prequeries: List[str]
sql: str
class AnnotationDatasource(BaseDatasource):
""" Dummy object so we can query annotations using 'Viz' objects just like
regular datasources.
"""
cache_timeout = 0
changed_on = None
type = "annotation"
def query(self, query_obj: QueryObjectDict) -> QueryResult:
error_message = None
qry = db.session.query(Annotation)
qry = qry.filter(Annotation.layer_id == query_obj["filter"][0]["val"])
if query_obj["from_dttm"]:
qry = qry.filter(Annotation.start_dttm >= query_obj["from_dttm"])
if query_obj["to_dttm"]:
qry = qry.filter(Annotation.end_dttm <= query_obj["to_dttm"])
status = utils.QueryStatus.SUCCESS
try:
df = pd.read_sql_query(qry.statement, db.engine)
except Exception as ex:
df = pd.DataFrame()
status = utils.QueryStatus.FAILED
logger.exception(ex)
error_message = utils.error_msg_from_exception(ex)
return QueryResult(
status=status,
df=df,
duration=timedelta(0),
query="",
error_message=error_message,
)
def get_query_str(self, query_obj: QueryObjectDict) -> str:
raise NotImplementedError()
def values_for_column(self, column_name: str, limit: int = 10000) -> List[Any]:
raise NotImplementedError()
class TableColumn(Model, BaseColumn):
"""ORM object for table columns, each table can have multiple columns"""
__tablename__ = "table_columns"
__table_args__ = (UniqueConstraint("table_id", "column_name"),)
table_id = Column(Integer, ForeignKey("tables.id"))
table = relationship(
"SqlaTable",
backref=backref("columns", cascade="all, delete-orphan"),
foreign_keys=[table_id],
)
is_dttm = Column(Boolean, default=False)
expression = Column(Text)
python_date_format = Column(String(255))
export_fields = [
"table_id",
"column_name",
"verbose_name",
"is_dttm",
"is_active",
"type",
"groupby",
"filterable",
"expression",
"description",
"python_date_format",
]
update_from_object_fields = [s for s in export_fields if s not in ("table_id",)]
export_parent = "table"
@property
def is_numeric(self) -> bool:
db_engine_spec = self.table.database.db_engine_spec
return db_engine_spec.is_db_column_type_match(
self.type, utils.DbColumnType.NUMERIC
)
@property
def is_string(self) -> bool:
db_engine_spec = self.table.database.db_engine_spec
return db_engine_spec.is_db_column_type_match(
self.type, utils.DbColumnType.STRING
)
@property
def is_temporal(self) -> bool:
db_engine_spec = self.table.database.db_engine_spec
return db_engine_spec.is_db_column_type_match(
self.type, utils.DbColumnType.TEMPORAL
)
def get_sqla_col(self, label: Optional[str] = None) -> Column:
label = label or self.column_name
if self.expression:
col = literal_column(self.expression)
else:
db_engine_spec = self.table.database.db_engine_spec
type_ = db_engine_spec.get_sqla_column_type(self.type)
col = column(self.column_name, type_=type_)
col = self.table.make_sqla_column_compatible(col, label)
return col
@property
def datasource(self) -> RelationshipProperty:
return self.table
def get_time_filter(
self,
start_dttm: DateTime,
end_dttm: DateTime,
time_range_endpoints: Optional[
Tuple[utils.TimeRangeEndpoint, utils.TimeRangeEndpoint]
],
) -> ColumnElement:
col = self.get_sqla_col(label="__time")
l = []
if start_dttm:
l.append(
col >= text(self.dttm_sql_literal(start_dttm, time_range_endpoints))
)
if end_dttm:
if (
time_range_endpoints
and time_range_endpoints[1] == utils.TimeRangeEndpoint.EXCLUSIVE
):
l.append(
col < text(self.dttm_sql_literal(end_dttm, time_range_endpoints))
)
else:
l.append(col <= text(self.dttm_sql_literal(end_dttm, None)))
return and_(*l)
def get_timestamp_expression(
self, time_grain: Optional[str]
) -> Union[TimestampExpression, Label]:
"""
Return a SQLAlchemy Core element representation of self to be used in a query.
:param time_grain: Optional time grain, e.g. P1Y
:return: A TimeExpression object wrapped in a Label if supported by db
"""
label = utils.DTTM_ALIAS
db = self.table.database
pdf = self.python_date_format
is_epoch = pdf in ("epoch_s", "epoch_ms")
if not self.expression and not time_grain and not is_epoch:
sqla_col = column(self.column_name, type_=DateTime)
return self.table.make_sqla_column_compatible(sqla_col, label)
if self.expression:
col = literal_column(self.expression)
else:
col = column(self.column_name)
time_expr = db.db_engine_spec.get_timestamp_expr(
col, pdf, time_grain, self.type
)
return self.table.make_sqla_column_compatible(time_expr, label)
@classmethod
def import_obj(cls, i_column: "TableColumn") -> "TableColumn":
def lookup_obj(lookup_column: TableColumn) -> TableColumn:
return (
db.session.query(TableColumn)
.filter(
TableColumn.table_id == lookup_column.table_id,
TableColumn.column_name == lookup_column.column_name,
)
.first()
)
return import_datasource.import_simple_obj(db.session, i_column, lookup_obj)
def dttm_sql_literal(
self,
dttm: DateTime,
time_range_endpoints: Optional[
Tuple[utils.TimeRangeEndpoint, utils.TimeRangeEndpoint]
],
) -> str:
"""Convert datetime object to a SQL expression string"""
sql = (
self.table.database.db_engine_spec.convert_dttm(self.type, dttm)
if self.type
else None
)
if sql:
return sql
tf = self.python_date_format
# Fallback to the default format (if defined) only if the SIP-15 time range
# endpoints, i.e., [start, end) are enabled.
if not tf and time_range_endpoints == (
utils.TimeRangeEndpoint.INCLUSIVE,
utils.TimeRangeEndpoint.EXCLUSIVE,
):
tf = (
self.table.database.get_extra()
.get("python_date_format_by_column_name", {})
.get(self.column_name)
)
if tf:
if tf in ["epoch_ms", "epoch_s"]:
seconds_since_epoch = int(dttm.timestamp())
if tf == "epoch_s":
return str(seconds_since_epoch)
return str(seconds_since_epoch * 1000)
return f"'{dttm.strftime(tf)}'"
# TODO(john-bodley): SIP-15 will explicitly require a type conversion.
return f"""'{dttm.strftime("%Y-%m-%d %H:%M:%S.%f")}'"""
@property
def data(self) -> Dict[str, Any]:
attrs = (
"id",
"column_name",
"verbose_name",
"description",
"expression",
"filterable",
"groupby",
"is_dttm",
"type",
"python_date_format",
)
return {s: getattr(self, s) for s in attrs if hasattr(self, s)}
class SqlMetric(Model, BaseMetric):
"""ORM object for metrics, each table can have multiple metrics"""
__tablename__ = "sql_metrics"
__table_args__ = (UniqueConstraint("table_id", "metric_name"),)
table_id = Column(Integer, ForeignKey("tables.id"))
table = relationship(
"SqlaTable",
backref=backref("metrics", cascade="all, delete-orphan"),
foreign_keys=[table_id],
)
expression = Column(Text, nullable=False)
export_fields = [
"metric_name",
"verbose_name",
"metric_type",
"table_id",
"expression",
"description",
"d3format",
"warning_text",
]
update_from_object_fields = list(
[s for s in export_fields if s not in ("table_id",)]
)
export_parent = "table"
def get_sqla_col(self, label: Optional[str] = None) -> Column:
label = label or self.metric_name
sqla_col = literal_column(self.expression)
return self.table.make_sqla_column_compatible(sqla_col, label)
@property
def perm(self) -> Optional[str]:
return (
("{parent_name}.[{obj.metric_name}](id:{obj.id})").format(
obj=self, parent_name=self.table.full_name
)
if self.table
else None
)
def get_perm(self) -> Optional[str]:
return self.perm
@classmethod
def import_obj(cls, i_metric: "SqlMetric") -> "SqlMetric":
def lookup_obj(lookup_metric: SqlMetric) -> SqlMetric:
return (
db.session.query(SqlMetric)
.filter(
SqlMetric.table_id == lookup_metric.table_id,
SqlMetric.metric_name == lookup_metric.metric_name,
)
.first()
)
return import_datasource.import_simple_obj(db.session, i_metric, lookup_obj)
sqlatable_user = Table(
"sqlatable_user",
metadata,
Column("id", Integer, primary_key=True),
Column("user_id", Integer, ForeignKey("ab_user.id")),
Column("table_id", Integer, ForeignKey("tables.id")),
)
class SqlaTable(Model, BaseDatasource):
"""An ORM object for SqlAlchemy table references"""
type = "table"
query_language = "sql"
metric_class = SqlMetric
column_class = TableColumn
owner_class = security_manager.user_model
__tablename__ = "tables"
__table_args__ = (UniqueConstraint("database_id", "table_name"),)
table_name = Column(String(250), nullable=False)
main_dttm_col = Column(String(250))
database_id = Column(Integer, ForeignKey("dbs.id"), nullable=False)
fetch_values_predicate = Column(String(1000))
owners = relationship(owner_class, secondary=sqlatable_user, backref="tables")
database = relationship(
"Database",
backref=backref("tables", cascade="all, delete-orphan"),
foreign_keys=[database_id],
)
schema = Column(String(255))
sql = Column(Text)
is_sqllab_view = Column(Boolean, default=False)
template_params = Column(Text)
baselink = "tablemodelview"
export_fields = [
"table_name",
"main_dttm_col",
"description",
"default_endpoint",
"database_id",
"offset",
"cache_timeout",
"schema",
"sql",
"params",
"template_params",
"filter_select_enabled",
"fetch_values_predicate",
]
update_from_object_fields = [
f for f in export_fields if f not in ("table_name", "database_id")
]
export_parent = "database"
export_children = ["metrics", "columns"]
sqla_aggregations = {
"COUNT_DISTINCT": lambda column_name: sa.func.COUNT(sa.distinct(column_name)),
"COUNT": sa.func.COUNT,
"SUM": sa.func.SUM,
"AVG": sa.func.AVG,
"MIN": sa.func.MIN,
"MAX": sa.func.MAX,
}
def make_sqla_column_compatible(
self, sqla_col: Column, label: Optional[str] = None
) -> Column:
"""Takes a sqlalchemy column object and adds label info if supported by engine.
:param sqla_col: sqlalchemy column instance
:param label: alias/label that column is expected to have
:return: either a sql alchemy column or label instance if supported by engine
"""
label_expected = label or sqla_col.name
db_engine_spec = self.database.db_engine_spec
if db_engine_spec.allows_column_aliases:
label = db_engine_spec.make_label_compatible(label_expected)
sqla_col = sqla_col.label(label)
sqla_col._df_label_expected = label_expected
return sqla_col
def __repr__(self) -> str:
return self.name
@property
def changed_by_name(self) -> str:
if not self.changed_by:
return ""
return str(self.changed_by)
@property
def changed_by_url(self) -> str:
if not self.changed_by:
return ""
return f"/superset/profile/{self.changed_by.username}"
@property
def connection(self) -> str:
return str(self.database)
@property
def description_markeddown(self) -> str:
return utils.markdown(self.description)
@property
def datasource_name(self) -> str:
return self.table_name
@property
def database_name(self) -> str:
return self.database.name
@classmethod
def get_datasource_by_name(
cls,
session: Session,
datasource_name: str,
schema: Optional[str],
database_name: str,
) -> Optional["SqlaTable"]:
schema = schema or None
query = (
session.query(cls)
.join(Database)
.filter(cls.table_name == datasource_name)
.filter(Database.database_name == database_name)
)
# Handling schema being '' or None, which is easier to handle
# in python than in the SQLA query in a multi-dialect way
for tbl in query.all():
if schema == (tbl.schema or None):
return tbl
return None
@property
def link(self) -> Markup:
name = escape(self.name)
anchor = f'<a target="_blank" href="{self.explore_url}">{name}</a>'
return Markup(anchor)
def get_schema_perm(self) -> Optional[str]:
"""Returns schema permission if present, database one otherwise."""
return security_manager.get_schema_perm(self.database, self.schema)
def get_perm(self) -> str:
return ("[{obj.database}].[{obj.table_name}]" "(id:{obj.id})").format(obj=self)
@property
def name(self) -> str: # type: ignore
if not self.schema:
return self.table_name
return "{}.{}".format(self.schema, self.table_name)
@property
def full_name(self) -> str:
return utils.get_datasource_full_name(
self.database, self.table_name, schema=self.schema
)
@property
def dttm_cols(self) -> List[str]:
l = [c.column_name for c in self.columns if c.is_dttm]
if self.main_dttm_col and self.main_dttm_col not in l:
l.append(self.main_dttm_col)
return l
@property
def num_cols(self) -> List[str]:
return [c.column_name for c in self.columns if c.is_numeric]
@property
def any_dttm_col(self) -> Optional[str]:
cols = self.dttm_cols
return cols[0] if cols else None
@property
def html(self) -> str:
t = ((c.column_name, c.type) for c in self.columns)
df = pd.DataFrame(t)
df.columns = ["field", "type"]
return df.to_html(
index=False,
classes=("dataframe table table-striped table-bordered " "table-condensed"),
)
@property
def sql_url(self) -> str:
return self.database.sql_url + "?table_name=" + str(self.table_name)
def external_metadata(self) -> List[Dict[str, str]]:
cols = self.database.get_columns(self.table_name, schema=self.schema)
for col in cols:
try:
col["type"] = str(col["type"])
except CompileError:
col["type"] = "UNKNOWN"
return cols
@property
def time_column_grains(self) -> Dict[str, Any]:
return {
"time_columns": self.dttm_cols,
"time_grains": [grain.name for grain in self.database.grains()],
}
@property
def select_star(self) -> Optional[str]:
# show_cols and latest_partition set to false to avoid
# the expensive cost of inspecting the DB
return self.database.select_star(
self.table_name, schema=self.schema, show_cols=False, latest_partition=False
)
@property
def data(self) -> Dict:
d = super().data
if self.type == "table":
grains = self.database.grains() or []
if grains:
grains = [(g.duration, g.name) for g in grains]
d["granularity_sqla"] = utils.choicify(self.dttm_cols)
d["time_grain_sqla"] = grains
d["main_dttm_col"] = self.main_dttm_col
d["fetch_values_predicate"] = self.fetch_values_predicate
d["template_params"] = self.template_params
d["is_sqllab_view"] = self.is_sqllab_view
return d
def values_for_column(self, column_name: str, limit: int = 10000) -> List[Any]:
"""Runs query against sqla to retrieve some
sample values for the given column.
"""
cols = {col.column_name: col for col in self.columns}
target_col = cols[column_name]
tp = self.get_template_processor()
qry = (
select([target_col.get_sqla_col()])
.select_from(self.get_from_clause(tp))
.distinct()
)
if limit:
qry = qry.limit(limit)
if self.fetch_values_predicate:
tp = self.get_template_processor()
qry = qry.where(text(tp.process_template(self.fetch_values_predicate)))
engine = self.database.get_sqla_engine()
sql = "{}".format(qry.compile(engine, compile_kwargs={"literal_binds": True}))
sql = self.mutate_query_from_config(sql)
df = pd.read_sql_query(sql=sql, con=engine)
return df[column_name].to_list()
def mutate_query_from_config(self, sql: str) -> str:
"""Apply config's SQL_QUERY_MUTATOR
Typically adds comments to the query with context"""
SQL_QUERY_MUTATOR = config["SQL_QUERY_MUTATOR"]
if SQL_QUERY_MUTATOR:
username = utils.get_username()
sql = SQL_QUERY_MUTATOR(sql, username, security_manager, self.database)
return sql
def get_template_processor(self, **kwargs: Any) -> BaseTemplateProcessor:
return get_template_processor(table=self, database=self.database, **kwargs)
def get_query_str_extended(self, query_obj: QueryObjectDict) -> QueryStringExtended:
sqlaq = self.get_sqla_query(**query_obj)
sql = self.database.compile_sqla_query(sqlaq.sqla_query)
logger.info(sql)
sql = sqlparse.format(sql, reindent=True)
sql = self.mutate_query_from_config(sql)
return QueryStringExtended(
labels_expected=sqlaq.labels_expected, sql=sql, prequeries=sqlaq.prequeries
)
def get_query_str(self, query_obj: QueryObjectDict) -> str:
query_str_ext = self.get_query_str_extended(query_obj)
all_queries = query_str_ext.prequeries + [query_str_ext.sql]
return ";\n\n".join(all_queries) + ";"
def get_sqla_table(self) -> table:
tbl = table(self.table_name)
if self.schema:
tbl.schema = self.schema
return tbl
def get_from_clause(
self, template_processor: Optional[BaseTemplateProcessor] = None
) -> Union[table, TextAsFrom]:
# Supporting arbitrary SQL statements in place of tables
if self.sql:
from_sql = self.sql
if template_processor:
from_sql = template_processor.process_template(from_sql)
from_sql = sqlparse.format(from_sql, strip_comments=True)
return TextAsFrom(sa.text(from_sql), []).alias("expr_qry")
return self.get_sqla_table()
def adhoc_metric_to_sqla(self, metric: Dict, cols: Dict) -> Optional[Column]:
"""
Turn an adhoc metric into a sqlalchemy column.
:param dict metric: Adhoc metric definition
:param dict cols: Columns for the current table
:returns: The metric defined as a sqlalchemy column
:rtype: sqlalchemy.sql.column
"""
expression_type = metric.get("expressionType")
label = utils.get_metric_name(metric)
if expression_type == utils.ADHOC_METRIC_EXPRESSION_TYPES["SIMPLE"]:
column_name = metric["column"].get("column_name")
table_column = cols.get(column_name)
if table_column:
sqla_column = table_column.get_sqla_col()
else:
sqla_column = column(column_name)
sqla_metric = self.sqla_aggregations[metric["aggregate"]](sqla_column)
elif expression_type == utils.ADHOC_METRIC_EXPRESSION_TYPES["SQL"]:
sqla_metric = literal_column(metric.get("sqlExpression"))
else:
return None
return self.make_sqla_column_compatible(sqla_metric, label)
def _get_sqla_row_level_filters(
self, template_processor: BaseTemplateProcessor
) -> List[str]:
"""
Return the appropriate row level security filters for this table and the current user.
:param BaseTemplateProcessor template_processor: The template processor to apply to the filters.
:returns: A list of SQL clauses to be ANDed together.
:rtype: List[str]
"""
return [
text("({})".format(template_processor.process_template(f.clause)))
for f in security_manager.get_rls_filters(self)
]
def get_sqla_query( # sqla
self,
metrics: List[Metric],
granularity: str,
from_dttm: Optional[datetime],
to_dttm: Optional[datetime],
columns: Optional[List[str]] = None,
groupby: Optional[List[str]] = None,
filter: Optional[List[Dict[str, Any]]] = None,
is_timeseries: bool = True,
timeseries_limit: int = 15,
timeseries_limit_metric: Optional[Metric] = None,
row_limit: Optional[int] = None,
inner_from_dttm: Optional[datetime] = None,
inner_to_dttm: Optional[datetime] = None,
orderby: Optional[List[Tuple[ColumnElement, bool]]] = None,
extras: Optional[Dict[str, Any]] = None,
order_desc: bool = True,
) -> SqlaQuery:
"""Querying any sqla table from this common interface"""
template_kwargs = {
"from_dttm": from_dttm,
"groupby": groupby,
"metrics": metrics,
"row_limit": row_limit,
"to_dttm": to_dttm,
"filter": filter,
"columns": {col.column_name: col for col in self.columns},
}
is_sip_38 = is_feature_enabled("SIP_38_VIZ_REARCHITECTURE")
template_kwargs.update(self.template_params_dict)
extra_cache_keys: List[Any] = []
template_kwargs["extra_cache_keys"] = extra_cache_keys
template_processor = self.get_template_processor(**template_kwargs)
db_engine_spec = self.database.db_engine_spec
prequeries: List[str] = []
orderby = orderby or []
# For backward compatibility
if granularity not in self.dttm_cols:
granularity = self.main_dttm_col
# Database spec supports join-free timeslot grouping
time_groupby_inline = db_engine_spec.time_groupby_inline
cols: Dict[str, Column] = {col.column_name: col for col in self.columns}
metrics_dict: Dict[str, SqlMetric] = {m.metric_name: m for m in self.metrics}
if not granularity and is_timeseries:
raise Exception(
_(
"Datetime column not provided as part table configuration "
"and is required by this type of chart"
)
)
if (
not metrics
and not columns
and (is_sip_38 or (not is_sip_38 and not groupby))
):
raise Exception(_("Empty query?"))
metrics_exprs: List[ColumnElement] = []
for m in metrics:
if utils.is_adhoc_metric(m):
assert isinstance(m, dict)
metrics_exprs.append(self.adhoc_metric_to_sqla(m, cols))
elif isinstance(m, str) and m in metrics_dict:
metrics_exprs.append(metrics_dict[m].get_sqla_col())
else:
raise Exception(_("Metric '%(metric)s' does not exist", metric=m))
if metrics_exprs:
main_metric_expr = metrics_exprs[0]
else:
main_metric_expr, label = literal_column("COUNT(*)"), "ccount"
main_metric_expr = self.make_sqla_column_compatible(main_metric_expr, label)
select_exprs: List[Column] = []
groupby_exprs_sans_timestamp: OrderedDict = OrderedDict()
if (is_sip_38 and metrics and columns) or (not is_sip_38 and groupby):
# dedup columns while preserving order
columns_ = columns if is_sip_38 else groupby
assert columns_
groupby = list(dict.fromkeys(columns_))
select_exprs = []
for s in groupby:
if s in cols:
outer = cols[s].get_sqla_col()
else:
outer = literal_column(f"({s})")
outer = self.make_sqla_column_compatible(outer, s)
groupby_exprs_sans_timestamp[outer.name] = outer
select_exprs.append(outer)
elif columns:
for s in columns:
select_exprs.append(
cols[s].get_sqla_col()
if s in cols
else self.make_sqla_column_compatible(literal_column(s))
)
metrics_exprs = []
assert extras is not None
time_range_endpoints = extras.get("time_range_endpoints")
groupby_exprs_with_timestamp = OrderedDict(groupby_exprs_sans_timestamp.items())
if granularity:
dttm_col = cols[granularity]
time_grain = extras.get("time_grain_sqla")
time_filters = []
if is_timeseries:
timestamp = dttm_col.get_timestamp_expression(time_grain)
select_exprs += [timestamp]
groupby_exprs_with_timestamp[timestamp.name] = timestamp
# Use main dttm column to support index with secondary dttm columns.
if (
db_engine_spec.time_secondary_columns
and self.main_dttm_col in self.dttm_cols
and self.main_dttm_col != dttm_col.column_name
):
time_filters.append(
cols[self.main_dttm_col].get_time_filter(
from_dttm, to_dttm, time_range_endpoints
)
)
time_filters.append(
dttm_col.get_time_filter(from_dttm, to_dttm, time_range_endpoints)
)
select_exprs += metrics_exprs
labels_expected = [c._df_label_expected for c in select_exprs]
select_exprs = db_engine_spec.make_select_compatible(
groupby_exprs_with_timestamp.values(), select_exprs
)
qry = sa.select(select_exprs)
tbl = self.get_from_clause(template_processor)
if (is_sip_38 and metrics) or (not is_sip_38 and not columns):
qry = qry.group_by(*groupby_exprs_with_timestamp.values())
where_clause_and = []
having_clause_and: List = []
for flt in filter: # type: ignore
if not all([flt.get(s) for s in ["col", "op"]]):
continue
col = flt["col"]
op = flt["op"].upper()
col_obj = cols.get(col)
if col_obj:
is_list_target = op in (
utils.FilterOperator.IN.value,
utils.FilterOperator.NOT_IN.value,
)
eq = self.filter_values_handler(
values=flt.get("val"),
target_column_is_numeric=col_obj.is_numeric,
is_list_target=is_list_target,
)
if op in (
utils.FilterOperator.IN.value,
utils.FilterOperator.NOT_IN.value,
):
cond = col_obj.get_sqla_col().in_(eq)
if isinstance(eq, str) and NULL_STRING in eq:
cond = or_(cond, col_obj.get_sqla_col() is None)
if op == utils.FilterOperator.NOT_IN.value:
cond = ~cond
where_clause_and.append(cond)
else:
if col_obj.is_numeric:
eq = utils.cast_to_num(flt["val"])
if op == utils.FilterOperator.EQUALS.value:
where_clause_and.append(col_obj.get_sqla_col() == eq)
elif op == utils.FilterOperator.NOT_EQUALS.value:
where_clause_and.append(col_obj.get_sqla_col() != eq)
elif op == utils.FilterOperator.GREATER_THAN.value:
where_clause_and.append(col_obj.get_sqla_col() > eq)
elif op == utils.FilterOperator.LESS_THAN.value:
where_clause_and.append(col_obj.get_sqla_col() < eq)
elif op == utils.FilterOperator.GREATER_THAN_OR_EQUALS.value:
where_clause_and.append(col_obj.get_sqla_col() >= eq)
elif op == utils.FilterOperator.LESS_THAN_OR_EQUALS.value:
where_clause_and.append(col_obj.get_sqla_col() <= eq)
elif op == utils.FilterOperator.LIKE.value:
where_clause_and.append(col_obj.get_sqla_col().like(eq))
elif op == utils.FilterOperator.IS_NULL.value:
where_clause_and.append(col_obj.get_sqla_col() == None)
elif op == utils.FilterOperator.IS_NOT_NULL.value:
where_clause_and.append(col_obj.get_sqla_col() != None)
else:
raise Exception(
_("Invalid filter operation type: %(op)s", op=op)
)
if config["ENABLE_ROW_LEVEL_SECURITY"]:
where_clause_and += self._get_sqla_row_level_filters(template_processor)
if extras:
where = extras.get("where")
if where:
where = template_processor.process_template(where)
where_clause_and += [sa.text("({})".format(where))]
having = extras.get("having")
if having:
having = template_processor.process_template(having)
having_clause_and += [sa.text("({})".format(having))]
if granularity:
qry = qry.where(and_(*(time_filters + where_clause_and)))
else:
qry = qry.where(and_(*where_clause_and))
qry = qry.having(and_(*having_clause_and))
if not orderby and ((is_sip_38 and metrics) or (not is_sip_38 and not columns)):
orderby = [(main_metric_expr, not order_desc)]
# To ensure correct handling of the ORDER BY labeling we need to reference the
# metric instance if defined in the SELECT clause.
metrics_exprs_by_label = {m._label: m for m in metrics_exprs}
for col, ascending in orderby:
direction = asc if ascending else desc
if utils.is_adhoc_metric(col):
col = self.adhoc_metric_to_sqla(col, cols)
elif col in cols:
col = cols[col].get_sqla_col()
if isinstance(col, Label) and col._label in metrics_exprs_by_label:
col = metrics_exprs_by_label[col._label]
qry = qry.order_by(direction(col))
if row_limit:
qry = qry.limit(row_limit)
if (
is_timeseries
and timeseries_limit
and not time_groupby_inline
and ((is_sip_38 and columns) or (not is_sip_38 and groupby))
):
if self.database.db_engine_spec.allows_joins:
# some sql dialects require for order by expressions
# to also be in the select clause -- others, e.g. vertica,
# require a unique inner alias
inner_main_metric_expr = self.make_sqla_column_compatible(
main_metric_expr, "mme_inner__"
)
inner_groupby_exprs = []
inner_select_exprs = []
for gby_name, gby_obj in groupby_exprs_sans_timestamp.items():
inner = self.make_sqla_column_compatible(gby_obj, gby_name + "__")
inner_groupby_exprs.append(inner)
inner_select_exprs.append(inner)
inner_select_exprs += [inner_main_metric_expr]
subq = select(inner_select_exprs).select_from(tbl)
inner_time_filter = dttm_col.get_time_filter(
inner_from_dttm or from_dttm,
inner_to_dttm or to_dttm,
time_range_endpoints,
)
subq = subq.where(and_(*(where_clause_and + [inner_time_filter])))
subq = subq.group_by(*inner_groupby_exprs)
ob = inner_main_metric_expr
if timeseries_limit_metric:
ob = self._get_timeseries_orderby(
timeseries_limit_metric, metrics_dict, cols
)
direction = desc if order_desc else asc
subq = subq.order_by(direction(ob))
subq = subq.limit(timeseries_limit)
on_clause = []
for gby_name, gby_obj in groupby_exprs_sans_timestamp.items():
# in this case the column name, not the alias, needs to be
# conditionally mutated, as it refers to the column alias in
# the inner query
col_name = db_engine_spec.make_label_compatible(gby_name + "__")
on_clause.append(gby_obj == column(col_name))
tbl = tbl.join(subq.alias(), and_(*on_clause))
else:
if timeseries_limit_metric:
orderby = [
(
self._get_timeseries_orderby(
timeseries_limit_metric, metrics_dict, cols
),
False,
)
]
# run prequery to get top groups
prequery_obj = {
"is_timeseries": False,
"row_limit": timeseries_limit,
"metrics": metrics,
"granularity": granularity,
"from_dttm": inner_from_dttm or from_dttm,
"to_dttm": inner_to_dttm or to_dttm,
"filter": filter,
"orderby": orderby,
"extras": extras,
"columns": columns,
"order_desc": True,
}
if not is_sip_38:
prequery_obj["groupby"] = groupby
result = self.query(prequery_obj)
prequeries.append(result.query)
dimensions = [
c
for c in result.df.columns
if c not in metrics and c in groupby_exprs_sans_timestamp
]
top_groups = self._get_top_groups(
result.df, dimensions, groupby_exprs_sans_timestamp
)
qry = qry.where(top_groups)
return SqlaQuery(
extra_cache_keys=extra_cache_keys,
labels_expected=labels_expected,
sqla_query=qry.select_from(tbl),
prequeries=prequeries,
)
def _get_timeseries_orderby(
self,
timeseries_limit_metric: Metric,
metrics_dict: Dict[str, SqlMetric],
cols: Dict[str, Column],
) -> Optional[Column]:
if utils.is_adhoc_metric(timeseries_limit_metric):
assert isinstance(timeseries_limit_metric, dict)
ob = self.adhoc_metric_to_sqla(timeseries_limit_metric, cols)
elif (
isinstance(timeseries_limit_metric, str)
and timeseries_limit_metric in metrics_dict
):
ob = metrics_dict[timeseries_limit_metric].get_sqla_col()
else:
raise Exception(
_("Metric '%(metric)s' does not exist", metric=timeseries_limit_metric)
)
return ob
def _get_top_groups(
self, df: pd.DataFrame, dimensions: List, groupby_exprs: OrderedDict
) -> ColumnElement:
groups = []
for unused, row in df.iterrows():
group = []
for dimension in dimensions:
group.append(groupby_exprs[dimension] == row[dimension])
groups.append(and_(*group))
return or_(*groups)
def query(self, query_obj: QueryObjectDict) -> QueryResult:
qry_start_dttm = datetime.now()
query_str_ext = self.get_query_str_extended(query_obj)
sql = query_str_ext.sql
status = utils.QueryStatus.SUCCESS
errors = None
def mutator(df: pd.DataFrame) -> None:
"""
Some engines change the case or generate bespoke column names, either by
default or due to lack of support for aliasing. This function ensures that
the column names in the DataFrame correspond to what is expected by
the viz components.
:param df: Original DataFrame returned by the engine
"""
labels_expected = query_str_ext.labels_expected
if df is not None and not df.empty:
if len(df.columns) != len(labels_expected):
raise Exception(
f"For {sql}, df.columns: {df.columns}"
f" differs from {labels_expected}"
)
else:
df.columns = labels_expected
try:
df = self.database.get_df(sql, self.schema, mutator)
except Exception as ex:
df = pd.DataFrame()
status = utils.QueryStatus.FAILED
logger.warning(f"Query {sql} on schema {self.schema} failed", exc_info=True)
db_engine_spec = self.database.db_engine_spec
errors = db_engine_spec.extract_errors(ex)
return QueryResult(
status=status,
df=df,
duration=datetime.now() - qry_start_dttm,
query=sql,
errors=errors,
)
def get_sqla_table_object(self) -> Table:
return self.database.get_table(self.table_name, schema=self.schema)
def fetch_metadata(self, commit: bool = True) -> None:
"""Fetches the metadata for the table and merges it in"""
try:
table = self.get_sqla_table_object()
except Exception as ex:
logger.exception(ex)
raise Exception(
_(
"Table [{}] doesn't seem to exist in the specified database, "
"couldn't fetch column information"
).format(self.table_name)
)
metrics = []
any_date_col = None
db_engine_spec = self.database.db_engine_spec
db_dialect = self.database.get_dialect()
dbcols = (
db.session.query(TableColumn)
.filter(TableColumn.table == self)
.filter(or_(TableColumn.column_name == col.name for col in table.columns))
)
dbcols = {dbcol.column_name: dbcol for dbcol in dbcols}
for col in table.columns:
try:
datatype = db_engine_spec.column_datatype_to_string(
col.type, db_dialect
)
except Exception as ex:
datatype = "UNKNOWN"
logger.error("Unrecognized data type in {}.{}".format(table, col.name))
logger.exception(ex)
dbcol = dbcols.get(col.name, None)
if not dbcol:
dbcol = TableColumn(column_name=col.name, type=datatype, table=self)
dbcol.sum = dbcol.is_numeric
dbcol.avg = dbcol.is_numeric
dbcol.is_dttm = dbcol.is_temporal
db_engine_spec.alter_new_orm_column(dbcol)
else:
dbcol.type = datatype
dbcol.groupby = True
dbcol.filterable = True
self.columns.append(dbcol)
if not any_date_col and dbcol.is_temporal:
any_date_col = col.name
metrics.append(
SqlMetric(
metric_name="count",
verbose_name="COUNT(*)",
metric_type="count",
expression="COUNT(*)",
)
)
if not self.main_dttm_col:
self.main_dttm_col = any_date_col
self.add_missing_metrics(metrics)
db.session.merge(self)
if commit:
db.session.commit()
@classmethod
def import_obj(
cls, i_datasource: "SqlaTable", import_time: Optional[int] = None
) -> int:
"""Imports the datasource from the object to the database.
Metrics and columns and datasource will be overrided if exists.
This function can be used to import/export dashboards between multiple
superset instances. Audit metadata isn't copies over.
"""
def lookup_sqlatable(table: "SqlaTable") -> "SqlaTable":
return (
db.session.query(SqlaTable)
.join(Database)
.filter(
SqlaTable.table_name == table.table_name,
SqlaTable.schema == table.schema,
Database.id == table.database_id,
)
.first()
)
def lookup_database(table: SqlaTable) -> Database:
try:
return (
db.session.query(Database)
.filter_by(database_name=table.params_dict["database_name"])
.one()
)
except NoResultFound:
raise DatabaseNotFound(
_(
"Database '%(name)s' is not found",
name=table.params_dict["database_name"],
)
)
return import_datasource.import_datasource(
db.session, i_datasource, lookup_database, lookup_sqlatable, import_time
)
@classmethod
def query_datasources_by_name(
cls,
session: Session,
database: Database,
datasource_name: str,
schema: Optional[str] = None,
) -> List["SqlaTable"]:
query = (
session.query(cls)
.filter_by(database_id=database.id)
.filter_by(table_name=datasource_name)
)
if schema:
query = query.filter_by(schema=schema)
return query.all()
@staticmethod
def default_query(qry: Query) -> Query:
return qry.filter_by(is_sqllab_view=False)
def has_extra_cache_key_calls(self, query_obj: QueryObjectDict) -> bool:
"""
Detects the presence of calls to `ExtraCache` methods in items in query_obj that
can be templated. If any are present, the query must be evaluated to extract
additional keys for the cache key. This method is needed to avoid executing the
template code unnecessarily, as it may contain expensive calls, e.g. to extract
the latest partition of a database.
:param query_obj: query object to analyze
:return: True if there are call(s) to an `ExtraCache` method, False otherwise
"""
templatable_statements: List[str] = []
if self.sql:
templatable_statements.append(self.sql)
if self.fetch_values_predicate:
templatable_statements.append(self.fetch_values_predicate)
extras = query_obj.get("extras", {})
if "where" in extras:
templatable_statements.append(extras["where"])
if "having" in extras:
templatable_statements.append(extras["having"])
for statement in templatable_statements:
if ExtraCache.regex.search(statement):
return True
return False
def get_extra_cache_keys(self, query_obj: QueryObjectDict) -> List[Hashable]:
"""
The cache key of a SqlaTable needs to consider any keys added by the parent
class and any keys added via `ExtraCache`.
:param query_obj: query object to analyze
:return: The extra cache keys
"""
extra_cache_keys = super().get_extra_cache_keys(query_obj)
if self.has_extra_cache_key_calls(query_obj):
sqla_query = self.get_sqla_query(**query_obj)
extra_cache_keys += sqla_query.extra_cache_keys
return extra_cache_keys
sa.event.listen(SqlaTable, "after_insert", security_manager.set_perm)
sa.event.listen(SqlaTable, "after_update", security_manager.set_perm)
RLSFilterRoles = Table(
"rls_filter_roles",
metadata,
Column("id", Integer, primary_key=True),
Column("role_id", Integer, ForeignKey("ab_role.id"), nullable=False),
Column("rls_filter_id", Integer, ForeignKey("row_level_security_filters.id")),
)
class RowLevelSecurityFilter(Model, AuditMixinNullable):
"""
Custom where clauses attached to Tables and Roles.
"""
__tablename__ = "row_level_security_filters"
id = Column(Integer, primary_key=True)
roles = relationship(
security_manager.role_model,
secondary=RLSFilterRoles,
backref="row_level_security_filters",
)
table_id = Column(Integer, ForeignKey("tables.id"), nullable=False)
table = relationship(SqlaTable, backref="row_level_security_filters")
clause = Column(Text, nullable=False)
|
py | b413ebda815b528aeb6591b02cb72e54831b4262 | #!/usr/bin/env python
#
# Copyright 2016 Google Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""This example gets all rate cards that can be used for Marketplace products.
"""
# Import appropriate modules from the client library.
from googleads import dfp
def main(client):
# Initialize appropriate service.
rate_card_service = client.GetService('RateCardService', version='v201802')
# Create a statement to select rate cards.
statement = (dfp.StatementBuilder()
.Where('forMarketplace = :forMarketplace')
.WithBindVariable('forMarketplace', True))
# Retrieve a small amount of rate cards at a time, paging
# through until all rate cards have been retrieved.
while True:
response = rate_card_service.getRateCardsByStatement(statement.ToStatement(
))
if 'results' in response:
for rate_card in response['results']:
# Print out some information for each rate card.
print(
'Rate card with ID "%d", name "%s", and currency code "%s" was '
'found.\n'
% (rate_card['id'], rate_card['name'], rate_card['currencyCode']))
statement.offset += statement.limit
else:
break
print '\nNumber of results found: %s' % response['totalResultSetSize']
if __name__ == '__main__':
# Initialize client object.
dfp_client = dfp.DfpClient.LoadFromStorage()
main(dfp_client)
|
py | b413ec882a4858ef88d66aaddf1949742fd86469 | # --------------------------------------------------------
# Tensorflow Faster R-CNN
# Licensed under The MIT License [see LICENSE for details]
# Written by Jiasen Lu, Jianwei Yang, based on code from Ross Girshick
# --------------------------------------------------------
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import _init_paths
import os
import sys
import numpy as np
import argparse
import pprint
import pdb
import time
import cv2
import torch
from torch.autograd import Variable
import torch.nn as nn
import torch.optim as optim
import torchvision.transforms as transforms
import torchvision.datasets as dset
from scipy.misc import imread
from roi_data_layer.roidb import combined_roidb
from roi_data_layer.roibatchLoader import roibatchLoader
from model.utils.config import cfg, cfg_from_file, cfg_from_list, get_output_dir
from model.rpn.bbox_transform import clip_boxes
from model.nms.nms_wrapper import nms
from model.rpn.bbox_transform import bbox_transform_inv
from model.utils.net_utils import save_net, load_net, vis_detections
from model.utils.blob import im_list_to_blob
from model.faster_rcnn.vgg16 import vgg16
from model.faster_rcnn.resnet import resnet
import pdb
try:
xrange # Python 2
except NameError:
xrange = range # Python 3
def parse_args():
"""
Parse input arguments
"""
parser = argparse.ArgumentParser(description='Train a Fast R-CNN network')
parser.add_argument('--dataset', dest='dataset',
help='training dataset',
default='pascal_voc', type=str)
parser.add_argument('--cfg', dest='cfg_file',
help='optional config file',
default='cfgs/vgg16.yml', type=str)
parser.add_argument('--net', dest='net',
help='vgg16, res50, res101, res152',
default='res101', type=str)
parser.add_argument('--set', dest='set_cfgs',
help='set config keys', default=None,
nargs=argparse.REMAINDER)
parser.add_argument('--load_dir', dest='load_dir',
help='directory to load models',
default="/srv/share/jyang375/models")
parser.add_argument('--image_dir', dest='image_dir',
help='directory to load images for demo',
default="images/img1")
parser.add_argument('--cuda', dest='cuda',
help='whether use CUDA',
action='store_true')
parser.add_argument('--mGPUs', dest='mGPUs',
help='whether use multiple GPUs',
action='store_true')
parser.add_argument('--cag', dest='class_agnostic',
help='whether perform class_agnostic bbox regression',
action='store_true')
parser.add_argument('--parallel_type', dest='parallel_type',
help='which part of model to parallel, 0: all, 1: model before roi pooling',
default=0, type=int)
parser.add_argument('--checksession', dest='checksession',
help='checksession to load model',
default=1, type=int)
parser.add_argument('--checkepoch', dest='checkepoch',
help='checkepoch to load network',
default=1, type=int)
parser.add_argument('--checkpoint', dest='checkpoint',
help='checkpoint to load network',
default=10021, type=int)
parser.add_argument('--bs', dest='batch_size',
help='batch_size',
default=1, type=int)
parser.add_argument('--vis', dest='vis',
help='visualization mode',
action='store_true')
parser.add_argument('--webcam_num', dest='webcam_num',
help='webcam ID number',
default=-1, type=int)
args = parser.parse_args()
return args
lr = cfg.TRAIN.LEARNING_RATE
momentum = cfg.TRAIN.MOMENTUM
weight_decay = cfg.TRAIN.WEIGHT_DECAY
def _get_image_blob(im):
"""Converts an image into a network input.
Arguments:
im (ndarray): a color image in BGR order
Returns:
blob (ndarray): a data blob holding an image pyramid
im_scale_factors (list): list of image scales (relative to im) used
in the image pyramid
"""
im_orig = im.astype(np.float32, copy=True)
im_orig -= cfg.PIXEL_MEANS
im_shape = im_orig.shape
im_size_min = np.min(im_shape[0:2])
im_size_max = np.max(im_shape[0:2])
processed_ims = []
im_scale_factors = []
for target_size in cfg.TEST.SCALES:
im_scale = float(target_size) / float(im_size_min)
# Prevent the biggest axis from being more than MAX_SIZE
if np.round(im_scale * im_size_max) > cfg.TEST.MAX_SIZE:
im_scale = float(cfg.TEST.MAX_SIZE) / float(im_size_max)
im = cv2.resize(im_orig, None, None, fx=im_scale, fy=im_scale,
interpolation=cv2.INTER_LINEAR)
im_scale_factors.append(im_scale)
processed_ims.append(im)
# Create a blob to hold the input images
blob = im_list_to_blob(processed_ims)
return blob, np.array(im_scale_factors)
if __name__ == '__main__':
args = parse_args()
print('Called with args:')
print(args)
if args.cfg_file is not None:
cfg_from_file(args.cfg_file)
if args.set_cfgs is not None:
cfg_from_list(args.set_cfgs)
cfg.USE_GPU_NMS = args.cuda
print('Using config:')
pprint.pprint(cfg)
np.random.seed(cfg.RNG_SEED)
# train set
# -- Note: Use validation set and disable the flipped to enable faster loading.
input_dir = args.load_dir + "/" + args.net + "/" + args.dataset
if not os.path.exists(input_dir):
raise Exception('There is no input directory for loading network from ' + input_dir)
load_name = os.path.join(input_dir,
'faster_rcnn_{}_{}_{}.pth'.format(args.checksession, args.checkepoch, args.checkpoint))
pascal_classes = np.asarray(['__background__',
'aeroplane', 'bicycle', 'bird', 'boat',
'bottle', 'bus', 'car', 'cat', 'chair',
'cow', 'diningtable', 'dog', 'horse',
'motorbike', 'person', 'pottedplant',
'sheep', 'sofa', 'train', 'tvmonitor'])
# initilize the network here.
if args.net == 'vgg16':
fasterRCNN = vgg16(pascal_classes, pretrained=False, class_agnostic=args.class_agnostic)
elif args.net == 'res101':
fasterRCNN = resnet(pascal_classes, 101, pretrained=False, class_agnostic=args.class_agnostic)
elif args.net == 'res50':
fasterRCNN = resnet(pascal_classes, 50, pretrained=False, class_agnostic=args.class_agnostic)
elif args.net == 'res152':
fasterRCNN = resnet(pascal_classes, 152, pretrained=False, class_agnostic=args.class_agnostic)
else:
print("network is not defined")
pdb.set_trace()
fasterRCNN.create_architecture()
print("load checkpoint %s" % (load_name))
if args.cuda > 0:
checkpoint = torch.load(load_name)
else:
checkpoint = torch.load(load_name, map_location=(lambda storage, loc: storage))
fasterRCNN.load_state_dict(checkpoint['model'])
if 'pooling_mode' in checkpoint.keys():
cfg.POOLING_MODE = checkpoint['pooling_mode']
print('load model successfully!')
# pdb.set_trace()
print("load checkpoint %s" % (load_name))
# initilize the tensor holder here.
im_data = torch.FloatTensor(1)
im_info = torch.FloatTensor(1)
num_boxes = torch.LongTensor(1)
gt_boxes = torch.FloatTensor(1)
# ship to cuda
if args.cuda > 0:
im_data = im_data.cuda()
im_info = im_info.cuda()
num_boxes = num_boxes.cuda()
gt_boxes = gt_boxes.cuda()
# make variable
im_data = Variable(im_data, volatile=True)
im_info = Variable(im_info, volatile=True)
num_boxes = Variable(num_boxes, volatile=True)
gt_boxes = Variable(gt_boxes, volatile=True)
if args.cuda > 0:
cfg.CUDA = True
if args.cuda > 0:
fasterRCNN.cuda()
fasterRCNN.eval()
start = time.time()
max_per_image = 100
thresh = 0.05
vis = True
webcam_num = args.webcam_num
# Set up webcam or get image directories
if webcam_num >= 0 :
cap = cv2.VideoCapture(webcam_num)
num_images = 0
else:
imglist = os.listdir(args.image_dir)
num_images = len(imglist)
print('Loaded Photo: {} images.'.format(num_images))
while (num_images >= 0):
total_tic = time.time()
if webcam_num == -1:
num_images -= 1
# Get image from the webcam
if webcam_num >= 0:
if not cap.isOpened():
raise RuntimeError("Webcam could not open. Please check connection.")
ret, frame = cap.read()
im_in = np.array(frame)
# Load the demo image
else:
im_file = os.path.join(args.image_dir, imglist[num_images])
# im = cv2.imread(im_file)
im_in = np.array(imread(im_file))
if len(im_in.shape) == 2:
im_in = im_in[:,:,np.newaxis]
im_in = np.concatenate((im_in,im_in,im_in), axis=2)
# rgb -> bgr
im = im_in[:,:,::-1]
blobs, im_scales = _get_image_blob(im)
assert len(im_scales) == 1, "Only single-image batch implemented"
im_blob = blobs
im_info_np = np.array([[im_blob.shape[1], im_blob.shape[2], im_scales[0]]], dtype=np.float32)
im_data_pt = torch.from_numpy(im_blob)
im_data_pt = im_data_pt.permute(0, 3, 1, 2)
im_info_pt = torch.from_numpy(im_info_np)
im_data.data.resize_(im_data_pt.size()).copy_(im_data_pt)
im_info.data.resize_(im_info_pt.size()).copy_(im_info_pt)
gt_boxes.data.resize_(1, 1, 5).zero_()
num_boxes.data.resize_(1).zero_()
# pdb.set_trace()
det_tic = time.time()
rois, cls_prob, bbox_pred, \
rpn_loss_cls, rpn_loss_box, \
RCNN_loss_cls, RCNN_loss_bbox, \
rois_label = fasterRCNN(im_data, im_info, gt_boxes, num_boxes)
scores = cls_prob.data
boxes = rois.data[:, :, 1:5]
if cfg.TEST.BBOX_REG:
# Apply bounding-box regression deltas
box_deltas = bbox_pred.data
if cfg.TRAIN.BBOX_NORMALIZE_TARGETS_PRECOMPUTED:
# Optionally normalize targets by a precomputed mean and stdev
if args.class_agnostic:
if args.cuda > 0:
box_deltas = box_deltas.view(-1, 4) * torch.FloatTensor(cfg.TRAIN.BBOX_NORMALIZE_STDS).cuda() \
+ torch.FloatTensor(cfg.TRAIN.BBOX_NORMALIZE_MEANS).cuda()
else:
box_deltas = box_deltas.view(-1, 4) * torch.FloatTensor(cfg.TRAIN.BBOX_NORMALIZE_STDS) \
+ torch.FloatTensor(cfg.TRAIN.BBOX_NORMALIZE_MEANS)
box_deltas = box_deltas.view(1, -1, 4)
else:
if args.cuda > 0:
box_deltas = box_deltas.view(-1, 4) * torch.FloatTensor(cfg.TRAIN.BBOX_NORMALIZE_STDS).cuda() \
+ torch.FloatTensor(cfg.TRAIN.BBOX_NORMALIZE_MEANS).cuda()
else:
box_deltas = box_deltas.view(-1, 4) * torch.FloatTensor(cfg.TRAIN.BBOX_NORMALIZE_STDS) \
+ torch.FloatTensor(cfg.TRAIN.BBOX_NORMALIZE_MEANS)
box_deltas = box_deltas.view(1, -1, 4 * len(pascal_classes))
pred_boxes = bbox_transform_inv(boxes, box_deltas, 1)
pred_boxes = clip_boxes(pred_boxes, im_info.data, 1)
else:
# Simply repeat the boxes, once for each class
pred_boxes = np.tile(boxes, (1, scores.shape[1]))
pred_boxes /= im_scales[0]
scores = scores.squeeze()
pred_boxes = pred_boxes.squeeze()
det_toc = time.time()
detect_time = det_toc - det_tic
misc_tic = time.time()
if vis:
im2show = np.copy(im)
for j in xrange(1, len(pascal_classes)):
inds = torch.nonzero(scores[:,j]>thresh).view(-1)
# if there is det
if inds.numel() > 0:
cls_scores = scores[:,j][inds]
_, order = torch.sort(cls_scores, 0, True)
if args.class_agnostic:
cls_boxes = pred_boxes[inds, :]
else:
cls_boxes = pred_boxes[inds][:, j * 4:(j + 1) * 4]
cls_dets = torch.cat((cls_boxes, cls_scores.unsqueeze(1)), 1)
# cls_dets = torch.cat((cls_boxes, cls_scores), 1)
cls_dets = cls_dets[order]
keep = nms(cls_dets, cfg.TEST.NMS, force_cpu=not cfg.USE_GPU_NMS)
cls_dets = cls_dets[keep.view(-1).long()]
if vis:
im2show = vis_detections(im2show, pascal_classes[j], cls_dets.cpu().numpy(), 0.5)
misc_toc = time.time()
nms_time = misc_toc - misc_tic
if webcam_num == -1:
sys.stdout.write('im_detect: {:d}/{:d} {:.3f}s {:.3f}s \r' \
.format(num_images + 1, len(imglist), detect_time, nms_time))
sys.stdout.flush()
if vis and webcam_num == -1:
# cv2.imshow('test', im2show)
# cv2.waitKey(0)
result_path = os.path.join(args.image_dir, imglist[num_images][:-4] + "_det.jpg")
cv2.imwrite(result_path, im2show)
else:
im2showRGB = cv2.cvtColor(im2show, cv2.COLOR_BGR2RGB)
cv2.imshow("frame", im2showRGB)
total_toc = time.time()
total_time = total_toc - total_tic
frame_rate = 1 / total_time
print('Frame rate:', frame_rate)
if cv2.waitKey(1) & 0xFF == ord('q'):
break
if webcam_num >= 0:
cap.release()
cv2.destroyAllWindows()
|
py | b413ee047465100d9f06cd7dba23c5a344ac608d | #!/usr/bin/env python3
import os
import subprocess
import argparse
import shutil
"""
Class to wrap terraform commands for shared account infrastructure
examples
### Provision infrastructure
aws-vault exec bichard7-sandbox -- scripts/shared_account_terraform.py sandbox infra
### destroy infrastructure
aws-vault exec bichard7-sandbox -- scripts/shared_account_terraform.py sandbox infra destroy
"""
class TerraformRunner(object):
_account_name = None
_user_type = None
_state_buckets = {
"sandbox": "cjse-bichard7-default-sharedaccount-sandbox-bootstrap-tfstate",
"pathtolive": "cjse-bichard7-default-pathtolive-bootstrap-tfstate"
}
_state_files = {
"infra": "tfstatefile",
"infra_ci": "ci/tfstatefile",
"infra_ci_sonarqube": "sonarqube/tfstatefile",
"users": "users/tfstatefile"
}
_lock_table_region = "eu-west-2"
_args = None
def _set_state_file(self):
self._statefile = self._state_files.get(self._args.module, "tfstatefile")
def _parse_arguments(self):
parser = argparse.ArgumentParser()
parser.add_argument('environment', type=str, help="The environment ")
parser.add_argument('module', type=str, help="The terraform module we want to run")
parser.add_argument('action', type=str, default='apply', help="The terraform action we want to perform",
nargs="?")
self._args = parser.parse_args()
def _get_bucket(self):
return self._state_buckets.get(self._args.environment, "cjse-bichard7-default-sharedaccount-sandbox-bootstrap-tfstate")
@staticmethod
def _generate_command(action="apply", extra_args=[]):
return " ".join(["terraform", action] + extra_args)
def _init_args(self):
return [
'-backend-config "bucket={}"'.format(self._get_bucket()),
'-backend-config "dynamodb_table={}-lock"'.format(self._get_bucket()),
'-backend-config "key={}"'.format(self._statefile),
'-backend-config "region={}"'.format(self._lock_table_region)
]
def _run_command(self, action="apply", extra_args=[]):
command = self._generate_command(action=action, extra_args=extra_args)
running_env = os.environ.copy()
try:
subprocess.run(args=command, env=running_env, shell=True, check=True)
except subprocess.CalledProcessError as e:
print(e.output)
exit(1)
def run(self):
self._parse_arguments()
self._set_state_file()
try:
os.chdir(
os.path.join(
"terraform",
"shared_account_{}_{}".format(
self._args.environment,
self._args.module
)
)
)
except FileNotFoundError:
print("Invalid target directory specified")
exit(-1)
try:
shutil.rmtree(".terraform")
os.unlink(".terraform.lock.hcl")
except FileNotFoundError:
pass
self._run_command(action="init", extra_args=self._init_args())
self._run_command(action=self._args.action,
extra_args=["-auto-approve" if os.getenv("AUTO_APPROVE") == "true" else ""])
if __name__ == "__main__":
runner = TerraformRunner()
runner.run()
|
py | b413ef16270c1236a6da5cc0602273de5a850945 | from unittest import TestCase
from mock import Mock
from plugins.announcer_plugin.announcer_plugin import Announcer
class AnnoucerTestCase(TestCase):
def test_after_connect_success(self):
mock_factory = Mock()
mock_config = Mock(colors='colors config')
mock_protocol = Mock()
mock_protocol.player.colored_name.return_value = 'player name'
plugin = Announcer()
plugin.factory = mock_factory
plugin.config = mock_config
plugin.protocol = mock_protocol
plugin.after_connect_success(None)
mock_factory.broadcast.assert_called_with(
'player name logged in.', 'Announcer'
)
mock_protocol.player.colored_name.assert_called_with('colors config')
def test_on_client_disconnect_request(self):
mock_factory = Mock()
mock_config = Mock(colors='colors config')
mock_protocol = Mock()
mock_protocol.player.colored_name.return_value = 'player name'
plugin = Announcer()
plugin.factory = mock_factory
plugin.config = mock_config
plugin.protocol = mock_protocol
plugin.on_client_disconnect_request(None)
mock_factory.broadcast.assert_called_with(
'player name logged out.', 'Announcer'
)
mock_protocol.player.colored_name.assert_called_with('colors config')
def test_on_client_disconnect_request_player_is_none(self):
mock_factory = Mock()
mock_protocol = Mock()
mock_protocol.player = None
plugin = Announcer()
plugin.factory = mock_factory
plugin.protocol = mock_protocol
self.assertFalse(mock_factory.broadcast.called)
|
py | b413efad302cad795a8bd1368a2ded880994bddb | import os
import site
from unittest import mock
from unittest.mock import patch
from contextlib import ExitStack
from cauldron.environ import systems
from cauldron.test.support import scaffolds
class TestSystems(scaffolds.ResultsTest):
"""..."""
def test_end_success(self):
"""Should end quietly with a success exit code."""
with patch('sys.exit') as sys_exit:
systems.end(0)
sys_exit.assert_called_with(0)
def test_end_fail(self):
"""Should end in error with non-success exit code."""
with patch('sys.exit') as sys_exit:
systems.end(1)
sys_exit.assert_called_with(1)
def test_remove_no_path(self):
"""Should abort removal if path is None."""
self.assertFalse(systems.remove(None))
def test_remove_file_fail(self):
"""Should fail to remove file when os.remove fails."""
path = 'NOT-REAL-PATH'
with ExitStack() as stack:
exists = stack.enter_context(patch('os.path.exists'))
exists.return_value = True
is_file = stack.enter_context(patch('os.path.isfile'))
is_file.return_value = True
remove = stack.enter_context(patch('os.remove'))
remove.side_effect = IOError('FAKE ERROR')
self.assertFalse(systems.remove(path))
exists.assert_called_with(path)
self.assertEqual(remove.call_count, 3)
def test_remove_file_success(self):
"""Should remove file when os.remove succeeds."""
path = 'NOT-REAL-PATH'
with ExitStack() as stack:
exists = stack.enter_context(patch('os.path.exists'))
exists.return_value = True
is_file = stack.enter_context(patch('os.path.isfile'))
is_file.return_value = True
remove = stack.enter_context(patch('os.remove'))
remove.return_value = True
self.assertTrue(systems.remove(path))
exists.assert_called_with(path)
self.assertEqual(remove.call_count, 1)
def test_remove_folder_fail(self):
"""Should fail to remove folder when os.remove fails."""
path = 'NOT-REAL-PATH'
with ExitStack() as stack:
exists = stack.enter_context(patch('os.path.exists'))
exists.return_value = True
is_file = stack.enter_context(patch('os.path.isfile'))
is_file.return_value = False
remove_tree = stack.enter_context(patch('shutil.rmtree'))
remove_tree.side_effect = IOError('FAKE ERROR')
self.assertFalse(systems.remove(path))
exists.assert_called_with(path)
self.assertEqual(remove_tree.call_count, 3)
def test_remove_folder_success(self):
"""Should remove file when os.remove succeeds."""
path = 'NOT-REAL-PATH'
with ExitStack() as stack:
exists = stack.enter_context(patch('os.path.exists'))
exists.return_value = True
is_file = stack.enter_context(patch('os.path.isfile'))
is_file.return_value = False
remove_tree = stack.enter_context(patch('shutil.rmtree'))
remove_tree.return_value = True
self.assertTrue(systems.remove(path))
exists.assert_called_with(path)
self.assertEqual(remove_tree.call_count, 1)
def test_simplify_path(self):
"""Should simplify the path """
path = os.path.dirname(os.path.realpath(__file__))
parent = os.path.dirname(path)
result = systems.simplify_path(path, [('[ME]', parent)])
self.assertTrue(result.startswith('[ME]'))
def test_simplify_path_no_match(self):
"""Should not simplify the path if it doesn't match """
path = 'NOT-A-REAL-PATH'
result = systems.simplify_path(path, [])
self.assertEqual(path, result)
def test_module_to_package_data_core_package(self):
"""Should return None when module is part of the standard library."""
result = systems.module_to_package_data('os', os)
self.assertIsNone(result)
def test_module_to_package_data_submodule(self):
"""Should return None if the module is a submodule."""
result = systems.module_to_package_data('cauldron.environ', None)
self.assertIsNone(result)
def test_get_site_packages_success(self):
"""Should get site packages."""
if not hasattr(site, 'getsitepackages'): # pragma: no cover
# Some versions of python on different systems lacked this
# function. Here we enforce its existence for testing.
setattr(site, 'getsitepackages', lambda *args, **kwargs: [])
data = [1, 2, 3]
with patch('site.getsitepackages') as get_site_packages:
get_site_packages.return_value = data
result = systems.get_site_packages()
self.assertEqual(data, result)
def test_get_site_packages_failed(self):
"""Should return an empty list if unable to get site packages."""
if not hasattr(site, 'getsitepackages'): # pragma: no cover
# Some versions of python on different systems lacked this
# function. Here we enforce its existence for testing.
setattr(site, 'getsitepackages', lambda *args, **kwargs: [])
with patch('site.getsitepackages') as get_site_packages:
get_site_packages.side_effect = ValueError('FAKE ERROR')
result = systems.get_site_packages()
self.assertIsInstance(result, list)
self.assertEqual(len(result), 0)
|
py | b413efede895cef03c4f19b6321ef93eb4dcfcba | import sys
sys.path.append('./amc_dl')
sys.path.append('../accomontage code/amc_dl')
from torch_plus import PytorchModel
from torch_plus.train_utils import get_zs_from_dists, kl_with_normal
import torch
from torch import nn
from torch.distributions import Normal
import numpy as np
sys.path.append('./models')
sys.path.append('../accomontage code/models')
from ptvae import RnnEncoder, RnnDecoder, PtvaeDecoder, \
TextureEncoder
class DisentangleVAE(PytorchModel):
def __init__(self, name, device, chd_encoder, rhy_encoder, decoder,
chd_decoder):
super(DisentangleVAE, self).__init__(name, device)
self.chd_encoder = chd_encoder
self.rhy_encoder = rhy_encoder
self.decoder = decoder
self.num_step = self.decoder.num_step
self.chd_decoder = chd_decoder
def confuse_prmat(self, pr_mat):
non_zero_ent = torch.nonzero(pr_mat.long())
eps = torch.randint(0, 2, (non_zero_ent.size(0),))
eps = ((2 * eps) - 1).long()
confuse_ent = torch.clamp(non_zero_ent[:, 2] + eps, min=0, max=127)
pr_mat[non_zero_ent[:, 0], non_zero_ent[:, 1], confuse_ent] = \
pr_mat[non_zero_ent[:, 0], non_zero_ent[:, 1], non_zero_ent[:, 2]]
return pr_mat
def get_chroma(self, pr_mat):
bs = pr_mat.size(0)
pad = torch.zeros(bs, 32, 4).to(self.device)
pr_mat = torch.cat([pr_mat, pad], dim=-1)
c = pr_mat.view(bs, 32, -1, 12).contiguous()
c = c.sum(dim=-2) # (bs, 32, 12)
c = c.view(bs, 8, 4, 12)
c = c.sum(dim=-2).float()
c = torch.log(c + 1)
return c.to(self.device)
def run(self, x, c, pr_mat, tfr1, tfr2, tfr3, confuse=True):
embedded_x, lengths = self.decoder.emb_x(x)
# cc = self.get_chroma(pr_mat)
dist_chd = self.chd_encoder(c)
# pr_mat = self.confuse_prmat(pr_mat)
dist_rhy = self.rhy_encoder(pr_mat)
z_chd, z_rhy = get_zs_from_dists([dist_chd, dist_rhy], True)
dec_z = torch.cat([z_chd, z_rhy], dim=-1)
pitch_outs, dur_outs = self.decoder(dec_z, False, embedded_x,
lengths, tfr1, tfr2)
recon_root, recon_chroma, recon_bass = self.chd_decoder(z_chd, False,
tfr3, c)
return pitch_outs, dur_outs, dist_chd, dist_rhy, recon_root, \
recon_chroma, recon_bass
def loss_function(self, x, c, recon_pitch, recon_dur, dist_chd,
dist_rhy, recon_root, recon_chroma, recon_bass,
beta, weights, weighted_dur=False):
recon_loss, pl, dl = self.decoder.recon_loss(x, recon_pitch, recon_dur,
weights, weighted_dur)
kl_loss, kl_chd, kl_rhy = self.kl_loss(dist_chd, dist_rhy)
chord_loss, root, chroma, bass = self.chord_loss(c, recon_root,
recon_chroma,
recon_bass)
loss = recon_loss + beta * kl_loss + chord_loss
return loss, recon_loss, pl, dl, kl_loss, kl_chd, kl_rhy, chord_loss, \
root, chroma, bass
def chord_loss(self, c, recon_root, recon_chroma, recon_bass):
loss_fun = nn.CrossEntropyLoss()
root = c[:, :, 0: 12].max(-1)[-1].view(-1).contiguous()
chroma = c[:, :, 12: 24].long().view(-1).contiguous()
bass = c[:, :, 24:].max(-1)[-1].view(-1).contiguous()
recon_root = recon_root.view(-1, 12).contiguous()
recon_chroma = recon_chroma.view(-1, 2).contiguous()
recon_bass = recon_bass.view(-1, 12).contiguous()
root_loss = loss_fun(recon_root, root)
chroma_loss = loss_fun(recon_chroma, chroma)
bass_loss = loss_fun(recon_bass, bass)
chord_loss = root_loss + chroma_loss + bass_loss
return chord_loss, root_loss, chroma_loss, bass_loss
def kl_loss(self, *dists):
# kl = kl_with_normal(dists[0])
kl_chd = kl_with_normal(dists[0])
kl_rhy = kl_with_normal(dists[1])
kl_loss = kl_chd + kl_rhy
return kl_loss, kl_chd, kl_rhy
def loss(self, x, c, pr_mat, dt_x, tfr1=0., tfr2=0., tfr3=0., beta=0.1, weights=(1, 0.5)):
#print(pr_mat.shape, dt_x.shape)
outputs = self.run(x, c, pr_mat, tfr1, tfr2, tfr3)
loss = self.loss_function(x, c, *outputs, beta, weights)
return loss
# def inference(self, c, pr_mat):
# self.eval()
# with torch.no_grad():
# dist_chd = self.chd_encoder(c)
# # pr_mat = self.confuse_prmat(pr_mat)
# dist_rhy = self.rhy_encoder(pr_mat)
# z_chd, z_rhy = get_zs_from_dists([dist_chd, dist_rhy], True)
# dec_z = torch.cat([z_chd, z_rhy], dim=-1)
# pitch_outs, dur_outs = self.decoder(dec_z, True, None,
# None, 0., 0.)
# est_x, _, _ = self.decoder.output_to_numpy(pitch_outs, dur_outs)
# return est_x
#
# def swap(self, c1, c2, pr_mat1, pr_mat2, fix_rhy, fix_chd):
# pr_mat = pr_mat1 if fix_rhy else pr_mat2
# c = c1 if fix_chd else c2
# est_x = self.inference(c, pr_mat)
# return est_x
def inference_encode(self, pr_mat, c):
self.eval()
with torch.no_grad():
dist_chd = self.chd_encoder(c)
dist_rhy = self.rhy_encoder(pr_mat)
return dist_chd, dist_rhy
def inference_decode(self, z_chd, z_rhy):
self.eval()
with torch.no_grad():
dec_z = torch.cat([z_chd, z_rhy], dim=-1)
pitch_outs, dur_outs = self.decoder(dec_z, True, None,
None, 0., 0.)
est_x, _, _ = self.decoder.output_to_numpy(pitch_outs, dur_outs)
return est_x
def inference(self, pr_mat, c, sample):
self.eval()
with torch.no_grad():
dist_chd = self.chd_encoder(c)
dist_rhy = self.rhy_encoder(pr_mat)
z_chd, z_rhy = get_zs_from_dists([dist_chd, dist_rhy], sample)
dec_z = torch.cat([z_chd, z_rhy], dim=-1)
pitch_outs, dur_outs = self.decoder(dec_z, True, None,
None, 0., 0.)
est_x, _, _ = self.decoder.output_to_numpy(pitch_outs, dur_outs)
return est_x
def swap(self, pr_mat1, pr_mat2, c1, c2, fix_rhy, fix_chd):
pr_mat = pr_mat1 if fix_rhy else pr_mat2
c = c1 if fix_chd else c2
est_x = self.inference(pr_mat, c, sample=False)
return est_x
def posterior_sample(self, pr_mat, c, scale=None, sample_chd=True,
sample_txt=True):
if scale is None and sample_chd and sample_txt:
est_x = self.inference(pr_mat, c, sample=True)
else:
dist_chd, dist_rhy = self.inference_encode(pr_mat, c)
if scale is not None:
mean_chd = dist_chd.mean
mean_rhy = dist_rhy.mean
# std_chd = torch.ones_like(dist_chd.mean) * scale
# std_rhy = torch.ones_like(dist_rhy.mean) * scale
std_chd = dist_chd.scale * scale
std_rhy = dist_rhy.scale * scale
dist_rhy = Normal(mean_rhy, std_rhy)
dist_chd = Normal(mean_chd, std_chd)
z_chd, z_rhy = get_zs_from_dists([dist_chd, dist_rhy], True)
if not sample_chd:
z_chd = dist_chd.mean
if not sample_txt:
z_rhy = dist_rhy.mean
est_x = self.inference_decode(z_chd, z_rhy)
return est_x
def prior_sample(self, x, c, sample_chd=False, sample_rhy=False,
scale=1.):
dist_chd, dist_rhy = self.inference_encode(x, c)
mean = torch.zeros_like(dist_rhy.mean)
loc = torch.ones_like(dist_rhy.mean) * scale
if sample_chd:
dist_chd = Normal(mean, loc)
if sample_rhy:
dist_rhy = Normal(mean, loc)
z_chd, z_rhy = get_zs_from_dists([dist_chd, dist_rhy], True)
return self.inference_decode(z_chd, z_rhy)
def gt_sample(self, x):
out = x[:, :, 1:].numpy()
return out
def interp(self, pr_mat1, c1, pr_mat2, c2, interp_chd=False,
interp_rhy=False, int_count=10):
dist_chd1, dist_rhy1 = self.inference_encode(pr_mat1, c1)
dist_chd2, dist_rhy2 = self.inference_encode(pr_mat2, c2)
[z_chd1, z_rhy1, z_chd2, z_rhy2] = \
get_zs_from_dists([dist_chd1, dist_rhy1, dist_chd2, dist_rhy2],
False)
if interp_chd:
z_chds = self.interp_z(z_chd1, z_chd2, int_count)
else:
z_chds = z_chd1.unsqueeze(1).repeat(1, int_count, 1)
if interp_rhy:
z_rhys = self.interp_z(z_rhy1, z_rhy2, int_count)
else:
z_rhys = z_rhy1.unsqueeze(1).repeat(1, int_count, 1)
bs = z_chds.size(0)
z_chds = z_chds.view(bs * int_count, -1).contiguous()
z_rhys = z_rhys.view(bs * int_count, -1).contiguous()
estxs = self.inference_decode(z_chds, z_rhys)
return estxs.reshape((bs, int_count, 32, 15, -1))
def interp_z(self, z1, z2, int_count=10):
z1 = z1.numpy()
z2 = z2.numpy()
zs = torch.stack([self.interp_path(zz1, zz2, int_count)
for zz1, zz2 in zip(z1, z2)], dim=0)
return zs
def interp_path(self, z1, z2, interpolation_count=10):
result_shape = z1.shape
z1 = z1.reshape(-1)
z2 = z2.reshape(-1)
def slerp2(p0, p1, t):
omega = np.arccos(
np.dot(p0 / np.linalg.norm(p0), p1 / np.linalg.norm(p1)))
so = np.sin(omega)
return np.sin((1.0 - t) * omega)[:, None] / so * p0[
None] + np.sin(
t * omega)[:, None] / so * p1[None]
percentages = np.linspace(0.0, 1.0, interpolation_count)
normalized_z1 = z1 / np.linalg.norm(z1)
normalized_z2 = z2 / np.linalg.norm(z2)
dirs = slerp2(normalized_z1, normalized_z2, percentages)
length = np.linspace(np.log(np.linalg.norm(z1)),
np.log(np.linalg.norm(z2)),
interpolation_count)
out = (dirs * np.exp(length[:, None])).reshape(
[interpolation_count] + list(result_shape))
# out = np.array([(1 - t) * z1 + t * z2 for t in percentages])
return torch.from_numpy(out).to(self.device).float()
@staticmethod
def init_model(device=None, chd_size=256, txt_size=256, num_channel=10):
name = 'disvae'
if device is None:
device = torch.device('cuda' if torch.cuda.is_available()
else 'cpu')
# chd_encoder = RnnEncoder(36, 1024, 256)
chd_encoder = RnnEncoder(36, 1024, chd_size)
# rhy_encoder = TextureEncoder(256, 1024, 256)
rhy_encoder = TextureEncoder(256, 1024, txt_size, num_channel)
# pt_encoder = PtvaeEncoder(device=device, z_size=152)
# chd_decoder = RnnDecoder(z_dim=256)
chd_decoder = RnnDecoder(z_dim=chd_size)
# pt_decoder = PtvaeDecoder(note_embedding=None,
# dec_dur_hid_size=64, z_size=512)
pt_decoder = PtvaeDecoder(note_embedding=None,
dec_dur_hid_size=64,
z_size=chd_size + txt_size)
model = DisentangleVAE(name, device, chd_encoder,
rhy_encoder, pt_decoder, chd_decoder)
return model
|
py | b413f0151b09ed2de7a284d1fd31dc335bddbd41 | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
import uuid
from msrest.pipeline import ClientRawResponse
from .. import models
class RegisteredAsnsOperations(object):
"""RegisteredAsnsOperations operations.
You should not instantiate directly this class, but create a Client instance that will create it for you and attach it as attribute.
:param client: Client for service requests.
:param config: Configuration of service client.
:param serializer: An object model serializer.
:param deserializer: An object model deserializer.
:ivar api_version: The client API version. Constant value: "2020-01-01-preview".
"""
models = models
def __init__(self, client, config, serializer, deserializer):
self._client = client
self._serialize = serializer
self._deserialize = deserializer
self.api_version = "2020-01-01-preview"
self.config = config
def get(
self, resource_group_name, peering_name, registered_asn_name, custom_headers=None, raw=False, **operation_config):
"""Gets an existing registered ASN with the specified name under the given
subscription, resource group and peering.
:param resource_group_name: The name of the resource group.
:type resource_group_name: str
:param peering_name: The name of the peering.
:type peering_name: str
:param registered_asn_name: The name of the registered ASN.
:type registered_asn_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: PeeringRegisteredAsn or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.peering.models.PeeringRegisteredAsn or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<azure.mgmt.peering.models.ErrorResponseException>`
"""
# Construct URL
url = self.get.metadata['url']
path_format_arguments = {
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'peeringName': self._serialize.url("peering_name", peering_name, 'str'),
'registeredAsnName': self._serialize.url("registered_asn_name", registered_asn_name, 'str'),
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('PeeringRegisteredAsn', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Peering/peerings/{peeringName}/registeredAsns/{registeredAsnName}'}
def create_or_update(
self, resource_group_name, peering_name, registered_asn_name, asn=None, custom_headers=None, raw=False, **operation_config):
"""Creates a new registered ASN with the specified name under the given
subscription, resource group and peering.
:param resource_group_name: The name of the resource group.
:type resource_group_name: str
:param peering_name: The name of the peering.
:type peering_name: str
:param registered_asn_name: The name of the ASN.
:type registered_asn_name: str
:param asn: The customer's ASN from which traffic originates.
:type asn: int
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: PeeringRegisteredAsn or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.peering.models.PeeringRegisteredAsn or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<azure.mgmt.peering.models.ErrorResponseException>`
"""
registered_asn = models.PeeringRegisteredAsn(asn=asn)
# Construct URL
url = self.create_or_update.metadata['url']
path_format_arguments = {
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'peeringName': self._serialize.url("peering_name", peering_name, 'str'),
'registeredAsnName': self._serialize.url("registered_asn_name", registered_asn_name, 'str'),
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(registered_asn, 'PeeringRegisteredAsn')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 201]:
raise models.ErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('PeeringRegisteredAsn', response)
if response.status_code == 201:
deserialized = self._deserialize('PeeringRegisteredAsn', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
create_or_update.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Peering/peerings/{peeringName}/registeredAsns/{registeredAsnName}'}
def delete(
self, resource_group_name, peering_name, registered_asn_name, custom_headers=None, raw=False, **operation_config):
"""Deletes an existing registered ASN with the specified name under the
given subscription, resource group and peering.
:param resource_group_name: The name of the resource group.
:type resource_group_name: str
:param peering_name: The name of the peering.
:type peering_name: str
:param registered_asn_name: The name of the registered ASN.
:type registered_asn_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: None or ClientRawResponse if raw=true
:rtype: None or ~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<azure.mgmt.peering.models.ErrorResponseException>`
"""
# Construct URL
url = self.delete.metadata['url']
path_format_arguments = {
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'peeringName': self._serialize.url("peering_name", peering_name, 'str'),
'registeredAsnName': self._serialize.url("registered_asn_name", registered_asn_name, 'str'),
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.delete(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 204]:
raise models.ErrorResponseException(self._deserialize, response)
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
delete.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Peering/peerings/{peeringName}/registeredAsns/{registeredAsnName}'}
def list_by_peering(
self, resource_group_name, peering_name, custom_headers=None, raw=False, **operation_config):
"""Lists all registered ASNs under the given subscription, resource group
and peering.
:param resource_group_name: The name of the resource group.
:type resource_group_name: str
:param peering_name: The name of the peering.
:type peering_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of PeeringRegisteredAsn
:rtype:
~azure.mgmt.peering.models.PeeringRegisteredAsnPaged[~azure.mgmt.peering.models.PeeringRegisteredAsn]
:raises:
:class:`ErrorResponseException<azure.mgmt.peering.models.ErrorResponseException>`
"""
def prepare_request(next_link=None):
if not next_link:
# Construct URL
url = self.list_by_peering.metadata['url']
path_format_arguments = {
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'peeringName': self._serialize.url("peering_name", peering_name, 'str'),
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
return request
def internal_paging(next_link=None):
request = prepare_request(next_link)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorResponseException(self._deserialize, response)
return response
# Deserialize response
header_dict = None
if raw:
header_dict = {}
deserialized = models.PeeringRegisteredAsnPaged(internal_paging, self._deserialize.dependencies, header_dict)
return deserialized
list_by_peering.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Peering/peerings/{peeringName}/registeredAsns'}
|
py | b413f07d71ef80d340dc7ffcb4497810369d6750 | # coding:utf-8
import sys
import copy
import collections
from LxBasic import bscObjects
from LxBasic import bscCfg, bscMethods
class ShmUtility(object):
MOD_sys = sys
MOD_copy = copy
CLS_dic_order = collections.OrderedDict
DEF_shm__rsc__category__platform = 'platform'
DEF_shm__rsc__category__plf_language = 'plf-language'
DEF_shm__rsc__category__plf_application = 'plf-application'
DEF_shm__rsc__category__plf_lng_language = 'plf-app-language'
DEF_shm__rsc__category__plf_package = 'plf-package'
DEF_shm__rsc__category__plf_lng_package = 'plf-lan-package'
DEF_shm__rsc__category__plf_App_Package = 'plf-app-package'
DEF_shm__rsc__category__plf_app_lng_Package = 'plf-app-lan-package'
DEF_shm__rsc__category__plf_plug = 'plf-plug'
DEF_shm__rsc__category__plf_lng_plug = 'plf-lan-plug'
DEF_shm__rsc__category__plf_app_lng_plug = 'plf-app-lan-plug'
DEF_shm__rsc__category__plf_app_plug = 'plf-app-plug'
DEF_shm__rsc__category__plf_module = 'plf-module'
DEF_shm__rsc__category__plf_lng_module = 'plf-lan-module'
DEF_shm__rsc__category__plf_app_lng_module = 'plf-app-lan-module'
DEF_shm__rsc__category__plf_app_module = 'plf-app-module'
DEF_shm__rsc__category__plf_scheme = 'plf-scheme'
DEF_shm__rsc__category__plf_app_scheme = 'plf-app-scheme'
DEF_shm__rsc__category__plf_lng_scheme = 'plf-lan-scheme'
DEF_shm__rsc__category__plf_app_lng_scheme = 'plf-app-lan-scheme'
DEF_shm__rsc__category__plf_App_Scheme = 'plf-app-scheme'
DEF_shm__rsc__category__plf_Lng_tool = 'plf-lan-tool'
DEF_shm__rsc__category__plf_app_lng_tool = 'plf-app-lan-tool'
DEF_shm__rsc__category__plf_app_tool = 'plf-app-tool'
# **************************************************************************************************************** #
Category_Project = 'project'
Category_Scheme_Lis = [
DEF_shm__rsc__category__plf_scheme,
DEF_shm__rsc__category__plf_lng_scheme,
DEF_shm__rsc__category__plf_app_scheme,
DEF_shm__rsc__category__plf_app_lng_scheme
]
Category_Package_Lis = [
DEF_shm__rsc__category__plf_package,
DEF_shm__rsc__category__plf_lng_package,
DEF_shm__rsc__category__plf_app_lng_Package,
DEF_shm__rsc__category__plf_App_Package
]
Category_Bin_Lis = [
DEF_shm__rsc__category__platform,
DEF_shm__rsc__category__plf_language,
DEF_shm__rsc__category__plf_application,
DEF_shm__rsc__category__plf_lng_language
]
Category_Module_Lis = [
DEF_shm__rsc__category__plf_module,
DEF_shm__rsc__category__plf_lng_module,
DEF_shm__rsc__category__plf_app_lng_module,
DEF_shm__rsc__category__plf_app_module
]
Category_Plug_Lis = [
DEF_shm__rsc__category__plf_plug,
DEF_shm__rsc__category__plf_lng_plug,
DEF_shm__rsc__category__plf_app_lng_plug,
DEF_shm__rsc__category__plf_app_plug
]
DEF_shm__keyword__share = 'share'
DEF_util__environ_key__name_scheme = 'LYNXI_NAME_SCHEME'
Environ_KeyVAR_kit__window__version_Scheme = 'LYNXI_VERSION_SCHEME'
Environ_Key_File_Scheme = 'LYNXI_SETUP_FILE_SCHEME'
Environ_Key_Config_File_Scheme = 'LYNXI_CONFIG_FILE_SCHEME'
Environ_Key_Enable_Develop = 'LYNXI_ENABLE_DEVELOP'
DEF_util__environ_key__path_preset = u'LYNXI_PATH_PRESET'
DEF_util__environ_key__path_kit = u'LYNXI_PATH_KIT'
DEF_util__environ_key__path_appkit = u'LYNXI_PATH_APPKIT'
DEF_util__environ_key__path_toolkit = u'LYNXI_PATH_TOOLKIT'
DEF_util__environ_key__paths_source = u'LYNXI_PATHS_SOURCE'
Environ_Key_Python_Bin_Path = u'LYNXI_BIN_PYTHON_PATH'
Environ_Key_Loadname_Plug = u'LYNXI_LOADNAME_PLUG'
Environ_Key_Loadname_Module = u'LYNXI_LOADNAME_MODULE'
DEF_shm__folder__source = 'source'
Key_User = 'user'
Key_Timestamp = 'timestamp'
Ext_Json = '.json'
DEF_shm__key__enable = 'enable'
DEF_shm__key__category = 'category'
DEF_shm__key__name = 'name'
DEF_shm__key__system = 'system'
DEF_shm__key__version = 'version'
Key_Record = 'record'
Key_Active = 'active'
Key_Develop = 'develop'
Key_Custom = 'custom'
Key_Application = 'application'
Key_Bin = 'bin'
Key_Platform = 'platform'
Key_App = 'app'
Key_PythonVAR_kit__window__version = 'python_version'
Key_Resource = 'resource'
Key_Config = 'config'
Key_Program = 'program'
Key_Dependent = 'dependent'
Key_Dependent_Module = 'dependent_module'
Key_Dependent_Package = 'dependent_package'
Key_Language = 'language'
Key_Language_Name = 'language_name'
Key_LanguageVAR_kit__window__version = 'language_version'
Key_Module = 'module'
Key_Plug = 'plug'
Key_Python_Package = 'python_package'
Key_Python_Module = 'python_module'
Key_Resource_Source_Path = 'sourcepath'
Key_Resource_Compile_Path = 'compilepath'
DEF_key_plug_name = 'plugname'
DEF_key_plug_version = 'plugversion'
Key_Plug_App = 'plugapp'
Key_Plug_Source_Path = 'plugpath'
Key_Plug_Load_Name = 'loadname'
Key_Plug_Module_Name = 'moduleName'
Language_Python = 'python'
Version_Default = '0.0.0'
DEF_shm_keyword__version_active = 'active'
DEF_shm_keyword__system_active = 'active'
App_Maya = 'maya'
Platform_Windows = 'windows'
PythonVAR_kit__window__version_27 = '2.7'
Key_Path = 'path'
Key_Environ = 'environ'
Key_Value = 'value'
Key_Operate = 'operate'
DEF_shm__keyword_index = 'index'
Operation_Add = '+='
Operation_Replace = '='
DEF_shm__keyword__self = 'self'
Attr_Key_Root = 'root'
Attr_Key_Path = 'path'
Path_Key_Active = 'active'
Path_Key_Server = 'server'
Path_Key_Local = 'local'
Path_Key_Develop = 'develop'
Path_Key_Product = 'product'
Path_Key_Workspace = 'workspace'
Attr_Key_Path_Source = 'sourcepath'
_String_Indent = ' '
@staticmethod
def _toSubPathMethod(*args):
if args:
sep = '/'
if len(args) > 1:
if isinstance(args[0], list) or isinstance(args[0], tuple):
return sep.join(list(args[0]))
return sep.join(list(args))
return args[0]
@staticmethod
def _toSubNameMethod(*args):
if args:
sep = '-'
if len(args) > 1:
if isinstance(args[0], list) or isinstance(args[0], tuple):
return sep.join(list(args[0]))
return sep.join(list(args))
return args[0]
return ''
@staticmethod
def _createTimestampMethod(osPath, osJsonFile):
isAscii = False
timestampDic = bscMethods.OsDirectory.allFileTimestampDict(osPath)
bscMethods.OsJsonFile.write(osJsonFile, timestampDic, ensure_ascii=isAscii)
@staticmethod
def _getChangedFileMethod(sourceTimestamp, targetTimestamp):
lis = []
for localOsFile, sourceTime in sourceTimestamp.items():
if targetTimestamp.__contains__(localOsFile):
targetTime = targetTimestamp[localOsFile]
if sourceTime != targetTime:
lis.append(localOsFile)
#
else:
lis.append(localOsFile)
return lis
@classmethod
def isDevelop(cls):
return [False, True][bscMethods.OsEnviron.get(bscCfg.BscUtility.DEF_util__environ_key__enable_develop, u'FALSE').lower() == u'true']
@classmethod
def isUsedef(cls):
return [False, True][bscMethods.OsEnviron.get(bscCfg.BscUtility.DEF_util__environ_key__enable_usedef, u'FALSE').lower() == u'true']
@classmethod
def setUsedef(cls, boolean):
bscMethods.OsEnviron.set(bscCfg.BscUtility.DEF_util__environ_key__enable_usedef, [u'FALSE', u'TRUE'][boolean])
# noinspection PyMethodMayBeStatic
def _jsonStrRaw(self):
return {}
@classmethod
def _toJsonStringMethod(cls, raw, indent=4):
def addNoneFnc_(lString, rString):
lis.append(u'{}null{}'.format(lString, rString))
def addStringFnc_(raw_, lString, rString):
lis.append(u'{}"{}"{}'.format(lString, raw_, rString))
def addUnicodeFnc_(raw_, lString, rString):
lis.append(u'{}"{}"{}'.format(lString, raw_, rString))
def addNumberFnc_(raw_, lString, rString):
lis.append(u'{}{}{}'.format(lString, raw_, rString))
def addBooleanFnc_(raw_, lString, rString):
lis.append(u'{}{}{}'.format(lString, str(raw_).lower(), rString))
def addMemberFnc_(raw_, lString, rString):
if isinstance(raw_, bool):
addBooleanFnc_(raw_, lString, rString)
elif isinstance(raw_, int) or isinstance(raw_, float):
addNumberFnc_(raw_, lString, rString)
elif isinstance(raw_, str):
addStringFnc_(raw_, lString, rString)
elif isinstance(raw_, unicode):
addUnicodeFnc_(raw_, lString, rString)
def addValueFnc_(raw_, lString, rString, rawtype=None):
if raw_ is None:
addNoneFnc_(lString=lString, rString='\r\n')
elif isinstance(raw_, list) or isinstance(raw_, tuple):
lString += defIndentString
addListFnc_(raw_, lString=lString, rString=rString)
elif isinstance(raw_, dict):
lString += defIndentString
addDictionaryFnc_(raw_, lString=lString, rString=rString)
else:
if rawtype == dict:
addMemberFnc_(raw_, lString='', rString=rString)
else:
addMemberFnc_(raw_, lString=lString+defIndentString, rString=rString)
def addListFnc_(raw_, lString, rString):
if raw_:
lis.append(u'{lString}[{rString}'.format(lString='', rString='\r\n'))
c = len(raw_)
for seq, i in enumerate(raw_):
if seq < c - 1:
addValueFnc_(i, lString=lString, rString=',\r\n', rawtype=list)
else:
addValueFnc_(i, lString=lString, rString='\r\n', rawtype=list)
lis.append(u'{lString}]{rString}'.format(lString=lString, rString=rString))
else:
lis.append(u'{lString}[]{rString}\r\n'.format(lString=lString, rString=rString))
def addDictionaryFnc_(raw_, lString, rString):
if raw_:
lis.append(u'{lString}{{{rString}'.format(lString='', rString='\r\n'))
c = len(raw_)
for seq, (k, v) in enumerate(raw_.items()):
addMemberFnc_(k, lString=lString + defIndentString, rString=': ')
if seq < c - 1:
addValueFnc_(v, lString=lString, rString=',\r\n', rawtype=dict)
else:
addValueFnc_(v, lString=lString, rString='\r\n', rawtype=dict)
lis.append(u'{lString}}}{rString}'.format(lString=lString, rString=rString))
else:
lis.append(u'{lString}{{}}{rString}'.format(lString='', rString=rString))
def addRawFnc_(raw_):
if raw_ is None:
addNoneFnc_(lString='', rString='\r\n')
elif isinstance(raw_, list) or isinstance(raw_, tuple):
addListFnc_(raw_, lString='', rString='\r\n')
elif isinstance(raw_, dict):
addDictionaryFnc_(raw_, lString='', rString='\r\n')
defIndentString = ' ' * indent
lis = [
u'{} = '.format(cls.__name__)
]
addRawFnc_(raw)
return ''.join(lis)
def __str__(self):
if self._jsonStrRaw():
return self._toJsonStringMethod(self._jsonStrRaw())
return ''
class ShmResourceCategory(object):
pass
class ShmSystemArgument(object):
# platform
platform = [
'share', 'share'
]
platform__python_27 = [
'share', 'share',
'python', '2.7'
]
platform__maya = [
'share', 'share',
'maya', 'share'
]
platform__maya__python_27 = [
'share', 'share',
'maya', 'share',
'python', '2.7'
]
platform__maya_2019 = [
'share', 'share',
'maya', '2019'
]
platform__houdini = [
'share', 'share',
'houdini', 'share'
]
platform__houdini__python_27 = [
'share', 'share',
'houdini', 'share',
'python', '2.7'
]
platform__houdini_18 = [
'share', 'share',
'houdini', '18'
]
# windows
windows = [
'windows', 'share'
]
windows__python_27 = [
'windows', 'share',
'python', '2.7'
]
windows__maya = [
'windows', 'share',
'maya', 'share'
]
windows__maya_2019 = [
'windows', 'share',
'maya', '2019'
]
windows__houdini = [
'windows', 'share',
'houdini', 'share'
]
windows__houdini_18 = [
'windows', 'share',
'houdini', '18'
]
# windows-x64
windows_x64 = [
'windows', 'x64'
]
# linux
linux = [
'linux', 'share'
]
linux__python_27 = [
'linux', 'share',
'python', '2.7'
]
linux__houdini_18 = [
'linux', 'share',
'houdini', '18'
]
# linux x64
linux_x64 = [
'linux', 'x64'
]
linux_x64__python_27 = [
'linux', 'x64',
'python', '2.7'
]
linux_x64__houdini = [
'linux', 'x64',
'houdini', 'share'
]
linux_x64__houdini_18 = [
'linux', 'x64',
'houdini', '18'
]
linux_x64__maya = [
'linux', 'x64',
'maya', 'share'
]
linux_x64__maya_18 = [
'linux', 'x64',
'maya', '18'
]
class ShmRootDefaultKey(object):
pass
class ShmRootDefaultValue(object):
develop = bscCfg.BscUtility.DEF_util__root__default_develop
product = bscCfg.BscUtility.DEF_util__root__default_product
local = bscCfg.BscUtility.DEF_util__root__default_local
|
py | b413f29603c3f6a202d08a49afeb8240ef921de2 | import sqlite3
from tkinter import*
from tkinter import ttk
import random
from datetime import datetime
import tkinter.messagebox
import tkinter as tk
from tkinter.scrolledtext import ScrolledText
import datetime
from source import DB
db = DB()
class Library :
def __init__(self,root):
self.root = root
self.root.title("Sasiram Library Management System")
#self.root.geometry("1350x750+0+0")
width_value=self.root.winfo_screenwidth()
height_value=self.root.winfo_screenheight()
self.root.geometry('%dx%d+0+0'%(width_value,height_value))
self.root.configure(background='powder blue')
#========================================page tab===================================================
nb = ttk.Notebook(root.master)
page1 = ttk.Frame(nb)
page1.pack(fill="both")
page2 = ttk.Frame(nb)
page2.pack(fill="both")
#text = ScrolledText(page2)
#text.pack(expand=1, fill="both")
page3 = ttk.Frame(nb)
page3.pack(fill="both")
nb.add(page1, text='\tBook Management\t ')
nb.add(page2, text='\tAll Data Records\t')
nb.add(page3, text='\tAdd Books\t')
#===========================================page tab================================================
#*******************************************************************************************************************************************************************
#==============================page 1 frame===============================================================================
MainFrame = Frame(page1, bg='powder blue')
MainFrame.pack(fill='both')
TitleFrame = Frame(MainFrame, width=135,padx=20, bd=5, relief=RIDGE)#
TitleFrame.pack(side=TOP,fill=X)
self.lblTitle = Label(TitleFrame, width=35, font=('arial',40,'bold'),text="\tLibrary Management System\t", padx=12)
self.lblTitle.pack()
DataFrame = Frame(MainFrame, bd=5, width=130, height=400, padx=20, relief=RIDGE)
DataFrame.pack(side=TOP,fill=X)
endFrame =Frame(MainFrame, bd=5,width=135,height=50,padx=20)#,relief=RIDGE
endFrame.pack(side=BOTTOM,fill='both')# end frame cover last space
ButtonFrame =Frame(MainFrame, bd=5,width=135,height=50,padx=20,pady=20,relief=RIDGE)#,relief=RIDGE
ButtonFrame.pack(side=BOTTOM,fill=X)
FrameDeatail = Frame(MainFrame, bd=5, width=1350, height=90, pady=20, padx=20, relief=RIDGE)#
FrameDeatail.pack(side=BOTTOM,fill=X)
DataFrameLEFT = LabelFrame(DataFrame,bd=10,width=800,height=300,padx=20,relief=RIDGE, font=('arial',12,'bold'),text="Library Membership Info:")
DataFrameLEFT.pack(side=LEFT,fill=X)
DataFrameRight = LabelFrame(DataFrame,bd=10,width=450,height=300,padx=20,relief=RIDGE,font=('arial',12,'bold'),text="Book Details:")
DataFrameRight.pack(side=LEFT,fill=X)
#==============================page 1 frame===============================================================================
def Issued_get_selected_row(event):
global selected_tuple
index= self.txtDisplayr.curselection()[0]
selected_tuple = self.txtDisplayr.get(index)
Issued_BookID.delete(0,END)
Issued_BookID.insert(END, selected_tuple[1])
Issued_BookTitle.delete(0,END)
Issued_BookTitle.insert(END, selected_tuple[2])
Issued_Author.delete(0,END)
Issued_Author.insert(END, selected_tuple[3])
Issued_BookSTD.delete(0,END)
Issued_BookSTD.insert(END, selected_tuple[4])
Issued_BookPrice.delete(0,END)
Issued_BookPrice.insert(END, selected_tuple[5])
def Return_get_selected_row(event):
global selected_tuple
index= student_record_display.curselection()[0]
selected_tuple = student_record_display.get(index)
Issued_MemberType.delete(0,END)
Issued_MemberType.insert(END, selected_tuple[1])
Issued_StudentID.delete(0,END)
Issued_StudentID.insert(END, selected_tuple[2])
Issued_Name.delete(0,END)
Issued_Name.insert(END, selected_tuple[3])
Issued_Class.delete(0,END)
Issued_Class.insert(END, selected_tuple[4])
Issued_Section.delete(0,END)
Issued_Section.insert(END, selected_tuple[5])
Issued_IssuedDate.delete(0,END)
Issued_IssuedDate.insert(END, selected_tuple[6])
Issued_ReturnDate.delete(0,END)
Issued_ReturnDate.insert(END, selected_tuple[7])
Issued_BookID.delete(0,END)
Issued_BookID.insert(END, selected_tuple[8])
Issued_BookTitle.delete(0,END)
Issued_BookTitle.insert(END, selected_tuple[9])
Issued_Author.delete(0,END)
Issued_Author.insert(END, selected_tuple[10])
Issued_BookSTD.delete(0,END)
Issued_BookSTD.insert(END, selected_tuple[11])
Issued_BookPrice.delete(0,END)
Issued_BookPrice.insert(END, selected_tuple[12])
Issued_Return.delete(0,END)
Issued_Return.insert(END, selected_tuple[13])
#==============================WIDGE============Library Membership Info===================================================================
MemberType_issued= StringVar()
StudentID_issued= StringVar()
Name_issued= StringVar()
Class_issued= StringVar()
Section_issued= StringVar()
IssuedDate_issued= StringVar()
ReturnDate_issued= StringVar()
BookID_issued= StringVar()
BookTitle_issued= StringVar()
Author_issued= StringVar()
BookSTD_issued= StringVar()
BookPrice_issued= StringVar()
Return_issued= StringVar()
Issued_MemberType= Label(DataFrameLEFT,font=('arial',12,'bold'), text="member type", padx=2, pady=2)
Issued_MemberType.grid(row=0, column=0, sticky=W)
Issued_MemberType = ttk.Combobox(DataFrameLEFT, font=('arial', 12,'bold'),width=23, textvariable=MemberType_issued)#state='readonly',
Issued_MemberType['value']=('student','Lecturer','Admin Staff')
Issued_MemberType.current(0)
Issued_MemberType.grid(row=0,column=1)
Issued_StudentID = Label(DataFrameLEFT, font=('arial',12,'bold'), text="Student ID", padx=2, pady=2)
Issued_StudentID.grid(row=1, column=0, sticky=W)
Issued_StudentID = Entry(DataFrameLEFT, font=('arial',12,'bold'),width=25, textvariable=StudentID_issued)
Issued_StudentID.grid(row=1, column=1)
Issued_Name = Label(DataFrameLEFT,font=('arial',12,'bold'), text="Name", padx=2, pady=2)
Issued_Name.grid(row=2, column=0, sticky=W)
Issued_Name = Entry(DataFrameLEFT, font=('arial',12,'bold'),width=25, textvariable=Name_issued)
Issued_Name.grid(row=2, column=1)
Issued_Class = Label(DataFrameLEFT, font=('arial',12,'bold'), text="Class", padx=2, pady=2)
Issued_Class.grid(row=3, column=0, sticky=W)
Issued_Class = Entry(DataFrameLEFT, font=('arial',12,'bold'),width=25, textvariable=Class_issued)
Issued_Class.grid(row=3, column=1)
Issued_Section = Label(DataFrameLEFT, font=('arial',12,'bold'), text="Section", padx=2, pady=2)
Issued_Section.grid(row=4, column=0, sticky=W)
Issued_Section = Entry(DataFrameLEFT, font=('arial',12,'bold'),width=25, textvariable=Section_issued)
Issued_Section.grid(row=4, column=1)
#date
date_now= datetime.datetime.now()
date_seven=(datetime.datetime.now() + datetime.timedelta(days=7))
Issued_IssuedDate = Label(DataFrameLEFT, font=('arial',12,'bold'), text="Issued Date", padx=2, pady=2)
Issued_IssuedDate.grid(row=5, column=0, sticky=W)
Issued_IssuedDate = ttk.Combobox(DataFrameLEFT, font=('arial',12,'bold'),width=23, textvariable=IssuedDate_issued)
Issued_IssuedDate['value']=(date_now,date_now)
Issued_IssuedDate.current(0)
Issued_IssuedDate.grid(row=5, column=1)
Issued_ReturnDate = Label(DataFrameLEFT, font=('arial',12,'bold'), text="Return Date", padx=2, pady=2)
Issued_ReturnDate.grid(row=6, column=0, sticky=W)
Issued_ReturnDate = ttk.Combobox(DataFrameLEFT, font=('arial',12,'bold'),width=23, textvariable=ReturnDate_issued)
Issued_ReturnDate['value']=(date_seven,date_seven)
Issued_ReturnDate.current(0)
Issued_ReturnDate.grid(row=6, column=1)
Issued_BookID = Label(DataFrameLEFT, font=('arial',12,'bold'), text="Book ID", padx=2, pady=2)
Issued_BookID.grid(row=0, column=2, sticky=W)
Issued_BookID = Entry(DataFrameLEFT, font=('arial',12,'bold'),width=25, textvariable=BookID_issued)
Issued_BookID.grid(row=0, column=3)
Issued_BookTitle = Label(DataFrameLEFT, font=('arial',12,'bold'), text="Book Title", padx=2, pady=2)
Issued_BookTitle.grid(row=1, column=2, sticky=W)
Issued_BookTitle = Entry(DataFrameLEFT, font=('arial',12,'bold'),width=25, textvariable=BookTitle_issued)
Issued_BookTitle.grid(row=1, column=3)
Issued_Author = Label(DataFrameLEFT, font=('arial',12,'bold'), text="Author", padx=2, pady=2)
Issued_Author.grid(row=2, column=2, sticky=W)
Issued_Author = Entry(DataFrameLEFT, font=('arial',12,'bold'),width=25, textvariable=Author_issued)
Issued_Author.grid(row=2, column=3)
Issued_BookSTD = Label(DataFrameLEFT, font=('arial',12,'bold'), text="Book std ", padx=2, pady=2)
Issued_BookSTD.grid(row=3, column=2, sticky=W)
Issued_BookSTD = Entry(DataFrameLEFT, font=('arial',12,'bold'),width=25, textvariable=BookSTD_issued)
Issued_BookSTD.grid(row=3, column=3)
Issued_BookPrice= Label(DataFrameLEFT, font=('arial',12,'bold'), text="Book price", padx=2, pady=2)
Issued_BookPrice.grid(row=4, column=2, sticky=W)
Issued_BookPrice = Entry(DataFrameLEFT, font=('arial',12,'bold'),width=25, textvariable=BookPrice_issued)
Issued_BookPrice.grid(row=4, column=3)
Issued_Return = Label(DataFrameLEFT, font=('arial',12,'bold'), text="Return ",bg="red", padx=2, pady=2)
Issued_Return.grid(row=5, column=2, sticky=W)
Issued_Return = ttk.Combobox(DataFrameLEFT, font=('arial',12,'bold'),width=23, textvariable=Return_issued)
Issued_Return['value']=('NO','YES')
Issued_Return.current(0)
Issued_Return.grid(row=5, column=3)
#==============================WIDGE============Library Membership Info===================================================================
#==============================Right WIDGE============
#==============================List box============
self.txtDisplayr=Listbox(DataFrameRight,font=('arial',12,'bold'),width=50,height=11)#,padx=8,pady=20
self.txtDisplayr.bind('<<ListboxSelect>>', Issued_get_selected_row)
self.txtDisplayr.grid(row=0, column=0)
scrollbar_10 = Scrollbar(DataFrameRight) #Scrollbar
scrollbar_10.grid(row=0,column=1, sticky='ns')
self.txtDisplayr.configure(yscrollcommand=scrollbar_10.set)
scrollbar_10.configure(command=self.txtDisplayr.yview)
scrollbar_11 = Scrollbar(DataFrameRight, orient=HORIZONTAL) #Scrollbar
scrollbar_11.grid(row=1,column=0, sticky='ew')
self.txtDisplayr.configure(xscrollcommand=scrollbar_11.set)
scrollbar_11.configure(command=self.txtDisplayr.xview)
for row in db.books():#disply books details
self.txtDisplayr.insert(0, row)
#==============================List box============
#==============================Right WIDGE============
#==============================list all details===================================================================
student_record_display=Listbox(FrameDeatail,font=('arial',12,'bold'),width=141,height=7)
student_record_display.bind('<<ListboxSelect>>', Return_get_selected_row)
student_record_display.grid(row=1, column=0)
scrollbar_12 = Scrollbar(FrameDeatail) #Scrollbar
scrollbar_12.grid(row=1,column=1, sticky='ns')
student_record_display.configure(yscrollcommand=scrollbar_12.set)
scrollbar_12.configure(command=student_record_display.xview)
#==============================list===================================================================
def save_issued():
add_issued_command()
tkinter.messagebox.showinfo("Student Detail", "library Records Save Successfully")
def add_issued_command():
db.insert2(MemberType_issued.get(),StudentID_issued.get(),Name_issued.get(),Class_issued.get(),Section_issued.get(),IssuedDate_issued.get(),ReturnDate_issued.get(),BookID_issued.get(),BookTitle_issued.get(),Author_issued.get(),BookSTD_issued.get(),BookPrice_issued.get(),Return_issued.get())
student_record_display.delete(0,END)
student_record_display.insert(END, (MemberType_issued.get(),StudentID_issued.get(),Name_issued.get(),Class_issued.get(),Section_issued.get(),IssuedDate_issued.get(),ReturnDate_issued.get(),BookID_issued.get(),BookTitle_issued.get(),Author_issued.get(),BookSTD_issued.get(),BookPrice_issued.get(),Return_issued.get()))
def non_return():
student_record_display.delete(0,END)
for row in db.STUDENT_RECORD():
student_record_display.insert(0, row)
def Reset_issued():
MemberType_issued.set("")
StudentID_issued.set("")
Name_issued.set("")
Class_issued.set("")
Section_issued.set("")
IssuedDate_issued.set("")
ReturnDate_issued.set("")
BookID_issued.set("")
BookTitle_issued.set("")
Author_issued.set("")
BookSTD_issued.set("")
BookPrice_issued.set("")
Return_issued.set("")
def Book_Returnd(): #update book
db.update2(selected_tuple[0],Return_issued.get())
tkinter.messagebox.showinfo("Student Detail", "Update Records Successfully")
def search_command_issued():
student_record_display.delete(0, END)
for row in db.search2(Return_issued.get()):
student_record_display.insert(END,row)
#==============================Button===================================================================
ButSave=Button(DataFrameLEFT, text='SAVE',font=('Roboto',11,'bold'), bg='powder blue',width=9,bd=6, padx=12,command=save_issued)
ButSave.grid(row=6, column=3)
self.btnDIsplayData=Button(ButtonFrame, text='display data',font=('arial',12,'bold'),width=30,bd=4,command=non_return)
self.btnDIsplayData.grid(row=0,column=0)
self.btnDelete=Button(ButtonFrame, text='Book Returnd',font=('arial',12,'bold'),width=30,bd=4,command=Book_Returnd)
self.btnDelete.grid(row=0,column=1)
self.btnReset=Button(ButtonFrame, text='Reset',font=('arial',12,'bold'),width=30,bd=4,command=Reset_issued)
self.btnReset.grid(row=0,column=2)
self.btnExit=Button(ButtonFrame, text='Exit',font=('arial',12,'bold'),width=30,bd=4, command=self.root.destroy)#search_command_issued
self.btnExit.grid(row=0,column=3)
#==============================Button===================================================================
#===========================================page 1 frame =====================================================
#*******************************************************************************************************************************************************************
self.tree = ttk.Treeview(page2)
self.tree["columns"]=("Member_type", "Student_ID", "Name", "Class", "Section", "Issue_Date","Return_Date","Book_ID","Book_Title","Author","Book_Std","Book_Price","Return")
self.tree.column("Member_type", width=100 )
self.tree.column("Student_ID", width=100)
self.tree.column("Name", width=100)
self.tree.column("Class", width=50)
self.tree.column("Section", width=50)
self.tree.column("Issue_Date", width=100)
self.tree.column("Return_Date", width=100 )
self.tree.column("Book_ID", width=50)
self.tree.column("Book_Title", width=100)
self.tree.column("Author", width=100)
self.tree.column("Book_Std", width=50)
self.tree.column("Book_Price", width=100)
self.tree.column("Return", width=100)
###############################
self.tree.heading("Member_type", text="Member type")
self.tree.heading("Student_ID", text="Student ID")
self.tree.heading("Name", text="Name")
self.tree.heading("Class", text="Class")
self.tree.heading("Section", text="Section")
self.tree.heading("Issue_Date", text="Issue Date")
self.tree.heading("Return_Date", text="Return Date")
self.tree.heading("Book_ID", text="Book ID")
self.tree.heading("Book_Title", text="Book Title")
self.tree.heading("Author", text="Author")
self.tree.heading("Book_Std", text="Book STD")
self.tree.heading("Book_Price", text="Book Price")
self.tree.heading("Return", text="Return")
#################################
for row in db.STUDENT_RECORD():
self.tree.insert("",0,text=row[0], values=(row[1],row[2],row[3],row[4],row[5],row[6],row[7],row[8],row[9],row[10],row[11],row[12],row[13]))
self.tree.pack(fill='both')
#*******************************************************************************************************************************************************************
#===========================================page 3 frame =====================================================
#--------------------------------------------------
MainFrame = Frame(page3, bg='powder blue')
MainFrame.pack(fill='both')
TitleFrame = Frame(MainFrame, width=135,padx=20,relief=RIDGE, bd=5)#
TitleFrame.pack(side=TOP,fill=X)
self.lblTitle = Label(TitleFrame, width=35, font=('arial',25,'bold'),text="\t Library Management System \t", padx=12)
self.lblTitle.pack()
Frame_2 = Frame(page3, width=135,relief=RIDGE,bd=5)#,padx=5
Frame_2.pack(side=TOP,fill='both')
#--------------------------------------------------
DetailsFrame1 = Frame(Frame_2, width=350,height=750,padx=12, bg='powder blue', bd=2)
DetailsFrame1.pack(side=LEFT, fill=Y)# fill=X,, expand=YES ,anchor=W,
DetailsFrame1_title = Frame(DetailsFrame1, width=350,height=50,padx=12, bg='powder blue', bd=2)
DetailsFrame1_title.pack(fill=X)
DetailsFrame = Frame(DetailsFrame1, width=350,height=750,padx=12, bg='powder blue', bd=2)
DetailsFrame.pack(side=LEFT, fill=Y)#, expand=YES, anchor=W,
LblTitle = Label(DetailsFrame1_title, width=27, font=('Roboto',18,'bold'),text=" Book Details ", bg='cadet blue', padx=8)
LblTitle.pack( fill=X)
TitleFrame = Frame(Frame_2, width=100,padx=15, bg='green', bd=6)
TitleFrame.pack(side=TOP, fill=X)
LblTitle = Label(TitleFrame, width=72, font=('Roboto',16,'bold'),text=" Books Details ", bg='powder blue', padx=8)
LblTitle.grid()
MiddleFrame=Frame(Frame_2, width=1350,height=600, bd=6,bg='orange')
MiddleFrame.pack(side=TOP, fill=X)#anchor=W,
ButtonFrame =Frame(Frame_2, bd=20,width=1350,height=50,padx=20)
ButtonFrame.pack(side=TOP, fill=X)#, anchor=W,
#--------------------------------------------------
def get_selected_row(event):
global selected_tuple
index= Display.curselection()[0]
selected_tuple = Display.get(index)
Name.delete(0,END)
Name.insert(END, selected_tuple[1])
FatherName.delete(0,END)
FatherName.insert(END, selected_tuple[2])
Occuption.delete(0,END)
Occuption.insert(END, selected_tuple[3])
Address1.delete(0,END)
Address1.insert(END, selected_tuple[4])
Address2.delete(0,END)
Address2.insert(END, selected_tuple[5])
PostCode.delete(0,END)
PostCode.insert(END, selected_tuple[6])
#--------------------------------------------------------Middle frame List-------------------------------
Display=Listbox(MiddleFrame, font=('Roboto',16,'bold'),width=70,height=15,background='#f0f0ff')#,padx=2,pady=4
Display.bind('<<ListboxSelect>>', get_selected_row)
Display.grid(row=0, column=0)
scrollbar1 = Scrollbar(MiddleFrame) #Scrollbar #,bg="red"
scrollbar1.grid(row=0,column=1, sticky='ns')
Display.configure(yscrollcommand=scrollbar1.set)
scrollbar1.configure(command=Display.yview)
scrollbar2 = Scrollbar(MiddleFrame, orient=HORIZONTAL) #Scrollbar
scrollbar2.grid(row=1,column=0, sticky='ew')
Display.configure(xscrollcommand=scrollbar2.set)
scrollbar2.configure(command=Display.xview)
#--------------------------------------------------------DeatialsFrame-------------------------------
BookID_text= StringVar()
BookTitle_text= StringVar()
Author_text= StringVar()
BookSTD_text= StringVar()
BookPrice_text= StringVar()
Date_text= StringVar()
Name = Label(DetailsFrame, font=('Roboto',18,'bold'), text="Book ID", bg='powder blue', padx=2, pady=8)
Name.grid(row=0, column=0, sticky=W)
Name = Entry(DetailsFrame, font=('Roboto',16,'bold'),width=22, textvariable=BookID_text)
Name.grid(row=0, column=1)
FatherName = Label(DetailsFrame, font=('Roboto',18,'bold'), text="Book Title", bg='powder blue', padx=2, pady=8)
FatherName.grid(row=1, column=0, sticky=W)
FatherName = Entry(DetailsFrame, font=('Roboto',16,'bold'),width=22, textvariable=BookTitle_text)
FatherName.grid(row=1, column=1)
Occuption = Label(DetailsFrame, font=('Roboto',18,'bold'), text="Author", bg='powder blue', padx=2, pady=8)
Occuption.grid(row=2, column=0, sticky=W)
Occuption = Entry(DetailsFrame, font=('Roboto',16,'bold'),width=22, textvariable=Author_text)
Occuption.grid(row=2, column=1)
Address1 = Label(DetailsFrame, font=('Roboto',18,'bold'), text="Book STD", bg='powder blue', padx=2, pady=8)
Address1.grid(row=3, column=0, sticky=W)
Address1 = Entry(DetailsFrame, font=('Roboto',16,'bold'),width=22, textvariable=BookSTD_text)
Address1.grid(row=3, column=1)
#PlaceAss = Label(DataFrameLEFT, font=('arial',12,'bold'), text="Placement Assistance:", padx=2, pady=2)
#PlaceAss.grid(row=4, column=2, sticky=W)
#Address1 = ttk.Combobox(DetailsFrame, font=('Roboto', 15,'bold'),width=23, textvariable=add1_text)#,state='readonly'
#Address1['value']=('STD 1','STD 2 ','STD 3 ','STD 4 ','STD 5 ','STD 6 ','STD 7 ','STD 8 ','STD 9 ','STD 10 ','STD 11 ','STD 12 ','OTHER')
#Address1.current(0)
#Address1.grid(row=3,column=1)
Address2 = Label(DetailsFrame, font=('Roboto',18,'bold'), text="Book Price", bg='powder blue', padx=2, pady=8)
Address2.grid(row=4, column=0, sticky=W)
Address2 = Entry(DetailsFrame, font=('Roboto',16,'bold'),width=22, textvariable=BookPrice_text)
Address2.grid(row=4, column=1)
PostCode = Label(DetailsFrame, font=('Roboto',18,'bold'), text="Date", bg='powder blue', padx=2, pady=8)
PostCode.grid(row=5, column=0, sticky=W)
PostCode = Entry(DetailsFrame, font=('Roboto',16,'bold'),width=22, textvariable=Date_text)
PostCode.grid(row=5, column=1)
#-----------------------------------------
def Reset():
BookID_text.set("")
BookTitle_text.set("")
Author_text.set("")
BookSTD_text.set("")
BookPrice_text.set("")
Date_text.set("")
def view_command():
Display.delete(0, END)
for row in db.books():
Display.insert(0, row)#END
def search_command():
Display.delete(0, END)
for row in db.search1(BookSTD_text.get()):
Display.insert(END,row)
def add_command():
db.insert1(BookID_text.get(), BookTitle_text.get(),Author_text.get(),BookSTD_text.get(),BookPrice_text.get(),Date_text.get())
Display.delete(0,END)
Display.insert(END, (BookID_text.get(), BookTitle_text.get(),Author_text.get(),BookSTD_text.get(),BookPrice_text.get(),Date_text.get()))
def delete_command():
db.delete1(selected_tuple[0])
def update_command():
db.update1(selected_tuple[0], BookID_text.get(), BookTitle_text.get(),Author_text.get(),BookSTD_text.get(),BookPrice_text.get(),Date_text.get())
#-------------------------------------------------------------Buttons-----------------------------------------------------------------------------------
ButSave=Button(DetailsFrame, text='SAVE',font=('Roboto',11,'bold'), bg='powder blue',width=9,bd=6, padx=12,command=add_command)
ButSave.grid(row=6, column=1)
BtnDelete=Button(ButtonFrame, text='LIST',font=('Roboto',11,'bold'), bg='powder blue',width=9,bd=6, padx=12,command=view_command)
BtnDelete.pack(side=LEFT)#grid(row=0,column=1)
BtnDelete=Button(ButtonFrame, text='UPDATE',font=('Roboto',11,'bold'), bg='powder blue',width=9,bd=6, padx=12,command=update_command)
BtnDelete.pack(side=LEFT)#grid(row=0,column=2)
BtnSearch=Button(ButtonFrame, text='SEARCH STD',font=('Roboto',11,'bold'), bg='powder blue',width=9,bd=6, padx=12,command=search_command)
BtnSearch.pack(side=LEFT)#grid(row=0,column=3)
self.btnReset=Button(ButtonFrame, text='RESET',font=('Roboto',11,'bold'), bg='powder blue',width=9,bd=6, padx=12,command=Reset)
self.btnReset.pack(side=LEFT)#grid(row=0,column=4)
BtnDelete=Button(ButtonFrame, text='DELETE',font=('Roboto',11,'bold'), bg='powder blue',width=9,bd=6, padx=12,command=delete_command)
BtnDelete.pack(side=LEFT)#grid(row=0,column=5)
self.btnExit=Button(ButtonFrame, text='EXIT',font=('Roboto',11,'bold'), bg='powder blue',width=9,bd=6, padx=12, command=self.root.destroy)
self.btnExit.pack(side=LEFT)#grid(row=0,column=6)
#===========================================page 3 frame =====================================================
#******************************************************************************************************************************************************************
nb.pack(expand=1, fill="both")
if __name__=='__main__':
root=Tk()
application = Library(root)
root.mainloop()
|
py | b413f2bb5e33ae3e8a9a2184da30df7c0f1ffdc9 | # -*- coding: utf-8 -*-
import os
import json
from random import sample
from itertools import chain
main_dir = os.path.dirname(os.path.abspath(__file__))
class Random(dict):
def __init__(self, file):
self.types_file = {
'nouns': self.load_nouns,
'nicknames': self.load_nicknames,
'dmails': self.load_dmails,
}
self.load_file(file)
super(Random, self).__init__()
def load_file(self, file):
"""
:param str file: filename
"""
self.types_file[file](file)
def load_nouns(self, file):
"""
Load dict from file for random words.
:param str file: filename
"""
with open(os.path.join(main_dir, file + '.dat'), 'r') as f:
self.nouns = json.load(f)
def load_dmails(self, file):
"""
Load list from file for random mails
:param str file: filename
"""
with open(os.path.join(main_dir, file + '.dat'), 'r') as f:
self.dmails = frozenset(json.load(f))
def load_nicknames(self, file):
"""
Load dict from file for random nicknames.
:param str file: filename
"""
with open(os.path.join(main_dir, file + '.dat'), 'r') as f:
self.nicknames = json.load(f)
@staticmethod
def check_count(count):
"""
Checks count
:param int count: count number ;)
:raises: ValueError
"""
if type(count) is not int:
raise ValueError('Param "count" must be int.')
if count < 1:
raise ValueError('Param "count" must be greater than 0.')
class RandomWords(Random):
def __init__(self):
self.available_letters = 'qwertyuiopasdfghjklzcvbnm'
super(RandomWords, self).__init__('nouns')
def random_word(self, letter=None):
"""
Return random word.
:param str letter: letter
:rtype: str
:returns: random word
"""
return self.random_words(letter)[0]
def random_words(self, letter=None, count=1):
"""
Returns list of random words.
:param str letter: letter
:param int count: how much words
:rtype: list
:returns: list of random words
:raises: ValueError
"""
self.check_count(count)
words = []
if letter is None:
all_words = list(
chain.from_iterable(self.nouns.values()))
try:
words = sample(all_words, count)
except ValueError:
len_sample = len(all_words)
raise ValueError('Param "count" must be less than {0}. \
(It is only {0} words)'.format(len_sample + 1, letter))
elif type(letter) is not str:
raise ValueError('Param "letter" must be string.')
elif letter not in self.available_letters:
raise ValueError(
'Param "letter" must be in {0}.'.format(
self.available_letters))
elif letter in self.available_letters:
try:
words = sample(self.nouns[letter], count)
except ValueError:
len_sample = len(self.nouns[letter])
raise ValueError('Param "count" must be less than {0}. \
(It is only {0} words for letter "{1}")'.format(len_sample + 1, letter))
return words
class RandomNicknames(Random):
def __init__(self):
self.available_letters = 'qwertyuiopasdfghjklzxcvbnm'
super(RandomNicknames, self).__init__('nicknames')
def random_nick(self, letter=None, gender=None):
"""
Return random nick.
:param str letter: letter
:param str gender: ``'f'`` for female, ``'m'`` for male and None for\
both
:rtype: str
:returns: random nick
"""
return self.random_nicks(letter, gender)[0]
def random_nicks(self, letter=None, gender='u', count=1):
"""
Return list of random nicks.
:param str letter: letter
:param str gender: ``'f'`` for female, ``'m'`` for male and None for both
:param int count: how much nicks
:rtype: list
:returns: list of random nicks
:raises: ValueError
"""
self.check_count(count)
nicks = []
if gender not in ('f', 'm', 'u'):
raise ValueError('Param "gender" must be in (f, m, u)')
if letter is None:
all_nicks = list(
chain.from_iterable(self.nicknames[gender].values()))
try:
nicks = sample(all_nicks, count)
except ValueError:
len_sample = len(all_nicks)
raise ValueError('Param "count" must be less than {0}. \
(It is only {0} words.")'.format(len_sample + 1))
elif type(letter) is not str:
raise ValueError('Param "letter" must be string.')
elif letter not in self.available_letters:
raise ValueError(
'Param "letter" must be in "{0}".'.format(
self.available_letters))
elif letter in self.available_letters:
try:
nicks = sample(self.nicknames[gender][letter], count)
except ValueError:
len_sample = len(self.nicknames[gender][letter])
raise ValueError('Param "count" must be less than {0}. \
(It is only {0} nicks for letter "{1}")'.format(len_sample + 1, letter))
return nicks
class RandomEmails(Random):
def __init__(self):
self.rn = RandomNicknames()
super(RandomEmails, self).__init__('dmails')
def randomMail(self):
"""
Return random e-mail.
:rtype: str
:returns: random e-mail
"""
return self.randomMails()[0]
def randomMails(self, count=1):
"""
Return random e-mails.
:rtype: list
:returns: list of random e-mails
"""
self.check_count(count)
random_nicks = self.rn.random_nicks(count=count)
random_domains = sample(list(self.dmails), count)
return [
nick.lower() + "@" + domain for nick, domain in zip(random_nicks,
random_domains)
]
|
py | b413f358a8d6de6312c7be5ed8a63baea1c0b132 | from pathlib import Path
import subprocess
import sys
_local_path = Path(__file__).parent
def test_build_git(tmp_path):
subprocess.run(
[sys.executable, str(_local_path / "pypi2pkgbuild.py"),
"-b", tmp_path, "-I", "git+file://{}".format(_local_path)],
check=True)
def test_build_wheel(tmp_path):
subprocess.run(
[sys.executable, "-mpip", "wheel", "--no-deps", "-w", tmp_path,
str(_local_path)],
check=True)
wheel_path, = tmp_path.iterdir()
subprocess.run(
[sys.executable, str(_local_path / "pypi2pkgbuild.py"),
"-b", tmp_path, "-I", "file://{}".format(wheel_path)],
check=True)
|
py | b413f4675445721c3fa364259531fb6ef14c425c | import numpy as np
import IPython.display
import librosa
import librosa.display
import matplotlib.pyplot as plt
import os
import shutil
from collections import defaultdict
from random import shuffle
import keras
from keras.datasets import mnist
from keras.models import Sequential, load_model
from keras.layers import Dense, Dropout, Flatten
from keras.layers import Conv2D, MaxPooling2D
from keras import backend as K
import numpy as np
import matplotlib.pyplot as plt
import matplotlib.image as mpimg
from keras.utils import np_utils
import pickle
from collections import Counter
import keras
from keras.models import Sequential, Model
from keras.layers import Dense, Dropout, Flatten, Activation, BatchNormalization, regularizers
from keras.layers.noise import GaussianNoise
from keras.layers import Conv2D, MaxPooling2D
from keras import backend as K
from keras.callbacks import ModelCheckpoint, EarlyStopping
from keras.utils.np_utils import to_categorical
from sklearn import metrics
import random
import numpy as np
import matplotlib.pyplot as plt
from sklearn import svm, datasets
from sklearn.model_selection import train_test_split
from sklearn.metrics import confusion_matrix
from sklearn.utils.multiclass import unique_labels
import sklearn
from sklearn import mixture
import itertools
import wave
import contextlib
import math
import json
def _read_wav(file_path):
try:
sr, d = wavfile.read(file_path)
if not str(d.dtype).startswith('float'):
if d.dtype == 'int16':
nb_bits = 16
elif d.dtype == 'int32':
nb_bits = 32
else:
print d.dtype, d
raise Exception('???')
max_nb_bit = float(2 ** (nb_bits - 1))
d = d / (max_nb_bit + 1.0)
if len(d.shape) == 2:
d = d[:, 0]
return d, sr
except:
# print traceback.format_exc()
y, sr = librosa.load(file_path, sr=None)
return y, sr
def _get_wav_mfcc(values, sr, winlen=25, winstep=15, n_mels=128, n_mfcc=13, mfccd=False, mfccdd=False, norm_mfcc=False, fmin=0, fmax=6000, **kwargs):
winlen = int((winlen / 1000.0) * sr)
winstep = int((winstep / 1000.0) * sr)
mfcc = librosa.feature.mfcc(
y=values,
sr=sr,
n_fft=winlen,
n_mfcc=n_mfcc,
n_mels=n_mels,
hop_length=winstep,
fmin=fmin,
fmax=fmax
)
if norm_mfcc:
mfcc = (mfcc - mfcc.mean()) / mfcc.var()
blocks = [mfcc]
if mfccd:
d = librosa.feature.delta(mfcc)
blocks.append(d)
if mfccdd:
dd = librosa.feature.delta(mfcc, order=2)
blocks.append(dd)
return np.vstack(blocks)
# Computing SDC
# From src:
# http://webcache.googleusercontent.com/search?q=cache:y_zQF8CnrysJ:www-lium.univ-lemans.fr/sidekit/_modules/frontend/features.html+&cd=2&hl=ru&ct=clnk&gl=ru&lr=lang_be%7Clang_ru&client=ubuntu
def _compute_delta(features, win=3, **kwargs):
x = np.zeros((features.shape[0] + 2 * win, features.shape[1]), dtype=np.float32)
x[:win, :] = features[0, :]
x[win:-win, :] = features
x[-win:, :] = features[-1, :]
delta = np.zeros(x.shape, dtype=np.float32)
filt = np.zeros(2 * win + 1, dtype=np.float32)
filt[0] = -1
filt[-1] = 1
for i in range(features.shape[1]):
delta[:, i] = np.convolve(features[:, i], filt)
return delta[win:-win, :]
def _get_sdc_features(cep, d=1, p=3, k=7, **kwargs):
"""
Compute the Shifted-Delta-Cepstral features for language identification
:param cep: matrix of feature, 1 vector per line
:param d: represents the time advance and delay for the delta computation
:param k: number of delta-cepstral blocks whose delta-cepstral
coefficients are stacked to form the final feature vector
:param p: time shift between consecutive blocks.
return: cepstral coefficient concatenated with shifted deltas
"""
y = np.r_[
np.resize(cep[0, :], (d, cep.shape[1])),
cep,
np.resize(cep[-1, :], (k * 3 + d, cep.shape[1]))
]
delta = _compute_delta(y, win=d)
sdc = np.empty((cep.shape[0], cep.shape[1] * k))
idx = np.zeros(delta.shape[0], dtype='bool')
for ii in range(k):
idx[d + ii * p] = True
for ff in range(len(cep)):
sdc[ff, :] = delta[idx, :].reshape(1, -1)
idx = np.roll(idx, 1)
return sdc
def _get_wav_spectrogram(values, sr, winlen=25, winstep=15, n_mels=128, **kwargs):
# ms to number of samples
winlen = int((winlen / 1000.0) * sr)
winstep = int((winstep / 1000.0) * sr)
S = librosa.feature.melspectrogram(
y=values,
sr=sr,
hop_length=winstep,
n_fft=winlen,
n_mels=n_mels,
)
log_S = librosa.power_to_db(S, ref=np.max)
return log_S
def _get_columns_silence_flags(cur_mfccs, silence_percentile=25, **kwargs):
percentile = np.percentile(cur_mfccs[0], silence_percentile)
left_cols = []
for i, v in enumerate(cur_mfccs[0]):
if v >= percentile:
left_cols.append(True)
else:
left_cols.append(False)
return left_cols
def _generator_features_extractor(wav_path, remove_silence=False, **kwargs):
values, sr = _read_wav(wav_path)
# # N1
# mfccs = _get_wav_mfcc(values, sr, **kwargs)
# spectrogram = _get_wav_spectrogram(values, sr, **kwargs)
# return np.vstack([mfccs, spectrogram])
# N2
mfccs_d_dd = _get_wav_mfcc(values, sr, mfccd=True, mfccdd=True)
sdcs_mfcc = _get_sdc_features(mfccs_d_dd[:13, :].T, d=1, p=3, k=7).T
sdcs_d = _get_sdc_features(mfccs_d_dd[13:26, :].T, d=1, p=3, k=7).T
sdcs_dd = _get_sdc_features(mfccs_d_dd[26:39, :].T, d=1, p=3, k=7).T
sdcs_mfcc_2 = _get_sdc_features(mfccs_d_dd[:13, :].T, d=5, p=2, k=5).T
sdcs_mfcc_3 = _get_sdc_features(mfccs_d_dd[:13, :].T, d=3, p=1, k=5).T
sdcs_mfcc_4 = _get_sdc_features(mfccs_d_dd[:13, :].T, d=1, p=2, k=6).T
res = np.vstack([mfccs_d_dd, sdcs_mfcc, sdcs_d, sdcs_dd, sdcs_mfcc_2, sdcs_mfcc_3, sdcs_mfcc_4])
if remove_silence:
flags = _get_columns_silence_flags(mfccs_d_dd, **kwargs)
res = res[:, flags]
return res
# def _generator_features_extractor(file_path):
# b1 = get_wav_feature(file_path, d=3, p=1, k=5, mfccd=True, mfccdd=True, remove_silence=True)
# b2 = get_wav_feature(file_path, d=1, p=1, k=7, mfccd=True, mfccdd=True, remove_silence=True)
# return np.vstack([b1, b2])
class DatasetGenerator(object):
def __init__(self, class_to_files, batch_size):
self.batch_size = batch_size
self.class_to_files = class_to_files
self.classes = list(set(self.class_to_files))
if batch_size % len(self.class_to_files) != 0:
raise ValueError(u'batch_size should be devided by number of classes {}'.format(len(class_to_files)))
self.language_vectors_in_batch = batch_size / len(self.class_to_files)
self.class_to_vector = dict()
for c, _ in self.class_to_files.iteritems():
class_vector = [0.0 for _ in xrange(len(self.class_to_files))]
class_vector[c] = 1.0
self.class_to_vector[c] = class_vector
self._files_generators_by_class = dict()
for c, _ in self.class_to_files.iteritems():
self._files_generators_by_class[c] = self._get_class_generator(c)
self._queue_by_class = defaultdict(list)
def _get_class_generator(self, c):
random.shuffle(self.class_to_files[c])
return iter(self.class_to_files[c])
def _get_next_class_file(self, c):
gen = self._files_generators_by_class[c]
try:
file_path = next(gen)
except StopIteration as e:
gen = self._get_class_generator(c)
file_path = next(gen)
self._files_generators_by_class[c] = gen
return file_path
def _get_next_class_vector(self, c):
if self._queue_by_class[c]:
return self._queue_by_class[c].pop()
file_path = self._get_next_class_file(c)
features = _generator_features_extractor(file_path)
self._queue_by_class[c].extend(features.T)
return self._queue_by_class[c].pop()
def next(self):
cur_batch_X = []
cur_batch_Y = []
for _ in xrange(self.language_vectors_in_batch):
for c in self.classes:
vector = None
while vector is None:
try:
vector = self._get_next_class_vector(c)
except:
continue
cur_batch_X.append(vector)
cur_batch_Y.append(self.class_to_vector[c])
return np.array(cur_batch_X), np.array(cur_batch_Y)
def __iter__(self):
return self
def init_class_to_files_list(folder):
MAP_LANGUAGE_TO_FILES = defaultdict(list)
for dirname in os.listdir(folder):
all_language_files = os.path.join(folder, dirname)
for filename in os.listdir(all_language_files):
full_file_path = os.path.join(all_language_files, filename)
MAP_LANGUAGE_TO_FILES[dirname].append(full_file_path)
LANGUAGE_TO_CLASS = dict(zip(sorted(MAP_LANGUAGE_TO_FILES.keys()), range(len(MAP_LANGUAGE_TO_FILES))))
MAP_CLASS_TO_FILES_LIST = dict()
for l, f in MAP_LANGUAGE_TO_FILES.iteritems():
MAP_CLASS_TO_FILES_LIST[LANGUAGE_TO_CLASS[l]] = f
return LANGUAGE_TO_CLASS, MAP_LANGUAGE_TO_FILES, MAP_CLASS_TO_FILES_LIST
print 'Initializing class to files'
LANGUAGE_TO_CLASS, MAP_LANGUAGE_TO_FILES, MAP_CLASS_TO_FILES_LIST = init_class_to_files_list('/home/kolegor/Study/Master/data/use2big_wav_big_splitted_5')
print len(MAP_CLASS_TO_FILES_LIST)
def _get_vectors_in_second():
# 171 in 3 seconds -> 57 in one second
return 57
def _get_file_duration(file_path):
with contextlib.closing(wave.open(file_path,'r')) as f:
frames = f.getnframes()
rate = f.getframerate()
duration = frames / float(rate)
return duration
def _get_files_list_total_duration(files_list, known_one_file_duration=None):
if known_one_file_duration is not None:
return known_one_file_duration * len(files_list)
return sum(map(_get_file_duration, files_list))
def get_total_dataset_duration(class_to_files, known_one_file_duration=None):
result = 0.0
for c, files_list in class_to_files.iteritems():
print c, len(class_to_files)
result += _get_files_list_total_duration(files_list, known_one_file_duration=known_one_file_duration)
return result
print 'Initializing models'
## instantiate model
# model = Sequential()
# # we can think of this chunk as the input layer
# model.add(Dense(520, input_dim=520))
# model.add(BatchNormalization())
# model.add(Activation('relu'))
# model.add(Dropout(0.3))
# # we can think of this chunk as the hidden layer
# model.add(Dense(256))
# model.add(Activation('relu'))
# # we can think of this chunk as the hidden layer
# model.add(Dense(128))
# model.add(BatchNormalization())
# model.add(Activation('relu'))
# model.add(Dropout(0.25))
# model.add(Dense(64))
# model.add(BatchNormalization())
# model.add(Activation('relu'))
# model.add(Dense(128))
# model.add(Activation('relu'))
# model.add(Dropout(0.2))
# model.add(Dense(50))
# model.add(BatchNormalization())
# model.add(Activation('relu'))
# model.add(Dense(len(MAP_CLASS_TO_FILES_LIST)))
# model.add(Activation('softmax'))
# model.compile(loss=keras.losses.categorical_crossentropy,
# optimizer=keras.optimizers.RMSprop(lr=0.0009),
# metrics=['accuracy'])
model = load_model('/home/kolegor/Study/Master/experiments/last_3_different_sdc_new_nn_layout/model.last_bottleneck_11_langs.keras')
def learning_rate_update_callback(epoch):
initial_lrate = 0.008
drop = 0.98
epochs_drop = 5000.0
lrate = initial_lrate * math.pow(drop, math.floor((1+epoch)/epochs_drop))
return lrate
# model.summary()
def _split_language_files(files_list, train_size, test_size):
count_by_file_id = defaultdict(int)
random.seed(42)
random.shuffle(files_list)
for filename in files_list:
uid = filename.split('.mp3')[0].split('/')[-1]
count_by_file_id[uid] += 1
if len(count_by_file_id) == 3:
ctrain, ctest, cval = 1, 1, 1
else:
count = len(count_by_file_id)
ctrain = int(count * 0.7)
ctestval = count - ctrain
ctest = (ctestval + 1) / 2
cval = ctestval - ctest
train_files_uids = [uid for uid, count in sorted(count_by_file_id.iteritems(), key=lambda x: x[1], reverse=True)[:ctrain]]
test_files_uids = [uid for uid, count in sorted(count_by_file_id.iteritems(), key=lambda x: x[1], reverse=True)[ctrain:ctrain+ctest]]
val_files_uids = [uid for uid, count in sorted(count_by_file_id.iteritems(), key=lambda x: x[1], reverse=True)[ctrain + ctest:]]
train_files = []
test_files = []
val_files = []
for filename in files_list:
uid = filename.split('.mp3')[0].split('/')[-1]
if uid in train_files_uids:
train_files.append(filename)
elif uid in test_files_uids:
test_files.append(filename)
elif uid in val_files_uids:
val_files.append(filename)
return train_files, test_files, val_files
def init_train_val_test_folders_iter(language_to_files_list):
_TRAIN_CLASS_TO_FILES = defaultdict(list)
_TEST_CLASS_TO_FILES = defaultdict(list)
_VAL_CLASS_TO_FILES = defaultdict(list)
for c, files_paths in language_to_files_list.iteritems():
train_files, test_files, val_files = _split_language_files(files_paths, 0.6, 0.2)
_TRAIN_CLASS_TO_FILES[c].extend(train_files)
_TEST_CLASS_TO_FILES[c].extend(test_files)
_VAL_CLASS_TO_FILES[c].extend(val_files)
return _TRAIN_CLASS_TO_FILES, _TEST_CLASS_TO_FILES, _VAL_CLASS_TO_FILES
print 'Splitting data'
train_gen_data, test_gen_data, val_gen_data = init_train_val_test_folders_iter(MAP_CLASS_TO_FILES_LIST)
def estimate_total_dataset_duration(class_to_files_list, known_one_file_duration=None):
total_duration = get_total_dataset_duration(class_to_files_list, known_one_file_duration=known_one_file_duration)
vectors_in_second = _get_vectors_in_second()
return total_duration * vectors_in_second
if False:
epochs = 3
batch_size = len(train_gen_data) * 3
train_data_vectors = estimate_total_dataset_duration(train_gen_data, known_one_file_duration=5)
val_data_vectors = estimate_total_dataset_duration(val_gen_data, known_one_file_duration=5)
train_data_steps = train_data_vectors / batch_size
val_data_steps = val_data_vectors / batch_size
print train_data_vectors, train_data_steps
print val_data_vectors, val_data_steps
history = model.fit_generator(
DatasetGenerator(train_gen_data, batch_size),
validation_data=DatasetGenerator(val_gen_data, batch_size),
validation_steps=val_data_steps,
steps_per_epoch=train_data_steps,
# callbacks=[keras.callbacks.LearningRateScheduler(learning_rate_update_callback)],
epochs=epochs
)
# 1641169/1641168 [==============================] - 27144s 17ms/step - loss: 0.8971 - acc: 0.7219 - val_loss: 2.7384 - val_acc: 0.2895
# Epoch 2/3
# 1641169/1641168 [==============================] - 26922s 16ms/step - loss: 0.8968 - acc: 0.7271 - val_loss: 2.4356 - val_acc: 0.3395
# Epoch 3/3
# 1641169/1641168 [==============================] - 26972s 16ms/step - loss: 0.8689 - acc: 0.7322 - val_loss: 2.7592 - val_acc: 0.3773
model.save('/home/kolegor/Study/Master/experiments/last_3_different_sdc_new_nn_layout/model.last_bottleneck_11_langs.keras')
else:
def _get_file_uid(file_path):
return file_path.split('.mp3')[0].split('/')[-1]
# tuples (file_path, real_class, predicted_class)
all_predictions = []
all_predictions_by_file = defaultdict(list)
all_files_count = sum(len(b) for a, b in test_gen_data.iteritems())
i = 0
count_correct = 0
count_all = 0
for cl, files_paths in test_gen_data.iteritems():
for file_path in files_paths:
if i % 100 == 0:
print i, all_files_count
i += 1
try:
features = _generator_features_extractor(file_path)
cur_predictions = model.predict(features.T)
for column, p in zip(features.T, cur_predictions):
pcl = np.argmax(p)
count_all += 1
if pcl == cl:
count_correct += 1
all_predictions.append((file_path, cl, tuple(list(p)), pcl))
all_predictions_by_file[file_path].append((cl, pcl))
except KeyboardInterrupt:
break
except:
continue
# 1386030 2936299 0.472032991191
print count_correct, count_all, float(count_correct) / count_all
count_correct_files = 0
for file_path, file_predictions in all_predictions_by_file.iteritems():
real_class = file_predictions[0][0]
most_common = Counter([pcl for _, pcl in file_predictions]).most_common(1)[0][0]
if most_common == real_class:
count_correct_files += 1
print count_correct_files, len(all_predictions_by_file), 100.0 * count_correct_files / len(all_predictions_by_file)
|
py | b413f4de7f8248311428c5b3480fc1b9c038644e | """ Main class of SOM visualization
This library tries to dublicate as much as possible of the SOM Toolbox
application from TU Vienna.
Args:
m (int): vertical number of neurons on the mxn grid.
n (int): horizontal number of neurons on the mxn grid.
dimension (numpy): vectore size of the SOM's weights.
input_data (numpy): input data for projection on SOM grid.
Returns:
SOMVisualizaiton
"""
import panel as pn
import numpy as np
import holoviews as hv
from holoviews import opts
from holoviews.streams import Pipe, Buffer
from controls.controllers import MainController
hv.extension('bokeh')
from visualizations.complane import ComponentPlane
from visualizations.dmatrix import DMatrix
from visualizations.hithistogram import HitHist
from visualizations.sdh import SDH
from visualizations.qerror import QError
from visualizations.umatrix import UMatrix
from visualizations.upmatrix import UStar_PMatrix
from visualizations.neighbourhood_graph import NeighbourhoodGraph
from visualizations.clustering import Clustering
from visualizations.metromap import MetroMap
from visualizations.piechart import PieChart
from visualizations.chessboard import Chessboard
from visualizations.time_series import TimeSeries
from visualizations.sky_metaphor import SkyMetaphor
from skimage.transform import resize
OBJECTS_CLASSES = [SkyMetaphor, ComponentPlane, HitHist, UMatrix, DMatrix, UStar_PMatrix,
SDH, PieChart, NeighbourhoodGraph, Chessboard, Clustering,
MetroMap, QError, TimeSeries]
_COLOURS_93 = ['#FF5555', '#5555FF', '#55FF55', '#FFFF55', '#FF55FF', '#55FFFF', '#FFAFAF', '#808080',
'#C00000', '#0000C0', '#00C000', '#C0C000', '#C000C0', '#00C0C0', '#404040', '#FF4040',
'#4040FF', '#40FF40', '#FFFF40', '#FF40FF', '#40FFFF', '#C0C0C0', '#800000', '#000080',
'#008000', '#808000', '#800080', '#008080', '#FF8080', '#8080FF', '#80FF80', '#FFFF80',
'#FF80FF', '#80FFFF', '#FF5555', '#5555FF', '#55FF55', '#FFFF55', '#FF55FF', '#55FFFF',
'#FFAFAF', '#808080', '#C00000', '#0000C0', '#00C000', '#C0C000', '#C000C0', '#00C0C0',
'#404040', '#FF4040', '#4040FF', '#40FF40', '#FFFF40', '#FF40FF', '#40FFFF', '#C0C0C0',
'#800000', '#000080', '#008000', '#808000', '#800080', '#008080', '#FF8080', '#8080FF',
'#80FF80', '#FFFF80', '#FF80FF', '#80FFFF', '#FF5555', '#5555FF', '#55FF55', '#FFFF55',
'#FF55FF', '#55FFFF', '#FFAFAF', '#808080', '#C00000', '#0000C0', '#00C000', '#C0C000',
'#C000C0', '#00C0C0', '#404040', '#FF4040', '#4040FF', '#40FF40', '#FFFF40', '#FF40FF',
'#40FFFF', '#C0C0C0', '#800000', '#000080', '#000000', '#FFFFFF']
class SOMToolbox():
def __init__(self, m, n, dimension, weights, input_data=None, classes=None, component_names=None):
self._height = self._width = 500
self._pipe = Pipe(data=[])
self._pipe_paths = Pipe(data=[])
self._visualizations = []
self._m = m
self._n = n
self._weights = weights
self._dim = dimension
self._idata = input_data
if input_data is not None:
self._distance = np.linalg.norm(self._idata[:, None, :] - self._idata[None, :, :], axis=-1)
if classes is not None:
self._classes = classes.astype(int)
else:
self._classes = classes
self._component_names = component_names
self._plot = None
self._maincontrol = MainController(self._interpolation, self._rotate, self._visualizations, OBJECTS_CLASSES,
name='', colormap='Greys')
self._mainp = pn.Column(pn.panel(self._maincontrol, default_layout=pn.Row, width=700))
self._xlim = (-.5 * self._m / self._n, .5 * self._m / self._n) if self._m > self._n else (-.5, .5)
self._ylim = (-.5 * self._n / self._m, .5 * self._n / self._m) if self._n > self._m else (-.5, .5)
# _COLOURS_93
self._Image = hv.DynamicMap(hv.Image, streams=[self._pipe]).apply.opts(cmap=self._maincontrol.param.colormap,
width=self._width, height=self._height,
xlim=self._xlim, ylim=self._ylim)
self._Paths = hv.DynamicMap(hv.Segments, streams=[self._pipe_paths]).apply.opts(line_width=1, color='red')
self._pdmap = pn.Column(self._Image * self._Paths)
self._controls = pn.Row()
self._timeseries = pn.Row()
self._mainview = pn.Column(pn.Column(self._mainp, pn.Row(self._pdmap, self._controls)),
pn.Column(self._timeseries))
self._visualizations.append(SkyMetaphor(self))
self._visualizations.append(ComponentPlane(self))
if input_data is not None: self._visualizations.append(HitHist(self))
self._visualizations.append(UMatrix(self))
self._visualizations.append(DMatrix(self))
if input_data is not None:
self._visualizations.append(UStar_PMatrix(self))
self._visualizations.append(SDH(self))
self._visualizations.append(PieChart(self))
self._visualizations.append(NeighbourhoodGraph(self))
self._visualizations.append(Chessboard(self))
self._visualizations.append(Clustering(self))
self._visualizations.append(MetroMap(self))
self._visualizations.append(QError(self))
self._visualizations.append(TimeSeries(self))
self._visualizations[0]._activate_controllers()
def _rotate(self, k):
if self._m != self._n: # in case SOM's sides are not equal
self._xlim, self._ylim = self._ylim, self._xlim
self._pdmap[0] = pn.Column(self._Image.opts(xlim=self._xlim, ylim=self._ylim) * self._Paths)
self._plot = np.rot90(self._plot, k=k, axes=(1, 0))
paths_rotated = []
paths_old = self._pipe_paths.data if type(self._pipe_paths.data) == list else [
self._pipe_paths.data] # check if only 1 path
for p in paths_old:
if k > 0: paths_rotated.append((p[1], -1 * p[0], p[3], -1 * p[2])) # clockwise
if k < 0: paths_rotated.append((-1 * p[1], p[0], -1 * p[3], p[2])) # counter clockwise
self._pipe.send(np.rot90(self._pipe.data, k=k, axes=(1, 0)))
self._pipe_paths.send(paths_rotated)
def _interpolation(self, ):
if self._maincontrol.interpolation:
self._pipe.send(resize(self._plot, (1000, 1000)))
else:
self._pipe.send(self._plot)
def _get_neuron_xy(self, neuron):
rotation = self._maincontrol._orientation % 4
if (rotation == 2 or rotation == 0):
m, n = self._m, self._n
else:
m, n = self._n, self._m
i, j = neuron // n, neuron % n
ir, jr = i, j # if clockwise rotated 0
if rotation == 1: ir, jr = j, self._m - i # if clockwise rotated 90
if rotation == 2: ir, jr = self._m - i, self._n - j # if clockwise rotated 180
if rotation == 3: ir, jr = self._n - j, i # if clockwise rotated 270
diffx = 1 / n if (rotation == 3 or rotation == 0) else -1 / n
diffy = 1 / m if (rotation == 3 or rotation == 2) else -1 / m
x, y = -0.5 + diffx / 2 + jr * (1 / n), 0.5 + diffy / 2 - ir * (1 / m)
return x, y
def _from_xy_to_neuron(self, pos_xy):
return int(pos_xy[0] * self._m + pos_xy[1])
def _get_xy(self, p):
rotation = self._maincontrol._orientation % 4
if (rotation == 2 or rotation == 0):
m, n = self._m, self._n
else:
m, n = self._n, self._m
# if we want to scale into [b-a]: (b-a)*((x-min)/(max-min))+a
scale = lambda a, b, x, minx, maxx: (b - a) * ((x - minx) / (maxx - minx)) + a
x = scale(-0.5, 0.5, p[0], -0.5, n - 0.5) # -.45 + (.95/(n-.5))*(p[0]-0)
y = scale(-0.5, 0.5, p[1], -0.5, m - 0.5) # -0.45 + ((0.5+0.45)/(m-0))*(p[1]-0)
return [x, y]
def _display(self, plot=None, paths=None):
if plot is not None:
self._plot = np.rot90(plot, k=self._maincontrol._orientation, axes=(1, 0))
if self._maincontrol.interpolation:
self._pipe.send(resize(self._plot, (1000, 1000)))
else:
self._pipe.send(self._plot)
if paths is not None:
self._pipe_paths.send(paths)
def _onbigscreen(self, ):
pn.serve(self._mainview) # , start=False, show=False
|
py | b413f546c82e857d11328d083a3b2bce9f190ba0 | from rest_framework.decorators import api_view
from rest_framework.response import Response
from django.core.paginator import Paginator, EmptyPage, PageNotAnInteger
from homework.models import Student
from homework.serializers import StudentSerializer
from homework.services.student_services import (
add_student_,
update_student_,
upload_image_,
)
@api_view(["GET"])
def get_students(request):
students = Student.objects.all()
page = request.query_params.get("page")
paginator = Paginator(students, 4)
try:
students = paginator.page(page)
except PageNotAnInteger:
students = paginator.page(1)
except EmptyPage:
students = paginator.page(paginator.num_pages)
if not page:
page = 1
page = int(page)
serializer = StudentSerializer(students, many=True)
return Response(
{"students": serializer.data, "page": page, "pages": paginator.num_pages}
)
@api_view(["POST"])
def add_student(request):
student = add_student_(request.data)
serializer = StudentSerializer(student, many=False)
return Response(serializer.data)
@api_view(["PUT"])
def update_student(request, pk):
student = update_student_(request.data, pk)
serializer = StudentSerializer(student, many=False)
return Response(serializer.data)
@api_view(["POST"])
def upload_image(request):
result = upload_image_(request.data)
return Response(result)
@api_view(["DELETE"])
def delete_student(request, pk):
student = Student.objects.get(id=pk)
student.delete()
return Response("Student Deleted")
|
py | b413f56d89d62598137b3c54914e032e2ddbae7e | #-*- coding:utf-8 -*-
from urllib.parse import quote
from urllib.request import urlopen
from bs4 import BeautifulSoup
from threading import Timer
import re
import requests
from pygame import mixer
def VOA_search(standard=True):
try:
if standard == True:
url_str = 'VOA_Standard_English/'
idx_num = 0
response_str = ' VOA standard '
else:
url_str = 'VOA_Special_English/'
idx_num = 1
response_str = ' VOA special '
resp1 = urlopen('http://www.51voa.com/' + url_str)
soup1 = BeautifulSoup(resp1, 'html.parser')
scope1 = soup1.select('a[target="_blank"]') #查找类为a且target为_blank的标签
latest_url = scope1[idx_num].attrs['href']
resp2 = urlopen('http://www.51voa.com' + latest_url)
soup2 = BeautifulSoup(resp2, 'html.parser')
scope2 = soup2.select('a[id="mp3"]') #查找id为mp3的标签
mp3_url = scope2[0].attrs['href']
Timer(3.5, VOA_download, (mp3_url,)).start() #开启另一个线程下载文件
return "马上为你播放最新" + response_str + ",正在下载。"
except:
return "对不起,找不到" + response_str + ",请检查连接。"
def VOA_download(mp3_url):
print("Downloading VOA MP3.")
r = requests.get(mp3_url)
with open("sound/VOA.mp3", "wb") as code:
code.write(r.content)
print("VOA MP3 downloaded, ready to play.")
#mixer.init()
mixer.music.load('sound/VOA.mp3')
mixer.music.play()
|
py | b413f7f3b7969822e541d9e1b054219ae142e58f | from flask import Flask, jsonify, render_template
import random
app = Flask(__name__)
@app.route('/')
def hello():
return 'Home page'
@app.route('/rmovie')
def random_movie():
movies = {1: 'Toy story', 2: 'The Raid', 3: 'Hero',
4: 'Ip Man', 5: 'Kung Fu Panda'}
movie = random.choice(list(movies.items()))
print(movie)
return jsonify(movie)
|
py | b413f85175bec63942864d53890239c99ec9ab6f | import numpy as np
from taichi.core.util import ti_core as _ti_core
from taichi.lang import impl
from taichi.lang.enums import Layout
from taichi.lang.util import (cook_dtype, has_pytorch, python_scope,
to_numpy_type, to_pytorch_type, to_taichi_type)
if has_pytorch():
import torch
class Ndarray:
"""Taichi ndarray class implemented with a torch tensor.
Args:
dtype (DataType): Data type of each value.
shape (Tuple[int]): Shape of the torch tensor.
"""
def __init__(self, dtype, shape):
self.host_accessor = None
if impl.current_cfg().ndarray_use_torch:
assert has_pytorch(
), "PyTorch must be available if you want to create a Taichi ndarray with PyTorch as its underlying storage."
self.arr = torch.zeros(shape,
dtype=to_pytorch_type(cook_dtype(dtype)))
if impl.current_cfg().arch == _ti_core.Arch.cuda:
self.arr = self.arr.cuda()
else:
self.arr = _ti_core.Ndarray(impl.get_runtime().prog,
cook_dtype(dtype), shape)
@property
def shape(self):
"""Gets ndarray shape.
Returns:
Tuple[Int]: Ndarray shape.
"""
raise NotImplementedError()
@property
def element_shape(self):
"""Gets ndarray element shape.
Returns:
Tuple[Int]: Ndarray element shape.
"""
raise NotImplementedError()
@property
def dtype(self):
"""Gets data type of each individual value.
Returns:
DataType: Data type of each individual value.
"""
return to_taichi_type(self.arr.dtype)
@property
def data_handle(self):
"""Gets the pointer to underlying data.
Returns:
int: The pointer to underlying data.
"""
return self.arr.data_ptr()
@python_scope
def __setitem__(self, key, value):
"""Sets ndarray element in Python scope.
Args:
key (Union[List[int], int, None]): Coordinates of the ndarray element.
value (element type): Value to set.
"""
raise NotImplementedError()
@python_scope
def __getitem__(self, key):
"""Gets ndarray element in Python scope.
Args:
key (Union[List[int], int, None]): Coordinates of the ndarray element.
Returns:
element type: Value retrieved.
"""
raise NotImplementedError()
@python_scope
def fill(self, val):
"""Fills ndarray with a specific scalar value.
Args:
val (Union[int, float]): Value to fill.
"""
if impl.current_cfg().ndarray_use_torch:
self.arr.fill_(val)
else:
from taichi.lang.meta import fill_ndarray # pylint: disable=C0415
fill_ndarray(self, val)
@python_scope
def to_numpy(self):
"""Converts ndarray to a numpy array.
Returns:
numpy.ndarray: The result numpy array.
"""
if impl.current_cfg().ndarray_use_torch:
return self.arr.cpu().numpy()
arr = np.zeros(shape=self.arr.shape, dtype=to_numpy_type(self.dtype))
from taichi.lang.meta import \
ndarray_to_ext_arr # pylint: disable=C0415
ndarray_to_ext_arr(self, arr)
impl.get_runtime().sync()
return arr
@python_scope
def from_numpy(self, arr):
"""Loads all values from a numpy array.
Args:
arr (numpy.ndarray): The source numpy array.
"""
if not isinstance(arr, np.ndarray):
raise TypeError(f"{np.ndarray} expected, but {type(arr)} provided")
if tuple(self.arr.shape) != tuple(arr.shape):
raise ValueError(
f"Mismatch shape: {tuple(self.arr.shape)} expected, but {tuple(arr.shape)} provided"
)
if impl.current_cfg().ndarray_use_torch:
self.arr = torch.from_numpy(arr).to(self.arr.dtype)
if impl.current_cfg().arch == _ti_core.Arch.cuda:
self.arr = self.arr.cuda()
else:
if hasattr(arr, 'contiguous'):
arr = arr.contiguous()
from taichi.lang.meta import \
ext_arr_to_ndarray # pylint: disable=C0415
ext_arr_to_ndarray(arr, self)
impl.get_runtime().sync()
def pad_key(self, key):
if key is None:
key = ()
if not isinstance(key, (tuple, list)):
key = (key, )
assert len(key) == len(self.arr.shape)
return key
def initialize_host_accessor(self):
if self.host_accessor:
return
impl.get_runtime().materialize()
self.host_accessor = NdarrayHostAccessor(self.arr)
class ScalarNdarray(Ndarray):
"""Taichi ndarray with scalar elements implemented with a torch tensor.
Args:
dtype (DataType): Data type of each value.
shape (Tuple[int]): Shape of the ndarray.
"""
@property
def shape(self):
return tuple(self.arr.shape)
@property
def element_shape(self):
return ()
@python_scope
def __setitem__(self, key, value):
if impl.current_cfg().ndarray_use_torch:
self.arr.__setitem__(key, value)
else:
self.initialize_host_accessor()
self.host_accessor.setter(value, *self.pad_key(key))
@python_scope
def __getitem__(self, key):
if impl.current_cfg().ndarray_use_torch:
return self.arr.__getitem__(key)
self.initialize_host_accessor()
return self.host_accessor.getter(*self.pad_key(key))
def __repr__(self):
return '<ti.ndarray>'
class NdarrayHostAccessor:
def __init__(self, ndarray):
if _ti_core.is_real(ndarray.dtype):
def getter(*key):
return ndarray.read_float(key)
def setter(value, *key):
ndarray.write_float(key, value)
else:
if _ti_core.is_signed(ndarray.dtype):
def getter(*key):
return ndarray.read_int(key)
else:
def getter(*key):
return ndarray.read_uint(key)
def setter(value, *key):
ndarray.write_int(key, value)
self.getter = getter
self.setter = setter
class NdarrayHostAccess:
"""Class for accessing VectorNdarray/MatrixNdarray in Python scope.
Args:
arr (Union[VectorNdarray, MatrixNdarray]): See above.
indices_first (Tuple[Int]): Indices of first-level access (coordinates in the field).
indices_second (Tuple[Int]): Indices of second-level access (indices in the vector/matrix).
"""
def __init__(self, arr, indices_first, indices_second):
self.ndarr = arr
self.arr = arr.arr
if arr.layout == Layout.SOA:
self.indices = indices_second + indices_first
else:
self.indices = indices_first + indices_second
if impl.current_cfg().ndarray_use_torch:
def getter():
return self.arr[self.indices]
def setter(value):
self.arr[self.indices] = value
else:
def getter():
self.ndarr.initialize_host_accessor()
return self.ndarr.host_accessor.getter(
*self.ndarr.pad_key(self.indices))
def setter(value):
self.ndarr.initialize_host_accessor()
self.ndarr.host_accessor.setter(
value, *self.ndarr.pad_key(self.indices))
self.getter = getter
self.setter = setter
|
py | b413f9176ebea210152971c8decb701e877de936 | from django import forms
from .models import NewCarModel
# linked to model
class NewCarForm(forms.ModelForm):
class Meta:
model = NewCarModel
fields = '__all__'
def clean_mpg(self):
mpg = self.cleaned_data['mpg']
if mpg < 20:
raise forms.ValidationError('That is less than a truck!')
if mpg > 500:
raise forms.ValidationError('That is impossible in 2019!')
return mpg
def clean_modelYear(self):
year = self.cleaned_data['modelYear']
if year < 2019:
raise forms.ValidationError('That is not new!!!')
return year |
py | b413f9370a4944ef59a18713cd6d607e453f8423 | """
I needed a simple gauge, so I've made on with Pmw.
It might be useful for others to use as a base to develop more comples
gauges with.
Is it worth cleaning up and submitting?
cheers and thanks
chris
Dr. Chris Wright
Intensive Care Unit
Monash Medical Centre
Clayton. VIC Australia
"""
import sys
import Tkinter
import Pmw
import time
if sys.platform == 'win32':
# MS-Windows specific fonts
label_font = "-family Ariel -size 12"
value_font = "-family Ariel -size 12"
small_font = "-family {MS Sans Serif} -size 9 -weight bold"
header_font = "-family {MS Sans Serif} -weight bold"
else:
# X-Windows specific fonts
label_font = "-*-helvetica-*-r-*-*-*-160-*-*-*-*-*-*"
value_font = "-*-courier-*-r-*-*-*-160-*-*-*-*-*-*"
small_font = "-*-helvetica-*-r-*-*-*-130-*-*-*-*-*-*"
header_font = "-*-helvetica-bold-r-*-*-*-150-*-*-*-*-*-*"
class VerticalGauge(Pmw.MegaWidget):
"""Vertical gauge with actual and desired settings"""
def __init__(self, parent = None, **kw):
optiondefs = (
('min', 0, None),
('max', 100, None),
('majortickinterval', 10, None),
('minortickinterval', 5, None),
('units', '', None),
('bg', 'grey', self._backgroundSet),
('actualvalue', 50, self._actualSet),
('desiredvalue', 50, self._desiredSet),
('actualcolour', 'yellow1', None),
('desiredcolour', 'turquoise1', None),
('label', 'Label', None),
)
self.defineoptions(kw, optiondefs)
Pmw.MegaWidget.__init__(self, parent)
interior = self.interior()
interior.grid_rowconfigure(1, weight = 1)
for r in range(3):
interior.grid_columnconfigure(r, weight = 1)
self.actuallabel = self.createcomponent('actualLabel',
(), None,
Tkinter.Label, (interior,),
text = '',
width = 3,
relief = 'sunken',
bd = 1,
fg = self['actualcolour'],
font = value_font)
self.actuallabel.grid(sticky = "nswe", row = 0, column = 0)
self.label = self.createcomponent('label',
(), None,
Tkinter.Label, (interior,),
text = self['label'],
relief = 'raised',
font = label_font,
fg = 'navy',
bd = 2)
self.label.grid(sticky = "nsew", row = 0, column = 1)
self.desiredlabel = self.createcomponent('desiredLabel',
(), None,
Tkinter.Label, (interior,),
text = '',
width = 3,
relief = 'sunken',
bd = 1,
fg = self['desiredcolour'],
font = value_font)
self.desiredlabel.grid(sticky = "nswe", row = 0, column = 2)
self.canvas = self.createcomponent('canvas',
(), None,
Tkinter.Canvas, (interior,),
width = 100,
height = 300,
bg = 'grey')
self.canvas.grid(sticky = "nsew", columnspan = 3, pady = 1)
self.canvas.bind("<Configure>", self._createGaugeAxes)
self._createGaugeAxes()
self.initialiseoptions()
def _createGaugeAxes(self, event = None):
min = self['min']
max = self['max']
units = self['units']
majortickinterval = self['majortickinterval']
gauge_range = max - min
c = self.canvas
c.delete("all")
if event:
h, w = event.height, event.width
else:
h = int(c.configure("height")[4])
w = int(c.configure("width")[4])
self.lower = h - 15
self.upper = 15
self.middle = w / 2
c.create_line(self.middle, self.lower, self.middle, self.upper)
majortickcount = int((max - min) / majortickinterval)
self.axislength = self.lower - self.upper
self.majortickdistance = float(self.axislength) / majortickcount
self.majortickwidth = w / 5
labeloffset = (w / 4) + 10
for i in range(majortickcount + 1):
v = min + i * majortickinterval
d = self.lower - i * self.majortickdistance
c.create_line(self.middle, d, self.middle + self.majortickwidth, d)
c.create_text(self.middle + labeloffset, d, font = small_font, text = str(v))
self._desiredSet(event)
self._actualSet(event)
def _backgroundSet(self):
self.canvas.configure(bg = self['bg'])
def _desiredSet(self, event = None):
c = self.canvas
desired = self['desiredvalue']
desiredcolour = self['desiredcolour']
min = self['min']
max = self['max']
if desired > max: desired = max
if desired < min: desired = min
gauge_range = max - min
c = self.canvas
if event:
h, w = event.height, event.width
else:
h = int(c.configure("height")[4])
w = int(c.configure("width")[4])
desired_y = self.lower - (float(desired - min) / gauge_range) * self.axislength
try:
c.delete('desiredBar')
except:
pass
c.create_line(self.middle - self.majortickwidth, desired_y,
self.middle + self.majortickwidth, desired_y,
fill = desiredcolour, stipple = 'gray50',
width = 10, tag = 'desiredBar')
self.desiredlabel.configure(text = desired)
def setActual(self, value):
self.configure(actualvalue = value)
def getActual(self):
return self.cget('actualvalue')
def _actualSet(self, event = None):
c = self.canvas
actual = self['actualvalue']
actualcolour = self['actualcolour']
min = self['min']
max = self['max']
if actual > max: actual = max
if actual < min: actual = min
gauge_range = max - min
c = self.canvas
if event:
h, w = event.height, event.width
else:
h = int(c.configure("height")[4])
w = int(c.configure("width")[4])
actual_y = self.lower - (float(actual - min) / gauge_range) * self.axislength
try:
c.delete('actualPointer')
except:
pass
triangle = ((self.middle, actual_y),
(self.middle - 1.4 * self.majortickwidth, actual_y - self.majortickwidth / 2),
(self.middle - 1.4 * self.majortickwidth, actual_y + self.majortickwidth / 2))
c.create_polygon(triangle, fill = actualcolour, tag = 'actualPointer')
self.actuallabel.configure(text = actual)
Pmw.forwardmethods(VerticalGauge, Tkinter.Canvas, 'canvas')
if __name__ == '__main__':
# Initialise Tkinter and Pmw.
root = Pmw.initialise()
root.title('Pmw VerticalGauge demonstration')
def increase():
av = g1.getActual()
g1.setActual(av + 1)
def decrease():
av = g1.getActual()
g1.setActual(av - 1)
g1 = VerticalGauge(min = 0,
max = 30,
actualvalue = 15,
desiredvalue = 22,
majortickinterval = 2,
label = "Pms")
g1.grid(sticky = "nsew")
root.grid_rowconfigure(0, weight = 1)
root.grid_columnconfigure(0, weight = 1)
b1 = Tkinter.Button(text = "Increase", command = increase)
b1.grid()
b2 = Tkinter.Button(text = "Decrease", command = decrease)
b2.grid()
# Let's go.
root.mainloop()
|
py | b413fa8e3595e4821fc1045556755237fd02088c | from typing import List, Optional
from warnings import filterwarnings
import requests
from beartype import beartype
from beartype.roar import BeartypeDecorHintPep585DeprecationWarning
from .base_client import BaseClient
from .types.dataset_types import DatasetType
from .types.thing_types import ThingType
from .types.timeseries_types import TimeseriesType
from .utils import filter_none_values_from_dict
filterwarnings("ignore", category=BeartypeDecorHintPep585DeprecationWarning)
Response = requests.models.Response
class ThingsClient(BaseClient):
"""
A client for handling the things section of NODA Self-host API
"""
@beartype
def __init__(self,
base_url: Optional[str] = None,
username: Optional[str] = None,
password: Optional[str] = None
) -> None:
super().__init__(base_url, username, password)
self._things_api_path = 'things'
@beartype
def get_things(self,
limit: Optional[int] = None,
offset: Optional[int] = None,
tags: Optional[List[str]] = None
) -> List[ThingType]:
"""Fetches things from NODA Self-host API
Args:
limit (Optional[int]): The numbers of items to return.
offset (Optional[int]): The number of items to skip before starting to collect the result set.
tags (Optional[List[str]]): List of tags to match on.
Returns:
List[:class:`.ThingType`]
Raises:
:class:`.SelfHostBadRequestException`: Sent request had insufficient data or invalid options.
:class:`.SelfHostUnauthorizedException`: Request was refused due to lacking authentication credentials.
:class:`.SelfHostForbiddenException`: Server understands the request but refuses to authorize it.
:class:`.SelfHostTooManyRequestsException`: Sent too many requests in a given amount of time.
:class:`.SelfHostInternalServerException`: Server encountered an unexpected condition that prevented it
from fulfilling the request.
"""
response: Response = self._session.get(
url=f'{self._base_url}/{self._api_version}/{self._things_api_path}',
params=filter_none_values_from_dict({
'limit': limit,
'offset': offset,
'tags': tags
})
)
return self._process_response(response)
@beartype
def create_thing(self,
name: str,
thing_type: Optional[str] = None,
tags: Optional[List[str]] = None
) -> ThingType:
"""Add a new thing to the NODA Self-host API
Args:
name (str): The name of the thing
thing_type (Optional[str]): Type of the thing
tags (Optional[List[str]]): Tags pinned on the thing
Returns:
:class:`.ThingType`
Raises:
:class:`.SelfHostBadRequestException`: Sent request had insufficient data or invalid options.
:class:`.SelfHostUnauthorizedException`: Request was refused due to lacking authentication credentials.
:class:`.SelfHostForbiddenException`: Server understands the request but refuses to authorize it.
:class:`.SelfHostTooManyRequestsException`: Sent too many requests in a given amount of time.
:class:`.SelfHostInternalServerException`: Server encountered an unexpected condition that prevented it
from fulfilling the request.
"""
response: Response = self._session.post(
url=f'{self._base_url}/{self._api_version}/{self._things_api_path}',
json=filter_none_values_from_dict({
'name': name,
'type': thing_type,
'tags': tags
})
)
return self._process_response(response)
@beartype
def get_thing(self, thing_uuid: str) -> ThingType:
"""Returns a thing from NODA Self-host API by UUID
Args:
thing_uuid (str): UUID of thing to fetch.
Returns:
:class:`.ThingType`
Raises:
:class:`.SelfHostBadRequestException`: Sent request had insufficient data or invalid options.
:class:`.SelfHostUnauthorizedException`: Request was refused due to lacking authentication credentials.
:class:`.SelfHostForbiddenException`: Server understands the request but refuses to authorize it.
:class:`.SelfHostNotFoundException`: The requested resource was not found.
:class:`.SelfHostTooManyRequestsException`: Sent too many requests in a given amount of time.
:class:`.SelfHostInternalServerException`: Server encountered an unexpected condition that prevented it
from fulfilling the request.
"""
response: Response = self._session.get(
url=f'{self._base_url}/{self._api_version}/{self._things_api_path}/{thing_uuid}'
)
return self._process_response(response)
@beartype
def update_thing(self,
thing_uuid: str,
name: Optional[str] = None,
state: Optional[str] = None,
thing_type: Optional[str] = None,
tags: Optional[List[str]] = None
) -> None:
"""Updates a thing from NODA Self-host API
Args:
thing_uuid (str): UUID of thing to update.
name (Optional[str]): The name of the thing
state (Optional[str]): The state of the thing.
- active
- inactive
- passive
- archived
thing_type (Optional[str]): Thing type declaration
tags (Optional[List[str]]): Tags pinned on the thing
Raises:
:class:`.SelfHostBadRequestException`: Sent request had insufficient data or invalid options.
:class:`.SelfHostUnauthorizedException`: Request was refused due to lacking authentication credentials.
:class:`.SelfHostForbiddenException`: Server understands the request but refuses to authorize it.
:class:`.SelfHostNotFoundException`: The requested resource was not found.
:class:`.SelfHostTooManyRequestsException`: Sent too many requests in a given amount of time.
:class:`.SelfHostInternalServerException`: Server encountered an unexpected condition that prevented it
from fulfilling the request.
"""
response: Response = self._session.put(
url=f'{self._base_url}/{self._api_version}/{self._things_api_path}/{thing_uuid}',
json=filter_none_values_from_dict({
'name': name,
'state': state,
'type': thing_type,
'tags': tags
})
)
return self._process_response(response)
@beartype
def delete_thing(self, thing_uuid: str) -> None:
"""Deletes a thing from NODA Self-host API
Args:
thing_uuid (str): UUID of thing to delete.
Raises:
:class:`.SelfHostBadRequestException`: Sent request had insufficient data or invalid options.
:class:`.SelfHostUnauthorizedException`: Request was refused due to lacking authentication credentials.
:class:`.SelfHostForbiddenException`: Server understands the request but refuses to authorize it.
:class:`.SelfHostNotFoundException`: The requested resource was not found.
:class:`.SelfHostTooManyRequestsException`: Sent too many requests in a given amount of time.
:class:`.SelfHostInternalServerException`: Server encountered an unexpected condition that prevented it
from fulfilling the request.
"""
response: Response = self._session.delete(
url=f'{self._base_url}/{self._api_version}/{self._things_api_path}/{thing_uuid}'
)
return self._process_response(response)
@beartype
def get_thing_datasets(self, thing_uuid: str) -> List[DatasetType]:
"""Returns a list of datasets associated with the specified thing from NODA Self-host API
Args:
thing_uuid (str): UUID of the target user.
Returns:
List[:class:`.DatasetType`]
Raises:
:class:`.SelfHostBadRequestException`: Sent request had insufficient data or invalid options.
:class:`.SelfHostUnauthorizedException`: Request was refused due to lacking authentication credentials.
:class:`.SelfHostForbiddenException`: Server understands the request but refuses to authorize it.
:class:`.SelfHostNotFoundException`: The requested resource was not found.
:class:`.SelfHostTooManyRequestsException`: Sent too many requests in a given amount of time.
:class:`.SelfHostInternalServerException`: Server encountered an unexpected condition that prevented it
from fulfilling the request.
"""
response: Response = self._session.get(
url=f'{self._base_url}/{self._api_version}/{self._things_api_path}/{thing_uuid}/datasets'
)
return self._process_response(response)
@beartype
def get_thing_timeseries(self, thing_uuid: str) -> List[TimeseriesType]:
"""Returns a list of timeseries associated with the specified thing from NODA Self-host API
Args:
thing_uuid (str): UUID of the target user.
Returns:
List[:class:`.TimeseriesType`]
Raises:
:class:`.SelfHostBadRequestException`: Sent request had insufficient data or invalid options.
:class:`.SelfHostUnauthorizedException`: Request was refused due to lacking authentication credentials.
:class:`.SelfHostForbiddenException`: Server understands the request but refuses to authorize it.
:class:`.SelfHostNotFoundException`: The requested resource was not found.
:class:`.SelfHostTooManyRequestsException`: Sent too many requests in a given amount of time.
:class:`.SelfHostInternalServerException`: Server encountered an unexpected condition that prevented it
from fulfilling the request.
"""
response: Response = self._session.get(
url=f'{self._base_url}/{self._api_version}/{self._things_api_path}/{thing_uuid}/timeseries'
)
return self._process_response(response)
|
py | b413fbb73187ea85511de15b12c0b4d12a6cddfc | # Created by always0ne on 2020.09.16
from Crypto.PublicKey import RSA
from Crypto.Cipher import PKCS1_OAEP
def encrypt_message_rsa():
message = input("input your message : ")
# generate RSA Key
key_pair = RSA.generate(3072)
pub_key = key_pair.publickey()
# generate Private Key and save private.pem
with open("private.pem",'wb') as private_key:
private_key.write(key_pair.export_key(passphrase="1234"))
# encrypt message and save enc.txt
encryptor = PKCS1_OAEP.new(pub_key)
with open("enc.txt",'wb') as encrypted_message:
encrypted_message.write(encryptor.encrypt(message.encode('utf-8')))
encrypt_message_rsa()
|
py | b413fc3be792807fef5856e23d199d77d34a026e | import os
import qt, vtk
import collections
from collections import OrderedDict
class CardiacDeviceBase(object):
DEVICE_CLASS_MODIFIED_EVENT = 20000
DEVICE_PARAMETER_VALUE_MODIFIED_EVENT = 20001
DEVICE_PROFILE_MODIFIED_EVENT = 20002
QUANTIFICATION_RESULT_UPDATED_EVENT = 20003
NAME=None
ID=None
@classmethod
def getParameters(cls):
raise NotImplementedError
@classmethod
def getInternalParameters(cls):
return {'interpolationSmoothness': 0.0}
@classmethod
def getSegments(cls):
return []
@staticmethod
def _genParameters(name, info, value, unit, minimum, maximum, singleStep, pageStep, decimals=2):
return {"name": name, "info": info, "value": value, "unit": unit, "minimum": minimum, "maximum": maximum,
"singleStep": singleStep, "pageStep": pageStep, "decimals": decimals}
@staticmethod
def getProfilePoints(params, segment=None, openSegment=True): #segment: one of 'distal', 'middle', 'proximal', 'whole'
raise NotImplementedError
@classmethod
def updateModel(cls, modelNode, parameterNode):
"""Most devices provides only a profile (via getProfilePoints) and model is computed from rotational sweep
of these points. However, a device can generate a model of arbitrary shape by overriding this method."""
raise NotImplementedError
@classmethod
def getPresets(cls):
csvFile = os.path.join(cls.RESOURCES_PATH, "Presets", cls.ID + ".csv")
#if os.path.exists(csvFile):
return DeviceImplantPresets(csvFile, cls.getParameters())
@classmethod
def getIcon(cls):
if cls.ID:
pngFile = os.path.join(cls.RESOURCES_PATH, "Icons", cls.ID + ".png")
if os.path.exists(pngFile):
return qt.QIcon(qt.QPixmap(pngFile))
return None
@classmethod
def getParameterValuesFromNode(cls, parameterNode):
parameterValues = {}
parameters = cls.getParameters()
for paramName, paramAttributes in parameters.items():
paramValueStr = parameterNode.GetParameter(cls.ID + "_" + paramName)
if paramValueStr:
parameterValues[paramName] = float(parameterNode.GetParameter(cls.ID+"_"+paramName))
else:
# value not defined in parameter node, use the default
paramScale = (0.01 if paramAttributes["unit"] == "%" else 1.0)
value = paramAttributes["value"] * paramScale
parameterValues[paramName] = value
return parameterValues
class DeviceImplantPresets(OrderedDict):
def __init__(self, csvfile, defaultParameters):
super(DeviceImplantPresets, self).__init__()
self._presetCSVFile = csvfile
if self._presetCSVFile and os.path.exists(self._presetCSVFile):
self._readPresets()
else:
# If preset file is not available then create a default preset from the parameter info
presets = OrderedDict()
defaultPreset = {}
for paramName, paramAttributes in defaultParameters.items():
paramScale = (0.01 if paramAttributes["unit"] == "%" else 1.0)
value = paramAttributes["value"] * paramScale
defaultPreset[paramName] = str(value)
presets["Default"] = defaultPreset
self.update(presets)
def _readPresets(self):
import csv
presets = OrderedDict()
with open(self._presetCSVFile, mode='r') as csv_file:
for row in csv.DictReader(csv_file):
presets[row["Model"]] = {key:row[key] for key in row.keys() if key.lower() != "model"}
self.update(presets)
class HarmonyDevice(CardiacDeviceBase):
NAME="Harmony TCPV"
ID="Harmony"
RESOURCES_PATH = os.path.join(os.path.dirname(__file__), "..", "Resources")
@classmethod
def getParameters(cls):
scale = 2.0 # allow scaling animal device to humans
PA = "Pulmonary artery side"
PV = "Right ventricle side"
return {
"distalStraightRadiusMm": cls._genParameters("Distal straight radius", PA, 15.5, "mm", 5, 15.5 * scale, 0.1, 1),
"distalStraightLengthMm":cls._genParameters("Distal straight length", PA, 8.9, "mm", 0, 17.7 * scale, 0.1, 1),
"distalCurvedRadiusMm": cls._genParameters("Distal curved radius", PA, 15, "mm", 0, 15.5 * scale, 0.1, 1),
"distalCurvedLengthMm": cls._genParameters("Distal curved length", PA, 8.8, "mm", 0, 17.7 * scale, 0.1, 1),
"midRadiusMm": cls._genParameters("Middle radius", "", 11, "mm", 3, 11 * scale, 0.1, 1),
"midLengthMm": cls._genParameters("Middle length", "", 17.7, "mm", 5, 17.7 * scale, 0.1, 1),
"proximalCurvedRadiusMm": cls._genParameters("Proximal curved radius", PA, 21, "mm", 0, 21.5 * scale, 0.1, 1),
"proximalCurvedLengthMm": cls._genParameters("Proximal curved length", PA, 8.8, "mm", 0, 17.7 * scale, 0.1, 1),
"proximalStraightRadiusMm": cls._genParameters("Proximal straight radius", PV, 21.5, "mm", 10, 21.5 * scale, 0.1, 1),
"proximalStraightLengthMm": cls._genParameters("Proximal straight length", PV, 8.9, "mm", 0, 17.7 * scale, 0.1, 1)
}
@classmethod
def getInternalParameters(cls):
return {'interpolationSmoothness': -1.0}
@classmethod
def getSegments(cls):
return ['distal', 'middle', 'proximal']
@staticmethod
def getProfilePoints(params, segment=None, openSegment=True):
import math
curvedSectionNumberOfPoints = 7
curvedSegmentSlope = 2.0
points = vtk.vtkPoints()
if segment is None or segment == 'distal' or segment == "whole":
if not openSegment: # add a point at center, to make closed model
points.InsertNextPoint(0, 0, -params['distalStraightLengthMm'] - params['distalCurvedLengthMm'] - params[
'midLengthMm'] * 0.5)
points.InsertNextPoint(params['distalStraightRadiusMm'], 0,
-params['distalStraightLengthMm'] - params['distalCurvedLengthMm'] - params[
'midLengthMm'] * 0.5)
curvedSectionStartX = params['midRadiusMm']
curvedSectionStartZ = -params['distalCurvedLengthMm'] - params['midLengthMm'] * 0.5
radiusScale = (params['distalCurvedRadiusMm'] - params['midRadiusMm']) / (
math.tanh(0.5 * curvedSegmentSlope) - math.tanh(-0.5 * curvedSegmentSlope))
for pointIndex in range(0, curvedSectionNumberOfPoints - 1):
normalizedPos = float(pointIndex) / float(curvedSectionNumberOfPoints - 1) # goes from 0 to 1
x = curvedSectionStartX + radiusScale * (
math.tanh((0.5 - normalizedPos) * curvedSegmentSlope) - math.tanh(-0.5 * curvedSegmentSlope))
z = curvedSectionStartZ + normalizedPos * params['distalCurvedLengthMm']
points.InsertNextPoint(x, 0, z)
points.InsertNextPoint(params['midRadiusMm'], 0, -params['midLengthMm'] * 0.5)
if (segment == "distal" or segment == "middle") and openSegment == False: # add a point at center, to make closed model
points.InsertNextPoint(0, 0, -params['midLengthMm'] * 0.5)
if segment == "middle": # whole models should only contain one copy of these point
points.InsertNextPoint(params['midRadiusMm'], 0, -params['midLengthMm'] * 0.5)
points.InsertNextPoint(params['midRadiusMm'], 0, +params['midLengthMm'] * 0.5)
if (segment == "middle" or segment == "proximal") and openSegment == False: # add a point at center, to make closed model
points.InsertNextPoint(0, 0, +params['midLengthMm'] * 0.5)
if segment is None or segment == "proximal" or segment == "whole":
points.InsertNextPoint(params['midRadiusMm'], 0,
+params['midLengthMm'] * 0.5) # TODO: check if point duplication is not an issue
curvedSectionStartX = params['midRadiusMm']
curvedSectionStartZ = params['midLengthMm'] * 0.5
radiusScale = (params['proximalCurvedRadiusMm'] - params['midRadiusMm']) / (
math.tanh(0.5 * curvedSegmentSlope) - math.tanh(-0.5 * curvedSegmentSlope))
for pointIndex in range(1, curvedSectionNumberOfPoints):
normalizedPos = float(pointIndex) / float(curvedSectionNumberOfPoints - 1) # goes from 0 to 1
x = curvedSectionStartX + radiusScale * (
math.tanh((normalizedPos - 0.5) * curvedSegmentSlope) - math.tanh(-0.5 * curvedSegmentSlope))
z = curvedSectionStartZ + normalizedPos * params['proximalCurvedLengthMm']
points.InsertNextPoint(x, 0, z)
points.InsertNextPoint(params['proximalStraightRadiusMm'], 0,
params['midLengthMm'] * 0.5 + params['proximalCurvedLengthMm'] + params[
'proximalStraightLengthMm'])
if segment is not None and openSegment == False: # add a point at center, to make closed model
points.InsertNextPoint(0, 0, params['midLengthMm'] * 0.5 + params['proximalCurvedLengthMm'] + params[
'proximalStraightLengthMm'])
return points
class CylinderDevice(CardiacDeviceBase):
NAME = "Cylinder valve/stent"
ID = "Cylinder"
RESOURCES_PATH = os.path.join(os.path.dirname(__file__), "..", "Resources")
@classmethod
def getParameters(cls):
# Use an OrderedDict to display sliders in the order we define them here
return collections.OrderedDict([
("expansionPercent", cls._genParameters("Expansion", "100% means expanded, 0% means crimped", 100, "%", 0, 100, 1, 10)),
("expandedDiameterMm", cls._genParameters("Expanded diameter", "", 22.4, "mm", 0, 60, 0.1, 1)),
("expandedLengthMm", cls._genParameters("Expanded length", "", 24, "mm", 0, 100, 0.1, 1)),
("crimpedDiameterMm", cls._genParameters("Crimped diameter", "", 7, "mm", 0, 60, 0.1, 1)),
("crimpedLengthMm", cls._genParameters("Crimped length", "", 32, "mm", 0, 100, 0.1, 1)),
("anchorPositionPercent", cls._genParameters("Anchor position", "Defines what point of the device remains in the same position as it expands/contracts", 0, "%", 0, 100, 1, 10))
])
@staticmethod
def getProfilePoints(params, segment=None, openSegment=True):
lengthMm = params['crimpedLengthMm'] + params['expansionPercent'] * (params['expandedLengthMm']-params['crimpedLengthMm'])
radiusMm = (params['crimpedDiameterMm'] + params['expansionPercent'] * (params['expandedDiameterMm']-params['crimpedDiameterMm'])) / 2.0
print("Expansion = {0}, actual diameter = {1}, length = {2}".format(params['expansionPercent'], radiusMm * 2.0, lengthMm))
origin = -lengthMm * params['anchorPositionPercent']
points = vtk.vtkPoints()
points.InsertNextPoint(radiusMm, 0, origin+lengthMm * 0.00)
points.InsertNextPoint(radiusMm, 0, origin+lengthMm * 0.25)
points.InsertNextPoint(radiusMm, 0, origin+lengthMm * 0.50)
points.InsertNextPoint(radiusMm, 0, origin+lengthMm * 0.75)
points.InsertNextPoint(radiusMm, 0, origin+lengthMm * 1.00)
return points
|
py | b413fd3e736c3ce2f5466f7826c8e4bbe4d80f3f | from typing import (
List,
Tuple,
TYPE_CHECKING,
Union,
)
from cached_property import cached_property
from cancel_token import CancelToken
from eth.rlp.accounts import Account
from eth.rlp.headers import BlockHeader
from eth.rlp.receipts import Receipt
from lahja import EndpointAPI
from eth_typing import BlockNumber
from eth.constants import GENESIS_BLOCK_NUMBER
from lahja import (
BroadcastConfig,
)
from p2p.abc import BehaviorAPI, CommandAPI, HandshakerAPI, SessionAPI
from p2p.peer_pool import BasePeerPool
from p2p.typing import Payload
from trinity.rlp.block_body import BlockBody
from trinity.protocol.common.peer import (
BaseChainPeer,
BaseProxyPeer,
BaseChainPeerFactory,
BaseChainPeerPool,
)
from trinity.protocol.common.peer_pool_event_bus import (
PeerPoolEventServer,
BaseProxyPeerPool,
)
from .api import LESAPI
from .commands import GetBlockHeaders
from .constants import (
MAX_HEADERS_FETCH,
)
from .events import (
GetBlockHeadersEvent,
SendBlockHeadersEvent,
)
from .proto import (
LESHandshakeParams,
LESProtocolV1,
LESProtocolV2,
ProxyLESProtocol,
)
from .events import (
GetAccountRequest,
GetBlockBodyByHashRequest,
GetBlockHeaderByHashRequest,
GetContractCodeRequest,
GetReceiptsRequest,
)
from .handshaker import LESV1Handshaker, LESV2Handshaker
if TYPE_CHECKING:
from trinity.sync.light.service import BaseLightPeerChain # noqa: F401
class LESPeer(BaseChainPeer):
max_headers_fetch = MAX_HEADERS_FETCH
supported_sub_protocols = (LESProtocolV1, LESProtocolV2)
sub_proto: Union[LESProtocolV1, LESProtocolV2] = None
def get_behaviors(self) -> Tuple[BehaviorAPI, ...]:
return super().get_behaviors() + (LESAPI().as_behavior(),)
@cached_property
def les_api(self) -> LESAPI:
return self.connection.get_logic(LESAPI.name, LESAPI)
class LESProxyPeer(BaseProxyPeer):
"""
A ``LESPeer`` that can be used from any process instead of the actual peer pool peer.
Any action performed on the ``BCCProxyPeer`` is delegated to the actual peer in the pool.
This does not yet mimic all APIs of the real peer.
"""
def __init__(self,
session: SessionAPI,
event_bus: EndpointAPI,
sub_proto: ProxyLESProtocol):
super().__init__(session, event_bus)
self.sub_proto = sub_proto
@classmethod
def from_session(cls,
session: SessionAPI,
event_bus: EndpointAPI,
broadcast_config: BroadcastConfig) -> 'LESProxyPeer':
return cls(session, event_bus, ProxyLESProtocol(session, event_bus, broadcast_config))
class LESPeerFactory(BaseChainPeerFactory):
peer_class = LESPeer
async def get_handshakers(self) -> Tuple[HandshakerAPI, ...]:
headerdb = self.context.headerdb
wait = self.cancel_token.cancellable_wait
head = await wait(headerdb.coro_get_canonical_head())
total_difficulty = await wait(headerdb.coro_get_score(head.hash))
genesis_hash = await wait(
headerdb.coro_get_canonical_block_hash(BlockNumber(GENESIS_BLOCK_NUMBER))
)
handshake_params_kwargs = dict(
network_id=self.context.network_id,
head_td=total_difficulty,
head_hash=head.hash,
head_number=head.block_number,
genesis_hash=genesis_hash,
serve_headers=True,
# TODO: these should be configurable to allow us to serve this data.
serve_chain_since=None,
serve_state_since=None,
serve_recent_state=None,
serve_recent_chain=None,
tx_relay=None,
flow_control_bl=None,
flow_control_mcr=None,
flow_control_mrr=None,
announce_type=None,
)
v1_handshake_params = LESHandshakeParams(version=1, **handshake_params_kwargs)
v2_handshake_params = LESHandshakeParams(version=2, **handshake_params_kwargs)
return (
LESV1Handshaker(handshake_params=v1_handshake_params),
LESV2Handshaker(handshake_params=v2_handshake_params),
)
class LESPeerPoolEventServer(PeerPoolEventServer[LESPeer]):
"""
LES protocol specific ``PeerPoolEventServer``. See ``PeerPoolEventServer`` for more info.
"""
def __init__(self,
event_bus: EndpointAPI,
peer_pool: BasePeerPool,
token: CancelToken = None,
chain: 'BaseLightPeerChain' = None) -> None:
super().__init__(event_bus, peer_pool, token)
self.chain = chain
subscription_msg_types = frozenset({GetBlockHeaders})
async def _run(self) -> None:
self.run_daemon_event(
SendBlockHeadersEvent,
lambda ev: self.try_with_session(
ev.session,
lambda peer: peer.sub_proto.send_block_headers(ev.headers, ev.buffer_value, ev.request_id) # noqa: E501
)
)
self.run_daemon_request(
GetBlockHeaderByHashRequest,
self.handle_get_blockheader_by_hash_requests
)
self.run_daemon_request(
GetBlockBodyByHashRequest,
self.handle_get_blockbody_by_hash_requests
)
self.run_daemon_request(GetReceiptsRequest, self.handle_get_receipts_by_hash_requests)
self.run_daemon_request(GetAccountRequest, self.handle_get_account_requests)
self.run_daemon_request(GetContractCodeRequest, self.handle_get_contract_code_requests)
await super()._run()
async def handle_get_blockheader_by_hash_requests(
self,
event: GetBlockHeaderByHashRequest) -> BlockHeader:
return await self.chain.coro_get_block_header_by_hash(event.block_hash)
async def handle_get_blockbody_by_hash_requests(
self,
event: GetBlockBodyByHashRequest) -> BlockBody:
return await self.chain.coro_get_block_body_by_hash(event.block_hash)
async def handle_get_receipts_by_hash_requests(
self,
event: GetReceiptsRequest) -> List[Receipt]:
return await self.chain.coro_get_receipts(event.block_hash)
async def handle_get_account_requests(
self,
event: GetAccountRequest) -> Account:
return await self.chain.coro_get_account(event.block_hash, event.address)
async def handle_get_contract_code_requests(
self,
event: GetContractCodeRequest) -> bytes:
return await self.chain.coro_get_contract_code(event.block_hash, event.address)
async def handle_native_peer_message(self,
session: SessionAPI,
cmd: CommandAPI,
msg: Payload) -> None:
if isinstance(cmd, GetBlockHeaders):
await self.event_bus.broadcast(GetBlockHeadersEvent(session, cmd, msg))
else:
raise Exception(f"Command {cmd} is not broadcasted")
class LESPeerPool(BaseChainPeerPool):
peer_factory_class = LESPeerFactory
class LESProxyPeerPool(BaseProxyPeerPool[LESProxyPeer]):
def convert_session_to_proxy_peer(self,
session: SessionAPI,
event_bus: EndpointAPI,
broadcast_config: BroadcastConfig) -> LESProxyPeer:
return LESProxyPeer.from_session(
session,
self.event_bus,
self.broadcast_config
)
|
py | b413fdc2907a071109d176bdb38b8facec70bd82 | # coding=utf8
# Copyright 2018 JDCLOUD.COM
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# NOTE: This class is auto generated by the jdcloud code generator program.
class BizTypeSetting(object):
def __init__(self, resourceType=None, antispam=None, porn=None, terrorism=None, ad=None):
"""
:param resourceType: (Optional) 资源类型,可选值text, image, audio, video
:param antispam: (Optional) 反垃圾策略,文本和音频填此结构体
:param porn: (Optional) 鉴黄策略,图片填此结构体,视频此结构体配置上认为检测,置空认为不检测
:param terrorism: (Optional) 暴恐策略,图片和视频填此结构
:param ad: (Optional) 图文广告策略,图片和视频填此结构
"""
self.resourceType = resourceType
self.antispam = antispam
self.porn = porn
self.terrorism = terrorism
self.ad = ad
|
py | b413ffa493489e126a632b450ff70e25cfffa9a8 | #-------------------------------------------------------------
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
#-------------------------------------------------------------
# -- Path setup --------------------------------------------------------------
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute.
import os
import sys
sys.path.insert(0, os.path.abspath('../..'))
# -- Project information -----------------------------------------------------
project = 'SystemDS'
copyright = '2020, Apache SystemDS'
author = 'Apache SystemDS'
# The full version, including alpha/beta/rc tags
release = '2.0.0'
# -- General configuration ---------------------------------------------------
# Add any Sphinx extension module names here, as strings.
extensions = ['sphinx.ext.autodoc',
'sphinx.ext.coverage',
'sphinx.ext.todo',
'sphinx_rtd_theme',
'sphinx.ext.autodoc.typehints']
# TODO there is currently ongoing development for nicer typehints in the description so for now lets just roll with
# the current version and switch to description once it is working
# autodoc_typehints = 'description'
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
# This pattern also affects html_static_path and html_extra_path.
exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store']
# -- Options for HTML output -------------------------------------------------
# The theme to use for HTML and HTML Help pages.
html_theme = 'sphinx_rtd_theme'
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
# html_static_path = ['_static']
|
py | b414003017c46ab0350a3785c3769e7494ac1606 | from typing import Dict
def get_order_details(self,
order_id: str) -> Dict:
"""
Shows order details
:param order_id: (required) order ID
:return: Dict
"""
method = 'GET'
api_url = '/api/v1/orders/details'
path = self._host + api_url
params = {
"order_id": order_id
}
signature = self._make_signature(method, api_url, params)
response = self._make_request(method, path, signature, params=params)
return response.get('order_details')
def get_orders_history(self,
page: str = None,
items_per_page: str = None,
order_type: str = None,
order_status: str = None,
order_sub_type: str = None,
from_timestamp: str = None,
till_timestamp: str = None,
currency: str = None,
address: str = None) -> Dict:
"""
Shows orders' history for specified account
Orders' history has pagination with 10 orders per page.
:param page: (optional) page number
:param items_per_page: (optional) how many items to show per page
:param order_type: (optional) - Allowed values are ‘DEPOSIT’, ‘WITHDRAWAL’,
‘EXCHANGE’, ‘INTERNAL_MOVEMENT’, ‘INVOICE’.
:param order_status: (optional)- Allowed values:
"NEW", "ERROR", "CLOSED", "EXPIRED", "CANCELLED",
"CANCELLING", "WAITING_FOR_CONFIRMATION",
"PAYMENT_IN_PROGRESS", "WAITING_FOR_PRICE", "BLOCKED"
:param order_sub_type: (optional) Allowed values: "GATEWAY", "CASHBOX", "BANK_COLLECTION", "SEPA"
:param from_timestamp: (optional) filtering by time of order creation.
:param till_timestamp: (optional) filtering by time of order creation.
:param currency: (optional) filtering by currency
:param address: (optional) deposit source - payment card number for fiat,
crypto wallet address for cryptocurrencies, IBAN and other details for SEPA etc
:return: Dict
"""
method = 'GET'
api_url = '/api/v1/orders/history'
path = self._host + api_url
params = {
"page": page,
"limits": items_per_page,
"order_type": order_type,
"order_status": order_status,
"order_sub_type": order_sub_type,
"from_timestamp": from_timestamp,
"till_timestamp": till_timestamp,
"currency": currency,
"address": address
}
params = {key: value for key, value in params.items() if value}
signature = self._make_signature(method, api_url, params)
return self._make_request(method, path, signature, params=params)
|
py | b4140113f92ed0cf53c2327e0fccad0973a42204 | # Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.
"""Centralized catalog of paths."""
import os
class DatasetCatalog(object):
DATA_DIR = "data/raw_data/"
DATASETS = {
"coco_2017_train": {
"img_dir": "coco/train2017",
"ann_file": "coco/images/annotations/instances_train2017.json"
},
"coco_2017_val": {
"img_dir": "coco/images/val2017",
"ann_file": "coco/annotations/instances_val2017.json"
},
"coco_2017_test_dev": {
"img_dir": "coco/images/test2017",
"ann_file": "coco/annotations/image_info_test-dev2017.json"
},
"coco_2014_train": {
"img_dir": "coco/images/train2014",
"ann_file": "coco/annotations/instances_train2014.json"
},
"coco_2014_val": {
"img_dir": "coco/images/val2014",
"ann_file": "coco/annotations/instances_val2014.json"
},
"coco_2014_minival": {
"img_dir": "coco/images/val2014",
"ann_file": "coco/annotations/instances_minival2014.json"
},
"coco_2014_valminusminival": {
"img_dir": "coco/images/val2014",
"ann_file": "coco/annotations/instances_valminusminival2014.json"
},
"keypoints_coco_2014_train": {
"img_dir": "coco/train2014",
"ann_file": "coco/annotations/person_keypoints_train2014.json",
},
"keypoints_coco_2014_val": {
"img_dir": "coco/val2014",
"ann_file": "coco/annotations/person_keypoints_val2014.json"
},
"keypoints_coco_2014_minival": {
"img_dir": "coco/val2014",
"ann_file": "coco/annotations/person_keypoints_minival2014.json",
},
"keypoints_coco_2014_valminusminival": {
"img_dir": "coco/val2014",
"ann_file": "coco/annotations/person_keypoints_valminusminival2014.json",
},
"voc_2007_train": {
"data_dir": "voc/VOC2007",
"split": "train"
},
"voc_2007_train_cocostyle": {
"img_dir": "voc/VOC2007/JPEGImages",
"ann_file": "voc/VOC2007/Annotations/pascal_train2007.json"
},
"voc_2007_val": {
"data_dir": "voc/VOC2007",
"split": "val"
},
"voc_2007_val_cocostyle": {
"img_dir": "voc/VOC2007/JPEGImages",
"ann_file": "voc/VOC2007/Annotations/pascal_val2007.json"
},
"voc_2007_test": {
"data_dir": "voc/VOC2007",
"split": "test"
},
"voc_2007_test_cocostyle": {
"img_dir": "voc/VOC2007/JPEGImages",
"ann_file": "voc/VOC2007/Annotations/pascal_test2007.json"
},
"voc_2012_train": {
"data_dir": "voc/VOC2012",
"split": "train"
},
"voc_2012_train_cocostyle": {
"img_dir": "voc/VOC2012/JPEGImages",
"ann_file": "voc/VOC2012/Annotations/pascal_train2012.json"
},
"voc_2012_val": {
"data_dir": "voc/VOC2012",
"split": "val"
},
"voc_2012_val_cocostyle": {
"img_dir": "voc/VOC2012/JPEGImages",
"ann_file": "voc/VOC2012/Annotations/pascal_val2012.json"
},
"voc_2012_test": {
"data_dir": "voc/VOC2012",
"split": "test"
# PASCAL VOC2012 doesn't made the test annotations available, so there's no json annotation
},
"cityscapes_fine_instanceonly_seg_train_cocostyle": {
"img_dir": "cityscapes/images",
"ann_file": "cityscapes/annotations/instancesonly_filtered_gtFine_train.json"
},
"cityscapes_fine_instanceonly_seg_val_cocostyle": {
"img_dir": "cityscapes/images",
"ann_file": "cityscapes/annotations/instancesonly_filtered_gtFine_val.json"
},
"cityscapes_fine_instanceonly_seg_test_cocostyle": {
"img_dir": "cityscapes/images",
"ann_file": "cityscapes/annotations/instancesonly_filtered_gtFine_test.json"
}
}
@staticmethod
def get(name):
if name.startswith('$'):
# the name is '$s', where s is the yaml str for the param
from qd.qd_common import load_from_yaml_str
args = load_from_yaml_str(name[1:])
import logging
from pprint import pformat
logging.info('MaskTSVDataset args\n{}'.format(pformat(args)))
return {'factory': 'MaskTSVDataset',
'args': args}
elif "coco" in name:
data_dir = DatasetCatalog.DATA_DIR
attrs = DatasetCatalog.DATASETS[name]
args = dict(
root=os.path.join(data_dir, attrs["img_dir"]),
ann_file=os.path.join(data_dir, attrs["ann_file"]),
)
return dict(
factory="COCODataset",
args=args,
)
elif "voc" in name:
data_dir = DatasetCatalog.DATA_DIR
attrs = DatasetCatalog.DATASETS[name]
args = dict(
data_dir=os.path.join(data_dir, attrs["data_dir"]),
split=attrs["split"],
)
return dict(
factory="PascalVOCDataset",
args=args,
)
raise RuntimeError("Dataset not available: {}".format(name))
class ModelCatalog(object):
S3_C2_DETECTRON_URL = "https://dl.fbaipublicfiles.com/detectron"
C2_IMAGENET_MODELS = {
"MSRA/R-50": "ImageNetPretrained/MSRA/R-50.pkl",
"MSRA/R-50-GN": "ImageNetPretrained/47261647/R-50-GN.pkl",
"MSRA/R-101": "ImageNetPretrained/MSRA/R-101.pkl",
"MSRA/R-101-GN": "ImageNetPretrained/47592356/R-101-GN.pkl",
"FAIR/20171220/X-101-32x8d": "ImageNetPretrained/20171220/X-101-32x8d.pkl",
"FAIR/20171220/X-101-64x4d": "ImageNetPretrained/20171220/X-101-64x4d.pkl",
}
C2_DETECTRON_SUFFIX = "output/train/{}coco_2014_train%3A{}coco_2014_valminusminival/generalized_rcnn/model_final.pkl"
C2_DETECTRON_MODELS = {
"35857197/e2e_faster_rcnn_R-50-C4_1x": "01_33_49.iAX0mXvW",
"35857345/e2e_faster_rcnn_R-50-FPN_1x": "01_36_30.cUF7QR7I",
"35857890/e2e_faster_rcnn_R-101-FPN_1x": "01_38_50.sNxI7sX7",
"36761737/e2e_faster_rcnn_X-101-32x8d-FPN_1x": "06_31_39.5MIHi1fZ",
"35858791/e2e_mask_rcnn_R-50-C4_1x": "01_45_57.ZgkA7hPB",
"35858933/e2e_mask_rcnn_R-50-FPN_1x": "01_48_14.DzEQe4wC",
"35861795/e2e_mask_rcnn_R-101-FPN_1x": "02_31_37.KqyEK4tT",
"36761843/e2e_mask_rcnn_X-101-32x8d-FPN_1x": "06_35_59.RZotkLKI",
"37129812/e2e_mask_rcnn_X-152-32x8d-FPN-IN5k_1.44x": "09_35_36.8pzTQKYK",
# keypoints
"37697547/e2e_keypoint_rcnn_R-50-FPN_1x": "08_42_54.kdzV35ao"
}
@staticmethod
def get(name):
if name.startswith("Caffe2Detectron/COCO"):
return ModelCatalog.get_c2_detectron_12_2017_baselines(name)
if name.startswith("ImageNetPretrained"):
return ModelCatalog.get_c2_imagenet_pretrained(name)
raise RuntimeError("model not present in the catalog {}".format(name))
@staticmethod
def get_c2_imagenet_pretrained(name):
prefix = ModelCatalog.S3_C2_DETECTRON_URL
name = name[len("ImageNetPretrained/"):]
name = ModelCatalog.C2_IMAGENET_MODELS[name]
url = "/".join([prefix, name])
return url
@staticmethod
def get_c2_detectron_12_2017_baselines(name):
# Detectron C2 models are stored following the structure
# prefix/<model_id>/2012_2017_baselines/<model_name>.yaml.<signature>/suffix
# we use as identifiers in the catalog Caffe2Detectron/COCO/<model_id>/<model_name>
prefix = ModelCatalog.S3_C2_DETECTRON_URL
dataset_tag = "keypoints_" if "keypoint" in name else ""
suffix = ModelCatalog.C2_DETECTRON_SUFFIX.format(dataset_tag, dataset_tag)
# remove identification prefix
name = name[len("Caffe2Detectron/COCO/"):]
# split in <model_id> and <model_name>
model_id, model_name = name.split("/")
# parsing to make it match the url address from the Caffe2 models
model_name = "{}.yaml".format(model_name)
signature = ModelCatalog.C2_DETECTRON_MODELS[name]
unique_name = ".".join([model_name, signature])
url = "/".join([prefix, model_id, "12_2017_baselines", unique_name, suffix])
return url
|
py | b41402a58820baa0641ea180db1e6cf203a687e2 | # Copyright (C) 2018 Intel Corporation
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions
# and limitations under the License.
#
#
# SPDX-License-Identifier: Apache-2.0
"""
This module implements resource contention detection on one workload
"""
import logging
from datetime import datetime
from collections import deque
from owca.metrics import Metric as OwcaMetric
from owca.metrics import Measurements, MetricName
from owca.detectors import ContendedResource
from prm.analyze.analyzer import Metric
log = logging.getLogger(__name__)
class Container:
"""
This class is the abstraction of one task, container metrics and
contention detection method are encapsulated in this module
"""
def __init__(self, cid, history_depth=5):
self.cid = cid
self.metrics = dict()
self.measurements = None
self.timestamp = 0
self.cpu_usage = 0
self.util = 0
self.usg_tt = 0
self.total_llc_occu = 0
self.llc_cnt = 0
self.history_depth = history_depth + 1
self.metrics_history = deque([], self.history_depth)
'''
add metric data to metrics history
metrics history only contains the most recent metrics data, defined by
self.historyDepth if histroy metrics data length exceeds the
self.historyDepth, the oldest data will be erased
'''
def _update_metrics_history(self):
self.metrics_history.append(self.metrics.copy())
def _get_history_delta_by_Type(self, columnname):
length = len(self.metrics_history)
if length == 0:
return 0
if length == 1:
return self.metrics_history[length - 1][columnname]
data_sum = 0
for x in range(length - 1):
data_sum = data_sum + self.metrics_history[x][columnname]
data_delta = self.metrics_history[length - 1][columnname] -\
data_sum / (length - 1)
return data_delta
def get_llcoccupany_delta(self):
return self._get_history_delta_by_Type(Metric.L3OCC)
def get_freq_delta(self):
return self._get_history_delta_by_Type(Metric.NF)
def get_latest_mbt(self):
if Metric.MB not in self.metrics:
return 0
return self.metrics[Metric.MB]
def get_metrics(self):
""" retrieve container platform metrics """
return self.metrics
def get_owca_metrics(self, app):
metrics = []
if self.metrics:
for met, val in self.metrics.items():
label_dict = dict(
task_id=self.cid
)
if app:
label_dict['application'] = app
metric = OwcaMetric(
name=met,
value=val,
labels=label_dict
)
metrics.append(metric)
return metrics
def update_measurement(self, timestamp: float,
measurements: Measurements, agg: bool):
"""
update measurements in current cycle and calculate metrics
"""
if self.cpu_usage != 0:
self.util = (measurements[MetricName.CPU_USAGE_PER_TASK] -
self.cpu_usage) * 100 / ((timestamp - self.usg_tt) * 1e9)
self.cpu_usage = measurements[MetricName.CPU_USAGE_PER_TASK]
self.usg_tt = timestamp
if measurements[MetricName.LLC_OCCUPANCY] > 0:
self.total_llc_occu += measurements[MetricName.LLC_OCCUPANCY]
self.llc_cnt += 1
if self.measurements and agg:
metrics = self.metrics
delta_t = timestamp - self.timestamp
metrics[Metric.CYC] = measurements[MetricName.CYCLES] -\
self.measurements[MetricName.CYCLES]
metrics[Metric.INST] = measurements[MetricName.INSTRUCTIONS] -\
self.measurements[MetricName.INSTRUCTIONS]
metrics[Metric.L3MISS] = measurements[MetricName.CACHE_MISSES] -\
self.measurements[MetricName.CACHE_MISSES]
metrics[Metric.MEMSTALL] = measurements[MetricName.MEMSTALL] -\
self.measurements[MetricName.MEMSTALL]
if self.llc_cnt == 0:
metrics[Metric.L3OCC] = 0
else:
metrics[Metric.L3OCC] = self.total_llc_occu / self.llc_cnt / 1024
self.total_llc_occu = 0
self.llc_cnt = 0
if metrics[Metric.INST] == 0:
metrics[Metric.CPI] = 0
metrics[Metric.L3MPKI] = 0
else:
metrics[Metric.CPI] = metrics[Metric.CYC] /\
metrics[Metric.INST]
metrics[Metric.L3MPKI] = metrics[Metric.L3MISS] * 1000 /\
metrics[Metric.INST]
metrics[Metric.MSPKI] = metrics[Metric.MEMSTALL] * 1000 /\
metrics[Metric.INST]
metrics[Metric.UTIL] = (measurements[MetricName.CPU_USAGE_PER_TASK]
- self.measurements[MetricName.CPU_USAGE_PER_TASK])\
* 100 / (delta_t * 1e9)
metrics[Metric.MB] = (measurements[MetricName.MEM_BW] -
self.measurements[MetricName.MEM_BW]) /\
1024 / 1024 / delta_t
if metrics[Metric.UTIL] == 0:
metrics[Metric.NF] = 0
else:
metrics[Metric.NF] = metrics[Metric.CYC] / delta_t / 10000 /\
metrics[Metric.UTIL]
self._update_metrics_history()
if not self.measurements or agg:
self.measurements = measurements
self.timestamp = timestamp
def _append_metrics(self, metrics, mname, mvalue):
metric = OwcaMetric(
name=mname,
value=mvalue,
labels=dict(
task_id=self.cid,
)
)
metrics.append(metric)
def _detect_in_bin(self, thresh):
owca_metrics = []
cond_res = []
metrics = self.metrics
unknown_reason = True
if metrics[Metric.CPI] > thresh['cpi']:
self._append_metrics(owca_metrics, Metric.CPI,
metrics[Metric.CPI])
self._append_metrics(owca_metrics, 'cpi_threshold', thresh['cpi'])
if metrics[Metric.L3MPKI] > thresh['mpki']:
log.info('Last Level Cache contention is detected:')
log.info('Latency critical container %s CPI = %f MPKI = %f \n',
self.cid, metrics[Metric.CPI], metrics[Metric.L3MPKI])
self._append_metrics(owca_metrics, Metric.L3MPKI,
metrics[Metric.L3MPKI])
self._append_metrics(owca_metrics, 'mpki_threshold',
thresh['mpki'])
cond_res.append(ContendedResource.LLC)
unknown_reason = False
if metrics[Metric.MSPKI] > thresh['mspki']:
log.info('Memory Bandwidth contention detected:')
log.info('Latency critical container %s CPI = %f MSPKI = %f \n',
self.cid, metrics[Metric.CPI], metrics[Metric.MSPKI])
self._append_metrics(owca_metrics, Metric.MSPKI,
metrics[Metric.MSPKI])
self._append_metrics(owca_metrics, 'mspki_threshold',
thresh['mspki'])
cond_res.append(ContendedResource.MEMORY_BW)
unknown_reason = False
if unknown_reason:
log.info('Performance is impacted by unknown reason:')
log.info('Latency critical container %s CPI exceeds threshold = %f',
self.cid, metrics[Metric.CPI])
cond_res.append(ContendedResource.UNKN)
return cond_res, owca_metrics
return [], owca_metrics
def tdp_contention_detect(self, tdp_thresh):
""" detect TDP contention in container """
owca_metrics = []
if not tdp_thresh:
return None, owca_metrics
metrics = self.metrics
log.debug('Current utilization = %f, frequency = %f, tdp utilization\
threshold = %f, tdp frequency bar = %f', metrics[Metric.UTIL],
metrics[Metric.NF], tdp_thresh['util'], tdp_thresh['bar'])
if metrics[Metric.UTIL] >= tdp_thresh['util'] and\
self.metrics[Metric.NF] < tdp_thresh['bar']:
log.info('TDP Contention Alert!')
self._append_metrics(owca_metrics, Metric.NF, metrics[Metric.NF])
self._append_metrics(owca_metrics, 'nf_threshold',
tdp_thresh['bar'])
self._append_metrics(owca_metrics, Metric.UTIL,
metrics[Metric.UTIL])
self._append_metrics(owca_metrics, 'util_threshold',
tdp_thresh['util'])
return ContendedResource.TDP, owca_metrics
return None, owca_metrics
def contention_detect(self, threshs):
""" detect resouce contention after find proper utilization bin """
if not threshs:
return [], []
metrics = self.metrics
for i in range(0, len(threshs)):
thresh = threshs[i]
if metrics[Metric.UTIL] < thresh['util_start']:
if i == 0:
return [], []
return self._detect_in_bin(threshs[i - 1])
if metrics[Metric.UTIL] >= thresh['util_start']:
if metrics[Metric.UTIL] < thresh['util_end'] or i == len(threshs) - 1:
return self._detect_in_bin(thresh)
def __str__(self):
metrics = self.metrics
return datetime.fromtimestamp(self.timestamp).isoformat() + ',' +\
self.cid + ',' + str(metrics[Metric.INST]) +\
',' + str(metrics[Metric.CYC]) + ',' +\
str(metrics[Metric.CPI]) + ',' + str(metrics[Metric.L3MPKI]) +\
',' + str(metrics[Metric.L3MISS]) + ',' +\
str(metrics[Metric.NF]) + ',' + str(metrics[Metric.UTIL]) +\
',' + str(metrics[Metric.L3OCC]) + ',' +\
str(metrics[Metric.MB]) + ',' + str(metrics[Metric.MSPKI]) + '\n'
|
py | b414035f8e337eeada3f8f999cc204ad29d15b6a | class ScalarCoercible:
cache_ok = True
def _coerce(self, value):
raise NotImplementedError
def coercion_listener(self, target, value, oldvalue, initiator):
return self._coerce(value)
|
py | b414066de646790a0f88d3b880300cdf9b62b7b7 | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is regenerated.
# --------------------------------------------------------------------------
from typing import Any, AsyncIterable, Callable, Dict, Generic, Optional, TypeVar
import warnings
from azure.core.async_paging import AsyncItemPaged, AsyncList
from azure.core.exceptions import ClientAuthenticationError, HttpResponseError, ResourceExistsError, ResourceNotFoundError, map_error
from azure.core.pipeline import PipelineResponse
from azure.core.pipeline.transport import AsyncHttpResponse, HttpRequest
from azure.mgmt.core.exceptions import ARMErrorFormat
from ... import models as _models
T = TypeVar('T')
ClsType = Optional[Callable[[PipelineResponse[HttpRequest, AsyncHttpResponse], T, Dict[str, Any]], Any]]
class AvailableResourceGroupDelegationsOperations:
"""AvailableResourceGroupDelegationsOperations async operations.
You should not instantiate this class directly. Instead, you should create a Client instance that
instantiates it for you and attaches it as an attribute.
:ivar models: Alias to model classes used in this operation group.
:type models: ~azure.mgmt.network.v2020_08_01.models
:param client: Client for service requests.
:param config: Configuration of service client.
:param serializer: An object model serializer.
:param deserializer: An object model deserializer.
"""
models = _models
def __init__(self, client, config, serializer, deserializer) -> None:
self._client = client
self._serialize = serializer
self._deserialize = deserializer
self._config = config
def list(
self,
location: str,
resource_group_name: str,
**kwargs: Any
) -> AsyncIterable["_models.AvailableDelegationsResult"]:
"""Gets all of the available subnet delegations for this resource group in this region.
:param location: The location of the domain name.
:type location: str
:param resource_group_name: The name of the resource group.
:type resource_group_name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: An iterator like instance of either AvailableDelegationsResult or the result of cls(response)
:rtype: ~azure.core.async_paging.AsyncItemPaged[~azure.mgmt.network.v2020_08_01.models.AvailableDelegationsResult]
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.AvailableDelegationsResult"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2020-08-01"
accept = "application/json"
def prepare_request(next_link=None):
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
if not next_link:
# Construct URL
url = self.list.metadata['url'] # type: ignore
path_format_arguments = {
'location': self._serialize.url("location", location, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
request = self._client.get(url, query_parameters, header_parameters)
else:
url = next_link
query_parameters = {} # type: Dict[str, Any]
request = self._client.get(url, query_parameters, header_parameters)
return request
async def extract_data(pipeline_response):
deserialized = self._deserialize('AvailableDelegationsResult', pipeline_response)
list_of_elem = deserialized.value
if cls:
list_of_elem = cls(list_of_elem)
return deserialized.next_link or None, AsyncList(list_of_elem)
async def get_next(next_link=None):
request = prepare_request(next_link)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
return pipeline_response
return AsyncItemPaged(
get_next, extract_data
)
list.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Network/locations/{location}/availableDelegations'} # type: ignore
|
py | b4140812a9e8416e4f092a549852f04e958e7f43 | ###############################################################################
# Caleydo - Visualization for Molecular Biology - http://caleydo.org
# Copyright (c) The Caleydo Team. All rights reserved.
# Licensed under the new BSD license, available at http://caleydo.org/license
###############################################################################
from builtins import str
from builtins import object
from . import plugin as p
import sys
ANONYMOUS = 'anonymous'
class User(object):
def __init__(self, id):
self.id = id
self.name = ANONYMOUS
self.roles = [ANONYMOUS]
def get_id(self):
return str(self.id)
@property
def is_authenticated(self):
return False
@property
def is_active(self):
return False
@property
def is_anonymous(self):
return self.name == ANONYMOUS
def has_role(self, role):
return role in self.roles
def __eq__(self, other):
"""
Checks the equality of two `UserMixin` objects using `get_id`.
"""
if isinstance(other, User):
return self.get_id() == other.get_id()
return NotImplemented
def __ne__(self, other):
"""
Checks the inequality of two `UserMixin` objects using `get_id`.
"""
equal = self.__eq__(other)
if equal is NotImplemented:
return NotImplemented
return not equal
if sys.version_info[0] != 2: # pragma: no cover
# Python 3 implicitly set __hash__ to None if we override __eq__
# We set it back to its default implementation
__hash__ = object.__hash__
ANONYMOUS_USER = User(ANONYMOUS)
class SecurityManager(object):
"""
a basic security manager
"""
def __init__(self):
pass
def login_required(self, f):
return f
def login(self, username, extra_fields={}):
"""logs the given user in
:returns the logged in user object or None if login failed
"""
return User('')
def logout(self):
"""
logs the current logged in user out
"""
pass
@property
def current_user(self):
"""
:returns the current logged in user
"""
return User('')
def is_authenticated(self):
"""whether the current user is authenticated
"""
return self.current_user.is_authenticated
def has_role(self, role):
"""whether the current use has the role
"""
return self.current_user.has_role(role)
def init_app(self, app):
pass
def add_login_routes(self, app):
pass
class DummyManager(SecurityManager):
"""
a dummy implementation of the security manager where everyone is authenticated
"""
def is_authenticated(self):
return True
def has_role(self, role):
return True
_manager = None
def manager():
"""
:return: the security manager
"""
global _manager
if _manager is None:
_manager = p.lookup('security_manager')
if _manager is None:
_manager = DummyManager()
return _manager
def is_logged_in():
return manager().is_authenticated()
def current_username():
u = manager().current_user
return u.name if hasattr(u, 'name') else ANONYMOUS
def current_user():
user = manager().current_user
if user.is_anonymous:
return ANONYMOUS_USER
return user
def login_required(f):
"""
Decorator for views that require login.
"""
return manager().login_required(f)
def init_app(app):
"""
initializes this app by for login mechanism
:param app:
:return:
"""
manager().init_app(app)
def add_login_routes(app):
"""
initializes this flask for providing access to /login and /logout
:param app:
:return:
"""
manager().add_login_routes(app)
PERMISSION_READ = 4
PERMISSION_WRITE = 2
PERMISSION_EXECUTE = 1
def to_number(p_set):
return (PERMISSION_READ if PERMISSION_READ in p_set else 0) + \
(PERMISSION_WRITE if PERMISSION_WRITE in p_set else 0) + \
(PERMISSION_EXECUTE if PERMISSION_EXECUTE in p_set else 0)
def to_string(p_set):
return ('r' if PERMISSION_READ in p_set else '-') + \
('w' if PERMISSION_WRITE in p_set else '-') + \
('x' if PERMISSION_EXECUTE in p_set else '-')
def _from_number(p):
r = set()
if p >= 4:
r.add(PERMISSION_READ)
p -= 4
if p >= 2:
r.add(PERMISSION_WRITE)
p -= 2
if p >= 1:
r.add(PERMISSION_EXECUTE)
return r
DEFAULT_PERMISSION = 744
def _decode(permission=DEFAULT_PERMISSION):
permission = int(permission)
others = _from_number(permission % 10)
group = _from_number((permission // 10) % 10)
user = _from_number((permission // 100) % 10)
buddies = _from_number((permission // 1000) % 10)
return user, group, others, buddies
def _is_equal(a, b):
if a == b:
return True
if not a or not b:
return False
a = a.lower()
b = b.lower()
return a == b
def _includes(items, item):
if not item:
return False
for check in items:
if _is_equal(check, item):
return True
return False
def can(item, permission, user=None):
if user is None:
user = current_user()
if not isinstance(item, dict):
# assume we have an object
item = {
'creator': getattr(item, 'creator', ANONYMOUS),
'buddies': getattr(item, 'buddies', []),
'group': getattr(item, 'group', ANONYMOUS),
'permissions': getattr(item, 'permissions', DEFAULT_PERMISSION)
}
owner, group, others, buddies = _decode(item.get('permissions', DEFAULT_PERMISSION))
# I'm the creator
if _is_equal(user.name, item.get('creator', ANONYMOUS)) and permission in owner:
return True
# check if I'm in the buddies list
if 'buddies' in item and _includes(item.get('buddies'), user.name) and permission in buddies:
return True
# check if I'm in the group
if 'group' in item and _includes(user.roles, item.get('group')) and permission in group:
return True
return permission in others
def can_read(data_description, user=None):
return can(data_description, PERMISSION_READ, user)
def can_write(data_description, user=None):
return can(data_description, PERMISSION_WRITE, user)
def can_execute(data_description, user=None):
return can(data_description, PERMISSION_EXECUTE, user)
|
py | b41409607dfb50e1375c1dcdd522e03fdc5cfd8e | from bokeh.io import output_file, show
from bokeh.models import ColumnDataSource, FactorRange
from bokeh.plotting import figure
from bokeh.transform import factor_cmap
output_file("bars.html")
fruits = ['Apples', 'Pears', 'Nectarines', 'Plums', 'Grapes', 'Strawberries']
years = ['2015', '2016', '2017']
data = {'fruits' : fruits,
'2015' : [2, 1, 4, 3, 2, 4],
'2016' : [5, 3, 3, 2, 4, 6],
'2017' : [3, 2, 4, 4, 5, 3]}
palette = ["#c9d9d3", "#718dbf", "#e84d60"]
# this creates [ ("Apples", "2015"), ("Apples", "2016"), ("Apples", "2017"), ("Pears", "2015), ... ]
x = [ (fruit, year) for fruit in fruits for year in years ]
counts = sum(zip(data['2015'], data['2016'], data['2017']), ()) # like an hstack
source = ColumnDataSource(data=dict(x=x, counts=counts))
p = figure(x_range=FactorRange(*x), plot_height=250, title="Fruit Counts by Year",
toolbar_location=None, tools="")
p.vbar(x='x', top='counts', width=0.9, source=source, line_color="white",
fill_color=factor_cmap('x', palette=palette, factors=years, start=1, end=2))
p.y_range.start = 0
p.x_range.range_padding = 0.1
p.xaxis.major_label_orientation = 1
p.xgrid.grid_line_color = None
show(p)
|
py | b4140a1bb77f0df5c91a51e310472a84b81c499b | from random import randint
class ListItem:
@staticmethod
def list_item(series):
length = len(series)
position = randint(0, length-1)
return series[position]
|
py | b4140b017b9113bc062be98e0b8911ea0eeb6a94 | from knowledge_model import Base, Knowledge
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
engine = create_engine('sqlite:///knowledge.db')
Base.metadata.create_all(engine)
DBSession = sessionmaker(bind=engine)
session = DBSession()
def add_article(topic,title,rating):
article=Knowledge(topic=topic, title=title, rating=rating)
session.add(article)
session.commit()
def query_all_articles():
knowledge=session.query(Knowledge).all()
return knowledge
def query_article_by_topic(topic):
knowledge=session.query(Knowledge).filter_by(topic=topic)
return knowledge
def query_article_by_rating(threshold):
knowledge=session.query(Knowledge).filter(Knowledge.rating<threshold).all()
def delete_article_by_topic(topic):
session.query(Knowledge).filter_by(topic=topic).delete()
session.commit()
def delete_all_articles():
session.query(Knowledge).delete()
session.commit()
def edit_article_rating(update_rating,article_title):
art_rate=session.query(Knowledge).filter_by(topic=article_title).first()
art_rate.rating=update_rating
session.commit()
def delete_article_by_rating(threshold):
session.query(Knowledge).filter(Knowledge.rating<threshold).delete()
session.commit()
def get_top_rated_5():
knowledge=query_all_articles()
delete_all_articles()
#add_article("birds", "birds are nice", 7)
#add_article("idk", "idk, google it",4)
#edit_article_rating(10,"birds")
#delete_article_by_topic("birds")
print(query_all_articles())
|
py | b4140b75d1b08ba825961d0d314e82c3fe65af77 | from .datagram import *
from .frame import *
|
py | b4140bd54e31ba3964d2575d07f53bff6e7e0aa1 | #!/usr/bin/python3
# -*- coding: utf-8 -*-
from zenipy import entry
import os
import datetime
user = 'adewinter'
home_dir = os.path.join('/', 'home', user)
data_dir = os.path.join(home_dir, '.flowping')
def get_placeholder_from_line(line):
fields = line.split(',')
if(fields.__len__() > 0):
return fields[-1].strip()
else:
return ''
def start():
try:
# Create target Directory
os.mkdir(data_dir)
print("Directory", data_dir, "Created.")
except FileExistsError:
pass
file_path = os.path.join(data_dir, 'data.csv')
if not (os.path.exists(file_path)):
f = open(file_path, 'w')
f.close()
read_file = open(file_path, 'r+')
lines = read_file.readlines()
if (lines.__len__() == 0):
placeholder = 'Taking over the world'
else:
placeholder = get_placeholder_from_line(lines[-1])
read_file.close()
append_file = open(file_path, 'a+')
result = entry(text="What are you working on?", placeholder=placeholder)
now = datetime.datetime.now()
data_to_append = f"{now},{result}\n"
print(f"Appending {data_to_append} to datefile: {file_path}.")
append_file.write(data_to_append)
append_file.close()
if __name__ == "__main__":
start()
|
py | b4140c71ae6668f407ac8b40556b398a76aad315 | # Lint as: python2, python3
# Copyright 2018 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""Tests for task_scheduler."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import os
from lingvo.core import early_stop
from lingvo.core import task_scheduler
from lingvo.core import test_utils
import numpy as np
from six.moves import range
import tensorflow as tf
_NUMPY_RANDOM_SEED = 9885784
class SchedulerTests(test_utils.TestCase):
def _TestSchedulerHelper(self, schedule, global_step, count_a):
np.random.seed(_NUMPY_RANDOM_SEED)
task_counts = {'a': 0, 'b': 0}
for _ in range(100):
task = schedule.Sample(global_step)
task_counts[task] += 1
self.assertEqual(task_counts['a'], count_a)
self.assertEqual(task_counts['b'], 100 - count_a)
def testConstantScheduler(self):
"""Approximate expected probabilities: (a:0.8, b:0.2)."""
p = task_scheduler.ConstantScheduler.Params()
p.task_probs = [('a', 0.8), ('b', 0.2)]
schedule = p.Instantiate()
self._TestSchedulerHelper(schedule, 0, 83)
def testExponentialScheduler(self):
"""Test exponential scheduler.
Approximate probabilities:
t=0: (a:0, b:1)
t=1e5: (a:0.63, b:0.37)
t=1e10: (a:1, b:0)
"""
p = task_scheduler.ExponentialScheduler.Params()
p.alpha = 1e-5
p.task_probs = [('a', (0, 1)), ('b', (1, 0))]
schedule = p.Instantiate()
self._TestSchedulerHelper(schedule, global_step=0, count_a=0)
self._TestSchedulerHelper(schedule, global_step=1e5, count_a=63)
self._TestSchedulerHelper(schedule, global_step=1e10, count_a=100)
def testSigmoidScheduler(self):
"""Test sigmoid scheduler.
Approximate probabilities:
t=0: (a:0.5, b:0.5)
t=1e5: (a:0.73, b:0.27)
t=1e10: (a:1, b:0)
"""
p = task_scheduler.SigmoidScheduler.Params()
p.alpha = 1e-5
p.task_probs = [('a', (0.5, 1)), ('b', (0.5, 0))]
schedule = p.Instantiate()
self._TestSchedulerHelper(schedule, global_step=0, count_a=54)
self._TestSchedulerHelper(schedule, global_step=1e5, count_a=73)
self._TestSchedulerHelper(schedule, global_step=1e10, count_a=100)
def _setupTestAdaptiveScheduler(self, p):
logdir = tf.test.get_temp_dir()
tf.gfile.MkDir(os.path.join(logdir, 'decoder_dev_a'))
tf.gfile.MkDir(os.path.join(logdir, 'decoder_dev_b'))
early_stop.MetricHistory.SetLogdirInMetricHistories(p, logdir)
p.epsilon = 0.05
p.tasks = ['a', 'b']
p.expected = [0.3, 0.5]
mh_a = early_stop.MetricHistory.Params()
mh_a.jobname = 'decoder_dev_a'
mh_a.metric = 'corpus_bleu'
mh_a.logdir = logdir
mh_a.local_filesystem = True
mh_b = early_stop.MetricHistory.Params()
mh_b.jobname = 'decoder_dev_b'
mh_b.metric = 'corpus_bleu'
mh_b.logdir = logdir
mh_b.local_filesystem = True
p.mh_a = mh_a
p.mh_b = mh_b
schedule = p.Instantiate()
early_stop.MetricHistory.ConditionalAppend(mh_a.jobname, mh_a.metric, 1,
0.05)
early_stop.MetricHistory.ConditionalAppend(mh_b.jobname, mh_b.metric, 1,
0.25)
return schedule
def testSimpleAdaptiveScheduler(self):
"""Test simple adaptive schedule.
Probability of task a:
(1.05 - 0.05/0.3) / ((1.05 - 0.05/0.3) + (1.05 - 0.25/0.5)) /approx 0.616
"""
np.random.seed(_NUMPY_RANDOM_SEED)
p = task_scheduler.SimpleAdaptiveScheduler.Params()
schedule = self._setupTestAdaptiveScheduler(p)
self._TestSchedulerHelper(schedule, 0, 63)
def testInverseRatioAdaptiveScheduler(self):
"""Test simple adaptive schedule.
Probability of task a:
1.05/(13/60.) / (1.05/(13/60.) + 1.05/(11/20.)) /approx 0.717
"""
np.random.seed(_NUMPY_RANDOM_SEED)
p = task_scheduler.InverseRatioAdaptiveScheduler.Params()
schedule = self._setupTestAdaptiveScheduler(p)
self._TestSchedulerHelper(schedule, 0, 71)
def testRoundRobinScheduler(self):
"""Test round-robin scheduler."""
p = task_scheduler.RoundRobinScheduler.Params()
p.tasks = ['a', 'b']
schedule = p.Instantiate()
for global_step in range(20):
task = schedule.Sample(global_step)
if global_step % 2 == 0:
self.assertEqual('a', task)
else:
self.assertEqual('b', task)
if __name__ == '__main__':
tf.test.main()
|
py | b4140c98f33885f00625952092c27397fdc4ad90 | """Urls for the Zinnia sitemap"""
from django.urls import path
from zinnia.views.sitemap import Sitemap
urlpatterns = [
path('', Sitemap.as_view(),
name='sitemap'),
]
|
py | b4140dfd40b15c1db62e16a4c0bc203aa4aff15a | from __future__ import print_function
try:
input = raw_input
except NameError:
pass
import argparse
import pc_lib_api
import pc_lib_general
import json
import pandas
import time
import sys
from datetime import datetime, date
# --Execution Block-- #
# --Parse command line arguments-- #
parser = argparse.ArgumentParser(prog='rltoolbox')
parser.add_argument(
'-u',
'--username',
type=str,
help='*Required* - Prisma Cloud API Access Key ID that you want to set to access your Prisma Cloud account.')
parser.add_argument(
'-p',
'--password',
type=str,
help='*Required* - Prisma Cloud API Secret Key that you want to set to access your Prisma Cloud account.')
parser.add_argument(
'-url',
'--uiurl',
type=str,
help='*Required* - Base URL used in the UI for connecting to Prisma Cloud. '
'Formatted as app.prismacloud.io or app2.prismacloud.io or app.eu.prismacloud.io, etc. '
'You can also input the api version of the URL if you know it and it will be passed through.')
parser.add_argument(
'-y',
'--yes',
action='store_true',
help='(Optional) - Override user input for verification (auto answer for yes).')
parser.add_argument(
'--detailed',
action='store_true',
help='(Optional) - Detailed alerts response.')
parser.add_argument(
'--matrixmode',
action='store_true',
help='(Optional) - Print out JSON responses.')
parser.add_argument(
'-fas',
'--alertstatus',
type=str,
help='(Optional) - Filter - Alert Status.')
parser.add_argument(
'-aid',
'--alertid',
type=str,
help='(Optional) - Filter - Alert ID.')
parser.add_argument(
'-fpt',
'--policytype',
type=str,
help='(Optional) - Filter - Policy Type.')
parser.add_argument(
'-fpcs',
'--policycomplianceStandard',
type=str,
help='(Optional) - Filter - Policy Compliance Standard.')
parser.add_argument(
'-fps',
'--policyseverity',
type=str,
help='(Optional) - Filter - Policy Severity.')
parser.add_argument(
'-fct',
'--cloudtype',
type=str,
help='(Optional) - Filter - Cloud Type.')
parser.add_argument(
'-fca',
'--cloudaccount',
type=str,
help='(Optional) - Filter - Cloud Account.')
parser.add_argument(
'-fcaid',
'--cloudaccountid',
type=str,
help='(Optional) - Filter - Cloud Account ID.')
parser.add_argument(
'-fcr',
'--cloudregion',
type=str,
help='(Optional) - Filter - Cloud Region.')
parser.add_argument(
'-tr',
'--timerange',
type=int,
default=30,
help='(Optional) - Time Range in days. Defaults to 30.')
parser.add_argument(
'-l',
'--limit',
type=int,
default=500,
help='(Optional) - Return values limit (Default to 500).')
parser.add_argument(
'-fagt',
'--accountgroup',
type=str,
help='(Optional) - Filter - Account Group.')
parser.add_argument(
'-fpid',
'--policyid',
type=str,
help='(Optional) - Filter - Policy ID.')
parser.add_argument(
'-frid',
'--resourceid',
type=str,
help='(Optional) - Filter - Resource ID.')
args = parser.parse_args()
# --End parse command line arguments-- #
print('User login')
pc_settings = pc_lib_general.pc_login_get(args.username, args.password, args.uiurl)
print('Done')
# Verification (override with -y)
if not args.yes:
print()
print('Ready to excute commands aginst your Prisma Cloud tenant.')
verification_response = str(input('Would you like to continue (y or yes to continue)?'))
continue_response = {'yes', 'y'}
print()
if verification_response not in continue_response:
pc_lib_general.pc_exit_error(400, 'Verification failed due to user response. Exiting...')
#----------------------------------------------------------------------
print('API - Data Call 1 - Getting authentication token..')
pc_settings = pc_lib_api.pc_jwt_get(pc_settings)
print('Done.')
#current time/date utilized in CSV output filesnames
now = datetime.now().strftime("%m_%d_%Y-%I_%M_%p")
print ('Cloud Type Specified in CLI =', args.cloudtype)
print('Building the filters for JSON package.')
alerts_filter = {}
if args.detailed:
alerts_filter['detailed'] = True
else:
alerts_filter['detailed'] = False
alerts_filter['timeRange'] = {}
alerts_filter['timeRange']['type'] = "relative"
alerts_filter['timeRange']['value'] = {}
alerts_filter['timeRange']['value']['unit'] = "day"
alerts_filter['timeRange']['value']['amount'] = args.timerange
alerts_filter['sortBy'] = ["id:asc"]
alerts_filter['offset'] = 0
alerts_filter['limit'] = args.limit
alerts_filter['filters'] = []
if args.alertstatus is not None:
temp_filter = {}
temp_filter['operator'] = "="
temp_filter['name'] = "alert.status"
temp_filter['value'] = args.alertstatus
alerts_filter['filters'].append(temp_filter)
if args.alertid is not None:
temp_filter = {}
temp_filter['operator'] = "="
temp_filter['name'] = "alert.id"
temp_filter['value'] = args.alertid
alerts_filter['filters'].append(temp_filter)
if args.cloudaccount is not None:
temp_filter = {}
temp_filter['operator'] = "="
temp_filter['name'] = "cloud.account"
temp_filter['value'] = args.cloudaccount
alerts_filter['filters'].append(temp_filter)
if args.cloudregion is not None:
temp_filter = {}
temp_filter['operator'] = "="
temp_filter['name'] = "cloud.region"
temp_filter['value'] = args.cloudregion
alerts_filter['filters'].append(temp_filter)
if args.accountgroup is not None:
temp_filter = {}
temp_filter['operator'] = "="
temp_filter['name'] = "account.group"
temp_filter['value'] = args.accountgroup
alerts_filter['filters'].append(temp_filter)
if args.cloudaccountid is not None:
temp_filter = {}
temp_filter['operator'] = "="
temp_filter['name'] = "cloud.accountId"
temp_filter['value'] = args.cloudaccountid
alerts_filter['filters'].append(temp_filter)
if args.cloudtype is not None:
temp_filter = {}
temp_filter['operator'] = "="
temp_filter['name'] = "cloud.type"
temp_filter['value'] = args.cloudtype
alerts_filter['filters'].append(temp_filter)
if args.policytype is not None:
temp_filter = {}
temp_filter['operator'] = "="
temp_filter['name'] = "policy.type"
temp_filter['value'] = args.policytype
alerts_filter['filters'].append(temp_filter)
if args.policycomplianceStandard is not None:
temp_filter = {}
temp_filter['operator'] = "="
temp_filter['name'] = "policy.complianceStandard"
temp_filter['value'] = args.policycomplianceStandard
alerts_filter['filters'].append(temp_filter)
if args.policyseverity is not None:
temp_filter = {}
temp_filter['operator'] = "="
temp_filter['name'] = "policy.severity"
temp_filter['value'] = args.policyseverity
alerts_filter['filters'].append(temp_filter)
if args.policyid is not None:
temp_filter = {}
temp_filter['operator'] = "="
temp_filter['name'] = "policy.id"
temp_filter['value'] = args.policyid
alerts_filter['filters'].append(temp_filter)
if args.resourceid is not None:
temp_filter = {}
temp_filter['operator'] = "="
temp_filter['name'] = "resource.id"
temp_filter['value'] = args.policyid
alerts_filter['filters'].append(temp_filter)
print('Done building filters specified in CLI.')
#----------------------------------------------------------------------
print('API - Data Call 2 - Plugging in filters, a granular JSON response is now being prepared by Prisma. A job ID will be provided.')
pc_settings, response_package = pc_lib_api.api_async_alerts_job(pc_settings, data=alerts_filter)
alerts_job_number = response_package
print('Putting JSON response inside a dataframe - job_id')
job_id_json = pandas.json_normalize(alerts_job_number) #put json inside a dataframe
#Grab the job ID which we will plug into a URL, this will allow a user to check status and download. We first must convert it to a string (schema purposes for URL) and then deal with unneccesary characters.
job_id_string = job_id_json['data.id'].to_string()
#For the job ID, will remove the first 5 characters since JSON pulls characters not relevant to the job ID.
job_id = job_id_string[5:]
print('Our job number is', job_id)
#----------------------------------------------------------------------
print('API - Data Call 3 - Using the job ID, we can now plug this into a URL to track status updates for alerts job.')
pc_settings, response_package = pc_lib_api.api_async_alerts_job_status(pc_settings, job_id)
alerts_job_status = response_package
if args.matrixmode == True:
print(alerts_job_status)
else:
print('Done')
print('Putting JSON response inside a dataframe - alert_status')
jobs_status_json = pandas.json_normalize(alerts_job_status)
print('Done')
#Before using this status check in the "IF" "ELSE" section below, we first have to convert the data to a string in order to help strip unneccesary characters.
jobs_status_string = jobs_status_json['data.status'].to_string()
#For the status, will remove the first 5 characters since it pulls characters not relevant to the status.
status_check = jobs_status_string[5:]
test = status_check.split()
print('Now lets create a loop to continously check on job status. Once the status changes from "IN_PROGRESS" to "READY_TO_DOWNLOAD", we will break the loop.')
for boston in test:
while status_check == "IN_PROGRESS":
print('Please wait, alert data job still in progress. Once status changes from "IN_PROGRESS" to "READY_TO_DOWNLOAD", this message will disappear and the code will proceed to the next step. Retries occur every 60 seconds:')
for i in range(60,0,-1):
sys.stdout.write(str(i)+' ')
sys.stdout.flush()
time.sleep(1)
pc_settings, response_package = pc_lib_api.api_async_alerts_job_status(pc_settings, job_id)
alerts_job_status1 = response_package
jobs_status_json1 = pandas.json_normalize(alerts_job_status1)
jobs_status_string1 = jobs_status_json1['data.status'].to_string()
status_check1 = jobs_status_string1[5:]
if status_check1 == "READY_TO_DOWNLOAD":
break
print('Job is now ready for download.')
#----------------------------------------------------------------------!=
print('API - Data Call 4 - Downloading the list of alerts as a JSON response.')
pc_settings, response_package = pc_lib_api.api_async_alerts_job_download(pc_settings, job_id)
alerts_job_download = response_package
print('Done grabbing the JSON')
if args.matrixmode == True:
print(alerts_job_download)
else:
print('Done')
print('Putting JSON response inside a dataframe - async_alerts_list')
#data simply refers to the top most JSON key you want to begin sorting by in JSON reponse.
rr = pandas.json_normalize(alerts_job_download['data'])
print('Done')
#----------------------------------------------------------------------
#In order to get an RQL query column populated and mapped to specific alerts, first we need to combine response from a policy list response and a saved search list response. Alerts response has a policyID field which we can map to this combo response to extract the associated RQL (if applicable)
print('API - Data Call 5 - Getting current policy list, this will help tie alerts to an RQL query. "rule.criteria" to "id" mapping from saved search in data call 2')
pc_settings, response_package = pc_lib_api.api_policy_v2_list_get_enabled(pc_settings)
policy_v2_list = response_package['data']
print('Done')
if args.matrixmode == True:
print(policy_v2_list)
else:
print('Done')
#put json inside a dataframe
pu = pandas.json_normalize(policy_v2_list)
print('Putting JSON reponse inside dataframe - policy_v2_list')
print('Done')
#----------------------------------------------------------------------
print('API - Data Call 6 - Getting saved search history list, this will help tie alerts to an RQL query. "id" to "rule.criteria" mapping from policy in data call 1')
pc_settings, response_package = pc_lib_api.api_search_get_all(pc_settings)
saved_searches = response_package['data']
if args.matrixmode == True:
print(saved_searches)
else:
print('Done')
pu2 = pandas.json_normalize(saved_searches)
print('Putting JSON response inside dataframe - saved_searches')
print('Done')
print('API - Mapping "policy" and "saved search (saved)" dataframes before appending RQL "query" column')
pu['query'] = pu['rule.criteria'].map(pu2.set_index('id')['query'])
print('Done')
#----------------------------------------------------------------------
print('API - Data Call 7 - Getting saved search history list, this will help tie alerts to an custom RQL query. "id" to "rule.criteria" mapping from policy in data call 1')
pc_settings, response_package = pc_lib_api.api_search_get_all_recent(pc_settings)
saved_searches_recent = response_package['data']
if args.matrixmode == True:
print(saved_searches_recent)
else:
print('Done')
pu3 = pandas.json_normalize(saved_searches_recent)
print('Putting JSON response inside dataframe - saved_searches_recent')
print('Done')
print('API - Mapping "policy" and "saved search (recent)" dataframes before appending RQL "query" column')
pu['custom_query'] = pu['rule.criteria'].map(pu3.set_index('id')['query'])
print('Done')
#----------------------------------------------------------------------
print('API - Data Call 8 - Getting alerts list. "policy.policyid" to "policyId" mapping from policy in data call 1. The more days pulled, the longer this step will take. If this times out with a 504 server side error, apply more filters or lower the days pulled. If a policy ID error pops up, please check that an alert even exists for your specificied time range in the UI. Please wait...')
pc_settings, response_package = pc_lib_api.api_alert_v2_list_get(pc_settings, data=alerts_filter)
alerts_list = response_package['data']
#print(args.matrixmode)
if args.matrixmode == True:
print(alerts_list)
else:
print('Done')
type = args.cloudtype
#----------------------------------------------------------------------
#Now that the query column from the Saved Search response has been merged into the policy dataframe. Next step is to map the policy dataframe to the alerts dataframe (policy ID is the index). Once mapped one can associate the "query" from the saved search with a specific alert.
print('API - Mapping "policy" dataframe with appended RQL column to "alerts" data frame. This will allow the script to add the query column to the alerts dump.')
rr['query'] = rr['policy.policyId'].map(pu.set_index('policyId')['query'])
rr['custom_query'] = rr['policy.policyId'].map(pu.set_index('policyId')['custom_query'])
print('Done')
print ('Converting the main time stamp column in dataframe to time/date. By default, Prisma Cloud stores the time stamp in Unix epoch time. This code will also convert the default time zone from Coordinated Universal Time (UTC) to Chicago/Central Time (CDT).')
rr['alertTime']=(pandas.to_datetime(rr['alertTime'],unit='ms')).apply(lambda x: x.tz_localize('UTC').tz_convert('America/Chicago'))
column_exist_check = "investigateOptions.searchId" in rr
print ('Check on whether any investigation links were provided in the JSON response: ' + str(column_exist_check))
print('Done')
print ('Assembling columns specific to AWS or GCP, this includes all tag/label information pulled in from ServiceNow(SNOW). If tags/labels from SNOW exists for a specific alert in Prisma Cloud, they will show up in the CSV.')
#Specifies which which columns to grab on this lite version. AWS uses "tags" and GCP uses "labels" so we must be sure the correct column names are called. The columns below can be swapped out for anything found in the JSON response ("rr" in this case). Condition check above is for the investigate column which isn't always populated with data.
if args.cloudtype == "gcp":
if column_exist_check == True:
gcp_LITE_FIELDS = ["id", "status", "alertTime", "policy.severity", "policy.name", "policy.policyId", "policy.policyType", "policy.recommendation", "policy.labels", "resource.cloudType", "resource.cloudAccountGroups", "resource.resourceType", "resource.resourceApiName", "resource.accountId", "resource.rrn", "resource.name", "resource.region", "resource.regionId", "resource.data.labels.owner", "resource.data.labels.owner_email","resource.data.labels.contact_email", "resource.data.payload.authenticationInfo.principalEmail", "resource.data.labels.business_service", "resource.data.labels.environment","resource.data.labels.business_unit", "resource.data.labels.name", "resource.data.status", "investigateOptions.searchId", "query", "custom_query"]
#Reindex, if one of our columns is empty the code will proceed and not error out.
rr2 = rr.reindex(columns=gcp_LITE_FIELDS)
rr2.loc[rr2['investigateOptions.searchId'].notnull(), 'investigateOptions.searchId'] = rr2['investigateOptions.searchId'].apply(lambda x: "{}{}".format('https://app3.prismacloud.io/investigate?searchId=', x))
#rr2.loc[rr2['investigateOptions.searchId'].isnull(), 'investigateOptions.searchId'] =
#We can specify additional parameters in the post processing. Data_Format, provides the time format for the AlertTime column. Index=false, removes the 1st column of numbers (index).
rr2.to_csv('%s_alerts_output_{}.csv'.format(now) % type, sep=',', encoding='utf-8', index=False, date_format='%m-%d-%y || %I:%M:%S %p CDT%z')
else:
gcp_LITE_FIELDS = ["id", "status", "alertTime", "policy.severity", "policy.name", "policy.policyId", "policy.policyType", "policy.recommendation", "policy.labels", "resource.cloudType", "resource.cloudAccountGroups", "resource.resourceType", "resource.resourceApiName", "resource.accountId", "resource.rrn", "resource.name", "resource.region", "resource.regionId", "resource.data.labels.owner", "resource.data.labels.owner_email","resource.data.labels.contact_email", "resource.data.payload.authenticationInfo.principalEmail", "resource.data.labels.business_service", "resource.data.labels.environment","resource.data.labels.business_unit", "resource.data.labels.name", "resource.data.status", "query", "custom_query"]
#Reindex, if one of our columns is empty the code will proceed and not error out.
rr2 = rr.reindex(columns=gcp_LITE_FIELDS)
#We can specify additional parameters in the post processing. Data_Format, provides the time format for the AlertTime column. Index=false, removes the 1st column of numbers (index).
rr2.to_csv('%s_alerts_output_{}.csv'.format(now) % type, sep=',', encoding='utf-8', index=False, date_format='%m-%d-%y || %I:%M:%S %p CDT%z')
if args.cloudtype == "aws":
if column_exist_check == True:
aws_LITE_FIELDS = ["id", "status", "alertTime", "policy.severity", "policy.name", "policy.policyId", "policy.policyType", "policy.recommendation", "policy.labels", "resource.cloudType", "resource.cloudAccountGroups", "resource.resourceType", "resource.resourceApiName", "resource.account", "resource.rrn", "resource.id", "resource.name", "resource.region", "resource.regionId", "resource.data.tagSets.Owner", "resource.data.tagSets.OwnerEmail", "resource.data.tagSets.ContactEmail","resource.data.tagSets.TechnicalService", "resource.data.tagSets.BusinessService","resource.data.tagSets.Environment","resource.data.tagSets.BusinessUnit", "investigateOptions.searchId", "query", "custom_query"]
#Reindex, if one of our columns is empty the code will proceed and not error out.
rr2 = rr.reindex(columns=aws_LITE_FIELDS)
rr2.loc[rr2['investigateOptions.searchId'].notnull(), 'investigateOptions.searchId'] = rr2['investigateOptions.searchId'].apply(lambda x: "{}{}".format('https://app3.prismacloud.io/investigate?searchId=', x))
#rr2.loc[rr2['investigateOptions.searchId'].isnull(), 'investigateOptions.searchId'] =
#We can specify additional parameters in the post processing. Data_Format, provides the time format for the AlertTime column. Index=false, removes the 1st column of numbers (index).
rr2.to_csv('%s_alerts_output_{}.csv'.format(now) % type, sep=',', encoding='utf-8', index=False, date_format='%m-%d-%y || %I:%M:%S %p CDT%z')
else:
aws_LITE_FIELDS = ["id", "status", "alertTime", "policy.severity", "policy.name", "policy.policyId", "policy.policyType", "policy.recommendation", "policy.labels", "resource.cloudType", "resource.cloudAccountGroups", "resource.resourceType", "resource.resourceApiName", "resource.account", "resource.rrn", "resource.id", "resource.name", "resource.region", "resource.regionId", "resource.data.tagSets.Owner", "resource.data.tagSets.OwnerEmail", "resource.data.tagSets.ContactEmail","resource.data.tagSets.TechnicalService", "resource.data.tagSets.BusinessService","resource.data.tagSets.Environment","resource.data.tagSets.BusinessUnit", "query", "custom_query"]
#Reindex, if one of our columns is empty the code will proceed and not error out.
rr2 = rr.reindex(columns=aws_LITE_FIELDS)
#We can specify additional parameters in the post processing. Data_Format, provides the time format for the AlertTime column. Index=false, removes the 1st column of numbers (index).
rr2.to_csv('%s_alerts_output_{}.csv'.format(now) % type, sep=',', encoding='utf-8', index=False, date_format='%m-%d-%y || %I:%M:%S %p CDT%z')
print('Done')
print('Saving JSON contents as a CSV...')
print('Done.')
|
py | b4140e7fff63ea9ba0be1451ae9d7cf5b3b7c48f | import _plotly_utils.basevalidators
class OpacitysrcValidator(_plotly_utils.basevalidators.SrcValidator):
def __init__(
self,
plotly_name='opacitysrc',
parent_name='scatterternary.marker',
**kwargs
):
super(OpacitysrcValidator, self).__init__(
plotly_name=plotly_name,
parent_name=parent_name,
edit_type='none',
role='info',
**kwargs
)
|
py | b4140eb4e91d7a2f72053eaa11f026bc6f78dca8 | from .exceptions import HetznerActionException
from .locations import HetznerCloudLocation
from .shared import _get_results
class HetznerCloudDatacentersAction(object):
def __init__(self, config):
self._config = config
def get_all(self, name=None):
status_code, results = _get_results(self._config, "datacenters",
url_params={"name": name} if name is not None else None)
if status_code != 200:
raise HetznerActionException(results)
for result in results["datacenters"]:
yield HetznerCloudDatacenter._load_from_json(result)
def get(self, id):
status_code, results = _get_results(self._config, "datacenters/%s" % id)
if status_code != 200:
raise HetznerActionException(results)
return HetznerCloudDatacenter._load_from_json(results["datacenter"])
class HetznerCloudDatacenter(object):
def __init__(self):
self.id = 0
self.name = ""
self.description = ""
self.location = None
self.supported_server_types = []
self.available_server_types = []
@staticmethod
def _load_from_json(json):
dc = HetznerCloudDatacenter()
dc.id = int(json["id"])
dc.name = json["name"]
dc.description = json["description"]
dc.location = HetznerCloudLocation._load_from_json(json["location"])
dc.supported_server_types = json["server_types"]["supported"]
dc.available_server_types = json["server_types"]["available"]
return dc
|
py | b4140f525ee793e23a06ad699dbc65623706ce28 | # Copyright 2010-2018 Amazon.com, Inc. or its affiliates. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"). You
# may not use this file except in compliance with the License. A copy of
# the License is located at
#
# http://aws.amazon.com/apache2.0/
#
# or in the "license" file accompanying this file. This file is
# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
# ANY KIND, either express or implied. See the License for the specific
# language governing permissions and limitations under the License.
import boto3
# Create SES client
ses = boto3.client('ses')
response = ses.delete_template(
TemplateName = 'TEMPLATE_NAME'
)
print(response)
#snippet-sourcedescription:[ses_deletetemplate.py demonstrates how to remove a specific email template.]
#snippet-keyword:[Python]
#snippet-keyword:[AWS SDK for Python (Boto3)]
#snippet-keyword:[Code Sample]
#snippet-keyword:[Amazon Simple Email Service]
#snippet-service:[ses]
#snippet-sourcetype:[full-example]
#snippet-sourcedate:[2018-08-11]
#snippet-sourceauthor:[tapasweni-pathak]
|
py | b4141050bb8f7bc968162afa9cf0b000837e00f6 | import traceback
import sys
class Check(object):
def __init__(self):
self.errors = []
self.skipped = False
super(Check, self).__init__()
def log_error(self, error, message, detail=None, strip=4):
"Add an error message and optional user message to the error list"
if message:
msg = message + ": " + error
else:
msg = error
tb = traceback.format_stack()
if sys.version_info >= (3, 0):
tb = tb[:-strip]
else:
tb = tb[strip:]
self.errors.append({
'message': msg,
'traceback': tb,
'detail': detail
})
def error_message(self):
"Get a single error message string with all errors joined"
return ", ".join([e['message'] for e in self.errors])
def equal(self, a, b, message=None):
"Check if two values are equal"
if a != b:
self.log_error("{} != {}".format(str(a), str(b)), message)
return False
return True
def not_equal(self, a, b, message=None):
"Check if two values are not equal"
if a == b:
self.log_error("{} == {}".format(str(a), str(b)), message)
return False
return True
def is_none(self, a, message=None):
"Check if a value is None"
if a is not None:
self.log_error("{} is not None".format(str(a)), message)
return False
return True
def is_not_none(self, a, message=None):
"Check if a value is not None"
if a is None:
self.log_error("{} is None".format(str(a)), message)
return False
return True
def is_true(self, a, message=None):
"Check if a value is True"
if not a:
self.log_error("{} is False".format(str(a)), message)
return False
return True
def is_false(self, a, message=None):
"Check if a value is False"
if a:
self.log_error("{} is True".format(str(a)), message)
return False
return True
def fail(self, message):
"Just log an error message"
self.log_error(message, None)
return False
def raises(self, exception_type, function, *args, **kwargs):
"""
Check if a function raises a specified exception type,
*args and **kwargs are forwarded to the function
"""
try:
result = function(*args, **kwargs)
self.log_error("{} did not throw exception {}".format(
function.__name__,
exception_type.__name__
), None)
return result
except Exception as e:
if type(e) != exception_type:
self.log_error("{} did raise {}: {}".format(
function.__name__,
type(e).__name__, e
), None)
def does_not_raise(self, function, *args, **kwargs):
"""
Check if a function does not raise an exception,
*args and **kwargs are forwarded to the function
"""
try:
return function(*args, **kwargs)
except Exception as e:
self.log_error("{} did raise {}: {}".format(
function.__name__,
type(e).__name__, e
), None)
|
py | b414105edc38ad0df3abf38c551a4f53c9f37906 | # -*- coding: utf-8 -*-
from brue import brue
from store import store
from routes import routes
brue.use(store)
brue.use(routes)
|
py | b414123163f44c9fc71604d403a8ef8039faed22 | # -*- coding: utf-8 -*-
"""
Testing class for report-data-entry endpoint of the Castor EDC API Wrapper.
Link: https://data.castoredc.com/api#/report-data-entry
@author: R.C.A. van Linschoten
https://orcid.org/0000-0003-3052-596X
"""
import pytest
from exceptions.exceptions import CastorException
from tests.test_api_endpoints.data_models import (
survey_data_point_extended_model,
data_options,
)
from tests.test_api_endpoints.helpers_api_endpoints import allowed_value
class TestSurveyDataEntry:
model_keys = survey_data_point_extended_model.keys()
data_options = data_options
test_field = {
"record_id": "000005",
"field_variable_name": "SF12_1",
"field_id": "FC4FAA2D-08FD-41F7-B482-444B2B6D3116",
"survey_instance_id": "1FFBCDD8-2FC2-4838-B6DD-0EAE3FF8818E",
"value": "1",
"updated_on": "2020-08-14 11:59:20",
"_embedded": {
"record": {
"id": "000005",
"record_id": "000005",
"ccr_patient_id": "",
"last_opened_step": "FFF23B2C-AEE6-4304-9CC4-9C7C431D5387",
"progress": 7,
"status": "open",
"archived": False,
"archived_reason": None,
"created_by": "B23ABCC4-3A53-FB32-7B78-3960CC907F25",
"created_on": {
"date": "2019-10-28 13:30:15.000000",
"timezone_type": 3,
"timezone": "Europe/Amsterdam",
},
"updated_by": "B23ABCC4-3A53-FB32-7B78-3960CC907F25",
"updated_on": {
"date": "2020-08-19 14:09:04.000000",
"timezone_type": 3,
"timezone": "Europe/Amsterdam",
},
"randomized_id": None,
"randomized_on": None,
"randomization_group": None,
"randomization_group_name": None,
"_embedded": {
"institute": {
"id": "1CFF5802-0B07-471F-B97E-B5166332F2C5",
"institute_id": "1CFF5802-0B07-471F-B97E-B5166332F2C5",
"name": "Test Institute",
"abbreviation": "TES",
"code": "TES",
"order": 0,
"deleted": False,
"country_id": 169,
"_links": {
"self": {
"href": "https://data.castoredc.com/api/study/D234215B-D956-482D-BF17-71F2BB12A2FD/institute/1CFF5802-0B07-471F-B97E-B5166332F2C5"
}
},
}
},
"_links": {
"self": {
"href": "https://data.castoredc.com/api/study/D234215B-D956-482D-BF17-71F2BB12A2FD/record/000005"
}
},
},
"field": {
"id": "FC4FAA2D-08FD-41F7-B482-444B2B6D3116",
"parent_id": "C19211FE-1C53-43F9-BC85-460DF1255153",
"field_id": "FC4FAA2D-08FD-41F7-B482-444B2B6D3116",
"field_number": 5,
"field_label": "How would you rate your overall health?",
"field_variable_name": "SF12_1",
"field_type": "radio",
"field_required": 1,
"field_hidden": 0,
"field_info": "",
"field_units": "",
"field_min": None,
"field_min_label": "",
"field_max": None,
"field_max_label": "",
"field_summary_template": "",
"field_slider_step": None,
"report_id": "",
"field_length": None,
"additional_config": "",
"exclude_on_data_export": False,
"option_group": None,
"metadata_points": [],
"validations": [],
"dependency_parents": [],
"dependency_children": [],
"_links": {
"self": {
"href": "https://data.castoredc.com/api/study/D234215B-D956-482D-BF17-71F2BB12A2FD/field/FC4FAA2D-08FD-41F7-B482-444B2B6D3116"
}
},
},
},
"_links": {
"self": {
"href": "https://data.castoredc.com/api/study/D234215B-D956-482D-BF17-71F2BB12A2FD/record/000005/data-point/survey/1FFBCDD8-2FC2-4838-B6DD-0EAE3FF8818E/FC4FAA2D-08FD-41F7-B482-444B2B6D3116"
}
},
}
def test_single_survey_instance_all_fields_record_success(self, client):
"""Tests if single survey instance returns the right data."""
survey = client.single_survey_instance_all_fields_record(
"000005", "1FFBCDD8-2FC2-4838-B6DD-0EAE3FF8818E"
)
for field in survey:
field_keys = field.keys()
assert len(field_keys) == len(self.model_keys)
for key in field_keys:
assert key in self.model_keys
assert type(field[key]) in survey_data_point_extended_model[key]
def test_single_survey_instance_all_fields_record_fail(self, client):
"""Tests if failing to call a single survey instance throws an error."""
with pytest.raises(CastorException) as e:
client.single_survey_instance_all_fields_record(
"00FAKE", "1FFBCDD8-2FC2-4838-B6DD-0EAE3FF8818E"
)
assert str(e.value) == "404 The record you requested data for does not exist."
def test_single_survey_instance_single_field_record_success(self, client):
"""Tests if single survey field returns the proper data."""
field = client.single_survey_instance_single_field_record(
"000005",
"1FFBCDD8-2FC2-4838-B6DD-0EAE3FF8818E",
"FC4FAA2D-08FD-41F7-B482-444B2B6D3116",
)
assert field == self.test_field
def test_single_survey_instance_single_field_record_fail_record(self, client):
"""Tests if calling a single survey field throws an error when failing"""
with pytest.raises(CastorException) as e:
client.single_survey_instance_single_field_record(
"00FAKE",
"1FFBCDD8-2FC2-4838-B6DD-0EAE3FF8818E",
"FC4FAA2D-08FD-41F7-B482-444B2B6D3116",
)
assert str(e.value) == "404 The record you requested data for does not exist."
def test_update_survey_instance_single_field_record_success(self, client):
"""Tests correctly changing a single survey field"""
field = "FC4FAA2D-08FD-41F7-B482-444B2B6D3116"
post_value = allowed_value(client, field)
# Update the field
change_reason = "Testing API"
client.update_survey_instance_single_field_record(
"000011",
"5F420735-03B5-4736-9CCA-D3B02DA2BFF4",
field,
post_value,
change_reason,
)
# Check if changing worked
new_value = client.single_survey_instance_single_field_record(
"000011", "5F420735-03B5-4736-9CCA-D3B02DA2BFF4", field
)
assert new_value["value"] == str(post_value)
def test_update_survey_instance_single_field_record_fail(self, client):
"""Tests failing to change a single survey field"""
field = "ED12B07E-EDA8-4D64-8268-BE751BD5DB36"
post_value = allowed_value(client, field)
old_value = client.single_survey_instance_single_field_record(
"110002", "0FFD2C09-C5F2-4072-BDF1-736516C0D60A", field
)
# Update the field
change_reason = "Testing API"
with pytest.raises(CastorException) as e:
client.update_survey_instance_single_field_record(
"110002",
"0FFD2C09-C5F2-4072-BDF1-736516C0D60A",
field + "FAKE",
post_value,
change_reason,
)
assert str(e.value) == "400 The request you made was malformed"
# Check if changing failed
new_value = client.single_survey_instance_single_field_record(
"110002", "0FFD2C09-C5F2-4072-BDF1-736516C0D60A", field
)
assert new_value["value"] == old_value["value"]
|
py | b4141402b433d6fcdbe02f2f87f9dcd0623e7aa4 | """Migrate entity IDs and simplify schema.
Revision ID: b3959bf8cc66
Revises: 1519391870a0
Create Date: 2020-02-07 07:10:40.437321
"""
from alembic import op
import sqlalchemy as sa
from followthemoney import model
from followthemoney.namespace import Namespace
# revision identifiers, used by Alembic.
revision = 'b3959bf8cc66'
down_revision = '1519391870a0'
def upgrade():
bind = op.get_bind()
meta = sa.MetaData()
meta.bind = bind
meta.reflect()
entity_table = meta.tables['entity']
collection_table = meta.tables['collection']
q = sa.select([collection_table])
crp = bind.execute(q)
for collection in crp.fetchall():
ns = Namespace(collection.foreign_id)
q = sa.select([entity_table])
q = q.where(entity_table.c.collection_id == collection.id)
erp = bind.execute(q)
while True:
entity = erp.fetchone()
if not entity:
break
proxy = model.get_proxy({
'id': entity.id,
'schema': entity.schema,
'properties': entity.data
}, cleaned=False)
proxy.add('name', entity.name, quiet=True, cleaned=False)
proxy = ns.apply(proxy)
q = sa.update(entity_table)
q = q.where(entity_table.c.id == entity.id)
q = q.values(id=proxy.id, data=proxy.properties)
bind.execute(q)
op.drop_column('entity', 'foreign_id')
op.drop_column('entity', 'name')
def downgrade():
pass
|
py | b4141421720dcd3be508aeb06211b535d3f8dc9b | from purpleserver.manager.tests.test_addresses import *
from purpleserver.manager.tests.test_parcels import *
from purpleserver.manager.tests.test_shipments import *
from purpleserver.manager.tests.test_trackers import *
from purpleserver.manager.tests.test_custom_infos import *
from purpleserver.manager.tests.test_pickups import *
|
py | b414147dc0d5235a0c406d5fc12dd46a84eb6550 | import pytest
import subprocess
import warnings
import conda_remove_envs as cenv
VALID_ENV_NAMES = [["test"], ["test", "test2"]]
CLI_STR = "python conda_remove_envs.py"
def create_env(env: str):
subprocess.run(f"conda create --name {env}".split())
@pytest.mark.parametrize("envlist", VALID_ENV_NAMES)
def test_removal_by_name_should_pass(envlist):
expected = cenv.list_envs()
for env in envlist:
create_env(env)
cmd = CLI_STR + " -n " + " ".join(envlist)
subprocess.run(cmd.split())
actual = cenv.list_envs()
assert expected == actual
def test_list_envs():
actual = cenv.list_envs()
actual.sort()
expected = list(set(actual))
expected.sort()
assert actual == expected
def test_remove_base_warning():
with pytest.warns(UserWarning) as record:
warnings.warn("Cannot remove 'base' environment", UserWarning)
cenv.remove_env("base")
assert record[0].message.args[0] == "Cannot remove 'base' environment"
|
py | b414151c600e3955b5c955fdd4b64a682554a64e | # Generated by Django 3.1.7 on 2021-02-19 22:04
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('envnet', '0009_network_registrations'),
]
operations = [
migrations.RemoveField(
model_name='network',
name='registrations',
),
]
|
py | b414157a17bfd136db8c667f145c5209b47d4886 | from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from math import cos
from math import pi
from math import sin
from compas.geometry import matrix_from_frame
from compas.geometry import transform_points
from compas.geometry import Circle
from compas.geometry import Frame
from compas.geometry import Plane
from ._shape import Shape
class Cylinder(Shape):
"""A cylinder is defined by a circle and a height.
Parameters
----------
circle: :class:`compas.geometry.Circle`
The circle of the cylinder.
height: float
The height of the cylinder.
Attributes
----------
plane : :class:`compas.geometry.Plane`
The plane containing the circle.
circle : :class:`compas.geometry.Circle`
The base circle of the cylinder.
radius : float
The radius of the base circle.
height : float
The height of the cylinder.
normal (read-only) : :class:`compas.geometry.Vector`
The normal of the base plane.
diameter : float
The diameter of the cylinder.
Examples
--------
>>> from compas.geometry import Plane
>>> from compas.geometry import Cylinder
>>> plane = Plane([0, 0, 0], [0, 0, 1])
>>> circle = Circle(plane, 5)
>>> cylinder = Cylinder(circle, 7)
"""
@property
def DATASCHEMA(self):
import schema
return schema.Schema({
'circle': {
'plane': Plane.DATASCHEMA.fget(None),
'radius': schema.And(float, lambda x: x > 0)
},
'height': schema.And(float, lambda x: x > 0)
})
@property
def JSONSCHEMANAME(self):
return 'cylinder'
__slots__ = ['_circle', '_height']
def __init__(self, circle, height, **kwargs):
super(Cylinder, self).__init__(**kwargs)
self._circle = None
self._height = None
self.circle = circle
self.height = height
@property
def data(self):
"""Returns the data dictionary that represents the cylinder.
Returns
-------
dict
The cylinder data.
"""
return {'circle': self.circle.data, 'height': self.height}
@data.setter
def data(self, data):
self.circle = Circle.from_data(data['circle'])
self.height = data['height']
@property
def plane(self):
"""Plane: The plane of the cylinder."""
return self.circle.plane
@plane.setter
def plane(self, plane):
self.circle.plane = Plane(*plane)
@property
def circle(self):
"""float: The circle of the cylinder."""
return self._circle
@circle.setter
def circle(self, circle):
self._circle = Circle(*circle)
@property
def radius(self):
"""float: The radius of the cylinder."""
return self.circle.radius
@radius.setter
def radius(self, radius):
self.circle.radius = float(radius)
@property
def height(self):
"""float: The height of the cylinder."""
return self._height
@height.setter
def height(self, height):
self._height = float(height)
@property
def normal(self):
"""Vector: The normal of the cylinder."""
return self.plane.normal
@property
def diameter(self):
"""float: The diameter of the cylinder."""
return self.circle.diameter
@property
def center(self):
"""Point: The center of the cylinder."""
return self.circle.center
@center.setter
def center(self, point):
self.circle.center = point
@property
def area(self):
"""Float: The surface area of the cylinder."""
return (self.circle.area * 2) + (self.circle.circumference * self.height)
@property
def volume(self):
"""Float: The volume of the cylinder."""
return self.circle.area * self.height
# ==========================================================================
# customisation
# ==========================================================================
def __repr__(self):
return 'Cylinder({0!r}, {1!r})'.format(self.circle, self.height)
def __len__(self):
return 2
def __getitem__(self, key):
if key == 0:
return self.circle
elif key == 1:
return self.height
else:
raise KeyError
def __setitem__(self, key, value):
if key == 0:
self.circle = value
elif key == 1:
self.height = value
else:
raise KeyError
def __iter__(self):
return iter([self.circle, self.height])
# ==========================================================================
# constructors
# ==========================================================================
@classmethod
def from_data(cls, data):
"""Construct a cylinder from its data representation.
Parameters
----------
data : :obj:`dict`
The data dictionary.
Returns
-------
Cylinder
The constructed cylinder.
Examples
--------
>>> from compas.geometry import Cylinder
>>> from compas.geometry import Circle
>>> from compas.geometry import Plane
>>> data = {'circle': Circle(Plane.worldXY(), 5).data, 'height': 7.}
>>> cylinder = Cylinder.from_data(data)
"""
cylinder = cls(Circle.from_data(data['circle']), data['height'])
return cylinder
# ==========================================================================
# methods
# ==========================================================================
def to_vertices_and_faces(self, u=16, triangulated=False):
"""Returns a list of vertices and faces.
Parameters
----------
u : int, optional
Number of faces in the "u" direction.
triangulated: bool, optional
Flag indicating that the faces have to be triangulated.
Returns
-------
(vertices, faces)
A list of vertex locations and a list of faces,
with each face defined as a list of indices into the list of vertices.
"""
if u < 3:
raise ValueError('The value for u should be u > 3.')
vertices = []
a = 2 * pi / u
z = self.height / 2
for i in range(u):
x = self.circle.radius * cos(i * a)
y = self.circle.radius * sin(i * a)
vertices.append([x, y, z])
vertices.append([x, y, -z])
# add v in bottom and top's circle center
vertices.append([0, 0, z])
vertices.append([0, 0, -z])
# transform vertices to cylinder's plane
frame = Frame.from_plane(self.circle.plane)
M = matrix_from_frame(frame)
vertices = transform_points(vertices, M)
faces = []
# side faces
for i in range(0, u * 2, 2):
faces.append([i, i + 1, (i + 3) % (u * 2), (i + 2) % (u * 2)])
# top and bottom circle faces
for i in range(0, u * 2, 2):
top = [i, (i + 2) % (u * 2), len(vertices) - 2]
bottom = [i + 1, (i + 3) % (u * 2), len(vertices) - 1]
faces.append(top)
faces.append(bottom[::-1])
if triangulated:
triangles = []
for face in faces:
if len(face) == 4:
triangles.append(face[0:3])
triangles.append([face[0], face[2], face[3]])
else:
triangles.append(face)
faces = triangles
return vertices, faces
def transform(self, transformation):
"""Transform the cylinder.
Parameters
----------
transformation : :class:`Transformation`
The transformation used to transform the cylinder.
Examples
--------
>>> from compas.geometry import Frame
>>> from compas.geometry import Transformation
>>> from compas.geometry import Plane
>>> from compas.geometry import Circle
>>> from compas.geometry import Cylinder
>>> circle = Circle(Plane.worldXY(), 5)
>>> cylinder = Cylinder(circle, 7)
>>> frame = Frame([1, 1, 1], [0.68, 0.68, 0.27], [-0.67, 0.73, -0.15])
>>> T = Transformation.from_frame(frame)
>>> cylinder.transform(T)
"""
self.circle.transform(transformation)
|
py | b414166a272253e909896ce14b5b2aeef7273a3f | """
Django settings for test04012022_dev_23303 project.
Generated by 'django-admin startproject' using Django 2.2.2.
For more information on this file, see
https://docs.djangoproject.com/en/2.2/topics/settings/
For the full list of settings and their values, see
https://docs.djangoproject.com/en/2.2/ref/settings/
"""
import os
import io
import environ
import logging
import google.auth
from google.cloud import secretmanager
from google.auth.exceptions import DefaultCredentialsError
from google.api_core.exceptions import PermissionDenied
from modules.manifest import get_modules
# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
env_file = os.path.join(BASE_DIR, ".env")
env = environ.Env()
env.read_env(env_file)
# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = env.bool("DEBUG", default=False)
try:
# Pull secrets from Secret Manager
_, project = google.auth.default()
client = secretmanager.SecretManagerServiceClient()
settings_name = os.environ.get("SETTINGS_NAME", "django_settings")
name = client.secret_version_path(project, settings_name, "latest")
payload = client.access_secret_version(name=name).payload.data.decode("UTF-8")
env.read_env(io.StringIO(payload))
except (DefaultCredentialsError, PermissionDenied):
pass
# Quick-start development settings - unsuitable for production
# See https://docs.djangoproject.com/en/2.2/howto/deployment/checklist/
# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = env.str("SECRET_KEY")
ALLOWED_HOSTS = env.list("HOST", default=["*"])
SITE_ID = 1
SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https")
SECURE_SSL_REDIRECT = env.bool("SECURE_REDIRECT", default=False)
# Application definition
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'django.contrib.sites'
]
LOCAL_APPS = [
'home',
'users.apps.UsersConfig',
]
THIRD_PARTY_APPS = [
'rest_framework',
'rest_framework.authtoken',
'rest_auth',
'rest_auth.registration',
'bootstrap4',
'allauth',
'allauth.account',
'allauth.socialaccount',
'allauth.socialaccount.providers.google',
'django_extensions',
'drf_yasg',
'storages',
]
MODULES_APPS = get_modules()
INSTALLED_APPS += LOCAL_APPS + THIRD_PARTY_APPS + MODULES_APPS
MIDDLEWARE = [
'django.middleware.security.SecurityMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
]
ROOT_URLCONF = 'test04012022_dev_23303.urls'
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [os.path.join(BASE_DIR, 'web_build')],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
],
},
},
]
WSGI_APPLICATION = 'test04012022_dev_23303.wsgi.application'
# Database
# https://docs.djangoproject.com/en/2.2/ref/settings/#databases
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
}
}
if env.str("DATABASE_URL", default=None):
DATABASES = {
'default': env.db()
}
# Password validation
# https://docs.djangoproject.com/en/2.2/ref/settings/#auth-password-validators
AUTH_PASSWORD_VALIDATORS = [
{
'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
},
]
# Internationalization
# https://docs.djangoproject.com/en/2.2/topics/i18n/
LANGUAGE_CODE = 'en-us'
TIME_ZONE = 'UTC'
USE_I18N = True
USE_L10N = True
USE_TZ = True
# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/2.2/howto/static-files/
STATIC_URL = '/static/'
MIDDLEWARE += ['whitenoise.middleware.WhiteNoiseMiddleware']
AUTHENTICATION_BACKENDS = (
'django.contrib.auth.backends.ModelBackend',
'allauth.account.auth_backends.AuthenticationBackend'
)
STATIC_ROOT = os.path.join(BASE_DIR, "staticfiles")
STATICFILES_DIRS = [os.path.join(BASE_DIR, 'static'), os.path.join(BASE_DIR, 'web_build/static')]
STATICFILES_STORAGE = 'whitenoise.storage.CompressedManifestStaticFilesStorage'
# allauth / users
ACCOUNT_EMAIL_REQUIRED = True
ACCOUNT_AUTHENTICATION_METHOD = 'email'
ACCOUNT_USERNAME_REQUIRED = False
ACCOUNT_EMAIL_VERIFICATION = "optional"
ACCOUNT_CONFIRM_EMAIL_ON_GET = True
ACCOUNT_LOGIN_ON_EMAIL_CONFIRMATION = True
ACCOUNT_UNIQUE_EMAIL = True
LOGIN_REDIRECT_URL = "users:redirect"
ACCOUNT_ADAPTER = "users.adapters.AccountAdapter"
SOCIALACCOUNT_ADAPTER = "users.adapters.SocialAccountAdapter"
ACCOUNT_ALLOW_REGISTRATION = env.bool("ACCOUNT_ALLOW_REGISTRATION", True)
SOCIALACCOUNT_ALLOW_REGISTRATION = env.bool("SOCIALACCOUNT_ALLOW_REGISTRATION", True)
REST_AUTH_SERIALIZERS = {
# Replace password reset serializer to fix 500 error
"PASSWORD_RESET_SERIALIZER": "home.api.v1.serializers.PasswordSerializer",
}
REST_AUTH_REGISTER_SERIALIZERS = {
# Use custom serializer that has no username and matches web signup
"REGISTER_SERIALIZER": "home.api.v1.serializers.SignupSerializer",
}
# Custom user model
AUTH_USER_MODEL = "users.User"
EMAIL_HOST = env.str("EMAIL_HOST", "smtp.sendgrid.net")
EMAIL_HOST_USER = env.str("SENDGRID_USERNAME", "")
EMAIL_HOST_PASSWORD = env.str("SENDGRID_PASSWORD", "")
EMAIL_PORT = 587
EMAIL_USE_TLS = True
# AWS S3 config
AWS_ACCESS_KEY_ID = env.str("AWS_ACCESS_KEY_ID", "")
AWS_SECRET_ACCESS_KEY = env.str("AWS_SECRET_ACCESS_KEY", "")
AWS_STORAGE_BUCKET_NAME = env.str("AWS_STORAGE_BUCKET_NAME", "")
AWS_STORAGE_REGION = env.str("AWS_STORAGE_REGION", "")
USE_S3 = (
AWS_ACCESS_KEY_ID and
AWS_SECRET_ACCESS_KEY and
AWS_STORAGE_BUCKET_NAME and
AWS_STORAGE_REGION
)
if USE_S3:
AWS_S3_CUSTOM_DOMAIN = env.str("AWS_S3_CUSTOM_DOMAIN", "")
AWS_S3_OBJECT_PARAMETERS = {"CacheControl": "max-age=86400"}
AWS_DEFAULT_ACL = env.str("AWS_DEFAULT_ACL", "public-read")
AWS_MEDIA_LOCATION = env.str("AWS_MEDIA_LOCATION", "media")
AWS_AUTO_CREATE_BUCKET = env.bool("AWS_AUTO_CREATE_BUCKET", True)
DEFAULT_FILE_STORAGE = env.str(
"DEFAULT_FILE_STORAGE", "home.storage_backends.MediaStorage"
)
MEDIA_URL = '/mediafiles/'
MEDIA_ROOT = os.path.join(BASE_DIR, 'mediafiles')
# Swagger settings for api docs
SWAGGER_SETTINGS = {
"DEFAULT_INFO": f"{ROOT_URLCONF}.api_info",
}
if DEBUG or not (EMAIL_HOST_USER and EMAIL_HOST_PASSWORD):
# output email to console instead of sending
if not DEBUG:
logging.warning("You should setup `SENDGRID_USERNAME` and `SENDGRID_PASSWORD` env vars to send emails.")
EMAIL_BACKEND = "django.core.mail.backends.console.EmailBackend"
# GCP config
GS_BUCKET_NAME = env.str("GS_BUCKET_NAME", "")
if GS_BUCKET_NAME:
DEFAULT_FILE_STORAGE = "storages.backends.gcloud.GoogleCloudStorage"
STATICFILES_STORAGE = "storages.backends.gcloud.GoogleCloudStorage"
GS_DEFAULT_ACL = "publicRead"
|
py | b41416ad3306fcd46c577e967650dd114b893595 | # Copyright (c) OpenMMLab. All rights reserved.
import argparse
import os
import re
import matplotlib.pyplot as plt
import numpy as np
from mmcls.utils import load_json_log
TEST_METRICS = ('precision', 'recall', 'f1_score', 'support', 'mAP', 'CP',
'CR', 'CF1', 'OP', 'OR', 'OF1', 'accuracy')
def cal_train_time(log_dicts, args):
"""Compute the average time per training iteration."""
for i, log_dict in enumerate(log_dicts):
print(f'{"-" * 5}Analyze train time of {args.json_logs[i]}{"-" * 5}')
all_times = []
for epoch in log_dict.keys():
if args.include_outliers:
all_times.append(log_dict[epoch]['time'])
else:
all_times.append(log_dict[epoch]['time'][1:])
all_times = np.array(all_times)
epoch_ave_time = all_times.mean(-1)
slowest_epoch = epoch_ave_time.argmax()
fastest_epoch = epoch_ave_time.argmin()
std_over_epoch = epoch_ave_time.std()
print(f'slowest epoch {slowest_epoch + 1}, '
f'average time is {epoch_ave_time[slowest_epoch]:.4f}')
print(f'fastest epoch {fastest_epoch + 1}, '
f'average time is {epoch_ave_time[fastest_epoch]:.4f}')
print(f'time std over epochs is {std_over_epoch:.4f}')
print(f'average iter time: {np.mean(all_times):.4f} s/iter')
print()
def get_legends(args):
"""if legend is None, use {filename}_{key} as legend."""
legend = args.legend
if legend is None:
legend = []
for json_log in args.json_logs:
for metric in args.keys:
# remove '.json' in the end of log names
basename = os.path.basename(json_log)[:-5]
if basename.endswith('.log'):
basename = basename[:-4]
legend.append(f'{basename}_{metric}')
assert len(legend) == (len(args.json_logs) * len(args.keys))
return legend
def plot_phase_train(metric, log_dict, epochs, curve_label, json_log):
"""plot phase of train cruve."""
if metric not in log_dict[epochs[0]]:
raise KeyError(f'{json_log} does not contain metric {metric}'
f' in train mode')
xs, ys = [], []
for epoch in epochs:
iters = log_dict[epoch]['iter']
if log_dict[epoch]['mode'][-1] == 'val':
iters = iters[:-1]
num_iters_per_epoch = iters[-1]
assert len(iters) > 0, (
'The training log is empty, please try to reduce the '
'interval of log in config file.')
xs.append(np.array(iters) / num_iters_per_epoch + (epoch - 1))
ys.append(np.array(log_dict[epoch][metric][:len(iters)]))
xs = np.concatenate(xs)
ys = np.concatenate(ys)
plt.xlabel('Epochs')
plt.plot(xs, ys, label=curve_label, linewidth=0.75)
def plot_phase_val(metric, log_dict, epochs, curve_label, json_log):
"""plot phase of val cruves."""
# some epoch may not have evaluation. as [(train, 5),(val, 1)]
xs = [e for e in epochs if metric in log_dict[e]]
ys = [log_dict[e][metric] for e in xs if metric in log_dict[e]]
assert len(xs) > 0, (f'{json_log} does not contain metric {metric}')
plt.xlabel('Epochs')
plt.plot(xs, ys, label=curve_label, linewidth=0.75)
def plot_curve_helper(log_dicts, metrics, args, legend):
"""plot curves from log_dicts by metrics."""
num_metrics = len(metrics)
for i, log_dict in enumerate(log_dicts):
epochs = list(log_dict.keys())
for j, metric in enumerate(metrics):
json_log = args.json_logs[i]
print(f'plot curve of {json_log}, metric is {metric}')
curve_label = legend[i * num_metrics + j]
if any(m in metric for m in TEST_METRICS):
plot_phase_val(metric, log_dict, epochs, curve_label, json_log)
else:
plot_phase_train(metric, log_dict, epochs, curve_label,
json_log)
plt.legend()
def plot_curve(log_dicts, args):
"""Plot train metric-iter graph."""
# set backend and style
if args.backend is not None:
plt.switch_backend(args.backend)
try:
import seaborn as sns
sns.set_style(args.style)
except ImportError:
print("Attention: The plot style won't be applied because 'seaborn' "
'package is not installed, please install it if you want better '
'show style.')
# set plot window size
wind_w, wind_h = args.window_size.split('*')
wind_w, wind_h = int(wind_w), int(wind_h)
plt.figure(figsize=(wind_w, wind_h))
# get legends and metrics
legends = get_legends(args)
metrics = args.keys
# plot curves from log_dicts by metrics
plot_curve_helper(log_dicts, metrics, args, legends)
# set title and show or save
if args.title is not None:
plt.title(args.title)
if args.out is None:
plt.show()
else:
print(f'save curve to: {args.out}')
plt.savefig(args.out)
plt.cla()
def add_plot_parser(subparsers):
parser_plt = subparsers.add_parser(
'plot_curve', help='parser for plotting curves')
parser_plt.add_argument(
'json_logs',
type=str,
nargs='+',
help='path of train log in json format')
parser_plt.add_argument(
'--keys',
type=str,
nargs='+',
default=['loss'],
help='the metric that you want to plot')
parser_plt.add_argument('--title', type=str, help='title of figure')
parser_plt.add_argument(
'--legend',
type=str,
nargs='+',
default=None,
help='legend of each plot')
parser_plt.add_argument(
'--backend', type=str, default=None, help='backend of plt')
parser_plt.add_argument(
'--style', type=str, default='whitegrid', help='style of plt')
parser_plt.add_argument('--out', type=str, default=None)
parser_plt.add_argument(
'--window-size',
default='12*7',
help='size of the window to display images, in format of "$W*$H".')
def add_time_parser(subparsers):
parser_time = subparsers.add_parser(
'cal_train_time',
help='parser for computing the average time per training iteration')
parser_time.add_argument(
'json_logs',
type=str,
nargs='+',
help='path of train log in json format')
parser_time.add_argument(
'--include-outliers',
action='store_true',
help='include the first value of every epoch when computing '
'the average time')
def parse_args():
parser = argparse.ArgumentParser(description='Analyze Json Log')
# currently only support plot curve and calculate average train time
subparsers = parser.add_subparsers(dest='task', help='task parser')
add_plot_parser(subparsers)
add_time_parser(subparsers)
args = parser.parse_args()
if args.window_size != '':
assert re.match(r'\d+\*\d+', args.window_size), \
"'window-size' must be in format 'W*H'."
return args
def main():
args = parse_args()
json_logs = args.json_logs
for json_log in json_logs:
assert json_log.endswith('.json')
log_dicts = [load_json_log(json_log) for json_log in json_logs]
eval(args.task)(log_dicts, args)
if __name__ == '__main__':
main()
|
py | b414174553c2ce0ae267eea52a2d1be077cd28a4 | import argparse
import logging
import os
import sys
from pprint import pformat
from bridge.deploy.registry import DEPLOY_REGISTRY
from bridge.analytics import AnalyticsClient
# The minimal required version of Python
MIN_PYTHON_VERSION = (3, 9)
def cli_init(args):
logger = logging.getLogger(__name__)
deploy_client = DEPLOY_REGISTRY[args.deploy_service]()
logger.info(f"Initializing Bridge '{pformat(args)}'.")
deploy_client.init()
AnalyticsClient(deploy_client).track_deploy_client_init()
def cli_destroy(args):
logger = logging.getLogger(__name__)
deploy_client = DEPLOY_REGISTRY[args.deploy_service]()
logger.info(f"Destroying Bridge '{pformat(args)}'.")
deploy_client.teardown()
AnalyticsClient(deploy_client).track_deploy_client_destroy()
def cli_run(args):
from bridge.app import main
logger = logging.getLogger(__name__)
logger.info(f"Starting Bridge '{pformat(args)}'.")
main()
def main():
if sys.version_info < MIN_PYTHON_VERSION:
raise AssertionError("Invalid Python version")
LOG_LEVEL = os.environ.get("LOG_LEVEL", "INFO")
level = logging.getLevelName(LOG_LEVEL)
logging.basicConfig(level=level)
logger = logging.getLogger(__name__)
# Main Parser
main_parser = argparse.ArgumentParser(
prog="bridge",
description=(
"Tool to sync model registries with" " production deployments."
),
)
subparsers = main_parser.add_subparsers()
# Init
init_parser = subparsers.add_parser(
"init", help="Create global cloud resources needed for deployment."
)
init_parser.add_argument(
"deploy_service",
choices=list(DEPLOY_REGISTRY.keys()),
help="The model deployment provider.",
)
init_parser.set_defaults(func=cli_init)
# Destroy
destroy_parser = subparsers.add_parser(
"destroy",
help=(
"Delete all deployments and global cloud resources "
"created by 'init'."
),
)
destroy_parser.add_argument(
"deploy_service",
choices=list(DEPLOY_REGISTRY.keys()),
help="The model deployment provider.",
)
destroy_parser.set_defaults(func=cli_destroy)
# Run
run_parser = subparsers.add_parser(
"run",
help="Start watcher service to automatically manage deployments.",
)
run_parser.set_defaults(func=cli_run)
args, config = main_parser.parse_known_args()
args.config = config
logger.debug(args)
if hasattr(args, "func"):
args.func(args)
else:
main_parser.print_help()
if __name__ == "__main__":
try:
main()
except KeyboardInterrupt:
exit()
|
py | b414179e49f6264782ace49a57cf8bc88d0da6c9 | #!/usr/bin/python
#
# This file is part of Ansible
#
# Ansible is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# Ansible is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
#
from __future__ import absolute_import, division, print_function
__metaclass__ = type
ANSIBLE_METADATA = {'status': ['preview'],
'supported_by': 'community',
'metadata_version': '1.1'}
DOCUMENTATION = '''
---
module: fmgr_secprof_dns
version_added: "2.8"
notes:
- Full Documentation at U(https://ftnt-ansible-docs.readthedocs.io/en/latest/).
author:
- Luke Weighall (@lweighall)
- Andrew Welsh (@Ghilli3)
- Jim Huber (@p4r4n0y1ng)
short_description: Manage DNS security profiles in FortiManager
description:
- Manage DNS security profiles in FortiManager
options:
adom:
description:
- The ADOM the configuration should belong to.
required: false
default: root
mode:
description:
- Sets one of three modes for managing the object.
- Allows use of soft-adds instead of overwriting existing values.
choices: ['add', 'set', 'delete', 'update']
required: false
default: add
youtube_restrict:
type: str
description:
- Set safe search for YouTube restriction level.
- choice | strict | Enable strict safe seach for YouTube.
- choice | moderate | Enable moderate safe search for YouTube.
required: false
choices: ["strict", "moderate"]
sdns_ftgd_err_log:
type: str
description:
- Enable/disable FortiGuard SDNS rating error logging.
- choice | disable | Disable FortiGuard SDNS rating error logging.
- choice | enable | Enable FortiGuard SDNS rating error logging.
required: false
choices: ["disable", "enable"]
sdns_domain_log:
type: str
description:
- Enable/disable domain filtering and botnet domain logging.
- choice | disable | Disable domain filtering and botnet domain logging.
- choice | enable | Enable domain filtering and botnet domain logging.
required: false
choices: ["disable", "enable"]
safe_search:
type: str
description:
- Enable/disable Google, Bing, and YouTube safe search.
- choice | disable | Disable Google, Bing, and YouTube safe search.
- choice | enable | Enable Google, Bing, and YouTube safe search.
required: false
choices: ["disable", "enable"]
redirect_portal:
type: str
description:
- IP address of the SDNS redirect portal.
required: false
name:
type: str
description:
- Profile name.
required: false
log_all_domain:
type: str
description:
- Enable/disable logging of all domains visited (detailed DNS logging).
- choice | disable | Disable logging of all domains visited.
- choice | enable | Enable logging of all domains visited.
required: false
choices: ["disable", "enable"]
external_ip_blocklist:
type: str
description:
- One or more external IP block lists.
required: false
comment:
type: str
description:
- Comment for the security profile to show in the FortiManager GUI.
required: false
block_botnet:
type: str
description:
- Enable/disable blocking botnet C&C; DNS lookups.
- choice | disable | Disable blocking botnet C&C; DNS lookups.
- choice | enable | Enable blocking botnet C&C; DNS lookups.
required: false
choices: ["disable", "enable"]
block_action:
type: str
description:
- Action to take for blocked domains.
- choice | block | Return NXDOMAIN for blocked domains.
- choice | redirect | Redirect blocked domains to SDNS portal.
required: false
choices: ["block", "redirect"]
domain_filter_domain_filter_table:
type: str
description:
- DNS domain filter table ID.
required: false
ftgd_dns_options:
type: str
description:
- FortiGuard DNS filter options.
- FLAG Based Options. Specify multiple in list form.
- flag | error-allow | Allow all domains when FortiGuard DNS servers fail.
- flag | ftgd-disable | Disable FortiGuard DNS domain rating.
required: false
choices: ["error-allow", "ftgd-disable"]
ftgd_dns_filters_action:
type: str
description:
- Action to take for DNS requests matching the category.
- choice | monitor | Allow DNS requests matching the category and log the result.
- choice | block | Block DNS requests matching the category.
required: false
choices: ["monitor", "block"]
ftgd_dns_filters_category:
type: str
description:
- Category number.
required: false
ftgd_dns_filters_log:
type: str
description:
- Enable/disable DNS filter logging for this DNS profile.
- choice | disable | Disable DNS filter logging.
- choice | enable | Enable DNS filter logging.
required: false
choices: ["disable", "enable"]
'''
EXAMPLES = '''
- name: DELETE Profile
fmgr_secprof_dns:
name: "Ansible_DNS_Profile"
comment: "Created by Ansible Module TEST"
mode: "delete"
- name: CREATE Profile
fmgr_secprof_dns:
name: "Ansible_DNS_Profile"
comment: "Created by Ansible Module TEST"
mode: "set"
block_action: "block"
'''
RETURN = """
api_result:
description: full API response, includes status code and message
returned: always
type: str
"""
from ansible.module_utils.basic import AnsibleModule, env_fallback
from ansible.module_utils.connection import Connection
from ansible.module_utils.network.fortimanager.fortimanager import FortiManagerHandler
from ansible.module_utils.network.fortimanager.common import FMGBaseException
from ansible.module_utils.network.fortimanager.common import FMGRCommon
from ansible.module_utils.network.fortimanager.common import FMGRMethods
from ansible.module_utils.network.fortimanager.common import DEFAULT_RESULT_OBJ
from ansible.module_utils.network.fortimanager.common import FAIL_SOCKET_MSG
from ansible.module_utils.network.fortimanager.common import prepare_dict
from ansible.module_utils.network.fortimanager.common import scrub_dict
###############
# START METHODS
###############
def fmgr_dnsfilter_profile_modify(fmgr, paramgram):
"""
:param fmgr: The fmgr object instance from fortimanager.py
:type fmgr: class object
:param paramgram: The formatted dictionary of options to process
:type paramgram: dict
:return: The response from the FortiManager
:rtype: dict
"""
mode = paramgram["mode"]
adom = paramgram["adom"]
url = ""
datagram = {}
response = DEFAULT_RESULT_OBJ
# EVAL THE MODE PARAMETER FOR SET OR ADD
if mode in ['set', 'add', 'update']:
url = '/pm/config/adom/{adom}/obj/dnsfilter/profile'.format(adom=adom)
datagram = scrub_dict(prepare_dict(paramgram))
# EVAL THE MODE PARAMETER FOR DELETE
elif mode == "delete":
# SET THE CORRECT URL FOR DELETE
url = '/pm/config/adom/{adom}/obj/dnsfilter/profile/{name}'.format(adom=adom, name=paramgram["name"])
datagram = {}
response = fmgr.process_request(url, datagram, paramgram["mode"])
return response
#############
# END METHODS
#############
def main():
argument_spec = dict(
adom=dict(type="str", default="root"),
mode=dict(choices=["add", "set", "delete", "update"], type="str", default="add"),
youtube_restrict=dict(required=False, type="str", choices=["strict", "moderate"]),
sdns_ftgd_err_log=dict(required=False, type="str", choices=["disable", "enable"]),
sdns_domain_log=dict(required=False, type="str", choices=["disable", "enable"]),
safe_search=dict(required=False, type="str", choices=["disable", "enable"]),
redirect_portal=dict(required=False, type="str"),
name=dict(required=False, type="str"),
log_all_domain=dict(required=False, type="str", choices=["disable", "enable"]),
external_ip_blocklist=dict(required=False, type="str"),
comment=dict(required=False, type="str"),
block_botnet=dict(required=False, type="str", choices=["disable", "enable"]),
block_action=dict(required=False, type="str", choices=["block", "redirect"]),
domain_filter_domain_filter_table=dict(required=False, type="str"),
ftgd_dns_options=dict(required=False, type="str", choices=["error-allow", "ftgd-disable"]),
ftgd_dns_filters_action=dict(required=False, type="str", choices=["monitor", "block"]),
ftgd_dns_filters_category=dict(required=False, type="str"),
ftgd_dns_filters_log=dict(required=False, type="str", choices=["disable", "enable"]),
)
module = AnsibleModule(argument_spec=argument_spec, supports_check_mode=False, )
# MODULE PARAMGRAM
paramgram = {
"mode": module.params["mode"],
"adom": module.params["adom"],
"youtube-restrict": module.params["youtube_restrict"],
"sdns-ftgd-err-log": module.params["sdns_ftgd_err_log"],
"sdns-domain-log": module.params["sdns_domain_log"],
"safe-search": module.params["safe_search"],
"redirect-portal": module.params["redirect_portal"],
"name": module.params["name"],
"log-all-domain": module.params["log_all_domain"],
"external-ip-blocklist": module.params["external_ip_blocklist"],
"comment": module.params["comment"],
"block-botnet": module.params["block_botnet"],
"block-action": module.params["block_action"],
"domain-filter": {
"domain-filter-table": module.params["domain_filter_domain_filter_table"],
},
"ftgd-dns": {
"options": module.params["ftgd_dns_options"],
"filters": {
"action": module.params["ftgd_dns_filters_action"],
"category": module.params["ftgd_dns_filters_category"],
"log": module.params["ftgd_dns_filters_log"],
}
}
}
module.paramgram = paramgram
fmgr = None
if module._socket_path:
connection = Connection(module._socket_path)
fmgr = FortiManagerHandler(connection, module)
fmgr.tools = FMGRCommon()
else:
module.fail_json(**FAIL_SOCKET_MSG)
results = DEFAULT_RESULT_OBJ
try:
results = fmgr_dnsfilter_profile_modify(fmgr, paramgram)
fmgr.govern_response(module=module, results=results,
ansible_facts=fmgr.construct_ansible_facts(results, module.params, paramgram))
except Exception as err:
raise FMGBaseException(err)
return module.exit_json(**results[1])
if __name__ == "__main__":
main()
|
py | b41417d8e3cddcbf129bf7830967d7cf3c050cd1 | """
Copyright (c) 2019, TransChain.
This source code is licensed under the Apache 2.0 license found in the
LICENSE file in the root directory of this source tree.
"""
import typing
from katena_chain_sdk_py.serializer.base_schema import BaseSchema
def println_json(data: any, schema: typing.Type[BaseSchema], is_array: bool = False):
print(schema().dumps(data, indent=2, many=is_array))
print()
|
py | b41417e2ab232c4aebf2e710a109d56876d2ae8b | #!/usr/bin/env python3
# Copyright (c) 2014-2021 The Bitcoin and Qogecoin Core Authors
# Distributed under the MIT software license, see the accompanying
# file COPYING or http://www.opensource.org/licenses/mit-license.php.
"""Test coinbase transactions return the correct categories.
Tests listtransactions, listsinceblock, and gettransaction.
"""
from test_framework.test_framework import QogecoinTestFramework
from test_framework.util import (
assert_array_result
)
class CoinbaseCategoryTest(QogecoinTestFramework):
def set_test_params(self):
self.num_nodes = 1
def skip_test_if_missing_module(self):
self.skip_if_no_wallet()
def assert_category(self, category, address, txid, skip):
assert_array_result(self.nodes[0].listtransactions(skip=skip),
{"address": address},
{"category": category})
assert_array_result(self.nodes[0].listsinceblock()["transactions"],
{"address": address},
{"category": category})
assert_array_result(self.nodes[0].gettransaction(txid)["details"],
{"address": address},
{"category": category})
def run_test(self):
# Generate one block to an address
address = self.nodes[0].getnewaddress()
self.generatetoaddress(self.nodes[0], 1, address)
hash = self.nodes[0].getbestblockhash()
txid = self.nodes[0].getblock(hash)["tx"][0]
# Coinbase transaction is immature after 1 confirmation
self.assert_category("immature", address, txid, 0)
# Mine another 99 blocks on top
self.generate(self.nodes[0], 99)
# Coinbase transaction is still immature after 100 confirmations
self.assert_category("immature", address, txid, 99)
# Mine one more block
self.generate(self.nodes[0], 1)
# Coinbase transaction is now matured, so category is "generate"
self.assert_category("generate", address, txid, 100)
# Orphan block that paid to address
self.nodes[0].invalidateblock(hash)
# Coinbase transaction is now orphaned
self.assert_category("orphan", address, txid, 100)
if __name__ == '__main__':
CoinbaseCategoryTest().main()
|
py | b414187dd1c7298c5a8ff96ceee9956c184d1fcc | import re
import emoji
from preprocessing.sinhalese_characters import get_simplified_character
def replace_url(text: str) -> str:
"""
replace URL of a text
:param text: text to replace urls
:return: url removed text
"""
return re.sub(r'(http://www\.|https://www\.|http://|https://)[a-z0-9]+([\-.]{1}[a-z0-9A-Z/]+)*', '', text)
def remove_retweet_state(text: str) -> str:
"""
remove retweet states in the beginning such as "RT @sam92ky: "
:param text: text
:return: text removed retweets state
"""
return re.sub(r'^RT @\w*: ', '', text)
def replace_mention(text: str) -> str:
return re.sub(r'@\w*', 'PERSON', text)
def split_tokens(text: str) -> list:
"""
tokenize text
:param text: text
:return: token list
"""
# text characters to split is from: https://github.com/madurangasiriwardena/corpus.sinhala.tools
emojis = ''.join(emj for emj in emoji.UNICODE_EMOJI.keys())
return [token for token in
re.split(r'[.…, ¸‚\"/|—¦”‘\'“’´!@#$%^&*+\-£?˜()\[\]{\}:;–Ê �0123456789' + emojis + ']', text)
if token != ""]
def set_spaces_among_emojis(text: str) -> str:
"""
make spaces among emojis to tokenize them
:param text: text to be modified
:return: modified text
"""
modified_text = ""
for c in text:
modified_text += c
if c in emoji.UNICODE_EMOJI:
modified_text += " "
return modified_text
def simplify_sinhalese_text(text: str) -> str:
"""
simplify
:param text:
:return:
"""
modified_text = ""
for c in text:
modified_text += get_simplified_character(c)
return modified_text
def stem_word(word: str) -> str:
"""
Stemming words
:param word: word
:return: stemmed word
"""
if len(word) < 4:
return word
# remove 'ට'
if word[-1] == 'ට':
return word[:-1]
# remove 'ද'
if word[-1] == 'ද':
return word[:-1]
# remove 'ටත්'
if word[-3:] == 'ටත්':
return word[:-3]
# remove 'එක්'
if word[-3:] == 'ෙක්':
return word[:-3]
# remove 'එ'
if word[-1:] == 'ෙ':
return word[:-1]
# remove 'ක්'
if word[-2:] == 'ක්':
return word[:-2]
# remove 'ගෙ' (instead of ගේ because this step comes after simplifying text)
if word[-2:] == 'ගෙ':
return word[:-2]
# else
return word
def tokenize(text: str) -> list:
# todo: add stem_word(token) and simplify_sinhalese_text methods
return [stem_word(token) for token in split_tokens(replace_url(replace_mention(
simplify_sinhalese_text(remove_retweet_state(text.strip('"')).lower()))))]
|
py | b41419468efae6b9fa1d349639cf150d9da1c935 | import pytest
from dvc.config import RemoteNotFoundError
from dvc.fs import get_fs_cls, get_fs_config
from dvc.fs.hdfs import HDFSFileSystem
from dvc.fs.http import HTTPFileSystem
from dvc.fs.https import HTTPSFileSystem
from dvc.fs.local import LocalFileSystem
from dvc.fs.s3 import S3FileSystem
from dvc.fs.ssh import SSHFileSystem
@pytest.mark.parametrize(
"url, cls",
[
("s3://bucket/path", S3FileSystem),
("ssh://example.com:/dir/path", SSHFileSystem),
("hdfs://example.com/dir/path", HDFSFileSystem),
("http://example.com/path/to/file", HTTPFileSystem),
("https://example.com/path/to/file", HTTPSFileSystem),
("path/to/file", LocalFileSystem),
("path\\to\\file", LocalFileSystem),
("file", LocalFileSystem),
("./file", LocalFileSystem),
(".\\file", LocalFileSystem),
("../file", LocalFileSystem),
("..\\file", LocalFileSystem),
("unknown://path", LocalFileSystem),
],
)
def test_get_fs_cls(url, cls):
assert get_fs_cls({"url": url}) == cls
def test_get_fs_config():
with pytest.raises(RemoteNotFoundError):
get_fs_config(None, {"remote": {}}, name="myremote")
|
py | b414195438d13e5a1e7e208e74024503d26fbe10 | """
Database schema.
"""
import datetime
import enum
import os
import copy
import gwemopt.utils
import gwemopt.ztf_tiling
from astropy import table
from astropy import coordinates
from astropy import units as u
from flask_login.mixins import UserMixin
from flask_sqlalchemy import SQLAlchemy
import gcn
import healpy as hp
from ligo.skymap.bayestar import rasterize
import lxml.etree
import pkg_resources
import numpy as np
from sqlalchemy.ext.associationproxy import association_proxy
from sqlalchemy.ext.hybrid import hybrid_property
from sqlalchemy_utils import EmailType, PhoneNumberType
from tqdm import tqdm
from .flask import app
db = SQLAlchemy(app)
def get_ztf_quadrants():
"""Calculate ZTF quadrant footprints as offsets from the telescope
boresight."""
quad_prob = gwemopt.ztf_tiling.QuadProb(0, 0)
ztf_tile = gwemopt.ztf_tiling.ZTFtile(0, 0)
quad_cents_ra, quad_cents_dec = ztf_tile.quadrant_centers()
offsets = np.asarray([
quad_prob.getWCS(
quad_cents_ra[quadrant_id],
quad_cents_dec[quadrant_id]
).calc_footprint(axes=quad_prob.quadrant_size)
for quadrant_id in range(64)])
return np.transpose(offsets, (2, 0, 1))
def create_all():
db.create_all(bind=None)
telescopes = ["ZTF", "Gattini", "DECam", "KPED", "GROWTH-India"]
available_filters = {"ZTF": ["g", "r", "i"],
"Gattini": ["J"],
"DECam": ["g", "r", "i", "z"],
"KPED": ["U", "g", "r", "i"],
"GROWTH-India": ["g", "r", "i", "z"]}
plan_args = {
'ZTF': {
'filt': ['g', 'r', 'g'],
'exposuretimes': [300.0, 300.0, 300.0],
'doReferences': True,
'doUsePrimary': True,
'doBalanceExposure': False,
'doDither': False,
'usePrevious': False,
'doCompletedObservations': False,
'doPlannedObservations': False,
'cobs': [None, None],
'schedule_type': 'greedy',
'filterScheduleType': 'block',
'airmass': 2.5,
'schedule_strategy': 'tiling',
'mindiff': 30.*60.,
'doMaxTiles': False,
'max_nb_tiles': 1000
},
'DECam': {
'filt': ['g', 'z'],
'exposuretimes': [25.0, 25.0],
'doReferences': True,
'doUsePrimary': False,
'doBalanceExposure': False,
'doDither': True,
'usePrevious': False,
'doCompletedObservations': False,
'doPlannedObservations': False,
'cobs': [None, None],
'schedule_type': 'greedy_slew',
'filterScheduleType': 'integrated',
'airmass': 2.5,
'schedule_strategy': 'tiling',
'mindiff': 30.*60.,
'doMaxTiles': False,
'max_nb_tiles': 1000
},
'Gattini': {
'filt': ['J'],
'exposuretimes': [300.0],
'doReferences': False,
'doUsePrimary': False,
'doBalanceExposure': False,
'doDither': False,
'usePrevious': False,
'doCompletedObservations': False,
'doPlannedObservations': False,
'cobs': [None, None],
'schedule_type': 'greedy',
'filterScheduleType': 'block',
'airmass': 2.5,
'schedule_strategy': 'tiling',
'mindiff': 30.*60.,
'doMaxTiles': False,
'max_nb_tiles': 1000
},
'KPED': {
'filt': ['r'],
'exposuretimes': [300.0],
'doReferences': False,
'doUsePrimary': False,
'doBalanceExposure': False,
'doDither': False,
'usePrevious': False,
'doCompletedObservations': False,
'doPlannedObservations': False,
'cobs': [None, None],
'schedule_type': 'greedy',
'filterScheduleType': 'integrated',
'airmass': 2.5,
'schedule_strategy': 'catalog',
'mindiff': 30.*60.,
'doMaxTiles': False,
'max_nb_tiles': 1000
},
'GROWTH-India': {
'filt': ['r'],
'exposuretimes': [300.0],
'doReferences': False,
'doUsePrimary': False,
'doBalanceExposure': False,
'doDither': False,
'usePrevious': False,
'doCompletedObservations': False,
'doPlannedObservations': False,
'cobs': [None, None],
'schedule_type': 'greedy',
'filterScheduleType': 'integrated',
'airmass': 2.5,
'schedule_strategy': 'catalog',
'mindiff': 30.*60.,
'doMaxTiles': False,
'max_nb_tiles': 1000
}
}
with tqdm(telescopes) as telescope_progress:
for tele in telescope_progress:
telescope_progress.set_description('populating {}'.format(tele))
filename = pkg_resources.resource_filename(
__name__, 'input/%s.ref' % tele)
if os.path.isfile(filename):
refstable = table.Table.read(
filename, format='ascii', data_start=2, data_end=-1)
refs = table.unique(refstable, keys=['field', 'fid'])
if "maglimcat" not in refs.columns:
refs["maglimcat"] = np.nan
reference_images = {
group[0]['field']: group['fid'].astype(int).tolist()
for group in refs.group_by('field').groups}
reference_mags = {
group[0]['field']: group['maglimcat'].tolist()
for group in refs.group_by('field').groups}
else:
reference_images = {}
reference_mags = {}
tesspath = 'input/%s.tess' % tele
try:
tessfile = app.open_instance_resource(tesspath)
except IOError:
tessfile = pkg_resources.resource_stream(__name__, tesspath)
tessfilename = tessfile.name
tessfile.close()
fields = np.recfromtxt(
tessfilename, usecols=range(3),
names=['field_id', 'ra', 'dec'])
with pkg_resources.resource_stream(
__name__, 'config/%s.config' % tele) as g:
config_struct = {}
for line in g.readlines():
line_without_return = line.decode().split("\n")
line_split = line_without_return[0].split(" ")
line_split = list(filter(None, line_split))
if line_split:
try:
config_struct[line_split[0]] = float(line_split[1])
except ValueError:
config_struct[line_split[0]] = line_split[1]
db.session.merge(Telescope(telescope=tele,
lat=config_struct["latitude"],
lon=config_struct["longitude"],
elevation=config_struct["elevation"],
timezone=config_struct["timezone"],
filters=available_filters[tele],
default_plan_args=plan_args[tele]))
for field_id, ra, dec in tqdm(fields, 'populating fields'):
ref_filter_ids = reference_images.get(field_id, [])
ref_filter_mags = []
for val in reference_mags.get(field_id, []):
ref_filter_mags.append(val)
bands = {1: 'g', 2: 'r', 3: 'i', 4: 'z', 5: 'J'}
ref_filter_bands = [bands.get(n, n) for n
in ref_filter_ids]
if config_struct["FOV_type"] == "square":
ipix, radecs, patch, area = gwemopt.utils.getSquarePixels(
ra, dec, config_struct["FOV"], Localization.nside)
elif config_struct["FOV_type"] == "circle":
ipix, radecs, patch, area = gwemopt.utils.getCirclePixels(
ra, dec, config_struct["FOV"], Localization.nside)
if len(radecs) == 0:
continue
corners = np.vstack((radecs, radecs[0, :]))
if corners.size == 10:
corners_copy = copy.deepcopy(corners)
corners[2] = corners_copy[3]
corners[3] = corners_copy[2]
contour = {
'type': 'Feature',
'geometry': {
'type': 'MultiLineString',
'coordinates': [corners.tolist()]
},
'properties': {
'telescope': tele,
'field_id': int(field_id),
'ra': ra,
'dec': dec,
'depth': dict(zip(ref_filter_bands, ref_filter_mags))
}
}
db.session.merge(Field(telescope=tele,
field_id=int(field_id),
ra=ra, dec=dec, contour=contour,
reference_filter_ids=ref_filter_ids,
reference_filter_mags=ref_filter_mags,
ipix=ipix.tolist()))
if tele == "ZTF":
quadrant_coords = get_ztf_quadrants()
skyoffset_frames = coordinates.SkyCoord(
fields['ra'], fields['dec'], unit=u.deg
).skyoffset_frame()
quadrant_coords_icrs = coordinates.SkyCoord(
*np.tile(
quadrant_coords[:, np.newaxis, ...],
(len(fields), 1, 1)), unit=u.deg,
frame=skyoffset_frames[:, np.newaxis, np.newaxis]
).transform_to(coordinates.ICRS)
quadrant_xyz = np.moveaxis(
quadrant_coords_icrs.cartesian.xyz.value, 0, -1)
for field_id, xyz in zip(
tqdm(fields['field_id'], 'populating subfields'),
quadrant_xyz):
for ii, xyz in enumerate(xyz):
ipix = hp.query_polygon(Localization.nside, xyz)
db.session.merge(SubField(telescope=tele,
field_id=int(field_id),
subfield_id=int(ii),
ipix=ipix.tolist()))
class User(db.Model, UserMixin):
name = db.Column(
db.String,
primary_key=True,
comment='Unique username')
email = db.Column(
EmailType,
comment='E-mail address')
phone = db.Column(
PhoneNumberType,
comment='Mobile/SMS phone number')
voice = db.Column(
db.Boolean,
nullable=False,
default=False,
comment='Set to true for voice alerts (default: SMS only)')
timezone = db.Column(
db.Unicode,
nullable=False,
default='America/New_York')
alert_from = db.Column(
db.Time,
comment='Start of hours for alerts')
alert_to = db.Column(
db.Time,
comment='End of hours for alerts')
def get_id(self):
"""Provide user ID for flask_login."""
return self.name
class Event(db.Model):
"""Event information, including an event ID, mission, and time of the
event"""
dateobs = db.Column(
db.DateTime,
primary_key=True,
comment='Event time')
gcn_notices = db.relationship(
lambda: GcnNotice,
order_by=lambda: GcnNotice.date)
_tags = db.relationship(
lambda: Tag,
lazy='selectin',
order_by=lambda: (
db.func.lower(Tag.text).notin_({'fermi', 'swift', 'amon', 'lvc'}),
db.func.lower(Tag.text).notin_({'long', 'short'}),
db.func.lower(Tag.text).notin_({'grb', 'gw', 'transient'})
)
)
tags = association_proxy(
'_tags',
'text',
creator=lambda tag: Tag(text=tag))
localizations = db.relationship(lambda: Localization)
plans = db.relationship(lambda: Plan, backref='event')
@hybrid_property
def retracted(self):
return 'retracted' in self.tags
@retracted.expression
def retracted(cls):
return db.literal('retracted').in_(cls.tags)
@property
def lightcurve(self):
try:
notice = self.gcn_notices[0]
except IndexError:
return None
root = lxml.etree.fromstring(notice.content)
elem = root.find(".//Param[@name='LightCurve_URL']")
if elem is None:
return None
else:
return elem.attrib.get('value', '').replace('http://', 'https://')
@property
def gracedb(self):
try:
notice = self.gcn_notices[0]
except IndexError:
return None
root = lxml.etree.fromstring(notice.content)
elem = root.find(".//Param[@name='EventPage']")
if elem is None:
return None
else:
return elem.attrib.get('value', '')
@property
def ned_gwf(self):
return "https://ned.ipac.caltech.edu/gwf/events"
@property
def graceid(self):
try:
notice = self.gcn_notices[0]
except IndexError:
return None
root = lxml.etree.fromstring(notice.content)
elem = root.find(".//Param[@name='GraceID']")
if elem is None:
return None
else:
return elem.attrib.get('value', '')
class Tag(db.Model):
"""Store qualitative tags for events."""
dateobs = db.Column(
db.DateTime,
db.ForeignKey(Event.dateobs),
primary_key=True)
text = db.Column(
db.Unicode,
primary_key=True)
class Telescope(db.Model):
"""Telescope information"""
telescope = db.Column(
db.String,
primary_key=True,
comment='Telescope name')
lat = db.Column(
db.Float,
nullable=False,
comment='Latitude')
lon = db.Column(
db.Float,
nullable=False,
comment='Longitude')
elevation = db.Column(
db.Float,
nullable=False,
comment='Elevation')
timezone = db.Column(
db.String,
nullable=False,
comment='Time zone')
filters = db.Column(
db.ARRAY(db.String),
nullable=False,
comment='Available filters')
fields = db.relationship(lambda: Field)
plans = db.relationship(lambda: Plan)
default_plan_args = db.Column(
db.JSON,
nullable=False,
comment='Default plan arguments')
class Field(db.Model):
"""Footprints and number of observations in each filter for standard PTF
tiles"""
telescope = db.Column(
db.String,
db.ForeignKey(Telescope.telescope),
primary_key=True,
comment='Telescope')
field_id = db.Column(
db.Integer,
primary_key=True,
comment='Field ID')
ra = db.Column(
db.Float,
nullable=False,
comment='RA of field center')
dec = db.Column(
db.Float,
nullable=False,
comment='Dec of field center')
contour = db.Column(
db.JSON,
nullable=False,
comment='GeoJSON contours')
reference_filter_ids = db.Column(
db.ARRAY(db.Integer),
nullable=False,
comment='Reference filter IDs')
reference_filter_mags = db.Column(
db.ARRAY(db.Float),
nullable=False,
comment='Reference filter magss')
ipix = db.Column(
db.ARRAY(db.Integer),
comment='Healpix indices')
subfields = db.relationship(lambda: SubField)
class SubField(db.Model):
"""SubFields"""
__table_args__ = (
db.ForeignKeyConstraint(
['telescope',
'field_id'],
['field.telescope',
'field.field_id']
),
)
telescope = db.Column(
db.String,
db.ForeignKey(Telescope.telescope),
primary_key=True,
comment='Telescope')
field_id = db.Column(
db.Integer,
primary_key=True,
comment='Field ID')
subfield_id = db.Column(
db.Integer,
primary_key=True,
comment='SubField ID')
ipix = db.Column(
db.ARRAY(db.Integer),
comment='Healpix indices')
class GcnNotice(db.Model):
"""Records of ingested GCN notices"""
ivorn = db.Column(
db.String,
primary_key=True,
comment='Unique identifier of VOEvent')
notice_type = db.Column(
db.Enum(gcn.NoticeType, native_enum=False),
nullable=False,
comment='GCN Notice type')
stream = db.Column(
db.String,
nullable=False,
comment='Event stream or mission (i.e., "Fermi")')
date = db.Column(
db.DateTime,
nullable=False,
comment='UTC message timestamp')
dateobs = db.Column(
db.DateTime,
db.ForeignKey(Event.dateobs),
nullable=False,
comment='UTC event timestamp')
content = db.deferred(db.Column(
db.LargeBinary,
nullable=False,
comment='Raw VOEvent content'))
def _get_property(self, property_name, value=None):
root = lxml.etree.fromstring(self.content)
path = ".//Param[@name='{}']".format(property_name)
elem = root.find(path)
value = float(elem.attrib.get('value', '')) * 100
return value
@property
def has_ns(self):
return self._get_property(property_name="HasNS")
@property
def has_remnant(self):
return self._get_property(property_name="HasRemnant")
@property
def far(self):
return self._get_property(property_name="FAR")
@property
def bns(self):
return self._get_property(property_name="BNS")
@property
def nsbh(self):
return self._get_property(property_name="NSBH")
@property
def bbh(self):
return self._get_property(property_name="BBH")
@property
def mass_gap(self):
return self._get_property(property_name="MassGap")
@property
def noise(self):
return self._get_property(property_name="Terrestrial")
class Localization(db.Model):
"""Localization information, including the localization ID, event ID, right
ascension, declination, error radius (if applicable), and the healpix
map."""
nside = 512
"""HEALPix resolution used for flat (non-multiresolution) operations."""
dateobs = db.Column(
db.DateTime,
db.ForeignKey(Event.dateobs),
primary_key=True,
comment='UTC event timestamp')
localization_name = db.Column(
db.String,
primary_key=True,
comment='Localization name')
uniq = db.deferred(db.Column(
db.ARRAY(db.BigInteger),
nullable=False,
comment='Multiresolution HEALPix UNIQ pixel index array'))
probdensity = db.deferred(db.Column(
db.ARRAY(db.Float),
nullable=False,
comment='Multiresolution HEALPix probability density array'))
distmu = db.deferred(db.Column(
db.ARRAY(db.Float),
comment='Multiresolution HEALPix distance mu array'))
distsigma = db.deferred(db.Column(
db.ARRAY(db.Float),
comment='Multiresolution HEALPix distance sigma array'))
distnorm = db.deferred(db.Column(
db.ARRAY(db.Float),
comment='Multiresolution HEALPix distance normalization array'))
contour = db.deferred(db.Column(
db.JSON,
comment='GeoJSON contours'))
@hybrid_property
def is_3d(self):
return (self.distmu is not None and
self.distsigma is not None and
self.distnorm is not None)
@is_3d.expression
def is_3d(self):
return (self.distmu.isnot(None) and
self.distsigma.isnot(None) and
self.distnorm.isnot(None))
@property
def table_2d(self):
"""Get multiresolution HEALPix dataset, probability density only."""
return table.Table(
[self.uniq, self.probdensity], names=['UNIQ', 'PROBDENSITY'])
@property
def table(self):
"""Get multiresolution HEALPix dataset, probability density and
distance."""
if self.is_3d:
return table.Table(
[
self.uniq,
self.probdensity, self.distmu,
self.distsigma, self.distnorm],
names=[
'UNIQ', 'PROBDENSITY', 'DISTMU', 'DISTSIGMA', 'DISTNORM'])
else:
return self.table_2d
@property
def flat_2d(self):
"""Get flat resolution HEALPix dataset, probability density only."""
order = hp.nside2order(Localization.nside)
result = rasterize(self.table_2d, order)['PROB']
return hp.reorder(result, 'NESTED', 'RING')
@property
def flat(self):
"""Get flat resolution HEALPix dataset, probability density and
distance."""
if self.is_3d:
order = hp.nside2order(Localization.nside)
t = rasterize(self.table, order)
result = t['PROB'], t['DISTMU'], t['DISTSIGMA'], t['DISTNORM']
return hp.reorder(result, 'NESTED', 'RING')
else:
return self.flat_2d,
class Plan(db.Model):
"""Tiling information, including the event time, localization ID, tile IDs,
and plan name"""
dateobs = db.Column(
db.DateTime,
db.ForeignKey(Event.dateobs),
primary_key=True,
comment='UTC event timestamp')
telescope = db.Column(
db.String,
db.ForeignKey(Telescope.telescope),
primary_key=True,
comment='Telescope')
plan_name = db.Column(
db.String,
primary_key=True,
comment='Plan name')
validity_window_start = db.Column(
db.DateTime,
nullable=False,
default=lambda: datetime.datetime.now(),
comment='Start of validity window')
validity_window_end = db.Column(
db.DateTime,
nullable=False,
default=lambda: datetime.datetime.now() + datetime.timedelta(1),
comment='End of validity window')
plan_args = db.Column(
db.JSON,
nullable=False,
comment='Plan arguments')
# FIXME: Hard-code program_id, filter_id, subprogram_name
program_id = 2
class Status(enum.IntEnum):
WORKING = 0
READY = 1
SUBMITTED = 2
status = db.Column(
db.Enum(Status),
default=Status.WORKING,
nullable=False,
comment='Plan status')
planned_observations = db.relationship(
'PlannedObservation', backref='plan',
order_by=lambda: PlannedObservation.obstime)
@property
def start_observation(self):
"""Time of the first planned observation."""
if self.planned_observations:
return self.planned_observations[0].obstime
else:
return None
@hybrid_property
def num_observations(self):
"""Number of planned observation."""
return len(self.planned_observations)
@num_observations.expression
def num_observations(cls):
"""Number of planned observation."""
return cls.planned_observations.count()
@property
def num_observations_per_filter(self):
"""Number of planned observation per filter."""
filters = list(Telescope.query.get(self.telescope).filters)
nepochs = np.zeros(len(filters),)
bands = {1: 'g', 2: 'r', 3: 'i', 4: 'z', 5: 'J'}
for planned_observation in self.planned_observations:
filt = bands[planned_observation.filter_id]
idx = filters.index(filt)
nepochs[idx] = nepochs[idx] + 1
nobs_per_filter = []
for ii, filt in enumerate(filters):
nobs_per_filter.append("%s: %d" % (filt, nepochs[ii]))
return " ".join(nobs_per_filter)
@property
def total_time(self):
"""Total observation time (seconds)."""
return sum(_.exposure_time for _ in self.planned_observations)
@property
def tot_time_with_overheads(self):
overhead = sum(
_.overhead_per_exposure for _ in self.planned_observations)
return overhead + self.total_time
@property
def ipix(self):
return {
i for planned_observation in self.planned_observations
if planned_observation.field.ipix is not None
for i in planned_observation.field.ipix}
@property
def area(self):
nside = Localization.nside
return hp.nside2pixarea(nside, degrees=True) * len(self.ipix)
def get_probability(self, localization):
ipix = np.asarray(list(self.ipix))
if len(ipix) > 0:
return localization.flat_2d[ipix].sum()
else:
return 0.0
class PlannedObservation(db.Model):
"""Tile information, including the event time, localization ID, field IDs,
tiling name, and tile probabilities."""
__table_args__ = (
db.ForeignKeyConstraint(
['dateobs',
'telescope',
'plan_name'],
['plan.dateobs',
'plan.telescope',
'plan.plan_name'],
ondelete='CASCADE',
onupdate='CASCADE'
),
db.ForeignKeyConstraint(
['telescope',
'field_id'],
['field.telescope',
'field.field_id']
),
)
planned_observation_id = db.Column(
db.Integer,
primary_key=True,
comment='Exposure ID')
dateobs = db.Column(
db.DateTime,
db.ForeignKey(Event.dateobs),
primary_key=True,
comment='UTC event timestamp')
telescope = db.Column(
db.String,
db.ForeignKey(Telescope.telescope),
primary_key=True,
comment='Telescope')
field_id = db.Column(
db.Integer,
primary_key=True,
comment='Field ID')
plan_name = db.Column(
db.String,
primary_key=True,
comment='Plan name')
field = db.relationship(Field, viewonly=True)
exposure_time = db.Column(
db.Integer,
nullable=False,
comment='Exposure time in seconds')
# FIXME: remove
weight = db.Column(
db.Float,
nullable=False,
comment='Weight associated with each observation')
filter_id = db.Column(
db.Integer,
nullable=False,
comment='Filter ID (g=1, r=2, i=3, z=4, J=5)')
obstime = db.Column(
db.DateTime,
nullable=False,
comment='UTC observation timestamp')
overhead_per_exposure = db.Column(
db.Integer,
nullable=False,
comment='Overhead time per exposure in seconds')
class Observation(db.Model):
"""Observation information, including the field ID, exposure time, and
filter."""
__table_args__ = (
db.ForeignKeyConstraint(
['telescope',
'field_id'],
['field.telescope',
'field.field_id']
),
)
telescope = db.Column(
db.String,
db.ForeignKey(Telescope.telescope),
primary_key=True,
comment='Telescope')
field_id = db.Column(
db.Integer,
primary_key=True,
comment='Field ID')
observation_id = db.Column(
db.Integer,
primary_key=True,
comment='Observation ID')
obstime = db.Column(
db.DateTime,
comment='Exposure timestamp')
field = db.relationship(Field)
filter_id = db.Column(
db.Integer,
nullable=False,
comment='Filter ID (g=1, r=2, i=3, z=4, J=5)')
exposure_time = db.Column(
db.Integer,
nullable=False,
comment='Exposure times')
airmass = db.Column(
db.Float,
comment='Airmass')
seeing = db.Column(
db.Float,
comment='Seeing')
limmag = db.Column(
db.Float,
comment='Limiting magnitude')
subfield_id = db.Column(
db.Integer,
default=0,
primary_key=True,
nullable=False,
comment='subfield (e.g. quadrant/chip as relevant for instrument')
successful = db.Column(
db.Boolean,
nullable=False,
comment='processed successfully?')
class Candidate(db.Model):
name = db.Column(
db.String,
primary_key=True,
comment='Candidate name')
growth_marshal_id = db.Column(
db.String,
unique=True, nullable=False,
comment='GROWTH marshal ID')
subfield_id = db.Column(
db.Integer,
nullable=True,
comment='Readout channel ID')
creationdate = db.Column(
db.DateTime,
comment='Date of candidate creation')
classification = db.Column(
db.String,
nullable=True,
comment='Classification')
redshift = db.Column(
db.Float,
nullable=True,
comment='Resdshift of the source')
iauname = db.Column(
db.String,
nullable=True,
comment='IAU name on TNS')
field_id = db.Column(
db.Integer,
comment='Field ID')
candid = db.Column(
db.BigInteger,
comment='Candidate ID')
ra = db.Column(
db.Float,
nullable=False,
comment='RA of the candidate')
dec = db.Column(
db.Float,
nullable=False,
comment='Dec of the candidate')
last_updated = db.Column(
db.DateTime,
nullable=False,
comment='Date of last update')
autoannotations = db.Column(
db.String,
nullable=True,
comment='Autoannotations from the GROWTH marshal')
photometry = db.relationship(
lambda: CandidatePhotometry,
backref='candidate',
order_by=lambda: CandidatePhotometry.dateobs)
@hybrid_property
def first_detection_time(self):
return self.photometry[0].dateobs
@first_detection_time.expression
def first_detection_time(cls):
return db.select(
[db.func.min(cls.dateobs)]
).where(
CandidatePhotometry.name == cls.name
).label(__name__)
class CandidatePhotometry(db.Model):
"""Candidate light curve pulled from the GROWTH
Marshal"""
lcid = db.Column(
db.BigInteger,
primary_key=True)
name = db.Column(
db.ForeignKey(Candidate.name),
nullable=False,
comment='Candidate name')
dateobs = db.Column(
db.DateTime,
nullable=True,
comment='Observation date')
fil = db.Column(
db.String,
nullable=True,
comment='Filter')
instrument = db.Column(
db.String,
nullable=True,
comment='Instruments')
limmag = db.Column(
db.Float,
nullable=True,
comment='Limiting magnitude')
mag = db.Column(
db.Float,
nullable=True,
comment='Mag PSF')
magerr = db.Column(
db.Float,
nullable=True,
comment='Mag uncertainty')
exptime = db.Column(
db.Float,
nullable=True,
comment='Exposure time')
programid = db.Column(
db.Integer,
nullable=True,
comment='Program ID number (1,2,3)')
|
py | b41419b48f6d9cbccdb181c3f0376049d85209c7 | from typing import Any, Dict, Optional
class RaidenError(Exception):
""" Base exception, used to catch all raiden related exceptions. """
pass
class RaidenRecoverableError(RaidenError):
pass
class RaidenUnrecoverableError(RaidenError):
pass
# Exceptions raised due to programming errors
class HashLengthNot32(RaidenError):
""" Raised if the length of the provided element is not 32 bytes in length,
a keccak hash is required to include the element in the merkle tree.
"""
pass
class UnknownEventType(RaidenError):
"""Raised if decoding of an event failed."""
pass
# Exceptions raised due to user interaction (the user may be another software)
class ChannelNotFound(RaidenError):
""" Raised when a provided channel via the REST api is not found in the
internal data structures"""
pass
class PaymentConflict(RaidenRecoverableError):
""" Raised when there is another payment with the same identifier but the
attributes of the payment don't match.
"""
pass
class InsufficientFunds(RaidenError):
""" Raised when provided account doesn't have token funds to complete the
requested deposit.
Used when a *user* tries to deposit a given amount of token in a channel,
but his account doesn't have enough funds to pay for the deposit.
"""
pass
class DepositOverLimit(RaidenError):
""" Raised when the requested deposit is over the limit
Used when a *user* tries to deposit a given amount of token in a channel,
but the amount is over the testing limit.
"""
pass
class DepositMismatch(RaidenRecoverableError):
""" Raised when the requested deposit is lower than actual channel deposit
Used when a *user* tries to deposit a given amount of token in a channel,
but the on-chain amount is already higher.
"""
pass
class InvalidAddress(RaidenError):
""" Raised when the user provided value is not a valid address. """
pass
class InvalidSecret(RaidenError):
""" Raised when the user provided value is not a valid secret. """
pass
class InvalidSecretHash(RaidenError):
""" Raised when the user provided value is not a valid secrethash. """
pass
class InvalidAmount(RaidenError):
""" Raised when the user provided value is not a positive integer and
cannot be used to define a transfer value.
"""
pass
class InvalidSettleTimeout(RaidenError):
""" Raised when the user provided timeout value is less than the minimum
settle timeout"""
pass
class InvalidSignature(RaidenError):
"""Raised on invalid signature recover/verify"""
pass
class SamePeerAddress(RaidenError):
""" Raised when a user tries to create a channel where the address of both
peers is the same.
"""
class UnknownAddress(RaidenError):
""" Raised when the user provided address is valid but is not from a known
node. """
pass
class UnknownTokenAddress(RaidenError):
""" Raised when the token address in unknown. """
pass
class TokenNotRegistered(RaidenError):
""" Raised if there is no token network for token used when opening a channel """
pass
class AlreadyRegisteredTokenAddress(RaidenError):
""" Raised when the token address in already registered with the given network. """
pass
class InvalidToken(RaidenError):
""" Raised if the token does not follow the ERC20 standard """
pass
# Exceptions raised due to protocol errors (this includes messages received
# from a byzantine node)
class STUNUnavailableException(RaidenError):
pass
class EthNodeCommunicationError(RaidenError):
""" Raised when something unexpected has happened during
communication with the underlying ethereum node"""
def __init__(self, error_msg: str) -> None:
super().__init__(error_msg)
class EthNodeInterfaceError(RaidenError):
""" Raised when the underlying ETH node does not support an rpc interface"""
pass
class AddressWithoutCode(RaidenError):
"""Raised on attempt to execute contract on address without a code."""
pass
class AddressWrongContract(RaidenError):
"""Raised on attempt to execute contract on address that has code but
is probably not the contract we wanted."""
pass
class DuplicatedChannelError(RaidenRecoverableError):
"""Raised if someone tries to create a channel that already exists."""
class ContractVersionMismatch(RaidenError):
"""Raised if deployed version of the contract differs."""
class TransactionThrew(RaidenError):
"""Raised when, after waiting for a transaction to be mined,
the receipt has a 0x0 status field"""
def __init__(self, txname: str, receipt: Optional[Dict[str, Any]]) -> None:
super().__init__("{} transaction threw. Receipt={}".format(txname, receipt))
class InvalidProtocolMessage(RaidenError):
"""Raised on an invalid or an unknown Raiden protocol message"""
class APIServerPortInUseError(RaidenError):
"""Raised when API server port is already in use"""
class RaidenServicePortInUseError(RaidenError):
"""Raised when Raiden service port is already in use"""
class InvalidDBData(RaidenUnrecoverableError):
"""Raised when the data of the WAL are in an unexpected format"""
class InvalidBlockNumberInput(RaidenError):
"""Raised when the user provided a block number that is < 0 or > UINT64_MAX"""
class NoStateForBlockIdentifier(RaidenError):
"""
Raised when we attempt to provide a block identifier older
than STATE_PRUNING_AFTER_BLOCKS blocks
"""
class InvalidNumberInput(RaidenError):
"""Raised when the user provided an invalid number"""
class TokenAppNotFound(RaidenError):
"""Raised when the token app is not found"""
class TokenAppExpired(RaidenError):
"""Raised when the token app is not found"""
class TransportError(RaidenError):
""" Raised when a transport encounters an unexpected error """
class ReplacementTransactionUnderpriced(RaidenError):
"""Raised when a replacement transaction is rejected by the blockchain"""
class TransactionAlreadyPending(RaidenUnrecoverableError):
"""Raised when a transaction is already pending"""
class ChannelOutdatedError(RaidenError):
""" Raised when an action is invoked on a channel whose
identifier has been replaced with a new channel identifier
due to a close/re-open of current channel.
"""
class InsufficientGasReserve(RaidenError):
""" Raised when an action cannot be done because the available balance
is not sufficient for the lifecycles of all active channels.
"""
class ServiceRequestFailed(RaidenError):
""" Raised when a request to one of the raiden services fails. """
class ServiceRequestIOURejected(ServiceRequestFailed):
""" Raised when a service request fails due to a problem with the iou. """
def __init__(self, message: str, error_code: int) -> None:
super().__init__(f"{message} ({error_code})")
self.error_code = error_code
|
py | b41419ec2ac37fa4656da636b439a51cf3af30f0 | from next.api import api_blueprint
from next.dashboard.dashboard import dashboard
from next.assistant.assistant_blueprint import assistant
from next.home import home
from next.query_page import query_page
import next.constants as constants
from flask import Flask
app = Flask(__name__)
app.register_blueprint(api_blueprint.api, url_prefix='/api')
app.register_blueprint(assistant, url_prefix='/assistant')
app.register_blueprint(home, url_prefix='/home')
if constants.SITE_KEY:
dashboard_prefix = '/dashboard/{}'.format(constants.SITE_KEY)
else:
dashboard_prefix = '/dashboard'
app.register_blueprint(dashboard, url_prefix=dashboard_prefix)
app.register_blueprint(query_page, url_prefix='/query')
import logging
import sys
# Log to standard out. Remember to turn off in production
app.logger.addHandler(logging.StreamHandler(sys.stdout))
app.logger.setLevel(logging.DEBUG)
#Handle internal errors using a custom error message
import json
@app.errorhandler(404)
def internal_system_error(error):
response = {
'meta':{
'status':'FAIL',
'code':404,
'message':'Resource not found'
}
}
return json.dumps(response), 404
|
py | b4141a3f49624252c386ab1f2c7ee5a9a90df048 | '''
(c) REACT LAB, Harvard University
Author: Ninad Jadhav, Weiying Wang
'''
#!/usr/bin/env python
import math
#from tf.transformations import euler_from_quaternion
from transformations import euler_from_quaternion
from mpl_toolkits.mplot3d import Axes3D # noqa: F401 unused import
import matplotlib.pyplot as plt
class RobotTrajectory:
def __init__(self, plot=True):
self.defaul_robot_id = 000
self.plot_traj = plot
def parse_trajectory(self, traj_type, parser_type, rx_robot_traj_fn, tx_robot_traj_fn=None, rx_mocap_id=None, tx_mocap_id=None):
parsed_rx_robot_trajectory, parsed_tx_robot_trajectory, parsed_relative_traj = {},{},{}
if parser_type == "optitrack":
parsed_rx_robot_trajectory = self.optitrack_mocap_data_parser(rx_robot_traj_fn, rx_mocap_id)
elif parser_type == "t265":
parsed_rx_robot_trajectory = self.T265_camera_data_parser(rx_robot_traj_fn)
elif parser_type == "vicon":
parsed_rx_robot_trajectory = self.vicon_mocap_ros_data_parser(rx_robot_traj_fn)
# For moving ends, need relative trajectory
if traj_type == "relative":
if parser_type == "optitrack":
parsed_tx_robot_trajectory = self.optitrack_mocap_data_parser(rx_robot_traj_fn, tx_mocap_id) #For optitrack, the trajectory file has traj for all robots
elif parser_type == "t265":
parsed_tx_robot_trajectory = self.T265_camera_data_parser(tx_robot_traj_fn)
elif parser_type == "vicon":
parsed_tx_robot_trajectory = self.vicon_mocap_ros_data_parser(tx_robot_traj_fn)
parsed_relative_traj = self.get_relative_trajectory(parsed_rx_robot_trajectory, parsed_tx_robot_trajectory)
return parsed_relative_traj
return parsed_rx_robot_trajectory
'''
#TODO
Return relative trajectory
'''
def get_relative_trajectory(self, parsed_rx_robot_trajectory, parsed_tx_robot_trajectory):
parse_relative_traj = {}
#How to get relative traj for t265 and vicon, if there is mismatch between number of poses? (match based on timestamp?)
return parse_relative_traj
'''
Read data from the mocap file
Return: dictonary with position (meters) and orientation data (degrees) for a specific robot
'''
def optitrack_mocap_data_parser(self, robot_traj, robot_id):
self.file = open(robot_traj, "r")
parsed_robot_trajectory = {}
pos_x = []
pos_y = []
pos_z = []
time_nan_secs = []
latency = []
pitch = []
yaw = []
m = self.file.readlines()
i = 0
while "Client is connected to server and listening for data..." not in m[i] and i < len(m): i+=1
#print("Found at ",i-1," Parsing now")
len_m = len(m)
#print(len_m)
#ignore the last data packet from the end of file.
for l in range(len(m)-1,0,-1):
#print m[l]
if "header" in m[l]:
len_m = l - 1
#print("new eof", len_m)
break
count=0
while i < len_m-3:
if "nsecs" in m[i]:
latency_val = int(m[i + 2].split(':', 1)[1].strip())
if latency_val < 0:
i = i + 13
continue
val = int(m[i].split(':', 1)[1].strip())
time_nan_secs.append(val)
i = i + 1
elif "orientation" in m[i] and i+4<len(m):
check_id = int(m[i - 5 ].split(':', 1)[1].strip())
if check_id == int(robot_id):
#print "count = ", count
count+=1
ori_x = float(m[i + 1].split(':', 1)[1].strip())
ori_y = float(m[i + 2].split(':', 1)[1].strip())
ori_z = float(m[i + 3].split(':', 1)[1].strip())
ori_w = float(m[i + 4].split(':', 1)[1].strip())
tmp_angle = euler_from_quaternion([ori_x, ori_y, ori_z, ori_w])
pitch.append(tmp_angle[1])
yaw.append(tmp_angle[2])
i = i + 4
else:
i = i + 1
elif "latency" in m[i]:
latency.append(int(m[i].split(':', 1)[1].strip()))
i = i + 1
elif "position" in m[i]:
check_id = int(m[i - 1 ].split(':', 1)[1].strip())
if check_id == int(robot_id):
pos_x.append(float(m[i + 1].split(':', 1)[1].strip()))
pos_y.append(float(m[i + 2].split(':', 1)[1].strip()))
pos_z.append(float(m[i + 3].split(':', 1)[1].strip()))
i= i + 3
else:
i = i + 1
else:
i = i + 1
print("frame count = ", count)
print("len of nsecs = ", len(time_nan_secs))
print("len of latency = ", len(latency))
print("len of orientation = ", len(pitch))
print("len of pos_x = ", len(pos_x))
print("len of pos_y = ", len(pos_y))
min_len = min(len(latency), count)
parsed_robot_trajectory['robot_id'] = robot_id
parsed_robot_trajectory['pose_list']= []
for pose_count in range(min_len):
corrected_time = str(time_nan_secs[pose_count] - latency[pose_count]) #Latency is in nanoseconds as well
pose_data = {
'pose_num' : pose_count,
'x' : pos_x[pose_count],
'y' : pos_y[pose_count],
'z' : pos_z[pose_count],
'pitch' : math.degrees(pitch[pose_count]),
'yaw' : math.degrees(yaw[pose_count]),
'time_sec' : int(corrected_time[:10]),
# 'time_sec' : int(corrected_time)
'time_nsec' : int(corrected_time[10:])
}
parsed_robot_trajectory['pose_list'].append(pose_data)
print("Completed extracting pose information for robot ID: ", robot_id)
#Plot the trajectory
# self.visualize_trajectory(pos_x, pos_y, pos_z)
return parsed_robot_trajectory
'''
Read data from the T265 trajectory file
Return: dictonary with position (meters) obtained from tracking camera. Latency and orientation are not obtained
** Note: The coordinate system is different for the T265 tracking camera compared to the mocap systems. Modify the parser accordingly
** refer : https://github.com/IntelRealSense/librealsense/blob/master/doc/t265.md
** The coordinate axis convention should be :
+x => move forward
+y => move left
+z => move up
'''
def T265_camera_data_parser(self, robot_traj_file):
self.file = open(robot_traj_file, "r")
parsed_robot_trajectory = {}
pos_x = []
pos_y = []
pos_z = []
time_nan_secs = []
latency = []
pitch = []
yaw = []
m = self.file.readlines()
i = 0
#print("Found at ",i-1," Parsing now")
len_m = len(m)
#print(len_m)
#ignore the last data packet from the end of file.
for l in range(len_m-1,0,-1):
#print m[l]
if "header" in m[l]:
len_m = l - 1
#print("new eof", len_m)
break
count=0
while i < len_m-4:
if "nsecs" in m[i]:
val = int(m[i].split(':', 1)[1].strip())
time_nan_secs.append(val)
i = i + 1
elif "orientation" in m[i] and i+4<len_m:
count+=1
ori_x = float(m[i + 1].split(':', 1)[1].strip())
ori_y = float(m[i + 2].split(':', 1)[1].strip())
ori_z = float(m[i + 3].split(':', 1)[1].strip())
ori_w = float(m[i + 4].split(':', 1)[1].strip())
tmp_angle = euler_from_quaternion([ori_x, ori_y, ori_z, ori_w])
pitch.append(tmp_angle[1])
yaw.append(tmp_angle[2])
i = i + 4
elif "position" in m[i]:
#Modified for use with the raw camera api. Align the mocap x,y,z with camera x,y,z
#mocap (x,y,z) = TrackingCamera(z,x,y)
pos_x.append(-float(m[i + 3].split(':', 1)[1].strip()))
pos_y.append(-float(m[i + 1].split(':', 1)[1].strip()))
pos_z.append(float(m[i + 2].split(':', 1)[1].strip()))
i= i + 3
else:
i = i + 1
print("frame count = ", count)
print("len of nsecs = ", len(time_nan_secs))
print("len of pos_x = ", len(pos_x))
print("len of pos_y = ", len(pos_y))
print("len of pos_z = ", len(pos_z))
min_len = min(len(pos_x), count)
parsed_robot_trajectory['robot_id'] = self.defaul_robot_id
parsed_robot_trajectory['pose_list']= []
for pose_count in range(min_len):
corrected_time = str(time_nan_secs[pose_count])
pose_data = {
'pose_num' : pose_count,
'x' : pos_x[pose_count],
'y' : pos_y[pose_count],
'z' : pos_z[pose_count],
'pitch' : math.degrees(pitch[pose_count]),
'yaw' : math.degrees(yaw[pose_count]),
'time_sec' : int(corrected_time[:10]),
'time_nsec' : int(corrected_time[10:])
}
parsed_robot_trajectory['pose_list'].append(pose_data)
print("Completed extracting pose information for")
#Plot the trajectory
if self.plot_traj:
self.visualize_trajectory(pos_x, pos_y, pos_z)
return parsed_robot_trajectory
'''
parse trajectory obtained from vicon motion capture system
'''
def vicon_mocap_ros_data_parser(self, robot_traj):
self.file = open(robot_traj, "r")
parsed_robot_trajectory = {}
pos_x = []
pos_y = []
pos_z = []
time_secs = []
time_nsecs = []
latency = []
pitch = []
yaw = []
m = self.file.readlines()
i = 0
len_m = len(m)
#print(len_m)
#ignore the last data packet from the end of file.
for l in range(len(m)-1,0,-1):
#print m[l]
if "header" in m[l]:
len_m = l - 1
#print("new eof", len_m)
break
count=0
while i < len_m-3:
if "secs" in m[i] and "nsecs" not in m[i]:
#Latency values are already incorporated in the timestamp of ros message
time_secs.append(int(m[i].split(':', 1)[1].strip()))
print("pp ",m[i+1])
time_nsecs.append(int(m[i+1].split(':', 1)[1].strip()))
i = i + 1
elif "orientation" in m[i] and i+4<len_m:
count+=1
ori_x = float(m[i + 1].split(':', 1)[1].strip())
ori_y = float(m[i + 2].split(':', 1)[1].strip())
ori_z = float(m[i + 3].split(':', 1)[1].strip())
ori_w = float(m[i + 4].split(':', 1)[1].strip())
tmp_angle = euler_from_quaternion([ori_x, ori_y, ori_z, ori_w])
pitch.append(tmp_angle[1])
yaw.append(tmp_angle[2])
i = i + 4
elif "position" in m[i] and "orientation" in m[i+4]:
pos_x.append(float(m[i + 1].split(':', 1)[1].strip()))
pos_y.append(float(m[i + 2].split(':', 1)[1].strip()))
pos_z.append(float(m[i + 3].split(':', 1)[1].strip()))
i= i + 3
else:
i = i + 1
print("frame count = ", count)
print("len of nsecs = ", len(time_nsecs))
print("len of pos_x = ", len(pos_x))
print("len of pos_y = ", len(pos_y))
print("len of pos_z = ", len(pos_z))
print("len of pitch = ", len(pitch))
min_len = min(len(pos_x), count)
parsed_robot_trajectory['robot_id'] = self.defaul_robot_id
parsed_robot_trajectory['pose_list']= []
for pose_count in range(min_len):
pose_data = {
'pose_num' : pose_count,
'x' : pos_x[pose_count],
'y' : pos_y[pose_count],
'z' : pos_z[pose_count],
'pitch' : math.degrees(pitch[pose_count]),
'yaw' : math.degrees(yaw[pose_count]),
'time_sec' : time_secs[pose_count],
'time_nsec' : time_nsecs[pose_count]
}
parsed_robot_trajectory['pose_list'].append(pose_data)
print("Completed extracting pose information")
#Plot the trajectory
self.visualize_trajectory(pos_x, pos_y, pos_z)
return parsed_robot_trajectory
def visualize_trajectory(self,pos_x, pos_y, pos_z):
fig = plt.figure()
ax = fig.add_subplot(111, projection='3d')
ax.scatter(pos_x, pos_y, pos_z, marker='o')
ax.set_xlabel('X Label')
ax.set_ylabel('Y Label')
ax.set_zlabel('Z Label')
plt.show()
'''
Currently not used. TODO Functionality for 2 antennas
'''
def imu_data_parser(self, input_file):
file = open(input_file, "r")
time_secs = []
time_nan_secs = []
latency = []
ang_x = []
ang_y = []
ang_z = []
lin_x = []
lin_y = []
lin_z = []
m = file.readlines()
i = 0
len_m = len(m)
#print(len_m)
#ignore the last data packet from the end of file.
for l in range(len(m)-1,0,-1):
#print m[l]
if "header" in m[l]:
len_m = l - 1
print("new eof", len_m)
break
count=0
print("Collecting data")
while i < len_m-3:
if "stamp" in m[i]:
time_secs.append(int(m[i+1].split(':', 1)[1].strip()))
time_nan_secs.append(int(m[i + 2].split(':', 1)[1].strip()))
i = i + 3
# if time_secs[len(time_secs)-1] - time_secs[len(time_secs)-2] < 0:
# print "line is " , i
# print time_secs[len(time_secs)-1]
# print time_secs[len(time_secs)-2]
#a= input()
elif "angular_velocity" in m[i]:
count+=1
ang_x.append(float(m[i + 1].split(':', 1)[1].strip()))
ang_y.append(float(m[i + 2].split(':', 1)[1].strip()))
ang_z.append(float(m[i + 3].split(':', 1)[1].strip()))
lin_x.append(float(m[i + 6].split(':', 1)[1].strip()))
lin_y.append(float(m[i + 7].split(':', 1)[1].strip()))
lin_z.append(float(m[i + 8].split(':', 1)[1].strip()))
i = i + 9
i = i+1
print("frame count = ", count)
print("len of nsecs = ", len(time_nan_secs))
print("len of imu angluar velocity = ", len(ang_z))
min_len = min(len(time_nan_secs), count)
print("min_len = ", min_len)
yaw = 0.0
#Yaw Angle from Angular velocity
with open(pwd+'/deg_'+fname, 'w') as f:
f.write("%s\n" % str(min_len-1))
for i in range(min_len-1):
time_a = time_secs[i] + time_nan_secs[i]*0.000000001
time_b = time_secs[i+1] + time_nan_secs[i+1]*0.000000001
yaw = yaw + ((ang_z[i] + ang_z[i+1]) * (time_b-time_a)/2)
if yaw > 2*math.pi:
yaw = yaw - 2*math.pi
deg_yaw = yaw * 180/math.pi
if deg_yaw > 180:
deg_yaw = deg_yaw - 360
f.write("%s\n" % str(deg_yaw))
with open(pwd+'/deg_'+fname, 'a') as f:
for i in range(min_len-1):
f.write("%s\n" % str(time_secs[i]))
with open(pwd+'/deg_'+fname, 'a') as f:
for i in range(min_len-1):
f.write("%s\n" % str(time_nan_secs[i]))
print(pwd+'deg_'+fname)
print("Completed")
|
py | b4141a9cd539013f35beb35e1ac9c834d503e37e | #!/usr/bin/env python
#
# Electrum - lightweight Bitcoin client
# Copyright (C) 2012 thomasv@gitorious
#
# Permission is hereby granted, free of charge, to any person
# obtaining a copy of this software and associated documentation files
# (the "Software"), to deal in the Software without restriction,
# including without limitation the rights to use, copy, modify, merge,
# publish, distribute, sublicense, and/or sell copies of the Software,
# and to permit persons to whom the Software is furnished to do so,
# subject to the following conditions:
#
# The above copyright notice and this permission notice shall be
# included in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
# NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS
# BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN
# ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
# CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.
import sys, time, threading
import os, json, traceback
import shutil
import weakref
import webbrowser
import csv
from decimal import Decimal
import base64
from functools import partial
from PyQt5.QtGui import *
from PyQt5.QtCore import *
import PyQt5.QtCore as QtCore
from .exception_window import Exception_Hook
from PyQt5.QtWidgets import *
from electrum import keystore, simple_config
from electrum.bitcoin import COIN, is_address, TYPE_ADDRESS
from electrum import constants
from electrum.plugins import run_hook
from electrum.i18n import _
from electrum.util import (format_time, format_satoshis, PrintError,
format_satoshis_plain, NotEnoughFunds,
UserCancelled, NoDynamicFeeEstimates, profiler,
export_meta, import_meta, bh2u, bfh, InvalidPassword)
from electrum import Transaction
from electrum import util, bitcoin, commands, coinchooser
from electrum import paymentrequest
from electrum.wallet import Multisig_Wallet, AddTransactionException
from .amountedit import AmountEdit, BTHAmountEdit, MyLineEdit, FeerateEdit
from .qrcodewidget import QRCodeWidget, QRDialog
from .qrtextedit import ShowQRTextEdit, ScanQRTextEdit
from .transaction_dialog import show_transaction
from .fee_slider import FeeSlider
from .util import *
class StatusBarButton(QPushButton):
def __init__(self, icon, tooltip, func):
QPushButton.__init__(self, icon, '')
self.setToolTip(tooltip)
self.setFlat(True)
self.setMaximumWidth(25)
self.clicked.connect(self.onPress)
self.func = func
self.setIconSize(QSize(25,25))
def onPress(self, checked=False):
'''Drops the unwanted PyQt5 "checked" argument'''
self.func()
def keyPressEvent(self, e):
if e.key() == Qt.Key_Return:
self.func()
from electrum.paymentrequest import PR_PAID
class ElectrumWindow(QMainWindow, MessageBoxMixin, PrintError):
payment_request_ok_signal = pyqtSignal()
payment_request_error_signal = pyqtSignal()
notify_transactions_signal = pyqtSignal()
new_fx_quotes_signal = pyqtSignal()
new_fx_history_signal = pyqtSignal()
network_signal = pyqtSignal(str, object)
alias_received_signal = pyqtSignal()
computing_privkeys_signal = pyqtSignal()
show_privkeys_signal = pyqtSignal()
def __init__(self, gui_object, wallet):
QMainWindow.__init__(self)
self.gui_object = gui_object
self.config = config = gui_object.config
self.setup_exception_hook()
self.network = gui_object.daemon.network
self.fx = gui_object.daemon.fx
self.invoices = wallet.invoices
self.contacts = wallet.contacts
self.tray = gui_object.tray
self.app = gui_object.app
self.cleaned_up = False
self.is_max = False
self.payment_request = None
self.checking_accounts = False
self.qr_window = None
self.not_enough_funds = False
self.pluginsdialog = None
self.require_fee_update = False
self.tx_notifications = []
self.tl_windows = []
self.tx_external_keypairs = {}
self.create_status_bar()
self.need_update = threading.Event()
self.minimize_to_tray = config.get('minimize_tray', True)
self.decimal_point = config.get('decimal_point', 8)
self.num_zeros = int(config.get('num_zeros', 0))
self.completions = QStringListModel()
self.tabs = tabs = QTabWidget(self)
self.send_tab = self.create_send_tab()
self.receive_tab = self.create_receive_tab()
self.addresses_tab = self.create_addresses_tab()
self.utxo_tab = self.create_utxo_tab()
self.console_tab = self.create_console_tab()
self.contacts_tab = self.create_contacts_tab()
tabs.addTab(self.create_history_tab(), QIcon(":icons/tab_history.png"), _('History'))
tabs.addTab(self.send_tab, QIcon(":icons/tab_send.png"), _('Send'))
tabs.addTab(self.receive_tab, QIcon(":icons/tab_receive.png"), _('Receive'))
def add_optional_tab(tabs, tab, icon, description, name):
tab.tab_icon = icon
tab.tab_description = description
tab.tab_pos = len(tabs)
tab.tab_name = name
if self.config.get('show_{}_tab'.format(name), False):
tabs.addTab(tab, icon, description.replace("&", ""))
add_optional_tab(tabs, self.addresses_tab, QIcon(":icons/tab_addresses.png"), _("&Addresses"), "addresses")
add_optional_tab(tabs, self.utxo_tab, QIcon(":icons/tab_coins.png"), _("Co&ins"), "utxo")
add_optional_tab(tabs, self.contacts_tab, QIcon(":icons/tab_contacts.png"), _("Con&tacts"), "contacts")
add_optional_tab(tabs, self.console_tab, QIcon(":icons/tab_console.png"), _("Con&sole"), "console")
tabs.setSizePolicy(QSizePolicy.Expanding, QSizePolicy.Expanding)
self.setCentralWidget(tabs)
if self.config.get("is_maximized"):
self.showMaximized()
self.setWindowIcon(QIcon(":icons/electrumbth.png"))
self.init_menubar()
wrtabs = weakref.proxy(tabs)
QShortcut(QKeySequence("Ctrl+W"), self, self.close)
QShortcut(QKeySequence("Ctrl+Q"), self, self.close)
QShortcut(QKeySequence("Ctrl+R"), self, self.update_wallet)
QShortcut(QKeySequence("Ctrl+PgUp"), self, lambda: wrtabs.setCurrentIndex((wrtabs.currentIndex() - 1)%wrtabs.count()))
QShortcut(QKeySequence("Ctrl+PgDown"), self, lambda: wrtabs.setCurrentIndex((wrtabs.currentIndex() + 1)%wrtabs.count()))
for i in range(wrtabs.count()):
QShortcut(QKeySequence("Alt+" + str(i + 1)), self, lambda i=i: wrtabs.setCurrentIndex(i))
self.payment_request_ok_signal.connect(self.payment_request_ok)
self.payment_request_error_signal.connect(self.payment_request_error)
self.notify_transactions_signal.connect(self.notify_transactions)
self.history_list.setFocus(True)
# network callbacks
if self.network:
self.network_signal.connect(self.on_network_qt)
interests = ['updated', 'new_transaction', 'status',
'banner', 'verified', 'fee']
# To avoid leaking references to "self" that prevent the
# window from being GC-ed when closed, callbacks should be
# methods of this class only, and specifically not be
# partials, lambdas or methods of subobjects. Hence...
self.network.register_callback(self.on_network, interests)
# set initial message
self.console.showMessage(_('Welcome to ElectrumBTH!'))
self.network.register_callback(self.on_quotes, ['on_quotes'])
self.network.register_callback(self.on_history, ['on_history'])
self.new_fx_quotes_signal.connect(self.on_fx_quotes)
self.new_fx_history_signal.connect(self.on_fx_history)
# update fee slider in case we missed the callback
self.fee_slider.update()
self.load_wallet(wallet)
self.connect_slots(gui_object.timer)
self.fetch_alias()
def on_history(self, b):
self.new_fx_history_signal.emit()
def setup_exception_hook(self):
Exception_Hook(self)
def on_fx_history(self):
self.history_list.refresh_headers()
self.history_list.update()
self.address_list.update()
def on_quotes(self, b):
self.new_fx_quotes_signal.emit()
def on_fx_quotes(self):
self.update_status()
# Refresh edits with the new rate
edit = self.fiat_send_e if self.fiat_send_e.is_last_edited else self.amount_e
edit.textEdited.emit(edit.text())
edit = self.fiat_receive_e if self.fiat_receive_e.is_last_edited else self.receive_amount_e
edit.textEdited.emit(edit.text())
# History tab needs updating if it used spot
if self.fx.history_used_spot:
self.history_list.update()
def toggle_tab(self, tab):
show = not self.config.get('show_{}_tab'.format(tab.tab_name), False)
self.config.set_key('show_{}_tab'.format(tab.tab_name), show)
item_text = (_("Hide") if show else _("Show")) + " " + tab.tab_description
tab.menu_action.setText(item_text)
if show:
# Find out where to place the tab
index = len(self.tabs)
for i in range(len(self.tabs)):
try:
if tab.tab_pos < self.tabs.widget(i).tab_pos:
index = i
break
except AttributeError:
pass
self.tabs.insertTab(index, tab, tab.tab_icon, tab.tab_description.replace("&", ""))
else:
i = self.tabs.indexOf(tab)
self.tabs.removeTab(i)
def push_top_level_window(self, window):
'''Used for e.g. tx dialog box to ensure new dialogs are appropriately
parented. This used to be done by explicitly providing the parent
window, but that isn't something hardware wallet prompts know.'''
self.tl_windows.append(window)
def pop_top_level_window(self, window):
self.tl_windows.remove(window)
def top_level_window(self, test_func=None):
'''Do the right thing in the presence of tx dialog windows'''
override = self.tl_windows[-1] if self.tl_windows else None
if override and test_func and not test_func(override):
override = None # only override if ok for test_func
return self.top_level_window_recurse(override, test_func)
def diagnostic_name(self):
return "%s/%s" % (PrintError.diagnostic_name(self),
self.wallet.basename() if self.wallet else "None")
def is_hidden(self):
return self.isMinimized() or self.isHidden()
def show_or_hide(self):
if self.is_hidden():
self.bring_to_top()
else:
self.hide()
def bring_to_top(self):
self.show()
self.raise_()
def on_error(self, exc_info):
if not isinstance(exc_info[1], UserCancelled):
traceback.print_exception(*exc_info)
self.show_error(str(exc_info[1]))
def on_network(self, event, *args):
if event == 'updated':
self.need_update.set()
self.gui_object.network_updated_signal_obj.network_updated_signal \
.emit(event, args)
elif event == 'new_transaction':
self.tx_notifications.append(args[0])
self.notify_transactions_signal.emit()
elif event in ['status', 'banner', 'verified', 'fee']:
# Handle in GUI thread
self.network_signal.emit(event, args)
else:
self.print_error("unexpected network message:", event, args)
def on_network_qt(self, event, args=None):
# Handle a network message in the GUI thread
if event == 'status':
self.update_status()
elif event == 'banner':
pass
elif event == 'verified':
self.history_list.update_item(*args)
elif event == 'fee':
if self.config.is_dynfee():
self.fee_slider.update()
self.do_update_fee()
elif event == 'fee_histogram':
if self.config.is_dynfee():
self.fee_slider.update()
self.do_update_fee()
# todo: update only unconfirmed tx
self.history_list.update()
else:
self.print_error("unexpected network_qt signal:", event, args)
def fetch_alias(self):
self.alias_info = None
alias = self.config.get('alias')
if alias:
alias = str(alias)
def f():
self.alias_info = self.contacts.resolve_openalias(alias)
self.alias_received_signal.emit()
t = threading.Thread(target=f)
t.setDaemon(True)
t.start()
def close_wallet(self):
if self.wallet:
self.print_error('close_wallet', self.wallet.storage.path)
run_hook('close_wallet', self.wallet)
@profiler
def load_wallet(self, wallet):
wallet.thread = TaskThread(self, self.on_error)
self.wallet = wallet
self.update_recently_visited(wallet.storage.path)
# address used to create a dummy transaction and estimate transaction fee
self.history_list.update()
self.address_list.update()
self.utxo_list.update()
self.need_update.set()
# Once GUI has been initialized check if we want to announce something since the callback has been called before the GUI was initialized
self.notify_transactions()
# update menus
self.seed_menu.setEnabled(self.wallet.has_seed())
self.update_lock_icon()
self.update_buttons_on_seed()
self.update_console()
self.clear_receive_tab()
self.request_list.update()
self.tabs.show()
self.init_geometry()
if self.config.get('hide_gui') and self.gui_object.tray.isVisible():
self.hide()
else:
self.show()
self.watching_only_changed()
run_hook('load_wallet', wallet, self)
def init_geometry(self):
winpos = self.wallet.storage.get("winpos-qt")
try:
screen = self.app.desktop().screenGeometry()
assert screen.contains(QRect(*winpos))
self.setGeometry(*winpos)
except:
self.print_error("using default geometry")
self.setGeometry(100, 100, 840, 400)
def watching_only_changed(self):
name = "ElectrumBTH Testnet" if constants.net.TESTNET else "ElectrumBTH"
title = '%s %s - %s' % (name, self.wallet.electrum_version,
self.wallet.basename())
extra = [self.wallet.storage.get('wallet_type', '?')]
if self.wallet.is_watching_only():
self.warn_if_watching_only()
extra.append(_('watching only'))
title += ' [%s]'% ', '.join(extra)
self.setWindowTitle(title)
self.password_menu.setEnabled(self.wallet.may_have_password())
self.import_privkey_menu.setVisible(self.wallet.can_import_privkey())
self.import_address_menu.setVisible(self.wallet.can_import_address())
self.export_menu.setEnabled(self.wallet.can_export())
def warn_if_watching_only(self):
if self.wallet.is_watching_only():
msg = ' '.join([
_("This wallet is watching-only."),
_("This means you will not be able to spend Bithereum with it."),
_("Make sure you own the seed phrase or the private keys, before you request Bithereum to be sent to this wallet.")
])
self.show_warning(msg, title=_('Information'))
def open_wallet(self):
try:
wallet_folder = self.get_wallet_folder()
except FileNotFoundError as e:
self.show_error(str(e))
return
filename, __ = QFileDialog.getOpenFileName(self, "Select your wallet file", wallet_folder)
if not filename:
return
self.gui_object.new_window(filename)
def backup_wallet(self):
path = self.wallet.storage.path
wallet_folder = os.path.dirname(path)
filename, __ = QFileDialog.getSaveFileName(self, _('Enter a filename for the copy of your wallet'), wallet_folder)
if not filename:
return
new_path = os.path.join(wallet_folder, filename)
if new_path != path:
try:
shutil.copy2(path, new_path)
self.show_message(_("A copy of your wallet file was created in")+" '%s'" % str(new_path), title=_("Wallet backup created"))
except BaseException as reason:
self.show_critical(_("ElectrumBTH was unable to copy your wallet file to the specified location.") + "\n" + str(reason), title=_("Unable to create backup"))
def update_recently_visited(self, filename):
recent = self.config.get('recently_open', [])
try:
sorted(recent)
except:
recent = []
if filename in recent:
recent.remove(filename)
recent.insert(0, filename)
recent = recent[:5]
self.config.set_key('recently_open', recent)
self.recently_visited_menu.clear()
for i, k in enumerate(sorted(recent)):
b = os.path.basename(k)
def loader(k):
return lambda: self.gui_object.new_window(k)
self.recently_visited_menu.addAction(b, loader(k)).setShortcut(QKeySequence("Ctrl+%d"%(i+1)))
self.recently_visited_menu.setEnabled(len(recent))
def get_wallet_folder(self):
return os.path.dirname(os.path.abspath(self.config.get_wallet_path()))
def new_wallet(self):
try:
wallet_folder = self.get_wallet_folder()
except FileNotFoundError as e:
self.show_error(str(e))
return
i = 1
while True:
filename = "wallet_%d" % i
if filename in os.listdir(wallet_folder):
i += 1
else:
break
full_path = os.path.join(wallet_folder, filename)
self.gui_object.start_new_window(full_path, None)
def init_menubar(self):
menubar = QMenuBar()
file_menu = menubar.addMenu(_("&File"))
self.recently_visited_menu = file_menu.addMenu(_("&Recently open"))
file_menu.addAction(_("&Open"), self.open_wallet).setShortcut(QKeySequence.Open)
file_menu.addAction(_("&New/Restore"), self.new_wallet).setShortcut(QKeySequence.New)
file_menu.addAction(_("&Save Copy"), self.backup_wallet).setShortcut(QKeySequence.SaveAs)
file_menu.addAction(_("Delete"), self.remove_wallet)
file_menu.addSeparator()
file_menu.addAction(_("&Quit"), self.close)
wallet_menu = menubar.addMenu(_("&Wallet"))
wallet_menu.addAction(_("&Information"), self.show_master_public_keys)
wallet_menu.addSeparator()
self.password_menu = wallet_menu.addAction(_("&Password"), self.change_password_dialog)
self.seed_menu = wallet_menu.addAction(_("&Seed"), self.show_seed_dialog)
self.private_keys_menu = wallet_menu.addMenu(_("&Private keys"))
self.private_keys_menu.addAction(_("&Sweep"), self.sweep_key_dialog)
self.import_privkey_menu = self.private_keys_menu.addAction(_("&Import"), self.do_import_privkey)
self.export_menu = self.private_keys_menu.addAction(_("&Export"), self.export_privkeys_dialog)
self.import_address_menu = wallet_menu.addAction(_("Import addresses"), self.import_addresses)
wallet_menu.addSeparator()
addresses_menu = wallet_menu.addMenu(_("&Addresses"))
addresses_menu.addAction(_("&Filter"), lambda: self.address_list.toggle_toolbar(self.config))
labels_menu = wallet_menu.addMenu(_("&Labels"))
labels_menu.addAction(_("&Import"), self.do_import_labels)
labels_menu.addAction(_("&Export"), self.do_export_labels)
history_menu = wallet_menu.addMenu(_("&History"))
history_menu.addAction(_("&Filter"), lambda: self.history_list.toggle_toolbar(self.config))
history_menu.addAction(_("&Summary"), self.history_list.show_summary)
history_menu.addAction(_("&Plot"), self.history_list.plot_history_dialog)
history_menu.addAction(_("&Export"), self.history_list.export_history_dialog)
contacts_menu = wallet_menu.addMenu(_("Contacts"))
contacts_menu.addAction(_("&New"), self.new_contact_dialog)
contacts_menu.addAction(_("Import"), lambda: self.contact_list.import_contacts())
contacts_menu.addAction(_("Export"), lambda: self.contact_list.export_contacts())
invoices_menu = wallet_menu.addMenu(_("Invoices"))
invoices_menu.addAction(_("Import"), lambda: self.invoice_list.import_invoices())
invoices_menu.addAction(_("Export"), lambda: self.invoice_list.export_invoices())
wallet_menu.addSeparator()
wallet_menu.addAction(_("Find"), self.toggle_search).setShortcut(QKeySequence("Ctrl+F"))
def add_toggle_action(view_menu, tab):
is_shown = self.config.get('show_{}_tab'.format(tab.tab_name), False)
item_name = (_("Hide") if is_shown else _("Show")) + " " + tab.tab_description
tab.menu_action = view_menu.addAction(item_name, lambda: self.toggle_tab(tab))
view_menu = menubar.addMenu(_("&View"))
add_toggle_action(view_menu, self.addresses_tab)
add_toggle_action(view_menu, self.utxo_tab)
add_toggle_action(view_menu, self.contacts_tab)
add_toggle_action(view_menu, self.console_tab)
tools_menu = menubar.addMenu(_("&Tools"))
# Settings / Preferences are all reserved keywords in macOS using this as work around
tools_menu.addAction(_("ElectrumBTH preferences") if sys.platform == 'darwin' else _("Preferences"), self.settings_dialog)
tools_menu.addAction(_("&Network"), lambda: self.gui_object.show_network_dialog(self))
tools_menu.addAction(_("&Plugins"), self.plugins_dialog)
tools_menu.addSeparator()
tools_menu.addAction(_("&Sign/verify message"), self.sign_verify_message)
tools_menu.addAction(_("&Encrypt/decrypt message"), self.encrypt_message)
tools_menu.addSeparator()
paytomany_menu = tools_menu.addAction(_("&Pay to many"), self.paytomany)
raw_transaction_menu = tools_menu.addMenu(_("&Load transaction"))
raw_transaction_menu.addAction(_("&From file"), self.do_process_from_file)
raw_transaction_menu.addAction(_("&From text"), self.do_process_from_text)
raw_transaction_menu.addAction(_("&From the blockchain"), self.do_process_from_txid)
raw_transaction_menu.addAction(_("&From QR code"), self.read_tx_from_qrcode)
self.raw_transaction_menu = raw_transaction_menu
run_hook('init_menubar_tools', self, tools_menu)
help_menu = menubar.addMenu(_("&Help"))
help_menu.addAction(_("&About"), self.show_about)
help_menu.addAction(_("&Official website"), lambda: webbrowser.open("https://bithereum.network"))
help_menu.addSeparator()
help_menu.addAction(_("&Documentation"), lambda: webbrowser.open("http://docs.electrum.org/")).setShortcut(QKeySequence.HelpContents)
help_menu.addAction(_("&Report Bug"), self.show_report_bug)
help_menu.addSeparator()
help_menu.addAction(_("&Donate to server"), self.donate_to_server)
self.setMenuBar(menubar)
def donate_to_server(self):
d = self.network.get_donation_address()
if d:
host = self.network.get_parameters()[0]
self.pay_to_URI('bithereumnetwork:%s?message=donation for %s'%(d, host))
else:
self.show_error(_('No donation address for this server'))
def show_about(self):
QMessageBox.about(self, "ElectrumBTH",
_("Version")+" %s" % (self.wallet.electrum_version) + "\n\n" +
_("ElectrumBTH's focus is speed, with low resource usage and simplifying Bithereum. You do not need to perform regular backups, because your wallet can be recovered from a secret phrase that you can memorize or write on paper. Startup times are instant because it operates in conjunction with high-performance servers that handle the most complicated parts of the Bithereum system." + "\n\n" +
_("Uses icons from the Icons8 icon pack (icons8.com).")))
def show_report_bug(self):
msg = ' '.join([
_("Please report any bugs as issues on github:<br/>"),
"<a href=\"{0}\">{0}</a><br/><br/>".format(constants.GIT_ISSUE_URL),
_("Before reporting a bug, upgrade to the most recent version of ElectrumBTH (latest release or git HEAD), and include the version number in your report."),
_("Try to explain not only what the bug is, but how it occurs.")
])
self.show_message(msg, title="ElectrumBTH - " + _("Reporting Bugs"))
def notify_transactions(self):
if not self.network or not self.network.is_connected():
return
self.print_error("Notifying GUI")
if len(self.tx_notifications) > 0:
# Combine the transactions if there are at least three
num_txns = len(self.tx_notifications)
if num_txns >= 3:
total_amount = 0
for tx in self.tx_notifications:
is_relevant, is_mine, v, fee = self.wallet.get_wallet_delta(tx)
if v > 0:
total_amount += v
self.notify(_("{} new transactions received: Total amount received in the new transactions {}")
.format(num_txns, self.format_amount_and_units(total_amount)))
self.tx_notifications = []
else:
for tx in self.tx_notifications:
if tx:
self.tx_notifications.remove(tx)
is_relevant, is_mine, v, fee = self.wallet.get_wallet_delta(tx)
if v > 0:
self.notify(_("New transaction received: {}").format(self.format_amount_and_units(v)))
def notify(self, message):
if self.tray:
try:
# this requires Qt 5.9
self.tray.showMessage("ElectrumBTH", message, QIcon(":icons/electrumbth_dark_icon"), 20000)
except TypeError:
self.tray.showMessage("ElectrumBTH", message, QSystemTrayIcon.Information, 20000)
# custom wrappers for getOpenFileName and getSaveFileName, that remember the path selected by the user
def getOpenFileName(self, title, filter = ""):
directory = self.config.get('io_dir', os.path.expanduser('~'))
fileName, __ = QFileDialog.getOpenFileName(self, title, directory, filter)
if fileName and directory != os.path.dirname(fileName):
self.config.set_key('io_dir', os.path.dirname(fileName), True)
return fileName
def getSaveFileName(self, title, filename, filter = ""):
directory = self.config.get('io_dir', os.path.expanduser('~'))
path = os.path.join( directory, filename )
fileName, __ = QFileDialog.getSaveFileName(self, title, path, filter)
if fileName and directory != os.path.dirname(fileName):
self.config.set_key('io_dir', os.path.dirname(fileName), True)
return fileName
def connect_slots(self, sender):
sender.timer_signal.connect(self.timer_actions)
def timer_actions(self):
# Note this runs in the GUI thread
if self.need_update.is_set():
self.need_update.clear()
self.update_wallet()
# resolve aliases
# FIXME this is a blocking network call that has a timeout of 5 sec
self.payto_e.resolve()
# update fee
if self.require_fee_update:
self.do_update_fee()
self.require_fee_update = False
def format_amount(self, x, is_diff=False, whitespaces=False):
return format_satoshis(x, is_diff, self.num_zeros, self.decimal_point, whitespaces)
def format_amount_and_units(self, amount):
text = self.format_amount(amount) + ' '+ self.base_unit()
x = self.fx.format_amount_and_units(amount) if self.fx else None
if text and x:
text += ' (%s)'%x
return text
def format_fee_rate(self, fee_rate):
return format_satoshis(fee_rate/1000, False, self.num_zeros, 0, False) + ' sat/byte'
def get_decimal_point(self):
return self.decimal_point
def base_unit(self):
assert self.decimal_point in [2, 5, 8]
if self.decimal_point == 2:
return 'bits'
if self.decimal_point == 5:
return 'mBTH'
if self.decimal_point == 8:
return 'BTH'
raise Exception('Unknown base unit')
def connect_fields(self, window, btc_e, fiat_e, fee_e):
def edit_changed(edit):
if edit.follows:
return
edit.setStyleSheet(ColorScheme.DEFAULT.as_stylesheet())
fiat_e.is_last_edited = (edit == fiat_e)
amount = edit.get_amount()
rate = self.fx.exchange_rate() if self.fx else Decimal('NaN')
if rate.is_nan() or amount is None:
if edit is fiat_e:
btc_e.setText("")
if fee_e:
fee_e.setText("")
else:
fiat_e.setText("")
else:
if edit is fiat_e:
btc_e.follows = True
btc_e.setAmount(int(amount / Decimal(rate) * COIN))
btc_e.setStyleSheet(ColorScheme.BLUE.as_stylesheet())
btc_e.follows = False
if fee_e:
window.update_fee()
else:
fiat_e.follows = True
fiat_e.setText(self.fx.ccy_amount_str(
amount * Decimal(rate) / COIN, False))
fiat_e.setStyleSheet(ColorScheme.BLUE.as_stylesheet())
fiat_e.follows = False
btc_e.follows = False
fiat_e.follows = False
fiat_e.textChanged.connect(partial(edit_changed, fiat_e))
btc_e.textChanged.connect(partial(edit_changed, btc_e))
fiat_e.is_last_edited = False
def update_status(self):
if not self.wallet:
return
if self.network is None or not self.network.is_running():
text = _("Offline")
icon = QIcon(":icons/status_disconnected.png")
elif self.network.is_bootstrapping:
text = _("Bootstrapping - Do not interrupt. Please stand by...")
icon = QIcon(":icons/status_waiting.png")
elif self.network.is_connected():
server_height = self.network.get_server_height()
server_lag = self.network.get_local_height() - server_height
# Server height can be 0 after switching to a new server
# until we get a headers subscription request response.
# Display the synchronizing message in that case.
if not self.wallet.up_to_date or server_height == 0 or server_lag < 0:
text = _("Synchronizing...") + ' ({}/{})'.format(self.network.get_local_height(), server_height)
icon = QIcon(":icons/status_waiting.png")
elif server_lag > 1:
text = _("Server is lagging ({} blocks)").format(server_lag)
icon = QIcon(":icons/status_lagging.png")
else:
c, u, x = self.wallet.get_balance()
text = _("Balance" ) + ": %s "%(self.format_amount_and_units(c))
if u:
text += " [%s unconfirmed]"%(self.format_amount(u, True).strip())
if x:
text += " [%s unmatured]"%(self.format_amount(x, True).strip())
# append fiat balance and price
if self.fx.is_enabled():
text += self.fx.get_fiat_status_text(c + u + x,
self.base_unit(), self.get_decimal_point()) or ''
if not self.network.proxy:
icon = QIcon(":icons/status_connected.png")
else:
icon = QIcon(":icons/status_connected_proxy.png")
else:
text = _("Not connected")
icon = QIcon(":icons/status_disconnected.png")
self.tray.setToolTip("%s (%s)" % (text, self.wallet.basename()))
self.balance_label.setText(text)
self.status_button.setIcon( icon )
def update_wallet(self):
self.update_status()
if self.wallet.up_to_date or not self.network or not self.network.is_connected():
self.update_tabs()
def update_tabs(self):
self.history_list.update()
self.request_list.update()
self.address_list.update()
self.utxo_list.update()
self.contact_list.update()
self.invoice_list.update()
self.update_completions()
def create_history_tab(self):
from .history_list import HistoryList
self.history_list = l = HistoryList(self)
l.searchable_list = l
toolbar = l.create_toolbar(self.config)
toolbar_shown = self.config.get('show_toolbar_history', False)
l.show_toolbar(toolbar_shown)
return self.create_list_tab(l, toolbar)
def show_address(self, addr):
from . import address_dialog
d = address_dialog.AddressDialog(self, addr)
d.exec_()
def show_transaction(self, tx, tx_desc = None):
'''tx_desc is set only for txs created in the Send tab'''
show_transaction(tx, self, tx_desc)
def create_receive_tab(self):
# A 4-column grid layout. All the stretch is in the last column.
# The exchange rate plugin adds a fiat widget in column 2
self.receive_grid = grid = QGridLayout()
grid.setSpacing(8)
grid.setColumnStretch(3, 1)
self.receive_address_e = ButtonsLineEdit()
self.receive_address_e.addCopyButton(self.app)
self.receive_address_e.setReadOnly(True)
msg = _('Bithereum address where the payment should be received. Note that each payment request uses a different Bithereum address.')
self.receive_address_label = HelpLabel(_('Receiving address'), msg)
self.receive_address_e.textChanged.connect(self.update_receive_qr)
self.receive_address_e.setFocusPolicy(Qt.ClickFocus)
grid.addWidget(self.receive_address_label, 0, 0)
grid.addWidget(self.receive_address_e, 0, 1, 1, -1)
self.receive_message_e = QLineEdit()
grid.addWidget(QLabel(_('Description')), 1, 0)
grid.addWidget(self.receive_message_e, 1, 1, 1, -1)
self.receive_message_e.textChanged.connect(self.update_receive_qr)
self.receive_amount_e = BTHAmountEdit(self.get_decimal_point)
grid.addWidget(QLabel(_('Requested amount')), 2, 0)
grid.addWidget(self.receive_amount_e, 2, 1)
self.receive_amount_e.textChanged.connect(self.update_receive_qr)
self.fiat_receive_e = AmountEdit(self.fx.get_currency if self.fx else '')
if not self.fx or not self.fx.is_enabled():
self.fiat_receive_e.setVisible(False)
grid.addWidget(self.fiat_receive_e, 2, 2, Qt.AlignLeft)
self.connect_fields(self, self.receive_amount_e, self.fiat_receive_e, None)
self.expires_combo = QComboBox()
self.expires_combo.addItems([i[0] for i in expiration_values])
self.expires_combo.setCurrentIndex(3)
self.expires_combo.setFixedWidth(self.receive_amount_e.width())
msg = ' '.join([
_('Expiration date of your request.'),
_('This information is seen by the recipient if you send them a signed payment request.'),
_('Expired requests have to be deleted manually from your list, in order to free the corresponding Bithereum addresses.'),
_('The Bithereum address never expires and will always be part of this ElectrumBTH wallet.'),
])
grid.addWidget(HelpLabel(_('Request expires'), msg), 3, 0)
grid.addWidget(self.expires_combo, 3, 1)
self.expires_label = QLineEdit('')
self.expires_label.setReadOnly(1)
self.expires_label.setFocusPolicy(Qt.NoFocus)
self.expires_label.hide()
grid.addWidget(self.expires_label, 3, 1)
self.save_request_button = QPushButton(_('Save'))
self.save_request_button.clicked.connect(self.save_payment_request)
self.new_request_button = QPushButton(_('New'))
self.new_request_button.clicked.connect(self.new_payment_request)
self.receive_qr = QRCodeWidget(fixedSize=200)
self.receive_qr.mouseReleaseEvent = lambda x: self.toggle_qr_window()
self.receive_qr.enterEvent = lambda x: self.app.setOverrideCursor(QCursor(Qt.PointingHandCursor))
self.receive_qr.leaveEvent = lambda x: self.app.setOverrideCursor(QCursor(Qt.ArrowCursor))
self.receive_buttons = buttons = QHBoxLayout()
buttons.addStretch(1)
buttons.addWidget(self.save_request_button)
buttons.addWidget(self.new_request_button)
grid.addLayout(buttons, 4, 1, 1, 2)
self.receive_requests_label = QLabel(_('Requests'))
from .request_list import RequestList
self.request_list = RequestList(self)
# layout
vbox_g = QVBoxLayout()
vbox_g.addLayout(grid)
vbox_g.addStretch()
hbox = QHBoxLayout()
hbox.addLayout(vbox_g)
hbox.addWidget(self.receive_qr)
w = QWidget()
w.searchable_list = self.request_list
vbox = QVBoxLayout(w)
vbox.addLayout(hbox)
vbox.addStretch(1)
vbox.addWidget(self.receive_requests_label)
vbox.addWidget(self.request_list)
vbox.setStretchFactor(self.request_list, 1000)
return w
def delete_payment_request(self, addr):
self.wallet.remove_payment_request(addr, self.config)
self.request_list.update()
self.clear_receive_tab()
def get_request_URI(self, addr):
req = self.wallet.receive_requests[addr]
message = self.wallet.labels.get(addr, '')
amount = req['amount']
URI = util.create_URI(addr, amount, message)
if req.get('time'):
URI += "&time=%d"%req.get('time')
if req.get('exp'):
URI += "&exp=%d"%req.get('exp')
if req.get('name') and req.get('sig'):
sig = bfh(req.get('sig'))
sig = bitcoin.base_encode(sig, base=58)
URI += "&name=" + req['name'] + "&sig="+sig
return str(URI)
def sign_payment_request(self, addr):
alias = self.config.get('alias')
alias_privkey = None
if alias and self.alias_info:
alias_addr, alias_name, validated = self.alias_info
if alias_addr:
if self.wallet.is_mine(alias_addr):
msg = _('This payment request will be signed.') + '\n' + _('Please enter your password')
password = None
if self.wallet.has_keystore_encryption():
password = self.password_dialog(msg)
if not password:
return
try:
self.wallet.sign_payment_request(addr, alias, alias_addr, password)
except Exception as e:
self.show_error(str(e))
return
else:
return
def save_payment_request(self):
addr = str(self.receive_address_e.text())
amount = self.receive_amount_e.get_amount()
message = self.receive_message_e.text()
if not message and not amount:
self.show_error(_('No message or amount'))
return False
i = self.expires_combo.currentIndex()
expiration = list(map(lambda x: x[1], expiration_values))[i]
req = self.wallet.make_payment_request(addr, amount, message, expiration)
try:
self.wallet.add_payment_request(req, self.config)
except Exception as e:
traceback.print_exc(file=sys.stderr)
self.show_error(_('Error adding payment request') + ':\n' + str(e))
else:
self.sign_payment_request(addr)
self.save_request_button.setEnabled(False)
finally:
self.request_list.update()
self.address_list.update()
def view_and_paste(self, title, msg, data):
dialog = WindowModalDialog(self, title)
vbox = QVBoxLayout()
label = QLabel(msg)
label.setWordWrap(True)
vbox.addWidget(label)
pr_e = ShowQRTextEdit(text=data)
vbox.addWidget(pr_e)
vbox.addLayout(Buttons(CopyCloseButton(pr_e.text, self.app, dialog)))
dialog.setLayout(vbox)
dialog.exec_()
def export_payment_request(self, addr):
r = self.wallet.receive_requests.get(addr)
pr = paymentrequest.serialize_request(r).SerializeToString()
name = r['id'] + '.bip70'
fileName = self.getSaveFileName(_("Select where to save your payment request"), name, "*.bip70")
if fileName:
with open(fileName, "wb+") as f:
f.write(util.to_bytes(pr))
self.show_message(_("Request saved successfully"))
self.saved = True
def new_payment_request(self):
addr = self.wallet.get_unused_address()
if addr is None:
if not self.wallet.is_deterministic():
msg = [
_('No more addresses in your wallet.'),
_('You are using a non-deterministic wallet, which cannot create new addresses.'),
_('If you want to create new addresses, use a deterministic wallet instead.')
]
self.show_message(' '.join(msg))
return
if not self.question(_("Warning: The next address will not be recovered automatically if you restore your wallet from seed; you may need to add it manually.\n\nThis occurs because you have too many unused addresses in your wallet. To avoid this situation, use the existing addresses first.\n\nCreate anyway?")):
return
addr = self.wallet.create_new_address(False)
self.set_receive_address(addr)
self.expires_label.hide()
self.expires_combo.show()
self.new_request_button.setEnabled(False)
self.receive_message_e.setFocus(1)
def set_receive_address(self, addr):
self.receive_address_e.setText(addr)
self.receive_message_e.setText('')
self.receive_amount_e.setAmount(None)
def clear_receive_tab(self):
addr = self.wallet.get_receiving_address() or ''
self.receive_address_e.setText(addr)
self.receive_message_e.setText('')
self.receive_amount_e.setAmount(None)
self.expires_label.hide()
self.expires_combo.show()
def toggle_qr_window(self):
from . import qrwindow
if not self.qr_window:
self.qr_window = qrwindow.QR_Window(self)
self.qr_window.setVisible(True)
self.qr_window_geometry = self.qr_window.geometry()
else:
if not self.qr_window.isVisible():
self.qr_window.setVisible(True)
self.qr_window.setGeometry(self.qr_window_geometry)
else:
self.qr_window_geometry = self.qr_window.geometry()
self.qr_window.setVisible(False)
self.update_receive_qr()
def show_send_tab(self):
self.tabs.setCurrentIndex(self.tabs.indexOf(self.send_tab))
def show_receive_tab(self):
self.tabs.setCurrentIndex(self.tabs.indexOf(self.receive_tab))
def receive_at(self, addr):
if not bitcoin.is_address(addr):
return
self.show_receive_tab()
self.receive_address_e.setText(addr)
self.new_request_button.setEnabled(True)
def update_receive_qr(self):
addr = str(self.receive_address_e.text())
amount = self.receive_amount_e.get_amount()
message = self.receive_message_e.text()
self.save_request_button.setEnabled((amount is not None) or (message != ""))
uri = util.create_URI(addr, amount, message)
self.receive_qr.setData(uri)
if self.qr_window and self.qr_window.isVisible():
self.qr_window.set_content(addr, amount, message, uri)
def set_feerounding_text(self, num_satoshis_added):
self.feerounding_text = (_('Additional {} satoshis are going to be added.')
.format(num_satoshis_added))
def create_send_tab(self):
# A 4-column grid layout. All the stretch is in the last column.
# The exchange rate plugin adds a fiat widget in column 2
self.send_grid = grid = QGridLayout()
grid.setSpacing(8)
grid.setColumnStretch(3, 1)
from .paytoedit import PayToEdit
self.amount_e = BTHAmountEdit(self.get_decimal_point)
self.payto_e = PayToEdit(self)
msg = _('Recipient of the funds.') + '\n\n'\
+ _('You may enter a Bithereum address, a label from your list of contacts (a list of completions will be proposed), or an alias (email-like address that forwards to a Bithereum address)')
payto_label = HelpLabel(_('Pay to'), msg)
grid.addWidget(payto_label, 1, 0)
grid.addWidget(self.payto_e, 1, 1, 1, -1)
completer = QCompleter()
completer.setCaseSensitivity(False)
self.payto_e.set_completer(completer)
completer.setModel(self.completions)
msg = _('Description of the transaction (not mandatory).') + '\n\n'\
+ _('The description is not sent to the recipient of the funds. It is stored in your wallet file, and displayed in the \'History\' tab.')
description_label = HelpLabel(_('Description'), msg)
grid.addWidget(description_label, 2, 0)
self.message_e = MyLineEdit()
grid.addWidget(self.message_e, 2, 1, 1, -1)
self.from_label = QLabel(_('From'))
grid.addWidget(self.from_label, 3, 0)
self.from_list = MyTreeWidget(self, self.from_list_menu, ['',''])
self.from_list.setHeaderHidden(True)
self.from_list.setMaximumHeight(80)
grid.addWidget(self.from_list, 3, 1, 1, -1)
self.set_pay_from([])
msg = _('Amount to be sent.') + '\n\n' \
+ _('The amount will be displayed in red if you do not have enough funds in your wallet.') + ' ' \
+ _('Note that if you have frozen some of your addresses, the available funds will be lower than your total balance.') + '\n\n' \
+ _('Keyboard shortcut: type "!" to send all your coins.')
amount_label = HelpLabel(_('Amount'), msg)
grid.addWidget(amount_label, 4, 0)
grid.addWidget(self.amount_e, 4, 1)
self.fiat_send_e = AmountEdit(self.fx.get_currency if self.fx else '')
if not self.fx or not self.fx.is_enabled():
self.fiat_send_e.setVisible(False)
grid.addWidget(self.fiat_send_e, 4, 2)
self.amount_e.frozen.connect(
lambda: self.fiat_send_e.setFrozen(self.amount_e.isReadOnly()))
self.max_button = EnterButton(_("Max"), self.spend_max)
self.max_button.setFixedWidth(140)
grid.addWidget(self.max_button, 4, 3)
hbox = QHBoxLayout()
hbox.addStretch(1)
grid.addLayout(hbox, 4, 4)
msg = _('Bithereum transactions are in general not free. A transaction fee is paid by the sender of the funds.') + '\n\n'\
+ _('The amount of fee can be decided freely by the sender. However, transactions with low fees take more time to be processed.') + '\n\n'\
+ _('A suggested fee is automatically added to this field. You may override it. The suggested fee increases with the size of the transaction.')
self.fee_e_label = HelpLabel(_('Fee'), msg)
def fee_cb(dyn, pos, fee_rate):
if dyn:
if self.config.use_mempool_fees():
self.config.set_key('depth_level', pos, False)
else:
self.config.set_key('fee_level', pos, False)
else:
self.config.set_key('fee_per_kb', fee_rate, False)
if fee_rate:
self.feerate_e.setAmount(fee_rate // 1000)
else:
self.feerate_e.setAmount(None)
self.fee_e.setModified(False)
self.fee_slider.activate()
self.spend_max() if self.is_max else self.update_fee()
self.fee_slider = FeeSlider(self, self.config, fee_cb)
self.fee_slider.setFixedWidth(140)
def on_fee_or_feerate(edit_changed, editing_finished):
edit_other = self.feerate_e if edit_changed == self.fee_e else self.fee_e
if editing_finished:
if not edit_changed.get_amount():
# This is so that when the user blanks the fee and moves on,
# we go back to auto-calculate mode and put a fee back.
edit_changed.setModified(False)
else:
# edit_changed was edited just now, so make sure we will
# freeze the correct fee setting (this)
edit_other.setModified(False)
self.fee_slider.deactivate()
self.update_fee()
class TxSizeLabel(QLabel):
def setAmount(self, byte_size):
self.setText(('x %s bytes =' % byte_size) if byte_size else '')
self.size_e = TxSizeLabel()
self.size_e.setAlignment(Qt.AlignCenter)
self.size_e.setAmount(0)
self.size_e.setFixedWidth(140)
self.size_e.setStyleSheet(ColorScheme.DEFAULT.as_stylesheet())
self.feerate_e = FeerateEdit(lambda: 0)
self.feerate_e.setAmount(self.config.fee_per_byte())
self.feerate_e.textEdited.connect(partial(on_fee_or_feerate, self.feerate_e, False))
self.feerate_e.editingFinished.connect(partial(on_fee_or_feerate, self.feerate_e, True))
self.fee_e = BTHAmountEdit(self.get_decimal_point)
self.fee_e.textEdited.connect(partial(on_fee_or_feerate, self.fee_e, False))
self.fee_e.editingFinished.connect(partial(on_fee_or_feerate, self.fee_e, True))
def feerounding_onclick():
text = (self.feerounding_text + '\n\n' +
_('To somewhat protect your privacy, ElectrumBTH tries to create change with similar precision to other outputs.') + ' ' +
_('At most 100 satoshis might be lost due to this rounding.') + ' ' +
_("You can disable this setting in '{}'.").format(_('Preferences')) + '\n' +
_('Also, dust is not kept as change, but added to the fee.'))
QMessageBox.information(self, 'Fee rounding', text)
self.feerounding_icon = QPushButton(QIcon(':icons/info.png'), '')
self.feerounding_icon.setFixedWidth(20)
self.feerounding_icon.setFlat(True)
self.feerounding_icon.clicked.connect(feerounding_onclick)
self.feerounding_icon.setVisible(False)
self.connect_fields(self, self.amount_e, self.fiat_send_e, self.fee_e)
vbox_feelabel = QVBoxLayout()
vbox_feelabel.addWidget(self.fee_e_label)
vbox_feelabel.addStretch(1)
grid.addLayout(vbox_feelabel, 5, 0)
self.fee_adv_controls = QWidget()
hbox = QHBoxLayout(self.fee_adv_controls)
hbox.setContentsMargins(0, 0, 0, 0)
hbox.addWidget(self.feerate_e)
hbox.addWidget(self.size_e)
hbox.addWidget(self.fee_e)
hbox.addWidget(self.feerounding_icon, Qt.AlignLeft)
hbox.addStretch(1)
vbox_feecontrol = QVBoxLayout()
vbox_feecontrol.addWidget(self.fee_adv_controls)
vbox_feecontrol.addWidget(self.fee_slider)
grid.addLayout(vbox_feecontrol, 5, 1, 1, -1)
if not self.config.get('show_fee', False):
self.fee_adv_controls.setVisible(False)
self.preview_button = EnterButton(_("Preview"), self.do_preview)
self.preview_button.setToolTip(_('Display the details of your transaction before signing it.'))
self.send_button = EnterButton(_("Send"), self.do_send)
self.clear_button = EnterButton(_("Clear"), self.do_clear)
buttons = QHBoxLayout()
buttons.addStretch(1)
buttons.addWidget(self.clear_button)
buttons.addWidget(self.preview_button)
buttons.addWidget(self.send_button)
grid.addLayout(buttons, 6, 1, 1, 3)
self.amount_e.shortcut.connect(self.spend_max)
self.payto_e.textChanged.connect(self.update_fee)
self.amount_e.textEdited.connect(self.update_fee)
def reset_max(t):
self.is_max = False
self.max_button.setEnabled(not bool(t))
self.amount_e.textEdited.connect(reset_max)
self.fiat_send_e.textEdited.connect(reset_max)
def entry_changed():
text = ""
amt_color = ColorScheme.DEFAULT
fee_color = ColorScheme.DEFAULT
feerate_color = ColorScheme.DEFAULT
if self.not_enough_funds:
amt_color, fee_color = ColorScheme.RED, ColorScheme.RED
feerate_color = ColorScheme.RED
text = _( "Not enough funds" )
c, u, x = self.wallet.get_frozen_balance()
if c+u+x:
text += ' (' + self.format_amount(c+u+x).strip() + ' ' + self.base_unit() + ' ' +_("are frozen") + ')'
# blue color denotes auto-filled values
elif self.fee_e.isModified():
feerate_color = ColorScheme.BLUE
elif self.feerate_e.isModified():
fee_color = ColorScheme.BLUE
elif self.amount_e.isModified():
fee_color = ColorScheme.BLUE
feerate_color = ColorScheme.BLUE
else:
amt_color = ColorScheme.BLUE
fee_color = ColorScheme.BLUE
feerate_color = ColorScheme.BLUE
self.statusBar().showMessage(text)
self.amount_e.setStyleSheet(amt_color.as_stylesheet())
self.fee_e.setStyleSheet(fee_color.as_stylesheet())
self.feerate_e.setStyleSheet(feerate_color.as_stylesheet())
self.amount_e.textChanged.connect(entry_changed)
self.fee_e.textChanged.connect(entry_changed)
self.feerate_e.textChanged.connect(entry_changed)
self.invoices_label = QLabel(_('Invoices'))
from .invoice_list import InvoiceList
self.invoice_list = InvoiceList(self)
vbox0 = QVBoxLayout()
vbox0.addLayout(grid)
hbox = QHBoxLayout()
hbox.addLayout(vbox0)
w = QWidget()
vbox = QVBoxLayout(w)
vbox.addLayout(hbox)
vbox.addStretch(1)
vbox.addWidget(self.invoices_label)
vbox.addWidget(self.invoice_list)
vbox.setStretchFactor(self.invoice_list, 1000)
w.searchable_list = self.invoice_list
run_hook('create_send_tab', grid)
return w
def spend_max(self):
self.is_max = True
self.do_update_fee()
def update_fee(self):
self.require_fee_update = True
def get_payto_or_dummy(self):
r = self.payto_e.get_recipient()
if r:
return r
return (TYPE_ADDRESS, self.wallet.dummy_address())
def do_update_fee(self):
'''Recalculate the fee. If the fee was manually input, retain it, but
still build the TX to see if there are enough funds.
'''
freeze_fee = self.is_send_fee_frozen()
freeze_feerate = self.is_send_feerate_frozen()
amount = '!' if self.is_max else self.amount_e.get_amount()
if amount is None:
if not freeze_fee:
self.fee_e.setAmount(None)
self.not_enough_funds = False
self.statusBar().showMessage('')
else:
fee_estimator = self.get_send_fee_estimator()
outputs = self.payto_e.get_outputs(self.is_max)
if not outputs:
_type, addr = self.get_payto_or_dummy()
outputs = [(_type, addr, amount)]
is_sweep = bool(self.tx_external_keypairs)
make_tx = lambda fee_est: \
self.wallet.make_unsigned_transaction(
self.get_coins(), outputs, self.config,
fixed_fee=fee_est, is_sweep=is_sweep)
try:
tx = make_tx(fee_estimator)
self.not_enough_funds = False
except (NotEnoughFunds, NoDynamicFeeEstimates) as e:
if not freeze_fee:
self.fee_e.setAmount(None)
if not freeze_feerate:
self.feerate_e.setAmount(None)
self.feerounding_icon.setVisible(False)
if isinstance(e, NotEnoughFunds):
self.not_enough_funds = True
elif isinstance(e, NoDynamicFeeEstimates):
try:
tx = make_tx(0)
size = tx.estimated_size()
self.size_e.setAmount(size)
except BaseException:
pass
return
except BaseException:
traceback.print_exc(file=sys.stderr)
return
size = tx.estimated_size()
self.size_e.setAmount(size)
fee = tx.get_fee()
fee = None if self.not_enough_funds else fee
# Displayed fee/fee_rate values are set according to user input.
# Due to rounding or dropping dust in CoinChooser,
# actual fees often differ somewhat.
if freeze_feerate or self.fee_slider.is_active():
displayed_feerate = self.feerate_e.get_amount()
if displayed_feerate:
displayed_feerate = displayed_feerate // 1000
else:
# fallback to actual fee
displayed_feerate = fee // size if fee is not None else None
self.feerate_e.setAmount(displayed_feerate)
displayed_fee = displayed_feerate * size if displayed_feerate is not None else None
self.fee_e.setAmount(displayed_fee)
else:
if freeze_fee:
displayed_fee = self.fee_e.get_amount()
else:
# fallback to actual fee if nothing is frozen
displayed_fee = fee
self.fee_e.setAmount(displayed_fee)
displayed_fee = displayed_fee if displayed_fee else 0
displayed_feerate = displayed_fee // size if displayed_fee is not None else None
self.feerate_e.setAmount(displayed_feerate)
# show/hide fee rounding icon
feerounding = (fee - displayed_fee) if fee else 0
self.set_feerounding_text(feerounding)
self.feerounding_icon.setToolTip(self.feerounding_text)
self.feerounding_icon.setVisible(bool(feerounding))
if self.is_max:
amount = tx.output_value()
self.amount_e.setAmount(amount)
def from_list_delete(self, item):
i = self.from_list.indexOfTopLevelItem(item)
self.pay_from.pop(i)
self.redraw_from_list()
self.update_fee()
def from_list_menu(self, position):
item = self.from_list.itemAt(position)
menu = QMenu()
menu.addAction(_("Remove"), lambda: self.from_list_delete(item))
menu.exec_(self.from_list.viewport().mapToGlobal(position))
def set_pay_from(self, coins):
self.pay_from = list(coins)
self.redraw_from_list()
def redraw_from_list(self):
self.from_list.clear()
self.from_label.setHidden(len(self.pay_from) == 0)
self.from_list.setHidden(len(self.pay_from) == 0)
def format(x):
h = x.get('prevout_hash')
return h[0:10] + '...' + h[-10:] + ":%d"%x.get('prevout_n') + u'\t' + "%s"%x.get('address')
for item in self.pay_from:
self.from_list.addTopLevelItem(QTreeWidgetItem( [format(item), self.format_amount(item['value']) ]))
def get_contact_payto(self, key):
_type, label = self.contacts.get(key)
return label + ' <' + key + '>' if _type == 'address' else key
def update_completions(self):
l = [self.get_contact_payto(key) for key in self.contacts.keys()]
self.completions.setStringList(l)
def protected(func):
'''Password request wrapper. The password is passed to the function
as the 'password' named argument. "None" indicates either an
unencrypted wallet, or the user cancelled the password request.
An empty input is passed as the empty string.'''
def request_password(self, *args, **kwargs):
parent = self.top_level_window()
password = None
while self.wallet.has_keystore_encryption():
password = self.password_dialog(parent=parent)
if password is None:
# User cancelled password input
return
try:
self.wallet.check_password(password)
break
except Exception as e:
self.show_error(str(e), parent=parent)
continue
kwargs['password'] = password
return func(self, *args, **kwargs)
return request_password
def is_send_fee_frozen(self):
return self.fee_e.isVisible() and self.fee_e.isModified() \
and (self.fee_e.text() or self.fee_e.hasFocus())
def is_send_feerate_frozen(self):
return self.feerate_e.isVisible() and self.feerate_e.isModified() \
and (self.feerate_e.text() or self.feerate_e.hasFocus())
def get_send_fee_estimator(self):
if self.is_send_fee_frozen():
fee_estimator = self.fee_e.get_amount()
elif self.is_send_feerate_frozen():
amount = self.feerate_e.get_amount()
amount = 0 if amount is None else amount
fee_estimator = partial(
simple_config.SimpleConfig.estimate_fee_for_feerate, amount)
else:
fee_estimator = None
return fee_estimator
def read_send_tab(self):
if self.payment_request and self.payment_request.has_expired():
self.show_error(_('Payment request has expired'))
return
label = self.message_e.text()
if self.payment_request:
outputs = self.payment_request.get_outputs()
else:
errors = self.payto_e.get_errors()
if errors:
self.show_warning(_("Invalid Lines found:") + "\n\n" + '\n'.join([ _("Line #") + str(x[0]+1) + ": " + x[1] for x in errors]))
return
outputs = self.payto_e.get_outputs(self.is_max)
if self.payto_e.is_alias and self.payto_e.validated is False:
alias = self.payto_e.toPlainText()
msg = _('WARNING: the alias "{}" could not be validated via an additional '
'security check, DNSSEC, and thus may not be correct.').format(alias) + '\n'
msg += _('Do you wish to continue?')
if not self.question(msg):
return
if not outputs:
self.show_error(_('No outputs'))
return
for _type, addr, amount in outputs:
if addr is None:
self.show_error(_('Bithereum Address is None'))
return
if _type == TYPE_ADDRESS and not bitcoin.is_address(addr):
self.show_error(_('Invalid Bithereum Address'))
return
if amount is None:
self.show_error(_('Invalid Amount'))
return
fee_estimator = self.get_send_fee_estimator()
coins = self.get_coins()
return outputs, fee_estimator, label, coins
def do_preview(self):
self.do_send(preview = True)
def do_send(self, preview = False):
if run_hook('abort_send', self):
return
r = self.read_send_tab()
if not r:
return
outputs, fee_estimator, tx_desc, coins = r
try:
is_sweep = bool(self.tx_external_keypairs)
tx = self.wallet.make_unsigned_transaction(
coins, outputs, self.config, fixed_fee=fee_estimator,
is_sweep=is_sweep)
except NotEnoughFunds:
self.show_message(_("Insufficient funds"))
return
except BaseException as e:
traceback.print_exc(file=sys.stdout)
self.show_message(str(e))
return
amount = tx.output_value() if self.is_max else sum(map(lambda x:x[2], outputs))
fee = tx.get_fee()
use_rbf = self.config.get('use_rbf', True)
if use_rbf:
tx.set_rbf(True)
if fee < self.wallet.relayfee() * tx.estimated_size() / 1000:
self.show_error('\n'.join([
_("This transaction requires a higher fee, or it will not be propagated by your current server"),
_("Try to raise your transaction fee, or use a server with a lower relay fee.")
]))
return
if preview:
self.show_transaction(tx, tx_desc)
return
if not self.network:
self.show_error(_("You can't broadcast a transaction without a live network connection."))
return
# confirmation dialog
msg = [
_("Amount to be sent") + ": " + self.format_amount_and_units(amount),
_("Mining fee") + ": " + self.format_amount_and_units(fee),
]
x_fee = run_hook('get_tx_extra_fee', self.wallet, tx)
if x_fee:
x_fee_address, x_fee_amount = x_fee
msg.append( _("Additional fees") + ": " + self.format_amount_and_units(x_fee_amount) )
confirm_rate = simple_config.FEERATE_WARNING_HIGH_FEE
if fee > confirm_rate * tx.estimated_size() / 1000:
msg.append(_('Warning') + ': ' + _("The fee for this transaction seems unusually high."))
if self.wallet.has_keystore_encryption():
msg.append("")
msg.append(_("Enter your password to proceed"))
password = self.password_dialog('\n'.join(msg))
if not password:
return
else:
msg.append(_('Proceed?'))
password = None
if not self.question('\n'.join(msg)):
return
def sign_done(success):
if success:
if not tx.is_complete():
self.show_transaction(tx)
self.do_clear()
else:
self.broadcast_transaction(tx, tx_desc)
self.sign_tx_with_password(tx, sign_done, password)
@protected
def sign_tx(self, tx, callback, password):
self.sign_tx_with_password(tx, callback, password)
def sign_tx_with_password(self, tx, callback, password):
'''Sign the transaction in a separate thread. When done, calls
the callback with a success code of True or False.
'''
def on_signed(result):
callback(True)
def on_failed(exc_info):
self.on_error(exc_info)
callback(False)
if self.tx_external_keypairs:
# can sign directly
task = partial(Transaction.sign, tx, self.tx_external_keypairs)
else:
# call hook to see if plugin needs gui interaction
run_hook('sign_tx', self, tx)
task = partial(self.wallet.sign_transaction, tx, password)
WaitingDialog(self, _('Signing transaction...'), task,
on_signed, on_failed)
def broadcast_transaction(self, tx, tx_desc):
def broadcast_thread():
# non-GUI thread
pr = self.payment_request
if pr and pr.has_expired():
self.payment_request = None
return False, _("Payment request has expired")
status, msg = self.network.broadcast(tx)
if pr and status is True:
self.invoices.set_paid(pr, tx.txid())
self.invoices.save()
self.payment_request = None
refund_address = self.wallet.get_receiving_addresses()[0]
ack_status, ack_msg = pr.send_ack(str(tx), refund_address)
if ack_status:
msg = ack_msg
return status, msg
# Capture current TL window; override might be removed on return
parent = self.top_level_window(lambda win: isinstance(win, MessageBoxMixin))
def broadcast_done(result):
# GUI thread
if result:
status, msg = result
if status:
if tx_desc is not None and tx.is_complete():
self.wallet.set_label(tx.txid(), tx_desc)
parent.show_message(_('Payment sent.') + '\n' + msg)
self.invoice_list.update()
self.do_clear()
else:
parent.show_error(msg)
WaitingDialog(self, _('Broadcasting transaction...'),
broadcast_thread, broadcast_done, self.on_error)
def query_choice(self, msg, choices):
# Needed by QtHandler for hardware wallets
dialog = WindowModalDialog(self.top_level_window())
clayout = ChoicesLayout(msg, choices)
vbox = QVBoxLayout(dialog)
vbox.addLayout(clayout.layout())
vbox.addLayout(Buttons(OkButton(dialog)))
if not dialog.exec_():
return None
return clayout.selected_index()
def lock_amount(self, b):
self.amount_e.setFrozen(b)
self.max_button.setEnabled(not b)
def prepare_for_payment_request(self):
self.show_send_tab()
self.payto_e.is_pr = True
for e in [self.payto_e, self.amount_e, self.message_e]:
e.setFrozen(True)
self.payto_e.setText(_("please wait..."))
return True
def delete_invoice(self, key):
self.invoices.remove(key)
self.invoice_list.update()
def payment_request_ok(self):
pr = self.payment_request
key = self.invoices.add(pr)
status = self.invoices.get_status(key)
self.invoice_list.update()
if status == PR_PAID:
self.show_message("invoice already paid")
self.do_clear()
self.payment_request = None
return
self.payto_e.is_pr = True
if not pr.has_expired():
self.payto_e.setGreen()
else:
self.payto_e.setExpired()
self.payto_e.setText(pr.get_requestor())
self.amount_e.setText(format_satoshis_plain(pr.get_amount(), self.decimal_point))
self.message_e.setText(pr.get_memo())
# signal to set fee
self.amount_e.textEdited.emit("")
def payment_request_error(self):
self.show_message(self.payment_request.error)
self.payment_request = None
self.do_clear()
def on_pr(self, request):
self.payment_request = request
if self.payment_request.verify(self.contacts):
self.payment_request_ok_signal.emit()
else:
self.payment_request_error_signal.emit()
def pay_to_URI(self, URI):
if not URI:
return
try:
out = util.parse_URI(URI, self.on_pr)
except BaseException as e:
self.show_error(_('Invalid bithereumnetwork URI:') + '\n' + str(e))
return
self.show_send_tab()
r = out.get('r')
sig = out.get('sig')
name = out.get('name')
if r or (name and sig):
self.prepare_for_payment_request()
return
address = out.get('address')
amount = out.get('amount')
label = out.get('label')
message = out.get('message')
# use label as description (not BIP21 compliant)
if label and not message:
message = label
if address:
self.payto_e.setText(address)
if message:
self.message_e.setText(message)
if amount:
self.amount_e.setAmount(amount)
self.amount_e.textEdited.emit("")
def do_clear(self):
self.is_max = False
self.not_enough_funds = False
self.payment_request = None
self.payto_e.is_pr = False
for e in [self.payto_e, self.message_e, self.amount_e, self.fiat_send_e,
self.fee_e, self.feerate_e]:
e.setText('')
e.setFrozen(False)
self.fee_slider.activate()
self.feerate_e.setAmount(self.config.fee_per_byte())
self.size_e.setAmount(0)
self.feerounding_icon.setVisible(False)
self.set_pay_from([])
self.tx_external_keypairs = {}
self.update_status()
run_hook('do_clear', self)
def set_frozen_state(self, addrs, freeze):
self.wallet.set_frozen_state(addrs, freeze)
self.address_list.update()
self.utxo_list.update()
self.update_fee()
def create_list_tab(self, l, toolbar=None):
w = QWidget()
w.searchable_list = l
vbox = QVBoxLayout()
w.setLayout(vbox)
vbox.setContentsMargins(0, 0, 0, 0)
vbox.setSpacing(0)
if toolbar:
vbox.addLayout(toolbar)
vbox.addWidget(l)
return w
def create_addresses_tab(self):
from .address_list import AddressList
self.address_list = l = AddressList(self)
toolbar = l.create_toolbar(self.config)
toolbar_shown = self.config.get('show_toolbar_addresses', False)
l.show_toolbar(toolbar_shown)
return self.create_list_tab(l, toolbar)
def create_utxo_tab(self):
from .utxo_list import UTXOList
self.utxo_list = l = UTXOList(self)
return self.create_list_tab(l)
def create_contacts_tab(self):
from .contact_list import ContactList
self.contact_list = l = ContactList(self)
return self.create_list_tab(l)
def remove_address(self, addr):
if self.question(_("Do you want to remove")+" %s "%addr +_("from your wallet?")):
self.wallet.delete_address(addr)
self.need_update.set() # history, addresses, coins
self.clear_receive_tab()
def get_coins(self):
if self.pay_from:
return self.pay_from
else:
return self.wallet.get_spendable_coins(None, self.config)
def spend_coins(self, coins):
self.set_pay_from(coins)
self.show_send_tab()
self.update_fee()
def paytomany(self):
self.show_send_tab()
self.payto_e.paytomany()
msg = '\n'.join([
_('Enter a list of outputs in the \'Pay to\' field.'),
_('One output per line.'),
_('Format: address, amount'),
_('You may load a CSV file using the file icon.')
])
self.show_message(msg, title=_('Pay to many'))
def payto_contacts(self, labels):
paytos = [self.get_contact_payto(label) for label in labels]
self.show_send_tab()
if len(paytos) == 1:
self.payto_e.setText(paytos[0])
self.amount_e.setFocus()
else:
text = "\n".join([payto + ", 0" for payto in paytos])
self.payto_e.setText(text)
self.payto_e.setFocus()
def set_contact(self, label, address):
if not is_address(address):
self.show_error(_('Invalid Address'))
self.contact_list.update() # Displays original unchanged value
return False
self.contacts[address] = ('address', label)
self.contact_list.update()
self.history_list.update()
self.update_completions()
return True
def delete_contacts(self, labels):
if not self.question(_("Remove {} from your list of contacts?")
.format(" + ".join(labels))):
return
for label in labels:
self.contacts.pop(label)
self.history_list.update()
self.contact_list.update()
self.update_completions()
def show_invoice(self, key):
pr = self.invoices.get(key)
if pr is None:
self.show_error('Cannot find payment request in wallet.')
return
pr.verify(self.contacts)
self.show_pr_details(pr)
def show_pr_details(self, pr):
key = pr.get_id()
d = WindowModalDialog(self, _("Invoice"))
vbox = QVBoxLayout(d)
grid = QGridLayout()
grid.addWidget(QLabel(_("Requestor") + ':'), 0, 0)
grid.addWidget(QLabel(pr.get_requestor()), 0, 1)
grid.addWidget(QLabel(_("Amount") + ':'), 1, 0)
outputs_str = '\n'.join(map(lambda x: self.format_amount(x[2])+ self.base_unit() + ' @ ' + x[1], pr.get_outputs()))
grid.addWidget(QLabel(outputs_str), 1, 1)
expires = pr.get_expiration_date()
grid.addWidget(QLabel(_("Memo") + ':'), 2, 0)
grid.addWidget(QLabel(pr.get_memo()), 2, 1)
grid.addWidget(QLabel(_("Signature") + ':'), 3, 0)
grid.addWidget(QLabel(pr.get_verify_status()), 3, 1)
if expires:
grid.addWidget(QLabel(_("Expires") + ':'), 4, 0)
grid.addWidget(QLabel(format_time(expires)), 4, 1)
vbox.addLayout(grid)
def do_export():
fn = self.getSaveFileName(_("Save invoice to file"), "*.bip70")
if not fn:
return
with open(fn, 'wb') as f:
data = f.write(pr.raw)
self.show_message(_('Invoice saved as' + ' ' + fn))
exportButton = EnterButton(_('Save'), do_export)
def do_delete():
if self.question(_('Delete invoice?')):
self.invoices.remove(key)
self.history_list.update()
self.invoice_list.update()
d.close()
deleteButton = EnterButton(_('Delete'), do_delete)
vbox.addLayout(Buttons(exportButton, deleteButton, CloseButton(d)))
d.exec_()
def do_pay_invoice(self, key):
pr = self.invoices.get(key)
self.payment_request = pr
self.prepare_for_payment_request()
pr.error = None # this forces verify() to re-run
if pr.verify(self.contacts):
self.payment_request_ok()
else:
self.payment_request_error()
def create_console_tab(self):
from .console import Console
self.console = console = Console()
return console
def update_console(self):
console = self.console
console.history = self.config.get("console-history",[])
console.history_index = len(console.history)
console.updateNamespace({'wallet' : self.wallet,
'network' : self.network,
'plugins' : self.gui_object.plugins,
'window': self})
console.updateNamespace({'util' : util, 'bithereumnetwork':bitcoin})
c = commands.Commands(self.config, self.wallet, self.network, lambda: self.console.set_json(True))
methods = {}
def mkfunc(f, method):
return lambda *args: f(method, args, self.password_dialog)
for m in dir(c):
if m[0]=='_' or m in ['network','wallet']: continue
methods[m] = mkfunc(c._run, m)
console.updateNamespace(methods)
def create_status_bar(self):
sb = QStatusBar()
sb.setFixedHeight(35)
qtVersion = qVersion()
self.balance_label = QLabel("")
self.balance_label.setTextInteractionFlags(Qt.TextSelectableByMouse)
self.balance_label.setStyleSheet("""QLabel { padding: 0 }""")
sb.addWidget(self.balance_label)
self.search_box = QLineEdit()
self.search_box.textChanged.connect(self.do_search)
self.search_box.hide()
sb.addPermanentWidget(self.search_box)
self.lock_icon = QIcon()
self.password_button = StatusBarButton(self.lock_icon, _("Password"), self.change_password_dialog )
sb.addPermanentWidget(self.password_button)
sb.addPermanentWidget(StatusBarButton(QIcon(":icons/preferences.png"), _("Preferences"), self.settings_dialog ) )
self.seed_button = StatusBarButton(QIcon(":icons/seed.png"), _("Seed"), self.show_seed_dialog )
sb.addPermanentWidget(self.seed_button)
self.status_button = StatusBarButton(QIcon(":icons/status_disconnected.png"), _("Network"), lambda: self.gui_object.show_network_dialog(self))
sb.addPermanentWidget(self.status_button)
run_hook('create_status_bar', sb)
self.setStatusBar(sb)
def update_lock_icon(self):
icon = QIcon(":icons/lock.png") if self.wallet.has_password() else QIcon(":icons/unlock.png")
self.password_button.setIcon(icon)
def update_buttons_on_seed(self):
self.seed_button.setVisible(self.wallet.has_seed())
self.password_button.setVisible(self.wallet.may_have_password())
self.send_button.setVisible(not self.wallet.is_watching_only())
def change_password_dialog(self):
from electrum.storage import STO_EV_XPUB_PW
if self.wallet.get_available_storage_encryption_version() == STO_EV_XPUB_PW:
from .password_dialog import ChangePasswordDialogForHW
d = ChangePasswordDialogForHW(self, self.wallet)
ok, encrypt_file = d.run()
if not ok:
return
try:
hw_dev_pw = self.wallet.keystore.get_password_for_storage_encryption()
except UserCancelled:
return
except BaseException as e:
traceback.print_exc(file=sys.stderr)
self.show_error(str(e))
return
old_password = hw_dev_pw if self.wallet.has_password() else None
new_password = hw_dev_pw if encrypt_file else None
else:
from .password_dialog import ChangePasswordDialogForSW
d = ChangePasswordDialogForSW(self, self.wallet)
ok, old_password, new_password, encrypt_file = d.run()
if not ok:
return
try:
self.wallet.update_password(old_password, new_password, encrypt_file)
except InvalidPassword as e:
self.show_error(str(e))
return
except BaseException:
traceback.print_exc(file=sys.stdout)
self.show_error(_('Failed to update password'))
return
msg = _('Password was updated successfully') if self.wallet.has_password() else _('Password is disabled, this wallet is not protected')
self.show_message(msg, title=_("Success"))
self.update_lock_icon()
def toggle_search(self):
tab = self.tabs.currentWidget()
#if hasattr(tab, 'searchable_list'):
# tab.searchable_list.toggle_toolbar()
#return
self.search_box.setHidden(not self.search_box.isHidden())
if not self.search_box.isHidden():
self.search_box.setFocus(1)
else:
self.do_search('')
def do_search(self, t):
tab = self.tabs.currentWidget()
if hasattr(tab, 'searchable_list'):
tab.searchable_list.filter(t)
def new_contact_dialog(self):
d = WindowModalDialog(self, _("New Contact"))
vbox = QVBoxLayout(d)
vbox.addWidget(QLabel(_('New Contact') + ':'))
grid = QGridLayout()
line1 = QLineEdit()
line1.setFixedWidth(280)
line2 = QLineEdit()
line2.setFixedWidth(280)
grid.addWidget(QLabel(_("Address")), 1, 0)
grid.addWidget(line1, 1, 1)
grid.addWidget(QLabel(_("Name")), 2, 0)
grid.addWidget(line2, 2, 1)
vbox.addLayout(grid)
vbox.addLayout(Buttons(CancelButton(d), OkButton(d)))
if d.exec_():
self.set_contact(line2.text(), line1.text())
def show_master_public_keys(self):
dialog = WindowModalDialog(self, _("Wallet Information"))
dialog.setMinimumSize(500, 100)
mpk_list = self.wallet.get_master_public_keys()
vbox = QVBoxLayout()
wallet_type = self.wallet.storage.get('wallet_type', '')
grid = QGridLayout()
basename = os.path.basename(self.wallet.storage.path)
grid.addWidget(QLabel(_("Wallet name")+ ':'), 0, 0)
grid.addWidget(QLabel(basename), 0, 1)
grid.addWidget(QLabel(_("Wallet type")+ ':'), 1, 0)
grid.addWidget(QLabel(wallet_type), 1, 1)
grid.addWidget(QLabel(_("Script type")+ ':'), 2, 0)
grid.addWidget(QLabel(self.wallet.txin_type), 2, 1)
vbox.addLayout(grid)
if self.wallet.is_deterministic():
mpk_text = ShowQRTextEdit()
mpk_text.setMaximumHeight(150)
mpk_text.addCopyButton(self.app)
def show_mpk(index):
mpk_text.setText(mpk_list[index])
# only show the combobox in case multiple accounts are available
if len(mpk_list) > 1:
def label(key):
if isinstance(self.wallet, Multisig_Wallet):
return _("cosigner") + ' ' + str(key+1)
return ''
labels = [label(i) for i in range(len(mpk_list))]
on_click = lambda clayout: show_mpk(clayout.selected_index())
labels_clayout = ChoicesLayout(_("Master Public Keys"), labels, on_click)
vbox.addLayout(labels_clayout.layout())
else:
vbox.addWidget(QLabel(_("Master Public Key")))
show_mpk(0)
vbox.addWidget(mpk_text)
vbox.addStretch(1)
vbox.addLayout(Buttons(CloseButton(dialog)))
dialog.setLayout(vbox)
dialog.exec_()
def remove_wallet(self):
if self.question('\n'.join([
_('Delete wallet file?'),
"%s" % self.wallet.storage.path,
_('If your wallet contains funds, make sure you have saved its seed.')]),
title='ElectrumBTH', icon=QMessageBox.Warning):
if self.question('\n'.join([
_('Are you sure you want to delete this wallet file?'),
"%s" % self.wallet.storage.path, '\n',
_("This CANNOT be undone!")]),
title='ElectrumBTH', icon=QMessageBox.Warning):
self._delete_wallet()
@protected
def _delete_wallet(self, password):
wallet_path = self.wallet.storage.path
basename = os.path.basename(wallet_path)
self.gui_object.daemon.stop_wallet(wallet_path)
self.close()
os.unlink(wallet_path)
self.show_error("Wallet removed:" + basename)
@protected
def show_seed_dialog(self, password):
if not self.wallet.has_seed():
self.show_message(_('This wallet has no seed'))
return
keystore = self.wallet.get_keystore()
try:
seed = keystore.get_seed(password)
passphrase = keystore.get_passphrase(password)
except BaseException as e:
self.show_error(str(e))
return
from .seed_dialog import SeedDialog
d = SeedDialog(self, seed, passphrase)
d.exec_()
def show_qrcode(self, data, title = _("QR code"), parent=None):
if not data:
return
d = QRDialog(data, parent or self, title)
d.exec_()
@protected
def show_private_key(self, address, password):
if not address:
return
try:
pk, redeem_script = self.wallet.export_private_key(address, password)
except Exception as e:
traceback.print_exc(file=sys.stdout)
self.show_message(str(e))
return
xtype = bitcoin.deserialize_privkey(pk)[0]
d = WindowModalDialog(self, _("Private key"))
d.setMinimumSize(600, 150)
vbox = QVBoxLayout()
vbox.addWidget(QLabel(_("Address") + ': ' + address))
vbox.addWidget(QLabel(_("Script type") + ': ' + xtype))
vbox.addWidget(QLabel(_("Private key") + ':'))
keys_e = ShowQRTextEdit(text=pk)
keys_e.addCopyButton(self.app)
vbox.addWidget(keys_e)
if redeem_script:
vbox.addWidget(QLabel(_("Redeem Script") + ':'))
rds_e = ShowQRTextEdit(text=redeem_script)
rds_e.addCopyButton(self.app)
vbox.addWidget(rds_e)
vbox.addLayout(Buttons(CloseButton(d)))
d.setLayout(vbox)
d.exec_()
msg_sign = _("Signing with an address actually means signing with the corresponding "
"private key, and verifying with the corresponding public key. The "
"address you have entered does not have a unique public key, so these "
"operations cannot be performed.") + '\n\n' + \
_('The operation is undefined. Not just in ElectrumBTH, but in general.')
@protected
def do_sign(self, address, message, signature, password):
address = address.text().strip()
message = message.toPlainText().strip()
if not bitcoin.is_address(address):
self.show_message(_('Invalid Bithereum address.'))
return
if self.wallet.is_watching_only():
self.show_message(_('This is a watching-only wallet.'))
return
if not self.wallet.is_mine(address):
self.show_message(_('Address not in wallet.'))
return
txin_type = self.wallet.get_txin_type(address)
if txin_type not in ['p2pkh', 'p2wpkh', 'p2wpkh-p2sh']:
self.show_message(_('Cannot sign messages with this type of address:') + \
' ' + txin_type + '\n\n' + self.msg_sign)
return
task = partial(self.wallet.sign_message, address, message, password)
def show_signed_message(sig):
try:
signature.setText(base64.b64encode(sig).decode('ascii'))
except RuntimeError:
# (signature) wrapped C/C++ object has been deleted
pass
self.wallet.thread.add(task, on_success=show_signed_message)
def do_verify(self, address, message, signature):
address = address.text().strip()
message = message.toPlainText().strip().encode('utf-8')
if not bitcoin.is_address(address):
self.show_message(_('Invalid Bithereum address.'))
return
try:
# This can throw on invalid base64
sig = base64.b64decode(str(signature.toPlainText()))
verified = bitcoin.verify_message(address, sig, message)
except Exception as e:
verified = False
if verified:
self.show_message(_("Signature verified"))
else:
self.show_error(_("Wrong signature"))
def sign_verify_message(self, address=''):
d = WindowModalDialog(self, _('Sign/verify Message'))
d.setMinimumSize(610, 290)
layout = QGridLayout(d)
message_e = QTextEdit()
layout.addWidget(QLabel(_('Message')), 1, 0)
layout.addWidget(message_e, 1, 1)
layout.setRowStretch(2,3)
address_e = QLineEdit()
address_e.setText(address)
layout.addWidget(QLabel(_('Address')), 2, 0)
layout.addWidget(address_e, 2, 1)
signature_e = QTextEdit()
layout.addWidget(QLabel(_('Signature')), 3, 0)
layout.addWidget(signature_e, 3, 1)
layout.setRowStretch(3,1)
hbox = QHBoxLayout()
b = QPushButton(_("Sign"))
b.clicked.connect(lambda: self.do_sign(address_e, message_e, signature_e))
hbox.addWidget(b)
b = QPushButton(_("Verify"))
b.clicked.connect(lambda: self.do_verify(address_e, message_e, signature_e))
hbox.addWidget(b)
b = QPushButton(_("Close"))
b.clicked.connect(d.accept)
hbox.addWidget(b)
layout.addLayout(hbox, 4, 1)
d.exec_()
@protected
def do_decrypt(self, message_e, pubkey_e, encrypted_e, password):
if self.wallet.is_watching_only():
self.show_message(_('This is a watching-only wallet.'))
return
cyphertext = encrypted_e.toPlainText()
task = partial(self.wallet.decrypt_message, pubkey_e.text(), cyphertext, password)
def setText(text):
try:
message_e.setText(text.decode('utf-8'))
except RuntimeError:
# (message_e) wrapped C/C++ object has been deleted
pass
self.wallet.thread.add(task, on_success=setText)
def do_encrypt(self, message_e, pubkey_e, encrypted_e):
message = message_e.toPlainText()
message = message.encode('utf-8')
try:
encrypted = bitcoin.encrypt_message(message, pubkey_e.text())
encrypted_e.setText(encrypted.decode('ascii'))
except BaseException as e:
traceback.print_exc(file=sys.stdout)
self.show_warning(str(e))
def encrypt_message(self, address=''):
d = WindowModalDialog(self, _('Encrypt/decrypt Message'))
d.setMinimumSize(610, 490)
layout = QGridLayout(d)
message_e = QTextEdit()
layout.addWidget(QLabel(_('Message')), 1, 0)
layout.addWidget(message_e, 1, 1)
layout.setRowStretch(2,3)
pubkey_e = QLineEdit()
if address:
pubkey = self.wallet.get_public_key(address)
pubkey_e.setText(pubkey)
layout.addWidget(QLabel(_('Public key')), 2, 0)
layout.addWidget(pubkey_e, 2, 1)
encrypted_e = QTextEdit()
layout.addWidget(QLabel(_('Encrypted')), 3, 0)
layout.addWidget(encrypted_e, 3, 1)
layout.setRowStretch(3,1)
hbox = QHBoxLayout()
b = QPushButton(_("Encrypt"))
b.clicked.connect(lambda: self.do_encrypt(message_e, pubkey_e, encrypted_e))
hbox.addWidget(b)
b = QPushButton(_("Decrypt"))
b.clicked.connect(lambda: self.do_decrypt(message_e, pubkey_e, encrypted_e))
hbox.addWidget(b)
b = QPushButton(_("Close"))
b.clicked.connect(d.accept)
hbox.addWidget(b)
layout.addLayout(hbox, 4, 1)
d.exec_()
def password_dialog(self, msg=None, parent=None):
from .password_dialog import PasswordDialog
parent = parent or self
d = PasswordDialog(parent, msg)
return d.run()
def tx_from_text(self, txt):
from electrum.transaction import tx_from_str
try:
tx = tx_from_str(txt)
return Transaction(tx)
except BaseException as e:
self.show_critical(_("ElectrumBTH was unable to parse your transaction") + ":\n" + str(e))
return
def read_tx_from_qrcode(self):
from electrum import qrscanner
try:
data = qrscanner.scan_barcode(self.config.get_video_device())
except BaseException as e:
self.show_error(str(e))
return
if not data:
return
# if the user scanned a bithereumnetwork URI
if str(data).startswith("bithereumnetwork:"):
self.pay_to_URI(data)
return
# else if the user scanned an offline signed tx
try:
data = bh2u(bitcoin.base_decode(data, length=None, base=43))
except BaseException as e:
self.show_error((_('Could not decode QR code')+':\n{}').format(e))
return
tx = self.tx_from_text(data)
if not tx:
return
self.show_transaction(tx)
def read_tx_from_file(self):
fileName = self.getOpenFileName(_("Select your transaction file"), "*.txn")
if not fileName:
return
try:
with open(fileName, "r") as f:
file_content = f.read()
except (ValueError, IOError, os.error) as reason:
self.show_critical(_("ElectrumBTH was unable to open your transaction file") + "\n" + str(reason), title=_("Unable to read file or no transaction found"))
return
return self.tx_from_text(file_content)
def do_process_from_text(self):
text = text_dialog(self, _('Input raw transaction'), _("Transaction:"), _("Load transaction"))
if not text:
return
tx = self.tx_from_text(text)
if tx:
self.show_transaction(tx)
def do_process_from_file(self):
tx = self.read_tx_from_file()
if tx:
self.show_transaction(tx)
def do_process_from_txid(self):
from electrum import transaction
txid, ok = QInputDialog.getText(self, _('Lookup transaction'), _('Transaction ID') + ':')
if ok and txid:
txid = str(txid).strip()
try:
r = self.network.synchronous_get(('blockchain.transaction.get', [txid]))
except BaseException as e:
self.show_message(str(e))
return
tx = transaction.Transaction(r)
self.show_transaction(tx)
@protected
def export_privkeys_dialog(self, password):
if self.wallet.is_watching_only():
self.show_message(_("This is a watching-only wallet"))
return
if isinstance(self.wallet, Multisig_Wallet):
self.show_message(_('WARNING: This is a multi-signature wallet.') + '\n' +
_('It cannot be "backed up" by simply exporting these private keys.'))
d = WindowModalDialog(self, _('Private keys'))
d.setMinimumSize(980, 300)
vbox = QVBoxLayout(d)
msg = "%s\n%s\n%s" % (_("WARNING: ALL your private keys are secret."),
_("Exposing a single private key can compromise your entire wallet!"),
_("In particular, DO NOT use 'redeem private key' services proposed by third parties."))
vbox.addWidget(QLabel(msg))
e = QTextEdit()
e.setReadOnly(True)
vbox.addWidget(e)
defaultname = 'electrumbth-private-keys.csv'
select_msg = _('Select file to export your private keys to')
hbox, filename_e, csv_button = filename_field(self, self.config, defaultname, select_msg)
vbox.addLayout(hbox)
b = OkButton(d, _('Export'))
b.setEnabled(False)
vbox.addLayout(Buttons(CancelButton(d), b))
private_keys = {}
addresses = self.wallet.get_addresses()
done = False
cancelled = False
def privkeys_thread():
for addr in addresses:
time.sleep(0.1)
if done or cancelled:
break
privkey = self.wallet.export_private_key(addr, password)[0]
private_keys[addr] = privkey
self.computing_privkeys_signal.emit()
if not cancelled:
self.computing_privkeys_signal.disconnect()
self.show_privkeys_signal.emit()
def show_privkeys():
s = "\n".join( map( lambda x: x[0] + "\t"+ x[1], private_keys.items()))
e.setText(s)
b.setEnabled(True)
self.show_privkeys_signal.disconnect()
nonlocal done
done = True
def on_dialog_closed(*args):
nonlocal done
nonlocal cancelled
if not done:
cancelled = True
self.computing_privkeys_signal.disconnect()
self.show_privkeys_signal.disconnect()
self.computing_privkeys_signal.connect(lambda: e.setText("Please wait... %d/%d"%(len(private_keys),len(addresses))))
self.show_privkeys_signal.connect(show_privkeys)
d.finished.connect(on_dialog_closed)
threading.Thread(target=privkeys_thread).start()
if not d.exec_():
done = True
return
filename = filename_e.text()
if not filename:
return
try:
self.do_export_privkeys(filename, private_keys, csv_button.isChecked())
except (IOError, os.error) as reason:
txt = "\n".join([
_("ElectrumBTH was unable to produce a private key-export."),
str(reason)
])
self.show_critical(txt, title=_("Unable to create csv"))
except Exception as e:
self.show_message(str(e))
return
self.show_message(_("Private keys exported."))
def do_export_privkeys(self, fileName, pklist, is_csv):
with open(fileName, "w+") as f:
if is_csv:
transaction = csv.writer(f)
transaction.writerow(["address", "private_key"])
for addr, pk in pklist.items():
transaction.writerow(["%34s"%addr,pk])
else:
import json
f.write(json.dumps(pklist, indent = 4))
def do_import_labels(self):
def import_labels(path):
def _validate(data):
return data # TODO
def import_labels_assign(data):
for key, value in data.items():
self.wallet.set_label(key, value)
import_meta(path, _validate, import_labels_assign)
def on_import():
self.need_update.set()
import_meta_gui(self, _('labels'), import_labels, on_import)
def do_export_labels(self):
def export_labels(filename):
export_meta(self.wallet.labels, filename)
export_meta_gui(self, _('labels'), export_labels)
def sweep_key_dialog(self):
d = WindowModalDialog(self, title=_('Sweep private keys'))
d.setMinimumSize(600, 300)
vbox = QVBoxLayout(d)
vbox.addWidget(QLabel(_("Enter private keys:")))
keys_e = ScanQRTextEdit(allow_multi=True)
keys_e.setTabChangesFocus(True)
vbox.addWidget(keys_e)
addresses = self.wallet.get_unused_addresses()
if not addresses:
try:
addresses = self.wallet.get_receiving_addresses()
except AttributeError:
addresses = self.wallet.get_addresses()
h, address_e = address_field(addresses)
vbox.addLayout(h)
vbox.addStretch(1)
button = OkButton(d, _('Sweep'))
vbox.addLayout(Buttons(CancelButton(d), button))
button.setEnabled(False)
def get_address():
addr = str(address_e.text()).strip()
if bitcoin.is_address(addr):
return addr
def get_pk():
text = str(keys_e.toPlainText())
return keystore.get_private_keys(text)
f = lambda: button.setEnabled(get_address() is not None and get_pk() is not None)
on_address = lambda text: address_e.setStyleSheet((ColorScheme.DEFAULT if get_address() else ColorScheme.RED).as_stylesheet())
keys_e.textChanged.connect(f)
address_e.textChanged.connect(f)
address_e.textChanged.connect(on_address)
if not d.exec_():
return
from electrum.wallet import sweep_preparations
try:
self.do_clear()
coins, keypairs = sweep_preparations(get_pk(), self.network)
self.tx_external_keypairs = keypairs
self.spend_coins(coins)
self.payto_e.setText(get_address())
self.spend_max()
self.payto_e.setFrozen(True)
self.amount_e.setFrozen(True)
except BaseException as e:
self.show_message(str(e))
return
self.warn_if_watching_only()
def _do_import(self, title, msg, func):
text = text_dialog(self, title, msg + ' :', _('Import'),
allow_multi=True)
if not text:
return
bad = []
good = []
for key in str(text).split():
try:
addr = func(key)
good.append(addr)
except BaseException as e:
bad.append(key)
continue
if good:
self.show_message(_("The following addresses were added") + ':\n' + '\n'.join(good))
if bad:
self.show_critical(_("The following inputs could not be imported") + ':\n'+ '\n'.join(bad))
self.address_list.update()
self.history_list.update()
def import_addresses(self):
if not self.wallet.can_import_address():
return
title, msg = _('Import addresses'), _("Enter addresses")
self._do_import(title, msg, self.wallet.import_address)
@protected
def do_import_privkey(self, password):
if not self.wallet.can_import_privkey():
return
title, msg = _('Import private keys'), _("Enter private keys")
self._do_import(title, msg, lambda x: self.wallet.import_private_key(x, password))
def update_fiat(self):
b = self.fx and self.fx.is_enabled()
self.fiat_send_e.setVisible(b)
self.fiat_receive_e.setVisible(b)
self.history_list.refresh_headers()
self.history_list.update()
self.address_list.refresh_headers()
self.address_list.update()
self.update_status()
def settings_dialog(self):
self.need_restart = False
d = WindowModalDialog(self, _('Preferences'))
vbox = QVBoxLayout()
tabs = QTabWidget()
gui_widgets = []
fee_widgets = []
tx_widgets = []
id_widgets = []
# language
lang_help = _('Select which language is used in the GUI (after restart).')
lang_label = HelpLabel(_('Language') + ':', lang_help)
lang_combo = QComboBox()
from electrum.i18n import languages
lang_combo.addItems(list(languages.values()))
try:
index = languages.keys().index(self.config.get("language",''))
except Exception:
index = 0
lang_combo.setCurrentIndex(index)
if not self.config.is_modifiable('language'):
for w in [lang_combo, lang_label]: w.setEnabled(False)
def on_lang(x):
lang_request = list(languages.keys())[lang_combo.currentIndex()]
if lang_request != self.config.get('language'):
self.config.set_key("language", lang_request, True)
self.need_restart = True
lang_combo.currentIndexChanged.connect(on_lang)
gui_widgets.append((lang_label, lang_combo))
# minimize to tray
mini_help = _('Set if the wallet should minimize to system tray instead of closing')
mini_label = HelpLabel(_('Minimize to tray') + ':', mini_help)
mini_checkbox = QCheckBox()
mini_checkbox.setChecked(self.minimize_to_tray)
if not self.config.is_modifiable('minimize_tray'):
for w in [mini_checkbox, mini_label]: w.setEnabled(False)
def on_minimize_tray(i):
self.minimize_to_tray = mini_checkbox.isChecked()
self.config.set_key('minimize_tray', self.minimize_to_tray, True)
mini_checkbox.stateChanged.connect(on_minimize_tray)
gui_widgets.append((mini_label, mini_checkbox))
nz_help = _('Number of zeros displayed after the decimal point. For example, if this is set to 2, "1." will be displayed as "1.00"')
nz_label = HelpLabel(_('Zeros after decimal point') + ':', nz_help)
nz = QSpinBox()
nz.setMinimum(0)
nz.setMaximum(self.decimal_point)
nz.setValue(self.num_zeros)
if not self.config.is_modifiable('num_zeros'):
for w in [nz, nz_label]: w.setEnabled(False)
def on_nz():
value = nz.value()
if self.num_zeros != value:
self.num_zeros = value
self.config.set_key('num_zeros', value, True)
self.history_list.update()
self.address_list.update()
nz.valueChanged.connect(on_nz)
gui_widgets.append((nz_label, nz))
msg = '\n'.join([
_('Time based: fee rate is based on average confirmation time estimates'),
_('Mempool based: fee rate is targeting a depth in the memory pool')
]
)
fee_type_label = HelpLabel(_('Fee estimation') + ':', msg)
fee_type_combo = QComboBox()
fee_type_combo.addItems([_('Static'), _('ETA'), _('Mempool')])
fee_type_combo.setCurrentIndex((2 if self.config.use_mempool_fees() else 1) if self.config.is_dynfee() else 0)
def on_fee_type(x):
self.config.set_key('mempool_fees', x==2)
self.config.set_key('dynamic_fees', x>0)
self.fee_slider.update()
fee_type_combo.currentIndexChanged.connect(on_fee_type)
fee_widgets.append((fee_type_label, fee_type_combo))
feebox_cb = QCheckBox(_('Edit fees manually'))
feebox_cb.setChecked(self.config.get('show_fee', False))
feebox_cb.setToolTip(_("Show fee edit box in send tab."))
def on_feebox(x):
self.config.set_key('show_fee', x == Qt.Checked)
self.fee_adv_controls.setVisible(bool(x))
feebox_cb.stateChanged.connect(on_feebox)
fee_widgets.append((feebox_cb, None))
use_rbf_cb = QCheckBox(_('Use Replace-By-Fee'))
use_rbf_cb.setChecked(self.config.get('use_rbf', True))
use_rbf_cb.setToolTip(
_('If you check this box, your transactions will be marked as non-final,') + '\n' + \
_('and you will have the possibility, while they are unconfirmed, to replace them with transactions that pay higher fees.') + '\n' + \
_('Note that some merchants do not accept non-final transactions until they are confirmed.'))
def on_use_rbf(x):
self.config.set_key('use_rbf', x == Qt.Checked)
use_rbf_cb.stateChanged.connect(on_use_rbf)
fee_widgets.append((use_rbf_cb, None))
# SSL certificate
msg = ' '.join([
_('SSL certificate used to sign payment requests.'),
_('Use setconfig to set ssl_chain and ssl_privkey.'),
])
if self.config.get('ssl_privkey') or self.config.get('ssl_chain'):
try:
SSL_identity = paymentrequest.check_ssl_config(self.config)
SSL_error = None
except BaseException as e:
SSL_identity = "error"
SSL_error = str(e)
else:
SSL_identity = ""
SSL_error = None
SSL_id_label = HelpLabel(_('SSL certificate') + ':', msg)
SSL_id_e = QLineEdit(SSL_identity)
SSL_id_e.setStyleSheet((ColorScheme.RED if SSL_error else ColorScheme.GREEN).as_stylesheet(True) if SSL_identity else '')
if SSL_error:
SSL_id_e.setToolTip(SSL_error)
SSL_id_e.setReadOnly(True)
id_widgets.append((SSL_id_label, SSL_id_e))
units = ['BTH', 'mBTH', 'bits']
msg = (_('Base unit of your wallet.')
+ '\n1 BTH = 1000 mBTH. 1 mBTH = 1000 bits.\n'
+ _('This setting affects the Send tab, and all balance related fields.'))
unit_label = HelpLabel(_('Base unit') + ':', msg)
unit_combo = QComboBox()
unit_combo.addItems(units)
unit_combo.setCurrentIndex(units.index(self.base_unit()))
def on_unit(x, nz):
unit_result = units[unit_combo.currentIndex()]
if self.base_unit() == unit_result:
return
edits = self.amount_e, self.fee_e, self.receive_amount_e
amounts = [edit.get_amount() for edit in edits]
if unit_result == 'BTH':
self.decimal_point = 8
elif unit_result == 'mBTH':
self.decimal_point = 5
elif unit_result == 'bits':
self.decimal_point = 2
else:
raise Exception('Unknown base unit')
self.config.set_key('decimal_point', self.decimal_point, True)
nz.setMaximum(self.decimal_point)
self.history_list.update()
self.request_list.update()
self.address_list.update()
for edit, amount in zip(edits, amounts):
edit.setAmount(amount)
self.update_status()
unit_combo.currentIndexChanged.connect(lambda x: on_unit(x, nz))
gui_widgets.append((unit_label, unit_combo))
block_explorers = sorted(util.block_explorer_info().keys())
msg = _('Choose which online block explorer to use for functions that open a web browser')
block_ex_label = HelpLabel(_('Online Block Explorer') + ':', msg)
block_ex_combo = QComboBox()
block_ex_combo.addItems(block_explorers)
block_ex_combo.setCurrentIndex(block_ex_combo.findText(util.block_explorer(self.config)))
def on_be(x):
be_result = block_explorers[block_ex_combo.currentIndex()]
self.config.set_key('block_explorer', be_result, True)
block_ex_combo.currentIndexChanged.connect(on_be)
gui_widgets.append((block_ex_label, block_ex_combo))
from electrum import qrscanner
system_cameras = qrscanner._find_system_cameras()
qr_combo = QComboBox()
qr_combo.addItem("Default","default")
for camera, device in system_cameras.items():
qr_combo.addItem(camera, device)
#combo.addItem("Manually specify a device", config.get("video_device"))
index = qr_combo.findData(self.config.get("video_device"))
qr_combo.setCurrentIndex(index)
msg = _("Install the zbar package to enable this.")
qr_label = HelpLabel(_('Video Device') + ':', msg)
qr_combo.setEnabled(qrscanner.libzbar is not None)
on_video_device = lambda x: self.config.set_key("video_device", qr_combo.itemData(x), True)
qr_combo.currentIndexChanged.connect(on_video_device)
gui_widgets.append((qr_label, qr_combo))
usechange_cb = QCheckBox(_('Use change addresses'))
usechange_cb.setChecked(self.wallet.use_change)
if not self.config.is_modifiable('use_change'): usechange_cb.setEnabled(False)
def on_usechange(x):
usechange_result = x == Qt.Checked
if self.wallet.use_change != usechange_result:
self.wallet.use_change = usechange_result
self.wallet.storage.put('use_change', self.wallet.use_change)
multiple_cb.setEnabled(self.wallet.use_change)
usechange_cb.stateChanged.connect(on_usechange)
usechange_cb.setToolTip(_('Using change addresses makes it more difficult for other people to track your transactions.'))
tx_widgets.append((usechange_cb, None))
def on_multiple(x):
multiple = x == Qt.Checked
if self.wallet.multiple_change != multiple:
self.wallet.multiple_change = multiple
self.wallet.storage.put('multiple_change', multiple)
multiple_change = self.wallet.multiple_change
multiple_cb = QCheckBox(_('Use multiple change addresses'))
multiple_cb.setEnabled(self.wallet.use_change)
multiple_cb.setToolTip('\n'.join([
_('In some cases, use up to 3 change addresses in order to break '
'up large coin amounts and obfuscate the recipient address.'),
_('This may result in higher transactions fees.')
]))
multiple_cb.setChecked(multiple_change)
multiple_cb.stateChanged.connect(on_multiple)
tx_widgets.append((multiple_cb, None))
def fmt_docs(key, klass):
lines = [ln.lstrip(" ") for ln in klass.__doc__.split("\n")]
return '\n'.join([key, "", " ".join(lines)])
choosers = sorted(coinchooser.COIN_CHOOSERS.keys())
if len(choosers) > 1:
chooser_name = coinchooser.get_name(self.config)
msg = _('Choose coin (UTXO) selection method. The following are available:\n\n')
msg += '\n\n'.join(fmt_docs(*item) for item in coinchooser.COIN_CHOOSERS.items())
chooser_label = HelpLabel(_('Coin selection') + ':', msg)
chooser_combo = QComboBox()
chooser_combo.addItems(choosers)
i = choosers.index(chooser_name) if chooser_name in choosers else 0
chooser_combo.setCurrentIndex(i)
def on_chooser(x):
chooser_name = choosers[chooser_combo.currentIndex()]
self.config.set_key('coin_chooser', chooser_name)
chooser_combo.currentIndexChanged.connect(on_chooser)
tx_widgets.append((chooser_label, chooser_combo))
def on_unconf(x):
self.config.set_key('confirmed_only', bool(x))
conf_only = self.config.get('confirmed_only', False)
unconf_cb = QCheckBox(_('Spend only confirmed coins'))
unconf_cb.setToolTip(_('Spend only confirmed inputs.'))
unconf_cb.setChecked(conf_only)
unconf_cb.stateChanged.connect(on_unconf)
tx_widgets.append((unconf_cb, None))
def on_outrounding(x):
self.config.set_key('coin_chooser_output_rounding', bool(x))
enable_outrounding = self.config.get('coin_chooser_output_rounding', False)
outrounding_cb = QCheckBox(_('Enable output value rounding'))
outrounding_cb.setToolTip(
_('Set the value of the change output so that it has similar precision to the other outputs.') + '\n' +
_('This might improve your privacy somewhat.') + '\n' +
_('If enabled, at most 100 satoshis might be lost due to this, per transaction.'))
outrounding_cb.setChecked(enable_outrounding)
outrounding_cb.stateChanged.connect(on_outrounding)
tx_widgets.append((outrounding_cb, None))
# Fiat Currency
hist_checkbox = QCheckBox()
hist_capgains_checkbox = QCheckBox()
fiat_address_checkbox = QCheckBox()
ccy_combo = QComboBox()
ex_combo = QComboBox()
def update_currencies():
if not self.fx: return
currencies = sorted(self.fx.get_currencies(self.fx.get_history_config()))
ccy_combo.clear()
ccy_combo.addItems([_('None')] + currencies)
if self.fx.is_enabled():
ccy_combo.setCurrentIndex(ccy_combo.findText(self.fx.get_currency()))
def update_history_cb():
if not self.fx: return
hist_checkbox.setChecked(self.fx.get_history_config())
hist_checkbox.setEnabled(self.fx.is_enabled())
def update_fiat_address_cb():
if not self.fx: return
fiat_address_checkbox.setChecked(self.fx.get_fiat_address_config())
def update_history_capgains_cb():
if not self.fx: return
hist_capgains_checkbox.setChecked(self.fx.get_history_capital_gains_config())
hist_capgains_checkbox.setEnabled(hist_checkbox.isChecked())
def update_exchanges():
if not self.fx: return
b = self.fx.is_enabled()
ex_combo.setEnabled(b)
if b:
h = self.fx.get_history_config()
c = self.fx.get_currency()
exchanges = self.fx.get_exchanges_by_ccy(c, h)
else:
exchanges = self.fx.get_exchanges_by_ccy('USD', False)
ex_combo.clear()
ex_combo.addItems(sorted(exchanges))
ex_combo.setCurrentIndex(ex_combo.findText(self.fx.config_exchange()))
def on_currency(hh):
if not self.fx: return
b = bool(ccy_combo.currentIndex())
ccy = str(ccy_combo.currentText()) if b else None
self.fx.set_enabled(b)
if b and ccy != self.fx.ccy:
self.fx.set_currency(ccy)
update_history_cb()
update_exchanges()
self.update_fiat()
def on_exchange(idx):
exchange = str(ex_combo.currentText())
if self.fx and self.fx.is_enabled() and exchange and exchange != self.fx.exchange.name():
self.fx.set_exchange(exchange)
def on_history(checked):
if not self.fx: return
self.fx.set_history_config(checked)
update_exchanges()
self.history_list.refresh_headers()
if self.fx.is_enabled() and checked:
# reset timeout to get historical rates
self.fx.timeout = 0
update_history_capgains_cb()
def on_history_capgains(checked):
if not self.fx: return
self.fx.set_history_capital_gains_config(checked)
self.history_list.refresh_headers()
def on_fiat_address(checked):
if not self.fx: return
self.fx.set_fiat_address_config(checked)
self.address_list.refresh_headers()
self.address_list.update()
update_currencies()
update_history_cb()
update_history_capgains_cb()
update_fiat_address_cb()
update_exchanges()
ccy_combo.currentIndexChanged.connect(on_currency)
hist_checkbox.stateChanged.connect(on_history)
hist_capgains_checkbox.stateChanged.connect(on_history_capgains)
fiat_address_checkbox.stateChanged.connect(on_fiat_address)
ex_combo.currentIndexChanged.connect(on_exchange)
fiat_widgets = []
fiat_widgets.append((QLabel(_('Fiat currency')), ccy_combo))
fiat_widgets.append((QLabel(_('Show history rates')), hist_checkbox))
fiat_widgets.append((QLabel(_('Show capital gains in history')), hist_capgains_checkbox))
fiat_widgets.append((QLabel(_('Show Fiat balance for addresses')), fiat_address_checkbox))
fiat_widgets.append((QLabel(_('Source')), ex_combo))
tabs_info = [
(fee_widgets, _('Fees')),
(tx_widgets, _('Transactions')),
(gui_widgets, _('Appearance')),
(fiat_widgets, _('Fiat')),
(id_widgets, _('Identity')),
]
for widgets, name in tabs_info:
tab = QWidget()
grid = QGridLayout(tab)
grid.setColumnStretch(0,1)
for a,b in widgets:
i = grid.rowCount()
if b:
if a:
grid.addWidget(a, i, 0)
grid.addWidget(b, i, 1)
else:
grid.addWidget(a, i, 0, 1, 2)
tabs.addTab(tab, name)
vbox.addWidget(tabs)
vbox.addStretch(1)
vbox.addLayout(Buttons(CloseButton(d)))
d.setLayout(vbox)
# run the dialog
d.exec_()
if self.fx:
self.fx.timeout = 0
run_hook('close_settings_dialog')
if self.need_restart:
self.show_warning(_('Please restart ElectrumBTH to activate the new GUI settings'), title=_('Success'))
def closeEvent(self, event):
if self.is_hidden() or not self.minimize_to_tray:
# It seems in some rare cases this closeEvent() is called twice
if not self.cleaned_up:
self.cleaned_up = True
self.clean_up()
event.accept()
else:
event.ignore()
self.hide()
self.tray.showMessage(
_("Info"),
_("Wallet was minimized to Tray"),
QSystemTrayIcon.Information,
1500
)
def clean_up(self):
self.wallet.thread.stop()
if self.network:
self.network.unregister_callback(self.on_network)
self.config.set_key("is_maximized", self.isMaximized())
if not self.isMaximized():
g = self.geometry()
self.wallet.storage.put("winpos-qt", [g.left(),g.top(),
g.width(),g.height()])
self.config.set_key("console-history", self.console.history[-50:],
True)
if self.qr_window:
self.qr_window.close()
self.close_wallet()
self.gui_object.close_window(self)
def plugins_dialog(self):
self.pluginsdialog = d = WindowModalDialog(self, _('ElectrumBTH Plugins'))
plugins = self.gui_object.plugins
vbox = QVBoxLayout(d)
# plugins
scroll = QScrollArea()
scroll.setEnabled(True)
scroll.setWidgetResizable(True)
scroll.setMinimumSize(400,250)
vbox.addWidget(scroll)
w = QWidget()
scroll.setWidget(w)
w.setMinimumHeight(plugins.count() * 35)
grid = QGridLayout()
grid.setColumnStretch(0,1)
w.setLayout(grid)
settings_widgets = {}
def enable_settings_widget(p, name, i):
widget = settings_widgets.get(name)
if not widget and p and p.requires_settings():
widget = settings_widgets[name] = p.settings_widget(d)
grid.addWidget(widget, i, 1)
if widget:
widget.setEnabled(bool(p and p.is_enabled()))
def do_toggle(cb, name, i):
p = plugins.toggle(name)
cb.setChecked(bool(p))
enable_settings_widget(p, name, i)
run_hook('init_qt', self.gui_object)
for i, descr in enumerate(plugins.descriptions.values()):
name = descr['__name__']
p = plugins.get(name)
if descr.get('registers_keystore'):
continue
try:
cb = QCheckBox(descr['fullname'])
plugin_is_loaded = p is not None
cb_enabled = (not plugin_is_loaded and plugins.is_available(name, self.wallet)
or plugin_is_loaded and p.can_user_disable())
cb.setEnabled(cb_enabled)
cb.setChecked(plugin_is_loaded and p.is_enabled())
grid.addWidget(cb, i, 0)
enable_settings_widget(p, name, i)
cb.clicked.connect(partial(do_toggle, cb, name, i))
msg = descr['description']
if descr.get('requires'):
msg += '\n\n' + _('Requires') + ':\n' + '\n'.join(map(lambda x: x[1], descr.get('requires')))
grid.addWidget(HelpButton(msg), i, 2)
except Exception:
self.print_msg("error: cannot display plugin", name)
traceback.print_exc(file=sys.stdout)
grid.setRowStretch(len(plugins.descriptions.values()), 1)
vbox.addLayout(Buttons(CloseButton(d)))
d.exec_()
def cpfp(self, parent_tx, new_tx):
total_size = parent_tx.estimated_size() + new_tx.estimated_size()
d = WindowModalDialog(self, _('Child Pays for Parent'))
vbox = QVBoxLayout(d)
msg = (
"A CPFP is a transaction that sends an unconfirmed output back to "
"yourself, with a high fee. The goal is to have miners confirm "
"the parent transaction in order to get the fee attached to the "
"child transaction.")
vbox.addWidget(WWLabel(_(msg)))
msg2 = ("The proposed fee is computed using your "
"fee/kB settings, applied to the total size of both child and "
"parent transactions. After you broadcast a CPFP transaction, "
"it is normal to see a new unconfirmed transaction in your history.")
vbox.addWidget(WWLabel(_(msg2)))
grid = QGridLayout()
grid.addWidget(QLabel(_('Total size') + ':'), 0, 0)
grid.addWidget(QLabel('%d bytes'% total_size), 0, 1)
max_fee = new_tx.output_value()
grid.addWidget(QLabel(_('Input amount') + ':'), 1, 0)
grid.addWidget(QLabel(self.format_amount(max_fee) + ' ' + self.base_unit()), 1, 1)
output_amount = QLabel('')
grid.addWidget(QLabel(_('Output amount') + ':'), 2, 0)
grid.addWidget(output_amount, 2, 1)
fee_e = BTHAmountEdit(self.get_decimal_point)
# FIXME with dyn fees, without estimates, there are all kinds of crashes here
def f(x):
a = max_fee - fee_e.get_amount()
output_amount.setText((self.format_amount(a) + ' ' + self.base_unit()) if a else '')
fee_e.textChanged.connect(f)
fee = self.config.fee_per_kb() * total_size / 1000
fee_e.setAmount(fee)
grid.addWidget(QLabel(_('Fee' + ':')), 3, 0)
grid.addWidget(fee_e, 3, 1)
def on_rate(dyn, pos, fee_rate):
fee = fee_rate * total_size / 1000
fee = min(max_fee, fee)
fee_e.setAmount(fee)
fee_slider = FeeSlider(self, self.config, on_rate)
fee_slider.update()
grid.addWidget(fee_slider, 4, 1)
vbox.addLayout(grid)
vbox.addLayout(Buttons(CancelButton(d), OkButton(d)))
if not d.exec_():
return
fee = fee_e.get_amount()
if fee > max_fee:
self.show_error(_('Max fee exceeded'))
return
new_tx = self.wallet.cpfp(parent_tx, fee)
new_tx.set_rbf(True)
self.show_transaction(new_tx)
def bump_fee_dialog(self, tx):
is_relevant, is_mine, v, fee = self.wallet.get_wallet_delta(tx)
tx_label = self.wallet.get_label(tx.txid())
tx_size = tx.estimated_size()
d = WindowModalDialog(self, _('Bump Fee'))
vbox = QVBoxLayout(d)
vbox.addWidget(QLabel(_('Current fee') + ': %s'% self.format_amount(fee) + ' ' + self.base_unit()))
vbox.addWidget(QLabel(_('New fee' + ':')))
fee_e = BTHAmountEdit(self.get_decimal_point)
fee_e.setAmount(fee * 1.5)
vbox.addWidget(fee_e)
def on_rate(dyn, pos, fee_rate):
fee = fee_rate * tx_size / 1000
fee_e.setAmount(fee)
fee_slider = FeeSlider(self, self.config, on_rate)
vbox.addWidget(fee_slider)
cb = QCheckBox(_('Final'))
vbox.addWidget(cb)
vbox.addLayout(Buttons(CancelButton(d), OkButton(d)))
if not d.exec_():
return
is_final = cb.isChecked()
new_fee = fee_e.get_amount()
delta = new_fee - fee
if delta < 0:
self.show_error("fee too low")
return
try:
new_tx = self.wallet.bump_fee(tx, delta)
except BaseException as e:
self.show_error(str(e))
return
if is_final:
new_tx.set_rbf(False)
self.show_transaction(new_tx, tx_label)
def save_transaction_into_wallet(self, tx):
try:
if not self.wallet.add_transaction(tx.txid(), tx):
self.show_error(_("Transaction could not be saved.") + "\n" +
_("It conflicts with current history."))
return False
except AddTransactionException as e:
self.show_error(e)
return False
else:
self.wallet.save_transactions(write=True)
# need to update at least: history_list, utxo_list, address_list
self.need_update.set()
self.msg_box(QPixmap(":icons/offline_tx.png"), None, _('Success'), _("Transaction added to wallet history"))
return True
|
py | b4141baaf88205179e3becf9ac4acf36829f7340 | #!/usr/bin/env python3
# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
#
# Author: Jeremy Maurer, Raymond Hogenson, David Bekaert & Yang Lei
# Copyright 2019, by the California Institute of Technology. ALL RIGHTS
# RESERVED. United States Government Sponsorship acknowledged.
#
# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
import itertools
import multiprocessing as mp
import time
import h5py
import numpy as np
from pyproj import CRS, Transformer
from scipy.interpolate import RegularGridInterpolator
from RAiDER.constants import _STEP
from RAiDER.interpolator import RegularGridInterpolator as Interpolator
from RAiDER.logger import *
from RAiDER.makePoints import makePoints1D
def calculate_rays(pnts_file, stepSize=_STEP):
'''
From a set of lats/lons/hgts, compute ray paths from the ground to the
top of the atmosphere, using either a set of look vectors or the zenith
'''
logger.debug('calculate_rays: Starting look vector calculation')
logger.debug('The integration stepsize is %f m', stepSize)
# get the lengths of each ray for doing the interpolation
getUnitLVs(pnts_file)
# This projects the ground pixels into earth-centered, earth-fixed coordinate
# system and sorts by position
lla2ecef(pnts_file)
def getUnitLVs(pnts_file):
'''
Get a set of look vectors normalized by their lengths
'''
get_lengths(pnts_file)
with h5py.File(pnts_file, 'r+') as f:
slv = f['LOS'][()] / f['Rays_len'][()][..., np.newaxis]
f['Rays_SLV'][...] = slv
def get_lengths(pnts_file):
'''
Returns the lengths of a vector or set of vectors, fast.
Inputs:
looks_vecs - an Nx3 numpy array containing look vectors with absolute
lengths; i.e., the absolute position of the top of the
atmosphere.
Outputs:
lengths - an Nx1 numpy array containing the absolute distance in
meters of the top of the atmosphere from the ground pnt.
'''
with h5py.File(pnts_file, 'r+') as f:
lengths = np.linalg.norm(f['LOS'][()], axis=-1)
try:
lengths[~np.isfinite(lengths)] = 0
except TypeError:
if ~np.isfinite(lengths):
lengths = 0
f['Rays_len'][:] = lengths.astype(np.float64)
f['Rays_len'].attrs['MaxLen'] = np.nanmax(lengths)
def lla2ecef(pnts_file):
'''
reproject a set of lat/lon/hgts to earth-centered, earth-fixed coordinate system
'''
t = Transformer.from_crs(4326, 4978, always_xy=True) # converts from WGS84 geodetic to WGS84 geocentric
with h5py.File(pnts_file, 'r+') as f:
ndv = f.attrs['NoDataValue']
lon = f['lon'][()]
lat = f['lat'][()]
hgt = f['hgt'][()]
lon[lon == ndv] = np.nan
lat[lat == ndv] = np.nan
hgt[hgt == ndv] = np.nan
sp = np.moveaxis(np.array(t.transform(lon, lat, hgt)), 0, -1)
f['Rays_SP'][...] = sp.astype(np.float64) # ensure double is maintained
def get_delays(stepSize, pnts_file, wm_file, interpType='3D',
delayType="Zenith", cpu_num=0):
'''
Create the integration points for each ray path.
'''
t0 = time.time()
# Get the weather model data
with h5py.File(wm_file, 'r') as f:
xs_wm = f['x'][()].copy()
ys_wm = f['y'][()].copy()
zs_wm = f['z'][()].copy()
wet = f['wet'][()].copy()
hydro = f['hydro'][()].copy()
ifWet = Interpolator((ys_wm, xs_wm, zs_wm), wet, fill_value=np.nan)
ifHydro = Interpolator((ys_wm, xs_wm, zs_wm), hydro, fill_value=np.nan)
with h5py.File(pnts_file, 'r') as f:
Nrays = f.attrs['NumRays']
chunkSize = f.attrs['ChunkSize']
in_shape = f['lon'].attrs['Shape']
arrSize = f['lon'].shape
max_len = np.nanmax(f['Rays_len'])
CHUNKS = chunk(chunkSize, in_shape)
Nchunks = len(CHUNKS)
with h5py.File(pnts_file, 'r') as f:
chunk_inputs = [(kk, CHUNKS[kk], np.array(f['Rays_SP']), np.array(f['Rays_SLV']),
chunkSize, stepSize, ifWet, ifHydro, max_len, wm_file) for kk in range(Nchunks)]
if Nchunks == 1:
delays = process_chunk(*chunk_inputs[0])
else:
with mp.Pool() as pool:
individual_results = pool.starmap(process_chunk, chunk_inputs)
try:
delays = np.concatenate(individual_results)
except ValueError:
delays = np.concatenate(individual_results, axis=-1)
wet_delay = delays[0, ...].reshape(in_shape)
hydro_delay = delays[1, ...].reshape(in_shape)
time_elapse = (time.time() - t0)
with open('get_delays_time_elapse.txt', 'w') as f:
f.write('{}'.format(time_elapse))
time_elapse_hr = int(np.floor(time_elapse / 3600.0))
time_elapse_min = int(np.floor((time_elapse - time_elapse_hr * 3600.0) / 60.0))
time_elapse_sec = (time_elapse - time_elapse_hr * 3600.0 - time_elapse_min * 60.0)
logger.debug(
"Delay estimation cost %d hour(s) %d minute(s) %d second(s) using %d cpu threads",
time_elapse_hr, time_elapse_min, time_elapse_sec, cpu_num
)
return wet_delay, hydro_delay
def make_interpolator(xs, ys, zs, data):
'''
Function to create and return an Interpolator object
'''
return RegularGridInterpolator(
(ys.ravel(), xs.ravel(), zs.ravel()),
data,
bounds_error=False,
fill_value=np.nan
)
def chunk(chunkSize, in_shape):
'''
Create a set of indices to use as chunks
'''
startInds = makeChunkStartInds(chunkSize, in_shape)
chunkInds = makeChunksFromInds(startInds, chunkSize, in_shape)
return chunkInds
def makeChunkStartInds(chunkSize, in_shape):
'''
Create a list of start indices for chunking a numpy D-dimensional array.
Inputs:
chunkSize - length-D tuple containing chunk sizes
in_shape - length-D tuple containing the shape of the array to be chunked
Outputs
chunkInds - a list of length-D tuples, where each tuple is the starting
multi-index of each chunk
Example:
makeChunkStartInds((2,2,16), (4,8,16))
Output: [(0, 0, 0),
(0, 2, 0),
(0, 4, 0),
(0, 6, 0),
(2, 0, 0),
(2, 2, 0),
(2, 4, 0),
(2, 6, 0)]
'''
if len(in_shape) == 1:
chunkInds = [(i,) for i in range(0, in_shape[0], chunkSize[0])]
elif len(in_shape) == 2:
chunkInds = [(i, j) for i, j in itertools.product(range(0, in_shape[0], chunkSize[0]),
range(0, in_shape[1], chunkSize[1]))]
elif len(in_shape) == 3:
chunkInds = [(i, j, k) for i, j, k in itertools.product(range(0, in_shape[0], chunkSize[0]),
range(0, in_shape[1], chunkSize[1]),
range(0, in_shape[2], chunkSize[2]))]
else:
raise NotImplementedError('makeChunkStartInds: ndim > 3 not supported')
return chunkInds
def makeChunksFromInds(startInd, chunkSize, in_shape):
'''
From a length-N list of tuples containing starting indices,
create a list of indices into chunks of a numpy D-dimensional array.
Inputs:
startInd - A length-N list of D-dimensional tuples containing the
starting indices of a set of chunks
chunkSize - A D-dimensional tuple containing chunk size in each dimension
in_shape - A D-dimensional tuple containing the size of each dimension
Outputs:
chunks - A length-N list of length-D lists, where each element of the
length-D list is a numpy array of indices
Example:
makeChunksFromInds([(0, 0), (0, 2), (2, 0), (2, 2)],(4,4),(2,2))
Output:
[[np.array([0, 0, 1, 1]), np.array([0, 1, 0, 1])],
[np.array([0, 0, 1, 1]), np.array([2, 3, 2, 3])],
[np.array([2, 2, 3, 3]), np.array([0, 1, 0, 1])],
[np.array([2, 2, 3, 3]), np.array([2, 3, 2, 3])]]
'''
indices = []
for ci in startInd:
index = []
for si, k, dim in zip(ci, chunkSize, range(len(chunkSize))):
if si + k > in_shape[dim]:
dend = in_shape[dim]
else:
dend = si + k
index.append(np.array(range(si, dend)))
indices.append(index)
# Now create the index mesh (for Ndim > 1)
chunks = []
if len(in_shape) > 1:
for index in indices:
chunks.append([np.array(g) for g in zip(*list(itertools.product(*index)))])
else:
chunks = indices
return chunks
def process_chunk(k, chunkInds, SP, SLV, chunkSize, stepSize, ifWet, ifHydro, max_len, wm_file):
"""
Perform the interpolation and integration over a single chunk.
"""
# Transformer from ECEF to weather model
p1 = CRS.from_epsg(4978)
proj_wm = getProjFromWMFile(wm_file)
t = Transformer.from_proj(p1, proj_wm, always_xy=True)
# datatype must be specific for the cython makePoints* function
_DTYPE = np.float64
# H5PY does not support fancy indexing with tuples, hence this if/else check
if len(chunkSize) == 1:
row = chunkInds[0]
ray = makePoints1D(max_len, SP[row, :].astype(_DTYPE), SLV[row, :].astype(_DTYPE), stepSize)
elif len(chunkSize) == 2:
row, col = chunkInds
ray = makePoints1D(max_len, SP[row, col, :].astype(_DTYPE), SLV[row, col, :].astype(_DTYPE), stepSize)
elif len(chunkSize) == 3:
row, col, zind = chunkInds
ray = makePoints1D(max_len, SP[row, col, zind, :].astype(_DTYPE), SLV[row, col, zind, :].astype(_DTYPE), stepSize)
else:
raise RuntimeError('Data in more than 4 dimensions is not supported')
ray_x, ray_y, ray_z = t.transform(ray[..., 0, :], ray[..., 1, :], ray[..., 2, :])
delay_wet = interpolate2(ifWet, ray_x, ray_y, ray_z)
delay_hydro = interpolate2(ifHydro, ray_x, ray_y, ray_z)
int_delays = _integrateLOS(stepSize, delay_wet, delay_hydro)
return int_delays
def getProjFromWMFile(wm_file):
'''
Returns the projection of an HDF5 file
'''
with h5py.File(wm_file, 'r') as f:
wm_proj = CRS.from_json(f['Projection'][()])
return wm_proj
def interpolate2(fun, x, y, z):
'''
helper function to make the interpolation step cleaner
'''
in_shape = x.shape
out = fun((y.ravel(), x.ravel(), z.ravel())) # note that this re-ordering is on purpose to match the weather model
outData = out.reshape(in_shape)
return outData
def _integrateLOS(stepSize, wet_pw, hydro_pw, Npts=None):
delays = []
for d in (wet_pw, hydro_pw):
if d.ndim == 1:
delays.append(np.array([int_fcn(d, stepSize)]))
else:
delays.append(_integrate_delays(stepSize, d, Npts))
return np.stack(delays, axis=0)
def _integrate_delays(stepSize, refr, Npts=None):
'''
This function gets the actual delays by integrating the refractivity in
each node. Refractivity is given in the 'refr' variable.
'''
delays = []
if Npts is not None:
for n, ray in zip(Npts, refr):
delays.append(int_fcn(ray, stepSize, n))
else:
for ray in refr:
delays.append(int_fcn(ray, stepSize))
return np.array(delays)
def int_fcn(y, dx, N=None):
return 1e-6 * dx * np.nansum(y[:N])
|
py | b4141c076063b2a2a4f481de6d279c4a3ab8843e | #!/usr/bin/python -u
#
# Setup script for libxml2 and libxslt if found
#
import sys, os
from distutils.core import setup, Extension
# Below ROOT, we expect to find include, include/libxml2, lib and bin.
# On *nix, it is not needed (but should not harm),
# on Windows, it is set by configure.js.
ROOT = r'/usr/local'
# Thread-enabled libxml2
with_threads = 0
# If this flag is set (windows only),
# a private copy of the dlls are included in the package.
# If this flag is not set, the libxml2 and libxslt
# dlls must be found somewhere in the PATH at runtime.
WITHDLLS = 1 and sys.platform.startswith('win')
def missing(file):
if os.access(file, os.R_OK) == 0:
return 1
return 0
try:
HOME = os.environ['HOME']
except:
HOME="C:"
if WITHDLLS:
# libxml dlls (expected in ROOT/bin)
dlls = [ 'iconv.dll','libxml2.dll','libxslt.dll','libexslt.dll' ]
dlls = map(lambda dll: os.path.join(ROOT,'bin',dll),dlls)
# create __init__.py for the libxmlmods package
if not os.path.exists("libxmlmods"):
os.mkdir("libxmlmods")
open("libxmlmods/__init__.py","w").close()
def altImport(s):
s = s.replace("import libxml2mod","from libxmlmods import libxml2mod")
s = s.replace("import libxsltmod","from libxmlmods import libxsltmod")
return s
if sys.platform.startswith('win'):
libraryPrefix = 'lib'
platformLibs = []
else:
libraryPrefix = ''
platformLibs = ["m","z"]
# those are examined to find
# - libxml2/libxml/tree.h
# - iconv.h
# - libxslt/xsltconfig.h
includes_dir = [
"/usr/include",
"/usr/local/include",
"/opt/include",
os.path.join(ROOT,'include'),
HOME
];
xml_includes=""
for dir in includes_dir:
if not missing(dir + "/libxml2/libxml/tree.h"):
xml_includes=dir + "/libxml2"
break;
if xml_includes == "":
print "failed to find headers for libxml2: update includes_dir"
sys.exit(1)
iconv_includes=""
for dir in includes_dir:
if not missing(dir + "/iconv.h"):
iconv_includes=dir
break;
if iconv_includes == "":
print "failed to find headers for libiconv: update includes_dir"
sys.exit(1)
# those are added in the linker search path for libraries
libdirs = [
os.path.join(ROOT,'lib'),
]
xml_files = ["libxml2-api.xml", "libxml2-python-api.xml",
"libxml.c", "libxml.py", "libxml_wrap.h", "types.c",
"xmlgenerator.py", "README", "TODO", "drv_libxml2.py"]
xslt_files = ["libxslt-api.xml", "libxslt-python-api.xml",
"libxslt.c", "libxsl.py", "libxslt_wrap.h",
"xsltgenerator.py"]
if missing("libxml2-py.c") or missing("libxml2.py"):
try:
try:
import xmlgenerator
except:
import generator
except:
print "failed to find and generate stubs for libxml2, aborting ..."
print sys.exc_type, sys.exc_value
sys.exit(1)
head = open("libxml.py", "r")
generated = open("libxml2class.py", "r")
result = open("libxml2.py", "w")
for line in head.readlines():
if WITHDLLS:
result.write(altImport(line))
else:
result.write(line)
for line in generated.readlines():
result.write(line)
head.close()
generated.close()
result.close()
with_xslt=0
if missing("libxslt-py.c") or missing("libxslt.py"):
if missing("xsltgenerator.py") or missing("libxslt-api.xml"):
print "libxslt stub generator not found, libxslt not built"
else:
try:
import xsltgenerator
except:
print "failed to generate stubs for libxslt, aborting ..."
print sys.exc_type, sys.exc_value
else:
head = open("libxsl.py", "r")
generated = open("libxsltclass.py", "r")
result = open("libxslt.py", "w")
for line in head.readlines():
if WITHDLLS:
result.write(altImport(line))
else:
result.write(line)
for line in generated.readlines():
result.write(line)
head.close()
generated.close()
result.close()
with_xslt=1
else:
with_xslt=1
if with_xslt == 1:
xslt_includes=""
for dir in includes_dir:
if not missing(dir + "/libxslt/xsltconfig.h"):
xslt_includes=dir + "/libxslt"
break;
if xslt_includes == "":
print "failed to find headers for libxslt: update includes_dir"
with_xslt = 0
descr = "libxml2 package"
modules = [ 'libxml2', 'drv_libxml2' ]
if WITHDLLS:
modules.append('libxmlmods.__init__')
c_files = ['libxml2-py.c', 'libxml.c', 'types.c' ]
includes= [xml_includes, iconv_includes]
libs = [libraryPrefix + "xml2"] + platformLibs
macros = []
if with_threads:
macros.append(('_REENTRANT','1'))
if with_xslt == 1:
descr = "libxml2 and libxslt package"
if not sys.platform.startswith('win'):
#
# We are gonna build 2 identical shared libs with merge initializing
# both libxml2mod and libxsltmod
#
c_files = c_files + ['libxslt-py.c', 'libxslt.c']
xslt_c_files = c_files
macros.append(('MERGED_MODULES', '1'))
else:
#
# On windows the MERGED_MODULE option is not needed
# (and does not work)
#
xslt_c_files = ['libxslt-py.c', 'libxslt.c', 'types.c']
libs.insert(0, libraryPrefix + 'exslt')
libs.insert(0, libraryPrefix + 'xslt')
includes.append(xslt_includes)
modules.append('libxslt')
extens=[Extension('libxml2mod', c_files, include_dirs=includes,
library_dirs=libdirs,
libraries=libs, define_macros=macros)]
if with_xslt == 1:
extens.append(Extension('libxsltmod', xslt_c_files, include_dirs=includes,
library_dirs=libdirs,
libraries=libs, define_macros=macros))
if missing("MANIFEST"):
manifest = open("MANIFEST", "w")
manifest.write("setup.py\n")
for file in xml_files:
manifest.write(file + "\n")
if with_xslt == 1:
for file in xslt_files:
manifest.write(file + "\n")
manifest.close()
if WITHDLLS:
ext_package = "libxmlmods"
if sys.version >= "2.2":
base = "lib/site-packages/"
else:
base = ""
data_files = [(base+"libxmlmods",dlls)]
else:
ext_package = None
data_files = []
setup (name = "libxml2-python",
# On *nix, the version number is created from setup.py.in
# On windows, it is set by configure.js
version = "2.6.26",
description = descr,
author = "Daniel Veillard",
author_email = "[email protected]",
url = "http://xmlsoft.org/python.html",
licence="MIT Licence",
py_modules=modules,
ext_modules=extens,
ext_package=ext_package,
data_files=data_files,
)
sys.exit(0)
|
py | b4141c0c1b3fbdb9bea296d0d94406336c86d1e7 | # Write the benchmarking functions here.
# See "Writing benchmarks" in the asv docs for more information.
import subprocess
import os
from pathlib import Path
import numpy as np
import tempfile
base_path = Path("~/regression/dgl/")
class GCNBenchmark:
params = [['pytorch'], ['cora', 'pubmed'], ['0', '-1']]
param_names = ['backend', 'dataset', 'gpu_id']
timeout = 120
# def setup_cache(self):
# self.tmp_dir = Path(tempfile.mkdtemp())
def setup(self, backend, dataset, gpu_id):
log_filename = Path("gcn_{}_{}_{}.log".format(backend, dataset, gpu_id))
if log_filename.exists():
return
gcn_path = base_path / "examples/{}/gcn/train.py".format(backend)
bashCommand = "/opt/conda/envs/{}-ci/bin/python {} --dataset {} --gpu {} --n-epochs 50".format(
backend, gcn_path.expanduser(), dataset, gpu_id)
process = subprocess.Popen(bashCommand.split(), stdout=subprocess.PIPE,env=dict(os.environ, DGLBACKEND=backend))
output, error = process.communicate()
print(str(error))
log_filename.write_text(str(output))
def track_gcn_time(self, backend, dataset, gpu_id):
log_filename = Path("{}_{}_{}.log".format(backend, dataset, gpu_id))
lines = log_filename.read_text().split("\\n")
time_list = []
for line in lines:
# print(line)
if 'Time' in line:
time_str = line.strip().split('|')[1]
time = float(time_str.split()[-1])
time_list.append(time)
return np.array(time_list)[-10:].mean()
def track_gcn_accuracy(self, backend, dataset, gpu_id):
log_filename = Path("{}_{}_{}.log".format(backend, dataset, gpu_id))
lines = log_filename.read_text().split("\\n")
test_acc = -1
for line in lines:
if 'Test accuracy' in line:
test_acc = float(line.split()[-1][:-1])
print(test_acc)
return test_acc
GCNBenchmark.track_gcn_time.unit = 's'
GCNBenchmark.track_gcn_accuracy.unit = '%'
|
py | b4141e481a9704e16ec30f5db495913e9ef9b497 | # -*- coding: utf-8 -*-
import re
import string
from typing import Callable, Optional, Pattern, List, Tuple
from decimal import Decimal, InvalidOperation
import attr
from ._currencies import (CURRENCY_CODES, CURRENCY_NATIONAL_SYMBOLS,
CURRENCY_SYMBOLS)
@attr.s(auto_attribs=True)
class Price:
amount: Optional[Decimal] # price numeric value, as Decimal
currency: Optional[str] # currency symbol (as appeared in text)
# price value, as a raw string
amount_text: Optional[str] = attr.ib(repr=False)
@property
def amount_float(self) -> Optional[float]:
""" price numeric value, as float """
if self.amount is not None:
return float(self.amount)
@classmethod
def fromstring(cls, price: Optional[str],
currency_hint: Optional[str] = None) -> 'Price':
"""
Given price and currency text extracted from HTML elements, return
``Price`` instance, which provides a clean currency symbol and
price amount as a Decimal number.
``currency_hint`` is optional; you can pass value of some element
which may contain currency, as a hint. If currency is present in
``price`` string, it could be **preferred** over a value extracted
from ``currency_hint`` string.
"""
amount_text = extract_price_text(price) if price is not None else None
amount_num = parse_number(amount_text) if amount_text is not None else None
currency = extract_currency_symbol(price, currency_hint)
if currency is not None:
currency = currency.strip()
return Price(
amount=amount_num,
currency=currency,
amount_text=amount_text,
)
parse_price = Price.fromstring
def or_regex(symbols: List[str]) -> Pattern:
""" Return a regex which matches any of ``symbols`` """
return re.compile('|'.join(re.escape(s) for s in symbols))
# If one of these symbols is found either in price or in currency,
# it is considered currency symbol, and returned as a currency, regardless
# of its position in text.
SAFE_CURRENCY_SYMBOLS = [
# Variants of $, etc. They need to be before $.
'Bds$', 'CUC$', 'MOP$',
'AR$', 'AU$', 'BN$', 'BZ$', 'CA$', 'CL$', 'CO$', 'CV$', 'HK$', 'MX$',
'NT$', 'NZ$', 'TT$', 'RD$', 'WS$', 'US$',
'$U', 'C$', 'J$', 'N$', 'R$', 'S$', 'T$', 'Z$', 'A$',
'SY£', 'LB£', 'CN¥', 'GH₵',
# unique currency symbols
'$', '€', '£', 'zł', 'Zł', 'Kč', '₽', '¥', '¥',
'฿', 'դր.', 'դր', '₦', '₴', '₱', '৳', '₭', '₪', '﷼', '៛', '₩', '₫', '₡',
'টকা', 'ƒ', '₲', '؋', '₮', 'नेरू', '₨',
'₶', '₾', '֏', 'ރ', '৲', '૱', '௹', '₠', '₢', '₣', '₤', '₧', '₯',
'₰', '₳', '₷', '₸', '₹', '₺', '₼', '₾', '₿', 'ℳ',
'ر.ق.\u200f', 'د.ك.\u200f', 'د.ع.\u200f', 'ر.ع.\u200f', 'ر.ي.\u200f',
'ر.س.\u200f', 'د.ج.\u200f', 'د.م.\u200f', 'د.إ.\u200f', 'د.ت.\u200f',
'د.ل.\u200f', 'ل.س.\u200f', 'د.ب.\u200f', 'د.أ.\u200f', 'ج.م.\u200f',
'ل.ل.\u200f',
' تومان', 'تومان',
# other common symbols, which we consider unambiguous
'EUR', 'euro', 'eur', 'CHF', 'DKK', 'Rp', 'lei',
'руб.', 'руб', 'грн.', 'грн', 'дин.', 'Dinara', 'динар', 'лв.', 'лв',
'р.', 'тңг', 'тңг.', 'ман.',
]
# "D" in some abbreviations means "dollar", and so currency
# can be written as SGD$123 or NZD $123. Currency code should take priority
# over $ symbol in this case.
DOLLAR_CODES = [k for k in CURRENCY_CODES if k.endswith('D')]
DOLLAR_REGEXES = [
r"""
\b{} # code like NZD
(?:[^\w]|$) # not a letter
""".format(k) for k in DOLLAR_CODES
]
# Other common currency symbols: 3-letter codes, less safe abbreviations
OTHER_CURRENCY_SYMBOLS_SET = (
set(
CURRENCY_CODES +
CURRENCY_SYMBOLS +
CURRENCY_NATIONAL_SYMBOLS +
# even if they appear in text, currency is likely to be rouble
['р', 'Р']
)
- set(SAFE_CURRENCY_SYMBOLS) # already handled
- {'-', 'XXX'} # placeholder values
- set(string.ascii_uppercase) # very unreliable on their own
)
OTHER_CURRENCY_SYMBOLS = sorted(OTHER_CURRENCY_SYMBOLS_SET,
key=len, reverse=True)
_search_dollar_code = re.compile("|".join(DOLLAR_REGEXES), re.VERBOSE).search
_search_safe_currency = or_regex(SAFE_CURRENCY_SYMBOLS).search
_search_unsafe_currency = or_regex(OTHER_CURRENCY_SYMBOLS).search
def extract_currency_symbol(price: Optional[str],
currency_hint: Optional[str]) -> Optional[str]:
"""
Guess currency symbol from extracted price and currency strings.
Return an empty string if symbol is not found.
"""
methods: List[Tuple[Callable, Optional[str]]] = [
(_search_safe_currency, price),
(_search_safe_currency, currency_hint),
(_search_unsafe_currency, price),
(_search_unsafe_currency, currency_hint),
]
if currency_hint and '$' in currency_hint:
methods.insert(0, (_search_dollar_code, currency_hint))
if price and '$' in price:
methods.insert(0, (_search_dollar_code, price))
for meth, attr in methods:
m = meth(attr) if attr else None
if m:
return m.group(0)
return None
def extract_price_text(price: str) -> Optional[str]:
"""
Extract text of a price from a string which contains price and
maybe some other text. If multiple price-looking substrings are present,
the first is returned (FIXME: it is better to return a number
which is near a currency symbol).
>>> extract_price_text("price: $12.99")
'12.99'
>>> extract_price_text("Free")
'0'
>>> extract_price_text("Foo")
>>> extract_price_text("1,235 USD")
'1,235'
In addition to numbers, it has a limited support for a case where
currency symbol (currently only euro) is a decimal separator:
>>> extract_price_text("99 €, 79 €")
'99'
>>> extract_price_text("99 € 79 €")
'99'
>>> extract_price_text("35€ 99")
'35€99'
>>> extract_price_text("35€ 999")
'35'
>>> extract_price_text("1,235€ 99")
'1,235€99'
>>> extract_price_text("50% OFF")
>>> extract_price_text("50%")
>>> extract_price_text("50")
'50'
"""
if price.count('€') == 1:
m = re.search(r"""
[\d\s.,]*?\d # number, probably with thousand separators
\s*?€\s*? # euro, probably separated by whitespace
\d\d
(?:$|[^\d]) # something which is not a digit
""", price, re.VERBOSE)
if m:
return m.group(0).replace(' ', '')
m = re.search(r"""
(\d[\d\s.,]*) # number, probably with thousand separators
\s*? # skip whitespace
(?:[^%\d]|$) # capture next symbol - it shouldn't be %
""", price, re.VERBOSE)
if m:
return m.group(1).strip(',.').strip()
if 'free' in price.lower():
return '0'
return None
# NOTE: Keep supported separators in sync with parse_number()
_search_decimal_sep = re.compile(r"""
\d # at least one digit (there can be more before it)
([.,€]) # decimal separator
(?: # 1,2 or 4+ digits. 3 digits is likely to be a thousand separator.
\d{1,2}?|
\d{4}\d*?
)
$
""", re.VERBOSE).search
def get_decimal_separator(price: str) -> Optional[str]:
""" Return decimal separator symbol or None if there
is no decimal separator.
>>> get_decimal_separator("1000")
>>> get_decimal_separator("12.99")
'.'
>>> get_decimal_separator("12,99")
','
>>> get_decimal_separator("12.999")
>>> get_decimal_separator("3,0000")
','
>>> get_decimal_separator("1,235€99")
'€'
"""
m = _search_decimal_sep(price)
if m:
return m.group(1)
def parse_number(num: str) -> Optional[Decimal]:
""" Parse a string with a number to a Decimal, guessing its format:
decimal separator, thousand separator. Return None if parsing fails.
>>> parse_number("1,234")
Decimal('1234')
>>> parse_number("12,34")
Decimal('12.34')
>>> parse_number("12,345")
Decimal('12345')
>>> parse_number("1,1")
Decimal('1.1')
>>> parse_number("1.1")
Decimal('1.1')
>>> parse_number("1234")
Decimal('1234')
>>> parse_number("12€34")
Decimal('12.34')
>>> parse_number("12€ 34")
Decimal('12.34')
>>> parse_number("1 234.99")
Decimal('1234.99')
>>> parse_number("1,235€99")
Decimal('1235.99')
>>> parse_number("1 235€99")
Decimal('1235.99')
>>> parse_number("1.235€99")
Decimal('1235.99')
>>> parse_number("")
>>> parse_number("foo")
"""
if not num:
return None
num = num.strip().replace(' ', '')
decimal_separator = get_decimal_separator(num)
# NOTE: Keep supported separators in sync with _search_decimal_sep
if decimal_separator is None:
num = num.replace('.', '').replace(',', '')
elif decimal_separator == '.':
num = num.replace(',', '')
elif decimal_separator == ',':
num = num.replace('.', '').replace(',', '.')
else:
assert decimal_separator == '€'
num = num.replace('.', '').replace(',', '').replace('€', '.')
try:
return Decimal(num)
except InvalidOperation:
return None
|
py | b4141e7ae7f1caddbb4c85537b3cfcd466f7129e | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# @Time : 5/15/20 4:49 PM
# @File : grover.py
# qubit number=4
# total number=14
import cirq
import cirq.google as cg
from typing import Optional
import sys
from math import log2
import numpy as np
#thatsNoCode
from cirq.contrib.svg import SVGCircuit
# Symbols for the rotation angles in the QAOA circuit.
def make_circuit(n: int, input_qubit):
c = cirq.Circuit() # circuit begin
c.append(cirq.H.on(input_qubit[0])) # number=1
c.append(cirq.H.on(input_qubit[1])) # number=2
c.append(cirq.H.on(input_qubit[2])) # number=3
c.append(cirq.H.on(input_qubit[3])) # number=4
c.append(cirq.Y.on(input_qubit[3])) # number=5
c.append(cirq.SWAP.on(input_qubit[1],input_qubit[0])) # number=6
c.append(cirq.SWAP.on(input_qubit[1],input_qubit[0])) # number=7
c.append(cirq.CNOT.on(input_qubit[2],input_qubit[0])) # number=8
c.append(cirq.CNOT.on(input_qubit[2],input_qubit[0])) # number=9
c.append(cirq.H.on(input_qubit[3])) # number=10
c.append(cirq.X.on(input_qubit[2])) # number=11
c.append(cirq.X.on(input_qubit[1])) # number=12
c.append(cirq.X.on(input_qubit[1])) # number=13
# circuit end
c.append(cirq.measure(*input_qubit, key='result'))
return c
def bitstring(bits):
return ''.join(str(int(b)) for b in bits)
if __name__ == '__main__':
qubit_count = 4
input_qubits = [cirq.GridQubit(i, 0) for i in range(qubit_count)]
circuit = make_circuit(qubit_count,input_qubits)
circuit = cg.optimized_for_sycamore(circuit, optimizer_type='sqrt_iswap')
circuit_sample_count =2000
simulator = cirq.Simulator()
result = simulator.run(circuit, repetitions=circuit_sample_count)
frequencies = result.histogram(key='result', fold_func=bitstring)
writefile = open("../data/startCirq643.csv","w+")
print(format(frequencies),file=writefile)
print("results end", file=writefile)
print(circuit.__len__(), file=writefile)
print(circuit,file=writefile)
writefile.close() |
py | b4141efafd5f1ac01a640d3e579d659871506196 | import xlrd
import numpy
wb = xlrd.open_workbook('salaries.xlsx')
sh = wb.sheet_by_index(0)
d = {}
for row in range(1, sh.nrows):
l = sh.row_values(row)
d[sorted(l[1:])[3]] = l[0]
print(d[sorted(list(d.keys()), reverse=True)[0]])
d = {}
for col in range(1, sh.ncols):
l = sh.col_values(col)
d[numpy.mean(l[1:])] = l[0]
print(d[sorted(list(d.keys()), reverse=True)[0]])
|
py | b4141f7cb4da8bf3a0877893b1e1ed7c3df0d232 | from rest_framework import filters
from rest_framework import viewsets
from rest_framework.permissions import IsAuthenticated
from rest_framework.authentication import SessionAuthentication
from rest_framework_jwt.authentication import JSONWebTokenAuthentication
from django_filters.rest_framework import DjangoFilterBackend
from common.permissions import IsOwnerOrReadOnly
from .models import VideoCollection, Video, VideoCollectionDetail
from .paginations import CommonPagination
from .filters import VideoCollectionFilter
from .serializers import VideoCollectionSerializer, VideoSerializer, VideoCollectionDetailSerializer,\
VideoCollectionListDetailSerializer, VideoCollectionDetail2Serializer
class VideoCollectionViewset(viewsets.ModelViewSet):
"""
视频合集
list:
获取视频合集列表
retrieve:
获取视频合集明细
create:
创建视频合集
update:
更新视频合集
destroy:
删除视频合集
"""
pagination_class = CommonPagination
authentication_classes = (JSONWebTokenAuthentication, SessionAuthentication)
filter_backends = (DjangoFilterBackend, filters.SearchFilter, filters.OrderingFilter)
filter_class = VideoCollectionFilter
search_fields = ('name', 'desc')
ordering_fields = ('add_time',)
def get_permissions(self):
if self.action == "create":
return [IsAuthenticated()]
if self.action == "update" or self.action == "destroy":
return [IsAuthenticated(), IsOwnerOrReadOnly()]
return super(VideoCollectionViewset, self).get_permissions()
def get_serializer_class(self):
if self.action == "retrieve":
return VideoCollectionListDetailSerializer
return VideoCollectionSerializer
def get_queryset(self):
if self.action == "list":
return VideoCollection.objects.filter(is_del=0).order_by("add_time")
return VideoCollection.objects.filter(is_del=0, user=self.request.user).order_by("add_time")
class VideoViewset(viewsets.ModelViewSet):
"""
视频
list:
查询视频列表
retrieve:
获取视频详情
create:
创建视频
update:
更新视频
destroy:
删除视频
"""
pagination_class = CommonPagination
permission_classes = (IsAuthenticated, IsOwnerOrReadOnly)
authentication_classes = (JSONWebTokenAuthentication, SessionAuthentication)
serializer_class = VideoSerializer
def get_queryset(self):
return Video.objects.filter(is_del=0, user=self.request.user).order_by("add_time")
class VideoCollectionDetailViewset(viewsets.ModelViewSet):
"""
视频合集明细
list:
查询视频合集明细列表
retrieve:
获取视频合集明细详情
create:
向视频合集中添加视频
update:
更新视频合集明细
destroy:
删除视频合集明细
"""
pagination_class = CommonPagination
permission_classes = (IsAuthenticated, IsOwnerOrReadOnly)
authentication_classes = (JSONWebTokenAuthentication, SessionAuthentication)
def get_serializer_class(self):
if self.action == "create" or self.action == "update":
return VideoCollectionDetail2Serializer
return VideoCollectionDetailSerializer
def get_queryset(self):
return VideoCollectionDetail.objects.filter(is_del=0, user=self.request.user).order_by("add_time")
|
py | b414203fa32f97d1165b00ed41a418935f3b76f4 | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: darts_match.proto
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor.FileDescriptor(
name='darts_match.proto',
package='app',
syntax='proto3',
serialized_options=None,
create_key=_descriptor._internal_create_key,
serialized_pb=b'\n\x11\x64\x61rts_match.proto\x12\x03\x61pp\"3\n\x0cMatchRequest\x12\x10\n\x08userName\x18\x01 \x01(\t\x12\x11\n\tmatchType\x18\x02 \x01(\t\" \n\rMatchResponse\x12\x0f\n\x07matchId\x18\x01 \x01(\x0c\"4\n\x0fRegisterRequest\x12\x0f\n\x07matchId\x18\x01 \x01(\x0c\x12\x10\n\x08userName\x18\x02 \x01(\t\"\'\n\x10RegisterResponse\x12\x13\n\x0bplayerIndex\x18\x01 \x01(\x05\"\"\n\x0f\x46inalizeRequest\x12\x0f\n\x07matchId\x18\x01 \x01(\x0c\"\x12\n\x10\x46inalizeResponse\"\x85\x01\n\x04\x44\x61rt\x12,\n\nmultiplier\x18\x01 \x01(\x0e\x32\x18.app.Dart.DartMultiplier\x12\x0f\n\x07segment\x18\x02 \x01(\x05\">\n\x0e\x44\x61rtMultiplier\x12\x08\n\x04MISS\x10\x00\x12\n\n\x06SINGLE\x10\x01\x12\n\n\x06\x44OUBLE\x10\x02\x12\n\n\x06TREBLE\x10\x03\"N\n\x0cVisitRequest\x12\x0f\n\x07matchId\x18\x01 \x01(\x0c\x12\x13\n\x0bplayerIndex\x18\x02 \x01(\x05\x12\x18\n\x05visit\x18\x03 \x03(\x0b\x32\t.app.Dart\"0\n\rVisitResponse\x12\x0e\n\x06result\x18\x01 \x01(\x05\x12\x0f\n\x07message\x18\x02 \x01(\t\"M\n\x0bListRequest\x12\x0f\n\x07matchId\x18\x01 \x01(\x0c\x12\x13\n\x0bplayerIndex\x18\x02 \x01(\x05\x12\x18\n\x05visit\x18\x03 \x03(\x0b\x32\t.app.Dart\"/\n\x06Player\x12\x10\n\x08userName\x18\x01 \x01(\t\x12\x13\n\x0bplayerIndex\x18\x02 \x01(\x05\"%\n\x05Match\x12\x1c\n\x07players\x18\x01 \x03(\x0b\x32\x0b.app.Player\"+\n\x0cListResponse\x12\x1b\n\x07matches\x18\x01 \x03(\x0b\x32\n.app.Match\"\x0e\n\x0cWatchRequest\"U\n\rWatchResponse\x12\x1b\n\x06player\x18\x01 \x01(\x0b\x32\x0b.app.Player\x12\x18\n\x05\x64\x61rts\x18\x02 \x03(\x0b\x32\t.app.Dart\x12\r\n\x05score\x18\x03 \x01(\x05\"3\n\x0bLeftRequest\x12\x0f\n\x07matchId\x18\x01 \x01(\x0c\x12\x13\n\x0bplayerIndex\x18\x02 \x01(\x05\"!\n\x0cLeftResponse\x12\x11\n\tremainder\x18\x01 \x01(\x05\x32\xa2\x03\n\nDartsMatch\x12\x36\n\x0b\x43reateMatch\x12\x11.app.MatchRequest\x1a\x12.app.MatchResponse\"\x00\x12?\n\x0eRegisterPlayer\x12\x14.app.RegisterRequest\x1a\x15.app.RegisterResponse\"\x00\x12>\n\rFinalizeMatch\x12\x14.app.FinalizeRequest\x1a\x15.app.FinalizeResponse\"\x00\x12\x37\n\x0cProcessVisit\x12\x11.app.VisitRequest\x1a\x12.app.VisitResponse\"\x00\x12\x34\n\x0bListMatches\x12\x10.app.ListRequest\x1a\x11.app.ListResponse\"\x00\x12\x37\n\nWatchMatch\x12\x11.app.WatchRequest\x1a\x12.app.WatchResponse\"\x00\x30\x01\x12\x33\n\nWhatIsLeft\x12\x10.app.LeftRequest\x1a\x11.app.LeftResponse\"\x00\x62\x06proto3'
)
_DART_DARTMULTIPLIER = _descriptor.EnumDescriptor(
name='DartMultiplier',
full_name='app.Dart.DartMultiplier',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='MISS', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SINGLE', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DOUBLE', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='TREBLE', index=3, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=336,
serialized_end=398,
)
_sym_db.RegisterEnumDescriptor(_DART_DARTMULTIPLIER)
_MATCHREQUEST = _descriptor.Descriptor(
name='MatchRequest',
full_name='app.MatchRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='userName', full_name='app.MatchRequest.userName', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='matchType', full_name='app.MatchRequest.matchType', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=26,
serialized_end=77,
)
_MATCHRESPONSE = _descriptor.Descriptor(
name='MatchResponse',
full_name='app.MatchResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='matchId', full_name='app.MatchResponse.matchId', index=0,
number=1, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=79,
serialized_end=111,
)
_REGISTERREQUEST = _descriptor.Descriptor(
name='RegisterRequest',
full_name='app.RegisterRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='matchId', full_name='app.RegisterRequest.matchId', index=0,
number=1, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='userName', full_name='app.RegisterRequest.userName', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=113,
serialized_end=165,
)
_REGISTERRESPONSE = _descriptor.Descriptor(
name='RegisterResponse',
full_name='app.RegisterResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='playerIndex', full_name='app.RegisterResponse.playerIndex', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=167,
serialized_end=206,
)
_FINALIZEREQUEST = _descriptor.Descriptor(
name='FinalizeRequest',
full_name='app.FinalizeRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='matchId', full_name='app.FinalizeRequest.matchId', index=0,
number=1, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=208,
serialized_end=242,
)
_FINALIZERESPONSE = _descriptor.Descriptor(
name='FinalizeResponse',
full_name='app.FinalizeResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=244,
serialized_end=262,
)
_DART = _descriptor.Descriptor(
name='Dart',
full_name='app.Dart',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='multiplier', full_name='app.Dart.multiplier', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='segment', full_name='app.Dart.segment', index=1,
number=2, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
_DART_DARTMULTIPLIER,
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=265,
serialized_end=398,
)
_VISITREQUEST = _descriptor.Descriptor(
name='VisitRequest',
full_name='app.VisitRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='matchId', full_name='app.VisitRequest.matchId', index=0,
number=1, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='playerIndex', full_name='app.VisitRequest.playerIndex', index=1,
number=2, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='visit', full_name='app.VisitRequest.visit', index=2,
number=3, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=400,
serialized_end=478,
)
_VISITRESPONSE = _descriptor.Descriptor(
name='VisitResponse',
full_name='app.VisitResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='result', full_name='app.VisitResponse.result', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='message', full_name='app.VisitResponse.message', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=480,
serialized_end=528,
)
_LISTREQUEST = _descriptor.Descriptor(
name='ListRequest',
full_name='app.ListRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='matchId', full_name='app.ListRequest.matchId', index=0,
number=1, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='playerIndex', full_name='app.ListRequest.playerIndex', index=1,
number=2, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='visit', full_name='app.ListRequest.visit', index=2,
number=3, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=530,
serialized_end=607,
)
_PLAYER = _descriptor.Descriptor(
name='Player',
full_name='app.Player',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='userName', full_name='app.Player.userName', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='playerIndex', full_name='app.Player.playerIndex', index=1,
number=2, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=609,
serialized_end=656,
)
_MATCH = _descriptor.Descriptor(
name='Match',
full_name='app.Match',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='players', full_name='app.Match.players', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=658,
serialized_end=695,
)
_LISTRESPONSE = _descriptor.Descriptor(
name='ListResponse',
full_name='app.ListResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='matches', full_name='app.ListResponse.matches', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=697,
serialized_end=740,
)
_WATCHREQUEST = _descriptor.Descriptor(
name='WatchRequest',
full_name='app.WatchRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=742,
serialized_end=756,
)
_WATCHRESPONSE = _descriptor.Descriptor(
name='WatchResponse',
full_name='app.WatchResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='player', full_name='app.WatchResponse.player', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='darts', full_name='app.WatchResponse.darts', index=1,
number=2, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='score', full_name='app.WatchResponse.score', index=2,
number=3, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=758,
serialized_end=843,
)
_LEFTREQUEST = _descriptor.Descriptor(
name='LeftRequest',
full_name='app.LeftRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='matchId', full_name='app.LeftRequest.matchId', index=0,
number=1, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='playerIndex', full_name='app.LeftRequest.playerIndex', index=1,
number=2, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=845,
serialized_end=896,
)
_LEFTRESPONSE = _descriptor.Descriptor(
name='LeftResponse',
full_name='app.LeftResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='remainder', full_name='app.LeftResponse.remainder', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=898,
serialized_end=931,
)
_DART.fields_by_name['multiplier'].enum_type = _DART_DARTMULTIPLIER
_DART_DARTMULTIPLIER.containing_type = _DART
_VISITREQUEST.fields_by_name['visit'].message_type = _DART
_LISTREQUEST.fields_by_name['visit'].message_type = _DART
_MATCH.fields_by_name['players'].message_type = _PLAYER
_LISTRESPONSE.fields_by_name['matches'].message_type = _MATCH
_WATCHRESPONSE.fields_by_name['player'].message_type = _PLAYER
_WATCHRESPONSE.fields_by_name['darts'].message_type = _DART
DESCRIPTOR.message_types_by_name['MatchRequest'] = _MATCHREQUEST
DESCRIPTOR.message_types_by_name['MatchResponse'] = _MATCHRESPONSE
DESCRIPTOR.message_types_by_name['RegisterRequest'] = _REGISTERREQUEST
DESCRIPTOR.message_types_by_name['RegisterResponse'] = _REGISTERRESPONSE
DESCRIPTOR.message_types_by_name['FinalizeRequest'] = _FINALIZEREQUEST
DESCRIPTOR.message_types_by_name['FinalizeResponse'] = _FINALIZERESPONSE
DESCRIPTOR.message_types_by_name['Dart'] = _DART
DESCRIPTOR.message_types_by_name['VisitRequest'] = _VISITREQUEST
DESCRIPTOR.message_types_by_name['VisitResponse'] = _VISITRESPONSE
DESCRIPTOR.message_types_by_name['ListRequest'] = _LISTREQUEST
DESCRIPTOR.message_types_by_name['Player'] = _PLAYER
DESCRIPTOR.message_types_by_name['Match'] = _MATCH
DESCRIPTOR.message_types_by_name['ListResponse'] = _LISTRESPONSE
DESCRIPTOR.message_types_by_name['WatchRequest'] = _WATCHREQUEST
DESCRIPTOR.message_types_by_name['WatchResponse'] = _WATCHRESPONSE
DESCRIPTOR.message_types_by_name['LeftRequest'] = _LEFTREQUEST
DESCRIPTOR.message_types_by_name['LeftResponse'] = _LEFTRESPONSE
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
MatchRequest = _reflection.GeneratedProtocolMessageType('MatchRequest', (_message.Message,), {
'DESCRIPTOR' : _MATCHREQUEST,
'__module__' : 'darts_match_pb2'
# @@protoc_insertion_point(class_scope:app.MatchRequest)
})
_sym_db.RegisterMessage(MatchRequest)
MatchResponse = _reflection.GeneratedProtocolMessageType('MatchResponse', (_message.Message,), {
'DESCRIPTOR' : _MATCHRESPONSE,
'__module__' : 'darts_match_pb2'
# @@protoc_insertion_point(class_scope:app.MatchResponse)
})
_sym_db.RegisterMessage(MatchResponse)
RegisterRequest = _reflection.GeneratedProtocolMessageType('RegisterRequest', (_message.Message,), {
'DESCRIPTOR' : _REGISTERREQUEST,
'__module__' : 'darts_match_pb2'
# @@protoc_insertion_point(class_scope:app.RegisterRequest)
})
_sym_db.RegisterMessage(RegisterRequest)
RegisterResponse = _reflection.GeneratedProtocolMessageType('RegisterResponse', (_message.Message,), {
'DESCRIPTOR' : _REGISTERRESPONSE,
'__module__' : 'darts_match_pb2'
# @@protoc_insertion_point(class_scope:app.RegisterResponse)
})
_sym_db.RegisterMessage(RegisterResponse)
FinalizeRequest = _reflection.GeneratedProtocolMessageType('FinalizeRequest', (_message.Message,), {
'DESCRIPTOR' : _FINALIZEREQUEST,
'__module__' : 'darts_match_pb2'
# @@protoc_insertion_point(class_scope:app.FinalizeRequest)
})
_sym_db.RegisterMessage(FinalizeRequest)
FinalizeResponse = _reflection.GeneratedProtocolMessageType('FinalizeResponse', (_message.Message,), {
'DESCRIPTOR' : _FINALIZERESPONSE,
'__module__' : 'darts_match_pb2'
# @@protoc_insertion_point(class_scope:app.FinalizeResponse)
})
_sym_db.RegisterMessage(FinalizeResponse)
Dart = _reflection.GeneratedProtocolMessageType('Dart', (_message.Message,), {
'DESCRIPTOR' : _DART,
'__module__' : 'darts_match_pb2'
# @@protoc_insertion_point(class_scope:app.Dart)
})
_sym_db.RegisterMessage(Dart)
VisitRequest = _reflection.GeneratedProtocolMessageType('VisitRequest', (_message.Message,), {
'DESCRIPTOR' : _VISITREQUEST,
'__module__' : 'darts_match_pb2'
# @@protoc_insertion_point(class_scope:app.VisitRequest)
})
_sym_db.RegisterMessage(VisitRequest)
VisitResponse = _reflection.GeneratedProtocolMessageType('VisitResponse', (_message.Message,), {
'DESCRIPTOR' : _VISITRESPONSE,
'__module__' : 'darts_match_pb2'
# @@protoc_insertion_point(class_scope:app.VisitResponse)
})
_sym_db.RegisterMessage(VisitResponse)
ListRequest = _reflection.GeneratedProtocolMessageType('ListRequest', (_message.Message,), {
'DESCRIPTOR' : _LISTREQUEST,
'__module__' : 'darts_match_pb2'
# @@protoc_insertion_point(class_scope:app.ListRequest)
})
_sym_db.RegisterMessage(ListRequest)
Player = _reflection.GeneratedProtocolMessageType('Player', (_message.Message,), {
'DESCRIPTOR' : _PLAYER,
'__module__' : 'darts_match_pb2'
# @@protoc_insertion_point(class_scope:app.Player)
})
_sym_db.RegisterMessage(Player)
Match = _reflection.GeneratedProtocolMessageType('Match', (_message.Message,), {
'DESCRIPTOR' : _MATCH,
'__module__' : 'darts_match_pb2'
# @@protoc_insertion_point(class_scope:app.Match)
})
_sym_db.RegisterMessage(Match)
ListResponse = _reflection.GeneratedProtocolMessageType('ListResponse', (_message.Message,), {
'DESCRIPTOR' : _LISTRESPONSE,
'__module__' : 'darts_match_pb2'
# @@protoc_insertion_point(class_scope:app.ListResponse)
})
_sym_db.RegisterMessage(ListResponse)
WatchRequest = _reflection.GeneratedProtocolMessageType('WatchRequest', (_message.Message,), {
'DESCRIPTOR' : _WATCHREQUEST,
'__module__' : 'darts_match_pb2'
# @@protoc_insertion_point(class_scope:app.WatchRequest)
})
_sym_db.RegisterMessage(WatchRequest)
WatchResponse = _reflection.GeneratedProtocolMessageType('WatchResponse', (_message.Message,), {
'DESCRIPTOR' : _WATCHRESPONSE,
'__module__' : 'darts_match_pb2'
# @@protoc_insertion_point(class_scope:app.WatchResponse)
})
_sym_db.RegisterMessage(WatchResponse)
LeftRequest = _reflection.GeneratedProtocolMessageType('LeftRequest', (_message.Message,), {
'DESCRIPTOR' : _LEFTREQUEST,
'__module__' : 'darts_match_pb2'
# @@protoc_insertion_point(class_scope:app.LeftRequest)
})
_sym_db.RegisterMessage(LeftRequest)
LeftResponse = _reflection.GeneratedProtocolMessageType('LeftResponse', (_message.Message,), {
'DESCRIPTOR' : _LEFTRESPONSE,
'__module__' : 'darts_match_pb2'
# @@protoc_insertion_point(class_scope:app.LeftResponse)
})
_sym_db.RegisterMessage(LeftResponse)
_DARTSMATCH = _descriptor.ServiceDescriptor(
name='DartsMatch',
full_name='app.DartsMatch',
file=DESCRIPTOR,
index=0,
serialized_options=None,
create_key=_descriptor._internal_create_key,
serialized_start=934,
serialized_end=1352,
methods=[
_descriptor.MethodDescriptor(
name='CreateMatch',
full_name='app.DartsMatch.CreateMatch',
index=0,
containing_service=None,
input_type=_MATCHREQUEST,
output_type=_MATCHRESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='RegisterPlayer',
full_name='app.DartsMatch.RegisterPlayer',
index=1,
containing_service=None,
input_type=_REGISTERREQUEST,
output_type=_REGISTERRESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='FinalizeMatch',
full_name='app.DartsMatch.FinalizeMatch',
index=2,
containing_service=None,
input_type=_FINALIZEREQUEST,
output_type=_FINALIZERESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='ProcessVisit',
full_name='app.DartsMatch.ProcessVisit',
index=3,
containing_service=None,
input_type=_VISITREQUEST,
output_type=_VISITRESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='ListMatches',
full_name='app.DartsMatch.ListMatches',
index=4,
containing_service=None,
input_type=_LISTREQUEST,
output_type=_LISTRESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='WatchMatch',
full_name='app.DartsMatch.WatchMatch',
index=5,
containing_service=None,
input_type=_WATCHREQUEST,
output_type=_WATCHRESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='WhatIsLeft',
full_name='app.DartsMatch.WhatIsLeft',
index=6,
containing_service=None,
input_type=_LEFTREQUEST,
output_type=_LEFTRESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
])
_sym_db.RegisterServiceDescriptor(_DARTSMATCH)
DESCRIPTOR.services_by_name['DartsMatch'] = _DARTSMATCH
# @@protoc_insertion_point(module_scope)
|
py | b414205e9369412ef4b59a2c85c4af7df1c49622 | from memshark_config import MemsharkConfig
from games import Games
from gui import gui
from background import background, memshark, server
config = MemsharkConfig('config.json')
games = Games('games')
memshark = memshark.Memshark(config)
exploit_server = server.ExploitServer()
background.schedule(memshark)
background.schedule(exploit_server)
app = gui.MainApp(config, games, memshark, exploit_server)
app.start()
|
bzl | b41421031ea2f2b4ab01850421f5542b3ed36287 | """
@generated
cargo-raze generated Bazel file.
DO NOT EDIT! Replaced on runs of cargo-raze
"""
load("@bazel_tools//tools/build_defs/repo:git.bzl", "new_git_repository") # buildifier: disable=load
load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive") # buildifier: disable=load
load("@bazel_tools//tools/build_defs/repo:utils.bzl", "maybe") # buildifier: disable=load
# EXPERIMENTAL -- MAY CHANGE AT ANY TIME: A mapping of package names to a set of normal dependencies for the Rust targets of that package.
_DEPENDENCIES = {
"remote/binary_dependencies": {
"ferris-says": "@remote_binary_dependencies__ferris_says__0_2_0//:ferris_says",
},
}
# EXPERIMENTAL -- MAY CHANGE AT ANY TIME: A mapping of package names to a set of proc_macro dependencies for the Rust targets of that package.
_PROC_MACRO_DEPENDENCIES = {
"remote/binary_dependencies": {
},
}
# EXPERIMENTAL -- MAY CHANGE AT ANY TIME: A mapping of package names to a set of normal dev dependencies for the Rust targets of that package.
_DEV_DEPENDENCIES = {
"remote/binary_dependencies": {
},
}
# EXPERIMENTAL -- MAY CHANGE AT ANY TIME: A mapping of package names to a set of proc_macro dev dependencies for the Rust targets of that package.
_DEV_PROC_MACRO_DEPENDENCIES = {
"remote/binary_dependencies": {
},
}
def crate_deps(deps, package_name = None):
"""EXPERIMENTAL -- MAY CHANGE AT ANY TIME: Finds the fully qualified label of the requested crates for the package where this macro is called.
WARNING: This macro is part of an expeirmental API and is subject to change.
Args:
deps (list): The desired list of crate targets.
package_name (str, optional): The package name of the set of dependencies to look up.
Defaults to `native.package_name()`.
Returns:
list: A list of labels to cargo-raze generated targets (str)
"""
if not package_name:
package_name = native.package_name()
# Join both sets of dependencies
dependencies = _flatten_dependency_maps([
_DEPENDENCIES,
_PROC_MACRO_DEPENDENCIES,
_DEV_DEPENDENCIES,
_DEV_PROC_MACRO_DEPENDENCIES,
])
if not deps:
return []
missing_crates = []
crate_targets = []
for crate_target in deps:
if crate_target not in dependencies[package_name]:
missing_crates.append(crate_target)
else:
crate_targets.append(dependencies[package_name][crate_target])
if missing_crates:
fail("Could not find crates `{}` among dependencies of `{}`. Available dependencies were `{}`".format(
missing_crates,
package_name,
dependencies[package_name],
))
return crate_targets
def all_crate_deps(normal = False, normal_dev = False, proc_macro = False, proc_macro_dev = False, package_name = None):
"""EXPERIMENTAL -- MAY CHANGE AT ANY TIME: Finds the fully qualified label of all requested direct crate dependencies \
for the package where this macro is called.
If no parameters are set, all normal dependencies are returned. Setting any one flag will
otherwise impact the contents of the returned list.
Args:
normal (bool, optional): If True, normal dependencies are included in the
output list. Defaults to False.
normal_dev (bool, optional): If True, normla dev dependencies will be
included in the output list. Defaults to False.
proc_macro (bool, optional): If True, proc_macro dependencies are included
in the output list. Defaults to False.
proc_macro_dev (bool, optional): If True, dev proc_macro dependencies are
included in the output list. Defaults to False.
package_name (str, optional): The package name of the set of dependencies to look up.
Defaults to `native.package_name()`.
Returns:
list: A list of labels to cargo-raze generated targets (str)
"""
if not package_name:
package_name = native.package_name()
# Determine the relevant maps to use
all_dependency_maps = []
if normal:
all_dependency_maps.append(_DEPENDENCIES)
if normal_dev:
all_dependency_maps.append(_DEV_DEPENDENCIES)
if proc_macro:
all_dependency_maps.append(_PROC_MACRO_DEPENDENCIES)
if proc_macro_dev:
all_dependency_maps.append(_DEV_PROC_MACRO_DEPENDENCIES)
# Default to always using normal dependencies
if not all_dependency_maps:
all_dependency_maps.append(_DEPENDENCIES)
dependencies = _flatten_dependency_maps(all_dependency_maps)
if not dependencies:
return []
return dependencies[package_name].values()
def _flatten_dependency_maps(all_dependency_maps):
"""Flatten a list of dependency maps into one dictionary.
Dependency maps have the following structure:
```python
DEPENDENCIES_MAP = {
# The first key in the map is a Bazel package
# name of the workspace this file is defined in.
"package_name": {
# An alias to a crate target. # The label of the crate target the
# Aliases are only crate names. # alias refers to.
"alias": "@full//:label",
}
}
```
Args:
all_dependency_maps (list): A list of dicts as described above
Returns:
dict: A dictionary as described above
"""
dependencies = {}
for dep_map in all_dependency_maps:
for pkg_name in dep_map:
if pkg_name not in dependencies:
# Add a non-frozen dict to the collection of dependencies
dependencies.setdefault(pkg_name, dict(dep_map[pkg_name].items()))
continue
duplicate_crate_aliases = [key for key in dependencies[pkg_name] if key in dep_map[pkg_name]]
if duplicate_crate_aliases:
fail("There should be no duplicate crate aliases: {}".format(duplicate_crate_aliases))
dependencies[pkg_name].update(dep_map[pkg_name])
return dependencies
def remote_binary_dependencies_fetch_remote_crates():
"""This function defines a collection of repos and should be called in a WORKSPACE file"""
maybe(
http_archive,
name = "remote_binary_dependencies__addr2line__0_14_0",
url = "https://crates.io/api/v1/crates/addr2line/0.14.0/download",
type = "tar.gz",
sha256 = "7c0929d69e78dd9bf5408269919fcbcaeb2e35e5d43e5815517cdc6a8e11a423",
strip_prefix = "addr2line-0.14.0",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.addr2line-0.14.0.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__adler__0_2_3",
url = "https://crates.io/api/v1/crates/adler/0.2.3/download",
type = "tar.gz",
sha256 = "ee2a4ec343196209d6594e19543ae87a39f96d5534d7174822a3ad825dd6ed7e",
strip_prefix = "adler-0.2.3",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.adler-0.2.3.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__adler32__1_2_0",
url = "https://crates.io/api/v1/crates/adler32/1.2.0/download",
type = "tar.gz",
sha256 = "aae1277d39aeec15cb388266ecc24b11c80469deae6067e17a1a7aa9e5c1f234",
strip_prefix = "adler32-1.2.0",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.adler32-1.2.0.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__ansi_term__0_11_0",
url = "https://crates.io/api/v1/crates/ansi_term/0.11.0/download",
type = "tar.gz",
sha256 = "ee49baf6cb617b853aa8d93bf420db2383fab46d314482ca2803b40d5fde979b",
strip_prefix = "ansi_term-0.11.0",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.ansi_term-0.11.0.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__atty__0_2_14",
url = "https://crates.io/api/v1/crates/atty/0.2.14/download",
type = "tar.gz",
sha256 = "d9b39be18770d11421cdb1b9947a45dd3f37e93092cbf377614828a319d5fee8",
strip_prefix = "atty-0.2.14",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.atty-0.2.14.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__autocfg__1_0_1",
url = "https://crates.io/api/v1/crates/autocfg/1.0.1/download",
type = "tar.gz",
sha256 = "cdb031dd78e28731d87d56cc8ffef4a8f36ca26c38fe2de700543e627f8a464a",
strip_prefix = "autocfg-1.0.1",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.autocfg-1.0.1.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__backtrace__0_3_54",
url = "https://crates.io/api/v1/crates/backtrace/0.3.54/download",
type = "tar.gz",
sha256 = "2baad346b2d4e94a24347adeee9c7a93f412ee94b9cc26e5b59dea23848e9f28",
strip_prefix = "backtrace-0.3.54",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.backtrace-0.3.54.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__bitflags__1_2_1",
url = "https://crates.io/api/v1/crates/bitflags/1.2.1/download",
type = "tar.gz",
sha256 = "cf1de2fe8c75bc145a2f577add951f8134889b4795d47466a54a5c846d691693",
strip_prefix = "bitflags-1.2.1",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.bitflags-1.2.1.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__bytemuck__1_4_1",
url = "https://crates.io/api/v1/crates/bytemuck/1.4.1/download",
type = "tar.gz",
sha256 = "41aa2ec95ca3b5c54cf73c91acf06d24f4495d5f1b1c12506ae3483d646177ac",
strip_prefix = "bytemuck-1.4.1",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.bytemuck-1.4.1.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__byteorder__1_3_4",
url = "https://crates.io/api/v1/crates/byteorder/1.3.4/download",
type = "tar.gz",
sha256 = "08c48aae112d48ed9f069b33538ea9e3e90aa263cfa3d1c24309612b1f7472de",
strip_prefix = "byteorder-1.3.4",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.byteorder-1.3.4.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__cfg_if__0_1_10",
url = "https://crates.io/api/v1/crates/cfg-if/0.1.10/download",
type = "tar.gz",
sha256 = "4785bdd1c96b2a846b2bd7cc02e86b6b3dbf14e7e53446c4f54c92a361040822",
strip_prefix = "cfg-if-0.1.10",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.cfg-if-0.1.10.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__cfg_if__1_0_0",
url = "https://crates.io/api/v1/crates/cfg-if/1.0.0/download",
type = "tar.gz",
sha256 = "baf1de4339761588bc0619e3cbc0120ee582ebb74b53b4efbf79117bd2da40fd",
strip_prefix = "cfg-if-1.0.0",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.cfg-if-1.0.0.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__clap__2_33_3",
url = "https://crates.io/api/v1/crates/clap/2.33.3/download",
type = "tar.gz",
sha256 = "37e58ac78573c40708d45522f0d80fa2f01cc4f9b4e2bf749807255454312002",
strip_prefix = "clap-2.33.3",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.clap-2.33.3.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__console__0_13_0",
url = "https://crates.io/api/v1/crates/console/0.13.0/download",
type = "tar.gz",
sha256 = "a50aab2529019abfabfa93f1e6c41ef392f91fbf179b347a7e96abb524884a08",
strip_prefix = "console-0.13.0",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.console-0.13.0.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__crc32fast__1_2_1",
url = "https://crates.io/api/v1/crates/crc32fast/1.2.1/download",
type = "tar.gz",
sha256 = "81156fece84ab6a9f2afdb109ce3ae577e42b1228441eded99bd77f627953b1a",
strip_prefix = "crc32fast-1.2.1",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.crc32fast-1.2.1.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__crossbeam_utils__0_7_2",
url = "https://crates.io/api/v1/crates/crossbeam-utils/0.7.2/download",
type = "tar.gz",
sha256 = "c3c7c73a2d1e9fc0886a08b93e98eb643461230d5f1925e4036204d5f2e261a8",
strip_prefix = "crossbeam-utils-0.7.2",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.crossbeam-utils-0.7.2.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__deflate__0_8_6",
url = "https://crates.io/api/v1/crates/deflate/0.8.6/download",
type = "tar.gz",
sha256 = "73770f8e1fe7d64df17ca66ad28994a0a623ea497fa69486e14984e715c5d174",
strip_prefix = "deflate-0.8.6",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.deflate-0.8.6.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__encode_unicode__0_3_6",
url = "https://crates.io/api/v1/crates/encode_unicode/0.3.6/download",
type = "tar.gz",
sha256 = "a357d28ed41a50f9c765dbfe56cbc04a64e53e5fc58ba79fbc34c10ef3df831f",
strip_prefix = "encode_unicode-0.3.6",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.encode_unicode-0.3.6.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__error_chain__0_10_0",
url = "https://crates.io/api/v1/crates/error-chain/0.10.0/download",
type = "tar.gz",
sha256 = "d9435d864e017c3c6afeac1654189b06cdb491cf2ff73dbf0d73b0f292f42ff8",
strip_prefix = "error-chain-0.10.0",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.error-chain-0.10.0.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__ferris_says__0_2_0",
url = "https://crates.io/api/v1/crates/ferris-says/0.2.0/download",
type = "tar.gz",
sha256 = "7f34f82e9a8b1533c027d018abd90b8687bf923be287b2617dfce4bea4ea3687",
strip_prefix = "ferris-says-0.2.0",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.ferris-says-0.2.0.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__gimli__0_23_0",
url = "https://crates.io/api/v1/crates/gimli/0.23.0/download",
type = "tar.gz",
sha256 = "f6503fe142514ca4799d4c26297c4248239fe8838d827db6bd6065c6ed29a6ce",
strip_prefix = "gimli-0.23.0",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.gimli-0.23.0.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__heck__0_3_1",
url = "https://crates.io/api/v1/crates/heck/0.3.1/download",
type = "tar.gz",
sha256 = "20564e78d53d2bb135c343b3f47714a56af2061f1c928fdb541dc7b9fdd94205",
strip_prefix = "heck-0.3.1",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.heck-0.3.1.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__hermit_abi__0_1_17",
url = "https://crates.io/api/v1/crates/hermit-abi/0.1.17/download",
type = "tar.gz",
sha256 = "5aca5565f760fb5b220e499d72710ed156fdb74e631659e99377d9ebfbd13ae8",
strip_prefix = "hermit-abi-0.1.17",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.hermit-abi-0.1.17.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__image__0_23_10",
url = "https://crates.io/api/v1/crates/image/0.23.10/download",
type = "tar.gz",
sha256 = "985fc06b1304d19c28d5c562ed78ef5316183f2b0053b46763a0b94862373c34",
strip_prefix = "image-0.23.10",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.image-0.23.10.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__indicatif__0_14_0",
url = "https://crates.io/api/v1/crates/indicatif/0.14.0/download",
type = "tar.gz",
sha256 = "49a68371cf417889c9d7f98235b7102ea7c54fc59bcbd22f3dea785be9d27e40",
strip_prefix = "indicatif-0.14.0",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.indicatif-0.14.0.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__jpeg_decoder__0_1_20",
url = "https://crates.io/api/v1/crates/jpeg-decoder/0.1.20/download",
type = "tar.gz",
sha256 = "cc797adac5f083b8ff0ca6f6294a999393d76e197c36488e2ef732c4715f6fa3",
strip_prefix = "jpeg-decoder-0.1.20",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.jpeg-decoder-0.1.20.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__lazy_static__1_4_0",
url = "https://crates.io/api/v1/crates/lazy_static/1.4.0/download",
type = "tar.gz",
sha256 = "e2abad23fbc42b3700f2f279844dc832adb2b2eb069b2df918f455c4e18cc646",
strip_prefix = "lazy_static-1.4.0",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.lazy_static-1.4.0.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__libc__0_2_80",
url = "https://crates.io/api/v1/crates/libc/0.2.80/download",
type = "tar.gz",
sha256 = "4d58d1b70b004888f764dfbf6a26a3b0342a1632d33968e4a179d8011c760614",
strip_prefix = "libc-0.2.80",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.libc-0.2.80.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__miniz_oxide__0_3_7",
url = "https://crates.io/api/v1/crates/miniz_oxide/0.3.7/download",
type = "tar.gz",
sha256 = "791daaae1ed6889560f8c4359194f56648355540573244a5448a83ba1ecc7435",
strip_prefix = "miniz_oxide-0.3.7",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.miniz_oxide-0.3.7.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__miniz_oxide__0_4_3",
url = "https://crates.io/api/v1/crates/miniz_oxide/0.4.3/download",
type = "tar.gz",
sha256 = "0f2d26ec3309788e423cfbf68ad1800f061638098d76a83681af979dc4eda19d",
strip_prefix = "miniz_oxide-0.4.3",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.miniz_oxide-0.4.3.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__num_integer__0_1_44",
url = "https://crates.io/api/v1/crates/num-integer/0.1.44/download",
type = "tar.gz",
sha256 = "d2cc698a63b549a70bc047073d2949cce27cd1c7b0a4a862d08a8031bc2801db",
strip_prefix = "num-integer-0.1.44",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.num-integer-0.1.44.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__num_iter__0_1_42",
url = "https://crates.io/api/v1/crates/num-iter/0.1.42/download",
type = "tar.gz",
sha256 = "b2021c8337a54d21aca0d59a92577a029af9431cb59b909b03252b9c164fad59",
strip_prefix = "num-iter-0.1.42",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.num-iter-0.1.42.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__num_rational__0_3_1",
url = "https://crates.io/api/v1/crates/num-rational/0.3.1/download",
type = "tar.gz",
sha256 = "e5fa6d5f418879385b213d905f7cf5bf4aa553d4c380f0152d1d4f2749186fa9",
strip_prefix = "num-rational-0.3.1",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.num-rational-0.3.1.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__num_traits__0_2_14",
url = "https://crates.io/api/v1/crates/num-traits/0.2.14/download",
type = "tar.gz",
sha256 = "9a64b1ec5cda2586e284722486d802acf1f7dbdc623e2bfc57e65ca1cd099290",
strip_prefix = "num-traits-0.2.14",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.num-traits-0.2.14.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__num_cpus__1_13_0",
url = "https://crates.io/api/v1/crates/num_cpus/1.13.0/download",
type = "tar.gz",
sha256 = "05499f3756671c15885fee9034446956fff3f243d6077b91e5767df161f766b3",
strip_prefix = "num_cpus-1.13.0",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.num_cpus-1.13.0.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__number_prefix__0_3_0",
url = "https://crates.io/api/v1/crates/number_prefix/0.3.0/download",
type = "tar.gz",
sha256 = "17b02fc0ff9a9e4b35b3342880f48e896ebf69f2967921fe8646bf5b7125956a",
strip_prefix = "number_prefix-0.3.0",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.number_prefix-0.3.0.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__object__0_22_0",
url = "https://crates.io/api/v1/crates/object/0.22.0/download",
type = "tar.gz",
sha256 = "8d3b63360ec3cb337817c2dbd47ab4a0f170d285d8e5a2064600f3def1402397",
strip_prefix = "object-0.22.0",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.object-0.22.0.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__pdqselect__0_1_0",
url = "https://crates.io/api/v1/crates/pdqselect/0.1.0/download",
type = "tar.gz",
sha256 = "4ec91767ecc0a0bbe558ce8c9da33c068066c57ecc8bb8477ef8c1ad3ef77c27",
strip_prefix = "pdqselect-0.1.0",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.pdqselect-0.1.0.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__png__0_16_7",
url = "https://crates.io/api/v1/crates/png/0.16.7/download",
type = "tar.gz",
sha256 = "dfe7f9f1c730833200b134370e1d5098964231af8450bce9b78ee3ab5278b970",
strip_prefix = "png-0.16.7",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.png-0.16.7.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__ppv_lite86__0_2_9",
url = "https://crates.io/api/v1/crates/ppv-lite86/0.2.9/download",
type = "tar.gz",
sha256 = "c36fa947111f5c62a733b652544dd0016a43ce89619538a8ef92724a6f501a20",
strip_prefix = "ppv-lite86-0.2.9",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.ppv-lite86-0.2.9.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__proc_macro_error__1_0_4",
url = "https://crates.io/api/v1/crates/proc-macro-error/1.0.4/download",
type = "tar.gz",
sha256 = "da25490ff9892aab3fcf7c36f08cfb902dd3e71ca0f9f9517bea02a73a5ce38c",
strip_prefix = "proc-macro-error-1.0.4",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.proc-macro-error-1.0.4.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__proc_macro_error_attr__1_0_4",
url = "https://crates.io/api/v1/crates/proc-macro-error-attr/1.0.4/download",
type = "tar.gz",
sha256 = "a1be40180e52ecc98ad80b184934baf3d0d29f979574e439af5a55274b35f869",
strip_prefix = "proc-macro-error-attr-1.0.4",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.proc-macro-error-attr-1.0.4.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__proc_macro2__1_0_24",
url = "https://crates.io/api/v1/crates/proc-macro2/1.0.24/download",
type = "tar.gz",
sha256 = "1e0704ee1a7e00d7bb417d0770ea303c1bccbabf0ef1667dae92b5967f5f8a71",
strip_prefix = "proc-macro2-1.0.24",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.proc-macro2-1.0.24.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__quote__1_0_7",
url = "https://crates.io/api/v1/crates/quote/1.0.7/download",
type = "tar.gz",
sha256 = "aa563d17ecb180e500da1cfd2b028310ac758de548efdd203e18f283af693f37",
strip_prefix = "quote-1.0.7",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.quote-1.0.7.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__rand__0_7_3",
url = "https://crates.io/api/v1/crates/rand/0.7.3/download",
type = "tar.gz",
sha256 = "6a6b1679d49b24bbfe0c803429aa1874472f50d9b363131f0e89fc356b544d03",
strip_prefix = "rand-0.7.3",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.rand-0.7.3.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__rand_chacha__0_2_2",
url = "https://crates.io/api/v1/crates/rand_chacha/0.2.2/download",
type = "tar.gz",
sha256 = "f4c8ed856279c9737206bf725bf36935d8666ead7aa69b52be55af369d193402",
strip_prefix = "rand_chacha-0.2.2",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.rand_chacha-0.2.2.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__rand_core__0_5_1",
url = "https://crates.io/api/v1/crates/rand_core/0.5.1/download",
type = "tar.gz",
sha256 = "90bde5296fc891b0cef12a6d03ddccc162ce7b2aff54160af9338f8d40df6d19",
strip_prefix = "rand_core-0.5.1",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.rand_core-0.5.1.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__rand_hc__0_2_0",
url = "https://crates.io/api/v1/crates/rand_hc/0.2.0/download",
type = "tar.gz",
sha256 = "ca3129af7b92a17112d59ad498c6f81eaf463253766b90396d39ea7a39d6613c",
strip_prefix = "rand_hc-0.2.0",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.rand_hc-0.2.0.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__rand_pcg__0_2_1",
url = "https://crates.io/api/v1/crates/rand_pcg/0.2.1/download",
type = "tar.gz",
sha256 = "16abd0c1b639e9eb4d7c50c0b8100b0d0f849be2349829c740fe8e6eb4816429",
strip_prefix = "rand_pcg-0.2.1",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.rand_pcg-0.2.1.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__regex__1_4_1",
url = "https://crates.io/api/v1/crates/regex/1.4.1/download",
type = "tar.gz",
sha256 = "8963b85b8ce3074fecffde43b4b0dded83ce2f367dc8d363afc56679f3ee820b",
strip_prefix = "regex-1.4.1",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.regex-1.4.1.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__regex_syntax__0_6_20",
url = "https://crates.io/api/v1/crates/regex-syntax/0.6.20/download",
type = "tar.gz",
sha256 = "8cab7a364d15cde1e505267766a2d3c4e22a843e1a601f0fa7564c0f82ced11c",
strip_prefix = "regex-syntax-0.6.20",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.regex-syntax-0.6.20.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__rstar__0_7_1",
url = "https://crates.io/api/v1/crates/rstar/0.7.1/download",
type = "tar.gz",
sha256 = "0650eaaa56cbd1726fd671150fce8ac6ed9d9a25d1624430d7ee9d196052f6b6",
strip_prefix = "rstar-0.7.1",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.rstar-0.7.1.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__rustc_demangle__0_1_18",
url = "https://crates.io/api/v1/crates/rustc-demangle/0.1.18/download",
type = "tar.gz",
sha256 = "6e3bad0ee36814ca07d7968269dd4b7ec89ec2da10c4bb613928d3077083c232",
strip_prefix = "rustc-demangle-0.1.18",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.rustc-demangle-0.1.18.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__smallvec__0_4_5",
url = "https://crates.io/api/v1/crates/smallvec/0.4.5/download",
type = "tar.gz",
sha256 = "f90c5e5fe535e48807ab94fc611d323935f39d4660c52b26b96446a7b33aef10",
strip_prefix = "smallvec-0.4.5",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.smallvec-0.4.5.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__strsim__0_8_0",
url = "https://crates.io/api/v1/crates/strsim/0.8.0/download",
type = "tar.gz",
sha256 = "8ea5119cdb4c55b55d432abb513a0429384878c15dde60cc77b1c99de1a95a6a",
strip_prefix = "strsim-0.8.0",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.strsim-0.8.0.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__structopt__0_3_20",
url = "https://crates.io/api/v1/crates/structopt/0.3.20/download",
type = "tar.gz",
sha256 = "126d630294ec449fae0b16f964e35bf3c74f940da9dca17ee9b905f7b3112eb8",
strip_prefix = "structopt-0.3.20",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.structopt-0.3.20.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__structopt_derive__0_4_13",
url = "https://crates.io/api/v1/crates/structopt-derive/0.4.13/download",
type = "tar.gz",
sha256 = "65e51c492f9e23a220534971ff5afc14037289de430e3c83f9daf6a1b6ae91e8",
strip_prefix = "structopt-derive-0.4.13",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.structopt-derive-0.4.13.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__syn__1_0_48",
url = "https://crates.io/api/v1/crates/syn/1.0.48/download",
type = "tar.gz",
sha256 = "cc371affeffc477f42a221a1e4297aedcea33d47d19b61455588bd9d8f6b19ac",
strip_prefix = "syn-1.0.48",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.syn-1.0.48.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__terminal_size__0_1_13",
url = "https://crates.io/api/v1/crates/terminal_size/0.1.13/download",
type = "tar.gz",
sha256 = "9a14cd9f8c72704232f0bfc8455c0e861f0ad4eb60cc9ec8a170e231414c1e13",
strip_prefix = "terminal_size-0.1.13",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.terminal_size-0.1.13.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__texture_synthesis__0_8_0",
url = "https://crates.io/api/v1/crates/texture-synthesis/0.8.0/download",
type = "tar.gz",
sha256 = "2ff7e9b61c0c11d66b78f7474d69217a4e0fc5b758ff33b67c6dc0b87b126191",
strip_prefix = "texture-synthesis-0.8.0",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.texture-synthesis-0.8.0.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__texture_synthesis_cli__0_8_0",
url = "https://crates.io/api/v1/crates/texture-synthesis-cli/0.8.0/download",
type = "tar.gz",
sha256 = "4d0a65298b735ba486081633e90469182d7c0ca59018fec1e81383bde852be2e",
strip_prefix = "texture-synthesis-cli-0.8.0",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.texture-synthesis-cli-0.8.0.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__textwrap__0_11_0",
url = "https://crates.io/api/v1/crates/textwrap/0.11.0/download",
type = "tar.gz",
sha256 = "d326610f408c7a4eb6f51c37c330e496b08506c9457c9d34287ecc38809fb060",
strip_prefix = "textwrap-0.11.0",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.textwrap-0.11.0.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__unicode_segmentation__1_6_0",
url = "https://crates.io/api/v1/crates/unicode-segmentation/1.6.0/download",
type = "tar.gz",
sha256 = "e83e153d1053cbb5a118eeff7fd5be06ed99153f00dbcd8ae310c5fb2b22edc0",
strip_prefix = "unicode-segmentation-1.6.0",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.unicode-segmentation-1.6.0.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__unicode_width__0_1_8",
url = "https://crates.io/api/v1/crates/unicode-width/0.1.8/download",
type = "tar.gz",
sha256 = "9337591893a19b88d8d87f2cec1e73fad5cdfd10e5a6f349f498ad6ea2ffb1e3",
strip_prefix = "unicode-width-0.1.8",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.unicode-width-0.1.8.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__unicode_xid__0_2_1",
url = "https://crates.io/api/v1/crates/unicode-xid/0.2.1/download",
type = "tar.gz",
sha256 = "f7fe0bb3479651439c9112f72b6c505038574c9fbb575ed1bf3b797fa39dd564",
strip_prefix = "unicode-xid-0.2.1",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.unicode-xid-0.2.1.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__vec_map__0_8_2",
url = "https://crates.io/api/v1/crates/vec_map/0.8.2/download",
type = "tar.gz",
sha256 = "f1bddf1187be692e79c5ffeab891132dfb0f236ed36a43c7ed39f1165ee20191",
strip_prefix = "vec_map-0.8.2",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.vec_map-0.8.2.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__version_check__0_9_2",
url = "https://crates.io/api/v1/crates/version_check/0.9.2/download",
type = "tar.gz",
sha256 = "b5a972e5669d67ba988ce3dc826706fb0a8b01471c088cb0b6110b805cc36aed",
strip_prefix = "version_check-0.9.2",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.version_check-0.9.2.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__winapi__0_3_9",
url = "https://crates.io/api/v1/crates/winapi/0.3.9/download",
type = "tar.gz",
sha256 = "5c839a674fcd7a98952e593242ea400abe93992746761e38641405d28b00f419",
strip_prefix = "winapi-0.3.9",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.winapi-0.3.9.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__winapi_i686_pc_windows_gnu__0_4_0",
url = "https://crates.io/api/v1/crates/winapi-i686-pc-windows-gnu/0.4.0/download",
type = "tar.gz",
sha256 = "ac3b87c63620426dd9b991e5ce0329eff545bccbbb34f3be09ff6fb6ab51b7b6",
strip_prefix = "winapi-i686-pc-windows-gnu-0.4.0",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.winapi-i686-pc-windows-gnu-0.4.0.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__winapi_util__0_1_5",
url = "https://crates.io/api/v1/crates/winapi-util/0.1.5/download",
type = "tar.gz",
sha256 = "70ec6ce85bb158151cae5e5c87f95a8e97d2c0c4b001223f33a334e3ce5de178",
strip_prefix = "winapi-util-0.1.5",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.winapi-util-0.1.5.bazel"),
)
maybe(
http_archive,
name = "remote_binary_dependencies__winapi_x86_64_pc_windows_gnu__0_4_0",
url = "https://crates.io/api/v1/crates/winapi-x86_64-pc-windows-gnu/0.4.0/download",
type = "tar.gz",
sha256 = "712e227841d057c1ee1cd2fb22fa7e5a5461ae8e48fa2ca79ec42cfc1931183f",
strip_prefix = "winapi-x86_64-pc-windows-gnu-0.4.0",
build_file = Label("//remote/binary_dependencies/cargo/remote:BUILD.winapi-x86_64-pc-windows-gnu-0.4.0.bazel"),
)
|
py | b414214199b5c376d38e51f7c0ae7e31b1fe8f5d | from discord import User, DMChannel, Client, Message, Member, Guild
from src.utils.poker.pokerImage import getPokerImage
from src.utils.casino.table.BlackJackTable import BlackJackTable
from src.utils.gamePlayerWaiting.GamePlayerWaiting import GamePlayerWaiting
from src.controller.onMessage.blackJack.stay import blackJackStay
from src.utils.casino.Casino import Casino
from loguru import logger
from pymysql import Connection
from src.model.makeDatabaseConnection import makeDatabaseConnection
async def blackJackGameStart(table: BlackJackTable, message: Message, self: Client, gamePlayerWaiting: GamePlayerWaiting, casino: Casino, db: Connection):
table.gameStart()
await message.channel.send("开始了,底牌已经私聊你们了,请各位查看自己的牌")
playerListStr = ""
myGuild: Guild = self.guilds[0]
for userID in table.players:
playerListStr += str(userID) + ", "
member: Member = await myGuild.fetch_member(userID)
dmChannel: DMChannel = await member.create_dm()
await dmChannel.send("这是你的牌:")
cards = table.viewCards(userID)
await dmChannel.send(file=getPokerImage(cards))
await dmChannel.send("你还要牌吗,要的话,在这里回复\"要\"或者\"不要\"")
await creategamePlayerWaiting(member, message, self, casino, gamePlayerWaiting, db)
logger.info(f"Black Jack started in table {table.inviteMessage.channel.id} with players: {playerListStr}")
async def creategamePlayerWaiting(member: Member, message: Message, self: Client, casino: Casino, gamePlayerWaiting: GamePlayerWaiting, db: Connection):
async def timeoutFun():
dbTemp = makeDatabaseConnection()
await message.channel.send(f"玩家{member.display_name}由于长时间没反应,自动开牌")
await blackJackStay(self, dbTemp, message, casino, member.id, member, gamePlayerWaiting, removeWait=False)
dbTemp.close()
async def warningFun():
await message.channel.send(f"玩家{member.display_name}由于长时间没反应,将会在5秒后自动开牌")
await gamePlayerWaiting.newWait(member.id, timeoutFun, warningFun) |
py | b41421bc0bf8566d6dfa98d41b046db5a416c4c7 | # -*- coding: utf-8 -*-
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import print_function
import airflow
from airflow import DAG
from airflow.operators.python_operator import PythonOperator
args = {
'owner': 'airflow',
'start_date': airflow.utils.dates.days_ago(2),
'provide_context': True
}
dag = DAG(
'example_xcom',
schedule_interval="@once",
default_args=args)
value_1 = [1, 2, 3]
value_2 = {'a': 'b'}
def push(**kwargs):
# pushes an XCom without a specific target
kwargs['ti'].xcom_push(key='value from pusher 1', value=value_1)
def push_by_returning(**kwargs):
# pushes an XCom without a specific target, just by returning it
return value_2
def puller(**kwargs):
ti = kwargs['ti']
# get value_1
v1 = ti.xcom_pull(key=None, task_ids='push')
assert v1 == value_1
# get value_2
v2 = ti.xcom_pull(task_ids='push_by_returning')
assert v2 == value_2
# get both value_1 and value_2
v1, v2 = ti.xcom_pull(key=None, task_ids=['push', 'push_by_returning'])
assert (v1, v2) == (value_1, value_2)
push1 = PythonOperator(
task_id='push', dag=dag, python_callable=push)
push2 = PythonOperator(
task_id='push_by_returning', dag=dag, python_callable=push_by_returning)
pull = PythonOperator(
task_id='puller', dag=dag, python_callable=puller)
pull.set_upstream([push1, push2])
|
py | b414223fe8a41a3612237eec85a882392bfdd2d9 | import glob
from sys import argv
import numpy as np
import os
dir1 = argv[1]
dir2 = argv[2]
files1 = sorted(glob.glob(dir1+"/V_t*"), key=os.path.getmtime)
files2 = sorted(glob.glob(dir2+"/V_t*"), key=os.path.getmtime)
data_final = []
for f1, f2 in zip(files1, files2):
data1 = open(f1, "r")
data2 = open(f2, "r")
data_list1 = []
data_list2 = []
for line in data1:
line = line.strip().split(",")
data_list1.append(float(line[4]))
for line in data2:
line = line.strip().split(",")
data_list2.append(float(line[4]))
x =np.array(data_list1)
y =np.array(data_list2)
error = (np.sqrt(sum( np.power(x-y,2) )) / np.sqrt( sum( np.power(y,2))))*100.0
print f1, f2, error
data_final.append(error)
z = np.array(data_final)
print z.mean(axis=None)
|
py | b414234206ba41fd1c9968723a6775c349704662 | # #Copyright (c) 2012, GPy authors (see AUTHORS.txt).
# Licensed under the BSD 3-clause license (see LICENSE.txt)
try:
import Tango
import pylab as pb
except:
pass
import numpy as np
def ax_default(fignum, ax):
if ax is None:
fig = pb.figure(fignum)
ax = fig.add_subplot(111)
else:
fig = ax.figure
return fig, ax
def meanplot(x, mu, color=Tango.colorsHex['darkBlue'], ax=None, fignum=None, linewidth=2,**kw):
_, axes = ax_default(fignum, ax)
return axes.plot(x,mu,color=color,linewidth=linewidth,**kw)
def gpplot(x, mu, lower, upper, edgecol=Tango.colorsHex['darkBlue'], fillcol=Tango.colorsHex['lightBlue'], ax=None, fignum=None, **kwargs):
_, axes = ax_default(fignum, ax)
mu = mu.flatten()
x = x.flatten()
lower = lower.flatten()
upper = upper.flatten()
plots = []
#here's the mean
plots.append(meanplot(x, mu, edgecol, axes))
#here's the box
kwargs['linewidth']=0.5
if not 'alpha' in kwargs.keys():
kwargs['alpha'] = 0.3
plots.append(axes.fill(np.hstack((x,x[::-1])),np.hstack((upper,lower[::-1])),color=fillcol,**kwargs))
#this is the edge:
plots.append(meanplot(x, upper,color=edgecol,linewidth=0.2,ax=axes))
plots.append(meanplot(x, lower,color=edgecol,linewidth=0.2,ax=axes))
return plots
def removeRightTicks(ax=None):
ax = ax or pb.gca()
for i, line in enumerate(ax.get_yticklines()):
if i%2 == 1: # odd indices
line.set_visible(False)
def removeUpperTicks(ax=None):
ax = ax or pb.gca()
for i, line in enumerate(ax.get_xticklines()):
if i%2 == 1: # odd indices
line.set_visible(False)
def fewerXticks(ax=None,divideby=2):
ax = ax or pb.gca()
ax.set_xticks(ax.get_xticks()[::divideby])
def align_subplots(N,M,xlim=None, ylim=None):
"""make all of the subplots have the same limits, turn off unnecessary ticks"""
#find sensible xlim,ylim
if xlim is None:
xlim = [np.inf,-np.inf]
for i in range(N*M):
pb.subplot(N,M,i+1)
xlim[0] = min(xlim[0],pb.xlim()[0])
xlim[1] = max(xlim[1],pb.xlim()[1])
if ylim is None:
ylim = [np.inf,-np.inf]
for i in range(N*M):
pb.subplot(N,M,i+1)
ylim[0] = min(ylim[0],pb.ylim()[0])
ylim[1] = max(ylim[1],pb.ylim()[1])
for i in range(N*M):
pb.subplot(N,M,i+1)
pb.xlim(xlim)
pb.ylim(ylim)
if (i)%M:
pb.yticks([])
else:
removeRightTicks()
if i<(M*(N-1)):
pb.xticks([])
else:
removeUpperTicks()
def align_subplot_array(axes,xlim=None, ylim=None):
"""
Make all of the axes in the array hae the same limits, turn off unnecessary ticks
use pb.subplots() to get an array of axes
"""
#find sensible xlim,ylim
if xlim is None:
xlim = [np.inf,-np.inf]
for ax in axes.flatten():
xlim[0] = min(xlim[0],ax.get_xlim()[0])
xlim[1] = max(xlim[1],ax.get_xlim()[1])
if ylim is None:
ylim = [np.inf,-np.inf]
for ax in axes.flatten():
ylim[0] = min(ylim[0],ax.get_ylim()[0])
ylim[1] = max(ylim[1],ax.get_ylim()[1])
N,M = axes.shape
for i,ax in enumerate(axes.flatten()):
ax.set_xlim(xlim)
ax.set_ylim(ylim)
if (i)%M:
ax.set_yticks([])
else:
removeRightTicks(ax)
if i<(M*(N-1)):
ax.set_xticks([])
else:
removeUpperTicks(ax)
def x_frame1D(X,plot_limits=None,resolution=None):
"""
Internal helper function for making plots, returns a set of input values to plot as well as lower and upper limits
"""
assert X.shape[1] ==1, "x_frame1D is defined for one-dimensional inputs"
if plot_limits is None:
xmin,xmax = X.min(0),X.max(0)
xmin, xmax = xmin-0.2*(xmax-xmin), xmax+0.2*(xmax-xmin)
elif len(plot_limits)==2:
xmin, xmax = plot_limits
else:
raise ValueError, "Bad limits for plotting"
Xnew = np.linspace(xmin,xmax,resolution or 200)[:,None]
return Xnew, xmin, xmax
def x_frame2D(X,plot_limits=None,resolution=None):
"""
Internal helper function for making plots, returns a set of input values to plot as well as lower and upper limits
"""
assert X.shape[1] ==2, "x_frame2D is defined for two-dimensional inputs"
if plot_limits is None:
xmin,xmax = X.min(0),X.max(0)
xmin, xmax = xmin-0.2*(xmax-xmin), xmax+0.2*(xmax-xmin)
elif len(plot_limits)==2:
xmin, xmax = plot_limits
else:
raise ValueError, "Bad limits for plotting"
resolution = resolution or 50
xx,yy = np.mgrid[xmin[0]:xmax[0]:1j*resolution,xmin[1]:xmax[1]:1j*resolution]
Xnew = np.vstack((xx.flatten(),yy.flatten())).T
return Xnew, xx, yy, xmin, xmax
|
py | b414238c478b117b14af9e9359f6111fc9d0ddcb | # -*- coding: utf-8 -*-
from operator import attrgetter
from pyangbind.lib.yangtypes import RestrictedPrecisionDecimalType
from pyangbind.lib.yangtypes import RestrictedClassType
from pyangbind.lib.yangtypes import TypedListType
from pyangbind.lib.yangtypes import YANGBool
from pyangbind.lib.yangtypes import YANGListType
from pyangbind.lib.yangtypes import YANGDynClass
from pyangbind.lib.yangtypes import ReferenceType
from pyangbind.lib.base import PybindBase
from collections import OrderedDict
from decimal import Decimal
from bitarray import bitarray
import six
# PY3 support of some PY2 keywords (needs improved)
if six.PY3:
import builtins as __builtin__
long = int
elif six.PY2:
import __builtin__
class state(PybindBase):
"""
This class was auto-generated by the PythonClass plugin for PYANG
from YANG module openconfig-interfaces - based on the path /interfaces/interface/routed-vlan/ipv4/state. Each member element of
the container is represented as a class variable - with a specific
YANG type.
YANG Description: Top level IPv4 operational state data
"""
__slots__ = ("_path_helper", "_extmethods", "__enabled", "__mtu")
_yang_name = "state"
_pybind_generated_by = "container"
def __init__(self, *args, **kwargs):
self._path_helper = False
self._extmethods = False
self.__enabled = YANGDynClass(
base=YANGBool,
default=YANGBool("true"),
is_leaf=True,
yang_name="enabled",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/interfaces/ip",
defining_module="openconfig-if-ip",
yang_type="boolean",
is_config=False,
)
self.__mtu = YANGDynClass(
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..65535"]}, int_size=16
),
restriction_dict={"range": ["68..max"]},
),
is_leaf=True,
yang_name="mtu",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/interfaces/ip",
defining_module="openconfig-if-ip",
yang_type="uint16",
is_config=False,
)
load = kwargs.pop("load", None)
if args:
if len(args) > 1:
raise TypeError("cannot create a YANG container with >1 argument")
all_attr = True
for e in self._pyangbind_elements:
if not hasattr(args[0], e):
all_attr = False
break
if not all_attr:
raise ValueError("Supplied object did not have the correct attributes")
for e in self._pyangbind_elements:
nobj = getattr(args[0], e)
if nobj._changed() is False:
continue
setmethod = getattr(self, "_set_%s" % e)
if load is None:
setmethod(getattr(args[0], e))
else:
setmethod(getattr(args[0], e), load=load)
def _path(self):
if hasattr(self, "_parent"):
return self._parent._path() + [self._yang_name]
else:
return ["interfaces", "interface", "routed-vlan", "ipv4", "state"]
def _get_enabled(self):
"""
Getter method for enabled, mapped from YANG variable /interfaces/interface/routed_vlan/ipv4/state/enabled (boolean)
YANG Description: Controls whether IPv4 is enabled or disabled on this
interface. When IPv4 is enabled, this interface is
connected to an IPv4 stack, and the interface can send
and receive IPv4 packets.
"""
return self.__enabled
def _set_enabled(self, v, load=False):
"""
Setter method for enabled, mapped from YANG variable /interfaces/interface/routed_vlan/ipv4/state/enabled (boolean)
If this variable is read-only (config: false) in the
source YANG file, then _set_enabled is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_enabled() directly.
YANG Description: Controls whether IPv4 is enabled or disabled on this
interface. When IPv4 is enabled, this interface is
connected to an IPv4 stack, and the interface can send
and receive IPv4 packets.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=YANGBool,
default=YANGBool("true"),
is_leaf=True,
yang_name="enabled",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/interfaces/ip",
defining_module="openconfig-if-ip",
yang_type="boolean",
is_config=False,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """enabled must be of a type compatible with boolean""",
"defined-type": "boolean",
"generated-type": """YANGDynClass(base=YANGBool, default=YANGBool("true"), is_leaf=True, yang_name="enabled", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/interfaces/ip', defining_module='openconfig-if-ip', yang_type='boolean', is_config=False)""",
}
)
self.__enabled = t
if hasattr(self, "_set"):
self._set()
def _unset_enabled(self):
self.__enabled = YANGDynClass(
base=YANGBool,
default=YANGBool("true"),
is_leaf=True,
yang_name="enabled",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/interfaces/ip",
defining_module="openconfig-if-ip",
yang_type="boolean",
is_config=False,
)
def _get_mtu(self):
"""
Getter method for mtu, mapped from YANG variable /interfaces/interface/routed_vlan/ipv4/state/mtu (uint16)
YANG Description: The size, in octets, of the largest IPv4 packet that the
interface will send and receive.
The server may restrict the allowed values for this leaf,
depending on the interface's type.
If this leaf is not configured, the operationally used MTU
depends on the interface's type.
"""
return self.__mtu
def _set_mtu(self, v, load=False):
"""
Setter method for mtu, mapped from YANG variable /interfaces/interface/routed_vlan/ipv4/state/mtu (uint16)
If this variable is read-only (config: false) in the
source YANG file, then _set_mtu is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_mtu() directly.
YANG Description: The size, in octets, of the largest IPv4 packet that the
interface will send and receive.
The server may restrict the allowed values for this leaf,
depending on the interface's type.
If this leaf is not configured, the operationally used MTU
depends on the interface's type.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=int,
restriction_dict={"range": ["0..65535"]},
int_size=16,
),
restriction_dict={"range": ["68..max"]},
),
is_leaf=True,
yang_name="mtu",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/interfaces/ip",
defining_module="openconfig-if-ip",
yang_type="uint16",
is_config=False,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """mtu must be of a type compatible with uint16""",
"defined-type": "uint16",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=RestrictedClassType(base_type=int, restriction_dict={'range': ['0..65535']},int_size=16), restriction_dict={'range': ['68..max']}), is_leaf=True, yang_name="mtu", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/interfaces/ip', defining_module='openconfig-if-ip', yang_type='uint16', is_config=False)""",
}
)
self.__mtu = t
if hasattr(self, "_set"):
self._set()
def _unset_mtu(self):
self.__mtu = YANGDynClass(
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..65535"]}, int_size=16
),
restriction_dict={"range": ["68..max"]},
),
is_leaf=True,
yang_name="mtu",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/interfaces/ip",
defining_module="openconfig-if-ip",
yang_type="uint16",
is_config=False,
)
enabled = __builtin__.property(_get_enabled)
mtu = __builtin__.property(_get_mtu)
_pyangbind_elements = OrderedDict([("enabled", enabled), ("mtu", mtu)])
|
py | b41423a7fe69e81d3e6a33a4f37a3780e396e32b | from django.db import models
from django.contrib.auth.models import AbstractBaseUser
from django.contrib.auth.models import PermissionsMixin
from django.contrib.auth.models import BaseUserManager
# Create your models here.
class UserProfileManager(BaseUserManager):
"""manages for user profiles"""
def create_user(self,email, name, password=None):
"""create a new user profile"""
if not email:
raise valueError('user must have a mail id')
email = self.normalize_email(email)
user = self.model(email=email, name=name)
user.set_password(password)
user.save(using=self._db)
return user
def create_superuser(self, email, name, password):
"""create and save new super user with given details"""
user = self.create_user(email, name, password)
user.is_superuser = True
user.is_staff = True
user.save(using = self._db)
return user
class UserProfile(AbstractBaseUser, PermissionsMixin):
"""Database model for user in the system"""
email = models.EmailField(max_length= 255, unique= True)
name = models.CharField(max_length=225)
is_active = models.BooleanField(default=True)
is_staff = models.BooleanField(default=False)
objects = UserProfileManager()
USERNAME_FIELD = 'email'
REQUIRED_FIELDS = ['name']
def get_full_name(self):
"""tetrieve full name of user"""
return self.name
def get_short_name(self):
"""Retrieve short name of the user"""
return self.name
def __str__(self):
"""Return string respresentation of our
user"""
return self.email |
py | b41424fe7496919b577c0b3c09d8eafb2a6411cb | """
archive_defn.py
Copyright 2016 University of Melbourne.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License.
You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and limitations under the License.
"""
import os
from fourdvar.params.root_path_defn import store_path
#Settings for archive processes
#location of archive directory
archive_path = os.path.join( store_path, 'archive' )
#archive model output of each successful iteration
iter_model_output = False
#archive observation-lite of each successful iteration
iter_obs_lite = True
#experiment name & name of directory to save results in
experiment = 'example_experiment'
#description is copied into a txt file in the experiment directory
description = """This is a test of the fourdvar system.
The description here should contain details of the experiment
and is written to the description text file."""
#name of txt file holding the description, if empty string ('') file is not created.
desc_name = 'description.txt'
#if True, delete any existing archive with the same name.
#if False, create a new archive name to save results into.
overwrite = True
#pattern used to create new archive name if overwrite is False
#<E> is replaced with the experiment name
#<I> if replace with a number to make a unique directory name
#if a tag is missing the assumed format is: <E>extension<I>
extension = '<E>_vsn<I>'
#cmaq datadef files can be archived. These require an archive name pattern
#patterns can include <YYYYMMDD>, <YYYYDDD> or <YYYY-MM-DD> tags to specify day
#initial conditions file
icon_file = 'icon.nc'
#emission file, requires a tag to map date
emis_file = 'emis.<YYYYMMDD>.nc'
#concentration file, requires a tag to map date
conc_file = 'conc.<YYYYMMDD>.nc'
#adjoint forcing file, requires a tag to map date
force_file = 'force.<YYYYMMDD>.nc'
#concentration sensitivity file, requires a tag to map date
sens_conc_file = 'sens_conc.<YYYYMMDD>.nc'
#emission sensitivity file, requires a tag to map date
sens_emis_file = 'sens_emis.<YYYYMMDD>.nc'
|
py | b414250e669ed8f1895157d1512f885be41d16b3 | import uuid
from datetime import datetime
from flask import current_app
from notifications_utils.template import SMSMessageTemplate
from app import notify_celery, statsd_client
from app.clients import ClientException
from app.clients.sms.firetext import get_firetext_responses
from app.clients.sms.mmg import get_mmg_responses
from app.dao import notifications_dao
from app.dao.templates_dao import dao_get_template_by_id
from app.models import NOTIFICATION_PENDING
from app.notifications.notifications_ses_callback import (
check_and_queue_callback_task,
)
sms_response_mapper = {
'MMG': get_mmg_responses,
'Firetext': get_firetext_responses,
}
@notify_celery.task(bind=True, name="process-sms-client-response", max_retries=5, default_retry_delay=300)
def process_sms_client_response(self, status, provider_reference, client_name, detailed_status_code=None):
# validate reference
try:
uuid.UUID(provider_reference, version=4)
except ValueError as e:
current_app.logger.exception(f'{client_name} callback with invalid reference {provider_reference}')
raise e
response_parser = sms_response_mapper[client_name]
# validate status
try:
notification_status, detailed_status = response_parser(status, detailed_status_code)
current_app.logger.info(
f'{client_name} callback returned status of {notification_status}'
f'({status}): {detailed_status}({detailed_status_code}) for reference: {provider_reference}'
)
except KeyError:
_process_for_status(
notification_status='technical-failure',
client_name=client_name,
provider_reference=provider_reference
)
raise ClientException(f'{client_name} callback failed: status {status} not found.')
_process_for_status(
notification_status=notification_status,
client_name=client_name,
provider_reference=provider_reference,
detailed_status_code=detailed_status_code
)
def _process_for_status(notification_status, client_name, provider_reference, detailed_status_code=None):
# record stats
notification = notifications_dao.update_notification_status_by_id(
notification_id=provider_reference,
status=notification_status,
sent_by=client_name.lower(),
detailed_status_code=detailed_status_code
)
if not notification:
return
statsd_client.incr('callback.{}.{}'.format(client_name.lower(), notification_status))
if notification.sent_at:
statsd_client.timing_with_dates(
f'callback.{client_name.lower()}.{notification_status}.elapsed-time',
datetime.utcnow(),
notification.sent_at
)
if notification.billable_units == 0:
service = notification.service
template_model = dao_get_template_by_id(notification.template_id, notification.template_version)
template = SMSMessageTemplate(
template_model.__dict__,
values=notification.personalisation,
prefix=service.name,
show_prefix=service.prefix_sms,
)
notification.billable_units = template.fragment_count
notifications_dao.dao_update_notification(notification)
if notification_status != NOTIFICATION_PENDING:
check_and_queue_callback_task(notification)
|
py | b414251e821af6c31590e492e98237b5cee4cdd0 | # constant values required
MALE = 'Male'
FEMALE = 'Female'
MEMBER_ADDITION_SUCCEEDED = 'MEMBER_ADDITION_SUCCEEDED'
SPOUSE_ADDITION_SUCCEEDED = 'SPOUSE_ADDITION_SUCCEEDED'
SPOUSE_ADDITION_FAILED = 'SPOUSE_ADDITION_FAILED'
CHILD_ADDITION_SUCCEEDED = 'CHILD_ADDITION_SUCCEEDED'
CHILD_ADDITION_FAILED = 'CHILD_ADDITION_FAILED'
PERSON_NOT_FOUND = 'PERSON_NOT_FOUND'
NONE = 'NONE'
|
py | b41425b129b3c1fc770b3c71612b6d71344ef8d3 | """
Where Sun OS platform specific modules go.
"""
__all__ = ['procfs']
|
py | b4142618ba3b03977116f63522c710a536929b51 | from os import environ
import pathlib
import posixpath
from premailer import transform
import re
files = [
"code.html",
"match.html",
"reset.html",
]
def handler():
in_path = posixpath.join(pathlib.Path(__file__).parent.absolute(), "./raw/")
out_path = posixpath.join(pathlib.Path(__file__).parent.absolute(), "./minified/")
for file_name in files:
print("processing %s" % (file_name))
template = open(
posixpath.join(in_path, file_name), encoding="utf-8", mode="r"
).read()
replaced = transform(
template,
base_path=in_path,
disable_validation=True,
cache_css_parsing=False,
disable_link_rewrites=True,
exclude_pseudoclasses=True,
)
replaced = re.sub(r"#([a-z_]+)#", r"{{\1}}", replaced)
replaced = re.sub(r"\u00A0", "", replaced)
f = open(posixpath.join(out_path, file_name), encoding="utf-8", mode="w",)
f.write(replaced)
f.close()
if __name__ == "__main__":
handler()
|
py | b414265056d1b2720de67c589596004243e5bcd9 | # coding: utf-8
from setuptools import setup
with open('README.rst', 'r') as f:
long_description = f.read()
setup(
name='requests-awsv2-auth',
version='1.1.3',
url='https://github.com/brutasse/requests-awsv2-auth',
license='BSD',
author=u'Bruno Renié',
description=('AWS v2 auth support for Python-Requests.'),
long_description=long_description,
py_modules=('awsv2_auth',),
zip_safe=False,
include_package_data=True,
platforms='any',
classifiers=(
'Intended Audience :: Developers',
'Intended Audience :: System Administrators',
'License :: OSI Approved :: BSD License',
'Operating System :: OS Independent',
'Programming Language :: Python',
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 3',
),
install_requires=(
'requests',
),
)
|
py | b4142705e509c97dfc9663a0b27b35116de0427c | import urllib.request
import json
from datetime import datetime
from i3pystatus import IntervalModule
from i3pystatus.core.util import internet, require, user_open
import locale
import threading
from contextlib import contextmanager
LOCALE_LOCK = threading.Lock()
@contextmanager
def setlocale(name):
# To deal with locales only in this module and keep it thread save
with LOCALE_LOCK:
saved = locale.setlocale(locale.LC_ALL)
try:
yield locale.setlocale(locale.LC_ALL, name)
finally:
locale.setlocale(locale.LC_ALL, saved)
class Bitcoin(IntervalModule):
"""
This module fetches and displays current Bitcoin market prices and
optionally monitors transactions to and from a list of user-specified
wallet addresses. Market data is pulled from the Bitaps Market
API <https://bitaps.com> and it is possible to specify
the exchange to be monitored.
Transaction data is pulled from blockchain.info
<https://blockchain.info/api/blockchain_api>.
.. rubric:: Available formatters
* {last_price}
* {ask_price}
* {bid_price}
* {open_price}
* {volume}
* {volume_thousand}
* {volume_percent}
* {age}
* {status}
* {last_tx_type}
* {last_tx_addr}
* {last_tx_value}
* {balance_btc}
* {balance_fiat}
* {symbol}
"""
settings = (
("format", "Format string used for output."),
("currency", "Base fiat currency used for pricing."),
("wallet_addresses", "List of wallet address(es) to monitor."),
("color", "Standard color"),
("exchange", "Get ticker from a custom exchange instead"),
("colorize", "Enable color change on price increase/decrease"),
("color_up", "Color for price increases"),
("color_down", "Color for price decreases"),
("interval", "Update interval."),
("symbol", "Symbol for bitcoin sign"),
"status"
)
format = "{symbol} {status}{last_price}"
currency = "USD"
exchange = "bitstamp"
symbol = "\uF15A"
wallet_addresses = ""
color = "#FFFFFF"
colorize = False
color_up = "#00FF00"
color_down = "#FF0000"
interval = 600
status = {
"price_up": "▲",
"price_down": "▼",
}
on_leftclick = "electrum"
on_rightclick = ["open_something", "https://bitaps.com/"]
_price_prev = 0
def _get_age(self, bitcoinaverage_timestamp):
with setlocale('C'): # Deal with locales (months name differ)
# Assume format is always utc, to avoid import pytz
diff = datetime.utcnow() - \
datetime.fromtimestamp(bitcoinaverage_timestamp)
return int(diff.total_seconds())
def _query_api(self, api_url):
url = "{}/{}".format(api_url, self.exchange.upper())
response = urllib.request.urlopen(url).read().decode("utf-8")
return json.loads(response)
def _fetch_price_data(self):
api_url = "https://api.bitaps.com/market/v1/tickers"
ret = self._query_api(api_url)["data"]
exchange = ret[self.exchange.upper()]["pairs"]["BTC{}".format(self.currency.upper())]
# Adapt values to global ticker format
exchange['24h_avg'] = None
return exchange
def _fetch_blockchain_data(self):
api = "https://blockchain.info/multiaddr?active="
addresses = "|".join(self.wallet_addresses)
url = "{}{}".format(api, addresses)
return json.loads(urllib.request.urlopen(url).read().decode("utf-8"))
@require(internet)
def run(self):
price_data = self._fetch_price_data()
fdict = {
"symbol": self.symbol,
"open": price_data["open"],
"ask_price": price_data["ask"],
"bid_price": price_data["bid"],
"last_price": price_data["last"],
"volume": price_data["volume"],
"volume_thousand": float(price_data["volume"]) / 1000,
"age": self._get_age(price_data['timestamp'])
}
if self._price_prev and fdict["last_price"] > self._price_prev:
color = self.color_up
fdict["status"] = self.status["price_up"]
elif self._price_prev and fdict["last_price"] < self._price_prev:
color = self.color_down
fdict["status"] = self.status["price_down"]
else:
color = self.color
fdict["status"] = ""
self._price_prev = fdict["last_price"]
if not self.colorize:
color = self.color
if self.wallet_addresses:
blockchain_data = self._fetch_blockchain_data()
wallet_data = blockchain_data["wallet"]
balance_btc = wallet_data["final_balance"] / 100000000
fdict["balance_btc"] = round(balance_btc, 2)
balance_fiat = fdict["balance_btc"] * fdict["last_price"]
fdict["balance_fiat"] = round(balance_fiat, 2)
fdict["total_sent"] = wallet_data["total_sent"]
fdict["total_received"] = wallet_data["total_received"]
fdict["transactions"] = wallet_data["n_tx"]
if fdict["transactions"]:
last_tx = blockchain_data["txs"][0]
fdict["last_tx_addr"] = last_tx["out"][0]["addr"]
fdict["last_tx_value"] = last_tx["out"][0]["value"] / 100000000
if fdict["last_tx_addr"] in self.wallet_addresses:
fdict["last_tx_type"] = "recv"
else:
fdict["last_tx_type"] = "sent"
self.data = fdict
self.output = {
"full_text": self.format.format(**fdict),
"color": color,
}
def open_something(self, url_or_command):
"""
Wrapper function, to pass the arguments to user_open
"""
user_open(url_or_command)
|
py | b4142765cc15ed15ba99a73564375b17f5d376ea | import collections
from copy import copy
from inspect import signature
from forbiddenfruit import curse, curses
def issequenceforme(obj):
if isinstance(obj, str):
return False
return isinstance(obj, collections.Sequence)
@curses(list, "from")
@classmethod
def _from(cls, obj, mapfn=None):
mapfn = mapfn or (lambda x: x)
return cls(list(map(mapfn, list(obj))))
@curses(list, "constructor")
@property
def constructor(self):
"""Returns the function that created the Array object's prototype"""
return type(self)
def get_length(self):
"""Returns the current length of the Array"""
return len(self)
def set_length(self, new_length):
if new_length < 0:
raise ValueError("New Array length must be positive")
for i in self[new_length:]:
self.remove(i)
old_len = len(self)
if new_length > old_len:
for i in range(
new_length - old_len
): # allow setting a higher length than currently existing
self.append(None)
curse(list, "length", property(get_length, set_length))
@curses(list, "prototype")
@property
def prototype(self):
# TODO: impl setattr via curses?
raise NotImplementedError("Not possible in Python yet, coming soon")
@curses(list, "concat")
def concat(self, *arrays):
"""Merge the Array with any other Arrays."""
for arr in arrays:
self.extend(arr)
@curses(list, "copyWithin")
def copyWithin(self, target, start=0, end=None):
if target < 0:
target = len(self) + target
if end is None:
end = len(self)
array_to_copy = self[start:end][: len(self) - target]
tmp = copy(self)
for i, j in enumerate(array_to_copy):
tmp[target + i] = j
return tmp
@curses(list, "entries")
def entries(self):
for i, j in enumerate(self):
yield [i, j]
@curses(list, "every")
def every(self, callback, this=None):
self = this or self
params = signature(callback).parameters
if len(params) == 1:
for i in self:
if callback(i) is False:
return False
elif len(params) == 2:
for i, j in enumerate(self):
if callback(j, i) is False:
return False
elif len(params) == 3:
for i, j in enumerate(self):
if callback(j, i, self) is False:
return False
return True
@curses(list, "forEach")
def forEach(self, callback, this=None):
self = this or self
params = signature(callback).parameters
if len(params) == 1:
for i in self:
callback(i)
elif len(params) == 2:
for i, j in enumerate(self):
callback(j, i)
elif len(params) == 3:
for i, j in enumerate(self):
callback(j, i, self)
@curses(list, "filter")
def filter(self, callback, this=None):
self = this or self
params = signature(callback).parameters
passed = []
if len(params) == 1:
for i in self:
if callback(i) is True:
passed.append(i)
elif len(params) == 2:
for i, j in enumerate(self):
if callback(j, i) is True:
passed.append(j)
elif len(params) == 3:
for i, j in enumerate(self):
if callback(j, i, self) is True:
passed.append(j)
return passed
@curses(list, "map")
def _map(self, callback, this=None):
self = this or self
params = signature(callback).parameters
res = []
if len(params) == 1:
for i in self:
res.append(callback(i))
elif len(params) == 2:
for i, j in enumerate(self):
res.append(callback(j, i))
elif len(params) == 3:
for i, j in enumerate(self):
res.append(callback(j, i, self))
return res
@curses(list, "findIndex")
def findIndex(self, callback, this=None):
self = this or self
params = signature(callback).parameters
if len(params) == 1:
for i, j in enumerate(self):
if callback(j) is True:
return i
elif len(params) == 2:
for i, j in enumerate(self):
if callback(j, i) is True:
return i
elif len(params) == 3:
for i, j in enumerate(self):
if callback(j, i, self) is True:
return i
return None
@curses(list, "find")
def find(self, callback, this=None):
self = this or self
params = signature(callback).parameters
if len(params) == 1:
for i in self:
if callback(i) is True:
return i
elif len(params) == 2:
for i, j in enumerate(self):
if callback(j, i) is True:
return j
elif len(params) == 3:
for i, j in enumerate(self):
if callback(j, i, self) is True:
return j
return None
@curses(list, "fill")
def fill(self, val, start=0, end=None):
if end is None or end > len(self):
end = len(self)
for i in range(start, end):
self[i] = val
return self
@curses(list, "includes")
def includes(self, item, start=0):
return item in self[start:]
@curses(list, "indexOf")
def indexOf(self, item, start=0):
try:
return self[start:].index(item)
except ValueError: # we want it to be the exact JS way and return -1 if not found
return -1
@curses(list, "lastIndexOf")
def lastIndexOf(self, item, start=-1):
try:
return len(self) - 1 - self[::start].index(item)
except ValueError: # we want it to be the exact JS way and return -1 if not found
return -1
@curses(list, "push")
def push(self, *items):
for i in items:
self.append(i)
return len(self)
@curses(list, "reduce")
def reduce(self, callback, initial=None):
params = signature(callback).parameters
if initial is None:
ret = self[0]
idx = 1
else:
ret = initial
idx = 0
while idx < len(self):
if len(params) == 2:
ret = callback(ret, self[idx])
elif len(params) == 3:
ret = callback(ret, self[idx], idx)
elif len(params) == 4:
ret = callback(ret, self[idx], idx, self)
idx += 1
return ret
@curses(list, "reduceRight")
def reduceRight(self, callback, initial=None):
params = signature(callback).parameters
self_2 = self[::-1]
if initial is None:
ret = self_2[0]
idx = 1
else:
ret = initial
idx = 0
while idx < len(self_2):
if len(params) == 2:
ret = callback(ret, self_2[idx])
elif len(params) == 3:
ret = callback(ret, self_2[idx], idx)
elif len(params) == 4:
ret = callback(ret, self_2[idx], idx, self)
idx += 1
return ret
@curses(list, "reverse")
def reverse(self):
old = copy(self)
for i, j in enumerate(old):
self[-(i + 1)] = j
return self
@curses(list, "shift")
def shift(self):
i = self[0]
del self[0]
return i
@curses(list, "slice")
def slice(self, start, end):
return self[start:end]
@curses(list, "some")
def some(self, callback, this=None):
self = this or self
params = signature(callback).parameters
if len(params) == 1:
for i in self:
if callback(i) is True:
return True
elif len(params) == 2:
for i, j in enumerate(self):
if callback(j, i) is True:
return True
elif len(params) == 3:
for i, j in enumerate(self):
if callback(j, i, self) is True:
return True
return False
_old_sort = list.sort
@curses(list, "sort")
def _sort(self, func=None):
if not func:
_old_sort(self)
else:
_old_sort(self, key=func)
@curses(list, "splice")
def splice(self, index, delete_count=0, *added):
out = []
for i in range(delete_count):
out.append(self[index + i])
del self[index + i]
for i, j in enumerate(added):
self.insert(index + i, j)
return out
@curses(list, "toString")
def toString(self):
return ",".join(str(i) for i in self)
@curses(list, "unshift")
def unshift(self, *elements):
self[0:0] = elements
return len(self)
@curses(list, "valueOf")
def valueOf(self):
return self
|
py | b4142816843566f6fd905f712f1e8d5d561d557b | '''
Created on Aug 1, 2017
@author: I310003
'''
#!/usr/bin/env python
# -*- coding: UTF-8 -*-
#
# generated by wxGlade 0.6.8 on Fri Jan 23 22:59:56 2015
#
import wx
# begin wxGlade: dependencies
import gettext
# end wxGlade
# begin wxGlade: extracode
# end wxGlade
class MyFrame(wx.Frame):
def __init__(self, *args, **kwds):
# begin wxGlade: MyFrame.__init__
kwds["style"] = wx.DEFAULT_FRAME_STYLE
wx.Frame.__init__(self, *args, **kwds)
self.button_1 = wx.Button(self, wx.ID_ANY, _("Hello World!"))
self.button_1.Bind(wx.EVT_BUTTON, self.OnButton )
self.__set_properties()
self.__do_layout()
# end wxGlade
def __set_properties(self):
# begin wxGlade: MyFrame.__set_properties
self.SetTitle(_("wxWidgets button example. pythonspot.com "))
# end wxGlade
def __do_layout(self):
# begin wxGlade: MyFrame.__do_layout
sizer_1 = wx.BoxSizer(wx.VERTICAL)
sizer_1.Add(self.button_1, 0, 0, 0)
self.SetSizer(sizer_1)
sizer_1.Fit(self)
self.Layout()
# end wxGlade
def OnButton(event, button_label):
wx.MessageBox( "This is a message.", "Button pressed.");
# end of class MyFrame
if __name__ == "__main__":
gettext.install("app") # replace with the appropriate catalog name
app = wx.PySimpleApp(0)
wx.InitAllImageHandlers()
frame_1 = MyFrame(None, wx.ID_ANY, "")
app.SetTopWindow(frame_1)
frame_1.Show()
app.MainLoop() |
py | b414283d166412254f3ee6b32c39fe9ea6b86ab0 | # coding: utf-8
from __future__ import absolute_import
from __future__ import print_function
from __future__ import unicode_literals
import os
import platform
import shutil
import socket
import tempfile
import time
import unittest
import uuid
from six import text_type
from ftplib import error_perm
from ftplib import error_temp
from pyftpdlib.authorizers import DummyAuthorizer
from fs import errors
from fs.opener import open_fs
from fs.ftpfs import FTPFS, ftp_errors
from fs.path import join
from fs.subfs import SubFS
from fs.test import FSTestCases
try:
from pytest import mark
except ImportError:
from . import mark
# Prevent socket timeouts from slowing tests too much
socket.setdefaulttimeout(1)
class TestFTPFSClass(unittest.TestCase):
def test_parse_ftp_time(self):
self.assertIsNone(FTPFS._parse_ftp_time("notreallyatime"))
t = FTPFS._parse_ftp_time("19740705000000")
self.assertEqual(t, 142214400)
def test_parse_mlsx(self):
info = list(
FTPFS._parse_mlsx(["create=19740705000000;modify=19740705000000; /foo"])
)[0]
self.assertEqual(info["details"]["modified"], 142214400)
self.assertEqual(info["details"]["created"], 142214400)
info = list(FTPFS._parse_mlsx(["foo=bar; .."]))
self.assertEqual(info, [])
def test_parse_mlsx_type(self):
lines = [
"Type=cdir;Modify=20180731114724;UNIX.mode=0755; /tmp",
"Type=pdir;Modify=20180731112024;UNIX.mode=0775; /",
"Type=file;Size=331523;Modify=20180731112041;UNIX.mode=0644; a.csv",
"Type=file;Size=368340;Modify=20180731112041;UNIX.mode=0644; b.csv",
]
expected = [
{
"basic": {"name": "a.csv", "is_dir": False},
"ftp": {
"type": "file",
"size": "331523",
"modify": "20180731112041",
"unix.mode": "0644",
},
"details": {"type": 2, "size": 331523, "modified": 1533036041},
},
{
"basic": {"name": "b.csv", "is_dir": False},
"ftp": {
"type": "file",
"size": "368340",
"modify": "20180731112041",
"unix.mode": "0644",
},
"details": {"type": 2, "size": 368340, "modified": 1533036041},
},
]
info = list(FTPFS._parse_mlsx(lines))
self.assertEqual(info, expected)
def test_opener(self):
ftp_fs = open_fs("ftp://will:[email protected]")
self.assertIsInstance(ftp_fs, FTPFS)
self.assertEqual(ftp_fs.host, "ftp.example.org")
ftps_fs = open_fs("ftps://will:[email protected]")
self.assertIsInstance(ftps_fs, FTPFS)
self.assertTrue(ftps_fs.tls)
class TestFTPErrors(unittest.TestCase):
"""Test the ftp_errors context manager."""
def test_manager(self):
mem_fs = open_fs("mem://")
with self.assertRaises(errors.ResourceError):
with ftp_errors(mem_fs, path="foo"):
raise error_temp
with self.assertRaises(errors.OperationFailed):
with ftp_errors(mem_fs):
raise error_temp
with self.assertRaises(errors.InsufficientStorage):
with ftp_errors(mem_fs):
raise error_perm("552 foo")
with self.assertRaises(errors.ResourceNotFound):
with ftp_errors(mem_fs):
raise error_perm("501 foo")
with self.assertRaises(errors.PermissionDenied):
with ftp_errors(mem_fs):
raise error_perm("999 foo")
def test_manager_with_host(self):
mem_fs = open_fs("mem://")
mem_fs.host = "ftp.example.com"
with self.assertRaises(errors.RemoteConnectionError) as err_info:
with ftp_errors(mem_fs):
raise EOFError
self.assertEqual(str(err_info.exception), "lost connection to ftp.example.com")
with self.assertRaises(errors.RemoteConnectionError) as err_info:
with ftp_errors(mem_fs):
raise socket.error
self.assertEqual(
str(err_info.exception), "unable to connect to ftp.example.com"
)
@mark.slow
@unittest.skipIf(platform.python_implementation() == "PyPy", "ftp unreliable with PyPy")
class TestFTPFS(FSTestCases, unittest.TestCase):
user = "user"
pasw = "1234"
@classmethod
def setUpClass(cls):
from pyftpdlib.test import ThreadedTestFTPd
super(TestFTPFS, cls).setUpClass()
cls._temp_dir = tempfile.mkdtemp("ftpfs2tests")
cls._temp_path = os.path.join(cls._temp_dir, text_type(uuid.uuid4()))
os.mkdir(cls._temp_path)
cls.server = ThreadedTestFTPd()
cls.server.shutdown_after = -1
cls.server.handler.authorizer = DummyAuthorizer()
cls.server.handler.authorizer.add_user(
cls.user, cls.pasw, cls._temp_path, perm="elradfmw"
)
cls.server.handler.authorizer.add_anonymous(cls._temp_path)
cls.server.start()
# Don't know why this is necessary on Windows
if platform.system() == "Windows":
time.sleep(0.1)
# Poll until a connection can be made
if not cls.server.is_alive():
raise RuntimeError("could not start FTP server.")
@classmethod
def tearDownClass(cls):
cls.server.stop()
shutil.rmtree(cls._temp_dir)
super(TestFTPFS, cls).tearDownClass()
def make_fs(self):
return open_fs(
"ftp://{}:{}@{}:{}".format(
self.user, self.pasw, self.server.host, self.server.port
)
)
def tearDown(self):
shutil.rmtree(self._temp_path)
os.mkdir(self._temp_path)
super(TestFTPFS, self).tearDown()
def test_ftp_url(self):
self.assertEqual(
self.fs.ftp_url,
"ftp://{}:{}@{}:{}".format(
self.user, self.pasw, self.server.host, self.server.port
),
)
def test_geturl(self):
self.fs.makedir("foo")
self.fs.create("bar")
self.fs.create("foo/bar")
self.assertEqual(
self.fs.geturl("foo"),
"ftp://{}:{}@{}:{}/foo".format(
self.user, self.pasw, self.server.host, self.server.port
),
)
self.assertEqual(
self.fs.geturl("bar"),
"ftp://{}:{}@{}:{}/bar".format(
self.user, self.pasw, self.server.host, self.server.port
),
)
self.assertEqual(
self.fs.geturl("foo/bar"),
"ftp://{}:{}@{}:{}/foo/bar".format(
self.user, self.pasw, self.server.host, self.server.port
),
)
def test_host(self):
self.assertEqual(self.fs.host, self.server.host)
def test_connection_error(self):
fs = FTPFS("ftp.not.a.chance", timeout=1)
with self.assertRaises(errors.RemoteConnectionError):
fs.listdir("/")
with self.assertRaises(errors.RemoteConnectionError):
fs.makedir("foo")
with self.assertRaises(errors.RemoteConnectionError):
fs.open("foo.txt")
def test_getmeta_unicode_path(self):
self.assertTrue(self.fs.getmeta().get("unicode_paths"))
self.fs.features
del self.fs.features["UTF8"]
self.assertFalse(self.fs.getmeta().get("unicode_paths"))
def test_opener_path(self):
self.fs.makedir("foo")
self.fs.writetext("foo/bar", "baz")
ftp_fs = open_fs(
"ftp://user:1234@{}:{}/foo".format(self.server.host, self.server.port)
)
self.assertIsInstance(ftp_fs, SubFS)
self.assertEqual(ftp_fs.readtext("bar"), "baz")
ftp_fs.close()
def test_create(self):
directory = join("home", self.user, "test", "directory")
base = "ftp://user:1234@{}:{}/foo".format(self.server.host, self.server.port)
url = "{}/{}".format(base, directory)
# Make sure unexisting directory raises `CreateFailed`
with self.assertRaises(errors.CreateFailed):
ftp_fs = open_fs(url)
# Open with `create` and try touching a file
with open_fs(url, create=True) as ftp_fs:
ftp_fs.touch("foo")
# Open the base filesystem and check the subdirectory exists
with open_fs(base) as ftp_fs:
self.assertTrue(ftp_fs.isdir(directory))
self.assertTrue(ftp_fs.isfile(join(directory, "foo")))
# Open without `create` and check the file exists
with open_fs(url) as ftp_fs:
self.assertTrue(ftp_fs.isfile("foo"))
# Open with create and check this does fail
with open_fs(url, create=True) as ftp_fs:
self.assertTrue(ftp_fs.isfile("foo"))
class TestFTPFSNoMLSD(TestFTPFS):
def make_fs(self):
ftp_fs = super(TestFTPFSNoMLSD, self).make_fs()
ftp_fs.features
del ftp_fs.features["MLST"]
return ftp_fs
def test_features(self):
pass
@mark.slow
@unittest.skipIf(platform.python_implementation() == "PyPy", "ftp unreliable with PyPy")
class TestAnonFTPFS(FSTestCases, unittest.TestCase):
user = "anonymous"
pasw = ""
@classmethod
def setUpClass(cls):
from pyftpdlib.test import ThreadedTestFTPd
super(TestAnonFTPFS, cls).setUpClass()
cls._temp_dir = tempfile.mkdtemp("ftpfs2tests")
cls._temp_path = os.path.join(cls._temp_dir, text_type(uuid.uuid4()))
os.mkdir(cls._temp_path)
cls.server = ThreadedTestFTPd()
cls.server.shutdown_after = -1
cls.server.handler.authorizer = DummyAuthorizer()
cls.server.handler.authorizer.add_anonymous(cls._temp_path, perm="elradfmw")
cls.server.start()
# Don't know why this is necessary on Windows
if platform.system() == "Windows":
time.sleep(0.1)
# Poll until a connection can be made
if not cls.server.is_alive():
raise RuntimeError("could not start FTP server.")
@classmethod
def tearDownClass(cls):
cls.server.stop()
shutil.rmtree(cls._temp_dir)
super(TestAnonFTPFS, cls).tearDownClass()
def make_fs(self):
return open_fs("ftp://{}:{}".format(self.server.host, self.server.port))
def tearDown(self):
shutil.rmtree(self._temp_path)
os.mkdir(self._temp_path)
super(TestAnonFTPFS, self).tearDown()
def test_ftp_url(self):
self.assertEqual(
self.fs.ftp_url, "ftp://{}:{}".format(self.server.host, self.server.port)
)
def test_geturl(self):
self.fs.makedir("foo")
self.fs.create("bar")
self.fs.create("foo/bar")
self.assertEqual(
self.fs.geturl("foo"),
"ftp://{}:{}/foo".format(self.server.host, self.server.port),
)
self.assertEqual(
self.fs.geturl("bar"),
"ftp://{}:{}/bar".format(self.server.host, self.server.port),
)
self.assertEqual(
self.fs.geturl("foo/bar"),
"ftp://{}:{}/foo/bar".format(self.server.host, self.server.port),
)
|
py | b4142859fc1abc96fac57334a77975fccd880f09 | from django.utils.functional import cached_property
from waldur_core.structure import models as structure_models
from waldur_core.structure.tests import factories as structure_factories
from waldur_core.structure.tests import fixtures as structure_fixtures
from waldur_mastermind.marketplace import models as marketplace_models
from waldur_mastermind.marketplace.tests import factories as marketplace_factories
from waldur_mastermind.marketplace_support import PLUGIN_NAME
from waldur_mastermind.support.tests.fixtures import SupportFixture
class MarketplaceSupportApprovedFixture(SupportFixture):
def __init__(self, provider_customer=None):
self.provider_customer = (
provider_customer or structure_factories.CustomerFactory()
)
self.plan_component
self.order_item
@cached_property
def provider(self):
return marketplace_factories.ServiceProviderFactory(
customer=self.provider_customer
)
@cached_property
def marketplace_offering(self):
return marketplace_factories.OfferingFactory(
customer=self.provider.customer, type=PLUGIN_NAME,
)
@cached_property
def resource(self):
return marketplace_factories.ResourceFactory(
offering=self.marketplace_offering, project=self.project, plan=self.plan
)
@cached_property
def plan(self):
return marketplace_factories.PlanFactory(
unit=marketplace_models.Plan.Units.PER_MONTH,
offering=self.marketplace_offering,
)
@cached_property
def offering_component(self):
return marketplace_factories.OfferingComponentFactory(
offering=self.marketplace_offering,
)
@cached_property
def plan_component(self):
return marketplace_factories.PlanComponentFactory(
plan=self.plan, component=self.offering_component,
)
@cached_property
def order(self):
return marketplace_factories.OrderFactory(project=self.project)
@cached_property
def order_item(self):
return marketplace_factories.OrderItemFactory(
order=self.order,
offering=self.marketplace_offering,
plan=self.plan,
state=marketplace_models.OrderItem.States.DONE,
resource=self.resource,
)
class SupportFixture(structure_fixtures.ProjectFixture):
def __init__(self):
self.plan_component_cpu
self.plan_component_ram
self.new_plan_component_cpu
self.new_plan_component_ram
self.service_provider
self.update_plan_prices()
@cached_property
def offering(self):
return marketplace_factories.OfferingFactory(
type=PLUGIN_NAME, options={'order': []}
)
@cached_property
def plan(self):
plan = marketplace_factories.PlanFactory(
offering=self.offering,
name='Standard plan',
unit_price=0,
unit=marketplace_models.Plan.Units.PER_MONTH,
)
return plan
@cached_property
def plan_component_cpu(self):
return marketplace_factories.PlanComponentFactory(
plan=self.plan, component=self.offering_component_cpu, price=4, amount=1,
)
@cached_property
def plan_component_ram(self):
return marketplace_factories.PlanComponentFactory(
plan=self.plan, component=self.offering_component_ram, price=3, amount=2,
)
@cached_property
def order(self):
return marketplace_factories.OrderFactory(project=self.project)
@cached_property
def order_item(self):
return marketplace_factories.OrderItemFactory(
order=self.order,
offering=self.offering,
attributes={'name': 'item_name', 'description': 'Description'},
plan=self.plan,
)
@cached_property
def offering_component_cpu(self):
return marketplace_factories.OfferingComponentFactory(
offering=self.offering,
billing_type=marketplace_models.OfferingComponent.BillingTypes.FIXED,
)
@cached_property
def offering_component_ram(self):
return marketplace_factories.OfferingComponentFactory(
offering=self.offering,
billing_type=marketplace_models.OfferingComponent.BillingTypes.FIXED,
type='ram',
)
@cached_property
def new_plan(self):
new_plan = marketplace_factories.PlanFactory(
offering=self.offering,
unit_price=0,
name='Small plan',
unit=marketplace_models.Plan.Units.PER_MONTH,
)
return new_plan
@cached_property
def new_plan_component_cpu(self):
return marketplace_factories.PlanComponentFactory(
plan=self.new_plan, component=self.offering_component_cpu, price=3
)
@cached_property
def new_plan_component_ram(self):
return marketplace_factories.PlanComponentFactory(
plan=self.new_plan, component=self.offering_component_ram, price=2
)
@cached_property
def new_order(self):
return marketplace_factories.OrderFactory(project=self.project)
@cached_property
def new_order_item(self):
return marketplace_factories.OrderItemFactory(
offering=self.offering,
attributes={'name': 'item_name_2', 'description': 'Description_2'},
plan=self.plan,
order=self.new_order,
)
@cached_property
def service_provider(self):
return marketplace_factories.ServiceProviderFactory(
customer=self.order_item.offering.customer,
description='ServiceProvider\'s description',
)
@cached_property
def service_owner(self):
owner = structure_factories.UserFactory()
self.offering.customer.add_user(owner, structure_models.CustomerRole.OWNER)
return owner
def update_plan_prices(self):
self._update_plan_price('plan')
self._update_plan_price('new_plan')
def _update_plan_price(self, plan_name):
plan = getattr(self, plan_name)
fixed_components = plan.components.filter(component__billing_type='fixed')
plan.unit_price = sum(comp.amount * comp.price for comp in fixed_components)
plan.save()
|
py | b41428753394aeea8e20a1719f986cbc449f7e08 | import demistomock as demisto
from CommonServerPython import *
from CommonServerUserPython import *
''' IMPORTS '''
import re
import json
import requests
import socket
# Disable insecure warnings
requests.packages.urllib3.disable_warnings()
''' GLOBALS/PARAMS '''
PARAMS = demisto.params()
API_KEY = PARAMS.get('api_key')
# Remove trailing slash to prevent wrong URL path to service
SERVER = 'https://autofocus.paloaltonetworks.com'
# Should we use SSL
USE_SSL = not PARAMS.get('insecure', False)
# Service base URL
BASE_URL = SERVER + '/api/v1.0'
VENDOR_NAME = 'AutoFocus V2'
# Headers to be sent in requests
HEADERS = {
'Content-Type': 'application/json'
}
API_PARAM_DICT = {
'scope': {
'Private': 'private',
'Public': 'public',
'Global': 'global'
},
'order': {
'Ascending': 'asc',
'Descending': 'desc'
},
'sort': {
'App Name': 'app_name',
'App Packagename': 'app_packagename',
'File type': 'filetype',
'Size': 'size',
'Finish Date': 'finish_date',
'First Seen (Create Date)': 'create_date',
'Last Updated (Update Date)': 'update_date',
'MD5': 'md5',
'SHA1': 'sha1',
'SHA256': 'sha256',
'Ssdeep Fuzzy Hash': 'ssdeep',
'Application': 'app',
'Device Country': 'device_country',
'Device Country Code': 'device_countrycode',
'Device Hostname': 'device_hostname',
'Device Serial': 'device_serial',
'Device vsys': 'vsys',
'Destination Country': 'dst_country',
'Destination Country Code': 'dst_countrycode',
'Destination IP': 'dst_ip',
'Destination Port': 'dst_port',
'Email Charset': 'emailsbjcharset',
'Industry': 'device_industry',
'Source Country': 'src_country',
'Source Country Code': 'src_countrycode',
'Source IP': 'src_ip',
'Source Port': 'src_port',
'Time': 'tstamp',
'Upload source': 'upload_srcPossible'
},
'tag_class': {
'Actor': 'actor',
'Campaign': 'campaign',
'Exploit': 'exploit',
'Malicious Behavior': 'malicious_behavior',
'Malware Family': 'malware_family'
},
'search_arguments': {
'file_hash': {
'api_name': 'alias.hash',
'operator': 'contains'
},
'domain': {
'api_name': 'alias.domain',
'operator': 'contains'
},
'ip': {
'api_name': 'alias.ip_address',
'operator': 'contains'
},
'url': {
'api_name': 'alias.url',
'operator': 'contains'
},
'wildfire_verdict': {
'api_name': 'sample.malware',
'operator': 'is',
'translate': {
'Malware': 1,
'Grayware': 2,
'Benign': 3,
'Phishing': 4,
}
},
'first_seen': {
'api_name': 'sample.create_date',
'operator': 'is in the range'
},
'last_updated': {
'api_name': 'sample.update_date',
'operator': 'is in the range'
},
'time_range': {
'api_name': 'session.tstamp',
'operator': 'is in the range'
},
'time_after': {
'api_name': 'session.tstamp',
'operator': 'is after'
},
'time_before': {
'api_name': 'session.tstamp',
'operator': 'is before'
}
},
'file_indicators': {
'Size': 'Size',
'SHA1': 'SHA1',
'SHA256': 'SHA256',
'FileType': 'Type',
'Tags': 'Tags',
'FileName': 'Name'
},
'search_results': {
'sha1': 'SHA1',
'sha256': 'SHA256',
'filetype': 'FileType',
'malware': 'Verdict',
'size': 'Size',
'create_date': 'Created',
'finish_date': 'Finished',
'md5': 'MD5',
'region': 'Region',
'tag': 'Tags',
'_id': 'ID',
'tstamp': 'Seen',
'filename': 'FileName',
'device_industry': 'Industry',
'upload_src': 'UploadSource',
'fileurl': 'FileURL'
}
}
SAMPLE_ANALYSIS_LINE_KEYS = {
'behavior': {
'display_name': 'behavior',
'indexes': {
'risk': 0,
'behavior': -1
}
},
'process': {
'display_name': 'processes',
'indexes': {
'parent_process': 0,
'action': 1
}
},
'file': {
'display_name': 'files',
'indexes': {
'parent_process': 0,
'action': 1
}
},
'registry': {
'display_name': 'registry',
'indexes': {
'action': 1,
'parameters': 2
}
},
'dns': {
'display_name': 'DNS',
'indexes': {
'query': 0,
'response': 1
}
},
'http': {
'display_name': 'HTTP',
'indexes': {
'host': 0,
'method': 1,
'url': 2
}
},
'connection': {
'display_name': 'connections',
'indexes': {
'destination': 2
}
},
'mutex': {
'display_name': 'mutex',
'indexes': {
'process': 0,
'action': 1,
'parameters': 2
}
}
}
SAMPLE_ANALYSIS_COVERAGE_KEYS = {
'wf_av_sig': {
'display_name': 'wildfire_signatures',
'fields': ['name', 'create_date']
},
'fileurl_sig': {
'display_name': 'fileurl_signatures',
'fields': ['name', 'create_date']
},
'dns_sig': {
'display_name': 'dns_signatures',
'fields': ['name', 'create_date']
},
'url_cat': {
'display_name': 'url_categories',
'fields': ['url', 'cat']
}
}
VERDICTS_TO_DBOTSCORE = {
'benign': 1,
'malware': 3,
'grayware': 2,
'phishing': 3,
'c2': 3
}
ERROR_DICT = {
'404': 'Invalid URL.',
'409': 'Invalid message or missing parameters.',
'500': 'Internal error.',
'503': 'Rate limit exceeded.'
}
if PARAMS.get('mark_as_malicious'):
verdicts = argToList(PARAMS.get('mark_as_malicious'))
for verdict in verdicts:
VERDICTS_TO_DBOTSCORE[verdict] = 3
''' HELPER FUNCTIONS '''
def parse_response(resp, err_operation):
try:
# Handle error responses gracefully
res_json = resp.json()
resp.raise_for_status()
return res_json
# Errors returned from AutoFocus
except requests.exceptions.HTTPError:
err_msg = f'{err_operation}: {res_json.get("message")}'
if res_json.get("message").find('Requested sample not found') != -1:
demisto.results(err_msg)
sys.exit(0)
elif res_json.get("message").find("AF Cookie Not Found") != -1:
demisto.results(err_msg)
sys.exit(0)
elif err_operation == 'Tag details operation failed' and \
res_json.get("message").find("Tag") != -1 and res_json.get("message").find("not found") != -1:
demisto.results(err_msg)
sys.exit(0)
else:
return return_error(err_msg)
# Unexpected errors (where no json object was received)
except Exception as err:
err_msg = f'{err_operation}: {err}'
return return_error(err_msg)
def http_request(url_suffix, method='POST', data={}, err_operation=None):
# A wrapper for requests lib to send our requests and handle requests and responses better
data.update({'apiKey': API_KEY})
res = requests.request(
method=method,
url=BASE_URL + url_suffix,
verify=USE_SSL,
data=json.dumps(data),
headers=HEADERS
)
return parse_response(res, err_operation)
def validate_sort_and_order(sort, order):
if sort and not order:
return_error('Please specify the order of sorting (Ascending or Descending).')
if order and not sort:
return_error('Please specify a field to sort by.')
return sort and order
def do_search(search_object, query, scope, size=None, sort=None, order=None, err_operation=None):
path = '/samples/search' if search_object == 'samples' else '/sessions/search'
data = {
'query': query,
'size': size
}
if scope:
data.update({'scope': API_PARAM_DICT['scope'][scope]}) # type: ignore
if validate_sort_and_order(sort, order):
data.update({'sort': {API_PARAM_DICT['sort'][sort]: {'order': API_PARAM_DICT['order'][order]}}}) # type: ignore
# Remove nulls
data = createContext(data, removeNull=True)
result = http_request(path, data=data, err_operation=err_operation)
return result
def run_search(search_object, query, scope=None, size=None, sort=None, order=None):
result = do_search(search_object, query=json.loads(query), scope=scope, size=size, sort=sort, order=order,
err_operation='Search operation failed')
in_progress = result.get('af_in_progress')
status = 'in progress' if in_progress else 'complete'
search_info = {
'AFCookie': result.get('af_cookie'),
'Status': status,
'SessionStart': datetime.now().strftime("%Y-%m-%dT%H:%M:%S"),
}
return search_info
def run_get_search_results(search_object, af_cookie):
path = f'/samples/results/{af_cookie}' if search_object == 'samples' else f'/sessions/results/{af_cookie}'
results = http_request(path, err_operation='Fetching search results failed')
return results
def get_fields_from_hit_object(result_object, response_dict_name):
new_object = {}
af_params_dict = API_PARAM_DICT.get(response_dict_name)
for key, value in result_object.items():
if key in af_params_dict: # type: ignore
new_key = af_params_dict.get(key) # type: ignore
new_object[new_key] = value
else:
new_object[key] = value
return new_object
def parse_hits_response(hits, response_dict_name):
parsed_objects = [] # type: ignore
if not hits:
return parsed_objects
else:
for hit in hits:
flattened_obj = {} # type: ignore
flattened_obj.update(hit.get('_source'))
flattened_obj['_id'] = hit.get('_id')
parsed_obj = get_fields_from_hit_object(flattened_obj, response_dict_name)
parsed_objects.append(parsed_obj)
return parsed_objects
def get_search_results(search_object, af_cookie):
results = run_get_search_results(search_object, af_cookie)
retry_count = 0
# Checking if the query has no results because the server has not fetched them yet.
# In this case, the complete percentage would be 0 (or lower than 100).
# In a case where there really aren't results (hits), the af_complete_percentage would be 100.
while (not results.get('hits') and (results.get('af_complete_percentage', 0) != 100)) and retry_count < 10:
time.sleep(5)
results = run_get_search_results(search_object, af_cookie)
retry_count += 1
parsed_results = parse_hits_response(results.get('hits'), 'search_results')
in_progress = results.get('af_in_progress')
status = 'in progress' if in_progress else 'complete'
return parsed_results, status
def get_session_details(session_id):
path = f'/session/{session_id}'
result = http_request(path, err_operation='Get session failed')
parsed_result = parse_hits_response(result.get('hits'), 'search_results')
return parsed_result
def validate_if_line_needed(category, info_line):
line = info_line.get('line')
line_values = line.split(',')
category_indexes = SAMPLE_ANALYSIS_LINE_KEYS.get(category).get('indexes') # type: ignore
if category == 'behavior':
risk_index = category_indexes.get('risk') # type: ignore
risk = line_values[risk_index].strip()
# only lines with risk higher the informational are considered
return not risk == 'informational'
elif category == 'registry':
action_index = category_indexes.get('action') # type: ignore
action = line_values[action_index].strip()
# Only lines with actions SetValueKey, CreateKey or RegSetValueEx are considered
return action == 'SetValueKey' or action == 'CreateKey' or action == 'RegSetValueEx'
elif category == 'file':
action_index = category_indexes.get('action') # type: ignore
action = line_values[action_index].strip()
benign_count = info_line.get('b') if info_line.get('b') else 0
malicious_count = info_line.get('m') if info_line.get('m') else 0
# Only lines with actions Create or CreateFileW where malicious count is grater than benign count are considered
return (action == 'Create' or action == 'CreateFileW') and malicious_count > benign_count
elif category == 'process':
action_index = category_indexes.get('action') # type: ignore
action = line_values[action_index].strip()
# Only lines with actions created, CreateKey or CreateProcessInternalW are considered
return action == 'created' or action == 'CreateProcessInternalW'
else:
return True
def get_data_from_line(line, category_name):
category_indexes = SAMPLE_ANALYSIS_LINE_KEYS.get(category_name).get('indexes') # type: ignore
values = line.split(',')
sub_categories = {} # type: ignore
if not category_indexes:
return sub_categories
else:
for sub_category in category_indexes: # type: ignore
sub_category_index = category_indexes.get(sub_category) # type: ignore
sub_categories.update({
sub_category: values[sub_category_index]
})
return sub_categories
def get_data_from_coverage_sub_category(sub_category_name, sub_category_data):
sub_categories_list = []
for item in sub_category_data:
new_sub_category = {}
fields_to_extract = SAMPLE_ANALYSIS_COVERAGE_KEYS.get(sub_category_name).get('fields') # type: ignore
for field in fields_to_extract: # type: ignore
new_sub_category[field] = item.get(field) # type: ignore
sub_categories_list.append(new_sub_category)
return sub_categories_list
def parse_coverage_sub_categories(coverage_data):
new_coverage = {}
for sub_category_name, sub_category_data in coverage_data.items():
if sub_category_name in SAMPLE_ANALYSIS_COVERAGE_KEYS:
new_sub_category_data = get_data_from_coverage_sub_category(sub_category_name, sub_category_data)
new_sub_category_name = SAMPLE_ANALYSIS_COVERAGE_KEYS.get(sub_category_name).get( # type: ignore
'display_name') # type: ignore
new_coverage[new_sub_category_name] = new_sub_category_data
return {'coverage': new_coverage}
def parse_lines_from_os(category_name, data, filter_data_flag):
new_lines = []
for info_line in data:
if not filter_data_flag or validate_if_line_needed(category_name, info_line):
new_sub_categories = get_data_from_line(info_line.get('line'), category_name)
new_lines.append(new_sub_categories)
return new_lines
def parse_sample_analysis_response(resp, filter_data_flag):
analysis = {}
for category_name, category_data in resp.items():
if category_name in SAMPLE_ANALYSIS_LINE_KEYS:
new_category = {}
for os_name, os_data in category_data.items():
os_sanitized_data = parse_lines_from_os(category_name, os_data, filter_data_flag)
new_category[os_name] = os_sanitized_data
category_dict = SAMPLE_ANALYSIS_LINE_KEYS.get(category_name)
analysis.update({category_dict['display_name']: new_category}) # type: ignore
elif category_name == 'coverage':
new_category = parse_coverage_sub_categories(category_data)
analysis.update(new_category)
return analysis
def sample_analysis(sample_id, os, filter_data_flag):
path = f'/sample/{sample_id}/analysis'
data = {
'coverage': 'true'
}
if os:
data['platforms'] = [os] # type: ignore
result = http_request(path, data=data, err_operation='Sample analysis failed')
analysis_obj = parse_sample_analysis_response(result, filter_data_flag)
return analysis_obj
def parse_tag_details_response(resp):
tag_details = resp.get('tag')
fields_to_extract_from_tag_details = [
'public_tag_name',
'tag_name',
'customer_name',
'source',
'tag_definition_scope',
'tag_definition_status',
'tag_class',
'count',
'lasthit',
'description'
]
new_tag_info = {}
for field in fields_to_extract_from_tag_details:
new_tag_info[field] = tag_details.get(field)
tag_group_details = resp.get('tag_groups')
if tag_group_details:
new_tag_info['tag_group'] = tag_group_details
return new_tag_info
def autofocus_tag_details(tag_name):
path = f'/tag/{tag_name}'
resp = http_request(path, err_operation='Tag details operation failed')
tag_info = parse_tag_details_response(resp)
return tag_info
def validate_tag_scopes(private, public, commodity, unit42):
if not private and not public and not commodity and not unit42:
return_error('Add at least one Tag scope by setting `commodity`, `private`, `public` or `unit42` to True')
def autofocus_top_tags_search(scope, tag_class_display, private, public, commodity, unit42):
validate_tag_scopes(private, public, commodity, unit42)
tag_class = API_PARAM_DICT['tag_class'][tag_class_display] # type: ignore
query = {
"operator": "all",
"children": [
{
"field": "sample.tag_class",
"operator": "is",
"value": tag_class
}
]
}
tag_scopes = list()
if private:
tag_scopes.append('private')
if public:
tag_scopes.append('public')
if commodity:
tag_scopes.append('commodity')
if unit42:
tag_scopes.append('unit42')
data = {
'query': query,
'scope': scope,
'tagScopes': tag_scopes
}
path = '/top-tags/search/'
resp = http_request(path, data=data, err_operation='Top tags operation failed')
in_progress = resp.get('af_in_progress')
status = 'in progress' if in_progress else 'complete'
search_info = {
'AFCookie': resp.get('af_cookie'),
'Status': status
}
return search_info
def parse_top_tags_response(response):
top_tags_list = [] # type: ignore
top_tags = response.get('top_tags')
if not top_tags:
return top_tags_list
else:
for tag in top_tags:
fields_to_extract_from_top_tags = ['tag_name', 'public_tag_name', 'count', 'lasthit']
new_tag = {}
for field in fields_to_extract_from_top_tags:
new_tag[field] = tag[field]
top_tags_list.append(new_tag)
return top_tags_list
def get_top_tags_results(af_cookie):
path = f'/top-tags/results/{af_cookie}'
results = http_request(path, err_operation='Fetching top tags results failed')
top_tags = parse_top_tags_response(results)
in_progress = results.get('af_in_progress')
status = 'in progress' if in_progress else 'complete'
return top_tags, status
def print_hr_by_category(category_name, category_data):
hr = content = f'### {string_to_table_header(category_name)}:\nNo entries'
if category_name == 'coverage':
content = category_data
if category_data:
hr = tableToMarkdown(f'{string_to_table_header(category_name)}:', category_data,
headerTransform=string_to_table_header)
else:
hr = f'### {string_to_table_header(category_name)}:\nNo entries'
demisto.results({
'Type': entryTypes['note'],
'ContentsFormat': formats['text'],
'Contents': content,
'HumanReadable': hr
})
else:
for os_name, os_data in category_data.items():
content = os_data
table_header = f'{category_name}_{os_name}'
if os_data:
hr = tableToMarkdown(f'{string_to_table_header(table_header)}:', os_data,
headerTransform=string_to_table_header)
else:
hr = f'### {string_to_table_header(table_header)}:\nNo entries'
demisto.results({
'Type': entryTypes['note'],
'ContentsFormat': formats['text'],
'Contents': content,
'HumanReadable': hr
})
def get_files_data_from_results(results):
"""
Gets a list of results and for each result returns a file object includes all relevant file indicators exists
in that result
:param results: a list of dictionaries
:return: a list of file objects
"""
files = []
if results:
for result in results:
raw_file = get_fields_from_hit_object(result, 'file_indicators')
file_data = filter_object_entries_by_dict_values(raw_file, 'file_indicators')
files.append(file_data)
return files
def filter_object_entries_by_dict_values(result_object, response_dict_name):
"""
Gets a dictionary (result_object) and filters it's keys by the values of another
dictionary (response_dict_name)
input: response_dict_name = 'file_indicators' - see API_PARAM_DICT above
result_object = {
"app": "web-browsing",
"vsys": 1,
"SHA256": "18c9acd34a3aea09121f027857e0004a3ea33a372b213a8361e8a978330f0dc8",
"UploadSource": "Firewall",
"src_port": 80,
"device_serial": "007051000050926",
"Seen": "2019-07-24T09:37:04",
"Name": "wildfire-test-pe-file.exe",
"user_id": "unknown",
"src_country": "United States",
"src_countrycode": "US",
"dst_port": 65168,
"device_countrycode": "US",
"Industry": "High Tech",
"Region": "us",
"device_country": "United States",
"ID": "179972200903"
}
output: {
"SHA256": "18c9acd34a3aea09121f027857e0004a3ea33a372b213a8361e8a978330f0dc8",
"Name": "wildfire-test-pe-file.exe"
}
:param result_object: a dictionary representing an object
:param response_dict_name: a dictionary which it's values are the relevant fields (filters)
:return: the result_object filtered by the relevant fields
"""
af_params_dict = API_PARAM_DICT.get(response_dict_name)
result_object_filtered = {}
if af_params_dict and isinstance(result_object, dict) and isinstance(af_params_dict, dict):
for key in result_object.keys():
if key in af_params_dict.values(): # type: ignore
result_object_filtered[key] = result_object.get(key)
return result_object_filtered
def search_samples(query=None, scope=None, size=None, sort=None, order=None, file_hash=None, domain=None, ip=None,
url=None, wildfire_verdict=None, first_seen=None, last_updated=None):
validate_no_query_and_indicators(query, [file_hash, domain, ip, url, wildfire_verdict, first_seen, last_updated])
if not query:
validate_no_multiple_indicators_for_search([file_hash, domain, ip, url])
query = build_sample_search_query(file_hash, domain, ip, url, wildfire_verdict, first_seen, last_updated)
return run_search('samples', query=query, scope=scope, size=size, sort=sort, order=order)
def build_sample_search_query(file_hash, domain, ip, url, wildfire_verdict, first_seen, last_updated):
indicator_args_for_query = {
'file_hash': file_hash,
'domain': domain,
'ip': ip,
'url': url
}
indicator_list = build_indicator_children_query(indicator_args_for_query)
indicator_query = build_logic_query('OR', indicator_list)
filtering_args_for_search = {} # type: ignore
if wildfire_verdict:
filtering_args_for_search['wildfire_verdict'] = \
demisto.get(API_PARAM_DICT, f'search_arguments.wildfire_verdict.translate.{wildfire_verdict}')
if first_seen:
filtering_args_for_search['first_seen'] = first_seen
if last_updated:
filtering_args_for_search['last_updated'] = last_updated
filters_list = build_children_query(filtering_args_for_search)
filters_list.append(indicator_query)
logic_query = build_logic_query('AND', filters_list)
return json.dumps(logic_query)
def search_sessions(query=None, size=None, sort=None, order=None, file_hash=None, domain=None, ip=None, url=None,
from_time=None, to_time=None):
validate_no_query_and_indicators(query, [file_hash, domain, ip, url, from_time, to_time])
if not query:
validate_no_multiple_indicators_for_search([file_hash, domain, ip, url])
query = build_session_search_query(file_hash, domain, ip, url, from_time, to_time)
return run_search('sessions', query=query, size=size, sort=sort, order=order)
def build_session_search_query(file_hash, domain, ip, url, from_time, to_time):
indicator_args_for_query = {
'file_hash': file_hash,
'domain': domain,
'ip': ip,
'url': url
}
indicator_list = build_indicator_children_query(indicator_args_for_query)
indicator_query = build_logic_query('OR', indicator_list)
time_filters_for_search = {} # type: ignore
if from_time and to_time:
time_filters_for_search = {'time_range': [from_time, to_time]}
elif from_time:
time_filters_for_search = {'time_after': [from_time]}
elif to_time:
time_filters_for_search = {'time_before': [to_time]}
filters_list = build_children_query(time_filters_for_search)
filters_list.append(indicator_query)
logic_query = build_logic_query('AND', filters_list)
return json.dumps(logic_query)
def build_logic_query(logic_operator, condition_list):
operator = None
if logic_operator == 'AND':
operator = 'all'
elif logic_operator == 'OR':
operator = 'any'
return {
'operator': operator,
'children': condition_list
}
def build_children_query(args_for_query):
children_list = [] # type: ignore
for key, val in args_for_query.items():
field_api_name = API_PARAM_DICT['search_arguments'][key]['api_name'] # type: ignore
operator = API_PARAM_DICT['search_arguments'][key]['operator'] # type: ignore
children_list += children_list_generator(field_api_name, operator, [val])
return children_list
def build_indicator_children_query(args_for_query):
for key, val in args_for_query.items():
if val:
field_api_name = API_PARAM_DICT['search_arguments'][key]['api_name'] # type: ignore
operator = API_PARAM_DICT['search_arguments'][key]['operator'] # type: ignore
children_list = children_list_generator(field_api_name, operator, val)
return children_list
def children_list_generator(field_name, operator, val_list):
query_list = []
for value in val_list:
query_list.append({
'field': field_name,
'operator': operator,
'value': value
})
return query_list
def validate_no_query_and_indicators(query, arg_list):
if query:
for arg in arg_list:
if arg:
return_error(f'The search command can either run a search using a custom query '
f'or use the builtin arguments, but not both')
def validate_no_multiple_indicators_for_search(arg_list):
used_arg = None
for arg in arg_list:
if arg and used_arg:
return_error(f'The search command can receive one indicator type at a time, two were given: {used_arg}, '
f'{arg}. For multiple indicator types use the custom query')
elif arg:
used_arg = arg
if not used_arg:
return_error('In order to perform a samples/sessions search, a query or an indicator must be given.')
return
def search_indicator(indicator_type, indicator_value):
headers = HEADERS
headers['apiKey'] = API_KEY
params = {
'indicatorType': indicator_type,
'indicatorValue': indicator_value,
'includeTags': 'true',
}
try:
result = requests.request(
method='GET',
url=f'{BASE_URL}/tic',
verify=USE_SSL,
headers=headers,
params=params
)
# Handle error responses gracefully
result.raise_for_status()
result_json = result.json()
# Unexpected errors (where no json object was received)
except Exception as err:
try:
text_error = result.json()
except ValueError:
text_error = {}
error_message = text_error.get('message')
if error_message:
return_error(f'Request Failed with status: {result.status_code}.\n'
f'Reason is: {str(error_message)}.')
elif str(result.status_code) in ERROR_DICT:
return_error(f'Request Failed with status: {result.status_code}.\n'
f'Reason is: {ERROR_DICT[str(result.status_code)]}.')
else:
err_msg = f'Request Failed with message: {err}.'
return return_error(err_msg)
return result_json
def parse_indicator_response(res, raw_tags, indicator_type):
indicator = {}
indicator['IndicatorValue'] = res.get('indicatorValue', '')
indicator['IndicatorType'] = res.get('indicatorType', '')
indicator['LatestPanVerdicts'] = res.get('latestPanVerdicts', '')
indicator['WildfireRelatedSampleVerdictCounts'] = res.get('wildfireRelatedSampleVerdictCounts', '')
indicator['SeenBy'] = res.get('seenByDataSourceIds', '')
first_seen = res.get('firstSeenTsGlobal', '')
last_seen = res.get('lastSeenTsGlobal', '')
if first_seen:
indicator['FirstSeen'] = timestamp_to_datestring(first_seen)
if last_seen:
indicator['LastSeen'] = timestamp_to_datestring(last_seen)
if raw_tags:
tags = []
for tag in raw_tags:
tags.append({
'PublicTagName': tag.get('public_tag_name', ''),
'TagName': tag.get('tag_name', ''),
'CustomerName': tag.get('customer_name', ''),
'Source': tag.get('source', ''),
'TagDefinitionScopeID': tag.get('tag_definition_scope_id', ''),
'TagDefinitionStatusID': tag.get('tag_definition_status_id', ''),
'TagClassID': tag.get('tag_class_id', ''),
'Count': tag.get('count', ''),
'Lasthit': tag.get('lasthit', ''),
'Description': tag.get('description', '')})
indicator['Tags'] = tags
if indicator_type == 'Domain':
indicator['WhoisAdminCountry'] = res.get('whoisAdminCountry', '')
indicator['WhoisAdminEmail'] = res.get('whoisAdminEmail', '')
indicator['WhoisAdminName'] = res.get('whoisAdminName', '')
indicator['WhoisDomainCreationDate'] = res.get('whoisDomainCreationDate', '')
indicator['WhoisDomainExpireDate'] = res.get('whoisDomainExpireDate', '')
indicator['WhoisDomainUpdateDate'] = res.get('whoisDomainUpdateDate', '')
indicator['WhoisRegistrar'] = res.get('whoisRegistrar', '')
indicator['WhoisRegistrarUrl'] = res.get('whoisRegistrarUrl', '')
indicator['WhoisRegistrant'] = res.get('whoisRegistrant', '')
return indicator
def calculate_dbot_score(indicator_response, indicator_type):
latest_pan_verdicts = indicator_response['latestPanVerdicts']
if not latest_pan_verdicts:
raise Exception('latestPanVerdicts value is empty in indicator response.')
pan_db = latest_pan_verdicts.get('PAN_DB')
wf_sample = latest_pan_verdicts.get('WF_SAMPLE')
# use WF_SAMPLE value for file indicator and PAN_DB for domain,url and ip indicators
if indicator_type == 'File' and wf_sample:
return VERDICTS_TO_DBOTSCORE.get(wf_sample.lower(), 0)
elif pan_db:
return VERDICTS_TO_DBOTSCORE.get(pan_db.lower(), 0)
else:
score = next(iter(latest_pan_verdicts.values()))
return VERDICTS_TO_DBOTSCORE.get(score.lower(), 0)
def get_indicator_outputs(indicator_type, indicators, indicator_context_output):
human_readable = ''
raw_res = []
scores = []
_indicators = []
context = []
for indicator in indicators:
indicator_response = indicator['response']
indicator_value = indicator['value']
dbot_score = {
'Indicator': indicator_value,
'Type': indicator_type.lower(),
'Vendor': VENDOR_NAME,
'Score': indicator['score']
}
indicator_context = {
indicator_context_output: indicator_value
}
if indicator['score'] == 3:
indicator_context['Malicious'] = {
'Vendor': VENDOR_NAME
}
if indicator_type == 'Domain':
whois = dict() # type: ignore
whois['Admin'] = dict()
whois['Registrant'] = dict()
whois['Registrar'] = dict()
whois['CreationDate'] = indicator_response['WhoisDomainCreationDate']
whois['ExpirationDate'] = indicator_response['WhoisDomainExpireDate']
whois['UpdatedDate'] = indicator_response['WhoisDomainUpdateDate']
whois['Admin']['Email'] = indicator_response['WhoisAdminEmail']
whois['Admin']['Name'] = indicator_response['WhoisAdminName']
whois['Registrar']['Name'] = indicator_response['WhoisRegistrar']
whois['Registrant']['Name'] = indicator_response['WhoisRegistrant']
indicator_context['WHOIS'] = whois
tags = indicator_response.get('Tags')
table_name = f'{VENDOR_NAME} {indicator_type} reputation for: {indicator_value}'
if tags:
indicators_data = indicator_response.copy()
del indicators_data['Tags']
md = tableToMarkdown(table_name, indicators_data, headerTransform=string_to_table_header)
md += tableToMarkdown('Indicator Tags:', tags, headerTransform=string_to_table_header)
else:
md = tableToMarkdown(table_name, indicator_response, headerTransform=string_to_table_header)
human_readable += md
raw_res.append(indicator['raw_response'])
scores.append(dbot_score)
context.append(indicator_context)
_indicators.append(indicator_response)
ec = {
outputPaths['dbotscore']: scores,
outputPaths[indicator_type.lower()]: context,
f'AutoFocus.{indicator_type}(val.IndicatorValue === obj.IndicatorValue)': _indicators,
}
return_outputs(readable_output=human_readable, outputs=ec, raw_response=raw_res)
def check_for_ip(indicator):
if '-' in indicator:
# check for address range
ip1, ip2 = indicator.split('-', 1)
if re.match(ipv4Regex, ip1) and re.match(ipv4Regex, ip2):
return FeedIndicatorType.IP
elif re.match(ipv6Regex, ip1) and re.match(ipv6Regex, ip2):
return FeedIndicatorType.IPv6
elif re.match(ipv4cidrRegex, ip1) and re.match(ipv4cidrRegex, ip2):
return FeedIndicatorType.CIDR
elif re.match(ipv6cidrRegex, ip1) and re.match(ipv6cidrRegex, ip2):
return FeedIndicatorType.IPv6CIDR
return None
if '/' in indicator:
if re.match(ipv4cidrRegex, indicator):
return FeedIndicatorType.CIDR
elif re.match(ipv6cidrRegex, indicator):
return FeedIndicatorType.IPv6CIDR
return None
else:
if re.match(ipv4Regex, indicator):
return FeedIndicatorType.IP
elif re.match(ipv6Regex, indicator):
return FeedIndicatorType.IPv6
return None
def find_indicator_type(indicator):
"""Infer the type of the indicator.
Args:
indicator(str): The indicator whose type we want to check.
Returns:
str. The type of the indicator.
"""
# trying to catch X.X.X.X:portNum
if ':' in indicator and '/' not in indicator:
sub_indicator = indicator.split(':', 1)[0]
ip_type = check_for_ip(sub_indicator)
if ip_type:
return ip_type
ip_type = check_for_ip(indicator)
if ip_type:
# catch URLs of type X.X.X.X/path/url or X.X.X.X:portNum/path/url
if '/' in indicator and (ip_type not in [FeedIndicatorType.IPv6CIDR, FeedIndicatorType.CIDR]):
return FeedIndicatorType.URL
else:
return ip_type
elif re.match(sha256Regex, indicator):
return FeedIndicatorType.File
# in AutoFocus, URLs include a path while domains do not - so '/' is a good sign for us to catch URLs.
elif '/' in indicator:
return FeedIndicatorType.URL
else:
return FeedIndicatorType.Domain
def resolve_ip_address(ip):
if check_for_ip(ip):
return socket.gethostbyaddr(ip)[0]
return None
''' COMMANDS'''
def test_module():
"""
Performs basic get request to get item samples
"""
query = {
'operator': 'all',
'children': [
{
'field': 'sample.malware',
'operator': 'is',
'value': 1
}
]
}
do_search('samples', query=query, scope='Public', err_operation='Test module failed')
return
def search_samples_command():
args = demisto.args()
file_hash = argToList(args.get('file_hash'))
domain = argToList(args.get('domain'))
ip = argToList(args.get('ip'))
url = argToList(args.get('url'))
wildfire_verdict = args.get('wildfire_verdict')
first_seen = argToList(args.get('first_seen'))
last_updated = argToList(args.get('last_updated'))
query = args.get('query')
scope = args.get('scope').capitalize()
max_results = args.get('max_results')
sort = args.get('sort')
order = args.get('order')
info = search_samples(query=query, scope=scope, size=max_results, sort=sort, order=order, file_hash=file_hash,
domain=domain, ip=ip, url=url, wildfire_verdict=wildfire_verdict, first_seen=first_seen,
last_updated=last_updated)
md = tableToMarkdown(f'Search Samples Info:', info)
demisto.results({
'Type': entryTypes['note'],
'ContentsFormat': formats['text'],
'Contents': info,
'EntryContext': {'AutoFocus.SamplesSearch(val.AFCookie == obj.AFCookie)': info},
'HumanReadable': md
})
def search_sessions_command():
args = demisto.args()
file_hash = argToList(args.get('file_hash'))
domain = argToList(args.get('domain'))
ip = argToList(args.get('ip'))
url = argToList(args.get('url'))
from_time = args.get('from_time')
to_time = args.get('to_time')
query = args.get('query')
max_results = args.get('max_results')
sort = args.get('sort')
order = args.get('order')
info = search_sessions(query=query, size=max_results, sort=sort, order=order, file_hash=file_hash, domain=domain,
ip=ip, url=url, from_time=from_time, to_time=to_time)
md = tableToMarkdown(f'Search Sessions Info:', info)
demisto.results({
'Type': entryTypes['note'],
'ContentsFormat': formats['text'],
'Contents': info,
'EntryContext': {'AutoFocus.SessionsSearch(val.AFCookie == obj.AFCookie)': info},
'HumanReadable': md
})
def samples_search_results_command():
args = demisto.args()
af_cookie = args.get('af_cookie')
results, status = get_search_results('samples', af_cookie)
files = get_files_data_from_results(results)
if not results or len(results) == 0:
md = results = 'No entries found that match the query'
status = 'complete'
else:
md = tableToMarkdown(f'Search Samples Results is {status}', results)
context = {
'AutoFocus.SamplesResults(val.ID === obj.ID)': results,
'AutoFocus.SamplesSearch(val.AFCookie ==== obj.AFCookie)': {'Status': status, 'AFCookie': af_cookie},
outputPaths['file']: files
}
demisto.results({
'Type': entryTypes['note'],
'ContentsFormat': formats['text'],
'Contents': results,
'EntryContext': context,
'HumanReadable': md
})
def sessions_search_results_command():
args = demisto.args()
af_cookie = args.get('af_cookie')
results, status = get_search_results('sessions', af_cookie)
files = get_files_data_from_results(results)
if not results or len(results) == 0:
md = results = 'No entries found that match the query'
status = 'complete'
else:
md = tableToMarkdown(f'Search Sessions Results is {status}', results)
context = {
'AutoFocus.SessionsResults(val.ID === obj.ID)': results,
'AutoFocus.SessionsSearch(val.AFCookie === obj.AFCookie)': {'Status': status, 'AFCookie': af_cookie},
outputPaths['file']: files
}
demisto.results({
'Type': entryTypes['note'],
'ContentsFormat': formats['text'],
'Contents': results,
'EntryContext': context,
'HumanReadable': md
})
def get_session_details_command():
args = demisto.args()
session_id = args.get('session_id')
result = get_session_details(session_id)
files = get_files_data_from_results(result)
md = tableToMarkdown(f'Session {session_id}:', result)
context = {
'AutoFocus.Sessions(val.ID === obj.ID)': result,
outputPaths['file']: files
}
demisto.results({
'Type': entryTypes['note'],
'ContentsFormat': formats['text'],
'Contents': result,
'EntryContext': context,
'HumanReadable': md
})
def sample_analysis_command():
args = demisto.args()
sample_id = args.get('sample_id')
os = args.get('os')
filter_data = False if args.get('filter_data') == 'False' else True
analysis = sample_analysis(sample_id, os, filter_data)
context = createContext(analysis, keyTransform=string_to_context_key)
demisto.results({
'Type': entryTypes['note'],
'ContentsFormat': formats['text'],
'Contents': {'ID': sample_id, 'Analysis': analysis},
'HumanReadable': f'### Sample Analysis results for {sample_id}:',
'EntryContext': {f'AutoFocus.SampleAnalysis(val.ID == obj.ID)': {'ID': sample_id, 'Analysis': context}},
})
for category_name, category_data in analysis.items():
print_hr_by_category(category_name, category_data)
def tag_details_command():
args = demisto.args()
tag_name = args.get('tag_name')
result = autofocus_tag_details(tag_name)
md = tableToMarkdown(f'Tag {tag_name} details:', result, headerTransform=string_to_table_header)
context = createContext(result, keyTransform=string_to_context_key)
demisto.results({
'Type': entryTypes['note'],
'ContentsFormat': formats['text'],
'Contents': result,
'EntryContext': {'AutoFocus.Tag(val.ID == obj.ID)': context},
'HumanReadable': md
})
def top_tags_search_command():
args = demisto.args()
scope = args.get('scope')
tag_class = args.get('class')
private = args.get('private') == 'True'
public = args.get('public') == 'True'
commodity = args.get('commodity') == 'True'
unit42 = args.get('unit42') == 'True'
info = autofocus_top_tags_search(scope, tag_class, private, public, commodity, unit42)
md = tableToMarkdown(f'Top tags search Info:', info)
demisto.results({
'Type': entryTypes['note'],
'ContentsFormat': formats['text'],
'Contents': info,
'EntryContext': {'AutoFocus.TopTagsSearch(val.AFCookie == obj.AFCookie)': info},
'HumanReadable': md
})
def top_tags_results_command():
args = demisto.args()
af_cookie = args.get('af_cookie')
results, status = get_top_tags_results(af_cookie)
md = tableToMarkdown(f'Search Top Tags Results is {status}:', results, headerTransform=string_to_table_header)
context = createContext(results, keyTransform=string_to_context_key)
demisto.results({
'Type': entryTypes['note'],
'ContentsFormat': formats['text'],
'Contents': results,
'EntryContext': {'AutoFocus.TopTagsResults(val.PublicTagName == obj.PublicTagName)': context,
'AutoFocus.TopTagsSearch(val.AFCookie == obj.AFCookie)': {'Status': status,
'AFCookie': af_cookie}},
'HumanReadable': md
})
def search_ip_command(ip):
indicator_type = 'IP'
ip_list = argToList(ip)
ip_indicators = []
outputs = []
raw_response = []
human_readable = ''
for ip_address in ip_list:
raw_res = search_indicator('ipv4_address', ip_address)
if not raw_res.get('indicator'):
raise ValueError('Invalid response for indicator')
indicator = raw_res.get('indicator')
raw_tags = raw_res.get('tags')
score = calculate_dbot_score(indicator, indicator_type)
dbot_score = Common.DBotScore(
indicator=ip_address,
indicator_type=DBotScoreType.IP,
integration_name=VENDOR_NAME,
score=score
)
ip = Common.IP(
ip=ip_address,
dbot_score=dbot_score
)
autofocus_ip_output = parse_indicator_response(indicator, raw_tags, indicator_type)
# create human readable markdown for ip
tags = autofocus_ip_output.get('Tags')
table_name = f'{VENDOR_NAME} {indicator_type} reputation for: {ip_address}'
if tags:
indicators_data = autofocus_ip_output.copy()
del indicators_data['Tags']
md = tableToMarkdown(table_name, indicators_data)
md += tableToMarkdown('Indicator Tags:', tags)
else:
md = tableToMarkdown(table_name, autofocus_ip_output)
human_readable += md
ip_indicators.append(ip)
outputs.append(autofocus_ip_output)
raw_response.append(raw_res)
command_results = CommandResults(
outputs_prefix='AutoFocus.IP',
outputs_key_field='IndicatorValue',
outputs=outputs,
readable_output=human_readable,
raw_response=raw_response,
indicators=ip_indicators
)
return command_results
def search_domain_command(args):
indicator_type = 'Domain'
domain_name_list = argToList(args.get('domain'))
domain_indicator_list = []
autofocus_domain_list = []
raw_response = []
human_readable = ''
for domain_name in domain_name_list:
raw_res = search_indicator('domain', domain_name)
if not raw_res.get('indicator'):
raise ValueError('Invalid response for indicator')
indicator = raw_res.get('indicator')
raw_tags = raw_res.get('tags')
score = calculate_dbot_score(indicator, indicator_type)
dbot_score = Common.DBotScore(
indicator=domain_name,
indicator_type=DBotScoreType.DOMAIN,
integration_name=VENDOR_NAME,
score=score
)
demisto.log(json.dumps(raw_res, indent=4))
domain = Common.Domain(
domain=domain_name,
dbot_score=dbot_score,
whois=Common.WHOIS(
creation_date=indicator.get('whoisDomainCreationDate'),
expiration_date=indicator.get('whoisDomainExpireDate'),
update_date=indicator.get('whoisDomainUpdateDate'),
admin_email=indicator.get('whoisAdminEmail'),
admin_name=indicator.get('whoisAdminName'),
registrar_name=indicator.get('whoisRegistrar'),
registrant_name=indicator.get('whoisRegistrant')
)
)
autofocus_domain_output = parse_indicator_response(indicator, raw_tags, indicator_type)
# create human readable markdown for ip
tags = autofocus_domain_output.get('Tags')
table_name = f'{VENDOR_NAME} {indicator_type} reputation for: {domain_name}'
if tags:
indicators_data = autofocus_domain_output.copy()
del indicators_data['Tags']
md = tableToMarkdown(table_name, indicators_data)
md += tableToMarkdown('Indicator Tags:', tags)
else:
md = tableToMarkdown(table_name, autofocus_domain_output)
human_readable += md
domain_indicator_list.append(domain)
raw_response.append(raw_res)
autofocus_domain_list.append(autofocus_domain_output)
command_results = CommandResults(
outputs_prefix='AutoFocus.Domain',
outputs_key_field='IndicatorValue',
outputs=autofocus_domain_list,
readable_output=human_readable,
raw_response=raw_response,
indicators=domain_indicator_list
)
return command_results
def search_url_command(url):
indicator_type = 'URL'
url_list = argToList(url)
url_indicator_list = []
autofocus_url_list = []
raw_response = []
human_readable = ''
for url_name in url_list:
raw_res = search_indicator('url', url_name)
if not raw_res.get('indicator'):
raise ValueError('Invalid response for indicator')
indicator = raw_res.get('indicator')
raw_tags = raw_res.get('tags')
score = calculate_dbot_score(indicator, indicator_type)
dbot_score = Common.DBotScore(
indicator=url_name,
indicator_type=DBotScoreType.URL,
integration_name=VENDOR_NAME,
score=score
)
url = Common.URL(
url=url_name,
dbot_score=dbot_score
)
autofocus_url_output = parse_indicator_response(indicator, raw_tags, indicator_type)
tags = autofocus_url_output.get('Tags')
table_name = f'{VENDOR_NAME} {indicator_type} reputation for: {url_name}'
if tags:
indicators_data = autofocus_url_output.copy()
del indicators_data['Tags']
md = tableToMarkdown(table_name, indicators_data)
md += tableToMarkdown('Indicator Tags:', tags)
else:
md = tableToMarkdown(table_name, autofocus_url_output)
human_readable += md
url_indicator_list.append(url)
raw_response.append(raw_res)
autofocus_url_list.append(autofocus_url_output)
command_results = CommandResults(
outputs_prefix='AutoFocus.URL',
outputs_key_field='IndicatorValue',
outputs=autofocus_url_list,
readable_output=human_readable,
raw_response=raw_response,
indicators=url_indicator_list
)
return command_results
def search_file_command(file):
indicator_type = 'File'
file_list = argToList(file)
file_indicator_list = []
autofocus_file_list = []
raw_response = []
human_readable = ''
for sha256 in file_list:
raw_res = search_indicator('sha256', sha256.lower())
if not raw_res.get('indicator'):
raise ValueError('Invalid response for indicator')
indicator = raw_res.get('indicator')
raw_tags = raw_res.get('tags')
score = calculate_dbot_score(indicator, indicator_type)
dbot_score = Common.DBotScore(
indicator=sha256,
indicator_type=DBotScoreType.FILE,
integration_name=VENDOR_NAME,
score=score
)
file = Common.File(
sha256=sha256,
dbot_score=dbot_score
)
autofocus_file_output = parse_indicator_response(indicator, raw_tags, indicator_type)
tags = autofocus_file_output.get('Tags')
table_name = f'{VENDOR_NAME} {indicator_type} reputation for: {sha256}'
if tags:
indicators_data = autofocus_file_output.copy()
del indicators_data['Tags']
md = tableToMarkdown(table_name, indicators_data)
md += tableToMarkdown('Indicator Tags:', tags)
else:
md = tableToMarkdown(table_name, autofocus_file_output)
human_readable += md
file_indicator_list.append(file)
raw_response.append(raw_res)
autofocus_file_list.append(autofocus_file_output)
command_results = CommandResults(
outputs_prefix='AutoFocus.File',
outputs_key_field='IndicatorValue',
outputs=autofocus_file_list,
readable_output=human_readable,
raw_response=raw_response,
indicators=file_indicator_list
)
return command_results
def get_export_list_command(args):
# the label is the name of the export list we want to fetch.
# panosFormatted is a flag stating that only indicators should be returned in the list.
data = {
'label': args.get('label'),
'panosFormatted': True,
'apiKey': ''
}
results = http_request(url_suffix='/export', method='POST', data=data,
err_operation=f"Failed to fetch export list: {args.get('label')}")
indicators = []
context_ip = []
context_url = []
context_domain = []
context_file = []
for indicator_value in results.get('export_list'):
indicator_type = find_indicator_type(indicator_value)
if indicator_type in [FeedIndicatorType.IP,
FeedIndicatorType.IPv6, FeedIndicatorType.IPv6CIDR, FeedIndicatorType.CIDR]:
if '-' in indicator_value:
context_ip.append({
'Address': indicator_value.split('-')[0]
})
context_ip.append({
'Address': indicator_value.split('-')[1]
})
elif ":" in indicator_value:
context_ip.append({
'Address': indicator_value.split(":", 1)[0]
})
else:
context_ip.append({
'Address': indicator_value
})
elif indicator_type in [FeedIndicatorType.Domain]:
context_domain.append({
'Name': indicator_value
})
elif indicator_type in [FeedIndicatorType.File]:
context_file.append({
'SHA256': indicator_value
})
elif indicator_type in [FeedIndicatorType.URL]:
if ":" in indicator_value:
resolved_address = resolve_ip_address(indicator_value.split(":", 1)[0])
semicolon_suffix = indicator_value.split(":", 1)[1]
slash_suffix = None
else:
resolved_address = resolve_ip_address(indicator_value.split("/", 1)[0])
slash_suffix = indicator_value.split("/", 1)[1]
semicolon_suffix = None
if resolved_address:
if semicolon_suffix:
indicator_value = resolved_address + ":" + semicolon_suffix
else:
indicator_value = resolved_address + "/" + slash_suffix
context_url.append({
'Data': indicator_value,
})
indicators.append({
'Type': indicator_type,
'Value': indicator_value,
})
hr = tableToMarkdown(f"Export list {args.get('label')}", indicators, headers=['Type', 'Value'])
return_outputs(hr, {'AutoFocus.Indicator(val.Value == obj.Value && val.Type == obj.Type)': indicators,
'IP(obj.Address == val.Address)': context_ip,
'URL(obj.Data == val.Data)': context_url,
'File(obj.SHA256 == val.SHA256)': context_file,
'Domain(obj.Name == val.Name)': context_domain},
results)
def main():
demisto.debug('Command being called is %s' % (demisto.command()))
try:
# Remove proxy if not set to true in params
handle_proxy()
active_command = demisto.command()
args = {k: v for (k, v) in demisto.args().items() if v}
if active_command == 'test-module':
# This is the call made when pressing the integration test button.
test_module()
demisto.results('ok')
elif active_command == 'autofocus-search-samples':
search_samples_command()
elif active_command == 'autofocus-search-sessions':
search_sessions_command()
elif active_command == 'autofocus-samples-search-results':
samples_search_results_command()
elif active_command == 'autofocus-sessions-search-results':
sessions_search_results_command()
elif active_command == 'autofocus-get-session-details':
get_session_details_command()
elif active_command == 'autofocus-sample-analysis':
sample_analysis_command()
elif active_command == 'autofocus-tag-details':
tag_details_command()
elif active_command == 'autofocus-top-tags-search':
top_tags_search_command()
elif active_command == 'autofocus-top-tags-results':
top_tags_results_command()
elif active_command == 'autofocus-get-export-list-indicators':
get_export_list_command(args)
elif active_command == 'ip':
return_results(search_ip_command(**args))
elif active_command == 'domain':
return_results(search_domain_command(args))
elif active_command == 'url':
return_results(search_url_command(**args))
elif active_command == 'file':
return_results(search_file_command(**args))
except Exception as e:
return_error(f'Unexpected error: {e}')
if __name__ == "__builtin__" or __name__ == "builtins":
main()
|
py | b41428824c4521efa94d9ffc075ad8a03abf8684 | import io
from zipfile import BadZipfile
from tempfile import NamedTemporaryFile
import openpyxl
from openpyxl.utils.exceptions import InvalidFileException
from django.core.files.uploadedfile import UploadedFile
from django.utils.translation import ugettext as _
class InvalidExcelFileException(Exception):
pass
class JSONReaderError(Exception):
pass
class HeaderValueError(Exception):
pass
class StringTypeRequiredError(Exception):
pass
class WorkbookJSONError(Exception):
pass
class IteratorJSONReader(object):
"""
>>> def normalize(it):
... r = []
... for row in IteratorJSONReader(it):
... r.append(sorted(row.items()))
... return r
>>> normalize([])
[]
>>> normalize([['A', 'B', 'C'], ['1', '2', '3']])
[[('A', '1'), ('B', '2'), ('C', '3')]]
>>> normalize([['A', 'data: key', 'user 1', 'user 2', 'is-ok?'],
... ['1', '2', '3', '4', 'yes']])
[[('A', '1'), ('data', {'key': '2'}), ('is-ok', True), ('user', ['3', '4'])]]
"""
def __init__(self, rows):
# you can only call __iter__ once
self._rows = iter(rows)
try:
self.headers = list(next(self._rows))
except StopIteration:
self.headers = []
self.fieldnames = self.get_fieldnames()
def row_to_json(self, row):
obj = {}
for value, header in zip(row, self.headers):
self.set_field_value(obj, header, value)
return obj
def __iter__(self):
try:
for row in self._rows:
yield self.row_to_json(row)
finally:
del self._rows
def get_fieldnames(self):
obj = {}
for field, value in zip(self.headers, [''] * len(self.headers)):
if not isinstance(field, str):
raise HeaderValueError('Field %s is not a string.' % field)
self.set_field_value(obj, field, value)
return list(obj)
@classmethod
def set_field_value(cls, obj, field, value):
if isinstance(value, bytes):
value = value.decode('utf-8')
if isinstance(value, str):
value = value.strip()
# try dict
try:
field, subfield = field.split(':')
except Exception:
pass
else:
field = field.strip()
if field not in obj:
obj[field] = {}
cls.set_field_value(obj[field], subfield, value)
return
# try list
try:
field, _ = field.split()
except Exception:
pass
else:
dud = {}
cls.set_field_value(dud, field, value)
(field, value), = list(dud.items())
if field not in obj:
obj[field] = []
elif not isinstance(obj[field], list):
obj[field] = [obj[field]]
if value not in (None, ''):
obj[field].append(value)
return
# else flat
# try boolean
try:
field, nothing = field.split('?')
assert(nothing.strip() == '')
except Exception:
pass
else:
try:
value = {
'yes': True,
'true': True,
'no': False,
'false': False,
'': False,
None: False,
}[value.lower() if hasattr(value, 'lower') else value]
except KeyError:
raise JSONReaderError(
'Values for field %s must be "yes" or "no", not "%s"' % (
field, value)
)
# set for any flat type
field = field.strip()
if field in obj:
raise JSONReaderError(
'You have a repeat field: %s' % field
)
obj[field] = value
def get_workbook(file_or_filename):
try:
return WorkbookJSONReader(file_or_filename)
except (HeaderValueError, InvalidExcelFileException) as e:
raise WorkbookJSONError(_(
"Upload failed! "
"Please make sure you are using a valid Excel 2007 or later (.xlsx) file. "
"Error details: {}."
).format(e))
except JSONReaderError as e:
raise WorkbookJSONError(_(
"Upload failed due to a problem with Excel columns. Error details: {}."
).format(e))
except HeaderValueError as e:
raise WorkbookJSONError(_(
"Upload encountered a data type error: {}."
).format(e))
except AttributeError as e:
raise WorkbookJSONError(_(
"Error processing Excel file: {}."
).format(e))
def get_single_worksheet(file_or_filename, title=None):
workbook = get_workbook(file_or_filename)
try:
worksheet = workbook.get_worksheet(title=title)
except WorksheetNotFound:
raise WorkbookJSONError(_(
"Could not find sheet '{title}'."
).format(title=title) if title else _("Uploaded file does not contian any sheets."))
return worksheet
class WorksheetNotFound(Exception):
def __init__(self, title):
self.title = title
super(WorksheetNotFound, self).__init__()
class WorksheetJSONReader(IteratorJSONReader):
def __init__(self, worksheet, title=None):
width = 0
self.title = title
self.worksheet = worksheet
try:
header_row = next(self.worksheet.iter_rows())
except StopIteration:
header_row = []
for cell in header_row:
if cell.value is None:
break
else:
width += 1
self.worksheet.calculate_dimension(force=True)
def iterator():
def _convert_float(value):
"""
excel doesn't distinguish between 1 and 1.0
if it can be an integer assume it is
"""
if isinstance(value, float) and int(value) == value:
return int(value)
else:
# Specifically check for None so that we can allow a value of 0
return value if value is not None else ''
for row in self.worksheet.iter_rows():
cell_values = [
_convert_float(cell.value)
for cell in row[:width]
]
if not any(cell != '' for cell in cell_values):
break
yield cell_values
super(WorksheetJSONReader, self).__init__(iterator())
class WorkbookJSONReader(object):
def __init__(self, file_or_filename):
check_types = (UploadedFile, io.RawIOBase, io.BufferedIOBase)
if isinstance(file_or_filename, check_types):
tmp = NamedTemporaryFile(mode='wb', suffix='.xlsx', delete=False)
file_or_filename.seek(0)
tmp.write(file_or_filename.read())
file_or_filename.seek(0)
tmp.close()
file_or_filename = tmp.name
try:
self.wb = openpyxl.load_workbook(file_or_filename, read_only=True, data_only=True)
except (BadZipfile, InvalidFileException, KeyError) as e:
raise InvalidExcelFileException(str(e))
self.worksheets_by_title = {}
self.worksheets = []
for worksheet in self.wb.worksheets:
try:
ws = WorksheetJSONReader(worksheet, title=worksheet.title)
except IndexError:
raise JSONReaderError('This Excel file has unrecognised formatting. Please try downloading '
'the lookup table first, and then add data to it.')
self.worksheets_by_title[worksheet.title] = ws
self.worksheets.append(ws)
def get_worksheet(self, title=None, index=None):
if title is not None and index is not None:
raise TypeError("Can only get worksheet by title *or* index")
if title:
try:
return self.worksheets_by_title[title]
except KeyError:
raise WorksheetNotFound(title=title)
elif index:
try:
return self.worksheets[index]
except IndexError:
raise WorksheetNotFound(title=index)
else:
try:
return self.worksheets[0]
except IndexError:
raise WorksheetNotFound(title=0)
def flatten_json_to_path(obj, path=()):
if isinstance(obj, dict):
for key, value in obj.items():
for item in flatten_json_to_path(value, path + (key,)):
yield item
elif isinstance(obj, list):
for key, value in enumerate(obj):
for item in flatten_json_to_path(value, path + (key,)):
yield item
else:
yield (path, obj)
def format_header(path, value):
# pretty sure making a string-builder would be slower than concatenation
s = path[0]
for p in path[1:]:
if isinstance(p, str):
s += f': {p}'
elif isinstance(p, int):
s += f' {p + 1}'
if isinstance(value, bool):
s += '?'
value = 'yes' if value else 'no'
return s, value
def flatten_json(obj):
for key, value in flatten_json_to_path(obj):
yield format_header(key, value)
def json_to_headers(obj):
return [key for key, value in sorted(flatten_json(obj), key=lambda t: alphanumeric_sort_key(t[0]))]
def alphanumeric_sort_key(key):
"""
Sort the given iterable in the way that humans expect.
Thanks to http://stackoverflow.com/a/2669120/240553
"""
import re
convert = lambda text: int(text) if text.isdigit() else text
return [convert(c) for c in re.split('([0-9]+)', key)]
def enforce_string_type(value):
if isinstance(value, str):
return value
if isinstance(value, int):
return str(value)
# Don't try to guess for decimal types how they should be converted to string
raise StringTypeRequiredError()
|
py | b41429429febe4761285d91d001482101bbb14e6 | from localgraphclustering import *
import time
import numpy as np
import networkx as nx
import random
def load_example_graph(vtype,itype):
return GraphLocal("localgraphclustering/tests/data/dolphins.edges",separator=" ",vtype=vtype,itype=itype)
def load_example_weighted_graph(vtype,itype):
return GraphLocal("localgraphclustering/tests/data/neuro-fmri-01.edges",header=True,separator=" ",vtype=vtype,itype=itype)
def generate_random_3Dgraph(n_nodes, radius, seed=None):
if seed is not None:
random.seed(seed)
# Generate a dict of positions
pos = {i: (random.uniform(0, 1), random.uniform(0, 1), random.uniform(0, 1)) for i in range(n_nodes)}
# Create random 3D network
G = nx.random_geometric_graph(n_nodes, radius, pos=pos)
return G
def test_GraphLocal_methods():
g = load_example_graph(np.uint32,np.uint32)
g1 = GraphLocal.from_shared(g.to_shared())
assert(g == g1)
g.largest_component()
g.biconnected_components()
g.core_number()
ei,ej,e = [],[],[]
for i in range(g._num_vertices):
for j in range(g.ai[i],g.ai[i+1]):
ei.append(i)
ej.append(g.aj[j])
e.append(g.adjacency_matrix.data[j])
g1 = GraphLocal()
g1.list_to_gl(ei,ej,e)
assert(g1 == g)
g1.discard_weights()
# Compute triangle clusters and cluster metrics
cond,cut,vol,cc,t = triangleclusters(g)
minverts, minvals = g.local_extrema(cond,True)
print("vertices with minimum conductance neighborhood:",minverts)
# Test graph with more than one components
G = GraphLocal("notebooks/datasets/neuro-fmri-01.edges",file_type = "edgelist", separator = " ", header = True)
# Test drawing fuinctions
# Read graph. This also supports gml and graphml format.
# The data set contains pairwise similarities of blasted
# sequences of 232 proteins belonging to the amidohydrolase superfamily.
g = GraphLocal('notebooks/datasets/sfld_brown_et_al_amidohydrolases_protein_similarities_for_beh.graphml','graphml',' ')
# Load pre-computed coordinates for nodes.
pos = np.loadtxt('notebooks/datasets/sfld_brown_et_al_amidohydrolases_protein_similarities_for_beh.xy', dtype = 'float')
groups = np.loadtxt('notebooks/datasets/sfld_brown_et_al_amidohydrolases_protein_similarities_for_beh.class', dtype = 'float')
drawing = g.draw_groups(groups, pos=pos, figsize=(15,15),nodesize_list=[10**2],edgealpha=0.05)
# Find the solution of L1-regularized PageRank using localized accelerated gradient descent.
# This method is the fastest among other l1-regularized solvers and other approximate PageRank solvers.
reference_node = [218]
l1_reg_vector = approximate_pagerank(g, reference_node, rho=1.0e-4, method="l1reg")
# Call C++ version of sweep cut rounding on the l1-regularized PageRank solution.
output_sc_fast = sweep_cut(g,l1_reg_vector)
# Extract the partition for g and store it.
l1_reg_vector_rounded = output_sc_fast[0]
# Highlight local cluster
drawing.highlight(l1_reg_vector_rounded,otheredges=True)
drawing.nodesize(l1_reg_vector_rounded,10**2)
drawing.nodecolor(l1_reg_vector_rounded,c='y')
# Make reference node larger and thicker
drawing.nodecolor(reference_node,facecolor='r',edgecolor='g',alpha=1)
drawing.nodesize(reference_node,15**2)
drawing.nodewidth(reference_node,3)
drawing.show()
#redraw the graph first
drawing = g.draw_groups(groups, pos=pos, figsize=(15,15),nodesize_list=[5**2],edgealpha=0.05)
# Nodes circled whose color is not blue are missclassified
drawing.nodecolor(l1_reg_vector_rounded,edgecolor='g',alpha=1)
drawing.nodesize(l1_reg_vector_rounded,15**2)
drawing.nodewidth(l1_reg_vector_rounded,5)
drawing.show()
N = generate_random_3Dgraph(n_nodes=200, radius=0.25, seed=1)
pos = np.array(list(nx.get_node_attributes(N,'pos').values()))
G = GraphLocal()
G = G.from_networkx(N)
drawing = G.draw(pos=pos, nodealpha=0.5, edgealpha=0.01, values=[random.uniform(0, 1) for i in range(200)])
drawing = G.draw_groups([range(50),range(50,100),range(100,150)], pos=pos, edgealpha=0.01,nodealpha=0.5)
def test_sweepcut_self_loop():
""" This is a regression test for sweep-cuts with self-loops """
g = GraphLocal()
# make a graph with a triangle that has a self-loop on one Vertex
# connected to a lot of other vertices and where there are lots
# of other things connected as well...
#g.list_to_gl([0,0,1,2,2,2,2],[1,2,2,2])
import networkx as nx
G = nx.Graph()
K6uK3 = nx.complete_graph(6)
K6uK3.add_edge(5,6)
K6uK3.add_edge(5,7)
K6uK3.add_edge(6,7)
K6uK3.add_edge(5,5)
gloop = GraphLocal().from_networkx(K6uK3)
S = [5,6,7]
w = [3,2,1]
Sp,phi1 = sweep_cut(gloop,(S,np.array(w)))
assert(set(S) == set(Sp))
assert(phi1 == 5/12)
assert(phi1 == gloop.set_scores(S)["cond"])
def setup_fiedler_test(g):
output_sp = fiedler(g)
R = [1]
R.extend(g.neighbors(R[0]))
output_sp2 = fiedler_local(g,R)
assert(np.all(output_sp2[0][1] >= -1.0e-6)) # make sure everything is almost positive.
def setup_fiedler_local_test(g):
R = [1]
R.extend(g.neighbors(R[0]))
phi0 = g.set_scores(R)["cond"]
sparse_vec = fiedler_local(g,R)[0]
S = sweep_cut(g,sparse_vec)[0]
phi1 = g.set_scores(S)["cond"]
assert(phi1 <= phi0)
def setup_sweep_cut_test(g):
tmp1 = sweep_cut(g,([1,2,3,4,5,6,7,8,9],[1,2,3,4,5,6,7,8,9]),cpp=False)
tmp2 = sweep_cut(g,([1,2,3,4,5,6,7,8,9],[1,2,3,4,5,6,7,8,9]))
assert(tmp1[1]==tmp2[1])
def setup_spectral_clustering_test(g):
output_sp = fiedler(g)
R = [1]
R.extend(g.neighbors(R[0]))
insize = len(R)
output_sp2 = spectral_clustering(g,R,method="fiedler_local")[0]
Z = set(R).union(output_sp2)
assert(len(Z) == insize)
phi0 = g.set_scores(R)["cond"]
for method in ["fiedler_local","acl","l1reg","nibble"]:
phi = g.set_scores(spectral_clustering(g, R, method=method)[0])["cond"]
assert(phi <= phi0)
def setup_flow_clustering_test(g):
# Call the global spectral partitioning algorithm.
output_sp = fiedler(g)
eig2 = output_sp
# Round the eigenvector
output_sc = sweep_cut(g,eig2)
# Extract the partition for g and store it.
R = output_sc[0]
phi0 = g.set_scores(R)["cond"]
for method in ["mqi","mqi_weighted","crd","sl"]:
phi = g.set_scores(flow_clustering(g,R,method=method)[0])["cond"]
assert(phi <= phi0)
def setup_flow_weighted_test():
for vtype,itype in [(np.uint32,np.uint32),(np.uint32,np.int64),(np.int64,np.int64)]:
g = load_example_graph(vtype,itype)
g.discard_weights()
cond1 = flow_clustering(g,range(20),method="mqi")[1]
cond2 = flow_clustering(g,range(20),method="mqi_weighted")[1]
# MQI_weighted should give the same result as MQI when running on unweighted graph
assert(cond1 == cond2)
cond1 = flow_clustering(g,range(20),method="sl")[1]
cond2 = flow_clustering(g,range(20),method="sl_weighted")[1]
# sl_weighted should give the same result as sl when running on unweighted graph
assert(cond1 == cond2)
for vtype,itype in [(np.uint32,np.uint32),(np.uint32,np.int64),(np.int64,np.int64)]:
g = load_example_weighted_graph(vtype,itype)
g1 = g.largest_component()
cond1 = flow_clustering(g1,range(100),method="mqi_weighted")[1]
cond2 = flow_clustering(g1,range(100),method="sl_weighted",delta=1.0e6)[1]
# sl_weighted should give the same result as mqi_weighted when delta is large
assert(cond1 == cond2)
# create a 100 node clique
edges = []
for i in range(100):
for j in range(i+1,100):
# set edge weight of a five node subclique to be 10 and 1 elsewhere
if i < 5 and j < 5:
edges.append((i,j,10))
else:
edges.append((i,j,1))
g = GraphLocal()
g.list_to_gl(ei,ej,e)
cluster = MQI_weighted(g,range(20))[0]
# MQI_weighted should return the five node subclique with edge weight to be 10
assert(np.array_equal(MQI_weighted(g,range(20))[0],np.array(range(5))))
def test_all_algs():
for vtype,itype in [(np.uint32,np.uint32),(np.uint32,np.int64),(np.int64,np.int64)]:
g = load_example_graph(vtype,itype)
setup_fiedler_test(g)
setup_fiedler_local_test(g)
setup_sweep_cut_test(g)
setup_spectral_clustering_test(g)
setup_flow_clustering_test(g)
|
py | b414296e9ccfaf34d92511a59711ed3392330567 | # Copyright 2019 by Michiel de Hoon. All rights reserved.
# Based on code contributed and copyright 2016 by Peter Cock.
#
# This file is part of the Biopython distribution and governed by your
# choice of the "Biopython License Agreement" or the "BSD 3-Clause License".
# Please see the LICENSE file that should have been included as part of this
# package.
"""Bio.SeqIO support for the UCSC nib file format.
Nib stands for nibble (4 bit) representation of nucleotide sequences.
The two nibbles in a byte each store one nucleotide, represented numerically
as follows:
- ``0`` - T
- ``1`` - C
- ``2`` - A
- ``3`` - G
- ``4`` - N (unknown)
As the first bit in a nibble is set if the nucleotide is soft-masked, we
additionally have:
- ``8`` - t
- ``9`` - c
- ``a`` - a
- ``b`` - g
- ``c`` - n (unknown)
A nib file contains only one sequence record.
You are expected to use this module via the Bio.SeqIO functions under
the format name "nib":
>>> from Bio import SeqIO
>>> record = SeqIO.read("Nib/test_even_bigendian.nib", "nib")
>>> print("%i %s..." % (len(record), record.seq[:20]))
50 nAGAAGagccgcNGgCActt...
For detailed information on the file format, please see the UCSC
description at https://genome.ucsc.edu/FAQ/FAQformat.html.
"""
from __future__ import print_function
from Bio.SeqIO.Interfaces import SequenceWriter
from Bio.Seq import Seq
from Bio.SeqRecord import SeqRecord
import struct
import sys
try:
hex2bytes = bytes.fromhex # python3
except AttributeError:
# python 2
hex2bytes = lambda s: s.decode("hex") # noqa: E731
if sys.version_info < (3,):
# python2
import binascii
bytes2hex = binascii.hexlify
elif sys.version_info < (3, 5):
# python 3.4
import binascii
bytes2hex = lambda b: binascii.hexlify(b).decode("ascii") # noqa: E731
else:
# python 3.5 and later
bytes2hex = lambda b: b.hex() # noqa: E731
try:
int.from_bytes # python3
except AttributeError:
def byte2int(b, byteorder):
"""Convert byte array to integer."""
if byteorder == "little":
return struct.unpack("<i", b)[0]
elif byteorder == "big":
return struct.unpack(">i", b)[0]
else:
# python 3
byte2int = lambda b, byteorder: int.from_bytes(b, byteorder) # noqa: E731
try:
maketrans = str.maketrans # python3
except AttributeError:
import string
maketrans = string.maketrans
# This is a generator function!
def NibIterator(handle, alphabet=None):
"""Iterate over a nib file and yield a SeqRecord.
- handle - input file in the nib file format as defibed by UCSC.
This must be opened in binary mode!
- alphabet - always ignored.
Note that a nib file always contains only one sequence record.
The sequence of the resulting SeqRecord object should match the sequence
generated by Jim Kent's nibFrag utility, except that it will be in upper
case, whereas nibFrag uses lower case.
This function is used internally via the Bio.SeqIO functions:
>>> from Bio import SeqIO
>>> record = SeqIO.read("Nib/test_even_bigendian.nib", "nib")
>>> print("%s %i" % (record.seq, len(record)))
nAGAAGagccgcNGgCActtGAnTAtCGTCgcCacCaGncGncTtGNtGG 50
You can also call it directly:
>>> with open("Nib/test_even_bigendian.nib", "rb") as handle:
... for record in NibIterator(handle):
... print("%s %i" % (record.seq, len(record)))
...
nAGAAGagccgcNGgCActtGAnTAtCGTCgcCacCaGncGncTtGNtGG 50
"""
if alphabet is not None:
raise ValueError("Alphabets are ignored.")
word = handle.read(4)
# check if file is empty
if not word:
raise ValueError("Empty file.")
signature = bytes2hex(word)
if signature == "3a3de96b":
byteorder = "little" # little-endian
elif signature == "6be93d3a":
byteorder = "big" # big-endian
else:
raise ValueError("unexpected signature in Nib header")
number = handle.read(4)
length = byte2int(number, byteorder)
data = handle.read()
indices = bytes2hex(data)
if length % 2 == 0:
if len(indices) != length:
raise ValueError("Unexpected file size")
elif length % 2 == 1:
if len(indices) != length + 1:
raise ValueError("Unexpected file size")
indices = indices[:length]
if not set(indices).issubset("0123489abc"):
raise ValueError("Unexpected sequence data found in file")
table = maketrans("0123489abc", "TCAGNtcagn")
nucleotides = indices.translate(table)
sequence = Seq(nucleotides)
record = SeqRecord(sequence)
yield record
class NibWriter(SequenceWriter):
"""Nib file writer."""
def __init__(self, handle):
"""Initialize an Nib writer object.
Arguments:
- handle - Output handle, in binary write mode.
"""
self.handle = handle
byteorder = sys.byteorder
if byteorder == "little": # little-endian
signature = "3a3de96b"
elif byteorder == "big": # big-endian
signature = "6be93d3a"
else:
raise RuntimeError("unexpected system byte order %s" % byteorder)
handle.write(hex2bytes(signature))
def write_file(self, records):
"""Use this to write an entire file containing the given record."""
count = 0
for record in records:
count += 1
if count == 0:
raise ValueError("Must have one sequence")
if count > 1:
raise ValueError("More than one sequence found")
handle = self.handle
sequence = record.seq
nucleotides = str(sequence)
length = len(sequence)
handle.write(struct.pack("i", length))
table = maketrans("TCAGNtcagn", "0123489abc")
padding = length % 2
suffix = padding * "T"
nucleotides += suffix
if not set(nucleotides).issubset("ACGTNacgtn"):
raise ValueError("Sequence should contain A,C,G,T,N,a,c,g,t,n only")
indices = nucleotides.translate(table)
handle.write(hex2bytes(indices))
return count
if __name__ == "__main__":
from Bio._utils import run_doctest
run_doctest(verbose=0)
|
py | b414298e221a10d70a7927492df618c770379064 | """Common utility functions"""
import argparse
import collections
import contextlib
import logging.handlers
import os
import pathlib
import sys
def first(xs, fn=lambda _: True, default=None):
"""Return the first element from iterable that satisfies predicate `fn`,
or `default` if no such element exists.
Args:
xs (Iterable[Any]): collection
fn (Callable[[Any],bool]): predicate
default (Any): default
Returns:
Any
"""
return next((i for i in xs if fn(i)), default)
def namedtuple(name, *field_props):
"""Create a documented Type[collections.namedtuple]`.
The `name` can be a string; or a tuple containing name and documentation.
The `field_props` are an iterable of tuple field properties. A property
can be a field name; a tuple of field name and documentation; or a tuple
of field name, documentation and default value.
Args:
name (Union[str,Tuple[str,str]]):
tuple name with optional documentation
field_props (Iterable[Union[str,Tuple[str,str],Tuple[str,str,Any]]]):
tuple field properties
Returns:
Type[collections.namedtuple]
"""
field_props = [(i, None) if isinstance(i, str) else list(i)
for i in field_props]
cls = collections.namedtuple(name if isinstance(name, str) else name[0],
[i[0] for i in field_props])
default_values = []
for i in field_props:
if default_values and len(i) < 3:
raise Exception("property with default value not at end")
if len(i) > 2:
default_values.append(i[2])
if default_values:
cls.__new__.__defaults__ = tuple(default_values)
if not isinstance(name, str):
cls.__doc__ = name[1]
for i in field_props:
if i[1]:
getattr(cls, i[0]).__doc__ = i[1]
with contextlib.suppress(AttributeError, ValueError):
cls.__module__ = sys._getframe(1).f_globals.get('__name__', '__main__')
return cls
def extend_enum_doc(cls, description=None):
"""Extend enumeration documentation with a list of all members.
Args:
cls (enum.EnumMeta): enumeration class
Returns:
enum.EnumMeta
"""
doc = description or cls.__doc__
cls.__doc__ = doc + "\n\n" + "\n".join("* " + i.name for i in cls)
return cls
class RegisterCallbackHandle(namedtuple('RegisterCallbackHandle', 'cancel')):
"""Handle for canceling callback registration.
Args:
cancel (Callable[[],None]): cancel callback registration
"""
def __enter__(self):
return self
def __exit__(self, *args):
self.cancel()
class CallbackRegistry:
"""Registry that enables callback registration and notification.
Callbacks in the registry are notified sequentially with
:meth:`CallbackRegistry.notify`. If a callback raises an exception, the
exception is caught and `exception_cb` handler is called. Notification of
subsequent callbacks is not interrupted. If handler is `None`, the
exception is reraised and no subsequent callback is notified.
Args:
exception_cb (Optional[Callable[[Exception],None]]): exception handler
"""
def __init__(self, exception_cb=None):
self._exception_cb = exception_cb
self._cbs = []
def register(self, cb):
"""Register a callback. A handle for registration canceling is
returned.
Args:
cb (Callable): callback
Returns:
RegisterCallbackHandle
"""
self._cbs.append(cb)
return RegisterCallbackHandle(lambda: self._cbs.remove(cb))
def notify(self, *args, **kwargs):
"""Notify all registered callbacks."""
for cb in self._cbs:
try:
cb(*args, **kwargs)
except Exception as e:
if self._exception_cb:
self._exception_cb(e)
else:
raise
def parse_url_query(query):
"""Parse url query string.
Returns a dictionary of field names and their values.
Args:
query (str): url query string
Returns:
Dict[str,str]
"""
ret = {}
for i in query.split('&'):
if not i:
continue
temp = i.split('=')
if not temp[0]:
continue
ret[temp[0]] = temp[1] if len(temp) > 1 else None
return ret
def parse_env_path(path):
"""Parse path with environment variables.
Parse file path and replace all file path segments equal to ``$<name>``
with value of ``<name>`` environment variable. If environment variable is
not set, segments are replaced with ``.``.
Args:
path (os.PathLike): file path
Returns:
pathlib.Path
"""
path = pathlib.Path(path)
segments = []
for segment in path.parts:
if segment.startswith('$'):
segment = os.environ.get(segment[1:], '.')
segments.append(segment)
return pathlib.Path(*segments)
class LogRotatingFileHandler(logging.handlers.RotatingFileHandler):
def __init__(self, filename, *vargs, **kwargs):
"""Filename is parsed with :func:`parse_env_path`.
For other arguments, see:
:class:`logging.handlers.RotatingFileHandler`.
"""
super().__init__(parse_env_path(filename), *vargs, **kwargs)
class EnvPathArgParseAction(argparse.Action):
"""Argparse action for parsing file paths with environment variables.
Each path is parsed with :func:`parse_env_path`.
"""
def __call__(self, parser, namespace, values, option_string=None):
ret = []
for value in (values if self.nargs else [values]):
try:
ret.append(parse_env_path(value))
except Exception as e:
parser.error(str(e))
setattr(namespace, self.dest, ret if self.nargs else ret[0])
|
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.