blob_id
stringlengths 40
40
| directory_id
stringlengths 40
40
| path
stringlengths 3
616
| content_id
stringlengths 40
40
| detected_licenses
sequencelengths 0
112
| license_type
stringclasses 2
values | repo_name
stringlengths 5
115
| snapshot_id
stringlengths 40
40
| revision_id
stringlengths 40
40
| branch_name
stringclasses 777
values | visit_date
timestamp[us]date 2015-08-06 10:31:46
2023-09-06 10:44:38
| revision_date
timestamp[us]date 1970-01-01 02:38:32
2037-05-03 13:00:00
| committer_date
timestamp[us]date 1970-01-01 02:38:32
2023-09-06 01:08:06
| github_id
int64 4.92k
681M
⌀ | star_events_count
int64 0
209k
| fork_events_count
int64 0
110k
| gha_license_id
stringclasses 22
values | gha_event_created_at
timestamp[us]date 2012-06-04 01:52:49
2023-09-14 21:59:50
⌀ | gha_created_at
timestamp[us]date 2008-05-22 07:58:19
2023-08-21 12:35:19
⌀ | gha_language
stringclasses 149
values | src_encoding
stringclasses 26
values | language
stringclasses 1
value | is_vendor
bool 2
classes | is_generated
bool 2
classes | length_bytes
int64 3
10.2M
| extension
stringclasses 188
values | content
stringlengths 3
10.2M
| authors
sequencelengths 1
1
| author_id
stringlengths 1
132
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0d03f4997dff14b4117c84cbf186adcb19848f23 | 75ed4fe365819c9cb64522bd2bcb1590295dd4a8 | /login/jwt_practice.py | 5a325c3daf7ee8c0583f4b3065523263105acef1 | [] | no_license | thals7/web | 2aaa36fecf44851d65031dd0c9f9062748bfb3f5 | f2c9aca7b3cf0116985fe17190a1274264bdd2c1 | refs/heads/master | 2023-02-26T08:30:49.663381 | 2021-01-29T00:48:50 | 2021-01-29T00:48:50 | 298,844,661 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 294 | py | # https://pyjwt.readthedocs.io/en/latest/usage.html#encoding-decoding-tokens-with-rs256-rsa
import jwt
key = "thisissecret"
encoded = jwt.encode({"some": "payload","like":"user_id"}, key, algorithm="HS256")
print(encoded)
decoded = jwt.decode(encoded, key, algorithms="HS256")
print(decoded) | [
"[email protected]"
] | |
8c208878ff29e2333826cf8bd74f0aa5d3bb8157 | 70f5f279e051360310f95be895320d8fa6cd8d93 | /extraPackages/matplotlib-3.0.2/tutorials/introductory/sample_plots.py | f1a1542ebec6b0b538466dedce1c4d478568158b | [
"BSD-3-Clause"
] | permissive | spacetime314/python3_ios | 4b16ab3e81c31213b3db1e1eb00230621b0a7dc8 | e149f1bc2e50046c8810f83dae7739a8dea939ee | refs/heads/master | 2020-05-09T20:39:14.980041 | 2019-04-08T15:07:53 | 2019-04-08T15:07:53 | 181,415,024 | 2 | 0 | BSD-3-Clause | 2019-04-15T05:00:14 | 2019-04-15T05:00:12 | null | UTF-8 | Python | false | false | 12,052 | py | """
==========================
Sample plots in Matplotlib
==========================
Here you'll find a host of example plots with the code that
generated them.
.. _matplotlibscreenshots:
Line Plot
=========
Here's how to create a line plot with text labels using
:func:`~matplotlib.pyplot.plot`.
.. figure:: ../../gallery/lines_bars_and_markers/images/sphx_glr_simple_plot_001.png
:target: ../../gallery/lines_bars_and_markers/simple_plot.html
:align: center
:scale: 50
Simple Plot
.. _screenshots_subplot_demo:
Multiple subplots in one figure
===============================
Multiple axes (i.e. subplots) are created with the
:func:`~matplotlib.pyplot.subplot` function:
.. figure:: ../../gallery/subplots_axes_and_figures/images/sphx_glr_subplot_001.png
:target: ../../gallery/subplots_axes_and_figures/subplot.html
:align: center
:scale: 50
Subplot
.. _screenshots_images_demo:
Images
======
Matplotlib can display images (assuming equally spaced
horizontal dimensions) using the :func:`~matplotlib.pyplot.imshow` function.
.. figure:: ../../gallery/images_contours_and_fields/images/sphx_glr_image_demo_003.png
:target: ../../gallery/images_contours_and_fields/image_demo.html
:align: center
:scale: 50
Example of using :func:`~matplotlib.pyplot.imshow` to display a CT scan
.. _screenshots_pcolormesh_demo:
Contouring and pseudocolor
==========================
The :func:`~matplotlib.pyplot.pcolormesh` function can make a colored
representation of a two-dimensional array, even if the horizontal dimensions
are unevenly spaced. The
:func:`~matplotlib.pyplot.contour` function is another way to represent
the same data:
.. figure:: ../../gallery/images_contours_and_fields/images/sphx_glr_pcolormesh_levels_001.png
:target: ../../gallery/images_contours_and_fields/pcolormesh_levels.html
:align: center
:scale: 50
Example comparing :func:`~matplotlib.pyplot.pcolormesh` and :func:`~matplotlib.pyplot.contour` for plotting two-dimensional data
.. _screenshots_histogram_demo:
Histograms
==========
The :func:`~matplotlib.pyplot.hist` function automatically generates
histograms and returns the bin counts or probabilities:
.. figure:: ../../gallery/statistics/images/sphx_glr_histogram_features_001.png
:target: ../../gallery/statistics/histogram_features.html
:align: center
:scale: 50
Histogram Features
.. _screenshots_path_demo:
Paths
=====
You can add arbitrary paths in Matplotlib using the
:mod:`matplotlib.path` module:
.. figure:: ../../gallery/shapes_and_collections/images/sphx_glr_path_patch_001.png
:target: ../../gallery/shapes_and_collections/path_patch.html
:align: center
:scale: 50
Path Patch
.. _screenshots_mplot3d_surface:
Three-dimensional plotting
==========================
The mplot3d toolkit (see :ref:`toolkit_mplot3d-tutorial` and
:ref:`mplot3d-examples-index`) has support for simple 3d graphs
including surface, wireframe, scatter, and bar charts.
.. figure:: ../../gallery/mplot3d/images/sphx_glr_surface3d_001.png
:target: ../../gallery/mplot3d/surface3d.html
:align: center
:scale: 50
Surface3d
Thanks to John Porter, Jonathon Taylor, Reinier Heeres, and Ben Root for
the `mplot3d` toolkit. This toolkit is included with all standard Matplotlib
installs.
.. _screenshots_ellipse_demo:
Streamplot
==========
The :meth:`~matplotlib.pyplot.streamplot` function plots the streamlines of
a vector field. In addition to simply plotting the streamlines, it allows you
to map the colors and/or line widths of streamlines to a separate parameter,
such as the speed or local intensity of the vector field.
.. figure:: ../../gallery/images_contours_and_fields/images/sphx_glr_plot_streamplot_001.png
:target: ../../gallery/images_contours_and_fields/plot_streamplot.html
:align: center
:scale: 50
Streamplot with various plotting options.
This feature complements the :meth:`~matplotlib.pyplot.quiver` function for
plotting vector fields. Thanks to Tom Flannaghan and Tony Yu for adding the
streamplot function.
Ellipses
========
In support of the `Phoenix <http://www.jpl.nasa.gov/news/phoenix/main.php>`_
mission to Mars (which used Matplotlib to display ground tracking of
spacecraft), Michael Droettboom built on work by Charlie Moad to provide
an extremely accurate 8-spline approximation to elliptical arcs (see
:class:`~matplotlib.patches.Arc`), which are insensitive to zoom level.
.. figure:: ../../gallery/shapes_and_collections/images/sphx_glr_ellipse_demo_001.png
:target: ../../gallery/shapes_and_collections/ellipse_demo.html
:align: center
:scale: 50
Ellipse Demo
.. _screenshots_barchart_demo:
Bar charts
==========
Use the :func:`~matplotlib.pyplot.bar` function to make bar charts, which
includes customizations such as error bars:
.. figure:: ../../gallery/statistics/images/sphx_glr_barchart_demo_001.png
:target: ../../gallery/statistics/barchart_demo.html
:align: center
:scale: 50
Barchart Demo
You can also create stacked bars
(`bar_stacked.py <../../gallery/lines_bars_and_markers/bar_stacked.html>`_),
or horizontal bar charts
(`barh.py <../../gallery/lines_bars_and_markers/barh.html>`_).
.. _screenshots_pie_demo:
Pie charts
==========
The :func:`~matplotlib.pyplot.pie` function allows you to create pie
charts. Optional features include auto-labeling the percentage of area,
exploding one or more wedges from the center of the pie, and a shadow effect.
Take a close look at the attached code, which generates this figure in just
a few lines of code.
.. figure:: ../../gallery/pie_and_polar_charts/images/sphx_glr_pie_features_001.png
:target: ../../gallery/pie_and_polar_charts/pie_features.html
:align: center
:scale: 50
Pie Features
.. _screenshots_table_demo:
Tables
======
The :func:`~matplotlib.pyplot.table` function adds a text table
to an axes.
.. figure:: ../../gallery/misc/images/sphx_glr_table_demo_001.png
:target: ../../gallery/misc/table_demo.html
:align: center
:scale: 50
Table Demo
.. _screenshots_scatter_demo:
Scatter plots
=============
The :func:`~matplotlib.pyplot.scatter` function makes a scatter plot
with (optional) size and color arguments. This example plots changes
in Google's stock price, with marker sizes reflecting the
trading volume and colors varying with time. Here, the
alpha attribute is used to make semitransparent circle markers.
.. figure:: ../../gallery/lines_bars_and_markers/images/sphx_glr_scatter_demo2_001.png
:target: ../../gallery/lines_bars_and_markers/scatter_demo2.html
:align: center
:scale: 50
Scatter Demo2
.. _screenshots_slider_demo:
GUI widgets
===========
Matplotlib has basic GUI widgets that are independent of the graphical
user interface you are using, allowing you to write cross GUI figures
and widgets. See :mod:`matplotlib.widgets` and the
`widget examples <../../gallery/index.html>`_.
.. figure:: ../../gallery/widgets/images/sphx_glr_slider_demo_001.png
:target: ../../gallery/widgets/slider_demo.html
:align: center
:scale: 50
Slider and radio-button GUI.
.. _screenshots_fill_demo:
Filled curves
=============
The :func:`~matplotlib.pyplot.fill` function lets you
plot filled curves and polygons:
.. figure:: ../../gallery/lines_bars_and_markers/images/sphx_glr_fill_001.png
:target: ../../gallery/lines_bars_and_markers/fill.html
:align: center
:scale: 50
Fill
Thanks to Andrew Straw for adding this function.
.. _screenshots_date_demo:
Date handling
=============
You can plot timeseries data with major and minor ticks and custom
tick formatters for both.
.. figure:: ../../gallery/text_labels_and_annotations/images/sphx_glr_date_001.png
:target: ../../gallery/text_labels_and_annotations/date.html
:align: center
:scale: 50
Date
See :mod:`matplotlib.ticker` and :mod:`matplotlib.dates` for details and usage.
.. _screenshots_log_demo:
Log plots
=========
The :func:`~matplotlib.pyplot.semilogx`,
:func:`~matplotlib.pyplot.semilogy` and
:func:`~matplotlib.pyplot.loglog` functions simplify the creation of
logarithmic plots.
.. figure:: ../../gallery/scales/images/sphx_glr_log_demo_001.png
:target: ../../gallery/scales/log_demo.html
:align: center
:scale: 50
Log Demo
Thanks to Andrew Straw, Darren Dale and Gregory Lielens for contributions
log-scaling infrastructure.
.. _screenshots_polar_demo:
Polar plots
===========
The :func:`~matplotlib.pyplot.polar` function generates polar plots.
.. figure:: ../../gallery/pie_and_polar_charts/images/sphx_glr_polar_demo_001.png
:target: ../../gallery/pie_and_polar_charts/polar_demo.html
:align: center
:scale: 50
Polar Demo
.. _screenshots_legend_demo:
Legends
=======
The :func:`~matplotlib.pyplot.legend` function automatically
generates figure legends, with MATLAB-compatible legend-placement
functions.
.. figure:: ../../gallery/text_labels_and_annotations/images/sphx_glr_legend_001.png
:target: ../../gallery/text_labels_and_annotations/legend.html
:align: center
:scale: 50
Legend
Thanks to Charles Twardy for input on the legend function.
.. _screenshots_mathtext_examples_demo:
TeX-notation for text objects
=============================
Below is a sampling of the many TeX expressions now supported by Matplotlib's
internal mathtext engine. The mathtext module provides TeX style mathematical
expressions using `FreeType <https://www.freetype.org/>`_
and the DejaVu, BaKoMa computer modern, or `STIX <http://www.stixfonts.org>`_
fonts. See the :mod:`matplotlib.mathtext` module for additional details.
.. figure:: ../../gallery/text_labels_and_annotations/images/sphx_glr_mathtext_examples_001.png
:target: ../../gallery/text_labels_and_annotations/mathtext_examples.html
:align: center
:scale: 50
Mathtext Examples
Matplotlib's mathtext infrastructure is an independent implementation and
does not require TeX or any external packages installed on your computer. See
the tutorial at :doc:`/tutorials/text/mathtext`.
.. _screenshots_tex_demo:
Native TeX rendering
====================
Although Matplotlib's internal math rendering engine is quite
powerful, sometimes you need TeX. Matplotlib supports external TeX
rendering of strings with the *usetex* option.
.. figure:: ../../gallery/text_labels_and_annotations/images/sphx_glr_tex_demo_001.png
:target: ../../gallery/text_labels_and_annotations/tex_demo.html
:align: center
:scale: 50
Tex Demo
.. _screenshots_eeg_demo:
EEG GUI
=======
You can embed Matplotlib into pygtk, wx, Tk, or Qt applications.
Here is a screenshot of an EEG viewer called `pbrain
<https://github.com/nipy/pbrain>`__.
.. image:: ../../_static/eeg_small.png
The lower axes uses :func:`~matplotlib.pyplot.specgram`
to plot the spectrogram of one of the EEG channels.
For examples of how to embed Matplotlib in different toolkits, see:
* :doc:`/gallery/user_interfaces/embedding_in_gtk3_sgskip`
* :doc:`/gallery/user_interfaces/embedding_in_wx2_sgskip`
* :doc:`/gallery/user_interfaces/mpl_with_glade3_sgskip`
* :doc:`/gallery/user_interfaces/embedding_in_qt_sgskip`
* :doc:`/gallery/user_interfaces/embedding_in_tk_sgskip`
XKCD-style sketch plots
=======================
Just for fun, Matplotlib supports plotting in the style of `xkcd
<http://www.xkcd.com/>`.
.. figure:: ../../gallery/showcase/images/sphx_glr_xkcd_001.png
:target: ../../gallery/showcase/xkcd.html
:align: center
:scale: 50
Xkcd
"""
###################################################################
# Subplot example
# ===============
#
# Many plot types can be combined in one figure to create
# powerful and flexible representations of data.
#
import matplotlib.pyplot as plt
import numpy as np
np.random.seed(19680801)
data = np.random.randn(2, 100)
fig, axs = plt.subplots(2, 2, figsize=(5, 5))
axs[0, 0].hist(data[0])
axs[1, 0].scatter(data[0], data[1])
axs[0, 1].plot(data[0], data[1])
axs[1, 1].hist2d(data[0], data[1])
plt.show()
| [
"[email protected]"
] | |
cabda38c0a0fe289c78c7072a6bd20d7cfacf53c | 968aedcc9e58d718bb3895f89de5292d8caabe52 | /leetcode/Hash-Table/valid-sudoku.py | 0a49bd177048dcb0d60662a3ac173ff7e1202e64 | [] | no_license | iCodeIN/competitive-programming-5 | 0729c9f09f12543455121fc633b051eb68529152 | 30bfafb6a7727c9305b22933b63d9d645182c633 | refs/heads/master | 2022-04-14T08:50:19.568207 | 2019-09-26T14:49:56 | 2019-09-26T14:49:56 | null | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 2,208 | py | class Solution:
def isValidSudoku(self, board):
"""
https://leetcode.com/problems/valid-sudoku/description/
:type board: List[List[str]]
:rtype: bool
"""
num = {"1":0,"2":0,"3":0,"4":0,"5":0,"6":0,"7":0,"8":0,"9":0,".":0}
for i in board:
num = {"1":0,"2":0,"3":0,"4":0,"5":0,"6":0,"7":0,"8":0,"9":0,".":0}
for j in i:
if num[j] == 0:
num[j] += 1
elif j != ".":
return False
for i in range(9):
num = {"1":0,"2":0,"3":0,"4":0,"5":0,"6":0,"7":0,"8":0,"9":0,".":0}
for j in board:
if num[j[i]] == 0:
num[j[i]] += 1
elif j[i] != ".":
return False
l = [
[board[0][0],board[0][1],board[0][2],board[1][0],board[1][1],board[1][2],board[2][0],board[2][1],board[2][2]],
[board[0][3],board[0][4],board[0][5],board[1][3],board[1][4],board[1][5],board[2][3],board[2][4],board[2][5]],
[board[0][6],board[0][7],board[0][8],board[1][6],board[1][7],board[1][8],board[2][6],board[2][7],board[2][8]],
[board[3][0],board[3][1],board[3][2],board[4][0],board[4][1],board[4][2],board[5][0],board[5][1],board[5][2]],
[board[3][3],board[3][4],board[3][5],board[4][3],board[4][4],board[4][5],board[5][3],board[5][4],board[5][5]],
[board[3][6],board[3][7],board[3][8],board[4][6],board[4][7],board[4][8],board[5][6],board[5][7],board[5][8]],
[board[6][0],board[6][1],board[6][2],board[7][0],board[7][1],board[7][2],board[8][0],board[8][1],board[8][2]],
[board[6][3],board[6][4],board[6][5],board[7][3],board[7][4],board[7][5],board[8][3],board[8][4],board[8][5]],
[board[6][6],board[6][7],board[6][8],board[7][6],board[7][7],board[7][8],board[8][6],board[8][7],board[8][8]]
]
for i in l:
num = {"1":0,"2":0,"3":0,"4":0,"5":0,"6":0,"7":0,"8":0,"9":0,".":0}
for j in i:
if num[j] == 0:
num[j] += 1
elif j != ".":
return False
return True
| [
"[email protected]"
] | |
db7c4a0b79b27d1685be085bd88b35cdddf098b6 | 35f163f0db45094ea44e6e698750e5780b8f1a79 | /aiortc/rtcsctptransport.py | 40a7178eca033abaac30e6921e02a2854cdbc32d | [
"BSD-3-Clause"
] | permissive | royfu/aiortc | 886ac442a56af93e8771d9be0aa15702b0635fb4 | 4a781e15e061176470f8b2c7d30ab2e64d71da9e | refs/heads/master | 2020-04-15T01:43:00.905480 | 2019-01-04T16:06:29 | 2019-01-04T16:06:29 | null | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 55,614 | py | import asyncio
import enum
import hmac
import logging
import math
import os
import time
import warnings
from struct import pack, unpack, unpack_from
import attr
import crcmod.predefined
from pyee import EventEmitter
from .exceptions import InvalidStateError
from .rtcdatachannel import RTCDataChannel, RTCDataChannelParameters
from .utils import random32, uint16_add, uint16_gt
try:
import crcmod._crcfunext
except ImportError: # pragma: no cover
warnings.warn('crcmod C extension was not found, datachannel performance will be reduced')
crc32c = crcmod.predefined.mkPredefinedCrcFun('crc-32c')
logger = logging.getLogger('sctp')
# local constants
COOKIE_LENGTH = 24
COOKIE_LIFETIME = 60
MAX_OUTBOUND_QUEUE = 100
MAX_STREAMS = 65535
USERDATA_MAX_LENGTH = 1200
# protocol constants
SCTP_CAUSE_INVALID_STREAM = 0x0001
SCTP_CAUSE_STALE_COOKIE = 0x0003
SCTP_DATA_LAST_FRAG = 0x01
SCTP_DATA_FIRST_FRAG = 0x02
SCTP_DATA_UNORDERED = 0x04
SCTP_MAX_ASSOCIATION_RETRANS = 10
SCTP_MAX_BURST = 4
SCTP_MAX_INIT_RETRANS = 8
SCTP_RTO_ALPHA = 1 / 8
SCTP_RTO_BETA = 1 / 4
SCTP_RTO_INITIAL = 3
SCTP_RTO_MIN = 1
SCTP_RTO_MAX = 60
SCTP_TSN_MODULO = 2 ** 32
RECONFIG_CHUNK = 130
RECONFIG_MAX_STREAMS = 135
FORWARD_TSN_CHUNK = 192
# parameters
SCTP_STATE_COOKIE = 0x0007
SCTP_STR_RESET_OUT_REQUEST = 0x000d
SCTP_STR_RESET_RESPONSE = 0x0010
SCTP_STR_RESET_ADD_OUT_STREAMS = 0x0011
SCTP_SUPPORTED_CHUNK_EXT = 0x8008
SCTP_PRSCTP_SUPPORTED = 0xc000
# data channel constants
DATA_CHANNEL_ACK = 2
DATA_CHANNEL_OPEN = 3
DATA_CHANNEL_RELIABLE = 0x00
DATA_CHANNEL_PARTIAL_RELIABLE_REXMIT = 0x01
DATA_CHANNEL_PARTIAL_RELIABLE_TIMED = 0x02
DATA_CHANNEL_RELIABLE_UNORDERED = 0x80
DATA_CHANNEL_PARTIAL_RELIABLE_REXMIT_UNORDERED = 0x81
DATA_CHANNEL_PARTIAL_RELIABLE_TIMED_UNORDERED = 0x82
WEBRTC_DCEP = 50
WEBRTC_STRING = 51
WEBRTC_BINARY = 53
WEBRTC_STRING_EMPTY = 56
WEBRTC_BINARY_EMPTY = 57
def chunk_type(chunk):
return chunk.__class__.__name__
def decode_params(body):
params = []
pos = 0
while pos <= len(body) - 4:
param_type, param_length = unpack('!HH', body[pos:pos + 4])
params.append((param_type, body[pos + 4:pos + param_length]))
pos += param_length + padl(param_length)
return params
def encode_params(params):
body = b''
padding = b''
for param_type, param_value in params:
param_length = len(param_value) + 4
body += padding
body += pack('!HH', param_type, param_length) + param_value
padding = b'\x00' * padl(param_length)
return body
def padl(l):
return 4 * ((l + 3) // 4) - l
def swapl(i):
return unpack("<I", pack(">I", i))[0]
def tsn_gt(a, b):
"""
Return True if tsn a is greater than b.
"""
half_mod = (1 << 31)
return (((a < b) and ((b - a) > half_mod)) or
((a > b) and ((a - b) < half_mod)))
def tsn_gte(a, b):
"""
Return True if tsn a is greater than or equal to b.
"""
return (a == b) or tsn_gt(a, b)
def tsn_minus_one(a):
return (a - 1) % SCTP_TSN_MODULO
def tsn_plus_one(a):
return (a + 1) % SCTP_TSN_MODULO
class Chunk:
def __init__(self, flags=0, body=b''):
self.flags = flags
self.body = body
def __bytes__(self):
body = self.body
data = pack('!BBH', self.type, self.flags, len(body) + 4) + body
data += b'\x00' * padl(len(body))
return data
def __repr__(self):
return '%s(flags=%d)' % (chunk_type(self), self.flags)
@property
def type(self):
for k, cls in CHUNK_TYPES.items():
if isinstance(self, cls):
return k
class BaseParamsChunk(Chunk):
def __init__(self, flags=0, body=b''):
self.flags = flags
if body:
self.params = decode_params(body)
else:
self.params = []
@property
def body(self):
return encode_params(self.params)
class AbortChunk(BaseParamsChunk):
pass
class CookieAckChunk(Chunk):
pass
class CookieEchoChunk(Chunk):
pass
class DataChunk(Chunk):
def __init__(self, flags=0, body=b''):
self.flags = flags
if body:
(self.tsn, self.stream_id, self.stream_seq, self.protocol) = unpack('!LHHL', body[0:12])
self.user_data = body[12:]
else:
self.tsn = 0
self.stream_id = 0
self.stream_seq = 0
self.protocol = 0
self.user_data = b''
@property
def body(self):
body = pack('!LHHL', self.tsn, self.stream_id, self.stream_seq, self.protocol)
body += self.user_data
return body
def __repr__(self):
return 'DataChunk(flags=%d, tsn=%d, stream_id=%d, stream_seq=%d)' % (
self.flags, self.tsn, self.stream_id, self.stream_seq)
class ErrorChunk(BaseParamsChunk):
pass
class ForwardTsnChunk(Chunk):
def __init__(self, flags=0, body=b''):
self.flags = flags
self.streams = []
if body:
self.cumulative_tsn = unpack_from('!L', body, 0)[0]
pos = 4
while pos < len(body):
self.streams.append(unpack_from('!HH', body, pos))
pos += 4
else:
self.cumulative_tsn = 0
@property
def body(self):
body = pack('!L', self.cumulative_tsn)
for stream_id, stream_seq in self.streams:
body += pack('!HH', stream_id, stream_seq)
return body
def __repr__(self):
return 'ForwardTsnChunk(cumulative_tsn=%d, streams=%s)' % (
self.cumulative_tsn, self.streams)
class HeartbeatChunk(BaseParamsChunk):
pass
class HeartbeatAckChunk(BaseParamsChunk):
pass
class BaseInitChunk(Chunk):
def __init__(self, flags=0, body=b''):
self.flags = flags
if body:
(self.initiate_tag, self.advertised_rwnd, self.outbound_streams,
self.inbound_streams, self.initial_tsn) = unpack('!LLHHL', body[0:16])
self.params = decode_params(body[16:])
else:
self.initiate_tag = 0
self.advertised_rwnd = 0
self.outbound_streams = 0
self.inbound_streams = 0
self.initial_tsn = 0
self.params = []
@property
def body(self):
body = pack(
'!LLHHL', self.initiate_tag, self.advertised_rwnd, self.outbound_streams,
self.inbound_streams, self.initial_tsn)
body += encode_params(self.params)
return body
class InitChunk(BaseInitChunk):
pass
class InitAckChunk(BaseInitChunk):
pass
class ReconfigChunk(BaseParamsChunk):
pass
class SackChunk(Chunk):
def __init__(self, flags=0, body=b''):
self.flags = flags
self.gaps = []
self.duplicates = []
if body:
self.cumulative_tsn, self.advertised_rwnd, nb_gaps, nb_duplicates = unpack(
'!LLHH', body[0:12])
pos = 12
for i in range(nb_gaps):
self.gaps.append(unpack('!HH', body[pos:pos + 4]))
pos += 4
for i in range(nb_duplicates):
self.duplicates.append(unpack('!L', body[pos:pos + 4])[0])
pos += 4
else:
self.cumulative_tsn = 0
self.advertised_rwnd = 0
@property
def body(self):
body = pack('!LLHH', self.cumulative_tsn, self.advertised_rwnd,
len(self.gaps), len(self.duplicates))
for gap in self.gaps:
body += pack('!HH', *gap)
for tsn in self.duplicates:
body += pack('!L', tsn)
return body
def __repr__(self):
return 'SackChunk(flags=%d, advertised_rwnd=%d, cumulative_tsn=%d, gaps=%s)' % (
self.flags, self.advertised_rwnd, self.cumulative_tsn, self.gaps)
class ShutdownChunk(Chunk):
def __init__(self, flags=0, body=b''):
self.flags = flags
if body:
self.cumulative_tsn = unpack('!L', body[0:4])[0]
else:
self.cumulative_tsn = 0
@property
def body(self):
return pack('!L', self.cumulative_tsn)
def __repr__(self):
return 'ShutdownChunk(flags=%d, cumulative_tsn=%d)' % (
self.flags, self.cumulative_tsn)
class ShutdownAckChunk(Chunk):
pass
class ShutdownCompleteChunk(Chunk):
pass
CHUNK_TYPES = {
0: DataChunk,
1: InitChunk,
2: InitAckChunk,
3: SackChunk,
4: HeartbeatChunk,
5: HeartbeatAckChunk,
6: AbortChunk,
7: ShutdownChunk,
8: ShutdownAckChunk,
9: ErrorChunk,
10: CookieEchoChunk,
11: CookieAckChunk,
14: ShutdownCompleteChunk,
130: ReconfigChunk,
192: ForwardTsnChunk,
}
class Packet:
def __init__(self, source_port, destination_port, verification_tag, chunks):
self.source_port = source_port
self.destination_port = destination_port
self.verification_tag = verification_tag
self.chunks = chunks
def __bytes__(self):
checksum = 0
data = pack(
'!HHLL',
self.source_port,
self.destination_port,
self.verification_tag,
checksum)
for chunk in self.chunks:
data += bytes(chunk)
# calculate checksum
checksum = swapl(crc32c(data))
return data[0:8] + pack('!L', checksum) + data[12:]
@classmethod
def parse(cls, data):
if len(data) < 12:
raise ValueError('SCTP packet length is less than 12 bytes')
source_port, destination_port, verification_tag, checksum = unpack(
'!HHLL', data[0:12])
# verify checksum
check_data = data[0:8] + b'\x00\x00\x00\x00' + data[12:]
if checksum != swapl(crc32c(check_data)):
raise ValueError('SCTP packet has invalid checksum')
packet = cls(
source_port=source_port,
destination_port=destination_port,
verification_tag=verification_tag,
chunks=[])
pos = 12
while pos <= len(data) - 4:
chunk_type, chunk_flags, chunk_length = unpack('!BBH', data[pos:pos + 4])
chunk_body = data[pos + 4:pos + chunk_length]
chunk_cls = CHUNK_TYPES.get(chunk_type)
if chunk_cls:
packet.chunks.append(chunk_cls(
flags=chunk_flags,
body=chunk_body))
pos += chunk_length + padl(chunk_length)
return packet
# RFC 6525
@attr.s
class StreamResetOutgoingParam:
request_sequence = attr.ib()
response_sequence = attr.ib()
last_tsn = attr.ib()
streams = attr.ib(default=attr.Factory(list))
def __bytes__(self):
data = pack(
'!LLL',
self.request_sequence,
self.response_sequence,
self.last_tsn)
for stream in self.streams:
data += pack('!H', stream)
return data
@classmethod
def parse(cls, data):
request_sequence, response_sequence, last_tsn = unpack('!LLL', data[0:12])
streams = []
for pos in range(12, len(data), 2):
streams.append(unpack('!H', data[pos:pos + 2])[0])
return cls(
request_sequence=request_sequence,
response_sequence=response_sequence,
last_tsn=last_tsn,
streams=streams)
@attr.s
class StreamAddOutgoingParam:
request_sequence = attr.ib()
new_streams = attr.ib()
def __bytes__(self):
data = pack(
'!LHH',
self.request_sequence,
self.new_streams,
0)
return data
@classmethod
def parse(cls, data):
request_sequence, new_streams, reserved = unpack('!LHH', data[0:8])
return cls(
request_sequence=request_sequence,
new_streams=new_streams)
@attr.s
class StreamResetResponseParam:
response_sequence = attr.ib()
result = attr.ib()
def __bytes__(self):
return pack('!LL', self.response_sequence, self.result)
@classmethod
def parse(cls, data):
response_sequence, result = unpack('!LL', data[0:8])
return cls(response_sequence=response_sequence, result=result)
RECONFIG_PARAM_TYPES = {
13: StreamResetOutgoingParam,
16: StreamResetResponseParam,
17: StreamAddOutgoingParam
}
class InboundStream:
def __init__(self):
self.reassembly = []
self.sequence_number = 0
def add_chunk(self, chunk):
if not self.reassembly or tsn_gt(chunk.tsn, self.reassembly[-1].tsn):
self.reassembly.append(chunk)
return
for i, rchunk in enumerate(self.reassembly):
# should never happen, the chunk should have been eliminated
# as a duplicate when _mark_received() is called
assert rchunk.tsn != chunk.tsn, 'duplicate chunk in reassembly'
if tsn_gt(rchunk.tsn, chunk.tsn):
self.reassembly.insert(i, chunk)
break
def pop_messages(self):
pos = 0
start_pos = None
while pos < len(self.reassembly):
chunk = self.reassembly[pos]
if start_pos is None:
ordered = not (chunk.flags & SCTP_DATA_UNORDERED)
if not (chunk.flags & SCTP_DATA_FIRST_FRAG):
if ordered:
break
else:
pos += 1
continue
if ordered and uint16_gt(chunk.stream_seq, self.sequence_number):
break
expected_tsn = chunk.tsn
start_pos = pos
elif chunk.tsn != expected_tsn:
if ordered:
break
else:
start_pos = None
pos += 1
continue
if (chunk.flags & SCTP_DATA_LAST_FRAG):
user_data = b''.join([c.user_data for c in self.reassembly[start_pos:pos + 1]])
self.reassembly = self.reassembly[:start_pos] + self.reassembly[pos + 1:]
if ordered and chunk.stream_seq == self.sequence_number:
self.sequence_number = uint16_add(self.sequence_number, 1)
pos = start_pos
yield (chunk.stream_id, chunk.protocol, user_data)
else:
pos += 1
expected_tsn = tsn_plus_one(expected_tsn)
def prune_chunks(self, tsn):
"""
Prune chunks up to the given TSN.
"""
pos = -1
size = 0
for i, chunk in enumerate(self.reassembly):
if tsn_gte(tsn, chunk.tsn):
pos = i
size += len(chunk.user_data)
else:
break
self.reassembly = self.reassembly[pos + 1:]
return size
@attr.s
class RTCSctpCapabilities:
"""
The :class:`RTCSctpCapabilities` dictionary provides information about the
capabilities of the :class:`RTCSctpTransport`.
"""
maxMessageSize = attr.ib()
"""
The maximum size of data that the implementation can send or
0 if the implementation can handle messages of any size.
"""
class RTCSctpTransport(EventEmitter):
"""
The :class:`RTCSctpTransport` interface includes information relating to
Stream Control Transmission Protocol (SCTP) transport.
:param: transport: An :class:`RTCDtlsTransport`.
"""
def __init__(self, transport, port=5000):
if transport.state == 'closed':
raise InvalidStateError
super().__init__()
self._association_state = self.State.CLOSED
self.__transport = transport
self._started = False
self.__state = 'new'
self._loop = asyncio.get_event_loop()
self._hmac_key = os.urandom(16)
self._local_partial_reliability = True
self._local_port = port
self._local_verification_tag = random32()
self._remote_extensions = []
self._remote_partial_reliability = False
self._remote_port = None
self._remote_verification_tag = 0
# inbound
self._advertised_rwnd = 1024 * 1024
self._inbound_streams = {}
self._inbound_streams_count = 0
self._inbound_streams_max = MAX_STREAMS
self._last_received_tsn = None
self._sack_duplicates = []
self._sack_misordered = set()
self._sack_needed = False
# outbound
self._cwnd = 3 * USERDATA_MAX_LENGTH
self._fast_recovery_exit = None
self._fast_recovery_transmit = False
self._forward_tsn_chunk = None
self._flight_size = 0
self._local_tsn = random32()
self._last_sacked_tsn = tsn_minus_one(self._local_tsn)
self._advanced_peer_ack_tsn = tsn_minus_one(self._local_tsn)
self._outbound_queue = []
self._outbound_queue_pos = 0
self._outbound_stream_seq = {}
self._outbound_streams_count = MAX_STREAMS
self._partial_bytes_acked = 0
# reconfiguration
self._reconfig_queue = []
self._reconfig_request = None
self._reconfig_request_seq = self._local_tsn
self._reconfig_response_seq = 0
# rtt calculation
self._srtt = None
self._rttvar = None
# timers
self._rto = SCTP_RTO_INITIAL
self._t1_handle = None
self._t2_handle = None
self._t3_handle = None
# data channels
self._data_channel_id = None
self._data_channel_queue = []
self._data_channels = {}
@property
def is_server(self):
return self.transport.transport.role != 'controlling'
@property
def port(self):
"""
The local SCTP port number used for data channels.
"""
return self._local_port
@property
def state(self):
"""
The current state of the SCTP transport.
"""
return self.__state
@property
def transport(self):
"""
The :class:`RTCDtlsTransport` over which SCTP data is transmitted.
"""
return self.__transport
@classmethod
def getCapabilities(cls):
"""
Retrieve the capabilities of the transport.
:rtype: RTCSctpCapabilities
"""
return RTCSctpCapabilities(maxMessageSize=65536)
def setTransport(self, transport):
self.__transport = transport
async def start(self, remoteCaps, remotePort):
"""
Start the transport.
"""
if not self._started:
self._started = True
self.__state = 'connecting'
self._remote_port = remotePort
# initialise local channel ID counter
if self.is_server:
self._data_channel_id = 0
else:
self._data_channel_id = 1
self.__transport._register_data_receiver(self)
if not self.is_server:
await self._init()
async def stop(self):
"""
Stop the transport.
"""
if self._association_state != self.State.CLOSED:
await self._abort()
self.__transport._unregister_data_receiver(self)
self._set_state(self.State.CLOSED)
async def _abort(self):
"""
Abort the association.
"""
chunk = AbortChunk()
try:
await self._send_chunk(chunk)
except ConnectionError:
pass
async def _init(self):
"""
Initialize the association.
"""
chunk = InitChunk()
chunk.initiate_tag = self._local_verification_tag
chunk.advertised_rwnd = self._advertised_rwnd
chunk.outbound_streams = self._outbound_streams_count
chunk.inbound_streams = self._inbound_streams_max
chunk.initial_tsn = self._local_tsn
self._set_extensions(chunk.params)
await self._send_chunk(chunk)
# start T1 timer and enter COOKIE-WAIT state
self._t1_start(chunk)
self._set_state(self.State.COOKIE_WAIT)
def _flight_size_decrease(self, chunk):
self._flight_size = max(0, self._flight_size - chunk._book_size)
def _flight_size_increase(self, chunk):
self._flight_size += chunk._book_size
def _get_extensions(self, params):
"""
Gets what extensions are supported by the remote party.
"""
for k, v in params:
if k == SCTP_PRSCTP_SUPPORTED:
self._remote_partial_reliability = True
elif k == SCTP_SUPPORTED_CHUNK_EXT:
self._remote_extensions = list(v)
def _set_extensions(self, params):
"""
Sets what extensions are supported by the local party.
"""
extensions = []
if self._local_partial_reliability:
params.append((SCTP_PRSCTP_SUPPORTED, b''))
extensions.append(FORWARD_TSN_CHUNK)
extensions.append(RECONFIG_CHUNK)
params.append((SCTP_SUPPORTED_CHUNK_EXT, bytes(extensions)))
def _get_inbound_stream(self, stream_id):
"""
Get or create the inbound stream with the specified ID.
"""
if stream_id not in self._inbound_streams:
self._inbound_streams[stream_id] = InboundStream()
return self._inbound_streams[stream_id]
def _get_timestamp(self):
return int(time.time())
async def _handle_data(self, data):
"""
Handle data received from the network.
"""
try:
packet = Packet.parse(data)
except ValueError:
return
# is this an init?
init_chunk = len([x for x in packet.chunks if isinstance(x, InitChunk)])
if init_chunk:
assert len(packet.chunks) == 1
expected_tag = 0
else:
expected_tag = self._local_verification_tag
# verify tag
if packet.verification_tag != expected_tag:
self.__log_debug('Bad verification tag %d vs %d',
packet.verification_tag, expected_tag)
return
# handle chunks
for chunk in packet.chunks:
await self._receive_chunk(chunk)
# send SACK if needed
if self._sack_needed:
await self._send_sack()
def _maybe_abandon(self, chunk):
"""
Determine if a chunk needs to be marked as abandoned.
If it does, it marks the chunk and any other chunk belong to the same
message as abandoned.
"""
if chunk._abandoned:
return True
abandon = (
(chunk._max_retransmits is not None and chunk._sent_count > chunk._max_retransmits) or
(chunk._expiry is not None and chunk._expiry < time.time())
)
if not abandon:
return False
chunk_pos = self._outbound_queue.index(chunk)
for pos in range(chunk_pos, -1, -1):
ochunk = self._outbound_queue[pos]
ochunk._abandoned = True
ochunk._retransmit = False
if (ochunk.flags & SCTP_DATA_FIRST_FRAG):
break
for pos in range(chunk_pos, len(self._outbound_queue)):
ochunk = self._outbound_queue[pos]
ochunk._abandoned = True
ochunk._retransmit = False
if (ochunk.flags & SCTP_DATA_LAST_FRAG):
break
return True
def _mark_received(self, tsn):
"""
Mark an incoming data TSN as received.
"""
# it's a duplicate
if tsn_gte(self._last_received_tsn, tsn) or tsn in self._sack_misordered:
self._sack_duplicates.append(tsn)
return True
# consolidate misordered entries
self._sack_misordered.add(tsn)
for tsn in sorted(self._sack_misordered):
if tsn == tsn_plus_one(self._last_received_tsn):
self._last_received_tsn = tsn
else:
break
# filter out obsolete entries
def is_obsolete(x):
return tsn_gt(x, self._last_received_tsn)
self._sack_duplicates = list(filter(is_obsolete, self._sack_duplicates))
self._sack_misordered = set(filter(is_obsolete, self._sack_misordered))
async def _receive(self, stream_id, pp_id, data):
"""
Receive data stream -> ULP.
"""
await self._data_channel_receive(stream_id, pp_id, data)
async def _receive_chunk(self, chunk):
"""
Handle an incoming chunk.
"""
self.__log_debug('< %s', chunk)
# server
if isinstance(chunk, InitChunk) and self.is_server:
self._last_received_tsn = tsn_minus_one(chunk.initial_tsn)
self._reconfig_response_seq = tsn_minus_one(chunk.initial_tsn)
self._remote_verification_tag = chunk.initiate_tag
self._ssthresh = chunk.advertised_rwnd
self._get_extensions(chunk.params)
self.__log_debug('- Peer supports %d outbound streams, %d max inbound streams',
chunk.outbound_streams, chunk.inbound_streams)
self._inbound_streams_count = min(chunk.outbound_streams, self._inbound_streams_max)
self._outbound_streams_count = min(self._outbound_streams_count, chunk.inbound_streams)
ack = InitAckChunk()
ack.initiate_tag = self._local_verification_tag
ack.advertised_rwnd = self._advertised_rwnd
ack.outbound_streams = self._outbound_streams_count
ack.inbound_streams = self._inbound_streams_max
ack.initial_tsn = self._local_tsn
self._set_extensions(ack.params)
# generate state cookie
cookie = pack('!L', self._get_timestamp())
cookie += hmac.new(self._hmac_key, cookie, 'sha1').digest()
ack.params.append((SCTP_STATE_COOKIE, cookie))
await self._send_chunk(ack)
elif isinstance(chunk, CookieEchoChunk) and self.is_server:
# check state cookie MAC
cookie = chunk.body
if (len(cookie) != COOKIE_LENGTH or
hmac.new(self._hmac_key, cookie[0:4], 'sha1').digest() != cookie[4:]):
self.__log_debug('x State cookie is invalid')
return
# check state cookie lifetime
now = self._get_timestamp()
stamp = unpack('!L', cookie[0:4])[0]
if stamp < now - COOKIE_LIFETIME or stamp > now:
self.__log_debug('x State cookie has expired')
error = ErrorChunk()
error.params.append((SCTP_CAUSE_STALE_COOKIE, b'\x00' * 8))
await self._send_chunk(error)
return
ack = CookieAckChunk()
await self._send_chunk(ack)
self._set_state(self.State.ESTABLISHED)
# client
elif isinstance(chunk, InitAckChunk) and self._association_state == self.State.COOKIE_WAIT:
# cancel T1 timer and process chunk
self._t1_cancel()
self._last_received_tsn = tsn_minus_one(chunk.initial_tsn)
self._reconfig_response_seq = tsn_minus_one(chunk.initial_tsn)
self._remote_verification_tag = chunk.initiate_tag
self._ssthresh = chunk.advertised_rwnd
self._get_extensions(chunk.params)
self.__log_debug('- Peer supports %d outbound streams, %d max inbound streams',
chunk.outbound_streams, chunk.inbound_streams)
self._inbound_streams_count = min(chunk.outbound_streams, self._inbound_streams_max)
self._outbound_streams_count = min(self._outbound_streams_count, chunk.inbound_streams)
echo = CookieEchoChunk()
for k, v in chunk.params:
if k == SCTP_STATE_COOKIE:
echo.body = v
break
await self._send_chunk(echo)
# start T1 timer and enter COOKIE-ECHOED state
self._t1_start(echo)
self._set_state(self.State.COOKIE_ECHOED)
elif (isinstance(chunk, CookieAckChunk) and
self._association_state == self.State.COOKIE_ECHOED):
# cancel T1 timer and enter ESTABLISHED state
self._t1_cancel()
self._set_state(self.State.ESTABLISHED)
elif (isinstance(chunk, ErrorChunk) and
self._association_state in [self.State.COOKIE_WAIT, self.State.COOKIE_ECHOED]):
self._t1_cancel()
self._set_state(self.State.CLOSED)
self.__log_debug('x Could not establish association')
return
# common
elif isinstance(chunk, DataChunk):
await self._receive_data_chunk(chunk)
elif isinstance(chunk, SackChunk):
await self._receive_sack_chunk(chunk)
elif isinstance(chunk, HeartbeatChunk):
ack = HeartbeatAckChunk()
ack.params = chunk.params
await self._send_chunk(ack)
elif isinstance(chunk, AbortChunk):
self.__log_debug('x Association was aborted by remote party')
self._set_state(self.State.CLOSED)
elif isinstance(chunk, ShutdownChunk):
self._t2_cancel()
self._set_state(self.State.SHUTDOWN_RECEIVED)
ack = ShutdownAckChunk()
await self._send_chunk(ack)
self._t2_start(ack)
self._set_state(self.State.SHUTDOWN_ACK_SENT)
elif (isinstance(chunk, ShutdownCompleteChunk) and
self._association_state == self.State.SHUTDOWN_ACK_SENT):
self._t2_cancel()
self._set_state(self.State.CLOSED)
elif (isinstance(chunk, ReconfigChunk) and
self._association_state == self.State.ESTABLISHED):
for param in chunk.params:
cls = RECONFIG_PARAM_TYPES.get(param[0])
if cls:
await self._receive_reconfig_param(cls.parse(param[1]))
elif isinstance(chunk, ForwardTsnChunk):
await self._receive_forward_tsn_chunk(chunk)
async def _receive_data_chunk(self, chunk):
"""
Handle a DATA chunk.
"""
self._sack_needed = True
# mark as received
if self._mark_received(chunk.tsn):
return
# find stream
inbound_stream = self._get_inbound_stream(chunk.stream_id)
# defragment data
inbound_stream.add_chunk(chunk)
self._advertised_rwnd -= len(chunk.user_data)
for message in inbound_stream.pop_messages():
self._advertised_rwnd += len(message[2])
await self._receive(*message)
async def _receive_forward_tsn_chunk(self, chunk):
"""
Handle a FORWARD TSN chunk.
"""
self._sack_needed = True
# it's a duplicate
if tsn_gte(self._last_received_tsn, chunk.cumulative_tsn):
return
def is_obsolete(x):
return tsn_gt(x, self._last_received_tsn)
# advance cumulative TSN
self._last_received_tsn = chunk.cumulative_tsn
self._sack_misordered = set(filter(is_obsolete, self._sack_misordered))
for tsn in sorted(self._sack_misordered):
if tsn == tsn_plus_one(self._last_received_tsn):
self._last_received_tsn = tsn
else:
break
# filter out obsolete entries
self._sack_duplicates = list(filter(is_obsolete, self._sack_duplicates))
self._sack_misordered = set(filter(is_obsolete, self._sack_misordered))
# update reassembly
for stream_id, stream_seq in chunk.streams:
inbound_stream = self._get_inbound_stream(stream_id)
# advance sequence number and perform delivery
inbound_stream.sequence_number = uint16_add(stream_seq, 1)
for message in inbound_stream.pop_messages():
self._advertised_rwnd += len(message[2])
await self._receive(*message)
# prune obsolete chunks
for stream_id, inbound_stream in self._inbound_streams.items():
self._advertised_rwnd += inbound_stream.prune_chunks(self._last_received_tsn)
async def _receive_sack_chunk(self, chunk):
"""
Handle a SACK chunk.
"""
if tsn_gt(self._last_sacked_tsn, chunk.cumulative_tsn):
return
received_time = time.time()
self._last_sacked_tsn = chunk.cumulative_tsn
done = 0
done_bytes = 0
restart_t3 = False
# handle acknowledged data
for i in range(len(self._outbound_queue)):
schunk = self._outbound_queue[i]
if tsn_gt(schunk.tsn, self._last_sacked_tsn):
break
done += 1
if not schunk._acked:
done_bytes += schunk._book_size
self._flight_size_decrease(schunk)
# update RTO estimate
if done == 1 and schunk._sent_count == 1:
self._update_rto(received_time - schunk._sent_time)
# handle gap blocks
loss = False
if chunk.gaps:
highest_seen_tsn = (chunk.cumulative_tsn + chunk.gaps[-1][1]) % SCTP_TSN_MODULO
seen = set()
for gap in chunk.gaps:
for pos in range(gap[0], gap[1] + 1):
tsn = (chunk.cumulative_tsn + pos) % SCTP_TSN_MODULO
seen.add(tsn)
for i in range(done, len(self._outbound_queue)):
schunk = self._outbound_queue[i]
if tsn_gt(schunk.tsn, highest_seen_tsn):
break
if schunk.tsn not in seen:
schunk._misses += 1
if schunk._misses == 3:
schunk._misses = 0
if not self._maybe_abandon(schunk):
schunk._retransmit = True
schunk._acked = False
self._flight_size_decrease(schunk)
loss = True
if i == done:
restart_t3 = True
elif not schunk._acked:
done_bytes += schunk._book_size
schunk._acked = True
self._flight_size_decrease(schunk)
# discard acknowledged data
if done:
self._outbound_queue = self._outbound_queue[done:]
self._outbound_queue_pos = max(0, self._outbound_queue_pos - done)
restart_t3 = True
# adjust congestion window
if self._fast_recovery_exit is None:
if done:
if self._cwnd <= self._ssthresh:
# slow start
self._cwnd += min(done_bytes, USERDATA_MAX_LENGTH)
else:
# congestion avoidance
self._partial_bytes_acked += done_bytes
if self._partial_bytes_acked >= self._cwnd:
self._partial_bytes_acked -= self._cwnd
self._cwnd += USERDATA_MAX_LENGTH
if loss:
self._ssthresh = max(self._cwnd // 2, 4 * USERDATA_MAX_LENGTH)
self._cwnd = self._ssthresh
self._partial_bytes_acked = 0
self._fast_recovery_exit = highest_seen_tsn
self._fast_recovery_transmit = True
elif tsn_gte(chunk.cumulative_tsn, self._fast_recovery_exit):
self._fast_recovery_exit = None
if not len(self._outbound_queue):
# there is no outstanding data, stop T3
self._t3_cancel()
elif restart_t3:
# the earliest outstanding chunk was acknowledged, restart T3
self._t3_handle.cancel()
self._t3_handle = None
self._t3_start()
self._update_advanced_peer_ack_point()
await self._data_channel_flush()
await self._transmit()
async def _receive_reconfig_param(self, param):
"""
Handle a RE-CONFIG parameter.
"""
self.__log_debug('<< %s', param)
if isinstance(param, StreamResetOutgoingParam):
# mark closed inbound streams
for stream_id in param.streams:
self._inbound_streams.pop(stream_id, None)
# close data channel
channel = self._data_channels.get(stream_id)
if channel:
self._data_channel_close(channel)
# send response
response_param = StreamResetResponseParam(
response_sequence=param.request_sequence,
result=1)
self._reconfig_response_seq = param.request_sequence
await self._send_reconfig_param(response_param)
elif isinstance(param, StreamAddOutgoingParam):
# increase inbound streams
self._inbound_streams_count += param.new_streams
# send response
response_param = StreamResetResponseParam(
response_sequence=param.request_sequence,
result=1)
self._reconfig_response_seq = param.request_sequence
await self._send_reconfig_param(response_param)
elif isinstance(param, StreamResetResponseParam):
if (self._reconfig_request and
param.response_sequence == self._reconfig_request.request_sequence):
# mark closed streams
for stream_id in self._reconfig_request.streams:
self._outbound_stream_seq.pop(stream_id, None)
self._data_channel_closed(stream_id)
self._reconfig_request = None
await self._transmit_reconfig()
async def _send(self, stream_id, pp_id, user_data,
expiry=None, max_retransmits=None, ordered=True):
"""
Send data ULP -> stream.
"""
if ordered:
stream_seq = self._outbound_stream_seq.get(stream_id, 0)
else:
stream_seq = 0
fragments = math.ceil(len(user_data) / USERDATA_MAX_LENGTH)
pos = 0
for fragment in range(0, fragments):
chunk = DataChunk()
chunk.flags = 0
if not ordered:
chunk.flags = SCTP_DATA_UNORDERED
if fragment == 0:
chunk.flags |= SCTP_DATA_FIRST_FRAG
if fragment == fragments - 1:
chunk.flags |= SCTP_DATA_LAST_FRAG
chunk.tsn = self._local_tsn
chunk.stream_id = stream_id
chunk.stream_seq = stream_seq
chunk.protocol = pp_id
chunk.user_data = user_data[pos:pos + USERDATA_MAX_LENGTH]
# initialize counters
chunk._abandoned = False
chunk._acked = False
chunk._book_size = len(chunk.user_data)
chunk._expiry = expiry
chunk._max_retransmits = max_retransmits
chunk._misses = 0
chunk._retransmit = False
chunk._sent_count = 0
chunk._sent_time = None
pos += USERDATA_MAX_LENGTH
self._local_tsn = tsn_plus_one(self._local_tsn)
self._outbound_queue.append(chunk)
if ordered:
self._outbound_stream_seq[stream_id] = uint16_add(stream_seq, 1)
# transmit outbound data
if not self._t3_handle:
await self._transmit()
async def _send_chunk(self, chunk):
"""
Transmit a chunk (no bundling for now).
"""
self.__log_debug('> %s', chunk)
packet = Packet(
source_port=self._local_port,
destination_port=self._remote_port,
verification_tag=self._remote_verification_tag,
chunks=[chunk])
await self.transport._send_data(bytes(packet))
async def _send_reconfig_param(self, param):
chunk = ReconfigChunk()
for k, cls in RECONFIG_PARAM_TYPES.items():
if isinstance(param, cls):
param_type = k
break
chunk.params.append((param_type, bytes(param)))
self.__log_debug('>> %s', param)
await self._send_chunk(chunk)
async def _send_sack(self):
"""
Build and send a selective acknowledgement (SACK) chunk.
"""
gaps = []
gap_next = None
for tsn in sorted(self._sack_misordered):
pos = (tsn - self._last_received_tsn) % SCTP_TSN_MODULO
if tsn == gap_next:
gaps[-1][1] = pos
else:
gaps.append([pos, pos])
gap_next = tsn_plus_one(tsn)
sack = SackChunk()
sack.cumulative_tsn = self._last_received_tsn
sack.advertised_rwnd = max(0, self._advertised_rwnd)
sack.duplicates = self._sack_duplicates[:]
sack.gaps = [tuple(x) for x in gaps]
await self._send_chunk(sack)
self._sack_duplicates.clear()
self._sack_needed = False
def _set_state(self, state):
"""
Transition the SCTP association to a new state.
"""
if state != self._association_state:
self.__log_debug('- %s -> %s', self._association_state, state)
self._association_state = state
if state == self.State.ESTABLISHED:
self.__state = 'connected'
asyncio.ensure_future(self._data_channel_flush())
elif state == self.State.CLOSED:
self._t1_cancel()
self._t2_cancel()
self._t3_cancel()
self.__state = 'closed'
# close data channels
for stream_id in list(self._data_channels.keys()):
self._data_channel_closed(stream_id)
# no more events will be emitted, so remove all event listeners
# to facilitate garbage collection.
self.remove_all_listeners()
# timers
def _t1_cancel(self):
if self._t1_handle is not None:
self.__log_debug('- T1(%s) cancel', chunk_type(self._t1_chunk))
self._t1_handle.cancel()
self._t1_handle = None
self._t1_chunk = None
def _t1_expired(self):
self._t1_failures += 1
self._t1_handle = None
self.__log_debug('x T1(%s) expired %d', chunk_type(self._t1_chunk), self._t1_failures)
if self._t1_failures > SCTP_MAX_INIT_RETRANS:
self._set_state(self.State.CLOSED)
else:
asyncio.ensure_future(self._send_chunk(self._t1_chunk))
self._t1_handle = self._loop.call_later(self._rto, self._t1_expired)
def _t1_start(self, chunk):
assert self._t1_handle is None
self._t1_chunk = chunk
self._t1_failures = 0
self.__log_debug('- T1(%s) start', chunk_type(self._t1_chunk))
self._t1_handle = self._loop.call_later(self._rto, self._t1_expired)
def _t2_cancel(self):
if self._t2_handle is not None:
self.__log_debug('- T2(%s) cancel', chunk_type(self._t2_chunk))
self._t2_handle.cancel()
self._t2_handle = None
self._t2_chunk = None
def _t2_expired(self):
self._t2_failures += 1
self._t2_handle = None
self.__log_debug('x T2(%s) expired %d', chunk_type(self._t2_chunk), self._t2_failures)
if self._t2_failures > SCTP_MAX_ASSOCIATION_RETRANS:
self._set_state(self.State.CLOSED)
else:
asyncio.ensure_future(self._send_chunk(self._t2_chunk))
self._t2_handle = self._loop.call_later(self._rto, self._t2_expired)
def _t2_start(self, chunk):
assert self._t2_handle is None
self._t2_chunk = chunk
self._t2_failures = 0
self.__log_debug('- T2(%s) start', chunk_type(self._t2_chunk))
self._t2_handle = self._loop.call_later(self._rto, self._t2_expired)
def _t3_expired(self):
self._t3_handle = None
self.__log_debug('x T3 expired')
# clear retransmit flag, mark abandoned chunks
for pos in range(self._outbound_queue_pos):
chunk = self._outbound_queue[pos]
chunk._retransmit = False
self._maybe_abandon(chunk)
self._update_advanced_peer_ack_point()
# retransmit
self._flight_size = 0
self._outbound_queue_pos = 0
self._partial_bytes_acked = 0
self._ssthresh = max(self._cwnd // 2, 4 * USERDATA_MAX_LENGTH)
self._cwnd = USERDATA_MAX_LENGTH
asyncio.ensure_future(self._transmit())
def _t3_start(self):
assert self._t3_handle is None
self.__log_debug('- T3 start')
self._t3_handle = self._loop.call_later(self._rto, self._t3_expired)
def _t3_cancel(self):
if self._t3_handle is not None:
self.__log_debug('- T3 cancel')
self._t3_handle.cancel()
self._t3_handle = None
async def _transmit(self):
"""
Transmit outbound data.
"""
# send FORWARD TSN
if self._forward_tsn_chunk is not None:
await self._send_chunk(self._forward_tsn_chunk)
self._forward_tsn_chunk = None
# ensure T3 is running
if not self._t3_handle:
self._t3_start()
# retransmit
for pos in range(self._outbound_queue_pos):
chunk = self._outbound_queue[pos]
if chunk._retransmit:
if self._fast_recovery_transmit:
self._fast_recovery_transmit = False
elif self._flight_size + chunk._book_size > self._cwnd:
return
self._flight_size_increase(chunk)
chunk._retransmit = False
chunk._sent_count += 1
await self._send_chunk(chunk)
while self._outbound_queue_pos < len(self._outbound_queue):
chunk = self._outbound_queue[self._outbound_queue_pos]
if self._flight_size + chunk._book_size > self._cwnd:
break
self._flight_size_increase(chunk)
# update counters
chunk._sent_count += 1
chunk._sent_time = time.time()
await self._send_chunk(chunk)
if not self._t3_handle:
self._t3_start()
self._outbound_queue_pos += 1
async def _transmit_reconfig(self):
if self._reconfig_queue and not self._reconfig_request:
streams = self._reconfig_queue[0:RECONFIG_MAX_STREAMS]
self._reconfig_queue = self._reconfig_queue[RECONFIG_MAX_STREAMS:]
param = StreamResetOutgoingParam(
request_sequence=self._reconfig_request_seq,
response_sequence=self._reconfig_response_seq,
last_tsn=tsn_minus_one(self._local_tsn),
streams=streams,
)
self._reconfig_request = param
self._reconfig_request_seq = tsn_plus_one(self._reconfig_request_seq)
await self._send_reconfig_param(param)
def _update_advanced_peer_ack_point(self):
"""
Try to advance "Advanced.Peer.Ack.Point" according to RFC 3758.
"""
if tsn_gt(self._last_sacked_tsn, self._advanced_peer_ack_tsn):
self._advanced_peer_ack_tsn = self._last_sacked_tsn
done = 0
streams = {}
for pos in range(self._outbound_queue_pos):
chunk = self._outbound_queue[pos]
if chunk._abandoned:
self._advanced_peer_ack_tsn = chunk.tsn
done += 1
if not (chunk.flags & SCTP_DATA_UNORDERED):
streams[chunk.stream_id] = chunk.stream_seq
else:
break
if done:
# build FORWARD TSN
self._forward_tsn_chunk = ForwardTsnChunk()
self._forward_tsn_chunk.cumulative_tsn = self._advanced_peer_ack_tsn
self._forward_tsn_chunk.streams = list(streams.items())
# drop data
self._outbound_queue = self._outbound_queue[done:]
self._outbound_queue_pos = max(0, self._outbound_queue_pos - done)
def _update_rto(self, R):
"""
Update RTO given a new roundtrip measurement R.
"""
if self._srtt is None:
self._rttvar = R / 2
self._srtt = R
else:
self._rttvar = (1 - SCTP_RTO_BETA) * self._rttvar + SCTP_RTO_BETA * abs(self._srtt - R)
self._srtt = (1 - SCTP_RTO_ALPHA) * self._srtt + SCTP_RTO_ALPHA * R
self._rto = max(SCTP_RTO_MIN, min(self._srtt + 4 * self._rttvar, SCTP_RTO_MAX))
def _data_channel_close(self, channel, transmit=True):
"""
Request closing the datachannel by sending an Outgoing Stream Reset Request.
"""
if channel.readyState not in ['closing', 'closed']:
channel._setReadyState('closing')
self._reconfig_queue.append(channel.id)
if len(self._reconfig_queue) == 1:
asyncio.ensure_future(self._transmit_reconfig())
def _data_channel_closed(self, stream_id):
channel = self._data_channels.pop(stream_id)
channel._setReadyState('closed')
async def _data_channel_flush(self):
"""
Try to flush buffered data to the SCTP layer.
We wait until the association is established, as we need to know
whether we are a client or a server to correctly assign an odd/even ID
to the data channels.
"""
if self._association_state != self.State.ESTABLISHED:
return
while len(self._outbound_queue) < MAX_OUTBOUND_QUEUE and self._data_channel_queue:
channel, protocol, user_data = self._data_channel_queue.pop(0)
# register channel if necessary
stream_id = channel.id
if stream_id is None:
stream_id = self._data_channel_id
self._data_channels[stream_id] = channel
self._data_channel_id += 2
channel._setId(stream_id)
# send data
if protocol == WEBRTC_DCEP:
await self._send(stream_id, protocol, user_data)
else:
if channel.maxPacketLifeTime:
expiry = time.time() + (channel.maxPacketLifeTime / 1000)
else:
expiry = None
await self._send(stream_id, protocol, user_data,
expiry=expiry,
max_retransmits=channel.maxRetransmits,
ordered=channel.ordered)
channel._addBufferedAmount(-len(user_data))
def _data_channel_open(self, channel):
channel_type = DATA_CHANNEL_RELIABLE
priority = 0
reliability = 0
if not channel.ordered:
channel_type |= 0x80
if channel.maxRetransmits is not None:
channel_type |= 1
reliability = channel.maxRetransmits
elif channel.maxPacketLifeTime is not None:
channel_type |= 2
reliability = channel.maxPacketLifeTime
data = pack('!BBHLHH', DATA_CHANNEL_OPEN, channel_type,
priority, reliability, len(channel.label), len(channel.protocol))
data += channel.label.encode('utf8')
data += channel.protocol.encode('utf8')
self._data_channel_queue.append((channel, WEBRTC_DCEP, data))
asyncio.ensure_future(self._data_channel_flush())
async def _data_channel_receive(self, stream_id, pp_id, data):
if pp_id == WEBRTC_DCEP and len(data):
msg_type = unpack('!B', data[0:1])[0]
if msg_type == DATA_CHANNEL_OPEN and len(data) >= 12:
# we should not receive an open for an existing channel
assert stream_id not in self._data_channels
# one side should be using even IDs, the other odd IDs
assert (stream_id % 2) != (self._data_channel_id % 2)
(msg_type, channel_type, priority, reliability,
label_length, protocol_length) = unpack('!BBHLHH', data[0:12])
pos = 12
label = data[pos:pos + label_length].decode('utf8')
pos += label_length
protocol = data[pos:pos + protocol_length].decode('utf8')
# check channel type
maxPacketLifeTime = None
maxRetransmits = None
if (channel_type & 0x03) == 1:
maxRetransmits = reliability
elif (channel_type & 0x03) == 2:
maxPacketLifeTime = reliability
# register channel
parameters = RTCDataChannelParameters(
label=label,
ordered=(channel_type & 0x80) == 0,
maxPacketLifeTime=maxPacketLifeTime,
maxRetransmits=maxRetransmits,
protocol=protocol)
channel = RTCDataChannel(self, parameters, id=stream_id)
channel._setReadyState('open')
self._data_channels[stream_id] = channel
# send ack
self._data_channel_queue.append(
(channel, WEBRTC_DCEP, pack('!B', DATA_CHANNEL_ACK)))
await self._data_channel_flush()
# emit channel
self.emit('datachannel', channel)
elif msg_type == DATA_CHANNEL_ACK:
assert stream_id in self._data_channels
channel = self._data_channels[stream_id]
channel._setReadyState('open')
elif pp_id == WEBRTC_STRING and stream_id in self._data_channels:
# emit message
self._data_channels[stream_id].emit('message', data.decode('utf8'))
elif pp_id == WEBRTC_STRING_EMPTY and stream_id in self._data_channels:
# emit message
self._data_channels[stream_id].emit('message', '')
elif pp_id == WEBRTC_BINARY and stream_id in self._data_channels:
# emit message
self._data_channels[stream_id].emit('message', data)
elif pp_id == WEBRTC_BINARY_EMPTY and stream_id in self._data_channels:
# emit message
self._data_channels[stream_id].emit('message', b'')
def _data_channel_send(self, channel, data):
if data == '':
pp_id, user_data = WEBRTC_STRING_EMPTY, b'\x00'
elif isinstance(data, str):
pp_id, user_data = WEBRTC_STRING, data.encode('utf8')
elif data == b'':
pp_id, user_data = WEBRTC_BINARY_EMPTY, b'\x00'
else:
pp_id, user_data = WEBRTC_BINARY, data
channel._addBufferedAmount(len(user_data))
self._data_channel_queue.append((channel, pp_id, user_data))
asyncio.ensure_future(self._data_channel_flush())
def __log_debug(self, msg, *args):
role = self.is_server and 'server' or 'client'
logger.debug(role + ' ' + msg, *args)
class State(enum.Enum):
CLOSED = 1
COOKIE_WAIT = 2
COOKIE_ECHOED = 3
ESTABLISHED = 4
SHUTDOWN_PENDING = 5
SHUTDOWN_SENT = 6
SHUTDOWN_RECEIVED = 7
SHUTDOWN_ACK_SENT = 8
| [
"[email protected]"
] | |
a7d2a235ea6e49e5cb3cfec57704cd6952825467 | 84c9a6fb5e18741f14a55d0d737e2a556383770d | /venv/Lib/site-packages/w3af/core/data/misc/response_cache_key.py | 14d57dbdfc5945b3463fb0df2398082606620c5c | [] | no_license | AravindChan96/Vulcan | 638a1db2f84df08bc50dd76c7f142014d529fbec | 5548a6f36f04108ac1a6ed8e707930f9821f0bd9 | refs/heads/master | 2022-11-05T15:05:54.224578 | 2020-06-19T20:44:14 | 2020-06-19T20:44:14 | 273,396,348 | 1 | 1 | null | null | null | null | UTF-8 | Python | false | false | 4,841 | py | """
response_cache_key.py
Copyright 2019 Andres Riancho
This file is part of w3af, http://w3af.org/ .
w3af is free software; you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation version 2 of the License.
w3af is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with w3af; if not, write to the Free Software
Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA
"""
import zlib
# pylint: disable=E0401
from darts.lib.utils.lru import SynchronizedLRUDict
# pylint: enable=E0401
from w3af.core.controllers.core_helpers.not_found.response import FourOhFourResponse
from w3af.core.data.misc.xml_bones import get_xml_bones
from w3af.core.data.misc.encoding import smart_str_ignore
def get_response_cache_key(http_response,
clean_response=None,
headers=None):
"""
Note: query.body has been cleaned by get_clean_body()
:param http_response: The HTTP response we want to get a cache key for
:param clean_response: The FourOhFourResponse associated with the HTTPResponse
passed as parameter (optional, will be calculated if not
provided)
:param headers: A string containing the HTTP response headers that have to be
used to calculate the hash
:return: Hash of the HTTP response body
"""
headers = '' or headers
#
# Only some HTTP responses benefit from the XML-bones signature
#
if _should_use_xml_bones(http_response):
body = get_xml_bones(http_response.get_body())
normalized_path = FourOhFourResponse.normalize_path(http_response.get_uri())
else:
#
# Get a clean_response if it was not provided
#
if clean_response is None:
clean_response = FourOhFourResponse.from_http_response(http_response)
body = clean_response.body
normalized_path = clean_response.normalized_path
#
# Calculate the hash using all the captured information
#
key = ''.join([str(http_response.get_code()),
smart_str_ignore(normalized_path),
str(headers),
smart_str_ignore(body)])
return quick_hash(key)
def _should_use_xml_bones(http_response):
# Ignore small responses (the bones for this document is not so
# representative)
if len(http_response.get_body()) < 256:
return False
# Ignore large responses (might break lxml parser)
if len(http_response.get_body()) > 1024 * 1024:
return False
# Check that this document is xml / html
has_expected_content_type = False
for content_type in ('xml', 'html'):
if content_type in http_response.content_type:
has_expected_content_type = True
if not has_expected_content_type:
return False
# Check that it actually has tags
if http_response.get_body().count('<') < 20:
return False
return True
def quick_hash(text):
text = smart_str_ignore(text)
return '%s%s' % (hash(text), zlib.adler32(text))
class ResponseCacheKeyCache(object):
#
# The memory impact of having a large number of items in this cache is
# really low, both the keys and the values are short strings (the result of
# quick_hash)
#
MAX_SIZE = 2000
def __init__(self):
self._cache = SynchronizedLRUDict(self.MAX_SIZE)
def get_response_cache_key(self,
http_response,
clean_response=None,
headers=None):
# When the clean response is available, use that body to calculate the
# cache key. It has been cleaned (removed request paths and QS parameters)
# so it has a higher chance of being equal to other responses / being
# already in the cache
if clean_response is not None:
body = clean_response.body
else:
body = http_response.body
cache_key = '%s%s' % (smart_str_ignore(body), headers)
cache_key = quick_hash(cache_key)
result = self._cache.get(cache_key, None)
if result is not None:
return result
result = get_response_cache_key(http_response,
clean_response=clean_response,
headers=headers)
self._cache[cache_key] = result
return result
def clear_cache(self):
self._cache.clear()
| [
"[email protected]"
] | |
1e1ef944556644234b01de801bc01eac0e0e4c2e | 48637665afeacae58d99e5203524ed8f2313d649 | /drawBot/context/gifContext.py | 2beff8238f2b3c31daaab1128958396ab581b1af | [
"BSD-3-Clause",
"BSD-2-Clause"
] | permissive | bitforks/drawbot | 35853a63d2d685cda7789d2dc3ca812bad0cc6d0 | e66f12cf1a4fdd412d27d3b9f36092112b6cbd4f | refs/heads/master | 2021-01-20T16:44:45.084558 | 2015-09-17T14:40:19 | 2015-09-17T14:40:19 | null | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 2,383 | py | import AppKit
import Quartz
import os
import tempfile
import subprocess
from imageContext import ImageContext
gifsiclePath = os.path.join(os.path.dirname(__file__), "tools", "gifsicle")
if not os.path.exists(gifsiclePath):
gifsiclePath = os.path.join(os.getcwd(), "tools", "gifsicle")
class GifContext(ImageContext):
_saveImageFileTypes = {
"gif": AppKit.NSGIFFileType,
}
fileExtensions = _saveImageFileTypes.keys()
_delay = 10
def __init__(self):
super(GifContext, self).__init__()
self._delayData = []
def _frameDuration(self, seconds):
# gifsicle -h: Set frame delay to TIME (in 1/100sec).
self._delayData[-1] = int(seconds * 100)
def _newPage(self, width, height):
super(GifContext, self)._newPage(width, height)
self._delayData.append(self._delay)
def _writeDataToFile(self, data, path, multipage):
pdfDocument = Quartz.PDFDocument.alloc().initWithData_(data)
pageCount = pdfDocument.pageCount()
shouldBeAnimated = pageCount > 1
tempPath = path
if shouldBeAnimated:
tempPath = tempfile.mkstemp(suffix=".gif")[1]
inputPaths = super(GifContext, self)._writeDataToFile(data, tempPath, shouldBeAnimated)
if shouldBeAnimated:
cmds = [
# gifsicle path
gifsiclePath,
# optimize level
# "-O3",
# force to 256 colors
"--colors", "256",
# make it loop
"--loop",
]
# add source paths with delay for each frame
for i, inputPath in enumerate(inputPaths):
cmds += [
# add the frame duration
"--delay", "%i" % self._delayData[i],
# add the input gif for each frame
inputPath
]
cmds += [
# output path
"--output",
path
]
# make a string of escaped commands
cmds = subprocess.list2cmdline(cmds)
# go
popen = subprocess.Popen(cmds, shell=True)
popen.wait()
# remove the temp input gifs
for inputPath in inputPaths:
os.remove(inputPath)
| [
"[email protected]"
] | |
bda72a45c56ce6b1c4ea927d9a8567f2951fdb18 | 9743d5fd24822f79c156ad112229e25adb9ed6f6 | /xai/brain/wordbase/otherforms/_sobers.py | 86f6f5f194a353070e1f9fe678676c2bc8cd808f | [
"MIT"
] | permissive | cash2one/xai | de7adad1758f50dd6786bf0111e71a903f039b64 | e76f12c9f4dcf3ac1c7c08b0cc8844c0b0a104b6 | refs/heads/master | 2021-01-19T12:33:54.964379 | 2017-01-28T02:00:50 | 2017-01-28T02:00:50 | null | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 218 | py |
#calss header
class _SOBERS():
def __init__(self,):
self.name = "SOBERS"
self.definitions = sober
self.parents = []
self.childen = []
self.properties = []
self.jsondata = {}
self.basic = ['sober']
| [
"[email protected]"
] | |
e8e3d8de20abef556139aa2fe44ad71d69297b8a | ba3231b25c60b73ca504cd788efa40d92cf9c037 | /nitro-python-13.0.36/nssrc/com/citrix/netscaler/nitro/resource/config/lb/lbvserver_dospolicy_binding.py | c16c8f0e9fb625627902597e36abce766c98e1f5 | [
"Apache-2.0",
"Python-2.0",
"LicenseRef-scancode-unknown-license-reference"
] | permissive | zhuweigh/vpx13 | f6d559ae85341e56472e3592cbc67062dac34b93 | b36caa3729d3ca5515fa725f2d91aeaabdb2daa9 | refs/heads/master | 2020-07-04T22:15:16.595728 | 2019-09-20T00:19:56 | 2019-09-20T00:19:56 | 202,435,307 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 6,292 | py | #
# Copyright (c) 2008-2019 Citrix Systems, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License")
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
from nssrc.com.citrix.netscaler.nitro.resource.base.base_resource import base_resource
from nssrc.com.citrix.netscaler.nitro.resource.base.base_resource import base_response
from nssrc.com.citrix.netscaler.nitro.service.options import options
from nssrc.com.citrix.netscaler.nitro.exception.nitro_exception import nitro_exception
from nssrc.com.citrix.netscaler.nitro.util.nitro_util import nitro_util
class lbvserver_dospolicy_binding(base_resource) :
""" Binding class showing the dospolicy that can be bound to lbvserver.
"""
def __init__(self) :
self._policyname = None
self._priority = None
self._name = None
self.___count = None
@property
def priority(self) :
r"""Priority.
"""
try :
return self._priority
except Exception as e:
raise e
@priority.setter
def priority(self, priority) :
r"""Priority.
"""
try :
self._priority = priority
except Exception as e:
raise e
@property
def policyname(self) :
r"""Name of the policy bound to the LB vserver.
"""
try :
return self._policyname
except Exception as e:
raise e
@policyname.setter
def policyname(self, policyname) :
r"""Name of the policy bound to the LB vserver.
"""
try :
self._policyname = policyname
except Exception as e:
raise e
@property
def name(self) :
r"""Name for the virtual server. Must begin with an ASCII alphanumeric or underscore (_) character, and must contain only ASCII alphanumeric, underscore, hash (#), period (.), space, colon (:), at sign (@), equal sign (=), and hyphen (-) characters. Can be changed after the virtual server is created.
CLI Users: If the name includes one or more spaces, enclose the name in double or single quotation marks (for example, "my vserver" or 'my vserver'). .<br/>Minimum length = 1.
"""
try :
return self._name
except Exception as e:
raise e
@name.setter
def name(self, name) :
r"""Name for the virtual server. Must begin with an ASCII alphanumeric or underscore (_) character, and must contain only ASCII alphanumeric, underscore, hash (#), period (.), space, colon (:), at sign (@), equal sign (=), and hyphen (-) characters. Can be changed after the virtual server is created.
CLI Users: If the name includes one or more spaces, enclose the name in double or single quotation marks (for example, "my vserver" or 'my vserver'). .<br/>Minimum length = 1
"""
try :
self._name = name
except Exception as e:
raise e
def _get_nitro_response(self, service, response) :
r""" converts nitro response into object and returns the object array in case of get request.
"""
try :
result = service.payload_formatter.string_to_resource(lbvserver_dospolicy_binding_response, response, self.__class__.__name__)
if(result.errorcode != 0) :
if (result.errorcode == 444) :
service.clear_session(self)
if result.severity :
if (result.severity == "ERROR") :
raise nitro_exception(result.errorcode, str(result.message), str(result.severity))
else :
raise nitro_exception(result.errorcode, str(result.message), str(result.severity))
return result.lbvserver_dospolicy_binding
except Exception as e :
raise e
def _get_object_name(self) :
r""" Returns the value of object identifier argument
"""
try :
if self.name is not None :
return str(self.name)
return None
except Exception as e :
raise e
@classmethod
def get(cls, service, name="", option_="") :
r""" Use this API to fetch lbvserver_dospolicy_binding resources.
"""
try :
if not name :
obj = lbvserver_dospolicy_binding()
response = obj.get_resources(service, option_)
else :
obj = lbvserver_dospolicy_binding()
obj.name = name
response = obj.get_resources(service)
return response
except Exception as e:
raise e
@classmethod
def get_filtered(cls, service, name, filter_) :
r""" Use this API to fetch filtered set of lbvserver_dospolicy_binding resources.
Filter string should be in JSON format.eg: "port:80,servicetype:HTTP".
"""
try :
obj = lbvserver_dospolicy_binding()
obj.name = name
option_ = options()
option_.filter = filter_
response = obj.getfiltered(service, option_)
return response
except Exception as e:
raise e
@classmethod
def count(cls, service, name) :
r""" Use this API to count lbvserver_dospolicy_binding resources configued on NetScaler.
"""
try :
obj = lbvserver_dospolicy_binding()
obj.name = name
option_ = options()
option_.count = True
response = obj.get_resources(service, option_)
if response :
return response[0].__dict__['___count']
return 0
except Exception as e:
raise e
@classmethod
def count_filtered(cls, service, name, filter_) :
r""" Use this API to count the filtered set of lbvserver_dospolicy_binding resources.
Filter string should be in JSON format.eg: "port:80,servicetype:HTTP".
"""
try :
obj = lbvserver_dospolicy_binding()
obj.name = name
option_ = options()
option_.count = True
option_.filter = filter_
response = obj.getfiltered(service, option_)
if response :
return response[0].__dict__['___count']
return 0
except Exception as e:
raise e
class Bindpoint:
REQUEST = "REQUEST"
RESPONSE = "RESPONSE"
class Labeltype:
reqvserver = "reqvserver"
resvserver = "resvserver"
policylabel = "policylabel"
class lbvserver_dospolicy_binding_response(base_response) :
def __init__(self, length=1) :
self.lbvserver_dospolicy_binding = []
self.errorcode = 0
self.message = ""
self.severity = ""
self.sessionid = ""
self.lbvserver_dospolicy_binding = [lbvserver_dospolicy_binding() for _ in range(length)]
| [
"[email protected]"
] | |
e4eb072d8f1eeaaa4cdf7f001effd5d4efd96d64 | 3f84f51751c4191bb81c9df7094578461fb12a2d | /AtcoderProblems/ABC/ABC063/C.py | a0d1edc24569a3ff1337dee5f6488331c804eeb4 | [] | no_license | rikukawamura/atcoder | 7ff49f1bd8534b99d87fe81ef950e1ba77eee8b8 | 09c0cfe3ce25be56d338614a29e996f4106117cd | refs/heads/master | 2023-08-13T21:21:19.058219 | 2021-09-28T10:02:42 | 2021-09-28T10:02:42 | 329,206,601 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 661 | py | import pdb
import itertools
N = int(input())
S = sorted([int(input()) for _ in range(N)], reverse=False)
non_ten = []
ten = []
for s in S:
if s % 10 == 0:
ten.append(s)
else:
non_ten.append(s)
ten = sorted(ten)
non_ten = sorted(non_ten)
total = sum(ten)+sum(non_ten)
'''
1. 総合点が10の倍数且つ,10の倍数でない問題がある場合
2. 総合点が10の倍数且つ,10の倍数でない問題がない場合
3. 総合点が10の倍数でない
の三つに場合分け
'''
if total % 10 == 0 and non_ten != []:
print(total-non_ten[0])
elif total % 10 == 0 and non_ten == []:
print(0)
else:
print(total) | [
"[email protected]"
] | |
5ca637eb2ebfb7ed1a6f2ba5d043a4f944c62dc9 | 8ff12c53e31f134b9f39f59b9a6f7d4f9142cea7 | /src/if.py | e048368b8880f8521d2dce9ee91380e173b38b8e | [] | no_license | quhuohuo/python | 5b0a80dbec7d22a0b274e4a32d269e85d254718c | 5732c5974519da8e8919dab42b36ab0ab2c99b37 | refs/heads/master | 2021-06-13T11:41:12.356329 | 2017-04-07T08:58:05 | 2017-04-07T08:58:05 | 75,054,981 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 94 | py | #!/usr/bin/python
#coding=utf-8
#简单的if语句
a = input()
if a > 0:
print "a > 0"
| [
"[email protected]"
] | |
50899501d48aefbf8069736a20d4adf832d5c014 | 195f19578f3eea3f8c3e3a780655ce2f8dd009d0 | /caixa_racional/views.py | 11f834c7fb9fdce95b8fa501e59832eee8e6a44b | [] | no_license | osmarsalesjr/TheBeeFreshTeam | 5b68c26d413940badc0814fb5c4cfc953b4fb695 | 9ca839083d903236054a813b265b0d09f34cb288 | refs/heads/master | 2022-12-17T11:22:30.482340 | 2019-07-14T00:11:50 | 2019-07-14T00:11:50 | 194,682,998 | 0 | 0 | null | 2022-12-08T01:47:50 | 2019-07-01T13:57:06 | Python | UTF-8 | Python | false | false | 1,240 | py | from rest_framework.response import Response
from rest_framework.reverse import reverse
from rest_framework import generics
from caixa_racional.models import Temperatura, BaseDeDados
from caixa_racional.serializers import TemperaturaSerializer, BaseDeDadosSerializer
class TemperaturaList(generics.ListCreateAPIView):
queryset = Temperatura.objects.order_by('-tempo')
serializer_class = TemperaturaSerializer
name = 'temperatura-list'
class TemperaturaDetail(generics.RetrieveUpdateDestroyAPIView):
queryset = Temperatura.objects.all()
serializer_class = TemperaturaSerializer
name = 'temperatura-detail'
class BaseDeDadosList(generics.ListCreateAPIView):
queryset = BaseDeDados.objects.all()
serializer_class = BaseDeDadosSerializer
name = 'base-de-dados-list'
class BaseDeDadosDetail(generics.RetrieveUpdateDestroyAPIView):
queryset = BaseDeDados.objects.all()
serializer_class = BaseDeDadosSerializer
name = 'base-de-dados-detail'
# Create your views here.
class ApiRoot(generics.GenericAPIView):
name = 'api-root'
def get(self, request, *args, **kwargs):
return Response({
'temperaturas': reverse(TemperaturaList.name, request=request),
})
| [
"[email protected]"
] | |
e0069151847d0d09f9ad60a5a5d0b65200219968 | 7364f67ec30af26e02cf894124a60478fdb56df7 | /make_plot.py | 1a6f1bf02180fc24f9f1df327b547b1d9b144f93 | [] | no_license | RaymondSimons/clear_local | b5430bbaf198654b190f7e41f261337389b6a209 | dd5dfbabd0149ef8323750a887640407dc6d1477 | refs/heads/master | 2021-06-15T08:14:42.452049 | 2021-02-24T15:47:00 | 2021-02-24T15:47:00 | 147,576,608 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 13,208 | py | import numpy as np
from matplotlib.backends.backend_pdf import PdfPages
from astropy.io import fits, ascii
from astropy.coordinates import SkyCoord
import astropy.units as u
import matplotlib as mpl
from astropy.cosmology import Planck15 as cosmo
import matplotlib.pyplot as plt
from numpy import *
import glob
from glob import glob
plt.rcParams['xtick.labelsize']=14
plt.rcParams['ytick.labelsize']=14
plt.ioff()
plt.close('all')
mpl.rcParams['text.usetex'] = True
mpl.rcParams['text.latex.preamble'] = [r'\usepackage{amsmath}']
cat = np.loadtxt('/Users/rsimons/Desktop/clear/Catalogs/z_r_O32.cat',dtype = 'str')
fls = glob('/Users/rsimons/Desktop/clear/izi_metal_profiles/fits/*npy')
ma_cat = np.loadtxt('/Users/rsimons/Dropbox/rcs_clear/catalogs/ma17.cat')
wang_cat = np.loadtxt('/Users/rsimons/Dropbox/rcs_clear/catalogs/wang17.cat')
wang18_cat = np.loadtxt('/Users/rsimons/Dropbox/rcs_clear/catalogs/wang18.cat')
jones_cat = np.loadtxt('/Users/rsimons/Dropbox/rcs_clear/catalogs/jones+13.cat')
swinbank_cat = np.loadtxt('/Users/rsimons/Dropbox/rcs_clear/catalogs/swinbank12.cat')
#GN1_physcat = np.loadtxt('/Users/rsimons/Desktop/clear/Catalogs/GN1_physcat.cat')
#GN2_physcat = np.loadtxt('/Users/rsimons/Desktop/clear/Catalogs/GN2_physcat.cat')
#GN3_physcat = np.loadtxt('/Users/rsimons/Desktop/clear/Catalogs/GN3_physcat.cat')
#gs_fout = fits.open('/Users/rsimons/Desktop/clear/Catalogs/goodss_3dhst.v4.4.fout.FITS')
#gn_fout = fits.open('/Users/rsimons/Desktop/clear/Catalogs/goodsn_3dhst.v4.4.fout.FITS')
#gn_cat = fits.open('/Users/rsimons/Desktop/clear/Catalogs/goodsn_3dhst.v4.4.cats/Catalog/goodsn_3dhst.v4.4.cat.FITS')
#gs_cat = fits.open('/Users/rsimons/Desktop/clear/Catalogs/goodss_3dhst.v4.4.cats/Catalog/goodss_3dhst.v4.4.cat.FITS')
count_all = 0
count_flat = 0
count_falling = 0
count_rising = 0
if False:
gn_cat = fits.open('/Users/rsimons/Desktop/clear/Catalogs/goodsn_3dhst.v4.4.zout.fits')
gs_cat = fits.open('/Users/rsimons/Desktop/clear/Catalogs/goodss_3dhst.v4.4.zout.fits')
x_gds = ascii.read('/Users/rsimons/Desktop/clear/Catalogs/clear_gdsxray.dat')
x_gdn = ascii.read('/Users/rsimons/Desktop/clear/Catalogs/clear_gdnxray.dat')
gf_gds = ascii.read('/Users/rsimons/Desktop/clear/Catalogs/allfields/goodss/goodss_3dhst.v4.1_f160w.galfit')
gf_gdn = ascii.read('/Users/rsimons/Desktop/clear/Catalogs/allfields/goodsn/goodsn_3dhst.v4.1_f160w.galfit')
re_arr = []
ms_arr = []
fit_types = array(['', '_S', '_EC', '_S_EC'])
if True:
for ft, fit_type in enumerate(fit_types):
with PdfPages('/Users/rsimons/Desktop/clear/figures/izi_z_radius%s.pdf'%fit_type) as pdf:
ms = 5
fig, ax = plt.subplots(1,1, figsize = (9, 4))
if False:
ax.errorbar(wang_cat[:,5], wang_cat[:,3], yerr =wang_cat[:,4], fmt = 'o', ms = ms,color = 'blue', label = 'Wang+ 17', zorder = 1)
ax.errorbar(wang18_cat[:,3], wang18_cat[:,1], yerr =wang18_cat[:,2], fmt = 'o', ms = ms,color = 'darkblue', label = 'Wang+ 18', zorder = 1)
ax.errorbar(jones_cat[:,0], jones_cat[:,1], yerr =jones_cat[:,2], fmt = 'o', ms = ms,color = 'skyblue', label = 'Jones+ 13', zorder = 1)
ax.errorbar(swinbank_cat[:,0], swinbank_cat[:,1], yerr =swinbank_cat[:,2], ms = ms,fmt = 'o', color = 'darkgreen', label = 'Swinbank+ 12', zorder = 1)
ax.errorbar(log10(ma_cat[:,1]), ma_cat[:,7], yerr =ma_cat[:,8], fmt = 's', ms = ms,markeredgecolor = 'black', markerfacecolor = "None", color = 'black', label = 'Ma+ 17; simulations', zorder = 1)
wuyts_x = linspace(10.0, 11.5, 100)
wuyts_y = -0.017*(wuyts_x - 10) + 0.0
ax.errorbar(-99, -1, yerr = 1., fmt = mrker, color = 'red', markeredgecolor = 'black', ms = 6, alpha = 1.0, label = 'Simons+ in prep')
ax.plot(wuyts_x, wuyts_y, '--', linewidth = 3, color = 'midnightblue', label = 'Wuyts+ 16', zorder = 1)
ax.axhline(y = 0, linestyle = '-', color = 'grey', alpha = 0.4, linewidth = 2, zorder = 0)
to_pl = []
for fl in fls:
fld = fl.split('/')[-1].split('_')[0]
di = fl.split('/')[-1].split('_')[1]
if 'N' in fld:
ct = gn_cat
xcat = x_gdn
gcat = gf_gdn
elif 'S' in fld:
ct = gs_cat
xcat = x_gds
gcat = gf_gds
match_cat = where(int(di) == ct[1].data['id'])[0][0]
match_xcat = where(int(di) == xcat['ID'])[0]
match_gcat = where(int(di) == gcat['NUMBER'])[0]
if len(match_xcat) > 0: match_xcat = match_xcat[0]
xclass = xcat['Xclass'][match_xcat]
ra_c = ct[1].data['ra'][match_cat]
dec_c = ct[1].data['dec'][match_cat]
z = ct[1].data['z500'][match_cat]
re = gcat['re'][match_cat]
re_kpc = re/cosmo.arcsec_per_kpc_proper(z).value
mrker = 'o'
mstar = np.log10(ct[1].data['mass'][match_cat])
ft = np.load(fl, allow_pickle = True)[()]
if xclass != 'AGN':
kpc_per_pix = 0.1 / cosmo.arcsec_per_kpc_proper(z).value
try:
crit1 = ft['p%s'%fit_type][0]/kpc_per_pix + 2*max(float(sqrt(ft['V%s'%fit_type][0,0])/kpc_per_pix), 0.03) < 0.0
crit2 = ft['p%s'%fit_type][0]/kpc_per_pix - 2*max(float(sqrt(ft['V%s'%fit_type][0,0])/kpc_per_pix), 0.03) > 0.0
if crit1 | crit2:
pass
else:
count_flat+=1
alp = 1.0
if False:
if crit1:
count_falling+=1
alp = 1.0
if False:
if crit2:
count_rising+=1
alp = 1.0
ax.errorbar(mstar, float(ft['p%s'%fit_type][0]/kpc_per_pix), yerr = float(sqrt(ft['V%s'%fit_type][0,0])/kpc_per_pix), fmt = mrker, color = 'red', markeredgecolor = 'black', ms = 5, alpha = alp)
#ax.errorbar(mstar, float(ft['p_S_EC'][0]/kpc_per_pix*re_kpc), yerr = 0., fmt = mrker, color = 'red', markeredgecolor = 'black', ms = 5, alpha = alp)
re_arr.append(ft['p%s'%fit_type][0]/kpc_per_pix*re_kpc)
ms_arr.append(mstar)
count_all+=1
except: pass
#ax.errorbar(-99, -1, yerr = 0.01, fmt = 's', color = 'red', fillstyle = 'none', markeredgecolor = 'red', label = 'CLEAR, (N = 112)', zorder = 10)
#ax.errorbar(9.25, 0.0246, xerr = 0.25, yerr = 0.003, fmt = 'o', color = 'red', markeredgecolor = 'black', ms = 10, label = 'CLEAR, STACK', zorder = 10)
#ax.errorbar(9.75, 0.0163, xerr = 0.25, yerr = 0.003, fmt = 'o', color = 'red', markeredgecolor = 'black', ms = 10, zorder = 10)
#ax.errorbar(10.25, 0.0121, xerr = 0.25, yerr = 0.004, fmt = 'o', color = 'red', markeredgecolor = 'black', ms = 10, zorder = 10)
#ax.errorbar(10.75, 0.0055, xerr = 0.25, yerr = 0.008, fmt = 'o', color = 'red', markeredgecolor = 'black', ms = 10, zorder = 10)
ax.annotate(r'$0.7 < z < 2.0$', (0.55, 0.85), xycoords = 'axes fraction', fontsize = 25, fontweight = 'bold')
ax.set_xlabel(r'$\log$ M$_{*}$ (M$_{\odot}$)', fontsize = 20)
ax.set_ylabel(r'$\frac{\Delta \log(O/H)}{\Delta R}$ (dex kpc$^{-1}$)', rotation = 90, fontsize = 20)
ax.legend(bbox_to_anchor=(1.0, 1.05), frameon = False, fontsize = 18)
ax.set_ylim(-0.33, 0.5)
ax.set_xlim(8.2, 11.5)
fig.subplots_adjust(bottom = 0.20, left = 0.15, right = 0.65, top = 0.95)
pdf.savefig()
res = np.load('test.npy', allow_pickle = True)[()]
re_arr = res['re_arr']
ms_arr = res['mstar']
if False:
with PdfPages('/Users/rsimons/Desktop/clear/figures/izi_z_effradius.pdf') as pdf:
ms = 5
fig, ax = plt.subplots(1,1, figsize = (9, 4))
ax.axhline(y = 0, linestyle = '-', color = 'grey', alpha = 0.4, linewidth = 2, zorder = 0)
bins = np.arange(9, 10.50, 0.25)
for b, bn in enumerate(bins):
gd = where((ms_arr > bn) & (ms_arr < bn+0.25) & (abs(re_arr) < 0.5))[0]
re_bin = re_arr[gd]
med = np.median(re_bin)
d = np.abs(re_bin - med)
mdev = np.median(d)
mean_mstar = bn + 0.25/2.
mrker = 'D'
alp = 1.0
med = np.mean(re_bin)
#mdev = np.std(re_bin)/np.sqrt(len(re_bin))
print (med)
ax.errorbar(mean_mstar, med, yerr = mdev, fmt = mrker, color = 'red', markeredgecolor = 'black', ms = 10., alpha = alp)
#eyeballed from figure 11
belfiore_re = np.array([0.020, -0.04, -0.045, -0.08, -0.13, -0.14])
belfiore_ere = np.array([0.022, 0.01, 0.05, 0.05, 0.02, 0.02])
belfiore_mass = np.arange(9, 10.5, 0.25)
ax.errorbar(belfiore_mass, belfiore_re, yerr = belfiore_ere, fmt = mrker, color = 'black', markeredgecolor = 'black', ms = 10., alpha = alp)
#ax.annotate(r'$0.7 < z < 2.0$', (0.55, 0.85), xycoords = 'axes fraction', fontsize = 25, fontweight = 'bold')
ax.set_xlabel(r'$\log$ M$_{*}$ (M$_{\odot}$)', fontsize = 20)
ax.set_ylabel(r'$\frac{\Delta \log(O/H)}{\Delta R}$ (dex R$_{\text{eff}}$$^{-1}$)', rotation = 90, fontsize = 20)
ax.legend(bbox_to_anchor=(1.0, 1.05), frameon = False, fontsize = 18)
ax.set_ylim(-0.2, 0.15)
ax.set_xlim(8.7, 10.8)
ax.set_xticks(arange(9, 11, 0.5))
ax.set_yticks(arange(-0.2, 0.2, 0.1))
fig.subplots_adjust(bottom = 0.20, left = 0.15, right = 0.65, top = 0.95)
pdf.savefig()
if False:
with PdfPages('/Users/rsimons/Dropbox/rcs_clear/z_radius_plots/mstar_sfr.pdf') as pdf:
fig, ax = plt.subplots(1,1, figsize = (6.5, 3))
for c in cat:
fld = c[0]
if fld == 'GN1': physcat = GN1_physcat
if fld == 'GN2': physcat = GN2_physcat
if fld == 'GN3': physcat = GN3_physcat
#mstar = physcat[where(physcat[:,0] == int(c[1]))[0][0],1]
mstar = float(c[-5])
print (mstar)
sfr = float(c[4])
ax.plot(mstar, sfr, color = 'red', marker = 'o', zorder = 10, markeredgecolor = 'black', markersize = 10)
#ax.plot(-99, -99, color = 'red', marker = 'o', zorder = 10, markeredgecolor = 'black', label = 'CLEAR G102\n+ archival G141\n(2 of 10 pointings)')
whitaker_all = np.loadtxt('/Users/rsimons/Downloads/goodsn_3dhst_v4.1.5_catalogs/goodsn_3dhst.v4.1.5.zbest.sfr')
fast_all = fits.open('/Users/rsimons/Desktop/clear/Catalogs/goodsn_3dhst.v4.1.cats/Fast/goodsn_3dhst.v4.1.fout.FITS')
whit = []
for i in arange(len(whitaker_all)):
good = where(fast_all[1].data['id'] == whitaker_all[i,0])[0][0]
whit.append([fast_all[1].data['lmass'][good], whitaker_all[i,1], fast_all[1].data['z'][good]])
whit_sfr = array([[8.8 , -0.03, 0.13],
[9.1 , 0.17, 0.08],
[9.3 , 0.38, 0.05],
[9.5 , 0.64, 0.02],
[9.7 , 0.81, 0.02],
[9.9 , 1.02, 0.03],
[10.1, 1.18, 0.04],
[10.3, 1.35, 0.04],
[10.5, 1.47, 0.05],
[10.7, 1.58, 0.05],
[10.9, 1.69, 0.08],
[11.1, 1.74, 0.13],
[11.3, 1.81, 0.11]])
whit = array(whit)
ax.errorbar(whit_sfr[:,0], 10**whit_sfr[:,1], yerr = 10.**whit_sfr[:,2], fmt = 'o', color = 'grey', zorder = 1, label = 'GDN (medians)')
gz = where((whit[:,2] > 1.0) & (whit[:,2] < 1.5))[0]
ax.plot(whit[gz,0], whit[gz,1], marker = '.', linewidth = 0., color = 'grey', markersize = 1, zorder = 0, alpha = 0.4, label = 'GDN (all)')
M_whit = linspace(9, 12, 1000)
sfr = -24 + 4.17*M_whit - 0.16*M_whit**2.
ax.set_xlabel(r'$\log$ M$_{*}$ (M$_{\odot}$) [FAST]', fontsize = 15)
ax.set_ylabel(r'SFR$_{UV+IR}$ (M$_{\odot}$ yr$^{-1}$)', rotation = 90, fontsize = 15)
ax.legend(bbox_to_anchor=(1.0, 1.05), frameon = False)
ax.set_xlim(8.0, 11.5)
ax.set_ylim(0.1, 300)
ax.set_yscale('log')
ax.legend(bbox_to_anchor=(0.99, 0.95), frameon = False)
fig.subplots_adjust(bottom = 0.20, left = 0.18, right = 0.70, top = 0.95)
pdf.savefig()
| [
"[email protected]"
] | |
c44e31915ea2575dc522eb49e10a41f4cfbe1772 | 308d75172c81bddb45c82429e4ddb3e09a3d220e | /01-OpencvPythonTutorial/ch22/06-hist-normalized.py | 2040ee82d34de1984caff9ba18df6ac7a162de4c | [] | no_license | Damon0626/OpenCV3ForBeigner | 956f2163249b5a3d1426089e3650695467a0427f | b3e8c5c201b6adabe067c9f2d1f614d93dcef447 | refs/heads/master | 2020-04-04T16:49:32.806739 | 2019-02-21T13:54:12 | 2019-02-21T13:54:12 | 156,095,139 | 1 | 1 | null | null | null | null | UTF-8 | Python | false | false | 533 | py | # -*-coding:utf-8-*-
# @Author: Damon0626
# @Time : 18-12-16 下午11:25
# @Email : [email protected]
# @Software: PyCharm
import cv2
import numpy as np
import matplotlib.pyplot as plt
image = cv2.imread('contrast75.png')
hist, bins = np.histogram(image.flatten(), 256, [0, 256])
cdf = hist.cumsum() # 累积分布
cdf_mormalized = cdf*hist.max()/cdf.max()
plt.plot(cdf_mormalized, color='b')
plt.hist(image.flatten(), 256, [0, 256], color='r')
plt.xlim([0, 256])
plt.legend(['cdf', 'histogram'], loc='upper left')
plt.show()
| [
"[email protected]"
] | |
85e05e3c33ca86dc0280e5c8edbaa2ac143212c9 | 6930e9d3372e83cf43a47ae8ad165f83a218aee2 | /capture/noworkflow/now/utils/bytecode/f_trace.py | 93e8b89a211b8519aa8d55ca65e360e7e73c8d43 | [
"MIT"
] | permissive | hugobowne/noworkflow | 02ab47a8b3377ee56f1e7c4552a8dbcb3d15e5f0 | 333cbe274348428f1a9514fe81406f8416036845 | refs/heads/master | 2021-01-17T20:27:16.524245 | 2015-11-18T23:53:28 | 2015-11-18T23:53:28 | null | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 3,164 | py | # Copyright (c) 2015 Universidade Federal Fluminense (UFF)
# Copyright (c) 2015 Polytechnic Institute of New York University.
# This file is part of noWorkflow.
# Please, consult the license terms in the LICENSE file.
""" Define f_trace related interpreters and functions """
# pylint: disable=R0902
from __future__ import (absolute_import, print_function,
division, unicode_literals)
import sys
from .code_interpreter import CodeInterpreter, PyInterpreter
class AlmostReadOnlyDict(dict):
""" Use it to avoid changes on the original dict """
def __init__(self, *args, **kwargs):
super(AlmostReadOnlyDict, self).__init__(*args, **kwargs)
self.other = {}
def __getitem__(self, item):
if item in self.other:
return self.other[item]
return super(AlmostReadOnlyDict, self).__getitem__(item)
def __setitem__(self, item, value):
self.other[item] = value
def __delitem__(self, item):
if item in self.other:
del self.other[item]
class FindFTrace(CodeInterpreter):
""" Find <expr>.f_trace attribution """
def __init__(self, *args, **kwargs):
# Disable operations that may cause effect
# Default
# self.store_fast = self.nop
self.store_subscr = self.nop
# self.store_name = self.nop
self.store_global = self.nop
# self.delete_fast = self.nop
self.delete_subscr = self.nop
# self.delete_name = self.nop
self.delete_attr = self.nop
self.delete_global = self.nop
self.print_expr = self.nop
# Python 2
self.store_slice__0 = self.nop
self.store_slice__0 = self.nop
self.store_slice__1 = self.nop
self.store_slice__2 = self.nop
self.store_slice__3 = self.nop
self.delete_slice__0 = self.nop
self.delete_slice__1 = self.nop
self.delete_slice__2 = self.nop
self.delete_slice__3 = self.nop
super(FindFTrace, self).__init__(*args, **kwargs)
self._locals = AlmostReadOnlyDict(self._locals)
self._globals = AlmostReadOnlyDict(self._globals)
def store_attr(self):
""" STORE_ATTR opcode """
if self.names[self.oparg] == 'f_trace':
self._stop = True
self.result = self.stack.pop() if self.stack else True
f_trace_name = u'FTraceExe' if sys.version_info >= (3, 0) else b'FTraceExe'
FTraceExe = type(f_trace_name, (FindFTrace, PyInterpreter), {})
def get_f_trace(code, loc, glob):
""" Get frame from frame.f_trace attribution """
interpreter = FTraceExe(code, loc, glob)
interpreter.execute()
return interpreter.result
def find_f_trace(code, loc, glob, lasti):
""" Check if code has frame.f_trace attribution """
if 'f_trace' not in code.co_names:
return False
interpreter = FindFTrace(code, loc, glob)
interpreter.execute()
if not interpreter.result:
return False
last_offset = 0
for offset in interpreter.linestarts:
if offset >= interpreter.opi:
return lasti == last_offset
last_offset = offset
return False
| [
"[email protected]"
] | |
84272d0c64a8128d488da1959bb8af7afbd979c7 | b580fd482147e54b1ca4f58b647fab016efa3855 | /host_im/mount/malware-classification-master/samples/not/sample_good807.py | 55e1c22bfa14474792c197d5bc8d4af90bf8e645 | [] | no_license | Barnsa/Dissertation | 1079c8d8d2c660253543452d4c32799b6081cfc5 | b7df70abb3f38dfd446795a0a40cf5426e27130e | refs/heads/master | 2022-05-28T12:35:28.406674 | 2020-05-05T08:37:16 | 2020-05-05T08:37:16 | 138,386,344 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 410 | py | import stringprep
import difflib
import math
import array
import textwrap
import datetime
import readline
import random
nterms = 618
n1, n2 = 0, 1
if nterms <= 0:
print("Please provide a positive integer.")
elif nterms == 1:
print("Fibonacci sequence upto", nterms, ":")
print(n1)
else:
print("Fibonacci sequence:")
count = 0
while 618 > 0:
print(n1)
nth = n1 + n2
n1 = n2
n2 = nth
count -= -1
| [
"[email protected]"
] | |
7012967cb00baae4f581bb049013311b240da290 | 61bc53ec90d92aece91753ec5ec9d25e0879a1e2 | /content/pythia/pythia/tasks/vqa/textvqa/dataset.py | c4bc2ee38892409d39789dc107a8a061951eb380 | [
"BSD-3-Clause"
] | permissive | aluka1994/textvqa | 08a16c9b21ea9c5eca05f5d4d1763c190d2d7275 | 694cb2be08def519ba73be78e34664afa2c607b5 | refs/heads/master | 2021-05-26T23:44:21.973827 | 2020-04-08T22:05:58 | 2020-04-08T22:05:58 | 254,190,630 | 0 | 0 | MIT | 2020-04-08T20:14:11 | 2020-04-08T20:14:10 | null | UTF-8 | Python | false | false | 1,047 | py | # Copyright (c) Facebook, Inc. and its affiliates.
from pythia.tasks.vqa.vizwiz import VizWizDataset
from pythia.utils.text_utils import word_tokenize
class TextVQADataset(VizWizDataset):
def __init__(self, dataset_type, imdb_file_index, config, *args, **kwargs):
super().__init__(dataset_type, imdb_file_index, config, *args, **kwargs)
self._name = "textvqa"
def format_for_evalai(self, report):
answers = report.scores.argmax(dim=1)
predictions = []
answer_space_size = self.answer_processor.get_true_vocab_size()
for idx, question_id in enumerate(report.question_id):
answer_id = answers[idx].item()
if answer_id >= answer_space_size:
answer_id -= answer_space_size
answer = word_tokenize(report.context_tokens[idx][answer_id])
else:
answer = self.answer_processor.idx2word(answer_id)
predictions.append({"question_id": question_id.item(), "answer": answer})
return predictions
| [
"[email protected]"
] | |
a355eeec111864ef7af555c51e8460f11f29c365 | 0581988cad7e0ea62a638d551548e409af1e5dc1 | /20200529/UI_PPT2PDF/myUI_ppt2pdf.py | cc3732bbc0196e33f92063a50d906881b836f73f | [] | no_license | Aimee888/python-20200513 | 7c1dff7d7f0fdea08e12735efeb2e889fedeee10 | 578c388be5582dc7f1556f95168adf0399b7ea1f | refs/heads/master | 2023-01-06T10:21:35.014780 | 2020-11-03T01:07:04 | 2020-11-03T01:07:04 | null | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 801 | py | #!/usr/bin/env python
# _*_ coding: UTF-8 _*_
"""=================================================
@Project -> File : six-dialog_design -> myUI_ppt2pdf.py
@IDE : PyCharm
@Author : Aimee
@Date : 2020/5/29 17:11
@Desc :
================================================="""
import sys
from PyQt5.QtWidgets import QApplication, QMainWindow
from PyQt5.QtGui import QPainter, QPixmap
from ui_ppt2pdf import Ui_MainWindow
class QmyMainWindow(QMainWindow):
def __init__(self, parent=None):
super().__init__(parent) # 调用父类构造函数
self.ui = Ui_MainWindow() # 创建UI对象
self.ui.setupUi(self) # 构造UI
if __name__ == '__main__':
app = QApplication(sys.argv) # 创建app
form = QmyMainWindow()
form.show()
sys.exit(app.exec_())
| [
"[email protected]"
] | |
673df8917fcffa493359510bf3037a32df3e67b3 | 70bc77336e4544031ad7d7d29a2e964ef2626076 | /ui/RepositoryTreeObject.py | 981473acc6c301a58ddafa660cc03bbf33b220b2 | [] | no_license | DronMDF/vanadis | 9af7a8c9281bf0eb17df593f5c9fc9345e474612 | de692207bbd127c5a9952e3144653492a0ba969f | refs/heads/master | 2020-04-17T08:11:18.411429 | 2016-12-21T20:50:05 | 2016-12-21T20:50:05 | 66,539,179 | 1 | 0 | null | 2016-12-21T20:50:06 | 2016-08-25T08:20:03 | Python | UTF-8 | Python | false | false | 532 | py | from pathlib import Path
from ui import RepositoryId
class RepositoryTreeObject:
''' This is a tree object (blob or tree) '''
def __init__(self, entry, prefix, repo=None):
self.entry = entry
self.prefix = prefix
self.repo = repo
def id(self):
return RepositoryId(self.entry.id)
def path(self):
return str(Path(self.prefix, self.entry.name))
def name(self):
return self.entry.name
def is_dir(self):
return self.entry.type == 'tree'
def content(self):
return self.repo[self.entry.id].data.decode('utf8')
| [
"[email protected]"
] | |
8be2f70f024036be4e4f7cc27126fd302cb87bd6 | a4fd9f5d765351fb771455db18290e48affc3747 | /password_generator/migrations/0001_initial.py | 6cba7dcf9ebd26fb59d017987760ec2b182b24db | [
"MIT"
] | permissive | rwgeaston/django-password-generator | a7aa9627c721b8a0f5ba66149d681c333f79da59 | 62607905bafc111802c61223a1c3c34aa927c9fc | refs/heads/master | 2020-03-28T18:50:35.512795 | 2018-09-15T22:03:23 | 2018-09-15T22:03:23 | 148,917,872 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 1,189 | py | # Generated by Django 2.0.6 on 2018-09-15 15:00
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='Word',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('word', models.CharField(max_length=15)),
('word_length', models.IntegerField(db_index=True)),
('count', models.IntegerField(db_index=True, default=0)),
],
),
migrations.CreateModel(
name='Wordset',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=30, unique=True)),
],
),
migrations.AddField(
model_name='word',
name='wordset',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='words', to='password_generator.Wordset'),
),
]
| [
"[email protected]"
] | |
0a87c613070041cd6ca47ab39268421df5ed335a | f06680ed95c01f50f0b484ffa81a5baea022282d | /data/parse_wiki.py | 80f18cd87ef5eb7edc85c99419d97b5428901daf | [
"MIT"
] | permissive | mohit1997/DeepZip | d3980935517b311b06d39a429546de9f024a73e5 | 8c35502397a1488c89fa282ed033cc9d5fd4b4dc | refs/heads/master | 2022-03-16T02:17:55.479656 | 2022-03-04T22:09:48 | 2022-03-04T22:09:48 | 141,168,415 | 126 | 28 | null | 2019-04-23T23:42:20 | 2018-07-16T17:03:11 | Python | UTF-8 | Python | false | false | 1,190 | py | import io
import sys
import numpy as np
import json
import argparse
parser = argparse.ArgumentParser(description='Input')
parser.add_argument('-param_file', action='store', dest='param_file',
help='param file file', default='params.json')
parser.add_argument('-input', action='store', dest='input_file_path',
help='input file path', default='enwik8')
parser.add_argument('-output', action='store',dest='output_file_path',
help='output file path', default='npwik8')
args = parser.parse_args()
f = io.open(args.input_file_path, mode="r", encoding="utf-8")
data = f.read()
print(len(data))
vals = list(set(data))
char2id_dict = {c: i for (i,c) in enumerate(vals)}
id2char_dict = {i: c for (i,c) in enumerate(vals)}
params = {'char2id_dict':char2id_dict, 'id2char_dict':id2char_dict}
with open(args.param_file, 'w') as f:
json.dump(params, f, indent=4)
print(char2id_dict)
print(id2char_dict)
out = [char2id_dict[c] for c in data]
integer_encoded = np.array(out)
integer_encoded = integer_encoded.reshape(len(integer_encoded), 1)
print(integer_encoded[:10])
print(data[:10])
np.save(args.output_file_path, integer_encoded)
| [
"[email protected]"
] | |
bd88a28726322d9e402e7e14401d7b7c2c0e8786 | 109ac2891c5af60cc0a5c9e988048315314014b3 | /Data Structure ZJU/printN2.py | 719f3203bd1e631382a9a080b659424415e69ebe | [] | no_license | francislinking/PTA-codes | 485c6019a458fa1705dde6f84a69b33c0bd7de81 | fea40800c6813300fe56f8b14f159d971b745a6b | refs/heads/master | 2021-07-10T14:50:16.643802 | 2021-05-02T02:16:35 | 2021-05-02T02:16:35 | 244,085,168 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 224 | py | # -*- coding: utf-8 -*-
"""
Created on Fri Feb 21 18:48:24 2020
@author: Deng Jie
"""
def printN( N ):
if N == 1:
print(1)
else:
printN( N-1 )
print(N)
n = eval(input())
printN(n) | [
"[email protected]"
] | |
fd3e9d3a74cc896ead349afb4520114128eef17e | fa81f1c5039da7554277a87d334cbee675e91995 | /yelp/migrations/0002_userreview.py | 7f9ee9895e7e1142e55b6537075c7be7a23e96e6 | [] | no_license | hldai/labelel | b5eaaac5cef73ccf6941ffed474e8b544c76a944 | 87c36972d1d7f4c1b146c185bcdee5207c030b8d | refs/heads/master | 2020-12-30T13:39:45.673434 | 2017-07-09T06:20:32 | 2017-07-09T06:20:32 | 91,239,809 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 636 | py | # -*- coding: utf-8 -*-
# Generated by Django 1.10.7 on 2017-05-17 08:28
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('yelp', '0001_initial'),
]
operations = [
migrations.CreateModel(
name='UserReview',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('username', models.CharField(max_length=64)),
('review_id', models.CharField(max_length=64)),
],
),
]
| [
"[email protected]"
] | |
b49482817895ed4364111760e2e669ad95ba46db | 999879f8d18e041d7fa313132408b252aded47f8 | /01-codes/scipy-master/scipy/cluster/tests/test_hierarchy.py | 84de6b3e57d6f535499c88e97ebd502ab81d922a | [
"MIT"
] | permissive | QPanProjects/Surrogate-Model | ebcaf05728e82dcbcd924c2edca1b490ab085173 | 848c7128201218b0819c9665e2cec72e3b1d29ac | refs/heads/master | 2022-10-11T19:03:55.224257 | 2020-06-09T14:37:35 | 2020-06-09T14:37:35 | null | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 40,971 | py | #! /usr/bin/env python
#
# Author: Damian Eads
# Date: April 17, 2008
#
# Copyright (C) 2008 Damian Eads
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions
# are met:
#
# 1. Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
#
# 2. Redistributions in binary form must reproduce the above
# copyright notice, this list of conditions and the following
# disclaimer in the documentation and/or other 00-courses provided
# with the distribution.
#
# 3. The name of the author may not be used to endorse or promote
# products derived from this software without specific prior
# written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS
# OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
# WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY
# DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
# DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE
# GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
# WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
# NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
from __future__ import division, print_function, absolute_import
import hierarchy_test_data
import numpy as np
import scipy.cluster.hierarchy
from numpy.testing import (TestCase, run_module_suite, dec, assert_raises,
assert_allclose, assert_equal, assert_, assert_warns)
from scipy._lib.six import xrange, u
from scipy.cluster._hierarchy import Heap
from scipy.cluster.hierarchy import (
ClusterWarning, linkage, from_mlab_linkage, to_mlab_linkage,
num_obs_linkage, inconsistent, cophenet, fclusterdata, fcluster,
is_isomorphic, single, leaders, correspond, is_monotonic, maxdists, maxinconsts, maxRstat,
is_valid_linkage, is_valid_im, to_tree, leaves_list, dendrogram,
set_link_color_palette, cut_tree, _order_cluster_tree,
_hierarchy, _LINKAGE_METHODS)
from scipy.spatial.distance import pdist
# Matplotlib is not a scipy dependency but is optionally used in dendrogram, so
# check if it's available
try:
import matplotlib
# and set the backend to be Agg (no gui)
matplotlib.use('Agg')
# before importing pyplot
import matplotlib.pyplot as plt
have_matplotlib = True
except:
have_matplotlib = False
class TestLinkage(object):
def test_linkage_non_finite_elements_in_distance_matrix(self):
# Tests linkage(Y) where Y contains a non-finite element (e.g. NaN or Inf).
# Exception expected.
y = np.zeros((6,))
y[0] = np.nan
assert_raises(ValueError, linkage, y)
def test_linkage_empty_distance_matrix(self):
# Tests linkage(Y) where Y is a 0x4 linkage matrix. Exception expected.
y = np.zeros((0,))
assert_raises(ValueError, linkage, y)
def test_linkage_tdist(self):
for method in ['single', 'complete', 'average', 'weighted', u('single')]:
yield self.check_linkage_tdist, method
def check_linkage_tdist(self, method):
# Tests linkage(Y, method) on the tdist data set.
Z = linkage(hierarchy_test_data.ytdist, method)
expectedZ = getattr(hierarchy_test_data, 'linkage_ytdist_' + method)
assert_allclose(Z, expectedZ, atol=1e-10)
def test_linkage_X(self):
for method in ['centroid', 'median', 'ward']:
yield self.check_linkage_q, method
def check_linkage_q(self, method):
# Tests linkage(Y, method) on the Q data set.
Z = linkage(hierarchy_test_data.X, method)
expectedZ = getattr(hierarchy_test_data, 'linkage_X_' + method)
assert_allclose(Z, expectedZ, atol=1e-06)
y = scipy.spatial.distance.pdist(hierarchy_test_data.X,
metric="euclidean")
Z = linkage(y, method)
assert_allclose(Z, expectedZ, atol=1e-06)
def test_compare_with_trivial(self):
rng = np.random.RandomState(0)
n = 20
X = rng.rand(n, 2)
d = pdist(X)
for method, code in _LINKAGE_METHODS.items():
Z_trivial = _hierarchy.linkage(d, n, code)
Z = linkage(d, method)
assert_allclose(Z_trivial, Z, rtol=1e-14, atol=1e-15)
class TestLinkageTies(object):
_expectations = {
'single': np.array([[0, 1, 1.41421356, 2],
[2, 3, 1.41421356, 3]]),
'complete': np.array([[0, 1, 1.41421356, 2],
[2, 3, 2.82842712, 3]]),
'average': np.array([[0, 1, 1.41421356, 2],
[2, 3, 2.12132034, 3]]),
'weighted': np.array([[0, 1, 1.41421356, 2],
[2, 3, 2.12132034, 3]]),
'centroid': np.array([[0, 1, 1.41421356, 2],
[2, 3, 2.12132034, 3]]),
'median': np.array([[0, 1, 1.41421356, 2],
[2, 3, 2.12132034, 3]]),
'ward': np.array([[0, 1, 1.41421356, 2],
[2, 3, 2.44948974, 3]]),
}
def test_linkage_ties(self):
for method in ['single', 'complete', 'average', 'weighted', 'centroid', 'median', 'ward']:
yield self.check_linkage_ties, method
def check_linkage_ties(self, method):
X = np.array([[-1, -1], [0, 0], [1, 1]])
Z = linkage(X, method=method)
expectedZ = self._expectations[method]
assert_allclose(Z, expectedZ, atol=1e-06)
class TestInconsistent(object):
def test_inconsistent_tdist(self):
for depth in hierarchy_test_data.inconsistent_ytdist:
yield self.check_inconsistent_tdist, depth
def check_inconsistent_tdist(self, depth):
Z = hierarchy_test_data.linkage_ytdist_single
assert_allclose(inconsistent(Z, depth),
hierarchy_test_data.inconsistent_ytdist[depth])
class TestCopheneticDistance(object):
def test_linkage_cophenet_tdist_Z(self):
# Tests cophenet(Z) on tdist data set.
expectedM = np.array([268, 295, 255, 255, 295, 295, 268, 268, 295, 295,
295, 138, 219, 295, 295])
Z = hierarchy_test_data.linkage_ytdist_single
M = cophenet(Z)
assert_allclose(M, expectedM, atol=1e-10)
def test_linkage_cophenet_tdist_Z_Y(self):
# Tests cophenet(Z, Y) on tdist data set.
Z = hierarchy_test_data.linkage_ytdist_single
(c, M) = cophenet(Z, hierarchy_test_data.ytdist)
expectedM = np.array([268, 295, 255, 255, 295, 295, 268, 268, 295, 295,
295, 138, 219, 295, 295])
expectedc = 0.639931296433393415057366837573
assert_allclose(c, expectedc, atol=1e-10)
assert_allclose(M, expectedM, atol=1e-10)
class TestMLabLinkageConversion(object):
def test_mlab_linkage_conversion_empty(self):
# Tests from/to_mlab_linkage on empty linkage array.
X = np.asarray([])
assert_equal(from_mlab_linkage([]), X)
assert_equal(to_mlab_linkage([]), X)
def test_mlab_linkage_conversion_single_row(self):
# Tests from/to_mlab_linkage on linkage array with single row.
Z = np.asarray([[0., 1., 3., 2.]])
Zm = [[1, 2, 3]]
assert_equal(from_mlab_linkage(Zm), Z)
assert_equal(to_mlab_linkage(Z), Zm)
def test_mlab_linkage_conversion_multiple_rows(self):
# Tests from/to_mlab_linkage on linkage array with multiple rows.
Zm = np.asarray([[3, 6, 138], [4, 5, 219],
[1, 8, 255], [2, 9, 268], [7, 10, 295]])
Z = np.array([[2., 5., 138., 2.],
[3., 4., 219., 2.],
[0., 7., 255., 3.],
[1., 8., 268., 4.],
[6., 9., 295., 6.]],
dtype=np.double)
assert_equal(from_mlab_linkage(Zm), Z)
assert_equal(to_mlab_linkage(Z), Zm)
class TestFcluster(object):
def test_fclusterdata(self):
for t in hierarchy_test_data.fcluster_inconsistent:
yield self.check_fclusterdata, t, 'inconsistent'
for t in hierarchy_test_data.fcluster_distance:
yield self.check_fclusterdata, t, 'distance'
for t in hierarchy_test_data.fcluster_maxclust:
yield self.check_fclusterdata, t, 'maxclust'
def check_fclusterdata(self, t, criterion):
# Tests fclusterdata(X, criterion=criterion, t=t) on a random 3-cluster data set.
expectedT = getattr(hierarchy_test_data, 'fcluster_' + criterion)[t]
X = hierarchy_test_data.Q_X
T = fclusterdata(X, criterion=criterion, t=t)
assert_(is_isomorphic(T, expectedT))
def test_fcluster(self):
for t in hierarchy_test_data.fcluster_inconsistent:
yield self.check_fcluster, t, 'inconsistent'
for t in hierarchy_test_data.fcluster_distance:
yield self.check_fcluster, t, 'distance'
for t in hierarchy_test_data.fcluster_maxclust:
yield self.check_fcluster, t, 'maxclust'
def check_fcluster(self, t, criterion):
# Tests fcluster(Z, criterion=criterion, t=t) on a random 3-cluster data set.
expectedT = getattr(hierarchy_test_data, 'fcluster_' + criterion)[t]
Z = single(hierarchy_test_data.Q_X)
T = fcluster(Z, criterion=criterion, t=t)
assert_(is_isomorphic(T, expectedT))
def test_fcluster_monocrit(self):
for t in hierarchy_test_data.fcluster_distance:
yield self.check_fcluster_monocrit, t
for t in hierarchy_test_data.fcluster_maxclust:
yield self.check_fcluster_maxclust_monocrit, t
def check_fcluster_monocrit(self, t):
expectedT = hierarchy_test_data.fcluster_distance[t]
Z = single(hierarchy_test_data.Q_X)
T = fcluster(Z, t, criterion='monocrit', monocrit=maxdists(Z))
assert_(is_isomorphic(T, expectedT))
def check_fcluster_maxclust_monocrit(self, t):
expectedT = hierarchy_test_data.fcluster_maxclust[t]
Z = single(hierarchy_test_data.Q_X)
T = fcluster(Z, t, criterion='maxclust_monocrit', monocrit=maxdists(Z))
assert_(is_isomorphic(T, expectedT))
class TestLeaders(object):
def test_leaders_single(self):
# Tests leaders using a flat clustering generated by single linkage.
X = hierarchy_test_data.Q_X
Y = pdist(X)
Z = linkage(Y)
T = fcluster(Z, criterion='maxclust', t=3)
Lright = (np.array([53, 55, 56]), np.array([2, 3, 1]))
L = leaders(Z, T)
assert_equal(L, Lright)
class TestIsIsomorphic(object):
def test_is_isomorphic_1(self):
# Tests is_isomorphic on test case #1 (one flat cluster, different labellings)
a = [1, 1, 1]
b = [2, 2, 2]
assert_(is_isomorphic(a, b))
assert_(is_isomorphic(b, a))
def test_is_isomorphic_2(self):
# Tests is_isomorphic on test case #2 (two flat clusters, different labelings)
a = [1, 7, 1]
b = [2, 3, 2]
assert_(is_isomorphic(a, b))
assert_(is_isomorphic(b, a))
def test_is_isomorphic_3(self):
# Tests is_isomorphic on test case #3 (no flat clusters)
a = []
b = []
assert_(is_isomorphic(a, b))
def test_is_isomorphic_4A(self):
# Tests is_isomorphic on test case #4A (3 flat clusters, different labelings, isomorphic)
a = [1, 2, 3]
b = [1, 3, 2]
assert_(is_isomorphic(a, b))
assert_(is_isomorphic(b, a))
def test_is_isomorphic_4B(self):
# Tests is_isomorphic on test case #4B (3 flat clusters, different labelings, nonisomorphic)
a = [1, 2, 3, 3]
b = [1, 3, 2, 3]
assert_(is_isomorphic(a, b) == False)
assert_(is_isomorphic(b, a) == False)
def test_is_isomorphic_4C(self):
# Tests is_isomorphic on test case #4C (3 flat clusters, different labelings, isomorphic)
a = [7, 2, 3]
b = [6, 3, 2]
assert_(is_isomorphic(a, b))
assert_(is_isomorphic(b, a))
def test_is_isomorphic_5(self):
# Tests is_isomorphic on test case #5 (1000 observations, 2/3/5 random
# clusters, random permutation of the labeling).
for nc in [2, 3, 5]:
yield self.help_is_isomorphic_randperm, 1000, nc
def test_is_isomorphic_6(self):
# Tests is_isomorphic on test case #5A (1000 observations, 2/3/5 random
# clusters, random permutation of the labeling, slightly
# nonisomorphic.)
for nc in [2, 3, 5]:
yield self.help_is_isomorphic_randperm, 1000, nc, True, 5
def test_is_isomorphic_7(self):
# Regression test for gh-6271
assert_(not is_isomorphic([1, 2, 3], [1, 1, 1]))
def help_is_isomorphic_randperm(self, nobs, nclusters, noniso=False, nerrors=0):
for k in range(3):
a = np.int_(np.random.rand(nobs) * nclusters)
b = np.zeros(a.size, dtype=np.int_)
P = np.random.permutation(nclusters)
for i in xrange(0, a.shape[0]):
b[i] = P[a[i]]
if noniso:
Q = np.random.permutation(nobs)
b[Q[0:nerrors]] += 1
b[Q[0:nerrors]] %= nclusters
assert_(is_isomorphic(a, b) == (not noniso))
assert_(is_isomorphic(b, a) == (not noniso))
class TestIsValidLinkage(object):
def test_is_valid_linkage_various_size(self):
for nrow, ncol, valid in [(2, 5, False), (2, 3, False),
(1, 4, True), (2, 4, True)]:
yield self.check_is_valid_linkage_various_size, nrow, ncol, valid
def check_is_valid_linkage_various_size(self, nrow, ncol, valid):
# Tests is_valid_linkage(Z) with linkage matrics of various sizes
Z = np.asarray([[0, 1, 3.0, 2, 5],
[3, 2, 4.0, 3, 3]], dtype=np.double)
Z = Z[:nrow, :ncol]
assert_(is_valid_linkage(Z) == valid)
if not valid:
assert_raises(ValueError, is_valid_linkage, Z, throw=True)
def test_is_valid_linkage_int_type(self):
# Tests is_valid_linkage(Z) with integer type.
Z = np.asarray([[0, 1, 3.0, 2],
[3, 2, 4.0, 3]], dtype=int)
assert_(is_valid_linkage(Z) == False)
assert_raises(TypeError, is_valid_linkage, Z, throw=True)
def test_is_valid_linkage_empty(self):
# Tests is_valid_linkage(Z) with empty linkage.
Z = np.zeros((0, 4), dtype=np.double)
assert_(is_valid_linkage(Z) == False)
assert_raises(ValueError, is_valid_linkage, Z, throw=True)
def test_is_valid_linkage_4_and_up(self):
# Tests is_valid_linkage(Z) on linkage on observation sets between
# sizes 4 and 15 (step size 3).
for i in xrange(4, 15, 3):
y = np.random.rand(i * (i - 1) // 2)
Z = linkage(y)
assert_(is_valid_linkage(Z) == True)
def test_is_valid_linkage_4_and_up_neg_index_left(self):
# Tests is_valid_linkage(Z) on linkage on observation sets between
# sizes 4 and 15 (step size 3) with negative indices (left).
for i in xrange(4, 15, 3):
y = np.random.rand(i * (i - 1) // 2)
Z = linkage(y)
Z[i // 2, 0] = -2
assert_(is_valid_linkage(Z) == False)
assert_raises(ValueError, is_valid_linkage, Z, throw=True)
def test_is_valid_linkage_4_and_up_neg_index_right(self):
# Tests is_valid_linkage(Z) on linkage on observation sets between
# sizes 4 and 15 (step size 3) with negative indices (right).
for i in xrange(4, 15, 3):
y = np.random.rand(i * (i - 1) // 2)
Z = linkage(y)
Z[i // 2, 1] = -2
assert_(is_valid_linkage(Z) == False)
assert_raises(ValueError, is_valid_linkage, Z, throw=True)
def test_is_valid_linkage_4_and_up_neg_dist(self):
# Tests is_valid_linkage(Z) on linkage on observation sets between
# sizes 4 and 15 (step size 3) with negative distances.
for i in xrange(4, 15, 3):
y = np.random.rand(i * (i - 1) // 2)
Z = linkage(y)
Z[i // 2, 2] = -0.5
assert_(is_valid_linkage(Z) == False)
assert_raises(ValueError, is_valid_linkage, Z, throw=True)
def test_is_valid_linkage_4_and_up_neg_counts(self):
# Tests is_valid_linkage(Z) on linkage on observation sets between
# sizes 4 and 15 (step size 3) with negative counts.
for i in xrange(4, 15, 3):
y = np.random.rand(i * (i - 1) // 2)
Z = linkage(y)
Z[i // 2, 3] = -2
assert_(is_valid_linkage(Z) == False)
assert_raises(ValueError, is_valid_linkage, Z, throw=True)
class TestIsValidInconsistent(object):
def test_is_valid_im_int_type(self):
# Tests is_valid_im(R) with integer type.
R = np.asarray([[0, 1, 3.0, 2],
[3, 2, 4.0, 3]], dtype=int)
assert_(is_valid_im(R) == False)
assert_raises(TypeError, is_valid_im, R, throw=True)
def test_is_valid_im_various_size(self):
for nrow, ncol, valid in [(2, 5, False), (2, 3, False),
(1, 4, True), (2, 4, True)]:
yield self.check_is_valid_im_various_size, nrow, ncol, valid
def check_is_valid_im_various_size(self, nrow, ncol, valid):
# Tests is_valid_im(R) with linkage matrics of various sizes
R = np.asarray([[0, 1, 3.0, 2, 5],
[3, 2, 4.0, 3, 3]], dtype=np.double)
R = R[:nrow, :ncol]
assert_(is_valid_im(R) == valid)
if not valid:
assert_raises(ValueError, is_valid_im, R, throw=True)
def test_is_valid_im_empty(self):
# Tests is_valid_im(R) with empty inconsistency matrix.
R = np.zeros((0, 4), dtype=np.double)
assert_(is_valid_im(R) == False)
assert_raises(ValueError, is_valid_im, R, throw=True)
def test_is_valid_im_4_and_up(self):
# Tests is_valid_im(R) on im on observation sets between sizes 4 and 15
# (step size 3).
for i in xrange(4, 15, 3):
y = np.random.rand(i * (i - 1) // 2)
Z = linkage(y)
R = inconsistent(Z)
assert_(is_valid_im(R) == True)
def test_is_valid_im_4_and_up_neg_index_left(self):
# Tests is_valid_im(R) on im on observation sets between sizes 4 and 15
# (step size 3) with negative link height means.
for i in xrange(4, 15, 3):
y = np.random.rand(i * (i - 1) // 2)
Z = linkage(y)
R = inconsistent(Z)
R[i // 2, 0] = -2.0
assert_(is_valid_im(R) == False)
assert_raises(ValueError, is_valid_im, R, throw=True)
def test_is_valid_im_4_and_up_neg_index_right(self):
# Tests is_valid_im(R) on im on observation sets between sizes 4 and 15
# (step size 3) with negative link height standard deviations.
for i in xrange(4, 15, 3):
y = np.random.rand(i * (i - 1) // 2)
Z = linkage(y)
R = inconsistent(Z)
R[i // 2, 1] = -2.0
assert_(is_valid_im(R) == False)
assert_raises(ValueError, is_valid_im, R, throw=True)
def test_is_valid_im_4_and_up_neg_dist(self):
# Tests is_valid_im(R) on im on observation sets between sizes 4 and 15
# (step size 3) with negative link counts.
for i in xrange(4, 15, 3):
y = np.random.rand(i * (i - 1) // 2)
Z = linkage(y)
R = inconsistent(Z)
R[i // 2, 2] = -0.5
assert_(is_valid_im(R) == False)
assert_raises(ValueError, is_valid_im, R, throw=True)
class TestNumObsLinkage(TestCase):
def test_num_obs_linkage_empty(self):
# Tests num_obs_linkage(Z) with empty linkage.
Z = np.zeros((0, 4), dtype=np.double)
assert_raises(ValueError, num_obs_linkage, Z)
def test_num_obs_linkage_1x4(self):
# Tests num_obs_linkage(Z) on linkage over 2 observations.
Z = np.asarray([[0, 1, 3.0, 2]], dtype=np.double)
assert_equal(num_obs_linkage(Z), 2)
def test_num_obs_linkage_2x4(self):
# Tests num_obs_linkage(Z) on linkage over 3 observations.
Z = np.asarray([[0, 1, 3.0, 2],
[3, 2, 4.0, 3]], dtype=np.double)
assert_equal(num_obs_linkage(Z), 3)
def test_num_obs_linkage_4_and_up(self):
# Tests num_obs_linkage(Z) on linkage on observation sets between sizes
# 4 and 15 (step size 3).
for i in xrange(4, 15, 3):
y = np.random.rand(i * (i - 1) // 2)
Z = linkage(y)
assert_equal(num_obs_linkage(Z), i)
class TestLeavesList(object):
def test_leaves_list_1x4(self):
# Tests leaves_list(Z) on a 1x4 linkage.
Z = np.asarray([[0, 1, 3.0, 2]], dtype=np.double)
to_tree(Z)
assert_equal(leaves_list(Z), [0, 1])
def test_leaves_list_2x4(self):
# Tests leaves_list(Z) on a 2x4 linkage.
Z = np.asarray([[0, 1, 3.0, 2],
[3, 2, 4.0, 3]], dtype=np.double)
to_tree(Z)
assert_equal(leaves_list(Z), [0, 1, 2])
def test_leaves_list_Q(self):
for method in ['single', 'complete', 'average', 'weighted', 'centroid',
'median', 'ward']:
yield self.check_leaves_list_Q, method
def check_leaves_list_Q(self, method):
# Tests leaves_list(Z) on the Q data set
X = hierarchy_test_data.Q_X
Z = linkage(X, method)
node = to_tree(Z)
assert_equal(node.pre_order(), leaves_list(Z))
def test_Q_subtree_pre_order(self):
# Tests that pre_order() works when called on sub-trees.
X = hierarchy_test_data.Q_X
Z = linkage(X, 'single')
node = to_tree(Z)
assert_equal(node.pre_order(), (node.get_left().pre_order()
+ node.get_right().pre_order()))
class TestCorrespond(TestCase):
def test_correspond_empty(self):
# Tests correspond(Z, y) with empty linkage and condensed distance matrix.
y = np.zeros((0,))
Z = np.zeros((0, 4))
assert_raises(ValueError, correspond, Z, y)
def test_correspond_2_and_up(self):
# Tests correspond(Z, y) on linkage and CDMs over observation sets of
# different sizes.
for i in xrange(2, 4):
y = np.random.rand(i * (i - 1) // 2)
Z = linkage(y)
assert_(correspond(Z, y))
for i in xrange(4, 15, 3):
y = np.random.rand(i * (i - 1) // 2)
Z = linkage(y)
assert_(correspond(Z, y))
def test_correspond_4_and_up(self):
# Tests correspond(Z, y) on linkage and CDMs over observation sets of
# different sizes. Correspondance should be false.
for (i, j) in (list(zip(list(range(2, 4)), list(range(3, 5)))) +
list(zip(list(range(3, 5)), list(range(2, 4))))):
y = np.random.rand(i * (i - 1) // 2)
y2 = np.random.rand(j * (j - 1) // 2)
Z = linkage(y)
Z2 = linkage(y2)
assert_equal(correspond(Z, y2), False)
assert_equal(correspond(Z2, y), False)
def test_correspond_4_and_up_2(self):
# Tests correspond(Z, y) on linkage and CDMs over observation sets of
# different sizes. Correspondance should be false.
for (i, j) in (list(zip(list(range(2, 7)), list(range(16, 21)))) +
list(zip(list(range(2, 7)), list(range(16, 21))))):
y = np.random.rand(i * (i - 1) // 2)
y2 = np.random.rand(j * (j - 1) // 2)
Z = linkage(y)
Z2 = linkage(y2)
assert_equal(correspond(Z, y2), False)
assert_equal(correspond(Z2, y), False)
def test_num_obs_linkage_multi_matrix(self):
# Tests num_obs_linkage with observation matrices of multiple sizes.
for n in xrange(2, 10):
X = np.random.rand(n, 4)
Y = pdist(X)
Z = linkage(Y)
assert_equal(num_obs_linkage(Z), n)
class TestIsMonotonic(TestCase):
def test_is_monotonic_empty(self):
# Tests is_monotonic(Z) on an empty linkage.
Z = np.zeros((0, 4))
assert_raises(ValueError, is_monotonic, Z)
def test_is_monotonic_1x4(self):
# Tests is_monotonic(Z) on 1x4 linkage. Expecting True.
Z = np.asarray([[0, 1, 0.3, 2]], dtype=np.double)
assert_equal(is_monotonic(Z), True)
def test_is_monotonic_2x4_T(self):
# Tests is_monotonic(Z) on 2x4 linkage. Expecting True.
Z = np.asarray([[0, 1, 0.3, 2],
[2, 3, 0.4, 3]], dtype=np.double)
assert_equal(is_monotonic(Z), True)
def test_is_monotonic_2x4_F(self):
# Tests is_monotonic(Z) on 2x4 linkage. Expecting False.
Z = np.asarray([[0, 1, 0.4, 2],
[2, 3, 0.3, 3]], dtype=np.double)
assert_equal(is_monotonic(Z), False)
def test_is_monotonic_3x4_T(self):
# Tests is_monotonic(Z) on 3x4 linkage. Expecting True.
Z = np.asarray([[0, 1, 0.3, 2],
[2, 3, 0.4, 2],
[4, 5, 0.6, 4]], dtype=np.double)
assert_equal(is_monotonic(Z), True)
def test_is_monotonic_3x4_F1(self):
# Tests is_monotonic(Z) on 3x4 linkage (case 1). Expecting False.
Z = np.asarray([[0, 1, 0.3, 2],
[2, 3, 0.2, 2],
[4, 5, 0.6, 4]], dtype=np.double)
assert_equal(is_monotonic(Z), False)
def test_is_monotonic_3x4_F2(self):
# Tests is_monotonic(Z) on 3x4 linkage (case 2). Expecting False.
Z = np.asarray([[0, 1, 0.8, 2],
[2, 3, 0.4, 2],
[4, 5, 0.6, 4]], dtype=np.double)
assert_equal(is_monotonic(Z), False)
def test_is_monotonic_3x4_F3(self):
# Tests is_monotonic(Z) on 3x4 linkage (case 3). Expecting False
Z = np.asarray([[0, 1, 0.3, 2],
[2, 3, 0.4, 2],
[4, 5, 0.2, 4]], dtype=np.double)
assert_equal(is_monotonic(Z), False)
def test_is_monotonic_tdist_linkage1(self):
# Tests is_monotonic(Z) on clustering generated by single linkage on
# tdist data set. Expecting True.
Z = linkage(hierarchy_test_data.ytdist, 'single')
assert_equal(is_monotonic(Z), True)
def test_is_monotonic_tdist_linkage2(self):
# Tests is_monotonic(Z) on clustering generated by single linkage on
# tdist data set. Perturbing. Expecting False.
Z = linkage(hierarchy_test_data.ytdist, 'single')
Z[2, 2] = 0.0
assert_equal(is_monotonic(Z), False)
def test_is_monotonic_Q_linkage(self):
# Tests is_monotonic(Z) on clustering generated by single linkage on
# Q data set. Expecting True.
X = hierarchy_test_data.Q_X
Z = linkage(X, 'single')
assert_equal(is_monotonic(Z), True)
class TestMaxDists(object):
def test_maxdists_empty_linkage(self):
# Tests maxdists(Z) on empty linkage. Expecting exception.
Z = np.zeros((0, 4), dtype=np.double)
assert_raises(ValueError, maxdists, Z)
def test_maxdists_one_cluster_linkage(self):
# Tests maxdists(Z) on linkage with one cluster.
Z = np.asarray([[0, 1, 0.3, 4]], dtype=np.double)
MD = maxdists(Z)
expectedMD = calculate_maximum_distances(Z)
assert_allclose(MD, expectedMD, atol=1e-15)
def test_maxdists_Q_linkage(self):
for method in ['single', 'complete', 'ward', 'centroid', 'median']:
yield self.check_maxdists_Q_linkage, method
def check_maxdists_Q_linkage(self, method):
# Tests maxdists(Z) on the Q data set
X = hierarchy_test_data.Q_X
Z = linkage(X, method)
MD = maxdists(Z)
expectedMD = calculate_maximum_distances(Z)
assert_allclose(MD, expectedMD, atol=1e-15)
class TestMaxInconsts(object):
def test_maxinconsts_empty_linkage(self):
# Tests maxinconsts(Z, R) on empty linkage. Expecting exception.
Z = np.zeros((0, 4), dtype=np.double)
R = np.zeros((0, 4), dtype=np.double)
assert_raises(ValueError, maxinconsts, Z, R)
def test_maxinconsts_difrow_linkage(self):
# Tests maxinconsts(Z, R) on linkage and inconsistency matrices with
# different numbers of clusters. Expecting exception.
Z = np.asarray([[0, 1, 0.3, 4]], dtype=np.double)
R = np.random.rand(2, 4)
assert_raises(ValueError, maxinconsts, Z, R)
def test_maxinconsts_one_cluster_linkage(self):
# Tests maxinconsts(Z, R) on linkage with one cluster.
Z = np.asarray([[0, 1, 0.3, 4]], dtype=np.double)
R = np.asarray([[0, 0, 0, 0.3]], dtype=np.double)
MD = maxinconsts(Z, R)
expectedMD = calculate_maximum_inconsistencies(Z, R)
assert_allclose(MD, expectedMD, atol=1e-15)
def test_maxinconsts_Q_linkage(self):
for method in ['single', 'complete', 'ward', 'centroid', 'median']:
yield self.check_maxinconsts_Q_linkage, method
def check_maxinconsts_Q_linkage(self, method):
# Tests maxinconsts(Z, R) on the Q data set
X = hierarchy_test_data.Q_X
Z = linkage(X, method)
R = inconsistent(Z)
MD = maxinconsts(Z, R)
expectedMD = calculate_maximum_inconsistencies(Z, R)
assert_allclose(MD, expectedMD, atol=1e-15)
class TestMaxRStat(object):
def test_maxRstat_invalid_index(self):
for i in [3.3, -1, 4]:
yield self.check_maxRstat_invalid_index, i
def check_maxRstat_invalid_index(self, i):
# Tests maxRstat(Z, R, i). Expecting exception.
Z = np.asarray([[0, 1, 0.3, 4]], dtype=np.double)
R = np.asarray([[0, 0, 0, 0.3]], dtype=np.double)
if isinstance(i, int):
assert_raises(ValueError, maxRstat, Z, R, i)
else:
assert_raises(TypeError, maxRstat, Z, R, i)
def test_maxRstat_empty_linkage(self):
for i in range(4):
yield self.check_maxRstat_empty_linkage, i
def check_maxRstat_empty_linkage(self, i):
# Tests maxRstat(Z, R, i) on empty linkage. Expecting exception.
Z = np.zeros((0, 4), dtype=np.double)
R = np.zeros((0, 4), dtype=np.double)
assert_raises(ValueError, maxRstat, Z, R, i)
def test_maxRstat_difrow_linkage(self):
for i in range(4):
yield self.check_maxRstat_difrow_linkage, i
def check_maxRstat_difrow_linkage(self, i):
# Tests maxRstat(Z, R, i) on linkage and inconsistency matrices with
# different numbers of clusters. Expecting exception.
Z = np.asarray([[0, 1, 0.3, 4]], dtype=np.double)
R = np.random.rand(2, 4)
assert_raises(ValueError, maxRstat, Z, R, i)
def test_maxRstat_one_cluster_linkage(self):
for i in range(4):
yield self.check_maxRstat_one_cluster_linkage, i
def check_maxRstat_one_cluster_linkage(self, i):
# Tests maxRstat(Z, R, i) on linkage with one cluster.
Z = np.asarray([[0, 1, 0.3, 4]], dtype=np.double)
R = np.asarray([[0, 0, 0, 0.3]], dtype=np.double)
MD = maxRstat(Z, R, 1)
expectedMD = calculate_maximum_inconsistencies(Z, R, 1)
assert_allclose(MD, expectedMD, atol=1e-15)
def test_maxRstat_Q_linkage(self):
for method in ['single', 'complete', 'ward', 'centroid', 'median']:
for i in range(4):
yield self.check_maxRstat_Q_linkage, method, i
def check_maxRstat_Q_linkage(self, method, i):
# Tests maxRstat(Z, R, i) on the Q data set
X = hierarchy_test_data.Q_X
Z = linkage(X, method)
R = inconsistent(Z)
MD = maxRstat(Z, R, 1)
expectedMD = calculate_maximum_inconsistencies(Z, R, 1)
assert_allclose(MD, expectedMD, atol=1e-15)
class TestDendrogram(object):
def test_dendrogram_single_linkage_tdist(self):
# Tests dendrogram calculation on single linkage of the tdist data set.
Z = linkage(hierarchy_test_data.ytdist, 'single')
R = dendrogram(Z, no_plot=True)
leaves = R["leaves"]
assert_equal(leaves, [2, 5, 1, 0, 3, 4])
def test_valid_orientation(self):
Z = linkage(hierarchy_test_data.ytdist, 'single')
assert_raises(ValueError, dendrogram, Z, orientation="foo")
@dec.skipif(not have_matplotlib)
def test_dendrogram_plot(self):
for orientation in ['top', 'bottom', 'left', 'right']:
yield self.check_dendrogram_plot, orientation
def check_dendrogram_plot(self, orientation):
# Tests dendrogram plotting.
Z = linkage(hierarchy_test_data.ytdist, 'single')
expected = {'color_list': ['g', 'b', 'b', 'b', 'b'],
'dcoord': [[0.0, 138.0, 138.0, 0.0],
[0.0, 219.0, 219.0, 0.0],
[0.0, 255.0, 255.0, 219.0],
[0.0, 268.0, 268.0, 255.0],
[138.0, 295.0, 295.0, 268.0]],
'icoord': [[5.0, 5.0, 15.0, 15.0],
[45.0, 45.0, 55.0, 55.0],
[35.0, 35.0, 50.0, 50.0],
[25.0, 25.0, 42.5, 42.5],
[10.0, 10.0, 33.75, 33.75]],
'ivl': ['2', '5', '1', '0', '3', '4'],
'leaves': [2, 5, 1, 0, 3, 4]}
fig = plt.figure()
ax = fig.add_subplot(221)
# test that dendrogram accepts ax keyword
R1 = dendrogram(Z, ax=ax, orientation=orientation)
assert_equal(R1, expected)
# test that dendrogram accepts and handle the leaf_font_size and
# leaf_rotation keywords
R1a = dendrogram(Z, ax=ax, orientation=orientation,
leaf_font_size=20, leaf_rotation=90)
testlabel = (
ax.get_xticklabels()[0]
if orientation in ['top', 'bottom']
else ax.get_yticklabels()[0]
)
assert_equal(testlabel.get_rotation(), 90)
assert_equal(testlabel.get_size(), 20)
R1a = dendrogram(Z, ax=ax, orientation=orientation,
leaf_rotation=90)
testlabel = (
ax.get_xticklabels()[0]
if orientation in ['top', 'bottom']
else ax.get_yticklabels()[0]
)
assert_equal(testlabel.get_rotation(), 90)
R1a = dendrogram(Z, ax=ax, orientation=orientation,
leaf_font_size=20)
testlabel = (
ax.get_xticklabels()[0]
if orientation in ['top', 'bottom']
else ax.get_yticklabels()[0]
)
assert_equal(testlabel.get_size(), 20)
plt.close()
# test plotting to gca (will import pylab)
R2 = dendrogram(Z, orientation=orientation)
plt.close()
assert_equal(R2, expected)
@dec.skipif(not have_matplotlib)
def test_dendrogram_truncate_mode(self):
Z = linkage(hierarchy_test_data.ytdist, 'single')
R = dendrogram(Z, 2, 'lastp', show_contracted=True)
plt.close()
assert_equal(R, {'color_list': ['b'],
'dcoord': [[0.0, 295.0, 295.0, 0.0]],
'icoord': [[5.0, 5.0, 15.0, 15.0]],
'ivl': ['(2)', '(4)'],
'leaves': [6, 9]})
R = dendrogram(Z, 2, 'mtica', show_contracted=True)
plt.close()
assert_equal(R, {'color_list': ['g', 'b', 'b', 'b'],
'dcoord': [[0.0, 138.0, 138.0, 0.0],
[0.0, 255.0, 255.0, 0.0],
[0.0, 268.0, 268.0, 255.0],
[138.0, 295.0, 295.0, 268.0]],
'icoord': [[5.0, 5.0, 15.0, 15.0],
[35.0, 35.0, 45.0, 45.0],
[25.0, 25.0, 40.0, 40.0],
[10.0, 10.0, 32.5, 32.5]],
'ivl': ['2', '5', '1', '0', '(2)'],
'leaves': [2, 5, 1, 0, 7]})
def test_dendrogram_colors(self):
# Tests dendrogram plots with alternate colors
Z = linkage(hierarchy_test_data.ytdist, 'single')
set_link_color_palette(['c', 'm', 'y', 'k'])
R = dendrogram(Z, no_plot=True,
above_threshold_color='g', color_threshold=250)
set_link_color_palette(['g', 'r', 'c', 'm', 'y', 'k'])
color_list = R['color_list']
assert_equal(color_list, ['c', 'm', 'g', 'g', 'g'])
# reset color palette (global list)
set_link_color_palette(None)
def calculate_maximum_distances(Z):
# Used for testing correctness of maxdists.
n = Z.shape[0] + 1
B = np.zeros((n - 1,))
q = np.zeros((3,))
for i in xrange(0, n - 1):
q[:] = 0.0
left = Z[i, 0]
right = Z[i, 1]
if left >= n:
q[0] = B[int(left) - n]
if right >= n:
q[1] = B[int(right) - n]
q[2] = Z[i, 2]
B[i] = q.max()
return B
def calculate_maximum_inconsistencies(Z, R, k=3):
# Used for testing correctness of maxinconsts.
n = Z.shape[0] + 1
B = np.zeros((n - 1,))
q = np.zeros((3,))
for i in xrange(0, n - 1):
q[:] = 0.0
left = Z[i, 0]
right = Z[i, 1]
if left >= n:
q[0] = B[int(left) - n]
if right >= n:
q[1] = B[int(right) - n]
q[2] = R[i, k]
B[i] = q.max()
return B
def within_tol(a, b, tol):
return np.abs(a - b).max() < tol
def test_unsupported_uncondensed_distance_matrix_linkage_warning():
assert_warns(ClusterWarning, linkage, [[0, 1], [1, 0]])
def test_euclidean_linkage_value_error():
for method in scipy.cluster.hierarchy._EUCLIDEAN_METHODS:
assert_raises(ValueError, linkage, [[1, 1], [1, 1]],
method=method, metric='cityblock')
def test_2x2_linkage():
Z1 = linkage([1], method='single', metric='euclidean')
Z2 = linkage([[0, 1], [0, 0]], method='single', metric='euclidean')
assert_allclose(Z1, Z2)
def test_node_compare():
np.random.seed(23)
nobs = 50
X = np.random.randn(nobs, 4)
Z = scipy.cluster.hierarchy.ward(X)
tree = to_tree(Z)
assert_(tree > tree.get_left())
assert_(tree.get_right() > tree.get_left())
assert_(tree.get_right() == tree.get_right())
assert_(tree.get_right() != tree.get_left())
def test_cut_tree():
np.random.seed(23)
nobs = 50
X = np.random.randn(nobs, 4)
Z = scipy.cluster.hierarchy.ward(X)
cutree = cut_tree(Z)
assert_equal(cutree[:, 0], np.arange(nobs))
assert_equal(cutree[:, -1], np.zeros(nobs))
assert_equal(cutree.max(0), np.arange(nobs - 1, -1, -1))
assert_equal(cutree[:, [-5]], cut_tree(Z, n_clusters=5))
assert_equal(cutree[:, [-5, -10]], cut_tree(Z, n_clusters=[5, 10]))
assert_equal(cutree[:, [-10, -5]], cut_tree(Z, n_clusters=[10, 5]))
nodes = _order_cluster_tree(Z)
heights = np.array([node.dist for node in nodes])
assert_equal(cutree[:, np.searchsorted(heights, [5])],
cut_tree(Z, height=5))
assert_equal(cutree[:, np.searchsorted(heights, [5, 10])],
cut_tree(Z, height=[5, 10]))
assert_equal(cutree[:, np.searchsorted(heights, [10, 5])],
cut_tree(Z, height=[10, 5]))
def test_Heap():
values = np.array([2, -1, 0, -1.5, 3])
heap = Heap(values)
pair = heap.get_min()
assert_equal(pair['key'], 3)
assert_equal(pair['value'], -1.5)
heap.remove_min()
pair = heap.get_min()
assert_equal(pair['key'], 1)
assert_equal(pair['value'], -1)
heap.change_value(1, 2.5)
pair = heap.get_min()
assert_equal(pair['key'], 2)
assert_equal(pair['value'], 0)
heap.remove_min()
heap.remove_min()
heap.change_value(1, 10)
pair = heap.get_min()
assert_equal(pair['key'], 4)
assert_equal(pair['value'], 3)
heap.remove_min()
pair = heap.get_min()
assert_equal(pair['key'], 1)
assert_equal(pair['value'], 10)
if __name__ == "__main__":
run_module_suite()
| [
"[email protected]"
] | |
d26db268212730243e31afef9dac80f44edda814 | d7ccb4225f623139995a7039f0981e89bf6365a4 | /.history/store/views_20211010181824.py | 5404e2083299cd5fa19e695dd5bbf5bc15364268 | [] | no_license | tonnymuchui/django-mall | 64fd4abc3725c1bd0a3dcf20b93b490fe9307b37 | 55c083d8433be3c77adc61939cd197902de4ce76 | refs/heads/master | 2023-08-23T04:59:20.418732 | 2021-10-13T15:59:37 | 2021-10-13T15:59:37 | 415,668,388 | 1 | 0 | null | null | null | null | UTF-8 | Python | false | false | 495 | py | from django.shortcuts import get_object_or_404, render
from store.models import Product
from cate import member
# Create your views here.
def store(request, category_slug=None):
categories = get_object_or_404(Category, slug=category_slug)
products = Product.objects.all().filter(is_available=True)
products_count = products.count()
content = {
'products': products,
'products_count': products_count,
}
return render(request, 'store/store.html', content) | [
"[email protected]"
] | |
a7e1a5f16a4ea6519e2f6f5df35e23b32f5345ba | 9a486a87e028303a551fbd0d1e1b6b650387ea14 | /parse_tlog/guide_flow.py | 2cebc5bd734a8a21dc044b2fbc02dd903521f052 | [] | no_license | shanlihou/pythonFunc | 7b8e7064fddd4522e492c915c086cc6c5abc6eec | 646920256551ccd8335446dd4fe11aa4b9916f64 | refs/heads/master | 2022-08-24T20:33:12.287464 | 2022-07-21T12:00:10 | 2022-07-21T12:00:10 | 24,311,639 | 3 | 0 | null | null | null | null | UTF-8 | Python | false | false | 1,067 | py | # coding:utf-8
import utils
import const
import LogOne
import csv_output
def guide_flow():
fname = utils.filter_from_origin('GuideFlow')
id_dic = {}
avatar_count = utils.get_avatar_count()
with utils.utf8_open(fname) as fr:
for line in fr:
lo = LogOne.get_log_from_line(line)
if not lo:
continue
id_dic.setdefault(lo.guide_id, set())
id_dic[lo.guide_id].add(lo.gbid)
rets = [(int(k), len(v)) for k, v in id_dic.items()]
rets.sort(key=lambda x: x[0])
csv = csv_output.CSVOutPut()
csv.set(0, 0, '节点')
csv.set(0, 1, '创角数')
csv.set(0, 2, '节点通过人数')
csv.set(0, 3, '节点通过率')
idx = 1
for key, num in rets:
csv.set(idx, 0, key)
csv.set(idx, 1, avatar_count)
csv.set(idx, 2, num)
csv.set(idx, 3, num / avatar_count)
idx += 1
out_name = utils.get_out_name('out', 'guide_flow.csv')
csv.output(out_name)
if __name__ == '__main__':
guide_flow() | [
"[email protected]"
] | |
4d2f1b947c37bea509fec0603fec028f2816c5f7 | 2bdedcda705f6dcf45a1e9a090377f892bcb58bb | /src/main/output/level_head/aws_company_right_program.py | a269a3cf60adfad0ae305533decf4c19f884f035 | [] | no_license | matkosoric/GenericNameTesting | 860a22af1098dda9ea9e24a1fc681bb728aa2d69 | 03f4a38229c28bc6d83258e5a84fce4b189d5f00 | refs/heads/master | 2021-01-08T22:35:20.022350 | 2020-02-21T11:28:21 | 2020-02-21T11:28:21 | 242,123,053 | 1 | 0 | null | null | null | null | UTF-8 | Python | false | false | 570 | py | import srt_translate as st #must use python 3
#import os
#subdirec = 'home/ben/test'
#os.chdir(test)
# Replace the subscriptionKey string value with your valid subscription key.
subscriptionKey = 'a3487bf0249992cc26cd5aaf14d5f0b0'
# Language codes and names:
# English: en
# Greek: el
# More: http://www.emreakkas.com/internationalization/microsoft-translator-api-languages-list-language-codes-and-names
inputfile='test.srt'
outputfile='test_result.srt'
fromlang = 'en'
tolang = 'el'
st.convert_srt_file(inputfile, outputfile, subscriptionKey, tolang)#, fromlang)
| [
"[email protected]"
] | |
06e1f9b50cd4203f4526aa2af52be37ed254a692 | f82757475ea13965581c2147ff57123b361c5d62 | /gi-stubs/repository/GnomeDesktop/RRScreen.py | 4092b6b1843f0804d1cbd81498ddbc29582ae5d0 | [] | no_license | ttys3/pygobject-stubs | 9b15d1b473db06f47e5ffba5ad0a31d6d1becb57 | d0e6e93399212aada4386d2ce80344eb9a31db48 | refs/heads/master | 2022-09-23T12:58:44.526554 | 2020-06-06T04:15:00 | 2020-06-06T04:15:00 | 269,693,287 | 8 | 2 | null | 2020-06-05T15:57:54 | 2020-06-05T15:57:54 | null | UTF-8 | Python | false | false | 19,400 | py | # encoding: utf-8
# module gi.repository.GnomeDesktop
# from /usr/lib64/girepository-1.0/GnomeDesktop-3.0.typelib
# by generator 1.147
"""
An object which wraps an introspection typelib.
This wrapping creates a python module like representation of the typelib
using gi repository as a foundation. Accessing attributes of the module
will dynamically pull them in and create wrappers for the members.
These members are then cached on this introspection module.
"""
# imports
import gi as __gi
import gi.overrides.GObject as __gi_overrides_GObject
import gi.repository.Gio as __gi_repository_Gio
import gobject as __gobject
class RRScreen(__gi_overrides_GObject.Object, __gi_repository_Gio.AsyncInitable, __gi_repository_Gio.Initable):
"""
:Constructors:
::
RRScreen(**properties)
new(screen:Gdk.Screen) -> GnomeDesktop.RRScreen
new_finish(result:Gio.AsyncResult) -> GnomeDesktop.RRScreen
"""
def bind_property(self, *args, **kwargs): # real signature unknown
pass
def bind_property_full(self, *args, **kargs): # reliably restored by inspect
# no doc
pass
def chain(self, *args, **kwargs): # real signature unknown
pass
def compat_control(self, *args, **kargs): # reliably restored by inspect
# no doc
pass
def connect(self, *args, **kwargs): # real signature unknown
pass
def connect_after(self, *args, **kwargs): # real signature unknown
pass
def connect_data(self, detailed_signal, handler, *data, **kwargs): # reliably restored by inspect
"""
Connect a callback to the given signal with optional user data.
:param str detailed_signal:
A detailed signal to connect to.
:param callable handler:
Callback handler to connect to the signal.
:param *data:
Variable data which is passed through to the signal handler.
:param GObject.ConnectFlags connect_flags:
Flags used for connection options.
:returns:
A signal id which can be used with disconnect.
"""
pass
def connect_object(self, *args, **kwargs): # real signature unknown
pass
def connect_object_after(self, *args, **kwargs): # real signature unknown
pass
def disconnect(*args, **kwargs): # reliably restored by inspect
""" signal_handler_disconnect(instance:GObject.Object, handler_id:int) """
pass
def disconnect_by_func(self, *args, **kwargs): # real signature unknown
pass
def do_changed(self, *args, **kwargs): # real signature unknown
""" changed(self) """
pass
def do_output_connected(self, *args, **kwargs): # real signature unknown
""" output_connected(self, output:GnomeDesktop.RROutput) """
pass
def do_output_disconnected(self, *args, **kwargs): # real signature unknown
""" output_disconnected(self, output:GnomeDesktop.RROutput) """
pass
def emit(self, *args, **kwargs): # real signature unknown
pass
def emit_stop_by_name(self, detailed_signal): # reliably restored by inspect
""" Deprecated, please use stop_emission_by_name. """
pass
def find_property(self, property_name): # real signature unknown; restored from __doc__
""" find_property(self, property_name:str) -> GObject.ParamSpec """
pass
def force_floating(self, *args, **kargs): # reliably restored by inspect
# no doc
pass
def freeze_notify(self): # reliably restored by inspect
"""
Freezes the object's property-changed notification queue.
:returns:
A context manager which optionally can be used to
automatically thaw notifications.
This will freeze the object so that "notify" signals are blocked until
the thaw_notify() method is called.
.. code-block:: python
with obj.freeze_notify():
pass
"""
pass
def getv(self, names, values): # real signature unknown; restored from __doc__
""" getv(self, names:list, values:list) """
pass
def get_crtc_by_id(self, id): # real signature unknown; restored from __doc__
""" get_crtc_by_id(self, id:int) -> GnomeDesktop.RRCrtc """
pass
def get_data(self, *args, **kargs): # reliably restored by inspect
# no doc
pass
def get_dpms_mode(self): # real signature unknown; restored from __doc__
""" get_dpms_mode(self) -> bool, mode:GnomeDesktop.RRDpmsMode """
return False
def get_output_by_id(self, id): # real signature unknown; restored from __doc__
""" get_output_by_id(self, id:int) -> GnomeDesktop.RROutput """
pass
def get_output_by_name(self, name): # real signature unknown; restored from __doc__
""" get_output_by_name(self, name:str) -> GnomeDesktop.RROutput """
pass
def get_properties(self, *args, **kwargs): # real signature unknown
pass
def get_property(self, *args, **kwargs): # real signature unknown
pass
def get_qdata(self, *args, **kargs): # reliably restored by inspect
# no doc
pass
def get_ranges(self): # real signature unknown; restored from __doc__
""" get_ranges(self) -> min_width:int, max_width:int, min_height:int, max_height:int """
pass
def handler_block(obj, handler_id): # reliably restored by inspect
"""
Blocks the signal handler from being invoked until
handler_unblock() is called.
:param GObject.Object obj:
Object instance to block handlers for.
:param int handler_id:
Id of signal to block.
:returns:
A context manager which optionally can be used to
automatically unblock the handler:
.. code-block:: python
with GObject.signal_handler_block(obj, id):
pass
"""
pass
def handler_block_by_func(self, *args, **kwargs): # real signature unknown
pass
def handler_disconnect(*args, **kwargs): # reliably restored by inspect
""" signal_handler_disconnect(instance:GObject.Object, handler_id:int) """
pass
def handler_is_connected(*args, **kwargs): # reliably restored by inspect
""" signal_handler_is_connected(instance:GObject.Object, handler_id:int) -> bool """
pass
def handler_unblock(*args, **kwargs): # reliably restored by inspect
""" signal_handler_unblock(instance:GObject.Object, handler_id:int) """
pass
def handler_unblock_by_func(self, *args, **kwargs): # real signature unknown
pass
def init(self, cancellable=None): # real signature unknown; restored from __doc__
""" init(self, cancellable:Gio.Cancellable=None) -> bool """
return False
def init_async(self, io_priority, cancellable=None, callback=None, user_data=None): # real signature unknown; restored from __doc__
""" init_async(self, io_priority:int, cancellable:Gio.Cancellable=None, callback:Gio.AsyncReadyCallback=None, user_data=None) """
pass
def init_finish(self, res): # real signature unknown; restored from __doc__
""" init_finish(self, res:Gio.AsyncResult) -> bool """
return False
def install_properties(self, pspecs): # real signature unknown; restored from __doc__
""" install_properties(self, pspecs:list) """
pass
def install_property(self, property_id, pspec): # real signature unknown; restored from __doc__
""" install_property(self, property_id:int, pspec:GObject.ParamSpec) """
pass
def interface_find_property(self, *args, **kargs): # reliably restored by inspect
# no doc
pass
def interface_install_property(self, *args, **kargs): # reliably restored by inspect
# no doc
pass
def interface_list_properties(self, *args, **kargs): # reliably restored by inspect
# no doc
pass
def is_floating(self): # real signature unknown; restored from __doc__
""" is_floating(self) -> bool """
return False
def list_clone_modes(self): # real signature unknown; restored from __doc__
""" list_clone_modes(self) -> list """
return []
def list_crtcs(self): # real signature unknown; restored from __doc__
""" list_crtcs(self) -> list """
return []
def list_modes(self): # real signature unknown; restored from __doc__
""" list_modes(self) -> list """
return []
def list_outputs(self): # real signature unknown; restored from __doc__
""" list_outputs(self) -> list """
return []
def list_properties(self): # real signature unknown; restored from __doc__
""" list_properties(self) -> list, n_properties:int """
return []
def new(self, screen): # real signature unknown; restored from __doc__
""" new(screen:Gdk.Screen) -> GnomeDesktop.RRScreen """
pass
def newv(self, object_type, parameters): # real signature unknown; restored from __doc__
""" newv(object_type:GType, parameters:list) -> GObject.Object """
pass
def newv_async(self, object_type, n_parameters, parameters, io_priority, cancellable=None, callback=None, user_data=None): # real signature unknown; restored from __doc__
""" newv_async(object_type:GType, n_parameters:int, parameters:GObject.Parameter, io_priority:int, cancellable:Gio.Cancellable=None, callback:Gio.AsyncReadyCallback=None, user_data=None) """
pass
def new_async(self, screen, callback=None, user_data=None): # real signature unknown; restored from __doc__
""" new_async(screen:Gdk.Screen, callback:Gio.AsyncReadyCallback=None, user_data=None) """
pass
def new_finish(self, result): # real signature unknown; restored from __doc__
""" new_finish(result:Gio.AsyncResult) -> GnomeDesktop.RRScreen """
pass
def notify(self, property_name): # real signature unknown; restored from __doc__
""" notify(self, property_name:str) """
pass
def notify_by_pspec(self, *args, **kargs): # reliably restored by inspect
# no doc
pass
def override_property(self, property_id, name): # real signature unknown; restored from __doc__
""" override_property(self, property_id:int, name:str) """
pass
def ref(self, *args, **kargs): # reliably restored by inspect
# no doc
pass
def refresh(self): # real signature unknown; restored from __doc__
""" refresh(self) -> bool """
return False
def ref_sink(self, *args, **kargs): # reliably restored by inspect
# no doc
pass
def replace_data(self, *args, **kargs): # reliably restored by inspect
# no doc
pass
def replace_qdata(self, *args, **kargs): # reliably restored by inspect
# no doc
pass
def run_dispose(self, *args, **kargs): # reliably restored by inspect
# no doc
pass
def set_data(self, *args, **kargs): # reliably restored by inspect
# no doc
pass
def set_dpms_mode(self, mode): # real signature unknown; restored from __doc__
""" set_dpms_mode(self, mode:GnomeDesktop.RRDpmsMode) -> bool """
return False
def set_properties(self, *args, **kwargs): # real signature unknown
pass
def set_property(self, *args, **kwargs): # real signature unknown
pass
def steal_data(self, *args, **kargs): # reliably restored by inspect
# no doc
pass
def steal_qdata(self, *args, **kargs): # reliably restored by inspect
# no doc
pass
def stop_emission(self, detailed_signal): # reliably restored by inspect
""" Deprecated, please use stop_emission_by_name. """
pass
def stop_emission_by_name(*args, **kwargs): # reliably restored by inspect
""" signal_stop_emission_by_name(instance:GObject.Object, detailed_signal:str) """
pass
def thaw_notify(self): # real signature unknown; restored from __doc__
""" thaw_notify(self) """
pass
def unref(self, *args, **kargs): # reliably restored by inspect
# no doc
pass
def watch_closure(self, *args, **kargs): # reliably restored by inspect
# no doc
pass
def weak_ref(self, *args, **kwargs): # real signature unknown
pass
def _force_floating(self, *args, **kwargs): # real signature unknown
""" force_floating(self) """
pass
def _ref(self, *args, **kwargs): # real signature unknown
""" ref(self) -> GObject.Object """
pass
def _ref_sink(self, *args, **kwargs): # real signature unknown
""" ref_sink(self) -> GObject.Object """
pass
def _unref(self, *args, **kwargs): # real signature unknown
""" unref(self) """
pass
def _unsupported_data_method(self, *args, **kargs): # reliably restored by inspect
# no doc
pass
def _unsupported_method(self, *args, **kargs): # reliably restored by inspect
# no doc
pass
def __copy__(self, *args, **kwargs): # real signature unknown
pass
def __deepcopy__(self, *args, **kwargs): # real signature unknown
pass
def __delattr__(self, *args, **kwargs): # real signature unknown
""" Implement delattr(self, name). """
pass
def __dir__(self, *args, **kwargs): # real signature unknown
""" Default dir() implementation. """
pass
def __eq__(self, *args, **kwargs): # real signature unknown
""" Return self==value. """
pass
def __format__(self, *args, **kwargs): # real signature unknown
""" Default object formatter. """
pass
def __getattribute__(self, *args, **kwargs): # real signature unknown
""" Return getattr(self, name). """
pass
def __ge__(self, *args, **kwargs): # real signature unknown
""" Return self>=value. """
pass
def __gt__(self, *args, **kwargs): # real signature unknown
""" Return self>value. """
pass
def __hash__(self, *args, **kwargs): # real signature unknown
""" Return hash(self). """
pass
def __init_subclass__(self, *args, **kwargs): # real signature unknown
"""
This method is called when a class is subclassed.
The default implementation does nothing. It may be
overridden to extend subclasses.
"""
pass
def __init__(self, **properties): # real signature unknown; restored from __doc__
pass
def __le__(self, *args, **kwargs): # real signature unknown
""" Return self<=value. """
pass
def __lt__(self, *args, **kwargs): # real signature unknown
""" Return self<value. """
pass
@staticmethod # known case of __new__
def __new__(*args, **kwargs): # real signature unknown
""" Create and return a new object. See help(type) for accurate signature. """
pass
def __ne__(self, *args, **kwargs): # real signature unknown
""" Return self!=value. """
pass
def __reduce_ex__(self, *args, **kwargs): # real signature unknown
""" Helper for pickle. """
pass
def __reduce__(self, *args, **kwargs): # real signature unknown
""" Helper for pickle. """
pass
def __repr__(self, *args, **kwargs): # real signature unknown
""" Return repr(self). """
pass
def __setattr__(self, *args, **kwargs): # real signature unknown
""" Implement setattr(self, name, value). """
pass
def __sizeof__(self, *args, **kwargs): # real signature unknown
""" Size of object in memory, in bytes. """
pass
def __str__(self, *args, **kwargs): # real signature unknown
""" Return str(self). """
pass
def __subclasshook__(self, *args, **kwargs): # real signature unknown
"""
Abstract classes can override this to customize issubclass().
This is invoked early on by abc.ABCMeta.__subclasscheck__().
It should return True, False or NotImplemented. If it returns
NotImplemented, the normal algorithm is used. Otherwise, it
overrides the normal algorithm (and the outcome is cached).
"""
pass
def __weakref__(self, *args, **kwargs): # real signature unknown
pass
g_type_instance = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
parent = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
priv = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
qdata = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
ref_count = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
__gpointer__ = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
__grefcount__ = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
props = None # (!) real value is '<gi._gi.GProps object at 0x7fc62d5a7700>'
__class__ = None # (!) real value is "<class 'gi.types.GObjectMeta'>"
__dict__ = None # (!) real value is "mappingproxy({'__info__': ObjectInfo(RRScreen), '__module__': 'gi.repository.GnomeDesktop', '__gtype__': <GType GnomeRRScreen (93939703625120)>, '__doc__': None, '__gsignals__': {}, 'new': gi.FunctionInfo(new), 'new_finish': gi.FunctionInfo(new_finish), 'new_async': gi.FunctionInfo(new_async), 'get_crtc_by_id': gi.FunctionInfo(get_crtc_by_id), 'get_dpms_mode': gi.FunctionInfo(get_dpms_mode), 'get_output_by_id': gi.FunctionInfo(get_output_by_id), 'get_output_by_name': gi.FunctionInfo(get_output_by_name), 'get_ranges': gi.FunctionInfo(get_ranges), 'list_clone_modes': gi.FunctionInfo(list_clone_modes), 'list_crtcs': gi.FunctionInfo(list_crtcs), 'list_modes': gi.FunctionInfo(list_modes), 'list_outputs': gi.FunctionInfo(list_outputs), 'refresh': gi.FunctionInfo(refresh), 'set_dpms_mode': gi.FunctionInfo(set_dpms_mode), 'do_changed': gi.VFuncInfo(changed), 'do_output_connected': gi.VFuncInfo(output_connected), 'do_output_disconnected': gi.VFuncInfo(output_disconnected), 'parent': <property object at 0x7fc62d5a54f0>, 'priv': <property object at 0x7fc62d5a5680>})"
__gdoc__ = 'Object GnomeRRScreen\n\nSignals from GnomeRRScreen:\n changed ()\n output-connected (gpointer)\n output-disconnected (gpointer)\n\nProperties from GnomeRRScreen:\n gdk-screen -> GdkScreen: GDK Screen\n The GDK Screen represented by this GnomeRRScreen\n dpms-mode -> GnomeRRDpmsModeType: DPMS Mode\n The DPMS mode for this GnomeRRScreen\n\nSignals from GObject:\n notify (GParam)\n\n'
__gsignals__ = {}
__gtype__ = None # (!) real value is '<GType GnomeRRScreen (93939703625120)>'
__info__ = ObjectInfo(RRScreen)
| [
"[email protected]"
] | |
4dd01610b4147862288e32fff7184a7315c42ec7 | 5d0edf31b17c5375faf6126c1a7be8e79bfe2ab8 | /buildout-cache/eggs/Products.Archetypes-1.9.11-py2.7.egg/Products/Archetypes/tests/test_utils.py | 7c7df642c1f4a38a6ef5e21daf0932724c758cb4 | [] | no_license | renansfs/Plone_SP | 27cba32ebd9fc03dae3941ec23cf1bf0a7b6667a | 8a7bdbdb98c3f9fc1073c6061cd2d3a0ec80caf5 | refs/heads/master | 2021-01-15T15:32:43.138965 | 2016-08-24T15:30:19 | 2016-08-24T15:30:19 | 65,313,812 | 0 | 3 | null | null | null | null | UTF-8 | Python | false | false | 12,246 | py | # -*- coding: utf-8 -*-
################################################################################
#
# Copyright (c) 2002-2005, Benjamin Saller <[email protected]>, and
# the respective authors. All rights reserved.
# For a list of Archetypes contributors see docs/CREDITS.txt.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
# * Redistributions of source code must retain the above copyright notice, this
# list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright notice,
# this list of conditions and the following disclaimer in the documentation
# and/or other materials provided with the distribution.
# * Neither the name of the author nor the names of its contributors may be used
# to endorse or promote products derived from this software without specific
# prior written permission.
#
# THIS SOFTWARE IS PROVIDED "AS IS" AND ANY AND ALL EXPRESS OR IMPLIED
# WARRANTIES ARE DISCLAIMED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
# WARRANTIES OF TITLE, MERCHANTABILITY, AGAINST INFRINGEMENT, AND FITNESS
# FOR A PARTICULAR PURPOSE.
#
################################################################################
"""
"""
from Products.Archetypes.tests.attestcase import ATTestCase
from Products.Archetypes.utils import DisplayList
from Products.Archetypes.utils import IntDisplayList
from Products.Archetypes.utils import Vocabulary
from Products.Archetypes.utils import make_uuid
class UidGeneratorTest(ATTestCase):
"""Some ppl have reported problems with uids. This test isn't mathematical
correct but should show the issue on plattform. I suspect it's Windows :|
"""
def test_uuid(self):
uids = {}
loop_length = 10 ** 5 # about 1.5 seconds on a fast cpu
for i in xrange(loop_length):
uid = make_uuid()
uids[uid] = 1
self.assertEqual(len(uids), loop_length)
class DisplayListTest(ATTestCase):
def test_cmp(self):
ta = ('a', 'b', 'c')
tb = ('a', 'c', 'b')
td = ('c', 'b', 'a')
self.assertTrue(DisplayList(zip(ta, ta)) == DisplayList(zip(ta, ta)))
self.assertFalse(DisplayList(zip(ta, ta)) == DisplayList(zip(ta, tb)))
self.assertTrue(DisplayList(zip(ta, ta)) == DisplayList(zip(td, td)))
self.assertTrue(DisplayList(zip(tb, ta)) == DisplayList(zip(tb, ta)))
self.assertRaises(TypeError, cmp, DisplayList(), '')
def test_slice(self):
ta = ('a', 'b', 'c')
l = zip(ta, ta)
sub = l[1:]
self.assertTrue(DisplayList(l)[1:] == sub)
def test_item(self):
ta = ('a', 'b', 'c')
l = zip(ta, ta)
for i in range(0, 2):
item = ta[i]
self.assertTrue(DisplayList(l)[i] == item)
def test_add(self):
ta = ('a', 'b', 'c')
l = zip(ta, ta)
dl = DisplayList(l)[:]
self.assertTrue(dl == l)
l.append(('d', 'd'))
dl.append(('d', 'd'))
self.assertTrue(dl == l)
def test_len(self):
ta = ('a', 'b', 'c')
l = zip(ta, ta)
dl = DisplayList(l)
self.assertTrue(len(dl) == len(l))
def test_keys(self):
ta = ('a', 'b', 'c')
l = zip(ta, ta)
dl = DisplayList(l)
self.assertTrue(tuple(dl.keys()) == ta)
def test_values(self):
ta = ('a', 'b', 'c')
l = zip(ta, ta)
dl = DisplayList(l)
self.assertTrue(tuple(dl.values()) == ta)
def test_items(self):
ta = ('a', 'b', 'c')
l = zip(ta, ta)
dl = DisplayList(l)
self.assertTrue(dl.items() == tuple(l))
def test_repr(self):
ta = ('a', 'b', 'c')
l = zip(ta, ta)
dl = DisplayList(l)
self.assertTrue(repr(dl).find(str(l)))
def test_str(self):
ta = ('a', 'b', 'c')
l = zip(ta, ta)
dl = DisplayList(l)
self.assertTrue(str(dl) == str(l))
def test_call(self):
ta = ('a', 'b', 'c')
l = zip(ta, ta)
dl = DisplayList(l)
self.assertTrue(dl == dl)
self.assertTrue(dl() == dl())
self.assertTrue(dl[:] == l)
self.assertTrue(dl()[:] == l)
def test_sort(self):
a = (('a', 'a',), ('b', 'b'), ('c', 'c'))
b = (('z', 'Z',), ('y', 'Y'), ('x', 'X'))
c = (('a', 'Z',), ('c', 'Y'), ('b', 'X'))
dla = DisplayList(a)
dlb = DisplayList(b)
dlc = DisplayList(c)
assert dla.values() == ['a', 'b', 'c']
dlb_s = dlb.sortedByValue()
assert dlb_s.values() == ['X', 'Y', 'Z']
dlc_s = dlc.sortedByKey()
assert dlc_s.values() == ['Z', 'X', 'Y']
def test_getValue(self):
a = (('a', 'A',), ('b', '\xc3\xab'), ('c', u'xeb'), ('d', 42))
dla = DisplayList(a)
self.assertEqual(dla.getValue('a'), 'A')
self.assertEqual(dla.getValue('b'), '\xc3\xab')
self.assertEqual(dla.getValue('c'), u'xeb')
self.assertEqual(dla.getValue('d'), 42)
self.assertEqual(dla.getValue('e'), None)
self.assertEqual(dla.getValue('e', 'default'), 'default')
# Switch the keys and values around.
b = (('A', 'a',), ('\xc3\xab', 'b'), (u'xeb', 'c'))
dlb = DisplayList(b)
self.assertEqual(dlb.getValue('A'), 'a')
self.assertEqual(dlb.getValue('\xc3\xab'), 'b')
self.assertEqual(dlb.getValue(u'xeb'), 'c')
self.assertEqual(dlb.getValue('e'), None)
self.assertEqual(dlb.getValue('e', 'default'), 'default')
class IntDisplayListTest(ATTestCase):
def test_cmp(self):
ta = (1, 2, 3)
tb = (1, 3, 2)
td = (3, 2, 1)
self.assertTrue(IntDisplayList(zip(ta, ta)) == IntDisplayList(zip(ta, ta)))
self.assertFalse(IntDisplayList(zip(ta, ta)) == IntDisplayList(zip(ta, tb)))
self.assertTrue(IntDisplayList(zip(ta, ta)) == IntDisplayList(zip(td, td)))
self.assertTrue(IntDisplayList(zip(tb, ta)) == IntDisplayList(zip(tb, ta)))
self.assertRaises(TypeError, cmp, IntDisplayList(), '')
def test_slice(self):
ta = (1, 2, 3)
l = zip(ta, ta)
sub = l[1:]
self.assertTrue(IntDisplayList(l)[1:] == sub)
def test_item(self):
ta = (1, 2, 3)
l = zip(ta, ta)
for i in range(0, 2):
item = ta[i]
self.assertTrue(IntDisplayList(l)[i] == item)
def test_add(self):
ta = (1, 2, 3)
l = zip(ta, ta)
dl = IntDisplayList(l)[:]
self.assertTrue(dl == l)
l.append((4, 4))
dl.append((4, 4))
self.assertTrue(dl == l)
def test_len(self):
ta = (1, 2, 3)
l = zip(ta, ta)
dl = IntDisplayList(l)
self.assertTrue(len(dl) == len(l))
def test_keys(self):
ta = (1, 2, 3)
l = zip(ta, ta)
dl = IntDisplayList(l)
self.assertTrue(tuple(dl.keys()) == ta)
def test_values(self):
ta = (1, 2, 3)
l = zip(ta, ta)
dl = IntDisplayList(l)
self.assertTrue(tuple(dl.values()) == ta)
def test_items(self):
ta = (1, 2, 3)
l = zip(ta, ta)
dl = IntDisplayList(l)
self.assertTrue(dl.items() == tuple(l))
def test_repr(self):
ta = (1, 2, 3)
l = zip(ta, ta)
dl = IntDisplayList(l)
self.assertTrue(repr(dl).find(str(l)))
def test_str(self):
ta = (1, 2, 3)
l = zip(ta, ta)
dl = IntDisplayList(l)
self.assertTrue(str(dl) == str(l))
def test_call(self):
ta = (1, 2, 3)
l = zip(ta, ta)
dl = IntDisplayList(l)
self.assertTrue(dl == dl)
self.assertTrue(dl() == dl())
self.assertTrue(dl[:] == l)
self.assertTrue(dl()[:] == l)
def test_sort(self):
a = ((1, 'a',), (2, 'b'), (3, 'c'))
b = ((10, 'Z',), (9, 'Y'), (8, 'X'))
c = ((1, 'Z',), (3, 'Y'), (2, 'X'))
dla = IntDisplayList(a)
dlb = IntDisplayList(b)
dlc = IntDisplayList(c)
assert dla.values() == ['a', 'b', 'c']
dlb_s = dlb.sortedByValue()
assert dlb_s.values() == ['X', 'Y', 'Z']
dlc_s = dlc.sortedByKey()
assert dlc_s.values() == ['Z', 'X', 'Y']
def test_getValue(self):
a = ((1, 'A',), (2, '\xc3\xab'), (3, u'xeb'), (4, 42))
dla = IntDisplayList(a)
self.assertEqual(dla.getValue(1), 'A')
self.assertEqual(dla.getValue(2), '\xc3\xab')
self.assertEqual(dla.getValue(3), u'xeb')
self.assertEqual(dla.getValue(4), 42)
self.assertEqual(dla.getValue(5), None)
self.assertEqual(dla.getValue(5, 'default'), 'default')
class VocabularyTest(ATTestCase):
def test_getValue(self):
a = (('a', 'A',), ('b', '\xc3\xab'), ('c', u'xeb'), ('d', 42))
dla = DisplayList(a)
va = Vocabulary(dla, instance=None, i18n_domain=None)
self.assertEqual(va.getValue('a'), 'A')
self.assertEqual(va.getValue('b'), '\xc3\xab')
self.assertEqual(va.getValue('c'), u'xeb')
self.assertEqual(va.getValue('d'), 42)
self.assertEqual(va.getValue('e'), None)
self.assertEqual(va.getValue('e', 'default'), 'default')
b = (('A', 'a',), ('\xc3\xab', 'b'), (u'xeb', 'c'))
dlb = DisplayList(b)
vb = Vocabulary(dlb, instance=None, i18n_domain=None)
self.assertEqual(vb.getValue('A'), 'a')
self.assertEqual(vb.getValue('\xc3\xab'), 'b')
self.assertEqual(vb.getValue(u'xeb'), 'c')
self.assertEqual(vb.getValue('e'), None)
self.assertEqual(vb.getValue('e', 'default'), 'default')
c = ((1, 'A',), (2, '\xc3\xab'), (3, u'xeb'), (4, 42))
dlc = IntDisplayList(c)
vb = Vocabulary(dlc, instance=None, i18n_domain=None)
self.assertEqual(dlc.getValue(1), 'A')
self.assertEqual(dlc.getValue(2), '\xc3\xab')
self.assertEqual(dlc.getValue(3), u'xeb')
self.assertEqual(dlc.getValue(4), 42)
self.assertEqual(dlc.getValue(5), None)
self.assertEqual(dlc.getValue(5, 'default'), 'default')
def test_translating_getValue(self):
# We use the same base displaylists as above (hopefully), but
# now we pass an instance and an i18n_domain. The instance is
# expected to be an Archetypes object, but currently it only
# needs to have a True boolean value.
a = (('a', 'A',), ('b', '\xc3\xab'), ('c', u'xeb'), ('d', 42))
dla = DisplayList(a)
va = Vocabulary(dla, instance=object(), i18n_domain='plone')
self.assertEqual(va.getValue('a'), 'A')
self.assertEqual(va.getValue('b'), '\xc3\xab'.decode('utf-8'))
self.assertEqual(va.getValue('c'), u'xeb')
self.assertEqual(va.getValue('d'), 42)
self.assertEqual(va.getValue('e'), None)
self.assertEqual(va.getValue('e', 'default'), 'default')
b = (('A', 'a',), ('\xc3\xab', 'b'), (u'xeb', 'c'))
dlb = DisplayList(b)
vb = Vocabulary(dlb, instance=object(), i18n_domain='plone')
self.assertEqual(vb.getValue('A'), 'a')
self.assertEqual(vb.getValue('\xc3\xab'), 'b')
self.assertEqual(vb.getValue(u'xeb'), 'c')
self.assertEqual(vb.getValue('e'), None)
self.assertEqual(vb.getValue('e', 'default'), 'default')
c = ((1, 'A',), (2, '\xc3\xab'), (3, u'xeb'), (4, 42))
dlc = IntDisplayList(c)
vb = Vocabulary(dlc, instance=object(), i18n_domain='plone')
self.assertEqual(dlc.getValue(1), 'A')
self.assertEqual(dlc.getValue(2), '\xc3\xab')
self.assertEqual(dlc.getValue(3), u'xeb')
self.assertEqual(dlc.getValue(4), 42)
self.assertEqual(dlc.getValue(5), None)
self.assertEqual(dlc.getValue(5, 'default'), 'default')
def test_suite():
from unittest import TestSuite, makeSuite
suite = TestSuite()
suite.addTest(makeSuite(VocabularyTest))
suite.addTest(makeSuite(DisplayListTest))
suite.addTest(makeSuite(IntDisplayListTest))
suite.addTest(makeSuite(UidGeneratorTest))
return suite
| [
"[email protected]"
] | |
1ff640fe7e36b1fa053268a8444e4d290cd90c50 | f514f2746c69726ac38f8e8679eb2b646d11ec91 | /dota2_stats/views/matches.py | 857690b007094ca61e06f0758e261b316c1b1ecb | [] | no_license | bobbyrward/dota2_stats | 871b99ca6550496acc95ff44947a23566708861f | b3d2e7fbe4712dcb08f75e3a15b358a8388711a3 | refs/heads/master | 2021-01-19T03:23:54.158490 | 2013-02-15T22:56:08 | 2013-02-15T22:56:08 | 8,169,782 | 1 | 0 | null | null | null | null | UTF-8 | Python | false | false | 770 | py | from pyramid.view import view_config
from dota2_stats.models import DBSession
from dota2_stats.models import Match
from dota2_stats.models import PlayerMatch
from dota2_stats.models import Player
from dota2_stats.models import Hero
from dota2_stats.views.common import template_params
@view_config(route_name='recent_matches', renderer='templates/recent_matches.jinja2')
def recent_matches(request):
matches = DBSession.query(Match).order_by(Match.start_time.desc()).limit(25).all()
return template_params(request, matches=matches)
@view_config(route_name='match_details', renderer='templates/match_details.jinja2')
def match_details(request):
match = DBSession.query(Match).get(request.matchdict['id'])
return template_params(request, match=match)
| [
"[email protected]"
] | |
93f9ff7f0c20a17eac24bdb842d09cdd06d72f77 | 94bfb1346a9ce4cf6ca8bfeeb5194b7a467731a6 | /aclark/db/migrations/0031_siteconfiguration_company.py | d98d10bc777b9bace6f33643961cd5d7281078eb | [
"MIT"
] | permissive | aclark4life/aclarknet-best-pro | 4006cad37c2eec166a98a73e988b9b490a10e5cb | e256bfdd63ad4445bf0a75ef0b91f6e1fd2479ea | refs/heads/master | 2023-03-01T09:10:04.041913 | 2020-12-01T18:40:07 | 2020-12-01T18:40:07 | 140,634,961 | 0 | 0 | MIT | 2021-02-10T01:57:38 | 2018-07-11T22:49:33 | CSS | UTF-8 | Python | false | false | 598 | py | # Generated by Django 2.2.3 on 2019-07-30 18:22
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [("db", "0030_company")]
operations = [
migrations.AddField(
model_name="siteconfiguration",
name="company",
field=models.ForeignKey(
blank=True,
limit_choices_to={"active": True},
null=True,
on_delete=django.db.models.deletion.CASCADE,
to="db.Company",
),
)
]
| [
"[email protected]"
] | |
5cb5f6b8ccf26d58cbd0363a37d076f1d55c436f | 16ac02b8f427bd622af1564f1236e4913ed63521 | /Codes/Version 1.8.6/geometry_def_DEP.py | 7c22289e317f50fec51f1ba4fa4e82052a6f6a8e | [
"MIT"
] | permissive | gharib85/Brownian-dynamics-in-a-time-varying-force-field | 20660665747310e1201e8ca7d404acc15ec7a3bd | 1dce268fcc4f27e066be0ec0b511178cbc1437c5 | refs/heads/main | 2023-08-16T03:47:51.957137 | 2021-10-23T19:09:50 | 2021-10-23T19:09:50 | null | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 8,523 | py | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Sat May 8 13:18:12 2021
@author: Mohammad Asif Zaman
Date: May 15, 2021
This code defines geoemtrical elements by a set of points. It also calculates the
normal vector for each surface of that geometry.
Definition parameters:
- In geo_pointsX() functions, the point array set p_wall[] contains the edge point of each geometry element
- All numbers in the geo_pointsX() functions are in microns (um)
Outputs:
- n_vector_set: Normal vectors for all the walls n = (a,b,c), n_vector_set = [(a1,b1,c1), (a2,b2,c2)....]. Size = (Nwalls,3)
- d_set : Set of d coefficients defining the wall planes ax + by + cz = d. Size (Nwalls,)
- p_lim_set : x,y,z limist for each wall. p_lim_1 = [xlow_1, xhigh_1, ylow_1, yhigh_1, zlow_1, zhigh_1], p_lim_set = [p_lim_1,p_lim_2....]. Size (Nwalls,6)
- Here, Nwalls = number of walls
- May 17, 2021:
- Added different geometry segment types (rectangular, triangular etc.)
- Implemented call function by string
- Functionalized different parts
"""
import numpy as np
# This function states which elements are present in the definition of the geometry
def geo_element_types():
# Types of geometrical elements used
# 4 = rectangular geometry segments (geo_points4() function must exist)
# 3 = triangular geometry segments (geo_points3() function must exist)
# .... and so on
return np.array([]) # return empty array (no walls defined). Only the default z = 0 reflecting wall will be considered.
# # Triangular geometry segments
# def geo_points3():
# # Points defining the geometry. Format
# # [ [ [x1,y1], [x2,y2], [x3,y3], [x4,y4] ] # first geometry segment
# # [ [X1,Y1], [X2,Y2], [X3,Y3], [X4,Y4] ] # second geometry segment
# # [ ] # .... (so on)
# # ]
# p_wall = [ [ [0,30], [70,30], [70,60] ],
# [ [-60,-30], [-40,-30], [-40,-50] ],
# ]
# # z range of each of the points
# z_wall = [ [0,140],
# [0,140],
# ]
# p_wall = np.array(p_wall)
# return p_wall, z_wall
# Calculates normal vector,d coeff and limits of x,y,z for a specific geometry set (example: geo_points4() )
def normal_v(pset,zset):
n_vector = [[]] # Set of normal vectors for each reflecting surface. Each element: n = (a,b,c)
d_coeff = [] # Corresponding d coefficient for ax + by + cz = d plane equation
p_lims =[[]] # x,y,z extent/limit of the wall. Format: [xlow, xhigh, ylow, yhigh, zlow, zhigh]
Nw = pset.shape[0] # Number of walls
for q in range(Nw): # goes through every wall geometry
ps = pset[q] # Select one geometry segment. ps contains all points defining that geometry segment.
zs = zset[q]
M = ps.shape[0] # Number of points in that geometry segment.For rectangle, it's 4 points.
i2 = 0
for m in range(M): # goes through all the points in qth wall geometry
# indices for pair of points
i1 = m
i2 = m + 1 if i2 < M-1 else 0 # for the last point, select i2 = 0 as the index of the second point
# making 3d coordiantes from 2d using z = 0 value for p1, p2 and z = 1 value for p3
# 3 points are defined
p1 = np.append(ps[i1],zs[0])
p2 = np.append(ps[i2],zs[1])
p3 = np.append(ps[i2],zs[0]) # a third arbitrary point at a different z value
# min-max limit of the three points
# format: xmin, xmax, ymin, ymax, zmin, zmax
temp = [min(p1[0],p2[0],p3[0]),max(p1[0],p2[0],p3[0]),
min(p1[1],p2[1],p3[1]),max(p1[1],p2[1],p3[1]),
min(p1[2],p2[2],p3[2]),max(p1[2],p2[2],p3[2])]
# Note: We are assuming that the geoemtry segments are uniform in the z direction (they extrude up from the xy plane in the z direction)
# For generalized case, the z points should be defined within the definition of the points. However, for most LOC microfluidic devices,
# there are no variations along the z direction and this assumption holds.
nv = np.cross(p3-p2,p2-p1) # normal vector to the plane containing the three points
d = np.dot(nv,p1) # corresponding d coefficient
d = d/np.linalg.norm(nv)
nv = nv/np.linalg.norm(nv)
n_vector = np.append(n_vector,[nv], axis = 1) if m+q == 0 else np.append(n_vector,[nv], axis = 0)
p_lims = np.append(p_lims,[temp], axis = 1) if m+q == 0 else np.append(p_lims,[temp], axis = 0)
# Note for n_vector if statement: When appending the first element within the empty
# n_vector, the axis is 1. For other cases, axis = 0. This ensures that the data structure is
# correct and the array of arrays don't blend into a 1D array. The first element occurs at
# q = 0 and m = 0 (thus, when m+q = 0).
d_coeff = np.append(d_coeff,d)
# print(n_vector)
return n_vector,d_coeff, p_lims
# return ps
# Calculate normal vectors, d coeff and (x,y,z) limits for all geometries (combines results for diff type of geometries)
# Main function that will return normal vectors for all the geometry segments, d coefficients and (x,y,z) limits
def geo_define():
el_type = geo_element_types() # Array containing types of geometries used
n_vector_set = np.array([[0,0,1]]) # List for storing normal vectors with z = 0 plane inputted as the 0th wall (substrate)
d_set = np.array([0]) # List for storing d coefficient with z = 0 plane inputted as the 0th wall
p_lim_set =np.array([[-np.inf, np.inf, -np.inf, np.inf, -np.inf, np.inf]]) # List for storing the limits/extent of each wall with an infinite z = 0 substrate already inputted
# # Use these empty list initialization if z = 0 substrate is not present
# n_vector_set = [[]] # List for storing normal vectors
# d_set = [] # List for storing d coefficient
# p_lim_set =[[]] # List for storing the limits/extent of each wall
for m in range(el_type.shape[0]): # looping over all geometry element types
fname = 'geo_points' + str(el_type[m]) + '()' # name of the corresponding geometry function
# Evaluate geo_points*()function by string name
p_w1,z_w1 = eval(fname)
nv,d,pp = normal_v(p_w1, z_w1)
n_vector_set = np.append(n_vector_set,nv, axis = 0)
d_set = np.append(d_set,d)
p_lim_set = np.append(p_lim_set,pp, axis = 0)
# # Use the following if you wish to initialize with an empty list instead of z = 0 default plane
# n_vector_set = nv if m == 0 else np.append(n_vector_set,nv, axis = 0)
# d_set = d if m == 0 else np.append(d_set,d)
# p_lim_set = pp if m == 0 else np.append(p_lim_set,pp, axis = 0)
return n_vector_set, d_set, p_lim_set
# Particle parameters (number and raidus array)
def time_pos_ax_limits():
# Particle parameters (number and raidus array)
Np = 3 # Number of particles
# ro = np.zeros((Np,1)) + 10e-6
ro = np.zeros(Np) + 10e-6
# ro[0] = 5e-6
# ro[1] = 8e-6
# Time parameters
tfinal = 38
# Axes parameters (in microns)
x_lim = [-250, 250]
y_lim = [-250, 250]
z_lim = [-20, 150]
# Limit of initial particle positions
xi_lim = [-80e-6, 80e-6]
yi_lim = [-80e-6, 80e-6]
zi_lim = [max(ro)*1.5, 80e-6]
return Np, ro, tfinal, x_lim, y_lim, z_lim, xi_lim, yi_lim, zi_lim
# Electrode array geometry parameters (units in um)
elec_width = 15
elec_spacing = 50
# =============================================================================
# n_vector_set, d_set, p_lim_set = geo_define()
# print(n_vector_set)
# print(d_set)
# print(p_lim_set)
| [
"[email protected]"
] | |
87ef5d8ed682d518f47eb6e1e03850066d251895 | ff268c31f10cbd3e1c44261ca65a45c88ed3dae5 | /Transfer Learning/Code/classify.py | 2670af12d79b9e3e4f9dd6f648aa8ad8c6399325 | [
"MIT"
] | permissive | gyani91/Machine-Learning | 6642c65359ed48b212a0f4296f5ce908ed6e95e3 | 2fabaa6386d3be24e56aaa9a19d58cd19d225198 | refs/heads/master | 2023-05-27T10:25:55.222053 | 2023-05-15T18:12:45 | 2023-05-15T18:12:45 | 114,811,646 | 2 | 2 | null | null | null | null | UTF-8 | Python | false | false | 1,485 | py | import tensorflow as tf
from resize import *
from convert import *
#IMAGE_PATH = "Classify/panda.jpg"
TRAINED_GRAPH = "sets_graph.pb"
LABELS = "label.txt"
FINAL_TENSOR_NAME = "final_tensor"
def classify(IMAGE_PATH):
# Convert the image to JPEG
converted_image = convert(IMAGE_PATH)
# Resize the image
resized_image = resize(converted_image)
# Read the input_image
input_image = tf.gfile.FastGFile(resized_image, 'rb').read()
# Load labels
class_labels = [line.rstrip() for line
in tf.gfile.GFile(LABELS)]
#Load the trained model
with tf.gfile.FastGFile(TRAINED_GRAPH, 'rb') as f:
graph_def = tf.GraphDef()
graph_def.ParseFromString(f.read())
_ = tf.import_graph_def(graph_def, name='')
init = tf.global_variables_initializer()
with tf.Session() as sess:
sess.run(init)
# Feed the input_image to the graph and get the prediction
softmax_tensor = sess.graph.get_tensor_by_name(FINAL_TENSOR_NAME+':0')
predictions = sess.run(softmax_tensor, {'DecodeJpeg/contents:0': input_image})
# Sort the labels of the prediction in order of confidence
sorted_labels = predictions[0].argsort()[-len(predictions[0]):][::-1]
print('Classification:')
for index in sorted_labels:
class_label = class_labels[index]
percentage = predictions[0][index]*100
print(('%s (%.2f' % (class_label, percentage))+'%)') | [
"[email protected]"
] | |
851e188926412577b670edf374b8128bec3b8f91 | 773dc03117f8b0d51f7a10e2a4577229c8be6ba3 | /alttprbot_discord/util/alttpr_discord.py | b109cfd6aa74d9845a9daf9414c8e7910803208f | [
"LicenseRef-scancode-unknown-license-reference",
"MIT"
] | permissive | tcprescott/sahasrahbot | 382cdff058d63feb5f42dbbd7729eb4b08c4d1bd | 64a125d948873d0faa5ea3f2d306075ad9e013be | refs/heads/master | 2023-08-31T15:33:01.533206 | 2023-08-31T01:58:48 | 2023-08-31T01:58:48 | 178,310,225 | 22 | 43 | MIT | 2023-09-01T08:45:52 | 2019-03-29T01:34:45 | Python | UTF-8 | Python | false | false | 17,139 | py | import datetime
import os
import aiohttp
import discord
import html2markdown
from pyz3r import ALTTPR
emoji_code_map = {
'Bow': 'Bow',
'Boomerang': 'BestBoomerang',
'Hookshot': 'Hookshot',
'Bombs': 'Blowup',
'Mushroom': 'Mushroom',
'Magic Powder': 'Powder',
'Ice Rod': 'IceRod',
'Pendant': 'PendantOfCourage',
'Bombos': 'Bombos',
'Ether': 'Ether',
'Quake': 'Quake',
'Lamp': 'Lamp',
'Hammer': 'MCHammer',
'Shovel': 'shovel',
'Flute': 'Flute',
'Bugnet': 'BugNet',
'Book': 'Mudora',
'Empty Bottle': 'EmptyBottle',
'Green Potion': 'GreenPotion',
'Somaria': 'somaria',
'Cape': 'Cape',
'Mirror': 'mirror',
'Boots': 'GoFast',
'Gloves': 'PowerGlove',
'Flippers': 'Flippers',
'Moon Pearl': 'MoonPearl',
'Shield': 'MirrorShield',
'Tunic': 'GreenTunic',
'Heart': 'ALotOfLove',
'Map': 'DungeonMap',
'Compass': 'DungeonCompass',
'Big Key': 'BigKey'
}
class ALTTPRDiscord(ALTTPR):
def __init__(self, *args, **kwargs):
super(ALTTPRDiscord, self).__init__(*args, **kwargs)
if 'baseurl' not in kwargs:
self.baseurl = os.environ.get("ALTTPR_BASEURL", 'https://alttpr.com')
username = os.environ.get("ALTTPR_USERNAME", None)
password = os.environ.get("ALTTPR_PASSWORD", None)
self.auth = aiohttp.BasicAuth(login=username, password=password) if username and password else None
@property
def generated_goal(self):
settings_list = []
meta = self.data['spoiler'].get('meta', {})
if meta.get('spoilers', 'off') == 'mystery':
return 'mystery'
settings = {
'mode': meta.get('mode', 'open'),
'weapons': meta.get('weapons', 'randomized'),
'goal': meta.get('goal', 'ganon'),
'logic': meta.get('logic', 'NoGlitches'),
'shuffle': meta.get('shuffle', 'none'),
'item_pool': meta.get('item_pool', 'normal'),
'dungeon_items': meta.get('dungeon_items', 'standard'),
'item_functionality': meta.get('item_functionality', 'normal'),
'entry_crystals_ganon': meta.get('entry_crystals_ganon', '7'),
'entry_crystals_tower': meta.get('entry_crystals_tower', '7'),
'enemizer.boss_shuffle': meta.get('enemizer.boss_shuffle', 'none'),
'enemizer.enemy_damage': meta.get('enemizer.enemy_damage', 'default'),
'enemizer.enemy_health': meta.get('enemizer.enemy_health', 'default'),
'enemizer.enemy_shuffle': meta.get('enemizer.enemy_shuffle', 'none'),
}
if not settings['item_pool'] in ['easy', 'normal'] or not settings['item_functionality'] in ['easy', 'normal']:
settings_list.append('hard')
elif settings['dungeon_items'] == 'full' and not settings['goal'] == 'dungeons':
settings_list.append('normal')
if is_enemizer(settings):
settings_list.append("enemizer")
if settings['weapons'] == 'swordless':
settings_list.append('swordless')
if not (settings['dungeon_items'] == 'full' and settings['goal'] == 'dungeons'):
if settings['mode'] == 'open':
if settings['shuffle'] == 'none' and not is_enemizer(settings) and (settings['item_pool'] == 'normal' and settings['item_functionality'] == 'normal') and not settings['weapons'] == 'swordless':
settings_list.append('casual')
settings_list.append('open')
elif settings['mode'] == 'standard' and settings['weapons'] == 'randomized':
settings_list.append('standard')
elif settings['mode'] == 'standard' and settings['weapons'] == 'assured' and (settings['item_pool'] == 'normal' and settings['item_functionality'] == 'normal'):
settings_list.append('casual')
elif settings['mode'] == 'inverted':
settings_list.append('inverted')
elif settings['mode'] == 'retro':
settings_list.append('retro')
if settings['goal'] == 'OverworldGlitches':
settings_list.append("overworld glitches")
elif settings['goal'] == 'MajorGlitches':
settings_list.append("major glitches")
elif settings['goal'] == 'NoLogic':
settings_list.append("no logic")
if not settings['entry_crystals_tower'] == '7' or not settings['entry_crystals_ganon'] == '7':
settings_list.append(
f"{settings['entry_crystals_tower']}/{settings['entry_crystals_ganon']}")
if settings['goal'] == 'ganon' and settings['shuffle'] != 'none':
settings_list.append("defeat ganon")
elif settings['goal'] == 'fast_ganon' and settings['shuffle'] == 'none':
settings_list.append("fast ganon")
elif settings['goal'] == 'dungeons':
settings_list.append("all dungeons")
elif settings['goal'] == 'triforce-hunt':
settings_list.append("triforce hunt")
elif settings['goal'] == 'pedestal':
settings_list.append("pedestal")
if settings['dungeon_items'] == 'mc':
settings_list.append("mc")
elif settings['dungeon_items'] == 'mcs':
settings_list.append("mcs")
elif settings['dungeon_items'] == 'full':
settings_list.append("keysanity")
if not settings['shuffle'] == 'none':
settings_list.append("+ entrance shuffle")
if meta.get('difficulty', None) == 'custom':
settings_list.append("(customizer)")
return " ".join(settings_list)
async def embed(self, emojis=False, name=False, notes=False, include_settings=True):
settings_map = await self.randomizer_settings()
meta = self.data['spoiler'].get('meta', {})
embed = discord.Embed(
title=meta.get('name', 'Requested Seed') if not name else name,
description=html2markdown.convert(
meta.get('notes', '')) if not notes else notes,
color=discord.Colour.dark_red(),
timestamp=datetime.datetime.fromisoformat(self.data['generated'])
)
if include_settings:
if meta.get('spoilers', 'off') == "mystery":
embed.add_field(
name='Mystery Game',
value="No meta information is available for this game.",
inline=False)
embed.add_field(
name='Item Placement',
value=f"**Glitches Required:** {meta['logic']}",
inline=True)
else:
if meta.get('special', False):
embed.add_field(
name='Festive Randomizer',
value="This game is a festive randomizer. Spooky!",
inline=False)
embed.add_field(
name='Settings',
value=(f"**Item Placement:** {settings_map['item_placement'][meta['item_placement']]}\n"
f"**Dungeon Items:** {settings_map['dungeon_items'][meta['dungeon_items']]}\n"
f"**Accessibility:** {settings_map['accessibility'][meta['accessibility']]}\n"
f"**World State:** {settings_map['world_state'][meta['mode']]}\n"
f"**Hints:** {meta['hints']}\n"
f"**Swords:** {settings_map['weapons'][meta['weapons']]}\n"
f"**Item Pool:** {settings_map['item_pool'][meta['item_pool']]}\n"
f"**Item Functionality:** {settings_map['item_functionality'][meta['item_functionality']]}"
),
inline=False
)
else:
embed.add_field(
name='Item Placement',
value="**Glitches Required:** {logic}\n**Item Placement:** {item_placement}\n**Dungeon Items:** {dungeon_items}\n**Accessibility:** {accessibility}".format(
logic=meta['logic'],
item_placement=settings_map['item_placement'][meta['item_placement']],
dungeon_items=settings_map['dungeon_items'][meta['dungeon_items']],
accessibility=settings_map['accessibility'][meta['accessibility']],
),
inline=True)
embed.add_field(
name='Goal',
value="**Goal:** {goal}\n**Open Tower:** {tower}\n**Ganon Vulnerable:** {ganon}".format(
goal=settings_map['goals'][meta['goal']],
tower=meta.get(
'entry_crystals_tower', 'unknown'),
ganon=meta.get(
'entry_crystals_ganon', 'unknown'),
),
inline=True)
embed.add_field(
name='Gameplay',
value="**World State:** {mode}\n**Entrance Shuffle:** {entrance}\n**Boss Shuffle:** {boss}\n**Enemy Shuffle:** {enemy}\n**Pot Shuffle:** {pot}\n**Hints:** {hints}".format(
mode=settings_map['world_state'][meta['mode']],
entrance=settings_map['entrance_shuffle'][meta['shuffle']
] if 'shuffle' in meta else "None",
boss=settings_map['boss_shuffle'][meta['enemizer.boss_shuffle']],
enemy=settings_map['enemy_shuffle'][meta['enemizer.enemy_shuffle']],
pot=meta.get('enemizer.pot_shuffle', 'off'),
hints=meta['hints']
),
inline=True)
embed.add_field(
name='Difficulty',
value="**Swords:** {weapons}\n**Item Pool:** {pool}\n**Item Functionality:** {functionality}\n**Enemy Damage:** {damage}\n**Enemy Health:** {health}".format(
weapons=settings_map['weapons'][meta['weapons']],
pool=settings_map['item_pool'][meta['item_pool']],
functionality=settings_map['item_functionality'][meta['item_functionality']],
damage=settings_map['enemy_damage'][meta['enemizer.enemy_damage']],
health=settings_map['enemy_health'][meta['enemizer.enemy_health']],
),
inline=True)
embed.add_field(name='File Select Code', value=self.build_file_select_code(
emojis=emojis), inline=False)
embed.add_field(name='Permalink', value=self.url, inline=False)
embed.set_footer(text="Generated", icon_url=discord.utils.get(
emojis, name="SahasrahBot").url)
return embed
async def tournament_embed(self, emojis=False, name=False, notes=False, include_settings=True):
settings_map = await self.randomizer_settings()
meta = self.data['spoiler'].get('meta', {})
embed = discord.Embed(
title=meta.get('name', 'Requested Seed') if not name else name,
description=html2markdown.convert(
meta.get('notes', '')) if not notes else notes,
color=discord.Colour.dark_gold(),
timestamp=datetime.datetime.fromisoformat(self.data['generated'])
)
if include_settings:
if meta.get('spoilers', 'off') == "mystery":
embed.add_field(
name='Mystery Game',
value="No meta information is available for this game.",
inline=False)
embed.add_field(
name='Item Placement',
value=f"**Glitches Required:** {meta['logic']}",
inline=True)
else:
if meta.get('special', False):
embed.add_field(
name='Festive Randomizer',
value="This game is a festive randomizer. Spooky!",
inline=False)
embed.add_field(
name='Settings',
value=(f"**Item Placement:** {settings_map['item_placement'][meta['item_placement']]}\n"
f"**Dungeon Items:** {settings_map['dungeon_items'][meta['dungeon_items']]}\n"
f"**Accessibility:** {settings_map['accessibility'][meta['accessibility']]}\n"
f"**World State:** {settings_map['world_state'][meta['mode']]}\n"
f"**Hints:** {meta['hints']}\n"
f"**Swords:** {settings_map['weapons'][meta['weapons']]}\n"
f"**Item Pool:** {settings_map['item_pool'][meta['item_pool']]}\n"
f"**Item Functionality:** {settings_map['item_functionality'][meta['item_functionality']]}"
),
inline=False
)
else:
embed.add_field(
name='Item Placement',
value="**Glitches Required:** {logic}\n**Item Placement:** {item_placement}\n**Dungeon Items:** {dungeon_items}\n**Accessibility:** {accessibility}".format(
logic=meta['logic'],
item_placement=settings_map['item_placement'][meta['item_placement']],
dungeon_items=settings_map['dungeon_items'][meta['dungeon_items']],
accessibility=settings_map['accessibility'][meta['accessibility']],
),
inline=True)
embed.add_field(
name='Goal',
value="**Goal:** {goal}\n**Open Tower:** {tower}\n**Ganon Vulnerable:** {ganon}".format(
goal=settings_map['goals'][meta['goal']],
tower=meta.get(
'entry_crystals_tower', 'unknown'),
ganon=meta.get(
'entry_crystals_ganon', 'unknown'),
),
inline=True)
embed.add_field(
name='Gameplay',
value="**World State:** {mode}\n**Entrance Shuffle:** {entrance}\n**Boss Shuffle:** {boss}\n**Enemy Shuffle:** {enemy}\n**Pot Shuffle:** {pot}\n**Hints:** {hints}".format(
mode=settings_map['world_state'][meta['mode']],
entrance=settings_map['entrance_shuffle'][meta['shuffle']
] if 'shuffle' in meta else "None",
boss=settings_map['boss_shuffle'][meta['enemizer.boss_shuffle']],
enemy=settings_map['enemy_shuffle'][meta['enemizer.enemy_shuffle']],
pot=meta.get('enemizer.pot_shuffle', 'off'),
hints=meta['hints']
),
inline=True)
embed.add_field(
name='Difficulty',
value="**Swords:** {weapons}\n**Item Pool:** {pool}\n**Item Functionality:** {functionality}\n**Enemy Damage:** {damage}\n**Enemy Health:** {health}".format(
weapons=settings_map['weapons'][meta['weapons']],
pool=settings_map['item_pool'][meta['item_pool']],
functionality=settings_map['item_functionality'][meta['item_functionality']],
damage=settings_map['enemy_damage'][meta['enemizer.enemy_damage']],
health=settings_map['enemy_health'][meta['enemizer.enemy_health']],
),
inline=True)
embed.add_field(name='File Select Code', value=self.build_file_select_code(
emojis=emojis), inline=False)
embed.set_footer(text="Generated", icon_url=discord.utils.get(
emojis, name="SahasrahBot").url)
return embed
def build_file_select_code(self, emojis=None):
if emojis:
emoji_list = list(map(lambda x: str(discord.utils.get(
emojis, name=emoji_code_map[x])), self.code))
return ' '.join(emoji_list) + ' (' + '/'.join(self.code) + ')'
else:
return '/'.join(self.code)
def is_enemizer(settings):
return settings['enemizer.boss_shuffle'] != 'none' or settings['enemizer.enemy_shuffle'] != 'none' or settings['enemizer.enemy_damage'] != 'default' or settings['enemizer.enemy_health'] != 'default'
| [
"[email protected]"
] | |
61d21cd5046836892b809cc0fc7f1a977605c227 | 5c2824ff58eb8a57d71b3c24873c4695c7c3a2ba | /Fundamentals_Final_Exam/03.Problem_Three.py | 1e7d798f28b43c1d3e3e8e862cdef755df0fecb0 | [] | no_license | svetoslavastoyanova/Python_Fundamentals_Mid_and_Final_Exams | e7ff6677bc762b24262019a0ebb0ed6a5952c50d | 781e03fd5f540d55b41fbe6ef1d722d39ed62176 | refs/heads/main | 2023-04-17T17:35:38.894988 | 2021-05-02T12:28:26 | 2021-05-02T12:28:26 | 349,411,531 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 389 | py | line = input()
minutes = int(input())
seconds = int(input())
total_time = 0
text = ""
while line != "Finish":
total_time = minutes*60 + seconds
if total_time < 55:
text = "Gold"
elif 55 <= total_time <= 85:
text = "Silver"
elif 85 < total_time <= 120:
text = "Bronze"
line = input()
minutes = int(input())
seconds = int(input())
| [
"[email protected]"
] | |
16a3a8560d14738c480e32368a3b4c2b7f240037 | fd474c0c0df7de6c09f802586068a2069222aadd | /reviewboard/reviews/evolutions/file_attachment_comment_extra_data.py | 86e3d2fc9eb496b260de4b12d19e98ab74fb6221 | [
"MIT"
] | permissive | pombredanne/reviewboard | a2970fa18cfff4b15adfe65fd0098287d73c650e | 15f1d7236ec7a5cb4778ebfeb8b45d13a46ac71d | refs/heads/master | 2022-03-09T22:24:19.951964 | 2022-02-09T07:12:23 | 2022-02-09T07:12:23 | 2,324,135 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 180 | py | from django_evolution.mutations import AddField
from djblets.db.fields import JSONField
MUTATIONS = [
AddField('FileAttachmentComment', 'extra_data', JSONField, null=True)
]
| [
"[email protected]"
] | |
ec8da544cd59eff81c89e4f327ad9a081c8125d6 | b45d33675b38fd3bd15fb2f73a29851a3cc4037d | /0x01-python-if_else_loops_functions/1-last_digit.py | 6fd126b08396c288541339756d823b1a022c70f4 | [] | no_license | angelah1994/holbertonschool-higher_level_programming-1 | 38b8ca1859af2ec08aa50a862ecf37cabf993b46 | 61ab83696ed45686456317c485f7adb7220654ff | refs/heads/master | 2023-03-16T04:08:44.868909 | 2020-05-15T15:40:50 | 2020-05-15T15:40:50 | null | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 429 | py | #!/usr/bin/python3
import random
number = random.randint(-10000, 10000)
last = abs(number) % 10
if number < 10:
last = last * -1
if last > 5:
print('Last digit of {} is {} and is greater than 5'. format(number, last))
elif last < 6 and last != 0:
str = 'Last digit of {} is {} and is less than 6 and not 0'
print(str. format(number, last))
else:
print('Last digit of {} is {} and is 0'. format(number, last))
| [
"[email protected]"
] | |
e5ec7d80a9bd8ec0d31e8695546aa7bfb197c39d | 209a7a4023a9a79693ec1f6e8045646496d1ea71 | /COMP0016_2020_21_Team12-datasetsExperimentsAna/pwa/FADapp/pythonScripts/venv/Lib/site-packages/pandas/tests/arrays/boolean/test_reduction.py | 7ac6c13a933d640f3b303c790f7220a60ada525b | [
"MIT"
] | permissive | anzhao920/MicrosoftProject15_Invictus | 5e2347015411bbffbdf0ceb059df854661fb240c | 15f44eebb09561acbbe7b6730dfadf141e4c166d | refs/heads/main | 2023-04-16T13:24:39.332492 | 2021-04-27T00:47:13 | 2021-04-27T00:47:13 | 361,913,170 | 0 | 0 | MIT | 2021-04-26T22:41:56 | 2021-04-26T22:41:55 | null | UTF-8 | Python | false | false | 2,077 | py | import numpy as np
import pytest
import pandas as pd
@pytest.fixture
def data():
return pd.array(
[True, False] * 4 + [np.nan] + [True, False] * 44 + [np.nan] + [True, False],
dtype="boolean",
)
@pytest.mark.parametrize(
"values, exp_any, exp_all, exp_any_noskip, exp_all_noskip",
[
([True, pd.NA], True, True, True, pd.NA),
([False, pd.NA], False, False, pd.NA, False),
([pd.NA], False, True, pd.NA, pd.NA),
([], False, True, False, True),
# GH-33253: all True / all False values buggy with skipna=False
([True, True], True, True, True, True),
([False, False], False, False, False, False),
],
)
def test_any_all(values, exp_any, exp_all, exp_any_noskip, exp_all_noskip):
# the methods return numpy scalars
exp_any = pd.NA if exp_any is pd.NA else np.bool_(exp_any)
exp_all = pd.NA if exp_all is pd.NA else np.bool_(exp_all)
exp_any_noskip = pd.NA if exp_any_noskip is pd.NA else np.bool_(exp_any_noskip)
exp_all_noskip = pd.NA if exp_all_noskip is pd.NA else np.bool_(exp_all_noskip)
for con in [pd.array, pd.Series]:
a = con(values, dtype="boolean")
assert a.any() is exp_any
assert a.all() is exp_all
assert a.any(skipna=False) is exp_any_noskip
assert a.all(skipna=False) is exp_all_noskip
assert np.any(a.any()) is exp_any
assert np.all(a.all()) is exp_all
@pytest.mark.parametrize("dropna", [True, False])
def test_reductions_return_types(dropna, data, all_numeric_reductions):
op = all_numeric_reductions
s = pd.Series(data)
if dropna:
s = s.dropna()
if op == "sum":
assert isinstance(getattr(s, op)(), np.int_)
elif op == "prod":
assert isinstance(getattr(s, op)(), np.int_)
elif op in ("min", "max"):
assert isinstance(getattr(s, op)(), np.bool_)
else:
# "mean", "std", "var", "median", "kurt", "skew"
assert isinstance(getattr(s, op)(), np.float64)
| [
"[email protected]"
] | |
1029307e17ff37e33f2b89833d70f7879f9f5e45 | 60dbecafad0eb3baf67265ebda5c6230dfc99088 | /old_plotter_files/CLUSTER_PLOT_NEW.py | d4f5332e0882a5300eb2ffad687c5550409349dd | [] | no_license | shanto268/NaSch_CA_Traffic_Flow_Analysis_Software | fbddadd70a70458b96a9a12c5a1c731d29266e34 | d9065df9b8288790aa688bf5bf4c30750ba2889c | refs/heads/master | 2020-09-20T01:47:41.301182 | 2020-05-17T03:28:20 | 2020-05-17T03:28:20 | 224,346,779 | 14 | 1 | null | null | null | null | UTF-8 | Python | false | false | 3,044 | py | # -*- coding: utf-8 -*-
"""
Created on Wed Sep 4 15:13:35 2019
@author: Owner
"""
import matplotlib.pyplot as plt
import csv
def plot1(fname):
fn = fname
nn = fn.split('.')
fr = 'processed_' + str(fname) + '.txt'
#dnewdata = "0.0, 0.0, 0.0, 0.0, 0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, "
dnewdata = "dnew line"
with open(fn, 'r') as f:
lines = f.read().split('\n')
#to delete line use "del lines[4]"
#to replace line:
for i in range(0,len(lines)):
if (i % 100) == 0 or (i % 100) < 19 and i > 0: #or (i % 4) == 1 :
lines[i] = dnewdata
with open(fr,'w') as f:
f.write('\n'.join(lines))
with open(fr, "r") as f:
lines = f.readlines()
with open(fr, "w") as f:
for line in lines:
if line.strip("\n") != "dnew line":
f.write(line)
with open(fn, "r+") as f: #fr change
a = f.read()
# with open(fr, "w+") as f: #fr change
# f.write("0.0, 0.0, 0.0, 0.0, 0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, \n" + a)
density = []
flow = []
updates = []
densityrv = []
flowrv = []
densityav = []
flowav = []
clnum = []
avgclsize = []
# clsize = []
with open(fr,'r') as csvfile:
plots = csv.reader(csvfile, delimiter=',')
for row in plots:
density.append(float(row[0]))
flow.append(float(row[1]))
updates.append(float(row[2]))
densityrv.append(float(row[3]))
flowrv.append(float(row[4]))
densityav.append(float(row[5]))
flowav.append(float(row[6]))
clnum.append(int(row[13]))
avgclsize.append(float(row[14]))
# clsize.append(float(row[11]))
plt.plot(updates, clnum,':' ,linewidth =1, )
plt.xlabel("Timesteps")
plt.ylabel("Number of Clusters")
plt.title("Number of Clusters over time")
plt.savefig("final/new/cluster_num_"+str(nn[0])+".pdf")
plt.show()
plt.plot(updates, avgclsize ,linewidth =1,)
plt.xlabel("Timesteps")
plt.ylabel("Average Size of Clusters")
plt.title("Average Size of Clusters over time")
plt.savefig("final/new/cluster_size_"+str(nn[0])+".pdf")
plt.show()
#r1m1 = plot1('type_aware_crit_density.txt')
#r1m2 = plot1('type_unaware_crit_density.txt')
#r2m1 = plot1('control_crit_density.txt')
r = plot1('type_unaware_low_density_same_vf.txt')
#show histograms:
#cluster numbers at each time period
#average size at each time period
#size
#combined graphs:
| [
"[email protected]"
] | |
1f36a514482352b15f26407d72ee9ba6027dac94 | 781e2692049e87a4256320c76e82a19be257a05d | /all_data/exercism_data/python/anagram/09ab13ac53e54a21819b45d33c272459.py | 30c00dfe1a41d1c24e13263850e15a2fbd439762 | [] | no_license | itsolutionscorp/AutoStyle-Clustering | 54bde86fe6dbad35b568b38cfcb14c5ffaab51b0 | be0e2f635a7558f56c61bc0b36c6146b01d1e6e6 | refs/heads/master | 2020-12-11T07:27:19.291038 | 2016-03-16T03:18:00 | 2016-03-16T03:18:42 | 59,454,921 | 4 | 0 | null | 2016-05-23T05:40:56 | 2016-05-23T05:40:56 | null | UTF-8 | Python | false | false | 394 | py | def detect_anagrams(src, words):
ret = []
t1 = sort_string(src)
for wd in words:
t2 = sort_string(wd)
if t1 == t2 and src.lower() != wd.lower():
ret.append(wd)
return ret
def sort_string(st):
ls = []
ret = ''
for c in range(len(st)):
ls.append((st[c]).lower())
ls.sort()
for c in ls:
ret += c
return ret
| [
"[email protected]"
] | |
92b8cb7b47b9ee887f6ae8c34067590019668fc1 | 77ec9edf40b34b48477a627d149b6c2054b98a93 | /abc_188_d.py | 07fb56df0536dddc7d23de1cb381a788cdc223e9 | [] | no_license | junkhp/atcorder | fa4eeb204e3a4ac713001ab89c205039703abc88 | 028ddf7a39534d5907232c4576a03af79feb6073 | refs/heads/main | 2023-04-11T02:15:10.088883 | 2021-04-22T07:06:06 | 2021-04-22T07:06:06 | 313,284,147 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 1,127 | py | # -*- coding: utf-8 -*-
def main():
n, C = map(int, input().split())
abc_list = [list(map(int, input().split())) for i in range(n)]
keikaku_dict = {}
day_set = set()
for i, abc in enumerate(abc_list):
a = abc[0]
b = abc[1] + 1
if a in day_set:
keikaku_dict[a].append([i, True])
else:
keikaku_dict[a] = [[i, True]]
day_set.add(a)
if b in day_set:
keikaku_dict[b].append([i, False])
else:
keikaku_dict[b] = [[i, False]]
day_set.add(b)
day_set = sorted(day_set)
day_cost = 0
day_cost_dict = {}
for day in day_set:
for xxx in keikaku_dict[day]:
service = xxx[0]
is_in = xxx[1]
if is_in:
day_cost += abc_list[service][2]
else:
day_cost -= abc_list[service][2]
day_cost_dict[day] = day_cost
ans = 0
for i in range(len(day_set) - 1):
ans += min(C, day_cost_dict[day_set[i]]) * (day_set[i + 1] - day_set[i])
print(ans)
if __name__ == '__main__':
main()
| [
"[email protected]"
] | |
0827574c78dbf6f8411927ec8c2f368c165aded5 | 135254b8c00935efd0efd33c708ce69470e23741 | /Hard/335. Self Crossing.py | 84c1eaf99e013d78a368297707b298a1dacf7618 | [] | no_license | MinecraftDawn/LeetCode | 4974e6f96612f01e4774ecd5c30bc42dfff79467 | 0404bcce27ff363430e6ab71dbc27a69055fd261 | refs/heads/master | 2021-06-19T05:50:08.000396 | 2021-06-14T05:57:09 | 2021-06-14T05:57:09 | 188,446,485 | 1 | 0 | null | null | null | null | UTF-8 | Python | false | false | 689 | py | # Reference: https://leetcode.com/problems/self-crossing/discuss/386087/Python-simple-case-easy-to-read
class Solution:
def isSelfCrossing(self, edge: list) -> bool:
if len(edge) < 4:
return False
for i in range(3, len(edge)):
if edge[i-1] <= edge[i-3] and edge[i] >= edge[i-2]:
return True
if i >= 4 and edge[i-1] == edge[i-3] and edge[i] + edge[i-4] >= edge[i-2]:
return True
if i >= 5 and edge[i-1] <= edge[i-3] and edge[i-3] <= edge[i-1] + edge[i-5] and edge[i] + edge[i-4] >= edge[i-2] and edge[i-4] <= edge[i-2]:
return True
return False | [
"[email protected]"
] | |
72603bc959fdf19473817fce9d1ad0b71955d359 | 650076fb94a086e15bdaa5bd2f51ce72df42dce4 | /test/functional/test_framework/messages.py | 49710adae056fb25980006084a7696e53a066492 | [
"MIT"
] | permissive | c0de0x/ErosCore | 548075fe85c46e2bb3946f94361689dbad692da8 | a71767f7ee7105dc83973aac8ac60903b69459c9 | refs/heads/master | 2022-11-25T14:35:59.091923 | 2020-07-30T14:38:39 | 2020-07-30T14:38:39 | null | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 39,420 | py | #!/usr/bin/env python3
# Copyright (c) 2010 ArtForz -- public domain half-a-node
# Copyright (c) 2012 Jeff Garzik
# Copyright (c) 2010-2017 The Bitcoin Core developers
# Distributed under the MIT software license, see the accompanying
# file COPYING or http://www.opensource.org/licenses/mit-license.php.
"""Bitcoin test framework primitive and message strcutures
CBlock, CTransaction, CBlockHeader, CTxIn, CTxOut, etc....:
data structures that should map to corresponding structures in
bitcoin/primitives
msg_block, msg_tx, msg_headers, etc.:
data structures that represent network messages
ser_*, deser_*: functions that handle serialization/deserialization."""
from codecs import encode
import copy
import hashlib
from io import BytesIO
import random
import socket
import struct
import time
from test_framework.siphash import siphash256
from test_framework.util import hex_str_to_bytes, bytes_to_hex_str
MIN_VERSION_SUPPORTED = 60001
MY_VERSION = 70918
MY_SUBVERSION = b"/python-mininode-tester:0.0.3/"
MY_RELAY = 1 # from version 70001 onwards, fRelay should be appended to version messages (BIP37)
MAX_INV_SZ = 50000
MAX_BLOCK_BASE_SIZE = 1000000
COIN = 100000000 # 1 btc in satoshis
NODE_NETWORK = (1 << 0)
# NODE_GETUTXO = (1 << 1)
NODE_BLOOM = (1 << 2)
# Serialization/deserialization tools
def sha256(s):
return hashlib.new('sha256', s).digest()
def ripemd160(s):
return hashlib.new('ripemd160', s).digest()
def hash256(s):
return sha256(sha256(s))
def ser_compact_size(l):
r = b""
if l < 253:
r = struct.pack("B", l)
elif l < 0x10000:
r = struct.pack("<BH", 253, l)
elif l < 0x100000000:
r = struct.pack("<BI", 254, l)
else:
r = struct.pack("<BQ", 255, l)
return r
def deser_compact_size(f):
nit = struct.unpack("<B", f.read(1))[0]
if nit == 253:
nit = struct.unpack("<H", f.read(2))[0]
elif nit == 254:
nit = struct.unpack("<I", f.read(4))[0]
elif nit == 255:
nit = struct.unpack("<Q", f.read(8))[0]
return nit
def deser_string(f):
nit = deser_compact_size(f)
return f.read(nit)
def ser_string(s):
return ser_compact_size(len(s)) + s
def deser_uint256(f):
r = 0
for i in range(8):
t = struct.unpack("<I", f.read(4))[0]
r += t << (i * 32)
return r
def ser_uint256(u):
rs = b""
for i in range(8):
rs += struct.pack("<I", u & 0xFFFFFFFF)
u >>= 32
return rs
def ser_uint64(u):
rs = b""
for i in range(2):
rs += struct.pack("<I", u & 0xFFFFFFFF)
u >>= 32
return rs
def uint256_from_str(s):
r = 0
t = struct.unpack("<IIIIIIII", s[:32])
for i in range(8):
r += t[i] << (i * 32)
return r
def uint256_from_compact(c):
nbytes = (c >> 24) & 0xFF
v = (c & 0xFFFFFF) << (8 * (nbytes - 3))
return v
def deser_vector(f, c):
nit = deser_compact_size(f)
r = []
for i in range(nit):
t = c()
t.deserialize(f)
r.append(t)
return r
# ser_function_name: Allow for an alternate serialization function on the
# entries in the vector (we use this for serializing the vector of transactions
# for a witness block).
def ser_vector(l, ser_function_name=None):
r = ser_compact_size(len(l))
for i in l:
if ser_function_name:
r += getattr(i, ser_function_name)()
else:
r += i.serialize()
return r
def deser_uint256_vector(f):
nit = deser_compact_size(f)
r = []
for i in range(nit):
t = deser_uint256(f)
r.append(t)
return r
def ser_uint256_vector(l):
r = ser_compact_size(len(l))
for i in l:
r += ser_uint256(i)
return r
def deser_string_vector(f):
nit = deser_compact_size(f)
r = []
for i in range(nit):
t = deser_string(f)
r.append(t)
return r
def ser_string_vector(l):
r = ser_compact_size(len(l))
for sv in l:
r += ser_string(sv)
return r
# Deserialize from a hex string representation (eg from RPC)
def FromHex(obj, hex_string):
obj.deserialize(BytesIO(hex_str_to_bytes(hex_string)))
return obj
# Convert a binary-serializable object to hex (eg for submission via RPC)
def ToHex(obj):
return bytes_to_hex_str(obj.serialize())
# Objects that map to bitcoind objects, which can be serialized/deserialized
class CAddress():
def __init__(self):
self.nServices = 1
self.pchReserved = b"\x00" * 10 + b"\xff" * 2
self.ip = "0.0.0.0"
self.port = 0
def deserialize(self, f):
self.nServices = struct.unpack("<Q", f.read(8))[0]
self.pchReserved = f.read(12)
self.ip = socket.inet_ntoa(f.read(4))
self.port = struct.unpack(">H", f.read(2))[0]
def serialize(self):
r = b""
r += struct.pack("<Q", self.nServices)
r += self.pchReserved
r += socket.inet_aton(self.ip)
r += struct.pack(">H", self.port)
return r
def __repr__(self):
return "CAddress(nServices=%i ip=%s port=%i)" % (self.nServices,
self.ip, self.port)
class CInv():
typemap = {
0: "MSG_ERROR",
1: "MSG_TX",
2: "MSG_BLOCK",
3: "MSG_FILTERED_BLOCK",
4: "MSG_TXLOCK_REQUEST",
5: "MSG_TXLOCK_VOTE",
6: "MSG_SPORK",
7: "MSG_MASTERNODE_WINNER",
8: "MSG_MASTERNODE_SCANNING_ERROR",
9: "MSG_BUDGET_VOTE",
10: "MSG_BUDGET_PROPOSAL",
11: "MSG_BUDGET_FINALIZED",
12: "MSG_BUDGET_FINALIZED_VOTE",
13: "MSG_MASTERNODE_QUORUM",
14: "MSG_MASTERNODE_QUORUM",
15: "MSG_MASTERNODE_ANNOUNCE",
16: "MSG_MASTERNODE_PING",
17: "MSG_DSTX",
18: "MSG_PUBCOINS",
19: "MSG_GENWIT",
20: "MSG_ACC_VALUE"
}
def __init__(self, t=0, h=0):
self.type = t
self.hash = h
def deserialize(self, f):
self.type = struct.unpack("<i", f.read(4))[0]
self.hash = deser_uint256(f)
def serialize(self):
r = b""
r += struct.pack("<i", self.type)
r += ser_uint256(self.hash)
return r
def __repr__(self):
return "CInv(type=%s hash=%064x)" \
% (self.typemap[self.type], self.hash)
class CBlockLocator():
def __init__(self):
self.nVersion = MY_VERSION
self.vHave = []
def deserialize(self, f):
self.nVersion = struct.unpack("<i", f.read(4))[0]
self.vHave = deser_uint256_vector(f)
def serialize(self):
r = b""
r += struct.pack("<i", self.nVersion)
r += ser_uint256_vector(self.vHave)
return r
def __repr__(self):
return "CBlockLocator(nVersion=%i vHave=%s)" \
% (self.nVersion, repr(self.vHave))
class COutPoint():
def __init__(self, hash=0, n=0):
self.hash = hash
self.n = n
def deserialize(self, f):
self.hash = deser_uint256(f)
self.n = struct.unpack("<I", f.read(4))[0]
def serialize(self):
r = b""
r += ser_uint256(self.hash)
r += struct.pack("<I", self.n)
return r
def __repr__(self):
return "COutPoint(hash=%064x n=%i)" % (self.hash, self.n)
class CTxIn():
def __init__(self, outpoint=None, scriptSig=b"", nSequence=0):
if outpoint is None:
self.prevout = COutPoint()
else:
self.prevout = outpoint
self.scriptSig = scriptSig
self.nSequence = nSequence
def deserialize(self, f):
self.prevout = COutPoint()
self.prevout.deserialize(f)
self.scriptSig = deser_string(f)
self.nSequence = struct.unpack("<I", f.read(4))[0]
def serialize(self):
r = b""
r += self.prevout.serialize()
r += ser_string(self.scriptSig)
r += struct.pack("<I", self.nSequence)
return r
def __repr__(self):
return "CTxIn(prevout=%s scriptSig=%s nSequence=%i)" \
% (repr(self.prevout), bytes_to_hex_str(self.scriptSig),
self.nSequence)
class CTxOut():
def __init__(self, nValue=0, scriptPubKey=b""):
self.nValue = nValue
self.scriptPubKey = scriptPubKey
def deserialize(self, f):
self.nValue = struct.unpack("<q", f.read(8))[0]
self.scriptPubKey = deser_string(f)
def serialize(self):
r = b""
r += struct.pack("<q", self.nValue)
r += ser_string(self.scriptPubKey)
return r
def __repr__(self):
return "CTxOut(nValue=%i.%08i scriptPubKey=%s)" \
% (self.nValue // COIN, self.nValue % COIN,
bytes_to_hex_str(self.scriptPubKey))
class CTransaction():
def __init__(self, tx=None):
if tx is None:
self.nVersion = 1
self.vin = []
self.vout = []
self.nLockTime = 0
self.sha256 = None
self.hash = None
else:
self.nVersion = tx.nVersion
self.vin = copy.deepcopy(tx.vin)
self.vout = copy.deepcopy(tx.vout)
self.nLockTime = tx.nLockTime
self.sha256 = tx.sha256
self.hash = tx.hash
def deserialize(self, f):
self.nVersion = struct.unpack("<i", f.read(4))[0]
self.vin = deser_vector(f, CTxIn)
flags = 0
if len(self.vin) == 0:
flags = struct.unpack("<B", f.read(1))[0]
# Not sure why flags can't be zero, but this
# matches the implementation in bitcoind
if (flags != 0):
self.vin = deser_vector(f, CTxIn)
self.vout = deser_vector(f, CTxOut)
else:
self.vout = deser_vector(f, CTxOut)
self.nLockTime = struct.unpack("<I", f.read(4))[0]
self.sha256 = None
self.hash = None
def serialize_without_witness(self):
r = b""
r += struct.pack("<i", self.nVersion)
r += ser_vector(self.vin)
r += ser_vector(self.vout)
r += struct.pack("<I", self.nLockTime)
return r
# Regular serialization is with witness -- must explicitly
# call serialize_without_witness to exclude witness data.
def serialize(self):
return self.serialize_without_witness()
# Recalculate the txid (transaction hash without witness)
def rehash(self):
self.sha256 = None
self.calc_sha256()
# We will only cache the serialization without witness in
# self.sha256 and self.hash -- those are expected to be the txid.
def calc_sha256(self, with_witness=False):
if self.sha256 is None:
self.sha256 = uint256_from_str(hash256(self.serialize_without_witness()))
self.hash = encode(hash256(self.serialize_without_witness())[::-1], 'hex_codec').decode('ascii')
def is_valid(self):
self.calc_sha256()
for tout in self.vout:
if tout.nValue < 0 or tout.nValue > 21000000 * COIN:
return False
return True
def __repr__(self):
return "CTransaction(nVersion=%i vin=%s vout=%s nLockTime=%i)" \
% (self.nVersion, repr(self.vin), repr(self.vout), self.nLockTime)
class CBlockHeader():
def __init__(self, header=None):
if header is None:
self.set_null()
else:
self.nVersion = header.nVersion
self.hashPrevBlock = header.hashPrevBlock
self.hashMerkleRoot = header.hashMerkleRoot
self.nTime = header.nTime
self.nBits = header.nBits
self.nNonce = header.nNonce
self.nAccumulatorCheckpoint = header.nAccumulatorCheckpoint
self.sha256 = header.sha256
self.hash = header.hash
self.calc_sha256()
def set_null(self):
self.nVersion = 4
self.hashPrevBlock = 0
self.hashMerkleRoot = 0
self.nTime = 0
self.nBits = 0
self.nNonce = 0
self.nAccumulatorCheckpoint = 0
self.sha256 = None
self.hash = None
def deserialize(self, f):
self.nVersion = struct.unpack("<i", f.read(4))[0]
self.hashPrevBlock = deser_uint256(f)
self.hashMerkleRoot = deser_uint256(f)
self.nTime = struct.unpack("<I", f.read(4))[0]
self.nBits = struct.unpack("<I", f.read(4))[0]
self.nNonce = struct.unpack("<I", f.read(4))[0]
self.nAccumulatorCheckpoint = deser_uint256(f)
self.sha256 = None
self.hash = None
def serialize(self):
r = b""
r += struct.pack("<i", self.nVersion)
r += ser_uint256(self.hashPrevBlock)
r += ser_uint256(self.hashMerkleRoot)
r += struct.pack("<I", self.nTime)
r += struct.pack("<I", self.nBits)
r += struct.pack("<I", self.nNonce)
r += ser_uint256(self.nAccumulatorCheckpoint)
return r
def calc_sha256(self):
if self.sha256 is None:
r = b""
r += struct.pack("<i", self.nVersion)
r += ser_uint256(self.hashPrevBlock)
r += ser_uint256(self.hashMerkleRoot)
r += struct.pack("<I", self.nTime)
r += struct.pack("<I", self.nBits)
r += struct.pack("<I", self.nNonce)
r += ser_uint256(self.nAccumulatorCheckpoint)
self.sha256 = uint256_from_str(hash256(r))
self.hash = encode(hash256(r)[::-1], 'hex_codec').decode('ascii')
def rehash(self):
self.sha256 = None
self.calc_sha256()
return self.sha256
# ERS Uniqueness
def get_uniqueness(self, prevout):
r = b""
r += struct.pack("<I", prevout.n)
r += ser_uint256(prevout.hash)
return r
def solve_stake(self, prevouts, isModifierV2=False):
target0 = uint256_from_compact(self.nBits)
loop = True
while loop:
for prevout in prevouts:
nvalue, txBlockTime, hashStake = prevouts[prevout]
target = int(target0 * nvalue / 100) % 2**256
data = b""
if isModifierV2:
data += ser_uint256(0)
else:
data += ser_uint64(0)
#data += ser_uint64(stakeModifier)
data += struct.pack("<I", txBlockTime)
# prevout for zPoS is serial hashes hex strings
if isinstance(prevout, COutPoint):
data += self.get_uniqueness(prevout)
else:
data += ser_uint256(uint256_from_str(bytes.fromhex(hashStake)[::-1]))
data += struct.pack("<I", self.nTime)
posHash = uint256_from_str(hash256(data))
if posHash <= target:
self.prevoutStake = prevout
loop = False
break
if loop:
self.nTime += 1
return True
def __repr__(self):
return "CBlockHeader(nVersion=%i hashPrevBlock=%064x hashMerkleRoot=%064x nTime=%s nBits=%08x nNonce=%08x)" \
% (self.nVersion, self.hashPrevBlock, self.hashMerkleRoot,
time.ctime(self.nTime), self.nBits, self.nNonce)
class CBlock(CBlockHeader):
def __init__(self, header=None):
super(CBlock, self).__init__(header)
self.vtx = []
def deserialize(self, f):
super(CBlock, self).deserialize(f)
self.vtx = deser_vector(f, CTransaction)
def serialize(self, with_witness=False):
r = b""
r += super(CBlock, self).serialize()
if with_witness:
r += ser_vector(self.vtx, "serialize_with_witness")
else:
r += ser_vector(self.vtx, "serialize_without_witness")
if hasattr(self, 'vchBlockSig'):
r += ser_string(self.vchBlockSig)
return r
# Calculate the merkle root given a vector of transaction hashes
@classmethod
def get_merkle_root(cls, hashes):
while len(hashes) > 1:
newhashes = []
for i in range(0, len(hashes), 2):
i2 = min(i+1, len(hashes)-1)
newhashes.append(hash256(hashes[i] + hashes[i2]))
hashes = newhashes
return uint256_from_str(hashes[0])
def calc_merkle_root(self):
hashes = []
for tx in self.vtx:
tx.calc_sha256()
hashes.append(ser_uint256(tx.sha256))
return self.get_merkle_root(hashes)
def calc_witness_merkle_root(self):
# For witness root purposes, the hash of the
# coinbase, with witness, is defined to be 0...0
hashes = [ser_uint256(0)]
for tx in self.vtx[1:]:
# Calculate the hashes with witness data
hashes.append(ser_uint256(tx.calc_sha256(True)))
return self.get_merkle_root(hashes)
def is_valid(self):
self.calc_sha256()
target = uint256_from_compact(self.nBits)
if self.sha256 > target:
return False
for tx in self.vtx:
if not tx.is_valid():
return False
if self.calc_merkle_root() != self.hashMerkleRoot:
return False
return True
def solve(self):
self.rehash()
target = uint256_from_compact(self.nBits)
while self.sha256 > target:
self.nNonce += 1
self.rehash()
def sign_block(self, key, low_s=True):
data = b""
data += struct.pack("<i", self.nVersion)
data += ser_uint256(self.hashPrevBlock)
data += ser_uint256(self.hashMerkleRoot)
data += struct.pack("<I", self.nTime)
data += struct.pack("<I", self.nBits)
data += struct.pack("<I", self.nNonce)
data += ser_uint256(self.nAccumulatorCheckpoint)
sha256NoSig = hash256(data)
self.vchBlockSig = key.sign(sha256NoSig, low_s=low_s)
def __repr__(self):
return "CBlock(nVersion=%i hashPrevBlock=%064x hashMerkleRoot=%064x nTime=%s nBits=%08x nNonce=%08x vtx=%s)" \
% (self.nVersion, self.hashPrevBlock, self.hashMerkleRoot,
time.ctime(self.nTime), self.nBits, self.nNonce, repr(self.vtx))
class PrefilledTransaction():
def __init__(self, index=0, tx = None):
self.index = index
self.tx = tx
def deserialize(self, f):
self.index = deser_compact_size(f)
self.tx = CTransaction()
self.tx.deserialize(f)
def serialize(self, with_witness=True):
r = b""
r += ser_compact_size(self.index)
if with_witness:
r += self.tx.serialize_with_witness()
else:
r += self.tx.serialize_without_witness()
return r
def serialize_without_witness(self):
return self.serialize(with_witness=False)
def serialize_with_witness(self):
return self.serialize(with_witness=True)
def __repr__(self):
return "PrefilledTransaction(index=%d, tx=%s)" % (self.index, repr(self.tx))
# This is what we send on the wire, in a cmpctblock message.
class P2PHeaderAndShortIDs():
def __init__(self):
self.header = CBlockHeader()
self.nonce = 0
self.shortids_length = 0
self.shortids = []
self.prefilled_txn_length = 0
self.prefilled_txn = []
def deserialize(self, f):
self.header.deserialize(f)
self.nonce = struct.unpack("<Q", f.read(8))[0]
self.shortids_length = deser_compact_size(f)
for i in range(self.shortids_length):
# shortids are defined to be 6 bytes in the spec, so append
# two zero bytes and read it in as an 8-byte number
self.shortids.append(struct.unpack("<Q", f.read(6) + b'\x00\x00')[0])
self.prefilled_txn = deser_vector(f, PrefilledTransaction)
self.prefilled_txn_length = len(self.prefilled_txn)
# When using version 2 compact blocks, we must serialize with_witness.
def serialize(self, with_witness=False):
r = b""
r += self.header.serialize()
r += struct.pack("<Q", self.nonce)
r += ser_compact_size(self.shortids_length)
for x in self.shortids:
# We only want the first 6 bytes
r += struct.pack("<Q", x)[0:6]
if with_witness:
r += ser_vector(self.prefilled_txn, "serialize_with_witness")
else:
r += ser_vector(self.prefilled_txn, "serialize_without_witness")
return r
def __repr__(self):
return "P2PHeaderAndShortIDs(header=%s, nonce=%d, shortids_length=%d, shortids=%s, prefilled_txn_length=%d, prefilledtxn=%s" % (repr(self.header), self.nonce, self.shortids_length, repr(self.shortids), self.prefilled_txn_length, repr(self.prefilled_txn))
# P2P version of the above that will use witness serialization (for compact
# block version 2)
class P2PHeaderAndShortWitnessIDs(P2PHeaderAndShortIDs):
def serialize(self):
return super(P2PHeaderAndShortWitnessIDs, self).serialize(with_witness=True)
# Calculate the BIP 152-compact blocks shortid for a given transaction hash
def calculate_shortid(k0, k1, tx_hash):
expected_shortid = siphash256(k0, k1, tx_hash)
expected_shortid &= 0x0000ffffffffffff
return expected_shortid
# This version gets rid of the array lengths, and reinterprets the differential
# encoding into indices that can be used for lookup.
class HeaderAndShortIDs():
def __init__(self, p2pheaders_and_shortids = None):
self.header = CBlockHeader()
self.nonce = 0
self.shortids = []
self.prefilled_txn = []
self.use_witness = False
if p2pheaders_and_shortids != None:
self.header = p2pheaders_and_shortids.header
self.nonce = p2pheaders_and_shortids.nonce
self.shortids = p2pheaders_and_shortids.shortids
last_index = -1
for x in p2pheaders_and_shortids.prefilled_txn:
self.prefilled_txn.append(PrefilledTransaction(x.index + last_index + 1, x.tx))
last_index = self.prefilled_txn[-1].index
def to_p2p(self):
if self.use_witness:
ret = P2PHeaderAndShortWitnessIDs()
else:
ret = P2PHeaderAndShortIDs()
ret.header = self.header
ret.nonce = self.nonce
ret.shortids_length = len(self.shortids)
ret.shortids = self.shortids
ret.prefilled_txn_length = len(self.prefilled_txn)
ret.prefilled_txn = []
last_index = -1
for x in self.prefilled_txn:
ret.prefilled_txn.append(PrefilledTransaction(x.index - last_index - 1, x.tx))
last_index = x.index
return ret
def get_siphash_keys(self):
header_nonce = self.header.serialize()
header_nonce += struct.pack("<Q", self.nonce)
hash_header_nonce_as_str = sha256(header_nonce)
key0 = struct.unpack("<Q", hash_header_nonce_as_str[0:8])[0]
key1 = struct.unpack("<Q", hash_header_nonce_as_str[8:16])[0]
return [ key0, key1 ]
# Version 2 compact blocks use wtxid in shortids (rather than txid)
def initialize_from_block(self, block, nonce=0, prefill_list = [0], use_witness = False):
self.header = CBlockHeader(block)
self.nonce = nonce
self.prefilled_txn = [ PrefilledTransaction(i, block.vtx[i]) for i in prefill_list ]
self.shortids = []
self.use_witness = use_witness
[k0, k1] = self.get_siphash_keys()
for i in range(len(block.vtx)):
if i not in prefill_list:
tx_hash = block.vtx[i].sha256
if use_witness:
tx_hash = block.vtx[i].calc_sha256(with_witness=True)
self.shortids.append(calculate_shortid(k0, k1, tx_hash))
def __repr__(self):
return "HeaderAndShortIDs(header=%s, nonce=%d, shortids=%s, prefilledtxn=%s" % (repr(self.header), self.nonce, repr(self.shortids), repr(self.prefilled_txn))
class BlockTransactionsRequest():
def __init__(self, blockhash=0, indexes = None):
self.blockhash = blockhash
self.indexes = indexes if indexes != None else []
def deserialize(self, f):
self.blockhash = deser_uint256(f)
indexes_length = deser_compact_size(f)
for i in range(indexes_length):
self.indexes.append(deser_compact_size(f))
def serialize(self):
r = b""
r += ser_uint256(self.blockhash)
r += ser_compact_size(len(self.indexes))
for x in self.indexes:
r += ser_compact_size(x)
return r
# helper to set the differentially encoded indexes from absolute ones
def from_absolute(self, absolute_indexes):
self.indexes = []
last_index = -1
for x in absolute_indexes:
self.indexes.append(x-last_index-1)
last_index = x
def to_absolute(self):
absolute_indexes = []
last_index = -1
for x in self.indexes:
absolute_indexes.append(x+last_index+1)
last_index = absolute_indexes[-1]
return absolute_indexes
def __repr__(self):
return "BlockTransactionsRequest(hash=%064x indexes=%s)" % (self.blockhash, repr(self.indexes))
class BlockTransactions():
def __init__(self, blockhash=0, transactions = None):
self.blockhash = blockhash
self.transactions = transactions if transactions != None else []
def deserialize(self, f):
self.blockhash = deser_uint256(f)
self.transactions = deser_vector(f, CTransaction)
def serialize(self, with_witness=True):
r = b""
r += ser_uint256(self.blockhash)
if with_witness:
r += ser_vector(self.transactions, "serialize_with_witness")
else:
r += ser_vector(self.transactions, "serialize_without_witness")
return r
def __repr__(self):
return "BlockTransactions(hash=%064x transactions=%s)" % (self.blockhash, repr(self.transactions))
class CPartialMerkleTree():
def __init__(self):
self.nTransactions = 0
self.vHash = []
self.vBits = []
self.fBad = False
def deserialize(self, f):
self.nTransactions = struct.unpack("<i", f.read(4))[0]
self.vHash = deser_uint256_vector(f)
vBytes = deser_string(f)
self.vBits = []
for i in range(len(vBytes) * 8):
self.vBits.append(vBytes[i//8] & (1 << (i % 8)) != 0)
def serialize(self):
r = b""
r += struct.pack("<i", self.nTransactions)
r += ser_uint256_vector(self.vHash)
vBytesArray = bytearray([0x00] * ((len(self.vBits) + 7)//8))
for i in range(len(self.vBits)):
vBytesArray[i // 8] |= self.vBits[i] << (i % 8)
r += ser_string(bytes(vBytesArray))
return r
def __repr__(self):
return "CPartialMerkleTree(nTransactions=%d, vHash=%s, vBits=%s)" % (self.nTransactions, repr(self.vHash), repr(self.vBits))
class CMerkleBlock():
def __init__(self):
self.header = CBlockHeader()
self.txn = CPartialMerkleTree()
def deserialize(self, f):
self.header.deserialize(f)
self.txn.deserialize(f)
def serialize(self):
r = b""
r += self.header.serialize()
r += self.txn.serialize()
return r
def __repr__(self):
return "CMerkleBlock(header=%s, txn=%s)" % (repr(self.header), repr(self.txn))
# Objects that correspond to messages on the wire
class msg_version():
command = b"version"
def __init__(self):
self.nVersion = MY_VERSION
self.nServices = NODE_NETWORK
self.nTime = int(time.time())
self.addrTo = CAddress()
self.addrFrom = CAddress()
self.nNonce = random.getrandbits(64)
self.strSubVer = MY_SUBVERSION
self.nStartingHeight = -1
self.nRelay = MY_RELAY
def deserialize(self, f):
self.nVersion = struct.unpack("<i", f.read(4))[0]
if self.nVersion == 10300:
self.nVersion = 300
self.nServices = struct.unpack("<Q", f.read(8))[0]
self.nTime = struct.unpack("<q", f.read(8))[0]
self.addrTo = CAddress()
self.addrTo.deserialize(f)
if self.nVersion >= 106:
self.addrFrom = CAddress()
self.addrFrom.deserialize(f)
self.nNonce = struct.unpack("<Q", f.read(8))[0]
self.strSubVer = deser_string(f)
else:
self.addrFrom = None
self.nNonce = None
self.strSubVer = None
self.nStartingHeight = None
if self.nVersion >= 209:
self.nStartingHeight = struct.unpack("<i", f.read(4))[0]
else:
self.nStartingHeight = None
if self.nVersion >= 70001:
# Relay field is optional for version 70001 onwards
try:
self.nRelay = struct.unpack("<b", f.read(1))[0]
except:
self.nRelay = 0
else:
self.nRelay = 0
def serialize(self):
r = b""
r += struct.pack("<i", self.nVersion)
r += struct.pack("<Q", self.nServices)
r += struct.pack("<q", self.nTime)
r += self.addrTo.serialize()
r += self.addrFrom.serialize()
r += struct.pack("<Q", self.nNonce)
r += ser_string(self.strSubVer)
r += struct.pack("<i", self.nStartingHeight)
r += struct.pack("<b", self.nRelay)
return r
def __repr__(self):
return 'msg_version(nVersion=%i nServices=%i nTime=%s addrTo=%s addrFrom=%s nNonce=0x%016X strSubVer=%s nStartingHeight=%i nRelay=%i)' \
% (self.nVersion, self.nServices, time.ctime(self.nTime),
repr(self.addrTo), repr(self.addrFrom), self.nNonce,
self.strSubVer, self.nStartingHeight, self.nRelay)
class msg_verack():
command = b"verack"
def __init__(self):
pass
def deserialize(self, f):
pass
def serialize(self):
return b""
def __repr__(self):
return "msg_verack()"
class msg_addr():
command = b"addr"
def __init__(self):
self.addrs = []
def deserialize(self, f):
self.addrs = deser_vector(f, CAddress)
def serialize(self):
return ser_vector(self.addrs)
def __repr__(self):
return "msg_addr(addrs=%s)" % (repr(self.addrs))
class msg_inv():
command = b"inv"
def __init__(self, inv=None):
if inv is None:
self.inv = []
else:
self.inv = inv
def deserialize(self, f):
self.inv = deser_vector(f, CInv)
def serialize(self):
return ser_vector(self.inv)
def __repr__(self):
return "msg_inv(inv=%s)" % (repr(self.inv))
class msg_getdata():
command = b"getdata"
def __init__(self, inv=None):
self.inv = inv if inv != None else []
def deserialize(self, f):
self.inv = deser_vector(f, CInv)
def serialize(self):
return ser_vector(self.inv)
def __repr__(self):
return "msg_getdata(inv=%s)" % (repr(self.inv))
class msg_getblocks():
command = b"getblocks"
def __init__(self):
self.locator = CBlockLocator()
self.hashstop = 0
def deserialize(self, f):
self.locator = CBlockLocator()
self.locator.deserialize(f)
self.hashstop = deser_uint256(f)
def serialize(self):
r = b""
r += self.locator.serialize()
r += ser_uint256(self.hashstop)
return r
def __repr__(self):
return "msg_getblocks(locator=%s hashstop=%064x)" \
% (repr(self.locator), self.hashstop)
class msg_tx():
command = b"tx"
def __init__(self, tx=CTransaction()):
self.tx = tx
def deserialize(self, f):
self.tx.deserialize(f)
def serialize(self):
return self.tx.serialize_without_witness()
def __repr__(self):
return "msg_tx(tx=%s)" % (repr(self.tx))
class msg_witness_tx(msg_tx):
def serialize(self):
return self.tx.serialize_with_witness()
class msg_block():
command = b"block"
def __init__(self, block=None):
if block is None:
self.block = CBlock()
else:
self.block = block
def deserialize(self, f):
self.block.deserialize(f)
def serialize(self):
return self.block.serialize(with_witness=False)
def __repr__(self):
return "msg_block(block=%s)" % (repr(self.block))
# for cases where a user needs tighter control over what is sent over the wire
# note that the user must supply the name of the command, and the data
class msg_generic():
def __init__(self, command, data=None):
self.command = command
self.data = data
def serialize(self):
return self.data
def __repr__(self):
return "msg_generic()"
class msg_witness_block(msg_block):
def serialize(self):
r = self.block.serialize(with_witness=True)
return r
class msg_getaddr():
command = b"getaddr"
def __init__(self):
pass
def deserialize(self, f):
pass
def serialize(self):
return b""
def __repr__(self):
return "msg_getaddr()"
class msg_ping():
command = b"ping"
def __init__(self, nonce=0):
self.nonce = nonce
def deserialize(self, f):
self.nonce = struct.unpack("<Q", f.read(8))[0]
def serialize(self):
r = b""
r += struct.pack("<Q", self.nonce)
return r
def __repr__(self):
return "msg_ping(nonce=%08x)" % self.nonce
class msg_pong():
command = b"pong"
def __init__(self, nonce=0):
self.nonce = nonce
def deserialize(self, f):
self.nonce = struct.unpack("<Q", f.read(8))[0]
def serialize(self):
r = b""
r += struct.pack("<Q", self.nonce)
return r
def __repr__(self):
return "msg_pong(nonce=%08x)" % self.nonce
class msg_mempool():
command = b"mempool"
def __init__(self):
pass
def deserialize(self, f):
pass
def serialize(self):
return b""
def __repr__(self):
return "msg_mempool()"
class msg_sendheaders():
command = b"sendheaders"
def __init__(self):
pass
def deserialize(self, f):
pass
def serialize(self):
return b""
def __repr__(self):
return "msg_sendheaders()"
# getheaders message has
# number of entries
# vector of hashes
# hash_stop (hash of last desired block header, 0 to get as many as possible)
class msg_getheaders():
command = b"getheaders"
def __init__(self):
self.locator = CBlockLocator()
self.hashstop = 0
def deserialize(self, f):
self.locator = CBlockLocator()
self.locator.deserialize(f)
self.hashstop = deser_uint256(f)
def serialize(self):
r = b""
r += self.locator.serialize()
r += ser_uint256(self.hashstop)
return r
def __repr__(self):
return "msg_getheaders(locator=%s, stop=%064x)" \
% (repr(self.locator), self.hashstop)
# headers message has
# <count> <vector of block headers>
class msg_headers():
command = b"headers"
def __init__(self, headers=None):
self.headers = headers if headers is not None else []
def deserialize(self, f):
# comment in bitcoind indicates these should be deserialized as blocks
blocks = deser_vector(f, CBlock)
for x in blocks:
self.headers.append(CBlockHeader(x))
def serialize(self):
blocks = [CBlock(x) for x in self.headers]
return ser_vector(blocks)
def __repr__(self):
return "msg_headers(headers=%s)" % repr(self.headers)
class msg_reject():
command = b"reject"
REJECT_MALFORMED = 1
def __init__(self):
self.message = b""
self.code = 0
self.reason = b""
self.data = 0
def deserialize(self, f):
self.message = deser_string(f)
self.code = struct.unpack("<B", f.read(1))[0]
self.reason = deser_string(f)
if (self.code != self.REJECT_MALFORMED and
(self.message == b"block" or self.message == b"tx")):
self.data = deser_uint256(f)
def serialize(self):
r = ser_string(self.message)
r += struct.pack("<B", self.code)
r += ser_string(self.reason)
if (self.code != self.REJECT_MALFORMED and
(self.message == b"block" or self.message == b"tx")):
r += ser_uint256(self.data)
return r
def __repr__(self):
return "msg_reject: %s %d %s [%064x]" \
% (self.message, self.code, self.reason, self.data)
class msg_feefilter():
command = b"feefilter"
def __init__(self, feerate=0):
self.feerate = feerate
def deserialize(self, f):
self.feerate = struct.unpack("<Q", f.read(8))[0]
def serialize(self):
r = b""
r += struct.pack("<Q", self.feerate)
return r
def __repr__(self):
return "msg_feefilter(feerate=%08x)" % self.feerate
class msg_sendcmpct():
command = b"sendcmpct"
def __init__(self):
self.announce = False
self.version = 1
def deserialize(self, f):
self.announce = struct.unpack("<?", f.read(1))[0]
self.version = struct.unpack("<Q", f.read(8))[0]
def serialize(self):
r = b""
r += struct.pack("<?", self.announce)
r += struct.pack("<Q", self.version)
return r
def __repr__(self):
return "msg_sendcmpct(announce=%s, version=%lu)" % (self.announce, self.version)
class msg_cmpctblock():
command = b"cmpctblock"
def __init__(self, header_and_shortids = None):
self.header_and_shortids = header_and_shortids
def deserialize(self, f):
self.header_and_shortids = P2PHeaderAndShortIDs()
self.header_and_shortids.deserialize(f)
def serialize(self):
r = b""
r += self.header_and_shortids.serialize()
return r
def __repr__(self):
return "msg_cmpctblock(HeaderAndShortIDs=%s)" % repr(self.header_and_shortids)
class msg_getblocktxn():
command = b"getblocktxn"
def __init__(self):
self.block_txn_request = None
def deserialize(self, f):
self.block_txn_request = BlockTransactionsRequest()
self.block_txn_request.deserialize(f)
def serialize(self):
r = b""
r += self.block_txn_request.serialize()
return r
def __repr__(self):
return "msg_getblocktxn(block_txn_request=%s)" % (repr(self.block_txn_request))
class msg_blocktxn():
command = b"blocktxn"
def __init__(self):
self.block_transactions = BlockTransactions()
def deserialize(self, f):
self.block_transactions.deserialize(f)
def serialize(self):
r = b""
r += self.block_transactions.serialize(with_witness=False)
return r
def __repr__(self):
return "msg_blocktxn(block_transactions=%s)" % (repr(self.block_transactions))
class msg_witness_blocktxn(msg_blocktxn):
def serialize(self):
r = b""
r += self.block_transactions.serialize(with_witness=True)
return r
| [
"[email protected]"
] | |
10f7e3c347eb30004151f1556f490b053246fe90 | d42b771f64bc2185a8c0dca0f5bcfa5a2e13c5ed | /users/migrations/0004_auto_20210401_1122.py | c6031beb0bb860b42025e95c43890b7f126098f1 | [] | no_license | bgy1060/Daily_Project | 4b38de59c09f5e3f82211a9860e1f32a8ef46b37 | bcc955bddd9941f2bc54f7577c26c1ddc6b36a48 | refs/heads/main | 2023-05-15T17:26:56.858438 | 2021-06-17T05:59:10 | 2021-06-17T05:59:10 | 353,864,798 | 1 | 1 | null | null | null | null | UTF-8 | Python | false | false | 446 | py | # Generated by Django 3.1.7 on 2021-04-01 02:22
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('users', '0003_customuser_username'),
]
operations = [
migrations.RemoveField(
model_name='customuser',
name='is_staff',
),
migrations.RemoveField(
model_name='customuser',
name='is_superuser',
),
]
| [
"[email protected]"
] | |
6b0e78d50451bb8ccc11aab0a05214bf732c8bdb | 163bbb4e0920dedd5941e3edfb2d8706ba75627d | /Code/CodeRecords/2479/60586/261474.py | a9f57d33b28f100b2c88c064fd239af6152dd305 | [] | no_license | AdamZhouSE/pythonHomework | a25c120b03a158d60aaa9fdc5fb203b1bb377a19 | ffc5606817a666aa6241cfab27364326f5c066ff | refs/heads/master | 2022-11-24T08:05:22.122011 | 2020-07-28T16:21:24 | 2020-07-28T16:21:24 | 259,576,640 | 2 | 1 | null | null | null | null | UTF-8 | Python | false | false | 369 | py | def exam4():
t=int(input())
res=[]
for i in range(t):
a=list(input())
b=list(input())
for item in a:
if b.count(item)==0:
res.append(item)
for item in b:
if a.count(item)==0:
res.append(item)
res.sort()
res=list(set(res))
s="".join(res)
print(s)
exam4() | [
"[email protected]"
] | |
210c32bffdfcbe2fea8b31746088ce7e6896645f | 2a39fe8bd203531c9bcdb470d19b80beac665eae | /read_best.py | 380979ab3e9af8575ef0e733e5486bb67e30d4e4 | [] | no_license | davidharvey1986/lenstoolTools | 7bf11af1a38700503a731c6fe7e83fdc92bf58c1 | 85bcf729603d34341f5f41c57c4e233b08055baa | refs/heads/master | 2021-09-08T14:29:52.695461 | 2018-03-10T13:54:50 | 2018-03-10T13:54:50 | 124,657,727 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 7,896 | py | from lensing.lenstool import potentiels
from write_par import *
import csv as csv
import ipdb as pdb
def read_best( filename='best.par',
pot_type='AUTO',
verbose=False,
return_limits=False,
return_image=False):
'''
Read the input into lenstool, the .par file and read it
into a python class
It assumes that the input file is best.par however this
can be changed with the keyword
RETURNS :
2 Rec arrays
- run_mode : contains the reference position of the halo
- potentiel_list : a n halo list of the potentiels found in the recon
For more information on these rec arrays see write_par.py and potentiels.py
BUGS:
It does require the output best.par to be in a paticular order
(the one output from lenstool)
UDPATE : CAN NOW READ INPUT PAR FILES (25/05/2016) DRH
'''
best_obj = open( filename, "rb")
run_mode = runmode()
mode = 'Comment'
pot = -1
ind=0
iLine=0
comment_flag = 0
limit_list = []
limit_flag = 0
z_m_limit = []
for iline in best_obj:
line = iline.splitlines()[0].split('\t')
if verbose:
print line
if (line[0] != 'runmode') & (comment_flag ==0):
continue
#if the length of the line is 1 then it is a key word
if line[0] == 'runmode':
mode='runmode'
comment_flag = 1
if line[0] == 'grille':
mode='grille'
if line[0] == 'image':
mode='image'
ret_image = image()
if (len(line) == 1):
if line[0].strip()[0] == '#':
if verbose:
print 'SKIPPING',line
continue
if line[0].split()[0] == 'potentiel':
mode='potentiel'
pot +=1
pot_name = line[0].split()[1]
if line[0].split()[0] == 'limit':
mode='limit'
limit_list.append(limits.get_limit( potentiel_list[pot]['profil']['int'] ))
limit_flag = 1
if len(line) > 1:
if line[1].strip()[0] == '#':
if verbose:
print 'SKIPPING ',line
continue
if line[1].strip() != 'end':
if (mode == 'runmode') :
option=line[1].split()
keys = run_mode[option[0]].dtype.names
for iKey in xrange(len(keys)):
if iKey < len(option):
run_mode[option[0]][keys[iKey]] = option[iKey]
else:
continue
if (mode == 'grille'):
option=line[1].split()
#If the filename is the best.par then
#the number of lenses is the nlentille
#if the input par file then tihs is the nlens_opt
if (option[0] == 'nlentille') | (option[0] == 'nlens'):
nlens = np.int(option[1])
#If I assume pot is NFW use this
#otherwise get the potentiel type automatically
if pot_type == 'NFW':
potentiel_list = [ potentiels.nfw() for i in xrange( nlens ) ]
else:
if pot_type == 'AUTO':
potentiel_list = get_potential_list( best_file = filename )
if (mode == 'image'):
option=line[1].split()
if option[0] == 'sigposArcsec':
option[0] = 'sigpos_arcsec'
if (option[0] == 'z_m_limit'):
dtype = [('name', object), ('im_label', float),
('int', np.int), ('lo', float), ('hi', float), ('res', float)]
iz_m_limit = np.array(('z_m_limit '+str(option[1]),
option[2],option[3],
option[4], option[5],
option[6]), dtype=dtype)
ret_image[ 'z_m_limit '+str(option[1]) ] = \
iz_m_limit
else:
image_keys = ret_image[option[0]].dtype.names
for i in xrange( 1, len(image_keys) ):
ret_image[ option[0] ][image_keys[i]] = option[i]
if (mode == 'potentiel'):
if pot >= nlens:
continue
option = line[1].split()
try:
data_type = potentiel_list[pot][option[0]].dtype.names[1]
potentiel_list[pot][option[0]][data_type] = \
np.array(option[1]).astype(data_type)
except:
if verbose == True:
print option[0],' does not exist in potentiel'
else:
pass
ra_halo = run_mode['reference']['ra'] - \
potentiel_list[pot]['x_centre']['float']/3600./\
np.cos(run_mode['reference']['dec']*np.pi/180.)
dec_halo = run_mode['reference']['dec'] + \
potentiel_list[pot]['y_centre']['float']/3600.
potentiel_list[pot]['ra'] = \
np.array(('ra', ra_halo), dtype=[('name', object), ('float', float)])
potentiel_list[pot]['dec'] = \
np.array(('dec', dec_halo), dtype=[('name', object), ('float', float)])
potentiel_list[pot]['identity'] =\
np.array(('identity', pot_name), dtype=[('name', object), ('str', object)])
if (mode == 'limit'):
option = line[1].split()
try:
data_type = limit_list[pot][option[0]].dtype.names[1]
limit_list[pot][option[0]][data_type] = \
np.array(option[1]).astype(data_type)
except:
if verbose == True:
print option[0],' does not exist in potentiel'
else:
pass
else:
mode = 'end'
if return_limits:
if limit_flag == 1:
return run_mode, potentiel_list, limit_list
else:
print 'NO LIMIT SECTION > IS THIS A BEST.PAR?'
return 0
elif return_image:
return ret_image
else:
return run_mode, potentiel_list
def get_potential_list( best_file='best.par', verbose=False ):
'''
Run this script to get the list of potentials
which may not be necessarily NFW
Currently only valid for NFW or PIEMD since these
are the only ones in limit.py and potentiels.py
'''
run_mode, pots = read_best( best_file, pot_type='NFW', verbose=verbose )
return [ potentiels.get_profile(iPot['profil']['int']) for iPot in pots ]
def get_limit_list( pots ):
return [ limits.get_limit( iPot['profil']['int']) for iPot in pots ]
| [
"[email protected]"
] | |
d952e3aa3f2702ee77fba5e59c2ae950a3d54637 | 9f5fcff2513f2d78f27e5313698dcc47fce1e754 | /Experiment/RL_EA_search/graphnas/rl_trainer.py | 98698e1f65a8d86c9fc924e46fecd052d959f4cb | [
"Apache-2.0"
] | permissive | ncucjm/notebook | c2495f790e9fc2ca55c1c29a8eaa2dc1bfe7463f | 7271a0d1b10cdd6298e223c7ff150d4df031aa76 | refs/heads/master | 2023-07-20T05:55:48.946687 | 2021-01-27T09:12:19 | 2021-01-27T09:12:19 | 202,633,012 | 0 | 0 | null | 2023-07-06T21:28:29 | 2019-08-16T00:58:45 | Jupyter Notebook | UTF-8 | Python | false | false | 14,605 | py | from collections import deque
import os
import glob
import torch
import numpy as np
import scipy.signal
import graphnas.utils.tensor_utils as utils
from graphnas_variants.macro_graphnas.pyg.pyg_gnn_model_manager import GeoCitationManager
import time
logger = utils.get_logger()
def discount(x, amount):
return scipy.signal.lfilter([1], [1, -amount], x[::-1], axis=0)[::-1]
history = []
def scale(value, last_k=10, scale_value=1):
'''
scale value into [-scale_value, scale_value], according last_k history
'''
max_reward = np.max(history[-last_k:])
if max_reward == 0:
return value
return scale_value / max_reward * value
def _get_optimizer(name):
if name.lower() == 'sgd':
optim = torch.optim.SGD
elif name.lower() == 'adam':
optim = torch.optim.Adam
return optim
def experiment_data_save(name,time_list,acc_list):
path = path_get()[1]
with open(path+"/"+name, "w") as f:
f.write(str(time_list))
f.write("\n"+str(acc_list))
print("the ", name, " have written")
def path_get():
# 当前文件目录
current_path = os.path.abspath('')
# 当前文件夹父目录
father_path = os.path.abspath(os.path.dirname(current_path))
# corpus_path = os.path.join(father_path, corpus)
return father_path, current_path
class RL_Trainer(object):
def __init__(self, args):
self.args = args
self.controller_step = 0 # counter for controller
self.cuda = args.cuda
self.epoch = 0
self.start_epoch = 0
self.max_length = self.args.shared_rnn_max_length
self.with_retrain = False
self.submodel_manager = None
self.controller = None
self.build_model() # build controller and sub-model
self.RL_train_time = []
self.RL_search_time = []
self.RL_train_acc = []
self.RL_search_acc = []
controller_optimizer = _get_optimizer(self.args.controller_optim)
self.controller_optim = controller_optimizer(self.controller.parameters(), lr=self.args.controller_lr)
if self.args.mode == "derive":
self.load_model()
def build_model(self):
self.args.share_param = False
self.with_retrain = True
self.args.shared_initial_step = 0
if self.args.search_mode == "macro":
# generate model description in macro way (generate entire network description)
from graphnas.search_space import MacroSearchSpace
search_space_cls = MacroSearchSpace()
self.search_space = search_space_cls.get_search_space()
self.action_list = search_space_cls.generate_action_list(self.args.layers_of_child_model)
# build RNN controller
from graphnas.graphnas_controller import SimpleNASController
self.controller = SimpleNASController(self.args, action_list=self.action_list,
search_space=self.search_space,
cuda=self.args.cuda)
if self.args.dataset in ["Cora", "Citeseer", "Pubmed"]:
# implements based on pyg
self.submodel_manager = GeoCitationManager(self.args)
if self.cuda:
self.controller.cuda()
def form_gnn_info(self, gnn):
if self.args.search_mode == "micro":
actual_action = {}
if self.args.predict_hyper:
actual_action["action"] = gnn[:-4]
actual_action["hyper_param"] = gnn[-4:]
else:
actual_action["action"] = gnn
actual_action["hyper_param"] = [0.005, 0.8, 5e-5, 128]
return actual_action
return gnn
def train(self, action_list):
model_path = "/home/jerry/experiment/RL_nas/graphnas/Citeseer"
# Training the controller
if not os.listdir(model_path):# 判断保存controler模型的文件夹是否为空,为空返回False,反之为Ture
self.train_controller()
print("*" * 35, "using controller search the initialize population", "*" * 35)
populations, accuracies = self.derive(self.args.population_size, action_list)
print("*" * 35, "the search DONE", "*" * 35)
self.save_model()
else:
self.load_model() # 每次加载step序号最大controler模型search
print("*" * 35, "using controller search the initialize population", "*" * 35)
populations, accuracies = self.derive(self.args.population_size, action_list)
print("*" * 35, "the search DONE", "*" * 35)
return populations, accuracies
def derive(self, sample_num, action_list):
if sample_num is None and self.args.derive_from_history:
return self.derive_from_history()
else:
if sample_num is None:
sample_num = self.args.derive_num_sample
gnn_list, _, entropies = self.controller.sample(sample_num, with_details=True)
accuracies = []
epoch = 0
for action in gnn_list:
once_RL_search_start_time = time.time()
gnn = self.form_gnn_info(action)
reward = self.submodel_manager.test_with_param(gnn, format=self.args.format,
with_retrain=self.with_retrain)
acc_score = reward[1]
accuracies.append(acc_score)
once_RL_search_end_time = time.time()
print("the", epoch, "epcoh controller train time: ",
once_RL_search_end_time - once_RL_search_start_time, 's')
if epoch == 0:
self.RL_search_time.append(once_RL_search_start_time)
self.RL_search_time.append(once_RL_search_end_time)
self.RL_search_acc.append(acc_score)
else:
self.RL_search_time.append(once_RL_search_end_time)
self.RL_search_acc.append(acc_score)
epoch += 1
father_path = path_get()[0]
experiment_data_save("controler_search.txt", self.RL_search_time, self.RL_search_acc)
print("all RL search time list: ", self.RL_search_time)
print("all RL search acc list: ", self.RL_search_acc)
for individual, ind_acc in zip(gnn_list, accuracies):
print("individual:", individual, " val_score:", ind_acc)
# gnn_structure 基因编码
population = []
for gnn_structure in gnn_list:
i = 0
single = []
for operator, action_name in zip(gnn_structure, action_list):
if i == 9:
operator = 8
i += 1
single.append(self.search_space[action_name].index(operator))
population.append(single)
return population, accuracies
def save_model(self):
torch.save(self.controller.state_dict(), self.controller_path)
torch.save(self.controller_optim.state_dict(), self.controller_optimizer_path)
logger.info(f'[*] SAVED: {self.controller_path}')
epochs, shared_steps, controller_steps = self.get_saved_models_info()
for epoch in epochs[:-self.args.max_save_num]:
paths = glob.glob(
os.path.join(self.args.dataset, f'*_epoch{epoch}_*.pth'))
for path in paths:
utils.remove_file(path)
def load_model(self):
epochs, shared_steps, controller_steps = self.get_saved_models_info()
if len(epochs) == 0:
logger.info(f'[!] No checkpoint found in {self.args.dataset}...')
return
self.epoch = self.start_epoch = max(epochs)
self.controller_step = max(controller_steps)
self.controller.load_state_dict(
torch.load(self.controller_path))
self.controller_optim.load_state_dict(
torch.load(self.controller_optimizer_path))
logger.info(f'[*] LOADED: {self.controller_path}')
def get_reward(self, gnn_list, entropies, hidden):
"""
Computes the reward of a single sampled model on validation data.
"""
if not isinstance(entropies, np.ndarray):
entropies = entropies.data.cpu().numpy()
if isinstance(gnn_list, dict):
gnn_list = [gnn_list]
if isinstance(gnn_list[0], list) or isinstance(gnn_list[0], dict):
pass
else:
gnn_list = [gnn_list] # when structure_list is one structure
reward_list = []
for gnn in gnn_list:
gnn = self.form_gnn_info(gnn)
reward = self.submodel_manager.test_with_param(gnn,
format=self.args.format,
with_retrain=self.with_retrain)
if reward is None: # cuda error hanppened
reward = 0
else:
rewards = reward[0]#奖励计算正确
reward_list.append(rewards)
acc_validation = reward[1]
if self.args.entropy_mode == 'reward':
rewards = reward_list + self.args.entropy_coeff * entropies
elif self.args.entropy_mode == 'regularizer':
rewards = reward_list * np.ones_like(entropies)
else:
raise NotImplementedError(f'Unkown entropy mode: {self.args.entropy_mode}')
return rewards, hidden, acc_validation
def train_controller(self):
"""
Train controller to find better structure.
"""
print("*" * 35, "training controller", "*" * 35)
model = self.controller
model.train()
baseline = None
adv_history = []
entropy_history = []
reward_history = []
hidden = self.controller.init_hidden(self.args.batch_size)
total_loss = 0
for step in range(self.args.controller_max_step):
#contraller训练一次的时间
once_controller_train_start_time = time.time()
# sample graphnas
structure_list, log_probs, entropies = self.controller.sample(with_details=True)
# calculate reward
np_entropies = entropies.data.cpu().numpy()
results = self.get_reward(structure_list, np_entropies, hidden)
torch.cuda.empty_cache()
if results: # has reward
rewards, hidden, acc = results
else:
continue
# discount
if 1 > self.args.discount > 0:
rewards = discount(rewards, self.args.discount)
reward_history.extend(rewards)
entropy_history.extend(np_entropies)
# moving average baseline
if baseline is None:
baseline = rewards
else:
decay = self.args.ema_baseline_decay
baseline = decay * baseline + (1 - decay) * rewards
adv = rewards - baseline
history.append(adv)
adv = scale(adv, scale_value=0.5)
adv_history.extend(adv)
adv = utils.get_variable(adv, self.cuda, requires_grad=False)
# policy loss
loss = -log_probs * adv
if self.args.entropy_mode == 'regularizer':
loss -= self.args.entropy_coeff * entropies
loss = loss.sum() # or loss.mean()
# update
self.controller_optim.zero_grad()
loss.backward()
if self.args.controller_grad_clip > 0:
torch.nn.utils.clip_grad_norm(model.parameters(),
self.args.controller_grad_clip)
self.controller_optim.step()
total_loss += utils.to_item(loss.data)
self.controller_step += 1
torch.cuda.empty_cache()
once_controller_train_end_time = time.time()
print("the", step, "epcoh controller train time: ",
once_controller_train_end_time-once_controller_train_start_time, "s")
if step == 0:
self.RL_train_time.append(once_controller_train_start_time)
self.RL_train_time.append(once_controller_train_end_time)
self.RL_train_acc.append(acc)
else:
self.RL_train_time.append(once_controller_train_end_time)
self.RL_train_acc.append(acc)
print("all RL train time list: ", self.RL_train_time)
print("all RL train acc list: ", self.RL_train_acc)
print("*" * 35, "training controller over", "*" * 35)
experiment_data_save("controler_train.txt", self.RL_train_time, self.RL_train_acc)
def evaluate(self, gnn):
"""
Evaluate a structure on the validation set.
"""
self.controller.eval()
gnn = self.form_gnn_info(gnn)
results = self.submodel_manager.retrain(gnn, format=self.args.format)
if results:
reward, scores = results
else:
return
logger.info(f'eval | {gnn} | reward: {reward:8.2f} | scores: {scores:8.2f}')
@property
def model_info_filename(self):
return f"{self.args.dataset}_{self.args.search_mode}_{self.args.format}_results.txt"
@property
def controller_path(self):
return f'{self.args.dataset}/controller_epoch{self.epoch}_step{self.controller_step}.pth'
@property
def controller_optimizer_path(self):
return f'{self.args.dataset}/controller_epoch{self.epoch}_step{self.controller_step}_optimizer.pth'
def get_saved_models_info(self):
paths = glob.glob(os.path.join(self.args.dataset, '*.pth'))
paths.sort()
def get_numbers(items, delimiter, idx, replace_word, must_contain=''):
return list(set([int(
name.split(delimiter)[idx].replace(replace_word, ''))
for name in items if must_contain in name]))
basenames = [os.path.basename(path.rsplit('.', 1)[0]) for path in paths]
epochs = get_numbers(basenames, '_', 1, 'epoch')
shared_steps = get_numbers(basenames, '_', 2, 'step', 'shared')
controller_steps = get_numbers(basenames, '_', 2, 'step', 'controller')
epochs.sort()
shared_steps.sort()
controller_steps.sort()
return epochs, shared_steps, controller_steps
| [
"[email protected]"
] | |
b61968dae81932acb65d39767e1e265e0cacf305 | 53fab060fa262e5d5026e0807d93c75fb81e67b9 | /backup/user_167/ch50_2019_06_11_13_14_47_474417.py | 72022271fba4966682e6c89d23ed329447de3c78 | [] | no_license | gabriellaec/desoft-analise-exercicios | b77c6999424c5ce7e44086a12589a0ad43d6adca | 01940ab0897aa6005764fc220b900e4d6161d36b | refs/heads/main | 2023-01-31T17:19:42.050628 | 2020-12-16T05:21:31 | 2020-12-16T05:21:31 | 306,735,108 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 193 | py | def numero_no_indice (lista):
l=[]
i=0
posicao=0
while i < len(lista):
if i[lista]==posicao:
i+=1
posicao+=1
x.append(i)
return l | [
"[email protected]"
] | |
52aaa46352ec022d7349aab054fec390a4a8e785 | ca7aa979e7059467e158830b76673f5b77a0f5a3 | /Python_codes/p03359/s399632677.py | 4b9e4ef5080c28b6522f683bf717ce679011d8cb | [] | no_license | Aasthaengg/IBMdataset | 7abb6cbcc4fb03ef5ca68ac64ba460c4a64f8901 | f33f1c5c3b16d0ea8d1f5a7d479ad288bb3f48d8 | refs/heads/main | 2023-04-22T10:22:44.763102 | 2021-05-13T17:27:22 | 2021-05-13T17:27:22 | 367,112,348 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 411 | py | import sys
read = sys.stdin.read
readline = sys.stdin.readline
readlines = sys.stdin.readlines
sys.setrecursionlimit(10 ** 9)
INF = 1 << 60
MOD = 1000000007
def main():
a, b = map(int, readline().split())
ans = 0
for i in range(1, 13):
if i < a:
ans += 1
elif i == a and i <= b:
ans += 1
print(ans)
return
if __name__ == '__main__':
main()
| [
"[email protected]"
] | |
a51a38a2e71d34f8aa6a2e82026c584eeacddb12 | ca7aa979e7059467e158830b76673f5b77a0f5a3 | /Python_codes/p03363/s286998192.py | 091f166035f13606310d3cb4f31a2b7bd57a4859 | [] | no_license | Aasthaengg/IBMdataset | 7abb6cbcc4fb03ef5ca68ac64ba460c4a64f8901 | f33f1c5c3b16d0ea8d1f5a7d479ad288bb3f48d8 | refs/heads/main | 2023-04-22T10:22:44.763102 | 2021-05-13T17:27:22 | 2021-05-13T17:27:22 | 367,112,348 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 297 | py | from collections import defaultdict
n = int(input())
li_a = list(map(int, input().split()))
s = [0] * (n+1)
dd = defaultdict(lambda:0)
for i in range(n):
s[i+1] += s[i] + li_a[i]
for i in range(n+1):
dd[s[i]] += 1
ans = 0
for key in dd:
ans += (dd[key]*dd[key] - dd[key])//2
print(ans) | [
"[email protected]"
] | |
cc2fb3795c9dafa0f738c912004965c3b7b25f9a | ab4f0df599159b2c3929c24b12e2766718efdacb | /pyplan_engine/classes/CustomImports.py | e28736de5115b4624154fca75a5693ce79bf591c | [
"MIT"
] | permissive | jorgedouglas71/pyplan-ide | f01ec438f727ee0dea01b0d265155b49a26ccdb8 | 5ad0e4a2592b5f2716ff680018f717c65de140f5 | refs/heads/master | 2020-09-25T03:10:01.201707 | 2019-12-04T15:39:55 | 2019-12-04T15:39:55 | 225,904,173 | 0 | 0 | MIT | 2019-12-04T15:57:06 | 2019-12-04T15:57:05 | null | UTF-8 | Python | false | false | 1,083 | py |
custom = {
}
defaults = {
"abs": abs,
"range": range,
"abs": abs,
"dict": dict,
"min": min,
"all": all,
"hex": hex,
"next": next,
"slice": slice,
"any": any,
"divmod": divmod,
"object": object,
"sorted": sorted,
"ascii": ascii,
"enumerate": enumerate,
"input": input,
"oct": oct,
"bin": bin,
"int": int,
"str": str,
"bool": bool,
"isinstance": isinstance,
"ord": ord,
"sum": sum,
"bytearray": bytearray,
"filter": filter,
"issubclass": issubclass,
"pow": pow,
"bytes": bytes,
"float": float,
"iter": iter,
"tuple": tuple,
"callable": callable,
"format": format,
"len": len,
"property": property,
"type": type,
"chr": chr,
"frozenset": frozenset,
"list": list,
"range": range,
"locals": locals,
"repr": repr,
"zip": zip,
"map": map,
"reversed": reversed,
"complex": complex,
"max": max,
"round": round
}
imports = {**custom, **defaults}
# bypass for tests
imports["bypass"] = True
| [
"[email protected]"
] | |
991ef0d671ac96b7e9c8992788bc4c9aaefb75ee | eb7b84b70b335e913aa7d6bf9effba3034810e27 | /FirstApp_Flask/employeeInfo.py | 73e6fe9ffb1ca4a70d0a2d3879854a6a22310f0c | [] | no_license | chavhanpunamchand/flask | ccc27c9909799bcc4fa9208b49c478942d569eb4 | 840c7299eb5cfc708f8a6cbefe8955115af9b179 | refs/heads/master | 2023-02-21T13:57:43.012794 | 2021-01-28T11:09:41 | 2021-01-28T11:09:41 | 326,448,594 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 1,008 | py |
class Employee:
def __init__(self,eid,ename,eage,egen,ecity,esalary,email,erole,eskil,ehobs):
self.empid=eid
self.empName=ename
self.empAge=eage
self.eGender=egen
self.empCity=ecity
self.empSalary=esalary
self.empEmail=email
self.empRole=erole
self.empSkill=eskil
self.emphobs=ehobs
def __str__(self):
return f" EmpId : {self.empid} " \
f" Name :{self.empName} " \
f" Age :{self.eGender}" \
f" City :{self.empCity}" \
f" Salary:{self.empSalary}" \
f" Role:{self.empRole} " \
f" Email:{self.empEmail}" \
f" Skill:{self.empSkill}" \
f" Hobies:{self.emphobs} "
def __repr__(self):
return self
# if __name__ == '__main__':
# s1=Employee(101,'Punamchand',27,'M','Pune',25000,'[email protected]','SSE','Python,Java','Cricket,Hocky')
# print(s1)
| [
"[email protected]"
] | |
4834fd49bfc14729688e48ddef0a083f66fbdcb8 | ca7aa979e7059467e158830b76673f5b77a0f5a3 | /Python_codes/p02659/s688514282.py | 9e2ccbf9b9d040bce71fcaff7dcf3be04de6d05d | [] | no_license | Aasthaengg/IBMdataset | 7abb6cbcc4fb03ef5ca68ac64ba460c4a64f8901 | f33f1c5c3b16d0ea8d1f5a7d479ad288bb3f48d8 | refs/heads/main | 2023-04-22T10:22:44.763102 | 2021-05-13T17:27:22 | 2021-05-13T17:27:22 | 367,112,348 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 104 | py | import math
from decimal import *
a,b=map(str,input().split())
ans=(Decimal(a)*Decimal(b))//1
print(ans) | [
"[email protected]"
] | |
54d596a8f7abeaa708ce12829210e4b01ed2cd1c | 9b5d0b7d7c9cdaef2851b675292e5eef651ab257 | /database/compiled_templates/root/history_common.mako.py | 10318451f22158193738b9a03b46af42ae5c7016 | [
"CC-BY-2.5",
"MIT"
] | permissive | msGenDev/Yeps-EURAC | 392fd497a6891a5a22204b236c26dcd133793f21 | 7b679ea17ba294893cc560354d759cfd61c0b450 | refs/heads/master | 2021-01-16T21:49:26.499975 | 2010-04-05T17:52:50 | 2010-04-05T17:52:50 | null | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 14,425 | py | from mako import runtime, filters, cache
UNDEFINED = runtime.UNDEFINED
__M_dict_builtin = dict
__M_locals_builtin = locals
_magic_number = 5
_modified_time = 1262455474.7263279
_template_filename=u'templates/root/history_common.mako'
_template_uri=u'root/history_common.mako'
_template_cache=cache.Cache(__name__, _modified_time)
_source_encoding=None
_exports = ['render_dataset']
def render_body(context,**pageargs):
context.caller_stack._push_frame()
try:
__M_locals = __M_dict_builtin(pageargs=pageargs)
n_ = context.get('n_', UNDEFINED)
__M_writer = context.writer()
# SOURCE LINE 1
_=n_
__M_locals.update(__M_dict_builtin([(__M_key, __M_locals_builtin()[__M_key]) for __M_key in ['_'] if __M_key in __M_locals_builtin()]))
__M_writer(u'\n')
# SOURCE LINE 136
__M_writer(u'\n')
return ''
finally:
context.caller_stack._pop_frame()
def render_render_dataset(context,data,hid,show_deleted_on_refresh=False,user_owns_dataset=True):
context.caller_stack._push_frame()
try:
h = context.get('h', UNDEFINED)
app = context.get('app', UNDEFINED)
def render_dataset(data,hid,show_deleted_on_refresh=False,user_owns_dataset=True):
return render_render_dataset(context,data,hid,show_deleted_on_refresh,user_owns_dataset)
request = context.get('request', UNDEFINED)
len = context.get('len', UNDEFINED)
enumerate = context.get('enumerate', UNDEFINED)
trans = context.get('trans', UNDEFINED)
_ = context.get('_', UNDEFINED)
__M_writer = context.writer()
# SOURCE LINE 3
__M_writer(u'\n <a name="')
# SOURCE LINE 4
__M_writer(unicode(trans.security.encode_id( data.id )))
__M_writer(u'"></a>\n ')
# SOURCE LINE 5
if data.state in ['no state','',None]:
data_state = "queued"
else:
data_state = data.state
user, roles = trans.get_user_and_roles()
# SOURCE LINE 11
__M_writer(u'\n')
# SOURCE LINE 12
if not trans.user_is_admin() and not trans.app.security_agent.can_access_dataset( roles, data.dataset ):
# SOURCE LINE 13
__M_writer(u' <div class="historyItemWrapper historyItem historyItem-')
__M_writer(unicode(data_state))
__M_writer(u' historyItem-noPermission" id="historyItem-')
__M_writer(unicode(data.id))
__M_writer(u'">\n')
# SOURCE LINE 14
else:
# SOURCE LINE 15
__M_writer(u' <div class="historyItemWrapper historyItem historyItem-')
__M_writer(unicode(data_state))
__M_writer(u'" id="historyItem-')
__M_writer(unicode(data.id))
__M_writer(u'">\n')
# SOURCE LINE 17
__M_writer(u' \n')
# SOURCE LINE 18
if data.deleted:
# SOURCE LINE 19
__M_writer(u' <div class="warningmessagesmall">\n <strong>This dataset has been deleted. Click <a href="')
# SOURCE LINE 20
__M_writer(unicode(h.url_for( controller='dataset', action='undelete', id=data.id )))
__M_writer(u'" class="historyItemUndelete" id="historyItemUndeleter-')
__M_writer(unicode(data.id))
__M_writer(u'" target="galaxy_history">here</a> to undelete.</strong>\n </div>\n')
# SOURCE LINE 23
__M_writer(u'\n')
# SOURCE LINE 25
__M_writer(u'\t<div style="overflow: hidden;" class="historyItemTitleBar">\t\t\n\t <div class="historyItemButtons">\n')
# SOURCE LINE 27
if data_state == "upload":
# SOURCE LINE 31
__M_writer(u' \t <img src="')
__M_writer(unicode(h.url_for('/static/images/eye_icon_grey.png')))
__M_writer(u'" width=\'16\' height=\'16\' alt=\'display data\' title=\'display data\' class=\'button display\' border=\'0\'>\n')
# SOURCE LINE 32
if user_owns_dataset:
# SOURCE LINE 33
__M_writer(u' \t <img src="')
__M_writer(unicode(h.url_for('/static/images/pencil_icon_grey.png')))
__M_writer(u'" width=\'16\' height=\'16\' alt=\'edit attributes\' title=\'edit attributes\' class=\'button edit\' border=\'0\'>\n')
# SOURCE LINE 35
else:
# SOURCE LINE 36
__M_writer(u' \t <a class="icon-button display" title="display data" href="')
__M_writer(unicode(h.url_for( controller='dataset', action='display', dataset_id=trans.security.encode_id( data.id ), preview=True, filename='' )))
__M_writer(u'" target="galaxy_main"></a>\n')
# SOURCE LINE 37
if user_owns_dataset:
# SOURCE LINE 38
__M_writer(u' \t <a class="icon-button edit" title="edit attributes" href="')
__M_writer(unicode(h.url_for( controller='root', action='edit', id=data.id )))
__M_writer(u'" target="galaxy_main"></a>\n')
# SOURCE LINE 41
if user_owns_dataset:
# SOURCE LINE 42
__M_writer(u'\t <a class="icon-button delete" title="delete" href="')
__M_writer(unicode(h.url_for( action='delete', id=data.id, show_deleted_on_refresh=show_deleted_on_refresh )))
__M_writer(u'" id="historyItemDeleter-')
__M_writer(unicode(data.id))
__M_writer(u'"></a>\n')
# SOURCE LINE 44
__M_writer(u'\t </div>\n\t <span class="state-icon"></span>\n\t <span class="historyItemTitle"><b>')
# SOURCE LINE 46
__M_writer(unicode(hid))
__M_writer(u': ')
__M_writer(unicode(data.display_name()))
__M_writer(u'</b></span>\n\t</div>\n \n')
# SOURCE LINE 50
__M_writer(u' \n <div id="info')
# SOURCE LINE 51
__M_writer(unicode(data.id))
__M_writer(u'" class="historyItemBody">\n')
# SOURCE LINE 52
if not trans.user_is_admin() and not trans.app.security_agent.can_access_dataset( roles, data.dataset ):
# SOURCE LINE 53
__M_writer(u' <div>You do not have permission to view this dataset.</div>\n')
# SOURCE LINE 54
elif data_state == "upload":
# SOURCE LINE 55
__M_writer(u' <div>Dataset is uploading</div>\n')
# SOURCE LINE 56
elif data_state == "queued":
# SOURCE LINE 57
__M_writer(u' <div>')
__M_writer(unicode(_('Job is waiting to run')))
__M_writer(u'</div>\n')
# SOURCE LINE 58
elif data_state == "running":
# SOURCE LINE 59
__M_writer(u' <div>')
__M_writer(unicode(_('Job is currently running')))
__M_writer(u'</div>\n')
# SOURCE LINE 60
elif data_state == "error":
# SOURCE LINE 61
__M_writer(u' <div>\n An error occurred running this job: <i>')
# SOURCE LINE 62
__M_writer(unicode(data.display_info().strip()))
__M_writer(u'</i>\n </div>\n\t\t<div>\n\t\t <a href="')
# SOURCE LINE 65
__M_writer(unicode(h.url_for( controller='dataset', action='errors', id=data.id )))
__M_writer(u'" target="galaxy_main">report this error</a>\n\t\t | <a href="')
# SOURCE LINE 66
__M_writer(unicode(h.url_for( controller='tool_runner', action='rerun', id=data.id )))
__M_writer(u'" target="galaxy_main">rerun</a>\n\t\t</div>\n')
# SOURCE LINE 68
elif data_state == "discarded":
# SOURCE LINE 69
__M_writer(u' <div>\n The job creating this dataset was cancelled before completion.\n </div>\n')
# SOURCE LINE 72
elif data_state == 'setting_metadata':
# SOURCE LINE 73
__M_writer(u' <div>')
__M_writer(unicode(_('Metadata is being Auto-Detected.')))
__M_writer(u'</div>\n')
# SOURCE LINE 74
elif data_state == "empty":
# SOURCE LINE 75
__M_writer(u' <div>')
__M_writer(unicode(_('No data: ')))
__M_writer(u'<i>')
__M_writer(unicode(data.display_info()))
__M_writer(u'</i></div>\n')
# SOURCE LINE 76
elif data_state == "ok":
# SOURCE LINE 77
__M_writer(u' <div>\n ')
# SOURCE LINE 78
__M_writer(unicode(data.blurb))
__M_writer(u',\n format: <span class="')
# SOURCE LINE 79
__M_writer(unicode(data.ext))
__M_writer(u'">')
__M_writer(unicode(data.ext))
__M_writer(u'</span>, \n database:\n')
# SOURCE LINE 81
if data.dbkey == '?':
# SOURCE LINE 82
__M_writer(u' <a href="')
__M_writer(unicode(h.url_for( controller='root', action='edit', id=data.id )))
__M_writer(u'" target="galaxy_main">')
__M_writer(unicode(_(data.dbkey)))
__M_writer(u'</a>\n')
# SOURCE LINE 83
else:
# SOURCE LINE 84
__M_writer(u' <span class="')
__M_writer(unicode(data.dbkey))
__M_writer(u'">')
__M_writer(unicode(_(data.dbkey)))
__M_writer(u'</span>\n')
# SOURCE LINE 86
__M_writer(u' </div>\n <div class="info">')
# SOURCE LINE 87
__M_writer(unicode(_('Info: ')))
__M_writer(unicode(data.display_info()))
__M_writer(u'</div>\n <div> \n')
# SOURCE LINE 89
if data.has_data:
# SOURCE LINE 90
__M_writer(u' <a href="')
__M_writer(unicode(h.url_for( controller='dataset', action='display', dataset_id=trans.security.encode_id( data.id ), to_ext=data.ext )))
__M_writer(u'">save</a>\n')
# SOURCE LINE 91
if user_owns_dataset:
# SOURCE LINE 92
__M_writer(u'\t\t\t | <a href="')
__M_writer(unicode(h.url_for( controller='tool_runner', action='rerun', id=data.id )))
__M_writer(u'" target="galaxy_main">rerun</a>\n')
# SOURCE LINE 94
for display_app in data.datatype.get_display_types():
# SOURCE LINE 95
__M_writer(u' ')
target_frame, display_links = data.datatype.get_display_links( data, display_app, app, request.base )
__M_writer(u'\n')
# SOURCE LINE 96
if len( display_links ) > 0:
# SOURCE LINE 97
__M_writer(u' | ')
__M_writer(unicode(data.datatype.get_display_label(display_app)))
__M_writer(u'\n')
# SOURCE LINE 98
for display_name, display_link in display_links:
# SOURCE LINE 99
__M_writer(u'\t\t\t\t <a target="')
__M_writer(unicode(target_frame))
__M_writer(u'" href="')
__M_writer(unicode(display_link))
__M_writer(u'">')
__M_writer(unicode(_(display_name)))
__M_writer(u'</a> \n')
# SOURCE LINE 104
__M_writer(u' </div>\n')
# SOURCE LINE 105
if data.peek != "no peek":
# SOURCE LINE 106
__M_writer(u' <div><pre id="peek')
__M_writer(unicode(data.id))
__M_writer(u'" class="peek">')
__M_writer(unicode(_(data.display_peek())))
__M_writer(u'</pre></div>\n')
# SOURCE LINE 108
else:
# SOURCE LINE 109
__M_writer(u'\t\t<div>')
__M_writer(unicode(_('Error: unknown dataset state "%s".') % data_state))
__M_writer(u'</div>\n')
# SOURCE LINE 111
__M_writer(u' \n')
# SOURCE LINE 113
__M_writer(u' \n')
# SOURCE LINE 114
if len( data.children ) > 0:
# SOURCE LINE 117
__M_writer(u' ')
children = []
for child in data.children:
if child.visible:
children.append( child )
# SOURCE LINE 122
__M_writer(u'\n')
# SOURCE LINE 123
if len( children ) > 0:
# SOURCE LINE 124
__M_writer(u' <div>\n There are ')
# SOURCE LINE 125
__M_writer(unicode(len( children )))
__M_writer(u' secondary datasets.\n')
# SOURCE LINE 126
for idx, child in enumerate(children):
# SOURCE LINE 127
__M_writer(u' ')
__M_writer(unicode(render_dataset( child, idx + 1, show_deleted_on_refresh = show_deleted_on_refresh )))
__M_writer(u'\n')
# SOURCE LINE 129
__M_writer(u' </div>\n')
# SOURCE LINE 132
__M_writer(u'\n </div>\n </div>\n\n')
return ''
finally:
context.caller_stack._pop_frame()
| [
"[email protected]"
] | |
c6283b432e7cc517ab49d5868cb8608925fc9ed8 | 60eec837687166288bed41aa406fb92ff8b63f1b | /model/tensorflow/examples/tfdnnc/tfdnnc.py | d764e5f63f5b4c9824f4e993163d0e65ed8a9156 | [
"MIT",
"LicenseRef-scancode-generic-export-compliance"
] | permissive | shivam-bit/dffml | 1c5ca75e946fddec78fd3943bdcb6cf5650bcd89 | 4c1e7870dba12c298d5a7f4740a23af634cacb7e | refs/heads/master | 2022-06-19T22:28:10.396364 | 2020-05-08T21:53:53 | 2020-05-08T21:53:53 | 262,698,152 | 0 | 0 | MIT | 2020-05-10T02:36:10 | 2020-05-10T02:36:09 | null | UTF-8 | Python | false | false | 1,128 | py | from dffml import CSVSource, Features, DefFeature
from dffml.noasync import train, accuracy, predict
from dffml_model_tensorflow.dnnc import DNNClassifierModel
model = DNNClassifierModel(
features=Features(
DefFeature("SepalLength", float, 1),
DefFeature("SepalWidth", float, 1),
DefFeature("PetalLength", float, 1),
DefFeature("PetalWidth", float, 1),
),
predict=DefFeature("classification", int, 1),
epochs=3000,
steps=20000,
classifications=[0, 1, 2],
clstype=int,
)
# Train the model
train(model, "iris_training.csv")
# Assess accuracy (alternate way of specifying data source)
print("Accuracy:", accuracy(model, CSVSource(filename="iris_test.csv")))
# Make prediction
for i, features, prediction in predict(
model,
{
"PetalLength": 4.2,
"PetalWidth": 1.5,
"SepalLength": 5.9,
"SepalWidth": 3.0,
},
{
"PetalLength": 5.4,
"PetalWidth": 2.1,
"SepalLength": 6.9,
"SepalWidth": 3.1,
},
):
features["classification"] = prediction["classification"]["value"]
print(features)
| [
"[email protected]"
] | |
bb74ddac0a7bc8a9c141488aea6f2faa98c9f820 | 8fb1d41797595550418ecfc0e7558f38254b4606 | /django/contrib/flatpages/tests/forms.py | 7282f009c9dc86d398bb8e67aa09a2233c66ef42 | [
"MIT"
] | permissive | hunch/hunch-gift-app | 2aad70a9f18124bf0de02d7a125fa93c765da008 | 8c7cad24cc0d9900deb4175e6b768c64a3d7adcf | refs/heads/master | 2016-09-06T03:13:52.153974 | 2012-03-26T18:11:59 | 2012-03-26T18:11:59 | 1,191,221 | 6 | 3 | null | null | null | null | UTF-8 | Python | false | false | 1,245 | py | from django.contrib.flatpages.admin import FlatpageForm
from django.test import TestCase
class FlatpageAdminFormTests(TestCase):
def setUp(self):
self.form_data = {
'title': "A test page",
'content': "This is a test",
'sites': [1],
}
def test_flatpage_admin_form_url_validation(self):
"The flatpage admin form validates correctly validates urls"
self.assertTrue(FlatpageForm(data=dict(url='/new_flatpage/', **self.form_data)).is_valid())
self.assertTrue(FlatpageForm(data=dict(url='/some.special~chars/', **self.form_data)).is_valid())
self.assertTrue(FlatpageForm(data=dict(url='/some.very_special~chars-here/', **self.form_data)).is_valid())
self.assertFalse(FlatpageForm(data=dict(url='/a space/', **self.form_data)).is_valid())
self.assertFalse(FlatpageForm(data=dict(url='/a % char/', **self.form_data)).is_valid())
self.assertFalse(FlatpageForm(data=dict(url='/a ! char/', **self.form_data)).is_valid())
self.assertFalse(FlatpageForm(data=dict(url='/a & char/', **self.form_data)).is_valid())
self.assertFalse(FlatpageForm(data=dict(url='/a ? char/', **self.form_data)).is_valid())
| [
"[email protected]"
] | |
5bf881452b66fca7b77a5d8014fa09f19fdeb494 | 87163acf1614292be250754f28114f89013f73a3 | /Codechef/COLOR.py | bb42d808690439dedd343a0a5262394440066d8d | [] | no_license | khush-01/Python-codes | 742a9d9966d2ceb3ad2e7c78e34ef88e55df955a | da3cae8df0aafe763399066eefc9b786538fdb35 | refs/heads/main | 2023-03-20T04:37:14.020134 | 2021-03-12T04:56:30 | 2021-03-12T04:56:30 | 346,941,048 | 2 | 0 | null | null | null | null | UTF-8 | Python | false | false | 151 | py | for _ in range(int(input())):
n = int(input())
s = input()
a = {'R': 0, 'G': 0, 'B': 0}
for x in s:
a[x] += 1
print(n - max(a.values()))
| [
"[email protected]"
] | |
e11e7f157bd594146ba4dff9987c8483968a3046 | 1d75146a66245dc046dc216bb602129208e00733 | /closed/GIGABYTE/code/rnnt/tensorrt/preprocessing/parts/features.py | 20935e8f677b7c52a9c49336f23ad6259ef4b8b6 | [
"Apache-2.0",
"LicenseRef-scancode-unknown-license-reference"
] | permissive | georgelyuan/inference_results_v1.1 | febf287bd5967bf7f087355a81f06a2bd298cbfe | 3196a5587887c39203ee3ac246fa5dbe789d9085 | refs/heads/main | 2023-08-16T08:49:45.274284 | 2021-09-23T20:57:17 | 2021-09-23T20:57:17 | 409,773,141 | 0 | 0 | NOASSERTION | 2021-09-23T23:36:37 | 2021-09-23T23:36:37 | null | UTF-8 | Python | false | false | 13,634 | py | # Copyright (c) 2021, NVIDIA CORPORATION. All rights reserved.
# Copyright (c) 2019, Myrtle Software Limited. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import torch
import torch.nn as nn
import math
import librosa
from .perturb import AudioAugmentor
from .segment import AudioSegment
from apex import amp
def audio_from_file(file_path, offset=0, duration=0, trim=False, target_sr=16000):
audio = AudioSegment.from_file(file_path,
target_sr=target_sr,
int_values=False,
offset=offset, duration=duration, trim=trim)
samples = torch.tensor(audio.samples, dtype=torch.float).cuda()
num_samples = torch.tensor(samples.shape[0]).int().cuda()
return (samples.unsqueeze(0), num_samples.unsqueeze(0))
class WaveformFeaturizer(object):
def __init__(self, input_cfg, augmentor=None):
self.augmentor = augmentor if augmentor is not None else AudioAugmentor()
self.cfg = input_cfg
def max_augmentation_length(self, length):
return self.augmentor.max_augmentation_length(length)
def process(self, file_path, offset=0, duration=0, trim=False):
audio = AudioSegment.from_file(file_path,
target_sr=self.cfg['sample_rate'],
int_values=self.cfg.get('int_values', False),
offset=offset, duration=duration, trim=trim)
return self.process_segment(audio)
def process_segment(self, audio_segment):
self.augmentor.perturb(audio_segment)
return torch.tensor(audio_segment.samples, dtype=torch.float)
@classmethod
def from_config(cls, input_config, perturbation_configs=None):
if perturbation_configs is not None:
aa = AudioAugmentor.from_config(perturbation_configs)
else:
aa = None
return cls(input_config, augmentor=aa)
constant = 1e-5
def normalize_batch(x, seq_len, normalize_type):
if normalize_type == "per_feature":
x_mean = torch.zeros((seq_len.shape[0], x.shape[1]), dtype=x.dtype,
device=x.device)
x_std = torch.zeros((seq_len.shape[0], x.shape[1]), dtype=x.dtype,
device=x.device)
for i in range(x.shape[0]):
x_mean[i, :] = x[i, :, :seq_len[i]].mean(dim=1)
x_std[i, :] = x[i, :, :seq_len[i]].std(dim=1)
# make sure x_std is not zero
x_std += constant
return (x - x_mean.unsqueeze(2)) / x_std.unsqueeze(2)
elif normalize_type == "all_features":
x_mean = torch.zeros(seq_len.shape, dtype=x.dtype, device=x.device)
x_std = torch.zeros(seq_len.shape, dtype=x.dtype, device=x.device)
for i in range(x.shape[0]):
x_mean[i] = x[i, :, :seq_len[i].item()].mean()
x_std[i] = x[i, :, :seq_len[i].item()].std()
# make sure x_std is not zero
x_std += constant
return (x - x_mean.view(-1, 1, 1)) / x_std.view(-1, 1, 1)
else:
return x
def splice_frames(x, frame_splicing):
""" Stacks frames together across feature dim
input is batch_size, feature_dim, num_frames
output is batch_size, feature_dim*frame_splicing, num_frames
"""
seq = [x]
for n in range(1, frame_splicing):
tmp = torch.zeros_like(x)
tmp[:, :, :-n] = x[:, :, n:]
seq.append(tmp)
return torch.cat(seq, dim=1)[:, :, ::frame_splicing]
class SpectrogramFeatures(nn.Module):
def __init__(self, sample_rate=8000, window_size=0.02, window_stride=0.01,
n_fft=None,
window="hamming", normalize="per_feature", log=True, center=True,
dither=constant, pad_to=8, max_duration=16.7,
frame_splicing=1):
super(SpectrogramFeatures, self).__init__()
torch_windows = {
'hann': torch.hann_window,
'hamming': torch.hamming_window,
'blackman': torch.blackman_window,
'bartlett': torch.bartlett_window,
'none': None,
}
self.win_length = int(sample_rate * window_size)
self.hop_length = int(sample_rate * window_stride)
self.n_fft = n_fft or 2 ** math.ceil(math.log2(self.win_length))
window_fn = torch_windows.get(window, None)
window_tensor = window_fn(self.win_length,
periodic=False) if window_fn else None
self.window = window_tensor
self.normalize = normalize
self.log = log
self.center = center
self.dither = dither
self.pad_to = pad_to
self.frame_splicing = frame_splicing
max_length = 1 + math.ceil(
(max_duration * sample_rate - self.win_length) / self.hop_length
)
max_pad = 16 - (max_length % 16)
self.max_length = max_length + max_pad
def get_seq_len(self, seq_len):
x = torch.ceil(seq_len.to(dtype=torch.float) / self.hop_length).to(
dtype=torch.int)
if self.frame_splicing > 1:
x = torch.ceil(x.float() / self.frame_splicing).to(dtype=torch.int)
return x
@torch.no_grad()
def forward(self, inp):
x, seq_len = inp
dtype = x.dtype
seq_len = self.get_seq_len(seq_len)
# dither
if self.dither > 0:
x += self.dither * torch.randn_like(x)
# do preemphasis
if hasattr(self, 'preemph') and self.preemph is not None:
x = torch.cat((x[:, 0].unsqueeze(1), x[:, 1:] - self.preemph * x[:, :-1]),
dim=1)
# get spectrogram
x = torch.stft(x, n_fft=self.n_fft, hop_length=self.hop_length,
win_length=self.win_length, center=self.center,
window=self.window.to(torch.float))
x = torch.sqrt(x.pow(2).sum(-1))
# log features if required
if self.log:
x = torch.log(x + 1e-20)
# frame splicing if required
if self.frame_splicing > 1:
x = splice_frames(x, self.frame_splicing)
# normalize if required
if self.normalize:
x = normalize_batch(x, seq_len, normalize_type=self.normalize)
# mask to zero any values beyond seq_len in batch, pad to multiple of `pad_to` (for efficiency)
#max_len = x.size(-1)
#mask = torch.arange(max_len).to(seq_len.dtype).to(seq_len.device).expand(x.size(0), max_len) >= seq_len.unsqueeze(1)
#x = x.masked_fill(mask.unsqueeze(1).to(device=x.device), 0)
#del mask
x = x[:, :, :seq_len.max()] # rnnt loss requires lengths to match
pad_to = self.pad_to
if pad_to != 0:
raise NotImplementedError()
# if pad_to == "max":
# x = nn.functional.pad(x, (0, self.max_length - x.size(-1)))
# elif pad_to > 0:
# pad_amt = x.size(-1) % pad_to
# if pad_amt != 0:
# x = nn.functional.pad(x, (0, pad_to - pad_amt))
return x.to(dtype)
@classmethod
def from_config(cls, cfg, log=False):
return cls(sample_rate=cfg['sample_rate'], window_size=cfg['window_size'],
window_stride=cfg['window_stride'],
n_fft=cfg['n_fft'], window=cfg['window'],
normalize=cfg['normalize'],
max_duration=cfg.get('max_duration', 16.7),
dither=cfg.get('dither', 1e-5), pad_to=cfg.get("pad_to", 0),
frame_splicing=cfg.get("frame_splicing", 1), log=log)
class FilterbankFeatures(nn.Module):
def __init__(self, sample_rate=8000, window_size=0.02, window_stride=0.01,
window="hamming", normalize="per_feature", n_fft=None,
preemph=0.97,
nfilt=64, lowfreq=0, highfreq=None, log=True, dither=constant,
pad_to=8,
max_duration=16.7,
frame_splicing=1):
super(FilterbankFeatures, self).__init__()
# print("PADDING: {}".format(pad_to))
torch_windows = {
'hann': torch.hann_window,
'hamming': torch.hamming_window,
'blackman': torch.blackman_window,
'bartlett': torch.bartlett_window,
'none': None,
}
self.win_length = int(sample_rate * window_size) # frame size
self.hop_length = int(sample_rate * window_stride)
self.n_fft = n_fft or 2 ** math.ceil(math.log2(self.win_length))
self.normalize = normalize
self.log = log
self.dither = dither
self.frame_splicing = frame_splicing
self.nfilt = nfilt
self.preemph = preemph
self.pad_to = pad_to
highfreq = highfreq or sample_rate / 2
window_fn = torch_windows.get(window, None)
window_tensor = window_fn(self.win_length,
periodic=False) if window_fn else None
filterbanks = torch.tensor(
librosa.filters.mel(sample_rate, self.n_fft, n_mels=nfilt, fmin=lowfreq,
fmax=highfreq), dtype=torch.float).unsqueeze(0)
# self.fb = filterbanks
# self.window = window_tensor
self.register_buffer("fb", filterbanks)
self.register_buffer("window", window_tensor)
# Calculate maximum sequence length (# frames)
max_length = 1 + math.ceil(
(max_duration * sample_rate - self.win_length) / self.hop_length
)
max_pad = 16 - (max_length % 16)
self.max_length = max_length + max_pad
def get_seq_len(self, seq_len):
x = torch.ceil(seq_len.to(dtype=torch.float) / self.hop_length).to(
dtype=torch.int)
# dtype=torch.long)
if self.frame_splicing > 1:
x = torch.ceil(x.float() / self.frame_splicing).to(dtype=torch.int)
return x
@torch.no_grad()
def forward(self, inp):
x, seq_len = inp
dtype = x.dtype
seq_len = self.get_seq_len(seq_len)
# dither
if self.dither > 0:
x += self.dither * torch.randn_like(x)
# do preemphasis
if self.preemph is not None:
x = torch.cat((x[:, 0].unsqueeze(1), x[:, 1:] - self.preemph * x[:, :-1]),
dim=1)
# do stft
x = torch.stft(x, n_fft=self.n_fft, hop_length=self.hop_length,
win_length=self.win_length,
center=True, window=self.window.to(dtype=torch.float))
# get power spectrum
x = x.pow(2).sum(-1)
# dot with filterbank energies
x = torch.matmul(self.fb.to(x.dtype), x)
# log features if required
if self.log:
x = torch.log(x + 1e-20)
# frame splicing if required
if self.frame_splicing > 1:
x = splice_frames(x, self.frame_splicing)
# normalize if required
if self.normalize:
x = normalize_batch(x, seq_len, normalize_type=self.normalize)
# mask to zero any values beyond seq_len in batch, pad to multiple of `pad_to` (for efficiency)
#max_len = x.size(-1)
x = x[:, :, :seq_len.max()] # rnnt loss requires lengths to match
# mask = torch.arange(max_len).to(seq_len.dtype).to(x.device).expand(x.size(0),
# max_len) >= seq_len.unsqueeze(1)
#x = x.masked_fill(mask.unsqueeze(1).to(device=x.device), 0)
pad_to = self.pad_to
if pad_to != 0:
raise NotImplementedError()
# if pad_to == "max":
# x = nn.functional.pad(x, (0, self.max_length - x.size(-1)))
# elif pad_to > 0:
# pad_amt = x.size(-1) % pad_to
# if pad_amt != 0:
# x = nn.functional.pad(x, (0, pad_to - pad_amt))
return x.to(dtype)
@classmethod
def from_config(cls, cfg, log=False):
return cls(sample_rate=cfg['sample_rate'], window_size=cfg['window_size'],
window_stride=cfg['window_stride'], n_fft=cfg['n_fft'],
nfilt=cfg['features'], window=cfg['window'],
normalize=cfg['normalize'],
max_duration=cfg.get('max_duration', 16.7),
dither=cfg['dither'], pad_to=cfg.get("pad_to", 0),
frame_splicing=cfg.get("frame_splicing", 1), log=log)
class FeatureFactory(object):
featurizers = {
"logfbank": FilterbankFeatures,
"fbank": FilterbankFeatures,
"stft": SpectrogramFeatures,
"logspect": SpectrogramFeatures,
"logstft": SpectrogramFeatures
}
def __init__(self):
pass
@classmethod
def from_config(cls, cfg):
feat_type = cfg.get('feat_type', "logspect")
featurizer = cls.featurizers[feat_type]
# return featurizer.from_config(cfg, log="log" in cfg['feat_type'])
return featurizer.from_config(cfg, log="log" in feat_type)
| [
"[email protected]"
] | |
6200af4f551c8e6534b8a3a0ca176346d8b54d3e | dfff816642f4e1afeab268f441906a6d811d3fb4 | /polling_stations/apps/data_collection/management/commands/import_swale.py | 1450c85e75893e03f2e966075e02b5698e14b4c4 | [] | permissive | mtravis/UK-Polling-Stations | 2c07e03d03959492c7312e5a4bfbb71e12320432 | 26e0331dc29253dc436a0462ffaa01e974c5dc52 | refs/heads/master | 2020-05-14T18:36:31.501346 | 2019-04-17T12:54:57 | 2019-04-17T12:54:57 | 181,912,382 | 0 | 0 | BSD-3-Clause | 2019-04-17T14:48:26 | 2019-04-17T14:48:26 | null | UTF-8 | Python | false | false | 3,327 | py | from django.contrib.gis.geos import Point
from data_collection.management.commands import BaseHalaroseCsvImporter
class Command(BaseHalaroseCsvImporter):
council_id = "E07000113"
addresses_name = (
"local.2019-05-02/Version 1/polling_station_export-2019-02-28Swle.csv"
)
stations_name = (
"local.2019-05-02/Version 1/polling_station_export-2019-02-28Swle.csv"
)
elections = ["local.2019-05-02"]
def address_record_to_dict(self, record):
rec = super().address_record_to_dict(record)
uprn = record.uprn.strip().lstrip("0")
if uprn in [
"200001875126", # ME122HP -> ME122LZ : Morland House Augustine Road, Minster-on-Sea, Sheerness, Kent
"100061078702", # ME122LE -> ME122LA : Uppal Villa Minster Drive, Minster-on-Sea, Sheerness, Kent
"100061081138", # ME122LS -> ME122LX : 150 Scarborough Drive, Minster-on-Sea, Sheerness, Kent
"10023197317", # ME122SG -> ME122SH : 1A Baldwin Cottage Baldwin Road, Minster-on-Sea, Sheerness, Kent
"10023197934", # ME122LT -> ME122LX : 152 Scarborough Drive, Minster-on-Sea, Sheerness, Kent
"100061080835", # ME123JE -> ME123HT : 1A Rosemary Avenue, Minster-on-Sea, Sheerness, Kent
"10023200030", # ME103TU -> ME123TU : 28 Petunia Avenue, Minster-on-Sea, Sheerness, Kent
"10035061220", # ME121AG -> ME122AG : 178 Invicta Road, Sheerness, Kent
"10093083738", # ME124JB -> ME101QA : Flat Above Marinos Fish Bar 212 London Road, Sittingbourne, Kent
"100061078990", # ME123PA -> ME123NZ : 382B Minster Road, Minster-on-Sea, Sheerness, Kent
"100061083074", # ME122SG -> ME122SD : Llamedos The Glen, Minster-on-Sea, Sheerness, Kent
"100062379223", # ME124JA -> ME124JB : Sheringham Bell Farm Lane, Minster-on-Sea, Sheerness, Kent
"100061073637", # ME124JA -> ME124JB : The Laurels Bell Farm Lane, Minster-on-Sea, Sheerness, Kent
"100061073623", # ME124JA -> ME124JB : Merry Moments Bell Farm Lane, Minster-on-Sea, Sheerness, Kent
"200002539987", # ME101NL -> ME101NS : Flat L 94 London Road, Sittingbourne, Kent
]:
rec["accept_suggestion"] = True
if uprn in [
"100062087806", # ME121TP -> ME122RT : Flat 2 36A Broadway, Sheerness, Kent
"100062087803", # ME121TP -> ME122RT : Flat 1 36A Broadway, Sheerness, Kent
"200002535746", # ME121NX -> ME102RD : 45A High Street, Sheerness, Kent
"10013741961", # ME130SG -> ME104ES : Winterbourne Cottage Annexe Rushett Lane, Norton, Faversham, Kent
"10023196555", # ME122DH -> ME121AG : 3 The Crescent Parklands Village The Broadway, Minster-on-Sea, Sheerness, Kent
"10023196556", # ME122DH -> ME121AG : 1 The Crescent Parklands Village The Broadway, Minster-on-Sea, Sheerness, Kent
"10023200723", # ME137JG -> ME98XF : 23 West Street, Faversham, Kent
]:
rec["accept_suggestion"] = False
return rec
def station_record_to_dict(self, record):
rec = super().station_record_to_dict(record)
if record.pollingstationnumber in ["113", "114"]:
rec["location"] = Point(0.735912, 51.337309, srid=4326)
return rec
| [
"[email protected]"
] | |
efbce9c46c5c14d023e8afee6f5fd7be1921c7d8 | 4c252eb68446d5fd050e28a6b5ba1a7879b70b0a | /pyuavcan/transport/can/media/socketcan/__init__.py | c6ee7989569c0ca65d99e412f740125f7b1a788f | [
"MIT"
] | permissive | jxltom/pyuavcan | ce2cdf3a95ba4c6f3a0fd8aae24b341e46481fae | 42063b65ee2af431ab485f228d1ed5465a576449 | refs/heads/master | 2021-01-16T15:09:48.547764 | 2020-05-26T09:31:25 | 2020-05-26T09:31:25 | 243,163,363 | 0 | 0 | MIT | 2020-02-26T03:53:47 | 2020-02-26T03:53:46 | null | UTF-8 | Python | false | false | 901 | py | #
# Copyright (c) 2019 UAVCAN Development Team
# This software is distributed under the terms of the MIT License.
# Author: Pavel Kirienko <[email protected]>
#
"""
The module is always importable but is functional only on GNU/Linux.
For testing or experimentation on a local machine it is often convenient to use a virtual CAN bus instead of a real one.
Using SocketCAN, one can set up a virtual CAN bus interface as follows::
modprobe can
modprobe can_raw
modprobe vcan
ip link add dev vcan0 type vcan
ip link set vcan0 mtu 72 # Enable CAN FD by configuring the MTU of 64+8
ip link set up vcan0
Where ``vcan0`` can be replaced with any other valid interface name.
Please read the SocketCAN documentation for more information.
"""
from sys import platform as _platform
if _platform == 'linux':
from ._socketcan import SocketCANMedia as SocketCANMedia
| [
"[email protected]"
] | |
9a0550f2cdac6a2e0d0f2070fccd45e0531db9ea | 7add1f8fc31b09bb79efd2b25cc15e23666c1d1d | /tfx/orchestration/kubeflow/v2/test_utils.py | 4431711f7007e93516012675c192b8ae8b27b24d | [
"Apache-2.0"
] | permissive | twitter-forks/tfx | b867e9fee9533029ca799c4a4c5d1c5430ba05fe | cb3561224c54a5dad4d5679165d5b3bafc8b451b | refs/heads/master | 2021-11-19T18:45:09.157744 | 2021-10-19T00:02:34 | 2021-10-19T00:02:34 | 205,426,993 | 2 | 1 | Apache-2.0 | 2021-10-18T21:03:50 | 2019-08-30T17:21:03 | Python | UTF-8 | Python | false | false | 21,046 | py | # Copyright 2020 Google LLC. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Test utilities for kubeflow v2 runner."""
import os
from typing import List
from kfp.pipeline_spec import pipeline_spec_pb2 as pipeline_pb2
import tensorflow_model_analysis as tfma
from tfx import v1 as tfx
from tfx.components.trainer.executor import Executor
from tfx.dsl.component.experimental import executor_specs
from tfx.dsl.component.experimental import placeholders
from tfx.dsl.components.base import base_component
from tfx.dsl.components.base import base_executor
from tfx.dsl.components.base import base_node
from tfx.dsl.components.base import executor_spec
from tfx.dsl.experimental.conditionals import conditional
from tfx.types import channel_utils
from tfx.types import component_spec
from tfx.types.experimental import simple_artifacts
from google.protobuf import message
_TEST_TWO_STEP_PIPELINE_NAME = 'two-step-pipeline'
_TEST_FULL_PIPELINE_NAME = 'full-taxi-pipeline'
_TEST_PIPELINE_ROOT = 'path/to/my/root'
_TEST_INPUT_DATA = 'path/to/my/data'
_TEST_MODULE_FILE_LOCATION = 'path/to/my/module_utils.py'
TEST_RUNTIME_CONFIG = pipeline_pb2.PipelineJob.RuntimeConfig(
gcs_output_directory=_TEST_PIPELINE_ROOT,
parameters={
'string_param': pipeline_pb2.Value(string_value='test-string'),
'int_param': pipeline_pb2.Value(int_value=42),
'float_param': pipeline_pb2.Value(double_value=3.14)
})
# TODO(b/158245564): Reevaluate whether to keep this test helper function
def two_step_pipeline() -> tfx.dsl.Pipeline:
"""Returns a simple 2-step pipeline under test."""
example_gen = tfx.extensions.google_cloud_big_query.BigQueryExampleGen(
query='SELECT * FROM TABLE').with_beam_pipeline_args([
'--runner=DataflowRunner',
])
statistics_gen = tfx.components.StatisticsGen(
examples=example_gen.outputs['examples'])
return tfx.dsl.Pipeline(
pipeline_name=_TEST_TWO_STEP_PIPELINE_NAME,
pipeline_root=_TEST_PIPELINE_ROOT,
components=[example_gen, statistics_gen],
# Needs to set GCP project because BQ is used.
beam_pipeline_args=[
'--project=my-gcp-project',
])
def create_pipeline_components(
pipeline_root: str,
transform_module: str,
trainer_module: str,
bigquery_query: str = '',
csv_input_location: str = '',
) -> List[base_node.BaseNode]:
"""Creates components for a simple Chicago Taxi TFX pipeline for testing.
Args:
pipeline_root: The root of the pipeline output.
transform_module: The location of the transform module file.
trainer_module: The location of the trainer module file.
bigquery_query: The query to get input data from BigQuery. If not empty,
BigQueryExampleGen will be used.
csv_input_location: The location of the input data directory.
Returns:
A list of TFX components that constitutes an end-to-end test pipeline.
"""
if bool(bigquery_query) == bool(csv_input_location):
raise ValueError(
'Exactly one example gen is expected. ',
'Please provide either bigquery_query or csv_input_location.')
if bigquery_query:
example_gen = tfx.extensions.google_cloud_big_query.BigQueryExampleGen(
query=bigquery_query)
else:
example_gen = tfx.components.CsvExampleGen(input_base=csv_input_location)
statistics_gen = tfx.components.StatisticsGen(
examples=example_gen.outputs['examples'])
schema_gen = tfx.components.SchemaGen(
statistics=statistics_gen.outputs['statistics'],
infer_feature_shape=False)
example_validator = tfx.components.ExampleValidator(
statistics=statistics_gen.outputs['statistics'],
schema=schema_gen.outputs['schema'])
transform = tfx.components.Transform(
examples=example_gen.outputs['examples'],
schema=schema_gen.outputs['schema'],
module_file=transform_module)
latest_model_resolver = tfx.dsl.Resolver(
strategy_class=tfx.dsl.experimental.LatestArtifactStrategy,
model=tfx.dsl.Channel(type=tfx.types.standard_artifacts.Model)).with_id(
'Resolver.latest_model_resolver')
trainer = tfx.components.Trainer(
custom_executor_spec=executor_spec.ExecutorClassSpec(Executor),
examples=transform.outputs['transformed_examples'],
schema=schema_gen.outputs['schema'],
base_model=latest_model_resolver.outputs['model'],
transform_graph=transform.outputs['transform_graph'],
train_args=tfx.proto.TrainArgs(num_steps=10),
eval_args=tfx.proto.EvalArgs(num_steps=5),
module_file=trainer_module,
)
# Get the latest blessed model for model validation.
model_resolver = tfx.dsl.Resolver(
strategy_class=tfx.dsl.experimental.LatestBlessedModelStrategy,
model=tfx.dsl.Channel(type=tfx.types.standard_artifacts.Model),
model_blessing=tfx.dsl.Channel(
type=tfx.types.standard_artifacts.ModelBlessing)).with_id(
'Resolver.latest_blessed_model_resolver')
# Set the TFMA config for Model Evaluation and Validation.
eval_config = tfma.EvalConfig(
model_specs=[tfma.ModelSpec(signature_name='eval')],
metrics_specs=[
tfma.MetricsSpec(
metrics=[tfma.MetricConfig(class_name='ExampleCount')],
thresholds={
'binary_accuracy':
tfma.MetricThreshold(
value_threshold=tfma.GenericValueThreshold(
lower_bound={'value': 0.5}),
change_threshold=tfma.GenericChangeThreshold(
direction=tfma.MetricDirection.HIGHER_IS_BETTER,
absolute={'value': -1e-10}))
})
],
slicing_specs=[
tfma.SlicingSpec(),
tfma.SlicingSpec(feature_keys=['trip_start_hour'])
])
evaluator = tfx.components.Evaluator(
examples=example_gen.outputs['examples'],
model=trainer.outputs['model'],
baseline_model=model_resolver.outputs['model'],
eval_config=eval_config)
with conditional.Cond(evaluator.outputs['blessing'].future()
[0].custom_property('blessed') == 1):
pusher = tfx.components.Pusher(
model=trainer.outputs['model'],
push_destination=tfx.proto.PushDestination(
filesystem=tfx.proto.PushDestination.Filesystem(
base_directory=os.path.join(pipeline_root, 'model_serving'))))
return [
example_gen, statistics_gen, schema_gen, example_validator, transform,
latest_model_resolver, trainer, model_resolver, evaluator, pusher
]
# TODO(b/158245564): Reevaluate whether to keep this test helper function
def full_taxi_pipeline() -> tfx.dsl.Pipeline:
"""Returns a full taxi pipeline under test."""
pipeline_components = create_pipeline_components(
pipeline_root=_TEST_PIPELINE_ROOT,
transform_module=_TEST_MODULE_FILE_LOCATION,
trainer_module=_TEST_MODULE_FILE_LOCATION,
csv_input_location=_TEST_INPUT_DATA)
return tfx.dsl.Pipeline(
pipeline_name=_TEST_FULL_PIPELINE_NAME,
pipeline_root=_TEST_PIPELINE_ROOT,
components=pipeline_components)
class TransformerSpec(component_spec.ComponentSpec):
"""ComponentSpec for a dummy container component."""
INPUTS = {
'input1':
component_spec.ChannelParameter(
type=tfx.types.standard_artifacts.Model),
}
OUTPUTS = {
'output1':
component_spec.ChannelParameter(
type=tfx.types.standard_artifacts.Model),
}
PARAMETERS = {
'param1': component_spec.ExecutionParameter(type=str),
}
class ProducerSpec(component_spec.ComponentSpec):
INPUTS = {}
OUTPUTS = {
'output1':
component_spec.ChannelParameter(
type=tfx.types.standard_artifacts.Model),
}
PARAMETERS = {
'param1': component_spec.ExecutionParameter(type=str),
}
class DummyContainerSpecComponent(base_component.BaseComponent):
"""Dummy ContainerSpec component."""
SPEC_CLASS = TransformerSpec
EXECUTOR_SPEC = executor_specs.TemplatedExecutorContainerSpec(
image='dummy/transformer',
command=[
'transformer',
'--input1',
placeholders.InputUriPlaceholder('input1'),
'--output1',
placeholders.OutputUriPlaceholder('output1'),
'--param1',
placeholders.InputValuePlaceholder('param1'),
])
def __init__(self, input1, param1, output1, instance_name=None):
spec = TransformerSpec(
input1=input1,
output1=output1,
param1=param1,
)
super().__init__(spec=spec)
if instance_name:
self._id = '{}.{}'.format(self.__class__.__name__, instance_name)
else:
self._id = self.__class__.__name__
class DummyProducerComponent(base_component.BaseComponent):
"""Dummy producer component."""
SPEC_CLASS = ProducerSpec
EXECUTOR_SPEC = executor_specs.TemplatedExecutorContainerSpec(
image='dummy/producer',
command=[
'producer',
'--output1',
placeholders.OutputUriPlaceholder('output1'),
'--param1',
placeholders.InputValuePlaceholder('param1'),
'--wrapped-param',
placeholders.ConcatPlaceholder([
'prefix-',
placeholders.InputValuePlaceholder('param1'),
'-suffix',
]),
])
def __init__(self, param1, output1, instance_name=None):
spec = ProducerSpec(
output1=output1,
param1=param1,
)
super().__init__(spec=spec)
if instance_name:
self._id = '{}.{}'.format(self.__class__.__name__, instance_name)
else:
self._id = self.__class__.__name__
dummy_transformer_component = tfx.dsl.experimental.create_container_component(
name='DummyContainerSpecComponent',
inputs={
'input1': tfx.types.standard_artifacts.Model,
},
outputs={
'output1': tfx.types.standard_artifacts.Model,
},
parameters={
'param1': str,
},
image='dummy/transformer',
command=[
'transformer',
'--input1',
placeholders.InputUriPlaceholder('input1'),
'--output1',
placeholders.OutputUriPlaceholder('output1'),
'--param1',
placeholders.InputValuePlaceholder('param1'),
],
)
dummy_producer_component = tfx.dsl.experimental.create_container_component(
name='DummyProducerComponent',
outputs={
'output1': tfx.types.standard_artifacts.Model,
},
parameters={
'param1': str,
},
image='dummy/producer',
command=[
'producer',
'--output1',
placeholders.OutputUriPlaceholder('output1'),
'--param1',
placeholders.InputValuePlaceholder('param1'),
'--wrapped-param',
placeholders.ConcatPlaceholder([
'prefix-',
placeholders.InputValuePlaceholder('param1'),
'-suffix',
]),
],
)
dummy_producer_component_2 = tfx.dsl.experimental.create_container_component(
name='DummyProducerComponent2',
outputs={
'output1': tfx.types.standard_artifacts.Model,
},
parameters={
'param1': str,
},
image='dummy/producer',
command=[
'producer',
],
)
dummy_consumer_component = tfx.dsl.experimental.create_container_component(
name='DummyConsumerComponent',
inputs={
'input1': tfx.types.standard_artifacts.Model,
},
outputs={
'output1': tfx.types.standard_artifacts.Model,
},
parameters={
'param1': int,
},
image='dummy/consumer',
command=[
'consumer',
],
)
def pipeline_with_one_container_spec_component() -> tfx.dsl.Pipeline:
"""Pipeline with container."""
importer_task = tfx.dsl.Importer(
source_uri='some-uri',
artifact_type=tfx.types.standard_artifacts.Model,
).with_id('my_importer')
container_task = DummyContainerSpecComponent(
input1=importer_task.outputs['result'],
output1=channel_utils.as_channel([tfx.types.standard_artifacts.Model()]),
param1='value1',
)
return tfx.dsl.Pipeline(
pipeline_name='pipeline-with-container',
pipeline_root=_TEST_PIPELINE_ROOT,
components=[importer_task, container_task],
)
def pipeline_with_two_container_spec_components() -> tfx.dsl.Pipeline:
"""Pipeline with container."""
container1_task = DummyProducerComponent(
output1=channel_utils.as_channel([tfx.types.standard_artifacts.Model()]),
param1='value1',
)
container2_task = DummyContainerSpecComponent(
input1=container1_task.outputs['output1'],
output1=channel_utils.as_channel([tfx.types.standard_artifacts.Model()]),
param1='value2',
)
return tfx.dsl.Pipeline(
pipeline_name='pipeline-with-container',
pipeline_root=_TEST_PIPELINE_ROOT,
components=[container1_task, container2_task],
)
def pipeline_with_two_container_spec_components_2() -> tfx.dsl.Pipeline:
"""Pipeline with container."""
container1_task = dummy_producer_component(
output1=channel_utils.as_channel([tfx.types.standard_artifacts.Model()]),
param1='value1',
)
container2_task = dummy_transformer_component(
input1=container1_task.outputs['output1'],
output1=channel_utils.as_channel([tfx.types.standard_artifacts.Model()]),
param1='value2',
)
return tfx.dsl.Pipeline(
pipeline_name='pipeline-with-container',
pipeline_root=_TEST_PIPELINE_ROOT,
components=[container1_task, container2_task],
)
def get_proto_from_test_data(filename: str,
pb_message: message.Message) -> message.Message:
"""Helper function that gets proto from testdata."""
filepath = os.path.join(os.path.dirname(__file__), 'testdata', filename)
return tfx.utils.parse_pbtxt_file(filepath, pb_message)
def get_text_from_test_data(filename: str) -> str:
"""Helper function that gets raw string from testdata."""
filepath = os.path.join(os.path.dirname(__file__), 'testdata', filename)
return tfx.dsl.io.fileio.open(filepath, 'rb').read().decode('utf-8')
class _ProducerComponentSpec(component_spec.ComponentSpec):
"""Test component spec using AI Platform simple artifact types."""
INPUTS = {}
OUTPUTS = {
'examples':
component_spec.ChannelParameter(type=simple_artifacts.Dataset),
'external_data':
component_spec.ChannelParameter(type=simple_artifacts.File),
}
PARAMETERS = {}
class _ConsumerComponentSpec(component_spec.ComponentSpec):
"""Test component spec using AI Platform simple artifact types."""
INPUTS = {
'examples':
component_spec.ChannelParameter(type=simple_artifacts.Dataset),
'external_data':
component_spec.ChannelParameter(type=simple_artifacts.File),
}
OUTPUTS = {
'stats':
component_spec.ChannelParameter(type=simple_artifacts.Statistics),
'metrics':
component_spec.ChannelParameter(type=simple_artifacts.Metrics)
}
PARAMETERS = {}
class ProducerComponent(base_component.BaseComponent):
"""Test component used in step 1 of a 2-step pipeline testing AI Platform simple artifact types."""
SPEC_CLASS = _ProducerComponentSpec
EXECUTOR_SPEC = executor_spec.ExecutorClassSpec(
executor_class=base_executor.EmptyExecutor)
def __init__(self):
examples_channel = channel_utils.as_channel([simple_artifacts.Dataset()])
external_data_channel = channel_utils.as_channel([simple_artifacts.File()])
super().__init__(
_ProducerComponentSpec(
examples=examples_channel, external_data=external_data_channel))
class ConsumerComponent(base_component.BaseComponent):
"""Test component used in step 2 of a 2-step pipeline testing AI Platform simple artifact types."""
SPEC_CLASS = _ConsumerComponentSpec
EXECUTOR_SPEC = executor_spec.ExecutorClassSpec(
executor_class=base_executor.EmptyExecutor)
def __init__(self, examples: tfx.dsl.Channel, external_data: tfx.dsl.Channel):
stats_output_channel = channel_utils.as_channel(
[simple_artifacts.Statistics()])
metrics_output_channel = channel_utils.as_channel(
[simple_artifacts.Metrics()])
super().__init__(
_ConsumerComponentSpec(
examples=examples,
external_data=external_data,
stats=stats_output_channel,
metrics=metrics_output_channel))
def two_step_kubeflow_artifacts_pipeline() -> tfx.dsl.Pipeline:
"""Builds 2-Step pipeline to test AI Platform simple artifact types."""
step1 = ProducerComponent()
step2 = ConsumerComponent(
examples=step1.outputs['examples'],
external_data=step1.outputs['external_data'])
return tfx.dsl.Pipeline(
pipeline_name='two-step-kubeflow-artifacts-pipeline',
pipeline_root=_TEST_PIPELINE_ROOT,
components=[step1, step2],
beam_pipeline_args=[
'--project=my-gcp-project',
])
def two_step_pipeline_with_task_only_dependency() -> tfx.dsl.Pipeline:
"""Returns a simple 2-step pipeline with task only dependency between them."""
step_1 = tfx.dsl.experimental.create_container_component(
name='Step 1',
inputs={},
outputs={},
parameters={},
image='step-1-image',
command=['run', 'step-1'])()
step_2 = tfx.dsl.experimental.create_container_component(
name='Step 2',
inputs={},
outputs={},
parameters={},
image='step-2-image',
command=['run', 'step-2'])()
step_2.add_upstream_node(step_1)
return tfx.dsl.Pipeline(
pipeline_name='two-step-task-only-dependency-pipeline',
pipeline_root=_TEST_PIPELINE_ROOT,
components=[step_1, step_2],
)
primitive_producer_component = tfx.dsl.experimental.create_container_component(
name='ProducePrimitives',
outputs={
'output_string': tfx.types.standard_artifacts.String,
'output_int': tfx.types.standard_artifacts.Integer,
'output_float': tfx.types.standard_artifacts.Float,
},
image='busybox',
command=[
'produce',
placeholders.OutputUriPlaceholder('output_string'),
placeholders.OutputUriPlaceholder('output_int'),
placeholders.OutputUriPlaceholder('output_float'),
],
)
primitive_consumer_component = tfx.dsl.experimental.create_container_component(
name='ConsumeByValue',
inputs={
'input_string': tfx.types.standard_artifacts.String,
'input_int': tfx.types.standard_artifacts.Integer,
'input_float': tfx.types.standard_artifacts.Float,
},
parameters={
'param_string': str,
'param_int': int,
'param_float': float,
},
image='busybox',
command=[
'consume',
placeholders.InputValuePlaceholder('input_string'),
placeholders.InputValuePlaceholder('input_int'),
placeholders.InputValuePlaceholder('input_float'),
placeholders.InputValuePlaceholder('param_string'),
placeholders.InputValuePlaceholder('param_int'),
placeholders.InputValuePlaceholder('param_float'),
],
)
def consume_primitive_artifacts_by_value_pipeline() -> tfx.dsl.Pipeline:
"""Pipeline which features consuming artifacts by value."""
producer_task = primitive_producer_component()
consumer_task = primitive_consumer_component(
input_string=producer_task.outputs['output_string'],
input_int=producer_task.outputs['output_int'],
input_float=producer_task.outputs['output_float'],
param_string='string value',
param_int=42,
param_float=3.14,
)
return tfx.dsl.Pipeline(
pipeline_name='consume-primitive-artifacts-by-value-pipeline',
pipeline_root=_TEST_PIPELINE_ROOT,
components=[producer_task, consumer_task],
)
def pipeline_with_runtime_parameter() -> tfx.dsl.Pipeline:
"""Pipeline which contains a runtime parameter."""
producer_task = primitive_producer_component()
consumer_task = primitive_consumer_component(
input_string=producer_task.outputs['output_string'],
input_int=producer_task.outputs['output_int'],
input_float=producer_task.outputs['output_float'],
param_string=tfx.dsl.experimental.RuntimeParameter(
ptype=str, name='string_param', default='string value'),
param_int=42,
param_float=3.14,
)
return tfx.dsl.Pipeline(
pipeline_name='pipeline-with-runtime-parameter',
pipeline_root=_TEST_PIPELINE_ROOT,
components=[producer_task, consumer_task],
)
| [
"[email protected]"
] | |
7e17aef7c234a83dd5f625f3f79766cdd0c89a88 | ca7aa979e7059467e158830b76673f5b77a0f5a3 | /Python_codes/p02689/s473246876.py | 934535d4b4972167937a8498aa36c123addb4e8b | [] | no_license | Aasthaengg/IBMdataset | 7abb6cbcc4fb03ef5ca68ac64ba460c4a64f8901 | f33f1c5c3b16d0ea8d1f5a7d479ad288bb3f48d8 | refs/heads/main | 2023-04-22T10:22:44.763102 | 2021-05-13T17:27:22 | 2021-05-13T17:27:22 | 367,112,348 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 374 | py | n, m = map(int, input().split())
h = list(map(int, input().split()))
ab = [list(map(int, input().split())) for _ in range(m)]
ans = [1]*n
for i in range(m):
if h[ab[i][0]-1] == h[ab[i][1]-1]:
ans[ab[i][0]-1] = 0
ans[ab[i][1]-1] = 0
elif h[ab[i][0]-1] > h[ab[i][1]-1]:
ans[ab[i][1]-1] = 0
else:
ans[ab[i][0]-1] = 0
print(sum(ans))
| [
"[email protected]"
] | |
1058bf0e58b27114ec41ce159f6a6412a3122981 | 2e3430eefb94fe6bc6ea8256ceadaf25bbf34e76 | /puma/helpers/testing/mixin/__init__.py | e5d5031dc92b6e294d4d8a37e566046b3bcab595 | [
"Apache-2.0"
] | permissive | gift-surg/puma | 33e08b464fe4241da512fefcab5e8909e6f1d768 | 58beae3459a0c8d96adfe9af323e26868428df4d | refs/heads/master | 2022-11-27T11:26:25.557773 | 2020-06-04T11:21:38 | 2020-06-04T11:21:38 | 198,999,143 | 1 | 0 | Apache-2.0 | 2020-07-29T16:36:50 | 2019-07-26T10:38:46 | Python | UTF-8 | Python | false | false | 173 | py | from puma.helpers.testing.mixin.not_a_test_case import NotATestCase # noqa: F401
from puma.helpers.testing.mixin.not_a_test_case_enum import NotATestCaseEnum # noqa: F401
| [
"[email protected]"
] | |
aeba0cf66cd6b63cc2cd690e704544f6c0591260 | 8541f4118c6093c84e78d768285e7007ee5f6a6c | /apps/inventory/migrations/0009_auto_20151220_1554.py | e2b8366a29c48b68bbe2d712b6d251b066c4ba96 | [] | no_license | iraycd/awecounting | c81a8ca6b7a4a942e63cf6b7d723f9883e57a107 | 388df4de63146e0a9a211afa522ec50e0f3df443 | refs/heads/master | 2021-01-15T23:30:27.439759 | 2016-03-16T10:34:40 | 2016-03-16T10:34:40 | 57,046,467 | 1 | 0 | null | 2016-04-25T14:03:40 | 2016-04-25T14:03:40 | null | UTF-8 | Python | false | false | 444 | py | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import models, migrations
import njango.fields
class Migration(migrations.Migration):
dependencies = [
('inventory', '0008_auto_20151219_1547'),
]
operations = [
migrations.AlterField(
model_name='sale',
name='date',
field=njango.fields.BSDateField(default=njango.fields.today),
),
]
| [
"[email protected]"
] | |
9399c9273cd93034db22b32eba2127b884c75e9e | 4869c5e4d4b5ba6af434b62a2369ed58891c4eb0 | /addons/plugin.video.fen/resources/lib/indexers/dialogs.py | 806f73f4a98af300d972b66eed68208ef5f003b4 | [] | no_license | JohnnyBlackwater/Zephyr-mod | 1bd73a04549da83965a0979a1957ab4f98b03a6d | 2e9472793b45287b1114221f5dd1674ce886bca1 | refs/heads/master | 2023-08-31T22:54:47.827433 | 2021-11-15T03:01:01 | 2021-11-15T03:01:01 | 428,104,158 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 35,327 | py | # -*- coding: utf-8 -*-
import json
import metadata
from windows import open_window
from modules import kodi_utils
from modules import settings
from modules.source_utils import clear_and_rescrape, clear_scrapers_cache, rescrape_with_disabled, scrape_with_custom_values
from modules.nav_utils import open_settings, clear_cache, refresh_cached_data
from modules.settings_reader import get_setting, set_setting
# from modules.kodi_utils import logger
ls = kodi_utils.local_string
build_url = kodi_utils.build_url
icon = kodi_utils.translate_path('special://home/addons/plugin.video.fen/icon.png')
fanart = kodi_utils.translate_path('special://home/addons/plugin.video.fen/fanart.png')
icon_directory = 'special://home/addons/script.tikiart/resources/media/%s'
def trailer_choice(db_type, poster, tmdb_id, trailer_url, all_trailers=[]):
if settings.get_language() != 'en' and not trailer_url and not all_trailers:
from apis.tmdb_api import tmdb_media_videos
try: all_trailers = tmdb_media_videos(db_type, tmdb_id)['results']
except: pass
if all_trailers:
from modules.utils import clean_file_name, to_utf8
if len(all_trailers) == 1:
video_id = all_trailers[0].get('key')
else:
items = [{'line': clean_file_name(i['name']), 'function': i['key']} for i in all_trailers]
video_id = open_window(['windows.extras', 'ShowSelectMedia'], 'select_media.xml', items=items, poster=poster)
if video_id == None: return 'canceled'
trailer_url = 'plugin://plugin.video.youtube/play/?video_id=%s' % video_id
return trailer_url
def genres_choice(db_type, genres, poster, return_genres=False):
from modules.meta_lists import movie_genres, tvshow_genres
def _process_dicts(genre_str, _dict):
final_genres_list = []
append = final_genres_list.append
for key, value in _dict.items():
if key in genre_str: append({'genre': key, 'value': value})
return final_genres_list
if db_type in ('movie', 'movies'): genre_action, meta_type, action = movie_genres, 'movie', 'tmdb_movies_genres'
else: genre_action, meta_type, action = tvshow_genres, 'tvshow', 'tmdb_tv_genres'
genre_list = _process_dicts(genres, genre_action())
if return_genres: return genre_list
if len(genre_list) == 0:
kodi_utils.notification(32760, 2500)
return None
mode = 'build_%s_list' % meta_type
items = [{'line': item['genre'], 'function': json.dumps({'mode': mode, 'action': action, 'genre_id': item['value'][0]})} for item in genre_list]
return open_window(['windows.extras', 'ShowSelectMedia'], 'select_media.xml', items=items, poster=poster)
def imdb_keywords_choice(db_type, imdb_id, poster):
from apis.imdb_api import imdb_keywords
def _builder():
for item in keywords_info:
obj = {'line': item, 'function': json.dumps({'mode': mode, 'action': 'imdb_keywords_list_contents', 'list_id': item, 'iconImage': 'imdb.png'})}
yield obj
kodi_utils.show_busy_dialog()
keywords_info = imdb_keywords(imdb_id)
if len(keywords_info) == 0:
kodi_utils.hide_busy_dialog()
kodi_utils.notification(32760, 2500)
return None
meta_type = 'movie' if db_type == 'movies' else 'tvshow'
mode = 'build_%s_list' % meta_type
items = list(_builder())
kodi_utils.hide_busy_dialog()
return open_window(['windows.extras', 'ShowSelectMedia'], 'select_media.xml', items=items, poster=poster, context_active_action='keywords')
def imdb_videos_choice(videos, poster, media=True):
try: videos = json.loads(videos)
except: pass
videos.sort(key=lambda x: x['quality_rank'])
if media:
items = [{'line': i['quality'], 'function': i['url']} for i in videos]
choice = open_window(['windows.extras', 'ShowSelectMedia'], 'select_media.xml', items=items, poster=poster)
else:
dl = [i['quality'] for i in videos]
fl = [i['url'] for i in videos]
list_items = [{'line1': item, 'icon': poster} for item in dl]
kwargs = {'items': json.dumps(list_items), 'heading': 'Fen', 'enumerate': 'false', 'multi_choice': 'false', 'multi_line': 'false'}
choice = kodi_utils.select_dialog(fl, **kwargs)
return choice
def trakt_manager_choice(params):
if not get_setting('trakt_user', ''): return kodi_utils.notification(32760, 3500)
icon = kodi_utils.translate_path('special://home/addons/script.tikiart/resources/media/trakt.png')
choices = [('%s %s...' % (ls(32602), ls(32199)), 'Add'), ('%s %s...' % (ls(32603), ls(32199)), 'Remove')]
list_items = [{'line1': item[0], 'icon': icon} for item in choices]
kwargs = {'items': json.dumps(list_items), 'heading': ls(32198).replace('[B]', '').replace('[/B]', ''), 'enumerate': 'false', 'multi_choice': 'false', 'multi_line': 'false'}
choice = kodi_utils.select_dialog([i[1] for i in choices], **kwargs)
if choice == None: return
if choice == 'Add':
from apis.trakt_api import trakt_add_to_list
trakt_add_to_list(params)
else:
from apis.trakt_api import trakt_remove_from_list
trakt_remove_from_list(params)
def playback_choice(content, meta):
items = [{'line': ls(32014), 'function': 'clear_and_rescrape'},
{'line': ls(32006), 'function': 'rescrape_with_disabled'},
{'line': ls(32135), 'function': 'scrape_with_custom_values'}]
choice = open_window(['windows.extras', 'ShowSelectMedia'], 'select_media.xml', items=items, rootname=meta['rootname'], poster=meta['poster'])
if choice == None: return
from modules.source_utils import clear_and_rescrape, rescrape_with_disabled, scrape_with_custom_values
if choice == 'clear_and_rescrape': clear_and_rescrape(content, meta)
elif choice == 'rescrape_with_disabled': rescrape_with_disabled(content, meta)
else: scrape_with_custom_values(content, meta)
def set_quality_choice(quality_setting):
include = ls(32188)
dl = ['%s SD' % include, '%s 720p' % include, '%s 1080p' % include, '%s 4K' % include]
fl = ['SD', '720p', '1080p', '4K']
try: preselect = [fl.index(i) for i in get_setting(quality_setting).split(', ')]
except: preselect = []
list_items = [{'line1': item} for item in dl]
kwargs = {'items': json.dumps(list_items), 'heading': 'Fen', 'enumerate': 'false', 'multi_choice': 'true', 'multi_line': 'false', 'preselect': preselect}
choice = kodi_utils.select_dialog(fl, **kwargs)
if choice is None: return
if choice == []:
kodi_utils.ok_dialog(text=32574, top_space=True)
return set_quality_choice(quality_setting)
set_setting(quality_setting, ', '.join(choice))
def extras_lists_choice():
screenshots_directory = 'special://home/addons/script.tikiart/resources/screenshots/extras/%s'
fl = [2051, 2052, 2053, 2054, 2055, 2056, 2057, 2058, 2059, 2060, 2061, 2062]
dl = [{'name': ls(32503), 'image': kodi_utils.translate_path(screenshots_directory % '001_recommended.jpg')},
{'name': ls(32607), 'image': kodi_utils.translate_path(screenshots_directory % '002_reviews.jpg')},
{'name': ls(32984), 'image': kodi_utils.translate_path(screenshots_directory % '003_trivia.jpg')},
{'name': ls(32986), 'image': kodi_utils.translate_path(screenshots_directory % '004_blunders.jpg')},
{'name': ls(32989), 'image': kodi_utils.translate_path(screenshots_directory % '005_parentalguide.jpg')},
{'name': ls(33032), 'image': kodi_utils.translate_path(screenshots_directory % '006_videos.jpg')},
{'name': ls(32616), 'image': kodi_utils.translate_path(screenshots_directory % '007_posters.jpg')},
{'name': ls(32617), 'image': kodi_utils.translate_path(screenshots_directory % '008_fanart.jpg')},
{'name': '%s %s' % (ls(32612), ls(32543)), 'image': kodi_utils.translate_path(screenshots_directory % '009_morefromyear.jpg')},
{'name': '%s %s' % (ls(32612), ls(32470)), 'image': kodi_utils.translate_path(screenshots_directory % '010_morefromgenres.jpg')},
{'name': '%s %s' % (ls(32612), ls(32480)), 'image': kodi_utils.translate_path(screenshots_directory % '011_morefromnetwork.jpg')},
{'name': '%s %s' % (ls(32612), ls(32499)), 'image': kodi_utils.translate_path(screenshots_directory % '012_collection.jpg')}]
try: preselect = [fl.index(i) for i in settings.extras_enabled_lists()]
except: preselect = []
kwargs = {'items': json.dumps(dl), 'preselect': preselect}
selection = open_window(('windows.extras', 'ExtrasChooser'), 'extras_chooser.xml', **kwargs)
if selection == []: return set_setting('extras.enabled_lists', 'noop')
elif selection == None: return
selection = [str(fl[i]) for i in selection]
set_setting('extras.enabled_lists', ','.join(selection))
def set_language_filter_choice(filter_setting):
from modules.meta_lists import language_choices
lang_choices = language_choices()
lang_choices.pop('None')
dl = list(lang_choices.keys())
fl = list(lang_choices.values())
try: preselect = [fl.index(i) for i in get_setting(filter_setting).split(', ')]
except: preselect = []
list_items = [{'line1': item} for item in dl]
kwargs = {'items': json.dumps(list_items), 'heading': 'Fen', 'enumerate': 'false', 'multi_choice': 'true', 'multi_line': 'false', 'preselect': preselect}
choice = kodi_utils.select_dialog(fl, **kwargs)
if choice == None: return
if choice == []:
return set_setting(filter_setting, 'eng')
set_setting(filter_setting, ', '.join(choice))
def enable_scrapers_choice():
scrapers = ['external', 'furk', 'easynews', 'rd-cloud', 'pm-cloud', 'ad-cloud', 'folders']
cloud_scrapers = {'rd-cloud': 'rd.enabled', 'pm-cloud': 'pm.enabled', 'ad-cloud': 'ad.enabled'}
scraper_names = [ls(32118).upper(), ls(32069).upper(), ls(32070).upper(), ls(32098).upper(), ls(32097).upper(), ls(32099).upper(), ls(32108).upper()]
preselect = [scrapers.index(i) for i in settings.active_scrapers(group_folders=True)]
list_items = [{'line1': item} for item in scraper_names]
kwargs = {'items': json.dumps(list_items), 'heading': 'Fen', 'enumerate': 'false', 'multi_choice': 'true', 'multi_line': 'false', 'preselect': preselect}
choice = kodi_utils.select_dialog(scrapers, **kwargs)
if choice is None: return
for i in scrapers:
set_setting('provider.%s' % i, ('true' if i in choice else 'false'))
if i in cloud_scrapers and i in choice:
set_setting(cloud_scrapers[i], 'true')
def folder_scraper_manager_choice(folder_info=None):
def _get_property(setting_id):
return kodi_utils.window.getProperty('fen_%s' % setting_id) or get_setting(setting_id)
def _set_property(setting_id, setting_value):
kodi_utils.window.setProperty('fen_%s' % setting_id, setting_value)
def _clear_property(setting_id):
kodi_utils.window.clearProperty(setting_id)
def _exit_save_settings():
for folder_no in range(1,6):
set_setting(name_setting % folder_no, _get_property(name_setting % folder_no))
set_setting(movie_dir_setting % folder_no, _get_property(movie_dir_setting % folder_no))
set_setting(tvshow_dir_setting % folder_no, _get_property(tvshow_dir_setting % folder_no))
_clear_property('fen_%s' % name_setting % folder_no)
_clear_property('fen_%s' % movie_dir_setting % folder_no)
_clear_property('fen_%s' % tvshow_dir_setting % folder_no)
def _return(passed_folder_info):
return folder_scraper_manager_choice(passed_folder_info)
def _make_folders():
return [{'number': folder_no, 'name': folder_names[folder_no], 'display_setting': name_setting % folder_no,
'movie_setting': movie_dir_setting % folder_no, 'tvshow_setting': tvshow_dir_setting % folder_no, 'display': _get_property(name_setting % folder_no),
'movie_dir': _get_property(movie_dir_setting % folder_no), 'tvshow_dir': _get_property(tvshow_dir_setting % folder_no)} \
for folder_no in range(1,6)]
def _update_folder_info():
folder_info.update({'display': _get_property(name_setting % folder_info['number']), 'movie_dir': _get_property(movie_dir_setting % folder_info['number']),
'tvshow_dir': _get_property(tvshow_dir_setting % folder_info['number'])})
def _make_listing():
return [('[B]%s[/B]: [I]%s[/I]' % (folder_name_str, folder_info['display']), folder_info['display_setting']),
('[B]%s[/B]: [I]%s[/I]' % (movie_dir_str, folder_info['movie_dir']), folder_info['movie_setting']),
('[B]%s[/B]: [I]%s[/I]' % (tv_dir_str, folder_info['tvshow_dir']), folder_info['tvshow_setting'])]
def _process_setting():
if setting == None: _return(None)
if 'display_name' in setting: _set_display()
else: _set_folder_path()
def _set_display():
default = folder_info['display']
folder_title = kodi_utils.dialog.input(folder_name_str, defaultt=default)
if not folder_title: folder_title = 'None'
_set_property(folder_info['display_setting'], folder_title)
_return(folder_info)
def _set_folder_path():
if _get_property(setting) not in ('', 'None'):
list_items = [{'line1': item} for item in [ls(32682), ls(32683)]]
kwargs = {'items': json.dumps(list_items), 'heading': 'Fen', 'enumerate': 'false', 'multi_choice': 'false', 'multi_line': 'false'}
action = kodi_utils.select_dialog([1, 2], **kwargs)
if action == None: _return(folder_info)
if action == 1:
_set_property(setting, 'None')
_return(folder_info)
else:
folder = kodi_utils.dialog.browse(0, 'Fen', '')
if not folder: folder = 'None'
_set_property(setting, folder)
_return(folder_info)
else:
folder = kodi_utils.dialog.browse(0, 'Fen', '')
if not folder: folder = 'None'
_set_property(setting, folder)
_return(folder_info)
try:
choose_folder_str, folder_name_str, movie_dir_str, tv_dir_str = ls(32109), ls(32115), ls(32116), ls(32117)
name_setting, movie_dir_setting, tvshow_dir_setting = 'folder%d.display_name', 'folder%d.movies_directory', 'folder%d.tv_shows_directory'
folder_names = {1: ls(32110), 2: ls(32111), 3: ls(32112), 4: ls(32113), 5: ls(32114)}
if not folder_info:
folders = _make_folders()
list_items = [{'line1': '%s: [I]%s[/I]' % (item['name'], item['display'])} for item in folders]
kwargs = {'items': json.dumps(list_items), 'heading': choose_folder_str, 'enumerate': 'false', 'multi_choice': 'false', 'multi_line': 'false'}
folder_info = kodi_utils.select_dialog(folders, **kwargs)
if folder_info == None: return _exit_save_settings()
else: _update_folder_info()
listing = _make_listing()
list_items = [{'line1': item[0]} for item in listing]
kwargs = {'items': json.dumps(list_items), 'heading': 'Fen', 'enumerate': 'false', 'multi_choice': 'false', 'multi_line': 'false'}
setting = kodi_utils.select_dialog([i[1] for i in listing], **kwargs)
_process_setting()
except Exception as e:
return
def results_sorting_choice():
quality, provider, size = ls(32241), ls(32583), ls(32584)
choices = [('%s, %s, %s' % (quality, provider, size), '0'), ('%s, %s, %s' % (quality, size, provider), '1'), ('%s, %s, %s' % (provider, quality, size), '2'),
('%s, %s, %s' % (provider, size, quality), '3'), ('%s, %s, %s' % (size, quality, provider), '4'), ('%s, %s, %s' % (size, provider, quality), '5')]
list_items = [{'line1': item[0]} for item in choices]
kwargs = {'items': json.dumps(list_items), 'heading': 'Fen', 'enumerate': 'false', 'multi_choice': 'false', 'multi_line': 'false'}
choice = kodi_utils.select_dialog(choices, **kwargs)
if choice:
set_setting('results.sort_order_display', choice[0])
set_setting('results.sort_order', choice[1])
def results_highlights_choice():
choices = ((ls(32240), '0'), (ls(32583), '1'), (ls(32241), '2'))
list_items = [{'line1': item[0]} for item in choices]
kwargs = {'items': json.dumps(list_items), 'heading': 'Fen', 'enumerate': 'false', 'multi_choice': 'false', 'multi_line': 'false'}
choice = kodi_utils.select_dialog([i[1] for i in choices], **kwargs)
if choice: return set_setting('highlight.type', choice)
def results_layout_choice():
screenshots_directory = 'special://home/addons/script.tikiart/resources/screenshots/results/%s'
xml_choices = [
('List Default', kodi_utils.translate_path(screenshots_directory % 'source_results_list.default.jpg')),
('List Contrast Default', kodi_utils.translate_path(screenshots_directory % 'source_results_list.contrast.default.jpg')),
('List Details', kodi_utils.translate_path(screenshots_directory % 'source_results_list.details.jpg')),
('List Contrast Details', kodi_utils.translate_path(screenshots_directory % 'source_results_list.contrast.details.jpg')),
('InfoList Default', kodi_utils.translate_path(screenshots_directory % 'source_results_infolist.default.jpg')),
('InfoList Contrast Default', kodi_utils.translate_path(screenshots_directory % 'source_results_infolist.contrast.default.jpg')),
('InfoList Details', kodi_utils.translate_path(screenshots_directory % 'source_results_infolist.details.jpg')),
('InfoList Contrast Details', kodi_utils.translate_path(screenshots_directory % 'source_results_infolist.contrast.details.jpg')),
('Shift Default', kodi_utils.translate_path(screenshots_directory % 'source_results_shift.default.jpg')),
('Shift Contrast Default', kodi_utils.translate_path(screenshots_directory % 'source_results_shift.contrast.default.jpg')),
('Shift Details', kodi_utils.translate_path(screenshots_directory % 'source_results_shift.details.jpg')),
('Shift Contrast Details', kodi_utils.translate_path(screenshots_directory % 'source_results_shift.contrast.details.jpg')),
('Thumb Default', kodi_utils.translate_path(screenshots_directory % 'source_results_thumb.default.jpg')),
('Thumb Contrast Default', kodi_utils.translate_path(screenshots_directory % 'source_results_thumb.contrast.default.jpg')),
('Thumb Details', kodi_utils.translate_path(screenshots_directory % 'source_results_thumb.details.jpg')),
('Thumb Contrast Details', kodi_utils.translate_path(screenshots_directory % 'source_results_thumb.contrast.details.jpg'))
]
choice = open_window(['windows.sources', 'SourceResultsChooser'], 'sources_chooser.xml', xml_choices=xml_choices)
if choice: set_setting('results.xml_style', choice)
def set_subtitle_choice():
choices = ((ls(32192), '0'), (ls(32193), '1'), (ls(32027), '2'))
list_items = [{'line1': item[0]} for item in choices]
kwargs = {'items': json.dumps(list_items), 'heading': 'Fen', 'enumerate': 'false', 'multi_choice': 'false', 'multi_line': 'false'}
choice = kodi_utils.select_dialog([i[1] for i in choices], **kwargs)
if choice: return set_setting('subtitles.subs_action', choice)
def scraper_dialog_color_choice(setting):
setting ='int_dialog_highlight' if setting == 'internal' else 'ext_dialog_highlight'
chosen_color = color_chooser('Fen')
if chosen_color: set_setting(setting, chosen_color)
def scraper_quality_color_choice(setting):
chosen_color = color_chooser('Fen')
if chosen_color: set_setting(setting, chosen_color)
def scraper_color_choice(setting):
choices = [('furk', 'provider.furk_colour'),
('easynews', 'provider.easynews_colour'),
('debrid_cloud', 'provider.debrid_cloud_colour'),
('folders', 'provider.folders_colour'),
('hoster', 'hoster.identify'),
('torrent', 'torrent.identify'),
('rd', 'provider.rd_colour'),
('pm', 'provider.pm_colour'),
('ad', 'provider.ad_colour'),
('free', 'provider.free_colour')]
setting = [i[1] for i in choices if i[0] == setting][0]
chosen_color = color_chooser('Fen')
if chosen_color: set_setting(setting, chosen_color)
def external_scrapers_manager():
icon = kodi_utils.ext_addon('script.module.fenomscrapers').getAddonInfo('icon')
all_color, hosters_color, torrent_color = 'mediumvioletred', get_setting('hoster.identify'), get_setting('torrent.identify')
enable_string, disable_string, specific_string, all_string = ls(32055), ls(32024), ls(32536), ls(32525)
scrapers_string, hosters_string, torrent_string = ls(32533), ls(33031), ls(32535)
fs_default_string = ls(32137)
all_scrapers_string = '%s %s' % (all_string, scrapers_string)
hosters_scrapers_string = '%s %s' % (hosters_string, scrapers_string)
torrent_scrapers_string = '%s %s' % (torrent_string, scrapers_string)
enable_string_base = '%s %s %s %s' % (enable_string, all_string, '%s', scrapers_string)
disable_string_base = '%s %s %s %s' % (disable_string, all_string, '%s', scrapers_string)
enable_disable_string_base = '%s/%s %s %s %s' % (enable_string, disable_string, specific_string, '%s', scrapers_string)
all_scrapers_base = '[COLOR %s]%s [/COLOR]' % (all_color, all_scrapers_string.upper())
debrid_scrapers_base = '[COLOR %s]%s [/COLOR]' % (hosters_color, hosters_scrapers_string.upper())
torrent_scrapers_base = '[COLOR %s]%s [/COLOR]' % (torrent_color, torrent_scrapers_string.upper())
tools_menu = \
[(all_scrapers_base, fs_default_string, {'mode': 'set_default_scrapers'}),
(all_scrapers_base, enable_string_base % '', {'mode': 'toggle_all', 'folder': 'all', 'setting': 'true'}),
(all_scrapers_base, disable_string_base % '', {'mode': 'toggle_all', 'folder': 'all', 'setting': 'false'}),
(all_scrapers_base, enable_disable_string_base % '',{'mode': 'enable_disable_specific_all', 'folder': 'all'}),
(debrid_scrapers_base, enable_string_base % hosters_string, {'mode': 'toggle_all', 'folder': 'hosters', 'setting': 'true'}),
(debrid_scrapers_base, disable_string_base % hosters_string, {'mode': 'toggle_all', 'folder': 'hosters', 'setting': 'false'}),
(debrid_scrapers_base, enable_disable_string_base % hosters_string, {'mode': 'enable_disable_specific_all', 'folder': 'hosters'}),
(torrent_scrapers_base, enable_string_base % torrent_string, {'mode': 'toggle_all', 'folder': 'torrents', 'setting': 'true'}),
(torrent_scrapers_base, disable_string_base % torrent_string, {'mode': 'toggle_all', 'folder': 'torrents', 'setting': 'false'}),
(torrent_scrapers_base, enable_disable_string_base % torrent_string, {'mode': 'enable_disable_specific_all', 'folder': 'torrents'})]
list_items = [{'line1': item[0], 'line2': item[1], 'icon': icon} for item in tools_menu]
kwargs = {'items': json.dumps(list_items), 'heading': 'Fen', 'enumerate': 'false', 'multi_choice': 'false', 'multi_line': 'true'}
chosen_tool = kodi_utils.select_dialog(tools_menu, **kwargs)
if chosen_tool == None: return
from modules import source_utils
params = chosen_tool[2]
mode = params['mode']
if mode == 'toggle_all':
source_utils.toggle_all(params['folder'], params['setting'])
elif mode == 'enable_disable_specific_all':
source_utils.enable_disable_specific_all(params['folder'])
elif mode == 'set_default_scrapers':
source_utils.set_default_scrapers()
kodi_utils.sleep(350)
return external_scrapers_manager()
def meta_language_choice():
from modules.meta_lists import meta_lang_choices
langs = meta_lang_choices()
list_items = [{'line1': i['name']} for i in langs]
kwargs = {'items': json.dumps(list_items), 'heading': ls(32145), 'enumerate': 'false', 'multi_choice': 'false', 'multi_line': 'false'}
list_choose = kodi_utils.select_dialog(langs, **kwargs)
if list_choose == None: return None
from caches.meta_cache import delete_meta_cache
chosen_language = list_choose['iso']
chosen_language_display = list_choose['name']
set_setting('meta_language', chosen_language)
set_setting('meta_language_display', chosen_language_display)
delete_meta_cache(silent=True)
def favorites_choice(params):
from modules.favourites import Favourites
favourites = Favourites(params)
db_type = params['db_type']
tmdb_id = params['tmdb_id']
title = params['title']
current_favourites = favourites.get_favourites(db_type)
if any(i['tmdb_id'] == tmdb_id for i in current_favourites):
action = favourites.remove_from_favourites
text = '%s Fen %s?' % (ls(32603), ls(32453))
else:
action = favourites.add_to_favourites
text = '%s Fen %s?' % (ls(32602), ls(32453))
if not kodi_utils.confirm_dialog(heading='Fen - %s' % title, text=text, top_space=True): return
action()
def options_menu(params, meta=None):
def _builder():
for item in listing:
line2 = item[1]
if line2 == '': line2 = item[0]
yield {'line1': item[0], 'line2': line2}
content = params.get('content', None)
if not content: content = kodi_utils.container_content()[:-1]
season = params.get('season', None)
episode = params.get('episode', None)
if not meta:
function = metadata.movie_meta if content == 'movie' else metadata.tvshow_meta
meta_user_info = metadata.retrieve_user_info()
meta = function('tmdb_id', params['tmdb_id'], meta_user_info)
watched_indicators = settings.watched_indicators()
on_str, off_str, currently_str, open_str, settings_str = ls(32090), ls(32027), ls(32598), ls(32641), ls(32247)
autoplay_status, autoplay_toggle, quality_setting = (on_str, 'false', 'autoplay_quality_%s' % content) if settings.auto_play(content) \
else (off_str, 'true', 'results_quality_%s' % content)
quality_filter_setting = 'autoplay_quality_%s' % content if autoplay_status == on_str else 'results_quality_%s' % content
autoplay_next_status, autoplay_next_toggle = (on_str, 'false') if settings.autoplay_next_episode() else (off_str, 'true')
results_xml_style_status = get_setting('results.xml_style', 'Default')
results_filter_ignore_status, results_filter_ignore_toggle = (on_str, 'false') if settings.ignore_results_filter() else (off_str, 'true')
results_sorting_status = get_setting('results.sort_order_display').replace('$ADDON[plugin.video.fen 32582]', ls(32582))
current_results_highlights_action = get_setting('highlight.type')
results_highlights_status = ls(32240) if current_results_highlights_action == '0' else ls(32583) if current_results_highlights_action == '1' else ls(32241)
current_subs_action = get_setting('subtitles.subs_action')
current_subs_action_status = 'Auto' if current_subs_action == '0' else ls(32193) if current_subs_action == '1' else off_str
active_scrapers = [i.replace('-', '') for i in settings.active_scrapers(group_folders=True)]
current_scrapers_status = ', '.join([i for i in active_scrapers]) if len(active_scrapers) > 0 else 'N/A'
current_quality_status = ', '.join(settings.quality_filter(quality_setting))
uncached_torrents_status, uncached_torrents_toggle = (on_str, 'false') if settings.display_uncached_torrents() else (off_str, 'true')
listing = []
base_str1 = '%s%s'
base_str2 = '%s: [B]%s[/B]' % (currently_str, '%s')
if content in ('movie', 'episode'):
multi_line = 'true'
listing += [(ls(32014), '', 'clear_and_rescrape')]
listing += [(ls(32006), '', 'rescrape_with_disabled')]
listing += [(ls(32135), '', 'scrape_with_custom_values')]
listing += [(base_str1 % (ls(32175), ' (%s)' % content), base_str2 % autoplay_status, 'toggle_autoplay')]
if autoplay_status == on_str and content == 'episode':
listing += [(base_str1 % (ls(32178), ''), base_str2 % autoplay_next_status, 'toggle_autoplay_next')]
listing += [(base_str1 % (ls(32105), ' (%s)' % content), base_str2 % current_quality_status, 'set_quality')]
listing += [(base_str1 % ('', '%s %s' % (ls(32055), ls(32533))), base_str2 % current_scrapers_status, 'enable_scrapers')]
if autoplay_status == off_str:
listing += [(base_str1 % ('', ls(32140)), base_str2 % results_xml_style_status, 'set_results_xml_display')]
listing += [(base_str1 % ('', ls(32151)), base_str2 % results_sorting_status, 'set_results_sorting')]
listing += [(base_str1 % ('', ls(32138)), base_str2 % results_highlights_status, 'set_results_highlights')]
listing += [(base_str1 % ('', ls(32686)), base_str2 % results_filter_ignore_status, 'set_results_filter_ignore')]
listing += [(base_str1 % ('', ls(32183)), base_str2 % current_subs_action_status, 'set_subs_action')]
if 'external' in active_scrapers:
listing += [(base_str1 % ('', ls(32160)), base_str2 % uncached_torrents_status, 'toggle_torrents_display_uncached')]
else: multi_line = 'false'
listing += [(ls(32046), '', 'extras_lists_choice')]
if content in ('movie', 'tvshow') and meta: listing += [(ls(32604) % (ls(32028) if meta['mediatype'] == 'movie' else ls(32029)), '', 'clear_media_cache')]
if watched_indicators == 1: listing += [((ls(32497) % ls(32037)), 'Clear Trakt Cache', 'clear_trakt_cache')]
if content in ('movie', 'episode'): listing += [(ls(32637), '', 'clear_scrapers_cache')]
listing += [('%s %s' % (ls(32118), ls(32513)), '', 'open_external_scrapers_manager')]
listing += [('%s %s %s' % (open_str, ls(32522), settings_str), '', 'open_scraper_settings')]
listing += [('%s %s %s' % (open_str, ls(32036), settings_str), '', 'open_fen_settings')]
listing += [(ls(32640), '', 'save_and_exit')]
list_items = list(_builder())
heading = ls(32646).replace('[B]', '').replace('[/B]', '')
kwargs = {'items': json.dumps(list_items), 'heading': heading, 'enumerate': 'false', 'multi_choice': 'false', 'multi_line': multi_line}
choice = kodi_utils.select_dialog([i[2] for i in listing], **kwargs)
if choice in (None, 'save_and_exit'): return
elif choice == 'clear_and_rescrape': return clear_and_rescrape(content, meta, season, episode)
elif choice == 'rescrape_with_disabled': return rescrape_with_disabled(content, meta, season, episode)
elif choice == 'scrape_with_custom_values': return scrape_with_custom_values(content, meta, season, episode)
elif choice == 'toggle_autoplay': set_setting('auto_play_%s' % content, autoplay_toggle)
elif choice == 'toggle_autoplay_next': set_setting('autoplay_next_episode', autoplay_next_toggle)
elif choice == 'enable_scrapers': enable_scrapers_choice()
elif choice == 'set_results_xml_display': results_layout_choice()
elif choice == 'set_results_sorting': results_sorting_choice()
elif choice == 'set_results_filter_ignore': set_setting('ignore_results_filter', results_filter_ignore_toggle)
elif choice == 'set_results_highlights': results_highlights_choice()
elif choice == 'set_quality': set_quality_choice(quality_filter_setting)
elif choice == 'set_subs_action': set_subtitle_choice()
elif choice == 'extras_lists_choice': extras_lists_choice()
elif choice == 'clear_media_cache': return refresh_cached_data(meta['mediatype'], 'tmdb_id', meta['tmdb_id'], meta['tvdb_id'], settings.get_language())
elif choice == 'toggle_torrents_display_uncached': set_setting('torrent.display.uncached', uncached_torrents_toggle)
elif choice == 'clear_trakt_cache': return clear_cache('trakt')
elif choice == 'clear_scrapers_cache': return clear_scrapers_cache()
elif choice == 'open_external_scrapers_manager': return external_scrapers_manager()
elif choice == 'open_scraper_settings': return kodi_utils.execute_builtin('Addon.OpenSettings(script.module.fenomscrapers)')
elif choice == 'open_fen_settings': return open_settings('0.0')
if choice == 'clear_trakt_cache' and content in ('movie', 'tvshow', 'season', 'episode'): kodi_utils.execute_builtin('Container.Refresh')
kodi_utils.show_busy_dialog()
kodi_utils.sleep(200)
kodi_utils.hide_busy_dialog()
options_menu(params, meta=meta)
def extras_menu(params):
function = metadata.movie_meta if params['db_type'] == 'movie' else metadata.tvshow_meta
meta_user_info = metadata.retrieve_user_info(kodi_utils.window)
meta = function('tmdb_id', params['tmdb_id'], meta_user_info)
open_window(['windows.extras', 'Extras'], 'extras.xml', meta=meta, is_widget=params.get('is_widget', 'false'), is_home=params.get('is_home', 'false'))
def media_extra_info(media_type, meta):
extra_info = meta.get('extra_info', None)
body = []
append = body.append
tagline_str, premiered_str, rating_str, votes_str, runtime_str = ls(32619), ls(32620), ls(32621), ls(32623), ls(32622)
genres_str, budget_str, revenue_str, director_str, writer_str = ls(32624), ls(32625), ls(32626), ls(32627), ls(32628)
studio_str, collection_str, homepage_str, status_str, type_str, classification_str = ls(32615), ls(32499), ls(32629), ls(32630), ls(32631), ls(32632)
network_str, created_by_str, last_aired_str, next_aired_str, seasons_str, episodes_str = ls(32480), ls(32633), ls(32634), ls(32635), ls(32636), ls(32506)
try:
if media_type == 'movie':
if 'tagline' in meta and meta['tagline']: append('[B]%s:[/B] %s' % (tagline_str, meta['tagline']))
if 'alternative_titles' in meta and meta['alternative_titles']: append('[B]%s:[/B] %s' % ('Aliases', ', '.join(meta['alternative_titles'])))
if 'status' in extra_info: append('[B]%s:[/B] %s' % (status_str, extra_info['status']))
append('[B]%s:[/B] %s' % (premiered_str, meta['premiered']))
append('[B]%s:[/B] %s (%s %s)' % (rating_str, meta['rating'], meta['votes'], votes_str))
append('[B]%s:[/B] %d mins' % (runtime_str, int(float(meta['duration'])/60)))
append('[B]%s:[/B] %s' % (genres_str, meta['genre']))
if 'budget' in extra_info: append('[B]%s:[/B] %s' % (budget_str, extra_info['budget']))
if 'revenue' in extra_info: append('[B]%s:[/B] %s' % (revenue_str, extra_info['revenue']))
append('[B]%s:[/B] %s' % (director_str, meta['director']))
append('[B]%s:[/B] %s' % (writer_str, meta['writer'] or 'N/A'))
append('[B]%s:[/B] %s' % (studio_str, meta['studio'] or 'N/A'))
if extra_info.get('collection_name'): append('[B]%s:[/B] %s' % (collection_str, extra_info['collection_name']))
if extra_info.get('homepage'): append('[B]%s:[/B] %s' % (homepage_str, extra_info['homepage']))
else:
if 'type' in extra_info: append('[B]%s:[/B] %s' % (type_str, extra_info['type']))
if 'alternative_titles' in meta and meta['alternative_titles']: append('[B]%s:[/B] %s' % ('Aliases', ', '.join(meta['alternative_titles'])))
if 'status' in extra_info: append('[B]%s:[/B] %s' % (status_str, extra_info['status']))
append('[B]%s:[/B] %s' % (premiered_str, meta['premiered']))
append('[B]%s:[/B] %s (%s %s)' % (rating_str, meta['rating'], meta['votes'], votes_str))
append('[B]%s:[/B] %d mins' % (runtime_str, int(float(meta['duration'])/60)))
append('[B]%s:[/B] %s' % (classification_str, meta['mpaa']))
append('[B]%s:[/B] %s' % (genres_str, meta['genre']))
append('[B]%s:[/B] %s' % (network_str, meta['studio']))
if 'created_by' in extra_info: append('[B]%s:[/B] %s' % (created_by_str, extra_info['created_by']))
if extra_info.get('last_episode_to_air', False):
last_ep = extra_info['last_episode_to_air']
lastep_str = '[%s] S%.2dE%.2d - %s' % (last_ep['air_date'], last_ep['season_number'], last_ep['episode_number'], last_ep['name'])
append('[B]%s:[/B] %s' % (last_aired_str, lastep_str))
if extra_info.get('next_episode_to_air', False):
next_ep = extra_info['next_episode_to_air']
nextep_str = '[%s] S%.2dE%.2d - %s' % (next_ep['air_date'], next_ep['season_number'], next_ep['episode_number'], next_ep['name'])
append('[B]%s:[/B] %s' % (next_aired_str, nextep_str))
append('[B]%s:[/B] %s' % (seasons_str, meta['total_seasons']))
append('[B]%s:[/B] %s' % (episodes_str, meta['total_aired_eps']))
if 'homepage' in extra_info: append('[B]%s:[/B] %s' % (homepage_str, extra_info['homepage']))
except: return kodi_utils.notification(32574, 2000)
return '\n\n'.join(body)
def color_chooser(msg_dialog, no_color=False):
from modules.meta_lists import meta_color_choices
color_chart = meta_color_choices()
color_display = ['[COLOR=%s]%s[/COLOR]' % (i, i.capitalize()) for i in color_chart]
if no_color:
color_chart.insert(0, 'No Color')
color_display.insert(0, 'No Color')
list_items = [{'line1': item} for item in color_display]
kwargs = {'items': json.dumps(list_items), 'heading': 'Fen', 'enumerate': 'false', 'multi_choice': 'false', 'multi_line': 'false'}
choice = kodi_utils.select_dialog(color_chart, **kwargs)
if choice == None: return
return choice
| [
"[email protected]"
] | |
5b5451db8c50ff61a68abf0e926e2717d703c310 | 5e381364c2ab31ff3618369085afffba6caa8edb | /recipes/ghc-filesystem/all/conanfile.py | a6f41d56f7d61a30937802fdd0e2b28998c42def | [
"MIT"
] | permissive | CAMOBAP/conan-center-index | 16aea68a6d22da22831ba985773125e8eda08f00 | 67d57532bdad549fef3fa6cb8fcdfa86bc55e4f1 | refs/heads/master | 2023-07-30T08:58:57.285571 | 2021-10-02T14:57:54 | 2021-10-02T14:57:54 | 323,262,699 | 1 | 0 | MIT | 2021-05-29T13:37:04 | 2020-12-21T07:30:02 | Python | UTF-8 | Python | false | false | 1,846 | py | import os
from conans import ConanFile, CMake, tools
class GhcFilesystemRecipe(ConanFile):
name = "ghc-filesystem"
description = "A header-only single-file std::filesystem compatible helper library"
topics = ("conan", "ghc-filesystem", "header-only", "filesystem")
homepage = "https://github.com/gulrak/filesystem"
url = "https://github.com/conan-io/conan-center-index"
license = "MIT"
generators = "cmake"
no_copy_source = True
_cmake = None
@property
def _source_subfolder(self):
return "source_subfolder"
def source(self):
tools.get(**self.conan_data["sources"][self.version])
extracted_dir = "filesystem-" + self.version
os.rename(extracted_dir, self._source_subfolder)
def _configure_cmake(self):
if self._cmake:
return self._cmake
self._cmake = CMake(self)
self._cmake.definitions["GHC_FILESYSTEM_BUILD_TESTING"] = False
self._cmake.definitions["GHC_FILESYSTEM_BUILD_EXAMPLES"] = False
self._cmake.definitions["GHC_FILESYSTEM_WITH_INSTALL"] = True
self._cmake.configure(source_folder=self._source_subfolder)
return self._cmake
def package(self):
self.copy(pattern="LICENSE", dst="licenses", src=self._source_subfolder)
cmake = self._configure_cmake()
cmake.install()
tools.rmdir(os.path.join(self.package_folder, "lib"))
def package_id(self):
self.info.header_only()
def package_info(self):
self.cpp_info.names["cmake_find_package"] = "ghcFilesystem"
self.cpp_info.names["cmake_find_package_multi"] = "ghcFilesystem"
self.cpp_info.components["filesystem"].names["cmake_find_package"] = "ghc_filesystem"
self.cpp_info.components["filesystem"].names["cmake_find_package_multi"] = "ghc_filesystem"
| [
"[email protected]"
] | |
183611ac29eb22fef0ace8fc886bf34f6a42b118 | 6923f79f1eaaba0ab28b25337ba6cb56be97d32d | /Non_Linear_Finite_Element_Analysis_of_Solids_and_Structures_Borst/pyfem-1.0/pyfem/solvers/Solver.py | 2f5f42f519614cbfcae51e0f61b1ad62fe84b778 | [] | no_license | burakbayramli/books | 9fe7ba0cabf06e113eb125d62fe16d4946f4a4f0 | 5e9a0e03aa7ddf5e5ddf89943ccc68d94b539e95 | refs/heads/master | 2023-08-17T05:31:08.885134 | 2023-08-14T10:05:37 | 2023-08-14T10:05:37 | 72,460,321 | 223 | 174 | null | 2022-10-24T12:15:06 | 2016-10-31T17:24:00 | Jupyter Notebook | UTF-8 | Python | false | false | 2,684 | py | ############################################################################
# This Python file is part of PyFEM-1.0, released on Aug. 29, 2012. #
# The PyFEM code accompanies the book: #
# #
# 'Non-Linear Finite Element Analysis of Solids and Structures' #
# R. de Borst, M.A. Crisfield, J.J.C. Remmers and C.V. Verhoosel #
# John Wiley and Sons, 2012, ISBN 978-0470666449 #
# #
# The code is written by J.J.C. Remmers, C.V. Verhoosel and R. de Borst. #
# Comments and suggestions can be sent to: #
# [email protected] #
# #
# The latest version can be downloaded from the web-site: #
# http://www.wiley.com/go/deborst #
# #
# The code is open source and intended for educational and scientific #
# purposes only. If you use PyFEM in your research, the developers would #
# be grateful if you could cite the book. #
# #
# Disclaimer: #
# The authors reserve all rights but do not guarantee that the code is #
# free from errors. Furthermore, the authors shall not be liable in any #
# event caused by the use of the program. #
############################################################################
#------------------------------------------------------------------------------
#
#------------------------------------------------------------------------------
class Solver:
def __init__( self , props , globdat ):
solverProps = getattr( props, "solver" )
solverType = solverProps.type
exec "from pyfem.solvers."+solverType+" import "+solverType
props.currentModule = "solver"
self.solver = eval(solverType+"( props , globdat )")
#------------------------------------------------------------------------------
#
#------------------------------------------------------------------------------
def run( self , props , globdat ):
self.solver.run( props , globdat )
| [
"[email protected]"
] | |
5b80fd002f0f429389f82658da7f4196dab59f60 | 9640f0b9a51ead702e1fc70c2571c116893f5046 | /products/migrations/0004_auto_20201006_2154.py | 04bb2798e01b633ba64b7ddc57eb2292697db4c1 | [] | no_license | CODE-Easyy/optic-backend | d7f46194d91fa25e1d54c4a9cb88faac9ca57ba9 | ed885265125b9d95be9467d95308964d25551230 | refs/heads/main | 2023-01-07T04:10:00.746112 | 2020-10-31T01:00:13 | 2020-10-31T01:00:13 | 302,156,695 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 1,255 | py | # Generated by Django 3.1.2 on 2020-10-06 21:54
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('products', '0003_auto_20201006_2115'),
]
operations = [
migrations.RemoveField(
model_name='brand',
name='cat',
),
migrations.RemoveField(
model_name='brand',
name='subcat',
),
migrations.RemoveField(
model_name='material',
name='cat',
),
migrations.RemoveField(
model_name='material',
name='subcat',
),
migrations.RemoveField(
model_name='opticalpower',
name='cat',
),
migrations.RemoveField(
model_name='opticalpower',
name='subcat',
),
migrations.RemoveField(
model_name='radius',
name='cat',
),
migrations.RemoveField(
model_name='radius',
name='subcat',
),
migrations.RemoveField(
model_name='volume',
name='cat',
),
migrations.RemoveField(
model_name='volume',
name='subcat',
),
]
| [
"[email protected]"
] | |
646d4b249fae7d33c288ffbadc7669cfe7b0b787 | aa265e03e73f718d4008cfe30ada7ee32c852eec | /ABC_B/ABC158_B.py | 72d7ebebec13e3477adb87e193dac338d8e588da | [
"MIT"
] | permissive | ryosuke0825/atcoder_python | 4fb9de9733cd9ef41c2ad9ad38b3f190f49d3ad5 | 52d037d0bc9ef2c721bf2958c1c2ead558cb0cf5 | refs/heads/master | 2023-03-11T22:47:56.963089 | 2023-03-05T01:21:06 | 2023-03-05T01:21:06 | 181,768,029 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 114 | py | n, a, b = map(int, input().split())
ans = n//(a+b)*a
amari = n-(a+b)*(n//(a+b))
ans += min(amari, a)
print(ans)
| [
"[email protected]"
] | |
2ed9c19851b9bc62ae4c409439f33728ff16f76a | 8692807f1dfa8c18c61df07cfafbbd27d4e66fba | /previous_problems/BEGINNER/CHN09.sol.py | a30740f15472fbe4e2738dd1ba776740691bc28d | [] | no_license | sharmakajal0/codechef_problems | 00381e9bf1996b859e46f087c2ffafd9d7a10ef1 | 0b979029e0a821f47fbdd6f9c624daee785a02e7 | refs/heads/master | 2020-05-29T15:04:40.459979 | 2020-03-29T08:44:53 | 2020-03-29T08:44:53 | 189,212,028 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 326 | py | #!/usr/bin/env python
for _ in range(int(input())):
s = str(input())
amber = 0
brass = 0
for i in s:
if i == 'a':
amber += 1
elif i == 'b':
brass += 1
if amber > brass:
print(brass)
elif brass > amber:
print(amber)
else:
print(amber) | [
"[email protected]"
] | |
11a635d63cb3cc6ec4f23bc2d39a60be2dea79e2 | e37f5df6d380e9395e2433558be05090c2e2cd72 | /tspec_mapping/make_index_from_field.py | 8384803f596600e08a0ab8f71024bd3eefc7a835 | [] | no_license | keflavich/tspec_mapping | c8bcff914095cb3bc2bfc9dccdcbe7897e5399c7 | b61e561235a5c3c104cb3a59e016fc252dbd5845 | refs/heads/master | 2021-01-18T14:15:14.241892 | 2015-02-13T06:22:29 | 2015-02-13T06:22:29 | 7,315,394 | 0 | 0 | null | 2013-09-28T15:54:17 | 2012-12-25T07:32:22 | Python | UTF-8 | Python | false | false | 5,970 | py | import astroquery.irsa
import astrometry
import astropy.io.fits
import numpy as np
import atpy
import os
def make_index_from_table(table,fieldname,fov=None,clobber=False,**kwargs):
"""
Given a table with RA and Dec columns (case-sensitive!), build an astrometry.net
quad index
Parameters
----------
table : Table
astropy.io.table or atpy.table instance (recarrays require different cleaning operations)
fieldname : str
Seed name for the output catalog file and index file
clobber : bool
Overwrite existing fits files / indices?
kwargs :
Are passed to astrometry.build_index
"""
#fitstable = astropy.io.fits.BinTableHDU(data=table)
newtable = atpy.Table(name=fieldname)
for colname in table.dtype.names:
newtable.add_column(colname, table[colname])
# sanitize fieldname
fieldname = fieldname.replace(" ","_") # what other chars should I be careful of?
#fitstable.writeto(fieldname+".fits",clobber=clobber)
newtable.write(fieldname+".fits",overwrite=clobber)
if fov is None:
# guess the FOV... sort of
rarange = (np.max(table['RA']) - np.min(table['RA']))*3600
decrange = (np.max(table['Dec'])-np.min(table['Dec']))*3600
fov = (rarange+decrange)/2.
return make_index_from_fitstable(fieldname+'.fits',fieldname,fov=fov,**kwargs)
def make_index_from_fitstable(fitstablename, fieldname=None, fov=None, preset_list=None, **kwargs):
"""
Build an index from a FITS table already on disk (very thin wrapper of build_index)
Parameters
----------
fitstablename : str
Full path to a .fits table with the 2nd header being a BinTableHDU for
astrometry's build-index to parse
preset_list : list
List of presets, in the range -5 to 21, to build indices for
fov : int
field of view in arcseconds
fieldname : str
output prefix for the index file. If not specified, will use the root string
of the fitsfilename
"""
if fov is None and 'scale_number' not in kwargs and preset_list is None:
raise ValueError("Must specify a preset or a FOV")
elif 'scale_number' in kwargs:
presets = [kwargs.pop('scale_number')]
elif preset_list is not None:
presets = preset_list
else:
# determine appropriate "presets" to use
preset = astrometry.get_closest_preset(fov/60.)
if preset > -4:
presets = [preset-2, preset-1,preset,preset+1]
elif preset > -5:
presets = [preset-1,preset,preset+1]
else:
presets = [preset,preset+1]
if fieldname is None:
fieldname = os.path.split( os.path.splitext(fitstablename)[0] )[1]
stdout,stderr = "",""
for preset in presets:
_stdout,_stderr = astrometry.build_index(fieldname+".fits",scale_number=preset,**kwargs)
stdout += _stdout
stderr += _stderr
return stdout,stderr
def make_index_from_field_2MASS(coords, fieldname, fov=900, clobber=False,
quality_exclude="UX", **kwargs):
"""
Create an index file. The input should be IRSA-parseable coordinates, e.g.
a name, ra/dec, or glon/glat coords
Parameters
----------
coords : str
IRSA-parseable coordinates or SIMBAD-readable filename
fieldname : str
Prefix string for output file name
fov : int
Field of view to include in arcseconds (Circular)
clobber : bool
Overwrite existing data files?
quality_exclude : str
Entries in the catalog with these characters will be excluded
Example
-------
>>> make_index_from_field_2MASS('Sgr C','Sgr C',300,scan_catalog=True,clobber=True)
>>> make_index_from_field_2MASS('266.1512 -29.4703','Sgr C',300,scan_catalog=True,clobber=True)
>>> make_index_from_field_2MASS('359.4288 -00.0898 gal','Sgr C',300,scan_catalog=True,clobber=True)
"""
fieldname = fieldname.replace(" ","_") # what other chars should I be careful of?
table = astroquery.irsa.query_gator_box('pt_src_cat',coords,fov)
table.rename_column('ra','RA')
table.rename_column('dec','Dec')
table = table[table['extd_flg']==0] # don't use extended sources; they're bad for astrometry
cleantable = _clean_table(table)
return make_index_from_table(cleantable,fieldname,fov=fov,clobber=clobber,**kwargs)
def make_index_from_field_UKIDSS(glon,glat,fieldname,catalog='GPS',fov=900,clobber=False,**kwargs):
"""
Create an index file. The input should be UKIDSS-parseable coordinates, e.g.
glon,glat (so far, only a galactic lon/lat query tool is implemented
Example
-------
>>> make_index_from_field_UKIDSS(359.4288,-00.0898,'Sgr C',fov=300,scan_catalog=True,clobber=True)
"""
fieldname = fieldname.replace(" ","_") # what other chars should I be careful of?
ukquery = astroquery.ukidss.UKIDSSQuery()
ukquery.programmeID = catalog
uktable = ukquery.get_catalog_gal(glon,glat,radius=fov/60.)[0]
uktable.writeto(fieldname+".fits",clobber=clobber)
#bintab = table[0][1]
#bintab.data = bintab.data.astype(newtype)
#table.rename_column('ra','RA')
#table.rename_column('dec','Dec')
#cleantable = _clean_table(table)
return make_index_from_fitstable(fieldname+".fits",fieldname=fieldname,fov=fov,**kwargs)
def _clean_table(table):
"""
Hack to convert a table to a FITS-friendly numpy ndarray;
this will become obsolete when astropy's table includes a FITS writer
"""
float_types = [np.float, np.float128, np.float16, np.float32, np.float64, np.float_, np.floating]
new_fields = [(k,np.dtype('S8')) if v[0].type == np.object_ else
(k,np.float64) if v[0].type in float_types else (k,v[0])
for (k,v) in table._data.dtype.fields.iteritems()]
new_array = np.array(table._data, dtype=new_fields)
return new_array
| [
"[email protected]"
] | |
c423aa3aae2fbf9cab46298e451c9a089326366a | 6fa7f99d3d3d9b177ef01ebf9a9da4982813b7d4 | /vcFgGJHxhTwRiLK5d_14.py | 4a507ff87ec2b6c51b89dc31487dd6f18bceb1b8 | [] | no_license | daniel-reich/ubiquitous-fiesta | 26e80f0082f8589e51d359ce7953117a3da7d38c | 9af2700dbe59284f5697e612491499841a6c126f | refs/heads/master | 2023-04-05T06:40:37.328213 | 2021-04-06T20:17:44 | 2021-04-06T20:17:44 | 355,318,759 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 149 | py |
from fractions import gcd
def smallest(n):
lcm = 1
for i in range(1, n+1):
lcm = (lcm * i)//(gcd(int(lcm), i))
return int(lcm)
| [
"[email protected]"
] | |
463cf524dbd01e72dbb67f1b1b24dff9c7398279 | 8bbb508b2efd88beff28e072b6ba3ad5ad86f9c7 | /Config_ipython_import/ipython/import_here.py | f079107a00955301b1fb74660f6674d951bf3ce6 | [] | no_license | wubinbai/2019 | 1f31a96d49c3383e696f208358347341783f6141 | 5f1e191a037b793abb97120f4e13185782c64a2c | refs/heads/master | 2021-06-30T11:48:16.230757 | 2020-09-25T09:09:01 | 2020-09-25T09:09:01 | 162,510,722 | 0 | 2 | null | null | null | null | UTF-8 | Python | false | false | 1,467 | py | import numpy as np
import sys
#np.set_printoptions(threshold=sys.maxsize)
np.set_printoptions(threshold=100)
import pandas as pd
import matplotlib.pyplot as plt
plt.ion()
import seaborn as sns
from tqdm import tqdm
def plot_whole(df):
with pd.option_context('display.max_rows', None, 'display.max_columns', None):
print(df)
h = help
# Better help function he():
def he():
global ar
ar = input('Enter the function name for help:')
help(eval(ar))
# for . operation of dir
# use eval(repr(xxx.i))
from pandas import read_csv as pdrc
# for ipython to display all results in the jupyter notebook:
from IPython.core.interactiveshell import InteractiveShell
InteractiveShell.ast_node_interactivity = "all"
def my_plot(data_array):
plt.figure()
plt.plot(data_array)
plt.grid()
def my_plotas(data_array):
plt.figure()
plt.plot(data_array)
plt.plot(data_array,'b*')
plt.grid()
def save_model_keras(model,save_path):
from keras.utils import plot_model
plot_model(model,show_shapes=True,to_file=save_path)
def torchviz_pdf(model,input_tensor):
from torchviz import make_dot
vis_graph = make_dot(model(input_tensor), params=dict(model.named_parameters()))
vis_graph.view() # 会在当前目录下保存一个“Digraph.gv.pdf”文件,并在默认浏览器中打开
def torch_summary(model,input_size):
from torchsummary import summary
summary(model.cuda(), input_size=input_size)
| [
"[email protected]"
] | |
3df683e8903138ebdc951f06fbbd6c3e9bf2ff7b | d664f6cdcd280d7575dd1f6082a2ad8bdabed14c | /test/functional/interface_rest.py | 6fd897edac33a30478a179e7b4ae8e5bfb1c91d5 | [
"MIT"
] | permissive | minblock/motherofweeddaycoin | 22c287cac9c925f03a5aa1f36243ce51f89ebd80 | eeb0625c0f2f35412b3a69da50bc55f6acd6806d | refs/heads/master | 2022-04-26T05:10:07.935378 | 2020-04-28T23:11:46 | 2020-04-28T23:11:46 | 259,339,910 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 14,715 | py | #!/usr/bin/env python3
# Copyright (c) 2014-2018 The Bitcoin Core developers
# Distributed under the MIT software license, see the accompanying
# file COPYING or http://www.opensource.org/licenses/mit-license.php.
"""Test the REST API."""
import binascii
from decimal import Decimal
from enum import Enum
from io import BytesIO
import json
from struct import pack, unpack
import http.client
import urllib.parse
from test_framework.test_framework import BitcoinTestFramework
from test_framework.util import (
assert_equal,
assert_greater_than,
assert_greater_than_or_equal,
hex_str_to_bytes,
)
from test_framework.messages import BLOCK_HEADER_SIZE
class ReqType(Enum):
JSON = 1
BIN = 2
HEX = 3
class RetType(Enum):
OBJ = 1
BYTES = 2
JSON = 3
def filter_output_indices_by_value(vouts, value):
for vout in vouts:
if vout['value'] == value:
yield vout['n']
class RESTTest (BitcoinTestFramework):
def set_test_params(self):
self.setup_clean_chain = True
self.num_nodes = 2
self.extra_args = [["-rest"], []]
def skip_test_if_missing_module(self):
self.skip_if_no_wallet()
def test_rest_request(self, uri, http_method='GET', req_type=ReqType.JSON, body='', status=200, ret_type=RetType.JSON):
rest_uri = '/rest' + uri
if req_type == ReqType.JSON:
rest_uri += '.json'
elif req_type == ReqType.BIN:
rest_uri += '.bin'
elif req_type == ReqType.HEX:
rest_uri += '.hex'
conn = http.client.HTTPConnection(self.url.hostname, self.url.port)
self.log.debug('%s %s %s', http_method, rest_uri, body)
if http_method == 'GET':
conn.request('GET', rest_uri)
elif http_method == 'POST':
conn.request('POST', rest_uri, body)
resp = conn.getresponse()
assert_equal(resp.status, status)
if ret_type == RetType.OBJ:
return resp
elif ret_type == RetType.BYTES:
return resp.read()
elif ret_type == RetType.JSON:
return json.loads(resp.read().decode('utf-8'), parse_float=Decimal)
def run_test(self):
self.url = urllib.parse.urlparse(self.nodes[0].url)
self.log.info("Mine blocks and send Motherofweeddaycoin to node 1")
# Random address so node1's balance doesn't increase
not_related_address = "2MxqoHEdNQTyYeX1mHcbrrpzgojbosTpCvJ"
self.nodes[0].generate(1)
self.sync_all()
self.nodes[1].generatetoaddress(100, not_related_address)
self.sync_all()
assert_equal(self.nodes[0].getbalance(), 50)
txid = self.nodes[0].sendtoaddress(self.nodes[1].getnewaddress(), 0.1)
self.sync_all()
self.log.info("Test the /tx URI")
json_obj = self.test_rest_request("/tx/{}".format(txid))
assert_equal(json_obj['txid'], txid)
# Check hex format response
hex_response = self.test_rest_request("/tx/{}".format(txid), req_type=ReqType.HEX, ret_type=RetType.OBJ)
assert_greater_than_or_equal(int(hex_response.getheader('content-length')),
json_obj['size']*2)
spent = (json_obj['vin'][0]['txid'], json_obj['vin'][0]['vout']) # get the vin to later check for utxo (should be spent by then)
# get n of 0.1 outpoint
n, = filter_output_indices_by_value(json_obj['vout'], Decimal('0.1'))
spending = (txid, n)
self.log.info("Query an unspent TXO using the /getutxos URI")
self.nodes[1].generatetoaddress(1, not_related_address)
self.sync_all()
bb_hash = self.nodes[0].getbestblockhash()
assert_equal(self.nodes[1].getbalance(), Decimal("0.1"))
# Check chainTip response
json_obj = self.test_rest_request("/getutxos/{}-{}".format(*spending))
assert_equal(json_obj['chaintipHash'], bb_hash)
# Make sure there is one utxo
assert_equal(len(json_obj['utxos']), 1)
assert_equal(json_obj['utxos'][0]['value'], Decimal('0.1'))
self.log.info("Query a spent TXO using the /getutxos URI")
json_obj = self.test_rest_request("/getutxos/{}-{}".format(*spent))
# Check chainTip response
assert_equal(json_obj['chaintipHash'], bb_hash)
# Make sure there is no utxo in the response because this outpoint has been spent
assert_equal(len(json_obj['utxos']), 0)
# Check bitmap
assert_equal(json_obj['bitmap'], "0")
self.log.info("Query two TXOs using the /getutxos URI")
json_obj = self.test_rest_request("/getutxos/{}-{}/{}-{}".format(*(spending + spent)))
assert_equal(len(json_obj['utxos']), 1)
assert_equal(json_obj['bitmap'], "10")
self.log.info("Query the TXOs using the /getutxos URI with a binary response")
bin_request = b'\x01\x02'
for txid, n in [spending, spent]:
bin_request += hex_str_to_bytes(txid)
bin_request += pack("i", n)
bin_response = self.test_rest_request("/getutxos", http_method='POST', req_type=ReqType.BIN, body=bin_request, ret_type=RetType.BYTES)
output = BytesIO(bin_response)
chain_height, = unpack("i", output.read(4))
response_hash = binascii.hexlify(output.read(32)[::-1]).decode('ascii')
assert_equal(bb_hash, response_hash) # check if getutxo's chaintip during calculation was fine
assert_equal(chain_height, 102) # chain height must be 102
self.log.info("Test the /getutxos URI with and without /checkmempool")
# Create a transaction, check that it's found with /checkmempool, but
# not found without. Then confirm the transaction and check that it's
# found with or without /checkmempool.
# do a tx and don't sync
txid = self.nodes[0].sendtoaddress(self.nodes[1].getnewaddress(), 0.1)
json_obj = self.test_rest_request("/tx/{}".format(txid))
# get the spent output to later check for utxo (should be spent by then)
spent = (json_obj['vin'][0]['txid'], json_obj['vin'][0]['vout'])
# get n of 0.1 outpoint
n, = filter_output_indices_by_value(json_obj['vout'], Decimal('0.1'))
spending = (txid, n)
json_obj = self.test_rest_request("/getutxos/{}-{}".format(*spending))
assert_equal(len(json_obj['utxos']), 0)
json_obj = self.test_rest_request("/getutxos/checkmempool/{}-{}".format(*spending))
assert_equal(len(json_obj['utxos']), 1)
json_obj = self.test_rest_request("/getutxos/{}-{}".format(*spent))
assert_equal(len(json_obj['utxos']), 1)
json_obj = self.test_rest_request("/getutxos/checkmempool/{}-{}".format(*spent))
assert_equal(len(json_obj['utxos']), 0)
self.nodes[0].generate(1)
self.sync_all()
json_obj = self.test_rest_request("/getutxos/{}-{}".format(*spending))
assert_equal(len(json_obj['utxos']), 1)
json_obj = self.test_rest_request("/getutxos/checkmempool/{}-{}".format(*spending))
assert_equal(len(json_obj['utxos']), 1)
# Do some invalid requests
self.test_rest_request("/getutxos", http_method='POST', req_type=ReqType.JSON, body='{"checkmempool', status=400, ret_type=RetType.OBJ)
self.test_rest_request("/getutxos", http_method='POST', req_type=ReqType.BIN, body='{"checkmempool', status=400, ret_type=RetType.OBJ)
self.test_rest_request("/getutxos/checkmempool", http_method='POST', req_type=ReqType.JSON, status=400, ret_type=RetType.OBJ)
# Test limits
long_uri = '/'.join(["{}-{}".format(txid, n_) for n_ in range(20)])
self.test_rest_request("/getutxos/checkmempool/{}".format(long_uri), http_method='POST', status=400, ret_type=RetType.OBJ)
long_uri = '/'.join(['{}-{}'.format(txid, n_) for n_ in range(15)])
self.test_rest_request("/getutxos/checkmempool/{}".format(long_uri), http_method='POST', status=200)
self.nodes[0].generate(1) # generate block to not affect upcoming tests
self.sync_all()
self.log.info("Test the /block, /blockhashbyheight and /headers URIs")
bb_hash = self.nodes[0].getbestblockhash()
# Check result if block does not exists
assert_equal(self.test_rest_request('/headers/1/0000000000000000000000000000000000000000000000000000000000000000'), [])
self.test_rest_request('/block/0000000000000000000000000000000000000000000000000000000000000000', status=404, ret_type=RetType.OBJ)
# Check result if block is not in the active chain
self.nodes[0].invalidateblock(bb_hash)
assert_equal(self.test_rest_request('/headers/1/{}'.format(bb_hash)), [])
self.test_rest_request('/block/{}'.format(bb_hash))
self.nodes[0].reconsiderblock(bb_hash)
# Check binary format
response = self.test_rest_request("/block/{}".format(bb_hash), req_type=ReqType.BIN, ret_type=RetType.OBJ)
assert_greater_than(int(response.getheader('content-length')), BLOCK_HEADER_SIZE)
response_bytes = response.read()
# Compare with block header
response_header = self.test_rest_request("/headers/1/{}".format(bb_hash), req_type=ReqType.BIN, ret_type=RetType.OBJ)
assert_equal(int(response_header.getheader('content-length')), BLOCK_HEADER_SIZE)
response_header_bytes = response_header.read()
assert_equal(response_bytes[:BLOCK_HEADER_SIZE], response_header_bytes)
# Check block hex format
response_hex = self.test_rest_request("/block/{}".format(bb_hash), req_type=ReqType.HEX, ret_type=RetType.OBJ)
assert_greater_than(int(response_hex.getheader('content-length')), BLOCK_HEADER_SIZE*2)
response_hex_bytes = response_hex.read().strip(b'\n')
assert_equal(binascii.hexlify(response_bytes), response_hex_bytes)
# Compare with hex block header
response_header_hex = self.test_rest_request("/headers/1/{}".format(bb_hash), req_type=ReqType.HEX, ret_type=RetType.OBJ)
assert_greater_than(int(response_header_hex.getheader('content-length')), BLOCK_HEADER_SIZE*2)
response_header_hex_bytes = response_header_hex.read(BLOCK_HEADER_SIZE*2)
assert_equal(binascii.hexlify(response_bytes[:BLOCK_HEADER_SIZE]), response_header_hex_bytes)
# Check json format
block_json_obj = self.test_rest_request("/block/{}".format(bb_hash))
assert_equal(block_json_obj['hash'], bb_hash)
assert_equal(self.test_rest_request("/blockhashbyheight/{}".format(block_json_obj['height']))['blockhash'], bb_hash)
# Check hex/bin format
resp_hex = self.test_rest_request("/blockhashbyheight/{}".format(block_json_obj['height']), req_type=ReqType.HEX, ret_type=RetType.OBJ)
assert_equal(resp_hex.read().decode('utf-8').rstrip(), bb_hash)
resp_bytes = self.test_rest_request("/blockhashbyheight/{}".format(block_json_obj['height']), req_type=ReqType.BIN, ret_type=RetType.BYTES)
blockhash = binascii.hexlify(resp_bytes[::-1]).decode('utf-8')
assert_equal(blockhash, bb_hash)
# Check invalid blockhashbyheight requests
resp = self.test_rest_request("/blockhashbyheight/abc", ret_type=RetType.OBJ, status=400)
assert_equal(resp.read().decode('utf-8').rstrip(), "Invalid height: abc")
resp = self.test_rest_request("/blockhashbyheight/1000000", ret_type=RetType.OBJ, status=404)
assert_equal(resp.read().decode('utf-8').rstrip(), "Block height out of range")
resp = self.test_rest_request("/blockhashbyheight/-1", ret_type=RetType.OBJ, status=400)
assert_equal(resp.read().decode('utf-8').rstrip(), "Invalid height: -1")
self.test_rest_request("/blockhashbyheight/", ret_type=RetType.OBJ, status=400)
# Compare with json block header
json_obj = self.test_rest_request("/headers/1/{}".format(bb_hash))
assert_equal(len(json_obj), 1) # ensure that there is one header in the json response
assert_equal(json_obj[0]['hash'], bb_hash) # request/response hash should be the same
# Compare with normal RPC block response
rpc_block_json = self.nodes[0].getblock(bb_hash)
for key in ['hash', 'confirmations', 'height', 'version', 'merkleroot', 'time', 'nonce', 'bits', 'difficulty', 'chainwork', 'previousblockhash']:
assert_equal(json_obj[0][key], rpc_block_json[key])
# See if we can get 5 headers in one response
self.nodes[1].generate(5)
self.sync_all()
json_obj = self.test_rest_request("/headers/5/{}".format(bb_hash))
assert_equal(len(json_obj), 5) # now we should have 5 header objects
self.log.info("Test tx inclusion in the /mempool and /block URIs")
# Make 3 tx and mine them on node 1
txs = []
txs.append(self.nodes[0].sendtoaddress(not_related_address, 11))
txs.append(self.nodes[0].sendtoaddress(not_related_address, 11))
txs.append(self.nodes[0].sendtoaddress(not_related_address, 11))
self.sync_all()
# Check that there are exactly 3 transactions in the TX memory pool before generating the block
json_obj = self.test_rest_request("/mempool/info")
assert_equal(json_obj['size'], 3)
# the size of the memory pool should be greater than 3x ~100 bytes
assert_greater_than(json_obj['bytes'], 300)
# Check that there are our submitted transactions in the TX memory pool
json_obj = self.test_rest_request("/mempool/contents")
for i, tx in enumerate(txs):
assert tx in json_obj
assert_equal(json_obj[tx]['spentby'], txs[i + 1:i + 2])
assert_equal(json_obj[tx]['depends'], txs[i - 1:i])
# Now mine the transactions
newblockhash = self.nodes[1].generate(1)
self.sync_all()
# Check if the 3 tx show up in the new block
json_obj = self.test_rest_request("/block/{}".format(newblockhash[0]))
non_coinbase_txs = {tx['txid'] for tx in json_obj['tx']
if 'coinbase' not in tx['vin'][0]}
assert_equal(non_coinbase_txs, set(txs))
# Check the same but without tx details
json_obj = self.test_rest_request("/block/notxdetails/{}".format(newblockhash[0]))
for tx in txs:
assert tx in json_obj['tx']
self.log.info("Test the /chaininfo URI")
bb_hash = self.nodes[0].getbestblockhash()
json_obj = self.test_rest_request("/chaininfo")
assert_equal(json_obj['bestblockhash'], bb_hash)
if __name__ == '__main__':
RESTTest().main()
| [
"[email protected]"
] | |
d9b120c6b3b4f2a5ecffddf3d0f51f0bbf07cc85 | 401f783a202949adbf144b5780bcd87a6daf2299 | /code/python/Day-89/SumNested.py | 790c9104c096a14e9c9ead74bef4561443cec9d7 | [] | no_license | TalatWaheed/100-days-code | 1934c8113e6e7be86ca86ea66c518d2f2cedf82a | b8fd92d4ddb6adc4089d38ac7ccd2184f9c47919 | refs/heads/master | 2021-07-04T14:28:45.363798 | 2019-03-05T13:49:55 | 2019-03-05T13:49:55 | 140,101,486 | 0 | 1 | null | null | null | null | UTF-8 | Python | false | false | 240 | py | def sum1(lst):
total = 0
for element in lst:
if (type(element) == type([])):
total = total + sum1(element)
else:
total = total + element
return total
print( "Sum is:",sum1([[1,2],[3,4]]))
| [
"[email protected]"
] | |
7258c482e8ee9b88d64784f4b8132d59778a383d | 6c721f3cfce6dc88396cd3b5f6a59d65a2ea5033 | /some_learn/Data_Set_handle/Caltech-Dateset/test/caltech_generate_xml/generate_xml.py | 5252a995b10a61e6a84100101e23d4a81ac98bc2 | [
"MIT"
] | permissive | unicoe/PycharmProjects | 20a3dabe88c7874da54451c7bb16999afc0eee35 | 23ff314eb5ac9bfa01a8278089d722b5d0061751 | refs/heads/master | 2020-03-23T09:16:25.907188 | 2019-12-21T03:10:49 | 2019-12-21T03:10:49 | 141,377,686 | 4 | 1 | null | null | null | null | UTF-8 | Python | false | false | 6,550 | py | #!/usr/bin/env python
# coding:utf-8
#from xml.etree.ElementTree import Element, SubElement, tostring
from lxml.etree import Element, SubElement, tostring
import pprint
from xml.dom.minidom import parseString
import os
def mkdir(path):
import os
path = path.strip()
path = path.rstrip("\\")
isExists = os.path.exists(path)
if not isExists:
os.makedirs(path)
print path + 'ok'
return True
else:
print path + 'failed!'
return False
def generate_xml(file_info, obj):
node_root = Element('annotation')
node_folder = SubElement(node_root, 'folder')
node_folder.text = file_info[0]
node_filename = SubElement(node_root, 'filename')
node_filename.text = file_info[1]
node_size = SubElement(node_root, 'size')
node_width = SubElement(node_size, 'width')
node_width.text = '640'
node_height = SubElement(node_size, 'height')
node_height.text = '480'
node_depth = SubElement(node_size, 'depth')
node_depth.text = '3'
for obj_i in obj:
print obj_i
node_object = SubElement(node_root, 'object')
node_name = SubElement(node_object, 'name')
#node_name.text = 'mouse'
node_name.text = 'person'
node_name = SubElement(node_object, 'difficult')
# node_name.text = 'mouse'
node_name.text = '0'
node_bndbox = SubElement(node_object, 'bndbox')
node_xmin = SubElement(node_bndbox, 'xmin')
#node_xmin.text = '99'
node_xmin.text = obj_i['xmin']
node_ymin = SubElement(node_bndbox, 'ymin')
#node_ymin.text = '358'
node_ymin.text = obj_i['ymin']
node_xmax = SubElement(node_bndbox, 'xmax')
#node_xmax.text = '135'
node_xmax.text = obj_i['xmax']
node_ymax = SubElement(node_bndbox, 'ymax')
#node_ymax.text = '375'
node_ymax.text = obj_i['ymax']
xml = tostring(node_root, pretty_print=True) #格式化显示,该换行的换行
dom = parseString(xml)
#file_root = '/home/user/Downloads/caltech_data_set/data_train/'
file_root = '/home/user/Downloads/caltech_data_set/data_reasonable_3_19/'
file_name = file_root + file_info[0];
mkdir (file_name)
fw = open(file_name+"/"+file_info[1].split('.')[0]+".xml", 'a+')
fw.write(xml)
print "xml _ ok"
fw.close()
#for debug
#print xml
def printPath(level, path):
global allFileNum
'''''
打印一个目录下的所有文件夹和文件
'''
# 所有文件夹,第一个字段是次目录的级别
dirList = []
# 所有文件
fileList = []
# 返回一个列表,其中包含在目录条目的名称(google翻译)
files = os.listdir(path)
# 先添加目录级别
dirList.append(str(level))
for f in files:
if(os.path.isdir(path + '/' + f)):
# 排除隐藏文件夹。因为隐藏文件夹过多
if(f[0] == '.'):
pass
else:
# 添加非隐藏文件夹
dirList.append(f)
if(os.path.isfile(path + '/' + f)):
# 添加文件
fileList.append(f)
# 当一个标志使用,文件夹列表第一个级别不打印
i_dl = 0
for dl in dirList:
if(i_dl == 0):
i_dl = i_dl + 1
else:
# 打印至控制台,不是第一个的目录
print '-' * (int(dirList[0])), dl
# 打印目录下的所有文件夹和文件,目录级别+1
printPath((int(dirList[0]) + 1), path + '/' + dl)
print fileList
for fl in fileList:
# 打印文件
#print '-' * (int(dirList[0])), fl
# 随便计算一下有多少个文件
#allFileNum = allFileNum + 1
"""
操作文件进行读写
"""
print fl[12:17],fl[17:21]
file_info = []
file_info.append(fl[12:17]+'/'+fl[17:21])
print file_info
print path
file_name = path+"/"+fl
fw = open(file_name, 'r');
line_content = fw.readlines()
fw.close()
print line_content
tmp = -1
obj = []
con_len = len(line_content)
try:
string = line_content[0].split(" ")
tmp = int(string[0])
except Exception:
continue
file_info.append(str(tmp) + '.jpg')
xmin = str(int(float(string[1])))
ymin = str(int(float(string[2])))
xmax = str(int(float(string[1]) + float(string[3])))
ymax = str(int(float(string[2]) + float(string[4])))
dict1 = {}
dict1["xmin"] = xmin
dict1["ymin"] = ymin
dict1["xmax"] = xmax
dict1["ymax"] = ymax
obj.append(dict1)
for con_i in xrange(1, con_len):
string = line_content[con_i].split(" ")
tmp1 = int(string[0])
if tmp == tmp1:
xmin = str(int(float(string[1])))
ymin = str(int(float(string[2])))
xmax = str(int(float(string[1]) + float(string[3])))
ymax = str(int(float(string[2]) + float(string[4])))
dict1 = {}
dict1["xmin"] = xmin
dict1["ymin"] = ymin
dict1["xmax"] = xmax
dict1["ymax"] = ymax
obj.append(dict1)
elif tmp1 > 0:
generate_xml(file_info, obj)
obj = []
tmp = tmp1
file_info[1] = str(tmp1) + ".jpg"
xmin = str(int(float(string[1])))
ymin = str(int(float(string[2])))
xmax = str(int(float(string[1]) + float(string[3])))
ymax = str(int(float(string[2]) + float(string[4])))
dict1 = {}
dict1["xmin"] = xmin
dict1["ymin"] = ymin
dict1["xmax"] = xmax
dict1["ymax"] = ymax
obj.append(dict1)
continue
def read_annotations_generate_fileinfo_obj(file_path):
pass
if __name__=="__main__":
#
# file_info = ['set00/V000', '1.jpg']
#
# obj = []
# obj1 = {"xmin":"1", "ymin":"1", "xmax":"5", "ymax":"5"}
# obj2 = {"xmin":"2", "ymin":"2", "xmax":"6", "ymax":"6"}
# obj.append(obj1)
# obj.append(obj2)
#
# generate_xml(file_info, obj)
#
printPath(1, "/home/user/Downloads/caltech_data_set/data_reasonable_3_19")
#printPath(1, "/home/user/Downloads/caltech_data_set/data_reasonable_train") | [
"[email protected]"
] | |
c91effaaaa7685813f70b6e5719bd5298048a24f | f714db4463dd37fc33382364dc4b1963a9053e49 | /src/sentry/integrations/msteams/utils.py | 3dd201dde1ca3bb9f4e5d09390e9154ffa859fb7 | [
"BUSL-1.1",
"Apache-2.0"
] | permissive | macher91/sentry | 92171c2ad23564bf52627fcd711855685b138cbd | dd94d574403c95eaea6d4ccf93526577f3d9261b | refs/heads/master | 2021-07-07T08:23:53.339912 | 2020-07-21T08:03:55 | 2020-07-21T08:03:55 | 140,079,930 | 0 | 0 | BSD-3-Clause | 2020-05-13T11:28:35 | 2018-07-07T11:50:48 | Python | UTF-8 | Python | false | false | 1,755 | py | from __future__ import absolute_import
from sentry.models import Integration
from sentry.utils.compat import filter
from .client import MsTeamsClient
MSTEAMS_MAX_ITERS = 100
def channel_filter(channel, name):
# the general channel has no name in the list
# retrieved from the REST API call
if channel.get("name"):
return name == channel.get("name")
else:
return name == "General"
def get_channel_id(organization, integration_id, name):
try:
integration = Integration.objects.get(
provider="msteams", organizations=organization, id=integration_id
)
except Integration.DoesNotExist:
return None
team_id = integration.external_id
client = MsTeamsClient(integration)
# handle searching for channels first
channel_list = client.get_channel_list(team_id)
filtered_channels = list(filter(lambda x: channel_filter(x, name), channel_list))
if len(filtered_channels) > 0:
return filtered_channels[0].get("id")
# handle searching for users
members = client.get_member_list(team_id, None)
for i in range(MSTEAMS_MAX_ITERS):
member_list = members.get("members")
continuation_token = members.get("continuationToken")
filtered_members = list(filter(lambda x: x.get("name") == name, member_list))
if len(filtered_members) > 0:
# TODO: handle duplicate username case
user_id = filtered_members[0].get("id")
tenant_id = filtered_members[0].get("tenantId")
return client.get_user_conversation_id(user_id, tenant_id)
if not continuation_token:
return None
members = client.get_member_list(team_id, continuation_token)
return None
| [
"[email protected]"
] | |
15b4273440bb14b4e5c6be697d42d8398d30d59e | 6034b74d0d31997aa46f669d3d77bb46e199a430 | /tests/test_dataset_processor.py | 6cae4a56b6cba92bb31d6fa50efab979b3b32076 | [
"Apache-2.0"
] | permissive | bupt-nlp/sequence-labeling-models | 5ea641eec4fad1f470bdb7f3dd64ef0b892ba557 | 75bb8c24098ad9f307605fad61d811bcd299cfcc | refs/heads/master | 2023-06-01T00:59:56.753952 | 2021-06-19T11:28:13 | 2021-06-19T11:28:13 | 287,694,702 | 2 | 2 | Apache-2.0 | 2020-08-15T07:07:06 | 2020-08-15T06:54:35 | null | UTF-8 | Python | false | false | 5,152 | py | from __future__ import annotations
import os
from typing import List, Tuple
from allennlp.data import Vocabulary, Instance
from allennlp.data.data_loaders import SimpleDataLoader, DataLoader
from allennlp.data.token_indexers.pretrained_transformer_indexer import PretrainedTransformerIndexer
from allennlp.data.dataset_readers.sequence_tagging import SequenceTaggingDatasetReader
from allennlp.models.model import Model
from allennlp.models.simple_tagger import SimpleTagger
from allennlp.modules.text_field_embedders.basic_text_field_embedder import BasicTextFieldEmbedder
from allennlp.modules.token_embedders.pretrained_transformer_embedder import PretrainedTransformerEmbedder
from allennlp.modules.seq2seq_encoders import PassThroughEncoder
from allennlp.training.trainer import Trainer
from allennlp.training import GradientDescentTrainer
import torch
from allennlp.data import (
DataLoader,
DatasetReader,
Instance,
Vocabulary,
TextFieldTensors,
)
from allennlp.data.data_loaders import SimpleDataLoader
from allennlp.models import Model
from allennlp.modules.text_field_embedders import BasicTextFieldEmbedder
from allennlp.training.optimizers import AdamOptimizer
from utils.dataset_processor import read_line, convert_bmes_to_sequence_tagging, convert_two_array_to_text_classification_corpus
def test_read_line():
line = '我爱 O'
tokens, labels = read_line(line)
assert tokens == ['我', '爱']
assert labels == ['O', 'O']
line = '吴京 PER'
tokens, labels = read_line(line)
assert tokens == ['吴', '京']
assert labels == ['B-PER', 'E-PER']
line = '吴京京 PER'
tokens, labels = read_line(line)
assert tokens == ['吴', '京', '京']
assert labels == ['B-PER', 'M-PER', 'E-PER']
def test_bmes_converter():
base_dir = './data/weibo'
for file in os.listdir(base_dir):
if not file.endswith('.txt'):
continue
input_file = os.path.join(base_dir, file)
output_file = os.path.join(base_dir, file.replace('txt', 'corpus'))
convert_bmes_to_sequence_tagging(input_file, output_file)
def test_sequence_tagging_reader():
model_name = 'bert-base-chinese'
bert_token_indexers = PretrainedTransformerIndexer(model_name=model_name)
reader = SequenceTaggingDatasetReader(token_indexers={"tokens": bert_token_indexers})
train_file = './data/weibo/train.corpus'
dev_file = './data/weibo/dev.corpus'
test_file = './data/weibo/dev.corpus'
train_instances = list(reader.read(train_file))
dev_instances = list(reader.read(dev_file))
test_instances = list(reader.read(test_file))
vocab: Vocabulary = Vocabulary.from_instances(train_instances)
assert vocab.get_namespaces() is not None
bert_text_field_embedder = PretrainedTransformerEmbedder(model_name=model_name)
tagger = SimpleTagger(
vocab=vocab,
text_field_embedder=BasicTextFieldEmbedder(
token_embedders={
'tokens': bert_text_field_embedder
}
),
encoder=PassThroughEncoder(bert_text_field_embedder.get_output_dim()),
calculate_span_f1=True,
label_encoding="BMES",
# verbose_metrics=True
)
train_data_loader, dev_data_loader = build_data_loaders(train_instances, dev_instances)
train_data_loader.index_with(vocab)
dev_data_loader.index_with(vocab)
trainer = build_trainer(model=tagger, serialization_dir='./output', train_loader=train_data_loader, dev_loader=dev_data_loader)
print("Starting training")
trainer.train()
print("Finished training")
def build_data_loaders(
train_data: List[Instance],
dev_data: List[Instance],
) -> Tuple[DataLoader, DataLoader]:
train_loader = SimpleDataLoader(train_data, 8, shuffle=True)
dev_loader = SimpleDataLoader(dev_data, 8, shuffle=False)
return train_loader, dev_loader
def build_trainer(
model: Model,
serialization_dir: str,
train_loader: DataLoader,
dev_loader: DataLoader,
) -> Trainer:
parameters = [(n, p) for n, p in model.named_parameters() if p.requires_grad]
optimizer = AdamOptimizer(parameters) # type: ignore
trainer = GradientDescentTrainer(
model=model,
serialization_dir=serialization_dir,
data_loader=train_loader,
validation_data_loader=dev_loader,
num_epochs=5,
optimizer=optimizer,
)
return trainer
def test_ask_ubuntu_corpus():
base_dirs = [
'./data/ask_ubuntu/intent-classification',
'./data/banking/intent-classification',
'./data/chatbot/intent-classification',
'./data/clinc/intent-classification',
'./data/hwu/intent-classification',
'./data/web_applications/intent-classification',
]
for base_dir in base_dirs:
for file in os.listdir(base_dir):
if file.endswith('.corpus'):
continue
file_path = os.path.join(base_dir, file)
convert_two_array_to_text_classification_corpus(
file_path
)
if __name__ == "__main__":
test_sequence_tagging_reader() | [
"[email protected]"
] | |
f98f2e83b1096b53c66e15f0fbf035ddc93c1d8e | 197420c1f28ccb98059888dff214c9fd7226e743 | /happy_pythoning_cource_2_prodv/3.1 Bool/3.1. bool.py | be2e6728722c29ced0d3c7440e63523df6444bf2 | [] | no_license | Vovanuch/python-basics-1 | fc10b6f745defff31364b66c65a704a9cf05d076 | a29affec12e8b80a1d3beda3a50cde4867b1dee2 | refs/heads/master | 2023-07-06T17:10:46.341121 | 2021-08-06T05:38:19 | 2021-08-06T05:38:19 | 267,504,364 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 283 | py | '''
Что будет выведено на экран в результате выполнения следующей программы?
'''
numbers = [-6, -8, 0, 1, 3, 8, -7, 12, 17, 24, 25, 3, 5, 1]
res = 0
for num in numbers:
res += (num % 2 == 1) and (num > 1)
print(res) | [
"[email protected]"
] | |
cfcc64c2dd2a5398b2f9b30038ba8ada47ef3178 | 31023b59e743b5bef1c2c935dc1f2b26e8e10e9b | /线程进程/进程间通信/数据共享.py | ef96b8051c960f8e1af50dca4856fa31d9922af5 | [] | no_license | hsyy673150343/PythonLearning | 417650d8ab5dbafbede08ef40223b29e82738443 | 817c6bd4c2ecba2549fa0be9f0c41337fe5acfdf | refs/heads/master | 2020-05-18T06:36:25.648709 | 2019-05-23T13:40:59 | 2019-05-23T13:40:59 | 184,239,403 | 3 | 0 | null | null | null | null | UTF-8 | Python | false | false | 600 | py | #!/usr/bin/env python
# -*- coding:utf8 -*-
# @TIME :2019/3/20 19:33
# @Author : 洪松
# @File : 数据共享.py
from multiprocessing import Process, Manager
def f(d, l,n):
d[n] = n
d["name"] ="alvin"
l.append(n)
if __name__ == '__main__':
with Manager() as manager:
d = manager.dict()
l = manager.list(range(5))
p_list = []
for i in range(10):
p = Process(target=f, args=(d,l,i))
p.start()
p_list.append(p)
for res in p_list:
res.join()
print(d)
print(l) | [
"[email protected]"
] | |
4d4d16b4c0f08b20c039e19a68cbe44848bc592a | e9a210d6e58bd0450197dfb4bbbc03f66788a297 | /eventplus/talks/migrations/0005_auto_20171031_0209.py | 3d87473b034b28fb35cd41f7b8916e329532208b | [] | no_license | werberth/EventPlus | 3895587c8f4b2b6cc03507d10badd3c3cd2c28c9 | d80c2ab5ec30fa83950380567195514df1cc9892 | refs/heads/master | 2021-08-28T17:00:21.363698 | 2017-12-12T21:10:16 | 2017-12-12T21:10:16 | 108,935,215 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 449 | py | # -*- coding: utf-8 -*-
# Generated by Django 1.11 on 2017-10-31 05:09
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('talks', '0004_auto_20171029_2142'),
]
operations = [
migrations.AlterField(
model_name='talk',
name='date',
field=models.DateField(verbose_name='Date'),
),
]
| [
"[email protected]"
] | |
3548555fe01d146e2f03abab2eaebe86099355ba | 71e49ae86e6398f3278a031edaeebee2218c8b9b | /todo_django/todo_django/settings.py | 6abb1e168465164fa89608d6d98a73a2d985b19f | [] | no_license | swestphal/todo | 38558688063089b30e0ac1ca4906ef1f2d816ad8 | 97ef67696abd4e46fcc5a61270d6ad60596de321 | refs/heads/main | 2023-05-13T11:25:26.614616 | 2021-05-28T15:26:19 | 2021-05-28T15:26:19 | 371,403,392 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 3,405 | py | """
Django settings for todo_django project.
Generated by 'django-admin startproject' using Django 3.2.3.
For more information on this file, see
https://docs.djangoproject.com/en/3.2/topics/settings/
For the full list of settings and their values, see
https://docs.djangoproject.com/en/3.2/ref/settings/
"""
from pathlib import Path
# Build paths inside the project like this: BASE_DIR / 'subdir'.
BASE_DIR = Path(__file__).resolve().parent.parent
# Quick-start development settings - unsuitable for production
# See https://docs.djangoproject.com/en/3.2/howto/deployment/checklist/
# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = 'django-insecure-6zj_#s34n7qjx$z739po7=4kl&a7i^*%t2w=w$n4986h+y4#x7'
# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = True
ALLOWED_HOSTS = ['localhost', '127.0.0.1']
CORS_ORIGIN_ALLOW_ALL = True
# Application definition
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'rest_framework',
'corsheaders',
'task'
]
MIDDLEWARE = [
'django.middleware.security.SecurityMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'corsheaders.middleware.CorsMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
]
ROOT_URLCONF = 'todo_django.urls'
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
],
},
},
]
WSGI_APPLICATION = 'todo_django.wsgi.application'
# Database
# https://docs.djangoproject.com/en/3.2/ref/settings/#databases
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': BASE_DIR / 'db.sqlite3',
}
}
# Password validation
# https://docs.djangoproject.com/en/3.2/ref/settings/#auth-password-validators
AUTH_PASSWORD_VALIDATORS = [
{
'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
},
]
# Internationalization
# https://docs.djangoproject.com/en/3.2/topics/i18n/
LANGUAGE_CODE = 'en-us'
TIME_ZONE = 'UTC'
USE_I18N = True
USE_L10N = True
USE_TZ = True
# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/3.2/howto/static-files/
STATIC_URL = '/static/'
# Default primary key field type
# https://docs.djangoproject.com/en/3.2/ref/settings/#default-auto-field
DEFAULT_AUTO_FIELD = 'django.db.models.BigAutoField'
| [
"[email protected]"
] | |
bc638535aa88eb0734da8bea69ec47da930a0515 | 472f531c32b63a39ed76c2d95517bae3342a9c09 | /lib/config/config.py | a45ecff4c1583df9226bd3d4fb918a0c772693e0 | [
"Apache-2.0"
] | permissive | TrendingTechnology/PyAnomaly | 90cc3a2d05ea6ecfd0ab5c3768c67ccd4c45b77d | 8e9bf7564ab0a39e251f439dfaa3d55ba053f3a9 | refs/heads/master | 2022-11-06T09:29:06.034245 | 2020-06-12T04:34:22 | 2020-06-12T04:34:22 | null | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 10,904 | py | from yacs.config import CfgNode as CN
__all__ = ['update_config']
config = CN()
# configure the system related matters, such as gpus, cudnn and so on
config.SYSTEM = CN()
config.SYSTEM.multigpus = False # to determine whether use multi gpus to train or test(data parallel)
config.SYSTEM.num_gpus = 1 # decide the num_gpus and
config.SYSTEM.gpus = [0]
config.SYSTEM.cudnn = CN()
config.SYSTEM.cudnn.benchmark = True
config.SYSTEM.cudnn.deterministic = False
config.SYSTEM.cudnn.enable = True
# about use the distributed
config.SYSTEM.distributed = CN()
config.SYSTEM.distributed.use = False
# configure the log things
config.LOG = CN()
config.LOG.log_output_dir = ''
config.LOG.tb_output_dir = '' # tensorboard log output dir
config.LOG.vis_dir = './output/vis'
# configure the dataset
config.DATASET = CN()
config.DATASET.name = ''
config.DATASET.seed = 2020
config.DATASET.read_format = 'opencv'
config.DATASET.channel_num = 3 # 1: grayscale image | 2: optical flow | 3: RGB or other 3 channel image
config.DATASET.channel_name = 'rgb' # 'gray' | 'uv' | 'rgb' | ....
config.DATASET.train_path = ''
config.DATASET.train_clip_length = 5 # the total clip length, including frames not be sampled
config.DATASET.train_frame_step = 1 # frame sample frequency
config.DATASET.train_clip_step = 1 # clip sample frequency
config.DATASET.test_path = ''
config.DATASET.test_clip_length = 5
config.DATASET.test_frame_step = 1
config.DATASET.test_clip_step = 1
config.DATASET.gt_path = ''
config.DATASET.number_of_class = 1 # use in changing the label to one hot
config.DATASET.score_normalize = True
config.DATASET.score_type = 'normal' # 'normal' | 'abnormal'
config.DATASET.decidable_idx = 1
config.DATASET.smooth = CN()
config.DATASET.smooth.guassian = True
config.DATASET.smooth.guassian_sigma = 10
config.DATASET.mini_dataset = CN()
config.DATASET.mini_dataset.samples = 2
config.DATASET.evaluate_function_type = 'compute_auc_score'
# ****************configure the argument of the data*************************
config.ARGUMENT = CN()
#========================Train Augment===================
config.ARGUMENT.train = CN()
config.ARGUMENT.train.use = False
config.ARGUMENT.train.resize = CN()
config.ARGUMENT.train.resize.use = False
config.ARGUMENT.train.resize.height = 32
config.ARGUMENT.train.resize.width = 32
config.ARGUMENT.train.grayscale = CN()
config.ARGUMENT.train.grayscale.use = False
config.ARGUMENT.train.fliplr = CN()
config.ARGUMENT.train.fliplr.use = False
config.ARGUMENT.train.fliplr.p = 1.0
config.ARGUMENT.train.flipud = CN()
config.ARGUMENT.train.flipud.use = False
config.ARGUMENT.train.flipud.p = 1.0
config.ARGUMENT.train.rote = CN()
config.ARGUMENT.train.rote.use = False
config.ARGUMENT.train.rote.degrees = [0,0]
config.ARGUMENT.train.JpegCompression = CN()
config.ARGUMENT.train.JpegCompression.use = False
config.ARGUMENT.train.JpegCompression.high = 100
config.ARGUMENT.train.JpegCompression.low = 80
config.ARGUMENT.train.GaussianBlur = CN()
config.ARGUMENT.train.GaussianBlur.use = False
config.ARGUMENT.train.GaussianBlur.high = 0.3
config.ARGUMENT.train.GaussianBlur.low = 0.03
config.ARGUMENT.train.CropToFixedSize = CN()
config.ARGUMENT.train.CropToFixedSize.use = False
config.ARGUMENT.train.CropToFixedSize.height = 256
config.ARGUMENT.train.CropToFixedSize.width = 256
config.ARGUMENT.train.CropToFixedSize.position = 'center' # uniform | normal | center | ...
#-------------------Normal------------------------
config.ARGUMENT.train.normal = CN()
config.ARGUMENT.train.normal.use = False
config.ARGUMENT.train.normal.mean = [0.485, 0.456, 0.406]
config.ARGUMENT.train.normal.std = [0.229, 0.224, 0.225]
#========================Val Augment===================
config.ARGUMENT.val = CN()
config.ARGUMENT.val.use = False
config.ARGUMENT.val.resize = CN()
config.ARGUMENT.val.resize.use = False
config.ARGUMENT.val.resize.height = 32
config.ARGUMENT.val.resize.width = 32
config.ARGUMENT.val.grayscale = CN()
config.ARGUMENT.val.grayscale.use = False
config.ARGUMENT.val.fliplr = CN()
config.ARGUMENT.val.fliplr.use = False
config.ARGUMENT.val.fliplr.p = 1.0
config.ARGUMENT.val.flipud = CN()
config.ARGUMENT.val.flipud.use = False
config.ARGUMENT.val.flipud.p = 1.0
config.ARGUMENT.val.rote = CN()
config.ARGUMENT.val.rote.use = False
config.ARGUMENT.val.rote.degrees = [0,0]
config.ARGUMENT.val.JpegCompression = CN()
config.ARGUMENT.val.JpegCompression.use = False
config.ARGUMENT.val.JpegCompression.high = 100
config.ARGUMENT.val.JpegCompression.low = 80
config.ARGUMENT.val.GaussianBlur = CN()
config.ARGUMENT.val.GaussianBlur.use = False
config.ARGUMENT.val.GaussianBlur.high = 0.3
config.ARGUMENT.val.GaussianBlur.low = 0.03
config.ARGUMENT.val.CropToFixedSize = CN()
config.ARGUMENT.val.CropToFixedSize.use = False
config.ARGUMENT.val.CropToFixedSize.height = 256
config.ARGUMENT.val.CropToFixedSize.width = 256
config.ARGUMENT.val.CropToFixedSize.position = 'center' # uniform | normal | center | ...
#-------------------Normal------------------------
config.ARGUMENT.val.normal = CN()
config.ARGUMENT.val.normal.use = False
config.ARGUMENT.val.normal.mean = [0.485, 0.456, 0.406]
config.ARGUMENT.val.normal.std = [0.229, 0.224, 0.225]
# *************************************************************************
# configure the model related things
config.MODEL = CN()
config.MODEL.name = '' # the name of the network, such as resnet
config.MODEL.type = '' # the type of the network, such as resnet50, resnet101 or resnet152
config.MODEL.hooks = CN() # determine the hooks use in the training
config.MODEL.hooks.train = [] # determine the hooks use in the training
config.MODEL.hooks.val = [] # determine the hooks use in the training
config.MODEL.flow_model_path = ''
config.MODEL.discriminator_channels = []
config.MODEL.pretrain_model = ''
config.MODEL.detector_config = ''
config.MODEL.detector_model_path = ''
# configure the resume
config.RESUME = CN()
config.RESUME.flag = False
config.RESUME.checkpoint_path = ''
# configure the freezing layers
config.FINETUNE = CN()
config.FINETUNE.flag = False
config.FINETUNE.layer_list = []
# configure the training process
#-----------------basic-----------------
config.TRAIN = CN()
config.TRAIN.batch_size = 2
config.TRAIN.start_step = 0
config.TRAIN.max_steps = 20000 # epoch * len(dataset)
config.TRAIN.log_step = 5 # the step to print the info
config.TRAIN.vis_step = 100 # the step to vis of the training results
config.TRAIN.mini_eval_step = 10 # the step to exec the light-weight eval
config.TRAIN.eval_step = 100 # the step to use the evaluate function
config.TRAIN.save_step = 500 # the step to save the model
config.TRAIN.epochs = 1
config.TRAIN.loss = ['mse', 'cross']
config.TRAIN.loss_coefficients = [0.5, 0.5] # the len must pair with the loss
config.TRAIN.mode = 'general' # general | adversarial | ....
#===============General Mode Settings==============
config.TRAIN.general = CN()
#---------------Optimizer configure---------------
config.TRAIN.general.optimizer = CN()
config.TRAIN.general.optimizer.include = ['A', 'B', 'C']
config.TRAIN.general.optimizer.name = 'adam'
config.TRAIN.general.optimizer.lr = 1e-3
config.TRAIN.general.optimizer.momentum = 0.9
config.TRAIN.general.optimizer.weight_decay = 0.0001
config.TRAIN.general.optimizer.nesterov = False
config.TRAIN.general.optimizer.output_name = ['optimizer_abc']
#-----------------Scheduler configure--------------
config.TRAIN.general.scheduler = CN()
config.TRAIN.general.scheduler.use = True
config.TRAIN.general.scheduler.name = 'none'
config.TRAIN.general.scheduler.step_size = 30 # the numebr of the iter, should be len(dataset) * want_epochs
config.TRAIN.general.scheduler.steps = [10000, 20000] # the numebr of the iter, should be len(dataset) * want_epochs
config.TRAIN.general.scheduler.gamma = 0.1
config.TRAIN.general.scheduler.T_max = 300 # use ine the cosine annealing LR
config.TRAIN.general.scheduler.eta_min = 0
config.TRAIN.general.scheduler.warmup_factor = 0.001
config.TRAIN.general.scheduler.warmup_iters = 1000
config.TRAIN.general.scheduler.warmup_method = 'linear' # 'linear' | 'constant'
#==================Adversarial Mode Setting==================
config.TRAIN.adversarial = CN()
#---------------Optimizer configure---------------
config.TRAIN.adversarial.optimizer = CN()
config.TRAIN.adversarial.optimizer.include = ['Generator', 'Discriminator']
config.TRAIN.adversarial.optimizer.name = 'adam'
config.TRAIN.adversarial.optimizer.g_lr = 1e-2
config.TRAIN.adversarial.optimizer.d_lr = 1e-2
config.TRAIN.adversarial.optimizer.weight_decay = 0.0001
config.TRAIN.adversarial.optimizer.output_name = ['optimizer_g', 'optimizer_d']
#-----------------Scheduler configure--------------
config.TRAIN.adversarial.scheduler = CN()
config.TRAIN.adversarial.scheduler.use = True
config.TRAIN.adversarial.scheduler.name = 'none'
config.TRAIN.adversarial.scheduler.step_size = 30 # the numebr of the iter, should be len(dataset) * want_epochs
config.TRAIN.adversarial.scheduler.steps = [1000,2000] # the numebr of the iter, should be len(dataset) * want_epochs
config.TRAIN.adversarial.scheduler.gamma = 0.1
config.TRAIN.adversarial.scheduler.T_max = 300 # use ine the cosine annealing LR
config.TRAIN.adversarial.scheduler.eta_min = 0
config.TRAIN.adversarial.scheduler.warmup_factor = 0.001
config.TRAIN.adversarial.scheduler.warmup_iters = 5000
config.TRAIN.adversarial.scheduler.warmup_method = 'linear' # 'linear' | 'constant'
#----------------Train save configure------------
config.TRAIN.split = ''
config.TRAIN.model_output = '' # use save the final model
config.TRAIN.checkpoint_output = '' # use to save the intermediate results, including lr, optimizer, state_dict...
config.TRAIN.pusedo_data_path = ''
#-------------------cluster setting--------------
config.TRAIN.cluster = CN()
config.TRAIN.cluster.k = 10
# configure the val process
config.VAL = CN()
config.VAL.name = ''
config.VAL.path = '' # if not use the data in the TRAIN.test_path
config.VAL.batch_size = 2
# configure the test process
config.TEST = CN()
config.TEST.name = ''
config.TEST.path = '' # if not use the data in the TRAIN.test_path
config.TEST.model_file = ''
config.TEST.result_output = ''
config.TEST.label_folder = ''
def _get_cfg_defaults():
'''
Get the config template
NOT USE IN OTHER FILES!!
'''
return config.clone()
def update_config(yaml_path, opts):
'''
Make the template update based on the yaml file
'''
print('=>Merge the config with {}\t'.format(yaml_path))
cfg = _get_cfg_defaults()
cfg.merge_from_file(yaml_path)
cfg.merge_from_list(opts)
cfg.freeze()
return cfg
| [
"[email protected]"
] | |
e5715062db6a8e7bb894b4f5bd2d275d98e604be | 0fb505765604b586c3a46e608fc23930f8501db5 | /venv/lib/python2.7/site-packages/django/contrib/messages/apps.py | 1a9189383ef26083e87cbc3f46cf61eaad518797 | [
"MIT"
] | permissive | domenicosolazzo/practice-django | b05edecc302d97c97b7ce1de809ea46d59e2f0e6 | 44e74c973384c38bd71e7c8a1aacd1e10d6a6893 | refs/heads/master | 2021-08-19T15:36:22.732954 | 2015-01-22T18:42:14 | 2015-01-22T18:42:14 | 25,118,384 | 0 | 0 | MIT | 2021-06-10T19:50:51 | 2014-10-12T12:08:47 | Python | UTF-8 | Python | false | false | 196 | py | from django.apps import AppConfig
from django.utils.translation import ugettext_lazy as _
class MessagesConfig(AppConfig):
name = 'django.contrib.messages'
verbose_name = _("Messages")
| [
"[email protected]"
] | |
7f847a8a75b0bd5215fc381713b860453fc20803 | d05a59feee839a4af352b7ed2fd6cf10a288a3cb | /xlsxwriter/test/comparison/test_chart_format24.py | 017038f19ec35b57c30a8bd3b6c3821087c59443 | [
"BSD-2-Clause-Views"
] | permissive | elessarelfstone/XlsxWriter | 0d958afd593643f990373bd4d8a32bafc0966534 | bb7b7881c7a93c89d6eaac25f12dda08d58d3046 | refs/heads/master | 2020-09-24T06:17:20.840848 | 2019-11-24T23:43:01 | 2019-11-24T23:43:01 | 225,685,272 | 1 | 0 | NOASSERTION | 2019-12-03T18:09:06 | 2019-12-03T18:09:05 | null | UTF-8 | Python | false | false | 1,555 | py | ###############################################################################
#
# Tests for XlsxWriter.
#
# Copyright (c), 2013-2019, John McNamara, [email protected]
#
from ..excel_comparsion_test import ExcelComparisonTest
from ...workbook import Workbook
class TestCompareXLSXFiles(ExcelComparisonTest):
"""
Test file created by XlsxWriter against a file created by Excel.
"""
def setUp(self):
self.set_filename('chart_format24.xlsx')
def test_create_file(self):
"""Test the creation of an XlsxWriter file with chart formatting."""
workbook = Workbook(self.got_filename)
worksheet = workbook.add_worksheet()
chart = workbook.add_chart({'type': 'column'})
chart.axis_ids = [115374720, 115389568]
data = [
[1, 2, 3, 4, 5],
[2, 4, 6, 8, 10],
[3, 6, 9, 12, 15],
]
worksheet.write_column('A1', data[0])
worksheet.write_column('B1', data[1])
worksheet.write_column('C1', data[2])
chart.add_series({
'categories': '=Sheet1!$A$1:$A$5',
'values': '=Sheet1!$B$1:$B$5',
})
chart.add_series({
'categories': '=Sheet1!$A$1:$A$5',
'values': '=Sheet1!$C$1:$C$5',
})
chart.set_chartarea({'fill': {'color': 'yellow', 'transparency': 75}})
chart.set_plotarea({'fill': {'color': 'red', 'transparency': 25}})
worksheet.insert_chart('E9', chart)
workbook.close()
self.assertExcelEqual()
| [
"[email protected]"
] | |
ec9733d55c01eb7e99f50e9f9fa51fd0cb38aaf4 | 6662aa37e6a5df5ef72359f5e52ba74a77ef00b6 | /sensors/gps_2.py | 886adcf175323e317a3326910e154dad195e52be | [] | no_license | rice-eclipse/RTR-PocketBeagle | b3f4a7536a28d5e412e7184dc8399558264f0e78 | 11939281355204c08f32da316ee35061def254f0 | refs/heads/master | 2023-01-01T18:28:53.201874 | 2020-10-24T18:48:15 | 2020-10-24T18:48:15 | 295,223,021 | 0 | 1 | null | null | null | null | UTF-8 | Python | false | false | 3,667 | py | # Simple GPS module demonstration.
# Will print NMEA sentences received from the GPS, great for testing connection
# Uses the GPS to send some commands, then reads directly from the GPS
import time
import board
#import busio
import adafruit_gps
import digitalio
# Create a serial connection for the GPS connection using default speed and
# a slightly higher timeout (GPS modules typically update once a second).
# These are the defaults you should use for the GPS FeatherWing.
# For other boards set RX = GPS module TX, and TX = GPS module RX pins.
#uart = busio.UART(board.TX, board.RX, baudrate=9600, timeout=10)
# for a computer, use the pyserial library for uart access
import serial
import os
permission_command = "sudo chmod 777 /dev/ttyO0"
sudoPassword = "temppwd"
os.system('echo %s|sudo -S %s' % (sudoPassword, permission_command))
reset = digitalio.DigitalInOut(board.P1_31)
reset.direction = digitalio.Direction.OUTPUT
standby = digitalio.DigitalInOut(board.P1_35)
standby.direction = digitalio.Direction.OUTPUT
fix = digitalio.DigitalInOut(board.P1_34)
fix.direction = digitalio.Direction.INPUT
reset.value = True #set reset pin to high, pin is active low
standby.value = True #set standby pin to high, pin is active low
uart = serial.Serial("/dev/ttyO0", baudrate=9600, timeout=1)
# If using I2C, we'll create an I2C interface to talk to using default pins
# i2c = board.I2C()
# Create a GPS module instance.
gps = adafruit_gps.GPS(uart) # Use UART/pyserial
# gps = adafruit_gps.GPS_GtopI2C(i2c) # Use I2C interface
# Initialize the GPS module by changing what data it sends and at what rate.
# These are NMEA extensions for PMTK_314_SET_NMEA_OUTPUT and
# PMTK_220_SET_NMEA_UPDATERATE but you can send anything from here to adjust
# the GPS module behavior:
# https://cdn-shop.adafruit.com/datasheets/PMTK_A11.pdf
# Turn on the basic GGA and RMC info (what you typically want)
# gps.send_command(b"PMTK314,0,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0")
# Turn on just minimum info (RMC only, location):
# gps.send_command(b'PMTK314,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0')
# Turn off everything:
# gps.send_command(b'PMTK314,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0')
# Tuen on everything (not all of it is parsed!)
gps.send_command(b'PMTK314,1,1,1,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0')
# Set update rate to once a second (1hz) which is what you typically want.
gps.send_command(b"PMTK220,1000")
# Or decrease to once every two seconds by doubling the millisecond value.
# Be sure to also increase your UART timeout above!
# gps.send_command(b'PMTK220,2000')
# You can also speed up the rate, but don't go too fast or else you can lose
# data during parsing. This would be twice a second (2hz, 500ms delay):
# gps.send_command(b'PMTK220,500')
# Main loop runs forever printing data as it comes in
timestamp = time.monotonic()
while True:
#os.system('echo %s|sudo -S %s' % (sudoPassword, permission_command))
if uart.inWaiting:
print("in waiting")
else:
print("not in waiting")
if fix.value:
print("fix pin is HIGH")
else:
print("fix pin is LOW")
try:
data = gps.read(32) # read up to 32 bytes
print(data) # this is a bytearray type
if data is not None:
# convert bytearray to string
data_string = "".join([chr(b) for b in data])
print(data_string, end="")
if time.monotonic() - timestamp > 5:
# every 5 seconds...
gps.send_command(b"PMTK605") # request firmware version
timestamp = time.monotonic()
except Exception as e:
print("Exception Caught: " + str(e))
| [
"[email protected]"
] | |
317316afc2eda10a2c3dfedce7768855c411e93a | 6aa36fee3f4fcc9ac8f5509e51ea6bd8fc05b39b | /virtualenv-flask/lib/python2.7/site-packages/cybox/helper.py | f65d1b9819c9e719d461e09d6c1d1c63d37795f0 | [] | no_license | syn-ack-zack/msg-stix-parser | 8c46c4d897d579162f224360a077ac42f28ffe89 | 1edb7c3b6d60f76f24b91830a1ae7076d46ede14 | refs/heads/master | 2021-03-27T15:01:07.344754 | 2016-09-30T16:43:22 | 2016-09-30T16:43:22 | 69,684,161 | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 2,178 | py | #!/usr/bin/env python
# Copyright (c) 2013, The MITRE Corporation. All rights reserved.
# See LICENSE.txt for complete terms.
'''
CybOX Common Indicator API
An api for creating observables for common indicators:
ipv4 addresses, domain names, file hashes, and urls.
'''
import sys
import uuid
from cybox.core import Observables, Observable, Object
from cybox.common import Hash
from cybox.objects.address_object import Address
from cybox.objects.file_object import File
from cybox.objects.uri_object import URI
def create_ipv4_observable(ipv4_address):
'''Create a CybOX Observable representing an IPv4 address'''
ipv4_object = Address.from_dict({'address_value': ipv4_address,
'category': Address.CAT_IPV4})
return Observable(ipv4_object)
def create_ipv4_list_observables(list_ipv4_addresses):
'''Create a list of CybOX Observables, each representing an IPv4 address'''
ipv4_objects = []
list_observables = []
for ipv4_address in list_ipv4_addresses:
ipv4_object = create_ipv4_object(ipv4_address)
observable = Observable(ipv4_object)
list_observables.append(observable)
return list_observables
def create_email_address_observable(email_address):
'''Create a CybOX Observable representing an IPv4 address'''
email_address_object = Address.from_dict({'address_value': email_address,
'category': Address.CAT_EMAIL})
return Observable(email_address_object)
def create_domain_name_observable(domain_name):
'''Create a CybOX Observable representing a domain name.'''
domain_name_object = URI.from_dict({'value': domain_name,
'type': URI.TYPE_DOMAIN})
return Observable(domain_name_object)
def create_file_hash_observable(fn, hash_value):
'''Create a CybOX Observable representing a file hash.'''
hash_ = Hash(hash_value)
file_ = File()
file_.file_name = fn
file_.add_hash(hash_)
return Observable(file_)
def create_url_observable(url):
url_object = URI.from_dict({'value': url, 'type': URI.TYPE_URL})
return Observable(url_object)
| [
"[email protected]"
] | |
7197e5b320ce581df451038b10cf2e2ca4b3c204 | e28009b0a4584e8d128ed6fbd4ba84a1db11d1b9 | /824.Goat Latin/824.Goat Latin.py | 9a074961d672b68bff7b9d3ae8faee89d99a8b11 | [] | no_license | jerrylance/LeetCode | 509d16e4285296167feb51a80d6c382b3833405e | 06ed3e9b27a3f1c0c517710d57fbbd794fd83e45 | refs/heads/master | 2020-12-02T23:10:27.382142 | 2020-08-02T02:03:54 | 2020-08-02T02:03:54 | 231,141,551 | 3 | 0 | null | null | null | null | UTF-8 | Python | false | false | 906 | py | # LeetCode Solution
# Zeyu Liu
# 2019.6.12
# 824.Goat Latin
from typing import List
# method 1 straight forward
class Solution:
def toGoatLatin(self, S: str) -> str:
vowel = ['a', 'e', 'i', 'o', 'u', 'A', 'E', 'I', 'O', 'U']
res = []
S = S.split()
for i,v in enumerate(S):
if v[0] in vowel:
v += 'ma'
else:
v = v[1:] + v[0]
v += 'ma'
v += 'a' * (i + 1)
res.append(v)
return " ".join(res)
# transfer method
solve = Solution()
print(solve.toGoatLatin("I speak Goat Latin"))
# method 2 one-line
class Solution:
def toGoatLatin(self, S: str) -> str:
return ' '.join((w if w[0].lower() in 'aeiou' else w[1:] + w[0]) + 'ma' + 'a' * (i + 1) for i, w in enumerate(S.split()))
# transfer method
solve = Solution()
print(solve.toGoatLatin("I speak Goat Latin")) | [
"[email protected]"
] | |
5948942590110adf43f925fd248d417aae92946d | ea88fac1da1b02e77180c2e20e1fd49b919eecf5 | /installer/core/providers/aws/boto3/rds.py | 30e547edc215610260da165de7c20d672e56f79c | [
"Apache-2.0"
] | permissive | jacob-kinzer/pacbot | 7eeb667505e0ddfd333f6e423b51b8199b481692 | d78605e9e3fd8c34435636495cd6d51c677754e3 | refs/heads/master | 2020-04-23T21:46:42.251948 | 2019-02-06T20:47:08 | 2019-02-06T20:47:08 | 171,480,569 | 0 | 0 | null | 2019-02-19T13:41:01 | 2019-02-19T13:41:00 | null | UTF-8 | Python | false | false | 1,610 | py | import boto3
def get_rds_client(access_key, secret_key, region):
return boto3.client(
'rds',
region_name=region,
aws_access_key_id=access_key,
aws_secret_access_key=secret_key)
def check_rds_instance_exists(instance_identifier, access_key, secret_key, region):
client = get_rds_client(access_key, secret_key, region)
try:
response = client.describe_db_instances(
DBInstanceIdentifier=instance_identifier
)
return True if len(response['DBInstances']) else False
except:
return False
def check_rds_option_group_exists(name, access_key, secret_key, region):
client = get_rds_client(access_key, secret_key, region)
try:
response = client.describe_option_groups(
OptionGroupName=name
)
return True if len(response['OptionGroupsList']) else False
except:
return False
def check_rds_parameter_group_exists(name, access_key, secret_key, region):
client = get_rds_client(access_key, secret_key, region)
try:
response = client.describe_db_parameter_groups(
DBParameterGroupName=name
)
return True if len(response['DBParameterGroups']) else False
except:
return False
def check_rds_subnet_group_exists(name, access_key, secret_key, region):
client = get_rds_client(access_key, secret_key, region)
try:
response = client.describe_db_subnet_groups(
DBSubnetGroupName=name
)
return True if len(response['DBSubnetGroups']) else False
except:
return False
| [
"[email protected]"
] | |
d2c259f4c791b27708cdc4496151cd77b169c2a2 | 82b946da326148a3c1c1f687f96c0da165bb2c15 | /sdk/python/pulumi_azure_native/network/v20210201/get_virtual_network_tap.py | 974801dd5908402ca2d9bf972ee8e210857e8afc | [
"BSD-3-Clause",
"Apache-2.0"
] | permissive | morrell/pulumi-azure-native | 3916e978382366607f3df0a669f24cb16293ff5e | cd3ba4b9cb08c5e1df7674c1c71695b80e443f08 | refs/heads/master | 2023-06-20T19:37:05.414924 | 2021-07-19T20:57:53 | 2021-07-19T20:57:53 | 387,815,163 | 0 | 0 | Apache-2.0 | 2021-07-20T14:18:29 | 2021-07-20T14:18:28 | null | UTF-8 | Python | false | false | 8,952 | py | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi SDK Generator. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from ... import _utilities
from . import outputs
__all__ = [
'GetVirtualNetworkTapResult',
'AwaitableGetVirtualNetworkTapResult',
'get_virtual_network_tap',
]
@pulumi.output_type
class GetVirtualNetworkTapResult:
"""
Virtual Network Tap resource.
"""
def __init__(__self__, destination_load_balancer_front_end_ip_configuration=None, destination_network_interface_ip_configuration=None, destination_port=None, etag=None, id=None, location=None, name=None, network_interface_tap_configurations=None, provisioning_state=None, resource_guid=None, tags=None, type=None):
if destination_load_balancer_front_end_ip_configuration and not isinstance(destination_load_balancer_front_end_ip_configuration, dict):
raise TypeError("Expected argument 'destination_load_balancer_front_end_ip_configuration' to be a dict")
pulumi.set(__self__, "destination_load_balancer_front_end_ip_configuration", destination_load_balancer_front_end_ip_configuration)
if destination_network_interface_ip_configuration and not isinstance(destination_network_interface_ip_configuration, dict):
raise TypeError("Expected argument 'destination_network_interface_ip_configuration' to be a dict")
pulumi.set(__self__, "destination_network_interface_ip_configuration", destination_network_interface_ip_configuration)
if destination_port and not isinstance(destination_port, int):
raise TypeError("Expected argument 'destination_port' to be a int")
pulumi.set(__self__, "destination_port", destination_port)
if etag and not isinstance(etag, str):
raise TypeError("Expected argument 'etag' to be a str")
pulumi.set(__self__, "etag", etag)
if id and not isinstance(id, str):
raise TypeError("Expected argument 'id' to be a str")
pulumi.set(__self__, "id", id)
if location and not isinstance(location, str):
raise TypeError("Expected argument 'location' to be a str")
pulumi.set(__self__, "location", location)
if name and not isinstance(name, str):
raise TypeError("Expected argument 'name' to be a str")
pulumi.set(__self__, "name", name)
if network_interface_tap_configurations and not isinstance(network_interface_tap_configurations, list):
raise TypeError("Expected argument 'network_interface_tap_configurations' to be a list")
pulumi.set(__self__, "network_interface_tap_configurations", network_interface_tap_configurations)
if provisioning_state and not isinstance(provisioning_state, str):
raise TypeError("Expected argument 'provisioning_state' to be a str")
pulumi.set(__self__, "provisioning_state", provisioning_state)
if resource_guid and not isinstance(resource_guid, str):
raise TypeError("Expected argument 'resource_guid' to be a str")
pulumi.set(__self__, "resource_guid", resource_guid)
if tags and not isinstance(tags, dict):
raise TypeError("Expected argument 'tags' to be a dict")
pulumi.set(__self__, "tags", tags)
if type and not isinstance(type, str):
raise TypeError("Expected argument 'type' to be a str")
pulumi.set(__self__, "type", type)
@property
@pulumi.getter(name="destinationLoadBalancerFrontEndIPConfiguration")
def destination_load_balancer_front_end_ip_configuration(self) -> Optional['outputs.FrontendIPConfigurationResponse']:
"""
The reference to the private IP address on the internal Load Balancer that will receive the tap.
"""
return pulumi.get(self, "destination_load_balancer_front_end_ip_configuration")
@property
@pulumi.getter(name="destinationNetworkInterfaceIPConfiguration")
def destination_network_interface_ip_configuration(self) -> Optional['outputs.NetworkInterfaceIPConfigurationResponse']:
"""
The reference to the private IP Address of the collector nic that will receive the tap.
"""
return pulumi.get(self, "destination_network_interface_ip_configuration")
@property
@pulumi.getter(name="destinationPort")
def destination_port(self) -> Optional[int]:
"""
The VXLAN destination port that will receive the tapped traffic.
"""
return pulumi.get(self, "destination_port")
@property
@pulumi.getter
def etag(self) -> str:
"""
A unique read-only string that changes whenever the resource is updated.
"""
return pulumi.get(self, "etag")
@property
@pulumi.getter
def id(self) -> Optional[str]:
"""
Resource ID.
"""
return pulumi.get(self, "id")
@property
@pulumi.getter
def location(self) -> Optional[str]:
"""
Resource location.
"""
return pulumi.get(self, "location")
@property
@pulumi.getter
def name(self) -> str:
"""
Resource name.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="networkInterfaceTapConfigurations")
def network_interface_tap_configurations(self) -> Sequence['outputs.NetworkInterfaceTapConfigurationResponse']:
"""
Specifies the list of resource IDs for the network interface IP configuration that needs to be tapped.
"""
return pulumi.get(self, "network_interface_tap_configurations")
@property
@pulumi.getter(name="provisioningState")
def provisioning_state(self) -> str:
"""
The provisioning state of the virtual network tap resource.
"""
return pulumi.get(self, "provisioning_state")
@property
@pulumi.getter(name="resourceGuid")
def resource_guid(self) -> str:
"""
The resource GUID property of the virtual network tap resource.
"""
return pulumi.get(self, "resource_guid")
@property
@pulumi.getter
def tags(self) -> Optional[Mapping[str, str]]:
"""
Resource tags.
"""
return pulumi.get(self, "tags")
@property
@pulumi.getter
def type(self) -> str:
"""
Resource type.
"""
return pulumi.get(self, "type")
class AwaitableGetVirtualNetworkTapResult(GetVirtualNetworkTapResult):
# pylint: disable=using-constant-test
def __await__(self):
if False:
yield self
return GetVirtualNetworkTapResult(
destination_load_balancer_front_end_ip_configuration=self.destination_load_balancer_front_end_ip_configuration,
destination_network_interface_ip_configuration=self.destination_network_interface_ip_configuration,
destination_port=self.destination_port,
etag=self.etag,
id=self.id,
location=self.location,
name=self.name,
network_interface_tap_configurations=self.network_interface_tap_configurations,
provisioning_state=self.provisioning_state,
resource_guid=self.resource_guid,
tags=self.tags,
type=self.type)
def get_virtual_network_tap(resource_group_name: Optional[str] = None,
tap_name: Optional[str] = None,
opts: Optional[pulumi.InvokeOptions] = None) -> AwaitableGetVirtualNetworkTapResult:
"""
Virtual Network Tap resource.
:param str resource_group_name: The name of the resource group.
:param str tap_name: The name of virtual network tap.
"""
__args__ = dict()
__args__['resourceGroupName'] = resource_group_name
__args__['tapName'] = tap_name
if opts is None:
opts = pulumi.InvokeOptions()
if opts.version is None:
opts.version = _utilities.get_version()
__ret__ = pulumi.runtime.invoke('azure-native:network/v20210201:getVirtualNetworkTap', __args__, opts=opts, typ=GetVirtualNetworkTapResult).value
return AwaitableGetVirtualNetworkTapResult(
destination_load_balancer_front_end_ip_configuration=__ret__.destination_load_balancer_front_end_ip_configuration,
destination_network_interface_ip_configuration=__ret__.destination_network_interface_ip_configuration,
destination_port=__ret__.destination_port,
etag=__ret__.etag,
id=__ret__.id,
location=__ret__.location,
name=__ret__.name,
network_interface_tap_configurations=__ret__.network_interface_tap_configurations,
provisioning_state=__ret__.provisioning_state,
resource_guid=__ret__.resource_guid,
tags=__ret__.tags,
type=__ret__.type)
| [
"[email protected]"
] | |
929c23a9620adf2f1c003ec9135e45ad81261b01 | 010c5fbc97731286be00028ff33fc981d943bca3 | /primal/src/system/bin-data/lift_over.py | a2fcb668400bd441d0a7b9f2cd1b78e1afe870a1 | [] | no_license | orenlivne/ober | 6ce41e0f75d3a8baebc53e28d7f6ae4aeb645f30 | 810b16b2611f32c191182042240851152784edea | refs/heads/master | 2021-01-23T13:48:49.172653 | 2014-04-03T13:57:44 | 2014-04-03T13:57:44 | 6,902,212 | 7 | 1 | null | null | null | null | UTF-8 | Python | false | false | 1,139 | py | #!/usr/bin/env python
'''
============================================================
Convert coordinates read from stdin from one human genome
build to another.
Usage: lift_over.py <from-build> <to-build>
stdin line format: chrom bp_in_from_build
stdout line format: bp_in_to_build, or '-' if not found
Created on February 19, 2014
@author: Oren Livne <[email protected]>
============================================================
'''
import sys, traceback, util
from pyliftover import LiftOver
if __name__ == '__main__':
try:
src, target = sys.argv[1:3]
if src == target:
for _, bp in (line.strip().split(' ') for line in sys.stdin):
print '%d %d' % (int(bp), int(bp))
else:
lo = LiftOver(src, target)
for chrom, bp in (line.strip().split(' ') for line in sys.stdin):
out = lo.convert_coordinate('chr' + chrom, int(bp))
if not out:
print '-'
else:
print '%d' % (out[0][1],)
except:
traceback.print_exc(file=sys.stdout)
sys.exit(util.EXIT_FAILURE)
| [
"[email protected]"
] | |
414bc115ae17b2308119de2eb29319cb4b453588 | 8e6005ff82a6b37b8c4e2a2fed5791323837d316 | /HighMassAnalysis/Analysis/vbf_muTau_LS.py | 3d44fab3699a7b0094d4520d2f6e8f035c7b18ee | [] | no_license | CMSRA2Tau/gurrola-sl5-on-sl6 | 633050a5ec5fd1a81a15c2e1dcf6b4952b718a9e | f56a99cd7121bcbdf301c2bea9fe397a6b9ef6a1 | refs/heads/master | 2020-04-15T06:13:09.462508 | 2014-12-17T17:57:01 | 2014-12-17T17:57:01 | null | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 76,430 | py |
import FWCore.ParameterSet.Config as cms
import copy
process = cms.Process('HiMassTau')
process.load('Configuration.StandardSequences.Services_cff')
process.load('Configuration.StandardSequences.FrontierConditions_GlobalTag_cff')
process.GlobalTag.globaltag = 'START53_V7F::All'
process.load('JetMETCorrections.Configuration.JetCorrectionServices_cff')
process.load('CommonTools/RecoAlgos/HBHENoiseFilter_cfi')
process.load('FWCore.MessageService.MessageLogger_cfi')
process.MessageLogger.cerr.FwkReport.reportEvery = 100
process.maxEvents = cms.untracked.PSet(
input = cms.untracked.int32(-1)
)
process.source = cms.Source("PoolSource",
skipEvents = cms.untracked.uint32(0),
fileNames = cms.untracked.vstring(
'file:skimPat_100_1_Smw.root',
'file:skimPat_101_1_Gyx.root',
'file:skimPat_102_1_eQH.root',
'file:skimPat_104_1_v5d.root',
)
)
process.TFileService = cms.Service("TFileService",
fileName = cms.string("analysis.root")
)
process.analyzeHiMassTau = cms.EDAnalyzer('HiMassTauAnalysis')
process.analyzeHiMassTau.DoMSUGRApoint = cms.bool(False)
process.analyzeHiMassTau.Prefix = cms.string('susyScan')
process.analyzeHiMassTau.Suffix = cms.string('')
process.analyzeHiMassTau.ScanFormat = cms.string(r"# model msugra_(\\d*)_(\\d*)_(m?\\d*)_(m?\\d*)_(m?\\d)\\s")
process.analyzeHiMassTau.ScanParameters = cms.vstring('M0', 'M12', 'tanbeta', 'A0', 'Mu')
process.analyzeHiMassTau.DoSMpoint = cms.bool(False)
process.analyzeHiMassTau.mLSP = cms.double(425.)
process.analyzeHiMassTau.mGL = cms.double(1150.)
#----- Fill histos?
process.analyzeHiMassTau.FillRecoVertexHists = cms.bool(True)
process.analyzeHiMassTau.FillGenTauHists = cms.bool(True)
process.analyzeHiMassTau.FillRecoTauHists = cms.bool(True)
process.analyzeHiMassTau.FillRecoMuonHists = cms.bool(True)
process.analyzeHiMassTau.FillRecoElectronHists = cms.bool(True)
process.analyzeHiMassTau.FillRecoJetHists = cms.bool(True)
process.analyzeHiMassTau.FillTopologyHists = cms.bool(True)
#-----Generator level Inputs
process.analyzeHiMassTau.GenParticleSource = cms.untracked.InputTag('genParticles')
process.analyzeHiMassTau.GenTauPtMinCut = cms.double(0.)
process.analyzeHiMassTau.GenTauPtMaxCut = cms.double(9999.)
process.analyzeHiMassTau.GenTauEtaMaxCut = cms.double(999.)
process.analyzeHiMassTau.SelectSusyScanPoint = cms.bool(False)
process.analyzeHiMassTau.M0 = cms.double(400.)
process.analyzeHiMassTau.M12 = cms.double(360.)
#-----Reco Tau Inputs
process.analyzeHiMassTau.RecoTauSource = cms.InputTag('patTaus')
process.analyzeHiMassTau.RecoTau1EtaCut = cms.double(2.1)
process.analyzeHiMassTau.RecoTau1PtMinCut = cms.double(20.)
process.analyzeHiMassTau.RecoTau1PtMaxCut = cms.double(9999.)
process.analyzeHiMassTau.DoRecoTau1DiscrByLeadTrack = cms.bool(False)
process.analyzeHiMassTau.UseRecoTau1DiscrByLeadTrackFlag = cms.bool(True)
process.analyzeHiMassTau.RecoTau1DiscrByLeadTrack = cms.untracked.string('leadingPionPtCut')
process.analyzeHiMassTau.DoRecoTau1DiscrByLeadTrackNhits = cms.bool(False)
process.analyzeHiMassTau.RecoTau1LeadTrackMinHits = cms.int32(12)
process.analyzeHiMassTau.DoRecoTau1DiscrByH3x3OverP = cms.bool(False)
process.analyzeHiMassTau.RecoTau1H3x3OverP = cms.double(0.03)
process.analyzeHiMassTau.DoRecoTau1DiscrByIsolation = cms.bool(True)
process.analyzeHiMassTau.UseRecoTau1DiscrByIsolationFlag = cms.bool(True)
process.analyzeHiMassTau.RecoTau1DiscrByIsolation = cms.untracked.string('byTightIsolationMVA3newDMwLT')
process.analyzeHiMassTau.UseRecoTau1IsoSumPtInsteadOfNiso = cms.bool(False)
process.analyzeHiMassTau.UseRecoTau1EllipseForEcalIso = cms.bool(False)
process.analyzeHiMassTau.RecoTau1EcalIsoRphiForEllipse = cms.double(0.15)
process.analyzeHiMassTau.RecoTau1EcalIsoRetaForEllipse = cms.double(0.07)
process.analyzeHiMassTau.RecoTau1TrackNisoMax = cms.int32(0)
process.analyzeHiMassTau.RecoTau1EcalNisoMax = cms.int32(0)
process.analyzeHiMassTau.RecoTau1TrackIsoSumPtMaxCutValue = cms.double(1.0)
process.analyzeHiMassTau.RecoTau1TrackIsoSumPtMinCutValue = cms.double(0.0)
process.analyzeHiMassTau.RecoTau1EcalIsoSumPtMaxCutValue = cms.double(1.0)
process.analyzeHiMassTau.RecoTau1EcalIsoSumPtMinCutValue = cms.double(0.0)
process.analyzeHiMassTau.RecoTau1DiscrByProngType = cms.string('1or3hps')
process.analyzeHiMassTau.RecoTau1LeadTrackThreshold = cms.double(5.0)
process.analyzeHiMassTau.RecoTau1SigGamThreshold = cms.double(1.0)
process.analyzeHiMassTau.RecoTau1IsoDeltaRCone = cms.double(0.5)
process.analyzeHiMassTau.RecoTau1TrackIsoTrkThreshold = cms.double(0.8)
process.analyzeHiMassTau.RecoTau1GammaIsoGamThreshold = cms.double(0.8)
process.analyzeHiMassTau.DoRecoTau1DiscrAgainstElectron = cms.bool(True)
process.analyzeHiMassTau.RecoTau1DiscrAgainstElectron = cms.untracked.string('againstElectronLooseMVA5')
process.analyzeHiMassTau.DoRecoTau1DiscrByCrackCut = cms.bool(False)
process.analyzeHiMassTau.DoRecoTau1DiscrAgainstMuon = cms.bool(True)
process.analyzeHiMassTau.RecoTau1DiscrAgainstMuon = cms.untracked.string('againstMuonTight3')
process.analyzeHiMassTau.SelectTau1sThatAreMuons = cms.bool(False)
process.analyzeHiMassTau.SelectTau1sThatAreElectrons = cms.bool(False)
process.analyzeHiMassTau.RemoveTau1OverlapWithMuon1s = cms.bool(True)
process.analyzeHiMassTau.Tau1Muon1MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveTau1OverlapWithElectron1s = cms.bool(False)
process.analyzeHiMassTau.Tau1Electron1MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveTau1OverlapWithMuon2s = cms.bool(False)
process.analyzeHiMassTau.Tau1Muon2MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveTau1OverlapWithElectron2s = cms.bool(False)
process.analyzeHiMassTau.Tau1Electron2MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RecoTau2EtaCut = cms.double(2.1)
process.analyzeHiMassTau.RecoTau2PtMinCut = cms.double(20.)
process.analyzeHiMassTau.RecoTau2PtMaxCut = cms.double(9999.)
process.analyzeHiMassTau.DoRecoTau2DiscrByLeadTrack = cms.bool(False)
process.analyzeHiMassTau.UseRecoTau2DiscrByLeadTrackFlag = cms.bool(True)
process.analyzeHiMassTau.RecoTau2DiscrByLeadTrack = cms.untracked.string('leadingPionPtCut')
process.analyzeHiMassTau.DoRecoTau2DiscrByLeadTrackNhits = cms.bool(False)
process.analyzeHiMassTau.RecoTau2LeadTrackMinHits = cms.int32(12)
process.analyzeHiMassTau.DoRecoTau2DiscrByH3x3OverP = cms.bool(False)
process.analyzeHiMassTau.RecoTau2H3x3OverP = cms.double(0.03)
process.analyzeHiMassTau.DoRecoTau2DiscrByIsolation = cms.bool(True)
process.analyzeHiMassTau.UseRecoTau2DiscrByIsolationFlag = cms.bool(True)
process.analyzeHiMassTau.RecoTau2DiscrByIsolation = cms.untracked.string('byTightIsolationMVA3newDMwLT')
process.analyzeHiMassTau.UseRecoTau2IsoSumPtInsteadOfNiso = cms.bool(False)
process.analyzeHiMassTau.UseRecoTau2EllipseForEcalIso = cms.bool(False)
process.analyzeHiMassTau.RecoTau2EcalIsoRphiForEllipse = cms.double(0.15)
process.analyzeHiMassTau.RecoTau2EcalIsoRetaForEllipse = cms.double(0.07)
process.analyzeHiMassTau.RecoTau2TrackNisoMax = cms.int32(0)
process.analyzeHiMassTau.RecoTau2EcalNisoMax = cms.int32(0)
process.analyzeHiMassTau.RecoTau2TrackIsoSumPtMaxCutValue = cms.double(1.0)
process.analyzeHiMassTau.RecoTau2TrackIsoSumPtMinCutValue = cms.double(0.0)
process.analyzeHiMassTau.RecoTau2EcalIsoSumPtMaxCutValue = cms.double(1.0)
process.analyzeHiMassTau.RecoTau2EcalIsoSumPtMinCutValue = cms.double(0.0)
process.analyzeHiMassTau.RecoTau2DiscrByProngType = cms.string('1or3hps')
process.analyzeHiMassTau.RecoTau2LeadTrackThreshold = cms.double(5.0)
process.analyzeHiMassTau.RecoTau2SigGamThreshold = cms.double(1.0)
process.analyzeHiMassTau.RecoTau2IsoDeltaRCone = cms.double(0.5)
process.analyzeHiMassTau.RecoTau2TrackIsoTrkThreshold = cms.double(0.8)
process.analyzeHiMassTau.RecoTau2GammaIsoGamThreshold = cms.double(0.8)
process.analyzeHiMassTau.DoRecoTau2DiscrAgainstElectron = cms.bool(True)
process.analyzeHiMassTau.RecoTau2DiscrAgainstElectron = cms.untracked.string('againstElectronLooseMVA5')
process.analyzeHiMassTau.DoRecoTau2DiscrByCrackCut = cms.bool(False)
process.analyzeHiMassTau.DoRecoTau2DiscrAgainstMuon = cms.bool(True)
process.analyzeHiMassTau.RecoTau2DiscrAgainstMuon = cms.untracked.string('againstMuonTight3')
process.analyzeHiMassTau.SelectTau2sThatAreMuons = cms.bool(False)
process.analyzeHiMassTau.SelectTau2sThatAreElectrons = cms.bool(False)
process.analyzeHiMassTau.RemoveTau2OverlapWithMuon1s = cms.bool(False)
process.analyzeHiMassTau.Tau2Muon1MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveTau2OverlapWithElectron1s = cms.bool(False)
process.analyzeHiMassTau.Tau2Electron1MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveTau2OverlapWithMuon2s = cms.bool(False)
process.analyzeHiMassTau.Tau2Muon2MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveTau2OverlapWithElectron2s = cms.bool(False)
process.analyzeHiMassTau.Tau2Electron2MatchingDeltaR = cms.double(0.3)
#-----Reco Muon Inputs
process.analyzeHiMassTau.RecoMuonSource = cms.InputTag('patMuons')
process.analyzeHiMassTau.UseTuneP = cms.bool(False)
process.analyzeHiMassTau.RecoMuon1EtaCut = cms.double(2.1)
process.analyzeHiMassTau.RecoMuon1PtMinCut = cms.double(30.)
process.analyzeHiMassTau.RecoMuon1PtMaxCut = cms.double(9999.)
process.analyzeHiMassTau.DoRecoMuon1DiscrByGlobal = cms.bool(True)
process.analyzeHiMassTau.DoRecoMuon1DiscrByIsolation = cms.bool(True)
process.analyzeHiMassTau.RecoMuon1IsoSumPtMaxCutValue = cms.double(0.2)
process.analyzeHiMassTau.RecoMuon1IsoSumPtMinCutValue = cms.double(0.0)
process.analyzeHiMassTau.RecoMuon1IsoDeltaRCone = cms.double(0.4)
process.analyzeHiMassTau.RecoMuon1TrackIsoTrkThreshold = cms.double(1.0)
process.analyzeHiMassTau.RecoMuon1EcalIsoRecHitThreshold = cms.double(1.0)
process.analyzeHiMassTau.DoRecoMuon1DiscrByIp = cms.bool(True)
process.analyzeHiMassTau.RecoMuon1IpCut = cms.double(0.2)
process.analyzeHiMassTau.RecoMuon1dzCut = cms.double(0.5)
process.analyzeHiMassTau.DoRecoMuon1DiscrByPionVeto = cms.bool(False)
process.analyzeHiMassTau.RecoMuon1CaloCompCoefficient = cms.double(0.8)
process.analyzeHiMassTau.RecoMuon1SegmCompCoefficient = cms.double(1.2)
process.analyzeHiMassTau.RecoMuon1AntiPionCut = cms.double(1.0)
process.analyzeHiMassTau.DoRecoMuon1DiscrByNormalizedChi2 = cms.bool(False)
process.analyzeHiMassTau.RecoMuon1NormalizedChi2MaxCut = cms.int32(10)
process.analyzeHiMassTau.DoRecoMuon1DiscrByChamberHits = cms.bool(True)
process.analyzeHiMassTau.RecoMuon1ChamberHitsMinCut = cms.int32(0)
process.analyzeHiMassTau.DoRecoMuon1DiscrByMatchedStations = cms.bool(True)
process.analyzeHiMassTau.RecoMuon1MatchedStationsMinCut = cms.int32(1)
process.analyzeHiMassTau.DoRecoMuon1DiscrByPixelHits = cms.bool(True)
process.analyzeHiMassTau.RecoMuon1PixelHitsMinCut = cms.int32(0)
process.analyzeHiMassTau.DoRecoMuon1DiscrByTrackerLayerWithHits = cms.bool(True)
process.analyzeHiMassTau.RecoMuon1TrackerLayerWithHitsMinCut = cms.int32(5)
process.analyzeHiMassTau.DoRecoMuon1DiscrByDptOpt = cms.bool(False)
process.analyzeHiMassTau.RecoMuon1DptOptMaxCut = cms.double(0.3)
process.analyzeHiMassTau.RecoMuon2EtaCut = cms.double(2.1)
process.analyzeHiMassTau.RecoMuon2PtMinCut = cms.double(10.)
process.analyzeHiMassTau.RecoMuon2PtMaxCut = cms.double(9999.)
process.analyzeHiMassTau.DoRecoMuon2DiscrByGlobal = cms.bool(True)
process.analyzeHiMassTau.DoRecoMuon2DiscrByIsolation = cms.bool(True)
process.analyzeHiMassTau.RecoMuon2IsoSumPtMaxCutValue = cms.double(0.2)
process.analyzeHiMassTau.RecoMuon2IsoSumPtMinCutValue = cms.double(0.0)
process.analyzeHiMassTau.RecoMuon2IsoDeltaRCone = cms.double(0.4)
process.analyzeHiMassTau.RecoMuon2TrackIsoTrkThreshold = cms.double(1.0)
process.analyzeHiMassTau.RecoMuon2EcalIsoRecHitThreshold = cms.double(1.0)
process.analyzeHiMassTau.DoRecoMuon2DiscrByIp = cms.bool(True)
process.analyzeHiMassTau.RecoMuon2IpCut = cms.double(0.2)
process.analyzeHiMassTau.RecoMuon2dzCut = cms.double(0.5)
process.analyzeHiMassTau.DoRecoMuon2DiscrByPionVeto = cms.bool(False)
process.analyzeHiMassTau.RecoMuon2CaloCompCoefficient = cms.double(0.8)
process.analyzeHiMassTau.RecoMuon2SegmCompCoefficient = cms.double(1.2)
process.analyzeHiMassTau.RecoMuon2AntiPionCut = cms.double(1.0)
process.analyzeHiMassTau.DoRecoMuon2DiscrByNormalizedChi2 = cms.bool(False)
process.analyzeHiMassTau.RecoMuon2NormalizedChi2MaxCut = cms.int32(10)
process.analyzeHiMassTau.DoRecoMuon2DiscrByChamberHits = cms.bool(True)
process.analyzeHiMassTau.RecoMuon2ChamberHitsMinCut = cms.int32(0)
process.analyzeHiMassTau.DoRecoMuon2DiscrByMatchedStations = cms.bool(True)
process.analyzeHiMassTau.RecoMuon2MatchedStationsMinCut = cms.int32(1)
process.analyzeHiMassTau.DoRecoMuon2DiscrByPixelHits = cms.bool(True)
process.analyzeHiMassTau.RecoMuon2PixelHitsMinCut = cms.int32(0)
process.analyzeHiMassTau.DoRecoMuon2DiscrByTrackerLayerWithHits = cms.bool(True)
process.analyzeHiMassTau.RecoMuon2TrackerLayerWithHitsMinCut = cms.int32(5)
process.analyzeHiMassTau.DoRecoMuon2DiscrByDptOpt = cms.bool(False)
process.analyzeHiMassTau.RecoMuon2DptOptMaxCut = cms.double(0.3)
#-----Reco Electron Inputs
process.analyzeHiMassTau.RecoElectronSource = cms.InputTag('heepPatElectrons')
process.analyzeHiMassTau.IsoValElec = cms.VInputTag(cms.InputTag('elPFIsoValueCharged03PFIdPFIso'),
cms.InputTag('elPFIsoValueNeutral03PFIdPFIso'),
cms.InputTag('elPFIsoValueGamma03NoPFIdPFIso'),
cms.InputTag('elPFIsoValuePU03NoPFIdPFIso'))
process.analyzeHiMassTau.UseHeepInfo = cms.bool(False)
process.analyzeHiMassTau.RecoElectron1EtaCut = cms.double(2.1)
process.analyzeHiMassTau.RecoElectron1PtMinCut = cms.double(30.)
process.analyzeHiMassTau.RecoElectron1PtMaxCut = cms.double(9999.)
process.analyzeHiMassTau.DoRecoElectron1DiscrByIsolation = cms.bool(True)
process.analyzeHiMassTau.RecoElectron1IsolationCone = cms.string('dR03')
process.analyzeHiMassTau.RecoElectron1IsoSumPtMaxCutValue = cms.double(0.15)
process.analyzeHiMassTau.RecoElectron1IsoSumPtMinCutValue = cms.double(0.0)
process.analyzeHiMassTau.DoRecoElectron1DiscrByIp = cms.bool(True)
process.analyzeHiMassTau.RecoElectron1IpCut = cms.double(0.02)
process.analyzeHiMassTau.RecoElectron1dzCut = cms.double(0.1)
process.analyzeHiMassTau.DoRecoElectron1DiscrByEoverP = cms.bool(True)
process.analyzeHiMassTau.RecoElectron1EoverPMax = cms.double(0.05)
process.analyzeHiMassTau.RecoElectron1EoverPMin = cms.double(0.0)
process.analyzeHiMassTau.DoRecoElectron1DiscrByHoverEm = cms.bool(True)
process.analyzeHiMassTau.RecoElectron1EEHoverEmCut = cms.double(0.10)
process.analyzeHiMassTau.RecoElectron1EBHoverEmCut = cms.double(0.12)
process.analyzeHiMassTau.DoRecoElectron1DiscrBySigmaIEtaIEta = cms.bool(True)
process.analyzeHiMassTau.RecoElectron1EESigmaIEtaIEta = cms.double(0.03)
process.analyzeHiMassTau.RecoElectron1EBSigmaIEtaIEta = cms.double(0.01)
process.analyzeHiMassTau.DoRecoElectron1DiscrByDEtaIn = cms.bool(True)
process.analyzeHiMassTau.RecoElectron1EEDEtaIn = cms.double(0.007)
process.analyzeHiMassTau.RecoElectron1EBDEtaIn = cms.double(0.004)
process.analyzeHiMassTau.DoRecoElectron1DiscrByDPhiIn = cms.bool(True)
process.analyzeHiMassTau.RecoElectron1EEDPhiIn = cms.double(0.03)
process.analyzeHiMassTau.RecoElectron1EBDPhiIn = cms.double(0.06)
process.analyzeHiMassTau.DoRecoElectron1DiscrBySCE2by5Over5by5 = cms.bool(False)
process.analyzeHiMassTau.RecoElectron1EBscE1by5Over5by5 = cms.double(0.83)
process.analyzeHiMassTau.RecoElectron1EBscE2by5Over5by5 = cms.double(0.94)
process.analyzeHiMassTau.DoRecoElectron1DiscrByMissingHits = cms.bool(True)
process.analyzeHiMassTau.RecoElectron1MissingHits = cms.int32(1)
process.analyzeHiMassTau.DoRecoElectron1DiscrByEcalDrivenSeed = cms.bool(False)
process.analyzeHiMassTau.DoRecoElectron1DiscrByTrackerDrivenSeed = cms.bool(False)
process.analyzeHiMassTau.RecoElectron2EtaCut = cms.double(2.1)
process.analyzeHiMassTau.RecoElectron2PtMinCut = cms.double(10.)
process.analyzeHiMassTau.RecoElectron2PtMaxCut = cms.double(9999.)
process.analyzeHiMassTau.DoRecoElectron2DiscrByIsolation = cms.bool(True)
process.analyzeHiMassTau.RecoElectron2IsolationCone = cms.string('dR03')
process.analyzeHiMassTau.RecoElectron2IsoSumPtMaxCutValue = cms.double(0.15)
process.analyzeHiMassTau.RecoElectron2IsoSumPtMinCutValue = cms.double(0.0)
process.analyzeHiMassTau.DoRecoElectron2DiscrByIp = cms.bool(True)
process.analyzeHiMassTau.RecoElectron2IpCut = cms.double(0.02)
process.analyzeHiMassTau.RecoElectron2dzCut = cms.double(0.1)
process.analyzeHiMassTau.DoRecoElectron2DiscrByEoverP = cms.bool(True)
process.analyzeHiMassTau.RecoElectron2EoverPMax = cms.double(0.05)
process.analyzeHiMassTau.RecoElectron2EoverPMin = cms.double(0.0)
process.analyzeHiMassTau.DoRecoElectron2DiscrByHoverEm = cms.bool(True)
process.analyzeHiMassTau.RecoElectron2EEHoverEmCut = cms.double(0.10)
process.analyzeHiMassTau.RecoElectron2EBHoverEmCut = cms.double(0.12)
process.analyzeHiMassTau.DoRecoElectron2DiscrBySigmaIEtaIEta = cms.bool(True)
process.analyzeHiMassTau.RecoElectron2EESigmaIEtaIEta = cms.double(0.03)
process.analyzeHiMassTau.RecoElectron2EBSigmaIEtaIEta = cms.double(0.01)
process.analyzeHiMassTau.DoRecoElectron2DiscrByDEtaIn = cms.bool(True)
process.analyzeHiMassTau.RecoElectron2EEDEtaIn = cms.double(0.007)
process.analyzeHiMassTau.RecoElectron2EBDEtaIn = cms.double(0.004)
process.analyzeHiMassTau.DoRecoElectron2DiscrByDPhiIn = cms.bool(True)
process.analyzeHiMassTau.RecoElectron2EEDPhiIn = cms.double(0.03)
process.analyzeHiMassTau.RecoElectron2EBDPhiIn = cms.double(0.06)
process.analyzeHiMassTau.DoRecoElectron2DiscrBySCE2by5Over5by5 = cms.bool(False)
process.analyzeHiMassTau.RecoElectron2EBscE1by5Over5by5 = cms.double(0.83)
process.analyzeHiMassTau.RecoElectron2EBscE2by5Over5by5 = cms.double(0.94)
process.analyzeHiMassTau.DoRecoElectron2DiscrByMissingHits = cms.bool(True)
process.analyzeHiMassTau.RecoElectron2MissingHits = cms.int32(1)
process.analyzeHiMassTau.DoRecoElectron2DiscrByEcalDrivenSeed = cms.bool(False)
process.analyzeHiMassTau.DoRecoElectron2DiscrByTrackerDrivenSeed = cms.bool(False)
#-----Reco Jet Inputs
process.analyzeHiMassTau.RecoJetSource = cms.InputTag('selectedPatJets')
process.analyzeHiMassTau.puJetIdwp = cms.untracked.string('puJetIdLoose') # puJetIdLoose puJetIdMedim puJetIdTight
process.analyzeHiMassTau.UseCorrectedJet = cms.bool(True)
process.analyzeHiMassTau.RecoJet1EtaMinCut = cms.double(0.0)
process.analyzeHiMassTau.RecoJet1EtaMaxCut = cms.double(5.0)
process.analyzeHiMassTau.RecoJet1PtCut = cms.double(30.0)
process.analyzeHiMassTau.ApplyJet1LooseID = cms.bool(True)
process.analyzeHiMassTau.ApplyJet1PileupID = cms.bool(False)
process.analyzeHiMassTau.RemoveJet1OverlapWithMuon1s = cms.bool(True)
process.analyzeHiMassTau.Jet1Muon1MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveJet1OverlapWithElectron1s = cms.bool(False)
process.analyzeHiMassTau.Jet1Electron1MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveJet1OverlapWithTau1s = cms.bool(True)
process.analyzeHiMassTau.Jet1Tau1MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveJet1OverlapWithMuon2s = cms.bool(False)
process.analyzeHiMassTau.Jet1Muon2MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveJet1OverlapWithElectron2s = cms.bool(False)
process.analyzeHiMassTau.Jet1Electron2MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveJet1OverlapWithTau2s = cms.bool(False)
process.analyzeHiMassTau.Jet1Tau2MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RecoJet2EtaMinCut = cms.double(0.0)
process.analyzeHiMassTau.RecoJet2EtaMaxCut = cms.double(5.0)
process.analyzeHiMassTau.RecoJet2PtCut = cms.double(30.0)
process.analyzeHiMassTau.ApplyJet2LooseID = cms.bool(True)
process.analyzeHiMassTau.ApplyJet2PileupID = cms.bool(False)
process.analyzeHiMassTau.RemoveJet2OverlapWithMuon1s = cms.bool(True)
process.analyzeHiMassTau.Jet2Muon1MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveJet2OverlapWithElectron1s = cms.bool(False)
process.analyzeHiMassTau.Jet2Electron1MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveJet2OverlapWithTau1s = cms.bool(True)
process.analyzeHiMassTau.Jet2Tau1MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveJet2OverlapWithMuon2s = cms.bool(False)
process.analyzeHiMassTau.Jet2Muon2MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveJet2OverlapWithElectron2s = cms.bool(False)
process.analyzeHiMassTau.Jet2Electron2MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveJet2OverlapWithTau2s = cms.bool(False)
process.analyzeHiMassTau.Jet2Tau2MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RecoCentralJetPtCut = cms.double(30.0)
process.analyzeHiMassTau.ApplyCentralJetLooseID = cms.bool(True)
process.analyzeHiMassTau.ApplyCentralJetPileupID = cms.bool(False)
process.analyzeHiMassTau.RemoveCentralJetOverlapWithMuon1s = cms.bool(True)
process.analyzeHiMassTau.CentralJetMuon1MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveCentralJetOverlapWithElectron1s = cms.bool(False)
process.analyzeHiMassTau.CentralJetElectron1MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveCentralJetOverlapWithTau1s = cms.bool(True)
process.analyzeHiMassTau.CentralJetTau1MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveCentralJetOverlapWithMuon2s = cms.bool(False)
process.analyzeHiMassTau.CentralJetMuon2MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveCentralJetOverlapWithElectron2s = cms.bool(False)
process.analyzeHiMassTau.CentralJetElectron2MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveCentralJetOverlapWithTau2s = cms.bool(False)
process.analyzeHiMassTau.CentralJetTau2MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.ApplyLeadingJetsLooseID = cms.bool(True)
process.analyzeHiMassTau.ApplyLeadingJetsPileupID = cms.bool(False)
process.analyzeHiMassTau.DoDiscrByFirstLeadingJet = cms.bool(True)
process.analyzeHiMassTau.RecoFirstLeadingJetPt = cms.double(30.0)
process.analyzeHiMassTau.RecoFirstLeadingJetEtaMinCut = cms.double(0.0)
process.analyzeHiMassTau.RecoFirstLeadingJetEtaMaxCut = cms.double(5.0)
process.analyzeHiMassTau.DoDiscrBySecondLeadingJet = cms.bool(True)
process.analyzeHiMassTau.RecoSecondLeadingJetPt = cms.double(30.0)
process.analyzeHiMassTau.RecoSecondLeadingJetEtaMinCut = cms.double(0.0)
process.analyzeHiMassTau.RecoSecondLeadingJetEtaMaxCut = cms.double(5.0)
process.analyzeHiMassTau.RemoveFirstLeadingJetOverlapWithMuon1s = cms.bool(True)
process.analyzeHiMassTau.FirstLeadingJetMuon1MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveFirstLeadingJetOverlapWithElectron1s = cms.bool(False)
process.analyzeHiMassTau.FirstLeadingJetElectron1MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveFirstLeadingJetOverlapWithTau1s = cms.bool(True)
process.analyzeHiMassTau.FirstLeadingJetTau1MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveSecondLeadingJetOverlapWithMuon1s = cms.bool(True)
process.analyzeHiMassTau.SecondLeadingJetMuon1MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveSecondLeadingJetOverlapWithElectron1s = cms.bool(False)
process.analyzeHiMassTau.SecondLeadingJetElectron1MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveSecondLeadingJetOverlapWithTau1s = cms.bool(True)
process.analyzeHiMassTau.SecondLeadingJetTau1MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveFirstLeadingJetOverlapWithMuon2s = cms.bool(False)
process.analyzeHiMassTau.FirstLeadingJetMuon2MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveFirstLeadingJetOverlapWithElectron2s = cms.bool(False)
process.analyzeHiMassTau.FirstLeadingJetElectron2MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveFirstLeadingJetOverlapWithTau2s = cms.bool(False)
process.analyzeHiMassTau.FirstLeadingJetTau2MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveSecondLeadingJetOverlapWithMuon2s = cms.bool(False)
process.analyzeHiMassTau.SecondLeadingJetMuon2MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveSecondLeadingJetOverlapWithElectron2s = cms.bool(False)
process.analyzeHiMassTau.SecondLeadingJetElectron2MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveSecondLeadingJetOverlapWithTau2s = cms.bool(False)
process.analyzeHiMassTau.SecondLeadingJetTau2MatchingDeltaR = cms.double(0.3)
#-----Reco b-Jet Inputs
process.analyzeHiMassTau.RecoBJetEtaMinCut = cms.double(0.0)
process.analyzeHiMassTau.RecoBJetEtaMaxCut = cms.double(2.5)
process.analyzeHiMassTau.RecoBJetPtCut = cms.double(20.0)
process.analyzeHiMassTau.RemoveBJetOverlapWithMuon1s = cms.bool(True)
process.analyzeHiMassTau.BJetMuon1MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveBJetOverlapWithElectron1s = cms.bool(False)
process.analyzeHiMassTau.BJetElectron1MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveBJetOverlapWithTau1s = cms.bool(True)
process.analyzeHiMassTau.BJetTau1MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveBJetOverlapWithMuon2s = cms.bool(False)
process.analyzeHiMassTau.BJetMuon2MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveBJetOverlapWithElectron2s = cms.bool(False)
process.analyzeHiMassTau.BJetElectron2MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.RemoveBJetOverlapWithTau2s = cms.bool(False)
process.analyzeHiMassTau.BJetTau2MatchingDeltaR = cms.double(0.3)
process.analyzeHiMassTau.ApplyJetBTagging = cms.bool(True)
process.analyzeHiMassTau.bTagger = cms.untracked.string('combinedSecondaryVertexBJetTags')
process.analyzeHiMassTau.JetBTaggingCut = cms.double(0.244) # CVSL=0.244, CSVM=0.679, CSVT=0.898
#-----Vertex Inputs
process.analyzeHiMassTau.RecoVertexSource = cms.InputTag('offlinePrimaryVertices') # vertex collection
process.analyzeHiMassTau.RecoVertexMaxZposition = cms.double(20.0) # vertex |z| < X
process.analyzeHiMassTau.RecoVertexMinTracks = cms.int32(2) # vertex must have >= 2 "good" tracks used to reconstruct it
process.analyzeHiMassTau.RecoVertexTrackWeight = cms.double(0.5) # weight used to define "good" tracks used to reconstruct vertex
#-----Trigger Inputs
process.analyzeHiMassTau.RecoTriggerSource = cms.InputTag("TriggerResults","","HLT") # trigger collection
process.analyzeHiMassTau.Trigger1Requirements = cms.vstring('HLT_IsoMu24_eta2p1_v') # trigger path name
process.analyzeHiMassTau.Trigger2Requirements = cms.vstring('HLT_IsoMu24_eta2p1_v') # trigger path name
#-----Susy Topology Inputs
process.analyzeHiMassTau.JetPtForMhtAndHt = cms.double(30.0)
process.analyzeHiMassTau.JetEtaForMhtAndHt = cms.double(5.0)
process.analyzeHiMassTau.ApplyJetLooseIDforMhtAndHt = cms.bool(True)
process.analyzeHiMassTau.ApplyJetPileupIDforMhtAndHt = cms.bool(False)
process.analyzeHiMassTau.DoSUSYDiscrByMHT = cms.bool(False)
process.analyzeHiMassTau.MhtCut = cms.double(250.0)
process.analyzeHiMassTau.DoSUSYDiscrByHT = cms.bool(False)
process.analyzeHiMassTau.HtCut = cms.double(250.0)
process.analyzeHiMassTau.DoSUSYDiscrByR1 = cms.bool(False)
process.analyzeHiMassTau.R1MinCut = cms.double(0.85)
process.analyzeHiMassTau.R1MaxCut = cms.double(999.0)
process.analyzeHiMassTau.DoSUSYDiscrByR2 = cms.bool(False)
process.analyzeHiMassTau.R2MinCut = cms.double(0.0)
process.analyzeHiMassTau.R2MaxCut = cms.double(3.6)
process.analyzeHiMassTau.DoSUSYDiscrByAlpha = cms.bool(False)
process.analyzeHiMassTau.AlphaMinCut = cms.double(0.5)
process.analyzeHiMassTau.AlphaMaxCut = cms.double(9999999999.9)
process.analyzeHiMassTau.DoSUSYDiscrByDphi1 = cms.bool(False)
process.analyzeHiMassTau.Dphi1MinCut = cms.double(0.9)
process.analyzeHiMassTau.Dphi1MaxCut = cms.double(999.9)
process.analyzeHiMassTau.DoSUSYDiscrByDphi2 = cms.bool(False)
process.analyzeHiMassTau.Dphi2MinCut = cms.double(0.5)
process.analyzeHiMassTau.Dphi2MaxCut = cms.double(9999.5)
process.analyzeHiMassTau.DoSUSYDiscrByLeadDiJetMass = cms.bool(False)
process.analyzeHiMassTau.LeadDiJetMinMassCut = cms.double(650.0)
process.analyzeHiMassTau.LeadDiJetMaxMassCut = cms.double(9999.0)
process.analyzeHiMassTau.DoSUSYDiscrByLeadDiJetPt = cms.bool(False)
process.analyzeHiMassTau.LeadDiJetMinPtCut = cms.double(0.5)
process.analyzeHiMassTau.LeadDiJetMaxPtCut = cms.double(0.5)
process.analyzeHiMassTau.DoSUSYDiscrByLeadDiJetDeltaEta = cms.bool(False)
process.analyzeHiMassTau.LeadDiJetMinDeltaEtaCut = cms.double(4.2)
process.analyzeHiMassTau.LeadDiJetMaxDeltaEtaCut = cms.double(9999.0)
process.analyzeHiMassTau.DoSUSYDiscrByLeadDiJetDeltaPhi = cms.bool(False)
process.analyzeHiMassTau.LeadDiJetMinDeltaPhiCut = cms.double(4.2)
process.analyzeHiMassTau.LeadDiJetMaxDeltaPhiCut = cms.double(9999.0)
process.analyzeHiMassTau.DoSUSYDiscrByLeadDiJetOSEta = cms.bool(False)
#-----Topology Inputs
process.analyzeHiMassTau.RecoMetSource = cms.InputTag('patPfMetT0pcT1Txy')
process.analyzeHiMassTau.DoDiscrByMet = cms.bool(True)
process.analyzeHiMassTau.TreatMuonsAsNeutrinos = cms.bool(False)
process.analyzeHiMassTau.RecoMetCut = cms.double(75.0)
process.analyzeHiMassTau.DoDiJetDiscrByDeltaR = cms.bool(True)
process.analyzeHiMassTau.DiJetDeltaRCut = cms.double(0.3)
process.analyzeHiMassTau.DoDiJetDiscrByDeltaEta = cms.bool(True)
process.analyzeHiMassTau.DiJetMinDeltaEtaCut = cms.double(4.2)
process.analyzeHiMassTau.DiJetMaxDeltaEtaCut = cms.double(999.9)
process.analyzeHiMassTau.DoDiJetDiscrByDeltaPhi = cms.bool(False)
process.analyzeHiMassTau.DiJetMinDeltaPhiCut = cms.double(0.0)
process.analyzeHiMassTau.DiJetMaxDeltaPhiCut = cms.double(2.5)
process.analyzeHiMassTau.DoDiJetDiscrByCosDphi = cms.bool(False)
process.analyzeHiMassTau.DiJetCosDphiMaxCut = cms.double(-0.95)
process.analyzeHiMassTau.DiJetCosDphiMinCut = cms.double(-1.00)
process.analyzeHiMassTau.DoDiscrByDiJetMassReco = cms.bool(False)
process.analyzeHiMassTau.DiJetMassMinCut = cms.double(1000.0)
process.analyzeHiMassTau.DiJetMassMaxCut = cms.double(9999.0)
process.analyzeHiMassTau.DoDiJetDiscrByOSEta = cms.bool(True)
process.analyzeHiMassTau.DoDiMuonDiscrByDeltaR = cms.bool(True)
process.analyzeHiMassTau.DiMuonDeltaRCut = cms.double(0.3)
process.analyzeHiMassTau.DiMuonDiscrByOSLSType = cms.string('NONE')
process.analyzeHiMassTau.DoDiMuonDiscrByCosDphi = cms.bool(False)
process.analyzeHiMassTau.DiMuonCosDphiMaxCut = cms.double(-0.95)
process.analyzeHiMassTau.DiMuonCosDphiMinCut = cms.double(-1.00)
process.analyzeHiMassTau.DoDiscrByDiMuonMassReco = cms.bool(False)
process.analyzeHiMassTau.UseVectorSumOfDiMuonProductsAndMetMassReco = cms.bool(False)
process.analyzeHiMassTau.UseCollinerApproxDiMuonMassReco = cms.bool(False)
process.analyzeHiMassTau.DiMuonMassMinCut = cms.double(60.0)
process.analyzeHiMassTau.DiMuonMassMaxCut = cms.double(120.0)
process.analyzeHiMassTau.DoDiMuonDiscrByCDFzeta2D = cms.bool(False)
process.analyzeHiMassTau.DiMuonPZetaCutCoefficient = cms.double(1.0)
process.analyzeHiMassTau.DiMuonPZetaVisCutCoefficient = cms.double(-0.875)
process.analyzeHiMassTau.DiMuonCDFzeta2DCutValue = cms.double(-7.00)
process.analyzeHiMassTau.DoDiMuonDiscrByDeltaPtDivSumPt = cms.bool(False)
process.analyzeHiMassTau.DiMuonDeltaPtDivSumPtMinCutValue = cms.double(0.1)
process.analyzeHiMassTau.DiMuonDeltaPtDivSumPtMaxCutValue = cms.double(1.0)
process.analyzeHiMassTau.DoDiMuonDiscrByDeltaPt = cms.bool(False)
process.analyzeHiMassTau.DiMuonDeltaPtMinCutValue = cms.double(30.0)
process.analyzeHiMassTau.DiMuonDeltaPtMaxCutValue = cms.double(9999.0)
process.analyzeHiMassTau.DoDiTauDiscrByDeltaR = cms.bool(True)
process.analyzeHiMassTau.DiTauDeltaRCut = cms.double(0.3)
process.analyzeHiMassTau.DiTauDiscrByOSLSType = cms.string('NONE')
process.analyzeHiMassTau.UseTauSeedTrackForDiTauDiscrByOSLS = cms.bool(False)
process.analyzeHiMassTau.DoDiTauDiscrByCosDphi = cms.bool(False)
process.analyzeHiMassTau.DiTauCosDphiMaxCut = cms.double(-0.95)
process.analyzeHiMassTau.DiTauCosDphiMinCut = cms.double(-1.00)
process.analyzeHiMassTau.DoDiscrByDiTauMassReco = cms.bool(False)
process.analyzeHiMassTau.UseVectorSumOfDiTauProductsAndMetMassReco = cms.bool(False)
process.analyzeHiMassTau.UseCollinerApproxDiTauMassReco = cms.bool(False)
process.analyzeHiMassTau.DiTauMassMinCut = cms.double(150.0)
process.analyzeHiMassTau.DiTauMassMaxCut = cms.double(1000.0)
process.analyzeHiMassTau.DoDiTauDiscrByCDFzeta2D = cms.bool(False)
process.analyzeHiMassTau.DiTauPZetaCutCoefficient = cms.double(1.0)
process.analyzeHiMassTau.DiTauPZetaVisCutCoefficient = cms.double(-0.875)
process.analyzeHiMassTau.DiTauCDFzeta2DCutValue = cms.double(-7.00)
process.analyzeHiMassTau.DoDiTauDiscrByDeltaPtDivSumPt = cms.bool(False)
process.analyzeHiMassTau.DiTauDeltaPtDivSumPtMinCutValue = cms.double(0.1)
process.analyzeHiMassTau.DiTauDeltaPtDivSumPtMaxCutValue = cms.double(1.0)
process.analyzeHiMassTau.DoDiTauDiscrByDeltaPt = cms.bool(False)
process.analyzeHiMassTau.DiTauDeltaPtMinCutValue = cms.double(30.0)
process.analyzeHiMassTau.DiTauDeltaPtMaxCutValue = cms.double(9999.0)
process.analyzeHiMassTau.DoDiElectronDiscrByDeltaR = cms.bool(True)
process.analyzeHiMassTau.DiElectronDeltaRCut = cms.double(0.3)
process.analyzeHiMassTau.DiElectronDiscrByOSLSType = cms.string('NONE')
process.analyzeHiMassTau.DoDiElectronDiscrByCosDphi = cms.bool(False)
process.analyzeHiMassTau.DiElectronCosDphiMaxCut = cms.double(-0.95)
process.analyzeHiMassTau.DiElectronCosDphiMinCut = cms.double(-1.00)
process.analyzeHiMassTau.DoDiscrByDiElectronMassReco = cms.bool(False)
process.analyzeHiMassTau.UseVectorSumOfDiElectronProductsAndMetMassReco = cms.bool(False)
process.analyzeHiMassTau.UseCollinerApproxDiElectronMassReco = cms.bool(False)
process.analyzeHiMassTau.DiElectronMassMinCut = cms.double(150.0)
process.analyzeHiMassTau.DiElectronMassMaxCut = cms.double(1000.0)
process.analyzeHiMassTau.DoDiElectronDiscrByCDFzeta2D = cms.bool(False)
process.analyzeHiMassTau.DiElectronPZetaCutCoefficient = cms.double(1.0)
process.analyzeHiMassTau.DiElectronPZetaVisCutCoefficient = cms.double(-0.875)
process.analyzeHiMassTau.DiElectronCDFzeta2DCutValue = cms.double(-7.00)
process.analyzeHiMassTau.DoDiElectronDiscrByDeltaPtDivSumPt = cms.bool(False)
process.analyzeHiMassTau.DiElectronDeltaPtDivSumPtMinCutValue = cms.double(0.1)
process.analyzeHiMassTau.DiElectronDeltaPtDivSumPtMaxCutValue = cms.double(1.0)
process.analyzeHiMassTau.DoDiElectronDiscrByDeltaPt = cms.bool(False)
process.analyzeHiMassTau.DiElectronDeltaPtMinCutValue = cms.double(30.0)
process.analyzeHiMassTau.DiElectronDeltaPtMaxCutValue = cms.double(9999.0)
process.analyzeHiMassTau.DoMuon1Tau1DiscrByDeltaR = cms.bool(True)
process.analyzeHiMassTau.Muon1Tau1DeltaRCut = cms.double(0.3)
process.analyzeHiMassTau.Muon1Tau1DiscrByOSLSType = cms.string('LS')
process.analyzeHiMassTau.UseTauSeedTrackForMuon1Tau1DiscrByOSLS = cms.bool(False)
process.analyzeHiMassTau.DoMuon1Tau1DiscrByCosDphi = cms.bool(False)
process.analyzeHiMassTau.Muon1Tau1CosDphiMaxCut = cms.double(-0.95)
process.analyzeHiMassTau.Muon1Tau1CosDphiMinCut = cms.double(-1.00)
process.analyzeHiMassTau.DoDiscrByMuon1Tau1MassReco = cms.bool(False)
process.analyzeHiMassTau.UseVectorSumOfMuon1Tau1ProductsAndMetMassReco = cms.bool(False)
process.analyzeHiMassTau.UseCollinerApproxMuon1Tau1MassReco = cms.bool(False)
process.analyzeHiMassTau.Muon1Tau1MassMinCut = cms.double(150.0)
process.analyzeHiMassTau.Muon1Tau1MassMaxCut = cms.double(1000.0)
process.analyzeHiMassTau.DoMuon1Tau1DiscrByCDFzeta2D = cms.bool(False)
process.analyzeHiMassTau.Muon1Tau1PZetaCutCoefficient = cms.double(1.0)
process.analyzeHiMassTau.Muon1Tau1PZetaVisCutCoefficient = cms.double(-0.875)
process.analyzeHiMassTau.Muon1Tau1CDFzeta2DCutValue = cms.double(-7.00)
process.analyzeHiMassTau.DoMuon1Tau1DiscrByDeltaPtDivSumPt = cms.bool(False)
process.analyzeHiMassTau.Muon1Tau1DeltaPtDivSumPtMinCutValue = cms.double(0.1)
process.analyzeHiMassTau.Muon1Tau1DeltaPtDivSumPtMaxCutValue = cms.double(1.0)
process.analyzeHiMassTau.DoMuon1Tau1DiscrByDeltaPt = cms.bool(False)
process.analyzeHiMassTau.Muon1Tau1DeltaPtMinCutValue = cms.double(30.0)
process.analyzeHiMassTau.Muon1Tau1DeltaPtMaxCutValue = cms.double(9999.0)
process.analyzeHiMassTau.DoMuon1Tau2DiscrByDeltaR = cms.bool(True)
process.analyzeHiMassTau.Muon1Tau2DeltaRCut = cms.double(0.3)
process.analyzeHiMassTau.Muon1Tau2DiscrByOSLSType = cms.string('NONE')
process.analyzeHiMassTau.UseTauSeedTrackForMuon1Tau2DiscrByOSLS = cms.bool(False)
process.analyzeHiMassTau.DoMuon1Tau2DiscrByCosDphi = cms.bool(False)
process.analyzeHiMassTau.Muon1Tau2CosDphiMaxCut = cms.double(-0.95)
process.analyzeHiMassTau.Muon1Tau2CosDphiMinCut = cms.double(-1.00)
process.analyzeHiMassTau.DoDiscrByMuon1Tau2MassReco = cms.bool(False)
process.analyzeHiMassTau.UseVectorSumOfMuon1Tau2ProductsAndMetMassReco = cms.bool(False)
process.analyzeHiMassTau.UseCollinerApproxMuon1Tau2MassReco = cms.bool(False)
process.analyzeHiMassTau.Muon1Tau2MassMinCut = cms.double(150.0)
process.analyzeHiMassTau.Muon1Tau2MassMaxCut = cms.double(1000.0)
process.analyzeHiMassTau.DoMuon1Tau2DiscrByCDFzeta2D = cms.bool(False)
process.analyzeHiMassTau.Muon1Tau2PZetaCutCoefficient = cms.double(1.0)
process.analyzeHiMassTau.Muon1Tau2PZetaVisCutCoefficient = cms.double(-0.875)
process.analyzeHiMassTau.Muon1Tau2CDFzeta2DCutValue = cms.double(-7.00)
process.analyzeHiMassTau.DoMuon1Tau2DiscrByDeltaPtDivSumPt = cms.bool(False)
process.analyzeHiMassTau.Muon1Tau2DeltaPtDivSumPtMinCutValue = cms.double(0.1)
process.analyzeHiMassTau.Muon1Tau2DeltaPtDivSumPtMaxCutValue = cms.double(1.0)
process.analyzeHiMassTau.DoMuon1Tau2DiscrByDeltaPt = cms.bool(False)
process.analyzeHiMassTau.Muon1Tau2DeltaPtMinCutValue = cms.double(30.0)
process.analyzeHiMassTau.Muon1Tau2DeltaPtMaxCutValue = cms.double(9999.0)
process.analyzeHiMassTau.DoMuon2Tau1DiscrByDeltaR = cms.bool(True)
process.analyzeHiMassTau.Muon2Tau1DeltaRCut = cms.double(0.3)
process.analyzeHiMassTau.Muon2Tau1DiscrByOSLSType = cms.string('NONE')
process.analyzeHiMassTau.UseTauSeedTrackForMuon2Tau1DiscrByOSLS = cms.bool(False)
process.analyzeHiMassTau.DoMuon2Tau1DiscrByCosDphi = cms.bool(False)
process.analyzeHiMassTau.Muon2Tau1CosDphiMaxCut = cms.double(-0.95)
process.analyzeHiMassTau.Muon2Tau1CosDphiMinCut = cms.double(-1.00)
process.analyzeHiMassTau.DoDiscrByMuon2Tau1MassReco = cms.bool(False)
process.analyzeHiMassTau.UseVectorSumOfMuon2Tau1ProductsAndMetMassReco = cms.bool(False)
process.analyzeHiMassTau.UseCollinerApproxMuon2Tau1MassReco = cms.bool(False)
process.analyzeHiMassTau.Muon2Tau1MassMinCut = cms.double(150.0)
process.analyzeHiMassTau.Muon2Tau1MassMaxCut = cms.double(1000.0)
process.analyzeHiMassTau.DoMuon2Tau1DiscrByCDFzeta2D = cms.bool(False)
process.analyzeHiMassTau.Muon2Tau1PZetaCutCoefficient = cms.double(1.0)
process.analyzeHiMassTau.Muon2Tau1PZetaVisCutCoefficient = cms.double(-0.875)
process.analyzeHiMassTau.Muon2Tau1CDFzeta2DCutValue = cms.double(-7.00)
process.analyzeHiMassTau.DoMuon2Tau1DiscrByDeltaPtDivSumPt = cms.bool(False)
process.analyzeHiMassTau.Muon2Tau1DeltaPtDivSumPtMinCutValue = cms.double(0.1)
process.analyzeHiMassTau.Muon2Tau1DeltaPtDivSumPtMaxCutValue = cms.double(1.0)
process.analyzeHiMassTau.DoMuon2Tau1DiscrByDeltaPt = cms.bool(False)
process.analyzeHiMassTau.Muon2Tau1DeltaPtMinCutValue = cms.double(30.0)
process.analyzeHiMassTau.Muon2Tau1DeltaPtMaxCutValue = cms.double(9999.0)
process.analyzeHiMassTau.DoMuon2Tau2DiscrByDeltaR = cms.bool(True)
process.analyzeHiMassTau.Muon2Tau2DeltaRCut = cms.double(0.3)
process.analyzeHiMassTau.Muon2Tau2DiscrByOSLSType = cms.string('NONE')
process.analyzeHiMassTau.UseTauSeedTrackForMuon2Tau2DiscrByOSLS = cms.bool(False)
process.analyzeHiMassTau.DoMuon2Tau2DiscrByCosDphi = cms.bool(False)
process.analyzeHiMassTau.Muon2Tau2CosDphiMaxCut = cms.double(-0.95)
process.analyzeHiMassTau.Muon2Tau2CosDphiMinCut = cms.double(-1.00)
process.analyzeHiMassTau.DoDiscrByMuon2Tau2MassReco = cms.bool(False)
process.analyzeHiMassTau.UseVectorSumOfMuon2Tau2ProductsAndMetMassReco = cms.bool(False)
process.analyzeHiMassTau.UseCollinerApproxMuon2Tau2MassReco = cms.bool(False)
process.analyzeHiMassTau.Muon2Tau2MassMinCut = cms.double(150.0)
process.analyzeHiMassTau.Muon2Tau2MassMaxCut = cms.double(1000.0)
process.analyzeHiMassTau.DoMuon2Tau2DiscrByCDFzeta2D = cms.bool(False)
process.analyzeHiMassTau.Muon2Tau2PZetaCutCoefficient = cms.double(1.0)
process.analyzeHiMassTau.Muon2Tau2PZetaVisCutCoefficient = cms.double(-0.875)
process.analyzeHiMassTau.Muon2Tau2CDFzeta2DCutValue = cms.double(-7.00)
process.analyzeHiMassTau.DoMuon2Tau2DiscrByDeltaPtDivSumPt = cms.bool(False)
process.analyzeHiMassTau.Muon2Tau2DeltaPtDivSumPtMinCutValue = cms.double(0.1)
process.analyzeHiMassTau.Muon2Tau2DeltaPtDivSumPtMaxCutValue = cms.double(1.0)
process.analyzeHiMassTau.DoMuon2Tau2DiscrByDeltaPt = cms.bool(False)
process.analyzeHiMassTau.Muon2Tau2DeltaPtMinCutValue = cms.double(30.0)
process.analyzeHiMassTau.Muon2Tau2DeltaPtMaxCutValue = cms.double(9999.0)
process.analyzeHiMassTau.DoElectron1Tau1DiscrByDeltaR = cms.bool(True)
process.analyzeHiMassTau.Electron1Tau1DeltaRCut = cms.double(0.7)
process.analyzeHiMassTau.Electron1Tau1DiscrByOSLSType = cms.string('NONE')
process.analyzeHiMassTau.UseTauSeedTrackForElectron1Tau1DiscrByOSLS = cms.bool(False)
process.analyzeHiMassTau.DoElectron1Tau1DiscrByCosDphi = cms.bool(False)
process.analyzeHiMassTau.Electron1Tau1CosDphiMaxCut = cms.double(-0.95)
process.analyzeHiMassTau.Electron1Tau1CosDphiMinCut = cms.double(-1.00)
process.analyzeHiMassTau.DoDiscrByElectron1Tau1MassReco = cms.bool(False)
process.analyzeHiMassTau.UseVectorSumOfElectron1Tau1ProductsAndMetMassReco = cms.bool(False)
process.analyzeHiMassTau.UseCollinerApproxElectron1Tau1MassReco = cms.bool(False)
process.analyzeHiMassTau.Electron1Tau1MassMinCut = cms.double(150.0)
process.analyzeHiMassTau.Electron1Tau1MassMaxCut = cms.double(1000.0)
process.analyzeHiMassTau.DoElectron1Tau1DiscrByCDFzeta2D = cms.bool(False)
process.analyzeHiMassTau.Electron1Tau1PZetaCutCoefficient = cms.double(1.0)
process.analyzeHiMassTau.Electron1Tau1PZetaVisCutCoefficient = cms.double(-0.875)
process.analyzeHiMassTau.Electron1Tau1CDFzeta2DCutValue = cms.double(-7.00)
process.analyzeHiMassTau.DoElectron1Tau1DiscrByDeltaPtDivSumPt = cms.bool(False)
process.analyzeHiMassTau.Electron1Tau1DeltaPtDivSumPtMinCutValue = cms.double(0.1)
process.analyzeHiMassTau.Electron1Tau1DeltaPtDivSumPtMaxCutValue = cms.double(1.0)
process.analyzeHiMassTau.DoElectron1Tau1DiscrByDeltaPt = cms.bool(False)
process.analyzeHiMassTau.Electron1Tau1DeltaPtMinCutValue = cms.double(30.0)
process.analyzeHiMassTau.Electron1Tau1DeltaPtMaxCutValue = cms.double(9999.0)
process.analyzeHiMassTau.DoElectron1Tau2DiscrByDeltaR = cms.bool(True)
process.analyzeHiMassTau.Electron1Tau2DeltaRCut = cms.double(0.3)
process.analyzeHiMassTau.Electron1Tau2DiscrByOSLSType = cms.string('NONE')
process.analyzeHiMassTau.UseTauSeedTrackForElectron1Tau2DiscrByOSLS = cms.bool(False)
process.analyzeHiMassTau.DoElectron1Tau2DiscrByCosDphi = cms.bool(False)
process.analyzeHiMassTau.Electron1Tau2CosDphiMaxCut = cms.double(-0.95)
process.analyzeHiMassTau.Electron1Tau2CosDphiMinCut = cms.double(-1.00)
process.analyzeHiMassTau.DoDiscrByElectron1Tau2MassReco = cms.bool(False)
process.analyzeHiMassTau.UseVectorSumOfElectron1Tau2ProductsAndMetMassReco = cms.bool(False)
process.analyzeHiMassTau.UseCollinerApproxElectron1Tau2MassReco = cms.bool(False)
process.analyzeHiMassTau.Electron1Tau2MassMinCut = cms.double(150.0)
process.analyzeHiMassTau.Electron1Tau2MassMaxCut = cms.double(1000.0)
process.analyzeHiMassTau.DoElectron1Tau2DiscrByCDFzeta2D = cms.bool(False)
process.analyzeHiMassTau.Electron1Tau2PZetaCutCoefficient = cms.double(1.0)
process.analyzeHiMassTau.Electron1Tau2PZetaVisCutCoefficient = cms.double(-0.875)
process.analyzeHiMassTau.Electron1Tau2CDFzeta2DCutValue = cms.double(-7.00)
process.analyzeHiMassTau.DoElectron1Tau2DiscrByDeltaPtDivSumPt = cms.bool(False)
process.analyzeHiMassTau.Electron1Tau2DeltaPtDivSumPtMinCutValue = cms.double(0.1)
process.analyzeHiMassTau.Electron1Tau2DeltaPtDivSumPtMaxCutValue = cms.double(1.0)
process.analyzeHiMassTau.DoElectron1Tau2DiscrByDeltaPt = cms.bool(False)
process.analyzeHiMassTau.Electron1Tau2DeltaPtMinCutValue = cms.double(30.0)
process.analyzeHiMassTau.Electron1Tau2DeltaPtMaxCutValue = cms.double(9999.0)
process.analyzeHiMassTau.DoElectron2Tau1DiscrByDeltaR = cms.bool(True)
process.analyzeHiMassTau.Electron2Tau1DeltaRCut = cms.double(0.3)
process.analyzeHiMassTau.Electron2Tau1DiscrByOSLSType = cms.string('NONE')
process.analyzeHiMassTau.UseTauSeedTrackForElectron2Tau1DiscrByOSLS = cms.bool(False)
process.analyzeHiMassTau.DoElectron2Tau1DiscrByCosDphi = cms.bool(False)
process.analyzeHiMassTau.Electron2Tau1CosDphiMaxCut = cms.double(-0.95)
process.analyzeHiMassTau.Electron2Tau1CosDphiMinCut = cms.double(-1.00)
process.analyzeHiMassTau.DoDiscrByElectron2Tau1MassReco = cms.bool(False)
process.analyzeHiMassTau.UseVectorSumOfElectron2Tau1ProductsAndMetMassReco = cms.bool(False)
process.analyzeHiMassTau.UseCollinerApproxElectron2Tau1MassReco = cms.bool(False)
process.analyzeHiMassTau.Electron2Tau1MassMinCut = cms.double(150.0)
process.analyzeHiMassTau.Electron2Tau1MassMaxCut = cms.double(1000.0)
process.analyzeHiMassTau.DoElectron2Tau1DiscrByCDFzeta2D = cms.bool(False)
process.analyzeHiMassTau.Electron2Tau1PZetaCutCoefficient = cms.double(1.0)
process.analyzeHiMassTau.Electron2Tau1PZetaVisCutCoefficient = cms.double(-0.875)
process.analyzeHiMassTau.Electron2Tau1CDFzeta2DCutValue = cms.double(-7.00)
process.analyzeHiMassTau.DoElectron2Tau1DiscrByDeltaPtDivSumPt = cms.bool(False)
process.analyzeHiMassTau.Electron2Tau1DeltaPtDivSumPtMinCutValue = cms.double(0.1)
process.analyzeHiMassTau.Electron2Tau1DeltaPtDivSumPtMaxCutValue = cms.double(1.0)
process.analyzeHiMassTau.DoElectron2Tau1DiscrByDeltaPt = cms.bool(False)
process.analyzeHiMassTau.Electron2Tau1DeltaPtMinCutValue = cms.double(30.0)
process.analyzeHiMassTau.Electron2Tau1DeltaPtMaxCutValue = cms.double(9999.0)
process.analyzeHiMassTau.DoElectron2Tau2DiscrByDeltaR = cms.bool(True)
process.analyzeHiMassTau.Electron2Tau2DeltaRCut = cms.double(0.3)
process.analyzeHiMassTau.Electron2Tau2DiscrByOSLSType = cms.string('NONE')
process.analyzeHiMassTau.UseTauSeedTrackForElectron2Tau2DiscrByOSLS = cms.bool(False)
process.analyzeHiMassTau.DoElectron2Tau2DiscrByCosDphi = cms.bool(False)
process.analyzeHiMassTau.Electron2Tau2CosDphiMaxCut = cms.double(-0.95)
process.analyzeHiMassTau.Electron2Tau2CosDphiMinCut = cms.double(-1.00)
process.analyzeHiMassTau.DoDiscrByElectron2Tau2MassReco = cms.bool(False)
process.analyzeHiMassTau.UseVectorSumOfElectron2Tau2ProductsAndMetMassReco = cms.bool(False)
process.analyzeHiMassTau.UseCollinerApproxElectron2Tau2MassReco = cms.bool(False)
process.analyzeHiMassTau.Electron2Tau2MassMinCut = cms.double(150.0)
process.analyzeHiMassTau.Electron2Tau2MassMaxCut = cms.double(1000.0)
process.analyzeHiMassTau.DoElectron2Tau2DiscrByCDFzeta2D = cms.bool(False)
process.analyzeHiMassTau.Electron2Tau2PZetaCutCoefficient = cms.double(1.0)
process.analyzeHiMassTau.Electron2Tau2PZetaVisCutCoefficient = cms.double(-0.875)
process.analyzeHiMassTau.Electron2Tau2CDFzeta2DCutValue = cms.double(-7.00)
process.analyzeHiMassTau.DoElectron2Tau2DiscrByDeltaPtDivSumPt = cms.bool(False)
process.analyzeHiMassTau.Electron2Tau2DeltaPtDivSumPtMinCutValue = cms.double(0.1)
process.analyzeHiMassTau.Electron2Tau2DeltaPtDivSumPtMaxCutValue = cms.double(1.0)
process.analyzeHiMassTau.DoElectron2Tau2DiscrByDeltaPt = cms.bool(False)
process.analyzeHiMassTau.Electron2Tau2DeltaPtMinCutValue = cms.double(30.0)
process.analyzeHiMassTau.Electron2Tau2DeltaPtMaxCutValue = cms.double(9999.0)
process.analyzeHiMassTau.DoElectron1Muon1DiscrByDeltaR = cms.bool(True)
process.analyzeHiMassTau.Electron1Muon1DeltaRCut = cms.double(0.3)
process.analyzeHiMassTau.Electron1Muon1DiscrByOSLSType = cms.string('NONE')
process.analyzeHiMassTau.DoElectron1Muon1DiscrByCosDphi = cms.bool(False)
process.analyzeHiMassTau.Electron1Muon1CosDphiMaxCut = cms.double(-0.95)
process.analyzeHiMassTau.Electron1Muon1CosDphiMinCut = cms.double(-1.00)
process.analyzeHiMassTau.DoDiscrByElectron1Muon1MassReco = cms.bool(False)
process.analyzeHiMassTau.UseVectorSumOfElectron1Muon1ProductsAndMetMassReco = cms.bool(False)
process.analyzeHiMassTau.UseCollinerApproxElectron1Muon1MassReco = cms.bool(False)
process.analyzeHiMassTau.Electron1Muon1MassMinCut = cms.double(150.0)
process.analyzeHiMassTau.Electron1Muon1MassMaxCut = cms.double(1000.0)
process.analyzeHiMassTau.DoElectron1Muon1DiscrByCDFzeta2D = cms.bool(False)
process.analyzeHiMassTau.Electron1Muon1PZetaCutCoefficient = cms.double(1.0)
process.analyzeHiMassTau.Electron1Muon1PZetaVisCutCoefficient = cms.double(-0.875)
process.analyzeHiMassTau.Electron1Muon1CDFzeta2DCutValue = cms.double(-7.00)
process.analyzeHiMassTau.DoElectron1Muon1DiscrByDeltaPtDivSumPt = cms.bool(False)
process.analyzeHiMassTau.Electron1Muon1DeltaPtDivSumPtMinCutValue = cms.double(0.1)
process.analyzeHiMassTau.Electron1Muon1DeltaPtDivSumPtMaxCutValue = cms.double(1.0)
process.analyzeHiMassTau.DoElectron1Muon1DiscrByDeltaPt = cms.bool(False)
process.analyzeHiMassTau.Electron1Muon1DeltaPtMinCutValue = cms.double(30.0)
process.analyzeHiMassTau.Electron1Muon1DeltaPtMaxCutValue = cms.double(9999.0)
process.analyzeHiMassTau.DoElectron1Muon2DiscrByDeltaR = cms.bool(True)
process.analyzeHiMassTau.Electron1Muon2DeltaRCut = cms.double(0.3)
process.analyzeHiMassTau.Electron1Muon2DiscrByOSLSType = cms.string('NONE')
process.analyzeHiMassTau.DoElectron1Muon2DiscrByCosDphi = cms.bool(False)
process.analyzeHiMassTau.Electron1Muon2CosDphiMaxCut = cms.double(-0.95)
process.analyzeHiMassTau.Electron1Muon2CosDphiMinCut = cms.double(-1.00)
process.analyzeHiMassTau.DoDiscrByElectron1Muon2MassReco = cms.bool(False)
process.analyzeHiMassTau.UseVectorSumOfElectron1Muon2ProductsAndMetMassReco = cms.bool(False)
process.analyzeHiMassTau.UseCollinerApproxElectron1Muon2MassReco = cms.bool(False)
process.analyzeHiMassTau.Electron1Muon2MassMinCut = cms.double(150.0)
process.analyzeHiMassTau.Electron1Muon2MassMaxCut = cms.double(1000.0)
process.analyzeHiMassTau.DoElectron1Muon2DiscrByCDFzeta2D = cms.bool(False)
process.analyzeHiMassTau.Electron1Muon2PZetaCutCoefficient = cms.double(1.0)
process.analyzeHiMassTau.Electron1Muon2PZetaVisCutCoefficient = cms.double(-0.875)
process.analyzeHiMassTau.Electron1Muon2CDFzeta2DCutValue = cms.double(-7.00)
process.analyzeHiMassTau.DoElectron1Muon2DiscrByDeltaPtDivSumPt = cms.bool(False)
process.analyzeHiMassTau.Electron1Muon2DeltaPtDivSumPtMinCutValue = cms.double(0.1)
process.analyzeHiMassTau.Electron1Muon2DeltaPtDivSumPtMaxCutValue = cms.double(1.0)
process.analyzeHiMassTau.DoElectron1Muon2DiscrByDeltaPt = cms.bool(False)
process.analyzeHiMassTau.Electron1Muon2DeltaPtMinCutValue = cms.double(30.0)
process.analyzeHiMassTau.Electron1Muon2DeltaPtMaxCutValue = cms.double(9999.0)
process.analyzeHiMassTau.DoElectron2Muon1DiscrByDeltaR = cms.bool(True)
process.analyzeHiMassTau.Electron2Muon1DeltaRCut = cms.double(0.3)
process.analyzeHiMassTau.Electron2Muon1DiscrByOSLSType = cms.string('NONE')
process.analyzeHiMassTau.DoElectron2Muon1DiscrByCosDphi = cms.bool(False)
process.analyzeHiMassTau.Electron2Muon1CosDphiMaxCut = cms.double(-0.95)
process.analyzeHiMassTau.Electron2Muon1CosDphiMinCut = cms.double(-1.00)
process.analyzeHiMassTau.DoDiscrByElectron2Muon1MassReco = cms.bool(False)
process.analyzeHiMassTau.UseVectorSumOfElectron2Muon1ProductsAndMetMassReco = cms.bool(False)
process.analyzeHiMassTau.UseCollinerApproxElectron2Muon1MassReco = cms.bool(False)
process.analyzeHiMassTau.Electron2Muon1MassMinCut = cms.double(150.0)
process.analyzeHiMassTau.Electron2Muon1MassMaxCut = cms.double(1000.0)
process.analyzeHiMassTau.DoElectron2Muon1DiscrByCDFzeta2D = cms.bool(False)
process.analyzeHiMassTau.Electron2Muon1PZetaCutCoefficient = cms.double(1.0)
process.analyzeHiMassTau.Electron2Muon1PZetaVisCutCoefficient = cms.double(-0.875)
process.analyzeHiMassTau.Electron2Muon1CDFzeta2DCutValue = cms.double(-7.00)
process.analyzeHiMassTau.DoElectron2Muon1DiscrByDeltaPtDivSumPt = cms.bool(False)
process.analyzeHiMassTau.Electron2Muon1DeltaPtDivSumPtMinCutValue = cms.double(0.1)
process.analyzeHiMassTau.Electron2Muon1DeltaPtDivSumPtMaxCutValue = cms.double(1.0)
process.analyzeHiMassTau.DoElectron2Muon1DiscrByDeltaPt = cms.bool(False)
process.analyzeHiMassTau.Electron2Muon1DeltaPtMinCutValue = cms.double(30.0)
process.analyzeHiMassTau.Electron2Muon1DeltaPtMaxCutValue = cms.double(9999.0)
process.analyzeHiMassTau.DoElectron2Muon2DiscrByDeltaR = cms.bool(True)
process.analyzeHiMassTau.Electron2Muon2DeltaRCut = cms.double(0.3)
process.analyzeHiMassTau.Electron2Muon2DiscrByOSLSType = cms.string('NONE')
process.analyzeHiMassTau.DoElectron2Muon2DiscrByCosDphi = cms.bool(False)
process.analyzeHiMassTau.Electron2Muon2CosDphiMaxCut = cms.double(-0.95)
process.analyzeHiMassTau.Electron2Muon2CosDphiMinCut = cms.double(-1.00)
process.analyzeHiMassTau.DoDiscrByElectron2Muon2MassReco = cms.bool(False)
process.analyzeHiMassTau.UseVectorSumOfElectron2Muon2ProductsAndMetMassReco = cms.bool(False)
process.analyzeHiMassTau.UseCollinerApproxElectron2Muon2MassReco = cms.bool(False)
process.analyzeHiMassTau.Electron2Muon2MassMinCut = cms.double(150.0)
process.analyzeHiMassTau.Electron2Muon2MassMaxCut = cms.double(1000.0)
process.analyzeHiMassTau.DoElectron2Muon2DiscrByCDFzeta2D = cms.bool(False)
process.analyzeHiMassTau.Electron2Muon2PZetaCutCoefficient = cms.double(1.0)
process.analyzeHiMassTau.Electron2Muon2PZetaVisCutCoefficient = cms.double(-0.875)
process.analyzeHiMassTau.Electron2Muon2CDFzeta2DCutValue = cms.double(-7.00)
process.analyzeHiMassTau.DoElectron2Muon2DiscrByDeltaPtDivSumPt = cms.bool(False)
process.analyzeHiMassTau.Electron2Muon2DeltaPtDivSumPtMinCutValue = cms.double(0.1)
process.analyzeHiMassTau.Electron2Muon2DeltaPtDivSumPtMaxCutValue = cms.double(1.0)
process.analyzeHiMassTau.DoElectron2Muon2DiscrByDeltaPt = cms.bool(False)
process.analyzeHiMassTau.Electron2Muon2DeltaPtMinCutValue = cms.double(30.0)
process.analyzeHiMassTau.Electron2Muon2DeltaPtMaxCutValue = cms.double(9999.0)
process.analyzeHiMassTau.DoDiscrByMuon1MetDphi = cms.bool(False)
process.analyzeHiMassTau.Muon1MetDphiMinCut = cms.double(1.30)
process.analyzeHiMassTau.Muon1MetDphiMaxCut = cms.double(3.15)
process.analyzeHiMassTau.DoDiscrByMuon2MetDphi = cms.bool(False)
process.analyzeHiMassTau.Muon2MetDphiMinCut = cms.double(1.30)
process.analyzeHiMassTau.Muon2MetDphiMaxCut = cms.double(3.15)
process.analyzeHiMassTau.DoDiscrByElectron1MetDphi = cms.bool(False)
process.analyzeHiMassTau.Electron1MetDphiMinCut = cms.double(1.30)
process.analyzeHiMassTau.Electron1MetDphiMaxCut = cms.double(3.15)
process.analyzeHiMassTau.DoDiscrByElectron2MetDphi = cms.bool(False)
process.analyzeHiMassTau.Electron2MetDphiMinCut = cms.double(1.30)
process.analyzeHiMassTau.Electron2MetDphiMaxCut = cms.double(3.15)
process.analyzeHiMassTau.DoDiscrByTau1MetDphi = cms.bool(False)
process.analyzeHiMassTau.Tau1MetDphiMinCut = cms.double(1.30)
process.analyzeHiMassTau.Tau1MetDphiMaxCut = cms.double(3.15)
process.analyzeHiMassTau.DoDiscrByTau2MetDphi = cms.bool(False)
process.analyzeHiMassTau.Tau2MetDphiMinCut = cms.double(1.30)
process.analyzeHiMassTau.Tau2MetDphiMaxCut = cms.double(3.15)
process.analyzeHiMassTau.DoDiscrByMuon1MetMt = cms.bool(False)
process.analyzeHiMassTau.Muon1MetMtMinCut = cms.double(0.0)
process.analyzeHiMassTau.Muon1MetMtMaxCut = cms.double(40.0)
process.analyzeHiMassTau.DoDiscrByMuon2MetMt = cms.bool(False)
process.analyzeHiMassTau.Muon2MetMtMinCut = cms.double(0.0)
process.analyzeHiMassTau.Muon2MetMtMaxCut = cms.double(99999.9)
process.analyzeHiMassTau.DoDiscrByElectron1MetMt = cms.bool(False)
process.analyzeHiMassTau.Electron1MetMtMinCut = cms.double(50.0)
process.analyzeHiMassTau.Electron1MetMtMaxCut = cms.double(100.0)
process.analyzeHiMassTau.DoDiscrByElectron2MetMt = cms.bool(False)
process.analyzeHiMassTau.Electron2MetMtMinCut = cms.double(0.0)
process.analyzeHiMassTau.Electron2MetMtMaxCut = cms.double(99999.9)
process.analyzeHiMassTau.DoDiscrByTau1MetMt = cms.bool(False)
process.analyzeHiMassTau.Tau1MetMtMinCut = cms.double(50.0)
process.analyzeHiMassTau.Tau1MetMtMaxCut = cms.double(100.0)
process.analyzeHiMassTau.DoDiscrByTau2MetMt = cms.bool(False)
process.analyzeHiMassTau.Tau2MetMtMinCut = cms.double(0.0)
process.analyzeHiMassTau.Tau2MetMtMaxCut = cms.double(99999.9)
process.analyzeHiMassTau.DoMuon1DiscrByIsZllCut = cms.bool(False)
process.analyzeHiMassTau.DoMuon2DiscrByIsZllCut = cms.bool(False)
process.analyzeHiMassTau.DoElectron1DiscrByIsZllCut = cms.bool(False)
process.analyzeHiMassTau.DoElectron2DiscrByIsZllCut = cms.bool(False)
#-----do matching to gen?
process.analyzeHiMassTau.MatchBToGen = cms.bool(False)
process.analyzeHiMassTau.MatchLeptonToGen = cms.bool(False)
process.analyzeHiMassTau.UseLeptonMotherId = cms.bool(False)
process.analyzeHiMassTau.UseLeptonGrandMotherId = cms.bool(False)
process.analyzeHiMassTau.LeptonMotherId = cms.int32(24)
process.analyzeHiMassTau.LeptonGrandMotherId = cms.int32(32)
process.analyzeHiMassTau.MatchTauToGen = cms.bool(False)
process.analyzeHiMassTau.UseTauMotherId = cms.bool(False)
process.analyzeHiMassTau.UseTauGrandMotherId = cms.bool(False)
process.analyzeHiMassTau.TauMotherId = cms.int32(25)
process.analyzeHiMassTau.TauGrandMotherId = cms.int32(1)
process.analyzeHiMassTau.TauToGenMatchingDeltaR = cms.double(0.25)
process.analyzeHiMassTau.TauDecayModeType = cms.vstring('1','2','3','4','5','6','7') #1 = rho1prong
#2 = a1prong
#3 = a3prong
#4 = noresonance1prong0pizero
#5 = noresonance1prongGte1pizero
#6 = noresonance3prong0pizero
#7 = noresonance3prongGte1pizero
#-----Create the Ntuple?
process.analyzeHiMassTau.DoProduceNtuple = cms.bool(False)
#-----Event Sequence inputs
process.analyzeHiMassTau.GenTauNmin = cms.int32(0)
process.analyzeHiMassTau.GenTauNmax = cms.int32(10000)
process.analyzeHiMassTau.GenTopNmin = cms.int32(0)
process.analyzeHiMassTau.GenTopNmax = cms.int32(10000)
process.analyzeHiMassTau.GenElectronNmin = cms.int32(0)
process.analyzeHiMassTau.GenElectronNmax = cms.int32(10000)
process.analyzeHiMassTau.GenMuonNmin = cms.int32(0)
process.analyzeHiMassTau.GenMuonNmax = cms.int32(10000)
process.analyzeHiMassTau.GenZNmin = cms.int32(0)
process.analyzeHiMassTau.GenZNmax = cms.int32(10000)
process.analyzeHiMassTau.GenWNmin = cms.int32(0)
process.analyzeHiMassTau.GenWNmax = cms.int32(10000)
process.analyzeHiMassTau.GenSMHiggsNmin = cms.int32(0)
process.analyzeHiMassTau.GenSMHiggsNmax = cms.int32(10000)
process.analyzeHiMassTau.RecoVertexNmin = cms.int32(0) # require event to have >=X vertices passing specified cuts
process.analyzeHiMassTau.RecoVertexNmax = cms.int32(10000) # require event to have <=X vertices passing specified cuts
process.analyzeHiMassTau.RecoTriggers1Nmin = cms.int32(0) # require event to pass >=X trigger paths defined above
process.analyzeHiMassTau.RecoMuon1Nmin = cms.int32(1)
process.analyzeHiMassTau.RecoMuon1Nmax = cms.int32(10000)
process.analyzeHiMassTau.RecoMuon2Nmin = cms.int32(0)
process.analyzeHiMassTau.RecoMuon2Nmax = cms.int32(10000)
process.analyzeHiMassTau.RecoElectron1Nmin = cms.int32(0)
process.analyzeHiMassTau.RecoElectron1Nmax = cms.int32(10000)
process.analyzeHiMassTau.RecoElectron2Nmin = cms.int32(0)
process.analyzeHiMassTau.RecoElectron2Nmax = cms.int32(10000)
process.analyzeHiMassTau.RecoTau1Nmin = cms.int32(1)
process.analyzeHiMassTau.RecoTau1Nmax = cms.int32(10000)
process.analyzeHiMassTau.RecoTau2Nmin = cms.int32(0)
process.analyzeHiMassTau.RecoTau2Nmax = cms.int32(10000)
process.analyzeHiMassTau.RecoJet1Nmin = cms.int32(2)
process.analyzeHiMassTau.RecoJet1Nmax = cms.int32(10000)
process.analyzeHiMassTau.RecoJet2Nmin = cms.int32(2)
process.analyzeHiMassTau.RecoJet2Nmax = cms.int32(10000)
process.analyzeHiMassTau.RecoCentralJetNmin = cms.int32(0)
process.analyzeHiMassTau.RecoCentralJetNmax = cms.int32(1000)
process.analyzeHiMassTau.RecoFirstLeadingJetNmin = cms.int32(1)
process.analyzeHiMassTau.RecoSecondLeadingJetNmin = cms.int32(1)
process.analyzeHiMassTau.RecoBJetNmin = cms.int32(0) # require event to have >=X "jets" passing specified cuts
process.analyzeHiMassTau.RecoBJetNmax = cms.int32(0) # require event to have <=X "jets" passing specified cuts
process.analyzeHiMassTau.SusyCombinationsNmin = cms.int32(1)
process.analyzeHiMassTau.RecoMuon1MetTopologyNmin = cms.int32(0)
process.analyzeHiMassTau.RecoMuon1MetTopologyNmax = cms.int32(1000)
process.analyzeHiMassTau.RecoMuon2MetTopologyNmin = cms.int32(0)
process.analyzeHiMassTau.RecoMuon2MetTopologyNmax = cms.int32(1000)
process.analyzeHiMassTau.RecoElectron1MetTopologyNmin = cms.int32(0)
process.analyzeHiMassTau.RecoElectron1MetTopologyNmax = cms.int32(1000)
process.analyzeHiMassTau.RecoElectron2MetTopologyNmin = cms.int32(0)
process.analyzeHiMassTau.RecoElectron2MetTopologyNmax = cms.int32(1000)
process.analyzeHiMassTau.RecoTau1MetTopologyNmin = cms.int32(0)
process.analyzeHiMassTau.RecoTau1MetTopologyNmax = cms.int32(1000)
process.analyzeHiMassTau.RecoTau2MetTopologyNmin = cms.int32(0)
process.analyzeHiMassTau.RecoTau2MetTopologyNmax = cms.int32(1000)
process.analyzeHiMassTau.DiMuonCombinationsNmin = cms.int32(0)
process.analyzeHiMassTau.DiMuonCombinationsNmax = cms.int32(10000)
process.analyzeHiMassTau.DiElectronCombinationsNmin = cms.int32(0)
process.analyzeHiMassTau.DiElectronCombinationsNmax = cms.int32(10000)
process.analyzeHiMassTau.DiTauCombinationsNmin = cms.int32(0)
process.analyzeHiMassTau.DiTauCombinationsNmax = cms.int32(10000)
process.analyzeHiMassTau.DiJetCombinationsNmin = cms.int32(1)
process.analyzeHiMassTau.DiJetCombinationsNmax = cms.int32(10000)
process.analyzeHiMassTau.Muon1Tau1CombinationsNmin = cms.int32(1)
process.analyzeHiMassTau.Muon1Tau1CombinationsNmax = cms.int32(10000)
process.analyzeHiMassTau.Muon1Tau2CombinationsNmin = cms.int32(0)
process.analyzeHiMassTau.Muon1Tau2CombinationsNmax = cms.int32(10000)
process.analyzeHiMassTau.Muon2Tau1CombinationsNmin = cms.int32(0)
process.analyzeHiMassTau.Muon2Tau1CombinationsNmax = cms.int32(10000)
process.analyzeHiMassTau.Muon2Tau2CombinationsNmin = cms.int32(0)
process.analyzeHiMassTau.Muon2Tau2CombinationsNmax = cms.int32(10000)
process.analyzeHiMassTau.Electron1Tau1CombinationsNmin = cms.int32(0)
process.analyzeHiMassTau.Electron1Tau1CombinationsNmax = cms.int32(10000)
process.analyzeHiMassTau.Electron1Tau2CombinationsNmin = cms.int32(0)
process.analyzeHiMassTau.Electron1Tau2CombinationsNmax = cms.int32(10000)
process.analyzeHiMassTau.Electron2Tau1CombinationsNmin = cms.int32(0)
process.analyzeHiMassTau.Electron2Tau1CombinationsNmax = cms.int32(10000)
process.analyzeHiMassTau.Electron2Tau2CombinationsNmin = cms.int32(0)
process.analyzeHiMassTau.Electron2Tau2CombinationsNmax = cms.int32(10000)
process.analyzeHiMassTau.Electron1Muon1CombinationsNmin = cms.int32(0)
process.analyzeHiMassTau.Electron1Muon1CombinationsNmax = cms.int32(10000)
process.analyzeHiMassTau.Electron1Muon2CombinationsNmin = cms.int32(0)
process.analyzeHiMassTau.Electron1Muon2CombinationsNmax = cms.int32(10000)
process.analyzeHiMassTau.Electron2Muon1CombinationsNmin = cms.int32(0)
process.analyzeHiMassTau.Electron2Muon1CombinationsNmax = cms.int32(10000)
process.analyzeHiMassTau.Electron2Muon2CombinationsNmin = cms.int32(0)
process.analyzeHiMassTau.Electron2Muon2CombinationsNmax = cms.int32(10000)
process.analyzeHiMassTau.RecoTriggers2Nmin = cms.int32(1)
process.analyzeHiMassTau.TopologicalSelectionSequence = cms.vstring('RecoTriggers1Nmin'
'RecoMuon1Nmax',
'RecoTau1Nmax',
'RecoSecondLeadingJetNmin',
'RecoBJetNmax',
'SusyCombinationsNmin',
'DiJetCombinationsNmax',
'Muon1Tau1CombinationsNmax',
'RecoTriggers2Nmin',
)
process.analyzeHiMassTau.EventSelectionSequence = cms.vstring('GenTauNmin','GenTauNmax',
'GenTopNmin','GenTopNmax',
'GenElectronNmin','GenElectronNmax',
'GenMuonNmin','GenMuonNmax',
'GenZNmin','GenZNmax',
'GenWNmin','GenWNmax',
'GenSMHiggsNmin','GenSMHiggsNmax',
'RecoVertexNmin','RecoVertexNmax',
'RecoTriggers1Nmin',
'RecoMuon1Nmin','RecoMuon1Nmax',
'RecoMuon2Nmin','RecoMuon2Nmax',
'RecoElectron1Nmin','RecoElectron1Nmax',
'RecoElectron2Nmin','RecoElectron2Nmax',
'RecoTau1Nmin','RecoTau1Nmax',
'RecoTau2Nmin','RecoTau2Nmax',
'RecoJet1Nmin','RecoJet1Nmax',
'RecoJet2Nmin','RecoJet2Nmax',
'RecoCentralJetNmin','RecoCentralJetNmax',
'RecoFirstLeadingJetNmin','RecoSecondLeadingJetNmin',
'RecoBJetNmin','RecoBJetNmax','SusyCombinationsNmin',
'RecoMuon1MetTopologyNmin','RecoMuon1MetTopologyNmax',
'RecoMuon2MetTopologyNmin','RecoMuon2MetTopologyNmax',
'RecoElectron1MetTopologyNmin','RecoElectron1MetTopologyNmax',
'RecoElectron2MetTopologyNmin','RecoElectron2MetTopologyNmax',
'RecoTau1MetTopologyNmin','RecoTau1MetTopologyNmax',
'RecoTau2MetTopologyNmin','RecoTau2MetTopologyNmax',
'DiMuonCombinationsNmin','DiMuonCombinationsNmax',
'DiElectronCombinationsNmin','DiElectronCombinationsNmax',
'DiTauCombinationsNmin','DiTauCombinationsNmax',
'DiJetCombinationsNmin','DiJetCombinationsNmax',
'Muon1Tau1CombinationsNmin','Muon1Tau1CombinationsNmax',
'Muon1Tau2CombinationsNmin','Muon1Tau2CombinationsNmax',
'Muon2Tau1CombinationsNmin','Muon2Tau1CombinationsNmax',
'Muon2Tau2CombinationsNmin','Muon2Tau2CombinationsNmax',
'Electron1Tau1CombinationsNmin','Electron1Tau1CombinationsNmax',
'Electron1Tau2CombinationsNmin','Electron1Tau2CombinationsNmax',
'Electron2Tau1CombinationsNmin','Electron2Tau1CombinationsNmax',
'Electron2Tau2CombinationsNmin','Electron2Tau2CombinationsNmax',
'Electron1Muon1CombinationsNmin','Electron1Muon1CombinationsNmax',
'Electron1Muon2CombinationsNmin','Electron1Muon2CombinationsNmax',
'Electron2Muon1CombinationsNmin','Electron2Muon1CombinationsNmax',
'Electron2Muon2CombinationsNmin','Electron2Muon2CombinationsNmax',
'RecoTriggers2Nmin',
)
#-----Inputs for systematic uncertainties
process.analyzeHiMassTau.CalculatePdfSystematicUncertanties = cms.bool(False) # if true, pdf systematic uncertanties will be calculated
process.analyzeHiMassTau.PdfWeightTags = cms.untracked.VInputTag("cteq65PdfWeights") # collection of weights for systematic uncertanties
process.analyzeHiMassTau.CalculateFSRSystematics = cms.bool(False)
process.analyzeHiMassTau.CalculateISRGluonSystematics = cms.bool(False)
process.analyzeHiMassTau.CalculateISRGammaSystematics = cms.bool(False)
process.analyzeHiMassTau.SmearTheMuon = cms.bool(False)
process.analyzeHiMassTau.MuonPtScaleOffset = cms.double(1.0)
process.analyzeHiMassTau.MuonPtSigmaOffset = cms.double(1.0)
process.analyzeHiMassTau.MuonEtaScaleOffset = cms.double(1.0)
process.analyzeHiMassTau.MuonEtaSigmaOffset = cms.double(1.0)
process.analyzeHiMassTau.MuonPhiScaleOffset = cms.double(1.0)
process.analyzeHiMassTau.MuonPhiSigmaOffset = cms.double(1.0)
process.analyzeHiMassTau.SmearTheElectron = cms.bool(False)
process.analyzeHiMassTau.ElectronPtScaleOffset = cms.double(1.0)
process.analyzeHiMassTau.ElectronPtSigmaOffset = cms.double(1.0)
process.analyzeHiMassTau.ElectronEtaScaleOffset = cms.double(1.0)
process.analyzeHiMassTau.ElectronEtaSigmaOffset = cms.double(1.0)
process.analyzeHiMassTau.ElectronPhiScaleOffset = cms.double(1.0)
process.analyzeHiMassTau.ElectronPhiSigmaOffset = cms.double(1.0)
process.analyzeHiMassTau.SmearTheTau = cms.bool(False)
process.analyzeHiMassTau.TauPtScaleOffset = cms.double(1.0)
process.analyzeHiMassTau.TauPtSigmaOffset = cms.double(1.0)
process.analyzeHiMassTau.TauEtaScaleOffset = cms.double(1.0)
process.analyzeHiMassTau.TauEtaSigmaOffset = cms.double(1.0)
process.analyzeHiMassTau.TauPhiScaleOffset = cms.double(1.0)
process.analyzeHiMassTau.TauPhiSigmaOffset = cms.double(1.0)
process.analyzeHiMassTau.SmearTheJet = cms.bool(False)
process.analyzeHiMassTau.UseDataBaseForJEC = cms.bool(True)
process.analyzeHiMassTau.JES_UpOrDown = cms.double(1.0)
process.analyzeHiMassTau.JetEnergyScaleOffset = cms.double(1.0)
process.analyzeHiMassTau.SmearThePt = cms.bool(False)
process.analyzeHiMassTau.SmearTheEta = cms.bool(False)
process.analyzeHiMassTau.SmearThePhi = cms.bool(False)
process.analyzeHiMassTau.CalculatePUSystematics = cms.bool(False)
process.analyzeHiMassTau.DataHistos = cms.string("DataPUFile_22Jan2013ReReco_Run2012.root")
process.analyzeHiMassTau.MCHistos = cms.string("S10MC_PUFile.root")
process.p = cms.Path(
process.analyzeHiMassTau
)
| [
"[email protected]"
] | |
4ac2d30029f1696f61dc06f4ed732f1df62f1975 | 5ce393ca7d50b35c41f65188dfe8423646b2fcbc | /flask_with_spark/utilities/decorators.py | eef1b78e8690d55b6feb452f294174d73054c5b1 | [
"MIT"
] | permissive | todhm/have_you_read_this_book | 0c7894319705ed4f047a6b936ab80ea239e5509c | 905cb1934bafc987b76b6e57dbc63285f491ac88 | refs/heads/master | 2022-11-24T07:36:12.514646 | 2018-08-07T05:00:33 | 2018-08-07T05:00:33 | 141,091,550 | 2 | 0 | MIT | 2022-11-11T07:26:01 | 2018-07-16T05:42:52 | JavaScript | UTF-8 | Python | false | false | 750 | py | from functools import wraps
from flask import session,request,redirect,url_for, abort
def login_required(f):
@wraps(f)
def decorated_function(*args,**kwargs):
if session.get('email') is None:
return redirect(url_for('user_app.login',next=request.url))
if session.get('userIntId') is None :
return redirect(url_for('user_app.login',next=request.url))
return f(*args,**kwargs)
return decorated_function
def logout_required(f):
@wraps(f)
def decorated_function(*args, **kwargs):
if session.get('email') or session.get('userIntId'):
return redirect(url_for('shopping_app.homepage',next=request.url))
return f(*args,**kwargs)
return decorated_function
| [
"[email protected]"
] | |
4defe82f22c6b83db3cff78f6e922aefcdaf6ec9 | f569978afb27e72bf6a88438aa622b8c50cbc61b | /douyin_open/VideoListVideoList/models/inline_response200_data.py | 84ccfc29afdb3da81d954674bb7beb2eca74eec5 | [] | no_license | strangebank/swagger-petstore-perl | 4834409d6225b8a09b8195128d74a9b10ef1484a | 49dfc229e2e897cdb15cbf969121713162154f28 | refs/heads/master | 2023-01-05T10:21:33.518937 | 2020-11-05T04:33:16 | 2020-11-05T04:33:16 | 310,189,316 | 1 | 0 | null | null | null | null | UTF-8 | Python | false | false | 6,561 | py | # coding: utf-8
"""
No description provided (generated by Swagger Codegen https://github.com/swagger-api/swagger-codegen) # noqa: E501
OpenAPI spec version: 1.0.0
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
import pprint
import re # noqa: F401
import six
class InlineResponse200Data(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
"""
"""
Attributes:
swagger_types (dict): The key is attribute name
and the value is attribute type.
attribute_map (dict): The key is attribute name
and the value is json key in definition.
"""
swagger_types = {
'error_code': 'ErrorCode',
'description': 'Description',
'cursor': 'Cursor',
'has_more': 'HasMore',
'list': 'list[Video]'
}
attribute_map = {
'error_code': 'error_code',
'description': 'description',
'cursor': 'cursor',
'has_more': 'has_more',
'list': 'list'
}
def __init__(self, error_code=None, description=None, cursor=None, has_more=None, list=None): # noqa: E501
"""InlineResponse200Data - a model defined in Swagger""" # noqa: E501
self._error_code = None
self._description = None
self._cursor = None
self._has_more = None
self._list = None
self.discriminator = None
self.error_code = error_code
self.description = description
self.cursor = cursor
self.has_more = has_more
if list is not None:
self.list = list
@property
def error_code(self):
"""Gets the error_code of this InlineResponse200Data. # noqa: E501
:return: The error_code of this InlineResponse200Data. # noqa: E501
:rtype: ErrorCode
"""
return self._error_code
@error_code.setter
def error_code(self, error_code):
"""Sets the error_code of this InlineResponse200Data.
:param error_code: The error_code of this InlineResponse200Data. # noqa: E501
:type: ErrorCode
"""
if error_code is None:
raise ValueError("Invalid value for `error_code`, must not be `None`") # noqa: E501
self._error_code = error_code
@property
def description(self):
"""Gets the description of this InlineResponse200Data. # noqa: E501
:return: The description of this InlineResponse200Data. # noqa: E501
:rtype: Description
"""
return self._description
@description.setter
def description(self, description):
"""Sets the description of this InlineResponse200Data.
:param description: The description of this InlineResponse200Data. # noqa: E501
:type: Description
"""
if description is None:
raise ValueError("Invalid value for `description`, must not be `None`") # noqa: E501
self._description = description
@property
def cursor(self):
"""Gets the cursor of this InlineResponse200Data. # noqa: E501
:return: The cursor of this InlineResponse200Data. # noqa: E501
:rtype: Cursor
"""
return self._cursor
@cursor.setter
def cursor(self, cursor):
"""Sets the cursor of this InlineResponse200Data.
:param cursor: The cursor of this InlineResponse200Data. # noqa: E501
:type: Cursor
"""
if cursor is None:
raise ValueError("Invalid value for `cursor`, must not be `None`") # noqa: E501
self._cursor = cursor
@property
def has_more(self):
"""Gets the has_more of this InlineResponse200Data. # noqa: E501
:return: The has_more of this InlineResponse200Data. # noqa: E501
:rtype: HasMore
"""
return self._has_more
@has_more.setter
def has_more(self, has_more):
"""Sets the has_more of this InlineResponse200Data.
:param has_more: The has_more of this InlineResponse200Data. # noqa: E501
:type: HasMore
"""
if has_more is None:
raise ValueError("Invalid value for `has_more`, must not be `None`") # noqa: E501
self._has_more = has_more
@property
def list(self):
"""Gets the list of this InlineResponse200Data. # noqa: E501
由于置顶的原因, list长度可能比count指定的数量多一些或少一些。 # noqa: E501
:return: The list of this InlineResponse200Data. # noqa: E501
:rtype: list[Video]
"""
return self._list
@list.setter
def list(self, list):
"""Sets the list of this InlineResponse200Data.
由于置顶的原因, list长度可能比count指定的数量多一些或少一些。 # noqa: E501
:param list: The list of this InlineResponse200Data. # noqa: E501
:type: list[Video]
"""
self._list = list
def to_dict(self):
"""Returns the model properties as a dict"""
result = {}
for attr, _ in six.iteritems(self.swagger_types):
value = getattr(self, attr)
if isinstance(value, list):
result[attr] = list(map(
lambda x: x.to_dict() if hasattr(x, "to_dict") else x,
value
))
elif hasattr(value, "to_dict"):
result[attr] = value.to_dict()
elif isinstance(value, dict):
result[attr] = dict(map(
lambda item: (item[0], item[1].to_dict())
if hasattr(item[1], "to_dict") else item,
value.items()
))
else:
result[attr] = value
if issubclass(InlineResponse200Data, dict):
for key, value in self.items():
result[key] = value
return result
def to_str(self):
"""Returns the string representation of the model"""
return pprint.pformat(self.to_dict())
def __repr__(self):
"""For `print` and `pprint`"""
return self.to_str()
def __eq__(self, other):
"""Returns true if both objects are equal"""
if not isinstance(other, InlineResponse200Data):
return False
return self.__dict__ == other.__dict__
def __ne__(self, other):
"""Returns true if both objects are not equal"""
return not self == other
| [
"[email protected]"
] | |
0955c0d21f9a00b48526ba7a9c1c612b49f3afa2 | a0eea3e416a050168aba0876c33a8fddf310868a | /src/eavatar.x.hub/eavatar/hub/avatar.py | db30881d9fe987bf3e1717de9e5f9638a4226eeb | [] | no_license | eavatar/exchange-hub | a0dee945da6922408773a8d5bd1b9029fcf55dd6 | c78cdc6aa357442001c76c0daaca3597cd8b4adf | refs/heads/master | 2021-01-13T14:28:29.968008 | 2015-03-05T13:47:18 | 2015-03-05T13:47:18 | 30,749,051 | 0 | 0 | null | 2015-03-05T13:47:19 | 2015-02-13T09:16:04 | Python | UTF-8 | Python | false | false | 3,307 | py | # -*- coding: utf-8 -*-
from __future__ import absolute_import, division, print_function, unicode_literals
"""
Avatar-specific functionality.
"""
import json
import time
import logging
import falcon
from datetime import datetime, timedelta
from cqlengine import columns
from cqlengine.models import Model
from eavatar.hub.app import api
from eavatar.hub import views
from eavatar.hub import managers
from eavatar.hub.hooks import check_authentication
from eavatar.hub.util import crypto, codecs
logger = logging.getLogger(__name__)
def _default_expired_at():
return datetime.utcnow() + timedelta(seconds=86400)
# models #
class Avatar(Model):
"""
Represents anything with an identity that can send or receive messages.
"""
xid = columns.Text(primary_key=True, partition_key=True)
owner_xid = columns.Text(default=None)
created_at = columns.DateTime(default=datetime.utcnow)
modified_at = columns.DateTime(default=datetime.utcnow)
expired_at = columns.DateTime(default=_default_expired_at)
# properties = columns.Map(columns.Text, columns.Text)
# links = columns.Set(columns.Text)
# aliases = columns.Set(columns.Text)
class Possession(Model):
"""
Represents relationship between an avatar and its possessions.
"""
owner_xid = columns.Text(primary_key=True)
avatar_xid = columns.Text(primary_key=True, clustering_order="ASC")
@staticmethod
def find_possessions(owner_xid):
return Possession.objects(owner_xid=owner_xid)
# managers #
class AvatarManager(managers.BaseManager):
model = Avatar
def __init__(self):
super(AvatarManager, self).__init__(self.model)
# views #
@falcon.before(check_authentication)
class AvatarCollection(views.ResourceBase):
def on_get(self, req, resp):
"""
Gets avatars belongs to the client.
:param req:
:param resp:
:return:
"""
owner_xid = req.context['client_xid']
qs = Possession.find_possessions(owner_xid)
resp.body = views.EMPTY_LIST
resp.status = falcon.HTTP_200
def on_put(self, req, resp):
try:
data = json.load(req.stream)
avatar = Avatar(xid=data.get('xid'), kind=data.get('kind'))
avatar.save()
resp.body = views.RESULT_OK
resp.status = falcon.HTTP_200
except Exception, e:
logger.error(e)
raise
@falcon.before(check_authentication)
class AvatarResource(views.ResourceBase):
def on_get(self, req, resp, avatar_xid):
if 'self' == avatar_xid:
avatar_xid = req.context['client_xid']
self_link = {
"rel": "self",
"href": "%s" % (req.uri,)
}
messages_link = {
"rel": "messages",
"href": "%s/messages" % (req.uri,),
}
result = dict(
subject="%s" % (req.uri,),
aliases=["avatar:%s" % avatar_xid],
links=[self_link, messages_link],
)
resp.content_type = b"application/jrd+json"
resp.data = json.dumps(result)
resp.status = falcon.HTTP_200
# routes
logger.debug("Binding routes for Avatar module.")
_avatar_resource = AvatarResource()
api.add_route("/{avatar_xid}", _avatar_resource)
| [
"[email protected]"
] | |
f3dd5b4353dff6f6cef8c9eb4092d47fdf047aaa | 80b545522375b2b8bbfdff0f540b1172e53b140c | /djecommerce/settings/development.py | 3911c6f54c156da5237c8b83486eff63c55810d2 | [] | no_license | DuncanMoyo/django-ecommerce-website | 18e1e8dcf358de6758ad7974a703145ed5cab4db | 21783c3d4159adffabbfc522099cf9c55346bed8 | refs/heads/master | 2022-12-11T15:14:51.039929 | 2019-08-31T20:48:59 | 2019-08-31T20:48:59 | 196,033,607 | 0 | 0 | null | 2022-12-08T05:51:59 | 2019-07-09T15:14:11 | JavaScript | UTF-8 | Python | false | false | 1,244 | py | from .base import *
DEBUG = True
ALLOWED_HOSTS = ['127.0.0.1']
INSTALLED_APPS += [
'debug_toolbar'
]
MIDDLEWARE += ['debug_toolbar.middleware.DebugToolbarMiddleware',]
# DEBUG TOOLBAR SETTINGS
DEBUG_TOOLBAR_PANELS = [
'debug_toolbar.panels.versions.VersionsPanel',
'debug_toolbar.panels.timer.TimerPanel',
'debug_toolbar.panels.settings.SettingsPanel',
'debug_toolbar.panels.headers.HeadersPanel',
'debug_toolbar.panels.request.RequestPanel',
'debug_toolbar.panels.sql.SQLPanel',
'debug_toolbar.panels.staticfiles.StaticFilesPanel',
'debug_toolbar.panels.templates.TemplatesPanel',
'debug_toolbar.panels.cache.CachePanel',
'debug_toolbar.panels.signals.SignalsPanel',
'debug_toolbar.panels.logging.LoggingPanel',
'debug_toolbar.panels.redirects.RedirectsPanel',
]
def show_toolbar(request):
return True
DEBUG_TOOLBAR_CONFIG = {
'INTERCEPT_REDIRECTS': False,
'SHOW_TOOLBAR_CALLBACK': show_toolbar
}
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
}
}
STRIPE_PUBLIC_KEY = 'pk_test_tx8uE8Va8yN3YmRlRZ0fZULC00RLAevl8q'
STRIPE_SECRET_KEY = 'sk_test_M6hBTe1EWdJqZplNq23kn54Q00HcSnrNNZ'
| [
"[email protected]"
] | |
691d6cbe2f9a10ae712c4829e33e1f99b6e87147 | f1e7f6c34316828eb541d76e160c43e2e61743c8 | /ate/main.py | 7b19cea3a56cefb0c8e9b9d5baa47c7a1d9d312e | [
"MIT"
] | permissive | luochun3731/ApiTestEngine | 216fc95b8e1d5423cc24083ffac22d119f4a789e | b75e17e7c7a73047660fe13accab4653aa0fa5fb | refs/heads/master | 2021-01-01T15:47:26.516079 | 2017-07-18T04:24:51 | 2017-07-18T04:24:51 | null | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 2,597 | py | import os
import argparse
import logging
import unittest
import HtmlTestRunner
from ate import runner, utils
class ApiTestCase(unittest.TestCase):
""" create a testcase.
"""
def __init__(self, test_runner, testcase):
super(ApiTestCase, self).__init__()
self.test_runner = test_runner
self.testcase = testcase
def runTest(self):
""" run testcase and check result.
"""
result = self.test_runner.run_test(self.testcase)
self.assertEqual(result, (True, []))
def create_suite(testset):
""" create test suite with a testset, it may include one or several testcases.
each suite should initialize a seperate Runner() with testset config.
"""
suite = unittest.TestSuite()
test_runner = runner.Runner()
config_dict = testset.get("config", {})
test_runner.init_config(config_dict, level="testset")
testcases = testset.get("testcases", [])
for testcase in testcases:
if utils.PYTHON_VERSION == 3:
ApiTestCase.runTest.__doc__ = testcase['name']
else:
ApiTestCase.runTest.__func__.__doc__ = testcase['name']
test = ApiTestCase(test_runner, testcase)
suite.addTest(test)
return suite
def create_task(testcase_path):
""" create test task suite with specified testcase path.
each task suite may include one or several test suite.
"""
task_suite = unittest.TestSuite()
testsets = utils.load_testcases_by_path(testcase_path)
for testset in testsets:
suite = create_suite(testset)
task_suite.addTest(suite)
return task_suite
def main():
""" parse command line options and run commands.
"""
parser = argparse.ArgumentParser(
description='Api Test Engine.')
parser.add_argument(
'--testcase-path', default='testcases',
help="testcase file path")
parser.add_argument(
'--log-level', default='INFO',
help="Specify logging level, default is INFO.")
parser.add_argument(
'--report-name',
help="Specify report name, default is generated time.")
args = parser.parse_args()
log_level = getattr(logging, args.log_level.upper())
logging.basicConfig(level=log_level)
testcase_path = args.testcase_path.rstrip('/')
task_suite = create_task(testcase_path)
output_folder_name = os.path.basename(os.path.splitext(testcase_path)[0])
kwargs = {
"output": output_folder_name,
"report_name": args.report_name
}
HtmlTestRunner.HTMLTestRunner(**kwargs).run(task_suite)
| [
"[email protected]"
] | |
70141887bb0e0c852255ad1ce9359dbad564f6f4 | 5d1e737816e53cdb0056031f49267467b0a8de77 | /visualize_cspace_and_workspace.py | aa42f09da550cf0bb13d9c643deee5e32f830a16 | [] | no_license | aorthey/configuration-space-visualizer | ab4fe297b586d0e7acb2bc58c14b09253da47fec | e839cc839be37de98c6f34ff5ca6347173fc8e45 | refs/heads/master | 2020-06-01T09:56:01.209503 | 2020-04-28T10:43:27 | 2020-04-28T10:43:27 | 190,740,072 | 2 | 3 | null | 2019-07-25T11:20:53 | 2019-06-07T12:30:03 | Python | UTF-8 | Python | false | false | 2,745 | py | import sys
import numpy as np
from src.cspace_visualizer import *
import os
if not os.path.exists("images"):
os.makedirs("images")
from worlds import manipulator_2dof as World
from matplotlib.ticker import MaxNLocator
world = World.Manipulator2dof()
worldName = world.getName()
N = 200
c1 = (0.9,0.9,0.9)
c2 = (0.7,0.7,0.7)
c3 = (0.5,0.5,0.5)
q1 = np.linspace(-np.pi,np.pi,N)
q2 = np.linspace(-np.pi,np.pi,N)
P1 = []
P2 = []
M = np.zeros((q1.shape[0], q2.shape[0]))
for i in range(q1.shape[0]):
for j in range(q2.shape[0]):
q = np.array((q1[i],q2[j]))
if not world.isFeasible(q):
M[i,j] = 1
P1 = np.append(P1,q1[i])
P2 = np.append(P2,q2[j])
infeasibleColumns = np.sum(M,axis=1)>=N
Q1 = []
Q2 = []
for i in range(q1.shape[0]):
for j in range(q2.shape[0]):
q = np.array((q1[i],q2[j]))
if infeasibleColumns[i]:
Q1 = np.append(Q1,q1[i])
Q2 = np.append(Q2,q2[j])
font_size = 25
offset = 0.5
p1 = np.array([0.5,2.57])
p2 = np.array([-1.57,-0.9])
p3 = np.array([2.5,-1.2])
symbol='x'
greyshade = 0.75
x1loc= (p1[0]-offset,p1[1]-offset)
x2loc= (p2[0]-offset,p2[1]-offset)
x3loc= (p3[0]-offset,p3[1]-offset)
###########################################################
fig = plt.figure(0)
fig.patch.set_facecolor('white')
ax = fig.gca()
ax.set_xlabel(r'x',fontsize=font_size)
ax.set_ylabel(r'y',rotation=1.57,fontsize=font_size)
ax.tick_params(axis='both', which='major', pad=15)
lim=1.1
plt.axis([-lim,lim,-lim,lim])
world.COLOR = c1
world.plotRobotAtConfig(ax,p1)
world.COLOR = c2
world.plotRobotAtConfig(ax,p2)
world.COLOR = c3
world.plotRobotAtConfig(ax,p3)
w1 = world.getEndeffectorPositions(p1)
w2 = world.getEndeffectorPositions(p2)
w3 = world.getEndeffectorPositions(p3)
yoffset = np.array((0.0,0.1))
xoffset = np.array((0.1,0.0))
ax.annotate(r''+symbol+'_1', w1+yoffset)
ax.annotate(r''+symbol+'_2', w2-2*yoffset-2*xoffset)
ax.annotate(r''+symbol+'_3', w3+yoffset)
world.plotObstacles(ax)
plt.savefig("images/"+worldName+"_workspace.png", bbox_inches='tight')
############################################################
fig = plt.figure(1)
fig.patch.set_facecolor('white')
ax = fig.gca()
ax.set_xlabel(r'\theta_1',fontsize=font_size)
ax.set_ylabel(r'\theta_2',rotation=1.57,fontsize=font_size)
ax.tick_params(axis='both', which='major', pad=15)
lim=3.14
plt.axis([-lim,lim,-lim,lim])
ax.annotate(r''+symbol+'_1', x1loc)
ax.annotate(r''+symbol+'_2', x2loc)
ax.annotate(r''+symbol+'_3', x3loc)
plt.plot(p1[0],p1[1],'o',color='black',markersize=10)
plt.plot(p2[0],p2[1],'o',color='black',markersize=10)
plt.plot(p3[0],p3[1],'o',color='black',markersize=10)
plotCSpaceDelaunayGrey(P1,P2,0.15)
plt.savefig("images/"+worldName+"_configuration_space.png", bbox_inches='tight')
plt.show()
| [
"[email protected]"
] | |
b1d35b14319ee88f737697a4ff38b5548b64a8e8 | cd4bbecc3f713b0c25508d0c5674d9e103db5df4 | /toontown/fishing/DistributedFishingTarget.py | 69a3ef13a7dc66adb05d96a2986dc55852deddf6 | [] | no_license | peppythegod/ToontownOnline | dce0351cfa1ad8c476e035aa3947fdf53de916a6 | 2e5a106f3027714d301f284721382cb956cd87a0 | refs/heads/master | 2020-04-20T05:05:22.934339 | 2020-01-02T18:05:28 | 2020-01-02T18:05:28 | 168,646,608 | 11 | 2 | null | null | null | null | UTF-8 | Python | false | false | 2,803 | py | from pandac.PandaModules import *
from direct.distributed.ClockDelta import *
from direct.interval.IntervalGlobal import *
from direct.directnotify import DirectNotifyGlobal
from direct.distributed import DistributedNode
from direct.fsm import ClassicFSM
from direct.fsm import State
from direct.directutil import Mopath
from toontown.toonbase import ToontownGlobals
from direct.actor import Actor
import FishingTargetGlobals
import random
import math
from toontown.effects import Bubbles
class DistributedFishingTarget(DistributedNode.DistributedNode):
notify = DirectNotifyGlobal.directNotify.newCategory(
'DistributedFishingTarget')
radius = 2.5
def __init__(self, cr):
DistributedNode.DistributedNode.__init__(self, cr)
NodePath.__init__(self)
self.pond = None
self.centerPoint = (0, 0, 0)
self.maxRadius = 1.0
self.track = None
def generate(self):
self.assign(render.attachNewNode('DistributedFishingTarget'))
shadow = loader.loadModel('phase_3/models/props/drop_shadow')
shadow.setPos(0, 0, -0.10000000000000001)
shadow.setScale(0.33000000000000002)
shadow.setColorScale(1, 1, 1, 0.75)
shadow.reparentTo(self)
self.bubbles = Bubbles.Bubbles(self, render)
self.bubbles.renderParent.setDepthWrite(0)
self.bubbles.start()
DistributedNode.DistributedNode.generate(self)
def disable(self):
if self.track:
self.track.finish()
self.track = None
self.bubbles.destroy()
del self.bubbles
self.pond.removeTarget(self)
self.pond = None
DistributedNode.DistributedNode.disable(self)
def delete(self):
del self.pond
DistributedNode.DistributedNode.delete(self)
def setPondDoId(self, pondDoId):
self.pond = base.cr.doId2do[pondDoId]
self.pond.addTarget(self)
self.centerPoint = FishingTargetGlobals.getTargetCenter(
self.pond.getArea())
self.maxRadius = FishingTargetGlobals.getTargetRadius(
self.pond.getArea())
def getDestPos(self, angle, radius):
x = radius * math.cos(angle) + self.centerPoint[0]
y = radius * math.sin(angle) + self.centerPoint[1]
z = self.centerPoint[2]
return (x, y, z)
def setState(self, stateIndex, angle, radius, time, timeStamp):
ts = globalClockDelta.localElapsedTime(timeStamp)
pos = self.getDestPos(angle, radius)
if self.track and self.track.isPlaying():
self.track.finish()
self.track = Sequence(
LerpPosInterval(
self, time - ts, Point3(*pos), blendType='easeInOut'))
self.track.start()
def getRadius(self):
return self.radius
| [
"[email protected]"
] | |
92d2540caa11631b2211b05ce1cd8a3ec146552b | 9743d5fd24822f79c156ad112229e25adb9ed6f6 | /xai/brain/wordbase/otherforms/_currycombs.py | 697fc9498aafda38d96c4b64c23d5633a1a39751 | [
"MIT"
] | permissive | cash2one/xai | de7adad1758f50dd6786bf0111e71a903f039b64 | e76f12c9f4dcf3ac1c7c08b0cc8844c0b0a104b6 | refs/heads/master | 2021-01-19T12:33:54.964379 | 2017-01-28T02:00:50 | 2017-01-28T02:00:50 | null | 0 | 0 | null | null | null | null | UTF-8 | Python | false | false | 234 | py |
#calss header
class _CURRYCOMBS():
def __init__(self,):
self.name = "CURRYCOMBS"
self.definitions = currycomb
self.parents = []
self.childen = []
self.properties = []
self.jsondata = {}
self.basic = ['currycomb']
| [
"[email protected]"
] | |
f064a7e6a09a8181c698703b8a964957bcf9ab3a | 5d9bab690aab86ba8d8e9ee315e68e6028fe8048 | /flumine/execution/betfairexecution.py | f3da633c30a1edbd5dbfdb73c982c771be4162f4 | [
"MIT"
] | permissive | chrlanct/flumine | 888b2e83a26930c7d1f17b7815fddccd036fc276 | 3728274bce91cf3d56f42da67b43d116bc2f860e | refs/heads/master | 2022-12-23T09:41:21.744645 | 2020-09-21T09:07:34 | 2020-09-21T09:07:34 | 297,768,368 | 0 | 0 | null | 2020-09-22T20:47:18 | 2020-09-22T20:47:17 | null | UTF-8 | Python | false | false | 9,802 | py | import logging
import requests
from typing import Callable
from betfairlightweight import BetfairError
from .baseexecution import BaseExecution
from ..clients.clients import ExchangeType
from ..order.orderpackage import BaseOrderPackage, OrderPackageType
from ..exceptions import OrderExecutionError
logger = logging.getLogger(__name__)
class BetfairExecution(BaseExecution):
EXCHANGE = ExchangeType.BETFAIR
def execute_place(
self, order_package: BaseOrderPackage, http_session: requests.Session
) -> None:
response = self._execution_helper(self.place, order_package, http_session)
if response:
for (order, instruction_report) in zip(
order_package, response.place_instruction_reports
):
with order.trade:
self._order_logger(
order, instruction_report, OrderPackageType.PLACE
)
if instruction_report.status == "SUCCESS":
order.executable()
elif instruction_report.status == "FAILURE":
order.lapsed() # todo correct?
elif instruction_report.status == "TIMEOUT":
# https://docs.developer.betfair.com/display/1smk3cen4v3lu3yomq5qye0ni/Betting+Enums#BettingEnums-ExecutionReportStatus
pass
def place(self, order_package: OrderPackageType, session: requests.Session):
return order_package.client.betting_client.betting.place_orders(
market_id=order_package.market_id,
instructions=order_package.place_instructions,
customer_ref=order_package.id.hex,
market_version=order_package.market_version,
customer_strategy_ref=order_package.customer_strategy_ref,
async_=order_package.async_,
session=session,
)
def execute_cancel(
self, order_package: BaseOrderPackage, http_session: requests.Session
) -> None:
response = self._execution_helper(self.cancel, order_package, http_session)
if response:
order_lookup = {o.bet_id: o for o in order_package}
for instruction_report in response.cancel_instruction_reports:
# get order (can't rely on the order they are returned)
order = order_lookup.pop(instruction_report.instruction.bet_id)
with order.trade:
self._order_logger(
order, instruction_report, OrderPackageType.CANCEL
)
if instruction_report.status == "SUCCESS":
if (
instruction_report.size_cancelled == order.size_remaining
): # todo what if?
order.execution_complete()
else:
order.executable()
elif instruction_report.status == "FAILURE":
order.executable()
elif instruction_report.status == "TIMEOUT":
order.executable()
# reset any not returned so that they can be picked back up
for order in order_lookup.values():
with order.trade:
order.executable()
def cancel(self, order_package: OrderPackageType, session: requests.Session):
# temp copy to prevent an empty list of instructions sent
# this can occur if order is matched during the execution
# cycle, resulting in all orders being cancelled!
cancel_instructions = list(order_package.cancel_instructions)
if not cancel_instructions:
logger.warning("Empty cancel_instructions", extra=order_package.info)
raise OrderExecutionError()
return order_package.client.betting_client.betting.cancel_orders(
market_id=order_package.market_id,
instructions=cancel_instructions,
customer_ref=order_package.id.hex,
session=session,
)
def execute_update(
self, order_package: BaseOrderPackage, http_session: requests.Session
) -> None:
response = self._execution_helper(self.update, order_package, http_session)
if response:
for (order, instruction_report) in zip(
order_package, response.update_instruction_reports
):
with order.trade:
self._order_logger(
order, instruction_report, OrderPackageType.UPDATE
)
if instruction_report.status == "SUCCESS":
order.executable()
elif instruction_report.status == "FAILURE":
order.executable()
elif instruction_report.status == "TIMEOUT":
order.executable()
def update(self, order_package: OrderPackageType, session: requests.Session):
return order_package.client.betting_client.betting.update_orders(
market_id=order_package.market_id,
instructions=order_package.update_instructions,
customer_ref=order_package.id.hex,
session=session,
)
def execute_replace(
self, order_package: BaseOrderPackage, http_session: requests.Session
) -> None:
response = self._execution_helper(self.replace, order_package, http_session)
if response:
for (order, instruction_report) in zip(
order_package, response.replace_instruction_reports
):
with order.trade:
# process cancel response
if (
instruction_report.cancel_instruction_reports.status
== "SUCCESS"
):
self._order_logger(
order,
instruction_report.cancel_instruction_reports,
OrderPackageType.CANCEL,
)
order.execution_complete()
elif (
instruction_report.cancel_instruction_reports.status
== "FAILURE"
):
order.executable()
elif (
instruction_report.cancel_instruction_reports.status
== "TIMEOUT"
):
order.executable()
# process place response
if instruction_report.place_instruction_reports.status == "SUCCESS":
# create new order
replacement_order = order.trade.create_order_replacement(
order,
instruction_report.place_instruction_reports.instruction.limit_order.price,
)
self._order_logger(
replacement_order,
instruction_report.place_instruction_reports,
OrderPackageType.REPLACE,
)
# add to blotter
order_package.market.place_order(
replacement_order, execute=False
)
replacement_order.executable()
elif (
instruction_report.place_instruction_reports.status == "FAILURE"
):
pass # todo
elif (
instruction_report.place_instruction_reports.status == "TIMEOUT"
):
pass # todo
def replace(self, order_package: OrderPackageType, session: requests.Session):
return order_package.client.betting_client.betting.replace_orders(
market_id=order_package.market_id,
instructions=order_package.replace_instructions,
customer_ref=order_package.id.hex,
market_version=order_package.market_version,
async_=order_package.async_,
session=session,
)
def _execution_helper(
self,
trading_function: Callable,
order_package: BaseOrderPackage,
http_session: requests.Session,
):
if order_package.orders:
try:
response = trading_function(order_package, http_session)
except BetfairError as e:
logger.error(
"Execution error",
extra={
"trading_function": trading_function.__name__,
"response": e,
"order_package": order_package.info,
},
exc_info=True,
)
if order_package.retry():
self.handler_queue.put(order_package)
self._return_http_session(http_session, err=True)
return
logger.info(
"execute_%s" % trading_function.__name__,
extra={
"trading_function": trading_function.__name__,
"elapsed_time": response.elapsed_time,
"response": response._data,
"order_package": order_package.info,
},
)
self._return_http_session(http_session)
return response
else:
logger.warning("Empty package, not executing", extra=order_package.info)
self._return_http_session(http_session)
| [
"[email protected]"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.