prompt
large_stringlengths 70
991k
| completion
large_stringlengths 0
1.02k
|
---|---|
<|file_name|>info_result.rs<|end_file_name|><|fim▁begin|>//
// imag - the personal information management suite for the commandline
// Copyright (C) 2015-2020 Matthias Beyer <[email protected]> and contributors
//
// This library is free software; you can redistribute it and/or
// modify it under the terms of the GNU Lesser General Public
// License as published by the Free Software Foundation; version
// 2.1 of the License.
//
// This library is distributed in the hope that it will be useful,
// but WITHOUT ANY WARRANTY; without even the implied warranty of
// MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
// Lesser General Public License for more details.
//
// You should have received a copy of the GNU Lesser General Public
// License along with this library; if not, write to the Free Software
// Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
//
<|fim▁hole|> map_info,
map_info_str,
map_info_err,
map_info_err_str,
|s| { info!("{}", s); }
);<|fim▁end|> | // Generates a extension for the `Result<T, E>`, named `DebugResult` which has functionality to
// print either `T` or `E` via `info!()`.
generate_result_logging_extension!(
InfoResult, |
<|file_name|>SpeedMeter.py<|end_file_name|><|fim▁begin|># -*- coding: iso-8859-1 -*-
# --------------------------------------------------------------------------- #
# SPEEDMETER Control wxPython IMPLEMENTATION
# Python Code By:
#
# Andrea Gavana, @ 25 Sep 2005
# Latest Revision: 10 Oct 2005, 22.40 CET
#
#
# TODO List/Caveats
#
# 1. Combination Of The Two Styles:
#
# SM_DRAW_PARTIAL_FILLER
# SM_DRAW_SECTORS
#
# Does Not Work Very Well. It Works Well Only In Case When The Sector Colours
# Are The Same For All Intervals.
#
#
# Thanks To Gerard Grazzini That Has Tried The Demo On MacOS, I Corrected A
# Bug On Line 246
#
#
# For All Kind Of Problems, Requests Of Enhancements And Bug Reports, Please
# Write To Me At:
#
# [email protected]
# [email protected]
#
# Or, Obviously, To The wxPython Mailing List!!!
#
# MODIFIED to add native Python wx.gizmos.LEDNubmerCtrl-type display, and a number of other things.
# by Jason Antman <http://www.jasonantman.com> <[email protected]>
# Modifications Copyright 2010 Jason Antman.
#
#
# End Of Comments
# --------------------------------------------------------------------------- #
"""Description:
SpeedMeter Tries To Reproduce The Behavior Of Some Car Controls (But Not Only),
By Creating An "Angular" Control (Actually, Circular). I Remember To Have Seen
It Somewhere, And I Decided To Implement It In wxPython.
SpeedMeter Starts Its Construction From An Empty Bitmap, And It Uses Some
Functions Of The wx.DC Class To Create The Rounded Effects. Everything Is
Processed In The Draw() Method Of SpeedMeter Class.
This Implementation Allows You To Use Either Directly The wx.PaintDC, Or The
Better (For Me) Double Buffered Style With wx.BufferedPaintDC. The Double
Buffered Implementation Has Been Adapted From The wxPython Wiki Example:
http://wiki.wxpython.org/index.cgi/DoubleBufferedDrawing
Usage:
SpeedWindow1 = SM.SpeedMeter(parent,
bufferedstyle,
extrastyle,
mousestyle
)
None Of The Options (A Part Of Parent Class) Are Strictly Required, If You
Use The Defaults You Get A Very Simple SpeedMeter. For The Full Listing Of
The Input Parameters, See The SpeedMeter __init__() Method.
Methods And Settings:
SpeedMeter Is Highly Customizable, And In Particular You Can Set:
- The Start And End Angle Of Existence For SpeedMeter;
- The Intervals In Which You Divide The SpeedMeter (Numerical Values);
- The Corresponding Thicks For The Intervals;
- The Interval Colours (Different Intervals May Have Different Filling Colours);
- The Ticks Font And Colour;
- The Background Colour (Outsize The SpeedMeter Region);
- The External Arc Colour;
- The Hand (Arrow) Colour;
- The Hand's Shadow Colour;
- The Hand's Style ("Arrow" Or "Hand");
- The Partial Filler Colour;
- The Number Of Secondary (Intermediate) Ticks;
- The Direction Of Increasing Speed ("Advance" Or "Reverse");
- The Text To Be Drawn In The Middle And Its Font;
- The Icon To Be Drawn In The Middle;
- The First And Second Gradient Colours (That Fills The SpeedMeter Control);
- The Current Value.
For More Info On Methods And Initial Styles, Please Refer To The __init__()
Method For SpeedMeter Or To The Specific Functions.
SpeedMeter Control Is Freeware And Distributed Under The wxPython License.
Latest Revision: Andrea Gavana @ 10 Oct 2005, 22.40 CET
"""
#----------------------------------------------------------------------
# Beginning Of SPEEDMETER wxPython Code
#----------------------------------------------------------------------
import wx
import wx.lib.colourdb
import wx.lib.fancytext as fancytext
import wx.gizmos as gizmos # for LEDControl
import exceptions
from math import pi, sin, cos, log, sqrt, atan2
#----------------------------------------------------------------------
# DC Drawing Options
#----------------------------------------------------------------------
# SM_NORMAL_DC Uses The Normal wx.PaintDC
# SM_BUFFERED_DC Uses The Double Buffered Drawing Style
SM_NORMAL_DC = 0
SM_BUFFERED_DC = 1
#----------------------------------------------------------------------
# SpeedMeter Styles
#----------------------------------------------------------------------
# SM_ROTATE_TEXT: Draws The Ticks Rotated: The Ticks Are Rotated
# Accordingly To The Tick Marks Positions
# SM_DRAW_SECTORS: Different Intervals Are Painted In Differend Colours
# (Every Sector Of The Circle Has Its Own Colour)
# SM_DRAW_PARTIAL_SECTORS: Every Interval Has Its Own Colour, But Only
# A Circle Corona Is Painted Near The Ticks
# SM_DRAW_HAND: The Hand (Arrow Indicator) Is Drawn
# SM_DRAW_SHADOW: A Shadow For The Hand Is Drawn
# SM_DRAW_PARTIAL_FILLER: A Circle Corona That Follows The Hand Position
# Is Drawn Near The Ticks
# SM_DRAW_SECONDARY_TICKS: Intermediate (Smaller) Ticks Are Drawn Between
# Principal Ticks
# SM_DRAW_MIDDLE_TEXT: Some Text Is Printed In The Middle Of The Control
# Near The Center
# SM_DRAW_MIDDLE_ICON: An Icon Is Drawn In The Middle Of The Control Near
# The Center
# SM_DRAW_GRADIENT: A Gradient Of Colours Will Fill The Control
# SM_DRAW_FANCY_TICKS: With This Style You Can Use XML Tags To Create
# Some Custom Text And Draw It At The Ticks Position.
# See wx.lib.fancytext For The Tags.
# SM_DRAW_BOTTOM_TEXT: Some Text Is Printed In The Bottom Of The Control
# SM_DRAW_BOTTOM_LED: A gizmos.LEDNumberCtrl-style value display is drawn at the bottom
SM_ROTATE_TEXT = 1
SM_DRAW_SECTORS = 2
SM_DRAW_PARTIAL_SECTORS = 4
SM_DRAW_HAND = 8
SM_DRAW_SHADOW = 16
SM_DRAW_PARTIAL_FILLER = 32
SM_DRAW_SECONDARY_TICKS = 64
SM_DRAW_MIDDLE_TEXT = 128
SM_DRAW_MIDDLE_ICON = 256
SM_DRAW_GRADIENT = 512
SM_DRAW_FANCY_TICKS = 1024
SM_DRAW_BOTTOM_TEXT = 2048
SM_DRAW_BOTTOM_LED = 4096
#----------------------------------------------------------------------
# Event Binding
#----------------------------------------------------------------------
# SM_MOUSE_TRACK: The Mouse Left Click/Drag Allow You To Change The
# SpeedMeter Value Interactively
SM_MOUSE_TRACK = 1
LINE1 = 1
LINE2 = 2
LINE3 = 4
LINE4 = 8
LINE5 = 16
LINE6 = 32
LINE7 = 64
DECIMALSIGN = 128
DIGIT0 = LINE1 | LINE2 | LINE3 | LINE4 | LINE5 | LINE6
DIGIT1 = LINE2 | LINE3
DIGIT2 = LINE1 | LINE2 | LINE4 | LINE5 | LINE7
DIGIT3 = LINE1 | LINE2 | LINE3 | LINE4 | LINE7
DIGIT4 = LINE2 | LINE3 | LINE6 | LINE7
DIGIT5 = LINE1 | LINE3 | LINE4 | LINE6 | LINE7
DIGIT6 = LINE1 | LINE3 | LINE4 | LINE5 | LINE6 | LINE7
DIGIT7 = LINE1 | LINE2 | LINE3
DIGIT8 = LINE1 | LINE2 | LINE3 | LINE4 | LINE5 | LINE6 | LINE7
DIGIT9 = LINE1 | LINE2 | LINE3 | LINE6 | LINE7
DASH = LINE7
DIGITALL = -1
fontfamily = range(70, 78)
familyname = ["default", "decorative", "roman", "script", "swiss", "modern", "teletype"]
weights = range(90, 93)
weightsname = ["normal", "light", "bold"]
styles = [90, 93, 94]
stylesname = ["normal", "italic", "slant"]
#----------------------------------------------------------------------
# BUFFERENDWINDOW Class
# This Class Has Been Taken From The wxPython Wiki, And Slightly
# Adapted To Fill My Needs. See:
#
# http://wiki.wxpython.org/index.cgi/DoubleBufferedDrawing
#
# For More Info About DC And Double Buffered Drawing.
#----------------------------------------------------------------------
class BufferedWindow(wx.Window):
"""
A Buffered window class.
To use it, subclass it and define a Draw(DC) method that takes a DC
to draw to. In that method, put the code needed to draw the picture
you want. The window will automatically be double buffered, and the
screen will be automatically updated when a Paint event is received.
When the drawing needs to change, you app needs to call the
UpdateDrawing() method. Since the drawing is stored in a bitmap, you
can also save the drawing to file by calling the
SaveToFile(self,file_name,file_type) method.
"""
def __init__(self, parent, id,
pos = wx.DefaultPosition,
size = wx.DefaultSize,
style=wx.NO_FULL_REPAINT_ON_RESIZE,
bufferedstyle=SM_BUFFERED_DC):
wx.Window.__init__(self, parent, id, pos, size, style)
self.Bind(wx.EVT_PAINT, self.OnPaint)
self.Bind(wx.EVT_SIZE, self.OnSize)
self.Bind(wx.EVT_ERASE_BACKGROUND, lambda x: None)
# OnSize called to make sure the buffer is initialized.
# This might result in OnSize getting called twice on some
# platforms at initialization, but little harm done.
self.OnSize(None)
def Draw(self, dc):
"""
just here as a place holder.
This method should be over-ridden when sub-classed
"""
pass
def OnPaint(self, event):
"""
All that is needed here is to draw the buffer to screen
"""
if self._bufferedstyle == SM_BUFFERED_DC:
dc = wx.BufferedPaintDC(self, self._Buffer)
else:
dc = wx.PaintDC(self)
dc.DrawBitmap(self._Buffer,0,0)
def OnSize(self,event):
# The Buffer init is done here, to make sure the buffer is always
# the same size as the Window
self.Width, self.Height = self.GetClientSizeTuple()
# Make new off screen bitmap: this bitmap will always have the
# current drawing in it, so it can be used to save the image to
# a file, or whatever.
# This seems required on MacOS, it doesn't like wx.EmptyBitmap with
# size = (0, 0)
# Thanks to Gerard Grazzini
if "__WXMAC__" in wx.Platform:
if self.Width == 0:
self.Width = 1
if self.Height == 0:
self.Height = 1
self._Buffer = wx.EmptyBitmap(self.Width, self.Height)
self.UpdateDrawing()
def UpdateDrawing(self):
"""
This would get called if the drawing needed to change, for whatever reason.
The idea here is that the drawing is based on some data generated
elsewhere in the system. IF that data changes, the drawing needs to
be updated.
"""
if self._bufferedstyle == SM_BUFFERED_DC:
dc = wx.BufferedDC(wx.ClientDC(self), self._Buffer)
self.Draw(dc)
else:
# update the buffer
dc = wx.MemoryDC()
dc.SelectObject(self._Buffer)
self.Draw(dc)
# update the screen
wx.ClientDC(self).Blit(0, 0, self.Width, self.Height, dc, 0, 0)
#----------------------------------------------------------------------
# SPEEDMETER Class
# This Is The Main Class Implementation. See __init__() Method For
# Details.
#----------------------------------------------------------------------
class SpeedMeter(BufferedWindow):
"""
Class for a gauge-style display using an arc marked with tick marks and interval numbers, and a moving needle/hand/pointer.
MODIFIED to add native Python wx.gizmos.LEDNubmerCtrl-type display, and a number of other things by Jason Antman <http://www.jasonantman.com> <[email protected]>
@todo: Need to document everything (all methods).
@todo: Build example code.
@todo: Find everything used internally only and prefix methods with "__"
@todo: Find all "raise" statements, and any "print" statements that print an error, make them work with exceptions - IndexError, TypeError, RuntimeError, LookupError
@todo: change all mentions of "hand" to "needle"
@todo: make sure we have setters/getters for DrawFaded, Alignment, Value (for LED)
@todo: in client, test gradients
"""
bottomTextBottom = None
DEBUG = False # controls debugging print statements
def __init__(self, parent, id=wx.ID_ANY, pos=wx.DefaultPosition,
size=wx.DefaultSize, extrastyle=SM_DRAW_HAND,
bufferedstyle=SM_BUFFERED_DC,
mousestyle=0):
""" Default Class Constructor.
Non Standard wxPython Parameters Are:
a) extrastyle: This Value Specifies The SpeedMeter Styles:
- SM_ROTATE_TEXT: Draws The Ticks Rotated: The Ticks Are Rotated
Accordingly To The Tick Marks Positions;
- SM_DRAW_SECTORS: Different Intervals Are Painted In Differend Colours
(Every Sector Of The Circle Has Its Own Colour);
- SM_DRAW_PARTIAL_SECTORS: Every Interval Has Its Own Colour, But Only
A Circle Corona Is Painted Near The Ticks;
- SM_DRAW_HAND: The Hand (Arrow Indicator) Is Drawn;
- SM_DRAW_SHADOW: A Shadow For The Hand Is Drawn;
- SM_DRAW_PARTIAL_FILLER: A Circle Corona That Follows The Hand Position
Is Drawn Near The Ticks;
- SM_DRAW_SECONDARY_TICKS: Intermediate (Smaller) Ticks Are Drawn Between
Principal Ticks;
- SM_DRAW_MIDDLE_TEXT: Some Text Is Printed In The Middle Of The Control
Near The Center;
- SM_DRAW_MIDDLE_ICON: An Icon Is Drawn In The Middle Of The Control Near
The Center;
- SM_DRAW_GRADIENT: A Gradient Of Colours Will Fill The Control;
- SM_DRAW_FANCY_TICKS: With This Style You Can Use XML Tags To Create
Some Custom Text And Draw It At The Ticks Position.
See wx.lib.fancytext For The Tags.;
- SM_DRAW_BOTTOM_TEXT: Some Text Is Printed In The Bottom Of The Control
- SM_DRAW_BOTTOM_LED: A wx.gizmos.LEDNumberCtrl-style value display is printed at the bottom
<|fim▁hole|> b) bufferedstyle: This Value Allows You To Use The Normal wx.PaintDC Or The
Double Buffered Drawing Options:
- SM_NORMAL_DC Uses The Normal wx.PaintDC;
- SM_BUFFERED_DC Uses The Double Buffered Drawing Style.
c) mousestyle: This Value Allows You To Use The Mouse To Change The SpeedMeter
Value Interactively With Left Click/Drag Events:
- SM_MOUSE_TRACK: The Mouse Left Click/Drag Allow You To Change The
SpeedMeter Value Interactively.
"""
self._extrastyle = extrastyle
self._bufferedstyle = bufferedstyle
self._mousestyle = mousestyle
if self._extrastyle & SM_DRAW_SECTORS and self._extrastyle & SM_DRAW_GRADIENT:
errstr = "\nERROR: Incompatible Options: SM_DRAW_SECTORS Can Not Be Used In "
errstr = errstr + "Conjunction With SM_DRAW_GRADIENT."
raise errstr
if self._extrastyle & SM_DRAW_PARTIAL_SECTORS and self._extrastyle & SM_DRAW_SECTORS:
errstr = "\nERROR: Incompatible Options: SM_DRAW_SECTORS Can Not Be Used In "
errstr = errstr + "Conjunction With SM_DRAW_PARTIAL_SECTORS."
raise errstr
if self._extrastyle & SM_DRAW_PARTIAL_SECTORS and self._extrastyle & SM_DRAW_PARTIAL_FILLER:
errstr = "\nERROR: Incompatible Options: SM_DRAW_PARTIAL_SECTORS Can Not Be Used In "
errstr = errstr + "Conjunction With SM_DRAW_PARTIAL_FILLER."
raise errstr
if self._extrastyle & SM_DRAW_FANCY_TICKS and self._extrastyle & SM_ROTATE_TEXT:
errstr = "\nERROR: Incompatible Options: SM_DRAW_FANCY_TICKS Can Not Be Used In "
errstr = errstr + "Conjunction With SM_ROTATE_TEXT."
raise errstr
if self._extrastyle & SM_DRAW_SHADOW and self._extrastyle & SM_DRAW_HAND == 0:
errstr = "\nERROR: Incompatible Options: SM_DRAW_SHADOW Can Be Used Only In "
errstr = errstr + "Conjunction With SM_DRAW_HAND."
if self._extrastyle & SM_DRAW_FANCY_TICKS:
wx.lib.colourdb.updateColourDB()
self.SetValueMultiplier() # for LED control
self.SetAngleRange()
self.SetIntervals()
self.SetSpeedValue()
self.SetIntervalColours()
self.SetArcColour()
self.SetTicks()
self.SetTicksFont()
self.SetTicksColour()
self.SetSpeedBackground()
self.SetHandColour()
self.SetShadowColour()
self.SetFillerColour()
self.SetDirection()
self.SetNumberOfSecondaryTicks()
self.SetMiddleText()
self.SetMiddleTextFont()
self.SetMiddleTextColour()
self.SetBottomText()
self.SetBottomTextFont()
self.SetBottomTextColour()
self.SetFirstGradientColour()
self.SetSecondGradientColour()
self.SetHandStyle()
self.DrawExternalArc()
self.DrawExternalCircle()
# for LED control
self._LEDwidth = 0
self._LEDheight = 0
self._LEDx = 0
self._LEDy = 0
self._InitLEDInternals()
self.SetLEDAlignment()
self.SetDrawFaded()
BufferedWindow.__init__(self, parent, id, pos, size,
style=wx.NO_FULL_REPAINT_ON_RESIZE,
bufferedstyle=bufferedstyle)
if self._mousestyle & SM_MOUSE_TRACK:
self.Bind(wx.EVT_MOUSE_EVENTS, self.OnMouseMotion)
def Draw(self, dc):
"""
Draws Everything On The Empty Bitmap.
Here All The Chosen Styles Are Applied.
GIGANTIC HUMONGOUS UGLY function that draws I{everything} on the bitmap except for the LEDs.
@param dc: the dc
@type dc: L{wx.BufferedDC}
"""
size = self.GetClientSize()
if size.x < 21 or size.y < 21:
return
new_dim = size.Get()
if not hasattr(self, "dim"):
self.dim = new_dim
self.scale = min([float(new_dim[0]) / self.dim[0],
float(new_dim[1]) / self.dim[1]])
# Create An Empty Bitmap
self.faceBitmap = wx.EmptyBitmap(size.width, size.height)
dc.BeginDrawing()
speedbackground = self.GetSpeedBackground()
# Set Background Of The Control
dc.SetBackground(wx.Brush(speedbackground))
dc.Clear()
centerX = self.faceBitmap.GetWidth()/2
centerY = self.faceBitmap.GetHeight()/2
self.CenterX = centerX
self.CenterY = centerY
# Get The Radius Of The Sector. Set It A Bit Smaller To Correct Draw After
radius = min(centerX, centerY) - 2
self.Radius = radius
# Get The Angle Of Existance Of The Sector
anglerange = self.GetAngleRange()
startangle = anglerange[1]
endangle = anglerange[0]
self.StartAngle = startangle
self.EndAngle = endangle
# Initialize The Colours And The Intervals - Just For Reference To The
# Children Functions
colours = None
intervals = None
if self._extrastyle & SM_DRAW_SECTORS or self._extrastyle & SM_DRAW_PARTIAL_SECTORS:
# Get The Intervals Colours
colours = self.GetIntervalColours()[:]
textangles = []
colourangles = []
xcoords = []
ycoords = []
# Get The Intervals (Partial Sectors)
intervals = self.GetIntervals()[:]
start = min(intervals)
end = max(intervals)
span = end - start
self.StartValue = start
self.EndValue = end
self.Span = span
# Get The Current Value For The SpeedMeter
currentvalue = self.GetSpeedValue()
# Get The Direction Of The SpeedMeter
direction = self.GetDirection()
if direction == "Reverse":
intervals.reverse()
if self._extrastyle & SM_DRAW_SECTORS or self._extrastyle & SM_DRAW_PARTIAL_SECTORS:
colours.reverse()
currentvalue = end - currentvalue
# This Because DrawArc Does Not Draw Last Point
offset = 0.1*self.scale/180.0
xstart, ystart = self.__CircleCoords(radius+1, -endangle, centerX, centerY)
xend, yend = self.__CircleCoords(radius+1, -startangle-offset, centerX, centerY)
# Calculate The Angle For The Current Value Of SpeedMeter
accelangle = (currentvalue - start)/float(span)*(startangle-endangle) - startangle
dc.SetPen(wx.TRANSPARENT_PEN)
if self._extrastyle & SM_DRAW_PARTIAL_FILLER:
# Get Some Data For The Partial Filler
fillercolour = self.GetFillerColour()
fillerendradius = radius - 10.0*self.scale
fillerstartradius = radius
if direction == "Advance":
fillerstart = accelangle
fillerend = -startangle
else:
fillerstart = -endangle
fillerend = accelangle
xs1, ys1 = self.__CircleCoords(fillerendradius, fillerstart, centerX, centerY)
xe1, ye1 = self.__CircleCoords(fillerendradius, fillerend, centerX, centerY)
xs2, ys2 = self.__CircleCoords(fillerstartradius, fillerstart, centerX, centerY)
xe2, ye2 = self.__CircleCoords(fillerstartradius, fillerend, centerX, centerY)
# Get The Sector In Which The Current Value Is
intersection = self.__GetIntersection(currentvalue, intervals)
sectorradius = radius - 10*self.scale
else:
sectorradius = radius
if self._extrastyle & SM_DRAW_PARTIAL_FILLER:
# Draw The Filler (Both In "Advance" And "Reverse" Directions)
dc.SetBrush(wx.Brush(fillercolour))
dc.DrawArc(xs2, ys2, xe2, ye2, centerX, centerY)
if self._extrastyle & SM_DRAW_SECTORS == 0:
dc.SetBrush(wx.Brush(speedbackground))
xclean1, yclean1 = self.__CircleCoords(sectorradius, -endangle, centerX, centerY)
xclean2, yclean2 = self.__CircleCoords(sectorradius, -startangle-offset, centerX, centerY)
dc.DrawArc(xclean1, yclean1, xclean2, yclean2, centerX, centerY)
# This Is Needed To Fill The Partial Sector Correctly
xold, yold = self.__CircleCoords(radius, startangle+endangle, centerX, centerY)
# Draw The Sectors
for ii, interval in enumerate(intervals):
if direction == "Advance":
current = interval - start
else:
current = end - interval
angle = (current/float(span))*(startangle-endangle) - startangle
angletext = -((pi/2.0) + angle)*180/pi
textangles.append(angletext)
colourangles.append(angle)
xtick, ytick = self.__CircleCoords(radius, angle, centerX, centerY)
# Keep The Coordinates, We Will Need Them After To Position The Ticks
xcoords.append(xtick)
ycoords.append(ytick)
x = xtick
y = ytick
if self._extrastyle & SM_DRAW_SECTORS:
if self._extrastyle & SM_DRAW_PARTIAL_FILLER:
if direction == "Advance":
if current > currentvalue:
x, y = self.__CircleCoords(radius, angle, centerX, centerY)
else:
x, y = self.__CircleCoords(sectorradius, angle, centerX, centerY)
else:
if current < end - currentvalue:
x, y = self.__CircleCoords(radius, angle, centerX, centerY)
else:
x, y = self.__CircleCoords(sectorradius, angle, centerX, centerY)
else:
x, y = self.__CircleCoords(radius, angle, centerX, centerY)
if ii > 0:
if self._extrastyle & SM_DRAW_PARTIAL_FILLER and ii == intersection:
# We Got The Interval In Which There Is The Current Value. If We Choose
# A "Reverse" Direction, First We Draw The Partial Sector, Next The Filler
dc.SetBrush(wx.Brush(speedbackground))
if direction == "Reverse":
if self._extrastyle & SM_DRAW_SECTORS:
dc.SetBrush(wx.Brush(colours[ii-1]))
dc.DrawArc(xe2, ye2, xold, yold, centerX, centerY)
if self._extrastyle & SM_DRAW_SECTORS:
dc.SetBrush(wx.Brush(colours[ii-1]))
else:
dc.SetBrush(wx.Brush(speedbackground))
dc.DrawArc(xs1, ys1, xe1, ye1, centerX, centerY)
if self._extrastyle & SM_DRAW_SECTORS:
dc.SetBrush(wx.Brush(colours[ii-1]))
# Here We Draw The Rest Of The Sector In Which The Current Value Is
if direction == "Advance":
dc.DrawArc(xs1, ys1, x, y, centerX, centerY)
x = xs1
y = ys1
else:
dc.DrawArc(xe2, ye2, x, y, centerX, centerY)
elif self._extrastyle & SM_DRAW_SECTORS:
dc.SetBrush(wx.Brush(colours[ii-1]))
# Here We Still Use The SM_DRAW_PARTIAL_FILLER Style, But We Are Not
# In The Sector Where The Current Value Resides
if self._extrastyle & SM_DRAW_PARTIAL_FILLER and ii != intersection:
if direction == "Advance":
dc.DrawArc(x, y, xold, yold, centerX, centerY)
else:
if ii < intersection:
dc.DrawArc(x, y, xold, yold, centerX, centerY)
# This Is The Case Where No SM_DRAW_PARTIAL_FILLER Has Been Chosen
else:
dc.DrawArc(x, y, xold, yold, centerX, centerY)
else:
if self._extrastyle & SM_DRAW_PARTIAL_FILLER and self._extrastyle & SM_DRAW_SECTORS:
dc.SetBrush(wx.Brush(fillercolour))
dc.DrawArc(xs2, ys2, xe2, ye2, centerX, centerY)
x, y = self.__CircleCoords(sectorradius, angle, centerX, centerY)
dc.SetBrush(wx.Brush(colours[ii]))
dc.DrawArc(xs1, ys1, xe1, ye1, centerX, centerY)
x = xs2
y = ys2
xold = x
yold = y
if self._extrastyle & SM_DRAW_PARTIAL_SECTORS:
sectorendradius = radius - 10.0*self.scale
sectorstartradius = radius
xps, yps = self.__CircleCoords(sectorstartradius, angle, centerX, centerY)
if ii > 0:
dc.SetBrush(wx.Brush(colours[ii-1]))
dc.DrawArc(xps, yps, xpsold, ypsold, centerX, centerY)
xpsold = xps
ypsold = yps
if self._extrastyle & SM_DRAW_PARTIAL_SECTORS:
xps1, yps1 = self.__CircleCoords(sectorendradius, -endangle+2*offset, centerX, centerY)
xps2, yps2 = self.__CircleCoords(sectorendradius, -startangle-2*offset, centerX, centerY)
dc.SetBrush(wx.Brush(speedbackground))
dc.DrawArc(xps1, yps1, xps2, yps2, centerX, centerY)
if self._extrastyle & SM_DRAW_GRADIENT:
dc.SetPen(wx.TRANSPARENT_PEN)
xcurrent, ycurrent = self.__CircleCoords(radius, accelangle, centerX, centerY)
# calculate gradient coefficients
col2 = self.GetSecondGradientColour()
col1 = self.GetFirstGradientColour()
r1, g1, b1 = int(col1.Red()), int(col1.Green()), int(col1.Blue())
r2, g2, b2 = int(col2.Red()), int(col2.Green()), int(col2.Blue())
flrect = float(radius+self.scale)
numsteps = 200
rstep = float((r2 - r1)) / numsteps
gstep = float((g2 - g1)) / numsteps
bstep = float((b2 - b1)) / numsteps
rf, gf, bf = 0, 0, 0
radiusteps = flrect/numsteps
interface = 0
for ind in range(numsteps+1):
currCol = (r1 + rf, g1 + gf, b1 + bf)
dc.SetBrush(wx.Brush(currCol))
gradradius = flrect - radiusteps*ind
xst1, yst1 = self.__CircleCoords(gradradius, -endangle, centerX, centerY)
xen1, yen1 = self.__CircleCoords(gradradius, -startangle-offset, centerX, centerY)
if self._extrastyle & SM_DRAW_PARTIAL_FILLER:
if gradradius >= fillerendradius:
if direction == "Advance":
dc.DrawArc(xstart, ystart, xcurrent, ycurrent, centerX, centerY)
else:
dc.DrawArc(xcurrent, ycurrent, xend, yend, centerX, centerY)
else:
if interface == 0:
interface = 1
myradius = fillerendradius + 1
xint1, yint1 = self.__CircleCoords(myradius, -endangle, centerX, centerY)
xint2, yint2 = self.__CircleCoords(myradius, -startangle-offset, centerX, centerY)
dc.DrawArc(xint1, yint1, xint2, yint2, centerX, centerY)
dc.DrawArc(xst1, yst1, xen1, yen1, centerX, centerY)
else:
if self._extrastyle & SM_DRAW_PARTIAL_SECTORS:
if gradradius <= sectorendradius:
if interface == 0:
interface = 1
myradius = sectorendradius + 1
xint1, yint1 = self.__CircleCoords(myradius, -endangle, centerX, centerY)
xint2, yint2 = self.__CircleCoords(myradius, -startangle-offset, centerX, centerY)
dc.DrawArc(xint1, yint1, xint2, yint2, centerX, centerY)
else:
dc.DrawArc(xst1, yst1, xen1, yen1, centerX, centerY)
else:
dc.DrawArc(xst1, yst1, xen1, yen1, centerX, centerY)
rf = rf + rstep
gf = gf + gstep
bf = bf + bstep
textheight = 0
# Get The Ticks And The Ticks Colour
ticks = self.GetTicks()[:]
tickscolour = self.GetTicksColour()
if direction == "Reverse":
ticks.reverse()
if self._extrastyle & SM_DRAW_SECONDARY_TICKS:
ticknum = self.GetNumberOfSecondaryTicks()
oldinterval = intervals[0]
dc.SetPen(wx.Pen(tickscolour, 1))
dc.SetBrush(wx.Brush(tickscolour))
dc.SetTextForeground(tickscolour)
# Get The Font For The Ticks
tfont, fontsize = self.GetTicksFont()
tfont = tfont[0]
myfamily = tfont.GetFamily()
fsize = self.scale*fontsize
tfont.SetPointSize(int(fsize))
tfont.SetFamily(myfamily)
dc.SetFont(tfont)
if self._extrastyle & SM_DRAW_FANCY_TICKS:
facename = tfont.GetFaceName()
ffamily = familyname[fontfamily.index(tfont.GetFamily())]
fweight = weightsname[weights.index(tfont.GetWeight())]
fstyle = stylesname[styles.index(tfont.GetStyle())]
fcolour = wx.TheColourDatabase.FindName(tickscolour)
textheight = 0
# Draw The Ticks And The Markers (Text Ticks)
for ii, angles in enumerate(textangles):
strings = ticks[ii]
if self._extrastyle & SM_DRAW_FANCY_TICKS == 0:
width, height, dummy, dummy = dc.GetFullTextExtent(strings, tfont)
textheight = height
else:
width, height, dummy = fancytext.GetFullExtent(strings, dc)
textheight = height
lX = dc.GetCharWidth()/2.0
lY = dc.GetCharHeight()/2.0
if self._extrastyle & SM_ROTATE_TEXT:
angis = colourangles[ii] - float(width)/(2.0*radius)
x, y = self.__CircleCoords(radius-10.0*self.scale, angis, centerX, centerY)
dc.DrawRotatedText(strings, x, y, angles)
else:
angis = colourangles[ii]
if self._extrastyle & SM_DRAW_FANCY_TICKS == 0:
x, y = self.__CircleCoords(radius-10*self.scale, angis, centerX, centerY)
lX = lX*len(strings)
x = x - lX - width*cos(angis)/2.0
y = y - lY - height*sin(angis)/2.0
if self._extrastyle & SM_DRAW_FANCY_TICKS:
fancystr = '<font family="' + ffamily + '" size="' + str(int(fsize)) + '" weight="' + fweight + '"'
fancystr = fancystr + ' color="' + fcolour + '"' + ' style="' + fstyle + '"> ' + strings + ' </font>'
width, height, dummy = fancytext.GetFullExtent(fancystr, dc)
x, y = self.__CircleCoords(radius-10*self.scale, angis, centerX, centerY)
x = x - width/2.0 - width*cos(angis)/2.0
y = y - height/2.0 - height*sin(angis)/2.0
fancytext.RenderToDC(fancystr, dc, x, y)
else:
dc.DrawText(strings, x, y)
# This Is The Small Rectangle --> Tick Mark
rectangle = colourangles[ii] + pi/2.0
sinrect = sin(rectangle)
cosrect = cos(rectangle)
x1 = xcoords[ii] - self.scale*cosrect
y1 = ycoords[ii] - self.scale*sinrect
x2 = x1 + 3*self.scale*cosrect
y2 = y1 + 3*self.scale*sinrect
x3 = x1 - 10*self.scale*sinrect
y3 = y1 + 10*self.scale*cosrect
x4 = x3 + 3*self.scale*cosrect
y4 = y3 + 3*self.scale*sinrect
points = [(x1, y1), (x2, y2), (x4, y4), (x3, y3)]
dc.DrawPolygon(points)
if self._extrastyle & SM_DRAW_SECONDARY_TICKS:
if ii > 0:
newinterval = intervals[ii]
oldinterval = intervals[ii-1]
spacing = (newinterval - oldinterval)/float(ticknum+1)
for tcount in xrange(ticknum):
if direction == "Advance":
oldinterval = (oldinterval + spacing) - start
stint = oldinterval
else:
oldinterval = start + (oldinterval + spacing)
stint = end - oldinterval
angle = (stint/float(span))*(startangle-endangle) - startangle
rectangle = angle + pi/2.0
sinrect = sin(rectangle)
cosrect = cos(rectangle)
xt, yt = self.__CircleCoords(radius, angle, centerX, centerY)
x1 = xt - self.scale*cosrect
y1 = yt - self.scale*sinrect
x2 = x1 + self.scale*cosrect
y2 = y1 + self.scale*sinrect
x3 = x1 - 6*self.scale*sinrect
y3 = y1 + 6*self.scale*cosrect
x4 = x3 + self.scale*cosrect
y4 = y3 + self.scale*sinrect
points = [(x1, y1), (x2, y2), (x4, y4), (x3, y3)]
dc.DrawPolygon(points)
oldinterval = newinterval
tfont.SetPointSize(fontsize)
tfont.SetFamily(myfamily)
self.SetTicksFont(tfont)
# Draw The External Arc
dc.SetBrush(wx.TRANSPARENT_BRUSH)
if self._drawarc and not self._drawfullarc:
dc.SetPen(wx.Pen(self.GetArcColour(), 2.0))
# If It's Not A Complete Circle, Draw The Connecting Lines And The Arc
if abs(abs(startangle - endangle) - 2*pi) > 1.0/180.0:
dc.DrawArc(xstart, ystart, xend, yend, centerX, centerY)
dc.DrawLine(xstart, ystart, centerX, centerY)
dc.DrawLine(xend, yend, centerX, centerY)
else:
# Draw A Circle, Is A 2*pi Extension Arc = Complete Circle
dc.DrawCircle(centerX, centerY, radius)
if self._drawfullarc:
dc.DrawCircle(centerX, centerY, radius)
# Here We Draw The Text In The Middle, Near The Start Of The Arrow (If Present)
# This Is Like The "Km/h" Or "mph" Text In The Cars
if self._extrastyle & SM_DRAW_MIDDLE_TEXT:
middlecolour = self.GetMiddleTextColour()
middletext = self.GetMiddleText()
middleangle = (startangle + endangle)/2.0
middlefont, middlesize = self.GetMiddleTextFont()
middlesize = self.scale*middlesize
middlefont.SetPointSize(int(middlesize))
dc.SetFont(middlefont)
mw, mh, dummy, dummy = dc.GetFullTextExtent(middletext, middlefont)
newx = centerX + 1.5*mw*cos(middleangle) - mw/2.0
newy = centerY - 1.5*mh*sin(middleangle) - mh/2.0
dc.SetTextForeground(middlecolour)
dc.DrawText(middletext, newx, newy)
# Here We Draw The Text In The Bottom
# This Is Like The "Km/h" Or "mph" Text In The Cars
if self._extrastyle & SM_DRAW_BOTTOM_TEXT:
bottomcolour = self.GetBottomTextColour()
bottomtext = self.GetBottomText()
# hack for two lines of text
if bottomtext.find("\n") != -1:
# we have a newline
foo = bottomtext.partition("\n")
bottomtext1 = foo[0]
bottomtext2 = foo[2]
bottomangle = (startangle + endangle)/2.0
bottomfont, bottomsize = self.GetBottomTextFont()
bottomsize = self.scale*bottomsize
bottomfont.SetPointSize(int(bottomsize))
dc.SetFont(bottomfont)
mw, mh, dummy, dummy = dc.GetFullTextExtent(bottomtext1, bottomfont)
newx = centerX + 1.5*mw*cos(bottomangle) - mw/2.0
newy = ystart
yoffset = mh + (mh * 2)
dc.SetTextForeground(bottomcolour)
dc.DrawText(bottomtext1, newx, newy)
mw, mh, dummy, dummy = dc.GetFullTextExtent(bottomtext2, bottomfont)
newx = centerX + 1.5*mw*cos(bottomangle) - mw/2.0
newy = ystart + yoffset
dc.SetTextForeground(bottomcolour)
dc.DrawText(bottomtext2, newx, newy)
else:
bottomangle = (startangle + endangle)/2.0
bottomfont, bottomsize = self.GetBottomTextFont()
bottomsize = self.scale*bottomsize
bottomfont.SetPointSize(int(bottomsize))
dc.SetFont(bottomfont)
mw, mh, dummy, dummy = dc.GetFullTextExtent(bottomtext, bottomfont)
newx = centerX + 1.5*mw*cos(bottomangle) - mw/2.0
newy = ystart
dc.SetTextForeground(bottomcolour)
dc.DrawText(bottomtext, newx, newy)
self.bottomTextBottom = (int)(newy + mh)
# Here We Draw The Icon In The Middle, Near The Start Of The Arrow (If Present)
# This Is Like The "Fuel" Icon In The Cars
if self._extrastyle & SM_DRAW_MIDDLE_ICON:
middleicon = self.GetMiddleIcon()
middlewidth, middleheight = self.__GetMiddleIconDimens()
middleicon.SetWidth(middlewidth*self.scale)
middleicon.SetHeight(middleheight*self.scale)
middleangle = (startangle + endangle)/2.0
mw = middleicon.GetWidth()
mh = middleicon.GetHeight()
newx = centerX + 1.5*mw*cos(middleangle) - mw/2.0
newy = centerY - 1.5*mh*sin(middleangle) - mh/2.0
dc.DrawIcon(middleicon, newx, newy)
# Restore Icon Dimension, If Not Something Strange Happens
middleicon.SetWidth(middlewidth)
middleicon.SetHeight(middleheight)
# Requested To Draw The Hand
if self._extrastyle & SM_DRAW_HAND:
handstyle = self.GetHandStyle()
handcolour = self.GetHandColour()
# Calculate The Data For The Hand
if textheight == 0:
maxradius = radius-10*self.scale
else:
maxradius = radius-5*self.scale-textheight
xarr, yarr = self.__CircleCoords(maxradius, accelangle, centerX, centerY)
if handstyle == "Arrow":
x1, y1 = self.__CircleCoords(maxradius, accelangle - 4.0/180, centerX, centerY)
x2, y2 = self.__CircleCoords(maxradius, accelangle + 4.0/180, centerX, centerY)
x3, y3 = self.__CircleCoords(maxradius+3*(abs(xarr-x1)), accelangle, centerX, centerY)
newx = centerX + 4*cos(accelangle)*self.scale
newy = centerY + 4*sin(accelangle)*self.scale
else:
x1 = centerX + 4*self.scale*sin(accelangle)
y1 = centerY - 4*self.scale*cos(accelangle)
x2 = xarr
y2 = yarr
x3 = centerX - 4*self.scale*sin(accelangle)
y3 = centerY + 4*self.scale*cos(accelangle)
x4, y4 = self.__CircleCoords(5*self.scale*sqrt(3), accelangle+pi, centerX, centerY)
if self._extrastyle & SM_DRAW_SHADOW:
if handstyle == "Arrow":
# Draw The Shadow
shadowcolour = self.GetShadowColour()
dc.SetPen(wx.Pen(shadowcolour, 5*log(self.scale+1)))
dc.SetBrush(wx.Brush(shadowcolour))
shadowdistance = 2.0*self.scale
dc.DrawLine(newx + shadowdistance, newy + shadowdistance,
xarr + shadowdistance, yarr + shadowdistance)
dc.DrawPolygon([(x1+shadowdistance, y1+shadowdistance),
(x2+shadowdistance, y2+shadowdistance),
(x3+shadowdistance, y3+shadowdistance)])
else:
# Draw The Shadow
shadowcolour = self.GetShadowColour()
dc.SetBrush(wx.Brush(shadowcolour))
dc.SetPen(wx.Pen(shadowcolour, 1.0))
shadowdistance = 1.5*self.scale
dc.DrawPolygon([(x1+shadowdistance, y1+shadowdistance),
(x2+shadowdistance, y2+shadowdistance),
(x3+shadowdistance, y3+shadowdistance),
(x4+shadowdistance, y4+shadowdistance)])
if handstyle == "Arrow":
dc.SetPen(wx.Pen(handcolour, 1.5))
# Draw The Small Circle In The Center --> The Hand "Holder"
dc.SetBrush(wx.Brush(speedbackground))
dc.DrawCircle(centerX, centerY, 4*self.scale)
dc.SetPen(wx.Pen(handcolour, 5*log(self.scale+1)))
# Draw The "Hand", An Arrow
dc.DrawLine(newx, newy, xarr, yarr)
# Draw The Arrow Pointer
dc.SetBrush(wx.Brush(handcolour))
dc.DrawPolygon([(x1, y1), (x2, y2), (x3, y3)])
else:
# Draw The Hand Pointer
dc.SetPen(wx.Pen(handcolour, 1.5))
dc.SetBrush(wx.Brush(handcolour))
dc.DrawPolygon([(x1, y1), (x2, y2), (x3, y3), (x4, y4)])
# Draw The Small Circle In The Center --> The Hand "Holder"
dc.SetBrush(wx.Brush(speedbackground))
dc.DrawCircle(centerX, centerY, 4*self.scale)
# here is where we draw the LEDNumberCtrl-style display at the bottom, if requested
if self._extrastyle & SM_DRAW_BOTTOM_LED:
self._DrawLED(dc, centerX)
dc.EndDrawing()
def SetIntervals(self, intervals=None):
"""
Sets The Intervals For SpeedMeter (Main Ticks Numeric Values).
@param intervals: list of the interval end points
@type intervals: L{list} of L{int}s or L{float}s, one marking the end of each interval
"""
if intervals is None:
intervals = [0, 50, 100]
self._intervals = intervals
def GetIntervals(self):
"""
Gets The Intervals For SpeedMeter.
@rtype: L{list} of L{int}s or L{float}s, one marking the end of each interval
"""
return self._intervals
def GetBottomTextBottom(self):
"""
Gets the Y position of the bottom of the BottomText.
Used to position the LEDNumberCtrl if one is present.
@return: Y position of the bottom of the BottomText on the BufferedWindow (DC)
@rtype: int
"""
return self.bottomTextBottom
def GetWidth(self):
"""
Gets the whole width of the SpeedMeter.
Used to position the LEDNumberCtrl if present.
@return: Width (px) of the whole faceBitmap
@rtype: int
"""
return self.faceBitmap.GetWidth()
def SetSpeedValue(self, value=None):
"""
Sets The Current Value For SpeedMeter.
Please also see L{SetValueMultiplier}() function.
The value MUST be within the range specified by the L{intervals} (see L{GetIntervals}).
Calling this function will trigger the L{UpdateDrawing}() method to redraw.
@param value: the desired value
@type value: L{int} or L{float}
"""
if value is None:
value = (max(self._intervals) - min(self._intervals))/2.0
else:
if not (isinstance(value, int) or isinstance(value, float)):
raise TypeError("value parameter of SetSpeedValue must be of int or float type, not " + str(type(value)))
if value < min(self._intervals):
raise IndexError("value parameter of SetSpeedValue is smaller than the minimum element in the points (intervals) list")
elif value > max(self._intervals):
raise IndexError("value parameter of SetSpeedValue Greater Than Maximum Element In Points List")
self._speedvalue = value
self._speedStr = str(int(value * self._ValueMultiplier))
try:
self.UpdateDrawing()
except:
pass
def GetSpeedValue(self):
"""
Gets The Current Value For SpeedMeter.
@rtype: L{int} or L{float}
"""
return self._speedvalue
def SetAngleRange(self, start=0, end=pi):
"""
Sets The Range Of Existence For SpeedMeter.
This Values *Must* Be Specifiend In RADIANS.
@param start: the start angle (default 0)
@type start: L{int} in radians
@param end: the end angle (default pi)
@type end: L{int} in radians
"""
self._anglerange = [start, end]
def GetAngleRange(self):
"""
Gets The Range Of Existence For SpeedMeter.
The Returned Values Are In RADIANS.
@rtype: L{list} of L{int}s (radians) like [start, end]
"""
return self._anglerange
def SetIntervalColours(self, colours=None):
"""
Sets The Colours For The Intervals.
Every Intervals (Circle Sector) Should Have A Colour.
Expects a list of L{wx.Colour}s of the same length as the number of intervals.
@param colours: list of colours to use for intervals
@type colours: L{list} of L{wx.Colour}s of same length as number of intervals
"""
if colours is None:
if not hasattr(self, "_anglerange"):
errstr = "\nERROR: Impossible To Set Interval Colours,"
errstr = errstr + " Please Define The Intervals Ranges Before."
raise errstr
return
colours = [wx.WHITE]*len(self._intervals)
else:
if len(colours) != len(self._intervals) - 1:
errstr = "\nERROR: Length Of Colour List Does Not Match Length"
errstr = errstr + " Of Intervals Ranges List."
print errstr
raise errstr
return
self._intervalcolours = colours
def GetIntervalColours(self):
"""
Gets The Colours For The Intervals.
@rtype: L{list} of L{wx.Colour}s
"""
if hasattr(self, "_intervalcolours"):
return self._intervalcolours
else:
raise "\nERROR: No Interval Colours Have Been Defined"
def SetTicks(self, ticks=None):
"""
Sets The Ticks For SpeedMeter Intervals (Main Ticks String Values).
Must be a list of strings, of the same length as the number of intervals.
This should probably not be called from outside the class, unless you want to set the interval ticks to something weird (maybe a fuel meter using "1/4", "1/2", etc.).
It is probably better to use the L{SetValueMultiplier}() function if you're dealing with linear integers.
@param ticks: list of strings, of the same length as the number of intervals.
@type ticks: L{list} of L{string}s
"""
if ticks is None:
if not hasattr(self, "_anglerange"):
errstr = "\nERROR: Impossible To Set Interval Ticks,"
errstr = errstr + " Please Define The Intervals Ranges Before."
raise errstr
return
ticks = []
for values in self._intervals:
ticks.append(str(values))
else:
if len(ticks) != len(self._intervals):
errstr = "\nERROR: Length Of Ticks List Does Not Match Length"
errstr = errstr + " Of Intervals Ranges List."
raise errstr
return
self._intervalticks = ticks
def GetTicks(self):
"""
Gets The Ticks For SpeedMeter Intervals (Main Ticks String Values).
@rtype: L{list} of L{string}s
"""
if hasattr(self, "_intervalticks"):
return self._intervalticks
else:
raise "\nERROR: No Interval Ticks Have Been Defined"
def SetTicksFont(self, font=None):
"""
Sets The Ticks Font.
@param font: the font for the text (default 10pt, wx.Font(1, wx.SWISS, wx.NORMAL, wx.BOLD, False))
@type font: L{wx.Font}
"""
if font is None:
self._originalfont = [wx.Font(10, wx.SWISS, wx.NORMAL, wx.BOLD, False)]
self._originalsize = 10
else:
self._originalfont = [font]
self._originalsize = font.GetPointSize()
def GetTicksFont(self):
"""
Gets The Ticks Font.
@rtype: L{tuple} of (L{wx.Font}, L{float} size)
"""
return self._originalfont[:], self._originalsize
def SetTicksColour(self, colour=None):
"""
Sets The Ticks Colour.
@param colour
@type colour: L{wx.Colour}
"""
if colour is None:
colour = wx.BLUE
self._tickscolour = colour
def GetTicksColour(self):
"""
Gets The Ticks Colour.
@rtype: L{wx.Colour}
"""
return self._tickscolour
def SetSpeedBackground(self, colour=None):
"""
Sets The Background Colour Outside The SpeedMeter Control.
@param colour
@type colour: L{wx.Colour}
"""
if colour is None:
colour = wx.SystemSettings_GetColour(0)
self._speedbackground = colour
def GetSpeedBackground(self):
"""
Gets The Background Colour Outside The SpeedMeter Control.
@rtype: L{wx.Colour}
"""
return self._speedbackground
def SetHandColour(self, colour=None):
"""
Sets The Hand (Arrow Indicator) Colour.
@param colour
@type colour: L{wx.Colour}
"""
if colour is None:
colour = wx.RED
self._handcolour = colour
def GetHandColour(self):
"""
Gets The Hand (Arrow Indicator) Colour.
@rtype: L{wx.Colour}
"""
return self._handcolour
def SetArcColour(self, colour=None):
"""
Sets The External Arc Colour (Thicker Line).
@param colour
@type colour: L{wx.Colour}
"""
if colour is None:
colour = wx.BLACK
self._arccolour = colour
def GetArcColour(self):
"""
Gets The External Arc Colour.
@rtype: L{wx.Colour}
"""
return self._arccolour
def SetShadowColour(self, colour=None):
"""
Sets The Hand's Shadow Colour.
@param colour
@type colour: L{wx.Colour}
"""
if colour is None:
colour = wx.Colour(150, 150, 150)
self._shadowcolour = colour
def GetShadowColour(self):
"""
Gets The Hand's Shadow Colour.
@rtype: L{wx.Colour}
"""
return self._shadowcolour
def SetFillerColour(self, colour=None):
"""
Sets The Partial Filler Colour.
A Circle Corona Near The Ticks Will Be Filled With This Colour, From
The Starting Value To The Current Value Of SpeedMeter.
@param colour: the colour
@type colour: L{wx.Colour}
"""
if colour is None:
colour = wx.Colour(255, 150, 50)
self._fillercolour = colour
def GetFillerColour(self):
"""
Gets The Partial Filler Colour.
@rtype: L{wx.Colour}
"""
return self._fillercolour
def SetDirection(self, direction=None):
"""
Sets The Direction Of Advancing SpeedMeter Value.
Specifying "Advance" Will Move The Hand In Clock-Wise Direction (Like Normal
Car Speed Control), While Using "Reverse" Will Move It CounterClock-Wise
Direction.
@param direction: direction of needle movement
@type direction: L{string} "Advance" (default) or "Reverse"
"""
if direction is None:
direction = "Advance"
if direction not in ["Advance", "Reverse"]:
raise '\nERROR: Direction Parameter Should Be One Of "Advance" Or "Reverse".'
return
self._direction = direction
def GetDirection(self):
"""
Gets The Direction Of Advancing SpeedMeter Value.
@rtype: L{string} "Advance" or "Reverse"
"""
return self._direction
def SetNumberOfSecondaryTicks(self, ticknum=None):
"""
Sets The Number Of Secondary (Intermediate) Ticks.
@param ticknum: number of secondary ticks (MUST be >= 1, default is 3)
@type ticknum: L{int}
"""
if ticknum is None:
ticknum = 3
if ticknum < 1:
raise "\nERROR: Number Of Ticks Must Be Greater Than 1."
return
self._secondaryticks = ticknum
def GetNumberOfSecondaryTicks(self):
"""
Gets The Number Of Secondary (Intermediate) Ticks.
@rtype: L{int}
"""
return self._secondaryticks
def SetMiddleText(self, text=None):
"""
Sets The Text To Be Drawn Near The Center Of SpeedMeter.
@param text: the text to draw
@type text: L{string}
"""
if text is None:
text = ""
self._middletext = text
def GetMiddleText(self):
"""
Gets The Text To Be Drawn Near The Center Of SpeedMeter.
@rtype: L{string}
"""
return self._middletext
def SetMiddleTextFont(self, font=None):
"""
Sets The Font For The Text In The Middle.
@param font: the font for the text (default 10pt, wx.Font(1, wx.SWISS, wx.NORMAL, wx.BOLD, False))
@type font: L{wx.Font}
"""
if font is None:
self._middletextfont = wx.Font(1, wx.SWISS, wx.NORMAL, wx.BOLD, False)
self._middletextsize = 10.0
self._middletextfont.SetPointSize(self._middletextsize)
else:
self._middletextfont = font
self._middletextsize = font.GetPointSize()
self._middletextfont.SetPointSize(self._middletextsize)
def GetMiddleTextFont(self):
"""
Gets The Font For The Text In The Middle.
@rtype: L{tuple} of (L{wx.Font}, L{float} size)
"""
return self._middletextfont, self._middletextsize
def SetMiddleTextColour(self, colour=None):
"""
Sets The Colour For The Text In The Middle.
@param colour: the colour for the text
@type colour: L{wx.Colour}
"""
if colour is None:
colour = wx.BLUE
self._middlecolour = colour
def GetMiddleTextColour(self):
"""
Gets The Colour For The Text In The Middle.
@rtype: L{wx.Colour}
"""
return self._middlecolour
def SetBottomText(self, text=None):
"""
Sets The Text To Be Drawn Near The Bottom Of SpeedMeter. Can have up to one newline. This should be used for a label, such as the gauge type and scale (i.e. "RPM x1000)
Newlines are understood. The text is drawn as two separate lines, and this is taken into account when positioning the LED digits if used.
@param text: the text to draw
@type text: L{string}
"""
if text is None:
text = ""
self._bottomtext = text
def GetBottomText(self):
"""
Gets The Text To Be Drawn Near The Bottom Of SpeedMeter (label)
@rtype: L{string}
"""
return self._bottomtext
def SetBottomTextFont(self, font=None):
"""
Sets The Font For The Text In The Bottom.
@param font: the font for the text (default 10pt, wx.Font(1, wx.SWISS, wx.NORMAL, wx.BOLD, False))
@type font: L{wx.Font}
"""
if font is None:
self._bottomtextfont = wx.Font(1, wx.SWISS, wx.NORMAL, wx.BOLD, False)
self._bottomtextsize = 10.0
self._bottomtextfont.SetPointSize(self._bottomtextsize)
else:
self._bottomtextfont = font
self._bottomtextsize = font.GetPointSize()
self._bottomtextfont.SetPointSize(self._bottomtextsize)
def GetBottomTextFont(self):
"""
Gets The Font For The Text In The Bottom.
@rtype: L{tuple} of (L{wx.Font}, L{float} size)
"""
return self._bottomtextfont, self._bottomtextsize
def SetBottomTextColour(self, colour=None):
"""
Sets The Colour For The Text In The Bottom of the gauge (label).
@param colour: the colour for the text
@type colour: L{wx.Colour}
"""
if colour is None:
colour = wx.BLUE
self._bottomcolour = colour
def SetLEDColour(self, colour=None):
"""
Sets The Colour For Bottom LED digits.
@param colour: the colour for the digits
@type colour: L{wx.Colour}
"""
if colour is None:
colour = wx.GREEN
self._ledcolour = colour
def GetLEDColour(self):
"""
Gets The Colour For The LED Digits
@rtype: L{wx.Colour}
"""
return self._ledcolour
def GetBottomTextColour(self):
"""
Gets The Colour For The Text In The Bottom
@rtype: L{wx.Colour}
"""
return self._bottomcolour
def SetMiddleIcon(self, icon):
"""
Sets The Icon To Be Drawn Near The Center Of SpeedMeter.
@param icon: The icon to be drawn
@type icon: L{wx.Icon}
"""
if icon.Ok():
self._middleicon = icon
else:
# edited 2010-06-13 by jantman to get rid of error - was raising an error as a string
print "\nERROR: Invalid Icon Passed To SpeedMeter."
return False
def GetMiddleIcon(self):
"""
Gets The Icon To Be Drawn Near The Center Of SpeedMeter.
@rtype: L{wx.Icon}
"""
return self._middleicon
def __GetMiddleIconDimens(self):
"""
USED INTERNALLY ONLY - Undocumented. Do NOT call from outside this class.
"""
return self._middleicon.GetWidth(), self._middleicon.GetHeight()
def __CircleCoords(self, radius, angle, centerX, centerY):
"""
USED INTERNALLY ONLY - Undocumented. Do NOT call from outside this class.
Method to get the coordinates of the circle.
"""
x = radius*cos(angle) + centerX
y = radius*sin(angle) + centerY
return x, y
def __GetIntersection(self, current, intervals):
"""
USED INTERNALLY ONLY - Undocumented. Do NOT call from outside this class.
"""
if self.GetDirection() == "Reverse":
interval = intervals[:]
interval.reverse()
else:
interval = intervals
indexes = range(len(intervals))
try:
intersection = [ind for ind in indexes if interval[ind] <= current <= interval[ind+1]]
except:
if self.GetDirection() == "Reverse":
intersection = [len(intervals) - 1]
else:
intersection = [0]
return intersection[0]
def SetFirstGradientColour(self, colour=None):
"""
Sets The First Gradient Colour (Near The Ticks).
@param colour: Color for the second gradient
@type colour: L{wx.Colour}
"""
if colour is None:
colour = wx.Colour(145, 220, 200)
self._firstgradientcolour = colour
def GetFirstGradientColour(self):
"""
Gets The First Gradient Colour (Near The Ticks).
@return: first gradient color
@rtype: L{wx.Colour}
"""
return self._firstgradientcolour
def SetSecondGradientColour(self, colour=None):
"""
Sets The Second Gradient Colour (Near The Center).
@param colour: Color for the second gradient
@type colour: L{wx.Colour}
"""
if colour is None:
colour = wx.WHITE
self._secondgradientcolour = colour
def GetSecondGradientColour(self):
"""
Gets The First Gradient Colour (Near The Center).
@return: second gradient color
@rtype: L{wx.Colour}
"""
return self._secondgradientcolour
def SetHandStyle(self, style=None):
"""
Sets The Style For The Hand (Arrow Indicator).
By Specifying "Hand" SpeedMeter Will Draw A Polygon That Simulates The Car
Speed Control Indicator. Using "Arrow" Will Force SpeedMeter To Draw A
Simple Arrow.
@param style: hand style, string, either "Arrow" or "Hand"
@type style: L{string}
"""
if style is None:
style = "Hand"
if style not in ["Hand", "Arrow"]:
raise '\nERROR: Hand Style Parameter Should Be One Of "Hand" Or "Arrow".'
return
self._handstyle = style
def GetHandStyle(self):
"""
Gets The Style For The Hand (Arrow Indicator)
@return: hand style, string either "Arrow" or "Hand"
@rtype: L{string}
"""
return self._handstyle
def DrawExternalArc(self, draw=True):
"""
Specify Wheter Or Not You Wish To Draw The External (Thicker) Arc.
@param draw: Whether or not to draw the external arc.(default True)
@type draw: L{boolean}
"""
self._drawarc = draw
def DrawExternalCircle(self, draw=False):
"""
Specify Wheter Or Not You Wish To Draw The External (Thicker) Arc as a full circle.
@param draw: boolean, whether or not to draw the full circle (default False)
@type draw: L{boolean}
"""
self._drawfullarc = draw
def OnMouseMotion(self, event):
""" Handles The Mouse Events.
Here Only Left Clicks/Drags Are Involved. Should SpeedMeter Have Something More?
@todo: Do we even want this? What does it do? Seems like it would allow the user to change the value or something, which is BAD.
"""
mousex = event.GetX()
mousey = event.GetY()
if event.Leaving():
return
pos = self.GetClientSize()
size = self.GetPosition()
centerX = self.CenterX
centerY = self.CenterY
direction = self.GetDirection()
if event.LeftIsDown():
angle = atan2(float(mousey) - centerY, centerX - float(mousex)) + pi - self.EndAngle
if angle >= 2*pi:
angle = angle - 2*pi
if direction == "Advance":
currentvalue = (self.StartAngle - self.EndAngle - angle)*float(self.Span)/(self.StartAngle - self.EndAngle) + self.StartValue
else:
currentvalue = (angle)*float(self.Span)/(self.StartAngle - self.EndAngle) + self.StartValue
if currentvalue >= self.StartValue and currentvalue <= self.EndValue:
self.SetSpeedValue(currentvalue)
event.Skip()
def GetSpeedStyle(self):
""" Returns A List Of Strings And A List Of Integers Containing The Styles. """
stringstyle = []
integerstyle = []
if self._extrastyle & SM_ROTATE_TEXT:
stringstyle.append("SM_ROTATE_TEXT")
integerstyle.append(SM_ROTATE_TEXT)
if self._extrastyle & SM_DRAW_SECTORS:
stringstyle.append("SM_DRAW_SECTORS")
integerstyle.append(SM_DRAW_SECTORS)
if self._extrastyle & SM_DRAW_PARTIAL_SECTORS:
stringstyle.append("SM_DRAW_PARTIAL_SECTORS")
integerstyle.append(SM_DRAW_PARTIAL_SECTORS)
if self._extrastyle & SM_DRAW_HAND:
stringstyle.append("SM_DRAW_HAND")
integerstyle.append(SM_DRAW_HAND)
if self._extrastyle & SM_DRAW_SHADOW:
stringstyle.append("SM_DRAW_SHADOW")
integerstyle.append(SM_DRAW_SHADOW)
if self._extrastyle & SM_DRAW_PARTIAL_FILLER:
stringstyle.append("SM_DRAW_PARTIAL_FILLER")
integerstyle.append(SM_DRAW_PARTIAL_FILLER)
if self._extrastyle & SM_DRAW_SECONDARY_TICKS:
stringstyle.append("SM_DRAW_SECONDARY_TICKS")
integerstyle.append(SM_DRAW_SECONDARY_TICKS)
if self._extrastyle & SM_DRAW_MIDDLE_TEXT:
stringstyle.append("SM_DRAW_MIDDLE_TEXT")
integerstyle.append(SM_DRAW_MIDDLE_TEXT)
if self._extrastyle & SM_DRAW_BOTTOM_TEXT:
stringstyle.append("SM_DRAW_BOTTOM_TEXT")
integerstyle.append(SM_DRAW_BOTTOM_TEXT)
if self._extrastyle & SM_DRAW_BOTTOM_LED:
stringstyle.append("SM_DRAW_BOTTOM_LED")
integerstyle.append(SM_DRAW_BOTTOM_LED)
if self._extrastyle & SM_DRAW_MIDDLE_ICON:
stringstyle.append("SM_DRAW_MIDDLE_ICON")
integerstyle.append(SM_DRAW_MIDDLE_ICON)
if self._extrastyle & SM_DRAW_GRADIENT:
stringstyle.append("SM_DRAW_GRADIENT")
integerstyle.append(SM_DRAW_GRADIENT)
if self._extrastyle & SM_DRAW_FANCY_TICKS:
stringstyle.append("SM_DRAW_FANCY_TICKS")
integerstyle.append(SM_DRAW_FANCY_TICKS)
return stringstyle, integerstyle
# below here is stuff added by jantman for the LED control
def SetDrawFaded(self, DrawFaded=None, Redraw=False):
"""
Set the option to draw the faded (non-used) LED segments.
@param DrawFaded: Whether or not to draw the unused segments.
@type DrawFaded: L{boolean}
@param Redraw: Whether or not to redraw NOW.
@type Redraw: L{boolean}
"""
if DrawFaded is None:
self._DrawFaded = DrawFaded
if DrawFaded != self._DrawFaded:
self._DrawFaded = DrawFaded
if Redraw:
Refresh(False)
def _InitLEDInternals(self):
"""
Sets up the class variables for the LED control stuff.
Should ONLY be called INTERNALLY.
"""
self._LineMargin = None
self._LineLength = None
self._LineWidth = None
self._DigitMargin = None
self._LeftStartPos = None
def _DrawLED(self, dc, CenterX):
"""
Handles all of the drawing for the LED control, just an extension to the original SpeedMeter Draw() method.
Should ONLY be called INTERNALLY.
@todo: this is hard coded to ignore the background - doesn't draw it. If you want something different, you need to change it.
@param dc: the DC
@type dc: L{dc}
@param CenterX: The X coordinate of the center of the gauge, as found in the original SpeedMeter code.
@type CenterX: L{int}
"""
self._RecalcInternals()
# Iterate each digit in the value, and draw.
if self.DEBUG is True:
print "===Drawing LED Value String: " + self._speedStr
for i in range(len(self._speedStr)):
c = self._speedStr[i]
if self.DEBUG:
print "Digit Number: " + str(i)
print "Drawing Digit: " + c
# Draw faded lines if wanted.
if self._DrawFaded and (c != '.'):
self._DrawDigit(dc, DIGITALL, i)
# Draw the digits.
if c == '0':
self._DrawDigit(dc, DIGIT0, i)
elif c == '1':
self._DrawDigit(dc, DIGIT1, i)
elif c == '2':
self._DrawDigit(dc, DIGIT2, i)
elif c == '3':
self._DrawDigit(dc, DIGIT3, i)
elif c == '4':
self._DrawDigit(dc, DIGIT4, i)
elif c == '5':
self._DrawDigit(dc, DIGIT5, i)
elif c == '6':
self._DrawDigit(dc, DIGIT6, i)
elif c == '7':
self._DrawDigit(dc, DIGIT7, i)
elif c == '8':
self._DrawDigit(dc, DIGIT8, i)
elif c == '9':
self._DrawDigit(dc, DIGIT9, i)
elif c == '-':
self._DrawDigit(dc, DASH, i)
elif c == '.':
self._DrawDigit(dc, DECIMALSIGN, (i-1))
elif c == ' ':
# skip this
pass
else:
print "Error: Undefined Digit Value: " + c
def _DrawDigit(self, dc, Digit, Column):
"""
Internal code to actually draw the lines that make up a single digit.
Should be called INTERNALLY ONLY.
@param dc: The DC.
@type dc: L{dc}
@param Digit: The constant (mask) defining the lines of the specified digit.
@type Digit: L{int}
@param Column: the number of the column that the digit should be in
@type Column: L{int}
"""
LineColor = self.GetForegroundColour()
if Digit == DIGITALL:
R = LineColor.Red() / 16
G = LineColor.Green() / 16
B = LineColor.Blue() / 16
LineColor = wx.Colour(R, G, B)
XPos = self._LeftStartPos + (Column * (self._LineLength + self._DigitMargin))
# Create a pen and draw the lines.
Pen = wx.Pen(LineColor, self._LineWidth, wx.SOLID)
dc.SetPen(Pen)
if Digit & LINE1:
dc.DrawLine(XPos + self._LineMargin*2, self._LineMargin + self.LEDyOffset,
XPos + self._LineLength + self._LineMargin*2, self._LineMargin + self.LEDyOffset)
if self.DEBUG:
print "Line1"
if Digit & LINE2:
dc.DrawLine(XPos + self._LineLength + self._LineMargin*3,
self._LineMargin*2 + self.LEDyOffset, XPos + self._LineLength + self._LineMargin*3,
self._LineLength + (self._LineMargin*2) + self.LEDyOffset)
if self.DEBUG:
print "Line2"
if Digit & LINE3:
dc.DrawLine(XPos + self._LineLength + self._LineMargin*3, self._LineLength + (self._LineMargin*4) + self.LEDyOffset,
XPos + self._LineLength + self._LineMargin*3, self._LineLength*2 + (self._LineMargin*4) + self.LEDyOffset)
if self.DEBUG:
print "Line3"
if Digit & LINE4:
dc.DrawLine(XPos + self._LineMargin*2, self._LineLength*2 + (self._LineMargin*5) + self.LEDyOffset,
XPos + self._LineLength + self._LineMargin*2, self._LineLength*2 + (self._LineMargin*5) + self.LEDyOffset)
if self.DEBUG:
print "Line4"
if Digit & LINE5:
dc.DrawLine(XPos + self._LineMargin, self._LineLength + (self._LineMargin*4) + self.LEDyOffset,
XPos + self._LineMargin, self._LineLength*2 + (self._LineMargin*4) + self.LEDyOffset)
if self.DEBUG:
print "Line5"
if Digit & LINE6:
dc.DrawLine(XPos + self._LineMargin, self._LineMargin*2 + self.LEDyOffset,
XPos + self._LineMargin, self._LineLength + (self._LineMargin*2) + self.LEDyOffset)
if self.DEBUG:
print "Line6"
if Digit & LINE7:
dc.DrawLine(XPos + self._LineMargin*2, self._LineLength + (self._LineMargin*3) + self.LEDyOffset,
XPos + self._LineMargin*2 + self._LineLength, self._LineLength + (self._LineMargin*3) + self.LEDyOffset)
if self.DEBUG:
print "Line7"
if Digit & DECIMALSIGN:
dc.DrawLine(XPos + self._LineLength + self._LineMargin*4, self._LineLength*2 + (self._LineMargin*5) + self.LEDyOffset,
XPos + self._LineLength + self._LineMargin*4, self._LineLength*2 + (self._LineMargin*5) + self.LEDyOffset)
if self.DEBUG:
print "Line DecimalSign"
#Dc.SetPen(wxNullPen);
def _RecalcInternals(self):
"""
Recalculates all variables controlling the placement and gemoetry of the digits. Bases it off of the Frame size. This should calculate everything like the gauge center and work off of that.
Should be called INTERNALLY ONLY.
Dimensions of LED segments
Size of character is based on the HEIGH of the widget, NOT the width.
Segment height is calculated as follows:
Each segment is m_LineLength pixels long.
There is m_LineMargin pixels at the top and bottom of each line segment
There is m_LineMargin pixels at the top and bottom of each digit
Therefore, the heigth of each character is:
m_LineMargin : Top digit boarder
m_LineMargin+m_LineLength+m_LineMargin : Top half of segment
m_LineMargin+m_LineLength+m_LineMargin : Bottom half of segment
m_LineMargin : Bottom digit boarder
----------------------
m_LineMargin*6 + m_LineLength*2 == Total height of digit.
Therefore, (m_LineMargin*6 + m_LineLength*2) must equal Height
Spacing between characters can then be calculated as follows:
m_LineMargin : before the digit,
m_LineMargin+m_LineLength+m_LineMargin : for the digit width
m_LineMargin : after the digit
= m_LineMargin*4 + m_LineLength
"""
# the size params for just the LED area itself
size = self.GetClientSize()
LEDHeight = int(size.y / 7) # based off of height of 30 in a 214px high client
Height = LEDHeight
LEDWidth = int(size.x / 2.4) # based off of width of 120 in a 290px wide client
ClientWidth = size.x
self.LEDyOffset = self.bottomTextBottom
if (Height * 0.075) < 1:
self._LineMargin = 1
else:
self._LineMargin = int(Height * 0.075)
if (Height * 0.275) < 1:
self._LineLength = 1
else:
self._LineLength = int(Height * 0.275)
self._LineWidth = self._LineMargin
self._DigitMargin = self._LineMargin * 4
# Count the number of characters in the string; '.' characters are not
# included because they do not take up space in the display
count = 0;
for char in self._speedStr:
if char != '.':
count = count + 1
ValueWidth = (self._LineLength + self._DigitMargin) * count
if self._Alignment == gizmos.LED_ALIGN_LEFT:
self._LeftStartPos = self._LineMargin + LeftEdge
elif self._Alignment == gizmos.LED_ALIGN_RIGHT:
self._LeftStartPos = ClientWidth - ValueWidth - self._LineMargin + LeftEdge
else:
# self._Alignment == gizmos.LED_ALIGN_CENTER:
# centered is the default
self._LeftStartPos = (ClientWidth /2 ) - (ValueWidth / 2)
def SetLEDAlignment(self, Alignment=None, Redraw=False):
"""
Sets LED digit alignment.
@param Alignment - the alignment of the LED digits - valid values are L{gizmos.LED_ALIGN_LEFT}, L{gizmos.LED_ALIGN_RIGHT}, L{gizmos.LED_ALIGN_CENTER} (center is default).
@type Alignment: wxLEDValueAlign
@param Redraw: Whether or not to redraw NOW.
@type Redraw: L{boolean}
"""
if Alignment is None:
self._Alignment = Alignment
if Alignment != self._Alignment:
self._Alignment = Alignment
if Redraw:
try:
self.UpdateDrawing()
except:
pass
def SetDrawFaded(self, DrawFaded=None, Redraw=False):
"""
Whether or not to draw the unused line segments. If true, draws them faded.
@param DrawFaded: Whether or not to draw the faded segments. (Default False)
@type DrawFaded: L{boolean}
@param Redraw: Whether or not to redraw NOW.
@type Redraw: L{boolean}
"""
if DrawFaded is None:
self._DrawFaded = DrawFaded
if DrawFaded != self._DrawFaded:
self._DrawFaded = DrawFaded
if Redraw:
Refresh(False)
def SetValueMultiplier(self, multiplier=1):
"""
Sets the value multiplier. Values set with SetValue() will be multiplied by this amount before being displayed on the LED control.
@param multiplier: the value multiplier
@type multiplier: L{int} or L{float}
@todo: re-do all this by setting a ValueScale (maybe at create time) and using this scale to determine the gauge scale, also divide values by it before feeding into the meter code itself (i.e. LED will show value as passed with SetValue()).
"""
self._ValueMultiplier = multiplier<|fim▁end|> | |
<|file_name|>psmetrics.py<|end_file_name|><|fim▁begin|># $Id$
# Christopher Lee [email protected]
# based upon pdfmetrics.py by Andy Robinson
from . import fontinfo
from . import latin1MetricsCache
##############################################################
#
# PDF Metrics
# This is a preamble to give us a stringWidth function.
# loads and caches AFM files, but won't need to as the
# standard fonts are there already
##############################################################
_stdenc_widths = {
'courier':
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600,
600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600,
600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600,
600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600,
600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 0, 600, 600, 600,
600, 0, 600, 600, 600, 600, 600, 600, 600, 600, 0, 600, 0, 600, 600, 600, 600, 600, 600, 600,
600, 0, 600, 600, 0, 600, 600, 600, 600, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 600, 0,
600, 0, 0, 0, 0, 600, 600, 600, 600, 0, 0, 0, 0, 0, 600, 0, 0, 0, 600, 0, 0, 600, 600, 600, 600,
0, 0, 600],
'courier-bold':
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600,
600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600,
600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600,
600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600,
600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 0, 600, 600, 600,
600, 0, 600, 600, 600, 600, 600, 600, 600, 600, 0, 600, 0, 600, 600, 600, 600, 600, 600, 600,
600, 0, 600, 600, 0, 600, 600, 600, 600, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 600, 0,
600, 0, 0, 0, 0, 600, 600, 600, 600, 0, 0, 0, 0, 0, 600, 0, 0, 0, 600, 0, 0, 600, 600, 600, 600,
0, 0, 600],
'courier-boldoblique':
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600,
600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600,
600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600,
600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600,
600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 0, 600, 600, 600,
600, 0, 600, 600, 600, 600, 600, 600, 600, 600, 0, 600, 0, 600, 600, 600, 600, 600, 600, 600,
600, 0, 600, 600, 0, 600, 600, 600, 600, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 600, 0,
600, 0, 0, 0, 0, 600, 600, 600, 600, 0, 0, 0, 0, 0, 600, 0, 0, 0, 600, 0, 0, 600, 600, 600, 600,
0, 0, 600],
'courier-oblique':
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600,
600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600,
600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600,
600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600,
600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 600, 0, 600, 600, 600,
600, 0, 600, 600, 600, 600, 600, 600, 600, 600, 0, 600, 0, 600, 600, 600, 600, 600, 600, 600,
600, 0, 600, 600, 0, 600, 600, 600, 600, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 600, 0,
600, 0, 0, 0, 0, 600, 600, 600, 600, 0, 0, 0, 0, 0, 600, 0, 0, 0, 600, 0, 0, 600, 600, 600, 600,
0, 0, 600],
'helvetica':
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
278, 278, 355, 556, 556, 889, 667, 222, 333, 333, 389, 584, 278, 333, 278, 278, 556, 556, 556,
556, 556, 556, 556, 556, 556, 556, 278, 278, 584, 584, 584, 556, 1015, 667, 667, 722, 722, 667,
611, 778, 722, 278, 500, 667, 556, 833, 722, 778, 667, 778, 722, 667, 611, 722, 667, 944, 667,
667, 611, 278, 278, 278, 469, 556, 222, 556, 556, 500, 556, 556, 278, 556, 556, 222, 222, 500,
222, 833, 556, 556, 556, 556, 333, 500, 278, 556, 500, 722, 500, 500, 500, 334, 260, 334, 584, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 333, 556, 556, 167, 556, 556, 556, 556, 191, 333, 556, 333, 333, 500, 500, 0, 556, 556, 556,
278, 0, 537, 350, 222, 333, 333, 556, 1000, 1000, 0, 611, 0, 333, 333, 333, 333, 333, 333, 333,
333, 0, 333, 333, 0, 333, 333, 333, 1000, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1000,
0, 370, 0, 0, 0, 0, 556, 778, 1000, 365, 0, 0, 0, 0, 0, 889, 0, 0, 0, 278, 0, 0, 222, 611, 944,
611, 0, 0, 834],
'helvetica-bold':
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
278, 333, 474, 556, 556, 889, 722, 278, 333, 333, 389, 584, 278, 333, 278, 278, 556, 556, 556,
556, 556, 556, 556, 556, 556, 556, 333, 333, 584, 584, 584, 611, 975, 722, 722, 722, 722, 667,
611, 778, 722, 278, 556, 722, 611, 833, 722, 778, 667, 778, 722, 667, 611, 722, 667, 944, 667,
667, 611, 333, 278, 333, 584, 556, 278, 556, 611, 556, 611, 556, 333, 611, 611, 278, 278, 556,
278, 889, 611, 611, 611, 611, 389, 556, 333, 611, 556, 778, 556, 556, 500, 389, 280, 389, 584, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 333, 556, 556, 167, 556, 556, 556, 556, 238, 500, 556, 333, 333, 611, 611, 0, 556, 556, 556,
278, 0, 556, 350, 278, 500, 500, 556, 1000, 1000, 0, 611, 0, 333, 333, 333, 333, 333, 333, 333,
333, 0, 333, 333, 0, 333, 333, 333, 1000, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1000,
0, 370, 0, 0, 0, 0, 611, 778, 1000, 365, 0, 0, 0, 0, 0, 889, 0, 0, 0, 278, 0, 0, 278, 611, 944,
611, 0, 0, 834],
'helvetica-boldoblique':
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
278, 333, 474, 556, 556, 889, 722, 278, 333, 333, 389, 584, 278, 333, 278, 278, 556, 556, 556,
556, 556, 556, 556, 556, 556, 556, 333, 333, 584, 584, 584, 611, 975, 722, 722, 722, 722, 667,
611, 778, 722, 278, 556, 722, 611, 833, 722, 778, 667, 778, 722, 667, 611, 722, 667, 944, 667,
667, 611, 333, 278, 333, 584, 556, 278, 556, 611, 556, 611, 556, 333, 611, 611, 278, 278, 556,
278, 889, 611, 611, 611, 611, 389, 556, 333, 611, 556, 778, 556, 556, 500, 389, 280, 389, 584, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 333, 556, 556, 167, 556, 556, 556, 556, 238, 500, 556, 333, 333, 611, 611, 0, 556, 556, 556,
278, 0, 556, 350, 278, 500, 500, 556, 1000, 1000, 0, 611, 0, 333, 333, 333, 333, 333, 333, 333,
333, 0, 333, 333, 0, 333, 333, 333, 1000, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1000,
0, 370, 0, 0, 0, 0, 611, 778, 1000, 365, 0, 0, 0, 0, 0, 889, 0, 0, 0, 278, 0, 0, 278, 611, 944,
611, 0, 0, 834],
'helvetica-oblique':<|fim▁hole|> [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
278, 278, 355, 556, 556, 889, 667, 222, 333, 333, 389, 584, 278, 333, 278, 278, 556, 556, 556,
556, 556, 556, 556, 556, 556, 556, 278, 278, 584, 584, 584, 556, 1015, 667, 667, 722, 722, 667,
611, 778, 722, 278, 500, 667, 556, 833, 722, 778, 667, 778, 722, 667, 611, 722, 667, 944, 667,
667, 611, 278, 278, 278, 469, 556, 222, 556, 556, 500, 556, 556, 278, 556, 556, 222, 222, 500,
222, 833, 556, 556, 556, 556, 333, 500, 278, 556, 500, 722, 500, 500, 500, 334, 260, 334, 584, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 333, 556, 556, 167, 556, 556, 556, 556, 191, 333, 556, 333, 333, 500, 500, 0, 556, 556, 556,
278, 0, 537, 350, 222, 333, 333, 556, 1000, 1000, 0, 611, 0, 333, 333, 333, 333, 333, 333, 333,
333, 0, 333, 333, 0, 333, 333, 333, 1000, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1000,
0, 370, 0, 0, 0, 0, 556, 778, 1000, 365, 0, 0, 0, 0, 0, 889, 0, 0, 0, 278, 0, 0, 222, 611, 944,
611, 0, 0, 834],
'symbol':
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
250, 333, 713, 500, 549, 833, 778, 439, 333, 333, 500, 549, 250, 549, 250, 278, 500, 500, 500,
500, 500, 500, 500, 500, 500, 500, 278, 278, 549, 549, 549, 444, 549, 722, 667, 722, 612, 611,
763, 603, 722, 333, 631, 722, 686, 889, 722, 722, 768, 741, 556, 592, 611, 690, 439, 768, 645,
795, 611, 333, 863, 333, 658, 500, 500, 631, 549, 549, 494, 439, 521, 411, 603, 329, 603, 549,
549, 576, 521, 549, 549, 521, 549, 603, 439, 576, 713, 686, 493, 686, 494, 480, 200, 480, 549, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 620, 247, 549, 167, 713, 500, 753, 753, 753, 753, 1042, 987, 603, 987, 603, 400, 549, 411,
549, 549, 713, 494, 460, 549, 549, 549, 549, 1000, 603, 1000, 658, 823, 686, 795, 987, 768, 768,
823, 768, 768, 713, 713, 713, 713, 713, 713, 713, 768, 713, 790, 790, 890, 823, 549, 250, 713,
603, 603, 1042, 987, 603, 987, 603, 494, 329, 790, 790, 786, 713, 384, 384, 384, 384, 384, 384,
494, 494, 494, 494, 0, 329, 274, 686, 686, 686, 384, 384, 384, 384, 384, 384, 494, 494, 790],
'times-bold':
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
250, 333, 555, 500, 500, 1000, 833, 333, 333, 333, 500, 570, 250, 333, 250, 278, 500, 500, 500,
500, 500, 500, 500, 500, 500, 500, 333, 333, 570, 570, 570, 500, 930, 722, 667, 722, 722, 667,
611, 778, 778, 389, 500, 778, 667, 944, 722, 778, 611, 778, 722, 556, 667, 722, 722, 1000, 722,
722, 667, 333, 278, 333, 581, 500, 333, 500, 556, 444, 556, 444, 333, 500, 556, 278, 333, 556,
278, 833, 556, 500, 556, 556, 444, 389, 333, 556, 500, 722, 500, 500, 444, 394, 220, 394, 520, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 333, 500, 500, 167, 500, 500, 500, 500, 278, 500, 500, 333, 333, 556, 556, 0, 500, 500, 500,
250, 0, 540, 350, 333, 500, 500, 500, 1000, 1000, 0, 500, 0, 333, 333, 333, 333, 333, 333, 333,
333, 0, 333, 333, 0, 333, 333, 333, 1000, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1000,
0, 300, 0, 0, 0, 0, 667, 778, 1000, 330, 0, 0, 0, 0, 0, 722, 0, 0, 0, 278, 0, 0, 278, 500, 722,
556, 0, 0, 750],
'times-bolditalic':
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
250, 389, 555, 500, 500, 833, 778, 333, 333, 333, 500, 570, 250, 333, 250, 278, 500, 500, 500,
500, 500, 500, 500, 500, 500, 500, 333, 333, 570, 570, 570, 500, 832, 667, 667, 667, 722, 667,
667, 722, 778, 389, 500, 667, 611, 889, 722, 722, 611, 722, 667, 556, 611, 722, 667, 889, 667,
611, 611, 333, 278, 333, 570, 500, 333, 500, 500, 444, 500, 444, 333, 500, 556, 278, 278, 500,
278, 778, 556, 500, 500, 500, 389, 389, 278, 556, 444, 667, 500, 444, 389, 348, 220, 348, 570, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 389, 500, 500, 167, 500, 500, 500, 500, 278, 500, 500, 333, 333, 556, 556, 0, 500, 500, 500,
250, 0, 500, 350, 333, 500, 500, 500, 1000, 1000, 0, 500, 0, 333, 333, 333, 333, 333, 333, 333,
333, 0, 333, 333, 0, 333, 333, 333, 1000, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 944, 0,
266, 0, 0, 0, 0, 611, 722, 944, 300, 0, 0, 0, 0, 0, 722, 0, 0, 0, 278, 0, 0, 278, 500, 722, 500,
0, 0, 750],
'times-italic':
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
250, 333, 420, 500, 500, 833, 778, 333, 333, 333, 500, 675, 250, 333, 250, 278, 500, 500, 500,
500, 500, 500, 500, 500, 500, 500, 333, 333, 675, 675, 675, 500, 920, 611, 611, 667, 722, 611,
611, 722, 722, 333, 444, 667, 556, 833, 667, 722, 611, 722, 611, 500, 556, 722, 611, 833, 611,
556, 556, 389, 278, 389, 422, 500, 333, 500, 500, 444, 500, 444, 278, 500, 500, 278, 278, 444,
278, 722, 500, 500, 500, 500, 389, 389, 278, 500, 444, 667, 444, 444, 389, 400, 275, 400, 541, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 389, 500, 500, 167, 500, 500, 500, 500, 214, 556, 500, 333, 333, 500, 500, 0, 500, 500, 500,
250, 0, 523, 350, 333, 556, 556, 500, 889, 1000, 0, 500, 0, 333, 333, 333, 333, 333, 333, 333,
333, 0, 333, 333, 0, 333, 333, 333, 889, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 889, 0,
276, 0, 0, 0, 0, 556, 722, 944, 310, 0, 0, 0, 0, 0, 667, 0, 0, 0, 278, 0, 0, 278, 500, 667, 500,
0, 0, 750],
'times-roman':
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
250, 333, 408, 500, 500, 833, 778, 333, 333, 333, 500, 564, 250, 333, 250, 278, 500, 500, 500,
500, 500, 500, 500, 500, 500, 500, 278, 278, 564, 564, 564, 444, 921, 722, 667, 667, 722, 611,
556, 722, 722, 333, 389, 722, 611, 889, 722, 722, 556, 722, 667, 556, 611, 722, 722, 944, 722,
722, 611, 333, 278, 333, 469, 500, 333, 444, 500, 444, 500, 444, 333, 500, 500, 278, 278, 500,
278, 778, 500, 500, 500, 500, 333, 389, 278, 500, 500, 722, 500, 500, 444, 480, 200, 480, 541, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 333, 500, 500, 167, 500, 500, 500, 500, 180, 444, 500, 333, 333, 556, 556, 0, 500, 500, 500,
250, 0, 453, 350, 333, 444, 444, 500, 1000, 1000, 0, 444, 0, 333, 333, 333, 333, 333, 333, 333,
333, 0, 333, 333, 0, 333, 333, 333, 1000, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 889, 0,
276, 0, 0, 0, 0, 611, 722, 889, 310, 0, 0, 0, 0, 0, 667, 0, 0, 0, 278, 0, 0, 278, 500, 722, 500,
0, 0, 750],
'zapfdingbats':
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
278, 974, 961, 974, 980, 719, 789, 790, 791, 690, 960, 939, 549, 855, 911, 933, 911, 945, 974,
755, 846, 762, 761, 571, 677, 763, 760, 759, 754, 494, 552, 537, 577, 692, 786, 788, 788, 790,
793, 794, 816, 823, 789, 841, 823, 833, 816, 831, 923, 744, 723, 749, 790, 792, 695, 776, 768,
792, 759, 707, 708, 682, 701, 826, 815, 789, 789, 707, 687, 696, 689, 786, 787, 713, 791, 785,
791, 873, 761, 762, 762, 759, 759, 892, 892, 788, 784, 438, 138, 277, 415, 392, 392, 668, 668, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 732, 544, 544, 910, 667, 760, 760, 776, 595, 694, 626, 788, 788, 788, 788, 788, 788, 788, 788,
788, 788, 788, 788, 788, 788, 788, 788, 788, 788, 788, 788, 788, 788, 788, 788, 788, 788, 788,
788, 788, 788, 788, 788, 788, 788, 788, 788, 788, 788, 788, 788, 894, 838, 1016, 458, 748, 924,
748, 918, 927, 928, 928, 834, 873, 828, 924, 924, 917, 930, 931, 463, 883, 836, 836, 867, 867,
696, 696, 874, 0, 874, 760, 946, 771, 865, 771, 888, 967, 888, 831, 873, 927, 970, 234]
}
ascent_descent = {'Courier': (629, -157),
'Courier-Bold': (626, -142),
'Courier-BoldOblique': (626, -142),
'Courier-Oblique': (629, -157),
'Helvetica': (718, -207),
'Helvetica-Bold': (718, -207),
'Helvetica-BoldOblique': (718, -207),
'Helvetica-Oblique': (718, -207),
'Symbol': (0, 0),
'Times-Bold': (676, -205),
'Times-BoldItalic': (699, -205),
'Times-Italic': (683, -205),
'Times-Roman': (683, -217),
'ZapfDingbats': (0, 0)}
_Widths = {'StandardEncoding': _stdenc_widths, 'Latin1Encoding': latin1MetricsCache.FontWidths}
def stringwidth(text, font, encoding):
if font in fontinfo.NonRomanFonts:
widths = _Widths['StandardEncoding'][font.lower()]
else:
try:
widths = _Widths[encoding][font.lower()]
except Exception:
raise KeyError("Improper encoding {0} or font name {1}".format(encoding, font))
w = 0
for char in text:
chr_idx = ord(char)
if chr_idx < len(widths):
chr_width = widths[chr_idx]
else:
chr_width = max(widths)
w = w + chr_width
return w<|fim▁end|> | |
<|file_name|>BatchCommand.java<|end_file_name|><|fim▁begin|>/*
* Copyright 2015 JBoss, by Red Hat, Inc
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.uberfire.ext.wires.bpmn.client.commands.impl;
import java.util.Arrays;
import java.util.List;
import java.util.Stack;
import org.uberfire.commons.validation.PortablePreconditions;
import org.uberfire.ext.wires.bpmn.client.commands.Command;
import org.uberfire.ext.wires.bpmn.client.commands.ResultType;
import org.uberfire.ext.wires.bpmn.client.commands.Results;
import org.uberfire.ext.wires.bpmn.client.rules.RuleManager;
/**
* A batch of Commands to be executed as an atomic unit
*/<|fim▁hole|>public class BatchCommand implements Command {
private List<Command> commands;
public BatchCommand( final List<Command> commands ) {
this.commands = PortablePreconditions.checkNotNull( "commands",
commands );
}
public BatchCommand( final Command... commands ) {
this.commands = Arrays.asList( PortablePreconditions.checkNotNull( "commands",
commands ) );
}
@Override
public Results apply( final RuleManager ruleManager ) {
final Results results = new DefaultResultsImpl();
final Stack<Command> appliedCommands = new Stack<Command>();
for ( Command command : commands ) {
results.getMessages().addAll( command.apply( ruleManager ).getMessages() );
if ( results.contains( ResultType.ERROR ) ) {
for ( Command undo : appliedCommands ) {
undo.undo( ruleManager );
}
return results;
} else {
appliedCommands.add( command );
}
}
return results;
}
@Override
public Results undo( final RuleManager ruleManager ) {
final Results results = new DefaultResultsImpl();
final Stack<Command> appliedCommands = new Stack<Command>();
for ( Command command : commands ) {
results.getMessages().addAll( command.undo( ruleManager ).getMessages() );
if ( results.contains( ResultType.ERROR ) ) {
for ( Command cmd : appliedCommands ) {
cmd.apply( ruleManager );
}
return results;
} else {
appliedCommands.add( command );
}
}
return results;
}
}<|fim▁end|> | |
<|file_name|>elementIdElements.js<|end_file_name|><|fim▁begin|>/**
*
* Search for multiple elements on the page, starting from an element. The located
* elements will be returned as a WebElement JSON objects. The table below lists the
* locator strategies that each server should support. Elements should be returned in
* the order located in the DOM.
*
* @param {String} ID ID of a WebElement JSON object to route the command to
* @param {String} selector selector to query the elements
* @return {Object[]} A list of WebElement JSON objects for the located elements.
*
* @see https://w3c.github.io/webdriver/webdriver-spec.html#find-elements-from-element
* @type protocol
*
*/
import { ProtocolError } from '../utils/ErrorHandler'
import findStrategy from '../helpers/findElementStrategy'
export default function elementIdElements (id, selector) {
if (typeof id !== 'string' && typeof id !== 'number') {
throw new ProtocolError('number or type of arguments don\'t agree with elementIdElements protocol command')
}
let found = findStrategy(selector, true)
return this.requestHandler.create(`/session/:sessionId/element/${id}/elements`, {
using: found.using,
value: found.value
}).then((result) => {
result.selector = selector
/**
* W3C webdriver protocol has changed element identifier from `ELEMENT` to
* `element-6066-11e4-a52e-4f735466cecf`. Let's make sure both identifier
* are supported.
*/
result.value = result.value.map((elem) => {
const elemValue = elem.ELEMENT || elem['element-6066-11e4-a52e-4f735466cecf']
return {
ELEMENT: elemValue,
'element-6066-11e4-a52e-4f735466cecf': elemValue
}<|fim▁hole|>}<|fim▁end|> | })
return result
}) |
<|file_name|>build.rs<|end_file_name|><|fim▁begin|>// Copyright © 2015, Peter Atashian
// Licensed under the MIT License <LICENSE.md>
fn main() {<|fim▁hole|><|fim▁end|> | println!("cargo:rustc-flags=-l mscoree");
} |
<|file_name|>lib.rs<|end_file_name|><|fim▁begin|>// Copyright © 2015, Peter Atashian
// Licensed under the MIT License <LICENSE.md>
//! FFI bindings to d3d9.
#![cfg(windows)]
extern crate winapi;
use winapi::*;
#[cfg(any(target_arch = "x86", target_arch = "x86_64", target_arch = "arm"))]
extern "system" {
pub fn D3DPERF_BeginEvent(col: D3DCOLOR, wszName: LPCWSTR) -> INT;
pub fn D3DPERF_EndEvent() -> INT;
pub fn D3DPERF_GetStatus() -> DWORD;
pub fn D3DPERF_QueryRepeatFrame() -> BOOL;
pub fn D3DPERF_SetMarker(col: D3DCOLOR, wszName: LPCWSTR) -> ();
pub fn D3DPERF_SetOptions(dwOptions: DWORD) -> ();
pub fn D3DPERF_SetRegion(col: D3DCOLOR, wszName: LPCWSTR) -> ();
pub fn Direct3DCreate9(SDKVersion: UINT) -> *mut IDirect3D9;<|fim▁hole|>}<|fim▁end|> | pub fn Direct3DCreate9Ex(SDKVersion: UINT, arg1: *mut *mut IDirect3D9Ex) -> HRESULT; |
<|file_name|>ticketing.py<|end_file_name|><|fim▁begin|># -*- coding: utf-8 -*-
"""
logbook.ticketing
~~~~~~~~~~~~~~~~~
Implements long handlers that write to remote data stores and assign
each logging message a ticket id.
:copyright: (c) 2010 by Armin Ronacher, Georg Brandl.
:license: BSD, see LICENSE for more details.
"""
from time import time
import json
from logbook.base import NOTSET, level_name_property, LogRecord
from logbook.handlers import Handler, HashingHandlerMixin
from logbook.helpers import cached_property, b, PY2, u
class Ticket(object):
"""Represents a ticket from the database."""
level_name = level_name_property()
def __init__(self, db, row):
self.db = db
self.__dict__.update(row)
@cached_property
def last_occurrence(self):
"""The last occurrence."""
rv = self.get_occurrences(limit=1)
if rv:
return rv[0]
def get_occurrences(self, order_by='-time', limit=50, offset=0):
"""Returns the occurrences for this ticket."""
return self.db.get_occurrences(self.ticket_id, order_by, limit, offset)
def solve(self):
"""Marks this ticket as solved."""
self.db.solve_ticket(self.ticket_id)
self.solved = True
def delete(self):
"""Deletes the ticket from the database."""
self.db.delete_ticket(self.ticket_id)
# Silence DeprecationWarning
__hash__ = None
def __eq__(self, other):
equal = True
for key in self.__dict__.keys():
if getattr(self, key) != getattr(other, key):
equal = False
break
return equal
def __ne__(self, other):
return not self.__eq__(other)
class Occurrence(LogRecord):
"""Represents an occurrence of a ticket."""
def __init__(self, db, row):
self.update_from_dict(json.loads(row['data']))
self.db = db
self.time = row['time']
self.ticket_id = row['ticket_id']
self.occurrence_id = row['occurrence_id']
class BackendBase(object):
"""Provides an abstract interface to various databases."""
def __init__(self, **options):
self.options = options
self.setup_backend()
def setup_backend(self):
"""Setup the database backend."""
raise NotImplementedError()
def record_ticket(self, record, data, hash, app_id):
"""Records a log record as ticket."""
raise NotImplementedError()
def count_tickets(self):
"""Returns the number of tickets."""
raise NotImplementedError()
def get_tickets(self, order_by='-last_occurrence_time',
limit=50, offset=0):
"""Selects tickets from the database."""
raise NotImplementedError()
def solve_ticket(self, ticket_id):
"""Marks a ticket as solved."""
raise NotImplementedError()
def delete_ticket(self, ticket_id):
"""Deletes a ticket from the database."""
raise NotImplementedError()
def get_ticket(self, ticket_id):
"""Return a single ticket with all occurrences."""
raise NotImplementedError()
def get_occurrences(self, ticket, order_by='-time', limit=50, offset=0):
"""Selects occurrences from the database for a ticket."""
raise NotImplementedError()
class SQLAlchemyBackend(BackendBase):
"""Implements a backend that is writing into a database SQLAlchemy can
interface.
This backend takes some additional options:
`table_prefix`
an optional table prefix for all tables created by
the logbook ticketing handler.
`metadata`
an optional SQLAlchemy metadata object for the table creation.
`autocreate_tables`
can be set to `False` to disable the automatic
creation of the logbook tables.
"""
def setup_backend(self):
from sqlalchemy import create_engine, MetaData
from sqlalchemy.orm import sessionmaker, scoped_session
engine_or_uri = self.options.pop('uri', None)
metadata = self.options.pop('metadata', None)
table_prefix = self.options.pop('table_prefix', 'logbook_')
if hasattr(engine_or_uri, 'execute'):
self.engine = engine_or_uri
else:
# Pool recycle keeps connections from going stale,
# which happens in MySQL Databases
# Pool size is more custom for out stack
self.engine = create_engine(engine_or_uri, convert_unicode=True,
pool_recycle=360, pool_size=1000)
# Create session factory using session maker
session = sessionmaker()
# Bind to the engined
session.configure(bind=self.engine)
# Scoped session is a thread safe solution for
# interaction with the Database
self.session = scoped_session(session)
if metadata is None:
metadata = MetaData()
self.table_prefix = table_prefix
self.metadata = metadata
self.create_tables()
if self.options.get('autocreate_tables', True):
self.metadata.create_all(bind=self.engine)
def create_tables(self):
"""Creates the tables required for the handler on the class and
metadata.
"""
import sqlalchemy as db
def table(name, *args, **kwargs):
return db.Table(self.table_prefix + name, self.metadata,
*args, **kwargs)
self.tickets = table('tickets',
db.Column('ticket_id', db.Integer,
primary_key=True),
db.Column('record_hash', db.String(40),
unique=True),
db.Column('level', db.Integer),
db.Column('channel', db.String(120)),
db.Column('location', db.String(512)),
db.Column('module', db.String(256)),
db.Column('last_occurrence_time', db.DateTime),
db.Column('occurrence_count', db.Integer),
db.Column('solved', db.Boolean),
db.Column('app_id', db.String(80)))
self.occurrences = table('occurrences',
db.Column('occurrence_id',
db.Integer, primary_key=True),
db.Column('ticket_id', db.Integer,
db.ForeignKey(self.table_prefix +
'tickets.ticket_id')),
db.Column('time', db.DateTime),
db.Column('data', db.Text),
db.Column('app_id', db.String(80)))
def _order(self, q, table, order_by):
if order_by[0] == '-':
return q.order_by(table.c[order_by[1:]].desc())
return q.order_by(table.c[order_by])
def record_ticket(self, record, data, hash, app_id):
"""Records a log record as ticket."""
# Can use the session instead engine.connection and transaction
s = self.session
try:
q = self.tickets.select(self.tickets.c.record_hash == hash)
row = s.execute(q).fetchone()
if row is None:
row = s.execute(self.tickets.insert().values(
record_hash=hash,
level=record.level,
channel=record.channel or u(''),
location=u('%s:%d') % (record.filename, record.lineno),
module=record.module or u('<unknown>'),
occurrence_count=0,
solved=False,
app_id=app_id
))
ticket_id = row.inserted_primary_key[0]
else:
ticket_id = row['ticket_id']
s.execute(self.occurrences.insert()
.values(ticket_id=ticket_id,
time=record.time,
app_id=app_id,
data=json.dumps(data)))
s.execute(
self.tickets.update()
.where(self.tickets.c.ticket_id == ticket_id)
.values(occurrence_count=self.tickets.c.occurrence_count + 1,
last_occurrence_time=record.time,
solved=False))
s.commit()
except Exception:
s.rollback()
raise
# Closes the session and removes it from the pool
s.remove()
def count_tickets(self):
"""Returns the number of tickets."""
return self.engine.execute(self.tickets.count()).fetchone()[0]
def get_tickets(self, order_by='-last_occurrence_time', limit=50,
offset=0):
"""Selects tickets from the database."""
return [Ticket(self, row) for row in self.engine.execute(
self._order(self.tickets.select(), self.tickets, order_by)
.limit(limit).offset(offset)).fetchall()]
def solve_ticket(self, ticket_id):
"""Marks a ticket as solved."""
self.engine.execute(self.tickets.update()
.where(self.tickets.c.ticket_id == ticket_id)
.values(solved=True))
def delete_ticket(self, ticket_id):
"""Deletes a ticket from the database."""
self.engine.execute(self.occurrences.delete()
.where(self.occurrences.c.ticket_id == ticket_id))
self.engine.execute(self.tickets.delete()
.where(self.tickets.c.ticket_id == ticket_id))
def get_ticket(self, ticket_id):
"""Return a single ticket with all occurrences."""
row = self.engine.execute(self.tickets.select().where(
self.tickets.c.ticket_id == ticket_id)).fetchone()
if row is not None:
return Ticket(self, row)
def get_occurrences(self, ticket, order_by='-time', limit=50, offset=0):
"""Selects occurrences from the database for a ticket."""
return [Occurrence(self, row) for row in
self.engine.execute(self._order(
self.occurrences.select()
.where(self.occurrences.c.ticket_id == ticket),
self.occurrences, order_by)
.limit(limit).offset(offset)).fetchall()]
class MongoDBBackend(BackendBase):
"""Implements a backend that writes into a MongoDB database."""
class _FixedTicketClass(Ticket):
@property
def ticket_id(self):
return self._id
class _FixedOccurrenceClass(Occurrence):
def __init__(self, db, row):
self.update_from_dict(json.loads(row['data']))
self.db = db
self.time = row['time']
self.ticket_id = row['ticket_id']
self.occurrence_id = row['_id']
# TODO: Update connection setup once PYTHON-160 is solved.
def setup_backend(self):
from pymongo import ASCENDING, DESCENDING
from pymongo.connection import Connection
try:
from pymongo.uri_parser import parse_uri
except ImportError:
from pymongo.connection import _parse_uri as parse_uri
from pymongo.errors import AutoReconnect
_connection = None
uri = self.options.pop('uri', u(''))
_connection_attempts = 0
parsed_uri = parse_uri(uri, Connection.PORT)
if type(parsed_uri) is tuple:
# pymongo < 2.0
database = parsed_uri[1]
else:
# pymongo >= 2.0
database = parsed_uri['database']
# Handle auto reconnect signals properly
while _connection_attempts < 5:
try:
if _connection is None:
_connection = Connection(uri)
database = _connection[database]
break
except AutoReconnect:
_connection_attempts += 1<|fim▁hole|> # setup correct indexes
database.tickets.ensure_index([('record_hash', ASCENDING)],
unique=True)
database.tickets.ensure_index([('solved', ASCENDING),
('level', ASCENDING)])
database.occurrences.ensure_index([('time', DESCENDING)])
def _order(self, q, order_by):
from pymongo import ASCENDING, DESCENDING
col = '%s' % (order_by[0] == '-' and order_by[1:] or order_by)
if order_by[0] == '-':
return q.sort(col, DESCENDING)
return q.sort(col, ASCENDING)
def _oid(self, ticket_id):
from pymongo.objectid import ObjectId
return ObjectId(ticket_id)
def record_ticket(self, record, data, hash, app_id):
"""Records a log record as ticket."""
db = self.database
ticket = db.tickets.find_one({'record_hash': hash})
if not ticket:
doc = {
'record_hash': hash,
'level': record.level,
'channel': record.channel or u(''),
'location': u('%s:%d') % (record.filename,
record.lineno),
'module': record.module or u('<unknown>'),
'occurrence_count': 0,
'solved': False,
'app_id': app_id,
}
ticket_id = db.tickets.insert(doc)
else:
ticket_id = ticket['_id']
db.tickets.update({'_id': ticket_id}, {
'$inc': {
'occurrence_count': 1
},
'$set': {
'last_occurrence_time': record.time,
'solved': False
}
})
# We store occurrences in a seperate collection so that
# we can make it a capped collection optionally.
db.occurrences.insert({
'ticket_id': self._oid(ticket_id),
'app_id': app_id,
'time': record.time,
'data': json.dumps(data),
})
def count_tickets(self):
"""Returns the number of tickets."""
return self.database.tickets.count()
def get_tickets(self, order_by='-last_occurrence_time', limit=50,
offset=0):
"""Selects tickets from the database."""
query = (self._order(self.database.tickets.find(), order_by)
.limit(limit).skip(offset))
return [self._FixedTicketClass(self, obj) for obj in query]
def solve_ticket(self, ticket_id):
"""Marks a ticket as solved."""
self.database.tickets.update({'_id': self._oid(ticket_id)},
{'solved': True})
def delete_ticket(self, ticket_id):
"""Deletes a ticket from the database."""
self.database.occurrences.remove({'ticket_id': self._oid(ticket_id)})
self.database.tickets.remove({'_id': self._oid(ticket_id)})
def get_ticket(self, ticket_id):
"""Return a single ticket with all occurrences."""
ticket = self.database.tickets.find_one({'_id': self._oid(ticket_id)})
if ticket:
return Ticket(self, ticket)
def get_occurrences(self, ticket, order_by='-time', limit=50, offset=0):
"""Selects occurrences from the database for a ticket."""
collection = self.database.occurrences
occurrences = self._order(collection.find(
{'ticket_id': self._oid(ticket)}
), order_by).limit(limit).skip(offset)
return [self._FixedOccurrenceClass(self, obj) for obj in occurrences]
class TicketingBaseHandler(Handler, HashingHandlerMixin):
"""Baseclass for ticketing handlers. This can be used to interface
ticketing systems that do not necessarily provide an interface that
would be compatible with the :class:`BackendBase` interface.
"""
def __init__(self, hash_salt, level=NOTSET, filter=None, bubble=False):
Handler.__init__(self, level, filter, bubble)
self.hash_salt = hash_salt
def hash_record_raw(self, record):
"""Returns the unique hash of a record."""
hash = HashingHandlerMixin.hash_record_raw(self, record)
if self.hash_salt is not None:
hash_salt = self.hash_salt
if not PY2 or isinstance(hash_salt, unicode):
hash_salt = hash_salt.encode('utf-8')
hash.update(b('\x00') + hash_salt)
return hash
class TicketingHandler(TicketingBaseHandler):
"""A handler that writes log records into a remote database. This
database can be connected to from different dispatchers which makes
this a nice setup for web applications::
from logbook.ticketing import TicketingHandler
handler = TicketingHandler('sqlite:////tmp/myapp-logs.db')
:param uri: a backend specific string or object to decide where to log to.
:param app_id: a string with an optional ID for an application. Can be
used to keep multiple application setups apart when logging
into the same database.
:param hash_salt: an optional salt (binary string) for the hashes.
:param backend: A backend class that implements the proper database
handling.
Backends available are: :class:`SQLAlchemyBackend`,
:class:`MongoDBBackend`.
"""
#: The default backend that is being used when no backend is specified.
#: Unless overriden by a subclass this will be the
#: :class:`SQLAlchemyBackend`.
default_backend = SQLAlchemyBackend
def __init__(self, uri, app_id='generic', level=NOTSET,
filter=None, bubble=False, hash_salt=None, backend=None,
**db_options):
if hash_salt is None:
hash_salt = u('apphash-') + app_id
TicketingBaseHandler.__init__(self, hash_salt, level, filter, bubble)
if backend is None:
backend = self.default_backend
db_options['uri'] = uri
self.set_backend(backend, **db_options)
self.app_id = app_id
def set_backend(self, cls, **options):
self.db = cls(**options)
def process_record(self, record, hash):
"""Subclasses can override this to tamper with the data dict that
is sent to the database as JSON.
"""
return record.to_dict(json_safe=True)
def record_ticket(self, record, data, hash):
"""Record either a new ticket or a new occurrence for a
ticket based on the hash.
"""
self.db.record_ticket(record, data, hash, self.app_id)
def emit(self, record):
"""Emits a single record and writes it to the database."""
hash = self.hash_record(record).encode('utf-8')
data = self.process_record(record, hash)
self.record_ticket(record, data, hash)<|fim▁end|> | time.sleep(0.1)
self.database = database
|
<|file_name|>multi_vms_file_transfer.py<|end_file_name|><|fim▁begin|>import time, os, logging
from autotest.client import utils
from autotest.client.shared import error
from virttest import remote, utils_misc
@error.context_aware
def run_multi_vms_file_transfer(test, params, env):
"""
Transfer a file back and forth between multi VMs for long time.
<|fim▁hole|> 3) Copy this file to VM1.
4) Compare copied file's md5 with original file.
5) Copy this file from VM1 to VM2.
6) Compare copied file's md5 with original file.
7) Copy this file from VM2 to VM1.
8) Compare copied file's md5 with original file.
9) Repeat step 5-8
@param test: KVM test object.
@param params: Dictionary with the test parameters.
@param env: Dictionary with test environment.
"""
def md5_check(session, orig_md5):
msg = "Compare copied file's md5 with original file."
error.context(msg, logging.info)
md5_cmd = "md5sum %s | awk '{print $1}'" % guest_path
s, o = session.cmd_status_output(md5_cmd)
if s:
msg = "Fail to get md5 value from guest. Output is %s" % o
raise error.TestError(msg)
new_md5 = o.splitlines()[-1]
if new_md5 != orig_md5:
msg = "File changed after transfer host -> VM1. Original md5 value"
msg += " is %s. Current md5 value is %s" % (orig_md5, new_md5)
raise error.TestFail(msg)
vm1 = env.get_vm(params["main_vm"])
vm1.verify_alive()
login_timeout = int(params.get("login_timeout", 360))
vm2 = env.get_vm(params["vms"].split()[-1])
vm2.verify_alive()
session_vm1 = vm1.wait_for_login(timeout=login_timeout)
session_vm2 = vm2.wait_for_login(timeout=login_timeout)
transfer_timeout = int(params.get("transfer_timeout", 1000))
username = params.get("username")
password = params.get("password")
port = int(params.get("file_transfer_port"))
if (not port) or (not username) or (not password):
raise error.TestError("Please set file_transfer_port, username,"
" password paramters for guest")
tmp_dir = params.get("tmp_dir", "/tmp/")
repeat_time = int(params.get("repeat_time", "10"))
clean_cmd = params.get("clean_cmd", "rm -f")
filesize = int(params.get("filesize", 4000))
count = int(filesize / 10)
if count == 0:
count = 1
host_path = os.path.join(tmp_dir, "tmp-%s" %
utils_misc.generate_random_string(8))
cmd = "dd if=/dev/zero of=%s bs=10M count=%d" % (host_path, count)
guest_path = (tmp_dir + "file_transfer-%s" %
utils_misc.generate_random_string(8))
try:
error.context("Creating %dMB file on host" % filesize, logging.info)
utils.run(cmd)
orig_md5 = utils.hash_file(host_path, method="md5")
error.context("Transfering file host -> VM1, timeout: %ss" % \
transfer_timeout, logging.info)
t_begin = time.time()
vm1.copy_files_to(host_path, guest_path, timeout=transfer_timeout)
t_end = time.time()
throughput = filesize / (t_end - t_begin)
logging.info("File transfer host -> VM1 succeed, "
"estimated throughput: %.2fMB/s", throughput)
md5_check(session_vm1, orig_md5)
ip_vm1 = vm1.get_address()
ip_vm2 = vm2.get_address()
for i in range(repeat_time):
log_vm1 = os.path.join(test.debugdir, "remote_scp_to_vm1_%s.log" %i)
log_vm2 = os.path.join(test.debugdir, "remote_scp_to_vm2_%s.log" %i)
msg = "Transfering file VM1 -> VM2, timeout: %ss." % transfer_timeout
msg += " Repeat: %s/%s" % (i + 1, repeat_time)
error.context(msg, logging.info)
t_begin = time.time()
s = remote.scp_between_remotes(src=ip_vm1, dst=ip_vm2, port=port,
s_passwd=password, d_passwd=password,
s_name=username, d_name=username,
s_path=guest_path, d_path=guest_path,
timeout=transfer_timeout,
log_filename=log_vm1)
t_end = time.time()
throughput = filesize / (t_end - t_begin)
logging.info("File transfer VM1 -> VM2 succeed, "
"estimated throughput: %.2fMB/s", throughput)
md5_check(session_vm2, orig_md5)
session_vm1.cmd("rm -rf %s" % guest_path)
msg = "Transfering file VM2 -> VM1, timeout: %ss." % transfer_timeout
msg += " Repeat: %s/%s" % (i + 1, repeat_time)
error.context(msg, logging.info)
t_begin = time.time()
remote.scp_between_remotes(src=ip_vm2, dst=ip_vm1, port=port,
s_passwd=password, d_passwd=password,
s_name=username, d_name=username,
s_path=guest_path, d_path=guest_path,
timeout=transfer_timeout,
log_filename=log_vm1)
t_end = time.time()
throughput = filesize / (t_end - t_begin)
logging.info("File transfer VM2 -> VM1 succeed, "
"estimated throughput: %.2fMB/s", throughput)
md5_check(session_vm1, orig_md5)
session_vm2.cmd("%s %s" % (clean_cmd, guest_path))
finally:
try:
session_vm1.cmd("%s %s" % (clean_cmd, guest_path))
except Exception:
pass
try:
session_vm2.cmd("%s %s" % (clean_cmd, guest_path))
except Exception:
pass
try:
os.remove(host_path)
except OSError:
pass
if session_vm1:
session_vm1.close()
if session_vm2:
session_vm2.close()<|fim▁end|> | 1) Boot up two VMs .
2) Create a large file by dd on host. |
<|file_name|>BenchmarkTest10564.java<|end_file_name|><|fim▁begin|>/**
* OWASP Benchmark Project v1.1
*
* This file is part of the Open Web Application Security Project (OWASP)
* Benchmark Project. For details, please see
* <a href="https://www.owasp.org/index.php/Benchmark">https://www.owasp.org/index.php/Benchmark</a>.
*
* The Benchmark is free software: you can redistribute it and/or modify it under the terms
* of the GNU General Public License as published by the Free Software Foundation, version 2.
*
* The Benchmark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without
* even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details
*
* @author Dave Wichers <a href="https://www.aspectsecurity.com">Aspect Security</a>
* @created 2015
*/
package org.owasp.benchmark.testcode;
import java.io.IOException;
import javax.servlet.ServletException;
import javax.servlet.annotation.WebServlet;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
@WebServlet("/BenchmarkTest10564")
public class BenchmarkTest10564 extends HttpServlet {
private static final long serialVersionUID = 1L;
@Override
public void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
doPost(request, response);
}
@Override
public void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
java.util.Map<String,String[]> map = request.getParameterMap();
String param = "";
if (!map.isEmpty()) {
param = map.get("foo")[0];
}
String bar = new Test().doSomething(param);
Object[] obj = { "a", "b"};<|fim▁hole|>
private class Test {
public String doSomething(String param) throws ServletException, IOException {
String bar;
String guess = "ABC";
char switchTarget = guess.charAt(1); // condition 'B', which is safe
// Simple case statement that assigns param to bar on conditions 'A' or 'C'
switch (switchTarget) {
case 'A':
bar = param;
break;
case 'B':
bar = "bob";
break;
case 'C':
case 'D':
bar = param;
break;
default:
bar = "bob's your uncle";
break;
}
return bar;
}
} // end innerclass Test
} // end DataflowThruInnerClass<|fim▁end|> |
response.getWriter().printf(java.util.Locale.US,bar,obj);
} // end doPost |
<|file_name|>test_alias.py<|end_file_name|><|fim▁begin|>import os
from cauldron.test import support
from cauldron.test.support import scaffolds
class TestAlias(scaffolds.ResultsTest):
"""..."""
def test_unknown_command(self):
"""Should fail if the command is not recognized."""
r = support.run_command('alias fake')
self.assertTrue(r.failed, 'should have failed')
self.assertEqual(r.errors[0].code, 'UNKNOWN_COMMAND')
def test_list(self):
"""..."""
r = support.run_command('alias list')
self.assertFalse(r.failed, 'should not have failed')
def test_add(self):
"""..."""
p = self.get_temp_path('aliaser')
r = support.run_command('alias add test "{}" --temporary'.format(p))
self.assertFalse(r.failed, 'should not have failed')
def test_remove(self):
"""..."""
directory = self.get_temp_path('aliaser')
path = os.path.join(directory, 'test.text')
with open(path, 'w+') as f:
f.write('This is a test')
support.run_command('alias add test "{}" --temporary'.format(path))
r = support.run_command('alias remove test --temporary')
self.assertFalse(r.failed, 'should not have failed')
self.assertFalse(r.failed, 'should not have failed')
def test_empty(self):<|fim▁hole|> r = support.run_command('alias add')
self.assertTrue(r.failed, 'should have failed')
self.assertEqual(r.errors[0].code, 'MISSING_ARG')
def test_autocomplete_command(self):
"""..."""
result = support.autocomplete('alias ad')
self.assertEqual(len(result), 1)
self.assertEqual(result[0], 'add')
def test_autocomplete_alias(self):
"""..."""
result = support.autocomplete('alias add fake-alias-not-real')
self.assertEqual(len(result), 0)
def test_autocomplete_path(self):
"""..."""
path = os.path.dirname(os.path.realpath(__file__))
result = support.autocomplete('alias add test {}'.format(path))
self.assertIsNotNone(result)<|fim▁end|> | """..."""
|
<|file_name|>es.js<|end_file_name|><|fim▁begin|>MessageFormat.locale.es = function ( n ) {<|fim▁hole|> return "other";
};<|fim▁end|> | if ( n === 1 ) {
return "one";
} |
<|file_name|>AnswerGenerator.java<|end_file_name|><|fim▁begin|>/**
*
*/
package qa.AnswerFormation;
import java.util.ArrayList;
import java.util.HashSet;
import java.util.List;
import java.util.Queue;
import qa.IQuestion;
import qa.Utility;
/**
* @author Deepak
*
*/
public class AnswerGenerator implements IAnswerGenerator {
/**
*
* @param processedQuestionsQueue
*/
public AnswerGenerator(Queue<IQuestion> processedQuestionsQueue, List<IQuestion> processedQuestions) {
this.processedQuestionsQueue = processedQuestionsQueue;
this.processedQuestions = processedQuestions;
}
<|fim▁hole|> @Override
public void run() {
while(!processedQuestionsQueue.isEmpty() || !Utility.IsPassageRetrivalDone) {
IQuestion question = processedQuestionsQueue.poll();
if(question == null) continue;
HashSet<String> answers = new HashSet<String>();
for(String passage : question.getRelevantPassages()) {
List<String> output = new ArrayList<String>();
String passageWithoutKeywords = null;
String nerTaggedPassage = Utility.getNERTagging(passage);
String posTaggedPassage = Utility.getPOSTagging(passage);
output.addAll(getDataFromOutput(nerTaggedPassage, question.getAnswerTypes()));
output.addAll(getDataFromOutput(posTaggedPassage, question.getAnswerTypes()));
for(String answer : output) {
if(!question.getQuestion().toLowerCase().contains(answer.toLowerCase()) && !answers.contains(answer)) {
answers.add(answer);
passageWithoutKeywords = Utility.removeKeywords(answer, question.getKeywords());
question.addAnswer(passageWithoutKeywords);
}
}
}
for(String passage : question.getRelevantPassages()) {
List<String> output = new ArrayList<String>();
String passageWithoutKeywords = null;
if(answers.size() >= 10) break;
try{
output.addAll(Utility.getNounPhrases(passage, false));
} catch (Exception ex) {
System.out.println(ex.getMessage());
}
for(String answer : output) {
if(!question.getQuestion().toLowerCase().contains(answer.toLowerCase()) && !answers.contains(answer)) {
answers.add(answer);
passageWithoutKeywords = Utility.removeKeywords(answer, question.getKeywords());
question.addAnswer(passageWithoutKeywords);
}
}
}
// for(String answer : answers) {
// boolean flag = true;
// for(String answer1 : answers) {
// if(!answer.equals(answer1)) {
// if(answer1.toLowerCase().contains(answer.toLowerCase())) {
// flag = false;
// break;
// }
// }
// }
// if(flag) {
// question.addAnswer(answer);
// }
// }
this.processedQuestions.add(question);
}
AnswerWriter writer = new AnswerWriter("answer.txt");
writer.writeAnswers(processedQuestions);
}
/**
*
* @param output
* @param tagName
* @return
*/
private List<String> getDataFromOutput(String output, String tagName) {
List<String> answers = new ArrayList<String>();
StringBuilder temp = new StringBuilder();
String[] outputArray = output.split("[/_\\s]");
String[] tags = tagName.split("\\|");
for(String tag : tags) {
if(tag == null || tag.equals("")) continue;
String[] tagsArray = tag.split(":");
for(int arrayIndex = 1; arrayIndex < outputArray.length; arrayIndex+=2) {
if(outputArray[arrayIndex].trim().equals(tagsArray[1].trim())) {
temp.append(outputArray[arrayIndex - 1] + " ");
} else {
if(!temp.toString().equals("")) {
answers.add(temp.toString().trim());
}
temp = new StringBuilder();
}
}
if(!temp.toString().equals("") ) {
answers.add(temp.toString().trim());
}
}
return answers;
}
/**
*
*/
private Queue<IQuestion> processedQuestionsQueue;
private List<IQuestion> processedQuestions;
}<|fim▁end|> | /**
*
*/
|
<|file_name|>app.module.ts<|end_file_name|><|fim▁begin|>import { BrowserModule } from '@angular/platform-browser';
import { NgModule } from '@angular/core';
import { FormsModule } from '@angular/forms';<|fim▁hole|>import { CovalentLayoutModule, CovalentStepsModule, CovalentDataTableModule, CovalentMessageModule } from '@covalent/core';
import { AppComponent } from './app.component';
import { ProjectsComponent } from './projects/projects.component';
@NgModule({
declarations: [
AppComponent,
ProjectsComponent
],
imports: [
BrowserModule,
FormsModule,
HttpModule,
MaterialModule,
BrowserAnimationsModule,
CovalentLayoutModule,
CovalentStepsModule,
CovalentDataTableModule,
CovalentMessageModule
],
providers: [],
bootstrap: [AppComponent]
})
export class AppModule { }<|fim▁end|> | import { HttpModule } from '@angular/http';
import { MaterialModule } from '@angular/material';
import {BrowserAnimationsModule} from '@angular/platform-browser/animations'; |
<|file_name|>error.rs<|end_file_name|><|fim▁begin|>// Copyright (c) 2015, The Radare Project. All rights reserved.
// See the COPYING file at the top-level directory of this distribution.
// Licensed under the BSD 3-Clause License:
// <http://opensource.org/licenses/BSD-3-Clause>
// This file may not be copied, modified, or distributed
// except according to those terms.
use std::{error, fmt};
use super::ssa_traits::SSA;
use super::ssastorage::SSAStorage;
use std::fmt::Debug;
#[derive(Debug)]
pub enum SSAErr<T: SSA + Debug> {
InvalidBlock(T::ActionRef),
InvalidType(String),
InvalidTarget(T::ActionRef, T::CFEdgeRef, T::ActionRef),
InvalidControl(T::ActionRef, T::CFEdgeRef),
WrongNumOperands(T::ValueRef, usize, usize),
WrongNumEdges(T::ActionRef, usize, usize),
NoSelector(T::ActionRef),
UnexpectedSelector(T::ActionRef, T::ValueRef),
UnreachableBlock(T::ActionRef),
InvalidExpr(T::ValueRef),
IncompatibleWidth(T::ValueRef, u16, u16),
}
impl fmt::Display for SSAErr<SSAStorage> {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
let err = match *self {
SSAErr::InvalidBlock(ni) => {
format!("Found Block with index: {:?}.", ni)
}
SSAErr::InvalidType(ref e) => {
format!("Expected type: {}.", e)
}
SSAErr::InvalidControl(bi, ei) => {
format!("Block {:?} has invalid outgoing edge: {:?}", bi, ei)
}
SSAErr::WrongNumOperands(n, e, f) => {
format!("{:?} expected {} number of operands, found: {}", n, e, f)
}
SSAErr::InvalidTarget(bi, ei, ti) => {<|fim▁hole|> format!("Block {:?} has Edge {:?} with invalid target {:?}",
bi,
ei,
ti)
}
SSAErr::WrongNumEdges(ni, e, f) => {
format!("Block {:?} expects {} edge(s), found: {}", ni, e, f)
}
SSAErr::NoSelector(bi) => {
format!("Block {:?} expects a selector. None found.", bi)
}
SSAErr::UnexpectedSelector(bi, ni) => {
format!("Block {:?} expected no selector, found: {:?}", bi, ni)
}
SSAErr::UnreachableBlock(bi) => {
format!("Unreachable block found: {:?}", bi)
}
SSAErr::InvalidExpr(ni) => {
format!("Found an invalid expression: {:?}", ni)
}
SSAErr::IncompatibleWidth(ni, e, f) => {
format!("{:?} expected with to be {}, found width: {}", ni, e, f)
}
};
write!(f, "{}.", err)
}
}
impl error::Error for SSAErr<SSAStorage> {
fn description(&self) -> &str {
""
}
fn cause(&self) -> Option<&error::Error> {
None
}
}<|fim▁end|> | |
<|file_name|>val_or_vec.rs<|end_file_name|><|fim▁begin|>use std::{iter, ptr, vec};
use serde::de::{
self,
value::{Error, SeqDeserializer},
Deserializer, IntoDeserializer,
};
#[derive(Debug)]
pub enum ValOrVec<T> {
Val(T),
Vec(Vec<T>),
}
impl<T> ValOrVec<T> {
pub fn push(&mut self, new_val: T) {
match self {
// To transform a Self::Val into a Self::Vec, we take the existing
// value out via ptr::read and add it to a vector, together with the
// new value. Since setting self to `ValOrVec::Vec` normally would
// cause T's Drop implementation to run if it has one (which would
// free resources that will now be owned by the first vec element),<|fim▁hole|> // there is no opportunity for outside code to observe an
// invalid state of self.
unsafe {
let existing_val = ptr::read(val);
vec.push(existing_val);
vec.push(new_val);
ptr::write(self, ValOrVec::Vec(vec))
}
}
ValOrVec::Vec(vec) => vec.push(new_val),
}
}
fn deserialize_val<U, E, F>(self, f: F) -> Result<U, E>
where
F: FnOnce(T) -> Result<U, E>,
E: de::Error,
{
match self {
ValOrVec::Val(val) => f(val),
ValOrVec::Vec(_) => Err(de::Error::custom("unsupported value")),
}
}
}
impl<T> IntoIterator for ValOrVec<T> {
type Item = T;
type IntoIter = IntoIter<T>;
fn into_iter(self) -> Self::IntoIter {
IntoIter::new(self)
}
}
pub enum IntoIter<T> {
Val(iter::Once<T>),
Vec(vec::IntoIter<T>),
}
impl<T> IntoIter<T> {
fn new(vv: ValOrVec<T>) -> Self {
match vv {
ValOrVec::Val(val) => IntoIter::Val(iter::once(val)),
ValOrVec::Vec(vec) => IntoIter::Vec(vec.into_iter()),
}
}
}
impl<T> Iterator for IntoIter<T> {
type Item = T;
fn next(&mut self) -> Option<Self::Item> {
match self {
IntoIter::Val(iter) => iter.next(),
IntoIter::Vec(iter) => iter.next(),
}
}
}
impl<'de, T> IntoDeserializer<'de> for ValOrVec<T>
where
T: IntoDeserializer<'de> + Deserializer<'de, Error = Error>,
{
type Deserializer = Self;
fn into_deserializer(self) -> Self::Deserializer {
self
}
}
macro_rules! forward_to_part {
($($method:ident,)*) => {
$(
fn $method<V>(self, visitor: V) -> Result<V::Value, Self::Error>
where V: de::Visitor<'de>
{
self.deserialize_val(move |val| val.$method(visitor))
}
)*
}
}
impl<'de, T> Deserializer<'de> for ValOrVec<T>
where
T: IntoDeserializer<'de> + Deserializer<'de, Error = Error>,
{
type Error = Error;
fn deserialize_any<V>(self, visitor: V) -> Result<V::Value, Self::Error>
where
V: de::Visitor<'de>,
{
match self {
ValOrVec::Val(val) => val.deserialize_any(visitor),
ValOrVec::Vec(_) => self.deserialize_seq(visitor),
}
}
fn deserialize_seq<V>(self, visitor: V) -> Result<V::Value, Self::Error>
where
V: de::Visitor<'de>,
{
visitor.visit_seq(SeqDeserializer::new(self.into_iter()))
}
fn deserialize_enum<V>(
self,
name: &'static str,
variants: &'static [&'static str],
visitor: V,
) -> Result<V::Value, Self::Error>
where
V: de::Visitor<'de>,
{
self.deserialize_val(move |val| val.deserialize_enum(name, variants, visitor))
}
fn deserialize_tuple<V>(self, len: usize, visitor: V) -> Result<V::Value, Self::Error>
where
V: de::Visitor<'de>,
{
self.deserialize_val(move |val| val.deserialize_tuple(len, visitor))
}
fn deserialize_struct<V>(
self,
name: &'static str,
fields: &'static [&'static str],
visitor: V,
) -> Result<V::Value, Self::Error>
where
V: de::Visitor<'de>,
{
self.deserialize_val(move |val| val.deserialize_struct(name, fields, visitor))
}
fn deserialize_unit_struct<V>(
self,
name: &'static str,
visitor: V,
) -> Result<V::Value, Self::Error>
where
V: de::Visitor<'de>,
{
self.deserialize_val(move |val| val.deserialize_unit_struct(name, visitor))
}
fn deserialize_tuple_struct<V>(
self,
name: &'static str,
len: usize,
visitor: V,
) -> Result<V::Value, Self::Error>
where
V: de::Visitor<'de>,
{
self.deserialize_val(move |val| val.deserialize_tuple_struct(name, len, visitor))
}
fn deserialize_newtype_struct<V>(
self,
name: &'static str,
visitor: V,
) -> Result<V::Value, Self::Error>
where
V: de::Visitor<'de>,
{
self.deserialize_val(move |val| val.deserialize_newtype_struct(name, visitor))
}
fn deserialize_ignored_any<V>(self, visitor: V) -> Result<V::Value, Self::Error>
where
V: de::Visitor<'de>,
{
visitor.visit_unit()
}
forward_to_part! {
deserialize_bool,
deserialize_char,
deserialize_str,
deserialize_string,
deserialize_bytes,
deserialize_byte_buf,
deserialize_unit,
deserialize_u8,
deserialize_u16,
deserialize_u32,
deserialize_u64,
deserialize_i8,
deserialize_i16,
deserialize_i32,
deserialize_i64,
deserialize_f32,
deserialize_f64,
deserialize_option,
deserialize_identifier,
deserialize_map,
}
}
#[cfg(test)]
mod tests {
use std::borrow::Cow;
use matches::assert_matches;
use super::ValOrVec;
#[test]
fn cow_borrowed() {
let mut x = ValOrVec::Val(Cow::Borrowed("a"));
x.push(Cow::Borrowed("b"));
x.push(Cow::Borrowed("c"));
assert_matches!(x, ValOrVec::Vec(v) if v == vec!["a", "b", "c"]);
}
#[test]
fn cow_owned() {
let mut x = ValOrVec::Val(Cow::from("a".to_owned()));
x.push(Cow::from("b".to_owned()));
x.push(Cow::from("c".to_owned()));
assert_matches!(
x,
ValOrVec::Vec(v) if v == vec!["a".to_owned(), "b".to_owned(), "c".to_owned()]
);
}
}<|fim▁end|> | // we instead use ptr::write to set self to Self::Vec.
ValOrVec::Val(val) => {
let mut vec = Vec::with_capacity(2);
// Safety: since the vec is pre-allocated, push can't panic, so |
<|file_name|>try_regexp.py<|end_file_name|><|fim▁begin|>from __future__ import unicode_literals, division, absolute_import
from builtins import * # pylint: disable=unused-import, redefined-builtin
from past.builtins import basestring
import logging
from flexget import options, plugin
from flexget.event import event
from flexget.terminal import console
log = logging.getLogger('try_regexp')
class PluginTryRegexp(object):
"""
This plugin allows user to test regexps for a task.
"""
def __init__(self):
self.abort = False
def matches(self, entry, regexp):
"""Return True if any of the entry string fields match given regexp"""
import re
for field, value in entry.items():
if not isinstance(value, basestring):
continue
if re.search(regexp, value, re.IGNORECASE | re.UNICODE):
return (True, field)
return (False, None)
def on_task_filter(self, task, config):
if not task.options.try_regexp:
return
if self.abort:
return
console('-' * 79)
console('Hi there, welcome to try regexps in realtime!')
console('Press ^D or type \'exit\' to continue. Type \'continue\' to continue non-interactive execution.')
console('Task \'%s\' has %s entries, enter regexp to see what matches it.' % (task.name, len(task.entries)))
while (True):
try:
s = input('--> ')
if s == 'exit':
break
if s == 'abort' or s == 'continue':
self.abort = True
break
except EOFError:
break<|fim▁hole|> count = 0
for entry in task.entries:
try:
match, field = self.matches(entry, s)
if match:
console('Title: %-40s URL: %-30s From: %s' % (entry['title'], entry['url'], field))
count += 1
except re.error:
console('Invalid regular expression')
break
console('%s of %s entries matched' % (count, len(task.entries)))
console('Bye!')
@event('plugin.register')
def register_plugin():
plugin.register(PluginTryRegexp, '--try-regexp', builtin=True, api_ver=2)
@event('options.register')
def register_parser_arguments():
options.get_parser('execute').add_argument('--try-regexp', action='store_true', dest='try_regexp', default=False,
help='try regular expressions interactively')<|fim▁end|> | |
<|file_name|>net.py<|end_file_name|><|fim▁begin|>#!/usr/bin/env Python
import time
import sys
if len(sys.argv) > 1:
INTERFACE = sys.argv[1]
else:
INTERFACE = 'eth1'
STATS = []
print 'Interface:',INTERFACE
def rx():
ifstat = open('/proc/net/dev').readlines()
for interface in ifstat:
#print '----', interface, '-----'
if INTERFACE in interface:
stat = float(interface.split()[1])
STATS[0:] = [stat]
def tx():
ifstat = open('/proc/net/dev').readlines()
for interface in ifstat:
if INTERFACE in interface:
stat = float(interface.split()[9])
STATS[1:] = [stat]
print 'In Out'
rx()
tx()<|fim▁hole|>
while True:
time.sleep(1)
rxstat_o = list(STATS)
rx()
tx()
RX = float(STATS[0])
RX_O = rxstat_o[0]
TX = float(STATS[1])
TX_O = rxstat_o[1]
RX_RATE = round((RX - RX_O)/1024/1024,3)
TX_RATE = round((TX - TX_O)/1024/1024,3)
print RX_RATE ,'MB ',TX_RATE ,'MB'<|fim▁end|> | |
<|file_name|>event.py<|end_file_name|><|fim▁begin|># encoding: utf-8
#
# Copyright 2017 University of Oslo, Norway
#
# This file is part of Cerebrum.
#
# Cerebrum is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# Cerebrum is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Cerebrum; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307, USA.
""" An abstract event that can be stored in the database. """
from __future__ import absolute_import
import datetime
import itertools
import mx.DateTime
import pytz
import cereconf
class _VerbSingleton(type):
""" A metaclass that makes each EventType verb a singleton. """
verbs = {}
def __call__(cls, verb, *args):
if verb not in cls.verbs:
cls.verbs[verb] = super(_VerbSingleton, cls).__call__(verb, *args)
return cls.verbs[verb]
def get_verb(cls, verb):
return cls.verbs.get(verb)
class EventType(_VerbSingleton('EventTypeSingleton', (object,), {})):
"""Holds an event type."""
__slots__ = ['verb', 'description', ]
def __init__(self, verb, description):
""" Initialize EventType.
:verb: Scim verb
:description: HR description text
"""
self.verb = verb
self.description = description
def __repr__(self):
return '<{0.__class__.__name__!s} {0.verb}>'.format(self)
def __eq__(self, other):
"""Equality."""
return isinstance(other, EventType) and other.verb == self.verb
def __hash__(self):
"""Hash."""
return hash(self.verb)
# Define event types:
ADD = EventType('add', 'Add an object to subject')
CREATE = EventType('create', 'Create a new subject')
ACTIVATE = EventType('activate', 'Subject has no longer quarantines in system')
MODIFY = EventType('modify', 'Attributes has changed')
DEACTIVATE = EventType('deactivate', 'Quarantine is activated')
DELETE = EventType('delete', 'Subject is deleted')
REMOVE = EventType('remove', 'Remove an object from subject')
PASSWORD = EventType('password', 'Subject has changed password')
JOIN = EventType('join', 'Join two objects')
class EntityRef(object):
""" Representation of a single entity.
The entity_id can be used internally to identify which object we reference
The entity_type and ident is used to generate a reference to the object
that other systems can use.
"""
__slots__ = ['ident', 'entity_type', 'entity_id', ]
def __init__(self, entity_id, entity_type, ident):
self.entity_id = int(entity_id)
self.entity_type = entity_type
self.ident = ident
def __repr__(self):
return ("<{0.__class__.__name__}"
" id={0.entity_id!r}"
" type={0.entity_type!r}"
" ident={0.ident!r}>").format(self)
def __eq__(self, other):
return (isinstance(other, EntityRef) and
self.entity_id == other.entity_id)
def to_dict(self):
return {
'ident': self.ident,
'entity_id': self.entity_id,
'entity_type': self.entity_type, }
class DateTimeDescriptor(object):
""" Datetime descriptor that handles timezones.
When setting the datetime, this method will try to localize it with the
default_timezone in the following ways:
- mx.DateTime.DateTimeType: Naive datetime, assume in default_timezone
- datetime.datetime: Assume in default_timezone if naive
- integer: Assume timestamp in UTC
The returned object will always be a localized datetime.datetime
"""
default_timezone = pytz.timezone(cereconf.TIMEZONE)
def __init__(self, slot):
""" Creates a new datetime descriptor.
:param str slot:
The attribute name where the actual value is stored.
"""
self.slot = slot
def __repr__(self):
return '{0.__class__.__name__}({0.slot!r})'.format(self)
def __get__(self, obj, cls=None):
if not obj:
return self
return getattr(obj, self.slot, None)
def __set__(self, obj, value):
if value is None:
self.__delete__(obj)
return
if isinstance(value, (int, long, )):
# UTC timestamp
value = pytz.utc.localize(
datetime.datetime.fromtimestamp(value))
elif isinstance(value, mx.DateTime.DateTimeType):
# Naive datetime in default_timezone
value = self.default_timezone.localize(value.pydatetime())
elif isinstance(value, datetime.datetime):
if value.tzinfo is None:
value = self.default_timezone.localize(value)
else:
raise TypeError('Invalid datetime {0} ({1})'.format(type(value),
repr(value)))
setattr(obj, self.slot, value)
def __delete__(self, obj):
if hasattr(obj, self.slot):
delattr(obj, self.slot)
class Event(object):
""" Event abstraction.
Contains all the neccessary data to serialize an event.
"""
DEFAULT_TIMEZONE = 'Europe/Oslo'
__slots__ = ['event_type', 'subject', 'objects', 'context', 'attributes',
'_timestamp', '_scheduled', ]
timestamp = DateTimeDescriptor('_timestamp')
scheduled = DateTimeDescriptor('_scheduled')
def __init__(self, event_type,
subject=None,
objects=None,
context=None,
attributes=None,
timestamp=None,
scheduled=None):
"""
:param EventType event: the type of event
:param EntityRef subject: reference to the affected entity
:param list objects: sequence of other affected objects (EntityRef)
:param list context: sequence of affected systems (str)
:param list attributes: sequence of affected attributes (str)
:param datetime timestamp: when the event originated
:param datetime schedule: when the event should be issued
"""
self.event_type = event_type
self.subject = subject
self.timestamp = timestamp
self.scheduled = scheduled
self.objects = set(objects or [])
self.context = set(context or [])
self.attributes = set(attributes or [])
def __repr__(self):
return ('<{0.__class__.__name__}'
' event={0.event_type!r}'
' subject={0.subject!r}>').format(self)
def mergeable(self, other):
"""Can this event be merged with other."""
if self.scheduled is not None:
return False
if self.subject != other.subject:
return False
if self.event_type == CREATE:
return other.event_type not in (DEACTIVATE, REMOVE)
if self.event_type == DELETE:
return other.event_type in (REMOVE, DEACTIVATE, ADD, ACTIVATE,
MODIFY, PASSWORD)
if (self.event_type == other.event_type and
self.event_type in (ADD, REMOVE, ACTIVATE, DEACTIVATE)):
return True
if self.context != other.context:
return False
return True
def merge(self, other):
"""Merge messages."""
def ret_self():<|fim▁hole|> return [self, other]
if self.event_type == CREATE:
if other.event_type == DELETE:
return []
if other.event_type == ADD:
self.context.update(other.context)
return ret_self()
if other.event_type == ACTIVATE:
return ret_self() # TODO: if quarantine is an attr, delete it
if other.event_type == MODIFY:
self.attributes.update(other.attributes)
return ret_self()
if other.event_type == PASSWORD:
self.attributes.add('password')
return ret_self()
elif self.event_type == DELETE:
return ret_self()
elif other.event_type == DELETE:
return [other]
elif (ACTIVATE == self.event_type and
DEACTIVATE == other.event_type and
self.context == other.context):
return []
elif (ADD == self.event_type and
REMOVE == other.event_type and
self.context == other.context):
return []
elif self.event_type == other.event_type:
if self.event_type in (ADD, REMOVE, ACTIVATE, DEACTIVATE):
self.context.update(other.context)
return ret_self()
if self.context != other.context:
return [self, other]
self.attributes.update(other.attributes)
return ret_self()
return [self, other]
def merge_events(events):
"""Merge events with similarities.
As long as subject is the same:
* create + add/activate/modify/password = create with attributes merged
* create + deactivate/remove is untouched
* create + delete should be removed
* delete + remove/deactivate/add/activate/modify/password = delete
* x + x = x
* activate + deactivate = noop (careful with aud)
Sort into canonical order:
#. create
#. delete
#. add
#. activate
#. modify
#. password
#. deactivate
#. remove
"""
order = (CREATE, DELETE, ADD, ACTIVATE, MODIFY, PASSWORD, DEACTIVATE,
REMOVE, JOIN)
ps = [[] for x in order]
for pl in events:
pltype = pl.event_type
idx = order.index(pltype)
ps[idx].append(pl)
result = {}
for idx, tp, pl in zip(range(len(order)), order, ps):
for p in pl:
if p.subject not in result:
result[p.subject] = [p]
else:
result[p.subject].append(p)
def merge_list(finished, merged, current, rest):
while rest or merged:
if rest:
new = current.merge(rest[0])
if not new:
rest.pop(0)
merged.extend(rest)
rest = merged
if not rest:
return finished
merged = []
current = rest.pop(0)
elif len(new) == 1:
if new[0] is not current:
merged.extend(rest)
rest = merged
current = rest.pop(0)
merged = []
else:
rest.pop(0)
else:
merged.append(rest.pop(0))
else: # merged is not empty
finished.append(current)
rest = merged
merged = []
current = rest.pop(0)
finished.append(current)
return finished
for sub, lst in result.items():
result[sub] = merge_list([], [], lst[0], lst[1:])
return list(itertools.chain(*result.values()))<|fim▁end|> | self.objects.update(other.objects)
return [self]
if not self.mergeable(other): |
<|file_name|>__openerp__.py<|end_file_name|><|fim▁begin|># -*- coding: utf-8 -*-
##############################################################################
#
# Copyright (C) 2015 ADHOC SA (http://www.adhoc.com.ar)
# All Rights Reserved.
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
##############################################################################
{
'name': 'Portal Partner Fix',
'version': '8.0.1.0.0',
'category': '',
'sequence': 14,
'summary': '',
'description': """
Portal Partner Fix
==================
Let user read his commercial partner
""",
'author': 'ADHOC SA',
'website': 'www.adhoc.com.ar',
'images': [
],
'depends': [<|fim▁hole|> 'data': [
'security/portal_security.xml',
],
'demo': [
],
'test': [
],
'installable': True,
'auto_install': False,
'application': False,
}
# vim:expandtab:smartindent:tabstop=4:softtabstop=4:shiftwidth=4:<|fim▁end|> | 'portal',
], |
<|file_name|>SqlRuntimeTester.java<|end_file_name|><|fim▁begin|>/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to you under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.calcite.test;
import org.apache.calcite.sql.SqlNode;
import org.apache.calcite.sql.parser.StringAndPos;
import org.apache.calcite.sql.test.AbstractSqlTester;
import org.apache.calcite.sql.test.SqlTestFactory;
import org.apache.calcite.sql.test.SqlTests;
import org.apache.calcite.sql.validate.SqlValidator;
import org.checkerframework.checker.nullness.qual.Nullable;
import static org.junit.jupiter.api.Assertions.assertNotNull;
/**
* Tester of {@link SqlValidator} and runtime execution of the input SQL.
*/
class SqlRuntimeTester extends AbstractSqlTester {
SqlRuntimeTester() {
}
@Override public void checkFails(SqlTestFactory factory, StringAndPos sap,
String expectedError, boolean runtime) {
final StringAndPos sap2 =
StringAndPos.of(runtime ? buildQuery2(factory, sap.addCarets())
: buildQuery(sap.addCarets()));
assertExceptionIsThrown(factory, sap2, expectedError, runtime);
}
@Override public void checkAggFails(SqlTestFactory factory,
String expr,
String[] inputValues,
String expectedError,
boolean runtime) {
String query =
SqlTests.generateAggQuery(expr, inputValues);
final StringAndPos sap = StringAndPos.of(query);
assertExceptionIsThrown(factory, sap, expectedError, runtime);
}
@Override public void assertExceptionIsThrown(SqlTestFactory factory,
StringAndPos sap, @Nullable String expectedMsgPattern) {
assertExceptionIsThrown(factory, sap, expectedMsgPattern, false);
}
public void assertExceptionIsThrown(SqlTestFactory factory,
StringAndPos sap, @Nullable String expectedMsgPattern, boolean runtime) {
final SqlNode sqlNode;
try {
sqlNode = parseQuery(factory, sap.sql);
} catch (Throwable e) {
checkParseEx(e, expectedMsgPattern, sap);
return;
}
Throwable thrown = null;
final SqlTests.Stage stage;
final SqlValidator validator = factory.createValidator();
if (runtime) {
stage = SqlTests.Stage.RUNTIME;
SqlNode validated = validator.validate(sqlNode);
assertNotNull(validated);
try {<|fim▁hole|> thrown = ex;
}
} else {
stage = SqlTests.Stage.VALIDATE;
try {
validator.validate(sqlNode);
} catch (Throwable ex) {
thrown = ex;
}
}
SqlTests.checkEx(thrown, expectedMsgPattern, sap, stage);
}
}<|fim▁end|> | check(factory, sap.sql, SqlTests.ANY_TYPE_CHECKER,
SqlTests.ANY_PARAMETER_CHECKER, SqlTests.ANY_RESULT_CHECKER);
} catch (Throwable ex) {
// get the real exception in runtime check |
<|file_name|>page_test.py<|end_file_name|><|fim▁begin|># Copyright (c) 2012 The Chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
import logging
from telemetry.page.actions import all_page_actions
from telemetry.page.actions import page_action
def _GetActionFromData(action_data):
action_name = action_data['action']
action = all_page_actions.FindClassWithName(action_name)
if not action:
logging.critical('Could not find an action named %s.', action_name)
logging.critical('Check the page set for a typo and check the error '
'log for possible Python loading/compilation errors.')
raise Exception('Action "%s" not found.' % action_name)
return action(action_data)
def GetCompoundActionFromPage(page, action_name):
if not action_name:
return []
action_data_list = getattr(page, action_name)
if not isinstance(action_data_list, list):
action_data_list = [action_data_list]
action_list = []
for subaction_data in action_data_list:
subaction_name = subaction_data['action']
if hasattr(page, subaction_name):
subaction = GetCompoundActionFromPage(page, subaction_name)
else:
subaction = [_GetActionFromData(subaction_data)]
action_list += subaction * subaction_data.get('repeat', 1)
return action_list
class Failure(Exception):
"""Exception that can be thrown from PageBenchmark to indicate an
undesired but designed-for problem."""
pass
class PageTestResults(object):
def __init__(self):<|fim▁hole|> self.skipped_pages = []
def AddSuccess(self, page):
self.page_successes.append({'page': page})
def AddFailure(self, page, message, details):
self.page_failures.append({'page': page,
'message': message,
'details': details})
def AddSkippedPage(self, page, message, details):
self.skipped_pages.append({'page': page,
'message': message,
'details': details})
class PageTest(object):
"""A class styled on unittest.TestCase for creating page-specific tests."""
def __init__(self,
test_method_name,
action_name_to_run='',
needs_browser_restart_after_each_run=False,
discard_first_result=False):
self.options = None
try:
self._test_method = getattr(self, test_method_name)
except AttributeError:
raise ValueError, 'No such method %s.%s' % (
self.__class_, test_method_name) # pylint: disable=E1101
self._action_name_to_run = action_name_to_run
self._needs_browser_restart_after_each_run = (
needs_browser_restart_after_each_run)
self._discard_first_result = discard_first_result
@property
def needs_browser_restart_after_each_run(self):
return self._needs_browser_restart_after_each_run
@property
def discard_first_result(self):
"""When set to True, the first run of the test is discarded. This is
useful for cases where it's desirable to have some test resource cached so
the first run of the test can warm things up. """
return self._discard_first_result
def AddCommandLineOptions(self, parser):
"""Override to expose command-line options for this benchmark.
The provided parser is an optparse.OptionParser instance and accepts all
normal results. The parsed options are available in Run as
self.options."""
pass
def CustomizeBrowserOptions(self, options):
"""Override to add test-specific options to the BrowserOptions object"""
pass
def CustomizeBrowserOptionsForPage(self, page, options):
"""Add options specific to the test and the given page."""
if not self.CanRunForPage(page):
return
for action in GetCompoundActionFromPage(page, self._action_name_to_run):
action.CustomizeBrowserOptions(options)
def SetUpBrowser(self, browser):
"""Override to customize the browser right after it has launched."""
pass
def CanRunForPage(self, page): #pylint: disable=W0613
"""Override to customize if the test can be ran for the given page."""
return True
def WillRunPageSet(self, tab, results):
"""Override to do operations before the page set is navigated."""
pass
def DidRunPageSet(self, tab, results):
"""Override to do operations after page set is completed, but before browser
is torn down."""
pass
def WillNavigateToPage(self, page, tab):
"""Override to do operations before the page is navigated."""
pass
def DidNavigateToPage(self, page, tab):
"""Override to do operations right after the page is navigated, but before
any waiting for completion has occurred."""
pass
def WillRunAction(self, page, tab, action):
"""Override to do operations before running the action on the page."""
pass
def DidRunAction(self, page, tab, action):
"""Override to do operations after running the action on the page."""
pass
def Run(self, options, page, tab, results):
self.options = options
compound_action = GetCompoundActionFromPage(page, self._action_name_to_run)
self._RunCompoundAction(page, tab, compound_action)
try:
self._test_method(page, tab, results)
finally:
self.options = None
def _RunCompoundAction(self, page, tab, actions):
for i, action in enumerate(actions):
prev_action = actions[i - 1] if i > 0 else None
next_action = actions[i + 1] if i < len(actions) - 1 else None
if (action.RunsPreviousAction() and
next_action and next_action.RunsPreviousAction()):
raise page_action.PageActionFailed('Consecutive actions cannot both '
'have RunsPreviousAction() == True.')
if not (next_action and next_action.RunsPreviousAction()):
action.WillRunAction(page, tab)
self.WillRunAction(page, tab, action)
try:
action.RunAction(page, tab, prev_action)
finally:
self.DidRunAction(page, tab, action)
@property
def action_name_to_run(self):
return self._action_name_to_run<|fim▁end|> | self.page_successes = []
self.page_failures = [] |
<|file_name|>views.py<|end_file_name|><|fim▁begin|>import requests
from allauth.socialaccount.providers.oauth2.views import (OAuth2Adapter,
OAuth2LoginView,
OAuth2CallbackView)
from .provider import BasecampProvider
class BasecampOAuth2Adapter(OAuth2Adapter):
provider_id = BasecampProvider.id
access_token_url = 'https://launchpad.37signals.com/authorization/token?type=web_server' # noqa
authorize_url = 'https://launchpad.37signals.com/authorization/new'
profile_url = 'https://launchpad.37signals.com/authorization.json'
def complete_login(self, request, app, token, **kwargs):
headers = {'Authorization': 'Bearer {0}'.format(token.token)}
resp = requests.get(self.profile_url, headers=headers)
extra_data = resp.json()<|fim▁hole|>oauth2_login = OAuth2LoginView.adapter_view(BasecampOAuth2Adapter)
oauth2_callback = OAuth2CallbackView.adapter_view(BasecampOAuth2Adapter)<|fim▁end|> | return self.get_provider().sociallogin_from_response(request,
extra_data)
|
<|file_name|>plot_capecod_onlevel.py<|end_file_name|><|fim▁begin|>"""
=====================================
Include On-leveling into Cape Cod
=====================================
This example demonstrates how to incorporate on-leveling into the `CapeCod`
estimator. The on-level approach emulates the approach taken by Friedland in
"Estimating Unpaid Claims Using Basic Techniques" Chapter 10. The `ParallelogramOLF`
estimator is new in chainladder 0.7.9 as is the ``xyz`` triangle.
"""
import chainladder as cl
import pandas as pd
# Grab a triangle
xyz = cl.load_sample('xyz')
# Premium on-leveling factors
rate_history = pd.DataFrame({
'date': ['1/1/1999', '1/1/2000', '1/1/2001', '1/1/2002', '1/1/2003',
'1/1/2004', '1/1/2005', '1/1/2006', '1/1/2007', '1/1/2008'],<|fim▁hole|># Loss on-leveling factors
tort_reform = pd.DataFrame({
'date': ['1/1/2006', '1/1/2007'],
'rate_change': [-0.1067, -.25]
})
# In addition to development, include onlevel estimator in pipeline for loss
pipe = cl.Pipeline(steps=[
('olf', cl.ParallelogramOLF(tort_reform, change_col='rate_change', date_col='date', vertical_line=True)),
('dev', cl.Development(n_periods=2)),
('model', cl.CapeCod(trend=0.034))
])
# Define X
X = cl.load_sample('xyz')['Incurred']
# Separately apply on-level factors for premium
sample_weight = cl.ParallelogramOLF(
rate_history, change_col='rate_change', date_col='date',
vertical_line=True).fit_transform(xyz['Premium'].latest_diagonal)
# Fit Cod Estimator
pipe.fit(X, sample_weight=sample_weight).named_steps.model.ultimate_
# Create a Cape Cod pipeline without onleveling
pipe2 = cl.Pipeline(steps=[
('dev', cl.Development(n_periods=2)),
('model', cl.CapeCod(trend=0.034))
])
# Finally fit Cod Estimator without on-leveling
pipe2.fit(X, sample_weight=xyz['Premium'].latest_diagonal).named_steps.model.ultimate_
# Plot results
cl.concat((
pipe.named_steps.model.ultimate_.rename('columns', ['With On-level']),
pipe2.named_steps.model.ultimate_.rename('columns', ['Without On-level'])), 1).T.plot(
kind='bar', title='Cape Cod sensitivity to on-leveling', grid=True, subplots=True, legend=False);<|fim▁end|> | 'rate_change': [.02, .02, .02, .02, .05, .075, .15, .1, -.2, -.2]
})
|
<|file_name|>classify.rs<|end_file_name|><|fim▁begin|>#![feature(core, core_float)]
extern crate core;
#[cfg(test)]
mod tests {
use core::num::Float;
use core::num::FpCategory::{self, Nan, Infinite, Zero, Subnormal, Normal};
// #[derive(Copy, Clone, PartialEq, Debug)]
// #[stable(feature = "rust1", since = "1.0.0")]
// pub enum FpCategory {
// /// "Not a Number", often obtained by dividing by zero
// #[stable(feature = "rust1", since = "1.0.0")]
// Nan,
//
// /// Positive or negative infinity
// #[stable(feature = "rust1", since = "1.0.0")]
// Infinite ,
//
// /// Positive or negative zero
// #[stable(feature = "rust1", since = "1.0.0")]
// Zero,
//
// /// De-normalized floating point representation (less precise than `Normal`)
// #[stable(feature = "rust1", since = "1.0.0")]
// Subnormal,
//
// /// A regular floating point number
// #[stable(feature = "rust1", since = "1.0.0")]
// Normal,
// }
// impl Float for f32 {
// #[inline]
// fn nan() -> f32 { NAN }
//
// #[inline]
// fn infinity() -> f32 { INFINITY }
//
// #[inline]
// fn neg_infinity() -> f32 { NEG_INFINITY }
//
// #[inline]
// fn zero() -> f32 { 0.0 }
//
// #[inline]
// fn neg_zero() -> f32 { -0.0 }
//
// #[inline]
// fn one() -> f32 { 1.0 }
//
// from_str_radix_float_impl! { f32 }
//
// /// Returns `true` if the number is NaN.
// #[inline]
// fn is_nan(self) -> bool { self != self }
//
// /// Returns `true` if the number is infinite.
// #[inline]
// fn is_infinite(self) -> bool {
// self == Float::infinity() || self == Float::neg_infinity()
// }
//
// /// Returns `true` if the number is neither infinite or NaN.
// #[inline]
// fn is_finite(self) -> bool {
// !(self.is_nan() || self.is_infinite())
// }
//
// /// Returns `true` if the number is neither zero, infinite, subnormal or NaN.
// #[inline]
// fn is_normal(self) -> bool {
// self.classify() == Fp::Normal
// }
//
// /// Returns the floating point category of the number. If only one property
// /// is going to be tested, it is generally faster to use the specific
// /// predicate instead.
// fn classify(self) -> Fp {
// const EXP_MASK: u32 = 0x7f800000;
// const MAN_MASK: u32 = 0x007fffff;
//
// let bits: u32 = unsafe { mem::transmute(self) };
// match (bits & MAN_MASK, bits & EXP_MASK) {
// (0, 0) => Fp::Zero,
// (_, 0) => Fp::Subnormal,
// (0, EXP_MASK) => Fp::Infinite,
// (_, EXP_MASK) => Fp::Nan,
// _ => Fp::Normal,
// }
// }
//
// /// Returns the mantissa, exponent and sign as integers.
// fn integer_decode(self) -> (u64, i16, i8) {
// let bits: u32 = unsafe { mem::transmute(self) };
// let sign: i8 = if bits >> 31 == 0 { 1 } else { -1 };
// let mut exponent: i16 = ((bits >> 23) & 0xff) as i16;
// let mantissa = if exponent == 0 {
// (bits & 0x7fffff) << 1
// } else {
// (bits & 0x7fffff) | 0x800000
// };
// // Exponent bias + mantissa shift
// exponent -= 127 + 23;
// (mantissa as u64, exponent, sign)
// }
//
// /// Rounds towards minus infinity.
// #[inline]
// fn floor(self) -> f32 {
// unsafe { intrinsics::floorf32(self) }
// }
//
// /// Rounds towards plus infinity.
// #[inline]
// fn ceil(self) -> f32 {
// unsafe { intrinsics::ceilf32(self) }
// }
//
// /// Rounds to nearest integer. Rounds half-way cases away from zero.
// #[inline]
// fn round(self) -> f32 {
// unsafe { intrinsics::roundf32(self) }
// }
//
// /// Returns the integer part of the number (rounds towards zero).
// #[inline]
// fn trunc(self) -> f32 {
// unsafe { intrinsics::truncf32(self) }
// }
//
// /// The fractional part of the number, satisfying:
// ///
// /// ```
// /// let x = 1.65f32;
// /// assert!(x == x.trunc() + x.fract())
// /// ```
// #[inline]
// fn fract(self) -> f32 { self - self.trunc() }
//
// /// Computes the absolute value of `self`. Returns `Float::nan()` if the
// /// number is `Float::nan()`.
// #[inline]
// fn abs(self) -> f32 {
// unsafe { intrinsics::fabsf32(self) }
// }
//
// /// Returns a number that represents the sign of `self`.
// ///
// /// - `1.0` if the number is positive, `+0.0` or `Float::infinity()`
// /// - `-1.0` if the number is negative, `-0.0` or `Float::neg_infinity()`
// /// - `Float::nan()` if the number is `Float::nan()`
// #[inline]
// fn signum(self) -> f32 {
// if self.is_nan() {
// Float::nan()
// } else {
// unsafe { intrinsics::copysignf32(1.0, self) }
// }
// }
//
// /// Returns `true` if `self` is positive, including `+0.0` and
// /// `Float::infinity()`.
// #[inline]
// fn is_positive(self) -> bool {
// self > 0.0 || (1.0 / self) == Float::infinity()
// }
//
// /// Returns `true` if `self` is negative, including `-0.0` and
// /// `Float::neg_infinity()`.
// #[inline]
// fn is_negative(self) -> bool {
// self < 0.0 || (1.0 / self) == Float::neg_infinity()
// }
//
// /// Fused multiply-add. Computes `(self * a) + b` with only one rounding
// /// error. This produces a more accurate result with better performance than
// /// a separate multiplication operation followed by an add.
// #[inline]
// fn mul_add(self, a: f32, b: f32) -> f32 {
// unsafe { intrinsics::fmaf32(self, a, b) }
// }
//
// /// Returns the reciprocal (multiplicative inverse) of the number.
// #[inline]
// fn recip(self) -> f32 { 1.0 / self }
//
// #[inline]
// fn powi(self, n: i32) -> f32 {
// unsafe { intrinsics::powif32(self, n) }
// }
//
// #[inline]
// fn powf(self, n: f32) -> f32 {
// unsafe { intrinsics::powf32(self, n) }
// }
//
// #[inline]
// fn sqrt(self) -> f32 {
// if self < 0.0 {
// NAN
// } else {
// unsafe { intrinsics::sqrtf32(self) }
// }
// }
//
// #[inline]
// fn rsqrt(self) -> f32 { self.sqrt().recip() }
//
// /// Returns the exponential of the number.
// #[inline]
// fn exp(self) -> f32 {
// unsafe { intrinsics::expf32(self) }
// }
//
// /// Returns 2 raised to the power of the number.
// #[inline]
// fn exp2(self) -> f32 {
// unsafe { intrinsics::exp2f32(self) }
// }
//
// /// Returns the natural logarithm of the number.
// #[inline]
// fn ln(self) -> f32 {
// unsafe { intrinsics::logf32(self) }
// }
//
// /// Returns the logarithm of the number with respect to an arbitrary base.
// #[inline]
// fn log(self, base: f32) -> f32 { self.ln() / base.ln() }
//
// /// Returns the base 2 logarithm of the number.
// #[inline]
// fn log2(self) -> f32 {
// unsafe { intrinsics::log2f32(self) }
// }
//
// /// Returns the base 10 logarithm of the number.
// #[inline]
// fn log10(self) -> f32 {
// unsafe { intrinsics::log10f32(self) }
// }
//
// /// Converts to degrees, assuming the number is in radians.
// #[inline]
// fn to_degrees(self) -> f32 { self * (180.0f32 / consts::PI) }
//
// /// Converts to radians, assuming the number is in degrees.
// #[inline]
// fn to_radians(self) -> f32 {
// let value: f32 = consts::PI;
// self * (value / 180.0f32)
// }
// }
#[test]
fn classify_test1() {
let value: f32 = f32::nan();
let result: FpCategory = value.classify();
assert_eq!(result, Nan);
}
#[test]
fn classify_test2() {
let value: f32 = f32::infinity();
let result: FpCategory = value.classify();
assert_eq!(result, Infinite);
}
#[test]
fn classify_test3() {
let value: f32 = f32::neg_infinity();
let result: FpCategory = value.classify();
assert_eq!(result, Infinite);
}
#[test]<|fim▁hole|>
assert_eq!(result, Zero);
}
#[test]
fn classify_test5() {
let mut value: f32 = 0.0_f32;
unsafe {
*(&mut value as *const f32 as *mut u32) =
0b0_00000000_11111111111111111111111;
}
let result: FpCategory = value.classify();
assert_eq!(result, Subnormal);
}
#[test]
fn classify_test6() {
let mut value: f32 = 0.0_f32;
unsafe {
*(&mut value as *const f32 as *mut u32) =
0b1_00000000_11111111111111111111111;
}
let result: FpCategory = value.classify();
assert_eq!(result, Subnormal);
}
#[test]
fn classify_test7() {
let value: f32 = 68_f32;
let result: FpCategory = value.classify();
assert_eq!(result, Normal);
}
}<|fim▁end|> | fn classify_test4() {
let value: f32 = f32::zero();
let result: FpCategory = value.classify(); |
<|file_name|>index.d.ts<|end_file_name|><|fim▁begin|>import { Message } from "botkit";
import { Intent, Entity } from "../index";
export interface DialogFlowMessage extends Message {
/**
* The top intent of the message.
*/
topIntent: Intent,
/**
* A list of possible intents.
*/
intents: Intent[],
/**
* The entities that match the utterance.
*/
entities: Entity[],
/**
* Fulfillments from DialogFlow.
*/
fulfillment: DialogFlowFulfillment
}
export interface DialogFlowFulfillmentMessage {
type: number,
speech: string
}
export interface DialogFlowFulfillment {
speech: string,
messages: DialogFlowFulfillmentMessage[]
}
export interface DialogFlowResult {
source?: string,
action?: string,
actionIncomplete?: boolean,
paramters?: any,
context?: any[],
fulfillment: DialogFlowFulfillment,
score: number
}
export interface DialogFlowResponse {
/**
* The unique identifier of the response.
*/
id: string,
/**
* The results of the conversational query or event processing.
*/
result: DialogFlowResult,
/**
* The session ID of the request.
*/<|fim▁hole|> sessionId: string
}
/**
* The interface for DialogFlow Configuration
*/
export interface DialogFlowConfiguration {
/**
* The DialogFlow Endpoint.
*/
endpoint: string,
/**
* DialowFlow API Key.
*/
accessToken: string,
/**
* The Google Cloud Project ID.
*/
projectId: string
}<|fim▁end|> | |
<|file_name|>layout_assignment.cc<|end_file_name|><|fim▁begin|>/* Copyright 2017 The TensorFlow Authors. All Rights Reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
==============================================================================*/
#include "tensorflow/compiler/xla/service/cpu/layout_assignment.h"
#include <numeric>
#include "tensorflow/compiler/xla/map_util.h"
#include "tensorflow/compiler/xla/service/cpu/dot_op_emitter.h"
#include "tensorflow/compiler/xla/service/cpu/ir_emission_utils.h"
#include "tensorflow/core/lib/core/errors.h"
namespace xla {
namespace cpu {
Status CpuLayoutAssignment::AddBackendConstraints(
LayoutConstraints* constraints) {
auto row_major_shape = [](const Shape& old_shape) {
Shape new_shape(old_shape);
std::vector<int64> dimension_order(new_shape.dimensions_size());
std::iota(dimension_order.rbegin(), dimension_order.rend(), 0);
*new_shape.mutable_layout() = LayoutUtil::MakeLayout(dimension_order);
return new_shape;
};
auto col_major_shape = [](const Shape& old_shape) {
Shape new_shape(old_shape);
std::vector<int64> dimension_order(new_shape.dimensions_size());
std::iota(dimension_order.begin(), dimension_order.end(), 0);
*new_shape.mutable_layout() = LayoutUtil::MakeLayout(dimension_order);
return new_shape;
};
// We want to change the layout of constant arrays to be column major when all
// of their users are dot operations that can be made faster with the flipped
// layout. To avoid going quadriatic over the # of instructions, we cache
// this property in should_make_rhs_col_major -- it maps a constant to true if
// all of the users of said constant are dot operations that can be sped up.
// This cache is populated lazily as we encounter dot operations traversing
// the instruction stream.
tensorflow::gtl::FlatMap<const HloInstruction*, bool>
should_make_rhs_col_major_cache;
auto should_make_rhs_col_major = [&](const HloInstruction& instruction) {
if (ProfitableToImplementDotInUntiledLlvmIr(instruction) !=
DotInLlvmIrProfitable::kWithColumnMajorRhs) {
return false;
}
const auto* rhs = instruction.operand(1);
if (rhs->opcode() != HloOpcode::kConstant) {
return false;
}
auto it = should_make_rhs_col_major_cache.find(rhs);
if (it != should_make_rhs_col_major_cache.end()) {
return it->second;
}
bool result = std::all_of(
rhs->users().begin(), rhs->users().end(), [&](HloInstruction* user) {
return ProfitableToImplementDotInUntiledLlvmIr(*user) ==
DotInLlvmIrProfitable::kWithColumnMajorRhs &&
user->operand(0) != rhs;
});
InsertOrDie(&should_make_rhs_col_major_cache, rhs, result);
return result;
};
const HloComputation* computation = constraints->computation();
for (auto* instruction : computation->instructions()) {
if (instruction->opcode() == HloOpcode::kConvolution &&
PotentiallyImplementedAsEigenConvolution(*instruction)) {
const HloInstruction* convolution = instruction;
const HloInstruction* lhs_instruction = convolution->operand(0);
const HloInstruction* rhs_instruction = convolution->operand(1);
// In order to implement `convolution` with Eigen convolution, the layouts
// of the input, filter, and output need to be row-major.
//
// These constraints are not hard constraints. Ideally, we should decide
// which layouts to choose according to some cost model.
Shape output_shape(row_major_shape(convolution->shape()));
Shape input_shape(row_major_shape(lhs_instruction->shape()));
Shape filter_shape(row_major_shape(rhs_instruction->shape()));
<|fim▁hole|> TF_RETURN_IF_ERROR(
constraints->SetOperandLayout(input_shape, convolution, 0));
TF_RETURN_IF_ERROR(
constraints->SetOperandLayout(filter_shape, convolution, 1));
TF_RETURN_IF_ERROR(
constraints->SetInstructionLayout(output_shape, convolution));
} else if (should_make_rhs_col_major(*instruction)) {
auto* dot = instruction;
const auto& rhs_shape = dot->operand(1)->shape();
TF_RETURN_IF_ERROR(
constraints->SetOperandLayout(col_major_shape(rhs_shape), dot, 1));
} else if (PotentiallyImplementedAsEigenDot(*instruction)) {
const HloInstruction* dot = instruction;
// In order to implement `dot` with Eigen dot, the layouts of the lhs,
// rhs, and output need to be row-major.
//
// These constraints are not hard constraints. Ideally, we should decide
// which layouts to choose according to some cost model.
Shape output_shape(row_major_shape(dot->shape()));
const HloInstruction* lhs_instruction = dot->operand(0);
Shape lhs_shape(row_major_shape(lhs_instruction->shape()));
TF_RETURN_IF_ERROR(constraints->SetOperandLayout(lhs_shape, dot, 0));
// dot is a kDot or a kTransposeDot fusion node. In the latter case, if
// it represents X @ X, it may have just one operand.
if (dot->operand_count() > 1) {
const HloInstruction* rhs_instruction = dot->operand(1);
Shape rhs_shape(row_major_shape(rhs_instruction->shape()));
TF_RETURN_IF_ERROR(constraints->SetOperandLayout(rhs_shape, dot, 1));
}
// Set layouts of the instructions' shapes.
TF_RETURN_IF_ERROR(constraints->SetInstructionLayout(output_shape, dot));
} else {
for (int64 operand_no = 0; operand_no < instruction->operand_count();
++operand_no) {
// Skip operands which already have a constraint.
if (constraints->OperandLayout(instruction, operand_no) != nullptr) {
continue;
}
// Skip over forwarded operands.
if (constraints->OperandBufferForwarded(instruction, operand_no)) {
continue;
}
Shape operand_shape(
row_major_shape(instruction->operand(operand_no)->shape()));
TF_RETURN_IF_ERROR(constraints->SetOperandLayout(
operand_shape, instruction, operand_no));
}
// Skip over the root instruction for the top-level computation.
if (computation->parent()->entry_computation() == computation &&
computation->root_instruction() == instruction) {
continue;
}
// Skip instructions which don't produce array shapes (tuples, opaque,
// etc.).
if (!ShapeUtil::IsArray(instruction->shape())) {
continue;
}
}
}
return tensorflow::Status::OK();
}
} // namespace cpu
} // namespace xla<|fim▁end|> | // Set layouts of the instructions' shapes. |
<|file_name|>helpers.py<|end_file_name|><|fim▁begin|># -*- coding: utf-8 -*-
#########################################################################
#
# Copyright (C) 2012 OpenPlans
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
#########################################################################
import json
import sys
import os
import urllib
import logging
import re
import time
import errno
import uuid
import datetime
from bs4 import BeautifulSoup
import geoserver
import httplib2
from urlparse import urlparse
from urlparse import urlsplit
from threading import local
from collections import namedtuple
from itertools import cycle, izip
from lxml import etree
import xml.etree.ElementTree as ET
from decimal import Decimal
from owslib.wcs import WebCoverageService
from owslib.util import http_post
from django.core.exceptions import ImproperlyConfigured
from django.contrib.contenttypes.models import ContentType
from django.db.models.signals import pre_delete
from django.template.loader import render_to_string
from django.conf import settings
from django.utils.translation import ugettext as _
from dialogos.models import Comment
from agon_ratings.models import OverallRating
from gsimporter import Client
from owslib.wms import WebMapService
from geoserver.store import CoverageStore, DataStore, datastore_from_index,\
coveragestore_from_index, wmsstore_from_index
from geoserver.workspace import Workspace
from geoserver.catalog import Catalog
from geoserver.catalog import FailedRequestError, UploadError
from geoserver.catalog import ConflictingDataError
from geoserver.resource import FeatureType, Coverage
from geoserver.support import DimensionInfo
from geonode import GeoNodeException
from geonode.layers.utils import layer_type, get_files
from geonode.layers.models import Layer, Attribute, Style
from geonode.layers.enumerations import LAYER_ATTRIBUTE_NUMERIC_DATA_TYPES
logger = logging.getLogger(__name__)
if not hasattr(settings, 'OGC_SERVER'):
msg = (
'Please configure OGC_SERVER when enabling geonode.geoserver.'
' More info can be found at '
'http://docs.geonode.org/en/master/reference/developers/settings.html#ogc-server')
raise ImproperlyConfigured(msg)
def check_geoserver_is_up():
"""Verifies all geoserver is running,
this is needed to be able to upload.
"""
url = "%sweb/" % ogc_server_settings.LOCATION
resp, content = http_client.request(url, "GET")
msg = ('Cannot connect to the GeoServer at %s\nPlease make sure you '
'have started it.' % ogc_server_settings.LOCATION)
assert resp['status'] == '200', msg
def _add_sld_boilerplate(symbolizer):
"""
Wrap an XML snippet representing a single symbolizer in the appropriate
elements to make it a valid SLD which applies that symbolizer to all features,
including format strings to allow interpolating a "name" variable in.
"""
return """
<StyledLayerDescriptor version="1.0.0" xmlns="http://www.opengis.net/sld" xmlns:ogc="http://www.opengis.net/ogc"
xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.opengis.net/sld http://schemas.opengis.net/sld/1.0.0/StyledLayerDescriptor.xsd">
<NamedLayer>
<Name>%(name)s</Name>
<UserStyle>
<Name>%(name)s</Name>
<Title>%(name)s</Title>
<FeatureTypeStyle>
<Rule>
""" + symbolizer + """
</Rule>
</FeatureTypeStyle>
</UserStyle>
</NamedLayer>
</StyledLayerDescriptor>
"""
_raster_template = """
<RasterSymbolizer>
<Opacity>1.0</Opacity>
</RasterSymbolizer>
"""
_polygon_template = """
<PolygonSymbolizer>
<Fill>
<CssParameter name="fill">%(bg)s</CssParameter>
</Fill>
<Stroke>
<CssParameter name="stroke">%(fg)s</CssParameter>
<CssParameter name="stroke-width">0.7</CssParameter>
</Stroke>
</PolygonSymbolizer>
"""
_line_template = """
<LineSymbolizer>
<Stroke>
<CssParameter name="stroke">%(bg)s</CssParameter>
<CssParameter name="stroke-width">3</CssParameter>
</Stroke>
</LineSymbolizer>
</Rule>
</FeatureTypeStyle>
<FeatureTypeStyle>
<Rule>
<LineSymbolizer>
<Stroke>
<CssParameter name="stroke">%(fg)s</CssParameter>
</Stroke>
</LineSymbolizer>
"""
_point_template = """
<PointSymbolizer>
<Graphic>
<Mark>
<WellKnownName>%(mark)s</WellKnownName>
<Fill>
<CssParameter name="fill">%(bg)s</CssParameter>
</Fill>
<Stroke>
<CssParameter name="stroke">%(fg)s</CssParameter>
</Stroke>
</Mark>
<Size>10</Size>
</Graphic>
</PointSymbolizer>
"""
_style_templates = dict(
raster=_add_sld_boilerplate(_raster_template),
polygon=_add_sld_boilerplate(_polygon_template),
line=_add_sld_boilerplate(_line_template),
point=_add_sld_boilerplate(_point_template)
)
def _style_name(resource):
return _punc.sub("_", resource.store.workspace.name + ":" + resource.name)
def get_sld_for(layer):
# FIXME: GeoServer sometimes fails to associate a style with the data, so
# for now we default to using a point style.(it works for lines and
# polygons, hope this doesn't happen for rasters though)
name = layer.default_style.name if layer.default_style is not None else "point"
# FIXME: When gsconfig.py exposes the default geometry type for vector
# layers we should use that rather than guessing based on the auto-detected
# style.
if name in _style_templates:
fg, bg, mark = _style_contexts.next()
return _style_templates[name] % dict(
name=layer.name,
fg=fg,
bg=bg,
mark=mark)
else:
return None
def fixup_style(cat, resource, style):
logger.debug("Creating styles for layers associated with [%s]", resource)
layers = cat.get_layers(resource=resource)
logger.info("Found %d layers associated with [%s]", len(layers), resource)
for lyr in layers:
if lyr.default_style.name in _style_templates:
logger.info("%s uses a default style, generating a new one", lyr)
name = _style_name(resource)
if style is None:
sld = get_sld_for(lyr)
else:
sld = style.read()
logger.info("Creating style [%s]", name)
style = cat.create_style(name, sld)
lyr.default_style = cat.get_style(name)
logger.info("Saving changes to %s", lyr)
cat.save(lyr)
logger.info("Successfully updated %s", lyr)
def cascading_delete(cat, layer_name):
resource = None
try:
if layer_name.find(':') != -1:
workspace, name = layer_name.split(':')
ws = cat.get_workspace(workspace)
try:
store = get_store(cat, name, workspace=ws)
except FailedRequestError:
if ogc_server_settings.DATASTORE:
try:
store = get_store(cat, ogc_server_settings.DATASTORE, workspace=ws)
except FailedRequestError:
logger.debug(
'the store was not found in geoserver')
return
else:
logger.debug(
'the store was not found in geoserver')
return
if ws is None:
logger.debug(
'cascading delete was called on a layer where the workspace was not found')
return
resource = cat.get_resource(name, store=store, workspace=workspace)
else:
resource = cat.get_resource(layer_name)
except EnvironmentError as e:
if e.errno == errno.ECONNREFUSED:
msg = ('Could not connect to geoserver at "%s"'
'to save information for layer "%s"' % (
ogc_server_settings.LOCATION, layer_name)
)
logger.warn(msg, e)
return None
else:
raise e
if resource is None:
# If there is no associated resource,
# this method can not delete anything.
# Let's return and make a note in the log.
logger.debug(
'cascading_delete was called with a non existent resource')
return
resource_name = resource.name
lyr = cat.get_layer(resource_name)
if(lyr is not None): # Already deleted
store = resource.store
styles = lyr.styles + [lyr.default_style]
cat.delete(lyr)
for s in styles:
if s is not None and s.name not in _default_style_names:
try:
cat.delete(s, purge='true')
except FailedRequestError as e:
# Trying to delete a shared style will fail
# We'll catch the exception and log it.
logger.debug(e)
# Due to a possible bug of geoserver, we need this trick for now
# TODO: inspect the issue reported by this hack. Should be solved
# with GS 2.7+
try:
cat.delete(resource, recurse=True) # This may fail
except:
cat.reload() # this preservers the integrity of geoserver
if store.resource_type == 'dataStore' and 'dbtype' in store.connection_parameters and \
store.connection_parameters['dbtype'] == 'postgis':
delete_from_postgis(resource_name)
elif store.type and store.type.lower() == 'geogig':
# Prevent the entire store from being removed when the store is a
# GeoGig repository.
return
else:
if store.resource_type == 'coverageStore':
try:
logger.info(" - Going to purge the " + store.resource_type + " : " + store.href)
cat.reset() # this resets the coverage readers and unlocks the files
cat.delete(store, purge='all', recurse=True)
cat.reload() # this preservers the integrity of geoserver
except FailedRequestError as e:
# Trying to recursively purge a store may fail
# We'll catch the exception and log it.
logger.debug(e)
else:
try:
if not store.get_resources():
cat.delete(store, recurse=True)
except FailedRequestError as e:
# Catch the exception and log it.
logger.debug(e)
def delete_from_postgis(resource_name):
"""
Delete a table from PostGIS (because Geoserver won't do it yet);
to be used after deleting a layer from the system.
"""
import psycopg2
db = ogc_server_settings.datastore_db
conn = psycopg2.connect(
"dbname='" +
db['NAME'] +
"' user='" +
db['USER'] +
"' password='" +
db['PASSWORD'] +
"' port=" +
db['PORT'] +
" host='" +
db['HOST'] +
"'")
try:
cur = conn.cursor()
cur.execute("SELECT DropGeometryTable ('%s')" % resource_name)
conn.commit()
except Exception as e:
logger.error(
"Error deleting PostGIS table %s:%s",
resource_name,
str(e))
finally:
conn.close()
def gs_slurp(
ignore_errors=True,
verbosity=1,
console=None,
owner=None,
workspace=None,
store=None,
filter=None,
skip_unadvertised=False,
skip_geonode_registered=False,
remove_deleted=False):
"""Configure the layers available in GeoServer in GeoNode.
It returns a list of dictionaries with the name of the layer,
the result of the operation and the errors and traceback if it failed.
"""
if console is None:
console = open(os.devnull, 'w')
if verbosity > 1:
print >> console, "Inspecting the available layers in GeoServer ..."
cat = Catalog(ogc_server_settings.internal_rest, _user, _password)
if workspace is not None:
workspace = cat.get_workspace(workspace)
if workspace is None:
resources = []
else:
# obtain the store from within the workspace. if it exists, obtain resources
# directly from store, otherwise return an empty list:
if store is not None:
store = get_store(cat, store, workspace=workspace)
if store is None:
resources = []
else:
resources = cat.get_resources(store=store)
else:
resources = cat.get_resources(workspace=workspace)
elif store is not None:
store = get_store(cat, store)
resources = cat.get_resources(store=store)
else:
resources = cat.get_resources()
if remove_deleted:
resources_for_delete_compare = resources[:]
workspace_for_delete_compare = workspace
# filter out layers for delete comparison with GeoNode layers by following criteria:
# enabled = true, if --skip-unadvertised: advertised = true, but
# disregard the filter parameter in the case of deleting layers
resources_for_delete_compare = [
k for k in resources_for_delete_compare if k.enabled in ["true", True]]
if skip_unadvertised:
resources_for_delete_compare = [
k for k in resources_for_delete_compare if k.advertised in ["true", True]]
if filter:
resources = [k for k in resources if filter in k.name]
# filter out layers depending on enabled, advertised status:
resources = [k for k in resources if k.enabled in ["true", True]]
if skip_unadvertised:
resources = [k for k in resources if k.advertised in ["true", True]]
# filter out layers already registered in geonode
layer_names = Layer.objects.all().values_list('typename', flat=True)
if skip_geonode_registered:
resources = [k for k in resources
if not '%s:%s' % (k.workspace.name, k.name) in layer_names]
# TODO: Should we do something with these?
# i.e. look for matching layers in GeoNode and also disable?
# disabled_resources = [k for k in resources if k.enabled == "false"]
number = len(resources)
if verbosity > 1:
msg = "Found %d layers, starting processing" % number
print >> console, msg
output = {
'stats': {
'failed': 0,
'updated': 0,
'created': 0,
'deleted': 0,
},
'layers': [],
'deleted_layers': []
}
start = datetime.datetime.now()
for i, resource in enumerate(resources):
name = resource.name
the_store = resource.store
workspace = the_store.workspace
try:
layer, created = Layer.objects.get_or_create(name=name, defaults={
"workspace": workspace.name,
"store": the_store.name,
"storeType": the_store.resource_type,
"typename": "%s:%s" % (workspace.name.encode('utf-8'), resource.name.encode('utf-8')),
"title": resource.title or 'No title provided',
"abstract": resource.abstract or 'No abstract provided',
"owner": owner,
"uuid": str(uuid.uuid4()),
"bbox_x0": Decimal(resource.latlon_bbox[0]),
"bbox_x1": Decimal(resource.latlon_bbox[1]),
"bbox_y0": Decimal(resource.latlon_bbox[2]),
"bbox_y1": Decimal(resource.latlon_bbox[3])
})
# recalculate the layer statistics
set_attributes(layer, overwrite=True)
# Fix metadata links if the ip has changed
if layer.link_set.metadata().count() > 0:
if not created and settings.SITEURL not in layer.link_set.metadata()[0].url:
layer.link_set.metadata().delete()
layer.save()
metadata_links = []
for link in layer.link_set.metadata():
metadata_links.append((link.mime, link.name, link.url))
resource.metadata_links = metadata_links
cat.save(resource)
except Exception as e:
if ignore_errors:
status = 'failed'
exception_type, error, traceback = sys.exc_info()
else:
if verbosity > 0:
msg = "Stopping process because --ignore-errors was not set and an error was found."
print >> sys.stderr, msg
raise Exception(
'Failed to process %s' %
resource.name.encode('utf-8'), e), None, sys.exc_info()[2]
else:
if created:
layer.set_default_permissions()
status = 'created'
output['stats']['created'] += 1
else:
status = 'updated'
output['stats']['updated'] += 1
msg = "[%s] Layer %s (%d/%d)" % (status, name, i + 1, number)
info = {'name': name, 'status': status}
if status == 'failed':
output['stats']['failed'] += 1
info['traceback'] = traceback
info['exception_type'] = exception_type
info['error'] = error
output['layers'].append(info)
if verbosity > 0:
print >> console, msg
if remove_deleted:
q = Layer.objects.filter()
if workspace_for_delete_compare is not None:
if isinstance(workspace_for_delete_compare, Workspace):
q = q.filter(
workspace__exact=workspace_for_delete_compare.name)
else:
q = q.filter(workspace__exact=workspace_for_delete_compare)
if store is not None:
if isinstance(
store,
CoverageStore) or isinstance(
store,
DataStore):
q = q.filter(store__exact=store.name)
else:
q = q.filter(store__exact=store)
logger.debug("Executing 'remove_deleted' logic")
logger.debug("GeoNode Layers Found:")
# compare the list of GeoNode layers obtained via query/filter with valid resources found in GeoServer
# filtered per options passed to updatelayers: --workspace, --store, --skip-unadvertised
# add any layers not found in GeoServer to deleted_layers (must match
# workspace and store as well):
deleted_layers = []
for layer in q:
logger.debug(
"GeoNode Layer info: name: %s, workspace: %s, store: %s",
layer.name,
layer.workspace,
layer.store)
layer_found_in_geoserver = False
for resource in resources_for_delete_compare:
# if layer.name matches a GeoServer resource, check also that
# workspace and store match, mark valid:
if layer.name == resource.name:
if layer.workspace == resource.workspace.name and layer.store == resource.store.name:
logger.debug(
"Matches GeoServer layer: name: %s, workspace: %s, store: %s",
resource.name,
resource.workspace.name,
resource.store.name)
layer_found_in_geoserver = True
if not layer_found_in_geoserver:
logger.debug(
"----- Layer %s not matched, marked for deletion ---------------",
layer.name)
deleted_layers.append(layer)
number_deleted = len(deleted_layers)
if verbosity > 1:
msg = "\nFound %d layers to delete, starting processing" % number_deleted if number_deleted > 0 else \
"\nFound %d layers to delete" % number_deleted
print >> console, msg
for i, layer in enumerate(deleted_layers):
logger.debug(
"GeoNode Layer to delete: name: %s, workspace: %s, store: %s",
layer.name,
layer.workspace,
layer.store)
try:
# delete ratings, comments, and taggit tags:
ct = ContentType.objects.get_for_model(layer)
OverallRating.objects.filter(
content_type=ct,
object_id=layer.id).delete()
Comment.objects.filter(
content_type=ct,
object_id=layer.id).delete()
layer.keywords.clear()
layer.delete()
output['stats']['deleted'] += 1
status = "delete_succeeded"
except Exception as e:
status = "delete_failed"
finally:
from .signals import geoserver_pre_delete
pre_delete.connect(geoserver_pre_delete, sender=Layer)
msg = "[%s] Layer %s (%d/%d)" % (status,
layer.name,
i + 1,
number_deleted)
info = {'name': layer.name, 'status': status}
if status == "delete_failed":
exception_type, error, traceback = sys.exc_info()
info['traceback'] = traceback
info['exception_type'] = exception_type
info['error'] = error
output['deleted_layers'].append(info)
if verbosity > 0:
print >> console, msg
finish = datetime.datetime.now()
td = finish - start
output['stats']['duration_sec'] = td.microseconds / \
1000000 + td.seconds + td.days * 24 * 3600
return output
def get_stores(store_type=None):
cat = Catalog(ogc_server_settings.internal_rest, _user, _password)
stores = cat.get_stores()
store_list = []
for store in stores:
store.fetch()
stype = store.dom.find('type').text.lower()
if store_type and store_type.lower() == stype:
store_list.append({'name': store.name, 'type': stype})
elif store_type is None:
store_list.append({'name': store.name, 'type': stype})
return store_list
def set_attributes(layer, overwrite=False):
"""
Retrieve layer attribute names & types from Geoserver,
then store in GeoNode database using Attribute model
"""
attribute_map = []
server_url = ogc_server_settings.LOCATION if layer.storeType != "remoteStore" else layer.service.base_url
if layer.storeType == "remoteStore" and layer.service.ptype == "gxp_arcrestsource":
dft_url = server_url + ("%s?f=json" % layer.typename)
try:
# The code below will fail if http_client cannot be imported
body = json.loads(http_client.request(dft_url)[1])
attribute_map = [[n["name"], _esri_types[n["type"]]]
for n in body["fields"] if n.get("name") and n.get("type")]
except Exception:
attribute_map = []
elif layer.storeType in ["dataStore", "remoteStore", "wmsStore"]:
dft_url = re.sub("\/wms\/?$",
"/",
server_url) + "wfs?" + urllib.urlencode({"service": "wfs",
"version": "1.0.0",
"request": "DescribeFeatureType",
"typename": layer.typename.encode('utf-8'),
})
try:
# The code below will fail if http_client cannot be imported or
# WFS not supported
body = http_client.request(dft_url)[1]
doc = etree.fromstring(body)
path = ".//{xsd}extension/{xsd}sequence/{xsd}element".format(
xsd="{http://www.w3.org/2001/XMLSchema}")
attribute_map = [[n.attrib["name"], n.attrib["type"]] for n in doc.findall(
path) if n.attrib.get("name") and n.attrib.get("type")]
except Exception:
attribute_map = []
# Try WMS instead
dft_url = server_url + "?" + urllib.urlencode({
"service": "wms",
"version": "1.0.0",
"request": "GetFeatureInfo",
"bbox": ','.join([str(x) for x in layer.bbox]),
"LAYERS": layer.typename.encode('utf-8'),
"QUERY_LAYERS": layer.typename.encode('utf-8'),
"feature_count": 1,
"width": 1,
"height": 1,
"srs": "EPSG:4326",
"info_format": "text/html",
"x": 1,
"y": 1
})
try:
body = http_client.request(dft_url)[1]
soup = BeautifulSoup(body)
for field in soup.findAll('th'):
if(field.string is None):
field_name = field.contents[0].string
else:
field_name = field.string
attribute_map.append([field_name, "xsd:string"])
except Exception:
attribute_map = []
elif layer.storeType in ["coverageStore"]:
dc_url = server_url + "wcs?" + urllib.urlencode({
"service": "wcs",
"version": "1.1.0",
"request": "DescribeCoverage",
"identifiers": layer.typename.encode('utf-8')
})
try:
response, body = http_client.request(dc_url)
doc = etree.fromstring(body)
path = ".//{wcs}Axis/{wcs}AvailableKeys/{wcs}Key".format(
wcs="{http://www.opengis.net/wcs/1.1.1}")
attribute_map = [[n.text, "raster"] for n in doc.findall(path)]
except Exception:
attribute_map = []
# we need 3 more items for description, attribute_label and display_order
attribute_map_dict = {
'field': 0,
'ftype': 1,
'description': 2,
'label': 3,
'display_order': 4,
}
for attribute in attribute_map:
attribute.extend((None, None, 0))
attributes = layer.attribute_set.all()
# Delete existing attributes if they no longer exist in an updated layer
for la in attributes:
lafound = False
for attribute in attribute_map:
field, ftype, description, label, display_order = attribute
if field == la.attribute:
lafound = True
# store description and attribute_label in attribute_map
attribute[attribute_map_dict['description']] = la.description
attribute[attribute_map_dict['label']] = la.attribute_label
attribute[attribute_map_dict['display_order']] = la.display_order
if overwrite or not lafound:
logger.debug(
"Going to delete [%s] for [%s]",
la.attribute,
layer.name.encode('utf-8'))
la.delete()
# Add new layer attributes if they don't already exist
if attribute_map is not None:
iter = len(Attribute.objects.filter(layer=layer)) + 1
for attribute in attribute_map:
field, ftype, description, label, display_order = attribute
if field is not None:
la, created = Attribute.objects.get_or_create(
layer=layer, attribute=field, attribute_type=ftype,
description=description, attribute_label=label,
display_order=display_order)
if created:
if is_layer_attribute_aggregable(
layer.storeType,
field,
ftype):
logger.debug("Generating layer attribute statistics")
result = get_attribute_statistics(layer.name, field)
if result is not None:
la.count = result['Count']
la.min = result['Min']
la.max = result['Max']
la.average = result['Average']
la.median = result['Median']
la.stddev = result['StandardDeviation']
la.sum = result['Sum']
la.unique_values = result['unique_values']
la.last_stats_updated = datetime.datetime.now()
la.visible = ftype.find("gml:") != 0
la.display_order = iter
la.save()
iter += 1
logger.debug(
"Created [%s] attribute for [%s]",
field,
layer.name.encode('utf-8'))
else:
logger.debug("No attributes found")
def set_styles(layer, gs_catalog):
style_set = []
gs_layer = gs_catalog.get_layer(layer.name)
default_style = gs_layer.default_style
layer.default_style = save_style(default_style)
# FIXME: This should remove styles that are no longer valid
style_set.append(layer.default_style)
alt_styles = gs_layer.styles
for alt_style in alt_styles:
style_set.append(save_style(alt_style))
layer.styles = style_set
return layer
def save_style(gs_style):
style, created = Style.objects.get_or_create(name=gs_style.name)
style.sld_title = gs_style.sld_title
style.sld_body = gs_style.sld_body
style.sld_url = gs_style.body_href
style.save()
return style
def is_layer_attribute_aggregable(store_type, field_name, field_type):
"""
Decipher whether layer attribute is suitable for statistical derivation
"""
# must be vector layer
if store_type != 'dataStore':
return False
# must be a numeric data type
if field_type not in LAYER_ATTRIBUTE_NUMERIC_DATA_TYPES:
return False
# must not be an identifier type field
if field_name.lower() in ['id', 'identifier']:
return False
return True
def get_attribute_statistics(layer_name, field):
"""
Generate statistics (range, mean, median, standard deviation, unique values)
for layer attribute
"""
logger.debug('Deriving aggregate statistics for attribute %s', field)
if not ogc_server_settings.WPS_ENABLED:
return None
try:
return wps_execute_layer_attribute_statistics(layer_name, field)
except Exception:
logger.exception('Error generating layer aggregate statistics')
def get_wcs_record(instance, retry=True):
wcs = WebCoverageService(ogc_server_settings.LOCATION + 'wcs', '1.0.0')
key = instance.workspace + ':' + instance.name
logger.debug(wcs.contents)
if key in wcs.contents:
return wcs.contents[key]
else:
msg = ("Layer '%s' was not found in WCS service at %s." %
(key, ogc_server_settings.public_url)
)
if retry:
logger.debug(
msg +
' Waiting a couple of seconds before trying again.')
time.sleep(2)
return get_wcs_record(instance, retry=False)
else:
raise GeoNodeException(msg)
def get_coverage_grid_extent(instance):
"""
Returns a list of integers with the size of the coverage
extent in pixels
"""
instance_wcs = get_wcs_record(instance)
grid = instance_wcs.grid
return [(int(h) - int(l) + 1) for
h, l in zip(grid.highlimits, grid.lowlimits)]
GEOSERVER_LAYER_TYPES = {
'vector': FeatureType.resource_type,
'raster': Coverage.resource_type,
}
def geoserver_layer_type(filename):
the_type = layer_type(filename)
return GEOSERVER_LAYER_TYPES[the_type]
def cleanup(name, uuid):
"""Deletes GeoServer and Catalogue records for a given name.
Useful to clean the mess when something goes terribly wrong.
It also verifies if the Django record existed, in which case
it performs no action.
"""
try:
Layer.objects.get(name=name)
except Layer.DoesNotExist as e:
pass
else:
msg = ('Not doing any cleanup because the layer %s exists in the '
'Django db.' % name)
raise GeoNodeException(msg)
cat = gs_catalog
gs_store = None
gs_layer = None
gs_resource = None
# FIXME: Could this lead to someone deleting for example a postgis db
# with the same name of the uploaded file?.
try:
gs_store = cat.get_store(name)
if gs_store is not None:
gs_layer = cat.get_layer(name)
if gs_layer is not None:
gs_resource = gs_layer.resource
else:
gs_layer = None
gs_resource = None
except FailedRequestError as e:
msg = ('Couldn\'t connect to GeoServer while cleaning up layer '
'[%s] !!', str(e))
logger.warning(msg)
if gs_layer is not None:
try:
cat.delete(gs_layer)
except:
logger.warning("Couldn't delete GeoServer layer during cleanup()")
if gs_resource is not None:
try:
cat.delete(gs_resource)
except:
msg = 'Couldn\'t delete GeoServer resource during cleanup()'
logger.warning(msg)
if gs_store is not None:
try:
cat.delete(gs_store)
except:
logger.warning("Couldn't delete GeoServer store during cleanup()")
logger.warning('Deleting dangling Catalogue record for [%s] '
'(no Django record to match)', name)
if 'geonode.catalogue' in settings.INSTALLED_APPS:
from geonode.catalogue import get_catalogue
catalogue = get_catalogue()
catalogue.remove_record(uuid)
logger.warning('Finished cleanup after failed Catalogue/Django '
'import for layer: %s', name)
def _create_featurestore(name, data, overwrite=False, charset="UTF-8", workspace=None):
cat = gs_catalog
cat.create_featurestore(name, data, overwrite=overwrite, charset=charset)
store = get_store(cat, name, workspace=workspace)
return store, cat.get_resource(name, store=store, workspace=workspace)
def _create_coveragestore(name, data, overwrite=False, charset="UTF-8", workspace=None):
cat = gs_catalog
cat.create_coveragestore(name, data, overwrite=overwrite)
store = get_store(cat, name, workspace=workspace)
return store, cat.get_resource(name, store=store, workspace=workspace)
def _create_db_featurestore(name, data, overwrite=False, charset="UTF-8", workspace=None):
"""Create a database store then use it to import a shapefile.
If the import into the database fails then delete the store
(and delete the PostGIS table for it).
"""
cat = gs_catalog
dsname = ogc_server_settings.DATASTORE
try:
ds = get_store(cat, dsname, workspace=workspace)
except FailedRequestError:
ds = cat.create_datastore(dsname, workspace=workspace)
db = ogc_server_settings.datastore_db
db_engine = 'postgis' if \
'postgis' in db['ENGINE'] else db['ENGINE']
ds.connection_parameters.update(
{'validate connections': 'true',
'max connections': '10',
'min connections': '1',
'fetch size': '1000',
'host': db['HOST'],
'port': db['PORT'],
'database': db['NAME'],
'user': db['USER'],
'passwd': db['PASSWORD'],
'dbtype': db_engine}
)
cat.save(ds)
ds = get_store(cat, dsname, workspace=workspace)
try:
cat.add_data_to_store(ds, name, data,
overwrite=overwrite,
charset=charset)
return ds, cat.get_resource(name, store=ds, workspace=workspace)
except Exception:
msg = _("An exception occurred loading data to PostGIS")
msg += "- %s" % (sys.exc_info()[1])
try:
delete_from_postgis(name)
except Exception:
msg += _(" Additionally an error occured during database cleanup")
msg += "- %s" % (sys.exc_info()[1])
raise GeoNodeException(msg)
def get_store(cat, name, workspace=None):
# Make sure workspace is a workspace object and not a string.
# If the workspace does not exist, continue as if no workspace had been defined.
if isinstance(workspace, basestring):
workspace = cat.get_workspace(workspace)
if workspace is None:
workspace = cat.get_default_workspace()
try:
store = cat.get_xml('%s/%s.xml' % (workspace.datastore_url[:-4], name))
except FailedRequestError:
try:
store = cat.get_xml('%s/%s.xml' % (workspace.coveragestore_url[:-4], name))
except FailedRequestError:
try:
store = cat.get_xml('%s/%s.xml' % (workspace.wmsstore_url[:-4], name))
except FailedRequestError:
raise FailedRequestError("No store found named: " + name)
if store.tag == 'dataStore':
store = datastore_from_index(cat, workspace, store)
elif store.tag == 'coverageStore':
store = coveragestore_from_index(cat, workspace, store)
elif store.tag == 'wmsStore':
store = wmsstore_from_index(cat, workspace, store)
return store
def geoserver_upload(
layer,
base_file,
user,
name,
overwrite=True,
title=None,
abstract=None,
permissions=None,
keywords=(),
charset='UTF-8'):
# Step 2. Check that it is uploading to the same resource type as
# the existing resource
logger.info('>>> Step 2. Make sure we are not trying to overwrite a '
'existing resource named [%s] with the wrong type', name)
the_layer_type = geoserver_layer_type(base_file)
# Get a short handle to the gsconfig geoserver catalog
cat = gs_catalog
# Fix bug on layer replace #2642
# https://github.com/GeoNode/geonode/issues/2462
cat.reload()
workspace = cat.get_default_workspace()
# Check if the store exists in geoserver
try:
store = get_store(cat, name, workspace=workspace)
except geoserver.catalog.FailedRequestError as e:
# There is no store, ergo the road is clear
pass
else:
# If we get a store, we do the following:
resources = store.get_resources()
# If the store is empty, we just delete it.
if len(resources) == 0:
cat.delete(store)
else:
# If our resource is already configured in the store it needs
# to have the right resource type
for resource in resources:
if resource.name == name:
msg = 'Name already in use and overwrite is False'
assert overwrite, msg
existing_type = resource.resource_type
if existing_type != the_layer_type:
msg = ('Type of uploaded file %s (%s) '
'does not match type of existing '
'resource type '
'%s' % (name, the_layer_type, existing_type))
logger.info(msg)
raise GeoNodeException(msg)
# Step 3. Identify whether it is vector or raster and which extra files
# are needed.
logger.info('>>> Step 3. Identifying if [%s] is vector or raster and '
'gathering extra files', name)
if the_layer_type == FeatureType.resource_type:
logger.debug('Uploading vector layer: [%s]', base_file)
if ogc_server_settings.DATASTORE:
create_store_and_resource = _create_db_featurestore
else:
create_store_and_resource = _create_featurestore
elif the_layer_type == Coverage.resource_type:
logger.debug("Uploading raster layer: [%s]", base_file)
create_store_and_resource = _create_coveragestore
else:
msg = ('The layer type for name %s is %s. It should be '
'%s or %s,' % (name,
the_layer_type,
FeatureType.resource_type,
Coverage.resource_type))
logger.warn(msg)
raise GeoNodeException(msg)
# Step 4. Create the store in GeoServer
logger.info('>>> Step 4. Starting upload of [%s] to GeoServer...', name)
# Get the helper files if they exist
files = get_files(base_file)
data = files
if 'shp' not in files:
data = base_file
try:
store, gs_resource = create_store_and_resource(name,
data,
charset=charset,
overwrite=overwrite,
workspace=workspace)
except UploadError as e:
msg = ('Could not save the layer %s, there was an upload '
'error: %s' % (name, str(e)))
logger.warn(msg)
e.args = (msg,)
raise
except ConflictingDataError as e:
# A datastore of this name already exists
msg = ('GeoServer reported a conflict creating a store with name %s: '
'"%s". This should never happen because a brand new name '
'should have been generated. But since it happened, '
'try renaming the file or deleting the store in '
'GeoServer.' % (name, str(e)))
logger.warn(msg)
e.args = (msg,)
raise
else:
logger.debug('Finished upload of [%s] to GeoServer without '
'errors.', name)
# Step 5. Create the resource in GeoServer
logger.info('>>> Step 5. Generating the metadata for [%s] after '
'successful import to GeoSever', name)
# Verify the resource was created
if gs_resource is not None:
assert gs_resource.name == name
else:
msg = ('GeoNode encountered problems when creating layer %s.'
'It cannot find the Layer that matches this Workspace.'
'try renaming your files.' % name)
logger.warn(msg)
raise GeoNodeException(msg)
# Step 6. Make sure our data always has a valid projection
# FIXME: Put this in gsconfig.py
logger.info('>>> Step 6. Making sure [%s] has a valid projection' % name)
if gs_resource.latlon_bbox is None:
box = gs_resource.native_bbox[:4]
minx, maxx, miny, maxy = [float(a) for a in box]
if -180 <= minx <= 180 and -180 <= maxx <= 180 and \
-90 <= miny <= 90 and -90 <= maxy <= 90:
logger.info('GeoServer failed to detect the projection for layer '
'[%s]. Guessing EPSG:4326', name)
# If GeoServer couldn't figure out the projection, we just
# assume it's lat/lon to avoid a bad GeoServer configuration
gs_resource.latlon_bbox = gs_resource.native_bbox
gs_resource.projection = "EPSG:4326"
cat.save(gs_resource)
else:
msg = ('GeoServer failed to detect the projection for layer '
'[%s]. It doesn\'t look like EPSG:4326, so backing out '
'the layer.')
logger.info(msg, name)
cascading_delete(cat, name)
raise GeoNodeException(msg % name)
# Step 7. Create the style and assign it to the created resource
# FIXME: Put this in gsconfig.py
logger.info('>>> Step 7. Creating style for [%s]' % name)
publishing = cat.get_layer(name)
if 'sld' in files:
f = open(files['sld'], 'r')
sld = f.read()
f.close()
else:
sld = get_sld_for(publishing)
if sld is not None:
try:
cat.create_style(name, sld)
except geoserver.catalog.ConflictingDataError as e:
msg = ('There was already a style named %s in GeoServer, '
'cannot overwrite: "%s"' % (name, str(e)))
logger.warn(msg)
e.args = (msg,)
# FIXME: Should we use the fully qualified typename?
publishing.default_style = cat.get_style(name)
cat.save(publishing)
# Step 10. Create the Django record for the layer
logger.info('>>> Step 10. Creating Django record for [%s]', name)
# FIXME: Do this inside the layer object
typename = workspace.name + ':' + gs_resource.name
layer_uuid = str(uuid.uuid1())
defaults = dict(store=gs_resource.store.name,
storeType=gs_resource.store.resource_type,
typename=typename,
title=title or gs_resource.title,
uuid=layer_uuid,
abstract=abstract or gs_resource.abstract or '',
owner=user)
return name, workspace.name, defaults, gs_resource
class ServerDoesNotExist(Exception):
pass
class OGC_Server(object):
"""
OGC Server object.
"""
def __init__(self, ogc_server, alias):
self.alias = alias
self.server = ogc_server
def __getattr__(self, item):
return self.server.get(item)
@property
def credentials(self):
"""
Returns a tuple of the server's credentials.
"""
creds = namedtuple('OGC_SERVER_CREDENTIALS', ['username', 'password'])
return creds(username=self.USER, password=self.PASSWORD)
@property
def datastore_db(self):
"""
Returns the server's datastore dict or None.
"""
if self.DATASTORE and settings.DATABASES.get(self.DATASTORE, None):
return settings.DATABASES.get(self.DATASTORE, dict())
else:
return dict()
@property
def ows(self):
"""
The Open Web Service url for the server.
"""
location = self.PUBLIC_LOCATION if self.PUBLIC_LOCATION else self.LOCATION
return self.OWS_LOCATION if self.OWS_LOCATION else location + 'ows'
@property
def rest(self):
"""
The REST endpoint for the server.
"""
return self.LOCATION + \
'rest' if not self.REST_LOCATION else self.REST_LOCATION
@property
def public_url(self):
"""
The global public endpoint for the server.
"""
return self.LOCATION if not self.PUBLIC_LOCATION else self.PUBLIC_LOCATION
@property
def internal_ows(self):
"""
The Open Web Service url for the server used by GeoNode internally.
"""
location = self.LOCATION
return location + 'ows'
@property
def internal_rest(self):
"""
The internal REST endpoint for the server.
"""
return self.LOCATION + 'rest'
@property
def hostname(self):
return urlsplit(self.LOCATION).hostname
@property
def netloc(self):
return urlsplit(self.LOCATION).netloc
def __str__(self):
return self.alias
class OGC_Servers_Handler(object):
"""
OGC Server Settings Convenience dict.
"""
def __init__(self, ogc_server_dict):
self.servers = ogc_server_dict
# FIXME(Ariel): Are there better ways to do this without involving
# local?
self._servers = local()
def ensure_valid_configuration(self, alias):
"""
Ensures the settings are valid.
"""
try:
server = self.servers[alias]
except KeyError:
raise ServerDoesNotExist("The server %s doesn't exist" % alias)
datastore = server.get('DATASTORE')
uploader_backend = getattr(
settings,
'UPLOADER',
dict()).get(
'BACKEND',
'geonode.rest')
if uploader_backend == 'geonode.importer' and datastore and not settings.DATABASES.get(
datastore):
raise ImproperlyConfigured(
'The OGC_SERVER setting specifies a datastore '
'but no connection parameters are present.')
if uploader_backend == 'geonode.importer' and not datastore:
raise ImproperlyConfigured(
'The UPLOADER BACKEND is set to geonode.importer but no DATASTORE is specified.')
if 'PRINTNG_ENABLED' in server:
raise ImproperlyConfigured("The PRINTNG_ENABLED setting has been removed, use 'PRINT_NG_ENABLED' instead.")
def ensure_defaults(self, alias):
"""
Puts the defaults into the settings dictionary for a given connection where no settings is provided.
"""
try:
server = self.servers[alias]
except KeyError:
raise ServerDoesNotExist("The server %s doesn't exist" % alias)
server.setdefault('BACKEND', 'geonode.geoserver')
server.setdefault('LOCATION', 'http://localhost:8080/geoserver/')
server.setdefault('USER', 'admin')
server.setdefault('PASSWORD', 'geoserver')
server.setdefault('DATASTORE', str())
server.setdefault('GEOGIG_DATASTORE_DIR', str())
for option in ['MAPFISH_PRINT_ENABLED', 'PRINT_NG_ENABLED', 'GEONODE_SECURITY_ENABLED',
'BACKEND_WRITE_ENABLED']:
server.setdefault(option, True)
for option in ['GEOGIG_ENABLED', 'WMST_ENABLED', 'WPS_ENABLED']:
server.setdefault(option, False)
def __getitem__(self, alias):
if hasattr(self._servers, alias):
return getattr(self._servers, alias)
self.ensure_defaults(alias)
self.ensure_valid_configuration(alias)
server = self.servers[alias]
server = OGC_Server(alias=alias, ogc_server=server)
setattr(self._servers, alias, server)
return server
def __setitem__(self, key, value):
setattr(self._servers, key, value)
def __iter__(self):
return iter(self.servers)
def all(self):
return [self[alias] for alias in self]
def get_wms():
wms_url = ogc_server_settings.internal_ows + \
"?service=WMS&request=GetCapabilities&version=1.1.0"
netloc = urlparse(wms_url).netloc
http = httplib2.Http()
http.add_credentials(_user, _password)
http.authorizations.append(
httplib2.BasicAuthentication(
(_user, _password),
netloc,
wms_url,
{},
None,
None,
http
)
)
body = http.request(wms_url)[1]
_wms = WebMapService(wms_url, xml=body)
return _wms
def wps_execute_layer_attribute_statistics(layer_name, field):
"""Derive aggregate statistics from WPS endpoint"""
# generate statistics using WPS
url = '%s/ows' % (ogc_server_settings.LOCATION)
# TODO: use owslib.wps.WebProcessingService for WPS interaction
# this requires GeoServer's WPS gs:Aggregate function to
# return a proper wps:ExecuteResponse
request = render_to_string('layers/wps_execute_gs_aggregate.xml', {
'layer_name': 'geonode:%s' % layer_name,
'field': field
})
response = http_post(
url,
request,
timeout=ogc_server_settings.TIMEOUT,
username=ogc_server_settings.credentials.username,
password=ogc_server_settings.credentials.password)
exml = etree.fromstring(response)
result = {}
for f in ['Min', 'Max', 'Average', 'Median', 'StandardDeviation', 'Sum']:
fr = exml.find(f)
if fr is not None:
result[f] = fr.text
else:
result[f] = 'NA'
count = exml.find('Count')<|fim▁hole|> result['Count'] = int(count.text)
else:
result['Count'] = 0
result['unique_values'] = 'NA'
return result
# TODO: find way of figuring out threshold better
# Looks incomplete what is the purpose if the nex lines?
# if result['Count'] < 10000:
# request = render_to_string('layers/wps_execute_gs_unique.xml', {
# 'layer_name': 'geonode:%s' % layer_name,
# 'field': field
# })
# response = http_post(
# url,
# request,
# timeout=ogc_server_settings.TIMEOUT,
# username=ogc_server_settings.credentials.username,
# password=ogc_server_settings.credentials.password)
# exml = etree.fromstring(response)
def style_update(request, url):
"""
Sync style stuff from GS to GN.
Ideally we should call this from a view straight from GXP, and we should use
gsConfig, that at this time does not support styles updates. Before gsConfig
is updated, for now we need to parse xml.
In case of a DELETE, we need to query request.path to get the style name,
and then remove it.
In case of a POST or PUT, we need to parse the xml from
request.body, which is in this format:
"""
if request.method in ('POST', 'PUT'): # we need to parse xml
# Need to remove NSx from IE11
if "HTTP_USER_AGENT" in request.META:
if ('Trident/7.0' in request.META['HTTP_USER_AGENT'] and
'rv:11.0' in request.META['HTTP_USER_AGENT']):
txml = re.sub(r'xmlns:NS[0-9]=""', '', request.body)
txml = re.sub(r'NS[0-9]:', '', txml)
request._body = txml
tree = ET.ElementTree(ET.fromstring(request.body))
elm_namedlayer_name = tree.findall(
'.//{http://www.opengis.net/sld}Name')[0]
elm_user_style_name = tree.findall(
'.//{http://www.opengis.net/sld}Name')[1]
elm_user_style_title = tree.find(
'.//{http://www.opengis.net/sld}Title')
if not elm_user_style_title:
elm_user_style_title = elm_user_style_name
layer_name = elm_namedlayer_name.text
style_name = elm_user_style_name.text
sld_body = '<?xml version="1.0" encoding="UTF-8"?>%s' % request.body
# add style in GN and associate it to layer
if request.method == 'POST':
style = Style(name=style_name, sld_body=sld_body, sld_url=url)
style.save()
layer = Layer.objects.all().filter(typename=layer_name)[0]
style.layer_styles.add(layer)
style.save()
if request.method == 'PUT': # update style in GN
style = Style.objects.all().filter(name=style_name)[0]
style.sld_body = sld_body
style.sld_url = url
if len(elm_user_style_title.text) > 0:
style.sld_title = elm_user_style_title.text
style.save()
for layer in style.layer_styles.all():
layer.save()
if request.method == 'DELETE': # delete style from GN
style_name = os.path.basename(request.path)
style = Style.objects.all().filter(name=style_name)[0]
style.delete()
def set_time_info(layer, attribute, end_attribute, presentation,
precision_value, precision_step, enabled=True):
'''Configure the time dimension for a layer.
:param layer: the layer to configure
:param attribute: the attribute used to represent the instant or period
start
:param end_attribute: the optional attribute used to represent the end
period
:param presentation: either 'LIST', 'DISCRETE_INTERVAL', or
'CONTINUOUS_INTERVAL'
:param precision_value: number representing number of steps
:param precision_step: one of 'seconds', 'minutes', 'hours', 'days',
'months', 'years'
:param enabled: defaults to True
'''
layer = gs_catalog.get_layer(layer.name)
if layer is None:
raise ValueError('no such layer: %s' % layer.name)
resource = layer.resource
resolution = None
if precision_value and precision_step:
resolution = '%s %s' % (precision_value, precision_step)
info = DimensionInfo("time", enabled, presentation, resolution, "ISO8601",
None, attribute=attribute, end_attribute=end_attribute)
metadata = dict(resource.metadata or {})
metadata['time'] = info
resource.metadata = metadata
gs_catalog.save(resource)
def get_time_info(layer):
'''Get the configured time dimension metadata for the layer as a dict.
The keys of the dict will be those of the parameters of `set_time_info`.
:returns: dict of values or None if not configured
'''
layer = gs_catalog.get_layer(layer.name)
if layer is None:
raise ValueError('no such layer: %s' % layer.name)
resource = layer.resource
info = resource.metadata.get('time', None) if resource.metadata else None
vals = None
if info:
value = step = None
resolution = info.resolution_str()
if resolution:
value, step = resolution.split()
vals = dict(
enabled=info.enabled,
attribute=info.attribute,
end_attribute=info.end_attribute,
presentation=info.presentation,
precision_value=value,
precision_step=step,
)
return vals
ogc_server_settings = OGC_Servers_Handler(settings.OGC_SERVER)['default']
_wms = None
_csw = None
_user, _password = ogc_server_settings.credentials
http_client = httplib2.Http()
http_client.add_credentials(_user, _password)
http_client.add_credentials(_user, _password)
_netloc = urlparse(ogc_server_settings.LOCATION).netloc
http_client.authorizations.append(
httplib2.BasicAuthentication(
(_user, _password),
_netloc,
ogc_server_settings.LOCATION,
{},
None,
None,
http_client
)
)
url = ogc_server_settings.rest
gs_catalog = Catalog(url, _user, _password)
gs_uploader = Client(url, _user, _password)
_punc = re.compile(r"[\.:]") # regex for punctuation that confuses restconfig
_foregrounds = [
"#ffbbbb",
"#bbffbb",
"#bbbbff",
"#ffffbb",
"#bbffff",
"#ffbbff"]
_backgrounds = [
"#880000",
"#008800",
"#000088",
"#888800",
"#008888",
"#880088"]
_marks = ["square", "circle", "cross", "x", "triangle"]
_style_contexts = izip(cycle(_foregrounds), cycle(_backgrounds), cycle(_marks))
_default_style_names = ["point", "line", "polygon", "raster"]
_esri_types = {
"esriFieldTypeDouble": "xsd:double",
"esriFieldTypeString": "xsd:string",
"esriFieldTypeSmallInteger": "xsd:int",
"esriFieldTypeInteger": "xsd:int",
"esriFieldTypeDate": "xsd:dateTime",
"esriFieldTypeOID": "xsd:long",
"esriFieldTypeGeometry": "xsd:geometry",
"esriFieldTypeBlob": "xsd:base64Binary",
"esriFieldTypeRaster": "raster",
"esriFieldTypeGUID": "xsd:string",
"esriFieldTypeGlobalID": "xsd:string",
"esriFieldTypeXML": "xsd:anyType"}
def _render_thumbnail(req_body):
spec = _fixup_ows_url(req_body)
url = "%srest/printng/render.png" % ogc_server_settings.LOCATION
hostname = urlparse(settings.SITEURL).hostname
params = dict(width=240, height=180, auth="%s,%s,%s" % (hostname, _user, _password))
url = url + "?" + urllib.urlencode(params)
# @todo annoying but not critical
# openlayers controls posted back contain a bad character. this seems
# to come from a − entity in the html, but it gets converted
# to a unicode en-dash but is not uncoded properly during transmission
# 'ignore' the error for now as controls are not being rendered...
data = spec
if type(data) == unicode:
# make sure any stored bad values are wiped out
# don't use keyword for errors - 2.6 compat
# though unicode accepts them (as seen below)
data = data.encode('ASCII', 'ignore')
data = unicode(data, errors='ignore').encode('UTF-8')
try:
resp, content = http_client.request(url, "POST", data, {
'Content-type': 'text/html'
})
except Exception:
logging.warning('Error generating thumbnail')
return
return content
def _fixup_ows_url(thumb_spec):
# @HACK - for whatever reason, a map's maplayers ows_url contains only /geoserver/wms
# so rendering of thumbnails fails - replace those uri's with full geoserver URL
import re
gspath = '"' + ogc_server_settings.public_url # this should be in img src attributes
repl = '"' + ogc_server_settings.LOCATION
return re.sub(gspath, repl, thumb_spec)<|fim▁end|> | if count is not None: |
<|file_name|>fat.rs<|end_file_name|><|fim▁begin|>use alloc::vec::Vec;
use core::mem;
use crate::address::Align;
#[repr(packed)]
pub struct BPB {
jmp_instr: [u8; 3],
oem_identifier: [u8; 8],
bytes_per_sector: u16,
sectors_per_cluster: u8,
resd_sectors: u16,
fat_count: u8,
root_dir_entry_count: u16,
sector_count: u16, // if 0, then sector_count >= 65536. actual value is large_sector_count
media_type: u8,
sectors_per_fat: u16, // for fat12/fat16 only
sectors_per_track: u16,
head_count: u16,
hidden_sector_count: u32,
large_sector_count: u32,
}
impl BPB {
fn fat_type(&self) -> FatFormat {
let dirent_sectors = (
(self.root_dir_entry_count as u32 * mem::size_of::<DirectoryEntry>() as u32)
.align(self.bytes_per_sector as u32)) / self.bytes_per_sector as u32;
let cluster_count =
(if self.sector_count == 0 { self.large_sector_count } else { self.sector_count as u32 }
- self.resd_sectors as u32 - (self.fat_count as u32 * self.sectors_per_fat as u32)
- dirent_sectors) / self.sectors_per_cluster as u32;
match cluster_count {
n if n < 4085 => FatFormat::Fat12,
n if n < 65525 => FatFormat::Fat16,
_ => FatFormat::Fat32,
}
}
}
#[repr(packed)]
pub struct DirectoryEntry {
name: [u8; 11],
attribute: u8,
_reserved: [u8; 10],
time: u16,
date: u16,
start_cluster: u16,
size: u32,
}
impl DirectoryEntry {
pub const READ_ONLY: u8 = 1 << 0;
pub const HIDDEN: u8 = 1 << 1;
pub const SYSTEM: u8 = 1 << 2;
pub const VOL_LABEL: u8 = 1 << 3;
pub const SUB_DIR: u8 = 1 << 4;
pub const ARCHIVE: u8 = 1 << 5;
// Returns true if the char is to be replaced by an underscore
fn is_replace_char(c: u8) -> bool {
match c {
b'+' | b',' | b';' | b'=' | b'[' | b']' => true,
_ => false
}
}
/* Legal characters are:
* <space>
* Numbers: 0-9
* Upper-case letters: A-Z
* Punctuation: !, #, $, %, & ', (, ), -, @, ^, _, `, {, }, ~
* 0x80-0xff (note that 0xE5 is stored as 0x05 on disk)
0x2E ('.') is reserved for dot entries
*/
fn is_legal_dos_char(c: u8) -> bool {
match c {
b' ' | b'0'..=b'9' | b'A'..=b'Z' | b'!' | b'#' | b'$' | b'%' | b'&' | b'\'' | b'(' | b')'
| b'-' | b'@' | b'^' | b'_' | b'`' | b'{' | b'}' | b'~' | 0x80..=0xff => true,
_ => false
}
}
fn short_filename(filename: Vec<u8>) -> Option<([u8; 8], Option<[u8; 3]>, bool)> {
let mut parts = filename
.rsplitn(2, |c| *c == b'.');
if let Some(name) = parts.next() {
let mut basename = [b' '; 8];
let mut extension = [b' '; 3];
let mut oversized = false;
let base = if let Some(b) = parts.next() {
// Filename contains extension
// Replace some invalid chars with underscores, convert lowercase to uppercase, and
// remove any remaining invalid characters.
let ext_iter = name
.iter()
.map(|c| {
if c.is_ascii_lowercase() {
c.to_ascii_uppercase()
} else if Self::is_replace_char(*c) {
b'_'
} else {
*c
}
})
.filter(|c| Self::is_legal_dos_char(*c));
let mut ext_i = 0;
for c in ext_iter {
// trim any leading spaces
if c == b' ' && ext_i == 0 {
continue;
} else if ext_i == extension.len() {
break;
}
extension[ext_i] = c;
ext_i += 1;
}
<|fim▁hole|> // No extension
name
};
// Replace some invalid chars with underscores, convert lowercase to uppercase, and
// remove any remaining invalid characters.
let base_iter = base
.iter()
.map(|c| {
if c.is_ascii_lowercase() {
c.to_ascii_uppercase()
} else if Self::is_replace_char(*c) {
b'_'
} else {
*c
}
})
.filter(|c| Self::is_legal_dos_char(*c));
let mut base_i = 0;
for c in base_iter {
// Trim leading spaces
if c == b' ' && base_i == 0 {
continue;
} else if base_i == basename.len() {
// Base name is too long. A tilde and a digit will need to follow the filename,
// but the digit depends on how many other similarly named files exist in the
// directory.
oversized = true;
break;
}
basename[base_i] = c;
base_i += 1;
}
if extension == [b' ', b' ', b' '] {
Some((basename, None, oversized))
} else {
Some((basename, Some(extension), oversized))
}
} else {
None
}
}
}
#[repr(packed)]
pub struct FatBPB {
bpb: BPB,
drive_number: u8,
win_nt_flags: u8,
signature: u8, // either 0x28 or 0x29
volume_id: u32,
volume_label: [u8; 11],
system_id: [u8; 8],
boot_code: [u8; 448],
partition_signature: u16, // 0xAA55
}
#[repr(packed)]
pub struct Fat32BPB {
bpb: BPB,
sectors_per_fat: u32,
flags: u16,
fat_version: u16,
root_dir_cluster: u32,
fs_info_sector: u16,
backup_boot_sector: u16,
_reserved: [u8; 12],
drive_number: u8,
win_nt_flags: u8,
signature: u8, // either 0x28 or 0x29
volume_id: u32,
volume_label: [u8; 11],
system_id: [u8; 8],
boot_code: [u8; 420],
partition_signature: u16, // 0xAA55
}
pub enum ClusterType {
Free,
Used,
Bad,
Reserved,
End,
Invalid,
}
pub enum FatFormat {
Fat12,
Fat16,
Fat32,
}
impl FatFormat {
pub fn cluster_type(&self, cluster: u32) -> ClusterType {
match self {
Self::Fat12 => match cluster {
0 => ClusterType::Free,
0xFF8..=0xFFF => ClusterType::End,
2..=0xFEF => ClusterType::Used,
0xFF7 => ClusterType::Bad,
1 | 0xFF0..=0xFF6 => ClusterType::Reserved,
_ => ClusterType::Invalid,
},
Self::Fat16 => match cluster {
0 => ClusterType::Free,
0xFFF8..=0xFFFF => ClusterType::End,
2..=0xFFEF => ClusterType::Used,
0xFFF7 => ClusterType::Bad,
1 | 0xFFF0..=0xFFF6 => ClusterType::Reserved,
_ => ClusterType::Invalid,
},
Self::Fat32 => match cluster {
0 => ClusterType::Free,
0xFFFFFF8..=0xFFFFFFF => ClusterType::End,
2..=0xFFFFFEF => ClusterType::Used,
0xFFFFFF7 => ClusterType::Bad,
1 | 0xFFFFFF0..=0xFFFFFF6 => ClusterType::Reserved,
_ => ClusterType::Invalid,
}
}
}
}<|fim▁end|> | b
} else {
|
<|file_name|>log_handlers.py<|end_file_name|><|fim▁begin|>from logging.handlers import BaseRotatingHandler
import string
import time
import datetime
import os
class TimePatternRotatingHandler(BaseRotatingHandler):
def __init__(self, filename, when, encoding=None, delay=0):
self.when = string.upper(when)
self.fname_pat = filename
self.mock_dt = None
self.computeNextRollover()
BaseRotatingHandler.__init__(self, self.filename, 'a', encoding, delay)
def get_now_dt(self):<|fim▁hole|> return datetime.datetime.now()
def computeNextRollover(self):
now = self.get_now_dt()
if self.when == 'MONTH':
dtfmt = '%Y-%m'
dt = (now.replace(day=1) + datetime.timedelta(days=40)).replace(day=1, hour=0, minute=0, second=0)
rolloverAt = time.mktime(dt.timetuple())
elif self.when == 'DAY':
dtfmt = '%Y-%m-%d'
dt = (now + datetime.timedelta(days=1)).replace(hour=0, minute=0, second=0)
rolloverAt = time.mktime(dt.timetuple())
self.rolloverAt = rolloverAt
self.dtfmt = dtfmt
self.filename = os.path.abspath(self.fname_pat % (now.strftime(self.dtfmt)))
#print now, self.filename
def shouldRollover(self, record):
now = self.get_now_dt()
t = time.mktime(now.timetuple())
#print t, self.rolloverAt
if t >= self.rolloverAt:
return 1
return 0
def doRollover(self):
if self.stream:
self.stream.close()
self.computeNextRollover()
self.baseFilename = self.filename
self.stream = self._open()<|fim▁end|> | if self.mock_dt is not None:
return self.mock_dt
|
<|file_name|>item6.py<|end_file_name|><|fim▁begin|>#!/usr/local/bin/python
a=['red','orange','yellow','green','blue','purple']
odds=a[::2]
evens=a[1::2]
print odds
print evens
x=b'abcdefg'
y=x[::-1]
print y
c=['a','b','c','d','e','f']
d=c[::2]
e=d[1:-1]<|fim▁hole|><|fim▁end|> | print e |
<|file_name|>eventflags_string.go<|end_file_name|><|fim▁begin|>// Code generated by "bitstringer -type=EventFlags"; DO NOT EDIT
package perffile
import "strconv"
func (i EventFlags) String() string {
if i == 0 {
return "0"
}
s := ""
if i&EventFlagClockID != 0 {
s += "ClockID|"
}
if i&EventFlagComm != 0 {
s += "Comm|"
}
if i&EventFlagCommExec != 0 {
s += "CommExec|"
}
if i&EventFlagContextSwitch != 0 {
s += "ContextSwitch|"
}
if i&EventFlagDisabled != 0 {
s += "Disabled|"
}
if i&EventFlagEnableOnExec != 0 {
s += "EnableOnExec|"
}
if i&EventFlagExcludeCallchainKernel != 0 {
s += "ExcludeCallchainKernel|"
}
if i&EventFlagExcludeCallchainUser != 0 {
s += "ExcludeCallchainUser|"
}
if i&EventFlagExcludeGuest != 0 {
s += "ExcludeGuest|"
}
if i&EventFlagExcludeHost != 0 {
s += "ExcludeHost|"
}
if i&EventFlagExcludeHypervisor != 0 {
s += "ExcludeHypervisor|"
}
if i&EventFlagExcludeIdle != 0 {
s += "ExcludeIdle|"
}
if i&EventFlagExcludeKernel != 0 {
s += "ExcludeKernel|"
}
if i&EventFlagExcludeUser != 0 {
s += "ExcludeUser|"
}
if i&EventFlagExclusive != 0 {
s += "Exclusive|"
}
if i&EventFlagFreq != 0 {
s += "Freq|"
}
if i&EventFlagInherit != 0 {
s += "Inherit|"
}
if i&EventFlagInheritStat != 0 {
s += "InheritStat|"
}
if i&EventFlagMmap != 0 {
s += "Mmap|"
}
if i&EventFlagMmapData != 0 {
s += "MmapData|"
}
if i&EventFlagMmapInodeData != 0 {
s += "MmapInodeData|"
}
if i&EventFlagNamespaces != 0 {
s += "Namespaces|"
}
if i&EventFlagPinned != 0 {<|fim▁hole|> }
if i&EventFlagSampleIDAll != 0 {
s += "SampleIDAll|"
}
if i&EventFlagTask != 0 {
s += "Task|"
}
if i&EventFlagWakeupWatermark != 0 {
s += "WakeupWatermark|"
}
if i&EventFlagWriteBackward != 0 {
s += "WriteBackward|"
}
i &^= 536772607
if i == 0 {
return s[:len(s)-1]
}
return s + "0x" + strconv.FormatUint(uint64(i), 16)
}<|fim▁end|> | s += "Pinned|" |
<|file_name|>strings.js<|end_file_name|><|fim▁begin|>define({<|fim▁hole|> sidebarTitle: 'Legend',
modalAboutTitle: 'About',
modalAboutContent: 'The goal of this application boilerplate is to demonstrate how to build a mapping application that utilizes the best parts of Dojo (AMD modules, classes and widgets, promises, i18n, routing, etc) along with the responsive UI of Bootstrap.',
modalAboutMoreInfo: 'More...',
widgets : {
geocoder : {
placeholder : 'Address or Location'
}
}
}),
fr: true,
es: true,
it: true,
de: true
// TODO: define other locales as needed
});<|fim▁end|> | root: ({
appTitle: 'Dojo Bootstrap Map',
navBasemaps: 'Basemaps',
navAbout: 'About', |
<|file_name|>bcbio_fastq_umi_prep.py<|end_file_name|><|fim▁begin|>#!/usr/bin/env python
"""Convert 3 fastq inputs (read 1, read 2, UMI) into paired inputs with UMIs in read names
Usage:
bcbio_fastq_umi_prep.py single <out basename> <read 1 fastq> <read 2 fastq> <umi fastq>
or:
bcbio_fastq_umi_prep.py autopair [<list> <of> <fastq> <files>]
Creates two fastq files with embedded UMIs: <out_basename>_R1.fq.gz <out_basename>_R2.fq.gz
or a directory of fastq files with UMIs added to the names.
"""
from __future__ import print_function
import argparse
import os
import sys
from bcbio import utils
from bcbio.bam import fastq
from bcbio.provenance import do
from bcbio.distributed.multi import run_multicore, zeromq_aware_logging
transform_json = r"""{
"read1": "(?P<name>@.*)\\n(?P<seq>.*)\\n\\+(.*)\\n(?P<qual>.*)\\n",
"read2": "(?P<name>@.*)\\n(?P<seq>.*)\\n\\+(.*)\\n(?P<qual>.*)\\n",
"read3": "(@.*)\\n(?P<MB>.*)\\n\\+(.*)\\n(.*)\\n"
}
"""
def run_single(args):
add_umis_to_fastq(args.out_base, args.read1_fq, args.read2_fq, args.umi_fq)
@utils.map_wrap
@zeromq_aware_logging
def add_umis_to_fastq_parallel(out_base, read1_fq, read2_fq, umi_fq, config):
add_umis_to_fastq(out_base, read1_fq, read2_fq, umi_fq)
def add_umis_to_fastq(out_base, read1_fq, read2_fq, umi_fq):
print("Processing", read1_fq, read2_fq, umi_fq)
cores = 8
out1_fq = out_base + "_R1.fq.gz"
out2_fq = out_base + "_R2.fq.gz"
transform_json_file = out_base + "-transform.json"
with open(transform_json_file, "w") as out_handle:
out_handle.write(transform_json)
with utils.open_gzipsafe(read1_fq) as in_handle:
ex_name = in_handle.readline().split(" ")
if len(ex_name) == 2:
fastq_tags_arg = "--keep_fastq_tags"
else:
fastq_tags_arg = ""
cmd = ("umis fastqtransform {fastq_tags_arg} "
"--fastq1out >(pbgzip -n {cores} -c > {out1_fq}) "
"--fastq2out >(pbgzip -n {cores} -c > {out2_fq}) "
"{transform_json_file} {read1_fq} "
"{read2_fq} {umi_fq}")
do.run(cmd.format(**locals()), "Add UMIs to paired fastq files")
os.remove(transform_json_file)
def run_autopair(args):
outdir = utils.safe_makedir(args.outdir)
to_run = []
extras = []
for fnames in fastq.combine_pairs(sorted(args.files)):
if len(fnames) == 2:
to_run.append(fnames)
elif len(fnames) == 3:
r1, r2, r3 = sorted(fnames)
to_run.append([r1, r2])
extras.append(r3)
else:
assert len(fnames) == 1, fnames
extras.append(fnames[0])
ready_to_run = []
for r1, r2 in to_run:
target = os.path.commonprefix([r1, r2])
r3 = None
for test_r3 in extras:
if (os.path.commonprefix([r1, test_r3]) == target and
os.path.commonprefix([r2, test_r3]) == target):
r3 = test_r3
break
assert r3, (r1, r2, extras)
base_name = os.path.join(outdir, os.path.commonprefix([r1, r2, r3]).rstrip("_R"))
ready_to_run.append([base_name, r1, r3, r2, {"algorithm": {}, "resources": {}}])
parallel = {"type": "local", "cores": len(ready_to_run), "progs": []}
run_multicore(add_umis_to_fastq_parallel, ready_to_run, {"algorithm": {}}, parallel)
if __name__ == "__main__":
parser = argparse.ArgumentParser(description="Add UMIs to fastq read names")<|fim▁hole|>
p = sp.add_parser("autopair", help="Automatically pair R1/R2/R3 fastq inputs")
p.add_argument("--outdir", default="with_umis", help="Output directory to write UMI prepped fastqs")
p.add_argument("files", nargs="*", help="All fastq files to pair and process")
p.set_defaults(func=run_autopair)
p = sp.add_parser("single", help="Run single set of fastq files with separate UMIs")
p.add_argument("out_base", help="Base name for output files -- you get <base_name>_R1.fq.gz")
p.add_argument("read1_fq", help="Input fastq, read 1")
p.add_argument("read2_fq", help="Input fastq, read 2")
p.add_argument("umi_fq", help="Input fastq, UMIs")
p.set_defaults(func=run_single)
if len(sys.argv) == 1:
parser.print_help()
args = parser.parse_args()
args.func(args)<|fim▁end|> | sp = parser.add_subparsers(title="[sub-commands]") |
<|file_name|>settings.spec.js<|end_file_name|><|fim▁begin|>/* eslint prefer-arrow-callback: 0, func-names: 0, 'react/jsx-filename-extension': [1, { "extensions": [".js", ".jsx"] }] */
import React from 'react';
import assert from 'assert';
import { shallow } from 'enzyme';
import sinon from 'sinon';
import DisplayNameSlugEditor from '../partials/display-name-slug-editor';
import CollectionSettings from './settings';
import Thumbnail from '../components/thumbnail';
const collectionWithDefaultSubject = {
id: '1',
default_subject_src: 'subject.png',
display_name: 'A collection',
private: false,
slug: 'username/a-collection'
};
const collectionWithoutDefaultSubject = {
id: '1',
display_name: 'A collection',
private: false,
slug: 'username/a-collection'
};
describe('<CollectionSettings />', function () {
let wrapper;
let confirmationSpy;
let deleteButton;
let handleDescriptionInputChangeStub;
before(function () {
confirmationSpy = sinon.spy(CollectionSettings.prototype, 'confirmDelete');
handleDescriptionInputChangeStub = sinon.stub(CollectionSettings.prototype, 'handleDescriptionInputChange');
wrapper = shallow(<CollectionSettings
canCollaborate={true}
collection={collectionWithoutDefaultSubject}
/>,
{ context: { router: {}}}
);
deleteButton = wrapper.find('button.error');
});
it('should render without crashing', function () {
assert.equal(wrapper, wrapper);
});
it('should render a <DisplayNameSlugEditor /> component', function () {
assert.equal(wrapper.find(DisplayNameSlugEditor).length, 1);
});
it('should render the correct default checked attribute for visibility', function () {
const privateChecked = wrapper.find('input[type="radio"]').first().props().defaultChecked;<|fim▁hole|> });
it('should render the thumbnail correctly depending on presence of default_subject_src', function () {
const thumbnailFirstInstance = wrapper.find(Thumbnail);
assert.equal(thumbnailFirstInstance.length, 0);
wrapper.setProps({ collection: collectionWithDefaultSubject });
const thumbnailSecondInstance = wrapper.find(Thumbnail);
assert.equal(thumbnailSecondInstance.length, 1);
});
it('should call this.handleDescriptionInputChange() when a user changes description text', function () {
wrapper.find('textarea').simulate('change');
sinon.assert.calledOnce(handleDescriptionInputChangeStub);
});
it('should render permission messaging if there is no user', function () {
wrapper.setProps({ canCollaborate: false, user: null });
const permissionMessage = wrapper.contains(<p>Not allowed to edit this collection</p>);
assert(permissionMessage, true);
});
it('should call this.confirmDelete() when the delete button is clicked', function () {
deleteButton.simulate('click');
sinon.assert.calledOnce(confirmationSpy);
});
});<|fim▁end|> | const publicChecked = wrapper.find('input[type="radio"]').last().props().defaultChecked;
assert.equal(privateChecked, collectionWithoutDefaultSubject.private);
assert.notEqual(publicChecked, collectionWithoutDefaultSubject.private); |
<|file_name|>harvest.py<|end_file_name|><|fim▁begin|>import re
import click
from lancet.utils import hr
@click.command()
@click.argument("query", required=False)
@click.pass_obj
def projects(lancet, query):
"""List Harvest projects, optionally filtered with a regexp."""
projects = lancet.timer.projects()
if query:
regexp = re.compile(query, flags=re.IGNORECASE)
def match(project):
match = regexp.search(project["name"])<|fim▁hole|> return False
project["match"] = match
return True
projects = (p for p in projects if match(p))
for project in sorted(projects, key=lambda p: p["name"].lower()):
name = project["name"]
if "match" in project:
m = project["match"]
s, e = m.start(), m.end()
match = click.style(name[s:e], fg="green")
name = name[:s] + match + name[e:]
click.echo(
"{:>9d} {} {}".format(
project["id"], click.style("‣", fg="yellow"), name
)
)
@click.command()
@click.argument("project_id", type=int)
@click.pass_obj
def tasks(lancet, project_id):
"""List Harvest tasks for the given project ID."""
for task in lancet.timer.tasks(project_id):
click.echo(
"{:>9d} {} {}".format(
task["id"], click.style("‣", fg="yellow"), task["name"]
)
)<|fim▁end|> | if match is None: |
<|file_name|>UniqueLbPortViolationException.java<|end_file_name|><|fim▁begin|>package org.openstack.atlas.service.domain.exception;
public class UniqueLbPortViolationException extends PersistenceServiceException {
private final String message;
public UniqueLbPortViolationException(final String message) {<|fim▁hole|> @Override
public String getMessage() {
return message;
}
}<|fim▁end|> | this.message = message;
}
|
<|file_name|>app.component.ts<|end_file_name|><|fim▁begin|>import { Component, ViewChild } from '@angular/core';
import { jqxNumberInputComponent } from '../../../../../jqwidgets-ts/angular_jqxnumberinput';
@Component({
selector: 'app-root',
templateUrl: './app.component.html'
})
export class AppComponent {
@ViewChild('numericInput') numericInput: jqxNumberInputComponent;
symboltypes: string[] = ['$', '%', 'None'];
decimaldigitsNumbers: string[] = ['0', '1', '2', '3', '4'];
digitsNumbers: string[] = ['1', '2', '3', '4', '5', '6', '7', '8'];
change(event) {
let checked = event.args.checked;
this.numericInput.spinButtons(checked);
};
checkedLeftbutton(event) {
this.numericInput.symbolPosition('left');
};
checkedRightbutton(event) {
this.numericInput.symbolPosition('right');
};
symboltypeSelect(event) {
let index = event.args.index;
if (index == 2) {
this.numericInput.symbol('');
}
else {
let symbol = this.symboltypes[index];
this.numericInput.symbol(symbol);
}
};
decimaldigitsSelect(event) {
let index = event.args.index;
this.numericInput.decimalDigits(this.decimaldigitsNumbers[index]);
};
digitsSelect(event) {
let index = event.args.index;
this.numericInput.digits(this.digitsNumbers[index]);<|fim▁hole|><|fim▁end|> | };
} |
<|file_name|>AbstractFileOutput.java<|end_file_name|><|fim▁begin|>package com.debugtoday.htmldecoder.output;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.OutputStreamWriter;
import java.io.PrintWriter;
import java.io.UnsupportedEncodingException;
import java.util.List;
import java.util.regex.Matcher;
import org.slf4j.Logger;
import com.debugtoday.htmldecoder.decoder.GeneralDecoder;
import com.debugtoday.htmldecoder.exception.GeneralException;
import com.debugtoday.htmldecoder.log.CommonLog;
import com.debugtoday.htmldecoder.output.object.FileOutputArg;
import com.debugtoday.htmldecoder.output.object.PaginationOutputArg;
import com.debugtoday.htmldecoder.output.object.TagFileOutputArg;
import com.debugtoday.htmldecoder.output.object.TagOutputArg;
import com.debugtoday.htmldecoder.output.object.TagWrapper;
import com.debugtoday.htmldecoder.output.object.TemplateFullTextWrapper;
/**
* base class for output relative to file output, i.g. article/article page/tag page/category page
* @author zydecx
*
*/
public class AbstractFileOutput implements Output {
private static final Logger logger = CommonLog.getLogger();
@Override
public String export(Object object) throws GeneralException {
// TODO Auto-generated method stub
return DONE;
}
/**
* @param file
* @param template
* @param arg
* @throws GeneralException
*/
public void writeToFile(File file, TemplateFullTextWrapper template, FileOutputArg arg) throws GeneralException {
File parentFile = file.getParentFile();<|fim▁hole|>
if (!file.exists()) {
try {
file.createNewFile();
} catch (IOException e) {
throw new GeneralException("fail to create file[" + file.getAbsolutePath() + "]", e);
}
}
try (
PrintWriter pw = new PrintWriter(new OutputStreamWriter(new FileOutputStream(file), "UTF-8"));
) {
String formattedHead = "";
if (arg.getPageTitle() != null) {
formattedHead += "<title>" + arg.getPageTitle() + "</title>";
}
if (arg.getHead() != null) {
formattedHead += "\n" + arg.getHead();
}
String templateFullText = template.getFullText()
.replace(GeneralDecoder.formatPlaceholderRegex("head"), formattedHead);
if (arg.getBodyTitle() != null) {
templateFullText = templateFullText
.replace(GeneralDecoder.formatPlaceholderRegex("body_title"), arg.getBodyTitle());
}
if (arg.getBody() != null) {
templateFullText = templateFullText
.replace(GeneralDecoder.formatPlaceholderRegex("body"), arg.getBody());
}
if (arg.getPagination() != null) {
templateFullText = templateFullText
.replace(GeneralDecoder.formatPlaceholderRegex("pagination"), arg.getPagination());
}
pw.write(templateFullText);
} catch (FileNotFoundException | UnsupportedEncodingException e) {
throw new GeneralException("fail to write to file[" + file.getAbsolutePath() + "]", e);
}
}
/**
* used for tag page/category page output
* @param templateFullTextWrapper
* @param arg
* @throws GeneralException
*/
public void exportTagPage(TemplateFullTextWrapper templateFullTextWrapper, TagFileOutputArg arg) throws GeneralException {
List<TagWrapper> itemList = arg.getTagList();
int itemSize = itemList.size();
int pagination = arg.getPagination();
int pageSize = (int) Math.ceil(itemSize * 1.0 / pagination);
Output tagOutput = arg.getTagOutput();
Output paginationOutput = arg.getPaginationOutput();
for (int i = 1; i <= pageSize; i++) {
List<TagWrapper> subList = itemList.subList((i - 1) * pagination, Math.min(itemSize, i * pagination));
StringBuilder sb = new StringBuilder();
for (TagWrapper item : subList) {
String itemName = item.getName();
TagOutputArg tagArg = new TagOutputArg(itemName, arg.getRootUrl() + "/" + item.getName(), item.getArticleList().size());
sb.append(tagOutput.export(tagArg));
}
File file = new File(formatPageFilePath(arg.getRootFile().getAbsolutePath(), i));
FileOutputArg fileOutputArg = new FileOutputArg();
fileOutputArg.setBodyTitle(arg.getBodyTitle());
fileOutputArg.setBody(sb.toString());
fileOutputArg.setPageTitle(arg.getBodyTitle());
fileOutputArg.setPagination(paginationOutput.export(new PaginationOutputArg(arg.getRootUrl(), pageSize, i)));
writeToFile(file, templateFullTextWrapper, fileOutputArg);
}
}
protected String formatPageUrl(String rootUrl, int index) {
return PaginationOutput.formatPaginationUrl(rootUrl, index);
}
protected String formatPageFilePath(String rootPath, int index) {
return PaginationOutput.formatPaginationFilePath(rootPath, index);
}
}<|fim▁end|> | if (!parentFile.exists()) {
parentFile.mkdirs();
} |
<|file_name|>gridportfoliodividend.js<|end_file_name|><|fim▁begin|>define(['./alasqlportfoliodividenddata', './monthstaticvalues', './bankdatadividend', './dateperiod'], function(alasqlportfoliodividenddata, monthstaticvalues, bankdatadividend, dateperiod) {
var gridData = [];
var gridId;
var months = monthstaticvalues.getMonthWithLettersValues();
var currentMonth = new Date().getMonth();
var selectedYear = new Date().getFullYear();
function setId(fieldId) {
gridId = fieldId;
}
function setData(year) {
selectedYear = year;
var result = alasqlportfoliodividenddata.getPortfolioDividends(selectedYear);
var data = [];
var id = 0;
result.forEach(function(entry) {
if(entry == null) return;
var månad = months[entry.Månad -1];
var land = entry.Land == null ? "x" : entry.Land.toLowerCase();
data.push({
Id: id,
Name : entry.Värdepapper,
Antal : entry.Antal,
Typ: entry.Typ,
Månad: månad,
Utdelningsdatum : entry.Utdelningsdag,
Utdelningsbelopp : entry.UtdelningaktieValuta,
Utdelningtotal: entry.Belopp,
Valuta: entry.Valuta,
ValutaKurs: entry.ValutaKurs,
Land: land,
UtdelningDeklarerad: entry.UtdelningDeklarerad,
Utv: entry.Utv
});
id++;
});
gridData = data;
}
function onDataBound(e) {
var columns = e.sender.columns;
var dataItems = e.sender.dataSource.view();
var today = new Date().toISOString();
for (var j = 0; j < dataItems.length; j++) {
if(dataItems[j].items == null) return;
for (var i = 0; i < dataItems[j].items.length; i++) {
var utdelningsdatum = new Date(dataItems[j].items[i].get("Utdelningsdatum")).toISOString();
var utdelningdeklarerad = dataItems[j].items[i].get("UtdelningDeklarerad");
var row = e.sender.tbody.find("[data-uid='" + dataItems[j].items[i].uid + "']");
if(utdelningsdatum <= today)
row.addClass("grid-ok-row");
if(utdelningdeklarerad == "N")
row.addClass("grid-yellow-row");
}
}
}
function load() {
var today = new Date().toISOString().slice(0, 10);
var grid = $(gridId).kendoGrid({
toolbar: ["excel", "pdf"],
excel: {
fileName: "förväntade_utdelningar" + "_" + today + ".xlsx",
filterable: true
},
pdf: {
fileName: "förväntade_utdelningar" + "_" + today + ".pdf",
allPages: true,
avoidLinks: true,
paperSize: "A4",
margin: { top: "2cm", left: "1cm", right: "1cm", bottom: "1cm" },
landscape: true,
repeatHeaders: true,
scale: 0.8
},
theme: "bootstrap",
dataBound: onDataBound,
dataSource: {
data: gridData,
schema: {
model: {
fields: {
Name: { type: "string" },
Antal: { type: "number" },
Typ: { type: "string" },
Utdelningsdatum: { type: "date" },
Utdelningsbelopp: { type: "string" },
Utdelningtotal: { type: "number"},
Land: {type: "string" },
ValutaKurs: { type: "string"},
Valuta: {type: "string" }
}
}
},
group: {
field: "Månad", dir: "asc", aggregates: [
{ field: "Månad", aggregate: "sum" },
{ field: "Name", aggregate: "count" },
{ field: "Utdelningtotal", aggregate: "sum"}
]
},<|fim▁hole|> aggregate: [ { field: "Månad", aggregate: "sum" },
{ field: "Name", aggregate: "count" },
{ field: "Utdelningtotal", aggregate: "sum" }
],
sort: ({ field: "Utdelningsdatum", dir: "asc" }),
pageSize: gridData.length
},
scrollable: true,
sortable: true,
filterable: true,
groupable: true,
pageable: false,
columns: [
{ field: "Månad", groupHeaderTemplate: "#= value.substring(2, value.length) #", hidden: true },
{ field: "UtdelningDeklarerad", hidden: true },
{ field: "Name", title: "Värdepapper", template: "<div class='gridportfolio-country-picture' style='background-image: url(/styles/images/#:data.Land#.png);'></div><div class='gridportfolio-country-name'>#: Name #</div>", width: "150px", aggregates: ["count"], footerTemplate: "Totalt antal förväntade utdelningar: #=count# st", groupFooterTemplate: gridNameGroupFooterTemplate },
{ field: "Utdelningsdatum", title: "Utd/Handl. utan utd", format: "{0:yyyy-MM-dd}", width: "75px" },
{ field: "Typ", title: "Typ", width: "70px" },
{ field: "Antal", title: "Antal", format: "{0} st", width: "40px" },
{ field: "Utdelningsbelopp", title: "Utdelning/aktie", width: "60px" },
{ title: "Utv.", template: '<span class="#= gridPortfolioDividendDivChangeClass(data) #"></span>', width: "15px" },
{ field: "Utdelningtotal", title: "Belopp", width: "110px", format: "{0:n2} kr", aggregates: ["sum"], footerTemplate: gridUtdelningtotalFooterTemplate, groupFooterTemplate: gridUtdelningtotalGroupFooterTemplate },
{ title: "", template: '<span class="k-icon k-i-info" style="#= gridPortfolioDividendInfoVisibility(data) #"></span>', width: "15px" }
],
excelExport: function(e) {
var sheet = e.workbook.sheets[0];
for (var i = 0; i < sheet.columns.length; i++) {
sheet.columns[i].width = getExcelColumnWidth(i);
}
}
}).data("kendoGrid");
grid.thead.kendoTooltip({
filter: "th",
content: function (e) {
var target = e.target;
return $(target).text();
}
});
addTooltipForColumnFxInfo(grid, gridId);
addTooltipForColumnUtvInfo(grid, gridId);
}
function addTooltipForColumnUtvInfo(grid, gridId) {
$(gridId).kendoTooltip({
show: function(e){
if(this.content.text().length > 1){
this.content.parent().css("visibility", "visible");
}
},
hide:function(e){
this.content.parent().css("visibility", "hidden");
},
filter: "td:nth-child(9)",
position: "left",
width: 200,
content: function(e) {
var dataItem = grid.dataItem(e.target.closest("tr"));
if(dataItem == null || e.target[0].parentElement.className == "k-group-footer" || dataItem.Utv == 0) return "";
var content = "Utdelningsutveckling jmf fg utdelning: " + dataItem.Utv.replace('.', ',') + " %";
return content
}
}).data("kendoTooltip");
}
function addTooltipForColumnFxInfo(grid, gridId) {
$(gridId).kendoTooltip({
show: function(e){
if(this.content.text().length > 1){
this.content.parent().css("visibility", "visible");
}
},
hide:function(e){
this.content.parent().css("visibility", "hidden");
},
filter: "td:nth-child(11)",
position: "left",
width: 200,
content: function(e) {
var dataItem = grid.dataItem(e.target.closest("tr"));
if(dataItem == null || dataItem.ValutaKurs <= 1 || e.target[0].parentElement.className == "k-group-footer") return "";
var content = "Förväntat belopp beräknat med " + dataItem.Valuta + " växelkurs: " + (dataItem.ValutaKurs).replace(".", ",") + "kr";
return content
}
}).data("kendoTooltip");
}
window.gridPortfolioDividendDivChangeClass = function gridPortfolioDividendDivChangeClass(data) {
if(data.Utv == 0 || data.Utv == null)
return "hidden";
else if(data.Utv > 0)
return "k-icon k-i-arrow-up";
else
return "k-icon k-i-arrow-down";
}
window.gridPortfolioDividendInfoVisibility = function gridPortfolioDividendInfoVisibility(data) {
return data.ValutaKurs > 1 ? "" : "display: none;";
}
function getExcelColumnWidth(index) {
var columnWidth = 150;
switch(index) {
case 0: // Månad
columnWidth = 80;
break;
case 1: // Värdepapper
columnWidth = 220;
break;
case 2: // Datum
columnWidth = 80;
break;
case 3: // Typ
columnWidth = 130;
break;
case 4: // Antal
columnWidth = 70;
break;
case 5: // Utdelning/aktie
columnWidth = 120;
break;
case 6: // Belopp
columnWidth = 260;
break;
default:
columnWidth = 150;
}
return columnWidth;
}
function gridNameGroupFooterTemplate(e) {
var groupNameValue = e.Månad.sum;
if(typeof e.Name.group !== 'undefined')
groupNameValue = e.Name.group.value;
var groupMonthValue = months.indexOf(groupNameValue);
if(currentMonth <= groupMonthValue) {
return "Antal förväntade utdelningar: " + e.Name.count + " st";
}
else {
return "Antal erhållna utdelningar: " + e.Name.count + " st";
}
}
function gridUtdelningtotalFooterTemplate(e) {
var startPeriod = dateperiod.getStartOfYear((selectedYear -1));
var endPeriod = dateperiod.getEndOfYear((selectedYear -1));
var isTaxChecked = $('#checkboxTax').is(":checked");
var selectedYearTotalNumeric = e.Utdelningtotal.sum;
var selectedYearTotal = kendo.toString(selectedYearTotalNumeric, 'n2') + " kr";
var lastYearTotalNumeric = bankdatadividend.getTotalDividend(startPeriod, endPeriod, isTaxChecked);
var lastYearTotal = kendo.toString(lastYearTotalNumeric, 'n2') + " kr";
var growthValueNumeric = calculateGrowthChange(selectedYearTotalNumeric, lastYearTotalNumeric);
var growthValue = kendo.toString(growthValueNumeric, 'n2').replace(".", ",") + "%";
var spanChange = buildSpanChangeArrow(selectedYearTotalNumeric, lastYearTotalNumeric);
return "Totalt förväntat belopp: " + selectedYearTotal + " " + spanChange + growthValue + " (" + lastYearTotal + ")";
}
function gridUtdelningtotalGroupFooterTemplate(e) {
var groupNameValue = e.Månad.sum;
if(typeof e.Name.group !== 'undefined')
groupNameValue = e.Name.group.value;
var groupMonthValue = months.indexOf(groupNameValue);
var isTaxChecked = $('#checkboxTax').is(":checked");
var lastYearValueNumeric = bankdatadividend.getDividendMonthSumBelopp((selectedYear -1), (groupMonthValue +1), isTaxChecked);
var selectedYearValueNumeric = e.Utdelningtotal.sum;
var lastYearValue = kendo.toString(lastYearValueNumeric, 'n2') + " kr";
var selectedYearValue = kendo.toString(selectedYearValueNumeric, 'n2') + " kr";
var monthName = groupNameValue.substring(3, groupNameValue.length).toLowerCase();
var spanChange = buildSpanChangeArrow(selectedYearValueNumeric, lastYearValueNumeric);
var growthValueNumeric = calculateGrowthChange(selectedYearValueNumeric, lastYearValueNumeric);
var growthValue = kendo.toString(growthValueNumeric, 'n2').replace(".", ",") + "%";
if(months.includes(groupNameValue)) {
if(currentMonth <= groupMonthValue) {
return "Förväntat belopp " + monthName + ": " + selectedYearValue + " " + spanChange + growthValue + " (" + lastYearValue + ")";
}
else {
return "Erhållet belopp " + monthName + ": " + selectedYearValue + " " + spanChange + growthValue + " (" + lastYearValue + ")";
}
}
else {
return "Förväntat belopp " + groupNameValue + ": " + selectedYearValue + " " + spanChange + growthValue + " (" + lastYearValue + ")";
}
}
function buildSpanChangeArrow(current, last) {
var spanArrowClass = current > last ? "k-i-arrow-up" : "k-i-arrow-down";
var titleText = current > last ? "To the moon" : "Back to earth";
return "<span class='k-icon " + spanArrowClass + "' title='" + titleText + "'></span>";
}
function calculateGrowthChange(current, last) {
if(last == 0) return 0;
var changeValue = current - last;
return ((changeValue / last) * 100).toFixed(2);
}
return {
setId: setId,
setData: setData,
load: load
};
});<|fim▁end|> | |
<|file_name|>http.rs<|end_file_name|><|fim▁begin|>use hyper::{Client, Error};
use hyper::client::Response;
use hyper::header::ContentType;
use hyper::method::Method;
/// Makes a DELETE request to etcd.
pub fn delete(url: String) -> Result<Response, Error> {
request(Method::Delete, url)
}
/// Makes a GET request to etcd.
pub fn get(url: String) -> Result<Response, Error> {
request(Method::Get, url)
}
/// Makes a POST request to etcd.<|fim▁hole|> request_with_body(Method::Post, url, body)
}
/// Makes a PUT request to etcd.
pub fn put(url: String, body: String) -> Result<Response, Error> {
request_with_body(Method::Put, url, body)
}
// private
/// Makes a request to etcd.
fn request(method: Method, url: String) -> Result<Response, Error> {
let client = Client::new();
let request = client.request(method, &url);
request.send()
}
/// Makes a request with an HTTP body to etcd.
fn request_with_body(method: Method, url: String, body: String) -> Result<Response, Error> {
let client = Client::new();
let content_type: ContentType = ContentType(
"application/x-www-form-urlencoded".parse().unwrap()
);
let request = client.request(method, &url).header(content_type).body(&body);
request.send()
}<|fim▁end|> | pub fn post(url: String, body: String) -> Result<Response, Error> { |
<|file_name|>sql_to_sheets.py<|end_file_name|><|fim▁begin|># Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
import datetime
import numbers
from contextlib import closing
from typing import Any, Iterable, Mapping, Optional, Sequence, Union
from airflow.operators.sql import BaseSQLOperator
from airflow.providers.google.suite.hooks.sheets import GSheetsHook
class SQLToGoogleSheetsOperator(BaseSQLOperator):
"""
Copy data from SQL results to provided Google Spreadsheet.
:param sql: The SQL to execute.
:param spreadsheet_id: The Google Sheet ID to interact with.
:param conn_id: the connection ID used to connect to the database.
:param parameters: The parameters to render the SQL query with.
:param database: name of database which overwrite the defined one in connection
:param spreadsheet_range: The A1 notation of the values to retrieve.
:param gcp_conn_id: The connection ID to use when fetching connection info.
:param delegate_to: The account to impersonate using domain-wide delegation of authority,
if any. For this to work, the service account making the request must have
domain-wide delegation enabled.
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
"""
template_fields: Sequence[str] = (
"sql",
"spreadsheet_id",
"spreadsheet_range",
"impersonation_chain",
)
template_fields_renderers = {"sql": "sql"}
template_ext: Sequence[str] = (".sql",)
ui_color = "#a0e08c"
def __init__(
self,
*,
sql: str,
spreadsheet_id: str,
sql_conn_id: str,
parameters: Optional[Union[Mapping, Iterable]] = None,
database: Optional[str] = None,
spreadsheet_range: str = "Sheet1",
gcp_conn_id: str = "google_cloud_default",
delegate_to: Optional[str] = None,
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.sql = sql
self.conn_id = sql_conn_id
self.database = database
self.parameters = parameters
self.gcp_conn_id = gcp_conn_id
self.spreadsheet_id = spreadsheet_id
self.spreadsheet_range = spreadsheet_range
self.delegate_to = delegate_to
self.impersonation_chain = impersonation_chain
def _data_prep(self, data):
for row in data:
item_list = []
for item in row:
if isinstance(item, (datetime.date, datetime.datetime)):
item = item.isoformat()
elif isinstance(item, int): # To exclude int from the number check.
pass
elif isinstance(item, numbers.Number):
item = float(item)
item_list.append(item)
yield item_list
def _get_data(self):
hook = self.get_db_hook()
with closing(hook.get_conn()) as conn, closing(conn.cursor()) as cur:
self.log.info("Executing query")
cur.execute(self.sql, self.parameters or ())
yield [field[0] for field in cur.description]
yield from self._data_prep(cur.fetchall())<|fim▁hole|> values = list(self._get_data())
self.log.info("Connecting to Google")
sheet_hook = GSheetsHook(
gcp_conn_id=self.gcp_conn_id,
delegate_to=self.delegate_to,
impersonation_chain=self.impersonation_chain,
)
self.log.info(f"Uploading data to https://docs.google.com/spreadsheets/d/{self.spreadsheet_id}")
sheet_hook.update_values(
spreadsheet_id=self.spreadsheet_id,
range_=self.spreadsheet_range,
values=values,
)<|fim▁end|> |
def execute(self, context: Any) -> None:
self.log.info("Getting data") |
<|file_name|>exceptions.js<|end_file_name|><|fim▁begin|>import { ListWrapper } from 'angular2/src/facade/collection';
import { stringify, isBlank } from 'angular2/src/facade/lang';
import { BaseException, WrappedException } from 'angular2/src/facade/exceptions';
function findFirstClosedCycle(keys) {
var res = [];
for (var i = 0; i < keys.length; ++i) {
if (ListWrapper.contains(res, keys[i])) {
res.push(keys[i]);
return res;
}
else {
res.push(keys[i]);
}
}
return res;
}
function constructResolvingPath(keys) {
if (keys.length > 1) {
var reversed = findFirstClosedCycle(ListWrapper.reversed(keys));
var tokenStrs = reversed.map(k => stringify(k.token));
return " (" + tokenStrs.join(' -> ') + ")";
}
else {
return "";
}
}
/**
* Base class for all errors arising from misconfigured providers.
*/
export class AbstractProviderError extends BaseException {
constructor(injector, key, constructResolvingMessage) {
super("DI Exception");
this.keys = [key];
this.injectors = [injector];
this.constructResolvingMessage = constructResolvingMessage;
this.message = this.constructResolvingMessage(this.keys);
}
addKey(injector, key) {
this.injectors.push(injector);
this.keys.push(key);
this.message = this.constructResolvingMessage(this.keys);
}
get context() { return this.injectors[this.injectors.length - 1].debugContext(); }
}
/**
* Thrown when trying to retrieve a dependency by `Key` from {@link Injector}, but the
* {@link Injector} does not have a {@link Provider} for {@link Key}.
*
* ### Example ([live demo](http://plnkr.co/edit/vq8D3FRB9aGbnWJqtEPE?p=preview))
*
* ```typescript
* class A {
* constructor(b:B) {}
* }
*
* expect(() => Injector.resolveAndCreate([A])).toThrowError();
* ```
*/
export class NoProviderError extends AbstractProviderError {
constructor(injector, key) {
super(injector, key, function (keys) {
var first = stringify(ListWrapper.first(keys).token);
return `No provider for ${first}!${constructResolvingPath(keys)}`;
});
}
}
/**
* Thrown when dependencies form a cycle.
*
* ### Example ([live demo](http://plnkr.co/edit/wYQdNos0Tzql3ei1EV9j?p=info))
*
* ```typescript
* var injector = Injector.resolveAndCreate([
* provide("one", {useFactory: (two) => "two", deps: [[new Inject("two")]]}),
* provide("two", {useFactory: (one) => "one", deps: [[new Inject("one")]]})
* ]);<|fim▁hole|> *
* Retrieving `A` or `B` throws a `CyclicDependencyError` as the graph above cannot be constructed.
*/
export class CyclicDependencyError extends AbstractProviderError {
constructor(injector, key) {
super(injector, key, function (keys) {
return `Cannot instantiate cyclic dependency!${constructResolvingPath(keys)}`;
});
}
}
/**
* Thrown when a constructing type returns with an Error.
*
* The `InstantiationError` class contains the original error plus the dependency graph which caused
* this object to be instantiated.
*
* ### Example ([live demo](http://plnkr.co/edit/7aWYdcqTQsP0eNqEdUAf?p=preview))
*
* ```typescript
* class A {
* constructor() {
* throw new Error('message');
* }
* }
*
* var injector = Injector.resolveAndCreate([A]);
* try {
* injector.get(A);
* } catch (e) {
* expect(e instanceof InstantiationError).toBe(true);
* expect(e.originalException.message).toEqual("message");
* expect(e.originalStack).toBeDefined();
* }
* ```
*/
export class InstantiationError extends WrappedException {
constructor(injector, originalException, originalStack, key) {
super("DI Exception", originalException, originalStack, null);
this.keys = [key];
this.injectors = [injector];
}
addKey(injector, key) {
this.injectors.push(injector);
this.keys.push(key);
}
get wrapperMessage() {
var first = stringify(ListWrapper.first(this.keys).token);
return `Error during instantiation of ${first}!${constructResolvingPath(this.keys)}.`;
}
get causeKey() { return this.keys[0]; }
get context() { return this.injectors[this.injectors.length - 1].debugContext(); }
}
/**
* Thrown when an object other then {@link Provider} (or `Type`) is passed to {@link Injector}
* creation.
*
* ### Example ([live demo](http://plnkr.co/edit/YatCFbPAMCL0JSSQ4mvH?p=preview))
*
* ```typescript
* expect(() => Injector.resolveAndCreate(["not a type"])).toThrowError();
* ```
*/
export class InvalidProviderError extends BaseException {
constructor(provider) {
super("Invalid provider - only instances of Provider and Type are allowed, got: " +
provider.toString());
}
}
/**
* Thrown when the class has no annotation information.
*
* Lack of annotation information prevents the {@link Injector} from determining which dependencies
* need to be injected into the constructor.
*
* ### Example ([live demo](http://plnkr.co/edit/rHnZtlNS7vJOPQ6pcVkm?p=preview))
*
* ```typescript
* class A {
* constructor(b) {}
* }
*
* expect(() => Injector.resolveAndCreate([A])).toThrowError();
* ```
*
* This error is also thrown when the class not marked with {@link Injectable} has parameter types.
*
* ```typescript
* class B {}
*
* class A {
* constructor(b:B) {} // no information about the parameter types of A is available at runtime.
* }
*
* expect(() => Injector.resolveAndCreate([A,B])).toThrowError();
* ```
*/
export class NoAnnotationError extends BaseException {
constructor(typeOrFunc, params) {
super(NoAnnotationError._genMessage(typeOrFunc, params));
}
static _genMessage(typeOrFunc, params) {
var signature = [];
for (var i = 0, ii = params.length; i < ii; i++) {
var parameter = params[i];
if (isBlank(parameter) || parameter.length == 0) {
signature.push('?');
}
else {
signature.push(parameter.map(stringify).join(' '));
}
}
return "Cannot resolve all parameters for " + stringify(typeOrFunc) + "(" +
signature.join(', ') + "). " + 'Make sure they all have valid type or annotations.';
}
}
/**
* Thrown when getting an object by index.
*
* ### Example ([live demo](http://plnkr.co/edit/bRs0SX2OTQiJzqvjgl8P?p=preview))
*
* ```typescript
* class A {}
*
* var injector = Injector.resolveAndCreate([A]);
*
* expect(() => injector.getAt(100)).toThrowError();
* ```
*/
export class OutOfBoundsError extends BaseException {
constructor(index) {
super(`Index ${index} is out-of-bounds.`);
}
}
// TODO: add a working example after alpha38 is released
/**
* Thrown when a multi provider and a regular provider are bound to the same token.
*
* ### Example
*
* ```typescript
* expect(() => Injector.resolveAndCreate([
* new Provider("Strings", {useValue: "string1", multi: true}),
* new Provider("Strings", {useValue: "string2", multi: false})
* ])).toThrowError();
* ```
*/
export class MixingMultiProvidersWithRegularProvidersError extends BaseException {
constructor(provider1, provider2) {
super("Cannot mix multi providers and regular providers, got: " + provider1.toString() + " " +
provider2.toString());
}
}
//# sourceMappingURL=exceptions.js.map<|fim▁end|> | *
* expect(() => injector.get("one")).toThrowError();
* ``` |
<|file_name|>object_is.js<|end_file_name|><|fim▁begin|>Object.is(1, 1);
Object.is(1, 2);
Object.is(1, {});
Object.is(1, NaN);
Object.is(0, 0);
Object.is(0, -0);
Object.is(NaN, NaN);
Object.is({}, {});
var emptyObject = {};
var emptyArray = [];
Object.is(emptyObject, emptyObject);
Object.is(emptyArray, emptyArray);
Object.is(emptyObject, emptyArray);
var squared = x => x * x;
Object.is(squared, squared);
var a: boolean = Object.is('a', 'a');<|fim▁hole|>var c: boolean = Object.is('a');
var d: boolean = Object.is('a', 'b', 'c');<|fim▁end|> | var b: string = Object.is('a', 'a'); |
<|file_name|>attenuator.py<|end_file_name|><|fim▁begin|># Copyright 2016 Google Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Controller module for attenuators.
Sample Config:
.. code-block:: python
"Attenuator": [
{
"address": "192.168.1.12",
"port": 23,
"model": "minicircuits",
"paths": ["AP1-2G", "AP1-5G", "AP2-2G", "AP2-5G"]
},
{
"address": "192.168.1.14",
"port": 23,
"model": "minicircuits",
"paths": ["AP-DUT"]
}
]
"""
import importlib
import logging
MOBLY_CONTROLLER_CONFIG_NAME = "Attenuator"
# Keys used inside a config dict for attenuator.
# Keys for making the connection to the attenuator device. Right now we only
# use telnet lib. This can be refactored when the need for a different
# communication protocol arises.
KEY_ADDRESS = "address"
KEY_PORT = "port"
# A string that is the model of the attenuator used. This is essentially the
# module name for the underlying driver for the attenuator hardware.
KEY_MODEL = "model"
# A list of strings, each describing what's the connected to this attenuation
# path
KEY_PATHS = "paths"
PACKAGE_PATH_TEMPLATE = "mobly.controllers.attenuator_lib.%s"
def create(configs):
objs = []
for config in configs:
_validate_config(config)
attenuator_model = config[KEY_MODEL]
# Import the correct driver module for the attenuator device
module_name = PACKAGE_PATH_TEMPLATE % attenuator_model
module = importlib.import_module(module_name)
# Create each
attenuation_device = module.AttenuatorDevice(
path_count=len(config[KEY_PATHS]))
attenuation_device.model = attenuator_model
instances = attenuation_device.open(config[KEY_ADDRESS], config[KEY_PORT])
for idx, path_name in enumerate(config[KEY_PATHS]):
path = AttenuatorPath(attenuation_device, idx=idx, name=path_name)
objs.append(path)
return objs
def destroy(objs):
for attenuation_path in objs:
attenuation_path.attenuation_device.close()
class Error(Exception):
"""This is the Exception class defined for all errors generated by
Attenuator-related modules.
"""
def _validate_config(config):
"""Verifies that a config dict for an attenuator device is valid.
Args:
config: A dict that is the configuration for an attenuator device.
Raises:
attenuator.Error: A config is not valid.
"""
required_keys = [KEY_ADDRESS, KEY_MODEL, KEY_PORT, KEY_PATHS]
for key in required_keys:
if key not in config:
raise Error("Required key %s missing from config %s", (key, config))
class AttenuatorPath:
"""A convenience class that allows users to control each attenuator path
separately as different objects, as opposed to passing in an index number
to the functions of an attenuator device object.
This decouples the test code from the actual attenuator device used in the
physical test bed.
For example, if a test needs to attenuate four signal paths, this allows the
test to do:
.. code-block:: python
self.attenuation_paths[0].set_atten(50)
self.attenuation_paths[1].set_atten(40)
instead of:
.. code-block:: python
self.attenuators[0].set_atten(0, 50)
self.attenuators[0].set_atten(1, 40)
The benefit the former is that the physical test bed can use either four
single-channel attenuators, or one four-channel attenuators. Whereas the
latter forces the test bed to use a four-channel attenuator.
"""
def __init__(self, attenuation_device, idx=0, name=None):
self.model = attenuation_device.model
self.attenuation_device = attenuation_device
self.idx = idx
if (self.idx >= attenuation_device.path_count):
raise IndexError("Attenuator index out of range!")
def set_atten(self, value):
"""This function sets the attenuation of Attenuator.
Args:
value: This is a floating point value for nominal attenuation to be
set. Unit is db.
"""
self.attenuation_device.set_atten(self.idx, value)
def get_atten(self):<|fim▁hole|> """Gets the current attenuation setting of Attenuator.
Returns:
A float that is the current attenuation value. Unit is db.
"""
return self.attenuation_device.get_atten(self.idx)
def get_max_atten(self):
"""Gets the max attenuation supported by the Attenuator.
Returns:
A float that is the max attenuation value.
"""
return self.attenuation_device.max_atten<|fim▁end|> | |
<|file_name|>pattern_correlation.py<|end_file_name|><|fim▁begin|>"""
This file is part of pyCMBS.
(c) 2012- Alexander Loew
For COPYING and LICENSE details, please refer to the LICENSE file
"""
"""
development script for pattern correlation analysis
"""
from pycmbs.diagnostic import PatternCorrelation
from pycmbs.data import Data<|fim▁hole|>
import matplotlib.pyplot as plt
plt.close('all')
fname = '../pycmbs/examples/example_data/air.mon.mean.nc'
# generate two datasets
x = Data(fname, 'air', read=True)
xc = x.get_climatology(return_object=True)
yc = xc.copy()
yc.data = yc.data * np.random.random(yc.shape)*10.
PC = PatternCorrelation(xc, yc)
PC.plot()
plt.show()<|fim▁end|> | import numpy as np |
<|file_name|>paint_preview_base_service.cc<|end_file_name|><|fim▁begin|>// Copyright 2019 The Chromium Authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
#include "components/paint_preview/browser/paint_preview_base_service.h"
#include <memory>
#include <utility>
#include "base/bind.h"
#include "base/callback.h"
#include "base/files/file_path.h"
#include "base/logging.h"
#include "base/metrics/histogram_functions.h"
#include "build/build_config.h"
#include "components/paint_preview/browser/compositor_utils.h"
#include "components/paint_preview/browser/paint_preview_client.h"
#include "components/paint_preview/common/mojom/paint_preview_recorder.mojom.h"
#include "content/public/browser/browser_thread.h"
#include "content/public/browser/web_contents.h"
#include "ui/gfx/geometry/rect.h"
namespace paint_preview {
PaintPreviewBaseService::PaintPreviewBaseService(
std::unique_ptr<PaintPreviewFileMixin> file_mixin,
std::unique_ptr<PaintPreviewPolicy> policy,
bool is_off_the_record)
: file_mixin_(std::move(file_mixin)),
policy_(std::move(policy)),
is_off_the_record_(is_off_the_record) {}
PaintPreviewBaseService::~PaintPreviewBaseService() = default;
void PaintPreviewBaseService::CapturePaintPreview(CaptureParams capture_params,
OnCapturedCallback callback) {
DCHECK_CURRENTLY_ON(content::BrowserThread::UI);
content::WebContents* web_contents = capture_params.web_contents;
content::RenderFrameHost* render_frame_host =
capture_params.render_frame_host ? capture_params.render_frame_host
: web_contents->GetMainFrame();
if (policy_ && !policy_->SupportedForContents(web_contents)) {
std::move(callback).Run(CaptureStatus::kContentUnsupported, {});
return;
}
PaintPreviewClient::CreateForWebContents(web_contents); // Is a singleton.
auto* client = PaintPreviewClient::FromWebContents(web_contents);
if (!client) {
std::move(callback).Run(CaptureStatus::kClientCreationFailed, {});
return;
}
PaintPreviewClient::PaintPreviewParams params(capture_params.persistence);
if (capture_params.root_dir) {
params.root_dir = *capture_params.root_dir;
}
params.inner.clip_rect = capture_params.clip_rect;
params.inner.is_main_frame =
(render_frame_host == web_contents->GetMainFrame());
params.inner.capture_links = capture_params.capture_links;
params.inner.max_capture_size = capture_params.max_per_capture_size;
params.inner.max_decoded_image_size_bytes =
capture_params.max_decoded_image_size_bytes;
params.inner.skip_accelerated_content =
capture_params.skip_accelerated_content;
// TODO(crbug/1064253): Consider moving to client so that this always happens.
// Although, it is harder to get this right in the client due to its
// lifecycle.
auto capture_handle =
web_contents->IncrementCapturerCount(gfx::Size(), /*stay_hidden=*/true,
/*stay_awake=*/true);
auto start_time = base::TimeTicks::Now();
client->CapturePaintPreview(
params, render_frame_host,
base::BindOnce(&PaintPreviewBaseService::OnCaptured,
weak_ptr_factory_.GetWeakPtr(), std::move(capture_handle),
start_time, std::move(callback)));
}
void PaintPreviewBaseService::OnCaptured(
base::ScopedClosureRunner capture_handle,
base::TimeTicks start_time,
OnCapturedCallback callback,
base::UnguessableToken guid,
mojom::PaintPreviewStatus status,
std::unique_ptr<CaptureResult> result) {
capture_handle.RunAndReset();
if (!(status == mojom::PaintPreviewStatus::kOk ||
status == mojom::PaintPreviewStatus::kPartialSuccess) ||
!result->capture_success) {
DVLOG(1) << "ERROR: Paint Preview failed to capture for document "
<< guid.ToString() << " with error " << status;<|fim▁hole|> return;
}
base::UmaHistogramTimes("Browser.PaintPreview.Capture.TotalCaptureDuration",
base::TimeTicks::Now() - start_time);
std::move(callback).Run(CaptureStatus::kOk, std::move(result));
}
} // namespace paint_preview<|fim▁end|> | std::move(callback).Run(CaptureStatus::kCaptureFailed, {}); |
<|file_name|>0001_initial.py<|end_file_name|><|fim▁begin|># -*- coding: utf-8 -*-
# Generated by Django 1.10.5 on 2017-06-02 20:34
from __future__ import unicode_literals
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):<|fim▁hole|>
initial = True
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='Credencial',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('user_name', models.CharField(max_length=60, unique=True)),
('password', models.CharField(max_length=255)),
('token', models.CharField(blank=True, max_length=60, unique=True)),
('agente', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
migrations.CreateModel(
name='Ferramenta',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('nome', models.CharField(max_length=60, unique=True)),
('link', models.URLField()),
],
),
migrations.CreateModel(
name='Linguagem',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('nome', models.CharField(max_length=60, unique=True)),
],
),
migrations.CreateModel(
name='Projeto',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('nome', models.CharField(max_length=60, unique=True)),
('dono', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='dono', to=settings.AUTH_USER_MODEL)),
('ferramentas', models.ManyToManyField(related_name='ferramentas', to='project_manager.Ferramenta')),
('linguagem', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='linguagem', to='project_manager.Linguagem')),
('participantes', models.ManyToManyField(related_name='participantes', to=settings.AUTH_USER_MODEL)),
],
),
migrations.AddField(
model_name='credencial',
name='ferramenta',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='project_manager.Ferramenta'),
),
]<|fim▁end|> | |
<|file_name|>subscriber.rs<|end_file_name|><|fim▁begin|>/*
* Copyright (c) Facebook, Inc. and its affiliates.
*
* This software may be used and distributed according to the terms of the
* GNU General Public License version 2.
*/
use crate::action::CloudSyncTrigger;
use crate::config::CommitCloudConfig;
use crate::error::*;
use crate::receiver::CommandName::{
self, CommitCloudCancelSubscriptions, CommitCloudRestartSubscriptions,
CommitCloudStartSubscriptions,
};
use crate::util;
use anyhow::{bail, Result};
use eventsource::reqwest::Client;
use log::{error, info, warn};
use reqwest::Url;
use serde::Deserialize;
use std::collections::HashMap;
use std::path::PathBuf;
use std::sync::atomic::{AtomicBool, Ordering};
use std::sync::{mpsc, Arc};
use std::time::{Duration, SystemTime};
use std::{str, thread};
#[allow(unused_macros)]
macro_rules! tinfo {
($throttler:expr, $($args:tt)+) => ( $throttler.execute(&|| {
info!($($args)+);
}))
}<|fim▁hole|>macro_rules! terror {
($throttler:expr, $($args:tt)+) => ( $throttler.execute(&|| {
error!($($args)+);
}))
}
#[derive(Deserialize)]
pub struct Notification {
pub(crate) version: u64,
pub(crate) new_heads: Option<Vec<String>>,
pub(crate) removed_heads: Option<Vec<String>>,
}
#[derive(PartialEq, Eq, Hash)]
pub struct Subscription {
pub(crate) repo_name: String,
pub(crate) workspace: String,
}
struct ThrottlingExecutor {
/// throttling rate in seconds
rate: Duration,
/// last time of command execution
last_time: SystemTime,
}
impl ThrottlingExecutor {
/// create ThrottlingExecutor with some duration
pub fn new(rate: Duration) -> ThrottlingExecutor {
ThrottlingExecutor {
rate,
last_time: SystemTime::now() - rate,
}
}
/// Run function if it is time, skip otherwise
#[inline]
fn execute(&mut self, f: &dyn Fn()) {
let now = SystemTime::now();
if now
.duration_since(self.last_time)
.map(|elapsed| elapsed >= self.rate)
.unwrap_or(true)
{
f();
self.last_time = now;
}
}
/// Reset time to pretend the command last execution was a while ago
#[inline]
fn reset(&mut self) {
self.last_time = SystemTime::now() - self.rate;
}
}
/// WorkspaceSubscriberService manages a set of running subscriptions
/// and trigger `hg cloud sync` on notifications
/// The workflow is simple:
/// * fire `hg cloud sync` on start in every repo
/// * read and start current set of subscriptions and
/// fire `hg cloud sync` on notifications
/// * fire `hg cloud sync` when connection recovers (could missed notifications)
/// * also provide actions (callbacks) to a few TcpReceiver commands
/// the commands are:
/// "start_subscriptions"
/// "restart_subscriptions"
/// "cancel_subscriptions"
/// if a command comes, gracefully cancel all previous subscriptions
/// and restart if requested
/// main use case:
/// if a cient (hg) add itself as a new subscriber (hg cloud join),
/// it is also client's responsibility to send "restart_subscriptions" command
/// same for unsubscribing (hg cloud leave)
/// The serve function starts the service
pub struct WorkspaceSubscriberService {
/// Server-Sent Events endpoint for Commit Cloud Notifications
pub(crate) notification_url: String,
/// Http endpoint for Commit Cloud requests
pub(crate) service_url: String,
/// OAuth token path (optional) for access to Commit Cloud SSE endpoint
pub(crate) user_token_path: Option<PathBuf>,
/// Directory with connected subscribers
pub(crate) connected_subscribers_path: PathBuf,
/// Number of retries for `hg cloud sync`
pub(crate) cloudsync_retries: u32,
/// Throttling rate for logging alive notification
pub(crate) alive_throttling_rate: Duration,
/// Throttling rate for logging errors
pub(crate) error_throttling_rate: Duration,
/// Channel for communication between threads
pub(crate) channel: (mpsc::Sender<CommandName>, mpsc::Receiver<CommandName>),
/// Interrupt barrier for joining threads
pub(crate) interrupt: Arc<AtomicBool>,
}
impl WorkspaceSubscriberService {
pub fn new(config: &CommitCloudConfig) -> Result<WorkspaceSubscriberService> {
Ok(WorkspaceSubscriberService {
notification_url: config
.notification_url
.clone()
.ok_or_else(|| ErrorKind::CommitCloudConfigError("undefined 'notification_url'"))?,
service_url: config
.service_url
.clone()
.ok_or_else(|| ErrorKind::CommitCloudConfigError("undefined 'service_url'"))?,
user_token_path: config.user_token_path.clone(),
connected_subscribers_path: config.connected_subscribers_path.clone().ok_or_else(
|| ErrorKind::CommitCloudConfigError("undefined 'connected_subscribers_path'"),
)?,
cloudsync_retries: config.cloudsync_retries,
alive_throttling_rate: Duration::new(config.alive_throttling_rate_sec, 0),
error_throttling_rate: Duration::new(config.error_throttling_rate_sec, 0),
channel: mpsc::channel(),
interrupt: Arc::new(AtomicBool::new(false)),
})
}
pub fn actions(&self) -> HashMap<CommandName, Box<dyn Fn() + Send>> {
let mut actions: HashMap<CommandName, Box<dyn Fn() + Send>> = HashMap::new();
actions.insert(CommitCloudRestartSubscriptions, {
let sender = self.channel.0.clone();
let interrupt = self.interrupt.clone();
Box::new(move || {
match sender.send(CommitCloudRestartSubscriptions) {
Err(err) => error!(
"Send CommitCloudRestartSubscriptions via mpsc::channel failed, reason: {}",
err
),
Ok(_) => {
info!("Restart subscriptions can take a while because it is graceful");
interrupt.store(true, Ordering::Relaxed);
}
}
})
});
actions.insert(CommitCloudCancelSubscriptions, {
let sender = self.channel.0.clone();
let interrupt = self.interrupt.clone();
Box::new(move || {
match sender.send(CommitCloudCancelSubscriptions) {
Err(err) => error!(
"Send CommitCloudCancelSubscriptions via mpsc::channel failed with {}",
err
),
Ok(_) => {
info!("Cancel subscriptions can take a while because it is graceful");
interrupt.store(true, Ordering::Relaxed);
}
}
})
});
actions.insert(CommitCloudStartSubscriptions, {
let sender = self.channel.0.clone();
let interrupt = self.interrupt.clone();
Box::new(move || {
match sender.send(CommitCloudStartSubscriptions) {
Err(err) => error!(
"Send CommitCloudStartSubscriptions via mpsc::channel failed with {}",
err
),
Ok(_) => interrupt.store(true, Ordering::Relaxed),
}
})
});
actions
}
pub fn serve(self) -> Result<thread::JoinHandle<Result<()>>> {
self.channel.0.send(CommitCloudStartSubscriptions)?;
Ok(thread::spawn(move || {
info!("Starting CommitCloud WorkspaceSubscriberService");
loop {
let command = self.channel.1.recv_timeout(Duration::from_secs(60));
match command {
Ok(CommitCloudCancelSubscriptions) => {
info!(
"All previous subscriptions have been canceled! \
Waiting for another commands..."
);
self.interrupt.store(false, Ordering::Relaxed);
}
Ok(CommitCloudRestartSubscriptions) => {
info!(
"All previous subscriptions have been canceled! \
Restarting subscriptions..."
);
self.interrupt.store(false, Ordering::Relaxed);
// start subscription threads
let access_token = util::read_access_token(&self.user_token_path);
if let Ok(access_token) = access_token {
let subscriptions = self.run_subscriptions(access_token)?;
for child in subscriptions {
let _ = child.join();
}
} else {
info!("User is not authenticated with Commit Cloud yet");
continue;
}
}
Ok(CommitCloudStartSubscriptions) => {
info!("Starting subscriptions...");
self.interrupt.store(false, Ordering::Relaxed);
let access_token = util::read_access_token(&self.user_token_path);
// start subscription threads
if let Ok(access_token) = access_token {
let subscriptions = self.run_subscriptions(access_token)?;
for child in subscriptions {
let _ = child.join();
}
} else {
info!("User is not authenticated with Commit Cloud yet");
continue;
}
}
Err(mpsc::RecvTimeoutError::Timeout) => {
if !util::read_subscriptions(&self.connected_subscribers_path)?.is_empty() {
self.channel.0.send(CommitCloudStartSubscriptions)?;
}
continue;
}
Err(e) => {
error!("Receive from mpsc::channel failed with {}", e);
bail!("Receive and wait on mpsc::channel failed with {}", e);
}
}
}
}))
}
/// This helper function reads the list of current connected subscribers
/// It starts all the requested subscriptions by simply runing a separate thread for each one
/// All threads keep checking the interrupt flag and join gracefully if it is restart or stop
fn run_subscriptions(&self, access_token: util::Token) -> Result<Vec<thread::JoinHandle<()>>> {
util::read_subscriptions(&self.connected_subscribers_path)?
.into_iter()
.map(|(subscription, repo_roots)| {
self.run_subscription(access_token.clone(), subscription, repo_roots)
})
.collect::<Result<Vec<thread::JoinHandle<()>>>>()
}
/// Helper function to run a single subscription
fn run_subscription(
&self,
access_token: util::Token,
subscription: Subscription,
repo_roots: Vec<PathBuf>,
) -> Result<thread::JoinHandle<()>> {
let mut notification_url = Url::parse(&self.notification_url)?;
let service_url = Url::parse(&self.service_url)?;
let sid = format!("({} @ {})", subscription.repo_name, subscription.workspace);
info!("{} Subscribing to {}", sid, notification_url);
notification_url
.query_pairs_mut()
.append_pair("workspace", &subscription.workspace)
.append_pair("repo_name", &subscription.repo_name)
.append_pair("access_token", &access_token.token)
.append_pair("token_type", &access_token.token_type.to_string());
let client = Client::new(notification_url);
info!("{} Spawn a thread to handle the subscription", sid);
let cloudsync_retries = self.cloudsync_retries;
let alive_throttling_rate = self.alive_throttling_rate;
let error_throttling_rate = self.error_throttling_rate;
let interrupt = self.interrupt.clone();
Ok(thread::spawn(move || {
info!("{} Thread started...", sid);
let fire = |reason: &'static str, version: Option<u64>| {
if service_url.socket_addrs(|| None).is_err() {
warn!(
"{} Skip CloudSyncTrigger: failed to lookup address information {}",
sid, service_url
);
return;
}
for repo_root in repo_roots.iter() {
info!(
"{} Fire CloudSyncTrigger in '{}' {}",
sid,
repo_root.display(),
reason,
);
// log outputs, results and continue even if unsuccessful
let _res = CloudSyncTrigger::fire(
&sid,
repo_root,
cloudsync_retries,
version,
subscription.workspace.clone(),
);
if interrupt.load(Ordering::Relaxed) {
break;
}
}
};
fire("before starting subscription", None);
if interrupt.load(Ordering::Relaxed) {
return;
}
info!("{} Start listening to notifications", sid);
let mut throttler_alive = ThrottlingExecutor::new(alive_throttling_rate);
let mut throttler_error = ThrottlingExecutor::new(error_throttling_rate);
let mut last_error = false;
// the library handles automatic reconnection
for event in client {
if interrupt.load(Ordering::Relaxed) {
return;
}
let event = event.map_err(|e| ErrorKind::CommitCloudHttpError(format!("{}", e)));
if let Err(e) = event {
terror!(throttler_error, "{} {}. Continue...", sid, e);
throttler_alive.reset();
last_error = true;
if format!("{}", e).contains("401 Unauthorized") {
// interrupt execution earlier
// all subscriptions have to be restarted from scratch
interrupt.store(true, Ordering::Relaxed);
}
continue;
}
let data = event.unwrap().data;
if data.is_empty() {
tinfo!(
throttler_alive,
"{} Received empty event. Subscription is alive",
sid
);
throttler_error.reset();
if last_error {
fire("after recover from error", None);
if interrupt.load(Ordering::Relaxed) {
return;
}
}
last_error = false;
continue;
}
throttler_alive.reset();
throttler_error.reset();
last_error = false;
info!("{} Received new notification event", sid);
let notification = serde_json::from_str::<Notification>(&data);
if let Err(e) = notification {
error!(
"{} Unable to decode json data in the event, reason: {}. Continue...",
sid, e
);
continue;
}
let notification = notification.unwrap();
info!(
"{} CommitCloud informs that the latest workspace version is {}",
sid, notification.version
);
if let Some(ref new_heads) = notification.new_heads {
if !new_heads.is_empty() {
info!("{} New heads:\n{}", sid, new_heads.join("\n"));
}
}
if let Some(ref removed_heads) = notification.removed_heads {
if !removed_heads.is_empty() {
info!("{} Removed heads:\n{}", sid, removed_heads.join("\n"));
}
}
fire("on new version notification", Some(notification.version));
if interrupt.load(Ordering::Relaxed) {
return;
}
}
}))
}
}<|fim▁end|> |
#[allow(unused_macros)] |
<|file_name|>rom_nist384_32.rs<|end_file_name|><|fim▁begin|>/*
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
*/
<|fim▁hole|>// Base Bits= 29
// Base Bits= 29
// nist384 Modulus
pub const MODULUS: [Chunk; NLEN] = [
0x1FFFFFFF, 0x7, 0x0, 0x1FFFFE00, 0x1FFFEFFF, 0x1FFFFFFF, 0x1FFFFFFF, 0x1FFFFFFF, 0x1FFFFFFF,
0x1FFFFFFF, 0x1FFFFFFF, 0x1FFFFFFF, 0x1FFFFFFF, 0x7F,
];
pub const R2MODP: [Chunk; NLEN] = [
0x0, 0x8000, 0x1FF80000, 0x1FFFFF, 0x2000000, 0x0, 0x0, 0x1FFFFFFC, 0xF, 0x100, 0x400, 0x0,
0x0, 0x0,
];
pub const MCONST: Chunk = 0x1;
// nist384 Curve
pub const CURVE_COF_I: isize = 1;
pub const CURVE_A: isize = -3;
pub const CURVE_B_I: isize = 0;
pub const CURVE_COF: [Chunk; NLEN] = [
0x1, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0,
];
pub const CURVE_B: [Chunk; NLEN] = [
0x13EC2AEF, 0x142E476E, 0xBB4674A, 0xC731B14, 0x1875AC65, 0x447A809, 0x4480C50, 0xDDFD028,
0x19181D9C, 0x1F1FC168, 0x623815A, 0x47DCFC9, 0x1312FA7E, 0x59,
];
pub const CURVE_ORDER: [Chunk; NLEN] = [
0xCC52973, 0x760CB56, 0xC29DEBB, 0x141B6491, 0x12DDF581, 0x6C0FA1B, 0x1FFF1D8D, 0x1FFFFFFF,
0x1FFFFFFF, 0x1FFFFFFF, 0x1FFFFFFF, 0x1FFFFFFF, 0x1FFFFFFF, 0x7F,
];
pub const CURVE_GX: [Chunk; NLEN] = [
0x12760AB7, 0x12A2F1C3, 0x154A5B0E, 0x5E4BB7E, 0x2A38550, 0xF0412A, 0xE6167DD, 0xC5174F3,
0x146E1D3B, 0x1799056B, 0x3AC71C7, 0x1D160A6F, 0x87CA22B, 0x55,
];
pub const CURVE_GY: [Chunk; NLEN] = [
0x10EA0E5F, 0x1218EBE4, 0x1FA0675E, 0x1639C3A, 0xB8C00A6, 0x1889DAF8, 0x11F3A768, 0x17A51342,
0x9F8F41D, 0x1C9496E1, 0x1767A62F, 0xC4C58DE, 0x17DE4A9, 0x1B,
];
pub const MODBYTES: usize = 48;
pub const BASEBITS: usize = 29;
pub const MODBITS: usize = 384;
pub const MOD8: usize = 7;
pub const MODTYPE: ModType = ModType::NOT_SPECIAL;
pub const SH: usize = 14;
pub const CURVETYPE: CurveType = CurveType::WEIERSTRASS;
pub const CURVE_PAIRING_TYPE: CurvePairingType = CurvePairingType::NOT;
pub const SEXTIC_TWIST: SexticTwist = SexticTwist::NOT;
pub const ATE_BITS: usize = 0;
pub const SIGN_OF_X: SignOfX = SignOfX::NOT;
pub const HASH_TYPE: usize = 48;
pub const AESKEY: usize = 24;<|fim▁end|> | use nist384::big::NLEN;
use super::super::arch::Chunk;
use types::{ModType, CurveType, CurvePairingType, SexticTwist, SignOfX};
|
<|file_name|>index.ts<|end_file_name|><|fim▁begin|>export { KeyValueCache } from './KeyValueCache';<|fim▁hole|><|fim▁end|> | export { InMemoryLRUCache } from './InMemoryLRUCache'; |
<|file_name|>get_active_creative_wrappers.py<|end_file_name|><|fim▁begin|>#!/usr/bin/python
#
# Copyright 2014 Google Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""This code example gets all active creative wrappers.
To create creative wrappers, run create_creative_wrappers.py.
The LoadFromStorage method is pulling credentials and properties from a
"googleads.yaml" file. By default, it looks for this file in your home
directory. For more information, see the "Caching authentication information"
section of our README.
Tags: CreativeWrapperService.getCreativeWrappersByStatement
"""
<|fim▁hole|># Import appropriate modules from the client library.
from googleads import dfp
def main(client):
# Initialize appropriate service.
creative_wrapper_service = client.GetService('CreativeWrapperService',
version='v201405')
# Create statement object to only select active creative wrappers.
values = [{
'key': 'status',
'value': {
'xsi_type': 'TextValue',
'value': 'ACTIVE'
}
}]
query = 'WHERE status = :status'
statement = dfp.FilterStatement(query, values)
# Get creative wrappers by statement.
while True:
response = creative_wrapper_service.getCreativeWrappersByStatement(
statement.ToStatement())
if 'results' in response:
# Display results.
for creative_wrapper in response['results']:
print ('Creative wrapper with ID \'%s\' applying to label \'%s\' was '
'found.' % (creative_wrapper['id'], creative_wrapper['labelId']))
statement.offset += dfp.SUGGESTED_PAGE_LIMIT
else:
break
print '\nNumber of results found: %s' % response['totalResultSetSize']
if __name__ == '__main__':
# Initialize client object.
dfp_client = dfp.DfpClient.LoadFromStorage()
main(dfp_client)<|fim▁end|> | __author__ = ('Nicholas Chen',
'Joseph DiLallo')
|
<|file_name|>test-load.js<|end_file_name|><|fim▁begin|>/*global describe, beforeEach, it*/
'use strict';
<|fim▁hole|>
describe('video generator', function () {
it('can be imported without blowing up', function () {
var app = require('../app');
assert(app !== undefined);
});
});<|fim▁end|> | var assert = require('assert'); |
<|file_name|>TestRtiOpCode.py<|end_file_name|><|fim▁begin|>from Chip import OpCodeDefinitions
from Tests.OpCodeTests.OpCodeTestBase import OpCodeTestBase
class TestRtiOpCode(OpCodeTestBase):<|fim▁hole|><|fim▁end|> | def test_execute_rti_implied_command_calls_and_method(self):
self.assert_opcode_execution(OpCodeDefinitions.rti_implied_command, self.target.get_rti_command_executed) |
<|file_name|>regions-variance-invariant-use-covariant.rs<|end_file_name|><|fim▁begin|>// Copyright 2012 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
// Test that a type which is invariant with respect to its region
// parameter used in a covariant way yields an error.
//
// Note: see variance-regions-*.rs for the tests that check that the
// variance inference works in the first place.
<|fim▁hole|> f: &'a mut &'a isize
}
fn use_<'b>(c: Invariant<'b>) {
// For this assignment to be legal, Invariant<'b> <: Invariant<'static>.
// Since 'b <= 'static, this would be true if Invariant were covariant
// with respect to its parameter 'a.
let _: Invariant<'static> = c; //~ ERROR mismatched types
}
fn main() { }<|fim▁end|> | struct Invariant<'a> { |
<|file_name|>orf.rs<|end_file_name|><|fim▁begin|>// Copyright 2014-2016 Johannes Köster, Martin Larralde.
// Licensed under the MIT license (http://opensource.org/licenses/MIT)
// This file may not be copied, modified, or distributed
// except according to those terms.
//! One-way orf finder algorithm.
//!
//! Complexity: O(n).
//!<|fim▁hole|>//! use bio::seq_analysis::orf::{Finder, Orf};
//! let start_codons = vec!(b"ATG");
//! let stop_codons = vec!(b"TGA", b"TAG", b"TAA");
//! let min_len = 50;
//! let finder = Finder::new(start_codons, stop_codons, min_len);
//!
//! let sequence = b"ACGGCTAGAAAAGGCTAGAAAA";
//!
//! for Orf{start, end, offset} in finder.find_all(sequence) {
//! let orf = &sequence[start..end];
//! //...do something with orf sequence...
//! }
//! ```
//!
//! Right now the only way to check the reverse strand for orf is to use
//! the `alphabet::dna::RevComp` struct and to check for both sequences.
//! But that's not so performance friendly, as the reverse complementation and the orf research
//! could go on at the same time.
use std::collections::VecDeque;
use std::iter;
use utils::{IntoTextIterator, TextIterator};
/// An implementation of a naive algorithm finder
pub struct Finder {
start_codons: Vec<VecDeque<u8>>,
stop_codons: Vec<VecDeque<u8>>,
min_len: usize,
}
impl Finder {
/// Create a new instance of a finder for the given start and stop codons and a particular length
pub fn new<'a>(
start_codons: Vec<&'a [u8; 3]>,
stop_codons: Vec<&'a [u8; 3]>,
min_len: usize,
) -> Self {
Finder {
start_codons: start_codons.into_iter() // Convert start_ and
.map(|x| { // stop_codons from
x.into_iter() // Vec<&[u8;3]> to
.map(|&x| x as u8) // Vec<VecDeque<u8>>
.collect::<VecDeque<u8>>() // so they can be
}) // easily compared
.collect(), // with codon built
stop_codons: stop_codons.into_iter() // from IntoTextIterator
.map(|x| { // object.
x.into_iter()
.map(|&x| x as u8)
.collect::<VecDeque<u8>>()
})
.collect(),
min_len,
}
}
/// Find all orfs in the given sequence
pub fn find_all<'a, I: IntoTextIterator<'a>>(&'a self, seq: I) -> Matches<I::IntoIter> {
Matches {
finder: self,
state: State::new(),
seq: seq.into_iter().enumerate(),
}
}
}
/// An orf representation with start and end position of said orf,
/// as well as offset of the reading frame (1,2,3) and strand location
// (current: +, reverse complementary: -).
pub struct Orf {
pub start: usize,
pub end: usize,
pub offset: i8,
}
/// The current algorithm state.
struct State {
start_pos: [Option<usize>; 3],
codon: VecDeque<u8>,
}
impl State {
/// Create new state.
pub fn new() -> Self {
State {
start_pos: [None, None, None],
codon: VecDeque::new(),
}
}
}
/// Iterator over offset, start position, end position and sequence of matched orfs.
pub struct Matches<'a, I: TextIterator<'a>> {
finder: &'a Finder,
state: State,
seq: iter::Enumerate<I>,
}
impl<'a, I: Iterator<Item = &'a u8>> Iterator for Matches<'a, I> {
type Item = Orf;
fn next(&mut self) -> Option<Orf> {
let mut result: Option<Orf> = None;
let mut offset: usize;
for (index, &nuc) in self.seq.by_ref() {
// update the codon
if self.state.codon.len() >= 3 {
self.state.codon.pop_front();
}
self.state.codon.push_back(nuc);
offset = (index + 1) % 3;
// inside orf
if self.state.start_pos[offset].is_some() {
// check if leaving orf
if self.finder.stop_codons.contains(&self.state.codon) {
// check if length is sufficient
if index + 1 - self.state.start_pos[offset].unwrap() > self.finder.min_len {
// build results
result = Some(Orf {
start: self.state.start_pos[offset].unwrap() - 2,
end: index + 1,
offset: offset as i8,
});
}
// reinitialize
self.state.start_pos[offset] = None;
}
// check if entering orf
} else if self.finder.start_codons.contains(&self.state.codon) {
self.state.start_pos[offset] = Some(index);
}
if result.is_some() {
return result;
}
}
None
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_orf() {
let start_codons = vec![b"ATG"];
let stop_codons = vec![b"TGA", b"TAG", b"TAA"];
let min_len = 50;
let finder = Finder::new(start_codons, stop_codons, min_len);
let sequence = b"ACGGCTAGAAAAGGCTAGAAAA";
for Orf { start, end, .. } in finder.find_all(sequence) {
let _ = &sequence[start..end];
}
}
}<|fim▁end|> | //! # Example
//!
//! ``` |
<|file_name|>syntax-extension-regex-unused.rs<|end_file_name|><|fim▁begin|>// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
// ignore-stage1
#![feature(phase)]
extern crate regex;
#[phase(plugin)] extern crate regex_macros;
#[deny(unused_variables)]
#[deny(dead_code)]
// Tests to make sure that extraneous dead code warnings aren't emitted from
// the code generated by regex!.
fn main() {
let fubar = regex!("abc"); //~ ERROR unused variable: `fubar`<|fim▁hole|>}<|fim▁end|> | |
<|file_name|>test_uitypes.py<|end_file_name|><|fim▁begin|>from __future__ import with_statement
import sys
import unittest
import maya.cmds as cmds
import pymel.core as pm
import pymel.core.uitypes as ui
if not hasattr(pm, 'currentMenuParent'):
def currentMenuParent():
return ui.PyUI(cmds.setParent(q=1, menu=1))
pm.currentMenuParent = currentMenuParent
class TestWithStatement(unittest.TestCase):
def setUp(self):
cmds.setParent(None, menu=1)
self.win = cmds.window()
def tearDown(self):
cmds.deleteUI(self.win, window=True)
def test_classInit(self):
with ui.FormLayout() as fl:
self.assertEqual(pm.currentParent(), fl)
self.assertEqual(pm.currentParent(), self.win)
with ui.RowLayout() as rl:
self.assertEqual(pm.currentParent(), rl)
# Since there can only be one top-level layout,
# what happens is that before creating the row layout, the
# parent is set to the window; but that automatically gets translated
# to mean the top-level layout for that window, which is the form
# layout... so the row layout has it's parent set to the form
# layout
self.assertEqual(pm.currentParent(), fl)
with ui.ColumnLayout() as cl:
self.assertEqual(pm.currentParent(), cl)
self.assertEqual(pm.currentParent(), fl)
def test_cmdInit(self):
with pm.formLayout() as fl:
self.assertEqual(pm.currentParent(), fl)
self.assertEqual(pm.currentParent(), self.win)
with pm.rowLayout() as rl:
self.assertEqual(pm.currentParent(), rl)
# Since there can only be one top-level layout,
# what happens is that before creating the row layout, the
# parent is set to the window; but that automatically gets translated
# to mean the top-level layout for that window, which is the form
# layout... so the row layout has it's parent set to the form
# layout
self.assertEqual(pm.currentParent(), fl)
with pm.columnLayout() as cl:
self.assertEqual(pm.currentParent(), cl)
self.assertEqual(pm.currentParent(), fl)
def test_parentJump(self):
cl = ui.ColumnLayout()
rl1 = ui.RowLayout()
with pm.rowLayout(parent=cl) as rl2:
self.assertEqual(pm.currentParent(), rl2)
self.assertEqual(pm.currentParent(), cl)
def test_nested(self):
with ui.ColumnLayout() as cl:
self.assertEqual(pm.currentParent(), cl)
with pm.rowLayout() as rl:
self.assertEqual(pm.currentParent(), rl)
self.assertEqual(pm.currentParent(), cl)
self.assertEqual(pm.currentParent(), self.win)
def test_nestedParentJump(self):
with ui.ColumnLayout() as cl:
self.assertEqual(pm.currentParent(), cl)
with pm.rowLayout() as rl:
self.assertEqual(pm.currentParent(), rl)
with cl:
# set the parent BACK to the column layout<|fim▁hole|> self.assertEqual(pm.currentParent(), cl)
self.assertEqual(pm.currentParent(), self.win)
def test_nestedMenu(self):
self.assertEqual(pm.currentParent(), self.win)
self.assertEqual(pm.currentMenuParent(), None)
with ui.ColumnLayout() as cl:
self.assertEqual(pm.currentParent(), cl)
self.assertEqual(pm.currentMenuParent(), None)
cmds.button()
with pm.popupMenu() as m:
self.assertEqual(pm.currentParent(), cl)
self.assertEqual(pm.currentMenuParent(), m)
with ui.MenuItem(subMenu=1) as sm:
self.assertEqual(pm.currentParent(), cl)
self.assertEqual(pm.currentMenuParent(), sm)
self.assertEqual(pm.currentParent(), cl)
self.assertEqual(pm.currentMenuParent(), m)
self.assertEqual(pm.currentParent(), cl)
self.assertEqual(pm.currentParent(), self.win)
def test_rowGroupLayout(self):
self.assertEqual(pm.currentParent(), self.win)
self.assertEqual(pm.currentMenuParent(), None)
with pm.textFieldButtonGrp( label='Label', text='Text', buttonLabel='Button' ) as tfbg:
self.assertEqual(pm.currentParent(), tfbg)
self.assertEqual(pm.currentMenuParent(), None)
cmds.button()
with pm.popupMenu() as m:
self.assertEqual(pm.currentParent(), tfbg)
self.assertEqual(pm.currentMenuParent(), m)
with pm.menuItem(subMenu=1) as sm:
self.assertEqual(pm.currentParent(), tfbg)
self.assertEqual(pm.currentMenuParent(), sm)
self.assertEqual(pm.currentParent(), tfbg)
self.assertEqual(pm.currentMenuParent(), m)
self.assertEqual(pm.currentParent(), tfbg)
self.assertEqual(pm.currentParent(), self.win)
fl = pm.formLayout()
tfbg2 = pm.textFieldButtonGrp( label='Label', text='Text', buttonLabel='Button' )
self.assertEqual(pm.currentParent(), fl)
with pm.columnLayout() as cl:
cmds.button()
with pm.popupMenu() as m:
self.assertEqual(pm.currentParent(), cl)
self.assertEqual(pm.currentMenuParent(), m)
with pm.menuItem(subMenu=1) as sm:
self.assertEqual(pm.currentParent(), cl)
self.assertEqual(pm.currentMenuParent(), sm)
self.assertEqual(pm.currentParent(), cl)
self.assertEqual(pm.currentMenuParent(), m)
self.assertEqual(pm.currentParent(), cl)
self.assertEqual(pm.currentParent(), fl)
def test_optionMenuGrp(self):
self.assertEqual(pm.currentParent(), self.win)
self.assertEqual(pm.currentMenuParent(), None)
with ui.ColumnLayout() as cl:
self.assertEqual(pm.currentParent(), cl)
self.assertEqual(pm.currentMenuParent(), None)
cmds.button()
with ui.OptionMenuGrp() as m:
self.assertEqual(pm.currentParent(), m)
self.assertEqual(pm.currentMenuParent(), m.menu())
self.assertEqual(pm.currentParent(), cl)
self.assertEqual(pm.currentParent(), self.win)
def test_windowExit(self):
self.assertEqual(pm.currentParent(), self.win)
newWin = ui.Window()
try:
with newWin:
self.assertEqual(pm.currentParent(), newWin)
with pm.formLayout() as fl:
self.assertEqual(pm.currentParent(), fl)
self.assertEqual(pm.currentParent(), newWin)
self.assertTrue(pm.currentParent() in (None, newWin, fl))
finally:
pm.deleteUI(newWin, window=True)
otherWin = ui.Window()
# try NOT using with statement, to make sure the last newWin
# statement's exit popped it off the stack correctly
try:
with pm.formLayout() as fl:
self.assertEqual(pm.currentParent(), fl)
self.assertEqual(pm.currentParent(), otherWin)
finally:
pm.deleteUI(otherWin, window=True)
class TestTextScrollList(unittest.TestCase):
def setUp(self):
cmds.setParent(None, menu=1)
self.win = cmds.window()
def tearDown(self):
cmds.deleteUI(self.win, window=True)
def test_selectItemEmptyList(self):
with ui.Window(self.win):
with pm.formLayout():
tsl = pm.textScrollList()
tsl.extend(['a','b','c'])
# Make sure this is NOT None
self.assertEqual(tsl.getSelectItem(), [])
if not pm.about(batch=1):
for key, obj in globals().items():
if isinstance(obj, unittest.TestCase):
del globals()[key]
obj.__name__ = '_canceledTest_' + obj.__name__<|fim▁end|> | self.assertEqual(pm.currentParent(), cl)
self.assertEqual(pm.currentParent(), rl) |
<|file_name|>config.rs<|end_file_name|><|fim▁begin|>use std::io::prelude::*;
use std::fs::File;
use std::path::Path;
use std::io::{Error, ErrorKind};
use toml;
#[derive(Clone)]
pub struct Config {
config: toml::Value,
}
<|fim▁hole|>impl Config {
pub fn from_path<P: AsRef<Path>>(path: P) -> Result<Config, Error> {
let mut fd = try!(File::open(path));
let mut toml = String::new();
try!(fd.read_to_string(&mut toml));
Config::from_string(&toml)
}
pub fn from_string(toml: &str) -> Result<Config, Error> {
let config = match toml.parse() {
Ok(config) => config,
Err(_) => {
return Err(Error::new(ErrorKind::InvalidData,
"Syntax error - config file is not valid TOML"))
}
};
Ok(Config { config: config })
}
pub fn lookup<'a>(&'a self, path: &'a str) -> Option<&'a toml::Value> {
self.config.lookup(path)
}
}<|fim▁end|> | |
<|file_name|>graph.rs<|end_file_name|><|fim▁begin|>use std::vec::Vec;
use gnuplot::*;
/// Draws a graph with x-axis indicating the iteration number and
/// y-axis showing the cost at that iteration.
pub fn draw_cost_graph(fg : &mut Figure, cost_history : &Vec<f64>) {
let mut x = Vec::with_capacity(cost_history.len());
for i in 0..cost_history.len() {
x.push(i)
}
fg.axes2d()
.set_aspect_ratio(Fix(1.0))
.set_x_label("Iteration", &[Rotate(0.0)])
.set_y_label("Cost", &[Rotate(90.0)])
.lines(x.iter(), cost_history.iter(), &[Color("#006633")])
.set_title("Cost Graph", &[]);
}
/// Draws a decision boundary at p_f(x, y) = 0 by evaluating the function
/// within the supplied limits. The function in evaluated as a grid of size
/// (grid_size x grid_size).
pub fn draw_decision_boundary_2d(fg : &mut Figure, limits : (f64, f64, f64, f64), grid_size : usize, p_f : &Fn(f64, f64) -> f64) {
assert!(limits.0 < limits.2);
assert!(limits.1 < limits.3);
assert!(grid_size >= 1);
let mut z_array : Vec<f64> = vec![];
let x_start = limits.0;
let y_start = limits.1;
let x_end = limits.2;
let y_end = limits.3;
let x_step = (x_end - x_start) / grid_size as f64;
let y_step = (y_end - y_start) / grid_size as f64;
for row in 0..grid_size {
for col in 0..grid_size {
let z = p_f(x_start + col as f64 * x_step, y_start + row as f64 * y_step);
z_array.push(z);
}
}
fg.axes3d()
.surface(z_array.iter(), grid_size, grid_size, Some((x_start, y_start, x_end, y_end)), &[])<|fim▁hole|>}<|fim▁end|> | .show_contours_custom(true, true, Linear, Fix("Decision Boundary"), [0.0f64].iter()); |
<|file_name|>log_entry.py<|end_file_name|><|fim▁begin|># TODO: Yes need to fix this violation of visibility
from functools import partial
from jarvis_cli.client.common import _get_jarvis_resource, _post_jarvis_resource, \
_put_jarvis_resource, query
def _construct_log_entry_endpoint(event_id):
return "events/{0}/logentries".format(event_id)
def get_log_entry(event_id, conn, log_entry_id):
return _get_jarvis_resource(_construct_log_entry_endpoint(event_id), conn,
log_entry_id)<|fim▁hole|>def post_log_entry(event_id, conn, log_entry_request, quiet=False,
skip_tags_check=False):
return _post_jarvis_resource(_construct_log_entry_endpoint(event_id), conn,
log_entry_request, quiet, skip_tags_check)
def put_log_entry(event_id, conn, log_entry_id, log_entry_request):
return _put_jarvis_resource(_construct_log_entry_endpoint(event_id), conn,
log_entry_id, log_entry_request)
query_log_entries = partial(query, "search/logentries")<|fim▁end|> | |
<|file_name|>custom.js<|end_file_name|><|fim▁begin|>(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
(i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),<|fim▁hole|> })(window,document,'script','//www.google-analytics.com/analytics.js','ga');
ga('create', 'UA-65704319-1', 'auto');
ga('send', 'pageview');<|fim▁end|> | m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) |
<|file_name|>zh-tw.py<|end_file_name|><|fim▁begin|># coding: utf8
{
'"update" is an optional expression like "field1=\'newvalue\'". You cannot update or delete the results of a JOIN': '"更新" 是選擇性的條件式, 格式就像 "欄位1=\'值\'". 但是 JOIN 的資料不可以使用 update 或是 delete"',
'%Y-%m-%d': '%Y-%m-%d',
'%Y-%m-%d %H:%M:%S': '%Y-%m-%d %H:%M:%S',
'%s rows deleted': '已刪除 %s 筆',
'%s rows updated': '已更新 %s 筆',
'(something like "it-it")': '(格式類似 "zh-tw")',
'A new version of web2py is available': '新版的 web2py 已發行',
'A new version of web2py is available: %s': '新版的 web2py 已發行: %s',
'ATTENTION: Login requires a secure (HTTPS) connection or running on localhost.': '注意: 登入管理帳號需要安全連線(HTTPS)或是在本機連線(localhost).',
'ATTENTION: TESTING IS NOT THREAD SAFE SO DO NOT PERFORM MULTIPLE TESTS CONCURRENTLY.': '注意: 因為在測試模式不保證多執行緒安全性,也就是說不可以同時執行多個測試案例',
'ATTENTION: you cannot edit the running application!': '注意:不可編輯正在執行的應用程式!',
'About': '關於',
'About application': '關於本應用程式',
'Admin is disabled because insecure channel': '管理功能(Admin)在不安全連線環境下自動關閉',
'Admin is disabled because unsecure channel': '管理功能(Admin)在不安全連線環境下自動關閉',
'Administrator Password:': '管理員密碼:',
'Are you sure you want to delete file "%s"?': '確定要刪除檔案"%s"?',
'Are you sure you want to uninstall application "%s"': '確定要移除應用程式 "%s"',
'Are you sure you want to uninstall application "%s"?': '確定要移除應用程式 "%s"',
'Asíncrona': 'Asíncrona',
'Authentication': '驗證',
'Available databases and tables': '可提供的資料庫和資料表',
'Ayuda': 'Ayuda',
'Cannot be empty': '不可空白',<|fim▁hole|>'Cannot compile: there are errors in your app. Debug it, correct errors and try again.': '無法編譯:應用程式中含有錯誤,請除錯後再試一次.',
'Change Password': '變更密碼',
'Check to delete': '打勾代表刪除',
'Check to delete:': '點選以示刪除:',
'Client IP': '客戶端網址(IP)',
'Comprobantes': 'Comprobantes',
'Configuración': 'Configuración',
'Configurar': 'Configurar',
'Consultas': 'Consultas',
'Controller': '控件',
'Controllers': '控件',
'Copyright': '版權所有',
'Cotización': 'Cotización',
'Create new application': '創建應用程式',
'Current request': '目前網路資料要求(request)',
'Current response': '目前網路資料回應(response)',
'Current session': '目前網路連線資訊(session)',
'DB Model': '資料庫模組',
'DESIGN': '設計',
'Database': '資料庫',
'Date and Time': '日期和時間',
'Delete': '刪除',
'Delete:': '刪除:',
'Deploy on Google App Engine': '配置到 Google App Engine',
'Description': '描述',
'Design for': '設計為了',
'Detalles': 'Detalles',
'E-mail': '電子郵件',
'EDIT': '編輯',
'Edit': '編輯',
'Edit Profile': '編輯設定檔',
'Edit This App': '編輯本應用程式',
'Edit application': '編輯應用程式',
'Edit current record': '編輯當前紀錄',
'Editing file': '編輯檔案',
'Editing file "%s"': '編輯檔案"%s"',
'Emisión': 'Emisión',
'Error logs for "%(app)s"': '"%(app)s"的錯誤紀錄',
'Estado (dummy)': 'Estado (dummy)',
'FacturaLibre': 'FacturaLibre',
'FacturaLibre. Aplicación en desarrollo': 'FacturaLibre. Aplicación en desarrollo',
'FacturaLibre. Aplicación web para factura electrónica': 'FacturaLibre. Aplicación web para factura electrónica',
'FacturaLibre: interfase alternativa': 'FacturaLibre: interfase alternativa',
'FacturaLibre: interfaz de usuario alternativa': 'FacturaLibre: interfaz de usuario alternativa',
'First name': '名',
'Functions with no doctests will result in [passed] tests.': '沒有 doctests 的函式會顯示 [passed].',
'Group ID': '群組編號',
'Hello World': '嗨! 世界',
'Import/Export': '匯入/匯出',
'Index': '索引',
'Información General': 'Información General',
'Información Técnica': 'Información Técnica',
'Inicio': 'Inicio',
'Installed applications': '已安裝應用程式',
'Internal State': '內部狀態',
'Invalid Query': '不合法的查詢',
'Invalid action': '不合法的動作(action)',
'Invalid email': '不合法的電子郵件',
'Language files (static strings) updated': '語言檔已更新',
'Languages': '各國語言',
'Last name': '姓',
'Last saved on:': '最後儲存時間:',
'Layout': '網頁配置',
'License for': '軟體版權為',
'Listar comprobantes.': 'Listar comprobantes.',
'Listar detalles': 'Listar detalles',
'Login': '登入',
'Login to the Administrative Interface': '登入到管理員介面',
'Logout': '登出',
'Lost Password': '密碼遺忘',
'Main Menu': '主選單',
'Menu Model': '選單模組(menu)',
'Models': '資料模組',
'Modules': '程式模組',
'NO': '否',
'Name': '名字',
'New Record': '新紀錄',
'No databases in this application': '這應用程式不含資料庫',
'Origin': '原文',
'Original/Translation': '原文/翻譯',
'Password': '密碼',
"Password fields don't match": '密碼欄不匹配',
'Peeking at file': '選擇檔案',
'Powered by': '基於以下技術構建:',
'Query:': '查詢:',
'Record ID': '紀錄編號',
'Register': '註冊',
'Registration key': '註冊金鑰',
'Remember me (for 30 days)': '記住我(30 天)',
'Reset Password key': '重設密碼',
'Resolve Conflict file': '解決衝突檔案',
'Role': '角色',
'Rows in table': '在資料表裏的資料',
'Rows selected': '筆資料被選擇',
'Saved file hash:': '檔案雜湊值已紀錄:',
'Secuencial': 'Secuencial',
'Servicios Web': 'Servicios Web',
'Static files': '靜態檔案',
'Stylesheet': '網頁風格檔',
'Submit': '傳送',
'Sure you want to delete this object?': '確定要刪除此物件?',
'Table name': '資料表名稱',
'Testing application': '測試中的應用程式',
'The "query" is a condition like "db.table1.field1==\'value\'". Something like "db.table1.field1==db.table2.field2" results in a SQL JOIN.': '"查詢"是一個像 "db.表1.欄位1==\'值\'" 的條件式. 以"db.表1.欄位1==db.表2.欄位2"方式則相當於執行 JOIN SQL.',
'There are no controllers': '沒有控件(controllers)',
'There are no models': '沒有資料庫模組(models)',
'There are no modules': '沒有程式模組(modules)',
'There are no static files': '沒有靜態檔案',
'There are no translators, only default language is supported': '沒有翻譯檔,只支援原始語言',
'There are no views': '沒有視圖',
'This is the %(filename)s template': '這是%(filename)s檔案的樣板(template)',
'Ticket': '問題單',
'Timestamp': '時間標記',
'Unable to check for upgrades': '無法做升級檢查',
'Unable to download': '無法下載',
'Unable to download app': '無法下載應用程式',
'Update:': '更新:',
'Upload existing application': '更新存在的應用程式',
'Use (...)&(...) for AND, (...)|(...) for OR, and ~(...) for NOT to build more complex queries.': '使用下列方式來組合更複雜的條件式, (...)&(...) 代表同時存在的條件, (...)|(...) 代表擇一的條件, ~(...)則代表反向條件.',
'User %(id)s Logged-in': '使用者 %(id)s 已登入',
'User %(id)s Registered': '使用者 %(id)s 已註冊',
'User ID': '使用者編號',
'Verify Password': '驗證密碼',
'View': '視圖',
'Views': '視圖',
'WSBFE': 'WSBFE',
'WSFEX': 'WSFEX',
'WSFEv0': 'WSFEv0',
'WSFEv1': 'WSFEv1',
'WSMTXCA': 'WSMTXCA',
'Welcome %s': '歡迎 %s',
'Welcome to web2py': '歡迎使用 web2py',
'YES': '是',
'about': '關於',
'appadmin is disabled because insecure channel': '因為來自非安全通道,管理介面關閉',
'cache': '快取記憶體',
'change password': '變更密碼',
'click here for online examples': '點此處進入線上範例',
'click here for the administrative interface': '點此處進入管理介面',
'customize me!': '請調整我!',
'data uploaded': '資料已上傳',
'database': '資料庫',
'database %s select': '已選擇 %s 資料庫',
'db': 'db',
'design': '設計',
'done!': '完成!',
'edit profile': '編輯設定檔',
'export as csv file': '以逗號分隔檔(csv)格式匯出',
'insert new': '插入新資料',
'insert new %s': '插入新資料 %s',
'invalid request': '不合法的網路要求(request)',
'login': '登入',
'logout': '登出',
'new record inserted': '已插入新紀錄',
'next 100 rows': '往後 100 筆',
'or import from csv file': '或是從逗號分隔檔(CSV)匯入',
'previous 100 rows': '往前 100 筆',
'record': '紀錄',
'record does not exist': '紀錄不存在',
'record id': '紀錄編號',
'register': '註冊',
'selected': '已選擇',
'state': '狀態',
'table': '資料表',
'unable to parse csv file': '無法解析逗號分隔檔(csv)',
'Últ.ID': 'Últ.ID',
'Últ.Nro.Cbte.': 'Últ.Nro.Cbte.',
}<|fim▁end|> | |
<|file_name|>base_module.py<|end_file_name|><|fim▁begin|>from abc import ABCMeta
from up.utils.up_logger import UpLogger
class BaseModule(metaclass=ABCMeta):
LOAD_ORDER = 0
def __init__(self, silent=False):
self.__silent = silent
self.__logger = UpLogger.get_logger()
self.__up = None
def initialize(self, up):
self.__up = up
self._log_debug("Initializing {}".format(self.class_name))
self._execute_initialization()
def _execute_initialization(self):
"""
Subclasses must override this method. Initialize the provider here.
:return: None
"""
pass
def _log_debug(self, message):
if not self.__silent:
self.logger.debug(message)
def _log_info(self, message):
if not self.__silent:
self.logger.info(message)
def _log_warning(self, message):
if not self.__silent:
self.logger.warning(message)
def _log_error(self, message):
if not self.__silent:
self.logger.error(message)
def _log_critical(self, message):
if not self.__silent:
self.logger.critical(message)
def load(self):
return False
def is_a(self, module_name):
return self.__class__.__name__ == module_name
@property
def logger(self):
return self.__logger
@property<|fim▁hole|> :rtype: str
"""
return self.__class__.__name__
@property
def up(self):
return self.__up
@property
def telemetry_content(self):
return None
@classmethod
def instance(cls, up):
up.get_module(cls)<|fim▁end|> | def class_name(self):
"""
|
<|file_name|>sti.go<|end_file_name|><|fim▁begin|>package sti
import (
"errors"
"fmt"
"io"
"io/ioutil"
"os"
"path"
"path/filepath"
"regexp"
"strings"
"time"
"github.com/openshift/source-to-image/pkg/api"
"github.com/openshift/source-to-image/pkg/build"
"github.com/openshift/source-to-image/pkg/build/strategies/layered"
dockerpkg "github.com/openshift/source-to-image/pkg/docker"
s2ierr "github.com/openshift/source-to-image/pkg/errors"
"github.com/openshift/source-to-image/pkg/ignore"
"github.com/openshift/source-to-image/pkg/scm"
"github.com/openshift/source-to-image/pkg/scm/git"
"github.com/openshift/source-to-image/pkg/scripts"
"github.com/openshift/source-to-image/pkg/tar"
"github.com/openshift/source-to-image/pkg/util"
"github.com/openshift/source-to-image/pkg/util/cmd"
"github.com/openshift/source-to-image/pkg/util/fs"
utilglog "github.com/openshift/source-to-image/pkg/util/glog"
utilstatus "github.com/openshift/source-to-image/pkg/util/status"
)
var (
glog = utilglog.StderrLog
// List of directories that needs to be present inside working dir
workingDirs = []string{
api.UploadScripts,
api.Source,
api.DefaultScripts,
api.UserScripts,
}
errMissingRequirements = errors.New("missing requirements")
)
// STI strategy executes the S2I build.
// For more details about S2I, visit https://github.com/openshift/source-to-image
type STI struct {
config *api.Config
result *api.Result
postExecutor dockerpkg.PostExecutor
installer scripts.Installer
runtimeInstaller scripts.Installer
git git.Git
fs fs.FileSystem
tar tar.Tar
docker dockerpkg.Docker
incrementalDocker dockerpkg.Docker
runtimeDocker dockerpkg.Docker
callbackInvoker util.CallbackInvoker
requiredScripts []string
optionalScripts []string
optionalRuntimeScripts []string
externalScripts map[string]bool
installedScripts map[string]bool
scriptsURL map[string]string
incremental bool
sourceInfo *git.SourceInfo
env []string
newLabels map[string]string
// Interfaces
preparer build.Preparer
ignorer build.Ignorer
artifacts build.IncrementalBuilder
scripts build.ScriptsHandler
source build.Downloader
garbage build.Cleaner
layered build.Builder
// post executors steps
postExecutorStage int
postExecutorFirstStageSteps []postExecutorStep
postExecutorSecondStageSteps []postExecutorStep
postExecutorStepsContext *postExecutorStepContext
}
// New returns the instance of STI builder strategy for the given config.
// If the layeredBuilder parameter is specified, then the builder provided will
// be used for the case that the base Docker image does not have 'tar' or 'bash'
// installed.
func New(client dockerpkg.Client, config *api.Config, fs fs.FileSystem, overrides build.Overrides) (*STI, error) {
excludePattern, err := regexp.Compile(config.ExcludeRegExp)
if err != nil {
return nil, err
}
docker := dockerpkg.New(client, config.PullAuthentication)
var incrementalDocker dockerpkg.Docker
if config.Incremental {
incrementalDocker = dockerpkg.New(client, config.IncrementalAuthentication)
}
inst := scripts.NewInstaller(
config.BuilderImage,
config.ScriptsURL,
config.ScriptDownloadProxyConfig,
docker,
config.PullAuthentication,
fs,
)
tarHandler := tar.NewParanoid(fs)
tarHandler.SetExclusionPattern(excludePattern)
builder := &STI{
installer: inst,
config: config,
docker: docker,
incrementalDocker: incrementalDocker,
git: git.New(fs, cmd.NewCommandRunner()),
fs: fs,
tar: tarHandler,
callbackInvoker: util.NewCallbackInvoker(),
requiredScripts: scripts.RequiredScripts,
optionalScripts: scripts.OptionalScripts,
optionalRuntimeScripts: []string{api.AssembleRuntime},
externalScripts: map[string]bool{},
installedScripts: map[string]bool{},
scriptsURL: map[string]string{},
newLabels: map[string]string{},
}
if len(config.RuntimeImage) > 0 {
builder.runtimeDocker = dockerpkg.New(client, config.RuntimeAuthentication)
builder.runtimeInstaller = scripts.NewInstaller(
config.RuntimeImage,
config.ScriptsURL,
config.ScriptDownloadProxyConfig,
builder.runtimeDocker,
config.RuntimeAuthentication,
builder.fs,
)
}
// The sources are downloaded using the Git downloader.
// TODO: Add more SCM in future.
// TODO: explicit decision made to customize processing for usage specifically vs.
// leveraging overrides; also, we ultimately want to simplify s2i usage a good bit,
// which would lead to replacing this quick short circuit (so this change is tactical)
builder.source = overrides.Downloader
if builder.source == nil && !config.Usage {
downloader, err := scm.DownloaderForSource(builder.fs, config.Source, config.ForceCopy)
if err != nil {
return nil, err
}
builder.source = downloader
}
builder.garbage = build.NewDefaultCleaner(builder.fs, builder.docker)
builder.layered, err = layered.New(client, config, builder.fs, builder, overrides)
if err != nil {
return nil, err
}
// Set interfaces
builder.preparer = builder
// later on, if we support say .gitignore func in addition to .dockerignore
// func, setting ignorer will be based on config setting
builder.ignorer = &ignore.DockerIgnorer{}
builder.artifacts = builder
builder.scripts = builder
builder.postExecutor = builder
builder.initPostExecutorSteps()
return builder, nil
}
// Build processes a Request and returns a *api.Result and an error.
// An error represents a failure performing the build rather than a failure
// of the build itself. Callers should check the Success field of the result
// to determine whether a build succeeded or not.
func (builder *STI) Build(config *api.Config) (*api.Result, error) {
builder.result = &api.Result{}
if len(builder.config.CallbackURL) > 0 {
defer func() {
builder.result.Messages = builder.callbackInvoker.ExecuteCallback(
builder.config.CallbackURL,
builder.result.Success,
builder.postExecutorStepsContext.labels,
builder.result.Messages,
)
}()
}
defer builder.garbage.Cleanup(config)
glog.V(1).Infof("Preparing to build %s", config.Tag)
if err := builder.preparer.Prepare(config); err != nil {
return builder.result, err
}
if builder.incremental = builder.artifacts.Exists(config); builder.incremental {
tag := firstNonEmpty(config.IncrementalFromTag, config.Tag)
glog.V(1).Infof("Existing image for tag %s detected for incremental build", tag)
} else {
glog.V(1).Info("Clean build will be performed")
}
glog.V(2).Infof("Performing source build from %s", config.Source)
if builder.incremental {
if err := builder.artifacts.Save(config); err != nil {
glog.Warning("Clean build will be performed because of error saving previous build artifacts")
glog.V(2).Infof("error: %v", err)
}
}
if len(config.AssembleUser) > 0 {
glog.V(1).Infof("Running %q in %q as %q user", api.Assemble, config.Tag, config.AssembleUser)
} else {
glog.V(1).Infof("Running %q in %q", api.Assemble, config.Tag)
}
startTime := time.Now()
if err := builder.scripts.Execute(api.Assemble, config.AssembleUser, config); err != nil {
if err == errMissingRequirements {
glog.V(1).Info("Image is missing basic requirements (sh or tar), layered build will be performed")
return builder.layered.Build(config)
}
if e, ok := err.(s2ierr.ContainerError); ok {
if !isMissingRequirements(e.Output) {
builder.result.BuildInfo.FailureReason = utilstatus.NewFailureReason(
utilstatus.ReasonAssembleFailed,
utilstatus.ReasonMessageAssembleFailed,
)
return builder.result, err
}
glog.V(1).Info("Image is missing basic requirements (sh or tar), layered build will be performed")
buildResult, err := builder.layered.Build(config)
return buildResult, err
}
return builder.result, err
}
builder.result.BuildInfo.Stages = api.RecordStageAndStepInfo(builder.result.BuildInfo.Stages, api.StageAssemble, api.StepAssembleBuildScripts, startTime, time.Now())
builder.result.Success = true
return builder.result, nil
}
// Prepare prepares the source code and tar for build.
// NOTE: this func serves both the sti and onbuild strategies, as the OnBuild
// struct Build func leverages the STI struct Prepare func directly below.
func (builder *STI) Prepare(config *api.Config) error {
var err error
if builder.result == nil {
builder.result = &api.Result{}
}
if len(config.WorkingDir) == 0 {
if config.WorkingDir, err = builder.fs.CreateWorkingDirectory(); err != nil {
builder.result.BuildInfo.FailureReason = utilstatus.NewFailureReason(
utilstatus.ReasonFSOperationFailed,
utilstatus.ReasonMessageFSOperationFailed,
)
return err
}
}
builder.result.WorkingDir = config.WorkingDir
if len(config.RuntimeImage) > 0 {
startTime := time.Now()
dockerpkg.GetRuntimeImage(config, builder.runtimeDocker)
builder.result.BuildInfo.Stages = api.RecordStageAndStepInfo(builder.result.BuildInfo.Stages, api.StagePullImages, api.StepPullRuntimeImage, startTime, time.Now())
if err != nil {
builder.result.BuildInfo.FailureReason = utilstatus.NewFailureReason(
utilstatus.ReasonPullRuntimeImageFailed,
utilstatus.ReasonMessagePullRuntimeImageFailed,
)
glog.Errorf("Unable to pull runtime image %q: %v", config.RuntimeImage, err)
return err
}
// user didn't specify mapping, let's take it from the runtime image then
if len(builder.config.RuntimeArtifacts) == 0 {
var mapping string
mapping, err = builder.docker.GetAssembleInputFiles(config.RuntimeImage)
if err != nil {
builder.result.BuildInfo.FailureReason = utilstatus.NewFailureReason(
utilstatus.ReasonInvalidArtifactsMapping,
utilstatus.ReasonMessageInvalidArtifactsMapping,
)
return err
}
if len(mapping) == 0 {
builder.result.BuildInfo.FailureReason = utilstatus.NewFailureReason(
utilstatus.ReasonGenericS2IBuildFailed,
utilstatus.ReasonMessageGenericS2iBuildFailed,
)
return errors.New("no runtime artifacts to copy were specified")
}
for _, value := range strings.Split(mapping, ";") {
if err = builder.config.RuntimeArtifacts.Set(value); err != nil {
builder.result.BuildInfo.FailureReason = utilstatus.NewFailureReason(
utilstatus.ReasonGenericS2IBuildFailed,
utilstatus.ReasonMessageGenericS2iBuildFailed,
)
return fmt.Errorf("could not parse %q label with value %q on image %q: %v",
dockerpkg.AssembleInputFilesLabel, mapping, config.RuntimeImage, err)
}
}
}
// we're validating values here to be sure that we're handling both of the cases of the invocation:
// from main() and as a method from OpenShift
for _, volumeSpec := range builder.config.RuntimeArtifacts {
var volumeErr error
switch {
case !path.IsAbs(filepath.ToSlash(volumeSpec.Source)):
volumeErr = fmt.Errorf("invalid runtime artifacts mapping: %q -> %q: source must be an absolute path", volumeSpec.Source, volumeSpec.Destination)
case path.IsAbs(volumeSpec.Destination):
volumeErr = fmt.Errorf("invalid runtime artifacts mapping: %q -> %q: destination must be a relative path", volumeSpec.Source, volumeSpec.Destination)
case strings.HasPrefix(volumeSpec.Destination, ".."):
volumeErr = fmt.Errorf("invalid runtime artifacts mapping: %q -> %q: destination cannot start with '..'", volumeSpec.Source, volumeSpec.Destination)
default:
continue
}
if volumeErr != nil {
builder.result.BuildInfo.FailureReason = utilstatus.NewFailureReason(
utilstatus.ReasonInvalidArtifactsMapping,
utilstatus.ReasonMessageInvalidArtifactsMapping,
)
return volumeErr
}
}
}
// Setup working directories
for _, v := range workingDirs {
if err = builder.fs.MkdirAllWithPermissions(filepath.Join(config.WorkingDir, v), 0755); err != nil {
builder.result.BuildInfo.FailureReason = utilstatus.NewFailureReason(
utilstatus.ReasonFSOperationFailed,
utilstatus.ReasonMessageFSOperationFailed,
)
return err
}
}
// fetch sources, for their .s2i/bin might contain s2i scripts
if config.Source != nil {
if builder.sourceInfo, err = builder.source.Download(config); err != nil {
builder.result.BuildInfo.FailureReason = utilstatus.NewFailureReason(
utilstatus.ReasonFetchSourceFailed,
utilstatus.ReasonMessageFetchSourceFailed,
)
return err
}
if config.SourceInfo != nil {
builder.sourceInfo = config.SourceInfo
}
}
// get the scripts
required, err := builder.installer.InstallRequired(builder.requiredScripts, config.WorkingDir)
if err != nil {
builder.result.BuildInfo.FailureReason = utilstatus.NewFailureReason(
utilstatus.ReasonInstallScriptsFailed,
utilstatus.ReasonMessageInstallScriptsFailed,
)
return err
}
optional := builder.installer.InstallOptional(builder.optionalScripts, config.WorkingDir)
requiredAndOptional := append(required, optional...)
if len(config.RuntimeImage) > 0 && builder.runtimeInstaller != nil {
optionalRuntime := builder.runtimeInstaller.InstallOptional(builder.optionalRuntimeScripts, config.WorkingDir)
requiredAndOptional = append(requiredAndOptional, optionalRuntime...)
}
// If a ScriptsURL was specified, but no scripts were downloaded from it, throw an error
if len(config.ScriptsURL) > 0 {
failedCount := 0
for _, result := range requiredAndOptional {
if includes(result.FailedSources, scripts.ScriptURLHandler) {
failedCount++
}
}
if failedCount == len(requiredAndOptional) {
builder.result.BuildInfo.FailureReason = utilstatus.NewFailureReason(
utilstatus.ReasonScriptsFetchFailed,
utilstatus.ReasonMessageScriptsFetchFailed,
)
return fmt.Errorf("could not download any scripts from URL %v", config.ScriptsURL)
}
}
for _, r := range requiredAndOptional {
if r.Error != nil {
glog.Warningf("Error getting %v from %s: %v", r.Script, r.URL, r.Error)
continue
}
builder.externalScripts[r.Script] = r.Downloaded
builder.installedScripts[r.Script] = r.Installed
builder.scriptsURL[r.Script] = r.URL
}
// see if there is a .s2iignore file, and if so, read in the patterns an then
// search and delete on
return builder.ignorer.Ignore(config)
}
// SetScripts allows to override default required and optional scripts
func (builder *STI) SetScripts(required, optional []string) {
builder.requiredScripts = required
builder.optionalScripts = optional
}
// PostExecute allows to execute post-build actions after the Docker
// container execution finishes.
func (builder *STI) PostExecute(containerID, destination string) error {
builder.postExecutorStepsContext.containerID = containerID
builder.postExecutorStepsContext.destination = destination
stageSteps := builder.postExecutorFirstStageSteps
if builder.postExecutorStage > 0 {
stageSteps = builder.postExecutorSecondStageSteps
}
for _, step := range stageSteps {
if err := step.execute(builder.postExecutorStepsContext); err != nil {
glog.V(0).Info("error: Execution of post execute step failed")
return err
}
}
return nil
}
func CreateBuildEnvironment(sourcePath string, cfgEnv api.EnvironmentList) []string {
s2iEnv, err := scripts.GetEnvironment(filepath.Join(sourcePath, api.Source))
if err != nil {
glog.V(3).Infof("No user environment provided (%v)", err)
}
return append(scripts.ConvertEnvironmentList(s2iEnv), scripts.ConvertEnvironmentList(cfgEnv)...)
}
// Exists determines if the current build supports incremental workflow.
// It checks if the previous image exists in the system and if so, then it
// verifies that the save-artifacts script is present.
func (builder *STI) Exists(config *api.Config) bool {
if !config.Incremental {
return false
}
policy := config.PreviousImagePullPolicy
if len(policy) == 0 {
policy = api.DefaultPreviousImagePullPolicy
}
tag := firstNonEmpty(config.IncrementalFromTag, config.Tag)
startTime := time.Now()
result, err := dockerpkg.PullImage(tag, builder.incrementalDocker, policy)
builder.result.BuildInfo.Stages = api.RecordStageAndStepInfo(builder.result.BuildInfo.Stages, api.StagePullImages, api.StepPullPreviousImage, startTime, time.Now())
if err != nil {
builder.result.BuildInfo.FailureReason = utilstatus.NewFailureReason(
utilstatus.ReasonPullPreviousImageFailed,
utilstatus.ReasonMessagePullPreviousImageFailed,
)
glog.V(2).Infof("Unable to pull previously built image %q: %v", tag, err)
return false
}
return result.Image != nil && builder.installedScripts[api.SaveArtifacts]
}
// Save extracts and restores the build artifacts from the previous build to
// the current build.
func (builder *STI) Save(config *api.Config) (err error) {
artifactTmpDir := filepath.Join(config.WorkingDir, "upload", "artifacts")
if builder.result == nil {
builder.result = &api.Result{}
}
if err = builder.fs.Mkdir(artifactTmpDir); err != nil {
builder.result.BuildInfo.FailureReason = utilstatus.NewFailureReason(
utilstatus.ReasonFSOperationFailed,
utilstatus.ReasonMessageFSOperationFailed,
)
return err
}
image := firstNonEmpty(config.IncrementalFromTag, config.Tag)
outReader, outWriter := io.Pipe()
errReader, errWriter := io.Pipe()
glog.V(1).Infof("Saving build artifacts from image %s to path %s", image, artifactTmpDir)
extractFunc := func(string) error {
startTime := time.Now()
extractErr := builder.tar.ExtractTarStream(artifactTmpDir, outReader)
io.Copy(ioutil.Discard, outReader) // must ensure reader from container is drained
builder.result.BuildInfo.Stages = api.RecordStageAndStepInfo(builder.result.BuildInfo.Stages, api.StageRetrieve, api.StepRetrievePreviousArtifacts, startTime, time.Now())
return extractErr
}
user := config.AssembleUser
if len(user) == 0 {
user, err = builder.docker.GetImageUser(image)
if err != nil {
builder.result.BuildInfo.FailureReason = utilstatus.NewFailureReason(
utilstatus.ReasonGenericS2IBuildFailed,
utilstatus.ReasonMessageGenericS2iBuildFailed,
)
return err
}
glog.V(3).Infof("The assemble user is not set, defaulting to %q user", user)
} else {
glog.V(3).Infof("Using assemble user %q to extract artifacts", user)
}
opts := dockerpkg.RunContainerOptions{
Image: image,
User: user,
ExternalScripts: builder.externalScripts[api.SaveArtifacts],
ScriptsURL: config.ScriptsURL,
Destination: config.Destination,
PullImage: false,
Command: api.SaveArtifacts,
Stdout: outWriter,
Stderr: errWriter,
OnStart: extractFunc,
NetworkMode: string(config.DockerNetworkMode),
CGroupLimits: config.CGroupLimits,
CapDrop: config.DropCapabilities,
Binds: config.BuildVolumes,
SecurityOpt: config.SecurityOpt,
}
dockerpkg.StreamContainerIO(errReader, nil, func(s string) { glog.Info(s) })
err = builder.docker.RunContainer(opts)
if e, ok := err.(s2ierr.ContainerError); ok {
err = s2ierr.NewSaveArtifactsError(image, e.Output, err)
}
builder.result.BuildInfo.FailureReason = utilstatus.NewFailureReason(
utilstatus.ReasonGenericS2IBuildFailed,
utilstatus.ReasonMessageGenericS2iBuildFailed,
)
return err
}
// Execute runs the specified STI script in the builder image.
func (builder *STI) Execute(command string, user string, config *api.Config) error {
glog.V(2).Infof("Using image name %s", config.BuilderImage)
// Ensure that the builder image is present in the local Docker daemon.
// The image should have been pulled when the strategy was created, so
// this should be a quick inspect of the existing image. However, if
// the image has been deleted since the strategy was created, this will ensure
// it exists before executing a script on it.
builder.docker.CheckAndPullImage(config.BuilderImage)
// we can't invoke this method before (for example in New() method)
// because of later initialization of config.WorkingDir
builder.env = CreateBuildEnvironment(config.WorkingDir, config.Environment)
errOutput := ""
outReader, outWriter := io.Pipe()
errReader, errWriter := io.Pipe()
externalScripts := builder.externalScripts[command]
// if LayeredBuild is called then all the scripts will be placed inside the image
if config.LayeredBuild {
externalScripts = false
}
opts := dockerpkg.RunContainerOptions{
Image: config.BuilderImage,
Stdout: outWriter,
Stderr: errWriter,
// The PullImage is false because the PullImage function should be called
// before we run the container
PullImage: false,
ExternalScripts: externalScripts,
ScriptsURL: config.ScriptsURL,
Destination: config.Destination,
Command: command,
Env: builder.env,
User: user,
PostExec: builder.postExecutor,
NetworkMode: string(config.DockerNetworkMode),
CGroupLimits: config.CGroupLimits,
CapDrop: config.DropCapabilities,
Binds: config.BuildVolumes,
SecurityOpt: config.SecurityOpt,
}
// If there are injections specified, override the original assemble script
// and wait till all injections are uploaded into the container that runs the
// assemble script.
injectionError := make(chan error)
if len(config.Injections) > 0 && command == api.Assemble {
workdir, err := builder.docker.GetImageWorkdir(config.BuilderImage)
if err != nil {
builder.result.BuildInfo.FailureReason = utilstatus.NewFailureReason(
utilstatus.ReasonGenericS2IBuildFailed,
utilstatus.ReasonMessageGenericS2iBuildFailed,
)
return err
}
config.Injections = util.FixInjectionsWithRelativePath(workdir, config.Injections)
truncatedFiles, err := util.ListFilesToTruncate(builder.fs, config.Injections)
if err != nil {
builder.result.BuildInfo.FailureReason = utilstatus.NewFailureReason(
utilstatus.ReasonInstallScriptsFailed,
utilstatus.ReasonMessageInstallScriptsFailed,
)
return err
}
rmScript, err := util.CreateTruncateFilesScript(truncatedFiles, "/tmp/rm-injections")
if len(rmScript) != 0 {
defer os.Remove(rmScript)
}
if err != nil {
builder.result.BuildInfo.FailureReason = utilstatus.NewFailureReason(<|fim▁hole|> return err
}
opts.CommandOverrides = func(cmd string) string {
return fmt.Sprintf("while [ ! -f %q ]; do sleep 0.5; done; %s; result=$?; . %[1]s; exit $result",
"/tmp/rm-injections", cmd)
}
originalOnStart := opts.OnStart
opts.OnStart = func(containerID string) error {
defer close(injectionError)
glog.V(2).Info("starting the injections uploading ...")
for _, s := range config.Injections {
if err := builder.docker.UploadToContainer(builder.fs, s.Source, s.Destination, containerID); err != nil {
injectionError <- util.HandleInjectionError(s, err)
return err
}
}
if err := builder.docker.UploadToContainer(builder.fs, rmScript, "/tmp/rm-injections", containerID); err != nil {
injectionError <- util.HandleInjectionError(api.VolumeSpec{Source: rmScript, Destination: "/tmp/rm-injections"}, err)
return err
}
if originalOnStart != nil {
return originalOnStart(containerID)
}
return nil
}
} else {
close(injectionError)
}
if !config.LayeredBuild {
r, w := io.Pipe()
opts.Stdin = r
go func() {
// Wait for the injections to complete and check the error. Do not start
// streaming the sources when the injection failed.
if <-injectionError != nil {
w.Close()
return
}
glog.V(2).Info("starting the source uploading ...")
uploadDir := filepath.Join(config.WorkingDir, "upload")
w.CloseWithError(builder.tar.CreateTarStream(uploadDir, false, w))
}()
}
dockerpkg.StreamContainerIO(outReader, nil, func(s string) {
if !config.Quiet {
glog.Info(strings.TrimSpace(s))
}
})
c := dockerpkg.StreamContainerIO(errReader, &errOutput, func(s string) { glog.Info(s) })
err := builder.docker.RunContainer(opts)
if err != nil {
// Must wait for StreamContainerIO goroutine above to exit before reading errOutput.
<-c
if isMissingRequirements(errOutput) {
err = errMissingRequirements
} else if e, ok := err.(s2ierr.ContainerError); ok {
err = s2ierr.NewContainerError(config.BuilderImage, e.ErrorCode, errOutput+e.Output)
}
}
return err
}
func (builder *STI) initPostExecutorSteps() {
builder.postExecutorStepsContext = &postExecutorStepContext{}
if len(builder.config.RuntimeImage) == 0 {
builder.postExecutorFirstStageSteps = []postExecutorStep{
&storePreviousImageStep{
builder: builder,
docker: builder.docker,
},
&commitImageStep{
image: builder.config.BuilderImage,
builder: builder,
docker: builder.docker,
fs: builder.fs,
tar: builder.tar,
},
&reportSuccessStep{
builder: builder,
},
&removePreviousImageStep{
builder: builder,
docker: builder.docker,
},
}
} else {
builder.postExecutorFirstStageSteps = []postExecutorStep{
&downloadFilesFromBuilderImageStep{
builder: builder,
docker: builder.docker,
fs: builder.fs,
tar: builder.tar,
},
&startRuntimeImageAndUploadFilesStep{
builder: builder,
docker: builder.docker,
fs: builder.fs,
},
}
builder.postExecutorSecondStageSteps = []postExecutorStep{
&commitImageStep{
image: builder.config.RuntimeImage,
builder: builder,
docker: builder.docker,
tar: builder.tar,
},
&reportSuccessStep{
builder: builder,
},
}
}
}
func isMissingRequirements(text string) bool {
tarCommand, _ := regexp.MatchString(`.*tar.*not found`, text)
shCommand, _ := regexp.MatchString(`.*/bin/sh.*no such file or directory`, text)
return tarCommand || shCommand
}
func includes(arr []string, str string) bool {
for _, s := range arr {
if s == str {
return true
}
}
return false
}
func firstNonEmpty(args ...string) string {
for _, value := range args {
if len(value) > 0 {
return value
}
}
return ""
}<|fim▁end|> | utilstatus.ReasonGenericS2IBuildFailed,
utilstatus.ReasonMessageGenericS2iBuildFailed,
) |
<|file_name|>filters.py<|end_file_name|><|fim▁begin|>import django_filters
from django_filters import rest_framework as filters
from django_rv_apps.apps.believe_his_prophets.models.spirit_prophecy_chapter import SpiritProphecyChapter, SpiritProphecyChapterLanguage
from django_rv_apps.apps.believe_his_prophets.models.spirit_prophecy import SpiritProphecy
from django_rv_apps.apps.believe_his_prophets.models.language import Language
from django.utils import timezone
class SpiritProphecyChapterLanguageFilter(django_filters.FilterSet):
code_iso = filters.ModelMultipleChoiceFilter(
queryset=Language.objects.all(),
field_name='language__code_iso',
to_field_name='code_iso'
)
start_date = filters.CharFilter(method='filter_date')
class Meta:
model = SpiritProphecyChapterLanguage
fields = ('id' ,'code_iso','start_date')
<|fim▁hole|> t = timezone.localtime(timezone.now())
return queryset.filter(
spirit_prophecy_chapter__start_date__year = t.year,
spirit_prophecy_chapter__start_date__month = t.month, spirit_prophecy_chapter__start_date__day = t.day,
)<|fim▁end|> | def filter_date(self, queryset, name, value): |
<|file_name|>EstimatorsTest.java<|end_file_name|><|fim▁begin|>package cz.cuni.lf1.lge.ThunderSTORM.estimators;
import cz.cuni.lf1.lge.ThunderSTORM.detectors.CentroidOfConnectedComponentsDetector;
import cz.cuni.lf1.lge.ThunderSTORM.estimators.PSF.EllipticGaussianPSF;
import cz.cuni.lf1.lge.ThunderSTORM.estimators.PSF.EllipticGaussianWAnglePSF;
import cz.cuni.lf1.lge.ThunderSTORM.estimators.PSF.Molecule;<|fim▁hole|>import cz.cuni.lf1.lge.ThunderSTORM.estimators.PSF.IntegratedSymmetricGaussianPSF;
import cz.cuni.lf1.lge.ThunderSTORM.estimators.PSF.PSFModel.Params;
import cz.cuni.lf1.lge.ThunderSTORM.util.CSV;
import static cz.cuni.lf1.lge.ThunderSTORM.util.MathProxy.sqr;
import cz.cuni.lf1.lge.ThunderSTORM.util.Point;
import ij.IJ;
import ij.process.FloatProcessor;
import java.util.List;
import java.util.Vector;
import org.junit.Test;
import static org.junit.Assert.*;
public class EstimatorsTest {
@Test
public void testRadialSymmetry() {
testEstimator(new MultipleLocationsImageFitting(5, new RadialSymmetryFitter()));
}
@Test
public void testLSQSym() {
testEstimator(new MultipleLocationsImageFitting(5, new LSQFitter(new SymmetricGaussianPSF(1), false, Params.BACKGROUND)));
testEstimator(new MultipleLocationsImageFitting(5, new LSQFitter(new SymmetricGaussianPSF(1), true, Params.BACKGROUND)));
}
@Test
public void testLSQIntSym() {
testEstimator(new MultipleLocationsImageFitting(5, new LSQFitter(new IntegratedSymmetricGaussianPSF(1), false, Params.BACKGROUND)));
testEstimator(new MultipleLocationsImageFitting(5, new LSQFitter(new IntegratedSymmetricGaussianPSF(1), true, Params.BACKGROUND)));
}
@Test
public void testMLEIntSym() {
testEstimator(new MultipleLocationsImageFitting(5, new MLEFitter(new IntegratedSymmetricGaussianPSF(1), Params.BACKGROUND)));
}
@Test
public void testMLESym() {
testEstimator(new MultipleLocationsImageFitting(5, new MLEFitter(new SymmetricGaussianPSF(1), Params.BACKGROUND)));
}
@Test
public void testLSQEllipticAngle() {
testEstimator(new MultipleLocationsImageFitting(5, new LSQFitter(new EllipticGaussianWAnglePSF(1, 0), false, Params.BACKGROUND)));
testEstimator(new MultipleLocationsImageFitting(5, new LSQFitter(new EllipticGaussianWAnglePSF(1, 0), true, Params.BACKGROUND)));
}
@Test
public void testMLEEllipticAngle() {
testEstimator(new MultipleLocationsImageFitting(5, new MLEFitter(new EllipticGaussianWAnglePSF(1, 0), Params.BACKGROUND)));
}
@Test
public void testLSQElliptic() {
testEstimator(new MultipleLocationsImageFitting(5, new LSQFitter(new EllipticGaussianPSF(1, 45), false, Params.BACKGROUND)));
testEstimator(new MultipleLocationsImageFitting(5, new LSQFitter(new EllipticGaussianPSF(1, 45), true, Params.BACKGROUND)));
}
@Test
public void testMLEElliptic() {
testEstimator(new MultipleLocationsImageFitting(5, new MLEFitter(new EllipticGaussianPSF(1, 45), Params.BACKGROUND)));
}
private void testEstimator(IEstimator estimator) throws FormulaParserException {
String basePath = this.getClass().getProtectionDomain().getCodeSource().getLocation().getPath();
FloatProcessor image = (FloatProcessor) IJ.openImage(basePath + "tubulins1_00020.tif").getProcessor().convertToFloat();
FloatProcessor filtered = (new CompoundWaveletFilter()).filterImage(image);
Vector<Point> detections = (new CentroidOfConnectedComponentsDetector("16", true)).detectMoleculeCandidates(filtered);
List<Molecule> fits = estimator.estimateParameters(image, detections);
for(Molecule fit : fits) {
convertXYToNanoMeters(fit, 150.0);
}
Vector<Molecule> ground_truth = null;
try {
ground_truth = CSV.csv2psf(basePath + "tubulins1_00020.csv", 1, 2);
} catch(Exception ex) {
fail(ex.getMessage());
}
Vector<Pair> pairs = pairFitsAndDetections2GroundTruths(detections, fits, ground_truth);
for(Pair pair : pairs) {
assertFalse("Result from the estimator should be better than guess from the detector.", dist2(pair.fit, pair.ground_truth) > dist2(pair.detection, pair.ground_truth));
}
//
// Note: better test would be to compare these results to the results from Matlab...but I don't have them at the moment
//
}
static void convertXYToNanoMeters(Molecule fit, double px2nm) {
fit.setX(fit.getX() * px2nm);
fit.setY(fit.getY() * px2nm);
}
static class Pair {
Point detection;
Molecule fit;
Molecule ground_truth;
public Pair(Point detection, Molecule fit, Molecule ground_truth) {
this.detection = detection;
this.fit = fit;
this.ground_truth = ground_truth;
}
}
static Vector<Pair> pairFitsAndDetections2GroundTruths(Vector<Point> detections, List<Molecule> fits, Vector<Molecule> ground_truth) {
assertNotNull(fits);
assertNotNull(detections);
assertNotNull(ground_truth);
assertFalse(fits.isEmpty());
assertFalse(detections.isEmpty());
assertFalse(ground_truth.isEmpty());
assertEquals("Number of detections should be the same as number of fits!", detections.size(), fits.size());
Vector<Pair> pairs = new Vector<Pair>();
int best_fit;
double best_dist2, dist2;
for(int i = 0, im = fits.size(); i < im; i++) {
best_fit = 0;
best_dist2 = dist2(fits.get(i), ground_truth.elementAt(best_fit));
for(int j = 1, jm = ground_truth.size(); j < jm; j++) {
dist2 = dist2(fits.get(i), ground_truth.elementAt(j));
if(dist2 < best_dist2) {
best_dist2 = dist2;
best_fit = j;
}
}
pairs.add(new Pair(detections.elementAt(i), fits.get(i), ground_truth.elementAt(best_fit)));
}
return pairs;
}
static double dist2(Point detection, Molecule ground_truth) {
return sqr(detection.x.doubleValue() - ground_truth.getX()) + sqr(detection.y.doubleValue() - ground_truth.getY());
}
static double dist2(Molecule fit, Molecule ground_truth) {
return sqr(fit.getX() - ground_truth.getY()) + sqr(fit.getX() - ground_truth.getY());
}
}<|fim▁end|> | import cz.cuni.lf1.lge.ThunderSTORM.estimators.PSF.SymmetricGaussianPSF;
import cz.cuni.lf1.lge.ThunderSTORM.filters.CompoundWaveletFilter;
import cz.cuni.lf1.lge.ThunderSTORM.FormulaParser.FormulaParserException; |
<|file_name|>struct.rs<|end_file_name|><|fim▁begin|>// Copyright 2013 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
// compile-flags:-Z extra-debug-info
// debugger:set print pretty off
// debugger:break 29
// debugger:run
// debugger:print pair
// check:$1 = {x = 1, y = 2}
// debugger:print pair.x
// check:$2 = 1
// debugger:print pair.y
// check:$3 = 2<|fim▁hole|>
struct Pair {
x: int,
y: int
}
fn main() {
let pair = Pair { x: 1, y: 2 };
let _z = ();
}<|fim▁end|> | |
<|file_name|>AssertionGenerator.js<|end_file_name|><|fim▁begin|>import UnexpectedHtmlLike from 'unexpected-htmllike';
import React from 'react';
import REACT_EVENT_NAMES from '../reactEventNames';
const PENDING_TEST_EVENT_TYPE = { dummy: 'Dummy object to identify a pending event on the test renderer' };
function getDefaultOptions(flags) {
return {
diffWrappers: flags.exactly || flags.withAllWrappers,
diffExtraChildren: flags.exactly || flags.withAllChildren,
diffExtraAttributes: flags.exactly || flags.withAllAttributes,
diffExactClasses: flags.exactly,
diffExtraClasses: flags.exactly || flags.withAllClasses
};
}
/**
*
* @param options {object}
* @param options.ActualAdapter {function} constructor function for the HtmlLike adapter for the `actual` value (usually the renderer)
* @param options.ExpectedAdapter {function} constructor function for the HtmlLike adapter for the `expected` value
* @param options.QueryAdapter {function} constructor function for the HtmlLike adapter for the query value (`queried for` and `on`)
* @param options.actualTypeName {string} name of the unexpected type for the `actual` value
* @param options.expectedTypeName {string} name of the unexpected type for the `expected` value
* @param options.queryTypeName {string} name of the unexpected type for the query value (used in `queried for` and `on`)
* @param options.actualRenderOutputType {string} the unexpected type for the actual output value
* @param options.getRenderOutput {function} called with the actual value, and returns the `actualRenderOutputType` type
* @param options.getDiffInputFromRenderOutput {function} called with the value from `getRenderOutput`, result passed to HtmlLike diff
* @param options.rewrapResult {function} called with the `actual` value (usually the renderer), and the found result
* @param options.wrapResultForReturn {function} called with the `actual` value (usually the renderer), and the found result
* from HtmlLike `contains()` call (usually the same type returned from `getDiffInputFromRenderOutput`. Used to create a
* value that can be passed back to the user as the result of the promise. Used by `queried for` when no further assertion is
* provided, therefore the return value is provided as the result of the promise. If this is not present, `rewrapResult` is used.
* @param options.triggerEvent {function} called the `actual` value (renderer), the optional target (or null) as the result
* from the HtmlLike `contains()` call target, the eventName, and optional eventArgs when provided (undefined otherwise)
* @constructor
*/
function AssertionGenerator(options) {
this._options = Object.assign({}, options);
this._PENDING_EVENT_IDENTIFIER = (options.mainAssertionGenerator && options.mainAssertionGenerator.getEventIdentifier()) ||
{ dummy: options.actualTypeName + 'PendingEventIdentifier' };
this._actualPendingEventTypeName = options.actualTypeName + 'PendingEvent';
}
AssertionGenerator.prototype.getEventIdentifier = function () {
return this._PENDING_EVENT_IDENTIFIER;
};
AssertionGenerator.prototype.installInto = function installInto(expect) {
this._installEqualityAssertions(expect);
this._installQueriedFor(expect);
this._installPendingEventType(expect);
this._installWithEvent(expect);
this._installWithEventOn(expect);
this._installEventHandlerAssertions(expect);
};
AssertionGenerator.prototype.installAlternativeExpected = function (expect) {
this._installEqualityAssertions(expect);
this._installEventHandlerAssertions(expect);
}
AssertionGenerator.prototype._installEqualityAssertions = function (expect) {
const {
actualTypeName, expectedTypeName,
getRenderOutput, actualRenderOutputType,
getDiffInputFromRenderOutput,
ActualAdapter, ExpectedAdapter
} = this._options;
expect.addAssertion([`<${actualTypeName}> to have [exactly] rendered <${expectedTypeName}>`,
`<${actualTypeName}> to have rendered [with all children] [with all wrappers] [with all classes] [with all attributes] <${expectedTypeName}>`],
function (expect, subject, renderOutput) {
var actual = getRenderOutput(subject);
return expect(actual, 'to have [exactly] rendered [with all children] [with all wrappers] [with all classes] [with all attributes]', renderOutput)
.then(() => subject);
});
expect.addAssertion([
`<${actualRenderOutputType}> to have [exactly] rendered <${expectedTypeName}>`,
`<${actualRenderOutputType}> to have rendered [with all children] [with all wrappers] [with all classes] [with all attributes] <${expectedTypeName}>`
], function (expect, subject, renderOutput) {
const exactly = this.flags.exactly;
const withAllChildren = this.flags['with all children'];
const withAllWrappers = this.flags['with all wrappers'];
const withAllClasses = this.flags['with all classes'];
const withAllAttributes = this.flags['with all attributes'];
const actualAdapter = new ActualAdapter();
const expectedAdapter = new ExpectedAdapter();
const testHtmlLike = new UnexpectedHtmlLike(actualAdapter);
if (!exactly) {
expectedAdapter.setOptions({concatTextContent: true});
actualAdapter.setOptions({concatTextContent: true});
}
const options = getDefaultOptions({exactly, withAllWrappers, withAllChildren, withAllClasses, withAllAttributes});
const diffResult = testHtmlLike.diff(expectedAdapter, getDiffInputFromRenderOutput(subject), renderOutput, expect, options);
return testHtmlLike.withResult(diffResult, result => {
if (result.weight !== 0) {
return expect.fail({
diff: function (output, diff, inspect) {
return output.append(testHtmlLike.render(result, output.clone(), diff, inspect));
}
});
}
return result;
});
});
expect.addAssertion([`<${actualTypeName}> [not] to contain [exactly] <${expectedTypeName}|string>`,
`<${actualTypeName}> [not] to contain [with all children] [with all wrappers] [with all classes] [with all attributes] <${expectedTypeName}|string>`], function (expect, subject, renderOutput) {
var actual = getRenderOutput(subject);
return expect(actual, '[not] to contain [exactly] [with all children] [with all wrappers] [with all classes] [with all attributes]', renderOutput);
});
expect.addAssertion([`<${actualRenderOutputType}> [not] to contain [exactly] <${expectedTypeName}|string>`,
`<${actualRenderOutputType}> [not] to contain [with all children] [with all wrappers] [with all classes] [with all attributes] <${expectedTypeName}|string>`], function (expect, subject, expected) {
var not = this.flags.not;
var exactly = this.flags.exactly;
var withAllChildren = this.flags['with all children'];
var withAllWrappers = this.flags['with all wrappers'];
var withAllClasses = this.flags['with all classes'];
var withAllAttributes = this.flags['with all attributes'];
var actualAdapter = new ActualAdapter();
var expectedAdapter = new ExpectedAdapter();
var testHtmlLike = new UnexpectedHtmlLike(actualAdapter);
if (!exactly) {
actualAdapter.setOptions({concatTextContent: true});
expectedAdapter.setOptions({concatTextContent: true});
}
var options = getDefaultOptions({exactly, withAllWrappers, withAllChildren, withAllClasses, withAllAttributes});
const containsResult = testHtmlLike.contains(expectedAdapter, getDiffInputFromRenderOutput(subject), expected, expect, options);
return testHtmlLike.withResult(containsResult, result => {
if (not) {
if (result.found) {
expect.fail({
diff: (output, diff, inspect) => {
return output.error('but found the following match').nl().append(testHtmlLike.render(result.bestMatch, output.clone(), diff, inspect));
}
});
}
return;
}
if (!result.found) {
expect.fail({
diff: function (output, diff, inspect) {
return output.error('the best match was').nl().append(testHtmlLike.render(result.bestMatch, output.clone(), diff, inspect));
}
});
}
});
});
// More generic assertions
expect.addAssertion(`<${actualTypeName}> to equal <${expectedTypeName}>`, function (expect, subject, expected) {
expect(getRenderOutput(subject), 'to equal', expected);
});
expect.addAssertion(`<${actualRenderOutputType}> to equal <${expectedTypeName}>`, function (expect, subject, expected) {
expect(subject, 'to have exactly rendered', expected);
});
expect.addAssertion(`<${actualTypeName}> to satisfy <${expectedTypeName}>`, function (expect, subject, expected) {
expect(getRenderOutput(subject), 'to satisfy', expected);
});
expect.addAssertion(`<${actualRenderOutputType}> to satisfy <${expectedTypeName}>`, function (expect, subject, expected) {
expect(subject, 'to have rendered', expected);
});
};
AssertionGenerator.prototype._installQueriedFor = function (expect) {
const {
actualTypeName, queryTypeName,
getRenderOutput, actualRenderOutputType,
getDiffInputFromRenderOutput, rewrapResult, wrapResultForReturn,
ActualAdapter, QueryAdapter
} = this._options;
expect.addAssertion([`<${actualTypeName}> queried for [exactly] <${queryTypeName}> <assertion?>`,
`<${actualTypeName}> queried for [with all children] [with all wrapppers] [with all classes] [with all attributes] <${queryTypeName}> <assertion?>`
], function (expect, subject, query, assertion) {
return expect.apply(expect,
[
getRenderOutput(subject), 'queried for [exactly] [with all children] [with all wrappers] [with all classes] [with all attributes]', query
].concat(Array.prototype.slice.call(arguments, 3)));
});
expect.addAssertion([`<${actualRenderOutputType}> queried for [exactly] <${queryTypeName}> <assertion?>`,
`<${actualRenderOutputType}> queried for [with all children] [with all wrapppers] [with all classes] [with all attributes] <${queryTypeName}> <assertion?>`], function (expect, subject, query) {
var exactly = this.flags.exactly;
var withAllChildren = this.flags['with all children'];
var withAllWrappers = this.flags['with all wrappers'];
var withAllClasses = this.flags['with all classes'];
var withAllAttributes = this.flags['with all attributes'];
var actualAdapter = new ActualAdapter();
var queryAdapter = new QueryAdapter();
var testHtmlLike = new UnexpectedHtmlLike(actualAdapter);
if (!exactly) {
actualAdapter.setOptions({concatTextContent: true});
queryAdapter.setOptions({concatTextContent: true});
}
const options = getDefaultOptions({exactly, withAllWrappers, withAllChildren, withAllClasses, withAllAttributes});
options.findTargetAttrib = 'queryTarget';
const containsResult = testHtmlLike.contains(queryAdapter, getDiffInputFromRenderOutput(subject), query, expect, options);
const args = arguments;
return testHtmlLike.withResult(containsResult, function (result) {
if (!result.found) {
expect.fail({
diff: (output, diff, inspect) => {
const resultOutput = output.error('`queried for` found no match.');
if (result.bestMatch) {
resultOutput.error(' The best match was')
.nl()
.append(testHtmlLike.render(result.bestMatch, output.clone(), diff, inspect));
}
return resultOutput;
}
});
}
if (args.length > 3) {
// There is an assertion continuation...
expect.errorMode = 'nested'
const s = rewrapResult(subject, result.bestMatch.target || result.bestMatchItem);
return expect.apply(null,
[
rewrapResult(subject, result.bestMatch.target || result.bestMatchItem)
].concat(Array.prototype.slice.call(args, 3)))
return expect.shift(rewrapResult(subject, result.bestMatch.target || result.bestMatchItem));
}
// There is no assertion continuation, so we need to wrap the result for public consumption
// i.e. create a value that we can give back from the `expect` promise
return expect.shift((wrapResultForReturn || rewrapResult)(subject, result.bestMatch.target || result.bestMatchItem));
});
});
};
AssertionGenerator.prototype._installPendingEventType = function (expect) {
const actualPendingEventTypeName = this._actualPendingEventTypeName;
const PENDING_EVENT_IDENTIFIER = this._PENDING_EVENT_IDENTIFIER;
expect.addType({
name: actualPendingEventTypeName,
base: 'object',
identify(value) {
return value && typeof value === 'object' && value.$$typeof === PENDING_EVENT_IDENTIFIER;
},
inspect(value, depth, output, inspect) {
return output.append(inspect(value.renderer)).red(' with pending event \'').cyan(value.eventName).red('\'');
}
});
};
AssertionGenerator.prototype._installWithEvent = function (expect) {
const { actualTypeName, actualRenderOutputType, triggerEvent, canTriggerEventsOnOutputType } = this._options;
let { wrapResultForReturn = (value) => value } = this._options;
const actualPendingEventTypeName = this._actualPendingEventTypeName;
<|fim▁hole|> expect.addAssertion(`<${actualTypeName}> with event <string> <assertion?>`, function (expect, subject, eventName, ...assertion) {
if (arguments.length > 3) {
return expect.apply(null, [{
$$typeof: PENDING_EVENT_IDENTIFIER,
renderer: subject,
eventName: eventName
}].concat(assertion));
} else {
triggerEvent(subject, null, eventName);
return expect.shift(wrapResultForReturn(subject));
}
});
expect.addAssertion(`<${actualTypeName}> with event (${REACT_EVENT_NAMES.join('|')}) <assertion?>`, function (expect, subject, ...assertion) {
return expect(subject, 'with event', expect.alternations[0], ...assertion);
});
expect.addAssertion(`<${actualTypeName}> with event <string> <object> <assertion?>`, function (expect, subject, eventName, eventArgs) {
if (arguments.length > 4) {
return expect.shift({
$$typeof: PENDING_EVENT_IDENTIFIER,
renderer: subject,
eventName: eventName,
eventArgs: eventArgs
});
} else {
triggerEvent(subject, null, eventName, eventArgs);
return expect.shift(subject);
}
});
expect.addAssertion(`<${actualTypeName}> with event (${REACT_EVENT_NAMES.join('|')}) <object> <assertion?>`, function (expect, subject, eventArgs, ...assertion) {
return expect(subject, 'with event', expect.alternations[0], eventArgs, ...assertion);
});
if (canTriggerEventsOnOutputType) {
expect.addAssertion(`<${actualRenderOutputType}> with event <string> <assertion?>`, function (expect, subject, eventName, ...assertion) {
if (arguments.length > 3) {
return expect.apply(null, [{
$$typeof: PENDING_EVENT_IDENTIFIER,
renderer: subject,
eventName: eventName,
isOutputType: true
}].concat(assertion));
} else {
triggerEvent(subject, null, eventName);
return expect.shift(wrapResultForReturn(subject));
}
});
expect.addAssertion(`<${actualRenderOutputType}> with event (${REACT_EVENT_NAMES.join('|')}) <assertion?>`, function (expect, subject, ...assertion) {
return expect(subject, 'with event', expect.alternations[0], ...assertion);
});
expect.addAssertion(`<${actualRenderOutputType}> with event <string> <object> <assertion?>`, function (expect, subject, eventName, args) {
if (arguments.length > 4) {
return expect.shift({
$$typeof: PENDING_EVENT_IDENTIFIER,
renderer: subject,
eventName: eventName,
eventArgs: args,
isOutputType: true
});
} else {
triggerEvent(subject, null, eventName, args);
return expect.shift(subject);
}
});
expect.addAssertion(`<${actualRenderOutputType}> with event (${REACT_EVENT_NAMES.join('|')}) <object> <assertion?>`, function (expect, subject, eventArgs, ...assertion) {
return expect(subject, 'with event', expect.alternations[0], eventArgs, ...assertion);
});
}
expect.addAssertion(`<${actualPendingEventTypeName}> [and] with event <string> <assertion?>`,
function (expect, subject, eventName) {
triggerEvent(subject.renderer, subject.target, subject.eventName, subject.eventArgs);
if (arguments.length > 3) {
return expect.shift({
$$typeof: PENDING_EVENT_IDENTIFIER,
renderer: subject.renderer,
eventName: eventName
});
} else {
triggerEvent(subject.renderer, null, eventName);
return expect.shift(subject.renderer);
}
});
expect.addAssertion(`<${actualPendingEventTypeName}> [and] with event (${REACT_EVENT_NAMES.join('|')}) <assertion?>`, function (expect, subject, ...assertion) {
return expect(subject, 'with event', expect.alternations[0], ...assertion);
});
expect.addAssertion(`<${actualPendingEventTypeName}> [and] with event <string> <object> <assertion?>`,
function (expect, subject, eventName, eventArgs) {
triggerEvent(subject.renderer, subject.target, subject.eventName, subject.eventArgs);
if (arguments.length > 4) {
return expect.shift({
$$typeof: PENDING_EVENT_IDENTIFIER,
renderer: subject.renderer,
eventName: eventName,
eventArgs: eventArgs
});
} else {
triggerEvent(subject.renderer, null, eventName, eventArgs);
return expect.shift(subject.renderer);
}
});
expect.addAssertion(`<${actualPendingEventTypeName}> [and] with event (${REACT_EVENT_NAMES.join('|')}) <object> <assertion?>`, function (expect, subject, eventArgs, ...assertion) {
return expect(subject, 'with event', expect.alternations[0], eventArgs, ...assertion);
});
};
AssertionGenerator.prototype._installWithEventOn = function (expect) {
const {
actualTypeName, queryTypeName, expectedTypeName,
getRenderOutput,
getDiffInputFromRenderOutput, triggerEvent,
ActualAdapter, QueryAdapter
} = this._options;
const actualPendingEventTypeName = this._actualPendingEventTypeName;
expect.addAssertion(`<${actualPendingEventTypeName}> on [exactly] [with all children] [with all wrappers] [with all classes] [with all attributes]<${queryTypeName}> <assertion?>`,
function (expect, subject, target) {
const actualAdapter = new ActualAdapter({ convertToString: true, concatTextContent: true });
const queryAdapter = new QueryAdapter({ convertToString: true, concatTextContent: true });
const testHtmlLike = new UnexpectedHtmlLike(actualAdapter);
const exactly = this.flags.exactly;
const withAllChildren = this.flags['with all children'];
const withAllWrappers = this.flags['with all wrappers'];
const withAllClasses = this.flags['with all classes'];
const withAllAttributes = this.flags['with all attributes'];
const options = getDefaultOptions({ exactly, withAllWrappers, withAllChildren, withAllClasses, withAllAttributes});
options.findTargetAttrib = 'eventTarget';
const containsResult = testHtmlLike.contains(queryAdapter, getDiffInputFromRenderOutput(getRenderOutput(subject.renderer)), target, expect, options);
return testHtmlLike.withResult(containsResult, result => {
if (!result.found) {
return expect.fail({
diff: function (output, diff, inspect) {
output.error('Could not find the target for the event. ');
if (result.bestMatch) {
output.error('The best match was').nl().nl().append(testHtmlLike.render(result.bestMatch, output.clone(), diff, inspect));
}
return output;
}
});
}
const newSubject = Object.assign({}, subject, {
target: result.bestMatch.target || result.bestMatchItem
});
if (arguments.length > 3) {
return expect.shift(newSubject);
} else {
triggerEvent(newSubject.renderer, newSubject.target, newSubject.eventName, newSubject.eventArgs);
return expect.shift(newSubject.renderer);
}
});
});
expect.addAssertion([`<${actualPendingEventTypeName}> queried for [exactly] <${queryTypeName}> <assertion?>`,
`<${actualPendingEventTypeName}> queried for [with all children] [with all wrappers] [with all classes] [with all attributes] <${queryTypeName}> <assertion?>`],
function (expect, subject, expected) {
triggerEvent(subject.renderer, subject.target, subject.eventName, subject.eventArgs);
return expect.apply(expect,
[subject.renderer, 'queried for [exactly] [with all children] [with all wrappers] [with all classes] [with all attributes]', expected]
.concat(Array.prototype.slice.call(arguments, 3)));
}
);
};
AssertionGenerator.prototype._installEventHandlerAssertions = function (expect) {
const { actualTypeName, expectedTypeName, triggerEvent } = this._options;
const actualPendingEventTypeName = this._actualPendingEventTypeName;
expect.addAssertion([`<${actualPendingEventTypeName}> [not] to contain [exactly] <${expectedTypeName}>`,
`<${actualPendingEventTypeName}> [not] to contain [with all children] [with all wrappers] [with all classes] [with all attributes] <${expectedTypeName}>`],
function (expect, subject, expected) {
triggerEvent(subject.renderer, subject.target, subject.eventName, subject.eventArgs);
return expect(subject.renderer, '[not] to contain [exactly] [with all children] [with all wrappers] [with all classes] [with all attributes]', expected);
});
expect.addAssertion(`<${actualPendingEventTypeName}> to have [exactly] rendered [with all children] [with all wrappers] [with all classes] [with all attributes] <${expectedTypeName}>`,
function (expect, subject, expected) {
triggerEvent(subject.renderer, subject.target, subject.eventName, subject.eventArgs);
return expect(subject.renderer, 'to have [exactly] rendered [with all children] [with all wrappers] [with all classes] [with all attributes]', expected);
});
};
export default AssertionGenerator;<|fim▁end|> | const PENDING_EVENT_IDENTIFIER = this._PENDING_EVENT_IDENTIFIER;
|
<|file_name|>utils.py<|end_file_name|><|fim▁begin|>import os
import csv
def get_value_or_default(value, default=None):
result = value.strip()
if len(result) == 0:
result = default
return result
def read_csv_file(csv_file_name,
delimiter,
quote_char='"', <|fim▁hole|> fd = open(file=csv_file_name, mode='r', encoding=encoding)
csv_reader = csv.reader(fd, delimiter=delimiter, quotechar=quote_char)
if skip_header:
next(csv_reader)
for row in csv_reader:
yield row
fd.close()<|fim▁end|> | skip_header=True,
encoding='latin-1'):
print(csv_file_name) |
<|file_name|>app.py<|end_file_name|><|fim▁begin|>import json
import os.path
import sys
sys.path.append(os.path.join(os.path.dirname(__file__), '../scripts'))
from flask import Flask, render_template, request
from py2neo import neo4j
from ollie import pipeline
app = Flask(__name__)
"""@app.route('/render', method=['POST'])
def render():
pairs = json.loads(request.form['data'])
edges = []
for pair in pairs:
n1 = pair[0]
n2 = pair[2]
rel = pair[1]
edges.append({'source': str(n1), 'target': str(n2), 'type': str(rel)})
return render_template('index4.html', links=edges)
"""
@app.route('/graph', methods=['POST', 'GET'])
def graph():
if request.method == 'POST':
f = request.files['file']
f.save('/tmp/doc.txt')
pairs = pipeline('/tmp/doc.txt')
edges = []
for pair in pairs:
n1 = pair[0]
n2 = pair[2]
rel = pair[1]
edges.append({'source': str(n1), 'target': str(n2), 'type': str(rel)})
#return render_template('graph.html', links=edges)
return json.dumps(edges)
else:
graph_db = neo4j.GraphDatabaseService('http://localhost:7474/db/data/')
relations = graph_db.get_index(neo4j.Relationship, 'relations')
q = relations.query('relation_name:*')
pairs = []
for rel in q:
pairs.append([rel.start_node['name'], rel.type, rel.end_node['name']])
return json.dumps(pairs)
@app.route('/graph/<concept>')
def concept(concept):
graph_db = neo4j.GraphDatabaseService('http://localhost:7474/db/data/')
relations = graph_db.get_index(neo4j.Node, 'concepts')
q = relations.query('concept_name:%s' % concept)
pairs = []
try:
concept = q.next()
except:
return json.dumps(pairs)
rels = concept.match()
for rel in rels:
pairs.append([rel.start_node['name'], rel.type, rel.end_node['name']])
return json.dumps(pairs)
@app.route('/search/<query>')
def search(query):
graph_db = neo4j.GraphDatabaseService('http://localhost:7474/db/data/')
concepts = graph_db.get_index(neo4j.Node, 'concepts')
query = '*' + '*'.join(query.strip().split(' ')) + '*'
print query
q = concepts.query('concept_name:%s' % str(query))
pairs = []
try:
concept = q.next()
except:
return json.dumps(pairs)
rels = concept.match()
for rel in rels:
pairs.append([rel.start_node['name'], rel.type, rel.end_node['name']])
return json.dumps(pairs)
@app.route('/graphical/<concepts>')
def graphical(concepts):
graph_db = neo4j.GraphDatabaseService('http://localhost:7474/db/data/')
relations = graph_db.get_index(neo4j.Node, 'concepts')
query = '"' + '" OR "'.join(concepts.split(',')) + '"'
q = relations.query('concept_name:(%s)' % str(query))
pairs = []
rels = []
concept = None
while True:
try:
concept = q.next()
except:
break
rels += concept.match()
if not concept:
return json.loads(pairs)
nodes = {}
edges = []
for rel in rels:
n1 = rel.start_node['name']
n2 = rel.end_node['name']
if n1 not in nodes:
nodes[str(n1)] = {"radius":10.0, "weight":1.00, "centrality":0.00, "fill":"rgba(0,127,255,0.70)", "stroke":"rgba(0,0,0,0.80)"}
if n2 not in nodes:
nodes[str(n2)] = {"radius":10.0, "weight":1.00, "centrality":0.00, "fill":"rgba(0,127,255,0.70)", "stroke":"rgba(0,0,0,0.80)"}
nodes[str(rel.type)] = {"radius":10.0, "weight":1.00, "centrality":0.00, "fill":"rgba(0,127,255,0.70)", "stroke":"rgba(0,0,0,0.80)"}
#edges.append([str(n1), str(rel.type), {"length":50.00, "stroke":"rgba(135,234,135,1.00)"}])
#edges.append([str(rel.type), str(n2), {"length":50.00, "stroke":"rgba(135,234,135,1.00)"}])
edges.append({'source': str(n1), 'target': str(n2), 'type': str(rel.type)})
return render_template('graph.html', links=edges, nodes=nodes)
@app.route('/')
def home():
return render_template('new.html')
@app.route('/browse')
def browse():
concepts = []
graph_db = neo4j.GraphDatabaseService('http://localhost:7474/db/data/')
relations = graph_db.get_index(neo4j.Node, 'concepts')
q = relations.query('concept_name:*')
while True:
try:
concept = q.next()
concepts.append(str(concept['name']))
except:
break
return render_template('browse.html', concepts=concepts)
app.debug = True<|fim▁hole|><|fim▁end|> | app.run() |
<|file_name|>AzeriteArmor.ts<|end_file_name|><|fim▁begin|>import { OvaleDebug } from "./Debug";
import { Ovale } from "./Ovale";
import aceEvent from "@wowts/ace_event-3.0";
import { LuaObj, LuaArray, wipe, pairs, tostring, lualength, ipairs } from "@wowts/lua";
import { sort, insert, concat } from "@wowts/table";
import { C_Item, ItemLocation, C_AzeriteEmpoweredItem, GetSpellInfo, ItemLocationMixin } from "@wowts/wow-mock";
import { OvaleEquipment } from "./Equipment";
let tsort = sort;
let tinsert = insert;
let tconcat = concat;
let item = C_Item
let itemLocation = ItemLocation
let azeriteItem = C_AzeriteEmpoweredItem
let azeriteSlots: LuaArray<boolean> = {
[1]: true,
[3]: true,
[5]: true
}
interface Trait {
name?: string;
spellID: number;
rank: number
}
let OvaleAzeriteArmorBase = OvaleDebug.RegisterDebugging(Ovale.NewModule("OvaleAzerite", aceEvent));
class OvaleAzeriteArmor extends OvaleAzeriteArmorBase {
self_traits: LuaObj<Trait> = {}
output: LuaArray<string> = {}
debugOptions = {
azeraittraits: {
name: "Azerite traits",
type: "group",
args: {
azeraittraits: {
name: "Azerite traits",
type: "input",
multiline: 25,
width: "full",
get: (info: LuaArray<string>) => {
return this.DebugTraits();
}
}
}
}
}
constructor() {
super();
for (const [k, v] of pairs(this.debugOptions)) {
OvaleDebug.options.args[k] = v;
}
}
OnInitialize() {<|fim▁hole|> }
OnDisable() {
this.UnregisterMessage("Ovale_EquipmentChanged")
this.UnregisterEvent("AZERITE_EMPOWERED_ITEM_SELECTION_UPDATED")
this.UnregisterEvent("PLAYER_ENTERING_WORLD")
}
ItemChanged(){
let slotId = OvaleEquipment.lastChangedSlot;
if(slotId != undefined && azeriteSlots[slotId]){
this.UpdateTraits()
}
}
AZERITE_EMPOWERED_ITEM_SELECTION_UPDATED(event: string, itemSlot: ItemLocationMixin){
this.UpdateTraits()
}
PLAYER_ENTERING_WORLD(event:string){
this.UpdateTraits()
}
UpdateTraits() {
this.self_traits = {}
for(const [slotId,] of pairs(azeriteSlots)){
let itemSlot = itemLocation.CreateFromEquipmentSlot(slotId)
if(item.DoesItemExist(itemSlot) && azeriteItem.IsAzeriteEmpoweredItem(itemSlot)){
let allTraits = azeriteItem.GetAllTierInfo(itemSlot)
for(const [,traitsInRow] of pairs(allTraits)){
for(const [,powerId] of pairs(traitsInRow.azeritePowerIDs)){
let isEnabled = azeriteItem.IsPowerSelected(itemSlot, powerId);
if(isEnabled){
let powerInfo = azeriteItem.GetPowerInfo(powerId)
let [name] = GetSpellInfo(powerInfo.spellID);
if(this.self_traits[powerInfo.spellID]){
let rank = this.self_traits[powerInfo.spellID].rank
this.self_traits[powerInfo.spellID].rank = rank + 1
}else{
this.self_traits[powerInfo.spellID] = {
spellID: powerInfo.spellID,
name: name,
rank: 1
};
}
break
}
}
}
}
}
}
HasTrait(spellId: number) {
return (this.self_traits[spellId]) && true || false;
}
TraitRank(spellId: number) {
if (!this.self_traits[spellId]) {
return 0;
}
return this.self_traits[spellId].rank;
}
DebugTraits(){
wipe(this.output);
let array: LuaArray<string> = {}
for (const [k, v] of pairs(this.self_traits)) {
tinsert(array, `${tostring(v.name)}: ${tostring(k)} (${v.rank})`);
}
tsort(array);
for (const [, v] of ipairs(array)) {
this.output[lualength(this.output) + 1] = v;
}
return tconcat(this.output, "\n");
}
}
export const OvaleAzerite = new OvaleAzeriteArmor();<|fim▁end|> | this.RegisterMessage("Ovale_EquipmentChanged", "ItemChanged")
this.RegisterEvent("AZERITE_EMPOWERED_ITEM_SELECTION_UPDATED")
this.RegisterEvent("PLAYER_ENTERING_WORLD") |
<|file_name|>mlaunch.py<|end_file_name|><|fim▁begin|>#!/usr/bin/env python3
import argparse
import functools
import json
import os
import re
import signal
import socket
import ssl
import subprocess
import sys
import threading
import time
import warnings
from collections import defaultdict
from operator import itemgetter
import psutil
from mtools.util import OrderedDict
from mtools.util.cmdlinetool import BaseCmdLineTool
from mtools.util.print_table import print_table
from mtools.version import __version__
try:
import Queue
except ImportError:
import queue as Queue
try:
from pymongo import MongoClient as Connection
from pymongo import version_tuple as pymongo_version
from bson import SON
from io import BytesIO
from distutils.version import LooseVersion
from pymongo.errors import ConnectionFailure, AutoReconnect
from pymongo.errors import OperationFailure, ConfigurationError
except ImportError as e:
raise ImportError("Can't import pymongo. See "
"https://api.mongodb.com/python/current/ for "
"instructions on how to install pymongo: " + str(e))
class MongoConnection(Connection):
"""
MongoConnection class.
Wrapper around Connection (itself conditionally a MongoClient or
pymongo.Connection) to specify timeout and directConnection.
"""
def __init__(self, *args, **kwargs):
kwargs.setdefault('directConnection', True)
kwargs.setdefault('serverSelectionTimeoutMS', 1)
# Set client application name for MongoDB 3.4+ servers
kwargs['appName'] = f'''mlaunch v{__version__}'''
Connection.__init__(self, *args, **kwargs)
def wait_for_host(port, interval=1, timeout=30, to_start=True, queue=None,
ssl_pymongo_options=None, tls_pymongo_options=None):
"""
Ping server and wait for response.
Ping a mongod or mongos every `interval` seconds until it responds, or
`timeout` seconds have passed. If `to_start` is set to False, will wait for
the node to shut down instead. This function can be called as a separate
thread.
If queue is provided, it will place the results in the message queue and
return, otherwise it will just return the result directly.
"""
host = 'localhost:%i' % port
start_time = time.time()
while True:
if (time.time() - start_time) > timeout:
if queue:
queue.put_nowait((port, False))
return False
try:
# make connection and ping host
con = MongoConnection(host,
**(ssl_pymongo_options or {}),
**(tls_pymongo_options or {}))
con.admin.command('ping')
if to_start:
if queue:
queue.put_nowait((port, True))
return True
else:
time.sleep(interval)
except Exception:
if to_start:
time.sleep(interval)
else:
if queue:
queue.put_nowait((port, True))
return True
def shutdown_host(port, username=None, password=None, authdb=None):
"""
Send the shutdown command to a mongod or mongos on given port.
This function can be called as a separate thread.
"""
host = 'localhost:%i' % port
try:
if username and password and authdb:
if authdb != "admin":
raise RuntimeError("given username/password is not for "
"admin database")
mc = MongoConnection(host, username=username, password=password)
else:
mc = MongoConnection(host)
try:
mc.admin.command('shutdown', force=True)
except AutoReconnect:
pass
except OperationFailure:
print("Error: cannot authenticate to shut down %s." % host)
return
except ConnectionFailure:
pass
else:
mc.close()
@functools.lru_cache()
def check_mongo_server_output(binary, argument):
"""Call mongo[d|s] with arguments such as --help or --version.
This is used only to check the server's output. We expect the server to
exit immediately.
"""
try:
proc = subprocess.Popen(['%s' % binary, argument],
stderr=subprocess.STDOUT,
stdout=subprocess.PIPE, shell=False)
except OSError as exc:
print('Failed to launch %s' % binary)
raise exc
out, err = proc.communicate()
if proc.returncode:
raise OSError(out or err)
return out
class MLaunchTool(BaseCmdLineTool):
UNDOCUMENTED_MONGOD_ARGS = ['--nopreallocj', '--wiredTigerEngineConfigString']
UNSUPPORTED_MONGOS_ARGS = ['--wiredTigerCacheSizeGB', '--storageEngine']
UNSUPPORTED_CONFIG_ARGS = ['--oplogSize', '--storageEngine', '--smallfiles', '--nojournal']
def __init__(self, test=False):
BaseCmdLineTool.__init__(self)
# arguments
self.args = None
# startup parameters for each port
self.startup_info = {}
# data structures for the discovery feature
self.cluster_tree = {}
self.cluster_tags = defaultdict(list)
self.cluster_running = {}
# memoize ignored arguments passed to different binaries
self.ignored_arguments = {}
# config docs for replica sets (key is replica set name)
self.config_docs = {}
# shard connection strings
self.shard_connection_str = []
# ssl configuration to start mongod or mongos, or create a MongoClient
self.ssl_server_args = ''
self.ssl_pymongo_options = {}
# tls configuration to start mongod or mongos, or create a MongoClient
self.tls_server_args = ''
self.tls_pymongo_options = {}
# indicate if running in testing mode
self.test = test
# version of MongoDB server
self.current_version = self.getMongoDVersion()
def run(self, arguments=None):
"""
Main run method.
Called for all sub-commands and parameters. It sets up argument
parsing, then calls the sub-command method with the same name.
"""
# set up argument parsing in run, so that subsequent calls
# to run can call different sub-commands
self.argparser = argparse.ArgumentParser()
self.argparser.add_argument('--version', action='version',
version=f'''mtools version {__version__} || Python {sys.version}''')
self.argparser.add_argument('--no-progressbar', action='store_true',
default=False,
help='disables progress bar')
self.argparser.description = ('script to launch MongoDB stand-alone '
'servers, replica sets and shards.')
# make sure init is default command even when specifying
# arguments directly
if arguments and arguments.startswith('-'):
arguments = 'init ' + arguments
# default sub-command is `init` if none provided
elif (len(sys.argv) > 1 and sys.argv[1].startswith('-') and
sys.argv[1] not in ['-h', '--help', '--version']):
sys.argv = sys.argv[0:1] + ['init'] + sys.argv[1:]
# create command sub-parsers
subparsers = self.argparser.add_subparsers(dest='command')
self.argparser._action_groups[0].title = 'commands'
self.argparser._action_groups[0].description = \
('init is the default command and can be omitted. To get help on '
'individual commands, run mlaunch <command> --help. Command line '
'arguments which are not handled by mlaunch will be passed '
'through to mongod/mongos if those options are listed in the '
'--help output for the current binary. For example: '
'--storageEngine, --logappend, or --config.')
# init command
helptext = ('initialize a new MongoDB environment and start '
'stand-alone instances, replica sets, or sharded '
'clusters.')
desc = ('Initialize a new MongoDB environment and start stand-alone '
'instances, replica sets, or sharded clusters. Command line '
'arguments which are not handled by mlaunch will be passed '
'through to mongod/mongos if those options are listed in the '
'--help output for the current binary. For example: '
'--storageEngine, --logappend, or --config.')
init_parser = subparsers.add_parser('init', help=helptext,
description=desc)
# either single or replica set
me_group = init_parser.add_mutually_exclusive_group(required=True)
me_group.add_argument('--single', action='store_true',
help=('creates a single stand-alone mongod '
'instance'))
me_group.add_argument('--replicaset', action='store_true',
help=('creates replica set with several mongod '
'instances'))
# replica set arguments
init_parser.add_argument('--nodes', action='store', metavar='NUM',
type=int, default=3,
help=('adds NUM data nodes to replica set '
'(requires --replicaset, default=3)'))
init_parser.add_argument('--arbiter', action='store_true',
default=False,
help=('adds arbiter to replica set '
'(requires --replicaset)'))
init_parser.add_argument('--name', action='store', metavar='NAME',
default='replset',
help='name for replica set (default=replset)')
init_parser.add_argument('--priority', action='store_true',
default=False,
help='make lowest-port member primary')
# sharded clusters
init_parser.add_argument('--sharded', '--shards', action='store',
nargs='+', metavar='N',
help=('creates a sharded setup consisting of '
'several singles or replica sets. '
'Provide either list of shard names or '
'number of shards.'))
init_parser.add_argument('--config', action='store', default=-1,
type=int, metavar='NUM',
help=('adds NUM config servers to sharded '
'setup (requires --sharded, default=1, '
'with --csrs default=3)'))
init_parser.add_argument('--csrs', default=False, action='store_true',
help=('deploy config servers as a replica '
'set (requires MongoDB >= 3.2.0)'))
init_parser.add_argument('--mongos', action='store', default=1,
type=int, metavar='NUM',
help=('starts NUM mongos processes (requires '
'--sharded, default=1)'))
# verbose, port, binary path
init_parser.add_argument('--verbose', action='store_true',
default=False,
help='outputs more verbose information.')
init_parser.add_argument('--port', action='store', type=int,
default=27017,
help=('port for mongod, start of port range '
'in case of replica set or shards '
'(default=27017)'))
init_parser.add_argument('--binarypath', action='store', default=None,
metavar='PATH',
help=('search for mongod/s binaries in the '
'specified PATH.'))
init_parser.add_argument('--dir', action='store', default='./data',
help=('base directory to create db and log '
'paths (default=./data/)'))
init_parser.add_argument('--hostname', action='store',
default='localhost',
help=('override hostname for replica set '
'configuration'))
# authentication, users, roles
self._default_auth_roles = ['dbAdminAnyDatabase',
'readWriteAnyDatabase',
'userAdminAnyDatabase',
'clusterAdmin']
init_parser.add_argument('--auth', action='store_true', default=False,
help=('enable authentication and create a '
'key file and admin user '
'(default=user/password)'))
init_parser.add_argument('--username', action='store', type=str,
default='user',
help=('username to add (requires --auth, '
'default=user)'))
init_parser.add_argument('--password', action='store', type=str,
default='password',
help=('password for given username (requires '
'--auth, default=password)'))
init_parser.add_argument('--auth-db', action='store', type=str,
default='admin', metavar='DB',
help=('database where user will be added '
'(requires --auth, default=admin)'))
init_parser.add_argument('--auth-roles', action='store',
default=self._default_auth_roles,
metavar='ROLE', nargs='*',
help=('admin user''s privilege roles; note'
'that the clusterAdmin role is '
'required to run the stop command '
'(requires --auth, default="%s")'
% ' '.join(self._default_auth_roles)))
init_parser.add_argument('--auth-role-docs', action='store_true',
default=False,
help='auth-roles are JSON documents')
init_parser.add_argument('--no-initial-user', action='store_false',
default=True, dest='initial-user',
help=('Do not create an initial user if auth '
'is enabled'))
def is_file(arg):
if not os.path.exists(os.path.expanduser(arg)):
init_parser.error("The file [%s] does not exist" % arg)
return arg
# MongoDB 4.2 adds TLS options to replace the corresponding SSL options
# https://docs.mongodb.com/manual/release-notes/4.2/#new-tls-options
if (LooseVersion(self.current_version) >= LooseVersion("4.2.0")):
# tls
tls_args = init_parser.add_argument_group('TLS options')
tls_args.add_argument('--tlsCAFile',
help='Certificate Authority file for TLS',
type=is_file)
tls_args.add_argument('--tlsCRLFile',
help='Certificate Revocation List file for TLS',
type=is_file)
tls_args.add_argument('--tlsAllowInvalidHostnames',
action='store_true',
help=('allow client and server certificates to '
'provide non-matching hostnames'))
tls_args.add_argument('--tlsAllowInvalidCertificates',
action='store_true',
help=('allow client or server connections with '
'invalid certificates'))
tls_server_args = init_parser.add_argument_group('Server TLS options')
tls_server_args.add_argument('--tlsMode',
help='set the TLS operation mode',
choices=('disabled allowTLS preferTLS '
'requireTLS'.split()))
tls_server_args.add_argument('--tlsCertificateKeyFile',
help='PEM file for TLS', type=is_file)
tls_server_args.add_argument('--tlsCertificateKeyFilePassword',
help='PEM file password')
tls_server_args.add_argument('--tlsClusterFile',
help=('key file for internal TLS '
'authentication'), type=is_file)
tls_server_args.add_argument('--tlsClusterPassword',
help=('internal authentication key '
'file password'))
tls_server_args.add_argument('--tlsDisabledProtocols',
help=('comma separated list of TLS '
'protocols to disable '
'[TLS1_0,TLS1_1,TLS1_2]'))
tls_server_args.add_argument('--tlsAllowConnectionsWithoutCertificates',
action='store_true',
help=('allow client to connect without '
'presenting a certificate'))
tls_server_args.add_argument('--tlsFIPSMode', action='store_true',
help='activate FIPS 140-2 mode')
tls_client_args = init_parser.add_argument_group('Client TLS options')
tls_client_args.add_argument('--tlsClientCertificate',
help='client certificate file for TLS',
type=is_file)
tls_client_args.add_argument('--tlsClientCertificateKeyFile',
help='client certificate key file for TLS',
type=is_file)
tls_client_args.add_argument('--tlsClientCertificateKeyFilePassword',
help='client certificate key file password')
self.tls_args = tls_args
self.tls_client_args = tls_client_args
self.tls_server_args = tls_server_args
else:
# ssl
ssl_args = init_parser.add_argument_group('TLS/SSL options')
ssl_args.add_argument('--sslCAFile',
help='Certificate Authority file for TLS/SSL',
type=is_file)
ssl_args.add_argument('--sslCRLFile',
help='Certificate Revocation List file for TLS/SSL',
type=is_file)
ssl_args.add_argument('--sslAllowInvalidHostnames',
action='store_true',
help=('allow client and server certificates to '
'provide non-matching hostnames'))
ssl_args.add_argument('--sslAllowInvalidCertificates',
action='store_true',
help=('allow client or server connections with '
'invalid certificates'))
ssl_server_args = init_parser.add_argument_group('Server TLS/SSL options')
ssl_server_args.add_argument('--sslMode',
help='set the TLS/SSL operation mode',
choices=('disabled allowSSL preferSSL '
'requireSSL'.split()))
ssl_server_args.add_argument('--sslPEMKeyFile',
help='PEM file for TLS/SSL', type=is_file)
ssl_server_args.add_argument('--sslPEMKeyPassword',
help='PEM file password')
ssl_server_args.add_argument('--sslClusterFile',
help=('key file for internal TLS/SSL '
'authentication'), type=is_file)
ssl_server_args.add_argument('--sslClusterPassword',
help=('internal authentication key '
'file password'))
ssl_server_args.add_argument('--sslDisabledProtocols',
help=('comma separated list of TLS '
'protocols to disable '
'[TLS1_0,TLS1_1,TLS1_2]'))
ssl_server_args.add_argument('--sslAllowConnectionsWithoutCertificates',
action='store_true',
help=('allow client to connect without '
'presenting a certificate'))
ssl_server_args.add_argument('--sslFIPSMode', action='store_true',
help='activate FIPS 140-2 mode')
ssl_client_args = init_parser.add_argument_group('Client TLS/SSL options')
ssl_client_args.add_argument('--sslClientCertificate',
help='client certificate file for TLS/SSL',
type=is_file)
ssl_client_args.add_argument('--sslClientPEMKeyFile',
help='client PEM file for TLS/SSL',
type=is_file)
ssl_client_args.add_argument('--sslClientPEMKeyPassword',
help='client PEM file password')
self.ssl_args = ssl_args
self.ssl_client_args = ssl_client_args
self.ssl_server_args = ssl_server_args
# start command
start_parser = subparsers.add_parser('start',
help=('starts existing MongoDB '
'instances. Example: '
'"mlaunch start config" '
'will start all config '
'servers.'),
description=('starts existing '
'MongoDB instances. '
'Example: "mlaunch '
'start config" will '
'start all config '
'servers.'))
start_parser.add_argument('tags', metavar='TAG', action='store',
nargs='*', default=[],
help=('without tags, all non-running nodes '
'will be restarted. Provide '
'additional tags to narrow down the '
'set of nodes to start.'))
start_parser.add_argument('--verbose', action='store_true',
default=False,
help='outputs more verbose information.')
start_parser.add_argument('--dir', action='store', default='./data',
help=('base directory to start nodes '
'(default=./data/)'))
start_parser.add_argument('--binarypath', action='store',
default=None, metavar='PATH',
help=('search for mongod/s binaries in the '
'specified PATH.'))
# stop command
helptext = ('stops running MongoDB instances. Example: "mlaunch stop '
'shard 2 secondary" will stop all secondary nodes '
'of shard 2.')
desc = ('stops running MongoDB instances with the shutdown command. '
'Example: "mlaunch stop shard 2 secondary" will stop all '
'secondary nodes of shard 2.')
stop_parser = subparsers.add_parser('stop',
help=helptext,
description=desc)
helptext = ('without tags, all running nodes will be stopped. '
'Provide additional tags to narrow down the set of '
'nodes to stop.')
stop_parser.add_argument('tags', metavar='TAG', action='store',
nargs='*', default=[], help=helptext)
stop_parser.add_argument('--verbose', action='store_true',
default=False,
help='outputs more verbose information.')
stop_parser.add_argument('--dir', action='store', default='./data',
help=('base directory to stop nodes '
'(default=./data/)'))
# restart command
desc = ('stops running MongoDB instances with the shutdown command. '
'Then restarts the stopped instances.')
restart_parser = subparsers.add_parser('restart',
help=('stops, then restarts '
'MongoDB instances.'),
description=desc)
restart_parser.add_argument('tags', metavar='TAG', action='store',
nargs='*', default=[],
help=('without tags, all non-running '
'nodes will be restarted. Provide '
'additional tags to narrow down the '
'set of nodes to start.'))
restart_parser.add_argument('--verbose', action='store_true',
default=False,
help='outputs more verbose information.')
restart_parser.add_argument('--dir', action='store', default='./data',
help=('base directory to restart nodes '
'(default=./data/)'))
restart_parser.add_argument('--binarypath', action='store',
default=None, metavar='PATH',
help=('search for mongod/s binaries in '
'the specified PATH.'))
# list command
list_parser = subparsers.add_parser('list',
help=('list MongoDB instances of '
'this environment.'),
description=('list MongoDB '
'instances of this '
'environment.'))
list_parser.add_argument('--dir', action='store', default='./data',
help=('base directory to list nodes '
'(default=./data/)'))
list_parser.add_argument('--json', action='store_true', default=False,
help=('output in JSON format '))
list_parser.add_argument('--tags', action='store_true', default=False,
help=('outputs the tags for each instance. '
'Tags can be used to target instances '
'for start/stop/kill.'))
list_parser.add_argument('--startup', action='store_true',
default=False,
help=('outputs the startup command lines for '
'each instance.'))
list_parser.add_argument('--verbose', action='store_true',
default=False, help='alias for --tags.')
# list command
helptext = ('kills (or sends another signal to) MongoDB instances '
'of this environment.')
desc = ('kills (or sends another signal to) MongoDB instances '
'of this environment.')
kill_parser = subparsers.add_parser('kill', help=helptext,
description=desc)
kill_parser.add_argument('tags', metavar='TAG', action='store',
nargs='*', default=[],
help=('without tags, all running nodes will '
'be killed. Provide additional tags to '
'narrow down the set of nodes to '
'kill.'))
kill_parser.add_argument('--dir', action='store', default='./data',
help=('base directory to kill nodes '
'(default=./data/)'))
kill_parser.add_argument('--signal', action='store', default=15,
help=('signal to send to processes, '
'default=15 (SIGTERM)'))
kill_parser.add_argument('--verbose', action='store_true',
default=False,
help='outputs more verbose information.')
# argparser is set up, now call base class run()
BaseCmdLineTool.run(self, arguments, get_unknowns=True)
# conditions on argument combinations
if (self.args['command'] == 'init' and
'single' in self.args and self.args['single']):
if self.args['arbiter']:
self.argparser.error("can't specify --arbiter for "
"single nodes.")
# replace path with absolute path, but store relative path as well
if ('dir' in self.args and self.args['dir']):
self.relative_dir = self.args['dir']
self.dir = os.path.abspath(self.args['dir'])
self.args['dir'] = self.dir
if (self.args['command'] is None):
self.argparser.print_help()
self.argparser.exit()
else:
# branch out in sub-commands
getattr(self, self.args['command'])()
# -- below are the main commands: init, start, stop, list, kill
def init(self):
"""
Sub-command init.
Branches out to sharded, replicaset or single node methods.
"""
# check for existing environment. Only allow subsequent
# 'mlaunch init' if they are identical.
if self._load_parameters():
if self.loaded_args != self.args:
raise SystemExit('A different environment already exists '
'at %s.' % self.dir)
first_init = False
else:
first_init = True
self.ssl_pymongo_options = self._get_ssl_pymongo_options(self.args)
self.tls_pymongo_options = self._get_tls_pymongo_options(self.args)
if (self._get_ssl_server_args() and not
self.args['sslAllowConnectionsWithoutCertificates'] and not
self.args['sslClientCertificate'] and not
self.args['sslClientPEMKeyFile']):
sys.stderr.write('warning: server requires certificates but no'
' --sslClientCertificate provided\n')
if (self._get_tls_server_args() and not
self.args['tlsAllowConnectionsWithoutCertificates'] and not
self.args['tlsClientCertificate'] and not
self.args['tlsClientCertificateKeyFile']):
sys.stderr.write('warning: server requires certificates but no'
' --tlsClientCertificate provided\n')
# number of default config servers
if self.args['config'] == -1:
self.args['config'] = 1
# Exit with error if --csrs is set and MongoDB < 3.1.0
if (self.args['csrs'] and
LooseVersion(self.current_version) < LooseVersion("3.1.0") and
LooseVersion(self.current_version) != LooseVersion("0.0.0")):
errmsg = (" \n * The '--csrs' option requires MongoDB version "
"3.2.0 or greater, the current version is %s.\n"
% self.current_version)
raise SystemExit(errmsg)
# add the 'csrs' parameter as default for MongoDB >= 3.3.0
if (LooseVersion(self.current_version) >= LooseVersion("3.3.0") or
LooseVersion(self.current_version) == LooseVersion("0.0.0")):
self.args['csrs'] = True
# construct startup strings
self._construct_cmdlines()
# write out parameters
if self.args['verbose']:
print("writing .mlaunch_startup file.")
self._store_parameters()
# exit if running in testing mode
if self.test:
return
# check if authentication is enabled, make key file
if self.args['auth'] and first_init:
if not os.path.exists(self.dir):
os.makedirs(self.dir)
if '--keyFile' in self.unknown_args:
# Check if keyfile is readable
keyfile = None
try:
keyfile_idx = self.unknown_args.index('--keyFile') + 1
keyfile_path = self.unknown_args[keyfile_idx]
keyfile = self._read_key_file(keyfile_path)
except:
print(f'\n WARNING: Specified keyFile does not appear readable: {keyfile_path}\n')
else:
keyfile = os.path.join(self.dir, "keyfile")
print(f'Generating keyfile: {keyfile}')
os.system('openssl rand -base64 753 > "%s"' % keyfile)
if os.name != 'nt':
os.system(f'chmod 600 "{keyfile}"')
# if not all ports are free, complain and suggest alternatives.
all_ports = self.get_tagged(['all'])
ports_avail = self.wait_for(all_ports, 1, 1, to_start=False)
if not all(map(itemgetter(1), ports_avail)):
dir_addon = (' --dir %s' % self.relative_dir
if self.relative_dir != './data' else '')
errmsg = ('\nThe following ports are not available: %s\n\n'
% ', '.join([str(p[0])
for p in ports_avail if not p[1]]))
errmsg += (" * If you want to restart nodes from this "
"environment, use 'mlaunch start%s' instead.\n"
% dir_addon)
errmsg += (" * If the ports are used by a different mlaunch "
"environment, stop those first with 'mlaunch stop "
"--dir <env>'.\n")
errmsg += (" * You can also specify a different port range with "
"an additional '--port <startport>'\n")
raise SystemExit(errmsg)
if self.args['sharded']:
shard_names = self._get_shard_names(self.args)
# start mongod (shard and config) nodes and wait
nodes = self.get_tagged(['mongod', 'down'])
self._start_on_ports(nodes, wait=True, override_auth=True)
# initiate replica sets if init is called for the first time
if first_init:
if self.args['csrs']:
# Initiate config servers in a replicaset
if self.args['verbose']:
print('Initiating config server replica set.')
members = sorted(self.get_tagged(["config"]))
self._initiate_replset(members[0], "configRepl")
for shard in shard_names:
# initiate replica set on first member
if self.args['verbose']:
print('Initiating shard replica set %s.' % shard)
members = sorted(self.get_tagged([shard]))
self._initiate_replset(members[0], shard)
# add mongos
mongos = sorted(self.get_tagged(['mongos', 'down']))
self._start_on_ports(mongos, wait=True, override_auth=True)
if first_init:
# add shards
mongos = sorted(self.get_tagged(['mongos']))
con = self.client('localhost:%i' % mongos[0])
shards_to_add = len(self.shard_connection_str)
nshards = con['config']['shards'].count_documents({})
if nshards < shards_to_add:
if self.args['replicaset']:
print("adding shards. can take up to 30 seconds...")
else:
print("adding shards.")
shard_conns_and_names = list(zip(self.shard_connection_str,
shard_names))
while True:
try:
nshards = con['config']['shards'].count_documents({})
except Exception:
nshards = 0
if nshards >= shards_to_add:
break
for conn_str, name in shard_conns_and_names:
try:
res = con['admin'].command(SON([('addShard',
conn_str),
('name', name)]))
except Exception as e:
if self.args['verbose']:
print('%s will retry in a moment.' % e)
continue
if res['ok']:
if self.args['verbose']:
print("shard %s added successfully" % conn_str)
shard_conns_and_names.remove((conn_str, name))
break
else:
if self.args['verbose']:
print(res + ' - will retry')
time.sleep(1)
elif self.args['single']:
# just start node
nodes = self.get_tagged(['single', 'down'])
self._start_on_ports(nodes, wait=False)
elif self.args['replicaset']:
# start nodes and wait
nodes = sorted(self.get_tagged(['mongod', 'down']))
self._start_on_ports(nodes, wait=True)
# initiate replica set
if first_init:
self._initiate_replset(nodes[0], self.args['name'])
# wait for all nodes to be running
nodes = self.get_tagged(['all'])
self.wait_for(nodes)
# now that nodes are running, add admin user if authentication enabled
if self.args['auth'] and self.args['initial-user'] and first_init:
self.discover()
nodes = []
if self.args['sharded']:
nodes = self.get_tagged(['mongos', 'running'])
elif self.args['single']:
nodes = self.get_tagged(['single', 'running'])
elif self.args['replicaset']:
print("waiting for primary to add a user.")
if self._wait_for_primary():
nodes = self.get_tagged(['primary', 'running'])
else:
raise RuntimeError("failed to find a primary, so adding "
"admin user isn't possible")
if not nodes:
raise RuntimeError("can't connect to server, so adding admin "
"user isn't possible")
roles = []
found_cluster_admin = False
if self.args['auth_role_docs']:
for role_str in self.args['auth_roles']:
role_doc = json.loads(role_str)
roles.append(role_doc)
if role_doc['role'] == "clusterAdmin":
found_cluster_admin = True
else:
roles = self.args['auth_roles']
found_cluster_admin = "clusterAdmin" in roles
if not found_cluster_admin:
warnings.warn("the stop command will not work with auth "
"because the user does not have the "
"clusterAdmin role")
self._add_user(sorted(nodes)[0], name=self.args['username'],
password=self.args['password'],
database=self.args['auth_db'],
roles=roles)
if self.args['sharded']:
for shard in shard_names:
members = sorted(self.get_tagged([shard]))
if self.args['verbose']:
print("adding users to %s" % shard)
self._add_user(members[0],
name=self.args['username'],
password=self.args['password'],
database=self.args['auth_db'],
roles=roles)
if self.args['verbose']:
print("added user %s on %s database" % (self.args['username'],
self.args['auth_db']))
# in sharded env, if --mongos 0, kill the dummy mongos
if self.args['sharded'] and self.args['mongos'] == 0:
port = self.args['port']
print("shutting down temporary mongos on localhost:%s" % port)
username = self.args['username'] if self.args['auth'] else None
password = self.args['password'] if self.args['auth'] else None
authdb = self.args['auth_db'] if self.args['auth'] else None
shutdown_host(port, username, password, authdb)
# discover again, to get up-to-date info
self.discover()
# for sharded authenticated clusters, restart after first_init
# to enable auth
if self.args['sharded'] and self.args['auth'] and first_init:
if self.args['verbose']:
print("restarting cluster to enable auth...")
self.restart()
if self.args['auth'] and self.args['initial-user']:
print('Username "%s", password "%s"'
% (self.args['username'], self.args['password']))
if self.args['verbose']:
print("done.")
# Get the "mongod" version, useful for checking for support or
# non-support of features.
# Normally we expect to get back something like "db version v3.4.0",
# but with release candidates we get abck something like
# "db version v3.4.0-rc2". This code exact the "major.minor.revision"
# part of the string
def getMongoDVersion(self):
binary = "mongod"
if self.args and self.args.get('binarypath'):
binary = os.path.join(self.args['binarypath'], binary)
try:
out = check_mongo_server_output(binary, '--version')
except Exception:
return "0.0"
buf = BytesIO(out)
current_version = buf.readline().strip().decode('utf-8')
# remove prefix "db version v"
if current_version.rindex('v') > 0:
current_version = current_version.rpartition('v')[2]
# remove suffix making assumption that all release candidates
# equal revision 0
try:
if current_version.rindex('-') > 0: # release candidate?
current_version = current_version.rpartition('-')[0]
except Exception:
pass
if self.args and self.args['verbose']:
print("Detected mongod version: %s" % current_version)
return current_version
def client(self, host_and_port, **kwargs):
kwargs.update(self.ssl_pymongo_options)
kwargs.update(self.tls_pymongo_options)
return MongoConnection(host_and_port, **kwargs)
def stop(self):
"""
Sub-command stop.
Parse the list of tags and stop the matching nodes. Each tag has a set
of nodes associated with it, and only the nodes matching all tags
(intersection) will be shut down.
Currently this is an alias for kill()
"""
self.kill()
def start(self):
"""Sub-command start."""
self.discover()
# startup_info only gets loaded from protocol version 2 on,
# check if it's loaded
if not self.startup_info:
# hack to make environment startable with older protocol
# versions < 2: try to start nodes via init if all nodes are down
if len(self.get_tagged(['down'])) == len(self.get_tagged(['all'])):
self.args = self.loaded_args
print("upgrading mlaunch environment meta-data.")
return self.init()
else:
raise SystemExit("These nodes were created with an older "
"version of mlaunch (v1.1.1 or below). To "
"upgrade this environment and make use of "
"the start/stop/list commands, stop all "
"nodes manually, then run 'mlaunch start' "
"again. You only have to do this once.")
# if new unknown_args are present, compare them with loaded ones
# (here we can be certain of protocol v2+)
if (self.args['binarypath'] is not None or
(self.unknown_args and
set(self.unknown_args) != set(self.loaded_unknown_args))):
# store current args, use self.args from file (self.loaded_args)
start_args = self.args
self.args = self.loaded_args
self.args['binarypath'] = start_args['binarypath']
# construct new startup strings with updated unknown args.
# They are for this start only and will not be persisted in
# the .mlaunch_startup file
self._construct_cmdlines()
# reset to original args for this start command
self.args = start_args
matches = self._get_ports_from_args(self.args, 'down')
if len(matches) == 0:
raise SystemExit('no nodes started.')
# start config servers first
config_matches = self.get_tagged(['config']).intersection(matches)
self._start_on_ports(config_matches, wait=True)
# start shards next
mongod_matches = (self.get_tagged(['mongod']) -
self.get_tagged(['config']))
mongod_matches = mongod_matches.intersection(matches)
self._start_on_ports(mongod_matches, wait=True)
# now start mongos
mongos_matches = self.get_tagged(['mongos']).intersection(matches)
self._start_on_ports(mongos_matches)
# wait for all matched nodes to be running
self.wait_for(matches)
# refresh discover
self.discover()
def list(self):
"""
Sub-command list.
Takes no further parameters. Will discover the current configuration
and print a table of all the nodes with status and port.
"""
self.discover()
print_docs = []
# mongos
for node in sorted(self.get_tagged(['mongos'])):
doc = OrderedDict([('process', 'mongos'), ('port', node),
('status', 'running'
if self.cluster_running[node] else 'down')])
print_docs.append(doc)
if len(self.get_tagged(['mongos'])) > 0:
print_docs.append(None)
# configs
for node in sorted(self.get_tagged(['config'])):
doc = OrderedDict([('process', 'config server'),
('port', node),
('status', 'running'
if self.cluster_running[node] else 'down')])
print_docs.append(doc)
if len(self.get_tagged(['config'])) > 0:
print_docs.append(None)
# mongod
for shard in self._get_shard_names(self.loaded_args):
tags = []
replicaset = ('replicaset' in self.loaded_args and
self.loaded_args['replicaset'])
padding = ''
if shard:
print_docs.append(shard)
tags.append(shard)
padding = ' '
if replicaset:
# primary
primary = self.get_tagged(tags + ['primary', 'running'])
if len(primary) > 0:
node = list(primary)[0]
print_docs.append(OrderedDict
([('process', padding + 'primary'),
('port', node),
('status', 'running'
if self.cluster_running[node]
else 'down')]))
# secondaries
secondaries = self.get_tagged(tags + ['secondary', 'running'])
for node in sorted(secondaries):
print_docs.append(OrderedDict
([('process', padding + 'secondary'),
('port', node),
('status', 'running'
if self.cluster_running[node]
else 'down')]))
# data-bearing nodes that are down or not in the
# replica set yet
mongods = self.get_tagged(tags + ['mongod'])
arbiters = self.get_tagged(tags + ['arbiter'])
nodes = sorted(mongods - primary - secondaries - arbiters)
for node in nodes:
print_docs.append(OrderedDict
([('process', padding + 'mongod'),
('port', node),
('status', 'running'
if self.cluster_running[node]
else 'down')]))
# arbiters
for node in arbiters:
print_docs.append(OrderedDict
([('process', padding + 'arbiter'),
('port', node),
('status', 'running'
if self.cluster_running[node]
else 'down')]))
else:
nodes = self.get_tagged(tags + ['mongod'])
if len(nodes) > 0:
node = nodes.pop()
print_docs.append(OrderedDict
([('process', padding + 'single'),
('port', node),
('status', 'running'
if self.cluster_running[node]
else 'down')]))
if shard:
print_docs.append(None)
processes = self._get_processes()
startup = self.startup_info
# print tags as well
for doc in [x for x in print_docs if type(x) == OrderedDict]:
try:
doc['pid'] = processes[doc['port']].pid
except KeyError:
doc['pid'] = '-'
if self.args['verbose'] or self.args['tags']:
tags = self.get_tags_of_port(doc['port'])
doc['tags'] = ', '.join(tags)
if self.args['startup']:
try:
# first try running process (startup may be modified
# via start command)
doc['startup command'] = ' '.join(processes[doc['port']]
.cmdline())
except KeyError:
# if not running, use stored startup_info
doc['startup command'] = startup[str(doc['port'])]
if (self.args['json']):
print(json.dumps(print_docs))
else:
print()
print_docs.append(None)
print_table(print_docs)
if self.loaded_args.get('auth'):
print('\tauth: "%s:%s"' % (self.loaded_args.get('username'),
self.loaded_args.get('password')))
def kill(self):
self.discover()
# get matching tags, can only send signals to running nodes
matches = self._get_ports_from_args(self.args, 'running')
processes = self._get_processes()
# convert signal to int, default is SIGTERM for graceful shutdown
sig = self.args.get('signal') or 'SIGTERM'
if os.name == 'nt':
sig = signal.CTRL_BREAK_EVENT
if type(sig) == int:
pass
elif isinstance(sig, str):
try:
sig = int(sig)
except ValueError:
try:
sig = getattr(signal, sig)
except AttributeError:
raise SystemExit("can't parse signal '%s', use integer or "
"signal name (SIGxxx)." % sig)
for port in processes:
# only send signal to matching processes
if port in matches:
p = processes[port]
p.send_signal(sig)
if self.args['verbose']:
print(" %s on port %i, pid=%i" % (p.name, port, p.pid))
print("sent signal %s to %i process%s."
% (sig, len(matches), '' if len(matches) == 1 else 'es'))
# there is a very brief period in which nodes are not reachable
# anymore, but the port is not torn down fully yet and an immediate
# start command would fail. This very short sleep prevents that case,
# and it is practically not noticable by users.
time.sleep(0.1)
# refresh discover
self.discover()
def restart(self):
# get all running processes
processes = self._get_processes()
procs = [processes[k] for k in list(processes.keys())]
# stop nodes via stop command
self.stop()
# wait until all processes terminate
psutil.wait_procs(procs)
# start nodes again via start command
self.start()
# --- below are api helper methods, can be called after creating an
# MLaunchTool() object
def discover(self):
"""
Fetch state for each processes.
Build the self.cluster_tree, self.cluster_tags, self.cluster_running
data structures, needed for sub-commands start, stop, list.
"""
# need self.args['command'] so fail if it's not available
if (not self.args or 'command' not in self.args or not
self.args['command']):
return
# load .mlaunch_startup file for start, stop, list, use current
# parameters for init
if self.args['command'] == 'init':
self.loaded_args = self.args
self.loaded_unknown_args = self.unknown_args
else:
if not self._load_parameters():
startup_file = os.path.join(self.dir, ".mlaunch_startup")
raise SystemExit("Can't read %s, use 'mlaunch init ...' first."
% startup_file)
self.ssl_pymongo_options = self._get_ssl_pymongo_options(self.loaded_args)
self.tls_pymongo_options = self._get_tls_pymongo_options(self.loaded_args)
# reset cluster_* variables
self.cluster_tree = {}
self.cluster_tags = defaultdict(list)
self.cluster_running = {}
# get shard names
shard_names = self._get_shard_names(self.loaded_args)
# some shortcut variables
is_sharded = ('sharded' in self.loaded_args and
self.loaded_args['sharded'] is not None)
is_replicaset = ('replicaset' in self.loaded_args and
self.loaded_args['replicaset'])
is_single = 'single' in self.loaded_args and self.loaded_args['single']
has_arbiter = ('arbiter' in self.loaded_args and
self.loaded_args['arbiter'])
# determine number of nodes to inspect
if is_sharded:
num_config = self.loaded_args['config']
# at least one temp. mongos for adding shards, will be
# killed later on
num_mongos = max(1, self.loaded_args['mongos'])
num_shards = len(shard_names)
else:
num_shards = 1
num_config = 0
num_mongos = 0
num_nodes_per_shard = self.loaded_args['nodes'] if is_replicaset else 1
if has_arbiter:
num_nodes_per_shard += 1
num_nodes = num_shards * num_nodes_per_shard + num_config + num_mongos
current_port = self.loaded_args['port']
# tag all nodes with 'all'
self.cluster_tags['all'].extend(list(range(current_port,
current_port + num_nodes)))
# tag all nodes with their port number (as string) and whether
# they are running
for port in range(current_port, current_port + num_nodes):
self.cluster_tags[str(port)].append(port)
running = self.is_running(port)
self.cluster_running[port] = running
self.cluster_tags['running' if running else 'down'].append(port)
# find all mongos
for i in range(num_mongos):
port = i + current_port
# add mongos to cluster tree
self.cluster_tree.setdefault('mongos', []).append(port)
# add mongos to tags
self.cluster_tags['mongos'].append(port)
current_port += num_mongos
# find all mongods (sharded, replicaset or single)
if shard_names is None:
shard_names = [None]
for shard in shard_names:
port_range = list(range(current_port,
current_port + num_nodes_per_shard))
# all of these are mongod nodes
self.cluster_tags['mongod'].extend(port_range)
if shard:
# if this is a shard, store in cluster_tree and tag shard name
self.cluster_tree.setdefault('shard', []).append(port_range)
self.cluster_tags[shard].extend(port_range)
if is_replicaset:
# get replica set states
rs_name = shard if shard else self.loaded_args['name']
try:
mrsc = self.client(
','.join('localhost:%i' % i for i in port_range),
replicaSet=rs_name)
# primary, secondaries, arbiters
# @todo: this is no longer working because MongoClient
# is now non-blocking
if mrsc.primary:
self.cluster_tags['primary'].append(mrsc.primary[1])
self.cluster_tags['secondary'].extend(list(map
(itemgetter(1),
mrsc.secondaries)))
self.cluster_tags['arbiter'].extend(list(map(itemgetter(1),
mrsc.arbiters)))
# secondaries in cluster_tree (order is now important)
self.cluster_tree.setdefault('secondary', [])
for i, secondary in enumerate(sorted(map
(itemgetter(1),
mrsc.secondaries))):
if len(self.cluster_tree['secondary']) <= i:
self.cluster_tree['secondary'].append([])
self.cluster_tree['secondary'][i].append(secondary)
except (ConnectionFailure, ConfigurationError):
pass
elif is_single:
self.cluster_tags['single'].append(current_port)
# increase current_port
current_port += num_nodes_per_shard
# add config server to cluster tree
self.cluster_tree.setdefault('config', []).append(port)
# If not CSRS, set the number of config servers to be 1 or 3
# This is needed, otherwise `mlaunch init --sharded 2 --replicaset
# --config 2` on <3.3.0 will crash
if not self.args.get('csrs') and self.args['command'] == 'init':
if num_config >= 3:
num_config = 3
else:
num_config = 1
for i in range(num_config):
port = i + current_port
try:
mc = self.client('localhost:%i' % port)
mc.admin.command('ping')
running = True
except ConnectionFailure:
# node not reachable
running = False
# add config server to cluster tree
self.cluster_tree.setdefault('config', []).append(port)
# add config server to tags
self.cluster_tags['config'].append(port)
self.cluster_tags['mongod'].append(port)
current_port += num_mongos
def is_running(self, port):
"""Return True if a host on a specific port is running."""
try:
con = self.client('localhost:%s' % port)
con.admin.command('ping')
return True
except (AutoReconnect, ConnectionFailure, OperationFailure):
# Catch OperationFailure to work around SERVER-31916.
return False
def get_tagged(self, tags):
"""
Tag format.
The format for the tags list is tuples for tags: mongos, config, shard,
secondary tags of the form (tag, number), e.g. ('mongos', 2) which
references the second mongos in the list. For all other tags, it is
simply the string, e.g. 'primary'.
"""
# if tags is a simple string, make it a list (note: tuples like
# ('mongos', 2) must be in a surrounding list)
if not hasattr(tags, '__iter__') and type(tags) == str:
tags = [tags]
nodes = set(self.cluster_tags['all'])
for tag in tags:
if re.match(r"\w+ \d{1,2}", tag):
# special case for tuple tags: mongos, config, shard,
# secondary. These can contain a number<|fim▁hole|>
try:
branch = self.cluster_tree[tag][int(number) - 1]
except (IndexError, KeyError):
continue
if hasattr(branch, '__iter__'):
subset = set(branch)
else:
subset = set([branch])
else:
# otherwise use tags dict to get the subset
subset = set(self.cluster_tags[tag])
nodes = nodes.intersection(subset)
return nodes
def get_tags_of_port(self, port):
"""
Get all tags related to a given port.
This is the inverse of what is stored in self.cluster_tags).
"""
return(sorted([tag for tag in self.cluster_tags
if port in self.cluster_tags[tag]]))
def wait_for(self, ports, interval=1.0, timeout=30, to_start=True):
"""
Spawn threads to ping host using a list of ports.
Returns when all hosts are running (if to_start=True) / shut down (if
to_start=False).
"""
threads = []
queue = Queue.Queue()
for port in ports:
threads.append(threading.Thread(target=wait_for_host, args=(
port, interval, timeout, to_start, queue,
self.ssl_pymongo_options, self.tls_pymongo_options)))
if self.args and 'verbose' in self.args and self.args['verbose']:
print("waiting for nodes %s..."
% ('to start' if to_start else 'to shutdown'))
for thread in threads:
thread.start()
for thread in threads:
thread.join()
# get all results back and return tuple
return tuple(queue.get_nowait() for _ in ports)
# --- below here are internal helper methods, do not call externally ---
def _load_parameters(self):
"""
Load the .mlaunch_startup file that exists in each datadir.
Handles different protocol versions.
"""
datapath = self.dir
startup_file = os.path.join(datapath, '.mlaunch_startup')
if not os.path.exists(startup_file):
return False
in_dict = json.load(open(startup_file, 'rb'))
# handle legacy version without versioned protocol
if 'protocol_version' not in in_dict:
in_dict['protocol_version'] = 1
self.loaded_args = in_dict
self.startup_info = {}
# hostname was added recently
self.loaded_args['hostname'] = socket.gethostname()
elif in_dict['protocol_version'] == 2:
self.startup_info = in_dict['startup_info']
self.loaded_unknown_args = in_dict['unknown_args']
self.loaded_args = in_dict['parsed_args']
# changed 'authentication' to 'auth', if present (from old env) rename
if 'authentication' in self.loaded_args:
self.loaded_args['auth'] = self.loaded_args['authentication']
del self.loaded_args['authentication']
return True
def _store_parameters(self):
"""Store startup params and config in datadir/.mlaunch_startup."""
datapath = self.dir
out_dict = {
'protocol_version': 2,
'mtools_version': __version__,
'parsed_args': self.args,
'unknown_args': self.unknown_args,
'startup_info': self.startup_info
}
if not os.path.exists(datapath):
os.makedirs(datapath)
try:
json.dump(out_dict,
open(os.path.join(datapath,
'.mlaunch_startup'), 'w'), indent=-1)
except Exception as ex:
print("ERROR STORING Parameters:", ex)
def _create_paths(self, basedir, name=None):
"""Create datadir and subdir paths."""
if name:
datapath = os.path.join(basedir, name)
else:
datapath = basedir
dbpath = os.path.join(datapath, 'db')
if not os.path.exists(dbpath):
os.makedirs(dbpath)
if self.args['verbose']:
print('creating directory: %s' % dbpath)
return datapath
def _get_ports_from_args(self, args, extra_tag):
tags = []
if 'tags' not in args:
args['tags'] = []
for tag1, tag2 in zip(args['tags'][:-1], args['tags'][1:]):
if re.match(r'^\d{1,2}$', tag1):
print("warning: ignoring numeric value '%s'" % tag1)
continue
if re.match(r'^\d{1,2}$', tag2):
if tag1 in ['mongos', 'shard', 'secondary', 'config']:
# combine tag with number, separate by string
tags.append('%s %s' % (tag1, tag2))
continue
else:
print("warning: ignoring numeric value '%s' after '%s'"
% (tag2, tag1))
tags.append(tag1)
if len(args['tags']) > 0:
tag = args['tags'][-1]
if not re.match(r'^\d{1,2}$', tag):
tags.append(tag)
tags.append(extra_tag)
matches = self.get_tagged(tags)
return matches
def _filter_valid_arguments(self, arguments, binary="mongod",
config=False):
"""
Return a list of accepted arguments.
Check which arguments in list are accepted by the specified binary
(mongod, mongos). If an argument does not start with '-' but its
preceding argument was accepted, then it is accepted as well. Example
['--slowms', '1000'] both arguments would be accepted for a mongod.
"""
# get the help list of the binary
if self.args and self.args['binarypath']:
binary = os.path.join(self.args['binarypath'], binary)
try:
out = check_mongo_server_output(binary, '--help')
except Exception:
raise SystemExit("Fatal error trying get output from `%s`."
"Is the binary in your path?" % binary)
accepted_arguments = []
# extract all arguments starting with a '-'
for line in [option for option in out.decode('utf-8').split('\n')]:
line = line.lstrip()
if line.startswith('-'):
argument = line.split()[0]
accepted_arguments.append(argument)
# add undocumented options
accepted_arguments.append('--setParameter')
if binary.endswith('mongod'):
accepted_arguments.append('--wiredTigerEngineConfigString')
# filter valid arguments
result = []
for i, arg in enumerate(arguments):
if arg.startswith('-'):
# check if the binary accepts this argument
# or special case -vvv for any number of v
argname = arg.split('=', 1)[0]
if (binary.endswith('mongod') and config and
argname in self.UNSUPPORTED_CONFIG_ARGS):
continue
elif argname in accepted_arguments or re.match(r'-v+', arg):
result.append(arg)
elif (binary.endswith('mongod') and
argname in self.UNDOCUMENTED_MONGOD_ARGS):
result.append(arg)
elif self.ignored_arguments.get(binary + argname) is None:
# warn once for each combination of binary and unknown arg
self.ignored_arguments[binary + argname] = True
if not (binary.endswith("mongos") and
arg in self.UNSUPPORTED_MONGOS_ARGS):
print("warning: ignoring unknown argument %s for %s" %
(arg, binary))
elif i > 0 and arguments[i - 1] in result:
# if it doesn't start with a '-', it could be the value of
# the last argument, e.g. `--slowms 1000`
# NB: arguments are always quoted
result.append(f'"{arg}"')
# return valid arguments as joined string
return ' '.join(result)
def _get_ssl_server_args(self):
s = ''
if not self.ssl_server_args:
return s
for parser in self.ssl_args, self.ssl_server_args:
for action in parser._group_actions:
name = action.dest
value = self.args.get(name)
if value:
if value is True:
s += ' --%s' % (name,)
else:
s += ' --%s "%s"' % (name, value)
return s
def _get_ssl_pymongo_options(self, args):
opts = {}
if not self.ssl_server_args:
return opts
for parser in [self.ssl_server_args]:
for action in parser._group_actions:
name = action.dest
value = args.get(name)
if value:
opts['ssl'] = True
opts['ssl_cert_reqs'] = ssl.CERT_NONE
for parser in self.ssl_args, self.ssl_client_args:
for action in parser._group_actions:
name = action.dest
value = args.get(name)
if value:
opts['ssl'] = True
if name == 'sslClientCertificate':
opts['ssl_certfile'] = value
elif name == 'sslClientPEMKeyFile':
opts['ssl_keyfile'] = value
elif name == 'sslClientPEMKeyPassword':
opts['ssl_pem_passphrase'] = value
elif name == 'sslAllowInvalidCertificates':
opts['ssl_cert_reqs'] = ssl.CERT_OPTIONAL
elif name == 'sslAllowInvalidHostnames':
opts['ssl_match_hostname'] = False
elif name == 'sslCAFile':
opts['ssl_ca_certs'] = value
elif name == 'sslCRLFile':
opts['ssl_crlfile'] = value
return opts
def _get_tls_server_args(self):
s = ''
if not self.tls_server_args:
return s
for parser in self.tls_args, self.tls_server_args:
for action in parser._group_actions:
name = action.dest
value = self.args.get(name)
if value:
if value is True:
s += ' --%s' % (name,)
else:
s += ' --%s "%s"' % (name, value)
return s
def _get_tls_pymongo_options(self, args):
opts = {}
if not self.tls_server_args:
return opts
for parser in [self.tls_server_args]:
for action in parser._group_actions:
name = action.dest
value = args.get(name)
if value:
opts['tls'] = True
opts['tls_cert_reqs'] = ssl.CERT_NONE
for parser in self.tls_args, self.tls_client_args:
for action in parser._group_actions:
name = action.dest
value = args.get(name)
if value:
opts['tls'] = True
# TLS parameters require PyMongo 3.9.0+
# https://api.mongodb.com/python/3.9.0/changelog.html
if name == 'tlsCertificateKeyFile':
opts['tlsCertificateKeyFile'] = value
elif name == 'tlsCertificateKeyFilePassword':
opts['tlsCertificateKeyFilePassword'] = value
elif name == 'tlsAllowInvalidCertificates':
opts['tlsAllowInvalidCertificates'] = ssl.CERT_OPTIONAL
elif name == 'tlsAllowInvalidHostnames':
opts['tlsAllowInvalidHostnames'] = False
elif name == 'tlsCAFile':
opts['tlsCAFile'] = value
elif name == 'tlsCRLFile':
opts['tlsCRLFile'] = value
return opts
def _get_shard_names(self, args):
"""
Get the shard names based on the self.args['sharded'] parameter.
If it's a number, create shard names of type shard##, where ## is a
2-digit number. Returns a list [None] if no shards are present.
"""
if 'sharded' in args and args['sharded']:
if len(args['sharded']) == 1:
try:
# --sharded was a number, name shards shard01, shard02,
# ... (only works with replica sets)
n_shards = int(args['sharded'][0])
shard_names = ['shard%.2i'
% (i + 1) for i in range(n_shards)]
except ValueError:
# --sharded was a string, use it as name for the one shard
shard_names = args['sharded']
else:
shard_names = args['sharded']
else:
shard_names = [None]
return shard_names
def _get_last_error_log(self, command_str):
logpath = re.search(r'--logpath ([^\s]+)', command_str)
loglines = ''
try:
with open(logpath.group(1), 'rb') as logfile:
for line in logfile:
if not line.startswith('----- BEGIN BACKTRACE -----'):
loglines += line
else:
break
except IOError:
pass
return loglines
def _start_on_ports(self, ports, wait=False, override_auth=False):
if override_auth and self.args['verbose']:
print("creating cluster without auth for setup, "
"will enable auth at the end...")
for port in ports:
command_str = self.startup_info[str(port)]
if override_auth:
# this is to set up sharded clusters without auth first,
# then relaunch with auth
command_str = re.sub(r'--keyFile \S+', '', command_str)
try:
if os.name == 'nt':
subprocess.check_call(command_str, shell=True)
# create sub process on windows doesn't wait for output,
# wait a few seconds for mongod instance up
time.sleep(5)
else:
subprocess.check_output([command_str], shell=True,
stderr=subprocess.STDOUT)
binary = command_str.split()[0]
if '--configsvr' in command_str:
binary = 'config server'
if self.args['verbose']:
print("launching: %s" % command_str)
else:
print("launching: %s on port %s" % (binary, port))
except subprocess.CalledProcessError as e:
print(e.output)
print(self._get_last_error_log(command_str), file=sys.stderr)
raise SystemExit("can't start process, return code %i. "
"tried to launch: %s"
% (e.returncode, command_str))
if wait:
self.wait_for(ports)
def _initiate_replset(self, port, name, maxwait=30):
"""Initiate replica set."""
if not self.args['replicaset'] and name != 'configRepl':
if self.args['verbose']:
print('Skipping replica set initialization for %s' % name)
return
con = self.client('localhost:%i' % port)
try:
rs_status = con['admin'].command({'replSetGetStatus': 1})
return rs_status
except OperationFailure:
# not initiated yet
for i in range(maxwait):
try:
con['admin'].command({'replSetInitiate':
self.config_docs[name]})
break
except OperationFailure as e:
print(e.message + " - will retry")
time.sleep(1)
if self.args['verbose']:
print("initializing replica set '%s' with configuration: %s"
% (name, self.config_docs[name]))
print("replica set '%s' initialized." % name)
def _add_user(self, port, name, password, database, roles):
con = self.client('localhost:%i' % port, serverSelectionTimeoutMS=10000)
ismaster = con['admin'].command('isMaster')
set_name = ismaster.get('setName')
if set_name:
con.close()
con = self.client('localhost:%i' % port, replicaSet=set_name,
serverSelectionTimeoutMS=10000)
v = ismaster.get('maxWireVersion', 0)
if v >= 7:
# Until drivers have implemented SCRAM-SHA-256, use old mechanism.
opts = {'mechanisms': ['SCRAM-SHA-1']}
else:
opts = {}
if database == "$external":
password = None
try:
con[database].command("createUser", name, pwd=password, roles=roles,
**opts)
except OperationFailure as e:
raise e
def _get_processes(self):
all_ports = self.get_tagged(['running'])
process_dict = {}
for p in psutil.process_iter():
# deal with zombie process errors in OSX
try:
name = p.name()
except psutil.NoSuchProcess:
continue
# skip all but mongod / mongos
if os.name == 'nt':
if name not in ['mongos.exe', 'mongod.exe']:
continue
else:
if name not in ['mongos', 'mongod']:
continue
port = None
for possible_port in self.startup_info:
# compare ports based on command line argument
startup = self.startup_info[possible_port].split()
try:
p_port = p.cmdline()[p.cmdline().index('--port') + 1]
startup_port = startup[startup.index('--port') + 1]
except ValueError:
continue
if str(p_port) == str(startup_port):
port = int(possible_port)
break
# only consider processes belonging to this environment
if port in all_ports:
process_dict[port] = p
return process_dict
def _wait_for_primary(self):
hosts = ([x['host']
for x in self.config_docs[self.args['name']]['members']])
rs_name = self.config_docs[self.args['name']]['_id']
mrsc = self.client(hosts, replicaSet=rs_name,
serverSelectionTimeoutMS=30000)
if mrsc.is_primary:
# update cluster tags now that we have a primary
self.cluster_tags['primary'].append(mrsc.primary[1])
self.cluster_tags['secondary'].extend(list(map(itemgetter(1),
mrsc.secondaries)))
self.cluster_tags['arbiter'].extend(list(map(itemgetter(1),
mrsc.arbiters)))
# secondaries in cluster_tree (order is now important)
self.cluster_tree.setdefault('secondary', [])
for i, secondary in enumerate(sorted(map(itemgetter(1),
mrsc.secondaries))):
if len(self.cluster_tree['secondary']) <= i:
self.cluster_tree['secondary'].append([])
self.cluster_tree['secondary'][i].append(secondary)
return True
return False
# --- below are command line constructor methods, that build the command
# --- line strings to be called
def _construct_cmdlines(self):
"""
Top-level _construct_* method.
From here, it will branch out to the different cases:
_construct_sharded, _construct_replicaset, _construct_single. These can
themselves call each other (for example sharded needs to create the
shards with either replicaset or single node). At the lowest level, the
construct_mongod, _mongos, _config will create the actual command line
strings and store them in self.startup_info.
"""
if self.args['sharded']:
# construct startup string for sharded environments
self._construct_sharded()
elif self.args['single']:
# construct startup string for single node environment
self._construct_single(self.dir, self.args['port'])
elif self.args['replicaset']:
# construct startup strings for a non-sharded replica set
self._construct_replset(self.dir, self.args['port'],
self.args['name'],
list(range(self.args['nodes'])),
self.args['arbiter'])
# discover current setup
self.discover()
def _construct_sharded(self):
"""Construct command line strings for a sharded cluster."""
num_mongos = self.args['mongos'] if self.args['mongos'] > 0 else 1
shard_names = self._get_shard_names(self.args)
# create shards as stand-alones or replica sets
nextport = self.args['port'] + num_mongos
for shard in shard_names:
if (self.args['single'] and
LooseVersion(self.current_version) >= LooseVersion("3.6.0")):
errmsg = " \n * In MongoDB 3.6 and above a Shard must be " \
"made up of a replica set. Please use --replicaset " \
"option when starting a sharded cluster.*"
raise SystemExit(errmsg)
elif (self.args['single'] and
LooseVersion(self.current_version) < LooseVersion("3.6.0")):
self.shard_connection_str.append(
self._construct_single(
self.dir, nextport, name=shard, extra='--shardsvr'))
nextport += 1
elif self.args['replicaset']:
self.shard_connection_str.append(
self._construct_replset(
self.dir, nextport, shard,
num_nodes=list(range(self.args['nodes'])),
arbiter=self.args['arbiter'], extra='--shardsvr'))
nextport += self.args['nodes']
if self.args['arbiter']:
nextport += 1
# start up config server(s)
config_string = []
# SCCC config servers (MongoDB <3.3.0)
if not self.args['csrs'] and self.args['config'] >= 3:
config_names = ['config1', 'config2', 'config3']
else:
config_names = ['config']
# CSRS config servers (MongoDB >=3.1.0)
if self.args['csrs']:
config_string.append(self._construct_config(self.dir, nextport,
"configRepl", True))
else:
for name in config_names:
self._construct_config(self.dir, nextport, name)
config_string.append('%s:%i' % (self.args['hostname'],
nextport))
nextport += 1
# multiple mongos use <datadir>/mongos/ as subdir for log files
if num_mongos > 1:
mongosdir = os.path.join(self.dir, 'mongos')
if not os.path.exists(mongosdir):
if self.args['verbose']:
print("creating directory: %s" % mongosdir)
os.makedirs(mongosdir)
# start up mongos, but put them to the front of the port range
nextport = self.args['port']
for i in range(num_mongos):
if num_mongos > 1:
mongos_logfile = 'mongos/mongos_%i.log' % nextport
else:
mongos_logfile = 'mongos.log'
self._construct_mongos(os.path.join(self.dir, mongos_logfile),
nextport, ','.join(config_string))
nextport += 1
def _construct_replset(self, basedir, portstart, name, num_nodes,
arbiter, extra=''):
"""
Construct command line strings for a replicaset.
Handles single set or sharded cluster.
"""
self.config_docs[name] = {'_id': name, 'members': []}
# Construct individual replica set nodes
for i in num_nodes:
datapath = self._create_paths(basedir, '%s/rs%i' % (name, i + 1))
self._construct_mongod(os.path.join(datapath, 'db'),
os.path.join(datapath, 'mongod.log'),
portstart + i, replset=name, extra=extra)
host = '%s:%i' % (self.args['hostname'], portstart + i)
member_config = {
'_id': len(self.config_docs[name]['members']),
'host': host,
}
# First node gets increased priority.
if i == 0 and self.args['priority']:
member_config['priority'] = 10
if i >= 7:
member_config['votes'] = 0
member_config['priority'] = 0
self.config_docs[name]['members'].append(member_config)
# launch arbiter if True
if arbiter:
datapath = self._create_paths(basedir, '%s/arb' % (name))
self._construct_mongod(os.path.join(datapath, 'db'),
os.path.join(datapath, 'mongod.log'),
portstart + self.args['nodes'],
replset=name)
host = '%s:%i' % (self.args['hostname'],
portstart + self.args['nodes'])
(self.config_docs[name]['members']
.append({'_id': len(self.config_docs[name]['members']),
'host': host,
'arbiterOnly': True}))
return(name + '/' +
','.join([c['host']
for c in self.config_docs[name]['members']]))
def _construct_config(self, basedir, port, name=None, isreplset=False):
"""Construct command line strings for a config server."""
if isreplset:
return self._construct_replset(basedir=basedir, portstart=port,
name=name,
num_nodes=list(range(
self.args['config'])),
arbiter=False, extra='--configsvr')
else:
datapath = self._create_paths(basedir, name)
self._construct_mongod(os.path.join(datapath, 'db'),
os.path.join(datapath, 'mongod.log'),
port, replset=None, extra='--configsvr')
def _construct_single(self, basedir, port, name=None, extra=''):
"""
Construct command line strings for a single node.
Handles shards and stand-alones.
"""
datapath = self._create_paths(basedir, name)
self._construct_mongod(os.path.join(datapath, 'db'),
os.path.join(datapath, 'mongod.log'), port,
replset=None, extra=extra)
host = '%s:%i' % (self.args['hostname'], port)
return host
def _construct_mongod(self, dbpath, logpath, port, replset=None, extra=''):
"""Construct command line strings for mongod process."""
rs_param = ''
if replset:
rs_param = '--replSet %s' % replset
auth_param = ''
if self.args['auth']:
auth_param = '--auth'
if '--keyFile' not in self.unknown_args:
key_path = os.path.abspath(os.path.join(self.dir, 'keyfile'))
auth_param = f'{auth_param} --keyFile "{key_path}"'
if self.unknown_args:
config = '--configsvr' in extra
extra = self._filter_valid_arguments(self.unknown_args, "mongod",
config=config) + ' ' + extra
# set WiredTiger cache size to 1 GB by default
if ('--wiredTigerCacheSizeGB' not in extra and
self._filter_valid_arguments(['--wiredTigerCacheSizeGB'],
'mongod')):
extra += ' --wiredTigerCacheSizeGB 1 '
# Exit with error if hostname is specified but not bind_ip options
if (self.args['hostname'] != 'localhost'
and LooseVersion(self.current_version) >= LooseVersion("3.6.0")
and (self.args['sharded'] or self.args['replicaset'])
and '--bind_ip' not in extra):
os.removedirs(dbpath)
errmsg = " \n * If hostname is specified, please include "\
"'--bind_ip_all' or '--bind_ip' options when deploying "\
"replica sets or sharded cluster with MongoDB version 3.6.0 "\
"or greater"
raise SystemExit(errmsg)
extra += self._get_ssl_server_args()
path = self.args['binarypath'] or ''
if os.name == 'nt':
newdbpath = dbpath.replace('\\', '\\\\')
newlogpath = logpath.replace('\\', '\\\\')
command_str = ("start /b \"\" \"%s\" %s --dbpath \"%s\" "
" --logpath \"%s\" --port %i "
"%s %s" % (os.path.join(path, 'mongod.exe'),
rs_param, newdbpath, newlogpath, port,
auth_param, extra))
else:
command_str = ("\"%s\" %s --dbpath \"%s\" --logpath \"%s\" "
"--port %i --fork "
"%s %s" % (os.path.join(path, 'mongod'), rs_param,
dbpath, logpath, port, auth_param,
extra))
# store parameters in startup_info
self.startup_info[str(port)] = command_str
def _construct_mongos(self, logpath, port, configdb):
"""Construct command line strings for a mongos process."""
extra = ''
auth_param = ''
if self.args['auth']:
auth_param = '--auth'
if '--keyFile' not in self.unknown_args:
key_path = os.path.abspath(os.path.join(self.dir, 'keyfile'))
auth_param = f'{auth_param} --keyFile "{key_path}"'
if self.unknown_args:
extra = self._filter_valid_arguments(self.unknown_args,
"mongos") + extra
extra += ' ' + self._get_ssl_server_args()
path = self.args['binarypath'] or ''
if os.name == 'nt':
newlogpath = logpath.replace('\\', '\\\\')
command_str = ("start /b %s --logpath \"%s\" --port %i --configdb %s "
"%s %s " % (os.path.join(path, 'mongos'),
newlogpath, port, configdb,
auth_param, extra))
else:
command_str = ("%s --logpath \"%s\" --port %i --configdb %s %s %s "
"--fork" % (os.path.join(path, 'mongos'), logpath,
port, configdb, auth_param, extra))
# store parameters in startup_info
self.startup_info[str(port)] = command_str
def _read_key_file(self, keyfile=None):
if not keyfile:
with open(os.path.join(self.dir, 'keyfile'), 'rb') as f:
return ''.join(f.readlines())
else:
with open(keyfile, 'rb') as f:
return ''.join(f.readlines())
def main():
tool = MLaunchTool()
tool.run()
if __name__ == '__main__':
sys.exit(main())<|fim▁end|> | tag, number = tag.split() |
<|file_name|>CSCI567_hw5_fall16.py<|end_file_name|><|fim▁begin|># coding: utf-8
'''
Name : ThammeGowda Narayanaswamy
USCID: 2074669439
'''
import math
from scipy.stats import multivariate_normal
import numpy as np
from matplotlib import pyplot as plt
import matplotlib.patches as mpatches
import scipy as sp
from scipy import spatial
from scipy import stats
from pprint import pprint
blob_file = "hw5_blob.csv"
circle_file = "hw5_circle.csv"
def load_points(f_name):
with open(f_name) as f:
res = []
for l in f:
x,y = l.split(",")
res.append([float(x), float(y)])
return np.array(res)
blobs = load_points(blob_file)
circles = load_points(circle_file)
'''
# In[4]:
plt.plot(*zip(*circles), marker='o', color='r', ls='')
plt.show()
plt.plot(*zip(*blobs), marker='o', color='b', ls='')
plt.show()
'''
# In[5]:
def k_means(k, pts, get_indices=False, silent=True, tol=1e-5):
N = len(pts)
assert k <= N
print("K=%d, N=%d" % (k, N))
# pick random k points
pos = set()
while len(pos) < k:
r = np.random.randint(N)
pos.add(r)
centroids = []
for p in pos:
centroids.append(tuple(pts[p]))
change = float('inf')
conv_tol = 1e-5
itr, max_iters = 0, 100
while change > tol and itr < max_iters:
itr += 1
# assign cluster to each point
asgn = {}
indices = {}
for ct in centroids:
asgn[ct] = []
indices[ct] = []
for idx, pt in enumerate(pts):
mindist = float('inf')
a = None
for ct in centroids:
dist = spatial.distance.cdist([ct], [pt])
if dist < mindist:
mindist = dist
a = ct
asgn[a].append(pt)
indices[a].append(idx)
# compute means of each cluster
oldcentr = centroids
centroids = []
for ct, cluster in asgn.items():
centroids.append(tuple(np.array(cluster).mean(axis=0)))
dist_matrix = spatial.distance.cdist(oldcentr, centroids)
# has distance between each pair of {new, old} centroids
# need the diagonal values
change = dist_matrix.trace()
if not silent:
print("Movement in centroids", change)
return indices if get_indices else asgn
# In[6]:
print("# K Means")
colors = ['r', 'g', 'b', 'y', 'c', 'k']
plt.figure(1, figsize=(15, 10))
plt.title("K Means")
ks = {2,3,5}
dss = {'Blobs': blobs, 'Circles': circles}
j = 1
for title, ds in dss.items():
for k in ks:
clstrs = k_means(k, ds)
plt.subplot(2, 3, j)
i = 0
for cnt, cpts in clstrs.items():
plt.plot(*zip(*cpts), marker='o', color=colors[i], ls='')
i += 1
plt.title("%s , K=%d" % (title, k))
j += 1
plt.show()
# # Kernel
'''
# ## Feature Mapping
# In[7]:
center = [0.0, 0.0]
newdim = sp.spatial.distance.cdist([center], circles).transpose()
clusters = k_means(2, newdim, get_indices=True)
i = 0
for cnt, cpts in clusters.items():
cpts = map(lambda x: circles[x], cpts)
plt.plot(*zip(*cpts), marker='o', color=colors[i], ls='')
i += 1
plt.show()
'''
# ## Kernel K Means
#
# Kernel used :
# 1 - (radius of x1) / (radius of x2)
#
# It ensures that the smaller radius goes to numerator and larger radius goes to denominator - for symmetry and bounding
print("Kernel K means")
class KernelKMeans(object):
def kernel_matrix(self, data, kernel_func):
''' Computes kernel matrix
: params:
data - data points
kernel_func - kernel function
:returns: nxn matrix
'''
n = data.shape[0]
K = np.zeros((n,n), dtype=float)
for i in range(n):
for j in range(n):
K[i,j] = kernel_func(data[i], data[j])
return K
def cluster(self, X, k, kernel_func, max_itr=100, tol=1e-3):
'''
Clusters the points
:params:
X - data points
k - number of clusters
kernel_func - kernel function that outputs smaller values for points in same cluster
:returns: Nx1 vector of assignments
'''
# N
N = X.shape[0]
# NxN matrix from kernel funnction element wise
K = self.kernel_matrix(X, kernel_func)
# equal weightage to all
cluster_weights = np.ones(N)
# Assignments : random assignments to begin with
A = np.random.randint(k, size=N)
for it in xrange(max_itr): # stuck up between 2 local minimas, abort after maxiter
# N x k matrix that stores distance between every point and cluster center
dist = self.compute_dist(K, k, A, sw=cluster_weights)
oldA, A = A, dist.argmin(axis=1)
# Check if it is conveged
n_same = np.sum(np.abs(A - oldA) == 0)
if 1 - float(n_same) / N < tol:
print "Converged at iteration:", it + 1
break
return A
def compute_dist(self, K, k, A, sw):
"""
Computes Nxk distance matrix using kernel matrix
: params:
K - NxN kernel Matrix
k - number of clusters
A - Nx1 Assignments
sw - sample weights
: returns : Nxk distance matrix
"""
dist = np.zeros((K.shape[0], k))
for cl in xrange(k):
mask = A == cl
if np.sum(mask) == 0:
raise Error("ERROR:cluster '%d' is empty. Looks like we cant make %d clusters" % (cl, k))
N_ = sw[mask].sum()
KK = K[mask][:, mask]
dist[:, cl] += np.sum(np.outer(sw[mask], sw[mask]) * KK / (N_*N_))
dist[:, cl] -= 2 * np.sum(sw[mask] * K[:, mask], axis=1) / N_
return dist
def distance(x1, x2):
'''Squared Eucledian distance between 2 points
:params:
x1 - point1
x2 - point2
'''
return np.sum((x1 - x2) ** 2)
def circular_kernel(x1, x2, center=None):
'''This kernel outputs lesser distance for the points that are from circumference
:params:
x1 - first point
x2 - second point
center - center of circle(default = origin (0,0,...))
'''
if center is None:
center = np.zeros(len(x1))
dist1 = distance(x1, center)
dist2 = distance(x2, center)
return 1.0 - min(dist1, dist2) / max(dist1, dist2)
clusters = KernelKMeans().cluster(circles, 2, circular_kernel)
for i in range(k):
cpts = circles[clusters == i]
plt.plot(*zip(*cpts), marker='o', color=colors[i], ls='')
i += 1
plt.show()
# # EM Algorithm with GMM
print("EM Algorithm")
# In[62]:
def multivar_gaussian_pdf(x, mu, covar):
return multivariate_normal.pdf(x, mean=mu, cov=covar)
class EM_GMM(object):
def __init__(self, data, k):
self.data = data
self.k = k
self.N = data.shape[0]
# theta param
self.mean, self.cov, self.weight = [], [], []
# random initialization
A = np.random.randint(k, size=data.shape[0])
for c in range(k):
cpts = data[A == c]
self.mean.append(np.mean(cpts, axis=0))
self.cov.append(np.cov(np.array(cpts).transpose()))
self.weight.append(1.0 * cpts.shape[0] / data.shape[0])
def compute_gamma(self):
gamma = np.zeros((self.N, self.k), dtype=float)
for idx, pt in enumerate(data):
pdf = []
for ct in range(k):
temp = multivar_gaussian_pdf(pt, self.mean[ct], self.cov[ct])
pdf.append(temp * self.weight[ct])
gamma[idx] = np.array(pdf) / sum(pdf)
return gamma<|fim▁hole|> means = []
covs = []
for i in range(self.k):
nr_mu = (P[:, i:i+1] * self.data).sum(axis=0)
dr_mu = P[:, i].sum(axis=0)
pt_mu = nr_mu / dr_mu
means.append(pt_mu)
for i in range(self.k):
nr_cov = (P[:, i:i+1] * (self.data - means[i])).transpose().dot(self.data - means[i])
dr_cov = P[:, i].sum(axis=0)
covs.append(nr_cov / dr_cov)
self.mean= means
self.cov = covs
self.weight = weights
def log_likelihood(self):
log_sum = 0.
for _, pt in enumerate(self.data):
row_sum = []
for ct in range(self.k):
p_X_given_N = multivar_gaussian_pdf(pt, self.mean[ct], self.cov[ct])
p_N = self.weight[ct]
joint = p_N * p_X_given_N
row_sum.append(joint)
res = sum(row_sum)
log_sum += math.log(res)
return log_sum
def gmm(self, max_itr = 50):
ll = []
for itr in range(max_itr):
old_means = self.mean # used for convergance test
gamma = self.compute_gamma()
self.update_theta(gamma)
ll.append(self.log_likelihood())
if np.sum(np.abs(np.array(self.mean) - np.array(old_means))) < 1e-3:
break
return gamma, ll
data = blobs
max_ll = 0
plt.figure(1, figsize=(8, 6))
legends = []
k = 3
for i in range(1,6):
em = EM_GMM(data, k)
gamma, ll = em.gmm()
if ll >= max_ll:
best_gamma = gamma
best = em
max_ll = ll
print "Converged: ", len(ll)
plt.plot(range(len(ll)), ll , '-', color=colors[i])
legends.append(mpatches.Patch(color=colors[i], label='Iteration: %d' % i))
plt.legend(handles=legends)
plt.show()
idx = best_gamma.argmax(axis=1)
print "Best parameters: "
print "Mean:", best.mean
print "Covar:", best.cov
plt.scatter(data[:,0], data[:,1], color=[colors[i] for i in idx] )
plt.show()<|fim▁end|> |
def update_theta(self, P):
weights = P.sum(axis=0)/P.sum() |
<|file_name|>UdfExecutionFailureInfo.java<|end_file_name|><|fim▁begin|>/*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.facebook.presto.thrift.api.udf;
import com.facebook.drift.annotations.ThriftConstructor;
import com.facebook.drift.annotations.ThriftField;
import com.facebook.drift.annotations.ThriftStruct;
import com.google.common.collect.ImmutableList;
import javax.annotation.Nullable;
import javax.annotation.concurrent.Immutable;
import java.util.List;
import static com.facebook.drift.annotations.ThriftField.Recursiveness.TRUE;
import static com.facebook.drift.annotations.ThriftField.Requiredness.OPTIONAL;
import static java.util.Objects.requireNonNull;
@Immutable
@ThriftStruct
public class UdfExecutionFailureInfo
{
private final String type;
private final String message;
private final UdfExecutionFailureInfo cause;
private final List<UdfExecutionFailureInfo> suppressed;
private final List<String> stack;
@ThriftConstructor
public UdfExecutionFailureInfo(
String type,
String message,
@Nullable UdfExecutionFailureInfo cause,
List<UdfExecutionFailureInfo> suppressed,
List<String> stack)
{
this.type = requireNonNull(type, "type is null");
this.message = requireNonNull(message, "message is null");
this.cause = cause;
this.suppressed = ImmutableList.copyOf(suppressed);
this.stack = ImmutableList.copyOf(stack);
}
@ThriftField(1)<|fim▁hole|> }
@Nullable
@ThriftField(2)
public String getMessage()
{
return message;
}
@Nullable
@ThriftField(value = 3, isRecursive = TRUE, requiredness = OPTIONAL)
public UdfExecutionFailureInfo getCause()
{
return cause;
}
@ThriftField(4)
public List<UdfExecutionFailureInfo> getSuppressed()
{
return suppressed;
}
@ThriftField(5)
public List<String> getStack()
{
return stack;
}
}<|fim▁end|> | public String getType()
{
return type; |
<|file_name|>models.py<|end_file_name|><|fim▁begin|># -*- coding: utf-8 -*-
#
# Copyright (c) 2013 Clione Software
# Copyright (c) 2010-2013 Cidadania S. Coop. Galega
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.<|fim▁hole|>#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from datetime import datetime
from django.core.validators import RegexValidator
from django.db import models
from django.utils.translation import ugettext_lazy as _
from django.contrib.auth.models import User
from django.contrib.sites.models import Site
from core.spaces.file_validation import ContentTypeRestrictedFileField
from fields import StdImageField
from allowed_types import ALLOWED_CONTENT_TYPES
class Space(models.Model):
"""
Spaces model. This model stores a "space" or "place" also known as a
participative process in reality. Every place has a minimum set of
settings for customization.
There are three main permission roles in every space: administrator
(admins), moderators (mods) and regular users (users).
"""
name = models.CharField(_('Name'), max_length=250, unique=True,
help_text=_('Max: 250 characters'))
url = models.CharField(_('URL'), max_length=100, unique=True,
validators=[RegexValidator(regex='^[a-z0-9_]+$',
message='Invalid characters in the space URL.')],
help_text=_('Valid characters are lowercase, digits and \
underscore. This will be the accesible URL'))
description = models.TextField(_('Description'),
default=_('Write here your description.'))
pub_date = models.DateTimeField(_('Date of creation'), auto_now_add=True)
author = models.ForeignKey(User, blank=True, null=True,
verbose_name=_('Space creator'), help_text=_('Select a user that \
will be marked as creator of the space'))
logo = StdImageField(upload_to='spaces/logos', size=(100, 75, False),
help_text = _('Valid extensions are jpg, jpeg, png and gif'))
banner = StdImageField(upload_to='spaces/banners', size=(500, 75, False),
help_text = _('Valid extensions are jpg, jpeg, png and gif'))
public = models.BooleanField(_('Public space'), help_text=_("This will \
make the space visible to everyone, but registration will be \
necessary to participate."))
# Modules
mod_debate = models.BooleanField(_('Debate'))
mod_proposals = models.BooleanField(_('Proposals'))
mod_news = models.BooleanField(_('News'))
mod_cal = models.BooleanField(_('Calendar'))
mod_docs = models.BooleanField(_('Documents'))
mod_voting = models.BooleanField(_('Voting'))
class Meta:
ordering = ['name']
verbose_name = _('Space')
verbose_name_plural = _('Spaces')
get_latest_by = 'pub_date'
permissions = (
('view_space', 'Can view this space.'),
('admin_space', 'Can administrate this space.'),
('mod_space', 'Can moderate this space.')
)
def __unicode__(self):
return self.name
@models.permalink
def get_absolute_url(self):
return ('space-index', (), {
'space_url': self.url})
class Entity(models.Model):
"""
This model stores the name of the entities responsible for the creation
of the space or supporting it.
"""
name = models.CharField(_('Name'), max_length=100, unique=True)
website = models.CharField(_('Website'), max_length=100, null=True,
blank=True)
logo = models.ImageField(upload_to='spaces/logos', verbose_name=_('Logo'),
blank=True, null=True)
space = models.ForeignKey(Space, blank=True, null=True)
class Meta:
ordering = ['name']
verbose_name = _('Entity')
verbose_name_plural = _('Entities')
def __unicode__(self):
return self.name
class Document(models.Model):
"""
This models stores documents for the space, like a document repository,
There is no restriction in what a user can upload to the space.
:methods: get_file_ext, get_file_size
"""
title = models.CharField(_('Document title'), max_length=100,
help_text=_('Max: 100 characters'))
space = models.ForeignKey(Space, blank=True, null=True,
help_text=_('Change the space to whom belongs this document'))
docfile = ContentTypeRestrictedFileField(_('File'),
upload_to='spaces/documents/%Y/%m/%d',
content_types=ALLOWED_CONTENT_TYPES,
max_upload_size=26214400,
help_text=_('Permitted file types: DOC, DOCX, PPT, ODT, ODF, ODP, \
PDF, RST, TXT.'))
pub_date = models.DateTimeField(auto_now_add=True)
author = models.ForeignKey(User, verbose_name=_('Author'), blank=True,
null=True, help_text=_('Change the user that will figure as the \
author'))
def get_file_ext(self):
filename = self.docfile.name
extension = filename.split('.')
return extension[1].upper()
def get_file_size(self):
if self.docfile.size < 1023:
return str(self.docfile.size) + " Bytes"
elif self.docfile.size >= 1024 and self.docfile.size <= 1048575:
return str(round(self.docfile.size / 1024.0, 2)) + " KB"
elif self.docfile.size >= 1048576:
return str(round(self.docfile.size / 1024000.0, 2)) + " MB"
class Meta:
ordering = ['pub_date']
verbose_name = _('Document')
verbose_name_plural = _('Documents')
get_latest_by = 'pub_date'
# There is no 'view-document' view, so I'll leave the get_absolute_url
# method without permalink. Remember that the document files are accesed
# through the url() method in templates.
def get_absolute_url(self):
return '/spaces/%s/docs/%s' % (self.space.url, self.id)
class Event(models.Model):
"""
Meeting data model. Every space (process) has N meetings. This will
keep record of the assistants, meeting name, etc.
"""
title = models.CharField(_('Event name'), max_length=250,
help_text="Max: 250 characters")
space = models.ForeignKey(Space, blank=True, null=True)
user = models.ManyToManyField(User, verbose_name=_('Users'),
help_text=_('List of the users that will assist or assisted to the \
event.'))
pub_date = models.DateTimeField(auto_now_add=True)
event_author = models.ForeignKey(User, verbose_name=_('Created by'),
blank=True, null=True, related_name='meeting_author',
help_text=_('Select the user that will be designated as author.'))
event_date = models.DateTimeField(verbose_name=_('Event date'),
help_text=_('Select the date where the event is celebrated.'))
description = models.TextField(_('Description'), blank=True, null=True)
location = models.TextField(_('Location'), blank=True, null=True)
latitude = models.DecimalField(_('Latitude'), blank=True, null=True,
max_digits=17, decimal_places=15, help_text=_('Specify it in decimal'))
longitude = models.DecimalField(_('Longitude'), blank=True, null=True,
max_digits=17, decimal_places=15, help_text=_('Specify it in decimal'))
def is_due(self):
if self.event_date < datetime.now():
return True
else:
return False
class Meta:
ordering = ['event_date']
verbose_name = _('Event')
verbose_name_plural = _('Events')
get_latest_by = 'event_date'
permissions = (
('view_event', 'Can view this event'),
('admin_event', 'Can administrate this event'),
('mod_event', 'Can moderate this event'),
)
def __unicode__(self):
return self.title
@models.permalink
def get_absolute_url(self):
return ('view-event', (), {
'space_url': self.space.url,
'event_id': str(self.id)})
class Intent(models.Model):
"""
Intent data model. Intent stores the reference of a user-token when a user
asks entering in a restricted space.
.. versionadded: 0.1.5
"""
user = models.ForeignKey(User)
space = models.ForeignKey(Space)
token = models.CharField(max_length=32)
requested_on = models.DateTimeField(auto_now_add=True)
def get_approve_url(self):
site = Site.objects.all()[0]
return "http://%s%sintent/approve/%s" % (site.domain, self.space.get_absolute_url(), self.token)<|fim▁end|> | # You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0 |
<|file_name|>SynchronizedThreading.java<|end_file_name|><|fim▁begin|>package com.wjiec.learn.reordering;
public class SynchronizedThreading {
private int number = 0;
private boolean flag = false;
public synchronized void write() {
number = 1;
flag = true;
}
public synchronized int read() {
if (flag) {
return number * number;
}<|fim▁hole|> return -1;
}
}<|fim▁end|> | |
<|file_name|>index.js<|end_file_name|><|fim▁begin|>/* */
"format cjs";
/**
* @license
* Copyright Google Inc. All Rights Reserved.
*
* Use of this source code is governed by an MIT-style license that can be
* found in the LICENSE file at https://angular.io/license
*/
/**
* @module
* @description
* Entry point for all public APIs of the common/testing package.
*/
export { SpyLocation } from './location_mock';<|fim▁hole|>//# sourceMappingURL=index.js.map<|fim▁end|> | export { MockLocationStrategy } from './mock_location_strategy'; |
<|file_name|>f_expression.py<|end_file_name|><|fim▁begin|>#######################################################################
#
# Author: Gabi Roeger
# Modified by: Silvia Richter ([email protected])
# (C) Copyright 2008: Gabi Roeger and NICTA
#
# This file is part of LAMA.
#
# LAMA is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 3
# of the license, or (at your option) any later version.
#
# LAMA is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, see <http://www.gnu.org/licenses/>.
#
#######################################################################
import string
import conditions
def parse_expression(exp):
if isinstance(exp, list):
functionsymbol = exp[0]
return PrimitiveNumericExpression(functionsymbol,
[conditions.parse_term(arg) for arg in exp[1:]])
elif exp.replace(".","").isdigit():
return NumericConstant(string.atof(exp))
else:
return PrimitiveNumericExpression(exp,[])
def parse_assignment(alist):
assert len(alist) == 3
op = alist[0]
head = parse_expression(alist[1])<|fim▁hole|> if op == "=":
return Assign(head, exp)
elif op == "increase":
return Increase(head, exp)
else:
assert False, "Assignment operator not supported."
class FunctionalExpression(object):
def __init__(self, parts):
self.parts = tuple(parts)
def dump(self, indent=" "):
print "%s%s" % (indent, self._dump())
for part in self.parts:
part.dump(indent + " ")
def _dump(self):
return self.__class__.__name__
def instantiate(self, var_mapping, init_facts):
raise ValueError("Cannot instantiate condition: not normalized")
class NumericConstant(FunctionalExpression):
parts = ()
def __init__(self, value):
self.value = value
def __eq__(self, other):
return (self.__class__ == other.__class__ and self.value == other.value)
def __str__(self):
return "%s %s" % (self.__class__.__name__, self.value)
def _dump(self):
return str(self)
def instantiate(self, var_mapping, init_facts):
return self
class PrimitiveNumericExpression(FunctionalExpression):
parts = ()
def __init__(self, symbol, args):
self.symbol = symbol
self.args = tuple(args)
def __eq__(self, other):
if not (self.__class__ == other.__class__ and self.symbol == other.symbol
and len(self.args) == len(other.args)):
return False
else:
for s,o in zip(self.args, other.args):
if not s == o:
return False
return True
def __str__(self):
return "%s %s(%s)" % ("PNE", self.symbol, ", ".join(map(str, self.args)))
def dump(self, indent=" "):
print "%s%s" % (indent, self._dump())
for arg in self.args:
arg.dump(indent + " ")
def _dump(self):
return str(self)
def instantiate(self, var_mapping, init_facts):
args = [conditions.ObjectTerm(var_mapping.get(arg.name, arg.name)) for arg in self.args]
pne = PrimitiveNumericExpression(self.symbol, args)
assert not self.symbol == "total-cost"
# We know this expression is constant. Substitute it by corresponding
# initialization from task.
for fact in init_facts:
if isinstance(fact, FunctionAssignment):
if fact.fluent == pne:
return fact.expression
assert False, "Could not find instantiation for PNE!"
class FunctionAssignment(object):
def __init__(self, fluent, expression):
self.fluent = fluent
self.expression = expression
def __str__(self):
return "%s %s %s" % (self.__class__.__name__, self.fluent, self.expression)
def dump(self, indent=" "):
print "%s%s" % (indent, self._dump())
self.fluent.dump(indent + " ")
self.expression.dump(indent + " ")
def _dump(self):
return self.__class__.__name__
def instantiate(self, var_mapping, init_facts):
if not (isinstance(self.expression, PrimitiveNumericExpression) or
isinstance(self.expression, NumericConstant)):
raise ValueError("Cannot instantiate assignment: not normalized")
# We know that this assignment is a cost effect of an action (for initial state
# assignments, "instantiate" is not called). Hence, we know that the fluent is
# the 0-ary "total-cost" which does not need to be instantiated
assert self.fluent.symbol == "total-cost"
fluent = self.fluent
expression = self.expression.instantiate(var_mapping, init_facts)
return self.__class__(fluent, expression)
class Assign(FunctionAssignment):
def __str__(self):
return "%s := %s" % (self.fluent, self.expression)
class Increase(FunctionAssignment):
pass<|fim▁end|> | exp = parse_expression(alist[2]) |
<|file_name|>server.js<|end_file_name|><|fim▁begin|>var fs = require('fs');
var os=require('os');
var express = require('express'),
// wine = require('./routes/wines');
user = require('./services/user');
contact = require('./services/contact');
inbox = require('./services/inbox');
outbox = require('./services/outbox');
device = require('./services/device');
audio = require('./services/audio');
GLOBAL.GCMessage = require('./lib/GCMessage.js');
GLOBAL.Sound = require('./lib/Sound.js');
GLOBAL.PORT = 3000;
GLOBAL.IP = "54.214.9.117";
GLOBAL.HOSTNAME = "vivu.uni.me";
//GLOBAL.IP = "127.0.0.1";
GLOBAL.__dirname = __dirname;
<|fim▁hole|>app.configure(function () {
app.use(express.logger('dev')); /* 'default', 'short', 'tiny', 'dev' */
app.use(express.bodyParser());
app.use(express.static(__dirname + '/public/html'));
});
app.set('views', __dirname + '/views');
app.engine('html', require('ejs').renderFile);
app.get('/$',function (req, res){
res.render('homepage.html');
});
app.get('/audio/flash',function (req, res){
try{
var filePath = __dirname + "/public/flash/dewplayer.swf";
var audioFile = fs.statSync(filePath);
res.setHeader('Content-Type', 'application/x-shockwave-flash');
res.setHeader('Content-Length',audioFile.size);
var readStream = fs.createReadStream(filePath);
readStream.on('data', function(data) {
res.write(data);
});
readStream.on('end', function() {
res.end();
});
}
catch(e){
console.log("Error when process get /audio/:id, error: "+ e);
res.send("Error from server");
}
});
app.get('/cfs/audio/:id',function (req, res){
try{
var audioId = req.params.id;
var ip = GLOBAL.IP;
var host = ip+":8888";
var source = "http://"+host+"/public/audio/"+audioId+".mp3";
res.render('index1.html',
{
sourceurl:source,
sourceid: audioId
});
}
catch(e){
console.log("Error when process get cfs/audio/:id, error: "+ e);
res.send("Error from server");
}
});
/*
app.get('/cfs/audio/:id',function (req, res)
{
console.log("hostname:"+os.hostname());
try{
var audioId = req.params.id;
var ip = GLOBAL.IP;
var host = ip+":8888";
var source = "http://"+host+"/public/audio/"+audioId+".mp3";
//var source = "http://s91.stream.nixcdn.com/3ce626c71801e67a340ef3b7997001be/51828581/NhacCuaTui025/RingBackTone-SunnyHill_4g6z.mp3";
res.render(__dirname + '/public/template/index.jade', {sourceurl:source, sourceid: audioId});
}
catch(e){
console.log("Error when process get cfs/audio/:id, error: "+ e);
res.send("Error from server");
}
});
*/
/*
app.get('/audio/:id',function (req, res)
{
try{
var audioId = req.params.id;
var filePath = __dirname + "/public/audio/"+audioId+".mp3";
var audioFile = fs.statSync(filePath);
res.setHeader('Content-Type', 'audio/mpeg');
res.setHeader('Content-Length',audioFile.size);
var readStream = fs.createReadStream(filePath);
readStream.on('data', function(data) {
res.write(data);
});
readStream.on('end', function() {
res.end();
});
}
catch(e){
console.log("Error when process get /audio/:id, error: "+ e);
res.send("Error from server");
}
});
*/
/*
app.get('/wines', wine.findAll);
app.get('/wines/:id', wine.findById);
app.post('/wines', wine.addWine);
app.put('/wines', wine.updateWine);
app.delete('/wines/:id', wine.deleteWine);
*/
//gcm service
//app.get('/services/gcm', bllUsers);
// user service
app.get('/services/users', user.findAllUsers);
//app.get('/services/users/:id', user.findUserByUserId);
app.get('/services/users/:facebookid',user.findUserByFacebookID);
app.post('/services/users/searchname', user.findUserByUserName);
app.post('/services/users', user.addUser);
app.put('/services/users/:id', user.updateUser);
app.delete('/services/users/:id', user.deleteUserByUserId);
app.delete('/services/users', user.deleteAllUsers);
//contact service
app.get('/services/contacts', contact.findAllContacts);
app.get('/services/contacts/byuserid/:id', contact.getAllContactByUserId);
app.post('/services/contacts', contact.addContact);
app.delete('/services/contacts', contact.deleteAllContacts);
/*
app.post('/services/contacts', user.addUser);
app.put('/services/contacts/:id', user.updateUser);
app.delete('/services/contacts/:id', user.deleteUserByUserId);
*/
//inbox service
app.post('/services/inboxs/delete', inbox.deleteInboxs);
app.post('/services/inboxs', inbox.addInbox);
app.get('/services/inboxs', inbox.findAllInboxs);
app.get('/services/inboxs/byuserid/:id', inbox.findInboxByUserId);
app.delete('/services/inboxs', inbox.deleteAllInboxs);
app.put('/services/inboxs', inbox.updateInboxs);
//app.put('/services/inbox', inbox.updateInbox);
//outbox service
app.post('/services/outboxs/delete', outbox.deleteOutboxs);
app.post('/services/outboxs', outbox.addOutbox);
app.get('/services/outboxs', outbox.findAllInboxs);
app.get('/services/outboxs/byuserid/:id', outbox.findOutboxByUserId);
app.delete('/services/outboxs', outbox.deleteAllOutboxs);
//device service
app.post('/services/devices', device.addDevice);
app.get('/services/devices', device.getAllDevices);
app.get('/services/devices/byuserid/:id', device.getAllDevicesByUserId);
app.delete('/services/devices', device.deleteAllDevices);
//audio service
app.post('/upload', audio.uploadAudio);
app.get('/upload/view', function(req, res){
res.send(
'<form action="/upload" method="post" enctype="multipart/form-data">'+
'<input type="file" name="source">'+
'<input type="submit" value="Upload">'+
'</form>'
);
});
/************************test******************************************/
app.get('/convert', function(req, res){
function removeSubstring(str,strrm){
var newstr = str.replace(strrm,"");
return newstr;
}
GLOBAL.__staticDir = removeSubstring(GLOBAL.__dirname,"/vivuserver/trunk");
var audioDir = GLOBAL.__staticDir + "/staticserver/public/audio/";
var soundlib = new GLOBAL.Sound;
soundlib.convertWavToMp3(audioDir + "24.wav",audioDir + "testout1.mp3");
res.send("Ok");
});
/************************end test******************************************/
app.listen(GLOBAL.PORT);
console.log('Listening on port 3000...');<|fim▁end|> | var app = express();
|
<|file_name|>default.rs<|end_file_name|><|fim▁begin|>use crate::deriving::generic::ty::*;
use crate::deriving::generic::*;
use rustc_ast::ptr::P;
use rustc_ast::walk_list;
use rustc_ast::EnumDef;
use rustc_ast::VariantData;
use rustc_ast::{Expr, MetaItem};
use rustc_errors::Applicability;
use rustc_expand::base::{Annotatable, DummyResult, ExtCtxt};
use rustc_span::symbol::Ident;
use rustc_span::symbol::{kw, sym};
use rustc_span::Span;
use smallvec::SmallVec;
pub fn expand_deriving_default(
cx: &mut ExtCtxt<'_>,
span: Span,
mitem: &MetaItem,
item: &Annotatable,
push: &mut dyn FnMut(Annotatable),
) {
item.visit_with(&mut DetectNonVariantDefaultAttr { cx });
let inline = cx.meta_word(span, sym::inline);
let attrs = vec![cx.attribute(inline)];
let trait_def = TraitDef {
span,
attributes: Vec::new(),
path: Path::new(vec![kw::Default, sym::Default]),
additional_bounds: Vec::new(),
generics: Bounds::empty(),
is_unsafe: false,
supports_unions: false,
methods: vec![MethodDef {
name: kw::Default,
generics: Bounds::empty(),
explicit_self: None,
args: Vec::new(),
ret_ty: Self_,
attributes: attrs,
is_unsafe: false,
unify_fieldless_variants: false,
combine_substructure: combine_substructure(Box::new(|cx, trait_span, substr| {
match substr.fields {
StaticStruct(_, fields) => {
default_struct_substructure(cx, trait_span, substr, fields)
}
StaticEnum(enum_def, _) => {
if !cx.sess.features_untracked().derive_default_enum {
rustc_session::parse::feature_err(
cx.parse_sess(),
sym::derive_default_enum,
span,
"deriving `Default` on enums is experimental",
)
.emit();
}
default_enum_substructure(cx, trait_span, enum_def)
}
_ => cx.span_bug(trait_span, "method in `derive(Default)`"),
}
})),
}],
associated_types: Vec::new(),
};
trait_def.expand(cx, mitem, item, push)
}
fn default_struct_substructure(
cx: &mut ExtCtxt<'_>,
trait_span: Span,
substr: &Substructure<'_>,
summary: &StaticFields,
) -> P<Expr> {
// Note that `kw::Default` is "default" and `sym::Default` is "Default"!
let default_ident = cx.std_path(&[kw::Default, sym::Default, kw::Default]);
let default_call = |span| cx.expr_call_global(span, default_ident.clone(), Vec::new());<|fim▁hole|> cx.expr_ident(trait_span, substr.type_ident)
} else {
let exprs = fields.iter().map(|sp| default_call(*sp)).collect();
cx.expr_call_ident(trait_span, substr.type_ident, exprs)
}
}
Named(ref fields) => {
let default_fields = fields
.iter()
.map(|&(ident, span)| cx.field_imm(span, ident, default_call(span)))
.collect();
cx.expr_struct_ident(trait_span, substr.type_ident, default_fields)
}
}
}
fn default_enum_substructure(
cx: &mut ExtCtxt<'_>,
trait_span: Span,
enum_def: &EnumDef,
) -> P<Expr> {
let default_variant = match extract_default_variant(cx, enum_def, trait_span) {
Ok(value) => value,
Err(()) => return DummyResult::raw_expr(trait_span, true),
};
// At this point, we know that there is exactly one variant with a `#[default]` attribute. The
// attribute hasn't yet been validated.
if let Err(()) = validate_default_attribute(cx, default_variant) {
return DummyResult::raw_expr(trait_span, true);
}
// We now know there is exactly one unit variant with exactly one `#[default]` attribute.
cx.expr_path(cx.path(
default_variant.span,
vec![Ident::new(kw::SelfUpper, default_variant.span), default_variant.ident],
))
}
fn extract_default_variant<'a>(
cx: &mut ExtCtxt<'_>,
enum_def: &'a EnumDef,
trait_span: Span,
) -> Result<&'a rustc_ast::Variant, ()> {
let default_variants: SmallVec<[_; 1]> = enum_def
.variants
.iter()
.filter(|variant| cx.sess.contains_name(&variant.attrs, kw::Default))
.collect();
let variant = match default_variants.as_slice() {
[variant] => variant,
[] => {
let possible_defaults = enum_def
.variants
.iter()
.filter(|variant| matches!(variant.data, VariantData::Unit(..)))
.filter(|variant| !cx.sess.contains_name(&variant.attrs, sym::non_exhaustive));
let mut diag = cx.struct_span_err(trait_span, "no default declared");
diag.help("make a unit variant default by placing `#[default]` above it");
for variant in possible_defaults {
// Suggest making each unit variant default.
diag.tool_only_span_suggestion(
variant.span,
&format!("make `{}` default", variant.ident),
format!("#[default] {}", variant.ident),
Applicability::MaybeIncorrect,
);
}
diag.emit();
return Err(());
}
[first, rest @ ..] => {
let mut diag = cx.struct_span_err(trait_span, "multiple declared defaults");
diag.span_label(first.span, "first default");
diag.span_labels(rest.iter().map(|variant| variant.span), "additional default");
diag.note("only one variant can be default");
for variant in &default_variants {
// Suggest making each variant already tagged default.
let suggestion = default_variants
.iter()
.filter_map(|v| {
if v.ident == variant.ident {
None
} else {
Some((cx.sess.find_by_name(&v.attrs, kw::Default)?.span, String::new()))
}
})
.collect();
diag.tool_only_multipart_suggestion(
&format!("make `{}` default", variant.ident),
suggestion,
Applicability::MaybeIncorrect,
);
}
diag.emit();
return Err(());
}
};
if !matches!(variant.data, VariantData::Unit(..)) {
cx.struct_span_err(
variant.ident.span,
"the `#[default]` attribute may only be used on unit enum variants",
)
.help("consider a manual implementation of `Default`")
.emit();
return Err(());
}
if let Some(non_exhaustive_attr) = cx.sess.find_by_name(&variant.attrs, sym::non_exhaustive) {
cx.struct_span_err(variant.ident.span, "default variant must be exhaustive")
.span_label(non_exhaustive_attr.span, "declared `#[non_exhaustive]` here")
.help("consider a manual implementation of `Default`")
.emit();
return Err(());
}
Ok(variant)
}
fn validate_default_attribute(
cx: &mut ExtCtxt<'_>,
default_variant: &rustc_ast::Variant,
) -> Result<(), ()> {
let attrs: SmallVec<[_; 1]> =
cx.sess.filter_by_name(&default_variant.attrs, kw::Default).collect();
let attr = match attrs.as_slice() {
[attr] => attr,
[] => cx.bug(
"this method must only be called with a variant that has a `#[default]` attribute",
),
[first, rest @ ..] => {
// FIXME(jhpratt) Do we want to perform this check? It doesn't exist
// for `#[inline]`, `#[non_exhaustive]`, and presumably others.
let suggestion_text =
if rest.len() == 1 { "try removing this" } else { "try removing these" };
cx.struct_span_err(default_variant.ident.span, "multiple `#[default]` attributes")
.note("only one `#[default]` attribute is needed")
.span_label(first.span, "`#[default]` used here")
.span_label(rest[0].span, "`#[default]` used again here")
.span_help(rest.iter().map(|attr| attr.span).collect::<Vec<_>>(), suggestion_text)
// This would otherwise display the empty replacement, hence the otherwise
// repetitive `.span_help` call above.
.tool_only_multipart_suggestion(
suggestion_text,
rest.iter().map(|attr| (attr.span, String::new())).collect(),
Applicability::MachineApplicable,
)
.emit();
return Err(());
}
};
if !attr.is_word() {
cx.struct_span_err(attr.span, "`#[default]` attribute does not accept a value")
.span_suggestion_hidden(
attr.span,
"try using `#[default]`",
"#[default]".into(),
Applicability::MaybeIncorrect,
)
.emit();
return Err(());
}
Ok(())
}
struct DetectNonVariantDefaultAttr<'a, 'b> {
cx: &'a ExtCtxt<'b>,
}
impl<'a, 'b> rustc_ast::visit::Visitor<'a> for DetectNonVariantDefaultAttr<'a, 'b> {
fn visit_attribute(&mut self, attr: &'a rustc_ast::Attribute) {
if attr.has_name(kw::Default) {
self.cx
.struct_span_err(
attr.span,
"the `#[default]` attribute may only be used on unit enum variants",
)
.emit();
}
rustc_ast::visit::walk_attribute(self, attr);
}
fn visit_variant(&mut self, v: &'a rustc_ast::Variant) {
self.visit_ident(v.ident);
self.visit_vis(&v.vis);
self.visit_variant_data(&v.data);
walk_list!(self, visit_anon_const, &v.disr_expr);
for attr in &v.attrs {
rustc_ast::visit::walk_attribute(self, attr);
}
}
}<|fim▁end|> |
match summary {
Unnamed(ref fields, is_tuple) => {
if !is_tuple { |
<|file_name|>create_dns_zone.py<|end_file_name|><|fim▁begin|>from lib.action import PyraxBaseAction
from lib.formatters import to_dns_zone_dict
__all__ = [
'CreateDNSZoneAction'
]<|fim▁hole|>class CreateDNSZoneAction(PyraxBaseAction):
def run(self, name, email_address, ttl=None, comment=None):
cdns = self.pyrax.cloud_dns
zone = cdns.create(name=name, emailAddress=email_address, ttl=ttl,
comment=comment)
result = to_dns_zone_dict(zone)
return result<|fim▁end|> | |
<|file_name|>execute_action.cc<|end_file_name|><|fim▁begin|>// Copyright 2015 The Chromium Authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
#include "chrome/browser/ash/file_system_provider/operations/execute_action.h"
#include <algorithm>
#include <string>
#include "chrome/common/extensions/api/file_system_provider.h"
#include "chrome/common/extensions/api/file_system_provider_internal.h"
namespace ash {
namespace file_system_provider {
namespace operations {
ExecuteAction::ExecuteAction(extensions::EventRouter* event_router,
const ProvidedFileSystemInfo& file_system_info,
const std::vector<base::FilePath>& entry_paths,
const std::string& action_id,
storage::AsyncFileUtil::StatusCallback callback)
: Operation(event_router, file_system_info),
entry_paths_(entry_paths),
action_id_(action_id),
callback_(std::move(callback)) {}
ExecuteAction::~ExecuteAction() {<|fim▁hole|>
bool ExecuteAction::Execute(int request_id) {
using extensions::api::file_system_provider::ExecuteActionRequestedOptions;
ExecuteActionRequestedOptions options;
options.file_system_id = file_system_info_.file_system_id();
options.request_id = request_id;
for (const auto& entry_path : entry_paths_)
options.entry_paths.push_back(entry_path.AsUTF8Unsafe());
options.action_id = action_id_;
return SendEvent(
request_id,
extensions::events::FILE_SYSTEM_PROVIDER_ON_EXECUTE_ACTION_REQUESTED,
extensions::api::file_system_provider::OnExecuteActionRequested::
kEventName,
extensions::api::file_system_provider::OnExecuteActionRequested::Create(
options));
}
void ExecuteAction::OnSuccess(int /* request_id */,
std::unique_ptr<RequestValue> result,
bool has_more) {
DCHECK(callback_);
std::move(callback_).Run(base::File::FILE_OK);
}
void ExecuteAction::OnError(int /* request_id */,
std::unique_ptr<RequestValue> /* result */,
base::File::Error error) {
DCHECK(callback_);
std::move(callback_).Run(error);
}
} // namespace operations
} // namespace file_system_provider
} // namespace ash<|fim▁end|> | } |
<|file_name|>dispatch.py<|end_file_name|><|fim▁begin|># Copyright 2017-2020 The GPflow Contributors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from deprecated import deprecated
from ..utilities import Dispatcher
conditional = Dispatcher("conditional")
conditional._gpflow_internal_register = conditional.register
# type-ignore below is because mypy doesn't like it when we assign to a function.
conditional.register = deprecated( # type: ignore
reason="Registering new implementations of conditional() is deprecated. "<|fim▁hole|>)(conditional._gpflow_internal_register)
sample_conditional = Dispatcher("sample_conditional")<|fim▁end|> | "Instead, create your own subclass of gpflow.posteriors.AbstractPosterior "
"and register an implementation of gpflow.posteriors.get_posterior_class "
"that returns your class." |
<|file_name|>AtomicState.cpp<|end_file_name|><|fim▁begin|>/**<|fim▁hole|> * License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program. If not, see <http://www.gnu.org/licenses/>.
*
**/
#include "stateengine/AtomicState.h"
#include "stateengine/Defines.h"
//--------------------------------------------------------------------
// TAtomicState
//--------------------------------------------------------------------
//--------------------------------------------------------------------
namespace donut
{
typedef void (*TUpdaterFunction)(const TStateEngineId&, double);
typedef void (*TStateCallBack)(const TStateEngineId&, TStateData *);
//--------------------------------------------------------------------
TAtomicState::TAtomicState(TStateEngineId parStateEngineId, TStateId parId, TStateData (* parTStateData), void (* parEnterCallBack), void (* parLeaveCallBack), void (* parUpdater))
: FStateEngineId (parStateEngineId)
{
FEnterCallBack = parEnterCallBack;
FLeaveCallBack = parLeaveCallBack;
FStateData = parTStateData;
FUpdater = parUpdater;
FId = parId;
}
TAtomicState::~TAtomicState()
{
// assert_msg_NO_RELEASE(FStateData!=NULL, "The state data has been already deleted. you do not need to.")
delete FStateData;
}
void TAtomicState::AddTransition(TTransitionId parId, TTransition * parTransition)
{
FOutTransitions[parId] = parTransition;
}
#if _DEBUG
void TAtomicState::AddInTransition(const TTransition * parTransition)
{
// FInTransitions.push_back(parTransition)
}
#endif
void TAtomicState::Update(double parDt)
{
void (*updater)( const TStateEngineId&, double) = *((TUpdaterFunction*) (&FUpdater));
updater(FStateEngineId , parDt);
}
void TAtomicState::Enter()
{
void (*enter)(const TStateEngineId&, TStateData *) = *((TStateCallBack*) (&FEnterCallBack));
enter(FStateEngineId, FStateData);
}
void TAtomicState::Leave()
{
void (*leave)(const TStateEngineId&, TStateData *) = *((TStateCallBack*) (&FLeaveCallBack));
leave(FStateEngineId, FStateData);
}
void TAtomicState::TransitionCallBack()
{
}
} // End namestate StateEngine<|fim▁end|> | * This program is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as
* published by the Free Software Foundation, either version 3 of the |
<|file_name|>CitiesAdapter.java<|end_file_name|><|fim▁begin|>package com.ruenzuo.weatherapp.adapters;
import android.app.Activity;
import android.content.Context;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.ArrayAdapter;
import android.widget.TextView;
import com.ruenzuo.weatherapp.R;<|fim▁hole|>import com.ruenzuo.weatherapp.models.City;
/**
* Created by ruenzuo on 08/05/14.
*/
public class CitiesAdapter extends ArrayAdapter<City> {
private int resourceId;
public CitiesAdapter(Context context, int resource) {
super(context, resource);
resourceId = resource;
}
@Override
public View getView(int position, View convertView, ViewGroup parent) {
if (convertView == null) {
LayoutInflater inflater = ((Activity)getContext()).getLayoutInflater();
convertView = inflater.inflate(resourceId, null);
}
City country = getItem(position);
TextView txtViewCityName = (TextView) convertView.findViewById(R.id.txtViewCityName);
txtViewCityName.setText(country.getName());
return convertView;
}
}<|fim▁end|> | |
<|file_name|>0003_auto_20161206_0538.py<|end_file_name|><|fim▁begin|># -*- coding: utf-8 -*-
# Generated by Django 1.10.4 on 2016-12-06 05:38
from __future__ import unicode_literals<|fim▁hole|>from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('api', '0002_hackathon_name'),
]
operations = [
migrations.AlterField(
model_name='hackathon',
name='name',
field=models.CharField(max_length=100, unique=True),
),
]<|fim▁end|> | |
<|file_name|>BlockModelRenderer.java<|end_file_name|><|fim▁begin|>package org.craft.client.render.blocks;
import java.util.*;
import java.util.Map.Entry;
import com.google.common.collect.*;
import org.craft.blocks.*;
import org.craft.blocks.states.*;
import org.craft.client.*;
import org.craft.client.models.*;
import org.craft.client.render.*;
import org.craft.client.render.texture.TextureIcon;
import org.craft.client.render.texture.TextureMap;
import org.craft.maths.*;
import org.craft.utils.*;
import org.craft.world.*;
public class BlockModelRenderer extends AbstractBlockRenderer<|fim▁hole|>{
private HashMap<BlockVariant, HashMap<String, TextureIcon>> icons;
private List<BlockVariant> blockVariants;
private static Quaternion rotationQuaternion;
/**
* Creates a new renderer for given block variants
*/
public BlockModelRenderer(List<BlockVariant> list)
{
this.blockVariants = list;
icons = Maps.newHashMap();
for(BlockVariant v : list)
icons.put(v, new HashMap<String, TextureIcon>());
if(rotationQuaternion == null)
rotationQuaternion = new Quaternion();
}
@Override
public void render(RenderEngine engine, OffsettedOpenGLBuffer buffer, World w, Block b, int x, int y, int z)
{
if(!b.shouldRender())
return;
Chunk chunk = null;
if(w != null)
chunk = w.getChunk(x, y, z);
BlockVariant variant = null;
float lightValue = 1f;
if(chunk == null)
variant = blockVariants.get(0);
else
{
variant = getVariant(w, b, x, y, z);
lightValue = chunk.getLightValue(w, x, y, z);
}
if(variant == null)
variant = blockVariants.get(0);
Model blockModel = variant.getModels().get(w == null ? 0 : w.getRNG().nextInt(variant.getModels().size())); // TODO: random model ?
for(int i = 0; i < blockModel.getElementsCount(); i++ )
{
ModelElement element = blockModel.getElement(i);
if(element.hasRotation())
{
Vector3 axis = Vector3.xAxis;
if(element.getRotationAxis() == null)
;
else if(element.getRotationAxis().equalsIgnoreCase("y"))
axis = Vector3.yAxis;
else if(element.getRotationAxis().equalsIgnoreCase("z"))
axis = Vector3.zAxis;
rotationQuaternion.init(axis, (float) Math.toRadians(element.getRotationAngle()));
}
Set<Entry<String, ModelFace>> entries = element.getFaces().entrySet();
Vector3 startPos = element.getFrom();
Vector3 size = element.getTo().sub(startPos);
for(Entry<String, ModelFace> entry : entries)
{
Vector3 faceStart = Vector3.NULL;
Vector3 faceSize = Vector3.NULL;
TextureIcon icon = getTexture(blockModel, variant, entry.getValue().getTexture());
boolean flip = false;
EnumSide cullface = EnumSide.fromString(entry.getValue().getCullface());
EnumSide side = EnumSide.fromString(entry.getKey());
if(side != null)
{
if(entry.getValue().hideIfSameAdjacent() && w.getBlockNextTo(x, y, z, side) == b)
{
continue;
}
}
if(cullface != EnumSide.UNDEFINED)
{
Block next = Blocks.air;
if(w != null)
next = w.getBlockNextTo(x, y, z, cullface);
if(next.isSideOpaque(w, x, y, z, cullface.opposite()))
{
continue;
}
}
if(entry.getKey().equals("up"))
{
faceStart = Vector3.get(startPos.getX(), startPos.getY() + size.getY(), startPos.getZ());
faceSize = Vector3.get(size.getX(), 0, size.getZ());
flip = true;
}
else if(entry.getKey().equals("down"))
{
faceStart = Vector3.get(startPos.getX(), startPos.getY(), startPos.getZ());
faceSize = Vector3.get(size.getX(), 0, size.getZ());
flip = true;
}
else if(entry.getKey().equals("west"))
{
faceStart = Vector3.get(startPos.getX(), startPos.getY(), startPos.getZ());
faceSize = Vector3.get(0, size.getY(), size.getZ());
}
else if(entry.getKey().equals("east"))
{
faceStart = Vector3.get(startPos.getX() + size.getX(), startPos.getY(), startPos.getZ());
faceSize = Vector3.get(0, size.getY(), size.getZ());
}
else if(entry.getKey().equals("north"))
{
faceStart = Vector3.get(startPos.getX(), startPos.getY(), startPos.getZ());
faceSize = Vector3.get(size.getX(), size.getY(), 0);
}
else if(entry.getKey().equals("south"))
{
faceStart = Vector3.get(startPos.getX(), startPos.getY(), startPos.getZ() + size.getZ());
faceSize = Vector3.get(size.getX(), size.getY(), 0);
}
else
{
continue;
}
renderFace(lightValue, buffer, x, y, z, icon, faceStart, faceSize, flip, entry.getValue().getMinUV(), entry.getValue().getMaxUV(), element.getRotationOrigin(), rotationQuaternion, element.shouldRescale());
faceSize.dispose();
faceStart.dispose();
}
size.dispose();
}
}
/**
* Returns most revelant variant depending on block states values at (x,y,z)
*/
private BlockVariant getVariant(World w, Block b, int x, int y, int z)
{
BlockVariant variant = null;
variantLoop: for(BlockVariant v : blockVariants)
{
if(v.getBlockStates() == null && variant == null)
{
variant = v;
}
else if(v.getBlockStates().isEmpty() && variant == null)
{
variant = v;
}
else
{
for(int i = 0; i < v.getBlockStates().size(); i++ )
{
BlockState state = v.getBlockStates().get(i);
IBlockStateValue value = v.getBlockStateValues().get(i);
if(w.getBlockState(x, y, z, state) != value)
{
continue variantLoop;
}
}
if(variant != null)
{
if(variant.getBlockStates().size() <= v.getBlockStates().size())
{
variant = v;
}
else if(variant.getBlockStates().size() > v.getBlockStates().size())
continue variantLoop;
}
else
{
variant = v;
}
}
}
return variant;
}
/**
* Gets TextureIcon from texture variable found in json model file
*/
private TextureIcon getTexture(Model blockModel, BlockVariant variant, String texture)
{
if(texture == null)
return null;
if(!icons.get(variant).containsKey(texture))
{
if(texture.startsWith("#"))
{
TextureIcon icon = getTexture(blockModel, variant, blockModel.getTexturePath(texture.substring(1)));
icons.get(variant).put(texture, icon);
}
else
{
TextureMap blockMap = OurCraft.getOurCraft().getRenderEngine().blocksAndItemsMap;
icons.get(variant).put(texture, blockMap.get(texture + ".png"));
}
}
return icons.get(variant).get(texture);
}
@Override
public boolean shouldRenderInPass(EnumRenderPass currentPass, World w, Block b, int x, int y, int z)
{
BlockVariant variant = getVariant(w, b, x, y, z);
if(variant == null)
return false;
return currentPass == variant.getPass();
}
}<|fim▁end|> | |
<|file_name|>RR_action_server.py<|end_file_name|><|fim▁begin|>#!/usr/bin/env python
"""
# Software License Agreement (BSD License)
#
# Copyright (c) 2012, University of California, Berkeley
# All rights reserved.
# Authors: Cameron Lee ([email protected]) and Dmitry Berenson (
[email protected])
#
# Redistribution and use in source and binary forms, with or without<|fim▁hole|># are met:
#
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above
# copyright notice, this list of conditions and the following
# disclaimer in the documentation and/or other materials provided
# with the distribution.
# * Neither the name of University of California, Berkeley nor the names
of its
# contributors may be used to endorse or promote products derived
# from this software without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
# FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE
# COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
# INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
# BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
# CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
# LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN
# ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
"""
"""
This node advertises an action which is used by the main lightning node
(see run_lightning.py) to run the Retrieve and Repair portion of LightningROS.
This node relies on a planner_stoppable type node to repair the paths, the
PathTools library to retrieve paths from the library (this is not a separate
node; just a python library that it calls), and the PathTools python library
which calls the collision_checker service and advertises a topic for displaying
stuff in RViz.
"""
import roslib
import rospy
import actionlib
import threading
from tools.PathTools import PlanTrajectoryWrapper, InvalidSectionWrapper, DrawPointsWrapper
from pathlib.PathLibrary import *
from lightning.msg import Float64Array, RRAction, RRResult
from lightning.msg import StopPlanning, RRStats
from lightning.srv import ManagePathLibrary, ManagePathLibraryResponse
import sys
import pickle
import time
# Name of this node.
RR_NODE_NAME = "rr_node"
# Name to use for stopping the repair planner. Published from this node.
STOP_PLANNER_NAME = "stop_rr_planning"
# Topic to subscribe to for stopping the whole node in the middle of processing.
STOP_RR_NAME = "stop_all_rr"
# Name of library managing service run from this node.
MANAGE_LIBRARY = "manage_path_library"
STATE_RETRIEVE, STATE_REPAIR, STATE_RETURN_PATH, STATE_FINISHED, STATE_FINISHED = (0, 1, 2, 3, 4)
class RRNode:
def __init__(self):
# Retrieve ROS parameters and configuration and cosntruct various objects.
self.robot_name = rospy.get_param("robot_name")
self.planner_config_name = rospy.get_param("planner_config_name")
self.current_joint_names = []
self.current_group_name = ""
self.plan_trajectory_wrapper = PlanTrajectoryWrapper("rr", int(rospy.get_param("~num_rr_planners")))
self.invalid_section_wrapper = InvalidSectionWrapper()
self.path_library = PathLibrary(rospy.get_param("~path_library_dir"), rospy.get_param("step_size"), node_size=int(rospy.get_param("~path_library_path_node_size")), sg_node_size=int(rospy.get_param("~path_library_sg_node_size")), dtw_dist=float(rospy.get_param("~dtw_distance")))
self.num_paths_checked = int(rospy.get_param("~num_paths_to_collision_check"))
self.stop_lock = threading.Lock()
self.stop = True
self.rr_server = actionlib.SimpleActionServer(RR_NODE_NAME, RRAction, execute_cb=self._retrieve_repair, auto_start=False)
self.rr_server.start()
self.stop_rr_subscriber = rospy.Subscriber(STOP_RR_NAME, StopPlanning, self._stop_rr_planner)
self.stop_rr_planner_publisher = rospy.Publisher(STOP_PLANNER_NAME, StopPlanning, queue_size=10)
self.manage_library_service = rospy.Service(MANAGE_LIBRARY, ManagePathLibrary, self._do_manage_action)
self.stats_pub = rospy.Publisher("rr_stats", RRStats, queue_size=10)
self.repaired_sections_lock = threading.Lock()
self.repaired_sections = []
self.working_lock = threading.Lock() #to ensure that node is not doing RR and doing a library management action at the same time
#if draw_points is True, then display points in rviz
self.draw_points = rospy.get_param("draw_points")
if self.draw_points:
self.draw_points_wrapper = DrawPointsWrapper()
def _set_repaired_section(self, index, section):
"""
After you have done the path planning to repair a section, store
the repaired path section.
Args:
index (int): the index corresponding to the section being repaired.
section (path, list of list of float): A path to store.
"""
self.repaired_sections_lock.acquire()
self.repaired_sections[index] = section
self.repaired_sections_lock.release()
def _call_planner(self, start, goal, planning_time):
"""
Calls a standard planner to plan between two points with an allowed
planning time.
Args:
start (list of float): A joint configuration corresponding to the
start position of the path.
goal (list of float): The jount configuration corresponding to the
goal position for the path.
Returns:
path: A list of joint configurations corresponding to the planned
path.
"""
ret = None
planner_number = self.plan_trajectory_wrapper.acquire_planner()
if not self._need_to_stop():
ret = self.plan_trajectory_wrapper.plan_trajectory(start, goal, planner_number, self.current_joint_names, self.current_group_name, planning_time, self.planner_config_name)
self.plan_trajectory_wrapper.release_planner(planner_number)
return ret
def _repair_thread(self, index, start, goal, start_index, goal_index, planning_time):
"""
Handles repairing a portion of the path.
All that this function really does is to plan from scratch between
the start and goal configurations and then store the planned path
in the appropriate places and draws either the repaired path or, if
the repair fails, the start and goal.
Args:
index (int): The index to pass to _set_repaired_section(),
corresponding to which of the invalid sections of the path we are
repairing.
start (list of float): The start joint configuration to use.
goal (list of float): The goal joint configuration to use.
start_index (int): The index in the overall path corresponding to
start. Only used for debugging info.
goal_index (int): The index in the overall path corresponding to
goal. Only used for debugging info.
planning_time (float): Maximum allowed time to spend planning, in
seconds.
"""
repaired_path = self._call_planner(start, goal, planning_time)
if self.draw_points:
if repaired_path is not None and len(repaired_path) > 0:
rospy.loginfo("RR action server: got repaired section with start = %s, goal = %s" % (repaired_path[0], repaired_path[-1]))
self.draw_points_wrapper.draw_points(repaired_path, self.current_group_name, "repaired"+str(start_index)+"_"+str(goal_index), DrawPointsWrapper.ANGLES, DrawPointsWrapper.GREENBLUE, 1.0, 0.01)
else:
if self.draw_points:
rospy.loginfo("RR action server: path repair for section (%i, %i) failed, start = %s, goal = %s" % (start_index, goal_index, start, goal))
self.draw_points_wrapper.draw_points([start, goal], self.current_group_name, "failed_repair"+str(start_index)+"_"+str(goal_index), DrawPointsWrapper.ANGLES, DrawPointsWrapper.GREENBLUE, 1.0)
if self._need_to_stop():
self._set_repaired_section(index, None)
else:
self._set_repaired_section(index, repaired_path)
def _need_to_stop(self):
self.stop_lock.acquire();
ret = self.stop;
self.stop_lock.release();
return ret;
def _set_stop_value(self, val):
self.stop_lock.acquire();
self.stop = val;
self.stop_lock.release();
def do_retrieved_path_drawing(self, projected, retrieved, invalid):
"""
Draws the points from the various paths involved in the planning
in different colors in different namespaces.
All of the arguments are lists of joint configurations, where each
joint configuration is a list of joint angles.
The only distinction between the different arguments being passed in
are which color the points in question are being drawn in.
Uses the DrawPointsWrapper to draw the points.
Args:
projected (list of list of float): List of points to draw as
projected between the library path and the actual start/goal
position. Will be drawn in blue.
retrieved (list of list of float): The path retrieved straight
from the path library. Will be drawn in white.
invalid (list of list of float): List of points which were invalid.
Will be drawn in red.
"""
if len(projected) > 0:
if self.draw_points:
self.draw_points_wrapper.draw_points(retrieved, self.current_group_name, "retrieved", DrawPointsWrapper.ANGLES, DrawPointsWrapper.WHITE, 0.1)
projectionDisplay = projected[:projected.index(retrieved[0])]+projected[projected.index(retrieved[-1])+1:]
self.draw_points_wrapper.draw_points(projectionDisplay, self.current_group_name, "projection", DrawPointsWrapper.ANGLES, DrawPointsWrapper.BLUE, 0.2)
invalidDisplay = []
for invSec in invalid:
invalidDisplay += projected[invSec[0]+1:invSec[-1]]
self.draw_points_wrapper.draw_points(invalidDisplay, self.current_group_name, "invalid", DrawPointsWrapper.ANGLES, DrawPointsWrapper.RED, 0.2)
def _retrieve_repair(self, action_goal):
"""
Callback which performs the full Retrieve and Repair for the path.
"""
self.working_lock.acquire()
self.start_time = time.time()
self.stats_msg = RRStats()
self._set_stop_value(False)
if self.draw_points:
self.draw_points_wrapper.clear_points()
rospy.loginfo("RR action server: RR got an action goal")
s, g = action_goal.start, action_goal.goal
res = RRResult()
res.status.status = res.status.FAILURE
self.current_joint_names = action_goal.joint_names
self.current_group_name = action_goal.group_name
projected, retrieved, invalid = [], [], []
repair_state = STATE_RETRIEVE
self.stats_msg.init_time = time.time() - self.start_time
# Go through the retrieve, repair, and return stages of the planning.
# The while loop should only ever go through 3 iterations, one for each
# stage.
while not self._need_to_stop() and repair_state != STATE_FINISHED:
if repair_state == STATE_RETRIEVE:
start_retrieve = time.time()
projected, retrieved, invalid = self.path_library.retrieve_path(s, g, self.num_paths_checked, self.robot_name, self.current_group_name, self.current_joint_names)
self.stats_msg.retrieve_time.append(time.time() - start_retrieve)
if len(projected) == 0:
rospy.loginfo("RR action server: got an empty path for retrieve state")
repair_state = STATE_FINISHED
else:
start_draw = time.time()
if self.draw_points:
self.do_retrieved_path_drawing(projected, retrieved, invalid)
self.stats_msg.draw_time.append(time.time() - start_draw)
repair_state = STATE_REPAIR
elif repair_state == STATE_REPAIR:
start_repair = time.time()
repaired = self._path_repair(projected, action_goal.allowed_planning_time.to_sec(), invalid_sections=invalid)
self.stats_msg.repair_time.append(time.time() - start_repair)
if repaired is None:
rospy.loginfo("RR action server: path repair didn't finish")
repair_state = STATE_FINISHED
else:
repair_state = STATE_RETURN_PATH
elif repair_state == STATE_RETURN_PATH:
start_return = time.time()
res.status.status = res.status.SUCCESS
res.retrieved_path = [Float64Array(p) for p in retrieved]
res.repaired_path = [Float64Array(p) for p in repaired]
rospy.loginfo("RR action server: returning a path")
repair_state = STATE_FINISHED
self.stats_msg.return_time = time.time() - start_return
if repair_state == STATE_RETRIEVE:
rospy.loginfo("RR action server: stopped before it retrieved a path")
elif repair_state == STATE_REPAIR:
rospy.loginfo("RR action server: stopped before it could repair a retrieved path")
elif repair_state == STATE_RETURN_PATH:
rospy.loginfo("RR action server: stopped before it could return a repaired path")
self.rr_server.set_succeeded(res)
self.stats_msg.total_time = time.time() - self.start_time
self.stats_pub.publish(self.stats_msg)
self.working_lock.release()
def _path_repair(self, original_path, planning_time, invalid_sections=None, use_parallel_repairing=True):
"""
Goes through each invalid section in a path and calls a planner to
repair it, with the potential for multi-threading. Returns the
repaired path.
Args:
original_path (path): The original path which needs repairing.
planning_time (float): The maximum allowed planning time for
each repair, in seconds.
invalid_sections (list of pairs of indicies): The pairs of indicies
describing the invalid sections. If None, then the invalid
sections will be computed by this function.
use_parallel_repairing (bool): Whether or not to use multi-threading.
Returns:
path: The repaired path.
"""
zeros_tuple = tuple([0 for i in xrange(len(self.current_joint_names))])
rospy.loginfo("RR action server: got path with %d points" % len(original_path))
if invalid_sections is None:
invalid_sections = self.invalid_section_wrapper.getInvalidSectionsForPath(original_path, self.current_group_name)
rospy.loginfo("RR action server: invalid sections: %s" % (str(invalid_sections)))
if len(invalid_sections) > 0:
if invalid_sections[0][0] == -1:
rospy.loginfo("RR action server: Start is not a valid state...nothing can be done")
return None
if invalid_sections[-1][1] == len(original_path):
rospy.loginfo("RR action server: Goal is not a valid state...nothing can be done")
return None
if use_parallel_repairing:
#multi-threaded repairing
self.repaired_sections = [None for i in xrange(len(invalid_sections))]
#each thread replans an invalid section
threadList = []
for i, sec in enumerate(invalid_sections):
th = threading.Thread(target=self._repair_thread, args=(i, original_path[sec[0]], original_path[sec[-1]], sec[0], sec[-1], planning_time))
threadList.append(th)
th.start()
for th in threadList:
th.join()
#once all threads return, then the repaired sections can be combined
for item in self.repaired_sections:
if item is None:
rospy.loginfo("RR action server: RR node was stopped during repair or repair failed")
return None
#replace invalid sections with replanned sections
new_path = original_path[0:invalid_sections[0][0]]
for i in xrange(len(invalid_sections)):
new_path += self.repaired_sections[i]
if i+1 < len(invalid_sections):
new_path += original_path[invalid_sections[i][1]+1:invalid_sections[i+1][0]]
new_path += original_path[invalid_sections[-1][1]+1:]
self.repaired_sections = [] #reset repaired_sections
else:
#single-threaded repairing
rospy.loginfo("RR action server: Got invalid sections: %s" % str(invalid_sections))
new_path = original_path[0:invalid_sections[0][0]]
for i in xrange(len(invalid_sections)):
if not self._need_to_stop():
#start_invalid and end_invalid must correspond to valid states when passed to the planner
start_invalid, end_invalid = invalid_sections[i]
rospy.loginfo("RR action server: Requesting path to replace from %d to %d" % (start_invalid, end_invalid))
repairedSection = self._call_planner(original_path[start_invalid], original_path[end_invalid])
if repairedSection is None:
rospy.loginfo("RR action server: RR section repair was stopped or failed")
return None
rospy.loginfo("RR action server: Planner returned a trajectory of %d points for %d to %d" % (len(repairedSection), start_invalid, end_invalid))
new_path += repairedSection
if i+1 < len(invalid_sections):
new_path += original_path[end_invalid+1:invalid_sections[i+1][0]]
else:
rospy.loginfo("RR action server: RR was stopped while it was repairing the retrieved path")
return None
new_path += original_path[invalid_sections[-1][1]+1:]
rospy.loginfo("RR action server: Trajectory after replan has %d points" % len(new_path))
else:
new_path = original_path
rospy.loginfo("RR action server: new trajectory has %i points" % (len(new_path)))
return new_path
def _stop_rr_planner(self, msg):
self._set_stop_value(True)
rospy.loginfo("RR action server: RR node got a stop message")
self.stop_rr_planner_publisher.publish(msg)
def _do_manage_action(self, request):
"""
Processes a ManagePathLibraryRequest as part of the ManagePathLibrary
service. Basically, either stores a path in the library or deletes it.
"""
response = ManagePathLibraryResponse()
response.result = response.FAILURE
if request.robot_name == "" or len(request.joint_names) == 0:
rospy.logerr("RR action server: robot name or joint names were not provided")
return response
self.working_lock.acquire()
if request.action == request.ACTION_STORE:
rospy.loginfo("RR action server: got a path to store in path library")
if len(request.path_to_store) > 0:
new_path = [p.positions for p in request.path_to_store]
if len(request.retrieved_path) == 0:
#PFS won so just store the path
store_path_result = self.path_library.store_path(new_path, request.robot_name, request.joint_names)
else:
store_path_result = self.path_library.store_path(new_path, request.robot_name, request.joint_names, [p.positions for p in request.retrieved_path])
response.result = response.SUCCESS
response.path_stored, response.num_library_paths = store_path_result
else:
response.message = "Path to store had no points"
elif request.action == request.ACTION_DELETE_PATH:
rospy.loginfo("RR action server: got a request to delete path %i in the path library" % (request.delete_id))
if self.path_library.delete_path_by_id(request.delete_id, request.robot_name, request.joint_names):
response.result = response.SUCCESS
else:
response.message = "No path in the library had id %i" % (request.delete_id)
elif request.action == request.ACTION_DELETE_LIBRARY:
rospy.loginfo("RR action server: got a request to delete library corresponding to robot %s and joints %s" % (request.robot_name, request.joint_names))
if self.path_library.delete_library(request.robot_name, request.joint_names):
response.result = response.SUCCESS
else:
response.message = "No library corresponding to robot %s and joint names %s exists"
else:
rospy.logerr("RR action server: manage path library request did not have a valid action set")
self.working_lock.release()
return response
if __name__ == "__main__":
try:
rospy.init_node("rr_node")
RRNode()
rospy.loginfo("Retrieve-repair: ready")
rospy.spin()
except rospy.ROSInterruptException:
pass<|fim▁end|> | # modification, are permitted provided that the following conditions |
<|file_name|>foo.js<|end_file_name|><|fim▁begin|><|fim▁hole|><|fim▁end|> | alert("foo!"); |
<|file_name|>settings.py<|end_file_name|><|fim▁begin|>import numpy as np
from astropy.coordinates import EarthLocation, SkyCoord
__all__ = ['MWA_LOC', 'MWA_FIELD_EOR0', 'MWA_FIELD_EOR1', 'MWA_FIELD_EOR2',
'MWA_FREQ_EOR_ALL_40KHZ', 'MWA_FREQ_EOR_ALL_80KHZ',
'MWA_FREQ_EOR_HI_40KHZ', 'MWA_FREQ_EOR_HI_80KHZ',
'MWA_FREQ_EOR_LOW_40KHZ', 'MWA_FREQ_EOR_LOW_80KHZ',
'HERA_ANT_DICT', 'F21']
F21 = 1420.405751786e6
MWA_LOC = EarthLocation(lat='−26d42m11.94986s', lon='116d40m14.93485s',
height=377.827)
MWA_FIELD_EOR0 = SkyCoord(ra='0.0h', dec='-30.0d')
MWA_FIELD_EOR1 = SkyCoord(ra='4.0h', dec='-30.0d')
MWA_FIELD_EOR2 = SkyCoord(ra='10.33h', dec='-10.0d')
MWA_FREQ_EOR_LOW_40KHZ = np.arange(138.895, 167.055, 0.04)
MWA_FREQ_EOR_HI_40KHZ = np.arange(167.055, 195.255, 0.04)
MWA_FREQ_EOR_ALL_40KHZ = np.arange(138.895, 195.255, 0.04)
MWA_FREQ_EOR_LOW_80KHZ = np.arange(138.915, 167.075, 0.08)
MWA_FREQ_EOR_HI_80KHZ = np.arange(167.075, 195.275, 0.08)<|fim▁hole|>HERA_ANT_DICT = {'hera19': 3, 'hera37': 4, 'hera61': 5, 'hera91': 6,
'hera127': 7, 'hera169': 8, 'hera217': 9, 'hera271': 10,
'hera331': 11}<|fim▁end|> | MWA_FREQ_EOR_ALL_80KHZ = np.arange(138.915, 195.275, 0.08) |
<|file_name|>rollup.config.js<|end_file_name|><|fim▁begin|>import typescript from 'rollup-plugin-typescript'
import bundleWorker from 'rollup-plugin-bundle-worker'
export default {
input: 'src/index.ts',
output: {<|fim▁hole|> file: 'docs/nonogram.js',
format: 'iife',
},
name: 'nonogram',
plugins: [
typescript({ typescript: require('typescript') }),
bundleWorker(),
],
}<|fim▁end|> | |
<|file_name|>integration-tests.ts<|end_file_name|><|fim▁begin|>import * as chai from 'chai';
import * as chaiAsPromised from 'chai-as-promised';
import {
GraphQLSchema,
GraphQLObjectType,
GraphQLString,
GraphQLInt,
GraphQLBoolean,
} from 'graphql';
import {SubscriptionManager} from 'graphql-subscriptions';
import {connect} from 'mqtt';
import {MQTTPubSub} from '../mqtt-pubsub';
chai.use(chaiAsPromised);
const expect = chai.expect;
const assert = chai.assert;
const schema = new GraphQLSchema({
query: new GraphQLObjectType({
name: 'Query',
fields: {
testString: {
type: GraphQLString,
resolve: function () {
return 'works';
},
},
},
}),
subscription: new GraphQLObjectType({
name: 'Subscription',
fields: {
testSubscription: {
type: GraphQLString,
resolve: function (root) {
return root;
},
},
testFilter: {
type: GraphQLString,
resolve: function (_, {filterBoolean}) {
return filterBoolean ? 'goodFilter' : 'badFilter';
},
args: {
filterBoolean: {type: GraphQLBoolean},
},
},
testFilterMulti: {
type: GraphQLString,
resolve: function (_, {filterBoolean}) {
return filterBoolean ? 'goodFilter' : 'badFilter';
},
args: {
filterBoolean: {type: GraphQLBoolean},
a: {type: GraphQLString},
b: {type: GraphQLInt},
},
},
testChannelOptions: {
type: GraphQLString,
resolve: function (root) {
return root;
},
args: {
repoName: {type: GraphQLString},
},
},
},
}),
});
const mqttClient = connect('mqtt://localhost');
const subManager = new SubscriptionManager({
schema,
setupFunctions: {
'testFilter': (_, {filterBoolean}) => {
return {
'Filter1': {filter: (root) => root.filterBoolean === filterBoolean},
};
},
'testFilterMulti': () => {
return {
'Trigger1': {filter: () => true},
'Trigger2': {filter: () => true},
};
},
},
pubsub: new MQTTPubSub({
client: mqttClient,
}),
});
describe('SubscriptionManager', function () {
before('wait for connection', function (done) {
mqttClient.on('connect', () => {
done();
});
});
it('throws an error if query is not valid', function () {
const query = 'query a{ testInt }';
const callback = () => null;
return expect(subManager.subscribe({query, operationName: 'a', callback}))
.to.eventually.be.rejectedWith('Subscription query has validation errors');
});
it('rejects subscriptions with more than one root field', function () {
const query = 'subscription X{ a: testSubscription, b: testSubscription }';
const callback = () => null;
return expect(subManager.subscribe({query, operationName: 'X', callback}))
.to.eventually.be.rejectedWith('Subscription query has validation errors');
});
it('can subscribe with a valid query and gets a subId back', function () {
const query = 'subscription X{ testSubscription }';<|fim▁hole|> subManager.unsubscribe(subId);
});
});
it.only('can subscribe with a valid query and get the root value', function (done) {
const query = 'subscription X{ testSubscription }';
const callback = function (err, payload) {
if (err) {
done(err);
}
try {
expect(payload.data.testSubscription).to.equals('good');
} catch (e) {
done(e);
return;
}
done();
};
subManager.subscribe({query, operationName: 'X', callback}).then(subId => {
subManager.publish('testSubscription', 'good');
setTimeout(() => {
subManager.unsubscribe(subId);
}, 10000);
});
});
it('can use filter functions properly', function (done) {
const query = `subscription Filter1($filterBoolean: Boolean){
testFilter(filterBoolean: $filterBoolean)
}`;
const callback = function (err, payload) {
if (err) {
done(err);
}
try {
expect(payload.data.testFilter).to.equals('goodFilter');
} catch (e) {
done(e);
return;
}
done();
};
subManager.subscribe({
query,
operationName: 'Filter1',
variables: {filterBoolean: true},
callback,
}).then(subId => {
subManager.publish('Filter1', {filterBoolean: false});
subManager.publish('Filter1', {filterBoolean: true});
setTimeout(() => {
subManager.unsubscribe(subId);
}, 20);
});
});
it('can subscribe to more than one trigger', function (done) {
// I also used this for testing arg parsing (with console.log)
// args a and b can safely be removed.
// TODO: write real tests for argument parsing
let triggerCount = 0;
const query = `subscription multiTrigger($filterBoolean: Boolean, $uga: String){
testFilterMulti(filterBoolean: $filterBoolean, a: $uga, b: 66)
}`;
const callback = function (err, payload) {
if (err) {
done(err);
}
try {
expect(payload.data.testFilterMulti).to.equals('goodFilter');
triggerCount++;
} catch (e) {
done(e);
return;
}
if (triggerCount === 2) {
done();
}
};
subManager.subscribe({
query,
operationName: 'multiTrigger',
variables: {filterBoolean: true, uga: 'UGA'},
callback,
}).then(subId => {
subManager.publish('NotATrigger', {filterBoolean: false});
subManager.publish('Trigger1', {filterBoolean: true});
subManager.publish('Trigger2', {filterBoolean: true});
setTimeout(() => {
subManager.unsubscribe(subId);
}, 30);
});
});
it('can unsubscribe', function (done) {
const query = 'subscription X{ testSubscription }';
const callback = (err) => {
if (err) {
done(err);
}
try {
assert(false);
} catch (e) {
done(e);
return;
}
done();
};
subManager.subscribe({query, operationName: 'X', callback}).then(subId => {
subManager.unsubscribe(subId);
subManager.publish('testSubscription', 'bad');
setTimeout(done, 30);
});
});
it('throws an error when trying to unsubscribe from unknown id', function () {
expect(() => subManager.unsubscribe(123))
.to.throw('undefined');
});
it('calls the error callback if there is an execution error', function (done) {
const query = `subscription X($uga: Boolean!){
testSubscription @skip(if: $uga)
}`;
const callback = function (err, payload) {
try {
// tslint:disable-next-line no-unused-expression
expect(payload).to.be.undefined;
expect(err.message).to.equals(
'Variable "$uga" of required type "Boolean!" was not provided.',
);
} catch (e) {
done(e);
return;
}
done();
};
subManager.subscribe({query, operationName: 'X', callback}).then(subId => {
subManager.publish('testSubscription', 'good');
setTimeout(() => {
subManager.unsubscribe(subId);
}, 40);
});
});
it('can use transform function to convert the trigger name given into more explicit channel name', function (done) {
const triggerTransform = (trigger, {path}) => [trigger, ...path].join('.');
const pubsub = new MQTTPubSub({
triggerTransform,
});
const subManager2 = new SubscriptionManager({
schema,
setupFunctions: {
testChannelOptions: (_, {repoName}) => ({
comments: {
channelOptions: {path: [repoName]},
},
}),
},
pubsub,
});
const callback = (err, payload) => {
if (err) {
done(err);
}
try {
expect(payload.data.testChannelOptions).to.equals('test');
done();
} catch (e) {
done(e);
}
};
const query = `
subscription X($repoName: String!) {
testChannelOptions(repoName: $repoName)
}
`;
const variables = {repoName: 'graphql-redis-subscriptions'};
subManager2.subscribe({query, operationName: 'X', variables, callback}).then(subId => {
pubsub.publish('comments.graphql-redis-subscriptions', 'test');
setTimeout(() => pubsub.unsubscribe(subId), 4);
});
});
});<|fim▁end|> | const callback = () => null;
subManager.subscribe({query, operationName: 'X', callback}).then(subId => {
expect(subId).to.be.a('number'); |
<|file_name|>fetch_update_findings.py<|end_file_name|><|fim▁begin|>import json
import django
import urllib3
if __name__ == '__main__':
django.setup()
from infrastructure.models import Server, CustomField
from resourcehandlers.aws.models import AWSHandler
from common.methods import set_progress
from django.core.serializers.json import DjangoJSONEncoder
def fetch_arns_for_findings(inspector_client):
"""
Fetch all ARNs for findings discovered in the latest run of all enabled Assessment Templates.
:param inspector_client:
:return:
"""
# at: Assessment template
# arn: Amazon resource name
findings = set()
# Get all assessment templates for current region
at_arns = inspector_client.list_assessment_templates()['assessmentTemplateArns']
if len(at_arns) > 0:
at_details = inspector_client.describe_assessment_templates(assessmentTemplateArns=at_arns)
# For each template, get the ARN for the latest run
if "assessmentTemplates" in at_details:
at_runs = [t['lastAssessmentRunArn'] for t in at_details['assessmentTemplates']]
paginator = inspector_client.get_paginator('list_findings', )
for page in paginator.paginate(assessmentRunArns=at_runs,
maxResults=500):
if len(page['findingArns']) > 0:
findings.add(page['findingArns'][0])
return findings
def get_instance_id(finding):
"""
Given a finding, go find and return the corresponding AWS Instance ID
:param finding:
:return:
"""
for kv in finding['attributes']:
if kv['key'] == 'INSTANCE_ID':
return kv['value']
return None
def update_instances(findings):
"""
For each finding build-up a dict keyed by instance ID with an array value of all applicable
findings. Then create or update the aws_inspector_findings custom field for each<|fim▁hole|> :param findings:
:return:
"""
instances = {}
# Group findings by instance
for finding in findings['findings']:
instance_id = get_instance_id(finding)
if instance_id not in instances:
instances[instance_id] = []
else:
instances[instance_id].append(finding)
# For each istance, find its CloudBolt Server record and update aws_inspector_findings
for instance in instances.keys():
try:
s = Server.objects.get(resource_handler_svr_id=instance)
cf, _ = CustomField.objects.get_or_create(name='aws_inspector_findings', type='TXT',
label="AWS Inspector Findings")
s.set_value_for_custom_field(cf.name, json.dumps(instances[instance], indent=True,
cls=DjangoJSONEncoder))
except Server.DoesNotExist as ex:
# Unable to locate and update the server, carry on
pass
def describe_findings(inspector_client, all_finding_arns):
"""
Given a list of findind ARNs, return the details for each finding.
:param inspector_client:
:param all_finding_arns:
:return:
"""
arns = list(all_finding_arns)
if len(arns) == 0:
return None
findings = inspector_client.describe_findings(findingArns=arns)
return findings
def run(job, *args, **kwargs):
rh: AWSHandler
for rh in AWSHandler.objects.all():
regions = set([env.aws_region for env in rh.environment_set.all()])
# For each region currently used by the current AWSHandler
for region in regions:
inspector = rh.get_boto3_client(service_name='inspector', region_name=region)
set_progress(f'Fetching findings for {rh.name} ({region}).')
all_finding_arns = fetch_arns_for_findings(inspector)
inspector_findings = describe_findings(inspector, all_finding_arns)
set_progress(f'Updating CloudBolt instances in {region}.')
if inspector_findings:
update_instances(inspector_findings)
return "", "", ""
if __name__ == '__main__':
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
run(None)<|fim▁end|> | corresponding CloudBolt server record. |
<|file_name|>name.rs<|end_file_name|><|fim▁begin|>use super::errors::{ErrorKind, Result};
use failchain::{bail, ensure};
use serde::de::{Deserialize, Deserializer, Error as SerdeDeError};
use std::borrow::Borrow;
use std::fmt;
use std::fmt::Debug;
use std::fmt::Display;
use std::ops::Deref;
use std::result::Result as StdResult;
use std::str::{self, FromStr};
#[derive(Clone, Copy, PartialEq, PartialOrd, Ord, Eq, Hash)]
pub struct WadName([u8; 8]);
impl WadName {
pub fn push(&mut self, new_byte: u8) -> Result<()> {
let new_byte = match new_byte.to_ascii_uppercase() {
b @ b'A'..=b'Z'
| b @ b'0'..=b'9'
| b @ b'_'
| b @ b'%'
| b @ b'-'
| b @ b'['
| b @ b']'
| b @ b'\\' => b,
b => {
bail!(ErrorKind::invalid_byte_in_wad_name(b, &self.0));
}
};
for byte in &mut self.0 {
if *byte == 0 {
*byte = new_byte;
return Ok(());
}
}
bail!(ErrorKind::wad_name_too_long(&self.0));
}
pub fn from_bytes(value: &[u8]) -> Result<WadName> {
let mut name = [0u8; 8];
let mut nulled = false;
for (dest, &src) in name.iter_mut().zip(value.iter()) {
ensure!(
src.is_ascii(),
ErrorKind::invalid_byte_in_wad_name(src, value)
);
let new_byte = match src.to_ascii_uppercase() {
b @ b'A'..=b'Z'
| b @ b'0'..=b'9'
| b @ b'_'
| b @ b'-'
| b @ b'['
| b @ b']'
| b @ b'%'
| b @ b'\\' => b,
b'\0' => {
nulled = true;
break;
}
b => {
bail!(ErrorKind::invalid_byte_in_wad_name(b, value));
}
};
*dest = new_byte;
}
ensure!(
nulled || value.len() <= 8,
ErrorKind::wad_name_too_long(value)
);
Ok(WadName(name))
}
}
impl FromStr for WadName {
type Err = super::errors::Error;
fn from_str(value: &str) -> Result<WadName> {
WadName::from_bytes(value.as_bytes())
}
}<|fim▁hole|> write!(formatter, "{}", str::from_utf8(&self[..]).unwrap())
}
}
impl Deref for WadName {
type Target = [u8; 8];
fn deref(&self) -> &[u8; 8] {
&self.0
}
}
impl Debug for WadName {
fn fmt(&self, formatter: &mut fmt::Formatter) -> fmt::Result {
write!(
formatter,
"WadName({:?})",
str::from_utf8(&self[..]).unwrap()
)
}
}
impl PartialEq<[u8; 8]> for WadName {
fn eq(&self, rhs: &[u8; 8]) -> bool {
self.deref() == rhs
}
}
impl Borrow<[u8; 8]> for WadName {
fn borrow(&self) -> &[u8; 8] {
self.deref()
}
}
impl AsRef<str> for WadName {
fn as_ref(&self) -> &str {
str::from_utf8(self.deref()).expect("wad name is not valid utf-8")
}
}
impl<'de> Deserialize<'de> for WadName {
fn deserialize<D>(deserializer: D) -> StdResult<Self, D::Error>
where
D: Deserializer<'de>,
{
WadName::from_bytes(&<[u8; 8]>::deserialize(deserializer)?).map_err(D::Error::custom)
}
}
pub trait IntoWadName {
fn into_wad_name(self) -> Result<WadName>;
}
impl IntoWadName for &[u8] {
fn into_wad_name(self) -> Result<WadName> {
WadName::from_bytes(self)
}
}
impl IntoWadName for &[u8; 8] {
fn into_wad_name(self) -> Result<WadName> {
WadName::from_bytes(self)
}
}
impl IntoWadName for &str {
fn into_wad_name(self) -> Result<WadName> {
WadName::from_str(self)
}
}
impl IntoWadName for WadName {
fn into_wad_name(self) -> Result<WadName> {
Ok(self)
}
}
#[cfg(test)]
mod test {
use super::WadName;
use std::str::FromStr;
#[test]
fn test_wad_name() {
assert_eq!(&WadName::from_str("").unwrap(), b"\0\0\0\0\0\0\0\0");
assert_eq!(&WadName::from_str("\0").unwrap(), b"\0\0\0\0\0\0\0\0");
assert_eq!(
&WadName::from_str("\01234567").unwrap(),
b"\0\0\0\0\0\0\0\0"
);
assert_eq!(&WadName::from_str("A").unwrap(), b"A\0\0\0\0\0\0\0");
assert_eq!(&WadName::from_str("1234567").unwrap(), b"1234567\0");
assert_eq!(&WadName::from_str("12345678").unwrap(), b"12345678");
assert_eq!(&WadName::from_str("123\05678").unwrap(), b"123\0\0\0\0\0");
assert_eq!(&WadName::from_str("SKY1").unwrap(), b"SKY1\0\0\0\0");
assert_eq!(&WadName::from_str("-").unwrap(), b"-\0\0\0\0\0\0\0");
assert_eq!(&WadName::from_str("_").unwrap(), b"_\0\0\0\0\0\0\0");
assert!(WadName::from_bytes(b"123456789").is_err());
assert!(WadName::from_bytes(b"1234\xfb").is_err());
assert!(WadName::from_bytes(b"\xff123").is_err());
assert!(WadName::from_bytes(b"$$ASDF_").is_err());
assert!(WadName::from_bytes(b"123456789\0").is_err());
}
}<|fim▁end|> |
impl Display for WadName {
fn fmt(&self, formatter: &mut fmt::Formatter) -> fmt::Result { |
<|file_name|>Threshold.java<|end_file_name|><|fim▁begin|>// Catalano Imaging Library
// The Catalano Framework
//
// Copyright © Diego Catalano, 2015
// diego.catalano at live.com
//
// This library is free software; you can redistribute it and/or
<|fim▁hole|>// modify it under the terms of the GNU Lesser General Public
// License as published by the Free Software Foundation; either
// version 2.1 of the License, or (at your option) any later version.
//
// This library is distributed in the hope that it will be useful,
// but WITHOUT ANY WARRANTY; without even the implied warranty of
// MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
// Lesser General Public License for more details.
//
// You should have received a copy of the GNU Lesser General Public
// License along with this library; if not, write to the Free Software
// Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA
//
package Catalano.Imaging.Filters;
import Catalano.Imaging.FastBitmap;
import Catalano.Imaging.IBaseInPlace;
/**
* The filter does image binarization using specified threshold value. All pixels with intensities equal or higher than threshold value are converted to white pixels. All other pixels with intensities below threshold value are converted to black pixels.
*
* Supported types: Grayscale.
* Coordinate System: Independent.
*
* @author Diego Catalano
*/
public class Threshold implements IBaseInPlace{
private int value = 128;
private boolean invert = false;
/**
* Initialize a new instance of the Threshold class.
*/
public Threshold() {}
/**
* Initialize a new instance of the Threshold class.
* @param value Threshold value.
*/
public Threshold(int value){
this.value = value;
}
/**
* Initialize a new instance of the Threshold class.
* @param value Threshold value.
* @param invert All pixels with intensities equal or higher than threshold value are converted to black pixels. All other pixels with intensities below threshold value are converted to white pixels.
*/
public Threshold(int value, boolean invert){
this.value = value;
this.invert = invert;
}
/**
* Threshold value.
* @return Threshold value.
*/
public int getValue() {
return value;
}
/**
* Threshold value.
* @param value Threshold value.
*/
public void setValue(int value) {
this.value = value;
}
@Override
public void applyInPlace(FastBitmap fastBitmap){
if (!fastBitmap.isGrayscale())
throw new IllegalArgumentException("Binarization works only with RGB images.");
int[] pixels = fastBitmap.getData();
for (int i = 0; i < pixels.length; i++) {
int l = pixels[i] & 0xFF;
if(invert == false){
if(l >= value){
pixels[i] = 255 << 24 | 255 << 16 | 255 << 8 | 255;
}
else{
pixels[i] = 0;
}
}
else{
if(l < value){
pixels[i] = 0;
}
else{
pixels[i] = 255 << 24 | 255 << 16 | 255 << 8 | 255;
}
}
}
}
}<|fim▁end|> | |
<|file_name|>ConnectionActivity.java<|end_file_name|><|fim▁begin|>package se.dsv.waora.deviceinternetinformation;
import android.content.Context;
import android.net.ConnectivityManager;
import android.net.NetworkInfo;
import android.os.Bundle;
import android.support.v7.app.ActionBarActivity;
import android.widget.TextView;
/**
* <code>ConnectionActivity</code> presents UI for showing if the device
* is connected to internet.
*
* @author Dushant Singh
*/
public class ConnectionActivity extends ActionBarActivity {
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
// Initiate view
TextView connectivityStatus = (TextView) findViewById(R.id.textViewDeviceConnectivity);
// Get connectivity service.
ConnectivityManager manager = (ConnectivityManager) getSystemService(Context.CONNECTIVITY_SERVICE);
// Get active network information
<|fim▁hole|> // Check if active network is connected.
boolean isConnected = activeNetwork != null && activeNetwork.isConnectedOrConnecting();
if (isConnected) {
// Set status connected
connectivityStatus.setText(getString(R.string.online));
connectivityStatus.setTextColor(getResources().getColor(R.color.color_on));
// Check if connected with wifi
boolean isWifiOn = activeNetwork.getType() == ConnectivityManager.TYPE_WIFI;
if (isWifiOn) {
// Set wifi status on
TextView wifiTextView = (TextView) findViewById(R.id.textViewWifi);
wifiTextView.setText(getString(R.string.on));
wifiTextView.setTextColor(getResources().getColor(R.color.color_on));
} else {
// Set mobile data status on.
TextView mobileDataTextView = (TextView) findViewById(R.id.textViewMobileData);
mobileDataTextView.setText(getString(R.string.on));
mobileDataTextView.setTextColor(getResources().getColor(R.color.color_on));
}
}
}
}<|fim▁end|> | NetworkInfo activeNetwork = manager.getActiveNetworkInfo();
|
<|file_name|>Pomeranian.java<|end_file_name|><|fim▁begin|>/*
* JBoss, Home of Professional Open Source
* Copyright 2010, Red Hat, Inc., and individual contributors
* by the @authors tag. See the copyright.txt in the distribution for a
* full listing of individual contributors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* http://www.apache.org/licenses/LICENSE-2.0
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.jboss.weld.tests.event.observer.transactional;
import static javax.ejb.TransactionManagementType.BEAN;
import static javax.enterprise.event.TransactionPhase.AFTER_COMPLETION;
import static javax.enterprise.event.TransactionPhase.AFTER_FAILURE;
import static javax.enterprise.event.TransactionPhase.AFTER_SUCCESS;
import static javax.enterprise.event.TransactionPhase.BEFORE_COMPLETION;
import static javax.enterprise.event.TransactionPhase.IN_PROGRESS;
import java.io.Serializable;
import javax.annotation.Priority;
import javax.ejb.Stateful;<|fim▁hole|>@Stateful
@TransactionManagement(BEAN)
@Tame
@SessionScoped
@SuppressWarnings("serial")
public class Pomeranian implements PomeranianInterface, Serializable {
@Override
public void observeInProgress(@Observes(during = IN_PROGRESS) Bark event) {
Actions.add(IN_PROGRESS);
}
@Override
public void observeAfterCompletion(@Observes(during = AFTER_COMPLETION) Bark someEvent) {
Actions.add(AFTER_COMPLETION);
}
@Override
public void observeAfterSuccess(@Observes(during = AFTER_SUCCESS) Bark event) {
Actions.add(AFTER_SUCCESS);
}
@Override
public void observeAfterSuccessWithHighPriority(@Priority(1) @Observes(during = AFTER_SUCCESS) Bark event) {
Actions.add(AFTER_SUCCESS + "1");
}
@Override
public void observeAfterSuccessWithLowPriority(@Priority(100) @Observes(during = AFTER_SUCCESS) Bark event) {
Actions.add(AFTER_SUCCESS + "100");
}
@Override
public void observeAfterFailure(@Observes(during = AFTER_FAILURE) Bark event) {
Actions.add(AFTER_FAILURE);
}
@Override
public void observeBeforeCompletion(@Observes(during = BEFORE_COMPLETION) Bark event) {
Actions.add(BEFORE_COMPLETION);
}
@Override
public void observeAndFail(@Observes(during=BEFORE_COMPLETION) @Gnarly Bark event) throws FooException {
Actions.add(BEFORE_COMPLETION);
throw new FooException();
}
}<|fim▁end|> | import javax.ejb.TransactionManagement;
import javax.enterprise.context.SessionScoped;
import javax.enterprise.event.Observes;
|
<|file_name|>useInputValueToggle.js<|end_file_name|><|fim▁begin|>import { computed, toRefs } from '@vue/composition-api'
export const useInputValueToggleProps = {
label: {
type: String
},
options: {
type: Array,
default: () => ([
{ value: 0 },
{ value: 1, color: 'var(--primary)' }
])
},
hints: {
type: Array
}
}
export const useInputValueToggle = (valueProps, props, context) => {
const {
value: rxValue,
onInput: rxOnInput
} = valueProps
const {
options
} = toRefs(props)
// middleware
const txValue = computed(() => options.value.findIndex(map => {
if (!map.value && !rxValue.value)
return true // compare False(y) w/ [null|undefined]
else
return `${map.value}` === `${rxValue.value}` // compare String(s)
}))
const txOnInput = value => {
const {
0: { value: defaultValue } = {}, // 0th option is default
[value]: { value: mappedValue } = {}, // map value (N) w/ Nth option<|fim▁hole|> } = options.value
const rxValue = (mappedValue !== undefined) ? mappedValue : defaultValue
if (mappedPromise) // handle Promise
return mappedPromise(rxValue, props, context)
else // otherwise emit
return rxOnInput(rxValue)
}
// state
const max = computed(() => `${options.value.length - 1}`)
const label = computed(() => {
const { 0: { label: defaultLabel } = {} , [txValue.value]: { label: mappedLabel } = {} } = options.value
return mappedLabel || defaultLabel
})
const color = computed(() => {
const { 0: { color: defaultColor } = {} , [txValue.value]: { color: mappedColor } = {} } = options.value
return mappedColor || defaultColor
})
const icon = computed(() => {
const { 0: { icon: defaultIcon } = {} , [txValue.value]: { icon: mappedIcon } = {} } = options.value
return mappedIcon || defaultIcon
})
const tooltip = computed(() => {
const { 0: { tooltip: defaultTooltip } = {} , [txValue.value]: { tooltip: mappedTooltip } = {} } = options.value
return mappedTooltip || defaultTooltip
})
return {
// middleware
value: txValue,
onInput: txOnInput,
// state
max,
label,
color,
icon,
tooltip
}
}<|fim▁end|> | [value]: { promise: mappedPromise } = {} // map promise |
<|file_name|>sale.py<|end_file_name|><|fim▁begin|># -*- coding: utf-8 -*-
# © 2011 Raphaël Valyi, Renato Lima, Guewen Baconnier, Sodexis
# License AGPL-3.0 or later (http://www.gnu.org/licenses/agpl.html).
from odoo import api, models, fields
class ExceptionRule(models.Model):
_inherit = 'exception.rule'
rule_group = fields.Selection(
selection_add=[('sale', 'Sale')],
)
model = fields.Selection(
selection_add=[
('sale.order', 'Sale order'),
('sale.order.line', 'Sale order line'),
])
class SaleOrder(models.Model):
_inherit = ['sale.order', 'base.exception']
_name = 'sale.order'
_order = 'main_exception_id asc, date_order desc, name desc'
rule_group = fields.Selection(
selection_add=[('sale', 'Sale')],
default='sale',
)
@api.model
def test_all_draft_orders(self):
order_set = self.search([('state', '=', 'draft')])
order_set.test_exceptions()
return True
@api.constrains('ignore_exception', 'order_line', 'state')
def sale_check_exception(self):
orders = self.filtered(lambda s: s.state == 'sale')
if orders:
orders._check_exception()
@api.onchange('order_line')
def onchange_ignore_exception(self):
if self.state == 'sale':
self.ignore_exception = False
@api.multi
def action_confirm(self):
if self.detect_exceptions():
return self._popup_exceptions()
else:
return super(SaleOrder, self).action_confirm()
@api.multi
def action_draft(self):
res = super(SaleOrder, self).action_draft()
orders = self.filtered(lambda s: s.ignore_exception)
orders.write({
'ignore_exception': False,
})
return res<|fim▁hole|> self.ensure_one()
return self.order_line
@api.model
def _get_popup_action(self):
action = self.env.ref('sale_exception.action_sale_exception_confirm')
return action<|fim▁end|> |
def _sale_get_lines(self): |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.