package
stringlengths 1
122
| pacakge-description
stringlengths 0
1.3M
|
---|---|
anqa-events
|
anqa-events
|
anqa-rest
|
anqa-rest
|
anqr
|
No description available on PyPI.
|
anrg.saga
|
SagaSaga:SchedulingAlgorithmsGathered.IntroductionThis repository contains a collection of scheduling algorithms.
The algorithms are implemented in python using a common interface.
Scripts for validating the schedules produced by the algorithms are also provided.
Scripts for comparing the performance of the algorithms are also provided.AlgorithmsThe following algorithms are implemented:Common:HEFT: Heteregeneous Earliest Finish TimeCPoP: Critical Path on ProcessorFastestNode: Schedule all tasks on the fastest nodeStochastic: (stochastic task cost, data size, compute speed, and communication strength)SHEFTImproved SHEFTStochastic HEFTMean HEFTUsageInstallationClone the repository and install the requirements:pipinstallanrg.sagaRunning the algorithmsThe algorithms are implemented as python modules.
The following example shows how to run the HEFT algorithm on a workflow:fromsaga.common.heftimportHeftSchedulerscheduler=HeftScheduler()network:nx.Graph=...task_graph:nx.DiGraph=...scheduler.schedule(network,task_graph)
|
anritsu-lightning
|
anritsu_lightningPython interface to the Anritsu Lightning 37xxxD VNAInstallation> pip install anritsu_lightningUsage>>>fromanritsu_lightningimportCommChannel>>>withCommChannel(address=6)asvna:...vna.ch3.parameter="S21"...s21=vna.read(channel=3,data_status="corrected")>>>It is also possible to read the S-parameters inTouchstoneSnP format.>>>fromanritsu_lightningimportCommChannel>>>withCommChannel(address=6)asvna:...vna.measurement_setup.start=40e6# Hz...vna.measurement_setup.stop=20e9# Hz...vna.measurement_setup.data_points=401...vna.ch1.parameter="S11"...vna.ch1.graph_type="log magnitude"...vna.ch1.graph_scale=20.0# dB/div...vna.ch2.parameter="S12"...vna.ch3.parameter="S21"...vna.ch3.graph_type="log magnitude"...vna.ch3.graph_scale=2.0# dB/div...vna.ch4.parameter="S22"...vna.display_mode="dual channels 1 & 3"...withopen(<file>,"wt")asf:...f.write(vna.get_s2p(previous=False))>>>It is also possible to use the markers to find the -3 dB point of a filter. In this
example, the VNA measured the -3 dB bandwidth of a Mini-Circuits VLF-1000+ low pass
filter with nominal specification of 1.3 GHz. The VNA was already setup to measure
S21 on channel 3 between 40 MHz and 5 GHz.In[1]:fromanritsu_lightningimportCommChannelIn[2]:cc=CommChannel(address=6)In[3]:vna=cc.get_instrument()In[4]:vna.markers.mode="normal"In[5]:vna.markers.enable([1,2])In[6]:vna.markers.set_active(1)In[7]:vna.markers.set_xaxis_location(1,"40 MHz")In[8]:vna.markers.delta_reference=1In[9]:vna.markers.set_active(2)In[10]:bw=vna.markers.search("-3 dB",reference="delta reference",timeout=5000)In[11]:print(f"{bw/1e9:.2f}GHz")1.26GHzSupported features:Measurement setup: frequency sweep, data points, etc.Channel setup: parameter (S11, S12, ...), graph type, etc.Graph setup: scale, reference, offsetData transfer: channel data, screen bitmap, S2P fileMarkers
|
anritsu-ms2090a-ams
|
No description available on PyPI.
|
anritsu-pwrmtr
|
anritsu_pwrmtrPython interface to the Anritsu power metersInstallation>pip install anritsu_pwrmtrUsageFor ML243xA models that use GPIB:>>>fromanritsu_pwrmtrimportCommChannel>>>withCommChannel(13)aspm:...pm.ch1.read()...-10.1For MA243x0A models that use USB:>>>fromanritsu_pwrmtrimportCommChannel>>>withCommChannel('<USB0::0x...::RAW>')aspm:...pm.read()...-10.1Supported models:ML243xAMA243x0ASupported features:Channel configuration for Readout modeSensor calibration and zeroingMeasuring power in Readout mode
|
ans
|
ANS: Ambient Noise Seismologyans is a python wrapper for ambient noise seismology tasks and it has a GUI for easier configuration of ambient-noise seismology projects. In its backend, this package depends on Perl interpreter, GMT (Generic Mapping Tools), and SAC (Seismic Analysis Code) as well as python modules including ObsPy etc. ans is successfully tested on Python 3.6.* and 3.8.* versions.Version 0.0.1This version include all the necessary commands and tools required for generating Rayleigh wave (ZZ and RR cross-correlations) and Love wave (TT cross-correlation component) Empirical Green's Functions (EGFs). A brief description of most useful CLI commands is given below:- $> ans init <maindir>
Description: initialize ans project at project main directory i.e. <maindir>
- $> ans config
$> ans config --maindir <maindir>
Description: open the program GUI to configure the ans project
Note: default <maindir> is current working directory
- $> ans download stations
Description: download list of available stations at the given
region boundary and dates that was previously set using the GUI.
Datacenters, desired station components etc should also be set using '$> ans config'.
- $> ans download metadata
Description: download station metadata files (xml file format) that will be used for
instrument response removal and updating sac headers.
- $> ans download mseeds
Description: main data acquisition module; download seismograms in mseed format
- $> ans mseed2sac <mseeds_dir> <sacs_dir>
Description: convert mseed to sac files while applying the listed processing steps
in project configuration mseed2sac tab (i.e., '$> ans config').
<mseeds_dir>: input mseed dataset directory
<sacs_dir>: output sac files dataset directory
- $> ans sac2ncf <sacs_dir> <ncfs_dir>
Description: process the input sacfiles in <sacs_dir> and output NCFs (noise cross-correlation functions)
while applying the list of sac2ncf processes defined in project configuration file.
Note: At this step it is necessary that all sac headers are updated and this can be done by either
performing instrument response removal or adding "write headers" process to the list of processes.
- $> ans ncf2egf <ncfs> <egfs_dir>
<ncfs>: either path to the <ncfs_dir> (full stack EGFs) or an ASCII datalist containing a one-column data format
list of paths to event directories (seasonal EGFs; e.g., "14001000000" i.e. 2014/01/01)
<egfs_dir>: path to output stacked EGF
|
ansaittua
|
ansaittuaEarned Value Management (Finnish: ansaittua arvonhallintaa) when coding swiftly.License: MITDocumentationUser and developerdocumentation of ansaittua.Bug TrackerFeature requests and bug reports are bested entered in thetodos of ansaittua.Primary Source repositoryThe primary source repository ofansaittua is at sourcehuta collection of tools useful for software development.StatusExperimental.Note: The default branch isdefault.
|
ansaotuvi
|
Mã nguồn mở chương trình an sao Tử vi ansaotuvi[x] Fork code từ reponsity của doannguyen/lasotuvi
[x] Cập nhật version mới của các thư viện
[x] Deploy ansaotuvi và ansaotuvi-website lên pypi
[x] Chạy thử dự án tích hợp ansaotuvi-website
[x] Điều chỉnh vị trí miếu hãm và ngũ hành của sao
[-] Chuẩn hóa vị trí sao và logic an sao
|
ansaotuvi-website
|
Mã nguồn mở chương trình an sao Tử vi ansaotuviThis is a wrapper of ansaotuvi into django application.Create virtual environment and activate itpip install virtualenvvirtualenv .envIf you are *nix userssource .env/bin/activateif you are Window users, just type.env/Scripts/activate.batand make sure you are working on command prompt (cmd.exe) not PowerShellNow you are working on the virtual environmentIf you do not have Django project, go and install django and ansaotuvi applicationspip install django ansaotuvi_websiteIf you have django project already, just install the ansaotuvi_website applicationpip install ansaotuvi_websiteAdd theansaotuvi_websiteapplication to your INSTALLED_APPS
by adding# settings.py
INSTALLED_APPS = [
# Your others applications
'ansaotuvi_website',
]Add router to theurls.py# urls.py
from django.urls import path, include
urlpatterns = [
# ....
path('ansaotuvi/', include('ansaotuvi_website.urls'))
]Here is a tutorial to show you how to add ansaotuvi app into a django project.Hope this help!
|
ansar-connect
|
ansar-connectTheansar-connectlibrary implements sophisticated asynchronous network messaging. It builds
on the features of theansar-encode <https://pypi.org/project/ansar-encode>_
andansar-create <https://pypi.org/project/ansar-create>_ libraries to integrate network
messaging into the asynchronous model of execution. The result of this approach is that sending
a message across a network is no different to sending a message between any two async objects.
There is also complete "portability" of messages - an application data object read from the disk
usingansar.encodecan immediately be sent across a network, with zero additional effort.This essential messaging capability is combined with the multi-process capabilities
ofansar.create, to deliver practical, distributed computing. Designs for software
solutions can now consider;multi-process compositions that run on a host,compositions of processes distributed across a LAN,and compositions where processes may be anywhere on the Internet.Smaller projects will use this library to deliver multi-process solutions on a single host.
The library will arrange for all the network messaging that binds the processes together,
into a single operational unit.The most ambitious projects will involve processes spread across the Internet, e.g. between
desktop PCs at different branches of an organization, or between head-office servers and mobile
personnel on laptops, or between those same servers and SBCs operating as weather stations. As
long as there is a reasonably up-to-date Python runtime (>=3.10) and Internet connectivity, this
library will arrange for full, asynchronous messaging between any 2 members of a composition.FeaturesImplements full-duplex, network messaging over asynchronous sockets.Inherits the complete type-sophistication ofansar.encode.Seamlessly extends the async model of operation to span local and wide-area networks.
|
ansar-create
|
ansar-createTheansar-createlibrary uses multi-threading and multi-processing to solve difficult
software challenges such as concurrency, interruption and cancellation. It wraps those
platform facilities in a standard runtime model, giving developers the ability to express
that type of software in a clear and consistent manner.This type of software is often referred to as asynchronous, event-driven or reactive
software. It acknowledges the fundamental fact that significant events can occur at
any time, and that software must be able to respond to those events in a reliable
and timely manner.FeaturesBased on a standard model for complex software operations (SDL)Uniform management of threads, processes and state machinesBuilt-in runtime facilities such as timers and logging.Persistent application configuration.Process orchestration.Development automation.
|
ansar-encode
|
ansar-encodeTheansar-encodelibrary provides for the convenient storage and recovery of
application data using system files. Files are created using standard encodings - the
default is JSON - and are human readable. Complex application data can be stored
including containers, instances of classes and object graphs.FeaturesBroad suite of primitive types, e.g. integers, floats, strings, times and enumerations.Structured data, e.g. an 8-by-8 table of user-defined class instances.Recovered data is fully-typed, e.g. reading aclass Userproduces aUserinstance.Graphs and graphs that include cycles, e.g. circular lists, syntax trees and state diagrams.Polymorphism, e.g. read an object of unknown type.Type-checking.Plain text files.Managed folders of files.Object versioning.
|
ansatz
|
ansatzA physics Python library
|
anscenter
|
backend-moduleMain backend module, which is used for developing web-app logic and deploying AI model.InstallationRun the following to install:pipinstallanscenterUsagefromanscester.predictimportMachineLearningModel# Generate modelmodel=MachineLearningModel()# Using Exampledata_input=[1.0,2.0,3.0,4.0]result=model.predict(data_input)print(result)Developing HelloGTo install helloG, along with the tools you need to develop and run tests, run the following in your virtualenv:$pipinstallanscenter[dev]
|
anschlusspruefer
|
This is a dummy package intended to prevent any dependency confusion attacks against our internal Python packages atSBB. For now this is the only foolproof way to prevent some dependency confusion attacks until the following PEP has been implemented:PEP-708For any questions regarding this package feel free to reach out [email protected].
|
ansciier
|
AnsciierMimic video or image to your terminalOverviewHow to use itFirst you needpython 3, I reccomend the latest version but I think it would run on older version too.Install the package with$ pip install ansciierorpip3if pip is point to python 2 version.Run it with$ ansciier /path/to/image-or-video.mp4.Type$ ansciier -hfor help.Example usage$ ansciier ~/Videos/rickroll.mp4 --aspect-ratio 20:9 --fps 24Command above will draw frame from rickroll.mp4 in Videos folder with aspect ratio of 20:9 and max fps at 24$ ansciier ~/Video/frame{0}.png --ascii --char @ --dim 200x50 --start-frame 59This command will draw image with@character, 200x50 square block dimension from frame59.png, frame60.png, until it reaches the highest frame number. It'll automatically find the last frame when--last-frameis not specified, it'll still continue if a frame is missing.$ ansciier camerawill use your cameraNote : Support only truecolor terminal (Because majority of terminals nowadays are truecolor I think you already have one).Should work on Windows and Linux
|
ansel
|
Codecs for reading/writing documents in the ANSEL character set.Free software: MIT licenseDocumentation:https://python-ansel.readthedocs.io.FeaturesAdds support for character set encodingsANSEL(ANSI/NISO Z39.47) andGEDCOM.Re-orders combining characters for consistency with the ANSEL specification.CreditsThis package was created withCookiecutterand theaudreyr/cookiecutter-pypackageproject template.
|
anserializer
|
anserializerA module for serializing and deserializing complex data structures to/from json. It allows the user to (de)serialize a complex dictionary/list structure in one go by defining serializers/deserializers for arbitrary sets of classes.Tested with python3.Serializer can be utilized either as instantiated or non-instantiated.Installpip3 install anserializerExamplesInstantiated examplefrom anserializer import Serializer, DatetimeSerializer, ObjectSerializer
class MyObjectClass(object):
pass
# instantiate the serializer
s = s = Serializer([ DatetimeSerializer(), ObjectSerializer(MyObjectClass) ])
# create object
o = MyObjectClass()
print(o)
# serialize object
x = s.get_serialized(o)
print(x)
# deserialize object
o = s.get_deserialized(x)
print(o)Non-instantiated examplefrom anserializer import Serializer, DatetimeSerializer, ObjectSerializer
class MyObjectClass(object):
pass
# put our list of serializer classes available for use into a variable
serializers = [ DatetimeSerializer(), ObjectSerializer(MyObjectClass) ]
# create object
o = MyObjectClass()
print(o)
# serialize object
x = Serializer.serialize(o, serializers)
print(x)
# deserialize object
o = Serializer.deserialize(x, serializers)
print(o)Allow children to be serialized by a serializer defined for their ancestorfrom anserializer import Serializer, DatetimeSerializer, ObjectSerializer
class MyObjectClass(object):
pass
# instantiate the serializer
s = s = Serializer([ DatetimeSerializer(), ObjectSerializer(object) ], serialize_children=True)
# create object
o = MyObjectClass()
print(o)
# serialize object
x = s.get_serialized(o)
print(x)Use your own serializerfrom anserializer import Serializer, DatetimeSerializer, ObjectSerializer, BaseSerializer
class MyObjectClass(object):
pass
# create your serializer
class MySerializer(BaseSerializer):
def __init__(self):
super().__init__([MyObjectClass], '^!MyObjectClass\(\)$')
def serialize(self, obj):
# do the magic and return a serialized element
return { '!MyObjectClass()': {
# insert object data here
}
}
def deserialize(self, serialized_obj):
# do the magic and return an object with the data given in serialized format
kwargs = {}
return MyObjectClass(**kwargs)
# instantiate the serializer
s = Serializer([ DatetimeSerializer(), MySerializer(), ObjectSerializer(object) ])
# create object
o = MyObjectClass()
print(o)
# serialize object
x = s.get_serialized(o)
print(x)
|
anser-module-upload
|
Failed to fetch description. HTTP Status Code: 404
|
ansh997
|
SpreadGSpreadG is a small wrapper around the Google Sheets API (v4) and gspread 3.6.0 to provide more convenient access to Google Sheets from Python scripts and plot graphs with the help of Matplotlib.Turn on the API, download an OAuth client ID as JSON file, and start working.Installation# pip install -r requirement.txtpipinstallspreadgThis will also install google-api-python-client and its dependencies, notably httplib2 and oauth2client, as required dependencies.Quickstartfromspreadgimport*data=dataloader('credentials.json',"Sheetname")df=to_df(data)fig=plotter(df,X,y)fig.savefig("something.png")author: Himanshu Palmailto: palhimanshu997[at]gmail[.]com
|
anshika
|
This package has Multiple Accounts.
|
anshils
|
No description available on PyPI.
|
anshitsu
|
AnshitsuA tiny digital photographic utility."Anshitsu" means a darkroom in Japanese.InstallRun this command in an environment where Python 3.10 or higher is installed.We have tested it on Windows, Mac, and Ubuntu on GitHub Actions, but we have not tested it on Macs with Apple Silicon, so please use it at your own risk on Macs with Apple Silicon.pipinstallanshitsuUsageIt is as described in the following help.INFO:Showinghelpwiththecommand'anshitsu -- --help'.
NAMEanshitsu-ProcessRunnnerforCommandLineInterface
SYNOPSISanshitsuPATH<flags>
DESCRIPTIONThisutilityconvertsthecolorsofimagessuchasphotos.Ifyouspecifyadirectorypath,itwillconverttheimagefilesinthespecifieddirectory.Ifyouspecifyafilepath,itwillconvertthespecifiedfile.Ifyouspecifyanoption,thespecifiedconversionwillbeperformed.TosakamodeisamodethatexpressesthepreferenceofTosaka-senpai,acharacterin"Kyūkyoku Chōjin R",for"photos taken with Tri-X that look like they wereburned onto No. 4 or No. 5 photographic paper".Onlyusefloating-pointnumberswhenusingthismode;numbersaround2.4willmakeitlookright.
POSITIONALARGUMENTSPATHType:strDirectoryorFilePath
FLAGS--colorautoadjust=COLORAUTOADJUSTType:boolDefault:FalseUsecolorautoadjustalgorithm.DefaultstoFalse.--colorstretch=COLORSTRETCHType:boolDefault:FalseUsecolorstretchalgorithm.DefaultstoFalse.--grayscale=GRAYSCALEType:boolDefault:FalseConverttograyscale.DefaultstoFalse.--invert=INVERTType:boolDefault:FalseInvertcolor.DefaultstoFalse.--tosaka=TOSAKAType:Optional[Optional]Default:NoneUseTosakamode.DefaultstoNone.--outputrgb=OUTPUTRGBType:boolDefault:FalseOutputsamonochromeimageinRGB.DefaultstoFalse.--noise=NOISEType:Optional[Optional]Default:NoneAddGaussiannoise.DefaultstoNone.
NOTESYoucanalsouseflagssyntaxforPOSITIONALARGUMENTSIf a directory is specified in the path, anoutdirectory will be created in the specified directory, and the converted JPEG and PNG images will be stored in PNG format.If you specify a JPEG or PNG image file as the path, anoutdirectory will be created in the directory where the image is stored, and the converted image will be stored in PNG format.Note:If you specify a file in any other format in the path, be aware there is no error handling. The program will terminate abnormally.AlgorithmsThe following algorithms are available in this tool.RGBA to RGB ConvertConverts an image that contains Alpha, such as RGBA, to image data that does not contain Alpha.
Transparent areas will be filled with white.This algorithm is performed on any image file.invertInverts the colors of an image using Pillow's built-in algorithm.In the case of negative film, color conversion that takes into account the film base color is not performed, but we plan to follow up with a feature to be developed in the future.colorautoajustWe will use the "automatic color equalization" algorithm described in the following paper to apply color correction.This process is more time consuming than the algorithm used in "colorstretch", but it can reproduce more natural colors.(References)A. Rizzi, C. Gatta and D. Marini, "A new algorithm for unsupervised global and local color correction.", Pattern Recognition Letters, vol. 24, no. 11, 2003.colorstretchThe "gray world" and "stretch" algorithms described in the following paper are combined to apply color correction.This process is faster than the algorithm used in "colorautoajust".(References)D. Nikitenko, M. Wirth and K. Trudel, "Applicability Of White-Balancing Algorithms to Restoring Faded Colour Slides: An Empirical Evaluation.", Journal of Multimedia, vol. 3, no. 5, 2008.grayscaleConvert a color image to grayscale using the algorithm described in the following article.Python でグレースケール(grayscale)化Note: This article is written in Japanese.Tosaka modeTosaka mode is a mode that expresses the preference of Tosaka-senpai, a character in "Kyūkyoku Chōjin R", for "photos taken with Tri-X that look like they were burned onto No. 4 or No. 5 photographic paper".Only use floating-point numbers when using this mode; numbers around 2.4 will make it look right.When this mode is specified, color images will also be converted to grayscale.outputrgbOutputs a monochrome image in RGB.noiseAdd Gaussian noise.To add noise, you need to specify a floating-point number; a value of about 10.0 will be just right.Special ThanksWe are using the following libraries.shunsukeaihara/colorcorrect
|
anshls
|
Failed to fetch description. HTTP Status Code: 404
|
anshudi
|
No description available on PyPI.
|
anshukak
|
No description available on PyPI.
|
anshupdf
|
This is our home page.
|
ansi
|
ANSIVarious ANSI escape codes, used in moving the cursor in a text console or
rendering coloured text.ExamplePrint something in bold yellow on a red background:>>> from ansi.colour import fg, bg
>>> from ansi.colour.fx import reset
>>> msg = (bg.red, fg.yellow, 'Hello world!', reset)
>>> print(''.join(map(str, msg)))
...If you like syntactic sugar, you may also do:>>> from ansi.colour import fg, bg
>>> print(bg.red(fg.yellow('Hello world!')))
...Also, 256 RGB colours are supported:>>> from ansi.colour.rgb import rgb256
>>> from ansi.colour.fx import reset
>>> msg = (rgb256(0xff, 0x80, 0x00), 'hello world', reset)
>>> print(''.join(map(str, msg)))
...If you prefer to use American English instead:>>> from ansi.color import ...Referenceshttps://www.ecma-international.org/publications-and-standards/standards/ecma-48/RequirementsAnsi requires python 3.6 and supports typing.
|
ansi256colors
|
ansi256-colorsA tool to print and demo ansi color codes for a 256 color terminal.InstallInstall using pip:pip install ansi256colorsThis will installansi256to your python bin. You can then call the script so
long as that is in your path.Usage> ansi256 -h
usage: ansi256 [-h] {print-table,test,write} ...
a tool for printing, testing, and exporting ansi color escapes
positional arguments:
{print-table,test,write}
print-table print a table of the ansi color codes
test test color codes on a string
write write a zsh-rc style file that exports all color codes
options:
-h, --help show this help message and exitPrinting Tables> ansi256 print-table --help
usage: ansi256 print-table [-h] [-f [0-255]] [-b [0-255]] {fg,bg,both}
positional arguments:
{fg,bg,both} specify whether to print foreground, background, or both color code tables
options:
-h, --help show this help message and exit
-f [0-255], --foreground [0-255]
specify a foreground color to be on top of the background table
-b [0-255], --background [0-255]
specify a background color to be the background of the foreground tableFor example:ansi256 print-table bothansi256 print-table -b 218 fgPrinting Tests> ansi256 test -h
usage: ansi256 test [-h] [-f [0-255]] [-b [0-255]] TEXT
positional arguments:
TEXT text to test
options:
-h, --help show this help message and exit
-f [0-255], --foreground [0-255]
specify the foreground color code (0-255)
-b [0-255], --background [0-255]
specify the background color code (0-255)For example:ansi256 test -b 218 -f 196 "This is a test of red on pink"Writing RC files> ansi256 write -h
usage: ansi256 write [-h] FILE
positional arguments:
FILE file to write the exports to
options:
-h, --help show this help message and exit> ansi256 write testrc
> head -5 testrc
export COLOR0_FG=$'%{\e[38;5;0m%}'
export COLOR0_BG=$'%{\e[48;5;0m%}'
export COLOR1_FG=$'%{\e[38;5;1m%}'
export COLOR1_BG=$'%{\e[48;5;1m%}'
export COLOR2_FG=$'%{\e[38;5;2m%}'
|
ansi2html
|
ansi2htmlConvert text with ANSI color codes to HTML or to LaTeX.Inspired by and developed off of the work ofpixelbeatandblackjack.Read the docsfor more
informations.Example - Python APIfromansi2htmlimportAnsi2HTMLConverterconv=Ansi2HTMLConverter()ansi="".join(sys.stdin.readlines())html=conv.convert(ansi)Example - Shell Usage$ls--color=always|ansi2html>directories.html
$sudotail/var/log/messages|ccze-A|ansi2html>logs.html
$taskrc._forcecolor:yeslimit:0burndown|ansi2html>burndown.htmlSee the list of full options with:$ansi2html--helpGet this project:$pip3installansi2htmlSource:https://github.com/pycontribs/ansi2html/pypi:https://pypi.org/project/ansi2html/Licenseansi2htmlis licensed LGPLv3+.CreditsAuthor:Ralph BeanContributor:Robin Schneider
|
ansi2image
|
ANSI to ImageA Python lib to convert ANSI text to ImageANSI2Image officially supports Python 3.8+.Main featuresRead ANSI file (or ANSI stdin) and save an image (JPG or PNG)Installationpip3install--upgradeansi2imageHelpANSItoimagev0.1.1byHelvioJunior
ANSItoImageconvertANSItexttoanimage.
https://github.com/helviojunior/ansi2imagepositionalarguments:[filename]Filepathor-tostdin
Options:-o--output[filename]imageoutputfile.--font[font]fonttype.(default:JetBrainsMonoRegular).--font-listListallsupportedfontfamilyandvariations-h,--helpshowhelpmessageandexit-vSpecifyverbositylevel(default:0).Example:-v,-vv,-vvv--versionshowcurrentversionANSI referencehttps://en.wikipedia.org/wiki/_ANSI_escape_code
|
ansi2txt
|
ansi2txtansi to plain text converterRelatedHere are some related projectscolorized-logsstrip-ansi-cliAcknowledgementsThis code base is a translation/port of theansi2txt.ccode base fromcolorized-logsto Python3 and Bash.
This project came about because I liked the originalansi2txt's output but did not want to have to compile it or ship binaries around.
I ported ansi2txt.c → ansi2txt.py but then came across an environment without python so went ansi2txt.py → ansi2txt.sh.LicenseAGPLv3ANDMITRunning TestsTo run tests, run the following commandbatsansi2txt.bats
|
ansi2utf8
|
# ansi2utf8ANSI to UTF-8**Usage: a2u FILE [NAME]**----## Installation> pip install ansi2utf8``` (shell)a2u file.txta2u file.txt ~file.txt```## Todo- a2u FILEURL [NAME]
|
ansibeautifier
|
ANSI BeautifierThis is a beautifier project to beautify your python command-line or GUI applications.It uses ANSI Escape codes (\u001b) to color the text.InstallationpipinstallansibeautifierUsagefromansibeautifierimportBeautifierb=Beautifier()print(b.red(text="Hello World",bright=False))ReturnsHello Worldin red but not bright as we have given the bright=False parameterThe default bright parameter is False.Variable UsageYou can also store the colored text in a variablefromansibeautifierimportBeautifierb=Beautifier()text=b.green(text="Hello World",bright=False)print(text)This also prints the same result but in a pythonic way!Colors supportedGreen - ansibeautifier.Beautifier.green()Red - ansibeautifier.Beautifier.red()Blue - ansibeautifier.Beautifier.blue()Black - ansibeautifier.Beautifier.black()White - ansibeautifier.Beautifier.white()Yellow - ansibeautifier.Beautifier.yellow()Magenta - ansibeautifier.Beautifier.magenta()Cyan - ansibeautifier.Beautifier.cyan()Other functionsansibeautifier.Beautifier.always_<color_name>()It colors the terminal in which you run the python code. It can be reset by using the ansibeautifier.Beautifier.reset_foreground_color() functionfromansibeautifierimportBeautifierb=Beautifier()print(b.always_white(text="Hello World"))print(b.reset_foreground_color())ansibeautifier.Beautifier.background_<color_name>()It colors the background of the text to the color specified
It can be nullified by the ansibeautifier.Beautifier.reset_background_color() functionfromansibeautifierimportBeautifierb=Beautifier()print(b.background_black(text="Hello World",bright=True))print(b.reset_colors())ansibeautifier.Beautifier.always_background_<color_name>()It colors the background of the terminal and the text specifiedfromansibeautifierimportBeautifierb=Beautifier()print(b.always_background_cyan(text="Hello World"))print(b.reset_colors())ansibeautifier.Beautifier.bold()It returns bold text. It can be neutralized by the ansibeautifier.Beautifier.reset_intensity() function.fromansibeautifierimportBeautifierb=Beautifier()print(b.bold(text="Hello World"))print(b.reset_intensity())It returnsHello Worldansibeautifier.Beautifier.underline()It returns underlined text. It can also be neutralized by the ansibeautifier.Beautifier.reset_intensity() function.fromansibeautifierimportBeautifierb=Beautifier()print(b.underline(text="Hello World",always=True))print(b.reset_intensity())This underlines all the text in the terminal and the text given.ansibeautifier.Beautifier.reverse()It returns like highlighted text.fromansibeautifierimportBeautifierb=Beautifier()print(b.reverse(text="Hello World",always=True))print(b.reset_intensity())ansibeautifier.Beautifier.conceal()It returns hidden text and you can unhide it by the ansibeautifier.Beautifier.reset_intensity() function.importtimefromansibeautifierimportBeautifierb=Beautifier()print("Hello World")print(b.conceal(text="Hello World"))print("Hello World")time.sleep(0.5)print(b.reset_intensity())Try this!Thanks for readingIf you want the source code you can visit my GitHub pagehttps://github.com/BoughtData92730/ansibeautifier
|
ansibel
|
No description available on PyPI.
|
ansibell
|
No description available on PyPI.
|
ansibilo
|
Ansibilo is a set of tools for Ansible. It provides:facto, an Ansible module to create secrets and select unused ports on the remote host (and reused them at the next run).A command line interface to export Ansible inventory in different formats.Some filter and callback plugins.A Sphinx extension which allow to include a graph of the inventory in the documentation.InstallationIntall the package fromPyPI, using pip:pipinstallansibiloOr from GitHub:gitclonegit://github.com/Polyconseil/ansibiloLinksSources:http://github.com/Polyconseil/ansibiloDocumentation:http://readthedocs.org/docs/ansibilo/Package:http://pypi.python.org/pypi/ansibilo/
|
ansiblator
|
Ansiblator==========Ansiblator - makes Ansible api more PythonicThis wrapper allows more easier way how to use Ansible in Python.Chain commands without without playbooks. More like Fabric.With this Ansible can be more powerfull and it will allow to chain commands withpython commands. Ansible documentation is on http://docs.ansible.com/.API is now trying to feel like Fabric, but it's still not complete, therewill be some changes.Get started===========For instalation you can download package and then just unpack package fromhttps://pypi.python.org/pypi/ansiblator and use it::python setup.py installor install by pip::pip install ansiblatorQuickstart==========For most quickest example you can just create your ansible host file namedansible_hosts inside your home directory or give full path to file.Ansiblator is mainly using file such as in ~/ansible_hosts.code::import ansiblator.api as anans = an.Ansiblator()ret = ans.local("uname -a", now=True, use_shell=True)ans.run("uname -a", now=True)ans.runner("uptime")ans.run_all()ans.copy(src="/tmp/aabc.csv", dest="/tmp/",pattern="pc",now=True)specify ansible hosts file and select pattern::ans = an.Ansiblator(inventory="/tmp/ansible_file", pattern="pc")use dictionary to create inventory::inv = {'pc':[{'ssh_host':'192.168.0.10', 'ssh_user':'test_user', 'su_user':'root'},{'ssh_host':'192.168.0.12', 'ssh_user':'test_user2', 'su_pass':'paasswd','su_user':'root'}]}ans = an.Ansiblator(inventory=inv)ans.run("uname -a", now=True)prepare commands and run after::ans = an.Ansiblator(run_at_once=False)ans.get(src="/tmp/file", dest="/tmp/")ans.get(src="/tmp/file2", dest="/tmp/")ans.run_all()make custom class::class Automatization(Ansiblator):def update_server(self, su=True,sudo=False):self.run("apt-get update", su=su, sudo=sudo)self.run("apt-get upgrade -y", su=su, sudo=sudo)use custom class and more patterns together::ans = Automatization(pattern=['servers', 'production', 'test', 'pc'])ans.update_server()With this, you can create full commands or functions and just pass to thempattern and run at the end.Need all modules inside Ansible?::ans = an.Ansiblator()ans.get_all_modules()#now you should be able to do >ans.user(name="hugo")#or evenans.pip(name="six", virtualenv="/tmp/venv", virtualenv_site_packages="yes")More information================Ansiblator automatically saves returned json values for actuall runs, soyou can use them for testing and conditions. For exampletesting::return_code = ans.local("uname -a", now=True, use_shell=True)return_code['contacted']orreturn_code = ans.local(["uname", "-a"], now=True, use_shell=False)return_code['contacted']Todo====- make more tests- improve logging- improve DictToInventory mapper, so more options are possible, such as groups and so onChanges=======- ability to run on more patterns- fixes on more runs- run all modules on ansibleInfo====For more information you can consult functions or actual Ansible documentation.More information can be also used on http://www.pripravto.cz. You can alsocontact us there.
|
ansible
|
AnsibleAnsible is a radically simple IT automation system. It handles configuration management, application
deployment, cloud provisioning, ad-hoc task execution, network automation, and multi-node
orchestration. Ansible makes complex changes like zero-downtime rolling updates with load balancers
easy. More information on the Ansiblewebsite.This is theansiblecommunity package.
Theansiblepython package contains a set of
independent Ansible collections that are curated by the community,
and it pulls inansible-core.
Theansible-corepython package contains the core runtime and CLI tools,
such asansibleandansible-playbook,
while theansiblepackage contains extra modules, plugins, and roles.ansiblefollowssemantic versioning.
Each major version ofansibledepends on a specific major version ofansible-coreand contains specific major versions of the collections it
includes.Design PrinciplesHave an extremely simple setup process and a minimal learning curve.Manage machines quickly and in parallel.Avoid custom-agents and additional open ports, be agentless by
leveraging the existing SSH daemon.Describe infrastructure in a language that is both machine and human
friendly.Focus on security and easy auditability/review/rewriting of content.Manage new remote machines instantly, without bootstrapping any
software.Allow module development in any dynamic language, not just Python.Be usable as non-root.Be the easiest IT automation system to use, ever.Use AnsibleYou can install a released version of Ansible withpipor a package manager. See ourInstallation guidefor details on installing Ansible
on a variety of platforms.Reporting IssuesIssues with plugins and modules in the Ansible package should be reported
on the individual collection’s issue tracker.
Issues withansible-coreshould be reported on
theansible-core issue tracker.
Issues with theansiblepackage build process or serious bugs or
vulnerabilities in a collection that are not addressed after opening an issue
in the collection’s issue tracker should be reported onansible-build-data’s issue tracker.Refer to theCommunication pagefor a
list of support channels if you need assistance from the community or are
unsure where to report your issue.Get InvolvedReadCommunity Informationfor ways to contribute to
and interact with the project, including mailing list information and how
to submit bug reports and code to Ansible or Ansible collections.Join aWorking Group, an organized community
devoted to a specific technology domain or platform.Talk to us before making larger changes
to avoid duplicate efforts. This not only helps everyone
know what is going on, but it also helps save time and effort if we decide
some changes are needed.For a list of email lists, Matrix and IRC channels, and Working Groups, see theCommunication pageCoding GuidelinesWe document our Coding Guidelines in theDeveloper Guide. We also suggest you review:Developing modules checklistCollection contributor guideBranch InfoThe Ansible package is a ‘batteries included’ package that brings inansible-coreand a curated set of collections. Ansible usessemantic versioning(for example, Ansible 5.6.0).The Ansible package has only one stable branch, called ‘latest’ in the documentation.SeeAnsible release and maintenancefor information about active branches and their correspondingansible-coreversions.Refer to theansible-build-datarepository for the exact versions ofansible-coreand collections that
are included in eachansiblerelease.RoadmapBased on team and community feedback, an initial roadmap will be published for a major
version (example: 5, 6). TheAnsible Roadmapdetails what is planned and how to influence the
roadmap.AuthorsAnsible was created byMichael DeHaanand has contributions from over 4700 users (and growing). Thanks everyone!Ansibleis sponsored byRed Hat, Inc.LicenseGNU General Public License v3.0 or laterSeeCOPYINGfor the full license text.
|
ansible-1password-lookup-plugin
|
1Password Local Lookup PluginThis is a simple lookup plugin that search for secrets in a local 1Password database (B5.sqlite format).
It uses theonepassword-local-searchpython module that
greatly improve performance over querying directly the 1Password servers.RequirementsYou require:python 3.7onepassword-local-search modulepip3 install onepassword-local-searchExample Playbook- hosts: servers
roles:
- role: mickaelperrin.ansible-onepassword-local-lookup-plugin
tasks
- debug:
msg: "{{ lookup('onepassword_local', 'p6iyvjqv4xdxw52hsacpkq4rgi', field='name') }}"
- debug:
msg: "{{ lookup('onepassword_local', 'c3264cef-1e5e-4c96-a192-26729539f3f5', field='your_custom_field') }}"
- debug:
msg: "{{ lookup('onepassword_local', '1234567890', field='password') }}"Custom uuid featureuuid in 1Password changes when you move an item from one vault to another. To prevent this issue, a custom uuid mapping feature has been implemented.You need to add on each item a field namedUUID(in capitals).Then runop-local mapping updateto generate the mapping table relationship.You can display UUID mapping by runningop-local mapping list.As we migrated from Lastpass to 1Password, we have also implemented a UUID mapping feature
related to a field namedLASTPASS_ID. If the uuid given is 100% numeric, the search query will be performed over this field.TestsTests are managed bypytestfor the python part andmoleculefor the ansible part withdockeras driver.mkvirtualenv3 ansible-onepassword-local-lookup-plugin
pip install -r requirements/dev.txtPytestpytestMoleculeEnsure thatdockerservice is up and runningmolecule testLicenseGPLv3
|
ansible2puml
|
ansible2pumlAbout ansible2pumlCreate an PlantUML activity diagram from playbooks and roles trough python.A .puml file with the PlantUML syntax is generated and a link to display the diagram as PNG is generated.RequirementsPython version >3.6Install packageInstall via pypipipinstallansible2pumlInstall via gitpipinstallgit+https://github.com/ProfileID/ansible2pumlHow toPlaybookansible2puml --source play.yml --destination play.pumlExampleSource:example-playbook.yml
|
ansible-aisnippet
|
ansible-aisnippetFeaturesQuickstartInstall python packageInstall the latest versionansible-aisnippetwithpiporpipxpipinstallansible-aisnippetUsageYou must have an openai.api_key. You can create it fromopenaiwebsite.exportOPENAI_KEY=<yourtoken>
ansible-aisnippet--helpUsage:ansible-aisnippet[OPTIONS]COMMAND[ARGS]...
╭─Options─────────────────────────────────────────────────────╮
│--version-vShowtheapplication's│
│versionandexit.│
│--install-completionInstallcompletionforthe│
│currentshell.│
│--show-completionShowcompletionforthe│
│currentshell,tocopyitor│
│customizetheinstallation.│
│--helpShowthismessageandexit.│
╰───────────────────────────────────────────────────────────────╯
╭─Commands────────────────────────────────────────────────────╮
│generateAskChatGPTtowriteanansibletaskusinga│
│template│
╰───────────────────────────────────────────────────────────────╯Generate task(s)ansible-aisnippet can generate an or several ansible task from a description or
a filetasks.ansible-aisnippetgenerate--helpUsage:ansible-aisnippetgenerate[OPTIONS][TEXT]AskChatGPTtowriteanansibletaskusingatemplate
╭─Arguments──────────────────────────────────────────────────────────────────────╮
│text[TEXT]Adescriptionoftasktoget[default:Installpackagehtop]│
╰──────────────────────────────────────────────────────────────────────────────────╯
╭─Options────────────────────────────────────────────────────────────────────────╮
│--verbose-vverbosemode│
│--filetasks-fPATH[default:None]│
│--outputfile-oPATH[default:None]│
│--playbook-pCreateaplaybook│
│--helpShowthismessageandexit.│
╰──────────────────────────────────────────────────────────────────────────────────╯Generate a taskexportOPENAI_KEY=<yourtoken>
ansible-aisnippetgenerate"execute command to start /opt/application/start.sh create /var/run/test.lock"Buildingprefixdictfromthedefaultdictionary...
Loadingmodelfromcache/tmp/jieba.cache
Loadingmodelcost0.686seconds.
Prefixdicthasbeenbuiltsuccessfully.
name:Executecommandtostart/opt/application/start.shcreate/var/run/test.lock
ansible.builtin.command:chdir:/opt/applicationcmd:./start.sh&&touch/var/run/test.lockcreates:/var/run/test.lockremoves:''Generate severals tasksansible-aisnippet can generate severals tasks from a file. The file is a yaml file which contains a list of tasks and blocksEx:-task:Install package htop, nginx and net-tools with generic module-task:Copy file from local file /tmp/toto to remote /tmp/titi set mode 0666 owner bob group wwwregister:test-name:A blockwhen:test.rc == 0block:-task:wait for port 6300 on localhost timeout 25rescue:-task:Execute command /opt/application/start.sh creates /var/run/test.lock-task:Download file from https://tmp.io/test/ set mode 0640 and force trueThis file produces this result :exportOPENAI_KEY=<yourtoken>
ansible-aisnippetgenerate-ftest.yml-p
Buildingprefixdictfromthedefaultdictionary...
Loadingmodelfromcache/tmp/jieba.cache
Loadingmodelcost0.671seconds.
Prefixdicthasbeenbuiltsuccessfully.
Result:
-name:Playbookgeneratedwithchatgpthosts:allgather_facts:truetasks:-name:Installpackagehtop,nginxandnet-toolsansible.builtin.yum:name:-htop-nginx-net-toolsstate:present-name:Copyfilefromlocalfile/tmp/tototoremote/tmp/titiansible.builtin.copy:src:/tmp/totodest:/tmp/titimode:'0666'owner:bobgroup:wwwregister:test-name:Ablockwhen:test.rc==0block:-name:Waitforport6300onlocalhosttimeout25ansible.builtin.wait_for:host:127.0.0.1port:'6300'timeout:'25'rescue:-name:Executecommand/opt/application/start.shcreates/var/run/test.lockansible.builtin.command:chdir:/tmp/testcmd:/opt/application/start.shcreates:/var/run/test.lock-name:Downloadfilefromhttps://tmp.io/test/ansible.builtin.get_url:backup:falsedecompress:truedest:/tmp/testforce:truegroup:rootmode:'0640'owner:roottimeout:'10'tmp_dest:/tmp/testurl:https://tmp.io/test/validate_certs:true
|
ansible-alicloud
|
No description available on PyPI.
|
ansible-alicloud-bak
|
No description available on PyPI.
|
ansible-alicloud-module-utils
|
No description available on PyPI.
|
ansible-anonymizer
|
Library to clean up Ansible tasks from any Personally Identifiable Information (PII)Free software: Apache Software License 2.0Anonymized fieldsCredit Card numberemail addressIP addressMAC addressUS SSNUS phone numberYAML commentpassword value, when the field name is identified as being sensitiveuser name from home directory pathUsageThe library can be used to remove the PII from a multi level structure:fromansible_anonymizer.anonymizerimportanonymize_structexample=[{"name":"foo bar","email":"[email protected]"}]anonymize_struct(example)# [{'name': 'foo bar', 'email': '[email protected]'}]But you can also anonymize a block of text:fromansible_anonymizer.anonymizerimportanonymize_text_blocksome_text="""
- name: a task
a_module:
secret: foobar
"""anonymize_text_block(some_text)# '\n- name: a task\n a_module:\n secret: "{{ secret }}"\n'You can also use theansible-anonymizercommand:ansible-anonymizer my-secret-fileCustomize the anonymized stringsBy default, the variables are anonymized with a string based on the name of the field.
You can customize it with thevalue_templateparameter:fromansible_anonymizer.anonymizerimportanonymize_structfromstringimportTemplateoriginal={"password":"$RvEDSRW#R"}value_template=Template("_${variable_name}_")anonymize_struct(original,value_template=value_template)# {'password': '_password_'}Limitationsanonymize_text_block()relies on its own text parser which only support a subset of YAML features. Because of this, it may not be able to identify some PII. When possible, useanonymize_structwhich accepts a Python structure instead.The Anonymizer is not a silver bullet and it’s still possible to see PII going through the filters.
|
ansibleapi
|
Ansible API is a REST based front-end to run Ansible Playbooks in a very lightweight server.Current features are:Query the API for what roles/playbooks are availableRun an Ansible Playbook from the API
|
ansible-api
|
ansible-api v0.5.1A restful http api for ansible
python version >= 3.7What is it?Ansibleis a radically simple IT automation system.
If you are trying to use it and not like CLI, you can try me now. I can provide you use ansible by A RESTful HTTP Api and a realtime processing message (websocket api), you can see all details.Changelog0.5.1add sha256 encryption support for signature (thx: jbackman)fit for latest ansible(v2.8.6) and ansible-runner(v1.4.2)add more error event capture in response0.5.0 replace tornado with sanic, more lightly (python>=3.7)0.3.0 using ansible-runner as middleware0.2.6 adaptive ansible 2.6.4 and add asynchronization mode0.2.2 optimize log0.2.1 optimize log and allow mutil-instance in the same host0.2.0 support websocket, remove code invaded in ansibleStructure chartHow to install[preparatory work] python version >= 3.7 (use asyncio featrue)pip3 install ansible-apiHow to start itdefault configuration: /etc/ansible/api.cfgstart:ansible-api -c [Configfile, Optional] -d [Daemon Mode, Optional]eg: ansible-api -c /etc/ansible/api.cfg -d > /dev/null &How to prepare your dataHTTP API Usage
|
ansible-apply
|
Ansible-apply takes the tasks file spec as first argument:ansible-apply role.name
ansible-apply role.name/some_tasks
ansible-apply https://some/playbook.yml
ansible-apply ./playbook.ymlIt will automatically download the playbook or role if not found.Then, the command takes any number of hosts and inventory variables on
the command line:ansible-apply role.name server1 server2 update=trueFinnaly, any argument passed with dashes are forwarded to the
ansible-playbook command it generates, but named args must use the=notation, and not a space to not confuse the command line parser:# works:
ansible-apply role.name server2 update=true --become-user=root
# does not:
ansible-apply role.name server2 update=true --become-user root
|
ansible-argspec-gen
|
This package contains code for Ansible argument specification program. Its main
audience are Ansible module maintainers that would like to reduce the
duplication in their modules by generating the argument specification directly
from the module’s user documentation.QuickstartDocumentation extractor is published onPyPIand we can install it usingpip:$ pip install ansible-argspec-gen[base] # This will install ansible-base
$ pip install ansible-argspec-gen[ansible] # This will install ansible
$ pip install ansible-argspec-gen # We need to install ansible or
# ansible-base ourselvesIf the previous command did not fail, we are ready to start updating our
modules. When we use the generator for the first time, we need to perform the
following three steps:Add two comments to the module’s source that will mark the location for the
generated code. By default, the generator searched for the# AUTOMATIC MODULE ARGUMENTScomment, but this can be changed with the--markercommand-line parameter.Run the generator, possibly in dry-run and diff mode first to spot any
issues.Remove any hand-writen remnants that are not needed anymore.For example, let us assume that the first few lines of our module’s main
function looks like this before the generator run:def main():
# AUTOMATIC MODULE ARGUMENTS
# AUTOMATIC MODULE ARGUMENTS
module = AnsibleModule(If we run the the generator now in check mode with difference priting switched
on, we will get back something like this:$ ansible-argspec-gen --diff --dry-run plugins/modules/route.py
--- ../ansible_collections/steampunk/nginx_unit/plugins/modules/route.py.old
+++ ../ansible_collections/steampunk/nginx_unit/plugins/modules/route.py.new
@@ -359,6 +359,52 @@
def main():
# AUTOMATIC MODULE ARGUMENTS
+ argument_spec = {
+ "global": {"default": False, "type": "bool"},
+ "name": {"type": "str"},
+ "socket": {"type": "path"},
+ "state": {
+ "choices": ["present", "absent"],
+ "default": "present",
+ "type": "str",
+ },
+ }
+ required_if = [("global", False, ("name",)), ("state", "present", ("steps",))]
# AUTOMATIC MODULE ARGUMENTS
module = AnsibleModule(Once we are happy wth the proposed changes, we can write them to the file:$ ansible-argspec-gen plugins/modules/route.pyIf we update the module’s documentation, we can simply rerun the previous
command and generator will take or updating the specification. Note that the
generator willoverwritethe content between the markers, so make sure you
do not manually modify that part of the file or you will loose the changes on
next update.Writing module documentationGenerating argument specification for theAnsibleModuleclass should work
on any module that has a documentation. But getting the generator to produce
other parameters such as conditional requirements takes a bit of work.In order to generate arequired_ifspecification, our parameters need to
have a sentence in its description that fits the templaterequired if
I({param_name}) is C({param_value}). The next example:options:
name:
description:
- Name of the resource. Required if I(state) is C(present).will produce the following specification:required_if = [("state", "present", ("name", ))]Another thing that generator knows how to produce is themutually_exclusivespecification. The pattern that the generator is looking for in this case isMutually exclusive with I({param1}), I({param2}), and I({param3}), where the
number of parameters that we can specify is not limited. Example:options:
processes:
description:
- Dynamic process limits.
- Mutually exclusive with I(no_processes).
no_processes:
description:
- Static process limit.
- Mutually exclusive with I(processes).This will produce:mutually_exclusive = [("no_processes", "processes")]Development setupGetting development environment up and running is relatively simple if we
havepipenvinstalled:$ pipenv updateTo test the extractor, we can run:$ pipenv run ansible-argspec-gen
|
ansible-art
|
A simple tool to apply role of ansible
|
ansible-autodoc
|
ansible-autodocGenerate documentation from annotated playbooks and roles using templates.Note: this project is currently in Beta, issues, ideas and pull requests are welcome.Featuresallow to document playbook projects and rolesuse templates to generate and maintain the documentationextended functions when documenting:tags: the autodoc will search for used tags in the projectGetting started# install
pip install ansible-autodoc
# print help
ansible-autodoc -h
# print parsed annotation results in the cli
ansible-autodoc -p all path/to/role_or_playbook
# generate README file based on annotations
ansible-autodoc [path/to/project]notes:you can usegripto see the live changes.this only runs with python 3, if you still have python 2.x use pip3AnnotationsUse the following annotations in your playbooks and rolesmeta:use @meta to annotate the metadata of playbook or role, like author
check below list of useful metadataauthor: (self explanatory)description: playbook / role descriptionname: to define a different role/project name instead of the folder namelicense: (self explanatory)email: (self explanatory)# @meta author: Author Name# @meta description: Project descriptiontodo:use @todo to annotate improvements, bugs etc# @todo bug: bug description# @todo improvement: improvementaction:use @action to annotate a actions performed by the playbook/role# @action install # this action describes the installation# @action # this action does not have a section, only descriptiontags:use @tag to annotate tags, this is a special annotation as this will not only search for annotations,
but also for used tags in the project and add that to the generated output.# @tag tagname # tag descriptionvariables:use @var this to annotate configuration variables# @var my_var: default_value # description of the variableexample:the idea is that after every annotation, we can define an example block, linked to the annotation.
in this case the example will be part of the var annotation.# @var my_var: default_value # description of the variablemy_var:default_value# @example # the hash is needed due to the parser constrains# my_var:# - subitem: string# - subitem2: string# @endTemplatesansible-autodoc comes with 3 templates out of the box, the default is "readme", you can change this in configuration.If you want to create your own project specific templates, see thetemplate documentationIf a file already exists in the output, the you will be prompted to overwrite or abort.READMEThe default "readme" template will generate a README.md file in the root of the project, detailing the sections:title and descriptionactionstagsvariablestodoslicenseauthor infomrationyou can extend this my creating a file"_readme_doby.md"in the root of your project, this will be included in the rendered Readme just after the
initial description.Doc and READMEThe "doc_and_readme" template is an extended template intended to be used playbook projects with several roles, it will generate a minimal
README.md file and a documentation subfolder "doc" with more detailed information.you can extend this my creating a file"_readme_doby.md"in the root of your project, this will be included in the rendered Readme just after the
initial description.the files created in the documentation folder will cover:tags: list all tags classified by rolesvariables: list all variables classified by rolestodo: list all todo actions classified by rolesreport: provides a report of the project and useful information during developmentyou can extend the documentation in this folder, just keep in mind that generated files will be overwritten.Command lineThe "cliprint" template is used to display the content when you use the command line print parameter "-p"Configurationyou can create a configuration file "autodoc.config.yaml" in the root of your project in order to modify
several behaviours, see the sample config file for more details:# role or project with playbooks
$ cd <project>
# create sample configuration (optional)
# you can pass the options as parameters too
$ ansible-autodoc --sample-doc > autodoc.config.yaml
|
ansible-autodoc-fork
|
ansible-autodocGenerate documentation from annotated playbooks and roles using templates.Note: this project is currently in Beta, issues, ideas and pull requests are welcome.Featuresallow to document playbook projects and rolesuse templates to generate and maintain the documentationextended functions when documenting:tags: the autodoc will search for used tags in the projectGetting started# install
pip install ansible-autodoc
# print help
ansible-autodoc -h
# print parsed annotation results in the cli
ansible-autodoc -p all path/to/role_or_playbook
# generate README file based on annotations
ansible-autodoc [path/to/project]notes:you can usegripto see the live changes.this only runs with python 3, if you still have python 2.x use pip3AnnotationsUse the following annotations in your playbooks and rolesmeta:use @meta to annotate the metadata of playbook or role, like author
check below list of useful metadataauthor: (self explanatory)description: playbook / role descriptionname: to define a different role/project name instead of the folder namelicense: (self explanatory)email: (self explanatory)# @meta author: Author Name# @meta description: Project descriptiontodo:use @todo to annotate improvements, bugs etc# @todo bug: bug description# @todo improvement: improvementaction:use @action to annotate a actions performed by the playbook/role# @action install # this action describes the installation# @action # this action does not have a section, only descriptiontags:use @tag to annotate tags, this is a special annotation as this will not only search for annotations,
but also for used tags in the project and add that to the generated output.# @tag tagname # tag descriptionvariables:use @var this to annotate configuration variables# @var my_var: default_value # description of the variableexample:the idea is that after every annotation, we can define an example block, linked to the annotation.
in this case the example will be part of the var annotation.# @var my_var: default_value # description of the variablemy_var:default_value# @example # the hash is needed due to the parser constrains# my_var:# - subitem: string# - subitem2: string# @endTemplatesansible-autodoc comes with 3 templates out of the box, the default is "readme", you can change this in configuration.If you want to create your own project specific templates, see thetemplate documentationIf a file already exists in the output, the you will be prompted to overwrite or abort.READMEThe default "readme" template will generate a README.md file in the root of the project, detailing the sections:title and descriptionactionstagsvariablestodoslicenseauthor infomrationyou can extend this my creating a file"_readme_doby.md"in the root of your project, this will be included in the rendered Readme just after the
initial description.Doc and READMEThe "doc_and_readme" template is an extended template intended to be used playbook projects with several roles, it will generate a minimal
README.md file and a documentation subfolder "doc" with more detailed information.you can extend this my creating a file"_readme_doby.md"in the root of your project, this will be included in the rendered Readme just after the
initial description.the files created in the documentation folder will cover:tags: list all tags classified by rolesvariables: list all variables classified by rolestodo: list all todo actions classified by rolesreport: provides a report of the project and useful information during developmentyou can extend the documentation in this folder, just keep in mind that generated files will be overwritten.Command lineThe "cliprint" template is used to display the content when you use the command line print parameter "-p"Configurationyou can create a configuration file "autodoc.config.yaml" in the root of your project in order to modify
several behaviours, see the sample config file for more details:# role or project with playbooks
$ cd <project>
# create sample configuration (optional)
# you can pass the options as parameters too
$ ansible-autodoc --sample-doc > autodoc.config.yaml
|
ansibleawx-client
|
Ansible AWX ClientDonate to help keep this project maintainedSummaryThis is a unofficial python API client for Ansible AWX.RequirementsrequestsQuick Start GuideInstall Ansible AWX Clientpip install ansibleawx-clientInitialize API ClientYou can do this with your username and password or using your Token.Initialize client with your username and passwordimport ansibleawx
API_URL = "http://my-ansibleawx.com/api/v2"
client = ansibleawx.Api("username", "password", api_url=API_URL)Initialize client with your tokenimport ansibleawx
API_URL = "http://my-ansibleawx.com/api/v2"
TOKEN = "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA"
client = ansibleawx.Api(api_url=API_URL, token=TOKEN)ExamplesGet Inventories# to get all inventories
response = client.get_inventories()
# to get specific inventory by id
response = client.get_inventories(1)Get Jobs Templates# to get all jobs templates
response = client.get_jobs_templates()
# to get specific job template by id
response = client.get_jobs_templates(1)Launch Job Template by idresponse = client.launch_job_template(1)Relaunch Job by idresponse = client.relaunch_job(1)Cancel Job by idresponse = client.cancel_job(1)Function ReferenceConsult theAnsible Tower documentationfor more details.
|
ansible-base
|
AnsibleAnsible is a radically simple IT automation system. It handles
configuration management, application deployment, cloud provisioning,
ad-hoc task execution, network automation, and multi-node orchestration. Ansible makes complex
changes like zero-downtime rolling updates with load balancers easy. More information onthe Ansible website.Design PrinciplesHave a dead simple setup process and a minimal learning curve.Manage machines very quickly and in parallel.Avoid custom-agents and additional open ports, be agentless by
leveraging the existing SSH daemon.Describe infrastructure in a language that is both machine and human
friendly.Focus on security and easy auditability/review/rewriting of content.Manage new remote machines instantly, without bootstrapping any
software.Allow module development in any dynamic language, not just Python.Be usable as non-root.Be the easiest IT automation system to use, ever.Use AnsibleYou can install a released version of Ansible viapip, a package manager, or
ourrelease repository. See ourinstallation guidefor details on installing Ansible
on a variety of platforms.Red Hat offers supported builds ofAnsible Engine.Power users and developers can run thedevelbranch, which has the latest
features and fixes, directly. Although it is reasonably stable, you are more likely to encounter
breaking changes when running thedevelbranch. We recommend getting involved
in the Ansible community if you want to run thedevelbranch.Get InvolvedReadCommunity
Informationfor all
kinds of ways to contribute to and interact with the project,
including mailing list information and how to submit bug reports and
code to Ansible.Join aWorking Group, an organized community devoted to a specific technology domain or platform.Submit a proposed code update through a pull request to thedevelbranch.Talk to us before making larger changes
to avoid duplicate efforts. This not only helps everyone
know what is going on, it also helps save time and effort if we decide
some changes are needed.For a list of email lists, IRC channels and Working Groups, see theCommunication pageCoding GuidelinesWe document our Coding Guidelines in theDeveloper Guide. We particularly suggest you review:Contributing your module to AnsibleConventions, tips and pitfallsBranch InfoThedevelbranch corresponds to the release actively under development.Thestable-2.Xbranches correspond to stable releases.Create a branch based ondeveland set up adev environmentif you want to open a PR.See theAnsible release and maintenancepage for information about active branches.RoadmapBased on team and community feedback, an initial roadmap will be published for a major or minor version (ex: 2.7, 2.8).
TheAnsible Roadmap pagedetails what is planned and how to influence the roadmap.AuthorsAnsible was created byMichael DeHaanand has contributions from over 4700 users (and growing). Thanks everyone!Ansibleis sponsored byRed Hat, Inc.LicenseGNU General Public License v3.0 or laterSeeCOPYINGto see the full text.
|
ansible-bender
|
ansible-benderThis tool bends containers usingAnsibleplaybooksand turns them into container images. It has a pluggable builder selection —
it is up to you to pick the tool which will be used to construct your container
image. Right now the only supported builder isbuildah.Moreto
comein the future.
Ansible-bender (ab) relies onAnsible connection
pluginsfor
performing builds.tl;dr Ansible is the frontend, buildah is the backend.The concept is described in following blog posts:Building containers with buildah and ansible.Ansible and Podman Can Play Together Now.Looking for maintainers ❤This project doesn't have an active maintainer right now thatwould watch issues daily.If you are a user of ansible-bender and are familiar with Python, please consider becoming a maintainer.FeaturesYou can build your container images with buildah as a backend.Ansible playbook is your build recipe.You are able to set various image metadata via CLI or as specific Ansible vars:working directoryenvironment variableslabelsuserdefault commandexposed portsYou can do volume mounts during build.Caching mechanism:Every task result is cached as a container image layer.You can turn this off with--no-cache.You can disable caching from a certain point by adding a tagno-cacheto a task.You can stop creating new image layers by adding tagstop-layeringto a task.If an image build fails, it's committed and named with a suffix-[TIMESTAMP]-failed(so
you can take a look inside and resolve the issue).The tool tries to find python interpreter inside the base image.You can push images you built to remote locations such as:a registry, a tarball, docker daemon, ...podman pushis used to perform the push.DocumentationYou can read more about this project in the documentation:Documentation homeInterfaceInstallationConfigurationUsageCaching and Layering mechanismContribution guideAnsible-bender in OKD
|
ansible-builder
|
Ansible BuilderAnsible Builder is a tool that automates the process of building execution
environments using the schemas and tooling defined in various Ansible
Collections and by the user.See the readthedocs page foransible-builderat:https://ansible-builder.readthedocs.io/en/stable/Get Involved:We useGitHub issuesto
track bug reports and feature ideasWant to contribute, check out ourguideJoin us in the#ansible-builderchannel on Libera.chat IRCFor the full list of Ansible email Lists, IRC channels and working groups,
check out theAnsible Mailing
listspage of the official Ansible documentation.Code of ConductWe ask all of our community members and contributors to adhere to theAnsible
code of
conduct. If
you have questions, or need assistance, please reach out to our community team
[email protected] License v2.0
|
ansible-bundle
|
# ansible-bundleSmall tool for automatic download roles and libs a-la-Gemfile# PreambleAs many roles have changed their configurations among time, anyone would use aspecific version of a role (for instance, a commit, or a branch, or a tag).Moreover, a complex playbook could need different versions from the same role.This app will download roles (bundles, from now on) from theirrepositories before launching a playbook. _That means that a role should be inits own repository_.# Prerequisites- Ansible. Any version.- Git >= 1.8.5- Python 2 >= 2.6# Installation## The easy way: pip`sudo pip install ansible-bundle`## The not-that-easy way: from code- Download [latest release](../../archive/master.zip)- Run `sudo python setup.py`# Syntax`ansible-bundle FILEYAML [ansible-playbook-options] [ansible-bundle-options]`ansible-bundle, along the ansible-playbook parameters, has also these:- `--bundle-clean-roles` Will clean roles and library directories before download (*)- `--bundle-dry` Shows what will be run (as it won't download anything,also won't search for dependencies)- `--bundle-deps-only` Don't run the playbook, just satisfy dependencies.- `--bundle-disable-color` Useful for non-interactive consoles- `--bundle-workers` concurrent connections when downloading/updating roles. Default: 1- `--bundle-safe-update` Don't clean existing roles. (*)(*) If both `bundle-clean-roles` and `bundle-safe-update` are set, `bundle-clean-roles` will take effect.# Configurationansible-bundle expects to find a `[bundle]` section into ansible.cfg, which maycontain some of the command lines parameters:- workers- verbosity- safeAnd the following extra options:- `url`: URL where the roles are located. For example, if role `apache` is in`github.com/foo/roles/apache`, the `url` should be set to `github.com/foo/roles`.Default is 'https://github.com'## bundle.cfg example[bundle]url='[email protected]:devopsysadmin/ansible-roles'workers=5verbosity=1# Example of useGiven the following playbook (site.yml):- include: site-common.ymltags:- common- hosts: allroles:- [email protected] { role: apache, version: '2.4' }Running `ansible-bundle site.yml` will search roles into the `site-common.yml` file andadd to download queue, which already includes postgresql 1.0 and apache master.Please note that each role is intended to be in its own repository, not in a folder.# Changelog## v 0.6- Syntax role/version changes to role@version. This simplifies the configuration inansible.cfg and allows branches names such as feature/something in role version.If a versioned role with the previous syntax is found, will complain with adeprecation warning. The previous syntax will be obsoleted in v 0.7.# AuthorDavid Pedersen (david.3pwood AT gmail DOT com)# LicenseGNU Public License v2 (GPLv2)
|
ansible-butler
|
ansible-butlerButler CLI for Ansible projectsFunctionsObjectActionDescriptiondirectoryinitinitialize an ansible directorydirectorycleancleanup an ansible directoryeeinitinitialize an execution environment directory for ansible-buildereedependencies[deps]parse the dependency tree based on execution environment definition (or collection requirements)rolelistlist rolesroledependencies[deps]Build a dependency graph between roles in a specified directoryrolecleanclean role directory structure (remove empty yml files & dirs)rolemk-readmeauto generate readme based on role meta and basic yml infoplaybookupdatemap legacy module names to FQCNsplaybooklist-collections[lc]list collections used in a playbook (following include_* directives)UsageUsage:
ansible-butler directory init [<dir>] [--config=PATH]
ansible-butler directory clean [<dir>] [--skip-roles]
ansible-butler ee init [<dir>] [--config=PATH]
ansible-butler ee [dependencies|deps] [--config=PATH] [<name>]
ansible-butler role list [--roles-path=PATH] [<name> --recursive]
ansible-butler role [dependencies|deps] [--roles-path=PATH]
ansible-butler role clean [--roles-path=PATH] [<name> --recursive]
ansible-butler role mk-readme [--roles-path=PATH] [<name> --recursive]
ansible-butler playbook update [--context=CONTEXT] [--config=PATH] [<name>] [--recursive] [--force]
ansible-butler playbook [list-collections|lc] [--context=CONTEXT] [--config=PATH] [<name>] [--recursive] [--force]
Arguments:
name name of target (accepts glob patterns)
dir path to directory [default: ./]
Options:
-h --help Show this screen
-r --recursive Apply glob recursively [default: False]
-f --force Make file changes in place
--config=PATH Path to config file
--roles-path=PATH Path to roles directory [default: ./roles]
--context=CONTEXT Path to context directory [default: ./]
--skip-roles Flag to skip cleaning rolesExamplesInitialize Ansible Directoryansible-butler directory init ./sandboxansible-butler directory init ./sandbox --config=~/configs/ansible-butler.ymlClean an Ansible Directoryansible-butler directory clean ./sandboxansible-butler directory clean ./sandbox --skip-rolesInitialize Execution Environment Directoryansible-butler ee init ./ee-windowsansible-butler ee init ./ee-windows --config=~/configs/ansible-butler.ymlInspect Execution Environment Dependenciesansible-butler ee dependencies execution-environment.ymlansible-butler ee deps requirements.yml --config=~/configs/ansible-butler.ymlClean Rolesansible-butler role clean my-role-1ansible-butler role clean my-role-*List Rolesansible-butler role listansible-butler role list ansible_collections/namespace/collection/rolesGenerate Dependency Graphansible-butler role depsansible-butler role deps ansible_collections/namespace/collection/rolesGenerate READMEansible-butler role mk-readme my-role-1ansible-butler role mk-readme my-role-*Update Playbooksansible-butler playbook update --context=./playbooks -ransible-butler playbook update legacy-*.ymlansible-butler playbook update -fList Collectionsansible-butler playbook list-collections --context=./playbooks -ransible-butler playbook lc example-playbook.ymlConfigurationCreate an.ansible-butler.ymlin one or more of the following locations:/etc/ansible-butler/ ## least precedence
~/
./ ## highest precedenceYou can also specify a specific path at runtime via the--configoption.# Example Configuration Schemaexecution_environment:init:version:2ansible_config:ansible.cfgee_base_image:quay.io/ansible/ansible-runner:latestee_builder_image:quay.io/ansible/ansible-builder:latestprepend_build_steps:-...append_build_steps:-...directory:init:folders:-name:pluginsfolders:...files:-README.mdfiles:-playbook.ymlrole:dependencies:output_fmt:html# [html,json]output_dest:graph.htmlinclude_tests:falsemaster_node:role-common-setupinitial_direction:downstreamtitle:ansible-butler roles dependency graphtitle_text_color:whitetitle_background_color:blacktree_options:# Customize the color palettecircleStrokeColor:'#2b8f91'linkStrokeColor:'#dddddd'closedNodeCircleColor:'#9bd3d4'openNodeCircleColor:whitecyclicNodeColor:'#FF4242'missingNodeColor:'#CC0100'maxDepthNodeColor:'#FF5850'playbook:update:modules:smart_device:redirect:zjleblanc.kasa.smart_devicecustom_module:redirect:company.it.custom_module🔗 Default configuration file🔗 Example adding test plugins directory🔗 Example adding module redirectsTroubleshootingansible-butler: command not foundcheck the $PATH environment variable and ensure that~/.local/binis includedLicenseGNU General Public LicenseAuthor InformationZach LeBlancRed Hat
|
ansible-cached-lookup
|
ansible-cached-lookupAn Ansible lookup plugin that caches the results of any other lookup, most
useful in group/host vars.By default, Ansible evaluates any lookups in a group/host var whenever the var
is accessed. For example, given a group/host var:content:"{{lookup('pipe','a-very-slow-command'}}"any tasks that accesscontent(e.g. in a template) will re-evaluate the
lookup, which adds up very quickly. Seeansible/ansible#9263.InstallationPick a name that you want to use to call this plugin in Ansible playbooks.
This documentation assumes you're using the namecached.pip install ansible-cached-lookupCreate alookup_pluginsdirectory in the directory in which you run Ansible.By default, Ansible will look for lookup plugins in anlookup_pluginsfolder
adjacent to the running playbook. For more information on this, or to change
the location where Ansible looks for lookup plugins, see theAnsible
docs.Create a file calledcached.py(or whatever name you picked) in thelookup_pluginsdirectory, with one line:fromansible_cached_lookupimportLookupModuleContributingTo run the tests, runtox.To format code to passtox -e lint, runtox -e format.
|
ansible-cmdb
|



AboutAnsible-cmdb takes the output of Ansible’s fact gathering and converts it into
a static HTML overview page (and other things) containing system configuration
information.It supports multiple types of output (html, csv, sql, etc) and extending
information gathered by Ansible with custom data. For each host it also shows
the groups, host variables, custom variables and machine-local facts.[HTML example](https://rawgit.com/fboender/ansible-cmdb/master/example/html_fancy.html) output.Features(Not all features are supported by all templates)Multiple formats / templates:Fancy HTML (–template html_fancy), as seen in the screenshots above.Fancy HTML Split (–template html_fancy_split), with each host’s details
in a separate file (for large number of hosts).CSV (–template csv), the trustworthy and flexible comma-separated format.JSON (–template json), a dump of all facts in JSON format.Markdown (–template markdown), useful for copy-pasting into Wiki’s and
such.Markdown Split (–template markdown_split), with each host’s details
in a seperate file (for large number of hosts).SQL (–template sql), for importing host facts into a (My)SQL database.Plain Text table (–template txt_table), for the console gurus.and of course, any custom template you’re willing to make.Host overview and detailed host information.Host and group variables.Gathered host facts and manual custom facts.Adding and extending facts of existing hosts and manually adding entirely
new hosts.Custom columnsGetting startedLinks to the full documentation can be found below, but here’s a rough
indication of how Ansible-cmdb works to give you an idea:Install Ansible-cmdb from [source, a release
package](https://github.com/fboender/ansible-cmdb/releases) or through pip:pip
install ansible-cmdb.Fetch your host’s facts through ansible:$ mkdir out
$ ansible -m setup –tree out/ allGenerate the CMDB HTML with Ansible-cmdb:$ ansible-cmdb out/ > overview.htmlOpenoverview.htmlin your browser.That’s it! Please do read the full documentation on usage, as there are some
caveats to how you can use the generated HTML.DocumentationAll documentation can be viewed at [readthedocs.io](http://ansible-cmdb.readthedocs.io/en/latest/).[Full documentation](http://ansible-cmdb.readthedocs.io/en/latest/)[Requirements and installation](http://ansible-cmdb.readthedocs.io/en/latest/installation/)[Usage](http://ansible-cmdb.readthedocs.io/en/latest/usage/)[Contributing and development](http://ansible-cmdb.readthedocs.io/en/latest/dev/)LicenseAnsible-cmdb is licensed under the GPLv3:This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.You should have received a copy of the GNU General Public License
along with this program. If not, see <http://www.gnu.org/licenses/>.For the full license, see the LICENSE file.
|
ansible-collections.python.dist.boo
|
ansible-collections.python.dist.boo: an Ansible Collection containing a boo moduleThis is a demo of anAnsible Collectionpython.distpackaged
as aPython distribution package. It contains only oneAnsible
modulecalledboo. Given that this dist and Ansible are
installed into the same (virtual)env, it’s accessible from Ansible byFQDNpython.dist.boo. Here’s how you can call it adhoc-style:$ PYTHONPATH=`pwd` ansible -m python.dist.boo -a name=Bob localhost
[WARNING]: No inventory was parsed, only implicit localhost is available
localhost | SUCCESS => {
"changed": false,
"greeting": "Hello, Bob!",
"msg": "Greeting Bob completed."
}Alternatively, install it, instead of mangling withPYTHONPATH:$ pip install ansible-collections.python.dist.boo
$ ansible -m python.dist.boo -a name=Bob localhost
[WARNING]: No inventory was parsed, only implicit localhost is available
localhost | SUCCESS => {
"changed": false,
"greeting": "Hello, Bob!",
"msg": "Greeting Bob completed."
}PurposeAt the moment of publishing this project, this demo only works with
the code supplied byAnsible PR #67093. One can test it by
installing Ansible from that PR-branch:$ pip install git+https://github.com/sivel/ansible@acd-content-dirPublishing to PyPIThis project implements automatic building and publishing of a Python
distribution package to PyPI for tagged commits and to TestPyPI for
every push tomaster. All the heavy lifting is done by the
combination of GitHub Actions CI/CD workflows,pep517CLI tool andsetuptools-scm.The published dists are then installable by:$ pip install ansible-collections.python.dist.boo # for PyPIand$ pip install ansible-collections.python.dist.boo \
-i https://test.pypi.org/simple/ --pre # for TestPyPIIf you’re learning by example, take a look at the following files:pyproject.tomlsetup.cfg.github/workflows/publish-to-test-pypi.ymlBesides, you can follow aPyPA guide about publishing packages via
GitHub Actionsthat will walk you through the process.PrerequisitesPython 3.7+LicenseThe source code and the documentation in this project are released under
theGPL v3 license.
|
ansible-collection-test-utils
|
No description available on PyPI.
|
ansible-compat
|
ansible-compatA python package contains functions that facilitate working with various
versions of Ansible 2.12 and newer.Documentation is available atansible-compat.readthedocs.io.
|
ansible-compose
|
Deploying a docker-compose.ymlFirst, configure the dsn variable to have the user/host and path to store the
compose files in. If not set, localhost in current working directory will be
used:export dsn=user@host:22/absolute/target/pathThen, run ansible-compose apply -f as you would with k8s:ansible-compose apply -f ./docker-compose-example.yml -f ./examples/nginx.ymlOr, deploy from a url, just like kubectl apply would let you:ansible-compose apply -f https://raw.git.../docker-compose.ymlRunning commands on the compose of a targetProvide ssh configuration and target dir as first argument, then the
docker-compose command:export dsn=user@host:22/absolute/target/path
ansible-compose stop
ansible-compose start
ansible-compose logs
ansible-compose helpTransforms for docker-compose.yml on the CLITransforms apply on the docker-compose.yml as if it was a template we are going
to render using environment variables.Suppose that you start with such a docker-compose.yml:version:'3.4'services:web:build:.environment:base:herewrong:todropFirst, all build lines will be dropped. You will have to
specify images with env vars ie.:web_image=abc # sets services[web][image]=abcIf you wanted to drop the environment line use the drop
prefix:drop_web_environment= # dels services[web][environment]You can also override deeper in the YAML tree:web_environment_SECRET_TOKEN=yoursecretEven for drop:drop_web_environment_VERSION=Or even drop a whole service:drop_web=You can however require all variables to have a particular prefix, ie:dev_drop_mail= # to prevent the mail service on dev branches
ansible_compose_prefix=dev # to enable only variables starting with dev_Don’t forget ansible-compose will also forward extraneous arguments to the
ansible-playbook call beneath.Automatic creation of volume dirsIt will try to automatically create the volume bind dirs for you. To set the
uid and/or guid, set them as env vars, either in the compose.yml:environment:
uid: 1001
gid: 100Or with an env var:web_environment_uid=1001InstallationLocal:pip install --user ansible-compose
export PATH="$HOME/.local/bin:$PATH"
echo 'export PATH="$HOME/.local/bin:$PATH"' >> ~/.bashrcThe current configuration in CI for image yourlabs/ansible-compose is:ANSIBLE_HOST_KEY_CHECKING=falseSSH_PRIVATE_KEY=our passwordless private key generated with ssh-keygen -t ed25519 and default optionsSSH_INSECURE=true not checking any host key today !Example gitlab ci jobdeploy:stage:deployimage:yourlabs/ansible-composevariables:[email protected]/home/stagingscript:-ansible-compose apply -f ./docker-compose.yml
|
ansible-conductor
|
maintain dynamic inventory in etcd, sync to/from disk
|
ansible-config-template
|
Team and repository tagsOpenStack-Ansible config_templateThis module is an ansible plugin that extends regular template functionality
with the ability to override items in config, in transit, through the use
of a simple dictionary without having to write out various temp files on target
machines.WARNINGInstallation from PyPI or as a Python module is deprecated and will be removed
in future releases.
Please, use ansible-galaxy to install the collection.More infromation about the projectDocumentation for the project can be found at:https://docs.openstack.org/ansible-config_template/latest/Release notes for the project can be found at:https://docs.openstack.org/releasenotes/ansible-config_template/The project source code repository is located at:https://opendev.org/openstack/ansible-config_template/The project home is at:https://launchpad.net/openstack-ansibleThe project bug tracker is located at:https://bugs.launchpad.net/openstack-ansible
|
ansibleconnect
|
Ansibleconnect - connect to all hosts from the inventory with one commandAnsible VersionCI Status2.52.62.72.82.9SSH into all hosts in your inventory with one command.Ansibleconnect creates a bash script based on your ansible inventory.
That script will create a new tmux window or session and create a separate pane
for each one of your 'sshable' inventory hosts. Inside of each of the
panes an ssh connection to the pane's host will be established.Setup example (on Ubuntu):sudo apt install tmux
sudo apt install sshpass
pip install ansibleconnectUsage examples:Connect to all hosts in inventory:source <(ansibleconnect -i inventory.yml)Connect to all hosts from group1 and group2:source <(ansibleconnect -i inventory.yml -g 'group1:group2')Connect to all hosts from group1 except for hosts that are also in group2:source <(ansibleconnect -i inventory.yml -g 'group1:!group2')Connect to all hosts from inventory except for hosts in group1:source <(ansibleconnect -i inventory.yml -g '!group1')Connect to all hosts that have AWS provider:source <(ansibleconnect -i inventory.yml -vars provider:aws)>NOTE:In case you don't use bash. You can also useevalcommand, for example:eval "$(ansibleconnect -i inventories/inventory.yml)"Possible flags-i,--inventory- Path to ansible inventory-g,--groups- Inventory groups of hosts to connect to (multiple groups should be concentrated with:.!in front of group name means that ansibleconnect should not connect to hosts form this group). Example:-g computes:!storage--hosts- List of hostnames to connect to. Example:--hosts hostA,hostB-vars,--variables- Variables that host should have defined in inventory to connect to it. Accepted format:key:valuein case where host should have variable with specific value orkeyin case where host should have defined variable no matter what value. Example:-v type:dev,team:ui-novars,--no-variables- Variables that host should not have defined in inventory to connect to it. Accepted format:key:valuein case where host should not have variable with specific value orkeyin case where host should not have defined variable no matter what value.Example:-novars type:prod,team:salesConfigurationAnsibleconnect looks for theansible.cfgfile in the same locations as ansible.Authenticationssh-agentFor authentication one can use ssh keys.ansibleconnectwill scan the inventory file for connection options (ansible_ssh_common_args,ansible_ssh_user,ansible_host,ansible_private_key_file, etc.). Ssh keys can be passed via them. Otherwise, one can use thessh-agenttool. Environment args (SSH_AGENT_PIDandSSH_AUTH_SOCK) will be passed to each one of the tmux shells.ssh-agent setup exampleeval $(ssh-agent)
ssh-add ~/.ssh/my_private_key.pemsshpassIfansible_ssh_passvariable is used in the inventory, one should install thesshpassand make it discoverable viaPATH. Please note that when using the sshpass, password will passed in plaintext and it will be saved in each of the tmux shells' history.
|
ansible-container
|
No description available on PyPI.
|
ansible-content-parser
|
ansible-content-parserOverviewansible-content-parseranalyzes Ansible files in a given source
(a local directory, an archive file or a git URL)
by runningansible-lintinternally,
updates Ansible files using theAutofix feature ofansible-lintand generates theftdata.jsonlfile, which is the training dataset for
developing custom AI models.BuildExecute thetoxcommand. Installable images are created under
thedistdirectory.InstallationPrerequisitesPython version 3.10 or later.UNIX OS, such as Linux or Mac OS.Note:Installation on Microsoft Windows OS is not supported.Procedureansible-content-parseruses a newer version ofansible-lintand its
dependent components. In order to isolate them from the existing
Ansible installations, it is recommended to installansible-content-parserin
a Python virtual environment with the following steps:Create a working directory and set up venv Python virtual environment:python -m venv ./venv
source ./venv/bin/activateInstallansible-content-parserfrom the pip repository:pip install --upgrade pip
pip install --upgrade ansible-content-parserAfter the installation is completed, verify thatansible-content-parserandansible-lintare installed correctly:ansible-content-parser --version
ansible-lint --versionA list of application versions and their dependencies are displayed.
In the output that is displayed, ensure that you have the same version ofansible-lint.Important:If there is a mismatch in the installedansible-lintversions, you cannot get consistent results from the content parser and ansible-lint.
For example, the following result shows a mismatch inansible-lintversions:$ ansible-content-parser --version
ansible-content-parser 0.0.1 using ansible-lint:6.20.0 ansible-core:2.15.4
$ ansible-lint --version
ansible-lint 6.13.1 using ansible 2.15.4
A new release of ansible-lint is available: 6.13.1 → 6.20.0If theansible-lintversions do not match, perform the following tasks:Deactivate and reactivate venv:deactivate
source ./venv/bin/activateVerify that theansible-lintversions match:ansible-content-parser --version
ansible-lint --versionFor example, the following output shows the same ansible-lint versions:$ ansible-content-parser --version
ansible-content-parser 0.0.1 using ansible-lint:6.20.0 ansible-core:2.15.4
$ ansible-lint --version
ansible-lint 6.20.0 using ansible-core:2.15.4 ansible-compat:4.1.10 ruamel-yaml:0.17.32 ruamel-yaml-clib:0.2.7Executionansible-content-parseraccepts two positional parameters (sourceandoutput)
with a few optional parameters.$ ansible-content-parser --help
usage: ansible-content-parser [-h] [--config-file CONFIG_FILE]
[--profile {min,basic,moderate,safety,shared,production}] [--fix WRITE_LIST]
[--skip-ansible-lint] [--no-exclude] [-v] [--source-license SOURCE_LICENSE]
[--source-description SOURCE_DESCRIPTION] [--repo-name REPO_NAME] [--repo-url REPO_URL]
[--version]
source output
Parse Ansible files in the given repository by running ansible-lint and generate a training dataset for Ansible
Lightspeed.
positional arguments:
source source, which can be an zip/tar archive, a git URL or a local directory
output output directory
options:
-h, --help show this help message and exit
--config-file CONFIG_FILE
Specify the configuration file to use for ansible-lint. By default it will look for '.ansible-
lint', '.config/ansible-lint.yml', or '.config/ansible-lint.yaml' in the source repository.
--profile {min,basic,moderate,safety,shared,production}
Specify which rules profile to be used for ansible-lint
--fix WRITE_LIST Specify how ansible-lint performs auto-fixes, including YAML reformatting. You can limit the
effective rule transforms (the 'write_list') by passing a keywords 'all' (=default) or 'none'
or a comma separated list of rule ids or rule tags.
-S, --skip-ansible-lint
Skip the execution of ansible-lint.
--no-exclude Do not let ansible-content-parser to generate training dataset by excluding files that caused
lint errors. With this option specified, a single lint error terminates the execution without
generating the training dataset.
-v, --verbose Explain what is being done
--source-license SOURCE_LICENSE
Specify the license that will be included in the training dataset.
--source-description SOURCE_DESCRIPTION
Specify the description of the source that will be included in the training dataset.
--repo-name REPO_NAME
Specify the repository name that will be included in the training dataset. If it is not
specified, it is generated from the source name.
--repo-url REPO_URL Specify the repository url that will be included in the training dataset. If it is not
specified, it is generated from the source name.
--version show program's version number and exitsourcepositional argumentThe first positional parameter issource, which specifies
the source repository to be used. Following three types of sources are supported:File directory.Archive file in the following table:File FormatFile ExtensionZIP.zipUncompressed TAR.tarCompressed TAR.tar.gz, .tgz, .tar.bz2, .tbz2, .tar.xz, .txzGit URL, [email protected]:ansible/workshop-examples.gitorhttps://github.com/ansible/workshop-examples.gitoutputpositional argumentThe second positional parameter isoutput, which specifies a writable
directory. If the directory already exists, it has to be
an empty directory. If it does not exist, it will be newly created with
the given name.ansible-content-parsercreates therepositorysubdirectory in theoutputdirectory and copies the contents of thesourcerepository
to it. The copied contents may be changed by during the execution
of the Content Parser.OutputsFollowing directory structure is created in the directory specified with theoutputpositional argument.output/
|-- ftdata.jsonl # Training dataset
|-- report.txt # A human-readable report
|
|-- repository/
| |-- (files copied from the source repository)
|
|-- metadata/
|-- lint-result.json # Metadata generated by ansible-lint
|-- sarif.json # ansible-lint results in SARIF format
|-- (other metadata files generated)ftdata.jsonlThis is the training dataset file, which is the main output ofansible-content-parser.It is in the JSONL format, each of whose line represents a JSON objectreport.txtThis is a human-readable report that provides the summary information of the run
ofansible-content-parser, which contains sections like:File counts per typeList of Ansible files identifiedIssues found by ansible-lintList of Ansible modules found in tasksNote: When the--skip-ansible-lintoption is specified, the first three sections do
not appear in the report.metadata directoryThis subdirectory contains a few files that contain metadata generated
in the Content Parser run.lint-result.jsonlint-result.jsonis created in themetadatasubdirectory
as the result of the execution
ofansible-content-parser. The file contains a dictionary, which
has two key/value pairs:filesThis is for the list of files that were found
in the execution. The format of each file entry is explained below.Each file entry is represented as a dictionary that contains following keysKeyDescriptionbase_kindMIME type of the file, for example,text/yamldirDirectory where the file resides.excException found while processing this file. It can be null.filenameFile namekindFile type, for example,playbook,tasksorrolenameFile name (Usually same asfilename)parentName of the parent, like a role. It can be nullroleAnsible role. It can be nullstop_processingIdentifies whether processing was stopped on this file or notupdatedIdentifies whether contents were updated byansible-lintFollowing shows an example of a file entry:{"base_kind":"text/yaml","dir":"/mnt/input/roles/delete_compute_node/tasks","exc":null,"filename":"roles/delete_compute_node/tasks/main.yaml","kind":"tasks","name":"roles/delete_compute_node/tasks/main.yaml","parent":"roles/delete_compute_node","role":"delete_compute_node","stop_processing":false,"updated":false}excludedThis is for the list of file paths, which were excluded in the secondansible-lintexecution because syntax check errors were found in those files on the first execution.
The files included in the list will not appear in the entries associated with thefileskey.Note:Ifansible-content-parseris executed with the--no-excludeoption, the second execution
does not occur even if syntax check errors were found on the first execution and
the training dataset will not be created.sarif.jsonThis is the output ofansible-lintwith the--sarif-fileoption.
Thereport.txtcontains a summary generated
from this file in the "Issues found by ansible-lint" section.
|
ansible-core
|
AnsibleAnsible is a radically simple IT automation system. It handles
configuration management, application deployment, cloud provisioning,
ad-hoc task execution, network automation, and multi-node orchestration. Ansible makes complex
changes like zero-downtime rolling updates with load balancers easy. More information on the Ansiblewebsite.Design PrinciplesHave an extremely simple setup process with a minimal learning curve.Manage machines quickly and in parallel.Avoid custom-agents and additional open ports, be agentless by
leveraging the existing SSH daemon.Describe infrastructure in a language that is both machine and human
friendly.Focus on security and easy auditability/review/rewriting of content.Manage new remote machines instantly, without bootstrapping any
software.Allow module development in any dynamic language, not just Python.Be usable as non-root.Be the easiest IT automation system to use, ever.Use AnsibleYou can install a released version of Ansible withpipor a package manager. See ourinstallation guidefor details on installing Ansible
on a variety of platforms.Power users and developers can run thedevelbranch, which has the latest
features and fixes, directly. Although it is reasonably stable, you are more likely to encounter
breaking changes when running thedevelbranch. We recommend getting involved
in the Ansible community if you want to run thedevelbranch.Get InvolvedReadCommunity Informationfor all
kinds of ways to contribute to and interact with the project,
including mailing list information and how to submit bug reports and
code to Ansible.Join aWorking Group,
an organized community devoted to a specific technology domain or platform.Submit a proposed code update through a pull request to thedevelbranch.Talk to us before making larger changes
to avoid duplicate efforts. This not only helps everyone
know what is going on, but it also helps save time and effort if we decide
some changes are needed.For a list of email lists, IRC channels and Working Groups, see theCommunication pageCoding GuidelinesWe document our Coding Guidelines in theDeveloper Guide. We particularly suggest you review:Contributing your module to AnsibleConventions, tips, and pitfallsBranch InfoThedevelbranch corresponds to the release actively under development.Thestable-2.Xbranches correspond to stable releases.Create a branch based ondeveland set up adev environmentif you want to open a PR.See theAnsible release and maintenancepage for information about active branches.RoadmapBased on team and community feedback, an initial roadmap will be published for a major or minor version (ex: 2.7, 2.8).
TheAnsible Roadmap pagedetails what is planned and how to influence the roadmap.AuthorsAnsible was created byMichael DeHaanand has contributions from over 5000 users (and growing). Thanks everyone!Ansibleis sponsored byRed Hat, Inc.LicenseGNU General Public License v3.0 or laterSeeCOPYINGto see the full text.
|
ansible-coverage-callback
|
# ansible-coverage-callback[](https://pypi.python.org/pypi/ansible-coverage-callback/)[](https://opensource.org/licenses/MIT)Coverage Tool for Ansible.## Requirements* Ansible >=2.4## InstallationInstall this Ansible plugin with:```$ pip install ansible-coverage-callback```Be sure to whitelist the plugin in your `ansible.cfg`:```[defaults]callback_whitelist = coverage```## Skip coverage tagYou may skip task or tasks from coverage report by adding `skip_coverage` tag:```---- name: Test handlercommand: whoamiwhen:- test_var == Falsetags:- skip_coverage```## Acknowledged issues* Imported handlers has no tags, so they can't be skipped* There is some magic hacks for skipping Molecule's system playbooks* Tasks from non imported files are not counted
|
ansible-creator
|
ansible-creatorA CLI tool for scaffolding all your Ansible Content.Installation$pipinstallansible-creator$ansible-creator--help
usage:ansible-creator[-h][--version]{init}...
TooltoscaffoldAnsibleContent.Getstartedbylookingatthehelptext.
optionalarguments:-h,--helpshowthishelpmessageandexit--versionPrintansible-creatorversionandexit.
Commands:{init}Thesubcommandtoinvoke.initInitializeanAnsibleCollection.UsageFull documentation on how to use this, along with it's integration with VS Code Ansible Extension can be found inhttps://ansible.readthedocs.io/projects/creator/.Upcoming featuresScaffold Ansible plugins of your choice with thecreateaction.
Switch to thecreatebranch and try it out!Licensingansible-creator is released under the Apache License version 2.See theLICENSEfile for more details.
|
ansible-cry
|
CRYEncrypt and decrypt ansible-vault string/file perfect external toolsCRY me a river of encrypted data. Cry cry Cry.
Utilisation:
cry [OPTIONS] [SUB_COMMAND [OPTIONS]]
Meta-options:
-h, --help Print this message
--help-all Prints help messages of all sub-commands and quits
-v, --version Show version
Options:
--config VALEUR:ExistingFile Filename with the vault password.; default: $HOME/.vault.pass
-d, --decrypt Encrypt by default, add -d to decrypt
--show-params Show parameters given to cry and exit.
Sub-commands:
file see 'cry file --help' for infos
string see 'cry string --help' for infosUtilisation:
cry file [OPTIONS] files...
Hidden-switches:
-h, --help Print this message
--help-all Prints help messages of all sub-commands and quits
-v, --version Show version
Options:
-c, --show-context Show the context around the string to decodeUtilisation:
cry string [OPTIONS] strings...
Hidden-switches:
-h, --help Print this message
--help-all Prints help messages of all sub-commands and quits
-v, --version Show version
Options:
-s, --stdin Use stdin as source
|
ansible-dependencies
|
No description available on PyPI.
|
ansible-deploy
|
No description available on PyPI.
|
ansible-deployer
|
ansible-deployerHowever ansible is a great tool for IT infrastructure automation in certain cases it's hard to use
it for the complete deployment, because of that tools like ansible-tower/AWX try to wrap it's
activity making further assumptions on the way ansible code is stored (versioning repository), the
way it's executed - certain combinations of inventory/playbook/tag options used to achieve specific
goals. Having it in mind ansible-deployer may be treated as yet another ansible-playbook wrapper, but
focused on comprehensive command line intrface and easy YAML based configuration.Some main fatures areThe results of whole excution are logged and saved together with the ansible code state for
potential review in the future.Only one active ansible-deployer per ansible inventory is allowed, attempt to execute ansible-deployer
on already locked infrastructure will be rejected.Working directory is setup per every execution of ansible-deployer separately to store the code
state used. This is done by site configurable hook.It's possible to lock/unlock inventory (defined as --infra --stage pairs) for manual manipulation,
stopping ansible-deployer from being used.Configuration filestasks.yaml- Configuration of tasks (sets of playbooks to be executed)infra.yaml- Configuration of infrastructures and stages of those mapping to ansible inventoryExamplesansible-deployer run --task updateUsers --infra webServers --stage prod
|
ansible-deployment
|
No description available on PyPI.
|
ansible-development-environment
|
ansible-development-environmentA pip-like install for ansible collections.FeaturesPromotes an "ephemeral" development approachEnsures the current development environment is isolatedInstall all collection python requirementsInstall all collection test requirementsChecks for missing system packagesSymlinks the current collection into the current python interpreter's site-packagesInstall all collection collection dependencies into the current python interpreter's site-packagesBy placing collections into the python site-packages directory they are discoverable by ansible as well as python and pytest.UsageSetting up a development environment$ pip install ansible-development-environment --user
$ git clone <collection_repo>
$ cd <collection_repo>
$ ansible-development-environment install -e .\[test] --venv venv
INFO: Found collection name: network.interfaces from /home/bthornto/github/network.interfaces/galaxy.yml.
INFO: Creating virtual environment: /home/bthornto/github/network.interfaces/venv
INFO: Virtual environment: /home/bthornto/github/network.interfaces/venv
INFO: Using specified interpreter: /home/bthornto/github/network.interfaces/venv/bin/python
INFO: Requirements file /home/bthornto/github/network.interfaces/requirements.txt is empty, skipping
INFO: Installing python requirements from /home/bthornto/github/network.interfaces/test-requirements.txt
INFO: Installing ansible-core.
INFO: Initializing build directory: /home/bthornto/github/network.interfaces/build
INFO: Copying collection to build directory using git ls-files.
INFO: Running ansible-galaxy to build collection.
INFO: Running ansible-galaxy to install collection and it's dependencies.
INFO: Removing installed /home/bthornto/github/network.interfaces/venv/lib64/python3.11/site-packages/ansible_collections/network/interfaces
INFO: Symlinking /home/bthornto/github/network.interfaces/venv/lib64/python3.11/site-packages/ansible_collections/network/interfaces to /home/bthornto/github/network.interfaces
WARNING: A virtual environment was specified but has not been activated.
WARNING: Please activate the virtual environment:
source venv/bin/activateTearing down the development environment$ ansible-development-environment uninstall ansible.scm
INFO Found collection name: ansible.scm from /home/bthornto/github/ansible.scm/galaxy.yml.
INFO Requirements file /home/bthornto/github/ansible.scm/requirements.txt is empty, skipping
INFO Uninstalling python requirements from /home/bthornto/github/ansible.scm/test-requirements.txt
INFO Removed ansible.utils: /home/bthornto/github/ansible.scm/venv/lib64/python3.11/site-packages/ansible_collections/ansible/utils
INFO Removed ansible.utils*.info: /home/bthornto/github/ansible.scm/venv/lib64/python3.11/site-packages/ansible_collections/ansible.utils-2.10.3.info
INFO Removed ansible.scm: /home/bthornto/github/ansible.scm/venv/lib64/python3.11/site-packages/ansible_collections/ansible/scm
INFO Removed collection namespace root: /home/bthornto/github/ansible.scm/venv/lib64/python3.11/site-packages/ansible_collections/ansible
INFO Removed collection root: /home/bthornto/github/ansible.scm/venv/lib64/python3.11/site-packages/ansible_collectionsHelpansible-development-environment --helpansible-development-environment install --helpansible-development-environment uninstall --help
|
ansible-dev-environment
|
ansible-dev-environmentA pip-like install for ansible collections.FeaturesPromotes an "ephemeral" development approachEnsures the current development environment is isolatedInstall all collection python requirementsInstall all collection test requirementsChecks for missing system packagesSymlinks the current collection into the current python interpreter's site-packagesInstall all collection collection dependencies into the current python interpreter's site-packagesBy placing collections into the python site-packages directory they are discoverable by ansible as well as python and pytest.UsageSetting up a development environment$ pip install ansible-dev-environment --user
$ git clone <collection_repo>
$ cd <collection_repo>
$ ade install -e .\[test] --venv venv
INFO: Found collection name: network.interfaces from /home/bthornto/github/network.interfaces/galaxy.yml.
INFO: Creating virtual environment: /home/bthornto/github/network.interfaces/venv
INFO: Virtual environment: /home/bthornto/github/network.interfaces/venv
INFO: Using specified interpreter: /home/bthornto/github/network.interfaces/venv/bin/python
INFO: Requirements file /home/bthornto/github/network.interfaces/requirements.txt is empty, skipping
INFO: Installing python requirements from /home/bthornto/github/network.interfaces/test-requirements.txt
INFO: Installing ansible-core.
INFO: Initializing build directory: /home/bthornto/github/network.interfaces/build
INFO: Copying collection to build directory using git ls-files.
INFO: Running ansible-galaxy to build collection.
INFO: Running ansible-galaxy to install collection and it's dependencies.
INFO: Removing installed /home/bthornto/github/network.interfaces/venv/lib64/python3.11/site-packages/ansible_collections/network/interfaces
INFO: Symlinking /home/bthornto/github/network.interfaces/venv/lib64/python3.11/site-packages/ansible_collections/network/interfaces to /home/bthornto/github/network.interfaces
WARNING: A virtual environment was specified but has not been activated.
WARNING: Please activate the virtual environment:
source venv/bin/activateTearing down the development environment$ ade uninstall ansible.scm
INFO Found collection name: ansible.scm from /home/bthornto/github/ansible.scm/galaxy.yml.
INFO Requirements file /home/bthornto/github/ansible.scm/requirements.txt is empty, skipping
INFO Uninstalling python requirements from /home/bthornto/github/ansible.scm/test-requirements.txt
INFO Removed ansible.utils: /home/bthornto/github/ansible.scm/venv/lib64/python3.11/site-packages/ansible_collections/ansible/utils
INFO Removed ansible.utils*.info: /home/bthornto/github/ansible.scm/venv/lib64/python3.11/site-packages/ansible_collections/ansible.utils-2.10.3.info
INFO Removed ansible.scm: /home/bthornto/github/ansible.scm/venv/lib64/python3.11/site-packages/ansible_collections/ansible/scm
INFO Removed collection namespace root: /home/bthornto/github/ansible.scm/venv/lib64/python3.11/site-packages/ansible_collections/ansible
INFO Removed collection root: /home/bthornto/github/ansible.scm/venv/lib64/python3.11/site-packages/ansible_collectionsHelpade --helpade install --helpade uninstall --help
|
ansible-dev-tools
|
Ansible Development Tools (ADT)Theansible-dev-toolspython package provides an easy way to install and discover the best tools available to create and test ansible content.The curated list of tools installed as part of the Ansible automation developer tools package includes:ansible-core: Ansible is a radically simple IT automation platform that makes your applications and systems easier to deploy and maintain. Automate everything from code deployment to network configuration to cloud management, in a language that approaches plain English, using SSH, with no agents to install on remote systems.ansible-builder: Ansible Builder is a tool that automates the process of building execution environments using the schemas and tooling defined in various Ansible Collections and by the user.ansible-creator: The fastest way to generate all your ansible content!ansible-lint: Checks playbooks for practices and behavior that could potentially be improved.ansible-navigatorA text-based user interface (TUI) for Ansible.ansible-sign: Utility for signing and verifying Ansible project directory contents.molecule: Molecule aids in the development and testing of Ansible content: collections, playbooks and rolespytest-ansible: A pytest plugin that enables the use of ansible in tests, enables the use of pytest as a collection unit test runner, and exposes molecule scenarios using a pytest fixture.tox-ansible: The tox-ansible plugin dynamically creates a full matrix of python interpreter and ansible-core version environments for running integration, sanity, and unit for an ansible collection both locally and in a Github action. tox virtual environments are leveraged for collection building, collection installation, dependency installation, and testing.ansible-dev-environment: A pip-like install for Ansible collections.Installationpython -m pip install ansible-dev-toolsUsageIn addition to installing each of the above tools,ansible-dev-toolsprovides an easy way to show the versions of the content creation tools that make up the current development environment.$ adt --version
ansible-builder 3.0.0
ansible-core 2.16.3
ansible-creator 24.2.0
ansible-dev-environment 24.1.0
ansible-dev-tools 0.2.0a0
ansible-lint 24.2.0
ansible-navigator 24.2.0
ansible-sign 0.1.1
molecule 24.2.0
pytest-ansible 24.1.2
tox-ansible 24.2.0DocumentationFor more information, please visit ourdocumentationpage.
|
ansible-discover
|
ansible-discoveransible-discoveris a command line tool to list dependencies and
dependants ofAnsibleroles and playbooks, respectively.One of its prime uses is in a CI tool like Jenkins. Once a change on,
say a role, is committed, useansible-discoverto gather the dependant
roles and playbooks. From this list, the respective CI jobs for playbook
and role validations may then be triggered.Installationpip install ansible-discoverUsageOne use case (like outlined above) is to determine all roles (directly
or indirectly) depending on a given set of roles:ansible-discover roles predecessors PATHSwherePATHSis a space-delimited list of paths to roles (e.g.,roles/my_sample_role).In addition to predecessors (i.e., dependants) for roles, you can also
discoversuccessors (i.e., dependencies) of roles:ansible-discoverroles successors;predecessors for playbooks:ansible-discoverplaybooks predecessors; andsuccessors of playbooks:ansible-discoverplaybooks successors.Related toolsansigenomeansible-roles-graphansible-reviewLicenseDistributed under the XYZ license. SeeLICENSE.txtfor more
information.ContributingFork it!Create your feature branch:git checkout-bmy-new-featureCommit your changes:git commit-am'Add some feature'Push to the branch:git push originmy-new-featureSubmit a pull request :)
|
ansible-doc-extractor
|
This package contains code for Ansible collection documentation extractor. Its
main audience are Ansible collection maintainers that would like to publish
API docs in the HTML form without having to manually copy the data already
present in the module’s metadata.QuickstartDocumentation extractor is published onPyPIand we can install it usingpip:$ pip install ansible-doc-extractor # If we already have ansible installed
$ pip install ansible-doc-extractor[ansible] # To also install ansible
$ pip install ansible-doc-extractor[base] # To also install ansible-base
$ pip install ansible-doc-extractor[core] # To also install ansible-coreIf the previous command did not fail, we are ready to start extracting the
documentation:$ ansible-doc-extractor \
/tmp/output-folder \
~/.ansible/collections/ansible_collections/my/col/plugins/modules/*.pyThis will extract the documentation from modules inmy.colcollection and
place resulting rst files into/tmp/output-folder.NoteAlways extract documentation from installed collection. Documentation
fragment loader fails to combine various parts of the documentation
otherwise.RST and Markdown supportBy defaultansible-doc-extractorwill output files in .rst format using the built-in Jinja2 template for rst. Pass the--markdownflag to output files in markdown.Custom templateansible-doc-extractorsupports a custom Jinja2 template file via--template. The following variables
are sent to the template:Variable nameTypeDescriptionModule’s documentation keyshort_descriptionstrShort description of a module.short_descriptiondescriptionstr / listLonger description of a module, type depends on the module’sdescriptiontype.descriptionrequirementslistRequirements needed on the host that executes this module.requirementsoptionsdictAll module options, often called parameters or arguments.optionsnoteslistModule’s additional notes.notesseealsolistDetails of any important information that doesn’t fit in one of the above sections.seealsodeprecatedstrMarks modules that will be removed in future releasesdeprecatedauthorstr / listAuthor of the module, type can vary depending on how many authors module has.authormetadatadictThis section provides information about the moduleRefers to ANSIBLE_METADATA block in the module.examplesstrCode examplesRefers to EXAMPLES block in the module.returndocsdictThis section documents the information the module returns.Refers to RETURN block in the module.The output files will use the same file extension as the custom template file.You can always refer to thedefault Jinja2 template for rstand thedefault Jinja2 template for markdown.Development setupGetting development environment up and running is relatively simple:$ python3 -m venv venv
$ . venv/bin/activate
(venv) $ pip install -e .To test the extractor, we can run:$ ansible-doc-extractor
|
ansible-docgen
|
No description available on PyPI.
|
ansible-docgenerator
|
# Ansible-DocGenA python utility for generating Ansible README.md files for roles[](https://travis-ci.org/toast38coza/Ansible-DocGen)* Ansible-DocGen is tested in `python3`## Installation### PIPpip install ansible-docgeneration**Note if you installed it from pip, you will need to use it like so from the command line:**python -m docgen### From Source```git clone [email protected]:toast38coza/Ansible-DocGen.gitcd Ansible-DocGenvirtualenv-3.4 env -p python3source env/bin/activatepip install -r requirements.txt```## Usage```Usage: docgen.py [OPTIONS]# Basic usage:python docgen.py#Specify a path to a role:python docgen.py --path=/path/to/ansible/role#Specify a path to a role, and output the result to a file:python docgen.py --path=/path/to/ansible/role > /path/to/ansible/role/README.md```**Options:*** `--path` Path to your role (defaults to current directory)* `--help` Show this message and exit.## Test.. coming soon## Example output:A role for settings up a django application* **Required ansible version:** >= 1.2## Required variables:| Parameter | Example | Comment ||--------------|----------|----------||`django_app_name`|`polls`|The name of your application. Typcially this will be the name of the folder containing your settings.py file|## Optional variables:| Parameter | Default ||--------------|----------||`django_settings_directory_path`|`{{django_app_path}}/{{django_project_name}}`||`django_gunicorn_workers`|`2`||`django_venv_path`|`/srv/venvs/{{django_app_name}}`||`django_media_path`|`/srv/{{django_app_name}}/media`||`django_http_port`|`8000`||`manage_commands`|`['collectstatic', 'syncdb', 'migrate']`||`django_settings_vars`|`[]`||`django_static_path`|`/srv/{{django_app_name}}/static`||`django_log_path`|`/var/log/{{django_app_name}}`||`django_requirements_location`|`{{django_app_path}}/requirements.txt`||`django_env_vars`|`[]`||`django_version`|`1.8`||`django_app_path`|`/var/www/{{django_app_name}}`||`django_github_version`|`master`|### Basic usage```- hosts: allvars:- django_app_name: pollsroles:- { role: django, tags: django }```### TODO* TODO: use pex to package virtualenv* Unit testing* Travis CI* Better error handling
|
ansible-docker-ci
|
ansible-docker-ciDocker image-based dynamic hosts for ansible.Installation# Install python packagepipinstallansible-docker-ci# Add plugin file it your connection plugins directoryecho"from ansible_docker_ci.image.connection.plugin import *">./connection_plugins/docker_image.py
|
ansible-doctor
|
ansible-doctorAnnotation based documentation for your Ansible rolesThis project is based on the idea (and at some parts on the code) ofansible-autodocby Andres Bott so credits goes to him for his work.ansible-doctoris a simple annotation like documentation generator based on Jinja2 templates. Whileansible-doctorcomes with a default template calledreadme, it is also possible to write custom templates to customize the output or render the data to other formats like HTML or XML as well.ansible-doctoris designed to work within a CI pipeline to complete the existing testing and deployment workflow. Releases are available as Python Packages onGitHuborPyPIand as Docker Image onDocker Hub.The full documentation is available athttps://ansible-doctor.geekdocs.de.ContributorsSpecial thanks to allcontributors. If you would like to contribute,
please see theinstructions.LicenseThis project is licensed under the GPL-3.0 License - see theLICENSEfile for details.
|
ansible-dotdiff
|
Nested structure diff library with dot-path notation for Ansible.DescriptionThis package is built to plug into Ansible’s module_utils, which doesn’t require
it to be installed on managed remote hosts, only the controller. It will be picked
up by Ansiballz, which zips a module’s dependencies and ships it over SSH.
The library is kept small to keep the footprint down.UsageBy design, the algorithm will ignore any keys that are omitted on the right (the target),
to allow an API endpoint to choose plausible defaults. For example, an API client
implementing this library will diff the desired state with a JSON REST resource
to predict whether or not a REST call needs to occur for the user’s changes to be applied.from ansible.module_utils.dotdiff import dotdifforig = { ‘one’: ‘one’,‘two’: ‘two’ }dest = { ‘one’: ‘another’,‘three’: ‘three’ }dotdiff(orig, dest)dotdiff() yields a list of DiffEntry objects:[one: “one” => “another”, three: “<undefined>” => “three”]Keys that would be added to the structure in this transaction have their values marked
as ‘<undefined>’.Nested lists and dictionaries are supported at an arbitrary level and will be indicated
using dot-separated paths. Changing a list’s member count will yield a DiffEntry indicating
a change in cardinality with a pound (#) sign.mylist.#: “3” => “4”This visualization is inspired by Terraform.LicenseMIT.
|
ansible-droplet
|
# Ansible Droplet[](https://travis-ci.org/FlorianKempenich/ansible-droplet) [](https://pypi.org/project/ansible-droplet/)`ansible-droplet` is a cli tool to easily create _ready-to-use_ droplets on Digital Ocean.* Create **ready-to-use** Ubuntu Droplet on Digital Ocean* One **simple** command: `ansible-droplet create my_droplet`* Access it directly via its **name**:* SSH: `ssh my_droplet`* Ansible: `- hosts: my_droplet`* And more:* New sudo user automatically created* Swap added (configurable)* DigitalOcean advanced metrics enabled* `glances` server running to monitor the Droplet from outside## Installation**Important Note:** For now only `python 2` is supported```pip install ansible-droplet```> Use a `virtualenv` or install with `pip install ansible-droplet --user`---## Usage### One time setupBefore using the tool, a simple _one-time-setup_ must be done.1. Make sure you have a **`ssh` public key**, or [generate one](https://help.github.com/articles/generating-a-new-ssh-key-and-adding-it-to-the-ssh-agent/#generating-a-new-ssh-key)1. Make sure you have a **Digital Ocean API token**, or [generate one](https://www.digitalocean.com/community/tutorials/how-to-use-the-digitalocean-api-v2)1. **Store in a file** the Digital Ocean API token in clear<sup>[1](#f1)</sup>1. Note down:* _Path_ to your **ssh public key*** _Path_ to the file containing the **Digital Ocean API token**1. Run `ansible-droplet config` see below for more detail on the config parameters### Create```ansible-droplet create my_droplet```##### Creation process* **Create a new Droplet** on your Ditigal Ocean Account* **Set it up** with: Swap, new sudo user, glances server* **Create a SSH entry** in your `~/.ssh/config` to be able to ssh directly with its name: `ssh my_droplet`* **Create a ansible inventory entry** in `~/.ansible-droplet-inventory`### Destroy```ansible-droplet destroy my_droplet```##### Destruction process* **Destroy the Droplet** from your Ditigal Ocean Account* **Remove the SSH entry** from your `~/.ssh/config` to be able to ssh directly with its name: `ssh my_droplet`* **Remove the ansible inventory entry** from `~/.ansible-droplet-inventory`## Advanced Usage### Config parametersWhen running `ansible-droplet config` a couple of parameters must be provided:* **Path to SSH key**:No brainer, the path to your SSH public key._Default: `~/.ssh/id_rsa.pub`_* **Name of SSH key on Digital Ocean**:To prevent uploading the public keys each time, Digital Ocean offers to store them under a name. It can be anything._Default: `Main SSH Key`_* **Path to Ditigal Ocean token**:The path to the file containing your Digital Ocean token in plain text._No defaults_* **User on Droplet - Username:**Username for the sudo user being created on the Droplet._No defaults_* **User on Droplet - Default Password:**Default password for the sudo user being created on the Droplet._No defaults__Do not forget to change it after the first login!_### Droplet specsYou can specify the droplet size and specs when creating a new droplet.Simply list it after the droplet name:```ansible-droplet create my_droplet SPEC_NAME```For now, only 3 specs are supported<sup>[2](#f2)</sup>:* **Micro*** size: "512mb"* region: "fra1"* image: "ubuntu-16-04-x64"* swap: "4GiB"* **Mini*** size: "1gb"* region: "fra1"* image: "ubuntu-16-04-x64"* swap: "4GiB"* **Power*** size: "4gb"* region: "fra1"* image: "ubuntu-16-04-x64"* swap: "4GiB"The list of specification are stored on the repository: [Droplet Specs](https://github.com/FlorianKempenich/ansible-droplet/tree/master/ansible_droplet/ansible/droplet_specs)### Ansible Droplet InventoryIt is possible to access the Droplet from any other ansible playbook via its **name**:- hosts: my_dropletFor that purpose the file `~/.ansible-droplet-inventory` is created.`~/.ansible-droplet-inventory` contains ansible inventory entries for each Droplet created with the `ansible-droplet` tool.To use the droplet in a playbook, either:* Point your _inventory_ to the `~/.ansible-droplet-inventory` file* Point your _inventory_ to a directory containing a symlink to the `~/.ansible-droplet-inventory` fileFrom there you can reference the Droplet directly by name.### Multiple configuration - Multiple Digital Ocean accountsYou can use multiple configurations to support:* Multiple **DigitalOcean accounts*** Multiple **SHH Keys*** Multiple **default user/password**The configuration generated by `ansible-droplet config` is kept in the installation directory.To allow multiple configs, simply **install multiple versions of `ansible-droplet` in different _virtualenvs_**<sup>[3](#f3)</sup>---1. <span id="f1"></span>Yes, this is a security concern. Feel free to open a pull request.2. <span id="f2"></span>For now, the addition of new droplet specs is not supported. Again, pull requests are welcome :)3. <span id="f3"></span>This is not optimal... You know what to do ;)
|
ansible-dynamic-inventory
|
# ansible-dynamic-inventoryGenerate ansible dynamic inventory from static inventory.Optionally, Replace the host list of ansible static inventory with ServiceAddress registered in consul service.[](https://badge.fury.io/py/ansible-dynamic-inventory)[](https://travis-ci.org/Yoshiyuki-Nakahara/python-ansible-dynamic-inventory)[](https://landscape.io/github/Yoshiyuki-Nakahara/python-ansible-dynamic-inventory/master)[](https://opensource.org/licenses/MIT)# Assumed Use Case- Dynamic inventory conversion from Static inventory- Merge Dynamic inventory into Static inventorycf. gce.py(dynamic inventory) + group_vars(static inventory)- In the service operated using [Consul](https://www.consul.io/), inventory is dynamically generated without rewriting static inventory when host information changes dynamically, such as automatic failover- Confirm the inventory structure with plantuml# References[Ansible inventory](http://docs.ansible.com/ansible/intro_inventory.html)[Ansible Dynamic Inventory](http://docs.ansible.com/ansible/intro_dynamic_inventory.html)[Consul Catalog Service](https://www.consul.io/docs/agent/http/catalog.html#catalog_service)# Installation$ yum install gcc python-devel openssl-devel python-pip$ pip install --upgrade pip setuptools (optional)$ pip install ansible-dynamic-inventory# Prerequisite of Replace with Consul ServiceIf the group name written in the ansible static inventory and the service name registered in the consul service are the same, the host name is replaced.# Usage# Stand alone execution$ ansible-dynamic-inventory --list# Stand alone execution and specified config file$ ansible-dynamic-inventory --list --config /path/to/config# or Specify config file with environment variable$ ANSIBLE_DYNAMIC_INVENTORY_CONFIG_PATH=/path/to/config ansible-dynamic-inventory --list# As Ansible Dynamic Inventory execution$ ansible-playbook --inventory ansible-dynamic-inventory /path/to/playbook.yml# outut in platuml formatansible-dynamic-inventory --plantuml# Configuration# vi ansible_dynamic_inventory.ini[ansible]# If both static_inventory_path and dynamic_inventory_path are specified,# merge dynamic_inventory into static_inventory# Either static_inventory_path or dynamic_inventory_path must not be empty# path to static inventory file or directorystatic_inventory_path = /path/to/ansible_inventory# path to dynamic inventory file#dynamic_inventory_path = ./gce.pydynamic_inventory_path =[consul]#url = http://localhost:8500/v1url =# Stand alone execution exampleex. ansible:static_inventory_path = ${this repository}/sample_inventory$ ansible-dynamic-inventory --list{"all": {"hosts": ["10.10.10.12","10.10.10.13","10.10.10.14","10.10.10.15","10.10.10.11"],"children": ["ungrouped","_mysql_replication_config","mysql_backup_storage"],"vars": {"datacenter": "vagrant","net_cidr": "10.10.10.0/24"}},"_mysql_replication_config": {"hosts": ["10.10.10.12","10.10.10.13","10.10.10.14","10.10.10.15","10.10.10.11"],"children": ["mysql_replication_master","mysql_replication_slave","mysql_replication_backup","mysql_failover"],"vars": {"mysql_replication_user": "root","mysql_master_host": "{{groups.mysql_replication_master[0]}}","mysql_master_group_name": "mysql_replication_master","mysql_version": "5.6.34","mysql_replication_group_name": "single","mysql_slave_group_name": "mysql_replication_slave"}},"_meta": {"hostvars": {"10.10.10.15": {},"10.10.10.14": {},"10.10.10.11": {},"10.10.10.13": {},"10.10.10.12": {"hostvar": "dummy"}}},"mysql_backup_storage": {"hosts": ["10.10.10.15"],"vars": {"mysql_backup_target_host": "{{groups.mysql_replication_backup[0]}}","mysql_backup_storage_cron_file": "../../../../varfiles/vagrant/mysql/backup_storage/mysql_backup"}},"mysql_replication_slave": {"hosts": ["10.10.10.13","10.10.10.14"]},"mysql_replication_backup": {"hosts": ["10.10.10.15"]},"ungrouped": {},"mysql_failover": {"hosts": ["10.10.10.11"],"vars": {"mysql_failover_config_file": "../../../../varfiles/vagrant/mysql/failover/config.yml"}},"mysql_replication_master": {"hosts": ["10.10.10.12"]}}# output in platuml format exampleex. ansible:static_inventory_path = ${this repository}/sample_inventory- Ansible host information is converted in plantuml "object"- Ansible group information is converted in plantuml "package"- Ansible group variable information is converted in plantuml "class"$ /usr/bin/ansible-dynamic-inventory --plantuml@startumlobject 10.10.10.15object 10.10.10.14object 10.10.10.11object 10.10.10.13object 10.10.10.12 {"hostvar": "dummy"}package all {all_hosts - 10.10.10.12all_hosts - 10.10.10.13all_hosts - 10.10.10.14all_hosts - 10.10.10.15all_hosts - 10.10.10.11class all_varsall_children - ungroupedall_children - _mysql_replication_configall_children - mysql_backup_storage}class all_vars {"datacenter": "vagrant""net_cidr": "10.10.10.0/24"}package _mysql_replication_config {_mysql_replication_config_hosts - 10.10.10.12_mysql_replication_config_hosts - 10.10.10.13_mysql_replication_config_hosts - 10.10.10.14_mysql_replication_config_hosts - 10.10.10.15_mysql_replication_config_hosts - 10.10.10.11class _mysql_replication_config_vars_mysql_replication_config_children - mysql_replication_master_mysql_replication_config_children - mysql_replication_slave_mysql_replication_config_children - mysql_replication_backup_mysql_replication_config_children - mysql_failover}class _mysql_replication_config_vars {"mysql_replication_user": "root""mysql_master_host": "{{groups.mysql_replication_master[0]}}""mysql_master_group_name": "mysql_replication_master""mysql_version": "5.6.34""mysql_replication_group_name": "single""mysql_slave_group_name": "mysql_replication_slave"}package mysql_backup_storage {mysql_backup_storage_hosts - 10.10.10.15class mysql_backup_storage_vars}class mysql_backup_storage_vars {"mysql_backup_target_host": "{{groups.mysql_replication_backup[0]}}""mysql_backup_storage_cron_file": "../../../../varfiles/vagrant/mysql/backup_storage/mysql_backup"}package mysql_replication_slave {mysql_replication_slave_hosts - 10.10.10.13mysql_replication_slave_hosts - 10.10.10.14}package mysql_replication_backup {mysql_replication_backup_hosts - 10.10.10.15}package ungrouped {}package mysql_failover {mysql_failover_hosts - 10.10.10.11class mysql_failover_vars}class mysql_failover_vars {"mysql_failover_config_file": "../../../../varfiles/vagrant/mysql/failover/config.yml"}package mysql_replication_master {mysql_replication_master_hosts - 10.10.10.12}@enduml# LicenseMIT License
|
ansible-dynamic-launcher
|
UNKNOWN
|
ansible-ec2-inventory
|
Ansible EC2 inventory=====================This Python module is based on the [original Ansible EC2 inventoryscript](https://raw.githubusercontent.com/ansible/ansible/devel/contrib/inventory/ec2.py)that is linked in the [Ansibledocs](http://docs.ansible.com/ansible/intro_dynamic_inventory.html#example-aws-ec2-external-inventory-script).The Python module in this repo fixes a few issues by being- installable via `pip` / PyPi: no need to place code from the Ansiblerepo in your inventory.- extendable for your needs: the class `Ec2Inventory` can be used as abase class for customizations.Installation------------pip install ansible-ec2-inventoryUsage-----### As a scriptansible-ec2-inventory --config ec2.ini*Note:* if you want to provide a boto profile, prefix the command with`AWS_PROFILE=myprofile`.### As a Python moduleExample:``` {.python}from ansible_ec2_inventory import Ec2Inventoryimport jsondef main():# get path of ec2.inipath = os.path.dirname(os.path.realpath(__file__))# get inventoryec2inventory = Ec2Inventory(configfile=path + '/ec2.ini')data = ec2inventory.get_inventory()# print jsonprint(json.dumps(data, sort_keys=True, indent=2))if __name__ == '__main__':main()```
|
ansible-events
|
ansible-eventsFree software: Apache Software License 2.0Documentation:https://ansible-events.readthedocs.io.Event driven automation for Ansible.The real world is fully of events that change the state of our software and systems.
Our automation needs to be able to react to those events. Ansible-events is a command
line tool that allows you to recognize which events that you care about and react accordingly
by running a playbook or other actions.Let’s get started with a simple hello world example to familiarize ourselves with the concepts:---
- name: Hello Events
hosts: localhost
sources:
- benthomasson.eda.range:
limit: 5
rules:
- name: Say Hello
condition: event.i == 1
action:
run_playbook:
name: benthomasson.eda.hello
...Events come from aevent sourceand then are checked againstrulesto determine if anactionshould
be taken. If theconditionof a rule matches the event it will run the action for that rule.In this example the event source is the Python range function. It produces events that count fromi=0toi=<limit>.Wheniis equal to 1 the condition for the theSay Hellorule matches and it runs a playbook.Normally events would come from monitoring and alerting systems or other software. The following
is a more complete example that accepts alerts from Alertmanager:---
- name: Automatic Remediation of a webserver
hosts: all
sources:
- name: listen for alerts
benthomasson.eda.alertmanager:
host: 0.0.0.0
port: 8000
rules:
- name: restart web server
condition: event.alert.labels.job == "fastapi" and event.alert.status == "firing"
action:
run_playbook:
name: benthomasson.eda.start_app
...This example sets up a webhook to receive events from alertmanager and then matches events
where thefastapijob alert has a staus offiring. This runs a playbook that will
remediate the issue.FeaturesConditionally launch playbooks based on rules that match events in event streams.ExamplesRules are organized into rulesets using a syntax that is similar to ansible-playbooks:---
- name: Hello Events
hosts: localhost
sources:
- benthomasson.eda.range:
limit: 5
rules:
- name: Say Hello
condition: event.i == 1
action:
run_playbook:
name: benthomasson.eda.hello
...Each ruleset defines: a set of hosts to pass to the playbook, a set of event sources,
and a set of rules. The set of hosts is the normal hosts pattern from ansible playbooks.
The event sources are a new type of plugin that subscribe to events from event streams.
The rules have conditions that match values in the events and actions that can run playbooks,
assert facts, retract facts, and print information to the console.Let’s look closer at the event source:- benthomasson.eda.range:
limit: 5This section of YAML defines that an event source plugin from the benthomasson.eda should
be loaded and given the arguments: limit=5. This source will generate a range of numbers
from zero to 4 and then exit.The rules YAML structure looks like the following:- name: Say Hello
condition: event.i == 1
action:
run_playbook:
name: benthomasson.eda.helloThis block of YAML defines a rule with name “Say Hello”, a condition that matches
when an event has an value “i” that is equal to 1, and an action that runs a playbook
inside the collection benthomasson.eda.How to installVia PyPi:pip install ansible-eventsVia Docker:docker build -t ansible-events .Usageansible-events --helpCreditsThis package was created withCookiecutterand theaudreyr/cookiecutter-pypackageproject template.History0.1.0 (2022-02-16)First release on PyPI.
|
ansible-exec
|
No description available on PyPI.
|
ansible-extras
|
No description available on PyPI.
|
ansible-filter
|
Ansible FilterContainsAnsiblerelated filter set for collection/object operations. Aims to extend the officialAnsible Filters.Available filters are listedbelow.InstallPyPIpip install -U ansible-filterSourcegit clone [email protected]:nl2go/ansible-filter.git
cd ansible-filter/
pip install .FiltersChange SetComputes the change set between a list of objects using a key attribute non-recursively.The filter argumentslocalandoriginare non-associative. Objects present at theoriginbut missing
withinlocalare considered non managed.The result is a change set with a map of lists to:create- objects to create atoriginupdate- objects to update atorigindelete- objects to delete atoriginnoop- objects with no operation requiredUseful to interact with any kind of a stateful API.from ansible_filter import change_set
local = [{ 'id': 1, 'foo': 'bar' }, { 'id': 2, 'foo': 'foo' }, { 'id': 3, 'foz':'baz' }, { 'id': 4, 'state': 'absent' }]
origin = [{ 'id': 2, 'foo': 'bar' }, { 'id': 3, 'foz':'baz' }, { 'id': 4, 'x': 'y' }, { 'id': 5, 'foo': 'bar' }]
result = change_set.change_set(local, origin, 'id')
print(result)
[
'create': [{ 'id': 1, 'foo': 'bar' }],
'update': [{ 'id': 2, 'foo': 'foo' }],
'delete': [{ 'id': 4, 'x': 'y' }],
'noop': [{ 'id': 3, 'foz': 'baz' }],
]Form URL EncodeEncodes arbitrary objects to form URL format.from ansible_filter import form_urlencode
obj = { 'foo': 'bar', 'foz': ['baz'] }
result = form_urlencode.form_urlencode(obj)
print(result)
foo=bar&foz[0]=baz&PickFilters a list of objects retaining attributes only matching the names passed as argument, non-recursively.from ansible_filter import pick
elements = [{ 'foo': 'bar', 'foz': 'baz' }]
result = pick.pick(elements, ['foo'])
print(result)
[{ 'foo': 'bar' }]OmitFilters a list objects omitting attributes matching the names passed as argument, non-recursively.from ansible_filter import omit
elements = [{ 'foo': 'bar', 'foz': 'baz' }]
result = omit.omit(elements, ['foo'])
print(result)
[{ 'foz': 'baz' }]Group ByGroups elements by key attribute.from ansible_filter import group_by
left = [{ 'id': '1', 'foo': 'a' }, { 'id': '2', 'foz': 'x' }]
right = [{ 'id': '1', 'foo': 'b' }, { 'id': '2', 'foz': 'y' }]
result = group_by.group_by(left, right, 'id')
print(result)
[
{ 'id': '1', 'group': [{ 'id': '1', 'foo': 'a' }, { 'id': '1', 'foo': 'b' }] },
{ 'id': '2', 'group': [{ 'id': '2', 'foz': 'x' }, { 'id': '2', 'foz': 'y' }] }
]List 2 DictConverts a list to dict by key attribute.from ansible_filter import list_to_dict
elements = [{ 'id': '1', 'foo': 'bar' }, { 'id': '2', 'foz': 'baz' }]
result = list_to_dict.list_to_dict(elements, 'id')
print(result)
{'1': {'foo': 'bar', 'id': '1'}, '2': {'id': '2', 'foz': 'baz'}}Point to Point ConnectionsResolves point to point connections between the local and remote hosts.from ansible_filter import network
remote_hostnames = ['two', 'three']
hostname = 'one'
hostvars = {
'one': {
'ansible_default_ipv4': {
'interface': 'eth0'
},
'ansible_eth0': {
'ipv4': {
'address': '127.0.0.1',
}
}
},
'two': {
'ansible_default_ipv4': {
'interface': 'eth0'
},
'ansible_eth0': {
'ipv4': {
'address': '127.0.0.2',
}
}
},
'three': {
'ansible_default_ipv4': {
'interface': 'eth0'
},
'ansible_eth0': {
'ipv4': {
'address': '127.0.0.3',
}
}
}
}
result = network.get_point_to_point_connections(remote_hostnames, hostname, hostvars)
[
{
'remote': {
'interface': 'eth0',
'hostname': 'two',
'address': '127.0.0.2'
},
'local': {
'interface': 'eth0',
'hostname': 'one',
'address': '127.0.0.1'
}
},
{
'remote': {
'interface': 'eth0',
'hostname': 'three',
'address': '127.0.0.3'
},
'local': {
'interface': 'eth0',
'hostname': 'one',
'address': '127.0.0.1'
}
}
]LinksWebsite:https://newsletter2go.com/License:MITReleases:https://pypi.org/project/ansible-filter/Code:https://github.com/nl2go/ansible-filterIssue tracker:https://github.com/nl2go/ansible-filter/issues
|
ansible-filters-ldif
|
Ansible filter to read or write LDIF.Install this Ansible Filter:viapip:pip install ansible-filters-ldifviaansible-galaxy:ansible-galaxy install 'git+https://github.com/atterdag/ansible-filters-ldif.git'Ansible filters always runs on localhost.ExamplesConvert dictionary to LDIF----name:Create dictionary with entriesset_fact:dictionary:--dc=example,dc=com-dc:-exampledescription:-This is a line longer than 79 characters, so LDIF breaks it up over multiple lineso:-example.comobjectClass:-dcObject-organization--ou=people,dc=example,dc=com-objectClass:-organizationalUnitou:-people--cn=Jane Doe,ou=people,dc=example,dc=com-cn:-Jane Doemail:[email protected]:-inetOrgPersonsn:-Doe--cn=John Doe,ou=people,dc=example,dc=com-cn:-John Doemail:[email protected]:-inetOrgPersonsn:-Doe--ou=groups,dc=example,dc=com-objectClass:-organizationalUnitou:-groups--cn=users,ou=groups,dc=example,dc=com-cn:-usersmember:-cn=Jane Doe,ou=people,dc=example,dc=com-cn=John Doe,ou=people,dc=example,dc=comobjectClass:-groupOfNames-name:"ConvertdictionarytoLDIFwhilewritingitto/tmp/test.ldifusing'to_ldif'filter"copy:content:"{{dictionary|to_ldif}}"dest:"/tmp/test.ldif"Convert LDIF to JSON----name:"Createmulti-linestringvariablewithLDIFdata"set_fact:ldif:|dn: dc=example,dc=comdc: exampledescription: This is one line which is longer than79 characters, so LDIF breaks it up over multiple linesobjectClass: dcObjectobjectClass: organizationo: example.comdn: ou=people,dc=example,dc=comobjectClass: organizationalUnitou: peopledn: cn=Jane Doe,ou=people,dc=example,dc=comobjectClass: inetOrgPersoncn: Jane Doesn: Doemail: [email protected]: cn=John Doe,ou=people,dc=example,dc=comobjectClass: inetOrgPersoncn: John Doesn: Doemail: [email protected]: ou=groups,dc=example,dc=comobjectClass: organizationalUnitou: groupsdn: cn=users,ou=groups,dc=example,dc=comobjectClass: groupOfNamescn: usersmember: cn=Jane Doe,ou=people,dc=example,dc=commember: cn=John Doe,ou=people,dc=example,dc=com-name:"ConvertstringtoJSONwhilewritingitto/tmp/test.jsonusing'from_ldif'filter"copy:content:"{{(ldif|from_ldif)|to_nice_json}}"dest:"/tmp/test.json"Build dependenciesInstall the following OS development packages first.sudo apt-get install libssl-dev libldap2-dev libsasl2-dev python2-dev python3-dev
mkvirtualenv --python=/usr/bin/python3 python3-development
pip install --requirement requirements.txt
gem install travis fryLicenseGPLv3.
|
ansible-flow
|
Ansible-flow is a simple utility to help make ansible easier to use with a
specific set of production use-cases.If you need to do the following, then ansible-flow might be for you:Use the same playbooks against multiple environmentsUse bastions in your deploymentRun a collection of playbooks in sequence.Links:GitHub:https://github.com/jmvrbanac/ansible-flowReadTheDocs:http://ansible-flow.readthedocs.org/en/latest/
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.