package
stringlengths
1
122
pacakge-description
stringlengths
0
1.3M
ajpy
AJPy aims to craft AJP requests in order to communicate with AJP connectors.
ajr
No description available on PyPI.
ajrnester
UNKNOWN
ajson
# AJson (Annotations Json Serializer)AJson is a serializer based on annotations that gives a lot of flexibility and configuration for you serialization process.[![Build Status](https://travis-ci.org/JorgeGarciaIrazabal/ajson.svg?branch=master)](https://travis-ci.org/JorgeGarciaIrazabal/ajson)[![codecov](https://codecov.io/gh/JorgeGarciaIrazabal/ajson/branch/master/graph/badge.svg)](https://codecov.io/gh/JorgeGarciaIrazabal/ajson)### Install: (python3.6 or higher)`pip install ajson`#### Motivation:There are amazing serialization libraries like [jsonpickle](https://jsonpickle.github.io/), and even more when the serialized object is meant to be used in python too.But there are no libraries that let you filter fields to serialize or modify the names of the attributes. These features are super useful, mainly for http APIsThis library allows you to have those features in a simple and intuitive way.#### Serialize Examples##### Simple Serialization With "Groups"If you want to filter some sensible data in some scenarios, you can define `groups` per attribute to control what is serialized and what is not.```pythonfrom ajson import AJson, ASerializer@AJson()class Restaurant:location:str # @aj(groups=["public","admin"])tables: int # @aj(groups=["public","admin"])owner: str # @aj(groups=["admin"])def __init__(self, location, tables, owner):self.location = locationself.tables = tablesself.owner = ownerserializer = ASerializer()restaurant = Restaurant("Manhattan", 30, "John Smith")print(serializer.serialize(restaurant, groups=["public"]))# {"location": "Manhattan", "tables": 30}print(serializer.serialize(restaurant, groups=["admin"]))# {"location": "Manhattan", "tables": 30, "owner": "John Smith"}```##### Rename Attributes With "Name"```pythonfrom ajson import AJsonfrom ajson.aserializer import ASerializer@AJson()class Customer:name: str # @aj(name=firstName)primary_email: str # @aj(name=email)last_name: str # @aj(name=lastName)def __init__(self):self.name = "John"self.last_name = "Smith"self.primary_email = "[email protected]"serializer = ASerializer()customer = Customer()print(serializer.serialize(customer))# {"firstName": "John", "lastName": "Smith", "email": "[email protected]"}```##### Nested Objects With Groups And Names```pythonfrom typing import Listfrom ajson import AJson, ASerializer@AJson()class Customer:name: str # @aj(name=firstName, groups=["public"])primary_email: str'''You can also add the annotation in a multi-line docstr@aj(name=email,groups=["public"])'''def __init__(self, name, primary_email):self.name = nameself.primary_email = primary_email@AJson()class Restaurant:location: str # @aj(groups=["public","admin"])owner: str # @aj(groups=["admin"])customer_list: List[Customer] # @aj(groups=["with_customers"] name=customers)def __init__(self):self.location = Noneself.owner = "John Smith"self.customer_list = [Customer("Dani", "[email protected]"),Customer("Mike", "[email protected]")]restaurant = Restaurant()print(ASerializer().serialize(restaurant, groups=["public"]))# '{"location": null}'# if you want to get the dictionary instead of a string, you can call `to_dict` instead of `serialize`print(ASerializer().to_dict(restaurant, groups=["public", "with_customers"]))'''{"location": None,"customers": [{"firstName": "Dani", "email": "[email protected]"},{"firstName": "Mike", "email": "[email protected]"}]}'''```#### Unserialize Examples##### UnSerialization With Custom Names```pythonfrom ajson import AJson, ASerializer@AJson()class Customer:name: str # @aj(name=firstName)primary_email: str # @aj(name=email)last_name: str # @aj(name=lastName)serializer = ASerializer()serialize_str = '{"firstName": "John", "lastName": "Smith", "email": "[email protected]"}'customer = serializer.unserialize(serialize_str, Customer)print(customer.name) # "John"print(customer.last_name) # "Smith"print(customer.primary_email) # "[email protected]"```##### Nested Objects```pythonfrom typing import List, Optionalfrom ajson import AJson, ASerializer@AJson()class Customer:def __init__(self):# we can also create the @aj annotation in the attribute's definitionself.name = None # @aj(name=firstName)self.primary_email = None # @aj(name=email)@AJson()class Restaurant:customer_list: List[Customer] # if we want to have nested objects, we need to define the types hints'''@aj(name=customers)we can create the @aj annotation in the attribute's definition'''owner: str = "John Smith"location: Optional[str] = Nonerestaurant_str = '''{"location": "Spain","customers": [{"firstName": "Dani", "email": "[email protected]"},{"firstName": "Mike", "email": "[email protected]"}]}'''serializer = ASerializer()restaurant = serializer.unserialize(restaurant_str, Restaurant)print(restaurant.owner) # "John Smith"print(restaurant.customer_list[0].name) # "Dani"```##### Known Limitations1. Unserialize a Dict with types (Dict[str:MyObject]) is not supported, it will just unserialize it as a dict.2. Unserialize a Dict with key different than a string (Dict[int:str])#### DocumentationDocumentation and additional information [here](https://jorgegarciairazabal.github.io/ajson/)#### ContributingAny contribution, feature request, or bug report is always welcome.Please, feel free to create any issues or PRs.
ajsonapi
ajsonapi: asynchronous JSON APIWhat is it?ajsonapiis a Python package for creating aJSON APIweb server backed by a database from a user-provided object model.How to specify an object model?Let's look at a simple object model specification.# model.pyfromajsonapiimport(JSON_API,OneToManyRelationship,ManyToOneRelationship,Attribute,String)classPersons(JSON_API):name=Attribute(String)articles=OneToManyRelationship('Articles',rfkey='person_id')classArticles(JSON_API):title=Attribute(String)author=ManyToOneRelationship('Persons',lfkey='person_id')This model contains two class definitions:PersonsandArticles. A person has a name and can author zero of more articles. An article has a title and has exactly one author (who is a person). The only parts in the model that may be unobvious are thelfkeyandrfkeyparameters in the relationship definitions. They are abbreviations forlocal foreign keyandremote foreign key, respectively. Ajsonapi uses these parameters to identify thatPersons.articlesandArticles.authorare each other's reverse relationship and to persist objects and their relationships in the database.For a more elaborate (albeit abstract) object model seeajsonapi's model for functional testing.How to create a web server?# app.pyfromaiohttp.webimportrun_appfromajsonapiimportApplicationimportmodel# Or directly include the above code snippetasyncdefmake_app():app=Application()awaitapp.connect_database('postgresql://user:password@localhost:5432/db')awaitapp.create_tables()app.add_json_api_routes()returnapp.apprun_app(make_app())What does ajsonapi provide?From the above six line model, ajsonapi creates a web server that supports the following eighteen operations (combinations of HTTP method and URI) as described by theJSON API specification.GET, POST /persons GET, PATCH, DELETE /persons/{id} GET, POST, PATCH, DELETE /persons/{id}/relationships/articles GET /persons/{id}/articles GET, POST /articles GET, PATCH, DELETE /articles/{id} GET, PATCH /articles/{id}/relationships/author GET /articles/{id}/authorAllGEToperations that return a collection support the?include,?fields,?filter,?sort, and?pagequery parameters. All objects created and manipulated through the web server are persisted in a Postgres database by ajsonapi.Where to get it?pipinstallajsonapi
ajsonrpc
Async JSON-RPC 2.0 protocol + asyncio serverLightweight JSON-RPC 2.0 protocol implementation and asynchronous server powered by asyncio. This library is a successor of json-rpc and written by the same team.Features:Full JSON-RPC 2.0 Implementation.Async request manager that handles the protocol.Vanilla Python, no dependencies.API server setup in 1 min.Same development team asjson-rpc, largely compatible code.Installing$ pip install ajsonrpcQuick StartThis package contains core JSON-RPC 2.0 primitives (request, response, etc.) and convenient backend-independent abstractions on top of them: dispatcher and request manager. These modules mirror implementation in the original json-rpc package with minor changes and improvements. Below is a summary of each module.Core ModuleConsists of JSON-RPC 2.0 primitives: request, batch request, response, batch response, error. It also defines base classes for custom errors and exceptions.Development principles:If python object is created or modified without exceptions, it contains valid data.Private state<object>._bodycontains the single source of truth. It is accessible and modifiable via getters (properties) and setters that ensure validation.bodyis always a dictionary with primitive keys and values (the only exception isresponse.resultthat could hold any value defined by the application).Constructor, getters and setters operate with JSON-RPC defined types, e.g.response.erroralways hasJSONRPC20Errortype. Most of other types are strings and numbers.Unlike json-rpc package, core module does not deal with serialization/de-serialization, this logic was moved to manager.DispatcherDispatcher is a dict-like object that maps method names to executables. One can think of it as an inproved dictionary, in fact it is inherited fromMutableMapping. Some of the ways to add methods to dispatcher:# initd=Dispatcher({"sum":lambdaa,b:a+b})# set itemd["max"]=lambdaa,b:max(a,b)# function [email protected]_functiondefadd(x,y):returnx+y# Add class or objectclassMath:defsum(self,a,b):returna+bdefdiff(self,a,b):returna-bd.add_class(Math)d.add_object(Math())d.add_dict({"min":lambdaa,b:min(a,b)})# rename functiond.add_function(add,name="my_add")# prefix methosd.add_class(Math,prefix="get_")ManagerManager generates a response for a request. It handles common routines: request parsing, exception handling and error generation, parallel request execution for batch requests, serialization/de-serialization. Manager is asynchronous and dackend agnostic, it exposes following common methods:# Get a response object for a single request. Used by other methods.asyncdefget_response_for_request(self,request:JSONRPC20Request)->Optional[JSONRPC20Response]# Get (batch) response for a string payload. Handles de-serialization and parse errors.asyncdefget_response_for_payload(self,payload:str)->Optional[Union[JSONRPC20Response,JSONRPC20BatchResponse]]# Most high-level method, returns string json for a string payload.asyncdefget_payload_for_payload(self,payload:str)->strVanilla Server (Demo)This package comes with an asyncioProtocol-basedminimalistic server scriptasync-json-rpc-server. One could think of it as a bottle-py of API servers.This was an experiment turned prototype: unlike json-rpc that requires some "shell" like Django or Flask to work, this package relies on asyncio and therefore could build on top of itsTCP server. Indeed, JSON-RPC 2.0 is intentionally simple: server does not require views, has only one endpoint (routing is not required), only deals with json. Hence, vanilla code would be not only sufficient but likely faster than any framework.This idea of self-sufficient server was extended further: what would be the minimum interface that allows to plug application code? What if zero integration is required? Likely, this was possible with runtime method introspection:async-json-rpc-serverparses given file with methods and exposes all of them. Let's consider an example:# examples/methods.pyimportasynciodefecho(s='pong'):returnsdefmul2(a,b):returna*basyncdefsay_after(delay,what):awaitasyncio.sleep(delay)returnwhatTo launch a server based on above methods, simply run:$ async-json-rpc-server examples/methods.py --port=8888(Ctrl+C stops the server).Single request example:$ curl -H 'Content-Type: application/json' \ -d '{"jsonrpc": "2.0", "method": "echo", "id": 0}' \ http://127.0.0.1:8888 {"jsonrpc": "2.0", "id": 0, "result": "pong"}Batch request example:BackendsBackend support is a syntactic sugar that wraps dispatcher and manager under one api class and provides convenient boilerplate, such as handler generation. Currently supported frameworks:TornadoSanicQuart
ajson-rpc2
No description available on PyPI.
ajsum
ajsumSmall Python package that adds functionality convenient for my personal code
ajtest
This is a simple example package. You can useGithub-flavored Markdownto write your content.
ajum
we-love-ajumThis small library is a Python wrapper forajum.de, querying the book review database of the german working group for children's and youth literature and media ("Arbeitsgemeinschaft Jugendliteratur und Medien" or "AJuM"), which is part of the german Education and Science Worker's Union ("Gewerkschaft Erziehung und Wissenschaft" or "GEW").We deem their work to be invaluable for kindergartens, (pre)schools, universities and other educational institutions. We are thankful for AJuM's commitment and want to give something back by spreading the word and provide an easy way to interact with their review database.Note:Since we DO NOT UNDER ANY CIRCUMSTANCES want to disrupt their services,asyncio.sleep()is called after each API request.The includedajum/data/index.jsonfile contains URL slugs for each ISBN. It was created using--strictmode, skipping invalid ISBNs - currently totalling 46203 (valid) ISBNs with 87939 reviews (averaging 1.90 reviews per ISBN).Getting startedSimply install all dependencies inside a virtual environment to get started:# Set up & activate virtualenvvirtualenv-ppython3venv# shellcheck disable=SC1091sourcevenv/bin/activate# Install dependencies, either ..# (1) .. from PyPi (stable)python-mpipinstallajum# (2) .. from repository (dev)python-mpipinstall--editable.From there, it's easy to roll out your own script:fromajumimportAjum# Initialize objectajum=Ajum()# Fetch reviews from first pageslugs=ajum.get_slugs():# Display their data:print(ajum.get_reviews(slugs))For more examples, have a look atsrc/cli.pyandsrc/ajum.pyto get you started - feedback appreciated, as always!UsageThe following commands are available:$ ajum --help Usage: ajum [OPTIONS] COMMAND [ARGS]... Tools for interacting with the 'AJuM' database. Options: -c, --config PATH Path to user settings file. -u, --ua PATH Path to "UA" strings file. -v, --verbose Enable verbose mode. --version Show the version and exit. --help Show this message and exit. Commands: backup Backs up remote database clear Removes cached results files export Exports review data to FILE query Queries remote database show Shows data for given ISBN stats Shows statisticsCommandsbackup.. remote database:$ ajum backup --help Usage: ajum backup [OPTIONS] Backs up remote database Options: -p, --parallel INTEGER Number of parallel downloads. -n, --number INTEGER Number of results pages to be scraped. --help Show this message and exit.export.. review data as index (or full database):$ ajum export --help Usage: ajum export [OPTIONS] [FILE] Exports review data to FILE Options: -s, --strict Whether to skip invalid ISBNs. -f, --full Whether to export full database. -j, --jobs INTEGER Number of jobs. --help Show this message and exit.show.. review data for given ISBN:$ ajum show --help Usage: ajum show [OPTIONS] ISBN Shows data for given ISBN Options: --help Show this message and exit.query.. remote database for given search terms:Usage: aj$ ajum query --help um query [OPTIONS] Queries remote database Options: -q, --query TEXT Search term. -t, --search-field TEXT Search field type. -r, --rating TEXT Rating. -f, --application TEXT Field of application. -m, --media-type TEXT Media type. -t, --topics TEXT Topics. -a, --ages TEXT Recommendable age range(s). -y, --year TEXT Publishing year. --help Show this message and exit.stats.. about (cached) reviews:$ ajum stats --help Usage: ajum stats [OPTIONS] Shows statistics Options: --help Show this message and exit.clear.. cached results files:$ ajum clear --help Usage: ajum clear [OPTIONS] Removes cached results files Options: -r, --reset Whether to remove cached results pages. --help Show this message and exit.DisclaimerFor legal reasons we only provide you with the means to download reviews. We assume neither ownership nor intellectual property of any review - they are publically available on theAJuM websiteand are subject to their legal sphere alone.Happy coding!:copyright: Fundevogel Kinder- und Jugendbuchhandlung
ajuste-ebo
Librería de Ajuste de Curva con Kernels en PythonEsta librería proporciona herramientas para realizar ajustes de curva utilizando tres tipos de kernels: gaussiano, tricube y Epanechnikov. Los ajustes de curva son útiles en estadística y análisis de datos para modelar relaciones entre variables.InstalaciónPuedes instalar la librería usandopip:pipinstallajuste_ebo
ajv.programme
programme is a program that writes programmes.This isn’t really intended for general use. It’s something I put together so someone I know could make nice-ish music-recital programmes, without having to manually futz with word processor layouts. But if you run across it on pypi, you’re welcome to use it.Quick StartInstall Python 3,pip, andprogrammeitself, if they’re not already present. The commands will differ by OS; this example is for Mint, Ubuntu, and other Debian-based systems:sudo apt install python3-pip pip3 install ajv.programmeNow generate a blank performer file, as follows:programme --init > performers.tsvOpenperformers.tsvin your spreadsheet program of choice. It may prompt you for import settings; the only relevant one is the delimiter, which should be a tab.Fill in the columns on the spreadsheet. Anything left blank will be ignored. Save when done. Then:programme performers.tsv output.htmlThat’s it. Open output.html and you should get something usable. If you specified images in the tsv file, and if those files are present in the same directory, they’ll be included in the output as portraits.You can runprogramme--helpto see additional options.
ak
Failed to fetch description. HTTP Status Code: 404
ak4777
ak47
ak48
ak47
ak7-distributions
No description available on PyPI.
aka
AbstractThis package provides a command line utility calledakafor swiftly renaming (or copying) files using Python code. This makes it easy to rename files even when the changes you are making are quite complicated. It always renames files in two passes to avoid collisions; it tries to detect miscellaneous errors in advance; and if errors occur underways it will put you in an emergency mode to resolve the problem or roll back changes. It also provides the functionsaka.renameandaka.copy, which is the underlying interface.The problem being solvedLets say you have a directory with the filesFile0,File1, andFile2. Then some people comes along and complains (rightly or wrongly) that the numbering starts at zero. So you decide to write a program to rename all those files, but a problem arises. You cannot do it in any order you like, you have to start withFile2->File3in order to avoid conflicts. It’d be nice to just write a function that knows how to change the names of individual files and let another program sort out the rest. This is whataka.renameis about:>>>importaka>>>fromconteximportrules>>>defmachine(fn):return rules(r'File(\d+)', {1: lambda digit: int(digit) + 1}).apply(fn)>>>machine('File42')'File43'>>>aka.rename(machine)Actions to be taken (simplified; doesn't show the temporary stage): /home/uglemat/Documents/File0 -> /home/uglemat/Documents/File1 /home/uglemat/Documents/File1 -> /home/uglemat/Documents/File2 /home/uglemat/Documents/File2 -> /home/uglemat/Documents/File3 Target directories: /home/uglemat/Documents The files will be renamed as shown above (in two passes though, in order to avoid collisions). This program searched for name conflicts in all target directories and did not find any. If errors do pop up, you'll be taken to an emergency mode where you can roll back changes. Continue? [N/y]: y Renaming /home/uglemat/Documents/File0 -> /tmp/aka_maok91r8/File0 Renaming /home/uglemat/Documents/File1 -> /tmp/aka_maok91r8/File1 Renaming /home/uglemat/Documents/File2 -> /tmp/aka_maok91r8/File2 Renaming /tmp/aka_maok91r8/File0 -> /home/uglemat/Documents/File1 Renaming /tmp/aka_maok91r8/File1 -> /home/uglemat/Documents/File2 Renaming /tmp/aka_maok91r8/File2 -> /home/uglemat/Documents/File3 TrueI usedcontex.rulesto manipulate the string, but you can do whatever you like insidemachine, you just need to return the new name of the file.By default it renames files in the current working directory, but that can be changed with thelocationargument toaka.rename.aka.copyis basically the same, it just copies files instead. Read the docstrings of those functions to learn the details.Command line utilityThat’s all fine and dandy, but when you just have some files and you need to rename them, you want to do it with a command line utility. This is the basics:$aka--helpUsefulinformation...$aka-p'fn+".jpg"'That will add a “.jpg” suffix to all files in the working directory. But lets do what we did above withaka.rename:$aka-p'rules(r"File(\d+)", {1: lambda digit: int(digit) + 1})'The expression after-pdoesn’t need to be a new filename, it can also be a unary callable (likemachineabove) that returns the new filename. That is why the example above works;contex.rulesreturns a callable. If you want to copy instead of rename, just add in the-coption:$aka-c-p'rules(r"File(\d+)", {1: lambda digit: int(digit) + 1})'--COPYINGFILESIN.--ERROR:/home/uglemat/Documents/File1->/home/uglemat/Documents/File2isaconflict!ERROR:/home/uglemat/Documents/File2->/home/uglemat/Documents/File3isaconflict!Aborting...Err, yes, that won’t work, of course. Good thingakadetects naming conflicts in advance!More complicated renaming schemesThat’s great, but what if it’s not a simple one-liner? Then you need to create a new file, write some python code, launch the python interpreter, import the stuff you need… It’s cumbersome, which is whyakacan help with that:$aka-eemacsThis will launch emacs and take you to a temporary file which looks kind of like this:importrefromos.pathimportjoinfromconteximportrules# Directories in which to perform changes:# /home/uglemat/Documentsdefrename(fn,dirname):returnfnYour job is to completerename, and when you exit the editor it will do the job (after asking you if you want to continue).Lets do something more advanced, say you have lots of files in~/Documents/filesof the formatFile<num>and you want to split them into the foldersoddandeven, like this:~/Documents/files$foriin{0..20};dotouch"File$i";done~/Documents/files$lsFile0File1File10File11File12File13File14File15File16File17File18File19File2File20File3File4File5File6File7File8File9~/Documents/files$mkdiroddevenThere is a slight problem in that you can’t renameoddandeven, but they are in the same directory. You just got to make sure that the rename function returns a falsy value for those filenames (btw, aka treats directories like files and will rename them too). Lets go to the editor withaka-e'emacs-nw'and write this:importrefromos.pathimportjoinfromconteximportrules# Directories in which to perform changes:# /home/uglemat/Documents/filesdefrename(fn,dirname):match=re.search(r'\d+',fn)ifmatch:digit=int(match.group(0))returnjoin('even'ifeven(digit)else'odd',fn)defeven(d):return(d%2)==0The directoriesoddandevendoesn’t match, sorenamereturnsNonefor those names and thus they are ignored, and the code above works as expected:~/Documents/files $ aka -e 'emacs -nw' running $ emacs -nw +9:14 /tmp/aka_3uvuyn8c.py Aka: Proceed? [Y/n]: y -- RENAMING FILES IN . -- Actions to be taken (simplified; doesn't show the temporary stage): /home/uglemat/Documents/files/File3 -> /home/uglemat/Documents/files/odd/File3 /home/uglemat/Documents/files/File18 -> /home/uglemat/Documents/files/even/File18 /home/uglemat/Documents/files/File13 -> /home/uglemat/Documents/files/odd/File13 ... Target directories: /home/uglemat/Documents/files/odd /home/uglemat/Documents/files/even The files will be renamed as shown above (in two passes though, in order to avoid collisions). This program searched for name conflicts in all target directories and did not find any. If errors do pop up, you'll be taken to an emergency mode where you can roll back changes. Continue? [N/y]: y Renaming /home/uglemat/Documents/files/File3 -> /tmp/aka_st72r5jp/File3 Renaming /home/uglemat/Documents/files/File18 -> /tmp/aka_st72r5jp/File18 Renaming /home/uglemat/Documents/files/File13 -> /tmp/aka_st72r5jp/File13 ... Renaming /tmp/aka_st72r5jp/File3 -> /home/uglemat/Documents/files/odd/File3 Renaming /tmp/aka_st72r5jp/File18 -> /home/uglemat/Documents/files/even/File18 Renaming /tmp/aka_st72r5jp/File13 -> /home/uglemat/Documents/files/odd/File13 ~/Documents/files $ ls * even: File0 File10 File12 File14 File16 File18 File2 File20 File4 File6 File8 odd: File1 File11 File13 File15 File17 File19 File3 File5 File7 File9RollbacksTo test the rollback feature ofaka, lets first launch this command:$aka-p'rules(r"File(\d+)", {1: lambda digit: int(digit) + 1})'-- RENAMING FILES IN . -- Actions to be taken (simplified; doesn't show the temporary stage): /home/uglemat/Documents/File3 -> /home/uglemat/Documents/File4 /home/uglemat/Documents/File1 -> /home/uglemat/Documents/File2 /home/uglemat/Documents/File2 -> /home/uglemat/Documents/File3 Target directories: /home/uglemat/Documents The files will be renamed as shown above (in two passes though, in order to avoid collisions). This program searched for name conflicts in all target directories and did not find any. If errors do pop up, you'll be taken to an emergency mode where you can roll back changes. Continue? [N/y]:Now it’s waiting for confirmation from the user. So we have time to do some sabotage in another shell:$touchFile4$lsFile1File2File3File4In the first shell, lets enteryto see how it fails:Renaming /home/uglemat/Documents/File3 -> /tmp/aka_1ozr4w4b/File3 Renaming /home/uglemat/Documents/File1 -> /tmp/aka_1ozr4w4b/File1 Renaming /home/uglemat/Documents/File2 -> /tmp/aka_1ozr4w4b/File2 Renaming /tmp/aka_1ozr4w4b/File3 -> /home/uglemat/Documents/File4 EMERGENCY MODE: File /home/uglemat/Documents/File4 already exists! ERROR: Error happened when trying to rename /tmp/aka_1ozr4w4b/File3 -> /home/uglemat/Documents/File4 What should the program do? retry : try again (presumably you've fixed something in the meantime) rollback : attempt to undo changes (except for the ones previously continue'd) showroll : show which actions will be taken if you choose `rollback` exit : exit the program continue : ignore the error and move on >Oh my, looks like things didn’t go as planned. You’re now in the emergency prompt ofaka. You can easily fix the problem by deletingFile4and enteringretry, but that’s boring. Let’s first see what happens when you selectcontinue:> continue Renaming /tmp/aka_1ozr4w4b/File1 -> /home/uglemat/Documents/File2 Renaming /tmp/aka_1ozr4w4b/File2 -> /home/uglemat/Documents/File3 LOST FILES IN TEMP DIR: '/tmp/aka_1ozr4w4b'$ls/tmp/aka_1ozr4w4bFile3It’s not very nice that it just left the file in the temp dir.continueis probably rarely a good option. Lets be more sophisticated and chooserollback:> showroll Rollback actions: /tmp/aka_1ozr4w4b/File2 -> /home/uglemat/Documents/File2 /tmp/aka_1ozr4w4b/File1 -> /home/uglemat/Documents/File1 /tmp/aka_1ozr4w4b/File3 -> /home/uglemat/Documents/File3 What should the program do? retry : try again (presumably you've fixed something in the meantime) rollback : attempt to undo changes (except for the ones previously continue'd) showroll : show which actions will be taken if you choose `rollback` exit : exit the program continue : ignore the error and move on > rollback Rollback renaming /tmp/aka_1ozr4w4b/File2 -> /home/uglemat/Documents/File2 Rollback renaming /tmp/aka_1ozr4w4b/File1 -> /home/uglemat/Documents/File1 Rollback renaming /tmp/aka_1ozr4w4b/File3 -> /home/uglemat/Documents/File3$lsFile1 File2 File3 File4Rollback will “undo” all previous actions, in the reverse order that they were “done’d”. If you use the--copyoption then the undoing consists of deleting files already copied. If any of the rollback actions fails, thenakawill ignore it and try to undo as much as possible.Installingakaworks only in Python 3.Install with$ pip3 install aka. You might want to replacepip3withpip, depending on how your system is configured.Windows CompatabilityI developed this program on GNU/Linux, but it should be working on Windows as well. It understands that filenames are case insensitive on Windows when checking for naming conflicts, yet the case sensitivity is preserved when the actual renames are done.DevelopingAka has some tests. Run$ nosetestsor$ python3 setup.py testto run the tests. The code is hosted athttps://notabug.org/Uglemat/akaYou can install in development mode with$ python3 setup.py develop, then your changes to aka will take effect immediately. Launch the same command with the--uninstalloption to (kind of) remove.LicenseThe code is licensed under the GNU General Public License 3 or later. This README file is public domain.
akachi
No description available on PyPI.
akad
Akad stored LINE Protocols of all you need
akadata
Python Akamai EdgeScape client==============================[![Build Status](https://img.shields.io/travis/redjack/akadata-py/master.svg)](https://travis-ci.org/redjack/akadata-py)Python client for Akamai EdgeScape. Supports Python 2.7/3.3+, and PyPy.## Installation```pip install akadata```## UsageTo lookup an IP address, initialize an `akadata.EdgeScape` object with the hostand port of the EdgeScape Facilitator server, and call its `ip_lookup` method.See the docstring for `akadata.EdgeScape.ip_lookup` for details about theresulting dictionary, parsing, and possible exceptions.```pycon>>> from pprint import pprint>>> from akadata import EdgeScape>>>>>> client = EdgeScape('192.168.59.103', 49158)>>>>>> pprint(client.ip_lookup('208.78.4.5', timeout=1)){u'areacode': [301],u'asnum': [40287],u'bw': 5000,u'city': u'SILVERSPRING',u'company': u'RedJack_LLC',u'continent': u'NA',u'country_code': u'US',u'county': [u'MONTGOMERY'],'default_answer': False,u'dma': 511,u'domain': u'redjack.com',u'fips': ['24031'],'ip': u'208.78.4.5',u'lat': 39.0245,u'long': -77.0094,u'msa': 8872,u'pmsa': 8840,u'region_code': u'MD',u'throughput': u'vhigh',u'timezone': u'EST',u'zip': ['20901', '20902', '20903', '20904', '20905', '20906', '20907','20908', '20910', '20911', '20914', '20915', '20916', '20918','20993', '20997']}```
akadav
akaDAV is a python module to provide WebDAV (RFC 2518) capabilities for Twisted 1.3. It enables you to quickly write your own WebDAV server application in Python. The package also includes easy-to-use and lightweight WebDAV server application.
ak-adb
This repository contains a pure-python implementation of the Android ADB and Fastboot protocols, using libusb1 for USB communications.This is a complete replacement and rearchitecture of the Android project’s ADB and fastboot code available athttps://github.com/android/platform_system_core/tree/master/adbThis code is mainly targeted to users that need to communicate with Android devices in an automated fashion, such as in automated testing. It does not have a daemon between the client and the device, and therefore does not support multiple simultaneous commands to the same device. It does support any number of devices and never communicates with a device that it wasn’t intended to, unlike the Android project’s ADB.
akademy
AkademyAkademy is a module containing composable object classes for developing reinforcement learning algorithms focused on quantitative trading and time-series forecasting. This module is a work-in-progress and should, at no time, be assumed to be designed well or be free of bugs.OverviewAkademy is designed using anAgent-Environmentmodel such thatAgent-class objects ingest information fromEnvironment-class objects (Env), produce anAction, which is then applied to theEnvironmentwhich results in a change inStateand possible reward to offer feedback to the agent.Note: this module does not provide any training routines -- only the object class that can be used to support the implementation of custom training routines.Getting StartedTo installakademyuse the following command in the desired Python 3.7+ environment:pip install akademyOnce installed, developers will have access toAgent,TradeEnv, andNetworkclass objects in which to design Reinforcement Learning algorithms to train models.Sample training routine:fromakademy.models.envsimportTradeEnvfromakademy.models.agentsimportDQNAgentfromakademy.common.utilsimportload_spy_daily# loads the dataset used during trainingdata=load_spy_daily(count=2500)# load the Trading Environmentenv=TradeEnv(data=data,window=50,asset="spy",)# load the agent to trainagent=DQNAgent(action_count=env.action_space.n,state_shape=env.observation_space.shape)# load user-defined training routinetraining_routine(agent=agent,env=env)TestsUnit testing can be run via the following command:python -m unittestFor detailed information the--verboseflag can be used. For more detailed usage consult theunittestmodule documentation.Available DataThis module comes with minimal data for Agents and Environments to train on. The current data available is listed below, along with sources for the most up-to-date versions as well:1. S&P500Location:/data/SPY.CSVStart:1993-01-29End:2023-01-23Total Rows:7,454(excludes header)Header:Date,Open,High,Low,Close,Adj Close,VolumeSource:https://finance.yahoo.com/quote/SPY/history?p=SPYnote: Any data can be used easily enough via conversion into a Pandas DataFrame object, but must contain information fordateand pricing data foropen,high,low, andcloseas well asvolumesuch that each row has at least those 6 features or the latter 5 and an index representative of date.NotesGym vs. GymnasiumTheGymproject by OpenAI has been sunset and now maintained asGymnasiumby theFarama-Foundation. TheEnvclasses present here make use of the newerGymnasiumpackage which, among other differences, produces an extra item in thestepmethod indicating whether an environment has been truncated.See herePyTorchPyTorch requires some additional consideration for setup depending on use-case. Akademy uses an approach whereby CPU-based training and inferences are possible via parameterized function calls. However, GPU use (e.g. CUDA) requires local considerations. [See here] (https://pytorch.org/get-started/locally/) for a more in-depth discussion and guide.This module currently uses the 1.* version, though a 2.* version release is imminent and an upgrade to that version is planned.
aka-distribution
No description available on PyPI.
akafra-demo
No description available on PyPI.
akafuji
No description available on PyPI.
akagami
No description available on PyPI.
akagi
akagiFree software: MIT licenseFeaturesakagi enables you to access various data sources such as Amazon Redshift, Amazon S3 and Google Spreadsheet (more in future) from python.InstallationInstall via pip:pip install akagior from source:$ git clone https://github.com/ayemos/akagi akagi $ cd akagi $ python setup.py installSetupTo use RedshiftDataSource, you need to set environment variableAKAGI_UNLOAD_BUCKETthe name of the Amazon S3 bucket you like to use as intermediate storage of Redshift Unload command.$ export AKAGI_UNLOAD_BUCKET=xyz-unload-bucket.ap-northeast-1To use SpreadsheetDetaSource, you need to set environment variableGOOGLE_APPLICATION_CREDENTIALto indicate your service account credentials file. You can get the credential fromhere.Associated client has to have read access to the sheets.$ export GOOGLE_APPLICATION_CREDENTIAL=$HOME/.credentials/service-1a2b.jsonExampleRedshiftDataSourcefromakagi.data_sourcesimportRedshiftDataSourceds=RedshiftDataSource('select * from (select user_id, path from logs.imp limit 10000')fordinds:print(d)# iterate on resultS3DataSourcefromakagi.data_sourcesimportS3DataSourceds=S3DataSource.for_prefix('image-data.ap-northeast-1','data/image_net/zebra',file_format='binary')fordinds:print(d)# iterate on resultSpreadsheetDataSourcefromakagi.data_sourcesimportLocalDataSourceds=SpreadsheetDataSource('1BxiMVs0XRA5nFMdKvBdBZjgmUUqptlbs74OgvE2upms',# sample sheet provided by Googlesheet_range='Class Data!A2:F31')fordinds:print(d)# iterate on resultLocalDataSourcefromakagi.data_sourcesimportLocalDataSourceds=LocalDataSource('./PATH/TO/YOUR/DATA/DIR',file_format='csv')fordinds:print(d)# iterate on resultCreditsThis package was created withCookiecutterand theaudreyr/cookiecutter-pypackageproject template.
akaifire
fire.pyA library for the Akai Fire FL Studio Controller.
akai-mpkmini-mkii-ctrl
Command-line controller for AKAI MPKmini MK IISource–CC BY 2.0Best effort project to overcome the fact that AKAI doesn't seem to be interested in fixing Segmentation faults in theirMPKmini Editor.It currently fixes my own itches but I gladly accept feedback!InstallTo install viaPyPi:pip3 install --user akai-mpkmini-mkii-ctrlPlease note that the dependencypython-rtmidirequires compilation resources to be present on your system. For Debian-like systems for example you need to installsudo apt-get install libasound2-dev. Refer to theproject documentationfor details.To install from source you can use:python3 setup.py install, ormake installwhich will run apipenvincluding linting, tests, etc.Usageakai_mpkmini_mkii_ctrlsupports a set of commands to push or pull presets to and from the device. All commands have a common set of options:-p, --preset NUM Target preset slot (0 = RAM, 1-4 = Stored preset, default: 0) -m, --midi-port NUM MIDI Port on which the device is located (default: 0) -v, --verbose Verbose output --help Show this message and exit.Commandsprint-preset: Print preset on device in human readable format. In this example it will print the preset stored in slot 1 on the device.python3-makai_mpkmini_mkii_ctrl\--preset1print-presetpull-preset: Pull a binary from the device and write to file.python3-makai_mpkmini_mkii_ctrl\--preset0\pull-preset\--output-fileram-preset.mk2push-preset: Push a local binary preset to the device. This also works withfactory binary presets.python3-makai_mpkmini_mkii_ctrl\--preset2\push-preset\--input-fileresources/factory-patches/preset1.mk2push-config-preset: Push a local configuration preset (Example) to the device. Notice that you are able to combine several input files for easier re-use. YAML and JSON format is supported. The configurations are applied in order, e.g., in this caseBase-Config.yamlwill be extended/overwritten with the properties found inLogic-RetroSynth+Juno.yaml.python3-makai_mpkmini_mkii_ctrl\--preset0\push-config-preset\--input-fileresources/config-presets/Base-Config.yaml\--input-fileresources/config-presets/Logic-RetroSynth+Juno.yamlDevelopmentYou can prepare apipenv-based development environment using:makecleanvenvYou can also install the controller to your system using:makeinstallTo use the localpipenv-based version you can use the following command from where you cloned the repository:pipenvrunpythonakai_mpkmini_mkii_ctrlResourcesThe implementation is based upon the following resources:https://github.com/gljubojevic/akai-mpk-mini-editorhttps://github.com/mungewell/mpd-utilshttps://www.snoize.com/midimonitor/https://github.com/gbevin/SendMIDIhttps://github.com/gbevin/ReceiveMIDIhttps://www.akaipro.com/mpk-mini-mkii
akamai
No description available on PyPI.
akamai-authtoken
Akamai-AuthToken: Akamai Authorization Token for Python=================================================.. image:: https://img.shields.io/pypi/v/akamai-authtoken.svg:target: https://pypi.python.org/pypi/akamai-authtoken.. image:: https://travis-ci.org/AstinCHOI/Akamai-AuthToken-Python.svg?branch=master:target: https://travis-ci.org/AstinCHOI/Akamai-AuthToken-Python.. image:: http://img.shields.io/:license-apache-blue.svg:target: https://github.com/AstinCHOI/Akamai-AuthToken-Python/blob/master/LICENSEAkamai-AuthToken is Akamai Authorization Token in the HTTP Cookie, Query String and Header for a client.You can configure it in the Property Manager at https://control.akamai.com.It's a behavior which is Auth Token 2.0 Verification.Akamai-AuthToken supports Python 2.6–2.7 & 3.3–3.6, and runs great on PyPy. (This is Akamai unofficial code).. image:: https://github.com/AstinCHOI/akamai-asset/blob/master/authtoken/authtoken.png?raw=true:align: centerInstallation------------To install Akamai Authorization Token for Python:.. code-block:: bash$ pip install akamai-authtokenExample-------.. code-block:: pythonfrom akamai.authtoken import AuthToken, AuthTokenErrorimport requests # just for this exampleAT_HOSTNAME = 'auth-token.akamaized.net'AT_ENCRYPTION_KEY = 'YourEncryptionKey'DURATION = 500 # seconds::AT_ENCRYPTION_KEY must be hexadecimal digit string with even-length.Don't expose AT_ENCRYPTION_KEY on the public repository.**URL parameter option**.. code-block:: python# 1) Cookieat = AuthToken(key=AT_ENCRYPTION_KEY, window_seconds=DURATION, escape_early=True)token = at.generateToken(url="/akamai/authtoken")url = "http://{0}{1}".format(AT_HOSTNAME, "/akamai/authtoken")response = requests.get(url, cookies={at.token_name: token})print(response) # Maybe not 403# 2) Query stringtoken = at.generateToken(url="/akamai/authtoken")url = "http://{0}{1}?{2}={3}".format(AT_HOSTNAME, "/akamai/authtoken", at.token_name, token)response = requests.get(url)print(response)::It depends on turning on/off 'Escape token input' in the property manager. (on: escape_early=True / off: escape_early=False)In [Example 2], it's only okay for 'Ignore query string' option on in the property manager.If you want to 'Ignore query string' off using query string as your token, Please contact your Akamai representative.**ACL(Access Control List) parameter option**.. code-block:: python# 1) Header using *at = AuthToken(key=AT_ENCRYPTION_KEY, window_seconds=DURATION)token = at.generateToken(acl="/akamai/authtoken/list/*")url = "http://{0}{1}".format(AT_HOSTNAME, "/akamai/authtoken/list/something")response = requests.get(url, headers={at.token_name: token})print(response)# 2) Cookie Delimited by '!'acl = ["/akamai/authtoken", "/akamai/authtoken/list/*"]token = at.generateToken(acl=AuthToken.ACL_DELIMITER.join(acl))url = "http://{0}{1}".format(AT_HOSTNAME, "/akamai/authtoken/list/something2")# or "/akamai/authtoken"response = requests.get(url, cookies={at.token_name: token})print(response)::It doesn't matter turning on/off 'Escape token input' in the property manager, but you should keep escape_early=False (Default)Usage-----**AuthToken Class**.. code-block:: pythonAuthToken(token_type=None, token_name='__token__', key=None, algorithm='sha256',salt=None, start_time=None, end_time=None, window_seconds=None,field_delimiter='~', escape_early=False, verbose=False)#==================== ===================================================================================================Parameter Description==================== ===================================================================================================token_type Select a preset. (Not Supported Yet)token_name Parameter name for the new token. [Default: __token__]key Secret required to generate the token. It must be hexadecimal digit string with even-length.algorithm Algorithm to use to generate the token. (sha1, sha256, or md5) [Default:sha256]salt Additional data validated by the token but NOT included in the token body. (It will be deprecated)start_time What is the start time? (Use string 'now' for the current time)end_time When does this token expire? 'end_time' overrides 'window_seconds'window_seconds How long is this token valid for?field_delimiter Character used to delimit token body fields. [Default: ~]escape_early Causes strings to be 'url' encoded before being used.verbose Print all parameters.==================== ===================================================================================================**AuthToken's Static Variable**.. code-block:: pythonACL_DELIMITER = '!' # Character used to delimit acl fields.**AuthToken's Method**.. code-block:: pythongenerateToken(url=None, acl=None, start_time=None, end_time=None,window_seconds=None, ip=None, payload=None, session_id=None)# Returns the authorization token string.+----------------+---------------------------------------------------------------------------------------------------------+| Parameter | Description |+================+=========================================================================================================+| url | Single URL path. |+----------------+---------------------------------------------------------------------------------------------------------+| acl | Access control list delimited by ! [ie. /\*] |+----------------+---------------------------------------------------------------------------------------------------------+| start_time | |+----------------+ +| end_time | Same as Authtoken's parameters, but they overrides Authtoken's. |+----------------+ +| window_seconds | |+----------------+---------------------------------------------------------------------------------------------------------+| ip | IP Address to restrict this token to. (Troublesome in many cases (roaming, NAT, etc) so not often used) |+----------------+---------------------------------------------------------------------------------------------------------+| payload | Additional text added to the calculated digest. |+----------------+---------------------------------------------------------------------------------------------------------+| session_id | The session identifier for single use tokens or other advanced cases. |+----------------+---------------------------------------------------------------------------------------------------------+Command-------.. code-block:: bash$ python cms_authtoken.py -k YourEncryptionKey -w 5000 -u /hello/world -xUse -h or --help option for the detail.License-------Copyright 2017 Akamai Technologies, Inc. All rights reserved.Licensed under the Apache License, Version 2.0 (the "License");you may not use this file except in compliance with the License.You may obtain a copy of the License at `<http://www.apache.org/licenses/LICENSE-2.0>`_.Unless required by applicable law or agreed to in writing, softwaredistributed under the License is distributed on an "AS IS" BASIS,WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.See the License for the specific language governing permissions andlimitations under the License.
akamaichinaCDN
A Python Class for Akamai China CDN ManagerAn Object oriented implementation of Akamai China CDN Portal.CredentialsIn order to use this configuration, you need to:Set up your credential files as described in theauthorizationandcredentialssections of the getting started guide on developer.akamai.com.When working through this process you need to give grants for the property manager API and the User Admin API (if you will want to move properties). The section in your configuration file should be called 'default'.OverviewAn Object oriented implementation of Akamai China CDN ManagerInstall pip package available$ pip install akamaichinaCDNInstantiate the object.>>> from akamaichinaCDN import ChinaCDNManager >>> china_manager = ChinaCDNManager('/Users/apadmana/.edgerc')Whitelist Hostname>>> china_manager.whiteList(newhostname)Class Definitionclass ChinaCDNManager(): def __init__(self,edgercLocation,accountSwitchKey=None): self.icp_info = '' self._edgerc = '' self._prdHttpCaller = '' self._session = '' self._baseurl_prd = '' self._host = '' self._icp_info = '' self._icp_entities_info = '' self.accountSwitchKey = '' self._edgehostnames = '' self._groups = '' self._appmapping = '' self._legalmapping = '' def getICPEntities(self) def getICPNumbers(self) def getEdgeHostNames(self) def getGroups(self) def getProperties(self) def getPropertyInfo(self,hostname) def getDeProvisionPolicy(self,edgeHostname) def specifyDeProvisionPolicy(self,edgeHostname) def getProvisionStateChange(self,hostname,changeId) def getcurrentProvisionStateChange(self,hostname) def getProvisionStatus(self,status_filter=None) def createPropertyHostname(self,hostname,icpNumberId,serviceCategoryName,groupId) def whiteList(self,hostname)
akamaiclient
This is a simple example package. You can useGithub-flavored Markdownto write your content.
akamai-edgeauth
EdgeAuth-Token-Python is Akamai Edge Authorization Token in the HTTP Cookie, Query String, and Header for a client. You can configure it in the Property Manager athttps://control.akamai.com. It’s a behavior which is Auth Token 2.0 Verification.EdgeAuth-Token-Python supports Python 2.6–2.7 & 3.3–3.6 and runs great on PyPy.InstallationTo install Akamai Edge Authorization Token for Python:$pipinstallakamai-edgeauthExamplefromakamai.edgeauthimportEdgeAuth,EdgeAuthErrorimportrequests# just for this exampleET_HOSTNAME='edgeauth.akamaized.net'ET_ENCRYPTION_KEY='YourEncryptionKey'DURATION=500# secondsET_ENCRYPTION_KEY must be hexadecimal digit string with even-length.Don’t expose ET_ENCRYPTION_KEY on the public repository.URL parameter option# 1) Cookieet=EdgeAuth(**{'key':ET_ENCRYPTION_KEY,'window_seconds':DEFAULT_WINDOW_SECONDS})token=et.generate_url_token("/akamai/edgeauth")url="http://{0}{1}".format(ET_HOSTNAME,"/akamai/edgeauth")response=requests.get(url,cookies={et.token_name:token})print(response)# Maybe not 403# 2) Query stringtoken=et.generate_url_token("/akamai/edgeauth")url="http://{0}{1}?{2}={3}".format(ET_HOSTNAME,"/akamai/edgeauth",et.token_name,token)response=requests.get(url)print(response)‘Escape token input’ option in the Property Manager corresponds to ‘escape_early’ in the code.Escape token input (on) == escape_early (True)Escape token input (off) == escape_early (False)In [Example 2] for Query String, it’s only okay for ‘Ignore query string’ option (on).If you want to ‘Ignore query string’ option (off) using query string as your token, Please contact your Akamai representative.ACL(Access Control List) parameter option# 1) Header using *et=EdgeAuth((**{'key':ET_ENCRYPTION_KEY,'window_seconds':DEFAULT_WINDOW_SECONDS})token=et.generate_acl_token("/akamai/edgeauth/list/*")url="http://{0}{1}".format(ET_HOSTNAME,"/akamai/edgeauth/list/something")response=requests.get(url,headers={et.token_name:token})print(response)# 2) Cookie Delimited by '!'acl_path=["/akamai/edgeauth","/akamai/edgeauth/list/*"]token=at.generate_acl_token(acl_path)# url = "http://{0}{1}".format(ET_HOSTNAME, "/akamai/edgeauth")url="http://{0}{1}".format(ET_HOSTNAME,"/akamai/edgeauth/list/something2")response=requests.get(url,cookies={at.token_name:token})print(response)ACL can use the wildcard(*, ?) in the path.Don’t use ‘!’ in your path because it’s ACL Delimiter.Use ‘escape_early=False’ as default setting but it doesn’t matter turning on/off ‘Escape token input’ option in the Property ManagerUsageEdgeAuth ClassclassEdgeAuth(token_type=None,token_name='__token__',key=None,algorithm='sha256',salt=None,ip=None,payload=None,session_id=None,start_time=None,end_time=None,window_seconds=None,field_delimiter='~',acl_delimiter='!',escape_early=False,verbose=False)ParameterDescriptiontoken_typeSelect a preset. (Not Supported Yet)token_nameParameter name for the new token. [Default: ‘__token__’]keySecret required to generate the token. It must be hexadecimal digit string with even-length.algorithmAlgorithm to use to generate the token. (‘sha1’, ‘sha256’, or ‘md5’) [Default: ‘sha256’]saltAdditional data validated by the token but NOT included in the token body. (It will be deprecated)ipIP Address to restrict this token to. (Troublesome in many cases (roaming, NAT, etc) so not often used)payloadAdditional text added to the calculated digest.session_idThe session identifier for single use tokens or other advanced cases.start_timeWhat is the start time? (Use string ‘now’ for the current time)end_timeWhen does this token expire? end_time overrides window_secondswindow_secondsHow long is this token valid for?field_delimiterCharacter used to delimit token body fields. [Default: ~]acl_delimiterCharacter used to delimit acl. [ Default: ! ]escape_earlyCauses strings to be ‘url’ encoded before being used.verbosePrint all parameters.EdgeAuth’s Methoddefgenerate_url_token(url)defgenerate_acl_token(acl)# Returns the authorization token string.ParameterDescriptionurlSingle URL path (String)aclAccess Control List can use the wildcard(*, ?). It can be String (single path) or Array (multi paths)Test“/test” directory is only for the internal test.OthersIf you use theSegmented Media Protectionbehavior in AMD(Adaptive Media Delivery) Product,token_nameshould be ‘hdnts’.Command$pythoncms_edgeauth.py-kYourEncryptionKey-w5000-u/hello/world-xUse -h or –help option for the detail.
akamaihttp
A Python Class for Akamai PropertyAn Object oriented implementation of Akamai Property. The advantage of Akamai Property class is application developers need not know about the PAPI calls and their usage. Application developers can just focus on getting their work done on Property Manager configs programmatically using the objects of AkamaiProperty.CredentialsIn order to use this configuration, you need to:Set up your credential files as described in theauthorizationandcredentialssections of the getting started guide on developer.akamai.com.When working through this process you need to give grants for the property manager API and the User Admin API (if you will want to move properties). The section in your configuration file should be called 'papi'.OverviewThe advantage of Akamai Property class is application developers need not know about the PAPI calls and their usage. Application developers can just focus on getting their work done on Property Manager configs programmatically using the objects of AkamaiProperty.Install Dependencies If you are using source code.$ pip install -r requirements.txtInstall pip package available$ pip install akamaipropertyInstantiate the object.>>> from akamaiproperty import AkamaiProperty >>> myProperty = AkamaiProperty("/Users/apadmana/.edgerc","test_bulkseach_update_1","<accountSwitchKey>") >>> myProperty = AkamaiProperty("/Users/apadmana/.edgerc","test_bulkseach_update_1")Print Basic Information>>> myProperty.printPropertyInfo() Property Name: test_bulkseach_update_1 Contract Id: ctr_C-1IE2OHM Group Id: grp_163363 Active Staging Version: 18 Active Production Version: 18Create a new version>>> myProperty.createVersion(18) '78'Get rule Tree>>>myProperty.getRuleTree(18) {'accountId': 'act_B-C-1IE2OH8', 'contractId': 'ctr_C-1IE2OHM', 'groupId': 'grp_163363', 'propertyId': 'prp_605086', 'propertyName': 'test_bulkseach_update_1', 'propertyVersion': 18, 'etag': 'd0d28a6b71e665144955f7f7e1ff214933d119d7', 'rules':.....}Activate the config>>>myProperty.activateStaging(18,"testing activation",["[email protected]"]) TrueClass Definitionclass AkamaiProperty(): def __init__(self,edgercLocation, name, accountSwitchKey=None): self.name = name self.contractId = '' self.groupId = '' self.propertyId = '' self.stagingVersion = 0 self.productionVersion = 0 self.accountSwitchKey = '' def printPropertyInfo(self) def getStagingVersion(self) def getProductionVersion(self) def getRuleTree(self,version) def updateRuleTree(self,version,jsondata) def createVersion(self,baseVersion) def activateStaging(self,version,notes,email_list) def activateProduction(self,version,notes,email_list,peer_review_email,customer_email)
akamaikickstart
# Python Code ExamplesThis will guide you through the steps necessary to set up credentials and start playing with the sample code. Note that once you’ve set up credentials for one language, you don’t need to re-create them for another language. If you set up the credentials for python, php will use the same credentials.These instructions expect that you are in the examples/python subdirectory of the github repository.# Authentication and Provisioning The easiest way to walk through the needed provisioning and authentication to get your credentials is by following the instructions on [Authorizing your Client](https://developer.akamai.com/introduction/Prov_Creds.html) from the Getting started guide on our site. Once you have done this, you’ll be able to run the ‘diagnostic tools’ example scripts.## Credential File Creation You can get your credentials set up for use by the sample code by using the gen_edgerc.py command in the examples/python directory:`bash $ ./gen_edgerc.py `When you run gen_edgerc.py with no command line options, the script will create a ‘default’ section in your ~/.edgerc file. For examples other than diagnostic_tools.py you’ll need to pass the name of the appropriate section as an argument, for example this is how you’d set up ccu.py:`bash ./gen_edgerc.py ccu `You can find the correct name for the credentials section on the “section=” line in the example script. If you run the script again for a specific section (including ‘default’) it will overwrite the previous credentials with your new ones.## Diagnostic Tools - diagnostic_tools.py The first example code to test is the diagnostic_tools.py script. The credentials from the creation step give you permission to run the “dig” command from the API.` bash ./diagnostic_tools.py `This simple script runs the ‘locations’ call to find out where the Akamai servers are located. The API can run the ‘dig’ for you from any of these locations. Once it has done that, it grabs one at random and makes the dig call from there.By reviewing the code you can see how simple it is to make API calls.All of the sample code in the directory also supports –verbose to see the output on the screen, and/or –debug to see all of the HTTP traffic. These flags can help enormously in figuring out what’s going wrong or how it’s working.` bash ./diagnostic_tools.py--verbose./diagnostic_tools.py--debug`## CCU (Purge) - ccu.py We have a [blog post](https://community.akamai.com/community/developer/blog/2015/08/20/getting-started-with-the-v2-open-ccu-api) with instructions on getting set up with the CCU API. Prerequisites: ccu credentials and edit the filename to a valid file on your system
akamaiproperty
Python SDK for Akamai Property ManagerA Python SDK of Akamai Property Manager. The advantage of SDK is application developers need not know about the PAPI calls and their usage. Application developers can just focus on getting their work done on Property Manager configs programmatically using the objects of AkamaiProperty.CredentialsIn order to use this configuration, you need to:Set up your credential files as described in theauthorizationandcredentialssections of the getting started guide on developer.akamai.com.When working through this process you need to give grants for the property manager API and the User Admin API (if you will want to move properties). The section in your configuration file should be called 'papi'.OverviewThe advantage of Akamai Property class is application developers need not know about the PAPI calls and their usage. Application developers can just focus on getting their work done on Property Manager configs programmatically using the objects of AkamaiProperty.Install pip package available$ pip install akamaipropertyInstantiate the object.>>> from akamaiproperty import AkamaiProperty >>> myProperty = AkamaiProperty("/Users/apadmana/.edgerc","test_bulkseach_update_1","<accountSwitchKey>") >>> myProperty = AkamaiProperty("/Users/apadmana/.edgerc","test_bulkseach_update_1")Print Basic Information>>> myProperty.printPropertyInfo() Property Name: test_bulkseach_update_1 Contract Id: ctr_C-1IE2OHM Group Id: grp_163363 Active Staging Version: 18 Active Production Version: 18Create a new version>>> myProperty.createVersion(18) '78'Get rule Tree>>>myProperty.getRuleTree(18) {'accountId': 'act_B-C-1IE2OH8', 'contractId': 'ctr_C-1IE2OHM', 'groupId': 'grp_163363', 'propertyId': 'prp_605086', 'propertyName': 'test_bulkseach_update_1', 'propertyVersion': 18, 'etag': 'd0d28a6b71e665144955f7f7e1ff214933d119d7', 'rules':.....}Activate the config>>>myProperty.activateStaging(18,"testing activation",["[email protected]"]) True
akamai_purge_cache
Akamai-Purge-Cache : Interactive ScriptCurrent Python Script"""Module to invoke Akamai Fastpurge via simple CLI utility."""importosimportclickfromfastpurgeimportFastPurgeClient@click.command()@click.option("paths","--path","-p",multiple=True,help="A single URL to Purge (This option is repeatable for additional URLs)",)@click.option("--dryrun","-d",is_flag=True,help="Just print the command and args that will be run and exit",)defmgpurge(paths:list[str],dryrun:bool)->None:"""Module to invoke Akamai Fastpurge via simple CLI utility.:param list[str] paths: List of paths to purge from Akamai cache.:param bool dryrun: Just print the command and args that will be run and exit"""# Default: Credentials are read from ~/.edgerc# client = FastPurgeClient()# Environment: Credentials are read from environment variablesclient=FastPurgeClient(auth={"client_secret":os.environ["AKAMAI_DEFAULT_CLIENT_SECRET"],"host":os.environ["AKAMAI_DEFAULT_HOST"],"access_token":os.environ["AKAMAI_DEFAULT_ACCESS_TOKEN"],"client_token":os.environ["AKAMAI_DEFAULT_CLIENT_TOKEN"],})ifdryrun:print("These paths will be purged:")forpathinpaths:click.echo(message=path)return# Start purge of some URLs# purge is a Future, if we want to ensure purge completed# we can block on the result:purge_cmd=client.purge_by_url(urls=paths)# purge is a Future, if we want to ensure purge completed# we can block on the result:result=purge_cmd.result()print("Purge completed:",result)if__name__=="__main__":# pylint: disable=no-value-for-parametermgpurge()TO-DODocument local installationDocument local runDocument buildingDocument publishing to PY-PiDocument remote installation and usage
akamai-shared-cloudlets
Akamai shared cloudlets libraryPurposeThis python program implements the API requests that deal with Akamai Shared Cloudlets (for more information about Akamai cloudlets, seehttps://techdocs.akamai.com/cloudlets/reference/api).It could be used as a building block for any application using the 'shared cloudlets API' (such as Akamai CLIhttps://github.com/akamai/clior Terraform Akamai providerhttps://registry.terraform.io/providers/akamai/akamai/latest/docs/guides/get_started_cloudlets).Using itPrerequisitesYou need Akamai credentials. To get them, see thehttps://techdocs.akamai.com/developer/docs/set-up-authentication-credentialsdocumentation.You also need Python 3.8+ (it should work with older versions, but 3.8 is the oldest one that I tested with).RunThere are two basic ways how to work with the app - importing the app and making it part of your own code like that - or there is limited 'cli' capability (that doesn't provide all the requests, but it may help you find some basic information anyway.)Getting itIt is recommended to use this app in virtual environment (especially for your development needs). Then install the app from PyPI. Following command installs the app from the TEST PYPI:TEST PYPIpython3 -m pip install --index-url https://test.pypi.org/simple/ akamai-shared-cloudletsPROD PYPIpython3 -m pip install akamai-shared-cloudletsPoetryTo add the library to your own project as dependency, and you happen to use Poetry to manage your dependencies:poetry add akamai-shared-cloudletsUsing itNext step would be to import (example):from akamai_shared_cloudlets import akamai_api_requests_abstractions as akamai_apiAnd finally you're ready to use the app:print(akamai_api.list_cloudlets("~/.edgerc"))Example above does not do very much, but it shows how to start using the app.Usint it as CLIIssuing the following command:cloudlets list-cloudletsWould produce this output (for example, it may be different in your case)Sending request to Akamai... [{'cloudletType': 'ER', 'cloudletName': 'EDGE_REDIRECTOR'}]Help is provided when issued thecloudletscommand without any parameters orclouldets --help
akami
No description available on PyPI.
akamodel
UNKNOWN
akande
ÀkàndéÀkàndé is an advanced voice assistant built in Python, leveraging OpenAI's GPT models for natural language understanding and response generation. Àkàndé has been enhanced to include a caching mechanism for efficient response retrieval and the ability to generate PDF summaries of interactions, making it ideal for both personal assistance and executive briefing purposes.FeaturesNatural Language Understanding: Utilizes OpenAI's GPT models to understand and generate human-like responses.PDF Summary Generation: Generates PDF summaries of voice interactions, including a question header, AI-generated response, and an accompanying logo.Caching Mechanism: Implements a SQLite-based caching system to store and retrieve past queries and responses, reducing API calls and improving response times.Voice Recognition: Integrates with speech recognition libraries to support voice input.Text-to-Speech: Converts text responses into speech, providing an interactive voice-based user experience.SetupPrerequisitesPython 3.8+Pipenv or virtualenvInstallation1. Clone the repositorygitclonehttps://github.com/sebastienrousseau/akandecdakande2. Install dependenciespipenvinstall# If using pipenv# orpython-mvenvvenvsourcevenv/bin/activate# On Windows use `venv\Scripts\activate`pipinstall-rrequirements.txt3. Set up environment variablesCopy .env.example to .env and fill in your OpenAI API key and other configurations.OPENAI_API_KEY=xxxxxxxxxx4. Running Àkàndépipenvrunpython-makande# If using pipenv# orpython-makandeUsageAfter starting Àkàndé, simply follow the voice prompts to ask questions. Àkàndé will respond verbally and generate a PDF summary for each interaction in the specified output directory.ContributingPull requests are welcome. SeeCONTRIBUTING.mdfor guidelines.LicenseThis project is licensed under the MIT license - see theLICENSEfile for details.
ak-androguard
No description available on PyPI.
akane
akaneA very simple, strongly typed library to easily create test cases for anything.InfoThis project is mostly purpose built to create e2e integration tests for my CLI apps (likegoup, for example). I wanted something lean which takes almost no effort to write tests with. Feel free to use it for your own projects, but don't expect a great developer experience.If you are looking for proper testing libraries, feel free to take a look into the following resources.unittestRobot FrameworkpytestInstallationpip install akaneorpip install git+https://github.com/zekrotja/akane.gitExampleThis example is also availble in theexamplesdirectory.importosimportakanefromakane.assertionsimportassert_eqfromakane.proceduresimportsetup,test,teardown@setup()definit_env_vars():os.environ["FOO"]="bar"@test(name="environment variables")deftest_env():assert_eq("bar",os.environ["FOO"])@test(name="environment in shell")deftest_cmd():res=akane.exec(("sh","-c","echo $FOO"))assert_eq("bar\n",res)@test(name="failing test")deffails():return(False,"whoops")@teardown(name="environment variables")defdelete_env_vars():delos.environ["FOO"]defmain()->int:returnakane.run_all()if__name__=="__main__":exit(main())
akanekopy
Akaneko is both a SFW and NSFW Wrapper, there's hentais for you perverts to use, however do understand that I'm the only one working on this, and I hand pick images to add, so you may get repeated images! Use it for your Discord Bot, your Self Made Console Waifu, or whatever it is :3NOTE: This Readme.md is from theOffical RepositoryChangelogsv1.1.2Fixed Discord Bot Examplev1.1.1Added Succubus Function (NSFW)Removed Loli Function (ToS)Example(s)Python:importakanekodeffunction_name():# Get SFW Neko Images, uwu #print("SFW Neko: "+akaneko.sfw.neko())# Get Lewd Neko (NSFW), owo #print("Lewd Neko:"+akaneko.lewdNeko())# Lewd Bomb me onii-san~~ #print("Lewd Bomb: "+akaneko.lewdBomb(5))# Get other NSFW Images#print("BDSM: "+akaneko.nsfw.bdsm())print("Maid: "+akaneko.nsfw.maid())print("Hentai: "+akaneko.nsfw.hentai())# Call Your Function!function_name()Legacy Function(s)Example:akaneko.module.function()# Formatakaneko.nsfw.lewdNeko()# Exampleakaneko.nsfw.lewdBomb(5)# Meow, I'm Example 2FunctionDescriptionlewdNekoNSFW Neko Girls (Cat Girls)lewdBomb(n)Sends (n) amount of lewds! :3SFW Function(s)Example:akaneko.module.function()# Formatakaneko.sfw.foxgirl()# Awoo!~ Another example!akaneko.sfw.neko()# Meow! An Example!FunctionDescriptionnekoSFW Neko Girls (Cat Girls)foxgirlSFW Fox Girls (Thanks to @LamkasDev!)NSFW Function(s)FunctionDescriptionassI know you like anime ass~ uwubdsmIf you don't know what it is, search it upblowjobBasically an image of a girl sucking on a sharp blade!cumBasically sticky white stuff that is usually milked from sharpies.doujinSends a random doujin page imageURL!feetSo you like smelly feet huh?femdomFemale Domination?foxgirlGirl's that are wannabe foxes, yesgifsBasically an animated image, so yes :3glassesGirls that wear glasses, uwu~hentaiSends a random vanilla hentai imageURL~netorareWow, I won't even question your fetishes.maidMaids, Maid Uniforms, etc, you know what maids are :3masturbationSolo Queue in CSGO!orgyGroup Lewd ActspantiesI mean... just why? You like underwear?pussyThe genitals of a female, or a cat, you give the meaning.schoolSchool Uniforms!~ Yatta~!succubusSpooky Succubus, oh I'm so scared~ Totally don't suck me~tentaclesI'm sorry but, why do they look like intestines?thighsThe top part of your legs, very hot, isn't it?uglyBastardThe one thing most of us can all agree to hate :)uniformMilitary, Konbini, Work, Nurse Uniforms, etc!~ Sexy~yuriGirls on Girls, and Girl's only!<3zettaiRyouikiThat one part of the flesh being squeeze in thigh-highs~<3FunctionDescriptionakaneko.sfw.mobileWallpapers()Fetch a random SFW Wallpaper! (Mobile)akaneko.sfw.wallpapers()Fetch a random SFW Wallpaper! (Desktop)akaneko.nsfw.mobileWallpapers()Fetch a random NSFW Wallpaper! (Mobile)akaneko.nsfw.wallpapers()Fetch a random NSFW Wallpaper! (Desktop)Discord Bot Exampleimportdiscord# Import the moduleimportakaneko# import the main modulefromdiscord.extimportcommands# get commands from discord.extclient=commands.Bot(command_prefix='[PREFIX HERE]')@client.event# the function decoratorasyncdefon_ready():# on Ready eventprint(f"Ready as{client.user}")# print the bot's tag when its [email protected]()# Make a isinstance for the commandasyncdefneko(ctx):# Make the function and pass in `ctx` as the paramsprint(akaneko.sfw.neko())client.run("token")# token hereAny Bugs?Open a issueor dm Raphiel#8922
akantu
Akantu: Swiss-Made Open-Source Finite-Element LibraryAkantumeans a little element in Kinyarwanda, a Bantu language. From now on it is also an open-source object-oriented library which has the ambition to be generic and efficient. Even though the code is written to be generic, Akantu strength are in solid mechanics models for fracture and contact simulations.The full documentation can be found onReadTheDocsBuildingAkantuDependenciesIn order to compileAkantuany compiler supporting fully C++14 should work. In addition some libraries are required:CMake (>= 3.5.1)Boost (pre-processor and Spirit)Eigen3 (if not present the build system will try to download it)For the python interface:Python (>=3 is recommended)pybind11 (if not present the build system will try to download it)To run parallel simulations:MPIScotchTo use the static or implicit dynamic solvers at least one of the following libraries is needed:MUMPS (since this is usually compiled in static you also need MUMPS dependencies)PETScTo compile the tests and examples:Gmshgoogle-test (if not present the build system will try to download it)On.debbased systems>sudoaptinstallcmakelibboost-devgmshlibeigen3-dev# For parallel>sudoaptinstallmpi-default-devlibmumps-devlibscotch-dev# For sequential>sudoaptinstalllibmumps-seq-devUsingcondaThis works only for sequential computation sincemumpsfrom conda-forge is compiled without MPI support>condacreate-nakantu>condaactivateakantu>condainstallboostcmake>condainstall-cconda-forgemumpsUsinghomebrew>brewinstallgcc>[email protected]>brewtapbrewsci/num>brewinstallbrewsci-mumps--without-brewsci-parmetisIf it does not work you can edit url tohttp://graal.ens-lyon.fr/MUMPS/MUMPS_5.3.5.tar.gzusing the command:>breweditbrewsci/numConfiguring and compilationAkantuis aCMakeproject, so to configure it, you can follow the usual way:>cdakantu>mkdirbuild>cdbuild>ccmake..[Settheoptionsthatyouneed]>make>makeinstallOn Mac OS X withhomebrewYou will need to specify the compiler explicitly:>CC=gcc-12CXX=g++-12FC=gfortran-12cmake..Considering the homebrew is installed in/opt/homebrewDefine the location of theScotchlibrary path:>cmake..-DSCOTCH_LIBRARY="/opt/homebrew/lib/libscotch.dylib;/opt/homebrew/lib/libscotcherr.dylib;/opt/homebrew/lib/libscotcherrexit.dylib"Specify path to allMUMPSlibraries:>cmake..-DMUMPS_DIR=/opt/homebrew/opt/brewsci-mumpsIn case the above does not work, specify theMUMPSpath manually using (e.g.):>cmake..-DMUMPS_LIBRARY_COMMON=/opt/homebrew/opt/brewsci-mumps/lib/libmumps_common.dylibIf compilation does not work change the path of the failing libraries to brew downloads in/opt/homebrew/.Using the python interfaceYou can installAkantuusing pip, this will install a pre-compiled version, this works only on Linux machines for now:>pipinstallakantuYou can then import the package in a python script as:importakantuThe python API is similar to the C++ one. If you encounter any problem with the python interface, you are welcome to do a merge request or post an issue onGitLab.ContributingContributing new features, bug fixesAny contribution is welcome, we are trying to follow agitflowworkflow, so the projectdeveloperscan create branches namedfeatures/<name of my feature>orbugfixes/<name of the fix>directly in the mainakanturepository. External fellows canForkthe project. In both cases the modifications have to be submitted in the form of aMerge Request.Asking for help, reporting issuesIf you want to ask for help concerning Akantu's compilation, usage or problem with the code do not hesitate to open anIssueon gitlab. If you want to contribute and don't know where to start, you are also invited to open an issue.Examples and Tutorials with the python interfaceTo help getting started, you can find examples with the source code in theexamplessub-folder. If you just want to test the python examples without having to compile the whole project you can use the following tarballakantu-python-examples.tgz.In addition to the examples, multiple tutorials using the python interface are available as notebooks with pre-installed version ofAkantuon Renku. The tutorials can be tested here:
ak-apkid
APKiDAPKiD gives you information about how an APK was made. It identifies many compilers, packers, obfuscators, and other weird stuff. It’sPEiDfor Android.Screen Shot 2019-05-07 at 10 55 00 AMFor more information on what this tool can be used for, check out:Android Compiler FingerprintingDetecting Pirated and Malicious Android Apps with APKiDAPKiD: PEiD for Android AppsInstallingInstallation is unfortunately a bit involved until apull requestis merged in a dependency. Here’s how you do it:gitclone--recursive-b"v3.10.0"https://github.com/VirusTotal/yara-python.git/tmp/yara-pythoncd/tmp/yara-python/yaracurlhttps://patch-diff.githubusercontent.com/raw/VirusTotal/yara/pull/1073.patch|gitamcd..pythonsetup.pybuild--enable-dexpythonsetup.pyinstallWithout this patch to Yara, the dexlib1 detection rule will fail as will any rule relying on string sizes.If this patch wasn’t needed, here’s how you’d install. First, installyara-pythonwith--enable-dexto compile Yara’s DEX module:# Don't use this method, for now. #pip install --upgrade wheel #pip wheel --wheel-dir=/tmp/yara-python --build-option="build" --build-option="--enable-dex" git+https://github.com/VirusTotal/[email protected] #pip install --no-index --find-links=/tmp/yara-python yara-pythonFinally, install APKiD:pipinstallapkidDockerYou can also run APKiD withDocker! Of course, this requires that you have git and Docker installed.Here’s how to use Docker:gitclonehttps://github.com/rednaga/APKiDcdAPKiD/dockerbuild.-trednaga:apkiddocker/apkid.sh~/reverse/targets/android/example/example.apk[+]APKiD2.1.0::fromRedNaga::rednaga.io[*]example.apk!classes.dex|->compiler:dxUsageusage: apkid [-h] [-v] [-t TIMEOUT] [-r] [--scan-depth SCAN_DEPTH] [--entry-max-scan-size ENTRY_MAX_SCAN_SIZE] [--typing {magic,filename,none}] [-j] [-o DIR] [FILE [FILE ...]] APKiD - Android Application Identifier v2.1.0 positional arguments: FILE apk, dex, or directory optional arguments: -h, --help show this help message and exit -v, --verbose log debug messages scanning: -t TIMEOUT, --timeout TIMEOUT Yara scan timeout (in seconds) -r, --recursive recurse into subdirectories --scan-depth SCAN_DEPTH how deep to go when scanning nested zips --entry-max-scan-size ENTRY_MAX_SCAN_SIZE max zip entry size to scan in bytes, 0 = no limit --typing {magic,filename,none} method to decide which files to scan output: -j, --json output scan results in JSON format -o DIR, --output-dir DIR write individual results here (implies --json)Submitting New Packers / Compilers / ObfuscatorsIf you come across an APK or DEX which APKiD does not recognize, please open a GitHub issue and tell us:what you think it is – obfuscated, packed, etc.the file hash (either MD5, SHA1, SHA256)We are open to any type of concept you might have for “something interesting” to detect, so do not limit yourself solely to packers, compilers or obfuscators. If there is an interesting anti-disassembler, anti-vm, anti-* trick, please make an issue.Pull requests are welcome. If you’re submitting a new rule, be sure to include a file hash of the APK / DEX so we can check the rule.LicenseThis tool is available under a dual license: a commercial one suitable for closed source projects and a GPL license that can be used in open source software.Depending on your needs, you must choose one of them and follow its policies. A detail of the policies and agreements for each license type are available in theLICENSE.COMMERCIALandLICENSE.GPLfiles.HackingIf you want to install the latest version in order to make changes, develop your own rules, and so on, simply clone this repository, compile the rules, and install the package in editable mode:gitclonehttps://github.com/rednaga/APKiDcdAPKiD./prep-release.pypipinstall-e.[dev,test]If the above doesn’t work, due to permission errors dependent on your local machine and where Python has been installed, try specifying the--userflag. This is likely needed if you’re not using a virtual environment:pipinstall-e.[dev,test]--userIf you update any of the rules, be sure to runprep-release.pyto recompile them.For MaintainersThis section is for package maintainers.To update the PyPI package:./prep-release.pyreadmerm-fdist/*pythonsetup.pysdistbdist_wheeltwineupload--repository-urlhttps://upload.pypi.org/legacy/dist/*Update the generatedREADME.rstuntil Pandoc learns how to translate Markdown with images that are links into reStructuredText:..image::https://travis-ci.org/rednaga/APKiD.svg?branch=master:target:https://travis-ci.org/rednaga/APKiD..image::https://img.shields.io/pypi/v/apkid.svg:target:https://pypi.python.org/pypi/apkid..image::https://img.shields.io/pypi/pyversions/apkid.svg:target:https://pypi.python.org/pypi/apkid..image::https://img.shields.io/pypi/format/apkid.svg:target:https://pypi.python.org/pypi/apkid..image::https://img.shields.io/pypi/l/apkid.svg:target:https://pypi.python.org/pypi/apkidFor more information seePackaging Projects.
ak-apkverify
ApkverifyJar Signature / APK Signature v2 verify with pure python (support rsa dsa ecdsa)require asn1cryptosupport verification for jar signature(apk signature v1),support verification for apk signature v2,support algorithm in rsa(md5/sha1/sha256/sha512),support algorithm in rsa+pss(sha256/sha512),support algorithm in dsa(sha1/sha256/sha512),support algorithm in ecdsa(sha256/sha512),support python2/python3,without build,without openssl/cryptography/M2Crypto,without any binary file like so/pyd/dll/dylib,Basic usage (command line):$ python -m apkverify --path "path.apk"Read the test.py for how to use.#!/usr/bin/python# -*- coding: utf-8 -*-from__future__importunicode_literals,print_functionimportosimportsysimportzipfiletry:from.apkverifyimportApkSignatureexcept(ValueError,ImportError):fromapkverifyimportApkSignatureif__name__=='__main__':test_dir=os.path.join(os.path.abspath('.'),'apksig')log=open(test_dir+'.py%d.txt'%(sys.version_info[0]),'wb')forfilenameinos.listdir(test_dir):file_path=os.path.join(test_dir,filename)ifnot(os.path.isfile(file_path)andzipfile.is_zipfile(file_path)):continueprint('='*79)print('File:{}'.format(file_path))log_verify=Nonetry:a=ApkSignature(os.path.abspath(file_path))print(a.apkpath)signature_version=a.is_sigv2()v_auto=a.verify()# auto check versionv_ver1=a.verify(1)# force check version 1v_ver2=a.verify(2)# force check version 2print('Verify:{},{},{},{}'.format(signature_version,v_auto,v_ver1,v_ver2))log_verify=v_ver1,v_ver2forlineina.errors:print('Error:{}'.format(line))all_certs=a.all_certs()sig_certs=a.get_certs()all_chain=a.get_chains()print(all_certs)print(sig_certs)print(all_chain)all_certs=a.all_certs(readable=True)sig_certs=a.get_certs(readable=True)all_chain=a.get_chains(readable=True)print(all_certs)print(sig_certs)print(all_chain)forone_chaininall_chain:# 签名信息(一般只有一个)print('\t[chain]'.ljust(79,'-'))foriinrange(0,len(one_chain)):# 签名的证书链()cert_prt,cert_sub,cert_iss=one_chain[i]print('\t\t[%2d] [certprt]'%i,cert_prt)print('\t\t\t[subject]',cert_sub)print('\t\t\t[ issuer]',cert_iss)exceptExceptionase:importlogginglogging.exception(e)print(e)log_verify=type(e)log.write(('%s\t%s\n'%(log_verify,filename)).encode('utf8'))log.flush()log.close()'''(False, False) empty-unsigned.apk(False, False) golden-aligned-in.apk(True, True) golden-aligned-out.apk(True, False) golden-aligned-v1-out.apk(True, True) golden-aligned-v1v2-out.apk(False, True) golden-aligned-v2-out.apk(False, False) golden-legacy-aligned-in.apk(True, True) golden-legacy-aligned-out.apk(True, False) golden-legacy-aligned-v1-out.apk(True, True) golden-legacy-aligned-v1v2-out.apk(False, True) golden-legacy-aligned-v2-out.apk(True, True) golden-rsa-minSdkVersion-1-out.apk(True, True) golden-rsa-minSdkVersion-18-out.apk(True, True) golden-rsa-minSdkVersion-24-out.apk(True, True) golden-rsa-out.apk(False, False) golden-unaligned-in.apk(True, True) golden-unaligned-out.apk(True, False) golden-unaligned-v1-out.apk(True, True) golden-unaligned-v1v2-out.apk(False, True) golden-unaligned-v2-out.apk(True, False) mismatched-compression-method.apk(True, True) original.apk(True, True) targetSandboxVersion-2.apk(True, True) two-signers-second-signer-v2-broken.apk(True, True) two-signers.apk(False, False) unsigned-targetSandboxVersion-2.apk(True, False) v1-only-empty.apk(True, False) v1-only-max-sized-eocd-comment.apk(True, False) v1-only-pkcs7-cert-bag-first-cert-not-used.apk(True, False) v1-only-targetSandboxVersion-2.apk(True, False) v1-only-two-signers.apk(True, False) v1-only-with-cr-in-entry-name.apk(True, False) v1-only-with-dsa-sha1-1.2.840.10040.4.1-1024.apk(True, False) v1-only-with-dsa-sha1-1.2.840.10040.4.1-2048.apk(True, False) v1-only-with-dsa-sha1-1.2.840.10040.4.1-3072.apk(True, False) v1-only-with-dsa-sha1-1.2.840.10040.4.3-1024.apk(True, False) v1-only-with-dsa-sha1-1.2.840.10040.4.3-2048.apk(True, False) v1-only-with-dsa-sha1-1.2.840.10040.4.3-3072.apk(True, False) v1-only-with-dsa-sha224-1.2.840.10040.4.1-1024.apk(True, False) v1-only-with-dsa-sha224-1.2.840.10040.4.1-2048.apk(True, False) v1-only-with-dsa-sha224-1.2.840.10040.4.1-3072.apk(True, False) v1-only-with-dsa-sha224-2.16.840.1.101.3.4.3.1-1024.apk(True, False) v1-only-with-dsa-sha224-2.16.840.1.101.3.4.3.1-2048.apk(True, False) v1-only-with-dsa-sha224-2.16.840.1.101.3.4.3.1-3072.apk(True, False) v1-only-with-dsa-sha256-1.2.840.10040.4.1-1024.apk(True, False) v1-only-with-dsa-sha256-1.2.840.10040.4.1-2048.apk(True, False) v1-only-with-dsa-sha256-1.2.840.10040.4.1-3072.apk(True, False) v1-only-with-dsa-sha256-2.16.840.1.101.3.4.3.2-1024.apk(True, False) v1-only-with-dsa-sha256-2.16.840.1.101.3.4.3.2-2048.apk(True, False) v1-only-with-dsa-sha256-2.16.840.1.101.3.4.3.2-3072.apk(True, False) v1-only-with-dsa-sha384-2.16.840.1.101.3.4.3.3-1024.apk(True, False) v1-only-with-dsa-sha384-2.16.840.1.101.3.4.3.3-2048.apk(True, False) v1-only-with-dsa-sha384-2.16.840.1.101.3.4.3.3-3072.apk(True, False) v1-only-with-dsa-sha512-2.16.840.1.101.3.4.3.4-1024.apk(True, False) v1-only-with-dsa-sha512-2.16.840.1.101.3.4.3.4-2048.apk(True, False) v1-only-with-dsa-sha512-2.16.840.1.101.3.4.3.4-3072.apk(True, False) v1-only-with-ecdsa-sha1-1.2.840.10045.2.1-p256.apk(True, False) v1-only-with-ecdsa-sha1-1.2.840.10045.2.1-p384.apk(True, False) v1-only-with-ecdsa-sha1-1.2.840.10045.2.1-p521.apk(True, False) v1-only-with-ecdsa-sha1-1.2.840.10045.4.1-p256.apk(True, False) v1-only-with-ecdsa-sha1-1.2.840.10045.4.1-p384.apk(True, False) v1-only-with-ecdsa-sha1-1.2.840.10045.4.1-p521.apk(True, False) v1-only-with-ecdsa-sha224-1.2.840.10045.2.1-p256.apk(True, False) v1-only-with-ecdsa-sha224-1.2.840.10045.2.1-p384.apk(True, False) v1-only-with-ecdsa-sha224-1.2.840.10045.2.1-p521.apk(True, False) v1-only-with-ecdsa-sha224-1.2.840.10045.4.3.1-p256.apk(True, False) v1-only-with-ecdsa-sha224-1.2.840.10045.4.3.1-p384.apk(True, False) v1-only-with-ecdsa-sha224-1.2.840.10045.4.3.1-p521.apk(True, False) v1-only-with-ecdsa-sha256-1.2.840.10045.2.1-p256.apk(True, False) v1-only-with-ecdsa-sha256-1.2.840.10045.2.1-p384.apk(True, False) v1-only-with-ecdsa-sha256-1.2.840.10045.2.1-p521.apk(True, False) v1-only-with-ecdsa-sha256-1.2.840.10045.4.3.2-p256.apk(True, False) v1-only-with-ecdsa-sha256-1.2.840.10045.4.3.2-p384.apk(True, False) v1-only-with-ecdsa-sha256-1.2.840.10045.4.3.2-p521.apk(True, False) v1-only-with-ecdsa-sha384-1.2.840.10045.2.1-p256.apk(True, False) v1-only-with-ecdsa-sha384-1.2.840.10045.2.1-p384.apk(True, False) v1-only-with-ecdsa-sha384-1.2.840.10045.2.1-p521.apk(True, False) v1-only-with-ecdsa-sha384-1.2.840.10045.4.3.3-p256.apk(True, False) v1-only-with-ecdsa-sha384-1.2.840.10045.4.3.3-p384.apk(True, False) v1-only-with-ecdsa-sha384-1.2.840.10045.4.3.3-p521.apk(True, False) v1-only-with-ecdsa-sha512-1.2.840.10045.2.1-p256.apk(True, False) v1-only-with-ecdsa-sha512-1.2.840.10045.2.1-p384.apk(True, False) v1-only-with-ecdsa-sha512-1.2.840.10045.2.1-p521.apk(True, False) v1-only-with-ecdsa-sha512-1.2.840.10045.4.3.4-p256.apk(True, False) v1-only-with-ecdsa-sha512-1.2.840.10045.4.3.4-p384.apk(True, False) v1-only-with-ecdsa-sha512-1.2.840.10045.4.3.4-p521.apk(True, False) v1-only-with-lf-in-entry-name.apk(True, False) v1-only-with-nul-in-entry-name.apk(True, False) v1-only-with-rsa-1024-cert-not-der.apk(True, False) v1-only-with-rsa-1024-cert-not-der2.apk(True, False) v1-only-with-rsa-1024.apk(True, False) v1-only-with-rsa-pkcs1-md5-1.2.840.113549.1.1.1-1024.apk(True, False) v1-only-with-rsa-pkcs1-md5-1.2.840.113549.1.1.1-16384.apk(True, False) v1-only-with-rsa-pkcs1-md5-1.2.840.113549.1.1.1-2048.apk(True, False) v1-only-with-rsa-pkcs1-md5-1.2.840.113549.1.1.1-3072.apk(True, False) v1-only-with-rsa-pkcs1-md5-1.2.840.113549.1.1.1-4096.apk(True, False) v1-only-with-rsa-pkcs1-md5-1.2.840.113549.1.1.1-8192.apk(True, False) v1-only-with-rsa-pkcs1-md5-1.2.840.113549.1.1.4-1024.apk(True, False) v1-only-with-rsa-pkcs1-md5-1.2.840.113549.1.1.4-16384.apk(True, False) v1-only-with-rsa-pkcs1-md5-1.2.840.113549.1.1.4-2048.apk(True, False) v1-only-with-rsa-pkcs1-md5-1.2.840.113549.1.1.4-3072.apk(True, False) v1-only-with-rsa-pkcs1-md5-1.2.840.113549.1.1.4-4096.apk(True, False) v1-only-with-rsa-pkcs1-md5-1.2.840.113549.1.1.4-8192.apk(True, False) v1-only-with-rsa-pkcs1-sha1-1.2.840.113549.1.1.1-1024.apk(True, False) v1-only-with-rsa-pkcs1-sha1-1.2.840.113549.1.1.1-16384.apk(True, False) v1-only-with-rsa-pkcs1-sha1-1.2.840.113549.1.1.1-2048.apk(True, False) v1-only-with-rsa-pkcs1-sha1-1.2.840.113549.1.1.1-3072.apk(True, False) v1-only-with-rsa-pkcs1-sha1-1.2.840.113549.1.1.1-4096.apk(True, False) v1-only-with-rsa-pkcs1-sha1-1.2.840.113549.1.1.1-8192.apk(True, False) v1-only-with-rsa-pkcs1-sha1-1.2.840.113549.1.1.5-1024.apk(True, False) v1-only-with-rsa-pkcs1-sha1-1.2.840.113549.1.1.5-16384.apk(True, False) v1-only-with-rsa-pkcs1-sha1-1.2.840.113549.1.1.5-2048.apk(True, False) v1-only-with-rsa-pkcs1-sha1-1.2.840.113549.1.1.5-3072.apk(True, False) v1-only-with-rsa-pkcs1-sha1-1.2.840.113549.1.1.5-4096.apk(True, False) v1-only-with-rsa-pkcs1-sha1-1.2.840.113549.1.1.5-8192.apk(True, False) v1-only-with-rsa-pkcs1-sha224-1.2.840.113549.1.1.1-1024.apk(True, False) v1-only-with-rsa-pkcs1-sha224-1.2.840.113549.1.1.1-16384.apk(True, False) v1-only-with-rsa-pkcs1-sha224-1.2.840.113549.1.1.1-2048.apk(True, False) v1-only-with-rsa-pkcs1-sha224-1.2.840.113549.1.1.1-3072.apk(True, False) v1-only-with-rsa-pkcs1-sha224-1.2.840.113549.1.1.1-4096.apk(True, False) v1-only-with-rsa-pkcs1-sha224-1.2.840.113549.1.1.1-8192.apk(True, False) v1-only-with-rsa-pkcs1-sha224-1.2.840.113549.1.1.14-1024.apk(True, False) v1-only-with-rsa-pkcs1-sha224-1.2.840.113549.1.1.14-16384.apk(True, False) v1-only-with-rsa-pkcs1-sha224-1.2.840.113549.1.1.14-2048.apk(True, False) v1-only-with-rsa-pkcs1-sha224-1.2.840.113549.1.1.14-3072.apk(True, False) v1-only-with-rsa-pkcs1-sha224-1.2.840.113549.1.1.14-4096.apk(True, False) v1-only-with-rsa-pkcs1-sha224-1.2.840.113549.1.1.14-8192.apk(True, False) v1-only-with-rsa-pkcs1-sha256-1.2.840.113549.1.1.1-1024.apk(True, False) v1-only-with-rsa-pkcs1-sha256-1.2.840.113549.1.1.1-16384.apk(True, False) v1-only-with-rsa-pkcs1-sha256-1.2.840.113549.1.1.1-2048.apk(True, False) v1-only-with-rsa-pkcs1-sha256-1.2.840.113549.1.1.1-3072.apk(True, False) v1-only-with-rsa-pkcs1-sha256-1.2.840.113549.1.1.1-4096.apk(True, False) v1-only-with-rsa-pkcs1-sha256-1.2.840.113549.1.1.1-8192.apk(True, False) v1-only-with-rsa-pkcs1-sha256-1.2.840.113549.1.1.11-1024.apk(True, False) v1-only-with-rsa-pkcs1-sha256-1.2.840.113549.1.1.11-16384.apk(True, False) v1-only-with-rsa-pkcs1-sha256-1.2.840.113549.1.1.11-2048.apk(True, False) v1-only-with-rsa-pkcs1-sha256-1.2.840.113549.1.1.11-3072.apk(True, False) v1-only-with-rsa-pkcs1-sha256-1.2.840.113549.1.1.11-4096.apk(True, False) v1-only-with-rsa-pkcs1-sha256-1.2.840.113549.1.1.11-8192.apk(True, False) v1-only-with-rsa-pkcs1-sha384-1.2.840.113549.1.1.1-1024.apk(True, False) v1-only-with-rsa-pkcs1-sha384-1.2.840.113549.1.1.1-16384.apk(True, False) v1-only-with-rsa-pkcs1-sha384-1.2.840.113549.1.1.1-2048.apk(True, False) v1-only-with-rsa-pkcs1-sha384-1.2.840.113549.1.1.1-3072.apk(True, False) v1-only-with-rsa-pkcs1-sha384-1.2.840.113549.1.1.1-4096.apk(True, False) v1-only-with-rsa-pkcs1-sha384-1.2.840.113549.1.1.1-8192.apk(True, False) v1-only-with-rsa-pkcs1-sha384-1.2.840.113549.1.1.12-1024.apk(True, False) v1-only-with-rsa-pkcs1-sha384-1.2.840.113549.1.1.12-16384.apk(True, False) v1-only-with-rsa-pkcs1-sha384-1.2.840.113549.1.1.12-2048.apk(True, False) v1-only-with-rsa-pkcs1-sha384-1.2.840.113549.1.1.12-3072.apk(True, False) v1-only-with-rsa-pkcs1-sha384-1.2.840.113549.1.1.12-4096.apk(True, False) v1-only-with-rsa-pkcs1-sha384-1.2.840.113549.1.1.12-8192.apk(True, False) v1-only-with-rsa-pkcs1-sha512-1.2.840.113549.1.1.1-1024.apk(True, False) v1-only-with-rsa-pkcs1-sha512-1.2.840.113549.1.1.1-16384.apk(True, False) v1-only-with-rsa-pkcs1-sha512-1.2.840.113549.1.1.1-2048.apk(True, False) v1-only-with-rsa-pkcs1-sha512-1.2.840.113549.1.1.1-3072.apk(True, False) v1-only-with-rsa-pkcs1-sha512-1.2.840.113549.1.1.1-4096.apk(True, False) v1-only-with-rsa-pkcs1-sha512-1.2.840.113549.1.1.1-8192.apk(True, False) v1-only-with-rsa-pkcs1-sha512-1.2.840.113549.1.1.13-1024.apk(True, False) v1-only-with-rsa-pkcs1-sha512-1.2.840.113549.1.1.13-16384.apk(True, False) v1-only-with-rsa-pkcs1-sha512-1.2.840.113549.1.1.13-2048.apk(True, False) v1-only-with-rsa-pkcs1-sha512-1.2.840.113549.1.1.13-3072.apk(True, False) v1-only-with-rsa-pkcs1-sha512-1.2.840.113549.1.1.13-4096.apk(True, False) v1-only-with-rsa-pkcs1-sha512-1.2.840.113549.1.1.13-8192.apk(False, False) v1-only-with-signed-attrs-missing-content-type.apk(False, False) v1-only-with-signed-attrs-missing-digest.apk(False, False) v1-only-with-signed-attrs-multiple-good-digests.apk(True, False) v1-only-with-signed-attrs-signerInfo1-good-signerInfo2-good.apk(True, False) v1-only-with-signed-attrs-signerInfo1-missing-content-type-signerInfo2-good.apk(True, False) v1-only-with-signed-attrs-signerInfo1-missing-digest-signerInfo2-good.apk(True, False) v1-only-with-signed-attrs-signerInfo1-multiple-good-digests-signerInfo2-good.apk(True, False) v1-only-with-signed-attrs-signerInfo1-wrong-content-type-signerInfo2-good.apk(True, False) v1-only-with-signed-attrs-signerInfo1-wrong-digest-signerInfo2-good.apk(True, False) v1-only-with-signed-attrs-signerInfo1-wrong-order-signerInfo2-good.apk(True, False) v1-only-with-signed-attrs-signerInfo1-wrong-signature-signerInfo2-good.apk(False, False) v1-only-with-signed-attrs-wrong-content-type.apk(False, False) v1-only-with-signed-attrs-wrong-digest.apk(False, False) v1-only-with-signed-attrs-wrong-order.apk(False, False) v1-only-with-signed-attrs-wrong-signature.apk(False, False) v1-only-with-signed-attrs.apk(False, False) v1-sha1-sha256-manifest-and-sf-with-sha1-wrong-in-manifest.apk(False, False) v1-sha1-sha256-manifest-and-sf-with-sha1-wrong-in-sf.apk(False, False) v1-sha1-sha256-manifest-and-sf-with-sha256-wrong-in-manifest.apk(False, False) v1-sha1-sha256-manifest-and-sf-with-sha256-wrong-in-sf.apk(True, False) v1-sha1-sha256-manifest-and-sf.apk(True, False) v1-sha1-sha256-manifest-and-sha1-sf.apk(True, False) v1-with-apk-sig-block-but-without-apk-sig-scheme-v2-block.apk(False, False) v2-only-apk-sig-block-size-mismatch.apk(False, False) v2-only-cert-and-public-key-mismatch.apk<class 'zipfile.BadZipFile'> v2-only-garbage-between-cd-and-eocd.apk(False, True) v2-only-max-sized-eocd-comment.apk(False, True) v2-only-missing-classes.dex.apk(False, False) v2-only-no-certs-in-sig.apk(False, False) v2-only-signatures-and-digests-block-mismatch.apk(False, True) v2-only-targetSandboxVersion-2.apk(False, True) v2-only-targetSandboxVersion-3.apk<class 'zipfile.BadZipFile'> v2-only-truncated-cd.apk(False, True) v2-only-two-signers-second-signer-no-sig.apk(False, True) v2-only-two-signers-second-signer-no-supported-sig.apk(False, True) v2-only-two-signers.apk(False, True) v2-only-unknown-pair-in-apk-sig-block.apk(False, False) v2-only-with-dsa-sha256-1024-sig-does-not-verify.apk(False, True) v2-only-with-dsa-sha256-1024.apk(False, True) v2-only-with-dsa-sha256-2048.apk(False, True) v2-only-with-dsa-sha256-3072.apk(False, False) v2-only-with-ecdsa-sha256-p256-digest-mismatch.apk(False, False) v2-only-with-ecdsa-sha256-p256-sig-does-not-verify.apk(False, True) v2-only-with-ecdsa-sha256-p256.apk(False, True) v2-only-with-ecdsa-sha256-p384.apk(False, True) v2-only-with-ecdsa-sha256-p521.apk(False, True) v2-only-with-ecdsa-sha512-p256.apk(False, True) v2-only-with-ecdsa-sha512-p384.apk(False, True) v2-only-with-ecdsa-sha512-p521.apk(False, False) v2-only-with-ignorable-unsupported-sig-algs.apk(False, True) v2-only-with-rsa-pkcs1-sha256-1024-cert-not-der.apk(False, True) v2-only-with-rsa-pkcs1-sha256-1024.apk(False, True) v2-only-with-rsa-pkcs1-sha256-16384.apk(False, False) v2-only-with-rsa-pkcs1-sha256-2048-sig-does-not-verify.apk(False, True) v2-only-with-rsa-pkcs1-sha256-2048.apk(False, True) v2-only-with-rsa-pkcs1-sha256-3072.apk(False, True) v2-only-with-rsa-pkcs1-sha256-4096.apk(False, True) v2-only-with-rsa-pkcs1-sha256-8192.apk(False, True) v2-only-with-rsa-pkcs1-sha512-1024.apk(False, True) v2-only-with-rsa-pkcs1-sha512-16384.apk(False, True) v2-only-with-rsa-pkcs1-sha512-2048.apk(False, True) v2-only-with-rsa-pkcs1-sha512-3072.apk(False, False) v2-only-with-rsa-pkcs1-sha512-4096-digest-mismatch.apk(False, True) v2-only-with-rsa-pkcs1-sha512-4096.apk(False, True) v2-only-with-rsa-pkcs1-sha512-8192.apk(False, True) v2-only-with-rsa-pss-sha256-1024.apk(False, True) v2-only-with-rsa-pss-sha256-16384.apk(False, False) v2-only-with-rsa-pss-sha256-2048-sig-does-not-verify.apk(False, True) v2-only-with-rsa-pss-sha256-2048.apk(False, True) v2-only-with-rsa-pss-sha256-3072.apk(False, True) v2-only-with-rsa-pss-sha256-4096.apk(False, True) v2-only-with-rsa-pss-sha256-8192.apk(False, True) v2-only-with-rsa-pss-sha512-16384.apk(False, True) v2-only-with-rsa-pss-sha512-2048.apk(False, True) v2-only-with-rsa-pss-sha512-3072.apk(False, True) v2-only-with-rsa-pss-sha512-4096.apk(False, True) v2-only-with-rsa-pss-sha512-8192.apk(False, False) v2-only-wrong-apk-sig-block-magic.apk(True, False) v2-stripped-with-ignorable-signing-schemes.apk(True, False) v2-stripped.apk<class 'NotImplementedError'> weird-compression-method.apk'''
akapriori
Python implementation of the Apriori Algorithm.
akapy
akapy
akaraike
Password GeneratorA password generator module made by the awesome folks at Python AbiaThis is a simple Open-Source Python library that helps you generate passwords quickly and neatly. It can be integrated into any form of application where there is the need to generate passwords or Secret keysFeaturesSet length for keyChoose types of characters or mixture of characters to useUsagefromakaraikeimportPasswordGenerator#Create an object of the PAsswordGenerator classpg=PasswordGenerator()#set length of characterspg.set_charset_length(length=5)'''Set types of characters as a listAcceptable values are- lowers for lowercase alphabets- uppers for UPPERCASE alphabets- numbers for 0123456789- specials for @$;:,._'''pg.set_charset_types(type=['numbers','specials'])#print the password generatorprint(pg.generate_password())Contributing to AkaraikeTo install akaraike, along with the tools you need to develop and run tests, run the following in your virtualenv:$pipinstall-e.[dev]
akara-python
No description available on PyPI.
akari
Tag your anime photos effortlesslyakariis a work in progress python program to manage anime artworkIt usesiqdbto reverse-search images and tags your images automaticallyYou can add new tags or remove tags manuallyYou can select some tags and filter your images accordinglyHas a built in image viewer for fullscreen viewingRequirementsrequestsBeautifulSoup 4PyQt5InstallationFor user installation, simply run:pip3 install --user akariThen add$HOME/.local/binto your$PATH:echo PATH=\"\$PATH:\$HOME/.local/bin\" >> $HOME/.bashrc source $HOME/.bashrcAlternatively, you can do a system wide installation:sudo pip3 install akariUsageusage: akari [-h] [-s /path/to/dir] [-g] optional arguments: -h, --help show this help message and exit -s /path/to/dir, --scan /path/to/dir Scan directory for new images -g, --gui Start the GUI -v, --version Displays the versionTODO
akari-client
akari_clientAKARIを使うためのクライアントライブラリインストールpipinstallakari_client注意:akari_clientを使う前にAKARI本体の設定が完了している必要があります。 詳しくはオンラインドキュメントを参照してください。Getting Started: ディスプレイに文字を表示するwithAkariClient()asakari:m5=akari.m5stackm5.set_display_text("Hello, world!")その他のサンプルや使い方はオンラインドキュメントを参照してください。ドキュメントhttps://AkariGroup.github.io/docs
akari-dl
akari-dl downloads anime video files from direct download websites based on user configuration to avoid more annoying downloading methods like torrenting and manually downloading.
akari-proto
akari_protoAKARIを gRPC 経由で使うための protobuf 定義詳細はオンラインドキュメント、およびakari_clientを参照してください。
akarsu
akarsuis the New Generation Profiler based onPEP 669. The name of the project, comes from the surname of a minstrel namedMuhlis Akarsu, which meansstream.Installationakarsucan be installed by runningpip install akarsu. It requires Python 3.12.0+ to run.Usagecatexample.pyOutput:deffoo():x=1isinstance(x,int)returnxdefbar():foo()bar()akarsu-fexample.pyOutput:Count Event Type Filename(function) 1 PY_CALL example.py(bar) 1 PY_START example.py(bar) 1 PY_CALL example.py(foo) 1 PY_START example.py(foo) 1 C_CALL example.py(<built-in function isinstance>) 1 C_RETURN example.py(foo) 1 PY_RETURN example.py(foo) 1 PY_RETURN example.py(bar) Total number of events: 8 PY_CALL = 2 PY_START = 2 PY_RETURN = 2 C_CALL = 1 C_RETURN = 1If you want to show only the function calls in the output, you can use the-cor--callsargument.akarsu-c-fexample.pyOutput:Count Event Type Filename(function) 1 PY_CALL example.py(bar) 1 PY_CALL example.py(foo) 1 C_CALL example.py(<built-in function isinstance>) Total number of events: 3 PY_CALL = 2 C_CALL = 1
akasa
About LaravelNote:This repository contains the core code of the Laravel framework. If you want to build an application using Laravel, visit the mainLaravel repository.Laravel is a web application framework with expressive, elegant syntax. We believe development must be an enjoyable, creative experience to be truly fulfilling. Laravel attempts to take the pain out of development by easing common tasks used in the majority of web projects, such as:Simple, fast routing engine.Powerful dependency injection container.Multiple back-ends forsessionandcachestorage.Database agnosticschema migrations.Robust background job processing.Real-time event broadcasting.Laravel is accessible, yet powerful, providing tools needed for large, robust applications. A superb combination of simplicity, elegance, and innovation gives you a complete toolset required to build any application with which you are tasked.Learning LaravelLaravel has the most extensive and thorough documentation and video tutorial library of any modern web application framework. TheLaravel documentationis in-depth and complete, making it a breeze to get started learning the framework.You may also try theLaravel Bootcamp, where you will be guided through building a modern Laravel application from scratch.If you're not in the mood to read,Laracastscontains over 1100 video tutorials covering a range of topics including Laravel, modern PHP, unit testing, JavaScript, and more. Boost the skill level of yourself and your entire team by digging into our comprehensive video library.ContributingThank you for considering contributing to the Laravel framework! The contribution guide can be found in theLaravel documentation.Code of ConductIn order to ensure that the Laravel community is welcoming to all, please review and abide by theCode of Conduct.Security VulnerabilitiesPlease reviewour security policyon how to report security vulnerabilities.LicenseThe Laravel framework is open-sourced software licensed under theMIT license.
akasha
Akasha-TerminalPython module for providing game information from different sources.
akasha-terminal
akashaAkasha simplifies document-based Question Answering (QA) by harnessing the power of Large Language Models to accurately answer your queries while searching through your provided documents. Use Retrieval Augmented Generation (RAG) to make LLM generate correct information from documents.With Akasha, you have the flexibility to choose from a variety of language models, embedding models, and search types. Adjusting these parameters is straightforward, allowing you to optimize your approach and discover the most effective methods for obtaining accurate answers from Large Language Models.For the chinese manual, please visitmanualInstallationWe recommend using Python 3.8 to run our akasha package. You can use Anaconda to create virtual environment.# create environmentcondacreate--namepy3-8python=3.8 activatepy3-8# install akashapipinstallakasha-terminalAPI KeysOPENAIIf you want to use openai models or embeddings, go toopenaito get the API key. You can either saveOPENAI_API_KEY=your api keyinto.envfile to current working directory or, set as a environment variable, usingexportin bash or useos.environin python.# set a environment variableexportOPENAI_API_KEY="your api key"AZURE OPENAIIf you want to use azure openai, go toauzreAIand get you own Language API base url and key. Also, remember to depoly all the models inAzure OpenAI Studio, the deployment name should be same as the model name. saveOPENAI_API_KEY=your azure key,OPENAI_API_BASE=your Language API base url,OPENAI_API_TYPE=azure,OPENAI_API_VERSION=2023-05-15into.envfile to current working directory.If you want to save both openai key and azure key at the same time, you can also useAZURE_API_KEY,AZURE_API_BASE,AZURE_API_TYPE,AZURE_API_VERSION## .env fileAZURE_API_KEY={yourazurekey}AZURE_API_BASE={yourLanguageAPIbaseurl}AZURE_API_TYPE=azureAZURE_API_VERSION=2023-05-15And now we can run akasha in python#PYTHON3.8importakashaak=akasha.Doc_QA(model="openai:gpt-3.5-turbo")response=ak.get_response(dir_path,prompt)LLAMA-2If you want to use original meta-llama model, you need to both register tohuggingfaceto get access token andmeta-llamato request access.Remember, the account on Hugging Face and the email you use to request access to Meta-Llama must be the same, so that you can download models from Hugging Face once your account is approved.You should see theGated model You have been granted access to this modelonce your account is approvedAgain, you can either saveHUGGINGFACEHUB_API_TOKEN=your api keyinto.envfile to current working directory or set as a environment variable, usingexportin bash or useos.environin python. After you create Doc_QA() class, you can still change the model you want when you call the function.# set a environment variableexportHUGGINGFACEHUB_API_TOKEN="your api key"#PYTHON3.8importakashaak=akasha.Doc_QA()response=ak.get_response(dir_path,prompt,model="hf:meta-llama/Llama-2-7b-chat-hf")Example and ParametersBasic get_response OpenAI exampleimportakashaimportosos.environ["OPENAI_API_KEY"]="your openAI key"dir_path="doc/"prompt="「塞西莉亞花」的花語是什麼? 「失之交臂的感情」 「赤誠的心」 「浪子的真情」 「無法挽回的愛」"ak=akasha.Doc_QA()response=ak.get_response(dir_path,prompt)print(response)「塞西莉亞花」的花語為「浪子的真情」Select different embeddingsUsing parameter "embeddings", you can choose different embedding models, and the embedding model will be used to store documents into vector storage and search relevant documents from prompt. Default isopenai:text-embedding-ada-002.Currently supportopenai,huggingfaceandtensorflowhub.huggingface exampleak = akasha.Doc_QA(embeddings="huggingface:all-MiniLM-L6-v2") resposne = ak.get_response(dir_path, prompt)To use huggingface embedding models, you can type huggingface:model_name or hf:model_name, for example,huggingface:all-MiniLM-L6-v2Select different modelsUsing parameter"model", you can choose different text generation models, default isopenai:gpt-3.5-turbo.Currently supportopenai,llama-cpp,huggingfaceandremote.1. openai exampleak=akasha.Doc_QA()ak.get_response(dir_path,prompt,embeddings="openai:text-embedding-ada-002",model="openai:gpt-3.5-turbo")2.huggingface exampleak=akasha.Doc_QA()ak.get_response(dir_path,prompt,embeddings="huggingface:all-MiniLM-L6-v2",model="hf:meta-llama/Llama-2-13b-chat-hf")To use text generation model fromhuggingface, for example, meta llama, you can typehf:meta-llama/Llama-2-13b-chat-hf3.llama-cpp examplellama-cpp can use quantized llama model and run on cpu, after you download or transfer llama-cpp model file usingllama-cpp-python.ak=akasha.Doc_QA()ak.get_response(dir_path,prompt,embeddings="huggingface:all-MiniLM-L6-v2",model="llama-cpu:model/llama-2-13b-chat.Q5_K_S.gguf")For example, if q5 model is in the "model/" directory, you can assignllama-cpu:model/llama-2-13b-chat.Q5_K_S.ggufto load model.ak=akasha.Doc_QA()ak.get_response(dir_path,prompt,embeddings="huggingface:all-MiniLM-L6-v2",model="llama-gpu:model/llama-2-3b-chat.Q5_K_S.gguf")you can also combine gpu with cpu to run llama-cpp, usingllama-gpu:model/llama-2-13b-chat.Q5_K_S.gguf4. remote server api exampleIf you deploy your own language model in other server using TGI (Text Generation Inference), you can useremote:{your LLM api url}to call the model.ak=akasha.Doc_QA()ak.get_response(dir_path,prompt,model="remote:http://140.92.60.189:8081")Select different search typeUsing parameter"search_type", you can choose different search methods to find similar documents , default ismerge, which is the combination of mmr, svm and tfidf. Currently you can select merge, mmr, svm and tfidf.Max Marginal Relevance(mmr)select similar documents by cosine similarity, but it also consider diversity, so it will also penalize document for closeness to already selected documents.Support Vector Machines(svm)use the input prompt and the documents vectors to train svm model, after training, the svm can be used to score new vectors based on their similarity to the training data.Term Frequency–Inverse Document Frequency(tfidf)is a commonly used weighting technique in information retrieval and text mining. TF-IDF is a statistical method used to evaluate the importance of a term in a collection of documents or a corpus with respect to one specific document in the collection.ak=akasha.Doc_QA(search_type="merge")akasha.get_response(dir_path,prompt,search_type="mmr")Some models you can usePlease note that for OpenAI models, you need to set the environment variable 'OPENAI_API_KEY,' and for most Hugging Face models, a GPU is required to run the models. However, for .gguf models, you can use a CPU to run them.openai_model="openai:gpt-3.5-turbo"# need environment variable "OPENAI_API_KEY"huggingface_model="hf:meta-llama/Llama-2-7b-chat-hf"#need environment variable "HUGGINGFACEHUB_API_TOKEN" to download meta-llama modelquantized_ch_llama_model="hf:FlagAlpha/Llama2-Chinese-13b-Chat-4bit"taiwan_llama_gptq="hf:weiren119/Taiwan-LLaMa-v1.0-4bits-GPTQ"mistral="hf:Mistral-7B-Instruct-v0.2"mediatek_Breeze="hf:MediaTek-Research/Breeze-7B-Instruct-64k-v0.1"### If you want to use llama-cpp to run model on cpu, you can download gguf version of models### from https://huggingface.co/TheBloke/Llama-2-7b-Chat-GGUF and the name behind "llama-gpu:" or "llama-cpu:"### from https://huggingface.co/TheBloke/CodeUp-Llama-2-13B-Chat-HF-GGUF### is the path of the downloaded .gguf filellama_cpp_model="llama-gpu:model/llama-2-13b-chat-hf.Q5_K_S.gguf"llama_cpp_model="llama-cpu:model/llama-2-7b-chat.Q5_K_S.gguf"llama_cpp_chinese_alpaca="llama-gpu:model/chinese-alpaca-2-7b.Q5_K_S.gguf"llama_cpp_chinese_alpaca="llama-cpu:model/chinese-alpaca-2-13b.Q5_K_M.gguf"chatglm_model="chatglm:THUDM/chatglm2-6b"Some embeddings you can usePlease noted that each embedding model has different window size, texts that over the max seq length will be truncated and won't be represent in embedding model.Rerank_base and rerank_large are not embedding models; instead, they compare the query to each chunk of the documents and return scores that represent the similarity. As a result, they offer higher accuracy compared to embedding models but may be slower.openai_emd="openai:text-embedding-ada-002"# need environment variable "OPENAI_API_KEY" # 8192 max seq lengthhuggingface_emd="hf:all-MiniLM-L6-v2"text2vec_ch_emd="hf:shibing624/text2vec-base-chinese"# 128 max seq lengthtext2vec_mul_emd="hf:shibing624/text2vec-base-multilingual"# 256 max seq lengthtext2vec_ch_para_emd="hf:shibing624/text2vec-base-chinese-paraphrase"# perform better for long text, 256 max seq lengthbge_en_emd="hf:BAAI/bge-base-en-v1.5"# 512 max seq lengthbge_ch_emd="hf:BAAI/bge-base-zh-v1.5"# 512 max seq lengthrerank_base="rerank:BAAI/bge-reranker-base"# 512 max seq lengthrerank_large="rerank:BAAI/bge-reranker-large"# 512 max seq lengthFunctionsUse chain-of-thought to solve complicated probleminstead of input one single prompt, you can input multiple small stop questions to get better answer.importakashaimportosos.environ["OPENAI_API_KEY"]="your openAI key"dir_path="mic/"queries2=["西門子自有工廠如何朝工業4.0 發展","詳細解釋「工業4.0 成熟度指數」發展路徑的六個成熟度","根據西門子自有工廠朝工業4.0發展,探討其各項工業4.0的成熟度指標"]ak=akasha.Doc_QA()response=ak.chain_of_thought(dir_path,queries2,search_type='svm')print(response)response 1: 西門子自有工廠朝工業4.0發展的方式包括以下幾個方面: 1. 數位化戰略:西門子提出數位化戰略,從工業4.0策略擬定到落地執行,為客戶提供一條龍服務。他們設計數位工廠原型 ,搭配OT、IT方案,並使用西門子的MindSphere工業物聯網平台,發展數據可視化和數據分析相關應用。 2. 跨領域合作:西門子近年積極與雲服務商、系統商等跨領域合作,推動智慧製造解決方案。此外,他們也與SAP進行ERP整合,專注於物聯網領域。 3. 虛實整合:西門子在中國大陸成都生產研發基地的案例中,從研發、生產、訂單管理、供應商管理到物流作業 ,實現了整條價值鏈的虛實整合。他們不斷提高配料、傳輸、檢測等流程的自動化程度。 總體而言,西門子通過數位化戰略、跨領域合作和虛實整合等方式,推動自有工廠朝工業4.0發 展。他們致力於提升生產效率和效能,並利用先進的技術和解決方案實現智慧工廠的建設。 response 2: 「工業4.0成熟度指數」的發展路徑劃分為六個成熟度,分別是電腦化、可連結、可視化、可分析、可預測和自適應。 1. 電腦化:這是工業4.0發展的起點,指企業開始使用計算機技 術,人員不再手動操作機械。然而,機械仍未聯網,各個IT系統仍各自獨立,資料尚未串聯。例如,企業的ERP系統與生產相關的系統獨立運作,訂單與產品品檢紀錄分散於兩套系統,導致 訂單無法回溯出現品質問題的環節。 2. 可連結:在這個成熟度階段,企業開始將各個IT系統進行連接,實現資料的串聯。這使得不同系統之間可以共享資料,提高資訊的流通效率。例 如,企業的ERP系統與生產相關的系統進行連接,訂單與產品品檢紀錄可以實現資料的回溯。 3. 可視化:在這個成熟度階段,企業開始實現資料的可視化,將資料以圖形化或圖表化的方 式呈現,使得管理者可以直觀地了解企業的運營狀況。例如,企業可以使用數據儀表板或報表來呈現生產線的運行情況和產品的品質指標。 4. 可分析:在這個成熟度階段,企業開始進 行資料的分析,利用數據分析工具和算法來挖掘資料中的價值和洞察。這使得企業可以更深入地了解生產過程中的問題和潛在的改進空間。例如,企業可以使用數據分析工具來分析生產線的 效率和品質問題,並提出改進措施。 5. 可預測:在這個成熟度階段,企業開始利用資料分析的結果來進行預測和預測模型的建立。這使得企業可以預測生產過程中可能出現的問題,並 提前採取相應的措施。例如,企業可以利用預測模型來預測生產線的故障和產品的品質問題,並提前進行維護和調整。 6. 自適應:在這個成熟度階段,企業開始實現自動化和自適應能 力,使得生產過程可以根據實時的數據和環境變化進行調整和優化。這使得企業可以更靈活地應對市場需求和生產變化。例如,企業可以實現生產線的自動調整和產品的自動優化,以適應市 場需求的變化。 這六個成熟度階段代表了企業在工業4.0發展過程中的不同階段和能力水平,企業可以根據自身的情況和目標,逐步提升成熟度,實現工業4.0的目標。 response 3: 根據西門子自有工廠朝工業4.0發展的方式,可以探討其在工業4.0成熟度指標中的幾個方面: 1. 數位化戰略:西門子提出數位化戰略,從工業4.0策略擬定到落地執行提供一條龍服務 。這代表企業在工業4.0成熟度指標中已經達到了可連結和可視化的階段,並開始將數據應用於生產優化和資源利用。 2. 整合系統:西門子在廠內進行軟體間整合,包括PLM、ERP、MOM 、WMS和Automation五大系統的整合,使數據互聯互通。這代表企業在工業4.0成熟度指標中已經達到了可分析和可預測的階段,並能夠利用數據分析技術進行深入分析和預測。 3. 數據 應用:西門子利用自有的數位雙生軟體Tecnomatix,打造虛擬工廠,模擬生產狀況或監控實際生產狀況。這代表企業在工業4.0成熟度指標中已經達到了可分析和可預測的階段,並能夠利用 數據應用提供的資訊,優化生產設備和工序。 總的來說,根據西門子自有工廠朝工業4.0發展的方式,可以看出他們在工業4.0成熟度指標中已經達到了可連結、可視化、可分析和可預測,優化生產設備和工序。ask question from a single fileIf there's only a short single file document, you can useask_whole_fileto ask LLM with the whole document filenoted that the length of the document can not larger than the window size of the model.exampleimportakashaak=akasha.Doc_QA(search_type="merge",verbose=True,max_doc_len=15000,model="openai:gpt-4-32k",)response=ak.ask_whole_file(system_prompt="用列舉的方式描述"file_path="docs/mic/20230726_工業4_0發展重點與案例分析,以西門子、鴻海為例.pdf",prompt="工業4.0有什麼可以參考的標準或是架構嗎?")工業4.0的參考標準或架構主要有以下幾種:1.「工業4.0成熟度指數」:由德國國家工程院(Acatech)提出,將發展階段劃分為電腦化、可連結、可視化、可分析、可預測、自適應共六個成熟度,前項為後項發展基礎。2.「新加坡工業智慧指數」(SingaporeSmartIndustryReadinessIndex,SIRI):由新加坡政府提出,用於評估企業在工業4.0的發展程度。3.「工業4.0實施步驟方法論」:這是一種實施工業4.0的具體步驟,包括盤點公司內部待改善問題,分析現況與預期目標差異,以及規劃具體要改善的業務流程路線圖。directly offer information to ask questionIf you do not want to use any document file, you can useask_selffunction and input the information you need using parameterinfo,infocan be string or list of string.exampleinstall_requires=["pypdf","langchain>=0.1.0","chromadb==0.4.14","openai==0.27","tiktoken","lark==1.1.7","scikit-learn<1.3.0","jieba==0.42.1","sentence-transformers==2.2.2","torch==2.0.1","transformers>=4.33.4","llama-cpp-python==0.2.6","auto-gptq==0.3.1","tqdm==4.65.0","docx2txt==0.8","rouge==1.0.1","rouge-chinese==1.0.3","bert-score==0.3.13","click","tokenizers>=0.13.3","streamlit==1.28.2","streamlit_option_menu==0.3.6",]ak=akasha.Doc_QA(verbose=True,max_doc_len=15000,model="openai:gpt-4",)response=ak.ask_self(prompt="langchain的套件版本?",info=install_requires)langchain的套件版本是0.1.0或更高版本。### Arguments of Doc_QA class ###"""Args:**embeddings (str, optional)**: the embeddings used in query and vector storage. Defaults to "text-embedding-ada-002".\n**chunk_size (int, optional)**: chunk size of texts from documents. Defaults to 1000.\n**model (str, optional)**: llm model to use. Defaults to "gpt-3.5-turbo".\n**verbose (bool, optional)**: show log texts or not. Defaults to False.\n**threshold (float, optional)**: the similarity threshold of searching. Defaults to 0.2.\n**language (str, optional)**: the language of documents and prompt, use to make sure docs won't exceedmax token size of llm input.\n**search_type (str, optional)**: search type to find similar documents from db, default 'merge'.includes 'merge', 'mmr', 'svm', 'tfidf', also, you can custom your own search_type function, as long as yourfunction input is (query_embeds:np.array, docs_embeds:list[np.array], k:int, relevancy_threshold:float, log:dict)and output is a list [index of selected documents].\n**record_exp (str, optional)**: use aiido to save running params and metrics to the remote mlflow or not if record_exp not empty, and set record_exp as experiment name. default "".\n**system_prompt (str, optional)**: the system prompt that you assign special instruction to llm model, so will not be usedin searching relevant documents. Defaults to "".\n**max_doc_len (int, optional)**: max document size of llm input. Defaults to 3000.\n**temperature (float, optional)**: temperature of llm model from 0.0 to 1.0 . Defaults to 0.0.\n**use_chroma (bool, optional)**: use chroma db name instead of documents path to load data or not. Defaults to False.**use_rerank (bool, optional)**: use rerank model to re-rank the selected documents or not. Defaults to False.**ignore_check (bool, optional)**: speed up loading data if the chroma db is already existed. Defaults to False."""Save LogsEach time you run any function from akasha, it will save logs that record the parameters of this run and the results. Each run will have a timestamp, you can use {obj_name}.timestamp_list to check them, and use it to find the log of the run you want see.You can also save logs into .txt file or .json fileqa=akasha.Doc_QA(verbose=False,search_type="merge",max_doc_len=1500,model="llama-gpu:model/chinese-alpaca-2-7b.Q5_K_S.gguf")query1="五軸是什麼"qa.get_response(doc_path="./doc/mic/",prompt=query1)qa.get_response(doc_path="./doc/mic/",prompt=query1)tp=qa.timestamp_listprint(tp)## ["2023/09/26, 10:52:36", "2023/09/26, 10:59:49", "2023/09/26, 11:09:23"]print(qa.logs[tp[-1]])## {"fn_type":"get_response","search_type":"merge", "max_doc_len":1500,....."response":....}qa.save_logs(file_name="logs.json",file_type="json")Use AiiDO to record experimentIf you want to record experiment metrics and results, you need to create a project on the AiiDO platform. Once done, you will receive all the necessary parameters for automatically uploading the experiment.Create a .env file on the same directory of your program, and paste all parameters..env fileMINIO_URL=YOUR_MINIO_URLMINIO_USER=YOUR_MINIO_USERMINIO_PASSWORD=YOUR_MINIO_PASSWORDTRACKING_SERVER_URI=YOUR_TRACKING_SERVER_URIAfter you created .env file, you can userecord_expto set your experiment name and it will automatically record experiment metrics and results to mlflow server.importakashaimportosfromdotenvimportload_dotenvload_dotenv()os.environ["OPENAI_API_KEY"]="your openAI key"dir_path="doc/"prompt="「塞西莉亞花」的花語是什麼? 「失之交臂的感情」 「赤誠的心」 「浪子的真情」 「無法挽回的愛」"exp_name="exp_akasha_get_response"ak=akasha.Doc_QA(record_exp=exp_name)response=ak.get_response(dir_path,prompt)In an experiment you assign, the run name is the combinations of the usage of embedding, search type and model nameYou can also compare the responses from different models, search type and embeddingsAuto EvaluationTo evaluate the performance of current parameters, you can use functionauto_evaluation. First you need to build a question set .txt file based on the documents you want to use. You can either generatesingle choice question fileoressay question file.Forsingle choice question file, every options and the correct answer is separated by tab(\t), each line is a question,for example:(question_pvc.txt)應回收廢塑膠容器材質種類不包含哪種? 聚丙烯(PP) 聚苯乙烯(PS) 聚氯乙烯(PVC) 低密度聚乙烯(LDPE) 4 庫存盤點包括庫存全盤作業及不定期抽盤作業,盤點計畫應包括下列項目不包含哪項? 盤點差異之處理 盤點清冊 各項物品存放區域配置圖 庫存全盤日期及參加盤點人員名單 1 以下何者不是環保署指定之公民營地磅機構? 中森加油站企業有限公司 台益地磅站 大眾地磅站 新福行 4it will return the correct rate and tokens of the question set, details of each question would save in logs, or in mlflow server if you turn onrecord_expimportakasha.evalasevalimportosfromdotenvimportload_dotenvload_dotenv()os.environ["OPENAI_API_KEY"]="your openAI key"dir_path="doc/pvc/"exp_name="exp_akasha_auto_evaluation"eva=eval.Model_Eval(question_style="single_choice",search_type='merge',\model="openai:gpt-3.5-turbo",embeddings="openai:text-embedding-ada-002",record_exp=exp_name)print(eva.auto_evaluation("question_pvc.txt",dir_path))## correct rate: 0.9, tokens: 3228 ##Foressay question file, each question has "問題:" before it, and each reference answer has "答案:" before it. Each question is separated by two newline(\n\n)問題:根據文件中的訊息,智慧製造的複雜性已超越系統整合商的負荷程度,未來產業鏈中的角色將傾向朝共和共榮共創智慧製造商機,而非過往的單打獨鬥模式發展。請問為什麼 供 應商、電信商、軟體開發商、平台商、雲端服務供應商、系統整合商等角色會傾向朝共和共榮共創智慧製造商機的方向發展? 答案:因為智慧製造的複雜性已超越系統整合商的負荷程度,單一角色難以完成整個智慧製造的需求,而共和共榮共創的模式可以整合各方的優勢,共同創造智慧製造的商機。 問題:根據文件中提到的資訊技術商(IT)和營運技術商(OT),請列舉至少兩個邊緣運算產品或解決方案。 答案:根據文件中的資訊,NVIDIA的邊緣運算產品包括Jetson系列和EGX系列,而IBM的邊緣運算產品包括IBM Edge Application Manager和IBM Watson Anywhere。Use llm to create questionset and evaluate the performanceIf you prefer not to create your own question set to assess the performance of the current parameters, you can utilize theeval.auto_create_questionsetfeature to automatically generate a question set along with reference answers. Subsequently, you can useeval.auto_evaluationto obtain metrics scores such asBert_score,Rouge, andLLM_scorefor essay questionset andcorrect ratefor single choice questionset. These scores range from 0 to 1, with higher values indicating that the generated response closely matches the reference answers.For example, the code create a questionset text file 'mic_1.txt' with ten questions and reference answers, each question is randomly generated from the content segments of given documents in 'doc/mic/' directory. Then you can use the questionset text file to evaluate the performance of the parameters you want to test.importakasha.evalasevaleva=eval.Model_Eval(question_style="essay",search_type='merge',\model="openai:gpt-3.5-turbo",embeddings="openai:text-embedding-ada-002",record_exp="exp_mic_auto_questionset")eva.auto_create_questionset(doc_path="doc/mic/",question_num=10,output_file_path="questionset/mic_essay.txt")bert_score,rouge,llm_score=eva.auto_evaluation(questionset_path="questionset/mic_essay.txt",doc_path="doc/mic/",question_style="essay",record_exp="exp_mic_auto_evaluation",search_type="svm")# bert_score = 0.782# rouge = 0.81# llm_score = 0.393Use different question types to test different abilities of LLMquestion_types parameter offers four question types,fact,summary,irrelevant,compared, default isfact.importakasha.evalasevaleva=eval.Model_Eval(search_type='merge',question_type="irrelevant",model="openai:gpt-3.5-turbo",record_exp="exp_mic_auto_questionset")eva.auto_create_questionset(doc_path="doc/mic/",question_num=10,output_file_path="questionset/mic_irre.txt")bert_score,rouge,llm_score=eva.auto_evaluation(questionset_path="questionset/mic_irre.txt",doc_path="doc/mic/",question_style="essay",record_exp="exp_mic_auto_evaluation",search_type="svm")assign certain topic of questionsetIf you want to generate certain topic of question, you can usecreate_topic_questionsetfunction, it will use the topic input to find related texts in the documents and generate question set.importakasha.evalasevaleva=eval.Model_Eval(search_type='merge',question_type="irrelevant",model="openai:gpt-3.5-turbo",record_exp="exp_mic_auto_questionset")eva.create_topic_questionset(doc_path="doc/mic/",topic="工業4.0",question_num=3,output_file_path="questionset/mic_topic_irre.txt")bert_score,rouge,llm_score=eva.auto_evaluation(questionset_path="questionset/mic_topic_irre.txt",doc_path="doc/mic/",question_style="essay",record_exp="exp_mic_auto_evaluation",search_type="svm")Find Optimum CombinationTo test all available combinations and find the best parameters, you can use functionoptimum_combination, you can give different embeddings, document chunk sizes, models, document similarity searching type, and the function will test all combinations to find the best combination based on the given question set and documents.Noted that best score combination is the highest correct rate combination, and best cost-effective combination is the combination that need least tokens to get a correct answer.importakasha.evalasevalimportosfromdotenvimportload_dotenvload_dotenv()os.environ["OPENAI_API_KEY"]="your openAI key"os.environ["HUGGINGFACEHUB_API_TOKEN"]="your huggingface key"dir_path="doc/pvc/"exp_name="exp_akasha_optimum_combination"embeddings_list=["hf:shibing624/text2vec-base-chinese","openai:text-embedding-ada-002"]model_list=["openai:gpt-3.5-turbo","hf:FlagAlpha/Llama2-Chinese-13b-Chat-4bit","hf:meta-llama/Llama-2-7b-chat-hf",\"llama-gpu:model/llama-2-7b-chat.Q5_K_S.gguf","llama-gpu:model/llama-2-13b-chat.Q5_K_S.gguf"]eva=eval.Model_Eval(question_style="single_choice")eva.optimum_combination("question_pvc.txt",dir_path,embeddings_list=embeddings_list,model_list=model_list,chunk_size_list=[200,400,600],search_type_list=["merge","tfidf",],record_exp=exp_name)The result would look like belowBest correct rate: 1.000 Best score combination: embeddings: openai:text-embedding-ada-002, chunk size: 400, model: openai:gpt-3.5-turbo, search type: merge embeddings: openai:text-embedding-ada-002, chunk size: 400, model: openai:gpt-3.5-turbo, search type: tfidf Best cost-effective: embeddings: hf:shibing624/text2vec-base-chinese, chunk size: 400, model: openai:gpt-3.5-turbo, search type: tfidf"""### Arguments of Model_Eval class ###Args:**embeddings (str, optional)**: the embeddings used in query and vector storage. Defaults to "text-embedding-ada-002".**chunk_size (int, optional)**: chunk size of texts from documents. Defaults to 1000.**model (str, optional)**: llm model to use. Defaults to "gpt-3.5-turbo".**verbose (bool, optional)**: show log texts or not. Defaults to False.**threshold (float, optional)**: the similarity threshold of searching. Defaults to 0.2.**language (str, optional)**: the language of documents and prompt, use to make sure docs won't exceedmax token size of llm input.**search_type (str, optional)**: search type to find similar documents from db, default 'merge'.includes 'merge', 'mmr', 'svm', 'tfidf', also, you can custom your own search_type function, as long as yourfunction input is (query_embeds:np.array, docs_embeds:list[np.array], k:int, relevancy_threshold:float, log:dict)and output is a list [index of selected documents].**record_exp (str, optional)**: use aiido to save running params and metrics to the remote mlflow or not if record_exp not empty, and set record_exp as experiment name. default "".**system_prompt (str, optional)**: the system prompt that you assign special instruction to llm model, so will not be usedin searching relevant documents. Defaults to "".**max_doc_len (int, optional)**: max document size of llm input. Defaults to 3000.**temperature (float, optional)**: temperature of llm model from 0.0 to 1.0 . Defaults to 0.0.**question_type (str, optional)**: the type of question you want to generate, "essay" or "single_choice". Defaults to "essay".**use_rerank (bool, optional)**: use rerank model to re-rank the selected documents or not. Defaults to False."""File SummarizationTo create a summary of a text file in various formats like .pdf, .txt, or .docx, you can use thesummary.summarize_filefunction. For example, the following code employs themap_reducesummary method to instruct LLM to generate a summary of approximately 500 words.There're two summary type,map_reduceandrefine,map_reducewill summarize every text chunks and then use all summarized text chunks to generate a final summary;refinewill summarize each text chunk at a time and using the previous summary as a prompt for summarizing the next segment to get a higher level of summary consistency.importakasha.summaryassummarysum=summary.Summary(chunk_size=1000,chunk_overlap=100)sum.summarize_file(file_path="doc/mic/5軸工具機因應市場訴求改變的發展態勢.pdf",summary_type="map_reduce",summary_len=500\,chunk_overlap=40)"""### Arguments of Summary class ###Args:**chunk_size (int, optional)**: chunk size of texts from documents. Defaults to 1000.**chunk_overlap (int, optional)**: chunk overlap of texts from documents. Defaults to 40.**model (str, optional)**: llm model to use. Defaults to "gpt-3.5-turbo".**verbose (bool, optional)**: show log texts or not. Defaults to False.**threshold (float, optional)**: the similarity threshold of searching. Defaults to 0.2.**language (str, optional)**: the language of documents and prompt, use to make sure docs won't exceedmax token size of llm input.**record_exp (str, optional)**: use aiido to save running params and metrics to the remote mlflow or not if record_exp not empty, and setrecord_exp as experiment name. default "".**system_prompt (str, optional)**: the system prompt that you assign special instruction to llm model, so will not be usedin searching relevant documents. Defaults to "".**max_doc_len(int, optional)**: max docuemnt length of llm input. Defaults to 3000.**temperature (float, optional)**: temperature of llm model from 0.0 to 1.0 . Defaults to 0.0."""Custom Search Type, Embeddings and ModelIn case you want to use other search types, embeddings, or language models, you can provide your own functions as parameters forsearch_type,embeddingsandmodel.Custom Search TypeIf you wish to devise your own method for identifying the most relevant documents, you can utilize your custom function as a parameter forsearch_type.In the 'cust' function, we employ the Euclidean distance metric to identify the most relevant documents. It returns a list of indices representing the top k documents with distances between the query and document embeddings smaller than the specified threshold.Here's a breakdown of the parameters:query_embeds: Embeddings of the query. (numpy array)docs_embeds: Embeddings of all documents. (list of numpy arrays representing document embeddings)k: Number of most relevant documents to be selected. (integer)relevancy_threshold: Threshold for relevancy. If the distance between the query and a document is smaller than relevancy_threshold, the document is selected. (float)log: A dictionary that can be used to record any additional information you desire. (dictionary)defcust(query_embeds,docs_embeds,k:int,relevancy_threshold:float,log:dict):fromscipy.spatial.distanceimporteuclideanimportnumpyasnpdistance=[[euclidean(query_embeds,docs_embeds[idx]),idx]foridxinrange(len(docs_embeds))]distance=sorted(distance,key=lambdax:x[0])## change dist if embeddings not between 0~1max_dist=1whilemax_dist<distance[-1][0]:max_dist*=10relevancy_threshold*=10## add log paralog['dd']="miao"return[idxfordist,idxindistance[:k]if(max_dist-dist)>=relevancy_threshold]doc_path="./mic/"prompt="五軸是什麼?"qa=akasha.Doc_QA(verbose=True,search_type=cust,embeddings="hf:shibing624/text2vec-base-chinese")qa.get_response(doc_path=doc_path,prompt=prompt)Custom EmbeddingsIf you want to use other embeddings, you can put your own embeddings as a function and set as the parameter ofembeddings.For example, In the 'test_embed' function, we use the SentenceTransformer model to generate embeddings for the given texts. You can directly use 'test_embed' as a parameter forembeddingsand execute the'get_response'function.Here's a breakdown of the parameters:texts: A list of texts to be embedded.deftest_embed(texts:list)->list:fromsentence_transformersimportSentenceTransformermdl=SentenceTransformer('BAAI/bge-large-zh-v1.5')embeds=mdl.encode(texts,normalize_embeddings=True)returnembedsdoc_path="./mic/"prompt="五軸是什麼?"qa=akasha.Doc_QA(verbose=True,search_type="svm",embeddings=test_embed)qa.get_response(doc_path=doc_path,prompt=prompt)Custom ModelIf you want to use other language models, you can put your own model as a function and set as the parameter ofmodel.For example, In the 'test_model' function, we use the OpenAI model to generate response for the given prompt. You can directly use 'test_model' as a parameter formodeland execute the'get_response'function.Here's a breakdown of the parameters:prompt: A string representing the prompt for the language model.deftest_model(prompt:str):importopenaifromlangchain.chat_modelsimportChatOpenAIopenai.api_type="open_ai"model=ChatOpenAI(model="gpt-3.5-turbo",temperature=0)ret=model.predict(prompt)returnretdoc_path="./mic/"prompt="五軸是什麼?"qa=akasha.Doc_QA(verbose=True,search_type="svm",model=test_model)qa.get_response(doc_path=doc_path,prompt=prompt)Command Line InterfaceYou can also use akasha in command line, for example, you can usekeep-responsingto create a document QA model and keep asking different questions and get response based on the documents in the given -d directory.$akashakeep-responsing-d../doc/plc/-c400-k1Please input your question(type "exit()" to quit) : 應回收廢塑膠容器材質種類不包含哪種? 聚丙烯(PP) 聚苯乙烯(PS) 聚氯乙烯(PVC) 低密度聚乙烯(LDPE)Response: 應回收廢塑膠容器材質種類不包含低密度聚乙烯(LDPE)。Please input your question(type "exit()" to quit) : 所謂市盈率,是指每股市價除以每股盈餘,也就是股票的? 本益比 帳面值比 派息 資金英國和德國等多個市場。然而,義大利、加拿大和澳洲並不在這些可交易的國家之列。Please input your question(type "exit()" to quit) : exit()Currently you can useget-response,keep-responsing,chain-of-thoughtandauto_create_questionsetandauto_evaluation.$akashakeep-responsing--help Usage:akashakeep-responsing[OPTIONS]Options:-d,--doc_pathTEXTdocumentdirectorypath,parseall.txt,.pdf,.docxfilesinthedirectory[required]-e,--embeddingsTEXTembeddingsforstoringthedocuments-c,--chunk_sizeINTEGERchunksizeforstoringthedocuments-m,--modelTEXTllmmodelforgeneratingtheresponse-ur--use_rerankBOOLusereranktosortthedocuments-t,--thresholdFLOATthresholdscoreforselectingtherelevantdocuments-l,--languageTEXTlanguageforthedocuments,defaultis'ch'forchinese-s,--search_typeTEXTsearchtypeforthedocuments,includemerge,svm,mmr,tfidf-sys,--system_promptTEXTsystempromptforthellmmodel-md,--max_doc_lenINTEGERmaxdocumentlengthforthellmmodelinput--helpShowthismessageandexit.akasha_uiIf you prefer running Akasha via a web page, we offer a Streamlit-based user interface.To start the application, use the following command:$akashauiYou should now be able to access the web page athttp://localhost:8501/.You can start by going to theSettings pageto configure your settings.The first option,Document Path, specifies the directory where you want the LLM to search for documents.You can either add document files and name the directory from theUpload Filespage or place the directory containing documents in the ./docs/ directory.You can download the models you want into model/ directory, and they will be added toLangauage Modeloption in the Setting page.The default setting is to use the OpenAI model and embeddings, so please remember to add your OpenAI API key on the left side.After you have finished setting up, you can start using Akasha.For example, you can instruct the language model with a query like '五軸是什麼,' and you can include a system prompt to specify how you want the model to answer in Chinese.It's important to note that the difference between a prompt and a system prompt is that the system prompt is not used for searching similar documents; it's more about defining the format or type of response you expect from the language model for a given prompt question.
akash-basic-calculator
Failed to fetch description. HTTP Status Code: 404
akash-cal
No description available on PyPI.
akash-calculator
A very basic calculator package.
akash-distributions
No description available on PyPI.
akashi
Akashi Commnad Line ToolUnofficialakashicommand line tool.This command line tool can record your attendance from command lime.RequirementsPython 3.5 or higherchromedriverInstall$pipinstallakashiUsage$ akashi --help Usage: akashi [OPTIONS] COMMAND [ARGS]... Options: --help Show this message and exit. Commands: attend Attend to akashi leave Leave to akashi login Login to akashi logout Logout to akashiLicenseMIT License.
akashianlibrary
please dont hesitage used it .Change Log0.0.1 (1/07/2022)-FIRST RELEASE
akashi-cli
No description available on PyPI.
akashi-core
No description available on PyPI.
akashic-records
akashic_recordsakashic_recordsis a Python package that dynamically generates functions using OpenAI code completion based on what is imported and how it is used. InstallationTo install the akashic_records package, you can use pip:pipinstallakashic_recordsUsageBehaviour of theakashic_recordspackage is based onwhatyou import from it, andhowyou use it.The package generates functions on the fly based on the name you import and the way you call it.fromakashic_recordsimportquick_sortarr=[3,1,4,1,5,9,2,6,5,3,5]result=quick_sort(arr)print(result)# Output: [1, 1, 2, 3, 3, 4, 5, 5, 5, 6, 9]The package will end up making requests to the OpenAI completations endpoint with thecode-davinci-002model.The above code will end up generating this prompt:def quick_sort(arr: list):Note that the parameter name matches the name of the argument that was passed in. If a constant is passed in instead of an identifier, generic names such asp0andp1will be used. To have a useful name with a constant argument, use keyword arguments.Return type and docstringsThis package (ab)uses type hints to give more information to the completion process.fromtypingimportAnnotatedfromakashic_recordsimportmerge_sortunsorted_list=[3,1,4,1,5,9,2,6,5,3,5]result:Annotated[list,"""Sorts the input list using the mergesort algorithm.Parameters:-----------unsorted_list : listThe input list to be sorted.Returns:--------listThe sorted list."""]=merge_sort(unsorted_list)print(result)# Output: [1, 1, 2, 3, 3, 4, 5, 5, 5, 6, 9]Generates the following prompt:def merge_sort(unsorted_list: list) -> list: """ Sorts the input list using the mergesort algorithm. Parameters: ----------- unsorted_list : list The input list to be sorted. Returns: -------- list The sorted list. """If you would like to include a docstring but not a return type, only use a string for the type annotation instead of usingtyping.Annotated.DecoratorIf you don't like the somewhat magical mechanisms by which the above functionality works (extracting type hints and argument variable names), the packge also supplies a decorator with the same functionality.fromakashic_recordsimportgenerate@generate(n=1,temperature=0.1,max_tokens=256)defmerge_sort(unsorted_list:list)->list:"""Sorts the input list using the mergesort algorithm.Parameters:-----------unsorted_list : listThe input list to be sorted.Returns:--------listThe sorted list."""print(merge_sort([3,1,4,1,5,9,2,6,5,3,5]))# Output: [1, 1, 2, 3, 3, 4, 5, 5, 5, 6, 9]How does the parameter name/type hint thing work?The very neatsorcerypackage byAlex Hall.OptionsThe OpenaAI completaions endpoint has many options. Some of these are available for tweaking.from akashic_records import config config.n = 3 # https://platform.openai.com/docs/api-reference/completions/create#completions/create-n config.max_tokens = 512 # https://platform.openai.com/docs/api-reference/completions/create#completions/create-max_tokens config.temperature = 0.2 # https://platform.openai.com/docs/api-reference/completions/create#completions/create-temperatureSome additional options are available to control the prompt generation process.from akashic_records import config config.type_hint = True # Set to False to disable all type hints in prompts # The package isn't always able to get a working completion from the API. # In the event that something goes wrong (syntax error in the generated code, runtime error trying to run it, etc) # The package will try to generate new completions. # The `attempts` value controls how many times it will try. config.attempts = 5 # Set to -1 for unlimited tries.What's with the name?The nameakashic_recordsis inspired by the spiritual belief of theAkashic Records. In this belief, the Akashic Records are a repository all universal events, thoughts, words, emotions and intent ever to have occurred in the past, present, or future in terms of all entities and life forms.This seemed fitting for a package that in some sense contains the implementation of "every function".
akashi-effects
No description available on PyPI.
akashi-engine
AkashiA Next-Generation Video EditorAkashiis a next-generation video editor.You can edit your videos only by programs without being bothered with complex GUI.Akashi is still in the very early stages of development, andnot ready for productionin any sense.Examplefromakashi_coreimportak,[email protected]()defmain():withak.atom()asa1:withak.lane()as_:ak.text('Sample Text').fit_to(a1).poly(lambdae,b,p:e(p.x)<<e(p.x+300)|e(p.y)<<e(p.y+300))withak.lane()as_:@gl.entry(ak.frag)defblue_filter(buffer:ak.frag,color:gl.frag_color)->None:color.b*=2.0ak.video('./dog.mp4').ap(lambdah:h.duration(10).stretch(True),lambdah:h.frag(blue_filter))Installationpipinstallakashi-engineRequirementsSystem RequirementsLinux (tested on Debian 11)OpenGL core profile 4.2 or laterPython 3.10 or laterRuntime DependenciesFFmpeg 4.xSDL2, SDL_ttf, SDL_imageQt5If you use Debian, you can install these dependencies by the command below.aptinstallffmpegqtbase5-devlibsdl2-2.0-0libsdl2-image-2.0-0libsdl2-ttf-2.0-0Note: Windows support is planned for a future release.Getting StartedTBDFeaturesCode Driven DevelopmentVideo Editing in Akashi is completely code driven.You no longer need complex GUIs like timeline and inspectors.Rich and Fast Visual Effects by GPU ShadersIn Akashi, every step of the video editing process is dominated in Python.The Python codes on visual effects are compiled to shaders by the JIT compiler.This gives Python a super power which shaders have; performance far beyond any other languages and non-comparable expressions.Modern PythonType hinting is mandatory in Akashi, and we require you to do so.Basically, all errors are detected before runtime, and you will have a nice Python IntelliSense experience.This helps you write your codes more secure and easier.New-Fashioned User InterfaceBuilt-in Hot ReloadSeamless Editing by ASP🚧 Smart GUIBasic Video Editing FeaturesHardware Accerated Video Playback (VA-API)Audio Playback (PulseAudio)Image RenderingText Rendering (ttf/otf, outline, shadow, etc.)Basic 2D shapes (Rectangle, Circle, Round Rectangle, Triangle, Line, etc.)Rich Codec Backend (FFmpeg)Video EncodingGalleryTwo Videos With Subtitleshttps://user-images.githubusercontent.com/70841910/148137328-02665a2e-962a-4d82-9414-66fa94cc196e.mp4fromakashi_coreimportak,glfrom.configimportakconfigWIDTH=akconfig().video.resolution[0]HEIGHT=akconfig().video.resolution[1]VRES=list(map(lambdax:x//2,akconfig().video.resolution))defsubtitle(msg:str,dur:float):ak.text(msg).ap(lambdah:h.pos(VRES[0],VRES[1]+495),lambdah:h.duration(ak.sec(dur)),lambdah:h.fg(ak.Color.White,40),lambdah:h.font_path('./myfont.ttf'),)@ak.entry()defmain():withak.atom()asa1:withak.lane()as_:subtitle('Lorem ipsum dolor sit amet',3)subtitle('At magnam natus ut mollitia reprehenderit',3)withak.lane()as_:ak.line(100).ap(lambdah:h.begin(0,VRES[1]+500).end(VRES[0]*2,VRES[1]+500),lambdah:h.fit_to(a1),)withak.lane()as_:ak.video('./blue_city.mp4').duration(3).pos(*VRES).stretch(True)ak.video('./cherry.mp4').duration(3).pos(*VRES).stretch(True)Circle Animationhttps://user-images.githubusercontent.com/70841910/148137358-bd784005-84e2-48c7-9552-abeb638e73f1.mp4fromakashi_coreimportak,glfrom.configimportakconfigimportrandomrandom.seed(102)WIDTH=akconfig().video.resolution[0]HEIGHT=akconfig().video.resolution[1]VRES=list(map(lambdax:x//2,akconfig().video.resolution))defrandom_radius()->float:returnrandom.choice([10,20,40,80,120])defrandom_pos()->tuple[int,int]:return(random.randrange(0,WIDTH),random.randrange(0,HEIGHT*2))defrandom_color()->str:returnak.hsv(random.randrange(0,360),50,100)defcircle_lane(radius:float,pos:tuple[int,int],color:str):withak.lane()as_:@gl.entry(ak.poly)deffly(buffer:ak.poly,pos:gl.poly_pos)->None:pos.y+=buffer.time*50ak.circle(radius).ap(lambdah:h.pos(*pos),lambdah:h.color(color),lambdah:h.poly(fly))@ak.entry()defmain():withak.atom()asa1:foriinrange(100):circle_lane(random_radius(),random_pos(),random_color())withak.lane()as_:ak.rect(WIDTH,HEIGHT).fit_to(a1).pos(*VRES).color(ak.Color.White)
akash-iq-library
No description available on PyPI.
akashIsAwesome
No description available on PyPI.
akashjeez
akashjeezakashjeez is a Python Package to Deliver a Lot of Useful Services!InstallationUse the package managerpipto install akashjeez.pipinstallakashjeezUsageimportakashjeez## Check Attributes and Methods available in this package.print(dir(akashjeez))# returns list of attributes and methods of this module.## Service: Greetings to User!print(akashjeez.say_hello())# returns 'Hello, World!'print(akashjeez.say_hello("Everyone"))# returns 'Hello, Everyone!'## Service: Get Coordinates From Google for Input Location# Syntax: >> google_place(location)print(akashjeez.google_place('queensland chennai'))## Service: Get LIVE CoronaVirus Stats From United States of America!print(akashjeez.covid19_usa_stats())## Service: Get LIVE CoronaVirus Stats From All Over Globe!print(akashjeez.covid19_stats())## Service: Get Cloud Compute Pricing from Public API.# Syntax: >> cloud_compute_cost(provider, input_cpu, input_memory, input_region)# Providers = alibaba, amazon, azure, google# Input Regions = all, US, EU, Asia etcprint(akashjeez.cloud_compute_cost('azure',2,4,'asia'))print(akashjeez.cloud_compute_cost('amazon',2,4,'us'))## Service: Get Live & Forecast Weather Report Data for Any Location from Public API.# Syntax: >> get_weather_data(city_name)print(akashjeez.get_weather_data('chennai'))print(akashjeez.get_weather_data('los angeles'))## Service: Get Country Info like COuntry Code, capital, ISO & Phone Code using Public API .print(akashjeez.get_country_info())## Service: Get New Comic Book Data using Public API.print(akashjeez.comic_books_data())## Service: Get Movie Infromation using Public API.# Syntax: >> movie_search(movie_name)print(akashjeez.movie_search('furious'))## Service: Get a Random Fake User Data using Public API.print(akashjeez.random_user_generator())## Service: Get All Cars Makers & Manufacturers Data using Public API.print(akashjeez.car_maker_manufacturers())## Service: Get Latest Nobel Prize Data using Public API.print(akashjeez.get_nobel_prize())## Service: Get All File Formats Data using Public API.print(akashjeez.file_formats())## Service: Get Latest Open Trivia Q&A Data using Public API.print(akashjeez.open_trivia()## Service: Get Latest & Upcoming Movies Type & Name Data from BookMyShow.com# Syntax: >> bookmyshow(city_name)print(akashjeez.bookmyshow('chennai'))## Service: Search Public DNS Info from http://dns.google.comprint(akashjeez.dns_search('akashjeez.herokuapp.com')## Serice: Shuffle of Cards Randomly!print(akashjeez.shuffle_cards())## Service: Calculate Age by Passing Date of Birth as Input.# Syntax: >> age_calculator(input_dob)print(akashjeez.age_calculator('10-04-1993'))ContributingPull Requests are Welcome. For Major Changes, Please Open an issue First to Discuss What You Would like to Change.Please Make Sure to Update Tests as Appropriate.LicenseMIT
akashjeezpy
Failed to fetch description. HTTP Status Code: 404
akash-package
No description available on PyPI.
akash-test1
ABC
akash-test2
ABC
akash-test3
ABC
akash-test4
def
akash-test5
def
akash-test7
ABC
akasht-invo
Failed to fetch description. HTTP Status Code: 404
akasj
No description available on PyPI.
aka-stats
Aka StatsAka (赤 - red in japanese) Stats.Unified module for keeping stats in Redis.The goal is to have an easy way to measure an application, and then expose these metrics through a HTTP API, either to process it in some web ui, or expose it to Prometheus.fromaka_statsimportStats,timerwithStats()asstats:t=timer()...stats("task_done",next(t).stat)Or for asynchronouse code:fromaka_statsimportStats,timerasyncdefprocess_device(device_id:str):asyncwithStats()asstat:t=timer()...stats("task_done",next(t).stat,extra_labels={"device_id":device_id})InstallationAnd add this package to your project:poetryaddaka-statsUsage GuideCheck out the usage guide here:Usage.mdPrometheus formattersInformation how to write a formatter is here:PrometheusFormatter.mdOptional Standalone HTTP APICheck out this guide here:Included HTTP APIPytest pluginThis module is also a pytest plugin, providing a fixturemock_statswhich collects stats instead of writing them to Redis.deftest_something(mock_stats):do_something()assertmock_stats[0]==(1612550961,"test",1,None)And the module with function:defdo_something():withStats()asstats:stat("test",1)
akatsuki
UNKNOWN
akatsuki-kafka
No description available on PyPI.
akatsuki-pp-py
akatsuki-pp-pyDifficulty and performance calculation for allosu!modes.This is a python binding to the Rust libraryrosu-ppwhich was bootstrapped throughPyO3. Since all the heavy lifting is done by Rust, rosu-pp-py comes with a very fast performance. Check out rosu-pp'sREADMEfor more info.Exposed typesThe library exposes the following classes:Calculator: Contains various parameters to calculate strains or map, difficulty, or performance attributesBeatmap: Contains a parsed beatmapBeatmapAttributes: Contains various attributes about the map itselfDifficultyAttributes: Contains various attributes about the difficulty based on the modePerformanceAttributes: Contains various attributes about the performance and difficulty based on the modeStrains: Contains strain values for each skill based on the modeAdditionally, the following error types are exposed:ParseError: Failed to parse a beatmapKwargsError: Invalid kwargs were providedHow to use akatsuki-pp-pyThe first step is to create a newBeatmapinstance by providing appropriate kwargs. Either of the kwargspath,content, orbytesmustbe given. The kwargsar,cs,hp, andodare optional. With the settersset_ar,set_cs,set_hp, andset_odyou can specify custom attributes.map=Beatmap(path="/path/to/file.osu",ar=9.87)map.set_od(1.23)withopen("/path/to/file.osu","rb")asfile:map=Beatmap(bytes=file.read())withopen("/path/to/file.osu")asfile:map=Beatmap(content=file.read())Next, you need to create an instance ofCalculatorby providing the appropriate kwargs again. Any of the following kwargs are allowed:mode,mods,acc,n_geki,n_katu,n300,n100,n50,n_misses,combo,passed_objects,clock_rate, anddifficulty. Each of these also have a setter method e.g.set_n_misses.calc=Calculator(mode=2,acc=98.76)calc.set_mods(8+64)# HDDTThe last step is to call any of the methodsmap_attributes,difficulty,performance, orstrainson the calculator and provide them aBeatmap.Examplefromakatsuki_pp_pyimportBeatmap,Calculatormap=Beatmap(path="./maps/100.osu")calc=Calculator(mods=8)# Calculate an SS on HDmax_perf=calc.performance(map)# The mods are still set to HDcalc.set_acc(99.11)calc.set_n_misses(1)calc.set_combo(200)# A good way to speed up the calculation is to provide# the difficulty attributes of a previous calculation# so that they don't need to be recalculated.# **Note** that this should only be done if neither# the map, mode, mods, nor passed objects amount changed.calc.set_difficulty(max_perf.difficulty)curr_perf=calc.performance(map)print(f'PP:{curr_perf.pp}/{max_perf.pp}| Stars:{max_perf.difficulty.stars}')map_attrs=calc.map_attributes(map)print(f'BPM:{map_attrs.bpm}')strains=calc.strains(map)print(f'Maximum aim strain:{max(strains.aim)}')Installing rosu-pp-pyInstalling rosu-pp-py requires asupported version of Python and Rust.OncePythonandRustand ready to go, you can install the project with pip:$pipinstallakatsuki-pp-pyor$ pip install git+https://github.com/osuAkatsuki/akatsuki-pp-pyLearn MoreRust documentation.PyO3 documentation.Python documentation.
akatsuki-proto
No description available on PyPI.
akaudit
Audit who has SSH access to your user homes via authorized_keys.
akbase2233
akbase2233 可以使用自己的密码表进行内容的加密与解密,其中最主要的特点是加密结果不是唯一的。 虽然同样的内容加密后的结果有所差异,但是却不影响正常的解密,所以其安全性有很大的提升。关键是可以设计 私有的密码表,密码表不公开的情况下,基本上是不可解的。
akbs
AKBS: TheAllKnowingBuildSystem for C, C++ and AssemblyRequirementsPython 3>=Installation# Linuxpython3-mpipinstallakbs# Windows (UNSUPPORTED!)python-mpipinstallakbs# Usagepython3-makbsSpeedTo test the build system, I made a build script forbasic_math_operations,Operating System: Debian BullseyeHost: Linux 5.10.102.1-microsoft-standard-WSL2Architecture: x86_64AKBS: Version: 1.0.5 Time: 1.351sCMake w/ Makefile: CMake Version: 3.26.0 Make Version: 4.3 Time: 3.360sCMake w/ Ninja: CMake Version: 3.26.0 Ninja Version: 1.11.1 Time: 2.750sThebuild.akbsfile is as followsset(C_STD, 17) set(CXX_STD, 17) set(FILES, wildcard$(src/library/**/*.cpp) wildcard$(src/library/**/*.c)) set(OUTPUT_DIR, dist) set(BUILD_DIR, build) if(eq$($PLATFORM, POSIX)) set(FILES, wildcard$(src/library/linux/*.asm) remove$($FILES, src/library/cross-platform/addp.c, src/library/cross-platform/multiply_whole.c)) endif check_for(C, CXX, ASM_INTEL, SHARED, STATIC) set(OUTPUT, libbasic_math_operations.so) compile(SHARED, $FILES) set(OUTPUT, libbasic_math_operations.a) compile(STATIC, $FILES)Note:Theremove$()helper function is used because themultiply_wholeandaddpfunctions are already present in Assembly for Linux, not for WindowsHow do you use AKBS?By default, the build script is calledbuild.akbs, similar toMakefileandCMakeLists.txtHowever, you can use the--fileoption to specify a fileTo enable languages, you use thecheck_forfunctioncheck_for(C, CXX, ASM_INTEL, ASM_ATT, STATIC, SHARED)Right now, only these 4 languages (and a static and shared library linker) (3 if you count AT&T and Intel syntax Assembly as one language) are supportedIf you want to set the standard of C and C++, set the C_STD or CXX_STD variableset(C_STD, 17) set(CXX_STD, 17)To compile a list of files, use the compile functioncompile(SHARED/STATIC, src/a.c src/b.c src/c.c)To print a statement you can use the print function and to exit a program, use exitprint($PLATFORM) exit(1)To use a variable, use$VARIABLENAMEThe$PLATFORMvariable comes predefined and is set toos.nameFor conditions, use if (else and else if are not implemented yet) and endif.if(set$(PLATFORM)) print($PLATFORM) if(eq$($PLATFORM, UNIX)) print Yay, we're in UNIX land endif endifAlso, there is a rudimentary pre-processor, with%define%define ifend endif if(set$(PLATFORM)) ifendComments work by adding a semi-colon at the start of the line; these ; lines ; will ; be ; skippedComments only work from the start of the lineThere is also a list of helper functionsFunction NameArgumentsDescriptionIntroducedwildcard$str1Evaluates a list of space separated globs into a space separated list of filesv1.0.0remove$str1, str2, str3...Removes str2 onwards from a space separated list of stringsv1.0.0replace$str1, str2, str3...Replaces str2,4,6,8... with str3,5,7,9... in str1v1.0.3eq$arg1, arg2Checks if two strings are equalv1.0.0neq$arg1, arg2Checks if two strings are unequalv1.0.3gt$arg1, arg2Checks if arg1 is greater than arg2v1.0.3lt$arg1, arg2Checks if arg1 is lesser than arg2v1.0.3gte$arg1, arg2Checks if arg1 is greater than or equal to arg2v1.0.3lte$arg1, arg2Checks if arg1 is lesser than or equal to arg2v1.0.3set$arg1Checks if there is a variable with the name arg1notset$arg1Checks if there is not a variable with the name arg1v1.0.3and$arg1, arg2, arg3...Ands all the booleansv1.0.4or$arg1, arg2, arg3...Ors all the booleansv1.0.4not$arg1Nots the booleanv1.0.4A list of important variables areVariableIs SetDescriptionIntroducedPLATFORMYesEquivalent ofos.namev1.0.0C_COMPILERNoC compiler location set bycheck_forv1.0.0CXX_COMPILERNoC++ compiler location set bycheck_forv1.0.0ASM_INTEL_COMPILERNoIntel Assembly assembler location set bycheck_forv1.0.0ASM_ATT_COMPILERNoAT&T Assembly assembler location set bycheck_forv1.0.0SHARED_COMPILERNoLinker location for shared libraries set bycheck_forv1.0.0STATIC_COMPILERNoLinker location for static libraries set bycheck_forv1.0.2OUTPUTNoOutput file generated by linkingv1.0.0C_STDNoThe C std used (just the number like 17, 11, etc.)v1.0.0CXX_STDNoThe C++ std used (just the number like 17, 11, etc.)v1.0.0BUILD_DIRNoThe directory to build the objects inv1.0.1OUTPUT_DIRNoThe directory to output the finished objects in inv1.0.1C_FLAGSNoFlags passed to C compilerv1.0.2CXX_FLAGSNoFlags passed to C++ compilerv1.0.2ASM_INTEL_FLAGSNoFlags passed to Intel Assembly assemblerv1.0.2ASM_ATT_FLAGSNoFlags passed to AT&T Assembly assemblerv1.0.2SHARED_FLAGSNoFlags passed to shared library linkerv1.0.2STATIC_FLAGSNoFlags passed to static library linkerv1.0.2How to clean the filespython3-makbs--cleanTo-DoBuild directory (milestone 1.0.1)Nested functionsWindows supportOptimizationSubdirectoriesAbility to set compilers fromset()and environment variablesMore if conditionsreplace$()helper function--clean commandDocumenting my code + Readable variablesC_FLAGS, CXX_FLAGS, ASM...Cache install locations (milestone 1.0.2)Ability to set target architecture (Use CFLAGS and CXXFLAGS)Plugin SupportMakeprintandexita function, not a statementMake comments work in the middle of a line
ak_cache
ak_cacheCache any data for your python projectsView Demo·Documentation·Report Bug·Request FeatureTable of Contents1. About the Project1.1. Features2. Getting Started2.1. Prerequisites2.2. Installation3. Usage4. Roadmap5. License6. Contact7. Acknowledgements1. About the Project1.1. FeaturesRead and write data to a Cached Pickle File2. Getting Started2.1. Prerequisites2.2. InstallationInstall my-project with flitgitclonehttps://github.com/rpakishore/ak_cache.gitcdak_cache pipinstallflit flitinstallAlternatively, you can use pippipinstallak_cache3. UsageUse this space to tell a little more about your project and how it can be used. Show additional screenshots, code samples, demos or link to other resources.$fromak_cacheimportCache$cache_file=Cache(r'Path\to\Cache\file.pkl')$cache_file.write('This is a text')$cache_file.read()'This is a text'Encrypt your pickle file as below$cache_file=Cache(r'Path\to\Cache\encr_file.pkl',password="Strong_Password")$cache_file.write('This is an encrypted text')$cache_file.read()'This is an encrypted text'4. RoadmapAdd encryption option to the cache file5. LicenseSee LICENSE.txt for more information.6. ContactArun Kishore -@rpakishoreProject Link:https://github.com/rpakishore/7. AcknowledgementsUse this section to mention useful resources and libraries that you have used in your projects.Awesome README TemplateBanner MakerShields.ioCarbon
akcalculator
Failed to fetch description. HTTP Status Code: 404
akcli
No description available on PyPI.
akc-mamba
AKC-MAMBA manualsI. Installation Instructions1. PrerequisitesBefore you begin, you should confirm that you have installed all the prerequisites below on the platform where you will be running AKC-Mamba.a. Install pip3If you have not installedpip3, use the following command to install:curl"https://bootstrap.pypa.io/get-pip.py"-o"get-pip.py"python3get-pip.py--userFor checking version :pip3--version2. Install AKC-Mambaa. Install AKC-Mamba from pip packageYou can use the following command:pip3installakc-mambaAfter install successfuly, you can get help by command:mamba--helpb. Install and run from source codeInstall required Python3 modules withpip3install-rrequirements.txtUse akc-mamba using python3 command:python3mamba.py--help3. Deploy and bootstrap network with CLIa. Prepare environmentWe now can use the Mamba tool to prepare required helm and k8s componentsmambaenvironmentAfter running this command, the program will ask you to fill in some of the most necessary information of creating a blockchain network:Cluster name: The name of the cluster network you created in stepSetup an AWS EKS cluster. Default:cluster-mamba-exampleKubenetes type: Currentlyakc-mambais supporting kubenetes of two types:eksandminikube. The default iseksEFS infomation: After you have entered theKubenetes type,mambawill automatically search your k8s network for information aboutefs. If you haveEFSinstalled before, the system will automatically update the config file located at~/.akachain/akc-mamba/mamba/config/.env. If not, you need to fill in the informationEFS SERVERbased on the installation stepSetup a Network File System. If the k8s type isminikubethen you do not need to enter this information.Important Note: You can check and update configuration parameters in~/.akachain/akc-mamba/mamba/config/.env, the file content is pretty much self-explained.b. Deploy and bootstrap networkmambastartThemamba startcommand executes a series of sub commands that installs various network components. For more information on each command for individual components, please refer to help sectionmamba--helpTo terminate the network, just runmambaterminateII. Development Guide1. Project structureMamba makes use ofClick_, an elegant python package for creating command line interfaces. The project structure is depicted in the tree below.. ├──command_group_1 │├──commands.py │├──__init__.py │ ├──utils │├──__init__.py │├──kube.py ├──settings │├──settings.py ├──mamba.pyThere are 4 main components:mamba.py : The bootstrap instance module of Mambasettings : Contains global variables that are shared accross all sub modulescommand_group : Each command group is separated into its own directory.utils : helper functions that must be initialized via settings.py2. Coding ConventionPlease followPEP8- Style guide for Python Code.Another example can be foundhereThere are several notes that are different with other languagesFunction names should be lowercase, with words separated by underscores as necessary to improve readability. Camel case is for class name3. Logging instructionA snake must know how hiss ... or sometimes rattle.Normally we can just use echo to print out message during execution However:It is mandatory tohisswhen there is error.also,rattleis needed when a snake meet something ... at the beginning or at the end of an execution.For more information about logging, please follow the standard convention inmamba/utils/hiss.py
akcompress
This is a text compression library implemented on the lines of HUFFMAN CODING. The code base is implemeted and maintained by Akshay Kumar Singh.Procedure:pip3 install akcompressimport akcompressfrom akcompress import compressfrom akcompress import expandYou are good to go.To access documentation use the doc() method.Change Log0.0.1 (30/06/2021)First Release
ak-construct
Construct is a powerfuldeclarativeparser (and builder) for binary data.Instead of writingimperative codeto parse a piece of data, you declaratively define adata structurethat describes your data. As this data structure is not code, you can use it in one direction toparsedata into Pythonic objects, and in the other direction, convert (“build”) objects into binary data.The library provides both simple, atomic constructs (such as integers of various sizes), as well as composite ones which allow you form hierarchical structures of increasing complexity. Construct featuresbit and byte granularity, easy debugging and testing, aneasy-to-extend subclass system, and lots of primitive constructs to make your work easier:Fields: raw bytes or numerical typesStructs and Sequences: combine simpler constructs into more complex onesAdapters: change how data is representedArrays/Ranges: duplicate constructsMeta-constructs: use the context (history) to compute the size of dataIf/Switch: branch the computational path based on the contextOn-demand (lazy) parsing: read only what you requirePointers: jump from here to there in the data streamNoteConstruct3is a rewrite of Construct2; the two are incompatible, thusconstruct3will be released as adifferent package. Construct 2.5 is the last release of the 2.x codebase.Construct 2.5 drops the experimental text parsing support that was added in Construct 2.0; it was highly inefficient and I chose to concentrate on binary data.ExampleAPascalStringis a string prefixed by its length:>>> from construct import * >>> >>> PascalString = Struct("PascalString", ... UBInt8("length"), ... Bytes("data", lambda ctx: ctx.length), ... ) >>> >>> PascalString.parse("\x05helloXXX") Container({'length': 5, 'data': 'hello'}) >>> PascalString.build(Container(length = 6, data = "foobar")) '\x06foobar'Instead of specifying the length manually, let’s use an adapter:>>> PascalString2 = ExprAdapter(PascalString, ... encoder = lambda obj, ctx: Container(length = len(obj), data = obj), ... decoder = lambda obj, ctx: obj.data ... ) >>> PascalString2.parse("\x05hello") 'hello' >>> PascalString2.build("i'm a long string") "\x11i'm a long string"See more examples offile formatsandnetwork protocolsin the repository.ResourcesConstruct’s homepage ishttp://construct.readthedocs.org, where you can find all kinds of docs and resources. The library itself is developed ongithub; please usegithub issuesto report bugs, and github pull-requests to send in patches. For general discussion or questions, please use thenew discussion group.RequirementsConstruct should run on any Python 2.5-3.5 or PyPy implementation.Its only requirement issix, which is used to overcome the differences between Python 2 and 3.
akCore
No description available on PyPI.