package
stringlengths 1
122
| pacakge-description
stringlengths 0
1.3M
|
---|---|
aionewton | No description available on PyPI. |
aionextpay | AioNextPayThis is an async library for requesting to thehttps://nextpay.orgpurchase gateway.How to install :pip install aionextpayHow to use :First importNextPayfromaionextpayfrom aionextpay import NextPayThen you need to create an instance from NextPay class and pass it's parameters to it in an async functionfrom aionextpay import NextPay
token = 'your_nextpay_token'
callback_uri = 'yourdomain.ir/verify'
async def func():
amount = '10000' # price of your product
nextpay = NextPay(token, amount, callback_uri)Then you need to use purchase functionfrom aionextpay import NextPay
token = 'your_nextpay_token'
callback_uri = 'yourdomain.ir/verify'
async def func():
amount = '10000' # price of your product
nextpay = NextPay(token, amount, callback_uri)
trans_id = await nextpay.purchase(order_id)Have in mind thatpurchasefunction take kwargs parameter. so read the docs.If every thing goes good you get a trans_id from that function.p.s : you have to create a gateway payment with that trans_id and give it to the client like this:from aionextpay import NextPay
token = 'your_nextpay_token'
callback_uri = 'yourdomain.ir/verify'
async def func():
amount = '10000' # price of your product
nextpay = NextPay(token, amount, callback_uri)
trans_id = await nextpay.purchase(order_id)
link = f"https://nextpay.org/nx/gateway/payment/{trans_id}"When your client complete the purchase nextpay will request to the address you gave tocallback_urivariableWhen it does verify the purchase in your request handler like this :await nextpay.verify(trans_id)If everything goes good it will return True otherwise an exception will raiseYou can also refund the payment like this :await nextpay.refund(trans_id)If everything goes good it will return True otherwise an exception will raise |
aiongrok | UNKNOWN |
aionic | The package is the async library for the API of Russian DNS registrator
Ru-Center (a.k.a. NIC.RU). It provides classes for managing DNS services,
zones and records.This project bases on:https://github.com/andr1an/nic-apiInstallationUsingpip:pip install aionicUsageInitializationTo start using the API, you should get a pair of OAuth application login and
password from NIC.RU. Here is the registration page:https://www.nic.ru/manager/oauth.cgi?step=oauth.app_registerimportasynciofromnic_apiimportNICApidefprint_token(token:dict):print("Token:",token)api=NICApi(client_id="---",client_secret="---",username="---/NIC-D",password="---",scope="GET:/dns-master/.+",token_updater=print_token)# First you need to get tokenasyncdefmain():awaitapi.get_token()asyncio.run(main)Get tokenCall theget_token()method:# First you need to get tokenasyncdefmain():awaitapi.get_token()asyncio.run(main)Now you are ready to use the API.A token can be saved anywhere, for example, to a file, using the callback:token_updater. It also could be used for authorization.
Neither password nor username is required as long as the token is valid.Viewing services and DNS zonesOnnic.ruDNS zones are located in “services”:api.services()Usually there is one service per account. Let’s view available zones in the
serviceMY_SERVICE:asyncdefmain():awaitapi.zones('MY_SERVICE')asyncio.run(main)When starting a modification make sure that there is no any uncommited
changes in the zone, cause they would be applied on commit.Getting DNS recordsOne has to specify both service and DNS zone name to view or modify a record:asyncdefmain():awaitapi.records('MY_SERIVCE','example.com')asyncio.run(main)Creating a recordTo add a record, create an instance of one of thenic_api.models.DNSRecordsubclasses, i.e.ARecord:importaionic.modelsasnic_modelsrecord_www=nic_models.ARecord(name='www',a='8.8.8.8',ttl=3600)Add this record to the zone and commit the changes:asyncdefmain():awaitapi.add_record(record_www,'MY_SERVICE','example.com')awaitapi.commit('MY_SERVICE','example.com')asyncio.run(main)Deleting a recordEvery record in the zone has an unique ID, and it’s accessible viaDNSRecord.idproperty. When you got the ID, pass it to thedelete_recordmethod:asyncdefmain():awaitapi.delete_record(10,'MY_SERVICE','example.com')awaitapi.commit('MY_SERVICE','example.com')asyncio.run(main)Do not forget to always commit the changes! |
aio_nirvana | This is a security placeholder package.
If you want to claim this name for legitimate purposes,
please contact us [email protected]@yandex-team.ru |
aionlib | aionlib- library support foraionIntroductionInstallationTodoLicenseIntroductionaionlibis the official library support for theaionInstallationTo installaionlibwith pip3, type:sudo pip3 install aionlibIf you want to installaionlibfrom github, type:sudo git clone https://github.com/blueShard-dev/aionlib
cd aionlib
sudo python3 setup.py installTodotutorial for all classes / functionsLicenseThis project is licensed under the Mozilla Public Licence 2.0 (MPL-2.0) License - see theLICENSEfile for more details |
aionmap | See ongithub. |
aionostr | aionostrasyncio nostr clientFree software: BSD licenseDocumentation:https://aionostr.readthedocs.io.FeaturesRetrieve anything from the nostr network, using one command:$aionostrgetnprofile1qqsv0knzz56gtm8mrdjhjtreecl7dl8xa47caafkevfp67svwvhf9hcpz3mhxue69uhkgetnvd5x7mmvd9hxwtn4wvspak3h$aionostrget-vnevent1qqsxpnzhw2ddf2uplsxgc5ctr9h6t65qaalzvzf0hvljwrz8q64637spp3mhxue69uhkyunz9e5k75j6gxm$aionostrquery-s-q'{"kinds": [1], "limit":10}'$aionostrsend--kind1--contenttest--private-key<privatekey>$aionostrmirror-rwss://source.relay-twss://target.relay--verbose'{"kinds": [4]}'Set environment variables:NOSTR_RELAYS=wss://brb.io,wss://nostr.mom
NOSTR_KEY=`aionostr gen | head -1`CreditsThis package was created withCookiecutterand theaudreyr/cookiecutter-pypackageproject template.History0.19.0 (2023.03.07)Improved benchmark:aionostr benchOptimized Event object0.18.0 (2023-03-03)Allow for pretty-printing events, or just printing the contentEnable authentication when mirroring0.17.0 (2023-03-03)SupportnaddrNIP-19 type0.16.0 (2023-02-11)Improve benchmarkAllow manager to authenticate0.11.0 (2023-01-30)auto authenticatesupport ‘note’ NIP-19 type0.7.0-0.8.0 (2023-01-28)support for nrelay type0.6.0 (2023-01-25)Implement reconnect0.5.0 (2023-01-25)Support NIP-210.1.0 (2023-01-18)First release on PyPI. |
aionotify | aionotifyis a simple, asyncio-based inotify library.Its use is quite simple:importasyncioimportaionotify# Setup the watcherwatcher=aionotify.Watcher()watcher.watch(alias='logs',path='/var/log',flags=aionotify.Flags.MODIFY)asyncdefwork():awaitwatcher.setup()for_iinrange(10):# Pick the 10 first eventsevent=awaitwatcher.get_event()print(event)watcher.close()asyncio.run(work())LinksCode athttps://github.com/rbarrois/aionotifyPackage athttps://pypi.python.org/pypi/aionotify/Continuous integration athttps://travis-ci.org/rbarrois/aionotify/EventsAn event is a simple object with a few attributes:name: the path of the modified fileflags: the modification flag; useaionotify.Flags.parse()to retrieve a list of individual values.alias: the alias of the watch triggering the eventcookie: for renames, this integer value links the “renamed from” and “renamed to” events.Watchesaionotifyuses a system of “watches”, similar to inotify.A watch may have an alias; by default, it uses the path name:watcher=aionotify.Watcher()watcher.watch('/var/log',flags=aionotify.Flags.MODIFY)# Similar to:watcher.watch('/var/log',flags=aionotify.Flags.MODIFY,alias='/var/log')A watch can be removed by using its alias:watcher=aionotify.Watcher()watcher.watch('/var/log',flags=aionotify.Flags.MODIFY)watcher.unwatch('/var/log') |
aionotion | 📟 aionotion: a Python3, asyncio-friendly library for Notion® Home Monitoringaionotionis a Python 3, asyncio-friendly library for interacting withNotionhome monitoring sensors.InstallationPython VersionsUsageContributingInstallationpipinstallaionotionPython Versionsaionotionis currently supported on:Python 3.10Python 3.11Python 3.12UsageimportasynciofromaiohttpimportClientSessionfromaionotionimportasync_get_client_with_credentialsasyncdefmain()->None:"""Create the aiohttp session and run the example."""client=awaitasync_get_client_with_credentials("<EMAIL>","<PASSWORD>",session=session)# Get the UUID of the authenticated user:client.user_uuid# >>> xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx# Get the current refresh token of the authenticated user (BE CAREFUL):client.refresh_token# >>> abcde12345# Get all "households" associated with the account:systems=awaitclient.system.async_all()# >>> [System(...), System(...), ...]# Get a system by ID:system=awaitclient.system.async_get(12345)# >>> System(...)# Get all bridges associated with the account:bridges=awaitclient.bridge.async_all()# >>> [Bridge(...), Bridge(...), ...]# Get a bridge by ID:bridge=awaitclient.bridge.async_get(12345)# >>> Bridge(...)# Get all sensors:sensors=awaitclient.sensor.async_all()# >>> [Sensor(...), Sensor(...), ...]# Get a sensor by ID:sensor=awaitclient.sensor.async_get(12345)# >>> Sensor(...)# Get "listeners" (conditions that a sensor is monitoring) for all sensors:listeners=awaitclient.listener.async_all()# >>> [Listener(...), Listener(...), ...]# Get all listener definitions supported by Notion:definitions=awaitclient.listener.async_definitions()# >>> [ListenerDefinition(...), ListenerDefinition(...), ...]# Get user info:user_info=awaitclient.user.async_info()# >>> User(...)# Get user preferences:user_preferences=awaitclient.user.async_preferences()# >>> UserPreferences(...)asyncio.run(main())Using a Refresh TokenDuring the normal course of operations,aionotionwill automatically maintain a refresh
token and use it when needed. At times, you may wish to manage that token yourself (so
that you can use it later)–aionotionprovides a few useful capabilities there.Refresh Token Callbacksaionotionallows implementers to defining callbacks that get called when a new refresh
token is generated. These callbacks accept a single string parameter (the refresh
token):importasynciofromaiohttpimportClientSessionfromaionotionimportasync_get_client_with_credentialsasyncdefmain()->None:"""Create the aiohttp session and run the example."""client=awaitasync_get_client_with_credentials("<EMAIL>","<PASSWORD>",session=session)defdo_somethng_with_refresh_token(refresh_token:str)->None:"""Do something interesting."""pass# Attach the callback to the client:remove_callback=client.add_refresh_token_callback(do_somethng_with_refresh_token)# Later, if you want to remove the callback:remove_callback()asyncio.run(main())Getting a Client via a Refresh TokenAll of previous examples retrieved an authenticated client withasync_get_client_with_credentials. However, implementers may also create an
authenticated client by providing a previously retrieved user UUID and refresh token:importasynciofromaiohttpimportClientSessionfromaionotionimportasync_get_client_with_refresh_tokenasyncdefmain()->None:"""Create the aiohttp session and run the example."""asyncwithClientSession()assession:# Create a Notion API client:client=awaitasync_get_client_with_refresh_token("<USER UUID>","<REFRESH TOKEN>",session=session)# Get to work...asyncio.run(main())Connection PoolingBy default, the library creates a new connection to Notion with each coroutine. If you
are calling a large number of coroutines (or merely want to squeeze out every second of
runtime savings possible), anaiohttpClientSessioncan be used for
connection pooling:importasynciofromaiohttpimportClientSessionfromaionotionimportasync_get_client_with_credentialsasyncdefmain()->None:"""Create the aiohttp session and run the example."""asyncwithClientSession()assession:# Create a Notion API client:client=awaitasync_get_client_with_credentials("<EMAIL>","<PASSWORD>",session=session)# Get to work...asyncio.run(main())Check out the examples, the tests, and the source files themselves for method
signatures and more examples.ContributingThanks to all ofour contributorsso far!Check for open features/bugsorinitiate a discussion on one.Fork the repository.(optional, but highly recommended) Create a virtual environment:python3 -m venv .venv(optional, but highly recommended) Enter the virtual environment:source ./.venv/bin/activateInstall the dev environment:script/setupCode your new feature or bug fix on a new branch.Write tests that cover your new functionality.Run tests and ensure 100% code coverage:poetry run pytest --cov aionotion testsUpdateREADME.mdwith any new documentation.Submit a pull request! |
aionova | aionovaA simple asyncio Python library to interact with the Anova Precision Cooker.Getting your cooker_id/cooker_secretSee the advice given here:https://github.com/bmedicke/anova.py/issues/1#issuecomment-446835744CreditsBased on @bmedicke's libraryhttps://github.com/bmedicke/anova.py |
aionowplaying | aionowplayingA cross-platform Now Playing clientUsage# Using pippipinstallaionowplaying# Using poetrypoetryaddaionowplayingDocumentationTODOsee tests for now.Developmentpoetryinstall
poetryrunpytest-vLicenseGPL-3.0 |
aiontai | Async wrapper for nhentai APIInstallation$pipinstallaiontaiHow to useCreate clientimportasynciofromaiohttpimportClientSessionfromaiontaiimport(NHentaiClient,NHentaiAPI,Conventer,)asyncdefmain()->None:client_object=NHentaiClient(api=NHentaiAPI(ClientSession(),),conventer=Conventer(),)asyncio.run(main())Or can useinjectorthat will create the object itself (Next examples will be using it)importasynciofrominjectorimportInjectorfromaiontaiimport(NHentaiClient,ClientModule,)asyncdefmain()->None:injector=Injector(ClientModule())client_object=injector.get(NHentaiClient)asyncio.run(main())Example of using the clientasyncdefmain()->None:injector=Injector(ClientModule())client_object=injector.get(NHentaiClient)asyncwithclient_objectasclient:# Will close the session itselfdoujin=awaitclient.get_random_doujin()forpageindoujin.images:print(page.url)print(doujin.to_json())asyncio.run(main())Example of using the proxy...frominjectorimport(provider,Injector,Module,)fromaiohttp_proxyimportProxyConnector# pip install aiohttp_proxy...classAiohttpProxyModule(Module):def__init__(self,proxi_url:str)->None:self._proxi_url=proxi_url@providerdefprovide_client_session(self)->ClientSession:connector=ProxyConnector.from_url(self._proxi_url)returnClientSession(connector=connector)asyncdefmain()->None:injector=Injector(modules=[ClientModule(),AiohttpProxyModule("http://user:[email protected]:1080"),],)client_object=injector.get(NHentaiClient)asyncio.run(main())Example of using the Low level apiasyncdefmain()->None:injector=Injector(ClientModule())client_object=injector.get(NHentaiAPI)asyncwithclient_objectasclient:doujin=awaitclient.get_random_doujin()# Return: Dict[str, Any]# from api without loss of informationprint(doujin)asyncio.run(main()) |
aiontutil | aiontutilAIONT utility packageInstallMethod 1:pipinstallaiontutilMethod 2:[email protected]:aiont-joelan/aiontutil.gitcdaiontutil/
makebuildcddist/
pipinstallaiontutil-x.y.z-py3-none-any.whlTestingmaketest |
aionuki | aionukiAsynchronous Python library for interacting with Nuki locks and openers. Forked frompynuki. Refactored to use aiohttp and asyncio.Supports automatic bridge discovery usingnuki.ioservers and interactive authentication without manually entering any token.Supports parsing callbacks and integrating the result in the object's data structure. Full support of theNuki API Spec v1.12PDF HereInstallationpipinstall-UaionukiUsageimportasynciofromaionukiimportNukiBridgeasyncdefmain():bridges=awaitNukiBridge.discover()asyncwith(bridges[0])(token=None)asbr:print("Starting the interactive auth procedure.",br)ifnotbr.token:print("Received token:",awaitbr.auth())awaitbr.connect()lock=(awaitbr.locks)[0]awaitlock.lock()awaitlock.unlock()loop=asyncio.get_event_loop()loop.run_until_complete(main())More info in theexamplesdirectory. |
aionursery | This library implements a Nursery object, similar to trio’sNurseryforasyncio.asyncdefchild():...asyncdefparent():asyncwithaionursery.Nursery()asnursery:# Make two concurrent calls to childnursery.start_soon(child())nursery.start_soon(child())Tasks form a tree: when you run your main coroutine (viaasyncio.get_event_loop().run_until_completeorasyncio.run), this creates an initial task, and all your other tasks will be children, grandchildren, etc. of the main task.The body of theasync withblock acts like an initial task that’s running inside the nursery, and then each call tonursery.start_soonadds another task that runs in parallel.Keep in mind that:If any task inside the nursery raises an unhandled exception, then the nursery immediately cancels all the tasks inside the nursery.Since all of the tasks are running concurrently inside the async with block, the block does not exit until all tasks have completed. If you’ve used other concurrency frameworks, then you can think of it as, the de-indentation at the end of the async with automatically “joins” (waits for) all of the tasks in the nursery.Once all the tasks have finished, then:
* The nursery is marked as “closed”, meaning that no new tasks can be started inside it.
* Any unhandled exceptions are re-raised inside the parent task. If there are multiple exceptions, then they’re collected up into a single MultiError exception.Since all tasks are descendents of the initial task, one consequence of this is that the parent can’t finish until all tasks have finished.Please note that you can’t reuse an already exited nursery. Trying to re-open it again, or tostart_soonmore tasks in it will raiseNurseryClosed.Shielding some tasks from being cancelledSometimes, however, you need to have an opposite behavior: a child must execute no matter what exceptions are raised in other tasks.
Imagine a payment transaction running in one task, and an sms sending in another.
You certainly don’t want an sms sending error to cancel a payment transaction.For that, you canasyncio.shieldyour tasks before starting them in the nursery:asyncdefperform_payment():...asyncdefsend_sms():...asyncdefparent():asyncwithNursery()asnursery:nursery.start_soon(asyncio.shield(perform_payment()))nursery.start_soon(send_sms())Getting results from childrenIf your background tasks are not quite long-lived and return some useful values that you want to process, you can gather all tasks into a list and useasyncio.wait(or similar functions) as usual:asyncdefparent():asyncwithNursery()asnursery:task_foo=nursery.start_soon(foo())task_bar=nursery.start_soon(bar())results=awaitasyncio.wait([task_foo,task_bar])If your background tasks are long-lived, you should useasyncio.Queueto pass objects between children and parent tasks:asyncdefchild(queue):whileTrue:data=awaitfrom_external_system()awaitqueue.put(data)asyncdefparent():queue=asyncio.Queue()asyncwithNursery()asnursery:nursery.start_soon(child(queue))whilesome_condition():data=awaitqueue.get()awaitdo_stuff_with(data)Integration withasync_timeoutYou can wrap a nursery in aasync_timeout.timeoutcontext manager.
When timeout happens, the whole nursery cancels:fromasync_timeoutimporttimeoutasyncdefchild():awaitasyncio.sleep(1000*1000)asyncdefparent():asyncwithtimeout(10):asyncwithNursery()asnursery:nursery.start_soon(child())awaitasyncio.sleep(1000*1000) |
aionuts | AionutsОсновная идея проекта - использование декораторов в python, за место if else логики, избавление от громостких блоков чем то более подходящим, а так же ускорение работы за счет использования асинхронности.Синтаксис и большинство команд приемствуются с другого подобного фреймворка - aiogram. Это сделано с расчетом на возможность изменения лишь нескольких строчек, для перенисения/переписания бота от апи телеграмма на апи Вконтакте. |
aionx | No description available on PyPI. |
aio-nxapi | Cisco NX-API asyncio ClientThis repository contains a Cisco NX-API asyncio based client that uses
thehttpxas an underlying transport andlxmlas the basis for handling XML.Note: This client does not support the NETCONF interface.WORK IN PROGESSQuick ExampleThie following shows how to create a Device instance and run a list of
commands.By default the Device instance will use HTTPS transport. The Device instance
supports the following settings:host- The device hostname or IP addressusername- The login user-namepassword- The login passwordproto-(Optional)Choose either "https" or "http", defaults to "https"port-(Optional)Chose the protocol port to override proto defaultThe result of command execution is a list of CommandResults (namedtuple).
Theoutputfield will be:lxml.Element when output format is 'xml'dict when output format is 'json'str when output format is 'text'fromasyncnxapiimportDeviceusername='dummy-user'password='dummy-password'asyncdefrun_test(host):dev=Device(host=host,creds=(username,password))res=awaitdev.exec(['show hostname','show version'],ofmt='json')forcmdinres:ifnotcmd.ok:print(f"{cmd.command}failed")continue# do something with cmd.output as dict since ofmt was 'json'LimitationsChunking is not currently supported. If anyone has need of this feature
please open an issue requesting support.ReferencesCisco DevNet NX-API Rerefence:https://developer.cisco.com/site/cisco-nexus-nx-api-references/Cisco platform specific NX-API references:N3K systems, requires 7.0(3)I2(2) or later:https://www.cisco.com/c/en/us/td/docs/switches/datacenter/nexus3000/sw/programmability/7_x/b_Cisco_Nexus_3000_Series_NX-OS_Programmability_Guide_7x/b_Cisco_Nexus_3000_Series_NX-OS_Programmability_Guide_7x_chapter_010010.htmlN5K system, requires 7.3(0)N1(1) or later:https://www.cisco.com/c/en/us/td/docs/switches/datacenter/nexus5000/sw/programmability/guide/b_Cisco_Nexus_5K6K_Series_NX-OS_Programmability_Guide/nx_api.html#topic_D110A801F14F43F385A90DE14293BA46N7K systems:https://www.cisco.com/c/en/us/td/docs/switches/datacenter/nexus7000/sw/programmability/guide/b_Cisco_Nexus_7000_Series_NX-OS_Programmability_Guide/b_Cisco_Nexus_7000_Series_NX-OS_Programmability_Guide_chapter_0101.htmlN9K systems:https://www.cisco.com/c/en/us/td/docs/switches/datacenter/nexus9000/sw/6-x/programmability/guide/b_Cisco_Nexus_9000_Series_NX-OS_Programmability_Guide/b_Cisco_Nexus_9000_Series_NX-OS_Programmability_Guide_chapter_011.html |
aioodbc | aioodbcis a Python 3.7+ module that makes it possible to accessODBCdatabases
withasyncio. It relies on the awesomepyodbclibrary and preserves the same look and
feel. Internallyaioodbcemploys threads to avoid blocking the event loop,threadsare not that as bad as you think!. Other drivers likemotoruse the
same approach.aioodbcis fully compatible and tested withuvloop. Take a look at the test
suite, all tests are executed with both the default event loop anduvloop.Basic Exampleaioodbcis based onpyodbcand provides the same api, you just need
to useyield from conn.f()orawait conn.f()instead ofconn.f()Properties are unchanged, soconn.propis correct as well asconn.prop = val.importasyncioimportaioodbcasyncdeftest_example():dsn="Driver=SQLite;Database=sqlite.db"conn=awaitaioodbc.connect(dsn=dsn)cur=awaitconn.cursor()awaitcur.execute("SELECT 42 AS age;")rows=awaitcur.fetchall()print(rows)print(rows[0])print(rows[0].age)awaitcur.close()awaitconn.close()asyncio.run(test_example())Connection PoolConnection pooling is ported fromaiopgand relies onPEP492features:importasyncioimportaioodbcasyncdeftest_pool():dsn="Driver=SQLite3;Database=sqlite.db"pool=awaitaioodbc.create_pool(dsn=dsn)asyncwithpool.acquire()asconn:cur=awaitconn.cursor()awaitcur.execute("SELECT 42;")r=awaitcur.fetchall()print(r)awaitcur.close()awaitconn.close()pool.close()awaitpool.wait_closed()asyncio.run(test_pool())Context ManagersPool,ConnectionandCursorobjects support the context management
protocol:importasyncioimportaioodbcasyncdeftest_example():dsn="Driver=SQLite;Database=sqlite.db"asyncwithaioodbc.create_pool(dsn=dsn)aspool:asyncwithpool.acquire()asconn:asyncwithconn.cursor()ascur:awaitcur.execute("SELECT 42 AS age;")val=awaitcur.fetchone()print(val)print(val.age)asyncio.run(test_example())InstallationIn a linux environmentpyodbc(henceaioodbc) requires theunixODBClibrary.
You can install it using your package manager, for example:$ sudo apt-get install unixodbc
$ sudo apt-get install unixodbc-devThen:pip install aioodbcRun testsTo run tests locally without docker, installunixodbcandsqlitedriver:$ sudo apt-get install unixodbc
$ sudo apt-get install libsqliteodbcCreate virtualenv and install package with requirements:$ pip install -r requirements-dev.txtRun tests, lints etc:$ make fmt
$ make lint
$ make testOther SQL Driversaiopg- asyncio client for PostgreSQLaiomysql- asyncio client form MySQLRequirementsPython3.7+pyodbcuvloop(optional)Changes0.5.0 (2023-10-28)Added support for python 3.12Bumped minimal supported version of pyodbc to 5.0.1Dropped aiodocker related testing to unlock python 3.120.4.1 (2023-10-28)Implemented cursor setinputsizes.Implemented cursor fetchval.Added more type annotations.Added autocommit setter for cusror.0.4.0 (2023-03-16)Fixed compatibility with python 3.9+.Removed usage of explicit loop parameter.Added default read size parameter for cursor.Updated tests and CI scripts.Code base formatted with black.0.3.3 (2019-07-05)Parameter echo passed properly in cursor #185Close bad connections before returning back to pool #1950.3.2 (2018-08-04)Added basic documentation for after_created and ThreadPoolExecutor #176 (thanks @AlexHagerman)Cursor/connection context managers now rollback transaction on error,
otherwise commit if autocommit=False #178 (thanks @julianit)0.3.1 (2018-03-23)Add after_create hook for connection configuration (thanks @lanfon72)0.3.0 (2018-02-23)Added optional pool connections recycling #167 (thanks @drpoggi)0.2.0 (2017-06-24)Fixed Cursor.execute returns a pyodbc.Cursor instead of itself #114Fixed __aiter__ to not be awaitable for python>=3.5.2 #113Tests now using aiodocker #1060.1.0 (2017-04-30)Fixed project version0.0.4 (2017-04-30)Improved mysql testing0.0.3 (2016-07-05)Dockerize tests, now we can add more DBs to tests using docker #15, #17, #19Test suite executed with both default asyncio and uvloop #180.0.2 (2016-01-01)Improved pep 492 support.pool.get method removed, use acquire instead.Added tests against MySQL.Added bunch of doc strings.0.0.1 (2015-10-12)Initial release. |
aio-odoorpc | aio-odoorpc: an async Odoo RPC clientThis package builds upon the lower-levelaio-odoorpc-baseadding a AsyncOdooRPC/OdooRPC class as a thin layer that makes for a friendlier interface.AsyncOdooRPC is asynchronous code, OdooRPC is synchronous.This package does not intend to implement all the functionality of other odoo rpc modules
like ERPpeek and odoorpc. This is meant to be simpler but equally easy to work with without
trying to be too smart. One of the motivations of this package was to have an alternative to odoorpc
which started getting in my way. For instance, just instantiating a new object in odoorpc may result
in a roundtrip to the remote Odoo server. These unnecessary RPC calls quickly add up and it becomes
too difficult to develop fast software. Also, odoorpc offers no asynchronous interface which
is a huge lost opportunity for code that spends a lot of time waiting on blocking network calls.Why use this instead of aio-odoorpc-baseWith this interface you can instantiate an object once and then make simpler invocations to remote
methods likelogin, read, search, search_read and search_count. With aio-odoorpc-base, you get only
the lower levelexecute_kwcall and must pass a long list of parameters on every invocation.Also, aio-odoorpc let's you simulate behavior of odoorpc by instantiating the AsyncOdooRPC class with
adefault_model_nameso then all method calls do not need to pass amodel_name. In this way, you can
easily replace the usage of odoorpc with this object (I know because I migrated a lot of code away
from odoorpc). Of course, if you are migrating from odoorpc you should take the opportunity to
migrate to async code as well.LimitationsRight now there are built-in helper functions only for getting info out (read, search, search_read,
search_count), nothing to help creating new records or updating field values. Those are coming soon.Things to know about this module:Asyncio is a python3 thing, so no python2 support;Type hints are used everywhere;This package uses jsonrpc only (no xmlrpc);You need to manage the http client yourself. Code is tested with requests (sync),
httpx (sync and async) and aiohttp (async). See 'Usage' for examples;I am willing to take patches and to add other contributors to this project. Feel free to get in touch,
the github page is the best place to interact with the project and the project's author;The synchronous version of the code is generated automatically from the asynchronous code, so at
least for now the effort to maintain both is minimal. Both versions are unit tested.Things to know about Odoo's API:Thelogin()call is really only a lookup of the uid (an int) of the user given a
database, username and password. If you are using this RPC client over and over in your code,
maybe even calling from a stateless cloud service, you should consider finding out the user id (uid)
of the user and pass the uid instead of the username to the constructor of AsyncOdooRPC. This way,
you do not need to call thelogin()method after instantiating the class, saving a RPC call;The uid mentioned above is not a session-like id. It is really only the database id of the user
and it never expires. There is really no 'login' step required to access the Odoo RPC API if you
know the uid from the beginning;You will need the url of your Odoo's jsonrpc endpoint. Usually, you will just need to add 'jsonrpc' to
your odoo url. Example: Odoo web GUI on 'https://acme.odoo.com', JSONRPC will be on 'https://acme.odoo.com/jsonrpc'.
However, you may alternatively use one of the three helper methods offered by aio-odoorpc-base.helpers
to build the correct url:defbuild_odoo_base_url(*,host:str,port:Optional[int]=None,ssl:bool=True,base_url:str='')->str:...defbuild_odoo_jsonrpc_endpoint_url(*,host:str,port:Optional[int]=None,ssl:bool=True,base_url:str='',custom_odoo_jsonrpc_suffix:Optional[str]=None)->str:...defodoo_base_url2jsonrpc_endpoint(odoo_base_url:str='',custom_odoo_jsonrpc_suffix:Optional[str]=None)->str:...To import them usefrom aio_odoorpc_base.helpers import build_odoo_base_url, build_odoo_jsonrpc_endpoint_url, odoo_base_url2jsonrpc_endpointExamples:build_odoo_base_url(host='acme.odoo.com',base_url='testing')>>>>https://acme.odoo.com/testingbuild_odoo_jsonrpc_endpoint_url(host='acme.odoo.com',base_url='testing')>>>>https://acme.odoo.com/testing/jsonrpcodoo_base_url2jsonrpc_endpoint('https://acme.odoo.com/testing')>>>>https://acme.odoo.com/testing/jsonrpcUsageNote: check the tests folder for more examplesI will omit the event_loop logic I assume that if you want an async module you already have
that sorted out yourself or through a framework like FastAPI.
All examples below could also be called using the synchronous OdooRPC object, but without the
'await' syntax.fromaio_odoorpcimportAsyncOdooRPCimporthttpx# If the http_client you are using does not support a 'base_url' parameter like# httpx does, you will need to pass the 'url_jsonrpc_endpoint' parameter when# instantiating the AsyncOdooRPC object.asyncwithhttpx.AsyncClient(base_url='https://acme.odoo.com/jsonrpc')assession:odoo=AsyncOdooRPC(database='acme_db',username_or_uid='demo',password='demo',http_client=session)awaitodoo.login()try:# A default model name has not been set a instantiation time so we should# pass the model_name on every invocation. Here it should throw an exception.awaitodoo.read()exceptRuntimeError:passelse:assertFalse,'Expected an exception to be raised'# Now with model_name set it should work. Not passing a list of ids# turns the read call into a search_read call with an empty domain (so it matches all)# A missing 'fields' parameter means 'fetch all fields'.# Hence this call is very expensive, it fetches all fields for all# 'sale.order' recordsall_orders_all_fields=awaitodoo.read(model_name='sale.order')# creates a copy of odoo obj setting default_model_name to 'sale.order'sale_order=odoo.new_for_model('sale.order')# now we do not need to pass model_name and it works!all_orders_all_fields2=awaitsale_order.read()large_orders=sale_order.search_read(domain=[['amount_total','>',10000]],fields=['partner_id','amount_total','date_order'])Object instantiationThe AsyncOdooRPC/OdooRPC object takes these parameters:database: string, required. The name of the odoo databaseusername_or_id: string or int, required. If you pass username (a string), an invocation of the
method 'login()' will be required to fetch the uid (an int). The uid is what is really needed,
if you know both username and uid pass the uid and avoid a login() call which costs a roundtrip
to the Odoo server;password: string, required. The user's password. Unfortunately, Odoo's jsonrpc API requires the
password to be sent on every call. There is no session or token mechanism available as an alternative
authentication method.http_client: an http client, optional. If an http client is not set, you will need to pass the
http_client parameter on every method invocation (when available). Some http clients (e.g. httpx)
let you create a session setting the appropriate url as inasync with httpx.AsyncClient(base_url='https://acme.odoo.com/jsonrpc as sessionif you do that on a supporting client, you do not need to pass url, it is already set on the
http_client. Otherwise, you will need to passurl_jsonrpc_endpointto the constructor.url_jsonrpc_endpoint: string, optional. The url for your Odoo's server jsonrpc endpoint. You should
always pass it unless your http_client already knows to which url it should point to.
You may use aio-odoorpc-base helper methodsbuild_odoo_base_url, build_odoo_jsonrpc_endpoint_url, odoo_base_url2jsonrpc_endpointthat were described earlier in this README to build this url.
In short, the jsonrpc endpoint is your odoo instance's base url with a 'jsonrpc' suffix:https://acme.odoo.com/jsonrpcdefault_model_name: str, optional. This parameter sets the default model_name for all method
invocations that take an optional method_name parameter. If you set 'model_name' on a method
invocation it will override this default and follow your order. When you have an instance of
AsyncOdooRPC/OdooRPC you can create a copy with a different default method_name calling the
method new_for_model('sale.order.line'). You can use this style to mimic odoorpc's way where
you can call search, read, search_read on a specific model.odoo=AsyncOdooRPC(...)sale_order=odoo.new_for_model('sale.order')sale_order_line=odoo.new_for_model('sale.order.line')partner=sale_order.new_for_model('partner')Just remember that new_for_method is nothing special, it only sets a default model name on a
copy of an instance. Making copies of copies is perfectly ok.DependenciesThis package depends onaio-odoorpc-basewhich has no dependency itself. |
aio-odoorpc-base | Base functions to pilot Odoo's jsonrpc API (aio-odoorpc-base)Description:This python package implements acompleteset of methods to accessOdoo's external API (using jsonrpc rather than xmlrpc).It offers an almost-exact mirror of Odoo's external API, even parameter names are the same.
It is 'almost-exact' because 'execute' is skipped in favor of 'execute_kw' only and the
It is 'almost-exact' because 'execute' is skipped in favor of 'execute_kw' only and the
API methods from the 'db' service: 'list', 'drop', 'dump', 'rename', 'restore' are here
implemented with names 'list_databases', 'drop_database', 'dump_database', 'rename_database'
and 'restore_database' respectively.The 'documentation' offered by this package is mostly in the form of proper type
annotations so that you have a better idea of what kind of data each API method expects.
Other than that, developers are recommended to go study Odoo's external API by reading the
source code at (https://github.com/odoo/odoo/tree/master/odoo/service). The three API services
'object', 'common' and 'db' are implemented there in files model.py, common.py and db.py
respectively. On each of these python files, a 'dispatch' method is implemented for the service
in question. The methods available on the external service api are usually those prefixed with
'exp_' in the method name, with the exception of the 'object' service which only exposes
'execute' and 'execute_kw'.All functions offered by this package are available in both async and sync versions.Odoo's API methods implemented:aboutauthenticatechange_admin_passwordcreate_databasedb_existdrop_databasedump_databaseduplicate_databaseexecute_kwlist_countrieslist_databaseslist_langloginmigrate_databasesrename_databaserestore_databaseserver_versionversionAll methods take as first 2 parameters:http_client: a callable or an instance of a compatible http_client (it must implement a 'post'
method that accepts a 'url' and a 'json' parameter. Packages 'requests', 'httpx' and 'aiohttp' are
compatible).
If http_client is a callable, it will be called with a dict as the post payload and must return a
response object with a '.json()' method that may be synchronous or asynchronous (when using the async
functions). It must return a dict or dict-like object representing the reponse.url: the complete URL of your Odoo's jsonrpc endpoint. Usually something like
'https://odoo.acme.com/jsonrpc' or 'https://odoo.acme.com:8443/jsonrpc'.Remaining parameters on each method are those expected by Odoo's external API, with identical names
as you will find on Odoo's source code. The method 'jsonrpc' is the low-level method in this package that
actually does all the HTTP calls for all implemented methods.By default, when you issue 'from aio_odoorpc_base import ...' you will be importing the async methods.
If you want the sync methods you must import from 'aio_odoorpc_base.sync'. You may also use
'aio_odoorpc_base.aio' if you prefer to be explicit on whether you are importing sync or async code.aio-odoorpc: a higher-level APIIn practice, you may notice that 99% of the time you will be calling the 'execute_kw' method
which is what allows you to deal with Odoo's models, reading and writing actual business data
via the model methods 'search', 'read', 'search_read', 'search_count', 'write', 'create', etc.
While this package only offers you a bare 'execute_kw' method and a helper 'execute_kwargs',
the higher-level package 'aio-odoorpc' expands over this one adding higher-level objects and methods
(such as 'search', 'read', 'search_read', 'search_count', 'write', 'create', etc) to consume those
model methods through calls to 'execute_kw' external API method.No dependencies:No dependency is not a promise, just a preference. It may change in the future, but only if for very
good reason. Here, are free to use whatever HTTP Client library you want.I am willing to make modifications in the code in order to support other http client solutions,
just get in touch (use the project's github repository for that).While it would be easier if this package shipped with a specific http client dependency, it should be
noted that having the possibility to reuse HTTP sessions is a great opportunity to improve the
speed of your running code. Also, it is possible that your project is already using some http client
library and here you have the opportunity to use it.Remember that you must use an async http client library if you are going to use the async functions,
or use a synchronous http client library if you are going to use the sync function.Python HTTP Client packages known to be compatible:sync-only: 'requests'async-only: 'aiohttp'sync and async: 'httpx'Motivation:The package 'odoorpc' is the most used and better maintained package to let you easily consume Odoo's
external API. It has lots of functionality, good documentation, a large user base and was developed
by people that are very experienced with Odoo in general and big contributors to the Odoo Community.In other words, if you are taking your first steps and do not need an async interface now, start with
odoorpc.However, for my needs, once I was developing Odoo integrations that needed to make hundreds of calls
to the Odoo API to complete a single job, I began to sorely miss an async interface as well as more
control over the HTTP client used (I wished for HTTP2 support and connection polling/reuse).Also, as I understood Odoo's external API, it started to sound like 'odoorpc' was too big for a task
too simple. For instance, most of the time (like 99,99% of the time), you will be calling to a single
jsonrpc method called 'execute_kw'. It is the same call over and over just changing the payload which
itself is a simple json.So I decided to develop a new package myself, made it async-first and tryed to keep it as simple as
possible. Also, I decided to split it in two, a very simple base package (this one) with only methods
that mirror those in Odoo's external API and another one 'aio-odoorpc' that adds another layer to
implement Odoo's model methods like 'search', 'search_read', 'read', etc. as well as an object model
to instantiate a class once and then make simple method invocation with few parameters to access
what you need.Useful tips about Odoo's external API:The 'login' call is really only a lookup of the user_id (an int) of a user given a
database name, user/login name and password. If you are using this RPC client over and over in your
code, maybe even calling from a stateless cloud service, you should consider finding out the
user id (uid) of the user and pass the uid instead of the username to the constructor of AsyncOdooRPC.
This way, you do not need to call the login() RPC method to retrieve the uid, saving a RPC call;The uid mentioned above is not a session-like id. It is really only the database id of the user
and it never expires. There is really no 'login' or 'session initiation' step required to access
Odoo's external API if you know the uid from the beginning;Other things to know about this module:It ships will a good suite of tests that run against an OCA runbot instance;Asyncio is a python3 thing, so no python2 support;Type hints are used everywhere;This package uses jsonrpc only (no xmlrpc). There is a lack of async xmlrpc tooling and
jsonrpc is considered the best RPC protocol in Odoo (faster, more widely used);The synchronous version of the code is generated automatically from the asynchronous code, so at
least for now the effort to maintain both is minimal.I am willing to take patches and to add other contributors to this project. Feel free to get in touch,
the github page is the best place to interact with the project and the project's author;I only develop and run code in Linux environments, if you find a bug under other OS I am happy
to take patches but I will not myself spend time looking into these eventual bugs;UsageOk, so let's start with some examples. I will omit the event_loop logic, I assume that if you want
to use an async module you already have that sorted out yourself or through a framework like FastAPI.All examples below could also be called using the synchronous OdooRPC object, but without the
'await' syntax.I recommend that you check the tests folder for many more examples. Also, the codebase is very very short,
do refer to it as well.from aio_odoorpc_base.aio import login, execute_kw
from aio_odoorpc_base.helpers import execute_kwargs
import httpx
url = 'https://odoo.acme.com/jsonrpc'
async with httpx.AsyncClient() as client:
uid = await login(http_client=client, url=url, db='acme', login='demo', password='demo')
kwargs = execute_kwargs(fields=['partner_id', 'date_order', 'amount_total'],
limit=1000, offset=0, order='amount_total DESC')
data = await execute_kw(http_client=client,
url=url,
db='acme',
uid=uid,
password='demo',
obj='sale.order',
method='search_read',
args=[],
kw=kwargs) |
aiookru | aiookruaiookru is a pythonok.ru APIwrapper.
The main features are:authorization (Authorization Code,Implicit Flow,Password Grant,Refresh Token)REST APImethodsUsageTo useok.ru APIyou need a registered app
andok.ruaccount.
For more details, seeaiookru Documentation.Client applicationUseClientSessionwhen REST API is needed in:client component of the client-server applicationstandalone mobile/desktop applicationi.e. when you embed your app’s info (application key) in publicly available code.fromaiookruimportClientSession,APIsession=ClientSession(app_id,app_key,access_token,session_secret_key)api=API(session)events=awaitapi.events.get()friends=awaitapi.friends.get()Passsession_secret_keyandaccess_tokenthat were received after authorization.
For more details, seeauthorization instruction.Server applicationUseServerSessionwhen REST API is needed in:server component of the client-server applicationrequests from your serversfromaiookruimportServerSession,APIsession=ServerSession(app_id,app_key,app_secret_key,access_token)api=API(session)events=awaitapi.events.get()friends=awaitapi.friends.get()Passapp_secret_keyandaccess_tokenthat was received after authorization.
For more details, seeauthorization instruction.Installationpipinstallaiookruorpython setup.py installSupported Python VersionsPython 3.5, 3.6, 3.7 and 3.8 are supported.TestRun all tests.pythonsetup.pytestRun tests with PyTest.python-mpytest[-kTEST_NAME]Licenseaiookruis released under the BSD 2-Clause License. |
aio-omdb | aio-omdbAsyncronous and synchronous Python clients for OMDb (the Open Movie Database).Usagefromaio_omdb.clientimportAsyncOMDBClient,SyncOMDBClientOMDB_API_KEY='...'# Get your key from OMDBa_client=AsyncOMDBClient(api_key=OMDB_API_KEY)s_client=SyncOMDBClient(api_key=OMDB_API_KEY)# Client provides the following methods:# Get by IMDB IDawaita_client.get_by_id('tt1000252')s_client.get_by_id('tt1000252')# Get by exact titleawaita_client.get_by_id('Rome, open city')s_client.get_by_id('Rome, open city')# Search title by a word or phraseawaita_client.search('Spock')s_client.search('Spock')The following exceptions may be raised:aio_omdb.exc.InvalidAPIKey: if an invalid API key is used;aio_omdb.exc.MovieNotFound: if no movie can be found inget_by_idorget_by_title.TestingInstall thetestingextraspipinstall-Ue.[testing]Create file.envin the project root and put your OMDb API key there:OMDB_API_KEY=<your API key>Run testsmaketestEnjoy! |
aiooncue | Async OncueAsync for OncueFree software: Apache Software License 2.0Documentation:https://aiooncue.readthedocs.io.FeaturesTODOCreditsThis package was created withCookiecutterand theaudreyr/cookiecutter-pypackageproject template.History0.1.0 (2021-02-09)First release on PyPI. |
aio-openapi | aio-openapiAsynchronous web middleware foraiohttpand serving Rest APIs withOpenAPIv 3
specification and with optionalPostgreSqldatabase bindings.See thetutorialfor a quick introduction. |
aioopenexchangerates | aioopenexchangeratesFetch rates from openexchangerates with aiohttp.InstallationInstall this via pip (or your favourite package manager):pip install aioopenexchangeratesUsageimportasynciofromaioopenexchangeratesimportClient,OpenExchangeRatesErrorasyncdefmain()->None:"""Run main."""asyncwithClient("your_api_key")asclient:try:result=awaitclient.get_latest()exceptOpenExchangeRatesErroraserr:print(err)else:print(result)if__name__=="__main__":asyncio.run(main())CreditsThis package was created withCookiecutterand thebrowniebroke/cookiecutter-pypackageproject template. |
aioopenssl | aioopensslprovides aasyncioTransport which usesPyOpenSSLinstead of the built-in ssl
module.The transport has two main advantages compared to the original:The TLS handshake can be deferred by passinguse_starttls=Trueand later
calling thestarttls()coroutine method.This is useful for protocols with aSTARTTLSfeature.A coroutine can be called during the TLS handshake; this can be used to defer
the certificate check to a later point, allowing e.g. to get user feedback
before thestarttls()method returns.This allows to ask users for certificate trust without the application layer
protocol interfering or starting to communicate with the unverified peer.NoteUse this module at your own risk. It has lower test coverage than I’d like
it to have; it has been exported from aioxmpp on request, where it undergoes
implicit testing. If you find bugs, please report them. If possible, add
regression tests while you’re at it.If you find security-critical bugs, please follow the procedure announced in
theaioxmpp readme.`DocumentationOfficial documentation can be built with sphinx and is available onlineon our servers. |
aioorm | version: 0.1.6status: productionauthor: hszemail:[email protected] interface forpeeweemodeled aftertorpeeweeFeaturesupport mysql and postgresqldatabase factory using database URLuse peewee’s fieldsManyToManyField supportShortcuts supportcsv dump /load supportcan use playhose.postgres_ext.JSONFieldInstallpython-mpip install aioormExample: GRUDfromaioormimportAioModel,AioMySQLDatabasefrompeeweeimportCharField,TextField,DateTimeFieldfrompeeweeimportForeignKeyField,PrimaryKeyFielddb=AioMySQLDatabase('test',host='127.0.0.1',port=3306,user='root',password='')classUser(AioModel):username=CharField()classMeta:database=dbclassBlog(AioModel):user=ForeignKeyField(User)title=CharField(max_length=25)content=TextField(default='')pub_date=DateTimeField(null=True)pk=PrimaryKeyField()classMeta:database=db# create connection poolawaitdb.connect(loop)# countawaitUser.select().count()# async iteration on select queryasyncforuserinUser.select():print(user)# fetch all records as a list from a query in one passusers=awaitUser.select()# insertuser=awaitUser.create(username='kszucs')# modifyuser.username='krisztian'awaituser.save()# async iteration on blog set[b.titleasyncforbinuser.blog_set.order_by(Blog.title)]# close connection poolawaitdb.close()# see more in the testsExample: Many to manyNote thatAioManyToManyFieldmust be used instead ofManyToMany.fromaioormimportAioManyToManyFieldclassUser(AioModel):username=CharField(unique=True)classMeta:database=dbclassNote(AioModel):text=TextField()users=AioManyToManyField(User)classMeta:database=dbNoteUserThrough=Note.users.get_through_model()asyncforuserinnote.users:# do something with the usersCurrently the only limitation I’m aware of immidiate setting of instance relation must be replaced with a method call:# original, which is not supportedcharlie.notes=[n2,n3]# use insteadawaitcharlie.notes.set([n2,n3])SerializingConverting to dict requires the asyncified version ofmodel_to_dictfromaioormimportmodel_to_dictserialized=awaitmodel_to_dict(user)Dump to csvtables can be dump to a csv file.fromaioorm.utilsimportaiodump_csvquery=User.select().order_by(User_csv.id)awaitaiodump_csv(query,str(filepath))DocumentationDocumentation on Readthedocs.TODOasync dataset supportmore testLimitationsuntested transactionsonly support mysql and postgresqlBug fixfixedgetandget_or_create‘s bug |
aio-ormsql | aio_ormsqlSimple asynchronous MySQL ORM class.Easy installpip3 install aio_ormsqlRequiredpip3 install asyncio aiomysqlPython 3.6+Example usageDo imports:fromaio_ormsql.dbimportDataBasefromaio_ormsql.classesTable,Column,WHEREWhere:DataBase- class for working with MySQLTable- class for creating a new tableColumn- class for creating a columnsWHERE- class for creating where statementsFollow examples ans see their output:ExamplesCreating tabletbl=Table('tests',Column('id',int),Column('tname',str))Connection to databasedb=DataBase('admin','admin','tests')awaitdb.connect()Simple WHERE statementwhere=WHERE(tbl.tname=='admin')print(where)# Output:# WHERE `tname`='admin'Another WHERE statementwhere2=WHERE((tbl.id>=20)|(tbl.tname=='admin'))print(where2)# Output:# WHERE `id`>=20 OR `tname`='admin'SELECT examplestatement=awaitdb.select([tbl.id,tbl.tname],where=where,table=tbl,back=True)print(statement)# Output:# SELECT DISTINCT `id`, `tname` FROM `tests` WHERE `tname`='admin'SELECT example 2statement2=awaitdb.select([tbl.id,tbl.tname],False,where2,table=tbl,back=True)print(statement2)# Output:# SELECT`id`, `tname` FROM `tests` WHERE `id`>=20 OR `tname`='admin'INSERT examplestatement3=awaitdb.insert({tbl.id:123,tbl.tname:'Johan'},tbl,back=True)print(statement3)# Output:# INSERT INTO `tests` (`id`, `tname`) VALUES (123, 'Johan')INSERT example 2statement4=awaitdb.update({tbl.id:'123',tbl.tname:'Admin'},WHERE(tbl.tname=='Johan'),tbl,back=True)print(statement4)# Output:# UPDATE `tests` SET `id`=123, `tname`='Admin' WHERE `tname`='Johan'Working with large DBarray=db.fetch_gen(awaitdb.select([tbl.id,tbl.tname],back=True))asyncforiteminarray:print('New pair:',item)# And you can see row-by-row outputClosing connection and wait for completing tasksawaitdb.close() |
aiooss | Based onhttps://github.com/aliyun/aliyun-oss-python-sdk文档参考官方文档, 调用io操作的接口前加await即可result=awaitbucket.get_object(...获取返回内容:body = await result.resp.read()Getting started# -*- coding: utf-8 -*-importaioossendpoint='http://oss-cn-hangzhou.aliyuncs.com'# Suppose that your bucket is in the Hangzhou region.auth=aiooss.Auth('<Your AccessKeyID>','<Your AccessKeySecret>')asyncdefgo(loop):# The object key in the bucket is story.txtasyncwithaiooss.Bucket(auth,endpoint,'<your bucket name>')asbucket:key='story.txt'# Uploadawaitbucket.put_object(key,'Ali Baba is a happy youth.')# Uploaddata=dict(a=1,b=2)awaitbucket.put_object(key,json.dumps(data),headers={'Content-Type':'application/json'})# Downloadresult=awaitbucket.get_object(key)print(result.headers)print(awaitresult.resp.read())# Deleteawaitbucket.delete_object(key)# Traverse all objects in the bucketasyncforobject_infoinaiooss.ObjectIterator(bucket):print(object_info.key)loop=asyncio.get_event_loop()loop.run_until_complete(go(loop)) |
aiooss2 | Aiooss2FeaturesTODORequirementsTODOInstallationYou can installAiooss2viapipfromPyPI:$pipinstallaiooss2UsageContributingContributions are very welcome.
To learn more, see theContributor Guide.LicenseDistributed under the terms of theApache 2.0 license,Aiooss2is free and open source software.IssuesIf you encounter any problems,
pleasefile an issuealong with a detailed description. |
aio-osservaprezzi | # aio_osservaprezziConsume the REST API ofhttp://osservaprezzi.mise.gov.it/ |
aiootp | aiootp - Asynchronous pseudo one-time pad based crypto and anonymity library.aiootpis an asynchronous library providing access to cryptographic
primatives and abstractions, transparently encrypted / decrypted file
I/O and databases, as well as powerful, pythonic utilities that
simplify data processing & cryptographic procedures in python code.
This library’s online, salt reuse / misuse resistant, tweakable AEAD cipher, calledChunky2048, is an implementation of thepseudo one-time pad. The
aim is to create a simple, standard, efficient implementation that’s
indistinguishable from the unbreakable one-time pad cipher; to give
users and applications access to user-friendly cryptographic tools; and,
to increase the overall security, privacy, and anonymity on the web, and
in the digital world. Users will findaiootpto be easy to write,
easy to read, and fun.Important Disclaimeraiootpis experimental software that works with Python 3.7+.
It’s a work in progress. The programming API could change with
future updates, and it isn’t bug free.aiootpprovides powerful
security tools and misc utilities that’re designed to be
developer-friendly and privacy preserving.
As a security tool,aiootpneeds to be tested and reviewed
extensively by the programming and cryptography communities to
ensure its implementations are sound. We provide no guarantees.
This software hasn’t yet been audited by third-party security
professionals.Quick Install$sudoapt-getinstallpython3-setuptoolspython3-pip$pip3install--user--upgradepiptypingaiootpRun Tests$cd~/aiootp/tests$coveragerun--sourceaiootp-mpytest-vvtest_aiootp.pyTable Of ContentsTransparently Encrypted DatabasesIdeal InitializationUser ProfilesTagsMetatagsBasic ManagementMirrorsPublic Cryptographic FunctionsEncrypt / DecryptHMACsChunky2048 CipherHigh-level FunctionsHigh-level GeneratorsPasscryptHashing & Verifying PassphrasesPasscrypt Algorithm OverviewX25519 & Ed25519X25519Ed25519ComprendeSynchronous GeneratorsAsynchronous GeneratorsModule OverviewFAQChangelogKnown IssuesTransparently Encrypted Databases…………..Table Of ContentsThe package’sAsyncDatabase&Databaseclasses are very powerful data persistence utilities. They automatically handle encryption & decryption of user data & metadata, providing a pythonic interface for storing & retrieving any bytes or JSON serializable objects. They’re designed to seamlessly bring encrypted bytes at rest to users as dynamic objects in use.Ideal Initialization………………………Table Of ContentsMake a new user key with a fast, cryptographically secure pseudo-random number generator. Then this strong 64-byte key can be used to create a database object.fromaiootpimportacsprng,AsyncDatabasekey=awaitacsprng()db=awaitAsyncDatabase(key)User Profiles…………………………….Table Of ContentsWith User Profiles, passphrases may be used instead to open a database. Often, passwords & passphrases contain very little entropy. So, they aren’t recommended for that reason. However, profiles provide a succinct way to use passphrases more safely. They do this by deriving strong keys from low entropy user input using the memory/cpu hard passcrypt algorithm, & a secret salt which is automatically generated & stored on the user’s filesystem.# Automatically convert any available user credentials into# cryptographic tokens which help to safely open databases ->db=awaitAsyncDatabase.agenerate_profile(b"server-url.com",# Here an unlimited number of bytes-type# arguments can be passed as additionalb"[email protected]",# optional credentials.username=b"username",passphrase=b"passphrase",salt=b"optional salt keyword argument",# Optional passcrypt configuration:mb=256,# The memory cost in Mibibytes (MiB)cpu=2,# The computational complexity & number of iterationscores=8,# How many parallel processes passcrypt will utilize)Tags…………………………………….Table Of ContentsData within databases are primarily organized by Tags. Tags are simply string labels, and the data stored under them can be any bytes or JSON serializable objects.asyncwithdb:# Using bracketed assignment adds tags to the cachedb["tag"]={"data":"can be any JSON serializable object"}db["hobby"]=b"fash smasher"db["bitcoin"]="0bb6eee10d2f8f45f8a"db["lawyer"]={"#":"555-555-1000","$":13000.50}db["safehouses"]=["Dublin Forgery","NY Insurrection"]# Changes in the cache are saved to disk when the context closes.# View an instance's tags ->db.tags>>>{'tag','hobby','bitcoin','lawyer','safehouses'}# View the filenames that locate the data for each tag ->db.filenames>>>{'0z0l10btu_yd-n4quc8tsj9baqu8xmrxz87ix','197ulmqmxg15lebm26zaahpqnabwr8sprojuh','248piaop3j9tmcvqach60qk146mt5xu6kjc-u','2enwc3crove2cnrx7ks963d8_se25k6cdn6q9','5dm-60yspq8yhah4ywxcp52kztq_9toj0owm2'}# There are various ways of working with tags ->awaitdb.aset_tag("new_tag",["data","goes","here"])# stored only in cacheawaitdb.aquery_tag("new_tag")# reads from disk if not in the cache>>>['data','goes','here']tag_path=db.path/awaitdb.afilename("new_tag")"new_tag"indb>>>Truetag_path.is_file()# the tag is saved in the cache, not to disk yet>>>Falseawaitdb.asave_tag("new_tag")tag_path.is_file()# now it's saved to disk>>>True# This removes the tag from cache, & any of its unsaved changes ->awaitdb.arollback_tag("new_tag")# Or, the user can take the tag out of the database & the filesystem ->awaitdb.apop_tag("new_tag")>>>['data','goes','here']"new_tag"indb>>>Falsetag_path.is_file()>>>FalseAccess to data is open to the user, so care must be taken not to let external API calls touch the database without accounting for how that can go wrong.Metatags…………………………………Table Of ContentsMetatags are used to organize data by string names & domain separate cryptographic material. They are fully-fledged databases all on their own, with their own distinct key material too. They’re accessible from the parent through an attribute that’s added to the parent instance with the same name as the metatag. When the parent is saved, or deleted, then their descendants are also.# Create a metatag database ->molly=awaitdb.ametatag("molly")# They can contain their own sets of tags (and metatags) ->molly["hobbies"]=["skipping","punching"]molly["hobbies"].append("reading")# The returned metatag & the reference in the parent are the same ->assertmolly["hobbies"]isdb.molly["hobbies"]assertisinstance(molly,AsyncDatabase)# All of an instance's metatags are viewable ->db.metatags>>>{'molly'}# Delete a metatag from an instance ->awaitdb.adelete_metatag("molly")db.metatags>>>set()assertnothasattr(db,"molly")Basic Management………………………….Table Of ContentsThere’s a few settings & public methods on databases for users to manage their instances & data. This includes general utilities for saving & deleting databases to & from the filesystem, as well as fine-grained controls for how data is handled.# The path attribute is set within the instance's __init__# using a keyword-only argument. It's the directory where the# instance will store all of its files.db.path>>>PosixPath('site-packages/aiootp/aiootp/databases')# Write database changes to disk with transparent encryption ->awaitdb.asave_database()# Entering the instance's context also saves data to disk ->asyncwithdb:print("Saving to disk...")# Delete a database from the filesystem ->awaitdb.adelete_database()As databases grow in the number of tags, metatags & the size of data within, it becomes desireable to load data from them as needed, instead of all at once into the cache during initialization. This is why thepreloadboolean keyword-only argument is set toFalseby default.# Let's create some test values to show the impact preloading has ->asyncwith(awaitAsyncDatabase(key,preload=True))asdb:db["favorite_foods"]=["justice","community"]awaitdb.ametatag("exercise_routines")db.exercise_routines["gardening"]={"days":["moday","wednesday"]}db.exercise_routines["swimming"]={"days":["thursday","saturday"]}# Again, preloading into the cache is toggled off by default ->uncached_db=awaitAsyncDatabase(key)# To retrieve elements, ``aquery_tag`` isn't necessary when# preloading is used, since the tag is already in the cache ->asyncwithuncached_db:db["favorite_foods"]>>>["justice","community"]uncached_db["favorite_foods"]>>>Nonevalue=awaituncached_db.aquery_tag("favorite_foods",cache=True)assertvalue==["justice","community"]assertuncached_db["favorite_foods"]==["justice","community"]# Metatags will be loaded, but their tags won't be ->asserttype(uncached_db.exercise_routines)==AsyncDatabaseuncached_db.exercise_routines["gardening"]>>>Noneawaituncached_db.exercise_routines.aquery_tag("gardening",cache=True)>>>{"days":["moday","wednesday"]}uncached_db.exercise_routines["gardening"]>>>{"days":["moday","wednesday"]}# But, tags can also be queried without caching their values,value=awaituncached_db.exercise_routines.aquery_tag("swimming")value>>>{"days":["thursday","saturday"]}uncached_db.exercise_routines["swimming"]>>>None# However, changes to mutable values won't be transmitted to the# database if they aren't retrieved from the cache ->value["days"].append("sunday")value>>>{"days":["thursday","saturday","sunday"]}awaituncached_db.exercise_routines.aquery_tag("swimming")>>>{"days":["thursday","saturday"]}Mirrors………………………………….Table Of ContentsDatabase mirrors allow users to make copies of all files within a database under new encryption keys. This is useful if users simply want to make backups, or if they’d like to update / change their database keys.# A unique login key / credentials are needed to create a new# database ->new_key=awaitacsprng()new_db=awaitAsyncDatabase(new_key)# Mirroring an existing database is done like this ->awaitnew_db.amirror_database(db)assert(awaitnew_db.aquery_tag("favorite_foods")isawaitdb.aquery_tag("favorite_foods"))# If the user is just updating their database keys, then the old# database should be deleted ->awaitdb.adelete_database()# Now, the new database can be saved to disk & given an appropriate# name ->asyncwithnew_dbasdb:passPublic Cryptographic Functions……………..Table Of ContentsAlthough databases handle encryption & decryption automatically, users may want to utilize their databases’ keys to do custom cryptographic procedures manually. There are a few public functions available to users if they should want such functionality.Encrypt / Decrypt…………………………Table Of Contents# Either JSON serializable or bytes-type data can be encrypted ->json_plaintext={"some":"JSON data can go here..."}bytes_plaintext=b"some bytes plaintext goes here..."token_plaintext=b"some token data goes here..."json_ciphertext=awaitdb.ajson_encrypt(json_plaintext)bytes_ciphertext=awaitdb.abytes_encrypt(bytes_plaintext)token_ciphertext=awaitdb.amake_token(token_plaintext)# Those values can just as easily be decrypted ->assertjson_plaintext==awaitdb.ajson_decrypt(json_ciphertext)assertbytes_plaintext==awaitdb.abytes_decrypt(bytes_ciphertext)asserttoken_plaintext==awaitdb.aread_token(token_ciphertext)# Filenames may be added to classify ciphertexts. They also alter the# key material used during encryption in such a way, that without the# correct filename, the data cannot be decrypted ->filename="grocery-list"groceries=["carrots","taytoes","rice","beans"]ciphertext=awaitdb.ajson_encrypt(groceries,filename=filename)assertgroceries==awaitdb.ajson_decrypt(ciphertext,filename=filename)awaitdb.ajson_decrypt(ciphertext,filename="wrong filename")>>>"InvalidSHMAC: Invalid StreamHMAC hash for the given ciphertext."# Time-based expiration of ciphertexts is also available for all# encrypted data this package produces ->fromaiootp.asynchsimportasleepawaitasleep(6)awaitdb.ajson_decrypt(json_ciphertext,ttl=1)>>>"TimestampExpired: Timestamp expired by <5> seconds."awaitdb.abytes_decrypt(bytes_ciphertext,ttl=1)>>>"TimestampExpired: Timestamp expired by <5> seconds."awaitdb.aread_token(token_ciphertext,ttl=1)>>>"TimestampExpired: Timestamp expired by <5> seconds."# The number of seconds that are exceeded may be helpful to know. In# which case, this is how to retrieve that integer value ->try:awaitdb.abytes_decrypt(bytes_ciphertext,ttl=1)exceptdb.TimestampExpiredaserror:asserterror.expired_by==5HMACs……………………………………Table Of ContentsBesides encryption & decryption, databases can also be used to manually verify the authenticity of bytes-type data with HMACs.# Creating an HMAC of some data with a database is done this way ->data=b"validate this data!"hmac=awaitdb.amake_hmac(data)awaitdb.atest_hmac(hmac,data)# Runs without incident# Data that is not the same will be caught ->altered_data=b"valiZate this data!"awaitdb.atest_hmac(hmac,altered_data)>>>"InvalidHMAC: Invalid HMAC hash for the given data."# Any number of bytes-type arguments can be run thorugh the function,# the collection of items is canonically encoded automagically ->arbitrary_data=(b"uid_\x0f\x12",b"session_id_\xa1")hmac=awaitdb.amake_hmac(*arbitrary_data)awaitdb.atest_hmac(hmac,*arbitrary_data)# Runs without incident# Additional qualifying information can be specified with the ``aad``# keyword argument ->fromtimeimporttimetimestamp=int(time()).to_bytes(8,"big")hmac=awaitdb.amake_hmac(*arbitrary_data,aad=timestamp)awaitdb.atest_hmac(hmac,*arbitrary_data)>>>"InvalidHMAC: Invalid HMAC hash for the given data."awaitdb.atest_hmac(hmac,*arbitrary_data,aad=timestamp)# Runs fine# This is most helpful for domain separation of the HMAC outputs.# Each distinct setting & purpose of the HMAC should be specified# & NEVER MIXED ->uuid=awaitdb.amake_hmac(user_name,aad=b"uuid")hmac=awaitdb.amake_hmac(user_data,aad=b"data-authentication")#Chunky2048 Cipher…………………………Table Of ContentsTheChunky2048cipher is built from generators & SHA3-based key-derivation functions. It’s designed to be easy to use, difficult to misuse & future-proof with large security margins.High-level Functions……………………..Table Of ContentsThese premade recipes allow for the easiest usage of the cipher.importaiootpcipher=aiootp.Chunky2048(key)# Symmetric encryption of JSON data ->json_data={"account":33817,"names":["queen b"],"id":None}encrypted_json_data=cipher.json_encrypt(json_data,aad=b"demo")decrypted_json_data=cipher.json_decrypt(encrypted_json_data,aad=b"demo",ttl=120)assertdecrypted_json_data==json_data# Symmetric encryption of binary data ->binary_data=b"some plaintext data..."encrypted_binary_data=cipher.bytes_encrypt(binary_data,aad=b"demo")decrypted_binary_data=cipher.bytes_decrypt(encrypted_binary_data,aad=b"demo",ttl=30)assertdecrypted_binary_data==binary_data# encrypted URL-safe Base64 encoded tokens ->token_data=b"some plaintext token data..."encrypted_token_data=cipher.make_token(token_data,aad=b"demo")decrypted_token_data=cipher.read_token(encrypted_token_data,aad=b"demo",ttl=3600)assertdecrypted_token_data==token_dataHigh-level Generators……………………..Table Of ContentsWith these generators, the online nature of the Chunky2048 cipher can be utilized. This means that any arbitrary amount of data can be processed in streams of controllable, buffered chunks. These streaming interfaces automatically handle message padding & depadding, ciphertext validation & detection of out-of-order message blocks.Encryption:fromaiootpimportAsyncCipherStream# Let's imagine we are serving some data over a network ->receiver=SomeRemoteConnection(session).connect()# This will manage encrypting a stream of data ->stream=awaitAsyncCipherStream(key,aad=session.transcript)# We'll have to send the salt & iv in some way ->receiver.transmit(salt=stream.salt,iv=stream.iv)# Now we can buffer the plaintext we are going to encrypt ->forplaintextinreceiver.upload.buffer(4*stream.PACKETSIZE):awaitstream.abuffer(plaintext)# The stream will now produce encrypted blocks of ciphertext# as well as the block ID which authenticates each block ->asyncforblock_id,ciphertextinstream:# The receiver needs both the block ID & ciphertext ->receiver.send_packet(block_id+ciphertext)# Once done with buffering-in the plaintext, the ``afinalize``# method is called so the remaining encrypted data will be# flushed out of the buffer to the user ->asyncforblock_id,ciphertextinstream.afinalize():receiver.send_packet(block_id+ciphertext)# Here we can give an optional check of further authenticity,# also cryptographically asserts the stream is finished ->receiver.transmit(shmac=awaitstream.shmac.afinalize())Decryption / Authentication:fromaiootpimportAsyncDecipherStream# Here let's imagine we'll be downloading some data ->source=SomeRemoteConnection(session).connect()# The key, salt, aad & iv must be the same for both parties ->stream=awaitAsyncDecipherStream(key,salt=source.salt,aad=session.transcript,iv=source.iv)# The downloaded ciphertext will now be buffered & the stream# object will produce the plaintext ->forciphertextinsource.download.buffer(4*stream.PACKETSIZE):# Here stream.shmac.InvalidBlockID is raised if an invalid or# out-of-order block is detected within the last 4 packets ->awaitstream.abuffer(ciphertext)# If authentication succeeds, the plaintext is produced ->asyncforplaintextinstream:yieldplaintext# After all the ciphertext is downloaded, ``afinalize`` is called# to finish processing the stream & flush out the plaintext ->asyncforplaintextinstream.afinalize():yieldplaintext# An optional check for further authenticity which also# cryptographically asserts the stream is finished ->awaitstream.shmac.afinalize()awaitstream.shmac.atest_shmac(source.shmac)#Passcrypt…………………………Table Of ContentsThePasscryptalgorithm is a data independent memory & computationally hard password-based key derivation function. It’s built from a single primitive, the SHAKE-128 extendable output function from the SHA-3 family. Its resource costs are measured by three parameters:mb, which represents an integer number of Mibibytes (MiB);cpu, which is a linear integer measure of computational complexity & the number of iterations of the algorithm over the memory cache; andcores, which is an integer which directly assigns the number of separate processes that will be pooled to complete the algorithm. The number of bytes of the output tag are decided by the integertag_sizeparameter. And, the number of bytes of the automatically generatedsaltare decided by the integersalt_sizeparameter.Hashing & Verifying Passphrases……………………..Table Of ContentsBy far, the dominating measure of difficulty forPasscryptis determined by thembMibibyte memory cost. It’s recommended that increases to desired difficulty are first translated into highermbvalues, where resource limitations of the machines executing the algorithm permit. If more difficulty is desired than can be obtained by increasingmb, then increases to thecpuparameter should be used. The higher this parameter is the less likely an adversary is to benefit from expending less than the intended memory cost, & increases the execution time & complexity of the algorithm. The final option that should be considered, if still more difficulty is desired, is to lower thecoresparallelization parameter, which will just cause each execution to take longer to complete.fromaiootpimportPasscrypt,hash_bytes# The class accepts an optional (but recommended) static "pepper"# which is applied as additional randomness to all hashes computed# by the class. It's a secret random bytes value of any size that is# expected to be stored somewhere inaccessible by the database which# contains the hashed passphrases ->withopen(SECRET_PEPPER_PATH,"rb")aspepper_file:Passcrypt.PEPPER=pepper_file.read()# when preparing to hash passphrases, it's a good idea to use any &# all of the static data / credentials available which are specific# to the context of the registration ->APPLICATION=b"my-application-name"PRODUCT=b"the-product-being-accessed-by-this-registration"STATIC_CONTEXT=[APPLICATION,PRODUCT,PUBLIC_CERTIFICATE]# If the same difficulty settings are going to be used for every# hash, then a ``Passcrypt`` instance can be initialized to# automatically pass those static settings ->pcrypt=Passcrypt(mb=1024,cpu=2,cores=8)# 1 GiB, 8 cores# Now that the static credentials / settings are ready to go, we# can start hashing any user information that arrives ->username=form["username"].encode()passphrase=form["passphrase"].encode()email_address=form["email_address"].encode()# The ``hash_bytes`` function can then be used to automatically# encode then hash the multi-input data so as to prevent the chance# of canonicalization (&/or length extension) attacks ->aad=hash_bytes(*STATIC_CONTEXT,username,email_address)hashed_passphrase=pcrypt.hash_passphrase(passphrase,aad=aad)asserttype(hashed_passphrase)isbytesassertlen(hashed_passphrase)==38# Later, a hashed passphrase can be used to authenticate a user ->untrusted_username=form["username"].encode()untrusted_passphrase=form["passphrase"].encode()untrusted_email_address=form["email_address"].encode()aad=hash_bytes(*STATIC_CONTEXT,untrusted_username,untrusted_email_address)try:pcrypt.verify(hashed_passphrase,untrusted_passphrase,aad=aad,ttl=3600)exceptpcrypt.InvalidPassphraseasauth_fail:# If the passphrase does not hash to the same value as the# stored hash, then this exception is raised & can be handled# by the application ->app.post_mortem(error=auth_fail)exceptpcrypt.TimestampExpiredasregistration_expired:# If the timestamp on the stored hash was created more than# ``ttl`` seconds before the current time, then this exception# is raised. This is helpful for automating registrations which# expire after a certain amount of time, which in this case was# 1 hour ->app.post_mortem(error=registration_expired)else:# If no exception was raised, then the user has been authenticated# by their passphrase, username, email address & the context of# the registration ->app.login_user(username,email_address)#Passcrypt Algorithm Overview……………………..Table Of ContentsBy being secret-independent,Passcryptis resistant to side-channel attacks. This implementation is also written in pure python. Significant attention was paid to design the algorithm so as to suffer minimally from the performance inefficiencies of python, since doing so would help to equalize the cost of computation between regular users & dedicated attackers with custom hardware / software. Below is a diagram that depicts how an example execution works:#___________________# of rows ___________________|||initialmemorycache||row# of columns == 2 * max([1, cpu // 2]) |||# of rows == ⌈1024*1024*mb/168*columns⌉ |vvvcolumn|---'-----------------------------------------'---|theinitialcachecolumn|---'-----------------------------------------'---|ofsize~`mb`iscolumn|---'-----------------------------------------'---|builtveryquicklycolumn|---'-----------------------------------------'---|usingSHAKE-128.column|---'-----------------------------------------'---|each(row,column)column|---'-----------------------------------------'---|coordinateholdscolumn|---'-----------------------------------------'---|oneelementofcolumn|---'-----------------------------------------'---|168-bytes.^|reflectionrow<-||--------------------'-------'--------------------|eachrowis|--------------------'-------'--------------------|hashedthenhas|--------------------'-------'--------------------|anew168-byte|--------------------'-------'--------------------|digestoverwrite|--------------------'-------'--------------------|thecurrentpointer|--------------------'-------'--------------------|inanalternating|--------------------Xxxxxxxx'xxxxxxxxxxxxxxxxxxxx| sequence, first at|oooooooooooooooooooo'oooooooO--------------------| the index, then at|->itsreflection.index|--'-------------------------------------------'--|thiscontinues|--'-------------------------------------------'--|untiltheentire|--'-------------------------------------------Xxx| cache has been|ooO-------------------------------------------'--| overwritten.|xx'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'xx|asingle`shake_128`|oo'ooooooooooooooooooooooooooooooooooooooooooo'oo|object(H)isused|xx'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'xx|todoallofthe|oo'ooooooooooooooooooooooooooooooooooooooooooo'oo|hashing.|-><-|indexreflection|xxxxxxxxxxx'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx| finally, the whole|ooooooooooo'ooooooooooooooooooooooooooooooooooooo| cache is quickly|xxxxxxxxxxx'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx| hashed `cpu` + 2|ooooooooooo'ooooooooooooooooooooooooooooooooooooo| number of times.|Fxxxxxxxxxx'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx| after each pass an|foooooooooo'ooooooooooooooooooooooooooooooooooooo| 84-byte digest is|fxxxxxxxxxx'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx| inserted into the|foooooooooo'ooooooooooooooooooooooooooooooooooooo| cache, ruling out|->hashingstatecycles.|hashcpu+2# of times Then a `tag_size`-vbytetagisoutput.H(cache)tag=H.digest(tag_size)#X25519 & Ed25519………………………….Table Of ContentsAsymmetric curve 25519 tools are available from these high-level interfaces over thecryptographypackage.X25519…………………………………..Table Of ContentsElliptic curve 25519 diffie-hellman exchange protocols.fromaiootpimportX25519,DomainKDF,GUID,Domains# Basic Elliptic Curve Diffie-Hellman ->guid=GUID().new()my_ecdhe_key=X25519().generate()yieldguid,my_ecdhe_key.public_bytes# send this to Bobraw_shared_secret=my_ecdhe_key.exchange(bobs_public_key)shared_kdf=DomainKDF(# Use this to create secret shared keysDomains.ECDHE,guid,bobs_public_key,my_ecdhe_key.public_bytes,key=raw_shared_secret,)# Triple ECDH Key Exchange client initialization ->withecdhe_key.dh3_client()asexchange:response=internet.post(exchange())exchange(response)clients_kdf=exchange.result()# Triple ECDH Key Exchange for a receiving peer ->identity_key,ephemeral_key=client_public_keys=internet.receive()server=ecdhe_key.dh3_server(identity_key,ephemeral_key)withserverasexchange:internet.post(exchange.exhaust())servers_kdf=exchange.result()# Success! Now both the client & server peers share an identical# ``DomainKDF`` hashing object to create shared keys ->assert(clients_kdf.sha3_512(context=b"test")==servers_kdf.sha3_512(context=b"test"))Ed25519………………………………….Table Of ContentsEdwards curve 25519 signing & verification.fromaiootpimportEd25519# In a land, long ago ->alices_key=Ed25519().generate()internet.send(alices_key.public_bytes)# Alice wants to sign a document so that Bob can prove she wrote it.# So, Alice sends the public key bytes of the key she wants to# associate with her identity, the document & the signature ->document=b"DesignDocument.cad"signed_document=alices_key.sign(document)message={"document":document,"signature":signed_document,"public_key":alices_key.public_bytes,}internet.send(message)# In a land far away ->alices_message=internet.receive()# Bob sees the message from Alice! Bob already knows Alice's public# key & she has reason believe it is genuinely Alice's. So, she'll# import Alice's known public key to verify the signed document ->assertalices_message["public_key"]==alices_public_keyalice_verifier=Ed25519().import_public_key(alices_public_key)alice_verifier.verify(alices_message["signature"],alices_message["document"])internet.send(b"Beautiful work, Alice! Thanks ^u^")The verification didn’t throw an exception! So, Bob knows the file was signed by Alice.Comprende………………………………..Table Of ContentsThis magic with generators is made simple with thecomprehensiondecorator. It wraps them inComprendeobjects with access to myriad data processing pipeline utilities right out of the box.Synchronous Generators…………………….Table Of Contentsfromaiootp.gentoolsimportcomprehension@comprehension()defgen(x:int,y:int):z=yieldx+yreturnx*y*z# Drive the generator forward with a context manager ->withgen(x=1,y=2)asexample:z=5# Calling the object will send ``None`` into the coroutine by default ->sum_of_x_y=example()assertsum_of_x_y==3# Passing ``z`` will send it into the coroutine, cause it to reach the# return statement & exit the context manager ->example(z)# The result returned from the generator is now available ->product_of_x_y_z=example.result()assertproduct_of_x_y_z==10# Here's another example ->@comprehension()defone_byte_numbers():fornumberinrange(256):yieldnumber# Chained ``Comprende`` generators are excellent inline data processors ->base64_data=one_byte_numbers().int_to_bytes(1).to_base64().list()# This converted each number to bytes then base64 encoded them into a list.# We can wrap other iterables to add functionality to them ->@comprehension()defunpack(iterable):foriteminiterable:yielditem# This example just hashes each output then yields themfordigestinunpack(base64_data).sha3_256():print(digest)Asynchronous Generators……………………Table Of ContentsAsyncComprendecoroutines have almost exactly the same interface as synchronous ones.fromaiootp.asynchsimportasleepfromaiootp.gentoolsimportComprende,comprehension@comprehension()asyncdefgen(x:int,y:int):# Because having a return statement in an async generator is a# SyntaxError, the return value is expected to be passed into# Comprende.ReturnValue, and then raised to propagate upstream.# It's then available from the instance's ``aresult`` method ->z=yieldx+yraiseComprende.ReturnValue(x*y*z)# Drive the generator forward.asyncwithgen(x=1,y=2)asexample:z=5# Awaiting the ``__call__`` method will send ``None`` into the# coroutine by default ->sum_of_x_y=awaitexample()assertsum_of_x_y==3# Passing ``z`` will send it into the coroutine, cause it to reach the# raise statement which will exit the context manager gracefully ->awaitexample(z)# The result returned from the generator is now available ->product_of_x_y_z=awaitexample.aresult()assertproduct_of_x_y_z==10# Let's see some other ways async generators mirror synchronous ones ->@comprehension()asyncdefone_byte_numbers():# It's probably a good idea to pass control to the event loop at# least once or twice, even if async sleeping after each iteration# may be excessive when no real work is being demanded by range(256).# This consideration is more or less significant depending on the# expectations placed on this generator by the calling code.awaitasleep()fornumberinrange(256):yieldnumberawaitasleep()# This is asynchronous data processing ->base64_data=awaitone_byte_numbers().aint_to_bytes(1).ato_base64().alist()# This converted each number to bytes then base64 encoded them into a list.# We can wrap other iterables to add asynchronous functionality to them ->@comprehension()asyncdefunpack(iterable):foriteminiterable:yielditem# Want only the first twenty results? ->asyncfordigestinunpack(base64_data).asha3_256()[:20]:# Then you can slice the generator.print(digest)# Users can slice generators to receive more complex output rules, like:# Getting every second result starting from the 4th result to the 50th ->asyncforresultinunpack(base64_data)[3:50:2]:print(result)# Although, negative slice numbers are not supported.Comprendegenerators have loads of tooling for users to explore. Play around with it and take a look at the other chainable generator methods inaiootp.Comprende.lazy_generators.Module Overview…………………………..Table Of ContentsHere’s a quick overview of this package’s modules:importaiootp# Commonly used constants, datasets & functionality across all modules ->aiootp.commons# The basic utilities & abstractions of the package's architecture ->aiootp.generics# A collection of the package's generator utilities ->aiootp.gentools# This module is responsible for providing entropy to the package ->aiootp.randoms# The high & low level abstractions used to implement the Chunky2048 cipher ->aiootp.ciphers# The higher-level abstractions used to create / manage key material ->aiootp.keygens# Global async / concurrency functionalities & abstractions ->aiootp.asynchs#FAQ……………………………………..Table Of ContentsQ: What is the one-time pad?A: It’s a cipher which provides an information theoretic guarantee of confidentiality. It’s typically thought to be too cumbersome a cipher for generalized application because it conveys strict, and well, cumbersome, requirements onto its users. The need for its keys to be at least as large as all the messages it’s ever used to encrypt is one such requirement. Our goal is to design a cipher which immitates the one-time pad through clever algorithms, in such a way as to minimize its inconveniences & still provide some form of information theoretic confidentiality guarantees or, at a minimum, be able to make non-trivial statements about its security against even computationally unbounded adversaries. In this effort, we’ve built what we hope to be a candidate cipher, which we’ve calledChunky2048.Q: How fast is this ``Chunky2048`` cipher?A: Well, because it relies onhashlib.shake_128hashing to build key material streams, it’s rather efficient. It can process about 24 MB/s on a ~1.5 GHz core for both encrypting & decrypting. This is still slow relative to other stream ciphers, but this package is written in pure Python & without hardware optimizations. Using SHA3 ASICs, specific chipset instructions, or a lower-level language implementation, could make this algorithm competitively fast.Q: What size keys does the ``Chunky2048`` cipher use?A: It’s been designed to work with any size of key >= 64 bytes.Q: What’s up with the ``AsyncDatabase`` / ``Database``?A: The idea is to create an intuitive, pythonic interface to a transparently encrypted and decrypted persistence tool that also cryptographically obscures metadata. It’s designed to persist raw bytes or JSON serializable data, which gives it native support for some of the most important basic python datatypes. It’s still a work in progress, albeit a very nifty one.Q: Why are the modules transformed into ``Namespace`` objects?A: We overwrite our modules in this package to have a more fine-grained control over what part of the package’s internal state is exposed to users & applications. The goal is make it more difficult for users to inadvertently jeopardize their security tools, & minimize the attack surface available to adversaries. TheNamespaceclass also makes it easier to coordinate and decide the library’s UI/UX across the package.Changelog………………………………..Table Of ContentsChanges for version 0.22.1Major ChangesThe top-levelDomainKDFclass’ hashing methods can now accept an arbitrary amount of additional data arguments which do not change the internal state of its objects.Switch the order of the internal raw guids with thenode_numberin theGUIDclass. This is intended to induce the most variability possible in output guids by interpreting the variable raw guids as more significant bits.Minor ChangesThe defaultcpucost forPasscryptwas lowered from 2 to 1.Ensured raw guid byte values used byGUIDclass are interpreted as big-endian integers.The top-level(a)csprngfunctions now don’t bother to convert a falsey, non-bytes, user-suppliedentropyargument tobytes. Instead they just use a value from an internal entropy pool as additional entropy for that invocation of the function.Code clean-ups.Documentation fixes.Added tests forDomainKDF,GUID&SyntheticIV, & improved clarity of some existing tests.Packaging changes to create coherent wheel files.Explicitly declare use of big-endian encoding throughout the package.Conduct a more comprehensive addition of the package’s types to theTypingclass.Changes for version 0.22.0(Major Rewrite: Backwards Incompatible)Security Advisory:The top-level(a)csprngfunctions were found to be unsafe in concurrent code, leading to the possibilty of producing identical outputs from distinct calls if run in quick succession from concurrently running threads & coroutines. The classification of this vulnerability is severe because: 1) users should be able to expect the output of a 64-byte cryptographically secure pseudo-random number generator to always produce unique outputs; and, 2) much of the package utilizes them to produce cryptographic material. This vulnerability does not effect users of the library which are not running it in multiple concurrent threads or coroutines. The vulnerability has been patched & all users arehighly encouragedto upgrade to v0.22.0+.Major ChangesSupport for python 3.6 was dropped. The package now supports python versions 3.7+.Chunky2048: A new version of the cipher has been developed which
implements algorithms & interfaces that offer improvements in multiple
regards: smaller size overhead of ciphertexts, faster execution time
for large messages & large keys, more robust salt reuse/misue resistance,
fewer aspects harming deniability & better domain separation.
Many of the changes are described here:The(a)bytes_keysgenerators were updated to useshake_128-based KDF objects instead ofsha3_512, yielding 256-bytes on each iteration instead of 128, now requiring only a single iteration to produce a keystream key for each block, instead of two. This choice was made during the process of analyzing the use of the user’s encryptionkeyto seed theseed_kdfon each iteration. We wanted to stop doing that essentially, because it slowed down the cipher too much when used with large keys. And because it seems like a bad idea to use the same key repeatedly while also not incorporating the uniqueness or entropy from the message’ssalt,sivoraad.But still, we somehow wanted to come up with an idea which could efficiently & continually extract entropy from the userkeyif it did happen to be large. An answer came in the form of expanding on an earlier implemented idea which used the key multiple times to create unique seeds during initialization. In this case, however, instead of creating unique seeds with the singleseed_kdf, each of the three KDFs & the MAC object used by the cipher will be given the wholekeyonce at initialization, with proper domain separation, & including the messagesalt&aad(Thesivcan’t be used because its creation happens after initialization during encryption). This gives each of their (SHA3) 200-byte internal states independent access to the full entropy of thekey.Then, the problem was that, by usingsha3_512internally, a maximum of 64-bytes of entropy could be communicated between KDFs at each round (and only 32-bytes from theStreamHMAC(shmac) object’ssha3_256MAC). But the blocksize of each round is 256-bytes. So, the idea became to attempt tocommunicatemore entropy between the KDFs & MAC each round than there exists possible messages in the message space of each round. It seems plausible, that by only assuming the independence of each of the KDFs / MAC & that they can indeedefficiently pass entropyto one another, that for large keys we could argue the relevant key space is that of the 800-byte internal state of the cipher at each round (which happens to be more than three times the size of the message space of each round). This is to say, we conjecture, that byefficiently communicating more entropyfromindependent sourcesthan there existspossible messages, & in fact incorporating the entropy ofeach message blockinto the cipher’s state at the start ofeach round, that the entropy of the internal keyspace is continually being refreshed in a way which is negligibly distinguishable from using a fresh random key each round the length of the blocksize. This seems like at least a feasible way to begin the argument that it is possible to meaningfully relate the information theoretic security of the one-time pad to a pseudo one-time pad in a measurable way.Efficiently Pass Entropy: By this we mean, the rate of bits extracted from one state object, to the rate of bits of actual entropy absorbed by a receiveing state object, up to its XORable state size, being different by only a negligible amount. Here, we can conservatively assume the limit of this efficiency is the XORable state size, since we know that in the ideal setting, XORingnuniform random bits with an unknown message of <=nbits is perfectly hiding, which implies perfectly efficient conveyance of entropy. By usingshake_128as each of the cipher’s state objects, & its larger rate of 168-bytes, more than twice the number of bytes can be passed to & extracted from each, per round & per call to their internalfpermutation, as compared withsha3_512.If they can efficiently pass entropy, then any secret state exposed by theleft_kdforright_kdfin the creation of ciphertext, can then be efficiently displaced by the introduction of new entropy from the other state objects. This follows from the theory that a finite sized pool of entropy which is already maximally filled with entropy, cannot incorporate more entropy without fundamentally erasing internal information. From this we arrived at the new design forChunky2048. In this new design, theshmacfeeds 168-bytes to theseed_kdf, theseed_kdfcreates 336-bytes to feed 168-bytes each to theleft_kdf&right_kdf, theleft_kdf&right_kdfeach produce 128-byte keys which XOR the 256-byte plaintext, then this ciphertext feeds theshmac& the cycle repeats.More work needs to be done to formalize these definitions & analyze their properties. We would be grateful for any help from those with expertise in formal proofs of security in tearing apart this design as we move closer to the first stable release of the package.TheSyntheticIVclass’ algorithm has been updated as a result of analyzing how we could improve the salt reuse / misuse resistance of the cipher without attesting to plaintext contents in the form of ansivattached to ciphertexts. This plaintext attestation worked counter to our goal of wanting to be able to say something non-trivial about the key-deniability of the cipher. It was noticed that the plaintext padding already incorporated an 8-byte timestamp (now reduced to 4-bytes) & 16-bytes of ephemeral randomness as part of the prepended inner-header, & that these values were not at all used to seed the cipher’s state during decryption. Instead a keyed-hash was calculated over the first block of plaintext during encryption to create the 24-bytesiv. But, this is actuallyless effectiveat producing salt reuse / misuse resistance than using the timestamp & ephemeral randomness directly in seeding theseed_kdf, because the timestamp is a unique & global counter that does not suffer from collisions. This understanding came while also trying to find a good use for the initialprimer_keygenerated by the keystream generator when sending in the first obligatoryNonevalue. In the previous version it was used to initialize theshmac, but now that theshmacwould be initialized directly with the userkey, it was searching for a use. So the idea was to pair them.The new 256-byteprimer_keywould be XORed with the 256-byte first block of plaintext to mask the inner-header. The unmasked inner-header & 148-bytes of theshmac’s digest will seed the keystream, & the freshly seeded keystream output would be truncated to XOR the part of the masked plaintext which doesn’t include the inner-header. There’s no need now to attach thesivto the ciphertext. Instead, during decryption, the decipher algorithm has access to the inner-header, because it has access to theprimer_key& the masked inner-header. The actual plaintext contents of the first block are only accessible after unmasking the inner-header & seeding the keystream. This combination alone of protection from a timestamp & 16-bytes of randomness should give a salt reuse / misuse resistance of at least~2 ^ 64 messagesper second!However, even with this new scheme, it would still be problematic to repeat a combination ofkey,salt&aad, since it would leak the XORs of timestamp information. With all of this in mind, the new formulation would include a 16-bytesalt& a newly introduced 16-byteiv, both of which are attached to ciphertexts. This is a header size reduction of 16-bytes, since priorsalt&sivsizes were 24-bytes each. The difference between thesalt&ivis that thesaltis available for the user to choose, but theivisalwaysgenerated randomly. Since theivisn’t dependent on message data the way that thesivwas, it too can now be incorporated into all of the state objects during initialization. Theivensures that even if akey,salt&aadtuple repeats, the timestamp is still protected. Below is a diagram of the procedure:#_____________________________________|||AlgorithmDiagram:Encryption||_____________________________________|------------------------------------------------------------------#|inner-header|firstblockofplaintext|#|timestamp|siv-key||#|4-bytes|16-bytes|236-bytes|#------------------------------------------------------------------#|----------------------entirefirstblock------------------------|#|#|#first256-bytekeystreamkey----⊕#|#|#V#maskedplaintextblock#------------------------------------------------------------------#|maskedinner-header|firstblockofmaskedplaintext|#------------------------------------------------------------------#|-----the236-bytemaskedplaintext-----|#|#|#siv=inner-header+shmac.digest(148)|#keystream(siv)[10:246]-----------------------⊕#|#|#V#------------------------------------------------------------------#|maskedinner-header|firstblockofciphertext|#------------------------------------------------------------------#_____________________________________|||AlgorithmDiagram:Decryption||_____________________________________|------------------------------------------------------------------#|maskedinner-header|firstblockofciphertext|#------------------------------------------------------------------#|----------------------entirefirstblock------------------------|#|#|#first256-bytekeystreamkey----⊕#|#|#V#unmaskedciphertextblock#------------------------------------------------------------------#|inner-header|firstblockofunmaskedciphertext|#------------------------------------------------------------------#|---the236-byteunmaskedciphertext----|#|#|#siv=inner-header+shmac.digest(148)|#keystream(siv)[10:246]-----------------------⊕#|#|#V#------------------------------------------------------------------#|inner-header|firstblockofplaintext|#|timestamp|siv-key||#|4-bytes|16-bytes|236-bytes|#------------------------------------------------------------------##ThePaddingclass has seen some changes. Firstly, the 8-byte timestamp in the inner-header was reduced to 4-bytes. Furthermore, to get the full 136 years out of the 4-byte timestamps, the epoch used to calculate them was changed to unix timestamp1672531200(Sun, 01 Jan 2023 00:00:00 UTC). This is the new default0date for the package’s timestamps. This saves some space & aims to provided fewer bits of confirmable attestation & correlation in proof games which simulate attacks on the key-deniability of the cipher. To explain: the plaintext padding includes random padding. That padding is intended to leave an adversary which attempts to brute force a ciphertext’s encryptionkey, even with unbounded computational resources, in a state where it cannot decide with better accuracy than random chance between the exponentially large number of keys which create the sameshmactag (the variablekeyspaceis much larger than the 32-byte tag) with their accompanying exponentially large number ofplausibleplaintexts (anyreasonableplaintext with any variable length random padding between 16 & 272 bytes), & the actual userkey& plaintext.We also got rid of the use of apadding_keyto indicate the end of a plaintext message. It used to be sliced off theprimer_key, but theprimer_keyhas a new use now. Also, thepadding_keywas another form of plaintext / key attestation harming deniability that we wanted to get rid of. Instead, a simpler method is now employed: The final byte of the final block of padded plaintext is a number which tells the decryptor exactly how many bytes of random padding were added to the plaintext to fill the block. This saves a lot of space, is simpler, minimizes unnecessary key attestation, & eliminates the need for thePaddingclass to know anything about user secrets in order to do the padding, which is an improvment all around.New(Async)CipherStream&(Async)DecipherStreamclasses were introduced which allow users to utilize the online nature of theChunky2048cipher, ciphering & deciphering data in bufferable chunks, without needing to know about or instantiate all of the low-level classes. They automatically handle the required plaintext padding, ciphertext authentication, & detection of out-of-order message blocks. This greatly simplifies the safe usage ofChunky2048in online mode, provides robustness, & gets rid of the need for users to worry about the dangers of release of unverified plaintexts.ThePasscryptalgorithm was redesigned to be data-independent, more efficiently acheive its security goals, & allow for more compact hashes which include its difficulty settings metadata. Thekbparameter was changed tomb, & now measures Mibibytes (MiB). A newcoresparallelization parameter was added, which indicates the number of parallel processes to use to complete the procedure. And thecpuparameter now measures the number of iterations over the memory cache that are done, as well as the computational complexity of the algorithm.Passcryptnow usesshake_128instead ofsha3_512internally. This also allows for users to specify atag_sizenumber of bytes to produce as an output tag. Asalt_sizeparameter can now also be supplied to the(a)hash_passphrasemethods. The(a)hash_passphrasemethods now produce raw-bytes outputs & the(a)hash_passphrase_raw&(a)verify_rawmethods were removed.(a)verifymethods now also acceptrange-type objects asmb_allowed,cpu_allowed, &cores_allowedkeyword argument inputs. These range objects can be used to specify the exact amount of resources which the user allows for difficulty settings, which can mitigate adversarial (or unintentional) DOS attacks on machines doing hash verification.Type annotations were added to most of the library, including return types, which were completely neglected in prior versions. They are still not functioning with mypy, & are serving right now as documentation & auto-complete helpers.Many unnecesssary, low-level or badly designed features, functions & classes were either deleted or pulled into private namespaces, along with major reorganization & cleanup of the codebase. The tangled mess of internal module imports was also cleaned up. The goal is to provide access to only the highest level, simplest, & safest by default interfaces which can actually help users in their data processing & cryptographic tasks. These changes aim to improve maintainability, readability, correctness & safety.New top-level(a)hash_bytesfunctions were added to the package, which accept an unlimited number bytes-type inputs as positional arguments & automatically canonically encode all inputs before being hashed (which aims to prevent canonicalization attacks & length-extension attacks). Akeykeyword-only argument can also be supplied to optionally produce keyed hashes.A new top-levelGUIDclass was added. It creates objects which produce variable length, obfuscated, pseudo-random bytes-type globally unique identifiers based on a user-defined integernode_number, a user-defined uniform bytessalt, a nanosecondtimestamp, randomentropybytes & a 1-bytecounter. The benefits of its novel design explained:1)the namespace separation of user-defined salts (like name-based uuids);2)guaranteed output uniqueness for all instances using the samesalt&node_numberwhich occur on a different nanosecond (like time-based uuids, but with higher precision);3)guaranteed output uniqueness between all instances which use the samesaltbut a differentnode_number, even if produced on the same nanosecond;4)guaranteed output uniqueness for any unique instance using the samesalt&node_numberif it produces 256 or fewer outputs every nanosecond;5)probabilistic output uniqueness for any unique instance using the samesalt&node_numberif it produces >256 outputs per-nanosecond, exponentially proportional to the number of randomentropybytes (which in turn are proportional to the output size of the GUIDs);6)output invertability, meaning outputs can be unmasked & sorted according totimestamp,node_number&counter;7)random-appearing outputs, with the marginal amount of privacy which can be afforded by obfuscated affine-group operations. Admittedly, point7)stillleaves much room for improvement, as the privacy of the design could instead be ensured by strong hardness assumptions given by other types of invertible permutations or group operations. The goal was to create something efficient (below 3µs per guid), which met the above criterion, & that produced output bit sequences which passed basic randomness tests. We’d be excited to accept pull requests which use strong invertable permutations or group operations that are also about as efficient, & that forn-byte declared output sizes, outputs do not repeat for fewer than ~256 **nsequential input values.The top-levelDomainKDFclass now also creates KDF objects which automatically canonically encode all inputs.TheX25519protocols now returnDomainKDFresults instead of plainsha3_512objects.The(Base)Comprendeclasses were greatly simplified, & the caching &messagesfeatures were removed.The top-level(a)mnemonicfunctions now return lists of bytes-type words, instead of str-type, & can now be used to quickly generate lists of randomly selected words without providing a (now optional) passphrase.The(Async)Databaseclasses’(a)generate_profilemethods no longer require tokens to first be created by the user. That is now handled internally, & the external API accepts raw bytes inputs for credentials from the user.ThePackageSigner&PackageVerifiernow usesha384for digests instead ofsha512. The verifier now by default recomputes & verifies the digests of files from the filesystem using thepathkeyword argument to the constructor as the root directory for the relative filepaths declared in the “checksums” entry of the signature summary.Minor ChangesA newClockclass was added to thegenerics.pymodule which provides a very intuitive API for handling time & timestamp functionalities for various time units.The test suite was reorganized, cleaned up & extended significantly, & now also utilizespytest-asyncioto run async tests. This led to many found & fixed bugs in code that was not being tested. There’s still a substantial amount of tests that need to be written. We would greatly appreciate contributions which extend our test coverage.Many improvements to the correctness, completeness & aesthetic beauty of the code documentation with the addition of visual aides, diagrams & usage examples.A top-levelreport_security_issuefunction was added, which provides a terminal application for users to automatically encrypt security reports to us using our new X25519 public key.We lost access to our signing keys in encrypted drives which were damaged in flooding. So we decided to shred them & start fresh. Our new Ed25519 signing key is “70d1740f2a439da98243c43a4d7ef1cf993b87a75f3bb0851ae79de675af5b3b”. Contact us via email or twitter if you’d like to confirm that the key you are seeing is really ours.Changes for version 0.21.1Minor ChangesFix usage of the wrong package signing key.Changes for version 0.21.0Major ChangesNon-backwards compatible changes:Altered theChunky2048cipher’s key derivation to continuously extract
entropy from users’ main encryption key. The design goal of the cipher
is to be as close as possible to a one-time pad, but because we use
key derivations to mix together all the relevant values used by the
cipher, there’s a limited amount of entropy that can be extracted
from the main key no matter how large it is. The changes feed the
main key into the internal seed KDF multiple times when creating the
cipher’s initial seeds, & once on every iteration of the(a)bytes_keysgenerators.Merged two internal KDFs used by the cipher into the one seed KDF. This
also now means that using the(a)update_keymethods of theStreamHMACclass updates the KDF used to ratchet the encryption keystream.Usesha3_512instead ofsha3_256for theStreamHMACfinal HMAC
& slice the first bytes designated by the package’scommons.pymodule.
This allows the HMAC length to be specified & changed easily. It’shighly discouragedto use anything less than 32-bytes.Minor ChangesInternal refactorings.Updates to tests.Changes for version 0.20.7Major ChangesChanged the way thePadding.(a)end_paddingmethods calculate the
required padding length. The change causes the methods to now assume
that the plaintext has already been prepended with the start padding.The varioustest_*&verify_*functions/methods throughout the
package have been changed to returnNoneon successful validation
instead ofTrue, which more closely matches the convention for
exception-raising validators.The defaultblock_idlength was changed from 16-bytes to 24-bytes.Minor ChangesMake the(a)end_paddingmethods of thePaddingclass assume the
supplied data has already been prepended with the start padding. This
better integrates with streams of plaintext (online usage).Small internal refactorings.Documentation fixes.Changes for version 0.20.6Major ChangesThe(Async)Databaseclasses now support storing rawbytestype
tag entries! This is a huge boon to time/space efficiency when needing
to store large binary files, since they don’t need to be converted to
& from base64. This feature was made possible with only very minor
changes to the classes, & they’re fully backwards-compatible! Older
versions will not be able handle rawbytesentries, but old JSON
serializable entries work the same way they did.Minor ChangesDocfixes.Small refactorings.Add new tests & make existing tests complete faster.Support empty strings to be passed to the(Async)Databaseconstructors’directorykwarg, signifying the current directory. NowNoneis
the only falsey value which triggers the constructors to use the default
database directory.Fixed a bug in theAsyncDatabaseclass’aset_tagmethod, which
would throw an attribute error when passed thecache=Falseflag.Add Windows support to the CI tests.Changes for version 0.20.5Minor ChangesInclude the missing changelog entries forv0.20.4.Changes for version 0.20.4Major ChangesAddpython3.10support by copying theasync_lrupackage’s main module
from their more up-to-date github repository instead of from PyPI.Minor ChangesSmall refactorings & code cleanups.Documentation updates.Type-hinting updates.Cleanups to the package’s module API.Improve CI & extend topython3.10.Changes for version 0.20.3Minor ChangesSmall refactorings.Documentation updates.Type-hinting updates.Additional tests.Changes for version 0.20.2Major ChangesChanged thePaddingclass’(a)check_timestampmethods to(a)test_timestamp, to better match the naming convention in the
rest of the package.Removed the(a)sum_sha3__(256/512)chainable generator methods from
theComprendeclass.Removed theos.urandombased functions in therandoms.pymodule.Minor ChangesFixes & improvements to out of date documentation.Small fixes to type-hints.Small refactorings.Add(a)generate_keyfunctions to the package &(Async)Keysclasses.Fix some exception messages.Changes for version 0.20.1Minor ChangesSmall fixes & improvements to documentation.Small fixes & improvements to tests.Small fixes to type-hints.Small re-organization of source file contents.Small bug fixes.Changes for version 0.20.0 (Backwards incompatible updates)Major ChangesThe(a)json_(en/de)crypt&(a)bytes_(en/de)cryptfunctions &
methods now only expect to work withbytestype ciphertext. And,
the low-level cipher generators expect iterables of bytes where they
used to expect iterables of integers.Thepidkeyword-only argument throughout the package was changed
toaadto more clearly communicate its purpose as authenticated
additional data.Thekey,salt&aadvalues throughout the package are now
expected to bebytestype values.Thekeymust now be at least 32-bytes for use within theChunky2048cipher & its interfaces.Thesalt, for use in theChunky2048cipher & its interfaces,
was decreased from needing to be 32-bytes to 24-bytes.Thesiv, for use in theChunky2048cipher & its interfaces, was
increased from needing to be 16-bytes to 24-bytes.The newKeyAADBundleclass was created as the primary interface
for consumingkey,salt,aad&sivvalues. This class’
objects are the only ones that are used to pass around these values
in low-levelChunky2048cipher functionalities. The higher-level
cipher functions are the only public interfaces that still receive
thesekey,salt, &aadvalues.TheKeyAADBundlenow manages the new initial key derivation of theChunky2048cipher. This new algorithm is much more efficient,
utilizing the output of the keystream’s first priming call instead of
throwing it away, removing the need for several other previously used
hashing calls.Thebytes_keys&abytes_keyskeystream generator algorithms
were improved & made more efficient. They also now only receivebytestype coroutine values orNone.TheStreamHMACalgorithms were improved & made more efficient.TheChunky2048class now creates instance’s that initialize, & who’s
methods are callable, much more efficiently by reducing its previously
dynamic structure. Its now reasonable to use these instances in code
that has strict performance requirements.TheKeys&AsyncKeysclasses were trimmed of all instance
behaviour. They are now strictly namespaces which contain static or
class methods.All instance’s of the wordpasswordthroughout the package have been
replaced with the wordpassphrase. ThePasscryptclass now only
acceptsbytestypepassphrase&saltvalues. The returned
hashes are also now alwaysbytes.ThePadding&BytesIOclasses’ functionalities were made more
efficient & cleaned up their implementations.NewPackageSigner&PackageVerifierclasses were added to thekeygens.pymodule to provide an intuituve API for users to sign their
own packages. This package now also uses these classes to sign itself.The newgentools.pymodule was created to organize the generator
utilities that were previously scattered throughout the package’s
top-level namespaces.The new_exceptions.pymodule was created to help organize the
exceptions raised throughout the package, improving readability
& maintainability.The new_typing.pymodule was added to assist in the long process
of adding functional type-hinting throughout the package. For now,
the type hints that have been added primarily function as documentation.A newSlotsbase class was added to thecommons.pymodule to
simplify the creation of more memory efficient & performant container
classes. The new_containers.pymodule was made for such classes
for use throughout the package. And, most classes throughout the
package were given__slots__attributes.A newOpenNamespaceclass was added, which is a subclass ofNamespace,
with the only difference being that instances do not omit attributes
from their repr’s.The new(a)bytes_are_equalfunctions, which are pointers tohmac.compare_digestfrom the standard library, have replaced the(a)time_safe_equalityfunctions.The(a)sha_256(_hmac)&(a)sha_512(_hmac)functions have had
their names changed to(a)sha3__256(_hmac)&(a)sha3__512(_hmac).
This was done to communicate that they are actually SHA3 functions,
but the double underscore is to keep them differentiable from the
standard library’shashlibobjects. They can now also returnbytesinstead of hex strings if theirhexkeyword argument is truthy.The base functionality of theComprendeclass was refactored out into aBaseComprendeclass. The chainable data processor generator methods
remain in theComprendeclass. Their endpoint methods (such as(a)list&(a)join) have also been changed so they don’t cache results by default.ThePasscryptclass’kb&hardnesscan now be set to values
independently from one another. The algorithm runs on the new(a)bytes_keyscoroutines, & a slightly more effective cache building
procedure.The databases classes now don’t preload their values by default. And,
various methods which work with tags & metatags have been given acachekeyword-only argument to toggle on/off the control of using
the cache for each operation.New method additions/changes to the database classes:(a)rollback_tag,(a)clear_cache, & afilenamesproperty
were added.(a)hmacwas changed to(a)make_hmac, & now returnsbyteshashes.(a)savewas changed to(a)save_database.(a)querywas changed to(a)query_tag.(a)setwas changed to(a)set_tag.(a)popwas changed to(a)pop_tag.Thetags,metatags&filenamesproperties now return sets
instead of lists.TheRopakeclass has been removed from the package pending changes to
the protocol & its implementation.The(a)generate_saltfunction now returnsbytestype values,
& takes asizekeyword-only argument, with no default, that determines
the number of bytes returned between [8, 64].The(a)random_512&(a)random_256public functions can now cause
their underlying random number generators to fill their entropy pools
when either theroundsorrefreshkeyword arguments are specified.The following variables were removed from the package:(a)keys,(a)passcrypt,(a)seeder,(a)time_safe_equality,Datastream,bits,(a)seedrange,(a)build_tree,(a)customize_parameters,convert_class_method_to_member,convert_static_method_to_member,(a)xor,(a)padding_key,(a)prime_table,(a)unique_range_gen,(a)non_0_digits,(a)bytes_digits,(a)digits,(a)permute,(a)shuffle,(a)unshuffle,(a)create_namespace,
((a)depad_plaintext,(a)pad_plaintext& their generator forms.
Only the non-generator forms remain in thePaddingclass), (The(a)passcrypt,(a)uuids,(a)into_namespacemethods from the
database classes), (The(a)csprbgfunctions were removed & instead
the(a)csprngfunctions producebytestype values.)Thorough & deep refactorings of modules, classes & methods. Many methods
& functions were made private, cleaning up the APIs of the package,
focusing on bringing the highest-level functionalities to top level
namespaces accessible to users. Some purely private functionalities
were entirely moved to private namespaces not readily accessible to
users.Most of the constants which determine the functionalities throughout
the package were refactored out intocommons.py. This allows
for easy changes to protocols & data formats.Minor ChangesMany documentation improvements, fixes, trimmings & updates.Added aWeakEntropyclass to therandoms.pymodule.Changes for version 0.19.4Major ChangesCreated a privateEntropyDaemonclass to run a thread in the
background which feeds into & extracts entropy from some of the
package’s entropy pools. Also moved the separate private_cacheentropy pools from the parameters to the random number generators.
They’re now a single private_poolshared global that’s
asynchronously & continuously updated by the background daemon thread.Switched therandomportion of function names in therandoms.pymodule to readuniqueinstead. This was done to the functions which
are actually pseudo-random. This should give users a better idea of
which functions do what. The exception is that therandom_sleep&arandom_sleepfunctions have kept their names even though they
sleep a pseudo-randomly variable amount of time. Their names may
cause more confusion if they were either(a)unique_sleepor(a)urandom_sleep. Because they don’t useos.urandom& what
is aunique_sleep? When / if a better name is found these
function names will be updated as well.Minor ChangesVarious docstring / documentation fixes & refactorings.Changes for version 0.19.3Major ChangesRemovedascii_encipher,ascii_decipher,aascii_encipher&aascii_deciphergenerators from theChunky2048&Comprendeclasses, & the package. It was unnecessary, didn’t fit well with the
intended use of thePaddingclass, & users would be much better
served by converting their ascii to bytes to use thebytes_generators instead.Removed themap_encipher,map_decipher,amap_encipher&amap_deciphergenerators from theChunky2048&Comprendeclasses, & the package. They were not being used internally to the
package anymore, & their functionality, security & efficiency could
not be guaranteed to track well with the changes in the rest of the
library.Added domain specificity to theX25519protocols’ key derivations.Renamed the database classes’(a)encrypt&(a)decryptmethods
to(a)json_encrypt&(a)json_decryptfor clarity & consistency
with the rest of the package. Their signatures, as well as those in(a)bytes_encrypt&(a)bytes_decrypt, were also altered to
receive plaintext & ciphertext as their only positional arguments.
Thefilenameargument is now a keyword-only argument with a defaultNonevalue. This allows databases to be used more succinctly for
manual encryption & decryption by making the filename tweak optional.Therunskeyword argument for the functions inrandoms.pywas
renamed torounds. It seems more clear that it is controlling the
number of rounds are internally run within the(a)random_number_generatorfunctions when deriving new entropy.Minor ChangesFixes to docstrings & tutorials. Rewrite & reorganization of thePREADME.rst&README.rst. More updates to the readme’s are still
on the way.Slight fix to the Passcrypt docstring’s algorithm diagram.Moved the default passcrypt settings to variables in thePasscryptclass.Added the ability to send passcrypt settings into themnemonic&amnemoniccoroutines, which call the algorithm internally but
previously could only use the default settings.Some code cleanups & refactorings.Changes for version 0.19.2Minor ChangesMade the output lengths of thePaddingclass’ generator functions
uniform. When the footer padding on a stream of plaintext needs to
exceed the 256-byte blocksize (i.e. when the last unpadded plaintext
block’s lengthLis232 < L < 256), then another full block of
padding is produced. The generators now yield 256-byte blocks
consistently (except during depadding when the last block of plaintext
may be smaller than the blocksize), instead of sometimes producing a
final padded block which is 512 bytes.Changes for version 0.19.1Minor ChangesFixed a bug where database classes were evaluating as falsey when they
didn’t have any tags saved in them. They should be considered truthy
if they’re instantiated & ready to store data, even if they’re
currently empty & not saved to disk. This was reflected in their__bool__methods. The bug caused empty metatags not to be loaded
when an instance loads, even whenpreloadis toggledTrue.Removed the coroutine-receiving logic from thePaddingclass’Comprendegenerators. Since they buffer data, the received values
aren’t ever going to coincide with the correct iteration & will be
susceptible to bugsFixed a bug in thePaddingclass’Comprendegenerators which
cut iteration short because not enough data was available from the
underlying generators upfront. Now, if used correctly to pad/depad
chunks of plaintext 256 bytes at a time, then they work as expected.Theupdate,aupdate,update_key&aupdate_keymethods
in both theStreamHMAC&DomainKDFclasses now returnselfto allow inline updates.Addedacsprng&csprngfunction pointers to theChunky2048class.Updates to docstrings which didn’t get updated with info on the newsynthetic IVfeature.Some other docstring fixes.Some small code cleanups & refactorings.Changes for version 0.19.0Major ChangesSecurity Upgrade: The package’s cipher was changed to an online,
authenticated scheme with salt reuse / misuse resistance. This was
acheived through a few backwards incompatible techniques:A synthetic IV (SIV) is calculated from the keyed-hash of the first
256-byte block of plaintext. The SIV is then used to seed the
keystream generator, & is used to update the validator object. This
ensures that if the first block is unique, then the whole ciphertext
will be unique.A 16-byte ephemeral & random SIV-key is also prepended to the
first block of plaintext during message padding. Since this value
is also hashed to derive the SIV, this key gives a strong
guarantee that a given message will produce a globally unique
ciphertext.An 8-byte timestamp is prepended to the first block of plaintext
during padding. Timestamps are inherently sequential, they can be
verified by a user within some bounds, & can also be used to
mitigate replay attacks. Since it’s hashed to make the SIV, then
it helps make the entire ciphertext unique.After being updated with each block of ciphertext, the validator’s
current state is again fed into the keystream generator as a new
rotating seed. This mitigation is limited to ensuring only that
every following block of ciphertext to a block which is unique
will also be unique. More specifically this means that:ifallother mitigations fail to be unique, or are missing, then
the first block which is uniquewill appear the same, except
for the bits which have changed,but, all following blocks will
be randomized.This limitation could be avoided with a linear
expansion in the ciphertext size by generating an SIV for each
block of plaintext. This linear expansion is prohibitive as a
default setting, but the block level secrecy, even when all other
mitigations fail, is enticing. This option may be added in the
future as a type of padding mode on the plaintext.The SIV-key is by far the most important mitigation, as it isn’t
feasibly forgeable by an adversary, & therefore also protects against
attacks using encryption oracles. These changes can be found in theSyntheticIVclass, the (en/de)cipher & xor generators, & theStreamHMACclass in theciphers.pymodule. The padding
changes can also be found in the newPaddingclass in thegenerics.pymodule. The SIV is attached in the clear with ciphertexts & was
designed to function with minimal user interaction. It needs only to
be passed into theStreamHMACclass during decryption – during
encryption it’s automatically generated & stored in theStreamHMACvalidator object’ssivproperty attribute.Security Patch: The internalsha3_512kdf’s to theakeys,keys,abytes_keys&bytes_keyskeystream generators are now updated
with 72 bytes of (64 key material + 8 padding), instead of just 64
bytes of key material. 72 bytes is thebitrateof thesha3_512object. This change causes the internal state of the object to be permuted
for each iteration update & before releasing a chunk of key material.
Frequency analysis of ciphertext bytes didn’t smooth out to the
cumulative distribution expected for all large ciphertexts prior to
this change. But after the change the distribution does normalize as
expected. This indicates that the key material streams were biased
away from random in a small but measurable way. Although, no
particular byte values seem to have been preferred by this bias, this
is a huge shortcoming with unknown potential impact on the strength
of the package’s cipher. This update is strongly recommended & is
backwards incompatible.This update gives a name to the package’s pseudo-one-time-pad cipher
implementation. It’s now calledChunky2048! TheOneTimePadclass’ name was updated toChunky2048to match the change.ThePreemptiveHMACValidationclass & its related logic in theStreamHMACclass was removed. The chaining of validator output
into the keystream makes running the validator over the ciphertext
separately or prior to the decryption process very costly. It would
either mean recalculating the full hash of the ciphertext a second
time to reproduce the correct outputs during each block, or a large
linear memory increase to hold all of its digests to be fed in some
time after preemtive validation. It’s much simpler to remove that
functionality & potentially replace it with something else that fits
the user’s applications better. For instance, thecurrent_digest&acurrent_digestmethods can produce secure, 32-byte authentication
tags at any arbitrary blocks throughout the cipher’s runtime, which
validate the cipehrtext up to that point. Or, thenext_block_id&anext_block_idmethods, which are a more robust option because
each id they produce validates the next ciphertext block before
updating the internal state of the validator. This acts as an
automatic message ordering algorithm, & leaves the deciphering
party’s state unharmed by dropped packets or manipulated ciphertext.Theupdate_key&aupdate_keymethods were also added to theStreamHMACclass. They allow the user to update the validators’
internal key with new entropy or context information during its
runtime.TheComprendeclass now takes achainedkeyword-only argument
which flags an instance as a chained generator. This flag allows
instances to communicate up & down their generator chain using the
sharedNamespaceobject accessible by theirmessagesattribute.The chainableComprendegenerator functions had their internals
altered to allow them to receive, & pass down their chain, values
sent from a user using the standard coroutinesend&asendmethod syntax.Comprendeinstances no longer automatically reset themselves every
time they enter their context managers or when they are iterated over.
This makes their interface more closely immitate the behavior of
async/sync generator objects. To get them to reset, thearesetorresetmethods must be used. The message chaining introduced in
this update allows chains ofComprendeasync/sync generators to
inform each other when the user instructs one of them to reset.The standard library’shmacmodule is now used internally to thegenerics.pymodule’ssha_512_hmac,sha_256_hmac,asha_512_hmac&asha_256_hmacfunctions. They still allow any type of data to be
hashed, but also now default to hashingbytestype objects as
they are given.The newDomainsclass, found ingenerics.py, is now used to
encode constants into deterministic pseudo-random 8-byte values for
helping turn hash function outputs into domain-specific hashes. Its
use was included throughout the library. This method has an added
benefit with respect to this package’s usage of SHA-3. That being, thebitratefor bothsha3_512&sha3_256are(2 * 32 * k) + 8bytes, wherek = 1forsha3_512&k = 2forsha3_256.
This means that prepending an 8-byte domain string to their inputs
also makes it more efficient to add some multiple of key material
to make the input data precisely equal thebitrate. More info on
domain-specific hashing can be foundhere.A newDomainsKDFclass incipehrs.pywas added to create a
more standard & secure method of key derivation to the library which
also incorporates domain separation. Its use was integrated thoughout
theAsyncDatabase&Databaseclasses to mitigate any further
vulnerabilities of their internal key-derivation functions. The
database classes now also use bytes-type keys internally, instead
of hex strings.ThePasscryptclass now contains methods which create & validate
passcrypt hashes which have their settings & salt attached to them.
Instances can now also be created with persistent settings that are
automatically sent into instance methods.Minor ChangesMany fixes of docstrings, typos & tutorials.Many refactorings: name changes, extracted classes / functions,
reorderings & moves.Various code clean-ups, efficiency & usability improvements.Many constants used throughout the library were given names defined
in thecommons.pymodule.Removed extraneous functions throughout the library.The asymmetric key generation & exchange functions/protocols were
moved from theciphers.pymodule tokeygens.py.Add missing modules to the MANIFEST.rst file.Added aUniformPrimesclass to the__datasetsmodule for efficient
access to primes that aren’t either mostly 1 or 0 bits, as is the case for
theprimeshelper table. These primes are now used in theHasherclass’amask_byte_order&mask_byte_ordermethods.Thetime_safe_equality&atime_safe_equalitymethods are now
standalone functions available from thegenerics.pymodule.Addedreset_poolto theProcesses&Threadsclasses. Also
fixed a missing piece of logic in theirsubmitmethod.Added various conversion values & timing functions to theasynchs.pymodule.Themake_uuid&amake_uuidcoroutines had their names changed tomake_uuids&amake_uuids.Created a newDatastreamclass ingenerics.pyto handle buffering
& resizing iterable streams of data. It enables simplifying logic that
must happen some number of iterations before the end of a stream. It’s
utilized in thePaddingclass’ generator functions available as
chainableComprendemethods.Thedata&adatagenerators can now produce a precise number ofsize-lengthblocksas specified by a user. This gets rid of the
confusing usage of the oldstopkeyword-only argument, which stopped
a stream afterapproximatelysizenumber of elements.Improved the efficiency & safety of entropy production in therandoms.pymodule.Changes for version 0.18.1Major ChangesSecurity Patch: Deprecated & replaced an internal kdf for saving
database tags due to a vulnerability. If an adversary can get a user
to reveal the value returned by thehmacmethod when fed the tag
file’s filename & the salt used for that encrypted tag, then they
could deduce the decryption key for the tag. A version check was
added only for backwards compatibility & will be removed on the next
update. All databases should continue functioning as normal, though
all users are advised tore-save their databasesafter upgrading
so the new kdf can be used. This will not overwrite the old files,
so they’ll need to be deleted manually.Replaced usage of the asyncswitchcoroutine withasyncio.sleepbecause it was not allowing tasks to switch as it was designed to.
Many improvements were made related to this change to make the
package behave better in async contexts.Removed the private method in the database classes which held a
reference to the root salt. It’s now held in a private attribute.
This change simplifies the code a bit & allows instances to be
pickleable.Theatimeout&timeoutchainableComprendegenerator
methods can now stop the generators’ executions mid-iteration. They
run them in separate async tasks or thread pools, respectively, to
acheive this.Theawait_on&wait_ongenerators now restart their timeout
counters after every successful iteration that detected a new value
in theirqueue. Thedelaykeyword argument was changed toprobe_frequency, a keyword-only argument.Removed the package’s dependency on theaioitertoolspackage.Made thesympypackage an optional import. If any of its
functionalities are used by the user, the package is only then
imported & this is done automatically.Various streamlining efforts were made to the imports & entropy
initialization to reduce the package’s import & startup time.Minor ChangesFixes of various typos, docstrings & tutorials.Various cleanups, refactorings & efficiency improvements.Added new tests for detecting malformed or modified ciphertexts.Removed extraneous functions ingenerics.py.Add aUNIFORM_PRIME_512value to__datasets.pyfor use in theHasher.mask_byte_order&Hasher.amask_byte_ordermethods.
Those methods were also altered to produce more uniform looking
results. The returned masked values are now also 64 bytes by default.Added anautomate_key_usekeyword-only boolean argument to the init
for theOneTimePad,Keys&AsyncKeysclasses. It can be toggled to
stop the classes from overwriting class methods so they
automatically read the instance’s key attribute. This optionally
speeds up instantiation by an order of magnitude at the cost of
convenience.Fixedasynchs.Threadsclass’ wrongful use of amultiprocessingManager.listobject instead of a regular list.Changed the_delaykeyword-only argument inProcesses&Threadsclasses’ methods toprobe_freqeuncyso users can specify how often
results will be checked for after firing off a process, thread, or
associated pool submission.Now theasubmit&submitmethods inProcesses&Threadscan accept keyword arguments.Addedagather&gathermethods to theThreads&Processesclasses. They receive any number of functions, &args&/orkwargsto
pass to those functions when submitting them to their associated
pools.Changed therunsuminstance IDs from hex strings to bytes & cleaned
up the instance caching & cleaning logic.Altered & made private theasalted_multiply&salted_multiplyfunctions in therandoms.pymodule.Started a new event loop specific to therandoms.pymodule which
should prevent theRuntimeErrorwhenrandom_number_generatoris called from within the user’s running event loop.Added aValueErrorcheck to the(a)cspr(b/n)gfunctions inrandoms.py. This will allow simultaneously running tasks to
request entropy from the function by returning a result from a
newly instantiated generator object.Added checks in the*_encipher&*_deciphergenerators to
help assure users correctly declare the mode for their StreamHMAC
validator instances.Fixed the__len__function in the database classes to count the
number of tags in the database & exclude their internal maintenaince
files.TheTimeoutErrorraised after decrypting a ciphertext with an
expired timestamp now contains the seconds it has exceeded thettlin avalueattribute.The timestamp used to sign the package now displays the day of
signing instead of the second of signing.The(a)sum_sha_*&(a)sum_passcryptgenerators were altered to
reapply the suppliedsalton every iteration.Stabilized the usability of thestopkeyword-only argument in theadata&datagenerators. It now directly decides the total
number of elements in asequenceallowed to be yielded.Changes for version 0.18.0Major ChangesSecurity Patch: Rewrote the HMAC-like creation & authentication
process for all of the package’s ciphers. Now, the*_encipher&*_decipherComprendegenerators must be passed a validator
object to hash the ciphertext as it’s being created / decrypted.
TheStreamHMACclass was created for this purpose. It’s initalized
with the user’s long-term key, the ephemeral salt & the pid value.
The pid value can now effectively be used to validate additional data.
These changes force the package’s cipher to be used as an AEAD cipher.Security Patch: The package’s*_hmachash functions & theComprendeclass’ hash generators were rewritten to prepend salts & keys to data
prior to hashing instead of appending. This is better for several
important reasons, such as: reducing the amortizability of costs in
trying to brute-force hashes, & more closely following the reasoning
behind the HMAC spec even though sha3 has a different security profile.Algorithm Patch: Theakeys,keys,abytes_keys, &bytes_keysalgorithms have been patched to differentiate each iteration’s two
sha3_512 hashes from one another in perpetuity. They contained a design
flaw which would, if both sha3_512 objects landed upon the same
1600-bit internal state, then they would produce the same keystreams
from then on. This change in backwards incompatible. This flaw is
infeasible to exploit in practice, but since the package’s hashes &
ciphertext validations were already channging this release, there was
no reason to not fix this flaw so that it’s self-healing if they ever
do land on the same internal states.ThePasscryptclass & its algorithm were made more efficient to
better equalize the cost for users & adversaries & simplifies the
algorithm. Any inefficiencies in an implementation would likely cause
the adversary to be able to construct optimized implementations to
put users at an even greater disadvantage at protecting their inputs
to the passcrypt algorithm. It used thesum_sha_256hash function
internally, & since it was also changing in a non-backwards
compatible way with this update, it was the best time to clean up
the implementation.Updated the package’s description & its docstrings that refer to
the package’s cipher as an implementation of the one-time-pad. It’s
not accurate since the package uses pseudo-random hash functions to
produce key material. Instead, the package’s goal is to create a
pseudo-one-time-pad that’s indistinguishable from a one-time-pad.
TheOneTimePadclass will keep its name for succinctness.Newamake_token,make_token,aread_token&read_tokenclass & instance methods added to theOneTimePadclass. These
tokens are urlsafe base64 encoded, are encrypted, authenticated &
contain timestamps that can enforce a time-to-live for each token.Non-backwards compatible changes to the database classes’ filenames,
encryption keys & HMACs. The*_hmachash functions that the
databases rely on were changing with this update, so additionally the
filenames table used to encode the filenames was switched from theBASE_36_TABLEto theBASE_38_TABLE. Both tables are safe for
uri’s across all platforms, but the new table can encode information
slightly more efficiently.Major refactorings & signature changes across the package to make
passing keys & salts to*_hmacfunctions & theComprendeclass’ hash generators explicit.Removed theofkeyword argument from all of theComprendeclass’ generators. It was overly complicating the code, & was not
entirely clear or useful for settings outside of thetags&atagsgenerators.Removedpybase64from the package & its dependencies list. The
built-in pythonbase64module works just fine.Sorted theWORDS_LIST,ASCII_ALPHANUMERIC, &BASE_64_TABLEdatasets.Thesalt&asaltfunctions have been renamed togenerate_salt&agenerate_saltfor clarity’s sake, & to reduce naming
collisions.Added another redundancy to thearandom_number_generator&random_number_generatorfunctions. Now the async tasks it prepares
into a list are pseudo-randomly shuffled before being passed intoasyncio.gather.Minor ChangesAdded a logo image to the package.Separated the FAQ section fromPREADME.rst.Theprimes&bitsdatasets are now represented in hex in the
source code.Added aBASE_38_TABLEdataset to the package.The database classes now fill an ephemeral dictionary of filenames
that couldn’t be used to successfully load a tag file, available from
within the_corrupted_filesattribute.TheComprendeclass’acache_check&cache_checkcontext
manager methods are now calledaauto_cache&auto_cache.Added newbytes_count&abytes_countgenerators togenerics.pymodule which increment each iteration & yield the results as bytes.Removed theakeypair&keypairfunctions from the package.
Their successors are theasingle_use_key&single_use_keymethods
in theAsyncKeys&Keysclasses. The attempt is to clarify &
put constraints on the interface for creating a bundle of key
material that has a single-use-only salt attached, as well as the pid
value.Moved ciphertext encoding functions into theBytesIOclass from
the globalgenerics.pymodule.SplitPrimeGroupsinto two classes, one higher-level class by the
same name & aBasePrimeGroupsclass. The former also has some
added functionality for masking the order of bytes in a sequence
using an modular exponentiation.TheHasherclass now has functionality added to mask the order
of a bytes sequence with a modular multiplication.Fixed the name of the project in the attribution lines in several
source files.Reconciled tests with the major changes in this release.The old identity key for the package that was signed by the gnupg
identity key was shredded & replaced with a new signed key.Several bug fixes to thesetup.pyautomated code signing.Changes for version 0.17.0Major ChangesSecurity Patch: The HMAC verifiers on ciphertexts did not include
thesaltorpidvalues when deriving the HMAC. This
associated data can therefore be changed to cause a party to
decrypt a past ciphertext with a salt or pid of an attacker’s
choosing. This is a critical vulnerability & it is highly recommended
all users update. The fix is to hash the ciphertext,salt&pidtogether & sending that hash into the validator to have
the HMAC created / tested. This change will cause all prior
ciphertexts to be marked invalid by the validator.Refactored the names of the Comprende cipher methods to better
communicate their intended use as lower level tools that cannot be
used on their own to obtain authenticated, CCA or CPA secure
encryption.Added more comprehensive tests forX25519&Ed25519classes,
as well as the protocols that utilize theX25519ecdh exchange.
Fixed some bugs in the process.X25519instances that contain a secret key now have access to
protocol methods which automatically pass their key in as a keyword
argument. This simplifies their usage further.Incorporated the newHasherclass into the package’s random
number generator to improve its entropy production.Minor ChangesVarious fixes to typos, docstrings & tutorials.New tutorials & docs added.Changed the default table inByteIO‘sjson_to_ascii,ajson_to_ascii,ascii_to_json&aascii_to_jsonto theURL_SAFE_TABLEto
facilitate the creation of urlsafe_tokens.Removed all code in theRopakeclass that was used to create a default
database to store a default salt for users. All of that functionality
is expected to be handled by the database classes’ token & profile
creation tools.Fixed bug in package signing script that called hex from a string.Updated the package signing script to include these metadata in the
signatures of the ephemeral keys: name of the package, version, the
date in seconds.Added metadata to thesetup.cfgfile.Make passcrypt objects available from thekeygensmodule.Add more consistent ability withinRopakeclass to specify a
time-to-live for protocol messages.Added check to make sure instances ofX25519&Ed25519are
not trying to import a new secret key once they already have one.
This won’t be allowed in favor of creating a new object for a new
secret key.Fixed bug in database classes’ bytes ciphers which called themselves
recursively instead of calling the global functions of the same name.Changes for version 0.16.0Major ChangesAllDatabase&AsyncDatabasefilenames have been converted to
base36 to aid in making the manifest files & the databases as a whole
more space efficient. These changes are not backwards compatible.More work was done to clean up the databases & make them more
efficient, as well as equalize the sizes of the database files to
mitigate leaking metadata about what they might contain.Added newX25519&Ed25519classes that greatly simplify the
usage of the cryptography module’s 25519 based tools. They also help
organize the codebase better – whereRopakewas holding onto
all of the asymmetric tooling even though those tools were not part
of the Ropake protocol.New base & helperAsymmetric25519&BaseEllipticCurveclasses
were added as well to facilitate the reorganization.Many methods inRopakewere turned private to simplify & clean up
the interface so its intended use as a protocol is more clear for users.Added the time-to-live functionality toRopakedecryption functions.
TheTIMEOUTattribute on the class can also be changed to import
a global time-to-live for allRopakeciphertexts.Removed allnc_hash functions from the package/generics.py module.TheNamespaceclass now has akeysmethod so that namespaces
can be unpacked using star-star syntax.Because of the ongoing failures of gnupg, we are moving away from
signing our packages with gnupg. Our new Ed25519 keys will be from
the cryptography package, & we’ll sign those with our gnupg key as a
secondary form of attestation. Our package signing will be automated
in the setup.py file & the methods we use will be transparent in the
code. The new signatures for each package version will be placed in
the fileSIGNATURES.txt.Minor ChangesMany fixes & additions to docstrings & tutorials.Massive refactorings, cleanups & typo fixes across the library,
especially in the database classes,Ropake& theciphersmodule.Added comprehensive functional tests for the Ropake class.AddedBASE_36_TABLEto thecommonsmodule.Fixed metadata issues in setup.py that caused upload issues to pypi.Thegenerate_profile,load_profile,agenerate_profile&aload_profiledatabase methods now accept arbitrary keyword arguments
that get passed into the database’s __init__ constructor.username&passwordare now required keyword-only arguments
to theagenerate_profile_tokens&generate_profile_tokensclassmethods.Theaload&loaddatabase methods now take amanifestkwarg
that when toggledTruewill also refresh the manifest file from
disk.Now when a database object is ordered to delete itself, the entirety
of the instance’s caches & attribute values are cleared & deleted.Filled out the references to strong key generators & protocols in thekeygensmodule.Changes for version 0.15.0Major ChangesSecurity Patch: The previous update left the default salt stored by
theRopakeclass on the user filesystem as an empty string for
new files that were created since theasalt&saltfunctions
were switched to producing 256-bit values instead of 512-bits. This
bug has now been fixed.An 8 byte timestamp is now prepended to each plaintext during the
padding step. The decryption functions now take attlkwarg which
will measure & enforce a time-to-live for ciphertexts under threat ofTimeoutError.Added new profile feature to the database classes. This standardizes
& simplifies the process for users to open databases using only
low-entropy “profile” information such asusername,password,*credentials& an optionalsalta user may have access to.
The newagenerate_profile_tokens,generate_profile_tokens,agenerate_profile,generate_profile,aprofile_exists,profile_exists,aload_profile,load_profile,adelete_profile&delete_profilefunctions are the public part of this new feature.Some more database class attributes have been turned private to clean
up the api.Fixed typo in__exit__method ofDatabaseclass which referenced
a method which had its name refactored, leading to a crash.Shifted the values in theprimesdictionary such that the key for
each element in the dictionary is the exclusive maximum of each prime
in that element. Ex: primes[512][-1].to_bytes(64, “big”) is now valid.
Whereas before, primes[512] was filled with primes that were 64 bytes
and 1 bit long, making them 65 byte primes. This changes some of the
values of constants in the package & therefore some values derived
from those constants.Slimmed down the number of elements in theprimes&bitsdictionaries, reducing the size of the package a great deal.primesnow contains two primes in each element, the first is the minimum
prime of that bit length, the latter the maximum.AddedURLSAFE_TABLEto the package.Madesalt&pid&ttlkeyword only arguments in key
generators & encryption / decryption functions, further tighening up
the api.Minor ChangesAddedthis_secondfunction toasynchsmodule for integer time.Addedapadding_key,padding_key,aplaintext_stream&plaintext_streamfunctions to theciphersmodule.Addedapadding_key,padding_keyto thekeygensmodule &AsyncKeys&Keysclasses.Addedaxi_mix,xi_mix,acheck_timestamp,check_timestamp,
to thegenericsmodule.Addedacsprbg,csprbg,asalt,salt,apadding_key,padding_key,aplaintext_stream&plaintext_streamfunctions
to OneTimePad class asstaticmethod& instance methods.Addedacheck_timestamp&check_timestampfunctions to theBytesIOclass.Addedadeniable_filename&deniable_filenameto thepathsmodule.Removed check for falsey data in encryption functions. Empty data is
& should be treated as valid plaintext.Various refactorings, docstring fixes & efficiency improvements.Added some new tests for database profiles.Changes for version 0.14.0Major ChangesSecurity patch: Theapad_bytes,pad_bytes,adepad_bytes&depad_bytesfunctions were changed internally to execute in a
more constant time. The variations were small for 256-byte buffers
(the default), but can grow very wide with larger buffers. The salt
in the package’s encryption utilities is now used to derive the
plaintext’s padding, making each padding unique.Unified the types of encodings the library’s encryption functions
utilize for producing ciphertext. This includes databases. They now
all use theLIST_ENCODING. This greatly increases the efficiency
of the databases’ encryption/decryption, save/load times. And this
encoding is more space efficient. This change is backwards
incompatible.TheLIST_ENCODINGspecification was also changed to produce
smaller ciphertexts. The salt is no longer encrypted & included as
the first 256 byte chunk of ciphertext. It is now packaged along with
ciphertext in the clear & is restricted to being a 256-bit hex
string.The interfaces for theDatabase&AsyncDatabasewere cleaned
up. Many attributes & functions that were not intended as the public
interface of the classes were made “private”. Also, the no longer
used utilities for encrypting & decrypting under the MAP_ENCODING
were removed.Updated theabytes_xor,bytes_xor,axor&xorgenerators
to shrink the size of theseedthat’s fed into thekeystream. This
allows the one-time-pad cipher to be more cpu efficient.Minor ChangesFixed various typos, docstrings & tutorials that have no kept up
with the pace of changes.Various refactorings throughout.Theakeypair&keypairfunctions now produce aNamespacepopulated with a 512-bit hex key & a 256-bit hex salt to be more
consistent with their intended use-case with the one-time-pad cipher.Removedaencode_salt,encode_salt,adecode_salt&decode_saltfunctions since they are no longer used in conjunction
with LIST_ENCODING ciphertexts.Updated tests to recognize these changes.Gave theOneTimePadclass access to aBytesIOobject under a
newioattribute.Changes for version 0.13.0Major ChangesSecurity Patch:xor&axorfunctions that define the
one-time-pad cipher had a vulnerability fixed that can leak <1-bit of
plaintext. The issue was in the way keys were built, where the
multiplicative products of two key segments were xor’d together. This
lead to keys being slightly more likely to be positive integers,
meaning the final bit had a greater than 1/2 probability of being a0. The fix is accompanied with an overhaul of the one-time-pad
cipher which is more efficient, faster, & designed with a better
understanding of the way bytes are processed & represented. The key
chunks now do not, & must not, surpass 256 bytes & neither should
any chunk of plaintext output. Making each chunk deterministically
256 bytes allows for reversibly formatting ciphertext to & from
bytes-like strings. These changes are backwards incompatible with
prior versions of this package & are strongly recommended.Addedbytes_xor&abytes_xorfunctions which take in key
generators which produce key segments of type bytes instead of hex
strings.AsyncDatabase&Databasenow save files in bytes format,
making them much more efficient on disk space. They use the newBytesIOclass in thegenericsmodule to transparently convert
to & from json & bytes. This change is also not backwards compatible.Removedacipher,cipher,adecipher,decipher,aorganize_encryption_streams,organize_encryption_streams,aorganize_decryption_streams,organize_decryption_streams,aencrypt,encrypt,adecrypt,decrypt,asubkeys&subkeysgenerators from theciphersmodule & package to slim
down the code, remove repetition & focus on the cipher tools that
include hmac authentication.Removed deprecated diffie-hellman methods inRopakeclass.Removed the staticpower10dictionary from the package.The default secret salt for theRopakeclass is now derived from the
contents of a file that’s in the databases directory which is chmod’d to
0o000 unless needed.Madeaclient_message_key,client_message_key,aserver_message_key,
&server_message_keyRopakeclass methods to help distinguish
client-to-server & server-to-client message keys which prevents replay
attacks on the one-message ROPAKE protocol.Added protocol coroutines to theRopakeclass which allow for easily
engaging in 2DH & 3DH elliptic curve exchanges for servers & clients.Efficiency improvements to theaseeder&seedergenerator functions
in therandomsmodule. This affects theacsprng&csprngobjects
& all the areas in the library that utilize those objects.Changed the repr behavior ofComprendeinstances to redact all args &
kwargs by default to protect cryptographic material from unintentionally
being displayed on user systems. The repr can display full contents by
calling theenable_debuggingmethod of theDebugControlclass.All generator functions decorated withcomprehensionare now given
arootattribute. This allows direct access to the function without
needing to instantiate or run it as aComprendeobject. This saves
a good deal of cpu & time in the overhead that would otherwise be
incurred by the class. This is specifically more helpful in tight &/or
lower-level looping.Minor ChangesVarious refactorings across the library.Fixed various typos, bugs & inaccurate docstrings throughout the library.Addchown&chmodfunctions to theasynchs.aosmodule.Now makes newmultiprocessing.Managerobjects in theasynchs.Processes&asynchs.Threadsclasses to avoid errors that occur when using a stale
object whose socket connections are closed.ChangedRopakeclass’adb_login&db_loginmethods toadatabase_login_key&database_login_key. Also, fix a crash bug in
those methods.ChangedRopakeclass’aec25519_pub,ec25519_pub,aec25519_priv&ec25519_privmethods toaec25519_public_bytes,ec25519_public_bytes,aec25519_private_bytes&ec25519_private_bytes.Added low-level private methods toRopakeclass which do derivation
& querying of the default class key & salt.Behavior changes to theainverse_int&inverse_intfunctions in thegenericsmodule to allow handling bases represented instrorbytestype strings.Behavior & name changes to theabinary_tree&binary_treefunctions in thegenericsmodule toabuild_tree&build_tree. They now allow making
uniform trees of any width & depth, limited only by the memory in a
user’s machine.Provided newacsprbg&csprbgobjects to the library that return 512-bits
of cryptographically secure pseudo-randombytestype strings. They are
made by the newabytes_seeder&bytes_seedergenerators.Thecsprng,acsprng,csprbg&acsprbgobjects were
wrapped in functions that automatically restart the generators if they’re
stalled / interrupted during a call. This keeps the package from melting
down if it can no longer call the CSPRNGs for new entropy.Cleaned up & simplifiedtable_keyfunctions in thekeygensmodule.Changed the output ofasafe_symm_keypair&safe_symm_keypairfunctions
to contain bytes values not their hex-only representation. Also removed
these functions from the main imports of the package since they are slow
& their main contribution is callingarandom_number_generator&random_number_generatorto utilize a large entropy pool when starting
CSPRNGs.Added new values to thebitsdictionary.Addedapad_bytes,pad_bytes,adepad_bytes&depad_bytesfunctions which useshake_256to pad/depad plaintext bytes to & from
multiples of 256 bytes. They take in a key to create the padding.
This method is intended to also aid in protecting against padding
oracle attacks.Changes for version 0.12.0Major ChangesThe OPAKE protocol was renamed to ROPAKE, an acronym for Ratcheting
Opaque Password Authenticated Key Exchange. This change was necessary
since OPAKE is already a name for an existing PAKE protocol. This change
also means theOpakeclass name was changed toRopake.TheRopakeclass’ registration algorithm was slightly modified to
use the generated Curve25519shared_keyan extra time in the key
derivation process. This shouldn’t break any currently authenticated
sessions.Theasyncio_contextmanagerpackage is no longer a listed dependency
insetup.py. The main file from that package was copied over into the/aiootpdirectory in order to remove the piece of code that caused
warnings to crop up when return values were retrieved from async
generators. This change will put an end to this whack-a-mole process of
trying to stop the warnings with try blocks scattered about the codebase.Addedasave_tag,save_tag,asave_file&save_filemethods
to the database classes so that specific entries can be saved to disk
without having to save the entire database which is much more costly. The
manifest file isn’t saved to disk when these methods are used, so if a
tag file isn’t already saved in the database, then the saved files will
not be present in the manifest or in the cache upon subsequent loads of
the database. The saved file will still however be saved on the
filesystem, though unbeknownst to the database instance.TheNamespaceclass now redacts all obvious key material in instance
repr’s, which is any 64+ hex character string, or any number with 64+
decimal digits.Removed the experimental recursive value retrieval withinComprende’s__aexamine_sent_exceptions&__examine_sent_exceptionsmethods.
This change leads to more reliable & faster code, in exchange for an
unnecessary feature being removed.Bug fix of theauuids&uuidsmethods by editing the code in
theasyncio_contextmanagerdependency & using the patched package
instead of thecomprehensiondecorator for thearelay&relaymethods ofComprende. Their internal algorithms was also updated to
be simpler, but are incompatible with the outputs of past versions of
these methods.Minor ChangesVarious refactorings & documentation additions / modifications throughout
the library.Various small bug fixes.The shared keys derived from theRopakeprotocol are now returned in
aNamespaceobject instead of a raw dictionary, which allows the
values to be retrieved by dotted &/or bracketed lookup.Theatest_hmac&test_hmacalgorithms / methods were made more
efficient & were refactored. Now they callatime_safe_equality&time_safe_equalityinternally, which are new methods that can apply
the non-constant time but randomized timing comparisons on any pairs of
values.Changes for version 0.11.0Major ChangesThe Opake protocol was made greatly more efficient. This was done by
replacing the diffie-hellman verifiers with a hash & xor commit & reveal
system. Most hashing was made more efficient my using quicker & smallersha_512function instead ofnc_512, & streamlining the protocol.TheOpake.client&Opake.client_registrationmethods now take
an instantiated client database instead of client credentials which
improves security, efficiency & usability. This change reduces the amount
of exposure received by user passwords & other credentials. It also
simplifies usage of the protocol by only needing to carry around a
database instead of a slew of credentials, which is also faster, since
the credentials are passed through the cpu & memory hardpasscryptfunction everytime to open the database.Minor ChangesHeavy refactorings & documentation additions / modifications of theOpakeclass. Removed theOpake.ainit_database&Opake.init_databasemethods, & made thesaltdefault argument parameter inOpake.aclient_database,Opake.client_database,Opake.adb_login&Opake.db_logininto a keyword only argument so any extra user definedcredentialsare able to be passed without specifying a salt.The decorators for theComprende.arelay&Comprende.relaymethods
were changed from@asyncio_contextmanager.async_contextmanagerto@comprehension()to stop that package from raising exceptions when
we retrieve return values from async generators.Changes for version 0.10.1Major ChangesAddedProcesses&Threadsclasses toasynchs.pywhich abstract
spawning & getting return values from async & sync functions intended to
be run in threads, processes or pools of the former types. This simplifies
& adds time control to usages of processes & threads throughout the
library.Reduced the effectiveness of timing analysis of the modular exponentiation
in theOpakeclass’ verifiers by making the process return values
only after discrete intervals of time. Timing attacks on that part of the
protocol may still be viable, but should be significantly reduced.Bug fix inComprendewhich should take care of warnings raised from
theaiocontextpackage when retrieving async generator values by
raisingUserWarningwithin them.Minor ChangesHeavy refactorings of theOpakeclass.Various refactorings & cleanups around the package.Further addreturn_exceptions=Trueflag to gather calls inciphers.py.Addedis_registration&is_authenticationwhich take a client
hello message that begin theOpakeprotocol, & returnFalseif
the message is not either a registration or authentication message,
respectively, & return"Maybe"otherwise, since these functions can’t
determine without running the protocol whether or not the message is
valid.Changes for version 0.10.0Major ChangesAdded a new oblivious, one-message, password authenticated key exchange
protocol class inaiootp.ciphers.Opake. It is a first attempt at the
protocol, which works rather well, but may be changed or cleaned up in a
future update.Added thecryptographypackage as a dependency for elliptic curve
25519 diffie-hellman key exchange in theOpakeprotocol.Fix buggy data processing functions ingenerics.pymodule.Addedsilentflag toAsyncDatabase&Databasemethods, which
allows their instances to finish initializing even if a file is missing
from the filesystem, normally causing aFileNotFoundError. This makes
trouble-shooting corrupted databases easier.Added newaiootp.paths.SecurePathfunction which returns the path to
a unique directory within the database’s default directory. The name of
the returned directory is a cryptographic value used to create & open the
default database used by theOpakeclass to store the cryptographic
salt that secures the class’ client passwords. It’s highly recommended
to override this default database by instantiating the Opake class with
a custom user-defined key. The instance doesn’t need to be saved, since
all the class’ methods are either class or static methods. The__init__method only changes the class’ default database to one opened with the
user-definedkey&/ordirectorykwargs, & should really only be
done once at the beginning of an application.Minor ChangesVarious refactorings & cleanups around the package.AddedComprendeclass feature to return the values from even the
generators within an instance’s arguments. This change better returns
values to the caller from chains ofComprendegenerators.Fixedcommons.BYTES_TABLEmissing values.Addedcommons.DH_PRIME_4096_BIT_GROUP_16&commons.DH_GENERATOR_4096_BIT_GROUP_16constants for use in theOpakeprotocol’s public key verifiers.Added other values to thecommons.pymodule.Added new very large no-collision hash functions to thegenerics.pymodule used to xor with diffie-hellman public keys in theOpakeclass.Added newwait_on&await_onComprendegenerators togenerics.pywhich waits for a queue or container to be populated & yields it whenever
it isn’t empty.Changes for version 0.9.3Major ChangesSpeed & efficiency improvements in theComprendeclass &azip.Minor ChangesVarious refactorings & code cleanups.Addedapop&popComprendegenerators to the library.Switched the default character table in theato_base,to_base,afrom_base, &from_basechainable generator methods from the 62
characterASCII_ALPHANUMERICtable, to the 95 characterASCII_TABLE.Made the digits generators inrandoms.pyautomatically create a new
cryptographically secure key if a key isn’t passed by a user.Some extra data processing functions added togenerics.py.Changes for version 0.9.2Major ChangesAddedpasscrypt&apasscryptinstance methods toOneTimePad,Keys, &AsyncKeysclasses. They produce password hashes that are
not just secured by the salt & passcrypt algorithm settings, but also by
their main symmetric instance keys. This makes passwords infeasible to
crack without also compromising the instance’s 512-bit key.Minor ChangesFurther improvements to the random number generator inrandoms.py.
Made its internals less sequential thereby raising the bar of work needed
by an attacker to successfully carry out an order prediction attack.Added checks in thePasscryptclass to make sure both a salt &
password were passed into the algorithm.SwitchedPermissionErrorexceptions inPasscrypt._validate_argstoValueErrorto be more consistent with the rest of the class.Documentation updates / fixes.Changes for version 0.9.1Minor ChangesNow any falsey values for thesaltkeyword argument in the library’skeys,akeys,bytes_keys,abytes_keys,subkeys, &asubkeysinfinite keystream generators, & other functions around the
library, will cause them to generate a new cryptographically secure
pseudo-random value for the salt. It formerly only did this whensaltwasNone.Theseeder&aseedergenerators have been updated to introduce
512 new bits of entropy fromsecrets.token_byteson every iteration
to ensure that the CSPRNG will produce secure outputs even if its
internal state is somehow discovered. This also allows for simply calling
the CSPRNG is enough, there’s no longer a strong reason to pass new
entropy into it manually, except to add even more entropy as desired.Madesizethe last keywordCHECKSUMS.txt argument inencrypt&aencryptto better mirror the signatures for rest of the library.Addedtoken_bits&atoken_bitsfunctions torandoms.pywhich
are renamings ofsecrets.randbits.Refactored & improved the security ograndoms.py’s random number
generator.Changes for version 0.9.0Major ChangesAdded hmac codes to ciphertext for the following functions:json_encrypt,ajson_encrypt,bytes_encrypt,abytes_encrypt,Database.encrypt&AsyncDatabase.aencrypt. This change greatly
increases the security of ciphertext by ensuring it hasn’t been modified
or tampered with maliciously. One-time pad ciphertext is maleable, so
without hmac validation it can be changed to successfully allow
decryption but return the wrong plaintext. These functions are the
highest level abstractions of the library for encryption/decryption,
which made them excellent targets for this important security update.
As well, it isn’t easily possible for the library to provide hmac codes
for generators that produce ciphertext, because the end of a stream of
ciphertext isn’t known until after the results have left the scope
of library code. So users will need to produce their own hmac codes for
generator ciphertext unless we find an elegant solution to this issue.
These functions now all return dictionaries with the associated hmac
stored in the"hmac"entry. The bytes functions formerly returned
lists, now their ciphertext is available from the"ciphertext"entry.
And, all database files will have an hmac attached to them now. These
changes were designed to still be compatible with old ciphertexts but
they’ll likely be made incompatible by the v0.11.x major release.Only truthy values are now validkeykeyword arguments in the
library’skeys,akeys,bytes_keys,abytes_keys,subkeys,
&asubkeysinfinite keystream generators. Also now seeding extra entropy
intocsprng&acsprngwhensaltis falsey within them.Only truthy values are now valid forpassword&saltarguments inapasscrypt,passcrypt& their variants.Minor ChangesUpdates to documentation &README.rsttutorials.Thekb,cpu, &hardnessarguments insum_passcrypt&asum_passcryptchainable generator methods were switched to keyword
only arguments.Changes for version 0.8.1Major ChangesAddedsum_passcrypt&asum_passcryptchainable generator methods
toComprendeclass. They cumulatively apply the passcrypt algorithm
to each yielded value from an underlying generator with the passcrypt’d
prior yielded result used as a salt. This allows making proofs of work,
memory & space-time out of iterations of the passcrypt algorithm very
simple.Minor ChangesVarious inaccurate docstrings fixed.Various refactorings of the codebase.Madekb,cpu, &hardnessarguments into keyword only arguments
inAsyncDatabase&Databaseclasses.Thelengthkeyword argument in functions around the library was
changed tosizeto be consistent across the whole package. Reducing
the cognitive burden of memorizing more than one name for the same concept.Various efficiency boosts.Edits toREADME.rst.Addedencode_salt,aencode_salt,decode_salt&adecode_saltfunctions to the library, which gives access to the procedure used to
encrypt & decrypt the random salt which is often the first element
produced in one-time pad ciphertexts.Added cryptographically secure pseudo-random values as default keys in
encryption functions to safeguard against users accidentally encrypting
data without specifying a key. This way, such mistakes will produce
ciphertext with an unrecoverable key, instead of without a key at all.Changes for version 0.8.0Major ChangesFixtest_hmac,atest_hmacfunctions in the keys & database
classes. The new non-constant-time algorithm needs a random salt to be
added before doing the secondary hmac to prevent some potential exotic
forms of chosen plaintext/ciphertext attacks on the algorithm. The last
version of the algorithm should not be used.TheKeys&AsyncKeysinterfaces were overhauled to remove the
persistance of instance salts. They were intended to be updated by users
with thereset&aresetmethods, but that cannot be guaranteed
easily through the class, so it is an inappropriate interface since
reusing salts for encryption is completely insecure. The instances do
still maintain state of their main encryption key, & new stateful methods
for key generation, likemnemonic&table_key, have been added.
Thestate&astatemethods have been removed.GaveOneTimePadinstances new stateful methods from theciphers.pymodule &keygens.pykeys classes. Its instances now remember the main
symmetric key behind thekeyproperty & automatically passes it as a
keyword argument to the methods inOneTimePad.instance_methods.Minor ChangesUpdateCHANGES.rstfile with the updates that were not logged for
v0.7.1.BYTES_TABLEwas turned into a list so that the byte characters can
be retrieved instead of their ordinal numbers.Changes for version 0.7.1Major ChangesFix a mistake in the signatures ofpasscrypt&apasscrypt. The args ``kb,cpu&hardnesswere changed into keyword only arguments
to mitigate user mistakes, but the internal calls to those functions were
still using positional function calls, which broke the api. This issue
is now fixed.Changes for version 0.7.0Major ChangesReplaced usage of barerandommodule functions, to usage of an
instance ofrandom.Randomto keep from messing with user’s settings
on that module.Finalized the algorithm for thepasscrypt&apasscryptfunctions.
The algorithm is now provably memory & cpu hard with a wide security
margin with adequate settings. The algorithm isn’t likely change with
upcoming versions unless a major flaw is found.The default value for thecpuargument inpasscrypt&apasscryptis now3& now directly determines how many hash iterations are done
for each element in the memory cache. This provides much more
responsiveness to users & increases the capacity to impact resource cost
with less tinkering.Switched theAsyncKeys.atest_hmac&Keys.test_hmacmethods to a
scheme which is not constant time, but which instead does not leak useful
information. It does this by not comparing the hmacs of the data, but of
a pair of secondary hmacs. The timing analysis itself is now dependant
on knowledge of the key, since any conclusions of such an analysis would
be unable correlate its findings with any supplied hmac without it.Addedtest_hmac&atest_hmacto the database classes, & changed
their hmac algorithm fromsha3_512tosha3_256.Minor ChangesVarious code cleanups, refactorings & speedups.Several fixes to inaccurate documentation.Several fixes to inaccurate function signatures.Addedmnemonic&amnemonickey generators tokeygens.pywith
a wordlist 2048 entries long. A custom wordlist can also be passed in.Minor changes inComprendeto track down a bug in the functions that
use the asyncio_contextmanager package. It causes a warning when asking
async generators to return (not yield) values.Some refactoring ofrandom_number_generator&arandom_number_generator.Changes for version 0.6.0Major ChangesReplaced the usage ofos.urandomwithin the package withsecrets.token_bytesto be more reliable across platforms.Replaced several usages ofrandom.randrangewithinrandoms.pyto
calls tosecrets.token_byteswhich is faster & more secure. It
now also seedsrandommodule periodically prior to usage.Changed the internal cache sorting algorithm ofpasscrypt&apasscryptfunctions. The key function passed tolist.sort(key=key)now not only updates thehashlib.sha3_512proof object with
each element in the cache, but with it’s own current output. This change
is incompatible with previous versions of the functions. The key function
is also trimmed down of unnecessary value checking.The default value for thecpuargument inpasscrypt&apasscryptis now40_000. This is right at the edge of when the argument begins
impacting the cpu work needed to comptute the password hash when thekbargument is the default of1024.Switched theAsyncKeys.atest_hmac&Keys.test_hmacmethods to a
constant time algorithm.Minor ChangesVarious code cleanups, refactorings & speedups.Added aconcurrent.futures.ThreadPoolExecutorinstance to theasynchsmodule for easily spinning off threads. It’s available underasynchs.thread_pool.Addedsort&asortchainable generator method to theComprendeclass. They support sorting by akeysorting function as well.Changed the name ofasynchs.executor_wrappertoasynchs.wrap_in_executor.Changed the name ofrandoms.non0_digit_stream,randoms.anon0_digit_stream,randoms.digit_stream&randoms.adigit_streamtorandoms.non_0_digits,randoms.anon_0_digits,randoms.digits&randoms.adigits.Several fixes to inaccurate documentation.apasscrypt&Passcrypt.anewnow use the synchronous version of the
algorithm internally because it’s faster & it doesn’t change the
parallelization properties of the function since it’s already run
automatically in another process.Addedshuffle,ashuffle,unshuffle, &aunshufflefunctions
torandoms.pythat reorder sequences pseudo-randomly based on theirkey&saltkeyword arguments.Fixed bugs inAsyncKeys&debuggers.py.Addeddebugger&adebuggerchainable generator methods to theComprendeclass which benchmarks & inspects running generators with
an inline syntax.Changes for version 0.5.1Major ChangesFixed a bug in the methodsauuids&uuidsof the database classes
that assigned to a variable within a closure that was nonlocal but which
wasn’t declared non-local. This caused an error which made the methods
unusable.Addedpasscrypt&apasscryptfunctions which are designed to be
tunably memory & cpu hard password-based key derivation function. It was
inspired by the scrypt protocol but internally uses the library’s tools.
It is a first attempt at the protocol, it’s internal details will likely
change in future updates.Addedbytes_keys&abytes_keysgenerators, which are just like
the library’skeysgenerator, except they yield the concatenatedsha3_512.digestinstead of thesha3_512.hexdigest.Added new chainable generator methods to theComprendeclass for
processing bytes, integers, & hex strings into one another.Minor ChangesVarious code cleanups.New tests added to the test suite forpasscrypt&apasscrypt.TheComprendeclass’alist&listmethods can now be passed
a boolean argument to return either amutablelist directly from the
lru_cache, or a copy of the cached list. This list is used by the
generator itself to yield its values, so wilely magic can be done on the
list to mutate the underlying generator’s results.Changes for version 0.5.0Major ChangesAdded interfaces inDatabase&AsyncDatabaseto handle encrypting
& decrypting streams (Comprendegenerators) instead of just raw json
data. They’re methods calledencrypt_stream,decrypt_stream,aencrypt_stream, &adecrypt_stream.Changed the attribute_METATAGused byDatabase&AsyncDatabaseto name the metatags entry in the database. This name is smaller, cleaner
& is used to prevent naming collisions between user entered values & the
metadata the classes need to organize themselves internally. This change
will break databases from older versions keeping them from accessing their
metatag child databases.Added the methodsauuids&uuidstoAsyncDatabase&Databasewhich return coroutines that accept potentially sensitive identifiers &
turns them into saltedsizelength hashes distinguished by asalt& acategory.Minor ChangesVarious code & logic cleanups / speedups.Refactorings of theDatabase&AsyncDatabaseclasses.Various inaccurate docstrings fixed.Changes for version 0.4.0Major ChangesFixed bug inaiootp.abytes_encryptfunction which inaccurately called
a synchronousComprendeend-point method on the underlying async
generator, causing an exception and failure to function.Changed the procedures inakeys&keysthat generate their internal
key derivation functions. They’re now slightly faster to initialize &
more theoretically secure since each internal state is fed by a seed
which isn’t returned to the user. This encryption algorithm change is
incompatible with the encryption algorithms of past versions.Minor ChangesVarious code cleanups.Various inaccurate docstrings fixed.Keyword arguments inKeys().test_hmac&AsyncKeys().atest_hmachad their order switched to be slightly more friendly to use.Added documentation toREADME.rston the inner workings of the
one-time-pad algorithm’s implementation.MadeCompende.arandom_sleep&Compende.random_sleepchainable
generator methods.Changed theCompende.adelimit_resize&Compende.delimit_resizealgorithms to not yield inbetween two joined delimiters in a sequence
being resized.Changes for version 0.3.1Minor ChangesFixed bug where a static method inAsyncDatabase&Databasewas
wrongly labelled a class method causing a failure to initialize.Changes for version 0.3.0Major ChangesTheAsyncDatabase&Databasenow use the more secureafilename&filenamemethods to derive the hashmap name and encryption streams
from a user-defined tag internal to theiraencrypt/adecrypt/encrypt/decryptmethods, as well as, prior to them getting called.
This will break past versions of databases’ ability to open their files.The package now has built-in functions for using the one-time-pad
algorithm to encrypt & decrypt binary data instead of just strings
or integers. They are available inaiootp.abytes_encrypt,aiootp.abytes_decrypt,aiootp.bytes_encrypt&aiootp.bytes_decrypt.TheComprendeclass now has generators that do encryption & decryption
of binary data as well. They are available from anyComprendegenerator
by theabytes_encrypt,abytes_decrypt,bytes_encrypt&bytes_decryptchainable method calls.Minor ChangesFixed typos and inaccuracies in various docstrings.Added a__ui_coordination.pymodule to handle inserting functionality
from higher-level to lower-level modules and classes.Various code clean ups and redundancy eliminations.AsyncKeys&Keysclasses now only update theirself.saltkey
by default when theirareset&resetmethods are called. This
aligns more closely with their intended use.Addedarandom_sleep&random_sleepchainable methods to theComprendeclass which yields outputs of generators after a random
sleep for each iteration.Added several other chainable methods to theComprendeclass for
string & bytes data processing. They’re viewable inComprende.lazy_generators.Added new, initial tests to the test suite.Changes for version 0.2.0Major ChangesAdded ephemeral salts to theAsyncDatabase&Databasefile
encryption procedures. This is a major security fix, as re-encryption
of files with the same tag in a database with the same open key would
use the same streams of key material each time, breaking encryption if
two different versions of a tag file’s ciphertext stored to disk were
available to an adversary. The database methodsencrypt,decrypt,aencrypt&adecryptwill now produce and decipher true one-time
pad ciphertext with these ephemeral salts.Theaiootp.subkeys&aiootp.asubkeysgenerators were revamped
to use thekeys&akeysgenerators internally instead of using
their own, slower algorithm.AsyncDatabasefile deletion is now asynchronous by running thebuiltins.os.removefunction in an async thread executor. The
decorator which does the magic is available ataiootp.asynchs.executor_wrapper.Minor ChangesFix typos in__root_salt&__aroot_saltdocstrings. Also replaced
thehash(self)argument for theirlru_cache&alru_cachewith a secure hmac instead.addgi_frame,gi_running,gi_code,gi_yieldfrom,ag_frame,ag_running,ag_code&ag_awaitproperties toComprendeclass to mirror async/sync generators more closely.Removeajson_encrypt,ajson_decrypt,json_encrypt,json_decryptfunctions’ internal creation of dicts to contain the
plaintext. It was unnecessary & therefore wasteful.Fix docstrings inOneTimePadmethods mentioningparentkwarg which
is a reference to deleted, refactored code.Fix incorrect docstrings in databasesnamestream&anamestreammethods.AddedASYNC_GEN_THROWNconstant toComprendeclass to try to stop
an infrequent & difficult to debugRuntimeErrorwhen async generators
do not stop after receiving anathrow.Database tags are now fully loaded when they’re copied using the methodsinto_namespace&ainto_namespace.Updated inaccurate docstrings inmap_encrypt,amap_encrypt,map_decrypt&amap_decryptOneTimePadmethods.Addedacustomize_parametersasync function toaiootp.genericsmodule.Various code clean ups.Changes for version 0.1.0Minor ChangesInitial version.Major ChangesInitial version.Known Issues……………………………..Table Of ContentsThe test suite for this software is under construction, & what tests
have been published are currently inadequate to the needs of
cryptography software.This package is currently in beta testing & active development,
meaning major changes are still possible when there are really good
reasons to do so. Contributions are welcome. Send us a message if
you spot a bug or security vulnerability:[email protected]@riseup.neted25519-key: 70d1740f2a439da98243c43a4d7ef1cf993b87a75f3bb0851ae79de675af5b3bx25519-key: 4457276dbcae91cc5b69f1aed4384b9eb6f933343bb44d9ed8a80e2ce438a450 |
aiooui | aioouiSource Code:https://github.com/bluetooth-devices/aioouiAsync OUI lookupsInstallationInstall this via pip (or your favourite package manager):pip install aioouiUsageStart by importing it:importaioouiContributors ✨Thanks goes to these wonderful people (emoji key):This project follows theall-contributorsspecification. Contributions of any kind welcome!CreditsThis package was created withCopierand thebrowniebroke/pypackage-templateproject template. |
aioouimeaux | # aioouimeauxOpen source control for Belkin WeMo devices* Free software: BSD license* Documentation: Soon at http://aioouimeaux.rtfd.org.## Features* Supports WeMo Switch, Light Switch, Insight Switch and Motion* Python API to interact with device at a low level using asyncio## About this libraryBased on a repository that can be found here: https://github.com/syphoxy/ouimeaux.gitThe original repository can be found here: https://github.com/iancmcc/ouimeauxThe library was modified to make use of asyncio.It has been forked here since it is a significant change. It has been renamed toclearly indicate the difference.## Installation```$ sudo pip3 install aioouimeaux```If you want to use a virtual environement```$ sudo pip3 install virtualenv$ mkdir ouimeaux-env$ virtualenv ouimeaux-env$ source ouimeaux-env/bin/activate$ cd ouimeaux-env$ pip3 install git+https://github.com/frawau/ouimeaux.git```At this point you should be able to use**Note:** Ensure that the `pip` and `virtualenv` command you use belongs to aPython 3 installation. On some systems, there are multiple versions of Pythoninstalled.You can try:```python3 -m aioouimeaux```and see something like:```Hit "Enter" to startUse Ctrl-C to quitMotion Motion status is now OffSwitch Test Switch 3 status is now OffSwitch Test Switch 1 status is now OnSwitch Test Switch 2 status is now OnMotion Motion status is now OffSelect Device:[1] Motion[2] Test Switch 1[3] Test Switch 2[4] Test Switch 3Your choice:2Select Function for Test Switch 1:[1] Power (0 or 1)[2] Get Home Id[3] Get MAC Address[4] Get Device Id[5] Get Serial Number[6] Explain[7] Function X (e.g. basicevent.GetHomeInfo see 'explain')[0] Back to device selection```## TroubleshootingOpen an issue and I'll try to help.aioouimeauxRelease 0.1.0b1 (Nov 20, 2017+++++++++++++++++++++++++++++- Modified the code to use asyncio- suppressed use of gevent- suppressed use of requests- suppressed the signal framework (was using thread)- suppressed REST server and client application- uses aiohttp and aiohttp_wsgi- Renamed Environment to WeMoouimeaux==========History-------Release 0.8.0 (July 30, 2016)+++++++++++++++++++++++++++++- Randomize subscription ports to enable simultaneous ouimeaux scripts (thanks @bennytheshap)- Fix for WeMo LED Light support (thanks @sstangle73)- #32: Removed address cache, broke server out into optional feature- Fix for Maker state reporting (thanks @pavoni)- Filter by SSDP location, fixing case where multiple devices respond from the same IP (thanks @szakharchenko)- Fix Maker event handlers, which were being passed as bridges (thanks @maxlazarov)- Work around gevent-socketio bug by explicitly casting header value as string- Fix for inconsistent Light state (thanks @canduuk)- StateChange signals are now a separate class and do not fire if value is unchanged (thanks @esecules)- Python 3 support (thanks to @drock371)Release 0.7.9 (March 17, 2015)++++++++++++++++++++++++++++++- Command line support for WeMo LED Light (thanks @fritz-fritz)- Command line support for WeMo Maker (thanks @logjames)- Support for 2.0.0 firmware (thanks @fritz-fritz)- Bug fixesRelease 0.7.3 (August 10, 2014)++++++++++++++++++++++++++++++++- Fixed #18: Error when run as root- Fixed #26: Evict devices from cache when unreachable- Fixed #29: GetPower stopped working for Insight devices- Fixed #31: Add blink method on switches, include in REST API- Fixed #33, #37: Handle invalid devices without dying- Fixed #35: Require requests >= 2.3.0- Fixed #40: Retry requests in the event of failure- Fixed #47: Don't choke on invalid newlines in XML returned by switches(thanks to @fingon)Release 0.7.2 (January 28, 2014)++++++++++++++++++++++++++++++++- Fix a bug with using query parameters on /api/deviceRelease 0.7 (January 27, 2014)++++++++++++++++++++++++++++++- Added REST API- Added Web appRelease 0.6 (January 25, 2014)++++++++++++++++++++++++++++++++- Added signals framework- Fixed #16, #19, #22: Defensively resubscribe to events when device responds with an error- Fixed #15: Signals framework includes relevant device when sending signal- Refactored structure, added Sphinx docsRelease 0.5.3 (January 25, 2014)++++++++++++++++++++++++++++++++- Fixed #20: Allow timeout in environment.wait()- Fixed #21: Add Insight supportRelease 0.5.2 (November 23, 2013)+++++++++++++++++++++++++++++++++- Fixed #14: Indicate Connection:close header to avoid logging when WeMo sendsinvalid HTTP response.Release 0.5.1 (November 9, 2013)++++++++++++++++++++++++++++++++- Fixed #10: Updated subscriber listener to use more reliable method ofretrieving non-loopback IP address; updated docs to fix typo in listenerregistration example (thanks to @benhoyle, @francxk)- Fixed #11: Remove instancemethod objects before attempting to pickle devicesin the cache (thanks @piperde, @JonPenner, @tomtomau, @masilu77)Release 0.5 (October 14, 2013)+++++++++++++++++++++++++++++++- Added fuzzy matching of device name when searching/toggling from command line- Added ``status`` mode to print status for all devices- Added ``switch status`` mode to print status for specific device- Added flags for all command-line options- Fixed #9: Removed unused fcntl import that precluded Windows usage (thanks to@deepseven)Release 0.4.3 (August 31, 2013)+++++++++++++++++++++++++++++++- Used new method of obtaining local IP for discovery that is less likely toreturn loopback- Exit with failure and instructions for solution if loopback IP is used- Updated installation docs to include python-dev and pip instructions (patchby @fnaard)- Fixed README inclusion bug that occasionally broke installation via pip.- Added ``--debug`` option to enable debug logging to stdoutRelease 0.4 (August 17, 2013)+++++++++++++++++++++++++++++- Fixed #7: Added support for light switch devices (patch by nschrenk).- Fixed #6: Added "wemo clear" command to clear the device cache.Release 0.3 (May 25, 2013)++++++++++++++++++++++++++- Fixed #4: Added ability to specify ip:port for discovery server binding. Removeddocumentation describing need to disable SSDP service on Windows.- Fixed #5: Added device cache for faster results.- Added configuration file.- Added ability to configure aliases for devices to avoid quoting strings onthe command line.- Added 'toggle' command to command line switch control.Release 0.2 (April 21, 2013)++++++++++++++++++++++++++++++- Fixed #1: Added ability to subscribe to motion and switch state change events.- Added Windows installation details to README (patch by @brianpeiris)- Cleaned up UDP server lifecycle so rediscovery doesn't try to start it back up.Release 0.1 (February 2, 2013)++++++++++++++++++++++++++++++- Initial release.* First release on PyPI. |
aio-overpass | A client for theOverpass API, a read-only API that serves up custom selected
parts ofOpenStreetMapdata.The Overpass API is optimized for data consumers that need a few elements within
a glimpse or up to roughly 10 million elements in some minutes, both selected by
search criteria like location, type of objects, tag properties, proximity, or
combinations of them. To make use of it, you should familiarize yourself withOverpass QL, the query language used to select the elements that you want.ContentsFeaturesGetting StartedChoosing ExtrasBasic UsageExampleCoordinatesSee alsoAn overview of modules, classes and functions can be found in theAPI referenceThere are some notebooks to check out inexamples/The version history is available inCHANGELOG.mdDevelopers can find some instructions inCONTRIBUTING.mdThe Overpass APIrepository,
itsblog,
itsuser's manualand itsrelease notesOverpass Turboto prototype your queries in your browserFeaturesAsynchronous requests usingaiohttpParallel queries within rate limitsFault tolerance through a (customizable) retry strategyExtensionsTyped elements that simplify browsing result setsShapelygeometries for manipulation and analysisGeoJSONexportsSimplified querying and processing of public transportation routesDesign GoalsA small and stable set of core functionality.Good defaults for queries and retrying.Room for extensions that simplify querying and/or processing of spatial data
in specific problem domains.Sensible and spec-compliant GeoJSON exports for all objects that represent spatial features.Detailed documentation that supplements learning about OSM and the Overpass API.Getting Startedpipinstallaio-overpass
pipinstallaio-overpass[shapely,networkx,joblib]poetryaddaio-overpass
poetryaddaio-overpass[shapely,networkx,joblib]Choosing ExtrasThis library can be installed with a number of optional extras.Install no extras, if you're fine withdictresult sets.Install theshapelyextra, if you would like the convenience of typed OSM elements.
It is also useful if you are interested in elements' geometries,
and either already use Shapely, or want a simple way to exportGeoJSON.This includes theptmodule to make it easier to interact with public transportation routes.
Something seemingly trivial like listing the stops of a route can have unexpected pitfalls,
since stops can have multiple route members, and may have a range of different tags and roles.
This submodule will clean up the relation data for you.Install thenetworkxextra to enable thept_orderedmodule, if you want a route's path as a
simple line from A to B. It is hard to do this consistently, mainly because ways are not always
ordered, and stop positions might be missing. You can benefit from this submodule if you wish torender a route's path between any two stopsmeasure the route's travelled distance between any two stopsvalidate the order of ways in the relationcheck if the route relation has gapsInstall thejoblibextra to speed uppt_ordered.collect_ordered_routes(), which can benefit
greatly from parallelization.Basic UsageThere are three basic steps to fetch the spatial data you need:Formulate a queryEither write your own custom query, f.e.Query("node(5369192667); out;"),or use one of theQuerysubclasses, f.e.SingleRouteQuery(relation_id=1643324).Call the Overpass APIPrepare your client withclient = Client(user_agent=...).Useawait client.run_query(query)to fetch the result set.Collect resultsEither access the raw result dictionaries withquery.result_set,or use a collector, f.e.collect_elements(query)to get a list of typedElements.Collectors are often specific to queries -collect_routesrequires aRouteQuery,
for instance.ExampleResults as Dictionariesfromaio_overpassimportClient,Queryquery=Query("way(24981342); out geom;")client=Client()awaitclient.run_query(query)query.result_set[{"type":"way","id":24981342,# ..."tags":{"addr:city":"Hamburg","addr:country":"DE","addr:housename":"Elbphilharmonie",# ...},}]Results as Objectsfromaio_overpass.elementimportcollect_elementselems=collect_elements(query)elems[0].tags{"addr:city":"Hamburg","addr:country":"DE","addr:housename":"Elbphilharmonie",# ...}Results as GeoJSONimportjsonjson.dumps(elems[0].geojson,indent=4){"type":"Feature","geometry":{"type":"Polygon","coordinates":[[[9.9832434,53.5415472],...]]},"properties":{"id":24981342,"type":"way","tags":{"addr:city":"Hamburg","addr:country":"DE","addr:housename":"Elbphilharmonie",...},...},"bbox":[9.9832434,53.540877,9.984967453.5416212,]}CoordinatesGeographic point locations are expressed by latitude (lat) and longitude (lon) coordinates.Latitude is given as an angle that ranges from –90° at the south pole to 90° at the north pole,
with 0° at the Equator.Longitude is given as an angle ranging from 0° at the Prime Meridian (the line that divides the
globe into Eastern and Western hemispheres), to +180° eastward and −180° westward.lat/lonvalues arefloatsthat are exactly those degrees, just without the ° sign.This might help you remember which coordinate is which:If you think of a world map, usually it’s a rectangle.Thelongside (the largest side) is the longitude.Longitude is the x-axis, and latitude is the y-axis.Be wary of coordinate order:The Overpass API explicitly names the coordinates:{ "lat": 50.2726005, "lon": 10.9521885 }Shapely geometries returned by this library uselat/lonorder, which is the order
stated byISO 6709, and seems like the most common order.GeoJSON, on the other hand, useslon/latorder.OpenStreetMap uses theWGS84spatial reference system
used by the Global Positioning System (GPS).OpenStreetMap node coordinates have seven decimal places, which gives them centimetric precision.
However, the position accuracy of GPS data is onlyabout 10m.
A reasonable display accuracy could be five places, which is precise to1.1 metresat the equator.Spatial features that cross the 180th meridian areproblematic,
since you go from longitude180.0to-180.0.
Such features usually have their geometries split up, like thearea of Russia. |
aioowm | aioowm - Easy Async library for working with OpenWeatherMapЧто нужно, чтобы пользоваться библиотекой:PyCharmOpenWeatherMap TokenУстановка:pipinstallaioowm
pipinstall-Uhttps://github.com/vladislavkovalskyi/aioowm/archive/master.zipСсылки:Пример:fromasyncioimportrunfromaioowmimportOWMweather=OWM(token="OpenWeatherMap Token",language="ru")asyncdefapp():result=awaitweather.get("Saint Petersburg")print(f"Город:{result.city.name}({result.city.country})\n"f"Температура:{result.weather.temperature.now}°C\n"f"Описание:{result.weather.description}\n"f"Скорость ветра:{result.weather.wind.speed}м/с")run(app())Вывод:Город: Санкт-Петербург (RU)
Температура: 2.12°C
Описание: Облачно
Скорость ветра: 1.34 м/с |
aiop4 | aiop4asyncio P4Runtime Python clientaiop4aiop4is anasyncioclient forP4Runtime. Breaking changes will likely happen until v1 is released.How to installpoetry add aiop4orpip install aiop4ExamplesL2 learning switch |
aiopagination | aiopaginationAboutaiopaginationis a library written using the aiogram library to help you create pagination using inline buttonsInfo:A sample to usefromaiogramimportexecutorfromaiogramimporttypesfromaiopagination.test.data.loaderimportdpfromaiopagination.widgets.aiokeyboardsimportbase_cd,pagination_cdfromaiopagination.widgets.aiopaginationimportPaginationsample_list=[(1,"Apple",'red'),(2,"Cucumber","green"),(3,"Melon","yellow"),(4,"Cherry","red"),(5,"Watermelon","green"),(6,"Banana","yellow"),(7,"Carrot","orange"),(8,"Kiwi","green"),(9,"Malina","red"),(10,"Apelsin","yellow"),(11,"Lemon","yellow"),(12,"Grape","black"),(13,"Carrot","red"),(14,"Potato","yellow"),(15,"Potato","yellow"),(16,"Banana","yellow"),(17,"Carrot","orange"),(18,"Kiwi","green"),(19,"Malina","red"),(20,"Apelsin","yellow"),(21,"Lemon","yellow"),(22,"Grape","black"),(23,"Carrot","red"),(24,"Potato","yellow")]# [email protected]_handler(commands=["start"])asyncdefbot_start(message:types.Message):pagination=Pagination(sample_list)awaitpagination.start_message(message=message)# select item and send to [email protected]_query_handler(base_cd.filter())asyncdefget_item_id(call:types.CallbackQuery,callback_data:dict):awaitcall.answer(cache_time=1)item_id=callback_data.get("item_id")pag=Pagination(sample_list)get_data=awaitpag.select_item(item_id=int(item_id))awaitcall.message.answer(get_data[1],parse_mode="HTML")# pagination [email protected]_query_handler(pagination_cd.filter())asyncdefshow_pagination(call:types.CallbackQuery,callback_data:dict):start=int(callback_data.get("start"))end=int(callback_data.get("end"))max_pages=int(callback_data.get("max_pages"))action=callback_data.get("action")pagination=Pagination(items=sample_list)ifaction=="prev":awaitpagination.prev(call=call,start=start,end=end,max_pages=max_pages)elifaction=="next":awaitpagination.next(call=call,start=start,end=end,max_pages=max_pages)else:awaitcall.answer(cache_time=1)awaitcall.message.edit_reply_markup()awaitcall.message.edit_text("Menga matn yuboring")if__name__=='__main__':executor.start_polling(dp,skip_updates=True) |
aiopaperscroll | No description available on PyPI. |
aio_parallel_tools | version: 0.0.1status: devauthor: huangsizheemail:[email protected] for creating asynchronous scripts easily.keywords: tools,asyncioFeatureTask poolActor and Actor ManagerExampleasyncwithAioFixedTaskPoolSimple()astask_pool:print(f"test pool size{task_pool.size}")print("test 4 task with pool size 3")awaitasyncio.gather(task_pool.submit(test,func_args=["c"]),task_pool.submit(test,func_args=["b"]),task_pool.submit(test,func_args=["a"]),task_pool.submit(test,func_args=["d"]))classPinger(AioActor):asyncdefreceive(self,message):print(message)try:awaitActorManager.get_actor("Ponger").Send('ping')exceptExceptionase:print(f"receive run error{e}")finally:awaitasyncio.sleep(0.5)Installpython-mpip install aio_parallel_toolsDocumenthttps://python-tools.github.io/aio_parallel_tools/Change Logversion 0.0.1created this project |
aioparrot | Asyncio based project to control Parrot drones.
Compatible devices: AR Drone 1 and 2. |
aiopath | 📁 Asyncpathlibfor Pythonaiopathis a complete implementation of Python'spathlibthat's compatible withasyncio,trio, and theasync/awaitsyntax.All I/O performed byaiopathis asynchronous andawaitable.Check out📂 app_pathsfor an example of library that usesaiopath, as well as thepycleanscript here.Use caseIf you're writing asynchronous Python code and want to take advantage ofpathlib's conveniences, but don't want to mix blocking andnon-blocking I/O, then you can reach foraiopath.For example, if you're writing an asyncweb scrapingscript, you might want to make several concurrent requests to websites and save the responses to persistent storage:fromasyncioimportrun,gatherfromaiohttpimportClientSessionfromaiopathimportAsyncPathasyncdefsave_page(url:str,name:str):path=AsyncPath(name)ifawaitpath.exists():returnasyncwithClientSession()assession,session.get(url)asresponse:content:bytes=awaitresponse.read()awaitpath.write_bytes(content)asyncdefmain():urls=['https://example.com','https://github.com/alexdelorenzo/aiopath','https://alexdelorenzo.dev','https://dupebot.firstbyte.dev']tasks=(save_page(url,f'{index}.html')forindex,urlinenumerate(urls))awaitgather(*tasks)run(main())If you usedpathlibinstead ofaiopath, tasks accessing the disk would block the event loop, and async tasks accessing the network would suspend until the event loop was unblocked.By usingaiopath, the script can access the network and disk concurrently.Implementationaiopathis a direct reimplementation ofCPython'spathlib.pyand shares some of its code.aiopath's class hierarchydirectly matches the one frompathlib, wherePathinherits fromPurePath,AsyncPathinherits fromAsyncPurePath, and so on.Withaiopath, methods that perform I/O are asynchronous and awaitable, and methods that perform I/O and return iterators inpathlibnow returnasync generators.aiopathgoes one step further, and wrapsos.scandir()andDirEntryto makeAsyncPath.glob()completely async.aiopathis typed with Pythontype annotations, and if using theaiofileback end, it takes advantage oflibaiofor async I/O on Linux.Usageaiopath's API directly matchespathlib, so check out the standard library documentation forPurePathandPath.Running examplesTo run the following examples with top-levelawaitexpressions,launch an asynchronous Python REPLusingpython3 -m asyncioor anIPython shell.You'll also need to installasynctempfilevia PyPI, like sopython3 -m pip install asynctempfile.ReplacingpathlibAll ofpathlib.Path's methods that perform synchronous I/O are reimplemented as asynchronous methods.PurePathmethods are not asynchronous because they don't perform I/O.frompathlibimportPathfromasynctempfileimportNamedTemporaryFilefromaiopathimportAsyncPathasyncwithNamedTemporaryFile()astemp:path=Path(temp.name)apath=AsyncPath(temp.name)# check existence## syncassertpath.exists()## asyncassertawaitapath.exists()# check if file## syncassertpath.is_file()## asyncassertawaitapath.is_file()# touchpath.touch()awaitapath.touch()# PurePath methods are not asyncassertpath.is_absolute()==apath.is_absolute()assertpath.as_uri()==apath.as_uri()# read and write texttext:str='example'awaitapath.write_text(text)assertawaitapath.read_text()==textassertnotpath.exists()assertnotawaitapath.exists()You can convertpathlib.Pathobjects toaiopath.AsyncPathobjects, and vice versa:frompathlibimportPathfromaiopathimportAsyncPathhome:Path=Path.home()ahome:AsyncPath=AsyncPath(home)path:Path=Path(ahome)assertisinstance(home,Path)assertisinstance(ahome,AsyncPath)assertisinstance(path,Path)# AsyncPath and Path objects can point to the same fileassertstr(home)==str(ahome)==str(path)# AsyncPath and Path objects are equivalentasserthome==ahomeAsyncPathis a subclass ofPathandPurePath, and a subclass ofAsyncPurePath:frompathlibimportPath,PurePathfromaiopathimportAsyncPath,AsyncPurePathassertissubclass(AsyncPath,Path)assertissubclass(AsyncPath,PurePath)assertissubclass(AsyncPath,AsyncPurePath)assertissubclass(AsyncPurePath,PurePath)path:AsyncPath=awaitAsyncPath.home()assertisinstance(path,Path)assertisinstance(path,PurePath)assertisinstance(path,AsyncPurePath)Check out the test files in thetestsdirectoryfor more examples of howaiopathcompares topathlib.Opening a fileYou can get an asynchronousfile-like object handleby usingasynchronous context managers.AsyncPath.open()'s async context manager yields ananyio.AsyncFileobject.fromasynctempfileimportNamedTemporaryFilefromaiopathimportAsyncPathtext:str='example'# you can access a file with async context managersasyncwithNamedTemporaryFile()astemp:path=AsyncPath(temp.name)asyncwithpath.open(mode='w')asfile:awaitfile.write(text)asyncwithpath.open(mode='r')asfile:result:str=awaitfile.read()assertresult==text# or you can use the read/write convenience methodsasyncwithNamedTemporaryFile()astemp:path=AsyncPath(temp.name)awaitpath.write_text(text)result:str=awaitpath.read_text()assertresult==textcontent:bytes=text.encode()awaitpath.write_bytes(content)result:bytes=awaitpath.read_bytes()assertresult==contentGlobbingaiopathimplementspathlibglobbingusing async I/O and async generators.fromaiopathimportAsyncPathhome:AsyncPath=awaitAsyncPath.home()asyncforpathinhome.glob('*'):assertisinstance(path,AsyncPath)print(path)downloads:AsyncPath=home/'Downloads'ifawaitdownloads.exists():# this might take a whilepaths:list[AsyncPath]=\[pathasyncforpathindownloads.glob('**/*')]InstallationDependenciesA POSIX compliant OS, or WindowsPython 3.7+requirements.txtPyPI$python3-mpipinstallaiopathPython 3.9 and olderaiopathfor Python 3.9 and older is available on PyPI under versions0.5.xand lower.Python 3.10 and neweraiopathfor Python 3.10 and newer is available on PyPI under versions0.6.xand higher.GitHubDownload a release archive for your Python version fromthe releases page.Then to install, run:$python3-mpipinstall-rrequirements.txt
$python3setup.pyinstallPython 3.9 and olderaiopathfor Python 3.9 and older is developed on thePython-3.9branch.Python 3.10 and neweraiopathfor Python 3.10 and newer is developed on thePython-3.10branch.SupportWant to support this project andother open-source projectslike it?LicenseSeeLICENSE. If you'd like to use this project with a different license, please get in touch.CreditSeeCREDIT.md. |
aiopathlib | aiopathlib: Pathlib support for asyncioaiopathlibis written in Python, for handling local
disk files in asyncio applications.Base onaiofilesand just like pathlib, but use await.withopen('filename','w')asfp:fp.write('My file contents')text=awaitaiopathlib.AsyncPath('filename').read_text()print(text)'My file contents'content=awaitaiopathlib.AsyncPath(Path('filename')).read_bytes()print(content)b'My file contents'Asynchronous interface to create folder.fromaiopathlibimportAsyncPathapath=AsyncPath('dirname/subpath')ifnotawaitapath.exists():awaitapath.mkdir(parents=True)Featuresa file API very similar to Python's standard packagepathlib, blocking APIsupport for buffered and unbuffered binary files, and buffered text filessupport forasync/await(:PEP:492) constructsInstallationTo install aiopathlib, simply:$pipinstallaiopathlibUsageThese functions are awaitableread_textread_bytesread_jsonwrite_textwrite_byteswrite_jsonmkdirtouchexistsrenameunlinkrmdirremovestatlstatis_fileis_diris_symlinkis_fifois_mountis_block_deviceis_char_deviceis_socketExampleSome common using cases:from pashlib import Path
from aiopathlib import AsyncPath
filename = 'test.json'
ap = AsyncPath(filename)
p = Path(filename)
assert (await ap.exists()) == p.exists() == False
await ap.touch() # Create a empty file
assert (await ap.is_file()) == p.is_file() == True
assert (await ap.is_dir()) == p.is_dir() == False
assert (await ap.is_symlink()) == p.is_symlink() == False
for func in ('is_fifo', 'is_mount', 'is_block_device', 'is_char_device', 'is_socket'):
assert (await getattr(ap, func)()) == getattr(p, func)()
d = {'key': 'value'}
await ap.write_json(d) # == p.write_text(json.dumps(d))
assert (await ap.read_json()) == d # == json.loads(p.read_text())
assert (await ap.read_bytes()) == p.read_bytes() # b'{"key": "value"}'
assert (await ap.stat()) == p.stat()
assert (await ap.lstat()) == p.lstat()
ap = await ap.rename('test_dir') # == AsyncPath(p.rename('test_dir'))
await ap.remove() # == await ap.unlink() == p.unlink()
await ap.mkdir() # == p.mkdir()
# Synchronization functions
[Path(i) for i in ap.glob('*')] == list(p.glob('*'))
[Path(i) for i in ap.rglob('*')] == list(p.rglob('*'))
ap / 'filename' == ap.joinpath('filename') == AsyncPath(f'{ap}/filename')
str(AsyncPath('string-or-Path-or-AsyncPath')) == str(Path('string-or-Path-or-AsyncPath'))
ap.stem == p.stem
ap.suffix == p.suffix
Path(ap.with_name('xxx')) == p.with_name('xxx')
Path(ap.parent) == p.parent
Path(ap.resolve()) == p.resolve()
...History0.3.1 (2022-02-20)Return content size after write local fileUpgrade dependencies0.3.0 (2021-12-16)Support Python3.7Cleardev_requirements.txtto be only package name and version0.2.3 (2021-10-16)Maketouchpass test for py39.Remove support for pypy3 from docs.0.2.2 (2021-09-20)Maketouch/stat/is_file/... be awaitable.Usesuper().__new__for initial.0.2.0 (2021-08-29)MakeAsyncPathbe subclass ofpathlib.Path.Add github action to show test coverage.0.1.3 (2021-08-28)Add makefile.Test all functions.Fix rename method error.Support sync pathlib methods.0.1.0 (2021-06-14)Introduced a changelog.Publish at gitee.ContributingContributions are very welcome. |
aiopathy | AioPathy: an asynchronous Path interface for local and cloud bucket storageAioPathy is a python package (with type annotations) for working with Cloud Bucket storage providers using a pathlib interface. It provides an easy-to-use API bundled with a CLI app for basic file operations between local files and remote buckets. It enables a smooth developer experience by letting developers work against the local file system during development and only switch over to live APIs for deployment. It also makes converting bucket blobs into local files a snap with optional local file caching.🚀 QuickstartYou can installpathyfrom pip:pipinstallpathyThe package exports thePathyclass and utilities for configuring the bucket storage provider to use.frompathyimportPathy,use_fs# Use the local file-system for quicker developmentuse_fs()# Create a bucketPathy("gs://my_bucket").mkdir(exist_ok=True)# An excellent blobgreeting=Pathy(f"gs://my_bucket/greeting.txt")# But it doesn't exist yetassertnotgreeting.exists()# Create it by writing some textgreeting.write_text("Hello World!")# Now it existsassertgreeting.exists()# Delete itgreeting.unlink()# Now it doesn'tassertnotgreeting.exists()Supported CloudsThe table below details the supported cloud provider APIs.Cloud ServiceSupportInstall ExtrasGoogle Cloud Storage✅pip install pathy[gcs]Amazon S3✅pip install pathy[s3]Azure✅pip install pathy[azure]Google Cloud StorageGoogle recommends using a JSON credentials file, which you can specify by path:fromgoogle.oauth2importservice_accountfrompathyimportset_client_paramscredentials=service_account.Credentials.from_service_account_file("./my-creds.json")set_client_params("gs",credentials=credentials)Amazon S3S3 uses a JSON credentials file, which you can specify by path:frompathyimportset_client_paramsset_client_params("s3",key_id="YOUR_ACCESS_KEY_ID",key_secret="YOUR_ACCESS_SECRET")AzureAzure blob storage can be passed aconnection_string:frompathyimportset_client_paramsset_client_params("azure",connection_string="YOUR_CONNECTION_STRING")or aBlobServiceClientinstance:fromazure.storage.blobimportBlobServiceClientfrompathyimportset_client_paramsservice:BlobServiceClient=BlobServiceClient.from_connection_string("YOUR_CONNECTION_STRING")set_client_params("azure",service=service)Semantic VersioningBefore Pathy reaches v1.0 the project is not guaranteed to have a consistent API, which means that types and classes may move around or be removed. That said, we try to be predictable when it comes to breaking changes, so the project uses semantic versioning to help users avoid breakage.Specifically, new releases increase thepatchsemver component for new features and fixes, and theminorcomponent when there are breaking changes. If you don't know much about semver strings, they're usually formatted{major}.{minor}.{patch}so increasing thepatchcomponent means incrementing the last number.Consider a few examples:From VersionTo VersionChanges are Breaking0.2.00.2.1No0.3.20.3.6No0.3.10.3.17No0.2.20.3.0YesIf you are concerned about breaking changes, you can pin the version in your requirements so that it does not go beyond the current semverminorcomponent, for example if the current version was0.1.37:pathy>=0.1.37,<0.2.0🎛 APIPathyclassPathy(self,args,kwargs)Subclass ofpathlib.Paththat works with bucket APIs.existsmethodPathy.exists(self)->boolReturns True if the path points to an existing bucket, blob, or prefix.fluidclassmethodPathy.fluid(path_candidate:Union[str,Pathy,BasePath],)->Union[Pathy,BasePath]Infer either a Pathy or pathlib.Path from an input path or string.The returned type is a union of the potentialFluidPathtypes and will
type-check correctly against the minimum overlapping APIs of all the input
types.If you need to use specific implementation details of a type, "narrow" the
return of this function to the desired type, e.g.frompathyimportFluidPath,Pathyfluid_path:FluidPath=Pathy.fluid("gs://my_bucket/foo.txt")# Narrow the type to a specific classassertisinstance(fluid_path,Pathy),"must be Pathy"# Use a member specific to that classassertfluid_path.prefix=="foo.txt/"from_bucketclassmethodPathy.from_bucket(bucket_name:str,scheme:str='gs')->'Pathy'Initialize a Pathy from a bucket name. This helper adds a trailing slash and
the appropriate prefix.frompathyimportPathyassertstr(Pathy.from_bucket("one"))=="gs://one/"assertstr(Pathy.from_bucket("two"))=="gs://two/"globmethodPathy.glob(self:'Pathy',pattern:str,)->Generator[Pathy,NoneType,NoneType]Perform a glob match relative to this Pathy instance, yielding all matched
blobs.is_dirmethodPathy.is_dir(self:'Pathy')->boolDetermine if the path points to a bucket or a prefix of a given blob
in the bucket.Returns True if the path points to a bucket or a blob prefix.
Returns False if it points to a blob or the path doesn't exist.is_filemethodPathy.is_file(self:'Pathy')->boolDetermine if the path points to a blob in the bucket.Returns True if the path points to a blob.
Returns False if it points to a bucket or blob prefix, or if the path doesn’t
exist.iterdirmethodPathy.iterdir(self:'Pathy',)->Generator[Pathy,NoneType,NoneType]Iterate over the blobs found in the given bucket or blob prefix path.lsmethodPathy.ls(self:'Pathy')->Generator[BlobStat,NoneType,NoneType]List blob names with stat information under the given path.This is considerably faster than using iterdir if you also need
the stat information for the enumerated blobs.Yields BlobStat objects for each found blob.mkdirmethodPathy.mkdir(self,mode:int=511,parents:bool=False,exist_ok:bool=False,)->NoneCreate a bucket from the given path. Since bucket APIs only have implicit
folder structures (determined by the existence of a blob with an overlapping
prefix) this does nothing other than create buckets.If parents is False, the bucket will only be created if the path points to
exactly the bucket and nothing else. If parents is true the bucket will be
created even if the path points to a specific blob.The mode param is ignored.Raises FileExistsError if exist_ok is false and the bucket already exists.openmethodPathy.open(self:'Pathy',mode:str='r',buffering:int=8192,encoding:Optional[str]=None,errors:Optional[str]=None,newline:Optional[str]=None,)->IO[Any]Open the given blob for streaming. This delegates to thesmart_openlibrary that handles large file streaming for a number of bucket API
providers.ownermethodPathy.owner(self:'Pathy')->Optional[str]Returns the name of the user that owns the bucket or blob
this path points to. Returns None if the owner is unknown or
not supported by the bucket API provider.renamemethodPathy.rename(self:'Pathy',target:Union[str,pathlib.PurePath])->'Pathy'Rename this path to the given target.If the target exists and is a file, it will be replaced silently if the user
has permission.If path is a blob prefix, it will replace all the blobs with the same prefix
to match the target prefix.replacemethodPathy.replace(self:'Pathy',target:Union[str,pathlib.PurePath])->'Pathy'Renames this path to the given target.If target points to an existing path, it will be replaced.resolvemethodPathy.resolve(self,strict:bool=False)->'Pathy'Resolve the given path to remove any relative path specifiers.frompathyimportPathypath=Pathy("gs://my_bucket/folder/../blob")assertpath.resolve()==Pathy("gs://my_bucket/blob")rglobmethodPathy.rglob(self:'Pathy',pattern:str,)->Generator[Pathy,NoneType,NoneType]Perform a recursive glob match relative to this Pathy instance, yielding
all matched blobs. Imagine adding "**/" before a call to glob.rmdirmethodPathy.rmdir(self:'Pathy')->NoneRemoves this bucket or blob prefix. It must be empty.samefilemethodPathy.samefile(self:'Pathy',other_path:Union[str,bytes,int,pathlib.Path],)->boolDetermine if this path points to the same location as other_path.statmethodPathy.stat(self:'Pathy')->pathy.BlobStatReturns information about this bucket path.to_localclassmethodPathy.to_local(blob_path:Union[Pathy,str],recurse:bool=True,)->pathlib.PathDownload and cache either a blob or a set of blobs matching a prefix.The cache is sensitive to the file updated time, and downloads new blobs
as their updated timestamps change.touchmethodPathy.touch(self:'Pathy',mode:int=438,exist_ok:bool=True)->NoneCreate a blob at this path.If the blob already exists, the function succeeds if exist_ok is true
(and its modification time is updated to the current time), otherwise
FileExistsError is raised.BlobStatdataclassBlobStat(self,name:str,size:Optional[int],last_modified:Optional[int],)->NoneStat for a bucket itemuse_fsfunctionuse_fs(root:Optional[str,pathlib.Path,bool]=None,)->Optional[pathy.BucketClientFS]Use a path in the local file-system to store blobs and buckets.This is useful for development and testing situations, and for embedded
applications.get_fs_clientfunctionget_fs_client()->Optional[pathy.BucketClientFS]Get the file-system client (or None)use_fs_cachefunctionuse_fs_cache(root:Optional[str,pathlib.Path,bool]=None,)->Optional[pathlib.Path]Use a path in the local file-system to cache blobs and buckets.This is useful for when you want to avoid fetching large blobs multiple
times, or need to pass a local file path to a third-party library.get_fs_cachefunctionget_fs_cache()->Optional[pathlib.Path]Get the folder that holds file-system cached blobs and timestamps.set_client_paramsfunctionset_client_params(scheme:str,kwargs:Any)->NoneSpecify args to pass when instantiating a service-specific Client
object. This allows for passing credentials in whatever way your underlying
client library prefers.CLIPathy command line interface. (v0.5.2)Usage:$[OPTIONS]COMMAND[ARGS]...Options:--install-completion: Install completion for the current shell.--show-completion: Show completion for the current shell, to copy it or customize the installation.--help: Show this message and exit.Commands:cp: Copy a blob or folder of blobs from one...ls: List the blobs that exist at a given...mv: Move a blob or folder of blobs from one path...rm: Remove a blob or folder of blobs from a given...cpCopy a blob or folder of blobs from one bucket to another.Usage:$cp[OPTIONS]FROM_LOCATIONTO_LOCATIONArguments:FROM_LOCATION: [required]TO_LOCATION: [required]Options:--help: Show this message and exit.lsList the blobs that exist at a given location.Usage:$ls[OPTIONS]LOCATIONArguments:LOCATION: [required]Options:-l, --long: Print long style entries with updated time and size shown. [default: False]--help: Show this message and exit.mvMove a blob or folder of blobs from one path to another.Usage:$mv[OPTIONS]FROM_LOCATIONTO_LOCATIONArguments:FROM_LOCATION: [required]TO_LOCATION: [required]Options:--help: Show this message and exit.rmRemove a blob or folder of blobs from a given location.Usage:$rm[OPTIONS]LOCATIONArguments:LOCATION: [required]Options:-r, --recursive: Recursively remove files and folders. [default: False]-v, --verbose: Print removed files and folders. [default: False]--help: Show this message and exit.CreditsPathy is originally based on theS3Pathproject, which provides a Path interface for S3 buckets. |
aiopay | Payme API uchun Asinxron kutubxona!!!BoshlashO'rnatish:$ pip install -U aiopayExample:fromasyncioimportget_event_loopfrompayme_uz.cardsimportPaymeSubscribeCardasyncdefmain():card_api=PaymeSubscribeCard(paycom_id='paycom_id',debug=True)# debug: True - sinov rejimi, False - ishlab chiqarish rejimidata=awaitcard_api.create(number='860006******6311',expire='0399',save=True)print(data)awaitcard_api.close()if__name__=='__main__':get_event_loop().run_until_complete(main())Result:{
"jsonrpc": "2.0",
"result": {
"card": {
"number": "860006******6311",
"expire": "03/99",
"token": "6308******5xUj",
"recurrent": true,
"verify": false,
"type": "22618"
}
}
} |
aiopayAPI | aiopayAPIАсинхронное API для работы с сайтом Payok.ioВажные ссылкиPyPiДокументацияGitHubВозможностиПолучение балансаПолучение транзакцийСоздание выплат (переводов)Методы выплатПолучение выплатСоздание ссылки для оплатыВозможные ошибки |
aiopaybear | paybear.io savvy.io async clientInstallpip install aiopaybear |
aiopaykassa | aiopaykassapaykassa.pro Api Python Async LibraryWrapper for Paykassa.pro API and SCI methodsAPI example:fromaiopaykassa.clientsimportPayKassaApiapi=PayKassaApi(api_id=<your_api_id>,api_key=<your_api_key>,shop=<your_merchant_id>)# test_mode=True for testingasyncdefprint_bitcoin_btc_balance():balance=awaitapi.get_shop_balance()print(balance.bitcoin_btc)defmain():loop=asyncio.get_event_loop()loop.run_until_complete(print_bitcoin_btc_balance())SCI example:importdecimalfromaiopaykassa.clientsimportPayKassaScifromaiopaykassa.typesimportNewOrdersci=PayKassaSci(sci_id=<your_merchant_id>,sci_key=<your_merchant_password>)# test_mode=True for testingasyncdefcreate_order_btc(order_id:int,amount:decimal.Decimal,comment:str)->NewOrder:new_order=awaitsci.create_order(order_id=order_id,amount=amount,currency=Currency.BTC,system=System.BITCOIN,comment=comment)returnnew_orderdefmain():loop=asyncio.get_event_loop()loop.run_until_complete(create_order_btc(1,decimal.Decimal(0.00001),"test")) |
aiopayok | AIOPayokpayok.io asynchronous python wrapperimportasynciofromaiopayokimportPayokpayok=Payok("API_ID","API_KEY")asyncdefmain()->None:print(awaitpayok.get_balance())asyncio.run(main())InstallingpipinstallaiopayokResourcesCheck out the docs athttps://payok.io/cabinet/documentation/doc_main.phpto learn more about PayOk, |
aiopaypal | AiopaypalAsync Wrapper for Paypal's REST APISetup ⚙️$pipinstallaiopaypalDependenciesaiohttpaiofilespyopensslUsageInitfromaiopaypalimportPaypalaiopaypal=Paypal(mode='live',client_id='client_id',client_secret='client_secret',)Create a user subscription1. Create a payment experience (Optional) (Do only once):payment_experience=awaitaiopaypal.post(url='/v1/payment-experience/web-profiles'json={'name':'Payment profile name','presentation':{'logo_image':'https://brand-logo.png,'brand_name':'Brand Name'},'flow_config':{'landing_page_type':'Billing','user_action':'commit','return_uri_http_method':'GET'},'input_fields':{'no_shipping':1,# No shipping address (digital goods)},'temporary':False})2. Create a billing plan (Where you specify the details of your plan) (Do only once):billing_plan=awaitaiopaypal.post(url='/v1/payments/billing-plans',json={"name":'Name of the plan',"description":'Description of the plan',"type":"INFINITE","payment_definitions":[{"name":'Name of the payment,"cycles":"0","frequency":"MONTH","frequency_interval":"1","type":"REGULAR","amount":{"value":str(123),"currency":'usd'},}],"merchant_preferences":{"setup_fee":{"value":str(123),"currency":currency},"auto_bill_amount":"yes",# Default "NO",'return_url':'https://example.com/payment/success-callback','cancel_url':'https://example.com/payment/cancel-callback,"initial_fail_amount_action":"cancel",# Default CONTINUE"max_fail_attempts":"3","auto_bill_amount":"YES",}})3. Create webhooks to listen for subscription events (Do only once):hook_profile=awaitaiopaypal.post(url='/v1/notifications/webhooks',json={url='https://example.com/webhook/',event_types=[{'name':'BILLING.SUBSCRIPTION.CANCELLED'},{'name':'BILLING.SUBSCRIPTION.SUSPENDED'},{'name':'BILLING.SUBSCRIPTION.RE-ACTIVATED'},]})4. Create a billing agreement (Where you bind a user to the billing plan created at "2.") and execute it:asyncdefcreate_agreement():returnawaitaiopaypal.post(url='',json={'name':'Agreement name','description':'Agreement Description','start_date':(datetime.datetime.utcnow()+\datetime.timedelta(days=1)).isoformat()[:-7]+'Z'# The start date must be no less than 24 hours after the current date as the agreement can take up to 24 hours to activate.'plan':{'id':billing_plan['id']},'payer':{'payment_method':'paypal','payer_info':{'email':'[email protected]'}}})defget_execute_from_response(response):forlinkinresponse['links']:iflink['rel']=='execute':returnlink['href']4.1 Create an agreement:@app.route('/create-agreement)asyncdefcreate_agreement():billing_agreement=awaitcreate_agreement()returnmake_user_open(get_execute_from_response(billing_agreement))4.2 Activate on success:# Second step (user callback)@app.route('/success-callback',methods=['GET'])asyncdeffinalize_agreement(request):token=request.args.get('token')user_id=request['session']['user_id']active_agreement=awaitaiopaypal.post('/v1/payments/billing-agreements/{}/agreement-execute'.format(token),extra_headers={'Content-Type':'application/json'})ifactive_agreement['state'].lower()!='active'and\active_agreement['state'].lower()!='pending':else:awaitstore_user_agreement_id(user_id,active_agreement['id'])activate_premium_product(user_id)return_to_user('Payment{}'.format(active_agreement['state']))5. Listen to agreement changes:@app.route('/webhook',methods=['POST','GET'])asyncdefhook(request):try:awaitaiopaypal.verify_from_headers(webhook_id=webhook['id'],# webhook response from "3."event_body=request.body.decode(),headers=headers)exceptPaypalErrorase:logger.exception(e)returnelse:event=request.jsonevent_type=event.get('event_type')agreement_id=event['resource']['id']ifevent_type=='BILLING.SUBSCRIPTION.SUSPENDED':logger.info('Billing agreement{}suspended'.format(agreement_id))awaitsuspend_agreement_by_agreement_id(agreement_id)elifevent_type=='BILLING.SUBSCRIPTION.CANCELLED':logger.info('Billing agreement{}cancelled'.format(agreement_id))awaitcancel_agreement_by_id(agreement_id)elifevent_type=='BILLING.SUBSCRIPTION.RE-ACTIVATED':logger.info('Agreement with ID:{}REACTIVATED'.format(agreement_id))awaitreactivate_agreement_by_id(agreement_id)elifevent_type=='PAYMENT.SALE.PENDING'or\event_type=='PAYMENT.ORDER.CREATED'or\event_type=='BILLING.SUBSCRIPTION.CREATED':logger.info('Payment/Subscription Created')else:logger.critical('Got unexpected event type{}'.format(event['resource']['id']))finally:# must return 200, else Paypal won't stop sendingreturnresponse.text('OK')Create a user payment... Figured it out? Help others and make a pull request :)Contact 📧I currently work as a freelance software devloper. Like my work and got a gig for me?Want to hire me fulltime? Send me an email @[email protected] me a coffee ☕Bitcoin:3NmywNKr1Lzo8gyNXFUnzvboziACpEa31zEthereum:0x1E1400C31Cd813685FE0f6D29E0F91c1Da4675aEBitcoin Cash:qqzn7rsav6hr3zqcp4829s48hvsvjat4zq7j42wkxdLitecoin:MB5M3cE3jE4E8NwGCWoFjLvGqjDqPyyEJpPaypal:https://paypal.me/omarryhan |
aiopaystack | aiopaystackAsynchronous Python library forPaystackInstallationpipinstallaiopaystackUsageAdd your paystack secret key as an environment variable as PAY_STACK_SECRET_KEYfrompaystackimportTransactionstrans=Transactions()# All parameters must be passed in as keywords. For both required and optional arguments.res=awaittrans.initialize(email="[email protected]",amount='5000')# Passing secret key as an argument# This replaces any key set in the environmentfrompaystackimportPaystackpaystack=Paystack(secret_key="paystack_secret_key")# to use one session for multiple request use the class as a context managerasyncwithTransactions()astrans:res=awaittrans.verify(reference="ref")# The response type for every request is a typed dict.fromtypingimportTypedDict,AnyResponse=TypedDict('Response',{'status_code':int,'status':bool,'message':str,'data':dict|Any})# Sample response{'status':True,'message':'Authorization URL created','data':{'authorization_url':'https://checkout.paystack.com/3521i62zf1i0ljl','access_code':'3521i62zf1i0ljl','reference':'2q16btxglw'},'status_code':200}## DOC Reference: <https://developers.paystack.co/v2.0/reference>### Static UseDon't forget to get your API key fromPaystackand assign to the variablePAYSTACK_SECRET_KEYPlease reference thedocsfolder for usage, |
aiopb | async Pub/SubAsync Pub/SubFree software: MIT licenseDocumentation:https://aiopb.readthedocs.io.FeaturesTODOCreditsThis package was created withCookiecutterand theaudreyr/cookiecutter-pypackageproject template.History0.1.0 (2019-02-26)First release on PyPI. |
aiopcli | aiopcliInstallationpip install aiopcliExamples# 単体コンテナイメージを登録
# デフォルトapi serverポートが8080
docker save my-image | gzip > my-image.tgz
aiopcli add server --name=my-image --api-server=my-image.tgz \
--api-port=80 --liveness-endpoint=/health --readiness-endpoint=/health
# "servableId"が出力されます。例えば、 `s-z15uerp3mehdxg33`
aiopcli create --tag=test --servable=s-z15uerp3mehdxg33 --server-plan=basic
# env id 321が返却されれば、
aiopcli status test
aiopcli status 321
# が同じ結果を返却
# default endpointがmultipart/form-dataを受け取る場合
aiopcli predict test [email protected]
# servableがapplication/jsonを受け取る場合
aiopcli predict 321 -d'{"inputs": {"array": [0, 4, 8]}}'
# 別のendpointを叩く
aiopcli predict 321 --endpoint=receipt/read [email protected]
# この端末で登録されたenvを全て確認
aiopcli status
# envを削除
aiopcli delete test
# 推論サーバ(tritonserver, TF Servingなど)こみのコンテナイメージ登録
docker save api-server | gzip > api-server.tgz
docker save inference-server | gzip > inference-server.tgz
aiopcli add server --name=my-image \
--api-server=api-server.tgz \
--inference-server=inference-server.tgz \
--api-port=8000 \
--metrics-port=8002 \ # tritonの場合のみ
--liveness-endpoint=/health \
--readiness-endpoint=/health/readyUsage guide基本設定で作成aiopcli create --servable=<servable-id> --server-plan=<basic,standard,gpu_basic>tagを付けて作成aiopcli create --tag=<tag> --servable=<servable-id> --server-plan=<basic,standard,gpu_basic>環境を削除aiopcli delete <tagまたはenv_id>単体envのステータスを確認aiopcli status <tagまたはenv_id>cliで作成したenvのステータスを確認aiopcli statusプロフィール(ホスト・APIキー)を設定してコマンドを実行aiopcli --profile=<profile> <command> <arg> ...
# または
aiopcli -p<profile> <command> <arg> ...host・apikeyをoverrideして実行aiopcli --host=<host> --apikey=<apikey> <command> <arg> ...
# または
aiopcli --host=<host> -k<apikey> <command> <arg> ...custom docker imageを登録# apiserver
aiopcli add server \
--name=single-custom \
--api-server=container.tgz \
--api-port=8000 \
--liveness-endpoint=/health \
--readiness-endpoint=/health
# apiserver & inferenceserver
aiopcli add server --name=double-custom --api-server=apiserver.tgz --inference-server=tritonserver.tgzConfiguration利用可能な環境変数AIOP_CONFIG=~/.aiop
AIOP_LOG_LEVEL=INFO
AIOP_PROFILE=stg
AIOP_HOST=https://aiops.inside.ai
AIOP_APIKEY=ICMxJ0Z4PTtvbHE/ITd8Njk4RCgjcy5TL0E3b0YwRj83R2hXKTl8WFAiaGdpSU55fH0kd0IsOCJSZ1AwaUJuPVhWdFJvO1B0O09OQDtsOkVtPydKOnRaIUcqIm8ibFghWitiKTlxUVsqQWkkPG9lJFNbNyNrJzRoNTZzaTF7P2djMy9zKTg4JHZNMVEpQlBIayYkQTtRR2luOEIsXj1iO0JzRyJAdzBaVn1HbWNcc0k5X0JUO0tLeC1vdnRnNTVxLEJfbEEmR1lZNl97ZSZALl9FNnxDYSh+Q09WYHxDPEBqeWYhM1BUbDR5YEw0aCh3UlM6TnAxPmMhXzNnZ3YoYQ==設定ファイルフォーマットはtoml。
デフォルト保存先:~/.aiop、AIOP_CONFIG環境変数で設定可。# プロフィールのデフォルトdefault="stg"# apikeyのデフォルトapikey="ICMxJ0Z4PTtvbHE/ITd8Njk4RCgjcy5TL0E3b0YwRj83R2hXKTl8WFAiaGdpSU55fH0kd0IsOCJSZ1AwaUJuPVhWdFJvO1B0O09OQDtsOkVtPydKOnRaIUcqIm8ibFghWitiKTlxUVsqQWkkPG9lJFNbNyNrJzRoNTZzaTF7P2djMy9zKTg4JHZNMVEpQlBIayYkQTtRR2luOEIsXj1iO0JzRyJAdzBaVn1HbWNcc0k5X0JUO0tLeC1vdnRnNTVxLEJfbEEmR1lZNl97ZSZALl9FNnxDYSh+Q09WYHxDPEBqeWYhM1BUbDR5YEw0aCh3UlM6TnAxPmMhXzNnZ3YoYQ=="# profiles[profiles.stg]log_level="INFO"[profiles.prod]apikey="QDd+VC55cy1tLV4rQXo2bSZ1OXsgOnx0UzUwbDpEUEQ/UXc3cihvPmtBWHBTWj1LT1w+RXY/aCksbCthVUZGdFUzd2d6e1IrRi5zUycxKlp9YFxEdjE0PXNAXEtGVyZhOC14WWtcXXcoWls6OScxJmlkTSwrTDttc0ouIzhFLEZGJ3xFJWhpI3lpeV1iJ24nSjsyICcgRzxEIi95cGF0eU96TmheaWcobEk+RVxGX01ZYz9jfk9cbThIRyUpaXpLdDklJCR5eTVjYzwyb3F6J2pqJEZbckViNG16PHQkK3xqdUtBSjpRY1UoYiQ1MHBHLitYazUzKD52aVddXzYsbA==" |
aio-pcvector | No description available on PyPI. |
aiopdhttp | No description available on PyPI. |
aiopeewee | AioPeeweeAsyncio interface forpeeweemodeled aftertorpeeweeImplemented database adapters:[x] aiomysql[ ] aiopg[ ] sqliteCurrently 125 test cases have been ported from peewee, not all of them but constantly increases.Simple Atomic operations (transactions) are also supported, but now well tested.InstallpipinstallaiopeeweeUsagefromaiopeeweeimportAioModel,AioMySQLDatabasefrompeeweeimportCharField,TextField,DateTimeFieldfrompeeweeimportForeignKeyField,PrimaryKeyFielddb=AioMySQLDatabase('test',host='127.0.0.1',port=3306,user='root',password='')classUser(AioModel):username=CharField()classMeta:database=dbclassBlog(AioModel):user=ForeignKeyField(User)title=CharField(max_length=25)content=TextField(default='')pub_date=DateTimeField(null=True)pk=PrimaryKeyField()classMeta:database=db# create connection poolawaitdb.connect(loop)# countawaitUser.select().count()# async iteration on select queryasyncforuserinUser.select():print(user)# fetch all records as a list from a query in one passusers=awaitUser.select()# insertuser=awaitUser.create(username='kszucs')# modifyuser.username='krisztian'awaituser.save()# async iteration on blog set[b.titleasyncforbinuser.blog_set.order_by(Blog.title)]# close connection poolawaitdb.close()# see more in the testsManyToManyNote thatAioManyToManyFieldmust be used instead ofManyToMany.fromaiopeeweeimportAioManyToManyFieldclassUser(AioModel):username=CharField(unique=True)classMeta:database=dbclassNote(AioModel):text=TextField()users=AioManyToManyField(User)classMeta:database=dbNoteUserThrough=Note.users.get_through_model()asyncforuserinnote.users:# do something with the usersCurrently the only limitation I’m aware of immidiate setting of instance relation must be replaced with a method call:# original, which is not supportedcharlie.notes=[n2,n3]# use insteadawaitcharlie.notes.set([n2,n3])SerializingConverting to dict requires the asyncified version ofmodel_to_dictfromaiopeeweeimportmodel_to_dictserialized=awaitmodel_to_dict(user) |
aio-peewee | DeprecationWarningThe package is deprecated.Please usepeewee-aioinstead.—aio-peewee– Peewee support for async frameworks (Asyncio,Trio,Curio)The library doesn’t make peewee work async, but allows you to use Peewee with
your asyncio based libraries correctly.FeaturesTasks Safety. The library tracks of the connection state using Task-local
storage, making the Peewee Database object safe to use with multiple tasks
inside a loop.Async management of connections for Peewee Connections PoolContentsDeprecationWarningFeaturesRequirementsInstallationQuickStartUsageInitializationAsync ConnectConnection PoolingDatabase URLASGI MiddlewareCurioBug trackerContributingLicenseRequirementspython >= 3.8Installationaio-peeweeshould be installed using pip:pip install aio-peeweeQuickStartfromaiopeeweeimportdb_urldb=db_url.connect('postgres+async://locahost:5432/database')asyncdefmain(id=1):asyncwithdb:item=Model.get(Model.id==1)returnitem.nameUsageInitializationfromaiopeeweeimportPostgresqlDatabaseAsync,SqliteDatabaseAsync,MySQLDatabaseAsync,CockroachDatabaseAsyncdb=PostgresqlDatabaseAsync('my_app',user='app',password='db_password',host='10.1.0.8',port=3306)Async Connect# Manualasyncdefmain():awaitdb.connect_async()# ...awaitdb.close_async()# Context managerasyncdefmain():asyncwithdb:# ...Connection PoolingfromaiopeeweeimportPooledPostgresqlDatabaseAsync,PooledSqliteDatabaseAsync,PooledMySQLDatabaseAsync,PooledCockroachDatabaseAsyncdb=PooledPostgresqlDatabaseAsync('my_database',max_connections=8,stale_timeout=300,user='postgres')Database URLfromaiopeeweeimportdb_urldb0=db_url.connect('cockroachdb+async://localhost/db',**db_params)db1=db_url.connect('cockroachdb+pool+async://localhost/db',**db_params)db2=db_url.connect('mysql+async://localhost/db',**db_params)db3=db_url.connect('mysql+pool+async://localhost/db',**db_params)db4=db_url.connect('postgres+async://localhost/db',**db_params)db5=db_url.connect('postgres+pool+async://localhost/db',**db_params)db6=db_url.connect('sqlite+async://localhost/db',**db_params)db7=db_url.connect('sqlite+pool+async://localhost/db',**db_params)db8=db_url.connect('sqliteexc+async://localhost/db',**db_params)db9=db_url.connect('sqliteexc+pool+async://localhost/db',**db_params)ASGI Middlewareimportdatetimeasdtfromasgi_toolsimportAppfromaiopeeweeimportPeeweeASGIPluginimportpeeweeaspwdb=PeeweeASGIPlugin(url='sqlite+async:///db.sqlite')@db.registerclassVisit(pw.Model):created=pw.DateTimeField(default=dt.datetime.utcnow())address=pw.CharField()db.create_tables()app=App()@app.route('/')asyncdefvisits_json(request):"""Store the visit and load latest 10 visits."""Visit.create(address=request.client[0])return[{'id':v.id,'address':v.address,'timestamp':round(v.created.timestamp()),}forvinVisit.select().order_by(Visit.id.desc()).limit(10)]app=db.middleware(app)Curioaio-peeweeusescontextvarsto store db connections. So you have to
enablecontextvarsfor Curio:https://curio.readthedocs.io/en/latest/howto.html#how-do-you-use-contextvarsBug trackerIf you have any suggestions, bug reports or
annoyances please report them to the issue tracker
athttps://github.com/klen/aio-peewee/issuesContributingDevelopment of the project happens at:https://github.com/klen/aio-peeweeLicenseLicensed under aMIT license. |
aiopegelonline | aiopegelonlineAsynchronous library to retrieve data fromPEGELONLINE.:warning:this is in early development state:warning:breaking changes may occure at every timeRequirementsPython >= 3.9aiohttpInstallationpipinstallaiopegelonlineExamplesGet all available measurement stationsimportasyncioimportaiohttpfromaiopegelonlineimportPegelOnlineasyncdefmain():asyncwithaiohttp.ClientSession()assession:pegelonline=PegelOnline(session)stations=awaitpegelonline.async_get_all_stations()foruuid,stationinstations.items():print(f"uuid:{uuid}name:{station.name}")if__name__=="__main__":asyncio.run(main())Get current measurementimportasyncioimportaiohttpfromaiopegelonlineimportPegelOnlineasyncdefmain():asyncwithaiohttp.ClientSession()assession:pegelonline=PegelOnline(session)measurements=awaitpegelonline.async_get_station_measurements("70272185-b2b3-4178-96b8-43bea330dcae")forname,datainmeasurements.as_dict().items():ifdataisNone:print(f"{name}not provided by measurement station")else:print(f"{name}:{data.value}{data.uom}")if__name__=="__main__":asyncio.run(main())ReferencesPEGELONLINE api reference (German) |
aiopen | aiopenInstall:pip install aiopenAsync-openWhy not use aiofiles?Wanted more type annotationsaiofiles uses ye ole@coroutinedecorator -- aiopen uses python3.6+async/awaitaiopen is a callable module, so you can do:import aiopenasync with aiopen('afile.txt', 'w') as f: await f.write('some text!')async with aiopen('afile.txt', 'r') as f: content = await f.read()(Big shouts out to the aiofiles people, aiopen is entirely based off of aiofiles)Usage:Just import it! The module is also callable!importaiopenasyncwithaiopen('afile.txt','w')asf:awaitf.write('some text!')asyncwithaiopen('afile.txt','r')asf:content=awaitf.read()print(content) |
aiopenapi3 | aiopenapi3A PythonOpenAPI 3 Specificationclient and validator for Python 3.This project is a fork ofDorthu/openapi3.Featuresimplements …Swagger 2.0OpenAPI 3.0.3OpenAPI 3.1.0description document parsing viapydanticrecursive schemas (A.a -> A)request body model creation via pydanticpydantic compatible "format"-type coercion (e.g. datetime.interval)additionalProperties (limited to string-to-any dictionaries without properties)response body & header parsing via pydanticblocking and nonblocking (asyncio) interface viahttpxSOCKS5 via httpx_sockstests with pytest &fastapiproviding access to methods and arguments via the sad smiley ._. interfacePlugin Interface/api to modify description documents/requests/responses to adapt to non compliant servicesYAML type coercion hints for not well formatted description documentsDescription Document dependency downloads (using the WebLoader)loggingexport AIOPENAPI3_LOGGING_HANDLERS=debugto get /tmp/aiopenapi3-debug.logDocumentationAPI DocumentationRunning TestsThis project includes a test suite, run viapytest. To run the test suite,
ensure that you've installed the dependencies and then runpytestin the root
of this project.PYTHONPATH=.pytest--cov=./--cov-report=xml. |
ai-openchat | Download:https://pypi.org/project/ai-openchat/ChatExample #1 Chat:importasynciofromai_openchatimportModel,AsyncOpenAIasyncdefchat():openai_client=AsyncOpenAI(token='API-KEY')resp=awaitopenai_client.generate_message('Your request?',Model().chat())print(resp)if__name__=='__main__':asyncio.run(chat())Example #2 Movie to Emoji:importasynciofromai_openchatimportModel,AsyncOpenAIasyncdefmovie_to_emoji():openai_client=AsyncOpenAI(token='API-KEY')resp=awaitopenai_client.generate_message('Convert movie titles into emoji.\n\n''Back to the Future: 👨👴🚗🕒\n''Batman: 🤵🦇\n''Transformers: 🚗🤖\n''Star Wars:',Model().movie_to_emoji())print(resp)# ⭐️⚔️if__name__=='__main__':asyncio.run(movie_to_emoji())Example #3 Custom chatimportasynciofromai_openchatimportModel,AsyncOpenAIasyncdefimage_generator():custom_model=Model(model="code-davinci-002",temperature=0,max_tokens=100,top_p=1.0,frequency_penalty=0.5,presence_penalty=0.0,stop=["You:"])openai_client=AsyncOpenAI(token='API-KEY')resp=awaitopenai_client.generate_message('Hello!',custom_model)print(resp)if__name__=='__main__':asyncio.run(image_generator())ImageGenerate Imageimportasynciofromai_openchatimportImageModel,AsyncOpenAIasyncdefimage_generator():openai_client=AsyncOpenAI(token='API-KEY')resp=awaitopenai_client.generate_image('Captain America',ImageModel().image())print(resp)if__name__=='__main__':asyncio.run(image_generator())Generate custom Imageimportasynciofromai_openchatimportImageModel,AsyncOpenAIasyncdefimage_generator():custom_model=ImageModel(n=1,size="1024x1024")openai_client=AsyncOpenAI(token='API-KEY')resp=awaitopenai_client.generate_image('Captain America',custom_model)print(resp)if__name__=='__main__':asyncio.run(image_generator())This project is an attempt to make an asynchronous library for convenient OpenAI management.
You can check out the rest of the models here: https://beta.openai.com/examples.TechnologiesPython >= 3.8;aiohttp >= 3.8 |
aiopening | UNKNOWN |
aio-periodic | No description available on PyPI. |
aio_periodic_task | UNKNOWN |
aiopes | Version 1 is not compatible with 0.1.0IndexInstallExampleDocumentationInstallPypi:pip3 install aiopesGit:pip3 install git+https://github.com/WardPearce/aiopes.gitExampleimportasyncioimportaiopesPES=aiopes.client(api_key="...")asyncdefexample():try:asyncfordata,serverinPES.servers():print(data.id)server_target=serverexceptaiopes.exceptions.InvalidAuthorization:print("Invalid API Key.")else:asyncforlocationinPES.locations():print(location.city)asyncforgroupinPES.mapgroups():print(group.name)formap_detailsingroup.maps():print(map_details.name)print(awaitPES.mods())print(awaitPES.plugins())print(awaitPES.tickrates())asyncforgamemodeinPES.gamemodes():print(gamemode.name)asyncforfileinPES.files():print(file.name)ifawaitPES.validate.settings(rcon="new_rcon"):print("Setting is valid")loop=asyncio.get_event_loop()loop.run_until_complete(example()) |
aiopexels | aiopexelsAn asynchronous API wrapper for the Pexels APINote:This library is still in development. There may be bugs and some things may not work as expected. Please be patient. |
aiopg | aiopgis a library for accessing aPostgreSQLdatabase
from theasyncio(PEP-3156/tulip) framework. It wraps
asynchronous features of the Psycopg database driver.Exampleimportasyncioimportaiopgdsn='dbname=aiopg user=aiopg password=passwd host=127.0.0.1'asyncdefgo():pool=awaitaiopg.create_pool(dsn)asyncwithpool.acquire()asconn:asyncwithconn.cursor()ascur:awaitcur.execute("SELECT 1")ret=[]asyncforrowincur:ret.append(row)assertret==[(1,)]loop=asyncio.get_event_loop()loop.run_until_complete(go())Example of SQLAlchemy optional integrationimportasynciofromaiopg.saimportcreate_engineimportsqlalchemyassametadata=sa.MetaData()tbl=sa.Table('tbl',metadata,sa.Column('id',sa.Integer,primary_key=True),sa.Column('val',sa.String(255)))asyncdefcreate_table(engine):asyncwithengine.acquire()asconn:awaitconn.execute('DROP TABLE IF EXISTS tbl')awaitconn.execute('''CREATE TABLE tbl (
id serial PRIMARY KEY,
val varchar(255))''')asyncdefgo():asyncwithcreate_engine(user='aiopg',database='aiopg',host='127.0.0.1',password='passwd')asengine:asyncwithengine.acquire()asconn:awaitconn.execute(tbl.insert().values(val='abc'))asyncforrowinconn.execute(tbl.select()):print(row.id,row.val)loop=asyncio.get_event_loop()loop.run_until_complete(go())Please use:$ make testfor executing the project’s unittests.
Seehttps://aiopg.readthedocs.io/en/stable/contributing.htmlfor details
on how to set up your environment to run the tests.Changelog1.4.0 (2022-10-26)Add python 3.11 and drop python 3.6 support` #892 <https://github.com/aio-libs/aiopg/pull/892>`_1.3.5 (2022-09-25)Fix pool size limit check for unlimited pools#8881.3.4 (2022-06-30)1.3.4b3 (2022-06-29)1.3.4b2 (2022-06-29)1.3.4b1 (2022-06-29)Fix compatibility with SA 1.4.38#891Add py.typed marker#8781.3.3 (2021-11-01)Support async-timeout 4.0+1.3.2 (2021-10-07)1.3.2b2 (2021-10-07)Respect use_labels for select statement#8821.3.2b1 (2021-07-11)Fix compatibility with SQLAlchemy >= 1.4#8701.3.1 (2021-07-08)1.3.1b2 (2021-07-06)Suppress “Future exception was never retrieved”#8621.3.1b1 (2021-07-05)Fix ClosableQueue.get on cancellation, close it on Connection.close#8591.3.0 (2021-06-30)1.3.0b4 (2021-06-28)Fix “Unable to detect disconnect when using NOTIFY/LISTEN”#5591.3.0b3 (2021-04-03)Reformat using black#8141.3.0b2 (2021-04-02)Type annotations#8131.3.0b1 (2021-03-30)Raise ResourceClosedError if we try to open a cursor on a closed SAConnection#8111.3.0b0 (2021-03-25)Fix compatibility with SA 1.4 for IN statement#8061.2.1 (2021-03-23)Pop loop in connection init due to backward compatibility#8081.2.0b4 (2021-03-23)Set max supported sqlalchemy version#8051.2.0b3 (2021-03-22)Don’t run ROLLBACK when the connection is closed#778Multiple cursors support#8011.2.0b2 (2020-12-21)Fix IsolationLevel.read_committed and introduce IsolationLevel.default#770Fix python 3.8 warnings in tests#7711.2.0b1 (2020-12-16)Deprecate blocking connection.cancel() method#5701.2.0b0 (2020-12-15)Implement timeout on acquiring connection from pool#7661.1.0 (2020-12-10)1.1.0b2 (2020-12-09)Added missing slots to context managers#7631.1.0b1 (2020-12-07)Fix on_connect multiple call on acquire#552Fix python 3.8 warnings#622Bump minimum psycopg version to 2.8.4#754Fix Engine.release method to release connection in any way#7561.0.0 (2019-09-20)Removal of an asynchronous call in favor of issues # 550Big editing of documentation and minor bugs #5340.16.0 (2019-01-25)Fix select priority name#525Renamepsycopg2topsycopg2-binaryto fix deprecation warning#507Fix#189hstore when using ReadDictCursor#512close cannot be used while an asynchronous query is underway#452sqlalchemy adapter trx begin allow transaction_mode#4980.15.0 (2018-08-14)Support Python 3.7#4370.14.0 (2018-05-10)Addget_dialectfunc to have ability to passjson_serializer#4510.13.2 (2018-01-03)Fixed compatibility with SQLAlchemy 1.2.0#412Added support for transaction isolation levels#2190.13.1 (2017-09-10)Added connection poll recycling logic#3730.13.0 (2016-12-02)Addasync withsupport to.begin_nested()#208Fix connection.cancel()#212#223Raise informative error on unexpected connection closing#191Added support for python types columns issues#217Added support for default values in SA table issues#2060.12.0 (2016-10-09)Add an on_connect callback parameter to pool#141Fixed connection to work under both windows and posix based systems#1420.11.0 (2016-09-12)Immediately remove callbacks from a closed file descriptor#139Drop Python 3.3 support0.10.0 (2016-07-16)Refactor tests to use dockerized Postgres server#107Reduce default pool minsize to 1#106Explicitly enumerate packages in setup.py#85Remove expired connections from pool on acquire#116Don’t crash when Connection is GC’ed#124Use loop.create_future() if available0.9.2 (2016-01-31)Make pool.release return asyncio.Future, so we can wait on it in__aexit__#102Add support for uuid type#1030.9.1 (2016-01-17)Documentation update#1010.9.0 (2016-01-14)Add async context managers for transactions#91Support async iterator in ResultProxy#92Add async with for engine#900.8.0 (2015-12-31)Add PostgreSQL notification support#58Support pools with unlimited size#59Cancel current DB operation on asyncio timeout#66Add async with support for Pool, Connection, Cursor#880.7.0 (2015-04-22)Get rid of resource leak on connection failure.Report ResourceWarning on non-closed connections.Deprecate iteration protocol support in cursor and ResultProxy.Release sa connection to pool onconnection.close().0.6.0 (2015-02-03)Accept dict, list, tuple, named and positional parameters inSAConnection.execute()0.5.2 (2014-12-08)Minor release, fixes a bug that leaves connection in broken state
aftercursor.execute()failure.0.5.1 (2014-10-31)Fix a bug for processing transactions in line.0.5.0 (2014-10-31)Add .terminate() to Pool and EngineReimplement connection pool (now pool size cannot be greater than pool.maxsize)Add .close() and .wait_closed() to Pool and EngineAdd minsize, maxsize, size and freesize properties to sa.EngineSupportechoparameter for logging executed SQL commandsConnection.close() is not a coroutine (but we keep backward compatibility).0.4.1 (2014-10-02)make cursor iterableupdate docs0.4.0 (2014-10-02)add timeouts for database operations.Autoregister psycopg2 support for json data type.Support JSON in aiopg.saSupport ARRAY in aiopg.saAutoregister hstore support if present in connected DBSupport HSTORE in aiopg.sa0.3.2 (2014-07-07)change signature to cursor.execute(operation, parameters=None) to
follow psycopg2 convention.0.3.1 (2014-07-04)Forward arguments to cursor constructor for pooled connections.0.3.0 (2014-06-22)Allow executing SQLAlchemy DDL statements.Fix bug with race conditions on acquiring/releasing connections from pool.0.2.3 (2014-06-12)Fix bug in connection pool.0.2.2 (2014-06-07)Fix bug with passing parameters into SAConnection.execute when
executing raw SQL expression.0.2.1 (2014-05-08)Close connection with invalid transaction status on returning to pool.0.2.0 (2014-05-04)Implemented optional support for sqlalchemy functional sql layer.0.1.0 (2014-04-06)Implemented plain connections: connect, Connection, Cursor.Implemented database pools: create_pool and Pool. |
aiopg8000 | NOTE:aiopg8000 (this project) is a fork of pg8000 to support asyncio.pg8000 (https://github.com/mfenniak/pg8000) is a Pure-Python interface to the PostgreSQL database engine. It is one of many PostgreSQL interfaces for the Python programming language. pg8000 is somewhat distinctive in that it is written entirely in Python and does not rely on any external libraries (such as a compiled python module, or PostgreSQL’s libpq library). pg8000 supports the standard Python DB-API version 2.0.pg8000’s name comes from the belief that it is probably about the 8000th PostgreSQL interface for Python. |
aiopg-listen | aiopg-listenThis library simplifies usage of listen/notify withaiopg:Handles lost of a connectionSimplifies processing notifications from multiple channelsSetups a timeout for receiving a notificationAllows to receive all notifications/only last notification depends onListenPolicy.importasyncioimportaiopgimportaiopg_listenasyncdefhandle_notifications(notification:aiopg_listen.NotificationOrTimeout)->None:print(f"{notification}has been received")listener=aiopg_listen.NotificationListener(aiopg_listen.connect_func())listener_task=asyncio.create_task(listener.run({"channel":handle_notifications},policy=aiopg_listen.ListenPolicy.LAST,notification_timeout=1))asyncwithaiopg.connect()asconnection,connection.cursor()ascursor:foriinrange(42):awaitcursor.execute(f"NOTIFY simple, '{i}'")v0.0.7 (2023-03-09)Fix python 3.11 compatibilityv0.0.6 (2022-11-02)Add python 3.11 supportv0.0.5 (2021-11-02)Support async-timeout 4.0+v0.0.4 (2021-09-08)Reexport explicitly#18v0.0.3 (2021-08-10)Allow suppressing timeout by aiopg_listen.NO_TIMEOUT#9Fix typing for python 3.8#11v0.0.2 (2021-07-25)Addaiopg_listen.connect_functo simplify initialization#5Rename consumer to listener#6Do not crash if handler fails#7v0.0.1 (2021-07-25)A first version |
aiopg-sqlite | No description available on PyPI. |
aiopgx | aiopgis a library for accessing aPostgreSQLdatabase
from theasyncio(PEP-3156/tulip) framework. It wraps
asynchronous features of the Psycopg database driver.Exampleimport asyncio
from aiopg.pool import create_pool
dsn = 'dbname=jetty user=nick password=1234 host=localhost port=5432'
@asyncio.coroutine
def test_select():
pool = yield from create_pool(dsn)
with (yield from pool) as conn:
cur = yield from conn.cursor()
yield from cur.execute('SELECT 1')
ret = yield from cur.fetchone()
assert ret == (1,), ret
asyncio.get_event_loop().run_until_complete(test_select())Example of SQLAlchemy optional integrationimport asyncio
from aiopg.sa import create_engine
import sqlalchemy as sa
metadata = sa.MetaData()
tbl = sa.Table('tbl', metadata,
sa.Column('id', sa.Integer, primary_key=True),
sa.Column('val', sa.String(255)))
@asyncio.coroutine
def go():
engine = yield from create_engine(user='aiopg',
database='aiopg',
host='127.0.0.1',
password='passwd')
with (yield from engine) as conn:
yield from conn.execute(tbl.insert().values(val='abc'))
res = yield from conn.execute(tbl.select())
for row in res:
print(row.id, row.val)
asyncio.get_event_loop().run_until_complete(go())Please use:$ python3 runtests.pyfor executing project’s unittestsCHANGES0.8.0 (XXXX-XX-XX)Add PostgreSQL notification support #58Support pools with unlimited size #590.7.0 (2015-04-22)Get rid of resource leak on connection failure.Report ResourceWarning on non-closed connections.Deprecate iteration protocol support in cursor and ResultProxy.Release sa connection to pool onconnection.close().0.6.0 (2015-02-03)Accept dict, list, tuple, named and positional parameters inSAConnection.execute()0.5.2 (2014-12-08)Minor release, fixes a bug that leaves connection in broken state
aftercursor.execute()failure.0.5.1 (2014-10-31)Fix a bug for processing transactions in line.0.5.0 (2014-10-31)Add .terminate() to Pool and EngineReimplement connection pool (now pool size cannot be greater than pool.maxsize)Add .close() and .wait_closed() to Pool and EngineAdd minsize, maxsize, size and freesize properties to sa.EngineSupportechoparameter for logging executed SQL commandsConnection.close() is not a coroutine (but we keep backward compatibility).0.4.1 (2014-10-02)make cursor iterableupdate docs0.4.0 (2014-10-02)add timeouts for database operations.Autoregister psycopg2 support for json data type.Support JSON in aiopg.saSupport ARRAY in aiopg.saAutoregister hstore support if present in connected DBSupport HSTORE in aiopg.sa0.3.2 (2014-07-07)change signature to cursor.execute(operation, parameters=None) to
follow psycopg2 convention.0.3.1 (2014-07-04)Forward arguments to cursor constructor for pooled connections.0.3.0 (2014-06-22)Allow executing SQLAlchemy DDL statements.Fix bug with race conditions on acquiring/releasing connections from pool.0.2.3 (2014-06-12)Fix bug in connection pool.0.2.2 (2014-06-07)Fix bug with passing parameters into SAConnection.execute when
executing raw SQL expression.0.2.1 (2014-05-08)Close connection with invalid transaction status on returning to pool.0.2.0 (2014-05-04)Implemented optional support for sqlalchemy functional sql layer.0.1.0 (2014-04-06)Implemented plain connections: connect, Connection, Cursor.Implemented database pools: create_pool and Pool. |
aiophoenixdb | What isaiophoenixdbThis project is based on the Apache Software Foundation open source
project Apache-Phoenixdb project to transform the call implementation of
the Avatica protocol in the code from the original synchronous mode to
asynchronous call.Getting startedHow to installpipinstallaiophoenixdbHow to useQuery sampleimportaiophoenixdbimportasyncioPHOENIX_CONFIG={'url':'http://xxxxxxxxxx','user':'xxx','password':'xxx','database':'xxx'}asyncdefquery_test():conn=awaitaiophoenixdb.connect(**PHOENIX_CONFIG)asyncwithconn:asyncwithconn.cursor()asps:# need awaitawaitps.execute("SELECT * FROM xxx WHERE id = ?",parameters=("1",))res=awaitps.fetchone()print(res)# Throw the query coroutine into the event loop to runasyncio.get_event_loop().run_until_complete(query_test())Query with DictCursorimportaiophoenixdbimportasynciofromaiophoenixdb.cursorsimportDictCursorPHOENIX_CONFIG={'url':'http://xxxxxxxxxx','user':'xxx','password':'xxx','database':'xxx'}asyncdefquery_test():conn=awaitaiophoenixdb.connect(**PHOENIX_CONFIG)asyncwithconn:asyncwithconn.cursor(cursor_factory=DictCursor)asps:# need awaitawaitps.execute("SELECT * FROM xxx WHERE id = ?",parameters=("1",))res=awaitps.fetchone()print(res)# Throw the query coroutine into the event loop to runasyncio.get_event_loop().run_until_complete(query_test()) |
aiophotoprism | aiophotoprismAsynchronous Python client for thePhotoprismWarning:Photoprism APIis not stable yet,
use on your own riskNOTE: The package is in active development.Not all features of the API are implemented.Installationpip install aiophotoprismUsageimportasynciofromaiophotoprismimportPhotoprismasyncdefmain():asyncwithPhotoprism("username","password")asclient:# interact with the client herepassif__name__=="__main__":asyncio.run(main())PhotoprismPhotoprism is the entrypoint class, it acts as an async context manager and provides access to API endpoints.Initializationdef__init__(self,username,# your usernamepassword,# your passwordurl="http://127.0.0.1:2342",# A base URL of the server, https://photoprism.example.com:443/something is also possibletimeout=DEFAULT_TIMEOUT,# Timeout in secondsverify_ssl=True,# Perform SSL verificationloop=None,# event loopsession=None# client session,)...PhotosReturns list of photos.awaitclient.photos(count=50)ConfigReturns Photoprism instance config.awaitclient.config()IndexForces the Photoprism instance to index photos. Complete scan is not supported.awaitclient.index()LicenseMIT LicenseCopyright (c) 2021 Gleb SinyavskiyPermission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE. |
aio-pika | aio-pikaA wrapper aroundaiormqfor asyncio and humans.Check out the examples and the tutorial in thedocumentation.If you are a newcomer to RabbitMQ, please start with theadopted official RabbitMQ tutorial.NoteSince version5.0.0this library doesn’t usepikaas AMQP connector.
Versions below5.0.0contains or requirespika’s source code.NoteThe version 7.0.0 has breaking API changes, see CHANGELOG.md
for migration hints.FeaturesCompletely asynchronous API.Object oriented API.Transparent auto-reconnects with complete state recovery withconnect_robust(e.g. declared queues or exchanges, consuming state and bindings).Python 3.7+ compatible.For python 3.5 users, aio-pika is available viaaio-pika<7.Transparentpublisher confirmssupport.Transactionssupport.Complete type-hints coverage.Installationpipinstallaio-pikaUsage exampleSimple consumer:importasyncioimportaio_pikaimportaio_pika.abcasyncdefmain(loop):# Connecting with the given parameters is also possible.# aio_pika.connect_robust(host="host", login="login", password="password")# You can only choose one option to create a connection, url or kw-based params.connection=awaitaio_pika.connect_robust("amqp://guest:[email protected]/",loop=loop)asyncwithconnection:queue_name="test_queue"# Creating channelchannel:aio_pika.abc.AbstractChannel=awaitconnection.channel()# Declaring queuequeue:aio_pika.abc.AbstractQueue=awaitchannel.declare_queue(queue_name,auto_delete=True)asyncwithqueue.iterator()asqueue_iter:# Cancel consuming after __aexit__asyncformessageinqueue_iter:asyncwithmessage.process():print(message.body)ifqueue.nameinmessage.body.decode():breakif__name__=="__main__":loop=asyncio.get_event_loop()loop.run_until_complete(main(loop))loop.close()Simple publisher:importasyncioimportaio_pikaimportaio_pika.abcasyncdefmain(loop):# Explicit type annotationconnection:aio_pika.RobustConnection=awaitaio_pika.connect_robust("amqp://guest:[email protected]/",loop=loop)routing_key="test_queue"channel:aio_pika.abc.AbstractChannel=awaitconnection.channel()awaitchannel.default_exchange.publish(aio_pika.Message(body='Hello{}'.format(routing_key).encode()),routing_key=routing_key)awaitconnection.close()if__name__=="__main__":loop=asyncio.get_event_loop()loop.run_until_complete(main(loop))loop.close()Get single message example:importasynciofromaio_pikaimportconnect_robust,Messageasyncdefmain(loop):connection=awaitconnect_robust("amqp://guest:[email protected]/",loop=loop)queue_name="test_queue"routing_key="test_queue"# Creating channelchannel=awaitconnection.channel()# Declaring exchangeexchange=awaitchannel.declare_exchange('direct',auto_delete=True)# Declaring queuequeue=awaitchannel.declare_queue(queue_name,auto_delete=True)# Binding queueawaitqueue.bind(exchange,routing_key)awaitexchange.publish(Message(bytes('Hello','utf-8'),content_type='text/plain',headers={'foo':'bar'}),routing_key)# Receiving messageincoming_message=awaitqueue.get(timeout=5)# Confirm messageawaitincoming_message.ack()awaitqueue.unbind(exchange,routing_key)awaitqueue.delete()awaitconnection.close()if__name__=="__main__":loop=asyncio.get_event_loop()loop.run_until_complete(main(loop))There are more examples and the RabbitMQ tutorial in thedocumentation.See alsoaiormqaiormqis a pure python AMQP client library. It is under the hood ofaio-pikaand might to be used when you really loving works with the protocol low level.
Following examples demonstrates the user API.Simple consumer:importasyncioimportaiormqasyncdefon_message(message):"""
on_message doesn't necessarily have to be defined as async.
Here it is to show that it's possible.
"""print(f" [x] Received message{message!r}")print(f"Message body is:{message.body!r}")print("Before sleep!")awaitasyncio.sleep(5)# Represents async I/O operationsprint("After sleep!")asyncdefmain():# Perform connectionconnection=awaitaiormq.connect("amqp://guest:guest@localhost/")# Creating a channelchannel=awaitconnection.channel()# Declaring queuedeclare_ok=awaitchannel.queue_declare('helo')consume_ok=awaitchannel.basic_consume(declare_ok.queue,on_message,no_ack=True)loop=asyncio.get_event_loop()loop.run_until_complete(main())loop.run_forever()Simple publisher:importasynciofromtypingimportOptionalimportaiormqfromaiormq.abcimportDeliveredMessageMESSAGE:Optional[DeliveredMessage]=Noneasyncdefmain():globalMESSAGEbody=b'Hello World!'# Perform connectionconnection=awaitaiormq.connect("amqp://guest:guest@localhost//")# Creating a channelchannel=awaitconnection.channel()declare_ok=awaitchannel.queue_declare("hello",auto_delete=True)# Sending the messageawaitchannel.basic_publish(body,routing_key='hello')print(f" [x] Sent{body}")MESSAGE=awaitchannel.basic_get(declare_ok.queue)print(f" [x] Received message from{declare_ok.queue!r}")loop=asyncio.get_event_loop()loop.run_until_complete(main())assertMESSAGEisnotNoneassertMESSAGE.routing_key=="hello"assertMESSAGE.body==b'Hello World!'Thepatioand thepatio-rabbitmqPATIOis an acronym for Python Asynchronous Tasks for AsyncIO - an easily extensible library, for distributed task execution, like celery, only targeting asyncio as the main design approach.patio-rabbitmqprovides you with the ability to useRPC over RabbitMQservices with extremely simple implementation:frompatioimportRegistry,ThreadPoolExecutorfrompatio_rabbitmqimportRabbitMQBrokerrpc=Registry(project="patio-rabbitmq",auto_naming=False)@rpc("sum")defsum(*args):returnsum(args)asyncdefmain():asyncwithThreadPoolExecutor(rpc,max_workers=16)asexecutor:asyncwithRabbitMQBroker(executor,amqp_url="amqp://guest:guest@localhost/",)asbroker:awaitbroker.join()And the caller side might be written like this:importasynciofrompatioimportNullExecutor,Registryfrompatio_rabbitmqimportRabbitMQBrokerasyncdefmain():asyncwithNullExecutor(Registry(project="patio-rabbitmq"))asexecutor:asyncwithRabbitMQBroker(executor,amqp_url="amqp://guest:guest@localhost/",)asbroker:print(awaitasyncio.gather(*[broker.call("mul",i,i,timeout=1)foriinrange(10)]))Propan:fire:Propanis a powerful and easy-to-use Python framework for building event-driven applications that interact with any MQ Broker.If you need no deep dive intoRabbitMQdetails, you can use more high-levelPropaninterfaces:frompropanimportPropanApp,RabbitBrokerbroker=RabbitBroker("amqp://guest:guest@localhost:5672/")app=PropanApp(broker)@broker.handle("user")asyncdefuser_created(user_id:int):assertisinstance(user_id,int)returnf"user-{user_id}: created"@app.after_startupasyncdefpub_smth():assert(awaitbroker.publish(1,"user",callback=True))=="user-1: created"Also,Propanvalidates messages bypydantic, generates your projectAsyncAPIspec, tests application locally, RPC calls, and more.In fact, it is a high-level wrapper on top ofaio-pika, so you can use both of these libraries’ advantages at the same time.python-socketioSocket.IOis a transport protocol that enables real-time bidirectional event-based communication between clients (typically, though not always, web browsers) and a server. This package provides Python implementations of both, each with standard and asyncio variants.Also this package is suitable for building messaging services overRabbitMQviaaio-pikaadapter:importsocketiofromaiohttpimportwebsio=socketio.AsyncServer(client_manager=socketio.AsyncAioPikaManager())app=web.Application()sio.attach(app)@sio.eventasyncdefchat_message(sid,data):print("message ",data)if__name__=='__main__':web.run_app(app)And a client is able to callchat_messagethe following way:importasyncioimportsocketiosio=socketio.AsyncClient()asyncdefmain():awaitsio.connect('http://localhost:8080')awaitsio.emit('chat_message',{'response':'my response'})if__name__=='__main__':asyncio.run(main())Thetaskiqand thetaskiq-aio-pikaTaskiqis an asynchronous distributed task queue for python. The project takes inspiration from big projects such as Celery and Dramatiq. But taskiq can send and run both the sync and async functions.The library provides you withaio-pikabroker for running tasks too.fromtaskiq_aio_pikaimportAioPikaBrokerbroker=AioPikaBroker()@broker.taskasyncdeftest()->None:print("nothing")asyncdefmain():awaitbroker.startup()awaittest.kiq()RasaWith over 25 million downloads, Rasa Open Source is the most popular open source framework for building chat and voice-based AI assistants.WithRasa, you can build contextual assistants on:Facebook MessengerSlackGoogle HangoutsWebex TeamsMicrosoft Bot FrameworkRocket.ChatMattermostTelegramTwilioYour own custom conversational channels or voice assistants as:Alexa SkillsGoogle Home ActionsRasahelps you build contextual assistants capable of having layered conversations with lots of back-and-forth. In order for a human to have a meaningful exchange with a contextual assistant, the assistant needs to be able to use context to build on things that were previously discussed –Rasaenables you to build assistants that can do this in a scalable way.And it also usesaio-pikato interact withRabbitMQdeep inside!VersioningThis software followsSemantic VersioningFor contributorsSetting up development environmentClone the project:gitclonehttps://github.com/mosquito/aio-pika.gitcdaio-pikaCreate a new virtualenv foraio-pika:python3-mvenvenvsourceenv/bin/activateInstall all requirements foraio-pika:pipinstall-e'.[develop]'Running TestsNOTE: In order to run the tests locally you need to run a RabbitMQ instance with default user/password (guest/guest) and port (5672).The Makefile provides a command to run an appropriate RabbitMQ Docker image:makerabbitmqTo test just run:maketestEditing DocumentationTo iterate quickly on the documentation live in your browser, try:nox-sdocs--serveCreating Pull RequestsPlease feel free to create pull requests, but you should describe your use cases and add some examples.Changes should follow a few simple rules:When your changes break the public API, you must increase the major version.When your changes are safe for public API (e.g. added an argument with default value)You have to add test cases (seetests/folder)You must add docstringsFeel free to add yourself to“thank’s to” section |
aiopika-macrobase | No description available on PyPI. |
aio-pika-msgpack-rpc | aio-pika-msgpack-rpcRequires Python >= 3.7.Installationpip install aio-pika-msgpack-rpcExampleimportasyncioimportaio_pikafromaio_pika_msgpack_rpcimportMSGPackRPCasyncdefmain():client=awaitaio_pika.connect_robust('amqp://guest:guest@localhost:5672/')channel=awaitclient.channel()rpc=awaitMSGPackRPC.create(channel)# rpc callsresult=awaitrpc.call('method_name',kwargs={'test':'data'})asyncio.get_event_loop().run_until_complete(main()) |
aio-pika-rpc | aio-pika-msgpack-rpcaio-pika msgpack RPC |
aiopinboard | 📌 aiopinboard: A Python 3 Library for Pinboardaiopinboardis a Python3,asyncio-focused library for interacting with thePinboardAPI.InstallationPython VersionsAPI TokenUsageBookmarksTheBookmarkObjectGetting the Last Change DatetimeGetting BookmarksAdding a BookmarkDeleting a BookmarkTagsGetting TagsGetting Suggested TagsDeleting a TagRenaming a TagNotesTheNoteObjectGetting NotesContributingInstallationpipinstallaiopinboardPython Versionsaiopinboardis currently supported on:Python 3.10Python 3.11Python 3.12API TokenYou can retrieve your Pinboard API token viayour account's settings page.Usageaiopinboardendeavors to replicate all of the endpoints inthe Pinboard API documentationwith sane, usable responses.All API usage starts with creating anAPIobject that contains your Pinboard API token:importasynciofromaiopinboardimportAPIasyncdefmain()->None:api=API("<PINBOARD_API_TOKEN>")# do things!asyncio.run(main())BookmarksTheBookmarkObjectAPI endpoints that retrieve one or more bookmarks will returnBookmarkobjects, which
carry all of the expected properties of a bookmark:hash: the unique identifier of the bookmarkhref: the bookmark's URLtitle: the bookmark's titledescription: the bookmark's descriptionlast_modified: the UTC date the bookmark was last modifiedtags: a list of tags applied to the bookmarkunread: whether the bookmark is unreadshared: whether the bookmark is sharedGetting the Last Change DatetimeTo get the UTC datetime of the last "change" (bookmark added, updated, or deleted):importasynciofromaiopinboardimportAPIasyncdefmain()->None:"""Run!"""api=API("<PINBOARD_API_TOKEN>")last_change_dt=awaitapi.bookmark.async_get_last_change_datetime()# >>> datetime.datetime(2020, 9, 3, 13, 7, 19, tzinfo=<UTC>)asyncio.run(main())This method should be used to determine whether additional API calls should be made –
for example, if nothing has changed since the last time a request was made, the
implementing library can halt.Getting BookmarksTo get a bookmark by its URL:importasynciofromaiopinboardimportAPIasyncdefmain()->None:api=API("<PINBOARD_API_TOKEN>")awaitapi.bookmark.async_get_bookmark_by_url("https://my.com/bookmark")# >>> <Bookmark href="https://my.com/bookmark">asyncio.run(main())To get all bookmarksimportasynciofromaiopinboardimportAPIasyncdefmain()->None:api=API("<PINBOARD_API_TOKEN>")awaitapi.bookmark.async_get_all_bookmarks()# >>> [<Bookmark ...>, <Bookmark ...>]asyncio.run(main())You can specify several optional parameters while getting all bookmarks:tags: an optional list of tags to filter results bystart: the optional starting index to return (defaults to the start)results: the optional number of results (defaults to all)from_dt: the optional datetime to start fromto_dt: the optional datetime to end atTo get all bookmarks created on a certain date:importasynciofromdatetimeimportdatefromaiopinboardimportAPIasyncdefmain()->None:"""Run!"""api=API("<PINBOARD_API_TOKEN>")awaitapi.bookmark.async_get_bookmarks_by_date(date.today())# >>> [<Bookmark ...>, <Bookmark ...>]# Optionally filter the results with a list of tags – note that only bookmarks that# have all tags will be returned:awaitapi.bookmark.async_get_bookmarks_by_date(date.today(),tags=["tag1","tag2"])# >>> [<Bookmark ...>, <Bookmark ...>]asyncio.run(main())To get recent bookmarks:importasynciofromaiopinboardimportAPIasyncdefmain()->None:api=API("<PINBOARD_API_TOKEN>")awaitapi.bookmark.async_get_recent_bookmarks(count=10)# >>> [<Bookmark ...>, <Bookmark ...>]# Optionally filter the results with a list of tags – note that only bookmarks that# have all tags will be returned:awaitapi.bookmark.async_get_recent_bookmarks(count=20,tags=["tag1","tag2"])# >>> [<Bookmark ...>, <Bookmark ...>]asyncio.run(main())To get a summary of dates and how many bookmarks were created on those dates:importasynciofromaiopinboardimportAPIasyncdefmain()->None:api=API("<PINBOARD_API_TOKEN>")dates=awaitapi.bookmark.async_get_dates()# >>> {datetime.date(2020, 09, 05): 4, ...}asyncio.run(main())Adding a BookmarkTo add a bookmark:importasynciofromaiopinboardimportAPIasyncdefmain()->None:api=API("<PINBOARD_API_TOKEN>")awaitapi.bookmark.async_add_bookmark("https://my.com/bookmark","My New Bookmark")asyncio.run(main())You can specify several optional parameters while adding a bookmark:description: the optional description of the bookmarktags: an optional list of tags to assign to the bookmarkcreated_datetime: the optional creation datetime to use (defaults to now)replace: whether this should replace a bookmark with the same URLshared: whether this bookmark should be sharedtoread: whether this bookmark should be unreadDeleting a BookmarkTo delete a bookmark by its URL:importasynciofromaiopinboardimportAPIasyncdefmain()->None:api=API("<PINBOARD_API_TOKEN>")awaitapi.bookmark.async_delete_bookmark("https://my.com/bookmark")asyncio.run(main())TagsGetting TagsTo get all tags for an account (and a count of how often each tag is used):importasynciofromaiopinboardimportAPIasyncdefmain()->None:api=API("<PINBOARD_API_TOKEN>")awaitapi.tag.async_get_tags()# >>> {"tag1": 3, "tag2": 8}asyncio.run(main())Getting Suggested TagsTo get lists of popular (used by the community) and recommended (used by you) tags for a
particular URL:importasynciofromaiopinboardimportAPIasyncdefmain()->None:api=API("<PINBOARD_API_TOKEN>")awaitapi.bookmark.async_get_suggested_tags("https://my.com/bookmark")# >>> {"popular": ["tag1", "tag2"], "recommended": ["tag3"]}asyncio.run(main())Deleting a TagTo delete a tag:importasynciofromaiopinboardimportAPIasyncdefmain()->None:api=API("<PINBOARD_API_TOKEN>")awaitapi.tag.async_delete_tag("tag1")asyncio.run(main())Renaming a TagTo rename a tag:importasynciofromaiopinboardimportAPIasyncdefmain()->None:api=API("<PINBOARD_API_TOKEN>")awaitapi.tag.async_rename_tag("old-tag","new-tag")asyncio.run(main())NotesTheNoteObjectAPI endpoints that retrieve one or more notes will returnNoteobjects, which
carry all of the expected properties of a note:note_id: the unique IDtitle: the titlehash: the computed hashcreated_at: the UTC datetime the note was createdupdated_at: the UTC datetime the note was updatedlength: the lengthGetting NotesTo get all notes for an account:importasynciofromaiopinboardimportAPIasyncdefmain()->None:api=API("<PINBOARD_API_TOKEN>")awaitapi.note.async_get_notes()# >>> [<Note ...>, <Note ...>]asyncio.run(main())ContributingThanks to all ofour contributorsso far!Check for open features/bugsorinitiate a discussion on one.Fork the repository.(optional, but highly recommended) Create a virtual environment:python3 -m venv .venv(optional, but highly recommended) Enter the virtual environment:source ./.venv/bin/activateInstall the dev environment:script/setupCode your new feature or bug fix on a new branch.Write tests that cover your new functionality.Run tests and ensure 100% code coverage:poetry run pytest --cov aiopinboard testsUpdateREADME.mdwith any new documentation.Submit a pull request! |
aiopinecone | aiopineconeAn asynchronous Pinecone DB Client, completely unaffiliated with Pinecone or Pinecone Systems, Inc. |
aioping | No description available on PyPI. |
aio_ping | aio_pingAn asyncio-based python ping implementation using raw sockets.Compatible with Python 3.5 ff.Note that ICMP messages can only be sent from processes running as root (in Windows, you must run this script as ‘Administrator’).Original VersionMatthew Dixon Cowlescopyleft 1989-2016 by the python-ping team, seeAUTHORSfor more details.license: GNU GPL v2, seeLICENSEfor more details.Usageusage: ping [-h] [-w TIMEOUT] [-c COUNT] [-i INTERVAL] [-4] [-6]
[-I SOURCEINTF] [-s NUMDATABYTES] [-T] [-S SOURCEIP]
hostname
A pure python implementation of the ping protocol. *REQUIRES ROOT*
positional arguments:
hostname The address to attempt to ping.
optional arguments:
-h, --help show this help message and exit
-w TIMEOUT, --deadline TIMEOUT
The maximum amount of time to wait until ping times
out.
-c COUNT, --request_count COUNT
The number of attempts to make. Zero=infinite.
-i INTERVAL, --interval INTERVAL
Time between ping attempts
-4, --ipv4 Flag to use IPv4.
-6, --ipv6 Flag to use IPv6.
-I SOURCEINTF, --interface SOURCEINTF
Interface to use.
-s NUMDATABYTES, --packet_size NUMDATABYTES
Designate the amount of data to send per packet.
-T, --test_case Flag to run the default test case suite.
-S SOURCEIP, --source_address SOURCEIP
Source address from which ICMP Echo packets will be
sent.Using as lib# python3
>>> from aio_ping import ping
>>> ping('google.com', timeout=3000, count=3, delay=0.5)
True
>>> ping('google.com', timeout=3000, count=3, delay=0.5, verbose=True)
PYTHON PING google.com (216.58.212.46): 1300 data bytes
72 bytes from 216.58.212.46: icmp_seq=0 ttl=59 time=4.42 ms
72 bytes from 216.58.212.46: icmp_seq=1 ttl=59 time=4.70 ms
72 bytes from 216.58.212.46: icmp_seq=2 ttl=59 time=4.44 ms
72 bytes from 216.58.212.46: icmp_seq=3 ttl=59 time=4.47 ms
----216.58.212.46 PYTHON PING Statistics----
4 packets transmitted, 4 packets received, 0.0% packet loss
round-trip (ms) min/avg/max = 4.4/4.5/4.7
1Async usage is via the {Ping} class, which can be used like this:from aio_ping import Ping,VerbosePing
async def ping(hostname, verbose=True, handle_signals=False, **kw):
"""
Send @count ping to @hostname with the given @timeout
"""
ping = (VerbosePing if verbose else Ping)(verbose=verbose, **kw)
if handle_signals: ping.add_signal_handler()
await ping.init(hostname)
res = await ping.looped()
if verbose:
ping.print_stats()
ping.close()
return rescontributeFork this repoonGitHubandsend pull requests. Thank you.Revision historyRevision historyLinksSourcecode at GitHubhttps://github.com/M-o-a-T/aioping |
aiopioneer | Python library for controlling a Pioneer AVI via its built-in API.Used by thepioneer_asyncintegration for Home Assistant, which was inspired by theoriginal Pioneer Home Assistant integration.
Tested on a VSX-930 (Main Zone and HDZone outputs).FeaturesImplemented in asyncio.Maintains single continuous telnet session to AVR, with automatic reconnect.Eliminates status polling where AVR sends keepalive responses (on port 8102).Auto-detects Zones 1, 2, 3 and HDZONE.Automatically polls AVR for source names - no longer need to manually code them in your config any more if your AVR supports their retrieval. Can also set source names manually.Ignore specific zones, for AVRs that report phantom zones.Queries device parameters: MAC address, software version, model.Ability to set internal parameters to change the API functionality, eg. maximum volume, volume step change delta.Defaults for internal parameters can be changed based on custom profiles based on AVR model.Includes workaround for AVRs with an initial volume set on the Main Zone (eg. VSX-930).Supports AVRs that do not support setting the volume level by emulating using up/down commands (eg. VSX-S510).Command line client for sending commands and testingSupports all listening mode functionsSupports all video related functionsSupports panel and remote lockingSupports most AMP related functionsSupports all tone functionsSupports most zone power functionsSupports all zone input functionsSupports all zone volumne and mute functionsSupports some basic tuner functionsParamsAparamsobject may be passed to the library that modifies its functionality.The default parameters listed below are for AVR models that do not match any custom profile. Custom profiles apply additional default parameters based on the model identifier retrieved from the AVR, and are defined inaiopioneer/param.py. If you need to modify parameters for the library to work for your AVR model, then please create a PR to add a custom profile for your AVR model, or log an issue containing your model number and the parameters that were modified requesting a custom profile to be created.NOTE:YAML syntax is used in the table below. Use Python equivalents (false->False,true->True,null->Noneetc.) when calling the Python API directly, and JSON syntax if manually specifying via the Home Assistant integration.NameTypeDefaultDescriptionignored_zoneslist[]List of zones to ignore even if they are auto-discovered. Specify Zone IDs as strings: "1", "2", "3" and "Z".command_delayfloat0.1Insert a delay between sequential commands that are sent to the AVR. This appears to make the AVR behave more reliably during status polls. Increase this value if debug logging shows that your AVR times out between commands.max_source_idint60Maximum source ID that the source discovery queries. Reduce this if your AVR returns errors.max_volumeint185Maximum volume for the Main Zone.max_volume_zonexint185Maximum volume for zones other than the Main Zone.power_on_volume_bounceboolfalseOn some AVRs (eg. VSX-930) where a power-on is set, the initial volume is not reported by the AVR correctly until a volume change is made. This option enables a workaround that sends a volume up and down command to the AVR on power-on to correct the reported volume without affecting the power-on volume.volume_step_onlyboolfalseOn some AVRs (eg. VSX-S510), setting the volume level is not supported natively by the API. This option emulates setting the volume level using volume up and down commands.ignore_volume_checkboolfalseDon't check volume when determining whether a zone exists on the AVR. Useful for AVRs with an HDZone that passes through audio.zone_1_sourceslist[](>0.4) Customises the available sources for use with Zone 1. Defaults to all available sources.zone_2_sourceslist["04", "06", "15", "26", "38", "53", "41", "44", "45", "17", "13", "05", "01", "02", "33", "46", "47", "99", "10"]Customises the available sources for use with Zone 2 (some AVRs do not support all sources).zone_3_sourceslist["04", "06", "15", "26", "38", "53", "41", "44", "45", "17", "13", "05", "01", "02", "33", "46", "47", "99", "10"]Customises the available sources for use with Zone 3 (some AVRs do not support all sources).hdzone_sourceslist["25", "04", "06", "10", "15", "19", "20", "21", "22", "23", "24", "34", "35", "26", "38", "53", "41", "44", "45", "17", "13", "33", "31", "46", "47", "48"]Customises the available sources for use with HDZone (some AVRs do not support all sources).hdzone_volume_requirementslist["13", "15", "05", "25"]A list of sources that HDZone must be set to for volume control, some AVRs do not support HDZone volume at all (seeignore_volume_checkabove) and some only allow control of certain sources.amplifier_speaker_system_modesdict....Customises the names of speaker system modes. Different generations of AVR will name zones slighty differently. For example, the SC-LX57 names speaker system mode15as5.1ch Bi-Amp + ZONE2however this can also be called5.2ch Bi-Amp + HDZONEon newer AVRs.disabled_amplifier_listening_modeslist[]A list of disabled listening modes / sound modes, all modes are enabled by default, some AVRs have definitions already to disable unsupported modes. If you try to change sound mode to a mode that has not been enabled, the AVR will return an error (usuallyE02).video_resolution_modeslist['0', '1', '3', '4', '5', '6', '7', '8', '9']Sets the available video resolutions. Not all AVRs support the same resolution settings. This defaults to all of the latest resolutions from FY16.mhl_sourcestringnullSets the MHL source ID. This is used for media controls. This information cannot be queried automaticallyenabled_functionslist["amp", "dsp", "tuner", "tone", "channels", "video", "system", "audio"]Change the functions that are enabled by the API, adding more functions will increase the amount of time it takes to complete a full init and update.disable_autoqueryboolfalseSetting totruewill disable auto queries on init for all functions apart from basic functionality (power, source, volume and mute). If you only need those functions, you can set this totrueam_frequency_stepintnullOptional setting to configure the AM frequency step. If this is set tonull, a function is queued to detect this information by stepping up and down the frequency when the tuner is first used while set to AM.debug_listenerboolfalseEnables additional debug logging for the listener task.debug_responderboolfalseEnables additional debug logging for the responder task.debug_updaterboolfalseEnables additional debug logging for the updater task.debug_commandboolfalseEnables additional debug logging for commands sent and responses received.Command line interface (CLI) (>= 0.1.3, CLI arguments >= 0.3)A very simple command line interfaceaiopioneeris available to connect to the AVR, send commands and receive responses. It can be used to test the capabilities of the library against your specific AVR.On Home Assistant, you can run the CLI when thepioneer_asyncHome Assistant integration has been installed. On Home Assistant Supervised or Container, start the CLI from within the HA container:docker exec -it homeassistant aiopioneer.Invoke the CLI with the following arguments:ArgumentDefaultDescriptionhostnamerequiredhostname for AVR connection-p--port8102port for AVR connection+Q--no-query-device-infoNoneskip AVR device info query+Z--no-query-zonesNoneskip AVR zone queryThe CLI accepts all API commands, as well as the following:CommandArgumentDescriptionexitorquitExit the CLI.zonezoneChange current zone tozone.log_levellog_levelChange debug level tolog_level. Valid log levels are:debug,info,warning,error,critical.updateRequest update of AVR. An update is scheduled in the updater task if a scan interval is set, if it is not set then the update is performed synchronously.update_fullRequest a full update of AVR irrespective of when the previous update was performed. An update is scheduled in the updater task if a scan interval is set, if it is not set then the update is performed synchronously.query_device_infoQuery the AVR for device information.query_zonesQuery the AVR for available zones. Ignore zones specified in parameterignored_zones(list).build_source_dictQuery the sources from the AVR.set_source_dictsources(JSON)Manually set the sources tosources.get_source_listReturn the current set of available source names that can be used with theselect_sourcecommand.get_paramsReturn the currently active set of parameters.get_user_paramsReturn the currently active set of user parameters.set_user_paramsparams(JSON)Set the user parameters toparams.get_toneReturns the current AVR tone attributes.get_ampReturns the current AVR amp attributes.get_tunerReturns the current AVR tuner attributes.get_channel_levelsReturns the current AVR channel levels.get_dspReturns the current AVR DSP attributes.get_videoReturns the current AVR video parameters.get_audioReturns the current AVR audio parameters.get_systemReturns the AVR system attributes.debug_listenerstate(bool)Enable/disable thedebug_listenerparameter.debug_responderstate(bool)Enable/disable thedebug_responderparameter.debug_updaterstate(bool)Enable/disable thedebug_updaterparameter.debug_commandstate(bool)Enable/disable thedebug_commandparameter.set_scan_intervalscan_interval(float)Set the scan interval toscan_interval.get_scan_intervalReturn the current scan interval.set_volume_levelvolume_level(int)Set the volume level for the current zone.select_sourcesource_nameSet the input source for the current zone.set_tuner_frequencybandfrequencySet the tuner band and (optionally) frequency.send_raw_commandraw_commandSend the raw commandraw_commandto the AVR.NOTE:The CLI interface may change in the future, and should not be used in scripts. Use the Python API instead.Source listIDDefault Name25BD04DVD06SAT/CBL15DVR/BDR19HDMI 120HDMI 221HDMI 322HDMI 423HDMI 524HDMI 634HDMI 726NETWORK (cyclic)38INTERNET RADIO53Spotify41PANDORA44MEDIA SERVER45FAVORITES17iPod/USB05TV01CD13USB-DAC02TUNER00PHONO12MULTI CH IN33BT AUDIO31HDMI (cyclic)46AirPlay (Information only)47DMR (Information only)Known issues and future plansDocument PioneerAVR APIBreaking changes0.4zone_z_sourceswas renamedhdzone_sourcesfor even more consistency.0.3zone_h_sourceswas renamedzone_z_sourcesfor consistency.0.2volume_step_deltahas been removed entirely.By default, a number of additional queries are sent at module startup to the AVR to gather amp, tuner and channel levels attributes. If your AVR does not handle these additional queries well, they can be disabled by setting parameterdisable_autoquerytotrue.0.1_PioneerAVR.__init__()no longer acceptscommand_delay,volume_workaroundandvolume_stepsarguments. Configure these parameters using the equivalentPARAM_*keys in theparamsdict, passed in via the constructure or set viaset_user_params().ReferencesHome Assistant Pioneer integration:https://www.home-assistant.io/integrations/pioneer/Pioneer commands references:https://github.com/rwifall/pioneer-receiver-notesAnother asyncio Pioneer HA component:https://github.com/realthk/asyncpioneerPioneer IP and serial IO control documentation:https://www.pioneerelectronics.com/PUSA/Support/Home-Entertainment-Custom-Install/RS-232+&+IP+Codes/A+V+Receivers |
aiopipe | aiopipe -- Multiprocess communication pipes for asyncioThis package wraps theos.pipesimplex communication pipe so it can be used as part of the non-blockingasyncioevent loop. A duplex pipe
is also provided, which allows reading and writing on both ends.Simplex exampleThe following example opens a pipe with the write end in the child process and the read
end in the parent process.>>>frommultiprocessingimportProcess>>>importasyncio>>>>>>fromaiopipeimportaiopipe>>>>>>asyncdefmain():...rx,tx=aiopipe()......withtx.detach()astx:...proc=Process(target=childproc,args=(tx,))...proc.start()......# The write end is now available in the child process...# and detached from the parent process.......asyncwithrx.open()asrx:...msg=awaitrx.readline()......proc.join()...returnmsg>>>>>>defchildproc(tx):...asyncio.run(childtask(tx))>>>>>>asyncdefchildtask(tx):...asyncwithtx.open()astx:...tx.write(b"hi from the child process\n")>>>>>>asyncio.run(main())b'hi from the child process\n'>>>Duplex exampleThe following example shows a parent and child process sharing a duplex pipe to exchange
messages.>>>frommultiprocessingimportProcess>>>importasyncio>>>>>>fromaiopipeimportaioduplex>>>>>>asyncdefmain():...mainpipe,chpipe=aioduplex()......withchpipe.detach()aschpipe:...proc=Process(target=childproc,args=(chpipe,))...proc.start()......# The second pipe is now available in the child process...# and detached from the parent process.......asyncwithmainpipe.open()as(rx,tx):...req=awaitrx.read(5)...tx.write(req+b" world\n")...msg=awaitrx.readline()......proc.join()...returnmsg>>>>>>defchildproc(pipe):...asyncio.run(childtask(pipe))>>>>>>asyncdefchildtask(pipe):...asyncwithpipe.open()as(rx,tx):...tx.write(b"hello")...rep=awaitrx.readline()...tx.write(rep.upper())>>>>>>asyncio.run(main())b'HELLO WORLD\n'>>>InstallationThis package requires Python 3.7+ and can be installed withpip:pip install aiopipe |
aio-pipe | AIO-PIPE========.. image:: https://travis-ci.org/mosquito/aio-pipe.svg:target: https://travis-ci.org/mosquito/aio-pipe:alt: Travis CI.. image:: https://img.shields.io/pypi/v/aio-pipe.svg:target: https://pypi.python.org/pypi/aio-pipe/:alt: Latest Version.. image:: https://img.shields.io/pypi/wheel/aio-pipe.svg:target: https://pypi.python.org/pypi/aio-pipe/.. image:: https://img.shields.io/pypi/pyversions/aio-pipe.svg:target: https://pypi.python.org/pypi/aio-pipe/.. image:: https://img.shields.io/pypi/l/aio-pipe.svg:target: https://pypi.python.org/pypi/aio-pipe/Real asynchronous file operations with asyncio support.Status------Development - BETAFeatures--------* aio-pipe is a helper for POSIX pipes.Code examples-------------Useful example... code-block:: pythonimport asynciofrom aio_pipe import AsyncPIPEasync def main(loop):p = AsyncPIPE(loop)for _ in range(1):await p.write(b"foo" * 1000)await p.read(3000)p.close()loop = asyncio.get_event_loop()loop.run_until_complete(main(loop))Write and read with helpers:.. code-block:: python |
aiopipes | UNKNOWN |
aio-piston | Aio-pistonThis is an unoffical Api wrapper for thepiston code execution engineExamplesimportaio_pistonimportasyncioasyncdefmain():asyncwithawaitaio_piston.Piston()aspiston:out=awaitpiston.execute('print("hello world")',language="python")#execute the codeout2=awaitpiston.execute('print(input("what is your name"))',language="python",inputs="bob")returnout,out2#*OR* without a context managerpiston=awaitaio_piston.Piston()...#do stuff...#at the endawaitpiston.close()out,out2=asyncio.run(main())print(str(out))#full outputprint(vars(out).keys())#all the attributes of the response class |
aiopixel | WIP |
aiopixiv | UNKNOWN |
aiopjlink | aiopjlinkA modern Python asyncio PJLink library (Class I and Class II).What is PJLink?Most projectors that have RJ45 ports on the back can be controlled viaPJLink.PJLink is a communication protocol and unified standard for operating and controlling data projectors viaTCP/IP, regardless of manufacturer.PJLink consists ofClass 1commands and queries, as well asClass 2notifications and extensions.Class 1 is the most common type of PJLink, and is used for basic commands such as power on/off, input selection, and adjusting volume.Class 2 is an extended version of the protocol that supports additional commands such as opening and closing the projector's lens cover, and is typically used by more sophisticated devices.What is aiopjlink?A Python library that usesasyncioto talk to one or more projectors connected to a network using the PJLink protocol.It has these advantages:✅ Clean modern asyncio API✅ High level API abstraction (eg.lamp.hours)✅ Pure Python 3 implementation (no dependencies)✅ Full suite of test cases✅ Context managers for keeping track of connections and resources✅ High quality error handlingUsageEach "connection" to a projector is managed through aPJLinkcontext manager. Once this is connected, you access the different functions through a high level API (e.g.conn.power.turn_off(),conn.lamps.hours(),conn.errors.query(), etc).For example, create aPJLinkconnection to the projector and issue commands:asyncwithPJLink(address="192.168.1.120",password="secretpassword")aslink:# Turn on the projector.awaitlink.power.turn_on()# Wait a few seconds, then print out all the error information.awaitasyncio.sleep(5)print("errors = ",awaitlink.errors.query())# Then wait a few seconds, then turn the projector off.awaitasyncio.sleep(5)awaitlink.power.turn_off()DevelopmentWe use thePDM package manager.pdminstall--dev# install all deps required to run and test the codepdmrunlint# check code qualitypdmruntest# check all test cases run OKpdmpublish# Publish the project to PyPIOther notes:There are more "pdm scripts" in the.tomlfile.Set the env variableAIOPJLINK_PRINT_DEBUG_COMMSto print debug comms to the console.RoadmapPull requests with test cases are welcome. There are still some things to finish, including:Search Protocol (§3.2)Status Notification Prototol (§3.3) |
aiopki | No description available on PyPI. |
aioplemmy | AIOPlemmy: a Python package for accessing the Lemmy API asynchronouslyAIOPlemmy allows you to interact with any Lemmy instance using Python and theLemmyHttp API.WARNING:Plemmy is still in development and needs testing!InstallationFor the most up-to-date version of AIOPlemmy, clone and install from the repository:git clone https://github.com/schicksal-hq/plemmy-aio
cd plemmy
pip3 install .Basic usageInteract with a Lemmy instance using theLemmyHttpobject:importaiohttpimportasyncioimportorjsonfromaioplemmyimportLemmyHttp,responsesasyncdefmain():# Unlike in original Plemmy, Plemmy-AIO accepts aiohttp session from outside and relies on developer to set# base_url parametersess=aiohttp.ClientSession(base_url="https://lemmy.ml",json_serialize=lambdax:str(orjson.dumps(x)))lemmy=LemmyHttp(client=sess,key=None)# login anonymously (as guest, no key)# The API requests are async :3resp=awaitlemmy.get_community(name="anime")resp=responses.GetCommunityResponse(resp)print(resp)asyncio.run(main())Logging in:importaiohttpimportasyncioimportorjsonfromaioplemmyimportLemmyHttp,responsesasyncdefmain():sess=aiohttp.ClientSession(base_url="https://lemmy.ml",json_serialize=lambdax:str(orjson.dumps(x)))# Use anonymous access to log inkey=awaitLemmyHttp(client=sess,key=None).login("[email protected]","j12345678")# And then create the LemmyHttp instance you'll actually use.# Of course, you can (and should) reuse the same aiohttp session.lemmy=LemmyHttp(client=sess,key=key)resp=awaitlemmy.get_community(name="anime")resp=responses.GetCommunityResponse(resp)print(resp)asyncio.run(main())Catching errors:importaiohttpimportasyncioimportorjsonfromaioplemmyimportLemmyHttp,LemmyError,responsesasyncdefmain():sess=aiohttp.ClientSession(base_url="https://lemmy.ml",json_serialize=lambdax:str(orjson.dumps(x)))lemmy=LemmyHttp(client=sess,key=None)try:resp=awaitlemmy.get_community(name="nonexistingcommunity")resp=responses.GetCommunityResponse(resp)print(resp)exceptLemmyErrorase:# The error code will be in the `error` property# Keep in mind that this property will be set if and only if# the Lemmy API could generate a response.## Unexpected I/O and HTTP errors will trigger LemmyError, but will# not set the property.print(e.error)# should print COULDNT_FIND COMMUNITYasyncio.run(main())Full documentation and further API refinements are on their way, but in meantime you should check out source code
andexamplesfrom upstream. The method names are essentially the
same after all, and you're unlikely to spot any difference if you just pipe LemmyHttp results to Response objects.
The important differences are:aioplemmy has its LemmyHttp methods all async...and the methods return objects parsed from JSON response, not response itself...and methods occasionally throw LemmyErrorReporting issues, making contributions, etc.Pull requests and issues are welcome. You're also welcome to submit them to the upstream repository, I will pull the fixes from there :) |
aioplisio | aioplisio - Asynchronous wrapper for Plisio APIМодуль в разработке, сейчас недоступны методы withdrawУстановка 💾Установка, используя пакетный менеджер pip$ pip install aioplisioУстановка с GitHub(требуетсяgit)$ git clone https://github.com/Fsoky/aioplisio
$ cd aioplisio
$ python setup.py installИли$ pip install git+https://github.com/Fsoky/aioplisioДополнительно:Зарегистрируйтесь на сайтеPlisioи получите API-ключ.Официальная документация по API:PLISIO API DOCSПримеры использования:ШаблонimportasynciofromaioplisioimportAIOPlisioClientasyncdefmain()->None:asyncwithAIOPlisioClient("API-KEY")asplisio:...if__name__=="__main__":asyncio.run(main())Получение транзакцийasyncwithAIOPlisioClient("YOUR API-KEY")asplisio:transactions=awaitplisio.get.transactions()# You can pass txnID for search by itprint(transactions.data)Инвойсы (чеки)asyncwithAIOPlisioClient("YOUR API-KEY")asplisio:invoice=awaitplisio.invoice.create("ORDER-NAME",12345001,# Order numberamount=10# 10 USDTcurrency="USDT"# Cryptosource_currency="USD"# Fiatexpire_min=15)print(f"Your invoice:{transaction.data.invoice_url}")transaction=awaitplisio.get.transactions(invoice.data.txn_id)print(transaction.data.status)...И также другие методы |
aiopluggy | Pleaseread the docsto learn more!A definitive exampleimportaiopluggy,asynciohookspec=aiopluggy.HookspecMarker("myproject")hookimpl=aiopluggy.HookimplMarker("myproject")classMySpec(object):"""A hook specification namespace.
"""@hookspecdefmyhook(self,arg1,arg2):"""My special little hook that you can customize.
"""classPlugin_1(object):"""A hook implementation namespace.
"""@hookimpl.asyncioasyncdefmyhook(self,arg1,arg2):print("inside Plugin_1.myhook()")returnarg1+arg2classPlugin_2(object):"""A 2nd hook implementation namespace.
"""@hookimpldefmyhook(self,arg1,arg2):print("inside Plugin_2.myhook()")returnarg1-arg2asyncdefmain():# create a manager and add the specpm=aiopluggy.PluginManager("myproject")pm.register_specs(MySpec)# register pluginsawaitpm.register(Plugin_1())awaitpm.register(Plugin_2())# call our `myhook` hookresults=awaitpm.hook.myhook(arg1=1,arg2=2)print(results)asyncio.get_event_loop.run_until_complete(main()) |
aiopm | No description available on PyPI. |
aiopo | No description available on PyPI. |
aiopogo | UNKNOWN |
aiopokeapi | AioPokéApiAn Asynchronous API wrapper for thePokéApi.Report issue·Request feature·Fork project🗝️ Key FeaturesUse of modern Python keywords:asyncandawait.Every object is fully type hinted.Objects get cached, this increases speed and avoids unnecessary API requests.🌍 DocumentationAioPokéApi has a very minimal website, which you can findhere. It also has somedocumentation.☄️ Installationpipinstallaiopokeapi⚙️Didn't work?Depending on your Python installation, you might need to use one of the
following:Python is not in PATHpath/to/python.exe-mpipinstallaiopokeapiPython is in PATH but pip is notpython-mpipinstallaiopokeapiUnix systems can use pip3/python3 commandspip3installaiopokeapipython3-mpipinstallaiopokeapiUsing multiple Python versionspy-mpipinstallaiopokeapi🚀 Getting startedAiopoke's goal is to be simple and easy to use:importasyncioimportaiopokeasyncdefmain():client=aiopoke.AiopokeClient()ability=awaitclient.get_ability(1)generation=awaitability.generation.fetch()awaitclient.close()asyncio.run(main())Or even better, using a context manager:# in main()asyncwithaiopoke.AiopokeClient()asclient:ability=awaitclient.get_ability(1)generation=awaitability.generation.fetch() |
aiopollen | Failed to fetch description. HTTP Status Code: 404 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.