package
stringlengths
1
122
pacakge-description
stringlengths
0
1.3M
aio-mc-rcon
Aio-MC-RCONAn asynchronous RCON client/wrapper written in Python for Minecraft Java Edition servers!Installationpip install -U aio-mc-rconExample UsageSee theexamples folder.Documentationclassaiomcrcon.Client(host:str, port:int, password:str):Arguments:host: str-The hostname / ip of the server to connect to.port: int-The port of the server to connect to.password: str-The password to connect, can be found as the value underrcon.passwordin theserver.propertiesfile.Methods:connect(timeout: int = 2)-wheretimeouthas a default value of 2 seconds.send_cmd(cmd: str, timeout: int = 2)-wherecmdis the command to be executed on the server and timeout has a default value of 2 seconds.close()-closes the connection between the client and server.exceptionaiomcrcon.RCONConnectionErrorRaised when the connection to the server fails.exceptionaiomcrcon.IncorrectPasswordErrorRaised when the provided password/authentication is invalid.exceptionaiomcrcon.ClientNotConnectedErrorRaised when the connect() method hasn't been called yet, and commands cannot be sent.
aiomcstats
AiomcstatsGet information about minecraft servers.Asyncpixel is an open source asyncronous python wrapper for the hypixel api with 100% of all endpoints.🤖 🎨 🚀Docs·Feature request·Report a bug·Support:Discussions&Discord✨ FeaturesAsyncronous.Unlike other libraries Asyncpixel is fully asyncronous. This makes it perfect to use in your next discord bot or powerful website without having to worry about blocking.100% API coverage.Asyncpixel is currently the only python library with full coverage of the hypixel API meaning that no endpoints are left untouched and outof your reach.Pydantic models.All models are checked and validated byPydanticmeaning that the data is always in the correct format perfect for you to use.Available on pypi.Asyncpixel is available on pypi meaning no building from source just usepip install asyncpixelto use it in your project.🏁 Getting Started with AsyncpixelTo get started check out the documentation whichlives hereand is generously hosted by readthedocs.InstallationUse the package managerpipor your favourite tool to install asyncpixel.pipinstallasyncpixelExampleimportasyncpixelimportasyncioasyncdefmain():hypixel=asyncpixel.Hypixel("hypixel_api_key")profile=awaithypixel.profile("405dcf08b80f4e23b97d943ad93d14fd")print(profile)awaithypixel.close()asyncio.run(main())❗ Code of ConductObsidion-dev is dedicated to providing a welcoming, diverse, and harrassment-free experience for everyone. We expect everyone in the Obsidion-dev community to abide by ourCode of Conduct. Please read it.🙌 Contributing to AsyncpixelFrom opening a bug report to creating a pull request: every contribution is appreciated and welcomed. If you're planning to implement a new feature or change the library please create an issue first. This way we can ensure your work is not in vain.Not Sure Where to Start?A good place to start contributing, are theGood first issues.📝 LicenseAsyncpixel is open-source. The library is licensedGPL v3.💬 Get in touchIf you have a question or would like to talk with other Asyncpixel users, please hop over toGithub discussionsor join our Discord server:Discord chatroomContributors ✨Thanks goes to these wonderful people (emoji key):AjayACST🚧Alex💻🐛Damian Grzanka💻This project follows theall-contributorsspecification. Contributions of any kind welcome!
aiomealie
Python: Homeassistant analyticsAsynchronous Python client for Mealie.AboutThis package allows you to fetch data from your Mealie instance.InstallationpipinstallaiomealieChangelog & ReleasesThis repository keeps a change log usingGitHub's releasesfunctionality. The format of the log is based onKeep a Changelog.Releases are based onSemantic Versioning, and use the format ofMAJOR.MINOR.PATCH. In a nutshell, the version will be incremented based on the following:MAJOR: Incompatible or major changes.MINOR: Backwards-compatible new features and enhancements.PATCH: Backwards-compatible bugfixes and package updates.ContributingThis is an active open-source project. We are always open to people who want to use the code or contribute to it.We've set up a separate document for ourcontribution guidelines.Thank you for being involved! :heart_eyes:Setting up development environmentThis Python project is fully managed using thePoetrydependency manager. But also relies on the use of NodeJS for certain checks during development.You need at least:Python 3.11+PoetryNodeJS 12+ (including NPM)To install all packages, including all development requirements:npminstall poetryinstallAs this repository uses thepre-commitframework, all changes are linted and tested with each commit. You can run all checks and tests manually, using the following command:poetryrunpre-commitrun--all-filesTo run just the Python tests:poetryrunpytestAuthors & contributorsThe content is byJoost Lekkerkerker.For a full list of all authors and contributors, checkthe contributor's page.LicenseMIT LicenseCopyright (c) 2024 Joost LekkerkerkerPermission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
aiomeasures
aiomeasures squattedThis is an empty package that raises an exception when imported. This exists to prevent malicious use of this Pypi package. To install aiomeasures, do so from the source repo :https://github.com/cookkkie/aiomeasures
aiomeasures-fork
This library allows you to send metrics to your Datadog or Statsd server. This works on Python >= 3.3 and relies on asyncio.Installation:python -m pip install aiomeasuresUsage:from aiomeasures import Datadog client = Datadog('udp://127.0.0.1:6789') client.incr('foo') client.decr('bar', tags={'one': 'two'}) with client.timer('baz'): # long process passThe client will send metrics to agent as possible.
aiomediawiki
A simple asyncio mediawiki client.
aiomega
Failed to fetch description. HTTP Status Code: 404
aiomeilisearch
AioMeiliSearchThe MeiliSearch API asyncio client for PythonIntroductionAioMeilisearchis a asynchronous python client for MeiliSearch.MeiliSearchis an open source, blazingly fast and hyper relevant search-engine. For more information about features in thisdocumentation.SummaryMeiliSearch's API is a standardRESTful HTTP API, so it comes with SDKs for all kinds of programing languages. One official python client is calledmeilisearch-python, it does a good job, with a great python http library calledrequests.async/awaitis new feature of python3.4+. In some cases we need to support this feature, so here we are.AioMeilisearchsupprts async/await, writed withaiohttp.Any python api in theMeiliSearch officia documenthas aawaitableversion byAioMeilisearch.Featureasync/awaitRequirementspython3.6+aiohttpInstallationpip3installaiomeilisearchUsageIndeximportaiomeilisearchclient=aiomeilisearch.Client('http://127.0.0.1:7700','masterKey')asyncdefcreate_index():# create a index, with primaryKey "id"# An index is where the documents are stored.returnawaitclient.create_index('movies',{'primaryKey':'id'})index=awaitcreate_index()Documentsasyncdefadd_documents():# add documents to indexdocuments=[{'id':1,'title':'Carol','genres':['Romance','Drama']},{'id':2,'title':'Wonder Woman','genres':['Action','Adventure']},{'id':3,'title':'Life of Pi','genres':['Adventure','Drama']},{'id':4,'title':'Mad Max: Fury Road','genres':['Adventure','Science Fiction']},{'id':5,'title':'Moana','genres':['Fantasy','Action']},{'id':6,'title':'Philadelphia','genres':['Drama']},]awaitindex.add_documents(documents)Get a documentawaitindex.get_document(1)Search documentsawaitclient.index("movies").search('飞天大盗')awaitclient.index("movies").search('The Great Gatsby',filter=["is_tv=True",["year=1925","year=2013"]])settingsawaitclient.index('movies').get_settings()awaitclient.index('movies').update_displayed_attributes(["id",'title','year'])awaitclient.index('movies').update_filterable_attributes(["year",'is_tv'])awaitclient.index('movies').update_searchable_attributes(['title','original_title',])...Demoshttps://github.com/ziyoubaba/aiomeilisearch/tree/main/demosDocumentationMeiliSearchall the python apis here. Don't forget 'await'.AioMeilisearch, Maybe later...LicenseUnder MIT license.Changelogversion 1.0.0based with theversion v0.23.0 of MeiliSearch.welcom contributionSource codeThe latest developer version is available in a GitHub repository:https://github.com/ziyoubaba/aiomeilisearch
aio-meilisearch
AIO_MEILISEARCHAsync Wrapper over Meilisearch REST API with type hintspipinstallaio_meilisearchUsagefromtypingimportTypedDict,List,Optionalimporthttpxfromaio_meilisearchimport(MeiliSearch,MeiliConfig,Index,SearchResponse,)classMovieDict(TypedDict):id:strname:strgenres:List[str]url:stryear:inthttp=httpx.AsyncClient()meilisearch=MeiliSearch(meili_config=MeiliConfig(base_url='http://localhost:7700',private_key='PRIVATE_KEY',public_key='PUBLIC_KEY',),http_client=http,)index:Index[MovieDict]=awaitmeilisearch.create_index(name="movies",pk="id")awaitindex.update_settings({"searchableAttributes":["name","genres"],"displayedAttributes":["name","genres","id","url","year",],"attributesForFaceting":["genres","year"],})movie_list:List[MovieDict]=[{"name":"Oblivion","genres":["action","adventure","sci-fi"],"id":"tt1483013","url":"https://www.imdb.com/title/tt1483013/","year":2013,}]awaitindex.documents.add_many(movie_list)response:SearchResponse[MovieDict]=awaitindex.documents.search(query="action")ContributingPrerequisites:poetrynoxnox-poetryInstall them on your system:pipinstallpoetrynoxnox-poetryRun tests:nox
aiomemcache
memcached client for asyncioaiomemcache is a minimal, pure python client for memcached, kestrel, etc.RequirementsPython >= 3.3asynciohttps://pypi.python.org/pypi/asyncio/Getting startedThe API looks very similar to the other memcache clients:import aiomemcache mc = aiomemcache.Client("127.0.0.1", 11211, connect_timeout=5) yield from mc.set(b"some_key", b"Some value") value = yield from mc.get(b"some_key") yield from mc.delete(b"another_key")CHANGES0.1 (02-04-2014)Initial release
aiomemoize
aiomemoizeMemoize asyncio Python calls. Invalidation is manual/explicit for each set of arguments, although exceptions raised arenotcached. This can be used for coroutines, and functions that return a promise.Installationpip install aiomemoizeUsageFor a coroutine whose arguments are hashable, you can create amemoizedversion by passing it tomemoize. This returns a tuple of the memoized function, and a function to invalidate the cache on a per-item basis.For example, the belowimportasynciofromaiomemoizeimportmemoizeasyncdefmain():memoized,invalidate=memoize(coro)results=awaitasyncio.gather(*[memoized('a'),memoized('a'),memoized('b'),])print(results)invalidate('a')results=awaitasyncio.gather(*[memoized('a'),memoized('a'),memoized('b'),])print(results)awaitmemoized('a')asyncdefcoro(value):print('Inside coro',value)awaitasyncio.sleep(1)returnvalueloop=asyncio.get_event_loop()loop.run_until_complete(main())loop.close()will outputInside coro a Inside coro b ['a', 'a', 'b'] Inside coro a ['a', 'a', 'b']
aiomemoizeconcurrent
aiomemoizeconcurrentMemoize concurrent asyncio Python coroutine calls. This offers short-lived memoization: for any given set of arguments, the cache lasts only for the length of a single call.Installationpip install aiomemoizeconcurrentUsageFor a coroutine whose arguments are hashable, you can create amemoizedversion by passing it tomemoize_concurrent. Any concurrent calls to this version that have the same arguments will result in only asinglerun of original coroutine.For example, creating 3 concurrent invocations of a coroutine where 2 of them have identical argumentsimportasynciofromaiomemoizeconcurrentimportmemoize_concurrentasyncdefmain():memoized_coro=memoize_concurrent(coro)results=awaitasyncio.gather(*[memoized_coro('a'),memoized_coro('a'),memoized_coro('b'),])print(results)awaitmemoized_coro('a')asyncdefcoro(value):print('Inside coro',value)awaitasyncio.sleep(1)returnvalueloop=asyncio.get_event_loop()loop.run_until_complete(main())loop.close()will only runcorotwice, as shown by the outputInside coro a Inside coro b ['a', 'a', 'b']Use casesThis can be used to memoize a function making calls to an API, and especially ifyou expect many concurrent calls;identical concurrent calls are idempotent;there are enough such calls that are identical to justify such a caching layer.It can also be used to avoid concurrency edge cases/race conditions with multiple tasks accessing shared resources. For example, multiple tasks may need to dynamically create shared UDP sockets. To ensure that this dynamic generation isn't called by multiple tasks at the same time for the same address, it can be wrapped withmemoize_concurrent.The functionmemoize_concurrentworks with both coroutines, and functions that return a future.
aiomemoizettl
aiomemoizettlMemoize asyncio Python calls with a per-result TTLInstallationpip install aiomemoizettlUsageFor a coroutine whose arguments are hashable, you can create amemoizedversion by passing it tomemoize_ttl, along with a function that converts its return value to a TTL.For example, the belowimportasynciofromaiomemoizettlimportmemoize_ttlasyncdefmain():memoized=memoize_ttl(coro,get_ttl=lambdaresult:result['ttl'])results=awaitasyncio.gather(*[memoized(1),memoized(2),])awaitasyncio.sleep(1)results=awaitasyncio.gather(*[memoized(1),memoized(2),# Will used the cached value of `coro(2)`])asyncdefcoro(value):print('Inside coro',value)return{'ttl':value,'some-other':'data'}loop=asyncio.get_event_loop()loop.run_until_complete(main())loop.close()will outputInside coro 1 Inside coro 2 Inside coro 1
aiomes
aiomesAIOMES- ассинхронная Python-библиотека, написанная на aiohttp и async_playwright, с легким доступом к сервисуМЭШПримечание:Библиотека может работать только с профилем ученика.Установка:pipinstallaiomesplaywrightinstallfirefoxОглавление методов библиотеки:Вход по логину / паролюВход по токенуПолучение расписанияПолучение короткого расписанияПолучение каникулярного расписанияПолучение Д/ЗПолучение оценокПолучение итоговых оценокПолучение прошлогодних итоговых оценокПолучение инфо о школеПолучению меню школьной столовойПолучение меню школьного буфетаПолучение посещаемостиПолучение уведомленийПолучение рейтинга в классеПолучение документовПолучение списка предметовМетоды:Авторизация по логину и паролюfromplaywright.async_apiimportasync_playwrightimportasyncioimportaiomesLOGIN=...PASSWORD=...asyncdefmain():asyncwithasync_playwright()asp:auth=awaitaiomes.AUTH(p)token=awaitauth.obtain_token(LOGIN,PASSWORD)iftoken=='2FA_NEEDED':sms_code=str(input())# Реализация вашей логики получения 2FA-кодаtoken=awaitauth.proceed_2fa(sms_code)user=awaitaiomes.Client(token)asyncio.run(main())Авторизация по токенуimportasyncioimportaiomesTOKEN=...asyncdefmain():user=awaitaiomes.Client(TOKEN)asyncio.run(main())Получение расписанияschedule=awaituser.get_schedule()forsubjectinschedule:print(f"{subject.name},{subject.start_time}-{subject.end_time}, к.{subject.room_number};{subject.marks}")Получение короткого расписанияtoday=date.today()short_schedule=awaituser.get_schedule_short([today,today+timedelta(1),...])forsubjectinshort_schedule:print(f"{subject.name},{subject.start_time}-{subject.end_time}")Получение расписания каникулperiods_schedule=awaituser.get_periods_schedule()forsubjectinperiods_schedule:print(f"{period.name}:{period.starts}—{period.ends}")Получение домашнего заданияhomework=awaituser.get_homeworks(from_date=date.today(),to_date=date.today())forhwinhomework:print(hw.hw_date.strftime('%d/%m'))print(f"{hw.subject_name}:{hw.description};{hw.attached_files},{hw.attached_tests}")print("-"*15)Получение оценокmarks=awaituser.get_marks(from_date=date.today()-timedelta(7),to_date=date.today())formarkinsorted(marks,key=lambdax:x.mark_date):print(f"{mark.subject_name}:{mark.value}[{mark.weight}] -{mark.reason}")Получение оценок за периодperiod_marks=awaituser.get_period_marks(year_id=user.class_level,period_id=0)# [Текущий класс, первый период]forper_markinperiod_marks:print(f"{per_mark.subject_name}-{per_mark.average_mark},{per_mark.final_mark};{per_mark.marks}")Получение итог. оценок за прошлые годаpast_marks=awaituser.get_past_final_marks(class_number=9)# [Номер класса]forpast_markinpast_marks:print(f"{past_mark.subject_name}-{past_mark.final_mark}")Получение информации о школеschool=awaituser.get_school_info()print(school.name,school.address)print(school.site,school.email)print(school.principal)Получение меню школьной столовойtoday=date.today()menu=awaituser.get_menu(today)foriteminmenu:print(f'{item.title}—{item.price}:')formealinitem.composition:print(f'-{meal.name},{meal.ingredients},{meal.calories}')print('-'*35)Получение меню школьного буфетаtoday=date.today()buffet_menu=awaituser.get_menu_buffet(today)foriteminbuffet_menu:print(item.name,item.full_name,item.price,item.is_available)Получение посещаемостиtoday=date.today()attendance=awaituser.get_visits(from_date=today-timedelta(7),to_date=today)fordayinattendance:print(f"{day.visit_date}:{day.in_time}-{day.out_time},{day.duration}")Получение всех уведомленийnotifications=awaituser.get_notifications()forninnotifications:n_date=datetime.strftime(n.event_date,'%d/%m')print(f'{n_date},{n.event_name}[{n.mark_value}], [{n.hw_description}]')Получение рейтинга в классеtoday=date.today()ranking=awaituser.get_class_rank(date_from=today-timedelta(7),date_to=today)fordayinranking:print(f"{day.rank_date},{day.place}")Получение документов ученикаdocs=awaituser.get_docs()fordocindocs:print(f"{doc.type_id}/{doc.series},{doc.number}/{doc.issuer},{doc.issue_date}")Получение списка предметовsubjects=awaituser.get_subjects()print(", ".join(subjects))[Asyncronous Input Output Moscow Electronic School]
aio-message-handler
No description available on PyPI.
aiomessaging
No description available on PyPI.
aiomessenger
[![travis-ci][travis-image]][travis-url] [![pypi-version][pypi-image]][pypi-url] [![codecov][codecov-image]][codecov-url][travis-image]:https://travis-ci.org/stegben/aiomessenger.svg?branch=master[travis-url]:https://travis-ci.org/stegben/aiomessenger[pypi-image]:https://badge.fury.io/py/aiomessenger.svg[pypi-url]:https://badge.fury.io/py/aiomessenger[codecov-image]:https://codecov.io/gh/stegben/aiomessenger/branch/master/graph/badge.svg[codecov-url]:https://codecov.io/gh/stegben/aiomessengerHome-page:https://github.com/stegben/aiomessengerAuthor: cph Author-email:[email protected]: UNKNOWN Description: # aiomessenger[![travis-ci][travis-image]][travis-url] [![pypi-version][pypi-image]][pypi-url] [![codecov][codecov-image]][codecov-url][travis-image]:https://travis-ci.org/stegben/aiomessenger.svg?branch=master[travis-url]:https://travis-ci.org/stegben/aiomessenger[pypi-image]:https://badge.fury.io/py/aiomessenger.svg[pypi-url]:https://badge.fury.io/py/aiomessenger[codecov-image]:https://codecov.io/gh/stegben/aiomessenger/branch/master/graph/badge.svg[codecov-url]:https://codecov.io/gh/stegben/aiomessengerPlatform: UNKNOWN Classifier: Programming Language :: Python :: 3.6
aiometer
aiometeraiometeris a concurrency scheduling library compatible withasyncioandtrioand inspired byTrimeter. It makes it easier to execute lots of tasks concurrently while controlling concurrency limits (i.e. applyingbackpressure) and collecting results in a predictable manner.ContentExampleFeaturesInstallationUsageFlow controlRunning tasksHow ToAPI ReferenceContributingLicenseExampleLet's useHTTPXto make web requests concurrently...Try this code interactively usingIPython.>>>importasyncio>>>importfunctools>>>importrandom>>>importaiometer>>>importhttpx>>>>>>client=httpx.AsyncClient()>>>>>>asyncdeffetch(client,request):...response=awaitclient.send(request)...# Simulate extra processing......awaitasyncio.sleep(2*random.random())...returnresponse.json()["json"]...>>>requests=[...httpx.Request("POST","https://httpbin.org/anything",json={"index":index})...forindexinrange(100)...]...>>># Send requests, and process responses as they're made available:>>>asyncwithaiometer.amap(...functools.partial(fetch,client),...requests,...max_at_once=10,# Limit maximum number of concurrently running tasks....max_per_second=5,# Limit request rate to not overload the server....)asresults:...asyncfordatainresults:...print(data)...{'index':3}{'index':4}{'index':1}{'index':2}{'index':0}...>>># Alternatively, fetch and aggregate responses into an (ordered) list...>>>jobs=[functools.partial(fetch,client,request)forrequestinrequests]>>>results=awaitaiometer.run_all(jobs,max_at_once=10,max_per_second=5)>>>results[{'index':0},{'index':1},{'index':2},{'index':3},{'index':4},...]InstallationThis project is in beta and maturing. Be sure to pin any dependencies to the latest minor.pipinstall"aiometer==0.5.*"FeaturesConcurrency management and throttling helpers.asyncioandtriosupport.Fully type annotated.100% test coverage.UsageFlow controlThe key highlight ofaiometeris allowing you to apply flow control strategies in order to limit the degree of concurrency of your programs.There are two knobs you can play with to fine-tune concurrency:max_at_once: this is used to limit the maximum number of concurrently running tasks at any given time. (If you have 100 tasks and setmax_at_once=10, thenaiometerwill ensure that no more than 10 run at the same time.)max_per_second: this option limits the number of tasks spawned per second. This is useful to not overload I/O resources, such as servers that may have a rate limiting policy in place.Example usage:>>>importasyncio>>>importaiometer>>>asyncdefmake_query(query):...awaitasyncio.sleep(0.05)# Simulate a database request....>>>queries=['SELECT * from authors']*1000>>># Allow at most 5 queries to run concurrently at any given time:>>>awaitaiometer.run_on_each(make_query,queries,max_at_once=5)...>>># Make at most 10 queries per second:>>>awaitaiometer.run_on_each(make_query,queries,max_per_second=10)...>>># Run at most 10 concurrent jobs, spawning new ones at least every 5 seconds:>>>asyncdefjob(id):...awaitasyncio.sleep(10)# A very long task....>>>awaitaiometer.run_on_each(job,range(100),max_at_once=10,max_per_second=0.2)Running tasksaiometerprovides 4 different ways to run tasks concurrently in the form of 4 different run functions. Each function accepts all the options documented inFlow control, and runs tasks in a slightly different way, allowing to address a variety of use cases. Here's a handy table for reference (see also theAPI Reference):EntrypointUse caserun_on_each()Execute async callbacks in any order.run_all()Return results as an ordered list.amap()Iterate over results as they become available.run_any()Return result of first completed function.To illustrate the behavior of each run function, let's first setup a hello world async program:>>>importasyncio>>>importrandom>>>fromfunctoolsimportpartial>>>importaiometer>>>>>>asyncdefget_greeting(name):...awaitasyncio.sleep(random.random())# Simulate I/O...returnf"Hello,{name}"...>>>asyncdefgreet(name):...greeting=awaitget_greeting(name)...print(greeting)...>>>names=["Robert","Carmen","Lucas"]Let's start withrun_on_each(). It executes an async function once for each item in a list passed as argument:>>>awaitaiometer.run_on_each(greet,names)'Hello, Robert!''Hello, Lucas!''Hello, Carmen!'If we'd like to get the list of greetings in the same order asnames, in a fashion similar toPromise.all(), we can userun_all():>>>awaitaiometer.run_all([partial(get_greeting,name)fornameinnames])['Hello, Robert','Hello, Carmen!','Hello, Lucas!']amap()allows us to process each greeting as it becomes available (which means maintaining order is not guaranteed):>>>asyncwithaiometer.amap(get_greeting,names)asgreetings:...asyncforgreetingingreetings:...print(greeting)'Hello, Lucas!''Hello, Robert!''Hello, Carmen!'Lastly,run_any()can be used to run async functions until the first one completes, similarly toPromise.any():>>>awaitaiometer.run_any([partial(get_greeting,name)fornameinnames])'Hello, Carmen!'As a last fun example, let's useamap()to implement a no-threads async version ofsleep sort:>>>importasyncio>>>fromfunctoolsimportpartial>>>importaiometer>>>numbers=[0.3,0.1,0.6,0.2,0.7,0.5,0.5,0.2]>>>asyncdefprocess(n):...awaitasyncio.sleep(n)...returnn...>>>asyncwithaiometer.amap(process,numbers)asresults:...sorted_numbers=[nasyncforninresults]...>>>sorted_numbers[0.1,0.2,0.2,0.3,0.5,0.5,0.6,0.7]How ToMultiple parametrized values inrun_on_eachandamaprun_on_eachandamaponly accept functions that accept a single positional argument (i.e.(Any) -> Awaitable).So if you have a function that is parametrized by multiple values, you should refactor it to match this form.This can generally be achieved like this:Build a proxy container type (eg. anamedtuple), egT.Refactor your function so that its signature is now(T) -> Awaitable.Build a list of these proxy containers, and pass it toaiometer.For example, assuming you have a function that processes X/Y coordinates...asyncdefprocess(x:float,y:float)->None:passxs=list(range(100))ys=list(range(100))forx,yinzip(xs,ys):awaitprocess(x,y)You could use it withamapby refactoring it like this:fromtypingimportNamedTuple# Proxy container type:classPoint(NamedTuple):x:floaty:float# Rewrite to accept a proxy as a single positional argument:asyncdefprocess(point:Point)->None:x=point.xy=point.y...xs=list(range(100))ys=list(range(100))# Build a list of proxy containers:points=[Point(x,y)forx,yinzip(x,y)]# Use it:asyncwithaiometer.amap(process,points)asresults:...API ReferenceCommon optionsmax_at_once(Optional,int): the maximum number of concurrently running tasks at any given time.max_per_second(Optional,int): the maximum number of tasks spawned per second.aiometer.run_on_each()Signature:asyncaiometer.run_on_each(async_fn,args, *,max_at_once=None,max_per_second=None) ->NoneConcurrently run the equivalent ofasync_fn(arg) for arg in args. Does not return any value. To get return values back, useaiometer.run_all().aiometer.run_all()Signature:asyncaiometer.run_all(async_fns,max_at_once=None,max_per_second=None) ->listConcurrently run theasync_fnsfunctions, and return the list of results in the same order.aiometer.amap()Signature:asyncaiometer.amap(async_fn,args,max_at_once=None,max_per_second=None) ->async iteratorConcurrently run the equivalent ofasync_fn(arg) for arg in args, and return an async iterator that yields results as they become available.aiometer.run_any()Signature:asyncaiometer.run_any(async_fns,max_at_once=None,max_per_second=None) ->AnyConcurrently run theasync_fnsfunctions, and return the first available result.ContributingSeeCONTRIBUTING.md.LicenseMITChangelogAll notable changes to this project will be documented in this file.The format is based onKeep a Changelog.0.5.0 - 2023-12-11RemovedDrop support for Python 3.7, as it has reached EOL. (Pull #44)AddedAdd official support for Python 3.12. (Pull #44)Add support for anyio 4. This allows catching exception groups using the native ExceptionGroup. On anyio 3.2+, anyio would throw its own ExceptionGroup type. Compatibility with anyio 3.2+ is retained. (Pull #43)0.4.0 - 2023-01-18RemovedDrop support for Python 3.6, which has reached EOL. (Pull #38)AddedAdd official support for Python 3.10 and 3.11. (Pull #38)FixedRelax version requirements fortyping_extensionsand addressmypy>=0.981strict optional changes. (Pull #38)0.3.0 - 2021-07-06ChangedUpdateanyiodependency to v3 (previously v1). (Pull #25)NB: no API change, but dependency mismatches may occur. Be sure to port your codebase to anyio v3 before upgradingaiometer.AddedAdd support for Python 3.6 (installs thecontextlib2backport library there). (Pull #26)Officialize support for Python 3.9. (Pull #26)0.2.1 - 2020-03-26FixedImprove robustness of themax_per_secondimplementation by using the generic cell rate algorithm (GCRA) instead of leaky bucket. (Pull #5)0.2.0 - 2020-03-22AddedAdd support for Python 3.7. (Pull #3)0.1.0 - 2020-03-21AddedAddrun_on_each(),run_all(),amap()andrun_any(), withmax_at_onceandmax_per_secondoptions. (Pull #1)
aiometrics
No description available on PyPI.
aio-microservice
AIO MicroserviceWhat is AIO Microservice?A library to create microservices.FeaturesInstallationAIO Microservice can be installed withpip:pip install aio-microserviceQuickstartContributing
aiomigrate
aiomigrateasyncio raw sql migration tool
aiomigrator
Failed to fetch description. HTTP Status Code: 404
aiomihome
aiomihomeAsyncio version ofhttps://github.com/Danielhiversen/PyXiaomiGatewayDescriptionThe API only includes a thin layer with out proper handling of devices or sensors with seprateLibrary consist of two classesServiceManage the multisocket interface to the xiaomi bridge and connecting to a gateway. Includes service for discovery of gatewaysGatewayHandles the connection to the gatewayThe bridge needs to be i local mode and key needs to be provided to communicate with the gatewayUsageCreate a new service object. The gateway config key is required to be able to communicate with the gateway. If discovery is used and only one gateway exist only key is needed, if several provide sid or host.See example folder for runnable code.Connect go gatewayRun auto discover:gateways_config=[{"host":"10.0.4.104","sid":"7811dcb07917","port":9898,"key":key}]service=XiaomiService(gateways_config=gateways_config)# Start the mulitcast socket listnerawaitservice.listen()# Run the auto discovergateways=awaitservice.discover()print("Number of gateways found:{}".format(len(gateways)))# Get first gatewaygateway=gateways[0]Directly connect to gateway:service=XiaomiService()# Start the mulitcast socket listnerawaitservice.listen()# Create a gatewate connetiongateway=awaitservice.add_gateway("10.0.4.104",9898,"7811dcb07917",key)Close conectionsawaitservice.close()Listnersasyncdefheartbeat_callback(data):print("HEARTBEAT RECIVED",data)asyncdefdevice_callback(data):print("Device data RECIVED",data)gateway.heartbeat_callback=heartbeat_callbackgateway.device_callback=device_callbackInteract with the gatewayPrint devicesprint("Turn on light")fordevice_type,devicesingateway.devices.items():print(device_type,len(devices))fordeviceindevices:print(" ",device['model'])print(device)forvalue_key,valueindevice['data'].items():print(" ",value_key,value)Set the gateway light# Redawaitgateway.set_color(255,0,0)awaitasyncio.sleep(1)# Greenawaitgateway.set_color(0,255,0)awaitasyncio.sleep(1)# Blueawaitgateway.set_color(0,0,255)awaitasyncio.sleep(1)# Offawaitgateway.turn_off_light()
aiomirai
Async Mirai SDK for PythonAsync Mirai SDK for Python is a Python SDK with async I/O forMirai'sMirai API HTTPplugin.Inspired byCQHTTP Python Async SDKandMirai Framework for Python.InstallationpipinstallaiomiraiDocumentationSee athttps://aiomirai.solariar.tech鸣谢这些项目也很棒, 去他们的项目页看看, 点个Star以鼓励他们的开发工作, 毕竟没有他们也没有aiomirai.特别感谢mamoe给我们带来这些精彩的项目:mirai: 即mirai-core, 一个高性能, 高可扩展性的 QQ 协议库, 同时也是个很棒的机器人开发框架!mirai-console: 一个基于mirai开发的插件式可扩展开发平台, 我们的大多数开发工作基本上都在该项目上完成, 不得不称赞其带来的开发敏捷性.mirai-api-http: 为该项目提供http接口的mirai-console插件, 万物之源特别感谢NatriumLab给我们带来这些精彩的项目:python-mirai:一个以 OICQ(QQ) 协议驱动的高性能机器人开发框架 Mirai 的 Python 接口,本项目的灵感来源,参考了部分内容比如这个鸣谢特别感谢CQMOE给我们带来这些精彩的项目:python-aiocqhttp:酷Q 的 CQHTTP 插件 的 Python SDK 异步版本,本项目的主要代码及逻辑都在一定程度上参考了该项目特别感谢Koishi.js给我们带来这些精彩的项目:koishijs.github.io:本项目文档的部分内容使用了该项目中的 vue 组件And finally...Long Live Mirai `-://+++//:-. -+o+o++oo+o+++////:-.` ``-o+/:+o:+++//++s+++/////:` `--/++//+os://+/::+o+::/:/--::. :://o/:-:o+:/`./-:+-.//:/://`:/:-.` -+/+:o/--:/--:.`.:-.-+o/+o++:- +-`:/:- ./::o:o+/+::.ohy:````-ss+/os+:`.-:` .-- ```.-/`+:++:/o/+++:.``````.`-o/:. -` ...``` `- -:/-`///+o+:````...-.`.:+/.`` .-. ``:. ...::::--....``.-----..:. .:-.- `....`----.-...-.---:.`--.` `..--::. -.```...-: `.--`-..-`-`-`..` ..:--... `.-- `.-```.- `..`. .....` ....``- `-...- :- `.` `..` -`-..` `.:``./...`.```..``..// `.. ..-.```.-``` `-:-:...-...``.`.````..//` ```` ../: .`.`` ...... +:.```. . //` `.`. .:// ````.` ....`` `.shy+.`. `.`-/: `-`.. `+/- -````. .```.````/os/../ `---//` :``- //. `-`.```.` .`````````..`..-` -+oo+/- .``.. :/. ```..````-:` .``.```````-.-`: .-::-// .`.`-`./. -:-.```-/.` `.`.``````./-`/+` ` -/``.:--.`/- ./+..-```:/``- `-.`````.` ``:++: ` -/. `.`.. /: .- ` ..-/-``.-.` `.....` `` `-/` ./- .:- `:/` . --.```.....` ` ` ` ` `` ``.` ```` `` `` /++/` `+++: -. `` ``` `. /+-+/ :/-+: `+/ -:...`.-.`.-- -:` /+..+/+-.+: `+/ -: .-..`-:` -:` -/` -/- `/- `/: -- .-....-` .-`
aiomisc
Miscellaneous utils for asyncio.As a programmer, you are no stranger to the challenges that come with building and maintaining software applications. One area that can be particularly difficult is making architecture of the software that using asynchronous I/O.This is where aiomisc comes in. aiomisc is a Python library that provides a collection of utility functions and classes for working with asynchronous I/O in a more intuitive and efficient way. It is built on top of theasynciolibrary and is designed to make it easier for developers to write asynchronous code that is both reliable and scalable.With aiomisc, you can take advantage of powerful features likeworker pools,connection pools,circuit breaker pattern, and retry mechanisms such asasyncbackoffandasyncretryto make your asyncio code more robust and easier to maintain. In this documentation, we’ll take a closer look at whataiomischas to offer and how it can help you streamline your asyncio service development.InstallationInstallation is possible in standard ways, such as PyPI or installation from a git repository directly.Installing fromPyPI:pip3installaiomiscInstalling from github.com:# Using git toolpip3installgit+https://github.com/aiokitchen/aiomisc.git# Alternative way using httppip3install\https://github.com/aiokitchen/aiomisc/archive/refs/heads/master.zipThe package contains several extras and you can install additional dependencies if you specify them in this way.Withuvloop:pip3install"aiomisc[uvloop]"Withaiohttp:pip3install"aiomisc[aiohttp]"Complete table of extras bellow:exampledescriptionpip install aiomisc[aiohttp]For runningaiohttpapplications.pip install aiomisc[asgi]For runningASGIapplicationspip install aiomisc[carbon]Sending metrics tocarbon(part ofgraphite)pip install aiomisc[cron]usecroniterfor scheduling taskspip install aiomisc[raven]Sending exceptions tosentryusingravenpip install aiomisc[rich]You might usingrichfor loggingpip install aiomisc[uvicorn]For runningASGIapplication usinguvicornpip install aiomisc[uvloop]useuvloopas a default event loopYou can combine extras values by separating them with commas, for example:pip3install"aiomisc[aiohttp,cron,rich,uvloop]"Quick StartThis section will cover how this library creates and uses the event loop and creates services. Of course, you can’t write about everything here, but you can read about a lot in theTutorialsection, and you can always refer to theModulesandAPI referencesections for help.Event-loop and entrypointLet’s look at this simple example first:importasyncioimportloggingimportaiomisclog=logging.getLogger(__name__)asyncdefmain():log.info('Starting')awaitasyncio.sleep(3)log.info('Exiting')if__name__=='__main__':withaiomisc.entrypoint(log_level="info",log_format="color")asloop:loop.run_until_complete(main())This code declares an asynchronousmain()function that exits after 3 seconds. It would seem nothing interesting, but the whole point is in theentrypoint.What does theentrypointdo, it would seem not so much, it creates an event-loop and transfers control to the user. However, under the hood, the logger is configured in a separate thread, a pool of threads is created, services are started, but more on that later and there are no services in this example.Alternatively, you can choose not to use an entrypoint, just create an event-loop and set this as a default event loop for current thread:importasyncioimportaiomisc# * Installs uvloop event loop is it's has been installed.# * Creates and set `aiomisc.thread_pool.ThreadPoolExecutor`# as a default executor# * Sets just created event-loop as a current event-loop for this thread.aiomisc.new_event_loop()asyncdefmain():awaitasyncio.sleep(1)if__name__=='__main__':loop=asyncio.get_event_loop()loop.run_until_complete(main())The example above is useful if your code is already using an implicitly created event loop, you will have to modify less code, just addaiomisc.new_event_loop()and all calls toasyncio.get_event_loop()will return the created instance.However, you can do with one call. Following example closes implicitly created asyncio event loop and install a new one:importasyncioimportaiomiscasyncdefmain():awaitasyncio.sleep(3)if__name__=='__main__':loop=aiomisc.new_event_loop()loop.run_until_complete(main())ServicesThe main thing that anentrypointdoes is start and gracefully stop services.The service concept within this library means a class derived from theaiosmic.Serviceclass and implementing theasync def start(self)->None:method and optionally theasync def stop(self, exc: Optional[ Exception])->Nonemethod.The concept of stopping a service is not necessarily is pressingCtrl+Ckeys by user, it’s actually just exiting theentrypointcontext manager.The example below shows what your service might look like:fromaiomiscimportentrypoint,ServiceclassMyService(Service):asyncdefstart(self):do_something_when_start()asyncdefstop(self,exc):do_graceful_shutdown()withentrypoint(MyService())asloop:loop.run_forever()The entry point can start as many instances of the service as it likes, and all of them will start concurrently.There is also a way if thestartmethod is a payload for a service, and then there is no need to implement the stop method, since the running task with thestartfunction will be canceled at the stop stage. But in this case, you will have to notify theentrypointthat the initialization of the service instance is complete and it can continue.Like this:importasynciofromthreadingimportEventfromaiomiscimportentrypoint,Serviceevent=Event()classMyService(Service):asyncdefstart(self):# Send signal to entrypoint for continue runningself.start_event.set()awaitasyncio.sleep(3600)withentrypoint(MyService())asloop:assertevent.is_set()NoteTheentrypointpasses control to the body of the context manager only after all service instances have started. As mentioned above, a start is considered to be the completion of thestartmethod or the setting of an start event withself.start_event.set().The whole power of this library is in the set of already implemented or abstract services. Such as:AIOHTTPService,ASGIService,TCPServer,UDPServer,TCPClient,PeriodicService,CronServiceand so on.Unfortunately in this section it is not possible to pay more attention to this, please pay attention to theTutorialsection section, there are more examples and explanations, and of cource you always can find out an answer on the/api/indexor in the source code. The authors have tried to make the source code as clear and simple as possible, so feel free to explore it.VersioningThis software followsSemantic VersioningSummary: it’s given a version number MAJOR.MINOR.PATCH, increment the:MAJOR version when you make incompatible API changesMINOR version when you add functionality in a backwards compatible mannerPATCH version when you make backwards compatible bug fixesAdditional labels for pre-release and build metadata are available as extensions to the MAJOR.MINOR.PATCH format.In this case, the package version is assigned automatically withpoem-plugins, it using on the tag in the repository as a major and minor and the counter, which takes the number of commits between tag to the head of branch.Summary: it’s given a version number MAJOR.MINOR.PATCH, increment the:MAJOR version when you make incompatible API changesMINOR version when you add functionality in a backwards compatible mannerPATCH version when you make backwards compatible bug fixesAdditional labels for pre-release and build metadata are available as extensions to the MAJOR.MINOR.PATCH format.In this case, the package version is assigned automatically withpoem-plugins, it using on the tag in the repository as a major and minor and the counter, which takes the number of commits between tag to the head of branch.How to develop?This project, like most open source projects, is developed by enthusiasts, you can join the development, submit issues, or send your merge requests.In order to start developing in this repository, you need to do the following things.Should be installed:Python 3.7+ aspython3InstalledPoetryaspoetryFor setting up developer environment just execute:# installing all dependencies poetry install # setting up pre-commit hooks poetry run pre-commit install # adding poem-plugins to the poetry poetry self add poem-plugins
aiomisc-dependency
Dependency injection plugin foraiomiscbuilt withaiodinelibrary and support pytest fixture style dependency injection.Table of contentsInstallationHow to useRegister dependencyUse dependencyDependencies for dependenciesloopbuilt-in dependencyLICENSEInstallationInstalling from pypi:pip3installaiomiscaiomisc-dependencyHow to useRegister dependencyTo register dependency you can useaiomisc_dependency.dependencydecorator.fromaiomisc_dependencyimportdependency@dependencyasyncdefpg_engine():pg_engine=awaitcreate_engine(dsn=pg_url)yieldpg_enginepg_engine.close()awaitpg_engine.wait_closed()As you can see dependency can be async generator function. Code after yield will be executed on teardown to correctly close the dependency.Coroutine functions, non async functions and generators are also supported.Use dependencyTo use dependency you need to add it’s name to__dependencies__property for every service which depends on it. Specified dependencies will be injected as service’s attributes on entrypoint startup. If needed to map the dependency with a different name, then, use__dependencies_map__fromcontextlibimportsuppressfromtypesimportMappingProxyTypeimportaiohttpfromaiomisc.service.aiohttpimportAIOHTTPServiceclassHealthcheckService(AIOHTTPService):__dependencies__=('pg_engine',)asyncdefcreate_application(self):app=aiohttp.web.Application()app.add_routes([aiohttp.web.get('/ping',self.healthcheck_handler)])returnappasyncdefhealthcheck_handler(self,request):pg_status=Falsewithsuppress(Exception):asyncwithself.pg_engine.acquire()asconn:awaitconn.execute('SELECT 1')pg_status=Truereturnaiohttp.web.json_response({'db':pg_status},status=(200ifpg_statuselse500),)classRESTService(AIOHTTPService):__dependencies__=('pg_engine',)...classAnotherRESTService(AIOHTTPService):__dependencies_map__=MappingProxyType({'pg_engine':'engine'})...If any required dependency won’t be found on entrypoint startup,RuntimeErrorwill be raised.You can set a dependency manually by adding it to kw arguments on service creation. This could be convenient in tests.fromunittestimportMockdeftest_rest_service():pg_engine_mock=Mock()service=RESTService(pg_engine=pg_engine_mock)...Dependencies for dependenciesYou can use dependencies as arguments for other dependencies. Arguments will injected automatically.@dependencyasyncdefpg_connection(pg_engine):asyncwithpg_engine.acquire()asconn:yieldconnloopbuilt-in dependencyBuilt-inloopdependency can be used if your dependency requires event loop instance.importaioredis@dependencyasyncdefredis_pool(loop):pool=aioredis.create_pool(redis_url,loop=loop)yieldpoolpool.close()awaitpool.wait_closed()LICENSEMIT
aiomisc-entrypoint
Aiomisc EntrypointAlternative way to runaiomisc entrypointwith processors added behavior to start and stop events of entrypoint and custom query logger.Basic usagefromaiomisc_entrypointimportEntrypointep=Entrypoint()ep.clear_environ()ep.change_user()ep.system_signals_listener()ep.register_services_in_context()ep.first_start_last_stop()ep.run_forever()Extended usagefromsignalimportSIGINT,SIGTERM,SIGKILLfromaiomiscimportServicefromaiomisc_entrypointimportEntrypointclassTestService(Service):asyncdefstart(self):...asyncdefmain():...services=(TestService(context_name='svc1'),TestService(context_name='svc2'),)ep=Entrypoint(*services)ep.clear_environ(lambdax:x.startwith('APP_'))ep.change_user('user')ep.system_signals_listener(SIGINT,SIGTERM,SIGKILL)ep.register_services_in_context()ep.first_start_last_stop()ep.run_until_complete(main())Release Notes:v1.0.1fix error with set loop forasyncio.EventinSysSignalListener
aiomisc-pytest
aiomisc pytest pluginThis package contains a plugin for pytest.Basic usageSimple usage example:asyncdeftest_sample(event_loop):f=event_loop.crete_future()event_loop.call_soon(f.set_result,True)assertawaitfasynchronous fixture example:[email protected]_fixture(loop):awaitasyncio.sleep(0)# Requires python 3.6+yieldIn case you have to save an instance of an async fixture between tests, the wrong solution is just changing the fixture scope. But why it wouldn't work? That's because, in the base scenario, theloopfixture creates a new event loop instance per test which will be closed after test teardown. When you have to use an async fixture any caller ofasyncio.get_event_loop()will get the current event loop instance which will be closed and the next test will run in another event loop. So the solution is to redefine theloopfixture with the required scope and custom fixture with the required scope.importasyncioimportpytestfromaiomiscimportentrypoint@pytest.fixture(scope='module')defloop():withentrypoint()asloop:asyncio.set_event_loop(loop)[email protected](scope='module')asyncdefsample_fixture(loop):yield1LOOP_ID=Noneasyncdeftest_using_fixture(sample_fixture):globalLOOP_IDLOOP_ID=id(asyncio.get_event_loop())assertsample_fixture==1asyncdeftest_not_using_fixture(loop):assertid(loop)==LOOP_IDpytest markersPackage contains some useful markers for pytest:catch_loop_exceptions- uncaught event loop exceptions will failling test.forbid_get_event_loop- forbids callasyncio.get_event_loopduring test [email protected]_get_event_loopasyncdeftest_with_get_loop():defswitch_context():loop=asyncio.get_event_loop()future=loop.create_future()loop.call_soon(future.set_result,True)returnfuturewithpytest.raises(Exception):awaitswitch_context()# Test will be [email protected]_loop_exceptionsasyncdeftest_with_errors(loop):asyncdeffail():# switch contextawaitasyncio.sleep(0)raiseException()loop.create_task(fail())awaitasyncio.sleep(0.1)returnPassing default [email protected]_context():return{'foo':'bar','bar':'foo',}Testing servicesRedefineservicesfixture in your test module:importaiomiscimportpytestclassSimpleServie(aiomisc.Service):asyncdefstart(self)->None:[email protected]():return[SimpleServie()]Event loop policy overridingimportasyncioimportpytestimporttokioimportuvlooppolicy_ids=('uvloop','asyncio','tokio')policies=(uvloop.EventLoopPolicy(),asyncio.DefaultEventLoopPolicy(),tokio.EventLoopPolicy())@pytest.fixture(params=policies,ids=policy_ids)defevent_loop_policy(request):returnrequest.paramThread pool overridingimportpytestfromaiomisc.thread_poolimportThreadPoolExecutorimportconcurrent.futuresthread_pool_ids=('aiomisc pool','default pool')thread_pool_implementation=(ThreadPoolExecutor,concurrent.futures.ThreadPoolExecutor)@pytest.fixture(params=thread_pool_implementation,ids=thread_pool_ids)defthread_pool_executor(request):returnrequest.paramentrypoint [email protected]_kwargs()->dict:returndict(log_config=False)aiohttp test clientimportpytestfrommyapp.services.restimportREST@pytest.fixturedefrest_port(aiomisc_unused_port_factory):returnaiomisc_unused_port_factory()@pytest.fixturedefrest_service(rest_port):returnREST(port=rest_port)@pytest.fixturedefservices(rest_service):return[rest_service]@pytest.fixturedefapi_client(api_service):test_srv=TestServer(app=rest_service.app,port=arguments.port,)returnTestClient(test_srv)...TCPProxySimple TCP proxy for emulate network problems. Available as fixturetcp_proxyExamples:importasyncioimporttimeimportpytestimportaiomiscclassEchoServer(aiomisc.service.TCPServer):asyncdefhandle_client(self,reader:asyncio.StreamReader,writer:asyncio.StreamWriter):chunk=awaitreader.read(65534)whilechunk:writer.write(chunk)chunk=awaitreader.read(65534)writer.close()awaitwriter.wait_closed()@pytest.fixture()defserver_port(aiomisc_unused_port_factory)->int:returnaiomisc_unused_port_factory()@pytest.fixture()defservices(server_port,localhost):return[EchoServer(port=server_port,address=localhost)]@pytest.fixture()asyncdefproxy(tcp_proxy,localhost,server_port):asyncwithtcp_proxy(localhost,server_port)asproxy:yieldproxyasyncdeftest_proxy_client_close(proxy):reader,writer=awaitproxy.create_client()payload=b"Hello world"writer.write(payload)response=awaitasyncio.wait_for(reader.read(1024),timeout=1)assertresponse==payloadassertnotreader.at_eof()awaitproxy.disconnect_all()assertawaitasyncio.wait_for(reader.read(),timeout=1)==b""assertreader.at_eof()asyncdeftest_proxy_client_slow(proxy):read_delay=0.1write_delay=0.2# Emulation of asymmetric and slow ISPwithproxy.slowdown(read_delay,write_delay):reader,writer=awaitproxy.create_client()payload=b"Hello world"delta=-time.monotonic()writer.write(payload)awaitasyncio.wait_for(reader.read(1024),timeout=2)delta+=time.monotonic()assertdelta>=read_delay+write_delayasyncdeftest_proxy_client_with_processor(proxy):processed_request=b"Never say hello"# Patching protocol functionsproxy.set_content_processors(# Process data from client to serverlambda_:processed_request,# Process data from server to clientlambdachunk:chunk[::-1],)reader,writer=awaitproxy.create_client()writer.write(b'nevermind')response=awaitreader.read(16)assertresponse==processed_request[::-1]
aiomixcloud
Mixcloud API wrapper for Python and Async IOaiomixcloudis a wrapper library for theHTTP APIofMixcloud. It supports asynchronous operation viaasyncioand specifically theaiohttpframework.aiomixcloudtries to be abstract and independent of the API’s transient structure, meaning it is not tied to specific JSON fields and resource types. That is, when the API changes or expands, the library should be ready to handle it.InstallationThe following Python versions are supported:CPython: 3.6, 3.7, 3.8, 3.9PyPy: 3.5Install viapip:pipinstallaiomixcloudUsageYou can start usingaiomixcloudas simply as:fromaiomixcloudimportMixcloud# Inside your coroutine:asyncwithMixcloud()asmixcloud:cloudcast=awaitmixcloud.get('bob/cool-mix')# Data is available both as attributes and itemscloudcast.user.namecloudcast['pictures']['large']# Iterate over associated resourcesforcommentinawaitcloudcast.comments():comment.urlA variety of possibilities is enabled duringauthorized usage:# Inside your coroutine:asyncwithMixcloud(access_token=access_token)asmixcloud:# Follow a useruser=awaitmixcloud.get('alice')awaituser.follow()# Upload a cloudcastawaitmixcloud.upload('myshow.mp3','My Show',picture='myshow.jpg')For more details see theusage pageof thedocumentation.LicenseDistributed under theMIT License.
aiomixpanel
No description available on PyPI.
aiommgpio
aiommgpioMMAP Based GPIO & PWM for Raspberry Pi w/ asyncio support.Support ModelsRaspberry Pi 1 (BCM2835)Raspberry Pi 2 (BCM2836/BCM2837)Raspberry Pi 3 (BCM2837)Raspberry Pi 4 (BCM2711)UsageimportasynciofromaiommgpioimportRPiMMIO,PWM_MODE,GPIO_MODEasyncdefmain():mmio=RPiMMIO()pwm=awaitmmio.get_pwm(18,PWM_MODE.HARDWARE)gpio=awaitmmio.get_gpio(12,GPIO_MODE.OUTPUT)awaitpwm.set_frequency(25000)# 25KHz frequencyawaitpwm.set_duty(20000)# 20000 nanoseconds dutyawaitpwm.start()print(f'PWM Period:{pwm.period}')print(f'PWM Frequency:{pwm.frequency}')print(f'PWM Duty:{pwm.duty}')awaitgpio.write(True)awaitasyncio.sleep(5)awaitgpio.write(False)awaitpwm.cleanup()awaitgpio.cleanup()if__name__=='__main__':asyncio.run(main())Seesrc/aiommgpio/example.py.
aiommost
aiommostAsyncio Mattermost client. Useful to write bots.UsagefromaiommostimportMattermostClientclient=MattermostClient(host,token)# create direct channeluser=awaitclient.users.get_by_username('someuser')channel=awaitclient.channels.direct(user.uid,user.uid)Contributing GuideMain dependencies:httpxpydanticDeveloper dependencies:mypywemake-python-styleguidepytestInstall dependencies:makedev.installBefore push:$makelint&&maketest...
aiommy
There would an aiommy description. Something about how to use and what is it for.### Installation ### pip install aiommy### TODO: ###test installingrefactoringsafely managing testing databasemake test runner for explicit set testing database and database driverlicensemakefiletravisseparate dev dependenciesresponsesvalidatorsviewsother extensions for peewee
aiomock
No description available on PyPI.
aiomodbus
No description available on PyPI.
aio-modbus-client
#aio modbus clientThe main purpose is to create classes of devices connected via modbus by describing their properties.Not intended to transfer bytes to modbus.The library allows you to organize work with devices connected to a TCP modbus server, and a serial port. It also assumes the possibility of having devices operating at different speeds and different connection parameters on the bus.##UseCreate your class inheriting from ModbusDevice. It is important to specify the static variable file in your class file =fileCreate a JSON file with the description of the registers of your deviceTo access the device, use an instance of your class and the package API.see example: example / Wirenboard / TestWirenBoardDimmer.pyif someone likes the implementation, I will add documentation##async APIread_param(param_id) - gets device property valuewrite_param(param_id, value) - writes the value to the property of the deviceis_device() - should return true if the device at the current address can be served by this classfind_devices() - returns the list of addresses of these devices. The function calls is_device for each modbus address.##LicensingThis published under the MIT License, seeLICENSEfor details.Copyright (c) 2019 Mikhail Razgovorov
aiomodelz
TensorChord Modelz Python SDK and CLImodelz-pywith aiohttpBasically,aioclient.pyimplements the async / aiohttp versions ofModelz*classes,andclient.pywraps around them withasyncio.run()calls.TensorChord Modelz Python SDK and CLIInstallationpipxpipCLI UsageExample UsageCLI InferencePython InterfaceDevelopInstallationpipxThis is the recommended installation method if you only want to use the CLI.$ pipx install aiomodelzpip$ pip install aiomodelzCLI Usage$modelz--help usage:modelz[-h]{inference,metrics,build}... modelzCLI positionalarguments:{inference,metrics,build}options:-h,--helpshowthishelpmessageandexitExample UsageCLI Inferenceecho"cute cat"|modelzinference$PROJECT--serdemsgpack--write-filecat.jpg--read-stdinPython Interface# use dotenv to load envfromdotenvimportload_dotenvload_dotenv()# example .env:# MODELZ_API_KEY=mzi-*****# MODELZ_HOST=https://{}.cloud.modelz.dev/ # use this if you're using the dev modelz cluster# MODELZ_SSL_VERIFY=0 # disable ssl verificationfrommodelzimportAioModelzClient,ModelzClient...Develop$ git clone https://github.com/tddschn/aiomodelz.git $ cd aiomodelz $ pdm install
aiomodernforms
Python: Async IO Modern Forms API ClientAsynchronous Python client for Modern Forms Fans.AboutThis package allows you to control and monitor Modern Forms fans programmatically. It is mainly created to allow third-party programs to automate the behavior of the Modern Forms fansInstallationpipinstallaiomodernformsUsage"""Asynchronous Python client for Async IO Modern Forms fan."""importasynciofromdatetimeimportdatetime,timedeltaimportaiomodernformsfromaiomodernforms.constimportLIGHT_POWER_ONasyncdefmain():"""Turn on the fan light."""asyncwithaiomodernforms.ModernFormsDevice("192.168.3.197")asfan:awaitfan.update()print(fan.status)awaitfan.light(on=LIGHT_POWER_ON,brightness=50,sleep=datetime.now()+timedelta(minutes=2),)print(fan.status)if__name__=="__main__":loop=asyncio.get_event_loop()loop.run_until_complete(main())
aiomodrinth
No description available on PyPI.
aiomoe
AioMoeFully asynchronous trace.moe API wrapperInstallationYou can install the stable version from PyPI:$ pip install aiomoeOr get it from github:$ pip install https://github.com/FeeeeK/aiomoe/archive/refs/heads/master.zipUsageGet info about your accountimportasynciofromaiomoeimportAioMoetm=AioMoe()# or AioMoe(token="xxxxxxxx")asyncdefmain():me=awaittm.me()print(me)print(f"Used quota:{me.quota_used}/{me.quota}")asyncio.run(main())The output will be like this:User(error=None, id='your ip', priority=0, concurrency=1, quota=1000, quota_used=0) Used quota: 0/1000Search animeimportasynciofromaiomoeimportAioMoetm=AioMoe()asyncdefmain():image="https://i.imgur.com/Xrb06w5.png"search_results=awaittm.search(file_source=image,anilist_info=True)print(search_results.result[0].anilist.title.romaji)# 'Steins;Gate 0'asyncio.run(main())You can pass a link to an image, bytes or file-like object (io.BytesIO)withopen("image.png","rb")asfile:search_results=awaittm.search(file)And use additional parameters such as:anilist_info - Return anAnilistobject instead of anilist idcut_borders - Cut out black borders from screenshotsanilist_id - Filter results by anilist idSee AlsoResponse objectstrace.moe API docstrace.moe API swagger docsContributingFork itCreate your feature branch (git checkout -b my-new-feature)Commit your changes (git commit -am 'Add some feature')Push to the branch (git push origin my-new-feature)Create new Pull RequestLicenseReleased under the MIT license.Copyright byFeeeeK.
aiomoex
Asyncio MOEX ISS APIРеализация на основе asyncio части запросов кMOEX Informational & Statistical Server.Документацияhttps://wlm1ke.github.io/aiomoex/Основные возможностиРеализовано несколько функций-запросов информации о торгуемых акциях и их исторических котировках, результаты которых напрямую конвертируются в pandas.DataFrame.Работа функций базируется на универсальном клиенте, позволяющем осуществлять произвольные запросы к MOEX ISS, поэтому перечень доступных функций-запросов может быть легко расширен. При необходимости добавления функций воспользуйтесьIssuesна GitHub с указанием ссылки на описание запроса:Полный перечень возможныхзапросовк MOEX ISSОфициальноеРуководство разработчикас дополнительной информациейПочему asyncio?На многие запросы MOEX ISS выдает данные порциями по 100 элементов, и для получения всей информации требуются дополнительные обращения к серверу для загрузки данных не с начальной позиции. Например, для скачивания котировок всех акций во всех режимах может потребоваться несколько десятков тысяч обращений к серверу.Результаты маленького тестирования загрузки исторических котировок в режиме TQBR для 35 и 277 (всех торгуемых) акций с помощью синхронных запросов:Вид запросов35 акций277 акцийasyncio12.6 сек40.6 секСинхронные210.4 сек1436.9 секУскорение16.7 раз35.4 разаНачало работыУстановка$pipinstallaiomoexПример использования реализованных запросовИстория котировок SNGSP в режиме TQBR:import asyncio import aiohttp import aiomoex import pandas as pd async def main(): async with aiohttp.ClientSession() as session: data = await aiomoex.get_board_history(session, 'SNGSP') df = pd.DataFrame(data) df.set_index('TRADEDATE', inplace=True) print(df.head(), '\n') print(df.tail(), '\n') df.info() asyncio.run(main())BOARDID CLOSE VOLUME VALUE TRADEDATE 2014-06-09 TQBR 27.48 12674200 3.484352e+08 2014-06-10 TQBR 27.55 14035900 3.856417e+08 2014-06-11 TQBR 28.15 27208800 7.602146e+08 2014-06-16 TQBR 28.27 68059900 1.913160e+09 2014-06-17 TQBR 28.20 22101600 6.292844e+08 BOARDID CLOSE VOLUME VALUE TRADEDATE 2020-09-01 TQBR 37.245 15671200 5.824013e+08 2020-09-02 TQBR 37.535 34659700 1.296441e+09 2020-09-03 TQBR 36.955 28177000 1.049745e+09 2020-09-04 TQBR 36.915 21908000 8.076767e+08 2020-09-07 TQBR 37.200 13334400 4.955280e+08 <class 'pandas.core.frame.DataFrame'> Index: 1573 entries, 2014-06-09 to 2020-09-07 Data columns (total 4 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 BOARDID 1573 non-null object 1 CLOSE 1573 non-null float64 2 VOLUME 1573 non-null int64 3 VALUE 1573 non-null float64 dtypes: float64(2), int64(1), object(1) memory usage: 61.4+ KBПример реализации запроса с помощью клиентаПеречень акций, торгующихся в режиме TQBR (описание запроса):import asyncio import aiohttp import aiomoex import pandas as pd async def main(): request_url = "https://iss.moex.com/iss/engines/stock/" "markets/shares/boards/TQBR/securities.json" arguments = {"securities.columns": ("SECID," "REGNUMBER," "LOTSIZE," "SHORTNAME")} async with aiohttp.ClientSession() as session: iss = aiomoex.ISSClient(session, request_url, arguments) data = await iss.get() df = pd.DataFrame(data["securities"]) df.set_index("SECID", inplace=True) print(df.head(), "\n") print(df.tail(), "\n") df.info() asyncio.run(main())REGNUMBER LOTSIZE SHORTNAME SECID ABRD 1-02-12500-A 10 АбрауДюрсо AFKS 1-05-01669-A 100 Система ао AFLT 1-01-00010-A 10 Аэрофлот AGRO None 1 AGRO-гдр AKRN 1-03-00207-A 1 Акрон REGNUMBER LOTSIZE SHORTNAME SECID YNDX None 1 Yandex clA YRSB 1-01-50099-A 10 ТНСэнЯр YRSBP 2-01-50099-A 10 ТНСэнЯр-п ZILL 1-02-00036-A 1 ЗИЛ ао ZVEZ 1-01-00169-D 1000 ЗВЕЗДА ао <class 'pandas.core.frame.DataFrame'> Index: 260 entries, ABRD to ZVEZ Data columns (total 3 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 REGNUMBER 248 non-null object 1 LOTSIZE 260 non-null int64 2 SHORTNAME 260 non-null object dtypes: int64(1), object(2) memory usage: 8.1+ KB
aiomojang
No description available on PyPI.
aiomon
aiomonTable of ContentsInstallationLicenseInstallationpip install aiomonDevelopmentRun tests with coverage:hatch run covLicenseaiomonis distributed under the terms of theMITlicense.
aiomonetaclient
Aio Moneta clientThis client for moneta merchant apihttps://www.moneta.ru/doc/MONETA.MerchantAPI.v2.ru.pdfInstalationpipinstallaiomonetaclientUsageimportasynciofromaiomonetacleintimportClientif__name__=='__main__':moneta_client=Client(username='<moneta username>',password='<moneta password>')loop=asyncio.get_event_loop()profiles=loop.run_until_complete(# request FindProfileRequestmoneta_client.find_profile_info('<phone without +>'))# print first profile attributesprint(profiles.Envelope.Body.FindProfileInfoResponse.to_json(indent=4))
aiomoney
aiomoney — простая асинхронная библиотека для работы с API ЮMoneyАвторизация приложенияЗарегистрируйте новое приложение YooMoney по ссылкеhttps://yoomoney.ru/myservices/new(без указания чекбокса OAuth2!).Получите и скопируйтеclient_idпосле создания приложенияСоздайте запрос на получение api-токена.О правах приложенияimportasynciofromosimportenvironfromaiomoneyimportauthorize_appasyncdefmain():awaitauthorize_app(client_id=environ.get("CLIENT_ID"),redirect_uri=environ.get("REDIRECT_URI"),app_permissions=["account-info","operation-history","operation-details","incoming-transfers","payment-p2p","payment-shop",])if__name__=="__main__":asyncio.run(main())Во время перенаправления поredirect_uriв адресной строке появится параметрcode=. Скопируйте значение и вставьте его в консольЕсли авторизация прошла успешно, в консоли отобразится Ваш api-token. Сохраните его в переменную окружения (рекомендация)Получение основной информации об аккаунтеimportasynciofromaiomoney.typesimportAccountInfo,Operation,OperationDetailsfromaiomoney.walletimportYooMoneyWalletasyncdefmain():wallet=YooMoneyWallet(access_token="ACCESS_TOKEN")account_info:AccountInfo=awaitwallet.account_infooperation_history:list[Operation]=awaitwallet.get_operation_history()operation_details:OperationDetails=awaitwallet.get_operation_details(operation_id="999")if__name__=="__main__":asyncio.run(main())Создание платёжной формы и проверка оплатыimportasynciofromaiomoney.walletimportYooMoneyWallet,PaymentSourceasyncdefmain():wallet=YooMoneyWallet(access_token="ACCESS_TOKEN")payment_form=awaitwallet.create_payment_form(amount_rub=990,unique_label="myproject_second_unicorn",payment_source=PaymentSource.YOOMONEY_WALLET,success_redirect_url="https://t.me/fofmow (nonono =/)")# проверка платежа по labelpayment_is_completed:bool=awaitwallet.check_payment_on_successful(payment_form.payment_label)print(f"Ссылка на оплату:\n{payment_form.link_for_customer}\n\n"f"Форма оплачена:{'Да'ifpayment_is_completedelse'Нет'}")if__name__=="__main__":asyncio.run(main())
aiomongo
No description available on PyPI.
aiomongodel
An asynchronous ODM similar toPyMODMon top ofMotoran asynchronous PythonMongoDBdriver. Works onPython 3.5and up. Some features such as asynchronous comprehensions require at leastPython 3.6.aiomongodelcan be used withasyncioas well as withTornado.Usage ofsessionrequires at least MongoDB version 4.0.InstallInstallaiomongodelusingpip:pip install aiomongodelDocumentationRead thedocs.Getting StartModelingTo create a model just create a new model class, inherit it fromaiomongodel.Documentclass, list all the model fields and place aMetaclass with model meta options. To create a subdocument, create a class with fields and inherit it fromaiomongodel.EmbeddedDocument.# models.pyfromdatetimeimportdatetimefrompymongoimportIndexModel,DESCENDINGfromaiomongodelimportDocument,EmbeddedDocumentfromaiomongodel.fieldsimport(StrField,BoolField,ListField,EmbDocField,RefField,SynonymField,IntField,FloatField,DateTimeField,ObjectIdField)classUser(Document):_id=StrField(regex=r'[a-zA-Z0-9_]{3, 20}')is_active=BoolField(default=True)posts=ListField(RefField('models.Post'),default=lambda:list())quote=StrField(required=False)# create a synonym fieldname=SynonymField(_id)classMeta:collection='users'classPost(Document):# _id field will be added automatically as# _id = ObjectIdField(defalut=lambda: ObjectId())title=StrField(allow_blank=False,max_length=50)body=StrField()created=DateTimeField(default=lambda:datetime.utcnow())views=IntField(default=0)rate=FloatField(default=0.0)author=RefField(User,mongo_name='user')comments=ListField(EmbDocField('models.Comment'),default=lambda:list())classMeta:collection='posts'indexes=[IndexModel([('created',DESCENDING)])]default_sort=[('created',DESCENDING)]classComment(EmbeddedDocument):_id=ObjectIdField(default=lambda:ObjectId())author=RefField(User)body=StrField()# `s` property of the fields can be used to get a mongodb string name# to use in queriesassertUser._id.s=='_id'assertUser.name.s=='_id'# name is synonymassertPost.title.s=='title'assertPost.author.s=='user'# field has mongo_nameassertPost.comments.body.s=='comments.body'# compound nameCRUDfrommotor.motor_asyncioimportAsyncIOMotorClientasyncdefgo(db):# create model's indexesawaitUser.q(db).create_indexes()# CREATE# create using save# Note: if do_insert=False (default) save performs a replace# with upsert=True, so it does not raise if _id already exists# in db but replace document with that _id.u=awaitUser(name='Alexandro').save(db,do_insert=True)assertu.name=='Alexandro'assertu._id=='Alexandro'assertu.is_activeisTrueassertu.posts==[]assertu.quoteisNone# using queryu=awaitUser.q(db).create(name='Ihor',is_active=False)# READ# get by idu=awaitUser.q(db).get('Alexandro')assertu.name=='Alexandro'# findusers=awaitUser.q(db).find({User.is_active.s:True}).to_list(10)assertlen(users)==2# using for loopusers=[]asyncforuserinUser.q(db).find({User.is_active.s:False}):users.append(user)assertlen(users)==1# in Python 3.6 an up use async comprehensionsusers=[userasyncforuserinUser.q(db).find({})]assertlen(users)==3# UPDATEu=awaitUser.q(db).get('Ihor')u.is_active=Trueawaitu.save(db)assert(awaitUser.q(db).get('Ihor')).is_activeisTrue# using update (without data validation)# object is reloaded from db after update.awaitu.update(db,{'$push':{User.posts.s:ObjectId()}})# DELETEu=awaitUser.q(db).get('Ihor')awaitu.delete(db)loop=asyncio.get_event_loop()client=AsyncIOMotorClient(io_loop=loop)db=client.aiomongodel_testloop.run_until_complete(go(db))ValidationUse model’svalidatemethod to validate model’s data. If there are any invalid data anaiomongodel.errors.ValidationErrorwill raise.NoteCreating model object or assigning it with invalid data does not raise errors! Be careful while saving model without validation.classModel(Document):name=StrField(max_length=7)value=IntField(gt=5,lte=13)data=FloatField()defgo():m=Model(name='xxx',value=10,data=1.6)# validate data# should not raise any errorm.validate()# invalid data# note that there are no errors while creating# model with invalid datainvalid=Model(name='too long string',value=0)try:invalid.validate()exceptaiomongodel.errors.ValidationErrorase:asserte.as_dict()=={'name':'length is greater than 7','value':'value should be greater than 5','data':'field is required'}# using translation - you can translate messages# to your language or modify themtranslation={"field is required":"This field is required","length is greater than{constraint}":("Length of the field ""is greater than ""{constraint}characters"),# see all error messages in ValidationError docs# for missed messages default messages will be used}asserte.as_dict(translation=translation)=={'name':'Length of the field is greater than 7 characters','value':'value should be greater than 5','data':'This field is required'}Queryingasyncdefgo(db):# find returns a cursorcursor=User.q(db).find({},{'_id':1}).skip(1).limit(2)asyncforuserincursor:print(user.name)assertuser.is_activeisNone# we used projection# find oneuser=awaitUser.q(db).find_one({User.name.s:'Alexandro'})assertuser.name=='Alexandro'# updateawaitUser.q(db).update_many({User.is_active.s:True},{'$set':{User.is_active.s:False}})# deleteawaitUser.q(db).delete_many({})Models InheritanceA hierarchy of models can be built by inheriting one model from another. Aaiomongodel.Documentclass should be somewhere in hierarchy for model adnaiomongodel.EmbeddedDocumentfor subdocuments. Note that fields are inherited but meta options are not.classMixin:value=IntField()classParent(Document):name=StrField()classChild(Mixin,Parent):# also has value and name fieldsrate=FloatField()classOtherChild(Child):# also has rate and name fieldsvalue=FloatField()# overwrite value field from MixinclassSubDoc(Mixin,EmbeddedDocument):# has value fieldpassModels Inheritance With Same CollectionclassMixin:is_active=BoolField(default=True)classUser(Mixin,Document):_id=StrField()role=StrField()name=SynonymField(_id)classMeta:collection='users'@classmethoddeffrom_mongo(cls,data):# create appropriate model when loading from dbifdata['role']=='customer':returnsuper(User,Customer).from_mongo(data)ifdata['role']=='admin':returnsuper(User,Admin).from_mongo(data)classCustomer(User):role=StrField(default='customer',choices=['customer'])# overwrite role fieldaddress=StrField()classMeta:collection='users'default_query={User.role.s:'customer'}classAdmin(User):role=StrField(default='admin',choices=['admin'])# overwrite role fieldrights=ListField(StrField(),default=lambda:list())classMeta:collection='users'default_query={User.role.s:'admin'}Transactionfrommotor.motor_asyncioimportAsyncIOMotorClientasyncdefgo(db):# create collection before using transactionawaitUser.create_collection(db)asyncwithawaitdb.client.start_session()assession:try:asyncwiths.start_transaction():# all statements that use session inside this block# will be executed in one transaction# pass session to QuerySetawaitUser.q(db,session=session).create(name='user')# note session param# pass session to QuerySet methodawaitUser.q(db).update_one({User.name.s:'user'},{'$set':{User.is_active.s:False}},session=session)# note session usageassertawaitUser.q(db,session).count_documents({User.name.s:'user'})==1# session could be used in document crud methodsu=awaitUser(name='user2').save(db,session=session)awaitu.delete(db,session=session)raiseException()# simulate error in transaction blockexceptException:# transaction was not committedassertawaitUser.q(db).count_documents({User.name.s:'user'})==0loop=asyncio.get_event_loop()client=AsyncIOMotorClient(io_loop=loop)db=client.aiomongodel_testloop.run_until_complete(go(db))LicenseThe library is licensed under MIT License.Changelog0.2.2 (2020-12-14)Bump version of motor for python 3.11 compatibility.Add tests workflow for GitHub Actions CI.0.2.1 (2020-12-14)Add verbose_name toFieldfor meta information.FixDecimalField’s issue to load field from float value.0.2.0 (2018-09-12)Move requirements to motor>=2.0.Removecountmethod fromMotorQuerySetCursor.Add session support toMotorQuerySetandDocument.Addcreate_collectionmethod toDocument.Fix__aiter__ofMotorQuerySetCursorfor python 3.7.Deprecatecountmethod ofMotorQuerySet.Deprecatecreatemethod ofDocument.0.1.0 (2017-05-19)The firstaiomongodelrelease.
aio-mongo-dm
aio-mongo-dmasynchronous lightweight ODM for MongoDB based onmotorSuitable Application EnvironmentThe goal of this package is to create an asynchronous, simple and intuitive ODM, which can be easily applied to the python asynchronous framework system. If you happen to like asynchronous framework very much and use mongodb as your data storage system.[Motor documentation][https://motor.readthedocs.io/en/stable/index.html]Installationpipinstallaio-mongo-dmQuick Startimportasynciofromdatetimeimportdatetimefromaio_mongo_dmimportDocumentclassUser(Document):# customize document field with schema data__schema__={'name':{'type':str,'default':'my_default_name','index':1},'age':{'type':int,'default':20,'index':-1},'sex':{'type':bool,'default':True},'createdAt':{'type':datetime,'index':-1},'updatedAt':{'type':datetime}}# customize class method@classmethodasyncdeffind_one_by_name(cls,user_name):returnawaitcls.find_one({'name':user_name})# hook method, after user.save()asyncdefafter_save(self):print('after_save User hook method')_db_url='mongodb://localhost:27000'asyncdefmain():# init whole document/collection with mongodb url and db_nameawaitDocument.init_db(url=_db_url,db_name='mytest')# create User instanceuser=User()assertuser.name=='my_default_name'assertuser.sexisTrue# _id not existassert'_id'notinusercount1=awaitUser.count({'age':{'$lg':10}})# save user object to db, then return _id (mongodb's unique index )awaituser.save()# _id existassert'_id'inusercount2=awaitUser.count({})assertcount2==count1+1print(f'User count={count2}')user_with_name=awaitUser.find_one_by_name('my_default_name')print(user_with_name,user_with_name.updatedAt)assertisinstance(user_with_name,User)cursor=User.find({'age':{'$gt':10}}).sort('age')fordocumentinawaitcursor.to_list(length=100):print(document)if__name__=='__main__':asyncio.run(main())Create Indeximportasynciofromdatetimeimportdatetimefromaio_mongo_dmimportDocumentclassUser(Document):# set DB name of this document__db_name__='mytest'# customize class with schema data__schema__={'name':{'type':str,'default':'my_default_name','index':1},'age':{'type':int,'default':20,'index':-1},'sex':{'type':bool,'default':True},'createdAt':{'type':datetime,'index':-1},'updatedAt':{'type':datetime}}# customize class method@classmethodasyncdeffind_one_by_name(cls,user_name):returnawaitcls.find_one({'name':user_name})# hook method, after user.save()asyncdefafter_save(self):print('after_save User hook method')_db_url='mongodb://localhost:27000'asyncdefmain():# init whole document/collection with mongodb url,with default db_name=testawaitDocument.init_db(url=_db_url)# 1、create single index respectively according to User.__schema__ index value, e.g. name 1, createdAt -1single_index_results=awaitUser.create_index()print('single_index_results',single_index_results)# 2、 create compound indexcompound_index_result=awaitUser.create_compound_index([('name',1),('createdAt',-1)])print('compound_index_result',compound_index_result)# 3、 get all index informationindex_information=awaitUser.get_index_infor()print('index_information',index_information)if__name__=='__main__':asyncio.run(main())TranscationNotice: transaction need replication set env, e.g. one Primary Server, one Secondary Server, one Arbiter Server. [Detail Configuration][https://www.mongodb.com/docs/v5.0/reference/configuration-options/]frombson.objectidimportObjectIdfromdatetimeimportdatetimeimportasynciofromaio_mongo_dmimportDocument,AioClient_db_url='mongodb://localhost:27000'classUser(Document):# customize class with schema data__schema__={'name':{'type':str,'default':'my_default_name','index':1},'age':{'type':int,'default':20,'index':-1},'sex':{'type':bool,'default':True},'createdAt':{'type':datetime,'index':-1},'updatedAt':{'type':datetime}}classPayOrder(Document):# customize class with schema data__schema__={'user_id':{'type':ObjectId},'total_fee':{'type':int},'status':{'type':str,'default':'Normal'},'createdAt':{'type':datetime,'index':-1},'updatedAt':{'type':datetime}}asyncdeftrancation_test(is_raise_exception):client=AioClient(client_name='__local__')asyncwithawaitclient.start_session()ass:asyncwiths.start_transaction():user=User()user.name='kavin'new_user=awaituser.save(session=s)pay_order=PayOrder()pay_order.user_id=user['_id']pay_order.total_fee=100pay_order.status='payed'new_order=awaitpay_order.save(session=s)assertnew_user['_id']==new_order['user_id']ifis_raise_exception:raiseException('trancation_test exception')asyncdefmain():# init whole document/collection with mongodb url and db_nameawaitDocument.init_db(url=_db_url,client_name='__local__',db_name='mytest')user_count1=awaitUser.count()new_order_count1=awaitPayOrder.count()# successful transactionawaittrancation_test(False)user_count2=awaitUser.count()new_order_count2=awaitPayOrder.count()# count +1assertuser_count2==user_count1+1assertnew_order_count2==new_order_count1+1try:# failed transactionawaittrancation_test(True)exceptExceptionase:assertstr(e)=='trancation_test exception'user_count3=awaitUser.count()new_order_count3=awaitPayOrder.count()# count nor changeassertuser_count3==user_count2assertnew_order_count3==new_order_count2print('trancation test ok.')if__name__=='__main__':asyncio.run(main())More ExampleFor more examples, please query the example folder.API ReferenceDocument__db_url__set db url of this document. you can set in sub-class Document or use Document class method init_db(url='mongodb://localhost:27017')default: 'mongodb://localhost:27017'__db_name__optional. Attribute for setting up the database. you can set in sub-class Document or use Document class method init_db(db_name='mytest')default: 'test'__collection_name__optional. Attribute for setting up the collection name. e.g. the class name is 'User', so default collection name is 'users' if not set.default: '{class name}.lower() + s'__schema__Set the initializing data for all objects in the collection when the object is initialized. Defined field default value and type will be checked .save(session=None)Coroutine. It saves the object in the database, attribute '_id' will be generated if successsession: ClientSession instance for transaction operationdelete()Coroutine. It remove an object from the database. If the object does not exist in the database, then theAioMongoDocumentDoesNotExistexception will be raised.refresh(session=None)Coroutine. Refresh the current object from the same object from the database. If the object does not exist in the database, then theAioMongoDocumentDoesNotExistexception will be raised.session: ClientSession instance for transaction operationpre_save()Hook Method. This method is called before the save() method. You can override this method in a subclass. If this method is not overridden, 'updatedAt' and 'createdAt' will be updated with datetime.now() by default if the field key defined inschema.after_save()Hook Method. This method is called after the save() method. You can override this method in a subclass.init_db(url: str = None, db_name: str = 'test', client_name: str = '__local__', io_loop: _UnixSelectorEventLoop = None, **kwargs) -> AioClientCoroutine Class Method. init mongodb method, create AioClient instance and set class attributeaio_clienturl: mongodb urldb_name: database nameclient_name: name for cache aio clientio_loop: asyncio event loopawaitDocument.init_db(url='mongodb://localhost:27000',db_name='mytest')create_instance(obj: object) -> Optional[_Document]Class Method. Create a document instance through an object, e.g. User.create_instance({'name': 'kavin', 'age': 30})obj: an object instancefind_by_id(oid: Union[str, ObjectId], session=None) -> Optional[_Document]Coroutine Class Method. document query based on document ID. e.g. User.find_by_id('xxxxxxx')oid: Document ID, str or ObjectIdsession: ClientSession instance for transaction operationdelete_by_id(oid: Union[str, ObjectId], session=None)Coroutine Class Method. delete document instance according to document ID. e.g. User.delete_by_id('xxxxxxx')oid: Document ID, str or ObjectIdsession: ClientSession instance for transaction operationfind(*args, session: Optional[ClientSession] = None,**kwargs) -> AsyncIOMotorCursorClass Method. Querying for More Than One Document, create a AsyncIOMotorCursor.cursor=User.find({'age':{'$gt':10}}).sort('age')fordocumentinawaitcursor.to_list(length=100):print(document)find_one(*args, session: Optional[ClientSession] = None, **kwargs) -> Optional[_Document]Coroutine Class Method. Getting a Single Document, return None if no matching document is found.doc=awaitUser.find_one({'age':{'$gt':10}}).sort('age')count(*filters: Mapping[str, Any], session: Optional[ClientSession] = None, **kwargs) -> intCoroutine Class Method. Count the number of documents in this collection.filters: A query document that selects which documents to count in the collection.session: ClientSession instance for transaction operationusers_num=awaitUser.count()users_name_num=awaitUser.count({'name':'kavin'})users_age_num=awaitUser.count({'age':{$gt:30}}})get_collection(db_name: str = None) -> AioCollectionClass Method. get aio collection through the specified db name, if db_name is not None. use attributedb_nameif db_name is None.db_name: database namecreate_index(session: Optional[ClientSession] = None) -> listCoroutine Class Method. create index on this collection. When defining document subclasses, index can be defined in schema, In this function, we will create all the previously defined default indexes one by one return index str listsession: ClientSession instance for transaction operationclassUser(Document):__schema__={'name':{'type':str,'default':'my_default_name','index':-1},'sex':{'type':bool},'age':{'type':int,'default':20,'index':1},'createdAt':{'type':datetime,'index':-1},'updatedAt':{'type':datetime}}res=awaitUser.create_index()assertres==['index_-1','age_1','createdAt_-1']create_compound_index(keys: Union[str, Sequence[Tuple[str, Union[int, str, Mapping[str, Any]]]]], session: Optional[ClientSession] = None) -> strCoroutine Class Method. create compound index on this collection.keys: list of index key and index valuesession: ClientSession instance for transaction operationclassUser(Document):__schema__={'name':{'type':str,'default':'my_default_name','index':-1},'sex':{'type':bool},'age':{'type':int,'default':20,'index':1},'createdAt':{'type':datetime,'index':-1},'updatedAt':{'type':datetime}}keys=[('name',1),('createdAt',-1)]res=awaitUser.create_compound_index(keys)assertres=='name_1_createdAt_-1'get_index_infor(session: Optional[ClientSession] = None) -> strCoroutine Class Method. Get information on this collection’s indexes.session: ClientSession instance for transaction operation
aiomongoengine
No description available on PyPI.
aiomonitor
aiomonitoraiomonitoris a module that adds monitor and cli capabilities forasyncioapplications. Idea and code were borrowed fromcurioproject. Task monitor that runs concurrently to theasyncioloop (or fast drop-in replacementuvloop) in a separate thread as result monitor will work even if the event loop is blocked for some reason.This library provides a python console usingaioconsolemodule. It is possible to execute asynchronous commands inside your running application. Extensible with you own commands, in the style of the standard library’scmdmoduleInstallationInstallation process is simple, just:$ pip install aiomonitorExampleMonitor has context manager interface:importaiomonitorasyncdefmain():loop=asyncio.get_running_loop()run_forever=loop.create_future()withaiomonitor.start_monitor(loop):awaitrun_forevertry:asyncio.run(main())exceptKeyboardInterrupt:passNow from separate terminal it is possible to connect to the application:$ telnet localhost 20101or the included python client:$ python -m aiomonitor.cliTutorialLet’s create a simpleaiohttpapplication, and see howaiomonitorcan be integrated with it.importasyncioimportaiomonitorfromaiohttpimportweb# Simple handler that returns response after 100sasyncdefsimple(request):print('Start sleeping')awaitasyncio.sleep(100)returnweb.Response(text="Simple answer")loop=asyncio.get_event_loop()# create application and register routeapp=web.Application()app.router.add_get('/simple',simple)# it is possible to pass a dictionary with local variables# to the python console environmenthost,port="localhost",8090locals_={"port":port,"host":host}# init monitor just before run_appwithaiomonitor.start_monitor(loop=loop,locals=locals_):# run application with built-in aiohttp run_app functionweb.run_app(app,port=port,host=host,loop=loop)Let’s save this code in filesimple_srv.py, so we can run it with the following command:$ python simple_srv.py ======== Running on http://localhost:8090 ======== (Press CTRL+C to quit)And now one can connect to a running application from a separate terminal, with thetelnetcommand, andaiomonitorwill immediately respond with prompt:$ telnet localhost 20101 Asyncio Monitor: 1 tasks running Type help for commands monitor >>>Now you can type commands, for instance,help:monitor >>> help Usage: help [OPTIONS] COMMAND [ARGS]... To see the usage of each command, run them with "--help" option. Commands: cancel Cancel an indicated task console Switch to async Python REPL exit (q,quit) Leave the monitor client session help (?,h) Show the list of commands ps (p) Show task table ps-terminated (pst,pt) List recently terminated/cancelled tasks signal Send a Unix signal stacktrace (st,stack) Print a stack trace from the event loop thread where (w) Show stack frames and the task creation chain of a task where-terminated (wt) Show stack frames and the termination/cancellation chain of a taskaiomonitoralso supports async python console inside a running event loop so you can explore the state of your application:monitor >>> console Python 3.10.7 (main, Sep 9 2022, 12:31:20) [Clang 13.1.6 (clang-1316.0.21.2.5)] on darwin Type "help", "copyright", "credits" or "license" for more information. --- This console is running in an asyncio event loop. It allows you to wait for coroutines using the 'await' syntax. Try: await asyncio.sleep(1, result=3) --- >>> await asyncio.sleep(1, result=3) 3 >>>To leave the console typeexit()or press Ctrl+D:>>> exit() ✓ The console session is closed. monitor >>>ExtensionAdditional console variablesYou may add more variables that can be directly referenced in theconsolecommand. Referthe console-variables example codeCustom console commandsaiomonitoris very easy to extend with your own console commands. Referthe extension example codeRequirementsPython3.8+ (3.10.7+ recommended)aioconsoleClickprompt_toolkituvloop(optional)CHANGES0.7.0 (2023-12-21)Overhauled the documentation (#393)Adopted ruff to replace black, flake8 and isort (#391)Added a new demo example to show various features of aiomonitor, especially using the GUI (also for PyCon APAC 2023 talk) (#385)Relaxed our direct dependnecy version range of aiohttp (“3.8.5 only” to “3.8.5 and higher”) to enable installation on Python 3.12 (#389)Updated the README example to conform with the latest API and convention (#383)0.6.0 (2023-08-27)Add the web-based monitoring user interface to list, inspect, and cancel running/terminated tasks, with refactoring the monitor business logic and presentation layers (termuiandwebui) (#84)Replace the default port numbers for the terminal UI, the web UI, and the console access (50101, 50201, 50102 -> 20101, 20102, 20103 respectively) (#374)Adopt towncrier to auto-generate the changelog (#375)0.5.0 (2023-07-21)Fix a regression in Python 3.10 due to #10 (#11)Support Python 3.11 properly by allowing the optional (nameandcontextkwargs passed toasyncio.create_task()in the hooked task factory function#10)Update development dependenciesSelective persistent termination logs (#9)Implement cancellation chain tracker (#8)Trigger auto-completion only when Tab is pressedSupport auto-completion of commands and arguments (#7)Add missing explicit dependency to ClickPromoteconsole_localsas public attrReimplement console command (#6)Migrate to Click-based command line interface (#5)Adopt (prompt_toolkitand support concurrent clients#4)Show the total number of tasks when executing (ps#3)Apply black, isort, mypy, flake8 and automate CI workflows using GitHub ActionsFix the task creation location in the ‘ps’ command outputRemove loop=loop from all asynchronous calls to support newer Python versions (#329)Added the task creation stack chain display to the ‘where’ command by setting a custom task factory (#1)These are the backported changes from [aiomonitor-ng](https://github.com/achimnol/aiomonitor-ng). As the version bumps have gone far away in the fork, all those extra releases are squashed into the v0.5.0 release.0.4.5 (2019-11-03)Fixed endless loop on EOF (thanks @apatrushev)0.4.4 (2019-03-23)Simplified python console start end #175Added python 3.7 compatibility #1760.4.3 (2019-02-02)Reworked console server start/close logic #1690.4.2 (2019-01-13)Fixed issue with type annotations from 0.4.1 release #1640.4.1 (2019-01-10)Fixed Python 3.5 support #161 (thanks @bmerry)0.4.0 (2019-01-04)Added support for custom commands #133 (thanks @yggdr)Fixed OptLocals being passed as the default value for “locals” #122 (thanks @agronholm)Added an API inspired by the standard library’s cmd module #135 (thanks @yggdr)Correctly report the port running aioconsole #124 (thanks @bmerry)0.3.1 (2018-07-03)Added the stacktrace command #120 (thanks @agronholm)0.3.0 (2017-09-08)Added _locals_ parameter for passing environment to python REPL0.2.1 (2016-01-03)Fixed import in telnet cli in #12 (thanks @hellysmile)0.2.0 (2016-01-01)Added basic documentationMost of methods of Monitor class are not not private api0.1.0 (2016-12-14)Added missed LICENSE fileUpdated API, added start_monitor() function0.0.3 (2016-12-11)Fixed README.rst0.0.2 (2016-12-11)Tests more stable nowAdded simple tutorial to README.rst0.0.1 (2016-12-10)Initial release.
aiomonitor-ng
aiomonitor-ngaiomonitor-ngis a (temporary) fork ofaiomonitorwith support for Python 3.10+ and additional usability & debuggability improvements.aiomonitoris a module that adds monitor and cli capabilities forasyncioapplications. Idea and code were borrowed fromcurioproject. Task monitor that runs concurrently to theasyncioloop (or fast drop-in replacementuvloop) in a separate thread as result monitor will work even if the event loop is blocked for some reason.This library provides a python console usingaioconsolemodule. It is possible to execute asynchronous commands inside your running application. Extensible with you own commands, in the style of the standard library’scmdmoduleInstallationInstallation process is simple, just:$ pip install aiomonitor-ngExampleMonitor has context manager interface:importaiomonitorasyncdefmain():loop=asyncio.get_running_loop()run_forever=loop.create_future()withaiomonitor.start_monitor(loop):awaitrun_forevertry:asyncio.run(main())exceptKeyboardInterrupt:passNow from separate terminal it is possible to connect to the application:$ telnet localhost 50101or the included python client:$ python -m aiomonitor.cliTutorialLet’s create a simpleaiohttpapplication, and see howaiomonitorcan be integrated with it.importasyncioimportaiomonitorfromaiohttpimportweb# Simple handler that returns response after 100sasyncdefsimple(request):loop=request.app.loopprint('Start sleeping')awaitasyncio.sleep(100,loop=loop)returnweb.Response(text="Simple answer")loop=asyncio.get_event_loop()# create application and register routeapp=web.Application(loop=loop)app.router.add_get('/simple',simple)# it is possible to pass a dictionary with local variables# to the python console environmenthost,port="localhost",8090locals_={"port":port,"host":host}# init monitor just before run_appwithaiomonitor.start_monitor(loop=loop,locals=locals_):# run application with built-in aiohttp run_app functionweb.run_app(app,port=port,host=host)Let’s save this code in filesimple_srv.py, so we can run it with the following command:$ python simple_srv.py ======== Running on http://localhost:8090 ======== (Press CTRL+C to quit)And now one can connect to a running application from a separate terminal, with thetelnetcommand, andaiomonitorwill immediately respond with prompt:$ telnet localhost 50101 Asyncio Monitor: 1 tasks running Type help for commands monitor >>>Now you can type commands, for instance,help:monitor >>> help Usage: help [OPTIONS] COMMAND [ARGS]... To see the usage of each command, run them with "--help" option. Commands: cancel Cancel an indicated task console Switch to async Python REPL exit (q,quit) Leave the monitor client session help (?,h) Show the list of commands ps (p) Show task table ps-terminated (pst,pt) List recently terminated/cancelled tasks signal Send a Unix signal stacktrace (st,stack) Print a stack trace from the event loop thread where (w) Show stack frames and the task creation chain of a task where-terminated (wt) Show stack frames and the termination/cancellation chain of a taskaiomonitoralso supports async python console inside a running event loop so you can explore the state of your application:monitor >>> console Python 3.10.7 (main, Sep 9 2022, 12:31:20) [Clang 13.1.6 (clang-1316.0.21.2.5)] on darwin Type "help", "copyright", "credits" or "license" for more information. --- This console is running in an asyncio event loop. It allows you to wait for coroutines using the 'await' syntax. Try: await asyncio.sleep(1, result=3) --- >>> await asyncio.sleep(1, result=3) 3 >>>To leave the console typeexit()or press Ctrl+D:>>> exit() ✓ The console session is closed. monitor >>>ExtensionAdditional console variablesYou may add more variables that can be directly referenced in theconsolecommand. Referthe console-variables example codeCustom console commandsaiomonitoris very easy to extend with your own console commands. Referthe extension example codeRequirementsPython3.8+ (3.10.7+ recommended)aioconsoleClickprompt_toolkituvloop(optional)CHANGES0.7.1 (2023-04-16)Support Python 3.11 properly by allowing the optionalnameandcontextkwargs passed toasyncio.create_task()in the hooked task factory function (#10)Update development dependencies0.7.0 (2022-10-19)Selective persistent termination logs (#9)Implement cancellation chain tracker (#8)Trigger auto-completion only when Tab is pressedSupport auto-completion of commands and arguments (#7)Add missing explicit dependency to Click0.6.0 (2022-09-26)Promoteconsole_localsas public attrReimplement console command (#6)Migrate to Click-based command line interface (#5)Adoptprompt_toolkitand support concurrent clients (#4)Show the total number of tasks when executingps(#3)Apply black, isort, mypy, flake8 and automate CI workflows using GitHub Actions0.5.1 (2022-08-29)Fix the task creation location in the ‘ps’ command output0.5.0 (2022-08-26)Made it compatible with Python 3.10Added the task creation stack chain display to the ‘where’ command by setting a custom task factory (#1)Changed the ‘ps’ command view to be more concise and display many tasks in a better way (#2)0.4.5 (2019-11-03)Fixed endless loop on EOF (thanks @apatrushev)0.4.4 (2019-03-23)Simplified python console start end #175Added python 3.7 compatibility #1760.4.3 (2019-02-02)Reworked console server start/close logic #1690.4.2 (2019-01-13)Fixed issue with type annotations from 0.4.1 release #1640.4.1 (2019-01-10)Fixed Python 3.5 support #161 (thanks @bmerry)0.4.0 (2019-01-04)Added support for custom commands #133 (thanks @yggdr)Fixed OptLocals being passed as the default value for “locals” #122 (thanks @agronholm)Added an API inspired by the standard library’s cmd module #135 (thanks @yggdr)Correctly report the port running aioconsole #124 (thanks @bmerry)0.3.1 (2018-07-03)Added the stacktrace command #120 (thanks @agronholm)0.3.0 (2017-09-08)Added _locals_ parameter for passing environment to python REPL0.2.1 (2016-01-03)Fixed import in telnet cli in #12 (thanks @hellysmile)0.2.0 (2016-01-01)Added basic documentationMost of methods of Monitor class are not not private api0.1.0 (2016-12-14)Added missed LICENSE fileUpdated API, added start_monitor() function0.0.3 (2016-12-11)Fixed README.rst0.0.2 (2016-12-11)Tests more stable nowAdded simple tutorial to README.rst0.0.1 (2016-12-10)Initial release.
aiomono
AIOMono (Alpha)Theaiomonois fully asynchronous library forMonobank APIwritten in Python 3.8 withasyncio,aiohttpandpydantic.SetupYou get token for your client fromMonobankAPI.Install thelatest versionof theaiomono:pip install aiomonoExamplesWe have 3 different classes for use Monobank API:MonoClientis simple base class for others, can only get currenciesPersonalMonoClient- this class for talk to personal Monobank APICorporateMonoClient- this class for talk to corporate Monobank API(soon)Simpleget_currencyrequestimportasynciofromaiomonoimportMonoClientmono_client=MonoClient()asyncdefmain():asyncwithmono_clientasclient:client_info=awaitclient.get_currency()print(client_info)asyncio.run(main())client_inforequestimportasynciofromaiomonoimportPersonalMonoClientMONOBANK_API_TOKEN='your token'asyncdefmain():try:mono_client=PersonalMonoClient(MONOBANK_API_TOKEN)client_info=awaitmono_client.client_info()print(f'User name{client_info.name}😍')finally:awaitmono_client.close()asyncio.run(main())Resources:# TODO
aiomonobank
Asynchronous Python library formonobankAPISetupYou get token for your client fromMonobankAPI.Install thelatest versionof theaiomonobank:pip install aiomonobankExamplesWe currently have 2 different classes for using the Monobank API:MonoPublicis simple base class for others, can only get currenciesMonoPersonal- this class for talk to personal Monobank APIget_currencyrequestimportjsonimportasynciofromaiomonobankimportMonoPublic,typesasyncdefmain():asyncwithMonoPublic()asmono_client:currencies:list[types.Currency]=awaitmono_client.get_currency()forcurrencyincurrencies:print(currency)if__name__=='__main__':asyncio.run(main())get_client_inforequestimportasynciofromaiomonobankimportMonoPersonalMONOBANK_API_TOKEN='your_token'asyncdefmain():mono_client=MonoPersonal(MONOBANK_API_TOKEN)try:client_info=awaitmono_client.get_client_info()print(f"Client name:{client_info.name}")print(client_info)finally:awaitmono_client.close()if__name__=='__main__':asyncio.run(main())get_statementrequestimportasynciofromdatetimeimportdatetime,timedeltafromaiomonobankimportMonoPersonalMONOBANK_API_TOKEN='your_token'asyncdefmain():mono_client=MonoPersonal(MONOBANK_API_TOKEN)try:transactions=awaitmono_client.get_statement(account_id='0',from_datetime=datetime.utcnow()-timedelta(days=3),to_datetime=datetime.utcnow()-timedelta(days=2))fortransactionintransactions:print(transaction)finally:awaitmono_client.close()if__name__=='__main__':asyncio.run(main())Resources:PyPI:aiomonobankDocumentation: (soon)
aiomonobnk
AiomonobnkAsync Python3.11 Monobank APIIntroductionAiomonobnk- python lib for:• MonoPay - Monobank Acquiringofficial docs:https://api.monobank.ua/docs/acquiring.htmllib docs:https://github.com/yeghorkikhai/mbnk/blob/master/docs/monopay_api.mdGithubhttps://github.com/yeghorkikhai/aiomonobnkInstallationpip install aiomonobnkGet Started with Monobank Open APIimportosimportasynciofromaiomonobnkimportMonoPayasyncdefmain():async_mono=MonoPay(token=os.getenv("MONOBANK_API_TOKEN"))invoice=awaitasync_mono.create_invoice(...)if__name__=="__main__":asyncio.run(main())
aiomothr
aiomothrInstallationpip install aiomothrUsageBasic example submitting a job requestfromaiomothrimportAsyncJobRequestrequest=AsyncJobRequest(service="echo")request.add_parameter(value="Hello MOTHR!")result=awaitrequest.run_job()print(result)Submitting a job request UsingAsyncMothrClient. This allows you to reuse the client connection when making multiple requests.frommothrpyimportAsyncJobRequest,AsyncMothrClientclient=AsyncMothrClient()# Send one requestrequest=AsyncJobRequest(client=client,service="echo")request.add_parameter(value="Hello MOTHR!")result=awaitrequest.run_job()print(result)# Reuse the client in another requestrequest=AsyncJobRequest(client=client,service="echo")request.add_parameter(value="Hello again MOTHR!")result=awaitrequest.run_job()print(result)Submit concurrent job requestsimportasynciofrommothrpyimportAsyncJobRequest,AsyncMothrClientclient=AsyncMothrClient()request_a=AsyncJobRequest(client=client,service="echo")request_a.add_parameter(value="Hello MOTHR!")request_b=AsyncJobRequest(client=client,service="echo")request_b.add_parameter(value="Hello again MOTHR!")# Execute both requests concurrentlytasks=[request_a.run_job(),request_b.run_job()]results=awaitasyncio.gather(*tasks)forresultinresults:print(result)
aiomox
aiomoxClient library for interacting with MOX LT smart home devices
aiompd
Usage example:importasyncioimportaiompdURLS=["http://mega5.fast-serv.com:8134","http://176.31.240.114:8326","http://74.86.186.4:10042","http://s14.myradiostream.com:4668",][email protected](mpc):yield frommpc.clear()forurlinURLS:yield frommpc.add(url)forninrange(len(URLS)):yield frommpc.play(track=n)yield fromasyncio.sleep(PLAY_TIME)@asyncio.coroutinedefvolumer(mpc):timeout=(len(URLS)*PLAY_TIME)/200forvolumeinrange(0,101,1):yield frommpc.set_volume(volume)yield fromasyncio.sleep(timeout)forvolumeinrange(100,-1,-1):yield frommpc.set_volume(volume)yield fromasyncio.sleep(timeout)defmain():loop=asyncio.get_event_loop()mpc=loop.run_until_complete(aiompd.Client.make_connection())loop.run_until_complete(asyncio.wait([nexter(mpc),volumer(mpc)]))if__name__=='__main__':main()
aiompesa
aiompesaA package for accessing theMPESA Daraja APIfromasyncio.UsageimportasynciofromaiompesaimportMpesaCONSUMER_KEY="nF4OwB2XiuYZwmdMz3bovnzw2qMls1b7"CONSUMER_SECRET="biIImmaAX9dYD4Pw"loop=asyncio.get_event_loop()mpesa=Mpesa(True,CONSUMER_KEY,CONSUMER_SECRET)token_response=loop.run_until_complete(mpesa.generate_token())access_token=token_response.get("access_token",None)expires_in=token_response.get("expires_in",None)ifaccess_tokenisNone:print("Error: Wrong credentials used to get the access_token")else:print(f"access_token ={access_token}, expires_in ={expires_in}secs")RequirementsPython 3.6+Installation$ pip install aiompesaMotivationTo learn a little more aboutasyncioand put it to some practise.To develop an async wrapper for theSafaricom daraja api.ContributionFollow thecontribution guidelines
aio-mpv-jsonipc
aio-mpv-jsonipcCheck the docstrings for documentationPyPI
aiomq
aiomq每个客户端都是节点,与服务端本质应该没有主次区别,仅因为位置的不同,导致承载的代码不同.不同节点数据交互方式应该是多样性的。渐进式异步任务构建服务,用于快速构建异步任务,定时任务。支持结构有 队列,订阅/监听,分发,RPC。 可视化任务池界面,在线设备,手动触发,下发任务,上报任务。版本要求python: 3.8.5 以上 aiohttp版本0.2.0 事件基本任务注册和分发和基础的仪表盘, 支持多客户端如果有更好建议或者写协作开发,请联系联系方式:[email protected]. Protocol classes class asyncio.Protocol The base class for implementing streaming protocols (for use with e.g. TCP and SSL transports). class asyncio.DatagramProtocol The base class for implementing datagram protocols (for use with e.g. UDP transports). class asyncio.SubprocessProtocol The base class for implementing protocols communicating with child processes (through a set of unidirectional pipes).
aiomql
Aiomql - Bot Building Framework and Asynchronous MetaTrader5 LibraryInstallationpipinstallaiomqlKey FeaturesAsynchronous Python Library For MetaTrader5Asynchronous Bot Building FrameworkBuild bots for trading in different financial markets using a bot factoryUse threadpool executors to run multiple strategies on multiple instruments concurrentlyRecords and keep track of trades and strategies in csv files.Helper classes for Bot Building. Easy to use and extend.Compatible with pandas-ta.Sample Pre-Built strategiesManage Trading periods using SessionsRisk ManagementRun multiple bots concurrently with different accounts from the same broker or different brokersAs an asynchronous MetaTrader5 LibrayimportasynciofromaiomqlimportMetaTraderasyncdefmain():mt5=MetaTrader()awaitmt5.initialize()awaitmt5.login(123456,'*******','Broker-Server')symbols=awaitmt5.symbols_get()print(symbols)asyncio.run(main())As a Bot Building FrameWork using a Sample StrategyThe following code is a sample bot that uses the FingerTrap strategy from the library.It assumes that you have a config file in the same directory as the script.The config file should be named aiomql.json and should contain the login details for your account.It demonstrates the use of sessions and risk management.Sessions allows you to specify the trading period for a strategy. You can also set an action to be performed at the end of a session.Risk Management allows you to manage the risk of a strategy. You can set the risk per trade and the risk to reward ratio.The trader class handles the placing of orders and risk management. It is an attribute of the strategy class.fromdatetimeimporttimeimportloggingfromaiomqlimportBot,ForexSymbol,FingerTrap,Session,Sessions,RAM,SimpleTrader,TimeFramelogging.basicConfig(level=logging.INFO)defbuild_bot():bot=Bot()# create sessions for the strategieslondon=Session(name='London',start=8,end=time(hour=15,minute=30),on_end='close_all')new_york=Session(name='New York',start=13,end=time(hour=20,minute=30))tokyo=Session(name='Tokyo',start=23,end=time(hour=6,minute=30))# configure the parameters and the trader for a strategyparams={'trend_candles_count':500,'fast_period':8,'slow_period':34,'etf':TimeFrame.M5}gbpusd=ForexSymbol(name='GBPUSD')st1=FingerTrap(symbol=gbpusd,params=params,trader=SimpleTrader(symbol=gbpusd,ram=RAM(risk=0.05,risk_to_reward=2)),sessions=Sessions(london,new_york))# use the default for the other strategiesst2=FingerTrap(symbol=ForexSymbol(name='AUDUSD'),sessions=Sessions(tokyo,new_york))st3=FingerTrap(symbol=ForexSymbol(name='USDCAD'),sessions=Sessions(new_york))st4=FingerTrap(symbol=ForexSymbol(name='USDJPY'),sessions=Sessions(tokyo))st5=FingerTrap(symbol=ForexSymbol(name='EURGBP'),sessions=Sessions(london))# sessions are not requiredst6=FingerTrap(symbol=ForexSymbol(name='EURUSD'))# add strategies to the botbot.add_strategies([st1,st2,st3,st4,st5,st6])bot.execute()# run the botbuild_bot()API DocumentationseeAPI Documentationfor more detailsContributingPull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.SupportFeeling generous, like the package or want to see it become a more mature package?Consider supporting the project by buying me a coffee.
aiomqtt
The idiomatic asyncio MQTT client 🙌(formerly known as asyncio-mqtt)Write code like this:PublishasyncwithClient("test.mosquitto.org")asclient:awaitclient.publish("humidity/outside",payload=0.38)SubscribeasyncwithClient("test.mosquitto.org")asclient:awaitclient.subscribe("humidity/#")asyncformessageinclient.messages:print(message.payload)aiomqtt combines the stability of the time-provenpaho-mqttlibrary with an idiomatic asyncio interface:No more callbacks! 👍No more return codes (welcome to theMqttError)Graceful disconnection (forget abouton_unsubscribe,on_disconnect, etc.)Supports MQTT versions 5.0, 3.1.1 and 3.1Fully type-hintedDid we mention no more callbacks?Read the documentation at sbtinstruments.github.io/aiomqttInstallationaiomqtt can be installed viapip install aiomqtt. The only dependency ispaho-mqtt.If you can't wait for the latest version, you can install aiomqtt directly from GitHub with:pip install git+https://github.com/sbtinstruments/aiomqttNote for Windows usersSince Python3.8, the default asyncio event loop is theProactorEventLoop. Said loopdoesn't support theadd_readermethodthat is required by aiomqtt. Please switch to an event loop that supports theadd_readermethod such as the built-inSelectorEventLoop:# Change to the "Selector" event loop if platform is Windowsifsys.platform.lower()=="win32"oros.name.lower()=="nt":fromasyncioimportset_event_loop_policy,WindowsSelectorEventLoopPolicyset_event_loop_policy(WindowsSelectorEventLoopPolicy())# Run your async application as usualasyncio.run(main())LicenseThis project is licensed under theBSD 3-clause License.Note that the underlying paho-mqtt library is dual-licensed. One of the licenses is the so-calledEclipse Distribution License v1.0. It is almost word-for-word identical to the BSD 3-clause License. The only differences are:One use of "COPYRIGHT OWNER" (EDL) instead of "COPYRIGHT HOLDER" (BSD)One use of "Eclipse Foundation, Inc." (EDL) instead of "copyright holder" (BSD)ContributingWe're very happy about contributions to aiomqtt! ✨ You can get started by readingCONTRIBUTING.md.VersioningThis project adheres toSemantic Versioning. Breaking changes will only occur in majorX.0.0releases.ChangelogThe changelog lives inCHANGELOG.md. It follows the principles ofKeep a Changelog.Related projectsIs aiomqtt not what you're looking for? There are a few other clients you can try:paho-mqtt: Synchronous clientmicropython-mqtt: Asynchronous client for microcontrollers in MicroPythongmqtt: Asynchronous clientfastapi-mqtt: Asynchronous wrapper around gmqtt; Simplifies integration with FastAPIamqtt: Asynchronous client; Includes a brokertrio-paho-mqtt: Asynchronous wrapper around paho-mqtt; Based on trio instead of asyncio
aio-mqtt
AboutAsynchronous MQTT client for 3.1.1 protocol version.InstallationRecommended way (via pip):$pipinstallaio-mqttExampleSimple echo server:importasyncioasaioimportloggingimporttypingastyimportaio_mqttlogger=logging.getLogger(__name__)classEchoServer:def__init__(self,reconnection_interval:int=10,loop:ty.Optional[aio.AbstractEventLoop]=None)->None:self._reconnection_interval=reconnection_intervalself._loop=looporaio.get_event_loop()self._client=aio_mqtt.Client(loop=self._loop)self._tasks=[self._loop.create_task(self._connect_forever()),self._loop.create_task(self._handle_messages())]asyncdefclose(self)->None:fortaskinself._tasks:iftask.done():continuetask.cancel()try:awaittaskexceptaio.CancelledError:passifself._client.is_connected():awaitself._client.disconnect()asyncdef_handle_messages(self)->None:asyncformessageinself._client.delivered_messages('in'):whileTrue:try:awaitself._client.publish(aio_mqtt.PublishableMessage(topic_name='out',payload=message.payload,qos=aio_mqtt.QOSLevel.QOS_1))exceptaio_mqtt.ConnectionClosedErrorase:logger.error("Connection closed",exc_info=e)awaitself._client.wait_for_connect()continueexceptExceptionase:logger.error("Unhandled exception during echo message publishing",exc_info=e)breakasyncdef_connect_forever(self)->None:whileTrue:try:connect_result=awaitself._client.connect('localhost')logger.info("Connected")awaitself._client.subscribe(('in',aio_mqtt.QOSLevel.QOS_1))logger.info("Wait for network interruptions...")awaitconnect_result.disconnect_reasonexceptaio.CancelledError:raiseexceptaio_mqtt.AccessRefusedErrorase:logger.error("Access refused",exc_info=e)exceptaio_mqtt.ConnectionLostErrorase:logger.error("Connection lost. Will retry in%dseconds",self._reconnection_interval,exc_info=e)awaitaio.sleep(self._reconnection_interval,loop=self._loop)exceptaio_mqtt.ConnectionCloseForcedErrorase:logger.error("Connection close forced",exc_info=e)returnexceptExceptionase:logger.error("Unhandled exception during connecting",exc_info=e)returnelse:logger.info("Disconnected")returnif__name__=='__main__':logging.basicConfig(level='DEBUG')loop=aio.new_event_loop()server=EchoServer(reconnection_interval=10,loop=loop)try:loop.run_forever()exceptKeyboardInterrupt:passfinally:loop.run_until_complete(server.close())loop.run_until_complete(loop.shutdown_asyncgens())loop.close()LicenseCopyright 2019-2020 Not Just A Toy Corp.Licensed under the Apache License, Version 2.0 (the “License”); you may not use this file except in compliance with the License. You may obtain a copy of the License athttp://www.apache.org/licenses/LICENSE-2.0Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an “AS IS” BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
aio-mqtt-mod
AboutAsynchronous MQTT client for 3.1.1 protocol version (mod). Because of abandoned original repo this fork adds support for python >= 3.10InstallationRecommended way (via pip):$pipinstallaio-mqttExampleSimple echo server:importasyncioasaioimportloggingimporttypingastyimportaio_mqttlogger=logging.getLogger(__name__)classEchoServer:def__init__(self,reconnection_interval:int=10,loop:ty.Optional[aio.AbstractEventLoop]=None)->None:self._reconnection_interval=reconnection_intervalself._loop=looporaio.get_event_loop()self._client=aio_mqtt.Client(loop=self._loop)self._tasks=[self._loop.create_task(self._connect_forever()),self._loop.create_task(self._handle_messages())]asyncdefclose(self)->None:fortaskinself._tasks:iftask.done():continuetask.cancel()try:awaittaskexceptaio.CancelledError:passifself._client.is_connected():awaitself._client.disconnect()asyncdef_handle_messages(self)->None:asyncformessageinself._client.delivered_messages('in'):whileTrue:try:awaitself._client.publish(aio_mqtt.PublishableMessage(topic_name='out',payload=message.payload,qos=aio_mqtt.QOSLevel.QOS_1))exceptaio_mqtt.ConnectionClosedErrorase:logger.error("Connection closed",exc_info=e)awaitself._client.wait_for_connect()continueexceptExceptionase:logger.error("Unhandled exception during echo message publishing",exc_info=e)breakasyncdef_connect_forever(self)->None:whileTrue:try:connect_result=awaitself._client.connect('localhost')logger.info("Connected")awaitself._client.subscribe(('in',aio_mqtt.QOSLevel.QOS_1))logger.info("Wait for network interruptions...")awaitconnect_result.disconnect_reasonexceptaio.CancelledError:raiseexceptaio_mqtt.AccessRefusedErrorase:logger.error("Access refused",exc_info=e)exceptaio_mqtt.ConnectionLostErrorase:logger.error("Connection lost. Will retry in%dseconds",self._reconnection_interval,exc_info=e)awaitaio.sleep(self._reconnection_interval,loop=self._loop)exceptaio_mqtt.ConnectionCloseForcedErrorase:logger.error("Connection close forced",exc_info=e)returnexceptExceptionase:logger.error("Unhandled exception during connecting",exc_info=e)returnelse:logger.info("Disconnected")returnif__name__=='__main__':logging.basicConfig(level='DEBUG')loop=aio.new_event_loop()server=EchoServer(reconnection_interval=10,loop=loop)try:loop.run_forever()exceptKeyboardInterrupt:passfinally:loop.run_until_complete(server.close())loop.run_until_complete(loop.shutdown_asyncgens())loop.close()LicenseCopyright 2019-2020 Not Just A Toy Corp.Licensed under the Apache License, Version 2.0 (the “License”); you may not use this file except in compliance with the License. You may obtain a copy of the License athttp://www.apache.org/licenses/LICENSE-2.0Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an “AS IS” BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
aiomsa
aiomsaaiomsais a Python 3.7+ framework built usingasyncio. At its core,aiomsaprovides a simple and standardized way to write xApps that can be deployed as microservices in Python.Installationaiomsacan be installed from PyPI.pipinstallaiomsaYou can also get the latest code from GitHub.poetryaddgit+https://github.com/facebookexternal/aiomsaGetting StartedThe follwing example shows how to useaiomsato create a simple xApp for subscribing to the E2T service for a particular custom service model.importasyncioimportaiomsaimportaiomsa.abcfromonos_ric_sdk_pyimportE2Client,SDLClientfrom.modelsimportMyModelasyncdefrun(e2:aiomsa.abc.E2Client,e2_node_id:str)->None:subscription=awaite2.subscribe(e2_node_id,service_model_name="my_model",service_model_version="v1",subscription_id="my_app-my_model-sub",trigger=bytes(MyModel(param="foo")),actions=[aiomsa.abc.RICAction(id=1,type=aiomsa.abc.RICActionType.REPORT,subsequent_action_type=aiomsa.abc.RICSubsequentActionType.CONTINUE,time_to_wait=aiomsa.abc.RICTimeToWait.ZERO,)],)asyncfor(_header,message)insubscription:print(message)asyncdefmain()->None:asyncwithE2Client(app_id="my_app",e2t_endpoint="e2t:5150")ase2,SDLClient(topo_endpoint="topo:5150")assdl:asyncfore2_nodeinsdl.watch_e2_connections():asyncio.create_task(run(e2,e2_node.id))if__name__=="__main__":aiomsa.run(main())
aiomsg
aiomsgPure-Python smart sockets (like ZMQ) for simpler networkingAttribution: And1mu [CC BY-SA 4.0 (https://creativecommons.org/licenses/by-sa/4.0)]Table of ContentsContentsaiomsgTable of ContentsDemoInspirationIntroductionCookbookPublish from either thebindorconnectendDistribute messages to a dynamically-scaled service (multiple instances)Distribute messages from a 2-instance service to a dynamically-scaled oneDistribute messages from one dynamically-scaled service to anotherTwo dynamically-scaled services, with a scaled fan-in, fan-out proxySecure connections with mutual TLSCrash course on setting up anSSLContextFAQWhy do you spellSøcketlike that?I want to talk to the aiomsg Søcket with a different programming languageDeveloper setupDemoLet’s make two microservices; one will send the current time to the other. Here’s the end that binds to a port (a.k.a, the “server”):importasyncio,timefromaiomsgimportSøcketasyncdefmain():asyncwithSøcket()assock:awaitsock.bind('127.0.0.1',25000)whileTrue:awaitsock.send(time.ctime().encode())awaitasyncio.sleep(1)asyncio.run(main())Running as a different process, here is the end that does the connecting (a.k.a, the “client”):importasynciofromaiomsgimportSøcketasyncdefmain():asyncwithSøcket()assock:awaitsock.connect('127.0.0.1',25000)asyncformsginsock.messages():print(msg.decode())asyncio.run(main())Note that these are both complete, runnable programs, not fragments.Looks a lot like conventional socket programming, except thatthesesockets have a few extra tricks. These are described in more detail further down in rest of this document.InspirationLooks a lot like ZeroMQ yes? no? Well if you don’t know anything about ZeroMQ, that’s fine too. The rest of this document will assume that you don’t know anything about ZeroMQ.aiomsgis heavily influenced by ZeroMQ.There are some differences; hopefully they make things simpler than zmq. For one thing,aiomsgis pure-python so no compilation step is required, and relies only on the Python standard library (and that won’t change).Also, we don’t have special kinds of socket pairs like ZeroMQ has. There is only the oneSøcketclass. The only role distinction you need to make between different socket instances is this: some sockets willbindand others willconnect.This is the leaky part of the API that comes from the underlying BSD socket API. Abindsocket will bind to a local interface and port. Aconnectsocket must connect to abindsocket, which can be on the same machine or a remote machine. This is the only complicated bit. You must decide, in a distributed microservices architecture, which sockets must bind and which must connect. A useful heuristic is that the service which is more likely to require horizontal scaling should have theconnectsockets. This is because thehostnamesto which they will connect (these will be thebindsockets) will be long-lived.IntroductionWhat you see above in the demo is pretty much a typical usage of network sockets. So what’s special aboutaiomsg? These are the high-level features:Messages, not streams:Send and receive aremessage-based, not stream based. Much easier! This does mean that if you want to transmit large amounts of data, you’re going to have have to break them up yourself, send the pieces, and put them back together on the other side.Automatic reconnectionThese sockets automatically reconnect. You don’t have to write special code for it. If the bind end (a.k.a “server”) is restarted, the connecting end will automatically reconnect. This works in either direction. Try it! run the demo code and kill one of the processes. And then start it up again. The connection will get re-established.Many connections on a single “socket”The bind end can receive multiple connections, but you do all your.send()and.recv()calls on a single object. (No callback handlers or protocol objects.)More impressive is that the connecting end is exactly the same; it can make outgoingconnect()calls to multiple peers (bind sockets), and you make all yoursend()andrecv()calls on a single object.This will be described in more detail further on in this document.Message distribution patternsReceiving messages is pretty simple: new messages just show up (remember that messages from all connected peers come through the same call):asyncwithSøcket()assock:awaitsock.bind()asyncformsginsock.messages():print(f"Received:{msg}")However, when sending messages you have choices. The choices affectwhich peersget the message. The options are:Publish: every connected peer is sent a copy of the messageRound-robin: each connected peer is sent auniquemessage; the messages are distributed to each connection in a circular pattern.By peer identity: you can also send to a specific peer by using its identity directly.The choice betweenpub-subandround-robinmust be made when creating theSøcket():fromaiomsgimportSøcket,SendModeasyncwithSøcket(send_mode=SendMode.PUBLISH)assock:awaitsock.bind()asyncformsginsock.messages():awaitsock.send(msg)This example receives a message from any connected peer, and sends that same message toeveryconnected peer (including the original sender). By changingPUBLISHtoROUNDROBIN, the message distribution pattern changes so that each “sent” message goes to only one connected peer. The next “sent” message will go to a different connected, and so on.Foridentity-basedmessage sending, that’s available any time, regardless of what you choose for thesend_modeparameter; for example:importasynciofromaiomsgimportSøcket,SendModeasyncdefmain():asyncwithSøcket()assock1,Søcket(send_mode=SendMode.PUBLISH)assock2:awaitsock1.bind(port=25000)awaitsock2.bind(port=25001)whileTrue:peer_id,message=awaitsock1.recv_identity()msg_id,_,data=msg.partition(b"\x00")awaitsock2.send(data)awaitsock1.send(msg_id+b"\x00ok",identity=peer_id)asyncio.run(main())This example shows how you can receive messages on one socket (sock1, which could have thousands of connected peers), and relay those messages to thousands of other peers connected on a different socket (sock2).For this example, thesend_modeofsock1doesn’t matter because ifidentityis specified in thesend()call, it’ll ignoresend_modecompletely.Oh, and the example above is a complete, runnable program which is pretty amazing!Built-in heartbeatingBecause ain’t nobody got time to mess around with TCP keepalive settings. The heartbeating is internal and opaque to your application code. You won’t even know it’s happening, unless you enable debug logs. Heartbeats are sent only during periods of inactivity, so they won’t interfere with your application messages.In theory, you really shouldn’t need heartbeating because TCP is a very robust protocol; but in practice, various intermediate servers and routers sometimes do silly things to your connection if they think a connection has been idle for too long. So, automatic heartbeating is baked in to let all intermediate hops know you want the connection to stay up, and if the connection goes down, you will know much sooner than the standard TCP keepalive timeout duration (which can be very long!).If either a heartbeat or a message isn’t received within a specific timeframe, that connection is destroyed. Whichever peer is making theconnect()call will then automatically try to reconnect, as discussed earlier.Built-in reliability choicesAh, so what do “reliability choices” mean exactly…?It turns out that it’s quite hard to send messages in a reliable way. Or, stated another way, it’s quite hard to avoid dropping messages: one side sends and the other side never gets the message.aiomsgalready buffers messages when being sent. Consider the following example:fromaiomsgimportSøcket,SendModeasyncwithSøcket(send_mode=SendMode.PUBLISH)assock:awaitsock.bind()whileTrue:awaitsock.send(b'123)awaitasyncio.sleep(1.0)This server above will send the bytesb"123"to all connected peers; but what happens if there arenoconnected peers? In this case the message will be buffered internally until there is at least one connected peer, and when that happens, all buffered messages will immediately be sent. To be clear, you don’t have to do anything extra. This is just the normal behaviour, and it works the same with theROUNDROBINsend mode.Message buffering happens whenever there are no connected peers available to receive a message. Sounds great right? Unfortunately, this is not quite enough to prevent messages from getting lost. It is still easy to have your process killed immediately after sending data into a kernel socket buffer, but right before the bytes actually get transmitted. In other words, your code thinks the message got sent, but it didn’t actually get sent.The only real solution for adding robustness is to have peersreplyto you saying that they received the message. Then, if you never receive this notification, you should assume that the message might not have been received, and send it again.aiomsgwill do this for you (so again there is no work on your part), but you do have to turn it on.This option is called theDeliveryGuarantee. The default option, which is just basic message buffering in the absence of any connected peers, is calledDeliveryGuarantee.AT_MOST_ONCE. It means, literally, that any “sent” message will received by a connected peer no more than once (of course, it may also be zero, as described above).The alternative is to setDeliveryGuarantee.AT_LEAST_ONCE, which enables the internal “retry” feature. It will be possible, under certain conditions, that any given message could be receivedmore than once, depending on timing and situation. This is how the code looks if you enable it:fromaiomsgimportSøcket,SendMode,DeliveryGuaranteeasyncwithSøcket(send_mode=SendMode.ROUNDROBIN,delivery_guarantee=DeliveryGuarantee.AT_LEAST_ONCE)assock:awaitsock.bind()whileTrue:awaitsock.send(b'123)awaitasyncio.sleep(1.0)It’s pretty much exactly the same as before, but we added theAT_LEAST_ONCEoption. Note thatAT_LEAST_ONCEdoes not work for thePUBLISHsending mode. (Would it make sense to enable?)As a minor point, you should note that whenAT_LEAST_ONCEis enabled, it does not mean that every send waits for acknowledgement before the next send. That would incur too much latency. Instead, there is a “reply checker” that runs on a timer, and if a reply hasn’t been received for a particular message in a certain timeframe (5.0 seconds by default), that message will be sent again.The connection may have gone down and back up within those 5 seconds, and there may be new messages buffered for sending before the retry send happens. In this case, the retry message will arriveafterthose buffered messages. This is a long way of saying that the way that message reliability has been implemented can result in messages being received in a differentorderto what they were sent. In exchange for this, you get a lower overall latency because sending new messages is not waiting on previous messages getting acknowledged.Pure python, doesn’t require a compilerDepends only on the Python standard libraryCookbookThe message distribution patterns are what makeaiomsgpowerful. It is the way you connect up a whole bunch of microservices that brings the greatest leverage. We’ll go through the different scenarios using a cookbook format.In the code snippets that follow, you should assumed that each snippet is a complete working program, except that some boilerplate is omitted. This is the basic template:importasynciofromaiomsgimportSøcket,SendMode,DeliveryGuarantee<main()function>asyncio.run(main())Just substitute in themain()function from the snippets below to make the complete programs.Publish from either thebindorconnectendThe choice of “which peer should bind” is unaffected by the sending mode of the socket.Compare# Publisher that bindsasyncdefmain():asyncwithSøcket(send_mode=SendMode.PUBLISH)assock:awaitsock.bind()whileTrue:awaitsock.send(b'News!')awaitasyncio.sleep(1)versus# Publisher that connectsasyncdefmain():asyncwithSøcket(send_mode=SendMode.PUBLISH)assock:awaitsock.connect()whileTrue:awaitsock.send(b'News!')awaitasyncio.sleep(1)The same is true for the round-robin sending mode. You will usually choose thebindpeer based one which service is least likely to require dynamic scaling. This means that the mental conception of socket peers as either aserverorclientis not that useful.Distribute messages to a dynamically-scaled service (multiple instances)In this recipe, one service needs to send messages to another service that is horizontally scaled.The trick here is that wedon’twant to use bind sockets on horizontally-scaled services, because other peers that need to make aconnectcall will need to know what hostname to use. Each instance in a horizontally-scaled service has a different IP address, and it becomes difficult to keep the “connect” side up-to-date about which peers are available. This can also change as the horizontally-scaled service increases or decreases the number of instances. (In ZeroMQ documentation, this is described as theDynamic Discovery Problem).aiomsghandles this very easily: just make sure that the dynamically-scaled service is making the connect calls:This is the manually-scaled service (has a specific domain name):# jobcreator.py -> DNS for "jobcreator.com" should point to this machine.asyncdefmain():asyncwithSøcket(send_mode=SendMode.ROUNDROBIN)assock:awaitsock.bind(hostname="0.0.0.0",port=25001)whileTrue:awaitsock.send(b"job")awaitasyncio.sleep(1)These are the downstream workers (don’t need a domain name):# worker.py - > can be on any number of machinesasyncdefmain():asyncwithSøcket()assockawaitsock.connect(hostname='jobcreator.com',port=25001)whileTrue:work=awaitsock.recv()<dowork>With this code, after you start upjobcreator.pyon the machine to which DNS resolves the domain name “jobcreator.com”, you can start up multiple instances ofworker.pyon other machines, and work will get distributed among them. You can even change the number of worker instances dynamically, and everything will “just work”, with the main instance distributing work out to all the connected workers in a circular pattern.This core recipe provides a foundation on which many of the other recipes are built.Distribute messages from a 2-instance service to a dynamically-scaled oneIn this scenario, there are actually two instances of the job-creating service, not one. This would typically be done for reliability, and each instance would be placed in a differentavailability zones. Each instance will have a different domain name.It turns out that the required setup follows directly from the previous one: you just add another connect call in the workers.The manually-scaled service is as before, but you start on instance ofjobcreator.pyon machine “a.jobcreator.com”, and start another on machine “b.jobcreator.com”. Obviously, it is DNS that is configured to point to the correct IP addresses of those machines (or you could use IP addresses too, if these are internal services).# jobcreator.py -> Configure DNS to point to these instancesasyncdefmain():asyncwithSøcket(send_mode=SendMode.ROUNDROBIN)assock:awaitsock.bind(hostname="0.0.0.0",port=25001)whileTrue:awaitsock.send(b"job")awaitasyncio.sleep(1)As before, the downstream workers, but this time each worker makes multipleconnect()calls; one to each job creator’s domain name:# worker.py - > can be on any number of machinesasyncdefmain():asyncwithSøcket()assock:awaitsock.connect(hostname='a.jobcreator.com',port=25001)awaitsock.connect(hostname='b.jobcreator.com',port=25001)whileTrue:work=awaitsock.recv()<dowork>aiomsgwill returnworkfrom thesock.recv()call above as it comes in from either job creation service. And as before, the number of worker instances can be dynamically scaled, up or down, and all the connection and reconnection logic will be handled internally.Distribute messages from one dynamically-scaled service to anotherIf both services need to be dynamically-scaled, and can have varying numbers of instances at any time, we can no longer rely on having one end do thesocket bindto a dedicated domain name. We really would like each to makeconnect()calls, as we’ve seen in previous examples.How to solve it?The answer is to create an intermediate proxy service that hastwobind sockets, with long-lived domain names. This is what will allow the other two dynamically-scaled services to have a dynamic number of instances.Here is the new job creator, whose name we change todynamiccreator.pyto reflect that it is now dynamically scalable:# dynamiccreator.py -> can be on any number of machinesasyncdefmain():asyncwithSøcket(send_mode=SendMode.ROUNDROBIN)assock:awaitsock.connect(hostname="proxy.jobcreator.com",port=25001)whileTrue:awaitsock.send(b"job")awaitasyncio.sleep(1)Note that our job creator above is now making aconnect()call toproxy.jobcreator.com:25001rather than binding to a local port. Let’s see what it’s connecting to. Here is the intermediate proxy service, which needs a dedicated domain name, and two ports allocated for each of the bind sockets.# proxy.py -> Set up DNS to point "proxy.jobcreator.com" to this instanceasyncdefmain():asyncwithSøcket()assock1,\Søcket(send_mode=SendMode.ROUNDROBIN)assock2:awaitsock1.bind(hostname="0.0.0.0",port=25001)awaitsock2.bind(hostname="0.0.0.0",port=25002)whileTrue:work=awaitsock1.recv()awaitsock2.send(work)Note thatsock1is bound to port 25001; this is what our job creator is connecting to. The other socket,sock2, is bound to port 25002, and this is the one that our workers will be making theirconnect()calls to. Hopefully it’s clear in the code that work is being received fromsock1and being sent ontosock2. This is pretty much a feature complete proxy service, and with only minor additions for error-handling can be used for real work.For completeness, here are the downstream workers:# worker.py - > can be on any number of machinesasyncdefmain():asyncwithSøcket()assock:awaitsock.connect(hostname='proxy.jobcreator.com',port=25002)whileTrue:work=awaitsock.recv()<dowork>Note that the workers are connecting to port 25002, as expected.You might be wondering: isn’t this just moving our performance problem to a different place? If the proxy service is not scalable, then surely that becomes the “weakest link” in our system architecture?This is a pretty typical reaction, but there are a couple of reasons why it might not be as bad as you think:The proxy service is doing very, very little work. Thus, we expect it to suffer from performance problems only at a much higher scale compared to our other two services which are likely to be doing more CPU-bound work (in real code, not my simple examples above).We could compile only the proxy service into faster low-level code using any number of tools such as Cython, C, C++, Rust, D and so on, in order to improve its performance, if necessary (this would require implementing theaiomsgprotocols in that other language though). This allows us to retain the benefits of using a dynamic language like Python in the dynamically scaled services where much greater business logic is captured (these can be then be horizontally scaled quite easily to handle performance issues if necessary).Performance is not the only reason services are dynamically scaled. It is always a good idea, even in low-throughput services, to have multiple instances of a service running in different availability zones. Outages do happen, yes, even in your favourite cloud provider’s systems.A separate proxy service as shown above isolates a really complex problem and removes it from your business logic code. It might not be easy to appreciate how significant that is. As your dev team is rapidly iterating on business features, and redeploying new versions several times a day, the proxy service is unchanging, and doesn’t require redeployment. In this sense, it plays a similar role to more traditional messaging systems like RabbitMQ and ActiveMQ.We can still run multiple instances of our proxy service using an earlier technique, as we’ll see in the next recipe.Two dynamically-scaled services, with a scaled fan-in, fan-out proxyThis scenario is exactly like the previous one, except that we’re nervous about having only a single proxy service, since it is a single point of failure. Instead, we’re going to have 3 instances of the proxy service running in parallel.Let’s jump straight into code. The proxy code itself is actually unchanged from before. We just need to run more copies of it on different machines.Each machine will have a different domain name.# proxy.py -> unchanged from the previous recipeasyncdefmain():asyncwithSøcket()assock1,\Søcket(send_mode=SendMode.ROUNDROBIN)assock2:awaitsock1.bind(hostname="0.0.0.0",port=25001)awaitsock2.bind(hostname="0.0.0.0",port=25002)whileTrue:work=awaitsock1.recv()awaitsock2.send(work)For the other two dynamically scaled services, we need to tell them all the domain names to connect to. We could set that up in an environment variable:$exportPROXY_HOSTNAMES="px1.jobcreator.com;px2.jobcreator.com;px3.jobcreator.com"Then, it’s really easy to modify our services to make use of that. First, the dynamically-scaled job creator:# dynamiccreator.py -> can be on any number of machinesasyncdefmain():asyncwithSøcket(send_mode=SendMode.ROUNDROBIN)assock:forproxyinos.environ['PROXY_HOSTNAMES'].split(";"):awaitsock.connect(hostname=proxy,port=25001)whileTrue:awaitsock.send(b"job")awaitasyncio.sleep(1)And the change for the worker code is identical (making sure the correct port is being used, 25002):# worker.py - > can be on any number of machinesasyncdefmain():asyncwithSøcket()assock:forproxyinos.environ['PROXY_HOSTNAMES'].split(";"):awaitsock.connect(hostname=proxy,port=25002)whileTrue:work=awaitsock.recv()<dowork>Three proxies, each running in a different availability zone, should be adequate for most common scenarios.TODO: more scenarios involving identity (like ROUTER-DEALER)Secure connections with mutual TLSSecure connectivity is extremely important,even in an internal microservices infrastructure. From a design perspective, the single biggest positive impact that can be made on security is to make iteasyfor users to do the “right thing”.For this reason,aiomsgdoes nothing new at all. It uses the existing support for secure connectivity in the Python standard library, and uses the same APIs exactly as-is.All you have to do is create anSSLContextobject, exactly as you normally would for conventional Python sockets, and pass that in.Mutual TLS authentication (mTLS)is where the client verifies the serverandthe server verifies the client. Inaiomsg, names like “client” and “server” are less useful, so let’s rather say that theconnectsocket verifies the targetbindsocket, and thebindsocket also verifies the incoming connecting socket.It sounds complicated, but at a high level you just need to supply anSSLContextinstance to the bind socket, and a differentSSLContextinstance to the connect socket (usually on a different computer). The details are all stored in theSSLContextobjects.Let’s first look at how that looks for a typical bind socket and connect socket:# bind endimportsslimportasyncio,timefromaiomsgimportSøcketasyncdefmain():ctx=ssl.SSLContext(...)# <--------- NEW!asyncwithSøcket()assock:awaitsock.bind('127.0.0.1',25000,ssl_context=ctx)whileTrue:awaits.send(time.ctime().encode())asyncio.run(main())# connect endimportsslimportasynciofromaiomsgimportSøcketasyncdefmain():ctx=ssl.SSLContext(...)# <--------- NEW!asyncwithSøcket()assock:awaitsock.connect('127.0.0.1',25000,ssl_context=ctx)asyncformsginsock.messages():print(msg.decode())asyncio.run(main())If you compare these two code snippets to what was shown in theDemosection, you’ll see it’s almost exactly the same, except that we’re passing a newctxparameter into the respectivebind()andconnect()calls, which is an instance ofSSLContext.So if you already know how to work with Python’s built-inSSLContextobject, you can already create secure connections withaiomsgand there’s nothing more you need to learn.Crash course on setting up anSSLContextYou might not know how to set up theSSLContextobject. Here, I’ll give a crash course, but please remember that I am not a security expert so make sure to ask an actual security expert to review your work if you’re working on a production system.The best way to create anSSLContextobject isnotwith its constructor, but rather a helper function calledcreate_default_context(), which sets a lot of sensible defaults that you would otherwise have to do manually. So that’s how you get the context instance.You do have to specify whether the purpose of the context object is to verify a client or a server. Let’s have a look at that:# bind socket, or "server"ctx:SSLContext=ssl.create_default_context(ssl.Purpose.CLIENT_AUTH)So here, above, we’re creating a context object for a bind socket. The purpose of the context is going to be toverify incoming client connections, that’s why theCLIENT_AUTHpurpose was given. As you might imagine, on the other end, i.e., the connect socket (or “client”), the purpose is going to be to verify the server:# connect socket, or "client"ctx:SSLContext=ssl.create_default_context(ssl.Purpose.SERVER_AUTH)Once you’ve created the context, the remaining parameters have the same meaning for both client and server.The way TLS works (the artist formerly known as SSL) is that each end of a connection has two pieces of information:Acertificate(may be shared publicly)Akey(MUST NOT BE SHARED! SECRET!)When the two sockets establish a connection, they trade certificates, but do not trade keys. Anyway, let’s look at what you need to actually set in the code. We’ll start with the connect socket (client).# connect socket, or "client"ctx:SSLContext=ssl.create_default_context(ssl.Purpose.SERVER_AUTH)ctx.verify_mode=ssl.CERT_REQUIREDctx.check_hostname=Truectx.load_verify_locations(<somethingthatcanverifytheservercert>)The above will let the client verify that the server it is connecting to is the correct one. When the socket connects, the server socket will send back acertificateand the client checks that against one of those mysterious “verify locations”.For mutual TLS, the server also wants to check the client. What does it check? Well, the client must also provide a certificate back to the server. So that requires an additional line in the code block above:# connect socket, or "client"ctx:SSLContext=ssl.create_default_context(ssl.Purpose.SERVER_AUTH)ctx.verify_mode=ssl.CERT_REQUIREDctx.check_hostname=Truectx.load_verify_locations(<somethingthatcanverifytheservercert>)# Client needs a pair of "cert" and "key"ctx.load_cert_chain(certfile="client.cert",keyfile="client.key")So that completes everything we need to do for the SSL context on the client side. On the server side, everything is almost exactly the same:# bind socket, or "server"ctx:SSLContext=ssl.create_default_context(ssl.Purpose.CLIENT_AUTH)ctx.verify_mode=ssl.CERT_REQUIREDctx.load_verify_locations(<somethingthatcanverifytheclientcert>)# Server needs a pair of "cert" and "key"ctx.load_cert_chain(certfile="server.cert",keyfile="server.key")That describes everything you need to do to set upmutual TLSusingSSLContextinstances.There are a few loose ends to tie up though. Where do you get thecertfileandkeyfilefrom? And what is this mysterious “verify location”? The first question is easier. The cert and key can be generated using the OpenSSL command-line application:$opensslreq-newkeyrsa:2048-nodes-keyoutserver.key\-x509-days365-outserver.cert\-subj'/C=GB/ST=Blah/L=Blah/O=Blah/OU=Blah/CN=example.com'Running the above command will create two new files,server.certandserver.key; these are ones you specify in earlier commands. Generating these files for the client is exactly the same, but you use different names.You could also useLet’s Encryptto generate the cert and key, in which case you don’t have to run the above commands.IFyou use Let’s Encrypt, you’ve also solved the other problem of supplying a “verify location”, and in fact you won’t need to callload_verify_locations()in the client code at all. This is because there are a bunch ofroot certificate authoritiesthat are provided with most operating systems, andLet’s Encryptis one of those.However, for the sake of argument, let’s say you want to make your own certificates and you don’t want to rely on system-provided root certificates at all; how to do the verification? Well it turns out that a very simple solution is to just use the target certificate itself to be the “verify location”. For example, here is the client context again:# connect socket, or "client"ctx:SSLContext=ssl.create_default_context(ssl.Purpose.SERVER_AUTH)ctx.verify_mode=ssl.CERT_REQUIREDctx.check_hostname=Truectx.load_verify_locations("server.cert")# <--- Same one as the server# Client needs a pair of "cert" and "key"ctx.load_cert_chain(certfile="client.cert",keyfile="client.key")and then in the server’s context, you could also use the client’s cert as the “verify location”:# bind socket, or "server"ctx:SSLContext=ssl.create_default_context(ssl.Purpose.CLIENT_AUTH)ctx.verify_mode=ssl.CERT_REQUIREDctx.load_verify_locations("client.cert) # <--- Same as on client# Server needs a pair of "cert" and "key"ctx.load_cert_chain(certfile="server.cert",keyfile="server.key")Obviously, the client code and the server code are running on different computers and you need to make sure that the right files are on the right computers in the right places.There are a lot of ways to make this more sophisticated, but it’s probably a good idea to get the simple case working, as described above, before looking at the more complicated cases. A cool option is to make your ownroot certificate authority, which can be a standard “verify location” in all your microservices, and then when you make certs and keys for each microservice, you just have to “sign” them with the root key. This process is described inBe your own certificate authorityby Moshe ZadkaHope that helps!FAQWhy do you spellSøcketlike that?The slashed O is used in homage toØMQ, a truly wonderful library that changed my thinking around what socket programming could be like.I want to talk to the aiomsg Søcket with a different programming languageWARNING: This section is extremely provisional. I haven’t fully nailed down the protocol yet.To make a clone of theSøcketin another language is probably a lot of work, but it’s actually not necessary to implement everything.You can talk toaiomsgsockets quite easily by implementing the simple protocol described below. It would be just like regular socket programming in your programming language. You just have to follow a few simple rules for the communication protocol.These are the rules:Every payloadin either direction shall be length-prefixed:message = [4-bytes big endian int32][payload]Immediatelyafter successfully opening a TCP connection, before doing anything else with your socket, you shall:Send your identity, as a 16 byte unique identifier (a 16 byte UUID4 is perfect). Note that Rule 1 still applies, so this would look likeidentity_message = b'\x00\x00\x00\x10' + [16 bytes](because the payload length, 16, is0x10in hex)Receive the other peer’s identity (16 bytes). Remember Rule 1 still applies, so you’ll actually receive 20 bytes, and the first four will be the length of the payload, which will be 16 bytes for this message.You shallperiodicallysend a heartbeat messageb"aiomsg-heartbeat". Every 5 seconds is good. If you receive such messages you can ignore them. If you don’t receive one (or an actual data message) within 15 seconds of the previous receipt, the connection is probably dead and you should kill it and/or reconnect. Note that Rule 1 still applies, and because the length of this message is also 16 bytes, the message is ironically similar to the identity message:heartbeat_message = b'\x00\x00\x00\x10' + b'aiomsg-heartbeat'After you’ve satisfied these rules, from that point on every message sent or received is a Rule 1 message, i.e., length prefixed with 4 bytes for the length of the payload that follows.If you want to run abindsocket, and receive multiple connections from differentaiomsgsockets, then the above rules apply toeachseparate connection.That’s it!TODO: Discuss the protocol forAT_LEAST_ONCEmode, which is a bit messy at the moment.Developer setupSetup:$ git clone https://github.com/cjrh/aiomsg $ python -m venv venv $ source venv/bin/activate (or venv/Scripts/activate.bat on Windows) $ pip install -e .[all]Run the tests:$ pytestCreate a new release:$ bumpymcbumpface --push-git --push-pypiThe easiest way to obtain thebumpymcbumpfacetool is to install it withpipx. Once installed and on your$PATH, the command above should work.NOTE: twine must be correctly configured to upload to pypi.If you don’t have rights to push to PyPI, but you do have rights to push to github, just omit the--push-pypioption in the command above. The command will automatically create the next git tag and push it.
aiomsgpack
aiomsgpackis a helper to write asyncio msgpack servers and clients.
aio-msgpack-rpc
aio-msgpack-rpcPure asyncio implementation of theMsgPack RPC.Requires Python >= 3.6.UsesstreamsAPI provided by theasynciostandard library.Just a simple implementation of the RPC layer.Installationpip install aio-msgpack-rpcExampleServerimportasyncioimportaio_msgpack_rpc# handlers can be defined on a class# they can either be async or plain functionsclassMyServicer:asyncdefsum(self,x,y):print(f"sum:{x},{y}")returnx+ydefnotification(self,msg):print(f"notification:{msg}")asyncdefmain():try:server=awaitasyncio.start_server(aio_msgpack_rpc.Server(MyServicer()),host="localhost",port=18002)whileTrue:awaitasyncio.sleep(0.1)finally:server.close()try:asyncio.get_event_loop().run_until_complete(main())exceptKeyboardInterrupt:passClientimportasyncioimportaio_msgpack_rpcasyncdefmain():client=aio_msgpack_rpc.Client(*awaitasyncio.open_connection("localhost",18002))# blocking rpc callsresult=awaitclient.call("sum",1,2)assertresult==3# one way notificationsawaitclient.notify("notification","hello")asyncio.get_event_loop().run_until_complete(main())BenchmarkSome basic performance benchmarks against the official implementation on my development machine.packagecall (QPS)notify (QPS)msgpack-rpc-python541411746aio-msgpack-rpc578186957
aio-multipart
multipart-stream
aiomultiprocess
aiomultiprocessTake a modern Python codebase to the next level of performance.On their own, AsyncIO and multiprocessing are useful, but limited: AsyncIO still can't exceed the speed of GIL, and multiprocessing only works on one task at a time. But together, they can fully realize their true potential.aiomultiprocess presents a simple interface, while running a full AsyncIO event loop on each child process, enabling levels of concurrency never before seen in a Python application. Each child process can execute multiple coroutines at once, limited only by the workload and number of cores available.Gathering tens of thousands of network requests in seconds is as easy as:asyncwithPool()aspool:results=awaitpool.map(<coroutinefunction>,<items>)Installaiomultiprocess requires Python 3.6 or newer. You can install it from PyPI:$pip3installaiomultiprocessUsageMost of aiomultiprocess mimics the standard multiprocessing module whenever possible, while accounting for places that benefit from async functionality.Running your asynchronous jobs on a pool of worker processes is easy:importasynciofromaiohttpimportrequestfromaiomultiprocessimportPoolasyncdefget(url):asyncwithrequest("GET",url)asresponse:returnawaitresponse.text("utf-8")asyncdefmain():urls=["https://jreese.sh",...]asyncwithPool()aspool:asyncforresultinpool.map(get,urls):...# process resultif__name__=='__main__':# Python 3.7asyncio.run(main())# Python 3.6# loop = asyncio.get_event_loop()# loop.run_until_complete(main())Take a look at theUser Guidefor more details and examples.For further context, watch the PyCon US 2018 talk about aiomultiprocess,"Thinking Outside the GIL":Slides available atSpeaker Deck.Licenseaiomultiprocess is copyrightJohn Reese, and licensed under the MIT license. I am providing code in this repository to you under an open source license. This is my personal repository; the license you receive to my code is from me and not from my employer. See theLICENSEfile for details.
aiomultiprocessing
UNKNOWN
aiomultitask
aiomultitask
aiomusiccast
Companion library for musiccast devices intended for the Home Assistant integration.SetupRequirementsPython 3.8+InstallationInstall it directly into an activated virtual environment:$ pip install aiomusiccastor add it to yourPoetryproject:$ poetry add aiomusiccastUsageAfter installation, the package can imported:$ python >>> import aiomusiccast >>> aiomusiccast.__version__
aiomygas
aiomygasAsynchronous Python API forМой Газ.InstallationUse pip to install the library:pip install aiomygasUsageimportasynciofrompprintimportpprintimportaiohttpfromaiomygasimportSimpleMyGasAuth,MyGasApiasyncdefmain(email:str,password:str)->None:"""Create the aiohttp session and run the example."""asyncwithaiohttp.ClientSession()assession:auth=SimpleMyGasAuth(email,password,session)api=MyGasApi(auth)data=awaitapi.async_get_accounts()pprint(data)if__name__=="__main__":_email=str(input("Email: "))_password=str(input("Password: "))asyncio.run(main(_email,_password))This will return accounts list that looks a little like this:{"elsGroup":[{"els":{"id":1111111,"jntAccountNum":"111111111111","isFull":true,"alias":"Дом","address":null,"epd":{"id":9,"name":"ЕПД Ростов-на-Дону","typePaymentCode":"10031","ENABLE_PAPER_RECEIPT_EPD":1,"UNITED_PAY_INDICATION_EPD":0},"params":null,"paperReceiptSetting":{"value":1,"dateTime":"2023-09-08T19:00:42.421Z"}},"lspu":[{"id":22222222,"account":"222222222222","isFull":1,"alias":"","address":null,"provider":{"id":41,"name":"Межрегионгаз Ростов-на-Дону","exchangeType":{"id":1,"code":"EXCHANGE_TYPE_ONLINE","name":"Онлайн","description":"Получение, отправка и обновление информации происходит в рамках настроенного поставщиком времени (как правило, это от 30 сек. до 30 минут)"},"setup":{"ACCOUNT_ATTACH_HINT":"Укажите номер лицевого счета поставщика газа (12 знаков), указанные на квитанции, без пробелов и запятых.\nПример: 370012345678","ALLOW_ACCESS_TYPE_CHARGE":"1","ALLOW_ACCESS_TYPE_COUNTER":"1","ALLOW_ACCESS_TYPE_PIN":"1","ALLOW_INDICATION_SEND":"1","ALLOW_INDICATION_SEND_LITE":"1","ALLOW_PAY":"1","ENABLE_PAYMENT_EXCHANGE":"1","ALLOW_PAY_APPLE":"0","ALLOW_PAY_GOOGLE":"0","ALLOW_PAY_SBP":"1","COUNTER_CHECK_DATE":"25","DAYS_BEFORE_CONTRACT_END":"60","DAYS_BEFORE_EQUIPMENT_CHECK":"60","ENABLE_AGREEMENT_SECTION":"1","ENABLE_APPLICATIONS_SECTION":"1","ENABLE_CALCULATION_SECTION":"1","ENABLE_INDICATION_SOURCE":"0","ENABLE_NOTIFICATION_DOCUMENT":"0","ENABLE_NOTIFICATION_EQUIPMENT":"1","ENABLE_PAYMENTS_SECTION":"1","ENABLE_PAYMENT_DETAILS_FULL":"1","ENABLE_PAYMENT_DETAILS_LITE":"0","ENABLE_PRINT_INVOICE":"1","ENABLE_PRIVILEGES_SECTION":"0","FULL_REQUEST_EMAIL":"[email protected]","IS_DEFAULT_FULL":"0","MAX_CONSUMPTION":"10000","MESSAGE_AFTER_CONTRACT_END":"Срок действия Вашего договора закончился. Требуется перезаключить договор","MESSAGE_AFTER_EQUIPMENT_CHECK":"Срок поверки прибора учета закончился. Показания будут отправляться, но не будут приняты к учету поставщиком услуг.","MESSAGE_BEFORE_CONTRACT_END":"Уважаемый абонент, срок действия Вашего договора заканчивается, пожалуйста не забудьте перезаключить его","MESSAGE_BEFORE_EQUIPMENT_CHECK":"Уважаемый абонент срок действия поверки Вашего прибора учета скоро заканчивается","SERVICE_UNAVAILABLE":"0","SUPPORT_EMAIL":"[email protected]","ENABLE_COUNTER_RATE":"1","ENABLE_PAPER_RECEIPT":"1","DUBLICATE_PAPER_RECEIPT":"1","ENABLE_NOTIFICATION_INDICATION":"0","MESSAGE_INDICATION_SECTION":"","SHOW_PAPER_RECEIPT_OFFER":"1","SHOW_NORMS_AND_RATES":"1","ALLOW_INDICATION_ZERO":"1","KKT_PAYMENT_METHOD_TYPE":"4","KKT_PAYMENT_SUBJECT_TYPE":"4","PAYMENT_MESSAGE":"","ENABLE_PAYMENT_MESSAGE":"0","ALLOW_AUTOPAY":"1","CHARGES_INTERVAL_MONTHS_NUMBER":"12","ALLOW_INDICATION_OVERLAP":"1","PROVIDER_ALLOW_OFFER_ELS":"1","ALLOW_INDICATION_CHECK_EXPIRED":"1","ALLOW_MIR_PAY":"1","GAS_COUNTER_TARIFF":"0","ENABLE_PRINT_EPD":null,"ALLOW_CREATE_AGREEMENT_TICKET":null,"DEPARTMET_EMAIL":null,"ALLOW_DOWNLOAD_CHARGES":null,"ALLOW_INDICATION_DATE_CHANGE":null,"ENABLE_EQUIPMENTS_DATE":null,"ENABLE_EQUIPMENTS_SERIAL":null,"ENABLE_ABONENT_FULLNAME":null,"ALLOW_TICKET_INSPECTOR_SEND":null,"RTP_TOPIC_ID":"","TICKET_SRC_PROVIDER_OPD":""}},"paperReceiptSetting":{"value":1,"dateTime":"2023-01-01T10:33:44.437Z"},"hasAutopay":false,"elsAvailable":null}],"lspuDublicate":[]},{"els":{"id":3333333,"jntAccountNum":"333333333333","isFull":true,"alias":"Офис","address":null,"epd":{"id":9,"name":"ЕПД Ростов-на-Дону","typePaymentCode":"10031","ENABLE_PAPER_RECEIPT_EPD":1,"UNITED_PAY_INDICATION_EPD":0},"params":null,"paperReceiptSetting":{"value":1,"dateTime":"2023-09-08T19:00:30.744Z"}},"lspu":[{"id":44444444,"account":"444444444444","isFull":1,"alias":"","address":null,"provider":{"id":41,"name":"Межрегионгаз Ростов-на-Дону","exchangeType":{"id":1,"code":"EXCHANGE_TYPE_ONLINE","name":"Онлайн","description":"Получение, отправка и обновление информации происходит в рамках настроенного поставщиком времени (как правило, это от 30 сек. до 30 минут)"},"setup":{"ACCOUNT_ATTACH_HINT":"Укажите номер лицевого счета поставщика газа (12 знаков), указанные на квитанции, без пробелов и запятых.\nПример: 370012345678","ALLOW_ACCESS_TYPE_CHARGE":"1","ALLOW_ACCESS_TYPE_COUNTER":"1","ALLOW_ACCESS_TYPE_PIN":"1","ALLOW_INDICATION_SEND":"1","ALLOW_INDICATION_SEND_LITE":"1","ALLOW_PAY":"1","ENABLE_PAYMENT_EXCHANGE":"1","ALLOW_PAY_APPLE":"0","ALLOW_PAY_GOOGLE":"0","ALLOW_PAY_SBP":"1","COUNTER_CHECK_DATE":"25","DAYS_BEFORE_CONTRACT_END":"60","DAYS_BEFORE_EQUIPMENT_CHECK":"60","ENABLE_AGREEMENT_SECTION":"1","ENABLE_APPLICATIONS_SECTION":"1","ENABLE_CALCULATION_SECTION":"1","ENABLE_INDICATION_SOURCE":"0","ENABLE_NOTIFICATION_DOCUMENT":"0","ENABLE_NOTIFICATION_EQUIPMENT":"1","ENABLE_PAYMENTS_SECTION":"1","ENABLE_PAYMENT_DETAILS_FULL":"1","ENABLE_PAYMENT_DETAILS_LITE":"0","ENABLE_PRINT_INVOICE":"1","ENABLE_PRIVILEGES_SECTION":"0","FULL_REQUEST_EMAIL":"[email protected]","IS_DEFAULT_FULL":"0","MAX_CONSUMPTION":"10000","MESSAGE_AFTER_CONTRACT_END":"Срок действия Вашего договора закончился. Требуется перезаключить договор","MESSAGE_AFTER_EQUIPMENT_CHECK":"Срок поверки прибора учета закончился. Показания будут отправляться, но не будут приняты к учету поставщиком услуг.","MESSAGE_BEFORE_CONTRACT_END":"Уважаемый абонент, срок действия Вашего договора заканчивается, пожалуйста не забудьте перезаключить его","MESSAGE_BEFORE_EQUIPMENT_CHECK":"Уважаемый абонент срок действия поверки Вашего прибора учета скоро заканчивается","SERVICE_UNAVAILABLE":"0","SUPPORT_EMAIL":"[email protected]","ENABLE_COUNTER_RATE":"1","ENABLE_PAPER_RECEIPT":"1","DUBLICATE_PAPER_RECEIPT":"1","ENABLE_NOTIFICATION_INDICATION":"0","MESSAGE_INDICATION_SECTION":"","SHOW_PAPER_RECEIPT_OFFER":"1","SHOW_NORMS_AND_RATES":"1","ALLOW_INDICATION_ZERO":"1","KKT_PAYMENT_METHOD_TYPE":"4","KKT_PAYMENT_SUBJECT_TYPE":"4","PAYMENT_MESSAGE":"","ENABLE_PAYMENT_MESSAGE":"0","ALLOW_AUTOPAY":"1","CHARGES_INTERVAL_MONTHS_NUMBER":"12","ALLOW_INDICATION_OVERLAP":"1","PROVIDER_ALLOW_OFFER_ELS":"1","ALLOW_INDICATION_CHECK_EXPIRED":"1","ALLOW_MIR_PAY":"1","GAS_COUNTER_TARIFF":"0","ENABLE_PRINT_EPD":null,"ALLOW_CREATE_AGREEMENT_TICKET":null,"DEPARTMET_EMAIL":null,"ALLOW_DOWNLOAD_CHARGES":null,"ALLOW_INDICATION_DATE_CHANGE":null,"ENABLE_EQUIPMENTS_DATE":null,"ENABLE_EQUIPMENTS_SERIAL":null,"ENABLE_ABONENT_FULLNAME":null,"ALLOW_TICKET_INSPECTOR_SEND":null,"RTP_TOPIC_ID":"","TICKET_SRC_PROVIDER_OPD":""}},"paperReceiptSetting":{"value":1,"dateTime":"2023-01-01T10:32:52.489Z"},"hasAutopay":false,"elsAvailable":null}],"lspuDublicate":[]}],"lspu":[]}Timeoutsaiomygas does not specify any timeouts for any requests. You will need to specify them in your own code. We recommend theasyncio.timeout:importasynciowithasyncio.timeout(10):data=awaitapi.async_get_accounts()
aiomyorm
aiomyorm is a simple and easy-to-use ORM framework, which has a similar interface to Django and fully supports asyncio.FeaturesPerfect support for asyncio and uvloop.Simple and easy to use API, similar to Django.Support mysql and SQLite.InstallationpipinstallaiomyormGetting StartedfromaiomyormimportModel,IntField,StringField,SmallIntField,auto_incrementfromaiomyormimportset_config,close_db_connectionimportasyncioset_config(engine='sqlite',db='test.db')classTest(Model):__table__='test'pk=IntField(primary_key=True,default=auto_increment())id=StringField(50)age=IntField(comment='the age of student.')birth_place=StringField(50,default='china')grade=SmallIntField()asyncdefgo():insert_rows=awaitTest.insert(Test(pk=5000,age=18,birth_place='place1'),Test(pk=5001,age=21,birth_place='place2'),Test(pk=5002,age=19,birth_place='place3'))all=awaitTest.find()print('insert rows: ',insert_rows)forrinall:print(r)if__name__=='__main__':loop=asyncio.get_event_loop()loop.run_until_complete(Test.create_table())loop.run_until_complete(go())loop.run_until_complete(close_db_connection())the resultsto create table test. insert rows: 3 pk:5000, id:, age:18, birth_place:place1, grade:0 pk:5001, id:, age:21, birth_place:place2, grade:0 pk:5002, id:, age:19, birth_place:place3, grade:0more use see the document:aiomyormDependenciesPython >= 3.5.3aiomysql(for MySQL)aiosqlite(for sqlite)TestsI have a simple test for you.It's better for you to test in [email protected]:yulansp/aiomyorm.gitthenpipinstall-rrequirements.txtRecipe you must install MySQL and configure the user name and password in thetests/test_mysql/config.pyfile.thenmaketestLicenseMIT
aiomysensors
aiomysensorsPython asyncio package to connect to MySensors gateways.InstallationInstall this via pip (or your favourite package manager):pip install aiomysensorsExample"""Show a minimal example using aiomysensors."""importasynciofromaiomysensorsimportAIOMySensorsError,Gateway,SerialTransportasyncdefrun_gateway()->None:"""Run a serial gateway."""port="/dev/ttyACM0"baud=115200transport=SerialTransport(port,baud)try:asyncwithGateway(transport)asgateway:asyncformessageingateway.listen():print("Message received:",message)exceptAIOMySensorsErroraserr:print("Error:",err)if__name__=="__main__":try:asyncio.run(run_gateway())exceptKeyboardInterrupt:passCommand Line InterfaceThere's a CLI for testing purposes.aiomysensors--debugserial-gateway-p/dev/ttyACM0CreditsThis package was created withCookiecutterand thebrowniebroke/cookiecutter-pypackageproject template.
aiomysimple
AioMySimpleAioMySimple is a simpler interface for AioMySQLimportaiomysimpledb=aiomysimple.Database(host="127.0.0.1",port=3306,user="root",password="",db="test_pymysql")my_table=db.get_table("my_table","id")asyncforrowinawaitmy_table.search():print(row["my_key"])awaitrow.update(my_key=row["my_key"]+1)result=awaitmy_table.search()print((awaitresult[3])["my_key"])
aiomysql
aiomysqlis a “driver” for accessing aMySQLdatabase from theasyncio(PEP-3156/tulip) framework. It depends on and reuses most parts ofPyMySQL.aiomysqltries to be like awesomeaiopglibrary and preserve same api, look and feel.Internallyaiomysqlis copy of PyMySQL, underlying io calls switched to async, basicallyyield fromandasyncio.coroutineadded in proper places)).sqlalchemysupport ported fromaiopg.Documentationhttps://aiomysql.readthedocs.io/Basic Exampleaiomysqlbased onPyMySQL, and provides same api, you just need to useawait conn.f()oryield from conn.f()instead of callingconn.f()for every method.Properties are unchanged, soconn.propis correct as well asconn.prop = val.importasyncioimportaiomysqlasyncdeftest_example(loop):pool=awaitaiomysql.create_pool(host='127.0.0.1',port=3306,user='root',password='',db='mysql',loop=loop)asyncwithpool.acquire()asconn:asyncwithconn.cursor()ascur:awaitcur.execute("SELECT 42;")print(cur.description)(r,)=awaitcur.fetchone()assertr==42pool.close()awaitpool.wait_closed()loop=asyncio.get_event_loop()loop.run_until_complete(test_example(loop))Example of SQLAlchemy optional integrationSqlalchemy support has been ported fromaiopgso api should be very familiar foraiopguser.:importasyncioimportsqlalchemyassafromaiomysql.saimportcreate_enginemetadata=sa.MetaData()tbl=sa.Table('tbl',metadata,sa.Column('id',sa.Integer,primary_key=True),sa.Column('val',sa.String(255)))asyncdefgo(loop):engine=awaitcreate_engine(user='root',db='test_pymysql',host='127.0.0.1',password='',loop=loop)asyncwithengine.acquire()asconn:awaitconn.execute(tbl.insert().values(val='abc'))awaitconn.execute(tbl.insert().values(val='xyz'))asyncforrowinconn.execute(tbl.select()):print(row.id,row.val)engine.close()awaitengine.wait_closed()loop=asyncio.get_event_loop()loop.run_until_complete(go(loop))RequirementsPython3.7+PyMySQLChanges0.2.0 (2023-06-11)Bump minimal SQLAlchemy version to 1.3 #815Remove deprecated Pool.get #706Partially portedPyMySQL#304#792aiomysql now reraises the original exception during connect() if it’s notIOError,OSErrororasyncio.TimeoutError.This was previously always raised asOperationalError.Fix debug log level with sha256_password authentication #863Modernized code withpyupgradeto Python 3.7+ syntax #930Removed tests for EoL MariaDB versions 10.3, 10.7 and 10.8, added tests for MariaDB 10.9, 10.10, 10.11 #9320.1.1 (2022-05-08)Fix SSL connection handshake charset not respecting client configuration #7760.1.0 (2022-04-11)Don’t send sys.argv[0] as program_name to MySQL server by default #620Allow running process as anonymous uid #587Fix timed out MySQL 8.0 connections raising InternalError rather than OperationalError #660Fix timed out MySQL 8.0 connections being returned from Pool #660Ensure connections are properly closed before raising an OperationalError when the server connection is lost #660Ensure connections are properly closed before raising an InternalError when packet sequence numbers are out of sync #660Unix sockets are now internally considered secure, allowing sha256_password and caching_sha2_password auth methods to be used #695Test suite now also tests unix socket connections #696Fix SSCursor raising InternalError when last result was not fully retrieved #635Remove deprecated no_delay argument #702Support PyMySQL up to version 1.0.2 #643Bump minimal PyMySQL version to 1.0.0 #713Align % formatting in Cursor.executemany() with Cursor.execute(), literal % now need to be doubled in Cursor.executemany() #714Fixed unlimited Pool size not working, this is now working as documented by passing maxsize=0 to create_pool #119Added Pool.closed property as present in aiopg #463Fixed SQLAlchemy connection context iterator #410Fix error packet handling for SSCursor #428Required python version is now properly documented in python_requires instead of failing on setup.py execution #731Add rsa extras_require depending on PyMySQL[rsa] #557Migrate to PEP 517 build system #746Self-reported__version__now returns version generated bysetuptools-scmduring build, otherwise‘unknown’#748Fix SSCursor raising query timeout error on wrong query #4280.0.22 (2021-11-14)Support python 3.10 #5050.0.21 (2020-11-26)Allow to use custom Cursor subclasses #374Fill Connection class with actual client version #388Fix legacy __aiter__ methods #403Fix & update docs #418 #437Ignore pyenv’s .python-version file #424Replace asyncio.streams.IncompleteReadError with asyncio.IncompleteReadError #460 #454Add support for SQLAlchemy default parameters #455 #466Update dependencies #485Support Python 3.7 & 3.8 #4930.0.20 (2018-12-19)Fixed connect_timeout #360Fixed support for SQLA executemany #324Fix the python 3.7 compatibility #357Fixed reuse connections when StreamReader has an exception #339Fixes warning when inserting binary strings #3260.0.19 (2018-07-12)See v0.0.180.0.18 (2018-07-09)Updated to support latest PyMySQL changes.aiomysql now sends client connection info.MySQL8+ Support including sha256_password and cached_sha2_password authentication plugins.Default max packet length sent to the server is no longer 1.Fixes issue where cursor.nextset can hang on query sets that raise errors.0.0.17 (2018-07-06)Pinned version of PyMySQL0.0.16 (2018-06-03)Added ability to execute precompiled sqlalchemy queries #294 (Thanks @vlanse)0.0.15 (2018-05-20)Fixed handling of user-defined types for sqlalchemy #290Fix KeyError when server reports unknown collation #2890.0.14 (2018-04-22)Fixed SSL connection finalization #2820.0.13 (2018-04-19)Added SSL support #280 (Thanks @terrycain)Fixed __all__ in aiomysql/__init__ #270 (Thanks @matianjun1)Added docker fixtures #275 (Thanks @terrycain)0.0.12 (2018-01-18)Fixed support for SQLAlchemy 1.2.0Fixed argument for cursor.execute in sa engine #239 (Thanks @NotSoSuper)0.0.11 (2017-12-06)Fixed README formatting on pypi0.0.10 (2017-12-06)Updated regular expressions to be compatible with pymysql #167 (Thanks @AlexLisovoy)Added connection recycling in the pool #2160.0.9 (2016-09-14)Fixed AttributeError in _request_authentication function #104 (Thanks @ttlttl)Fixed legacy auth #105uvloop added to test suite #106Fixed bug with unicode in json field #107 (Thanks @methane)0.0.8 (2016-08-24)Default min pool size reduced to 1 #80 (Thanks @Drizzt1991)Update to PyMySQL 0.7.5 #89Fixed connection cancellation in process of executing a query #79 (Thanks @Drizzt1991)0.0.7 (2016-01-27)Fix for multiple results issue, ported from pymysql #52Fixed useless warning with no_delay option #55Added async/await support for Engine, SAConnection, Transaction #57pool.release returns future so we can wait on it in __aexit__ #60Update to PyMySQL 0.6.70.0.6 (2015-12-11)Fixed bug with SA rollback (Thanks @khlyestovillarion!)Fixed issue with default no_delay option (Thanks @khlyestovillarion!)0.0.5 (2015-10-28)no_delay option is deprecated and True by defaultAdd Cursor.mogrify() methodSupport for “LOAD LOCAL INFILE” query.Check connection inside pool, in case of timeout drop it, fixes #25Add support of python 3.5 features to pool, connection and cursor0.0.4 (2015-05-23)Allow to call connection.wait_closed twice.Fixed sqlalchemy 1.0.0 support.Fix #11: Rename Connection.wait_closed() to .ensure_closed()Raise ResourceWarning on non-closed ConnectionRename Connection.connect to _connect0.0.3 (2015-03-10)Added support for PyMySQL up to 0.6.6.Ported improvements from PyMySQL.Added basic documentation.Fixed and added more examples.0.0.2 (2015-02-17)Added MANIFEST.in.0.0.1 (2015-02-17)Initial release.Implemented plain connections: connect, Connection, Cursor.Implemented database pools.Ported sqlalchemy optional support.
aiomysql-core
Aiomysql-CoreSimple framework for aiomysqlIntroducesimple package, easy to useaiomysqlDocumentclick meInstallationpip install aiomysql-coreSimple usesimportasyncioimportaiomysqlfromaiomysql_coreimportAioMysqlCoreasyncdeftest_example(loop):pool=awaitaiomysql.create_pool(host='',port=3306,user='',password='',db='',loop=loop)core=AioMysqlCore(pool=pool)rows=awaitcore.query('select * from users where uid=%s',113)print(rows)rows=awaitcore.gener('select * from users limit 100')asyncforrowinrows:print(row)row=awaitcore.get('select * from users where uid=%(uid)s',{'uid':113})print(row)rowcount=awaitcore.execute_rowcount('select * from users where uid=%(uid)s',{'uid':113})print(rowcount)pool.close()awaitpool.wait_closed()loop=asyncio.get_event_loop()loop.run_until_complete(test_example(loop))Simple SQLAlchemy usesimportasynciofromaiomysql.saimportcreate_enginefromaiomysql_coreimportAioMysqlAlchemyCorefromsqlalchemyimportColumn,Integer,String,MetaData,Tablemetadata=MetaData()Test=Table('test',metadata,Column('id',Integer,primary_key=True),Column('content',String(255),server_default=""))asyncdeftest_example(loop):config={'user':'','password':'','db':'','host':'','port':3306,'autocommit':True,'charset':'utf8mb4'}engine=awaitcreate_engine(loop=loop,**config)core=AioMysqlAlchemyCore(engine=engine)# insertdoc={'content':'insert'}clause=Test.insert().values(**doc)rowcount=awaitcore.execute_rowcount(clause)print(rowcount)# searchclause=Test.select().where(Test.c.id==1).limit(1)row=awaitcore.get(clause)print(row.id,row.content)clause=Test.select().where(Test.c.id>1)rows=awaitcore.query(clause)asyncforrowinrows:print(row.id,row.content)# updatedoc={'content':'update'}clause=Test.update().values(**doc).where(Test.c.id==1)rowcount=awaitcore.execute_rowcount(clause)print(rowcount)# deleteclause=Test.delete().where(Test.c.id==1)rowcount=awaitcore.execute_rowcount(clause)print(rowcount)awaitengine.wait_closed()loop=asyncio.get_event_loop()loop.run_until_complete(test_example(loop))
aiomysql-fork
fork aiomysql and support python3.7 later
aioN26
# aioN26 Unofficial Asynchronous N26-bank API implementationbased onhttps://app.swaggerhub.com/apis/Rots/N26/0.2#/transactions/get_smrt_transactionsI will document this better in the future, but here is a working example:import asyncio from aioN26.api import Api from pprint import pprint import logging import os from dotenv import load_dotenv # pip install python-dotenv # We load local environment variables from file MYSECRET.env # The file format is as follows: # ------------MYSECRET.env------------- # [email protected] # PASSWORD=mysecretpassword # DEVICE_TOKEN=yourgeneratedtoken # ------------------------------------- # to generate the DEVICE_TOKEN, run this in a python3 console: # >>> import uuid ; print(uuid.uuid4()) # and paste the result in the file load_dotenv('MYSECRET.env') # directory/file containing the environment variables logging.basicConfig(level=logging.DEBUG) async def main(): async with Api(username=os.getenv('USERNAME'), password=os.getenv('PASSWORD'), device_token=os.getenv('DEVICE_TOKEN')) as api: print(api.get_device_token()) print('\nget_me = \\') pprint(await api.get_me()) print('\nget_me_statuses = \\') pprint(await api.get_me_statuses()) print('\nget_addresses = \\') pprint(await api.get_addresses()) print('\nget_barzahlen_check = \\') pprint(await api.get_barzahlen_check()) print('\nget_spaces = \\') pprint(await api.get_spaces()) print('\nget_accounts = \\') pprint(await api.get_accounts()) print('\nget_settings_account_limits = \\') pprint(await api.get_settings_account_limits()) print('\nset_settings_account_limits = \\') pprint(await api.set_settings_account_limits(500, 3000)) print('\nget_smrt_categories = \\') pprint(await api.get_smrt_categories()) print('\nget_smrt_transactions = \\') transactions = await api.get_smrt_transactions(from_time=1636030755256, to_time=1636030755256) pprint(transactions) print('TOTAL', len(transactions)) asyncio.run(main())
aio-nacos
No description available on PyPI.
aio-nameko-proxy
aio-nameko-proxyA standalone nameko rpc proxy for asyncio and some wrappers for using nameko rpc proxy with asynchronous web frameworks(Sanic, fastapi).This project is based on aio-pika and reference the source code of official nameko project and aio-pika.installpipinstallaio-nameko-proxyexamples:standalone AIOClusterRpcProxyIf you want most of your messages to be persistent(default). Set the delivery mode parameter as DeliveryMode.PERSISTENT, Call sw_dlm_call when you need to send a non-persistent message.importsslimportasynciofromaio_nameko_proxyimportAIOClusterRpcProxyfromaio_pikaimportDeliveryModeconfig={"AMQP_URI":"amqp://guest:[email protected]:5672",# Required,"rpc_exchange":"nameko-rpc","time_out":30,"con_time_out":5,"delivery_mode":DeliveryMode.PERSISTENT,"serializer":"my_serializer","ACCEPT":["pickle","json","my_serializer"],"SERIALIZERS":{"my_serializer":{"encoder":"my_slizer.dumps","decoder":"my_slizer.loads","content_type":"my-content-type","content_encoding":"utf-8"}},# If SSL is configured, Remember to change the URI to TLS port. eg: "amqps://guest:[email protected]:5671""AMQP_SSL":{'ca_certs':'certs/ca_certificate.pem',# or 'cafile': 'certs/ca_certificate.pem','certfile':'certs/client_certificate.pem','keyfile':'certs/client_key.pem','cert_reqs':ssl.CERT_REQUIRED}}asyncdefrun():asyncwithAIOClusterRpcProxy(config)asrpc:# time_out: the time_out of waitting the remote method result.# con_time_out: the time_out of connecting to the rabbitmq server or binding the queue, consume and so on.# persistent msg callresult=awaitrpc.rpc_demo_service.normal_rpc("demo")reply_obj=awaitrpc.rpc_demo_service.normal_rpc.call_async("demo")result=awaitreply_obj.result()# non-persistent msg callresult=awaitrpc.rpc_demo_service.normal_rpc.sw_dlm_call("demo")reply_obj=awaitrpc.rpc_demo_service.normal_rpc.sw_dlm_call_async("demo")result=awaitreply_obj.result()if__name__=='__main__':loop=asyncio.get_event_loop()loop.run_until_complete(run())If you want most of your messages to be non-persistent(persistent is default). Set the delivery mode parameter as DeliveryMode.NOT_PERSISTENT, Call sw_dlm_call when you need to send a persistent message.importasynciofromaio_nameko_proxyimportAIOClusterRpcProxyfromaio_pikaimportDeliveryModeconfig={"AMQP_URI":"pyamqp://guest:[email protected]:5672","rpc_exchange":"nameko-rpc","time_out":30,"con_time_out":5,"delivery_mode":DeliveryMode.NOT_PERSISTENT}asyncdefrun():asyncwithAIOClusterRpcProxy(config)asrpc:# non-persistent msg callresult=awaitrpc.rpc_demo_service.normal_rpc("demo")reply_obj=awaitrpc.rpc_demo_service.normal_rpc.call_async("demo")result=awaitreply_obj.result()# persistent msg callresult=awaitrpc.rpc_demo_service.normal_rpc.sw_dlm_call("demo")reply_obj=awaitrpc.rpc_demo_service.normal_rpc.sw_dlm_call_async("demo")result=awaitreply_obj.result()if__name__=='__main__':loop=asyncio.get_event_loop()loop.run_until_complete(run())AIOPooledClusterRpcProxyimportasynciofromaio_nameko_proxyimportAIOPooledClusterRpcProxyfromaio_pikaimportDeliveryModeconfig={"AMQP_URI":"pyamqp://guest:[email protected]:5672","rpc_exchange":"nameko-rpc","time_out":30,"con_time_out":5,"pool_size":10,"initial_size":2,"delivery_mode":DeliveryMode.NOT_PERSISTENT}asyncdefrun():asyncwithAIOPooledClusterRpcProxy(config)asproxy_pool:asyncwithproxy_pool.acquire()asrpc:result=awaitrpc.rpc_demo_service.normal_rpc("demo")if__name__=='__main__':loop=asyncio.get_event_loop()loop.run_until_complete(run())Sanic WrapperimportsslfromsanicimportSanicfromsanic.responseimportjsonfromaio_pikaimportDeliveryModefromaio_nameko_proxy.wrappersimportSanicNamekoClusterRpcProxyclassConfig(object):# AMQP_URI: RequiredNAMEKO_AMQP_URI="pyamqp://guest:[email protected]:5672"# rpc_exchangeNAMEKO_RPC_EXCHANGE="nameko-rpc"# pool_sizeNAMEKO_POOL_SIZE=60# initial_sizeNAMEKO_INITIAL_SIZE=60# time_outNAMEKO_TIME_OUT=30# con_time_outNAMEKO_CON_TIME_OUT=5# serializerNAMEKO_SERIALIZER="json"# ACCEPTNAMEKO_ACCEPT=["pickle","json"]# SERIALIZERS: custom serializersNAMEKO_SERIALIZERS={"my_serializer":{"encoder":"my_slizer.dumps","decoder":"my_slizer.loads","content_type":"my-content-type","content_encoding":"utf-8"}}# AMQP_SSL: ssl configsNAMEKO_AMQP_SSL={'ca_certs':'certs/ca_certificate.pem',# or 'cafile': 'certs/ca_certificate.pem','certfile':'certs/client_certificate.pem','keyfile':'certs/client_key.pem','cert_reqs':ssl.CERT_REQUIRED}# delivery_modeNAMEKO_DELIVERY_MODE=DeliveryMode.PERSISTENT# other supported properties of aio-pika.Message, the key name format is "NAMEKO_{}".format(property_name.upper())# ...app=Sanic("App Name")app.config.from_object(Config)# rpc_cluster = SanicNamekoClusterRpcProxy(app)# or# from aio_nameko_proxy.wrappers import rpc_cluster # contextvars required in py36# SanicNamekoClusterRpcProxy(app)# orrpc_cluster=SanicNamekoClusterRpcProxy()rpc_cluster.init_app(app)@app.route("/")asyncdeftest(request):rpc=awaitrpc_cluster.get_proxy()result=awaitrpc.rpc_demo_service.normal_rpc("demo")reply_obj=awaitrpc.rpc_demo_service.normal_rpc.call_async("demo")result=awaitreply_obj.result()result=awaitrpc.rpc_demo_service.normal_rpc.sw_dlm_call("demo")reply_obj=awaitrpc.rpc_demo_service.normal_rpc.sw_dlm_call_async("demo")result=awaitreply_obj.result()returnjson({"hello":"world"})@app.websocket('/ws')asyncdefws(request,ws):rpc=awaitrpc_cluster.get_proxy()foriinrange(3):_=awaitws.recv()result=awaitrpc.rpc_demo_service.normal_rpc("demo")awaitws.send(result)ws.close()# in websocket handlers, you should call the remove actively in the endrpc_cluster.remove()if__name__=="__main__":app.run(host="0.0.0.0",port=8000)FastAPI WrapperimportsslfromfastapiimportFastAPI,WebSocketfromaio_pikaimportDeliveryModefrompydanticimportBaseSettingsfromaio_nameko_proxy.wrappersimportFastApiNamekoProxyMiddleware,rpc_cluster# contextvars required in py36classSettings(BaseSettings):# AMQP_URI: RequiredNAMEKO_AMQP_URI="pyamqp://guest:[email protected]:5672"# rpc_exchangeNAMEKO_RPC_EXCHANGE="nameko-rpc"# pool_sizeNAMEKO_POOL_SIZE=60# initial_sizeNAMEKO_INITIAL_SIZE=60# time_outNAMEKO_TIME_OUT=30# con_time_outNAMEKO_CON_TIME_OUT=5# serializerNAMEKO_SERIALIZER="json"# ACCEPTNAMEKO_ACCEPT=["pickle","json"]# SERIALIZERS: custom serializersNAMEKO_SERIALIZERS={"my_serializer":{"encoder":"my_slizer.dumps","decoder":"my_slizer.loads","content_type":"my-content-type","content_encoding":"utf-8"}}# AMQP_SSL: ssl configsNAMEKO_AMQP_SSL={'ca_certs':'certs/ca_certificate.pem',# or 'cafile': 'certs/ca_certificate.pem','certfile':'certs/client_certificate.pem','keyfile':'certs/client_key.pem','cert_reqs':ssl.CERT_REQUIRED}# delivery_modeNAMEKO_DELIVERY_MODE=DeliveryMode.PERSISTENT# other supported properties of aio-pika.Message, the key name format is "NAMEKO_{}".format(property_name.upper())# ...settings=Settings()app=FastAPI()app.add_middleware(FastApiNamekoProxyMiddleware,config=settings)@app.get("/")asyncdeftest():rpc=awaitrpc_cluster.get_proxy()result=awaitrpc.rpc_demo_service.normal_rpc("demo")reply_obj=awaitrpc.rpc_demo_service.normal_rpc.call_async("demo")result=awaitreply_obj.result()result=awaitrpc.rpc_demo_service.normal_rpc.sw_dlm_call("demo")reply_obj=awaitrpc.rpc_demo_service.normal_rpc.sw_dlm_call_async("demo")result=awaitreply_obj.result()return{"hello":"world"}@app.websocket("/ws")asyncdefws(ws:WebSocket):awaitws.accept()rpc=awaitrpc_cluster.get_proxy()foriinrange(3):_=awaitws.receive()result=awaitrpc.rpc_demo_service.normal_rpc("demo")awaitws.send(result)ws.close()# in websocket handlers, you should call the remove() actively in the endrpc_cluster.remove()
aio-nano
aio-nanoOverviewThis library contains an asynchronous python RPC client for Nano nodes, allowing you to more easily develop on the Nano network with fully type annnotated methods and responses.InstallationPIPpip install aio-nanoPoetrypoetry add aio-nanoExample Async HTTP RPC Callfromaio_nanoimportClientimportasyncioasyncdefmain():api_key=...client=Client('https://mynano.ninja/api/node',{'Authorization':api_key})supply=awaitclient.available_supply()print(supply)asyncio.run(main())Example Async WebSocket RPC Subscriptionimportasynciofromtimeimporttimefromaio_nanoimportWSClientasyncdefmain():ws=awaitWSClient("ws://localhost:7078").connect()start=time()*1000awaitws.subscribe("confirmation",lambdax:print(x),ack=True)print(f"Acked in{time()*1000-start}ms")awaitasyncio.Future()asyncio.run(main())
aionanoleaf
aioNanoleaf packageAn async Python wrapper for the Nanoleaf API.InstallationpipinstallaionanoleafUsageImportfromaionanoleafimportNanoleafCreate aaiohttp.ClientSessionto make requestsfromaiohttpimportClientSessionsession=ClientSession()Create aNanoleafinstancefromaionanoleafimportNanoleaflight=Nanoleaf(session,"192.168.0.100")ExamplefromaiohttpimportClientSessionfromasyncioimportrunimportaionanoleafasyncdefmain():asyncwithClientSession()assession:nanoleaf=aionanoleaf.Nanoleaf(session,"192.168.0.73")try:awaitnanoleaf.authorize()exceptaionanoleaf.Unauthorizedasex:print("Not authorizing new tokens:",ex)returnawaitnanoleaf.turn_on()awaitnanoleaf.get_info()print("Brightness:",nanoleaf.brightness)awaitnanoleaf.deauthorize()run(main())
aionap
aionap is aPythonasyncio enabled REST client. It uses a similar API likeslumberand copies shameless other parts of it.Feel free to contribute via pull requests, issues or email messages.QuickStartInstall aionap:$ pip install aionapInstall Optional Requirement:pip install pyyamlUse it!UsageGet an API object and fetch a url/resource (e.g.https://demo.api-platform.com/books)importaionapasyncwithaionap.API('https://demo.api-platform.com')asapiresponse=awaitapi.books.get()For more see thedocumenation, thetest/test_demo_api.pyfile or theexampledirectory.Installationaionap is available via PyPI, just install it as usual.$pipinstallaionapaionaprequires Python >= 3.6.[OPTIONAL]PyYaml (Required for the yaml serializer):$pipinstallpyyamlFeaturesBasic Auth supportJSON, YAML serializersGET, POST, PUT, PATCH, DELETE of resourcesGood test coverageTODOOAuth supportReadthedocs API documentation (SSL_CERT_FILE)CompatibilityPython >= 3.6LicenceBSD 2-Clause LicenseAuthors and ContributorsChristian Assing(Main author)
aionasa
aionasaAn async python wrapper for NASA open APIs. (api.nasa.gov)DisclaimerThis module is still in the development/testing phase. Bugs are still being worked out and breaking changes are common.Current Progress: 5/17 APIsAPOD: NASA Astronomy Picture of the DayAPI:completeCLI:completeDocumentation:completeInSight: Mars Weather DataAPI:completeDocumentation:completeEPIC: Earth Polychromatic Imaging CameraAPI:completeDocumentation:completeAsteroids-NeoWs: Near Earth Object Web ServiceAPI:completeDocumentation:completeExoplanet: NASA Exoplanet DatabaseAPI:completeDocumentation:completeInstallingaionasa can be installed from pypi with the command:# Linuxpython3-mpipinstall-Uaionasa# Windowspython-mpipinstall-UaionasaTo install the development version of the library directly from source:$gitclonehttps://github.com/nwunderly/aionasa $cdaionasa $python3-mpipinstall-U.QuickstartWe'll be using IPython because it supportsawaitexpressions directly from the console.$pipinstallaionasaipython $ipythonfromaionasaimportAPOD,InSightasyncwithAPOD()asapod:picture=awaitapod.get()picture.url# this will be the most recent APOD image URL.asyncwithInSight()asinsight:data=awaitinsight.get()data# this will be a dict containing the JSON data returned by the API.
aio-nasa-tle-loader
SpaceTrackApi clientSmall async wrapper fornasa-tle-loaderpackageRequirementsaiohttp >= 2.0.7nasa-tle-loader >= 1.0.0Installingpip install aio-nasa-tle-loaderGetting startedTo retrieve something from Space-Track:# -*- coding: utf-8 -*-importasyncioimportjsonfromaio_nasa_tle_loaderimportAsyncNasaTLELoaderasyncdefmain(loop):asyncwithAsyncNasaTLELoader(loop=loop)asloader:# Getting list `nasa_tle_loader.TLE`(namedtuple like) objectstle_list=awaitloader()# Print result as JSONprint(json.dumps([tle.as_dict()fortleintle_list[:3]],indent=2))if__name__=='__main__':loop=asyncio.get_event_loop()loop.run_until_complete(main(loop))Result:[ { "EPOCH": "2017-05-17 13:16:58", "EPOCH_MICROSECONDS": "124064", "NORAD_CAT_ID": "25544", "TLE_LINE0": "ISS", "TLE_LINE1": "1 25544U 98067A 17137.55345051 .00016717 00000-0 10270-3 0 9004", "TLE_LINE2": "2 25544 51.6389 191.0057 0005051 169.7469 190.3787 15.54030000 16987" }, { "EPOCH": "2017-05-17 22:32:35", "EPOCH_MICROSECONDS": "151072", "NORAD_CAT_ID": "25544", "TLE_LINE0": "ISS", "TLE_LINE1": "1 25544U 98067A 17137.93929573 .00016717 00000-0 10270-3 0 9014", "TLE_LINE2": "2 25544 51.6398 189.0848 0005258 166.2909 193.8387 15.53887043 17040" }, { "EPOCH": "2017-05-18 01:37:47", "EPOCH_MICROSECONDS": "963136", "NORAD_CAT_ID": "25544", "TLE_LINE0": "ISS", "TLE_LINE1": "1 25544U 98067A 17138.06791624 .00016717 00000-0 10270-3 0 9024", "TLE_LINE2": "2 25544 51.6394 188.4430 0005111 170.0057 190.1198 15.53888284 17061" } ]Source codeThe latest developer version is available in a github repository:https://github.com/nkoshell/aio-nasa-tle-loader
aionationstates
No description available on PyPI.
aiondao
No description available on PyPI.
aiondata
📊 AionDataAionData is a common data access layer designed for AI-driven drug discovery software. It provides a unified interface to access diverse biochemical databases.InstallationTo install AionData, ensure you have Python 3.10 or newer installed on your system. You can install AionData via pip:pipinstallaiondataDatasetsAionData provides access to the following datasets:BindingDB: A public, web-accessible database of measured binding affinities, focusing chiefly on the interactions of proteins considered to be drug-targets with small, drug-like molecules.MoleculeNet: An extensive collection of datasets curated to support and benchmark the development of machine learning models in the realm of drug discovery and chemical informatics. Covers a broad spectrum of molecular data including quantum mechanical properties, physical chemistry, biophysics, and physiological effects.Tox21: Features qualitative toxicity measurements for 12,000 compounds across 12 targets, used for toxicity prediction.ESOL: Contains water solubility data for 1,128 compounds, aiding in solubility prediction models.FreeSolv: Provides experimental and calculated hydration free energy for small molecules, crucial for understanding solvation.Lipophilicity: Includes experimental measurements of octanol/water distribution coefficients (logD) for 4,200 compounds.QM7: A dataset of 7,165 molecules with quantum mechanical properties computed using density functional theory (DFT).QM8: Features electronic spectra and excited state energies of over 20,000 small molecules computed with TD-DFT.QM9: Offers geometric, energetic, electronic, and thermodynamic properties of ~134k molecules computed with DFT.MUV: Datasets designed for the validation of virtual screening techniques, with about 93,000 compounds.HIV: Contains data on the ability of compounds to inhibit HIV replication, for binary classification tasks.BACE: Includes quantitative binding results for inhibitors of human beta-secretase 1, with both classification and regression tasks.BBBP: Features compounds with information on permeability properties across the Blood-Brain Barrier.SIDER: Contains information on marketed medicines and their recorded adverse drug reactions, for side effects prediction.ClinTox: Compares drugs approved by the FDA and those that failed clinical trials for toxicity reasons, for binary classification and toxicity prediction.LicenseAionData is licensed under the Apache License. See the LICENSE file for more details.
aionefit
Python asyncio library for (some) Bosch thermostatsaionefit is a Python library to control some Bosch thermostats using the Python asycio framework. This is done with theSlixmpplibrary.As there is no known way to talk directly to the thermostat, all communication has to be routed via cloud servers at Bosch.Original codeThis software is based on code of the following projects:https://github.com/robertklep/nefit-easy-clienthttps://github.com/patvdleer/nefit-client-pythonSupported hardwaresourceNefit Easy (Netherlands)Junkers Control CT100 (Belgium, Germany)Buderus Logamatic TC100 (Belgium)E.L.M. Touch (France)Worcester Wave (UK)Bosch Control CT‑100 (Other)
aionefit-updated
Python asyncio library for (some) Bosch thermostatsaionefit is a Python library to control some Bosch thermostats using the Python asycio framework. This is done with theSlixmpplibrary.As there is no known way to talk directly to the thermostat, all communication has to be routed via cloud servers at Bosch.Original codeThis software is based on code of the following projects:https://github.com/robertklep/nefit-easy-clienthttps://github.com/patvdleer/nefit-client-pythonSupported hardwaresourceNefit Easy (Netherlands)Junkers Control CT100 (Belgium, Germany)Buderus Logamatic TC100 (Belgium)E.L.M. Touch (France)Worcester Wave (UK)Bosch Control CT‑100 (Other)
aioneo4j
info:asyncio client for neo4jInstallationpipinstallaioneo4jUsageimportasynciofromaioneo4jimportNeo4jasyncdefgo():asyncwithNeo4j('http://neo4j:[email protected]:7474/')asneo4j:data=awaitneo4j.data()assertbool(data)loop=asyncio.get_event_loop()loop.run_until_complete(go())loop.close()
aionet
Asynchronous multi-vendor library for interacting with network devicesthis is a fork from netdev, with code refactor and new features added.Requires:asyncioAsyncSSHPython >=3.5pyYAMLasyncsshSupports:Cisco IOSCisco IOS XECisco IOS XRCisco ASACisco NX-OSHP ComwareFujitsu Blade SwitchesMikrotik RouterOSArista EOSJuniper JunOSAruba AOS 6.XAruba AOS 8.XTerminalFeatures:SSHTelnetTextFSMExamples:Example of interacting with Cisco IOS devices:importasyncioimportaionetasyncdeftask(device):asyncwithaionet.ConnectionHandler(**device)asconn:out=awaitconn.send_command("show ver")print(out)commands=["interface vlan2","no shut"]out=awaitconn.send_config_set(commands)asyncdefrun():dev1={'username':'user','password':'pass','device_type':'cisco_ios','ip':'ip address',}dev2={'username':'user','password':'pass','device_type':'cisco_ios','ip':'ip address',}devices=[dev1,dev2]tasks=[task(dev)fordevindevices]awaitasyncio.wait(tasks)loop=asyncio.get_event_loop()loop.run_until_complete(run())
aionetbox
AIO NetboxAn asyncio netbox library that conforms to any running netbox via it's OpenAPI specInstallationAIONetbox is distributed as a library intended to be included in other asyncio python projects. It has been developed on python 3.6+ though 3.8 is recommended.pip install aionetboxUsagefrom aionetbox import AIONetbox netbox = AIONetbox.from_openapi( url='http://localhost:8000', api_key='0123abcd' ) sites = await netbox.dcim.dcim_sites_list() my_site = await netbox.dcim.dcim_sites_read(id=2) custom_field_sort = await netbox.dcim.dcim_regions_list(cf_sf_id='identifier')Each module and method map to the swagger definition for netbox (/api/docs)
aio-net-events
aio-net-eventsaio-net-eventsis a Python library that provides asynchronous generators yielding events when the network configuration of the machine changes. Currently only network interface additions / removals and IP address additions / removals are supported; more events may be added later.Supports Windows, Linux and macOS at the moment.Requires Python >= 3.8.Works withasyncioandtrio.InstallationUse the package managerpipto installaio-net-events.pipinstallaio-net-eventsUsageContributingPull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.Please make sure to update tests as appropriate.LicenseMIT
aionetrunner
No description available on PyPI.
aionetworking
Various utilities for networking with asyncio
aionewsapi
aionewsapiAsyncio client for interacting with newsapi.orgInstallation instructions:pipinstallaionewsapiUsage:fromaionewsapiimportNewsAPIclient=NewsAPI()Noteaionewsapi is not affiliated or endorsed by any of the web services it interacts with.