package
stringlengths
1
122
pacakge-description
stringlengths
0
1.3M
aiobook
AioBookAioBook it is async framework for build messenger application in facebookInstallationUse the package managerpipto install aiobook.pipinstallaiobookUsageFacebook HandlerFacebook Handler can handle allfacebook webhook eventsYou can use decorators:fromaiobookimportFacebookHandlerhandler=FacebookHandler("<page_access_token","<verification_token>",skip_confirm_execution=True)@handler.handle_messageasyncdefhandle_message(event):print("receive your message{}".format(event.text))@handler.handle_postbackasyncdefhandle_postback(event):print("receive your postback with payload{}".format(event.postback))Or directly register handlers:asyncdefhandle_message(event):print("receive your message{}".format(event.text))asyncdefhandle_postback(event):print("receive your postback with payload{}".format(event.postback))handler.set_webhook_handler("message",handle_message)handler.set_webhook_handler("postback",handle_postback)To get list allowed and defined handlers:handler.get_allowed_handlers()handler.get_defined_handlers()Also you can set handler before_handle, and after_handle. It will be called before(or after) handle_event:@handler.before_handleasyncdeflog_message(event):logging.info("{}handled.".format(event.name))@handler.after_handleasyncdeflog_message(event):logging.info("{}handled.".format(event.name))To receive message you need register handler in HTTP Server:fromaiohttpimportwebfromaiobookimportFacebookHandlerhandler=FacebookHandler("<page_access_token","<verification_token>",skip_confirm_execution=True)app=web.Application()app.add_routes([web.get("<url_pattern>",handler.handle_get)])app.add_routes([web.post("<url_pattern>",handler.handle_post)])MessengerMessenger supportsSend API method. Facebook Handler included in Messenger.fromaiobookimportMessengermessenger=Messenger("<page_access_token","<verification_token>",skip_confirm_execution=True)@messenger.handler.handle_messageasyncdefhandle_message(event):awaitmessenger.send(event.sender_id,"Your message:{}".format(event.text))@messenger.handler.handle_postbackasyncdefhandle_postback(event):awaitmessenger.send(event.sender_id,"Your press button:{}".format(event.postback))messenger.sendAllow to send text or templates:awaitmessenger.send(event.sender_id,message,quick_replies=None,messaging_type=None,metadata=None,notification_type=None,tag=None)Allowed types for messenger.sendString message and templatesSupported TemplatesButtonTemplate, GenericTemplate, ListTemplate, OpenGraphTemplate, MediaTemplate.Supported ButtonsCallButton, GamePlayButton, LogInButton, LogOutButton, PostbackButton, UrlButtonOtherQuickReply, Element, MediaElement, OpenGraphElementfromaiobook.core.facebookimportQuickReplyfromaiobook.core.facebookimportElement,MediaElement,OpenGraphElementfromaiobook.core.facebookimportButtonTemplate,GenericTemplate,ListTemplate,OpenGraphTemplate,MediaTemplatefromaiobook.core.facebookimportCallButton,GamePlayButton,LogInButton,LogOutButton,PostbackButton,UrlButtonawaitmessenger.send(event.sender_id,ButtonTemplate('Hi, press buttons',buttons=[PostbackButton('test','test_payload'),UrlButton(title='test_rl',url='https://www.messenger.com')]))awaitmessenger.send(event.sender_id,GenericTemplate([Element('test',buttons=[PostbackButton('test','test_payload'),UrlButton(title='test_rl',url='https://www.messenger.com')]),Element('test2',image_url,'test2',buttons=[PostbackButton('test','test_payload'),UrlButton(title='test_rl',url='https://www.messenger.com')])]))messenger.get_user_profileAllows you to use a sender_id to retrieve user profile information:response=awaitmessenger.get_user_profile(event.sender_id,fields=("first_name","last_name"))Next fields aresupported.messenger.get_page_infoAllows you to retrieve your page information:response=awaitmessenger.get_page_info()messenger.imitate_typingDecorate func to imitate typing with defined timeout before answer. Included mark_seen, typing_on and typing_off [email protected][email protected]_typing(1)asyncdefhandle_postback(event):awaitmessenger.send(event.sender_id,"Your press button:{}".format(event.postback))Or you can use sender actions independently:messenger.mark_seenSender action to mark last message as readawaitmessenger.mark_seen(event.sender_id)messenger.typing_onSender action to turn typing indicators onawaitmessenger.typing_on(event.sender_id)messenger.typing_offSender action to turn typing indicators offawaitmessenger.typing_off(event.sender_id)AioBook AppAioBook it is small aiohttp wrapper that helps manage and deploy your messenger appfromaiobookimportAioBookAppfromaiobookimportMessengerapp=AioBookApp(port=3000)messenger=Messenger("<page_access_token","<verification_token>",skip_confirm_execution=True)app.register_messenger(messenger)app.start_bot()LicenseMIT
aiobooru
Aiobooru - A package for Danbooru API using aiohttpUse like:importaiobooruimportasynciodefrun():asyncwithaiobooru.Booru()asa:p=awaita.post(1)awaitp.download('image.jpg')asyncio.run(run())
aioboot
AIOBoot - Python asynchronous application framework, built on top of famous libraries.Documentation:https://aioboot.readthedocs.io/en/latest/
aiobosest
AboutPowered by the Bose® SoundTouch®Developers API.FeaturesSupport GET and POST using the REST APIWebSocket connection to receive updates from the systemGetting StartedA simple client example is provided on the examples/ directory.Documentationhttps://aiobosest.readthedocs.io/RequirementsPython >= 3.5aiohttplxmlzeroconfLicenseaiobosestis offered under theGPLv3license.Source codeThe latest version is available in a github repository:https://github.com/trunet/aiobosest
aiobot
No description available on PyPI.
aioboto3
Breaking changes for v11: The S3Transfer config passed into upload/download_file etc.. has been updated to that it matches what boto3 usesBreaking changes for v9: aioboto3.resource and aioboto3.client methods no longer exist, make a session then call session.client etc…This was done for various reasons but mainly that it prevents the default session living longer than it should as that breaks situations where eventloops are replaced.The .client and .resource functions must now be used as async context managers.Now that aiobotocore has reached version 1.0.1, a side effect of the work put in to fix various issues like bucket region redirection and supporting web assume role type credentials, the client must now be instantiated using a context manager, which by extension applies to the resource creator. You used to get away with callingres =aioboto3.resource('dynamodb')but that no longer works. If you really want to do that, you can dores = awaitaioboto3.resource('dynamodb').__aenter__()but you’ll need to remember to call__aexit__.There will most likely be some parts that dont work now which I’ve missed, just make an issue and we’ll get them resoved quickly.Creating service resources must also be async now, e.g.asyncdefmain():session=aioboto3.Session()asyncwithsession.resource("s3")ass3:bucket=awaits3.Bucket('mybucket')# <----------------asyncfors3_objectinbucket.objects.all():print(s3_object)Updating to aiobotocore 1.0.1 also brings with it support for running inside EKS as well as asyncifyingget_presigned_urlThis package is mostly just a wrapper combining the great work ofboto3andaiobotocore.aiobotocore allows you to use near enough all of the boto3 client commands in an async manner just by prefixing the command withawait.With aioboto3 you can now use the higher level APIs provided by boto3 in an asynchronous manner. Mainly I developed this as I wanted to use the boto3 dynamodb Table object in some async microservices.While all resources in boto3 should work I havent tested them all, so if what your after is not in the table below then try it out, if it works drop me an issue with a simple test case and I’ll add it to the table.ServicesStatusDynamoDB Service ResourceTested and workingDynamoDB TableTested and workingS3WorkingKinesisWorkingSSM Parameter StoreWorkingAthenaWorkingExampleSimple example of using aioboto3 to put items into a dynamodb tableimportasyncioimportaioboto3fromboto3.dynamodb.conditionsimportKeyasyncdefmain():session=aioboto3.Session()asyncwithsession.resource('dynamodb',region_name='eu-central-1')asdynamo_resource:table=awaitdynamo_resource.Table('test_table')awaittable.put_item(Item={'pk':'test1','col1':'some_data'})result=awaittable.query(KeyConditionExpression=Key('pk').eq('test1'))# Example batch writemore_items=[{'pk':'t2','col1':'c1'},\{'pk':'t3','col1':'c3'}]asyncwithtable.batch_writer()asbatch:foritem_inmore_items:awaitbatch.put_item(Item=item_)loop=asyncio.get_event_loop()loop.run_until_complete(main())# Outputs:# [{'col1': 'some_data', 'pk': 'test1'}]Things that either dont work or have been patchedAs this library literally wraps boto3, its inevitable that some things won’t magically be async.Fixed:s3_client.download_file*This is performed by the s3transfer module. – Patched with get_objects3_client.upload_file*This is performed by the s3transfer module. – Patched with custom multipart uploads3_client.copyThis is performed by the s3transfer module. – Patched to use get_object -> upload_fileobjectdynamodb_resource.Table.batch_writerThis now returns an async context manager which performs the same functionResource waiters - You can now await waiters which are part of resource objects, not just client waiters, e.g.await dynamodbtable.wait_until_exists()Resource object properties are normally autoloaded, now they are all co-routines and the metadata they come from will be loaded on first await and then cached thereafter.S3 Bucket.objects object now works and has been asyncified. Examples here -https://aioboto3.readthedocs.io/en/latest/usage.html#s3-resource-objectsAmazon S3 Client-Side EncryptionBoto3 doesn’t support AWS client-side encryption so until they do I’ve added basic support for it. Docs hereCSECSE requires the pythoncryptographylibrary so if you dopip install aioboto3[s3cse]that’ll also include cryptography.This library currently supports client-side encryption using KMS-Managed master keys performing envelope encryption using either AES/CBC/PKCS5Padding or preferably AES/GCM/NoPadding. The files generated are compatible with the Java Encryption SDK so I will assume they are compatible with the Ruby, PHP, Go and C++ libraries as well.Non-KMS managed keys are not yet supported but if you have use of that, raise an issue and i’ll look into it.DocumentationDocs are here -https://aioboto3.readthedocs.io/en/latest/Examples here -https://aioboto3.readthedocs.io/en/latest/usage.htmlFeaturesClosely mimics the usage of boto3.TodoMore examplesSet up docsLook into monkey-patching the aws xray sdk to be more async if it needs to be.CreditsThis package was created withCookiecutterand theaudreyr/cookiecutter-pypackageproject template. It also makes use of theaiobotocoreandboto3libraries. All the credit goes to them, this is mainly a wrapper with some examples.
aiobotocore
Async client for amazon services usingbotocoreandaiohttp/asyncio.This library is a mostly full featured asynchronous version of botocore.Install$ pip install aiobotocoreBasic Exampleimportasynciofromaiobotocore.sessionimportget_sessionAWS_ACCESS_KEY_ID="xxx"AWS_SECRET_ACCESS_KEY="xxx"asyncdefgo():bucket='dataintake'filename='dummy.bin'folder='aiobotocore'key='{}/{}'.format(folder,filename)session=get_session()asyncwithsession.create_client('s3',region_name='us-west-2',aws_secret_access_key=AWS_SECRET_ACCESS_KEY,aws_access_key_id=AWS_ACCESS_KEY_ID)asclient:# upload object to amazon s3data=b'\x01'*1024resp=awaitclient.put_object(Bucket=bucket,Key=key,Body=data)print(resp)# getting s3 object properties of file we just uploadedresp=awaitclient.get_object_acl(Bucket=bucket,Key=key)print(resp)# get object from s3response=awaitclient.get_object(Bucket=bucket,Key=key)# this will ensure the connection is correctly re-used/closedasyncwithresponse['Body']asstream:assertawaitstream.read()==data# list s3 objects using paginatorpaginator=client.get_paginator('list_objects')asyncforresultinpaginator.paginate(Bucket=bucket,Prefix=folder):forcinresult.get('Contents',[]):print(c)# delete object from s3resp=awaitclient.delete_object(Bucket=bucket,Key=key)print(resp)loop=asyncio.get_event_loop()loop.run_until_complete(go())Context Manager ExamplesfromcontextlibimportAsyncExitStackfromaiobotocore.sessionimportAioSession# How to use in existing context managerclassManager:def__init__(self):self._exit_stack=AsyncExitStack()self._s3_client=Noneasyncdef__aenter__(self):session=AioSession()self._s3_client=awaitself._exit_stack.enter_async_context(session.create_client('s3'))asyncdef__aexit__(self,exc_type,exc_val,exc_tb):awaitself._exit_stack.__aexit__(exc_type,exc_val,exc_tb)# How to use with an external exit_stackasyncdefcreate_s3_client(session:AioSession,exit_stack:AsyncExitStack):# Create client and add cleanupclient=awaitexit_stack.enter_async_context(session.create_client('s3'))returnclientasyncdefnon_manager_example():session=AioSession()asyncwithAsyncExitStack()asexit_stack:s3_client=awaitcreate_s3_client(session,exit_stack)# do work with s3_clientSupported AWS ServicesThis is a non-exuastive list of what tests aiobotocore runs against AWS services. Not all methods are tested but we aim to test the majority of commonly used methods.ServiceStatusS3WorkingDynamoDBBasic methods testedSNSBasic methods testedSQSBasic methods testedCloudFormationStack creation testedKinesisBasic methods testedDue to the way boto3 is implemented, its highly likely that even if services are not listed above that you can take anyboto3.client(‘service’)and stickawaitinfront of methods to make them async, e.g.await client.list_named_queries()would asynchronous list all of the named Athena queries.If a service is not listed here and you could do with some tests or examples feel free to raise an issue.Run TestsThere are two set of tests, those that can be mocked throughmotorunning in docker, and those that require running against a personal amazon key. The CI only runs the moto tests.To run the moto tests:$ make mototestTo run the non-moto tests:Make sure you have development requirements installed and your amazon key and secret accessible via environment variables:$ pip install pip-tools $ pip-compile requirements-dev.txt $ pip-sync requirements-dev.txt $ export AWS_ACCESS_KEY_ID=xxx $ export AWS_SECRET_ACCESS_KEY=xxxExecute tests suite:$ make testEnable type checking and code completionInstalltypes-aiobotocorethat contains type annotations foraiobotocoreand all supportedbotocoreservices.# install aiobotocore type annotations # for ec2, s3, rds, lambda, sqs, dynamo and cloudformationpython-mpipinstall'types-aiobotocore[essential]'# or install annotations for services you usepython-mpipinstall'types-aiobotocore[acm,apigateway]'# Lite version does not provide session.create_client overloads # it is more RAM-friendly, but requires explicit type annotationspython-mpipinstall'types-aiobotocore-lite[essential]'Now you should be able to runPylance,pyright, ormypyfor type checking as well as code completion in your IDE.Fortypes-aiobotocore-litepackage use explicit type annotations:fromaiobotocore.sessionimportget_sessionfromtypes_aiobotocore_s3.clientimportS3Clientsession=get_session()asyncwithsession.create_client("s3")asclient:client:S3Client# type checking and code completion is now enabled for clientFull documentation fortypes-aiobotocorecan be found here:https://youtype.github.io/types_aiobotocore_docs/Mailing Listhttps://groups.google.com/forum/#!forum/aio-libsRequirementsPython3.8+aiohttpbotocoreawscli & boto3awscli and boto3 depend on a single version, or a narrow range of versions, of botocore. However, aiobotocore only supports a specific range of botocore versions. To ensure you install the latest version of awscli and boto3 that your specific combination or aiobotocore and botocore can support use:pip install -U 'aiobotocore[awscli,boto3]'If you only need awscli and not boto3 (or vice versa) you can just install one extra or the other.Changes2.12.0 (2024-02-28)bump botocore dependency specification2.11.2 (2024-02-02)bump botocore dependency specification2.11.1 (2024-01-25)bump botocore dependency specification2.11.0 (2024-01-19)send project-specificUser-AgentHTTP header #8532.10.0 (2024-01-18)bump botocore dependency specification2.9.1 (2024-01-17)fix race condition in S3 Express identity cache #10722.9.0 (2023-12-12)bump botocore dependency specification2.8.0 (2023-11-28)add AioStubber that returns AioAWSResponse()remove confusingaiobotocore.session.Sessionsymbolbump botocore dependency specification2.7.0 (2023-10-17)add support for Python 3.12drop more Python 3.7 support (EOL)relax botocore dependency specification2.6.0 (2023-08-11)bump aiohttp minimum version to 3.7.4.post0drop python 3.7 support (EOL)2.5.4 (2023-08-07)fix __aenter__ attribute error introduced in refresh bugfix (#1031)2.5.3 (2023-08-06)add more support for Python 3.11bump botocore to 1.31.17add waiter.wait returnfix SSO token refresh bug #10252.5.2 (2023-07-06)fix issue #10202.5.1 (2023-06-27)bump botocore to 1.29.1612.5.0 (2023-03-06)bump botocore to 1.29.76 (thanks @jakob-keller #999)2.4.2 (2022-12-22)fix retries (#988)2.4.1 (2022-11-28)Adds support for checksums in streamed request trailers (thanks @terrycain #962)2.4.0 (2022-08-25)bump botocore to 1.27.592.3.4 (2022-06-23)fix select_object_content2.3.3 (2022-06-07)fix connect timeout while getting IAM credsfix test files appearing in distribution package2.3.2 (2022-05-08)fix 3.6 testing and and actually fix 3.6 support2.3.1 (2022-05-06)fix 3.6 supportAioConfig: allow keepalive_timeout to be None (thanks @dnlserrano #933)2.3.0 (2022-05-05)fix encoding issue by swapping to AioAWSResponse and AioAWSRequest to behave more like botocorefix exceptions mappings2.2.0 (2022-03-16)remove deprecated APIsbump to botocore 1.24.21re-enable retry of aiohttp.ClientPayloadError2.1.2 (2022-03-03)fix httpsession close call2.1.1 (2022-02-10)implement asynchronous non-blocking adaptive retry strategy2.1.0 (2021-12-14)bump to botocore 1.23.24fix aiohttp resolver config param #9062.0.1 (2021-11-25)revert accidental dupe of _register_s3_events #867 (thanks @eoghanmurray)Support customizing the aiohttp connector resolver class #893 (thanks @orf)fix timestream query #9022.0.0 (2021-11-02)bump to botocore 1.22.8turn off defaultAIOBOTOCORE_DEPRECATED_1_4_0_APISenv var to match botocore module. See notes in 1.4.0.1.4.2 (2021-09-03)Fix missing close() method on http session (thanks@terrycain)Fix for verify=False1.4.1 (2021-08-24)put backwards incompatible changes behindAIOBOTOCORE_DEPRECATED_1_4_0_APISenv var. This means that#876will not work unless this env var has been set to 0.1.4.0 (2021-08-20)fix retries via config#877remove AioSession and get_session top level names to matchbotocorechange exceptions raised to match those ofbotocore, seemappings1.3.3 (2021-07-12)fix AioJSONParser#8721.3.2 (2021-07-07)Bump tobotocoreto1.20.1061.3.1 (2021-06-11)TCPConnector: change deprecated ssl_context to sslfix non awaited generate presigned url calls#8681.3.0 (2021-04-09)Bump tobotocoreto1.20.49#8561.2.2 (2021-03-11)Await call to async method _load_creds_via_assume_role#858(thanks@puzza007)1.2.1 (2021-02-10)verify strings are now correctly passed to aiohttp.TCPConnector#851(thanks@FHTMitchell)1.2.0 (2021-01-11)bump botocore to1.19.52use passed in http_session_cls param to create_client#7971.1.2 (2020-10-07)fix AioPageIterator search method #831 (thanks@joseph-jones)1.1.1 (2020-08-31)fix s3 region redirect bug #8251.1.0 (2020-08-18)bump botocore to 1.17.441.0.7 (2020-06-04)fix generate_db_auth_token via #8161.0.6 (2020-06-04)revert __getattr__ fix as it breaks ddtrace1.0.5 (2020-06-03)Fixed AioSession.get_service_data emit call #811 via #812Fixed async __getattr__ #789 via #8031.0.4 (2020-04-15)Fixed S3 Presigned Post not being async1.0.3 (2020-04-09)Fixes typo when using credential process1.0.2 (2020-04-05)Disable Client.__getattr__ emit for now #7891.0.1 (2020-04-01)Fixed signing requests with explicit credentials1.0.0 (2020-03-31)API breaking: The result of create_client is now a required async context classCredential refresh should now workgenerate_presigned_url is now an async call along with other credential methodsCredentials.[access_key/secret_key/token] now raise NotImplementedError because they won’t call refresh like botocore. Instead should use get_frozen_credentials async methodBump botocore and extras0.12.0 (2020-02-23)Bump botocore and extrasDrop support for 3.5 given we are unable to test it with moto and it will soon be unsupportedRemove loop parameters for Python 3.8 complianceRemove deprecated AioPageIterator.next_page0.11.1 (2020-01-03)Fixed event streaming API calls like S3 Select.0.11.0 (2019-11-12)replace CaseInsensitiveDict with urllib3 equivalent #744 (thanks to inspiration from @craigmccarter and @kevchentw)bump botocore to 1.13.14fix for mismatched botocore method replacements0.10.4 (2019-10-24)Make AioBaseClient.close method async #724 (thanks @bsitruk)Bump awscli, boto3, botocore #735 (thanks @bbrendon)switch paginator to async_generator, add result_key_iters (deprecate next_page method)0.10.3 (2019-07-17)Bump botocore and extras0.10.2 (2019-02-11)Fix response-received emitted event #6820.10.1 (2019-02-08)Make tests pass with pytest 4.1 #669 (thanks @yan12125)Support Python 3.7 #671 (thanks to @yan12125)Update RTD build config #672 (thanks @willingc)Bump to botocore 1.12.91 #6790.10.0 (2018-12-09)Update to botocore 1.12.49 #639 (thanks @terrycain)0.9.4 (2018-08-08)Add ClientPayloadError as retryable exception0.9.3 (2018-07-16)Bring botocore up to date0.9.2 (2018-05-05)bump aiohttp requirement to fix read timeouts0.9.1 (2018-05-04)fix timeout bug introduced in last release0.9.0 (2018-06-01)bump aiohttp to 3.3.xremove unneeded set_socket_timeout0.8.0 (2018-05-07)Fix pagination #573 (thanks @adamrothman)Enabled several s3 tests via motoBring botocore up to date0.7.0 (2018-05-01)Just version bump0.6.1a0 (2018-05-01)bump to aiohttp 3.1.xswitch tests to Python 3.5+switch to native coroutinesfix non-streaming body timeout retries0.6.0 (2018-03-04)Upgrade to aiohttp>=3.0.0 #536 (thanks @Gr1N)0.5.3 (2018-02-23)Fixed waiters #523 (thanks @dalazx)fix conn_timeout #4850.5.2 (2017-12-06)Updated awscli dependency #4610.5.1 (2017-11-10)Disabled compressed response #4300.5.0 (2017-11-10)Fix error botocore error checking #190Update supported botocore requirement to: >=1.7.28, <=1.7.40Bump aiohttp requirement to support compressed responses correctly #2980.4.5 (2017-09-05)Added SQS examples and tests #336Changed requirements.txt structure #336bump to botocore 1.7.4Added DynamoDB examples and tests #3400.4.4 (2017-08-16)add the supported versions of boto3 to extras require #3240.4.3 (2017-07-05)add the supported versions of awscli to extras require #273 (thanks @graingert)0.4.2 (2017-07-03)update supported aiohttp requirement to: >=2.0.4, <=2.3.0update supported botocore requirement to: >=1.5.71, <=1.5.780.4.1 (2017-06-27)fix redirects #2680.4.0 (2017-06-19)update botocore requirement to: botocore>=1.5.34, <=1.5.70fix read_timeout due to #245implement set_socket_timeout0.3.3 (2017-05-22)switch to PEP 440 version parser to support ‘dev’ versions0.3.2 (2017-05-22)Fix botocore integrationProvisional fix for aiohttp 2.x stream supportupdate botocore requirement to: botocore>=1.5.34, <=1.5.520.3.1 (2017-04-18)Fixed Waiter support0.3.0 (2017-04-01)Added support for aiohttp>=2.0.4 (thanks @achimnol)update botocore requirement to: botocore>=1.5.0, <=1.5.330.2.3 (2017-03-22)update botocore requirement to: botocore>=1.5.0, <1.5.290.2.2 (2017-03-07)set aiobotocore.__all__ for * imports #121 (thanks @graingert)fix ETag in head_object response #1320.2.1 (2017-02-01)Normalize headers and handle redirection by botocore #115 (thanks @Fedorof)0.2.0 (2017-01-30)add support for proxies (thanks @jjonek)remove AioConfig verify_ssl connector_arg as this is handled by the create_client verify paramremove AioConfig limit connector_arg as this is now handled by by the Configmax_pool_connectionsproperty (note default is 10)0.1.1 (2017-01-16)botocore updated to version 1.5.00.1.0 (2017-01-12)Pass timeout to aiohttp.request to enforce read_timeout #86 (thanks @vharitonsky) (bumped up to next semantic version due to read_timeout enabling change)0.0.6 (2016-11-19)Added enforcement of plain response #57 (thanks @rymir)botocore updated to version 1.4.73 #74 (thanks @vas3k)0.0.5 (2016-06-01)Initial alpha release
aio-botocore
The sole purpose of this fork is to release a version of [aiobotocore](https://github.com/aio-libs/aiobotocore) with relaxed constraints on the dependencies for botocore and boto3. This should enable algorithms for resolving dependencies to work more efficiently with this library.Hopefully any risks in relaxing the versions allowed for botocore and boto3 are minimal. However, use at your own risk (i.e. use your own unit tests and test coverage to manage your risks).If the original library works for your purposes, use it instead of this library. If changes to this library are working, some form of the changes might get integrated into the original project. If so, hopefully this library will cease to exist (or at least cease to be maintained in this form).Install$ pip install aio-botocoreThe original library is installed using$ pip install aiobotocore
aiobotocore-botorange
Async client for amazon services usingbotocoreandaiohttp/asyncio.Main purpose of this library to support amazon s3 api, but other services should work (may be with minor fixes). For now we have tested only upload/download api for s3, other users report that SQS and Dynamo services work also. More tests coming soon.Install$ pip install aiobotocoreBasic ExampleimportasyncioimportaiobotocoreAWS_ACCESS_KEY_ID="xxx"AWS_SECRET_ACCESS_KEY="xxx"asyncdefgo(loop):bucket='dataintake'filename='dummy.bin'folder='aiobotocore'key='{}/{}'.format(folder,filename)session=aiobotocore.get_session(loop=loop)asyncwithsession.create_client('s3',region_name='us-west-2',aws_secret_access_key=AWS_SECRET_ACCESS_KEY,aws_access_key_id=AWS_ACCESS_KEY_ID)asclient:# upload object to amazon s3data=b'\x01'*1024resp=awaitclient.put_object(Bucket=bucket,Key=key,Body=data)print(resp)# getting s3 object properties of file we just uploadedresp=awaitclient.get_object_acl(Bucket=bucket,Key=key)print(resp)# delete object from s3resp=awaitclient.delete_object(Bucket=bucket,Key=key)print(resp)# list s3 objects using paginatorpaginator=client.get_paginator('list_objects')asyncforresultinpaginator.paginate(Bucket=bucket,Prefix=folder):forcinresult.get('Contents',[]):print(c)# get object from s3response=awaitclient.get_object(Bucket=bucket,key=key)# this will ensure the connection is correctly re-used/closedasyncwithresponse['Body']asstream:bytes=awaitstream.read()loop=asyncio.get_event_loop()loop.run_until_complete(go(loop))Run TestsMake sure you have development requirements installed and your amazon key and secret accessible via environment variables:$ cd aiobotocore $ export AWS_ACCESS_KEY_ID=xxx $ export AWS_SECRET_ACCESS_KEY=xxx $ pip install -Ur requirements-dev.txtExecute tests suite:$ py.test -v testsMailing Listhttps://groups.google.com/forum/#!forum/aio-libsRequirementsPython3.4+aiohttpbotocoreChanges0.2.3 (XXXX-XX-XX)0.2.2 (2017-03-07)set aiobotocore.__all__ for * imports #121 (thanks @graingert)fix ETag in head_object response #1320.2.1 (2017-02-01)Normalize headers and handle redirection by botocore #115 (thanks @Fedorof)0.2.0 (2017-01-30)add support for proxies (thanks @jjonek)remove AioConfig verify_ssl connector_arg as this is handled by the create_client verify paramremove AioConfig limit connector_arg as this is now handled by by the Configmax_pool_connectionsproperty (note default is 10)0.1.1 (2017-01-16)botocore updated to version 1.5.00.1.0 (2017-01-12)Pass timeout to aiohttp.request to enforce read_timeout #86 (thanks @vharitonsky) (bumped up to next semantic version due to read_timeout enabling change)0.0.6 (2016-11-19)Added enforcement of plain response #57 (thanks @rymir)botocore updated to version 1.4.73 #74 (thanks @vas3k)0.0.5 (2016-06-01)Initial alpha release
aiobotocore-chrisglass
Async client for amazon services usingbotocoreandaiohttp/asyncio.This library is a mostly full featured asynchronous version of botocore.Install$ pip install aiobotocoreBasic Exampleimportasynciofromaiobotocore.sessionimportget_sessionAWS_ACCESS_KEY_ID="xxx"AWS_SECRET_ACCESS_KEY="xxx"asyncdefgo():bucket='dataintake'filename='dummy.bin'folder='aiobotocore'key='{}/{}'.format(folder,filename)session=get_session()asyncwithsession.create_client('s3',region_name='us-west-2',aws_secret_access_key=AWS_SECRET_ACCESS_KEY,aws_access_key_id=AWS_ACCESS_KEY_ID)asclient:# upload object to amazon s3data=b'\x01'*1024resp=awaitclient.put_object(Bucket=bucket,Key=key,Body=data)print(resp)# getting s3 object properties of file we just uploadedresp=awaitclient.get_object_acl(Bucket=bucket,Key=key)print(resp)# get object from s3response=awaitclient.get_object(Bucket=bucket,Key=key)# this will ensure the connection is correctly re-used/closedasyncwithresponse['Body']asstream:assertawaitstream.read()==data# list s3 objects using paginatorpaginator=client.get_paginator('list_objects')asyncforresultinpaginator.paginate(Bucket=bucket,Prefix=folder):forcinresult.get('Contents',[]):print(c)# delete object from s3resp=awaitclient.delete_object(Bucket=bucket,Key=key)print(resp)loop=asyncio.get_event_loop()loop.run_until_complete(go())Context Manager ExamplesfromcontextlibimportAsyncExitStackfromaiobotocore.sessionimportAioSession# How to use in existing context managerclassManager:def__init__(self):self._exit_stack=AsyncExitStack()self._s3_client=Noneasyncdef__aenter__(self):session=AioSession()self._s3_client=awaitself._exit_stack.enter_async_context(session.create_client('s3'))asyncdef__aexit__(self,exc_type,exc_val,exc_tb):awaitself._exit_stack.__aexit__(exc_type,exc_val,exc_tb)# How to use with an external exit_stackasyncdefcreate_s3_client(session:AioSession,exit_stack:AsyncExitStack):# Create client and add cleanupclient=awaitexit_stack.enter_async_context(session.create_client('s3'))returnclientasyncdefnon_manager_example():session=AioSession()asyncwithAsyncExitStack()asexit_stack:s3_client=awaitcreate_s3_client(session,exit_stack)# do work with s3_clientSupported AWS ServicesThis is a non-exuastive list of what tests aiobotocore runs against AWS services. Not all methods are tested but we aim to test the majority of commonly used methods.ServiceStatusS3WorkingDynamoDBBasic methods testedSNSBasic methods testedSQSBasic methods testedCloudFormationStack creation testedKinesisBasic methods testedDue to the way boto3 is implemented, its highly likely that even if services are not listed above that you can take anyboto3.client(‘service’)and stickawaitinfront of methods to make them async, e.g.await client.list_named_queries()would asynchronous list all of the named Athena queries.If a service is not listed here and you could do with some tests or examples feel free to raise an issue.Run TestsMake sure you have development requirements installed and your amazon key and secret accessible via environment variables:$ cd aiobotocore $ export AWS_ACCESS_KEY_ID=xxx $ export AWS_SECRET_ACCESS_KEY=xxx $ pipenv sync --devExecute tests suite:$ py.test -v testsMailing Listhttps://groups.google.com/forum/#!forum/aio-libsRequirementsPython3.6+aiohttpbotocoreawscliawscli depends on a single version of botocore, however aiobotocore only supports a specific range of botocore versions. To ensure you install the latest version of awscli that your specific combination or aiobotocore and botocore can support use:pip install -U aiobotocore[awscli]Changes1.4.1 (2021-08-24)put backwards incompatible changes behindAIOBOTOCORE_DEPRECATED_1_4_0_APISenv var. This means that#876will not work unless this env var has been set to 0.1.4.0 (2021-08-20)fix retries via config#877remove AioSession and get_session top level names to matchbotocorechange exceptions raised to match those ofbotocore, seemappings1.3.3 (2021-07-12)fix AioJSONParser#8721.3.2 (2021-07-07)Bump tobotocoreto1.20.1061.3.1 (2021-06-11)TCPConnector: change deprecated ssl_context to sslfix non awaited generate presigned url calls#8681.3.0 (2021-04-09)Bump tobotocoreto1.20.49#8561.2.2 (2021-03-11)Await call to async method _load_creds_via_assume_role#858(thanks@puzza007)1.2.1 (2021-02-10)verify strings are now correctly passed to aiohttp.TCPConnector#851(thanks@FHTMitchell)1.2.0 (2021-01-11)bump botocore to1.19.52use passed in http_session_cls param to create_client#7971.1.2 (2020-10-07)fix AioPageIterator search method #831 (thanks@joseph-jones)1.1.1 (2020-08-31)fix s3 region redirect bug #8251.1.0 (2020-08-18)bump botocore to 1.17.441.0.7 (2020-06-04)fix generate_db_auth_token via #8161.0.6 (2020-06-04)revert __getattr__ fix as it breaks ddtrace1.0.5 (2020-06-03)Fixed AioSession.get_service_data emit call #811 via #812Fixed async __getattr__ #789 via #8031.0.4 (2020-04-15)Fixed S3 Presigned Post not being async1.0.3 (2020-04-09)Fixes typo when using credential process1.0.2 (2020-04-05)Disable Client.__getattr__ emit for now #7891.0.1 (2020-04-01)Fixed signing requests with explicit credentials1.0.0 (2020-03-31)API breaking: The result of create_client is now a required async context classCredential refresh should now workgenerate_presigned_url is now an async call along with other credential methodsCredentials.[access_key/secret_key/token] now raise NotImplementedError because they won’t call refresh like botocore. Instead should use get_frozen_credentials async methodBump botocore and extras0.12.0 (2020-02-23)Bump botocore and extrasDrop support for 3.5 given we are unable to test it with moto and it will soon be unsupportedRemove loop parameters for Python 3.8 complianceRemove deprecated AioPageIterator.next_page0.11.1 (2020-01-03)Fixed event streaming API calls like S3 Select.0.11.0 (2019-11-12)replace CaseInsensitiveDict with urllib3 equivalent #744 (thanks to inspiration from @craigmccarter and @kevchentw)bump botocore to 1.13.14fix for mismatched botocore method replacements0.10.4 (2019-10-24)Make AioBaseClient.close method async #724 (thanks @bsitruk)Bump awscli, boto3, botocore #735 (thanks @bbrendon)switch paginator to async_generator, add result_key_iters (deprecate next_page method)0.10.3 (2019-07-17)Bump botocore and extras0.10.2 (2019-02-11)Fix response-received emitted event #6820.10.1 (2019-02-08)Make tests pass with pytest 4.1 #669 (thanks @yan12125)Support Python 3.7 #671 (thanks to @yan12125)Update RTD build config #672 (thanks @willingc)Bump to botocore 1.12.91 #6790.10.0 (2018-12-09)Update to botocore 1.12.49 #639 (thanks @terrycain)0.9.4 (2018-08-08)Add ClientPayloadError as retryable exception0.9.3 (2018-07-16)Bring botocore up to date0.9.2 (2018-05-05)bump aiohttp requirement to fix read timeouts0.9.1 (2018-05-04)fix timeout bug introduced in last release0.9.0 (2018-06-01)bump aiohttp to 3.3.xremove unneeded set_socket_timeout0.8.0 (2018-05-07)Fix pagination #573 (thanks @adamrothman)Enabled several s3 tests via motoBring botocore up to date0.7.0 (2018-05-01)Just version bump0.6.1a0 (2018-05-01)bump to aiohttp 3.1.xswitch tests to Python 3.5+switch to native coroutinesfix non-streaming body timeout retries0.6.0 (2018-03-04)Upgrade to aiohttp>=3.0.0 #536 (thanks @Gr1N)0.5.3 (2018-02-23)Fixed waiters #523 (thanks @dalazx)fix conn_timeout #4850.5.2 (2017-12-06)Updated awscli dependency #4610.5.1 (2017-11-10)Disabled compressed response #4300.5.0 (2017-11-10)Fix error botocore error checking #190Update supported botocore requirement to: >=1.7.28, <=1.7.40Bump aiohttp requirement to support compressed responses correctly #2980.4.5 (2017-09-05)Added SQS examples and tests #336Changed requirements.txt structure #336bump to botocore 1.7.4Added DynamoDB examples and tests #3400.4.4 (2017-08-16)add the supported versions of boto3 to extras require #3240.4.3 (2017-07-05)add the supported versions of awscli to extras require #273 (thanks @graingert)0.4.2 (2017-07-03)update supported aiohttp requirement to: >=2.0.4, <=2.3.0update supported botocore requirement to: >=1.5.71, <=1.5.780.4.1 (2017-06-27)fix redirects #2680.4.0 (2017-06-19)update botocore requirement to: botocore>=1.5.34, <=1.5.70fix read_timeout due to #245implement set_socket_timeout0.3.3 (2017-05-22)switch to PEP 440 version parser to support ‘dev’ versions0.3.2 (2017-05-22)Fix botocore integrationProvisional fix for aiohttp 2.x stream supportupdate botocore requirement to: botocore>=1.5.34, <=1.5.520.3.1 (2017-04-18)Fixed Waiter support0.3.0 (2017-04-01)Added support for aiohttp>=2.0.4 (thanks @achimnol)update botocore requirement to: botocore>=1.5.0, <=1.5.330.2.3 (2017-03-22)update botocore requirement to: botocore>=1.5.0, <1.5.290.2.2 (2017-03-07)set aiobotocore.__all__ for * imports #121 (thanks @graingert)fix ETag in head_object response #1320.2.1 (2017-02-01)Normalize headers and handle redirection by botocore #115 (thanks @Fedorof)0.2.0 (2017-01-30)add support for proxies (thanks @jjonek)remove AioConfig verify_ssl connector_arg as this is handled by the create_client verify paramremove AioConfig limit connector_arg as this is now handled by by the Configmax_pool_connectionsproperty (note default is 10)0.1.1 (2017-01-16)botocore updated to version 1.5.00.1.0 (2017-01-12)Pass timeout to aiohttp.request to enforce read_timeout #86 (thanks @vharitonsky) (bumped up to next semantic version due to read_timeout enabling change)0.0.6 (2016-11-19)Added enforcement of plain response #57 (thanks @rymir)botocore updated to version 1.4.73 #74 (thanks @vas3k)0.0.5 (2016-06-01)Initial alpha release
aiobotocore-mirror
Async client for amazon services usingbotocoreandaiohttp/asyncio.Main purpose of this library to support amazon s3 api, but other services should work (but may be with minor fixes). For now we have tested only upload/download api for s3. More tests coming soon.Install$ pip install -e git+https://github.com/aio-libs/aiobotocore.git@master#egg=aiobotocoreBasic ExampleimportasyncioimportaiobotocoreAWS_ACCESS_KEY_ID="xxx"AWS_SECRET_ACCESS_KEY="xxx"@asyncio.coroutinedefgo(loop):bucket='dataintake'filename='dummy.bin'folder='aiobotocore'key='{}/{}'.format(folder,filename)session=aiobotocore.get_session(loop=loop)client=session.create_client('s3',region_name='us-west-2',aws_secret_access_key=AWS_SECRET_ACCESS_KEY,aws_access_key_id=AWS_ACCESS_KEY_ID)# upload object to amazon s3data=b'\x01'*1024resp=yield fromclient.put_object(Bucket=bucket,Key=key,Body=data)print(resp)# getting s3 object properties of file we just uploadedresp=yield fromclient.get_object_acl(Bucket=bucket,Key=key)print(resp)# delete object from s3resp=yield fromclient.delete_object(Bucket=bucket,Key=key)print(resp)loop=asyncio.get_event_loop()loop.run_until_complete(go(loop))Run TestsMake sure you have development requirements installed and your amazon key and secret accessible via environment variables:$ cd aiobotocore $ export AWS_ACCESS_KEY_ID=xxx $ export AWS_SECRET_ACCESS_KEY=xxx $ pip install -Ur requirements-dev.txtExecute tests suite:$ py.test -v testsRequirementsPython3.3+asyncioorPython3.4+aiohttpbotocoreChanges0.0.1 (xxxx-xx-xx)Initial release
aiobotocore-refreshable-credentials
aiobotocore-refreshable-credentialsImplements an aiobotocore.Session subclass for using aiobotocore with expiring credentials (IAM STS).Usageimportaiobotocore_refreshable_credentialssession=aiobotocore_refreshable_credentials.get_session()asyncwithsession.create_client('rekognition')asclient:...Python Versions Supported3.8+
aiobotocore-stubs
No description available on PyPI.
aiobotstat
aiobotstatMini library forBotStat API.Current API version: 0.1InstallGit:pip install git+https://github.com/viuipan/aiobotstat.gitPyPi:pip install aiobotstatExample:importasynciofromaiobotstatimportBotStatAPIBOT_TOKEN=''FILE_PATH=''BOT_USERNAME=''api=BotStatAPI(BOT_TOKEN)asyncdefmain():bot_info=awaitapi.get_bot_info(username=BOT_USERNAME)print(bot_info)task=awaitapi.create_task(file=FILE_PATH)print(task)status=awaitapi.get_task_status(task_id=task.id)print(status)cancel_result=awaitapi.cancel_task(task_id=task.id)print(f'Cancel result is{cancel_result}')asyncio.run(main())
aiobottle
UNKNOWN
aiobp
Boilerplate for asyncio serviceThis module provides boilerplate for microservices written in asyncio:Runner with task reference handler and graceful shutdownConfiguration providerLogger with color supportimportasynciofromaiobpimportrunnerasyncdefmain():try:awaitasyncio.sleep(60)exceptasyncio.CancelledError:print('Saving data...')runner(main())More complex exampleimportasyncioimportaiohttp# just for exampleimportsysfromaiobpimportcreate_task,on_shutdown,runnerfromaiobp.configimportInvalidConfigFile,sys_argv_or_filenamesfromaiobp.config.confimportloaderfromaiobp.loggingimportLoggingConfig,add_devel_log_level,log,setup_loggingclassWorkerConfig:"""Your microservice worker configuration"""sleep:int=5classConfig:"""Put configurations together"""worker:WorkerConfiglog:LoggingConfigasyncdefworker(config:WorkerConfig,client_session:aiohttp.ClientSession)->int:"""Perform service work"""attempts=0try:asyncwithclient_session.get('http://python.org')asresp:assertresp.status==200log.debug('Page length%d',len(awaitresp.text()))attempts+=1awaitasyncio.sleep(config.sleep)exceptasyncio.CancelledError:log.info('Doing some shutdown work')awaitclient_session.post('http://localhost/service/attempts',data={'attempts':attempts})returnattemptsasyncdefservice(config:Config):"""Your microservice"""client_session=aiohttp.ClientSession()on_shutdown(client_session.close,after_tasks_cancel=True)create_task(worker(config.worker,client_session),'PythonFetcher')# you can do some monitoring, statistics collection, etc.# or just let the method finish and the runner will wait for Ctrl+C or killdefmain():"""Example microservice"""add_devel_log_level()try:config_filename=sys_argv_or_filenames('service.local.conf','service.conf')config=loader(Config,config_filename)exceptInvalidConfigFileaserror:print(f'Invalid configuration:{error}')sys.exit(1)setup_logging(config.log)log.info("Using config file:%s",config_filename)runner(service(config))if__name__=='__main__':main()
aiobravado
AiobravadoAboutAiobravado is the asyncio version of thebravado libraryfor use with theOpenAPI Specification(previously known as Swagger).aiobravado requires Python 3.5+ and allows you to use asynchronous programming when interacting with OpenAPI-enabled services. Here’s the breakdown of bravado packages and their use case:bravado- Library to dynamically interact with OpenAPI/Swagger-enabled services. Supports Python 2.7+.fido- HTTP client to enable asynchronous network requests for bravado. Supports Python 2.7+. Depends on twisted. Spins up a separate thread to handle network requests.bravado-asyncio- asyncio-powered asynchronous HTTP client for bravado. Requires Python 3.5+. It is the default HTTP client for aiobravado, but can be used with bravado as well.aiobravado - asyncio-enabled library to dynamically interact with OpenAPI/Swagger-enabled services. Supports basically all of the features of bravado. Requires Python 3.5+. No additional threads are created.Example Usagefromaiobravado.clientimportSwaggerClientclient=awaitSwaggerClient.from_url('http://petstore.swagger.io/v2/swagger.json')pet=awaitclient.pet.getPetById(petId=42).result(timeout=5)DocumentationMore documentation is available athttp://aiobravado.readthedocs.orgInstallation# To install aiobravado$pipinstallaiobravado# To install aiobravado with optional packages recommended by aiohttp$pipinstallaiobravado[aiohttp_extras]DevelopmentCode is documented usingSphinx.virtualenvis recommended to keep dependencies and libraries isolated.Setup# Run teststox# Install git pre-commit hookstox-epre-commitinstallContributingFork it (http://github.com/sjaensch/aiobravado/fork)Create your feature branch (git checkout-bmy-new-feature)Add your modificationsAdd short summary of your modifications onchangelog.rstunderUpcoming release. Add that entry at the top of the file if it’s not there yet.Commit your changes (git commit-m"Add some feature")Push to the branch (git push originmy-new-feature)Create new Pull RequestLicenseCopyright (c) 2013, Digium, Inc. All rights reserved. Copyright (c) 2014-2015, Yelp, Inc. All rights reserved.Aiobravado is licensed with aBSD 3-Clause License.
aiobrawlstats
aiobrawlstats - Easy Async library for working with BrawlStars APIЧто нужно, чтобы пользоваться библиотекой:PyCharmBrawlStars TokenУстановка:pipinstall-Uaiobrawlstats pipinstall-Uhttps://github.com/vladislavkovalskyi/aiobrawlstats/archive/master.zip
aiobreaker
aiobreakeraiobreaker is a Python implementation of the Circuit Breaker pattern, described in Michael T. Nygard's bookRelease It!_.Circuit breakers exist to allow one subsystem to fail without destroying the entire system. This is done by wrapping dangerous operations (typically integration points) with a component that can circumvent calls when the system is not healthy.This project is a fork of pybreaker_ by Daniel Fernandes Martins that replaces tornado with native asyncio, originally so I could practice packaging and learn about that shiny newtypingpackage... _Release It!:https://pragprog.com/titles/mnee2/release-it-second-edition/.. _pybreaker:https://github.com/danielfm/pybreakerFeaturesConfigurable list of excluded exceptions (e.g. business exceptions)Configurable failure threshold and reset timeoutSupport for several event listeners per circuit breakerCan guard generator functionsFunctions and properties for easy monitoring and managementasynciosupportOptional redis backingSynchronous and asynchronous event listenersRequirementsAll you need ispython 3.6or higher.InstallationTo install, simply download from pypi:.. code:: bashpip install aiobreakerUsageThe first step is to create an instance ofCircuitBreakerfor each integration point you want to protect against... code:: pythonfrom aiobreaker import CircuitBreaker # Used in database integration points db_breaker = CircuitBreaker(fail_max=5, reset_timeout=timedelta(seconds=60)) @db_breaker async def outside_integration(): """Hits the api""" ...At that point, go ahead and get familiar with the documentation.
aiobroadlink
aiobroadlinkLibrary to control various Broadlink devices using asyncioThis software is based on the protocol description from Ipsum Domus (?) Details athttps://blog.ipsumdomus.com/broadlink-smart-home-devices-complete-protocol-hack-bc0b4b397af1This software is based on python-broadlink by Matthew Garrett Details athttps://github.com/mjg59/python-broadlinkRemote Control device seem to be working alright (both IR and RF)RM4C are now supported. RM4 PRO are also supported.A1 device also work.Provisioning works.Other will be tested when I get the relevant hardware.Install with pip3. Be forewarned that aiobroadlink needs the 'cryptography' library. This library will be automatically installed, but for this to succeed, you do need to be able to compile things. To that effect you need a compiler and some header files. On Debian/Ubuntu distributions, this means you need the packages 'libffi-dev' and 'libssl-dev'You can runaiobroadlinkorpython3 -m aiobroadlinkIf your IP address cannot be guessed, doaiobroadlink -i xxx.xxx.xxx.xxxwith xxx.xxx.xxx.xxx the IP address of the interface you want to use.When learning commands, they will be save in the file ~/.aiobroadlink.
aiobroker
No description available on PyPI.
aiobrultech-serial
What is aiobrultech-serial?This library talks to devices fromBrultech Researchover their serial port, usingsiobrultech-protocolsto decode the data.Installationpip install aiobrultech-serialUsagefromaiobrultech_serialimportconnectasyncwithconnect("/dev/ttyUSB0")asconnection:asyncforpacketinconnection.packets():print(f"{packet}")Look atscripts/dump.pyfor a fuller example.API CallsThis library also supports getting and setting information on the attached device. It supports all of the API calls available insiobrultech-protocols.DevelopmentSetuppython3.11 -m venv .venv source .venv/bin/activate # Install Requirements pip install -r requirements.txt # Install Dev Requirements pip install -r requirements-dev.txt # One-Time Install of Commit Hooks pre-commit installTestingTests are run withpytest.
aiobrute
AiobruteAiobrute is a tool for asynchronously testing password login on several protocols. It use the asyncio librairie instead of threads for testing password concurrently and efficiently.DISCLAIMER: This software is for educational purposes only. This software should not be used for illegal activity.The following modules are currently supported* http : test login for http protocol * ftp : test login for ftp protocol * ssh : test login for ssh protocol * mysql : test login for mysql protocolSome modules support multiple protocolModuleProtocolDescriptionhttphttp-formTesting html form authenticationhttpbasic-authTesting http basic authenticationhttpwp-xmlrpcTesting wordpress xml-rpc authenticationSome wordlists are also includedNameSizeDescriptionrockyou59187Shorter version of the popular rockyou wordlisthotmail8929Some Passwords from an old hotmail leakmyspace37120Some Passwords from an old myspace leakadobe90Some Passwords from an old adobe leakmostused200Most commonly used passwordsInstallation & UsageRun aiobrute with dockerdocker run -it --name aiobrute --rm blackice22/aiobrute <MODULE> <OPTIONS>Install aiobrute with pippip install aiobruteOutput ExamplesWhen no verbosity option are specified, a progress bar is displayed to the user with some statistics.aiobrute http -t http://localhost:8080/wp-login.php -u admin -m POST -p http-form -c 302 -f USER:log PASS:pwd ░█████╗░██╗░█████╗░██████╗░██████╗░██╗░░░██╗████████╗███████╗ ██╔══██╗██║██╔══██╗██╔══██╗██╔══██╗██║░░░██║╚══██╔══╝██╔════╝ ███████║██║██║░░██║██████╦╝██████╔╝██║░░░██║░░░██║░░░█████╗░░ ██╔══██║██║██║░░██║██╔══██╗██╔══██╗██║░░░██║░░░██║░░░██╔══╝░░ ██║░░██║██║╚█████╔╝██████╦╝██║░░██║╚██████╔╝░░░██║░░░███████╗ ╚═╝░░╚═╝╚═╝░╚════╝░╚═════╝░╚═╝░░╚═╝░╚═════╝░░░░╚═╝░░░╚══════╝ https://github.com/jylanglois/aiobrute version: [0.1.0 - alpha] [-] Loading data from the 'rockyou' build in wordlist Worker Type: http | Target: http://localhost:8080/wp-login.php | Workers: 15 | Wordlist Size: 59188 |█████████▏ | ▅▃▁ 13455/59188 [23%] in 18s (730.3/s, eta: 1:03)if verbosity options are specified, the status for each requests are printed in the console.2022-04-15 11:16:20,925 - [HTTP] [INFO] - method: [POST] - status: [200] - target: http://localhost:8080/wp-login.php - username: admin - password: 1234567 - (6 of 59188) - [worker 6] 2022-04-15 11:16:20,926 - [HTTP] [INFO] - method: [POST] - status: [200] - target: http://localhost:8080/wp-login.php - username: admin - password: daniel - (10 of 59188) - [worker 10] 2022-04-15 11:16:20,927 - [HTTP] [INFO] - method: [POST] - status: [200] - target: http://localhost:8080/wp-login.php - username: admin - password: 123456789 - (3 of 59188) - [worker 3] 2022-04-15 11:16:20,928 - [HTTP] [INFO] - method: [POST] - status: [200] - target: http://localhost:8080/wp-login.php - username: admin - password: abc123 - (8 of 59188) - [worker 8] 2022-04-15 11:16:20,928 - [HTTP] [INFO] - method: [POST] - status: [200] - target: http://localhost:8080/wp-login.php - username: admin - password: 12345 - (2 of 59188) - [worker 2] 2022-04-15 11:16:20,929 - [HTTP] [INFO] - method: [POST] - status: [200] - target: http://localhost:8080/wp-login.php - username: admin - password: nicole - (9 of 59188) - [worker 9] 2022-04-15 11:16:20,929 - [HTTP] [INFO] - method: [POST] - status: [200] - target: http://localhost:8080/wp-login.php - username: admin - password: 123456 - (1 of 59188) - [worker 1] 2022-04-15 11:16:20,930 - [HTTP] [INFO] - method: [POST] - status: [200] - target: http://localhost:8080/wp-login.php - username: admin - password: iloveyou - (4 of 59188) - [worker 4] 2022-04-15 11:16:20,930 - [HTTP] [INFO] - method: [POST] - status: [200] - target: http://localhost:8080/wp-login.php - username: admin - password: 12345678 - (7 of 59188) - [worker 7] 2022-04-15 11:16:20,931 - [HTTP] [INFO] - method: [POST] - status: [200] - target: http://localhost:8080/wp-login.php - username: admin - password: princess - (5 of 59188) - [worker 5]Usage ExamplesHTTP modules examplesTest http html login form and validate the candidate if302status code is returnedaiobrute http -t http://localhost:8080/wp-login.php -u admin -m POST -p http-form -c 302 -f USER:log PASS:pwdTest http html login with a csrf token and validate the candidate if302status code is returnedaiobrute http -t http://localhost:8080/admin/login/ -u root -m POST -p http-form -c 302 -f USER:user PASS:pwd CSRF:csrftokenTest http login with basic authentication and validate the candidate if401status code is not returnedaiobrute http -t http://localhost:8080/ -u admin -m GET -p basic-auth -c ^401Test wordpress xml-rpc login and validate the candidate if thefaultCodestring is not found in the responseaiobrute http -t http://localhost:8080/xmlrpc.php -u admin -m POST -p wp-xmlrpc -s '^faultCode'Other modules examplesTest ssh login with 5 concurrent worker and using themostusedbuilt-in wordlistaiobrute ssh -u admin -t localhost -w 5 -l mostusedContributingPull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.LicenseGNU GPLv3
aiobservable
AIObservableA simple and efficient implementation of the observable pattern.IntroductionWhat sets it apart is that it doesn't represents events as a combination of a name and arguments, but instead operates on classes.Instead of using names like "on_connect" the library encourages the use of a "ConnectEvent" class which has the arguments as its attributes. Instead of listening to a meaningless name observers instead use the event type (the class). When emitting an event we then use instances of the event class.Apart from other benefits this especially helps with typings and eliminates the issue of having to know the function signature for each event, as the only argument is the event instance.Using the built-indataclassesmakes it easy to avoid writing boiler-plate code for each event.Exampleimportasyncioimportdataclassesimportaiobservable@dataclasses.dataclass()classConnectEvent:user_id:intuser_name:strasyncdefmain():observable=aiobservable.Observable()defon_connect(event:ConnectEvent)->None:print(f"{event.user_name}connected!")observable.on(ConnectEvent,on_connect)event=ConnectEvent(1,"Simon")# emit returns a future which resolves to None when all observers# are done handling the eventawaitobservable.emit(event)asyncio.run(main())InstallingYou can install the library fromPyPI:pipinstallaiobservable
aiobsonrpc
aiobsonrpcbsonrpcfor asyncio.Python 3.5+Getting StartedInstallingpip install aiobsonrpcExampleServerimportasyncioimportaiobsonrpc@aiobsonrpc.service_classclassEchoService(object):@aiobsonrpc.aio_rpc_requestasyncdefecho(self,_,data):awaitasyncio.sleep(1)returndataasyncdefon_connected(reader,writer):aiobsonrpc.JSONRpc(reader,writer,services=EchoService())if__name__=='__main__':loop=asyncio.get_event_loop()server=asyncio.start_server(on_connected,'0.0.0.0',6789,loop=loop)loop.create_task(server)loop.run_forever()Clientimportasyncioimportaiobsonrpcasyncdefdo_connect():reader,writer=awaitasyncio.open_connection('localhost',6789,loop=loop)rpc=aiobsonrpc.JSONRpc(reader,writer)peer=rpc.get_peer_proxy(timeout=5)result=awaitpeer.echo(123)print(result)# 123if__name__=='__main__':loop=asyncio.get_event_loop()loop.run_until_complete(do_connect())
aiobtclientapi
aiobtclientapiis an asynchronous API for communicating with BitTorrent clients. It is a high-level wrapper aroundaiobtclientrpc.All client-specific details are abstracted away as much as possible, but the individual quirks of each client can still be accessed if necessary.Documentation:https://aiobtclientapi.readthedocs.io/Repository:https://codeberg.org/plotski/aiobtclientapiPackage:https://pypi.org/project/aiobtclientapi/LicenseGPLv3+
aiobtclientrpc
aiobtclientrpcprovides low-level access to the RPC protocols of BitTorrent clients. It is supposed to be the basis for a high-level library. If you want to use this directly, you need to read the documentation or source code of each client.Check outaiobtclientapifor a high-level API.Documentation:https://aiobtclientrpc.readthedocs.io/Repository:https://codeberg.org/plotski/aiobtclientrpcPackage:https://pypi.org/project/aiobtclientrpc/FeaturesTunnel the client connection through a proxy (SOCKS5, SOCKS4, HTTP tunnel)Event handlers, e.g. when a torrent was added (Deluge only)(Re-)Connect automatically on any RPC method call and disconnect automatically when used as a context managerKeep track of the connection status and provide changes to a callbackSupported BitTorrent ClientsDelugeqBittorrentTransmission(daemon)rTorrentLicenseGPLv3+
aiobtcrpc
No description available on PyPI.
aio-btdht
Asyncio Bittorrent DHT ServerSimply async distributed hash table for Bittorrent network.Based onaio-KRPCandaio-UDP.Class hierarchy:UDPServer — low level async [U]ser [D]atagram [P]rotocol socket server;KRPCServer — async [K]ademlia [R]emote [P]rocedure [C]all protocol provider;DHT — implementation of Bittorrent [D]istributed [H]ash [T]able;MethodsDHT.__init__Arguments:local_id(int) — local node identifier in range [0..2^160).Optional arguments:upload_speed(int, default0) — outgoing traffic throttler;download_speed(int, default0) — incoming traffic throttler;recv_max_size(int, default256 * 1024) — socket reading buffer size.runLaunch servering.Arguments:host(str) — local host address, e.g.0.0.0.0;port(int) — local port for outgoing and incoming UDP traffic;loop(object, defaultNone) — asyncio eventloop, will create whileNoneand run forever.Result:None.asyncbootstrapMethod for initializing routing table.Arguments:initial_peers(list, example:[(ip: str, port: int), ...]) — list of router for initialize routing table;Result:None.asyncannounceAnnounce peer for specifiedinfo_hash.Arguemtns:info_hash(20 bytes) — binary form ofinfo_hash;port(int, defaultNone) — information for announce, value is DHT-server listen port whenNone.Result:None.async__getitem__Get peers for torrent byinfo_hash.Arguemtns:info_hash(20 bytes) — binary form ofinfo_hash;Result:Set((host: str, port: int), ...).ExampleimportasynciofromaiobtdhtimportDHTfromaioudpimportUDPServerasyncdefmain(loop):initial_nodes=[("67.215.246.10",6881),# router.bittorrent.com("87.98.162.88",6881),# dht.transmissionbt.com("82.221.103.244",6881)# router.utorrent.com]udp=UDPServer()udp.run("0.0.0.0",12346,loop=loop)dht=DHT(int("0x54A10C9B159FC0FBBF6A39029BCEF406904019E0",16),server=udp,loop=loop)print("bootstrap")awaitdht.bootstrap(initial_nodes)print("bootstrap done")print("search peers for Linux Mint torrent (8df9e68813c4232db0506c897ae4c210daa98250)")peers=awaitdht[bytes.fromhex("8df9e68813c4232db0506c897ae4c210daa98250")]print("peers:",peers)print("peer search for ECB3E22E1DC0AA078B48B7323AEBBA827AD9BD80")peers=awaitdht[bytes.fromhex("ECB3E22E1DC0AA078B48B7323AEBBA827AD9BD80")]print("peers:",peers)print("announce with port `2357`")awaitdht.announce(bytes.fromhex("ECB3E22E1DC0AA078B48B7323AEBBA827AD9BD80"),2357)print("announce done")print("search our own ip")peers=awaitdht[bytes.fromhex("ECB3E22E1DC0AA078B48B7323AEBBA827AD9BD80")]print("peers:",peers)if__name__=='__main__':loop=asyncio.get_event_loop()loop.run_until_complete(main(loop))loop.run_forever()Output:bootstrap bootstrap done search peers for Linux Mint torrent (8df9e68813c4232db0506c897ae4c210daa98250) peers: {('146.120.244.10', 1078), ('162.244.138.182', 42000), ('79.226.71.128', 16881), ('24.236.49.9', 41785), ('91.138.144.37', 1487), ('68.44.225.100', 8967), ('80.44.205.104', 6881), ('212.32.229.66', 1160), ('108.204.244.160', 51413), ('66.130.106.74', 49160), ('90.128.48.99', 1348), ('188.232.107.50', 61025), ('173.44.55.179', 8589), ('46.242.8.31', 51413), ('181.97.186.68', 51413), ('68.114.220.251', 49237), ('172.98.67.53', 31545), ('94.212.149.191', 16881), ('188.163.46.182', 31217), ('185.148.3.42', 58489), ('37.24.80.202', 6881), ('86.159.123.195', 51413), ('88.202.177.238', 41280), ('87.79.216.207', 50000), ('73.24.30.122', 35727), ('212.32.229.66', 1842), ('97.103.206.29', 53314), ('58.6.89.141', 1157), ('78.92.112.15', 60745), ('124.176.208.220', 51413), ('68.44.225.100', 5051), ('188.6.231.166', 51413), ('58.6.89.141', 1099), ('99.28.209.57', 29694), ('108.172.92.81', 55555), ('46.107.192.126', 8999), ('212.232.79.196', 27886), ('24.236.49.9', 14100), ('84.28.172.201', 51413), ('188.195.3.8', 51413), ('92.110.18.80', 8999), ('91.153.229.129', 51413), ('5.196.71.175', 51413), ('185.21.217.21', 56031), ('93.77.142.76', 63193), ('145.130.138.96', 51413), ('73.14.170.195', 51413), ('79.121.6.189', 6318), ('78.142.23.65', 57171), ('101.112.156.11', 51413), ('195.242.156.241', 1194), ('92.247.152.29', 54217), ('81.171.9.199', 50781), ('75.108.45.243', 51413), ('172.98.67.53', 1842), ('68.44.225.100', 53849), ('81.171.9.199', 23542), ('37.214.48.240', 51413), ('72.177.3.209', 51413), ('93.203.236.159', 51413), ('95.154.19.186', 6881), ('50.39.164.144', 16881), ('88.223.24.14', 51413), ('213.188.38.130', 8999), ('85.3.104.53', 51413), ('173.69.6.72', 51413), ('46.166.190.136', 46825), ('45.231.194.211', 41481), ('77.50.127.80', 41446), ('5.39.88.54', 49420), ('78.42.236.30', 16881), ('5.227.15.139', 42068), ('31.17.12.154', 51413), ('41.188.115.45', 56763), ('46.246.123.24', 39574), ('144.172.126.68', 65516), ('72.14.181.193', 62904), ('90.128.48.99', 1347), ('90.92.181.147', 16881), ('184.175.133.66', 8999), ('185.45.195.161', 20064), ('188.243.178.111', 51413), ('139.47.117.179', 49967), ('88.138.252.41', 57837), ('91.237.164.182', 57310), ('199.204.160.119', 51413), ('50.39.198.212', 8999), ('183.89.65.240', 44808), ('90.154.5.75', 51413), ('68.44.225.100', 58836), ('89.165.156.40', 32435), ('23.227.192.103', 51413), ('217.10.117.5', 55179), ('83.149.46.57', 51413), ('217.120.102.36', 22449), ('27.255.16.95', 40001), ('68.44.225.100', 11790), ('98.142.213.39', 7401), ('38.141.32.249', 49222), ('71.38.191.134', 51413), ('68.44.225.100', 15234), ('24.236.49.9', 42805), ('74.205.141.130', 8999), ('46.158.232.118', 63782), ('66.78.249.53', 42065), ('78.239.83.41', 50505), ('89.134.45.63', 51413), ('89.202.42.62', 51413), ('198.48.184.203', 51157), ('172.98.67.53', 2623), ('109.104.17.83', 50000), ('80.128.42.85', 51413), ('82.253.238.71', 51413), ('81.171.9.199', 10128), ('68.44.225.100', 47320), ('88.172.132.57', 12464), ('84.57.138.103', 51413), ('68.44.225.100', 7158), ('23.82.10.32', 40779), ('5.189.164.178', 62924), ('86.171.122.135', 51413), ('91.226.253.69', 1039), ('188.32.24.204', 17691), ('79.121.6.189', 1060), ('68.44.225.100', 34255), ('68.44.225.100', 42549), ('91.227.46.177', 51413), ('87.253.244.34', 16881), ('73.64.230.136', 51413), ('203.184.52.1', 8999), ('66.78.249.53', 8999), ('109.197.193.160', 56172), ('109.252.2.116', 8197), ('151.252.157.50', 61213), ('37.53.35.117', 32852), ('74.140.192.66', 18954), ('71.197.139.141', 51413), ('173.179.36.233', 58948), ('75.174.143.174', 55265), ('70.56.193.245', 51413)} peer search for ECB3E22E1DC0AA078B48B7323AEBBA827AD9BD80 peers: {('192.168.10.10', 2357)} announce with port `2357` announce done search our own ip peers: {('192.168.10.10', 2357)}LinksAIO-KRPC;AIO-UDP;BEP 0005;DHT on Wikipedia;
aiobtname
No description available on PyPI.
aiobungie
aiobungieA statically typed, asynchronous API wrapper for building clients for Bungie's API in Python.InstallingCurrently Python 3.10, 3.11 and 3.12 are supported.Stable release.pipinstallaiobungieDevelopment via github master.pipinstallgit+https://github.com/nxtlo/aiobungie@masterQuick ExampleSeeExamples for advance usage.importaiobungieclient=aiobungie.Client('YOUR_API_KEY')asyncdefmain()->None:# Fetch a Destiny 2 character passing a component.# This includes Equipments, Inventory, Records, etc.asyncwithclient.rest:my_warlock=awaitclient.fetch_character(member_id=4611686018484639825,membership_type=aiobungie.MembershipType.STEAM,character_id=2305843009444904605,components=[aiobungie.ComponentType.CHARACTER_EQUIPMENT],)# Other components will be populated when passed to `components=[...]`# Otherwise will be `None`# Make sure the component is fetched.assertmy_warlock.equipmentisnotNone# Get the first item equipped on this character.item=my_warlock.equipment[0]print(item.hash,item.location)# Fetch this item, Note that this performs an HTTP request.# Alternatively, You can use the manifest here instead.# See examples folder for more information.item=awaitclient.fetch_inventory_item(item.hash)print(item.name,item.type_and_tier_name)# Prints: Izanagi's Burden Exotic Sniper Rifle# You can either run it using the client or just asyncio.run(main())client.run(main())RESTful clientsaiobungie also provides a stand-aloneRESTClient/RESTPoolwhich's whatClientbuilt on top of, These clients just provide a lower-level abstraction.A key note is that anyClientbased user can access theRESTClientinstance bound to it with.restproperty.Key FeaturesLower level, allows to read and deserialize the JSON objects yourself.RESTClients do not turn response payloads into one ofaiobungie.cratesobject.RESTful, You can use this as your REST API client in backend directly.BothManifestandOAuthmethods are usable directly.Exampleimportaiobungieimportasyncio# Single REST client connection.client=aiobungie.RESTClient("...")asyncdefmain()->None:asyncwithclient:# Download and open the JSON manifest.manifest=awaitclient.download_json_manifest(name="latest_manifest")withmanifest.open("r")asfile:data=file.read()# OAuth2 API. 2 simple methods for fetching and refreshing tokens.tokens=awaitclient.fetch_oauth2_tokens('code')refreshed_tokens=awaitclient.refresh_access_token(tokens.refresh_token)# Testing your own requests.response=awaitclient.static_request("GET",# Method."Destiny2/path/to/route",# Route.auth="optional_access_token",# If the method requires OAuth2.json={"some_key":"some_value"}# If you need to pass JSON data.)asyncio.run(main())Dependanciesaiohttpattrsbackports.datetime_fromisoformat, required forPython 3.10only.Featuresaiobungie features are extra dependencies that replaces the standard library with either faster/neater pkgs.speedupThis will include and usesorjsonorujsonas the defaultjsonparser. They provide faster json serialization and de-serialization than the standard Python JSON pkg.full: This will include all of the features above.For installing the specified feature, typepip install aiobungie[feature-name]ContributingPlease read thismanualRelated ProjectsIf you have used aiobungie and want to show your work, Feel free to Open a PR including it.Fated: A Discord BOT that uses aiobungie.Useful ResourcesDiscord Username:vfateaiobungie Documentation:Here.BungieAPI Discord:HereOfficial Bungie Documentation:HereBungie Developer Portal:HereAdditional informationIf you have any question you can either open a blank issue, open a new github discussion, or just tag me in BungieAPI discord server.
aioburst
aioburstA library to optimize the speed of rate limited async calls, by alternating between bursts and sleeps.UsageInstall the package using pip:pip install aioburstImport the limiter:from aioburst import aioburstInstantiate the limiter using thecreatemethod, setting thelimit(number of calls) andperiod(period over which the number of calls are restricted):limiter = AIOBurst.create(limit=10, period=1.0) async with limiter: ...The code above would allow 10 asynchronous entries (into the context manager) without any limit. Then it adds "sleepers" for the next calls. The sleeper tells the next entry when it can start. The 11th call gets the sleeper set by the 1st call that returned and waits until theperiodhas elapsed. This approach ensures that there are never more thanlimitsimultaneous calls but that the next call can start as soon as possible. The result is that in a sliding window ofperiodyou should see exactlylimitcalls as active, regardless of how fast or slow any individual call returns.You can also stack limiters:limiter1 = AIOBurst.create(limit=10, period=1.0) limiter2 = AIOBurst.create(limit=100, period=60.0) async with limiter1: async with limiter2: ...Use this for cases where an API has a two-level rate limit like 10 calls per second or 100 calls per minute---both limits will be respected. The stack is also idempotent, meaning that whichever way you stack the limiters, both limits will be respected:limiter1 = AIOBurst.create(limit=10, period=1.0) limiter2 = AIOBurst.create(limit=100, period=60.0) async with limiter1: async with limiter2: ... # The limit above will do the exact same thing as the limit below async with limiter2: async with limiter1: ...
aiobus
An event-bus application layer, support redisSimplyASYNCframework to integrateDISTRIBUTEDmessage bus to your application:utilize redis pub/sub features in backendfast to start and simply configurethread-safe and python async-friendly thanks to coroutines contextsfault tolerance if some redis instances down for a little timescalable thanks to consistent hashing for mappingtopictoinstancewithout cluster setupInstallpip install aiobusUsageimportjsonimportasyncioimportdatetimefromaiobus.redisimportRedisBus...bus=RedisBus(servers=['192.168.100.1','192.168.100.2:6380'],max_pool_size=1000)...# Publisher Coroutineasyncdefpublisher():whileTrue:awaitbus.publish('my-topic',{'stamp':str(datetime.datetime.now())})awaitasyncio.sleep(0.1)...# Subscriber Coroutineasyncdefsubscriber():awaitbus.subscribe('my-topic')asyncformsginawaitbus.listen():print(json.dumps(msg,indent=2,sort_keys=True))...DemoSetup redis instances on localhostdocker-compose up -dfor demo purposesRun/DebugDemo scriptRunning testsBefore to run tests you shouldinstall dependencies:pip install -r requirements.txtstart the dependent services with command:docker-compose up -d
aioca
aioca is an asynchronous EPICS Channel Access client for asyncio and Python using libca via ctypes.PyPIpip install aiocaSource codehttps://github.com/dls-controls/aiocaDocumentationhttps://dls-controls.github.io/aiocaChangeloghttps://github.com/dls-controls/aioca/blob/master/CHANGELOG.rstIt exposes a high level interface similar to the commandline tools:caget(pvs, ...) Returns a single snapshot of the current value of each PV. caput(pvs, values, ...) Writes values to one or more PVs. camonitor(pvs, callback, ...) Receive notification each time any of the listed PVs changes. connect(pvs, ...) Optionally can be used to establish PV connection before using the PV.Seehttps://dls-controls.github.io/aiocafor more detailed documentation.
aiocache
Asyncio cache supporting multiple backends (memory, redis and memcached).This library aims for simplicity over specialization. All caches contain the same minimum interface which consists on the following functions:add: Only adds key/value if key does not exist.get: Retrieve value identified by key.set: Sets key/value.multi_get: Retrieves multiple key/values.multi_set: Sets multiple key/values.exists: Returns True if key exists False otherwise.increment: Increment the value stored in the given key.delete: Deletes key and returns number of deleted items.clear: Clears the items stored.raw: Executes the specified command using the underlying client.ContentsInstallingUsageHow does it workAmazing examplesDocumentationInstallingpip install aiocachepip install aiocache[redis]pip install aiocache[memcached]pip install aiocache[redis,memcached]pip install aiocache[msgpack]UsageUsing a cache is as simple as>>>importasyncio>>>fromaiocacheimportCache>>>cache=Cache(Cache.MEMORY)# Here you can also use Cache.REDIS and Cache.MEMCACHED, default is Cache.MEMORY>>>withasyncio.Runner()asrunner:>>>runner.run(cache.set('key','value'))True>>>runner.run(cache.get('key'))'value'Or as a decoratorimportasynciofromcollectionsimportnamedtuplefromaiocacheimportcached,Cachefromaiocache.serializersimportPickleSerializer# With this we can store python objects in backends like Redis!Result=namedtuple('Result',"content, status")@cached(ttl=10,cache=Cache.REDIS,key="key",serializer=PickleSerializer(),port=6379,namespace="main")asyncdefcached_call():print("Sleeping for three seconds zzzz.....")awaitasyncio.sleep(3)returnResult("content",200)asyncdefrun():awaitcached_call()awaitcached_call()awaitcached_call()cache=Cache(Cache.REDIS,endpoint="127.0.0.1",port=6379,namespace="main")awaitcache.delete("key")if__name__=="__main__":asyncio.run(run())The recommended approach to instantiate a new cache is using theCacheconstructor. However you can also instantiate directly usingaiocache.RedisCache,aiocache.SimpleMemoryCacheoraiocache.MemcachedCache.You can also setup cache aliases so its easy to reuse configurationsimportasynciofromaiocacheimportcaches# You can use either classes or strings for referencing classescaches.set_config({'default':{'cache':"aiocache.SimpleMemoryCache",'serializer':{'class':"aiocache.serializers.StringSerializer"}},'redis_alt':{'cache':"aiocache.RedisCache",'endpoint':"127.0.0.1",'port':6379,'timeout':1,'serializer':{'class':"aiocache.serializers.PickleSerializer"},'plugins':[{'class':"aiocache.plugins.HitMissRatioPlugin"},{'class':"aiocache.plugins.TimingPlugin"}]}})asyncdefdefault_cache():cache=caches.get('default')# This always returns the SAME instanceawaitcache.set("key","value")assertawaitcache.get("key")=="value"asyncdefalt_cache():cache=caches.create('redis_alt')# This creates a NEW instance on every callawaitcache.set("key","value")assertawaitcache.get("key")=="value"asyncdeftest_alias():awaitdefault_cache()awaitalt_cache()awaitcaches.get("redis_alt").delete("key")if__name__=="__main__":asyncio.run(test_alias())How does it workAiocache provides 3 main entities:backends: Allow you specify which backend you want to use for your cache. Currently supporting: SimpleMemoryCache, RedisCache usingredisand MemCache usingaiomcache.serializers: Serialize and deserialize the data between your code and the backends. This allows you to save any Python object into your cache. Currently supporting: StringSerializer, PickleSerializer, JsonSerializer, and MsgPackSerializer. But you can also build custom ones.plugins: Implement a hooks system that allows to execute extra behavior before and after of each command.If you are missing an implementation of backend, serializer or plugin you think it could be interesting for the package, do not hesitate to open a new issue.Those 3 entities combine during some of the cache operations to apply the desired command (backend), data transformation (serializer) and pre/post hooks (plugins). To have a better vision of what happens, here you can check howsetfunction works inaiocache:Amazing examplesInexamples folderyou can check different use cases:Sanic, Aiohttp and TornadoPython object in RedisCustom serializer for compressing dataTimingPlugin and HitMissRatioPlugin demosUsing marshmallow as a serializerUsing cached decorator.Using multi_cached decorator.DocumentationUsageCachesSerializersPluginsConfigurationDecoratorsTestingExamples
aiocached
aiocachedaioachedis a simple package with decoratorcachedto cache results of ordinary and coroutine functions with configurable TTL andNonevalue supportI wrote a simple helper to cache results in one service because I found it easy to do. As soon as I needed the helper in another project, I realized that it should be in a separate package published on PyPI. Having foundaiocacheproject I was disappointed because it wasn't able to cacheNonevalues. So I had a reason to createaiocached.Table of contentsUsage examplesInstallationUsage examplesIn this examplefoo(1)will be run just once:importasynciofromaiocachedimportcached@cachedasyncdeffoo(n):awaitasyncio.sleep(n)asyncdefmain():awaitasyncio.gather(*[foo(1)for_inrange(1000)])asyncio.run(main())In this examplebar(1)will be run twice because of TTL:importasynciofromaiocachedimportcached@cached(ttl=2)asyncdefbar(n):awaitasyncio.sleep(n)asyncdefmain():awaitbar(1)awaitasyncio.sleep(2)awaitbar(1)asyncio.run(main())If you want to cache an ordinary function, you can do it as well. In this examplefoobar(1)will be run twice for the same reason as above:importtimefromaiocachedimportcached@cached(ttl=2)deffoobar(n):time.sleep(n)defmain():foobar(1)time.sleep(2)foobar(1)main()InstallationUse pip to install:$pipinstallaiocached
aiocacher
No description available on PyPI.
aiocaldav
aiocaldavaiocaldav is a fork of the caldav project since v0.5.0It uses aiohttp client library instead of synchronous request lib. It also targets only python 3.6+ (remove six and older python support)Drawbacks:no DigestAuth Support for nowBug corrections since caldav v0.5.0:Todo list without completed query syntax was wrongIt was possible to completed an already completed task. Now complete() an already completed task does nothing (perhaps should we raise an error instead ?)changed datetime output in cdav to match rfc 5545 (for timezones)Evolutions since caldav v0.5.0 (incompatible change in top of 'asyncification')package name changed from caldav to aiocaldavPrincipal.calendar_home_set is no longer a property, it's now a async method To set the prop, now use Principal._calendar_home_setter(url) To retrieve is use await Principal.calendar_home_set()TestsTests uses pytest and pytest_asyncio and need (by default) docker and docker-compose. Just run:# pytest .to launch the tests.
aiocall
The missing call methods for pythonasyncio.I found that python asyncio is lacking some basic methods to invoke calls:call_soon_periodic(interval,callback,*args)call_later_periodic(delay,interval,callback,*args)This module provides these methods.Example:importasyncioimportaiocalldefdo_something(what):print('doing',what)timer=aiocall.call_later_periodic(0.5,1.0,do_something,'stuff')loop=asyncio.get_event_loop()loop.call_later(4.0,timer.cancel)loop.call_later(5.0,loop.stop)loop.run_forever()loop.close()Output:doing stuff doing stuff doing stuff doing stuff
aiocan
Failed to fetch description. HTTP Status Code: 404
aiocapsolver
Capsolver.com API for PythonCapSolverPython is an elegant API implementation of Capsolver.com for pythonFeatures:Async/Sync supportAll endpoints/inputs (soon)Error handlingInstallationpip install aiofiles(https://pypi.org/project/aiocapsolver/)Usagefromaiocapsolver.capsolverimportAsyncCapSolversolver=AsyncCapSolver(MyCapSolverApiKey)asyncdefmyasyncfunction():result=awaitsolver.solve_image_to_text(images/myimage.png)WIPCurrently only supports:asyncImage captcha solvingWatch this space for future additions
aiocapsule
aiocapsuleA minimal package containing an AIOHTTP wrapper functionDesigned for asynchronous HTTP requests toa large, unknown number of different serversSupport for HTTP proxy and basic authenticationInstallpipinstallaiocapsuleHow to usefromaiocapsule.coreimportrequestA simple call returning JSON as a dict:awaitrequest('GET','https://api.coingecko.com/api/v3/ping'){'gecko_says': '(V3) To the Moon!'}Or a string of HTML:awaitrequest('GET','https://example.com/',text=True)'<!doctype html>\n<html>\n<head>\n <title>Example Domain</title>\n\n <meta charset="utf-8" />\n <meta http-equiv="Content-type" content="text/html; charset=utf-8" />\n <meta name="viewport" content="width=device-width, initial-scale=1" />\n <style type="text/css">\n body {\n background-color: #f0f0f2;\n margin: 0;\n padding: 0;\n font-family: -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", "Open Sans", "Helvetica Neue", Helvetica, Arial, sans-serif;\n \n }\n div {\n width: 600px;\n margin: 5em auto;\n padding: 2em;\n background-color: #fdfdff;\n border-radius: 0.5em;\n box-shadow: 2px 3px 7px 2px rgba(0,0,0,0.02);\n }\n a:link, a:visited {\n color: #38488f;\n text-decoration: none;\n }\n @media (max-width: 700px) {\n div {\n margin: 0 auto;\n width: auto;\n }\n }\n </style> \n</head>\n\n<body>\n<div>\n <h1>Example Domain</h1>\n <p>This domain is for use in illustrative examples in documents. You may use this\n domain in literature without prior coordination or asking for permission.</p>\n <p><a href="https://www.iana.org/domains/example">More information...</a></p>\n</div>\n</body>\n</html>\n'
aiocarbon
Client for feeding data to graphite.ExampleCounter example:importasyncioimportaiocarbonasyncdefmain(loop):aiocarbon.setup(host="127.0.0.1",port=2003,client_class=aiocarbon.TCPClient)for_inrange(1000):withaiocarbon.Counter("foo"):awaitasyncio.sleep(0.1)if__name__=="__main__":loop=asyncio.get_event_loop()loop.run_until_complete(main(loop))loop.close()
aiocasambi
Python library for controlling Casambi lightsaio Python library for controlling Casambi via Cloud APISupported modesThese modes are implemented:on/offbrightnesscolor temperaturergbrgbwGetting StartedRequest developer api key from Casambi:https://developer.casambi.com/Setup a site in Casambi app:http://support.casambi.com/support/solutions/articles/12000041325-how-to-create-a-siteInstallatingInstall this library through pip:pip install aiocasambiAuthorsOlof Hellqvist-Initial workLicenseThis project is licensed under the MIT License - see theLICENSE.mdfile for detailsTest scriptAdd the credentials tocasambi.yaml.exampleand rename the file tocasambi.yamlBuild localy in envpython3 -m venv aiocasambi source ./aiocasambi/bin/activate ./aiocasambi/bin/pip3 install -r ./aiocasambi/requirements.txtDisclaimerThis library is neither affiliated with nor endorsed by Casambi.
aiocassandra
info:Simple threaded cassandra wrapper for asyncioInstallationpipinstallaiocassandraUsageimportasynciofromaiocassandraimportaiosessionfromcassandra.clusterimportClusterfromcassandra.queryimportSimpleStatement# connection is blocking callcluster=Cluster()# aiocassandra uses executor_threads to talk to cassndra driver# https://datastax.github.io/python-driver/api/cassandra/cluster.html?highlight=executor_threadssession=cluster.connect()asyncdefmain():# patches and adds `execute_future`, `execute_futures` and `prepare_future`# to `cassandra.cluster.Session`aiosession(session)# best way is to use cassandra prepared statements# https://cassandra-zone.com/prepared-statements/# https://datastax.github.io/python-driver/api/cassandra/cluster.html#cassandra.cluster.Session.prepare# try to create them once on application initquery=session.prepare('SELECT now() FROM system.local;')# if non-blocking prepared statements is really needed:query=awaitsession.prepare_future('SELECT now() FROM system.local;')print(awaitsession.execute_future(query))# pagination is also supportedquery='SELECT * FROM system.size_estimates;'statement=SimpleStatement(query,fetch_size=100)# don't miss *s* (execute_futureS)asyncwithsession.execute_futures(statement)aspaginator:asyncforrowinpaginator:print(row)loop=asyncio.get_event_loop()loop.run_until_complete(main())cluster.shutdown()loop.close()Python 3.5+ is requiredThanksThe library was donated byOcean S.A.Thanks to the company for contribution.
aiocast
aiocastCast videos to chromecast devices from the terminal.Install$ pip install aiocastUsage$ aiocast play video.mp4$ aiocast list-devices$ aiocast device-info "Home"$ aiocast play --help  ✔  19:31  usage: aiocast play [-h] [-d DEVICE_NAME] [-p PORT] [-t TIMEOUT] [-i IDLE] [--local-ip LOCAL_IP] [--mimetype MIMETYPE] media Cast a video positional arguments: media Path to the video to cast. optional arguments: -h, --help show this help message and exit -d DEVICE_NAME, --device-name DEVICE_NAME The target cast device name (default: None) -p PORT, --port PORT Port of the local cast server (default: None) -t TIMEOUT, --timeout TIMEOUT Timeout after which the program will close if stuck on buffering. (default: 60) -i IDLE, --idle IDLE Time to stay idle after a stop or media ends (default: 2.5) --local-ip LOCAL_IP Local ip to use, otherwise get the first private ip available. (default: None) --mimetype MIMETYPE Set the mimetype of the media, otherwise will guess. (default: None)
aiocdp
AIOCDPThis library provides an async wrapper around the Chrome DevTools Protocol.AboutTo be the underlying engine for any projects using Chrome DevTools Protocol. This library should embody the following:Flexibility: Allow for any use case that the CDP supports.Minimal external dependencies: Opt for built-in python libraries where possible.Clean code: No hacks, workarounds, or spaghetti code.UsageStarting ChromeThe following will launch chrome through the command line.importasynciofromaiocdpimportChromeasyncdefsetup_chrome():chrome=Chrome.init(host="localhost",port=9222,)chrome.start()# run your logic hereif__name__=="__main__":asyncio.run(setup_chrome())NOTE:If you had chrome open previously, you may run into connection issues. If this is the case, close your tabs, run the script to relaunch chrome, and then reopen your tabs.Opening a tabimportasynciofromaiocdpimportChromeasyncdefsetup_tabs():chrome=Chrome.init(host="localhost",port=9222,)chrome.start()# opens a new tab. Return aiocdp.ITarget instance.target=chrome.open_tab("https://www.google.com")target=chrome.open_tab("https://www.yahoo.com")target=chrome.open_tab("https://www.github.com/amirdevstudio/aiocdp")# opens a new tab. The parameter is optionaltarget=chrome.open_tab()if__name__=="__main__":asyncio.run(setup_tabs())PackageYou can find the AIOCDP package published to pypi.org/project/aiocdpFor DevelopersScopeThis library is built to be "one and done" except for performance optimizations or design changes. This library should limit the dependencies it has on the CDP unless it's a core feature (opening sessions, etc.).DependenciesBuilt-indataclassesmodule for classesBuilt-intypingmodule for type hints, enum literals, and other goodiesBuilt-injsonmodule for JSON serializationBuilt-inasynciomodule for async functionalityExternalrequestsmodule for HTTP communicationExternalwebsocketsmodule for websocket communicationInternalsModule Organizationaiocdp- Root package. Contains core and utility modulesaiocdp.core- Core functionality for communicating with the Chrome DevTools Protocolaiocdp.core.chrome.Chrome-> Represents the Chrome instance / process.aiocdp.core.target.Target-> Represents a chrome devtools protocol target (Page, Frame, Worker, etc)aiocdp.core.connection.Connection-> Represents a websocket connection to a targetaiocdp.core.session.TargetSession-> Represents a session to a specific target.aiocdp.core.stream.EventStream-> Represents a stream of events from a connection.aiocdp.core.stream.EventStreamReader-> Readonly reader to an event stream.TODO:Setup proper typehints for setting subclasses. (use a type var in a shared module)documentationPublishing to PyPiUpdate the version insetup.pyand / orpyproject.tomlRunpython -m pip install build twineRunpython -m buildRuntwine check dist/*Runtwine upload dist/aiocdp-<MAJOR>.<MINOR>.<PATCH>*
aiocea
aiocea 异步数字货币交易所接口框架aiocea是 基于asynico 与aiohttp的数字货币交易所接口整合框架。该框架封装了币安交易所的API(未来将支持更多交易所), 实现了串行,并发, 串行与并发混合执行的 网络请求任务。 该框架逻辑清晰,Getting Started : 快速理解框架与逻辑一个异步网络请求称之为一个任务(Task), 整个框架的调用是任务驱动型的,且具有标准的流程。但框架的设计仍然提供了足够多的灵活性。完成任何一个特定的加密数字货币接口请求任务都要经过以下几个步骤:1. 实例交易所APIaiocea 在utils.py 中封装了交易所Restful API请求方法(目前仅支持币安)。其中, 币安交易所为类BinanceRestApi,它封装了请求币安交易所Restful API时的请求方法。传入你的API_KEY与API_SERECT即可完成一个交易所API请求实例。如下代码:fromutilsimportBinanceRestApiAPI=BinanceRestApi(api_key="your api key here",api_secret="your api secret here")BinanceRestApi中定义了request方法。调用该方法请求某个网址将会在请求头与querystring中添加币安交易所需要特定的参数,从而实现数据的获取。2. 定义任务在aico中,Task子类,GatherTask类,SerialTask类, 都是框架所认可 任务类型。我们统称为任务泛型GenericTask。下面我们将逐个阐述。所有GenericTask类对象都具有result属性,方便在完成任务后调取结果。Task及其子类在tasks.py中定义了Task类。任何一个继承自Task类, 且在 async def _call_(self, *args, **kwargs) 方法中定义了异步请求结果的类,都称之为某个特定任务类。如请求币安币本位合约服务器ping:classCoinMPing(Task):'''测试服务器连通'''asyncdef__call__(self,*args,**kwargs):response=awaitself.binance_API.\request(method="GET",url=BinanceRestApi.CoinM_BASE_URL+"/dapi/v1/ping")returnresponsedef__str__(self):return"ping服务器"Task类是一个抽象类, 他是整个框架执行任务的最小单元。它要求每个子类实现async def _call_(self, *args, **kwargs) 方法,定义异步请求的结果,并使用return返回, 从而完成对某个特定任务的定义。任何继承自Task的子类对象都有以下属性和方法:result :该属性保存该任务在异步请求成功后的返回结果即async def _call_()中的return值 。用户不可手动设置,由框架维护。binance_API :该属性的设置是为了实现不同的API_KEY,API_SECRET同时获取数据,它是BinanceRestApi的实例, 因此用户在定义Task子类时候可以调用该属性的request方法,实现特定API下的网络请求。该属性的默认值将继承自类变量Task.default_binance_API。因此你可以通过Task.set_default_binance_API()方法来设置类变量,那么在之后创建的Task子类实例的binance_API属性都将继承这个值。下面的代码例子展示了如何通过设置类变量的方法实现不同Task子类在初始化时拥有不同的binance_API属性。fromtasksimportTaskfromutilsimportBinanceRestApifromtasklistsimportCoinMPing#定义两个不同的API请求接口API1=BinanceRestApi(api_key="key_1",api_secret="screte_1")API2=BinanceRestApi(api_key="key_2",api_secret="screte_2")#设置Task子类实例默认使用的binanceRestApiTask.set_default_binance_API(API1)task1=CoinMPing()# CoinMPing是预定义好的Task子类,print("task1:",task1.binance_API.api_key)# output: key_1#更换Task子类实例默认使用的binanceRestApiTask.set_default_binance_API(API2)task2=CoinMPing()# CoinMPing是预定义好的Task子类,print("task2:",task2.binance_API.api_key)# output: key_2_str_ : 该方法用于命名这个特定的任务。如若没有实现,将继承自父类Task中的方法aiocea 在tasklists.py 里已经定义了常用的币安数据请求任务,在大多数场景下,直接导入使用即可。返回数据均为原始返回数据json处理后的结果。GatherTask并发任务类GatherTask用于将多个GenericTask实例对象包装成一个并发任务类。使得在执行的时候可以并发执行。注意,GenericTask是三种任务类型的统称,因此我们实际上不仅可以将多个单独任务包装成一个并行任务,也可以将多个串行任务包装成一个并发任务。实例化一个GatherTask非常简单。只需要在构造时,传入多个GenericTask实例对象即可。如下例子:fromtasksimportGatherTaskfromtasklistsimportCoinMPing,CoinMServerTime#ping服务器与获取服务器时间并发执行gathertask=GatherTask(CoinMPing(),CoinMServerTime())#展示任务gathertask.show_tasks()# output:# 并发任务(2):# ping币安币本位服务器# 获取币安币本位服务器时间我们可以通过show_tasks()方法展示当前并发任务的任务队列。GatherTask具有result属性,他是传入的所有GenericTask实例对象对应的result属性构成的列表。SerialTask串行任务类与GatherTask类型类似。SerialTask用于将多个GenericTask实例对象包装成一个串行任务类。使得在执行的时候保证是同步执行。实例化一个SerialTask非常简单。只需要在构造时,传入多个GenericTask实例对象即可。如下例子:fromtasksimportSerialTaskfromtasklistsimportCoinMPing,CoinMServerTime#ping服务器与获取服务器时间不同执行serialtask=SerialTask(CoinMPing(),CoinMServerTime())#展示任务serialtask.show_tasks()# output:# 串行任务(2):# ping币安币本位服务器# 获取币安币本位服务器时间我们也可以通过show_tasks()方法展示当前并发任务的任务队列。SerialTask具有result属性,他是传入的所有GenericTask实例对象对应的result属性构成的列表。3. 定义任务队列在aiocea中, 执行任务是以队列模型为执行的。执行器将根据不同的任务队列实现不同形式的任务执行顺序。任务队列中的任务aiocea在taskoperator.py中已经预定义了以下三种任务队列模型:TaskQueue先进先出任务队列TaskStack先进后出任务栈TaskRing先进先出循环任务队列。刚对象的生成接受两个参数iter_num与iter_interval分别表示任务队列循环次数与每次循环后的sleep时间。这三种类型统称为: 泛型任务队列GenericTaskQueue。GenericTaskQueue与SerialTask,GatherTask类都继承自BaseMultiTaskOperate这个类,这个类重载了_add_ 与 __ sub__方法,使得我们能够使用加号与减号添加或删减任务。并且都具有add_tasks方法用于添加任务,show_tasks方法用于打印当前任务。下面的列子展示了如何生成一个任务队列。fromtasksimportGatherTask,SerialTaskfromtaskoperatorimportTaskRingfromtasklistsimport*#定义一个循环任务队列ring=TaskRing()#往循环任务队列添加任务ring+=CoinMPing()#Task 0ring+=GatherTask(SerialTask(CoinMServerTime(),CoinMFetchTicker(symbol="BTCUSD_PERP",pair="BTCUSD")),SerialTask(CoinMServerTime(),CoinMFetchTicker(symbol="ETHUSD_PERP",pair="ETHUSD")))#Task1#展示当前任务ring.show_tasks()# output:# -------当前任务队列(2)-------# Task 0# ping币安币本位服务器# Task 1# 并发任务(2):# 串行任务(2):# 获取币安币本位服务器时间# 获取币本位合约:BTCUSD_PERP 实时价格# 串行任务(2):# 获取币安币本位服务器时间# 获取币本位合约:ETHUSD_PERP 实时价格#-----------------------4. 创建任务执行器,添加任务队列, 获得执行结果在aiocea中, 执行队列中的任务。都需要任务执行器来执行,它是定义在taskoperator.py中TaskOperater类的对象。任务执行器将同步执行所添加任务队列中的任务,每执行完一个任务,该任务的result属性将从None变为对应的异步请求的结果。任务执行器的整个执行过程的步骤如下所示:给执行器添加执行任务队列q循环调用q中__next__方法,获得当前要执行的任务task在事件循环loop中执行该任务异步回调方法并获得结果, 并将task.result设置为该结果。如若任务队列raise StopIteration。停止执行, 否则重复步骤2。在TaskOperater对象具有以下方法或属性:taskqueue属性。 即执行器的 执行任务队列add_queue方法。用于添加一个执行任务队列。传入值必须为泛型任务队列GenericTaskQueue对象,并设置为taskqueue属性。该方法返回执行器自身,即self。run方法。 执行整个运行步骤。 完成时任务队列中的所有result属性均为返回结果。fishih_task_generator方法 :该方法接受一个参数exec_interval表示每个任务队列中每个任务执行后,sleep时间。该方法返回一个生成器,在执行步骤3完成时, yield 该任务对象。runSingleTask方法。 传入一个 泛型任务GenericTask对象。在方法内生成一个TaskQueue对象作为执行器的执行任务队列,并在任务队列中添加传入的任务对象。该方法返回执行器自身,即self。repeatSingleTask方法。 传入一个 泛型任务GenericTask对象。在方法内生成一个TaskRing对象作为执行器的执行任务队列,并在任务队列中添加传入的任务对象。该方法返回执行器自身,即self。以上便是整个框架的基础逻辑和方法。更多的使用与方法请参考后续的API文档。该框架实际上实际上可以还完成日常其他异步请求结果。Examples : 使用例子获取服务器时间fromtaskoperatorimportTaskQueue,TaskOperaterfromtasklistsimport*#设置任务默认的APITask.set_default_binance_API(BinanceRestApi(api_key="key_1",api_secret="screte_1"))#定义一个任务task=CoinMServerTime()#定义一个任务队列q=TaskQueue()#往任务队列添加任务q+=task#Task 0#定义一个执行器并添加任务队列op=TaskOperater().add_queue(q)#执行任务队列的任务op.run()#获得任务结果print(task.result)每秒循环获取TickerfromtaskoperatorimportTaskRing,TaskOperaterfromtasklistsimport*#设置任务默认的APITask.set_default_binance_API(BinanceRestApi(api_key="key_1",api_secret="screte_1"))#定义一个Ticker任务task=CoinMFetchTicker(symbol="BTCUSD_PERP",pair="BTCUSD")#定义一个循环任务队列q=TaskRing(iter_interval=1)#循环间隔为1秒#往任务队列添加任务q+=task#Task 0#展示任务队列q.show_tasks()op=TaskOperater().add_queue(q)#执行任务队列的任务fortaskinop.fishih_task_generator():print(task.result)#output:# -------当前任务队列(1)-------# Task 0# 获取币本位合约:BTCUSD_PERP 实时价格# -----------------------# 定义一个执行器并添加任务队列# [{'symbol': 'BTCUSD_PERP', 'ps': 'BTCUSD', 'price': '55163.4', 'time': 1618897014034}]# [{'symbol': 'BTCUSD_PERP', 'ps': 'BTCUSD', 'price': '55163.4', 'time': 1618897014034}]# [{'symbol': 'BTCUSD_PERP', 'ps': 'BTCUSD', 'price': '55163.4', 'time': 1618897014034}]# [{'symbol': 'BTCUSD_PERP', 'ps': 'BTCUSD', 'price': '55163.4', 'time': 1618897014034}]# ...每秒循环 并发获取 服务器时间 与 TickerfromtaskoperatorimportTaskRing,TaskOperaterfromtasksimportGatherTaskfromtasklistsimport*#设置任务默认的APITask.set_default_binance_API(BinanceRestApi(api_key="key_1",api_secret="screte_1"))#定义一个循环任务队列q=TaskRing(iter_interval=1)#循环间隔为1秒#往任务队列添加并发任务q+=GatherTask(CoinMServerTime(),CoinMFetchTicker(symbol="BTCUSD_PERP",pair="BTCUSD"))#展示任务队列q.show_tasks()fortaskinTaskOperater().add_queue(q).fishih_task_generator():print(task.result)#output:#-------当前任务队列(1)-------#Task 0#并发任务(2):# 获取币安币本位服务器时间# 获取币本位合约:BTCUSD_PERP 实时价格#-----------------------#[{'serverTime': 1618897355627}, [{'symbol': 'BTCUSD_PERP', 'ps': 'BTCUSD', 'price': '54927.5', 'time': #1618897355247}]]#[{'serverTime': 1618897356024}, [{'symbol': 'BTCUSD_PERP', 'ps': 'BTCUSD', 'price': '54925.0', 'time': #1618897356011}]]#[{'serverTime': 1618897356404}, [{'symbol': 'BTCUSD_PERP', 'ps': 'BTCUSD', 'price': '54925.0', 'time': #1618897356011}]]#...串行执行k线获取fromtaskoperatorimportTaskQueue,TaskOperaterfromtasklistsimportCoinMFetchKlinefromutilsimport*fromtasksimportTaskimporttime#设置任务默认的APITask.set_default_binance_API(BinanceRestApi(api_key="key_1",api_secret="screte_1"))#定义一个任务队列q=TaskQueue()#获取两个周期的k线q+=CoinMFetchKline(symbol="BTCUSD_PERP",interval="1m",limit=1,endTime=int(time.time())*1000)q+=CoinMFetchKline(symbol="BTCUSD_PERP",interval="1m",limit=1,endTime=(int(time.time())-60)*1000)q.show_tasks()fortaskinTaskOperater().add_queue(q).fishih_task_generator():print(task,"finish!")print(task.result)#output:#-------当前任务队列(2)-------#Task 0#获取币本位合约:BTCUSD_PERP 1m周期 K线<None to 1618898005000>(limit 1)#Task 1#获取币本位合约:BTCUSD_PERP 1m周期 K线<None to 1618897945000>(limit 1)#-----------------------#获取币本位合约:BTCUSD_PERP 1m周期 K线<None to 1618898005000>(limit 1) finish!#[[1618897980000, '54963.2', '54975.9', '54963.2', '54975.8', '2662', 1618898039999, '4.84281221', 73, #'2329', '4.23701807', '0']]#获取币本位合约:BTCUSD_PERP 1m周期 K线<None to 1618897945000>(limit 1) finish!#[[1618897920000, '54881.1', '54963.2', '54881.1', '54963.1', '17844', 1618897979999, '32.49263090', 320, #15905', '28.96235089', '0']]并发执行两个币种k线的串行获取fromtaskoperatorimportTaskQueue,TaskOperaterfromtasklistsimportCoinMFetchKlinefromutilsimport*fromtasksimportTask,SerialTask,GatherTaskimporttime#设置任务默认的APITask.set_default_binance_API(BinanceRestApi(api_key="WULDU3TzwOSNR0vRL3zPyumx06ji4O9YKeDM0JpzQA5Gnv6m4RvZMolKK9w8N1qk",api_secret="Hc3kKfgi7ZnBgkGCQ3vWmkJEsIsLJwhs8wM5tvL594nH3TvOKCw9V6f97vZLCtF2"))#定义一个任务队列q=TaskQueue()#获取两个币中的两个周期的k线的并发执行q+=GatherTask(#第一个币种的两个k线任务SerialTask(CoinMFetchKline(symbol="BTCUSD_PERP",interval="1m",limit=1,endTime=int(time.time())*1000),CoinMFetchKline(symbol="BTCUSD_PERP",interval="1m",limit=1,endTime=int(time.time())*1000)),SerialTask(CoinMFetchKline(symbol="ETHUSD_PERP",interval="1m",limit=1,endTime=int(time.time())*1000),CoinMFetchKline(symbol="ETHUSD_PERP",interval="1m",limit=1,endTime=int(time.time())*1000)))q.show_tasks()fortaskinTaskOperater().add_queue(q).fishih_task_generator():print("finish")BTC_Kline1=task.result[0][0]BTC_Kline2=task.result[0][1]ETH_Kline1=task.result[1][0]ETH_Kline2=task.result[1][1]print(BTC_Kline1)print(BTC_Kline2)print(ETH_Kline1)print(ETH_Kline2)#output:#-------当前任务队列(1)-------#Task 0#并发任务(2):# 串行任务(2):# 获取币本位合约:BTCUSD_PERP 1m周期 K线<None to 1618898371000>(limit 1)# 获取币本位合约:BTCUSD_PERP 1m周期 K线<None to 1618898371000>(limit 1)# 串行任务(2):# 获取币本位合约:ETHUSD_PERP 1m周期 K线<None to 1618898371000>(limit 1)# 获取币本位合约:ETHUSD_PERP 1m周期 K线<None to 1618898371000>(limit 1)#-----------------------#finish#[[1618898340000, '54962.4', '55016.4', '54962.4', '55016.4', '15264', 1618898399999, '27.75471862', 150, #'10905', '19.82837640', '0']]#[[1618898340000, '54962.4', '55016.4', '54962.4', '55016.4', '15284', 1618898399999, '27.79107142', 151, #'10925', '19.86472920', '0']]#[[1618898340000, '2118.17', '2120.31', '2118.17', '2119.90', '47561', 1618898399999, '224.39426159', 109, '27294', '128.79773623', '0']]#[[1618898340000, '2118.17', '2120.31', '2118.17', '2119.90', '47561', 1618898399999, '224.39426159', 109, #'27294', '128.79773623', '0']]
aio-celery
aio-celeryWhat is aio-celery?This project is an alternative independent asyncio implementation ofCelery.Quoting Celerydocumentation:Celery is written in Python, but the protocol can be implemented in any language.And aio-celery does exactly this, it (re)implementsCelery Message Protocol(in Python) in order to unlock access to asyncio tasks and workers.The most notable feature of aio-celery is that it does not depend on Celery codebase. It is written completely from scratch as a thin wrapper aroundaio-pika(which is an asyncronous RabbitMQ python driver) and it has no other dependencies (except forredis-pyfor result backend support, but this dependency is optional).There have been attempts to create asyncio Celery Pools before, andcelery-pool-asynciois one such example, but its implementation, due to convoluted structure of the original Celery codebase, is (by necessity) full of monkeypatching and other fragile techniques. This fragility was apparently thereasonwhy this library became incompatible with Celery version 5.Celery project itself clearlystruggleswith implementing Asyncio Coroutine support, constantly delaying this feature due to apparent architectural difficulties.This project was created in an attempt to solve the same problem but using the opposite approach. It implements only a limited (but still usable — that is the whole point) subset of Celery functionality without relying on Celery code at all — the goal is to mimic the basic wire protocol and to support a subset of Celery API minimally required for running and manipulating tasks.FeaturesWhat is supported:Basic tasks API:@app.taskdecorator,delayandapply_asynctask methods,AsyncResultclass etc.Everything is asyncio-friendly and awaitableAsyncronous Celery worker that is started from the command lineRouting and publishing options such ascountdown,eta,queue,priority, etc.Task retriesOnly RabbitMQ as a message brokerOnly Redis as a result backendImportant design decisions for aio-celery:Complete feature parity with upstream Celery project is not the goalThe parts that are implemented mimic original Celery API as close as possible, down to class and attribute namesThe codebase of this project is kept as simple and as concise, it strives to be easy to understand and reason aboutThe codebase is maintained to be as small as possible – the less code, the fewer bugsExternal dependencies are kept to a minimum for the same purposeThis project must not at any point have celery as its external dependencyInstallationInstall usingpip:pipinstallaio-celeryIf you intend to use Redis result backend for storing task results, run this command:pipinstallaio-celery[redis]UsageDefineCeleryapplication instance and register a task:# hello.pyimportasynciofromaio_celeryimportCeleryapp=Celery()@app.task(name="add-two-numbers")asyncdefadd(a,b):awaitasyncio.sleep(5)returna+bThen run worker:$aio_celeryworkerhello:appQueue some tasks:# publish.pyimportasynciofromhelloimportadd,appasyncdefpublish():asyncwithapp.setup():tasks=[add.delay(n,n)forninrange(50000)]awaitasyncio.gather(*tasks)asyncio.run(publish())$python3publish.pyThe last script concurrently publishes 50000 messages to RabbitMQ. It takes about 8 seconds to finish, with gives average publishing rate of about 6000 messages per second.Using Redis Result Backendimportasynciofromaio_celeryimportCeleryfromaio_celery.exceptionsimportTimeoutErrorapp=Celery()app.conf.update(result_backend="redis://localhost:6379",)@app.task(name="do-something")asyncdeffoo(x,y,z):awaitasyncio.sleep(5)returnx+y-zasyncdefmain():asyncwithapp.setup():result=awaitfoo.delay(1,2,3)try:value=awaitresult.get(timeout=10)exceptTimeoutError:print("Result is not ready after 10 seconds")else:print("Result is",value)if__name__=="__main__":asyncio.run(main())Adding contextimportcontextlibimportasyncpgfromaio_celeryimportCeleryapp=Celery()@app.define_app_context@contextlib.asynccontextmanagerasyncdefsetup_context():asyncwithasyncpg.create_pool("postgresql://localhost:5432",max_size=10)aspool:yield{"pool":pool}@app.taskasyncdefget_postgres_version():asyncwithapp.context["pool"].acquire()asconn:version=awaitconn.fetchval("SELECT version()")returnversionRetriesimportrandomfromaio_celeryimportCeleryapp=Celery()@app.task(name="add-two-numbers",bind=True,max_retries=3)asyncdefadd(self,a,b):ifrandom.random()>0.25:# Sends task to queue and raises `aio_celery.exception.Retry` exception.awaitself.retry(countdown=2)Priorities and QueuesSupport forRabbitMQ Message Priorities:importasynciofromaio_celeryimportCeleryapp=Celery()app.conf.update(task_default_priority=5,# global default for all taskstask_default_queue="queue-a",# global default for all taskstask_queue_max_priority=10,# sets `x-max-priority` argument for RabbitMQ Queue)@app.task(name="add-two-numbers",priority=6,# per task default (overrides global default)queue="queue-b",# per task default (overrider global default))asyncdefadd(a,b):awaitasyncio.sleep(3)returna+basyncdefmain():asyncwithapp.setup():awaitadd.apply_async(args=(2,3),priority=7,# overrides all defaultsqueue="queue-c",# overrides all defaults)if__name__=="__main__":asyncio.run(main())See alsoRabbitMQ documentationon priorities.Send unregistered task by nameimportasynciofromaio_celeryimportCeleryapp=Celery()app.conf.update(result_backend="redis://localhost:6379",)asyncdefmain():asyncwithapp.setup():result=awaitapp.send_task("add-two-numbers",args=(3,4),queue="high-priority",countdown=30,)print(awaitresult.get(timeout=5))if__name__=="__main__":asyncio.run(main())Register tasks using@shared_taskdecoratorAnalogous to original Celeryfeature, the@shared_taskdecorator lets you create tasks without having any concrete app instance:fromaio_celeryimportCelery,shared_task@shared_taskasyncdefadd(a,b):returna+bapp=Celery()# `add` task is already registered on `app` instanceReferencesSimilar Projectshttps://github.com/cr0hn/aiotaskshttps://github.com/the-wondersmith/celery-aio-poolhttps://github.com/kai3341/celery-pool-asyncioInspirationhttps://github.com/taskiq-python/taskiqRelevant Discussionshttps://github.com/celery/celery/issues/3884https://github.com/celery/celery/issues/7874https://github.com/anomaly/lab-python-server/issues/21https://github.com/anomaly/lab-python-server/issues/32
aiocells
aiocellsis a package that provides tools for synchronous and asynchronous execution of nodes in a dependency graph.Contents:ExamplesDevelopment InstallationExamplesHello worldHere is the code for thefirst demo.#!/usr/bin/env python3importaiocellsdefhello_world():print("Hello, world!")defmain():graph=aiocells.DependencyGraph()# The node can be any callable, in this case a function.graph.add_node(hello_world)aiocells.compute_sequential(graph)This issynchronousgraph computation. There is only one node in the graph. It is a function that prints a message. Synchronous nodes must becallable.Defining ordering constraintsHere isdemo 4. It shows how edges between nodes are defined:#!/usr/bin/env python3importtimeimportaiocellsdefmain():graph=aiocells.DependencyGraph()# 'add_node' always returns the node that has just been added, in this# case the lambda functions. We will use this below to define precedence# relationshipsprint_sleeping=graph.add_node(lambda:print("Sleeping..."))sleep=graph.add_node(lambda:time.sleep(2))print_woke_up=graph.add_node(lambda:print("Woke up!"))print("Define the precedence relationships...")graph.add_precedence(print_sleeping,sleep)graph.add_precedence(sleep,print_woke_up)# Now, after we've defined the precedence relationships, we use the# simplest computer to compute the graph. The nodes will be called in# an order that is consistent with the precedence relationships.# Specifically, the nodes are executed in topological order.aiocells.compute_sequential(graph)In this case, there are three nodes. After the nodes are added, we define precedence relationships between them. When the graph is computed, it is done so in a way that honours the precedence relationships.Asynchronous nodesBelow is the code fordemo_5. Note the use ofasyncio.sleep,functools.partialandaiocells.async_compute_sequential.#!/usr/bin/env python3importasynciofromfunctoolsimportpartialimportaiocells# This example demonstrates graph nodes that are coroutines. We use# a different computer; one that know how to deal with coroutines.defmain():graph=aiocells.DependencyGraph()# First, we add a lambda functionbefore_sleep=graph.add_node(lambda:print("Sleeping..."))# Second, we create a coroutine function using functools.partial. This# is the closest we can get to a lambda for an async functionsleep_2=partial(asyncio.sleep,2)# Finally, another lambda functionwake_up=graph.add_node(lambda:print("Woke up!"))# Here, 'sleep' will implicitly be added to the graph because it is# part of the precedence relationshipgraph.add_precedence(before_sleep,sleep_2)graph.add_precedence(sleep_2,wake_up)# Here, we use the `async_compute_sequential`, which, like# `compute_sequential`, call the nodes in a topologically correct sequence.# However, whereas `compute_sequential` only supports vanilla callables,# `async_compute_sequential` additionally supports coroutine functions,# as defined by `inspect.iscoroutinefunction`. However, the execution is# still sequential. Each coroutine function is executed using 'await' and# must complete before the next node is executed. The function# `async_compute_sequential` is a coroutine and must be awaited. Here,# we simply pass it to `asyncio.run`.asyncio.run(aiocells.async_compute_sequential(graph))Concurrent computationdemo 6is a an example of graph thatcouldbe computed concurrently but is not due to the use ifasync_compute_sequential.importasynciofromfunctoolsimportpartialimportaiocellsdefcreate_graph(stopwatch):graph=aiocells.DependencyGraph()# The method to start the stopwatchstart_stopwatch=stopwatch.start# Two sleeps. Note that they are asyncio.sleepsleep_1=partial(asyncio.sleep,1)sleep_2=partial(asyncio.sleep,2)# The method to stop the stopwatchstop_stopwatch=stopwatch.stop# Start the stopwatch before the first sleepgraph.add_precedence(start_stopwatch,sleep_1)# Stop the stopwatch after the first sleepgraph.add_precedence(sleep_1,stop_stopwatch)# Start the stopwatch before the second sleepgraph.add_precedence(start_stopwatch,sleep_2)# Stop the stopwatch after the second sleepgraph.add_precedence(sleep_2,stop_stopwatch)# Note that there is no precedence relationship between the two# sleeps.returngraphdefmain():stopwatch=aiocells.Stopwatch()graph=create_graph(stopwatch)# Even though the graph is a diamond (the sleeps do no depend on each# other and _could_ be executed concurrenty, `async_compute_sequential`# does not support concurrent execution. Thus, the execution time is# about 3 seconds, the sum of the two sleeps.print("Two async sleeps computed sequentially.")print("Total time should take about 3 seconds...")asyncio.run(aiocells.async_compute_sequential(graph))print("Computation with `async_compute_sequential` took"f"{stopwatch.elapsed_time()}")demo_7is the same graph as above but computed concurrently withasync_compute_concurrent.#!/usr/bin/env python3importasyncioimportaiocellsimportaiocells.demo_6asdemo_6defmain():stopwatch=aiocells.Stopwatch()graph=demo_6.create_graph(stopwatch)# Here, we run the same graph as the previous demo but we use# 'async_compute_concurrent' which will run the two sleeps concurrently.# Thus, the execution time will be around 2 seconds, the maximum of# the two sleeps.print("Running previous demo's graph concurrently.")print("Total execution time should be about 2 seconds...")asyncio.run(aiocells.async_compute_concurrent(graph))print("Computation with `async_compute_concurrent` took"f"{stopwatch.elapsed_time()}")Development InstallationThere is aMakefilein the repository. The default target will initialise a virtual environment, install dependencies into that environment and then test the code. It requiresPython 3.8,virtualenvandpipto be installed. If those are missing, it will print suggestions on how to address the problem.$makeActivating the virtual environment and running the demosThe default make target will generate a file calledactivate_aiocells. To activate the virtual environment:$sourceactivate_aiocellsOnce you've done that, you should have the following command available:$aiocellsdemo-1Tab completionactivate_aiocellswill enable tab completion foraiocells:$aiocells<TAB>Editable installationThe package will be installed in the virutal environment usingpip --editable. This means that modifications to the code will be immediately available.To test this, try modifyingsrc/aiocells/demo_1.pyto print a different message. You should be able to immediately run the demo and see the new message:$aiocellsdemo-1
aiocertstream
aiocertstreamAn async client to connect to certstream in Python.InstallationWindows:pip install aiocertstreamorpy -3 -m pip install aiocertstream*Nix:pip3 install aiocertstreamExample usageFor this example we'll just print out the cert index:fromaiocertstreamimportClientclient=Client()@client.listenasyncdefmy_handler(event:dict)->None:print(event["data"]["cert_index"])client.run()
aiocflib
No description available on PyPI.
aiocfscrape
A simple async Python module to bypass Cloudflare's anti-bot page. Based on aiohttp ClientSession. Solution was inherited fromcfscrapemodule.You could use it eg. with Python 3 andasynciofor concurrent crawling of web resources protected with CloudFlare.Table of ContentsInstallationBasic UsageDependenciesLicenseChangelog1.0.0 (2020-03-25)0.0.9 (2019-03-21)0.0.8 (2019-03-16)0.0.7 (2019-01-10)0.0.6 (2018-06-19)0.0.5 (2018-06-19)0.0.4 (2018-03-06)0.0.3 (2017-11-21)0.0.2 (2016-08-26)0.0.1 (2016-08-26)InstallationInstall with pippipinstallaiocfscrapeBasic Usageaiocfscrape is a aiohttp.ClientSession wrapper. Soaiohttp client referencecan be used as the base.To make simple get request do the following:importasynciofromaiocfscrapeimportCloudflareScraperasyncdeftest_open_page(url):asyncwithCloudflareScraper()assession:asyncwithsession.get(url)asresp:returnawaitresp.text()if__name__=='__main__':asyncio.run(test_open_page('<your url>'))DependenciesPython3.5.3+aiohttp>=3.1.3, <4ajs2pyLicenseaiocfscrape is offered under the MIT license.Changelog1.0.0 (2020-03-25)Update anti-bot integration to latest cfscrape version [Nachtalb]Constrain aiohttp to <4a because ofhttps://github.com/aio-libs/aiohttp/pull/3933[Nachtalb]Update package information [Nachtalb]0.0.9 (2019-03-21)Update anit-bot integration to latest cfscrape version [gaardiolor]0.0.8 (2019-03-16)Update anit-bot integration to latest cfscrape version [gaardiolor]0.0.7 (2019-01-10)Fix error when host does not return a “Server” header [pavlodvornikov]Update anit-bot integration to latest cfscrape version [gaardiolor]0.0.6 (2018-06-19)Update readme [pavlodvornikov]0.0.5 (2018-06-19)Fix aiohttp version 3 compatibility [ape364]Bump js2py [ape364]Update anti-bot integration to latest cfscrape version [dteh]0.0.4 (2018-03-06)Update anti-bot integration to latest cfscrape version [slazarov]0.0.3 (2017-11-21)Fix AttributError [ape364]0.0.2 (2016-08-26)Support timeout inside request [pavlodvornikov]Update package information [pavlodvornikov]Remove obsolete files [pavlodvornikov]0.0.1 (2016-08-26)Initial implementation
aioch
aiochaiochis a library for accessing a ClickHouse database over native interface from the asyncio. It wraps features ofclickhouse-driverfor asynchronous usage.InstallationThe package can be installed usingpip:pipinstallaiochTo install from source:gitclonehttps://github.com/mymarilyn/aiochcdaioch pythonsetup.pyinstallUsagefromdatetimeimportdatetimeimportasynciofromaiochimportClientasyncdefexec_progress():client=Client('localhost')progress=awaitclient.execute_with_progress('LONG AND COMPLICATED QUERY')timeout=20started_at=datetime.now()asyncfornum_rows,total_rowsinprogress:done=num_rows/total_rowsiftotal_rowselsetotal_rowsnow=datetime.now()# Cancel query if it takes more than 20 seconds to process 50% of rows.if(now-started_at).total_seconds()>timeoutanddone<0.5:awaitclient.cancel()breakelse:rv=awaitprogress.get_result()print(rv)asyncdefexec_no_progress():client=Client('localhost')rv=awaitclient.execute('LONG AND COMPLICATED QUERY')print(rv)loop=asyncio.get_event_loop()loop.run_until_complete(asyncio.wait([exec_progress(),exec_no_progress()]))For more information seeclickhouse-driverusage examples.Parametersexecutor- instance of custom Executor, if not supplied default executor will be usedloop- asyncio compatible event loopOther parameters are passing to wrapped clickhouse-driver's Client.Licenseaioch is distributed under theMIT license.
aio-ch
aiochaiochis a library for accessing a ClickHouse database over native interface from the asyncio. It wraps features ofclickhouse-driverfor asynchronous usage.InstallationThe package can be installed usingpip:pipinstallaio_chUsagefromdatetimeimportdatetimeimportasynciofromaiochimportClientasyncdefexec_progress():client=Client('localhost')progress=awaitclient.execute_with_progress('LONG AND COMPLICATED QUERY')timeout=20started_at=datetime.now()asyncfornum_rows,total_rowsinprogress:done=num_rows/total_rowsiftotal_rowselsetotal_rowsnow=datetime.now()# Cancel query if it takes more than 20 seconds to process 50% of rows.if(now-started_at).total_seconds()>timeoutanddone<0.5:awaitclient.cancel()breakelse:rv=awaitprogress.get_result()print(rv)asyncdefexec_no_progress():client=Client('localhost')rv=awaitclient.execute('LONG AND COMPLICATED QUERY')print(rv)loop=asyncio.get_event_loop()loop.run_until_complete(asyncio.wait([exec_progress(),exec_no_progress()]))For more information seeclickhouse-driverusage examples.Parametersexecutor- instance of custom Executor, if not supplied default executor will be usedloop- asyncio compatible event loopOther parameters are passing to wrapped clickhouse-driver's Client.Licenseaioch is distributed under theMIT license.
aioch2
aioch2aioch2is a library for accessing a ClickHouse database over native interface from the asyncio. It wraps features ofclickhouse-driverfor asynchronous usage.InstallationThe package can be installed usingpip:pipinstallaioch2Usagefromdatetimeimportdatetimeimportasynciofromaioch2importClientasyncdefexec_progress():client=Client('localhost')progress=awaitclient.execute_with_progress('LONG AND COMPLICATED QUERY')timeout=20started_at=datetime.now()asyncfornum_rows,total_rowsinprogress:done=num_rows/total_rowsiftotal_rowselsetotal_rowsnow=datetime.now()# Cancel query if it takes more than 20 seconds to process 50% of rows.if(now-started_at).total_seconds()>timeoutanddone<0.5:awaitclient.cancel()breakelse:rv=awaitprogress.get_result()print(rv)asyncdefexec_no_progress():client=Client('localhost')rv=awaitclient.execute('LONG AND COMPLICATED QUERY')print(rv)loop=asyncio.get_event_loop()loop.run_until_complete(asyncio.wait([exec_progress(),exec_no_progress()]))For more information seeclickhouse-driverusage examples.Parametersexecutor- instance of custom Executor, if not supplied default executor will be usedloop- asyncio compatible event loopOther parameters are passing to wrapped clickhouse-driver's Client.Licenseaioch2 is distributed under theMIT license.
aiochan
aiochanAiochan is a library written to bring the wonderful idiom ofCSP-styleconcurrency to python. The implementation is based on the battle-tested Clojure librarycore.async, while the API is carefully crafted to feel as pythonic as possible.Why?Doing concurrency in Python was painfulasyncio sometimes feels too low-levelI am constantly missing capabilities fromgolangandcore.asyncIt is much easier to portcore.asyncto Python than to port all thosewonderfulpythonpackagesto some other language.What am I getting?PythonicAPIthat includes everything you'd need for CSP-style concurrency programmingWorks seamlessly with existing asyncio-based librariesFullytestedFullydocumentedGuaranteed to work with Python 3.5.2 or above and PyPy 3.5 or aboveDepends only on python's core libraries, zero external dependenciesProven, efficient implementation based on Clojure's battle-testedcore.asyncFamiliar semantics for users ofgolang's channels and Clojure's core.async channelsFlexible implementation that does not depend on the inner workings of asyncio at allPermissivelylicensedAbeginner-friendly tutorialto get newcomers onboard as quickly as possibleHow to install?pip3installaiochanHow to use?Read thebeginner-friendly tutorialthat starts from the basics. Or if you are already experienced withgolangor Clojure'score.async, start with thequick introductionand then dive into theAPI documentation.I want to try it firstThequick introductionand thebeginner-friendly tutorialcan both be run in jupyter notebooks, online in binders if you want (just look forat the top of each tutorial).ExamplesIn addition to theintroductionand thetutorial, we have thecomplete set of examplesfrom Rob Pike'sGo concurrency patternstranslated into aiochan. Also, here is asolutionto the classicaldining philosophers problem.I still don't know how to use itWe are just starting out, but we will try to answer aiochan-related questions onstackoverflowas quickly as possible.I found a bugFile anissue, or if you think you can solve it, a pull request is even better.Do you use it in production? For what use cases?aiochanis definitely not a toy and we do use it in production, mainly in the two following scenarios:Complex data-flow in routing. We integrate aiochan with an asyncio-based web server. This should be easy to understand.Data-preparation piplelines. We prepare and pre-process data to feed into our machine learning algorithms as fast as possible so that our algorithms spend no time waiting for data to come in, but no faster than necessary so that we don't have a memory explosion due to data coming in faster than they can be consumed. For this we make heavy use ofparallel_pipeandparallel_pipe_unordered. Currently we are not aware of any other library that can completely satisfy this need of ours.What's up with the logo?It is our 'hello world' example:importaiochanasacasyncdefblue_python(c):whileTrue:# do some hard workproduct="a product made by the blue python"awaitc.put(product)asyncdefyellow_python(c):whileTrue:result=awaitc.get()# use result to do amazing thingsprint("A yellow python has received",result)asyncdefmain():c=ac.Chan()for_inrange(3):ac.go(blue_python(c))for_inrange(3):ac.go(yellow_python(c))in other words, it is a 3-fan-in on top of a 3-fan-out. If you run it, you will have an endless stream ofA yellow python has received a product made by the blue python.If you have no idea what this is, read thetutorial.
aiochannel
aiochannel - AsyncIO ChannelChannel concept for asyncio.Installpip install aiochannelChangelogChangelogUsageBasicsChannelhas a very similar API toasyncio.Queue. The key difference is that a channel is only considered "done" when it has been both closed and drained, so calling.join()on a channel will wait for it to be both closed and drained (UnlikeQueuewhich will return from.join()once the queue is empty).NOTE:Closing a channel is permanent. You cannot open it again.importasynciofromaiochannelimportChannel# ...asyncdefmain():# A Channel takes a max queue size and an loop# both optional. loop is not recommended as# in asyncio is phasing out explicitly passed event-loopmy_channel:Channel[str]=Channel(100)# You add items to the channel withawaitmy_channel.put("my item")# Note that this can throw ChannelClosed if the channel# is closed, during the attempt at adding the item# to the channel. Also note that .put() will block until# it can successfully add the item.# Retrieving is done withmy_item=awaitmy_channel.get()# Note that this can also throw ChannelClosed if the# channel is closed before or during retrival.# .get() will block until an item can be retrieved.# Note that this requires someone else to close and drain# the channel.# Lastly, you can close a channel with `my_channel.close()`# In this example, the event-loop call this asynchronouslyasyncio.get_event_loop().call_later(0.1,my_channel.close)# You can wait for the channel to be closed and drained:awaitmy_channel.join()# Every call to .put() after .close() will fail with# a ChannelClosed.# you can check if a channel is marked for closing withifmy_channel.closed():print("Channel is closed")asyncio.run(main())Like theasyncio.Queueyou can also call non-async get and put:# non-async version of putmy_channel.put_nowait(item)# This either returns None,# or raises ChannelClosed or ChannelFull# non-async version of getmy_channel.get_nowait()# This either returns the next item from the channel# or raises ChannelEmpty or ChannelClosed# (Note that ChannelClosed emplies that the channel# is empty, but also that is will never fill again)As of0.2.0Channelalso implements the async iterator protocol. You can now useasync forto iterate over the channel until it closes, without having to deal withChannelClosedexceptions.# the channel might contain data hereasyncforiteminchannel:print(item)# the channel is closed and empty herewhich is functionally equivalent towhileTrue:try:data=yield fromchannel.get()exceptChannelClosed:break# process data here
aiochat
Server Usageimportaiochatimportaiohttp.webclassAgent(aiochat.ServerAgent):@aiochat.useasyncdefjoin(self,*args,delimit=','):returndelimit.join(map(str,args))app=aiohttp.web.Application()routes=aiohttp.web.RouteTableDef()@routes.get('/connect')asyncdefhandle(request):websocket=aiohttp.web.WebSocketResponse()awaitwebsocket.prepare(request)agent=Agent()awaitagent.start(websocket)value='eggs and bacon and salad'# remote callresult=awaitagent.split(value,delimit=' and ')print('result:',result)# wait until disconnectionawaitagent.wait()returnwebsocketapp.router.add_routes(routes)aiohttp.web.run_app(app)Client UsageimportaiochatimportaiohttpimportasyncioclassAgent(aiochat.ClientAgent):@aiochat.useasyncdefsplit(self,value,delimit=','):returnvalue.split(delimit)loop=asyncio.get_event_loop()url='http://localhost:8080/connect'asyncdefmain():session=aiohttp.ClientSession(loop=loop)asyncdefconnect():whilenotsession.closed:try:websocket=awaitsession.ws_connect(url)exceptaiohttp.ClientError:awaitasyncio.sleep(0.5)else:breakreturnwebsocketagent=Agent(connect)awaitagent.start()values=('crooked man','mile','sixpence','stile')# remote callresult=awaitagent.join(*values,delimit=' ')print('result:',result)# disconnectawaitagent.stop()awaitsession.close()coroutine=main()loop.run_until_complete(coroutine)DetailsClients will attempt to auto-reconnect until told to stop.Method names have to follow python function name limitations.The reconnection protocol reserves thehelloalertmethods.Implementation reserves thebindwaitstartstopmethods.Annotations are not considered; schema checking should be done manually.Keyword arguments cannot be passed in a positional manner and vice versa.WebSockets should not be used outside of Agent context while connected.Installingpython3 -m pip install aiochat
aioChatbase
aioChatbaseis a library forChatbase Generic Message APIwritten in Python 3.6 withasyncioandaiohttp. It helps to integrate Chatbase with your chatbot.How to installpython3.6 -m pip install aioChatbaseHow to useImport ChatbasefromaiochatbaseimportChatbaseCreate cb instancecb=Chatbase(API_KEY,BOT_PLATFORM)Register handled messageawaitcb.register_message(user_id='123456',intent='start')Register non-handled messageawaitcb.register_message(user_id='123456',intent='unknown message',not_handled=True)Register url clickawaitcb.register_click(url='google.com')Close instance on your app shutdownawaitcb.close()ExamplesCheck more examples at the/examplesfolderWikiFeel free to read ourWiki
aiochclient
aiochclientAn async http(s) ClickHouse client for python 3.6+ supporting type conversion in both directions, streaming, lazy decoding on select queries, and a fully typed interface.Table of ContentsInstallationQuick StartDocumentationType ConversionConnection Pool SettingsNotes on SpeedInstallationYou can use it with eitheraiohttporhttpxhttp connectors.To use withaiohttpinstall it with command:> pip install aiochclient[aiohttp]Oraiochclient[aiohttp-speedups]to install with extra speedups.To use withhttpxinstall it with command:> pip install aiochclient[httpx]Oraiochclient[httpx-speedups]to install with extra speedups.Installing with[*-speedups]adds the following:cChardetforaiohttpspeedupaiodnsforaiohttpspeedupciso8601for ultra-fast datetime parsing while decoding data from ClickHouse foraiohttpandhttpx.Additionally the installation process attempts to use Cython for a speed boost (roughly 30% faster).Quick StartConnecting to ClickHouseaiochclientneedsaiohttp.ClientSessionorhttpx.AsyncClientto connect to ClickHouse:fromaiochclientimportChClientfromaiohttpimportClientSessionasyncdefmain():asyncwithClientSession()ass:client=ChClient(s)assertawaitclient.is_alive()# returns True if connection is OkQuerying the databaseawaitclient.execute("CREATE TABLE t (a UInt8, b Tuple(Date, Nullable(Float32))) ENGINE = Memory")For INSERT queries you can pass values as*args. Values should be iterables:awaitclient.execute("INSERT INTO t VALUES",(1,(dt.date(2018,9,7),None)),(2,(dt.date(2018,9,8),3.14)),)For fetching all rows at once use thefetchmethod:all_rows=awaitclient.fetch("SELECT * FROM t")For fetching first row from result use thefetchrowmethod:row=awaitclient.fetchrow("SELECT * FROM t WHERE a=1")assertrow[0]==1assertrow["b"]==(dt.date(2018,9,7),None)You can also usefetchvalmethod, which returns first value of the first row from query result:val=awaitclient.fetchval("SELECT b FROM t WHERE a=2")assertval==(dt.date(2018,9,8),3.14)With async iteration on the query results stream you can fetch multiple rows without loading them all into memory at once:asyncforrowinclient.iterate("SELECT number, number*2 FROM system.numbers LIMIT 10000"):assertrow[0]*2==row[1]Usefetch/fetchrow/fetchval/iteratefor SELECT queries andexecuteor any of last for INSERT and all another queries.Working with query resultsAll fetch queries return rows as lightweight, memory efficient objects.Before v1.0.0rows were only returned as tuples.All rows have a full mapping interface, where you can get fields by names or indexes:row=awaitclient.fetchrow("SELECT a, b FROM t WHERE a=1")assertrow["a"]==1assertrow[0]==1assertrow[:]==(1,(dt.date(2018,9,8),3.14))assertlist(row.keys())==["a","b"]assertlist(row.values())==[1,(dt.date(2018,9,8),3.14)]DocumentationTo check out theapi docs, visit thereadthedocs site..Type Conversionaiochclientautomatically converts types from ClickHouse to python types and vice-versa.ClickHouse typePython typeBoolboolUInt8intUInt16intUInt32intUInt64intUInt128intUInt256intInt8intInt16intInt32intInt64intInt128intInt256intFloat32floatFloat64floatStringstrFixedStringstrEnum8strEnum16strDatedatetime.dateDateTimedatetime.datetimeDateTime64datetime.datetimeDecimaldecimal.DecimalDecimal32decimal.DecimalDecimal64decimal.DecimalDecimal128decimal.DecimalIPv4ipaddress.IPv4AddressIPv6ipaddress.IPv6AddressUUIDuuid.UUIDNothingNoneTuple(T1, T2, ...)Tuple[T1, T2, ...]Array(T)List[T]Nullable(T)NoneorTLowCardinality(T)TMap(T1, T2)Dict[T1, T2]Nested(T1, T2, ...)List[Tuple[T1, T2, ...], Tuple[T1, T2, ...]]Connection Pool Settingsaiochclientuses theaiohttp.TCPConnectorto determine pool size. By default, the pool limit is 100 open connections.Notes on SpeedIt's highly recommended usinguvloopand installingaiochclientwith speedups for the sake of speed. Some recent benchmarks on our machines without parallelization:180k-220k rows/sec on SELECT50k-80k rows/sec on INSERTNote: these benchmarks are system dependent
aiocheck
aiocheckA python asyncio host health checker using native ping commands.Example:pip install aiocheck aiocheck 10.20.30.40 10.20.30.50 10.20.30.60stdout:########### # Running # ########### Addresses: ['10.20.30.50', '10.20.30.40', '10.20.30.60'] Press CTRL+C to exitaiocheck_log.csv:address, alive, timestamp 10.20.30.60, False, 2020-06-22 17:35:40.398753 10.20.30.40, False, 2020-06-22 17:35:40.398729 10.20.30.50, False, 2020-06-22 17:35:40.398660For further details visit theDocumentation.InstallUsing pippip install aiocheck aiocheck localhostUsing binary from GitHubgit clone https://github.com/kruserr/aiocheck.git cd aiocheck ./bin/aiocheck.exe localhostFor further install instructions visit theDocumentation.DevelopOpen in VS Codegit clone https://github.com/kruserr/aiocheck.git python -m pip install --upgrade pip setuptools wheel pytest tox twine pyinstaller cd aiocheck python -m pip install -e . code .Run VS Code TasksCTRL+SHIFT+BorCTRL+P>Tasks: Run TaskFor further developing instructions visit theDocumentation.
aiochorm
Async ClickHouse ORM.More information available on read the doc:https://github.com/qvp/aiochorm
aiochris
aiochrisChRISPython client library built onaiohttp(async HTTP client) andpyserde(dataclassesdeserializer).InstallationRequires Python 3.10 or 3.11.pipinstallaiochris# orpoetryaddaiochrisFor convenience, container images are also provided.dockerpullghcr.io/fnndsc/aiochris:0.3.0Quick ExampleimportasynciofromaiochrisimportChrisClientasyncdefreadme_example():chris=awaitChrisClient.from_login(username='chris',password='chris1234',url='https://cube.chrisproject.org/api/v1/')dircopy=awaitchris.search_plugins(name_exact='pl-brainmgz',version='2.0.3').get_only()plinst=awaitdircopy.create_instance(compute_resource_name='host')feed=awaitplinst.get_feed()awaitfeed.set(name="hello, aiochris!")awaitchris.close()# do not forget to clean up!asyncio.run(readme_example())Documentation LinksClient documentation:https://fnndsc.github.io/aiochrisDeveloper documentation:https://github.com/FNNDSC/aiochris/wiki
aiochroma
AIOChromaAIOChromais an API wrapper for communication with Razer Chroma devices.Up till now, it is mostly used for thecustom Chroma integration for Home Assistant. But you are welcome to use it for your purposes, as well as suggest new features which you would like to use.A short presentation of the features can be found in thisYouTube video.InstallationInstallation of the latest release is available from PyPI:pip install aiochromaUsageThis section is still under development.Supported devicesThis list provides only the models tested by me or other users.Some of the devices might be in the group which you would not expect. This is not related to the integration but to the Chroma SDK.GroupDevicesChromalinkChroma Addressable RGB Controller(link*)Mousepads:Goliathus Extended Chroma(link)Services:AuraConnectHeadsetKraken 7.1 V2(link),Kraken X USB(link)KeyboardBlackWidow Chroma(link),BlackWidow Elite(link),BlackWidow V3 Pro(link)Cynosa Chroma(link)Huntsman Elite(link),Huntsman V2 Analog(link)KeypadTartarus V2(link)MouseBasilisk(link)DeathAdder V2 Pro(link)Mamba Tournament Edition(link)Viper Ultimate(link) (+Mouse Dock) (link)MousepadBase Station V2 Chroma(link)Firefly V2(link)Mouse Bungee V3 Chroma(link)* As an Amazon Associate I earn from qualifying purchases. Not like I ever got anything yet (:Support the developmentThis library is a free-time project. If you like it, you can support me by buying a coffee.
aiochrome
# aiochrome[![Build Status](https://travis-ci.org/fate0/aiochrome.svg?branch=master)](https://travis-ci.org/fate0/aiochrome)[![Codecov](https://img.shields.io/codecov/c/github/fate0/aiochrome.svg)](https://codecov.io/gh/fate0/aiochrome)[![Updates](https://pyup.io/repos/github/fate0/aiochrome/shield.svg)](https://pyup.io/repos/github/fate0/aiochrome/)[![PyPI](https://img.shields.io/pypi/v/aiochrome.svg)](https://pypi.python.org/pypi/aiochrome)[![PyPI](https://img.shields.io/pypi/pyversions/aiochrome.svg)](https://github.com/fate0/aiochrome)A Python Package for the Google Chrome Dev Protocol, [more document](https://fate0.github.io/aiochrome/)## Table of Contents* [Installation](#installation)* [Setup Chrome](#setup-chrome)* [Getting Started](#getting-started)* [Tab management](#tab-management)* [Debug](#debug)* [Examples](#examples)* [Ref](#ref)## InstallationTo install aiochrome, simply:```$ pip install -U aiochrome```or from GitHub:```$ pip install -U git+https://github.com/fate0/aiochrome.git```or from source:```$ python setup.py install```## Setup Chromesimply:```$ google-chrome --remote-debugging-port=9222```or headless mode (chrome version >= 59):```$ google-chrome --headless --disable-gpu --remote-debugging-port=9222```or use docker:```$ docker pull fate0/headless-chrome$ docker run -it --rm --cap-add=SYS_ADMIN -p9222:9222 fate0/headless-chrome```## Getting Started``` pythonimport asyncioimport aiochromeasync def main():# create a browser instancebrowser = aiochrome.Browser(url="http://127.0.0.1:9222")# create a tabtab = await browser.new_tab()# register callback if you wantasync def request_will_be_sent(**kwargs):print("loading: %s" % kwargs.get('request').get('url'))tab.Network.requestWillBeSent = request_will_be_sent# start the tabawait tab.start()# call methodawait tab.Network.enable()# call method with timeoutawait tab.Page.navigate(url="https://github.com/fate0/aiochrome", _timeout=5)# wait for loadingawait tab.wait(5)# stop the tab (stop handle events and stop recv message from chrome)await tab.stop()# close tabawait browser.close_tab(tab)loop = asyncio.get_event_loop()try:loop.run_until_complete(main())finally:loop.close()```or (alternate syntax)``` pythonimport asyncioimport aiochromeasync def main():browser = aiochrome.Browser(url="http://127.0.0.1:9222")tab = await browser.new_tab()async def request_will_be_sent(**kwargs):print("loading: %s" % kwargs.get('request').get('url'))tab.set_listener("Network.requestWillBeSent", request_will_be_sent)await tab.start()await tab.call_method("Network.enable")await tab.call_method("Page.navigate", url="https://github.com/fate0/aiochrome", _timeout=5)await tab.wait(5)await tab.stop()await browser.close_tab(tab)loop = asyncio.get_event_loop()try:loop.run_until_complete(main())finally:loop.close()```more methods or events could be found in[Chrome DevTools Protocol](https://chromedevtools.github.io/devtools-protocol/tot/)## Debugset DEBUG env variable:![aiochrome_with_debug_env](https://raw.githubusercontent.com/fate0/aiochrome/master/docs/images/aiochrome_with_debug_env.png)## Tab managementrun `aiochrome -h` for more infoexample:![aiochrome_tab_management](https://raw.githubusercontent.com/fate0/aiochrome/master/docs/images/aiochrome_tab_management.png)## Examplesplease see the [examples](http://github.com/fate0/aiochrome/blob/master/examples) directory for more examples## Ref* [chrome-remote-interface](https://github.com/cyrus-and/chrome-remote-interface/)* [Chrome DevTools Protocol](https://chromedevtools.github.io/devtools-protocol/tot/)
aiochsa
Clickhouse Python/asyncio library for use with SQLAlchemy coreExampleimportaiochsaimportsqlalchemyassatable=sa.Table('test',sa.MetaData(),sa.Column('id',sa.Integer),sa.Column('name',sa.String),)asyncwithaiochsa.connect('clickhouse://127.0.0.1:8123')asconn:awaitconn.execute(table.insert(),[{'id':1,'name':'Alice'},{'id':2,'name':'Bob'},],)rows=awaitconn.fetch(table.select())To addFINALmodifier usewith_hint(table, 'FINAL')(seeSQLAlchemy docs for details).Configure logging to show SQL:logging.getLogger('aiochsa.client.SQL').setLevel(logging.DEBUG)Custom type convertersHere is an example of installing converter for ClickHouse’sDateTimetype that requires and returns timezone-aware Python’sdatetimeobject and stores it as UTC:fromdatetimeimportdatetimeimportaiochsafromaiochsa.typesimportDateTimeUTCType,TypeRegistrytypes=TypeRegistry()types.register(DateTimeUTCType,['DateTime'],datetime)conn=aiochsa.connect(dsn,types=types)Change logSeeCHANGELOG.DevelopmentPrerequizites: Python (use pyenv to manage multiple versions), pip, tox, coverage, docker, docker-compose.Running tests:# Run whole tests matrix:tox# Run test with specific Python version only:tox-epy38# Test with specific Clickhouse version:tox-epy38----clickhouse-version=21.2.2.8# Run specified test(s):tox-epy38--tests/test_execute.py::test_aggregate_function
aioci
No description available on PyPI.
aiocian
aiocianDescriptionUnofficial library for interaction withCianContentsRelease Notes0.0.1Getting StartedInstallation from pipInstallation from GitHubQuick StartRelease NotesVersion 0.0.1Created libraryAdd simple searchGetting StartedInstallation from pipFor installation botovod library from pip you should have pip with python (prefer python3.6 or later)pipinstallaiocianInstallation from GitHubTo basic installation from GitHub repository you should have git, python3 (prefer python3.6 or later), pip (optionally) in your systemgitclonehttps://github.com/OlegYurchik/aiocian.gitcdaiocian pipinstall.orgitclonehttps://github.com/OlegYurchik/aiocian.gitcdaiocian pythonsetup.pyinstallQuick StartAfter installation, you can use the library in your code. Below is a sneak example of using the libraryfromaiocianimportBUY,CianClient,FLAT,SPBimportasyncioasyncdefmain():asyncwithCianClient()asclient:search=client.search(region=SPB,action=BUY,places=(FLAT,))asyncforresultinsearch:print(result.url)loop=asyncio.get_event_loop()loop.run_until_complete(main)loop.close()
aiocircuitbreaker
aiocircuitbreakerThis is an async Python implementation of thecircuitbreakerlibrary.InstallationThe project is available on PyPI. Simply run:$ pip install aiocircuitbreakerUsageThis is the simplest example. Just decorate a async function with the@circuitdecorator:from aiocircuitbreaker import circuit @circuit async def external_call(): ...This decorator sets up a circuit breaker with the default settings. The circuit breaker:monitors the function execution and counts failuresresets the failure count after every successful execution (while it is closed)opens and prevents further executions after 5 subsequent failuresswitches to half-open and allows one test-execution after 30 seconds recovery timeoutcloses if the test-execution succeededconsiders all raised exceptions (based on classException) as an expected failureis named “external_call” - the name of the function it decoratesWhat doesfailuremean?Afailureis a raised exception, which was not caught during the function call. By default, the circuit breaker listens for all exceptions based on the classException. That means, that all exceptions raised during the function call are considered as an “expected failure” and will increase the failure count.Get specific about the expected failureIt is important, to beas specific as possible, when defining the expected exception. The main purpose of a circuit breaker is to protect your distributed system from a cascading failure. That means, you probably want to open the circuit breaker only, if the integration point on the other end is unavailable. So e.g. if there is anConnectionErroror a requestTimeout.If you are e.g. using the requests library (http://docs.python-requests.org/) for making HTTP calls, itsRequestExceptionclass would be a great choice for theexpected_exceptionparameter.All recognized exceptions will be re-raised anyway, but the goal is, to let the circuit breaker only recognize those exceptions which are related to the communication to your integration point.ConfigurationThe following configuration options can be adjusted via decorator parameters. For example:from aiocircuitbreaker import circuit @circuit(failure_threshold=10, expected_exception=ConnectionError) async def external_call(): ...failure thresholdBy default, the circuit breaker opens after 5 subsequent failures. You can adjust this value with thefailure_thresholdparameter.recovery timeoutBy default, the circuit breaker stays open for 30 seconds to allow the integration point to recover. You can adjust this value with therecovery_timeoutparameter.expected exceptionBy default, the circuit breaker listens for all exceptions which are based on theExceptionclass. You can adjust this with theexpected_exceptionparameter. It can be either an exception class or a tuple of exception classes.nameBy default, the circuit breaker name is empty string. You can adjust the name with parametername.fallback functionBy default, the circuit breaker will raise aCircuitBreakerexception when the circuit is opened. You can instead specify a function (async function) to be called when the circuit is opened. This function can be specified with thefallback_functionparameter and will be called with the same parameters as the decorated function would be.Advanced UsageIf you apply circuit breakers to a couple of functions and you always set specific options other than the default values, you can extend theCircuitBreakerclass and create your own circuit breaker subclass instead:from aiocircuitbreaker import CircuitBreaker class MyCircuitBreaker(CircuitBreaker): FAILURE_THRESHOLD = 10 RECOVERY_TIMEOUT = 60 EXPECTED_EXCEPTION = RequestExceptionNow you have two options to apply your circuit breaker to a function. As an Object directly:@MyCircuitBreaker() async def external_call(): ...Please note, that the circuit breaker class has to be initialized, you have to use a class instance as decorator (@MyCircuitBreaker()), not the class itself (@MyCircuitBreaker).Or via the decorator proxy:@circuit(cls=MyCircuitBreaker) async def external_call(): ...
aioclamd
aioclamdThis package is an asynchronous version of the pleasant packagepython-clamd. It has the same external API, only all methods are coroutines and all communication is handled asynchronously using theasyncioframework.TheClamdAsyncClientconnects to aClamAVantivirus instance and scans files and data for malicious threats. This package does not bundle ClamAV in any way, so a running instance of theclamddeamon is required.Installationpip install aioclamdUsageTo scan a file (on the system where ClamAV is installed):importasynciofromaioclamdimportClamdAsyncClientasyncdefmain(host,port):clamd=ClamdAsyncClient(host,port)print(awaitclamd.scan('/etc/clamav/clamd.conf'))asyncio.run(main("127.0.0.1",3310))# Output:# {'/etc/clamav/clamd.conf': ('OK', None)}To scan a data stream:importasyncioimportbase64fromioimportBytesIOfromaioclamdimportClamdAsyncClientEICAR=BytesIO(base64.b64decode(b"WDVPIVAlQEFQWzRcUFpYNTQoUF4pN0NDKTd9JEVJQ0FSLVNU"b"QU5EQVJELUFOVElWSVJVUy1URVNU\nLUZJTEUhJEgrSCo=\n"))asyncdefmain(host,port):clamd=ClamdAsyncClient(host,port)print(awaitclamd.instream(EICAR))asyncio.run(main("127.0.0.1",3310))# Output:# {'stream': ('FOUND', 'Win.Test.EICAR_HDB-1')}DevelopmentA local instance ofClamAVcan be had with Docker:dockerrun-p3310:3310--rmclamav/clamav
aioclaude-api
Claude AI-API ( Unofficial )This project provides an unofficial API for Claude AI, allowing users to access and interact with Claude AI .Current Version == 1.0.12Table of contentsUse CasesPrerequisitesInstallationUsageList All ConversationsSend MessageSend Message with attachmentDelete ConversationChat Conversation HistoryCreate New ChatReset All ConversationsRename ChatDisclaimerUse Cases1. Python Console ChatBot ( Check in usecases folder for sample console chatbot ) 2. Discord Chatbot 3. Many more can be done....PrerequisitesTo use this API, you need to have the following:Python installed on your system requests library installedpipinstallrequestsInstallationTo use the Claude AI Unofficial API, you can either clone the GitHub repository or directly download the Python file.Terminal :pip install aioclaude-apiorClone the repository:git clone https://github.com/AmoreForever/Claude-API.gitUsageImport the claude_api module in your Python script:from claude_api import ClientNext, you need to create an instance of the Client class by providing your Claude AI cookie:You can get cookie from the browser's developer tools network tab ( see for any claude.ai requests check out cookie ,copy whole value ) or storage tab ( You can find cookie of claude.ai ,there will be four values )(Checkout below image for the format of cookie ,It is Better to Use from network tab to grab cookie easily )cookie = os.environ.get('cookie') claude_api = Client(cookie)List All ConversationsTo list all the conversation Id's you had with Claude , you can use the list_all_conversations method:conversations = claude_api.list_all_conversations() for conversation in conversations: conversation_id = conversation['uuid'] print(conversation_id)Send MessageTo send a message to Claude, you can use the send_message method. You need to provide the prompt and the conversation ID:prompt = "Hello, Claude!" conversation_id = "<conversation_id>" or claude_api.create_new_chat()['uuid'] response = claude_api.send_message(prompt, conversation_id) print(response)Send Message with attachmentYou can send any type of attachment to claude to get responses using attachment argument in send_message(). Note: Claude currently supports only some file types.prompt = "Hey,Summarize me this document.!" conversation_id = "<conversation_id>" or claude_api.create_new_chat()['uuid'] response = claude_api.send_message(prompt, conversation_id,attachment="path/to/file.pdf") print(response)Delete ConversationTo delete a conversation, you can use the delete_conversation method:conversation_id = "<conversation_id>" deleted = claude_api.delete_conversation(conversation_id) if deleted: print("Conversation deleted successfully") else: print("Failed to delete conversation")Chat Conversation HistoryTo get the chat conversation history, you can use the chat_conversation_history method:conversation_id = "<conversation_id>" history = claude_api.chat_conversation_history(conversation_id) print(history)Create New ChatTo create a new chat conversation (id), you can use the create_new_chat method:new_chat = claude_api.create_new_chat() conversation_id = new_chat['uuid'] print(conversation_id)Reset All ConversationsTo reset all conversations, you can use the reset_all method:reset = claude_api.reset_all() if reset: print("All conversations reset successfully") else: print("Failed to reset conversations")Rename ChatTo rename a chat conversation, you can use the rename_chat method:conversation_id = "<conversation_id>" title = "New Chat Title" renamed = claude_api.rename_chat(title, conversation_id) if renamed: print("Chat conversation renamed successfully") else: print("Failed to rename chat conversation")DisclaimerThis project provides an unofficial API for Claude AI and is not affiliated with or endorsed by Claude AI or Anthropic. Use it at your own risk.Please refer to the official Claude AI documentation[https://claude.ai/docs] for more information on how to use Claude AI.
aiocleverbot
aiocleverbotAn async python wrapper for cleverbot. Does not require API KEYInstallationpip install aiocleverbotUsagefromaiocleverbotimportcleverbot# Without contextresponse=awaitcleverbot("Hello")print(response)# With context# Please note that context should include messages sent to Cleverbot as well as the responsesresponse=awaitcleverbot("Bad",["hi","How are you?"])print(response)
aiocli
Async cli client/commander frameworkaiocli is a Python library for simple and lightweight async console runner.Full compatibility with argparse module and highly inspired by aiohttp module.InstallationUse the package managerpipto install aiocli.pipinstallaiocliDocumentationVisitaiocli docs.UsagefromloggingimportgetLogger,Logger,StreamHandlerfromosimportgetenvfromaiocli.commanderimportrun_app,Application,Depends,Stateapp=Application(state={'envs':{'LOGGER_NAME':str(getenv('LOGGER_NAME','example_app')),'LOGGER_LEVEL':str(getenv('LOGGER_LEVEL','INFO')),}})def_get_logger(state:State)->Logger:logger=getLogger(state.get('envs')['LOGGER_NAME'])logger.setLevel(state.get('envs')['LOGGER_LEVEL'])handler=StreamHandler()logger.addHandler(handler)[email protected](name='greet:to',positionals=[('name',{'default':'World!'})])asyncdefhandle_greeting(name:str,logger:Logger=Depends(_get_logger))->int:logger.info(f'Hello{name}')[email protected](name='div',optionals=[('--a',{'type':float}),('--b',{'type':float})])asyncdefhandle_division(a:float,b:float,logger:Logger=Depends(_get_logger))->int:try:logger.info(f'Result{a}/{b}={(a/b)}')return0exceptBaseExceptionaserr:logger.error(f'Error:{err}')return1# python3 main.py <command> <positionals> <optionals>if__name__=='__main__':run_app(app)RequirementsPython >= 3.7ContributingPull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.Please make sure to update tests as appropriate.LicenseMIT
aioclickhouse
No description available on PyPI.
aio-clickhouse
aiochaiochis a library for accessing a ClickHouse database over native interface from the asyncio. It wraps features ofclickhouse-driverfor asynchronous usage.InstallationThe package can be installed usingpip:pipinstallaioclickhouseUsagefromdatetimeimportdatetimeimportasynciofromaiochimportClientasyncdefexec_progress():client=Client('localhost')progress=awaitclient.execute_with_progress('LONG AND COMPLICATED QUERY')timeout=20started_at=datetime.now()asyncfornum_rows,total_rowsinprogress:done=num_rows/total_rowsiftotal_rowselsetotal_rowsnow=datetime.now()# Cancel query if it takes more than 20 seconds to process 50% of rows.if(now-started_at).total_seconds()>timeoutanddone<0.5:awaitclient.cancel()breakelse:rv=awaitprogress.get_result()print(rv)asyncdefexec_no_progress():client=Client('localhost')rv=awaitclient.execute('LONG AND COMPLICATED QUERY')print(rv)loop=asyncio.get_event_loop()loop.run_until_complete(asyncio.wait([exec_progress(),exec_no_progress()]))For more information seeclickhouse-driverusage examples.Parametersexecutor- instance of custom Executor, if not supplied default executor will be usedloop- asyncio compatible event loopOther parameters are passing to wrapped clickhouse-driver's Client.Licenseaioch is distributed under theMIT license.
aioclient
aioclientInstallationpipinstallaioclientUsageimportaioclientasyncdefget_example():status,headers,body=awaitaioclient.get('https://www.example.com/')print(body)Changelogv0.1.0GET requests returnstatus, headers, bodytuplesv0.2.0Support OPTIONS, HEAD, POST, PUT, PATCH, and DELETE requestsDeserialize text/xml responses as XML ElementTreev0.2.1Fix project description
aio-client
AIO-clientПакетaio-clientпредназначен для простой и быстрой интеграции пользовательского Django-проекта с подсистемой взаимодействия со СМЭВ 3(AIO).История изменений1.8.5 (2023-12-28) ++++++++++++++++++(EDUCLLG-8117) Изменен пример aio-сервера в примере конфига и добавлена отправка сигнала через robust_sender, если включен режим отладки1.8.4 (2023-10-05) ++++++++++++++++++(EDUSMEVTLS-524) Доработка поиска в панели администратора по значению в поле "Бизнес-данные запроса" (в бд поле body)(EDUCLLG-7980) Добавление README.md, смена формата CHANGELOG на md1.8.3 (2023-08-25) ++++++++++++++++++(EDUSMEVTLS-512) Доработка уникальности сообщений по origin_message_id.1.8.2 (2022-09-12) ++++++++++++++++++(EDUSMEVTLS-405) Реализована проверка уникальности origin_message_id.(EDUSCHL-17922) Изменена сортировка в aio_client.provider.api.get_requests, чтобы сначала обрабатывались более ранние заявки.1.8.1 (2022-04-22) ++++++++++++++++++Добавлено логирование ошибок при отправке запроса в функции post_requestИсправлена ошибка: при получении запросов и ответов могли стать "отправленными" записи не возвращаемые из функций1.8.0 (2021-07-01) ++++++++++++++++++Удаление полученных сообщений одним запросом DELETE в АИО (для версии aio_server >=1.4.0)Добавление кодировки в файлы миграцийИзменение MessageID для потребителя только если сообщение создано раньше, чем день назадДобавление кодировки1.7.1 (2021-06-17) ++++++++++++++++++Таймаут добавлен только для GET-запросов;Ошибка Таймаута устанавливает статус запроса Ошибка.1.7.0 (2021-05-21) ++++++++++++++++++Изменение названия поля "Статус Пакета"Исправление множественного названия моделейИзменение порядка отобржения полей в админкеПравки в установке "Статуса Сообщения" при отправке запросаПрофилирование запроса получения Запросов к переотправкеДобавление настройки таймаута(в секундах) при отправке запроса в АИО(по умолчанию 1сек)1.6.1 (2021-04-25) ++++++++++++++++++Изменения для работы с ЭДС 2.12+1.6.0 (2020-12-28) ++++++++++++++++++Изменения для работы с ЭШ под Python 3.71.5.2 (2020-12-23) ++++++++++++++++++Поднята максимальная версия пакета зависимости requests до 2.251.5.1 (2020-12-03) ++++++++++++++++++Исправлена обратная совместимость с версиями 1.4.xИсправлены названия асинхронных периодических задачДобавлена функция выдачи ответов потребителю к обработкеДобавлена функция пометки ответа потребителю ошибкой обработкиДобавлено логирование http-ответа от aio_server при возникновении ошибки1.5.0 (2020-11-25) ++++++++++++++++++Добавление хранения и приёма информации по полям Код Ошибки и Описание ошибки в Потребитель. Ответ СМЭВИсправление чтение конфига для периодических асинхронных задачДобавлены индексы для полей origin_message_id базовых моделей1.4.4 (2020-11-12) ++++++++++++++++++Добавлены индексы по полям message_id и state базовых моделейИсправлена конфигурация yadicИсправлено предупреждение pyyaml о небезопасном загрузчикеСтилевые исправления fab src1.4.3 (2020-03-02) ++++++++++++++++++Добавлена возможность массовой смены статуса сообщений и фильтрация по статусу для модели "Потребитель. Ответ СМЭВ"Добавлена возможность настройки expiry_date - времени, спустя которое неотправленному сообщению присваивается статус ошибкиИсправлена долгая загрузка объектов в админке из-за поля "Лог запроса"Добавлено поле Статус Ответа в реестрах "Заявки в СМЭВ", "Ответ СМЭВ по заявкам", "Ответ СМЭВ"1.4.2 (2019-08-20) ++++++++++++++++++Исправлена генерация message_id1.4.1 (2019-08-16) ++++++++++++++++++Исправлены описания моделей для Поставщиков и Потребителей1.4.0 (2019-04-12) ++++++++++++++++++Удалена зависимость от API pip.1.3.0 (2019-03-06) ++++++++++++++++++Исправлена отправка сообщений в Sentry.Исправлено сохранение запросов к Поставщику.Для периодических задач добавлены несколько типов расписаний.Добавлена отправка сигнала о завершении получения данных из системы AIO.1.1.1 (2018-10-30) ++++++++++++++++++Исправлена ошибка django.core.exceptions.FieldError для Django<1.101.1.0 (2018-10-18) ++++++++++++++++++Добавлена повторная отправка POST запросов при транспортных ошибках, и отправка в sentry других ошибок. Если в течение суток с момента создания POST запроса отправка не выполнена, то запросу присваивается статус errorПри включенной настройке DEBUG_MODE, в исх. сообщения заполняется is_test_message=TrueПри повторной отправке POST запроса от потребителя присваивается новый message_idВ админке для POST Запросов добавлен метод "Повторно отправить сообщения в статусе ошибка" для сообщений, для котрыхДоработана конфигурация зависимых объектов для клиента из пользовательских приложений.Доработана совместимость с python3.Добавлены тесты на совместимость с различными версиями python и Django.1.0.0 (2018-10-10) ++++++++++++++++++Выделена версия 1.0.0
aio-clients
aiohttp clientWhat is the difference from aiohttp.Client?It is simpler and as a RequestsInstall beta:pipinstallaio-clientsExample:Base reqeust:importasynciofromaio_clientsimportHttp,Optionsasyncdefmain():r=awaitHttp().get('https://google.com',o=Options(is_json=False,is_close_session=True))print(f'code={r.code}body={r.body}')asyncio.run(main())Async reqeustimportasyncioimportaiohttpfromaio_clientsimportHttp,Optionsasyncdefon_request_start(session,trace_config_ctx,params):print("Starting request")asyncdefon_request_end(session,trace_config_ctx,params):print("Ending request")asyncdefmain():trace_config=aiohttp.TraceConfig()trace_config.on_request_start.append(on_request_start)trace_config.on_request_end.append(on_request_end)http=Http(host='https://google.com/search',option=Options(trace_config=trace_config,is_json=False),)r=awaitasyncio.gather(http.get(q_params={'q':'test'}),http.get(q_params={'q':'hello_world'}),http.get(q_params={'q':'ping'}),)print(f'status code={[i.codeforiinr]}body={[i.bodyforiinr]}')awaithttp.close()asyncio.run(main())Multipart reqeust:importasynciofromaio_clientsimportHttp,Optionsfromaio_clients.multipartimportEasy,Form,File,Writerasyncdefmain():withEasy('form-data')asform:form.add_form(Form(key='chat_id',value=12345123))form.add_form(Form(key='audio',value='hello world'))form.add_form(File(key='file',value=b'hello world file',file_name='test.py'))r=awaitHttp(option=Options(is_close_session=True,is_json=False)).post('http://localhost:8081',form=form,)writer=Writer()awaitform.write(writer)print(f'code={r.code}body={r.body}')print(f'full body:\n{writer.buffer.decode()}')asyncio.run(main())
aiocloud
Installingpython3 -m pip install aiocloudDisclaimerThis library isveryincomplete and created for a single project!That being said, I will probably get to finish it sometime soon <3
aiocloudflare
Inspired by the officalpython-cloudflarelibrary developed byCloudflare. This project is created to be compatible withasynciofor non-blocking IO.For sync code, it is recommanded to usepython-cloudflareviapip installpython-cloudflareas it is used by hundreds and offically maintained by Cloudflare. This ensure that APIs are always updated according to Cloudflare API release.NOTE:This library is in Beta, this means fixes and updates are still going on every second. Do not use it in Production unless you have tested on the API route specific to your use case and that would be at your own risk.Having said that, do submit an issue if you encounter any bug so we can move away from the Alpha stage sooner.Featuresasync http API call using modern http libraryhttpx.Autocompletion on IDE.Fully type hinted.Feature Roadmapto support cert tokento support sync API clientThese are some alternative use cases that are not in the top of my priority now as I have not received any request for. If you are interested, you may want to submit a pull request to contribute some of these features.RequirementsPython3.9+InstallationYou can installaiocloudflareviapipfromPyPI:$pipinstallaiocloudflareUsagefromaiocloudflareimportCloudflareasyncdefget_zone():asyncwithCloudflare()ascf:response=awaitcf.zones.get()Unlike the officalpython-cloudflarelibrary,aiocloudflaredoes not parse and handle http responses.So the awaited response object will have to be handled just as any http request, response pattern. theResponseobject is the same ashttpx’sResponse.fromaiocloudflareimportCloudflareasyncdefget_zone():asyncwithCloudflare()ascf:response=awaitcf.zones.get()# check status codeifresponse.status_code==200:# get json dataresp_json=response.json()# Cloudflare API typically store results in a ``result`` key.returnresp_json["result"]else:# to get texture data from responseprint(response.text)Full configuration can be done usingConfig()class.fromaioCloudflareimportCloudflare,Configconfig=Config(email="[email protected]",token="<secret>")# for demo only, do not hardcode secretsasyncdefget_zone():asyncwithCloudflare(config=config)ascf:result=awaitcf.zones.get()Configuration can also be stored in a.envfile for a “global” configuration without needing to create aConfig()class. Keys available are:CF_API_EMAIL="" CF_API_KEY="" CF_API_CERTKEY="" CF_API_URL="" DEBUG=false CF_PROFILE="" USER_AGENT=""Advance UsageYou may wish to wrapCloudflare()into you own class for customised settings or requirements. To do that, just provide a__aenter__()and__aexit__()method to your class like so.classMyCfClient:def__init__(self):self._config=Config(email="[email protected]",token="<secret>")# for demo only, do not hardcode secretsasyncdef__aenter__(self):self._client=Cloudflare(config=self._config)returnselfasyncdef__aexit__(self,exc_type,exc_value,traceback):awaitself._client.aclose()Then you can call your own class with async context manager.asyncwithMyCfClient()asown_class:awaitown_class.zones.get()ContributingContributions are very welcome. To learn more, see theContributor Guide.LicenseDistributed under the terms of theMIT license,aioCloudflareis free and open source software.IssuesIf you encounter any problems, pleasefile an issuealong with a detailed description.
aiocloudlibrary
aiocloudlibraryAsynchronous library to communicate with the Cloud Library APICloud Library API Example ScriptThis example script demonstrates how to utilize theCloudLibraryClientclass from theaiocloudlibrarymodule to interact with the Cloud Library API. It showcases various functionalities available within the client class to retrieve different types of data from the API.UsageInstallation:Ensure theaiocloudlibrarymodule is installed. If not, install it usingpip install aiocloudlibrary.Configuration:Replace the placeholder details (barcode/email, PIN/password, library ID) within the script with actual values.Functionality Displayed:current(): Fetches the current patron items.history(): Retrieves the patron's borrowing history.holds(): Retrieves the patron's holds.saved(): Retrieves the patron's saved items.featured(): Retrieves the patron's featured items.email(): Retrieves the patron's email settings.notifications(): Retrieves patron notifications (unread or archived).Execution:Run the script, and it will sequentially call these functions, displaying the retrieved data for each function.Important NotesReplace placeholder values (barcode, PIN, library ID) with your actual Cloud Library account details.Thenotifications()function demonstrates both fetching unread notifications and archiving specific notifications.Make sure you have appropriate access rights and permissions for the Cloud Library API endpoints being accessed.Feel free to modify the script as needed, adding error handling, logging, or additional functionalities based on your requirements and the Cloud Library API's capabilities."""Example Script: Accessing Cloud Library API with aiocloudlibraryThis script demonstrates fetching current patron items, borrowing history, holds, and saved itemsfrom the Cloud Library API using the aiocloudlibrary library.Make sure to replace '[email protected]', 'xxxxxxxx', and 'xxxxx' with actual credentials and library details.Requirements:- aiocloudlibrary library (installed via pip)Usage:Run the script and observe the output displaying various data from the Cloud Library API."""fromaiocloudlibraryimportCloudLibraryClientimportasyncioimportjsonasyncdefmain():client=CloudLibraryClient(barcode="[email protected]",# Replace with actual barcode/emailpin="xxxxxxxx",# Replace with actual PIN/passwordlibrary="xxxxx"# Replace with actual library ID)try:current=awaitclient.current()print(f"Current Patron Items:{json.dumps(current,indent=2)}")history=awaitclient.history()print(f"Patron's Borrowing History:{json.dumps(history,indent=2)}")holds=awaitclient.holds()print(f"Patron's Holds:{json.dumps(holds,indent=2)}")saved_items=awaitclient.saved()print(f"Patron's Saved Items:{json.dumps(saved_items,indent=2)}")featured_items=awaitclient.featured()print(f"Patron's Featured Items:{json.dumps(featured_items,indent=2)}")email_settings=awaitclient.email()print(f"Patron's Email Settings:{json.dumps(email_settings,indent=2)}")unread_notifications=awaitclient.notifications()print(f"Unread Notifications:{json.dumps(unread_notifications,indent=2)}")notification_ids=["notification_id_1","notification_id_2"]# Replace with actual notification IDsarchived_notifications=awaitclient.notifications(unread="false",notification_id_to_archive=notification_ids)print(f"Archived Notifications:{json.dumps(archived_notifications,indent=2)}")finally:awaitclient.close_session()asyncio.run(main())
aiocloudpayments
aiocloudpaymentsPython AsyncCloudPaymentsLibraryClient Basic Usage Examplefrom datetime import date from aiocloudpayments import AioCpClient async def main(): client = AioCpClient(PUBLIC_ID, API_SECRET) await client.test() await client.charge_card( amount=10, currency="RUB", invoice_id="1234567", ip_address="123.123.123.123", description="Payment for goods in example.com", account_id="user_x", name="CARDHOLDER NAME", card_cryptogram_packet="01492500008719030128SMfLeYdKp5dSQVIiO5l6ZCJiPdel4uDjdFTTz1UnXY+3QaZcNOW8lmXg0H670MclS4lI+qLkujKF4pR5Ri+T/E04Ufq3t5ntMUVLuZ998DLm+OVHV7FxIGR7snckpg47A73v7/y88Q5dxxvVZtDVi0qCcJAiZrgKLyLCqypnMfhjsgCEPF6d4OMzkgNQiynZvKysI2q+xc9cL0+CMmQTUPytnxX52k9qLNZ55cnE8kuLvqSK+TOG7Fz03moGcVvbb9XTg1oTDL4pl9rgkG3XvvTJOwol3JDxL1i6x+VpaRxpLJg0Zd9/9xRJOBMGmwAxo8/xyvGuAj85sxLJL6fA==", payer=Person( first_name="Test", last_name="Test", middle_name="Test", birth=date(1998, 1, 16), address="12A, 123", street="Test Avenue", city="LosTestels, City of Test Angels", country="Testland", phone="+1 111 11 11", post_code="101011010" ) ) await client.disconnect()AiohttpDispatcher Basic Usage Examplefrom aiocloudpayments import AiohttpDispatcher, Result from aiocloudpayments.types import PayNotification, CancelNotification, CheckNotification CERT_FILE = "cert.pem" CERT_FILE = "pkey.pem" def main(): dp = AiohttpDispatcher() # router is not needed here, but I am just showing the logic router = Router() # register with router @router.cancel(lambda n: 5 > n.amount > 1) async def foo(notification: CancelNotification): print(f"{notification=}") # return {"result": 0} return Result.OK # register with router @router.pay(lambda n: n.amount <= 1) async def foo(notification: PayNotification): print(f"{notification.amount=}") # return {"result": 0} return Result.OK # register with router @router.check() async def foo(notification: CheckNotification): print(f"{notification.amount=}") # return {"result": 12} return Result.WRONG_AMOUNT # register with dp @dp.cancel(lambda n: n.amount > 5) async def foo(notification: CancelNotification): print(f"{notification.amount=}, > 5") # if you don't return anything, Result.OK is assumed dp.include_router(router) ssl_context = SSLContext() ssl_context.load_cert_chain(CERT_FILE, KEY_FILE) dp.run_app( AioCpClient(PUBLIC_ID, API_SECRET), "/test", pay_path="/pay", cancel_path="/cancel", ssl_context=ssl_context, check_hmac=False # disable hmac check, only use in development environments )architecture inspired byaiogram
aiocloudstack
CloudStackAIOVery thin Python CloudStack Client using asyncio
aioclustermanager
No description available on PyPI.
aiocmd
No description available on PyPI.
aiocmdline
aiocmdlinePython has a builtin module that provides a line-oriented command interpreter. However, the builtin module is difficult to use with async Python code. This module provides a ready-to-use class that simplifies the implementation of regular and async commands.Dependencies:aioreadlineExampleclass MyCmdline(AIOCmdline): def do_quit(self): self.stop_cmdloop() async def do_sleep(self, arg): await asyncio.sleep(int(arg)) print("sleep done") mycmdline = MyCmdline(prompt="mycmd> ", history=True) mycmdline.cmdloop()
aiocoap
The aiocoap package is an implementation of CoAP, theConstrained Application Protocol.It is written in Python 3 using itsnative asynciomethods to facilitate concurrent operations while maintaining an easy to use interface.UsageFor how to use the aiocoap library, have a look at theguidedtour, or at theexamplesandtoolsprovided.A full reference is available in theAPI documentation.All examples can be run directly from a source code copy. If you prefer to install it, the usual Python mechanisms apply (seeinstallation).Features / StandardsThis library supports the following standards in full or partially:RFC7252(CoAP): Supported for clients and servers. Multicast is supported on the server side, and partially for clients. DTLS is supported but experimental, and lacking some security properties. No caching is done inside the library.RFC7641(Observe): Basic support for clients and servers. Reordering, re-registration, and active cancellation are missing.RFC7959(Blockwise): Supported both for atomic and random access.RFC8323(TCP, WebSockets): Supports CoAP over TCP, TLS, and WebSockets (both over HTTP and HTTPS). The TLS parts are server-certificate only; preshared, raw public keys and client certificates are not supported yet.RFC7967(No-Response): Supported.RFC8132(PATCH/FETCH): Types and codes known, FETCH observation supported.RFC9176: A standalone resource directory server is provided along with a library function to register at one. They lack support for groups and security considerations, and are generally rather simplistic.RFC8613(OSCORE): Full support client-side; protected servers can be implemented based on it but are not automatic yet.draft-ietf-core-oscore-groupcomm-17(Group OSCORE): Supported for both group and pairwise mode in groups that are fully known. (The lack of an implemented joining or persistence mechanism makes this impractical for anything but experimentation.)If something described by one of the standards but not implemented, it is considered a bug; please file at thegithub issue tracker. (If it’s not on the list or in the excluded items, file a wishlist item at the same location).DependenciesBasic aiocoap works out of the box onPython3.7 or newer (also works onPyPy3). For full support (DTLS, OSCORE and link-format handling) follow theinstallationinstructions as these require additional libraries.aiocoap provides different network backends for different platforms. The most featureful backend is available for Linux, but most operations work on BSDs, Windows and macOS as well. See theFAQfor more details.If your library depends on aiocoap, it should pick the required extras (as perinstallation) and declare a dependency likeaiocoap[linkheader,oscore] >= 0.4b2.Developmentaiocoap tries to stay close toPEP8recommendations and general best practice, and should thus be easy to contribute to.Bugs (ranging from “design goal” and “wishlist” to typos) are currently tracked in thegithub issue tracker. Pull requests are welcome there; if you start working on larger changes, please coordinate on the issue tracker.Documentation is built usingsphinxwith./setup.py build_sphinx; hacks used there are described in./doc/README.doc.Unit tests are implemented in the./tests/directory and easiest run usingtox(though still available through./setup.py testfor the time being); complete test coverage is aimed for, but not yet complete (and might never be, as the error handling for pathological network partners is hard to trigger with a library designed not to misbehave). The tests are regularly run at theCI suite at gitlab, from wherecoverage reportsare available.Relevant URLshttps://github.com/chrysn/aiocoapThis is where the latest source code can be found, and bugs can be reported. Generally, this serves as the project web site.http://aiocoap.readthedocs.org/Online documentation built from the sources.http://coap.technology/Further general information on CoAP, the standard documents involved, and other implementations and tools available.Licensingaiocoap is published under the MIT License, and follows the best practice ofreuse.software. Files inaiocoap/util/vendored/may have different (but compatible and OSI approved) licenses.When using aiocoap for a publication, please cite it according to the output of./setup.py cite[--bibtex].Copyright Christian Amsüss and the aiocoap contributors.aiocoap was originally based ontxThingsby Maciej Wasilak. The full list of aiocoap contributors can be obtained from the version control history.
aiocodeforces
No description available on PyPI.
aiocodingame
Use thecodingame moduleinstead.Install that module:pipinstallcodingame[async]To create an asynchronous client:importasyncioimportcodingameasyncdefmain():client=codingame.Client(is_async=True)# if you want to log inawaitclient.login("[email protected]","password")# get a codingamercodingamer=awaitclient.get_codingamer("username")print(codingamer.pseudo)# get the global leaderboardglobal_leaderboard=awaitclient.get_global_leaderboard()# print the pseudo of the top codingamerprint(global_leaderboard.users[0].pseudo)asyncio.run(main())
aiocogeo
aiocogeoInstallationpip install aiocogeo # With S3 filesystem pip install aiocogeo[s3]UsageCOGs are opened using theCOGReaderasynchronous context manager:fromaiocogeoimportCOGReaderasyncwithCOGReader("http://cog.tif")ascog:...Several filesystems are supported:HTTP/HTTPS(http://,https://)S3(s3://)File(/)MetadataGenerating arasterio-style profilefor the COG:asyncwithCOGReader("https://async-cog-reader-test-data.s3.amazonaws.com/lzw_cog.tif")ascog:print(cog.profile)>>>{'driver':'GTiff','width':10280,'height':12190,'count':3,'dtype':'uint8','transform':Affine(0.6,0.0,367188.0,0.0,-0.6,3777102.0),'blockxsize':512,'blockysize':512,'compress':'lzw','interleave':'pixel','crs':'EPSG:26911','tiled':True,'photometric':'rgb'}Lower Level MetadataA COG is composed of several IFDs, each with many TIFF tags:fromaiocogeo.ifdimportIFDfromaiocogeo.tagimportTagasyncwithCOGReader("https://async-cog-reader-test-data.s3.amazonaws.com/lzw_cog.tif")ascog:forifdincog:assertisinstance(ifd,IFD)fortaginifd:assertisinstance(tag,Tag)Each IFD contains more granular metadata about the image than what is included in the profile. For example, finding the tilesize for each IFD:asyncwithCOGReader("https://async-cog-reader-test-data.s3.amazonaws.com/lzw_cog.tif")ascog:forifdincog:print(ifd.TileWidth.value,ifd.TileHeight.value)>>>512512128128128128128128128128128128More advanced use cases may need access to tag-level metadata:asyncwithCOGReader("https://async-cog-reader-test-data.s3.amazonaws.com/lzw_cog.tif")ascog:first_ifd=cog.ifds[0]assertfirst_ifd.tag_count==24fortaginfirst_ifd:print(tag)>>>Tag(code=258,name='BitsPerSample',tag_type=TagType(format='H',size=2),count=3,length=6,value=(8,8,8))Tag(code=259,name='Compression',tag_type=TagType(format='H',size=2),count=1,length=2,value=5)Tag(code=257,name='ImageHeight',tag_type=TagType(format='H',size=2),count=1,length=2,value=12190)Tag(code=256,name='ImageWidth',tag_type=TagType(format='H',size=2),count=1,length=2,value=10280)...Image DataThe reader also has methods for reading internal image tiles and performing partial reads. Currently only jpeg, lzw, deflate, packbits, and webp compressions are supported.Image TilesReading the top left tile of an image at native resolution:asyncwithCOGReader("https://async-cog-reader-test-data.s3.amazonaws.com/webp_cog.tif")ascog:x=y=z=0tile=awaitcog.get_tile(x,y,z)ifd=cog.ifds[z]asserttile.shape==(ifd.bands,ifd.TileHeight.value,ifd.TileWidth.value)Partial ReadYou can read a portion of the image by specifying a bounding box in the native crs of the image and an output shape:asyncwithCOGReader("https://async-cog-reader-test-data.s3.amazonaws.com/webp_cog.tif")ascog:assertcog.epsg==26911partial_data=awaitcog.read(bounds=(368461,3770591,368796,3770921),shape=(512,512))Internal MasksIf the COG has an internal mask, the returned array will be a masked array:importnumpyasnpasyncwithCOGReader("https://async-cog-reader-test-data.s3.amazonaws.com/naip_image_masked.tif")ascog:assertcog.is_maskedtile=awaitcog.get_tile(0,0,0)assertnp.ma.is_masked(tile)ConfigurationConfiguration options are exposed through environment variables:INGESTED_BYTES_AT_OPEN- defines the number of bytes in the first GET request at file opening (defaults to 16KB)HEADER_CHUNK_SIZE- chunk size used to read header (defaults to 16KB)ENABLE_BLOCK_CACHE- determines if image blocks are cached in memory (defaults to TRUE)ENABLE_HEADER_CACHE- determines if COG headers are cached in memory (defaults to TRUE)HTTP_MERGE_CONSECUTIVE_RANGES- determines if consecutive ranges are merged into a single request (defaults to FALSE)BOUNDLESS_READ- determines if internal tiles outside the bounds of the IFD are read (defaults to TRUE)BOUNDLESS_READ_FILL_VALUE- determines the value used to fill boundless reads (defaults to 0)LOG_LEVEL- determines the log level used by the package (defaults to ERROR)VERBOSE_LOGS- enables verbose logging, designed for use whenLOG_LEVEL=DEBUG(defaults to FALSE)AWS_REQUEST_PAYER- set torequesterto enable reading from S3 RequesterPays buckets.Refer toaiocogeo/config.pyfor more details about configuration options.CLI$ aiocogeo --help Usage: aiocogeo [OPTIONS] COMMAND [ARGS]... Options: --install-completion [bash|zsh|fish|powershell|pwsh] Install completion for the specified shell. --show-completion [bash|zsh|fish|powershell|pwsh] Show completion for the specified shell, to copy it or customize the installation. --help Show this message and exit. Commands: create-tms Create OGC TileMatrixSet. info Read COG metadata.
aiocogeo-tiler
aiocogeo tiler
aiocoingecko
Asynchronous CoinGecko API wrapperPython3 wrapper around theCoinGeckoAPI (V3)Features 100% API implementation and full Pythonic documentation.InstallationPyPIpipinstallaiocoingeckoUsageimportasynciofromaiocoingeckoimportAsyncCoinGeckoAPISessionasyncdefmain():asyncwithAsyncCoinGeckoAPISession()ascg:print(awaitcg.ping())asyncio.run(main())
aiocombiner
Failed to fetch description. HTTP Status Code: 404
aiocomelit
aiocomelitPython library to control Comelit SimplehomeInstallationInstall this via pip (or your favourite package manager):pip install aiocomelitContributors ✨Thanks goes to these wonderful people (emoji key):This project follows theall-contributorsspecification. Contributions of any kind welcome!CreditsThis package was created withCopierand thebrowniebroke/pypackage-templateproject template.
aiocometd
aiocometdaiocometd is aCometDclient built usingasyncio, implementing theBayeuxprotocol.CometDis a scalable WebSocket and HTTP based event and message routing bus.CometDmakes use of WebSocket and HTTP push technologies known asCometto provide low-latency data from the server to browsers and client applications.FeaturesSupported transports:long-pollingwebsocketAutomatic reconnection after network failuresExtensionsUsageimportasynciofromaiocometdimportClientasyncdefchat():nickname="John"# connect to the serverasyncwithClient("http://example.com/cometd")asclient:# subscribe to channels to receive chat messages and# notifications about new membersawaitclient.subscribe("/chat/demo")awaitclient.subscribe("/members/demo")# send initial messageawaitclient.publish("/chat/demo",{"user":nickname,"membership":"join","chat":nickname+" has joined"})# add the user to the chat room's membersawaitclient.publish("/service/members",{"user":nickname,"room":"/chat/demo"})# listen for incoming messagesasyncformessageinclient:ifmessage["channel"]=="/chat/demo":data=message["data"]print(f"{data['user']}:{data['chat']}")if__name__=="__main__":loop=asyncio.get_event_loop()loop.run_until_complete(chat())For more detailed usage examples take a look at thecommand line chat exampleor for a more complex example with a GUI check out theaiocometd-chat-demo.Documentationhttps://aiocometd.readthedocs.io/Changelog0.4.5 (2019-03-14)Fix connection issues when used with reverse proxy servers with cookie based sticky sessions0.4.4 (2019-02-26)Refactor the websocket transport implementation to use a single connection per client0.4.3 (2019-02-12)Fix reconnection issue on Salesforce Streaming API0.4.2 (2019-01-15)Fix the handling of invalid websocket transport responsesFix the handling of failed subscription responses0.4.1 (2019-01-04)Add documentation links0.4.0 (2019-01-04)Add type hintsAdd integration tests0.3.1 (2018-06-15)Fix premature request timeout issue0.3.0 (2018-05-04)Enable the usage of third party JSON librariesFix detection and recovery from network failures0.2.3 (2018-04-24)Fix RST rendering issues0.2.2 (2018-04-24)Fix documentation typosImprove examplesReorganise documentation0.2.1 (2018-04-21)Add PyPI badge to README0.2.0 (2018-04-21)Supported transports:long-pollingwebsocketAutomatic reconnection after network failuresExtensions
aiocometd-ng
aiocometdaiocometd is aCometDclient built usingasyncio, implementing theBayeuxprotocol.CometDis a scalable WebSocket and HTTP based event and message routing bus.CometDmakes use of WebSocket and HTTP push technologies known asCometto provide low-latency data from the server to browsers and client applications.FeaturesSupported transports:long-pollingwebsocketAutomatic reconnection after network failuresExtensionsUsageimportasynciofromaiocometdimportClientasyncdefchat():nickname="John"# connect to the serverasyncwithClient("http://example.com/cometd")asclient:# subscribe to channels to receive chat messages and# notifications about new membersawaitclient.subscribe("/chat/demo")awaitclient.subscribe("/members/demo")# send initial messageawaitclient.publish("/chat/demo",{"user":nickname,"membership":"join","chat":nickname+" has joined"})# add the user to the chat room's membersawaitclient.publish("/service/members",{"user":nickname,"room":"/chat/demo"})# listen for incoming messagesasyncformessageinclient:ifmessage["channel"]=="/chat/demo":data=message["data"]print(f"{data['user']}:{data['chat']}")if__name__=="__main__":loop=asyncio.get_event_loop()loop.run_until_complete(chat())For more detailed usage examples take a look at thecommand line chat exampleor for a more complex example with a GUI check out theaiocometd-chat-demo.Documentationhttps://aiocometd.readthedocs.io/Changelog0.4.5 (2019-03-14)Fix connection issues when used with reverse proxy servers with cookie based sticky sessions0.4.4 (2019-02-26)Refactor the websocket transport implementation to use a single connection per client0.4.3 (2019-02-12)Fix reconnection issue on Salesforce Streaming API0.4.2 (2019-01-15)Fix the handling of invalid websocket transport responsesFix the handling of failed subscription responses0.4.1 (2019-01-04)Add documentation links0.4.0 (2019-01-04)Add type hintsAdd integration tests0.3.1 (2018-06-15)Fix premature request timeout issue0.3.0 (2018-05-04)Enable the usage of third party JSON librariesFix detection and recovery from network failures0.2.3 (2018-04-24)Fix RST rendering issues0.2.2 (2018-04-24)Fix documentation typosImprove examplesReorganise documentation0.2.1 (2018-04-21)Add PyPI badge to README0.2.0 (2018-04-21)Supported transports:long-pollingwebsocketAutomatic reconnection after network failuresExtensions
aiocometd-noloop
A CometD client, updated for Python 3.10Explore the docs »Getting Started·Documentation·LicenseAbout the Projectaiocometd-noloop is an updated version of theaiocometd Python package, which is a python client forCometD.This package is updated for compatibility with Python 3.10, allowing for usage with modern more modern Python features and projects.Getting Startedaiocometd-noloop is available onPyPI. To use it in your project, run:pip install aiocometd-noloopBasic UsageSee the original repository for usage examples and any further details.LicenseDistributed under the MIT License. SeeLICENSEfor further details.(Back to Top)
aiocomfoconnect
aiocomfoconnectaiocomfoconnectis an asyncio Python 3 library for communicating with a Zehnder ComfoAir Q350/450/600 ventilation system. It's the successor ofcomfoconnect.It's compatible with Python 3.10 and higher.Installationpip3installaiocomfoconnectCLI Usage$python-maiocomfoconnect--help $python-maiocomfoconnectdiscover $python-maiocomfoconnectregister--host192.168.1.213 $python-maiocomfoconnectset-speedaway--host192.168.1.213 $python-maiocomfoconnectset-speedlow--host192.168.1.213 $python-maiocomfoconnectset-modeauto--host192.168.1.213 $python-maiocomfoconnectset-speedmedium--host192.168.1.213 $python-maiocomfoconnectset-speedhigh--host192.168.1.213 $python-maiocomfoconnectshow-sensors--host192.168.1.213 $python-maiocomfoconnectshow-sensor276--host192.168.1.213 $python-maiocomfoconnectshow-sensor276--host192.168.1.213-f $python-maiocomfoconnectget-property--host192.168.1.2131189# Unit 0x01, SubUnit 0x01, Property 0x08, Type STRING. See PROTOCOL-RMI.mdAvailable methodsasync connect(): Connect to the bridge.async disconnect(): Disconnect from the bridge.async register_sensor(sensor): Register a sensor.async deregister_sensor(sensor): Deregister a sensor.async get_mode(): Get the ventilation mode.async set_mode(mode): Set the ventilation mode. (auto / manual)async get_speed(): Get the ventilation speed.async set_speed(speed): Set the ventilation speed. (away / low / medium / high)async get_bypass(): Get the bypass mode.async set_bypass(mode, timeout=-1): Set the bypass mode. (auto / on / off)async get_balance_mode(): Get the balance mode.async set_balance_mode(mode, timeout=-1): Set the balance mode. (balance / supply only / exhaust only)async get_boost(): Get the boost mode.async set_boost(mode, timeout=-1): Set the boost mode. (boolean)async get_away(): Get the away mode.async set_away(mode, timeout=-1): Set the away mode. (boolean)async get_temperature_profile(): Get the temperature profile.async set_temperature_profile(profile): Set the temperature profile. (warm / normal / cool)async get_sensor_ventmode_temperature_passive(): Get the sensor based ventilation passive temperature control setting.async set_sensor_ventmode_temperature_passive(mode): Set the sensor based ventilation passive temperature control setting. (auto / on / off)async get_sensor_ventmode_humidity_comfort(): Get the sensor based ventilation humidity comfort setting.async set_sensor_ventmode_humidity_comfort(mode): Set the sensor based ventilation humidity comfort setting. (auto / on / off)async get_sensor_ventmode_humidity_protection(): Get the sensor based ventilation humidity protection setting.async set_sensor_ventmode_humidity_protection(mode): Set the sensor based ventilation humidity protection setting. (auto / on / off)Low-level APIasync cmd_start_session(): Start a session.async cmd_close_session(): Close a session.async cmd_list_registered_apps(): List registered apps.async cmd_register_app(uuid, device_name, pin): Register an app.async cmd_deregister_app(uuid): Deregister an app.async cmd_version_request(): Request the bridge's version.async cmd_time_request(): Request the bridge's time.async cmd_rmi_request(message, node_id): Send a RMI request.async cmd_rpdo_request(pdid, type, zone, timeout): Send a RPDO request.async cmd_keepalive(): Send a keepalive message.ExamplesDiscovery of ComfoConnect LAN C Bridgesimportasynciofromaiocomfoconnectimportdiscover_bridgesasyncdefmain():""" ComfoConnect LAN C Bridge discovery example."""# Discover all ComfoConnect LAN C Bridges on the subnet.bridges=awaitdiscover_bridges()print(bridges)if__name__=="__main__":asyncio.run(main())Basic ExampleimportasynciofromaiocomfoconnectimportComfoConnectfromaiocomfoconnect.constimportVentilationSpeedfromaiocomfoconnect.sensorsimportSENSORSasyncdefmain(local_uuid,host,uuid):""" Basic example."""defsensor_callback(sensor,value):""" Print sensor updates. """print(f"{sensor.name}={value}")# Connect to the Bridgecomfoconnect=ComfoConnect(host,uuid,sensor_callback=sensor_callback)awaitcomfoconnect.connect(local_uuid)# Register all sensorsforkeyinSENSORS:awaitcomfoconnect.register_sensor(SENSORS[key])# Set speed to LOWawaitcomfoconnect.set_speed(VentilationSpeed.LOW)# Wait 2 minutes so we can see some sensor updatesawaitasyncio.sleep(120)# Disconnect from the bridgeawaitcomfoconnect.disconnect()if__name__=="__main__":asyncio.run(main(local_uuid='00000000000000000000000000001337',host='192.168.1.20',uuid='00000000000000000000000000000055'))# Replace with your bridge's IP and UUIDDevelopment NotesProtocol DocumentationComfoConnect LAN C ProtocolPDO SensorsRMI commandsDecode network trafficYou can use thescripts/decode_pcap.pyfile to decode network traffic between the Mobile App and the ComfoConnect LAN C. Make sure that the first TCP session in the capture is the connection between the bridge and the app. It's therefore recommended to start the capture before you open the app.$sudotcpdump-iany-s0-w/tmp/capture.pcaptcpandport56747$python3script/decode_pcap.py/tmp/capture.pcapGenerate zehnder_pb2.py filepip3installgrpcio-tools python3-mgrpc_tools.protoc-Iprotobufprotobuf/nanopb.proto--python_out=aiocomfoconnect/protobuf python3-mgrpc_tools.protoc-Iprotobufprotobuf/zehnder.proto--python_out=aiocomfoconnect/protobuf
aioconcurrency
aioconcurrencyRun a coroutine with each item in an iterable, concurrentlyInstallpip install aioconcurrencyUsage exampleimport aioconcurrency items = [1, 2, 3, 4] async def f(item): return item * 2 async def main(): await aioconcurrency.map(items, f, concurrency=2) # Returns [2, 4, 6, 8] async for result in aioconcurrency.each(items, f, concurrency=2): print(result) # Prints 2 4 6 8 in random orderApiaioconcurrency.mapRuns the given coroutine concurrently with each item in an iterable. The list of the return values will be ordered as if ran serially.itemsAn iterable object.coroCoroutine to feed each item to.optional: concurrencyNumber of concurrent runs ofcoro. Defaults toaioconcurrency.Infinite.optional: executorCan be an instance of ThreadPoolExecutor.optional: loopThe asyncio event loop that will be used.aioconcurrency.eachRuns the given coroutine concurrently with each item in an iterable. Returns a generator that may be used to iterate over the return values. The generator yields values as soon as they are available.itemsAn iterable object. If anasyncio.Queueis passed then.eachwill read from it indefinitely.coroCoroutine to feed each item to.optional: concurrencyNumber of concurrent runs ofcoro. Defaults toaioconcurrency.Infinite.optional: executorCan be an instance of ThreadPoolExecutor.optional: loopThe asyncio event loop that will be used.optional: discard_resultsIf truthy, discard the return value ofcoro. Defaults to false.property: wait()Coroutine. May be used to wait until all items have been processed.property: processed_countThe number of items that have been processed so far.property: cancel()Cancels all runs ofcoro.Testspytest .