package
stringlengths 1
122
| pacakge-description
stringlengths 0
1.3M
|
---|---|
aiosql | SQLis code.
Write it, version control it, comment it, and run it using files.
Writing your SQL code in Python programs as strings doesn’t allow you to easily
reuse them in SQL GUIs or CLI tools likepsql.
With aiosql you can organize your SQL statements in.sqlfiles, load them
into your python application as methods to call without losing the ability to
use them as you would any other SQL file.This project supports standardPEP 249andasynciobased drivers forSQLite(sqlite3,aiosqlite,apsw),PostgreSQL(psycopg (3),psycopg2,pg8000,pygresql,asyncpg),MySQL(PyMySQL,mysqlclient,mysql-connector),MariaDB(mariadb) andDuckDB(duckdb),
out of the box.
Note that some detailed feature support may vary depending on the underlying driver
and database engine actual capabilities.This module is an implementation ofKris Jenkins’ yesqlClojurelibrary to thePythonecosystem.
Extensions to support other database drivers can be written by you!
See:Database Driver Adapters.
Feel free to pull request!BadgesUsageInstall frompypi, for instance by runningpip install aiosql.Then write parametric SQL queries in a file and execute it from Python methods,
eg thisgreetings.sqlfile:-- name: get_all_greetings
-- Get all the greetings in the databaseselectgreeting_id,greetingfromgreetingsorderby1;-- name: get_user_by_username^
-- Get a user from the database using a named parameterselectuser_id,username,namefromuserswhereusername=:username;This example has an imaginary SQLite database with greetings and users.
It prints greetings in various languages to the user and showcases the basic
feature of being able to load queries from a SQL file and call them by name
in python code.You can useaiosqlto load the queries in this file for use in your Python
application:importaiosqlimportsqlite3queries=aiosql.from_path("greetings.sql","sqlite3")withsqlite3.connect("greetings.db")asconn:user=queries.get_user_by_username(conn,username="willvaughn")# user: (1, "willvaughn", "William")for_,greetinginqueries.get_all_greetings(conn):# scan [(1, "Hi"), (2, "Aloha"), (3, "Hola"), …]print(f"{greeting},{user[2]}!")# Hi, William!# Aloha, William!# …Or even in an asynchroneous way, with two SQL queries running in parallel
usingaiosqliteandasyncio:importasyncioimportaiosqlimportaiosqlitequeries=aiosql.from_path("greetings.sql","aiosqlite")asyncdefmain():asyncwithaiosqlite.connect("greetings.db")asconn:# Parallel queries!greetings,user=awaitasyncio.gather(queries.get_all_greetings(conn),queries.get_user_by_username(conn,username="willvaughn"))for_,greetingingreetings:print(f"{greeting},{user[2]}!")asyncio.run(main())It may seem inconvenient to provide a connection on each call.
You may have a look at theAnoDBDBclass which wraps both a database connection and query functions in one
connection-like extended object, including managing automatic reconnection if
needed.Why you might want to use thisYou think SQL is pretty good, and writing SQL is an important part of your applications.You don’t want to write your SQL in strings intermixed with your python code.You’re not using an ORM likeSQLAlchemyorDjango,
with large (100k lines) code imprints vs about 800 foraiosql,
and you don’t need to.You want to be able to reuse your SQL in other contexts.
Loading it intopsqlor other database tools.Why you might NOT want to use thisYou’re looking for anORM.You aren’t comfortable writing SQL code.You don’t have anything in your application that requires complicated SQL beyond basic CRUD operations.Dynamically loaded objects built at runtime really bother you. |
aiosqlalchemy-miniorm | # Asynchronous SQLAlchemy Object Relational Mapper.This is an ORM for accessing SQLAlchemy using asyncio. Working on top of SQLAlchemy Core.It presents a method of associating user-defined Python classes with database tables, and instances of those classes (objects) with rows in their corresponding tables.## UsageInitialization:from sqlalchemy import MetaData, Integer, String, DateTimefrom aiopg.sa import create_enginefrom aiosqlalchemy_miniorm import RowModel, RowModelDeclarativeMeta, BaseModelManagermetadata = MetaData()BaseModel = declarative_base(metadata=metadata, cls=RowModel, metaclass=RowModelDeclarativeMeta)async def setup():metadata.bind = await create_engine(**database_settings)class MyEntityManager(BaseModelManager):async def get_with_products(self):return await self.get_items(where_list=[(MyEntity.c.num_products > 0)])class MyEntity(BaseModel):__tablename__ = 'my_entity'__model_manager_class__ = MyEntityManagerid = Column(Integer, primary_key=True)name = Column(String(100), nullable=False)num_products = Column(Integer)created_at = Column(DateTime(), server_default=text('now()'), nullable=False)Query:objects = await MyEntity.objects.get_instances(where_list=[(MyEntity.c.name == 'foo')],order_by=['name', '-num_products'])num_objects = await MyEntity.objects.count(where_list=[(MyEntity.c.name == 'foo'), (MyEntity.c.num_products > 3)])or (low-level):objects = await MyEntity.objects \.set_sql(MyEntity.table.select()) \.where([(MyEntity.c.name == 'foo')]) \.limit(10) \.fetchall()Management:record = await MyEntity.objects.insert(name='bar',num_products=0,)await record.update(name='baz')await record.delete()Transactions:async with MyEntity.objects.transaction() as my_entity_objects:await my_entity_objects.insert(name='bar', num_products=0)await my_entity_objects.delete([(MyEntity.c.name == 'foo')]) |
aiosqlembic | Aiosqlembic, migrations welcome !ReadmeAiosqlembic aims at running database migrations powered by the awesomeaiosqlIt's inspired by goose for those coming from goIt is in development and likely to breakDocumentationIt'shereUsageRunaiosqlembic --help |
aiosqlite | aiosqlite provides a friendly, async interface to sqlite databases.It replicates the standardsqlite3module, but with async versions
of all the standard connection and cursor methods, plus context managers for
automatically closing connections and cursors:asyncwithaiosqlite.connect(...)asdb:awaitdb.execute("INSERT INTO some_table ...")awaitdb.commit()asyncwithdb.execute("SELECT * FROM some_table")ascursor:asyncforrowincursor:...It can also be used in the traditional, procedural manner:db=awaitaiosqlite.connect(...)cursor=awaitdb.execute('SELECT * FROM some_table')row=awaitcursor.fetchone()rows=awaitcursor.fetchall()awaitcursor.close()awaitdb.close()aiosqlite also replicates most of the advanced features ofsqlite3:asyncwithaiosqlite.connect(...)asdb:db.row_factory=aiosqlite.Rowasyncwithdb.execute('SELECT * FROM some_table')ascursor:asyncforrowincursor:value=row['column']awaitdb.execute('INSERT INTO foo some_table')assertdb.total_changes>0Installaiosqlite is compatible with Python 3.8 and newer.
You can install it from PyPI:$pipinstallaiosqliteDetailsaiosqlite allows interaction with SQLite databases on the main AsyncIO event
loop without blocking execution of other coroutines while waiting for queries
or data fetches. It does this by using a single, shared thread per connection.
This thread executes all actions within a shared request queue to prevent
overlapping actions.Connection objects are proxies to the real connections, contain the shared
execution thread, and provide context managers to handle automatically closing
connections. Cursors are similarly proxies to the real cursors, and provide
async iterators to query results.Licenseaiosqlite is copyrightAmethyst Reese, and licensed under the
MIT license. I am providing code in this repository to you under an open source
license. This is my personal repository; the license you receive to my code
is from me and not from my employer. See theLICENSEfile for details. |
aiosqlite3 | No description available on PyPI. |
aiosqlite-custom | aiosqlite provides a friendly, async interface to sqlite databases.It replicates the standardsqlite3module, but with async versions
of all the standard connection and cursor methods, plus context managers for
automatically closing connections and cursors:asyncwithaiosqlite.connect(...)asdb:awaitdb.execute("INSERT INTO some_table ...")awaitdb.commit()asyncwithdb.execute("SELECT * FROM some_table")ascursor:asyncforrowincursor:...It can also be used in the traditional, procedural manner:db=awaitaiosqlite.connect(...)cursor=awaitdb.execute('SELECT * FROM some_table')row=awaitcursor.fetchone()rows=awaitcursor.fetchall()awaitcursor.close()awaitdb.close()aiosqlite also replicates most of the advanced features ofsqlite3:asyncwithaiosqlite.connect(...)asdb:db.row_factory=aiosqlite.Rowasyncwithdb.execute('SELECT * FROM some_table')ascursor:asyncforrowincursor:value=row['column']awaitdb.execute('INSERT INTO foo some_table')assertdb.total_changes>0Installaiosqlite is compatible with Python 3.8 and newer.
You can install it from PyPI:$pipinstallaiosqliteDetailsaiosqlite allows interaction with SQLite databases on the main AsyncIO event
loop without blocking execution of other coroutines while waiting for queries
or data fetches. It does this by using a single, shared thread per connection.
This thread executes all actions within a shared request queue to prevent
overlapping actions.Connection objects are proxies to the real connections, contain the shared
execution thread, and provide context managers to handle automatically closing
connections. Cursors are similarly proxies to the real cursors, and provide
async iterators to query results.Licenseaiosqlite is copyrightAmethyst Reese, and licensed under the
MIT license. I am providing code in this repository to you under an open source
license. This is my personal repository; the license you receive to my code
is from me and not from my employer. See theLICENSEfile for details. |
aiosqlitedict | Python Wrapper For sqlite3 and aiosqliteMain Features:Easy conversion between sqlite table and Python dictionary and vice-versa.Execute SQL queries.Get values of a certain column in a Python list.delete from your table.convert your json file into a sql database table.Order your list with parameters likeorder_by,limit..etcChoose any number of columns to your dict, which makes it faster for your dict to load instead of selecting all.Installationpy-mpipinstall-UaiosqlitedictUsageAiosqlite is used to import a SQLite3 table as a Python dictionary.
In this example we have a database file namedds_data.dbthis database has a table namedds_salariesNow to create an instance of this table in python we do the following>>>fromaiosqlitedict.databaseimportConnect>>>ds_salaries=Connect("ds_data.db","ds_salaries","id")now we can get rows of this table.>>>asyncdefsome_func():...>>>user_0=awaitds_salaries.to_dict(0,"job_title","salary")# to get `job_title` and `salary` of user with id 0>>>print(user_0){'job_title':'Data Scientist','salary':70000}>>>user_0=awaitds_salaries.to_dict(0,"*")# to get all columns of user with id 0>>>print(user_0){'id':0,'work_year':2020,'experience_level':'MI','employment_type':'FT','job_title':'Data Scientist','salary':70000,'salary_currency':'EUR','salary_in_usd':79833,'employee_residence':'DE','remote_ratio':0,'company_location':'DE','company_size':'L'}now lets do some operations on our data>>>user_0=awaitds_salaries.to_dict(0,"job_title","salary")>>>user_0["salary"]+=676# increase user 0's salary>>>print(user_0["salary"])70676# getting top 5 rows by salaries>>>salaries=awaitds_salaries.select("salary",limit=5,ascending=False)>>>print(salaries)[70000,260000,85000,20000,150000]# to get "job_title" but order with salaries>>>best_jobs=awaitds_salaries.select("job_title",order_by="salary",limit=5,ascending=False)>>>print(best_jobs)['Data Scientist','Data Scientist','BI Data Analyst','ML Engineer','ML Engineer']# We can do the same task by executing a query>>>best_jobs_2=awaitds_salaries.execute("SELECT job_title FROM ds_salaries ORDER BY salary DESC LIMIT 5")>>>print(best_jobs_2)[('Data Scientist',),('Data Scientist',),('BI Data Analyst',),('ML Engineer',),('ML Engineer',)]# to get job_titles that includes the title "scientist" without duplicates>>>scientists=awaitds_salaries.select("job_title",like="scientist",distinct=True)>>>print(scientists)['Data Scientist','Machine Learning Scientist','Lead Data Scientist','Research Scientist','AI Scientist','Principal Data Scientist','Applied Data Scientist','Applied Machine Learning Scientist','Staff Data Scientist']# to get all users' salary that have the title "ML Engineer" using a query>>>ML_Engineers=awaitds_salaries.execute("SELECT salary FROM ds_salaries WHERE job_title = 'ML Engineer'")>>>print(ML_Engineers)[(14000,),(270000,),(7000000,),(8500000,),(256000,),(20000,)]# to get the highest salaries>>>high_salaries=awaitds_salaries.select("salary",between=(10000000,40000000))# between 30M and 40M salary>>>print(sorted(high_salaries,reverse=True))[30400000,11000000,11000000]# but what if we want to know their ids? here order_by is best used>>>high_salaries2=awaitds_salaries.select("salary",order_by="salary",limit=3,ascending=False)# same task with different method>>>print(high_salaries2)[30400000,11000000,11000000]>>>high_salaries3=awaitds_salaries.select("id",order_by="salary",limit=3,ascending=False)# id of richest to poorest>>>print(high_salaries3)[177,7,102]:warning: Warning: Connect.select method is vulnerable to SQL injection.Lets say you want to delete a certain user>>>awaitds_salaries.delete(5)# removing user with id 5 from the table.finally updating our SQLite table>>>awaitds_salaries.to_sql(0,user_0)# Saving user 0's data to the tableContributingPull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.Please make sure to update tests as appropriate.LicensePlease notice that
this package is built-on top ofaiosqliteMIT |
aiosqlite-fork | No description available on PyPI. |
aiosql-mysql | aiosql-mysqlTable of ContentsAbout The ProjectGetting StartedUsageContributingLicenseContactThanksAbout The Projectaiosql-mysql is a database adaptor intended to allow the use ofasyncmywithaiosql.Warning:This project is in early developement. The PyMySQL adaptor works but is not fully tested. AsyncMy is not implimented and working at this time, please check back later.Getting StartedFor information about cloning and dev setup see:ContributingUsageThis is example is adapted from aiosql's readme.users.sql-- name: get-user-by-username^SELECT*FROMusersWHEREusername=:username;-- name: create_users#CREATETABLEusers(useridINTNOTNULLAUTO_INCREMENTPRIMARYKEY,usernameVARCHAR(100),firstnameVARCHAR(100),lastnameVARCHAR(100));-- name: insert_bob!INSERTINTOusers(username,firstname,lastname)VALUES('bob','bob','smith');Blocking executionIndexing a document adds it to or updates it in the search store.importaiosqlimportpymysqlfromaiosql_mysqlimportPyMySQLAdaptorconn=pymysql.connect(host="127.0.0.1",port=3306,user="root",password="password",database="ExampleDb",cursorclass=pymysql.cursors.DictCursor,)queries=aiosql.from_path("./users.sql",PyMySQLAdaptor)queries.create_users(conn)queries.insert_bob(conn)result=queries.get_user_by_username(conn,username="bob")print(result)# {'userid': 1, 'username': 'bob', 'firstname': 'bob', 'lastname': 'smith'}For a more detailed and complete version of the above example seepymysql_example.py&users.sqlAsync executionimportaiosqlimportasyncmyfromaiosql_mysqlimportAsyncMySQLAdapterqueries=aiosql.from_path("./users.sql",AsyncMySQLAdapter)asyncdefmain():conn=awaitasyncmy.connect(host="127.0.0.1",port=3306,user="root",password="password",database="ExampleDb",)awaitqueries.create_users(conn)awaitqueries.insert_user(conn,user_name='sbob',first_name='Bob',last_name='Smith')result=awaitqueries.get_user_by_username(conn,username="sbob")print(result)if__name__=="__main__":importasyncioasyncio.run(main())# {'userid': 1, 'username': 'sbob', 'firstname': 'Bob', 'lastname': 'Smith'}For a more detailed and complete version of the above example seeasyncmy_example.py&users.sqlContributingSee theopen issuesfor a list of proposed features (and known issues).Contributions are what make the open source community such an amazing place to be learn, inspire, and create. Any contributions you make aregreatly appreciated.Fork the ProjectCreate your Feature Branch (git checkout -b feature/AmazingFeature)execute: py.test --cov-report xml:cov.xml --covCommit your Changes (git commit -m 'Add some AmazingFeature')Push to the Branch (git push origin feature/AmazingFeature)Open a Pull RequestCloning / Development setupClone the repo and installgitclonehttps://github.com/kajuberdut/aiosql-mysql.gitcdaiosql-mysql
pipenvinstall--devRun testspipenvshell
wardFor more about pipenv see:Pipenv GithubLicenseDistributed under the MIT. SeeLICENSEfor more information.ContactPatrick Shechet [email protected] Link:https://github.com/kajuberdut/aiosql-mysqlThanksThis library would be pointless without:Will Vaughn, creator of aiosqlThe other contributors to aiosqlThe PyMySql TeamLong2Ice, creator of asyncmyThe aiomysql team who's work makes asyncmy possibleMany Many others in the Python and Open Source communities |
aiosqs | aiosqsPython asynchronous and lightweight SQS client. The goal of this library is to provide fast and optimal access to SQS
for Python projects, e.g. when you need a high-load queue consumer or high-load queue producer written in Python.Supports Python versions 3.8, 3.9, 3.10, 3.11, 3.12.Supported and tested Amazon-like SQS providers: Amazon, VK Cloud.Why aiosqs?Main problem ofbotocoreandaiobotocoreis huge memory and CPU consumption.
Alsoaiobotocoreitself is a transition ofbotocoreto async interface without any optimizations.Related issues:https://github.com/boto/boto3/issues/1670https://github.com/aio-libs/aiobotocore/issues/940https://github.com/aio-libs/aiobotocore/issues/970InstallationInstall package:pipinstallaiosqsUsageCreate a client:fromaiosqsimportSQSClientclient=SQSClient(aws_access_key_id="access_key_id",aws_secret_access_key="secret_access_key",region_name="us-west-2",host="sqs.us-west-2.amazonaws.com",)Receive the queue url by queue name:response=awaitclient.get_queue_url(queue_name=queue_name)queue_url=response["QueueUrl"]Send a message to the queue:response=awaitclient.send_message(queue_url=queue_url,message_body=json.dumps({"demo":1,"key":"value"}),delay_seconds=0,)print(response)Receive a message from the queue:response=awaitclient.receive_message(queue_url=queue_url,max_number_of_messages=1,visibility_timeout=30,)ifresponse:print(response)receipt_handle=response[0]["ReceiptHandle"]Delete the message from the queue:awaitclient.delete_message(queue_url=queue_url,receipt_handle=receipt_handle,)Close the client at the end:awaitclient.close()Another option is to useSQSClientas an async context manager. No need to callclosemanually in this case. Example:fromaiosqsimportSQSClientasyncwithSQSClient(aws_access_key_id="access_key_id",aws_secret_access_key="secret_access_key",region_name="us-west-2",host="sqs.us-west-2.amazonaws.com",)asclient:response=awaitclient.get_queue_url(queue_name="dev_orders")queue_url=response["QueueUrl"]print(queue_url) |
aiossdb | # aiossdbaiossdb is a library for accessing an ssdb database from AsyncIO[](https://coveralls.io/github/Microndgt/aiossdb?branch=master)Requirements------------- Python 3.6+DONE and TODO-------------- [x] base async ssdb connection- [x] async ssdb parser- [x] async ssdb connection pool- [x] easy using ssdb async client- [x] tests- [ ] detailed docs- [ ] suppress ReplyError as a choice- [ ] releasing...- [ ] and more...Quick Start------------ ClientClient will create a connection pool, each time you execute the command will be from the available connection pool to get the connection, and then execute the command, and then releaseClient会创建一个连接池,在每次执行命令的时候都会去从可用连接池中拿到连接,然后执行命令,然后释放```loop = asyncio.new_event_loop()asyncio.set_event_loop(loop)async def just_look():c = Client(loop=loop)await c.set('a', 1)res = await c.get('a')print(res)await c.close()return resloop.run_until_complete(just_look())loop.close()```- ConnectionPool```import asynciofrom aiossdb import create_poolloop = asyncio.get_event_loop()async def connect_tcp():pool = await create_pool(('localhost', 8888), loop=loop, minsize=5, maxsize=10)# Use the direct implementation of the command pool# 使用pool直接执行命令await pool.execute('set', 'a', 2)val = await pool.execute('hget', 'hash_name', 'hash_key')print(val)# Use the pool to get the connection# 使用pool获取连接conn, addr = await pool.get_connection()await conn.execute('set', 'a', 2)val = await conn.execute('hget', 'hash_name', 'hash_key')print(val)# Get the final connection to be released# 获取的连接最后一定要releaseawait pool.release(conn)pool.close()await pool.wait_closed()loop.run_until_complete(connect_tcp())loop.close()```If you request a non-existent key, a `ReplyError` will be raised and the type of error may be: `not_found`, `error`, `fail`, `client_error`如果获取不存在的键等情况会引发`ReplyError`, 错误类型可能有: `not_found`, `error`, `fail`, `client_error````try:val = await conn.execute('hget', 'hash_name', 'hash_key')except ReplyError as e:print("Error type: {}".format(e.etype))print("Executed command: {}".format(e.command))```- Connection```import asynciofrom aiossdb import create_connection, ReplyErrorloop = asyncio.get_event_loop()async def connect_tcp():conn = await create_connection(('localhost', 8888), loop=loop)await conn.execute('set', 'a', 2)val = await conn.execute('hget', 'hash_name', 'hash_key')print(val)conn.close()await conn.wait_closed()loop.run_until_complete(connect_tcp())loop.close()```Exceptions----------- SSDBError- ConnectionClosedError- ReplyError- ProtocolError- PoolClosedErrorNOTES------ The preliminary test shows that `aiossdb` is 25 times fast than [pyssdb](https://github.com/ifduyue/pyssdb)Contributor===========Kevin Du--------- Email: `[email protected]`- Site: `http://skyrover.me` |
aiossechat | aio-sse-chatSpecial python aio sse client module especially for parsing server-sent events (SSE) response from LLM.Modified fromaiohttp-sse-clientWhy need this?Normal SSE packages will not get correct value from streaming LLM response(in case you have not escape\nto\\n) since it will not parse the response correctly. This module will parse the response correctly and return the correct value.Also, LLM request usually need to submit aPOSTrequest while most current aio sse modules choose to raise error when submit aPOSTrequest. Though it is not a good practice to usePOSTrequest to get a streaming response, but it helps a lot for simplifying the code.Installationpipinstallaio-sse-chatUsageCreate your aiohttp session and useaiosseclientto wrap the session to do request.# fastapi [email protected]('/sse')# support all http methodsasyncdefsse_endpoint(data:dict):asyncdeff():foriinrange(10):yield'\n'awaitasyncio.sleep(0.2)returnEventSourceResponse(f())################### client sideimportaiohttpfromaiossechatimportaiosseclientasyncforeventinaiosseclient(url=some_url,method='post',json=some_data):print(data,end='',flush=True)# can get single `'\n'` correctly |
aiosseclient | Asynchronous Server Side Events (SSE) ClientSimilar tosseclientandsseclient-py, a tiny package for supporting Server Side Events (SSE) with py3.9asyncioandaiohttp.Install it with this:pip3 install aiosseclientSample code (read more):importasyncioimportaiohttpfromaiosseclientimportaiosseclientasyncdefmain():asyncforeventinaiosseclient('https://stream.wikimedia.org/v2/stream/recentchange'):print(event)loop=asyncio.get_event_loop()loop.run_until_complete(main())Alternative librariesThere are different libraries, apparently inspired from this library initially, that right now are
handling session close better using a different API,https://github.com/rtfol/aiohttp-sse-clienthttps://github.com/JelleZijlstra/aiohttp-sse-client2 |
aio-stack | No description available on PyPI. |
aiostalk | aiostalk is a small and shameless Python client library for communicating
with thebeanstalkdwork queue.It is based on (and requires) another library calledgreenstalkby Justin Mayhew.Getting StartedPresuming beanstalkd running on localhost at standard port.>>>importasyncio>>>importaiostalk>>>>>>asyncdefmain():...client=aiostalk.Client(('127.0.0.1',11300))...awaitclient.connect()...job_id=awaitclient.put('hello')...print(job_id)...job=awaitclient.reserve()...print(job.id)...print(job.body)...awaitclient.delete(job)...awaitclient.close()>>>>>>asyncio.run(main())1
1
helloUsing the Client as an asyncio context manager is also supported.DocumentationPlease see greenstalk docs atRead the Docs. |
aio-standalone | aio-standaloneStandalone application server framework using asyncio.InstallationGrab it off pypi:pip install aio-standaloneOr grab it off github:git clone https://github.com/kblin/aio-standalone.git
cd aio-standalone
pip install .LicenseAll code is available under the Apache License version 2, see theLICENSEfile for details. |
aiostat | This is a security placeholder package.
If you want to claim this name for legitimate purposes,
please contact us [email protected]@yandex-team.ru |
aiostaticmap | Static MapA small, python-based library for creating map images with lines and markers.Examplem=StaticMap(300,400,10)m.add_line(Line(((13.4,52.5),(2.3,48.9)),'blue',3))image=awaitm.render()image.save('map.png')This will create a 300px x 400px map with a blue line drawn from Berlin to Paris.InstallationStaticMap is a small library, all it takes is python and two python packages:Pillowandrequest. Install staticmap via:pipinstallaiostaticmapUsageCreate a new map instance:m=StaticMap(width,height,padding_x,padding_y,url_template,tile_size)parameterdescriptionwidthwidth of the image in pixelsheightheight of the image in pixelspadding_x(optional) minimum distance in pixel between map features (lines, markers) and map borderpadding_y(optional) minimum distance in pixel between map features (lines, markers) and map borderurl_template(optional) the tile server URL for the map base layer, e.g.http://a.tile.osm.org/{z}/{x}/{y}.pngtile_size(optional) tile size in pixel, usually 256Add a line:line=Line(coordinates,color,width))m.add_line(line)parameterdescriptioncoordinatea sequence of lon/lat pairscolora color definition Pillowsupportswidththe stroke width of the line in pixelsimplifywhether to simplify coordinates, looks less shaky, default is trueAdd a map circle marker:marker=CircleMarker(coordinate,color,width))m.add_marker(marker)parameterdescriptioncoordinatea lon/lat pair: e.g.(120.1, 47.3)colora color definition Pillowsupportswidthdiameter of marker in pixelAdd a polygon:polygon=Polygon(coordinates,fill_color,outline_color,simplify)m.add_polygon(polygon)parameterdescriptioncoordinatea lon/lat pair: e.g.[[9.628, 47.144], [9.531, 47.270], [9.468, 47.057], [9.623, 47.050], [9.628, 47.144]]fill_colora color definition Pillowsupportsoutline_colora color definition Pillowsupportssimplifywhether to simplify coordinates, looks less shaky, default is trueSamplesShow Position on MapfromsrcimportStaticMap,CircleMarkerm=StaticMap(200,200,url_template='http://a.tile.osm.org/{z}/{x}/{y}.png')marker_outline=CircleMarker((10,47),'white',18)marker=CircleMarker((10,47),'#0036FF',12)m.add_marker(marker_outline)m.add_marker(marker)image=awaitm.render(zoom=5)image.save('marker.png')Show Ferry ConnectionfromsrcimportStaticMap,Linem=StaticMap(200,200,80)coordinates=[[12.422,45.427],[13.749,44.885]]line_outline=Line(coordinates,'white',6)line=Line(coordinates,'#D2322D',4)m.add_line(line_outline)m.add_line(line)image=awaitm.render()image.save('ferry.png')Show Icon MarkerfromsrcimportStaticMap,IconMarkerm=StaticMap(240,240,80)icon_flag=IconMarker((6.63204,45.85378),'./samples/icon-flag.png',12,32)icon_factory=IconMarker((6.6015,45.8485),'./samples/icon-factory.png',18,18)m.add_marker(icon_flag)m.add_marker(icon_factory)image=awaitm.render()image.save('icons.png')LicenceStaticMap is open source and licensed under Apache License, Version 2.0The map samples on this page are made withOSMdata, ©OpenStreetMapcontributors |
aiostatsd | UNKNOWN |
aio-statsd | aiostasdan asyncio-based client for send metric toStatsD,Graphite.carbon,TelegrafStatsDandDogStatsD.Installationpipinstallaio_statsdUsageUsage ClientCreate connection and send gauge metric.
aiostatsd client will automatically send messages in the background when the loop is runningimportasynciofromaio_statsdimportStatsdClientloop=asyncio.get_event_loop()client=StatsdClient()loop.run_until_complete(client.connect())client.gauge('test.key',1)loop.run_forever()Use context managerimportasynciofromaio_statsdimportStatsdClientasyncdefmain():asyncwithStatsdClient()asclient:client.gauge('test.key',1)loop=asyncio.get_event_loop()loop.run_until_complete(main())Client paramhost: default value 'localhost', Statsd Server ipport: default value 8125, Statsd Server portprotocol: default value ProtocolFlag.udp, Transport Layer Prrotocol, Select Tcp:ProtocolFlag.udpor Udp:ProtocolFlag.tcptimeout: default value 0, send msg timeout, if timeout==0, not enable timeoutdebug: default value False, enable debugclose_timeout: default value 9, Within a few seconds after the client is closed, continue to send messages which in the queuecreate_timeout: default value 9, Create connection timeoutmax_len: default value 10000, deque lengthsample_rate(Use in StatsD Client, DogStatsD Client): default value 1, use sample rate in Statsd or DogStatsDsend metricimportasynciofromaio_statsdimportStatsdClientasyncdefmain():asyncwithStatsdClient()asclient:client.gauge('test.key',1)client.counter('test.key',1)client.sets('test.key',1)client.timer('test.key',1)withclient.timeit('test'):pass# run your code# all metric support sample rateclient.gauge('test1.key',1,sample_rate=0.5)# mutli metric support(not support sample rate, the sample rate will always be set to 1)fromaio_statsdimportStatsdProtocolmetric=StatsdProtocol()metric.gauge('test2.key',1)metric.sets('test2.key',1)client.send_statsd(metric)loop=asyncio.get_event_loop()loop.run_until_complete(main())Other ClientGraphite(carbon)importasynciofromaio_statsdimportGraphiteClientloop=asyncio.get_event_loop()client=GraphiteClient()loop.run_until_complete(client.connect())client.send_graphite('test.key',1)# Multiple clients timestamp interval synchronizationloop.run_forever()DogStatsDNote: Not tested in productionimportasynciofromaio_statsdimportDogStatsdClientasyncdefmain():asyncwithDogStatsdClient()asclient:client.gauge('test.key',1)client.distribution('test.key',1)client.increment('test.key',1)client.histogram('test.key',1)client.timer('test.key',1)withclient.timeit('test'):pass# run your code# all metric support sample rate and DogStatsD tagclient.gauge('test1.key',1,sample_rate=0.5,tag_dict={'tag':'tag1'})# mutli metric support(# DogStatsdProtocol will store the message in its own queue and# DogStatsDClient traverses to read DogStatsdProtocol's message and send it# )fromaio_statsdimportDogStatsdProtocolmetric=DogStatsdProtocol()metric.gauge('test2.key',1,tag_dict={'tag':'tag1'})metric.histogram('test2.key',1)client.send_dog_statsd(metric,sample_rate=0.5)loop=asyncio.get_event_loop()loop.run_until_complete(main())TelegrafStatsdNote: Not tested in productionimportasynciofromaio_statsdimportTelegrafStatsdClientasyncdefmain():asyncwithTelegrafStatsdClient()asclient:client.gauge('test.key',1)client.distribution('test.key',1)client.increment('test.key',1)client.histogram('test.key',1)client.timer('test.key',1)withclient.timeit('test'):pass# run your code# all metric support sample rate and TelegrafStatsd tagclient.gauge('test1.key',1,sample_rate=0.5,tag_dict={'tag':'tag1'})# mutli metric support(# TelegrafStatsdProtocol will store the message in its own queue and# TelegrafStatsDClient traverses to read TelegrafStatsdProtocol's message and send it# )fromaio_statsdimportTelegrafStatsdProtocolmetric=TelegrafStatsdProtocol()metric.gauge('test2.key',1,tag_dict={'tag':'tag1'})metric.histogram('test2.key',1)client.send_telegraf_statsd(metric,sample_rate=0.5)loop=asyncio.get_event_loop()loop.run_until_complete(main())TelegrafNote: Not tested in productionimportasynciofromaio_statsdimportTelegrafClientasyncdefmain():asyncwithTelegrafClient()asclient:client.send_telegraf('test.key',{"field1":100},user_server_time=True)Use in web frameworksfast_tools example |
aio-stdout | aio_stdoutAsynchronous Input Output - StdoutThe purpose of this package is to provide asynchronous variants of the builtininputandprintfunctions.printis known to be relatively slow compared to other operations.inputis even slower because it has to wait for user input. While these slow IO operations are being ran, code usingasyncioshould be able to continuously run.PIP InstallingFor Unix/macOS:python3 -m pip install aio-stdoutFor Windows:py -m pip install aio-stdoutainput and aprintWithaio_stdout, theaio_stdout.ainputandaio_stdout.aprintfunctions provide easy to use functionality with organized behaviour.importasynciofromaio_stdoutimportainput,aprintasyncdefcountdown(n:int)->None:"""Count down from `n`, taking `n` seconds to run."""foriinrange(n,0,-1):awaitaprint(i)awaitasyncio.sleep(1)asyncdefget_name()->str:"""Ask the user for their name."""name=awaitainput("What is your name? ")awaitaprint(f"Your name is{name}.")returnnameasyncdefmain()->None:awaitasyncio.gather(countdown(15),get_name())if__name__=="__main__":asyncio.run(main())Example output:15
What is your name? Jane
14
13
12
11
10
9
8
Your name is Jane.
7
6
5
4
3
2
1Notice that while the prompt"What is your name? "is being waited for, thecountdowncontinues toaprintin the background, without becoming blocked. Thecountdowndoes not, however, display its results until theainputis completed. Instead it waits for theainputto finish before flushing out all of the queued messages.It is worth noting that with naive threading, a normal attempt to useprintwhile waiting on aninputleads to overlapping messages. Fixing this behavior requires a lot more work than should be needed to use a simpleprintorinputfunction, which is why this package exists. To remedy this problem, queues are used to store messages until they are ready to be printed.IO LocksAlthough the asynchronization behaviors ofainputandaprintare nice, sometimes we want to be able to synchronize our messages even more. IO locks provide a way to group messages together, locking the globalaio_stdoutqueues until it finishes or yields access.importasynciofromaio_stdoutimportIOLock,ainput,aprintasyncdefcountdown(n:int)->None:"""Count down from `n`, taking `n` seconds to run."""asyncwithIOLock(n=5)aslock:foriinrange(n,0,-1):awaitlock.aprint(i)awaitasyncio.sleep(1)asyncdefget_name()->str:"""Ask the user for their name."""asyncwithIOLock()aslock:name=awaitlock.ainput("What is your name? ")awaitlock.aprint(f"Your name is{name}.")returnnameasyncdefmain()->None:awaitasyncio.gather(countdown(15),get_name())if__name__=="__main__":asyncio.run(main())Let's try the example again now using the new locks:15
14
13
12
11
What is your name? Jane
Your name is Jane.
10
9
8
7
6
5
4
3
2
1Notice that this time thecountdowndoes not immediately yield to theget_name. Instead, it runs 5 messages before yielding control over toget_name. Now, after thelock.ainputfinishes, it does not yield tocountdown. Instead, it runs its ownlock.aprintfirst. In the meantime,countdowncontinues to run in the background and flushes all of its buffered messages afterwards.FlushingSince messages may be delayed, it is possible for your asynchronous code to finish running before all messages are displayed, producing confusing results. As such, the best recommended practice is to flush frommainbefore terminating.fromaio_stdoutimportflush@flushasyncdefmain()->None:...Final ExampleCombining all best practices, the final example should look something like this:importasynciofromaio_stdoutimportIOLock,ainput,aprint,flushasyncdefcountdown(n:int)->None:"""Count down from `n`, taking `n` seconds to run."""foriinrange(n,0,-1):awaitaprint(i)awaitasyncio.sleep(1)asyncdefget_name()->str:"""Ask the user for their name."""asyncwithIOLock()aslock:name=awaitlock.ainput("What is your name? ")awaitlock.aprint(f"Your name is{name}.")returnname@flushasyncdefmain()->None:awaitasyncio.gather(countdown(15),get_name())if__name__=="__main__":asyncio.run(main())Common GotchasUsinginputorprintinstead ofainputandaprintwill push a message immediately to the console, potentially conflicting withainputoraprint.Usingainputoraprintinstead oflock.ainputandlock.aprintmay producedeadlockdue to having to wait for the lock to release. As such, thelockis equipped with a defaulttimeoutlimit of 10 seconds to avoid deadlock and explain to users this potential problem. |
aiosteady | aiosteadyis an MIT licensed library, written in Python, for rate limiting
in asyncio applications using Redis and theaioredislibrary.aiosteady currently implements theleaky bucket algorithmin a very efficient way.max_capacity=10# The bucket can contain up to 10 drops, starts with 0drop_recharge=5.0# 5 seconds between drop recharges.throttler=Throttler(aioredis,max_capacity,drop_recharge)# consume() returns information about success, the current bucket level,# how long until the next drop recharges, etc.res=awaitthrottler.consume(f'user:{user_id}')InstallationTo install aiosteady, simply:$pipinstallaiosteadyUsageThe leaky bucket algorithm follows a simple model.A single bucket contains a number of drops, called the bucket level. Buckets start with zero drops.Buckets have a maximum capacity of drops.Each use of the bucket (consumption) inserts one or more drops into the bucket, up until the maximum capacity. If the bucket would overflow, the consumption fails.One drop leaks out everydrop_rechargeseconds, freeing space in the bucket for a new drop to be put into it.The bucket may also be manually drained.In addition to making the consumption fail, full buckets can optionally be configured to block further attempts to consume for a period.Create an instance ofaiosteady.leakybucket.Throttler, giving it an instance
of anaioredisclient and rate limiting parameters (the maximum bucket
capacity, the number of seconds it takes for a drop to leak out, and an
optional blocking duration).AThrottlersupports two operations: consuming and peeking.awaitThrottler.consume("a_key")(consumebecause it consumes bucket resources)
attempts to put the given number of drops (default 1) into the bucket at the
given key. It returns an instance ofaiosteady.leakybucket.ThrottleResult,
with fields for:success: a boolean, describing whether the consumption was successfullevel: an integer, describing the new level of the bucketuntil_next_drop: a float, describing the number of seconds left after the next drop regeneratesblocked_for: an optional float, if blocking is being used and the bucket is blocked, the number of seconds until the block expiresIf the number of drops given is negative, drops are instead removed from the bucket. The bucket may not go below zero drops.awaitThrottler.peek("a_key")returns the sameThrottleResultbut without attempting to
consume any drops.Both operations are implemented using a single Redis call, using Lua scripting.Changelog22.1.0 (UNRELEASED)Switch to CalVer.Add Python 3.10 support.Add support for recharging the bucket (removing existing drops).Switch the main branch name frommastertomain.0.2.1 (2021-05-12)Improve theattrsdependency specification, sinceattrsuses CalVer.0.2.0 (2021-04-08)Use the Redisevalshainstead ofeval, for efficiency.0.1.0 (2021-03-07)Initial release.CreditsThe Lua Redis script for atomic leaky bucket has been taken and heavily adapted from theProrateproject. |
aiosteam | aiosteamAn asynchronous steam API wrapper written in Python.Installationpy-mpipinstallaiosteamTodoDocumentationMore functionalityExamples |
aiosteamist | Async SteamistControl Steamist steam systemsInstallationInstall this via pip (or your favourite package manager):pip install aiosteamistContributors ✨Thanks goes to these wonderful people (emoji key):This project follows theall-contributorsspecification. Contributions of any kind welcome!CreditsThis package was created withCookiecutterand thebrowniebroke/cookiecutter-pypackageproject template. |
aiosteampy | AIOSTEAMPYPreviously this library was a soft fork ofbukson/steampyand created only to
provide asynchronous methods and proxies support.
But now itstandaloneproject. Created by myself for steam trading purposes mostly.[!IMPORTANT]
See full documentationhere📖InstallationpipinstallaiosteampypipenvinstallaiosteampypoetryaddaiosteampyProject have extracurrencies converterwith
target dependencyaiosteampy[converter]. For instance:poetryaddaiosteampy[converter][!TIP]aiohttp docsrecommends installing speedups (aiodns,cchardet, ...)AIOSTEAMPY useaiohttpunderneath to do asynchronous requests to steam servers,
with modern async/await syntax.Generally, project inspired most
byDoctorMcKay/node-steamcommunityKey featuresStateless: the main idea was a low-middle layer API wrapper of some steam services and methods like market,
tradeoffers, confirmations, steamguard, etc. But if you want to cache your entities data (listings, confirmations,
...)there is some methods to help.Declarative: there is models almost for every data.Typed: for editor support most things are typed.Short: I really tried to fit most important for steam trading methods.What can I do with thisOperate with steam trade offers for any manner.Sell, buy items on market. Place, cancel orders.Login trough steam to 3rd party sites.Fetch data from market.Manipulate many accounts with proxies for each session.Store and load cookies to stay logged in.Convert market prices into different currencies.What I can't doChat (at least for now).Get apps, packages.All, that need connection to CM.Interact with game servers (inspect CS2 (ex. CSGO) items, ...).Edit profile, social interaction(groups, clans).Handle entities (listings, items, tradeoffers) lifecycle for easy if you need to store it.Tests 🧪Readtest documentation📖Contribution 💛There is no rules or requirements to contribute. Feedbacks, suggests, other are welcome.
I will be very grateful for helping me get the things right.Creditsbukson/steampyDoctorMcKay/node-steamcommunityIdentifying Steam itemsRevadike/InternalSteamWebAPIGobot1234/steam.pySteam Market id's storage reposteamapi.xpaw.meSteam Exchange Rate Tracker |
aios-test | No description available on PyPI. |
aiostomp | AiostompSimple asyncio stomp 1.1 client for python 3.6.Heavely inspired ontorstomp.Installwith pip:pipinstallaiostompUsageimportsysimportloggingimportasynciofromaiostompimportAioStomplogging.basicConfig(format="%(asctime)s-%(filename)s:%(lineno)d- ""%(levelname)s-%(message)s",level='DEBUG')asyncdefrun():client=AioStomp('localhost',61613,error_handler=report_error)client.subscribe('/queue/channel',handler=on_message)awaitclient.connect()client.send('/queue/channel',body=u'Thanks',headers={})asyncdefon_message(frame,message):print('on_message:',message)returnTrueasyncdefreport_error(error):print('report_error:',error)defmain(args):loop=asyncio.get_event_loop()loop.run_until_complete(run())loop.run_forever()if__name__=='__main__':main(sys.argv)DevelopmentWith empty virtualenv for this project, run this command:makesetupand run all tests =)maketestContributingFork, patch, test, and send a pull request. |
aiostorage | Interface for performing common object storage operations asynchronously. The aim is to support multiple object storage providers, e.g. Google Cloud, Backblaze, etc. |
aiostorage-orm | AioStorageORM (CyberPhysics)Установкаpipinstallaiostorage-ormЗависимостиredis-pynest-asyncioБазовый пример использования (все примеры,базовый пример)Импорт классовimportredis.asyncioasredisfromaiostorage_ormimportAIOStorageORMfromaiostorage_ormimportAIORedisORMfromaiostorage_ormimportAIORedisItemfromaiostorage_ormimportOperationResultОпределить модельclassExampleItem(AIORedisItem):"""Атрибуты объекта с указанием типа данных(в процессе сбора данных из БД приводится тип)"""date_time:intany_value:floatclassMeta:"""Системный префикс записи в RedisКлючи указанные в префиксе обязательны дляпередачи в момент создания экземпляра"""table="subsystem.{subsystem_id}.tag.{tag_id}"Установить подключение ORM можно двумя способамиПередать данные для подключения непосредственно в ORMorm:AIOStorageORM=AIORedisORM(host="localhost",port=8379,db=1)awaitorm.init()Создать подключение redis.Redis и передать его в конструкторredis:redis.Redis=redis.Redis(host="localhost",port=8379,db=1)orm:AIOStorageORM=AIORedisORM(client=redis)awaitorm.init()Добавление/редактирование записи (ключами записи являются параметры, указанные в Meta.table модели)Создать объект на основе моделиexample_item:ExampleItem=ExampleItem(subsystem_id=3,tag_id=15,date_time=100,any_value=17.)Выполнить вставку можно несколькими способамиИспользовать метод save() созданного экземпляраoperation_result:OperationResult=awaitexample_item.save()Использовать метод save() AIOStorageOrmoperation_result:OperationResult=awaitorm.save(item=example_item)Использоватьгрупповуювставку записей (пример групповой вставки)operation_result:OperationResult=awaitorm.bulk_create(items=[example_item1,example_item2])Выборка данных из БДдля выборки необходимо передать аргументы для параметров, которые используются в Meta.tabletable="subsystem.{subsystem_id}.tag.{tag_id}"^^, напримерexample_items:ExampleItem=awaitexampleitem.get(subsystem_id=3,tag_id=15)Использование нескольких подключений (пример)для использования нескольких подключений необходимо в метод AIOStorageItem.using(db_instance=...) передать
подготовленное соединение с БД Redis, напримерredis_another:redis.Redis=redis.Redis(host="localhost",port=8379,db=17)...result_of_operation:OperationResult=awaitexample_item.using(db_instance=redis_another).save()Поиск по списку значений (пример)для поиска записей по параметру, находящемуся в списке значений, необходимо параметр дополнить суффиксом __in, в
который необходимо передать список искомых значенийgetted_items:list[ExampleItem]=awaitExampleItem.filter(subsystem_id__in=[21,23],tag_id=15)Поиск по предварительно подготовленному объекту (пример)для поиска записи указанным образом, необходимо создать объект с параметрами, необходимыми для поиска и передать
его в метод AIORedisORM.getitem:ExampleItem=ExampleItem(subsystem_id=1,tag_id=15)item_by_object:ExampleItem|None=awaitExampleItem.get(_item=item)Поиск по предварительно подготовленным объектам (пример)для поиска записи указанным образом, необходимо создать объекты с параметрами, необходимыми для поиска и передать
их списком в метод AIORedisORM.filteritems:list[ExampleItem]=[ExampleItem(subsystem_id=1,tag_id=15),ExampleItem(subsystem_id=2,tag_id=16),]item_by_objects:list[ExampleItem]=awaitExampleItem.filter(_items=items)Удаление одного объекта (пример)example_item:ExampleItem=ExampleItem(subsystem_id=3,tag_id=15)result_of_operation:OperationResult=awaitexample_item.delete()Удаление нескольких объектов одновременно (пример)result_of_operation:OperationResult=awaitorm.bulk_delete(items=example_items)Добавление объектов с ограниченным временем жизни (пример)classExampleItem(AIORedisItem):# Атрибуты объекта с указанием типа данных (в процессе сбора данных из БД приводится тип)date_time:intany_value:strclassMeta:# Системный префикс записи в Redis# Ключи указанные в префиксе обязательны для передачи в момент создания экземпляраtable="subsystem.{subsystem_id}.tag.{tag_id}"# Время жизни объекта в базе данныхttl=10...example_item:ExampleItem=ExampleItem(subsystem_id=3,tag_id=15,date_time=100,any_value=17.)result_of_operation:OperationResult=awaitexample_item.save()...example_items:list[ExampleItem]=[]foriinrange(100):subsystem_id:int=i%10example_item:ExampleItem=ExampleItem(subsystem_id=subsystem_id,another_key_value=i,tag_id=10+(15*random.randint(0,1)),date_time=i*100,any_value=random.random()*10,)example_items.append(example_item)result_of_operation:OperationResult=awaitorm.bulk_create(items=example_items)Добавление одной записи во фрейм (пример)classExampleItem(AIORedisItem):# Атрибуты объекта с указанием типа данных (в процессе сбора данных из БД приводится тип)date_time:intany_value:strclassMeta:# Системный префикс записи в Redis# Ключи указанные в префиксе обязательны для передачи в момент создания экземпляраtable="subsystem.{subsystem_id}.tag.{tag_id}"ttl=10# Время жизни объекта в базе данныхframe_size=3# Размер frame'а...result_of_operation:OperationResult=awaitorm.frame.add(item_or_items=example_item)Групповое добавление записей во фрейм (пример)записи могут быть разнородными (должны являться наследником AIORedisItem, но при этом они могут быть определены
различными друг от друга классами)...result_of_operation:OperationResult=awaitorm.frame.add(item_or_items=[example_item,example_item_2])Сбор данных из фрейма (пример)данные из фрейма можно получить только списком (list[ExampleItem])получение данных из фрейма ограничивается агрументами start_index и end_index (включительно, т.е. самый старый элемент
get(ExampleItem(), 0, 0), самый последний добавленный get(ExampleItem(), -1, -1))...result_of_operation:OperationResult=awaitorm.frame.get(item=example_item)Запуск примеровpython-mvenvvenvsource./venv/bin/activatepipinstallredis# Базовый простой примерPYTHONPATH="${PYTHONPATH}:."pythonexamples/redis_1_single.py# Пример групповой вставки (bulk)PYTHONPATH="${PYTHONPATH}:."pythonexamples/redis_2_bulk_multiple.py# Пример использования нескольких подключенийPYTHONPATH="${PYTHONPATH}:."pythonexamples/redis_3_using_multiple_connections.py# Пример поиска по списку значенийPYTHONPATH="${PYTHONPATH}:."pythonexamples/redis_4_values_in_list.py# Пример поиска по переданному подготовленному экземпляруPYTHONPATH="${PYTHONPATH}:."pythonexamples/redis_5_find_by_object.py# Пример удаления объектовPYTHONPATH="${PYTHONPATH}:."pythonexamples/redis_6_delete_item.py# Пример добавления объектов с ограниченным временем жизниPYTHONPATH="${PYTHONPATH}:."pythonexamples/redis_7_ttl.py# Пример работы с frame'амиPYTHONPATH="${PYTHONPATH}:."pythonexamples/redis_8_frame.py |
aiostore | No description available on PyPI. |
aiostp | asyncio client library for stpmex.comSTP – Async python client library for stpmex.comInstallationpip install aiostpAuthenticationThe preferred way to configure the credentials for the client is to set theSTP_PRIVATE_LOCATIONandSTP_KEY_PASSPHRASEenvironment variables. The client
library will automatically configure based on the values of those variables.STP_PRIVATE_LOCATIONcan be the route to the file, or the private key itself.To configure manually:importaiostpaiostp.configure(priv_key='PKxxxx',priv_key_passphrase='yyyyyy') |
aiostratum-proxy | **`aiostratum_proxy`** is a Stratum Protocol proxy (ie cryptocurrency/mining) built using Python3. It was built to be a modern, code-concise, **extensible**, and fast replacement for existing aging Stratum Protocol proxy solutions & implementations.* Requires Python 3.5 or greater (built with `asyncio` using `async`/`await` syntax)* Extensible: easily implement new coin/algorithm 'stratum-like' protocols as dynamically-loaded, external Python3 modules (via config file)* Can run multiple proxies at the same time (ie. mine different coins on different pools)* Each proxy supports up to 65536 miner connections (per pool connection), each mining a separate nonce space (dependent on miner support)* Supports plaintext and secure connections (ie. TLS/SSL) for both incoming miner connections and outgoing pool connections* Plain socket transport only (ie. not JSONRPC over HTTP, HTTP Push, or HTTP Poll); this is the defacto standard in the cryptocurrency space#### DonationsI built this on my own time, outside of my day job. If you find this mining proxy useful, donate to help continue development. **I value my time.*** **BTC**: 1BS4QYAFiya4tsjyvHeY945biKnDj6bRA4* **LTC**: LTN1LPGnJjHMKq4DcuQYifQgLfT4Phmn9d* **ETH**: 0x4B005e68D323bdABD8eeD1D415117Ff1B57b3EC5* **BTCP**: b1CACK65UTwzmHGw2VvyozdPsRLMb8utGLg* **ZCL**: t1eoqTqyatJzL2rErW83waZLQHZLKuipMbi#### InstallationInstallation is simple:```pip install aiostratum-proxy```However, for an isolated and more robust installation, you should consider using Python virtual environments:```# this will create a directory 'containing' the Python3 virtual environmentpython3 -m venv aiostratum_proxycd aiostratum_proxy# this will install the aiostratum-proxy packagebin/pip install aiostratum-proxy# verify the installation by checking the package versionbin/aiostratum-proxy --version```#### UsageInstallation creates a new command-line shortcut called `aiostratum-proxy`; it has built-in command-line help, just run:```bin/aiostratum-proxy --help```A config file is needed for the proxy to run; you can generate one:```bin/aiostratum-proxy --generate-config > proxy-config.yaml```Open and edit the generated `proxy-config.yaml` in a text editor. The config file's syntax is YAML ([here's a good guide to YAML](https://github.com/Animosity/CraftIRC/wiki/Complete-idiot's-introduction-to-yaml)).To run `aiostratum-proxy`, pass it your edited config file:```bin/aiostratum-proxy --config proxy-config.yaml```#### Supported Algorithms/Coins`aiostratum-proxy` was designed to be modular and extensible when it comes to coin and algorithm support. This is done via miner+pool protocol module pairs (more on this below).Current support includes:* any coin based on Equihash (ZCash, ZClassic, Bitcoin Gold, Bitcoin Private, etc):* miner module: `aiostratum_proxy.protocols.equihash.EquihashWorkerProtocol`* pool module: `aiostratum_proxy.protocols.equihash.EquihashPoolProtocol`* (**EXPERIMENTAL/UNTESTED**/probably not working just yet) Bitcoin (and related coins):* miner module: `aiostratum_proxy.protocols.stratum.StratumWorkerProtocol`* pool module: `aiostratum_proxy.protocols.stratum.StratumPoolProtocol`As you can see, it is possible for a protocol implementation (ie. both the worker & pool sides) to support multiple coins, assuming they share some common ancestry or have heavily borrowed technical decisions.#### Add Support for New Coins & AlgorithmsThe terms 'Stratum' and 'Stratum Protocol' are used broadly (perhaps too much so) in the cryptocurrency ecosystem. The concept behind the Stratum protocol started with the specific desire to improve the efficiency of Bitcoin mining pools.In time, the rise of altcoins demanded a similar approach to managing the communications between miners and pools. For whatever reason, most altcoins have tweaked the original Stratum spec to their needs, borrowing and learning from prior mistakes.To add support for a new coin or algorithm, there are two options:1. Use an existing protocol implementation and tweak; note that this means if there are future changes, it may have cascading impacts (see the Equihash protocol pair as an example, it is based off of the Stratum protocol pair)1. Create a new protocol pairing by implementing both `BasePoolProtocol` (to handle connections to pools) and `BaseWorkerProtocol` (to handle incoming miner connections)For example, if you were implementing Monero support:1. Create a new Python module with the Monero 'worker' and 'pool' protocol class implementations1. Add the new Monero worker/pool classes to your proxy config file1. You will need to ensure your Python module is visible within `PYTHONPATH` to use it within your proxy YAML config file1. **Consider [submitting it as a pull request](https://github.com/wetblanketcc/aiostratum_proxy/pulls) to `aiostratum_proxy`!** If so, you would place the new module alongside the existing Equihash implementation at `aiostratum_proxy.protocols.monero`#### Future ConsiderationsCommunity involvement is appreciated. [Code review](https://github.com/wetblanketcc/aiostratum_proxy), [pull requests for bug fixes & new protocols](https://github.com/wetblanketcc/aiostratum_proxy/pulls), [reporting issues](https://github.com/wetblanketcc/aiostratum_proxy/issues), spreading the word - all appreciated.##### TODO:Community feedback on the following is appreciated.* More coin/algo support* Complete `mining.set_extranonce` support* Consider additional authentication improvements (currently miners aren't authenticated)* authenticate miners locally within proxy via config* authenticate miners via passthru to pool; would require per-pool mappings of username/password for fallback pools in config?* Consider immediate reply to miner share submissions instead of waiting for pool response* HAProxy `PROXY` protocol support#### Related & Informative Links1. [Stratum Protocol Specification](https://slushpool.com/help/manual/stratum-protocol)1. [Stratum Protocol Specification Draft](https://docs.google.com/document/d/17zHy1SUlhgtCMbypO8cHgpWH73V5iUQKk_0rWvMqSNs/edit?hl=en_US)1. [Stratum Protocol Bitcoin Wiki Page](https://en.bitcoin.it/wiki/Stratum_mining_protocol)1. [Original Bitcointalk.org Stratum Announcement](https://bitcointalk.org/index.php?topic=108533.0)1. [Follow-on Bitcointalk.org Stratum Discussion](https://bitcointalk.org/index.php?topic=557991.0) |
aio-strawberry-sqlalchemy-mapper | This fork is a heavily modified version ofstrawberry-sqlalchemy-mapperwith the following additions/changes:Implements relay pagination (usingthis sqlakeyset fork)Fully asyncUses SQLAlchmy 2.0 stylePydantic integration (generated types can also be pydantic models)strawberry-sqlalchemy-mapperStrawberry-sqlalchemy-mapper is the simplest way to implement autogenerated strawberry types for columns and relationships in SQLAlchemy models.Instead of manually listing every column and relationship in a SQLAlchemy model, strawberry-sqlalchemy-mapper
lets you decorate a class declaration and it will automatically generate the necessary strawberry fields
for all columns and relationships (subject to the limitations below) in the given model.Native support for most of SQLAlchemy's most common types.Extensible to arbitrary custom SQLAlchemy types.Automatic batching of queries, avoiding N+1 queries when getting relationshipsSupport for SQLAlchemy >=1.4.xLightweight and fast.Getting Startedstrawberry-sqlalchemy-mapper is available onPyPipip install strawberry-sqlalchemy-mapperFirst, define your sqlalchemy model:# models.pyfromsqlalchemyimportColumn,Integer,Stringfromsqlalchemy.ext.declarativeimportdeclarative_baseBase=declarative_base()classEmployee(Base):__tablename__='employee'id=Column(UUID,primary_key=True)name=Column(String,nullable=False)password_hash=Column(String,nullable=False)department_id=Column(UUID,ForeignKey('department.id'))department=relationship('Department',back_populates='employees')classDepartment(Base):__tablename__="department"id=Column(UUID,primary_key=True)name=Column(String,nullable=False)employees=relationship('Employee',back_populates='department')Next, decorate a type withstrawberry_sqlalchemy_mapper.type()to register it as a strawberry type for the given SQLAlchemy model.
This will automatically add fields for the model's columns, relationships, association proxies,
and hybrid properties. For example:# elsewhere# ...fromstrawberry_sqlalchemy_mapperimportStrawberrySQLAlchemyMapperstrawberry_sqlalchemy_mapper=StrawberrySQLAlchemyMapper()@strawberry_sqlalchemy_mapper.type(models.Employee)classEmployee:__exclude__=["password_hash"]@strawberry_sqlalchemy_mapper.type(models.Department)classDepartment:[email protected]:@strawberry.fielddefdepartments(self):returndb.session.scalars(select(models.Department)).all()# context is expected to have an instance of StrawberrySQLAlchemyLoaderclassCustomGraphQLView(GraphQLView):defget_context(self):return{"sqlalchemy_loader":StrawberrySQLAlchemyLoader(bind=YOUR_SESSION),}# call finalize() before using the schema:# (note that models that are related to models that are in the schema# are automatically mapped at this stage -- e.g., Department is mapped# because employee.department is a relationshp to Department)strawberry_sqlalchemy_mapper.finalize()# only needed if you have polymorphic typesadditional_types=list(strawberry_sqlalchemy_mapper.mapped_types.values())schema=strawberry.Schema(query=Query,mutation=Mutation,extensions=extensions,types=additional_types,)# You can now query, e.g.:"""query {departments {idnameemployees {edge {node {idnamedepartment {# Just an example of nested relationshipsidname}}}}}}"""LimitationsSQLAlchemy Models -> Strawberry Types and Interfaces are expected to have a consistent
(customizable) naming convention. These can be configured by passingmodel_to_type_nameandmodel_to_interface_namewhen constructing the mapper.Natively supports the following SQLAlchemy types:Integer:int,Float:float,BigInteger:int,Numeric:Decimal,DateTime:datetime,Date:date,Time:time,String:str,Text:str,Boolean:bool,Unicode:str,UnicodeText:str,SmallInteger:int,SQLAlchemyUUID:uuid.UUID,VARCHAR:str,ARRAY[T]:List[T]# PostgreSQL arrayEnum:(thePythonenumitismappedto,[email protected])Additional types can be supported by passingextra_sqlalchemy_type_to_strawberry_type_map,
although support forTypeDecoratortypes is untested.Association proxies are expected to be of the formassociation_proxy('relationship1', 'relationship2'),
i.e., both properties are expected to be relationships.Roots of polymorphic hierarchiesare supported, but are also expected to be registered viastrawberry_sqlalchemy_mapper.interface(), and its concrete type and
its descendants are expected to inherit from the interface:classBook(Model):id=Column(UUID,primary_key=True)classNovel(Book):passclassShortStory(Book):pass# in another filestrawberry_sqlalchemy_mapper=StrawberrySQLAlchemyMapper()@strawberry_sqlalchemy_mapper.interface(models.Book)classBookInterface:pass@strawberry_sqlalchemy_mapper.type(models.Book)classBook:pass@strawberry_sqlalchemy_mapper.type(models.Novel)classNovel:pass@strawberry_sqlalchemy_mapper.type(models.ShortStory)classShortStory:passContributingWe encourage you to contribute to strawberry-sqlalchemy-mapper! Any contributions you make are greatly appreciated.If you have a suggestion that would make this better, please fork the repo and create a pull request. Don't forget to give the project a star! Thanks again!Fork the ProjectCreate your Feature Branch (git checkout -b feature)Commit your Changes (git commit -m 'Add some feature')Push to the Branch (git push origin feature)Open a Pull RequestPrerequisitesThis project usespre-commit_, please make sure to install it before making any
changes::pip install pre-commit
cd strawberry-sqlalchemy-mapper
pre-commit installIt is a good idea to update the hooks to the latest version::pre-commit autoupdateDon't forget to tell your contributors to also install and use pre-commit.Installationpipinstall-rrequirements.txtTestpytest⚖️ LICENSEMIT ©strawberry-sqlalchemy-mapper |
aiostream | Generator-based operators for asynchronous iterationSynopsisaiostreamprovides a collection of stream operators that can be combined to create
asynchronous pipelines of operations.It can be seen as an asynchronous version ofitertools, although some aspects are slightly different.
Essentially, all the provided operators return a unified interface called a stream.
A stream is an enhanced asynchronous iterable providing the following features:Operator pipe-lining- using pipe symbol|Repeatability- every iteration creates a different iteratorSafe iteration context- usingasync withand thestreammethodSimplified execution- get the last element from a stream usingawaitSlicing and indexing- using square brackets[]Concatenation- using addition symbol+RequirementsThe stream operators rely heavily on asynchronous generators (PEP 525):python >= 3.6Stream operatorsThestream operatorsare separated in 7 categories:creationiterate,preserve,just,call,empty,throw,never,repeat,count,rangetransformationmap,enumerate,starmap,cycle,chunksselectiontake,takelast,skip,skiplast,getitem,filter,until,takewhile,dropwhilecombinationmap,zip,merge,chain,ziplatestaggregationaccumulate,reduce,listadvancedconcat,flatten,switch,concatmap,flatmap,switchmaptimingspaceout,timeout,delaymiscellaneousaction,printDemonstrationThe following example demonstrates most of the streams capabilities:importasynciofromaiostreamimportstream,pipeasyncdefmain():# Create a counting stream with a 0.2 seconds intervalxs=stream.count(interval=0.2)# Operators can be piped using '|'ys=xs|pipe.map(lambdax:x**2)# Streams can be slicedzs=ys[1:10:2]# Use a stream context for proper resource managementasyncwithzs.stream()asstreamer:# Asynchronous iterationasyncforzinstreamer:# Print 1, 9, 25, 49 and 81print('->',z)# Streams can be awaited and return the last valueprint('9² = ',awaitzs)# Streams can run several timesprint('9² = ',awaitzs)# Streams can be concatenatedone_two_three=stream.just(1)+stream.range(2,4)# Print [1, 2, 3]print(awaitstream.list(one_two_three))# Run main coroutineloop=asyncio.get_event_loop()loop.run_until_complete(main())loop.close()More examples are available in theexample sectionof the documentation.ContactVincent Michel:[email protected] |
aio.stream | Stream utils for aiohttp and aiofiles. |
aiostun | Async STUN client for PythonKey FeaturesSupport RFC3489Transports UDP, TCP and TLSIPv4 and IPv6 supportSupport RFC5389Support RFC5780Support RFC8489InstallationThis module can be installed frompypiwebsitepipinstallaiostunGetting your mapped addressimportaiostunimportasyncioasyncdefmain():asyncwithaiostun.Client(host='openrelay.metered.ca',port=443,ipproto=aiostun.TLS)asstunc:mapped_addr=awaitstunc.get_mapped_address()print(mapped_addr){'family':'IPv4','port':38778,'ip':'xx.xx.xx.xx'}asyncio.run(main())Default constants for family:aiostun.IP4(default)aiostun.IP6Default constants for IP protocol:aiostun.UDP(default)aiostun.TCPaiostun.TLSThe default remote port is3478with a timeout connection of2 seconds.For developersRunning all test units.python3-munittestdiscovertests/-v |
aiosu | Simple and fast asynchronous osu! API v1 and v2 library with various utilities.FeaturesSupport for modern async syntax (async with)Support for API v1 and API v2Rate limit handlingUtilities for osu! related calculationsEasy to useInstallingPython 3.9 or higher is requiredTo install the library, simply run the following commands# Linux/macOSpython3-mpipinstall-Uaiosu# Windowspy-3-mpipinstall-UaiosuTo install the development version, do the following:$gitclonehttps://github.com/NiceAesth/aiosu$cdaiosu$python3-mpipinstall-U.API v1 Exampleimportaiosuimportasyncioasyncdefmain():# async with syntaxasyncwithaiosu.v1.Client("osu api token")asclient:user=awaitclient.get_user(7782553)# regular syntaxclient=aiosu.v1.Client("osu api token")user=awaitclient.get_user(7782553)awaitclient.aclose()if__name__=="__main__":asyncio.run(main())API v2 Exampleimportaiosuimportasyncioimportdatetimeasyncdefmain():token=aiosu.models.OAuthToken.model_validate(json_token_from_api)# ortoken=aiosu.models.OAuthToken(access_token="access token",refresh_token="refresh token",expires_on=datetime.datetime.utcnow()+datetime.timedelta(days=1),# can also be string)# async with syntaxasyncwithaiosu.v2.Client(client_secret="secret",client_id=1000,token=token)asclient:user=awaitclient.get_me()# regular syntaxclient=aiosu.v2.Client(client_secret="secret",client_id=1000,token=token)user=awaitclient.get_me()awaitclient.aclose()if__name__=="__main__":asyncio.run(main())You can find more examples in the examples directory.ContributingPlease read theCONTRIBUTING.rstto learn how to contribute to aiosu!Acknowledgmentsdiscord.pyfor README formattingosu!Akatsukifor performance and accuracy utils |
aiosubprocess | UsageReleaseDevelopmentaiosubprocessAzero-dependencyasync subprocess that keeps on getting stdout and stderr.How to useExample 1: Hello WorldA classic Hello World. It printsHello World!in the shell and
redirects the stdout toprint().importasynciofromaiosubprocessimportProcessasyncio.get_event_loop().run_until_complete(Process("echo Hello World!",stdout=print).shell())$>pythonex1_minimal.py[AIOSubprocess-0]HelloWorld!
Processfinishedwithexitcode0Example 2: Two ProcessesOne process writes to a file, and a second process logs the content of the file
in real time.importasynciofromaiosubprocessimportProcessloop=asyncio.get_event_loop()reader=Process("""for i in {1..5}doecho "Hello $i World" > tempfile.logsleep 1done""",loop=loop,name="Writer",)writer=Process("timeout --foreground 10s tail -f tempfile.log",loop=loop,name="Reader",expected_returncode=124,# Because timeout is expected)awaitable_reader=reader.shell()awaitable_writer=writer.shell()gathered=asyncio.gather(awaitable_reader,awaitable_writer,loop=loop)asyncio.get_event_loop().run_until_complete(gathered)assertgathered.result()==[True,True]Which does exactly this:Why?There are many scenario where we need to keep an eye on
a subprocess output. If we want to do so in realtime
(and redirect it to logs or a GUI), the boilerplate is
tedious.The other solution is to wait for the subprocess to
exit and read the stdout/stderr afterwards.This library implements this boilerplate, so you don't have to. |
aio.subprocess | Subprocess utils for asyncio. |
aiosubpub | AioSubPubAsync pub sub implementation.Inspired by someone else whose name I cannot find anymore. If you see your code (I did some improvements on it I think) please let me know and I am happy to give you credit.Installationpip install aiosubpubUsageimportaiosubpubimportasyncioloop=asyncio.get_event_loop()# create a channela_channel=aiosubpub.Channel()# subscribe to the channel using a callback.defcall_back(data):print(data)subscription=loop.create_task(a_channel.subscribe(call_back))# Publish a message.a_channel.publish("a message")subscription.un_subscribe()# Without callback:subscription=a_channel.get_subscription()asyncdef_custom_subscriber():withsubscriptionassub:result=awaitsub.get()print(result)a_channel.publish("a message")result=await_custom_subscriber()changelog1.0.10Addget_latestto the channel. |
aiosumma | No description available on PyPI. |
aiosumma-wheel | No description available on PyPI. |
aiosupabase | aiosupabaseUnofficial Asyncronous Python Client for SupabaseLatest Version:FeaturesUnified Asyncronous and Syncronous Python Client forSupabaseSupports Python 3.6+Strongly Typed withPydanticUtilizes Environment Variables for ConfigurationAPIsBoth async and sync Apis are available for the followingAuthPostgrestStorageRealtimeFunctionsInstallation# Install from PyPIpipinstallaiosupabase# Install from sourcepipinstallgit+https://github.com/trisongz/aiosupabase.gitUsageExample UsageimportasynciofromaiosupabaseimportSupabasefromaiosupabase.utilsimportlogger"""Environment Vars that map to Supabase.configure:all vars are prefixed with SUPABASE_SUPABASE_URL (url): str | Supabase URLSUPABASE_KEY (key): str | API KeySUPABASE_DEBUG_ENABLED (debug_enabled): bool - defaults to FalseSUPABASE_CLIENT_SCHEMA (client_schema): str - defaults to 'public'SUPABASE_HEADERS (headers): Dict - defaults to {}SUPABASE_AUTO_REFRESH_TOKENS (auto_refresh_tokens): bool - defaults to TrueSUPABASE_PERSIST_SESSION (persist_session): bool - defaults to TrueSUPABASE_REALTIME_CONFIG (realtime_config): Dict - defaults to NoneSUPABASE_TIMEOUT (timeout): int - defaults to 5 [DEFAULT_POSTGREST_CLIENT_TIMEOUT]SUPABASE_COOKIE_OPTIONS (cookie_options): Dict - defaults to NoneSUPABASE_REPLACE_DEFAULT_HEADERS (replace_default_headers): bool - defaults to False"""Supabase.configure(url='...',key="...",debug_enabled=True,)asyncdefasync_fetch_table(table:str="profiles",select:str="*"):# Async fetch# note that table is `atable` for asyncreturnawaitSupabase.atable(table).select(select).execute()deffetch_table(table:str="profiles",select:str="*"):# Sync fetchreturnSupabase.table(table).select(select).execute()asyncdefasync_fetch_users():# Async `ListUsers`# note that most async methods are prefixed with# `async_`returnawaitSupabase.auth.async_list_users()deffetch_users():# Sync `ListUsers`# note that most async methods are prefixed withreturnSupabase.auth.list_users()asyncdefrun_test():# Async fetchasync_data=awaitasync_fetch_table()logger.info(f"async_data:{async_data}")async_users=awaitasync_fetch_users()logger.info(f"async_users:{async_users}")# Sync fetchsync_data=fetch_table()logger.info(f"sync_data:{sync_data}")sync_users=fetch_users()logger.info(f"sync_users:{sync_users}")asyncio.run(run_test()) |
aiosvkmimer | aiosvkmimerAsyncio library for SVK Mimerhttps://mimer.svk.se/UNDER DEVELOPMENT!!!Example#!/usr/bin/env python3
import asyncio
import logging
import pprint
from aiosvkmimer.client import Mimer
from prettytable import PrettyTable
# settings
AVAILABLE_KW = 8
LOGGING_LEVEL = logging.INFO
PERIOD_FROM = '2023-09-01'
PERIOD_TO = '2023-09-01'
# configure logging
logging.basicConfig(
format='%(asctime)s.%(msecs)03d %(levelname)s %(module)s - %(funcName)s: %(message)s',
datefmt='%Y-%m-%d %H:%M:%S',
level=LOGGING_LEVEL
)
def nice_price_output(prices, prices_sum, description):
table = PrettyTable()
table.field_names = ['Date', f'{description} Price (SEK)']
table.align = 'l'
table.padding_width = 2
for date, price in prices.items():
table.add_row([date, price])
table.add_row(['Total', prices_sum])
print(table)
async def main():
mimer = Mimer(
kw_available=AVAILABLE_KW
)
await mimer.fetch(
period_from=PERIOD_FROM,
period_to=PERIOD_TO
)
prices_fcr_n = mimer.get_fcr_n_prices()
prices_fcr_d_up = mimer.get_fcr_d_up_prices()
prices_fcr_d_down = mimer.get_fcr_d_down_prices()
nice_price_output(
prices = prices_fcr_n,
prices_sum = mimer.get_sum_prices(prices_fcr_n),
description = 'FCR-N'
)
nice_price_output(
prices = prices_fcr_d_up,
prices_sum = mimer.get_sum_prices(prices_fcr_d_up),
description = 'FCR-D UP'
)
nice_price_output(
prices = prices_fcr_d_down,
prices_sum = mimer.get_sum_prices(prices_fcr_d_down),
description = 'FCR-D DOWN'
)
if __name__ == '__main__':
asyncio.run(main())Output+-----------------------+----------------------+
| Date | FCR-N Price (SEK) |
+-----------------------+----------------------+
| 2023-09-01 00:00:00 | 5.486731094065 |
| 2023-09-01 01:00:00 | 5.473278975271923 |
| 2023-09-01 02:00:00 | 5.483046953399573 |
| 2023-09-01 03:00:00 | 6.024919011768675 |
| 2023-09-01 04:00:00 | 6.104584800746637 |
| 2023-09-01 05:00:00 | 5.986810670706285 |
| 2023-09-01 06:00:00 | 5.294082266137055 |
| 2023-09-01 07:00:00 | 5.916679569335204 |
| 2023-09-01 08:00:00 | 5.313394297072511 |
| 2023-09-01 09:00:00 | 5.348831037548244 |
| 2023-09-01 10:00:00 | 5.9203287697025315 |
| 2023-09-01 11:00:00 | 5.135825881194346 |
| 2023-09-01 12:00:00 | 5.444676166159138 |
| 2023-09-01 13:00:00 | 5.870019916245568 |
| 2023-09-01 14:00:00 | 5.850853828046824 |
| 2023-09-01 15:00:00 | 5.273275331522941 |
| 2023-09-01 16:00:00 | 5.797683169549086 |
| 2023-09-01 17:00:00 | 5.319454888932867 |
| 2023-09-01 18:00:00 | 5.611263498677803 |
| 2023-09-01 19:00:00 | 5.239250943858182 |
| 2023-09-01 20:00:00 | 5.93201670243 |
| 2023-09-01 21:00:00 | 5.210278111126205 |
| 2023-09-01 22:00:00 | 5.133704096332981 |
| 2023-09-01 23:00:00 | 5.356158378649376 |
| Total | 133.52714835847897 |
+-----------------------+----------------------+
+-----------------------+------------------------+
| Date | FCR-D UP Price (SEK) |
+-----------------------+------------------------+
| 2023-09-01 00:00:00 | 3.311632998865711 |
| 2023-09-01 01:00:00 | 3.288935554512015 |
| 2023-09-01 02:00:00 | 3.213295847960331 |
| 2023-09-01 03:00:00 | 3.3367902114671084 |
| 2023-09-01 04:00:00 | 3.398391009694043 |
| 2023-09-01 05:00:00 | 3.3220856146399993 |
| 2023-09-01 06:00:00 | 3.57437562248 |
| 2023-09-01 07:00:00 | 3.3276878273599997 |
| 2023-09-01 08:00:00 | 3.5360363853152434 |
| 2023-09-01 09:00:00 | 3.3858881034742017 |
| 2023-09-01 10:00:00 | 3.4942032289633813 |
| 2023-09-01 11:00:00 | 3.4234401975693007 |
| 2023-09-01 12:00:00 | 3.398380667428867 |
| 2023-09-01 13:00:00 | 3.4784308005587947 |
| 2023-09-01 14:00:00 | 3.467791909074881 |
| 2023-09-01 15:00:00 | 3.356563749603953 |
| 2023-09-01 16:00:00 | 3.4768104629727423 |
| 2023-09-01 17:00:00 | 3.3993578147653962 |
| 2023-09-01 18:00:00 | 3.4107899648525573 |
| 2023-09-01 19:00:00 | 3.399232440138542 |
| 2023-09-01 20:00:00 | 3.26405194036575 |
| 2023-09-01 21:00:00 | 3.3716430111448683 |
| 2023-09-01 22:00:00 | 3.287736021089064 |
| 2023-09-01 23:00:00 | 3.3028224861553075 |
| Total | 81.22637387045205 |
+-----------------------+------------------------+
+-----------------------+--------------------------+
| Date | FCR-D DOWN Price (SEK) |
+-----------------------+--------------------------+
| 2023-09-01 00:00:00 | 6.320488106056495 |
| 2023-09-01 01:00:00 | 6.112997699823686 |
| 2023-09-01 02:00:00 | 6.134507849844338 |
| 2023-09-01 03:00:00 | 6.499827573714409 |
| 2023-09-01 04:00:00 | 6.844614022955501 |
| 2023-09-01 05:00:00 | 6.503989107457241 |
| 2023-09-01 06:00:00 | 5.819638910411214 |
| 2023-09-01 07:00:00 | 5.3365174089495655 |
| 2023-09-01 08:00:00 | 5.884750472026569 |
| 2023-09-01 09:00:00 | 5.607872638348425 |
| 2023-09-01 10:00:00 | 6.010793589290964 |
| 2023-09-01 11:00:00 | 5.860842833901714 |
| 2023-09-01 12:00:00 | 5.903521302210824 |
| 2023-09-01 13:00:00 | 5.986418718780103 |
| 2023-09-01 14:00:00 | 5.638328690026532 |
| 2023-09-01 15:00:00 | 5.50365199491099 |
| 2023-09-01 16:00:00 | 6.00273359520028 |
| 2023-09-01 17:00:00 | 5.799405263827196 |
| 2023-09-01 18:00:00 | 6.062401602087815 |
| 2023-09-01 19:00:00 | 5.780730776740605 |
| 2023-09-01 20:00:00 | 6.2497822237014375 |
| 2023-09-01 21:00:00 | 5.658725413145366 |
| 2023-09-01 22:00:00 | 5.317705775310582 |
| 2023-09-01 23:00:00 | 6.1910944685587 |
| Total | 143.03134003728056 |
+-----------------------+--------------------------+ |
aio-swagger | No description available on PyPI. |
aioswagger11 | Aboutaioswagger11 is an asyncio-compatible clone of swagger.py, capable of
understanding Swagger 1.1 definitions (only).As swagger has been renamed to OpenAPI which by now has version 3.0
(and has an actual specification – unlike Swagger 1.1) this library is
(mostly) only usable with Asterisk, which still uses Swagger 1.1
declarations.Aioswagger11 supports a WebSocket extension, allowing a WebSocket to
be documented, and auto-generated WebSocket client code.from swagger.py:Swagger.py is a Python library for usingSwaggerdefined APIs.Swagger itself is best described on the Swagger home page:Swagger is a specification and complete framework implementation for
describing, producing, consuming, and visualizing RESTful web
services.TheSwagger
specificationdefines
how APIs may be described using Swagger.UsageInstall the latest release from PyPI.$ sudo pip install aioswagger11Or install from source using thesetup.pyscript.$ sudo ./setup.py installAPIaioswagger11 will dynamically build an object model from a Swagger-enabled
RESTful API.Here is a simple example using theAsterisk REST
Interface#!/usr/bin/env python3importjsonimportasyncioimportaiohttpfromaioswagger11.clientimportSwaggerClientfromaioswagger11.http_clientimportAsynchronousHttpClienthttp_client=AsynchronousHttpClient()http_client.set_api_key('localhost','hey:peekaboo')asyncdefrun(ari,msg_json):channelId=msg_json['channel']['id']awaitari.channels.answer(channelId=channelId)awaitari.channels.play(channelId=channelId,media='sound:hello-world')# In a real program you should wait for the PlaybackFinished event insteadawaitasyncio.sleep(3)awaitari.channels.continueInDialplan(channelId=channelId)asyncdefmain():ari=SwaggerClient("http://localhost:8088/ari/api-docs/resources.json",http_client=http_client)ws=ari.events.eventWebsocket(app='hello')asyncformsg_strinws:ifmsg.typein{aiohttp.WSMsgType.CLOSED,aiohttp.WSMsgType.CLOSING}:breakelifmsg.type!=aiohttp.WSMsgType.TEXT:continue# ignoremsg_json=json.loads(msg_str)ifmsg_json['type']=='StasisStart':asyncio.ensure_future(run(ari,msg_json))if__name__=="__main__":loop=asyncio.get_event_loop()loop.run_until_complete(main())Data modelThe data model presented by theswagger_modelmodule is nearly
identical to the original Swagger API resource listing and API
declaration. This means that if you add extra custom metadata to your
docs (such as a_authoror_copyrightfield), they will carry
forward into the object model. I recommend prefixing custom fields with
an underscore, to avoid collisions with future versions of Swagger.There are a few meaningful differences.Resource listingThefileandbase_dirfields have been added, referencing the
original.jsonfile.The objects in aresource_listing’sapiarray contains a
fieldapi_declaration, which is the processed result from the
referenced API doc.API declarationAfilefield has been added, referencing the original.jsonfile.DevelopmentThe code is documented usingSphinx, which
allowsIntelliJ IDEAto do a better job at inferring types for autocompletion.To keep things isolated, I also recommend installing (and using)virtualenv.$ sudo pip install virtualenv
$ mkdir -p ~/virtualenv
$ virtualenv ~/virtualenv/swagger
$ . ~/virtualenv/swagger/bin/activateSetuptoolsis used for
building.Pytestis used
for unit testing, with thecoverageplugin installed to
generated code coverage reports. Pass--with-coverageto generate
the code coverage report. HTML versions of the reports are put incover/index.html.$ ./setup.py develop # prep for development (install deps, launchers, etc.)
$ ./setup.py pytest # run unit tests
$ ./setup.py bdist_egg # build distributableTestingSimply runpython3 setup.py pytest.Note that testing this module requires a version of httpretty that’s been
fixed to work with aiohttp.LicenseCopyright (c) 2013, Digium, Inc.
Copyright (c) 2018, Matthias Urlichsaioswagger11 is licensed with aBSD 3-Clause
License.The current author humbly requests that you share any further bug fixes or
enhancements to this code. |
aioswitchbee | pySwitchbeeA Python module library to controlSwitchBeesmart home devices.Example code usage:fromasyncioimportget_event_loopfromaiohttpimportClientSession,ClientTimeout,TCPConnectorfromswitchbee.apiimportCentralUnitAPIfromswitchbee.deviceimportApiStateCommand,DeviceTypeasyncdefmain():session=ClientSession(connector=TCPConnector(ssl=False),timeout=ClientTimeout(total=3),)cu=CentralUnitAPI("192.168.50.2","user","pass",session)awaitcu.connect()print(f"Central Unit Name:{cu.name}")print(f"Central Unit MAC:{cu.mac}")print(f"Central Unit Version:{cu.version}")devices=awaitcu.devicesfordeviceindevices:# set the dimmer lights to 50% brightnessifdevice.type==DeviceType.Dimmer:print("Discovered Dimmer device called{device.name}"" current brightness is{device.brigt}")awaitcu.set_state(device.id,50)# set the shutters position to 30% openedifdevice.type==DeviceType.Shutter:print("Discovered Shutter device called{device.name}"" current position is{device.position}")awaitcu.set_state(device.id,30)# turn off switchesifdevice.type==DeviceType.Switch:print("Discovered Switch device called{device.name}"" current state is{device.state}")awaitcu.set_state(device.id,ApiStateCommand.OFF)# set timer switch on for 10 minutesifdevice.type==DeviceType.TimedPower:print("Discovered Timed Power device called{device.name}"" current state is{device.state}with{device.minutes_left}""minutes left until shutdown")awaitcu.set_state(device.id,10)session.close()if__name__=="__main__":get_event_loop().run_until_complete(main())exit()Using the CLI tool:Alternatively, it is possible to controlSwitchBeedevices using the cli toolswitchbee_cli.pyas following:To list devices that currently on:python switchbee_cli.py -i 192.168.50.2 -u USERNAME -p PASSWORD get_states --only-on'_state': 'ON',
'hardware': <HardwareType.Switch: 'DIMMABLE_SWITCH'>,
'id': 311,
'name': 'Ceiling',
'type': <DeviceType.Switch: 'SWITCH'>,
'zone': 'Outdoo Storage'}
{ '_state': 'ON',
'hardware': <HardwareType.Switch: 'DIMMABLE_SWITCH'>,
'id': 142,
'name': 'Spotlights',
'type': <DeviceType.Switch: 'SWITCH'>,
'zone': 'Porch'}To set shutter with device id 392 position 50%:python switchbee_cli.py -i 192.168.50.2 -u USERNAME -p PASSWORD set_state --device-id 392 --state 50To turn on Power Timed Switch with device id 450 for 30 minutes:python switchbee_cli.py -i 192.168.50.2 -u USERNAME -p PASSWORD set_state --device-id 450 --state 30To turn off light with device id 122:python switchbee_cli.py -i 192.168.50.2 -u USERNAME -p PASSWORD set_state --device-id 122 --state OFF |
aioswitchbotmeter | No description available on PyPI. |
aioswitcher | Help WantedAioswitcher project is looking for maintainers and contributors!For various reasons, I can only keep maintaining this project as far as dependency bumps and publishing.As for new features and the occasional bug support, these will require other maintainers/contributors.If that's you - please feel free to ping me and I will do all I can to make the onboarding process easy.Switcher Python IntegrationPyPi module integrating with variousSwitcherdevices.Check out thewiki pagesfor a list of supported devices.pipinstallaioswitcherDocumentationWikiContributingExample UsageState Bridgeasyncdefprint_devices(delay):defon_device_found_callback(device):# a switcher device will broadcast a state message approximately every 4 secondsprint(asdict(device))asyncwithSwitcherBridge(on_device_found_callback):awaitasyncio.sleep(delay)# run the bridge for 60 secondsasyncio.run(print_devices(60))Power Plug APIasyncdefcontrol_power_plug(device_ip,device_id,device_key):# for connecting to a device we need its id, login key and ip addressasyncwithSwitcherType1Api(device_ip,device_id,device_key)asapi:# get the device current stateawaitapi.get_state()# turn the device onawaitapi.control_device(Command.ON)# turn the device offawaitapi.control_device(Command.OFF)# set the device name to 'my new name'awaitapi.set_device_name("my new name")asyncio.run(control_power_plug("111.222.11.22","ab1c2d","00"))Water Heater APIasyncdefcontrol_water_heater(device_ip,device_id,device_key):# for connecting to a device we need its id, login key and ip addressasyncwithSwitcherType1Api(device_ip,device_id,device_key)asapi:# get the device current stateawaitapi.get_state()# turn the device on for 15 minutesawaitapi.control_device(Command.ON,15)# turn the device offawaitapi.control_device(Command.OFF)# set the device name to 'my new name'awaitapi.set_device_name("my new name")# configure the device for 02:30 auto shutdownawaitapi.set_auto_shutdown(timedelta(hours=2,minutes=30))# get the schedules from the deviceawaitapi.get_schedules()# delete and existing schedule with id 1awaitapi.delete_schedule("1")# create a new recurring schedule for 13:00-14:30# executing on sunday and fridayawaitapi.create_schedule("13:00","14:30",{Days.SUNDAY,Days.FRIDAY})asyncio.run(control_water_heater("111.222.11.22","ab1c2d","00"))Runner APIasyncdefcontrol_runner(device_ip,device_id,device_key):# for connecting to a device we need its id, login key and ip addressasyncwithSwitcherType2Api(device_ip,device_id,device_key)asapi:# get the device current stateawaitapi.get_shutter_state()# open the shutter to 30%awaitapi.set_position(30)# stop the shutter if currently rollingawaitapi.stop()asyncio.run(control_runner("111.222.11.22","ab1c2d","00"))Breeze APIasyncdefcontrol_breeze(device_ip,device_id,device_key,remote_manager,remote_id):# for connecting to a device we need its id, login key and ip addressasyncwithSwitcherType2Api(device_ip,device_id,device_key)asapi:# get the device current stateawaitapi.get_breeze_state()# initialize the Breeze RemoteManager and get the remoteremote=remote_manager.get_remote(remote_id)# prepare a control command that turns on the Breeze# set to 24 degree (Celsius) cooling with vertical swing# send command to the deviceawaitapi.control_breeze_device(remote,DeviceState.ON,ThermostatMode.COOL,24,ThermostatFanLevel.MEDIUM,ThermostatSwing.ON,)# create the remote manager outside the context for re-usingremote_manager=SwitcherBreezeRemoteManager()asyncio.run(control_breeze("111.222.11.22","ab1c2d","00",remote_manager,"DLK65863"))Command Line Helper Scriptsdiscover_devices.pycan discover devices and their
states.control_device.pycan control a device.DisclaimerThis isNOTan official module and it isNOTofficially supported by the vendor.That said, thanks are in order to all the people atSwitcherfor their cooperation and general support.ContributorsThanks goes to these wonderful people (emoji key):Aviad Golan🔣Dolev Ben Aharon📖Fabian Affolter💻Itzik Ephraim💻Kesav890📖Liad Avraham💻Or Bin💻Shai rod🔣Shay Levy💻🤔YogevBokobza💻⚠️🚧dmatik📝🤔📓jafar-atili💻📖 |
aiosyncapi | No description available on PyPI. |
aiosyncthing | aiosyncthingAsynchronous Python client for theSyncthingREST API.Inspired bypython-syncthing,
some snippets were copied frompython-fumisNOTE: The package is in active development.Not all features of the API are implemented.Installationpip install aiosyncthingUsageimportasynciofromaiosyncthingimportSyncthingasyncdefmain():asyncwithSyncthing("API Key")asclient:# interact with the client herepassif__name__=="__main__":asyncio.run(main())SyncthingSyncthing is the entrypoint class, it acts as an async context manager and provides access to endpoint namespaces.Initializationdef__init__(self,api_key,# your API Keyurl="http://127.0.0.1:8384",# A base URL of the server, https://syncthing.example.com:443/something is also possibletimeout=DEFAULT_TIMEOUT,# Timeout in secondsverify_ssl=True,# Perform SSL verificationloop=None,# event loopsession=None# client session,)...In case if the api_key is invalid,aiosyncthing.exceptions.SyncthingErrorwill be raised on attempt to perform any request exceptclient.system.ping(), this one only raisesaiosyncthing.exceptions.PingError.System namespaceProvides access to theSystem EndpointspingReturns none if ping is successful or raisessyncthing.exceptions.PingErrorawaitclient.system.ping()configReturns a dict with the server config or raisessyncthing.exceptions.SyncthingErrorawaitclient.system.config()statusReturns a dict with the server status or raisessyncthing.exceptions.SyncthingErrorawaitclient.system.status()versionReturns a dict with the server version or raisessyncthing.exceptions.SyncthingErrorawaitclient.system.version()pausePauses synchronization with all devices or with the selected device or raisessyncthing.exceptions.SyncthingError,
in case if passed devices is unknown to the server,syncthing.exceptions.UnknownDeviceErrorwill be raised. Always returnsNoneawaitclient.system.pause()# pause allawaitclient.system.pause(device_id)# eg: 'MTLMICV-YE72URC-NF4LBO3-2LVPTFZ-LLCZHEZ-2F3OEJS-R6CWZVE-7VXHFQA"resumeResumes synchronization with all devices or with a selected device or raisessyncthing.exceptions.SyncthingError,
in case if passed devices is unknown to the server,syncthing.exceptions.UnknownDeviceErrorwill be raised. Always returnsNoneawaitclient.system.resume()# resume allawaitclient.system.resume(device_id)# eg: 'MTLMICV-YE72URC-NF4LBO3-2LVPTFZ-LLCZHEZ-2F3OEJS-R6CWZVE-7VXHFQA"Database namespaceProvides access to theDatabase EndpointsstatusReturns a dict with the folder status or raisessyncthing.exceptions.SyncthingError. If the folder id is unknown to
the server,syncthing.exceptions.UnknownFolderErrorwill be raised.awaitclient.database.status(folder_id)# eg: 'GXWxf-3zgnU'Events namespaceProvides access to theEvents EndpointslistenIs an async generator function that listens to theEvent API, yields events one by one and hides the complexity of long polling.
Raisessyncthing.exceptions.SyncthingErrorin case of error, handles timeouts internally and reconnects to the
endpoint.asyncforeventinclient.events.listen():print(event)last_seen_idReturns the id of the last received event of the previous batch.asyncforeventinclient.events.listen():ifevents.last_seen_id==0:continue# skip first batch because it's historical dataLicenseMIT LicenseCopyright (c) 2020 Gleb SinyavskiyPermission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE. |
aiosysbus | aiosysbusManage your Livebox in PythonEasily manage your Livebox in Python.
Check your config, configure your dhcp, disable your wifi, monitor your LAN activity and many others, on LAN or remotely.aiosysbus is a python library implementing fir the Livebox v3.This project is based on stilllman/aiofreepybox, which provides the same features as aiofreepybox in a synchronous manner.WARNINGVersion 1.0.0 and abovemakes all these calls asynchronously.
It breaks the compatibility of previous codesInstallUse the PIP package manager$pipinstallaiosysbusOr manually download and install the last version from github$gitclonehttps://github.com/cyr-ius/aiosysbus.git
$pythonsetup.pyinstallGet startedaiosysbus < 1.0.0# Import the aiosysbus package.fromaiosysbusimportAIOSysbusasyncdefreboot()# Instantiate the Sysbus class using default options.lvbx=AIOSysbus('192.168.1.1','80','xxxxxx')# Connect to the livebox with default options.lvbx.connect()# Do something useful, rebooting your livebox for example.lvbx.system.reboot()# Properly close the session.lvbx.close()aiosysbus >= 1.0.0importasyncioimportloggingasyncdefasync_main()->None:# Instantiate the Sysbus class using default options.api=AIOSysbus(username=xxxx,password=xxxx,host=HOST)# Connect to the livebox.awaitapi.async_connect()# Query exampleparameters={"parameters":{"expression":{"wifi":"wifi && .Active==False"}}}devices=awaitapi.devices.async_get_devices(parameters)awaitapi.async_close()if__name__=="__main__":loop=asyncio.get_event_loop()loop.run_until_complete(async_main())Have a look at theexample.pyfor a more complete overview.Notes on HTTPSNot implemented |
aiot | ## sdk python for mobifone aiot platformthis is an example project demonstrating how to publish a python module to PyPI.## InstallationRun the followin gto install`python pip install aiot `## Developing aiotTo install Helloworld, along with the tools you need to develop and run tests, run the following in yout virtualenv:`bash $ pip install-e.[dev] ` |
aio-taginfo | A typed async client for thetaginfoAPI, a system for finding and aggregating
information aboutOpenStreetMap'stags, and making it browsable and searchable.This library makes use ofaiohttpfor requests, andPydanticfor parsing
and validating the responses.ContentsRationaleUsageExampleEndpointsSee alsoAn overview of modules, classes and functions can be found in theAPI referenceThe version history is available inCHANGELOG.mdThetaginfo website,
itsAPI documentation,
and itsOSM Wiki pageRationaleOpenStreetMap uses tags to add meaning to geographic objects. There is no fixed
list of those tags. New tags can be invented and used as needed. Everybody can
come up with a new tag and add it to new or existing objects. This makes
OpenStreetMap enormously flexible, but sometimes also a bit hard to work with.Whether you are contributing to OSM or using the OSM data, there are always
questions like: What tags do people use for feature X? What tags can I use for
feature Y so that it appears properly on the map? Is the tag Z described on the
wiki actually in use and where?Taginfo helps you by showing statistics about which tags are actually in the
database, how many people use those tags, where they are used and so on. It also
gets information about tags from the wiki and from other places. Taginfo tries
to bring together all information about tags to help you understand how they are
used and what they mean.Taginfo has an API that lets you access the contents of its databases in several
ways. The API is used internally for the web user interface and can also be used
by anybody who wants to integrate taginfo data into their websites or
applications.UsageThe API is intended for the use of the OpenStreetMap community. Do not use it
for other services. If you are not sure, ask on the mailing list (see below).Always use a sensible User-agent header with enough information that we can
contact you if there is a problem.The server running the taginfo API does not have unlimited resources. Please use
the API responsibly. Do not create huge amounts of requests to get the whole
database or large chunks of it, instead use thedatabase downloadsprovided.
If you are using the API and you find it is slow, you are probably overusing it.If you are using the taginfo API it is recommended that you join thetaginfo-dev mailing list. Updates to the API will be announced there and this
is also the right place for your questions.The data available through taginfo is licenced underODbL,
the same license as the OpenStreetMap data.OpenStreetMap®is open data, licensed under theOpen Data Commons Open Database License(ODbL)
by theOpenStreetMap Foundation(OSMF).You are free to copy, distribute, transmit and adapt our data, as long as you
credit OpenStreetMap and its contributors. If you alter or build upon our data,
you may distribute the result only under the same licence. The fulllegal codeexplains your rights and responsibilities.ExampleHere is an example of an API request using this library:fromaio_taginfoimportkey_overview# either use a temporary session…response:Response[KeyOverview]=awaitkey_overview(key="amenity")# …or provide your ownasyncwithaiohttp.ClientSession()assession:response:Response[KeyOverview]=awaitkey_overview(key="amenity",session=session)EndpointsThis library is early in development and most endpoints are still missing.EndpointSchema/api/4/key/chronologymultiple/api/4/key/combinationspaginated✅/api/4/key/distribution/nodesimage/api/4/key/distribution/waysimage✅/api/4/key/overviewsingle✅/api/4/key/prevalent_valuesmultiple/api/4/key/projectspaginated✅/api/4/key/similarpaginated/api/4/key/statsmultiple/api/4/key/valuespaginated/api/4/key/wiki_pagesmultiple/api/4/keys/allpaginated/api/4/keys/similarpaginated/api/4/keys/wiki_pagespaginated/api/4/keys/without_wiki_pagepaginated/api/4/project/iconimage/api/4/project/tagspaginated/api/4/projects/allpaginated/api/4/projects/keyspaginated/api/4/projects/tagspaginated/api/4/relation/projectspaginated/api/4/relation/rolespaginated/api/4/relation/statsmultiple/api/4/relation/wiki_pagesmultiple/api/4/relations/allpaginated/api/4/search/by_key_and_valuepaginated/api/4/search/by_keywordpaginated/api/4/search/by_rolepaginated/api/4/search/by_valuepaginated✅/api/4/site/config/geodistributionother/api/4/site/infoother/api/4/site/sourcesother/api/4/tag/chronologymultiple/api/4/tag/combinationspaginated/api/4/tag/distribution/nodesimage/api/4/tag/distribution/waysimage/api/4/tag/overviewsingle/api/4/tag/projectspaginated/api/4/tag/statsmultiple/api/4/tag/wiki_pagesmultiple/api/4/tags/listmultiple/api/4/tags/popularpaginated/api/4/unicode/charactersmultiple/api/4/wiki/languagesmultiple |
aiotaika | aiotaikaTaika asynchronous client library for PythonThis library provides a very intuitive API for developers to easily create Python applications for Taika Tech's Spatial User Interface (SUI). With SUI, you can program actions to your physical environment, which are triggered based on location, orientation, and/or gesture data coming from a Taika Ring or Taika Tag.You can find aiotaika's documentation fromRead the Docs.Basic ExamplesFor more examples, seeexamplesfolder of the repository.Callback ExampleIn this example, we subscribe to move events and gesture events of ring ID 3.
In the callback function, we track the latest position and when a gesture event
happens, we print out which gesture was made with the latest location data.With callbacks one can register one or more callbacks to a single callback function.
However, the Event type must be a parent class of all incoming event objects in that
callback function. HereRingGestureEventandRingMoveEventinherit fromRingEvent.import asyncio
from aiotaika import TaikaClient
from aiotaika.events import EventType
from aiotaika.ring import RingGestureEvent, RingMoveEvent
Vector3 last_position
async def my_callback(event: RingEvent) -> None:
if isinstance(event, RingMoveEvent):
last_position = event.position
elif isinstance(event, RingGestureEvent):
print(f"{event.gesture} in position {last_position}")
async def main() -> None:
async with TaikaClient(
host="127.0.0.1", username="root", password=""
) as taika:
rings = taika.rings
for key, ring in rings.items():
print("{}: {}".format(key, ring.metadata))
await rings[3].register_event_cb(EventType.RING_MOVE_EVT, my_callback)
await rings[3].register_event_cb(EventType.RING_GESTURE_EVT, my_callback)
await asyncio.sleep(5)
try:
asyncio.run(main())
except KeyboardInterrupt:
passAsynchronous Generator Example (no callbacks!)This example shows simply how all the incoming events can be handled via theeventsAsyncGenerator of theTaikaClientclass. In this example, we simply print out a
ring's name and position when aRingMoveEventhappens.import asyncio
from aiotaika import TaikaClient
from aiotaika.ring import RingMoveEvent
async def main() -> None:
async with TaikaClient(
host="127.0.0.1", username="root", password=""
) as taika:
async with taika.events() as events:
async for event in events:
if isinstance(event, RingMoveEvent):
print(f"Ring {event.metadata.name} position:")
print(f"x: {event.position.x}, z: {event.position.z}")
print(f"height: {event.position.y}")
try:
asyncio.run(main())
except KeyboardInterrupt:
passRequirements:HardwareTaika Development KitSoftwareNote: these should be automatically satisfied ifaiotaikais installed viapip.You can find precise version requirements frompyproject.tomlPythonasyncio-mqttaiomysql |
aiotailf | No description available on PyPI. |
aiotailwind | aiotailwindasyncio library to interact withTailwinddevices using theirlocal JSON API.Local Control KeyIn order to use the local control API, it's necessary to obtain your Local Control Key (a 6-digit code), which you then provide when initialising anAuthinstance.To obtain your Local Control Key, ensure that you've updated your Tailwind iQ3 to the v9.95 firmware or later, and then:VisitTailwind WebLog in with your Tailwind accountClick "Local Control Key" in the top menu (red box in screenshot)If no key is displayed, click the "Create new local command key" buttonEnter the 6-digit code (green box in screenshot) into the integration configurationSupported DevicesThis library has been developed and tested against a Tailwind iQ3 garage door opener controller. A rudimentary attempt to support the Light device referenced in the API documentation has been made, but without access to a physical device, it has not been possible to test whether this works.DemoSeedemo.pyfor a basic demo. |
aiotaipit | aioTaipitAsynchronous Python API forTaipit cloud meters.InstallationUse pip to install the library:pip install aiotaipitUsageimportasynciofrompprintimportpprintimportaiohttpfromaiotaipitimportSimpleTaipitAuth,TaipitApiasyncdefmain(username:str,password:str)->None:"""Create the aiohttp session and run the example."""asyncwithaiohttp.ClientSession()assession:auth=SimpleTaipitAuth(username,password,session)api=TaipitApi(auth)meters=awaitapi.async_get_meters()pprint(meters)if__name__=="__main__":_username="<YOUR_USER_NAME>"_password="<YOUR_PASSWORD>"asyncio.run(main(_username,_password))TheSimpleTaipitAuthclient also accept custom client ID and secret (this can be found by sniffing the client).This will return a price object that looks a little like this:[{'address':'Санкт-Петербург, Ворошилова, 2','category':0,'ecometerdata':{'P_aver':0.21986280758339,'P_averSmall':0.15261778589793,'P_averSmall_':109.88480584651,'P_aver_':158.30122146004,'P_aver_TF1':False,'P_aver_TF2':False,'P_aver_TF31':False,'P_aver_TF32':False,'P_aver_TF33':False,'P_norm':0.0066666666666667,'currentTS':1671485359,'ecoStatus':None,'lastReading':{'energy_a':1004.85,'energy_t1_a':794.45,'energy_t2_a':210.4,'energy_t3_a':0,'ts_tz':1671483628,'value':0.02},'meterCategory':0,'time':1671485359,'timezone':3,'trend':-48.41641561353,'trendTF1':False,'trendTF2':False},'id':2147485997,'isLowDataFreq':False,'isOwner':False,'isVirtual':0,'metername':'НЕВА МТ 114 (Wi-Fi) (22001110)','owner':{'peopleNumber':None,'type':0,'typeCode':'person'},'serialNumber':'22001110','usericopath':'/uploads/user/photo/3edba895933a54540fbdb88614f24f480a9eeb68.png','username':'Компания Тайпит','waterHot':False},{'address':'Санкт-Петербург, Ворошилова, 2','category':0,'ecometerdata':{'P_aver':0.25422232030182,'P_averSmall':0.2494024938596,'P_averSmall_':179.56979557891,'P_aver_':183.04007061731,'P_aver_TF1':False,'P_aver_TF2':False,'P_aver_TF31':False,'P_aver_TF32':False,'P_aver_TF33':False,'P_norm':0,'currentTS':1671485359,'ecoStatus':None,'lastReading':{'energy_a':11595.62,'energy_t1_a':10420.94,'energy_t2_a':1174.68,'energy_t3_a':0,'ts_tz':1671483641,'value':0},'meterCategory':0,'time':1671485359,'timezone':3,'trend':-3.4702750384005,'trendTF1':False,'trendTF2':False},'id':2147485996,'isLowDataFreq':False,'isOwner':False,'isVirtual':0,'metername':'НЕВА МТ 114 (Wi-Fi) (22001114)','owner':{'peopleNumber':None,'type':0,'typeCode':'person'},'serialNumber':'22001114','usericopath':'/uploads/user/photo/3edba895933a54540fbdb88614f24f480a9eeb68.png','username':'Компания Тайпит','waterHot':False}]Timeoutsaiotaipit does not specify any timeouts for any requests. You will need to specify them in your own code. We recommend thetimeoutfromasynciopackage:importasynciowithasyncio.timeout(10):all_readings=awaitapi.async_get_meters() |
aiotankerkoenig | aiotankerkoenigAsynchronous Python client for tankerkoenig.de.AboutThis package allows you to fetch data from tankerkoenig.de.InstallationpipinstallaiotankerkoenigUsageimportasynciofromaiotankerkoenigimportTankerkoenigasyncdefmain()->None:"""Run the example."""asyncwithTankerkoenig(api_key="abc123")astk:station_details=awaittk.station_details("12345678-1234-1234-1234-123456789012")print(f"Name:{station_details.name}")if__name__=="__main__":asyncio.run(main())Changelog & ReleasesThis repository keeps a change log usingGitHub's releasesfunctionality. The format of the log is based onKeep a Changelog.Releases are based onSemantic Versioning, and use the format
ofMAJOR.MINOR.PATCH. In a nutshell, the version will be incremented
based on the following:MAJOR: Incompatible or major changes.MINOR: Backwards-compatible new features and enhancements.PATCH: Backwards-compatible bugfixes and package updates.ContributingThis is an active open-source project. I am always open to people who want to
use the code or contribute to it.Thank you for being involved! :heart_eyes:Setting up development environmentThis Python project is fully managed using thePoetrydependency manager. But also relies on the use of NodeJS for certain checks during development.You need at least:Python 3.11+PoetryNodeJS 20+ (including NPM)To install all packages, including all development requirements:npminstall
poetryinstallAs this repository uses thepre-commitframework, all changes
are linted and tested with each commit. You can run all checks and tests
manually, using the following command:poetryrunpre-commitrun--all-filesTo run just the Python tests:poetryrunpytestAuthors & contributorsThe content is byJan-Philipp Benecke.For a full list of all authors and contributors,
checkthe contributor's page.LicenseMIT LicenseCopyright (c) 2024 Jan-Philipp BeneckePermission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE. |
aiotapioca-wrapper | AioTapioca-WrapperIt's an asynchronous fork oftapioca-wrapperlibrary.Tapioca helps you generating Python clients for APIs.
APIs wrapped by Tapioca are explorable and follow a simple interaction pattern that works uniformly so developers don't need to learn how to use a new coding interface/style for each service API.DocumentationFull documentation hosted byreadthedocs.FlavoursYou can find the full list of available tapioca clientshere.To create new flavours, refer toBuilding a wrapperin the documentation. There is also acookiecutter templateto help bootstraping new API clients.Other resourcesContributorsChangelogBlog post explaining the basics about Tapioca |
aiotapioca-yandex-metrika | Python client for allAPI Yandex MetrikaIt's an asynchronous fork oftapi-yandex-metrikalibrary.Installationpip install aiotapioca-yandex-metrikaDocumentationManagement APIReports APILogs APIDependenciesaiohttpaiotapioca-wrapperHelpAndrey IlinAuthorPavel MaksimovGood luck friend! Star the repo ;)Удачи тебе, друг! Поставь звездочку ;)Copyright (c) Pavel Maksimov.The author of this modification Andrey Ilin. |
aiotarantool | Connector required tarantool version 1.6:$ pip install aiotarantoolTry it example:importasyncioimportaiotarantoolcnt=0asyncdefinsert_job(tnt):globalcntforitinrange(2500):cnt+=1r=awaittnt.insert("tester",(cnt,cnt))loop=asyncio.get_event_loop()tnt=aiotarantool.connect("127.0.0.1",3301)tasks=[loop.create_task(insert_job(tnt))for_inrange(40)]loop.run_until_complete(asyncio.wait(tasks))loop.run_until_complete(tnt.close())loop.close()Under this scheme the aiotarantool driver makes a smaller number of read/write tarantool socket.See benchmark results time for insert/select/delete 100K tuples on 1.5KBytes:calltarantoolaiotarantoolinsert35.93804712.701088select24.38974812.746204delete35.22451513.905095 |
aiotarantool_queue | Bindings require tarantool version 1.6 and aiotarantool connector:$ pip install aiotarantool_queue aiotarantoolTry it example:importasyncioimportaiotarantool_queueimportrandom@asyncio.coroutinedefput_job(queue):fortube_namein("tube1","tube2","tube3"):tube=queue.tube(tube_name)task=yield fromtube.put({"task_data":random.random()})@asyncio.coroutinedeftake_job(tube):whileTrue:task=yield fromtube.take(5)ifnottask:breakprint(task.data)yield fromtask.ack()loop=asyncio.get_event_loop()queue=aiotarantool_queue.Queue("127.0.0.1",3301)put_tasks=[asyncio.async(put_job(queue))for_inrange(20)]take_tasks=[asyncio.async(take_job(queue.tube(tube_name)))fortube_namein("tube1","tube2","tube3")]loop.run_until_complete(asyncio.wait(put_tasks+take_tasks))loop.run_until_complete(queue.close())loop.close()This code makes it easy to develop your application to work with queue. |
aiotarfile | aiotarfileStream-based tarball processing, but like, async.
A thin set of wrappers over the rust crateasync-tar.Install withpip isntall aiotarfile.Consult docstrings, type annotations, and tab-completion for usage. |
aio-task | aio-taskSimple and reliable asynchronous tasks manager that is asyncio friendly.Key FeaturesA simple worker interface to register coroutines as tasks.A simple broker interface to produce and fetch tasks.Broker and worker(s) can be setup in a single program avoiding external service dependencies (by using dummies queue and cache).Task is not lost if worker crash during processing it, it's kept in the queue and re-processed until a worker acknowledge it.Task exceptions are not lost: you will retrieve them in the task's result.Support rabbitmq, redis and sentinel.Easily hackable to add new queuing/caching servicesGetting StartedUsedocker-compose -f examples/docker-compose.yml upto bring up a rabbitmq and a redis to run this example.Installpip install aio-taskWorker → run tasksimportasynciofromaio_taskimportWorkerasyncdefaddition(a,b):""" Task example. """returna+basyncdefstart_worker():rabbitmq_config={"url":"amqp://guest:guest@localhost:5672","routing_key":"tasks_queue"}redis_config={"address":"redis://localhost"}worker=awaitWorker.create("rabbitmq",rabbitmq_config,"redis",redis_config)worker.register_handler(addition)awaitworker.start()returnworkerloop=asyncio.get_event_loop()worker=loop.run_until_complete(start_worker())try:loop.run_forever()exceptKeyboardInterrupt:loop.run_until_complete(worker.close())# gracefull shutdownloop.close()Broker → produce tasksimportasynciofromaio_taskimportBrokerasyncdefsample_addition():# setup brokerrabbitmq_config={"url":"amqp://guest:guest@localhost:5672","routing_key":"tasks_queue"}redis_config={"address":"redis://localhost"}broker=awaitBroker.create("rabbitmq",rabbitmq_config,"redis",redis_config)# produce tasktask_id=awaitbroker.create_task("addition",{"a":1,"b":2})awaitasyncio.sleep(0.1)# fetch tasktask=awaitbroker.get_task(task_id)print(task)awaitbroker.close()# graceful shutdownloop=asyncio.get_event_loop()loop.run_until_complete(sample_addition())loop.run_until_complete(broker.close())💡 More examples in examples/ !Run testsunit testspip install -e .[test]
pytest -xvs tests/unitintegration testspip install -e .[test]
docker-compose -f tests/integration/compose/docker-compose.yml up -d
IP_HOST=localhost pytest -xvs tests/integration |
aio-task-bound-context | AIO Task Bound ContextContext manager that provides a means for context to be set, and retrieved
in Python AsyncIO.What???Okay so for a concrete example, thing of how Flask handles the current request:fromflaskimportrequestThis import, called from anywhere, will import the current request being
handled. This is made possible in a way similar to this:request=Nonedefget_request():returnrequestdefset_request(value):globalrequestrequest=valueWhen the HTTP server gets a request, it will callset_request, then anywhere
in the code another function can callget_requestto get the value.Here's the kicker: This is not possible with AIO, because multiple tasks may
be running at once, so there are multiple values forrequest, rather than
just a single value. Imagine the same piece of code being used in AIO:importasyncioasaioasyncdefhandle_request(request):set_request(request)# generate the responseawaitaio.sleep(1)assertget_request()==request# will failset_request(None)aio.get_event_loop().run_until_complete(aio.gather(handle_request('value 1'),handle_request('value 2'),))Obviously, this is going to be problematic.The answeraio_task_bound_contextattaches a stack of the current context values to the
currentTask, as well as tracking the parent tasks so that their context
can be inherrited:importasyncioasaiofromaio_task_bound_contextimportset_task_factory,TaskBoundContextclassRequestContext(TaskBoundContext):def__init__(self,request):self.request=requestdefget_value(self):returnself.requestasyncdefhandle_request(request):withRequestContext(request):# generate the responseawaitaio.sleep(1)assertRequestContext.current()==request# will succeedloop=aio.get_event_loop()set_task_factory(loop=loop)loop.run_until_complete(aio.gather(handle_request('value 1'),handle_request('value 2'),))ExamplesNote that all these examples will work in async tasks, which is what makes
them more special than a simple context manager. They are all simple examples
outside an async environment, but don't be fooled by the hidden complexity.To start off, we need to replace the default task factory inasynciowith
a wrapper to add extra details to tasks. Assume this has been executed before
all examples:importasyncioasaiofromaio_task_bound_contextimportcreate_task_factory,TaskBoundContextloop=aio.get_event_loop()loop.set_task_factory(create_task_factory(loop=loop))With noget_valuefunction defined, the "value" is theTaskBoundContextitself, so you can setup values in the__init__function if you just want
to pass around as set of values.classExampleContext(TaskBoundContext):def__init__(self,*args,**kwargs):super().__init__()self.args=argsself.kwargs=kwargswithExampleContext('an arg',key='in kwargs'):assertExampleContext.current().args==['an arg']assertExampleContext.current().kwargs=={'key':'in kwargs'}The "as value" of the context manager is the value returned fromget_value.classExampleContext(TaskBoundContext):def__init__(self,value):super().__init__()self.value=valuedefget_value(self):returnself.valuewithExampleContext('test')asvalue:assertvalue=='test'withExampleContext('different')asvalue:assertvalue=='different'Contexts are a hierarchical stack, so you can have multiple contexts and they
will push/pop their values onto/off of the stack of contexts.classExampleContext(TaskBoundContext):def__init__(self,value):super().__init__()self.value=valuedefget_value(self):returnself.valuewithExampleContext('first'):assertExampleContext.current()=='first'withExampleContext('second'):assertExampleContext.current()=='second'assertExampleContext.current()=='firstTheget_valuefunction can accept a single argument, which is the current
value of the stack.classLoggerContext(TaskBoundContext):def__init__(self,suffix):super().__init__()self.suffix=suffixdefget_value(self,current):ifcurrentisNone:returnlogging.getLogger(self.suffix)else:returncurrent.getChild(self.suffix)TestingPython 3.5+ is supported. To run tests across all environments, we usepyenv, and some quickvirtualenvinvocations (yes, we could also usetox).To run the tests, just run./tests_runall.shwhich will install relevant
Python versions if not already installed, create virtualenvs for them, and
runtests.py.To run tests manually, simply./test.py.LicenseCopyright 2018 Ricky CookPermission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. |
aiotask-context | No description available on PyPI. |
aiotaskpool | aiotaskpool |
aiotaskq | aiotaskqA simple asynchronous task queueMotivationPopular asynchronous worker library likeCelerydoesn't support asyncio and is hard to use for advanced usage.aiotaskqaims to help users compose tasks in a very native async-await manner.Plus, it is also fully-typed for better productivity and correctness.Give it a try and let us know if you like it. For questions or feedback feel to file issues on this repository.Example UsageInstall aiotaskqpython-mpipinstall--upgradepip
pipinstallaiotaskqDefine a simple app like the following:tree.
.
└──app└──app.pyWhereapp.pycontains the following:[email protected]_task(b:int)->int:# Some task with high cpu usagedef_naive_fib(n:int)->int:ifn<=2:return1return_naive_fib(n-1)+_naive_fib(n-2)return_naive_fib(b)asyncdefmain():async_result=awaitsome_task.apply_async(42)sync_result=some_task(42)assertasync_result==sync_resultprint(f"sync_result == async_result == 165580141. Awesome!")if__name__=="__main__":loop=asyncio.get_event_loop()loop.run_until_complete(main())Start redisdockerrun--publish127.0.0.1:6379:6379redisIn a different terminal, start the aiotaskq workerpython-maiotaskqworkerapp.appThen in another different terminal, run your apppython./app.py# Output: sync_result == async_result == 165580141. Awesome!Advanced usage exampleLet's say we want to compose a workflow where we want to break up some of the tasks and run them in parallel:|-- task_2 --> |
|-- task_2 --> | | task_3 --> |
START -> task_1 --> |-- task_2 --> | --> | task_3 --> | --> task_4 --> FINISH
|-- task_2 --> | | task_3 --> |
|-- task_2 --> |Usingcelerywe might end up with thisfromceleryimportCeleryapp=Celery()@app.taskdeftask_1(*args,**kwargs):[email protected]_2(*args,**kwargs):[email protected]_3(*args,**kwargs):[email protected]_4(*args,**kwargs):passif__name__=="__main__":step_1=task_1.si(some_arg="a")step_2=[task_2.si(some_arg=f"{i}")foriinrange(5)]step_3=[task_3.si(some_arg=f"{i}")foriinrange(3)]step_4=task_4.si(some_arg="b")workflow=chord(header=step_1,body=chord(header=step_2,body=chord(header=step_3,body=step_4,),),)output=workflow.apply_async().get()print(output)Usingaiotaskqwe may end up with the following:importasynciofromaiotaskqimporttask@taskdeftask_1(*args,**kwargs):pass@taskdeftask_2(*args,**kwargs):pass@taskdeftask_3(*args,**kwargs):pass@taskdeftask_4(*args,**kwargs):pass# So far the same as celery# And now the workflow is just native python, and you're free# to use any `asyncio` library of your choice to help with composing# your workflow e.g. `trio` to handle more advanced scenarios like# error propagation, task cancellation etc.if__name__=="__main__":step_1=task_1.apply_async()step_2=asyncio.gather(task_2.apply_async(arg=f"{i}"foriinrange(5)))step_3=asyncio.gather(task_3.apply_async(arg=f"{i}"foriinrange(3)))step_4=task_4.apply_async()workflow=[step_1,step_2,step_3,step_4]output=awaitasyncio.gather(workflow)print(output)InstallpipinstallaiotaskqDevelopmentsource./activate.shTestsIn another terminal./docker.shIn the main terminalsource./activate.sh
./test.shLinksPYPI |
aiotasks | aiotasks========*aiotasks: Celery like task manager for new AsyncIO Python module*Code | https://github.com/cr0hn/aiotasks---- | ----------------------------------------------Issues | https://github.com/cr0hn/aiotasks/issues/Python version | Python 3.5 and aboveWhat's aiotasks---------------aiotasks is an asynchronous task queue/job queue based on distributed message passing based on Python asyncio framework. Based on the Celery Task Queue ideas, but focusing in performance, non-blocking, event-driven.What's new?-----------This aiotasks version, add a lot of new features and fixes, like:Version 1.0.0+++++++++++++- First version releasedYou can read entire list in CHANGELOG file.Installation------------Simple++++++Install aiotasks is so easy:```$ python3.5 -m pip install aiotasks```With extra performance++++++++++++++++++++++Aiotasks also includes some optional dependencies to add extra performance but requires a bit different installation, because they (usually) depends of C extensions.To install the tool with extra performance you must do:```$ python3.5 -m pip install 'aiotasks[performance]'```**Remember that aiotasks only runs in Python 3.5 and above**.Quick start-----------You can display inline help writing:From cloned project+++++++++++++++++++```bashpython aiotasks.py -h```From pip installation+++++++++++++++++++++```bashaiotasks -h``` |
aio.tasks | Utils for managing concurrent asyncio tasks.You can use theconcurrentasync generator to run asyncio tasks
concurrently.It works much likeasyncio.as_available, but with a couple of differences.coroscan be anyiterablesincluding sync/asyncgeneratorslimitcan be supplied to specify the maximum number of concurrent tasksSettinglimitto-1will make all tasks run concurrently.The defaultlimitisnumber of cores + 4to a maximum of32. This
(somewhat arbitrarily) reflects the default for asyncio’sThreadPoolExecutor.For network tasks it might make sense to set the concurrencylimitlower
than the default, if, for example, opening many concurrent connections will
trigger rate-limiting or soak bandwidth.If an error is raised while trying to iterate the provided coroutines, the
error is wrapped in anConcurrentIteratorErrorand is raised immediately.In this case, no further handling occurs, andyield_exceptionshas no
effect.Any errors raised while trying to create or run tasks are wrapped inConcurrentError.Any errors raised during task execution are wrapped inConcurrentExecutionError.If you specifyyield_exceptionsasTruethen the wrapped errors will be
yielded in the results.Ifyield_exceptionsis False (the default), then the wrapped error will
be raised immediately.If you use any kind ofGeneratororAsyncGeneratorto produce the
awaitables, andyield_exceptionsisFalse, in the event that an error
occurs, it is your responsibility tocloseremaining awaitables that you
might have created, but which have not already been fired.This utility is useful for concurrency of io-bound (as opposed to cpu-bound)
tasks.UsageLets first create a coroutine that waits for a random amount of time,
and then returns its id and how long it waited.>>>importrandom>>>asyncdeftask_to_run(task_id):...print(f"{task_id}starting")...wait=random.random()*5...awaitasyncio.sleep(wait)...returntask_id,waitNext lets create an async generator that yields 10 of the coroutines.Note that the coroutines are not awaited, they will be created as tasks.>>>defprovider():...fortask_idinrange(0,10):...yieldtask_to_run(task_id)Finally, lets create an function to asynchronously iterate the results, and
fire it with the generator.As we limit the concurrency to 3, the first 3 jobs start, and as the first
returns, the next one fires.This continues until all have completed.>>>importasyncio>>>fromaio.tasksimportconcurrent>>>asyncdefrun(coros):...asyncfor(task_id,wait)inconcurrent(coros,limit=3):...print(f"{task_id}waited{wait}")>>>asyncio.run(run(provider()))0 starting
1 starting
2 starting
... waited ...
3 starting
... waited ...
...
... waited ... |
aiotba | aiotbayet another wrapper for The Blue Alliance's API except this one usesasynciobecause it magically makes everything
faster, right?also because there's an overcomplicated type hinting system so there's autocomplete on everything (except for the season
specific data structures, those are all just dicts lol and nobody cares about themmostof the time)exampleimportasynciofromaiotbaimportTBASessionasyncdefmain():ses=TBASession("tba apiv3 key here")poofs=awaitses.team(254)print(poofs.nickname)asyncio.run(main())this lib follows closely to the endpoints ofAPIv3and should cover just
about all of them except for thesimpleendpointsinstallationpip install aiotbanotesall of this is on a provisional basis and large parts of the api could change at a moment's notice. this isn't "stable"
yet so to speak. |
aiot-client | No description available on PyPI. |
aiotcloud | No description available on PyPI. |
aio-tcpserver | version: 0.0.3status: devauthor: hszemail:[email protected] tcp sever for runningasyncio.Protocol. The inspiration comes from sanic,only support python 3.6+keywords:tcp-server,asyncioFeaturewith multiple workerfor asyncio.Protocolcan be closed by ctrl+C in windowswith hooksExampleserver.pyimportasyncioimporttimefromaio_tcpserverimporttcp_server,listener@listener("before_server_start")asyncdefbeat(loop):print(time.time())print("ping")classEchoServerClientProtocol(asyncio.Protocol):defconnection_made(self,transport):peername=transport.get_extra_info('peername')print('Connection from{}'.format(peername))self.transport=transportdefdata_received(self,data):message=data.decode()print('Data received:{!r}'.format(message))print('Send:{!r}'.format(message))self.transport.write(data)print('Close the client socket')self.transport.close()defmain():tcp_server(EchoServerClientProtocol,worker=3)if__name__=='__main__':main()client.pyimportasyncioclassEchoClientProtocol(asyncio.Protocol):def__init__(self,message,loop):self.message=messageself.loop=loopdefconnection_made(self,transport):transport.write(self.message.encode())print('Data sent:{!r}'.format(self.message))defdata_received(self,data):print('Data received:{!r}'.format(data.decode()))defconnection_lost(self,exc):print('The server closed the connection')print('Stop the event loop')self.loop.stop()loop=asyncio.get_event_loop()message='Hello World!'coro=loop.create_connection(lambda:EchoClientProtocol(message,loop),'127.0.0.1',5000)loop.run_until_complete(coro)loop.run_forever()loop.close()Installpython-mpip installaio-tcpserverDocumentationDocumentation on Readthedocs. |
aiotdk | http://github.com/aylak-github/aiotdk/blob/main/README.md |
aiotdlib | aiotdlib - Python asyncio Telegram client based onTDLibThis wrapper is actual
forTDLib v1.8.14 (958fed6e8e440afe87b57c98216a5c8d3f3caed8)This package includes prebuilt TDLib binaries for macOS (arm64) and Debian Bullseye (amd64).
You can use your own binary by passinglibrary_pathargument toClientclass constructor. Make sure it's built
fromthis commit. Compatibility with
other versions of library is not guaranteed.FeaturesAll types and functions are generated automatically
fromtl schemaAll types and functions come with validation and good IDE type hinting (thanks
toPydantic)A set of high-level API methods which makes work with tdlib much simplerRequirementsPython 3.9+Get yourapi_idandapi_hash. Read more
inTelegram docsInstallationPyPIpipinstallaiotdlibor if you usePoetrypoetryaddaiotdlibExamplesBase exampleimportasyncioimportloggingfromaiotdlibimportClientAPI_ID=123456API_HASH=""PHONE_NUMBER=""asyncdefmain():client=Client(api_id=API_ID,api_hash=API_HASH,phone_number=PHONE_NUMBER)asyncwithclient:me=awaitclient.api.get_me()logging.info(f"Successfully logged in as{me.json()}")if__name__=='__main__':logging.basicConfig(level=logging.INFO)asyncio.run(main())Any parameter of Client class could be also set via environment variables.importasyncioimportloggingfromaiotdlibimportClientasyncdefmain():asyncwithClient()asclient:me=awaitclient.api.get_me()logging.info(f"Successfully logged in as{me.json()}")if__name__=='__main__':logging.basicConfig(level=logging.INFO)asyncio.run(main())and run it like this:exportAIOTDLIB_API_ID=123456exportAIOTDLIB_API_HASH=<my_api_hash>exportAIOTDLIB_BOT_TOKEN=<my_bot_token>
pythonmain.pyEvents handlersimportasyncioimportloggingfromaiotdlibimportClientfromaiotdlib.apiimportAPI,BaseObject,UpdateNewMessageAPI_ID=123456API_HASH=""PHONE_NUMBER=""asyncdefon_update_new_message(client:Client,update:UpdateNewMessage):chat_id=update.message.chat_id# api field of client instance contains all TDLib functions, for example get_chatchat=awaitclient.api.get_chat(chat_id)logging.info(f'Message received in chat{chat.title}')asyncdefany_event_handler(client:Client,update:BaseObject):logging.info(f'Event of type{update.ID}received')asyncdefmain():client=Client(api_id=API_ID,api_hash=API_HASH,phone_number=PHONE_NUMBER)# Registering event handler for 'updateNewMessage' event# You can register many handlers for certain event typeclient.add_event_handler(on_update_new_message,update_type=API.Types.UPDATE_NEW_MESSAGE)# You can register handler for special event type "*".# It will be called for each received eventclient.add_event_handler(any_event_handler,update_type=API.Types.ANY)asyncwithclient:# idle() will run client until it's stoppedawaitclient.idle()if__name__=='__main__':logging.basicConfig(level=logging.INFO)asyncio.run(main())Bot command handlerimportloggingfromaiotdlibimportClientfromaiotdlib.apiimportUpdateNewMessageAPI_ID=123456API_HASH=""BOT_TOKEN=""bot=Client(api_id=API_ID,api_hash=API_HASH,bot_token=BOT_TOKEN)# Note: bot_command_handler method is universal and can be used directly or as decorator# Registering handler for '/help' [email protected]_command_handler(command='help')asyncdefon_help_command(client:Client,update:UpdateNewMessage):# Each command handler registered with this method will update update.EXTRA field# with command related data: {'bot_command': 'help', 'bot_command_args': []}awaitclient.send_text(update.message.chat_id,"I will help you!")asyncdefon_start_command(client:Client,update:UpdateNewMessage):# So this will print "{'bot_command': 'help', 'bot_command_args': []}"print(update.EXTRA)awaitclient.send_text(update.message.chat_id,"Have a good day! :)")asyncdefon_custom_command(client:Client,update:UpdateNewMessage):# So when you send a message "/custom 1 2 3 test"# So this will print "{'bot_command': 'custom', 'bot_command_args': ['1', '2', '3', 'test']}"print(update.EXTRA)if__name__=='__main__':logging.basicConfig(level=logging.INFO)# Registering handler for '/start' commandbot.bot_command_handler(on_start_command,command='start')bot.bot_command_handler(on_custom_command,command='custom')bot.run()ProxyimportasyncioimportloggingfromaiotdlibimportClient,ClientProxySettings,ClientProxyTypeAPI_ID=123456API_HASH=""PHONE_NUMBER=""asyncdefmain():client=Client(api_id=API_ID,api_hash=API_HASH,phone_number=PHONE_NUMBER,proxy_settings=ClientProxySettings(host="10.0.0.1",port=3333,type=ClientProxyType.SOCKS5,username="aiotdlib",password="somepassword",))asyncwithclient:awaitclient.idle()if__name__=='__main__':logging.basicConfig(level=logging.INFO)asyncio.run(main())MiddlewaresimportasyncioimportloggingfromaiotdlibimportClient,HandlerCallablefromaiotdlib.apiimportAPI,BaseObject,UpdateNewMessageAPI_ID=12345API_HASH=""PHONE_NUMBER=""asyncdefsome_pre_updates_work(event:BaseObject):logging.info(f"Before call all update handlers for event{event.ID}")asyncdefsome_post_updates_work(event:BaseObject):logging.info(f"After call all update handlers for event{event.ID}")# Note that call_next argument would always be passed as keyword argument,# so it should be called "call_next" only.asyncdefmy_middleware(client:Client,event:BaseObject,*,call_next:HandlerCallable):# Middlewares useful for opening database connections for exampleawaitsome_pre_updates_work(event)try:awaitcall_next(client,event)finally:awaitsome_post_updates_work(event)asyncdefon_update_new_message(client:Client,update:UpdateNewMessage):logging.info('on_update_new_message handler called')asyncdefmain():client=Client(api_id=API_ID,api_hash=API_HASH,phone_number=PHONE_NUMBER)client.add_event_handler(on_update_new_message,update_type=API.Types.UPDATE_NEW_MESSAGE)# Registering middleware.# Note that middleware would be called for EVERY EVENT.# Don't use them for long-running tasks as it could be heavy performance hit# You can add as much middlewares as you want.# They would be called in order you've added themclient.add_middleware(my_middleware)asyncwithclient:awaitclient.idle()if__name__=='__main__':logging.basicConfig(level=logging.INFO)asyncio.run(main())LICENSEThis project is licensed under the terms of theMITlicense. |
aiotelebot | aioTelebot FrameworkUnofficial Telegram Bot SDK for Python 3.5+Telegram bot framework using Python asyncio |
aiotelegraf | aiotelegrafAn asyncio-base client for sending metrics toTelegraf.Implementation based onpytelegrafpackage.Installation$pipinstallaiotelegrafUsageimportasyncioimportaiotelegrafasyncdefmain():client=aiotelegraf.Client(host='0.0.0.0',port=8089,tags={'my_global_tag_1':'value_1','my_global_tag_2':'value_2',})awaitclient.connect()client.metric('my_metric_1','value_1',tags={'my_tag_1':'value_1',})awaitclient.close()asyncio.run(main())ContributingTo work on theaiotelegrafcodebase, you'll want to clone the project locally and install the required dependencies viapoetry:[email protected]:Gr1N/aiotelegraf.git
$makeinstallTo run tests and linters use command below:$makelint&&maketestIf you want to run only tests or linters you can explicitly specify which test environment you want to run, e.g.:$makelint-blackLicenseaiotelegrafis licensed under the MIT license. See the license file for details. |
aiotelegram | Tiny asyncio-based telgram bot-api wrapper library.Reasonsaiotgis framework, not library and have no proxy support.Raw api calls translation is better for understanding and will not
break if telegram api will be changed.snake_caseFeaturesSimple as telegram api is.Works with any json provider (aiohttp(default),aiorequests, etc.)snake_caseapi converted to telegramcamelCase.Pollingoffsethandled for you viaget_updatesmethod.Handling timeout between requests automatically (viapausekeyword-only argument).Source code isshort and simple.Installationpython -m pip install aiotelegram |
aioTelegramBot | UNKNOWN |
aio-telegram-bot | aio-telegram-botAn asynchronous framework for building your own Telegram Bot overAPI.Installationaio-telegram-botrequires Python 3.5.3+ and is available on PyPI:$ pip install aio-telegram-bot*Compatible with PyPy3.5-6.0.0+ExamplesPolling exampleimportasyncioimportosfromaiotelegrambotimportBot,Client,Content,Messagefromaiotelegrambot.rulesimportContainsasyncdefhi(message:Message):awaitmessage.send_message("Hello!",True)asyncdefrun(bot:Bot):awaitbot.initialize()whileTrue:awaitasyncio.sleep(1)if__name__=="__main__":loop=asyncio.get_event_loop()client=Client(os.environ["TELEGRAM_BOT_TOKEN"])bot=Bot(client)bot.add_handler(hi,content_type=Content.TEXT,rule=Contains("hi"))try:loop.run_until_complete(run(bot))exceptKeyboardInterrupt:loop.run_until_complete(bot.close())loop.run_until_complete(bot.client.close())finally:loop.close()Running:$ export TELEGRAM_BOT_TOKEN=12345678:replace-me-with-real-token
$ python3 polling.pyWebhook exampleExample of how to generate ssl certificate:openssl req -x509 -sha256 -nodes -days 365 -newkey rsa:2048 -keyout domain_srv.key -out domain_srv.crtimportargparseimportjsonimportosimportsslfromaiohttpimportwebfromasync_generatorimportasync_generator,yield_fromaiotelegrambotimportBot,Client,Content,Handlers,Messagefromaiotelegrambot.rulesimportContainshandlers=Handlers()TOKEN=os.environ["TELEGRAM_BOT_TOKEN"]HOST=os.environ["TELEGRAM_BOT_HOST"]PORT=8443parser=argparse.ArgumentParser()parser.add_argument("files",metavar="N",type=str,nargs="+")SSL_PUBLIC_KEY,SSL_PRIVATE_KEY=parser.parse_args()[email protected](content_type=Content.TEXT,rule=Contains("hi"))asyncdefhi(message:Message):awaitmessage.send_message("Hello!")asyncdefwebhook_handle(request):bot=request.app["bot"]data=awaitrequest.text()awaitbot.process_update(json.loads(data))returnweb.Response()@async_generatorasyncdefinit_bot(app:web.Application):bot=Bot(Client(TOKEN),handlers)awaitbot.initialize(webhook=True)awaitbot.client.set_webhook("https://{}:{}/{}".format(HOST,PORT,TOKEN),certificate=SSL_PUBLIC_KEY)app["bot"]=botawaityield_()awaitbot.client.delete_webhook()awaitbot.close()awaitbot.client.close()app=web.Application()app.router.add_post("/{}".format(TOKEN),webhook_handle)app.cleanup_ctx.extend([init_bot])context=ssl.create_default_context(ssl.Purpose.CLIENT_AUTH)context.load_cert_chain(SSL_PUBLIC_KEY,SSL_PRIVATE_KEY)web.run_app(app,host="0.0.0.0",port=PORT,ssl_context=context)Running:$ export TELEGRAM_BOT_TOKEN=12345678:replace-me-with-real-token
$ export TELEGRAM_BOT_HOST=real.host.com
$ python3 webhook.py domain_srv.crt domain_srv.keyLicenseaio-telegram-botis offered under the MIT license. |
aio-telegram-client | aiohttp telegram clientOnly http client for telegram bot.
use with my telegram bot SDK |
aio-telegram-log-handler | Aio-telegram-logBasic library for sending log to telegram, using python logging module(like handler), but do it asynchronously.UsageJust set telegram handler in your logging settings...
"handlers": {
"default": {
"formatter": "default",
"level": "DEBUG",
"class": "logging.StreamHandler",
"stream": "ext://sys.stdout",
},
"telegram": {
"class": "tghandler.handler.TelegramLoggingHandler",
"token": "input your token",
"chat_ids": [0, 1],
"level": "ERROR",
"formatter": "default",
},
},
... |
aiotelegraph | aiotelegraphDescriptionThis is simple async library for interaction with Telegra.ph APIContentsRelease Notes0.1.0Getting StartedInstallation from PipInstallation from GitHubQuick StartUser GuideDocumentationEntitiesMethodsHelp the authorContribute repoDonateRelease NotesVersion 0.1.0Init libraryGetting StartedInstallation from pipFor installation aiotelegraph library from pip you should have pip with python (prefer python3.6 or
later)pipinstallaiotelegraphInstallation from GitHubTo basic installation from GitHub repository you should have git, python3 (prefer python3.6 or
later), pip (optionally) in your systemgitclonehttps://github.com/OlegYurchik/aiotelegraph.gitcdaiotelegraph
pipinstall.orgitclonehttps://github.com/OlegYurchik/aiotelegraph.gitcdaiotelegraph
pythonsetup.pyinstallQuick StartAfter installation, you can use the library in your code. Below is a sneak example of using the
library.fromasynciofromaiotelegraphimportNodeElement,TelegraphClientasyncdefmain():client=TelegraphClient()awaitclient.create_account(short_name="ShortName",author_name="AuthorName",author_url="AuthorUrl")content=[NodeElement(text="Hello"),NodeElement(text="Neighbour"),]page=awaitclient.create_page(title="Greeting",content=content,return_content=True)loop=asyncio.get_event_loop()loop.run_until_complete(main()) |
aio-telegraph | aio-telegraphAsync library for telegra.ph APICover all API methods forDecember 19 2016.Installationpip install aio-telegraphUsageUsage example are available atexamples.py |
aiotempfile | aiotempfileOverviewProvides asynchronous temporary files.InstallationFrompypi.org$ pip install aiotempfileFrom source code$gitclonehttps://github.com/crashvb/aiotempfile
$cdaiotempfile
$virtualenvenv
$sourceenv/bin/activate
$python-mpipinstall--editable.[dev]UsageThis implementation is a derivation ofaiofilesand functions the same way.importaiotempfileasyncwithaiotempfile.open()asfile:file.write(b"data")If the context manager is not used, files will need be explicitly closed; otherwise, they will only be removed during the interepreter teardown.importaiotempfilefile=awaitaiotempfile.open()file.write(b"data")file.close()Environment VariablesVariableDefault ValueDescriptionAIOTEMPFILE_DEBUGAdds additional debug logging.DevelopmentSource Control |
aiotense | Project: aiotenseLicense: Apache 2.0About: Time Processing ToolOS: IndependentPython: 3.9+Typing: TypedTopic: UtilitiesDocumentation·Report Bug·Request FeatureTable of ContentsAbout The ProjectWelcomeGetting StartedWith PyPiWith PoetryUsageBuilt-in basicsReconfiguring existing settingsAdding new settingsFAQExamplesContributingLicenseContactAcknowledgmentsAbout The ProjectWelcomeHave you ever needed to convert, for example, the string "1d1minute 2 sec"
to the number of seconds or a datetime.timedelta object?No? Then advise us to your friends :) And if you really need our tool - let's move on!Getting startedWith PyPi$pip3installaiotenseWith Poetry$poetryaddaiotenseUsageBuilt-in basicimportasyncioimportdatetimefromaiotenseimportTenseParsertime_string="1d2minutes 5 sec"# <-- Digit parser -->digit_parser=TenseParser(TenseParser.DIGIT)digit_value=asyncio.run(digit_parser.parse(time_string))# <-- Assertions -->assertdigit_value==86525# <-- Timedelta parser -->delta_parser=TenseParser(TenseParser.TIMEDELTA)delta_value=asyncio.run(delta_parser.parse(time_string))# <-- Assertions -->assertisinstance(delta_value,datetime.timedelta)assertstr(delta_value)=="1 day, 0:02:05"Reconfiguring existing settingsimportasynciofromaiotenseimportTenseParser,from_tense_file_sourceconfig_emulation="""[model.Tense]multiplier = 2 # each unit of time will be multiplied by 2# !!! Note: If the multiplier is <= 0, then the parsers will# not work correctly. In this case, a warning will be sent to the console.[units.Minute]duration = 120 # Why not?...aliases = my_minute, my_minutes, my_min, my_mins"""parser=TenseParser(TenseParser.TIMEDELTA,config=from_tense_file_source(config_emulation),)delta_value=asyncio.run(parser.parse("1 my_min 10my_mins 9 my_minutes"))# <-- Assertions -->assertstr(delta_value)=="1:20:00"# (each 120 * 2)Adding new settingsimportasynciofromaiotenseimportTenseParser,from_tense_file_sourceconfig_emulation="""[model.Tense] # This header is required.[virtual]duration = exp(year * 10)aliases = decade, dec, decs, decades"""parser=TenseParser(TenseParser.TIMEDELTA,config=from_tense_file_source(config_emulation),)delta_value=asyncio.run(parser.parse("1year 10 decades5 seconds"))# <-- Assertions -->assertstr(delta_value)=="36865 days, 0:00:05"FAQBut what if you need to parse a string like: "1day and 10 minutes + 5 seconds"?
Let's see:>>>importasyncio>>>fromaiotenseimportTenseParser>>>complex_string="1day and 10 minutes + 5 seconds">>>parser=TenseParser(TenseParser.TIMEDELTA)>>>asyncio.run(parser.parse(complex_string))'0:00:05'Wait... What? 5 second? But there are days and minutes...It's okay, you're using flexible aiotense! This problem is solved in two ways:You write your own time_resolver and pass itChoose an existing one from aiotense.resolversLet's demonstrate!
I will use the second option, since the built-in time resolvers in aiotense are suitable for me.>>>importasyncio>>>fromaiotenseimportTenseParser,resolvers>>>complex_string="1day and 10 minutes + 5 seconds">>>parser=TenseParser(TenseParser.TIMEDELTA,time_resolver=resolvers.smart_resolver)>>>asyncio.run(parser.parse(complex_string))'1 day, 0:10:05'Well, that's better!aiotense.application.resolvers.smart_resolver()is also case insensitive!>>>importasyncio>>>fromaiotenseimportTenseParser,resolvers>>>complex_string="1DAY and 10 MINUTES + 5 SECONDS">>>parser=TenseParser(TenseParser.TIMEDELTA,time_resolver=resolvers.smart_resolver)>>>asyncio.run(parser.parse(complex_string))'1 day, 0:10:05'Examples.If you think that this is where the possibilities of aiotense ends, then you are wrong!
The possibilities of aiotense are too many for a README, so I suggest you move on to viewing
the usage examples here:Aiotense ExamplesContributingContributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make aregreatly appreciated.If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement".
Don't forget to give the project a star! Thanks again!Fork the ProjectCreate your Feature Branch (git checkout -b feature/AmazingFeature)Commit your Changes (git commit -m 'Add some AmazingFeature')Push to the Branch (git push origin feature/AmazingFeature)Open a Pull RequestLicenseDistributed under the Apache 2.0 License. SeeLICENSEfor more information.ContactAcknowledgmentsChoose an Open Source LicenseImg ShieldsGitHub PagesPythonPython CommunityMkDocsMkDocs Material |
aiotest | What is Aiotest?Aiotest is an easy to use, scriptable and scalable performance testing tool.You define the behaviour of your users in Python asyncio code, instead of being stuck in a UI or restrictive domain specific language.This makes Aiotest infinitely expandable and very developer friendly.To start using Aiotest, go toInstallationInstall the packagepip install aiotestRun aiotestaiotest -f aiotestfile.pyfromaiotestimportAsyncHttpUser,LoadUserShape,loggerclassTestUser(AsyncHttpUser):host="https://uat.taobao.com"token=Noneasyncdefon_start(self):url="/login"data={"username":"admin","password":"123456"}asyncwithself.session.post(url=url,data=data)asresp:data=awaitresp.json()self.token=data["token"]asyncdeftest_search(self):url="/search"hearders={"Authorization":self.token}data={"keyword":"F22"}asyncwithself.session.post(url=url,hearders=hearders,json=data)asresp:data=awaitresp.json()asyncdeftest_personal_info(self):url="/personalInfo"asyncwithself.session.get(url=url,hearders=hearders)asresp:data=awaitresp.json()FeaturesWrite test scenarios inpython asyncioIf you want your users to loop, perform some conditional behaviour or do some calculations, you just use the asyncio programming constructs provided by Python.
Aiotest runs every user inside its task (a asyncio task). This enables you to write your tests like normal (async) Python code instead of having to use callbacks or some other mechanism.
Because your scenarios are "just python" you can use your regular IDE, and version control your tests as regular codeDistributed and scalable - supports hundreds of thousands of concurrent usersAiotest makes it easy to run load tests distributed over multiple machines.
It is asyncio-based (usingpython asyncio, which makes it possible for a single process to handle many thousands concurrent users.
While there may be other tools that are capable of doing more requests per second on a given hardware, the low overhead of each Aiotest user makes it very suitable for testing highly concurrent workloads.Command-based UIIt can also be run without the UI, making it easy to use for CI/CD testing.Can test any systemEven though Aiotest primarily works with web sites/services, it can be used to test almost any system or protocol. Justwrite a client for what you want to testPrometheus-Grafana-based Data collection and presentationUse Prometheus to collect test data and Grafana to present itAutomatic collection of test cases(Reference pytest)An automatic collection of use cases similar to pytest, subclasses of User and LoadUserShape must start or end with Test(eg: TestUser,TestShape...), and api coroutines to be tested must also start or end with test(eg: test_search, test_personal_info...)Multiple user classes, flexible setting of test scenariosA aiotestfile can have multiple user classes at the same time, set different execution weights through the user class attribute weight, and flexibly set various test scenarios, example, shopping mall ordering scene, a user class simulating direct placing an order, and a user class simulating shopping cart placing an order.Custom load shapesSometimes a completely custom shaped load test is required that cannot be achieved by simply setting or changing the user count and spawn rate. For example, you might want to generate a load spike or ramp up and down at custom times. By using a LoadUserShape class you have full control over the user count and spawn rate at all times.Serial, parallel execution of api coroutinesEach user (a user class instance) acquiesce executes the test api coroutine serial from top to bottom(eg: when placing an order in the mall, the rear interface must wait for the return data from the front interface before it can be executed);You can override the self.start() method of the user class to execute the api coroutine to be tested in parallel(eg: api do not need to wait for the return data of other apis, and can be executed in parallel)AuthorsHewei githubmail:[email protected] source licensed under the MIT license (see LICENSE file for details).Express one's thanksAiotest is a rewrite of locust (based on python asyncio) that drops the TaskSet class, sets the API to be tested only through the User class, drops the Stats class, collects test data through Prometheus, drops the Web class, and presents test data through Grafanalocust.io |
aio.testing | aio.testingTest utils for theaioasyncio frameworkBuild statusInstallationRequires python >= 3.4Install with:pipinstallaio.testingAio testing provides 2 decorators for running asyncio testsaio.testing.run_until_complete:creates a test loopcalls the test with loop.run_until_completeaio.testing.run_forever:creates a test loopcalls test using loop.run_foreverwaits for number of seconds specified in “timeout” (default = 1)if test returns a callable, calls it as a coroutinewaits for number of seconds specified in “sleep” (default = 0)@aio.testing.run_until_completeaio.testing provides a method decorator for running asyncio-based testsimportunittestimportasyncioimportaio.testingclassMyTestCase(unittest.TestCase):@aio.testing.run_until_completedeftest_example():yield fromasyncio.sleep(2)self.assertTrue(True)Prior to the test running asyncio.get_new_loop() is called and set using asyncio.set_event_loop().On completion of the test asyncio.set_event_loop() is again called with the original event [email protected]_foreverIf your code needs to test long-running tasks, you can use the @aio.testing.run_forever decorator.The @aio.testing.run_forever decorator uses loop.run_forever to run the test.Any setup required can be done in the body of the test function which can optionally return a test callbackThe callback is wrapped in a coroutine, and called after 1 secondimportunittestimportasyncioimportaio.testingclassMyFutureTestCase(unittest.TestCase):@aio.testing.run_foreverdeftest_example():yield fromasyncio.sleep(2)defcallback_test(self):yield fromasyncio.sleep(2)self.assertTrue(True)# this function is called 1 second after being returnedreturncallback_testAs with aio.testing.run_until_complete, the test is run in a separate [email protected]_forever with timeoutYou can specify how many seconds to waitbeforerunning the callback tests by setting the timeout valueimportunittestimportasyncioimportaio.testingclassMyFutureTestCase(unittest.TestCase):@aio.testing.run_forever(timeout=10)deftest_example():yield fromasyncio.sleep(2)defcallback_test(self):yield fromasyncio.sleep(2)self.assertTrue(True)# this function is called 10 seconds after being [email protected]_forever with sleepSometimes a test needs to wait for some time after services have been stopped and the test loop has been destroyed.You can specify how many seconds to waitafterrunning the callback tests by setting the sleep valueimportunittestimportasyncioimportaio.testingclassMyFutureTestCase(unittest.TestCase):@aio.testing.run_forever(sleep=10)deftest_example():yield fromasyncio.sleep(2)defcallback_test(self):yield fromasyncio.sleep(2)self.assertTrue(True)returncallback_testaio.testing usageaio.testing.run_until_completeLets create a test>>> import asyncio
>>> import aio.testing>>> @aio.testing.run_until_complete
... def run_test(parent_loop):
... yield from asyncio.sleep(1)
...
... print(asyncio.get_event_loop() != parent_loop)And lets check that the test loop is not the same as the current one>>> loop_before_test = asyncio.get_event_loop()
>>> run_test(loop_before_test)
TrueAfter the test has run we have the original event loop back>>> asyncio.get_event_loop() == loop_before_test
TrueWe can raise an error in the test>>> @aio.testing.run_until_complete
... def run_test():
... assert(True == False)>>> try:
... run_test()
... except Exception as e:
... print(repr(e))
AssertionError()aio.testing.run_foreverLets create a future test>>> import asyncio>>> @aio.testing.run_forever
... def run_test(parent_loop):
... yield from asyncio.sleep(1)
...
... print(asyncio.get_event_loop() != parent_loop)Just like with aio.testing.run_until_complete, the test is run in a separate loop>>> loop_before_test = asyncio.get_event_loop()
>>> run_test(loop_before_test)
TrueAnd again, after the test has run we have the original event loop back>>> asyncio.get_event_loop() == loop_before_test
TrueIf the test returns a callable, its called 1 second later.The test_callback runs in the same loop as the test>>> @aio.testing.run_forever
... def run_test():
... test_loop = asyncio.get_event_loop()
...
... @asyncio.coroutine
... def test_callback():
... print(
... asyncio.get_event_loop() == test_loop)
...
... return test_callback>>> run_test()
TrueThe test_callback is always wrapped in asyncio.coroutine if its not one already>>> @aio.testing.run_forever
... def run_test():
...
... def test_callback():
... yield from asyncio.sleep(1)
... print("test_callback is always wrapped in a coroutine!")
...
... return test_callback>>> run_test()
test_callback is always wrapped in a coroutine!We can raise an error in the test>>> @aio.testing.run_forever
... def run_test():
... assert(True == False)>>> try:
... run_test()
... except Exception as e:
... print(repr(e))
AssertionError()And we can raise an error in the test callback>>> @aio.testing.run_forever
... def run_test():
...
... def test_callback():
... assert(True == False)
...
... return test_callback>>> try:
... run_test()
... except Exception as e:
... print(repr(e))
AssertionError()By default the test_callback is called 1 second after being returned>>> import time>>> @aio.testing.run_forever
... def run_test():
... test_run_at = int(time.time())
...
... return lambda: (
... print("callback called %s second(s) after test" % (
... int(time.time()) - test_run_at)))>>> run_test()
callback called 1 second(s) after testYou can set the amount of time to wait before calling the test_callback by setting the “timeout” argument in the decorator>>> import time>>> @aio.testing.run_forever(timeout=3)
... def run_test():
... test_run_at = int(time.time())
...
... return lambda: print(
... "callback called %s second(s) after test" % (
... int(time.time()) - test_run_at))>>> run_test()
callback called 3 second(s) after testYou can also set the amount of time to wait after the test has completely finished, by setting the “sleep” argument on the decorator>>> @aio.testing.run_forever(sleep=3)
... def run_test(test_time):
... return lambda: (
... test_time.__setitem__('completed_at', int(time.time())))>>> test_time = {}
>>> run_test(test_time)>>> print("test waited %s second(s) after completing" % (
... int(time.time()) - test_time['completed_at']))
test waited 3 second(s) after completing |
aiotestspeed | AIO Speedtestis a library written in Python to perform speed tests asynchronously and programmatically.This project was made based on the existingSpeedtestfrom which we shared several code snippets, what I did were few modifications to work asynchronously.Basic Usageimport asyncio
from aiotestspeed.aio import Speedtest
units = ('bit', 1)
async def main():
s: Speedtest = await Speedtest()
await s.get_best_server()
await s.download()
await s.upload()
print('Ping: %s ms\nDownload: %0.2f M%s/s\nUpload: %0.2f M%s/s' %
(s.results.ping,
(s.results.download / 1000.0 / 1000.0) / units[1],
units[0],
(s.results.upload / 1000.0 / 1000.0) / units[1],
units[0]))
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
loop.close()class Speedtest(aiobject):
async def __init__(self, config=None, source_address=None, timeout=10, secure=False):
...
@property
async def best(self) -> dict:
...
async def get_config(self) -> dict:
"""Download the speedtest.net configuration and return only the data
we are interested in
"""
...
async def get_servers(self, servers: list = None, exclude: list = None) -> list:
"""Retrieve a the list of speedtest.net servers, optionally filtered
to servers matching those specified in the ``servers`` argument
"""
...
async def set_mini_server(self, server: str) -> list:
"""Instead of querying for a list of servers, set a link to a
speedtest mini server
"""
...
async def get_closest_servers(self, limit: int = 5) -> None:
"""Limit servers to the closest speedtest.net servers based on
geographic distance.
"""
...
async def get_best_server(self, servers=None) -> dict:
"""Perform a speedtest.net "ping" to determine which speedtest.net
server has the lowest latency.
Args:
servers ([type], optional): [description]. Defaults to None.
Raises:
SpeedtestBestServerFailure: [description]
Returns:
dict: [description]
"""
...
async def download(self, callback=do_nothing) -> int:
"""Test download speed against speedtest.net
Args:
callback ([type], optional): [description]. Defaults to do_nothing.
Returns:
int: [description]
"""
...
async def upload(self, callback=do_nothing, pre_allocate: bool = True) -> int:
"""Test upload speed against speedtest.net
Args:
callback ([type], optional): [description]. Defaults to do_nothing.
pre_allocate (bool, optional): [description]. Defaults to True.
Returns:
int: [description]
"""
...Changelog |
aiotext | All-in-one Text CleanerThis package was created to speed up the process of cleaning text for natural language processing and machine learning. The package does the following:Converts all text to lowercaseExpands contractions usingpycontractionstrained on the glove-twitter-100 word2vec training set (optional)Removes text in brackets. Matches "()","[]", or "{}" (optional)Combines concatenations (turns "georgetown-louisville" into "georgetown louisville" or "georgetownlousivelle"). Matches all types of hyphens.Splits sentences on punctuation using algorithm defined inthis stackoverflow post.Tokenizes sentences.Lemmatizes tokens using NLTK WordNetLemmatizer and a lookup table between Penn Bank tags and Word Net.Installation$ pip3 install aiotext
$ pip3 install git+https://github.com/EricWiener/pycontractionsPlease note that pycontractions is specified as a dependency and will download from PyPi and work, but the branch I linked to above has multiple improvements.Usage:Options:OptionDefaultDescriptionexpand_contractionsTrueIf true, contractions will be expanded (it's -> it is). This takes a long time. Especially the first time you run it.strip_text_in_bracketsFalseIf true removes text in brackets. If false the brackets will be removed, but text inside will remain.combine_concatenationsFalseIf false replaces hyphen with space (george-louis -> george louis). If true just removes hyphen (george-louis -> georgelouis).w2v_pathNonePath to word2vec binary.api_key"word2vec-google-news-300"w2v_path will be given preference over api_key. If no valid binary is found at the path, the api will download the key specified. If no key is specified, the Google News vector will be used.fromaiotextimportCleanertext="Call me Ishmael. Some years ago—never mind how long precisely—having "text+="little or no money in my purse, and nothing particular to interest me "text+="on shore, I thought I would sail about a little and see the watery part "text+="of the world. It is a way I have of driving off the spleen and "text+="regulating the circulation."# Initialize cleanercleaner=Cleaner(expand_contractions=True)assertcleaner.clean(text)==[['call','me','ishmael'],['some','year','ago','never','mind','how','long','precisely','have','little','or','no','money','in','my','purse','and','nothing','particular','to','interest','me','on','shore','i','think','i','would','sail','about','a','little','and','see','the','watery','part','of','the','world'],['it','be','a','way','i','have','of','drive','off','the','spleen','and','regulate','the','circulation'],]NotesPlease note you might have to manually quit and reattempt to run the program the first time you run it if it gets stuck after downloading the contractions dataset.Wordnet is used to lemmatize based on the parts of speech given by Penn Bank. Since Wordnet is limited in the number of options (eg. no pronouns), some words will not be processed. This is done to preserve the root word. For instance, "us" Wordnet will convert "us" to "u". In order to avoid this, "us" will not be passed into the lemmatizer.You may need to run the following ifwordnetorpunktis not foundpython3>>importnltk>>nltk.download('wordnet')>>nltk.download('punkt')Change log1.0.0: Initial release1.0.1: Corrected handling of sentences without punctuation and brackets1.0.2: Added modified contraction expander download. Also made changes to solveissuewith NLTK lemmatizer.1.0.3: Added options for specifying word2vec model to use for contraction expansion1.0.4: Minor syntax error1.0.5: Changed passing of arguments, updated README, improved tokenization, and changed order of parsing to tag POS before cleaning text. |
aiotf | aiotfAsyncio-basedTensorflow ServingPredictionFeaturesasyncio:better use of your cpu idle timepep8 compliant:following best code standardshigh-performance prediction:we useaio-grpcExampleimportaiotfasyncdefmake_prediction(model_name:str,data):asyncwithaiotf.AsyncTensorflowServing('localhost:9000')asclient:predictions=awaitclient.predict(model_name,data)You can find more examples in theexamples/subdirectory.Installation$pipinstallaiotfLicenseMIT |
aiotfm | aiotfmaiotfm is an asynchronous Client implementation ofTransformicethat allows developers to make bots easily.
It uses an API endpoint to get the keys needed to connect to the game.
aiotfm is based onTransFromagewhich use threads instead of coroutines.If you prefer Lua over Python then checkout theLua versionmade by@Lautenschlager-idJoin theFifty Shades of Luadiscordserver to discuss about this API and to receive special support.Keys EndpointThis API depends on anendpointthat gives you access to the Transformice encryption keys.To use it you will need a token which you can get byapplying through this form. See below to know the names of Transfromage managers who handle the token system.Tocutoeltuco@discord=>Tocu#0018212634414021214209;Blank3495@discord=> #8737436703225140346881;Bolodefchoco@discord=>Lautenschlager#2555285878295759814656.Advantages3 times faster than TransFromageCompatible with discord.pyFasterAsynchronousSpeedTransFromage takes around 13 seconds to be connected to the community platform while aiotfm takes less than 4 seconds.
Those results can vary depending on your computer and your internet connection.InstallationYou can install aiotfm using pip:pip install aiotfmTo have a more up to date package, you have to clone this repository and install it manually:gitclonehttps://github.com/Athesdrake/aiotfmcdaiotfm
python3-mpipinstall.Requirementsaiotfm require python 3.7 or higher andaiohttp.Python 3.6Python 3.6 support is not guaranteed sincev1.4.3as Python 3.6 has reached EOF.Python 3.5You can still use aiotfm with Python 3.5.3 or higher by cloning the repository and remove the sugar syntax of Python 3.6.
These changes are the typed variables and fstrings.
Due to a major update in the asynchronous stuff of Python 3.5.3, aiotfm is not compatible with the previous versions of Python.UpdateTo update aiotfm, use the following command:pip install -U aiotfmExampleimportaiotfmbot=aiotfm.Client()@bot.eventasyncdefon_ready():print('Connected to the community platform.')bot.run("api_tfmid","api_token","username","password",encrypted=False,room="start_room")A more complete example.DocumentationYou can find the documentation of aiotfmhere.AboutYou can have more information about TransFromage in thisthread. |
aiotg | Asynchronous Python API for building Telegram bots, featuring:Easy and declarative APIHassle-free setup - no need for SSL certificates or static IPBuilt-in support for analytics via chatbase.comAutomatic handling of Telegram API throttling or timeoutsInstall it with pip:pipinstallaiotgThen you can create a new bot in few lines:fromaiotgimportBot,Chatbot=Bot(api_token="...")@bot.command(r"/echo (.+)")defecho(chat:Chat,match):returnchat.reply(match.group(1))bot.run()Now run it with a proper API_TOKEN and it should reply to /echo commands.NoteType annotations are not required but will help your editor/IDE to provide code completion.The example above looks like a normal synchronous code but it actually returns a coroutine.
If you want to make an external request (and that’s what bots usually do) just use aiohttp and async/await syntax:importaiohttpfromaiotgimportBot,Chatbot=Bot(api_token="...")@bot.command("bitcoin")asyncdefbitcoin(chat:Chat,match):url="https://apiv2.bitcoinaverage.com/indices/global/ticker/BTCUSD"asyncwithaiohttp.ClientSession()assession:response=awaitsession.get(url)info=awaitresponse.json()awaitchat.send_text(str(info["averages"]["day"]))bot.run()But what if you just want to write a quick integration and don’t need to provide a conversational interface? We’ve got you covered!
You can send messages (or any other media) by constructing a Chat object with user_id or channel name. We even saved you some extra keystrokes by providing handy Channel constructors:...channel=bot.channel("@yourchannel")private=bot.private("1111111")asyncdefgreeter():awaitchannel.send_text("Hello from channel!")awaitprivate.send_text("Why not greet personally?")...ExamplesAsync IOSend imagePost to channelWebhooks modeSender idFor a real world example, take a look atWhatisBotorMusic Catalog Bot.For more information on how to use the project, see the project’sdocumentation. |
aiotgbot | Key FeaturesAsyncio andaiohttpbasedAllTelegram Bot APItypes and methods supportedBot API rate limit supportBoth long polling and webhooks supportedFully type annotated (PEP 484)Installationaiotgbot is available on PyPI. Use pip to install it:pipinstallaiotgbotRequirementsPython >= 3.8aiohttpaiojobsmsgspecbackofffrozenlistaiofreqlimityarlUsing aiotgbotfromtypingimportAsyncIteratorfromaiotgbotimport(Bot,BotUpdate,HandlerTable,PollBot,PrivateChatFilter,Runner)fromaiotgbot.storage_memoryimportMemoryStoragehandlers=HandlerTable()@handlers.message(filters=[PrivateChatFilter()])asyncdefreply_private_message(bot:Bot,update:BotUpdate)->None:assertupdate.messageisnotNonename=(f'{update.message.chat.first_name}'f'{update.message.chat.last_name}')awaitbot.send_message(update.message.chat.id,f'Hello,{name}!')asyncdefrun_context(runner:Runner)->AsyncIterator[None]:storage=MemoryStorage()awaitstorage.connect()handlers.freeze()bot=PollBot(runner['token'],handlers,storage)awaitbot.start()yieldawaitbot.stop()awaitstorage.close()defmain()->None:runner=Runner(run_context)runner['token']='some:token'runner.run()if__name__=='__main__':main() |
aio-tg-bot | No description available on PyPI. |
aiotgnotifier | aiotgnotifierНотификация через мессенджер telegram |
aiotgsdk | aiotgsdk |
aiothingy | aiothingyAsynchronous Python library for interacting with the Nordic Thingy52 over Bluetooth. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.