package
stringlengths
1
122
pacakge-description
stringlengths
0
1.3M
aidockermon
aidockermonMonitor system load of the server running the nvidia/cuda docker containers.Featuresysinfo: system static infosysload: system cpu/memory loadgpu: nvidia gpu loaddisk: disk loadcontainers: containers' load that based on the nvidia/cuda imagePrerequisitePython >= 3InstallationpipinstallaidockermonOr usesetuptoolspythonsetup.pyinstallUsage$ aidockermon -h usage: aidockermon [-h] [-v] {query,create-esindex,delete-esindex} ... optional arguments: -h, --help show this help message and exit -v, --version show program's version number and exit command: {query,create-esindex,delete-esindex} query Query system info, log them via syslog protocol create-esindex Create elasticsearch index delete-esindex Delete elasticsearch index$ aidockermon query -h usage: aidockermon query [-h] [-l] [-r REPEAT] [-f FILTERS [FILTERS ...]] type positional arguments: type info type: sysinfo, sysload, gpu, disk, containers optional arguments: -h, --help show this help message and exit -l, --stdout Print pretty json to console instead of send a log -r REPEAT, --repeat REPEAT n/i repeat n times every i seconds -f FILTERS [FILTERS ...], --filters FILTERS [FILTERS ...] Filter the disk paths for disk type; filter the container names for containers typeFor example:Show sysinfo$aidockermonquery-lsysinfo{"gpu":{"gpu_num":2,"driver_version":"410.104","cuda_version":"10.0"},"mem_tot":67405533184,"kernel":"4.4.0-142-generic","cpu_num":12,"docker":{"version":"18.09.3"},"system":"Linux"}Show sys load$aidockermonquery-lsysload{"mem_free":11866185728,"mem_used":8023793664,"cpu_perc":57.1,"mem_perc":12.8,"mem_avail":58803163136,"mem_tot":67405533184}Show gpu load$aidockermonquery-lgpu{"mem_tot":11177,"gpu_temperature":76.0,"mem_free":1047,"mem_used":10130,"gpu_perc":98.0,"gpu_id":0,"mem_perc":46.0}{"mem_tot":11178,"gpu_temperature":66.0,"mem_free":3737,"mem_used":7441,"gpu_perc":95.0,"gpu_id":1,"mem_perc":44.0}Show disk usage$aidockermonquerydisk-l-f/{"path":"/","device":"/dev/nvme0n1p3","total":250702176256,"used":21078355968,"free":216865271808,"percent":8.9}$aidockermonquerydisk-l-f//disk{"path":"/","device":"/dev/nvme0n1p3","total":250702176256,"used":21078355968,"free":216865271808,"percent":8.9}{"path":"/disk","device":"/dev/sda1","total":1968874311680,"used":1551374692352,"free":317462949888,"percent":83.0}Show containers' loadNote that theapp_namewould be read from environment variableAPP_NAME, which is a short description for this training program.$aidockermonquerycontainers-l-fDianAI{"proc_name":"python3 test_run.py","app_name":"测试程序","pid":13540,"container":"DianAI","started_time":1554698236,"running_time":9343,"mem_used":9757}{"proc_name":"python train.py","app_name":"","pid":15721,"container":"DianAI","started_time":1554698236,"running_time":19343,"mem_used":1497}{"mem_limit":67481047040,"net_output":47863240948,"block_read":1327175626752,"net_input":18802869033,"mem_perc":14.637655604461704,"block_write":132278439936,"name":"DianAI","cpu_perc":0.0,"mem_used":9877643264}Configloggingdebug:falselog:version:1# This is the default level, which could be ignored.# CRITICAL = 50# FATAL = CRITICAL# ERROR = 40# WARNING = 30# WARN = WARNING# INFO = 20# DEBUG = 10# NOTSET = 0#level: 20disable_existing_loggers:falseformatters:simple:format:'%(levelname)s%(message)s'monitor:format:'%(message)s'filters:require_debug_true:():'aidockermon.handlers.RequireDebugTrue'handlers:console:level:DEBUGclass:logging.StreamHandlerformatter:simplefilters:[require_debug_true]monitor:level:INFOclass:rfc5424logging.handler.Rfc5424SysLogHandleraddress:[127.0.0.1,1514]enterprise_id:1loggers:runtime:handlers:[console]level:DEBUGpropagate:falsemonitor:handlers:[monitor,console]level:INFOpropagate:falseThis is the default config, which should be located at/etc/aidockermon/config.yml.You can modify theaddressvalue to specify the logging target.address: [127.0.0.1, 1514]: UDP to 127.0.0.1:1514address: /var/log/aidockermon: unix domain datagram socketIf you add ansocktypeargument, you can specify whether to use UDP or TCP as transport protocol.socktype: 1: TCPsocktype: 2: UDPEnable TLS/SSL:tls_enable:truetls_verify:truetls_ca_bundle:/path/to/ca-bundle.pemSetdebugastrue, you can see message output in the console.Cronjobsudocpetc/cron.d/aidockermon/etc/cron.d sudosystemctlrestartcronsyslog-ngUsing syslog-ng to collect logs and send them to elasticsearch for future use such as visualization with kibana.cpetc/syslog-ng/syslog-ng.conf/etc/syslog-ng/ sudosystemctlrestartsyslog-ngSample config:@version:3.20 destinationd_elastic{elasticsearch2(index("syslog-ng")type("${.SDATA.meta.type}")flush-limit("0")cluster("es-syslog-ng")cluster-url("http://localhost:9200")client-mode("http")client-lib-dir(/usr/share/elasticsearch/lib)template("${MESSAGE}\n"));};sources_python{#unix-dgram("/var/log/aidockermon");syslog(ip(127.0.0.1)port(1514)transport("udp")flags(no-parse));};log{source(s_python);parser{syslog-parser(flags(syslog-protocol));};destination(d_elastic);};Modify it to specify the elasticsearch server and the log source's port and protocol.
aido-client
Hello
aidoc-transit
Failed to fetch description. HTTP Status Code: 404
aidol
No description available on PyPI.
aidol3
No description available on PyPI.
aidon
No description available on PyPI.
aido-protocols
No description available on PyPI.
aido-protocols-daffy
No description available on PyPI.
aido-protocols-daffy-aido4
No description available on PyPI.
aidops-utility
Example PackageThis is a simple example package. You can useGithub-flavored Markdownto write your content.
aidotpy
ai.pyA single-file Python script that interacts with ChatGPT API in the command-line.Features:Use shortcuts to access predefined promptsHighlight code in outputSupport one-shot queries and conversationsUse special command like!setto control the behavior when chattingInstallJust copy the script to a folder in$PATH, like/usr/local/bin. You can also change its name toaito get ride of the.pyextension.Here's a command that can directly install the script into your system:curl https://raw.githubusercontent.com/reorx/ai.py/master/ai.py -o /usr/local/bin/ai && chmod +x /usr/local/bin/aiYou can also install it with pip or pipx:pip install aidotpyUsagePaste your OpenAI API key to~/.ai_py_config.json, or set it inAI_PY_API_KEYenvironment variable.echo'{"api_key":"<Your API key>"}'>~/.ai_py_config.jsonFor detail usage of the script, please read the description of./ai.py -h:usage: ai [-h] [-s SYSTEM] [-c] [--history HISTORY] [-w] [-v] [-t] [-d] [--version] [PROMPT] A simple CLI for ChatGPT API positional arguments: PROMPT your prompt, leave it empty to run REPL. you can use @ to load prompt from ~/.ai_py_prompts.json options: -h, --help show this help message and exit -s SYSTEM, --system SYSTEM system message to use at the beginning of the conversation. if starts with @, the message will be located through ~/.ai_py_prompts.json -c, --conversation enable conversation, which means all the messages will be sent to the API, not just the last one. This is only useful to REPL --history HISTORY load the history from a JSON file. -w, --write-history write new messages to --history file after each chat. -v, --verbose verbose mode, show execution info and role in the message -t, --show-tokens show a breakdown of the tokens used in the prompt and in the response -d, --debug debug mode, enable logging --version show program's version number and exitOne-off queryPass the prompt as the first argument:./ai.py 'hello world'You can also pass the prompt through a pipe (|):head README.md | ./ai.py 'Proofreading the following text:'REPLRun without argument forRead–eval–print loop:./ai.pyBy default only the last message and the system message are sent to the API, if you want it to remember all the context (i.e. send all the messages in each chat), add-cargument to enable conversation:./ai.py -cSystem messageYou can pass a system message to define the behavior for the assistant:./ai.py -s 'You are a proofreader' 'its nice know you'You can also save your predefined system messages in~/.ai_py_promots.jsonand refer them with@at the beginning, this will be covered in the next section.Prompt shortcutsYou can predefine prompts in~/.ai_py_prompts.jsonand refer to them by using@as a prefix. This works for both system messages and user messages.Suppose your~/.ai_py_prompts.jsonlooks like this:{"system":{"cli":"As a technology assistant with expertise in command line, answer questions in simple and short words for users who have a high-level background. Provide only one example, and explain as less as possible."},"user":{"native":"Paraphrase the following sentences to make it more native:\n","revise":"Revise the following sentences to make them more clear concise and coherent:\n","":""}}Then you can use thecliprompt shortcut in system message by:./ai.py -s @cliand use thenativeorreviseprompt shortcut in user message by:./ai.py '@native its nice know you' It's great to get to know you.Verbose modeAdd-vto print role name and parameters used in the API call.ScreenshotSpecial commandsYou can use special commands to control the behavior of the script when running in REPL.Here's a list of available commands:!set <key> <value>: set a key-value pair in the config, available keys are:verbose: set toTrueorFalse, e.g.!set verbose Trueconversation: set toTrueorFalse, e.g.!set conversation Truesystem: set the system message. e.g.!set system you are a poet,!set system @cliparams: set the parmeters for the ChatGPT API. e.g.!set params temperature 0.5model: set the model to use. e.g.!set model gpt-4!info: print the execution info!write-history: write current messages to history file. e.g.!write-history history.json
aido-utils-daffy
No description available on PyPI.
aido-utils-daffy-aido4
No description available on PyPI.
aid-push
Push notifications for APNS (iOS) and GCM (Android). It provides HTTP/2 network protocol based APNS client.QuickstartInstall using pip:pip install aid-pushWhether usingAPNSorGCM, pushjack provides clients for each. It also provides a HTTP/2 based APNS client.APNS (using certificates)Send notifications using theAPNSClientclass:fromaid_pushimportAPNSClientclient=APNSClient(certificate='<path/to/certificate.pem>',default_error_timeout=10,default_expiration_offset=2592000,default_batch_size=100,default_retries=5)token='<device token>'alert='Hello world.'# Send to single device.# NOTE: Keyword arguments are optional.res=client.send(token,alert,badge='badge count',sound='sound to play',category='category',content_available=True,title='Title',title_loc_key='t_loc_key',title_loc_args='t_loc_args',action_loc_key='a_loc_key',loc_key='loc_key',launch_image='path/to/image.jpg',extra={'custom':'data'})# Send to multiple devices by passing a list of tokens.client.send([token],alert,**options)Access response data.# List of all tokens sent.res.tokens# List of errors as APNSServerError objectsres.errors# Dict mapping errors as token => APNSServerError object.res.token_errorsOverride defaults for error_timeout, expiration_offset, and batch_size.client.send(token,alert,expiration=int(time.time()+604800),error_timeout=5,batch_size=200)Send a low priority message.# The default is low_priority == Falseclient.send(token,alert,low_priority=True)Get expired tokens.expired_tokens=client.get_expired_tokens()Close APNS connection.client.close()For the APNS sandbox, useAPNSSandboxClientinstead:fromaid_pushimportAPNSSandboxClientAPNS (using Auth tokens)Send notifications using theAPNSHTTP2Clientclass:fromaid_pushimportAPNSHTTP2Clientkey="my_key"token=apns.APNSAuthToken(token=key,team_id="my_team_id",key_id="my_key_id",)client=apns.APNSHTTP2Client(token=token,bundle_id='my_bundle_id',)response=client.send_message(device_id="my_device_id",message="message",content_available=True,title="title")Close APNS connection.client.conn.close()For the APNS sandbox, useAPNSHTTP2SandboxClientinstead:fromaid_pushimportAPNSHTTP2SandboxClientGCMSend notifications using theGCMClientclass:fromaid_pushimportGCMClientclient=GCMClient(api_key='<api-key>')registration_id='<registration id>'alert='Hello world.'notification={'title':'Title','body':'Body','icon':'icon'}# Send to single device.# NOTE: Keyword arguments are optional.res=client.send(registration_id,alert,notification=notification,collapse_key='collapse_key',delay_while_idle=True,time_to_live=604800)# Send to multiple devices by passing a list of ids.client.send([registration_id],alert,**options)Alert can also be be a dictionary with data fields.alert={'message':'Hello world','custom_field':'Custom Data'}Alert can also contain the notification payload.alert={'message':'Hello world','notification':notification}Send a low priority message.# The default is low_priority == Falseclient.send(registration_id,alert,low_priority=True)Access response data.# List of requests.Response objects from GCM Server.res.responses# List of messages sent.res.messages# List of registration ids sent.res.registration_ids# List of server response data from GCM.res.data# List of successful registration ids.res.successes# List of failed registration ids.res.failures# List of exceptions.res.errors# List of canonical ids (registration ids that have changed).res.canonical_idsChangelogv2.1.4 (14-04-2021)apns: Fix HTTP2 error handlingRemove nonfunctional parameters from APNSError callsv2.1.3 (08-04-2021)apns: Fix HTTP2 error handlingRemoveerrorsandtoken_errorsfrom APNSResponsev2.1.2 (07-04-2021)ImportAPNSAuthTokenFileto__init__.py.v2.1.1 (07-04-2021)apns: HTTP2 classes return an instance of APNSResponseapns: Fix an exception while trying to decode a string tokenv2.1.0 (28-02-2020)ImportAPNSAuthTokento__init__.py.v1.5.0 (2018-07-29)gcm: Use FCM URL instead of deprecated GCM URL. ThanksLukas Anzinger!v1.4.1 (2018-06-18)apns: Remove restriction on token length due to incorrect assumption about tokens always being 64 characters long.v1.4.0 (2017-11-09)apns: Add exceptionsAPNSProtocolErrorandAPNSTimeoutError. ThanksJakub Kleň!apns: Add retry mechanism toAPNSClient.send. ThanksJakub Kleň!Adddefault_retriesargument toAPNSClientinitialization. Defaults to5.Addretriesargument toAPNSClient.send. By default will useAPNSClient.default_retriesunless explicitly passed in.If unable to send afterretries, anAPNSTimeoutErrorwill be raised.apns: Fix bug in bulkAPNSClient.sendthat resulted in an off-by-one error for message identifier in returned errors. ThanksJakub Kleň!apns: Add max payload truncation option toAPNSClient.send. ThanksJakub Kleň!Adddefault_max_payload_lengthargument toAPNSClientinitialization. Defaults to0which disabled max payload length check.Addmax_payload_lengthargument toAPNSClient.send. By default will useAPNSClient.default_max_payload_lengthunless explicitly passed in.Whenmax_payload_lengthset, messages will be truncated to fit within the length restriction by trimming the “message” text and appending it with “…”.v1.3.0 (2017-03-11)apns: Optimize reading from APNS Feedback so that the number of bytes read are based on header and token lengths.apns: Explicitly close connection to APNS Feedback service after reading data.apns: Add support formutable-contentfield (Apple Notification Service Extension) viamutable_contentargument toAPNSClient.send(). ThanksAhmed Khedr!apns: Add support forthread-idfield (group identifier in Notification Center) viathread_idargument toAPNSClient.send(). ThanksAhmed Khedr!v1.2.1 (2015-12-14)apns: Fix implementation of empty APNS notifications and allow notifications with{"aps": {}}to be sent. ThanksJulius Seporaitis!v1.2.0 (2015-12-04)gcm: Add support forpriorityfield to GCM messages vialow_prioritykeyword argument. Default behavior is for all messages to be"high"priority. This is the opposite of GCM messages but mirrors the behavior in the APNS module where the default priority is"high".v1.1.0 (2015-10-22)gcm: Add support fornotificationfield to GCM messages.gcm: Replaceregistration_idsfield withtofield when sending to a single recipient sinceregistration_idsfield has been deprecated for single recipients.v1.0.1 (2015-05-07)gcm: Fix incorrect authorization header in GCM client. ThanksBrad Montgomery!v1.0.0 (2015-04-28)apns: AddAPNSSandboxClientfor sending notifications to APNS sandbox server.apns: Addmessageattribute toAPNSResponse.pushjack: Add internal logging.apns: Fix APNS error checking to properly handle reading when no data returned.apns: Make APNS sending stop during iteration if a fatal error is received from APNS server (e.g. invalid topic, invalid payload size, etc).apns/gcm: Make APNS and GCM clients maintain an active connection to server.apns: Make APNS always returnAPNSResponseobject instead of only raisingAPNSSendErrorwhen errors encountered. (breaking change)apns/gcm: Remove APNS/GCM module send functions and only support client interfaces. (breaking change)apns: Removeconfigargument fromAPNSClientand use individual method parameters as mapped below instead: (breaking change)APNS_ERROR_TIMEOUT=>default_error_timeoutAPNS_DEFAULT_EXPIRATION_OFFSET=>default_expiration_offsetAPNS_DEFAULT_BATCH_SIZE=>default_batch_sizegcm: Removeconfigargument fromGCMClientand use individual method parameters as mapped below instead: (breaking change)GCM_API_KEY=>api_keypushjack: Removepushjack.clientsmodule. (breaking change)pushjack: Removepushjack.configmodule. (breaking change)gcm: RenameGCMResponse.payloadstoGCMResponse.messages. (breaking change)v0.5.0 (2015-04-22)apns: Add new APNS configuration valueAPNS_DEFAULT_BATCH_SIZEand set to100.apns: Addbatch_sizeparameter to APNSsendthat can be used to overrideAPNS_DEFAULT_BATCH_SIZE.apns: Make APNSsendbatch multiple notifications into a single payload. Previously, individual socket writes were performed for each token. Now, socket writes are batched based on either theAPNS_DEFAULT_BATCH_SIZEconfiguration value or thebatch_sizefunction argument value.apns: Make APNSsendresume sending from after the failed token when an error response is received.apns: Make APNSsendraise anAPNSSendErrorwhen one or more error responses received.APNSSendErrorcontains an aggregation of errors, all tokens attempted, failed tokens, and successful tokens. (breaking change)apns: Replacepriorityargument to APNSsendwithlow_priority=False. (breaking change)v0.4.0 (2015-04-15)apns: Improve error handling in APNS so that errors aren’t missed.apns: Improve handling of APNS socket connection during bulk sending so that connection is re-established when lost.apns: Make APNS socket read/writes non-blocking.apns: Make APNS socket frame packing easier to grok.apns/gmc: Remove APNS and GCMsend_bulkfunction. Modifysendto support bulk notifications. (breaking change)apns: RemoveAPNS_MAX_NOTIFICATION_SIZEas config option.gcm: RemoveGCM_MAX_RECIPIENTSas config option.gcm: Removerequestargument from GCM send function. (breaking change)apns: Removesockargument from APNS send function. (breaking change)gcm: Return namedtuple for GCM canonical ids.apns: Return namedtuple class for APNS expired tokens.v0.3.0 (2015-04-01)gcm: Addrestricted_package_nameanddry_runfields to GCM sending.gcm: Add exceptions for all GCM server error responses.apns: Makeapns.get_expired_tokensandAPNSClient.get_expired_tokensaccept an optionalsockargument to provide a custom socket connection.apns: RaiseAPNSAuthErrorinstead ofAPNSErrorif certificate file cannot be read.apns: RaiseAPNSInvalidPayloadSizeErrorinstead ofAPNSDataOverflow. (breaking change)apns: RaiseAPNSInvalidTokenErrorinstead ofAPNSError.gcm: RaiseGCMAuthErrorifGCM_API_KEYis not set.pushjack: Rename several function parameters: (breaking change)gcm:alerttodatagcm:token/tokenstoregistration_id/registration_idsgcm:Dispatcher/dispatchertoGCMRequest/requestClients:registration_idtodevice_idgcm: ReturnGCMResponseobject forGCMClient.send/send_bulk. (breaking change)gcm: Returnrequests.Responseobject(s) forgcm.send/send_bulk. (breaking change)v0.2.2 (2015-03-30)apns: Fix payload key assigments fortitle-loc,title-loc-args, andlaunch-image. Previously,'_'was used in place of'-'.v0.2.1 (2015-03-28)apns: Fix incorrect variable reference inapns.receive_feedback.v0.2.0 (2015-03-28)pushjack: Fix handling ofconfigin clients whenconfigis a class object and subclass ofConfig.apns: Makeapns.send/send_bulkaccept additionalalertfields:title,title-loc,title-loc-args, andlaunch-image.gcm: Makegcm.send/send_bulkraise aGCMErrorexception ifGCM_API_KEYis not set.gcm: Make gcm payload creation castdatato dict if isn’t not passed in as one. Original value ofdatais then set to{'message': data}. (breaking change)gcm: Make gcm payload creation not set defaults for optional keyword arguments. (breaking change)v0.1.0 (2015-03-26)pushjack: Renamepushjack.settingsmodule topushjack.config. (breaking change)apns/gcm: Allow config settings overrides to be passed intocreate_gcm_config,create_apns_config, andcreate_apns_sandbox_config.pushjack: OverrideConfig’supdate()method with custom method that functions similarly tofrom_object()except that it accepts adictinstead.v0.0.1 (2015-03-25)First release.LicenseThe MIT License (MIT)Copyright (c) 2015 Derrick GillandPermission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
ai-dq-module
DQAI (Data Quality Artificial Intelligence)This code provides a Python class calledDQAIthat utilizes the OpenAI Chat API to analyze a dataset and generate data quality rules specific to the data.UsageInstall the necessary dependencies.Set up your OpenAI API key or use the provided default key.Prepare your dataset in a suitable format (e.g., CSV).Instantiate theDQAIclass.Invoke theinvoke_from_datasetmethod, passing the dataset as input.The code will generate Python code based on the dataset and execute it.The generated rules and the results will be saved in the current directory as "generated_code.py" and "rulesapplication.csv," respectively.The generated rules can be obtained by calling the_get_rules_from_filemethod.Note: Make sure to modify the file path (pathvariable) in the provided code to match your dataset's location.Example:import pandas as pd from dqai import DQAIRead the dataset from a CSV filepath = "path/to/your/dataset.csv" data = pd.read_csv(path)Instantiate DQAI and generate data quality rulesdqai = DQAI() result = dqai.invoke_from_dataset(data)Access the generated rules and resultsrules = result["0"] results_df = result["1"]
ai-drf-core
Overview:This package contains various useful helper functions for the django REST framework.ATTENTION: This package is deprecated. All functionality will live on in theai-django-corepackage. Please install it with the drf-extension enabled. More details in the changelog.Installation:Add a requirement to your requirements.txt:ai-drf-coreAdd module toINSTALLED_APPS:ai-drfRun migrationsContributeClone the project locallyCreate a new branch for your featureChange the dependency in your requirements.txt to a local (editable) one that points to your local file system:-e /Users/myuser/workspace/ai-drf-coreEnsure the code passes the testsRun:python setup.py developCreate a pull requestPublish to PyPIRun:python setup.py sdist uploadIf you run into trouble, please create a file in your home directory: ~/.pypirc[distutils] index-servers = pypi [pypi] repository: https://upload.pypi.org/legacy/ username: password:TestsCheck coveragepytest --cov=ai-drf-coreRun testspytest
aidrin
Failed to fetch description. HTTP Status Code: 404
aidriver
No description available on PyPI.
aids
AIDScrapperBased off a small script made by an Anonymous user. Now it's a full featured client which supports not onlyAIDbut alsoHolo. Alternatively, it can transform AID scenario objects in .scenario files for compatibility with the different AI Dynamic Storytelling platforms, speciallyNAIThe json files can be transformed into .html for better readability although that's not the main purpose of this tool - you should upload it to one of the supported websites to use the stories.RequirementsTo install all requirements, use the following snippet after installing python on your machine.pip install -r requirements.txtOr just use:pip install aidsUsageTo install the whole thing. You can manage it directly from console. Use:aids helpAfterwards for a list of commands or:aids-windows.bat helpIf you use Windows.You can also use the package directly via:python3 -m aids helpOr in outside the package directory with:python aids/manage.py helpTestingIf you feel like messing around with the code make sure none of the tests are failing using thetestcommand. It uses pytest if you have it installed or unittest as a fallback.
aidsinfo
A Python wrapper for the AIDSinfo drug information API.AIDSinfo Documentation:http://www.aidsinfo.nih.gov/Other/rss.aspxUsage>>> from aidsinfo import DrugInfo >>> info = DrugInfo() >>> info.search('abacavir') {'abacavir': {'data': 'here'}}>>> info.search('combivir') {'combivir': {'data': 'here'}}>>> # You can also get back just the XML data. ... xml_data = info.search('combivir', output_format=None)CopyrightCopyright (c) 2011 Code for America Laboratories.See LICENSE for details.
aidt
AI Data TransformersThis repo contains a library that uses LLMs for data transformations on values in Pandas or Spark dataframes.The core value prop is that the library exposes simple building blocks for writing transformations thta use LLM to perform some computation over the data as given by a prompt. In essence, instead of defining a data transformation logic in detailed code, we replace the logic with an LLM, i.e. we instruct the LLM to take an instruction and process and get the desired output from the data.For example, consider the problem for needing to parse out the data from raw text which can be in format: unix timestamp, UTC timestamp, textual description, in Japanese etc. Traditionally, data engineers need to write well-test transformation of ever-increasing subsets of cases on highly filtered data to do so. But you can do so with just an LLM call with the instruction: "parse out time and write it out as a UTC timestamp".The library exposes interfaces for both API-based models and open source models. In particular, once can take a possible quantized model and use that. The library takes care of optimially running it, the worker distribution and serialization.Low-Level Functions -> Let Pandas etc. handle itMap, Reduce, SerializeComposition Functions -> Let User Define themDon't define composition abstractions, let users handle it.Text Level Functionstransform(x, transform_instruction) structify(x, model: PydanticBaseModel, structify_instruction=None) classify(x, c, labels=[], classification_instruction=None) score_in_range(x, range_start, range_end, scoring_instruction=None) extract(x, types=[])OptimizationsCompile TransformationsFor a sequence of transformations, if you compile and serialize, under the hood, the models will reuse the KV-cache for the input.
aidu
aiduArtificial Intelligence, DoLicenseApache Version 2.0Copyright 2019 Alexander W. Wong Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
aidungeon
Placeholder for AIDungeon 2! Coming soon...
aidungeonapi
AIDungeonAPIUNOFFICIAL Api to interface with AI dungeonExamplefromaidungeonapiimportAIDungeonClientimportasyncioasyncdefcallback(result):print(result)asyncdefblocking_task():whileTrue:awaitasyncio.sleep(10)asyncdefmain():aidc=awaitAIDungeonClient(debug=True)adventure=awaitaidc.connect_to_public_adventure('51e86616-507f-49f7-b07d-9a58b3261781')awaitadventure.register_actions_callback(callback)awaitasyncio.create_task(blocking_task())asyncio.run(main())
ai-dungeon-cli
AI Dungeon CLIThis is basically a cli client toplay.aidungeon.io.This allows playing AI Dungeon 2 inside a terminal.I primarily did this to play the game on a DEC VT320 hardware terminal for a morefaithfulexperience.For more context, read theaccompanying blog post.Installationpip$ python3 -m pip install ai-dungeon-cliOr for unstable release from the source code:$ python3 -m pip install .Arch LinuxPackage is onAUR.Usingtrizen:$ trizen -S ai-dungeon-cli-gitOld school way:$ git clone https://aur.archlinux.org/ai-dungeon-cli-git.git $ cd ai-dungeon-cli-git $ makepkg -siPlayingUnless specified, all user inputs are consideredDoactions.Quoted input entries are automatically interpreted asSayactions, e.g.:> "Hey dragon! You didn't invite me to the latest BBQ party!"Do be explicit about the action type, prefix your input with a command:/do/say/story/rememberFor example, the previousSayprompt could also be written:> /say Hey dragon! You didn't invite me to the latest BBQ party!To quit, either pressCtrl-C,Ctrl-Dor type in the special/quitcommand.RunningIn any case, you first need to create a configuration file.Installed$ ai-dungeon-cliFrom sourceWith a conda env (assuming you're usinganaconda):$ cd ai-dungeon-cli $ conda env create $ conda activate ai-dungeon-cli-env $ ./ai_dungeon_cli/__init__.pyWith a viltualenv:$ cd ai-dungeon-cli $ virtualenv -p $(command -v python3) ai-dungeon-cli-venv $ source ai-dungeon-cli-venv/bin/activate $ python3 -m pip install -r requirements.txt $ ./ai_dungeon_cli/__init__.pyPlease note that all those examples use a virtual env in order to not mess up with the main Python env on your system.Configuration (optional)Several things can be tuned by resorting to a config file.Create a fileconfig.ymleither:in the same folder in your home folder:$HOME/.config/ai-dungeon-cli/config.ymlin the same folder as the sources:./ai-dungeon-cli/ai_dungeon_cli/config.ymlAuthenticationBy default, if no authentication configuration is provided, an anonymous session is created.ai-dungeon-cli supports 2 ways to configure user authentication.Either precise a couple of credentials in conf:email:'<MY-USER-EMAIL>'password:'<MY-USER-PASSWORD>'Or sniff aAuthentication Tokenand use it directly:auth_token:'<MY-AUTH-TOKEN>'To get this token, you need to first login in a web browser toplay.aidungeon.io.Then you can find the token either in your browserlocalStorageor in the content of theconnection_initmessage of the websocket communication (first sent message).Either way, developer tools (F12) is your friend.Slow Typing AnimationBy default, responses are printed to the screen instantly.To enable a fun "slow" typing animation, use:slow_typing_effect:TruePromptThe default user prompt is'> '.You can customize it with e.g. :prompt:'me:'Command-line argumentsAll configuration options are mapped to command-line arguments.Additionally, some features (such as multi-player support) are only available through those arguments.The list of all arguments can be retrieved by callingai-dungeon-cliwith either-hof--help.AuthenticationOne can use either the--auth-token <token>or--email <email> --password <password>arguments to authenticate.Slow Typing AnimationJust append--slow-typingto your execution call to enable this fancy effect.PromptThe custom prompt can be set with--prompt '<prompt>'.Multi-playerTo join an existing multi-player adventure, use arguments--adventure <public-adventure-id> --name <character-name>.DebugTO enable debug mode and see the responses from the play.aidungeon.io API, use--debug. This option is mainly useful for developers.DependenciesPlease have a look atrequirements.txt.Limitations and future improvementsRight now, the code is over-optimistic: we don't catch cleanly when the backend is down.A better user experience could be achieved with the use of thecurseslibrary.For now/revertand/alterspecial actions are not supported.It would also be nice to add support for browsing other players' stories (Exploremenu).Implementation detailsWe fallback to a pure ASCII version of the splash logo if we detect an incompatible locale / terminal type.SupportAs you might have heard, hosting AI Dungeon costs a lot of money.This cli client relies on the same infrastructure as the online version (play.aidungeon.io).So don't hesitate tohelp support the hosting feesto keep the game up and running.AuthorJordan Besly@p3r7(blog).Contributors & acknowledgementsMajor contributions:Idan Gur@idangur: OOP rewrite of game logicAlberto Oporto Ames@otreblan: packaging, submission to AUR@jgb95: slow typing effectAlexander Batyrgariev@sasha00123: help on porting to new websocket/gql-based version of the APIMinor contributions:Robert Davis@bdavs: pip requirements@Jezza: suggested login using credsCode for slow typing effect inspired bythis messagefromMagnus Lyckaon thePython Tutor mailing list.Similar projectssasha00123/ai-dungeon-bot, a bot for Telegram & VK, written in pythonSoptikHa2/aidungeon2-cli, written in Rust (unmaintained)
aidutils
aidutils
aie
it is i.fromaieimportiprint(i)# i >>>>>>> j
aiearth-core
AIEARTH CORERead More
aiearth-data
AIEarth Engine Python SDKVisit theAIEarth main pagefor more information.Quickstart快速入门AIEarth Data 快速入门重要:为了保障数据的传输速度,建议在平台托管Jupyter Notebook环境下使用说明AiEarth Data SDK可以使用户使用STAC标准接口访问我们的平台的公开数据和用户的私人数据。通过 指定数据唯一STAC ID并配合offset + size的办法,获取数据切片。安装办法联系平台获取python模块安装包aiearth_data-1.0.0-py3-none-any.whl请联系钉钉用户群:32152986打开开发环境的终端(Terminal),安装Python模块pipinstall'/path/to/aiearth_data-1.0.0-py3-none-any.whl'打开开发环境Notebook进行代码编写使用案例查询公开数据,并获取完整BGR波段数据# 初始化,并鉴权importosimportnumpyasnpfromaiearthimportcoreimportaiearth.data.searchfromaiearth.data.searchimportPersonalQueryBuilder,personaldata_client,opendata_clientfromaiearth.data.loaderimportDataLoaderfromaiearth.data.stac_idimportPersonalStacId,OpenStacIdfrommatplotlibimportpyplotaspltfromdatetimeimportdatetimeimportnumpyasnp# 鉴权core.Authenticate()# 获取 STAC 客户端client_opendata=opendata_client()# 查询部分区域选定时间范围的Sentinel-2数据,并进行按云量从小到大排序bbox=[116.30526075575888,39.856226715750836,116.45625485992359,39.96534081463834]search_req=client_opendata.search(collections=['SENTINEL_MSIL2A'],bbox=bbox,sortby=['+eo:cloud_cover'],datetime=(datetime(2022,1,1),datetime(2022,3,1)))# 获取查询结果的第一个item=list(search_req.items())[0]print(item.id)# 使用B4, B3, B2 查询所有数据dataloader=DataLoader(OpenStacId(item.id))dataloader.load()shape=item.properties.get("proj:shape")width,height=shapeprint(width,height)buffer_width,buffer_height=2048,2048width_grid=int(width/buffer_width)+1height_grid=int(height/buffer_height)+1img=np.ndarray(shape=(width,height,3))foridx,band_nameinenumerate(("B4","B3","B2")):channel=np.ndarray(shape=(width,height))forhinrange(height_grid):forwinrange(width_grid):x_offset=buffer_width*wy_offset=buffer_height*hthis_buffer_width=buffer_widthifx_offset+buffer_width<=widthelsewidth-x_offsetthis_buffer_height=buffer_heightify_offset+buffer_height<=heightelseheight-y_offsetblock=dataloader.block(x_offset,y_offset,this_buffer_width,this_buffer_height,band_name)channel[x_offset:x_offset+this_buffer_width,y_offset:y_offset+this_buffer_height]=blockimg[:,:,idx]=channel查询个人数据,并获取完整B1波段数据# 引入aiearth基础包并进行鉴权fromaiearthimportcorecore.Authenticate()# 引入 aiearth-data 及其他依赖fromaiearth.data.searchimportPersonalQueryBuilder,personaldata_clientfromaiearth.data.loaderimportDataLoaderfromaiearth.data.stac_idimportPersonalStacIdfrommatplotlibimportpyplotaspltfromdatetimeimportdatetimeimportnumpyasnp# 查询 2023-06-01 至 2023-06-15 上传/生产的,名字里含有 gaofen1_2m_5000 的个人数据query=PersonalQueryBuilder().and_name_contains("gaofen1_2m_5000").and_upload_datetime_between(datetime(2023,6,1),datetime(2023,6,15)).build()client=personaldata_client()search_req=client.search(query=query)items=list(search_req.items())# 获取上一步结果第一幅影像的B1波段的全部数据item=items[0]item.properties# properties 内容{'meta:bandCount':3,'meta:dataType':'Byte','description':'超分_gaofen1_2m_5000','title':'超分_gaofen1_2m_5000','proj:bbox':[113.14763797458761,29.51906661147954,113.24763840038761,29.619067037279542],'proj:epsg':-1,'datetime':'2023-06-14T16:38:27.056Z','proj:shape':[13915,13915],'proj:transform':[113.14763797458761,7.18652e-06,0.0,29.619067037279542,0.0,-7.18652e-06],'meta:resX':0.7999997333291894,'meta:resY':0.799999733329161,'aie:band_names':['B1','B2','B3'],'meta:uploadDatetime':'2023-06-14T16:38:27.056Z'}# 循环获取B1波段全部数据width,height=item.properties.get("proj:shape")img=np.ndarray(shape=(width,height))buffer_width,buffer_height=2048,2048width_grid=int(width/buffer_width)+1height_grid=int(height/buffer_height)+1dataloader=DataLoader(PersonalStacId(item.id))dataloader.load()forh_idxinrange(height_grid):forw_idxinrange(width_grid):x_offset=buffer_width*w_idxy_offset=buffer_height*h_idxthis_buffer_width=buffer_widthifx_offset+buffer_width<=widthelsewidth-x_offsetthis_buffer_height=buffer_heightify_offset+buffer_height<=heightelseheight-y_offsetblock=dataloader.block(x_offset,y_offset,this_buffer_width,this_buffer_height,"B1")img[x_offset:x_offset+this_buffer_width,y_offset:y_offset+this_buffer_height]=block
aiearth-deeplearning
Failed to fetch description. HTTP Status Code: 404
aiearth-engine
AIEarth Engine Python SDKVisit theAIEarth main pagefor more information.Quickstart快速入门
aiearth-openapi
AIEarth OpenAPI
aiearth-predict
AI Earth Predict 是时空数据和AI模型相结合的开发套件,用于进行大规模遥感数据模型推理的统一计算库,并可以用来并行处理遥感数据,支持本地模式运行,以及将模型部署到 AI Earth地球科学云平台AI Earth Predict Github
ai-economist
Foundation: An Economic Simulation FrameworkThis repo contains an implementation of Foundation, a framework for flexible, modular, and composable environments thatmodel socio-economic behaviors and dynamics in a society with both agents and governments.Foundation provides aGym-style API:reset: resets the environment's state and returns the observation.step: advances the environment by one timestep, and returns the tuple(observation, reward, done, info).This simulation can be used in conjunction with reinforcement learning to learn optimal economic policies, as detailed in the following papers:The AI Economist: Improving Equality and Productivity with AI-Driven Tax Policies,Stephan Zheng, Alexander Trott, Sunil Srinivasa, Nikhil Naik, Melvin Gruesbeck, David C. Parkes, Richard Socher.The AI Economist: Optimal Economic Policy Design via Two-level Deep Reinforcement LearningStephan Zheng, Alexander Trott, Sunil Srinivasa, David C. Parkes, Richard Socher.Building a Foundation for Data-Driven, Interpretable, and Robust Policy Design using the AI EconomistAlexander Trott, Sunil Srinivasa, Douwe van der Wal, Sebastien Haneuse, Stephan Zheng.If you use this code in your research, please cite us using this BibTeX entry:@misc{2004.13332, Author = {Stephan Zheng, Alexander Trott, Sunil Srinivasa, Nikhil Naik, Melvin Gruesbeck, David C. Parkes, Richard Socher}, Title = {The AI Economist: Improving Equality and Productivity with AI-Driven Tax Policies}, Year = {2020}, Eprint = {arXiv:2004.13332}, }For more information and context, check out:The AI Economist websiteBlog: The AI Economist: Improving Equality and Productivity with AI-Driven Tax PoliciesBlog: The AI Economist moonshotBlog: The AI Economist web demo of the COVID-19 case studyWeb demo: The AI Economist ethical review of AI policy design and COVID-19 case studySimulation Cards: Ethics Review and Intended UsePlease see ourSimulation Cardfor a review of the intended use and ethical review of our framework.Please see ourCOVID-19 Simulation Cardfor a review of the ethical aspects of the pandemic simulation (and as fitted for COVID-19).Join us on SlackIf you're interested in extending this framework, discussing machine learning for economics, and collaborating on research project:join our Slack channelaieconomist.slack.comusing thisinvite link, oremail us @[email protected] InstructionsTo get started, you'll need to have Python 3.7+ installed.Using pipSimply use the Python package manager:pipinstallai-economistInstalling from SourceClone this repository to your local machine:git clone www.github.com/salesforce/ai-economistCreate a new conda environment (named "ai-economist" below - replace with anything else) and activate itconda create --name ai-economist python=3.7 --yes conda activate ai-economistEithera) Edit the PYTHONPATH to include the ai-economist directoryexport PYTHONPATH=<local path to ai-economist>:$PYTHONPATHORb) Install as an editable Python packagecd ai-economist pip install -e .Useful tip: for quick access, add the following to your ~/.bashrc or ~/.bash_profile:alias aiecon="conda activate ai-economist; cd <local path to ai-economist>"You can then simply runaiecononce to activate the conda environment.Testing your InstallTo test your installation, try running:conda activate ai-economist python -c "import ai_economist"Getting StartedTo familiarize yourself with Foundation, check out the tutorials in thetutorialsfolder. You can run these notebooks interactively in your browser on Google Colab.Multi-Agent Simulationseconomic_simulation_basic(Try this on Colab!): Shows how to interact with and visualize the simulation.economic_simulation_advanced(Try this on Colab!): Explains how Foundation is built up using composable and flexible building blocks.optimal_taxation_theory_and_simulation(Try this on Colab!): Demonstrates how economic simulations can be used to study the problem of optimal taxation.covid19_and_economic_simulation(Try this on Colab!): Introduces a simulation on the COVID-19 pandemic and economy that can be used to study different health and economic policies .Multi-Agent Trainingmulti_agent_gpu_training_with_warp_drive(Try this on Colab!): Introduces our multi-agent reinforcement learning frameworkWarpDrive, which we then use to train the COVID-19 and economic simulation.multi_agent_training_with_rllib(Try this on Colab!): Shows how to perform distributed multi-agent reinforcement learning withRLlib.two_level_curriculum_training_with_rllib: Describes how to implement two-level curriculum training withRLlib.To run these notebookslocally, you needJupyter. Seehttps://jupyter.readthedocs.io/en/latest/install.htmlfor installation instructions and(https://jupyter-notebook.readthedocs.io/en/stable/for examples of how to work with Jupyter.Structure of the CodeThe simulation is located in theai_economist/foundationfolder.The code repository is organized into the following components:ComponentDescriptionbaseContains base classes to can be extended to define Agents, Components and Scenarios.agentsAgents represent economic actors in the environment. Currently, we have mobile Agents (representing workers) and a social planner (representing a government).entitiesEndogenous and exogenous components of the environment. Endogenous entities include labor, while exogenous entity includes landmarks (such as Water and Grass) and collectible Resources (such as Wood and Stone).componentsComponents are used to add some particular dynamics to an environment. They also add action spaces that define how Agents can interact with the environment via the Component.scenariosScenarios compose Components to define the dynamics of the world. It also computes rewards and exposes states for visualization.The datasets (including the real-world data on COVID-19) are located in theai_economist/datasetsfolder.Releases and ContributingPlease let us know if you encounter any bugs by filing a GitHub issue.We appreciate all your contributions. If you plan to contribute new Components, Scenarios Entities, or anything else, please see ourcontribution guidelines.ChangelogFor the complete release history, seeCHANGELOG.md.LicenseFoundation and the AI Economist are released under theBSD-3 License.
ai-edge
No description available on PyPI.
ai-edge-explorer
No description available on PyPI.
ai-edge-toolbox
No description available on PyPI.
ai-edge-torch
No description available on PyPI.
aiee
No description available on PyPI.
aiei
Install AIEIVersion is recommended, not requiredpython>=3.6torch>=1.6.0 (pytorch amp & tensorboard), torchvisioninstall locally:pip install -v -e .. With building ops:AIEI=0 pip install -v -e .change variable:base_config.py -> TORCH_MODEL_HOME, config.py -> data_pathTODOmerge multi-scale_pad and get_scale_padadd user define path_ckpt
ai-einblick-prompt
ai-einblick-promptai-einblick-promptempowers JupyterLab users with a domain-specific AI agent that can generate, modify, and fix code for data science workflows. Einblick's extension is the easiest and only context-aware way to augment the data workflow with generative AI.Einblick Prompt AIEinblick is the AI-native data notebook that can write and fix code, create beautiful charts, build models, and much more. This is made possible via our domain-specific AI agent, Einblick Prompt. Visit ourhomepageto learn more about Einblick.UsageWatch a quick video tutorial.Quick StartStart JupyterLab and installai-einblick-promptvia the Extension Manager in the left panel.Read in a dataset.Click the Einblick logo icon in the top-right side of any Python cell, and select “Generate.”Ask Einblick Prompt AI to write code.Example Prompts“Create a box plot of col_3.”“Filter for cat_1, cat_2, and cat_3.”“Create a scatter plot of col_1 vs. col_2, color by col_4.”Keyboard shortcutCommand (⌘) + K/Ctrl + K- Toggle prompt widget on active cell.CommandsThe following commands are executable from the Jupyterlab command palette (Command (⌘) + Shift + C/Ctrl + Shift + C)Einblick AI: Prompt: Toggle prompt widget on active cell.Einblick AI: Generate: Toggle "Generate" prompt on active cell to create new code for the cell.Einblick AI: Fix: Toggle "Fix" prompt on active cell to fix errors in the cell.Einblick AI: Modify: Toggle "Modify" prompt on active cell to change existing code in the cell.Installation StepsRequirementsJupyterLab >= 4.0.0InstallMethod 1Searchai-einblick-promptin JupyterLab's Extension Manager, and click InstallMethod 2Execute:pipinstallai-einblick-promptUninstallTo remove the extension, execute:pipuninstallai-einblick-prompt
aie-ipyleaflet
A Jupyter widget for dynamic Leaflet maps
aiembassy
No description available on PyPI.
ai-embedder
AI Link EmbedderIntroductionThis abomination of a code was created to solve my fiancé issue with migrating Adobe Illustrator files to different machine. It comes with absolutely no guarantee, but it seems to work. Good luck.RequirementsRunning this script requires:Windows OS (macOS is not supported for now)Adobe Illustrator installedRunUsage:usage: AI Link Embedder [-h] [-r] [-o] [-d DEST] [-p PREFIX] [-s SUFFIX] dictionary This script takes .ai files and save them with linked files embedded positional arguments: dictionary Directory with input .ai files options: -h, --help show this help message and exit -r, --recursive Scan for .ai files recursively -o, --override Override resulting .ai is it exists -d DEST, --dest DEST Directory where resulting .ai files will be saved to -p PREFIX, --prefix PREFIX -s SUFFIX, --suffix SUFFIXExample:python -m ai_embedder
ai-emotion
Representations of emotion for use in AI systems.Theoretical OverviewSimple Emotional RepresentationsThere are several ways of representing emotion available in this package:Simple Vectoral RepresentationAn emotion is given as a vector in a 3D space. Each dimension can have a value from -1 to 1 inclusive, and the dimensions are:valence (unpleasant-pleasant): This axis represents how positive or negative an emotion is. For example, happiness has high positive valence, while sadness has high negative valence.arousal (deactivated-activated): This axis indicates the level of activation or energy associated with the emotion. For instance, excitement is a high-arousal emotion, while calmness is a low-arousal emotion.control (submissive-dominant): This axis describes the degree of control or influence a person feels they have in a given emotional state. Anger and pride are high positive, while helplessness and fear are high negative.Plutchik's Wheel of EmotionsPlutchik's wheel is shown below:Here Plutchik identifies eight primary emotions. These primary emotions can be combined to form secondary emotions, which can be combined to form tertiary emotions.In our package, we represent an emotional state as an 8D vector with one dimension for each primary emotion. Furthermore, the length of the vector can range from 0 to 1 inclusive, representing the intensity of the emotion. The eight primary emotions are:joysadnesstrustdisgustfearangersurpriseanticipationPhysiological RepresentationWith inspiration from theories like James-Lange, Cannon-Bard, and Schachter-Singer, we can describe an emotional state using a physiological vector. Each dimension can range from 0 to 1 inclusive, representing the intensity of the physiological response. Here our dimensions are:heart_ratebreathing_ratehair_raisedblood_pressurebody_temperaturemuscle_tensionpupil_dilationgi_blood_flowamygdala_blood_flowprefrontal_cortex_blood_flowmuscle_blood_flowgenitalia_blood_flowsmile_muscle_activitybrow_furrow_activitylip_tighteningthroat_tightnessmouth_drynessvoice_pitch_raisingspeech_ratecortisol_leveladrenaline_leveloxytocin_levelConversion Between RepresentationsConversion between representations requires using an LLM. This is because the emotional interpretation of a physiological reaction is context-dependent. Note that converting between different representations is also very inexact, and you will very likely not get the same vector back from doing an inverse operation.Complex Emotional RepresentationsComplex emotions may not be easily representable by a single vector. For example, a person may feel exhausted at work and want to watch some TV, but they may also know that watching TV would make them feel anxious and guilty. Simultaneously they may also feel a sense of pride from working on their project.In this package we represent complex emotions in the following ways:Conditional Emotions: The person in this example knows that if they watch TV, they will feel a certain way. This is a prediction about how their emotional state will change in response to a certain action. Thus we provide a transition function on an emotional representation object, allowing the emotion to change in relation to a context and an action.Combined Emotions: The physiological representation is a unitary representation, and is a single vector regardless of how complicated a person's emotional state is. The other representations are the sum of weighted vectors.ExamplesThis is a simple example of how to use the package. See the examples/ folder for more complex examples.from dotenv import load_dotenv import os from typing import cast from ai_emotion.complex_emotion import ComplexEmotion from ai_emotion.conversion import to_plutchik from ai_emotion.simple_emotion import VectoralEmotion, PlutchikEmotion, PhysiologicalEmotion from ai_emotion.transition import transition vectoral_emotion = VectoralEmotion( valence=0.1, arousal=0.2, control=0.8, ) load_dotenv() converted_plutchik_emotion = to_plutchik( vectoral_emotion, openai_api_key=cast(str, os.getenv("OPENAI_API_KEY")), context="Getting prepared for a presentation.", ) complex_emotion = ComplexEmotion([ (0.7, vectoral_emotion), (0.3, converted_vectoral_emotion), ]) later_complex_emotion = transition( complex_emotion, openai_api_key=cast(str, os.getenv("OPENAI_API_KEY")), context="Started my presentation.", )
aiengine
AIEngine is a next generation interactive/programmable Python/Ruby/Java/Lua packet inspection engine with capabilities of learning without any human intervention, NIDS(Network Intrusion Detection System) functionality, DNS domain classification, network collector, network forensics and many others.AIEngine also helps network/security professionals to identify traffic and develop signatures for use them on NIDS, Firewalls, Traffic classifiers and so on.The main functionalities of AIEngine are:Support for interacting/programing with the user while the engine is running.Support for PCRE JIT for regex matching.Support for regex graphs (complex detection patterns).Support five types of NetworkStacks (lan,mobile,lan6,virtual and oflow).Support Sets and Bloom filters for IP searches.Support Linux, FreeBSD and MacOS operating systems.Support for HTTP,DNS and SSL Domains matching.Support for banned domains and hosts for HTTP, DNS, SMTP and SSL.Frequency analysis for unknown traffic and auto-regex generation.Generation of Yara signatures.Easy integration with databases (MySQL, Redis, Cassandra, Hadoop, etc…) for data correlation.Easy integration with other packet engines (Netfilter).Support memory clean caches for refresh stored memory information.Support for detect DDoS at network/application layer.Support for rejecting TCP/UDP connections.Support for network forensics on real time.Supports protocols such as Bitcoin,CoAP,DHCP,DNS,GPRS,GRE,HTTP,ICMPv4/ICMPv6,IMAP,IPv4/v6,Modbus, MPLS,MQTT,Netbios,NTP,OpenFlow,POP,Quic,RTP,SIP,SMTP,SSDP,SSL,TCP,UDP,VLAN,VXLAN.
ai.engine
serial.engine
ai_engine_new
Name: AI-EngineDescription: Kandinsky-Dream-Image-API is a reverse API that provides a simple and efficient way to generate images using AI from Kandindy or Dream.ai .Dowloand:pip install ai_engineAlso, make sure you have Python version 3.8 or higher installed Perhaps you use pydantic in your projects, you will not need to change the usual version, it works with two versions (from 1.12.10 to 2.3.0) The library is small, easy to replenish. Poetry was used for developmentThe documentation will be written on the wiki pages, please fill it upFunctional:– Image generation using models from Kandinky and Dream.ai – A simple and intuitive API for interaction – The ability to configure image generation parameters (resolution, style, etc.)
aieon
No description available on PyPI.
aie-sdk
AIEarth Engine Python SDKVisit theAIEarth main pagefor more information.Quickstart快速入门
aie-secrets
No description available on PyPI.
ai-eval-flow-post-processor
Failed to fetch description. HTTP Status Code: 404
ai-executor
1.实验目标借助AI生成代码的能力,让python执行不存在的方法!强化代码编程能力,验证想法的可行性!2.参考链接:https://www.bilibili.com/video/BV1na4y1K7H5/?spm_id_from=333.880.my_history.page.clickhttps://www.bilibili.com/video/BV1ET41187m9/?spm_id_from=333.880.my_history.page.click3.最终效果可以执行不存在的方法,注意合理命名方法名可以保存能正常运行的方法,避免重复生成
aiextractor
No description available on PyPI.
aiextras
No description available on PyPI.
ai-eye
Infoai_eye.py 2018-05-25Author: Zhao Mingming <[email protected]>Copyright: This module has been placed in the public domain.version:0.0.7Functions:has_closed_eye: eye’s open degreeHow To Use This Modulewhen u use pip install ldm==0.0.2fromldmimportlandmarksfromai_eyeimporthas_closed_eyefromskimageimportioimagepath="closed_eye/10.jfif"img=io.imread(imagepath)ldl,helptxt=landmarks(img)printhelptxtforldinldl:printhas_closed_eye(ld)when u use pip install ldm==0.0.4importldmfromai_eyeimporthas_closed_eyefromskimageimportioimagepath="closed_eye/10.jfif"ldmer=ldm.LDM()img=io.imread(imagepath)ldl,facel,helptxt=ldmer.landmarks(img)printhelptxtforldinldl:printhas_closed_eye(ld)
aif360
The AI Fairness 360 toolkit is an open-source library to help detect and mitigate bias in machine learning models. The AI Fairness 360 Python package includes a comprehensive set of metrics for datasets and models to test for biases, explanations for these metrics, and algorithms to mitigate bias in datasets and models.We have developed the package with extensibility in mind. This library is still in development. We encourage the contribution of your datasets, metrics, explainers, and debiasing algorithms.
aif360-fork2
This is a temporary fork for personal use. It will be deleted in a while
aiface
aiface
ai-face
# ai_faceversion:0.0.1analyze the face artribution in the imagelike this:```# pip install ldmimport ldm# pip install ai_eyefrom ai_face import *from skimage import ioimagepath="closed_eye/10.jfif"ldmer=ldm.LDM()img=io.imread(imagepath)ldl,facel,helptxt=ldmer.landmarks(img)print helptxtprint 'face_num:'print face_num(facel)print 'face_avre_rate:'print face_area_rate(img,facel)print 'face_center_degree:'print face_center_degree(img,facel)print 'face_direction:'print face_direction(facel)print 'face_feature:'print face_feature(img,facel)print 'face_compare:'feature1=face_feature(img,facel)feature2=face_feature(img,facel)print compare2facefeature(feature1,feature2)```20180524anjiang
aifactory
No description available on PyPI.
aifactory-alpha
No description available on PyPI.
aifactory-beta
No description available on PyPI.
aifactory-tools-find-buildout-root
No description available on PyPI.
aifail
AiFailPython Library for retrying openai/gcp API callsInstallationpipinstallaifailUsageTo get started, simply wrap your functions with@retry_if, specifying the condition to retry on.https://github.com/GooeyAI/aifail/blob/a540c05a2a9436c0b6b1caab8ed823387999d5f9/examples/basic_openai.py#L5https://github.com/GooeyAI/aifail/blob/a540c05a2a9436c0b6b1caab8ed823387999d5f9/examples/basic_openai.py#L13-L30Custom logicYou can use this with anything that needs retrying, e.g. google sheets -defsheets_api_should_retry(e:Exception)->bool:returnisinstance(e,HttpError)and(e.resp.statusin(408,429)ore.resp.status>500)@retry_if(sheets_api_should_retry)defupdate_cell(spreadsheet_id:str,row:int,col:int,value:str):get_spreadsheet_service().values().update(spreadsheetId=spreadsheet_id,range=f"{col_i2a(col)}{row}:{col_i2a(col)}{row}",body={"values":[[value]]},valueInputOption="RAW",).execute()Advanced UsageThis library is used by GooeyAI in production to handle thousands of API calls every day. To save costs and handle rate limits, you can intelligently specify quick fallbacks (eg azure openai) -https://github.com/GooeyAI/aifail/blob/a540c05a2a9436c0b6b1caab8ed823387999d5f9/examples/azure_openai_fallback.py#L7-L16https://github.com/GooeyAI/aifail/blob/a540c05a2a9436c0b6b1caab8ed823387999d5f9/examples/azure_openai_fallback.py#L19-L35Traceable ErrorsAiFail comes with a built in logger, and outputs complete stack traces tracing the error back to the original call site.# python examples/azure_openai_fallback.py2023-11-1804:36:01.364|WARNING|aifail.aifail:try_all:63-[2/2]tyringnextfn,prev_exc=NotFoundError("Error code: 404 - {'error': {'code': 'DeploymentNotFound', 'message': 'The API deployment for this resource does not exist. If you created the deployment within the last 5 minutes, please wait a moment and try again.'}}")2023-11-1804:36:01.778|WARNING|aifail.aifail:wrapper:98-[1/1]capturederror,retry_delay=0.4117675457681431s,exc=NotFoundError("Error code: 404 - {'error': {'message': 'The model `gpt-4-x` does not exist', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}")2023-11-1804:36:02.483|WARNING|aifail.aifail:try_all:63-[2/2]tyringnextfn,prev_exc=NotFoundError("Error code: 404 - {'error': {'code': 'DeploymentNotFound', 'message': 'The API deployment for this resource does not exist. If you created the deployment within the last 5 minutes, please wait a moment and try again.'}}")2023-11-1804:36:04.093|WARNING|aifail.aifail:wrapper:98-[2/1]capturederror,retry_delay=0.9974197744911488s,exc=NotFoundError("Error code: 404 - {'error': {'message': 'The model `gpt-4-x` does not exist', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}")Traceback(mostrecentcalllast):File"/Users/dev/Projects/dara/aifail/aifail/aifail.py",line65,intry_allreturnfn()File"/Users/dev/Projects/dara/aifail/examples/azure_openai_fallback.py",line28,in<lambda>lambda:azure_client.chat.completions.create(... openai.NotFoundError:Errorcode:404-{'error':{'code':'DeploymentNotFound','message':'The API deployment for this resource does not exist. If you created the deployment within the last 5 minutes, please wait a moment and try again.'}}Theaboveexceptionwasthedirectcauseofthefollowingexception: Traceback(mostrecentcalllast):File"/Users/dev/Projects/dara/aifail/aifail/aifail.py",line86,inwrapperreturnfn(*args,**kwargs)File"/Users/dev/Projects/dara/aifail/examples/azure_openai_fallback.py",line26,inchad_gpt4response=try_all(File"/Users/dev/Projects/dara/aifail/aifail/aifail.py",line69,intry_allraiseprev_excFile"/Users/dev/Projects/dara/aifail/aifail/aifail.py",line65,intry_allreturnfn()File"/Users/dev/Projects/dara/aifail/examples/azure_openai_fallback.py",line34,in<lambda>lambda:openai_client.chat.completions.create(... openai.NotFoundError:Errorcode:404-{'error':{'message':'The model `gpt-4-x` does not exist','type':'invalid_request_error','param':None,'code':'model_not_found'}}Theaboveexceptionwasthedirectcauseofthefollowingexception: Traceback(mostrecentcalllast):File"/Users/dev/Projects/dara/aifail/aifail/aifail.py",line65,intry_allreturnfn()File"/Users/dev/Projects/dara/aifail/examples/azure_openai_fallback.py",line28,in<lambda>lambda:azure_client.chat.completions.create(... openai.NotFoundError:Errorcode:404-{'error':{'code':'DeploymentNotFound','message':'The API deployment for this resource does not exist. If you created the deployment within the last 5 minutes, please wait a moment and try again.'}}Theaboveexceptionwasthedirectcauseofthefollowingexception: Traceback(mostrecentcalllast):File"/Users/dev/Projects/dara/aifail/examples/azure_openai_fallback.py",line43,in<module>chad_gpt4(File"/Users/dev/Projects/dara/aifail/aifail/aifail.py",line102,inwrapperraiseprev_excFile"/Users/dev/Projects/dara/aifail/aifail/aifail.py",line86,inwrapperreturnfn(*args,**kwargs)File"/Users/dev/Projects/dara/aifail/examples/azure_openai_fallback.py",line26,inchad_gpt4response=try_all(File"/Users/dev/Projects/dara/aifail/aifail/aifail.py",line69,intry_allraiseprev_excFile"/Users/dev/Projects/dara/aifail/aifail/aifail.py",line65,intry_allreturnfn()File"/Users/dev/Projects/dara/aifail/examples/azure_openai_fallback.py",line34,in<lambda>lambda:openai_client.chat.completions.create(... openai.NotFoundError:Errorcode:404-{'error':{'message':'The model `gpt-4-x` does not exist','type':'invalid_request_error','param':None,'code':'model_not_found'}}Processfinishedwithexitcode1Sentry too, will capture the entire retry loop
aifashion-sdk
AIFashion python sdkAF Open API本项目是AIFashion® HTTP API的一个封装,旨在为开发者提供简洁高效的SDK。 该SDK可以自动完成身份认证、访问令牌获取、令牌自动更新,并提供各个功能的接口。ID 与 Secret您需要首先访问AIFashion®的控制面板,注册并登陆, 创建一个合适的应用并获取应用的ID和Secret安装您可以通过pip指令完成安装pip install aifashion_sdk使用通过python调用aifashion包,给定 ID 和 Scret 即可使用importaifashionclient_id='XXXXX'client_secret='YYYYY'af=aifashion.AIFashionFunctions(client_id=client_id,client_secret=client_secret)或者您可以保存在yaml文件中,并通过client_filename参数传递文件名client_id:XXXXXclient_secret:YYYYYimportaifashionclient_filename='ZZZZ.yml'af=aifashion.AIFashionFunctions(client_filename=client_filename)接下来您可以调用相应的函数获取识别的结果# if image_url is givenimage_url='some url here'af.a_function(image_url=image_url)# if image_filename is givenimage_filename='some filename here'af.a_function(image_fname=image_filename)# if image_base64 string is givenimage_base64='some base64 here'af.a_function(image_base64=image_base64)例如:>>>af.fahsion_tagging(image_url='https://images.aifashion.com/01/0f/c7/010fc7622b1b75f7d580137636f25d88.jpg'){'image':{'height':640,'id':'010fc7622b1b75f7d580137636f25d88','url':'https://images.aifashion.com/01/0f/c7/010fc7622b1b75f7d580137636f25d88.jpg','width':640},'object':{'category':'箱包','category_id':501,'confidence':0.9646803140640259,'region':{'x1':0.07128814697265624,'x2':0.9111869812011719,'y1':0.041831283569335936,'y2':0.9469302368164062}},'request_id':'855be4d7-7896-4428-bcf1-507a89a835c5#359895','tags':[{'key':'风格','values':[{'confidence':0.26829038772796976,'value':'韩版'}]},{'key':'款式','values':[{'confidence':0.7964576377930703,'value':'单肩包'}]},{'key':'大小','values':[{'confidence':0.5306223626645841,'value':'小'}]},{'key':'图案','values':[{'confidence':0.44190941484801455,'value':'纯色'}]},{'key':'适用场景','values':[{'confidence':0.959326524265762,'value':'休闲'}]},{'key':'闭合方式','values':[{'confidence':0.6383202888420646,'value':'拉链'}]},{'key':'形状','values':[{'confidence':0.44122126961424496,'value':'横款方形'}]},{'key':'性别','values':[{'confidence':1.0,'value':'通用'}]},{'key':'二级类别','values':[{'confidence':0.81,'value':'挎包'}]}]}>>>af.color_analysis(image_fname="dress.jpeg"){'colors':[{'name_en':'AntiqueWhite','name_zh':'浅橙色','percentage':0.1646513949747629,'rgb':'#F6EBDD'},{'name_en':'DimGray','name_zh':'黑灰色','percentage':0.388586906280622,'rgb':'#5B575A'},{'name_en':'Salmon','name_zh':'西瓜红','percentage':0.16732301662136928,'rgb':'#E8696D'},{'name_en':'DarkSalmon','name_zh':'西瓜红','percentage':0.23548634607206909,'rgb':'#EF9E97'},{'name_en':'SaddleBrown','name_zh':'深棕色','percentage':0.04395233605117681,'rgb':'#5B231C'}],'image':{'height':640,'id':'010fc7622b1b75f7d580137636f25d88','url':'https://images.aifashion.com/01/0f/c7/010fc7622b1b75f7d580137636f25d88.jpg','width':640},'object':{'category':'短裙','category_id':301,'confidence':0.8859304785728455,'region':{'x1':0.06238516420125961,'x2':0.9350795149803162,'y1':0.019997864961624146,'y2':0.9653385281562805}},'request_id':'855be4d7-7896-4428-bcf1-507a89a835c5#359896'}>>>af.clothes_detect(image_fname="red_dress.jpeg"){'image':{'height':541,'id':'2755c68dfc1b44301219bc9f399b3148','url':'https://images.aifashion.com/27/55/c6/2755c68dfc1b44301219bc9f399b3148.jpg','width':500},'objects':[{'category':'连衣裙','category_id':302,'confidence':0.9999842643737793,'region':{'x1':0.15614866256713866,'x2':0.8775624847412109,'y1':0.22497901348686994,'y2':0.9921972458030277}}],'request_id':'5ec403e2-d5f1-4dc3-aa51-0109c6e43b56#363567'}AIFashion python sdkAF Open APIThis project is a wrap of url API provided by AIFashion company. It can automatically process authorization, and renew token when it is expiring. All functions are provided as a function within a object, which is very convenient.ID & SecretYou need to visitconsoleto regist an account at first, login and create application to get the ID & Secret.InstallationYou can use pip to install this sdkpip install aifashion_sdkUsageUse AIFashion python sdk withimportaifashionclient_id='XXXXX'client_secret='YYYYY'af=aifashion.AIFashionFunctions(client_id=client_id,client_secret=client_secret)or you can store the ID & Secret in a yaml fileimportaifashionclient_filename='ZZZZ.yml'af=aifashion.AIFashionFunctions(client_filename=client_filename)Then you can call the functions inside af object, a typical calling is like this# if image_url is givenimage_url='some url here'af.a_function(image_url=image_url)# if image_filename is givenimage_filename='some filename here'af.a_function(image_fname=image_filename)# if image_base64 string is givenimage_base64='some base64 here'af.a_function(image_base64=image_base64)e.g.>>>af.fahsion_tagging(image_url='https://images.aifashion.com/01/0f/c7/010fc7622b1b75f7d580137636f25d88.jpg'){'image':{'height':640,'id':'010fc7622b1b75f7d580137636f25d88','url':'https://images.aifashion.com/01/0f/c7/010fc7622b1b75f7d580137636f25d88.jpg','width':640},'object':{'category':'箱包','category_id':501,'confidence':0.9646803140640259,'region':{'x1':0.07128814697265624,'x2':0.9111869812011719,'y1':0.041831283569335936,'y2':0.9469302368164062}},'request_id':'855be4d7-7896-4428-bcf1-507a89a835c5#359895','tags':[{'key':'风格','values':[{'confidence':0.26829038772796976,'value':'韩版'}]},{'key':'款式','values':[{'confidence':0.7964576377930703,'value':'单肩包'}]},{'key':'大小','values':[{'confidence':0.5306223626645841,'value':'小'}]},{'key':'图案','values':[{'confidence':0.44190941484801455,'value':'纯色'}]},{'key':'适用场景','values':[{'confidence':0.959326524265762,'value':'休闲'}]},{'key':'闭合方式','values':[{'confidence':0.6383202888420646,'value':'拉链'}]},{'key':'形状','values':[{'confidence':0.44122126961424496,'value':'横款方形'}]},{'key':'性别','values':[{'confidence':1.0,'value':'通用'}]},{'key':'二级类别','values':[{'confidence':0.81,'value':'挎包'}]}]}>>>af.color_analysis(image_fname="dress.jpeg"){'colors':[{'name_en':'AntiqueWhite','name_zh':'浅橙色','percentage':0.1646513949747629,'rgb':'#F6EBDD'},{'name_en':'DimGray','name_zh':'黑灰色','percentage':0.388586906280622,'rgb':'#5B575A'},{'name_en':'Salmon','name_zh':'西瓜红','percentage':0.16732301662136928,'rgb':'#E8696D'},{'name_en':'DarkSalmon','name_zh':'西瓜红','percentage':0.23548634607206909,'rgb':'#EF9E97'},{'name_en':'SaddleBrown','name_zh':'深棕色','percentage':0.04395233605117681,'rgb':'#5B231C'}],'image':{'height':640,'id':'010fc7622b1b75f7d580137636f25d88','url':'https://images.aifashion.com/01/0f/c7/010fc7622b1b75f7d580137636f25d88.jpg','width':640},'object':{'category':'短裙','category_id':301,'confidence':0.8859304785728455,'region':{'x1':0.06238516420125961,'x2':0.9350795149803162,'y1':0.019997864961624146,'y2':0.9653385281562805}},'request_id':'855be4d7-7896-4428-bcf1-507a89a835c5#359896'}>>>af.clothes_detect(image_fname="red_dress.jpeg"){'image':{'height':541,'id':'2755c68dfc1b44301219bc9f399b3148','url':'https://images.aifashion.com/27/55/c6/2755c68dfc1b44301219bc9f399b3148.jpg','width':500},'objects':[{'category':'连衣裙','category_id':302,'confidence':0.9999842643737793,'region':{'x1':0.15614866256713866,'x2':0.8775624847412109,'y1':0.22497901348686994,'y2':0.9921972458030277}}],'request_id':'5ec403e2-d5f1-4dc3-aa51-0109c6e43b56#363567'}
aifeynman
Quick StartInstallationIt's strongly recommended to setup a fresh virtual environment by typingvirtualenv -p python3 feyn source feyn/bin/activateFirst install numpy withpip install numpy. The 'aifeynman' package is available on PyPI and can be installed withpip install aifeynman.Note that for now, AI Feynman is supported only for Linux and Mac environments.First exampleMove into a clean directory and run the following Python commands:import aifeynman aifeynman.get_demos("example_data") # Download examples from server aifeynman.run_aifeynman("./example_data/", "example1.txt", 60, "14ops.txt", polyfit_deg=3, NN_epochs=500)This example will get solved in about 10-30 minutes depending on what computer you have and whether you have a GPU.Here ‘example.txt’ contains the data table to perform symbolic regression on, with columns separated by spaces, commas or tabs. The other parameters control the search: here the brute-force modules tries combinations of the 14 basic operations in ‘14ops.txt’ for up to 30 seconds, polynomial fits are tried up to degree 3, and the interpolating neural network is trained for up to 500 epochs.AI-FeynmanThis code is an improved implementation of AI Feynman: a Physics-Inspired Method for Symbolic Regression, Silviu-Marian Udrescu and Max Tegmark (2019) [Science Advances] and AI Feynman 2.0: Pareto-optimal symbolic regression exploiting graph modularity, Udrescu S.M. et al. (2020) [arXiv].Please checkthis Medium articlefor a more detailed eplanation of how to get the code running.In order to get started, run compile.sh to compile the fortran files used for the brute force code.ai_feynman_example.py contains an example of running the code on some examples (found in the example_data directory). The examples correspond to the equations I.8.14, I.10.7 and I.50.26 in Table 4 in the paper. More data files on which the code can be tested on can be found in theFeynman Symbolic Regression Database.The main function of the code, called by the user, has the following parameters:pathdir - path to the directory containing the data filefilename - the name of the file containing the dataBF_try_time - time limit for each brute force call (set by default to 60 seconds)BF_ops_file_type - file containing the symbols to be used in the brute force code (set by default to "14ops.txt")polyfit_deg - maximum degree of the polynomial tried by the polynomial fit routine (set be default to 4)NN_epochs - number of epochs for the training (set by default to 4000)vars_name - name of the variables appearing in the equation (inluding the name ofthe output variable). This should be passed as a list of strings, with the name of the variables appearing in the same order as they are in the file containing the datatest_percentage - percentage of the input data to be kept aside and used as the test setThe data file to be analyzed should be a text file with each column containing the numerical values of each (dependent and independent) variable. The solution file will be saved in the directory called "results" under the name solution_{filename}. The solution file will contain several rows (corresponding to each point on the Pareto frontier), each row showing:the mean logarithm in based 2 of the error of the discovered equation applied to the input data (this can be though of as the average error in bits)the cummulative logarithm in based 2 of the error of the discovered equation applied to the input data (this can be though of as the cummulative error in bits)the complexity of the discovered equation (in bits)the error of the discovered equation applied to the input datathe symbolic expression of the discovered equationIf test_percentage is different than zero, one more number is added in the beginning of each row, showing the error of the discovered equation on the test set.ai_feynman_terminal_example.py allows calling the aiFeynman function from the command line. (e.g. python ai_feynman_terminal_example.py --pathdir=../example_data/ --filename=example1.txt). Use python ai_feynman_terminal_example.py --help to display all the available parameters that can be passed to the function.CitationIf you compare with, build on, or use aspects of the AI Feynman work, please cite the following:@article{udrescu2020ai, title={AI Feynman: A physics-inspired method for symbolic regression}, author={Udrescu, Silviu-Marian and Tegmark, Max}, journal={Science Advances}, volume={6}, number={16}, pages={eaay2631}, year={2020}, publisher={American Association for the Advancement of Science} }
aiffel7
aiffel7_library사용해보기패키지 설치pip install --upgrade aiffel7example 폴더에서 사용법 확인하기구현된 함수에대한 문서들이 들어있습니다.기여하기pull request로 패키지에 포함할 파일을 aiffel7 폴더에 추가코드 상단에 기여자 적기(날짜 생략 가능)# 기여자: [기여자 이름]# 기여 날짜: [YYYY-MM-DD]# 설명: [변경 사항에 대한 간략한 설명]# 설명에는 사용처, 이것이 주는 이점 , 원리 등등을 자세히 써주면 좋음example folder에 사용법 작성문의하기issues 에서 새로운 이슈 만들기 라이브러리 구현이 되었으면 좋은 것들 문의를 하거나, 라이브러리에 문제가 있다면 알려주시면 됩니다.
aiffel-korean-tokenizer
No description available on PyPI.
aiff-kit
No description available on PyPI.
aifig
AI FiguresPurposeAIFIG is a python library for generating figures of machine learning models.The libary allows you to generate figures such as the following, which may be useful for use in presentations, papers etc.AIFIGis a refactored version of some of my personal code. Functionality will naturally be limited and not suited for every use. I encourage anyone who is interested to contribute with additional features.If you use AIFIG in a paper, you can cite the library like this (bibtex):@misc{aifig, author ={Sigve Rokenes}, title ={AI-FIG}, year ={2019}, publisher ={GitHub}, journal ={GitHub repository}, howpublished ={\url{https://github.com/evgiz/aifig}}}InstallAI-FIG library with svg export:pipinstallaifigIf you need to export as png or pdf:pipinstallsvglibUsageSimple example# Import libraryimportaifig# Create new figure, title and author is optionalmy_figure=aifig.figure("Figure 1","Sigve Rokenes")# Figures consist of graphs (eg. each network in a model)my_graph=aifig.graph("gen")# Graphs contain elements (inputs, outputs, layers)my_graph.add(aifig.dense("input",16))my_graph.add(aifig.dense("hidden_1",64))my_graph.add(aifig.dense("hidden_2",128))my_graph.add(aifig.dense("hidden_3",64))my_graph.add(aifig.dense("output",1))my_graph.add(aifig.arrow("prediction"))# Add the graph to the figure at position (0,0)my_figure.add(graph,0,0)# Save the figuremy_figure.save_png("my_figure.png",scale=1)my_figure.save_svg("my_figure.svg")my_figure.save_pdf("my_figure.pdf")The above code generates this figure:Multi-graph example (GAN model)importaifigfigure=aifig.figure()# Define generator networkgenerator_elements=[aifig.dense("noise_vector",128,comment="norm_dist",simple=True),aifig.conv("tconv_1",48,comment="5x5"),aifig.conv("tconv_2",32,comment="5x5"),aifig.conv("tconv_3",8,comment="5x5"),aifig.conv("tconv_4",3,comment="5x5"),aifig.image("gen_result",comment="(fake image)")]# Define discriminator networkdiscriminator_elements=[aifig.image("image_input",comment="real/fake"),aifig.conv("conv_1",16,comment="5x5"),aifig.pool("max_pool")aifig.conv("conv_2",32,comment="5x5"),aifig.pool("max_pool"),aifig.conv("conv_3",48,comment="5x5"),aifig.dense("dense_1",64),aifig.dense("output",1),aifig.arrow("prediction",comment="log prob")]# Create graphs with elementsgen_graph=aifig.graph("gen",generator_elements)dsc_graph=aifig.graph("dsc",discriminator_elements)dat_graph=aifig.graph("dat",[aifig.image("real_image",comment="(dataset)")])# Add graphs to figurefigure.add(gen_graph,0,0)figure.add(dat_graph,1,0)figure.add(dsc_graph,0,1)# Connect inputs to discriminator networkfigure.connect("gen","dsc")figure.connect("dat","dsc")# Save figure as pngfigure.save_png("gan.png")This code generates the following figure:APIA figure consists of one or more graphs. These graphs are placed in a grid usingfigure.add(graph, x, y). You can add elements to graphs usingmygraph.add(element), and you can connect graphs with arrows usingfigure.connect("graph_name1", "graph_name2"). Finally, to save a figure, usemy_figure.save_svg("fig.svg")or variants for different formats.# ===================== ## Figure ## ===================== ## title figure title# author figure authormy_figure=aifig.figure()# figure.add# graph graph to add# x x position in grid# y y position in gridmy_figure.add(graph,0,0)# figure.connect# from name of first graph# to name of second graph# position grid position of arrow, use this if# different arrows overlap# offset arrow offset in units, useful to# distinguish different arrows at same positionmy_figure.connect("graph1","graph2")# figure.save (path)# path file path to save to# scale upscale (png only)# debug enable debug draw modemy_figure.save_png("my_figure.png",scale=1)my_figure.save_svg("my_figure.svg")my_figure.save_pdf("my_figure.pdf")# ===================== ## Graph ## ===================== ## name (required)# elements [list of elements]# spacing (between elements, default 32)my_graph=aifig.graph("graph_name")my_graph.add(element)# ===================== ## Layer elements ## ===================== ## label text label, use None to hide# size size of layer (nodes, filters)# comment additional comment text# size_label set to False to hide size label# simple (dense only) set True to render as simple rectangledense=aifig.dense()# Dense (fully connected)conv=aifig.conv()# Convolutional layer# ===================== ## Simple elements ## ===================== ## label text label, use None to hide# comment additional comment textpool=aifig.pool()# Pooling layerimage=aifig.image()# Image (usually input)arrow=aifig.arrow()# Arrow# ===================== ## Special elements ## ===================== ## width width of padding (use negative to reduce)padding=aifig.padding(10)Dependenciessvgwritesvglib (only to save as pdf/png)reportlab (only to save as pdf/png)
aifix
No description available on PyPI.
aiflow
AI FlowIntroductionAI Flow, which offers various reusable operators & processing units in AI modeling, helps AI engineer to write less, reuse more, integrate easily.Installpip install aiflowConceptsOperators VS. UnitsIdeally, we agree:AnOperatorwould contain lot of units, which will be integrated intoairflowfor building non-realtime processing workflow;AUnitis a small calculation unit, which could be a function, or just a simple modeling logic, and it could be picked as bricks to build an operator. Besides, it could be reused anywhere for realtime calculation.ClassesOperatorsMongoToCSVOperatorElastic2CSVOperatorRegExLabellingOperatorUnitsDoc2VecUnitDoc2MatUnitTests & ExamplesExample: Use Units to Build Your CastleExample: Working with AirflowIntests/docker/folder, we provide examples on how to useaiflowwithairflow. It is a docker image, you could simply copy and start to use it!In project root directory, run commands first:docker-compose up --build aiflowThen openlocalhost:8080in your browser, you can see all the examplesaiflowprovided! Note: both the default username & password areadminEnjoy!Contribution
ai-flow
No description available on PyPI.
ai-flow-nightly
No description available on PyPI.
aiflows
🤖🌊aiFlowsembodies theFlowsabstraction (arXiv) and greatly simplifies the design and implementation of complex (work)Flows involving humans, AI systems, and tools. It enables:🧩 Modularity: Flows can be stacked like LEGO blocks into arbitrarily nested structures with the complexity hidden behind a message-based interface🤝 Reusability: Flows can be shared publicly on the FlowVerse, readily downloaded and reused as part of different Flows🔀 Concurrency: Being consistent with the Actor model of concurrent computation, Flows are concurrency friendly – a necessary feature for a multi-agent futureFlows in a NutshellThe framework is centered aroundFlowsandmessages. Flows represent the fundamental building block of computation. They are independent, self-contained, goal-driven entities able to complete a semantically meaningful unit of work. To exchange information, Flows communicate via a standardized message-based interface. Messages can be of any type the recipient Flow can process.TheFlowsframework exemplified.The first column depicts examples of tools. Notably, in the Flows framework, AI systems correspond to tools. The second column depicts Atomic Flows, effectively minimal wrappers around tools constructed from the example tools. The third column depicts examples of Composite Flows defining structured interaction between _Atomic_ or _Composite_ Flows. The fourth column illustrates a specific _Composite_ competitive coding Flow as those used in the experiments in the paper. The fifth column outlines the structure of a hypothetical Flow, defining a meta-reasoning process that could support autonomous behavior.FlowVerse in a NutshellThe FlowVerse is a repository of Flows (powered by the 🤗 HuggingFace hub) created and shared by our community for everyone to use! With aiFlows, Flows can be readily downloaded, used, extended, or composed into novel, more complex Flows. For instance, sharing a Flow that uses only API-based tools (tools subsume models in the Flows abstraction) is as simple as sharing a config file (e.g.,hereis the AutoGPT Flow on FlowVerse). For the ones using ChatGPT, you could think of them as completely customizable open-source GPTs(++).The FlowVerse is continuously growing. To explore the currently available Flows, check out the 🤲│flow-sharing Forum on the Discordserver. Additionally, theTutorialsandDetailed Examplesin theGetting Startedsections cover some of the Flows we provide in more detail (e.g., the ChatAtomicFlow and QA, VisionAtomicFlow and VisualQA, ReAct and ReAct with human feedback, AutoGPT, etc.).Why should I use aiFlows?AI is set to revolutionize the way we work. Our mission is to support AI researchers and to allow them to seamlessly share advancements with practitioners. This will establish a feedback loop, guiding progress toward beneficial directions while ensuring that everyone can freely access and benefit from the next-generation AI tools.As a researcher, you will benefit from:The ability to design, implement, and study arbitrarily complex interactionsComplete control and customizability (e.g., the tools, the specific Flows and the information they have access to, the choice of models and their deployment, etc.)The ability to readily reproduce, reuse, or build on top of Flows shared on the FlowVerse and systematically study them across different settings (the infrastructure in thecc_flowsrepository could be a useful starting point in future studies)The ability to readily make your work accessible to practitioners and other researchers and access their feedback.As a practitioner, you will benefit from the:The ability to design and implement arbitrarily complex interactionsComplete control and customizability (e.g., the tools, the specific Flows and the information they have access to, the choice of models and their deployment, etc.)The ability to readily reuse or build on top of Flows shared on the FlowVerseDirect access to any advancements in the field.To develop the next-generation AI tools and at the same time maximally benefit from them, developers and researchers need to have complete control over their workflows -- aiFlows strives to empower you to make each Flow your own! See thecontributesection for more information.InstallationThe library requires Python 3.10+. To install the library, run the following command:pipinstallaiflowsOther installation optionsInstall bleeding-edge [email protected]:epfl-dlab/aiflows.gitcdaiflows pipinstall-e.Getting StartedQuick start (🕓 5 min)Here, you'll see how you can run inference with your first question-answering Flow, and you can trivially change between vastly different question-answering Flows thanks to the modular abstraction and FlowVerse!Tutorial (🕓 20 min)In this tutorial, we introduce you to the library's features through a walkthrough of how to build useful Flows of gradually increasing complexity. Starting from a vanilla QA Flow, we'll first extend it to a ReAct Flow, then ReAct with human feedback, and finish the tutorial with a version of AutoGPT!Developer's Guide (🕓 10 min)We are constantly optimizing our Flow development workflow (pun intended:). In this short guide, we share our best tips so that you don't have to learn the hard way.Detailed ExamplesMany of the recently proposed prompting and collaboration strategies involving tools, humans, and AI models are, in essence, specific Flows (see the figure below). In the link above, you'll find a detailed walkthrough of how to build some representative workflows.ContributeAs mentioned above, our goal is to make Flows a community-driven project that will benefit researchers and developers alike (see theWhy should I use aiFlows?section), and to achieve this goal, we need your help.You can become a part of the project in a few ways:contribute to the aiFlows codebase: this will directly improve the library and benefit everyone using itcontribute to the FlowVerse: by making your work accessible to everyone, others might improve your work and build on it, or you can build on others' workuse the library in your creative projects, push it to its limits, and share your feedback: the proof of the pudding is in the eating, and the best way to identify promising directions, as well as important missing features, is by experimentinglast but not least, ⭐ the repository and 📣 share aiFlows with your friends and colleagues; spread the word ❤️We will support the community in the best way we can but also lead by example. In the coming weeks, we will share:a roadmap for the library (FlowViz; FlowStudio; improve flexibility, developer experience, and support for concurrency, etc. -- feedback and help would be greatly appreciated!)write-ups outlining features, ideas, and our long-term vision for Flows -- we encourage you to pick up any of these and start working on them in whatever way you see fita version of JARVIS -- your fully customizable open-source version of ChatGPT+(++), which we will continue building in public! We hope that this excites you as much as it excites us, and JARVIS will become one of those useful projects that will constantly push the boundaries of what's possible with FlowsWe have tried to find a way for anyone to benefit by contributing to the project. TheContribution Guidedescribes our envisioned workflows in more detail (we would love to hear your feedback on this -- the Discordserveralready has a channel for it :)).In a nutshell, this is just the beginning, and we have a long way to go. Stay tuned, and let's work on a great (open-source) AI future together!ContributorsMade withcontrib.rocks.CitationTo reference the 🤖🌊aiFlowslibrary, please cite the paperFlows: Building Blocks of Reasoning and Collaborating AI:@misc{josifoski2023flows, title={Flows: Building Blocks of Reasoning and Collaborating AI}, author={Martin Josifoski and Lars Klein and Maxime Peyrard and Yifei Li and Saibo Geng and Julian Paul Schnitzler and Yuxing Yao and Jiheng Wei and Debjit Paul and Robert West}, year={2023}, eprint={2308.01285}, archivePrefix={arXiv}, primaryClass={cs.AI} }
aiflux
No description available on PyPI.
aiforce
AI-ForceA machine learning toolbox.InstallYou can install aiforce withpipfrom a terminal window with the following command:pip install aiforceHow to useFill me in please! Don't forget code examples:1+12
ai-forge
Failed to fetch description. HTTP Status Code: 404
aiforges
aiforge
aiformat
No description available on PyPI.
aiforthechurch
AI for the ChurchModern LLMs are rooted in secular value systems that are often misaligned with religious organisations. This PyPI package allows anyone to train and deploying doctrinally correct LLMs based on Llama2. Effectively, we are aligning models to a set of values.Model fine-tuningfromaiforthechurchimportalign_llama2doctrinal_dataset="/path/to/csv"align_llama2(doctrinal_dataset)aiforthechurchis integrated with HuggingFace shuch that the aligned model will be automatically pushed to your HuggingFace repo of choice at the end of the training.At aiforthechurch.org we provide tools for generating doctrinal datasets, a few examples are available at huggingface.co/AiForTheChurch.Model inference and deploymentWe implemented an inference API in the same format as OpenAI's.importaiforthechurchaiforthechurch.Completion.create(denomination="catholic",message="Does Jesus love me?")There is also an asynchronous streaming API, just setstream=True.And you can use our code to create your own inference server with the models you train by editing theMODELSdictionary ingloohack/deployment/prod_models.pywith a denomination and a path to your model in huggingface. Start the server by running the following command on your machine:python gloohack/deployment/inference.pyModel training requirementsIf you wish to train your models using this repo you will need access to a machine with over 16GB of GPU memory and 30GB RAM. The full model weights for Llama2-7B amount to almost 30GB, but we use parameter-efficient fine-tuning (PEFT) LoRA to save memory and avoid any catastrophic forgetting during the fine-tuning procedure.ReferencesWe leaned heavily on open-source libraries liketransformers,peft, andbitsandbytesfor this project.Dettmers, Tim, Mike Lewis, Younes Belkada, and Luke Zettlemoyer. 2022. "LLM.int8(): 8-bit Matrix Multiplication for Transformers at Scale."arXiv preprint arXiv:2208.07339.Hu, Edward J., Yelong Shen, Phillip Wallis, Zeyuan Allen-Zhu, Yuanzhi Li, Shean Wang, Lu Wang, and Weizhu Chen. 2021. "LoRA: Low-Rank Adaptation of Large Language Models."arXiv preprint arXiv:2106.09685.
ai-forward
中文|EnglishOpenAI ForwardOpenAI API 接口转发服务The fastest way to deploy openai api forwarding功能|部署指南|应用|配置选项|对话日志本项目用于解决一些地区无法直接访问OpenAI的问题,将该服务部署在可以正常访问OpenAI API的(云)服务器上, 通过该服务转发OpenAI的请求。即搭建反向代理服务; 允许输入多个OpenAI API-KEY 组成轮询池; 可自定义二次分发api key.由本项目搭建的长期代理地址:https://api.openai-forward.comhttps://cloudflare.worker.openai-forward.comhttps://cloudflare.page.openai-forward.comhttps://vercel.openai-forward.comhttps://render.openai-forward.comhttps://railway.openai-forward.com功能基础功能支持转发OpenAI所有接口支持流式响应支持指定转发路由前缀docker部署pip 安装部署Railway 一键部署Render 一键部署cloudflare 部署Vercel一键部署高级功能允许输入多个openai api key 组成轮询池自定义 转发api key (见高级配置)流式响应对话日志多接口转发部署指南👉部署文档提供以下几种部署方式有海外vps方案pip 安装部署Docker部署https://api.openai-forward.com无vps免费部署方案Railway部署https://railway.openai-forward.comRender一键部署https://render.openai-forward.com下面的部署仅提供单一转发功能一键Vercel部署https://vercel.openai-forward.comcloudflare部署https://cloudflare.page.openai-forward.com应用聊天应用基于开源项目ChatGPT-Next-Web搭建自己的chatgpt服务替换docker启动命令中的BASE_URL为我们自己搭建的代理服务地址detailsdockerrun-d\-p3000:3000\-eOPENAI_API_KEY="sk-******"\-eBASE_URL="https://api.openai-forward.com"\-eCODE="******"\yidadaa/chatgpt-next-web在代码中使用Pythonimport openai+ openai.api_base = "https://api.openai-forward.com/v1"openai.api_key = "sk-******"More ExamplesJS/TSimport { Configuration } from "openai";const configuration = new Configuration({+ basePath: "https://api.openai-forward.com/v1",apiKey: "sk-******",});gpt-3.5-turbocurlhttps://api.openai-forward.com/v1/chat/completions\-H"Content-Type: application/json"\-H"Authorization: Bearer sk-******"\-d'{"model": "gpt-3.5-turbo","messages": [{"role": "user", "content": "Hello!"}]}'Image Generation (DALL-E)curl--location'https://api.openai-forward.com/v1/images/generations'\--header'Authorization: Bearer sk-******'\--header'Content-Type: application/json'\--data'{"prompt": "A photo of a cat","n": 1,"size": "512x512"}'配置选项配置的设置方式支持两种一种为在命令行中执行aifd run的运行参数(如--port=8000)中指定;另一种为读取环境变量的方式指定。命令行参数可通过aifd run --help查看Click for more detailsaifd run参数配置项配置项说明默认值--port服务端口号8000--workers工作进程数1--openai_base_url同 OPENAI_BASE_URLhttps://api.openai.com--openai_route_prefix同 OPENAI_ROUTE_PREFIXNone--api_key同 OPENAI_API_KEYNone--forward_key同 FORWARD_KEYNone--extra_base_url同 EXTRA_BASE_URLNone--extra_route_prefix同 EXTRA_ROUTE_PREFIXNone--log_chat同 LOG_CHATFalse环境变量配置项支持从运行目录下的.env文件中读取环境变量说明默认值OPENAI_BASE_URL默认 openai官方 api 地址https://api.openai.comOPENAI_ROUTE_PREFIXopenai(接口格式)路由前缀/OPENAI_API_KEY默认openai api key,支持多个默认api key, 以sk-开头, 以逗号分隔无FORWARD_KEY允许调用方使用该key代替openai api key,支持多个forward key, 以逗号分隔; 如果设置了OPENAI_API_KEY,而没有设置FORWARD_KEY, 则客户端调用时无需提供密钥, 此时出于安全考虑不建议FORWARD_KEY置空无EXTRA_BASE_URL额外转发服务地址无EXTRA_ROUTE_PREFIX额外转发服务路由前缀无LOG_CHAT是否记录聊天内容false高级配置设置openai api_key为自定义的forward keyClick for more details需要配置 OPENAI_API_KEY 和 FORWARD_KEY, 例如OPENAI_API_KEY=sk-*******FORWARD_KEY=fk-******# 这里fk-token由我们自己定义这里我们配置了FORWARD_KEY为fk-******, 那么后面客户端在调用时只需设置OPENAI_API_KEY为我们自定义的fk-******即可。这样的好处是在使用一些需要输入OPENAI_API_KEY的第三方应用时,我们可以使用自定义的api-keyfk-******, 无需担心真正的OPENAI_API_KEY被泄露。并且可以对外分发fk-******。用例:curlhttps://api.openai-forward.com/v1/chat/completions\-H"Content-Type: application/json"\-H"Authorization: Bearer fk-******"\-d'{"model": "gpt-3.5-turbo","messages": [{"role": "user", "content": "Hello!"}]}'Pythonimport openai+ openai.api_base = "https://api.openai-forward.com/v1"- openai.api_key = "sk-******"+ openai.api_key = "fk-******"Web applicationdockerrun-d\-p3000:3000\-eOPENAI_API_KEY="fk-******"\-eBASE_URL="https://api.openai-forward.com"\-eCODE="<your password>"\yidadaa/chatgpt-next-web多路由转发支持转发不同地址的服务至同一端口的不同路由下,基本可以转发任何服务。用例见.env.example对话日志默认不记录对话日志,若要开启需设置环境变量LOG_CHAT=trueClick for more details保存路径在当前目录下的Log/chat路径中。记录格式为{'messages': [{'user': 'hi'}], 'model': 'gpt-3.5-turbo', 'forwarded-for': '', 'uid': '467a17ec-bf39-4b65-9ebd-e722b3bdd5c3', 'datetime': '2023-07-18 14:01:21'} {'assistant': 'Hello there! How can I assist you today?', 'uid': '467a17ec-bf39-4b65-9ebd-e722b3bdd5c3'} {'messages': [{'user': 'Hello!'}], 'model': 'gpt-3.5-turbo', 'forwarded-for': '', 'uid': 'f844d156-e747-4887-aef8-e40d977b5ee7', 'datetime': '2023-07-18 14:01:23'} {'assistant': 'Hi there! How can I assist you today?', 'uid': 'f844d156-e747-4887-aef8-e40d977b5ee7'}转换为json格式:aifdconvert得到chat.json:[{"datetime":"2023-07-18 14:01:21","forwarded-for":"","model":"gpt-3.5-turbo","messages":[{"user":"hi"}],"assistant":"Hello there! How can I assist you today?"},{"datetime":"2023-07-18 14:01:23","forwarded-for":"","model":"gpt-3.5-turbo","messages":[{"user":"Hello!"}],"assistant":"Hi there! How can I assist you today?"}]Backer and SponsorLicenseOpenAI-Forward is licensed under theMITlicense.
aifrenz
AI FrenzAn open-source machine learning library
ai.fri3d
No description available on PyPI.
aifrruflowmeter
What is AifrruFlowMeterThe AifrruFlowMeter is an open source tool that generates Biflows from pcap files, and extracts features from these flows.AifrruFlowMeter is a network traffic flow generator available from here . It can be used to generate bidirectional flows, where the first packet determines the forward (source to destination) and backward (destination to source) directions, hence the statistical time-related features can be calculated separately in the forward and backward directions. Additional functionalities include, selecting features from the list of existing features, adding new features, and controlling the duration of flow timeout.NOTE: TCP flows are usually terminated upon connection teardown (by FIN packet) while UDP flows are terminated by a flow timeout.The flow timeout value can be assigned arbitrarily by the individual scheme e.g., 600 seconds for both TCP and UDP.
aifs
AI FilesystemLocal semantic search over folders. Why didn't this exist?pipinstallaifs pipinstallunstructured[all-docs]# If you want to parse all doc types. Includes large packages!fromaifsimportsearchsearch("How does AI Filesystem work?",path="/path/to/folder")search("It's not unlike how Spotlight works.")# Path defaults to CWDHow it worksRunningaifs.searchwill chunk and embed all nested supported files (.txt,.py,.sh,.docx,.pptx,.jpg,.png,.eml,.html, and.pdf) inpath. It will then store these embeddings into an_.aifsfile inpath.By storing the index, you only have to chunk/embed once. This makes semantic searchveryfast after the first time you search a path.If a file has changed or been added,aifs.searchwill update or add those chunks. We still need to handle file deletions (we welcome PRs).In detail:If a folder hasn't been indexed, we first useunstructuredto parse and chunk every file in thepath.Then we usechromato embed the chunks locally and save them to a_.aifsfile inpath.Finally,chromais used again to semantically search the embeddings.If an_.aifsfileisfound in a directory, it uses that instead of indexing it again. If some files have been updated, it will re-index those.GoalsWe should always have SOTA parsing and chunking. The logic for this should be swapped out as new methods arise.Chunking should be semantic — as in,pythonandmarkdownfiles should havedifferentchunking algorithms based on the expected content of those filetypes. Who has this solution?For parsing, I think Unstructured is the best of the best. Is this true?We should always have SOTA embedding. If a better local embedding model is found, we should automatically download and use it.I think Chroma will always do this (is this true?) so we depend on Chroma.This project should stayminimally scoped— we wantaifsto be the best local semantic search in the universe.Why?We built this to letopen-interpreterquickly semantically search files/folders.
aifs-nni
简体中文NNI (Neural Network Intelligence)is a lightweight but powerful toolkit to help usersautomateFeature Engineering,Neural Architecture Search,Hyperparameter TuningandModel Compression.The tool manages automated machine learning (AutoML) experiments,dispatches and runsexperiments' trial jobs generated by tuning algorithms to search the best neural architecture and/or hyper-parameters indifferent training environmentslikeLocal Machine,Remote Servers,OpenPAI,Kubeflow,FrameworkController on K8S (AKS etc.),DLWorkspace (aka. DLTS),AML (Azure Machine Learning)and other cloud options.Who should consider using NNIThose who want totry different AutoML algorithmsin their training code/model.Those who want to run AutoML trial jobsin different environmentsto speed up search.Researchers and data scientists who want to easilyimplement and experiment new AutoML algorithms, may it be: hyperparameter tuning algorithm, neural architect search algorithm or model compression algorithm.ML Platform owners who want tosupport AutoML in their platform.NNI v1.9 has been released!NNI capabilities in a glanceNNI provides CommandLine Tool as well as an user friendly WebUI to manage training experiments. With the extensible API, you can customize your own AutoML algorithms and training services. To make it easy for new users, NNI also provides a set of build-in state-of-the-art AutoML algorithms and out of box support for popular training platforms.Within the following table, we summarized the current NNI capabilities, we are gradually adding new capabilities and we'd love to have your contribution.Frameworks & LibrariesAlgorithmsTraining ServicesBuilt-inSupported FrameworksPyTorchKerasTensorFlowMXNetCaffe2More...Supported LibrariesScikit-learnXGBoostLightGBMMore...ExamplesMNIST-pytorchMNIST-tensorflowMNIST-kerasAuto-gbdtCifar10-pytorchScikit-learnEfficientNetKernel TunningMore...Hyperparameter TuningExhaustive searchRandom SearchGrid SearchBatchHeuristic searchNaïve EvolutionAnnealHyperbandPBTBayesian optimizationBOHBTPESMACMetis TunerGP TunerRL BasedPPO TunerNeural Architecture SearchENASDARTSP-DARTSCDARTSSPOSProxylessNASNetwork MorphismTextNASModel CompressionPruningAGP PrunerSlim PrunerFPGM PrunerNetAdapt PrunerSimulatedAnnealing PrunerADMM PrunerAutoCompress PrunerQuantizationQAT QuantizerDoReFa QuantizerFeature Engineering (Beta)GradientFeatureSelectorGBDTSelectorEarly Stop AlgorithmsMedian StopCurve FittingLocal MachineRemote ServersAML(Azure Machine Learning)Kubernetes based servicesOpenPAIKubeflowFrameworkController on K8S (AKS etc.)DLWorkspace (aka. DLTS)ReferencesPython APINNI AnnotationSupported OSCustomizeTunerCustomizeAssessorInstall Customized Algorithms as Builtin Tuners/Assessors/AdvisorsSupport TrainingServiceImplement TrainingServiceInstallationInstallNNI supports and is tested on Ubuntu >= 16.04, macOS >= 10.14.1, and Windows 10 >= 1809. Simply run the followingpip installin an environment that haspython 64-bit >= 3.6.Linux or macOSpython3-mpipinstall--upgradenniWindowspython-mpipinstall--upgradenniIf you want to try latest code, pleaseinstall NNIfrom source code.For detail system requirements of NNI, please refer toherefor Linux & macOS, andherefor Windows.Note:If there is any privilege issue, add--userto install NNI in the user directory.Currently NNI on Windows supports local, remote and pai mode. Anaconda or Miniconda is highly recommended to installNNI on Windows.If there is any error likeSegmentation fault, please refer toFAQ. For FAQ on Windows, please refer toNNI on Windows.Verify installationThe following example is built on TensorFlow 1.x. Make sureTensorFlow 1.x is usedwhen running it.Download the examples via clone the source code.gitclone-bv1.9https://github.com/Microsoft/nni.gitRun the MNIST example.Linux or macOSnnictlcreate--confignni/examples/trials/mnist-tfv1/config.ymlWindowsnnictlcreate--confignni\examples\trials\mnist-tfv1\config_windows.ymlWait for the messageINFO: Successfully started experiment!in the command line. This message indicates that your experiment has been successfully started. You can explore the experiment using theWeb UI url.INFO: Starting restful server... INFO: Successfully started Restful server! INFO: Setting local config... INFO: Successfully set local config! INFO: Starting experiment... INFO: Successfully started experiment! ----------------------------------------------------------------------- The experiment id is egchD4qy The Web UI urls are: http://223.255.255.1:8080 http://127.0.0.1:8080 ----------------------------------------------------------------------- You can use these commands to get more information about the experiment ----------------------------------------------------------------------- commands description 1. nnictl experiment show show the information of experiments 2. nnictl trial ls list all of trial jobs 3. nnictl top monitor the status of running experiments 4. nnictl log stderr show stderr log content 5. nnictl log stdout show stdout log content 6. nnictl stop stop an experiment 7. nnictl trial kill kill a trial job by id 8. nnictl --help get help information about nnictl -----------------------------------------------------------------------Open theWeb UI urlin your browser, you can view detail information of the experiment and all the submitted trial jobs as shown below.Hereare more Web UI pages.DocumentationTo learn about what's NNI, read theNNI Overview.To get yourself familiar with how to use NNI, read thedocumentation.To get started and install NNI on your system, please refer toInstall NNI.ContributingThis project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visithttps://cla.microsoft.com.When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.This project has adopted theMicrosoft Open Source Code of Conduct. For more information see the Code ofConduct FAQor [email protected] any additional questions or comments.After getting familiar with contribution agreements, you are ready to create your first PR =), follow the NNI developer tutorials to get start:We recommend new contributors to start with simple issues:'good first issue'or'help-wanted'.NNI developer environment installation tutorialHow to debugIf you have any questions on usage, reviewFAQfirst, if there are no relevant issues and answers to your question, try contact NNI dev team and users inGitterorFile an issueon GitHub.Customize your own TunerImplement customized TrainingServiceImplement a new NAS trainer on NNICustomize your own AdvisorExternal Repositories and ReferencesWith authors' permission, we listed a set of NNI usage examples and relevant articles.External RepositoriesRunENASwith NNIAutomatic Feature Engineeringwith NNIHyperparameter Tuning for Matrix Factorizationwith NNIscikit-nniHyper-parameter search for scikit-learn pipelines using NNIRelevant ArticlesHyper Parameter Optimization ComparisonNeural Architecture Search ComparisonParallelizing a Sequential Algorithm TPEAutomatically tuning SVD with NNIAutomatically tuning SPTAG with NNIFind thy hyper-parameters for scikit-learn pipelines using Microsoft NNIBlog (in Chinese)-AutoML tools (Advisor, NNI and Google Vizier) comparisonby@gaocegege- 总结与分析 section of design and implementation of kubeflow/katibBlog (in Chinese)-A summary of NNI new capabilities in 2019by @squirrelscFeedbackFile an issueon GitHub.Ask a question with NNI tags onStack Overflow.Discuss on the NNIGitterin NNI.Join IM discussion groups:GitterWeChatORRelated ProjectsTargeting at openness and advancing state-of-art technology,Microsoft Research (MSR)had also released few other open source projects.OpenPAI: an open source platform that provides complete AI model training and resource management capabilities, it is easy to extend and supports on-premise, cloud and hybrid environments in various scale.FrameworkController: an open source general-purpose Kubernetes Pod Controller that orchestrate all kinds of applications on Kubernetes by a single controller.MMdnn: A comprehensive, cross-framework solution to convert, visualize and diagnose deep neural network models. The "MM" in MMdnn stands for model management and "dnn" is an acronym for deep neural network.SPTAG: Space Partition Tree And Graph (SPTAG) is an open source library for large scale vector approximate nearest neighbor search scenario.We encourage researchers and students leverage these projects to accelerate the AI development and research.LicenseThe entire codebase is underMIT license
aifunc
No description available on PyPI.
ai-functions
Python AI FunctionsSimple library that can convert from python functions to a JSON schema description of those functions, suitable for use with AI libraries.Installationpip install ai-functionsUsageFor example:from ai_functions import get_openai_functions, execute_function def search_web(query: Annotated[str, "google formatted keywords to search for"]): """Search the web""" print(openai_function_dict([search_web]))Also, if you get aresponse.function_callfrom openai, you can useexecute_functionopenai_function_execute([search_web], function_call)Or, if you have the name and arguments split out already:function_execute([search_web], name, arguments)Finally, if you want a container to handle this:from ai_functions import AIFunctions container = AIFunctions([search_web, add_calendar_entry]) functions = container.openai_dict() subset_functions = container.openai_dict(["search_web"]) container.execute("search_web", {"query":"top web hosting sites"}) container.opeanai_execute({"name": "search_web", "arguments": "{\"query\":\"top web hosting sites\"}")What stuff does this handle?Converts your annotated schema into an appropriate promptHandles converting arguments to JSON if they are specified as a string.Auto-casts arguments to the right types, if they aren't right.Raises errors that AI engines understand if returned as a function response, instead of errors with poor descriptions.Async executeIf a loop is provided to the AIFunctions constructor, or to any execute calls, it will be used to schedule a coroutine.Async versions of execute are available, prefix all calls withasync_Some fine printIf you want to have a paramter that is still seen as "valid", but isn't part of the schema, you can annotate it with None as the description. But this is really an "enforcement" thing, and might not belong in this library.Dealing with context is a beast in chat apps, so more work here might be helpful.For example, I use meta-functions that unlock others, to prevent context-bloat.Might put that in another lib soon, or put it here.
aifunctools
Functools for OpenAI GPT FunctionsOpenAI just released GPT Functions:https://openai.com/blog/function-calling-and-other-api-updatesBut it's really cumbersome to make the actual function signature. And manually parse results and chain the calls:curlhttps://api.openai.com/v1/chat/completions-u:$OPENAI_API_KEY-H'Content-Type: application/json'-d'{"model": "gpt-3.5-turbo-0613","messages": [{"role": "user", "content": "What is the weather like in Boston?"}],"functions": [{"name": "get_current_weather","description": "Get the current weather in a given location","parameters": {"type": "object","properties": {"location": {"type": "string","description": "The city and state, e.g. San Francisco, CA"},"unit": {"type": "string","enum": ["celsius", "fahrenheit"]}},"required": ["location"]}}]}'Use aifunctools insteadInstead, aifunctools automatically parses available python type annotations and docstrings so you don't have to do all of the manual work:fromaifunctools.openai_funcsimportcomplete_with_functionsdefget_current_weather(location:str,unit:str)->dict:"""Get the current weather in a given location:param location: The city and state, e.g. San Francisco, CA:param unit: Either "celsius" or "fahrenheit":return: the current temperature, unit, and a description"""return{"temperature":22,"unit":"celsius","description":"Sunny"}resp=complete_with_functions("What is the weather like in Boston?",get_current_weather)# The response should contain: "The weather in Boston is currently sunny with a temperature of 22 degrees Celsius."What's nextOpenAPI SpecsPython functions are great, but what would really be great is to parse openapi specs from remote services. Imagine where every service provider makes their api spec available and agents can just crawl these APIs and use aifunctools to chain them together.Richer typesFor this hack we only support the basic string / number / boolean types. By making this more flexible we can get much richer behavior.Error handlingI'm sure there are lots of cases where incomplete docstrings and type signatures will cause errors. We can make these more robust (or use some reasonable defaults).
aifund
aifundAIfund 是基于 Python 的基金量化工具版本更新pipinstall--upgradeaifund
aify
🚀 aifyBuild your AI-native application in seconds.Home|Documentation|Feedback🛠️ AI-native application framework and runtime. Simply write a YAML file.🤖 Ready-to-use AI chatbot UI.Dependenciesmicrosoft/guidanceas the core prompt engineUvicorn,Starlette,FastAPIas the serverFeaturesModels: The LLMs/transformers models supported by guidance.Memory storage: Local file / Google Cloud Datastore / User-definedEmbeddings: OpenAI / User-definedVector storage and search: Local CSV files, Pandas DataFrame and Numpy in memory / User-definedDeployment: Local /Google Cloud App engineUI: Chatbot webuiAPI: RESTful API / PythonGetting startedWelcome to Aify, the AI-native application framework and runtime that allows you to ship your AI applications in seconds! With Aify, you can easily build and deploy AI-powered applications using a simple YAML file. In this guide, we will walk you through the steps to get started with Aify and create your first AI application.InstallationTo begin, make sure you have the following prerequisites installed on your system:Python 3.8 or higherPip package managerOnce you have the prerequisites, you can install Aify by running the following command in your terminal:pipinstallaifyCreate your first appYou need to prepare a directory for your applications:mkdir./appsNow you can start the aify service and then accesshttp://localhost:2000using a browser, and aify will greet you.aifyrun./appsNow it's just a blank application, you can't use it for anything. Next, we will create a chatbot.Creating a YAML file aify uses a YAML file to define your AI application. This file contains all the necessary configurations and settings for your application. Here's an example of a basic YAML file:title:Chatbotmodel:vendor:openainame:gpt-3.5-turboparams:api_key:<YOUR_OPENAI_API_KEY>prompt:|{{#system~}}You are a helpful and terse assistant.{{~/system}}{{#each (memory.read program_name session_id n=3)}}{{~#if this.role == 'user'}}{{#user~}}{{this.content}}{{~/user}}{{/if~}}{{~#if this.role == 'assistant'}}{{#assistant~}}{{this.content}}{{~/assistant}}{{/if~}}{{~/each}}{{#user~}}{{prompt}}{{memory.save program_name session_id 'user' prompt}}{{~/user}}{{#assistant~}}{{gen 'answer' temperature=0 max_tokens=2000}}{{memory.save program_name session_id 'assistant' answer}}{{~/assistant}}variables:-name:prompttype:input-name:answertype:outputHere are some simple explanations about this YAML file:Thetitlerepresents the name of this application.Themodelsection defines the AI model used by this application and the runtime parameters required by the model.Thepromptsection is used to drive the application's execution. Aify uses the guidance software package provided by Microsoft to drive the execution of the AI program. Guidance provides a way to operate as a Chain of Thought. Since guidance uses the Handlebars template system, the format of this section is actually a Handlebars template.The prompt section contains some helper functions that allow the AI model to dynamically change its runtime behavior, helping us achieve more complex functionality. These functions are built-in to aify, but you can also write your own helper functions in Python to accomplish specific tasks.The terms "system," "user," and "assistant" are used to define the roles in an LLM-based chat task."memory.read" and "memory.write" are built-in helper functions in Aify, used to save and load the conversation history of users and AI."each" and "if" are branch control statements provided by Handlebars."gen" is the function provided by "guidance" to indicate the execution of LLM generation tasks.Thevariablessection defines the input and output variables of the application, which are used for external systems to access the data generated by AI through an API.Play with your AI appNow go back to your browser and refresh the page. You will see the application you just created. You can have some conversations with it, just like ChatGPT.aify is not a chatbotAlthough aify provides a chatbot interface, its main purpose is not to provide a replacement for ChatGPT or a competitive conversation application.The chatbot UI is only for convenient debugging of AI applications. Of course, you can indeed use it as a chatbot for daily use.The main goal of aify is to provide an efficient framework for developing and deploying AI applications.If your goal is to develop your own complex AI applications, you should pay more attention to the APIs and extension mechanisms provided by aify.📝 More examples:https://github.com/shellc/aify/tree/main/examplesWebui screenshot
aiga-django-amqp
merupakan package khusus django untuk keperluan berkomunikasi dengan rabbitmqQuick startTambahkan “aiga_amqp” ke INSTALLED_APPS pada file settings.py:INSTALLED_APPS = [ ... 'aiga_amqp', ]A.1 Message SenderBerikut contoh kode untuk mengirim message:from aiga_amqp.core import send_queue send_queue('nama channel', message='ini pesan')A.2 Compete ConsumerBerikut merupakan langkah untuk consume antrian message secara “compete” atau bergantian.Pada folder app yang anda inginkan, tambahkan fileconsumer.pyPada fileconsumer.pybuat function untuk menangani message seperti contoh dibawah ini:def consumer(channel, method, properties, body): print('saya telah baru saja melakukan sesuatu hal yang penting disini ...')Pada fileapps.pyoverride methodreadydan lakukan perintah seperti contoh berikut untuk mengeksekusiconsumer.py:from django.apps import AppConfig class ExampleConfig(AppConfig): default_auto_field = 'django.db.models.BigAutoField' name = 'example' def ready(self) -> None: from aiga_amqp.core import AMQPListener from .consumer import consumer listener = AMQPListener() listener.consume(consumer, channel_name='your channel name')Jalankan djangoB.1 Publish MessagesegeraB.2 Subscribe MessagesegeraSettings VariableTerdapat beberapa variable yang bisa digunakan padasettings.pybeserta default value nya:AIGA_AMQP = { 'HOST' : 'localhost', //alamat host dari rabbitmq 'PORT' : 5672, //port dari rabbitmq 'CREDENTIAL' : False, //set menjadi True jika menggunakan USERNAME dan PASSWORD 'USERNAME' : None, //username untuk mengakses rabbitmq 'PASSWORD' : None, //password untuk mengakses rabbitmq 'HEARTBEAT' : 600, 'TIMEOUT' : 300 }
aigar
Gym env that replicates agar.ioHow to install:cd to this repo. Then run:pip install -e .Or simply:pip install aigarHow to use:Simply import aigar and import gym. The following easy environments are available:"AigarPellet-v0" - You control a single cell. The goal is to collect as many pellets as quickly as possible."AigarGreedy1-v0" - You control a single cell. There is another cell controlled by a simple greedy heuristic. Collect as many pellets as quickly as possible and eat the opponent as often as you can.There are many more options available by following the naming scheme: "Aigar[Pellet|Greedy[1|2|5]][Grid][Split][Eject]-v0" The number behind "Greedy" determines the number of greedy bots. If "Grid" is used a simplified lower dimensional observation space will be used (not based on pixels). If "Split" is used the player cell can split itself, just as in agar.io. If "Eject" is used the player cell can eject some mass, just as in agar.io.Observation Space:By default, the observation space will be an rgb image of size (900, 900, 3).If the "Grid" option is used an easier version of the obs space will be used, e.g. in "AigarPelletGrid-v0". In this easier version the obs space is either (11, 11, 3) in the "Pellet" options or (11, 11, 4) in the "Greedy" options. The first two dimensions determine the size of the grid and the last dimension the number of grids. The first grid determines the pellet mass per grid cell, the second grid is the combined mass of every cell of the player that is at least partially in a grid cell and the third grid determine the playing field boundary, every grid receives a floating value between 0 and 1 depending on how much of it is outside of the playing field. The additional grid in the Greedy version determines the combined mass of every opponent cell (no matter which opponent) that is at least partially in that grid cell. One cell of the player or opponent can thus count for multiple cells.Action Space:The action space in this version consists out of two dimensions: the x and y location of the cursor. It is limited to a range of 0-1, mapping the whole possible space of possitions of the cursor.If the "Split" option is used, the action space increases by one. The additional continuous action determines the split action: a value above 0.5 means the player cells will split, otherwise not.The "Eject" option has the same effect as the "Split" option, but for ejecting. This option is not selectable if "Split" is not selected, e.g. there is not "AigarPelletEject-v0", only an "AigarPelletSplitEject-v0".
ai-gateway
No description available on PyPI.
aigbook
No description available on PyPI.
aigc
🔥aigc🔥Installpipinstall-UaigcDocsUsagesimport aigcTODO======= History0.0.0 (2023-02-15)First release on PyPI.
aigc-evals
torch_training:https://github.com/ssbuild/aigc_evals.git
aigc-zoo
torch_training:https://github.com/ssbuild/aigc_zoo.git