package
stringlengths 1
122
| pacakge-description
stringlengths 0
1.3M
|
---|---|
aishell
|
AiShell 🤖A simple Python code that connects to OpenAI's ChatGPT and executes the returned results.If you are interested in these projects, please checkout AiShell's brother project:YGK-a. YGK-a is a client for the ChatGPT from your terminal, and also supports unix/linux pipelines.DemoKey Features 💡Interact with your computer using natural languageAutomatically executes the command from the response of ChatGPTGood for complex tasks like handling Git and extracting tar filesNo need to search StackOverflow for commands,AiShellhas got you coveredAiShellsimplifies the process of setting up and retrieving tokens or API keys.WithAiShell, you don't have to worry about the technical details.Simply installAiShell, execute it, and you're ready to go!Prerequisites 📚Python 3.9+ChatGPT Account (or OpenAI Account)Getting Started 🚀To begin usingAiShell, start by installing it with pip:pipinstallaishellOnce you've installedAiShell, you can start using it right away.
For example, to print "Hello World" usingAiShell, enter the following command:aishell'print Hello World'Advanced Settings 🛠By default,AiShellis configured to use the reverse-engineered ChatGPT client and retrieve login information from your browser, so you don't need to configure anything to useAiShell. However, for those who want to use different models with an OpenAI API Key, you can configure it as follows:Create an account on OpenAI.Go tohttps://platform.openai.com/account/api-keysand copy your API key.Modify or create the~/.ygka_openai_config.jsonfile as follows:{..."language_model":"official_chatgpt","openai_api_key":"<your OpenAI API key>"}Here, you can add your OpenAI API key. This will enable AiShell to use the official chatgpt api and the API key when executing commands.Contributions 💬Feel free to contribute toAiShellby adding more functionality or fixing bugs.
|
ai-shell
|
ai_shellOpenAI-centric shell for giving safe, chat-optimized, filesystem access to an Assistant as a "tool".Even if you trust the bot to run bash directly on your machine or docker container, standard tools will run up your
bill with excess tokens in the reply, or a command generates too few tokens and the bot doesn't know what is
going on.This is an alternative tocode_interpreter, tools running code in docker container locally, or tools running arbitrary
shell code locally.Installationpip install ai_shellUsageSee these full examples. As long as the OPENAI_API_KEY environment variable is set, you can run these examples.Pylint botwill
attempt to
fix python
code lint issues.Test writer botwill attempt to
write unit tests for python code.Tool tester bottries out tools
to see if they basically work.To execute demo bots, run these commands and follow initialization instructions if needed. They all expect to
manipulate python code in an /src/ folder.python-mai_shell.demo_bots.docs_writer_bot
python-mai_shell.demo_bots.pylint_bot
python-mai_shell.demo_bots.test_writer_bot
python-mai_shell.demo_bots.tool_tester_bot
python-mai_shell.demo_bots.todo_botThis is the python interface to the tools, how you're expected to wire up the tool to your bot.importai_shellcat=ai_shell.CatTool(".")print(cat.cat(["file.py"]))print(cat.cat_markdown(["file.py"]))ls=ai_shell.LsTool(".")print(ls.ls("docs"))print(ls.ls_markdown("docs"))This is the smallest example to illustrate basic capabilities, also
seehere.importasyncioimportai_shellasyncdefmain():defstatic_keep_going(toolkit:ai_shell.ToolKit):usage=toolkit.get_tool_usage_for("ls")ifusage["count"]>0:return("Great job! You've used ls. Summarize in paragraph form and we're done.")return("You haven't used the ls tool yet. Do you have access to the ls tool? If"" there is a problem report it to the report_text tool to end the session.")# Creates temporary botsbot=ai_shell.TaskBot(ai_shell.Config(),name="Folder inspection bot.",bot_instructions="Run the ls tool and tell me what you see.",model="gpt-3.5-turbo-1106",dialog_logger_md=ai_shell.DialogLoggerWithMarkdown("./tmp"),)awaitbot.initialize()the_ask=f"""You are in the './' folder. You do not need to guess the pwd, it is './'.Run ls and tell me what you see in paragraph format."""awaitbot.basic_tool_loop(the_ask=the_ask,root_folder="./src",tool_names=["ls","report_text",],keep_going_prompt=static_keep_going,)if__name__=="__main__":asyncio.run(main())This is the cli interface, which is intended for testing, not for bot usage.aiscat_markdown--file-pathspyproject.tomlFeatures in BriefMany cli-like tools interfaces, such as ls, cat, grep, head, tail, and git.OpenAI glue for all cli tools.UX with a bot in mind.Security with mischievous but not especially malicious bot in mind.Bot (Assistant) boilerplate helpSupport for bots doing one shot tool use and goal function driven tool use.Bot have extensibility points.TODO: plugin system for tools.Analogues supported todayDirectories: ls, findFiles: cat, grep, head, tailEditing: sed, ed, edlin, patch, replace, insert, rewrite, write newData: cutOther: pycat, token counter, gitTasking: todon.b. Every file is read and written as utf-8 strings.Prior Artai_shell draws inspiration from various command-line interface (CLI) tools and shell environments, integrating
features from traditional shells with OpenAI's language models. It is designed to provide an easy and secure interface
for AI-assisted file system interactions, keeping in mind both usability and safety.DocumentationFeaturesDesignUse CasesTODOAPI docs, pdoc3 style
|
aiShelModule
|
No description available on PyPI.
|
aishield
|
AIShield Python Integration PackageAIShield provides the Python convenience package to allow users to seamlessly integrate AIShield Vulnerability Assessment and Defense capabilities into their AI development workflows. Users will receive assessment reports, sample attack vectors, and a threat-informed defense model with telemetry connection to SIEM/SOAR, such as Splunk and Microsoft Sentinel.RequirementsRequires Python>=3.6, and pip >= 19.0Installation$ pip install aishieldDetailsCheck out the Quick Start Examplehere.More references implementations, tutorials, samples, and documentation of AIShield can be found on ourGithub Repository.Pre-requisites:AIShield API should be white-listed, or proxy settings must be appropriately configured for the AIShield API to be called.Valid AIShield API subscription plan and authentication keys. For details regarding subscription please visit,Subscription Pageor reach out to sales [email protected] Features:Model Extraction attack Vulnerability Analysis and Threat informed Defense Generation with relevant report artifacts for Image & Tabular ClassificationSupported for Models trained on Tensorflow (Tensorflow >=2.5.0 and <=2.9.1)Supported Input Model File formats: .h5,.pycAssessment Report formats available: PDF, XML, JSON, TXTMore about AIShieldWebsite:https://www.boschaishield.com/Email:[email protected] History0.1.5Added vulnerability analysis for image segmentation: model extraction attackUpdated to be compatible with latest AIShield API version. Also, now api_key is not required to be provided explicitly for analysis. It will be generated from org_id and policies are consumed accordingly.0.1.4Added vulnerability analysis for time series forecasting: model extraction attack0.1.3Updated to be compatible with latest AIShield API versionAdded vulnerability analysis for tabular classification: model evasion attack0.1.2Added vulnerability analysis for image classification: model evasion & model poisoning attackAdded vulnerability analysis for tabular classification: model extraction attack0.1.1Updated to be compatible with latest AIShield API version0.1.0Initial versionAdded vulnerability analysis for model extraction attack:for image_classification task*Prepare the vulnerability configs and send the model for analysis to AIShield API.This will generate vulnerability analysis reports, threat-informed defense generation with SIEM/SOAR telemetry enabled, defense reports, and sample attack data artifacts.*
|
aishvarya
|
No description available on PyPI.
|
aisim
|
AISim ‒ Simulations for light-pulse atom interferometryAISim is a Python package for simulating light-pulse atom
interferometers.It uses dedicated objects to model the laser beams, the atomic ensemble
and the detection system and store experimental parameters in a neat
way. After you define these objects you can use built-in propagators to
simulate internal and external degrees of freedom of cold atoms.InstallationThe latest tagged release can installed via pip withpip install aisimor via conda withconda install -c conda-forge aisimAlternatively, if you plan to make changes to the code, usegit clone https://github.com/bleykauf/aisim.git
cd aisim
python setup.py developUsageFor basic usage and code reference, see thedocumentation.ExamplesSome examples are provided in the form ofJupyter
notebooks:Effect of wavefront aberrations in atom
interferometryRabi oscillations with a Gaussian beam and thermal
atomsMultiport atom interferometerContributingContributions are very welcome. If you want to help, check outour contributions guide.AuthorsBastian Leykauf (https://github.com/bleykauf)Sascha Vowe (https://github.com/savowe)LicenseAISim ‒ Simulations for light-pulse atom interferometryCopyright © 2023 B. Leykauf, S. VoweThis program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.You should have received a copy of the GNU General Public License along with this program. If not, seehttps://www.gnu.org/licenses/.
|
aisimplekit
|
# aisimplekit
Simple lib for various machine learning and AI tasks.# Installation:` python3.7-mpip install aisimplekit `# Contributions
Feel free to fork, add new features to this lib. Hope it helps ;) !
|
aisi-od-training
|
This is a dummy package intended to prevent any dependency confusion attacks against our internal Python packages atSBB. For now this is the only foolproof way to prevent some dependency confusion attacks until the following PEP has been implemented:PEP-708For any questions regarding this package feel free to reach out [email protected].
|
aisi-pip-skeleton
|
This is a dummy package intended to prevent any dependency confusion attacks against our internal Python packages atSBB. For now this is the only foolproof way to prevent some dependency confusion attacks until the following PEP has been implemented:PEP-708For any questions regarding this package feel free to reach out [email protected].
|
aisi-pip-utils
|
This is a dummy package intended to prevent any dependency confusion attacks against our internal Python packages atSBB. For now this is the only foolproof way to prevent some dependency confusion attacks until the following PEP has been implemented:PEP-708For any questions regarding this package feel free to reach out [email protected].
|
aisi-skeleton
|
This is a dummy package intended to prevent any dependency confusion attacks against our internal Python packages atSBB. For now this is the only foolproof way to prevent some dependency confusion attacks until the following PEP has been implemented:PEP-708For any questions regarding this package feel free to reach out [email protected].
|
aisi-trackoffset-estimator
|
This is a dummy package intended to prevent any dependency confusion attacks against our internal Python packages atSBB. For now this is the only foolproof way to prevent some dependency confusion attacks until the following PEP has been implemented:PEP-708For any questions regarding this package feel free to reach out [email protected].
|
aisi-utils
|
This is a dummy package intended to prevent any dependency confusion attacks against our internal Python packages atSBB. For now this is the only foolproof way to prevent some dependency confusion attacks until the following PEP has been implemented:PEP-708For any questions regarding this package feel free to reach out [email protected].
|
aislab
|
No description available on PyPI.
|
aislib
|
No description available on PyPI.
|
ais-libpythonpro
|
libpythonproMódulo para exemplificar construção de projetos Python no curso PyToolsSuportada Versão do Python 3Para instalar:"""
console
python3 -m venv .venv
source .venv\Scripts\activate
pip install -r requirements-dev.txt
"""Para conferir qualidade do código:'''
console
flake8
'''
|
aisling-connector
|
Binance Public API Connector PythonThis is a lightweight library that works as a connector toBinance public APISupported APIs:/api/*/sapi/*Spot Websocket Market StreamSpot User Data StreamInclusion of test cases and examplesCustomizable base URL, request timeout and HTTP proxyResponse metadata can be displayedInstallationpipinstallbinance-connectorDocumentationhttps://binance-connector.readthedocs.ioRESTful APIsUsage examples:frombinance.spotimportSpotclient=Spot()print(client.time())client=Spot(key='<api_key>',secret='<api_secret>')# Get account informationprint(client.account())# Post a new orderparams={'symbol':'BTCUSDT','side':'SELL','type':'LIMIT','timeInForce':'GTC','quantity':0.002,'price':9500}response=client.new_order(**params)print(response)Please findexamplesfolder to check for more endpoints.TestnetSpot Testnetis available, it can be used to test/api/*endpoints./sapi/*endpoints are not available.No UI.Steps to setup testnet API key.https://dev.binance.vision/t/99To use testnet:frombinance.spotimportSpotasClientclient=Client(base_url='https://testnet.binance.vision')print(client.time())Base URLIfbase_urlis not provided, it defaults toapi.binance.com.It's recommended to pass in thebase_urlparameter, even in production as Binance provides alternative URLs
in case of performance issues:https://api1.binance.comhttps://api2.binance.comhttps://api3.binance.comOptional parametersPEP8 suggestslowercase with words separated by underscores, but for this connector,
the methods' optional parameters should follow their exact naming as in the API documentation.# Recognised parameter nameresponse=client.cancel_oco_order('BTCUSDT',orderListId=1)# Unrecognised parameter nameresponse=client.cancel_oco_order('BTCUSDT',order_list_id=1)RecvWindow parameterAdditional parameterrecvWindowis available for endpoints requiring signature.It defaults to5000(milliseconds) and can be any value lower than60000(milliseconds).
Anything beyond the limit will result in an error response from Binance server.frombinance.spotimportSpotasClientclient=Client(key,secret)response=client.get_order('BTCUSDT',orderId=11,recvWindow=10000)Timeouttimeoutis available to be assigned with the number of seconds you find most appropriate to wait for a server response.Please remember the value as it won't be shown in error messageno bytes have been received on the underlying socket for timeout seconds.By default,timeoutis None. Hence, requests do not time out.frombinance.spotimportSpotasClientclient=Client(timeout=1)ProxyProxy is supported.frombinance.spotimportSpotasClientproxies={'https':'http://1.2.3.4:8080'}client=Client(proxies=proxies)Response MetadataThe Binance API server provides weight usages in the headers of each response.
You can display them by initializing the client withshow_limit_usage=True:frombinance.spotimportSpotasClientclient=Client(show_limit_usage=True)print(client.time())returns:{'data':{'serverTime':1587990847650},'limit_usage':{'x-mbx-used-weight':'31','x-mbx-used-weight-1m':'31'}}You can also display full response metadata to help in debugging:client=Client(show_header=True)print(client.time())returns:{'data':{'serverTime':1587990847650},'header':{'Context-Type':'application/json;charset=utf-8',...}}IfClientErroris received, it'll display full response meta information.Display logsSetting the log level toDEBUGwill log the request URL, payload and response text.ErrorThere are 2 types of error returned from the library:binance.error.ClientErrorThis is thrown when server returns4XX, it's an issue from client side.It has 4 properties:status_code- HTTP status codeerror_code- Server's error code, e.g.-1102error_message- Server's error message, e.g.Unknown order sent.header- Full response header.binance.error.ServerErrorThis is thrown when server returns5XX, it's an issue from server side.Websocketfrombinance.websocket.spot.websocket_clientimportSpotWebsocketClientasWebsocketClientdefmessage_handler(message):print(message)ws_client=WebsocketClient()ws_client.start()ws_client.mini_ticker(symbol='bnbusdt',id=1,callback=message_handler,)# Combine selected streamsws_client.instant_subscribe(stream=['bnbusdt@bookTicker','ethusdt@bookTicker'],callback=message_handler,)ws_client.stop()More websocket examples are available in theexamplesfolderHeartbeatOnce connected, the websocket server sends a ping frame every 3 minutes and requires a response pong frame back within
a 10 minutes period. This package handles the pong responses automatically.Testnetfrombinance.websocket.spot.websocket_clientimportSpotWebsocketClientasWebsocketClientws_client=WebsocketClient(stream_url='wss://testnet.binance.vision')Test Case# In case packages are not installed yetpipinstall-rrequirements/requirements-test.txtpytestLimitationFutures and Vanilla Options APIs are not supported:/fapi/*/dapi/*/vapi/*Associated Websocket Market and User Data StreamsContributingContributions are welcome.If you've found a bug within this project, please open an issue to discuss what you would like to change.If it's an issue with the API, please open a topic atBinance Developer Community
|
aismt
|
aismtA command-line interface for AI Smart Task.UsingPython package: to add and install this package as a dependency of your project, runpoetry add aismt.Python CLI: to view this app's CLI commands once it's installed, runaismt --help.Python application: to serve this REST API, rundocker compose up appand openlocalhost:8000in your browser. Within the Dev Container, this is equivalent to runningpoe api.ContributingPrerequisites1. Set up Git to use SSHGenerate an SSH keyandadd the SSH key to your GitHub account.Configure SSH to automatically load your SSH keys:cat<< EOF >> ~/.ssh/configHost *AddKeysToAgent yesIgnoreUnknown UseKeychainUseKeychain yesEOF2. Install DockerInstall Docker Desktop.EnableUse Docker Compose V2in Docker Desktop's preferences window.Linux only:Configure Docker to use the BuildKit build system. On macOS and Windows, BuildKit is enabled by default in Docker Desktop.Export your user's user id and group id so thatfiles created in the Dev Container are owned by your user:cat<< EOF >> ~/.bashrcexport UID=$(id --user)export GID=$(id --group)EOF3. Install VS Code or PyCharmInstall VS CodeandVS Code's Dev Containers extension. Alternatively, installPyCharm.Optional:install aNerd Fontsuch asFiraCode Nerd Fontandconfigure VS Codeorconfigure PyCharmto use it.Development environmentsThe following development environments are supported:⭐️GitHub Codespaces: click onCodeand selectCreate codespaceto start a Dev Container withGitHub Codespaces.⭐️Dev Container (with container volume): click onOpen in Dev Containersto clone this repository in a container volume and create a Dev Container with VS Code.Dev Container: clone this repository, open it with VS Code, and runCtrl/⌘+⇧+P→Dev Containers: Reopen in Container.PyCharm: clone this repository, open it with PyCharm, andconfigure Docker Compose as a remote interpreterwith thedevservice.Terminal: clone this repository, open it with your terminal, and rundocker compose up --detach devto start a Dev Container in the background, and then rundocker compose exec dev zshto open a shell prompt in the Dev Container.DevelopingRunpoefrom within the development environment to print a list ofPoe the Poettasks available to run on this project.Runpoetry add {package}from within the development environment to install a run time dependency and add it topyproject.tomlandpoetry.lock. Add--group testor--group devto install a CI or development dependency, respectively.Runpoetry updatefrom within the development environment to upgrade all dependencies to the latest versions allowed bypyproject.toml.
|
aisnakes
|
No description available on PyPI.
|
aisolver
|
我们做什么我们致力于提供优质求解器功能快速上手1、安装python库pipinstallaisolver2、调用求解>>>importaisolver>>>fromaisolverimportmix_integer_solver# 指定需要求解问题的.lp文件>>>file_name="input_problem.lp"# 确定问题类型problem,可选WPMS,FCMCNF,GISP# solution_save默认为None,表示会打印求解结果。当指定文件路径时,会将求解结果保存到指定文件>>>mix_integer_solver(file_name,problem="FCMCNF",solution_save="./solve_out.txt")目前能够求解问题1、WPMS(Weighted Independent Maximum Packing Set)WIMPs问题是NP-hard问题的一个例子,通常被用于寻找无向图中的最大权独立集。在WIMPs问题中,给定一个带权无向图,每个顶点都有一个正权值。问题的目标是找到权值之和最大的独立顶点集,其中独立顶点集是指该集合中的任何两个顶点之间都没有边相连。WIMPs问题在许多领域中都有应用,例如在社交网络分析、电路设计和资源分配等领域中。2、GISP(广义独立集问题)问题GISP问题是一类经典的优化问题,通常用于优化分配问题,例如分配资源或任务到不同的位置或机器上。在GISP问题中,有一组可用的资源和一组任务需要被完成。每个资源都有一个成本,同时可以完成一组任务,这些任务需要一定的资源。任务可以被分配给多个资源,但每个任务只能被完成一次。问题的目标是找到最小化总成本的任务分配方案。GISP问题的一个重要特点是,每个任务可以被分配给多个资源,这使得问题更加复杂。因此,GISP问题被认为是一种困难的组合优化问题。GISP问题在许多领域中都有应用,例如物流、生产调度和资源分配等领域。3、FCMCNF(Facility Capacity Multiple Choice Network Flow)该问题通常涉及在网络流模型中选择路径,以便满足一些特定的容量约束。FCMCNF问题中存在多种选择,每种选择都会影响网络流的容量限制。
具体而言,FCMCNF问题是在具有多个源和汇的带容量的有向图中找到最小费用的流,并满足以下要求:每个源必须流出特定数量的流量。每个汇必须接收特定数量的流量。每个路径可以选择多个容量和费用。每个路径的容量都受到一些限制,例如每个路径的容量不能超过某个特定的值。敬请期待多样功能
|
aisort
|
README for PCSorterIntroductionPCSorter is a Python-based utility designed to intelligently organize files in a directory using the OpenAI ChatGPT API. This tool scans a specified directory (including subdirectories if desired), classifies files based on content and type, and reorganizes them into a more structured format. It leverages the advanced capabilities of OpenAI's GPT models to understand and categorize file contents, making file management more efficient and intuitive.Key Features:File sorting using AI-driven insights.Customizable directory and file type handling.Backup and restore functionality for sorted files.Cross-platform compatibility with detailed setup instructions.RequirementsPython 3.6 or higheropenaiPython packageAdditional Python libraries:os,re,shutil,time,argparse,datetime,json,sysAn active OpenAI API keyInstallation InstructionsEnsure Python 3.6+ is installed on your system.Install AiSort via pippip install aisortSet up an environment variable for your OpenAI API key (instructions in the next section).Setting Up Environment VariablesWindowsCommand Prompt:Usesetx OPENAI_API_KEY "Your-API-Key"to set the API key.PowerShell:Apply$env:OPENAI_API_KEY = "Your-API-Key"to set the key.Editing System Properties:Open System Properties -> Advanced -> Environment Variables.Add a new System variable namedOPENAI_API_KEYwith your API key as its value.macOSUsing Terminal:Addexport OPENAI_API_KEY="Your-API-Key"to your.bash_profileor.zshrc.Editing.bash_profileor.zshrc:Open these files in a text editor and add the export line as above.LinuxUsing Terminal:Similar to macOS, useexport OPENAI_API_KEY="Your-API-Key"in.bashrcor equivalent.Editing.bashrcor equivalent:Open the file in an editor and add the export command.ConfigurationBefore running PCSorter, ensure theOPENAI_API_KEYenvironment variable is set.Usage InstructionsRunning the script:ExecuteAiSortin your terminal.Use command-line arguments to specify options like--model,--dir,--include,--backup.Common use cases:Sorting files in the current directory:AiSort sort --dir ./my_directoryUsing a specific GPT model:AiSort sort --model gpt-3.5-turboTroubleshootingAPI Key Not Recognized:Ensure the environment variableOPENAI_API_KEYis correctly set.Permission Errors:Run the script with appropriate permissions or from a non-restricted directory.Invalid Model Specified:Check that the model name is correct and supported.FAQsCan PCSorter handle large directories?Yes, but performance may vary based on the number and size of files.ContributingContributions to PCSorter are welcome. Please submit issues and pull requests through GitHub, adhering to the project's coding standards and guidelines.LicensePCSorter is released under the MIT License. See the LICENSE file for more details.AcknowledgmentsThanks to the contributors and to OpenAI for the API that powers this project. Special thanks to [list any special contributors or resources].
|
aisp
|
Artificial Immune Systems Package.Select the language / Selecione o Idioma:English.Português.Package documentation / Documentação do pacote:Docs.Wiki Github.EnglishSummary:Introduction.Installation.DependenciesUser installationHow to import the TechniquesExamples.IntroductionTheAISPis a python package that implements artificial immune systems techniques, distributed under the GNU Lesser General Public License v3.0 (LGPLv3).The package started in2022as a research package at the Federal Institute of Northern Minas Gerais - Salinas campus (IFNMG - Salinas).Artificial Immune Systems (AIS) are inspired by the vertebrate immune system, creating metaphors that apply the ability to detect and catalog pathogens, among other features of this system.Algorithms implemented:[x]Negative Selection.[ ]Dendritic Cells.[ ]Clonalg.[ ]Immune Network Theory.InstallationThe module requires installation ofpython 3.8.10or higher.Dependencies:PackagesVersionnumpy≥ 1.22.4scipy≥ 1.8.1tqdm≥ 4.64.1User installationThe simplest way to install AISP is usingpip:pipinstallaispHow to import the Techniquesfromaisp.NSAimportRNSAnsa=RNSA(N=300,r=0.05)Examples:Example using the negative selection technique (nsa):In the example present in thisnotebook,500random samples were generated, arranged in two groups, one for each class.Below are some examples that use a database for classification with theJupyter notebooktool.Negative Selection:RNSAApplication of negative selection techniques for classification using the Iris family flower database and Old Faithful Geyser:iris_dataBase_examplegeyser_dataBase_exampleBNSAmushrooms_dataBase_examplePortuguêsSumário:Introdução.Instalação.DependênciasInstalação do usuárioComo importar as TecnicasExemplos.IntroduçãoOAISPé um pacote python que implementa as técnicas dos sistemas imunológicos artificiais, distribuído sob a licença GNU Lesser General Public License v3.0 (LGPLv3).O pacote teve início no ano de2022como um pacote de pesquisa no instituto federal do norte de minas gerais - campus salinas (IFNMG - Salinas).Os sistemas imunológicos artificiais (SIA) inspiram-se no sistema imunológico dos vertebrados, criando metáforas que aplicam a capacidade de reconhecer e catalogar os patógenos, entre outras características desse sistema.Algoritmos implementados:[x]Seleção Negativa.[ ]Células Dendríticas.[ ]Clonalg.[ ]Teoria da Rede Imune.InstalaçãoO módulo requer a instalação dopython 3.8.10ou superior.Dependências:PacotesVersãonumpy≥ 1.22.4scipy≥ 1.8.1tqdm≥ 4.64.1Instalação do usuárioA maneira mais simples de instalação do AISP é utilizando opip:pipinstallaispComo importar as Tecnicasfromaisp.NSAimportRNSAnsa=RNSA(N=300,r=0.05)Exemplos:Exemplo utilizando a técnica de seleção negativa (nsa):No exemplo presente nessenotebook, gerando500amostras aleatórias dispostas em dois grupos um para cada classe.A seguir alguns exemplos que utiliza-se de base de dados para classificação com a ferramentaJupyter notebook.Seleção Negativa:RNSAAplicação das tecnica de seleção negativa para classificação utilizando a base de dados de flores da família Iris e Old Faithful Geyser:iris_dataBase_examplegeyser_dataBase_exampleBNSAmushrooms_dataBase_example
|
aispace2
|
A Jupyter extension for the next-generation of AISpace
|
ais-package
|
No description available on PyPI.
|
aisparser
|
No description available on PyPI.
|
aispeech_logger
|
No description available on PyPI.
|
aisploit
|
🤖🛡️🔍🔒🔑 aisploitAISploit is a Python package designed to support red teams and penetration testers in exploiting large language model AI solutions. It provides tools and utilities to automate tasks related to AI-based security testing.FeaturesAutomate red teaming tasks using large language model AI solutionsPerform penetration testing with AI-powered toolsSupport for various security testing scenariosEasy-to-use Python interfaceInstallationYou can install aisploit using pip:pipinstallaisploitContributingContributions are welcome! If you have any ideas for new features, improvements, or bug fixes, feel free to open an issue or submit a pull request.LicenseThis project is licensed under the MIT License - see theLICENSEfile for details.
|
aispm
|
No description available on PyPI.
|
aisports
|
AISportsAISports is webpage to have fun and compare your AI Programming skills against other people
|
aisqlite
|
No description available on PyPI.
|
aisquared
|
AISquaredThis package contains utilities to interact with the AI Squared technology stack, particularly with developing and deploying models to the AI Squared Platform or other applications developed through the AI Squared JavaScript SDK.InstallationThis package is available throughPypiand can be installed by running the following command:pipinstallaisquaredAlternatively, the latest version of the software can be installed directly from GitHub using the following commandpipinstallgit+https://github.com/AISquaredInc/aisquaredCapabilitiesThis package is currently in a state of constant development, so it is likely that breaking changes can be made at any time. We will work diligently to document changes and make stable releases in the future.Theaisquaredpackage currently contains five subpackages, theaisquared.configpackage, theaisquared.basesubpackage, theaisquared.loggingsubpackage, theaisquared.servingsubpackage, and theaisquared.platformpackage. Theconfigpackage holds objects for building the configuration files that need to be included with converted model files for use within the AI Squared Extension. The contents of the config subpackage contain both pre- and postprocessing steps as well as harvesting, analytic, rendering, and feedback objects to use with the model. The following will explain the functionality of the config package:aisquared.configTheaisquared.configsubpackage contains the following objects:ModelConfigurationTheModelConfigurationobject is the final object to be used to create the configuration file. It takes as input a list of harvesting steps, list of preprocessing steps, a list of analytics, a list of postprocessing steps, a list of rendering steps, an optional MLFlow URI, an optional MLFlow user, and an optional MLFlow tokenGraphConfigurationThe `GraphConfiguration object is another method for creating configuration files. Instead of taking a predefined set of steps, it allows the developer to add steps to create a directed acyclic graphaisquared.config.harvestingTheaisquared.config.harvestingsubpackage contains the following objects:ImageHarvesterTheImageHarvesterclass indicates the harvesting of images within the DOM to perform prediction onTextHarvesterTheTextHarvesterclass indicates the harvesting of text within the DOM to perform prediction onInputHarvesterTheInputHarvesterclass configures harvesting of different kinds of user-defined inputsQueryParameterHarvesterTheQueryParameterHarvesterclass configures harvesting based on query parametersaisquared.config.preprocessingTheaisquared.config.preprocessingsubpackage contains the following objects:ImagePreprocessorTheImagePreprocessorclass takes in preprocessing steps (defined below) which define preprocessing steps for images.TabularPreprocessorTheTabularPreprocessorclass takes in preprocessing steps (defined below) which define preprocessing steps for tabular data.TextPreprocessorTheTextPreprocessorclass takes in preprocessing steps (defined below) which define preprocessing steps for text data.aisquared.config.analyticTheaisquared.config.analyticsubpackage contains the following objects:LocalAnalyticTheLocalAnalyticclass indicates the use of an analytic or lookup table from a local fileLocalModelTheLocalModelclass indicates the use of a model from a local fileDeployedAnalyticTheDeployedAnalyticclass indicates the use of an analytic or lookup table from a remote resourceDeployedModelTheDeployedModelclass indicates the use of a model deployed to a remote resourceReverseMLWorkflowTheReverseMLWorkflowclass indicates the use of a Reverse ML Workflow, pulling predictions from a remote sourceaisquared.config.postprocessingTheaisquared.config.postprocessingsubpackage contains the following objects:RegressionTheRegressionobject is a postprocessing class for models which perform regression. Since it is common to train regression models by scaling regression outputs to values between 0 and 1, this class is designed to convert output values between 0 and 1 to their original values, corresponding tominandmaxwhen the class is instantiated.BinaryClassificationTheBinaryClassificationobject is a postprocessing class for models which perform binary classification. The class is instantiated with a label map and a cutoff value used to identify when the positive class (class 1) is identified.MulticlassClassificationTheMulticlassClassificationobject is a postprocessing class for models which perform multiclass classification. The class is instantiated with a label map only.ObjectDetectionTheObjectDetectionobject is a postprocessing class for models which perform object detection. The class is instantiated with a label map and a cutoff value for identification.aisquared.config.renderingTheaisquared.config.renderingsubpackage contains the following objects:ImageRenderingTheImageRenderingobject is a rendering class for rendering single predictions on images.ObjectRenderingTheObjectRenderingobject is a rendering class for rendering object detection predictions on images.WordRenderingTheWordRenderingobject is a rendering class for rendering highlights, underlines, or badges on individual words.DocumentRenderingTheDocumentRenderingobject is a rendering class for rendering document predictions.BarChartRenderingTheBarChartRenderingobject is a rendering class for rendering bar charts.ContainerRenderingTheContainerRenderingobject is a rendering class for rendering containers.DashboardReplacementRenderingTheDashboardReplacementRenderingobject is a rendering class for rendering complete dashboard replacementsDoughnutChartRenderingTheDoughnutChartRenderingobject is a class for rendering doughnut chartsFilterRenderingTheFilterRenderingobject is a class for pass data in a model chainHTMLTagRenderingTheHTMLTagRenderingobject is a class for rendering HTML tagsPieChartRenderingThePieChartRenderingobject is a class for rendering pie chartsSOSRenderingTheSOSRenderingobject is a class for rendering SOS dashboardsTableRenderingTheTableRenderingobject is a class for rendering tablesaisquared.config.feedbackTheaisquared.config.feedbacksubpackage contains the following objects:SimpleFeedbackTheSimpleFeedbackobject is a feedback object for simple thumbs up/thumbs down for predictionsBinaryFeedbackTheBinaryFeedbackobject is a feedback object for binary classification use casesMulticlassFeedbackTheMulticlassFeedbackobject is a feedback object for multiclass classification use casesRegressionFeedbackTheRegressionFeedbackobject is a feedback object for regression use casesModelFeedbackTheModelFeedbackobject is a feedback object for configuring feedback for the model directly, rather than its predictionsQualitativeFeedbackTheQualitativeFeedbackobject is a feedback object for configuring questions asked about each individual prediction the model makesPreprocessing StepsTheaisquared.config.preprocessingsubpackage containsPreProcStepobjects, which are then fed into theImagePreprocessor,TabularPreprocessor, andTextPreprocessorclasses. ThePreProcStepclasses are:tabular.ZScoreThis class configures standard normalization procedures for tabular datatabular.MinMaxThis class configures Min-Max scaling procedures for tabular datatabular.OneHotThis class configures One Hot encoding for columns of tabular datatabular.DropColumnThis class configures dropping columnsimage.AddValueThis class configures adding values to pixels in image dataimage.SubtractValueThis class configures subtracting values to pixels in image dataimage.MultiplyValueThis class configures multiplying pixel values by a value in image dataimage.DivideValueThis class configures dividing pixel values by a value in image dataimage.ConvertToColorThis class configures converting images to the specified color schemeimage.ResizeThis class configures image resize procedurestext.TokenizeThis class configures how text will be tokenizedtext.RemoveCharactersThis class configures which characters should be removed from texttext.ConvertToCaseThis class configures which case - upper or lower - text should be converted totext.ConvertToVocabularyThis class configures how text tokens should be converted to vocabulary integerstext.PadSequencesThis class configures how padding should occur given a sequence of text tokens converted to a sequence of integersThese step objects can then be placed within theTabularPreprocessor,ImagePreprocessor, orTextPreprocessorobjects. For theTabularPreprocessor, theZScore,MinMax, andOneHotSteps are supported. For theImagePreprocessor, theAddValue,SubtractValue,MultiplyValue,DivideValue,ConvertToColor, andResizeSteps are supported. For theTextPreprocessor, theTokenize,RemoveCharacters,ConvertToCase,ConvertToVocabulary, andPadSequencesSteps are supportedFinal Configuration and Model CreationOnce harvesting, preprocessing, analytic, postprocessing, and rendering objects have been created, these objects can then be passed to theaisquared.config.ModelConfigurationclass. This class utilizes the objects passed to it to build the entire model configuration automatically.Once theModelConfigurationobject has been created with the required parameters, the.compile()method can be used to create a file with the.airextension that can be loaded into an application which utilizes the AI Squared JavaScript SDK.aisquared.baseTheaisquared.basesubpackage contains base utilities not designed to be directly called by the end user.aisquared.platformTheaisquared.platformsubpackage contains classes and utilities for interacting with the AI Squared Platform. It primarily contains theAISquaredPlatformClientwith the following capabilities:The ability to securely log in to an instance of the AI Squared PlatformThe ability to check whether the connection is healthyThe ability to list.airfiles deployed to the platformThe ability to retrieve the configuration for a.airfile deployed in the platformThe ability to delete a.airfile deployed in the platformThe ability to list users who have a.airfile shared with themThe ability to share a.airfile with usersThe ability to unshare a.airfile with usersThe ability to list all users of the platformThe ability to list all groups in the platformThe ability to list all users in a group in the platformaisquared.serving(requires installing aisquared[full])Theaisquared.servingsubpackage contains utilities for serving models locally or remotely usingMLflowor locally usingFlask.aisquared.logging(requires installing aisquared[full])Theaisquared.loggingsubpackage is powered byMLflow, a powerful open-source platform for the machine learning lifecycle. Theloggingsubpackage inherits nearly all functionality from mlflow, so we highly recommend users refer to theMLflow documentation sitefor additional information.In this subpackage, we have additionally added implementations of individual functions to save TensorFlow, Keras, Scikit-Learn, and PyTorch models in a format that can be deployed quickly using MLflow.ContributingAI Squared welcomes feedback and contributions to this repository! We use GitHub for issue tracking, so feel free to place your input there. For any issues you would like to keep confidential, such as any confidential security issues, or if you would like to contribute directly to this project, please reach out [email protected] we will get back to you as soon as possible.ChangesBelow are a list of additional features, bug fixes, and other changes made for each version.Version 0.1.3Addedflagsparameter toTextHarvesterusing regular expression harvestingDeletedmodel_feedbackparameter inModelConfigurationobject and included functionality infeedback_stepsparameterChangedformatparameter toheaderfor both deployed analyticsAdded feedback and stages toDocumentPredictorandImagePredictorobjectsNon-API changes forALLOWED_STAGESFixed bugs preventing Windows users from importing the packageUpdatedModelConfigurationto includeurlparameterChanged default tokenization stringVersion 0.2.0Moved preprocessing steps under subpackages for specific kinds of preprocessing stepsCleaned up documentation to render within programmatic access environmentsAddedaisquared.loggingsubpackageCreatedInputHarvesterAllows for harvesting of input text, images, and tabular dataCreated theaisquared.servingsubpackage, specifically thedeploy_modelandget_remote_predictionfunctionsCreated theGraphConfigurationclassAddedauto-runparameter toModelConfigurationandGraphConfigurationclassesCreated theaisquaredCLI with the following commands:aisquared deploy, which deploys a model locallyaisquared predict, which predicts using a local JSON fileaisquared airfiles, which contains the subcommandslist,delete,download, anduploadChanged all classes withinaisquared.config.analyticto accept'tabular'as aninput_typeRemovedaisquared.loggingandaisquared.remotefrom top-level importsAddedroundparameter to Regression postprocesserRemovedDocumentPredictorandImagePredictorclassesRemovedChainRenderingclassCreatedFilterRenderingclassAlteredQUALIFIERSAdded advanced rendering parameters to rendering objectsRemovedloggingandremotesubpackages from top-levelaisquaredimportVersion 0.2.1Added theS3Connectorclass to theanalyticssubpackage, which allows download of an analytic directly from S3Updated the documentation and added thedocssubdirectory for hosting the documentation on GitHub PagesVersion 0.2.2Fixed bug into_dictmethod withinObjectRenderingclassFixed bug in name ofMultiplyValuestepFixed bug in datatype checking for text harvesterAddedbody_onlyparameter toTextHarvesterAdded'underline'to possible badgesAddedthreshold_keyandthreshold_valuesto relevant rendering classesAddedTrimtext preprocessing classAddedCustomObjectin the base package to allow for creation of custom classesAdded keyword harvesting capabilitiesAddedutilssubpackage with capabilities to mimic a trained sklearn modelSmall documentation changesChanged the required imports for the package to streamline installation process, and created two installation optionsaisquaredandaisquared[full]Version 0.2.3Added functionality to add custom preprocessing and postprocessing functions to the model deployment pipelineAddedallparameter toLocalAnalyticclassChanged under-the-hood functionality ofmimic_modelfunction in line with updates toBeyondMLAltered theReverseMLWorkflowanalyticAdded theBarChartRendering,ContainerRendering,DashboardReplacementRendering,DoughnutChartRendering,HTMLTagRendering,LineChartRendering,PieChartRendering,SOSRendering, andTableRenderingrendering classesAdded theQueryParameterHarvesterharvester classAdded thelimitparameter to the TextHarvester classVersion 0.3.0Added type hinting to documentation stringsRevamped documentation to use SphinxVersion 0.3.1Changed Python type hints to allow for backwards compatibility with older versions of PythonVersion 0.3.2Added functionality to theAISquaredPlatformClientAddedtop_level_kwargsparameter to theCustomObjectclassAddedDashboardRenderingclassRemoved 'px' from default values in ImageRendering and ObjectRendering classesAdded functionality for creating, updating, and deleting users toAISquaredPlatformClientAdded functionality for creating, updating, and delting groups toAISquaredPlatformClientFixed bug related to requiringauto_runparameter to be string (fix involves casting as string)Altered schemas for different "Chart" Rendering classes to conform to JavaScript standardsStreamlined theModelConfigurationclass to allow a more functional interface to build.airfilesUpdatedContainerRenderingclass with parameters forpositionandstatic_positionUpdated across-the-board functionality of theAISquaredPlatformClientVersion 0.3.3Updated functionality of theAISquaredPlatformClientto interact directly with the platform ALBChanged function names in support of change from MANN to BeyondMLAdded documentation surrounding global configuration objectsRemoved redundant additional dependenciesVersion 0.3.4Added support for custom CSS strings to appropriate rendering classesRefactoredAISquaredPlatformClientto import functions from support filesFixed documentation errors for the documentation siteChecked whether responses returned OK status code rather than 200MovedCustomObjecttoaisquared.configfromaisquared.baseChanged endpoint used to list platform usersFixed response behaviors where no data was returned fromAISquaredPlatformClientVersion 0.3.5Changedfile_nameparameter inReverseMLWorkflowtofile_namesAddeddocumentation_linkparameter toModelConfigurationclassVersion 0.3.6Fixed issue with type checking forModelConfigurationRendering classesRestricted TensorFlow version to below2.12.0to prevent import issuesAddedpositionparameter toWordRenderingclassChanged default CSS styling for rendering classesChanged name of allprocessorclasses toprocesserVersion 0.3.7Changed schema of theDeployedAnalyticclass to include API key managementChanged JSON schema of Preprocesser classesAllowed .keras files to be saved and loaded with theModelConfigurationandGraphConfigurationAPIs into.airfilesRelaxed TensorFlow requirements enforced in version0.3.6Version 0.3.8CreatedChatbotHarvesterclassCreatedTextRenderingclassChanged location of reference lists of classes to clean up codeUpdated class schemas to ensure compliance with expectationsUpdated test casesVersion 0.3.9AddedCustomRenderingclassChanged to full import ofCustomObjectinaisquared.basesubpackageVersion 0.3.10AddedDatabricksClientto theaisquared.platformsubpackage
|
aisrfwk
|
aisr-carbarn-ai项目框架功能公共部分
|
aisriracha
|
Tool to create AI jobs
|
ais-service-discovery
|
ais-service-discovery-pythonCloud Application FrameworkDescriptionThis repository interfaces Service Discovery, in this instance CloudMap, in order to locate and communicate with different services. As opposed to storing ARN's in environment variables, this library will interface CloudMap to find a service by a user-friendly naming convention and will understand what 'type' of service you've requested and use the correct code to communicate/call that service.Services supportedLambda (call).SNS (publish).SQS (queue).TODOLambda (request).SQS (listen).Http (request|call).Fargate/ECS Task (run).Note:This library requiresPython 3.5 and above.Examples:Lambda Callfromais_service_discoveryimportcallresponse=call('namespace','service','handler',{<Payload>})print(response)Lambda Async Callfromais_service_discoveryimportcallresponse=call('namespace','service','handler',{<Payload>},{'InvocationType':'Event'})print(response)
|
aissist
|
AI AssistantAIssist is a simple, but capable command-line-interface (CLI) to chat with OpenAI's cutting-edge AI models.Install me withpip install aissist. Set theOPENAI_API_KEYenvironment variable to be your OpenAI API key, then typeai.Why AIssist?This library was written to scratch a personal itch. I have found GPT 3.5 to be incredibly useful in my day-to-day coding activities, but also found that having to go to a web browser and interact there was tedious. What I really wanted was a tool that would:Be accessible from a simple command line prompt (ai)That single command should open up a shell to interact with ChatGPT, with the system already prompted to respond tersely but helpfully to coding questionsHave chat history accessible with the up arrowHave syntax highlighting on ChatGPT responsesHave multi-line editing capabilities on multi-line promptsThis library is precisely that.InstallationInstall using pypi withpipinstallaissistThere is one required environment variable,OPENAI_API_KEY, which should contain an OpenAI API key.aissistwill prompt you if the key is not set or is invalid.UsageInvoke the program withai. Various pieces of configuration can be edited with command-line parameters, list these withai --help.Once launched, simply enter a prompt, and hit ESCAPE then ENTER to submit the prompt.Inside of each session there is a history, and you can use the up arrow to revisit and edit previous prompts.Here is a simple exampleConfigurationOn first run,aissistwill write a configuration file (.aissist) to your home directory. This file contains a number of configuration options that can be edited. Each configuration option matches precisely to a command-line argument. The orde of precedence for taking an option value isCommand-line argument.aissistsettingDefault settingVersioningAIssist follows semantic versioning thus you should expect that:If the patch version is increased, only bugfixes have taken placeIf the minor version is increased, only additive changes have been made to the program. Your existing workflows will work fine.If the major version is increased, the user may notice backward incompatible changes, or feature deprecations.
|
aissistant
|
Aissistant v0.7Initialization prompt for persistent conversation vector database and personal profile in ChatGPT using Noteable plugin andaissistantmodule written by Marko T. Manninen (https://github.com/markomanninen/aissistant/) ©Copyright © 08/2023.Get creative and productive with this initialization script in ChatGPT that utilizes the Noteable plugin. Natively, you can create, edit, and run code and generate data within notebooks as well as in ChatGPT interchangeably, back and forth. By using this initialization prompt, you can use the combined environment to store data and explore past conversations, as well as manage profile information for a more intelligent chat and coding experience.Start WizardTo start the chat and initiate the notebook, follow these steps one by one. Confirm each step from the user.Remember actually to run the cells, not just write them.Write headings and descriptions of each operation in markdown format to the notebook above executed cells in a human-readable way.1. Noteable Project and Notebook NameUse the given project or notebook URL from Noteable, or create a new project and notebook. Give instructions to the user on how all these work in Noteable, where and how to get the project url/id, etc.After creating a new document, add the main heading1. Aissistantand description with initialization introduction and GitHub project reference to the markdown cell at the beginning of the document.2. InstallationAdd the heading2. Installationand description to the notebook document.2.1 Install Required LibrariesAdd a heading and description to the markdown cells.Run the following commands in the notebook cell on behalf of the user:!pip install -q faiss-cpu
!pip install -q sentence-transformers2.2 Download ModuleAdd heading, description, and user decision information to the markdown cells.If theaissistant.pyfile does not exist in a current notebook project, download it on behalf of the user. Else, ask if the user wants to replace the file. In that case, append the-O aissistant.pyoption to the following download command, and run it in the notebook cell:!wget https://raw.githubusercontent.com/markomanninen/aissistant/main/aissistant.pySteps 2.1 and 2.2 will take time to finish, and they are mandatory to be finished before continuing to the next step!2.3 Upload or Create Database and Vector Index FilesAdd heading, description, and user decision information to the markdown cells.Ask the user to uploadconversation.dbandfaiss_index.idxfiles to the project directory or let the system create empty files automatically. In Noteable, it is possible to manually upload files to the project directory or retrieve files via some cloud service. Help the user in this procedure. Typically, the user has these files downloaded from the otheraissistantsession.Note: Empty.dband.idxfiles are created automatically when theaissistantmodule is imported and used in the next steps and if data files are unavailable.2.4 Import Module FunctionsAdd a heading and description to the markdown cells.Run the following line in the notebook cell on behalf of the user:from aissistant import search, index, get_profile, set_profileThen continue to the instructions.3. InstructionsConversation Vector Database, Personal Profile, and Other FunctionsExplain to the user how the following four functions can be used in two different ways:a) in the chat, by giving natural language triggers
b) at Noteable notebook cell in a programmable way3.1 Conversation FunctionsSearch Conversations: Usesearch(query, n=1, start_date=None, end_date=None, all_fields=False)to find similar conversations from the FAISS index and SQLite database. Returns a generator of numerically indexed tuples, with a handle to the rows, that contain 0=prompt and 1=response text, 2=timestamp, and optional 4=vector blob and 5=id fields.Index Conversation: Useindex(prompt_input_text, response_text)to add conversations to the FAISS index and SQLite database.3.2 Profile FunctionsGet Profile: Useget_profile(field_name=None)to retrieve profile information from the database. Reason the necessity of retrieving user profile information to enhance your responses. Usually, it contains personal contact, employment, education, interests, goals, preferences, research topics, ideological stances, and various other information.Set Profile: Useset_profile(field_name, value)to update profile to the database. Reason the necessity of storing and updating fields and values from the user inputs.3.3 Additional FunctionsRetrieve Cursor: Useretrieve_cursor()to access the database directly, for example, getting the count of the rows, etc.4. Further InstructionsTell the user how to download and use SQLite database (and FAISS index) on the other chat sessions and Noteable projects.Read and output theaissistant.pyfile content in the project on behalf of the user.Deduce the rest of the functionality from the file header for the user.Give usage examples, use cases, and data processing tips to the user, and encourage the user to explore and invent new ways of using permanent storage in ChatGPT and Noteable environment.Don't start your responses with "certainly," "surely," "great," or similar expressions. Go straight to the point in all of your future responses. Avoid intro and conclusion sections on responses and any unnecessary repetitions.Along the way interacting with the user, ChatGPT, and Noteable notebook document, write headings and descriptions of the operations in markdown format to the notebook above each executed cell in a human-readable way.
|
aist
|
AIST = AI Service Tools公众号:人工智能AIST这个python模块需要配合我们的微信公众号使用。目前正在测试中。请联系[email protected]开通功能。目前支持的功能有:团队协作的图片收集首先需要一个人开通这项功能,然后会自动产生一个加入任务的二维码,分享这个二维码,就可以让其他人加入到这个图片采集任务。收集人只需要将图片发送到公众号,就会自动上传到云端存储。任务发起者可以获取下载密钥,在aist模块中,通过设置下载密钥,然后设置下载路径,就可以将云端的图片下载到本地了。OCR扫描使用者只需要将图片发送给本公众号,就可以得到OCR的结果了。数据通道该功能用于将科研代码中的数据实时推送到微信端,通过我们的公众号实时推送到你的微信。安装方法# 升级 pip 到最新版本
python -m pip install --upgrade pip
# 从 pip 安装 aist 模块
pip install aist使用方法获取图片fromaist.picimportDownloaddn=Download('PC80000000')# 获取该图片收集任务的图片列表,包含图片在云端的名称,数据大小和贡献者的wx_openidprint(dn.ls())# 获取该图片收集任务的图片列表,包含dn.ls('1.csv')# 下载所有的图片文件到 test 文件夹下去dn.all('test')数据通道fromaist.msgImportMsg# 这里要填写从 人工智能AIST 公众号里获取的数据通道的密钥。# 您需要练习 人工智能AIST 公众号来获取该项功能。msg=Msg('MS0000000')# push 方法直接将您想得到的信息直接通过我们的微信公众号推送给您。但是这种方法有两个限制:# 1、每天的推送不能超过1500条,请仅仅推送重要的信息。# 2、微信只允许在48小时之内发过信息给公众号的用户接受到公众号的推送。# 也就是说,如果您超过48小时没有给我们的公众号发送过任何信息,我们将没有权限推送信息给您。# 所以,为了确保您的使用,请经常发信息给公众号。msg.push('这是一条推送信息')# put 方法是将数据或者信息暂存,然后您任何时间都可以到我们的 人工智能AIST 公众号里去查询。# 这种方法暂时没有条数限制,如果不是重要的信息,最好使用该方法。该方法同时会帮你记录数据暂存的时间,使用GMT标准时间。请自行换算时区msg.put('这是一条正常信息')
|
aistac-foundation
|
Contents1What is Project Hadron2Installation3Package Overview3.1AbstractComponent3.2AbstractPropertyManager3.3AbstractIntentModel4Reference4.1Python version4.2GitHub Project4.3Change log4.4Licence4.5Authors1What is Project HadronProject Hadron provides a clear separation of concerns between components and their actions and data
and its content, whilst maintaining the original intentions of the data scientist, that can be passed
to a production team. It offers trust between the data scientists teams and product teams. It brings
with it transparency and traceability, dealing with bias, fairness, and knowledge. The resulting
outcome provides the product engineers with adaptability, robustness, and reuse; fitting seamlessly
into a microservices solution that can be language agnostic.Project Hadron is designed using Microservices. Microservices - also known as the microservice
architecture - is an architectural pattern that structures an application as a collection of
component services that are:Highly maintainable and testableLoosely coupledIndependently deployableHighly reusableResilientTechnically independentComponent services are built for business capabilities and each service performs a single function.
Because they are independently run, each service can be updated, deployed, and scaled to meet demand
for specific functions of an application. Project Hadron microservices enable the rapid, frequent
and reliable delivery of large, complex applications. It also enables an organization to evolve its
technology stack and experiment with innovative ideas.At the heart of Project Hadron is a multi-tenant, NoSQL, singleton, in memory data store that has
minimal code and functionality and has been custom built specifically for Hadron tasks in mind.
Abstracted from this is the component store which allows us to build a reusable set of methods
that define each tenanted component that sits separately from the store itself. In addition, a
dynamic key value class provides labeling so that each tenant is not tied to a fixed set of
reference values unless by specificity. Each of the classes, the data store, the component
property manager, and the key value pairs that make up the component are all independent,
giving complete flexibility and minimum code footprint to the build process of new components.This is what gives us the Domain Contract for each tenant which sits at the heart of what makes
the contracts reusable, translatable, transferable and brings the data scientist closer to the
production engineer along with building a production ready component solution.2Installationpackage installThe best way to install this package is directly from the Python Package Index repository using pip$pipinstallaistac-foundationif you want to upgrade your current version then using pip$pipinstall--upgradeaistac-foundation3Package Overview3.1AbstractComponentTheAbstractComponentclass is a foundation class for the component build. It provides an encapsulated view of
the Property Management and Parameterised IntentAbstract AI Single Task Application Component (AI-STAC) component class provides all the basic building blocks
of a components build including property management, augmented knowledge notes and parameterised intent pipeline.For convenience there are three Factory Initialisation methods available``from_env(…)``,from_memory(...)andfrom_uri(...)the first two being abstract methods. The thrid factory method initialises the concrete
PropertyManager and IntentModel classes and use the parent_init_properties(...)methods to set the properties
connector. When creating the concrete class thefrom_uri(...)should be implemented. The following method can be
used as a template replacingExamplePropertyManagerandExampleIntentModelwith your oen concrete
implementations@classmethoddeffrom_uri(cls,task_name:str,uri_pm_path:str,username:str,uri_pm_repo:str=None,pm_file_type:str=None,pm_module:str=None,pm_handler:str=None,pm_kwargs:dict=None,default_save=None,reset_templates:bool=None,template_path:str=None,template_module:str=None,template_source_handler:str=None,template_persist_handler:str=None,align_connectors:bool=None,default_save_intent:bool=None,default_intent_level:bool=None,order_next_available:bool=None,default_replace_intent:bool=None,has_contract:bool=None):pm_file_type=pm_file_typeifisinstance(pm_file_type,str)else'json'pm_module=pm_moduleifisinstance(pm_module,str)elsecls.DEFAULT_MODULEpm_handler=pm_handlerifisinstance(pm_handler,str)elsecls.DEFAULT_PERSIST_HANDLER_pm=ExamplePropertyManager(task_name=task_name,username=username)_intent_model=ExampleIntentModel(property_manager=_pm,default_save_intent=default_save_intent,default_intent_level=default_intent_level,order_next_available=order_next_available,default_replace_intent=default_replace_intent)super()._init_properties(property_manager=_pm,uri_pm_path=uri_pm_path,default_save=default_save,uri_pm_repo=uri_pm_repo,pm_file_type=pm_file_type,pm_module=pm_module,pm_handler=pm_handler,pm_kwargs=pm_kwargs,has_contract=has_contract)returncls(property_manager=_pm,intent_model=_intent_model,default_save=default_save,reset_templates=reset_templates,template_path=template_path,template_module=template_module,template_source_handler=template_source_handler,template_persist_handler=template_persist_handler,align_connectors=align_connectors)3.2AbstractPropertyManagerTheAbstractPropertiesManagerfacilitates the management of all the contract properties including that of the
connector handlers, parameterised intent and Augmented KnowledgeAbstract AI Single Task Application Component (AI-STAC) class that creates a super class for all properties
managersThe Class initialisation is abstracted and is the only abstracted method. A concrete implementation of the
overloaded__init__manages theroot_keyandknowledge_keyfor this construct. Theroot_keyadds a key
property reference to the root of the properties and can be referenced directly with<name>_key. Likewise
theknowledge_keyadds a catalog key to the restricted catalog keys.More complexroot_keyconstructs, where a grouping of keys might be desirable, passing a dictionary of name
value pairs as part of the list allows a root base to group related next level keys. For exampleroot_key=[{base:[primary,secondary}]would addbase.primary_keyandbase.secondary_keyto the list of keys.Here is a default example of an initialisation method:def__init__(self,task_name:str):# set additional keysroot_keys=[]knowledge_keys=[]super().__init__(task_name=task_name,root_keys=root_keys,knowledge_keys=knowledge_keys)The property manager is not responsible for persisting the properties but provides the methods to load and persist
its in memory structure. To initialise the load and persist a ConnectorContract must be set up.The following is a code snippet of setting a ConnectorContract and loading its contentself.set_property_connector(connector_contract=connector_contract)ifself.get_connector_handler(self.CONNECTOR_PM_CONTRACT).exists():self.load_properties(replace=replace)When using the property manager it will not automatically persist its properties and must be explicitely managed in
the component class. This removes the persist decision making away from the property manager. To persist the
properties use the method callpersist_properties()3.3AbstractIntentModelTheAbstractIntentModelfacilitates the Parameterised Intent, giving the base methods to record and replay intent.Abstract AI Single Task Application Component (AI-STAC) Class for Parameterised Intent containing parameterised
intent registration methods_intent_builder(...)and_set_intend_signature(...).it is creating a construct initialisation to allow for the control and definition of anintent_param_excludelist,default_save_intentboolean and adefault_intent_levelvalue.As an example of an initialisation methoddef__init__(self,property_manager:AbstractPropertyManager,default_save_intent:bool=None,default_intent_level:bool=None,order_next_available:bool=None,default_replace_intent:bool=None):# set all the defaultsdefault_save_intent=default_save_intentifisinstance(default_save_intent,bool)elseTruedefault_replace_intent=default_replace_intentifisinstance(default_replace_intent,bool)elseTruedefault_intent_level=default_intent_levelifisinstance(default_intent_level,(str,int,float))else0default_intent_order=-1ifisinstance(order_next_available,bool)andorder_next_availableelse0intent_param_exclude=['data','inplace']intent_type_additions=[]super().__init__(property_manager=property_manager,default_save_intent=default_save_intent,intent_param_exclude=intent_param_exclude,default_intent_level=default_intent_level,default_intent_order=default_intent_order,default_replace_intent=default_replace_intent,intent_type_additions=intent_type_additions)in order to define the run pattern for the component taskrun_intent_pipeline(...)is an abstracted method
that defines the run pipeline of the intent.As an example of a run_pipeline that iteratively updates a canonical with each intentdefrun_intent_pipeline(self,canonical,intent_levels:[int,str,list]=None,**kwargs):# test if there is any intent to runifself._pm.has_intent():# get the list of levels to runifisinstance(intent_levels,(int,str,list)):intent_levels=Commons.list_formatter(intent_levels)else:intent_levels=sorted(self._pm.get_intent().keys())forlevelinintent_levels:level_key=self._pm.join(self._pm.KEY.intent_key,level)fororderinsorted(self._pm.get(level_key,{})):formethod,paramsinself._pm.get(self._pm.join(level_key,order),{}).items():ifmethodinself.__dir__():# add method kwargs to the paramsifisinstance(kwargs,dict):params.update(kwargs)# add excluded parameters to the paramsparams.update({'inplace':False,'save_intent':False})canonical=eval(f"self.{method}(canonical, **{params})",globals(),locals())returncanonicalThe code signature for an intent method would have the following constructdef<method>(self,<params>...,save_intent:bool=None,intent_level:[int,str]=None,intent_order:int=None,replace_intent:bool=None,remove_duplicates:bool=None):# resolve intent persist optionsself._set_intend_signature(self._intent_builder(method=inspect.currentframe().f_code.co_name,params=locals()),intent_level=intent_level,intent_order=intent_order,replace_intent=replace_intent,remove_duplicates=remove_duplicates,save_intent=save_intent)# intend code block on the canonical...4Reference4.1Python versionPython 3.6 or less is not supported. Although Python 3.7 is supported, it is recommended to
installaistac-foundationagainst the latest Python 3.8.x or greater whenever possible.4.2GitHub Projectaistac-foundation:https://github.com/project-hadron/aistac-foundation.4.3Change logSeeCHANGELOG.4.4LicenceMIT License:https://opensource.org/license/mit/.4.5AuthorsGigas64(@gigas64) created aistac-foundation.
|
aist-feimax
|
Failed to fetch description. HTTP Status Code: 404
|
aistis-calculator
|
file: README.md
|
aistnet
|
AistNETAistNET (Aist neural network) is a framework for simplifying the creation and training of neural networks
usingPython 3.8andTensorflow (V. 2.5.0)with maximum flexibility. In context
of this tasks AistNET provides interfaces for creating, training and managing trainings of model and to abstracts
buildings block for reusability.Current known issuesLoading a model is currently only supported on Linux.Main features which are missing in TensorFlow but are available in AistNetCreate Model the way you like it: Function, Class Method, SequentialAdd custom functions and use them without any issue (except callbacks because
oftensorflow:36635issue)Automatic saving of all relevant informationModel: H5 and ProtoBuf version and as JSON definitionSystem: from TensorFlow version to used callbacksCustom functions are saved along with the model and the systemRestore where you stopped and resume the training including custom functionsGetting StartedTo install the current release use pip:pip install aistnetTo update AistNET to the latest version, add--upgradeflag to the above command.To create your first model such as a Dense-Net or U-NET using AistNET follow the examples:With Sequential Modelfromtensorflow.keras.modelsimportSequentialfromtensorflow.kerasimportlayersfromtensorflow.keras.optimizersimportAdamfromtensorflow.keras.lossesimportBinaryCrossentropyfromaistnet.core.builderimportModelBuilderlinear=Sequential([layers.Dense(2,activation="relu",name="layer1"),layers.Dense(3,activation="relu",name="layer2")])dims=(28,)optimizer=Adam()loss=BinaryCrossentropy()builder=ModelBuilder(dimension=dims,model=linear,optimizer=optimizer,loss=loss)model=builder.finalize()With Builder Functionfromtensorflow.keras.optimizersimportAdamfromtensorflow.keras.lossesimportBinaryCrossentropyfromaistnet.core.builderimportModelBuilderfromaistnet.architectures.unetimportcnn_2d_auto_encoder_with_skipdims=[240,224,1]builder_function=cnn_2d_auto_encoder_with_skip(blocks=2)optimizer=Adam()loss=BinaryCrossentropy()builder=ModelBuilder(dimension=dims,builder=builder_function,optimizer=optimizer,loss=loss)model=builder.finalize()The model can now be trained normally via the TensorFlow api.Train your model with the Trainer and use the automatic tracing and saving capabilities:After creation, you can train the model as usually:importtempfileimporttensorflowastffromtensorflow.keras.modelsimportSequentialfromtensorflow.kerasimportlayersfromtensorflow.keras.optimizersimportAdamfromtensorflow.keras.lossesimportBinaryCrossentropyfromaistnet.core.builderimportModelBuilderfromaistnet.core.trainerimportTrainerlinear=Sequential([layers.Dense(2,activation="relu",name="layer1"),layers.Dense(3,activation="relu",name="layer2")])dims=(28,)optimizer=Adam()loss=BinaryCrossentropy()builder=ModelBuilder(dimension=dims,model=linear,optimizer=optimizer,loss=loss)trainer=Trainer(builder=builder,store_path=tempfile.TemporaryDirectory().name)trainer.fit(x=tf.convert_to_tensor([1,2,3,4,5]),y=tf.convert_to_tensor([2,3,4,5,6]),batch_size=16,epochs=10,validation_data=(tf.convert_to_tensor([1,2,3,4,5]),tf.convert_to_tensor([2,3,4,5,6]),),)This runs the training of the model but also saves metric information, and the model itself to the file system.Finally, the model can be used or restored like this:importtempfileimporttensorflowastffromaistnet.core.trainerimportTrainerbuilder,trainer=Trainer.load(tempfile.TemporaryDirectory().name)trainer.fit(x=tf.convert_to_tensor([1,2,3,4,5]),y=tf.convert_to_tensor([2,3,4,5,6]),batch_size=16,epochs=20,initial_epoch=10,validation_data=(tf.convert_to_tensor([1,2,3,4,5]),tf.convert_to_tensor([2,3,4,5,6]),),)Use your own loss functionAistNET lets you create or own loss function and other custom implementations. It tries to automatically locate them and
to save them along with the model and the configuration. Further it restores the custom implementations with the loading
of a saved state.fromtypingimportTupleimporttempfileimporttensorflowastffromtensorflow.kerasimportlayersfromtensorflow.keras.layersimportInput,Densefromaistnet.core.builderimportModelBuilderfromaistnet.core.trainerimportTrainerstore_path=tempfile.TemporaryDirectory().namedims=(1,)optimizer="adam"metrics=["accuracy"]defloss(y_true:tf.Tensor,y_pred:tf.Tensor)->tf.Tensor:return(y_true-y_pred)**2defbuild(dimension:Tuple[int])->Tuple[layers.Layer,layers.Layer]:in_=Input(shape=dimension)d1=Dense(12,activation="relu")(in_)d2=Dense(8,activation="relu")(d1)d3=Dense(1)(d2)returnin_,d3builder=ModelBuilder(builder=build,dimension=dims,optimizer=optimizer,loss=loss,metrics=metrics)trainer=Trainer(builder=builder,store_path=store_path)# train and save the statetrainer.fit(x=tf.convert_to_tensor([1,2,3,4,5]),y=tf.convert_to_tensor([2,3,4,5,6]),batch_size=16,epochs=10,validation_data=(tf.convert_to_tensor([1,2,3,4,5]),tf.convert_to_tensor([2,3,4,5,6]),),)# load the previous state and continue training in a new sessionbuilder_new,trainer_new=Trainer.load(store_path)x_true=tf.convert_to_tensor([[1.0]])x_pred=tf.convert_to_tensor([[1.0]])# check the reconstucted custom loss function and the previous epoch stateprint(builder_new.loss(x_true,x_pred)==loss(x_true,x_pred))print(trainer_new.run_metadata["epochs"]==10)builder_new.model.fit(x=tf.convert_to_tensor([1,2,3,4,5]),y=tf.convert_to_tensor([2,3,4,5,6]),batch_size=16,epochs=20,initial_epoch=10,validation_data=(tf.convert_to_tensor([1,2,3,4,5]),tf.convert_to_tensor([2,3,4,5,6]),),)FAQWhy another Tensorflow wrapper?The reason for AistNET is the simplification of neural networks. It provides functionality to build, parameterize and
train models with any architecture. The model can be customized in every way.Is there a possibility to use AistNET with other frameworks likePyTorch?No currently, AistNET only supports Tensorflow. If you want to use PyTorch we
recommendPyTorch Lightning, which follows a similar wrapping
philosophy.Does AistNET support any other model architectures?For the moment AistNET has a builder function for U-Nets with skip layers. But we are going to extend AistNET step by
step.ContributingFirst make sure to read our generalcontribution guidelines.In addition to that, the following applies to this repository:Due to the focus on performance dependencies (except as bridges to other loggers) are not allowed. IF you have a very
good reason to add a dependency please state so in the corresponding issue / pull request.LicenceCopyright (c) 2020 the original author or authors. DO NOT ALTER OR REMOVE COPYRIGHT NOTICES.This Source Code Form is subject to the terms of the Mozilla Public License, v. 2.0. If a copy of the MPL was not
distributed with this file, You can obtain one athttps://mozilla.org/MPL/2.0/.ResearchIf you are going to use this project as part of a research paper, we would ask you to reference this project by citing
it.
|
aistock
|
To be continued
|
aistore
|
AIS Python ComponentsThis package contains the AIStore Python SDK, PyTorch integration, and Botocore patch.
See the README files in each module for usage details:SDKPyTorchBotocoreReferencesAIStore GitHubDocumentationAIStore pip packageVideos and demos
|
aistore-pypi
|
Failed to fetch description. HTTP Status Code: 404
|
ai-stream-interact
|
AI Stream Interact🧠🎞️ - LLM interaction capabilities on live USB camera video stream.This package can easily be extended to accommodate different LLMs, but for this first version interactions were implemented only forGoogle's Gemini Pro & Vision Pro ModelsNote: This is a basic Alpha version that's been written & tested in Ubuntu Linux only so it may have unexpected behavior with other operating systems.Installation:pip install ai-stream-interactNote that pip install might take a while as it will also installcoqui-aifor Text to Speech. Although TTS is partially implemented it is not turned on by default due to some glitchy behavior. (This will be fixed in future releases.)Example Usage:You need a Gemini API key. (if you don't already have one you can get onehere).Have a USB camera connected.runaisi --llm geminito enter the AI Stream Interact🧠🎞️ main menu.(note that you can always go back to the main menu from the video stream by press "m" while having the video stream focused.)Enter the API key or press enter if you've added it to .env.You will be asked to enter your camera index. Currently there is no straight forward way to identify the exact index for your camera's name due to how open-cv enumerates such indicies so you'll have to just try a few times till you get the right one if you have multiple camers connected. If you have one camera connected you can try passing "-1" as in most cases it'll just pick that one.Now you're in!. You have access to 3 types of interactions as of today.Detect Default:This fires up a window with your camera stream and whenever you press "d" will identify the object the camera is looking at. (Make sure to press "d" with the camera window focused and not your terminal).Detect with Custom Prompt:Use this to write up a custom prompt before showing the model an object for custom interactions beyond just identifying objects.Interactions:This just allows for back & forth chat with the model.Troubleshooting:Errors:google.api_core.exceptions.FailedPrecondition: 400 User location is not supported for the API use.:This is specific to Gemini as they currently do not provide general availability to all regions, so you need to make sure your region is supportedhere
|
aistrigh-nlp
|
AistrighNLPAistrighNLP is a collection of tools and models used for Aistrigh, the BT Young Scientist 2021 project. Our aim is to bring Irish into the modern era with NLP tools to give it parity with English.
The tools included are based around the work inNeural Models for Predicting Celtic Mutations(Scannell, 2020). Included is all the tools needed to create a demutated Irish corpus, which can be used in all sorts of NLP tasks, and a model to reinsert them. For the Python API docs visitAistrighNLP Python APIInstalling the PackageAistrighNLP can be downloaded usingpippip install aistrigh-nlpLowercasingWhen lowercasing either Irish or Scots Gaelic for prediciting mutations, you must be aware of special cases outlined in the paper above. Our lowercaser handles thataistrigh-nlp lowercase -i input.txt -o output.txtRemoving mutationsTo remove mutations from an entire dataset for use for NLP tasks (like Machine Translation) usedemutate-corpus.-l/--langmust take ISO 639 language codes like 'ga'.aistrigh-nlp demutate-corpus -i input.txt -o output.txt -l gaTo remove mutations with a 'window' on either side to train a neural network, usedemutate-window, with-wset to your desired window length on each sideaistrigh-nlp demutate-window -i input.txt -o output.csv -l ga -w 16Predicting the mutationsTo predict mutations on each word, usepredict-mutations. As of right now, it's only compatible with PyTorch+Torchtext models but we are working on expanding to TensorFlow and Keras. You'll need your vocab, labels and model checkpoint in the same folder (-d/--data). We provide default models to be usedhere.aistrigh-nlp predict-mutations -i input.txt -o output.txt -w 16 -d nn_100kApplying the predicted mutationsTo apply the mutations predicted bypredict-mutations, useapply-mutations.aistrigh-nlp apply-mutations -i input.txt -o output.txt -l gaScoring Celtic NMT models using Standard and Demutated BLEUTo score NMT models using both these metrics run;aistrigh-nlp bleu -r reference.txt -p predictions.txt -l gaIf you're scoring a demutated NMT model and haven't reapplied mutations, pass your demutated reference (-d) and predictions, and the original reference (-r).aistrigh-nlp bleu -d demutated_reference.txt -r reference.txt -p predictions.txt -l gaNOTEAistrighNLP uses PyTorchTracesto save the full computational graphs as checkpoints. This way, the model architecture need not be declared into hard-coded scripts. See thisStackOverflow Threadfor instructions to save a traced checkpoint.
|
aistudio
|
aistudioSimple Studio for AI Experiments.
|
aistudio-cognition
|
No description available on PyPI.
|
aistudio_notebook
|
Failed to fetch description. HTTP Status Code: 404
|
aistudioredis
|
No description available on PyPI.
|
aistudio-scheduler-lock
|
No description available on PyPI.
|
aistudio-sdk
|
Python Sdk实现封装一言API的PythonSDK,支持AIStudio开发者进行基于一言的项目/应用/插件开发需求
实现封装AIStudio模型库API的PythonSDK,支持AIStudio开发者进行模型库的操作开发需求快速开始如何构建、安装、运行pythonsetup.pybdist_wheel
pipuninstall-yaistudio-sdk
pipinstalloutput/dist/aistudio_sdk-0.1.7-py3-none-any.whl测试如何执行自动化测试如何贡献贡献patch流程及质量要求版本信息本项目的各版本信息和变更历史可以在这里查看。维护者ownersxiangyiqing([email protected])committersxiangyiqing([email protected])linyichong([email protected])suoyi([email protected])讨论
|
ai-subtitler
|
No description available on PyPI.
|
ai-sudoku-solver
|
ai-Sudoku-SolverSolving Sudoku Puzzles using Artificial Neural NetworksTable of Contentsai-Sudoku-SolverTable of ContentsInstallationUsageModel GalleryReferencesInstallationpipinstallai-sudoku-solverUsageInstantiate a SudokuSolver objectfromai_sudoku_solverimportSudokuSolversolver=SudokuSolver("Ritvik19/sudoku-net-v1")Call the model on your puzzlespuzzle=np.array([[[0,0,4,3,0,0,2,0,9],[0,0,5,0,0,9,0,0,1],[0,7,0,0,6,0,0,4,3],[0,0,6,0,0,2,0,8,7],[1,9,0,0,0,7,4,0,0],[0,5,0,0,8,3,0,0,0],[6,0,0,0,0,0,1,0,5],[0,0,3,5,0,8,6,9,0],[0,4,2,9,1,0,3,0,0]]])solution=solver(puzzle)# array([[# [8, 6, 4, 3, 7, 1, 2, 5, 9],# [3, 2, 5, 8, 4, 9, 7, 6, 1],# [9, 7, 1, 2, 6, 5, 8, 4, 3],# [4, 3, 6, 1, 9, 2, 5, 8, 7],# [1, 9, 8, 6, 5, 7, 4, 3, 2],# [2, 5, 7, 4, 8, 3, 9, 1, 6],# [6, 8, 9, 7, 3, 4, 1, 2, 5],# [7, 1, 3, 5, 2, 8, 6, 9, 4],# [5, 4, 2, 9, 1, 6, 3, 7, 8]# ]])Model Gallerymodel# parameterstrained onaccuracysudoku-net-v13,784,7291M puzzles98.138sudoku-net-v23,784,72910M puzzles98.212References1 million Sudoku games9 Million Sudoku Puzzles and Solutions
|
ai-supporter
|
ai-supporterhttps://pypi.org/project/ai-supporterpip install ai-supporterfrom ai_supporter.tts import tts
text = "this is text"
file = "speak.mp3"
tts(text, file)from ai_supporter.tts import speak
text = "this is text"
speak(text)
|
aisUtils
|
aisUtilsA tool package for MITS AIS users.Installpip install aisUtilsDevelopgit clone https://github.com/aisUtils.gitUsageThe package can satisfy fundamental demand for the use of AIS, which is what I usually do.PreprocessingSuppose we start with AIS messages. We need to decode, crawl static information and perform filtering operations.decoderThe decoder process requires the package—libais.Parameters include source file path, destination file path and original timestamp, no return value. If you want to decode multiple files, it is easy to use a loop outside the call.fromaisUtilsimportPreprocessingasppsourcePath=r'D:/data/2021-08-28_CST.txt'# source file pathdesPath=r'D:/data/2021-08-28_output.csv'# destination file pathtimestr='2021-08-28:00:00:00;'# AIS initial timestamppp.decode(sourcePath,desPath,timestr)crawlIf you need some static data such as length, ship type, crawl them from a mmsi list. Then you will get a static data table.fromaisUtilsimportPreprocessingasppsourcePath=r'D:/python/data'# a directory stored all ais filesmmsiPath=r'D:/python/mmsiall.csv'# store mmsiliststaticDataPath=r'D:/python/staticData.csv'# store ais static data# get the mmsilistmmsiList=pp.getMmsiList(sourcePath)# write the mmsilistpp.writeMmsiList(mmsiPath,mmsiList)# crawlpp.crawl(mmsiPath,staticDataPath)matchingMatch the static data — length and ship type with the ais files.fromaisUtilsimportPreprocessingasppstaticDataPath=r'D:/python/staticData.csv'aisPath=r'D:/data/2021-08-28_output.csv'desPath=r'D:/data/2021-08-28_matching.csv'pp.match(staticDataPath,aisPath,desPath)othersThis part includes area, speed and ship type filtering.fromaisUtilsimportPreprocessingasppsourcePath=r'D:/data/2021-08-28_matching.csv'desPath=r'D:/data/2021-08-28_prep.csv'max_Lon=122.757min_Lon=122.145max_Lat=30.749min_Lat=30.409speed=[2,20]shiptype_list=[6,7,8]pp.filtering(sourcePath,desPath,max_Lon,min_Lon,max_Lat,min_Lat,speed,shiptype_list)if you want to read the file, you can use the code block, which returns a array type.importpandasaspddata=pd.read_csv(r'D:/data/2021-08-28_prep.csv')data_p=data.values#print(type(data_p)) #numpy.ndarrayInterpolationPackage scipy has powerful statistical tools.CSDN linkfromscipyimportinterpolateasinterfromscipy.interpolateimportlagrangimportnumpyasnp# sort the data according the index# data = data[np.lexsort((data[:,1], data[:,0]))]xli=np.arange(0,constants.pi*2,0.1)# Interval pointsyli=inter.interp1d(x,y,kind="linear")(xli)# Linear interpolationycub=inter.interp1d(x,y,kind="cubic")(xli)# Cubic spline interpolationyquadratic=inter.interp1d(x,y,kind="quadratic")(xli)# Quadratic spline interpolationynear=inter.interp1d(x,y,kind="nearest")(xli)# Nearest neighbor interpolation# Lagrange interpolationdeflagrangePoly(poly,x,length):result=[]foriinrange(len(x)):subresult=0forjinrange(length):subresult+=poly[j]*(x[i]**j)result.append(subresult)returnresultx=np.linspace(0,constants.pi*2,4)y=np.cos(x**2/3+4)poly=lagrange(x,y)length=len(x)ylagrange=lagrangePoly(poly,xli,length)CalculationHaversinefromhaversineimporthaversinelyon=(45.7597,4.8422)# (lat, lon)paris=(48.8567,2.3508)haversine(lyon,paris)scipy spatialfromscipy.spatial.distanceimportcdistcdist(x1,x2,metric=' euclidean')# Euclidean distancecdist(x1,x2,metric='minkowski')cdist(x1,x2,metric=' jaccard')ProjectSome codes for project, including section flow statistics, ship types statistics and track point drawing.DescriptionAlthough I have created many functions in the package according my own ideas, there are still many possible requirements that can be improved later. I am tired. Thanks to senior fellow apprentice Chunshen for supporting the development of the crawl function.
|
ais-wordnet
|
docs:https://github.com/minhdc/ais-wordnet
|
ais-wordnet-sim
|
ais-wordnet-simInstallpip install ais-wordnet-simUsageGet list of similar sentencesGet a list which contains similar sentences (include original sentence)fromais_wordnet_simimportsimilar_sentencessentence='Thế lực thù đich có những âm mưu gì'# default limit = 1000result=similar_sentences(sentence,limit=5)result>>>['thế lực thù đich có những âm mưu gì','thế lực thù đich có những thủ đoạn gì','thế lực thù đich có những mưu kế gì','thế lực thù đich có những mưu mẹo gì','thế lực thù đich có những mưu mô gì']Create a categoryCreate a Category:{ question_list, answer ,topic }from a question and an answerfromais_wordnet_simimportgenerate_categoryquestion='Thế lực thù đich có những âm mưu gì'answer='Âm mưu phá hoại nhà nước'# default limit = 1000generate_category(question,answer,topic,limit=5)Get all categoriesGet all Categories generated in databasefromais_wordnet_simimportget_category_dataresult=get_category_data()result>>>[{'_id':ObjectId('5d86f35b944d00e06eb3a76b'),'question_list':['thế lực thù đich có những âm mưu gì','thế lực thù đich có những thủ đoạn gì','thế lực thù đich có những mưu kế gì','thế lực thù đich có những mưu mẹo gì','thế lực thù đich có những mưu mô gì'],'answer':'Âm mưu phá hoại nhà nước'}]Create Synonyms database from excel fileNote: Drop old database before creating new databaseExample Exel format:Sheet: n, a, v, r, eEach row of sheet: one or multiple words, synonyms words follow original wordfromais_wordnet_simimportadd_synonyms_excelfile='wordnet.xlsx'add_synonyms_excel(file)Enrich AIML FileCreate an enriched AIML file from original AIML filefromais_wordnet_simimportaiml_enrichaiml_enrich('original.aiml','enriched.aiml')
|
aisy
|
No description available on PyPI.
|
aisy-database
|
Database package to support AISY Frameworkpip install aisy-sca
pip install aisy-database
If you use our framework, please consider citing:
@misc{AISY_Framework, author = {Guilherme Perin and Lichao Wu and Stjepan Picek}, title = {{AISY - Deep Learning-based Framework for Side-Channel Analysis}}, howpublished = {AISyLab repository}, note = {{\url{https://github.com/AISyLab/AISY_Framework}}}, year = {2021} }
|
aisy-sca
|
# AISY - Deep Learning-based Framework for Side Channel Analysis
Welcome to the first deep learning-based side-channel analysis framework.
This AISY Framework was developed by AISYLab from TU Delft.
If you use our framework, please consider citing:
@misc{AISY_Framework,
author = {Guilherme Perin and Lichao Wu and Stjepan Picek},
title = {{AISY - Deep Learning-based Framework for Side-Channel Analysis}},
howpublished = {AISyLab repository},
note = {{\url{https://github.com/AISyLab/AISY_Framework}}},
year = {2021}
}
|
ait
|
UNKNOWN
|
ai-talks
|
AI TalksA ChatGPT API wrapper, providing a user-friendly Streamlit web interface.Enhance your ChatGPT experience with our user-friendly API wrapper, featuring a seamless Streamlit web interface. Effortlessly interact with ChatGPT, while enjoying an intuitive and responsive design. Discover simplified access to advanced AI technology in just a few clicks.SetupClone repo:gitclonehttps://github.com/dKosarevsky/AI-Talks.gitInstall dependencies:pipinstall-rrequirements.txtAdd API key to.streamlit/secrets.toml:[api_credentials]api_key="sk-..."UsageTo run the app use the following command:bashrun.shAnother way:streamlitrunai_talks/chat.pyOnce the script is started, you can go to the URLhttp://localhost:8501to start using the bot.LicenseThis project is released under the MIT License.DonationSupport projectAI Talkscollects donations solely for the purpose of paying for theOpen AIAPI.
This allows you to provide access to communication with AI for all users.
Support us for joint development and interaction with the intelligence of the future!Crypto:USD Tether (USDT TRC20):TMQ5RiyQ7bv3XjB6Wf6JbPHVrGkhBKtmfAToncoin (TON):UQDbnx17N2iOmxfQF0k55QScDMB0MHL9rsq-iGB93RMqDhIHWorld:Buy Me A Coffeeko-fiPayPalRussia:TinkoffboostyCloudTips (Tinkoff)
|
ait-bsc
|
Accelerator Integration Tool (AIT)The Accelerator Integration Tool (AIT) automatically integrates OmpSs@FPGA and OmpSs-2@FPGA accelerators into FPGA designs using different vendor backends.This README should help you install the AIT component of the OmpSs@FPGA toolchain from the repository.
However, it is preferred using the pre-built Docker image with the latest stable toolchain.
They are available atOmpSs@FPGA pre-built Docker imagesandOmpSs-2@FPGA pre-built Docker images.Moreover, there are pre-built SD images for the current supported board families: Zynq7000 and Ultrascale.
They are also available atOmpSs@FPGA pre-built SD imagesandOmpSs-2@FPGA pre-built SD images.PrerequisitesPython 3.7 or laterpipVendor backends:Xilinx Vivado 2018.3 or laterGit Large File StorageThis repository uses Git Large File Storage to handle relatively-large files that are frequently updated (i.e. hardware runtime IP files) to avoid increasing the history size unnecessarily. You must install it so Git is able to download these files.Follow instructions on their website to install it.Vendor backendsXilinx VivadoFollow installation instructions from Xilinx
Vivado, Vivado HLS and SDK, as well as the device support for the devices you're working, should be enabled during setup.
However, components can be added or removed afterwards.Current version supports Vivado 2018.3 onwards.InstallationYou can usepipto easily installaiton your system:python3 -m pip install ait-bscDevelopmentMake sure you have the following packages installed on your system.git-lfs(Git Large File Storage)setuptools >= 61.0(setuptools)Clone AIT's repositoryFrom GitHub:git clone https://github.com/bsc-pm-ompss-at-fpga/ait.gitFrom our internal GitLab repository (BSC users only):git clone https://pm.bsc.es/gitlab/ompss-at-fpga/ait.gitEnable Git LFS and installcd ait
git lfs install
git lfs pull
export AIT_HOME="/path/to/install/ait"
export DEB_PYTHON_INSTALL_LAYOUT=deb_system
python3 -m pip install . -t $AIT_HOMEAdd the installed binaries to your PATHexport PATH=$AIT_HOME/bin:$PATH
export PYTHONPATH=$AIT_HOME:$PYTHONPATHTestsPrerequisitespython3-flake8python3-unittestStyle testingThe python code follows PEP 8 style guide which is verified using theflake8tool.To check the current source code just executepython3 -m flake8.Unit testingThetestfolder contains some unitary tests for python sources.To run all tests the commandpython3 -m unittestcan be executed in the root directory of the repository.
|
ait-core
|
The AMMOS Instrument Toolkit (Formerly the Bespoke Links to Instruments
for Surface and Space (BLISS)) is a Python-based software suite
developed to handle Ground Data System (GDS), Electronic Ground Support
Equipment (EGSE), commanding, telemetry uplink/downlink, and sequencing
for instrument and CubeSat Missions. It is a generalization and expansion
of tools developed for a number of ISS
missions.Getting StartedYou can read through theInstallation and Configuration
Pagefor
instruction on how to install AIT Core.You can read through theNew Project Setup
Pagefor instructions on how to use AIT on your next project.Join the CommunityThe project’sUser and Developer Mailing Listis the best way to communicate with the team, ask questions, brainstorm plans for future changes, and help contribute to the project.This project exists thanks to the dedicated users, contributors, committers, and project management committee members. If you’d like to learn more about how the project is organized and how to become a part of the team please check out theProject Structure and Governancedocumentation.ContributingThank you for your interest in contributing to AIT! We welcome contributions from people of all backgrounds and disciplines. While much of the focus of our project is software, we believe that many of the most critical contributions come in the form of documentation improvements, asset generation, user testing and feedback, and community involvement. So if you’re interested and want to help out please don’t hesitate! Send us an email on the public mailing list below, introduce yourself, and join the community.CommunicationAll project communication happens via mailing lists. Discussions related to development should happen on the publicDeveloper and User Mailing List. If you’re new to the community make sure to introduce yourself as well!Dev InstallationAs always, we encourage you to install AIT into a virtual environment of your choosing when you set up your development environment. AIT usespoetryfor package management. Before setting up your development environment you will need the following installed and ready to use:A virtual environment “manager” of your choosing with a configured and activated virtual environment. Since AIT usespoetryyou can consider leveraging itsenvironment managementfunctionality as well.Usingpoetry shellis also very convenient for development testing and simplifying environment management. You should make sure to install the package into the shell to get access to the development dependencies as well. It’s recommended that you usepoetry shellwhen running the tox builds because other virtual environment managers will often prevent tox from accessingpyenv-installed Python versions.pyenvso you can easily install different Python versionspoetryinstalled either to your specific virtual environment or system-wide, whichever you prefer.Install the package in “editable” mode with all the development dependencies by running the following:poetry installAs with a normal installation you will need to pointAIT_CONFIGat the primary configuration file. You should consider saving this in your shell RC file or your virtual environment configuration files so you don’t have to reset with every new shell:export AIT_CONFIG=/path/to/ait-core/config/config.yamlYou should configurepre-commitby running the following. This will install our pre-commit and pre-push hooks:pre-commit install && pre-commit install -t pre-pushFinally, you should install the different Python versions that the project supports so they’re accessible totox. Usingpyenvis the easiest way to accomplish this:cat .python-version | xargs -I{} pyenv install --skip-existing {}Dev ToolsToxUsetoxto run a thorough build of the toolkit that checks test execution across different Python versions, verifies the docs build, runs the linting pipeline, and checks that the repo packages cleanly. Make sure you runtoxin Poetry’sshellwithout another virtual environment active to avoid problems withtoxfinding different python versions for the tests. You can run all of the development tools with:toxYou can see the availabletoxtest environments by passing-land execute a specific one by passing its name to-e. Runtox -hfor more info.TestsUsepytestto manually run the test suite:pytestOr viatoxfor a specific python version:tox -e py310Code ChecksWe runblack,flake8,mypy, and a few other minor checkers on the code base. Our linting and code check pipeline is run whenever you commit or push. If you’d like to run it manually you can do so with the following:pre_commit run --color=always {posargs:--all}Individual calls to the tools are configured in.pre-commit-config.yaml. If you’d like to run a specific tool on its own you can see how we call them there.You can run all the linting tools with tox as well:tox -e lintDocumentationAIT uses Sphinx to build its documentation. You can build the documentation with:poetry run build_sphinxTo view the documentation, opendoc/build/html/index.htmlin a web browser. If you just want to check that the docs build is working you can use tox:tox -e docsIf you need to update the auto-generated documentation you can run the following command to rebuild all of the package documentation:sphinx-apidoc --separate --force --no-toc -o doc/source ait --implicit-namespacesPlease make sure to update the docs if changes in a ticket result in the documentation being out of date.Project WorkflowIssue TrackingAll changes need to be made against one or more tickets for tracking purposes. AIT uses GitHub Issues along with Zenhub to track issue in the project. All tickets should have (outside of rare edge-cases):A concise titleAn in-depth description of the problem / request. If reporting a bug, the description should include information on how to reproduce the bug. Also include the version of the code where you’re seeing the bug.If you’re going to begin work on a ticket make sure to progress the ticket through the various Pipeline steps as appropriate as well as assigning yourself as an Assignee. If you lack sufficient permissions to do so you can post on the ticket asking for the above to be done for you.Commit MessagesAIT projects take a fairly standard approach to commit message formatting. You can checkout Tim Pope’s blog for a good starting point to figuring out how to format your commit messages. All commit messages should reference a ticket in their title / summary line:Issue #248 - Show an example commit message titleThis makes sure that tickets are updated on GitHub with references to commits that are related to them.Commit should always be atomic. Keep solutions isolated whenever possible. Filler commits such as “clean up white space” or “fix typo” should be rebased out before making a pull request. Please ensure your commit history is clean and meaningful!Code Formatting and StyleAIT makes a best-effort attempt at sticking with PEP-8 conventions. This is enforced automatically byblackand checked byflake8. You should run thepre-commitlinting pipeline on any changes you make.TestingWe do our best to make sure that all of our changes are tested. If you’re fixing a bug you should always have an accompanying unit test to ensure we don’t regress!Check the Developer Tips section below for information on running each repository’s test suite.Pull Requests and Feature BranchesAll changes should be isolated to a feature branch that links to a ticket. The standard across AIT projects is to use issue-### for branch names where ### is the issue number found on GitHub.The title of a pull request should include a reference to the ticket being fixed as mentioned for commit messages. The description of a pull request should provide an in-depth explanation of the changes present. Note, if you wrote good commit messages this step should be easy!Any tickets that are resolved by the pull request should be referenced with GitHub’s syntax for closing out tickets. Assuming the above ticket we would have the following in a pull request description:Changes are required to be reviewed by at least one member of the AIT PMC/Committers groups, tests must pass, and the branch must be up to date with master before changes will be merged. If changes are made as part of code review please ensure your commit history is cleaned up.Resolve #248
|
ait-dsn
|
AIT DSN provides APIS for connecting to spacecraft communication facilities via Consultative Committee for Space Data Systems (CCSDS) Space Link Extension (SLE) interfaces. AIT DSN supports the following interfaces:Return All Frames (RAF) interface (v4 and v5)Return Channel Frames (RCF) interface (v4 and v5)Forward Communications Link Transmission Unit (CLTU) interface (v4 and v5)CCSDS File Delivery Protocol Class 1 interfaceInterface specifications are available in theCCSDS Blue BooksGetting StartedYou can read through theInstallation Pagefor instruction on how to install AIT DSN.Join the CommunityThe project’sUser and Developer Mailing Listis the best way to communicate with the team, ask questions, brainstorm plans for future changes, and help contribute to the project.This project exists thanks to the dedicated users, contributors, committers, and project management committee members. If you’d like to learn more about how the project is organized and how to become a part of the team please check out theProject Structure and Governancedocumentation.ContributingFor information on how to contribute please see theAIT Contributing
Guide
|
ai-team-hwlloworld
|
No description available on PyPI.
|
ai-team-template-test
|
No description available on PyPI.
|
aitech
|
No description available on PyPI.
|
aitemplates
|
aitemplatesaitemplates is a Python package designed to simplify and streamline your work with the OpenAI API. It provides automatic function calling, helper classes, Python typing support, error checking, and a usage meter to help manage API costs. Additionally, aitemplates offers built-in examples of using ChromaDB and tools for efficient prompt engineering with OpenAI.FeaturesFunction Integration: Quick and easy integration with OpenAI functions for automatic execution and error handling.Error Checking: Automatically catch and handle errors during API calls.Usage Meter: Keep track of your OpenAI API usage with a built-in metering system.ChromaDB Integration: Work directly with ChromaDB from the aitemplates interface.Asynchronous Chat Completions: Use the-asncflag to run asynchronous chat completions. The built-inprint_everyoption prints every time a completion finishes in parallel. If you'd like to maintain the order of your completions, pass in thekeep_orderboolean asTrue.Prompt Engineering Examples: Get started quickly with included examples of prompt engineering techniques.Python Typing Support: Enjoy the benefits of Python's dynamic typing system while using OpenAI API.InstallationYou can install aitemplates directly from PyPI:pipinstallaitemplatesTo get the latest version of the package, you can also clone the repository and install the package by runningpip install -e .in the repository directory.Creating Jupyter Notebook TemplatesCreate Jupyter notebook templates for prompt engineering with the following command:aitemplatesname_of_notebookInclude a Chroma database in the template with the-dbflag:aitemplatesname_of_notebook-dbFor asynchronous chat completions, add the-asncflag:aitemplatesname_of_notebook-db-asncFor chat completions with function boilerplate add the-funcflag (async do not support functions):aitemplatesname_of_notebook-db-funcDocumentationThe package includes example notebooks, which provide comprehensive guides and demonstrations of the functionalities provided by aitemplates. To access these notebooks, access or clone the repository athttps://github.com/SilenNaihin/ai_templatesand navigate to the/notebooksdirectory.Here are the available notebooks:oai_examples.ipynb: Provides examples for the OpenAI API including functions.chroma_examples.ipynb: Demonstrates usage of ChromaDB.prompt_engineering_example.ipynb: A comprehensive guide on prompt engineering, including various techniques, usage of ChromaDB and the OpenAI library together.RequirementsBe sure to have a.envfile with theOPENAI_API_KEYdefined in the root of your project or your api key defined in some way for OpenAI. AWS env variables are for ChromaDB deploymenthttps://docs.trychroma.com/deploymentThis library doesn't currently support text models liketext-davinci-003. Streaming and logit_bias parameters are also not supportedContributingI welcome and appreciate contributions to aitemplates. If you'd like to contribute a feature, please submit a pull request. For bugs, please open a new issue, and I'll address it promptly.Future?https://twitter.com/ItakGol/status/1668336193270865921Support for other LLMsSupport for other dbs
|
ai-tensor
|
No description available on PyPI.
|
aiter
|
aiter -- Asynchronous Iterator PatternsPEP 525describesasynchronous iterators, a merging of iterators with async functionality. Python 3.6 makes legal constructs such asasync for event in peer.event_iterator:
await process_event(event)which is a huge improvement over usingasync.Queueobjects which have no built-in way to determine "end-of-stream" conditions.This module implements some patterns useful for python asynchronous iterators.Documentation available onreadthedocs.io.Atutorialis available.github versionCAVEATThis project is still in its infancy, and I reserve the right to rename things and cause other breaking changes.
|
ai-terminal
|
MistralTerminal - chatbot in your Linux terminalOverviewMistralTerminal is a command-line interface for interacting with MistralAI. It allows users to send questions directly from the terminal and receive concise answers from MistralAI. This script is designed for simplicity and ease of use. It keeps the conversation history, so you can continue interact with it without giving the context every time. The input can be multiline. If the script suggest code blocks, you can easily copy them in the clipboard. It also supports colored output for enhanced readability.LicenseLicense: BSD 3 clauseAuthorV.A. Yastrebov, CNRS, MINES Paris - PSL, France, Jan 2024.External ContributorsBasile MarchandRequirementsAn API key for MistralAI, check onmistral.aiInstallationSet your MistralAI API key in your environment:exportMISTRAL_API_KEY='your_api_key_here'From Github sourcesgitclonehttps://github.com/vyastreb/ai-terminal.gitcdai-terminalInstall required dependencies and theaicommand using :pipinstall.From PyPiComming SoonOptions--model/-m: Sets the MistralAI model to be used (mistral-tiny, mistral-small or mistral-medium). Default is 'mistral-tiny'.--temp/-T: Sets the temperature for the AI's responses. Default is 0.2.--tokens/-t: Sets the maximum number of tokens in the response. Default is 2.--verbose/-v: If set, prints the question or the whole history.--no-chat/-n: If set, does not keep the discussion in memory.--unit-test/-u: Unit tests.--help/-h: Displays the help message and usage instructions.The default model is 'mistral-tiny', which is a smaller model that is faster to load and run. The default temperature is 0.5, which is a good value for most questions. The default token count is 350, which is a reasonable length for most answers. The verbose option is useful for checking whether your questions and history were correctly parsed by the script.Configuration fileAll parameters could be defined in the script or in the config file~/.mistralai/config.json.
If config file is not found, the script will use default parameters.
If config file exists, it will use parameters from the file, and will ignore parameters defined in the script.
But if you prescribe options in the command line, they will be used instead of the config file.Example of config file:{
"model": "mistral-tiny",
"max_memory": 31,
"max_tokens": 1000,
"waitingTime": 180,
"max_line_length": 80,
"temperature": 0.5
}Usageai[options]These commands will start a new line>where your question could be written.Examples:ai
>Howtoconvertjpgtopnginlinux?ai--temp0.0
>Whatisthemeaningoflife?ai--modelmistral-tiny--temp0.8--tokens5000>Whatisthebest(accordingtoparisian)cheeseinFrance?ai-mmistral-small-T0.8-t500-v
>WhatisthevisibleEMspectrum?FeaturesCan be run from anywhere in the terminalSupports multi-line inputRemembers past questionsIf one code block is shown, it automatically stores it in the clipboardIf several code blocks are shown, it suggests to store the one you want in the clipboardKeeps the history of conversation in a local file for some (user defined) timeColored output for enhanced readabilityAdjustable parameters for model, temperature, and token countSupports multi-line responses with automatic line wrappingNotesEnsure your terminal supports ANSI color codes for the best experience.The history of last messages (31 by default) is stored in~/.mistralai/history.txtfor 3 minutes by default. You can change the number of stored messages and the time in the script of in config file:max_memoryandwaitingTime.
|
aiter-timeouts
|
aiter-timeoutsTimeout functionality for asynchronous iterators. Supports timeouts on the total time to exhaust an asynchronous iterator, or the time it takes for any given step.Supports Python 3.7+.ExampleJust wrap your async iterator in a call towith_timeout, like so:fromaiter_timeoutsimportwith_timeoutasyncdefasync_iter():foriinrange(10):awaitasyncio.sleep(0.5)yielditry:asyncforvalinwith_timeout(async_iter(),timeout=6,timeout_per_step=1):...exceptIterationTimeoutErrorase:print(f"step{e.step}took too long")exceptIteratorTimeoutErrorase:print(f"only reached step{e.step}before timing out")LicenseMIT License
Copyright (c) 2023 Elias Gabriel
|
aitertools
|
Async versions of the itertools features.Example UsageThe behaviour of each feature in this project is designed to match, one-to-one,
the behaviour of each feature in the Pythonitertoolsmodule. The primary
difference of this package is that all features can additionally consume
async functions and async iterables in addition to the sync versions. All
functions also return async iterables.Wrapping Sync CodeWhen working with sync iterables in an async world you may want to wrap things
in an async interface for interoperability with other async tools. For example,
a typical sync call tochainmight look like:iter1=(1,2,3,4)iter2=(5,6,7,8)iter3=(9,10,11,12)forvalueinitertools.chain(iter1,iter2,iter3):print(value)The above would output numbers 1 - 12 in the order shown. However, in the async
world not all iterables are tuples and lists. The same thing can be
accomplished with the asyncchain:iter1=(1,2,3,4)iter2=(5,6,7,8)iter3=(9,10,11,12)asyncforvalueinaitertools.chain(iter1,iter2,iter3):print(value)The above behaves exactly as the sync version except that it exposes the async
iteratable interface and works withasync def.Mixing Sync and AsyncAll functions in theaitertoolspackage accept both sync and async iterables.
This allows for the combination of the two when needed:iter1=(1,2,3,4)iter2=some_async_iter()# Async resolves to 5, 6, 7, 8asyncforvalueinaitertools.chain(iter1,iter2):print(value)Ensuring Async InterfacesThis package also provides a handful of tools for working with the async
interfaces:aiterAsync function that countersiter. This function, when given an
async iterable, will return an async iterator. When given a sync
iterable it will wrap it and return an async iterator.anextAsync function that countersnext. This function calls the__anext__()method of the iterator and returns the value. It,
like thenextmethod, can optionally return a default value when
the iterator is empty.alistAsync function that counterslist. For use cases likelist(iterable)this function allows forawait alist(iterable).atupleAsync function that counterstuple. Similar to thealistabove.Development RoadmapThe current release of this package is only missing thepermutations,combinations, andcombinations_with_replacementfeatures. The next release
should contain these missing features.Additionally, several features from functools and the available globals are
slated for addition. For example, thefilter,map, andreducefeatures
are good fits for this package and already have at least oneitertoolscounterpart added.TestingAll tests are stored in the ‘/tests’ subdirectory. All tests are expected to
pass for Python 3.5 and above. To run tests create a virtualenv and install
the test-requirements.txt list. After that using thetoxcommand will launch
the test suite.LicenseCopyright 2015 Kevin ConwayLicensed under the Apache License, Version 2.0 (the “License”);
you may not use this file except in compliance with the License.
You may obtain a copy of the License athttp://www.apache.org/licenses/LICENSE-2.0Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an “AS IS” BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.ContributingFirstly, if you’re putting in a patch then thank you! Here are some tips for
getting your patch merged:StyleAs long as the code passes the PEP8 and PyFlakes gates then the style is
acceptable.DocsThe PEP257 gate will check that all public methods have docstrings. If you’re
adding additional features from theitertoolsmodule try to preserve the
original docstrings if possible. Sometimes the original docs don’t fit well
with the PEP257 format. In those cases it is OK to modify the docstring to fit.
If you’re adding something new, like a helper function, try out thenapoleon style of docstrings.TestsMake sure the patch passes all the tests. If you’re adding a new feature fromitertoolsbe sure to add some of the original, standard library tests. The
tests used to validate the Pythonitertoolsmodule are found in the
‘/Lib/test/test_itertools.py’ file in the Python source. The orginal tests are
organized into large blocks of tests based on features. As much as possible,
break the tests up into individual units. Check out the existing tests for this
project for inspiration if needed.If you’re adding something totally new the make sure to throw in a few tests.
If you’re fixing a bug then definitely add at least one test to prevent
regressions.
|
aitest-cli
|
aitestThe aitest Command Line Interface is a unified tool to manage your aitest services.InstallingTo install this CLI tool you can run the below command for Linuxpip3 install aitest-cliTo install this CLI tool you can run the below command for windowspip install aitest-cliHow to get configuration information1) To get User Identifier follow below steps:
visit aitest -> settings -> copy User Identifier.
2) To get the Client ID and Client secret ID you need to send a request via email to this email id [email protected] . You will receive the Client ID and Client secret ID via email.How to useTo see help text, you can run:
aitest --help
aitest <command> --help
aitest <command> <subcommand> --help
ex:
1) If you want help to understand aitest CLI, you can run the following command:
Input :
aitest --help
output:
Usage: aitest [OPTIONS] COMMAND [ARGS]...
The aitest Command Line Interface is a unified tool to manage your aitest services.
To see help text, you can run:
aitest --help
aitest <command> --help
aitest <command> <subcommand> --help
Options:
--help Show this message and exit.
Commands:
configure If this command is run with no arguments, you will be prompted...
run If this command is run with testrun id as an argument, aitest...
status If this command is run with testrun id as an argument, aitest...
2) If you want to know how configure command works , you can run the following command:
Input:
aitest configure --help
Output:
aitest configure [OPTIONS]
If this command is run with no arguments, you will be prompted for configuration values such as your aitest User ID, Client ID and Client Secret ID.If your configure file
does not exist (the default location is ~/.aitest/configure), the aitest
CLI will create it for you.To keep an existing value, hit enter when
prompted for the value.
To save the configurations , you can run below command:
aitest configure
Options:
--help Show this message and exit.
--env This options is to set the AITEST Environment by deafult it is set to PROD.
3) How to run aitest configure command?
Input:
aitest configure [-e optional]
Output:
Enter aiTest User Identifier :
Enter Client ID :
Enter Client Secret ID :Note : To access the aitest services using CLI commands you need to run aitest configure command first.4) How to run aitest run command?:
Input:
aitest run -id 69cd8eb8-9700-11ed-bdc9-3e71e7127aff -p test-password -w 5
Output:
Test created successfully
Test Name : Test 1
Testrun ID : 47782fb0-9711-11ed-809d-62cd909dddc7
Testrun is in progress, result will be displayed once it get completed.
test status : completed
+--------------+-----------------+--------------------------------------+--------+------------+
| browser_name | browser_version | test run result id | status | time taken |
+--------------+-----------------+--------------------------------------+--------+------------+
| firefox | 104 | 47c53f08-9711-11ed-b277-c6d8e5da78fe | fail | 0.001 |
| firefox | 104 | 47d350e8-9711-11ed-b277-c6d8e5da78fe | fail | 0.001 |
| firefox | 104 | 47deb2b2-9711-11ed-b277-c6d8e5da78fe | fail | 0.001 |
| firefox | 104 | 47eca5fc-9711-11ed-b277-c6d8e5da78fe | fail | 0.001 |
| firefox | 104 | 47fa16ec-9711-11ed-b277-c6d8e5da78fe | fail | 0.002 |
+--------------+-----------------+--------------------------------------+--------+------------+
Note : Enter git password only if you used git url for creating the test otherwise no need to enter git password
5) How to run aitest status command?
Input :
aitest status -id 47782fb0-9711-11ed-809d-62cd909dddc7
Output:
test status : completed
+--------------+-----------------+--------------------------------------+--------+------------+
| browser_name | browser_version | test run result id | status | time taken |
+--------------+-----------------+--------------------------------------+--------+------------+
| firefox | 104 | 47c53f08-9711-11ed-b277-c6d8e5da78fe | fail | 0.001 |
| firefox | 104 | 47d350e8-9711-11ed-b277-c6d8e5da78fe | fail | 0.001 |
| firefox | 104 | 47deb2b2-9711-11ed-b277-c6d8e5da78fe | fail | 0.001 |
| firefox | 104 | 47eca5fc-9711-11ed-b277-c6d8e5da78fe | fail | 0.001 |
| firefox | 104 | 47fa16ec-9711-11ed-b277-c6d8e5da78fe | fail | 0.002 |
+--------------+-----------------+--------------------------------------+--------+------------+
|
ai-test-craft
|
AI Test CraftAutomated Unit Test Generation for PythonAITestCraft is a Python package that assists developers in generating unit tests for their code. It leverages OpenAI's GPT models to create test cases based on the structure and requirements of your codebase.Features:Automated generation of unit tests from JSON configuration.Support for Python 3.10 and other versions.Allows for additional comments to guide test case creation.Sequential file processing for context-aware test generation.Prerequisites:To use AITestCraft, you need an OpenAI API key. To obtain it, follow these steps:Sign up for an account atOpenAI.Navigate to the API section and follow the instructions to generate your API key.Before running AITestCraft, ensure that yourOPENAI_API_KEYenvironment variable is set with your OpenAI API key. Alternatively, you can specify a custom environment variable name for your OpenAI API key using the command-line interface.Installation:Install AITestCraft using pip:pipinstallaitestcraftUsage:First, set theOPENAI_API_KEYenvironment variable or a custom one:For Unix-based systems (Linux/Mac):exportOPENAI_API_KEY='your_api_key_here'For Windows:set OPENAI_API_KEY=your_api_key_hereAlternatively, use a custom environment variable:exportMY_ENV_VAR_NAME='your_api_key_here'To generate unit tests, create a JSON configuration file namedto-test.json:{"language":"python","language_version":"3.11","model":"gpt-3.5-turbo","overwrite":"never","additional_comments":["Use functions for each testcase and not unittest.TestCase"],"files":[{"code":"aitestcraft/conf_representation.py","test":"tests/test_conf_representation.py"},{"code":"aitestcraft/validators.py","test":"tests/test_validators.py"},{"code":"aitestcraft/ai_generator.py","test":"tests/test_ai_generator.py"}]}Then run the package with:aitestcraftto-test.jsonOr if you're using a custom environment variable for the API key:pythonaitestgen.py--open-ai-env-varMY_ENV_VAR_NAME./to-test.jsonConfiguration Fields:language: Programming language used.language_version: Version of the programming language.model: OpenAImodelused for generating tests.overwrite: ["neve"/"always"] If set to "never", existing test files will not be overwritten.additional_comments: Optional, global comments for test generation.files: A list of objects representing the source code and the test files.In thefiles, if a comment starts withAI-TEST, it indicates a message for the AI to include specific demands for the next line, like so:# AI-TEST: don't test this conditionNote:The generated tests may require minor adjustments to fit the exact needs of your project.Order the files into-test.jsonfrom the most standalone files to those with the most dependencies. This helps to provide context that can improve the quality of the generated tests.Disclaimer:The test generation is not guaranteed to be perfect and might need adjustments to work seamlessly with your codebase.
|
ai-test-designer
|
AI Test Automation Designer Using Robot Framework.
|
aitester
|
No description available on PyPI.
|
aitestflow
|
# aitestflow
Flexible and powerful UI Automation and API with Data base analysis with pretty good automation library for Python,
|
aitestgen
|
AITestGenPython Library to generate unit tests using OpenAI APIInstallationCreate a virtual enviroment (Optional)python-mvenvvenvsourcevenv/bin/activate# venv/Scripts/activate on WindowsInstall package (require python ^3.11 version)pipinstallaitestgenAdd OpenAI API Key (Required)exportOPENAI_API_KEY=<my_api_key># or add variable in your .env fileExampleCreate a python file and add the following code:fromaitestgen.autotestimportautotest@autotest()defsum(num1:float,num2:float)->float:"""This function is responsible for calculating the sum of two numbers"""returnnum1+num2@autotest()defmul(num1:float,num2:float)->float:"""This function is responsible for calculating the multiplication of two numbers"""returnnum1*num2Important:To improve results, add type annotations and documentation to the function.Generate tests:Run the following command (change the filename)aitestgengenerate--inputfilesrc/operations.pyAnd then you will get the following result in a .py file created in a tests folder:Result# Function: mul - test1:deftest_mul_positive_numbers():assertmul(5.5,2)==11.0# Function: mul - test2:deftest_mul_negative_numbers():assertmul(-3,4)==-12.0# Function: mul - test3:deftest_mul_zero():assertmul(0,10)==0.0# Function: sum - test1:deftest_sum_positive_numbers():assertsum(3.5,2.5)==6.0# Function: sum - test2:deftest_sum_negative_numbers():assertsum(-3.5,-2.5)==-6.0# Function: sum - test3:deftest_sum_mixed_numbers():assertsum(3.5,-2.5)==1.0
|
aitext
|
No description available on PyPI.
|
aitextgen
|
aitextgenA robust Python tool for text-based AI training and generation usingOpenAI'sGPT-2andEleutherAI'sGPT Neo/GPT-3architecture.aitextgen is a Python package that leveragesPyTorch,Hugging Face Transformersandpytorch-lightningwith specific optimizations for text generation using GPT-2, plusmanyadded features. It is the successor totextgenrnnandgpt-2-simple, taking the best of both packages:Finetunes on a pretrained 124M/355M/774M GPT-2 model from OpenAI or a 125M/350M GPT Neo model from EleutherAI...or create your own GPT-2/GPT Neo model + tokenizer and train from scratch!Generates text faster than gpt-2-simple and with better memory efficiency!With Transformers, aitextgen preserves compatibility with the base package, allowing you to use the model for other NLP tasks, download custom GPT-2 models from the HuggingFace model repository, and upload your own models! Also, it uses the includedgenerate()function to allow a massive amount of control over the generated text.With pytorch-lightning, aitextgen trains models not just on CPUs and GPUs, but alsomultipleGPUs and (eventually) TPUs! It also includes a pretty training progress bar, with the ability to add optional loggers.The input dataset is its own object, allowing you to not only easily encode megabytes of data in seconds, cache, and compress it on a local computer before transporting to a remote server, but you are able tomergedatasets without biasing the resulting dataset, orcross-trainon multiple datasets to create blended output.You can read more about aitextgenin the documentation!DemoYou can play with aitextgenfor freewith powerful GPUs using these Colaboratory Notebooks!Finetune OpenAI's 124M GPT-2 model (or GPT Neo) on your own dataset (GPU)Train a GPT-2 model + tokenizer from scratch (GPU)You can also play with customRedditandHacker Newsdemo models on your own PC.Installationaitextgen can be installedfrom PyPI:pip3installaitextgenQuick ExamplesHere's how you can quickly test out aitextgen on your own computer, even if you don't have a GPU!For generating text from a pretrained GPT-2 model:fromaitextgenimportaitextgen# Without any parameters, aitextgen() will download, cache, and load the 124M GPT-2 "small" modelai=aitextgen()ai.generate()ai.generate(n=3,max_length=100)ai.generate(n=3,prompt="I believe in unicorns because",max_length=100)ai.generate_to_file(n=10,prompt="I believe in unicorns because",max_length=100,temperature=1.2)You can also generate from the command line:aitextgengenerate
aitextgengenerate--prompt"I believe in unicorns because"--to_fileFalseWant to train your own mini GPT-2 model on your own computer? You can follow alongin this Jupyter Notebookor, download thistext file of Shakespeare's plays, cd to that directory in a Terminal, open up apython3console and go:fromaitextgen.TokenDatasetimportTokenDatasetfromaitextgen.tokenizersimporttrain_tokenizerfromaitextgen.utilsimportGPT2ConfigCPUfromaitextgenimportaitextgen# The name of the downloaded Shakespeare text for trainingfile_name="input.txt"# Train a custom BPE Tokenizer on the downloaded text# This will save one file: `aitextgen.tokenizer.json`, which contains the# information needed to rebuild the tokenizer.train_tokenizer(file_name)tokenizer_file="aitextgen.tokenizer.json"# GPT2ConfigCPU is a mini variant of GPT-2 optimized for CPU-training# e.g. the # of input tokens here is 64 vs. 1024 for base GPT-2.config=GPT2ConfigCPU()# Instantiate aitextgen using the created tokenizer and configai=aitextgen(tokenizer_file=tokenizer_file,config=config)# You can build datasets for training by creating TokenDatasets,# which automatically processes the dataset with the appropriate size.data=TokenDataset(file_name,tokenizer_file=tokenizer_file,block_size=64)# Train the model! It will save pytorch_model.bin periodically and after completion to the `trained_model` folder.# On a 2020 8-core iMac, this took ~25 minutes to run.ai.train(data,batch_size=8,num_steps=50000,generate_every=5000,save_every=5000)# Generate text from it!ai.generate(10,prompt="ROMEO:")# With your trained model, you can reload the model at any time by# providing the folder containing the pytorch_model.bin model weights + the config, and providing the tokenizer.ai2=aitextgen(model_folder="trained_model",tokenizer_file="aitextgen.tokenizer.json")ai2.generate(10,prompt="ROMEO:")Want to run aitextgen and finetune GPT-2? Use the Colab notebooks in the Demos section, orfollow the documentationto get more information and learn some helpful tips!Known IssuesTPUs cannot be used to train a model: although youcantrain an aitextgen model on TPUs by settingn_tpu_cores=8in an appropriate runtime, and the training loss indeed does decrease, there are a number of miscellaneous blocking problems. [Tracking GitHub Issue]Upcoming FeaturesThe current release (v0.5.X) of aitextgenis considered to be a beta, targeting the most common use cases. The Notebooks and examples written so far are tested to work, but more fleshing out of the docs/use cases will be done over the next few months in addition to fixing the known issues noted above.The next versions of aitextgen (and one of the reasons I made this package in the first place) will have native support forschema-based generation. (Seethis repofor a rough proof-of-concept.)Additionally, I plan to develop an aitextgenSaaSto allow anyone to run aitextgen in the cloud and build APIs/Twitter+Slack+Discord bots with just a few clicks. (The primary constraint is compute cost; if any venture capitalists are interested in funding the development of such a service, let me know.)I've listed more tentative features in theUPCOMINGdocument.Ethicsaitextgen is a tool primarily intended to help facilitate creative content. It is not a tool intended to deceive. Although parody accounts are an obvious use case for this package, make sure you areas upfront as possiblewith the methodology of the text you create. This includes:State that the text was generated using aitextgen and/or a GPT-2 model architecture. (A link to this repo would be a bonus!)If parodying a person, explicitly state that it is a parody, and reference who it is parodying.If the generated text is human-curated, or if it's unsupervised random output.Indicating who is maintaining/curating the AI-generated text.Make a good-faith effort to remove overfit output from the generated text that matches the input text verbatim.It's fun to anthropomorphise the nameless "AI" as an abstract genius, but part of the reason I made aitextgen (and all my previous text-generation projects) is to make the technology more accessible and accurately demonstrate both its promise, and its limitations.Any AI text generation projects that are deliberately deceptive may be disavowed.Maintainer/CreatorMax Woolf (@minimaxir)Max's open-source projects are supported by hisPatreonandGitHub Sponsors. If you found this project helpful, any monetary contributions to the Patreon are appreciated and will be put to good creative use.LicenseMIT
|
aitextgenAws
|
aitextgen-awsaitextgen is a wrapper based on the work by Max Woolf. Edited for distributed computing on AWS since current version of the aitextgen does not allow to set the parameter for Pytorch Lightnings Distributed modes.A robust Python tool for text-based AI training and generation usingOpenAI'sGPT-2andEleutherAI'sGPT Neo/GPT-3architecture.aitextgen is a Python package that leveragesPyTorch,Hugging Face Transformersandpytorch-lightningwith specific optimizations for text generation using GPT-2, plusmanyadded features. It is the successor totextgenrnnandgpt-2-simple, taking the best of both packages:Finetunes on a pretrained 124M/355M/774M GPT-2 model from OpenAI or a 125M/350M GPT Neo model from EleutherAI...or create your own GPT-2/GPT Neo model + tokenizer and train from scratch!Generates text faster than gpt-2-simple and with better memory efficiency!With Transformers, aitextgen preserves compatibility with the base package, allowing you to use the model for other NLP tasks, download custom GPT-2 models from the HuggingFace model repository, and upload your own models! Also, it uses the includedgenerate()function to allow a massive amount of control over the generated text.With pytorch-lightning, aitextgen trains models not just on CPUs and GPUs, but alsomultipleGPUs and (eventually) TPUs! It also includes a pretty training progress bar, with the ability to add optional loggers.The input dataset is its own object, allowing you to not only easily encode megabytes of data in seconds, cache, and compress it on a local computer before transporting to a remote server, but you are able tomergedatasets without biasing the resulting dataset, orcross-trainon multiple datasets to create blended output.You can read more about aitextgenin the documentation!DemoYou can play with aitextgenfor freewith powerful GPUs using these Colaboratory Notebooks!Finetune OpenAI's 124M GPT-2 model (or GPT Neo) on your own dataset (GPU)Train a GPT-2 model + tokenizer from scratch (GPU)You can also play with customRedditandHacker Newsdemo models on your own PC.Installationaitextgen can be installedfrom PyPI:pip3installaitextgenQuick ExamplesHere's how you can quickly test out aitextgen on your own computer, even if you don't have a GPU!For generating text from a pretrained GPT-2 model:fromaitextgenimportaitextgen# Without any parameters, aitextgen() will download, cache, and load the 124M GPT-2 "small" modelai=aitextgen()ai.generate()ai.generate(n=3,max_length=100)ai.generate(n=3,prompt="I believe in unicorns because",max_length=100)ai.generate_to_file(n=10,prompt="I believe in unicorns because",max_length=100,temperature=1.2)You can also generate from the command line:aitextgengenerate
aitextgengenerate--prompt"I believe in unicorns because"--to_fileFalseWant to train your own mini GPT-2 model on your own computer? You can follow alongin this Jupyter Notebookor, download thistext file of Shakespeare's plays, cd to that directory in a Terminal, open up apython3console and go:fromaitextgen.TokenDatasetimportTokenDatasetfromaitextgen.tokenizersimporttrain_tokenizerfromaitextgen.utilsimportGPT2ConfigCPUfromaitextgenimportaitextgen# The name of the downloaded Shakespeare text for trainingfile_name="input.txt"# Train a custom BPE Tokenizer on the downloaded text# This will save one file: `aitextgen.tokenizer.json`, which contains the# information needed to rebuild the tokenizer.train_tokenizer(file_name)tokenizer_file="aitextgen.tokenizer.json"# GPT2ConfigCPU is a mini variant of GPT-2 optimized for CPU-training# e.g. the # of input tokens here is 64 vs. 1024 for base GPT-2.config=GPT2ConfigCPU()# Instantiate aitextgen using the created tokenizer and configai=aitextgen(tokenizer_file=tokenizer_file,config=config)# You can build datasets for training by creating TokenDatasets,# which automatically processes the dataset with the appropriate size.data=TokenDataset(file_name,tokenizer_file=tokenizer_file,block_size=64)# Train the model! It will save pytorch_model.bin periodically and after completion to the `trained_model` folder.# On a 2020 8-core iMac, this took ~25 minutes to run.ai.train(data,batch_size=8,num_steps=50000,generate_every=5000,save_every=5000)# Generate text from it!ai.generate(10,prompt="ROMEO:")# With your trained model, you can reload the model at any time by# providing the folder containing the pytorch_model.bin model weights + the config, and providing the tokenizer.ai2=aitextgen(model_folder="trained_model",tokenizer_file="aitextgen.tokenizer.json")ai2.generate(10,prompt="ROMEO:")Want to run aitextgen and finetune GPT-2? Use the Colab notebooks in the Demos section, orfollow the documentationto get more information and learn some helpful tips!Known IssuesTPUs cannot be used to train a model: although youcantrain an aitextgen model on TPUs by settingn_tpu_cores=8in an appropriate runtime, and the training loss indeed does decrease, there are a number of miscellaneous blocking problems. [Tracking GitHub Issue]Upcoming FeaturesThe current release (v0.5.X) of aitextgenis considered to be a beta, targeting the most common use cases. The Notebooks and examples written so far are tested to work, but more fleshing out of the docs/use cases will be done over the next few months in addition to fixing the known issues noted above.The next versions of aitextgen (and one of the reasons I made this package in the first place) will have native support forschema-based generation. (Seethis repofor a rough proof-of-concept.)Additionally, I plan to develop an aitextgenSaaSto allow anyone to run aitextgen in the cloud and build APIs/Twitter+Slack+Discord bots with just a few clicks. (The primary constraint is compute cost; if any venture capitalists are interested in funding the development of such a service, let me know.)I've listed more tentative features in theUPCOMINGdocument.Ethicsaitextgen is a tool primarily intended to help facilitate creative content. It is not a tool intended to deceive. Although parody accounts are an obvious use case for this package, make sure you areas upfront as possiblewith the methodology of the text you create. This includes:State that the text was generated using aitextgen and/or a GPT-2 model architecture. (A link to this repo would be a bonus!)If parodying a person, explicitly state that it is a parody, and reference who it is parodying.If the generated text is human-curated, or if it's unsupervised random output.Indicating who is maintaining/curating the AI-generated text.Make a good-faith effort to remove overfit output from the generated text that matches the input text verbatim.It's fun to anthropomorphise the nameless "AI" as an abstract genius, but part of the reason I made aitextgen (and all my previous text-generation projects) is to make the technology more accessible and accurately demonstrate both its promise, and its limitations.Any AI text generation projects that are deliberately deceptive may be disavowed.Maintainer/CreatorMax Woolf (@minimaxir)Max's open-source projects are supported by hisPatreonandGitHub Sponsors. If you found this project helpful, any monetary contributions to the Patreon are appreciated and will be put to good creative use.LicenseMIT
|
ai-text-to-sql
|
AI-Text-to-SQL: Transforming Text Queries into SQL Magic ✨Have you ever wished you could simply speak or type your queries in plain English, and poof! They magically turn into perfectly formatted SQL queries? Well, say hello to ai_text_to_sql - the enchanting Python package that brings your dream to life!Unleashing the Magic 🌟With pipInstalling AI-Text-to-SQL is as easy as pip-pip-pip:pip install ai_text_to_sqlCasting the Spell 🪄To summon the power of AI-Text-to-SQL, follow these mystical steps:Import the Data Connector: Begin by importing the required data connector of your choice. It's like selecting the perfect wand for your SQL sorcery!Import the LLM Connector: Next, import the LLM (Language Learning Model) connector. This vital ingredient enhances the mystical abilities of ai_text_to_sql, enabling it to comprehend your text queries like a seasoned SQL wizard.Instantiating TextToSQL: Now, it's time to weave your magic! Create an instance of TextToSQL by passing in the objects of the previously instantiated data connector and LLM connector. This mystical union forms the very heart of ai_text_to_sql, enabling it to interpret your words and craft elegant SQL incantations.Spellbinding Example Usage 🎩fromai_text_to_sql.data_connectorsimportSQLiteConnectorfromai_text_to_sql.llm_connectorsimportOpenAIConnectorfromai_text_to_sql.text_to_sqlimportTextToSQL# Prepare your spell ingredientssqlite_connector=SQLiteConnector(database='chinook.db')openai_connector=OpenAIConnector(api_key='YOUR_OPENAI_API_KEY')# Weave the enchantment 🧙♂️✨text_to_sql=TextToSQL(sqlite_connector,openai_connector)# Utter your magical incantation 🗣️✨text_query="Find all the tracks written by AC/DC, including the track name, album title, and the artist name. Sort the results alphabetically by track name."# Witness the spell's transformation 🔮✨sql_query=text_to_sql.convert_text_to_sql(text_query)# Unleash the magic upon the database directly 💾✨results=text_to_sql.query(text_query)# Store the results in a DataFrame for further sorcery 📊✨df=text_to_sql.query_df(text_query)To witness the enchanting powers of AI-Text-to-SQL in action, behold the following example usage with the SQLite data connector and OpenAI LLM connector.The Realm of Compatible Databases 🌐🏰Within the enchanted realm of AI-Text-to-SQL, a variety of databases stand ready to be harmoniously united with your mystical text queries. Our sorcery extends its reach to the following realms of data:DatabaseStatusSQLite✅PostgreSQL✅MySQL✅MariaDB✅MS SQLServer✅Oracle🔜MS Access🔜Firebird🔜IBM Db2🔜Is your preferred database missing from our list? Don't worry! Suggestions for new data connectors are always welcome. Feel free to create an issue on the GitHub repository and maybe even submit a pull request!The Fellowship of Language Learning Models 🧠📚In the mystical realm of AI-Text-to-SQL, an esteemed fellowship of Language Learning Models (LLMs) awaits to join forces with your magical text queries. Our sorcery encompasses the wisdom of the following LLM allies:LLMStatusOpenAI✅Bard🔜HuggingFace🔜Excited about the potential of additional LLMs? If you have recommendations for new LLM connectors to integrate into AI-Text-to-SQL, please create an issue on the GitHub repository to share your ideas and take steps to contribute to the project!Contributing 🤝Thank you for your interest in contributing to AI-Text-to-SQL! Contributions from the community are highly appreciated.For detailed instructions on how to contribute to the project, please refer to the CONTRIBUTING.md file. It provides guidelines on reporting bugs, suggesting new features, making code improvements, and more.Your contributions are valuable, and together, let's collaborate to enhance AI-Text-to-SQL and make it even more magical!License 📜AI-Text-to-SQL is released under the GNU General Public License v3.0 (GPL-3.0). This means that you are free to use, modify, and distribute this package in compliance with the terms outlined in the license.Embrace the spirit of open-source collaboration and together let's propel the world of text-to-SQL transformation forward!
|
aitg
|
No description available on PyPI.
|
aitg-doctools
|
No description available on PyPI.
|
aitg-host
|
No description available on PyPI.
|
ait-gui
|
AMMOS Instrument Toolkit (AIT) GUIAIT GUI provides a framework for building a custom website for realtime
telemetry monitoring, commanding, and other MOS operations with only
minimal HTML and configuration file changes. Built on top of the AIT
Core libraries, AIT GUI provides web-development friendly access to the
underlying telemetry, commanding, EVR, system logging, and other AIT
Core functions in addition to a suite of pre-built UI Components and a
REST API to handle common MOS use cases.Getting StartedYou can read through theInstallation Pagefor
instructions on how to install AIT GUI.Browser RequirementsAIT GUI requires one of the following browsers:Firefox version 52.0 or newerChrome version 57.0 or newerSafari v10.1 or newer will likely work as well but compatibility is not regularly checked or guaranteed.Join the CommunityThe project’sUser and Developer Mailing Listis the best way to communicate with the team, ask questions, brainstorm plans for future changes, and help contribute to the project.This project exists thanks to the dedicated users, contributors, committers, and project management committee members. If you’d like to learn more about how the project is organized and how to become a part of the team please check out theProject Structure and Governancedocumentation.ContributingFor information on how to contribute please see theAIT Contributing Guide
|
aithamza-regression-model
|
Example regression model package from Train In Data.
|
aithermal
|
No description available on PyPI.
|
aithon
|
# AITHON
AITHON is a library for use in AI competitions.# Copyright
Copyright (c) 2021 SeungBaek Hong
|
ai-thumbnail
|
ai-thumbnail
|
aitk
|
aitk: Artificial Intelligence ToolkitThis collection contains two things: an open source set of Python tools, and a set of computational essays for exploring Artificial Intelligence, Machine Learning, and Robotics. This is a collaborative effort started by the authors, building on almost a century of collective experience in education and research.The code and essays are designed to require as few computing resources as necessary, while still allowing readers to experience first-hand the topics covered.AuthorsDouglas Blank- Emeritus Professor of Computer Science, Bryn Mawr College; Head of Research atComet.mlJim Marshall- Professor in the Computer Science Department at Sarah Lawrence CollegeLisa Meeden- Professor in the Computer Science Department at Swarthmore CollegeContributorsPlease feel free to contribute to this collection:https://github.com/ArtificialIntelligenceToolkit/aitkYour Name HereComputational EssaysEach computational essay is described atComputational Essays.Python toolsaitkis a virtual Python package containing the following modules.aitk- top level virtual package; install this to get all of the followingaitk.robots- Python package for exploring simulated mobile robots, with cameras and sensorsaitk.algorithms- Python package for exploring algorithmsaitk.networks- Python package for constructing and visualizing Keras deep learning modelsaitk.utils- Python package for common utilitiesPython InstallationWe recommend usingminicondafor running Jupyter Notebooks locally on your computer. However, you can also skip this and run the Computational Essays on other services, such as Google's Colab. To useminiconda:First installminicondaNext, activate your base environment:source ~/miniconda/bin/activateCreate a Python 3.8 conda environment:conda create --name py38 python=3.8Activate it:conda activate py38You only need to do step 1 once. To get out of conda, back to your regular system:conda deactivate(will get out of py38)conda deactivate(will get out of base environment)Software InstallationAfter activating your conda environment:pip install "aitk.robots[jupyter]"(installs all of the aitk.robots requirements to run in Jupyter Lab 3.0)pip install pandas tensorflow numpy matplotlib tqdm ipycanvas(some things you might want)Jupyter InstallationIf you want to work in notebooks and jupyter lab:pip install jupyterlabjupyter labextension install @jupyter-widgets/jupyterlab-manager ipycanvasjupyter labstarts it up, opens browser windowAITK CommunityFor questions and comments, please usehttps://github.com/ArtificialIntelligenceToolkit/aitk/discussions/
|
aitk.algorithms
|
aitk.algorithmsGeneral Algorithms for Artificial Intelligence
|
aitkens
|
aitkensAitken's delta-squared series acceleration methodJ. M. F. Tsang ([email protected])UsageGiven an objectxsthat can be turned into a one-dimensional numpy
array, run:fromaitkensimportaccelerateaccelerate(xs)Example: Iterates of $\sqrt{2}$This example, which is given on the Wikipedia article [1], is actually a
poor example since the original iterates converge quadratically, rather
than linearly. The accelerated sequence's terms tend to overshoot the
true values.fromitertoolsimportaccumulatefromaitkensimportaccelerateiterates=list(accumulate(range(5),lambdax,_:0.5*(x+2/x),initial=1))acc=accelerate(iterates)Referenceshttps://en.wikipedia.org/wiki/Aitken%27s_delta-squared_processLicenceThis work is licensed under aCreative Commons Attribution 4.0 International License.
|
aitkhelp
|
No description available on PyPI.
|
aitkit
|
No description available on PyPI.
|
aitk.keras
|
aitk.kerasAn implementation of the main Keras API with the layers in numpy.UNDER DEVELOPMENTWhy?useful to explain deep learningcan be used where tensorflow is not available (eg, JupterLite)Featuressupports Keras's Sequential and functional APIsalternative dataset downloader for JupyterLiteExamples:# Classic XORfromaitk.keras.layersimportInput,Densefromaitk.keras.modelsimportSequentialinputs=[[0,0],[0,1],[1,0],[1,1]]targets=[[0],[1],[1],[0]]model=Sequential()model.add(Input(2,name="input"))model.add(Dense(8,activation="tanh",name="hidden"))model.add(Dense(1,activation="sigmoid",name="output"))model.compile(optimizer="adam",loss="mse")outputs=model.predict(inputs)model.fit(inputs,targets,epochs=epochs,verbose=0,shuffle=False)See the notebook directory for additional examples.See also the examples in the tests folder.Developmentimplement shufflereport metrics to logs/historyprobably lots of edge cases ar brokensee "FIXME" items in codeTo run the tests:$ pytest -vvv testsPlease feel free to report issues and make Pull Requests!ReferencesLowlevel numpy code based onnumpy_ml.
|
aitk.networks
|
aitk.networksA Keras model wrapper with visualizations
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.