package
stringlengths 1
122
| pacakge-description
stringlengths 0
1.3M
|
---|---|
aliyun-python-sdk-xspace
|
aliyun-python-sdk-xspace
This is the xspace module of Aliyun Python SDK.Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greater
Documentation:Please visithttp://develop.aliyun.com/sdk/python
|
aliyun-python-sdk-xtrace
|
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
|
aliyun-python-sdk-yundun
|
aliyun-python-sdk-yundun
This is the yundun module of Aliyun Python SDK.Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greater
Documentation:Please visithttp://develop.aliyun.com/sdk/python
|
aliyun-python-sdk-yundun-ds
|
aliyun-python-sdk-yundun-ds
This is the yundun-ds module of Aliyun Python SDK.Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greater
Documentation:Please visithttp://develop.aliyun.com/sdk/python
|
aliyun-python-sdk-yundun-test
|
aliyun-python-sdk-yundun
This is the yundun module of Aliyun Python SDK.Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greater
Documentation:Please visithttp://develop.aliyun.com/sdk/python
|
aliyun-python-sdk-zhuque
|
aliyun-python-sdk-zhuque
This is the zhuque module of Aliyun Python SDK.Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greater
Documentation:Please visithttp://develop.aliyun.com/sdk/python
|
aliyun-rds-bkp
|
Aliyun RDS Backup Tool这是一个按照自定义配置,从阿里云RDS(目前只支持MySQL)下载数据库备份(包括全备和binlog备份)到本地的工具,并支持定时清理过期备份文件。安装pipinstallaliyun-rds-bkp配置配置文件为json格式。{"AccessKeyId":"AccessKeyID Provided by Aliyun RDS","AccessKeySecret":"AccessKeySecret Provided by Aliyun RDS","Regions":[{"RegionID":"cn-hangzhou","DBInstances":[{"DBInstanceId":"rm-XXXXXXXXXXXXXXXXXX","LastFullBackup":{"BackupEndTime":"2019-03-16 05:30:00"},"LastBinlogBackup":{"BackupEndTime":"2019-03-16 05:30:00"},"BackupRetentionDays":21,"Schedule":{"FullBackup":{"Plan":"* * * * 2,4,6"},"BinlogBackup":{"Plan":"* * * * *"}}}]}],"BackupHome":"Path/to/Backup/Directory","FailedDownloads":"Path/to/Failed Downloads/Directory","ErrorLog":"Path/to/Error/Log","MailConfig":{"SMTPServer":"Your SMTP Server","SMTPLogin":"Account to Login SMTP Server","SMTPPassword":"Password to login SMTP Server","SMTPPort":25,"TTLS":false,"From":"email_from","To":["email_1","email_2"],"Cc":["email_cc"],"Subject":"E-Mail Subject"}}参数说明AccessKeyId: 阿里云提供的AccessKeyIdAccessKeySecret: 阿里云提供的AccessKeySecretRegionID: 参考https://help.aliyun.com/document_detail/40654.htmlDBInstanceId: RDS实例IDBackupEndTime: 上次备份的结束的UTC时间,用于增量,格式为YYYY-MM-DD HH:MI:SSBackupRetentionDays: 备份保留天数Plan: 备份计划(本地时间)。5个参数以空格分隔,分别代表触发备份的分钟小时一个月中的第几天月份一个星期中的第几天。1表示星期一,7表示星期日。BackupHome: 放置备份文件的总目录。在总目录下会自动按照RegionID->InstanceID->Year->Month->Day创建目录ErrorLog: 指定错误日志存放文件SMTPServer: 邮件服务器地址,用于发送备份成功或失败的通知邮件SMTPLogin: 邮件服务器登录账号SMTPPassword: 邮件服务器登录密码SMTPPort: 邮件服务器端口TTLS: 邮件服务器是否使用TTLS,true或falseFrom: 邮件发送账户To: 邮件接收账户列表Cc: 邮件抄送账户列表Subject: 通知邮件的主题使用编写调用脚本importosfromaliyunrdsbkp.mysql_backupimportMySQLBackup"""日常备份"""if__name__=='__main__':dir_path=os.path.dirname(os.path.realpath(__file__))config_file=os.path.join(dir_path,'config/settings.json')# 配置文件路径mysql_backup=MySQLBackup(config_file)mysql_backup.backup()importosfromaliyunrdsbkp.retry_downloaderimportRetryDownloader"""下载重试"""if__name__=='__main__':dir_path=os.path.dirname(os.path.realpath(__file__))config_file=os.path.join(dir_path,'config/settings.json')# 配置文件路径retry_downloader=RetryDownloader(config_file)retry_downloader.run()在Linux下的crontab或Windows下的Task Scheduler配置定时执行以上调用脚本
|
aliyun-ros-cli
|
PrepareRequiresPython 2.7aliyun-python-sdk-rosConfigWhen use ros command for the first time, the ros-cli will create a
default configuration where ros installed.[ACCESS]
ACCESS_KEY_ID = YOUR_KEY_ID
ACCESS_KEY_SECRET = YOUR_KEY_SECRET
REGION_ID = YOUR_REGION
[OTHER]
JSON_INDENT = 2
DEBUG = FalsePlease userosset-userdatato set your default configuration.You can also input the region when using ros cli. In many cases, the
default configuration will be used if you don’t specify the region.Set DEBUG True to read more output.Installpip install aliyun-python-sdk-ros
pip install aliyun-ros-cliTab CompletionSupport tab completion in bash. Put theros_completionfile at/etc/bash_completion.d/and runsource /etc/bash_completion.d/ros_completionros_completioncontent:#! /usr/bin/bash
#
# put this file at `/etc/bash_completion.d/` and run `source /etc/bash_completion.d/ros_completion`
#
# Copyright (c) 2017 Aliyun.com All right reserved. This software is the
# confidential and proprietary information of Aliyun.com ("Confidential
# Information"). You shall not disclose such Confidential Information and shall
# use it only in accordance with the terms of the license agreement you entered
# into with Aliyun.com .
#
# created by quming on 07/24/2017
_ros()
{
# cur prev ros_father
local cur prev opts_top opts_cmds
COMPREPLY=()
cur="${COMP_WORDS[COMP_CWORD]}"
prev="${COMP_WORDS[COMP_CWORD-1]}"
opts_top="-h --help --json --config --region-id"
opts_cmds="abandon-stack \
create-stack \
delete-stack \
describe-stack describe-resource \
get-template \
list-stacks list-resources list-regions list-events \
preview-stack \
resource-type resource-type-detail resource-type-template \
set-userdata \
update-stack \
validate-template"
local opt_abandon_stack="-h --help --region-id --stack-name --stack-id"
local opt_create_stack="-h --help --region-id --stack-name \
--template-url --parameters --disable-rollback \
--timeout-in-minutes"
local opt_delete_stack="-h --help --region-id --stack-name --stack-id"
local opt_describe_stack="-h --help --stack-name --stack-id"
local opt_describe_resource="-h --help --stack-name --stack-id --resource-name"
local opt_get_template="-h --help --stack-name --stack-id"
local opt_list_stacks="-h --help --stack-name --stack-id --region-id \
--status --page-number --page-size"
local opt_list_resources="-h --help --stack-name --stack-id"
local opt_list_regions="-h --help"
local opt_list_events="-h --help --stack-name --stack-id --resource-status \
--resource-name --resource-type --page-number --page-size"
local opt_preview_stack="-h --help --region-id --stack-name --stack-id \
--template-url --parameters --disable-rollback \
--timeout-in-minutes"
local opt_resource_type="-h --help --status"
local opt_resource_type_detail="-h --help --name"
local opt_resource_type_template="-h --help --name"
local opt_set_userdata="-h --help --key-id --key-secret --json --region-id"
local opt_update_stack="-h --help --region-id --stack-name --stack-id \
--template-url --parameters --disable-rollback \
--timeout-in-minutes"
local opt_validate_template="-h --help --template-url"
# if [ -z "${cur}" ]; then
if [ "${prev}"x = "ros"x ]; then
ros_father=""
fi
if [[ ${opts_cmds} = *${prev}* ]]; then
ros_father=${prev}
fi
# echo "["${cur}"]["${prev}"]["${ros_father}"]"
case "${ros_father}" in
abandon-stack)
COMPREPLY=($(compgen -W "${opt_abandon_stack}" -- ${cur}))
return 0
;;
create-stack)
COMPREPLY=($(compgen -W "${opt_create_stack}" -- ${cur}))
return 0
;;
delete-stack)
COMPREPLY=($(compgen -W "${opt_delete_stack}" -- ${cur}))
return 0
;;
describe-stack)
COMPREPLY=($(compgen -W "${opt_describe_stack}" -- ${cur}))
return 0
;;
describe-resource)
COMPREPLY=($(compgen -W "${opt_describe_resource}" -- ${cur}))
return 0
;;
get-template)
COMPREPLY=($(compgen -W "${opt_get_template}" -- ${cur}))
return 0
;;
list-stacks)
COMPREPLY=($(compgen -W "${opt_list_stacks}" -- ${cur}))
return 0
;;
list-resources)
COMPREPLY=($(compgen -W "${opt_list_resources}" -- ${cur}))
return 0
;;
list-regions)
COMPREPLY=($(compgen -W "${opt_list_regions}" -- ${cur}))
return 0
;;
list-events)
COMPREPLY=($(compgen -W "${opt_list_events}" -- ${cur}))
return 0
;;
preview-stack)
COMPREPLY=($(compgen -W "${opt_preview_stack}" -- ${cur}))
return 0
;;
resource-type)
COMPREPLY=($(compgen -W "${opt_resource_type}" -- ${cur}))
return 0
;;
resource-type-detail)
COMPREPLY=($(compgen -W "${opt_resource_type_detail}" -- ${cur}))
return 0
;;
resource-type-template)
COMPREPLY=($(compgen -W "${opt_resource_type_template}" -- ${cur}))
return 0
;;
set-userdata)
COMPREPLY=($(compgen -W "${opt_set_userdata}" -- ${cur}))
return 0
;;
update-stack)
COMPREPLY=($(compgen -W "${opt_update_stack}" -- ${cur}))
return 0
;;
validate-template)
COMPREPLY=($(compgen -W "${opt_validate_template}" -- ${cur}))
return 0
;;
*)
if [[ ${cur} == -* ]] ; then
COMPREPLY=($(compgen -W "${opts_top}" -- ${cur}))
return 0
else
COMPREPLY=($(compgen -W "${opts_cmds}" -- ${cur}))
return 0
fi
;;
esac
}
complete -F _ros rosHelpIf you want more details, please visitROS
API.Top Class Commands$ ros -h
usage: ros [-h] [--config CONFIG_FILE] [--json] [--region-id REGION_ID] ...
optional arguments:
-h, --help show this help message and exit
--config CONFIG_FILE Location of config file
--json Print results as JSON format
--region-id REGION_ID
Region ID, if not set, use config file's field
commands:
set-userdata Set default Aliyun access info
create-stack Creates a stack as specified in the template
delete-stack Deletes the specified stack
update-stack Update a stack as specified in the template
preview-stack Preview a stack as specified in the template
abandon-stack Abandon the specified stack
list-stacks Returns the summary information for stacks whose
status matches the specified StackStatusFilter
describe-stack Returns the description for the specified stack
list-resources Returns descriptions of all resources of the specified
stack
describe-resource Returns a description of the specified resource in the
specified stack
resource-type Returns types of resources
resource-type-detail
Returns detail of the specific resource type
resource-type-template
Returns template of the specific resource type
get-template Returns the template body for a specified stack
validate-template Validates a specified template
list-regions Returns all regions avaliable
list-events Returns all stack related events for a specified stack
in reverse chronological orderCommands on stacksCreate stack$ ros create-stack -h
usage: ros create-stack [-h] [--region-id REGION_ID] --stack-name STACK_NAME
--template-url TEMPLATE_URL [--parameters PARAMETERS]
[--disable-rollback DISABLE_ROLLBACK]
[--timeout-in-minutes TIMEOUT_IN_MINUTES]
optional arguments:
-h, --help show this help message and exit
--region-id REGION_ID
The region that is associated with the stack
--stack-name STACK_NAME
The name that is associated with the stack
--template-url TEMPLATE_URL
Location of file containing the template body
--parameters PARAMETERS
A list of Parameter structures that specify input
parameters for the stack. Synatax: key=value,key=value
--disable-rollback DISABLE_ROLLBACK
Set to true to disable rollback of the stack if stack
creation failed
--timeout-in-minutes TIMEOUT_IN_MINUTES
The amount of time that can pass before the stack
status becomes CREATE_FAILEDDelete stack$ ros delete-stack -h
usage: ros delete-stack [-h] --region-id REGION_ID --stack-name STACK_NAME
--stack-id STACK_ID
optional arguments:
-h, --help show this help message and exit
--region-id REGION_ID
The region that is associated with the stack
--stack-name STACK_NAME
The name that is associated with the stack
--stack-id STACK_ID The id that is associated with the stackUpdate stack$ ros update-stack -h
usage: ros update-stack [-h] --region-id REGION_ID --stack-name STACK_NAME
--stack-id STACK_ID --template-url TEMPLATE_URL
[--parameters PARAMETERS]
[--disable-rollback DISABLE_ROLLBACK]
[--timeout-in-minutes TIMEOUT_IN_MINUTES]
optional arguments:
-h, --help show this help message and exit
--region-id REGION_ID
The region that is associated with the stack
--stack-name STACK_NAME
The name that is associated with the stack
--stack-id STACK_ID The id that is associated with the stack
--template-url TEMPLATE_URL
Location of file containing the template body
--parameters PARAMETERS
A list of Parameter structures that specify input
parameters for the stack. Synatax: key=value,key=value
--disable-rollback DISABLE_ROLLBACK
Set to true to disable rollback of the stack if stack
creation failed
--timeout-in-minutes TIMEOUT_IN_MINUTES
The amount of time that can pass before the stack
status becomes CREATE_FAILEDPreview stack$ ros preview-stack -h
usage: ros preview-stack [-h] [--region-id REGION_ID] --stack-name STACK_NAME
--template-url TEMPLATE_URL [--parameters PARAMETERS]
[--disable-rollback DISABLE_ROLLBACK]
[--timeout-in-minutes TIMEOUT_IN_MINUTES]
optional arguments:
-h, --help show this help message and exit
--region-id REGION_ID
The region that is associated with the stack
--stack-name STACK_NAME
The name that is associated with the stack
--template-url TEMPLATE_URL
Location of file containing the template body
--parameters PARAMETERS
A list of Parameter structures that specify input
parameters for the stack. Synatax: key=value,key=value
--disable-rollback DISABLE_ROLLBACK
Set to true to disable rollback of the stack if stack
creation failed
--timeout-in-minutes TIMEOUT_IN_MINUTES
The amount of time that can pass before the stack
status becomes CREATE_FAILEDAbandon stack$ ros abandon-stack -h
usage: ros abandon-stack [-h] --region-id REGION_ID --stack-name STACK_NAME
--stack-id STACK_ID
optional arguments:
-h, --help show this help message and exit
--region-id REGION_ID
The region that is associated with the stack
--stack-name STACK_NAME
The name that is associated with the stack
--stack-id STACK_ID The id that is associated with the stackList stacks$ ros list-stacks -h
usage: ros list-stacks [-h] [--stack-name STACK_NAME] [--stack-id STACK_ID]
[--status {CREATE_COMPLETE,CREATE_FAILED,CREATE_IN_PROGRESS,DELETE_COMPLETE,DELETE_FAILED,DELETE_IN_PROGRESS,ROLLBACK_COMPLETE,ROLLBACK_FAILED,ROLLBACK_IN_PROGRESS}]
[--region-id REGION_ID] [--page-number PAGE_NUMBER]
[--page-size PAGE_SIZE]
optional arguments:
-h, --help show this help message and exit
--stack-name STACK_NAME
The name that is associated with the stack
--stack-id STACK_ID The id that is associated with the stack
--status {CREATE_COMPLETE,CREATE_FAILED,CREATE_IN_PROGRESS,DELETE_COMPLETE,DELETE_FAILED,DELETE_IN_PROGRESS,ROLLBACK_COMPLETE,ROLLBACK_FAILED,ROLLBACK_IN_PROGRESS}
status of stacks
--region-id REGION_ID
The region of stacks
--page-number PAGE_NUMBER
The page number of stack lists, start from 1, default
1
--page-size PAGE_SIZE
Lines each page, max 100, default 10Describe stack$ ros describe-stack -h
usage: ros describe-stack [-h] --stack-name STACK_NAME --stack-id STACK_ID
optional arguments:
-h, --help show this help message and exit
--stack-name STACK_NAME
The name that is associated with the stack
--stack-id STACK_ID The id that is associated with the stackCommands on resourcesList resources$ ros list-resources -h
usage: ros list-resources [-h] --stack-name STACK_NAME --stack-id STACK_ID
optional arguments:
-h, --help show this help message and exit
--stack-name STACK_NAME
The name of stack
--stack-id STACK_ID The id of stackDescribe resource$ ros describe-resource -h
usage: ros describe-resource [-h] --stack-name STACK_NAME --stack-id STACK_ID
--resource-name RESOURCE_NAME
optional arguments:
-h, --help show this help message and exit
--stack-name STACK_NAME
The name of stack
--stack-id STACK_ID The id of stack
--resource-name RESOURCE_NAME
The name of resourceResource type$ ros resource-type -h
usage: ros resource-type [-h]
[--status {UNKNOWN,SUPPORTED,DEPRECATED,UNSUPPORTED,HIDDEN}]
optional arguments:
-h, --help show this help message and exit
--status {UNKNOWN,SUPPORTED,DEPRECATED,UNSUPPORTED,HIDDEN}
The status of resourceResource type detail$ ros resource-type-detail -h
usage: ros resource-type-detail [-h] --name NAME
optional arguments:
-h, --help show this help message and exit
--name NAME The name of resourceResource type template$ ros resource-type-template -h
usage: ros resource-type-template [-h] --name NAME
optional arguments:
-h, --help show this help message and exit
--name NAME The name of resourceCommands on templateGet template$ ros get-template -h
usage: ros get-template [-h] --stack-name STACK_NAME --stack-id STACK_ID
optional arguments:
-h, --help show this help message and exit
--stack-name STACK_NAME
The name that is associated with the stack
--stack-id STACK_ID The id that is associated with the stackValidate template$ ros validate-template -h
usage: ros validate-template [-h] --template-url TEMPLATE_URL
optional arguments:
-h, --help show this help message and exit
--template-url TEMPLATE_URL
Location of file containing the template bodyOther commandsList regionsList all regions and need no parameters.$ ros list-regions -h
usage: ros list-regions [-h]
optional arguments:
-h, --help show this help message and exitList events$ ros list-events -h
usage: ros list-events [-h] --stack-name STACK_NAME --stack-id STACK_ID
[--resource-status {COMPLETE,FAILED,IN_PROGRESS}]
[--resource-name RESOURCE_NAME]
[--resource-type RESOURCE_TYPE]
[--page-number PAGE_NUMBER] [--page-size PAGE_SIZE]
optional arguments:
-h, --help show this help message and exit
--stack-name STACK_NAME
The name that is associated with the stack
--stack-id STACK_ID The id that is associated with the stack
--resource-status {COMPLETE,FAILED,IN_PROGRESS}
status of resources: COMPLETE\FAILED\IN_PROGRESS
--resource-name RESOURCE_NAME
The name of resources
--resource-type RESOURCE_TYPE
The type of resources
--page-number PAGE_NUMBER
The page number of stack lists, start from 1, default
1
--page-size PAGE_SIZE
Lines each page, max 100, default 10Set userdata$ ros set-userdata -h
usage: ros set-userdata [-h] --key-id KEY_ID --key-secret KEY_SECRET
--region-id REGION_ID [--json-ident JSON_IDENT]
[--debug {False,True}]
optional arguments:
-h, --help show this help message and exit
--key-id KEY_ID The default Aliyun access key id
--key-secret KEY_SECRET
The default Aliyun access key region
--region-id REGION_ID
The default region
--json-ident JSON_IDENT
The default json indent when output in json format
--debug {False,True} Whether to read debug infos
|
aliyun-sdk1
|
Failed to fetch description. HTTP Status Code: 404
|
aliyun-sdk12
|
Failed to fetch description. HTTP Status Code: 404
|
aliyunsdkcms
|
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
|
aliyun-sdk-common-managed-credentials-provider
|
No description available on PyPI.
|
aliyunsdkcore
|
This is the core module of Aliyun Python SDK.Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application,
library, or script with Aliyun services.This module works on Python versions:3.0 and greaterDocumentation:Please visithttps://github.com/duangy/aliyunsdkcore
|
aliyunsdkeci
|
aliyun-python-sdk-eci
This is the eci module of Aliyun Python SDK.Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greater
Documentation:Please visithttp://develop.aliyun.com/sdk/python
|
aliyun-secret-manager-client
|
The Aliyun Secrets Manager Client for Python enables Python developers
to easily work with Aliyun KMS Secrets.Read this in other languages: 简体中文
<https://github.com/aliyun/aliyun-secretsmanager-client-python/blob/master/README.zh-cn.rst>Aliyun Secrets Manager Client
HomepageIssuesReleaseLicenseApache License
2.0FeaturesProvide quick integration capability to gain secret informationProvide Alibaba secrets cache ( memory cache or encryption file cache
)Provide tolerated disaster by the secrets with the same secret name
and secret data in different regionsProvide default backoff strategy and user-defined backoff strategyRequirementsPython 2.7,3.5,3.6,3.7,3.8,3.9InstallInstall the official release version through PIP (taking Linux as an
example):$pipinstallaliyun-secret-manager-clientYou can also install the unzipped installer package directly:$sudopythonsetup.pyinstallSample CodeOrdinary User Sample CodeBuild Secrets Manager Client by system environment variables (system
environment variables setting for details)fromalibaba_cloud_secretsmanager_client.secret_manager_cache_client_builderimportSecretManagerCacheClientBuilderif__name__=='__main__':secret_cache_client=SecretManagerCacheClientBuilder.new_client()secret_info=secret_cache_client.get_secret_info("#secretName#")print(secret_info.__dict__)Build Secrets Manager Client by the given parameters(accessKey,
accessSecret, regionId, etc)importosfromalibaba_cloud_secretsmanager_client.secret_manager_cache_client_builderimportSecretManagerCacheClientBuilderfromalibaba_cloud_secretsmanager_client.service.default_secret_manager_client_builderimportDefaultSecretManagerClientBuilderif__name__=='__main__':secret_cache_client=SecretManagerCacheClientBuilder.new_cache_client_builder(DefaultSecretManagerClientBuilder.standard()\.with_access_key(os.getenv("#accessKeyId#"),os.getenv("#accessKeySecret#"))\.with_region("#regionId#").build())\.build();secret_info=secret_cache_client.get_secret_info("#secretName#")print(secret_info.__dict__)Particular User Sample CodeUse custom parameters or customized implementationimportosfromalibaba_cloud_secretsmanager_client.secret_manager_cache_client_builderimportSecretManagerCacheClientBuilderfromalibaba_cloud_secretsmanager_client.cache.file_cache_secret_store_strategyimportFileCacheSecretStoreStrategyfromalibaba_cloud_secretsmanager_client.service.default_secret_manager_client_builderimportDefaultSecretManagerClientBuilderfromalibaba_cloud_secretsmanager_client.service.default_refresh_secret_strategyimportDefaultRefreshSecretStrategyfromalibaba_cloud_secretsmanager_client.service.full_jitter_back_off_strategyimportFullJitterBackoffStrategyif__name__=='__main__':secret_cache_client=SecretManagerCacheClientBuilder\.new_cache_client_builder(DefaultSecretManagerClientBuilder.standard().with_access_key(os.getenv("#accessKeyId#"),os.getenv("#accessKeySecret#"))\.with_back_off_strategy(FullJitterBackoffStrategy(3,2000,10000))\.with_region("#regionId#").build())\.with_cache_secret_strategy(FileCacheSecretStoreStrategy("#cacheSecretPath#",True,"#salt#"))\.with_refresh_secret_strategy(DefaultRefreshSecretStrategy("#ttlName#"))\.with_cache_stage("#stage#")\.with_secret_ttl("#secretName#",1*60*1000l)\.build()secret_info=secret_cache_client.get_secret_info("#secretName#")print(secret_info.__dict__)
|
aliyun-sls-logger
|
aliyun-sls-loggeraliyun-sls-loggeris an implementation of asynchronous Aliyun logging designed to integrate with asynchronous code. This library provides no additional features beyond logging.InstallationTo installaliyun-sls-logger, simply use pip:pip install aliyun-sls-loggerUsageTo usealiyun-sls-logger, create an instance of theQueuedLogHandlerclass and add it to your logger. TheQueuedLogHandlerrequires the following parameters:access_key_id: The access key ID for your Aliyun account.access_key_secret: The access key secret for your Aliyun account.endpoint: The endpoint for your Aliyun SLS service.project: The name of the project to log to.logstore: The name of the logstore to log to.Once you have created an instance of theQueuedLogHandler, you can add it to your logger like any other handler:fromaiologgerimportLoggerfromaiologger.levelsimportLogLevelfromaliyun_sls_logger.logger_handlerimportQueuedLogHandlerlogger=Logger(name=__name__,level=LogLevel.INFO)handler=QueuedLogHandler(access_key_id='your_access_key_id',access_key='your_access_key_secret',end_point='your_endpoint',project='your_project',log_store='your_logstore',extract_json=True,extract_json_prefix='test_')logger.add_handler(handler)You can then use the logger as usual:awaitlogger.info('This is a log message.')
|
aliyun-sms
|
阿里云短信第三方Python-SDKPython: >= 3.6
|
aliyun-sms-code
|
aliyun_sms_codeDocumentationSend mobile verfication code using Aliyun SMS service.First make sure you get your access key, and set your access key id and access key secret as environment variables ALIBABA_CLOUD_ACCESS_KEY_ID and ALIBABA_CLOUD_ACCESS_KEY_SECRET.
See创建AccessKeyThen activate Aliyun short message service(sms), submit your sign and template sms, and wait for audit results. After approval, sign name and template code will be given to you.
In the same way, set sign name and template code as environment variables SIGN_NAME and TEMPLATE_CODE.
See短信服务InstallRunpipinstallaliyun_sms_code
|
aliyun-SMS-py3.6
|
No description available on PyPI.
|
aliyunstoreplugin
|
Aliyun OSS store plugin for MLflowThis repository provides a MLflow plugin that allows users to use a aliyun oss as the artifact store for MLflow.Implementation overviewaliyunstoreplugin: this package includes theAliyunOssArtifactRepositoryclass that is used to read and write artifacts from Aliyun OSS storage.setup.pyfile defines entrypoints that tell MLflow to automatically associate theossURIs with theAliyunOssArtifactRepositoryimplementation when thealiyunstorepluginlibrary is installed. The entrypoints are configured as follows:entry_points={
"mlflow.artifact_repository": [
"oss=aliyunstoreplugin.store.artifact.aliyun_oss_artifact_repo:AliyunOssArtifactRepository"
]
},UsageTo store artifacts in Aliyun OSS Storage, specify a URI of the formoss://<bucket>/<path>.
This plugin expects Aliyun Storage access credentials in theMLFLOW_OSS_ENDPOINT_URL,MLFLOW_OSS_KEY_IDandMLFLOW_OSS_KEY_SECRETenvironment variables,
so you must set these variables on both your client
application and your MLflow tracking server. Finally, you must runpip install oss2separately (on both your client and the server) to access Aliyun OSS Storage
|
aliyunswarm
|
集群接入点地址:参考:
https://help.aliyun.com/document_detail/26063.html?spm=a2c4g.11186623.6.768.81bb2d05Wmhak0Testfrom aliyunswarm import *
if __name__ == "__main__":
swarm_api = SwarmApi("https://host:port/",
"/path/ca.pem",
"/path/cert.pem",
"/path/key.pem")
################################################
# applications api
################################################
# 查询全部应用
swarm_api.query_applications()
# 查询应用
swarm_api.query_applications('swarmtest')
# 创建应用
template = open('./compose/compose_ngx.yaml', 'r').read()
swarm_api.create_application(template, 'swarmtest', 'swarm test')
# 停止应用
swarm_api.stop_application('swarmtest')
# 启动应用
swarm_api.start_application('swarmtest')
# 终止应用
swarm_api.kill_application('swarmtest')
# 删除应用
swarm_api.delete_application('swarmtest')
# 重新部署应用
swarm_api.redeploy_application('swarmtest')
# 更新应用
template = open('./compose/compose_ngx.yaml', 'r').read()
swarm_api.update_application(template, 'swarmtest', 'swarmtest 2', version='2.0')
################################################
# services api
################################################
# 查询服务
swarm_api.query_services('api')
# 查询指定应用的服务
swarm_api.query_service('swarmtest', 'api')
# 启动指定应用的服务
swarm_api.start_service('swarmtest', 'api')
# 停止指定应用的服务
swarm_api.stop_service('swarmtest', 'api')
# 中止指定应用的服务
swarm_api.kill_service('swarmtest', 'api')
|
aliyun-swarm-sdk
|
#create_application====================================================================="""data = {"name": "nginx","description": "This is a test application",'template': 'version: \'2\'\r\nservices: \r\n nginx4:\r\n volumes:\r\n - /data/servicesLog:/data/servicesLog\r\n image: "xxxx"\r\n restart: always\r\n environment:\r\n - msjvm="2048"\r\n - mxjvm="2048"\r\n - env_tag=prod\r\n command: ["/bin/sh","/usr/local/run.sh"]\r\n labels:\r\n aliyun.routing.port_8080: nginx4;http://nginx4.xxxx.com\r\n aliyun.routing.session_sticky: false\r\n aliyun.global: true',"version": "1.0",}Swarm_Object.create_application('https://xxxxxxxxxxx:20180', 'ca.pem', 'cert.pem', 'key.pem', data)"""#update_application"""data = {"description": "This is a test application",'template': 'version: \'2\'\r\nservices: \r\n nginx4:\r\n volumes:\r\n - /data/servicesLog:/data/servicesLog\r\n image: "docker.wanshifu.com:5000/worker"\r\n restart: always\r\n environment:\r\n - msjvm="2048"\r\n - mxjvm="2048"\r\n command: ["/bin/sh","/usr/local/run.sh"]\r\n labels:\r\n aliyun.routing.port_8080: nginx4;http://nginx4.wanshifu.com\r\n aliyun.routing.session_sticky: false\r\n aliyun.global: true',"version": "4.0",}Swarm_Object.update_application('https://xxxxxxxxxxx:20180', 'ca.pem', 'cert.pem', 'key.pem', data, 'nginx' )"""#update_application"""data = {"update_method": "blue-green","description": "This is a test application",'template': 'version: \'2\'\r\nservices: \r\n nginx8:\r\n volumes:\r\n - /data/servicesLog:/data/servicesLog\r\n image: "docker.wanshifu.com:5000/worker"\r\n restart: always\r\n environment:\r\n - msjvm="2048"\r\n - mxjvm="2048"\r\n command: ["/bin/sh","/usr/local/run.sh"]\r\n labels:\r\n aliyun.routing.port_8080: nginx4;http://nginx4.wanshifu.com\r\n aliyun.routing.session_sticky: false\r\n aliyun.global: true',"version": "5.0",}Swarm_Object.update_application('https://xxxxxxxxxxx:20180', 'ca.pem', 'cert.pem', 'key.pem', data, 'nginx')"""#update_confirmation#Swarm_Object.update_confirmation('https://xxxxxxxxxxx:20180', 'ca.pem', 'cert.pem', 'key.pem','nginx')#redeploy_application"""redeploy_application"""#Swarm_Object.redeploy_application('https://xxxxxxxxxxx:20180', 'ca.pem', 'cert.pem', 'key.pem', 'nginx')#stop_application#Swarm_Object.stop_application('https://xxxxxxxxxxx:20180', 'ca.pem', 'cert.pem', 'key.pem', 'nginx')#start_application#Swarm_Object.start_application('https://xxxxxxxxxxx:20180', 'ca.pem', 'cert.pem', 'key.pem', 'nginx')#kill_application#Swarm_Object.kill_application('https://xxxxxxxxxxx:20180', 'ca.pem', 'cert.pem', 'key.pem', 'nginx')#delete_application#Swarm_Object.delete_application('https://xxxxxxxxxxx:201801', 'ca.pem', 'cert.pem', 'key.pem', 'nginx')
|
aliyun-table
|
aliyun-table阿里云表格相关操作目前支持:[ ] 查询全部数据[x] 表格查询:短语查询,前缀匹配查询,精准查询,范围查询[x] 插入数据[x] 更新数据已知bug:[ ] 查询超过100条时,不加排序可能导致查询无法翻页.CHANGELOGv0.1.2 (2020-03-19)现在put_row和update_row会以字典形式返回主键信息.安装pipinstall--upgradealiyun-table安装完成后,在交互界面执行:>>>fromaliyun_tableimportTableClient>>>如果没有报错,说明安装成功初始化以下变量可以设置为环境变量OTS_END_POINTOTS_ACCESS_KEY_IDOTS_ACCESS_KEY_SECRETfromaliyun_tableimportTableClienttable_cli=TableClient()插入数据和更新数据table_cli=TableClient(instance_name='实例名',end_point='endpoint',access_key_id='access_key_secret',access_key_secret='access_key_secret'):data={'pk1':'123',# pk2为自增列,需要传None.'pk2':None,'col1':'test''article':'This is a test article.'}pk_dict=table_cli.put_row(table_name='table_name',pk_list=['primary_key1','primary_key2'],data=data)print(pk_dict)ReferenceclassTableClient(object):def__init__(self,table_name,instance_name,end_point=None,access_key_id=None,access_key_secret=None):...defshow_index(self,index_name='filter'):...defput_row(self,table_name,pk_list,data):"""写入数据,写入成功返回消耗cu.:param pk_list [list]: primary key name list. e.g. ['pk1', 'pk2']:param data [dict]: 包括主键在内的数据字典."""...defupdate_row(self,pk_list,data):"""Update row:param pk_list [list]: primary key name list. e.g. ['pk1', 'pk2']:param data [dict]: 包括主键在内的数据字典."""...defquery(self,table_name,must_query_list=[],must_not_query_list=[],should_query_list=[],get_total_count=False,sort_list=None,index_name='filter',column_to_get=None,limit=None):"""第一个版本的and查询.根据用户输入的查询条件构造阿里云查询.:param must_query_list [list]: 需要满足的查询条件列表, 不指定则默认为查询全部数据:param must_not_query_list [list]: 不查询的条件列表(满足这些条件的数据不查询):param should_query_list [list]: 不查询的条件列表(满足这些条件的数据不查询):param get_total_count [bool]: 是否需要获取查询到的总数量:param sort_list [list]: 列的排序列表,格式为list. 默认不排序:param limit [int]: 最多返回多少数量的数据:return: 查询到数据的迭代器,每个数据根据get_total_count的取值有所不同...
|
aliyun-voice
|
Python 语言版本Go版本Installpip install aliyun_voiceTTS 【语音合成服务】Voice(ACCESS_ID, ACCESS_KEY)阿里云认证。from aliyun_voice.voice import Voice
auth = Voice(ALIYUNACCESSID, ALIYUNACCESSKEY)auth.get_voice(text, **tts_params)获取语音文件字节数组,或抛出错误。auth.save_voice(text, dist, **tts_params)存储语音文件到指定目录【dist】,或抛出错误。auth.tts_params设置语音文件属性,参考:https://help.aliyun.com/document_detail/52793.html?spm=5176.doc30422.6.587.Z6MuvvTestpython-munittestdiscover-v
|
aliyyunpdf
|
No description available on PyPI.
|
aljex
|
Failed to fetch description. HTTP Status Code: 404
|
aljp-messagemedia-rest-api
|
Australia's Leading Messaging Solutions for Business and Enterprise.
|
aljpy
|
No description available on PyPI.
|
aljson
|
Convert a SqlAlchemy query object to a dict(json)Installpip install aljsonUsagefromaljsonimportBaseMixin# The Sqlalchemy modelclassParent(Base,BaseMixin):__tablename__='parent'id=sa.Column(sa.Integer,primary_key=True,unique=True)name=sa.Column(sa.String(64))# query Parent modelprint(result.to_json())Full examplefromsqlalchemy.ormimportsessionmaker,relationship,backreffromsqlalchemy.ext.declarativeimportdeclarative_basefromaljsonimportBaseMixinimportsqlalchemyassaBase=declarative_base()classParent(Base,BaseMixin):__tablename__='parent'id=sa.Column(sa.Integer,primary_key=True,unique=True)name=sa.Column(sa.String(64))classChild(Base,BaseMixin):__tablename__='child'id=sa.Column(sa.Integer,primary_key=True,unique=True)name=sa.Column(sa.String(64))parent_id=sa.Column(sa.Integer,sa.ForeignKey('parent.id'))parent=relationship("Parent")# Create an engine that stores data in the local directory's# sqlalchemy_example.db file.engine=sa.create_engine('sqlite:///my_database.sqlite')# Create all tables in the engine. This is equivalent to "Create Table"# statements in raw SQL.Base.metadata.create_all(engine)DBSession=sessionmaker(bind=engine)session=DBSession()# Create a new parent and a childnew_parent=Parent()new_parent.name="parent_1"new_child=Child()new_child.name="child_1"new_child.parent=new_parentsession.add(new_parent)session.add(new_child)session.commit()# Search for a rowquery_result=session.query(Child).first()# And you can call .to_jsonprint(query_result.to_json())# The result should be like this:# {'id': 1, 'name': 'child_1', 'parent_id': 1, 'parent': {'id': 1, 'name': 'parent_1'}}
|
alkali
|
alkali is a simple database engine. If you’re currently using a list of
dicts then you should come take a look. alkali’s main goal is to allow
the developer to easily control the on disk format.
|
alkaline
|
No description available on PyPI.
|
alkana
|
alkanaA tool to get the katakana reading of an alphabetical string.InstallingPython 3 or higher is required.From pippython3 -m pip install -U alkanaClone from GitHub$ git clone https://github.com/cod-sushi/alkana.py
$ cd alkana.py
$ python3 -m pip install -U .ExamplePython importimportalkanaprint(alkana.get_kana("Hello"))print(alkana.get_kana("World"))print(alkana.get_kana("abcdefg"))Output of above example is:ハロー
ワールド
NoneIf the reading is not exist, ReturnsNone.Command line$alkanahello
ハロー
$alkanaworld
ワールド
$alkanaabcdefgIf the reading is not exist, there is no output.External fileAlkana can extend the dictionary.sample.csvhogehoge,ホゲホゲ
piyopiyo,ピヨピヨ
...importalkanaalkana.add_external_data('./sample.csv')print(alkana.get_kana('hogehoge'))# ホゲホゲCopyrightsAlphabetical word - katakana dictionary's data is frombep-eng.dic.Bilingual Emacspeak Project(c) 1999-2002 Bilingual Emacspeak ProjectLisenceGPLv2
|
alkana.py
|
alkana.pyA tool to get the katakana reading of an alphabetical string.InstallingPython 3 or higher is required.From pippython3 -m pip install -U alkana.pyClone from GitHub$ git clone https://github.com/cod-sushi/alkana.py
$ cd alkana.py
$ python3 -m pip install -U .ExamplePython importQuick example:importalkanaprint(alkana.get_kana("Hello"))print(alkana.get_kana("World"))print(alkana.get_kana("abcdefg"))Output of above example is:ハロー
ワールド
NoneIf the reading is not exist, ReturnsNone.Command line$alkanahello
ハロー
$alkanaworld
ワールド
$alkanaabcdefgIf the reading is not exist, there is no output.CopyrightsAlphabetical word - katakana dictionary's data is frombep-eng.dic.Bilingual Emacspeak Project(c) 1999-2002 Bilingual Emacspeak ProjectLisenceGPLv2
|
alkanet-picard
|
UNKNOWN
|
alkemy-workflow
|
No description available on PyPI.
|
alkesqlite3
|
AlkSqlite3Installpip install alksqlite3AuthorGuanjie Wang
|
alkey
|
# Alkey[Alkey][] is a [Redis][] backed tool for generating cache keys that implicitlyupdate / invalidate when [SQLAlchemy][] model instances change, e.g.:from alkey.cache import get_cache_key_generatorkey_generator = get_cache_key_generator()# The `cache_key` will be invalidated when `instance1` or `instance2` change.cache_key = key_generator(instance1, instance2)It can be used by any [SQLAlchemy][] application that has access to [Redis][].Plus it has (optional) integration with the [Pyramid][] framework:`config.include` the package and generate keys using, e.g.:cache_key = request.cache_key(request.context)## How it Works[Alkey][] works by binding to the SQLAlchemy session's [before_flush][] and[after_commit][] events to maintain a unique token, in Redis, against everymodel instance. As long as the model instance has a unique `id` property, thistoken will change whenever the instance is updated or deleted. In addition,Alkey maintains a global write token and a token against each database table.You can use these to generate cache keys that invalidate:* when an *instance* changes* when a *table* changes; or* when *anything* changesThe main algorithm is to record instances as changed when they're flushed tothe db in the session's new, dirty or deleted lists (identifiers in the format`alkey:tablename#row_id`, e.g.: `alkey:users#1`, are stored in a Redis set).Then, when the session's transaction is committed, the tokens for each recordedinstance (plus their table and the global write token) are updated. This meansthat a cache key that contains the tokens will miss, causing the cached valueto be regenerated.New tokens are generated when instances are looked up that are not alreadyin the cache. So keys will always be invalidated if you lose / flush yourRedis data.> Note also that changes recorded during a transaction that'ssubsequently rolled back will be discarded (i.e.: the tokens will not be updated)*unless* the rolled-back transaction is a sub-transaction. In that case — ifyour application code explicitly uses sub-transactions — rollbacks may leadto unnecessary cache-misses.## Configuring a Redis Client[Alkey][] looks in the `os.environ` (i.e.: you need to provide[environment variables][]) for a values to configure a [redis client][]:* `REDIS_URL`: a connection string including any authenticaton information, e.g.:`redis://username:password@hostname:port`* `REDIS_DB`: defaults to `0`* `REDIS_MAX_CONNECTIONS`: the maximum number of connections for the client'sconnection pool (defaults to not set)## Binding to Session EventsUse the `alkey.events.bind` function, e.g.:from alkey import eventsfrom myapp import Session # the sqlalchemy session you're usingevents.bind(Session)## Generating Cache KeysYou can then instantiate an `alkey.cache.CacheKeyGenerator` and call it withany of the following types as positional arguments to generate a cache key:* SQLAlchemy model instances* model instance identifiers in the format `alkey:tablename#row_id`* SQLAlchemy model classes* model class identifiers in the format `alkey:tablename#*`* the `alkey.constants.GLOBAL_WRITE_TOKEN`, which has the value `alkey:*#*`* arbitrary values that can be coerced to a unicode stringE.g. using the `alkey.cache.get_cache_key_generator` factory to instantiate:from alkey.cache import get_cache_key_generatorkey_generator = get_cache_key_generator()cache_key = key_generator(instance, 'alkey:users#1', 1, 'foo', {'bar': 'baz'})Or, for example, imagine you have a `users` table, of which `user` is an instancewith an `id` of `1`:# Invalidate when this user changes.cache_key = key_generator(user)cache_key = key_generator('alkey:users#1')# Invalidate when any user is inserted, updated or deleted.cache_key = key_generator(user.__class__)cache_key = key_generator('alkey:users#*')# Invalidate when any instance of any type is inserted, updated or deleted.cache_key = key_generator('alkey:*#*')Or you can directly get the instance token with `alkey.cache.get_token`, e.g.:from alkey.cache import get_tokenfrom alkey.client import get_redis_clientredis_client = get_redis_client()token = get_token(redis_client, user)token = get_token(redis_client, 'alkey:users#1')## Pyramid IntegrationIf you're writing a [Pyramid][] application, you can bind to the session eventsby just including the package:config.include('alkey')This will, by default, use the [pyramid_basemodel][] threadlocal scoped session.To use a different session class, provide a dotted path to it as the`alkey.session_cls` in your .ini settings, e.g.:alkey.session_cls=myapp.model.SessionAn appropriately configured `alkey.cache.CacheKeyGenerator` instance will thenbe available as ``request.cache_key``, e.g:key = request.cache_key(instance1, instance2, 'arbitrary string')Or e.g.: in a [Mako template][]:<%page cached=True, cache_key=${request.cache_key(1, self.uri, instance)} />## Tests[Alkey][] has been developed and tested against Python2.7. To run the tests,install `mock`, `nose` and `coverage` and either hack the `setUp` method in`alkey.tests:IntegrationTest` or have a Redis db available at`redis://localhost:6379`. Then, e.g.:$ nosetests alkey --with-doctest --with-coverage --cover-tests --cover-package alkey..........................Name Stmts Miss Cover Missing------------------------------------------------alkey 11 0 100%alkey.cache 74 0 100%alkey.client 73 0 100%alkey.constants 6 0 100%alkey.events 12 0 100%alkey.handle 76 0 100%alkey.interfaces 6 0 100%alkey.tests 184 0 100%alkey.utils 30 0 100%------------------------------------------------TOTAL 472 0 100%----------------------------------------------------------------------Ran 26 tests in 0.566sOK[alkey]: http://github.com/thruflo/alkey[Redis]: http://redis.io[SQLAlchemy]: http://www.sqlalchemy.org/[redis client]: https://github.com/andymccurdy/redis-py[before_flush]: http://docs.sqlalchemy.org/ru/latest/orm/events.html#sqlalchemy.orm.events.SessionEvents.before_flush[after_commit]: http://docs.sqlalchemy.org/ru/latest/orm/events.html#sqlalchemy.orm.events.SessionEvents.after_commit[Pyramid]: http://docs.pylonsproject.org/projects/pyramid/en/latest[Mako template]: http://www.makotemplates.org/[pyramid_basemodel]: http://github.com/thruflo/pyramid_basemodel[environment variables]: http://blog.akash.im/per-project-environment-variables-with-forema[Heroku addons]: https://www.google.co.uk/search?q=Heroku+addons+redis
|
alkh
|
alkh [al-khwarizmi]Algorithmic python debugging1.Convert your debugger stack to jupyter notebook2.Focus on code pathways using local web applicationInstallationpip install alkhConvert your debugger stack to jupyter notebookAPIfunction name: take_it_offlinedescription: create jupyter notebook based on the program stackparameters:notebook_dir_path: Optional[str] = None, directory path to save the notebook inlevels: Optional[int] = 1, number of program stack layers to put in notebookUsageimport alkhalkh.take_it_offline('path-of-notebooks-directory')oralkh.take_it_offline('path-of-notebooks-directory', levels=2)orbash:export ALKH_NOTEBOOKS_PATH='path-to-notebooks-directory'python:import alkhalkh.take_it_offline()oralkh.take_it_offline(levels=2)Usage flow exampleStop at breakpoint within PyCharmUse Console to run code within debuggerRun: import alkhRun: alkh.take_it_offline('path-of-notebooks-directory')Start JupyterRun the notebookFocus on code pathways using local web applicationAPIfunction name: analyzedescription: launches web application to analyze code pathwaysparameters:NoneUsageimport alkhalkh.analyze()Usage flow exampleAdd two line to top of fileRun the fileAnalyze your code
|
alkira
|
No description available on PyPI.
|
alkira-sdk
|
No description available on PyPI.
|
alkivi-config-manager
|
Python config-manager used at AlkiviPackageExampleWrite a conf like[default]; general configuration: default endpointendpoint=dev[dev]; configuration specific to 'dev' endpointenv=dev[prod]; configuration specific to 'prod' endpointenv=prodfromalkivi.configimportConfigManagerconfig=ConfigManager('test')# This will look for several files, in order# 1. Current working directory: ``./test.conf``# 2. Current user's home directory ``~/.test.conf``# 3. System wide configuration ``/etc/test.conf``# Then find the endpointendpoint=config.get('default',endpoint)# Or use a specific oneendpoint='prod'# And thenenv=config.get(endpoint,'env')ParametersTestsTesting is set up usingpytestand coverage is
handled with the pytest-cov plugin.Run your tests withpy.testin the root directory.Coverage is ran by default and is set in thepytest.inifile. To see
an html output of coverage openhtmlcov/index.htmlafter running the
tests.TODOTravis CIThere is a.travis.ymlfile that is set up to run your tests for
python 2.7 and python 3.2, should you choose to use it.TODO
|
alkivi-google-client
|
Google python client used at AlkiviPackageExamplefromalkivi.googleimportclientasgoogleimportloggingscope='https://www.googleapis.com/auth/admin.directory.user.readonly'# Using default configurationgoogle_client=google.Client(scopes=[scope])# Using specific endpointgoogle_client=google.Client(endpoint='account2')# Get directory client for Admin SDK apiimpersonate='[email protected]'directory_client=google_client.get_directory_client(impersonate)# Get a gmail client for gmail APIgmail_client=google_client.get_gmail_client()CredentialsCredentials are fetched from, in priority order: - ./google.conf (script
directory) - $HOME/.google.conf - /etc/google.confExample[default]; general configuration: default endpointendpoint=account1[account1]; configuration specific to 'account1' endpoint; using can be; - service: for Service Account; - oauth: for OAuth authentificationusing=service; for Service Accountservice_account_key=/path/to_your_service_key.json[account2]; other account configurationusing=oauth; for OAuthclient_id=your_client_idclient_secret=your_client_secretrefresh_token=your_refresh_tokenTestsTesting is set up usingpytestand coverage is
handled with the pytest-cov plugin.Run your tests withpy.testin the root directory.Coverage is ran by default and is set in thepytest.inifile. To see
an html output of coverage openhtmlcov/index.htmlafter running the
tests.TODOTravis CIThere is a.travis.ymlfile that is set up to run your tests for
python 2.7 and python 3.2, should you choose to use it.TODO
|
alkivi-logger
|
Python logger used at AlkiviPackageExampleimportloggingfromalkivi.loggerimportLogger## Define Logger#logger=Logger(min_log_level_to_mail=None,min_log_level_to_save=logging.DEBUG,min_log_level_to_print=logging.DEBUG,min_log_level_to_syslog=None,emails=['[email protected]'],use_root_logger=False)# If set to True will use root_logger## All log level, from bottom to top#logger.debug('This is a debug comment')logger.info('This is an info comment')logger.warning('This is a warning comment')logger.error('This is an error comment')logger.critical('This is a critical comment')try:1/0exceptExceptionase:logger.exception('This is an exception comment')pass## You can adjust log level on the fly#logger.set_min_level_to_mail(logging.WARNING)logger.set_min_level_to_save(logging.WARNING)## You can use loops#logger.new_loop_logger()foriinrange(0,11):logger.new_iteration(prefix='i=%i'%(i))logger.debug("We are now prefixing all logger")ifi==9:logger.debug("Lets do another loop")logger.new_loop_logger()forjinrange(0,5):logger.new_iteration(prefix='j=%i'%(j))logger.debug("Alkivi pow@")# Dont forget to close logger or shit will happenlogger.del_loop_logger()# Bonus point : if emailing is set, only send email for the loop we have# errorifi==10:logger.critical("We shall receive only mail for last loop")logger.del_loop_logger()logger.debug('We now remove an loop, thus a prefix')ParametersTestsTesting is set up usingpytestand coverage is
handled with the pytest-cov plugin.Run your tests withpy.testin the root directory.Coverage is ran by default and is set in thepytest.inifile. To see
an html output of coverage openhtmlcov/index.htmlafter running the
tests.TODOTravis CIThere is a.travis.ymlfile that is set up to run your tests for
python 2.7 and python 3.2, should you choose to use it.TODO
|
alkivi-odoo-client
|
Odoo python client used at Alkivi. Based on odoorpc on which we add
additional fonction.PackageExamplefromalkivi.odooimportclientasodoo# Using default configurationclient=odoo.Client()# Using specific endpointclient=odoo.Client(endpoint='prod')# TODOCredentialsCredentials are fetched from, in priority order: - ./odoo.conf (script
directory) - $HOME/.odoo.conf - /etc/odoo.confExample[default]; general configuration: default endpointendpoint=dev[dev]; configuration specific to 'dev' endpointprotocol=jsonrpc+sslport=443url=odoo.domainversion=8.0db=odooDatabaseuser=pdooUserpassword=AweSomePasswOrd[prod]; other configurationTestsTesting is set up usingpytestand coverage is
handled with the pytest-cov plugin.Run your tests withpy.testin the root directory.Coverage is ran by default and is set in thepytest.inifile. To see
an html output of coverage openhtmlcov/index.htmlafter running the
tests.TODOTravis CIThere is a.travis.ymlfile that is set up to run your tests for
python 2.7 and python 3.2, should you choose to use it.TODO
|
alkompy
|
Python bindings foralkomp, a GPGPU library written in Rust for performing compute operations.pip3 install alkompyAt this time, the Python interface is designed to specifically work withnumpy ndarrays. This means you can quickly send a numpy array to a GPU withdata_gpu = device.to_device(my_np_array)and run a computation usingdevice.call(...).to_devicereturns an object that records the memory location of a GPU buffer, as well shape and type. In order to retrieve the contents of the buffer:device.get(data_gpu).getfunction returns a numpy in the same shape asmy_np_array.Build from sourcegit clone https://github.com/RustyBamboo/alkomp && cd alkomppip3 install -r requirements-dev.txtpython3 setup.py develop --userpython3 test/test.py
|
alkymi
|
alkymi ⚗️Alkymi is a pure Python (3.7+) library for describing and executing tasks and pipelines with built-in caching and
conditional evaluation based on checksums.Alkymi is easy to install, simple to use, and has very few dependencies outside of Python's standard library. The code
is cross-platform, and allows you to write your pipelines once and deploy to multiple operating systems (tested on
Linux, Windows and Mac).Documentation, including a quickstart guide, is providedhere.FeaturesEasily define complex data pipelines as decorated Python functionsThis allows you to run linting, type checking, etc. on your data pipelinesReturn values are automatically cached to disk, regardless of typeEfficiently checks if pipeline is up-to-dateChecks if external files have changed, bound functions have changed or if pipeline dependencies have changedNo domain specific language (DSL) or CLI tool, just regular PythonSupports caching and conditional evaluation in Jupyter NotebooksCross-platform - works on Linux, Windows and MacExpose recipes as a command-line interface (CLI) using alkymi'sLabtypeSample UsageFor examples of how to use alkymi, see thequickstart guide.Example code:[email protected]()deflong_running_task()->np.ndarray:# Perform expensive computation here ...hard_to_compute_result=np.array([42])# Return value will be automatically cached to diskreturnhard_to_compute_resultresult=long_running_task.brew()# == np.ndarray([42])Or one of the examples, e.g.MNIST.InstallationInstall via pip:pipinstall--useralkymiOr see theInstallation page.TestingAfter installing, you can run the test suite (use thelint,coverageandtype_checkrecipes to perform those
actions):python3labfile.pybrewtestLicensealkymi is licensed under The MIT License as found in the LICENSE.md fileUpcoming FeaturesThe following features are being considered for future implementation:Type annotations propagated from bound functions to recipesSupport for call/type checking all recipes (e.g. by adding acheckcommand toLab)Cache maintenance functionalityKnown Issuesalkymi currently doesn't check custom objects for altered external files when computing cleanliness (e.g.MyClasshas aself._some_paththat points to a file somewhere outside alkymi's internal cache)alk.foreach()currently only supports enumerable inputs of typeListorDictRecipes markedtransientwill always be dirty, and thus always require reevaluation. This functionality should be
replaced by a proper means of creating recipes that don't cache outputs, but only run when needed to provide inputs for
downstream recipes
|
all2graph
|
all2graph是一个工业级、高度自动化的业务模型深度学习解决方案,主要用于表单数据、关联数据和时序数据建模安装最新版本
pip install all2graph如果想安装gpu版本,请先按照pytorch和dgl的指使安装对应的cuda版本教程请见tutorials
|
all2vec
|
UNKNOWN
|
all4scripts-zfullio
|
No description available on PyPI.
|
allabaster
|
All these fancy sidebars and footers were stripped down, as well,
as html head and body tags. Now theme generates not complete html
pages, but html fragments. It’s intended usage – to insert these
fragments into other html pages.CSS code was modified to decrease chance of interference with
css rules of the main site.Breadcrumbs were added to simplify navigation.WarningThe rest of the page is not reworked yet, and copied from Alabaster.Alabaster is a visually (c)lean, responsive, configurable theme for theSphinxdocumentation system. It is Python 2+3 compatible.It began as a third-party theme, and is still maintained separately, but as of
Sphinx 1.3, Alabaster is an install-time dependency of Sphinx and is selected
as the default theme.Live examples of this theme can be seen onthis project’s own website,paramiko.org,fabfile.organdpyinvoke.org.For more documentation, please seehttp://alabaster.readthedocs.io.NoteYou can install thedevelopment versionviapip installalabaster==dev.
|
allabolag
|
This is a scraper for collecting data from allabolag.se. It has no formal relationship with the site.It is written and maintained forNewsworthy, but could possibly come in handy for other people as well.InstallingpipinstallallabolagExample usagefromallabolagimportCompanycompany=Company("559071-2807")# show all available data about the company in a raw...print(company.raw_data)# ...or cleaned formatprint(company.data)And you can iterate the list of recent liquidations.fromallabolagimportiter_liquidated_companiesforcompanyiniter_liquidated_companies(until="2019-06-01"):print(company)DevelopingTo run tests:python3-mpytestDeploymentTo deploy a new version to PyPi:Update Changelog below.Update version insetup.pyBuild:python3 setup.py sdist bdist_wheelUpload:python3 -m twine upload dist/allabolag-X.Y.X*…assuming you have Twine installed (pip install twine) and configured.Changelog0.1.7Bug fix: Add encoding for Python 2.70.1.6Fixes bug when company has remark about Svensk Handels Varningslistan0.1.5Make Python 2.7 compatible.0.1.4Updating _iter_liquidate_companies to handle rebuilt site.0.1.3Bug fixes0.1.0First version
|
all-against-all
|
Each item of a list against all others$pipinstallall-against-allfromall_against_allimportall_against_allimportoperatorlist_=[1,2,3,4]e=all_against_all(func=operator.add,iterable=list_,ignore_exceptions=True,skip_own=True)e=list(e)foreeine:print(ee)# [((1, 2), 3), ((1, 3), 4), ((1, 4), 5)]# [((2, 1), 3), ((2, 3), 5), ((2, 4), 6)]# [((3, 1), 4), ((3, 2), 5), ((3, 4), 7)]# [((4, 1), 5), ((4, 2), 6), ((4, 3), 7)]e=all_against_all(func=operator.add,iterable=list_,ignore_exceptions=True,skip_own=False)e=list(e)foreeine:print(ee)# [((1, 2), 3), ((1, 3), 4), ((1, 4), 5)]# [((2, 1), 3), ((2, 3), 5), ((2, 4), 6)]# [((3, 1), 4), ((3, 2), 5), ((3, 4), 7)]# [((4, 1), 5), ((4, 2), 6), ((4, 3), 7)]# [((1, 1), 2), ((1, 2), 3), ((1, 3), 4), ((1, 4), 5)]# [((2, 1), 3), ((2, 2), 4), ((2, 3), 5), ((2, 4), 6)]# [((3, 1), 4), ((3, 2), 5), ((3, 3), 6), ((3, 4), 7)]# [((4, 1), 5), ((4, 2), 6), ((4, 3), 7), ((4, 4), 8)]
|
allah
|
Here you can recieve the blessing of Allah and get a chance to go to heaven🤗
|
all-ai
|
No description available on PyPI.
|
allak
|
allakA CLI tool to produce an IPv6 address with the host portion being a modified EUI-64.InstallationPython version: >=3.11Using pipInside a virtual environmentpip install allak
allak-cli --helpWithout a virtual environmentpip install --user allak
allak-cli --helpUsagePackage functionality is provided by an entrypointallak-cli.allak-cli [OPTIONS] PREFIX MAC
Given IPv6 PREFIX of prefix-length 64 and MAC address returns IPv6 host
address with interface identifier as modified EUI-64.
Options:
--version Show the version and exit.
--help Show this message and exit.MetaAllak is a mountain in Sweden, a village in Altai, a train station in Korea.
|
allalgorithms
|
The All ▲lgorithms Python library<br>
<br>
<br>
<br>
[`python.allalgorithms.com`](https://python.allalgorithms.com)Why?Why not 😂Clean and focusedActively maintainedBecause All Algorithms should easy to use in PythonRead the detailed documentation atpython.allalgorithms.comor see thedocsdirectory on Github. SeeTree.Installpip install allalgorithmsUsage Examplefromallalgorithms.searchesimportbinary_searcharr=[-2,1,2,7,10,77]print(binary_search(arr,7))# -> 3print(binary_search(arr,3))# -> NoneTreeSearchesBinary SearchSortingMerge SortRelatedjavascript-lib: All ▲lgorithms Javascript libraryMaintainersCarlos AbrahamLicenseMITLicense ©Carlos Abraham
|
all-algorithms
|
No description available on PyPI.
|
allan
|
Failed to fetch description. HTTP Status Code: 404
|
allanbot
|
No description available on PyPI.
|
allanc-sphinx
|
Custom Sphinx themes for the documentation ofmy projects.Themesyeen: Slight modification of theTriAx Corptheme.Defaults to blue colours and sidebar on the left.Field lists now use themainlightcoloroption for the background colour for headers, rather than a hardcoded blue colour.Optionally adds a Github “Octocat” banner to the repository (created byTim Holman).UsageTo use the theme, specifyallanc_sphinx[yeen]as a dependency, and in yourconf.pyfile:html_theme = 'yeen'If you want to add the Octocat banner, you just need to define the link you want to go to - for example:html_theme_options = {'github_url': 'https://github.com/the-allanc/allanc_sphinx'}
|
allan-tool
|
A Sample p and t functionyou can use like :from allan_tool import *p('hellowrd')p(t())I still like you very much, like the wind gone thousands of miles, do not ask the return date.
|
allauth-jinja
|
allauth-jinjaThe complete set ofdjango-allauthtemplates rewritten in jinja.Installation & Usage[not published yet]Add allauth_jinja to your installed apps above allauth, e.g.INSTALLED_APPS = [
...
allauth_jinja,
allauth,
...
]In your templates settings for jinja you must have at least the following{
"APP_DIRS": True,
"OPTIONS": {
"match_extension": None,
"app_dirname": "jinja2",
"undefined": "jinja2.Undefined",
"context_processors": [
"django.template.context_processors.request",
"django.contrib.messages.context_processors.messages",
...
],
"globals": {
...,
"user_display": "allauth_jinja.account.templatetags.account.user_display",
...
}
}
}Depending on what templates you use you'll need to add the relevant global functions.
|
allauth-no-signup-tim
|
TIM login enable/disable adapter for allauth-djangoAdds allauth adapters to disable/enable login in django
|
allauth-socialaccount-provider-keycloak
|
allauth-socialaccount-provider-keycloakinstallpip install allauth-socialaccount-provider-keycloakconfigureINSTALLED_APPS=[...'django.contrib.sites',...#'allauth','allauth.account','allauth.socialaccount','allauth_socialaccount_provider_keycloak',...]AUTHENTICATION_BACKENDS=('django.contrib.auth.backends.ModelBackend','allauth.account.auth_backends.AuthenticationBackend',)SOCIALACCOUNT_PROVIDERS={'keycloak':{'KEYCLOAK_URL':"https://sso.grafcan.es/auth/realms/demo",}}
|
allauth-tim
|
TIM social login for django-allauthThis package adds TIM login support for django-allauth library.
|
allauth-watchdog-id
|
allauth-watchdog_idA django-allauth provider for Watchdog ID.Free software: MIT licenseDocumentation:https://allauth-watchdog-id.readthedocs.io.FeaturesProvides integration of Django withid.siecobywatelska.plthroughdjango-allauth.CreditsThis package was created withCookiecutterand theaudreyr/cookiecutter-pypackageproject template.History0.1.0 (2016-12-03)First release on PyPI.
|
allay
|
No description available on PyPI.
|
all-badge
|
All Badge==================
|
allbaro-chunilpathfinder
|
AllbaroAllbaro한국환경공단 올바로 시스템 스크래핑을 위한 파이썬 라이브러리>>>importdatetime>>>fromallbaroimportAllbaro>>>>>>allbaro=Allbaro()>>>>>># 올바로 시스템 로그인>>>allbaro.authenticate('올바로 시스템 아이디','올바로 시스템 패스워드')>>># 인계서진행상황확인 - 인계서진행상황 정보 조회>>>start_date=datetime.date(2023,5,1)>>>end_date=datetime.date(2023,5,31)>>>>>>result_list=allbaro.handover_process_list(start_date,end_date)
|
allbluepy
|
No description available on PyPI.
|
allcasts
|
allcasts 📻 🗃A Python package for downloading all available episodes from a podcast RSS feed. Useful for making private archives of your favourite podcasts.Installationpip install allcastsUsageCommand Line: Interactive ModeFrom your terminal runallcastswhich will kindly ask you for the an RSS feed's URL and download all available episodes for that podcast.$allcasts===================================================================WelcometotheAllCastsApp!==================================================================PleaseentertheURLofthepodcastfeedyouwanttodownload:https://atp.fm/rss
Pleaseenterthedirectoryyouwanttodownloadthepodcastto[leaveblankforcurrentdir]:Downloadingallpodcastsfromhttps://atp.fm/rssto/Users/lewis/Documents/Python-Projects/allcasts
Downloadinghttps://traffic.libsyn.com/atpfm/atp464.mp3[............................................................................................]🎧Downloaded464:MonksatDraftingTables
Downloadinghttps://traffic.libsyn.com/atpfm/atp463.mp3[............................................................................................]🎧Downloaded463:NoIndicationofProgress
Downloadinghttps://traffic.libsyn.com/atpfm/atp462.mp3[............................................................................................]🎧Downloaded462:XcodeX
Downloadinghttps://traffic.libsyn.com/atpfm/atp461.mp3[...........]Command Line: ArgumentsAllcasts supports a variety of command line arguments. To display help message below useallcasts -husage: allcasts.py [-h] [-d <DIRECTORY>] -f <URL> [-s <NUMBER>] [-e <NUMBER>] [-a] [-n <NUMBER>]
A friendly command line podcast downloader - supports downloading entire feeds, individual episodes, and a range of episodes
optional arguments:
-h, --help show this help message and exit
-d <DIRECTORY>, --directory <DIRECTORY>
the directory to save the podcast episodes
-f <URL>, --feed <URL>
the url of the podcast feed
-s <NUMBER>, --start <NUMBER>
the number of the first episode to download
-e <NUMBER>, --end <NUMBER>
the number of the last episode to download
-a, --all download all episodes
-n <NUMBER>, --number <NUMBER>
download a specific episodeExample Commands:Download episodes 100 to 120allcasts-f"https://atp.fm/rss"-s100-e120Download all episodes of a podcastallcasts-f"https://atp.fm/rss"-aDownload episode 200allcasts-f"https://atp.fm/rss"-n100As a Python moduleallcasts is a Python module that can be imported and used in your own Python code too!fromallcastsimportAllCastsAllCasts.download_all('https://atp.fm/rss','/Users/lewis/Documents/Python-Projects/allcasts')LimitationsPrivate Patreon RSS feeds are not currently supported due to their strange DRM measures.TodoAdd support for downloading multiple podcasts at once.Add support for command line arguments.Add itunes API support to search for podcasts and select the correct feed.
|
allcities
|
allcitiesa Python library to work with all the cities of the world with a population of at least 1000 inhabitants.NoteThis library was whipped up in an afternoon when I got a little carried away when I really needed something much simpler. It is not fully tested and will need revisiting/cleanup.InstallationpipinstallallcitiesUsage exampleUsage of this library is quite simple.fromallcitiesimportcitiesresults=cities.filter(name='Los Angeles')forresultinresults:print(result)citiesis a set-like object that contains objects that represent cities. The above code will output:<Santa Rosa los Angeles, 11, MX>
<Los Angeles, 10, MX>
<Los Angeles, CA, US>
<Los Angeles, 25, MX>
<Lake Los Angeles, CA, US>
<East Los Angeles, CA, US>
<Los Angeles, 13, PH>You can chain/combine filters as follows:results=cities.filter(name='Los Angeles').filter(country_code='US')results2=cities.filter(name='Los Angeles',country_code='US')print(results==results2)forresultinresults:print(result)gives youTrue
<Los Angeles, CA, US>
<East Los Angeles, CA, US>
<Lake Los Angeles, CA, US>You can also filter on numeric properties. The syntax to do so is a comparison operator<, <=, ==, !=, >=, >followed by a numeric value.results=cities.filter(elevation='>1000')results2=cities.filter(elevation='>1000').filter(elevation='<1500')print(results)print(results2)Gives you<CitySet (1339)>
<CitySet (795)>Each city object has properties that can be accessed normally or filtered on. You can also export a dictionary with the.dictproperty.pprint.pprint(city_object.dict)Here is the resulting dict{'admin1_code':'CA','admin2_code':'037','alternatenames':['East Los Angeles','Este de Los Angeles','Este de Los Ángeles','Ist Los Andzeles','Orienta Losangeleso','Orienta Losanĝeleso','dong luo shan ji','iseuteuloseuaenjelleseu','ista lasa enjelsa','isutorosanzerusu','Ист Лос Анџелес','इस्ट लस एन्जेल्स','イーストロサンゼルス','东洛杉矶','이스트로스앤젤레스'],'asciiname':'East Los Angeles','country_code':'US','dem':63,'elevation':61,'feature_class':'P','feature_code':'PPL','geonameid':5344994,'latitude':34.0239,'longitude':-118.17202,'modification_date':'2011-05-14','name':'East Los Angeles','population':126496,'timezone':'America/Los_Angeles'}LicenseThis project is licensed under the MIT License - see theLICENSE.mdfile for detailsRelease History1.0.0Initial ReleaseContributingFork it (https://github.com/Jonchun/allcities/fork)Create your feature branch (git checkout -b feature/fooBar)Commit your changes (git commit -am 'Add some fooBar')Push to the branch (git push origin feature/fooBar)Create a new Pull Request
|
all-clip
|
all_clipLoad any clip model with a standardized interfaceInstallpip install all_clipPython examplesfromall_clipimportload_clipimporttorchfromPILimportImageimportpathlibmodel,preprocess,tokenizer=load_clip("open_clip:ViT-B-32/laion2b_s34b_b79k",device="cpu",use_jit=False)image=preprocess(Image.open(str(pathlib.Path(__file__).parent.resolve())+"/CLIP.png")).unsqueeze(0)text=tokenizer(["a diagram","a dog","a cat"])withtorch.no_grad(),torch.cuda.amp.autocast():image_features=model.encode_image(image)text_features=model.encode_text(text)image_features/=image_features.norm(dim=-1,keepdim=True)text_features/=text_features.norm(dim=-1,keepdim=True)text_probs=(100.0*image_features@text_features.T).softmax(dim=-1)print("Label probs:",text_probs)# prints: [[1., 0., 0.]]Checkout these examples to call this as a lib:example.pyAPIThis module exposes a single functionload_clip:clip_modelCLIP model to load (defaultViT-B/32). See below supported models section.use_jituses jit for the clip model (defaultTrue)warmup_batch_sizewarmup batch size (default1)clip_cache_pathcache path for clip (defaultNone)devicedevice (defaultNone)Related projectsclip-retrievalto use clip for inference, and retrievalopen_clipto train clip modelsCLIP_benchmarkto evaluate clip modelsSupported modelsOpenAISpecify the model as "ViT-B-32"Openclip"open_clip:ViT-B-32/laion2b_s34b_b79k"to use theopen_clipHF CLIP"hf_clip:patrickjohncyh/fashion-clip"to use thehugging faceDeepsparse backendDeepSparseis an inference runtime for fast sparse model inference on CPUs. There is a backend available within clip-retrieval by installing it withpip install deepsparse-nightly[clip], and specifying aclip_modelwith a prepended"nm:", such as"nm:neuralmagic/CLIP-ViT-B-32-256x256-DataComp-s34B-b86K-quant-ds"or"nm:mgoin/CLIP-ViT-B-32-laion2b_s34b_b79k-ds".Japanese clipjapanese-clipprovides some models for japanese.
For example one isja_clip:rinna/japanese-clip-vit-b-16For developmentEither locally, or ingitpod(doexport PIP_USER=falsethere)Setup a virtualenv:python3 -m venv .env
source .env/bin/activate
pip install -e .to run tests:pip install -r requirements-test.txtthenmake lint
make testYou can usemake blackto reformat the codepython -m pytest -x -s -v tests -k "ja_clip"to run a specific test
|
allconnect
|
No description available on PyPI.
|
allcopol
|
AllCoPolAllCoPol is collection of tools for the analysis of polyploids, that allows
to infer ancestral allele combinations as well as corresponding subgenome phylogenies.InstallationAllCoPol is hosted at the Python Package Index (PyPI), so it can be easily
installed via pip:python3-mpipinstallallcopolTo run PhyloNet, Java 1.7 or later has to be installed.Contained toolsallcopolThis is the main tool of the package implementing heuristic optimization of
ancestral allele combinations. It requires at least four arguments,
specifying the input files (-A,-G), the number of supplied gene trees per marker
(-S), and the path to a PhyloNet jar file (-P),
which can can be obtained fromhttps://bioinfocs.rice.edu/phylonet(newest tested version: 3.8.0).
Besides, the tabu tenure (-t) and the number of iterations (-i) are crucial
parameters, which have to be tuned for proper optimization.The allele mapping file (-A) is a tab-delimited text file with one line
per accession and four columns: accession, taxon, allele IDs (comma separated),
and ploidy level.
The gene tree file (-G) consists of newick strings supplied as one tree per line.
For the trees, which are assumed to be rooted, topologies are sufficient while
edge lengths, support values, etc. will be ignored. Multiple gene trees per
marker can be supplied as consecutive lines in the tree file, e.g.<tree1 for marker1>
<tree2 for marker1>
<tree3 for marker1>
<tree1 for marker2>
<tree2 for marker2>
<tree3 for marker2>
...Because the program cannot know from the input file which trees belong to the
same marker, the-Soption has to be set correctly (for the example above:-S 3).To get a complete list of program options, typeallcopol--helpMinimal example:mapping.nw (input):acc1 sp1 A_1,A_2 2
acc2 sp1 B_1,B_2 2
acc3 sp2 C_1,C_2 2
acc4 sp2 D_1,D_2 2
acc5 sp3 E_1,E_2 2
acc6 sp4 F_1,F_2 2
acc7 sp5 G_1,G_2,H_1,H_2 4trees.nw (input):(((E_2,(B_1,C_2)),(G_1,F_2)),((H_1,H_2),((A_2,A_1),((G_2,D_2),(F_1,((C_1,B_2),(D_1,E_1)))))));
((((F_2,(G_2,(G_1,F_1))),(A_1,A_2)),((((C_1,C_2),B_1),B_2),(((D_2,D_1),E_1),E_2))),(H_1,H_2));
((((G_2,((F_1,F_2),G_1)),A_1),((H_2,H_1),(D_1,E_1))),(((D_2,E_2),(C_2,(B_2,(C_1,B_1)))),A_2));
(((B_2,(C_2,B_1)),(H_2,H_1)),((A_2,A_1),(((G_2,G_1),(F_2,F_1)),((E_1,D_1),((D_2,E_2),C_1)))));
(((A_1,A_2),(H_2,H_1)),(((E_1,D_1),((F_1,F_2),(G_2,G_1))),(((E_2,D_2),(B_1,(C_2,B_2))),C_1)));command:allcopol-Amapping.txt-Gtrees.nw-S1-PPhyloNet_3.8.0.jar-t5-i20Setting the tabu tenure to zero and using reinitialization (-u), it is also
possible to perform random restart hillclimbing instead of tabu search.
While this avoids extensive parameter tuning, it usually requires a higher
number of iterations to obtain satisfactory solutions:allcopol-Amapping.txt-Gtrees.nw-S1-PPhyloNet_3.8.0.jar-t0-u1-i100If runtime is limiting, the number of evaluated solutions per iteration can be
limited via the -s option. Note that this may be at the expense of a lower final
solution quality.create_indfileThis script takes a number of allele mapping strings (one per line, obtained
by multiple runs ofallcopolbased on the same* input files) as input and
prints a matrix representation of the inferred allele partitions. The latter can
be used as input for Clumpp oralign_clusters.* The used gene trees may vary, but the underlying markers and their order in
the tree files must be identical.Example:mappings.txt (input):sp2:C_1,C_2;sp3:D_1,D_2;sp1___01:B_1_m0,B_2_m0,B_1_m1,B_2_m1;sp1___02:A_1_m0,A_2_m0,A_1_m1,A_2_m1
sp2:C_1,C_2;sp3:D_1,D_2;sp1___01:A_1_m0,B_2_m0,B_1_m1,B_2_m1;sp1___02:A_2_m0,A_1_m1,A_2_m1,B_1_m0
sp2:C_1,C_2;sp3:D_1,D_2;sp1___01:A_1_m0,A_2_m0,A_1_m1,A_2_m1;sp1___02:B_1_m0,B_2_m0,B_1_m1,B_2_m1
sp2:C_1,C_2;sp3:D_1,D_2;sp1___01:A_1_m0,A_2_m0,A_1_m1,B_2_m1;sp1___02:B_1_m0,B_2_m0,B_1_m1,A_2_m1command:create_indfilemappings.txtsp1>example.indfileThe second argumentsp1is the name of the polyploid taxon, whose
pseudo-diploid ancestors have been inferred.example.indfile (output):1 1 (x) 1 : 0 1
2 2 (x) 1 : 0 1
3 3 (x) 1 : 0 1
4 4 (x) 1 : 0 1
5 5 (x) 1 : 1 0
6 6 (x) 1 : 1 0
7 7 (x) 1 : 1 0
8 8 (x) 1 : 1 0
1 1 (x) 1 : 1 0
2 2 (x) 1 : 0 1
...align_clustersThis tool can be used to match clusters (pseudo-diploids) among multiple
reconstructions. To avoid getting stuck in a local optimum, a tabu list is
applied, whose size can be specified via the-toption - unlikeallcopol,
the heuristic used for this step seems to be relatively robust.-nsets the total number of optimization iterations.Applied to the matrix representation written above, the commandalign_clusters-n50-t2example.indfilecreates the output file example.permutations:2 1
2 1
1 2
1 2and a second file containing the averaged cluster coefficients
(example.clustering).relabel_treesUsing the output ofalign_clusters, the species trees obtained by multiple
runs ofallcopolcan be relabeled to mitigate label switching.relabel_treesexpects three arguments, a file containing the inferred species
trees, the name of the analyzed polyploid taxon and a permutation file as
written byalign_clusters.Example:sp_trees.nw (input):(sp3,(sp1___02,(sp1___01,sp2)));
(sp3,(sp1___02,(sp1___01,sp2)));
(sp3,(sp1___01,(sp1___02,sp2)));
(sp3,(sp1___01,(sp1___02,sp2)));The commandrelabel_treessp_trees.nwsp1example.permutationsyields(sp3,(sp1_P1,(sp1_P2,sp2)));
(sp3,(sp1_P1,(sp1_P2,sp2)));
(sp3,(sp1_P1,(sp1_P2,sp2)));
(sp3,(sp1_P1,(sp1_P2,sp2)));Now that the pseudo-diploids are labeled according to their homology,
conventional consensus methods can be applied to the trees.
|
allcountries
|
All ContriesThis is simple package to help make country names and country code available in your application.Github-flavored Markdownto write your content.
|
allcountry
|
No description available on PyPI.
|
allcreate-login
|
Failed to fetch description. HTTP Status Code: 404
|
allcrypt
|
Allcrypt Encryption/DecryptionAllcrypt is a Python application for file and message encryption/decryption using Fernet symmetric key cryptography.FeaturesEncrypt and decrypt files securelyEncrypt and decrypt messagesGenerate and manage encryption keys on a USB driveInstallationInstall the package using pip:pipinstallallcryptRun the Allcrypt GUI:allcryptUsageEncrypt a MessageLaunch the Allcrypt GUI.Enter the path to the USB drive in the provided field.Enter the message or encrypted bytes in the text box.Click the "Encrypt Message" button.Follow any prompts to enter the password for key encryption.Decrypt a MessageLaunch the Allcrypt GUI.Enter the path to the USB drive in the provided field.Enter the encrypted message in the text box.Click the "Decrypt Message" button.Follow any prompts to enter the password for key decryption.Encrypt a FileLaunch the Allcrypt GUI.Enter the path to the USB drive in the provided field.Click the "Encrypt File" button.Choose the source file to encrypt.Choose the destination file for the encrypted output.Optionally, check the "Compress Files" and "Shred Original File" checkboxes.Click the "Encrypt File" button.Decrypt a FileLaunch the Allcrypt GUI.Enter the path to the USB drive in the provided field.Click the "Decrypt File" button.Choose the source file to decrypt.Choose the destination file for the decrypted output.Optionally, check the "Shred Original File" checkbox.Click the "Decrypt File" button.Generate a New KeyLaunch the Allcrypt GUI.Enter the path to the USB drive in the provided field.Click the "Generate New Key for this USB" button.Enter a password for key encryption when prompted.The new key will be generated and saved on the USB drive.LicenseThis project is licensed under the MIT License - see theLICENSEfile for details.About Me:My name is PranavVisit myGithubTo look at source code on github and raise issues:Allcrypt Source code
|
allcuisines
|
No description available on PyPI.
|
alldata
|
This is a Package in which you can Extract Images,Text and Tables from 1 packageFree software: BSD 2-Clause LicenseInstallationpip install alldataYou can also install the in-development version with:pip install git+ssh://git@https://github.com/shehrozkapoor/alldata.git/shehrozkapoor/python-alldata.git@masterDocumentationhttps://python-alldata.readthedocs.io/DevelopmentTo run all the tests run:toxNote, to combine the coverage data from all the tox environments run:Windowsset PYTEST_ADDOPTS=--cov-append
toxOtherPYTEST_ADDOPTS=--cov-append toxChangelog0.0.0 (2020-12-13)First release on PyPI.
|
alldatetime
|
alldatetimeIntroductionThis is a lightweight Python library for representing time and dates, with no restrictions on the year range. (Unlike the datetime library in CPython, which has a year range from 1 to 9999.) It also supports fuzzy dates, allowing for representation of uncertain times (i.e., you can specify only the year, month, or day). For example, it can represent dates like 1912, or March 1912, etc. Some parts of the code are referenced from CPython's datetime implementation.Please NOTE that this library currently does not support time zones currently.InstallationTo install this library, run the following command:pip install alldatetimeTypes Overviewalldatetime.alldatetime.alldate: A class used to represent dates.alldatetime.alldatetime.alltime: A class used to represent time.alldatetime.alldatetime.alldatetime: A class used to represent date and time.alldatetime.alldatetime.alldateperiod: Used to represent a time interval by specifying a start time and an end time.alldatetime.fuzzydatetime.fuzzydate: Used to represent a fuzzy date, such as the year 1950, or June 1950, etc.alldatetime.fuzzydatetime.fuzzydateperiod: Used to represent a fuzzy date range, such as from 1950 to 1980, or from June 1950 to September 1950, etc.alldatealldateis used to represent dates by specifying year, month and day.Methods and Constructor__init__(self, year: int, month: int, day: int)Constructor of classalldate.year: The year of the date. No limitation. Use negative numbers to represent years before the Common Era (BC).month: The month of the date. Ranging from 1 to 12. A ValueError will be raised if month is out of range.day: The day of the date. The range starts from 1 and goes up to the number of days in the specified month. A ValueError will be raised if month is out of range.classmethodfromtimestamp(cls, timestamp: int)Return an instance ofalldatecorresponding to the POSIX timestamp.timestamp: POSIX timestamp.Returns: An instance ofalldatecorresponding to the POSIX timestamp.Example usage:fromalldatetime.alldatetimeimportalldatead=alldate.fromtimestamp(-111553804800)print(ad)# -1566-01-01fromordinal(cls, n: int)Return an instance ofalldatefrom a modified version of proleptic Gregorian ordinal. Ordinal number 0 represents January 1, 1 AD not 1.n: Ordinal number. Ordinal number 0 represents January 1, 1 AD.Returns: An instance ofalldatecorresponding to the ordinal number.Example usage:fromalldatetime.alldatetimeimportalldatedate=alldate.fromordinal(0)print(date)# 0001-01-01date=alldate.fromordinal(-1)print(date)# -0001-12-31toordinal(self)Returns: A modified version of proleptic Gregorian ordinal. Ordinal number 0 represents January 1, 1 AD not 1.Example usage:fromalldatetime.alldatetimeimportalldatedate=alldate(1,1,1)ordinal=date.toordinal()print(ordinal)# 0date=alldate(-1,12,31)ordinal=date.toordinal()print(ordinal)# -1weekday(self) -> intReturn day of the week, where Monday == 0 ... Sunday == 6.Returns: Return day of the week, where Monday == 0 ... Sunday == 6.Propertiesyear: The year of the date.month: The month of the date.day: The day of the date.timestampThe POSIX timestamp of the beginning of the date.alltimealltimeis used to represent a time by specifying hour, minute, second and millisecond.Methods and Constructor__init__(self, hour, minute, second, microsecond=0)Constructor of classalltime.hour: The hour of the time raning from 0 to 23. A ValueError will be raised if month is out of range.minute: The minute of the time ranging from 0 to 59. A ValueError will be raised if month is out of range.second: The second of the time ranging from 0 to 59. A ValueError will be raised if month is out of range.microsecond: The microsecond of the time raning from 0 to 999999. A ValueError will be raised if month is out of range.Propertieshour: The hour of the time.minute: The minute of the time.second: The second of the time.microsecond: The microsecond of the time.alldatetimealldatetimeis used to represent a date time.Methods and Constructor__init__(self, year: int, month: int, day: int, hour: int=0, minute: int=0, second: int=0, microsecond: int=0)Constructor of classalldatetime.year: The year of the date. No limitation. Use negative numbers to represent years before the Common Era (BC).month: The month of the date. Ranging from 1 to 12. A ValueError will be raised if month is out of range.day: The day of the date. The range starts from 1 and goes up to the number of days in the specified month. A ValueError will be raised if month is out of range.hour: The hour of the time raning from 0 to 23. A ValueError will be raised if month is out of range.minute: The minute of the time ranging from 0 to 59. A ValueError will be raised if month is out of range.second: The second of the time ranging from 0 to 59. A ValueError will be raised if month is out of range.microsecond: The microsecond of the time raning from 0 to 999999. A ValueError will be raised if month is out of range.classmethodfromtimestamp(cls, timestamp: int)Return an instance ofalldatetimecorresponding to the POSIX timestamp.timestamp: POSIX timestamp.Returns: An instance ofalldatetimecorresponding to the POSIX timestamp.Example usage:fromalldatetime.alldatetimeimportalldatetimedt=alldatetime.fromtimestamp(-111553854830)print(dt)# -1567-12-31 10:06:10dt=alldatetime.fromtimestamp(0)print(dt)# 1970-01-01 00:00:00date(self) -> alldateReturn analldateinstance representing the date part of the date time.Returns: Analldateinstance representing the date part of the date time.time(self) -> alltimeReturn analltimeinstance representing the time part of the date time.Returns: Analltimeinstance representing the time part of the date time.classmethodstrptime(cls, date_string: str, format: str)Class methodstrptimecreates an alldatetime object from a string representing a date and time and a corresponding format string.date_string: A string representing a date and time.format: Corresponding format string.Returns: An instance ofalldatetimeparsed from the date and time string based on the format string.Example usage:fromalldatetime.alldatetimeimportalldatetimealldatetime.strptime("5000-01-08 08:30:15 BC","%Y-%m-%d%H:%M:%S")# alldatetime(-5000, 1, 8, 8, 30, 15)alldatetime.strptime("5000/01/08 08:30:15 BC","%Y/%m/%d%H:%M:%S")# alldatetime(-5000, 1, 8, 8, 30, 15)alldatetime.strptime("2000-01-08 08:30:15 AD","%Y-%m-%d%H:%M:%S")# alldatetime(2000, 1, 8, 8, 30, 15)alldatetime.strptime("2000-01-08 08:30:15","%Y-%m-%d%H:%M:%S")# alldatetime(2000, 1, 8, 8, 30, 15)weekday(self) -> intReturn day of the week, where Monday == 0 ... Sunday == 6.Returns: Return day of the week, where Monday == 0 ... Sunday == 6.Propertiesyear: The year of the date.month: The month of the date.day: The day of the date.hour: The hour of the time.minute: The minute of the time.second: The second of the time.microsecond: The microsecond of the time.timestampThe POSIX timestamp of the date time.alldateperiodalldateperiodis used to represent a date period, consisting of a start date and an end date, forming an open-closed interval.Methods and Constructor__init__(self, start_date: alldate, end_date: alldate)Constructor of classalldateperiod.start_date: The start date of the date period.end_date: The end date of the date period.overlap_with(self, other)Check whether the date period overlaps with another date period.other: An instance ofalldateperiod.Returns: Whether the date period overlaps with the date period passed.Example usage:fromalldatetime.alldatetimeimportalldate,alldateperiodperiod=alldateperiod(alldate(2023,12,1),alldate(2023,12,5))other_period=alldateperiod(alldate(2023,10,10),alldate(2023,12,3))overlap=period.overlap_with(other_period)# Truecover(self, date: alldate) -> boolCheck whether the date period covers a date.date: An instance ofalldate.Returns: Whether the date period covers the date passed.Example usage:fromalldatetime.alldatetimeimportalldate,alldateperiodperiod=alldateperiod(alldate(2023,12,1),alldate(2023,12,5))date=alldate(2023,12,4)period.cover(date)# Truedate=alldate(2023,12,5)period.cover(date)# FalsePropertiesstart_date: The start date of the date period.end_date: The end date of the date period.PrecisionPrecisionis used to indicate how precise afuzzydateis. It consists of two parts: num and unit.Methods and Constructor__init__(self, num: int, unit: PrecisionUnit)num: Required. A number representing precision.unit: Required. Enum PrecisionUnit: Year, Month, Day.Propertiesnum: A number representing precision.unit: Enum PrecisionUnit: Year, Month, Dayfuzzydatefuzzydaterepresents an imprecise time, unlikealldatewhich represents a specific date.alldaterequires specifying the exact year, month, and day, whereasfuzzydateonly needs the year specified; the month and day can be left unspecified.Methods and Constructor__init__(self, year: int = None, month: int = None, day: int = None, precision: Precision = None, forward_precision: Precision = None, backward_precision: Precision = None)Constructor of classfuzzydate.year: Required. The year of the fuzzy date. Use negative numbers to represent years before the Common Era (BC).month: Optional. The month of the fuzzy date.day: Optional. The day of the fuzzy date.precision: Optional. The precision of the fuzzy date for both forward and backward.forward_precision: Optional. The forward precision of the fuzzy date. If not present, precision will be used.backward_precision: Optional. The forward precision of the fuzzy date. If not present, precision will be used.Note that, if no precision was passed,fuzzydatewill infer the precision from the arguments of year, month and day.to_alldateperiod(self) -> alldateperiodConvert thefuzzydatetoalldateperiod.Returns: An instance ofalldateperiodrepresenting the range of thefuzzydate.Example usage:fromalldatetime.fuzzydatetimeimportfuzzydatefdate=fuzzydate(1987)fdate.to_alldateperiod()# 1987-01-01 -> 1988-01-01fdate=fuzzydate(1987,1)fdate.to_alldateperiod()# 1987-01-01 -> 1987-01-31fdate=fuzzydate(1987,1,1)fdate.to_alldateperiod()# 1987-01-01 -> 1987-01-02to_alldateperiod_timestamps(self) -> tuple[float, float]Convert thefuzzydateto a tuple of two timestamps.Returns: A tuple of two floats, the first float is the timestamp of start date, and the second float is the timestamp of end date.Example usage:fromalldatetime.fuzzydatetimeimportfuzzydatefdate=fuzzydate(1987)fdate.to_alldateperiod_timestamps()# 536457600.0 -> 567993600.0fdate=fuzzydate(1987,1)fdate.to_alldateperiod_timestamps()# 536457600.0 -> 539049600.0fdate=fuzzydate(1987,1,1)fdate.to_alldateperiod_timestamps()# 536457600.0 -> 536544000.0overlap_with(self, other) -> boolCheck whether thefuzzydateoverlaps with anotherfuzzydate.other: An instance offuzzydate.Returns: Whether thefuzzydateoverlaps with thefuzzydatepassed.Example usage:fromalldatetime.fuzzydatetimeimportfuzzydatefdate1,fdate2=fuzzydate(1987),fuzzydate(1988)fdate1.overlap_with(fdate2)# Falsefdate1,fdate2=fuzzydate(1987),fuzzydate(1987,1)fdate1.overlap_with(fdate2)# Truefdate1,fdate2=fuzzydate(1987),fuzzydate(1987,12,31)fdate1.overlap_with(fdate2)# TruePropertiesyear: The year of the fuzzy date.month: The month of the fuzzy date.day: The day of the fuzzy date.anchor: The anchor date of the fuzzy date. It is an instance ofalldateinitiated with the year, month and day. If month or day is absent, 1 is used.forward_precision: The forward precision.backward_precision: The backward precision.fuzzydateperiodfuzzydateperiodis used to represent a date period, consisting of a fuzzy start date and an fuzzy end date, forming an open-closed interval.Methods and Constructor__init__(self, start_date: fuzzydate, end_date: fuzzydate)Constructor of classfuzzydateperiod.start_date: The fuzzy start date of the date period.end_date: The fuzzy end date of the date period.cover(self, date: fuzzydate) -> boolCheck whether the date period covers a fuzzy date.date: An instance offuzzydate.Returns: Whether the date period covers the fuzzy date passed.Example usage:fromalldatetime.fuzzydatetimeimportfuzzydate,fuzzydateperiodperiod=fuzzydateperiod(fuzzydate(202,1),fuzzydate(202,5))period.cover(fuzzydate(202,3))# Trueperiod.cover(fuzzydate(202,1))# Trueperiod.cover(fuzzydate(202,6))# False
|
alldebrid.py
|
No description available on PyPI.
|
alldevutils
|
No description available on PyPI.
|
all-distributions
|
No description available on PyPI.
|
alldists
|
No description available on PyPI.
|
all-downloader
|
No description available on PyPI.
|
allegedb
|
No description available on PyPI.
|
allegro5
|
ALLEGRO 5 OFFICIAL DISTRIBUTIONAllegro is a cross-platform library mainly aimed at video game and multimedia programming. It handles common, low-level tasks such as creating windows, accepting user input, loading data, drawing images, playing sounds, etc. and generally abstracting away the underlying platform...This is the official allegro 5 wrapper in python. It was built from theofficial github repousing cmake.Tested onwindows 10with a32-bitpython interpreter(It is preferable to use a 32-bit interpreter because 64-bit hasn't been testedyet)Demo scriptimportosos.environ['ALLEGRO5_DLL']=r'path\to\allegro5\dlls'os.environ['ALLEGRO5_VERSION']='5.0.10'# this is just an examplefromallegro5import*al_run_demo()The demo game is hostedhereDocumentationYou can find official documentationhere
|
allegroai
|
Failed to fetch description. HTTP Status Code: 404
|
allegroai-api
|
Failed to fetch description. HTTP Status Code: 404
|
allegroai-config
|
Failed to fetch description. HTTP Status Code: 404
|
allegro-pl-rest-api
|
https://developer.allegro.pl/about# noqa: E501
|
allegrordf
|
UNKNOWN
|
allein_zu_haus
|
This pacakge implements aNeedleman–Wunschglobal alignment withaffine gap penalties, taking into account the confidence in each base pair of the read sequence.The classic Needleman-Wunsch algorithm finds the optimal global match assuming all read nucleotides have been identified with certainty. In practice,NGS (next generation sequencing)identifies nucleotides with varying levels of qualities, and popular formats, e.g.,FASTQare expressely built for describing these qualities. This package modifies the algorithm to take qualities into account (seemathematical_rationalebelow).This package is designed to be used in conjunction with a read aligner such asBowtie2orGEM. Ideally, these tools would be configured to take basepair quality into account when searching for hits. Since this is currently not the case, one can use the allein_zu_haus package in order to realign reads to reference sequences at the positions reported by the read aligner, and extract more accurate alignment scores andCIGARstrings.The package handles ambiguous base codes by weighing the different options according to the base priors provided. Read sequences must be drawn from the {A, C, G, T, N} alphabet, but reference sequences may contain any basepairs defined by the IUPAC nucleotide ambiguity code. Quality values are ignored for ambiguous read basepairs.Usage example:Minimal Exampleimportallein_zu_hausimportnumpyasnpmax_read_len=100# Max length of read to be alignedmax_ref_len=2*max_read_len# Max length of reference subsequence to be globally aligned (where start position is determined by read aligner output)mismatch_penalty=4# Penalty for mismatched nucleotidesgap_open=6# Gep opening penaltygep_extend=3# Gap extension penaltyaligner=allein_zu_haus.Aligner(max_read_len,max_ref_len,mismatch_penalty,gap_open,gap_extend)read=np.array(['A','C','G','T','A'],dtype=bytes)# Read sequence, given as numpy array, dtype=bytesref=np.array(['A','G','G','T','A'],dtype=bytes)# Reference subsequence to be aligned, given as numpy array, dtype=bytesread_bp_probs=np.array([0.9,0.99,0.8,0.99,0.99])# Confidence in each read basepair. See below on how to extract such values from read quality stringsbase_probs=np.array([0.25]*4)# Basepair prior probabilities. Here, assuming uniform distribution on nucleotides.# See below example for more biologically relevant priorsmax_gaps=2# Maximal number of gaps allowed. Use small values to improve run timescore,cigar=aligner.match(read,read_bp_probs,base_probs,ref,max_gaps)#Aligner returns alignment score and CIGAR stringSetting Up Priors & QualitiesOne way of extracting more meaningful basepair priors is by using the nucleotide frequencies observed in reference sequences. Assuming _ref_seq is a string variable holding the reference genome of interest this can be done by:defget_priors():return[(k,v/float(len(_ref_seq)))for(k,v)incollections.Counter(_ref_seq).items()]Naturally, more intricate priors can easily be designed.Basepair probabilities cab be extracted from quality strings reported in FASTQ files / read aligner output using this formula:\begin{equation*}
1 - 10^\frac{-q}{10}
\end{equation*}where q is the ascii value of the quality character - 33 (assuming qualities are goven in Phred+33)defget_read_bp_probs(read_quality_string):# read quality_string: FASTQ quality string as reported in FASTQ file or read aligner outputreturn1-np.power(10,-np.array([ord(e)-33foreina])/10.)As you can see, read_p now holds per basepair probabilities (1 - P_error). For example, for quality string ‘??CII’ output will be [0.999, 0.999, 0.996, 0.9999, 0.9999]Aligning multiple readsThis package is optimized for multiple alignments, e.g., when iterating over the results ofGEMor a FASTQ file, and checking the score of each result relative to some corresponding reference subsequence. For this reason, to use it, first create an object with the parameters relevant to all matches.
Since reads usually have similar lengths, and since read aligners provide the position in the reference sequence to which the read was aligned, both length of read and reference sequence can be easily bound.importallein_zu_hausimportnumpyasnpmax_read_len=100max_ref_len=2*max_read_lenmismatch_penalty=4gap_open=6gep_extend=3aligner=allein_zu_haus.Aligner(max_read_len,max_ref_len,mismatch_penalty,gap_open,gap_extend)Then use the aligner repeatedly for each match, providing only the match specific parameters:forread,read_quality_string,ref,max_gapsin...:# read, ref should be of type np.array with dtype=bytesread_bp_probs=get_read_bp_probs(read_quality_string)# Function for exatracting basepair confidence from FASTQ quality strings. See example above.base_probs=get_priors()# Function for computing priors from reference and/or read sequences. See example abovescore,cigar=aligner.match(read,read_bp_probs,base_probs,ref,max_gaps)# multiple calls to aligner, with relevant sequences and priorsMathematical RationaleSuppose we wish to find the optimal match between a readDand a referenceF. Unfortunately, we cannot observeDdirectly, and instead only seeD’, which is the sequence outputted by some imperfectsequencing process. Say that at some point in the alignment algorithm we consider whether a nucleotide fromD’matches a nucleotide fromF. Define:bD’is the nucleotide reported by the sequencing process.bDis the true (unknown) uncleotide.bRis the reference nucleotide.The classic algorithm would assign the penalty\begin{equation*}
\mbox{penalty}(b_{D'}, b_R)
\end{equation*}whereas the correct penalty should be\begin{equation*}
\mbox{penalty}(b_{D}, b_R) \simeq
\sum_b \left[ P\left( B_D = b | B_{D'} = b_{D'} \right) \cdot \mbox{penalty}(b, b_R) \right]
\end{equation*}ByBayes’ Theorem,\begin{equation*}
P\left( B_D = b | B_{D'} = b_{D'} \right)
=
\frac
{
P\left( B_{D'} = b_{D'} | B_D = b \right)
\cdot
P\left( B_D = b \right)
}
{
\sum_{k = 'A', 'C', 'G', 'T}
P\left( B_{D'} = b_{D'} | B_D = k \right)
\cdot
P\left( B_D = k \right)
}
\end{equation*}For evaluating these terms, note that\begin{equation*}
P\left( B_D = b \right)
\end{equation*}is the prior over the nucleotides (which must be given by the user), and\begin{equation*}
P\left( B_{D'} = b_{D'} | B_D = b \right)
=
\begin{cases}
1 - P_{\mbox{err}} ,& \text{if } b_{D'} = b, \\
\frac{P_{\mbox{err}}}{3}, & \text{otherwise}
\end{cases}
\end{equation*}wherePerris the probability for error determined by the reported quality for this nucleotide.IssuesFeel free to open tickets athttps://bitbucket.org/taliraveh/allein_zu_haus/issues.
|
allelicimbalance
|
allelicimbalanceAllelic imbalance utilities
|
allEmoji
|
allEmojiSimply use emojies as class attributes in PythonThis is just one of my hobby projects. Nothing much useful for others
So, if you're using VS-Code or any other editors that gives you the attribute suggestions, then it'll come to your use. Else, use theemojilibrary that's more advanced.pip install allEmoji
|
allen
|
UNKNOWN
|
allen1989
|
allen1989main:prod:Formalisms from Allen at al. 1989 paperHistorycreation (2021-10-25)First release on PyPI.
|
allenact
|
AllenAct is a modular and flexible learning framework designed with a focus on the unique requirements of Embodied-AI research.
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.