package
stringlengths 1
122
| pacakge-description
stringlengths 0
1.3M
|
---|---|
antchain-mytc
|
English |简体中文Ant Chain MYTC SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-mytcpipinstallantchain-mytcIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-nftc
|
English |简体中文Ant Chain NFTC SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-nftcpipinstallantchain-nftcIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-nftx
|
English |简体中文Ant Chain NFTX SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-nftxpipinstallantchain-nftxIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-notification
|
English |简体中文Ant Chain NOTIFICATION SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain_sdk_notificationpipinstallantchain_sdk_notificationIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-op
|
English |简体中文Ant Chain OP SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain_sdk_oppipinstallantchain_sdk_opIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-opaiot
|
English |简体中文Ant Chain OPAIOT SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-opaiotpipinstallantchain-opaiotIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-opinternational
|
English |简体中文Ant Chain OPINTERNATIONAL SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-opinternationalpipinstallantchain-opinternationalIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-osp
|
English |简体中文Ant Chain OSP SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-osppipinstallantchain-ospIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-partner
|
English |简体中文Ant Chain PARTNER SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-partnerpipinstallantchain-partnerIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-pcc
|
English |简体中文Ant Chain PCC SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain_sdk_pccpipinstallantchain_sdk_pccIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-propertychain
|
English |简体中文Ant Chain PROPERTYCHAIN SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain_sdk_propertychainpipinstallantchain_sdk_propertychainIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-rcsmart
|
English |简体中文Ant Chain RCSMART SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-rcsmartpipinstallantchain-rcsmartIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-realperson
|
English |简体中文Ant Chain REALPERSON SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-realpersonpipinstallantchain-realpersonIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-risk
|
English | [简体中文](README-CN.md)## Ant Chain risk SDK for Python## InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see the [pip user guide](https://pip.pypa.io/en/stable/installing/“pip User Guide”) to install pip.`bash # Install theantchain_sdk_risk-testpip installantchain_sdk_risk-test`## Issues[Opening an Issue](https://github.com/alipay/antchain-openapi-prod-sdk/issues/new), Issues not conforming to the guidelines may be closed immediately.## ChangelogDetailed changes for each release are documented in the [release notes](./ChangeLog.md).## References[Latest Release](https://github.com/alipay/antchain-openapi-prod-sdk/tree/master/python)## License[Apache-2.0](http://www.apache.org/licenses/LICENSE-2.0)Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-risknet
|
English |简体中文Ant Chain RISKNET SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-risknetpipinstallantchain-risknetIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-riskplus
|
English |简体中文Ant Chain RISKPLUS SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-riskpluspipinstallantchain-riskplusIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-rms
|
English |简体中文Ant Chain RMS SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain_sdk_rmspipinstallantchain_sdk_rmsIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-saas
|
English |简体中文Ant Chain SAAS SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-saaspipinstallantchain-saasIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-saastest16
|
English |简体中文Ant Chain SaasTest16 SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-saastest16pipinstallantchain-saastest16IssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-saastest17
|
English |简体中文Ant Chain SaasTest17 SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-saastest17pipinstallantchain-saastest17IssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-saastest6
|
English |简体中文Ant Chain SaasTest6 SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-saastest6pipinstallantchain-saastest6IssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-securitytech
|
English |简体中文Ant Chain SECURITYTECH SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-securitytechpipinstallantchain-securitytechIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-shuziwuliu
|
English |简体中文Ant Chain SHUZIWULIU SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-shuziwuliupipinstallantchain-shuziwuliuIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-smartaccount
|
English |简体中文Ant Chain SMARTACCOUNT SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-smartaccountpipinstallantchain-smartaccountIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-sp
|
English |简体中文Ant Chain SP SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain_sdk_sppipinstallantchain_sdk_spIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-stlr
|
English |简体中文Ant Chain STLR SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-stlrpipinstallantchain-stlrIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-tam
|
English |简体中文Ant Chain TAM SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-tampipinstallantchain-tamIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-tax
|
English |简体中文Ant Chain TAX SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-taxpipinstallantchain-taxIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-tdm
|
English |简体中文Ant Chain TDM SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain_sdk_tdmpipinstallantchain_sdk_tdmIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-testdjbouttr
|
English |简体中文Ant Chain TESTDJBOUTTR SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain_sdk_testdjbouttrpipinstallantchain_sdk_testdjbouttrIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-tftus
|
English |简体中文Ant Chain TFTUS SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain_sdk_tftuspipinstallantchain_sdk_tftusIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-trade
|
English |简体中文Ant Chain TRADE SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-tradepipinstallantchain-tradeIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-trdemo
|
English |简体中文Ant Chain TRDEMO SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain_sdk_trdemopipinstallantchain_sdk_trdemoIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-twc
|
English |简体中文Ant Chain TWC SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-twcpipinstallantchain-twcIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-twoe
|
English |简体中文Ant Chain TWOE SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain_sdk_twoepipinstallantchain_sdk_twoeIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-windward
|
English |简体中文Ant Chain WINDWARD SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-windwardpipinstallantchain-windwardIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-yunqing
|
English |简体中文Ant Chain YUNQING SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-yunqingpipinstallantchain-yunqingIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-yuqing
|
English |简体中文Ant Chain YUQING SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-yuqingpipinstallantchain-yuqingIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-zjlm
|
English |简体中文Ant Chain ZJLM SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain_sdk_zjlmpipinstallantchain_sdk_zjlmIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
antchain-zolozfaceverify
|
English |简体中文Ant Chain ZOLOZFACEVERIFY SDK for PythonRequirementsPython >= 3.6InstallationInstall with pipPython SDK uses a common package management tool namedpip. If pip is not installed, see thepip user guideto install pip.# Install the antchain-zolozfaceverifypipinstallantchain-zolozfaceverifyIssuesOpening an Issue, Issues not conforming to the guidelines may be closed immediately.UsageQuick ExamplesChangelogDetailed changes for each release are documented in therelease notes.ReferencesLatest ReleaseLicenseApache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
|
ant-cli
|
No description available on PyPI.
|
antco
|
No description available on PyPI.
|
ant-colony
|
Copyright (c) 2017 Jerzy PawlikowskiPermission is hereby granted, free of charge, to any person obtaining a copyof this software and associated documentation files (the "Software"), to dealin the Software without restriction, including without limitation the rightsto use, copy, modify, merge, publish, distribute, sublicense, and/or sellcopies of the Software, and to permit persons to whom the Software isfurnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in allcopies or substantial portions of the Software.THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS ORIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THEAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHERLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THESOFTWARE.Description: =================Ant Colony System=================.. image:: https://travis-ci.org/jurekpawlikowski/ant-colony.svg?branch=master:target: https://travis-ci.org/jurekpawlikowski/ant-colonyPython implementation of the Ant Colony systemDevelopment-----------Install the requirements:pip install -r requirements.txtRun tests:pytest --pep8Usage-----from ant_colony.graph import Node, Graphnodes = [Node(0, 0), Node(0, 1), Node(1, 1), Node(1, 0)]graph = Graph(nodes)path, distance = graph.find_shortest_path()Platform: UNKNOWN
|
ant-datrie
|
datrieSuper-fast, efficiently stored Trie for Python (2.x and 3.x).
Useslibdatrie.Installationpip install datrieUsageCreate a new trie capable of storing items with lower-case ascii keys:>>> import string
>>> import datrie
>>> trie = datrie.Trie(string.ascii_lowercase)trievariable is a dict-like object that can have unicode keys of
certain ranges and Python objects as values.In addition to implementing the mapping interface, tries facilitate
finding the items for a given prefix, and vice versa, finding the
items whose keys are prefixes of a given string. As a common special
case, finding the longest-prefix item is also supported.WarningFor efficiency you must define allowed character range(s) while
creating trie.datriedoesn’t check if keys are in allowed
ranges at runtime, so be careful! Invalid keys are OK at lookup time
but values won’t be stored correctly for such keys.Add some values to it (datrie keys must be unicode; the examples
are for Python 2.x):>>> trie[u'foo'] = 5
>>> trie[u'foobar'] = 10
>>> trie[u'bar'] = 'bar value'
>>> trie.setdefault(u'foobar', 15)
10Check if u’foo’ is in trie:>>> u'foo' in trie
TrueGet a value:>>> trie[u'foo']
5Find all prefixes of a word:>>> trie.prefixes(u'foobarbaz')
[u'foo', u'foobar']
>>> trie.prefix_items(u'foobarbaz')
[(u'foo', 5), (u'foobar', 10)]
>>> trie.iter_prefixes(u'foobarbaz')
<generator object ...>
>>> trie.iter_prefix_items(u'foobarbaz')
<generator object ...>Find the longest prefix of a word:>>> trie.longest_prefix(u'foo')
u'foo'
>>> trie.longest_prefix(u'foobarbaz')
u'foobar'
>>> trie.longest_prefix(u'gaz')
KeyError: u'gaz'
>>> trie.longest_prefix(u'gaz', default=u'vasia')
u'vasia'
>>> trie.longest_prefix_item(u'foobarbaz')
(u'foobar', 10)Check if the trie has keys with a given prefix:>>> trie.has_keys_with_prefix(u'fo')
True
>>> trie.has_keys_with_prefix(u'FO')
FalseGet all items with a given prefix from a trie:>>> trie.keys(u'fo')
[u'foo', u'foobar']
>>> trie.items(u'ba')
[(u'bar', 'bar value')]
>>> trie.values(u'foob')
[10]Get all suffixes of certain word starting with a given prefix from a trie:>>> trie.suffixes()
[u'pro', u'producer', u'producers', u'product', u'production', u'productivity', u'prof']
>>> trie.suffixes(u'prod')
[u'ucer', u'ucers', u'uct', u'uction', u'uctivity']Save & load a trie (values must be picklable):>>> trie.save('my.trie')
>>> trie2 = datrie.Trie.load('my.trie')Trie and BaseTrieThere are two Trie classes in datrie package:datrie.Trieanddatrie.BaseTrie.datrie.BaseTrieis slightly faster and uses less
memory but it can store only integer numbers -2147483648 <= x <= 2147483647.datrie.Trieis a bit slower but can store any Python object as a value.If you don’t need values or integer values are OK then usedatrie.BaseTrie:import datrie
import string
trie = datrie.BaseTrie(string.ascii_lowercase)Custom iterationIf the built-in trie methods don’t fit you can usedatrie.Stateanddatrie.Iteratorto implement custom traversal.NoteIf you usedatrie.BaseTrieyou needdatrie.BaseStateanddatrie.BaseIteratorfor custom traversal.For example, let’s find all suffixes of'fo'for our trie and get
the values:>>> state = datrie.State(trie)
>>> state.walk(u'foo')
>>> it = datrie.Iterator(state)
>>> while it.next():
... print(it.key())
... print(it.data))
o
5
obar
10PerformancePerformance is measured fordatrie.Trieagainst Python’s dict with
100k unique unicode words (English and Russian) as keys and ‘1’ numbers
as values.datrie.Trieuses about 5M memory for 100k words; Python’s dict
uses about 22M for this according to my unscientific tests.This trie implementation is 2-6 times slower than python’s dict
on __getitem__. Benchmark results (macbook air i5 1.8GHz,
“1.000M ops/sec” == “1 000 000 operations per second”):Python 2.6:
dict __getitem__: 7.107M ops/sec
trie __getitem__: 2.478M ops/sec
Python 2.7:
dict __getitem__: 6.550M ops/sec
trie __getitem__: 2.474M ops/sec
Python 3.2:
dict __getitem__: 8.185M ops/sec
trie __getitem__: 2.684M ops/sec
Python 3.3:
dict __getitem__: 7.050M ops/sec
trie __getitem__: 2.755M ops/secLooking for prefixes of a given word is almost as fast as__getitem__(results are for Python 3.3):trie.iter_prefix_items (hits): 0.461M ops/sec
trie.prefix_items (hits): 0.743M ops/sec
trie.prefix_items loop (hits): 0.629M ops/sec
trie.iter_prefixes (hits): 0.759M ops/sec
trie.iter_prefixes (misses): 1.538M ops/sec
trie.iter_prefixes (mixed): 1.359M ops/sec
trie.has_keys_with_prefix (hits): 1.896M ops/sec
trie.has_keys_with_prefix (misses): 2.590M ops/sec
trie.longest_prefix (hits): 1.710M ops/sec
trie.longest_prefix (misses): 1.506M ops/sec
trie.longest_prefix (mixed): 1.520M ops/sec
trie.longest_prefix_item (hits): 1.276M ops/sec
trie.longest_prefix_item (misses): 1.292M ops/sec
trie.longest_prefix_item (mixed): 1.379M ops/secLooking for all words starting with a given prefix is mostly limited
by overall result count (this can be improved in future because a
lot of time is spent decoding strings from utf_32_le to Python’s
unicode):trie.items(prefix="xxx"), avg_len(res)==415: 0.609K ops/sec
trie.keys(prefix="xxx"), avg_len(res)==415: 0.642K ops/sec
trie.values(prefix="xxx"), avg_len(res)==415: 4.974K ops/sec
trie.items(prefix="xxxxx"), avg_len(res)==17: 14.781K ops/sec
trie.keys(prefix="xxxxx"), avg_len(res)==17: 15.766K ops/sec
trie.values(prefix="xxxxx"), avg_len(res)==17: 96.456K ops/sec
trie.items(prefix="xxxxxxxx"), avg_len(res)==3: 75.165K ops/sec
trie.keys(prefix="xxxxxxxx"), avg_len(res)==3: 77.225K ops/sec
trie.values(prefix="xxxxxxxx"), avg_len(res)==3: 320.755K ops/sec
trie.items(prefix="xxxxx..xx"), avg_len(res)==1.4: 173.591K ops/sec
trie.keys(prefix="xxxxx..xx"), avg_len(res)==1.4: 180.678K ops/sec
trie.values(prefix="xxxxx..xx"), avg_len(res)==1.4: 503.392K ops/sec
trie.items(prefix="xxx"), NON_EXISTING: 2023.647K ops/sec
trie.keys(prefix="xxx"), NON_EXISTING: 1976.928K ops/sec
trie.values(prefix="xxx"), NON_EXISTING: 2060.372K ops/secRandom insert time is very slow compared to dict, this is the limitation
of double-array tries; updates are quite fast. If you want to build a trie,
consider sorting keys before the insertion:dict __setitem__ (updates): 6.497M ops/sec
trie __setitem__ (updates): 2.633M ops/sec
dict __setitem__ (inserts, random): 5.808M ops/sec
trie __setitem__ (inserts, random): 0.053M ops/sec
dict __setitem__ (inserts, sorted): 5.749M ops/sec
trie __setitem__ (inserts, sorted): 0.624M ops/sec
dict setdefault (updates): 3.455M ops/sec
trie setdefault (updates): 1.910M ops/sec
dict setdefault (inserts): 3.466M ops/sec
trie setdefault (inserts): 0.053M ops/secOther results (note thatlen(trie)is currently implemented
using trie traversal):dict __contains__ (hits): 6.801M ops/sec
trie __contains__ (hits): 2.816M ops/sec
dict __contains__ (misses): 5.470M ops/sec
trie __contains__ (misses): 4.224M ops/sec
dict __len__: 334336.269 ops/sec
trie __len__: 22.900 ops/sec
dict values(): 406.507 ops/sec
trie values(): 20.864 ops/sec
dict keys(): 189.298 ops/sec
trie keys(): 2.773 ops/sec
dict items(): 48.734 ops/sec
trie items(): 2.611 ops/secPlease take this benchmark results with a grain of salt; this
is a very simple benchmark and may not cover your use case.Current Limitationskeys must be unicode (no implicit conversion for byte strings
under Python 2.x, sorry);there are no iterator versions of keys/values/items (this is not
implemented yet);it is painfully slow and maybe buggy under pypy;library is not tested with narrow Python builds.ContributingDevelopment happens at github:https://github.com/pytries/datrie.Feel free to submit ideas, bugs, pull requests.Running tests and benchmarksMake suretoxis installed and run$ toxfrom the source checkout. Tests should pass under Python 2.7 and 3.4+.$ tox -c tox-bench.iniruns benchmarks.If you’ve changed anything in the source code then
make surecythonis installed and run$ update_c.shbefore eachtoxcommand.Please note that benchmarks are not included in the release
tar.gz’s because benchmark data is large and this
saves a lot of bandwidth; use source checkouts from
github or bitbucket for the benchmarks.Authors & ContributorsSeehttps://github.com/pytries/datrie/graphs/contributors.This module is based onlibdatrieC library by Theppitak Karoonboonyanan
and is inspired byfast_trieRuby bindings,PyTriepure
Python implementation andTree::TriePerl implementation;
some docs and API ideas are borrowed from these projects.LicenseLicensed under LGPL v2.1.
CHANGES
=======0.8.4 (2022-03-30)forked from pytries/datrie .rename ant-datrieFix Decode string based on byteorder of systemAdjust TrieBase load0.8.2 (2020-03-25)Future-proof Python support by making cython a build time dependency and
removing cython generated c files from the repo (and sdist).Fix collections.abc.MutableMapping importCI and test updatesAdjust library name to unbreak some linkers0.8.1 (skipped)This version intentionally skipped0.8 (2019-07-03)Python 3.7 compatibility; extension is rebuilt with Cython 0.29.11.Trie.get function;Python 2.6 and 3.3 support is dropped;removed patch to libdatrie which is no longer required;testing and CI fixes.0.7.1 (2016-03-12)updated the bundled C library to version 0.2.9;implementedTrie.__len__in terms oftrie_enumerate;rebuilt Cython wrapper with Cython 0.23.4;changedTrieto implementcollections.abc.MutableMapping;fixedTriepickling, which segfaulted on Python2.X.0.7 (2014-02-18)bundled libdatrie C library is updated to version 0.2.8;new.suffixes()method (thanks Ahmed T. Youssef);wrapper is rebuilt with Cython 0.20.1.0.6.1 (2013-09-21)fixed build for Visual Studio (thanks Gabi Davar).0.6 (2013-07-09)datrie is rebuilt with Cython 0.19.1;iter_prefix_values,prefix_valuesandlongest_prefix_valuemethods fordatrie.BaseTrieanddatrie.Trie(thanks Jared Suttles).0.5.1 (2013-01-30)Recently introduced memory leak inlongest_prefixandlongest_prefix_itemis fixed.0.5 (2013-01-29)longest_prefixandlongest_prefix_itemmethods are fixed;datrie is rebuilt with Cython 0.18;misleading benchmark results in README are fixed;State._walk is renamed to State.walk_char.0.4.2 (2012-09-02)Update to latest libdatrie; this makes.keys()method a bit slower but
removes a keys length limitation.0.4.1 (2012-07-29)cPickle is used for saving/loadingdatrie.Trieif it is available.0.4 (2012-07-27)libdatrieimprovements and bugfixes, including C iterator API support;custom iteration support usingdatrie.Stateanddatrie.Iterator.speed improvements:__length__,keys,valuesanditemsmethods should be up to 2x faster.keys longer than 32768 are not supported in this release.0.3 (2012-07-21)There are no new features or speed improvements in this release.datrie.newis deprecated; usedatrie.Triewith the same arguments;small test & benchmark improvements.0.2 (2012-07-16)datrie.Trieitems can have any Python object as a value
(Triefrom 0.1.x becomesdatrie.BaseTrie);longest_prefixandlongest_prefix_itemsare fixed;save&loadare rewritten;setdefaultmethod.0.1.1 (2012-07-13)Windows support (upstream libdatrie changes are merged);license is changed from LGPL v3 to LGPL v2.1 to match the libdatrie license.0.1 (2012-07-12)Initial release.
|
anteater
|
Anteater - CI/CD Gate Check FrameworkDescriptionAnteater is an open framework to prevent the unwanted merging of nominated strings,
filenames, binaries, depreciated functions, staging enviroment code / credentials
etc. Anything that can be specified with regular expression syntax, can be
sniffed out by anteater.You tell anteater exactly what you don't want to get merged, and anteater looks
after the rest.If anteater finds something, it exits with a non-zero code which in turn fails
the build of your CI tool, with the idea that it would prevent a pull request
merging. Any false positives are easily negated by using the
same RegExp framework to cancel out the false match.Entire projects may also be scanned also, using a recursive directory walk.With a few simple steps it can be easily implemented into a CI / CD workflow
with tooling such asTravis CI,CircleCI,Gitlab CI/CDandJenkins.It is currently used in the Linux Foundations project'OPNFV'as means to provide automated security checks at gate, but as shown in the
examples below, it can be used for other scenarios.Anteater also provides integrates with the Virus Total API, so any binaries,
public IP addresses or URL's found by anteater, will be sent to the Virus Total
API and a report will be returned. If any object is reported as malicous,
it will fail the CI build job.Example content is provided for those unsure of what to start with and its
encouraged and welcomed to share any Anteater filter strings you find useful.Why would I want to use this?Anteater has many uses, and can easily be bent to cover your own specific needs.First, as mentioned, it can be set up to block strings and files with a
potential security impact or risk. This could include private keys, a shell
history, aws credentials etc.It is especially useful at ensuring that elements used in a staging /
development enviroment don't find there way into a production enviroment.Let's take a look at some examples:apprun:
regex: app\.run\s*\(.*debug.*=.*True.*\)
desc: "Running flask in debug mode could potentially leak sensitive data"The above will match code where a flask server is set to running in debug modeapp.run(host='0.0.0.0' port=80 debug=true), which can be typical to a
developers enviroment and mistakenly staged into production.For a rails app, this could be:regex: \<%=.*debug.*%>Even more simple, look for the following in most logging frameworks:regex: log\.debugNeed to stop developers mistakenly adding a private key?private_key:
regex: -----BEGIN\sRSA\sPRIVATE\sKEY----
desc: "This looks like it could be a private key"How about credential files that would cause a job loss if ever leaked into
production? Anteater works with file names too.For Example:jenkins\.plugins\.publish_over_ssh\.BapSshPublisherPlugin\.xmlOr even..- \.pypirc
- \.gem\/credentials
- aws_access_key_id
- aws_secret_access_key
- LocalSettings\.phpIf your app has its own custom secrets / config file, then its very easy to
add your own regular expressions. Everything is set using YAML formatting,
so no need to change anteaters code.Depreciated functions, classes etcAnother use is for when a project depreciates an old function, yet developers
might still make pull requests using the old function naming:depreciated_function:``
regex: depreciated_function\(.*\)
desc: This function was depreciated in release X, use Y function.Or perhaps stopping people from using 1.x versions of a framework:<script.src.*="https:\/\/ajax\.googleapis\.com\/ajax\/libs\/angularjs\/1.*<\/script>What if I get false postives?Easy, you set a RegExp to stop the match , kind of like RegExp'ception.Let's say we want to stop use of MD5:md245:
regex: md[245]
desc: "Insecure hashing algorithm"This then incorrectly gets matched to the following:mystring = int(amd500) * 4We set a specific ignore RegEx, so it matches and then is unmatched by the
ignore entry.mystring.=.int\(amd500\).*Yet other instance ofMD5continue to get flagged.BinariesWith anteater, if you pass the argument--binaries, any binary found
causes a build failure on the originating pull request. It is not until a
sha256 checksum is set within anteater's YAML ignore files, that the build is
allowed to pass.This means you can block people from checking in compiled objects, images, PDFs
etc that may have an unknown origin or tampering with the existing binary files.An example:$ anteater --binaries --project myproj --patchset /tmp/patch
Non Whitelisted Binary file: /folder/to/repo/images/pal.png
Please submit patch with this hash: 3aeae9c71e82942e2f34341e9185b14b7cca9142d53f8724bb8e9531a73de8b2Let's enter the hash::binaries:
images/pal.png:
- 3aeae9c71e82942e2f34341e9185b14b7cca9142d53f8724bb8e9531a73de8b2Run the job again::$ anteater --binaries --project myproj --patchset /tmp/patch
Found matching file hash for: /folder/to/repo/images/pal.pngThis way we can sure binaries are not tampered with by means of a failed
cryptographic signature / checksum.Any binaries not having a sha256 checksum will also be sent to the Virus Total
API for scanning.Virus Total APIIf the following flags (combined or individually)--ips,-urls,--binariesare used, anteater will perform a lookup to the Virus Total API.IP addresses, will be have their DNS history checked for any previous or present connection
with known black listed domains marked as malicious or containing malware.URLs, will be checked for any previous or present connection with known black listed domains
marked as malicious or containing malware.As mentioned, Binaries will be sent to Virus Total and verified as clean / infected.For more details and indepth documentation, please visitreadthedocsLast of all, if you do use anteater, I would love to know (twitter: @decodebytes)
and pull requests / issues are welcome!ContributeContributions are welcome.Please make a pull request in a new branch, and not master.git checkout -b mypatchgit push origin mypatchUnit tests and PEP8 checks are in tox, so simply run thetoxcommand before
pushing your code.If your patch fixes and issue, please paste the issue url into the commit
message.
|
anteater-sh
|
get more information----->Github-For-More-INFO
|
antee-c-header-macro-generator
|
c_header_macro_generatorGenerate name of Macro which is used in C/CPP header file according to header file name.
|
antelope
|
AntelopeAntelope is anANTLRbased YAML 1.2 parser.Python implementation
WIP``
|
antelope-background
|
backgroundBackground LCI implementation including Tarjan Ordering.This is kept as a separate repo because it is the only placenumpy/scipyis required. The
idea is to enable people to run LCI/A computations without having the background data on their
machine or having to perform matrix construction and inversion (i.e. only using foreground
computations, like GaBi does).Partial OrderingThe default implementation performs an ordering of the LCI database using Tarjan's algorithm
for detecting strongly-connected components (seePartial Ordering of Life Cycle Inventory
Databases)It performs the ordering, and then builds and stores a static LCI database (A and B matrices).This code is a bit convoluted, but it works.(Muttered question from the audience)No, it isn't tested. Tests have been performed (and passed).(indistinct grumbling)I know. I'm sorry.InstallingInstallation should be straightforward--lxmlis required here to access a local copy of ecoinvent.user@host$ pip install antelope_background lxmlSetting up a catalog with ecoinvent data>>> from antelope_core import LcCatalog
>>> from antelope_core.data_sources.ecoinvent import EcoinventConfig
>>> cat = LcCatalog('/home/user/my_catalog')
Loading JSON data from /home/b/my_catalog/reference-quantities.json:
local.qdb: /home/b/my_catalog/reference-quantities.json
local.qdb: /data/GitHub/lca-tools/lcatools/qdb/data/elcd_reference_quantities.json
25 new quantity entities added (25 total)
6 new flow entities added (6 total)
>>> ec = EcoinventConfig('/path/to/ecoinvent')
>>> for res in ec.make_resources('local.ecoinvent.3.7.1.cutoff'):
cat.add_resource(res)
>>> cat.show_interfaces()
local.ecoinvent.3.7.1.cutoff [basic, exchange]
local.qdb [basic, index, quantity]
>>>When the background is installed, new interface methods are available for catalog queries. In
order to access them, the background matrix must be constructed, which is done through
traversal of the LCI network using Tarjan's algorithm. This is triggered automatically
any time you request a background interface method. But it can also be triggered explicitly:>>> q = cat.query('local.ecoinvent.3.7.1.cutoff')
>>> q.check_bg()
... # several minutes pass
Loaded 17400 processes (t=158.06 s)
Loaded 17495 processes (t=158.69 s)
20 new quantity entities added (20 total)
5333 new flow entities added (5333 total)
17495 new process entities added (17495 total)
...
Creating flat background
...
True
>>> cat.show_interfaces()
local.ecoinvent.3.7.1.cutoff [basic, exchange]
local.ecoinvent.3.7.1.cutoff.index.20210205 [background, basic, index]
local.qdb [basic, index, quantity]
>>>Thecheck_bg()route is slow because it requires indexing the database and traversing all exchanges,
both of which require loading all XML files. Fortunately, if the two steps are done during
the same python session, then the inventory remains in memory and each file only has to be
loaded once.Once the background matrix and index are created, the XML files do not need to be individually
loaded except to access details about a specific process.Now that the background interface exists, background queries can be conducted.ContributingPlease do!
|
antelope-core
|
coreAntelope Catalog - reference implementation.This repository provides code that enables access to different forms of life cycle
inventory and impact assessment data, ideally from both local and remote sources. It
allows you to view and index data sources, inspect their contents, and perform
exchange relation queries, quantity relation queries, and LCIA computations.At present, the Antelope Catalog relies on local LCA data on your machine, just like other
LCA software. However, the plan is to remove this requirement by off-loading
computing requirements to the cloud.Additional PackagesThe software for constructing and inverting background matrices, which requiresSciPy, is in a separate repository calledantelope_background. The idea is that
these computations can be performed remotely, allowing lightweight clients to run
without scientific computing software (other than python). However, at the moment
this is not yet available.Theantelope_foregroundpackage allows
users to construct and compute product models that use a mixture of data sources.Please visit and install these packages to access and test these functions.Quick Start1. Configure a local catalogantelope_coreis on PyPI-- note the optional dependency if you want to access datasets
in XML formats (ILCD, EcoSpoldV1, EcoSpoldV2):user@host$ pip install antelope_core[XML]Antelope stores its content in acatalog--- for automated unit testing, this should be
specified in an environment variable:user@host$ export ANTELOPE_CATALOG_ROOT=/path/to/where/you/want/catalogOnce that's done, the catalog can be "seeded" with a core set of free tools by running the
local configuration unit test. This is a bit tricky because unit tests are not usually
designed to be run on distributed code, so it requires a bit of a hack to specify the
location of the installed package (note that if you are using a virtual environment, your
site-packages directory is inside that virtual environment):user@host$ python -m unittest discover -s /path/to/your/site-packages -p test_aa_local.pyThat will install: two different USLCI implementations (both somewhat stale), and the TRACI 2.1
LCIA methodology.2. Start RunningYou are now ready to perform LCIA calculations:user@host$ python3
>>> from antelope_core import LcCatalog
>>> from antelope import enum # a simple "enumerate-and-show items" for interactive useIf you have defined your catalog root in your environment, you can import it:>>> from antelope_core.catalog.catalog_root import CATALOG_ROOT
>>> cat = LcCatalog(CATALOG_ROOT)
Loading JSON data from /path/to/your/catalog/reference-quantities.json:
local.qdb: /path/to/your/catalog/reference-quantities.json
local.qdb: /data/GitHub/lca-tools/lcatools/qdb/data/elcd_reference_quantities.json
25 new quantity entities added (25 total)
6 new flow entities added (6 total)Else, you can make it anything you want>>> cat = LcCatalog('/path/to/anywhere')
Loading JSON data from /path/to/anywhere/reference-quantities.json:
local.qdb: /path/to/anywhere/reference-quantities.json
local.qdb: /data/GitHub/lca-tools/lcatools/qdb/data/elcd_reference_quantities.json
25 new quantity entities added (25 total)
6 new flow entities added (6 total)You then interact with the catalog by making queries to specific data sources:>>> cat.show_interfaces() # output shown after running `test_aa_local`
lcia.ipcc.2007.traci21 [basic, index, quantity]
local.lcia.traci.2.1 [basic, index, quantity]
local.qdb [basic, index, quantity]
local.uslci.ecospold [basic, exchange, quantity]
local.uslci.olca [basic, exchange, quantity]
>>> lcias = enum(cat.query('local.lcia.traci.2.1').lcia_methods())
local.lcia.traci.2.1: /data/LCI/TRACI/traci_2_1_2014_dec_10_0.xlsx
Loading workbook /data/LCI/TRACI/traci_2_1_2014_dec_10_0.xlsx
Applying stored configuration
Applying context hint local.lcia.traci.2.1:air => to air
Applying context hint local.lcia.traci.2.1:water => to water
Applying configuration to Traci21Factors with 11 entities at /data/LCI/TRACI/traci_2_1_2014_dec_10_0.xlsx
Missing canonical quantity-- adding to LciaDb
registering local.lcia.traci.2.1/Acidification Air
[00] [local.lcia.traci.2.1] Acidification Air [kg SO2 eq] [LCIA]
Missing canonical quantity-- adding to LciaDb
registering local.lcia.traci.2.1/Ecotoxicity, freshwater
[01] [local.lcia.traci.2.1] Ecotoxicity, freshwater [CTUeco] [LCIA]
Missing canonical quantity-- adding to LciaDb
registering local.lcia.traci.2.1/Eutrophication Air
[02] [local.lcia.traci.2.1] Eutrophication Air [kg N eq] [LCIA]
Missing canonical quantity-- adding to LciaDb
registering local.lcia.traci.2.1/Eutrophication Water
[03] [local.lcia.traci.2.1] Eutrophication Water [kg N eq] [LCIA]
Missing canonical quantity-- adding to LciaDb
registering local.lcia.traci.2.1/Global Warming Air
[04] [local.lcia.traci.2.1] Global Warming Air [kg CO2 eq] [LCIA]
Missing canonical quantity-- adding to LciaDb
registering local.lcia.traci.2.1/Human Health Particulates Air
[05] [local.lcia.traci.2.1] Human Health Particulates Air [PM2.5 eq] [LCIA]
Missing canonical quantity-- adding to LciaDb
registering local.lcia.traci.2.1/Human health toxicity, cancer
[06] [local.lcia.traci.2.1] Human health toxicity, cancer [CTUcancer] [LCIA]
Missing canonical quantity-- adding to LciaDb
registering local.lcia.traci.2.1/Human health toxicity, non-cancer
[07] [local.lcia.traci.2.1] Human health toxicity, non-cancer [CTUnoncancer] [LCIA]
Missing canonical quantity-- adding to LciaDb
registering local.lcia.traci.2.1/Ozone Depletion Air
[08] [local.lcia.traci.2.1] Ozone Depletion Air [kg CFC-11 eq] [LCIA]
Missing canonical quantity-- adding to LciaDb
registering local.lcia.traci.2.1/Smog Air
[09] [local.lcia.traci.2.1] Smog Air [kg O3 eq] [LCIA]
>>> lcias[3].show()
QuantityRef catalog reference (Eutrophication Water)
origin: local.lcia.traci.2.1
UUID: f07dbefc-a5a0-3380-92fb-4c5c8a82fabb
Name: Eutrophication Water
Comment:
==Local Fields==
Indicator: kg N eq
local_Name: Eutrophication Water
local_Comment:
local_UnitConversion: {'kg N eq': 1.0}
local_Method: TRACI 2.1
local_Category: Eutrophication Water
local_Indicator: kg N eq
>>> _=enum(lcias[3].factors())
Imported 14 factors for [local.lcia.traci.2.1] Eutrophication Water [kg N eq] [LCIA]
[00] 7.29 [GLO] [kg N eq / kg] local.lcia.traci.2.1/phosphorus: water (Eutrophication Water [kg N eq] [LCIA])
[01] 3.19 [GLO] [kg N eq / kg] local.lcia.traci.2.1/phosphorus pentoxide: water (Eutrophication Water [kg N eq] [LCIA])
[02] 2.38 [GLO] [kg N eq / kg] local.lcia.traci.2.1/phosphate: water (Eutrophication Water [kg N eq] [LCIA])
[03] 2.31 [GLO] [kg N eq / kg] local.lcia.traci.2.1/phosphoric acid: water (Eutrophication Water [kg N eq] [LCIA])
[04] 0.986 [GLO] [kg N eq / kg] local.lcia.traci.2.1/nitrogen: water (Eutrophication Water [kg N eq] [LCIA])
[05] 0.779 [GLO] [kg N eq / kg] local.lcia.traci.2.1/ammonium: water (Eutrophication Water [kg N eq] [LCIA])
[06] 0.779 [GLO] [kg N eq / kg] local.lcia.traci.2.1/ammonia: water (Eutrophication Water [kg N eq] [LCIA])
[07] 0.451 [GLO] [kg N eq / kg] local.lcia.traci.2.1/nitric oxide: water (Eutrophication Water [kg N eq] [LCIA])
[08] 0.291 [GLO] [kg N eq / kg] local.lcia.traci.2.1/nitrogen dioxide: water (Eutrophication Water [kg N eq] [LCIA])
[09] 0.291 [GLO] [kg N eq / kg] local.lcia.traci.2.1/nitrogen oxides: water (Eutrophication Water [kg N eq] [LCIA])
[10] 0.237 [GLO] [kg N eq / kg] local.lcia.traci.2.1/nitrate: water (Eutrophication Water [kg N eq] [LCIA])
[11] 0.227 [GLO] [kg N eq / kg] local.lcia.traci.2.1/nitric acid: water (Eutrophication Water [kg N eq] [LCIA])
[12] 0.05 [GLO] [kg N eq / kg] local.lcia.traci.2.1/biological oxygen demand: water (Eutrophication Water [kg N eq] [LCIA])
[13] 0.05 [GLO] [kg N eq / kg] local.lcia.traci.2.1/chemical oxygen demand: water (Eutrophication Water [kg N eq] [LCIA])
>>>Specific objects, whose IDs are known, can be retrieved by ID:>>> p = cat.query('local.uslci.olca').get('ba5df01a-626b-35b8-859f-f1df42dd54a0')
...
>>> p.show()
ProcessRef catalog reference (ba5df01a-626b-35b8-859f-f1df42dd54a0)
origin: local.uslci.olca
UUID: ba5df01a-626b-35b8-859f-f1df42dd54a0
Name: Polyethylene, low density, resin, at plant, CTR
Comment:
==Local Fields==
SpatialScope: RNA
TemporalScope: {'begin': '2002-01-01-05:00', 'end': '2003-01-01-05:00'}
Classifications: ['Chemical Manufacturing', 'All Other Basic Organic Chemical Manufacturing']
>>> rxs = enum(p.references())
[00] [ Polyethylene, low density, resin, at plant, CTR [RNA] ]*==> 1 (kg) Polyethylene, low density, resin, at plant, CTR
[01] [ Polyethylene, low density, resin, at plant, CTR [RNA] ]*==> 0.429 (MJ) Recovered energy, for Polyethylene, low density, resin, at plant, CTR
>>>LCIA can be computed for process inventories (note, however, that withoutantelope_backgroundit is not
possible to compute LCI results. In this case the cradle-to-resin dataset is already an LCI). Again,
to do that, please visit / installantelope_background.>>> res = lcias[3].do_lcia(p.inventory(rxs[0]))
...
>>> res.show_details()
[local.lcia.traci.2.1] Eutrophication Water [kg N eq] [LCIA] kg N eq
------------------------------------------------------------
[local.uslci.olca] Polyethylene, low density, resin, at plant, CTR [RNA]:
1.14e-05 = 0.05 x 0.000228 [GLO] local.lcia.traci.2.1/chemical oxygen demand, water, unspecified
5.89e-06 = 0.779 x 7.55e-06 [GLO] local.lcia.traci.2.1/ammonia, water, unspecified
2.85e-06 = 0.05 x 5.7e-05 [GLO] local.lcia.traci.2.1/biological oxygen demand, water, unspecified
7.29e-07 = 7.29 x 1e-07 [GLO] local.lcia.traci.2.1/phosphorus, water, unspecified
7.62e-08 = 0.986 x 7.73e-08 [GLO] local.lcia.traci.2.1/nitrogen, water, unspecified
2.42e-08 = 0.779 x 3.1e-08 [GLO] local.lcia.traci.2.1/ammonium, water, unspecified
2.1e-05 [local.lcia.traci.2.1] Eutrophication Water [kg N eq] [LCIA]
>>>Search requires an index to be created:>>> q = cat.query('local.uslci.olca')
>>> _=enum(q.processes(Name='polyethylene')
---------------------------------------------------------------------------
IndexRequired Traceback (most recent call last)
...
IndexRequired: itype index required for attribute processes | ()
>>> cat.index_ref(q.origin)
...
'local.uslci.olca.index.20210205'
>>> _=enum(q.processes(Name='polyethylene')
[00] [local.uslci.olca] Polyethylene, low density, resin, at plant [RNA]
[01] [local.uslci.olca] Polyethylene, linear low density, resin, at plant [RNA]
[02] [local.uslci.olca] Polyethylene terephthalate, resin, at plant [RNA]
[03] [local.uslci.olca] Polyethylene, linear low density, resin, at plant, CTR [RNA]
[04] [local.uslci.olca] Polyethylene, low density, resin, at plant, CTR [RNA]
[05] [local.uslci.olca] Polyethylene, high density, resin, at plant, CTR [RNA]
[06] [local.uslci.olca] Polyethylene, high density, resin, at plant [RNA]
[07] [local.uslci.olca] Polyethylene terephthalate, resin, at plant, CTR [RNA]
>>>Installing EcoinventIf you have an ecoinvent license, you can install it in your catalog by first downloading
the 7z files that contain the EcoSpold datasets and storing them on your system.You will need to create a folder for ecoinvent, and then create a subfolder for each version
(say, '3.7.1'), and put the 7z files in that.user@host$ mkdir -p /path/to/Ecoinvent/3.7.1The 7z files unfortunately need to be extracted before they can be loaded. After you are done
you should have something that looks like this:user@host$ ls /path/to/Ecoinvent/3.7.1
'ecoinvent 3.7.1_cutoff_ecoSpold02' 'ecoinvent 3.7.1_cutoff_ecoSpold02.7z'
user@host$After that, you can setup ecoinvent in your catalog from within python:>>> from antelope_core.data_sources.ecoinvent import EcoinventConfig
>>> ec = EcoinventConfig('/path/to/Ecoinvent')
>>> _=enum(ec.origins)
[00] local.ecoinvent.3.7.1.cutoff
>>> ec.register_all_resources(cat)
>>>Again, you will need to index the resources before being able to search through them- this takes
several minutes. This is why we are working on a remote solution for this problem.Warning: if you want to do Ecoinvent LCI as well, you will needantelope_background-- please
visit that page.ContributingFork, open an issue, whatever.
|
antelope-foreground
|
foregroundAn interface and implementation for building and analyzing foreground models
|
antelope-interface
|
antelopeStandard Interface and reference framework for LCATheantelopepackage is an interface specification for accessing LCA data resources, as described in [JIE submitted]. It should be subclassed by implementations that wish to expose data using this uniform interface. A reference implementation, including a stand-alone LCA computing tool, exists...DocumentationSee the following articles:Antelope Design Principlesfor documentation of this repository.Entity Specificationand nomenclature.Return Typeswhich arereferencesto entities.See Alsoantelope_coreThe reference implementation including local data source management.antelope_backgroundUsed for partial ordering of databases and construction and inversion of matricesantelope_foregroundUsed for building foreground models
|
antelopy
|
antelopyDocumentation:https://antelopy.stuckatsixpm.comSource Code:https://github.com/stuckatsixpm/antelopyantelopyserializes transaction data for Antelope blockchains, integrating with existing Python packages for ease of use.In the Antelope's Leap 3.1 release, theabi_json_to_binendpoint was deprecated to ensure the integrity of transaction data. However, available options for interacting with Antelope chains with Python all rely on this endpoint for the serialization step.antelopyis designed to handle this process in a non-intrusive way, minimizing changes users ofaioeosandeospyneed to make.Key FeaturesSerialize transaction data in preparation for transactionRead ABIs from the blockchain or file, with the option to save for reuseIntegration wrappers aroundaioeosandeospyantelopysupports the following libraries.Antelope LibrarySupport StatusUsage GuideRepositoryaioeos✔ Fully integratedLinkLinkeospy✔ Fully integrated[^1]LinkLinkpyntelope✘ Waiting for dependency updateN/ALinkBasic Usage:InstallationpipinstallantelopyUsageVisit ourdocumentationfor various examples of how to useantelopy!Supportantelopyis possible thanks to funding from:
|
antenna
|
No description available on PyPI.
|
antennae
|
No description available on PyPI.
|
antenna-intensity-modeler
|
Create near-field plots of parabolic dish antennas.Free software: GNU General Public License v3Documentation:https://wboxx1.github.io/antenna-intensity-modelerInstallation:$pipinstallantenna-intensity-modelerFeaturesTODOCreditsThis package was created withCookiecutterand thewboxx1/cookiecutter-pypackageproject template.
|
antenna-lib
|
Antenna LibProyecto open source de diseño y simulación de antenas en Python.Placeholder para mayor documentación
|
antenna-optimizer
|
This project can optimize anntennas using genetic algorithms. It uses
mypgapyPython wrapper forPGApack, the parallel genetic algorithm
library, originally by David Levine at Argonne National Laboratory and
currently maintained by me. It also usesPyNEC, the Python wrapper for
NEC2++, the C++ version of theNumerical Electromagnetics Code.Originally this started out with a low-gain two-element antenna where
the driven element is a folded dipole. One of the requirements for that
antenna was that it should have 50 Ω impedance and at least some
forward gain. Optimizing this antenna by hand soon turned out to be
tedious and I started experimenting with optimization by a genetic
algorithm.You can find the original experiments infolded.necandfolded2.nec. These are input files to the command-line NEC programs.
You can either use thenec2ccommand-line program which produces an
output file that can be viewed withxnecviewor you can use
graphicalxnec2cprogram. All run under Linux and are included in
the Debian Linux distribution. The.necfilesmayalso be usable
with other NEC versions on other operating systems but I have not tried.For optimizing the two-element Yagi-Uda you can use the command-line
toolfolded_antennawhere the optimizer is implemented infolded.py. Later a 3-element antenna with the same principles was
added infolded_3ele.pycallable with the command-line toolfolded_3ele_antenna.Theantenna_model.pyfactors out the common parts of the various
antennas.Thehb9cv.pywith the command-line toolhb9cv_antennamodels the
well-know HB9CV antenna but only very crudely. NEC2 isn’t really suited
for modelling the phasing stubs of that antenna because it doesn’t like
parallel wires that are too close. I mainly did this for comparing some
of the antennas resulting from optimization with the well-known
performance of the HB9CV.In the filelogper.pywith the command-line toollogper_antennayou can find a 9-element log-periodic antenna. It can currently not be
optimized, the performance of the real antenna is better than the
results obtained with NEC, so I didn’t implement an optimizer for it
yet.All the antenna programs take an action as mandatory argument. The
action is typically eitheroptimizefor running the optimizer ornecoutfor creating a.necfile which can then be fed to one of
the nec programs mentioned above. When running the optimizer it makes
sense to experiment with different random seeds, each random seed will
usually produce a different antenna. In addition there are some
experimental actions,frgainprints the forward and backward gains
(in dBi) for the lowest, the middle, and the highest frequencies and the
VSWR for those. Thegainaction visualizes the 3D antenna gain
pattern and theswraction visualizes the VSWR over the given
frequency range. Note that both, thegainand theswraction
compute the antenna data over the whole frequency range using NEC and
that may take some time.The output of the optimizer is text (usually redirected to a file) that
prints the evaluation, the VSWR, maximum gain, and forward/backward
ratio of the best antenna for every 10th generation of the genetic
algorithm. In addition the command-line options to create that antenna
are printed. When the genetic algorithm doesn’t make any more progress,
the search terminates and the data of the best evaluation are given. An
example of the last lines of such a text is as follows. The data is from
one of the best 2-element antennas was obtained with the random seed 26
of an earlier version of the program:The Best Evaluation: 2.886437e+02.
The Best String:
-r 0.0364 -d 0.0444 -l 0.1704 -4 0.1075
VSWR: [1.7901433511443068, 1.1495780609982815, 1.7995760521232753]
GMAX: 6.69913175227, RMAX: -3.03663376703
Cache hits: 5670/9243 61.34%
Eval: 288.64
[ 101011001100101001111000000010011 ]This tells us the evaluation (which is meaningful only to the
genetic algorithm), the genetic algorithmmaximizesthis value.
The command-line options after the lineThe Best String:can be used
to create a.necfile for that antenna. The antenna in the example
has a voltage standing wave ratio of < 1.8 at the band ends and
around 1.15 in the middle of the band (the 70cm band from 430 to
440 MHz in that case). The forward gain (in the middle of the band)
is 6.7 dBi. The RMAX value is the (maximum) backward gain (in a 30
degree area in the back). So the F/B ratio of that antenna is:6.7 dB - -3.0 dB = 9.7 dBThe last line of the text output contains the genetic representation of
that antenna.
The.necfile for the antenna above which was optimized with an early
version of this package can be created with the command:folded_antenna -r 0.0364 -d 0.0444 -l 0.1704 -4 0.1075 necout > folded-opt.necThe command-line options specify the radius of the folded dipole, the
distance of the reflector from the folded dipole, the (half) length of
the reflector, and the (half) length of the straight part of the folded
dipole, respectively.According to NEC it has a standing wave ratio (VSWR) of < 1.8 from
430-440 MHz, a forward gain of > 6.5 dBi over the whole
frequency range and a Forward/Back Ratio of 8-11 dB.If you want to implement an optimizer for your own antenna, look at the
filefolded.py: You need to implement a class that defines the
geometry of the new antenna and an optimizer class that initializes the
gene ranges and implements acompute_antennamethod that returns an
instance of your antenna class with the parameters obtained from the
given gene. All lengths in the models are metric (in meters) as is the
default in NEC.A recent addition to this package involves modelling of coax cables.
This uses information from an old article by Frank Witt[1]to derive
everything necessary to model a transmission linewithloss from the
manufacturer cable data. The command-line tool for using these coax
models is namedcoaxmodel. Again this command has several
sub-commands:loss: This displays the fitted loss-curves from the manufacturer
data against the curve-fit algorithm used, you can see how much
difference in dB the fitted curve has to the loss at certain
frequencies given by the manufacturer data.matchcomputes the impedance at the load and input (depending on
which was given as an input the other is computed), the matched and
total loss (the matched loss is the loss in the cable if the load is
prefectly matched, the total loss is the sum of the matched loss and
the additional loss due to reflections), the SWR and data for various
stub-matches to get to the cable impedance Z0.resonatorcomputes the resistance and Q-factor of a coax resonator
at the given frequency. A resonator is a piece of cable that either
has a short-circuit or an open-circuit at the far end. The sub-command
computes resonators for quarter and half wave at the given frequency.stubcomputes the length, Q-factor, and resulting impedance as
well as the inductance or capacitance of a stub for a given impedance
and frequency. The impedance by default is -100j Ohm and can be
changed with the -x option. The stub with the shortest resulting
length is chosen, so for a negative reactance an open stub is chosen
while for a positive reactance a closed stub is chosen.For all these sub-commands you can specify the frequency, length of
cable and impedance (either at the load or at the end of the cable) to
be used in computing the results. You can specify complex impedances as
a python complex number in the format a+bj, e.g. 50-500j.Finally thetransmission_lineprogram can optimize the stub-matching
for a transmission line using NEC. By default a lossless line is asumed.
Also by default a closed stub at the closest possible position is
searched.Transmission lines can be modelled by NEC with theTL(transmission
line) card. But NEC can also model arbitrary (symmetric, passive)
networks with theNT(network) card. We use this (and the code incoaxmodel.py) to model a real cablewith lossfor stub matching.
It is instructive to compare the values for stub-matching obtained
analytically fromcoaxmodelwith the values obtained from an
optimization with the genetic algorithm bytransmission_line. Note
that the coaxmodel takes frequecies in Hz while transmission_line (which
uses NEC) accepts frequencies in MHz. So we match a complex impedance of
75+15j Ω for example:coaxmodel -c sytronic_RG_58_CU -f 435e6 -z 75+15j matchThis yields a stub of length 8.007 cm attached 7.888 cm from the load when
matching with a closed stub. When optimizing withtransmission_line:transmission_line -c sytronic_RG_58_CU -f 435 -z 75+15j optimizewe get 8.016cm for the stub length and 7.6cm for the distance of the
stub from the load. We can visualize this over a given frequency range
by either producing NEC output:transmission_line -c sytronic_RG_58_CU -f 435 -z 75+15j -i 50 \
-l 0.0816 -d 0.076 --frqstart=430 --frqend=440 necout > x1.nec
transmission_line -c sytronic_RG_58_CU -f 435 -z 75+15j -i 50 \
-l 0.08007 -d 0.07888 --frqstart=430 --frqend=440 necout > x2.necAnd the compute the nec model and display with:nec2c -i x1.nec > x1.out
nec2c -i x2.nec > x2.out
xnecview x1.out
xnecview x2.outOr directly display the VSWR curves with:transmission_line -c sytronic_RG_58_CU -f 435 -z 75+15j -i 50 \
-l 0.0816 -d 0.076 --frqstart=430 --frqend=440 swr
transmission_line -c sytronic_RG_58_CU -f 435 -z 75+15j -i 50 \
-l 0.08007 -d 0.07888 --frqstart=430 --frqend=440 swrBoth are close enough, the SWR is below 1.1 over the whole frequency
range given. Note that this can change drastically if load impedances
with a higher VSWR are matched.Also note that the NEC files produced in the example above have a different
NEC networkfor each frequency. This is because NEC models networks
using anadmittance matrixwhich is frequency dependent.This means the sequence of twoNTcards, aTLcard, aFRcard and aRPcard are repeated for
each frequency. Here the twoNTcards define the network of the
cable from the load to the stub and the stub itself while theTLcard defines the length of a lossless transmission line from the stub to
the source. TheFRcard specifies a single frequency and theRPcard defines a radiation pattern and triggers computation. This format
is perfectly valid NEC code, but certain programs (like the popularxnec2c) cannot deal with this format and display only a single
frequency.[1]Frank Witt. Transmission line properties from manufacturer’s
data. In R. Dean Straw, editor, The ARRL Antenna Compendium, volume 6,
pages 179–183. American Radio Relay League (ARRL), 1999.ChangesVersion 0.3: Multi-objective optimizationSwitch to pyproject.toml instead of setup.pyAdd multi-objective optimizationAllow NSGA-III for multi-objective optimizationAllow to model groundMultiple frequency rangesAllow to use average gain when optimizing. Needs a bug-fix in pynechttps://github.com/tmolteno/necpp/pull/73Add epsilon constrained optimization
This allows to better find areas with good gain even if constraining
the solutions to low SWRVersion 0.2: More cable dataFix setup to correctly specify dependenciesAdd more cable data the following command will list supported cable
types:coaxmodel --helpVersion 0.1: Initial Release
|
antenny-cdk
|
antenny-cdkantenny-cdk is a cloud development kit construct library for Antenny. It provides a way to integrate Antenny into your cdk infrastructure.Installationnpmnpminstallantenny-cdk--savepippipinstallantenny-cdknugetdotnetaddpackageAntenny.CdkUsageTo create a subscription in your aws-cdk project:constantenny=require('antenny-cdk');constsub=newantenny.Subscription(this,'Sub',{apiKey:'{api-key}',subscription:{name:'example-subscription',customerId:'{customerId}',region:'{aws-region}',resource:{protocol:'ws',url:'wss://example.com'},endpoint:{protocol:'http',url:'https://example.com'}}});There is also a real world example included in oursample-app.
|
antenny-py
|
This is an api that allows you to interact with the Antenny platform. It allows you to manage your clients and subscriptions. # noqa: E501
|
antevents
|
AntEvents is a (Python3) framework for building IOT event processing
dataflows. The goal of this framework is to support the
creation of robust IoT systems from reusable components. These systems must
account for noisy/missing sensor data, distributed computation, and the
need for local (near the data source) processing.AntEvents is pure Python (3.4 or later). The packaged distribution
(e.g. on PyPi) only includes the core Python code. The source repository athttps://github.com/mpi-sws-rse/antevents-pythoncontains the core Python
code plus the documentation, examples, and tests. There is also a port
of AntEvents for micropython available in the source repo.
|
antfarm
|
UNKNOWN
|
antfin
|
No description available on PyPI.
|
antfs
|
Usage:Select file paths matched the specifiedant path pattern:>>> ds = AntPatternDirectoryScanner("foo/bar", "foo/**/*.txt")
... for filename in ds.scan():
... print(filename)
...Copy matched files into target directory:>>> ds = AntPatternDirectoryScanner("foo/bar", "data/**/*.txt")
... ds.copy("target/dir")Installation:(A): Installation with pipumask 022
sudo pip3 install antfs
# Upgrading:
sudo pip3 install antfs --upgrade(B): Installation from sourcesumask 022
git clone https://github.com/Softmotions/antfs.git
cd ./antfs
sudo python3 ./setup.py install
|
antgen
|
ANTgen - the AMBAL-based NILM Trace generatorThis tool generates synthetic macroscopic load signatures for their use in conjunction
with NILM (load disaggregation) tools. By default, it runs in scripted mode (i.e., with
no graphical user interface) and processes an input configuration file into a set of CSV
output files containing power consumption values and the timestamps of their occurrence,
as well as a file summarizing the events that have occurred during the simulation).If you find this tool useful and use it (or parts of it), we ask you to cite the following work in your publications:@inproceedings{reinhardt20benchmarking,author={Andreas Reinhardt and Christoph Klemenjak},title={How does Load Disaggregation Performance Depend on Data Characteristics? Insights from a Benchmarking Study},booktitle={Proceedings of the 11th ACM International Conference on Future Energy Systems (e-Energy)},year={2020}}RequirementsANTgen relies on a small number of Python libraries to fully function. Install them by typing:pip3install-rrequirements.txtNote: On Windows systems without a C/C++ compiler suite installed, the installation of package requirements may fail,
reporting that the "Microsoft Visual C++ Build Tools" are missing. To continue,download and install
them, then re-run the above command.ANTgen can show an overview plot of its generated data ifmatplotlibis installed.
So unless you plan to use ANTgen to create data on a headless server, we recommend the installation
of this library as well.pip3installmatplotlib# (optional, install only if you want to see your data plotted)Usagepython3antgen.py[-oDESTINATION][-w][-sSEED][-a][-mMAPFILE][-dDAYS][-nNOISECONFIG][-v][-p]configfileMandatory argument:configfile: The configuration file to process (see next section for format)Optional arguments:-o <dir>specifies the folder in which the output files shall be saved (output/by default)-woverwrite output files if they already exist-s <123123123>define the seed value for the random number generator-arandomly pick an appliance model each time the appliance is being operated (if unset, all operations of appliances of the same type will be exact replicas of each other)-m <mapping.conf>defines the mappings between appliances types and their AMBAL models (can be overwritten individually in the[devices]section of the configuration file)-n <C123>defines if noise shall be added to the aggregate signal. One letter (C=constant,G=Gaussian) followed by the amplitude in Watt-d <123>overrides the number of days for which to generate data (can also be given in the[GENERAL]section of the configuration file)-popens a graphical interface to plot the resulting traces after their generation (requiresmatplotlib)-vmakes ANTgen verbose and outputs more status informationConfigurationAll ANTgen configuration files are expected to be present in TOML format.
The core configuration file must feature at least the sections[GENERAL]and[users].
Optionally, specific appliance models to be used can be placed in the[devices]section.
If[devices]is not part of the configuration file, a fallback mapping between
appliance names and the corresponding model dictionary must be provided by means
of the-moption, such as-m mapping.conf. This mapping file must contain a section
named[devices]to be processed correctly.In the[GENERAL]section, the configuration needs to be provided with anameand
the number ofdaysfor which data shall be generated. Optionally, theseedvalue for the random number generator can be specified to ensure a repeatable
trace generation. When no seed is provided, the random number generator will
initialize itself based on the current system time. If the number of days is neither
specified here nor on the command line (using-d), ANTgen will generate one day
worth of data only.The[users]section contains key-value pairs of user handles (only used for debugging)
and the corresponding user models (see next section for details). In order to allow
household base loads (refrigerator, etc) to run unattended, they should be added
as another (virtual) user, e.g.,Home.An (optional)[devices]section can be added to list the appliance modeldirectoriesto
consider, indexed by thecapitalizedappliance type. It is crucial to keep
the structure of the path intact, i.e., use exactly two levels of hierarchy, with
the first-level subdirectory indicating the type of modeled appliance, and
the second-level subdirectory referring to the individual handle from which
the data was extracted. The keys for each entry must be upper-case and reflect
the type of appliance that is being referred to. All AMBAL models must be located
in theappliances/subdirectory.An example configuration is shown as follows:[GENERAL]
name = Sample configuration
days = 4
seed = 12345
[users]
Home = baseload.conf
Jack = STUDENT/student_simple.conf
[devices]
COOKINGSTOVE = COOKINGSTOVE/dev_D33097
TV = TV/dev_B80E51This configuration file will create synthetic load signature data for four days, seeding
the random number generator with the value 12345. There are two users in the generated data,
one going by the handle "Jack" and following the daily routines specified inusers/STUDENT/student_simple.conf.
The second "user" is present to model the household base load, as defined inusers/baseload.conf.UsersTo create realistic models, ANTgen relies on user models. All user models are stored in
theusers/subdirectory, or subdirectories thereof. User models are stored in TOML
format, and must feature the[GENERAL]and[presence]sections, as well as one section
for each user activity that should be modeled (these ones must start with the stringactivity_).In the[GENERAL]section, the user model must be provided with aname, which is
also reflected in the graphical user interface and the per-user power consumption output
file.The[presence]section contains key-value pairs of weekdays ("monday" through "sunday";
all in lower-case) and and the corresponding presence times. Times are specified in
24hr notation (from 00:00-24:00); multiple time ranges can be concatenated using commas.
These time frames indicate when a user can start/perform an activity.All activity sections must start with theactivity_tag, followed by a unique
identifier (hint: this makes it easy to remove an activity temporarily by making it aninactivity_). User activities are modeled separately (see below); the link between is
created by specifying the file name of the activity configuration using themodelentry.
Thedaily_runsvalue states the average number of repetitions of this activity
throughout each day. There is no guarantee the activity will be scheduledexactlythis
often during each simulated day. Lastly, activity occurrences can be time-limited by
specifying the hours during which the activity can take place for each day of the week,
using the same notation as for thepresencetag above.An example user model is shown as follows:[GENERAL]
name = Lucas Lazybone
[presence]
monday = 00:00-08:30, 14:00-24:00
sunday = 00:00-24:00
[activity_breakfast]
model = KITCHEN/cooking_quick.conf
daily_runs = 1
monday = 07:30-08:15
sunday = 08:30-09:15This configuration models a user who is only at home on mondays and sundays, and cooks
breakfast once on both days at some (randomly determined) time in the specified time intervals.ActivitiesActivities are modeled as state machines, to be executed by the users. All activity
models must be stored in theactivities/subdirectory, or subdirectories thereof.
Activity models also use the TOML format.As follows, find some notes on the used nomenclature and some general guidelines
for activity definitions:The only entry the[GENERAL]section of each activity model must contain is thenamefield. Enter a descriptive name of the activity, which will also constitute
the corresponding file name for the power data when written to an output file.Specify the types of appliances the activity requires in the[devices]section. The tool
will try to find matches for all entries listed there, so any unused leftover entries
can make the synthesis fail. Use unique numeric keys for listing the devices required. They
will be later referred to in the state machine (see below).Activities are modeled in the form of state machines. Each operational state is
specified in the[sequence]table.
All states must be assigned a numeric identifier (the state machine starts in state0) and
require the specification of the following fields, which are entered in the
form of a comma-separated list:A short name (primarily for debugging purposes)The minimum and maximum duration (use '0' to fall back to the underlying appliance
model's default value). If a value greater than 0 is provided, the underlying appliance's activity
will be scaled linearly in time, i.e., the durations of all its elements will be
stretched/compressed to meet the requested overall duration.A flag whether the user must be present for a particular state to take place. This
ensures that an activity is only scheduled when the user is actually at home when needed.A flag whether the state must run to completion before the state machine will progress
to the next step. If this value is set to 'false', a delay of just 5-10 seconds is introduced
before moving on to the next state, and the appliance continues to operate in the background.The ID of the device to operate in this state (as per the key specified in the[devices]section of the activity configuration file).The next state(s) into which the appliance operation can move, as well as the probability
of the transition there. The state machine will advance into state A with the probability
specified, and into the state B with the converse probability. Using the same value for states A
and B, or a probability of 1.0 effectively makes this a linear flow with no variation possible.
Referring to an undefined state in the state machine will terminate the state machine's flow.The state model representation also allows for unattended operation of devices (simply set
the flag whether it involves the user to false). Similarly, states that involve the user but no appliance (e.g.
eating) can be modeled by setting the appliance ID to an undefined value (e.g., 0), but specifing a non-zero duration.There is no need to add a "start" state, yet an initial state with a duration of 0 seconds
can be added for the sake of better readability. Likewise, a state relying on an undefined
appliance (e.g., '0') with non-zero duration can act as a delay in-between states.An example activity model for vacuuming the apartment is shown as follows. It assigns ID 1 to the
VACUUMCLEANER appliance, and runs this appliance as long (or short) as stored in the appliance model.
The user must be present both during the start of the activity (involves_user) as well as throughout
its operation (run_to_completion). After one room has been vaccumed, the user rests for 5-10 minutes,
before vacuuming another room (at 20% probability) or stopping the activity (at 80% probability).[GENERAL]
name = vacuuming
[devices]
1=VACUUMCLEANER
[sequence]
# state ID, min_dur, max_dur, involves_user, run_to_completion, dev, prob_for_a, state_a, state_b
0=vacuum, 0, 0, true, true, 1, 1.0, 1, 1
1=rest, 300, 600, true, true, 0, 0.2, 0, 2Appliance modelsANTgen uses the AMBAL format for its appliance models (i.e., XML files). A sample set
of models is provided in theappliances/subdirectory of this repository.
Newly extracted models can simply be copied into this directory.User interfaceWhen executed with the-poption, a graphical user interface is brought up after the
trace generation has completed. The user interface shows traces for total power demand
as well as the demand of power per user, power per activity, and power per appliance.
To use this feature, thematplotliblibrary must be installed.By clicking on the colored lines in the legend boxes (nottheir textual labels),
the visibility of individual traces can be toggled from the view.Getting started with a little exampleThe distribution of ANTgen ships with a few user, activity, and appliance models.
Run the following command to create a synthetic trace for one user and a constantly
running refrigerator, for the duration of 10 days.python3antgen.py-mmapping.confdefault.confDuring its execution, ANTgen will output some logging information, an excerpt
of which is shown as follows. Most of it should be self-explanatory.root [I] ANTgen started using 'default.conf' on 14-05-2020 at 14:16:25
root [I] Output files will be stored in ./output
...
UserModel [I] User model successfully created for 'Household base load' (1 activity)
UserModel [I] User model successfully created for 'Grumpy Grandma' (5 activities)
root [I] ********************************************************************************
...
UserModel [I] First weekdaynature(s) for activity 1/1 (fridge) for 10 days...
ActivityModel [I] Synthesis of 'fridge operation' done: 241 scheduled, 5 didn't fit
UserModel [I] Generating load signature(s) for activity 1/5 (vacuum) for 10 days...
ActivityModel [I] Synthesis of 'vacuuming' done: 8 runs scheduled
UserModel [I] Generating load signature(s) for activity 2/5 (dishwashing) for 10 days...
ActivityModel [I] Synthesis of 'dishwasher operation' done: 1 runs scheduled
UserModel [I] Generating load signature(s) for activity 3/5 (tv) for 10 days...
ActivityModel [I] Synthesis of 'watching TV' done: 13 runs scheduled
UserModel [I] Generating load signature(s) for activity 4/5 (ironing) for 10 days...
ActivityModel [I] Synthesis of 'ironing clothes' done: 4 runs scheduled
UserModel [I] Generating load signature(s) for activity 5/5 (laundry) for 10 days...
ActivityModel [I] Synthesis of 'washing a load of laundry' done: 3 runs scheduled
root [I] Synthesis completed in 30.926 seconds
...
root [I] ********************************************************************************
root [I] Trace duration (days) : 10
root [I] First weekday : friday
root [I] # active devices : 12
root [I] # appliance operations : 553
root [I] ---------------------------------------
root [I] VACUUMCLEANER #runs : 8
root [I] WASHINGMACHINE #runs : 3
root [I] IRON #runs : 4
root [I] REFRIGERATOR #runs : 241
root [I] DISHWASHER #runs : 1
root [I] TV #runs : 26
root [I] ---------------------------------------
root [I] Max. appl. concurrency : 2
root [I] Random seed : 1234567890
root [I] Added noise : noneIf you havematplotlibinstalled, ANTgen can also provide a plot of the synthesis results.
Simply invoke it with-pon the command line:python3antgen.py-mmapping.conf-pdefault.confANTgen features two ways to make the output data a little harder to disaggregate.
First, adding noise to the aggregate signal is possible by invoking ANTgen with the-noption. For example,-n G200will add 200 Watts of Gaussian noise (with
a standard deviation of one tenth of the amplitude, i.e., 20W) to the aggregate signal.
Second, you can use the-aswitch to alternate the used appliance model for each activity.
While all refrigerator cycles followed the exact same power consumption pattern in above
diagram, a random model for the given appliance will be selected in this case (from the directoryappliances/REFRIGERATOR/dev20111228/) for each operation of the refrigerator.python3antgen.py-mmapping.conf-nG200-a-d5-pdefault.confOther configuration files (including the ones that were used to create the synthetic data for
the aforementioned ACM e-Energy 2020 publication) are located in thetestcases/directory.Copyright noticeCopyright (C) 2019-2020 Andreas [email protected], TU ClausthalPermission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
|
antgo
|
TargetAntgo is a machine learning experiment manage platform, which has been integrated deeply with MLTalker.
Antgo provides a one-stop model development, deployment, analyze, auto-optimize and manage environment.Installation(RECOMMENDED) use dockerdocker environment.install from pippip install antgoinstall from sourcegit clonehttps://github.com/jianzfb/antgo.gitcd antgopip install -r requirements.txtpython setup.py build_ext installRegisterRegister inMLTalker.All user experiment records would be managed by MLTalker in user’s personal page.Quick Example1.step create mvp code(cifar10 classification task)antgo create mvp –name=cifar102.step start training processpython3 ./cifar10/main.py –exp=cifar10 –gpu-id=0 –process=train3.step check training login ./output/cifar10/output/checkpoint4.step export onnx modelpython3 ./cifar10/main.py –exp=cifar10 –checkpoint=./output/cifar10/output/checkpoint/epoch_1500.pth –process=export
|
antgrid-server
|
antgridantgrid pypiInterfaceserver_start: server注册baichuan_start: 启动Baichuan模型chatglm_start: 启动chatglm模型llama2_start: 启动llama2模型sd_start: 启动stable diffusion模型
|
anthem
|
Anthem is a tool to help scripting Odoo instances for automated setup,
upgrades, testing and more.It should be an alternative to the other tools likeoerpscenario.Make your own songsWriting your songs is as easy as creating a Python Package. The
songs functions called by anthem must have a positionalctxargument.ctxis essentially the execution context - you can accessctx.envfrom
it, which is an Odoo environment instance that you should be pretty much familiar with.## songs/install.py
def setup_company(ctx):
""" Setup company """
company = ctx.env.ref('base.main_company')
company.name = 'My Company'
def main(ctx):
setup_company(ctx)LogsA song can display some logs when executed [email protected],[email protected]_company(ctx):""" Setting up company """company=ctx.env.ref('base.main_company')withctx.log('Changing name'):company.name='My Company'ctx.log_line('Name changed')withctx.log('Loading a logo'):company.logo=b64encode(LOGO_CONTENT)ctx.log_line('Logo changed')The decorator on the function will display the first line of the docstring.
Both the decorator and the context manager will show the timing of the
execution. The upper example gives:Execute your songsUse the command lineanthem. Provided your songs andopenerpare in thePYTHONPATH:Anthem will execute the functionmainof the modulesongs.installwith
actxinitialized with an Odooenv.Instead of using-cfor the command line, you can export the environment
variableOPENERP_SERVERwith the path of the configuration file.In order to haveopenerpin thePYTHONPATH, you might install it as a
package withpip install-eor directly modify thePYTHONPATH.In order to have yoursongsin thePYTHONPATH, the better is to make a
Python package out of them.TestingDependenciesTo run the tests, you must have Postgresql running, with accesses for your user
(or you will have to modifytests/config/odoo.cfgwith your database
username and password).Run the testsTo runanthem’s tests, it is a good idea to do aneditableinstall of it
in a virtualenv. You must also prepare the environment by installing odoo packages.Odoo 9.0 (Python 2):Odoo 10.0 (Python 2):Odoo 11.0 (Python 3):If need be, you can drop the test database with (adapt the version):These steps will download the nightly release of Odoo install it as a package
then install a database, so tests can be run against it (and that’s also why it
is important to use a virtualenv!)When callingpytest, you have to define theOPENERP_SERVERenvironment
variable with the configuration file for the Odoo database that will be used
for the tests.LyricsLyrics are predefined snippets written for the most commonly used cases, like:Loading data: read (load) a data file (CSV format is supported at the moment)Provide XMLIDs for recordsUpserting a record: essentially search for the record and update it with
given values, or create it in case it isn’t there yetUninstalling a module(s)Updating module configuration: pre-defining a set of settings for a particular
module (or set of modules)Loading dataThere’s an ability to supply data in a handy CSV format - Anthem is just able to
parse and load those.load_csvmethod is meant to be the main entrypoint for
doing so:ParamDescriptionctxAnthem context instancemodelOdoo model name or model klass fromctx.envpathabsolute or relative path to CSV file.
If a relative path is given you must provide a value forODOO_DATA_PATHin your environment
or set--odoo-data-pathoption.headerwhitelist of CSV columns to loadheader_excludeblacklist of CSV columns to ignorefmtparamskeyword params forcsv_unireaderCSV format is similar to that of an Odoo export format, namely:
* it should contain a set of field names in a header
* each consecutive row defines a set of values to use to create records on a given modelRecordsThis section is dedicated to methods that operate on records.Provide XMLIDs for recordsThis is as simple as callinganthem.records.add_xmlidwith a record as a
first parameter and a desired XMLID as a second.E.g., you have a very specialres.partnerrecordfoo:fromanthem.recordsimportadd_xmlid[...]@anthem.logdefadd_xmlid_to_foo(ctx):"""Make Jhony Foo great again."""foo=ctx.env['res.partner'].create({'name':'Jhony','lastname':'Foo',})add_xmlid(foo,'__setup__.res_partner_foo_jhony')From now on, Jhony could be referred to asctx.env.ref('__setup__.res_partner_foo_jhony').Upserting a record“Upsert”is a commonly used term that basically stands for UPDATE or INSERT.
Anthem features a facility that is capable of executing that kind of operations
on Odoo databases. There is a method calledanthem.records.create_or_updatethat relies on the model, a set of values and a record XMLID.If your goal is to create the record in the first place as well as provide an
XMLID, as was shown in a previous section,create_or_updatedoes just what
you need.Examplefromanthem.recordsimportcreate_or_update[...]@anthem.logdefcreate_partner_foo(ctx):"""Ensure that Jhony Foo is known to our Company."""create_or_update(ctx,model='res.partner',xmlid='__setup__.res_partner_foo_jhony',values={'name':'Jhony','lastname':'Foo',})Upon calling, it would:Try to fetch the record by a given XMLIDIf the record was found:Update it with the given values (callrecord.update(values)on it)Otherwise:Create a record with given values (callmodel.create(values))Provide an XMLID to it (usinganthem.records.add_xmlid)In any case: return that record backModulesThis section is dedicated to methods that operate on modules.Uninstalling a module(s)Sometimes you just need some particular module to be gone from your instance(s)
and you’d like it done programmatically, without having to reach for each
instance, search for it and hit the“Uninstall”button. Anthem can do the
job for you: you can simply call ananthem.lyrics.modules.uninstallwith a
list of module names that you won’t use anymore.Example (given that there are modulesfooandbarthat you want gone):fromanthem.lyrics.modulesimportuninstall[...]@anthem.logdefuninstall_foo(ctx):"""Get rid of legacy `foo` and `bar`."""uninstall(ctx,['foo','bar'])Updating translations on module(s)In a similar fashion, sometimes you need to update translations on a set of
modules -anthem.lyrics.modules.update_translationsis there for you :wink:Example is similar to the previous case - just call the different method instead.Updating module configurationBy using this feature, you’re able to preconfigure your module setup via Anthem
song: you’ll just need a straight idea what needs to be done, an instance of a
configuration settings model for your module (model name will do as well) and a
mapping (in a form of Python dictionary) of technical configuration names with
desired values.Here’s a brief example ofsalemodule configuration:fromanthem.lyricsimportsettings[...]@anthem.logdefdefine_sale_settings(ctx):"""Configure `sale` module."""model=ctx.env['sale.config.settings']# it's okay to use 'sale.config.settings' as a string thoughmodel='sale.config.settings'settings(ctx,model,{'default_invoice_policy':'delivery',...:...,'et':'cetera',})Be advised: settings onchange are not triggered by this function.Usage within MarabuntaAnthem andMarabuntaare powerful
when combined: you can call a set of songs inside Marabunta’s migration steps
using following syntax:...-version:10.0.1.0.0operations:pre:-anthem songs.upgrade.your_pre_song::mainpost:-anthem songs.upgrade.your_post_song::mainBy using this approach, you possess the power of full-pledged OdooEnvironmentinstance initialized on a live database while performing a
regular upgrade powered by Marabunta.Let’s say that you have to enable multicompany with inter-company transactions
on a migration to next version, lets say, 10.0.1.1.0. In this case, you’ll need
a song to back this up on a Python side first:# songs.upgrade.upgrade_10_0_1_1_0.pyfromanthem.lyricsimportsettings[...]@anthem.logdefenable_multicompany(ctx):"""Set up multicompany."""settings(ctx,'base.config.settings',{# enable multicompany as it is'group_light_multi_company':True,# enable inter-company transactions'module_inter_company_rules':True,})[...]@anthem.logdefmain(ctx):enable_multicompany(ctx)And then you’ll need to call it on a migration step:...-version:10.0.1.1.0operations:post:-anthem songs.upgrade.upgrade_10_0_1_1_0::mainBoom! Enjoy your new multicompany settings.That’s all, folks!Thanks for reading. Happy hacking and enjoy your songwriting skills!Release HistoryUnreleasedFeaturesBugfixesImprovementsDocumentationBuild0.14.0 (2023-05-16)BugfixesFix Update_translation function and update black version 22.3.0Pin version of Setuptools < 58Fix environment initialization for Odoo 15Fixadd_xmlidfor Odoo 15ImprovementsEnable Travis-CI tests for Odoo 14 and Odoo 15Add: nuke_translations to allow to remove already existing translations0.13.0 (2019-08-29)FeaturesBREAKING: Change defaultoverwritevalue forlyrics.modules.update_translationsto FalseSupport odoo saas versionsBugfixesMakelyrics.modules.update_translationsOdoo >= 11.0 compatible0.12.2 (2019-06-21)ImprovementsAdd ‘tracking_disable=True’ as default context to load CSVs
(avoid creating ‘mail.message’ records and speed up the import process)BuildPackaging: build universal wheels0.12.1 (2018-11-09)DocumentationImprove API docsBuildThe lib is now automaticaly published to Pypi by Travis when a tag is added0.12.0 (2018-03-19)FeaturesAdd a new option--odoo-data-pathor env. variableODOO_DATA_PATH.Thelyrics.loaders.load_csvmethod now accepts a relative path appended to the
new option “odoo data path”. Absolute paths are still allowed.Bugfixeslyrics.loaders.update_translationsis now deprecated as it was a duplicate fromlyrics.modules.update_translations0.11.0 (2017-12-22)FeaturesMake it Python 3 and Odoo 11 compatibleBuildSwitch to unicodecsv instead of custom code to handle thatFix the flapping tests setup. Removed tox which was provoking that for some reason.Add a lint check in build0.10.0 (2017-09-19)BugfixesDisable Odoo’s xmlrpc portBuildAdd ‘build-release.sh’ script with commands to build and upload the dist files0.9.0 (2017-08-21)FeaturesNew lyrics: modules.update_translations to update translations from po filesLyrics ‘uninstall’ has been moved from uninstaller.uninstall to modules.uninstall,
previous path is still working for backward compatibilityNew lyrics context manager ‘records.switch_company’0.8.0 (2017-07-24)FeaturesNew lyrics: Define settings like being in the interfaceAdd CSV Loading columns control (columns whitelist and blacklist)BugfixesFix error when loading CSV with no rows0.7.0 (2017-04-28)ImprovementsSplit CSV loaders in functions to be able to get rows from a CSV or to load
rows, enabling to modify the rows before loading them for instancecreate_or_update lyrics accepts now a model so we can change its env (user,
context, …)New lyrics to uninstall module0.6.0 (2017-01-18)FeaturesCSV loaders can be used with a model in order to pass a contextBugfixesFix tests by installing eggs from odoo/requirements.txt0.5.0 (2016-10-12)FeaturesSupport Odoo 10Allow to specify the encoding of an imported file, default is utf8Bugfixes‘records.add_xmlid’ lyrics do no longer fail when it already exists0.4.0 (2016-08-19)FeaturesNew lyrics: CSV loaders from path or streamNewctx.log_lineto print a line respecting the current indentationImprovementsAdd tests for the existing lyricsBuildFinally green builds!0.3.0 (2016-07-26)FeaturesAdd –quiet modeFixesEncode the logged strings to the default encoding or utf8Allow to use Ctrl-c to stop anthem.Set openerp’s loglevel to ERROR, its logs clutter anthem’s own outputs0.2.0 (2016-07-22)FeaturesAbility to log descriptions and timings in songs with the
context managerContext.logand the decoratoranthem.log.from anthem import log
@log
def setup_company(ctx):
""" Setup company """
# do stuff
with ctx.log('other stuff'):
# do other stuff
@log
def load_data(ctx):
""" Load data """
# load
@log
def main(ctx):
setup_company(ctx)
load_data(ctx)If we run anthem onmain, we will get:running... main
running... Setup company
running... other stuff
other stuff: 0.850s
Setup company: 1.100s
running... Load data
Load data: 2.900s
main: 4.000s0.1.3 (2016-07-07)FixesCorrect lyric to create or update a record0.1.2 (2016-07-07)Add a lyric to create a xmlidAdd a lyric to create or update a record0.1.1 (2016-06-23)Fixed crash on non-editable install.0.1.0 (2016-06-23)Initial release.
|
anthemav
|
This is a Python package to interface withAnthemAVM and MRX receivers and
processors. It uses the asyncio library to maintain an object-based
connection to the network port of the receiver with supporting methods
and properties to poll and adjust the receiver settings.This package was created primarily to support an anthemav media_player
platform for theHome Assistantautomation platform but it is structured to be general-purpose and
should be usable for other applications as well.ImportantThis package will maintain a persistant connection to the network
control port which will prevent any other application from communicating
with the receiver. This includes the Anthem iOS and Android remote
control app as well as the ARC-2 room calibration software. You will
need to disable any application that is using the library in order to
run those other applications.RequirementsPython 3.6 or newer with asyncioAn Anthem MRX or AVM receiver or processorKnown IssuesThis has only been tested with an MRXx20 series receiver, although
the Anthem protocol was largely unchanged from the MRXx10 series. It
should work with the older units, but I’d appreciate feedback or pull
requests if you encounter problems. It will definitely not work with
the original MRXx00 units or the D2v models.Only Zone 1 is currently supported. If you have other zones
configured, this library will not allow you to inspect or control
them. This is not an intractable problem, I just chose not to address
that nuance in this initial release. It’s certainly feasible to add
support but I am not settled on how that should be exposed in the
internal API of the package.I skipped over a lot of the more esoteric settings that are available
(like toggling Dolby Volume on each input). If I passed over a
setting that’s really important to you, please let me know and I’ll
be happy to add support for it. Eventually I intend to cover the full
scope of the Anthem API, but you know how it goes.InstallationYou can, of course, just install the most recent release of this package
usingpip. This will download the more rececnt version fromPyPIand install it to your
host.pip install anthemavIf you want to grab the the development code, you can also clone this
git repository and install from local sources:cd python-anthemav
pip install .And, as you probably expect, you can live the developer’s life by
working with the live repo and edit to your heart’s content:cd python-anthemav
pip install . -eTestingThe package installs a command-line tool which will connect to your
receiver, power it up, and then monitor all activity and changes that
take place. The code for this console monitor is inanthemav/tools.pyand you can invoke it by simply running this at
the command line with the appropriate IP and port number that matches
your receiver and its configured port:anthemav_monitor --host 10.0.0.100 --port 14999Helpful Commandssudo tcpflow -c port 14999Interesting LinksProject HomeAPI Documentation for Anthem Network Protocol (Excel Spreadsheet):MRX-x20 and AVM-60MRX-x40, AVM-70 and AVM-90MDX-16 and MDX-8Pictures of catsCreditsThis package was written by David McNett.https://github.com/nuggethttps://keybase.io/nuggetThis package is maintained by Alex Henryhttps://github.com/hyralexHow can you help?First and foremost, you can help by forking this project and coding.
Features, bug fixes, documentation, and sample code will all add
tremendously to the quality of this project.If you have a feature you’d love to see added to the project but you
don’t think that you’re able to do the work, I’m someone is probably
happy to perform the directed development in the form of a bug or
feature bounty.If you’re anxious for a feature but it’s not actually worth money to
you, please open an issue here on Github describing the problem or
limitation. If you never ask, it’ll never happenIf you just want to thank me for the work I’ve already done, I’m
happy to accept your thanks, gratitude, pizza, or bitcoin. My bitcoin
wallet address can be onKeybaseor
you can send me a donation viaPayPal.Or, if you’re not comfortable sending me money directly, I’ll be
nearly as thrilled (really) if you donate tothe
ACLU,EFF, orEPICand let me know that you did.
|
anthemav-hyralex
|
This is a Python package to interface withAnthemAVM and MRX receivers and
processors. It uses the asyncio library to maintain an object-based
connection to the network port of the receiver with supporting methods
and properties to poll and adjust the receiver settings.This package was created primarily to support an anthemav media_player
platform for theHome Assistantautomation platform but it is structured to be general-purpose and
should be usable for other applications as well.ImportantThis package will maintain a persistant connection to the network
control port which will prevent any other application from communicating
with the receiver. This includes the Anthem iOS and Android remote
control app as well as the ARC-2 room calibration software. You will
need to disable any application that is using the library in order to
run those other applications.RequirementsPython 3.6 or newer with asyncioAn Anthem MRX or AVM receiver or processorKnown IssuesThis has only been tested with an MRXx20 series receiver, although
the Anthem protocol was largely unchanged from the MRXx10 series. It
should work with the older units, but I’d appreciate feedback or pull
requests if you encounter problems. It will definitely not work with
the original MRXx00 units or the D2v models.Only Zone 1 is currently supported. If you have other zones
configured, this library will not allow you to inspect or control
them. This is not an intractable problem, I just chose not to address
that nuance in this initial release. It’s certainly feasible to add
support but I am not settled on how that should be exposed in the
internal API of the package.I skipped over a lot of the more esoteric settings that are available
(like toggling Dolby Volume on each input). If I passed over a
setting that’s really important to you, please let me know and I’ll
be happy to add support for it. Eventually I intend to cover the full
scope of the Anthem API, but you know how it goes.InstallationYou can, of course, just install the most recent release of this package
usingpip. This will download the more rececnt version fromPyPIand install it to your
host.pip install anthemavIf you want to grab the the development code, you can also clone this
git repository and install from local sources:cd python-anthemav
pip install .And, as you probably expect, you can live the developer’s life by
working with the live repo and edit to your heart’s content:cd python-anthemav
pip install . -eTestingThe package installs a command-line tool which will connect to your
receiver, power it up, and then monitor all activity and changes that
take place. The code for this console monitor is inanthemav/tools.pyand you can invoke it by simply running this at
the command line with the appropriate IP and port number that matches
your receiver and its configured port:anthemav_monitor --host 10.0.0.100 --port 14999Helpful Commandssudo tcpflow -c port 14999Interesting LinksProject HomeAPI Documentation for Anthem Network
Protocol(Excel Spreadsheet)Pictures of catsCreditsThis package was written by David McNett.https://github.com/nuggethttps://keybase.io/nuggetHow can you help?First and foremost, you can help by forking this project and coding.
Features, bug fixes, documentation, and sample code will all add
tremendously to the quality of this project.If you have a feature you’d love to see added to the project but you
don’t think that you’re able to do the work, I’m someone is probably
happy to perform the directed development in the form of a bug or
feature bounty.If you’re anxious for a feature but it’s not actually worth money to
you, please open an issue here on Github describing the problem or
limitation. If you never ask, it’ll never happenIf you just want to thank me for the work I’ve already done, I’m
happy to accept your thanks, gratitude, pizza, or bitcoin. My bitcoin
wallet address can be onKeybaseor
you can send me a donation viaPayPal.Or, if you’re not comfortable sending me money directly, I’ll be
nearly as thrilled (really) if you donate tothe
ACLU,EFF, orEPICand let me know that you did.
|
anthemav-serial
|
No description available on PyPI.
|
anthe-official
|
AntheThis is the official repository for the articleLess is More!
A slim architecture for optimal language translation. Anthe is an architecture
that improves on the Transformer performance with much fewer parameters.To run the experiments run thetrain.pyfile. If you want to activate the Transformer architecture, pass the
argument--comments=sameemb_projectoutput. If you want to activate the Anthe architecture, pass the argument--comments=geglu_gateattention_hsoftpos:2_tcffn:.005_tcpreatt:.07_tclength:2. By default it will use
the WMT14 dataset. If you want to use the WMT17 add the following text to the comments argument:--comments=..._lpair:cs-en, where the available
language pairs are cs-en, de-en, fi-en, lv-en, ru-en, tr-en, zh-en.You can install it as a package withpip install anthe-official.Layers AvailableThe following layers are available for the Anthe architecture, only in TensorFlow 2.10.0 for now.
You can access the Anthe architecture, the AntheEncoderBlock and the AntheDecoderBlock, like so:fromanthe_official.neural_models_tfimportAnthe,AntheEncoderBlock,AntheDecoderBlockmodel=Anthe(inputs_vocab_size,target_vocab_size,encoder_count,decoder_count,attention_head_count,d_model,d_point_wise_ff,dropout_prob)encoder_block=AntheEncoderBlock(attention_head_count,d_model,d_point_wise_ff,dropout_prob)decoder_block=AntheDecoderBlock(attention_head_count,d_model,d_point_wise_ff,dropout_prob)In the article we develop other layers that are part of the Anthe architecture, but might be of interest
on their own.
The TC versions of the Dense, Conv1D and Embedding,
and the SoftPOS and the HSoftPOS, can be accessed like so:fromanthe_official.neural_models_tfimport*tc_dense=TCDense(d_model,length=3,ratio=.2)tc_conv1d=TCConv1D(filters,kernel_size,tc_length=3,ratio=.2)tc_embedding=TCEmbedding(input_dim,output_dim,tc_length=3,ratio=.2)soft_pos=SoftPOS(add_units,n_subpos=add_units,repeat_subpos=1)hsoft_pos=HSoftPOS(vocab_size,embed_dim)AcknowledgementsWe thankstrutive07for his implementation of the Transformer and
WMT14 task, which we used as a starting point for our code.
|
anthill
|
No description available on PyPI.
|
anthill-admin
|
No description available on PyPI.
|
anthill-blog
|
No description available on PyPI.
|
anthill-common
|
No description available on PyPI.
|
anthill-config
|
No description available on PyPI.
|
anthill.customexport
|
IntroductionThis package adds a new tab toportal_skins/customfolder named “FS-Export”.
Using this tab you can export all scripts and other stuff you have customized
with one click to the filesystem. Now also with support for
portal_view_customizations folder.Even in times of views, content providers and viewlets there are cases where
you have stuff in custom folder and struggle with exporting it to the
filesystem.InstallationExtend your buildout with anthill.customexportMake sure your instance knows about it (eggs=, zcml=)Rerun buildoutGo to portal_skins/custom and then click on the new tabTested withPlone 2.5.x and 3.xChangelog0.3 - 2010-11-30Fixed exporter for Plone 4 environments
[spamsch]0.2 - 2009-09-01Fixed code to make export from portal_view_customiztions possible [spamsch]0.1 - 2009-08-14Initial release after repackaging
|
anthill-discovery
|
No description available on PyPI.
|
anthill-dlc
|
No description available on PyPI.
|
anthill-environment
|
No description available on PyPI.
|
anthill-event
|
No description available on PyPI.
|
anthill.exampletheme
|
IntroductionThis package provides a public theme for Plone. It isNOTuseable using
plain plone. It is intended to be used in conjunction withanthill.skinnerpackage. Look there for more information.To remember: This theme relies on the idea that for anonymous users you want
to have a flexible theme that is easy to set up and for yourself or content
editors you want to unleash the full power of Plone by leaving the interface
as is. This paradigma is enough for most Plone websites imho.InstallationInclude anthill.exampletheme in your buildoutMake sure to have it included in instance zcml and eggs optionAlso dont’ forget to include anthill.skinner, anthill.tal.macrorenderer and
z3c.autoincludeGo to portal_quickinstaller and install anthill.skinnerThen install anthill.examplethemeClick on “Show Preview” on the upper right corner in PloneDependenciesz3c.autoincludeanthill.skinner >= 0.2anthill.tal.macrorendererHow does it look like?How to create a public skin?I’m currently creating a paster template that creates all the files and dirs needed to
start right away. This template is based on plone3_theme and has no big
differences (but important and subtle ones).But before invoking paster you should have a look at this
example in order to see how easy it is to create your own layout. I will
describe the steps needed in order to build a public view:I assume you have a ready to use design ready. I took mine from
freecsstemplates.org. I got images, one css file and the index.html.Create a package (e.g. my.theme) usingpaster create-tplone3_theme my.themeCopy all images to browser/imagesCopy your css file to browser/stylesheets/main.cssCopy main_template.pt from this package
(skins/anthill_exampletheme_custom_templates) to
skins/my_theme_custom_templatesCopy publicmenu_levels.pt from this package
(skins/anthill_exampletheme_custom_templates) to
skins/my_theme_custom_templatesFire up your favourite editor and open
skins/my_theme_custom_templates/main_templateDelete all stuff in the body tagCopy the contents of your index.html between body tagsLook at how this package uses menu structures and insert menu call at
correct places. Also have a look at publicmenu_levels.pt where menu
definitions live.Do not forget to include the “Back to Plone” link!Now make sure that your profiles/default/skins.xml matches the one from this
package. You do need to change the contents in skin-path to be based onpublicviewand notPlone Default. Yu also need to make sure that the
first layer (my_theme_custom_templates) has attribute
insert-before=”anthill_skinner_templates”Now you should be ready to start with your new skin. Make sure to install
anthill.skinner first and then install your package. Go to Plone interface and
then you should have a new link on the upper right that reads “Show Preview”.
Click and if you did all right you should see your shiny layout.I think you agree that this is very easy and will take you at most an
hour to have every design ready. How long will it take to have this design on
top of Plone w/o publicview? Don’t know - not my problem :)Feedback to the authorChangelog0.2 - 2009-08-09Fixed some setuptools issues and added more doc [spamsch]0.1 - 2009-08-08Initial release
|
anthill-exec
|
No description available on PyPI.
|
anthill-game-controller
|
No description available on PyPI.
|
anthill-game-master
|
No description available on PyPI.
|
anthill-leaderboard
|
No description available on PyPI.
|
anthill-login
|
No description available on PyPI.
|
anthill-message
|
No description available on PyPI.
|
anthill-modules.core
|
No description available on PyPI.
|
anthill-profile
|
No description available on PyPI.
|
anthill-promo
|
No description available on PyPI.
|
anthill-PyMySQL
|
PyMySQLTable of ContentsRequirementsInstallationDocumentationExampleResourcesLicenseThis package contains a pure-Python MySQL client library, based onPEP 249.Most public APIs are compatible with mysqlclient and MySQLdb.NOTE: PyMySQL doesn’t support low level APIs_mysqlprovides likedata_seek,store_result, anduse_result. You should use high level APIs defined inPEP 249.
But some APIs likeautocommitandpingare supported becausePEP 249doesn’t cover
their usecase.RequirementsPython – one of the following:CPython: 2.7 and >= 3.5PyPy: Latest versionMySQL Server – one of the following:MySQL>= 5.5MariaDB>= 5.5InstallationPackage is uploaded onPyPI.You can install it with pip:$ python3 -m pip install PyMySQLTo use “sha256_password” or “caching_sha2_password” for authenticate,
you need to install additional dependency:$ python3 -m pip install PyMySQL[rsa]DocumentationDocumentation is available online:https://pymysql.readthedocs.io/For support, please refer to theStackOverflow.ExampleThe following examples make use of a simple tableCREATETABLE`users`(`id`int(11)NOTNULLAUTO_INCREMENT,`email`varchar(255)COLLATEutf8_binNOTNULL,`password`varchar(255)COLLATEutf8_binNOTNULL,PRIMARYKEY(`id`))ENGINE=InnoDBDEFAULTCHARSET=utf8COLLATE=utf8_binAUTO_INCREMENT=1;importpymysql.cursors# Connect to the databaseconnection=pymysql.connect(host='localhost',user='user',password='passwd',db='db',charset='utf8mb4',cursorclass=pymysql.cursors.DictCursor)try:withconnection.cursor()ascursor:# Create a new recordsql="INSERT INTO `users` (`email`, `password`) VALUES (%s,%s)"cursor.execute(sql,('[email protected]','very-secret'))# connection is not autocommit by default. So you must commit to save# your changes.connection.commit()withconnection.cursor()ascursor:# Read a single recordsql="SELECT `id`, `password` FROM `users` WHERE `email`=%s"cursor.execute(sql,('[email protected]',))result=cursor.fetchone()print(result)finally:connection.close()This example will print:{'password':'very-secret','id':1}ResourcesDB-API 2.0:https://www.python.org/dev/peps/pep-0249/MySQL Reference Manuals:https://dev.mysql.com/doc/MySQL client/server protocol:https://dev.mysql.com/doc/internals/en/client-server-protocol.html“Connector” channel in MySQL Community Slack:https://lefred.be/mysql-community-on-slack/PyMySQL mailing list:https://groups.google.com/forum/#!forum/pymysql-usersLicensePyMySQL is released under the MIT License. See LICENSE for more information.
|
anthill.querytool
|
IntroductionThis package provides a complete user interface forAdvancedQueryby
Dieter Maurer. It enables you to use a powerful language to search for
content. It also provides functionality to save parametrized and conditional
queries for later use (predefined queries). Look at the examples for more information.InstallationPut anthill.querytool in eggs= and zcml=Make sure that AdvancedQuery is installed (works for Plone 3.x)Example queryA query could look like that:And(
Eq('SearchableText', '$text'),
~Generic('path', {'query':'Members', 'level':-1}),
[[if($allowed_types)]]
In('portal_types', $allowed_types),
[[endif]]
[[ifnot($allowed_types)]]
In('portal_types', ['Folder', 'Document']),
[[endif]]
Ge('start_date', TODAY)
)Here you see that you can parametrize queries (variable expansion enabled
using $), you also can use defined constants (only one currently active called TODAY
where TODAY=DateTime()) and you can put conditionals in your queries.Conditionals are a powerful way to enable or disable certain parts of your
query. Theifstatement checks if a given parameter exists. You can also
replaceifwithifnotthat only activates the given part if the
parameter is not set.You can save this query and call it later on like that:context.query_tool.executePredefinedQuery('contentsearch', text='Test*', allowed_types=['Folder', ])ExtensionsThis release adds some additional query operators (defined in
SearchOperators.py). The following operators are currently available:Countcounts items in search results. (e.g. Count(Eq(‘SearchableText’, ‘moses’)) )Sumcomputes sum over int result (e.g. Sum(Ge(‘commentcount’, 10)) )Avgcomputes the average over an int result (e.g. Avg(Ge(‘userviews’, 1)) )Look at SearchOperators.py for examples on how to create your own operators.Submit queryPredefined queriesThanksMarkus Reinsch for coding the predecessor of this packageDieter Maurer for his great implementation of AdvancedQueryChangelog0.2 - 2009/08/21Revamped documentation [spamsch]0.1 - 2009/08/20Initial release
|
anthill-report
|
No description available on PyPI.
|
anthill.skinner
|
IntroductionThis package provides functionality to ease skinning of plone.
It is built around the idea that you shouldn’t have to adapt much plone
templates but instead take any layout you want and put it on top of Plone.That means that all editing is done using Plone skin but for anonymous users
(or users not having the correct permission) another skin is shown. This works
based on rules described below. No url switching or iframe magic needed.It resemblescollective.skinnybut instead of imposing customized templates
for each and every content type this package tries to reuse already
existing views and templates. You also won’t need a special server
configuration to redirect to ++skin++ or such. With some coding it is also
possible to use this for community sites because peoplecanlog in and
either (if permission is set) they see the well-known Plone interface or the public skin.
Also there is no need to hack around to prevent plone templates fromleakingbecause youwantto display all plone related templates as is.Full Example provided here:http://pypi.python.org/pypi/anthill.examplethemeInstallationInclude anthill.skinner in your buildout.cfgMake sure to also include z3c.autoincludeRerun buildoutRestart your Zope instanceGo to portal_quickinstaller and install anthill.skinnerATT: Make sure to restart Zope - this is because of a handler only being
evaluated on startupThere should be a new link “Show Preview” on the bottomCreation of a theme (simple way)Create a new folder custom_public in portal_skinsInclude this folder in portal_skins/manage_propertiesForm in publicviewCustomize anthill_skinner_templates/main_template to custom_publicPut images and CSS also to this folderFor a more elaborate example look at anthill.examplethemeDependenciesz3c.autoincludeanthill.tal.macrorendererTested withPlone 4.x (for Plone 3.x use version <0.7)ProsNo need to understand the complex plone template logicNo need to write a new handbook for editors - take any recent plone book
because the edit skin stays the sameLess work when updating to a new Plone version because you didn’t touch
much of the templatesAlmost no limitations for your theme/design that could be imposed by the
fact that you need to include all the edit functionality (tabs, …) into
your themeBy not having to fiddle with Plone inner logic/templates that much you save
much timeConsEditors have no in-place editing - although you can change to the edit view
on every context there’s one more click neededIncluding plone portlets into your theme is a little more complexSimilar packagescollective.skinnycollective.editskinswitcherRulesInstead of having url based rules this package uses simple rules suitable for
most deployments. If you don’t like these rules then you can easily overwrite
them.Rules to show public skin are as follows (order matters):User is anonymousUser is authenticated but has not the correct permission (anthill: View CMS)User is authenticated, has the correct permission but activated previewThere is a request variable named anthill.skinner.previewAll rules can be found inbrowser/handling#mustDisplayPublicSkin.Overwrite rulesYou can overwrite these rules by defining an adapter. Please keep in mind that if
you overwrite rules then you need to overwrite all rules!configure.zcml:<adapter
for="anthill.skinner.interfaces.ISkinHandler"
provides="anthill.skinner.interfaces.IRuleOverwrite"
factory="your.product.publicview.RuleMaker"
/>publicview.py:class RuleMaker:
implements(IRuleOverwrite)
def __init__(self, context):
self.context = context
def mustDisplayPublicView(self, context, request):
return TrueHow to create your own skinIn order to create your own skin first take a look at the very simple example
included in this package. It shows you how to define your menu and how content
will be displayed.Please be aware of the fact that it is intended to not load any of the css or
javascript coming with Plone.You can then create your own theme based on anthill.skinner by simply using the
same skin and layer for your resources. Useanthill.skinner.interfaces.IPublicSkinLayeras the layer andpublicviewas the skin name you’re putting your stuff into.ThanksDevelopers of collective.skinnyPlone communitybanality design & communication for funding this (all anthill.* packages)This package is part of the anthill.* ecosystem that powers many websites all
around the world - all being built on top of this package (originally for
Plone 2.x).Changelog0.8 (2010-11-30)Added possibility to select language for menu items
[spamsch]Plone 4 compatibility
[spamsch]0.6 (2009-11-27)Many more fixes to utility methods
[spamsch]Fixed renderer to use context for macro rendering
[spamsch]Fixed navtree to always use folderish context
[spamsch]Now taking review state into account for display
[spamsch]Added method to check if context is portal root
[spamsch]Fixed severe bug in user auth where skinner would never
switch into cms mode even if user has authenticated
[spamsch]0.5 - 2009-10-28Fixed severe bug in main_template always showing error [spamsch]0.4 - 2009-08-14Fixed skinner to be only active where installed using portal_quickinstaller [spamsch]0.3 - 2009-08-09Fixed z3c.autoinclude inclusion in configure.zcml [spamsch]0.2 - 2009-08-08Added menu generator for public views to ease menu separation and usage [spamsch]Fixed preview activation and deactivation link to use the current context
and stick to the folder [spamsch]0.1 - 2009-06-08Initial release
|
anthill-social
|
No description available on PyPI.
|
anthill-static
|
No description available on PyPI.
|
anthill-store
|
No description available on PyPI.
|
anthill.tal.macrorenderer
|
IntroductionThis package renders macros from a given page template using pure python.Sometime you may want to use page templates like code libraries where for each
functionality you have one macro. Calling macros is no problem using ZPT
use-macro but how do you call macros from pure python code and also pass
parameters? Because there do not seems to be an obvious solution for this problem
(especially the parameters part) this package was created.Render macro with namemacronamefrom a given page template:>>> from anthill.tal.macrorenderer import MacroRenderer
>>> template = ViewPageTemplateFile('template.pt')
>>> renderer = MacroRenderer(template, 'macroname')
>>> print renderer(data={'option1' : 42})Sometimes you get an exception about not enough context being provided to the
renderer (or for prior versions a TypeError).A fix is easy: Simply add acontext=self.contextto the MacroRenderer call:>>> renderer = MacroRenderer(template, 'macroname', context=self.context)Changelog0.2.1 (2009-08-24)Added fix forhttp://mail.zope.org/pipermail/zope3-dev/2007-April/022266.htmlbecause this can happen when calling macros with not enough context [spamsch]0.2 (2009-08-24)Fix for missing context (TypeError exceptions) [spamsch]0.1 (2009-08-08)Initial release
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.