package
stringlengths 1
122
| pacakge-description
stringlengths 0
1.3M
|
---|---|
alphabacktest
|
alphabacktestDISCLAIMER: The results this backtesting software might produce may not be accurate, reliable or suppose any evidence ensuring the profitability of an algorithmic trading strategy. The results are indicative and might not be appropiate for trading purposes. Therefore, the creator does not bear any responsibility for any losses anyone might incur as a result of using this software.Please, be fully informed about the risks and costs associated with trading the financial markets, as it is one of the riskiest investment forms possible.Descriptionalphabacktest is a library that aims at bringing algorithmic trading to all Pyhton programmers via a very simple set of methods that allow backtesting any trading strategy anyone can come up with, allowing external functions or modules.You can find below an example of the structure to use when working with alphabacktest. The module is designed to be inherited in a new class created by the user and call the methods within the class.Backtest()has the engine to run the backtest chronologically and callsstrategy()at each point in the data. The variables passed on to the function arelisttypes with the previous quotes at each point for all the prices (high,close...) and volumes; except fordtimewhich is a string representing the current data point.fromalphabacktestimportBacktestclassmStrategy(Backtest):''' Always call super().__init__() '''def__init__(self):super().__init__(ticker="AMZN")''' You can choose the parameters name'''defstrategy(self,_open,close,high,low,vol,dtime):''' Fill in your strategy '''ifnotself.has_positions():q=int(self.free_balance/close[-1])self.long_order(security=self.symbol,amount=q,dtime=dtime,price=close[-1])if__name__=='__main__':mStrategy()As seen, the usage is pretty straightforward, and does not require a huge effort for the user to import and/or work with it, being a smooth process and giving freedom to apply any strategy, from those based on technical indicators to AI, going through many different tools.Regarding the data source (explained at Usage), getting the proper data is sometimes rather difficult, especially when one is looking for tight timeframes (1m,5m,15m...). This data is not usually free, so this module gives the chance to either get the data from Yahoo Finance or from a csv file that the user can have in their local memory from a purchase or own harvest.InstallationYou can find all releases inPyPI.$ pip install alphabacktestRequirementsPython 3.6+DependenciesAs some of the features provided by the module are meant to be optional,the installation of alphabacktest does not imply the collectionof TA-lib nor dash. Therefore, the package comes without these modules, which need to be installed by the user.TA-lib. As there are sometimes difficulties when installing this module depending on the OS and IDE configuration, there is an option for the backtesting engine not to calculate any technical indicator (as all are based on TA-lib), so in that case you don't necessarily need to have it installed. In case you want to enable the calculation of technical indicators, you will indeed need the module you can findhere.Dash. alphabacktest uses dash to plot and display the results. However, this option can also be desactivated. In case you do want it, you can find ithere.UsageThesuper().__init__()is highly important as it defines exactly the settings of the backtest. It is customisable to an extent and allows multiple features to be desactivated or included.super().__init__(self,sym="",initial_time="first",final_time="last",dateformat="%Y-%m-%d",file_path="",ticker=None,indicators=True,slippage=0.0001,leverage=1,fees=0.005/100,capital=20000,save_results=True,save_path=os.getcwd(),plot_results=True)Data sourceIn the first place, the data can be pulled from Yahoo Finance viapandas_datareader, or it can be imported from a local file.From Yahoo FinanceIn the first case,symandfile_pathneed to be left as they are, while thetickerneeds to be filled with the security symbol (according to the symbols Yahoo Finance uses). Thedateformatof Yahoo Finance is the default format.super().__init__(self,ticker="AAPL")From a local CSV fileIf it is the second case, the initialization varies a little andsym,file_pathanddateformatparameters need to be specified, anddo notspecifyticker. It is important that the dateformat is specified correctly, otherwise an error will be raised by the datetime module.super().__init__(self,sym="SP500",dateformat="%Y-%m-%d%H:%M:%S",file_path="yourCSVfiledirectory.csv")However, the requirements for the file are very tight. The engine imports the data and assigns "Datetime","Open","High","Low","Close","Volume" values to the columns in that order. The format in Datetime column will be kept as a string and set as index whilst the format for Open-Volume will be converted to float. At last, the separator needs to be ','. If the format of your csv file is not this one, you can either transform it externally or provide your own data (which is recommended).From a given DataFrameThe data format must be the same as the described above, meaning "Datetime" (str),"Open" (float),"High" (float),"Low" (float),"Close" (float),"Volume" (float) [IN THIS ORDER]. Datetime must be the dataframe's index.Example:data=pd.read_csv('csv/file/path.csv')''' Data treatment'''data=data.set_index("Datetime")data.loc[:,'Open':'Close']=data.loc[:,'Open':'Close'].astype(float)classmyStrategy(Backtest):def__init__(self,data):super().__init__(sym='BTCUSD',data=data,initial_time="04/01/2020 01:00:00",dateformat="%d/%m/%Y %H:%M:%S")myStrategy(data)Data example:Open High Low Close Volume
Datetime
01/04/2007 17:15:00 3746.71 3850.91 3707.23 3843.52 4324200990.0
01/04/2007 17:30:00 3849.22 3947.98 3817.41 3943.41 5244856835.0
01/04/2007 17:45:00 3931.05 3935.69 3826.22 3836.74 4530215218.0
01/04/2007 18:00:00 3832.04 3865.93 3783.85 3857.72 4847965467.0
01/04/2007 18:15:00 3851.97 3904.90 4093.30 3836.90 5137609823.0
... ... ... ... ... ...
[m rows x n columns]TimeperiodAll the data availableIf the aim of the user is to backtest the strategy for all the points in the data, the parametersinitial_timeandfinal_timeshould not be modified, they are already set as the very beginning and the last point respectively.Specific timeframesIf the user wants to backtest certain scenarios, theinitial_timeandfinal_timeneed to be modified accordingly, always following the format specified indateformat.super().__init__(self,sym="GOLD",data=your_data,initial_time="01/01/2015 00:00:00",final_time="01/01/2020 23:55:00",dateformat="%Y-%m-%d%H:%M:%S",)ExtrasTechnical IndicatorsThe technical indicators that come with the alphabacktest module are the SMA, EMA, RSI, Bollinger Bands and MACD. However if anyone would want to change it, you can also redefine theindicators()method.frombcktclasses_cyimportBacktestclassmStrategy(Backtest):def__init__(self):super().__init__(ticker="AMZN")defstrategy(self,_open,close,high,low,vol,dtime):''' Your strategy'''defindicators(self,close):fromtalibimportRSI,BBANDS,MACD,EMA,SMA#....'''Define your indicators'''if__name__=='__main__':mStrategy()Otherwise, if the user does not wish to use any of them, the indicators calculation can be desactivated by settingindicators=Falsein thesuper().__init()declaration.Trading environment conditionsIn order to simulate a real brokerage activity, the orders are not placed and executed right away. Instead, they first pass through a check and are placed in the following period. This adds more realism and settles the real-pessimistic scenario.Moreover, the parameters that are required by the engine to reproduce this behaviour are set by default but the user can change them and adapt it to what their broker sets as conditions. The attributes are the following.slippage. This parameter refers to the difference between the price at which the order is placed and the price at which the trade is executed (CFI). Although the slippage can really be zero, positive or negative, in alphabacktest it is considered to always be playing against the interests of the trader.leverage.Investopediadescribes the leverage as follows.Leverage refers to the use of debt (borrowed funds) to amplify returns from an investment or project.In this module the leverage is a multiplier that refers to thentimes your capital is increased by taking debt. Leverage can range from 1 (meaning no debt) up to 400 depending on the financial product. (Again, thanksInvestopedia). Nonetheless, this number is usually set by your broker.fees. This refers to the commissions the broker is charging per unit of capital invested, meaning that it will be dependent on the amount the trader allocates to the trade.capital. Total initial amount of liquidity the account is provided with. It is not considered to be in any specific currency; just the currency the security is traded with. Therefore, the backtest will not consider fluctuations due to the FX markets evolution.Results treatmentThe default configuration is set to save the results in csv files inside a folder namedbacktest_resultsthe engine creates in the cwd and it runs aDashapp where the results are plotted and the statistics presented. In case the user wanted to desactivate any of this features, the initialisation allows it by settingsave_resultsorplot_results(bool) asFalse. Moreover, if the user wanted to change the directory where the results are stored,save_path(str) is the parameter to customize.Example:super().__init__(self,sym="",initial_time="first",final_time="last",dateformat="%Y-%m-%d",file_path="",ticker=None,indicators=True,slippage=0.0001,leverage=1,fees=0.005/100,capital=20000,save_results=True,save_path="your/preferred/directory",plot_results=False)Class attributesThe attributes of the inherited class are the following.self.user_positionsPandas DataFrame containing the open positions.Example:Security OPrice ODate CPrice CDate Amount PNL Performance
btoHxGN SP500 3653.884575 30/11/2020 20:30:00 - - -20 4.037615 0.110502self.closed_positionsPandas DataFrame containing the history of the closed positions.Example:Security OPrice ODate CPrice CDate Amount PNL Performance
l7h7eRx SP500 3517.898175 12/10/2020 10:30:00 3482.75 14/10/2020 11:00:00 -20 699.445602 19.882486
eO9fL5Q SP500 3444.844450 15/10/2020 04:00:00 3476.75 15/10/2020 14:30:00 20 634.666156 18.423652
sIWnkka SP500 3503.649600 16/10/2020 09:45:00 3473.50 16/10/2020 15:00:00 -20 599.488350 17.110397
XihBLba SP500 3423.092275 19/10/2020 14:00:00 3450.00 20/10/2020 05:30:00 20 534.731408 15.621297
Dm2U5jQ SP500 3418.591825 26/10/2020 02:45:00 3396.25 26/10/2020 10:00:00 20 -450.255092 -13.170777
EAtb5UL SP500 3388.088775 26/10/2020 10:30:00 3361.75 26/10/2020 12:30:00 20 -530.163589 -15.647866
3ua0S1x SP500 3386.088575 26/10/2020 13:30:00 3360.00 27/10/2020 17:15:00 20 -525.157589 -15.509269
...self.user_portfolioPandas DataFrame with the representation of the assets in the user's portfolio and the total value of their wallet.Example:Security Amount Value
0 SP500 -20 -73070.0self.tradesPandas DataFrame with the information on the executed trades.Example:Security Type Datetime Price Amount
lfdXIFo SP500 Sell 12/10/2020 10:15:00 3516.00 1
6sp7cHv SP500 Close#l7h7eRx 14/10/2020 10:45:00 3499.25 -20
JXKYklr SP500 Buy 15/10/2020 03:45:00 3444.25 1
32Bc0fR SP500 Close#eO9fL5Q 15/10/2020 14:15:00 3470.00 20
oP3w0Ha SP500 Sell 16/10/2020 09:30:00 3507.00 1
XPTZCZE SP500 Close#sIWnkka 16/10/2020 14:45:00 3484.00 -20
izB77xS SP500 Buy 19/10/2020 13:45:00 3432.25 1
4qGx5Ld SP500 Close#XihBLba 20/10/2020 05:15:00 3443.50 20
...self.ordersPandas DataFrame with the information of all orders either if they are placed or not.Example:Security Type Datetime Price Amount Status
DsN3JXP SP500 Sell 12/10/2020 10:15:00 3516.00 1 Executed
Ztfajo7 SP500 Close#l7h7eRx 14/10/2020 10:45:00 3499.25 -20 Executed
VOiZwri SP500 Buy 15/10/2020 03:45:00 3444.25 1 Executed
2hUedWm SP500 Close#eO9fL5Q 15/10/2020 14:15:00 3470.00 20 Executed
O2j9Sjf SP500 Sell 16/10/2020 09:30:00 3507.00 1 Executed
lStGqRL SP500 Close#sIWnkka 16/10/2020 14:45:00 3484.00 -20 Executed
9lcGAFG SP500 Buy 19/10/2020 13:45:00 3432.25 1 Executed
X86B0s7 SP500 Close#XihBLba 20/10/2020 05:15:00 3443.50 20 Executed
...self.free_balanceRepresents the free margin the account has.Class methodsThe methods of the inherited class are the following ones.self.long_order(security, amount, dtime, price)Sends a long (buy) order to the broker. The parameters are:security: str. Name of the security to be traded, it is a must for the engine to account the traded asset.amount: int. Quantity of contracts the order aims at. The minimum quantity is 1 and only whole (integer) numbers are accepted.dtime: str. Time at which the long order is placed in the specifieddateformat.price: float. Price at which the order is aimed. This price will not have any influence, but is useful for further analysis on the results of one's trades.self.short_order(security, amount, dtime, price)Sends a short (sell) order to the broker. The parameters are:security: str. Name of the security to be traded, it is a must for the engine to account the traded asset.amount: int. Quantity of contracts the order aims at. The minimum quantity is 1 and only whole (integer) numbers are accepted.dtime: str. Time at which the short order is placed in the specifieddateformat.price: float. Price at which the order is aimed. This price will not have any influence, but is useful for further analysis on the results of one's trades.self.closing_order(p_id, dtime, double price)Sends a closing order to the broker. The parameters are:pID: str. This represents the position ID, which is generated randomly once a position starts after the trade is executed. It is placed at the Position DataFrame index and allows the selection of a particular position.dtime: str. Time at which the closing order is placed in the specifieddateformat.price: float. Price at which the order is aimed. This price will not have any influence, but is useful for further analysis on the results of one's trades.self.has_positions()Returns a bool referring to the possession of an open contract. ReturnsTrueif there are open contracts andFalseif there is no open position. No parameters taken.self.get_positions(security,_open=True)Returns a pandas DataFrame with the positions related to the specified security. The parameters are:security: str. Name of the security of the positions._open: bool. Refers to the state of the positions to be returned. If the method is called with_open=True, the positions that will be returned are the currently open posigions; whilst if it is_open=False, the method will return all positions regardless of their state.self.get_long_positions(security,_open=True)Returns a pandas Dataframe with only long positions. The parameters are:security: str. Name of the security of the positions._open: bool. Refers to the state of the positions to be returned. If the method is called with_open=True, the positions that will be returned are the currently open posigions; whilst if it is_open=False, the method will return all positions regardless their state.self.get_short_positions(security,_open=True)Returns a pandas Dataframe with only short positions. The parameters are:security: str. Name of the security of the positions._open: bool. Refers to the state of the positions to be returned. If the method is called with_open=True, the positions that will be returned are the currently open posigions; whilst if it is_open=False, the method will return all positions regardless their state.ResultsThe results are obtained via csv files and a dash app summary of the strategy performance that will be running on the local serverhttp://127.0.0.1:port/as the logs report.FeaturesTO DO:Multiple assets (threading)CreditsThis package was created with Cookiecutter_ and theaudreyr/cookiecutter-pypackage_ project template... _Cookiecutter:https://github.com/audreyr/cookiecutter.. _audreyr/cookiecutter-pypackage:https://github.com/audreyr/cookiecutter-pypackage
|
alphabase
|
AlphaBaseAlphaBase provides all basic python functionalities for AlphaPept
ecosystem from theMann Labs at the Max Planck Institute of
Biochemistryand theUniversity of
Copenhagen. To enable
all hyperlinks in this document, please view it atGitHub. For documentation,
please seereadthedocs.AboutLicenseInstallationPip installerDeveloper installerUsageTroubleshootingCitationsHow to contributeChangelogAboutThe infrastructure package of AlphaX ecosystem for MS proteomics. It was first published with AlphaPeptDeep, seeCitations.Packages built upon AlphaBaseAlphaPeptDeep: deep learning framework for proteomics.AlphaRaw: raw data reader for different vendors.AlphaDIA: DIA search engine.PeptDeep-HLA: personalized HLA-binding peptide prediction.AlphaViz: visualization for MS-based proteomics.AlphaQuant: quantification for MS-based proteomics.CitationsWen-Feng Zeng, Xie-Xuan Zhou, Sander Willems, Constantin Ammar, Maria Wahle, Isabell Bludau, Eugenia Voytik, Maximillian T. Strauss & Matthias Mann. AlphaPeptDeep: a modular deep learning framework to predict peptide properties for proteomics. Nat Commun 13, 7238 (2022).https://doi.org/10.1038/s41467-022-34904-3LicenseAlphaBase was developed by theMann Labs at the Max Planck Institute of Biochemistryand theUniversity of Copenhagenand is
freely available with anApache License. External Python
packages (available in therequirementsfolder) have
their own licenses, which can be consulted on their respective websites.InstallationAlphaBase can be installed and used on all major operating systems
(Windows, macOS and Linux). There are two different types of
installation possible:Pip installer:Choose this installation if you want to use
AlphaBase as a Python package in an existing Python 3.8 environment
(e.g. a Jupyter notebook).Developer installer:Choose this installation if you
are familiar withcondaand
Python. This installation allows access to all available features of
AlphaBase and even allows to modify its source code directly.
Generally, the developer version of AlphaBase outperforms the
precompiled versions which makes this the installation of choice for
high-throughput experiments.PipAlphaBase can be installed in an existing Python 3.8 environment with a
singlebashcommand.Thisbashcommand can also be run directly
from within a Jupyter notebook by prepending it with a!:pipinstallalphabaseInstalling AlphaBase like this avoids conflicts when integrating it in
other tools, as this does not enforce strict versioning of dependancies.
However, if new versions of dependancies are released, they are not
guaranteed to be fully compatible with AlphaBase. While this should only
occur in rare cases where dependencies are not backwards compatible, you
can always force AlphaBase to use dependancy versions which are known to
be compatible with:pipinstall"alphabase[stable]"NOTE: You might need to runpip install -U pipbefore installing
AlphaBase like this. Also note the double quotes".For those who are really adventurous, it is also possible to directly
install any branch (e.g.@development) with any extras
(e.g.#egg=alphabase[stable,development-stable]) from GitHub with e.g.pipinstall"git+https://github.com/MannLabs/alphabase.git@development#egg=alphabase[stable,development-stable]"DeveloperAlphaBase can also be installed in editable (i.e. developer) mode with a
fewbashcommands. This allows to fully customize the software and
even modify the source code to your specific needs. When an editable
Python package is installed, its source code is stored in a transparent
location of your choice. While optional, it is advised to first (create
and) navigate to e.g. a general software folder:mkdir~/folder/where/to/install/softwarecd~/folder/where/to/install/softwareThe following commands assume you do not perform any additionalcdcommands anymore.Next, download the AlphaBase repository from GitHub either directly or
with agitcommand. This creates a new AlphaBase subfolder in your
current directory.gitclonehttps://github.com/MannLabs/alphabase.gitFor any Python package, it is highly recommended to use a separateconda virtual environment, as
otherwisedependancy conflicts can occur with already existing
packages.condacreate--namealphabasepython=3.9-y
condaactivatealphabaseFinally, AlphaBase and all itsdependanciesneed to be
installed. To take advantage of all features and allow development (with
the-eflag), this is best done by also installing thedevelopment
dependenciesinstead of only
thecore dependencies:pipinstall-e"./alphabase[development]"By default this installs loose dependancies (no explicit versioning),
although it is also possible to use stable dependencies
(e.g.pip install -e "./alphabase[stable,development-stable]").By using the editable flag-e, all modifications to theAlphaBase
source code folderare directly reflected when running
AlphaBase. Note that the AlphaBase folder cannot be moved and/or renamed
if an editable version is installed. In case of confusion, you can
always retrieve the location of any Python module with e.g. the commandimport modulefollowed bymodule.__file__.UsageTODOTroubleshootingIn case of issues, check out the following:Issues: Try a few
different search terms to find out if a similar problem has been
encountered beforeDiscussions:
Check if your problem or feature requests has been discussed before.How to contributeIf you like this software, you can give us astarto boost our
visibility! All direct contributions are also welcome. Feel free to post
a newissueor clone the
repository and create apull
requestwith a new branch.
For an even more interactive participation, check out thediscussionsand thethe Contributors License Agreement.ChangelogSee theHISTORY.mdfor a full overview of the changes made
in each version.
|
alphabases
|
No description available on PyPI.
|
alphabet
|
alphabetuses various methods to recognize text.InstallationThe easiest way to install the package is viapip:$ pip install alphabetUsageObfuscationfromalphabetimportalphabetkey="foobar"s=alphabet.alphabet("python")print(s)>pythont=s.obfuscate(key)print(bytes(t,'utf-8'))>b'\x16\x16\x1b\n\x0e\x1c'print(t.obfuscate(key))>pythonIdentify a stringfromalphabetimportalphabetalphabet('%!').identify()>'PostScript document text'alphabet('import os').identify()>'Python'alphabet('<div>foobar</div>').identify()>'XML'alphabet('Привет').identify()>'ru'
|
alphabet2kana
|
alphabet2kanaConvert English alphabet to Katakanaアルファベットの日本語表記はUnidicと英語アルファベット - Wikipediaを参考にしています。特に、Zはゼット表記です。Installationpipinstallalphabet2kanaUsagefromalphabet2kanaimporta2ka2k("ABC")# "エービーシー"a2k("Alphabetと日本語")# "エーエルピーエイチエービーイーティーと日本語"a2k("Alphabetと日本語",delimiter="・")# "エー・エル・ピー・エイチ・エー・ビー・イー・ティーと日本語"a2k('k8s',delimiter='・',numeral=True)# "ケー・エイト・エス"半角にのみ対応しています。
全角アルファベットはmojimojiやjaconvなどで半角に変換してください。Only supported with half-width characters.
|
alphabet5-ping
|
copy credentials
|
alphabeta
|
alphabetais a collection of tools for A/B testing in Python.
|
alphabet-detector
|
UNKNOWN
|
alphabetic-number
|
# alphabetic-number
Convert number to alphabetical fromhi
|
alphabetic-simple
|
Django Template tag for building alphabetical indexLink to repository:https://github.com/Arpaso/alphabetic-simpleBuilds alpabetic index to navigate through collection sorted by firstletter.
Supportsenglishandrussiangroups of alphabets.Usageview.py:from django.views.generic.list_detail import object_list
from alphabetic.utils import alphabetic_setup
from .models import MyModel
def myview(request):
...
queryset = MyModel.objects.all()
return object_list(request, alphabetic_setup(request, queryset, 'last_name'), template_name=template)template.html:{% show_alphabetic_filter %}alphabetic_setup(request, queryset, ‘last_name’)- returns sorted queryset in alphabetical order by firstletter of
the attribute name, e.g.last_nameor whatever attribute of the model you specified.show_alphabetic_filter- template tag shows clickable alphabet in the template.Clicking on the letter will produce GET request to the current url with a tail?firstletter=X,
whereXis the clicked letter.Written by the development team of Arpaso company:http://arpaso.com
|
alphabetic-timestamp
|
Alphabetic TimestampThis is small Python package which encode standard timestamp to shorted form by using alphabetic symbols.Installationpipinstallalphabetic-timestampDescriptionUser can use two base for coding. It other words two different lists of symbols for encoding.Base36Base62Example of codingThis example show coding of datetime 2020-01-01 20:20:20.002000. It means timestamp 1577906420.002.Time UnitsBase36Base62secondsq3g0dw1IMJxydeciseconds78yg3uwhdRlpucentiseconds20hkh2kw2MeBs6Qmillisecondsk4voqpsyrMm2x6qSymbols of Base 36importalphabetic_timestampasatsprint(ats.base36.symbols)>>>0123456789abcdefghijklmnopqrstuvwxyzSymbols of Base 62importalphabetic_timestampasatsprint(ats.base62.symbols)>>>0123456789abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZInterfaceThe example shows interface of base36. The interface is same for base62.importdatetimeimportalphabetic_timestampasatsdt=datetime.datetime.now()ts=dt.timestamp()ats.base36.now(time_unit=ats.TimeUnit.seconds)ats.base36.from_datetime(dt,time_unit=ats.TimeUnit.seconds)ats.base36.from_timestamp(ts,time_unit=ats.TimeUnit.seconds)ats.base36.to_datetime("q67vhw",time_unit=ats.TimeUnit.seconds,time_zone=None)ats.base36.to_timestamp("q67vhw",time_unit=ats.TimeUnit.seconds)Note: The string "q67vhw" is only example of encoded timestamp.ExamplesThis package is compatible with Python2.7 and Python3.4+. However these examples are written in Python3.6.Encode & Printimportalphabetic_timestampasatsprint(ats.base36.now())# Current DT: 2020-02-24 18:34:44.349162>>>q67vhwEncoded & Decodeimportdatetimeimportalphabetic_timestampasats# DATETIME -> ALPHABETIC_TIMESTAMP -> DATETIMEdt=datetime.datetime.now()alphabetic_ts_36=ats.base36.from_datetime(dt)decoded_ts_36=ats.base36.to_datetime(alphabetic_ts_36)alphabetic_ts_62=ats.base62.from_datetime(dt)decoded_ts_62=ats.base62.to_datetime(alphabetic_ts_62)# TIMESTAMP -> ALPHABETIC_TIMESTAMP -> TIMESTAMPts=dt.timestamp()alphabetic_ts_36=ats.base36.from_timestamp(ts)decoded_ts_36=ats.base36.to_timestamp(alphabetic_ts_36)alphabetic_ts_62=ats.base62.from_timestamp(ts)decoded_ts_62=ats.base62.to_timestamp(alphabetic_ts_62)Time unitsimportdatetimeimportalphabetic_timestampasatsdt=datetime.datetime.now()# Set time unit for current timestampnow36_ts=ats.base36.now()# default: ats.TimeUnit.secondsnow36_ts=ats.base36.now(time_unit=ats.TimeUnit.seconds)# Set time unit for specific datetime and timestampalphabetic_ts_36=ats.base36.from_datetime(dt,time_unit=ats.TimeUnit.seconds)alphabetic_ts_36=ats.base36.from_timestamp(dt.timestamp(),time_unit=ats.TimeUnit.seconds)# Examples of available time unitsnow36_ts=ats.base36.now(time_unit=ats.TimeUnit.seconds)now36_ts=ats.base36.now(time_unit=ats.TimeUnit.deciseconds)now36_ts=ats.base36.now(time_unit=ats.TimeUnit.centiseconds)now36_ts=ats.base36.now(time_unit=ats.TimeUnit.milliseconds)Possible IssueThere is a possible issue caused bybug in standard datetime module.
|
alphabetize
|
AlphabetizeAlphabetize finds grouped lines of variables within your files and orders them alphabetically. This is useful for cleaning up codebases.The organisation priority is:independent variablesdependent variablesThe variable 'a' is ordered last as it depends on the other variables.b = 10
c = 20
d = 40
a = b + c + dUPPERCASE and lowercase variables are separated when ordered.A = 10
B = 20
C = 30
a = 10
b = 20
c = 30Installationpip install alphabetizeUsagealphabetize myfile.py
alphabetize path/to/myfile.pyThe provided argument can either be the relative path or absolute path to a Python file.ExamplesExample 1 - single useConsider the following Python script (unordered_code.py)import datetime
from time import time
# First Variable Block
c_variable = 60
A_variable = 10
a_variable = 40
B_variable = 20
b_variable = 50
C_variable = 30
class TestClass:
def __init__(self):
self.c_variable = 30
self.a_variable = 10
self.b_variable = 20
def test_function():
c_variable = time()
a_variable = 10
b_variable = datetime
a_list = [a_variable, b_variable, c_variable]
bb_variable = 20
aa_variable = 10
cc_variable = aa_variable + bb_variable
return a_list, cc_variableCalling:alphabetize unordered_code.pyResults in the following output:import datetime
from time import time
# First Variable Block
A_variable = 10
B_variable = 20
C_variable = 30
a_variable = 40
b_variable = 50
c_variable = 60
class TestClass:
def __init__(self):
self.a_variable = 10
self.b_variable = 20
self.c_variable = 30
def test_function():
a_variable = 10
b_variable = datetime
c_variable = time()
a_list = [a_variable, b_variable, c_variable]
aa_variable = 10
bb_variable = 20
cc_variable = aa_variable + bb_variable
return a_list, cc_variableExample 2 - multiple usesDepending on the variable names found within grouped lines,alphabetizecan be called multiple times to further reorder the grouped lines of variables.This particularly comes into play when independent and dependent variables are mixed within the same grouped lines block.Consider the following Python script (unordered_code_multi.py)def test_function():
c_variable = 30
a_variable = 10
b_variable = 20
list = [a_variable, b_variable, c_variable]
bb_variable = 20
aa_variable = 10
cc_variable = aa_variable + bb_variable
return list, cc_variableCallingalphabetize unordered_code_multi.pyfor the first time produces:def test_function():
a_variable = 10
b_variable = 20
c_variable = 30
aa_variable = 10
bb_variable = 20
list = [a_variable, b_variable, c_variable]
cc_variable = aa_variable + bb_variable
return list, cc_variableThen callingalphabetize unordered_code_multi.pya second time produces a further ordered file:def test_function():
a_variable = 10
aa_variable = 10
b_variable = 20
bb_variable = 20
c_variable = 30
cc_variable = aa_variable + bb_variable
list = [a_variable, b_variable, c_variable]
return list, cc_variableRecommended RunningWhen usingalphabetizeit is recommended that you lint and format your files in the following order:flake8is a wrapper around the tools:PyflakespycodestyleNed Batchelder's McCabe scriptvulturefinds unused code in Python programs.alphabetizefinds and orders variables within filesIt is recommended to runalphabetizea second time to catch any caught dependent variables. Then a furtherflake8to ensure your file is formatted and adheres to PEP8 correctly.
|
alphabetize-codeowners
|
No description available on PyPI.
|
alphabets
|
Alphabet Module by Aditya Shrivastava [version 0.1]. Copyright (c) 2020 Aditya Shrivastava. All rights reserved. Visit myprogramarchives.blogspot.com for more information
|
alphabetsoup
|
al·pha·bet soup
/ˈalfəˌbet so͞op/Submit
noun INFORMAL
incomprehensible or confusing language, typically containing many abbreviations or symbols.alphabetsoup fixes problems with protein FASTA files:
+————————–+————————+
| Problem | Fix |
+————————–+————————+
| Unknown character | Replaces with ‘X’ |
+————————–+————————+
| Stop/ambigous at ends | Trims |
+————————–+————————+
| Stop/ambiguous in middle | Optionally fragments |
+————————–+————————+
| Too short | Deletes record |
+————————–+————————+Latest ReleaseGitHubLicenseDocumentationTravis BuildCoveragePythonicityCode GradeDependenciesIssuesKanban
|
alphabet-soup-lambert
|
Alphabet SoupYou have been contracted from a newspaper tasked with the job of providing an answer key to their word search for the
Sunday print. The newspaper's word search is a traditional game consisting of a grid of characters in which a selection
of words have been hidden. You are provided with the list of words that have been hidden and must find the words within
the grid of characters.RequirementsLoad a character grid with scrambled words embedded within it and a words list of the words to find. The following
conditions apply:Within the grid of characters, the words may appear vertical, horizontal or diagonal.Within the grid of characters, the words may appear forwards or backwards.Words that have spaces in them will not include spaces when hidden in the grid of characters.ExecutionWhen approaching this problem, I first made a list of what I needed- a way to read and store the words and the grid from
the given file. I also needed to write functions that could search through the grid as well as a way to pass the
retrieved locations around. I decided a dictionary would be best for this so that when printing the output to the
console, a for-loop could just run through it. I also created a list to store the words and a 2D list to store the
grid.Analyze is where the file is read and parsed. Word_find is the function that is used to create the dictionary and to
call each of the search functions. When running through the searches, I wanted to create a separate function for each
direction so that the code would be cleaner and easier to test. Vertical and horizontal are the two most straightforward
functions- based on row and column. The diagonal searches were mostly trial and error using print statements to see
which parts of the grid were being accessed. Using join on the list seemed the easiest way to search for a word. The
locations list of tuples in each diagonal method made it easier to track the locations of the elements based on where
they were located in the current string.When testing, I created analyze as a separate function so that tests could be run without requesting the
input file. I did do some research and attempted to make it work with a mock input file. That is something that I am
going to continue to work with so that I can learn it. As a workaround, I created an output text file that analyze can
write to for comparison with the answer files while running the tests. There is a test for each direction that tests
words going forward and backward within the grid. There is also a test that has words going in every direction to
ensure that all the functions work together.Sample DataThe following was used as one of the sample input and output datasets.Input10x10
H A S S T A S T U I
S E Y B S P L E A T
B K R N A E J A U E
I V G D F K T S B N
G O O D E Y I S K L
W N O E L D N I I T
M M S L T R K X Q L
Q U E E N J W K S A
S A D T I V I T C A
P I R X T O V I P Q
TINT
HERDED
GOOD
LIST
LEAP
SPA
QUEEN
TEAS
GOOSE
KNIT
SEEK
TEASOutputTINT 9:4 6:4
TEAS 0:7 3:7
SEEK 6:2 3:5
QUEEN 7:0 7:4
LIST 6:9 3:6
HERDED 0:0 5:5
GOOSE 3:2 7:2
LEAP 6:3 9:0
GOOD 4:0 4:3
SPA 0:6 2:4
KNIT 6:6 3:6
|
alphabetss
|
No description available on PyPI.
|
alphabet-thief
|
alphabet-thief"Hi, I am an alphabet thief. I steal every Latin alphabet from A to Z."RequirementsPython >= 3.7Installationpip install alphabet_thiefUsageimportalphabet_thieftext='Hello, how are you?'result=alphabet_thief.steal(text)print(f'"{}"')" , ?"text='Hello, how are you?'result=alphabet_thief.replace(text,'a')print(f'"{}"')"aaaaa, aaa aaa aaa?"text='Trời mưa thì phải ở nhà.'result=alphabet_thief.steal(text)print(f'"{}"')" ̛̀ ̛ ̀ ̉ ̛̉ ̀."text='Trời mưa thì phải ở nhà.'result=alphabet_thief.replace(text,'a')print(f'"{}"')"aaà̛a aa̛a aaà aaảa ả̛ aaà."
|
alphabhta
|
alphabhtaUtilities for build systemInstallation$pipinstallalphabhtaLicenseMIT
|
alphabot
|
Open source python bot to chat with Slack and, eventually, other platforms.Inspired byHubot. Alphabot is written in Tornado combining the power of Python with the speed of coroutines.Installationvirtualenv.venvmakebuildRunning the botUntil this is packaged as a pip this is the way to start the bot:exportPYTHONPATH=$(pwd)exportSLACK_TOKEN=xoxb-YourTokenpythonalphabot/app.py
|
alphabox
|
No description available on PyPI.
|
alpha-bpc
|
Alpha_BPCPython package for the Alpha version of Binary Patch Convolution frameworkCurrently only supports MNIST and F-MNIST using GPUs for convolutionCall help(BPC) for more information about the class and its inputsTake a look atmnist_bpc.pyandfmnist_bpc.pyin the Github repository for examples of how to use the frameworkGithub:https://github.com/amirarfan/alpha_bpc/History0.1.2 - First release for Thesis
|
alpha-build-core
|
AlphaBuildAlphaBuild is a simple monorepo build tool based on Make with out-of-the-box support for numerous tools in the
Python, Bash, Jupyter Notebooks, Markdown, YAML ecosystems and with a strong focus on extensibility.The way AlphaBuild works draws inspiration heavily from monorepo build tools such as Pants, Bazel, Buck.
It can run at once multiple linters, formatters, type checkers, hermetic packers, testing frameworks, virtual
environment managers etc.PlatformsUsageGoals - what tools we runTargets - what files we run the tools onAlphaBuild structureIDE IntegrationExample monorepo running AlphaBuildCommon admin actionsInstallationCI/CD SetupUpgradeChange goal definitionsAdd goalsUpdate PYTHONPATHSee/Change tools configThird party environmentsNested MakefilesGenerate requirements.txt for each sub-projectGenerate setup.py for each sub-projectMarkdown badgeOut-of-the-box tools by languageHigh-level comparison with Pants, Bazel, Pre-commit,
MakefilesLimitationsDetailed comparison: Make vs Pre-commit vs Tox/Nox vs Bazel vs PantsPlatformsAlphaBuild works on Linux distributions, MacOS, WSL and Windows with Git Bash.MakeandBashare AlphaBuild's only pre-requisites (Note: Bash with GNU utilities, not BSD utilities).Linux, WSLSince Linux is awesome and WSL follows its footsteps, AlphaBuild should just work there.MacOSAlphaBuild relies heavily on the GNU version offindandegrep, so, if you are running an OS which,
by default, uses BSD rather than GNU (pointing fingers to MacOS here, you may need tobrew installfindutilsandgrep, see:https://xenodium.com/gnu-find-on-macos/)Also, AlphaBuild may not work if you have ancient versions ofMake/Bash, so try upgrading them if some commands
don't seem to work. Macs typically come equipped with ancient versions ofBash.WindowsOn Windows,MakeandBashare not supported out-of-the-box, so it is recommended to use AlphaBuild on Windows within
Git Bash. Note that Git Bash does not come with Make pre-installed. Once you downloaded and installed Git Bash
runbuild-support/git-bash-integration/install_make.shrunning Git Bash as administrator. Alternatively, you may getMakefromconda.UsageUsually to format, lint, type-check, test, package, ... the code, one needs to run a bunch of commands in the terminal,
setting the right flags and the right parameters. This Make-based build system helps with running these commands with
a very simple interface:make <goal> <optional-targets>wheregoal = what tools we runtargets = over which files we run these tools.Goals - what tools we runGoals mean what command line tools to run. This build system can run one more tools at once as follows:Singleindividualtoole.g.make mypy,make flake8,make isortMultiple tools for aspecific languagee.g.make fmt-pyruns all Python formatters e.g.isort,black,docformatter,flynt,autoflakee.g.make fmt-shrunsshfmte.g.make lint-pyruns Python all linters and all formatters in "verification" mode, that isflake8+pylint+
check whether the code is already formatted withisort,black,docformatter,flynt,autoflakemake fmt-md,make lint-yml,test-sh,test-py... work similarlyMultiple tools formultiple languagese.g.make fmtruns all formatters for all supported languages (Python, Bash, Markdown, YAML, ...)e.g.make lintruns all linters for all supported languagese.g.make testruns all test suites for all supported languagesIt is possible to run multiple goals at once likemake lint test. In addition, it is very easy to change the meaning
of goals that run more than one command since they are very simply defined in Make based on other goals. For example,
one can remove theshfmtfrom bash linting simply by doing the below:# Beforelint-sh:shellcheckshfmt-check# where shellcheck and shfmt-check run the respective commands# Afterlint-sh:shellcheckPer-tool config files (e.g.mypy.ini,pyproject.toml) are typically places inbuild-support/<language>/tools-config/.Targets - what files we run the tools onWe have seen that Make gives us the power to run multiple terminal commands effortlessly. Using a Makefile like
described above is standard practice in many projects, typically running the different tools over all their files.
However, as projects grow, the need to run these tools at different granularities (e.g. in a specific directory,
over a given file, on the diff between two branches, since we last committed etc). This is where targets come into play.With default targetsThe default targets per-language are defined at the top of theMakefilein language specific variables
e.g.ONPY=py-project1/ py-project2/ script.pyandONSH=scripts/.make lintruns:all Python linters on all directories (in the$ONPY) that contain Python/stub files.all notebook linters on all directories (in$ONNB) that contain.ipynbfiles.all Bash linters (shellcheck) on all directories (in$ONSH) that contain Bash files.a Haskell linter (hlint) on all directories (in$ONHS) that contain Haskell files.a YAML linter (yamllint) on all directories (in$ONYML) that contain YAML files.make lint,make fmt -j1,make type-checkwork similarlyremember that the$(ONPY),$(ONSH), ... variables are defined at the top of the Makefile and represent the
default locations where AlphaBuild searches for files certain languages.With specific targetsTo specify manually the files/directories you want to run your tools on, AlphaBuild leverages the "on" syntax:file:make lint on=app_iqor/server.pyruns all Python linters on the file, same
asmake lint-py on=app_iqor/server.pydirectory:make lint on=lib_py_utilsruns a bunch of linters on the directory, in this case, same
asmake lint-py on=lib_py_utils/.files/directories:make lint on="lib_py_utils app_iqor/server.py"runs a bunch of linters on both targets.globs:make lint on=lib_*aliases:make fmt on=iqorwhere at the top of the Makefileiqor=app_iqor/iqor app_iqor/test_iqor, this is the
same asmake fmt on=app_iqor/iqor app_iqor/test_iqorbecauseiqoris an alias forapp_iqor/iqor app_iqor/test_iqor. Even though this example is simplistic, it is useful to alias combinations of
multiple files/directories. It is recommended to set aliases as constants in the Makefile even though environment
variables would also work.same formake fmt,make test,make type-check.With git revision targetsmake fmt -j1 since=masterruns all formatters on the diff between the current branch and master.make fmt -j1 since=HEAD~1runs all formatters on all files that changed since "2 commits ago".make lint since=--cachedruns all linters on all files that are "git added".all goals that support the "on" syntax also support the "since" syntaxMixed "on" and "since"One can use the "on" and "since" syntaxes at the same time. For example:make lint on=my_dir/ since=HEAD~2will run all linters on all files inmy_dir/that changed since "3 commits ago".ConstraintsDifferent languages may have different goals, for example Python can be packaged hermetically with Shiv, while Bash
obviously can't.The following goals must support the "on" and "since" syntax and ensure that they are only run if there are any targets
for the language they target:formatlinttype-checktestIf you want to learn more about the API of a specific goal, check the source code.AlphaBuild structureMakefile:AlphaBuild's entry point, this is where all components come together.3rdparty/:Files required to build environments of 3rd party dependencies (e.g. requirements.txt files,
package.json or lock files)build-support/:Makefile library inspired by Pants/Bazel to run linters, formatters, test frameworks, type
checkers, packers etc. on a variety of languages (Python, Jupyter Notebooks, Bash, Haskell, YAML, Markdown)The flags used per-tool (e.g. setting the paths to config files) can be found inbuild-support/alpha-build/config/<lang>.mkThe core part of AlphaBuild lives inbuild-support/alpha-build/core/, this comprises the build-system backboneresolver.mkand recipes to run lots of readily-available tools. This should be the same for all monorepos that
use AlphaBuild.By convention, repo-specific custom goals go inbuild-support/alpha-build/extensions/following the examples incore.build-support/<other-programming-lang-than-make>/contain things like config files for each tool and other files
required for your custom AlphaBuild goals.IDE IntegrationPyCharm / IntelliJWindows: On Windows, it is advised to set Git Bash as the default "Terminal". In "settings" search for "terminal",
go to "Tools → Terminal" and set "Shell Path" to something likeC:\Program Files\Git\bin\bash.exe.PYTHONPATH: If you are writing Python, please mark the directories for each project as "Sources Roots", such that
PyCharm discovers your imports.AlphaBuild hotkeys: It is easy to use AlphaBuild with hotkeys. For example, mappingAlt+Ftomake fmt -j1 on=<current-file>andAlt+Ltomake lint on=<current-file>. To set this up, follow the example inbuild-support/ide-integration/windows/to set up an external tool. Unix systems would be similar. Next, just map the
new external tools to your hotkeys.Common admin actionsInstallationUse thiscookiecuttertemplate:https://github.com/cristianmatache/cookiecutter-alpha-build-polyrepo-py.
Feel free to extend the project such that it becomes a monorepo.UpgradeTo upgrade an existing installation if new tools are added or changes are made to the target resolution infrastructure,
one would simply need to replace thebuild-support/alpha-build/coredirectory. To do that please run:pipinstallalpha-build-core--targettmp/
tar-xvftmp/alpha_build_core.tar.gz
rm-rftmp/CI/CD setupSince all CI/CD pipelines essentially rely on running some scripts in a certain order, AlphaBuild can be called
directly from any CI/CD pipeline regardless of CI/CD technology provider. AlphaBuild helps with ensuring that both the
CI pipelines and developers run the exact same commands. Since, one can easily select targets within the repo, setting
pipelines on a per sub-project basis, is effortless with AlphaBuild.Change goal definitionsLet's say, for example, you don't want to runpylintas part of your python linting. You would simply go to theMakefileand change the definition of thelint-pygoal to not includepylint.Add goalsThe goals that are available out of the box are found inbuild-support/alpha-build/core/<language>/.
You can extend/replace the core goals for new languages and/or tools by writing.mkcode inbuild-support/alpha-build/extensions/<language>/following the examples inbuild-support/alpha-build/core/.
For example,https://github.com/cristianmatache/workspaceextends AlphaBuild with Prometheus and Alertmanager goals.Update PYTHONPATHThe PYTHONPATH is set at the top of theMakefile. For example, to add new directories to
the PYTHONPATH (i.e. to mark them as sources roots) setPY_SOURCES_ROOTSat the top of the Makefile.See/Change tools configLet's say you want to change the waymypyis configured to exclude some directory from checking. Then head tobuild-support/alpha-build/config/python.mkcheck what is the path to themypyconfig file, go there and update it.
All other tools work similarly.Third party environmentsExact reproduction of the default environment:The recipes to fully replicate the default environment
(mostly usingpip,condaandnpm) are found inbuild-support/alpha-build/core/<langugage>/setup.mk, where they
use dependency files and lock files that can be found in3rdparty/. In practice, runmake env-default-replicateinside a conda environment. Also make sure you also havenpminstalled becausemarkdownlintandbatsbash
testing framework come fromnpm(if you don't need them no need to worry aboutnpmjust exclude themarkdownenvironment rule from the pre-requisites ofenv-default-replicate)Create/Upgrade/Edit default environment:If you want to edit the default environment, for example to add,
remove, constrain packages edit therequirements.txtnot theconstraints.txtfile (in3rdparty/).
Theconstraints.txtis only used for reproducibility. If you just want to upgrade your third party dependencies
there is no need to temper with therequirements.txtfiles. Then runmake env-default-upgradeand check the lock
files back into git.Add a new environment:To add a new environment, first add the dependency files (e.g.requirements.txt) in3rdparty/<new-env-name>, add a new goal inbuild-support/alpha-build/extensions. For environment management over
time, we strongly encourage maintaining the approach split between creation/upgrade/edit and exact reproduction of
environments.Nested MakefilesSupposing you want to use a different config file forblackfor a project in your monorepo. You would have 2 options:
change the config file globally frombuild-support/alpha-build/config/python.mk(this would affect other projects) or create
anotherMakefilein your specific project if you just want different settings for your little project (whether this
is a good or a bad idea is more of a philosophical debate, I would argue that globally consistent config files are
preferable, but I acknowledge that this may be needed sometimes).
So, to have nestedMakefiles that work with different config files:# root/Makefileblack.%:# Add this goal to be able to delegate to inner Makefile-s$(MAKE)-C$(subst.,/,$*)custom-blackfmt-py:blackblack.my-proj# Add your custom "black" goals here# root/my-proj/Makefileinclude build-support/alpha-build/core/python/format.mkcustom-black:$(evaltargets:=$(onpy))$(MAKE)blacktargets="$(on)"BLACK_FLAGS="-S --config my-proj/pyproject.toml"This waymake fmt-pyat the root would call the regularblackrule but will also delegate to the inner Makefilecustom-blackrule to run the same tool differently. Don't forget to excludemy-proj/from black in the outer
Makefile, otherwise black would be run twice (once from the outside Makefile and again from the inner one).Generate requirements.txt for each sub-projectRunmake reqs-py.Generate setup.py for each sub-projectRunbuild-support/python/packaging/generate_pip_install_files.pyMarkdown badgeIf you like AlphaBuild, wear the Markdown badge on your repo:[
](https://github.com/cristianmatache/alpha-build)Out-of-the-box tools by languageAlphaBuild has recipes to run the following tools. However, if you don't use some of them you don't need to have them
installed in your environments. For example, let's say you don't usebanditthen you don't need to havebanditinstalled in your environment provided that you are not using thebanditgoal per-se or as part of a composite goal
likelint-pyorlint.Click to expand and see the supported tools!- Python:
- Setup: `pip` / `conda`
- Type-check: `mypy`
- Test: `pytest` (along with `doctest` and other plugins)
- Format + Lint: `black`, `docformatter`, `isort`, `autoflake`, `flynt`, `pre-commit`, `pyupgrade`
- Lint only: `flake8`, `pylint`, `bandit`, `pydocstyle` (along with plugins like `darglint`, `flake8-bugbear`)
- Package: `pipreqs`, `shiv`
- Jupyter:
- Setup: `pip`
- Format + Lint: `jupyterblack`, `nbstripout`
- Lint only: `flake8-nb`
- Bash:
- Setup: `npm` and `conda`
- Test: `bats` (bash testing: `bats-core`, `bats-assert`, `bats-support`)
- Format + Lint: `shfmt`
- Lint only: `shellcheck`
- Haskell:
- Lint: `hlint`
- YAML:
- Setup: `pip`
- Lint: `yamllint`, `prettier`Markdown:Setup:npmFormat + Lint:markdownlint,prettierHTML + CSS:Setup:npmFormat + Lint:prettierJavaScript:Setup:npmFormat + Lint:prettierTypeScript:Setup:npmFormat + Lint:prettierreStructuredText Text (.rst)Lint:rstcheckSwift:Format:SwiftLint,swift-formatLint:SwiftLint,swift-formatKotlin:Format:ktlintLint:ktlintIt is very easy to extend this list with another tool, just following the existing examples.Example monorepo running AlphaBuildTo see AlphaBuild at work in a real-world example checkhttps://github.com/cristianmatache/workspaceout.
Workspace extends AlphaBuild with support for Prometheus, Alertmanager and Grafana.High-level comparison with Pants, Bazel, Pre-commit and traditional MakefilesModern build tools like Pants or Bazel work similarly to AlphaBuild in terms of goals and targets, but they also add
a caching layer on previous results of running the goals. While they come equipped with heavy machinery to support
enormous scale projects, they also come with some restrictions and specialized maintenance and contribution requirements.For example, Pants which, in my opinion, is the most suitable modern build tool for Python doesn't allow building
environments with arbitrary package managers (e.g. conda, mamba), does not work on Windows, prohibits inconsistent
environments (which is good but sometimes simply impossible in practice), does not yet support multiple environments.
Bazel, requires maintaining the dependencies between Python files twice, once as "imports" in the Python files
(the normal thing to do) and twice in some specificBUILDfiles that must be placed in each directory (by contrast
Pants features autodiscovery). Maintaining the same dependencies in two places is quite draining. Of course, these tools
come with benefits like (remote) caching, incrementality and out-of-the-box support for hermetic packaging (e.g. PEXes),
remote execution etc. Moreover, playing with some new command line tools, or new programming languages / types of files
(e.g. Jupyter Notebooks, Markdown, YAML) may be challenging with these frameworks. The Pants community is very welcoming
and supportive towards incorporating new tools, so it would be good to give Pants a try first. However, if any of the
mentioned shortcomings is a hard requirement, Make seems like a good and robust alternative in the meanwhile which
withstood the test of time in so many settings. AlphaBuild's strengths are its flexibility, simplicity, transparency and
tooling richness. One can quickly hack/add a new tool, see the commands that run under the hood and does not need to
worry about BUILD files or the config language.Since AlphaBuild is essentially a script manager (Python, Bash, Perl, anything) enhanced with advanced
target/file/directory selection, AlphaBuild would allow an incremental adoption of large-scale build tools like Pants.
For example, in the main Makefile, one could do:# Makefilelint-with-pants:$(evalon:=::)# Default value if the user does not specify "on" in the terminal command like: make goal on=../pantslint$(on)lint:lint-mdlint-nblint-ymllint-with-pantssuch that running a command like the below would delegate most of the work to Pants while using AlphaBuild's core or
custom capabilities not yet available in Pants (e.g. linting notebooks, markdown or YAML files).makelinton=my-dir/Barepre-commitand typical usages of Make work exceptionally well on small projects, but they don't really scale
well to multi-projects monorepos. The build system proposed here, already incorporatespre-commitand is obviously
compatible with any existing Makefiles. This approach simply takes the idea of advanced target selection and ports it
over to classical techniques like pre-commit and Make.LimitationsSince AlphaBuild is essentially a small-repo tool (Python Makefile) adapted to work on larger codebases (through target
selection), there is a point from where it will no longer be able to scale up. Fortunately, that point is quite far away
from medium-sized repos/teams.In addition, AlphaBuild requires that the commands it builds are shorter thangetconf ARG_MAXcharacters.Detailed comparison: AlphaBuild vs Make vs Pre-commit vs Tox/Nox vs Bazel vs PantsMost small/medium popular Python open source projects use Make, Pre-commit and/or Tox/Nox with a crushing majority
for formatting, linting and/or testing and/or publishing. For the same purposes, fewer projects simply use a bunch
of Bash scripts or monorepo-style build tools like Bazel (or Pants, Buck, Please).
Make, pre-commit, nox/tox work pretty well together in the same repo and are often used so.Note 1:Every time we talk about Pants we talk about Pants V2, which is a fundamentally different product from
Pants V1. Even Twitter gave up on Pants V1 and is moving to Bazel (seehttps://twitter.com/jin_/status/1255133781876330497).Note 2:IMHO Pants (V2) is the best (but not yet perfect) build tool for large monorepos out there and has
great potential.Pros and ConsClick to expand and see the pros and cons of each tool!Tox/Nox:Pros:Good to run the same commands in multiple environmentsCross-platformCons:Does not scale to large reposPython-onlyNotes: can be called from within Make, but can also call Make and pre-commit commands
(see:https://tox.wiki/en/latest/config.html#conf-allowlist_externals)Pre-commit:Pros:Great to manage git hooks (as the name implies)Can run over all files or over a specific set of files but not directories
(slightly more advanced target selection)Sort of incrementalCons:Does not work on Windows (docs say it doesn't but in fact it does partly) [EDIT this is no longer the case.]Typically, very much geared towards format and lint.Notes: can be called from within MakeMake:Pros:Very flexible -> can essentially cover any tools/language (AlphaBuild is just Make with several built-in tools
and target selection) and any way of fetching dependencies, multiple 3rd party environments etc.Very transparent -> easy to supportCross-platformSupports nested MakefilesCons:More scalable than pre-commit/tox/nox but not as scalable or hermetic as Bazel/Pants/Buck/Please
(seehttps://github.com/thought-machine/please#why-please-and-not-makefor details).Notes: can run pre-commit, tox/nox but can also be run from tox/nox (not from pre-commit though)Bazel:Pros:Great for large scale projects (incremental, DAG, remote caching/execution)HermeticCons:Need to maintain special BUILD files in every directory in which all dependencies are written again
(once imported in the code, twice in the BUILD files)Low tool coverage for Python/Jupyter ecosystemSupport for 3rd party Python environments was not great (not sure if it is still the case)Has more sophisticated support needs (dedicated engineers and/or tuned CI infra)Notes: can be called from within MakePants V2:Pros:Great for large scale projects (incremental, DAG, remote caching/execution)Dependencies are auto-discovered in BUILD filesHermeticCons:Environment support: no conda, no multiple environments, no arbitrary ways to create environments, no inconsistent
envsNo support for Windows (only for WSL)Tool/language support could be better (Pants's support for Python is better than Bazel's though)Has more sophisticated support needs (e.g. dedicated engineers and/or tuned CI infra)Does not readily support the equivalent of nested MakefilesNotes:Can be called from within MakeStill need to have BUILD files in every directory (much easier to work with than in Bazel)Users by toolClick to expand and see who uses each tool!Nox, Tox, Pre-commitThese tools are simply extremely popular.Bazelraypytorch (Make for Python, Bash and CMake; in parallel with Bazel for other languages)selenium (with Rake i.e. ruby make) -> example BUILD filehttps://github.com/SeleniumHQ/selenium/blob/trunk/py/BUILD.bazeltensorflowkerasprotobufjax (with pre-commit)Pantshttps://www.pantsbuild.org/page/who-uses-pantsMakeDatapandas (in parallel with pre-commit)vaexpydanticpandera (in parallel with pre-commit, Make also runs nox)MLpytorch (Make for Python, Bash and CMake; in parallel with Bazel for other languages)xgboostDetermined AI (example of a more scalable Make-based infrastructure with nested Makefiles)VisualizationseabornDistributedcelery (in parallel with tox and pre-commit)Stream processingfaust (Make also runs pre-commit)Webrequestsgunicorn (in parallel with tox)sentry-python (Make also runs tox)AsyncIO (Aio-libs)aiohttp (Make also runs pre-commit)yarlaiomysqlaioredis-py (Make also runs pre-commit)aiopgDevOpsansible (in parallel with tox)pytest-testinfra (in parallel with tox)Documentationsphinx: the home repo (in parallel with tox)everything that uses sphinx for documentationPackagingpoetry (in parallel with tox and pre-commit)pyyaml (in parallel with tox)colorama (in parallel with tox)wrapt (Make also runs tox)pydata bottleneckfacebook researchmephistodemucsdiffqdoraapachesuperset (in parallel with tox and pre-commit)IBMlale (in parallel with pre-commit)compliance-trestle (Make also runs pre-commit)redditbaseplate.pybaseplate.py-upgradercqlmapperOther companies: AWS, Lyft, Microsoft, GoCardless, HewlettPackard
|
alpha-build-git-bash-utils
|
Bash Utils for Git Bash on WindowsDescriptionUtils that don't come with Git Bash by default, likeGNU Make,rsync, `zstdInstructionsGet the code in a temporary location as a one off:cd<your-repo-root>
pipinstallalpha-build-git-bash-utils--targettmp/Unpack the code:tar-xvftmp/alpha_build_git_bash_utils.tar.gzThis will put the code in<repo-root>/build-support/git-bash-integration/.Remove the temporary download location:rm-rftmp/Run Git Bash as administrator and install the relevant utils.# from repo root./build-support/git-bash-integration/install_<utility>.sh
|
alpha-build-lite-py
|
AlphaBuildAlphaBuild is a simple monorepo build tool based on Make with out-of-the-box support for numerous tools in the
Python, Bash, Jupyter Notebooks, Markdown, YAML ecosystems and with a strong focus on extensibility.The way AlphaBuild works draws inspiration heavily from monorepo build tools such as Pants, Bazel, Buck.
It can run at once multiple linters, formatters, type checkers, hermetic packers, testing frameworks, virtual
environment managers etc.PlatformsUsageGoals - what tools we runTargets - what files we run the tools onAlphaBuild structureIDE IntegrationExample monorepo running AlphaBuildCommon admin actionsInstallationCI/CD SetupUpgradeChange goal definitionsAdd goalsUpdate PYTHONPATHSee/Change tools configThird party environmentsNested MakefilesGenerate requirements.txt for each sub-projectGenerate setup.py for each sub-projectMarkdown badgeOut-of-the-box tools by languageHigh-level comparison with Pants, Bazel, Pre-commit,
MakefilesLimitationsDetailed comparison: Make vs Pre-commit vs Tox/Nox vs Bazel vs PantsPlatformsAlphaBuild works on Linux distributions, MacOS, WSL and Windows with Git Bash.MakeandBashare AlphaBuild's only pre-requisites (Note: Bash with GNU utilities, not BSD utilities).Linux, WSLSince Linux is awesome and WSL follows its footsteps, AlphaBuild should just work there.MacOSAlphaBuild relies heavily on the GNU version offindandegrep, so, if you are running an OS which,
by default, uses BSD rather than GNU (pointing fingers to MacOS here, you may need tobrew installfindutilsandgrep, see:https://xenodium.com/gnu-find-on-macos/)Also, AlphaBuild may not work if you have ancient versions ofMake/Bash, so try upgrading them if some commands
don't seem to work. Macs typically come equipped with ancient versions ofBash.WindowsOn Windows,MakeandBashare not supported out-of-the-box, so it is recommended to use AlphaBuild on Windows within
Git Bash. Note that Git Bash does not come with Make pre-installed. Once you downloaded and installed Git Bash
runbuild-support/git-bash-integration/install_make.shrunning Git Bash as administrator. Alternatively, you may getMakefromconda.UsageUsually to format, lint, type-check, test, package, ... the code, one needs to run a bunch of commands in the terminal,
setting the right flags and the right parameters. This Make-based build system helps with running these commands with
a very simple interface:make <goal> <optional-targets>wheregoal = what tools we runtargets = over which files we run these tools.Goals - what tools we runGoals mean what command line tools to run. This build system can run one more tools at once as follows:Singleindividualtoole.g.make mypy,make flake8,make isortMultiple tools for aspecific languagee.g.make fmt-pyruns all Python formatters e.g.isort,black,docformatter,flynt,autoflakee.g.make fmt-shrunsshfmte.g.make lint-pyruns Python all linters and all formatters in "verification" mode, that isflake8+pylint+
check whether the code is already formatted withisort,black,docformatter,flynt,autoflakemake fmt-md,make lint-yml,test-sh,test-py... work similarlyMultiple tools formultiple languagese.g.make fmtruns all formatters for all supported languages (Python, Bash, Markdown, YAML, ...)e.g.make lintruns all linters for all supported languagese.g.make testruns all test suites for all supported languagesIt is possible to run multiple goals at once likemake lint test. In addition, it is very easy to change the meaning
of goals that run more than one command since they are very simply defined in Make based on other goals. For example,
one can remove theshfmtfrom bash linting simply by doing the below:# Beforelint-sh:shellcheckshfmt-check# where shellcheck and shfmt-check run the respective commands# Afterlint-sh:shellcheckPer-tool config files (e.g.mypy.ini,pyproject.toml) are typically places inbuild-support/<language>/tools-config/.Targets - what files we run the tools onWe have seen that Make gives us the power to run multiple terminal commands effortlessly. Using a Makefile like
described above is standard practice in many projects, typically running the different tools over all their files.
However, as projects grow, the need to run these tools at different granularities (e.g. in a specific directory,
over a given file, on the diff between two branches, since we last committed etc). This is where targets come into play.With default targetsThe default targets per-language are defined at the top of theMakefilein language specific variables
e.g.ONPY=py-project1/ py-project2/ script.pyandONSH=scripts/.make lintruns:all Python linters on all directories (in the$ONPY) that contain Python/stub files.all notebook linters on all directories (in$ONNB) that contain.ipynbfiles.all Bash linters (shellcheck) on all directories (in$ONSH) that contain Bash files.a Haskell linter (hlint) on all directories (in$ONHS) that contain Haskell files.a YAML linter (yamllint) on all directories (in$ONYML) that contain YAML files.make lint,make fmt -j1,make type-checkwork similarlyremember that the$(ONPY),$(ONSH), ... variables are defined at the top of the Makefile and represent the
default locations where AlphaBuild searches for files certain languages.With specific targetsTo specify manually the files/directories you want to run your tools on, AlphaBuild leverages the "on" syntax:file:make lint on=app_iqor/server.pyruns all Python linters on the file, same
asmake lint-py on=app_iqor/server.pydirectory:make lint on=lib_py_utilsruns a bunch of linters on the directory, in this case, same
asmake lint-py on=lib_py_utils/.files/directories:make lint on="lib_py_utils app_iqor/server.py"runs a bunch of linters on both targets.globs:make lint on=lib_*aliases:make fmt on=iqorwhere at the top of the Makefileiqor=app_iqor/iqor app_iqor/test_iqor, this is the
same asmake fmt on=app_iqor/iqor app_iqor/test_iqorbecauseiqoris an alias forapp_iqor/iqor app_iqor/test_iqor. Even though this example is simplistic, it is useful to alias combinations of
multiple files/directories. It is recommended to set aliases as constants in the Makefile even though environment
variables would also work.same formake fmt,make test,make type-check.With git revision targetsmake fmt -j1 since=masterruns all formatters on the diff between the current branch and master.make fmt -j1 since=HEAD~1runs all formatters on all files that changed since "2 commits ago".make lint since=--cachedruns all linters on all files that are "git added".all goals that support the "on" syntax also support the "since" syntaxMixed "on" and "since"One can use the "on" and "since" syntaxes at the same time. For example:make lint on=my_dir/ since=HEAD~2will run all linters on all files inmy_dir/that changed since "3 commits ago".ConstraintsDifferent languages may have different goals, for example Python can be packaged hermetically with Shiv, while Bash
obviously can't.The following goals must support the "on" and "since" syntax and ensure that they are only run if there are any targets
for the language they target:formatlinttype-checktestIf you want to learn more about the API of a specific goal, check the source code.AlphaBuild structureMakefile:AlphaBuild's entry point, this is where all components come together.3rdparty/:Files required to build environments of 3rd party dependencies (e.g. requirements.txt files,
package.json or lock files)build-support/:Makefile library inspired by Pants/Bazel to run linters, formatters, test frameworks, type
checkers, packers etc. on a variety of languages (Python, Jupyter Notebooks, Bash, Haskell, YAML, Markdown)The flags used per-tool (e.g. setting the paths to config files) can be found inbuild-support/alpha-build/config/<lang>.mkThe core part of AlphaBuild lives inbuild-support/alpha-build/core/, this comprises the build-system backboneresolver.mkand recipes to run lots of readily-available tools. This should be the same for all monorepos that
use AlphaBuild.By convention, repo-specific custom goals go inbuild-support/alpha-build/extensions/following the examples incore.build-support/<other-programming-lang-than-make>/contain things like config files for each tool and other files
required for your custom AlphaBuild goals.IDE IntegrationPyCharm / IntelliJWindows: On Windows, it is advised to set Git Bash as the default "Terminal". In "settings" search for "terminal",
go to "Tools → Terminal" and set "Shell Path" to something likeC:\Program Files\Git\bin\bash.exe.PYTHONPATH: If you are writing Python, please mark the directories for each project as "Sources Roots", such that
PyCharm discovers your imports.AlphaBuild hotkeys: It is easy to use AlphaBuild with hotkeys. For example, mappingAlt+Ftomake fmt -j1 on=<current-file>andAlt+Ltomake lint on=<current-file>. To set this up, follow the example inbuild-support/ide-integration/windows/to set up an external tool. Unix systems would be similar. Next, just map the
new external tools to your hotkeys.Common admin actionsInstallationUse thiscookiecuttertemplate:https://github.com/cristianmatache/cookiecutter-alpha-build-polyrepo-py.
Feel free to extend the project such that it becomes a monorepo.UpgradeTo upgrade an existing installation if new tools are added or changes are made to the target resolution infrastructure,
one would simply need to replace thebuild-support/alpha-build/coredirectory. To do that please run:pipinstallalpha-build-core--targettmp/
tar-xvftmp/alpha_build_core.tar.gz
rm-rftmp/CI/CD setupSince all CI/CD pipelines essentially rely on running some scripts in a certain order, AlphaBuild can be called
directly from any CI/CD pipeline regardless of CI/CD technology provider. AlphaBuild helps with ensuring that both the
CI pipelines and developers run the exact same commands. Since, one can easily select targets within the repo, setting
pipelines on a per sub-project basis, is effortless with AlphaBuild.Change goal definitionsLet's say, for example, you don't want to runpylintas part of your python linting. You would simply go to theMakefileand change the definition of thelint-pygoal to not includepylint.Add goalsThe goals that are available out of the box are found inbuild-support/alpha-build/core/<language>/.
You can extend/replace the core goals for new languages and/or tools by writing.mkcode inbuild-support/alpha-build/extensions/<language>/following the examples inbuild-support/alpha-build/core/.
For example,https://github.com/cristianmatache/workspaceextends AlphaBuild with Prometheus and Alertmanager goals.Update PYTHONPATHThe PYTHONPATH is set at the top of theMakefile. For example, to add new directories to
the PYTHONPATH (i.e. to mark them as sources roots) setPY_SOURCES_ROOTSat the top of the Makefile.See/Change tools configLet's say you want to change the waymypyis configured to exclude some directory from checking. Then head tobuild-support/alpha-build/config/python.mkcheck what is the path to themypyconfig file, go there and update it.
All other tools work similarly.Third party environmentsExact reproduction of the default environment:The recipes to fully replicate the default environment
(mostly usingpip,condaandnpm) are found inbuild-support/alpha-build/core/<langugage>/setup.mk, where they
use dependency files and lock files that can be found in3rdparty/. In practice, runmake env-default-replicateinside a conda environment. Also make sure you also havenpminstalled becausemarkdownlintandbatsbash
testing framework come fromnpm(if you don't need them no need to worry aboutnpmjust exclude themarkdownenvironment rule from the pre-requisites ofenv-default-replicate)Create/Upgrade/Edit default environment:If you want to edit the default environment, for example to add,
remove, constrain packages edit therequirements.txtnot theconstraints.txtfile (in3rdparty/).
Theconstraints.txtis only used for reproducibility. If you just want to upgrade your third party dependencies
there is no need to temper with therequirements.txtfiles. Then runmake env-default-upgradeand check the lock
files back into git.Add a new environment:To add a new environment, first add the dependency files (e.g.requirements.txt) in3rdparty/<new-env-name>, add a new goal inbuild-support/alpha-build/extensions. For environment management over
time, we strongly encourage maintaining the approach split between creation/upgrade/edit and exact reproduction of
environments.Nested MakefilesSupposing you want to use a different config file forblackfor a project in your monorepo. You would have 2 options:
change the config file globally frombuild-support/alpha-build/config/python.mk(this would affect other projects) or create
anotherMakefilein your specific project if you just want different settings for your little project (whether this
is a good or a bad idea is more of a philosophical debate, I would argue that globally consistent config files are
preferable, but I acknowledge that this may be needed sometimes).
So, to have nestedMakefiles that work with different config files:# root/Makefileblack.%:# Add this goal to be able to delegate to inner Makefile-s$(MAKE)-C$(subst.,/,$*)custom-blackfmt-py:blackblack.my-proj# Add your custom "black" goals here# root/my-proj/Makefileinclude build-support/alpha-build/core/python/format.mkcustom-black:$(evaltargets:=$(onpy))$(MAKE)blacktargets="$(on)"BLACK_FLAGS="-S --config my-proj/pyproject.toml"This waymake fmt-pyat the root would call the regularblackrule but will also delegate to the inner Makefilecustom-blackrule to run the same tool differently. Don't forget to excludemy-proj/from black in the outer
Makefile, otherwise black would be run twice (once from the outside Makefile and again from the inner one).Generate requirements.txt for each sub-projectRunmake reqs-py.Generate setup.py for each sub-projectRunbuild-support/python/packaging/generate_pip_install_files.pyMarkdown badgeIf you like AlphaBuild, wear the Markdown badge on your repo:[
](https://github.com/cristianmatache/alpha-build)Out-of-the-box tools by languageAlphaBuild has recipes to run the following tools. However, if you don't use some of them you don't need to have them
installed in your environments. For example, let's say you don't usebanditthen you don't need to havebanditinstalled in your environment provided that you are not using thebanditgoal per-se or as part of a composite goal
likelint-pyorlint.Click to expand and see the supported tools!Python:Setup:pip/condaType-check:mypyTest:pytest(along withdoctestand other plugins)Format + Lint:black,docformatter,isort,autoflake,flynt,pre-commit,pyupgradeLint only:flake8,pylint,bandit,pydocstyle(along with plugins likedarglint,flake8-bugbear)Package:pipreqs,shivJupyter:Setup:pipFormat + Lint:jupyterblack,nbstripoutLint only:flake8-nbBash:Setup:npmandcondaTest:bats(bash testing:bats-core,bats-assert,bats-support)Format + Lint:shfmtLint only:shellcheckHaskell:Lint:hlintYAML:Setup:pipLint:yamllint,prettierMarkdown:Setup:npmFormat + Lint:markdownlint,prettierHTML + CSS:Setup:npmFormat + Lint:prettierJavaScript:Setup:npmFormat + Lint:prettierTypeScript:Setup:npmFormat + Lint:prettierreStructuredText Text (.rst)Lint:rstcheckSwift:Format:SwiftLint,swift-formatLint:SwiftLint,swift-formatKotlin:Format:ktlintLint:ktlintIt is very easy to extend this list with another tool, just following the existing examples.Example monorepo running AlphaBuildTo see AlphaBuild at work in a real-world example checkhttps://github.com/cristianmatache/workspaceout.
Workspace extends AlphaBuild with support for Prometheus, Alertmanager and Grafana.High-level comparison with Pants, Bazel, Pre-commit and traditional MakefilesModern build tools like Pants or Bazel work similarly to AlphaBuild in terms of goals and targets, but they also add
a caching layer on previous results of running the goals. While they come equipped with heavy machinery to support
enormous scale projects, they also come with some restrictions and specialized maintenance and contribution requirements.For example, Pants which, in my opinion, is the most suitable modern build tool for Python doesn't allow building
environments with arbitrary package managers (e.g. conda, mamba), does not work on Windows, prohibits inconsistent
environments (which is good but sometimes simply impossible in practice), does not yet support multiple environments.
Bazel, requires maintaining the dependencies between Python files twice, once as "imports" in the Python files
(the normal thing to do) and twice in some specificBUILDfiles that must be placed in each directory (by contrast
Pants features autodiscovery). Maintaining the same dependencies in two places is quite draining. Of course, these tools
come with benefits like (remote) caching, incrementality and out-of-the-box support for hermetic packaging (e.g. PEXes),
remote execution etc. Moreover, playing with some new command line tools, or new programming languages / types of files
(e.g. Jupyter Notebooks, Markdown, YAML) may be challenging with these frameworks. The Pants community is very welcoming
and supportive towards incorporating new tools, so it would be good to give Pants a try first. However, if any of the
mentioned shortcomings is a hard requirement, Make seems like a good and robust alternative in the meanwhile which
withstood the test of time in so many settings. AlphaBuild's strengths are its flexibility, simplicity, transparency and
tooling richness. One can quickly hack/add a new tool, see the commands that run under the hood and does not need to
worry about BUILD files or the config language.Since AlphaBuild is essentially a script manager (Python, Bash, Perl, anything) enhanced with advanced
target/file/directory selection, AlphaBuild would allow an incremental adoption of large-scale build tools like Pants.
For example, in the main Makefile, one could do:# Makefilelint-with-pants:$(evalon:=::)# Default value if the user does not specify "on" in the terminal command like: make goal on=../pantslint$(on)lint:lint-mdlint-nblint-ymllint-with-pantssuch that running a command like the below would delegate most of the work to Pants while using AlphaBuild's core or
custom capabilities not yet available in Pants (e.g. linting notebooks, markdown or YAML files).makelinton=my-dir/Barepre-commitand typical usages of Make work exceptionally well on small projects, but they don't really scale
well to multi-projects monorepos. The build system proposed here, already incorporatespre-commitand is obviously
compatible with any existing Makefiles. This approach simply takes the idea of advanced target selection and ports it
over to classical techniques like pre-commit and Make.LimitationsSince AlphaBuild is essentially a small-repo tool (Python Makefile) adapted to work on larger codebases (through target
selection), there is a point from where it will no longer be able to scale up. Fortunately, that point is quite far away
from medium-sized repos/teams.In addition, AlphaBuild requires that the commands it builds are shorter thangetconf ARG_MAXcharacters.Detailed comparison: AlphaBuild vs Make vs Pre-commit vs Tox/Nox vs Bazel vs PantsMost small/medium popular Python open source projects use Make, Pre-commit and/or Tox/Nox with a crushing majority
for formatting, linting and/or testing and/or publishing. For the same purposes, fewer projects simply use a bunch
of Bash scripts or monorepo-style build tools like Bazel (or Pants, Buck, Please).
Make, pre-commit, nox/tox work pretty well together in the same repo and are often used so.Note 1:Every time we talk about Pants we talk about Pants V2, which is a fundamentally different product from
Pants V1. Even Twitter gave up on Pants V1 and is moving to Bazel (seehttps://twitter.com/jin_/status/1255133781876330497).Note 2:IMHO Pants (V2) is the best (but not yet perfect) build tool for large monorepos out there and has
great potential.Pros and ConsClick to expand and see the pros and cons of each tool!Tox/Nox:Pros:Good to run the same commands in multiple environmentsCross-platformCons:Does not scale to large reposPython-onlyNotes: can be called from within Make, but can also call Make and pre-commit commands
(see:https://tox.wiki/en/latest/config.html#conf-allowlist_externals)Pre-commit:Pros:Great to manage git hooks (as the name implies)Can run over all files or over a specific set of files but not directories
(slightly more advanced target selection)Sort of incrementalCons:Does not work on Windows (docs say it doesn't but in fact it does partly) [EDIT this is no longer the case.]Typically, very much geared towards format and lint.Notes: can be called from within MakeMake:Pros:Very flexible -> can essentially cover any tools/language (AlphaBuild is just Make with several built-in tools
and target selection) and any way of fetching dependencies, multiple 3rd party environments etc.Very transparent -> easy to supportCross-platformSupports nested MakefilesCons:More scalable than pre-commit/tox/nox but not as scalable or hermetic as Bazel/Pants/Buck/Please
(seehttps://github.com/thought-machine/please#why-please-and-not-makefor details).Notes: can run pre-commit, tox/nox but can also be run from tox/nox (not from pre-commit though)Bazel:Pros:Great for large scale projects (incremental, DAG, remote caching/execution)HermeticCons:Need to maintain special BUILD files in every directory in which all dependencies are written again
(once imported in the code, twice in the BUILD files)Low tool coverage for Python/Jupyter ecosystemSupport for 3rd party Python environments was not great (not sure if it is still the case)Has more sophisticated support needs (dedicated engineers and/or tuned CI infra)Notes: can be called from within MakePants V2:Pros:Great for large scale projects (incremental, DAG, remote caching/execution)Dependencies are auto-discovered in BUILD filesHermeticCons:Environment support: no conda, no multiple environments, no arbitrary ways to create environments, no inconsistent
envsNo support for Windows (only for WSL)Tool/language support could be better (Pants's support for Python is better than Bazel's though)Has more sophisticated support needs (e.g. dedicated engineers and/or tuned CI infra)Does not readily support the equivalent of nested MakefilesNotes:Can be called from within MakeStill need to have BUILD files in every directory (much easier to work with than in Bazel)Users by toolClick to expand and see who uses each tool!Nox, Tox, Pre-commitThese tools are simply extremely popular.Bazelraypytorch (Make for Python, Bash and CMake; in parallel with Bazel for other languages)selenium (with Rake i.e. ruby make) -> example BUILD filehttps://github.com/SeleniumHQ/selenium/blob/trunk/py/BUILD.bazeltensorflowkerasprotobufjax (with pre-commit)Pantshttps://www.pantsbuild.org/page/who-uses-pantsMakeDatapandas (in parallel with pre-commit)vaexpydanticpandera (in parallel with pre-commit, Make also runs nox)MLpytorch (Make for Python, Bash and CMake; in parallel with Bazel for other languages)xgboostDetermined AI (example of a more scalable Make-based infrastructure with nested Makefiles)VisualizationseabornDistributedcelery (in parallel with tox and pre-commit)Stream processingfaust (Make also runs pre-commit)Webrequestsgunicorn (in parallel with tox)sentry-python (Make also runs tox)AsyncIO (Aio-libs)aiohttp (Make also runs pre-commit)yarlaiomysqlaioredis-py (Make also runs pre-commit)aiopgDevOpsansible (in parallel with tox)pytest-testinfra (in parallel with tox)Documentationsphinx: the home repo (in parallel with tox)everything that uses sphinx for documentationPackagingpoetry (in parallel with tox and pre-commit)pyyaml (in parallel with tox)colorama (in parallel with tox)wrapt (Make also runs tox)pydata bottleneckfacebook researchmephistodemucsdiffqdoraapachesuperset (in parallel with tox and pre-commit)IBMlale (in parallel with pre-commit)compliance-trestle (Make also runs pre-commit)redditbaseplate.pybaseplate.py-upgradercqlmapperOther companies: AWS, Lyft, Microsoft, GoCardless, HewlettPackard
|
alphacast
|
This Alphacast Python Library
|
alphacast-library
|
This is the first version of Alphacast Python Library
|
alphaclops-core
|
alphaclops is a Python library for writing, manipulating, and optimizing quantum
circuits and running them against quantum computers and simulators.This module isalphaclops-core, which contains everything you’d need to write quantum algorithms for NISQ devices and run them on the built-in alphaclops simulators.
In order to run algorithms on a given quantum hardware platform, you’ll have to install the right alphaclops module as well.InstallationTo install the stable version of onlyalphaclops-core, usepip install alphaclops-core.
To install the pre-release version of onlyalphaclops-core, usepip install alphaclops-core –pre.To get all the optional modules installed as well, you’ll have to usepip install alphaclopsorpip install alphaclops –prefor the pre-release version.
|
alphaclopsv1
|
alphaclops is a Python library for writing, manipulating, and optimizing quantum
circuits and running them against quantum computers and simulators.This module isalphaclops-core, which contains everything you’d need to write quantum algorithms for NISQ devices and run them on the built-in alphaclops simulators.
In order to run algorithms on a given quantum hardware platform, you’ll have to install the right alphaclops module as well.InstallationTo install the stable version of onlyalphaclops-core, usepip install alphaclops-core.
To install the pre-release version of onlyalphaclops-core, usepip install alphaclops-core –pre.To get all the optional modules installed as well, you’ll have to usepip install alphaclopsorpip install alphaclops –prefor the pre-release version.
|
alphacoders
|
Alphacoderspip install alphacoders
python3 -m alphacoders -h
|
alpha-compiler-vk
|
No description available on PyPI.
|
alphaconf
|
AlphaConfA small library to ease writing parameterized scripts.
The goal is to execute a single script and be able to overwrite the parameters
easily.
The configuration is based onOmegaConf.
Optionally, loading from toml or usingpydanticis possible.To run multiple related tasks, there is an integration withinvoke.
If you need something more complex, like running multiple instances of the
script, take a look athydra-coreor use another script
to launch multiple instances.Demo and applicationTo run an application, you need...# myapp.pyimportalphaconfimportlogging# define the default values and helpersalphaconf.setup_configuration({"server.url":"http://default",},{"server.url":"The URL to show here",})defmain():log=logging.getLogger()log.info('server.url:',alphaconf.get('server.url'))log.info('has server.user:',alphaconf.get('server.user',bool,default=False))if__name__=='__main__':alphaconf.cli.run(main)Invoking:pythonmyapp.pyserver.url=http://github.comDuring aninteractive session, you can set the application in the current
context.# import other modulesimportalphaconf.interactivealphaconf.interactive.mount()alphaconf.interactive.load_configuration_file('path')Check theDEMOfor more examples.How the configuration is loadedWhen running a program, first dotenv is used to load environment variables
from a.envfile - this is optional.Then configuration is built from:default configurations defined using (alphaconf.setup_configuration)applicationkey is generatedPYTHON_ALPHACONFenvironment variable may contain a path to loadconfiguration files from configuration directories (using application name)environment variables based on key prefixes,
except "BASE" and "PYTHON";if you have a configuration key "abc", all environment variables starting
with "ABC_" will be loaded, for example "ABC_HELLO=a" would set "abc.hello=a"key-values from the program argumentsFinally, the configuration is fully resolved and logging is configured.Configuration templates and resolversConfiguration values are resolved byOmegaConf.
Some of the resolvers (standard and custom):${oc.env:USER,me}: resolve the environment variable USER
with a default value "me"${oc.select:config_path}: resolve to another configuration value${read_text:file_path}: read text contents of a file asstr${read_bytes:file_path}: read contents of a file asbytes${read_strip:file_path}: read text contents of a file as strip spacesTheoc.selectis used to build multiple templates for configurations
by providing base configurations.
An argument--select key=templateis a shortcut forkey=${oc.select:base.key.template}.
So,logging: ${oc.select:base.logging.default}resolves to the configuration
dict defined in base.logging.default and you can select it using--select logging=default.Configuration values and integrationsTyped-configurationYou can useOmegaConfwithpydantictogettyped values.classMyConf(pydantic.BaseModel):value:int=0defbuild(self):# use as a factory pattern to create more complex objects# for example, a connection to the databasereturnself.value*2# setup the configurationalphaconf.setup_configuration(MyConf,prefix='a')# read the valuealphaconf.get('a',MyConf)v=alphaconf.get(MyConf)# because it's registered as a typeSecretsWhen showing the configuration, by default configuration keys which are
secrets, keys or passwords will be masked.
You can read values or passwords from files, by using the template${read_strip:/path_to_file}or, more securely, read the file in the codealphaconf.get('secret_file', Path).read_text().strip().Inject parametersWe can inject default values to functions from the configuration.
Either one by one, where we can map a factory function or a configuration key.
Or inject all automatically base on the parameter name.fromalphaconf.injectimportinject,inject_auto@inject('name','application.name')@inject_auto(ignore={'name'})defmain(name:str,example=None):pass# similar todefmain(name:str=None,example=None):ifnameisNone:name=alphaconf.get('application.name',str)ifexampleisNone:example=alphaconf.get('example',default=example)...Invoke integrationJust add the lines below to parameterize invoke.
Note that the argument parsing to overwrite configuration will work only
when the script is directly called.importalphaconf.invokens=alphaconf.invoke.collection(globals())alphaconf.setup_configuration({'backup':'all'})alphaconf.invoke.run(__name__,ns)Way to 1.0Run a specific functionalphaconf my.module.main:
find functions and inject argsInstall completions for bashalphaconf --install-autocompletion
|
alphaconfig
|
AlphaConfigIntroductionAlphaConfig is easy to read and easy to use, which is designed for configuration.InstallationYou can install AlphaConfig by pip.pip install AlphaConfigAPIsinit(src_dict: dict=None, path2yaml: os.PathLike=None, path2json: os.PathLike=None, read_only: bool=False, **kwargs)Info:
Initialize a AlphaConfig instance.
Args:
src_dict (dict): Create an AlphaConfig instance from python builtin's dict, default is None.
path2yaml (os.PathLike): Create an AlphaConfig instance from a yaml file, default is None.
path2json (os.PathLike): Create an AlphaConfig instance from a json file, default is None.
read_only (bool): Set the state of AlphaConfig instance, modification is allowed only if its value is False.
kwargs (key-value pairs): Create attribute-value pairs from key-value pairs.ExamplefromalphaconfigimportAlphaConfigtest_dict={"attr_1":1,"attr_2":{"attr_2_1":[2,3],"attr_2_2":"this is attr_2_2",}}configs=AlphaConfig(test_dict,attr_3="value_3")print(configs)# * ATTRIBUTES *# - attr_1: 1# - attr_2:# - attr_2_1: [2, 3]# - attr_2_2: this is attr_2_2# - attr_3: value_3is_read_only()Info:
Allow user to check whether an instance is read only or not.
Returns:
(bool): True if it is read only else False.cvt2dict()Info:
Convert an AlphaConfig instance to a python builtin's dict.
Returns:
(dict)Examplestest_dict={"attr_1":1,"attr_2":{"attr_2_1":[2,3],"attr_2_2":"this is attr_2_2",}}configs=AlphaConfig(test_dict,attr_3="value_3")print(configs.cvt2dict())# {'attr_1': 1, 'attr_2': {'attr_2_1': [2, 3], 'attr_2_2': 'this is attr_2_2'}, 'attr_3': 'value_3'}cvt_state(read_only: bool=None)Info:
Convert the readable state according to the given arg.
Args:
read_only (bool): Set the readable state, if no value is given, revert the state.
Returns:
(bool): The final readable state.keys()Info:
Get all user-defined attributes. This method act like a python builtin's dict.
Returns:
(dict_keys)values()Info:
Get all user-defined values of corresponding keys. This method act like a python builtin's dict.
Returns:
(dict_values)items()Info:
Get all user-defined values of corresponding keys. This method act like a python builtin's dict.
Returns:
(dict_items)iter()AlphaConfig supportsiter.Exampletest_dict={"attr_1":1,"attr_2":{"attr_2_1":[2,3],"attr_2_2":"this is attr_2_2",}}config=AlphaConfig(test_dict)foritinconfig:print(it)# ('attr_1', 1)# ('attr_2', {'attr_2_1': [2, 3], 'attr_2_2': 'this is attr_2_2'})copy and deepcopyAlphaConfig supports copy and deepcopy, just callcopymodule.
|
alpha-cord
|
No description available on PyPI.
|
alphacore
|
Core statistical functions for alpha
|
alphacpc
|
Example Package
|
alphacsc
|
This is a library to perform shift-invariantsparse dictionary learning, also known as
convolutional sparse coding (CSC), on time-series data.
It includes a number of different models:univariate CSCmultivariate CSCmultivariate CSC with a rank-1 constraint[1]univariate CSC with an alpha-stable distribution[2]A mathematical descriptions of these models is availablein the documentation.InstallationTo install this package, the easiest way is usingpip. It will install this
package and its dependencies. Thesetup.pydepends onnumpyandcythonfor the installation so it is advised to install them beforehand. To
install this package, please run one of the two commands:(Latest stable version)pip install alphacsc(Development version)pip install git+https://github.com/alphacsc/alphacsc.git#egg=alphacsc(Dicodile backend)pip install numpy cython
pip install alphacsc[dicodile]To use dicodile backend, do not forget to setMPI_HOSTFILEenvironment
variable.If you do not have admin privileges on the computer, use the--userflag
withpip. To upgrade, use the--upgradeflag provided bypip.To check if everything worked fine, you can run:python -c 'import alphacsc'and it should not give any error messages.QuickstartHere is an example to present briefly the API:importnumpyasnpimportmatplotlib.pyplotaspltfromalphacscimportBatchCDL# Define the different dimensions of the problemn_atoms=10n_times_atom=50n_channels=5n_trials=10n_times=1000# Generate a random set of signalsX=np.random.randn(n_trials,n_channels,n_times)# Learn a dictionary with batch algorithm and rank1 constraints.cdl=BatchCDL(n_atoms,n_times_atom,rank1=True)cdl.fit(X)# Display the learned atomsfig,axes=plt.subplots(n_atoms,2,num="Dictionary")forkinrange(n_atoms):axes[k,0].plot(cdl.u_hat_[k])axes[k,1].plot(cdl.v_hat_[k])axes[0,0].set_title("Spatial map")axes[0,1].set_title("Temporal map")foraxinaxes.ravel():ax.set_xticklabels([])ax.set_yticklabels([])plt.show()Dicodile backendAlphaCSC can use adicodile-based backend to perform sparse encoding in parallel.To install dicodile, runpip install alphacsc[dicodile].Known OpenMPI issuesWhen self-installing OpenMPI (for instance to rundicodileon a single machine, or for continuous integration), running thedicodilesolver might end up causing a deadlock (no output for a long time). It is often due to communication issue between the workers. This issue can often be solved by disabling Docker-related virtual NICs, for instance by runningexportOMPI_MCA_btl_tcp_if_exclude="docker0".Bug reportsUse thegithub issue trackerto report bugs.Cite our workIf you use this code in your project, please consider citing our work:[1]Dupré La Tour, T., Moreau, T., Jas, M., & Gramfort, A. (2018).Multivariate Convolutional Sparse Coding for Electromagnetic Brain Signals. Advances in Neural Information
Processing Systems (NIPS).[2]Jas, M., Dupré La Tour, T., Şimşekli, U., & Gramfort, A. (2017).Learning
the Morphology of Brain Signals Using Alpha-Stable Convolutional Sparse Coding.
Advances in Neural Information Processing Systems (NIPS), pages 1099–1108.
|
alphacube
|
No description available on PyPI.
|
alphad3m
|
AlphaD3M is an AutoML system that automatically searches for models and derives end-to-end pipelines that read,
pre-process the data, and train the model. AlphaD3M leverages recent advances in deep reinforcement learning and is
able to adapt to different application domains and problems through incremental learning.AlphaD3M provides data scientists and data engineers the flexibility to address complex problems by leveraging the
Python ecosystem, including open-source libraries and tools, support for collaboration, and infrastructure that enables
transparency and reproducibility.This repository is part of New York University's implementation of theData Driven Discovery project (D3M).Documentation is available here.
|
alphad3m-containers
|
AlphaD3M is an AutoML system that automatically searches for models and derives end-to-end pipelines that read,
pre-process the data, and train the model. AlphaD3M leverages recent advances in deep reinforcement learning and is
able to adapt to different application domains and problems through incremental learning.AlphaD3M provides data scientists and data engineers the flexibility to address complex problems by leveraging the
Python ecosystem, including open-source libraries and tools, support for collaboration, and infrastructure that enables
transparency and reproducibility.This repository is part of New York University's implementation of theData Driven Discovery project (D3M).Documentation is available here.
|
alpha-data-py
|
alpha-data-pyA client to fetch data from the alphaticks data baseInstallationYou can install alpha-data-py using the following command:pipinstallalpha-data-pyUsageIn order to use the library, you need an API key from alphaticks. Go toalphaticks.io,
Account -> Licenses -> AlphaData and click onAdd credentials.Now that you have your credentials ready, you can create a client, fetch a security, and fetch the data you need for
that security.importdatetimefromadataimportClient,FREQ_1Mc=Client(API_KEY,API_SECRET)secs=c.get_securities()forsinsecs:ifs.symbol=="BTCUSDT"ands.exchange=="fbinance":sec=sstart=datetime.datetime(2022,1,1)end=datetime.datetime(2022,1,5)it=c.get_historical_ohlcv(sec,FREQ_1M,start,end)whileit.next():print(it.o,it.h,it.l,it.c,it.v)
|
alphadb
|
AlphaDBA toolset for MySQL database versioning.Still in alpha stageAlphaDB is currently inbetastage. Breaking changes should be expected.Table of ContentsDocumentationInstallationInstall using PIPUsageExceptionsLicenseDocumentationVisit theofficial documentationInstallationInstall usingPIPpip install alphadbNote thatpiprefers to the Python 3 package manager. In an environment where Python 2 is also present the correct command may bepip3.UsageImport AlphaDBfromalphadbimportAlphaDBConnect to a database.db=AlphaDB()db.connect(host="localhost",user="user",password="password",database="database")Make sure the database is empty, back it up if necessary. If the database is not empty, you can use thevacatemethod.
Note that this function will erase ALL data in the database and there is no way to get it back. For extra safety the argumentconfirm=Trueis required for the function to run.db.vacate(confirm=True)The database is now ready to be initialized. Theinitmethod will create theadb_conftable. This holds configuration data for the database.db.init()Now we update the database. For this we need to give it a structure. The database version information is a JSON structure formatted as such:database_version_source={"name":"mydb",## Database name, does not have to, but is advised to match the actual database name"version":[## List containing database versions{"_id":"0.1.0",## Database version"createtable":{## Object containing tables to be created,"customers":{## Object key will be used as table name"primary_key":"id","name":{## Object key will be used as column name"type":"VARCHAR",## Data type"length":100,## Date max length,},"id":{"type":"INT","a_i":True}},}},{"_id":"1.0.0","createtable":{"orders":{"primary_key":"id","id":{"type":"INT","a_i":True},"date":{"type":"DATETIME",},"note":{"type":"TEXT","null":True}}}}]}Then call theupdatemethod.db.update(version_source=database_version_source)ExceptionsNoConnectionTheNoConnectionexception is thrown when a method is called while no database connection is active.DBNotInitializedTheDBNotInitializedexception is thrown when the database is not yet initialized.Database.init()## Will initialize the database and thus resolve the errorDBTemplateNoMatchTheDBTemplateNoMatchexception is thrown when de database was previously updated using another version source.
On initialization, a tableadb_confis created. In this table the columntemplateis used to save the version source template name. Make sure it matches.LicenseGPL-3.0 LICENSE
|
alpha-dbw
|
### Alpha_pyРепозиторий содержит API низкоуровневой системы управления автомобилем [Alpha](alpha.starline.ru), разрабатываемой в рамках проекта [OSCAR](https://gitlab.com/starline/oscar), для беспилотных транспортных средств.#### Установка с PyPI` pip3 install--useralpha_py `#### Установка из исходников` git clonehttps://gitlab.com/starline/alpha_py.git&& cd alpha_py pip3 install--user-e. `#### Использование` import alpha vehicle =alpha.Vehicle(“/dev/ttyACM0”)vehicle.drive() vehicle.steer(20) vehicle.move(10) vehicle.manual() vehicle.led_blink() vehicle.emergency_stop() vehicle.recover() vehicle.left_turn_signal() vehicle.right_turn_signal() vehicle.emergency_signals() vehicle.turn_off_signals() vehicle.get_vehicle_speed() `
|
alphadetector
|
No description available on PyPI.
|
alphadev
|
AlphaDevAlphaDev is an AI model based on the AlphaZero/MuZero Reinforcement Learning architecture. It's designed to optimize assembly code using a set of assembly instructions and a cost function which takes into account both correctness and performance.Usagepip install alphadevArchitectureAlphaDev consists of:Representation Network:f_repthat outputs a latent representationhtof the stateSt.Prediction Network:f_predthat predicts the expected return (the value)vˆtand a policyπˆtfrom a given latent state.Dynamics Network:f_dynthat predicts the next latent statehtk+1and rewardrˆtk+1resulting from a transition.How AlphaDev WorksOn reaching a new state, AlphaDev encodes the state into a latent representation using the representation network. The dynamics and prediction networks are used to simulate several trajectories that fill out a search tree by sampling state transitions.The actions are selected using a strategy that balances exploration (trying new actions) and exploitation (progressing further down the subtree of the current best action).Finally, the predicted policy is trained to match the visit counts of the MCTS policy in an attempt to distil the search procedure into a policy that will disregard nodes that are not promising.Potential Use CasesAlphaDev, due to its general architecture, could potentially be adapted to solve a wide variety of optimization problems. Here are a few examples:Route Optimization: For logistics companies, optimizing the routes of their fleet can result in significant cost savings. AlphaDev could be used to learn the optimal routes based on a variety of factors such as traffic, distance, and number of stops.Job Scheduling: In computing, job scheduling is a key issue. AlphaDev could be used to learn the optimal schedule that maximizes the usage of computational resources and minimizes job completion time.Stock Portfolio Optimization: AlphaDev could be used to learn the optimal mix of stocks to maximize return and minimize risk, given the current market conditions.Game Playing: Similar to its ancestor AlphaZero, AlphaDev could potentially be used to master a wide variety of games, by learning the optimal strategies.Drug Discovery: AlphaDev could be used to find the optimal chemical structure for a new drug that maximizes efficacy and minimizes side effects.UsageAssemblyGame This represents the Assembly Game RL environment. The state of the RL environment contains the current program and the state of memory and registers. Doing a step in this environment is equivalent to adding a new assembly instruction to the program (see the step method). The reward is a combination of correctness and latency reward after executing the assembly program over an input distribution. For simplicity of the overall algorithm we are not including the assembly runner, but assembly execution can be delegated to an external library (e.g. AsmJit).AlphaDevConfig contains the main hyperparameters used for the AlphaDev agent. This includes configuration of AlphaZero, MCTS, and underlying networks.play_game contains the logic to run an AlphaDev game. This include the MCTS procedure and the storage of the game.RepresentationNet and PredictionNet contain the implementation the networks used in the AlphaZero algorithm. It uses a MultiQuery Transformer to represent assembly instructionFuture WorkFuture adaptations of AlphaDev could implement different learning algorithms or optimization techniques for specific domains or problem areas.ContributingPlease readCONTRIBUTING.mdfor details on our code of conduct, and the process for submitting pull requests to us.LicenseThis project is licensed under the MIT License - see theLICENSE.mdfile for details.AcknowledgmentsWe appreciate the efforts of the researchers and developers who contributed to the development of the AlphaZero/MuZero architectures on which AlphaDev is based.RoadmapAdd jax-based multi query attention:MultiQueryAttentionBlockaddResBlockV2add utils, terminal, is_correct, legal_actions
|
alphadict
|
UNKNOWN
|
alphaDigger
|
No description available on PyPI.
|
alphadoc
|
alphadocAutomatic docstring generator and style guide that supports a number of specified conventions for documentation in Python.FeaturesAuto-generates docstrings with a customizable template for functions.Automatically fixes the code according to the standard PEP-8 style convention for python.Support for common and widely used docstrings formats such as Numpy, Google, ReStructured Text and Epytext (Javadoc)InstallationUsing pip:$ pip install alphadocUsagealphadoc takes your filename and format type as arguments$ alphadoc <filename> --d <doc_format>Seealphadoc --helpfor more command-line switches and options!Options :Usage: alphadoc [OPTIONS] FILENAME
Automatic docstring generator and style guide that supports a
number of specified conventions for formatting as well as
documentation in Python.
Options:
-d, --doc_format TEXT Specified format for docstrings from Options-
ReST : For ReStructured Text (default); Epytext
: For Epytext (Javadoc); Google : For Google-
Style ; Numpydoc : For Numpydoc
--help Show this message and exit.Example :Before alphadocimportastimportsysdeftop_level_functions(body):return(fforfinbodyifisinstance(f,ast.FunctionDef))defparse_ast(filename):withopen(filename,"rt")asfile:returnast.parse(file.read(),filename=filename)defget_func(filename):tree=parse_ast(filename)func_list=[]forfuncintop_level_functions(tree.body):func_list.append(func.name)returnfunc_listAfter alphadocDocstring format:ReStructured Text(default)importastimportsysdeftop_level_functions(body):"""This is reST style.:param param1: this is a first param:param param2: this is a second param:returns: this is a description of what is returned:raises keyError: raises an exception"""return(fforfinbodyifisinstance(f,ast.FunctionDef))defparse_ast(filename):"""This is reST style.:param param1: this is a first param:param param2: this is a second param:returns: this is a description of what is returned:raises keyError: raises an exception"""withopen(filename,"rt")asfile:returnast.parse(file.read(),filename=filename)defget_func(filename):"""This is reST style.:param param1: this is a first param:param param2: this is a second param:returns: this is a description of what is returned:raises keyError: raises an exception"""tree=parse_ast(filename)func_list=[]forfuncintop_level_functions(tree.body):func_list.append(func.name)returnfunc_listEpytext (Javadoc)importastimportsysdeftop_level_functions(body):"""This is javadoc style.@param param1: this is a first param@param param2: this is a second param@return: this is a description of what is returned@raise keyError: raises an exception"""return(fforfinbodyifisinstance(f,ast.FunctionDef))defparse_ast(filename):"""This is javadoc style.@param param1: this is a first param@param param2: this is a second param@return: this is a description of what is returned@raise keyError: raises an exception"""withopen(filename,"rt")asfile:returnast.parse(file.read(),filename=filename)defget_func(filename):"""This is javadoc style.@param param1: this is a first param@param param2: this is a second param@return: this is a description of what is returned@raise keyError: raises an exception"""tree=parse_ast(filename)func_list=[]forfuncintop_level_functions(tree.body):func_list.append(func.name)returnfunc_listGoogleimportastimportsysdeftop_level_functions(body):"""This is an example of Google style.Args:param1: This is the first param.param2: This is a second param.Returns:This is a description of what is returned.Raises:KeyError: Raises an exception."""return(fforfinbodyifisinstance(f,ast.FunctionDef))defparse_ast(filename):"""This is an example of Google style.Args:param1: This is the first param.param2: This is a second param.Returns:This is a description of what is returned.Raises:KeyError: Raises an exception."""withopen(filename,"rt")asfile:returnast.parse(file.read(),filename=filename)defget_func(filename):"""This is an example of Google style.Args:param1: This is the first param.param2: This is a second param.Returns:This is a description of what is returned.Raises:KeyError: Raises an exception."""tree=parse_ast(filename)func_list=[]forfuncintop_level_functions(tree.body):func_list.append(func.name)returnfunc_listNumpydocimportastimportsysdeftop_level_functions(body):"""Numpydoc description of a kindof very exhautive numpydoc format docstring.Parameters----------first : array_likethe 1st param name `first`second :the 2nd paramthird : {'value', 'other'}, optionalthe 3rd param, by default 'value'Returns-------stringa value in a stringRaises------KeyErrorwhen a key errorOtherErrorwhen an other error"""return(fforfinbodyifisinstance(f,ast.FunctionDef))defparse_ast(filename):"""Numpydoc description of a kindof very exhautive numpydoc format docstring.Parameters----------first : array_likethe 1st param name `first`second :the 2nd paramthird : {'value', 'other'}, optionalthe 3rd param, by default 'value'Returns-------stringa value in a stringRaises------KeyErrorwhen a key errorOtherErrorwhen an other error"""withopen(filename,"rt")asfile:returnast.parse(file.read(),filename=filename)defget_func(filename):"""Numpydoc description of a kindof very exhautive numpydoc format docstring.Parameters----------first : array_likethe 1st param name `first`second :the 2nd paramthird : {'value', 'other'}, optionalthe 3rd param, by default 'value'Returns-------stringa value in a stringRaises------KeyErrorwhen a key errorOtherErrorwhen an other error"""tree=parse_ast(filename)func_list=[]forfuncintop_level_functions(tree.body):func_list.append(func.name)returnfunc_listReferenceshttp://daouzli.com/blog/docstring.htmlContributingalphadoc is fully Open-Source and open for contributions! We request you to respect our contribution guidelines as defined in ourCODE_OF_CONDUCT.mdandCONTRIBUTING.md
|
alpha-dom
|
AlphaDom (v0.1.4)Playing Dominion with Deep Reinforcement Learning.ReferencesAlphaZeroNotesThoughtsAlphaStarNotesThoughtsDeep RL for Imperfect Information GamesNotesThoughts
|
alpha-eigen
|
alpha-eigenUsing dynamic mode decomposition to calculate alpha-eigenvalues for time-dependent radiation transport problems.InstallThese codes are intended to continue work from the paper McClarren, R. G. (2019). Calculating Time Eigenvalues of the Neutron Transport Equation with Dynamic Mode Decomposition. Nuclear Science and Engineering, 193(8), 854–867.http://doi.org/10.1080/00295639.2018.1565014
|
alphaess
|
alphaessThis Python library logs in to cloud.alphaess.com and retrieves data on your Alpha ESS inverter, photovoltaic panels, and battery if you have one.UsageCreate a new Alpha ESS instance, log in, retrieve a list of Alpha ESS systems and request energy statistics of one of those Alpha ESS systems.APICurrently this package uses an API that I reverse engineered the API from the Alpha ESS web app. This is an internal API subject to change at any time by Alpha ESS.NoteAs of AlphaCloud V5.0.0.2 (2022-10-16) Alpha ESS introduced anti-crawl efforts to make retrieving data more difficult for packages like this one.To be good internet citizens, it is advised that your polling frequency for any AlphaCloud endpoints are 10 seconds at a minimum.MethodsThere are four public methods in this moduleauthenticate(username, password)- attempts to authenticate to the ALpha ESS API with a username and password combination, returns True or False depending on successful authentication or notgetdata()- having successfully authenticated attempts to get statistical energy data on all registered Alpha ESS systems - will return None if there are issues retrieving data from the Alpha ESS API.setbatterycharge(serial, enabled, cp1start, cp1end, cp2start, cp2end, chargestopsoc)- having successfully authenticated set battery grid charging settings for the SN.setbatterydischarge(serial, enabled, dp1start, dp1end, dp2start, dp2end, dischargecutoffsoc)- having successfully authenticated set battery discharge settings for the SN.
|
alphaess-modbus
|
AlphaESS ModBus readerAsync Python 3 library to read ModBus from an AlphaESS inverter. Tested and assumes using a Raspberry Pi as the host for RTU.Usesasynciominimalmodbusfor ModBus/RS485 RTU communication.
Usespymodbysfor Modbus TCP communication.Seealphaess_collectorwhich uses this library to store values in MySQL.Compatible with RTU:DeviceBaudTestedSMILE59600✅SMILE-B39600SMILE-T109600Storion T3019200Hardware (RTU)⚠️⚠️ This worked for me, so do at your own risk!! ⚠️⚠️More information (and pictures) in theNotessection below.Use the inverter menu to enable modbus in slave mode.Snip one end of an ethernet cable off and connect (may vary):Blue/white to RS485 ABlue to RS485 BRS485 RX to GPIO 15RS485 TX to GPIO 14Enable serial port on Raspberry Pi withraspi-config.Connect other end of ethernet cable to the inverter CAN port.Quick startPIPInstall with:python3-mpipinstallalphaess-modbusCheckoutexample.pyorexample-tcp.pyto get startedCloneClone repo and runexample.py:[email protected]:SorX14/alphaess_modbus.gitcd./alphaess_modbus
./example.py[Sun,20Nov202221:36:54]INFO[example.py.main:27]PV:0WGRID:1078WUSE:1078WBattery:0WDone! 🎉ArchitectureThis library concentrates on reading data, butwritingis possible.Uses a JSON definition file containing all the ModBus registers and how to parse them - lookup the register you want from thePDFand request it using the reader functions below.For example, to get the capacity of your installed system, find the item in the PDF:Copy the name -PV Capacity of Grid Inverter- and request withawait reader.get_value("PV Capacity of Grid Inverter")DefinitionsAn excerpt fromregisters.json:{"name":"pv2_current","address":1058,"hex":"0x0422","type":"register","signed":false,"decimals":1,"units":"A"},which would be used when called with:awaitreader.get_value("PV2 current")# or await reader.get_value("pv2_current")It will read register0x0422, process the result as unsigned, divide it by 10, and optionally addAas the units.The default JSON file was created withalphaess_pdf_parser. You can override the default JSON file withReader(json_file=location)Reading valuesReader()Create a new RTU readerimportasynciofromalphaess_modbusimportReaderasyncdefmain():reader:Reader=Reader()definition=awaitreader.get_definition("pv2_voltage")print(definition)asyncio.run(main())Optionally change the defaults with:decimalAddress=85serial='/dev/serial0'debug=Falsebaud=9600json_file=Noneformatter=NoneReaderTCP()Create a new TCP readerimportasynciofromalphaess_modbusimportReaderTCPasyncdefmain():reader:ReaderTCP=ReaderTCP(ip="192.168.1.100",port=502)definition=awaitreader.get_definition("pv2_voltage")print(definition)asyncio.run(main())Optionally change the defaults with:ip=Noneport=502slave_id=int(0x55)json_file=Noneformatter=Noneget_value(name) -> intRequests a value from the inverter.grid=awaitreader.get_value("total_active_power_grid_meter")print(grid)# 1234Prints the current grid usage as an integer.get_units(name) -> strGet units (if any) for a register name.grid_units=awaitreader.get_units("total_active_power_grid_meter")print(grid_units)# Wget_formatted_value(name, use_formatter=True)Same asget_value()but returns a string with units. If aformatteris defined for the register, a different return type is possible.grid=awaitreader.get_formatted_value("total_active_power_grid_meter")print(grid)# 1234WSetuse_formattertoFalseto prevent a formatter from being invoked.get_definition(name) -> dictGet the JSON entry for an item. Useful if you're trying towritea register.item=awaitreader.get_definition("inverter_power_total")print(item)# {'name': 'inverter_power_total', 'address': 1036, 'hex': '0x040C', 'type': 'long', 'signed': True, 'decimals': 0, 'units': 'W'}FormattersSome registers are special and not just simple numbers - they could contain ASCII, hex-encoded numbers or another format.For example,0x0809Local IPreturns 4 bytes of the current IP, e.g.0xC0,0xA8,0x01,0x01(192.168.1.1).To help, there is a built-in formatter which will be invoked when calling.get_formatted_value()e.g:ip=awaitreader.get_formatted_value("Local IP")print(ip)# 192.168.0.1Not all registers have a formatter, and you might have a preference on how the value is returned (e.g. time-format). To help with this, you can pass aformattertoReader()and override or add to the default:classmy_custom_formatter:deflocal_ip(self,val)->str:bytes=val.to_bytes(4,byteorder='big')returnf"IP of device:{int(bytes[0])}-{int(bytes[1])}-{int(bytes[2])}-{int(bytes[3])}"reader:Reader=Reader(formatter=my_customer_formatter)local_ip=awaitreader.get_formatted_value("local_ip")print(local_ip)# IP of device: 192 - 168 - 0 - 0Each formatting function is based on the conformed name of a register. You can find the conformed name of a register by searchingregisters.jsonor by usingawait reader.get_definition(name)Writing values☠️ ModBus gives full control of the inverter. There are device-level protections in place but be very careful ☠️This library is intended to read values, but you can get a reference to theinternal ModBus librarywithreader.instrument:# Using internal reference to read a valueread=awaitreader.instrument.read_long(int(0x0021),3,False)print(read)# Untested, but should set system languageawaitreader.instrument.write_register(int(0x071D),1,0)Read the library docs for what to do next:https://minimalmodbus.readthedocs.io/en/stable/Use theAlphaESS manualfor how each register works.NotesDefinitionsWhilemy parsing scriptdid its best, there are likely to be many faults and missing entries. I only need a few basic registers so haven't tested them all.Some registers are longer than the default 4 bytes and won't work- you'll have to use the internal reader instead.PR's are welcome 🙂Registers always returning 0There are a lot of registers, but they might not all be relevant depending on your system setup. For example, the PV meter section is useless if your AlphaESS is in DC mode.Error handlingI've had the connection break a few times while testing, make sure you handle reconnecting correctly.example.pywill output the full exception should one happen.My TCP setupSome of the more recent AlphaESS inverters have this out of the box, but mine didn't. The original RTU setup was to bridge this gap.Eventually, I purchased aWaveShare RS485 TO POE Ethernet Converterbut I'm sure there are alternatives. You want something that converts a RTU device to TCP.The WaveShare one is powered by PoE, it was simple to unplug my RTU setup and put this in its place.Added a small piece of DIN rail next to my inverter and gave the converter a static IP.My RTU setupI used am5stamp RS485 modulewith a digital isolator and DC/DC isolator.Installed in an enclosure with a PoE adapter to power the Pi and provide connectivity.Enabled ModBus interface on the inverter. You'll need the service password, mine was set to the default of1111.Then connected to the CAN port.Credit and thanksSpecial thanks go tohttps://github.com/CharlesGillanders/alphaesswhere I originally started
playing around with my PV system. Their project uses the AlphaESS dashboard backend API to unofficially get inverter values from the cloud.Invaluable resource for discussing with other users. Highly
recommend readinghttps://github.com/CharlesGillanders/alphaess/issues/9which ended up with
AlphaESS creating an official API to retrieve data -https://github.com/alphaess-developer/alphacloud_open_apiAnother great resource ishttps://github.com/dxoverdy/Alpha2MQTTwhich uses a ESP8266 instead
of a Raspberry PI to communicate with the inverter - again - highly recommended.https://github.com/scanapi/scanapifor 'helping' with github actions (I used their workflow actions as templates for this project).
|
alphaessopenapi
|
alphaessThis Python library uses the Alpha ESS Open API to retrieve data on your Alpha ESS inverter, photovoltaic panels, and battery if you have one. This library is principally intended for use by my Home Assistant integration [https://github.com/CharlesGillanders/homeassistant-alphaESS]How to use1. Sign up for an open API accountRegister athttps://open.alphaess.com/for a (free) account to get your Developer ID (AppID) and Developer Secret (AppSecret).Once registered, add your battery/inverter to the developer account via the web UI.NoteTo be good internet citizens, it is advised that your polling frequency for any AlphaCloud endpoints are 10 seconds at a minimum.MethodsThere are public methods in this module that duplicate the AlphaESS OpenAPI and provide wrappers forhttps://openapi.alphaess.com/api/getEssListhttps://openapi.alphaess.com/api/getLastPowerDatahttps://openapi.alphaess.com/api/getOneDayPowerBySnhttps://openapi.alphaess.com/api/getOneDateEnergyBySnhttps://openapi.alphaess.com/api/getChargeConfigInfohttps://openapi.alphaess.com/api/updateChargeConfigInfohttps://openapi.alphaess.com/api/getDisChargeConfigInfohttps://openapi.alphaess.com/api/updateDisChargeConfigInfoAll of the above are documented athttps://open.alphaess.com/developmentManagement/apiList(Registration required)getdata() - Attempts to get statistical energy data for use in Home Assistant for all registered Alpha ESS systems - will return None if there are issues retrieving data from the Alpha ESS API.authenticate - Attempts to usehttps://openapi.alphaess.com/api/getEssListto validate authentication to the ALpha ESS API - will return True or False.setbatterycharge (serial, enabled, dp1start, dp1end, dp2start, dp2end, chargecutoffsoc)Parameters:chargecutoffsoc(float) % to stop charging from the grid atenabled(bool) True to charge from the grid, False do notdp1start(datetime.time) The start time of charging period 1 (the minutes must be one of :00, :15, :30, :45)dp1end(datetime.time) The end time of charging period 1 (the minutes must be one of :00, :15, :30, :45)dp2start(datetime.time) The start time of charging period 2 (the minutes must be one of :00, :15, :30, :45)dp2end(datetime.time) The end time of charging period 2 (the minutes must be one of :00, :15, :30, :45)serial(str) The serial number of the battery/inverter.setbatterydischarge (serial, enabled, dp1start, dp1end, dp2start, dp2end, dischargecutoffsoc)Parameters:dischargecutoffsoc(float) % to stop discharging from the battery atenabled(bool) True to discharge from the battery, False do notdp1start(datetime.time) The start time of charging period 1 (the minutes must be one of :00, :15, :30, :45)dp1end(datetime.time) The end time of charging period 1 (the minutes must be one of :00, :15, :30, :45)dp2start(datetime.time) The start time of charging period 2 (the minutes must be one of :00, :15, :30, :45)dp2end(datetime.time) The end time of charging period 2 (the minutes must be one of :00, :15, :30, :45)serial(str) The serial number of the battery/inverter.
|
alpha-factory
|
This programme is to automatically generate alpha factors and filter
relatively good factors with back-testing methods. Time consuming parts
are optimized withnumbapackage.Dependenciespython >= 3.5pandas >= 0.22.0numpy >= 1.14.0RNWS >= 0.2.1numba >= 0.38.0single_factor_model>=0.3.0IPython 5.1.0empyricalalphalensNote: It is best to use the latest version ofllvmlitein order to
makenumbawork properly. Otherwise it may couse a kernel-dies
situation.Exampleload packages and read in datafromalpha_factoryimportgenerator_class,get_memory_use_pct,cleanfromRNWSimportreadimportnumpyasnpimportpandasaspdstart=20180101end=20180331factor_path='.'frame_path='.'df=pd.read_csv(frame_path+'/frames.csv')## read in datare=read.read_df('./re',file_pattern='re',start=start,end=end)cap=read.read_df('./cap',file_pattern='cap',header=0,dat_col='cap',start=start,end=end)open_price,close,vwap,adj,high,low,volume,sus=read.read_df('./mkt_data',file_pattern='mkt',start=start,end=end,header=0,dat_col=['open','close','vwap','adjfactor','high','low','volume','sus'])ind1,ind2,ind3=read.read_df('./ind',file_pattern='ind',start=start,end=end,header=0,dat_col=['level1','level2','level3'])inx_weight=read.read_df('./ZZ800_weight','Stk_ZZ800',start=start,end=end,header=None,inx_col=1,dat_col=3)Note:framescontains columns as:df_name,equation,dependency,type, wheretypeincludesdf,cap,group. In this caseframes.csvhavedf_name:re,cap,open_price,close,vwap,high,low,volume,ind1,ind2,ind3.You can also read data by usingpd.read_csvdirectly depending on
how you store your data.start to generateparms={'re':close.mul(adj).pct_change(),'cap':cap,'open_price':open_price,'close':close,'vwap':vwap,'high':high,'low':low,'volume':volume,'ind1':ind1,'ind2':ind2,'ind3':ind3}withgenerator_class(df,factor_path,**parms)asgen:gen.generator(batch_size=3,name_start='a')gen.generator(batch_size=3,name_start='a')gen.output_df(path=frame_path+'/frames_new.csv')continue to generate with existing frames and factorswithgenerator_class(df,factor_path,**parms)asgen:gen.reload_df(path=frame_path+'/frames_new.csv')gen.reload_factors(align=True)clean()foriinrange(5):gen.generator(batch_size=2,name_start='a')print('step%dmemory usage:\t%.1f%%\n'%(i,get_memory_use_pct()))ifget_memory_use_pct()>80:breakgen.output_df(path=frame_path+'/frames_new2.csv')Note: It is very important toalignall factors and initial
dataframes before generating.you can also choose how to store your factors by settingstore_methodbacktesting with stratified sampling approach and ic-ir meansure after generationdata_box_param={'ind':ind1,'price':vwap*adjfactor,'sus':sus,'ind_weight':inx_weight,'path':'./databox'}back_test_param={'sharpe_ratio_thresh':3,'n':5,'out_path':'.','back_end':'loky','n_jobs':6,'detail_root_path':None,'double_side_cost':0.003,'rf':0.03}icir_param={'ir_thresh':0.4,'out_path':'.','back_end':'loky','n_jobs':6}withgenerator_class(df,factor_path,**parms)asgen:foriinrange(5):gen.generator(batch_size=2,name_start='a')gen.output_df(path=frame_path+'/frames_new.csv')gen.getOrCreate_databox(**data_box_param)gen.back_test(**back_test_param)gen.icir(**icir_param)clean()ifget_memory_use_pct()>90:print('Memory exceeded')breakTo temporarily save (and reload) factor data you can usecreate_tmp_memoryandreload_tmp_memorymethods. This is usually
used beforeback_testandicirto release more memory for
parallel running.generate script of factorsfromalpha_factoryimportwrite_fileimportpandasaspddf2=pd.read_csv(frame_path+'/frames_new.csv')write_file(df2,'script.py')locate a factorfromalpha_factory.utiliseimportget_factor_pathfactor_name='a0'path=get_factor_path(factor_path,factor_name)only whenstorage_method='byTime'use your own functionsTo use your own functions you need to append your code in classfunctionsfrombasic_functions.pyin the sourse file and also
append the corresponding names infunctions.csvfromdatafile
in the sourse file.After that you can setdebug=Trueingeneratorfunction to check
if there is any bug from all those functions. If indeed there is, a new
embeded ipython would be activated to help you find out what is going on
in the loop.
|
alphafed
|
AlphaMed简体中文 |EnglishAlphaMed 是一个基于区块链技术的去中心化联邦学习解决方案,旨在使医疗机构能够在保证其医疗数据隐私和安全的同时,实现多机构联合建模。医疗机构可以在本地节点实现模型的训练,并支持以匿名的身份将加密的参数共享至聚合节点,从而实现更安全、可信的联邦学习。相比于传统的联邦学习,AlphaMed 平台不仅能够确保只有合法的且经过许可的参与者才能加入网络,同时支持节点的匿名化的参与联合建模。同时,区块链的共识算法能够确保网络中的节点得到一直的决策,恶意的参与者或者数据投毒等攻击将被拒绝,从而保证了联邦学习更好的安全性。在联邦学习的过程中,各个参与方都受到智能合约的约束,并且所有的事件、操作都将被记录在区块链的分布式账本上,可追溯、可审计,使得联合机器学习的安全性和隐私保护能力极大的提升。开始构建第一个联邦学习任务
|
alpha-filter
|
and greateralpha-filterWhen parsing, sometimes it is necessary to reduce the number of requests to the server, for example, our script collects links from pagination to the product every day, and then parses each product separately. But what to do if some time ago we already parsed these goods, why do it twice. alpha-filter will help filter out those ads that have already been read, and will return only new ones.Getting startingpipinstallalpha-filterUsagefromalphafilterimportfilter_ads,mark_as_processed,is_processed>>>first_parsing_urls=["https://www.example.com/1","https://www.example.com/2"]>>>new,old=filter_ads(first_parsing_urls)>>>new["https://www.example.com/1","https://www.example.com/2"]>>>old[]second_parsing_urls=first_parsing_urls# second parsing same with first>>>new,old=filter_ads(second_parsing_urls)>>>new[]>>>old[]>>>third_parsing_urls=["https://www.example.com/2","https://www.example.com/3"]>>>new,old=filter_ads(third_parsing_urls)>>>new["https://www.example.com/3"]>>>old["https://www.example.com/1"]Also you can mark your urls for some purposes>>>urls_for_mark=["https://www.example.com/2","https://www.example.com/3"]>>>mark_as_processed(urls_for_mark)>>>is_processed("https://www.example.com/2")True>>>is_processed("https://www.example.com/4")FalseIt uses a fast sqlite database to store urls. The database file ('ads.db') will be created in the root directoryWarning!!! this package has no protection against sql injection, do not use it for the external interface
|
alphafold
|
No description available on PyPI.
|
alphafold2mmopt
|
No description available on PyPI.
|
alphafold2-pytorch
|
No description available on PyPI.
|
alphafold-colabfold
|
No description available on PyPI.
|
alphafold-kagglefold
|
No description available on PyPI.
|
alphafoldmodel
|
AlphafoldModelis a package to parse Alphafold PDB structures and PAE Matrices into interactive Python objects.The package contains the following classes:AlphafoldModel: Parses the Alphafold model PDB file alongside its JSON PAE matrix into an interactive Python object. It allows declarative queries for a model's local PAE and plDDT metrics.ModelPDB:ModelPDBis the base class that carries the PDB parsing functionalities as well as residue-based nearest-neighbour search methods.ModelPAE:ModelPAEis the base class that carries the PAE parsing functionalities forAlphafoldModel.LoadedKDTree:LoadedKDTreeis a wrapper class which provides an interface over thescipy.spatial.KDTreeclass to instead store and return arbitrary objects with coordinate information.Explore the source file in:src/AlphafoldModel/alphafoldmodelInstallingpip install alphafoldmodel
|
alpha-galaxy
|
No description available on PyPI.
|
alphagens
|
AlphaGensAn agent-environment based backtesting framework.Event-Driven backtestingclassBuyAndHold(BaseStrategy):def__init__(self,engine,broker,account):super().__init__(engine,broker,account)defbefore_trading_end(self):ifself.current_date==self.trade_dates[0]:self.account.order_target_pct_to(pd.Series(1,index=["000001"]))quick startclassBuyAndHold(BaseAgent):def__init__(self,data):super().__init__(data)self.i=-1deftake_action(self,state,date):self.buffer.append(state)self.i+=1ifself.i==0:target_positions=np.array([0.2foriinrange(5)])returntarget_positionselse:returnNO_ACTIONwhileTrue:try:print(f"current date is{env.current_date}")action=agent.take_action(state,env.current_date)next_state,reward,truncated,terminated,info=env.step(action)state=next_statereturns.append(reward)exceptIndexError:break
|
alphagot
|
alphagotnft test
|
alphagradient
|
DocumentationFor now, official documentation is hosted onnathanheidacker.github.ioIntroductionAlphaGradientis a package for creating and backtesting financial algorithms in native python. AlphaGradient implements asset and portfolio-analagous datastructures that interact in intuitive ways, and provides a framework for developing algorithms that utilize these objects in highly parallelized backtests. AlphaGradient is built on top of the industry's most widely adopted libraries, and requires (comparatively) little technical knowhow and time investment to pick up.To be able to use AlphaGradient effectively, you must know...basic financereallybasic pythonWhere other libraries might require you to be a programmer first and and a certified quantitative financial analyst second, AlphaGradient is truly geared towards hobbyist algorithmic traders who want to play around with ideas for financial algorithms. Within minutes, you can have a fully backtestable algorithm using nothing but basic python.AlphaGradient is to the algorithmic trader as console gaming is to the gamer; it gets the job done and comes at a bargain--batteries included. If you have even a passing interest in finance or algorithmic trading, this package makes it as easy as possible to get involved without having a background in computer science.InstallationThe source code is currently hosted on GitHub at:https://github.com/nathanheidacker/alphagradientBinary installers for the latest released version are available at thePython
Package Index (PyPI)# using pip
pip install alphagradientDependenciespandasnumpyaenumyfinancetqdmpathosLicenseAlphaGradient is licensed under theApache LicenseAbout Me and Contact InfoIm a student from Northwestern University in Evanston Illinois studying cognitive science and artificial intelligence. I'm currently seeking employment. If you're interested in my work, feel free to contact me atthis email
|
alphagram
|
alphagram <= advanced pyrogramLess FloodWaitsFaster API CallsCallables Made Easy
|
alphaGrinder
|
No description available on PyPI.
|
alphahome
|
No description available on PyPI.
|
alpha-homora-v2
|
Alpha-Homora-V2-PythonA Python3.9+ package that wraps Alpha Homora V2 positions to simplify interaction with their smart contracts in your Python projects.Current FeaturesRewards Value | Position Value | Debt & Leverage Ratio | Pool Info | Current APYHarvest Rewards | Close Position | Add & Remove LiquidityCurrent Supported NetworksAvalancheEthereum(WIP)Table of ContentsInstallationUsageAvalancheUninstallationRoadmapContributionInstallationThis package is set up to be installed using thepippackage manager.Ensure that you have Python 3.9+ installed. If not, you can downloadhere. The syntax is dependent on features added in this recent version.Install the package using pip:pipinstall--upgradealpha-homora-v2After install, the package will be available to you in your local Python environment asalpha-homora-v2When updates are made to the package, the version will automatically be incremented so that in order to get the newest version on your end, you can simply use the same installation command and your pip will detect and update to the newest version.UsageHow to use the package:Avalanche:Import the AvalanchePosition class into your Python script:fromalpha_homora_v2importAvalanchePositionCreating anAvalanchePositioninstance requires the following:Your position ID (an integer)This ID should match your position on Alpha Homora, without the "#"your public wallet key(Optional)your private wallet keyYour private key is required to sign transactional methodsOnce you've gathered all of these variables, you can create the position instance like this example below:position=AvalanchePosition(position_id=11049,owner_wallet_address="0x...",owner_private_key="123abc456efg789hij...")# <- Optional - see step 4Alternatively, get all open positions (AvalanchePositionobjects) by owner wallet address:fromalpha_homora_v2.positionimportget_avax_positions_by_ownerpositions=get_avax_positions_by_owner(owner_address="owner_wallet_address",owner_private_key="owner_private_key",# <- Optional)# NOTE: Passing the private key is optional, but required if you want to use transactional methods on the returned AvalanchePosition object(s).Use your position instance(s) to interact with the Alpha Homora V2 position smart contracts on the network:Transactional Methods:Return aTransactionReceiptobject upon successPrivate wallet keyrequiredfor use to sign transactionsSee the documentation in theAvalanchePositionclass for function parameters.# Add liquidity to the LPposition.add(params)# Remove liquidity from the LPposition.remove(params)# Harvest available rewards:position.harvest()# Close the position:position.close()Informational MethodsReturn JSON dataPrivate wallet keynot requiredfor useSeeexamples/position_info.ipynbfor output examples.# Get position value (equity, debt, and position value):position.get_position_value()# Get value of harvestable rewards:position.get_rewards_value()# Get current debt ratio:position.get_debt_ratio()# Get the current leverage ratio:position.get_leverage_ratio()# Get current pool APYposition.get_current_apy()# Get underlying tokens and LP for the pool:position.get_pool_tokens()# Get the debt of each token in the position (token, debt_uint256, debt_token, debt_usd):position.get_token_debts()# Alternatively, get the debt of a single token:position.get_token_debts(token_address)# Get all token borrow rates from CREAM:position.get_cream_borrow_rates()# get LP pool info:position.poolUninstallation:Uninstall the package like any other Python package using the pip uninstall command:pipuninstallalpha-homora-v2Features:Avalanche:Get all open positions by owner wallet addressHarvest Position RewardsClose PositionGet position value of equity and debtGet current debt ratioGet outstanding rewards value in native rewards token and USDAdd LiquidityRemove LiquidityGet aggregate pool APY (incl. borrowAPY)Ethereum:Get all open positions by owner wallet addressHarvest Position RewardsClose PositionGet position value of equity and debtGet current debt ratioGet outstanding rewards value in native rewards token and USDAdd LiquidityRemove LiquidityGet aggregate pool APY (incl. borrowAPY)Contribution:Contributions are welcome! Please read thecontribution guidelinesto learn more about how to contribute to this project.
|
alphahutch
|
Common packages used for developing in Alpha Hutch.
|
alphai
|
AlphAIAlphAI is a high-level open-source Python toolkit designed for efficient AI development and in-depth GPU profiling. Supporting popular tensor libraries likePyTorchandJax, it optimizes developer operations on GPU servers and integrates seamlessly withAmerican Data Science Labs, offering robust control over remote Jupyter Lab servers and environment runtimes.FeaturesGPU Profiling and Analytics: Advanced GPU profiling capabilities to maximize resource efficiency and performance.Benchmarking Tools: Pythonic, easy-to-use tools for evaluating and comparing model performance.Remote Jupyter Lab Integration: Programmatic management of remote Jupyter Lab servers for enhanced productivity.Local Tensor Model Support: Streamlines the integration and management of tensor models from providers like Hugging Face.Tensor Engine Compatibility: Fully compatible with PyTorch, with upcoming support for Jax and TensorFlow.Quick StartInstallationInstall AlphAI easily using pip:pipinstallalphai# If you'd like to install torch in a Linux machine with CUDA-driverspipinstallalphai[torch]Authentication Pre-requisitesAlthough not strictly required to use the computational functions of the alphai package, it is recommended to create an account atAmerican Data Scienceand generate an API key to make use of your two free remote Jupyter Lab servers.You don't need an API key to use the GPU profiling, benchmarking, and generate modules.Basic UsageHere's a quick example to get started with AlphAI:fromalphaiimportAlphAI# Initialize AlphAIaai=AlphAI(api_key=os.environ.get("ALPHAI_API_KEY"),)# Start remote Jupyter Lab serversaai.start_server()# Upload to your server's file systemaai.upload("./main.py")# Start python kernel and run code remotelycode="print('Hello world!')"aai.run_code(code)Documentation and Detailed UsageFor more documentation and detailed instructions on how to use AlphAI's various features, please refer to ourDocumentation.Working with Tensor ModelsGuidance on integrating and leveraging tensor models.GPU Profiling and AnalyticsComprehensive features for GPU profiling and analytics.Integration with American Data Science LabsDiscover the benefits of integrating AlphAI with American Data Science Labs.System RequirementsPython 3.9+PyTorch (recommnended) or Jax (limited support)Linux OS i.e. Ubuntu 18.04+ContributingWe welcome contributions! Please see ourContribution Guidelinesfor more information.LicenseAlphAI is released under theApache 2.0license.Support and ContactFor support or inquiries about enterprise solutions, contact us [email protected].
|
alphai-api
|
AlphAI Application Programming InterfaceCall its functions with a running instance of AlphAI to control it
|
alpha-id-py
|
Python library that allows you to generate short alphanumeric strings from integers. It can be useful for creating compact, unique, and obfuscated identifiers.
|
alphainsight-jc6102
|
alphainsight_jc6102a Python package designed to facilitate financial data analysis using the Alpha Vantage API.Installation$pipinstallalphainsight_jc6102UsagePlease see Final_Project_Vignettet.ipynb to have more understanding of usage.ContributingInterested in contributing? Check out the contributing guidelines. Please note that this project is released with a Code of Conduct. By contributing to this project, you agree to abide by its terms.Licensealphainsight_jc6102was created by Jieyuan Chen. It is licensed under the terms of the MIT license.Creditsalphainsight_jc6102was created withcookiecutterand thepy-pkgs-cookiecuttertemplate.
|
alphainspect
|
AlphaInspect仿alphalens的单因子分析工具安装pip install -i https://pypi.org/simple --upgrade alphainspect # 官方源
pip install -i https://pypi.tuna.tsinghua.edu.cn/simple --upgrade alphainspect # 国内镜像源使用准备数据。运行data/prepare_data.pydate, asset。必需的两个字段factor因子值。放在因子发生时刻forward return远期收益率。计算收益率后需要回移到开始位置。为何是shift(-n)收益率,而不是shift(n)因子呢?多期收期率。如果移动因子,会导致一个因子就要移动多次因子一般成百上千,全移动要的工作量非常大,而收期率则少很多推荐大家使用expr_codegen和polars_ta等项目运行examples/demo1.py示例弹出简易图表运行examples/demo2.py示例弹出完整图表运行examples/demo3.py示例多进程并行输出HTML网页报表运行examples/demo4.py示例事件图表部分图示累计收益的计算方法参考cum_returns.mdalphainspect与alphalens的不同不自动计算forward_returns。alphalens设periods=(1, 5, 10),然后内部计算持有1、5、10期数的收益率alphainspect由用户外部生成,用户可以比较同因子,不同交易方式产生的差异。例如:RETURN_OC_1: T+1开盘入场,T+1收盘出场RETURN_CC_1: T+0开盘入场,T+1收盘出场RETURN_OO_1: T+1开盘入场,T+2开盘出场RETURN_OO_5: T+1开盘入场,T+6开盘出场不做去极值、标准化、行业中性化等操作alphalens的各参数要弄懂还是很麻烦的,初学者如绩效达不到要求就得深入研究源代码找原因alphainspect用户在外可以一次性全计算好,如F1_ORG, F1_ZS, F1_NEUT,然后在分别传不同因子进行比较即可资金分配只用等权alphalens有因子加权、多空等设置alphainspect只提供等权一种计算方法,实现简单收益率计算方法不同alphalens多期简单收益率几何平均成1期,然后+1累乘alphainspect由用户提供1期简单收益率,然后根据要求持有或调仓,得到新的权益,循环迭代下去。更精确alphainspect与alphalens的相同数据组织方式相同。都是长表,都是因子不移动,收益率计算,然后后移到与因子产生时间对齐累计收益计算模式类似。每期产生因子,持有时长超过一期,都是将资金分成多份分别入场不考虑滑点和手续费。单因子是用来合成多因子的,因手续费和滑点而错过部分单因子就可惜了,应当在因子合成后的回测阶段才考虑手续费二次开发git --clone https://github.com/wukan1986/alphainspect.git
cd alphainspect
pip install -e .
|
alphaiq-sdk
|
AlphaIQ Python SDKTo get access to the API,sign up here.Welcome to the AlphaIQ API! We offer Quantitative Linguistic Risk Indicators that enable investors to uncover hidden risks in forward-looking statements from management.To learn more about AlphaIQ,read about us.Review thePrivacy PolicyandTerms of Serviceon our website.InstallationRequirements.Python 2.7 and 3.4+Installation via Pippipinstallalphaiq-sdkThen import the package:importalphaiq_sdkInstallation via GitHubpipinstallgit+https://github.com/alphaiq-ai/python-sdk.git(you may need to runpipwith root permission:sudo pip install git+https://github.com/alphaiq-ai/python-sdk.git)Then import the package:importalphaiq_sdkInstallation via SetuptoolsInstall viaSetuptools.pythonsetup.pyinstall--user(orsudo python setup.py installto install the package for all users)Then import the package:importalphaiq_sdkGetting StartedIt is advised to setup a.envfile the store credentials. Documentation can be foundhere. To use the.envfile to store credentials, install thepython-dotenvpackage with pip:pipinstallpython-dotenvAn example of the contents of the.envfile are shown below:[email protected]
PASSWORD=VGhpcyBpcyBteSBwYXNzd29yZCBlbmNvZGVkIHRvIEJhc2U2NCBmb3JtYXQ=Please follow theinstallation procedureand then run the following to retrieve your bearer token for authentication to other API routes:importosfromdotenvimportload_dotenvimportalphaiq_sdkfromalphaiq_sdk.restimportApiException# Load the environment variables from the .env fileload_dotenv()EMAIL=os.getenv('EMAIL')PASSWORD=os.getenv('PASSWORD')# Define the API configuration, client object and API instanceconfiguration=alphaiq_sdk.Configuration(host='https://data.app.alphaiq.ai/api/v1')withalphaiq_sdk.ApiClient(configuration)asapi_client:# Make an instance of the API classapi_instance=alphaiq_sdk.InvestmentResearchersApi(api_client)# Define the values needed to authenticate to the APIcontent_type='application/json'# str |inline_object=alphaiq_sdk.InlineObject(email=EMAIL,password=PASSWORD)try:# Authenticate using your credentialsapi_response=api_instance.auth_gettoken_post(content_type=content_type,inline_object=inline_object)exceptApiExceptionase:# Log an exception if it occursprint("Exception when calling the API:%s\n"%e)# Extract your bearer token for authentication to other API pathsid_token=api_response.data.id_token# Add the bearer token to the configuration for authenticating other routessetattr(configuration,'access_token',id_token)Documentation for API EndpointsAll URIs are relative tohttps://data.app.alphaiq.ai/api/v1ClassMethodHTTP requestDescriptionInvestmentResearchersApiauth_gettoken_postPOST/auth/gettokenGetTokenInvestmentResearchersApicompany_compass_report_ticker_getGET/company/compass/report/{ticker}CompassReportPDFInvestmentResearchersApicompany_mapping_company_to_security_getGET/company-mapping/company-to-securityCompanyToSecurityInvestmentResearchersApicompany_spindex_get_latest_spindex_factors_getGET/company-spindex/getLatestSpindexFactorsGetLatestSpindexFactorsInvestmentResearchersApicompany_spindex_get_latest_spindex_overall_risk_getGET/company-spindex/getLatestSpindexOverallRiskGetLatestSpindexOverallRiskInvestmentResearchersApicompany_spindex_get_timeseries_spindex_factors_getGET/company-spindex/getTimeseriesSpindexFactorsGetTimeseriesSpindexFactorsInvestmentResearchersApicompany_spindex_get_timeseries_spindex_overall_risk_getGET/company-spindex/getTimeseriesSpindexOverallRiskGetTimeseriesSpindexOverallRiskInvestmentResearchersApicompany_spinsights_report_ticker_getGET/company/spinsights/report/{ticker}SpinsightsReportPDFInvestmentResearchersApifactor_library_compass_questions_getGET/factor-library/compass-questionsGetCompassQuestionsInvestmentResearchersApifactor_library_spindex_factors_getGET/factor-library/spindex-factorsGetSpindexFactorsInvestmentResearchersApigenerative_company_compass_report_content_ticker_getGET/generative/company/compass/reportContent/{ticker}GetCompassReportContentInvestmentResearchersApigenerative_company_question_answer_ticker_getGET/generative/company/questionAnswer/{ticker}GetCompassExplorerQuestionAnswerInvestmentResearchersApigenerative_company_spinsights_explorer_ticker_getGET/generative/company/spinsights/explorer/{ticker}GetSpinsightsExplorerInvestmentResearchersApigenerative_company_spinsights_report_content_ticker_getGET/generative/company/spinsights/reportContent/{ticker}GetSpinsightsReportContentDocumentation For ModelsCategoryInlineObjectInlineObject1InlineObject2InlineObject3InlineObject4InlineResponse200InlineResponse2001InlineResponse20010InlineResponse20010DataInlineResponse20011InlineResponse20011DataInlineResponse20011DataLvl2IndustriesWithLatestAvgOverallriskInlineResponse20011DataLvl3IndustriesWithLatestAvgOverallriskInlineResponse20012InlineResponse20012ConsumerProductsAndServicesInlineResponse20012DataInlineResponse20012EnergyInlineResponse20012FinancialsInlineResponse20012FoodInlineResponse20012HealthcareInlineResponse20012IndustrialsInlineResponse20012InformationInlineResponse20012InformationToolsInlineResponse20013InlineResponse20013DataInlineResponse20014InlineResponse20014DataInlineResponse20015InlineResponse20015DataInlineResponse20016InlineResponse20016Company1InlineResponse20016DataInlineResponse20017InlineResponse20017DataInlineResponse20018InlineResponse20018DataInlineResponse20018DataChevronCorpCVXInlineResponse20019InlineResponse2001DataInlineResponse2002InlineResponse20020InlineResponse20020DataInlineResponse20021InlineResponse20021DataInlineResponse20022InlineResponse20022DataInlineResponse20023InlineResponse20023DataInlineResponse20024InlineResponse20024DataInlineResponse20024DataSpinsightsContentInlineResponse20025InlineResponse20025DataInlineResponse20025DataCompassContentInlineResponse20026InlineResponse20026DataInlineResponse20026DataQuestionAnswerInlineResponse20027InlineResponse20027DataInlineResponse20028InlineResponse20028DataInlineResponse20029InlineResponse20029DataInlineResponse2002DataInlineResponse2002DataQuestionContextInlineResponse2002DataQuestionsInlineResponse2003InlineResponse2003DataInlineResponse2003DataSpinsightsExplorerInlineResponse2004InlineResponse2005InlineResponse2005DataInlineResponse2006InlineResponse2007InlineResponse2007DataInlineResponse2007DataHighRiskCompaniesInlineResponse2008InlineResponse2008DataInlineResponse2008DataFinancialsInlineResponse2008DataFinancialsDrillDownIndustriesDetailsInlineResponse2008DataFinancialsRealEstateInlineResponse2008DataFinancialsRealEstateDrillDownIndustriesDetailsInlineResponse2008DataFinancialsRealEstateRealEstateRentalInlineResponse2008DataFinancialsRealEstateRealEstateRentalHighRiskCompaniesInlineResponse2008DataIndustriesDetailsInlineResponse2009InlineResponse2009DataInlineResponse2009DataHighriskIndustriesInlineResponse200DataInlineResponse405InlineResponse405ErrorsPetTag
|
alphaj-krxcralwer
|
A KRX Cralwer
|
alphaj-krx-crawler
|
A KRX Cralwer
|
alphaj-naver-stock-cralwer
|
A Naver Stock Cralwer
|
alpha-kentaurus
|
Not what you're looking forAh ah ah, you didn't say the magic word
|
alpha-kentaurus-macos
|
Not what you're looking forAh ah ah, you didn't say the magic word
|
alpha-kentaurus-stone
|
Not what you're looking forAh ah ah, you didn't say the magic word
|
alphalens
|
No description available on PyPI.
|
alphalens-eqi
|
No description available on PyPI.
|
alphalens-qa
|
No description available on PyPI.
|
alphalens-reloaded
|
Alphalens is a Python library for performance analysis of predictive
(alpha) stock factors. Alphalens works great with theZiplineopen source backtesting library, andPyfoliowhich provides performance and risk analysis of financial portfolios.The main function of Alphalens is to surface the most relevant statistics and plots about an alpha factor, including:Returns AnalysisInformation Coefficient AnalysisTurnover AnalysisGrouped AnalysisGetting startedWith a signal and pricing data creating a factor "tear sheet" is a two step process:importalphalens# Ingest and format datafactor_data=alphalens.utils.get_clean_factor_and_forward_returns(my_factor,pricing,quantiles=5,groupby=ticker_sector,groupby_labels=sector_names)# Run analysisalphalens.tears.create_full_tear_sheet(factor_data)Learn moreCheck out theexample notebooksfor more on how to read and use the factor tear sheet.InstallationInstall with pip:pip install alphalens-reloadedInstall with conda:conda install -c ml4t alphalens-reloadedInstall from the master branch of Alphalens repository (development code):pip install git+https://github.com/stefan-jansen/alphalens-reloadedAlphalens depends on:matplotlibnumpypandasscipyseabornstatsmodelsUsageA good way to get started is to run the examples in aJupyter notebook.To get set up with an example, you can:Run a Jupyter notebook server via:jupyternotebookFrom the notebook list page(usually found athttp://localhost:8888/), navigate over to the examples directory, and open any file with a .ipynb extension.Execute the code in a notebook cell by clicking on it and hitting Shift+Enter.Questions?If you find a bug, feel free to open an issue on ourgithub tracker.ContributeIf you want to contribute, a great place to start would be thehelp-wanted issues.CreditsAndrew CampbellJames ChristopherThomas WieckiJonathan LarkinJessica Stauth ([email protected])Taso PetridisFor a full list of contributors see thecontributors page.Example Tear SheetsExample factor courtesy ofExtractAlphaPeformance Metrics TablesReturns Tear SheetInformation Coefficient Tear SheetSector Tear Sheet
|
alphalens-tej
|
Alphalens is a Python library for performance analysis of predictive
(alpha) stock factors. Alphalens works great with theZiplineopen source backtesting library, andPyfoliowhich provides performance and risk analysis of financial portfolios.The main function of Alphalens is to surface the most relevant statistics and plots about an alpha factor, including:Returns AnalysisInformation Coefficient AnalysisTurnover AnalysisGrouped AnalysisGetting startedWith a signal and pricing data creating a factor "tear sheet" is a two step process:importalphalens# Ingest and format datafactor_data=alphalens.utils.get_clean_factor_and_forward_returns(my_factor,pricing,quantiles=5,groupby=ticker_sector,groupby_labels=sector_names)# Run analysisalphalens.tears.create_full_tear_sheet(factor_data)Learn moreCheck out theexample notebooksfor more on how to read and use the factor tear sheet.InstallationInstall with pip:pip install alphalens-reloadedInstall with conda:conda install -c ml4t alphalens-reloadedInstall from the master branch of Alphalens repository (development code):pip install git+https://github.com/stefan-jansen/alphalens-reloadedAlphalens depends on:matplotlibnumpypandasscipyseabornstatsmodelsUsageA good way to get started is to run the examples in aJupyter notebook.To get set up with an example, you can:Run a Jupyter notebook server via:jupyternotebookFrom the notebook list page(usually found athttp://localhost:8888/), navigate over to the examples directory, and open any file with a .ipynb extension.Execute the code in a notebook cell by clicking on it and hitting Shift+Enter.Questions?If you find a bug, feel free to open an issue on ourgithub tracker.ContributeIf you want to contribute, a great place to start would be thehelp-wanted issues.CreditsAndrew CampbellJames ChristopherThomas WieckiJonathan LarkinJessica Stauth ([email protected])Taso PetridisFor a full list of contributors see thecontributors page.Example Tear SheetsExample factor courtesy ofExtractAlphaPeformance Metrics TablesReturns Tear SheetInformation Coefficient Tear SheetSector Tear Sheet
|
alphalib
|
Welcome to alphalibA library for your daily data engineering and data science routines.This file will become your README and also the index of your documentation.InstallpipinstallalphalibHow to useIngest from Excel toaccountstable in PostgreSQL# Ingest from Excel to `accounts` table in PostgreSQLexcel_source=file_sources.get(FileSource.Excel,file_path="data/accounts.xlsx")config={'host':'localhost','port':5432,'db':'testdb','user':'user1','password':'userpwd'}pgsql_target=db_targets.get(DatabaseTarget.PostgreSQL,**config)ingest(excel_source,pgsql_target,'accounts')2020-12-12 21:26:59,943 INFO(): {'user_id': INTEGER(), 'username': VARCHAR(length=50), 'password': VARCHAR(length=50), 'email': VARCHAR(length=255), 'created_on': TIMESTAMP(), 'last_login': TIMESTAMP()}
Total records in data/accounts.xlsx - 100
user_id - 100
username - 100
password - 100
email - 100
created_on - 1
last_login - 1Ingest from CSV toaccountstable in MySQL# Ingest from CSV to `accounts` table in MySQLcsv_source=file_sources.get(FileSource.CSV,file_path="data/accounts.csv")config={'host':'localhost','port':3306,'db':'testdb','user':'user1','password':'userpwd'}mysql_target=db_targets.get(DatabaseTarget.MySQL,**config)ingest(csv_source,mysql_target,'accounts')2020-12-12 21:35:29,017 INFO(): {'user_id': INTEGER(), 'username': VARCHAR(length=50), 'password': VARCHAR(length=50), 'email': VARCHAR(length=255), 'created_on': TIMESTAMP(), 'last_login': TIMESTAMP()}
Total records in data/accounts.csv - 100
user_id - 100
username - 100
password - 100
email - 100
created_on - 1
last_login - 1
|
alphalogic-api
|
Alphalogic APIThe official library to develop the Alphalogic system adapters with Python2.Documentation
|
alphalogic-api3
|
Alphalogic APIThe official library to develop the Alphalogic system adapters with Python3.Documentation
|
alphamap
|
AlphaMapA python-based library that enables the exploration of proteomic datasets on the peptide level.AboutAlphaMap is a tool for peptide level MS data exploration. You can load and inspect MS data analyzed byAlphaPept, DIA-NN, MaxQuant, Spectronaut or FragPipe. Uploaded data is processed and formatted for visual inspection of the sequence coverage of any selected protein and its identified post-translational modifications (PTMs). UniProt information is available to directly annotate sequence regions of interest such as protein domains, secondary structures, sequence variants, known PTMs, etc. Additionally, users can select proteases to further evaluate the distribution of proteolytic cleavage sites across a protein sequence. The functionality of AlphaMap can be accessed via an intuitive graphical user interface or - more flexibly - as a Python package that allows its integration into common analysis workflows for data visualization.LicenseAlphaMap was developed by theMann Labs at the Max Planck Institute of Biochemistryand is freely available with anApache License.InstallationAlphaMap can be installed and used on Windows and MacOS.
There are three different types of installation possible:One-click GUI installer:Choose this installation if you only want the GUI and/or keep things as simple as possible.Pip installer:Choose this installation if you want to use AlphaMap as a Python package in an existing Python 3.8 environment (e.g. a Jupyter notebook). If needed, the GUI can be installed with pip as well.Developer installer:Choose this installation if you are familiar with CLI tools,condaand Python. This installation allows access to all available features of AlphaMap and even allows to modify its source code directly.One-click GUIThe GUI of AlphaMap is a completely stand-alone tool that requires no knowledge of Python. Click on one of the links below to download the latest release for:WindowsMacOSIMPORTANT: Please refer to theGUI manualfor detailed instructions on the installation, troubleshooting and usage of the stand-alone AlphaMap GUI.IMPORTANT: The one-click-installers on macOS and Windows requireat least macOS Catalina (10.15) or higherandWindows 10respectively. For Windows, a system update might be necessary in case older versions do not work. To prevent installation errors onWindows, we recommenduninstalling the previous AlphaMap version before installing a new one.PipAlphaMap can be installed in an existing Python 3.8 environment with a singlebashcommand.Thisbashcommand can also be run directly from within a Jupyter notebook by prepending it with a!.pipinstallalphamapWhen a new version of AlphaMap becomes available, the old version can easily be upgraded by running e.g. the command again with an additional--upgradeflag:pipinstallalphamap--upgradeNOTE: When installing withpip, UniProt information is not included. Upon first usage of a specific Organism, its information will be automatically downloaded from UniProt.DeveloperAlphaMap can also be installed in editable (i.e. developer) mode with a fewbashcommands. This allows to fully customize the software and even modify the source code to your specific needs. When an editable Python package is installed, its source code is stored in a transparent location of your choice. While optional, it is advised to first (create and) navigate to e.g. a general software folder:mkdir~/folder/where/to/install/softwarecd~/folder/where/to/install/softwareNext, download the AlphaMap repository from GitHub either directly or with agitcommand. This creates a new AlphaMap subfolder in your current directory.gitclonehttps://github.com/MannLabs/alphamap.gitcdalphamapFor any Python package, it is highly recommended to use aconda virtual environment. AlphaMap can either be installed in a new conda environment or in an already existing environment.Note that dependency conflicts can occur with already existing packages in the latter case! Once a conda environment is activated, AlphaMap and all itsdependenciesneed to be installed.condacreate-nalphamappython=3.8-y
condaactivatealphamap
pipinstall-e.By using the editable flag-e, all modifications to the AlphaMapsource code folderare directly reflected when running AlphaMap. Note that the AlphaMap folder cannot be moved and/or renamed if an editable version is installed.When using Jupyter notebooks and multiple conda environments direcly from the terminal, it is recommended toconda install nb_conda_kernelsin the conda base environment. Hereafter, running ajupyter notebookfrom the conda base environment should have apython [conda env: alphamap]kernel available, in addition to all other conda kernels in which the commandconda install ipykernelwas run.Test dataAlphaMap has direct data import options for AlphaPept, DIA-NN, MaxQuant, Spectronaut and FragPipe.AlphaPeptAlphaMap takes theresults.csvfile from AlphaPept as input format. An example is available fordownload here.DIA-NNAlphaMap takes the peptide-level output .tsv file from DIA-NN as input format. An example is available fordownload here.MaxQuantAlphaMap takes theevidence.txtfile from MaxQuant as input format. A reduced example file is available fordownload here.SpectronautAlphaMap takes Spectronaut results exported in normal long format (.csv or .tsv) as input. Necessary columns include:PEP.AllOccuringProteinAccessionsEG.ModifiedSequenceR.FileNameTo ensure proper formatting of the Spectronaut output, an export scheme is available fordownload here.A reduced example file is also available fordownload here.FragPipeThere are two options to visualize data analyzed by FragPipe:Upload individual"peptide.tsv"files for single MS runs. A reduced example file is available fordownload here.Upload the"combined_peptide.tsv"file with the joint information about peptides identified in all runs (there is an option to select the experiment(s)). Be aware that the combined_peptide.tsv does not provide information about PTM localization. PTMs are therefore not shown for this option. A reduced example file is available fordownload here.UsageThere are two ways to use AlphaMap:GUI:This allows to interactively import and visualize the data.Python:This allows to access data and explore it interactively with custom code.NOTE: The first time you use a fresh installation of AlphaMap, it is often quite slow because some functions might still need compilation on your local operating system and architecture. Subsequent use should be a lot faster.GUIPlease refer to theGUI manualfor detailed instructions on the installation and usage of the stand-alone AlphaMap GUI.If the GUI was not installed through a one-click GUI installer, it can be activated with the followingbashcommand:alphamapNote that this needs to be prepended with a!when you want to run this from within a Jupyter notebook. When the command is run directly from the command-line, make sure you use the right environment (activate it with e.g.conda activate alphamapor set an alias to the binary executable).Python and Jupyter notebooksAlphaMap can be imported as a Python package into any Python script or notebook with the commandimport alphamap.A Jupyter notebook tutorial'Workflow.ipynb'is available to demonstrate how to load AlphaMap as python module and hot to visualize data interactively. When running locally it provides interactive plots, which are not rendered on GitHub.AlphaMap includes fasta files and UniProt annotations for: 'Human', 'Mouse', 'Rat', 'Cow', 'Zebrafish', 'Drosophila', 'Caenorhabditis elegans', 'Slime mold', 'Arabidopsis thaliana', 'Rice', 'Escherichia coli', 'Bacillus subtilis', 'Saccharomyces cerevisiae', 'SARS-COV' and 'SARS-COV-2'. If additional organisms are of interest, corresponding .fasta files and sequence annotations can be downloaded directly from UniProt. A Jupyter notebook tutorial'Uniprot_preprocessing.ipynb'shows how to load and format a UniProt annotation file.
|
alphamed-federated
|
AlphaMedAlphaMed 是一个基于区块链技术的去中心化联邦学习解决方案,旨在使医疗机构能够在保证其医疗数据隐私和安全的同时,实现多机构联合建模。医疗机构可以在本地节点实现模型的训练,并支持以匿名的身份将加密的参数共享至聚合节点,从而实现更安全、可信的联邦学习。相比于传统的联邦学习,AlphaMed 平台不仅能够确保只有合法的且经过许可的参与者才能加入网络,同时支持节点的匿名化的参与联合建模。同时,区块链的共识算法能够确保网络中的节点得到一直的决策,恶意的参与者或者数据投毒等攻击将被拒绝,从而保证了联邦学习更好的安全性。在联邦学习的过程中,各个参与方都受到智能合约的约束,并且所有的事件、操作都将被记录在区块链的分布式账本上,可追溯、可审计,使得联合机器学习的安全性和隐私保护能力极大的提升。开始构建第一个联邦学习任务在不同结构的数据源之间构建异构联邦学习任务如果 AlphaMed 平台预置的现有算法依然无法满足业务需要,还可以自行设计联邦学习算法,并使其运行在 AlphaMed 平台之上。这里展示了如果自定义联邦学习算法,并通过 AlphaMed 平台实际执行联邦学习任务。为了帮助开发者调试自己的代码,AlphaMed 平台还提供了一套模拟运行环境,以在本地节点模拟实际运行环境,包括联邦学习运行环境。AlphaMed 平台除面向算法工程师提供了开发联邦学习模型的支持外,还面向模型使用者提供了预训练模型的支持。与传统预训练模型相比,AlphaMed 平台上的预训练模型支持功能更为强大、使用更为方便。在 AlphaMed 平台上,不仅可以寻找并下载心仪的预训练模型,更可以利用私有数据微调、部署预训练模型,使其更加适配于私有的业务数据。AlphaMed 平台预置了一定数量的预训练模型,同时也支持第三方开发者开发上传自己的预训练模型。这里展示了如何设计自己的预训练模型。项目目录说明src/alphafed├── auto_mlAutoML 模块│ └── cvatCVAT 工具├── contractor合约消息工具├── data_channel数据传输工具├── docs说明文档│ ├── auto_mlAutoML 说明文档│ ├── customized_scheduler自定义调度器说明文档│ ├── fed_avgFedAvg 横向联邦说明文档│ ├── hetero_nnHeteroNN 异构联邦说明文档│ ├── mock模拟调试说明文档│ └── tutorialtutorial 示例说明文档,包含所有主要功能├── examples脚本测试代码 / 示例代码├── fed_avgFedAvg 横向联邦模块├── hetero_nnHeteroNN 异构联邦模块│ └── psi隐私求交模块├── secure安全工具├── fs.py文件系统工具├── loggers.py日志工具├── mock.py模拟调试工具└── utils.py其它工具
|
alphaMic
|
Failed to fetch description. HTTP Status Code: 404
|
alphamini
|
AlphaMini PythonSDKDocuments:ChineseEnglish
|
alphaml
|
No description available on PyPI.
|
alpha-nest
|
No description available on PyPI.
|
alphanet
|
AlphaNetA Recurrent Neural Network For Predicting Stock PricesAlphaNetV2Below is the structure of AlphaNetV2input: (batch_size, history time steps, features)
stride = 5
input -> expand features -> BN -> LSTM -> BN -> Dense(linear)AlphaNetV3Below is the structure of AlphaNetV3input: (batch_size, history time steps, features)
stride = 5
+-> expand features -> BN -> GRU -> BN -+
input --| stride = 10 |- concat -> Dense(linear)
+-> expand features -> BN -> GRU -> BN -+InstallationEither clone this repository or just use pypi:pip install alphanet.The pypi project is here:alphanet.ExampleStep 0: import alphanetfromalphanetimportAlphaNetV3,load_modelfromalphanet.dataimportTrainValData,TimeSeriesDatafromalphanet.metricsimportUpDownAccuracyStep 1: build data# read datadf=pd.read_csv("some_data.csv")# compute label (future return)df_future_return=here_you_compute_it_by_your_selfdf=df_future_return.merge(df,how="inner",left_on=["date","security_code"],right_on=["date","security_code"])# create an empty liststock_data_list=[]# put each stock into the list using TimeSeriesData() classsecurity_codes=df["security_code"].unique()forcodeinsecurity_codes:table_part=df.loc[df["security_code"]==code,:]stock_data_list.append(TimeSeriesData(dates=table_part["date"].values,# date columndata=table_part.iloc[:,3:].values,# data columnslabels=table_part["future_10_cum_return"].values))# label column# put stock list into TrainValData() class, specify dataset lengthstrain_val_data=TrainValData(time_series_list=stock_data_list,train_length=1200,# 1200 trading days for trainingvalidate_length=150,# 150 trading days for validationhistory_length=30,# each input contains 30 days of historysample_step=2,# jump to days forward for each samplingtrain_val_gap=10# leave a 10-day gap between training and validationStep 2: get datasets from desired period# get one training period that start from 20110131train,val,dates_info=train_val_data.get(20110131,order="by_date")print(dates_info)Step 3: compile the model and start training# get an AlphaNetV3 instancemodel=AlphaNetV3(l2=0.001,dropout=0.0)# you may use UpDownAccuracy() here to evaluate performancemodel.compile(metrics=[tf.keras.metrics.RootMeanSquaredError(),UpDownAccuracy()]# trainmodel.fit(train.batch(500).cache(),validation_data=val.batch(500).cache(),epochs=100)Step 4: save and loadsaving# save model by save methodmodel.save("path_to_your_model")# or just save weightsmodel.save_weights("path_to_your_weights")loading# load entire model using load_model() from alphanet modulemodel=load_model("path_to_your_model")# only load weights by first creating a model instancemodel=AlphaNetV3(l2=0.001,dropout=0.0)model.load_weights("path_to_your_weights")Note: onlyalphanet.load_model(filename)recognizes customUpDownAccuracy.
If you do not useUpDownAccuracy,
you canalsousetf.keras.models.load_model(filename).DocumentationFor detailed documentation, go toalphanet documentation.For implementation details, go toalphanet source folder.One Little CaveatThe model expands features quadratically.
So, if you have 5 features, it will be expanded to more than 50 features (for AlphaNetV3),
and if you have 10 features, it will be expanded to more than 200 features.
Therefore, do not put too many features inside.One More Notealphanet.datamodule is completely independent fromalphanetmodule,
and can be a useful tool for training any timeseries neural network.
|
alphanetworks-obs-py
|
OBS Python SDK built by alphanetworks mx team
|
alphaneural
|
Machine Learning EvxThis is a simplified version ofalphaneuralpackage used to generate buy and sell signals for crypto and conventional stock markets based on the above article on medium.InstallationInstall alphaneural withpython3 -m pip install alphaneuralUsageIn your python script simply import the module and use as follows:from alphaneural.alphaneural import alpha_param
print(apha_param(df,'mem'))The above methods take OHCLV data and the option to specifiy the file path as 'file' or in memory saved variable 'mem'. This will result in a single parameternamed alpha.Testing an entire dataframeTesting of a dataframe for correct buy, sell signals is as simple as applying the lambda function as follows:import pandas as pd
from alphaneural import alpha_param
df = pd.read_csv('../../../path/to_your.csv')
def getEnterSignal(data,src):
alpha = alpha_param(data,f'{src}')
return alpha
mainsig = getEnterSignal(df,'mem')
df['sig'] = df['alpha'].apply(lambda x: 1 if x < mainsig else 0)Alphaneural can be applied to a file as follows:from alphaneural import alpha_param
alpha = alpha_param('../../../path/to_your.csv','file')WarningThis is not financial advise. Alphaneural is entirely on its preliminary stages. Use it at your own risk.
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.